Concurrent- Parallel- Networked- Distributed- Computing
Table of Contents
1 Too Many Adjectives?
- The adjectives Sequential, Concurrent, Parallel, Networked, Distributed are applicable to Computing.
- What do they mean?
- Are the defs of Concurrent and Parallel parochial?
2 Sequential Computing
- Recall our defs of
;
||
and[]
- Assume: Process creation operations are unavailable.
- Assume: Operators
||
and[]
do not occur. - Assume: Underlying processor has just one core.
- Assume: Not aware of other processes.
3 Networked Computing
- Processes communicate only via send and receive of messages.
- Processes do not have shared state/ memory.
- How do they "know" each other?
- IP addresses, DNS, UDP/TCP port numbers, …
- RPC, RMI, … discovery mechanisms
- Independent lives? Cooperative? Contentious? Malicious?
- Loss of messages? Possible.
- Order of sends = Order of receives? May not be.
- Design and implement SMP (Synchronous Message Passing) only.
- Some nodes may provide AMP services based on SMP.
- Scalabe? Yes, but …
- Nondeterminism (our fat-bar)
4 Distributed Computing
- Networked Computing + "Synergy"
4.1 Lookup Some Words
- Contemporary
- Simultaneous
- walk and chew gum
- Sharing and Ownership
- Synergy
4.2 Synergy-1
- syn.er.gy noun {si-ner-jea} : the increased effectiveness that results when two or more people or businesses work together. Plural synergies
- synergism; broadly: combined action or operation
- a mutually advantageous conjunction or compatibility of distinct business participants or elements (as resources or efforts)
- Examples of SYNERGY:
- A synergy has developed among the different groups working on this project.
- two companies that have found synergy
4.3 Synergy-2
- Noun 1. synergy - the working together of two things (muscles or drugs for example) to produce an effect greater than the sum of their individual effects
- potentiation = (medicine) the synergistic effect of two drugs given simultaneously
4.4 Distributed Computing, contd.
- Networked Computing + Synergy
- Processes communicate only via send and receive.
- Loss of messages? No.
- Order of sends = Order of receives? Yes?
- Processes do not have shared state/ memory.
4.5 Distributed Computing-1
- Independent lives? Yes;
- Cooperative? Yes;
- Contentious? No;
- Malicious? No.
4.6 Distributed Computing-2
- How do they "know" each other?
- Established naming protocols
- PL support
- Scalabe? Yes. (Note the "but" part in networked computing.)
- Nondeterminism (our fat-bar)
4.7 Distributed Shared Memory
x :=
a
by P1 vsx :=
b
by P2 vsread x
by P3- Must P3 be able to see: a and/or b? when?
- Are we assuming priorities among P1, P2, P3?
- Global clock?
- Unclear semantics; implied expections of "memory"
4.8 Distributed Shared Data
- Large, Conceptually One, "Data Structure". Databases.
- Partitioned and stored across many nodes.
- Migrate the partitions as needed.
- Replicate some partitions for reads, track writes
5 Parallel Computing
- What is described here is a consensus view.
- Does not assume the absence of the other computing models.
5.1 Parallel Computing
- Assumes multiple processors/ cores.
- Assumes shared memory.
- Assumes life-times of processes overlap.
- Assumes arbitrarily fine granularity
- memory read v write v write by different cores
- E.g.,
a +=
b
and simultaneoulyc -=
d
- Could a be an alias for c?
6 Concurrent Computing
- What is described here is a consensus view.
- Does not assume the absence of the other computing models.
6.1 Concurrent Computing-1
- Permits but does not assume multiple processors/ cores.
- Permits but does not assume shared memory.
- Permits but does not assume life-times of processes overlap.
- Assumes "not-so-fine" granularity of execution of basic instructions.
6.2 Concurrent Computing-2
- Higher level abstract view
- Mutual exclusion of code segments
- Synchronization of indpendent processes
- Send and receive messages
7 Exercises
- Distributed but not concurrent?
- Client-Server always distributed?
- Peer-to-peer only in distributed systems?
- RPC/RMI inherently client-server?
- Order of importance: Efficient, Correct, Symmetric, Scalable, Deadlock-free, Livelock-free, Starvation-free, …?
8 References
- Rob Pike - 'Concurrency Is Not Parallelism', 2013, video: http://www.youtube.com/watch?v=cN_DpYBzKso, slides: http://concur.rspace.googlecode.com/hg/talk/concur.html; interview: http://www.infoq.com/interviews/pike-concurrency . Rob Pike is a Distinguished Engineer at Google, Inc. Highly Recommended that you watch.
- Robert Harper, Parallelism is not concurrency, http://existentialtype.wordpress.com/2011/03/17/ 2011. Robert Harper is a CS professor at CMU. Highly Recommended Reading.