CEG 7370 Distributed Computing | Slides

Concurrent- Parallel- Networked- Distributed- Computing

Table of Contents

1 Too Many Adjectives?

  1. The adjectives Sequential, Concurrent, Parallel, Networked, Distributed are applicable to Computing.
  2. What do they mean?
  3. Are the defs of Concurrent and Parallel parochial?

2 Sequential Computing

  1. Recall our defs of ; || and []
  2. Assume: Process creation operations are unavailable.
  3. Assume: Operators || and [] do not occur.
  4. Assume: Underlying processor has just one core.
  5. Assume: Not aware of other processes.

3 Networked Computing

  1. Processes communicate only via send and receive of messages.
  2. Processes do not have shared state/ memory.
  3. How do they "know" each other?
    1. IP addresses, DNS, UDP/TCP port numbers, …
    2. RPC, RMI, … discovery mechanisms
  4. Independent lives? Cooperative? Contentious? Malicious?
  5. Loss of messages? Possible.
  6. Order of sends = Order of receives? May not be.
  7. Design and implement SMP (Synchronous Message Passing) only.
  8. Some nodes may provide AMP services based on SMP.
  9. Scalabe? Yes, but …
  10. Nondeterminism (our fat-bar)

4 Distributed Computing

  1. Networked Computing + "Synergy"

4.1 Lookup Some Words

  1. Contemporary
  2. Simultaneous
    1. walk and chew gum
  3. Sharing and Ownership
  4. Synergy

4.2 Synergy-1

www.merriam-webster.com

  1. syn.er.gy noun {si-ner-jea} : the increased effectiveness that results when two or more people or businesses work together. Plural synergies
  2. synergism; broadly: combined action or operation
  3. a mutually advantageous conjunction or compatibility of distinct business participants or elements (as resources or efforts)
  4. Examples of SYNERGY:
    1. A synergy has developed among the different groups working on this project.
    2. two companies that have found synergy

4.3 Synergy-2

http://wordnet.princeton.edu/

  1. Noun 1. synergy - the working together of two things (muscles or drugs for example) to produce an effect greater than the sum of their individual effects
  2. potentiation = (medicine) the synergistic effect of two drugs given simultaneously

4.4 Distributed Computing, contd.

  1. Networked Computing + Synergy
  2. Processes communicate only via send and receive.
    1. Loss of messages? No.
    2. Order of sends = Order of receives? Yes?
  3. Processes do not have shared state/ memory.

4.5 Distributed Computing-1

  1. Independent lives? Yes;
  2. Cooperative? Yes;
  3. Contentious? No;
  4. Malicious? No.

4.6 Distributed Computing-2

  1. How do they "know" each other?
    1. Established naming protocols
    2. PL support
  2. Scalabe? Yes. (Note the "but" part in networked computing.)
  3. Nondeterminism (our fat-bar)

4.7 Distributed Shared Memory

  1. x := a by P1 vs x := b by P2 vs read x by P3
  2. Must P3 be able to see: a and/or b? when?
  3. Are we assuming priorities among P1, P2, P3?
  4. Global clock?
  5. Unclear semantics; implied expections of "memory"

4.8 Distributed Shared Data

  1. Large, Conceptually One, "Data Structure". Databases.
  2. Partitioned and stored across many nodes.
  3. Migrate the partitions as needed.
  4. Replicate some partitions for reads, track writes

5 Parallel Computing

  1. What is described here is a consensus view.
  2. Does not assume the absence of the other computing models.

5.1 Parallel Computing

  1. Assumes multiple processors/ cores.
  2. Assumes shared memory.
  3. Assumes life-times of processes overlap.
  4. Assumes arbitrarily fine granularity
    1. memory read v write v write by different cores
    2. E.g., a += b and simultaneouly c -= d
      1. Could a be an alias for c?

6 Concurrent Computing

  1. What is described here is a consensus view.
  2. Does not assume the absence of the other computing models.

6.1 Concurrent Computing-1

  1. Permits but does not assume multiple processors/ cores.
  2. Permits but does not assume shared memory.
  3. Permits but does not assume life-times of processes overlap.
  4. Assumes "not-so-fine" granularity of execution of basic instructions.

6.2 Concurrent Computing-2

  1. Higher level abstract view
  2. Mutual exclusion of code segments
  3. Synchronization of indpendent processes
  4. Send and receive messages

7 Exercises

  1. Distributed but not concurrent?
  2. Client-Server always distributed?
  3. Peer-to-peer only in distributed systems?
  4. RPC/RMI inherently client-server?
  5. Order of importance: Efficient, Correct, Symmetric, Scalable, Deadlock-free, Livelock-free, Starvation-free, …?

8 References

  1. Rob Pike - 'Concurrency Is Not Parallelism', 2013, video: http://www.youtube.com/watch?v=cN_DpYBzKso, slides: http://concur.rspace.googlecode.com/hg/talk/concur.html; interview: http://www.infoq.com/interviews/pike-concurrency . Rob Pike is a Distinguished Engineer at Google, Inc. Highly Recommended that you watch.
  2. Robert Harper, Parallelism is not concurrency, http://existentialtype.wordpress.com/2011/03/17/ 2011. Robert Harper is a CS professor at CMU. Highly Recommended Reading.

Copyright © 2014 pmateti@wright.eduwww.wright.edu/~pmateti