What is the difference between concurrency and parallelism?

前端 未结 30 2704
清歌不尽
清歌不尽 2020-11-22 00:21

What is the difference between concurrency and parallelism?

Examples are appreciated.

相关标签:
30条回答
  • 2020-11-22 01:11

    concurency: multiple execution flows with the potential to share resources

    Ex: two threads competing for a I/O port.

    paralelism: splitting a problem in multiple similar chunks.

    Ex: parsing a big file by running two processes on every half of the file.

    0 讨论(0)
  • 2020-11-22 01:11

    Concurrency is the generalized form of parallelism. For example parallel program can also be called concurrent but reverse is not true.

    1. Concurrent execution is possible on single processor (multiple threads, managed by scheduler or thread-pool)

    2. Parallel execution is not possible on single processor but on multiple processors. (One process per processor)

    3. Distributed computing is also a related topic and it can also be called concurrent computing but reverse is not true, like parallelism.

    For details read this research paper Concepts of Concurrent Programming

    0 讨论(0)
  • 2020-11-22 01:11

    Concurrency vs Parallelism

    Rob Pike in 'Concurrency Is Not Parallelism'

    Concurrency is about dealing with lots of things at once.

    Parallelism is about doing lots of things at once.

    [Concurrency theory]

    Concurrency - handles waiting operation Parallelism - handles thread stuff

    My vision of concurrency and parallelism

    [Sync vs Async]

    0 讨论(0)
  • 2020-11-22 01:12

    I like Rob Pike's talk: Concurrency is not Parallelism (it's better!) (slides) (talk)

    Rob usually talks about Go and usually addresses the question of Concurrency vs Parallelism in a visual and intuitive explanation! Here is a short summary:

    Task: Let's burn a pile of obsolete language manuals! One at a time!

    Task

    Concurrency: There are many concurrently decompositions of the task! One example:

    Gophers

    Parallelism: The previous configuration occurs in parallel if there are at least 2 gophers working at the same time or not.

    0 讨论(0)
  • 2020-11-22 01:12

    Concurrent programming execution has 2 types : non-parallel concurrent programming and parallel concurrent programming (also known as parallelism).

    The key difference is that to the human eye, threads in non-parallel concurrency appear to run at the same time but in reality they don't. In non - parallel concurrency threads rapidly switch and take turns to use the processor through time-slicing. While in parallelism there are multiple processors available so, multiple threads can run on different processors at the same time.

    Reference: Introduction to Concurrency in Programming Languages

    0 讨论(0)
  • 2020-11-22 01:12

    (I'm quite surprised such a fundamental question is not resolved correctly and neatly for years...)

    In short, both concurrency and parallelism are properties of computing.

    As of the difference, here is the explanation from Robert Harper:

    The first thing to understand is parallelism has nothing to do with concurrency. Concurrency is concerned with nondeterministic composition of programs (or their components). Parallelism is concerned with asymptotic efficiency of programs with deterministic behavior. Concurrency is all about managing the unmanageable: events arrive for reasons beyond our control, and we must respond to them. A user clicks a mouse, the window manager must respond, even though the display is demanding attention. Such situations are inherently nondeterministic, but we also employ pro forma nondeterminism in a deterministic setting by pretending that components signal events in an arbitrary order, and that we must respond to them as they arise. Nondeterministic composition is a powerful program structuring idea. Parallelism, on the other hand, is all about dependencies among the subcomputations of a deterministic computation. The result is not in doubt, but there are many means of achieving it, some more efficient than others. We wish to exploit those opportunities to our advantage.

    They can be sorts of orthogonal properties in programs. Read this blog post for additional illustrations. And this one discussed slightly more on difference about components in programming, like threads.

    Note that threading or multitasking are all implementations of computing serving more concrete purposes. They can be related to parallelism and concurrency, but not in an essential way. Thus they are hardly good entries to start the explanation.

    One more highlight: (physical) "time" has almost nothing to do with the properties discussed here. Time is just a way of implementation of the measurement to show the significance of the properties, but far from the essence. Think twice the role of "time" in time complexity - which is more or less similar, even the measurement is often more significant in that case.

    0 讨论(0)
提交回复
热议问题