Skip to main content

An overview of parallelism, concurrency, and asynchrony.

· 2 min read
John Crickett
Code Challenge Writer

Parallelism, concurrency, and asynchrony are three different ways of achieving efficient and responsive software execution, but they differ in their approaches and their goals.

Parallelism refers to executing multiple tasks simultaneously, by dividing them into smaller sub-tasks that can be processed in parallel on multiple CPU cores or processors. Parallelism is commonly used in high-performance computing and in modern computer architectures that contain multiple CPUs.

Concurrency, on the other hand, refers to the ability of a program to manage multiple tasks simultaneously, even if it only has a single CPU core. Concurrency is achieved by interleaving the execution of different tasks, allowing them to make progress in small steps, and using synchronization mechanisms, such as locks and semaphores, to ensure that they don't interfere with each other.

Asynchrony is a programming model that allows tasks to proceed independently of one another, without waiting for each other to complete. Asynchronous programming is typically used to improve the responsiveness of a program, by allowing it to perform other tasks while waiting for slow operations, such as file I/O or network requests, to complete. In asynchronous programming, a task is started and then allowed to continue executing in the background, while the program continues to perform other tasks. When the background task completes, it signals the program, which can then handle the results.

In summary, parallelism is about executing tasks simultaneously, concurrency is about managing multiple tasks simultaneously, and asynchrony is about executing tasks independently and asynchronously, without waiting for each other.