Understanding concurrency and parallelism.

Tejaswi Kasat
4 min readJul 5, 2022

Concurrency

In computing, concurrency is the ability to run multiple computations or processes simultaneously. A computationally concurrent program is one in which multiple computations can proceed independently and possibly in parallel.

Concurrency has been a fundamental concept in programming languages since the early days of computing. Early programming languages were designed to support only a single computation at a time. However, as computers became more powerful, it became clear that concurrent programming could provide a significant performance boost.

Today, concurrency is a central feature of many programming languages, including Java, C#, Go, and Erlang. Concurrent programming is so important that the Java platform includes a special set of libraries, the Java Concurrency Utilities, to support it.

Concurrency can be implemented in many different ways. The most common approach is to use threads. A thread is a unit of execution that can run independently of other threads. Threads are popular because they are relatively easy to use and they provide a high degree of flexibility.

Another approach to concurrency is to use processes. A process is a self-contained unit of execution that has its own memory space. Processes are more difficult to use than threads, but they offer several advantages, including isolation (a process cannot access the memory of another process) and protection (a process can be terminated without affecting other processes).

Still, another approach is to use events. Events are notifications that something has happened, such as a key being pressed or a mouse button being clicked. Events can be used to trigger concurrent computations, but they are typically less flexible than threads or processes.

Concurrency is a powerful tool, but it must be used with care. If not managed properly, concurrency can lead to race conditions, deadlocks, and other problems. As a result, concurrent programming requires a high level of skill and experience.

Parallelism

Parallelism is a technique in computer programming whereby two or more tasks can be executed concurrently — that is, at the same time.

One common form of parallelism is found in multiprocessing, where multiple processors are employed to execute tasks concurrently. Another form of parallelism is found in multithreading, where a single processor is employed to execute multiple threads — that is, multiple tasks that can be executed independently but share common resources such as memory.

Parallelism can be used to increase the performance of a computer program by reducing the amount of time required to execute a task. For example, if a task can be divided into two subtasks that can be executed concurrently, the overall time required to execute the task will be reduced by half.

To take advantage of parallelism, a computer program must be designed in such a way that it can be divided into multiple tasks that can be executed concurrently. This can be a difficult task, as it requires a deep understanding of the problem that the program is trying to solve.

Many different techniques can be used to achieve parallelism, and the choice of technique will depend on the nature of the problem that the program is trying to solve. Some of the more common techniques include:

  • Data parallelism, where the same operation is performed on multiple data items concurrently.
  • Instruction-level parallelism, where multiple instructions are executed concurrently.
  • Task parallelism, where multiple tasks are executed concurrently.
  • Pipeline parallelism, where multiple stages of a pipeline are executed concurrently.

Parallelism can be used to speed up the execution of a computer program, but it is not a panacea. In some cases, parallelism can make a program run slower. This is because the overhead associated with parallelism can sometimes outweigh the benefits.

When deciding whether or not to use parallelism, it is important to consider the tradeoffs carefully. In general, parallelism should only be used if it is likely to result in a significant performance improvement.

Differences between concurrency and parallelism

There is a lot of confusion between the terms concurrency and parallelism, and for good reason: they are often used interchangeably, when in fact they are two very different things. Concurrency is the ability of a program to have multiple threads of execution, running at the same time. Parallelism is the ability of a program to split its work across multiple processors or cores, to finish the work faster. So, concurrency is about dealing with multiple threads of execution, while parallelism is about using multiple processors.

Concurrency is a more general term that includes parallelism. All parallel programs are concurrent, but not all concurrent programs are parallel. In a concurrent program, multiple threads of execution are interleaved, meaning that each thread gets a turn to run, and the order in which the threads run is not predictable. In a parallel program, the threads are running simultaneously on different processors.

Conclusion

There are several important differences between concurrency and parallelism. Perhaps the most important is that concurrency is about dealing with multiple things at the same time, while parallelism is about doing multiple things at the same time.

Another key difference is that concurrency is about dealing with independent tasks that can be done concurrently, while parallelism is about speeding up a single task by breaking it into smaller parts that can be executed in parallel.

Finally, it’s worth noting that concurrent programming is generally considered to be significantly more complex than parallel programming, due to the need to carefully manage shared states and avoid race conditions.

--

--