Concurrency and Parallelism

Let's see how we can make the two less confusing
Published on 2024/03/15

I won't fit a complete differentiation between the two in one thought but I was thinking I could give it a superficial go. There are definitely resources out there that go in much more depth about what concurrency is and what parallelism is. Just look at this video by Rob Pike where he dedicates 30 minutes to explain how concurrency is not parallelism. I can't believe it's 8 years old, wow!

I think early on I was more comfortable with the concept of "running things in parallel" which often times just meant running the same script on different chunks of data so I could leverage as many cores as possible. I wasn't that far off, and concurrency was not at play here. My code was written in a very linear way, get some input, do something, generate some output. The only thing I had to worry about was that the slices of input were different, and the outputs were not overwriting one another. Then I learned about threads and thread pools, I had to manage that for my program to take advantage of the server it was running on.

While I haven't touched concurrency much before, coming to Go was refreshing and empowering. The concurrency model is very accessible and simple to use (with some gotchas you learn along the way). Now I found myself being able to write concurrent code and I think that's what helped me grasp the difference quickly. I'm writing concurrent code but there's no guarantee it will run in parallel. That's left to how the execution is managed and beyond my control.

I like the way Katherine Cox-Buday puts it in "Concurrency in Go":

Concurrency is a property of the code; parallelism is a property of the running program

That clicked right away for me. There are example that, when pushed to one extreme or the other, can help settle your understanding. Let's think for a second about parallelism as a program/function executed on different processors at the same time. If our machine were to only have one processor, we can't achieve parallelism but our code could still be structured to be executed concurrently. In this case the chances the program can run in parallel are 0%. Now the other extreme is an infinite amount of processors, in this case the chances the program CAN be executed in parallel is 100%.

Thoughts

You can do the work to structure your code in a concurrent manner so that you only have to worry about the communication between the concurrent parts. This will give you the possibility that it can execute in parallel, which simply means two things happen simultaneously. Go gives you an easy access to concurrency tools that are a big selling point of the language, it just takes some time to master but it's simple to get started.

0
← Go Back