Concurrency in Python: 5 Practical Ways to Get Things Running at the Same Time
Back to Blog

Concurrency in Python: 5 Practical Ways to Get Things Running at the Same Time

Today I ran into a UI issue where a blocking I/O operation was taking too long, and the whole Python program would just freeze and become unresponsive. That pushed me to revisit how concurrency in Python works, and what options exist to avoid this kind of situation.

Python does not really have one concurrency model. It has several, and each one behaves differently depending on whether you are dealing with I/O, CPU work, or external processes. Here are five practical approaches, with what they are, what they do well, and where they fall short.

1. Threading

Threading runs multiple threads within the same process, so one thread can continue while another is stuck waiting on I/O.

Its main strengths are simplicity, good compatibility with existing blocking libraries, and a straightforward way to prevent UI freezing in I/O-heavy workloads. Its main drawbacks are the GIL, which prevents true CPU parallelism, and the risk of race conditions when sharing state between threads.

It is most suitable for fixing UI freezes caused by blocking I/O, as well as API calls, database queries, and file operations.

import threading
threading.Thread(target=task).start()

2. Asyncio

Asyncio uses an event loop where tasks cooperatively yield control when they are waiting on I/O.

Its main strengths are very high concurrency efficiency and low overhead, especially when handling many simultaneous I/O operations. Its main drawbacks are a steeper learning curve, and the fact that it does not work well with blocking libraries unless they have async versions.

It is most suitable for high-concurrency network applications, such as web servers, crawlers, and real-time systems.

import asyncio
asyncio.run(asyncio.gather(task1(), task2()))

3. Multiprocessing

Multiprocessing runs tasks in separate processes, allowing true parallel execution across CPU cores.

Its main strengths are bypassing the GIL and enabling real CPU parallelism. Its main drawbacks are higher memory usage, slower startup, and the overhead of transferring data between processes.

It is most suitable for CPU-heavy workloads, such as data processing, simulation, and media or image processing, but it is not typically used to fix simple I/O freezing issues.

from multiprocessing import Pool
Pool(4).map(work, items)

4. concurrent.futures

concurrent.futures provides a higher-level interface for managing thread and process pools.

Its main strengths are a clean and simple API, easier task management, and flexibility to switch between threads and processes. Its main drawbacks are less fine-grained control and the fact that performance still depends on whether you choose threads or processes underneath.

It is most suitable for quickly parallelizing blocking work, especially when you want a more structured and maintainable approach than raw threading or multiprocessing.

from concurrent.futures import ThreadPoolExecutor
ThreadPoolExecutor().map(task, items)

5. Subprocess

Subprocess runs external programs as separate processes and lets Python coordinate them.

Its main strengths are leveraging existing optimized tools and isolating heavy work outside Python. Its main drawbacks are more complex error handling, process management overhead, and dependency on external binaries.

It is most suitable for automation tasks, media pipelines, and situations where external CLI tools already solve the problem well.

import subprocess
subprocess.run(["ffmpeg", "in.mp4", "out.mp4"])