Course Progress80%
🍎 Python Concurrency & Data Topic 80 / 100
⏳ 8 min read

asyncio Basics

The event loop, coroutines, Tasks, gather, timeouts — everything you need to write clean concurrent Python programs.

"Think of asyncio like a single chef juggling multiple dishes. While one dish is in the oven, the chef works on another. No extra chefs needed — just smart switching."

— Shurai

The Big Picture — How asyncio Works

asyncio runs an event loop — a scheduler that manages all your coroutines. It constantly checks: "which coroutine is waiting for something slow? Which one is ready to continue?" and switches between them:

▶ How the event loop works (step by step):
1
You call asyncio.run(main()) — this starts the event loop and hands it your coroutine.
2
The loop runs main() until it hits an await — a point where it must wait for something slow.
3
Instead of blocking, the loop switches to another ready coroutine and runs it.
4
When the slow operation completes, the original coroutine is resumed right where it paused.

Coroutines vs Tasks

There are two ways to run coroutines concurrently — gather() and create_task(). Understanding the difference is key:

asyncio.gather()
Run several coroutines together.
Wait for all to finish.
Returns all results as a list.
Easiest and most common.
asyncio.create_task()
Schedule a coroutine to run soon.
Returns a Task you can cancel.
More control over individual jobs.
Good for fire-and-monitor patterns.
python — gather() in action
import asyncio

async def brew_tea():
    print("Boiling water...")
    await asyncio.sleep(3)    # wait 3 seconds
    print("Tea is ready!")
    return "chai"

async def toast_bread():
    print("Toasting bread...")
    await asyncio.sleep(2)    # wait 2 seconds
    print("Toast is ready!")
    return "toast"

async def make_breakfast():
    # Both run at the SAME time — not one after the other
    tea, bread = await asyncio.gather(
        brew_tea(),
        toast_bread(),
    )
    print(f"Breakfast: {tea} + {bread}")

asyncio.run(make_breakfast())
output (finishes in 3s, not 5s)
Boiling water...
Toasting bread...
Toast is ready!     ← after 2s
Tea is ready!       ← after 3s
Breakfast: chai + toast

create_task() — More Control

python
import asyncio

async def send_email(to):
    print(f"Sending email to {to}...")
    await asyncio.sleep(1)
    print(f"Email sent to {to}")

async def main():
    # Schedule tasks — they start immediately in the background
    t1 = asyncio.create_task(send_email("riya@shurai.com"))
    t2 = asyncio.create_task(send_email("arjun@shurai.com"))

    print("Tasks created, doing other work...")
    await asyncio.sleep(0.1)        # yield control so tasks can start

    await t1                          # wait for task 1
    await t2                          # wait for task 2

asyncio.run(main())

Handling Timeouts

python — asyncio.wait_for()
import asyncio

async def slow_api():
    await asyncio.sleep(10)    # pretend this takes forever
    return "data"

async def main():
    try:
        result = await asyncio.wait_for(slow_api(), timeout=3.0)
    except asyncio.TimeoutError:
        print("Request timed out after 3 seconds")

asyncio.run(main())
# Request timed out after 3 seconds

Real Example — Download Multiple URLs Concurrently

python — with aiohttp (pip install aiohttp)
import asyncio
import aiohttp

async def fetch(session, url):
    async with session.get(url) as resp:
        return await resp.text()

async def download_all(urls):
    async with aiohttp.ClientSession() as session:
        tasks   = [fetch(session, url) for url in urls]
        results = await asyncio.gather(*tasks)
    return results

urls = [
    "https://httpbin.org/get",
    "https://httpbin.org/ip",
    "https://httpbin.org/uuid",
]
pages = asyncio.run(download_all(urls))
print(f"Downloaded {len(pages)} pages concurrently")
asyncio is single-threaded

asyncio uses one thread and one CPU core. It achieves concurrency by smart cooperative switching — coroutines voluntarily yield at await points. This is perfect for I/O-bound tasks. For CPU-heavy work, use multiprocessing (lesson 82).

"asyncio's secret is that while your code waits for a network reply, it isn't sitting idle — it's running someone else's code. That's how one thread handles thousands of connections."

— Shurai

🧠 Quiz — Q1

What is the asyncio event loop?

🧠 Quiz — Q2

What is the difference between asyncio.gather() and asyncio.create_task()?

🧠 Quiz — Q3

Which function do you use to run the top-level coroutine and start the event loop?

🧠 Quiz — Q4

asyncio is best suited for which type of tasks?