Async python with asyncio

You have two options for concurrent python

  1. threads
  2. asyncio (coroutine concurrency)

Traditionally, we used threads with python for concurrency, but the release of the asyncio library (version 3.4) and the release of async/await syntax (version 3.5) created support for native coroutines that are familiar to users of the many languages that support it. (history of asyncio)

asyncio (docs)

The topics we cover here

  • Runners,
  • Coroutines,
  • Concurrent Coroutines
  • Threads for converting sync code

Runners (docs)

You cannot “await” on the top level of a file so typically the main function will need to be run by asyncio.run.

Python
async def main():
    await asyncio.sleep(1)
    print('hello')

asyncio.run(main())

Coroutines (docs)

(docs)

Note: calling a coroutine by itself returns coroutine object and does not schedule execution

Python
>>> main()
<coroutine object main at 0x1053bb7c8>

Q: How do you run a coroutine?
Besides “run” there are other ways to run a coroutine including:

Python
# use await
await kale()

# use create_task
asyncio.create_task(kale())
Python
# full example 
import asyncio
import time

async def kale(suffix):
    await asyncio.sleep(5)
    return "hello " + suffix

async def main():
    x = kale("1")
    y = kale("2")
    z = asyncio.create_task(kale("3")
    print(x)
    print(y)
    print(z)
    
asyncio.run(main())

Concurrent Coroutines (with gather)

Python
import asyncio

async def factorial(name, number):
    # imagine code here
		pass 

async def main():
    # Schedule three calls *concurrently*:
    L = await asyncio.gather(
        factorial("A", 2),
        factorial("B", 3),
        factorial("C", 4),
    )
    print(L)
		# list of values from the coroutines
		# [2, 6, 24]

asyncio.run(main())

Often you use a list comprehension or other list creation code.
This means you’ll need to unpack into gather

Python
async def main():
    urls = [
        "https://www.example.com", 
        "https://www.example2.com"
    ]
    tasks = [fetch_url(url) for url in urls]
    
    # Note the asterisk *
    responses = await asyncio.gather(*tasks)

Threads

The response library is naturally a synchronous library so this request.get is blocking.
Here to_thread pushes execution to a separate thread BUT you can await it just like a coroutine.

Python
# sync
def fetch_url(url):
    response = requests.get(url)
    return response.text 
    
async def main():
    coro = asyncio.to_thread("whereever.com")
		x =	await coro

you can even wrap this in an async def function to make things prettier

Python
# async version of fetch_url
async def fetch_url_async(url):
		return await asyncio.to_thread(fetch_url, url)
		
async def main():
    coro = asyncio.to_thread(fetch_url)
		x =	await fetch_url_async("wherever.com")

For the next post we will use these strategies to make concurrent openai calls.