同步程序(synchronous program)即每次只执行一个步骤,只有在当前操作的步骤彻底完成之后,才会移动到下一个步骤继续执行。
异步程序(asynchronous program)同样每次只接受一个操作步骤,但是它并不会等待当前的操作步骤执行完成后才移动到下一个步骤。
一、同步编程
下面的代码负责从队列(queue)中依次取出某个特定的任务并执行。队列是 Python 中的一种实现了 FIFO (先入先出)规则的数据结构,可以调用其 put()
和 get()
方法以相同的顺序存入和取出数据。
import queue
def task(name, work_queue):
if work_queue.empty():
print(f"Task {name} nothing to do")
else:
while not work_queue.empty():
count = work_queue.get()
total = 0
print(f"Task {name} running")
for x in range(count):
total += 1
print(f"Task {name} total: {total}")
def main():
"""
This is the main entry point for the program.
"""
# Create the queue of 'work'
work_queue = queue.Queue()
# Put some 'work' in the queue
for work in [15, 10, 5, 2]:
work_queue.put(work)
# Create some synchronous tasks
tasks = [
(task, "One", work_queue),
(task, "Two", work_queue)
]
# Run the tasks
for t, n, q in tasks:
t(n, q)
if __name__ == "__main__":
main()
# => Task One running
# => Task One total: 15
# => Task One running
# => Task One total: 10
# => Task One running
# => Task One total: 5
# => Task One running
# => Task One total: 2
# => Task Two nothing to do
其中的 task()
函数接收字符串和队列作为参数,可以从队列中不断地取出数字并完成计数动作,直到队列为空。
而 main()
函数则依次指派两个 task()
函数用于处理同一个队列中的数据。
从输出结果中可以看出,排在前面的 Task One
处理了队列中的所有数据,待数据处理完成即 Task One
中的循环退出之后,排在后面的 Task Two
继续执行,但已经没有任何数据需要处理。
二、协作
下面版本的程序允许两个工作函数共同处理位于同一队列中的多个任务,达到“协作”的效果。
import queue
def task(name, queue):
while not queue.empty():
count = queue.get()
total = 0
print(f"Task {name} running")
for x in range(count):
total += 1
yield
print(f"Task {name} total: {total}")
def main():
"""
This is the main entry point for the program.
"""
# Create the queue of 'work'
work_queue = queue.Queue()
# Put some 'work' in the queue
for work in [15, 10, 5, 2]:
work_queue.put(work)
# Create some tasks
tasks = [
task("One", work_queue),
task("Two", work_queue)
]
# Run the tasks
done = False
while not done:
for t in tasks:
try:
next(t)
except StopIteration:
tasks.remove(t)
if len(tasks) == 0:
done = True
if __name__ == "__main__":
main()
# => Task One running
# => Task Two running
# => Task Two total: 10
# => Task Two running
# => Task One total: 15
# => Task One running
# => Task Two total: 5
# => Task One total: 2
代码中最关键的部分即 yield
关键字,它将 task()
函数转变成了生成器(generator)。
生成器函数和其他 Python 函数在调用时并没有太大区别,但是当生成器中的 yield
语句执行时,对程序的控制权会被返还给调用生成器的函数(caller)。
有趣的地方在于,还可以通过将生成器传递给 next()
函数,将程序的控制权再次转移给生成器本身。结合前面生成器中的 yield
关键字,最终达到在程序中来回切换上下文的效果。
上面的 main()
函数通过调用 next(t)
将程序控制权转移给某个 Task()
生成器,而 Task()
代码中的 yield
又会在执行完指派给它的某个任务之后,返回结果并将控制权还给 main()
主程序。而 main()
函数继续通过 next(t)
将队列中的下一个任务分配给另一个 Task()
执行,直到任务队列为空。
从输出结果中也可以看出,Task One
和 Task Two
都参与了队列中任务的处理。
虽然 Task Two total: 10
最后先于 Task One total: 15
输出,但并不代表该程序是异步执行的。Task Two
先输出的原因仅在于它执行的计数次数少。
从逻辑上看,这种通过生成器函数切换程序上下文的方式虽然使多个工作函数同时参与到了任务中,但仍旧总是在当前任务执行完之后才进行上下文的切换。因此仍属于同步程序。
三、阻塞式请求
下面的代码示例和上一个版本基本相同,只不过在 Task()
代码中使用 time.sleep(delay)
函数用于模拟阻塞请求,取代了之前的计数操作。
阻塞请求是指在一定时间内阻止 CPU 去处理其他任何事件的代码,比如 time.sleep(delay)
就会阻止 CPU 做其他任何操作直到等待的时间结束。
import time
import queue
def task(name, queue):
while not queue.empty():
delay = queue.get()
start_time = time.time()
print(f"Task {name} running")
time.sleep(delay)
print(f"Task {name} total elapsed time: {time.time() - start_time:.1f}")
yield
def main():
"""
This is the main entry point for the program.
"""
# Create the queue of 'work'
work_queue = queue.Queue()
# Put some 'work' in the queue
for work in [15, 10, 5, 2]:
work_queue.put(work)
tasks = [
task("One", work_queue),
task("Two", work_queue)
]
# Run the tasks
start_time = time.time()
done = False
while not done:
for t in tasks:
try:
next(t)
except StopIteration:
tasks.remove(t)
if len(tasks) == 0:
done = True
print(f"\nTotal elapsed time: {time.time() - start_time:.1f}")
if __name__ == "__main__":
main()
# => Task One running
# => Task One total elapsed time: 15.0
# => Task Two running
# => Task Two total elapsed time: 10.0
# => Task One running
# => Task One total elapsed time: 5.0
# => Task Two running
# => Task Two total elapsed time: 2.0
# => Total elapsed time: 32.0
从输出的结果中可以看出,协作的方式并没有加速程序的运行。整个程序执行完耗费的全部时间刚好是阻塞代码引起的延迟时间之和。
四、非阻塞式请求
import asyncio
import time
async def task(name, work_queue):
while not work_queue.empty():
delay = await work_queue.get()
start_time = time.time()
print(f"Task {name} running")
await asyncio.sleep(delay)
print(f"Task {name} total elapsed time: {time.time() - start_time:.1f}")
async def main():
"""
This is the main entry point for the program.
"""
# Create the queue of 'work'
work_queue = asyncio.Queue()
# Put some 'work' in the queue
for work in [15, 10, 5, 2]:
await work_queue.put(work)
# Run the tasks
start_time = time.time()
await asyncio.gather(
asyncio.create_task(task("One", work_queue)),
asyncio.create_task(task("Two", work_queue)),
)
print(f"\nTotal elapsed time: {time.time() - start_time:.1f}")
if __name__ == "__main__":
asyncio.run(main())
# => Task One running
# => Task Two running
# => Task Two total elapsed time: 10.0
# => Task Two running
# => Task Two total elapsed time: 5.0
# => Task Two running
# => Task One total elapsed time: 15.0
# => Task Two total elapsed time: 2.0
# => Total elapsed time: 17.0
同上一个版本的程序不同,此处使用 await asyncio.sleep(delay)
替代了之前的 time.sleep(delay)
和 yield
语句,用来创建一个非阻塞的延迟并将程序控制权返还给调用者(即 main()
函数)。
而最后一行代码 asyncio.run(main())
将创建一个事件循环(event loop),用于执行 main()
函数也就是两个 task()
实例。
事件循环是 Python 中 async 机制的核心内容,它负责调用执行所有的工作代码。当具体的某个工作任务在执行过程中遇到 await
语句时,则程序将控制权返还给主事件循环。
同时事件循环持续不断地监视着所有工作任务的状态,并根据触发的某个具体事件将程序控制权分配给对应的工作代码。由此保证 CPU 一直处于“紧张”状态,不会被某个 IO 密集型操作阻塞而处于等待状态。
await asyncio.sleep(delay)
对于 CPU 而言是非阻塞的。在处理该代码时,CPU 并不会等待延迟的时间结束,而是在事件循环的任务队列中注册一个 sleep 事件,紧接着将程序的控制权直接交还给了事件循环,由事件循环监视可能会触发的事件(比如延迟结束)并分配任务给工作代码。
五、同步 HTTP 请求
代码如下:
import queue
import requests
import time
def task(name, work_queue):
with requests.Session() as session:
while not work_queue.empty():
url = work_queue.get()
print(f"Task {name} getting URL: {url}")
start = time.perf_counter()
session.get(url)
elapsed = time.perf_counter() - start
print(f"Task {name} elapsed time: {elapsed}")
yield
def main():
"""
This is the main entry point for the program
"""
# Create the queue of work
work_queue = queue.Queue()
# Put some work in the queue
for url in [
"https://www.baidu.com",
"https://www.jianshu.com/",
"https://www.jd.com/",
"https://www.tmall.com/",
"https://www.zhihu.com/",
"https://www.douban.com/",
"https://www.bilibili.com/",
]:
work_queue.put(url)
tasks = [task("One", work_queue), task("Two", work_queue)]
# Run the tasks
done = False
start = time.perf_counter()
while not done:
for t in tasks:
try:
next(t)
except StopIteration:
tasks.remove(t)
if len(tasks) == 0:
done = True
elapsed = time.perf_counter() - start
print(f"Total elapsed time: {elapsed}")
if __name__ == "__main__":
main()
# => Task One getting URL: https://www.baidu.com
# => Task One elapsed time: 0.1149543260000001
# => Task Two getting URL: https://www.jianshu.com/
# => Task Two elapsed time: 0.09059067599999993
# => Task One getting URL: https://www.jd.com/
# => Task One elapsed time: 0.06162170499999986
# => Task Two getting URL: https://www.tmall.com/
# => Task Two elapsed time: 0.16310405699999975
# => Task One getting URL: https://www.zhihu.com/
# => Task One elapsed time: 5.111851316
# => Task Two getting URL: https://www.douban.com/
# => Task Two elapsed time: 0.31463871600000015
# => Task One getting URL: https://www.bilibili.com/
# => Task One elapsed time: 0.18364096799999974
# => Total elapsed time: 6.082241783
六、异步 HTTP 请求
import asyncio
import aiohttp
import time
async def task(name, work_queue):
async with aiohttp.ClientSession() as session:
while not work_queue.empty():
url = await work_queue.get()
print(f"Task {name} getting URL: {url}")
start = time.perf_counter()
async with session.get(url) as response:
await response.text()
elapsed = time.perf_counter() - start
print(f"Task {name} elapsed time: {elapsed}")
async def main():
"""
This is the main entry point for the program
"""
# Create the queue of work
work_queue = asyncio.Queue()
# Put some work in the queue
for url in [
"https://www.baidu.com",
"https://www.jianshu.com/",
"https://www.jd.com/",
"https://www.tmall.com/",
"https://www.zhihu.com/",
"https://www.douban.com/",
"https://www.bilibili.com/",
]:
await work_queue.put(url)
# Run the tasks
start = time.perf_counter()
await asyncio.gather(
asyncio.create_task(task("One", work_queue)),
asyncio.create_task(task("Two", work_queue)),
)
elapsed = time.perf_counter() - start
print(f"Total elapsed time: {elapsed}")
if __name__ == "__main__":
asyncio.run(main())
# => Task One getting URL: https://www.baidu.com
# => Task Two getting URL: https://www.jianshu.com/
# => Task One elapsed time: 0.21159282599999996
# => Task One getting URL: https://www.jd.com/
# => Task One elapsed time: 0.09301454900000006
# => Task One getting URL: https://www.tmall.com/
# => Task One elapsed time: 0.09578595199999995
# => Task One getting URL: https://www.zhihu.com/
# => Task Two elapsed time: 0.4948436540000001
# => Task Two getting URL: https://www.douban.com/
# => Task One elapsed time: 0.218484707
# => Task One getting URL: https://www.bilibili.com/
# => Task Two elapsed time: 0.2670722139999997
# => Task One elapsed time: 0.22822054000000014
# => Total elapsed time: 0.867767489