celery
理解:将数据发送到消息队列,而celery则监听队列是否有内容,有则获取,并且是通过异步协程的方式,从而可以并发地获取并处理任务,然后再将处理完的任务结果保存起来,待主线程读取
自己实现异步任务:创建一个任务队列(如rabbitmq/redis)->异步程序绑定任务队列->生产者将任务放入任务队列->异步程序监听任务队列->执行对应回调
celery:无需我们自己创建队列,只需要配置任务队列的IP和端口,其他由celery帮我们实现
简单示例
定义任务
# task_queue.py
import celery
import time
backend='redis://127.0.0.1:6379/1'
# 结果存储位置
broker='redis://127.0.0.1:6379/2'
# 任务队列位置
cel_buy = celery.Celery('buy', backend=backend, broker=broker)
# 异步任务队列配置
# 绑定一个异步任务
@cel_buy.task
def buy(name):
print("购买商品:{}".format(name))
time.sleep(5)
print("购买商品:{}完成".format(name))
return name
# return的结果就是任务执行完保存到数据库的结果
启动命令
celery worker -A uf -l info
# win10环境下需要添加运行参数: -P eventlet
其他参数:
-c n 控制并发数量
启动成功可以看到如下信息:
...
- ** ---------- [config]
- ** ---------- .> app: buy:0x1fdb036b908
- ** ---------- .> transport: redis://127.0.0.1:6379/2
- ** ---------- .> results: redis://127.0.0.1:6379/1
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
# 所有的任务队列
[tasks]
. task_queue.buy
...
添加任务并监听结果
from task_queue import buy, notice, cel_buy, cel_notice
from celery.result import AsyncResult
import time
def get_result(id, cel):
while True:
async_result = AsyncResult(id=id, app=cel)
# 根据任务ID从指定队列获取任务执行状态
if async_result.successful():
result = async_result.get()
return result
elif async_result.failed():
print('执行失败')
break
elif async_result.status == 'PENDING':
print('任务等待中被执行...')
elif async_result.status == 'RETRY':
print('任务异常后正在重试...')
break
elif async_result.status == 'STARTED':
print('任务已经开始被执行...')
time.sleep(1)
def main():
buy_task = buy.delay("aaa")
# 绑定的异步任务可以通过delay方法将任务添加到队列
result = get_result(buy_task.id, cel_buy)
print(result)
main()
注:
历史遗留任务问题:有些任务没有执行完会继续留在数据库里等待执行,所以如果希望重新启动服务时不执行历史任务,则需要先清空历史任务
按序执行多任务示例
- 创建多个任务队列:
# task_queue.py
import celery
import time
backend='redis://127.0.0.1:6379/1'
# 结果存储位置
broker='redis://127.0.0.1:6379/2'
# 任务队列位置
cel_buy = celery.Celery('buy', backend=backend, broker=broker)
# 异步任务队列buy配置
cel_notice = celery.Celery('notice', backend=backend, broker=broker)
# 异步任务队列notice配置
@cel_buy.task
def buy(name):
print("购买商品:{}".format(name))
time.sleep(5)
print("购买商品:{}完成".format(name))
return name
@cel_notice.task
def notice(name):
print("通知客户商品:{}购买成功!".format(name))
return "ok"
- 执行并监听任务进度:
from task_queue import buy, notice, cel_buy, cel_notice
from celery.result import AsyncResult
import time
def get_result(id, cel, callback=None):
while True:
async_result = AsyncResult(id=id, app=cel)
if async_result.successful():
result = async_result.get()
if callback:
return callback(result)
return result
elif async_result.failed(): # 执行任务报错或者中断时
print('执行失败')
break
elif async_result.status == 'PENDING':
print('任务等待中被执行...')
elif async_result.status == 'RETRY':
print('任务异常后正在重试...')
break
elif async_result.status == 'STARTED':
print('任务已经开始被执行...')
time.sleep(1)
def callback(result):
return notice.delay(result)
def main():
buy_task = buy.delay("aaa")
# 先执行购买任务
notice_task = get_result(buy_task.id, cel_buy, callback)
# 当购买任务监听到成功时则执行通知任务
if notice_task:
result = get_result(notice_task.id, cel_notice)
print(result)
main()
多任务结构
- 任务配置文件:
# schedule.py
from datetime import timedelta
from celery import Celery
from celery.schedules import crontab
cel = Celery('tasks', broker='redis://127.0.0.1:6379/1', backend='redis://127.0.0.1:6379/2', include=[
'task_queue',
# 所有任务文件
])
cel.conf.beat_schedule = {
# 定时任务名
'6s-buy': {
# 指定文件下的任务函数
'task': 'task_queue.buy',
# 6秒加一次该任务
'schedule': timedelta(seconds=6),
# 传递参数
'args': ('aaa',)
},
}
- 任务文件:
# task_queue.py
from schedule import cel
import time
@cel.task
def buy(name):
print("购买商品:{}".format(name))
time.sleep(5)
print("购买商品:{}完成".format(name))
return name
@cel.task
def notice(name):
print("通知客户商品:{}购买成功!".format(name))
return "ok"
先执行下列命令,读取配置问题,定时添加任务:
celery beat -A schedule(配置文件名)
然后启动任务队列:
celery -A task_queue(任务文件) worker -l info -P eventlet
更多参考:
https://www.cnblogs.com/pyedu/p/12461819.html
https://www.jianshu.com/p/57414db33c27