今天用celery 执行 task的时候碰到了 重复执行的情况,而且是重复执行了8次….(电脑是8核的)
谷歌了一下,celery 在执行task时有个机制,就是任务时长超过了 visibility_timeout 时还没执行完,就会指定其他worker重新开始task,默认的时长是一小时.
app.conf.broker_transport_options = {‘visibility_timeout’: 3600}
但是我这个肯定没1小时,怎么还是会重复执行,而且是8次,
# task代码
import json
import paho.mqtt.subscribe as subscribe
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from celery import task
channel_layer = get_channel_layer()
def on_message(client, userdata, msg):
# if msg.topic == "result":
payload = json.loads(msg.payload)
print(payload)
group_name = payload['group_name']
async_to_sync(channel_layer.group_send)(group_name, {"type": "send.message", "message": payload['message']})
@task
def listen():
subscribe.callback(callback=on_message, topics='result')
subscribe是paho-mqtt库的函数,是个阻塞函数,直到监听到result主题的信息.
然后就很尴尬,后来找到celery-once这个库,解决了重复执行task的问题 github地址
具体步骤与解析在git上有详细介绍,这里就简单说下,我的修改
#demo/settings
from celery import Celery
celery = Celery('mytest.task', broker='redis://@127.0.0.1:6379/0')
celery.conf.ONCE = {
'backend': 'celery_once.backends.Redis',
'settings': {
'url': 'redis://localhost:6379/0',
'default_timeout': 60 * 60
}
}
# demo/celery
# 这段代码在celery官方文档中有 直接复制过来即可
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'whthas_home.settings')
app = Celery('portal')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
#demo/__init__.py
from .celery import app as celery_app
# mytest/task
import json
import paho.mqtt.subscribe as subscribe
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from celery_once import QueueOnce
# 这里导入setting中的celery对象 如果直接用普通的celery.task 会报找不到task的错
from simpleDemo.settings import celery
channel_layer = get_channel_layer()
def on_message(client, userdata, msg):
# if msg.topic == "result":
payload = json.loads(msg.payload)
print(payload)
group_name = payload['group_name']
async_to_sync(channel_layer.group_send)(group_name, {"type": "send.message", "message": payload['message']})
@celery.task(base=QueueOnce, once={'graceful': True})
def listen():
subscribe.callback(callback=on_message, topics='result')
然后就正常了,不会重复执行任务了