Django简单的定时插件 crontab:
https://www.jianshu.com/p/f6e80e6125cc
参考资料:
https://blog.youkuaiyun.com/lm_is_dc/article/details/82705450
https://www.colabug.com/4075504.html
一、pip安装celery(注意版本很重要)``
python3.6.4 (python3.7 ay...变成了关键字)
celery == 3.1.23
django-celery == 3.2.2
二、安装rabbitmq (我使用docker安装)``
https://blog.youkuaiyun.com/saga_gallon/article/details/81482384 rabbitmq虚拟机命令行管理参考
https://blog.youkuaiyun.com/leisure_life/article/details/78707338
https://blog.youkuaiyun.com/qq_16855077/article/details/81288552 虚拟机界面管理参考
docker pull rabbitmq:3.7.7-management
docker run -d --name rabbitmq3.7.7 -p 5672:5672 -p 15672:15672 -v pwd
/data:/var/lib/rabbitmq --hostname myRabbit -e RABBITMQ_DEFAULT_VHOST=my_vhost -e RABBITMQ_DEFAULT_USER=admin -e RABBITMQ_DEFAULT_PASS=admin 镜像id
参数解释:RABBITMQ_DEFAULT_VHOST:虚拟机名;
RABBITMQ_DEFAULT_USER:用户名;
RABBITMQ_DEFAULT_PASS:密码
三、 settings.py中添加以下几行:
from __future__ import absolute_import # 写在最顶部
# celery 配置
import djcelery
djcelery.setup_loader()
BROKER_URL = 'amqp://admin:admin@127.0.0.1:5672/my_vhost'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' # 定时任务
CELERY_ACCEPT_CONTENT = ['pickle', 'json']
CELERYD_CONCURRENCY = 8 # 并发worker数
CELERYD_FORCE_EXECV = True # 非常重要,有些情况下可以防止死锁
CELERYD_MAX_TASKS_PER_CHILD = 100 # 每个worker最多执行100个任务就会被销毁,可防止内存泄露
CELERY_DISABLE_RATE_LIMITS = True # 任务发出后,经过一段时间还未收到acknowledge , 就将任务重新交给其他worker执行
INSTALLED_APPS = ['djcelery',# 添加djcelery]
四、adminx.py文件配置``
from __future__ import absolute_import, unicode_literals
from djcelery.models import (
TaskState, WorkerState,
PeriodicTask, IntervalSchedule, CrontabSchedule,
)
from xadmin.sites import site
site.register(IntervalSchedule) # 存储循环任务设置的时间
site.register(CrontabSchedule) # 存储定时任务设置的时间
site.register(PeriodicTask) # 存储任务
site.register(TaskState) # 存储任务执行状态
site.register(WorkerState) # 存储执行任务的worker
五、settings.py同级目录下创建celery.py
from __future__ import absolute_import
import os
from celery import Celery,platforms
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '项目名.settings')
app = Celery('项目名')
platforms.C_FORCE_ROOT = True
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(packages=settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
六、把以下信息写入主应用下的__init__.py文件``
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
七、在APP下创建tasks.py``
from __future__ import absolute_import
from celery import task,shared_task
@shared_task
def test_task():
print("测试成功")
八、 启动命令
python manage.py makemigrations
python manage.py migrate
# linux
python manage.py celery worker --beat # 启动消费者和生产者,,beat是定时的节奏
# windows
celery -A 项目名 beat # 先启动beat
celery -A MySites worker # 再启动worker