Django中如何使用django-celery完成异步任务
安装Celery
我们可以使用pip在vietualenv中安装:
pip install django-celery celerydjango settings设置
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379/2' ---使用redis当消息队列
注册
djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379/2' ---使用redis当消息队列
注册
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'devops',
'apps',
'common',
'djcelery', ]
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'devops',
'apps',
'common',
'djcelery', ]
在apps下创建一个task.py文件
from celery import task
@task
def add(x, y):
return x + y
@task
def add(x, y):
return x + y
@task
def pp():
return 'ffffffffffffffffffffff'
返回settings将其导入进去
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379/2'
CELERY_IMPORTS = ('apps.task')
BROKER_URL = 'redis://127.0.0.1:6379/2'
CELERY_IMPORTS = ('apps.task')
随意选个views中的方法调用改task测试下
#项目列表
def project_list(request):
admin = Admin.objects.get(id=get_current_admin_id(request))
pt=admin.projects.all().order_by('-id')
from apps import task
tt=task.add.delay(2,2)
print 'vvvvvvvvvvvvvvvvvvvv',tt -----访问该功能时调用task的add方法
page_objects = pages(pt, request, 5) ##分页
return render_to_response('project/project_list.html',locals())
def project_list(request):
admin = Admin.objects.get(id=get_current_admin_id(request))
pt=admin.projects.all().order_by('-id')
from apps import task
tt=task.add.delay(2,2)
print 'vvvvvvvvvvvvvvvvvvvv',tt -----访问该功能时调用task的add方法
page_objects = pages(pt, request, 5) ##分页
return render_to_response('project/project_list.html',locals())
安装启动redis:略
启动runserver 与 celery
python manage.py runserver
python manage.py celery worker --loglevel=info
python manage.py celery worker --loglevel=info
访问对应的页面看日志
[2017-11-10 14:55:17,016: INFO/MainProcess] Task apps.task.add[9bafe6d2-8411-4f5f-8eed-10444da0ae3a] succeeded in 0.00280212797225s: 4 --可以看到worker日志,返回结果
手工测试task
打开新的terminal, 激活virtualenv, 并切换到django项目目录:
$ python manage.py shell
>>> from apps.task import add
>>> add.delay(2, 2)
此时, 你可以在worker窗口中看到worker执行该task: [2014-10-07 08:47:08,076: INFO/MainProcess] Got task from broker: myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc]
[2014-10-07 08:47:08,299: INFO/MainProcess] Task myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc] succeeded in 0.183349132538s: 4
>>> from apps.task import add
>>> add.delay(2, 2)
此时, 你可以在worker窗口中看到worker执行该task: [2014-10-07 08:47:08,076: INFO/MainProcess] Got task from broker: myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc]
[2014-10-07 08:47:08,299: INFO/MainProcess] Task myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc] succeeded in 0.183349132538s: 4
Eager模式
Eager模式
如果在settings.py设置: CELERY_ALWAYS_EAGER = True
那么Celery便以eager模式运行, 则task便不需要加delay运行:
# 若启用eager模式, 则以下两行代码相同
add.delay(2, 2)
add(2, 2)
如果在settings.py设置: CELERY_ALWAYS_EAGER = True
add.delay(2, 2)
add(2, 2)
dj-celery 定时任务
settings配置添加
启动celery beat
查看worker日志
参考文档: http://www.codeweblog.com/djcelery%E5%85%A5%E9%97%A8-%E5%AE%9E%E7%8E%B0%E8%BF%90%E8%A1%8C%E5%AE%9A%E6%97%B6%E4%BB%BB%E5%8A%A1/
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' # 定时任务
CELERYBEAT_SCHEDULE = {
'add-every-3-seconds': {
'task': 'apps.task.pp',
'schedule': timedelta(seconds=3) ---每隔3秒 执行task下的pp函数
},
}
如果需要传参可以这样写
CELERYBEAT_SCHEDULE = {
'add-every-3-seconds': {
'task': 'apps.task.pp',
'schedule': timedelta(seconds=3) ---每隔3秒 执行task下的pp函数
},
}
如果需要传参可以这样写
CELERYBEAT_SCHEDULE = {
'add-every-3-minutes': {
'task': 'apps.task.add',
'schedule': timedelta(seconds=3),
'args': (16, 16)
},
}
'add-every-3-minutes': {
'task': 'apps.task.add',
'schedule': timedelta(seconds=3),
'args': (16, 16)
},
}
启动celery beat
启动beat
执行定时任务时, Celery会通过celerybeat进程来完成. Celerybeat会保持运行, 一旦到了某一定时任务需要执行时, Celerybeat便将其加入到queue中. 不像worker进程, Celerybeat只有需要一个即可.
启动: python manage.py celery beat --loglevel=info
其实还有一种简单的启动方式worker和beat一起启动:
python manage.py celery worker --loglevel=info --beat
执行定时任务时, Celery会通过celerybeat进程来完成. Celerybeat会保持运行, 一旦到了某一定时任务需要执行时, Celerybeat便将其加入到queue中. 不像worker进程, Celerybeat只有需要一个即可.
启动: python manage.py celery beat --loglevel=info
查看worker日志
[2017-11-10 16:17:25,853: INFO/MainProcess] Received task: apps.task.pp[8a3af6fb-5189-4647-91f2-8aa07489dd1e]
[2017-11-10 16:17:25,858: INFO/MainProcess] Task apps.task.pp[8a3af6fb-5189-4647-91f2-8aa07489dd1e] succeeded in 0.00379144400358s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:28,858: INFO/MainProcess] Received task: apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25]
[2017-11-10 16:17:28,864: INFO/MainProcess] Task apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25] succeeded in 0.0049942266196s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:31,859: INFO/MainProcess] Received task: apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a]
[2017-11-10 16:17:31,865: INFO/MainProcess] Task apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a] succeeded in 0.00537821277976s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:34,859: INFO/MainProcess] Received task: apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452]
[2017-11-10 16:17:34,865: INFO/MainProcess] Task apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452] succeeded in 0.00530335493386s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:25,858: INFO/MainProcess] Task apps.task.pp[8a3af6fb-5189-4647-91f2-8aa07489dd1e] succeeded in 0.00379144400358s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:28,858: INFO/MainProcess] Received task: apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25]
[2017-11-10 16:17:28,864: INFO/MainProcess] Task apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25] succeeded in 0.0049942266196s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:31,859: INFO/MainProcess] Received task: apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a]
[2017-11-10 16:17:31,865: INFO/MainProcess] Task apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a] succeeded in 0.00537821277976s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:34,859: INFO/MainProcess] Received task: apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452]
[2017-11-10 16:17:34,865: INFO/MainProcess] Task apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452] succeeded in 0.00530335493386s: 'ffffffffffffffffffffff'
参考文档: http://www.codeweblog.com/djcelery%E5%85%A5%E9%97%A8-%E5%AE%9E%E7%8E%B0%E8%BF%90%E8%A1%8C%E5%AE%9A%E6%97%B6%E4%BB%BB%E5%8A%A1/
django-celery参考文档:
http://blog.youkuaiyun.com/acm_zl/article/details/53188064
https://www.cnblogs.com/znicy/p/5626040.html
http://www.weiguda.com/blog/73/
https://www.cnblogs.com/Lin-Yi/p/7590971.html
补充还有另外一种方法那就是celery
具体配置可见参考
http://www.jianshu.com/p/7a869a73b92f
http://www.jianshu.com/p/7a869a73b92f
来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/29096438/viewspace-2147120/,如需转载,请注明出处,否则将追究法律责任。
转载于:http://blog.itpub.net/29096438/viewspace-2147120/
本文详细介绍如何在Django项目中使用django-celery实现异步任务和定时任务,包括安装配置、基本任务定义及调用、定时任务配置等。
734

被折叠的 条评论
为什么被折叠?



