1.安装redis
2. pip install celery
pip install eventlet
3.创建tasks.py
from celery import Celery
app = Celery('tasks',broker='redis://:****@127.0.0.1:6379/0')
@app.task
def add(x,y):
print("running。。。",x,y)
return x+y
4.启动redis (带密码)
5.进入dos界面输入命令:celery -A tasks(模块名) worker -l info -P eventlet
D:\PycharmProjects\celery and redis>celery -A tasks worker -l info -P eventlet
-------------- celery@888888888 v4.4.7 (cliffs)
--- ***** -----
-- ******* ---- Windows-8888888 2020-11-18 16:00:27
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: tasks:88888888
- ** ---------- .> transport: redis://:**@127.0.0.1:6379/0
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. tasks.add
[2020-11-18 16:00:27,289: INFO/MainProcess] Connected to redis://:**@127.0.0.1:6379/0
[2020-11-18 16:00:27,298: INFO/MainProcess] mingle: searching for neighbors
[2020-11-18 16:00:28,319: INFO/MainProcess] mingle: all alone
[2020-11-18 16:00:28,342: INFO/MainProcess] pidbox: Connected to redis://:**@127.0.0.1:6379/0.
6.在dos界面,进入tasks.py所在目录,执行:
D:\PycharmProjects\celery and redis>python
Python 3.5.2 |Anaconda 4.2.0 (64-bit)| (default, Jul 5 2016, 11:41:13) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from tasks import add
>>> add.delay(2,2)
<AsyncResult: ed2338f6-b03c-4779-8b52-7b7f43cb4b44>
>>>
7.在第5步启动的dos界面显示:
[2020-11-18 16:02:12,335: INFO/MainProcess] Received task: tasks.add[ed2338f6-b03c-4779-8b52-7b7f43cb4b44]
[2020-11-18 16:02:12,336: WARNING/MainProcess] running。。。
[2020-11-18 16:02:12,336: WARNING/MainProcess] 2
[2020-11-18 16:02:12,336: WARNING/MainProcess] 2
[2020-11-18 16:02:12,337: INFO/MainProcess] Task tasks.add[ed2338f6-b03c-4779-8b52-7b7f43cb4b44] succeeded in 0.0s: 4
这篇博客介绍了如何配置并运行一个基于Celery的任务队列系统,与Redis结合使用。首先,它讲解了如何安装Redis,然后通过pip安装Celery和eventlet。接着,创建了名为tasks.py的文件来定义任务。启动带密码的Redis服务器后,在命令行中使用celery命令启动worker。最后,在任务文件的目录下执行相关命令,观察在Celery worker中任务的执行情况。
8万+





