django admin celery beat简单的定时任务管理平台

一直都在想写一个这样平台,前端比较low,所以就使用了Django 自带的后端写了一个定时任务管理平台
具体结构如下

djangotask/
├── app01
│?? ├── admin.py
│?? ├── apps.py
│?? ├── __init__.py
│?? ├── migrations
│?? │?? ├── __init__.py
│?? │?? └── __pycache__
│?? │??     └── __init__.cpython-38.pyc
│?? ├── models.py
│?? ├── __pycache__
│?? │?? ├── admin.cpython-38.pyc
│?? │?? ├── apps.cpython-38.pyc
│?? │?? ├── __init__.cpython-38.pyc
│?? │?? ├── models.cpython-38.pyc
│?? │?? ├── tasks.cpython-38.pyc
│?? │?? ├── urls.cpython-38.pyc
│?? │?? └── views.cpython-38.pyc
│?? ├── tasks.py
│?? ├── tests.py
│?? ├── urls.py
│?? └── views.py
├── django-db
├── djangotask
│?? ├── asgi.py
│?? ├── celery.py
│?? ├── __init__.py
│?? ├── __pycache__
│?? │?? ├── celery.cpython-38.pyc
│?? │?? ├── config.cpython-38.pyc
│?? │?? ├── __init__.cpython-38.pyc
│?? │?? ├── settings.cpython-38.pyc
│?? │?? ├── urls.cpython-38.pyc
│?? │?? └── wsgi.cpython-38.pyc
│?? ├── settings.py
│?? ├── urls.py
│?? └── wsgi.py
├── manage.py
├── requirements.txt
└── scripts
    ├── __init__.py
    └── purgelog.sh

需要安装的模块通过pip命令行安装,没什么好介绍的
‘django_celery_beat’,
‘django_celery_results’,
celery通过心跳对我们添加的定时任务做了一个定时扫描这点很好,results保存了我们的task执行结果,如果想更炫点的话可以采用restframework框架查看我们的定时任务,这里不做介绍,前面有文章写了这些方面,需要的同学可以前去看看
我这个是一个简单的功能测试,所以自己创建了一个app01,创建命令django-admin startapp app01,settings文件完整如下:

"""
Django settings for djangotask project.

Generated by 'django-admin startproject' using Django 3.2.13.

For more information on this file, see
https://docs.djangoproject.com/en/3.2/topics/settings/

For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.2/ref/settings/
"""

from pathlib import Path

# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent


# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.2/howto/deployment/checklist/

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'django-insecure-ro4r=2^w8d2z_w)cc9n4wthi+xuplnp2k=*ksb91^r!je0@te8'

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True

ALLOWED_HOSTS = ['*']


# Application definition

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'app01',
    'django_celery_beat',
    'django_celery_results',
    'rest_framework',
]

MIDDLEWARE = [
    'django.middleware.security.SecurityMiddleware',
    'django.contrib.sessions.middleware.SessionMiddleware',
    'django.middleware.common.CommonMiddleware',
    'django.middleware.csrf.CsrfViewMiddleware',
    'django.contrib.auth.middleware.AuthenticationMiddleware',
    'django.contrib.messages.middleware.MessageMiddleware',
    'django.middleware.clickjacking.XFrameOptionsMiddleware',
]

ROOT_URLCONF = 'djangotask.urls'

TEMPLATES = [
    {
        'BACKEND': 'django.template.backends.django.DjangoTemplates',
        'DIRS': [],
        'APP_DIRS': True,
        'OPTIONS': {
            'context_processors': [
                'django.template.context_processors.debug',
                'django.template.context_processors.request',
                'django.contrib.auth.context_processors.auth',
                'django.contrib.messages.context_processors.messages',
            ],
        },
    },
]

WSGI_APPLICATION = 'djangotask.wsgi.application'


# Database
# https://docs.djangoproject.com/en/3.2/ref/settings/#databases

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.mysql',
        'NAME': 'celeryapp',
        'USER': 'dbadmin',
        'PASSWORD': 'dbadmin',
        'HOST': '192.168.56.104',
        'PORT': '3306',
        'OPTIONS': {
            'init_command': "SET sql_mode='STRICT_TRANS_TABLES'",
            'charset': 'utf8mb4'
        }
    }
}


# Password validation
# https://docs.djangoproject.com/en/3.2/ref/settings/#auth-password-validators

AUTH_PASSWORD_VALIDATORS = [
    {
        'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
    },
    {
        'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
    },
    {
        'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
    },
    {
        'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
    },
]


# Internationalization
# https://docs.djangoproject.com/en/3.2/topics/i18n/

LANGUAGE_CODE = 'zh-Hans'

TIME_ZONE = 'Asia/Shanghai'

USE_I18N = True

USE_L10N = True

USE_TZ = False


# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.2/howto/static-files/

STATIC_URL = '/static/'

# Default primary key field type
# https://docs.djangoproject.com/en/3.2/ref/settings/#default-auto-field

DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'


# celery相关参数
#celery相关配置
#设置消息broker
CELERY_BROKER_URL = "redis://192.168.56.104:6380/10"
CELERY_TIMEZONE = TIME_ZONE
CELERY_ENABLE_UTC = False
DJANGO_CELERY_BEAT_TZ_AWARE = False
#存储任务状态及结果
# CELERY_RESULT_BACKEND = "redis://10.6.3.10:6379/2"
RESULT_BACKEND = "redis://192.168.56.104:6380/11"
# celery内容等消息的格式设置,默认json
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
# 为任务设置超时时间,单位秒。超时即中止,执行下个任务。
# CELERY_TASK_TIME_LIMIT = 5
# 为存储结果设置过期日期,默认1天过期。如果beat开启,Celery每天会自动清除。
# 设为0,存储结果永不过期
# CELERY_RESULT_EXPIRES = xx
# 任务限流
CELERY_TASK_ANNOTATIONS = {'tasks.add': {'rate_limit': '10/s'}}
# Worker并发数量,一般默认CPU核数,可以不设置
CELERY_WORKER_CONCURRENCY = 20
# 每个worker执行了多少任务就会死掉,默认是无限的
CELERY_WORKER_MAX_TASKS_PER_CHILD = 200
#将任务调度器设为DatabaseScheduler
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'

这里面包括了很多celery的重要配置信息,celery相关参数以下都是celery的相关配置信息
在djangotask包里面添加celery.py文件

from __future__ import absolute_import,unicode_literals
import  os
from .settings import INSTALLED_APPS
from celery import  Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE','djangotask.settings')
app=Celery('djangotask')
# app.config_from_object('djangotask.config')
app.config_from_object('django.conf:settings',namespace='CELERY')
app.conf.update(CELERY_RESULT_BACKEND ='django-db')
print(app.conf.get('result_backend'))
app.autodiscover_tasks(lambda :INSTALLED_APPS)

最后的lambda是一个匿名函数,直接扫描安装的app里面有没有tasks包的,这个匿名函数会被调用直接返回一个列表,为什么这个说呢,因为看源码可以了解的函数是否可以调用,如果可以调用就直接调用这个函数返回结果,的确非常秒的源码


    def _autodiscover_tasks_from_names(self, packages, related_name):
        # packages argument can be lazy
        return self.loader.autodiscover_tasks(
            packages() if callable(packages) else packages, related_name,
        )

在app01下面创建tasks.py文件内容如下

from __future__ import absolute_import,unicode_literals
from celery import  shared_task
from subprocess import  getstatusoutput
@shared_task()
def add(x,y):
    print("invoke x+y")
    return  x+y

@shared_task
def mul(x, y):
    print("invoke mul")
    print("invoke mul")
    return x * y

@shared_task()
def customize_task(*args,**kwargs):
    # 获取自定义的脚本命令
    cmd=kwargs.get('cmd')
    # 开始执行无论python,shell一律采用subprocess方法
    (status,result)=getstatusoutput(cmd)
    return status,result


开启celery和celery beat命令,注意celery的broker是保存在redis中的,backend是保存在django-db(也就是ORM中)
很多人都反馈启动redis之后broker有数据但是django-db也就是我们
from django_celery_results.models import TaskResult这个models中的数据没有出现
我一开始也是这样,最后这样启动解决问题
celery beat启动命令

celery -A djangotask beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler  
(python38) [root@mysql04 djangotask]# celery -A djangotask beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler 
django-db
celery beat v5.2.6 (dawn-chorus) is starting.
__    -    ... __   -        _
LocalTime -> 2022-06-16 11:32:25
Configuration ->
    . broker -> redis://192.168.56.104:6380/10
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> django_celery_beat.schedulers.DatabaseScheduler

    . logfile -> [stderr]@%INFO
    . maxinterval -> 5.00 seconds (5s)
[2022-06-16 11:32:25,591: INFO/MainProcess] beat: Starting...

celery 启动命令

celery -A djangotask worker -s django -c 10 -E  --loglevel=info

注意里面加了-s django这个命令

(python38) [root@mysql04 djangotask]# celery -A djangotask worker -s django -c 10 -E  --loglevel=info
django-db
/virtual/python38/lib/python3.8/site-packages/celery/platforms.py:840: SecurityWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!

Please specify a different user using the --uid option.

User information: uid=0 euid=0 gid=0 egid=0

  warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
 
 -------------- celery@mysql04 v5.2.6 (dawn-chorus)
--- ***** ----- 
-- ******* ---- Linux-3.10.0-1160.24.1.el7.x86_64-x86_64-with-glibc2.17 2022-06-16 11:33:18
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         djangotask:0x7f5f21c1a340
- ** ---------- .> transport:   redis://192.168.56.104:6380/10
- ** ---------- .> results:     
- *** --- * --- .> concurrency: 10 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . app01.tasks.add
  . app01.tasks.customize_task
  . app01.tasks.mul

[2022-06-16 11:33:18,768: INFO/MainProcess] Connected to redis://192.168.56.104:6380/10
[2022-06-16 11:33:18,770: INFO/MainProcess] mingle: searching for neighbors
[2022-06-16 11:33:19,786: INFO/MainProcess] mingle: all alone
[2022-06-16 11:33:19,793: WARNING/MainProcess] /virtual/python38/lib/python3.8/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory

[2022-06-16 11:33:19,793: INFO/MainProcess] celery@mysql04 ready.

上面的app01/tasks.py文件中有个代码

@shared_task()
def customize_task(*args,**kwargs):
    # 获取自定义的脚本命令
    cmd=kwargs.get('cmd')
    # 开始执行无论python,shell一律采用subprocess方法
    (status,result)=getstatusoutput(cmd)
    return status,result

这个主要是为了定制我们要执行的脚本,服务器可以直接调用我们脚本来个性化定制任务,下面我将在django admin中演示
在这里插入图片描述
这个界面大家都很熟悉,django自带后台管理页面,可以设置interval,Crontabs,clocked三个时间参数选项,我们设置一个任务基于crontab
在这里插入图片描述
然后设置任务内容在这个时间触发
在这里插入图片描述
点开这个定时任务,可以传递参数
在这里插入图片描述
是不非常秒,可以自己想怎么定义就怎么定义
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

purgelog.sh脚本内容如下:

#!/bin/bash
ssh 192.168.56.104 ifconfig;
sleep 11;
echo "我爱北京天安门"

已经达到了我们的预期在指定的时间执行了,实现了我们想要的结果,当然也可以在网页前端请求中添加任务,这样前端可以很快返回结果,比较繁重的内容交给后端处理,例如发送email,或者数据库备份的事情后台搞
写的views文件体现这个内容

from app01 import  tasks
from django.http import JsonResponse
from django_celery_results.models import TaskResult

def index(request,*args,**kwargs):
    res=tasks.add.delay(*(1,3))
    return JsonResponse({'status': 'successful', 'task_id': res.task_id})

前端返回jsonResponse函数,后端处理tasks.add命令,到这这个项目结束
资源下载地址源代码完整django定时任务系统

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值