利用django admin后台配置celery定时任务

时间:2021-03-25 07:45:04

1、安装djcelery

 

pip install django-celery

 

2、在Django项目setting配置

A、配置djcelery

# CELERY STUFF
import djcelery
djcelery.setup_loader()
BROKER_URL
= 'redis://localhost:6379'
CELERYBEAT_SCHEDULER
= 'djcelery.schedulers.DatabaseScheduler' # 定时任务
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERY_RESULT_BACKEND
= 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT
= ['application/json']
CELERY_TASK_SERIALIZER
= 'json'
CELERY_RESULT_SERIALIZER
= 'json'
CELERY_TIMEZONE
= 'Asia/Shanghai'

 

B、配置APPS

INSTALLED_APPS = (
# 'bootstrap_admin',
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'djcelery', )

 

3、创建一个用Saltstack API收集主机信息task

from celery import task
from saltstack.saltapi import SaltAPI

@task
def server_collects(tgt,server_id):
contexts
= {'server_list': server_list, 'server_id': server_id}
try:
sapi
= SaltAPI(url="http://192.168.62.200:8000", username="kbson", password="kbson")

grains
= sapi.SaltCmd(tgt=tgt, fun='grains.items', client='local')['return'][0]
minions
= sapi.key_list('manage.status', client='runner')['return'][0]
if salt_server and grains:
for i in grains.keys():
try:
server
= Servers.objects.get(local_ip=i)
except:
server
= Servers()
if i in minions['up']:
minions_status
= '0'
else:
minions_status
= '1'
server.hostname
= grains[i]['host']
server.local_ip
= grains[i]['id']
server.OS
= grains[i]['os'] + ' ' + grains[i]['osrelease'] + '-' + grains[i]['osarch']
server.Mem
= grains[i]['mem_total']
server.Cpus
= grains[i]['num_cpus']
server.Cpu_type
= grains[i]['cpu_model']
server.minion_id
= grains[i]['id']
server.app
= grains[i]['virtual']
server.server_status
= minions_status
server.save()
contexts.update({
'success': u'%s 收集成功' % tgt})
if not grains:
contexts.update({
'error': u'%s 主机不存在或者离线' % tgt})
except Exception as e:
contexts.update({
'error': '%s %s' % (tgt, e)})
return contexts

 

4、配置supervisor守护进程来启动celery,启动worker与beat进程

worker:执行任务的消费者,通常会在多台服务器运行多个消费者来提高执行效率。

beat:任务调度器,Beat进程会读取配置文件的内容,周期性地将配置中到期需要执行的任务发送给任务队列。 Celerybeat会保持运行, 一旦到了某一定时任务需要执行时, Celerybeat便将其加入到queue中. 不像worker进程, Celerybeat只有需要一个即可.

[program:DJ.celeryd]
command
=/usr/local/python27/bin/python /data/PRG/saltruler/manage.py celery worker --loglevel=info
user
=root
numprocs
=1
directory
=/data/PRG/saltruler
stdout_logfile
=/var/log/celery_worker.log
stderr_logfile
=/var/log/celery_worker.log
autostart
=true
autorestart
=true
startsecs
=10
stopwaitsecs
= 120
priority
=998

[program:DJ.celerybeat]
command
=/usr/local/python/bin/python /data/PRG/saltruler/manage.py celery beat --schedule=/tmp/celerybeat-schedule --pidfile=/tmp/django_celerybeat.pid --loglevel=INFO
user
=root
numprocs
=1
directory
=/data/PRG/saltruler
stdout_logfile
=/var/log/celery_beat.log
stderr_logfile
=/var/log/celery_beat.log
autostart
=true
autorestart
=true
startsecs
=10
stopwaitsecs
= 120
priority
=998

 

 

 

5、Django后台配置celery定时任务

访问Django后台http://192.168.62.200:8000/admin/djcelery/periodictask/

利用django admin后台配置celery定时任务

 

添加periodic task

利用django admin后台配置celery定时任务

查看添加的定时任务

利用django admin后台配置celery定时任务

 

查看后台日志是否有执行,正常执行日志输入类似如下内容

[2017-04-20 13:30:00,001: INFO/Beat] Scheduler: Sending due task update cmdb (saltstack.tasks.server_collects)
[
2017-04-20 13:30:00,006: INFO/MainProcess] Received task: saltstack.tasks.server_collects[f46d2c70-1550-4dec-b5aa-036fbffd2a5f]
[
2017-04-20 13:30:00,058: WARNING/Worker-18] https://192.168.62.200:8000
[
2017-04-20 13:30:07,163: INFO/MainProcess] Task saltstack.tasks.server_collects[f46d2c70-1550-4dec-b5aa-036fbffd2a5f] succeeded in 7.15584068373s: {'server_id': 1L, 'success': u'* \u6536\u96c6\u6210\u529f', 'server_list': <QuerySet [<SaltServer: games -...
[
2017-04-20 13:31:49,945: INFO/Beat] Writing entries (1)...