个人亲测可用环境
如果期间装了别的版本会遇到无数问题
File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\kombu\transport\redis.py", line 146, in append
pipe.zadd(self.unacked_index_key, delivery_tag, time()) \
File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\redis\client.py", line 2320, in zadd
for pair in iteritems(mapping):
File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\redis\_compat.py", line 109, in iteritems
return iter(x.items())
AttributeError: 'str' object has no attribute 'items'
File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\djcelery\management\commands\celery.py", line 11, in
class Command(CeleryCommand):
File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\djcelery\management\commands\celery.py", line 15, in Command
base.get_options() +
TypeError: can only concatenate tuple (not "NoneType") to tuple
python超出递归限制
反正各种问题celery的坑是真的多,无敌
下面放代码
settings配置
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'djcelery',
'course'
]
from .celeryconfig import *
BROKER_BACKEND = 'redis'
BROKER_URL = 'redis://127.0.0.1:6379/1'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/2'
celeryconfig.py配置
import djcelery
djcelery.setup_loader()
CELERY_QUEUES = {
'beat_tasks': {
'exchange': 'beat_tasks',
'exchange_type': 'direct',
'binding_key': 'beat_tasks'
},
'work_queue': {
'exchange': 'work_queue',
'exchange_type': 'direct',
'binding_key': 'work_queue'
}
}
CELERY_DEFAULT_QUEUE = 'work_queue'
CELERY_IMPORTS = (
'course.tasks'
)
# 有些情况下可以防止死锁
CELERYD_FORCE_EXECV = True
# 设置并发的worker数量
CELERYD_CONCURRENCY = 4
# 允许重试
CELERY_ACKS_LATE = True
# 每个worker最多执行100个任务被销毁,可以防止内存泄漏
CELERYD_MAX_TASKS_PER_CHILD = 100
# 单个任务的最大运行时间,超过就杀死
CELERYD_TASK_TIME_LEMIT = 12 * 30
# 定时任务
CELERYBEAT_SCHEDULE = {
'task1': {
'task': 'course-task',
'schedule': timedelta(seconds=5), # 每5秒执行一次
'options': {
'queue': 'beat_tasks' # 当前定时任务是跑在beat_tasks队列上的
}
}
}
views.py
from django.shortcuts import render
from django.http import JsonResponse
from course.tasks import CourseTask
def do(request):
# 执行异步任务
print('start do request')
# CourseTask.delay() # delay和apply_async拥有同样的功能,apply_async功能更全面
CourseTask.apply_async(args=('helle', ), queue='work_queue')
print('end do request')
return JsonResponse({'result': 'ok'})
urls.py
from django.contrib import admin
from django.urls import path
from course.views import do
urlpatterns = [
path('admin/', admin.site.urls),
path('do/', do, name='do')
]
python manage.py runserver
python manage.py celery worker -l info (工人)启动worker节点来处理任务
python manage.py celery beat -l info (领导)启动定时任务,当到时间时,把任务放入broker中,broker检测到了,让worker去工作。
python manage.py celery flower 错误日志监控,启动时必须先启动worker
supervisor进程管理,需pip install superviso 可以管理flower,worker,uwsgi等,日后会出关于supervisor的介绍。