djcelery使用及环境配置

celery环境

个人亲测可用环境

  • python3.6
    amqp=1.4.9
    anyjson=0.3.3
    Babel=2.7.0
    billiard=3.3.0.23
    celery=3.1.26.post2
    celery-with-redis=3.0
    Django=2.2.2
    django-celery=3.2.2
    djangorestframework=3.9.2
    dnspython=1.16.0
    eventlet=0.25.0
    flower=0.9.3
    greenlet=0.4.15
    kombu=3.0.37
    meld3=1.0.2
    monotonic=1.5
    PyMySQL=0.9.3
    pytz=2019.1
    redis=2.10.6
    six=1.12.0
    sqlparse=0.3.0
    supervisor=4.0.3
    tornado=5.1.1
    vine=1.3.0

如果期间装了别的版本会遇到无数问题

  File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\kombu\transport\redis.py", line 146, in append
    pipe.zadd(self.unacked_index_key, delivery_tag, time()) \
  File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\redis\client.py", line 2320, in zadd
    for pair in iteritems(mapping):
  File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\redis\_compat.py", line 109, in iteritems
    return iter(x.items())
AttributeError: 'str' object has no attribute 'items'

  File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\djcelery\management\commands\celery.py", line 11, in 
    class Command(CeleryCommand):
  File "C:\Users\Administrator\Envs\xmzdjango\lib\site-packages\djcelery\management\commands\celery.py", line 15, in Command
    base.get_options() +
TypeError: can only concatenate tuple (not "NoneType") to tuple

python超出递归限制

反正各种问题celery的坑是真的多,无敌
下面放代码
djcelery使用及环境配置_第1张图片
settings配置

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'djcelery',
    'course'
]

from .celeryconfig import *
BROKER_BACKEND = 'redis'
BROKER_URL = 'redis://127.0.0.1:6379/1'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/2'

celeryconfig.py配置

import djcelery

djcelery.setup_loader()


CELERY_QUEUES = {
    'beat_tasks': {
        'exchange': 'beat_tasks',
        'exchange_type': 'direct',
        'binding_key': 'beat_tasks'
    },
    'work_queue': {
        'exchange': 'work_queue',
        'exchange_type': 'direct',
        'binding_key': 'work_queue'
    }
}


CELERY_DEFAULT_QUEUE = 'work_queue'

CELERY_IMPORTS = (
    'course.tasks'
)
#  有些情况下可以防止死锁
CELERYD_FORCE_EXECV = True

#  设置并发的worker数量
CELERYD_CONCURRENCY = 4

# 允许重试
CELERY_ACKS_LATE = True

#  每个worker最多执行100个任务被销毁,可以防止内存泄漏
CELERYD_MAX_TASKS_PER_CHILD = 100

#  单个任务的最大运行时间,超过就杀死
CELERYD_TASK_TIME_LEMIT = 12 * 30

#  定时任务
CELERYBEAT_SCHEDULE = {
    'task1': {
        'task': 'course-task',
        'schedule': timedelta(seconds=5),  # 每5秒执行一次
        'options': {
            'queue': 'beat_tasks'  # 当前定时任务是跑在beat_tasks队列上的
        }
    }
}

views.py

from django.shortcuts import render
from django.http import JsonResponse
from course.tasks import CourseTask

def do(request):

    #  执行异步任务
    print('start do request')
    # CourseTask.delay()  #  delay和apply_async拥有同样的功能,apply_async功能更全面
    CourseTask.apply_async(args=('helle', ), queue='work_queue')
    print('end do request')
    return JsonResponse({'result': 'ok'})

urls.py

from django.contrib import admin
from django.urls import path
from course.views import do
urlpatterns = [
    path('admin/', admin.site.urls),
    path('do/', do, name='do')
]
命令
  • python manage.py runserver

  • python manage.py celery worker -l info (工人)启动worker节点来处理任务

  • python manage.py celery beat -l info (领导)启动定时任务,当到时间时,把任务放入broker中,broker检测到了,让worker去工作。

  • python manage.py celery flower 错误日志监控,启动时必须先启动worker

supervisor进程管理,需pip install superviso 可以管理flower,worker,uwsgi等,日后会出关于supervisor的介绍。

你可能感兴趣的:(遇到的BUG,Django)