错误提示一:
[2019-02-16 15:47:43,063: ERROR/MainProcess] Unrecoverable error: AttributeError("'str' object has no attribute 'items'",)
Traceback (most recent call last):
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/worker/__init__.py", line 206, in start
self.blueprint.start(self)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/bootsteps.py", line 374, in start
return self.obj.start()
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/worker/consumer.py", line 280, in start
blueprint.start(self)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/worker/consumer.py", line 884, in start
c.loop(*c.loop_args())
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/celery/worker/loops.py", line 76, in asynloop
next(loop)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/kombu/async/hub.py", line 340, in create_loop
cb(*cbargs)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/kombu/transport/redis.py", line 1019, in on_readable
self._callbacks[queue](message)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/kombu/transport/virtual/__init__.py", line 534, in _callback
self.qos.append(message, message.delivery_tag)
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/kombu/transport/redis.py", line 146, in append
pipe.zadd(self.unacked_index_key, delivery_tag, time()) \
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/redis/client.py", line 2320, in zadd
for pair in iteritems(mapping):
File "/home/damon96/.virtualenvs/oakvipENV/lib/python3.6/site-packages/redis/_compat.py", line 122, in iteritems
return iter(x.items())
AttributeError: 'str' object has no attribute 'items'
原因:项目使用的第三方redis包版本过高
解决方式:降低redis版本,pip install redis==2.10.6
错误提示二:
Running a worker with superuser privileges when the
worker accepts messages serialized with pickle is a very bad idea!
If you really want to continue then you have to set the C_FORCE_ROOT
environment variable (but please think about this before you do).
原因:celery需要使用root用户执行
解决方式:
from celery import Celery, platforms
app = Celery('tasks')
platforms.C_FORCE_ROOT = True #加上这一行
@app.task
def add(x, y):
return x + y
错误提示三:
Traceback (most recent call last):
File "/Users/aromanovich/Envs/celery3.1/lib/python2.7/site-packages/celery/app/trace.py", line 218, in trace_task
R = retval = fun(*args, **kwargs)
File "/Users/aromanovich/Envs/celery3.1/lib/python2.7/site-packages/celery/app/trace.py", line 398, in __protected_call__
return self.run(*args, **kwargs)
File "/Users/aromanovich/Projects/celery/app.py", line 10, in add
manager = multiprocessing.Manager()
File "/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/__init__.py", line 99, in Manager
m.start()
File "/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/managers.py", line 524, in start
self._process.start()
File "/usr/local/Cellar/python/2.7.6/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/process.py", line 124, in start
'daemonic processes are not allowed to have children'
原因:
解决方式:设置环境变量 export PYTHONOPTIMIZE=1
错误提示四
Traceback (most recent call last):
File "manage.py", line 10, in
execute_from_command_line(sys.argv)
File "D:\Python27\lib\site-packages\django\core\management\__init__.py", line 354, in execute_from_command_line
utility.execute()
File "D:\Python27\lib\site-packages\django\core\management\__init__.py", line 346, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "D:\Python27\lib\site-packages\django\core\management\__init__.py", line 190, in fetch_command
klass = load_command_class(app_name, subcommand)
File "D:\Python27\lib\site-packages\django\core\management\__init__.py", line 40, in load_command_class
module = import_module('%s.management.commands.%s' % (app_name, name))
File "D:\Python27\lib\importlib\__init__.py", line 37, in import_module
__import__(name)
File "D:\Python27\lib\site-packages\djcelery\management\commands\celery.py", line 11, in
class Command(CeleryCommand):
File "D:\Python27\lib\site-packages\djcelery\management\commands\celery.py", line 15, in Command
base.get_options() +
TypeError: can only concatenate tuple (not "NoneType") to tuple
原因:这个问题是由于安装包不全导致的,分别检查是否安装了以下安装包
解决方式:
pip install celery
pip install celery-with-redis
pip install django-celery
错误提示五:
Traceback (most recent call last):
File "D:\Python27\lib\site-packages\celery\app\trace.py", line 240, in trace_task
R = retval = fun(*args, **kwargs)
File "D:\Python27\lib\site-packages\celery\app\trace.py", line 438, in __protected_call__
return self.run(*args, **kwargs)
File "F:\virtual\project\agrometeorological\agrometeorological\ganter\tasks.py", line 34, in get_access
g.account_set.all()
File "D:\Python27\lib\site-packages\django\db\models\manager.py", line 228, in all
return self.get_queryset()
File "D:\Python27\lib\site-packages\django\db\models\fields\related.py", line 712, in get_queryset
qs = qs.filter(**self.core_filters)
File "D:\Python27\lib\site-packages\django\db\models\query.py", line 679, in filter
return self._filter_or_exclude(False, *args, **kwargs)
File "D:\Python27\lib\site-packages\django\db\models\query.py", line 697, in _filter_or_exclude
clone.query.add_q(Q(*args, **kwargs))
File "D:\Python27\lib\site-packages\django\db\models\sql\query.py", line 1310, in add_q
clause, require_inner = self._add_q(where_part, self.used_aliases)
File "D:\Python27\lib\site-packages\django\db\models\sql\query.py", line 1338, in _add_q
allow_joins=allow_joins, split_subq=split_subq,
File "D:\Python27\lib\site-packages\django\db\models\sql\query.py", line 1150, in build_filter
lookups, parts, reffed_expression = self.solve_lookup_type(arg)
File "D:\Python27\lib\site-packages\django\db\models\sql\query.py", line 1036, in solve_lookup_type
_, field, _, lookup_parts = self.names_to_path(lookup_splitted, self.get_meta())
File "D:\Python27\lib\site-packages\django\db\models\sql\query.py", line 1373, in names_to_path
if field.is_relation and not field.related_model:
File "D:\Python27\lib\site-packages\django\utils\functional.py", line 59, in __get__
res = instance.__dict__[self.name] = self.func(instance)
File "D:\Python27\lib\site-packages\django\db\models\fields\related.py", line 110, in related_model
apps.check_models_ready()
File "D:\Python27\lib\site-packages\django\apps\registry.py", line 131, in check_models_ready
raise AppRegistryNotReady("Models aren't loaded yet.")
AppRegistryNotReady: Models aren't loaded yet.
原因:我是在定时任务中使用外键关联的反向查询时报了上面的错误
解决方式:
# 模型加载不全,解决方式是在定时任务tasks文件的包引用最前面引用,如下:
# -*- coding:utf-8 -*-
import django
django.setup()
from celery.schedules import crontab_parser
from celery import shared_task
错误提示六
ERROR/MainProcess] Received unregistered task of type 'generate_static_sku_detail_html'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[2], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (78b)
Traceback (most recent call last):
File "/home/python/.virtualenvs/meiduo/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 558, in on_task_received
strategy = strategies[type_]
KeyError: 'generate_static_sku_detail_html'
原因:项目未开启
解决方式:先开启项目,在开启celery
技巧一:任务重复执行
celery执行定时任务的时候遇到了重复执行的问题,当时是用redis做broker和backend。
官方文档中有相关描述。
描述:就是说当我们设置一个ETA时间比visibility_timeout长的任务时,每过一次 visibility_timeout 时间,celery就会认为这个任务没被worker执行成功,重新分配给其它worker再执行。
解决办法就是把 visibility_timeout参数调大,比我们ETA的时间差要大。celery本身的定位就主要是实时的异步队列,对于这种长时间定时执行,支持不太好。
但是第二天依然重复执行了。。。
最后我的解决方法是在每次定时任务执行完就在redis中写入一个唯一的key对应一个时间戳,当下次任务执行前去获取redis中的这个key对应的value值,和当前的时间做比较,当满足我们的定时频率要求时才执行,这样保证了同一个任务在规定的时间内只会执行一次。
技巧二:使用多个queue来执行任务
当你有很多任务需要执行的时候,不要偷懒只使用默认的queue,这样会相互影响,并且拖慢任务执行的,导致重要的任务不能被快速的执行。鸡蛋不能放在同一个篮子里的道理大家都懂。
配置的设置:CELERY_ROUTES = {'feed.tasks.import_feed': {'queue': 'feeds'}}
使用方式:celery -A proj worker -Q feeds,celery
指定routes,就会自动生成对应的queue,然后使用-Q指定queue启动celery就可以,默认的queue名字是celery。可以看官方文档对默认queue的名字进行修改。
技巧三:启动多个workers执行不同的任务
在同一台机器上,对于优先级不同的任务最好启动不同的worker去执行,比如把实时任务和定时任务分开,把执行频率高的任务和执行频率低的任务分开,这样有利于保证高优先级的任务可以得到更多的系统资源,同时高频率的实时任务日志比较多也会影响实时任务的日志查看,分开就可以记录到不同的日志文件,方便查看。
$ celery -A proj worker --loglevel=INFO --concurrency=10 -n worker1.%h
$ celery -A proj worker --loglevel=INFO --concurrency=10 -n worker2.%h
$ celery -A proj worker --loglevel=INFO --concurrency=10 -n worker3.%h
可以像这样启动不同的worker,%h可以指定hostname,详细说明可以查看官方文档
高优先级的任务可以分配更多的concurrency,但是并不是worker并法数越多越好,保证任务不堆积就好。
技巧四:确定是否需要关注任务执行状态
这个要视具体的业务场景来看,如果对结果不关心,或者任务的执行本身会对数据产生影响,通过对数据的判断可以知道执行的结果那就不需要返回celery任务的退出状态,可以设置:
进行全局设置CELERY_IGNORE_RESULT = True
或者进行单个任务设置
@app.task(ignore_result=True)
def mytask(…):
something()
但是,如果业务需要根据任务执行的状态进行响应的处理就不要这样设置。
技巧五:防止内存泄漏
长时间运行Celery有可能发生内存泄露,可以像下面这样设置
CELERYD_MAX_TASKS_PER_CHILD = 40 # 每个worker执行了多少任务就会死掉
技巧六:遇到异常重启之后将原来的任务重新加入到队列中
来源:https://www.waitig.com/%E8%A7%A3%E5%86%B3celery%E8%BF%9B%E7%A8%8B%E9%87%8D%E5%90%AF%E5%90%8E%EF%BC%8C%E6%AD%A3%E5%9C%A8%E8%BF%9B%E8%A1%8C%E4%B8%AD%E7%9A%84%E4%BB%BB%E5%8A%A1%E4%B8%A2%E5%A4%B1%E6%88%96%E8%80%85%E6%A0%87.html
task_reject_on_worker_lost = True
task_acks_late = True
该配置可以保证task不丢失,中断的task在下次启动时将会重新执行。
task_reject_on_worker_lost作用是当worker进程意外退出时,task会被放回到队列中
task_acks_late作用是只有当worker完成了这个task时,任务才被标记为ack状态
需要说明的是,backend最好使用rabbitmq等支持ACK状态的消息中间件。