celery安装配置(Windows)

  1. 安装RabbitMQ,会提示安装Erlang,按照提示即可。
  2. 确认RabbitMQ是否安装成功。
    a. 启动RabbitMQ Service
    b. 启动RabbitMQ Command Prompt (sbin dir),执行:rabbitmqctl status。报错如下:
    Status of node rabbit@DESKTOP-P6AC70L ...
    Error: unable to perform an operation on node 'rabbit@DESKTOP-P6AC70L'. Please see diagnostics information and suggestions below.
    
    Most common reasons for this are:
    
     * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues)
     * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server)
     * Target node is not running
    
    In addition to the diagnostics info below:
    
     * See the CLI, clustering and networking guides on http://rabbitmq.com/documentation.html to learn more
     * Consult server logs on node rabbit@DESKTOP-P6AC70L
    
    DIAGNOSTICS
    ===========
    
    attempted to contact: ['rabbit@DESKTOP-P6AC70L']
    
    rabbit@DESKTOP-P6AC70L:
      * connected to epmd (port 4369) on DESKTOP-P6AC70L
      * epmd reports node 'rabbit' uses port 25672 for inter-node and CLI tool traffic
      * TCP connection succeeded but Erlang distribution failed
    
      * Authentication failed (rejected by the remote node), please check the Erlang cookie
    
    
    Current node details:
     * node name: 'rabbitmqcli60@DESKTOP-P6AC70L'
     * effective user's home directory: C:\Users\junwe
     * Erlang cookie hash: aXFZrVZ2A9gF0kkNeMhRXA==
    
    c. 搜索.erlang.cookie(应该有两个),用任意一个替换另外一个保证内容一致。
    d. 重装RabbitMQ Service - 运行RabbitMQ Service - (re)install
    e. 重复步骤a、b,可以看到RabbitMQ Service已经运行起来。
    f. 启动RabbitMQ Command Prompt (sbin dir),执行:rabbitmq-plugins enable rabbitmq_management
    g. 打开浏览器,输入http://127.0.0.1:15672,成功跳转至RabbitMQ管理界面。(不运行步骤f,页面无法访问)
  3. 安装celery。在python环境下执行:pip install celery
  4. 创建tasks.py:
    from celery import Celery
    import time
    
    app = Celery('tasks', backend='amqp', broker='amqp://')
    
    @app.task
    def add(x, y):
      time.sleep(5)
      return x + y
    
  5. 切到tasks.py目录下运行:celery -A tasks worker --loglevel=info
    提示错误:
      -------------- celery@iZev0nc46ircftZ v4.1.0 (latentcall)
      ---- **** -----
      --- * ***  * -- Windows-10-10.0.14393-SP0 2018-04-19 21:24:48
      -- * - **** ---
      - ** ---------- [config]
      - ** ---------- .> app:         tasks:0x213343e6630
      - ** ---------- .> transport:   amqp://guest:**@localhost:5672//
      - ** ---------- .> results:     disabled://
      - *** --- * --- .> concurrency: 2 (prefork)
      -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
      --- ***** -----
       -------------- [queues]
                    .> celery           exchange=celery(direct) key=celery
      
      
      [tasks]
        . tasks.add
      
      [2018-04-19 21:24:48,634: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [WinError 10042] 在 getsockopt 或 setsockopt 调用中指定的一个未知的、无效的或不受支持的选项或层次。.
      Trying again in 2.00 seconds...
      
      [2018-04-19 21:24:49,322: INFO/SpawnPoolWorker-2] child process 2956 calling self.run()
      [2018-04-19 21:24:49,353: INFO/SpawnPoolWorker-1] child process 5768 calling self.run()
      [2018-04-19 21:24:50,697: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [WinError 10042] 在 getsockopt 或 setsockopt 调用中指定的一个未知的、无效的或不受支持的选项或层次。.
      Trying again in 4.00 seconds...
    
    Google后发现是有个bug,由于amqp的版本太低没有被修复:https://github.com/celery/py-amqp/pull/136。由于我是anaconda环境,运行conda update --all把所有的package都更新了一下。再运行celery -A tasks worker --loglevel=info就可以了。
  6. 在调用端运行如下代码:
    >>> from tasks import add
    >>> res = add.delay(4,4)
    >>> res.ready()
    False
    >>> res.ready()
    True
    >>> res.result
    8
    >>>
    
    切到celery的环境下看一下,发现报错了:
        The AMQP result backend is scheduled for deprecation in     version 4.0 and removal in version v5.0.     Please use RPC backend or a persistent backend.
    
      alternative='Please use RPC backend or a persistent backend.')
    [2018-04-20 16:27:26,581: INFO/SpawnPoolWorker-1] child process 4104 calling self.run()
    [2018-04-20 16:27:26,596: WARNING/SpawnPoolWorker-1] c:\programdata\anaconda3\lib\site-packages\celery\backends\amqp.py:68: CPendingDeprecationWarning:
        The AMQP result backend is scheduled for deprecation in     version 4.0 and removal in version v5.0.     Please use RPC backend or a persistent backend.
    
      alternative='Please use RPC backend or a persistent backend.')
    [2018-04-20 16:27:26,800: INFO/MainProcess] mingle: all alone
    [2018-04-20 16:27:26,846: INFO/MainProcess] celery@iZ15r2bhksa4s0Z ready.
    [2018-04-20 16:28:23,878: INFO/MainProcess] Received task: tasks.add[0a988088-95da-44a1-8c04-7c4581ad6f25]
    [2018-04-20 16:28:23,956: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
    Traceback (most recent call last):
      File "c:\programdata\anaconda3\lib\site-packages\billiard\pool.py", line 358, in workloop
        result = (True, prepare_result(fun(*args, **kwargs)))
      File "c:\programdata\anaconda3\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
        tasks, accept, hostname = _loc
    ValueError: not enough values to unpack (expected 3, got 0)
    
    Google了一下发现celery 4.0版本后就放弃了对Windows的支持:https://github.com/celery/celery/issues/4178。找到一个workaround:
    1. pip install eventlet
    2. celery -A tasks worker --pool=eventlet
  7. 使用redis作为backend
    a. pip install redis
    b. 代码改为:app = Celery('tasks', broker='redis://127.0.0.1:6379/0', backend='redis://127.0.0.1:6379/0')
    c. 安装/启动redis server
    d. 重复步骤5、6
  8. 在两台机器上进行测试
    a. 修改tasks.py中的ip至rabbitmq/redis所在机器的公网网址
    b. 拷贝tasks.py到另外一个机器
    c. 在另外一个机器上执行pip install celery和pip install redis
    d. 重复步骤6

你可能感兴趣的:(celery安装配置(Windows))