scrapyd部署爬虫项目

安装

pip install scrapyd
pip install scrapyd-client

配置scrapy.cfg

图片.png
[settings]
default = dangdang01.settings

[deploy]
url = http://localhost:6800/
project = dangdang01

部署

进入工程根目录下,执行:

scrapyd-deploy

如果有多台主机,则配置为:

[settings]
default = dangdang01.settings

[deploy:mac]
url = http://localhost:6800/
project = dangdang01

[deploy:tencent]
url = http://173.207.138.122:6800/
project = dangdang01

如果要配置到tencent机器上,配置命令为:

scrapyd-deploy tencent

需要提前修改远程主机上面scrapyd配置文件,例如:
.../lib/python2.7/site-packages/scrapyd/default_scrapyd.conf
修改绑定IP:

bind_address = 0.0.0.0

修改远程配置文件、及部署项目后,工程就被部署到了远程的eggs目录下。

你可能感兴趣的:(scrapyd部署爬虫项目)