Python 同时运行多个爬虫

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

settings = get_project_settings()

crawler = CrawlerProcess(settings)

crawler.crawl('文件名1')
crawler.crawl('文件名2')

crawler.start()
crawler.start()

你可能感兴趣的:(python,爬虫,数据挖掘)