如何在运行中关闭scrapy

In spider, you can just throw CloseSpider exception.

def parse_page(self, response):
if ‘Bandwidth exceeded’ in response.body:
raise CloseSpider(‘bandwidth_exceeded’)

For others (middlewares, pipeline, etc):
crawler.engine.close_spider(self, ‘log message’)

来自https://stackoverflow.com/questions/9524923/how-can-i-make-scrapy-crawl-break-and-exit-when-encountering-the-first-exception

你可能感兴趣的:(爬虫)