scrapy 为每个spider 设置不同的pipelines

class ExceptionspiderSpider(scrapy.Spider):
    name = 'exceptionSpider'
    # allowed_domains = ['baidu.com']
    start_urls = ['http://baidu.com/']

    custom_settings = {
        'ITEM_PIPELINES':{
            'TestExceptionSpider.pipelines.TestexceptionspiderPipeline':300,
            'TestExceptionSpider.exceptionPipeline.ExceptionPipeline':400
        }
    }

    def start_requests(self):
        for url in self.start_urls:
           yield scrapy.Request(url = url,callback= self.parse)

    def parse(self, response):
        pass
 

在custom_settings里写
http://www.waitingfy.com/archives/3833

 
 

你可能感兴趣的:(scrapy)