python_scrapy_twisted.web.error.SchemeNotSupported: Unsupported scheme: b''_及解决

问题描述:在使用scrapy框架的middleware中间件,去尝试使用代理,执行后就会报错

2018-12-26 00:39:30 [scrapy.core.scraper] ERROR: Error downloading 
Traceback (most recent call last):
  File "e:\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1416, in _inlineCallbacks
    result = result.throwExceptionIntoGenerator(g)
  File "e:\anaconda3\lib\site-packages\twisted\python\failure.py", line 491, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request
    defer.returnValue((yield download_func(request=request,spider=spider)))
  File "e:\anaconda3\lib\site-packages\scrapy\utils\defer.py", line 45, in mustbe_deferred
    result = f(*args, **kw)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 65, in download_request
    return handler.download_request(request, spider)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 67, in download_request
    return agent.download_request(request)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 331, in download_request
    method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 252, in request
    proxyEndpoint = self._getEndpoint(self._proxyURI)
  File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1635, in _getEndpoint
    return self._endpointFactory.endpointForURI(uri)
  File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1513, in endpointForURI
    raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,))
twisted.web.error.SchemeNotSupported: Unsupported scheme: b''

中间件代码如下:

class ProxyMiddleware(object):
    logger = logging.getLogger(__name__)

    def process_request(self, request, spider):
        self.logger.debug("Using Proxy")
        request.meta['proxy'] = '119.101.112.28:9999'
        return None

    def process_response(self, request, response, spider):
        response.status = 202
        return response

解决:

原因是代理的值前没有加http!!!

正确代码:

class ProxyMiddleware(object):
    logger = logging.getLogger(__name__)

    def process_request(self, request, spider):
        self.logger.debug("Using Proxy")
        request.meta['proxy'] = 'http://119.101.112.28:9999'
        return None

    def process_response(self, request, response, spider):
        response.status = 202
        return response

修改后,运行成功。问题解决

 

你可能感兴趣的:(python,python问题及解决方案,爬虫问题,错误整理,scrapy,爬虫,代理,proxy)