scrapy利用下载器中间件给request对象修改User-Agent及ip代理

middlewares.py编写

from p5.settings import UserAgent_list
from p5.settings imt IpAgent_list
import random

class RandomUserAgentMiddleware(object):
	#当每个request请求发送给目标网站时调用,用于处理或修改request对象
    def process_request(self, request, spider):
        ua = random.choice(UserAgent_list)
        ia = random.choice(IpAgent_list)
        request.headers["User-Agent"] = ua
        request.meta["proxy"] = ia
        return None

class CheckUserAgent(object):
	#,在将request获取到响应传递给引擎之前调用,在将响应传递给引擎之前处理收到的响应,如:响应失败重新请求,或将失败的做一定处理再返回给引擎
    def process_response(self, request, response, spider):
        print(request.headers["User-Agent"])
        print(request.meat["proxy"])
        return response

在settings.py中开启下载器中间件

DOWNLOADER_MIDDLEWARES = {
     
    'p5.middlewares.RandomUserAgentMiddleware': 543,
    'p5.middlewares.CheckUserAgent': 544,
}

常见的User-Agent列表

USER_AGENT_LIST = [“User-Agent,Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0;”,“User-Agent,Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)”,“User-Agent,Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)”,“User-Agent, Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)”,“User-Agent, Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv,2.0.1) Gecko/20100101 Firefox/4.0.1”,“User-Agent,Mozilla/5.0 (Windows NT 6.1; rv,2.0.1) Gecko/20100101 Firefox/4.0.1”,“User-Agent,Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11”,“User-Agent,Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11”,“User-Agent,Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50”,“User-Agent, Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)”,“User-Agent, Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)”,“User-Agent, Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)”]

你可能感兴趣的:(爬虫框架,中间件,python,http,scrapy,爬虫)