爬虫篇——User-Agent爬取备用及存储

爬虫篇——User-Agent爬取备用及存储

  • 代码

代码

本文通过抓取常见的User-Agent(用户代理),将其写入列表并保存为json格式文件,且将代码进行了封装,方便以后抓取数据时动态的更新请求头中的User-Agent,模拟真实的浏览器发送请求,从一方面避免抓取数据时反爬的干扰。

# *********************** User-Agent 爬取 ********************************
import requests
from bs4 import BeautifulSoup
import re
import json
import urllib.request

class UserAgentSpider(object):
    def __init__(self):
        self.headers = {"User-Agent":"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36"}
        self.url = "https://blog.csdn.net/jamesaonier/article/details/89003053"
        self.user_agent_list = []
        self.validation_user_agent_list = []
    # 1、发送网络请求 获取数据
    def data_request(self):
        return requests.get(self.url,headers = self.headers).content.decode("utf-8")
    
    # 2、数据解析
    def data_parse(self,data):
        parse_data = BeautifulSoup(data,"lxml")
        all_agent = parse_data.select("div p")[1:12]
        for agent in all_agent:
            pattern = re.compile("\n")
            each_agent = pattern.split(agent.get_text())[1:]
            for each in each_agent:
                self.user_agent_list.append(each)
    
    # 3、检验user-agent的可用性
    def test_validation(self):
        for agent in self.user_agent_list:
            try:
                url = "https://www.baidu.com/"
                headers = {}
                headers["User-Agent"] = agent
                requests.get(url,headers = headers)
                self.validation_user_agent_list.append(agent)
            except urllib.request.HTTPError as error:
                print(error.code)
                
    # 4、数据存储
    def data_save(self):
        with open("./User-Agent.json","w",encoding = "utf-8") as fp:
            json.dump(self.validation_user_agent_list,fp)
            
    # 4、统筹运行
    def run(self):
        # 1、请求数据
        data = self.data_request()
        # 2、数据解析
        self.data_parse(data)
#         print(self.user_agent_list)
        # 3、检验user-agent的可用性
        self.test_validation()
        # 4、数据存储
        self.data_save()
                
if __name__ == "__main__":
    UserAgentSpider().run()

代码运行结果:

["Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36 OPR/26.0.1656.60", "Opera/8.0 (Windows NT 5.1; U; en)", "Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.50", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; en) Opera 9.50", "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:34.0) Gecko/20100101 Firefox/34.0", "Mozilla/5.0 (X11; U; Linux x86_64; zh-CN; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.57.2 (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36", "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11", "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.101 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.11 TaoBrowser/2.0 Safari/536.11", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.71 Safari/537.1 LBBROWSER", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; LBBROWSER)", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E; LBBROWSER)\"", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; QQBrowser/7.0.3698.400)", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E)", "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 SE 2.X MetaSr 1.0", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; SE 2.X MetaSr 1.0)", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.4.3.4000 Chrome/30.0.1599.101 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 UBrowser/4.0.3214.0 Safari/537.36"]

by CyrusMay 2020 04 24

从前只想装懂
装作什么都懂
懂得生存的规则之后
却只想要 都不懂
——————五月天——————

你可能感兴趣的:(爬虫篇,python,xmlhttprequest,spidermonkey,前端,https)