aiohttp与requests效率对比

之前使用scrapy爬取了一些代理网站的代理,因为像知乎等网站针对ip进行了反爬,如果限制了ip地址,需要使用代理来进行访问,所以爬取一些代理,有备无患。但是很多免费代理网站提供的代理,十个可能就一两个能用,因此写一个小程序来对代理进行验证就有必要了,这也是一个代理池的基本实现思路。要验证代理是否可用很简单,通过http请求带上代理地址,验证请求是否成功即可。python中当然是用requests最简单了,但是如果要验证多个代理,是用request进行同步验证,代理多了必然很慢。所以需要考虑多线程,或异步方式。而python中可以通过协程来实现异步请求,更有封装好的aiohttp框架可以使用。以下代码对比requests和aiohttp对20个代理地址进行校验的效率(请求地址,百度):

requests

import requests
import time

from requests import ConnectTimeout
from lear_scrapy.util.redisclient import RedisClient

api = 'http://www.baidu.com'
# 代理存在redis里面,这里通过redisClient取出,具体代码就不贴了
rc = RedisClient()
proxies = rc.get_proxies(3500, 3520)
# print(proxies)
# 记录开始时间
start = time.time()
for proxy in proxies:
    try:
        print(proxy.decode())
        # 这里设置一个5秒超时时间,避免使用代理太长时间连接不上
        resp = requests.get(api, proxies={'http': proxy.decode()}, timeout=5)
        print(proxy.decode(), ':', resp.status_code)
    except ConnectTimeout as e:
        print(e)
    except Exception as e:
        print(e)
end = time.time()

print('takes:', (end - start))

循环20次请求百度后

takes: 74.56189751625061

aiohttp

import aiohttp
import asyncio
import time
from lear_scrapy.util.redisclient import RedisClient


class ProxyValidate:
    def __init__(self, proxies, api):
        self.__proxies = proxies
        self.__api = api

    def validate_all(self):
        loop = asyncio.get_event_loop()
        useful_proxies = []
        task = [self.validate_single(proxy.decode(), useful_proxies) for proxy in self.__proxies]
        loop.run_until_complete(asyncio.wait(task))
        print(useful_proxies)

    async def validate_single(self, proxy, useful_proxies):
        try:
            async with aiohttp.ClientSession() as session:
                async with session.get(self.__api, timeout=5, proxy=proxy) as resp:
                    if resp.status == 200:
                        print(proxy, ':useful')
                        useful_proxies.append(proxy)
        except aiohttp.ClientProxyConnectionError as error:
            print(proxy, ':bad')
            print(error)
        except Exception as e:
            print(proxy, ':bad')
            print(e)


if __name__ == '__main__':
    rc = RedisClient()
    proxies = rc.get_proxies(3500, 3520)
    # proxies = [b'http://125.40.238.181:56834', b'http://117.127.0.201:8080', b'http://221.2.174.6:8060', b'http://121.41.120.245:80']
    start = time.time()
    validate = ProxyValidate(proxies, 'http://www.baidu.com')
    validate.validate_all()
    end = time.time()
    print('validate proxies takes:', (end - start))

同样请求百度20次

validate proxies takes: 5.147261142730713

居然只花了5秒钟。
结论:我的老天

你可能感兴趣的:(aiohttp与requests效率对比)