python3 asyncio 高性能异步爬虫

直接写个小列子吧

import asyncio
import aiohttp
import requests
import time

start = time.time()
async def gets(url):
    # conn = aiohttp.TCPConnector()
    session = aiohttp.ClientSession()
    response = await session.get(url)
    result = await response.text()
    await session.close()   #这里注明一下,如果不关闭的话,会出现警告
    return result

async def request():
    url = 'http://127.0.0.1:5000'
    print('go go go ',url)
    result = await gets(url)

    print('Get response from',url,'Result:',result)



tasks = [asyncio.ensure_future(request()) for _ in range(5)]
loop  = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
end = time.time()
print('Cost time:', end - start)

用flask启动一个

from flask import Flask
import time

app = Flask(__name__)


@app.route('/')
def index():
    time.sleep(3)
    return 'Hello!'


if __name__ == '__main__':
    app.run(threaded=True)  # 这表明 Flask 启动了多线程模式,不然默认是只有一个线程的。

不关闭上面的session的话,警告就是这样的

Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x10bb31690>

你可能感兴趣的:(python3 asyncio 高性能异步爬虫)