【python爬虫】aiohttp模块的异步协程代码模板

aiohttp模块的使用:

import aiohttp
import asyncio

urls = [
    "",
    "",
    ""
]

async def download(url):
    name = url.rsplit("/",1)[1]
    #发送请求
    async with aiohttp.ClientSession() as session:
       async with session.get(url) as resp:
            with open(name, mode="wb") as f:
                f.write(await resp.content.read())  #读取内容为异步的,需要await挂起
    print(name,"done!")

async def main():
    tasks = []
    for url in urls:
        tasks.append(download(url))

    await asyncio.wait(tasks)

if __name__ == '__main__':
    asyncio.run(main())

 

你可能感兴趣的:(python爬虫,python,爬虫)