Python requests“Max retries exceeded with url” error

今天写python网络爬虫的时候遇到一个问题,报错的具体内容如下:

HTTPConnectionPool(host='dds.cr.usgs.gov', port=80): Max retries exceeded with url: /ltaauth//sno18/ops/l1/2016/138/037/LC81380372016038LGN00.tar.gz?id=stfb9e0bgrpmc4j9lcg45ikrj1&iid=LC81380372016038LGN00&did=227966479&ver=production (Caused by NewConnectionError('0x105b9d210>: Failed to establish a new connection: [Errno 65] No route to host',))

多方查阅后发现了解决问题的原因:http连接太多没有关闭导致的

解决办法:

1、增加重试连接次数

  requests.adapters.DEFAULT_RETRIES = 5

2、关闭多余的连接

requests使用了urllib3库,默认的http connection是keep-alive的,requests设置False关闭。

操作方法

s = requests.session()
  s.keep_alive = False

你可能感兴趣的:(python,python,request,retries)