爬虫遇到urllib.error.URLError: <urlopen error [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。>

提示如下

爬虫遇到urllib.error.URLError: <urlopen error [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。>_第1张图片

Traceback (most recent call last):
  File "D:\py3.8\lib\urllib\request.py", line 1319, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "D:\py3.8\lib\http\client.py", line 1230, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "D:\py3.8\lib\http\client.py", line 1276, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "D:\py3.8\lib\http\client.py", line 1225, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "D:\py3.8\lib\http\client.py", line 1004, in _send_output
    self.send(msg)
  File "D:\py3.8\lib\http\client.py", line 944, in send
    self.connect()
  File "D:\py3.8\lib\http\client.py", line 915, in connect
    self.sock = self._create_connection(
  File "D:\py3.8\lib\socket.py", line 808, in create_connection
    raise err
  File "D:\py3.8\lib\socket.py", line 796, in create_connection
    sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\eclipse\works\pytest\pytest\test\cytest.py", line 89, in
    pcres=urllib.request.urlopen(url)
  File "D:\py3.8\lib\urllib\request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "D:\py3.8\lib\urllib\request.py", line 525, in open
    response = self._open(req, data)
  File "D:\py3.8\lib\urllib\request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "D:\py3.8\lib\urllib\request.py", line 502, in _call_chain
    result = func(*args)
  File "D:\py3.8\lib\urllib\request.py", line 1348, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "D:\py3.8\lib\urllib\request.py", line 1322, in do_open
    raise URLError(err)
urllib.error.URLError:

意味着当前的请求被终止

一种是响应请求太慢了导致的,一种是每次间隔请求时间太短了太频繁导致的

爬虫的说白了尽量接近人为模拟效果的,比如加上时间间隔,毕竟有时间跟没时间的去爬虫对服务器压力都不一样的对吧

基本解决方法time.sleep()来处理

新手学python就来py编程,不定期分享所学的源码,适合新手


 

你可能感兴趣的:(错误栈)