太忙又太懒,很少写日志。
import request
from time import sleep
while True:
request.post(your_url, data={"a": 0})
sleep(2)
shell> ss -ant | grep TIME-WAIT (ss 比 netstat 快)
链接不断变成TIME-WAIT状态,每次均开新端口后又关闭,不会自动复用已有port连接,
一来耗资源二来重新连接耗时间。
参考 http://docs.python-requests.org/en/master/user/advanced/#keep-alive
及request源码 /usr/lib64/python2.7/site-packages/requests/api.py:53
import request
from time import sleep
sess = request.Session() # session should be outside, use the same session to send req
while True:
req = request.Request('POST', your_url, data={"a": 0})
prep = sess.prepare_request(req)
sess.send(prep, stream=False)
sleep(2)
参考
https://urllib3.readthedocs.io/en/latest/user-guide.html
https://urllib3.readthedocs.io/en/latest/advanced-usage.html
import urllib3
from time import sleep
http = urllib3.PoolManager(num_pools=100, headers={'Connection':'keep-alive'}, maxsize=100, block=True)
# num_pools is for each host, while maxsize (keep opened connections, not max usable) is for ports per host, according to the docs.
data = {"a": 0}
while True:
encoded_body = json.dumps(data).encode('utf-8')
http.request('POST', your_url, body=encoded_body)
sleep(2)