【信息搜集】存活性检测脚本

收集信息、去重之后当然还要检测一下url是否存活,说不定全打不开呢呢...,哈哈哈,这也是刚写的,和上一个脚本用法一样。

python testalive.py -f url.txt

我试过效果还是挺不错的,就是速度太慢了......等我学会了线程再优化一波。

#testalive
import argparse
import requests
import urllib3
parser = argparse.ArgumentParser(description= "TestAlive")
parser.add_argument('-f',type = str,required = True)
args = parser.parse_args()
urllib3.disable_warnings()  
TimeOut = 5
header = {'User-Agent' : 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36','Connection':'close'}
list1 = []
list2 = []
with open(args.f,"r") as file:
    for i in file:
        i = i.replace("\n",'')
        list1.append(i)
for i in list1:
    try:
        r = requests.get(i,headers=header,timeout=TimeOut)
        status = r.status_code
        if status == 200:
            list2.append(i)
            print(i)
    except:pass   
with open(args.f, "w") as f:
    for i in list2:
        f.write(i + "\n")
print("\nTestAlive finished")    

你可能感兴趣的:(【信息搜集】存活性检测脚本)