IP池的建立

目录

一.常见ip代理免费厂商

二.查看网页是否为静态加载网页

三.抓取ip

四.验证ip是否有效

五.总代码


 

一.常见ip代理免费厂商

ip代理免费厂商
厂商名称网址
免费代理IP库http://ip.jiangxianli.com
快代理https://www.kuaidaili.com
云代理 http://www.ip3366.net
全网代理http://www.goubanjia.com
66代理http://www.66ip.cn

二.查看网页是否为静态加载网页

以快代理为例,火狐浏览器

①单击右键,选择查看页面源代码

②Ctrl+F键,在左下的搜索框输入页面的任意一个ip地址,若能显示,则网页是静态加载的页面


三.抓取ip

import requests
import random
import parsel
import time


user_agent = [
        "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
        "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0);",
        "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)",
        "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
        "Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
        "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11",
        "Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11",
        "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
        "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36",
        "Mozilla/5.0 (X11; Linux x86_64; rv:76.0) Gecko/20100101 Firefox/76.0",
  ]

ip_list = list()

#抓取第1到2999页的ip地址
for page in range(1,3000):
    print("正在爬取第{}页".format(page))
    url = "https://www.kuaidaili.com/free/inha/{}/".format(page)
    print(url)
    headers = { 
        'User-Agent':user_agent[random.randint(1,20)],
    }
    
    response = requests.get(url,headers=headers)
    
#     response = requests.get(url,headers=headers)
    data = response.text
    
    html_data = parsel.Selector(data)
    data_list = html_data.xpath('//table[@class="table table-bordered table-striped"]/tbody/tr')

    
    for tr in data_list:
        proxies_dict = dict()
        ip = tr.xpath("./td[1]/text()").extract_first()
        ip_port = tr.xpath("./td[2]/text()").extract_first()
        http_type = tr.xpath("./td[4]/text()").extract_first()
        proxies_dict[http_type] = ip + ":" + ip_port
        ip_list.append(proxies_dict)
        print(proxies_dict)
    time.sleep(2)
print("###################end####################")
print(len(ip_list))

四.验证ip是否有效

以访问天天基金的网页为例

#保存有效ip
def write(id):
    f = open('G:/id.txt','a+')
    f.write("{}\n".format(id))
    f.close()

header = {
    'User-Agent':user_agent[random.randint(1,21)],
    'Accept-Encoding':"gzip, deflate, br",
    'Accept-Language':'zh-CN,zh;q=0.8,zh-TW;q=0.7,zh-HK;q=0.5,en-US;q=0.3,en;q=0.2'  
}
count = 0
for proxy in ip_list:
    try:
        response = requests.get("https://fund.eastmoney.com/",headers=header,proxies=proxy,timeout=0.1)
        if response.status_code == 200:
            write(proxy)  #将有效的ip保存在txt文件里
            print("*"*50)
            count+=1
            print(count)
    except Exception as e:
        print(e)
print("end")
print("&"*50)

五.总代码

import requests
import random
import parsel
import time

#将有ip地址保存在txt文件里
def write(id):
    f = open('G:/id2.txt','a+')
    f.write("{}\n".format(id))
    f.close()

user_agent = [
        "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
        "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0);",
        "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)",
        "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
        "Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
        "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11",
        "Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11",
        "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
        "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36",
        "Mozilla/5.0 (X11; Linux x86_64; rv:76.0) Gecko/20100101 Firefox/76.0",
  ]


ip_list = list()
for page in range(1,3000):
    print("正在爬取第{}页".format(page))
    url = "https://www.kuaidaili.com/free/inha/{}/".format(page)
    print(url)
    headers = { 
        'User-Agent':user_agent[random.randint(1,20)],
    }
    
    “““
#用代理ip去爬取ip,可能会失效,得重新设置代理ip
proxy = {
        'HTTP': '218.65.67.6:9000',
        'HTTPS':'218.65.67.6:9000'
    }

    
    response = requests.get(url,headers=headers,proxies=proxy,timeout=1)
”“”

    response = requests.get(url,headers=headers)
    data = response.text
    
    html_data = parsel.Selector(data)
    data_list = html_data.xpath('//table[@class="table table-bordered table-striped"]/tbody/tr')

    
    for tr in data_list:
        proxies_dict = dict()
        ip = tr.xpath("./td[1]/text()").extract_first()
        ip_port = tr.xpath("./td[2]/text()").extract_first()
        http_type = tr.xpath("./td[4]/text()").extract_first()
        proxies_dict[http_type] = ip + ":" + ip_port
        ip_list.append(proxies_dict)
        print(proxies_dict)
    time.sleep(2)
print("###################end####################")
print(len(ip_list))

print("###################ip验证####################")
header = {
    'User-Agent':user_agent[random.randint(1,21)],
    'Accept-Encoding':"gzip, deflate, br",
    'Accept-Language':'zh-CN,zh;q=0.8,zh-TW;q=0.7,zh-HK;q=0.5,en-US;q=0.3,en;q=0.2'  
}
count = 0
for proxy in ip_list:
    try:
        response = requests.get("https://fund.eastmoney.com/",headers=header,proxies=proxy,timeout=0.1)
        if response.status_code == 200:
            write(proxy)
            print("*"*50)
            count+=1
            print(count)
    except Exception as e:
        print(e)
print("end")
print("&"*50)

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值