摘要:百度云搜索搜網(wǎng)盤淘寶券使用代理格式化,第一個參數(shù),請求目標(biāo)可能是或者對應(yīng)設(shè)置初始化將代理設(shè)置成全局當(dāng)使用請求時自動使用代理引入隨機(jī)模塊文件格式化注意第一個參數(shù)可能是或者,對應(yīng)設(shè)置初始化將代理設(shè)置成全局當(dāng)使用請求時自動使用代理請求
使用IP代理
ProxyHandler()格式化IP,第一個參數(shù),請求目標(biāo)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置
build_opener()初始化IP
install_opener()將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib import urllib.request import random #引入隨機(jī)模塊文件 ip = "180.115.8.212:39109" proxy = urllib.request.ProxyHandler({"https":ip}) #格式化IP,注意:第一個參數(shù)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置 opener = urllib.request.build_opener(proxy,urllib.request.HTTPHandler) #初始化IP urllib.request.install_opener(opener) #將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP #請求 url = "https://www.baidu.com/" data = urllib.request.urlopen(url).read().decode("utf-8") print(data)
ip代理池構(gòu)建一
適合IP存活時間長,穩(wěn)定性好的代理ip,隨機(jī)調(diào)用列表里的ip
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib from urllib import request import random #引入隨機(jī)模塊文件 def dai_li_ip(): ip = [ "110.73.8.103:8123", "115.46.151.100:8123", "42.233.187.147:19" ] shui = random.choice(ip) print(shui) proxy = urllib.request.ProxyHandler({"https": shui}) # 格式化IP,注意,第一個參數(shù),請求目標(biāo)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置 opener = urllib.request.build_opener(proxy, urllib.request.HTTPHandler) # 初始化IP urllib.request.install_opener(opener) # 將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP #請求 dai_li_ip() #執(zhí)行代理IP函數(shù) url = "https://www.baidu.com/" data = urllib.request.urlopen(url).read().decode("utf-8") print(data)
ip代理池構(gòu)建二,接口方式
每次調(diào)用第三方接口動態(tài)獲取ip,適用于IP存活時間短的情況
我們用http://http.zhimaruanjian.com...
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib from urllib import request import json def dai_li_ip(): url = "http://http-webapi.zhimaruanjian.com/getip?num=1&type=2&pro=&city=0&yys=0&port=11&time=1&ts=0&ys=0&cs=0&lb=1&sb=0&pb=4&mr=1" data = urllib.request.urlopen(url).read().decode("utf-8") data2 = json.loads(data) # 將字符串還原它本來的數(shù)據(jù)類型 print(data2["data"][0]) ip = str(data2["data"][0]["ip"]) dkou = str(data2["data"][0]["port"]) zh_ip = ip + ":" + dkou print(zh_ip) proxy = urllib.request.ProxyHandler({"https": zh_ip}) # 格式化IP,注意,第一個參數(shù),請求目標(biāo)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置 opener = urllib.request.build_opener(proxy, urllib.request.HTTPHandler) # 初始化IP urllib.request.install_opener(opener) # 將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP #請求 dai_li_ip() #執(zhí)行代理IP函數(shù) url = "https://www.baidu.com/" data = urllib.request.urlopen(url).read().decode("utf-8") print(data)
用戶代理和ip代理結(jié)合應(yīng)用
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib from urllib import request import json import random def yh_dl(): #創(chuàng)建用戶代理池 yhdl = [ "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0", "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0.1) Gecko/20100101 Firefox/4.0.1", "Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1", "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11", "Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)", "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "User-Agent:Mozilla/5.0 (iPod; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "Mozilla/5.0 (iPad; U; CPU OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "Mozilla/5.0 (Linux; U; Android 2.3.7; en-us; Nexus One Build/FRF91) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1", "Opera/9.80 (Android 2.3.4; Linux; Opera Mobi/build-1107180945; U; en-GB) Presto/2.8.149 Version/11.10", "Mozilla/5.0 (Linux; U; Android 3.0; en-us; Xoom Build/HRI39) AppleWebKit/534.13 (KHTML, like Gecko) Version/4.0 Safari/534.13", "Mozilla/5.0 (BlackBerry; U; BlackBerry 9800; en) AppleWebKit/534.1+ (KHTML, like Gecko) Version/6.0.0.337 Mobile Safari/534.1+", "Mozilla/5.0 (compatible; MSIE 9.0; Windows Phone OS 7.5; Trident/5.0; IEMobile/9.0; HTC; Titan)", "UCWEB7.0.2.37/28/999", "NOKIA5700/ UCWEB7.0.2.37/28/999", "Openwave/ UCWEB7.0.2.37/28/999", "Mozilla/4.0 (compatible; MSIE 6.0; ) Opera/UCWEB7.0.2.37/28/999" ] thisua = random.choice(yhdl) #隨機(jī)獲取代理信息 headers = ("User-Agent",thisua) #拼接報(bào)頭信息 opener = urllib.request.build_opener() #創(chuàng)建請求對象 opener.addheaders=[headers] #添加報(bào)頭到請求對象 urllib.request.install_opener(opener) #將報(bào)頭信息設(shè)置為全局,urlopen()方法請求時也會自動添加報(bào)頭 def dai_li_ip(): #創(chuàng)建ip代理池 url = "http://http-webapi.zhimaruanjian.com/getip?num=1&type=2&pro=&city=0&yys=0&port=11&time=1&ts=0&ys=0&cs=0&lb=1&sb=0&pb=4&mr=1" data = urllib.request.urlopen(url).read().decode("utf-8") data2 = json.loads(data) # 將字符串還原它本來的數(shù)據(jù)類型 print(data2["data"][0]) ip = str(data2["data"][0]["ip"]) dkou = str(data2["data"][0]["port"]) zh_ip = ip + ":" + dkou print(zh_ip) proxy = urllib.request.ProxyHandler({"https": zh_ip}) # 格式化IP,注意,第一個參數(shù),請求目標(biāo)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置 opener = urllib.request.build_opener(proxy, urllib.request.HTTPHandler) # 初始化IP urllib.request.install_opener(opener) # 將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP #請求 dai_li_ip() #執(zhí)行代理IP函數(shù) yh_dl() #執(zhí)行用戶代理池函數(shù) gjci = "連衣裙" zh_gjci = gjc = urllib.request.quote(gjci) #將關(guān)鍵詞轉(zhuǎn)碼成瀏覽器認(rèn)識的字符,默認(rèn)網(wǎng)站不能是中文 url = "https://s.taobao.com/search?q=%s&s=0" %(zh_gjci) # print(url) data = urllib.request.urlopen(url).read().decode("utf-8") print(data)
用戶代理和ip代理結(jié)合應(yīng)用封裝模塊
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib from urllib import request import json import random import re import urllib.error def hq_html(hq_url): """ hq_html()封裝的爬蟲函數(shù),自動啟用了用戶代理和ip代理 接收一個參數(shù)url,要爬取頁面的url,返回html源碼 """ def yh_dl(): #創(chuàng)建用戶代理池 yhdl = [ "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0", "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0.1) Gecko/20100101 Firefox/4.0.1", "Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1", "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11", "Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)", "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "User-Agent:Mozilla/5.0 (iPod; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "Mozilla/5.0 (iPad; U; CPU OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5", "Mozilla/5.0 (Linux; U; Android 2.3.7; en-us; Nexus One Build/FRF91) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1", "Opera/9.80 (Android 2.3.4; Linux; Opera Mobi/build-1107180945; U; en-GB) Presto/2.8.149 Version/11.10", "Mozilla/5.0 (Linux; U; Android 3.0; en-us; Xoom Build/HRI39) AppleWebKit/534.13 (KHTML, like Gecko) Version/4.0 Safari/534.13", "Mozilla/5.0 (BlackBerry; U; BlackBerry 9800; en) AppleWebKit/534.1+ (KHTML, like Gecko) Version/6.0.0.337 Mobile Safari/534.1+", "Mozilla/5.0 (compatible; MSIE 9.0; Windows Phone OS 7.5; Trident/5.0; IEMobile/9.0; HTC; Titan)", "UCWEB7.0.2.37/28/999", "NOKIA5700/ UCWEB7.0.2.37/28/999", "Openwave/ UCWEB7.0.2.37/28/999", "Mozilla/4.0 (compatible; MSIE 6.0; ) Opera/UCWEB7.0.2.37/28/999" ] thisua = random.choice(yhdl) #隨機(jī)獲取代理信息 headers = ("User-Agent",thisua) #拼接報(bào)頭信息 opener = urllib.request.build_opener() #創(chuàng)建請求對象 opener.addheaders=[headers] #添加報(bào)頭到請求對象 urllib.request.install_opener(opener) #將報(bào)頭信息設(shè)置為全局,urlopen()方法請求時也會自動添加報(bào)頭 def dai_li_ip(hq_url): #創(chuàng)建ip代理池 url = "http://http-webapi.zhimaruanjian.com/getip?num=1&type=2&pro=&city=0&yys=0&port=11&time=1&ts=0&ys=0&cs=0&lb=1&sb=0&pb=4&mr=1" if url: data = urllib.request.urlopen(url).read().decode("utf-8") data2 = json.loads(data) # 將字符串還原它本來的數(shù)據(jù)類型 # print(data2["data"][0]) ip = str(data2["data"][0]["ip"]) dkou = str(data2["data"][0]["port"]) zh_ip = ip + ":" + dkou pat = "(w*):w*" rst = re.compile(pat).findall(hq_url) #正則匹配獲取是http協(xié)議還是https協(xié)議 rst2 = rst[0] proxy = urllib.request.ProxyHandler({rst2: zh_ip}) # 格式化IP,注意,第一個參數(shù),請求目標(biāo)可能是http或者h(yuǎn)ttps,對應(yīng)設(shè)置 opener = urllib.request.build_opener(proxy, urllib.request.HTTPHandler) # 初始化IP urllib.request.install_opener(opener) # 將代理IP設(shè)置成全局,當(dāng)使用urlopen()請求時自動使用代理IP else: pass #請求 try: dai_li_ip(hq_url) #執(zhí)行代理IP函數(shù) yh_dl() #執(zhí)行用戶代理池函數(shù) data = urllib.request.urlopen(hq_url).read().decode("utf-8") return data except urllib.error.URLError as e: # 如果出現(xiàn)錯誤 if hasattr(e, "code"): # 如果有錯誤代碼 # print(e.code) # 打印錯誤代碼 pass if hasattr(e, "reason"): # 如果有錯誤信息 # print(e.reason) # 打印錯誤信息 pass # a = hq_html("http://www.baid.com/") # print(a)
模塊使用
#!/usr/bin/env python # -*- coding: utf-8 -*- import urllib.request import fzhpach gjc = "廣告錄音" gjc = urllib.request.quote(gjc) #將關(guān)鍵詞轉(zhuǎn)碼成瀏覽器認(rèn)識的字符,默認(rèn)網(wǎng)站不能是中文 url = "https://www.baidu.com/s?wd=%s&pn=0" %(gjc) a = fzhpach.hq_html(url) print(a)
【轉(zhuǎn)載自:https://www.jianshu.com/u/3fe...】
文章版權(quán)歸作者所有,未經(jīng)允許請勿轉(zhuǎn)載,若此文章存在違規(guī)行為,您可以聯(lián)系管理員刪除。
轉(zhuǎn)載請注明本文地址:http://specialneedsforspecialkids.com/yun/44064.html
摘要:隨后,為了保險,重啟,火狐瀏覽器也重啟一下,然后開始抓的包,此時你會發(fā)現(xiàn)你的連接并不安全等類似提示已經(jīng)消失,并且已經(jīng)能夠抓包了。 【百度云搜索,搜各種資料:http://www.bdyss.com】 【搜網(wǎng)盤,搜各種資料:http://www.swpan.cn】 封裝模塊 #!/usr/bin/env?python #?-*-?coding:?utf-8?-*- import?urll...
摘要:并不是所有爬蟲都遵守,一般只有大型搜索引擎爬蟲才會遵守。的端口號為的端口號為工作原理網(wǎng)絡(luò)爬蟲抓取過程可以理解為模擬瀏覽器操作的過程。表示服務(wù)器成功接收請求并已完成整個處理過程。 爬蟲概念 數(shù)據(jù)獲取的方式: 企業(yè)生產(chǎn)的用戶數(shù)據(jù):大型互聯(lián)網(wǎng)公司有海量用戶,所以他們積累數(shù)據(jù)有天然優(yōu)勢。有數(shù)據(jù)意識的中小型企業(yè),也開始積累的數(shù)據(jù)。 數(shù)據(jù)管理咨詢公司 政府/機(jī)構(gòu)提供的公開數(shù)據(jù) 第三方數(shù)據(jù)平臺購買...
摘要:百度云搜索搜網(wǎng)盤如果爬蟲沒有異常處理,那么爬行中一旦出現(xiàn)錯誤,程序?qū)⒈罎⑼V构ぷ鳎挟惓L幚砑词钩霈F(xiàn)錯誤也能繼續(xù)執(zhí)行下去常見狀態(tài)碼重定向到新的,永久性重定向到臨時,非永久性請求的資源未更新非法請求請求未經(jīng)授權(quán)禁止訪問沒找到對應(yīng)頁面服務(wù)器內(nèi)部 【百度云搜索:http://www.lqkweb.com】 【搜網(wǎng)盤:http://www.swpan.cn】 如果爬蟲沒有異常處理,那么爬行中一...
摘要:所以使用代理隱藏真實(shí)的,讓服務(wù)器誤以為是代理服務(wù)器的在請求自己。參考來源由于涉及到一些專業(yè)名詞知識,本節(jié)的部分內(nèi)容參考來源如下代理服務(wù)器維基百科代理百度百科上一篇文章網(wǎng)絡(luò)爬蟲實(shí)戰(zhàn)和下一篇文章網(wǎng)絡(luò)爬蟲實(shí)戰(zhàn)使用發(fā)送請求 上一篇文章:Python3網(wǎng)絡(luò)爬蟲實(shí)戰(zhàn)---18、Session和Cookies下一篇文章:Python3網(wǎng)絡(luò)爬蟲實(shí)戰(zhàn)---20、使用Urllib:發(fā)送請求 我們在做爬蟲...
閱讀 875·2021-11-22 09:34
閱讀 1010·2021-10-08 10:16
閱讀 1821·2021-07-25 21:42
閱讀 1793·2019-08-30 15:53
閱讀 3524·2019-08-30 13:08
閱讀 2183·2019-08-29 17:30
閱讀 3346·2019-08-29 17:22
閱讀 2180·2019-08-29 15:35