requests库是第三方库,使用之前需要安装:
pip install requests
一.常见操作:
import requests resp = requests.get('http://www.baidu.com') # 查看响应内容,response.text返回的是unicode格式的数据 print(resp.text) # 查看响应内容,response.content返回的是字节流数据 # 然后通过decode('utf-8')转换为utf-8 print(resp.content.decode('utf-8')) # 查看完整URL地址 print(resp.url) # 查看响应头部字符编码 print(resp.encoding)
二.添加headers和查询参数
import requests kw = {'wd': '中国'} headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) ' 'AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36' } resp = requests.get('http://www.baidu.com/s', params=kw, headers=headers) with open('baidu.html', 'w', encoding='utf-8') as f: f.write(resp.content.decode('utf-8'))
三.发送post请求
import requests url = 'http://www.iqianyue.com/mypost' data = { 'name': 'aa', 'pass': '123456' } headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko)' ' Chrome/72.0.3626.121 Safari/537.36', 'Referer': 'http://www.iqianyue.com/mypost' } resp = requests.post(url=url, data=data, headers=headers) print(resp.json())
四.使用代理
import requests # 使用reqeusts添加代理,只需要在请求的方法(get/post)中传递proxies参数 url = 'http://httpbin.org/get' headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko)' ' Chrome/72.0.3626.121 Safari/537.36' } proxy = { 'https': '116.209.56.67:9999' } resp = requests.get(url, headers=headers, proxies=proxy) with open('proxy.html', 'w', encoding='utf-8') as f: f.write(resp.text)
五.使用cookie
import requests headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) ' 'Chrome/72.0.3626.121 Safari/537.36', 'Host': 'www.baidu.com' } response = requests.get("http://www.baidu.com", headers=headers) print(response.cookies) for key, value in response.cookies.items(): print(key + '=' + value)