想用Python写一个百度下拉词搜索抓取及相关搜索词抓取实现自动化查询记录到表格中并自动去除重复词组该怎么写
以下是一个示例代码,可以实现百度下拉词搜索抓取和相关搜索词抓取,并将结果记录到表格中并去除重复词组:
import requests
from bs4 import BeautifulSoup
import openpyxl
def get_related_keywords(keyword):
url = 'https://www.baidu.com/s'
params = {
'wd': keyword,
'rn': '10',
'tn': 'baidurt',
'ie': 'utf-8',
'rqlang': 'cn',
'rsv_dl': 'tb',
'rsv_srlang': 'cn',
'rsv_rq': 'off'
}
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3'
}
res = requests.get(url, params=params, headers=headers)
soup = BeautifulSoup(res.text, 'html.parser')
related_keywords = []
for related in soup.select('.tip'):
related_keywords.append(related.text.strip())
return related_keywords
def get_dropdown_keywords(keyword):
url = 'https://www.baidu.com/sugrec'
params = {
'prod': 'pc',
'wd': keyword,
'cb': 'window.baidu.sug'
}
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3'
}
res = requests.get(url, params=params, headers=headers)
soup = BeautifulSoup(res.text, 'html.parser')
dropdown_keywords = []
for dropdown in soup.select('.suglist li'):
dropdown_keywords.append(dropdown.text.strip())
return dropdown_keywords
def save_to_excel(data):
wb = openpyxl.Workbook()
ws = wb.active
ws.append(['Keyword', 'Related Keywords', 'Dropdown Keywords'])
for row in data:
ws.append(row)
wb.save('keywords.xlsx')
if __name__ == '__main__':
keywords = ['Python', '数据分析', '机器学习']
data = []
for keyword in keywords:
related_keywords = get_related_keywords(keyword)
dropdown_keywords = get_dropdown_keywords(keyword)
row = [keyword, ', '.join(related_keywords), ', '.join(dropdown_keywords)]
data.append(row)
save_to_excel(data)
代码中使用了requests和BeautifulSoup库来进行网页的抓取和解析。其中,get_related_keywords函数可以获取到百度搜索结果页面中的相关搜索词。get_dropdown_keywords函数可以获取到百度搜索框下拉词提示中的关键词。最后,save_to_excel函数可以将结果保存到一个Excel表格中,并去除重复词组。如果需要从其他搜索引擎中获取相关词和下拉词,可以根据具体页面的结构和规律进行相应的修改。
原文地址: https://www.cveoy.top/t/topic/baSL 著作权归作者所有。请勿转载和采集!