用Python做一个壁纸爬虫
以下是用Python编写的壁纸爬虫代码,可以爬取搜狗壁纸网站上的壁纸图片并保存到本地:
import requests
from bs4 import BeautifulSoup
import os
# 创建一个文件夹用于保存壁纸图片
if not os.path.exists('wallpapers'):
os.mkdir('wallpapers')
# 爬取壁纸图片
url = 'https://bizhi.sogou.com/'
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')
img_tags = soup.find_all('img')
for img_tag in img_tags:
img_url = img_tag['src']
if 'bizhi' in img_url:
img_name = img_url.split('/')[-1]
img_path = os.path.join('wallpapers', img_name)
if not os.path.exists(img_path):
img_response = requests.get(img_url, stream=True)
with open(img_path, 'wb') as f:
for chunk in img_response.iter_content(chunk_size=1024):
if chunk:
f.write(chunk)
print(f'{img_name} saved successfully.')
运行以上代码,即可在当前目录下创建一个名为“wallpapers”的文件夹,然后将爬取到的壁纸图片保存到该文件夹中。
原文地址: https://www.cveoy.top/t/topic/bAWS 著作权归作者所有。请勿转载和采集!