python爬虫实战——自动话获取淘宝商品数据

嗨喽,大家好呀~这里是爱看美女的茜茜呐python爬虫实战——自动话获取淘宝商品数据_第1张图片

开发环境:

  • python 3.8

  • pycharm 专业版

三方库:
  • DrissionPage >>> pip install DrissionPage

如何安装python第三方模块:

  1. win + R 输入 cmd 点击确定, 输入安装命令 pip install 模块名 (pip install requests) 回车

  2. 在pycharm中点击Terminal(终端) 输入安装命令


更多精彩机密、教程,尽在下方,赶紧点击了解吧~

python源码、视频教程、插件安装教程、资料我都准备好了,直接在文末名片自取就可


代码展示

导入模块

from DrissionPage import ChromiumPage
from info import USERNAME, PASSWORD
import time
import re
import json
import csv
page = ChromiumPage()
page.listen.start('h5/mtop.relationrecommend.wirelessrecommend.recommend/2.0/')
  1. 打开一个窗口
# page.get('https://login.taobao.com/member/login.jhtml')
  1. 匹配到账号输入框 并输入账号信息
# page.ele('xpath://input[@id="fm-login-id"]').input(USERNAME)
  1. 匹配到密码输入框 并输入密码信息
# page.ele('xpath://input[@id="fm-login-password"]').input(PASSWORD)
  1. 点击登陆
# page.ele('xpath://button[@class="fm-button fm-submit password-login"]').click()
# time.sleep(2)
# page.get('https://s.taobao.com/search?_input_charset=utf-8&commend=all&ie=utf8&initiative_id=tbindexz_20170306&q=iphone%2015%20pro%20max&search_type=item&source=suggest&sourceId=tb.index&spm=a21bo.jianhua.201856-taobao-item.2&ssid=s5-e&suggest=0_8&suggest_query=ip&wq=ip')
# packet = page.listen.wait()
# print(packet.response.body)
for i in range(10):
    page.ele('xpath://button[@class="next-btn next-small next-btn-normal next-pagination-item next-next"]').click()
    pack = page.listen.wait()
    mtopjson = re.findall('mtopjsonp\d+\((.*)\)', pack.response.body)[0]
    print(mtopjson)
    mtopdict = json.loads(mtopjson)
    itemsArray = mtopdict['data']['itemsArray']
    for item in itemsArray:
        title = item['title']
        price = item['price']
        realSales = item['realSales']
        procity = item['procity']
        shop_name = item['shopInfo']['title']
        print(title, price, realSales, procity, shop_name)
        with open('taobao.csv', mode='a', newline='', encoding='utf-8') as f:
            csv_writer = csv.writer(f)
            csv_writer.writerow([title, price, realSales, procity, shop_name])
    time.sleep(2)

尾语

感谢你观看我的文章呐~本次航班到这里就结束啦

希望本篇文章有对你带来帮助 ,有学习到一点知识~

躲起来的星星也在努力发光,你也要努力加油(让我们一起努力叭)。

最后,宣传一下呀~更多源码、资料、素材、解答、交流皆点击下方名片获取呀

你可能感兴趣的:(python爬虫,python,爬虫,开发语言,pycharm,学习)