Scraping the second page of a website in Python does not work

久未见 提交于 2019-12-11 11:43:08

问题


Let's say I want to scrape the data here.

I can do it nicely using urlopen and BeautifulSoup in Python 2.7.

Now if I want to scrape data from the second page with this address.

What I get is the data from the first page! I looked at the page source of the second page using "view page source" of Chrome and the content belongs to first page!

How can I scrape the data from the second page?


回答1:


The page is of a quite asynchronous nature, there are XHR requests forming the search results, simulate them in your code using requests. Sample code as a starting point for you:

from bs4 import BeautifulSoup
import requests

url = 'http://www.amazon.com/Best-Sellers-Books-Architecture/zgbs/books/173508/#2'
ajax_url = "http://www.amazon.com/Best-Sellers-Books-Architecture/zgbs/books/173508/ref=zg_bs_173508_pg_2"

def get_books(data):
    soup = BeautifulSoup(data)

    for title in soup.select("div.zg_itemImmersion div.zg_title a"):
        print title.get_text(strip=True)


with requests.Session() as session:
    session.get(url)

    session.headers = {
        'User-Agent': 'Mozilla/5.0 (Linux; U; Android 4.0.3; ko-kr; LG-L160L Build/IML74K) AppleWebkit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30',
        'X-Requested-With': 'XMLHttpRequest'
    }

    for page in range(1, 10):
        print "Page #%d" % page

        params = {
            "_encoding": "UTF8",
            "pg": str(page),
            "ajax": "1"
        }
        response = session.get(ajax_url, params=params)
        get_books(response.content)

        params["isAboveTheFold"] = "0"
        response = session.get(ajax_url, params=params)
        get_books(response.content)

And don't forget to be a good web-scraping citizen and follow the Terms of Use.



来源:https://stackoverflow.com/questions/30229316/scraping-the-second-page-of-a-website-in-python-does-not-work

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!