Adding a wait-for-element while performing a SplashRequest in python Scrapy

走远了吗. 提交于 2019-11-29 07:52:34

问题


I am trying to scrape a few dynamic websites using Splash for Scrapy in python. However, I see that Splash fails to wait for the complete page to load in certain cases. A brute force way to tackle this problem was to add a large wait time (eg. 5 seconds in the below snippet). However, this is extremely inefficient and still fails to load certain data (sometimes it take longer than 5 seconds to load the content). Is there some sort of a wait-for-element condition that can be put through these requests?

yield SplashRequest(
          url, 
          self.parse, 
          args={'wait': 5},
          'User-Agent':"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36",
          }
)

回答1:


Yes, you can write a Lua script to do that. Something like that:

function main(splash)
  splash:set_user_agent(splash.args.ua)
  assert(splash:go(splash.args.url))

  -- requires Splash 2.3  
  while not splash:select('.my-element') do
    splash:wait(0.1)
  end
  return {html=splash:html()}
end

Before Splash 2.3 you can use splash:evaljs('!document.querySelector(".my-element")') instead of not splash:select('.my-element').

Save this script to a variable (lua_script = """ ... """). Then you can send a request like this:

yield SplashRequest(
    url, 
    self.parse, 
    endpoint='execute',
    args={
        'lua_source': lua_script,
        'ua': "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36"
    }
}

See scripting tutorial and reference for more details on how to write Splash Lua scripts.




回答2:


I have a similar requirement, with timeouts. My solution is a slight modification of above:

function wait_css(splash, css, maxwait)
    if maxwait == nil then
        maxwait = 10     --default maxwait if not given
    end

    local i=0
    while not splash:select(css) do
       if i==maxwait then
           break     --times out at maxwait secs
       end
       i=i+1
       splash:wait(1)      --each loop has duration 1sec
    end
end



回答3:


You can use lua script with javascript and splash:wait_for_resume (documentation).

function main(splash, args)
  splash.resource_timeout = 60

  assert(splash:go(splash.args.url))

  assert(splash:wait(1))
  splash.scroll_position = {y=500}

  result, error = splash:wait_for_resume([[
    function main(splash) {
      var checkExist = setInterval(function() {
        if (document.querySelector(".css-selector").innerText) {
          clearInterval(checkExist);
          splash.resume();
        }
      }, 1000);
    }
  ]], 30)

  assert(splash:wait(0.5))
  return splash:html()
end

If you use without scrapy-splash plugin, attention to splash.args.url in splash:go, will be different.



来源:https://stackoverflow.com/questions/41075257/adding-a-wait-for-element-while-performing-a-splashrequest-in-python-scrapy

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!