python selenium - takes a lot of time when it does not find elements

十年热恋 提交于 2021-01-27 13:12:16

问题


my code scans a lot of internet pages with chromedriver and searches for the same element in each page with "find_elements_by_xpath"

Lines = driver.find_elements_by_xpath(
                    '//*[@id="top"]/div[contains(@style, "display: block;")]/'
                    'div[contains(@style, "display: block;")]//tbody//a[contains(@title, "Line")]')

When it finds, one or multiple, it works fast and good. But, when the XPath doesn't exist it runs for 6-7 seconds and then moves on.

Can I limit the search for 1 second, And if it doesn't find in a second, just move on? Is there a way to do this?


回答1:


Try to use ExplicitWait as below:

from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait as wait
from selenium.common.exceptions import TimeoutException

try:
    Lines = wait(driver, 1).until(EC.presence_of_all_elements_located((By.XPATH, '//*[@id="top"]/div[contains(@style, "display: block;")]/'
                'div[contains(@style, "display: block;")]//tbody//a[contains(@title, "Line")]')))
except TimeoutException:
    pass

This should allow you to wait for 1 second until at least one element found and get the list of required WebElements or do nothing otherwise



来源:https://stackoverflow.com/questions/45596974/python-selenium-takes-a-lot-of-time-when-it-does-not-find-elements

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!