Urllib2 & BeautifulSoup : Nice couple but too slow - urllib3 & threads?

前端 未结 3 534
醉酒成梦
醉酒成梦 2021-01-31 06:38

I was looking to find a way to optimize my code when I heard some good things about threads and urllib3. Apparently, people disagree which solution is the best.

The pro

3条回答
  •  感情败类
    2021-01-31 07:15

    Hey Guys,

    Some news from the problem ! I've found this script, which might be useful ! I'm actually testing it and it's promising (6.03 to run the script below).

    My idea is to find a way to mix that with urllib3. In effet, I'm making request on the same host a lot of times.

    The PoolManager will take care of reusing connections for you whenever you request the same host. this should cover most scenarios without significant loss of efficiency, but you can always drop down to a lower level component for more granular control. (urrlib3 doc site)

    Anyway, it seems to be very interesting and if I can't see yet how to mix these two functionnalities (urllib3 and the threading script below), I guess it's doable ! :-)

    Thank you very much for taking the time to give me a hand with that, It smells good !

    import Queue
    import threading
    import urllib2
    import time
    from bs4 import BeautifulSoup as BeautifulSoup
    
    
    
    hosts = ["http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=1", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=2", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=3", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=4", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=5", "http://www.bulats.org//agents/find-an-agent?field_continent_tid=All&field_country_tid=All&page=6"]
    
    queue = Queue.Queue()
    out_queue = Queue.Queue()
    
    class ThreadUrl(threading.Thread):
        """Threaded Url Grab"""
        def __init__(self, queue, out_queue):
            threading.Thread.__init__(self)
            self.queue = queue
            self.out_queue = out_queue
    
        def run(self):
            while True:
                #grabs host from queue
                host = self.queue.get()
    
                #grabs urls of hosts and then grabs chunk of webpage
                url = urllib2.urlopen(host)
                chunk = url.read()
    
                #place chunk into out queue
                self.out_queue.put(chunk)
    
                #signals to queue job is done
                self.queue.task_done()
    
    class DatamineThread(threading.Thread):
        """Threaded Url Grab"""
        def __init__(self, out_queue):
            threading.Thread.__init__(self)
            self.out_queue = out_queue
    
        def run(self):
            while True:
                #grabs host from queue
                chunk = self.out_queue.get()
    
                #parse the chunk
                soup = BeautifulSoup(chunk)
                #print soup.findAll(['table'])
    
                tableau = soup.find('table')
            rows = tableau.findAll('tr')
            for tr in rows:
                cols = tr.findAll('td')
                for td in cols:
                        texte_bu = td.text
                        texte_bu = texte_bu.encode('utf-8')
                        print texte_bu
    
                #signals to queue job is done
                self.out_queue.task_done()
    
    start = time.time()
    def main():
    
        #spawn a pool of threads, and pass them queue instance
        for i in range(5):
            t = ThreadUrl(queue, out_queue)
            t.setDaemon(True)
            t.start()
    
        #populate queue with data
        for host in hosts:
            queue.put(host)
    
        for i in range(5):
            dt = DatamineThread(out_queue)
            dt.setDaemon(True)
            dt.start()
    
    
        #wait on the queue until everything has been processed
        queue.join()
        out_queue.join()
    
    main()
    print "Elapsed Time: %s" % (time.time() - start)
    

提交回复
热议问题