urllib freeze if url is too big !

拥有回忆 提交于 2019-12-14 03:58:30

问题


ok im trying to open a url using urllib but the problem is that the file is too big, so when i open the url python freezes, im also using wxpython which also freezes when i open the url my cpu goes to almost 100% when the url is opened

any solutions ? is there a way i can open the url in chunks and maybe have a time.sleep(0.5) in there so it does not freeze ? this is my code :

f = open("hello.txt",'wb')
datatowrite = urllib.urlopen(link).read()
f.write(datatowrite)
f.close()

Thanks


回答1:


You want to split the download into a separate thread, so your UI thread continues to work while the download thread does the work separately. That way you don't get the "freeze" while the download happens.

Read more about threading here:

http://docs.python.org/library/threading.html

Alternatively, you could use the system to download the file outside of python using curl or wget.



来源:https://stackoverflow.com/questions/6565910/urllib-freeze-if-url-is-too-big

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!