Timeout a file download with Python urllib?

前端 未结 3 662
一生所求
一生所求 2020-12-09 19:24

Python beginner here. I want to be able to timeout my download of a video file if the process takes longer than 500 seconds.

import urllib
try:
   urllib.ur         


        
相关标签:
3条回答
  • 2020-12-09 19:43

    Although urlretrieve does not have this feature, you can still set the default timeout (in seconds) for all new socket objects.

    import socket
    import urllib    
    
    socket.setdefaulttimeout(15)
    
    try:
       urllib.urlretrieve ("http://www.videoURL.mp4", "filename.mp4")
    except Exception as e:
       print("error")
    
    0 讨论(0)
  • 2020-12-09 19:51

    Better way is to use requests so you can stream the results and easily check for timeouts:

    import requests
    
    # Make the actual request, set the timeout for no data to 10 seconds and enable streaming responses so we don't have to keep the large files in memory
    request = requests.get('http://www.videoURL.mp4', timeout=10, stream=True)
    
    # Open the output file and make sure we write in binary mode
    with open('filename.mp4', 'wb') as fh:
        # Walk through the request response in chunks of 1024 * 1024 bytes, so 1MiB
        for chunk in request.iter_content(1024 * 1024):
            # Write the chunk to the file
            fh.write(chunk)
            # Optionally we can check here if the download is taking too long
    
    0 讨论(0)
  • 2020-12-09 19:54

    urlretrieve does not have that option. But you can easily perform your example with the help of urlopen and writing the result in a file, like so:

    request = urllib.urlopen("http://www.videoURL.mp4", timeout=500)
    with open("filename.mp4", 'wb') as f:
        try:
            f.write(request.read())
        except:
            print("error")
    

    That's if you are using Python 3. If you are using Python 2, you should rather use urllib2.

    0 讨论(0)
提交回复
热议问题