How to `wget` a list of URLs in a text file?

后端 未结 5 1201
余生分开走
余生分开走 2021-01-30 03:46

Let\'s say I have a text file of hundreds of URLs in one location, e.g.

http://url/file_to_download1.gz
http://url/file_to_download2.gz
http://url/file_to_downlo         


        
5条回答
  •  一整个雨季
    2021-01-30 04:21

    If you're on OpenWrt or using some old version of wget which doesn't gives you -i option:

    #!/bin/bash
    input="text_file.txt"
    while IFS= read -r line
    do
      wget $line
    done < "$input"
    

    Furthermore, if you don't have wget, you can use curl or whatever you use for downloading individual files.

提交回复
热议问题