Let\'s say I have a text file of hundreds of URLs in one location, e.g.
http://url/file_to_download1.gz http://url/file_to_download2.gz http://url/file_to_downlo
If you're on OpenWrt or using some old version of wget which doesn't gives you -i option:
-i
#!/bin/bash input="text_file.txt" while IFS= read -r line do wget $line done < "$input"
Furthermore, if you don't have wget, you can use curl or whatever you use for downloading individual files.
wget
curl