I\'d like to download a web pages while supplying URLs from stdin. Essentially one process continuously produces URLs to stdout/file and I want to pipe them to wget or curl.
You can do this with cURL, but your input needs to be properly formatted. Example alfa.txt:
url example.com
output example.htm
url stackoverflow.com
output stackoverflow.htm
Alternate example:
url stackoverflow.com/questions
remote-name
url stackoverflow.com/documentation
remote-name
Example command:
cat alfa.txt | curl -K-