Shell script using curl to loop through urls

前端 未结 2 1725
囚心锁ツ
囚心锁ツ 2021-02-05 05:41

I\'ve been trying to create a simple script that will take a list of queries from a .txt file, append the main url variable, then scrape the content and output it to a text file

相关标签:
2条回答
  • 2021-02-05 06:30

    You've got nested quotes, try something like this:

    #!/bin/bash
    
    url=https://www.google.fr/?q=
    while read query
    do
        content=$(curl "{$url}${query}")
        echo $query
        echo $content >> output.txt
    done < query.txt
    
    0 讨论(0)
  • 2021-02-05 06:32

    Use more quotes !

    • http://mywiki.wooledge.org/Quotes
    • http://mywiki.wooledge.org/Arguments
    • http://wiki.bash-hackers.org/syntax/words

    Try this instead :

    url="example.com/?q="
    for i in $(cat query.txt); do
        content="$(curl -s "$url/$i")"
        echo "$content" >> output.txt
    done
    
    0 讨论(0)
提交回复
热议问题