Script to get the HTTP status code of a list of urls?

前端 未结 8 846
无人及你
无人及你 2020-11-30 17:17

I have a list of URLS that I need to check, to see if they still work or not. I would like to write a bash script that does that for me.

I only need the returned HTT

相关标签:
8条回答
  • 2020-11-30 18:01

    Extending the answer already provided by Phil. Adding parallelism to it is a no brainer in bash if you use xargs for the call.

    Here the code:

    xargs -n1 -P 10 curl -o /dev/null --silent --head --write-out '%{url_effective}: %{http_code}\n' < url.lst
    

    -n1: use just one value (from the list) as argument to the curl call

    -P10: Keep 10 curl processes alive at any time (i.e. 10 parallel connections)

    Check the write_out parameter in the manual of curl for more data you can extract using it (times, etc).

    In case it helps someone this is the call I'm currently using:

    xargs -n1 -P 10 curl -o /dev/null --silent --head --write-out '%{url_effective};%{http_code};%{time_total};%{time_namelookup};%{time_connect};%{size_download};%{speed_download}\n' < url.lst | tee results.csv
    

    It just outputs a bunch of data into a csv file that can be imported into any office tool.

    0 讨论(0)
  • 2020-11-30 18:02
    wget --spider -S "http://url/to/be/checked" 2>&1 | grep "HTTP/" | awk '{print $2}'
    

    prints only the status code for you

    0 讨论(0)
提交回复
热议问题