Can I use wget to check , but not download

后端 未结 5 430
失恋的感觉
失恋的感觉 2021-01-30 04:53

Can I use wget to check for a 404 and not actually download the resource? If so how? Thanks

相关标签:
5条回答
  • 2021-01-30 05:27

    You can use the following option to check for the files:

    wget --delete-after URL
    
    0 讨论(0)
  • 2021-01-30 05:31

    There is the command line parameter --spider exactly for this. In this mode, wget does not download the files and its return value is zero if the resource was found and non-zero if it was not found. Try this (in your favorite shell):

    wget -q --spider address
    echo $?
    

    Or if you want full output, leave the -q off, so just wget --spider address. -nv shows some output, but not as much as the default.

    0 讨论(0)
  • 2021-01-30 05:35

    If you want to check quietly via $? without the hassle of grep'ing wget's output you can use:

    wget -q "http://blah.meh.com/my/path" -O /dev/null
    

    Works even on URLs with just a path but has the disadvantage that something's really downloaded so this is not recommended when checking big files for existence.

    0 讨论(0)
  • 2021-01-30 05:39

    Yes easy.

    wget --spider www.bluespark.co.nz
    

    That will give you

    Resolving www.bluespark.co.nz... 210.48.79.121
    Connecting to www.bluespark.co.nz[210.48.79.121]:80... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: unspecified [text/html]
    200 OK
    
    0 讨论(0)
  • 2021-01-30 05:40

    If you are in a directory where only root have access to write in system. Then you can directly use wget www.example.com/wget-test using a standard user account. So it will hit the url but because of having no write permission file won't be saved.. This method is working fine for me as i am using this method for a cronjob. Thanks.

    sthx

    0 讨论(0)
提交回复
热议问题