问题
Is there an HTTP client like wget/lynx/GET
that is distributed by default in POSIX or *nix operating systems that could be used for maximum portability?
I know most systems have wget
or lynx
installed, but I seem to remember installing some Ubuntu server systems using default settings and they had neither wget
or lynx
installed in the base package.
I am writing a shell script for Linux (and probably Mac) to install a piece of software onto the computer. To prevent having to distribute a couple of large files, I would like to fetch these files from the internet instead of packaging in with the installer. Currently, the install script is to be distributed in a single file created from Makeself.
I'd like to avoid having the install script be over 100 MB which it would be if the files were included, and also they may not be required if the person is upgrading or re-installing the software. Maybe the most portable thing to do is include the files in the pacakage.
Right now I am just thinking of having the script check for wget
, lynx
, and GET
, in that order and it will use whichever one it can for downloading, but I could avoid this altogether if there was a way I could download the files that would work on all systems.
EDIT:
Does anyone know much about lwp-request (GET
) and its availability? This seems to be readily available on several of the systems I have checked so far, and I remember this always being around 10+ years ago going back to RedHat.
回答1:
Edit in 2019-11-04: I'm rewriting my answer to reflect the importance of ensuring that a transfer isn't tampered with while in flight. I'll leave my original answer below the rule.
I suggest using rsync
over ssh
to transfer your files. rsync
's interface may look overwhelming, but most users may be able to pick rsync -avzP
, and if you need more flexibility, rsync can adapt. Using ssh
will provide integrity, authenticity, and privacy to your connection.
curl
is the de facto standard for http transfers; if plain http or https are preferred, curl
or tools based on curl
are probably a good choice.
In my experience, tools are available about in this order:
wget
curl
sftp
ftp
GET
(I useHEAD
all the time and often forget it is just one tool in the suite)tftp
nc
(not as common as I wish)socat
(even less common)
The bash
/dev/tcp
tool is available on most systems I've used (some used dash
or pdksh
instead), but using echo
with bash
, nc
, or socat
is going the long-way-around for HTTP access -- you'll have to handle headers somehow, which reduces its elegance.
回答2:
Official list of POSIX 7 utilities
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/contents.html
The following are not present in the list:
- wget
- curl
- ftp
The same goes for the LSB which essentially only guarantees the POSIX utilities.
But I do think that the POSIX C is enough to implement most of netcat
functionality, so it is really a missed opportunity. E.g.: How to make an HTTP get request in C without libcurl?
Likely it is because network protocols like HTTP were deemed too specific/didn't exist at the time POSIX was still evolving, and POSIX basically froze forever. Notably, HTTPS encryption is likely not trivial to implement.
回答3:
Curl is probably even more common than wget. At least in my experience merely because more other tools depend on it. But both curl and wget are a super simple install and will be available on any system.
回答4:
gnu awk (gawk) has built in TCP handeling even on non-linux systems http://www.gnu.org/software/gawk/manual/gawkinet/html_node/
回答5:
I suggest using ftp or wget, as they are the most common in Linux distributions. The best practice might be to have your script look to see if a command is available, if not go to the next command.
来源:https://stackoverflow.com/questions/9490872/is-wget-or-similar-programs-always-available-on-posix-systems