I am looking to gzip multiple files (into multiple .gz files) in a directory while keeping the originals.
I can do individual files using these commands:
Your >
in the last command gets parsed by the same shell which runs find
. Use a nested shell:
find . -type f -name "*cache.html" -exec sh -c "gzip < {} > {}.gz" \;
I'd use bash(1)
's simple for
construct for this:
for f in *cache.html ; do gzip -c "$f" > "$f.gz" ; done
If I knew the filenames were 'sane', I'd leave off the ""
around the arguments, because I'm lazy. And my filenames are usually sane. But scripts don't have that luxury.
-k, --keep
gzip 1.6 (June 2013) added the -k, --keep
option, so now you can:
find . -type f -name "*cache.html" -exec gzip -k {} \;
gzip -k *cache.html
or for all files recursively simply:
gzip -kr .
Found at: https://unix.stackexchange.com/questions/46786/how-to-tell-gzip-to-keep-original-file
Since you have multiple files, GNU Parallel might be useful:
find . -type f -name "*cache.html" | parallel gzip '<{} >{}.gz'
Watch the intro video for a quick introduction: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1