Basically I want to take as input text from a file, remove a line from that file, and send the output back to the same file. Something along these lines if that makes it any
Since this question is the top result in search engines, here's a one-liner based on https://serverfault.com/a/547331 that uses a subshell instead of sponge
(which often isn't part of a vanilla install like OS X):
echo "$(grep -v 'seg[0-9]\{1,\}\.[0-9]\{1\}' file_name)" > file_name
The general case is:
echo "$(cat file_name)" > file_name
Edit, the above solution has some caveats:
printf '%s'
should be used instead of echo
so that files containing -n
don't cause undesired behavior.x
to the output and remove it on the outside via parameter expansion of a temporary variable like ${v%x}
.$v
stomps the value of any existing variable $v
in the current shell environment, so we should nest the entire expression in parentheses to preserve the previous value.null
from the output. I verified this by calling dd if=/dev/zero bs=1 count=1 >> file_name
and viewing it in hex with cat file_name | xxd -p
. But echo $(cat file_name) | xxd -p
is stripped. So this answer should not be used on binary files or anything using unprintable characters, as Lynch pointed out.The general solution (albiet slightly slower, more memory intensive and still stripping unprintable characters) is:
(v=$(cat file_name; printf x); printf '%s' ${v%x} > file_name)
Test from https://askubuntu.com/a/752451:
printf "hello\nworld\n" > file_uniquely_named.txt && for ((i=0; i<1000; i++)); do (v=$(cat file_uniquely_named.txt; printf x); printf '%s' ${v%x} > file_uniquely_named.txt); done; cat file_uniquely_named.txt; rm file_uniquely_named.txt
Should print:
hello
world
Whereas calling cat file_uniquely_named.txt > file_uniquely_named.txt
in the current shell:
printf "hello\nworld\n" > file_uniquely_named.txt && for ((i=0; i<1000; i++)); do cat file_uniquely_named.txt > file_uniquely_named.txt; done; cat file_uniquely_named.txt; rm file_uniquely_named.txt
Prints an empty string.
I haven't tested this on large files (probably over 2 or 4 GB).
I have borrowed this answer from Hart Simha and kos.