I have the following shell script. The purpose is to loop thru each line of the target file (whose path is the input parameter to the script) and do work against each line.
This was happening to me because I had set -e
and a grep
in a loop was returning with no output (which gives a non-zero error code).
The problem is that do_work.sh
runs ssh
commands and by default ssh
reads from stdin which is your input file. As a result, you only see the first line processed, because ssh
consumes the rest of the file and your while loop terminates.
To prevent this, pass the -n
option to your ssh
command to make it read from /dev/null
instead of stdin.
ssh -n option prevents checking the exit status of ssh when using HEREdoc while piping output to another program. So use of /dev/null as stdin is preferred.
#!/bin/bash
while read ONELINE ; do
ssh ubuntu@host_xyz </dev/null <<EOF 2>&1 | filter_pgm
echo "Hi, $ONELINE. You come here often?"
process_response_pgm
EOF
if [ ${PIPESTATUS[0]} -ne 0 ] ; then
echo "aborting loop"
exit ${PIPESTATUS[0]}
fi
done << input_list.txt
More generally, a workaround which isn't specific to ssh
is to redirect standard input for any command which might otherwise consume the while
loop's input.
while read -r LINE; do
let count++
echo "$count $LINE"
sh ./do_work.sh "$LINE" </dev/null
done < "$FILENAME"
The addition of </dev/null
is the crucial point here (though the corrected quoting is also somewhat important; see also When to wrap quotes around a shell variable?). You will want to use read -r
unless you specifically require the legacy slightly odd behavior you get without -r
.
Another workaround of sorts which is somewhat specific to ssh
is to make sure any ssh
command has its standard input tied up, e.g. by changing
ssh otherhost some commands here
to instead read the commands from a here document, which conveniently (for this particular scenario) ties up the standard input of ssh
for the commands:
ssh otherhost <<'____HERE'
some commands here
____HERE