What I\'m trying to achieve is to read command line arguments from a file and invoke a command using them. So essentially I need to pass the arguments through a bash variabl
If you want to execute your command once for each line found in file.txt
, so each line is a separate argument set, you can do this :
xargs /some/command <file.txt
The xargs
utility takes each line it receives on standard input and uses its content as arguments to be provided to the command that is called. If the file contains only one line, it will work and execute the command only once.
The following solution does the same, but works with functions too:
while IFS= read -r line
do
eval args=\("$line"\)
command_or_function "${args[@]}"
done<file.txt
Please note that this uses eval
, which means that if file.txt
contains malicious content, arbitrary code execution could result. You must be 100% certain that the data contained in the file is safe.
The idea with this technique is that you explode each line into an array (one array element is one argument), and then use an array expansion ("${args[@]}"
) that expands to a list of all its elements, properly quoted (the quotes around the expansion are important here).
As an aside, the eval
line could be replaced with :
declare -a args=\($line\)
But $line
still gets expanded, so this is no safer than eval
.
Use command substitution, to expand the file contents to the command,
/some/command "$(<file.txt)"
As an example,
cat file
"aaa bbb" "xxx yyy"
using printf
on it INCORRECTLY with cat
will produce
printf "%s\n" $(cat file)
"aaa
bbb"
"xxx
yyy"
With proper quoting present, arguments are sent as such without getting split.
printf "%s\n" "$(<file)"
"aaa bbb" "xxx yyy"