i have a while loop reading lines from a $hosts
while read line
do
ip=$line
check
done < $hosts
my question is can I us
You can send tasks to the background by &
If you intend to wait for all of them to finish you can use the wait
command:
process_to_background &
echo Processing ...
wait
echo Done
You can get the pid
of the given task started in the background if you want to wait for one (or few) specific tasks.
important_process_to_background &
important_pid=$!
while i in {1..10}; do
less_important_process_to_background $i &
done
wait $important_pid
echo Important task finished
wait
echo All tasks finished
On note though: the background processes can mess up the output as they will run asynchronously. You might want to use a named pipe to collect the output from them.
edit
As asked in the comments there might be a need for limiting the background processes forked. In this case you can keep track of how many background processes you've started and communicate with them through a named pipe.
mkfifo tmp # creating named pipe
counter=0
while read ip
do
if [ $counter -lt 10 ]; then # we are under the limit
{ check $ip; echo 'done' > tmp; } &
let $[counter++];
else
read x < tmp # waiting for a process to finish
{ check $ip; echo 'done' > tmp; } &
fi
done
cat /tmp > /dev/null # let all the background processes end
rm tmp # remove fifo
Use GNU Parallel:
parallel check ::: $hosts
You can start multiple processes, each calling the function check
and wait for them to finish.
while read line
do
ip=$line
check &
done < $hosts
wait # wait for all child processes to finish
Whether this increases the speed depends on available processors and the function check
's implementation. You have to ensure there's no data dependency in check
between iterations.