I have many .sh
scripts in a single folder and would like to run them one after another. A single script can be executed as:
bash wget-some_long_
I ran into this problem where I couldn't use loops and run-parts works with cron.
foo () {
bash -H $1
#echo $1
#cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script
You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.
Use this:
for f in *.sh; do # or wget-*.sh instead of *.sh
bash "$f" -H
done
If you want to stop the whole execution when a script fails:
for f in *.sh; do
bash "$f" -H || break # execute successfully or break
# Or more explicitly: if this execution fails, then stop the `for`:
# if ! bash "$f" -H; then break; fi
done
It you want to run, e.g., x1.sh
, x2.sh
, ..., x10.sh
:
for i in `seq 1 10`; do
bash "x$i.sh" -H
done
There is a much simpler way, you can use the run-parts
command which will execute all scripts in the folder:
run-parts /path/to/folder