Run all shell scripts in folder

前端 未结 3 1608
醉话见心
醉话见心 2020-12-30 01:24

I have many .sh scripts in a single folder and would like to run them one after another. A single script can be executed as:

bash wget-some_long_         


        
相关标签:
3条回答
  • 2020-12-30 01:39

    I ran into this problem where I couldn't use loops and run-parts works with cron.

    Answer:

    foo () {
        bash -H $1 
        #echo $1
        #cat $1
    }
    cd /dat/dat1/files #change directory
    export -f foo #export foo
    parallel foo ::: *.sh #equivalent to putting a & in between each script
    

    You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.

    0 讨论(0)
  • 2020-12-30 01:46

    Use this:

    for f in *.sh; do  # or wget-*.sh instead of *.sh
      bash "$f" -H 
    done
    

    If you want to stop the whole execution when a script fails:

    for f in *.sh; do
      bash "$f" -H || break  # execute successfully or break
      # Or more explicitly: if this execution fails, then stop the `for`:
      # if ! bash "$f" -H; then break; fi
    done
    

    It you want to run, e.g., x1.sh, x2.sh, ..., x10.sh:

    for i in `seq 1 10`; do
      bash "x$i.sh" -H 
    done
    
    0 讨论(0)
  • 2020-12-30 01:46

    There is a much simpler way, you can use the run-parts command which will execute all scripts in the folder:

    run-parts /path/to/folder
    
    0 讨论(0)
提交回复
热议问题