SLURM sbatch job array for the same script but with different input arguments run in parallel

前端 未结 3 2101
别跟我提以往
别跟我提以往 2021-02-04 16:58

I have a problem where I need to launch the same script but with different input arguments.

Say I have a script myscript.py -p -i

3条回答
  •  误落风尘
    2021-02-04 17:56

    If you use SLURM job arrays, you could linearise the index of your two for loops, and then do a comparison of the loop index and the array task id:

    #!/bin/bash
    #SBATCH --job-name=cv_01
    #SBATCH --output=cv_analysis_eis-%j.out
    #SBATCH --error=cv_analysis_eis-%j.err
    #SBATCH --partition=gpu2
    #SBATCH --nodes=1
    #SBATCH --cpus-per-task=4
    #SBATCH -a 0-154
    
    # NxM = 5 * 31 = 154
    
    p1_arr=(0.05 0.075 0.1 0.25 0.5)
    
    # SLURM_ARRAY_TASK_ID=154 # comment in for testing
    
    for ip1 in {0..4} # 5 steps
    do
        for i in {0..150..5} # 31 steps
        do
            let task_id=$i/5+31*$ip1
    
            # printf $task_id"\n" # comment in for testing
    
            if [ "$task_id" -eq "$SLURM_ARRAY_TASK_ID" ]
            then
              p1=${p1_arr[ip1]}
              # printf "python myscript.py -p $p1 -v $i\n" # comment in for testing
              python myscript.py -p $p1 -v $i\n
            fi
        done
    done
    

    This answer is pretty similar to Carles. I would thus have preferred to write it as a comment but do not have enough reputation.

提交回复
热议问题