SLURM sbatch job array for the same script but with different input arguments run in parallel

前端 未结 3 2100
别跟我提以往
别跟我提以往 2021-02-04 16:58

I have a problem where I need to launch the same script but with different input arguments.

Say I have a script myscript.py -p -i

3条回答
  •  死守一世寂寞
    2021-02-04 17:38

    The best approach is to use job arrays.

    One option is to pass the parameter p1 when submitting the job script, so you will only have one script, but will have to submit it multiple times, once for each p1 value.

    The code will be like this (untested):

    #!/bin/bash
    #SBATCH --job-name=cv_01
    #SBATCH --output=cv_analysis_eis-%j-%a.out
    #SBATCH --error=cv_analysis_eis-%j-%a.err
    #SBATCH --partition=gpu2
    #SBATCH --nodes=1
    #SBATCH --cpus-per-task=4
    #SBATCH -a 0-150:5
    
    python myscript.py -p $1 -v $SLURM_ARRAY_TASK_ID
    

    and you will submit it with:

    sbatch my_jobscript.sh 0.05
    sbatch my_jobscript.sh 0.075
    ...
    

    Another approach is to define all the p1 parameters in a bash array and submit NxM jobs (untested)

    #!/bin/bash
    #SBATCH --job-name=cv_01
    #SBATCH --output=cv_analysis_eis-%j-%a.out
    #SBATCH --error=cv_analysis_eis-%j-%a.err
    #SBATCH --partition=gpu2
    #SBATCH --nodes=1
    #SBATCH --cpus-per-task=4
    #Make the array NxM
    #SBATCH -a 0-150
    
    PARRAY=(0.05 0.075 0.1 0.25 0.5)    
    
    #p1 is the element of the array found with ARRAY_ID mod P_ARRAY_LENGTH
    p1=${PARRAY[`expr $SLURM_ARRAY_TASK_ID % ${#PARRAY[@]}`]}
    #v is the integer division of the ARRAY_ID by the lenght of 
    v=`expr $SLURM_ARRAY_TASK_ID / ${#PARRAY[@]}`
    python myscript.py -p $p1 -v $v
    

提交回复
热议问题