Job arrays

Revision as of 20:03, 14 May 2018 by Dboss (talk | contribs)


This article is a draft

This is not a complete article: This is a draft, a work in progress that is intended to be published into an article, which may or may not be ready for inclusion in the main wiki. It should not necessarily be considered factual or authoritative.




A large number of tasks which differ only in some parameter can be conveniently submitted as a job array, also known as a task array or an array job. The individual tasks in the array are distinguished by an environment variable, $SLURM_ARRAY_TASK_ID, which Slurm sets to different values according to the range you supply with the --array parameter.

sbatch --array=0-7 ...      # $SLURM_ARRAY_TASK_ID will take values from 0 to 7 inclusive
sbatch --array=1,3,5,7 ...  # $SLURM_ARRAY_TASK_ID will take the listed values
sbatch --array=1-7:2 ...    # Step-size of 2, does the same as the previous example
sbatch --array=1-100%10 ... # Allow no more than 10 of the jobs to run simultaneously

See Job Array Support at SchedMD.com for detailed documentation.

A simple example

$ sbatch --array=1-10 runme
Your job-array 54321.1-10:1 ("runme") has been submitted

Job 54321 will be scheduled as 10 independent tasks which may start at different times on different hosts. The individual tasks are differentiated by the value of an environment variable $SLURM_ARRAY_TASK_ID. The script can reference $SLURM_ARRAY_TASK_ID to select an input file, for example, or to set a command-line argument for the application code:

my_app <input.$SLURM_ARRAY_TASK_ID
my_app $SLURM_ARRAY_TASK_ID some_arg another_arg

Here is an example bash script that you can insert into your slurm batch script that uses the $SLURM_ARRAY_TASK_ID to select a different input file from the datafiles directory:

FILES=(datadir/*)
input_file="${FILES[$SLURM_ARRAY_TASK_ID]}"
my_app $input_file

Using a job array instead of a large number of separate serial jobs has advantages for other users and the system as a whole. A waiting job array only produces one line of output in squeue, making it easier for you to read its output. The scheduler does not have to analyze job requirements for each array task separately, so it can run more efficiently too.

Running the same script in multiple directories

This example assumes that you have multiple directories, each with the same structure, and you want to run the same script in each of these directories. You could use the following bash script :

File : file.txt

#!/bin/bash
#SBATCH --ntasks=1
#SBATCH --time=2:00:00

echo "Starting run at: `date`"

DIRS=($(find . -maxdepth 1 -type d \( ! -name . \)))
DIR=${DIRS[$SLURM_ARRAY_TASK_ID]}

cd $DIR
pwd

# Place the code to execute here


and submit it using the following command :

[name@server ~]$ N=$(find . -maxdepth 1 -type d \( ! -name . \) | wc -l)
[name@server ~]$ sbatch --array=1-$N script.sh