Hi Ajad,

Amjad Syed via slurm-users <slurm-users@lists.schedmd.com> writes:

> Hello
>
> I have the following scenario:
> I need to submit a sequence of up to 400 jobs where the even jobs depend on 
> the preceeding odd job to finish and every odd job depends on the presence of 
> a
> file generated by the preceding even job (availability of the file for the 
> first of those 400 jobs is guaranteed).
>
> If I just submit all those jobs via a loop using dependencies, then I end up 
> with a lot of pending jobs who might later not even run because no output 
> file has
> been produced by the preceding jobs. Is there a way to pause the submission 
> loop until the required file has been generated so that at most two jobs are
> submitted at the same time?
>
> Here is a sample submission script showing what I want to achieve.
>
> for i in {1..200}; do
>   FILE=GHM_paramset_${i}.dat
>    # How can I pause the submission loop until the FILE has been created????
>     #if test -f "$FILE"; then
>         jobid4=$(sbatch --parsable --dependency=afterok:$jobid3 job4_sub $i)
>         jobid3=$(sbatch --parsable --dependency=afterok:$jobid4 job3_sub $i)
>     #fi
> done
>
> Any help will be appreciated
>
> Amjad

You might find a job array useful for this (for any large number of jobs
with identical resources, using a job array also helps backfilling to
work efficiently, if you are using it).

With a job array you can specify how many jobs should run simultaneously
with the '%' notation:

  --array=1-200%2

Cheers,

Loris

-- 
Dr. Loris Bennett (Herr/Mr)
FUB-IT (ex-ZEDAT), Freie Universität Berlin

-- 
slurm-users mailing list -- slurm-users@lists.schedmd.com
To unsubscribe send an email to slurm-users-le...@lists.schedmd.com

Reply via email to