Hi,
i am using slurm in a single node job batching system. Slurm ist perfect
for that case and works for a couple years flawlessly. Lately i was
shuffleing around jobs which take much longer to run to only run 
daily, and other jobs to run more frequently.

A Question i had was - is there a possibility to lock jobs not to
run multiple times? Or better - i have a list of jobs with heavy
dependencys - and i'd like to run this job list again when all
of them have completed. 

So i could create a lock and an cleanup job which removes that
lock and depends on all other jobs i queue in this batch.

Currently i have something like this in my cron scripts which
looks into the job queue and if it identifies jobs it does
not queue new ones.

        squeue  -l | egrep -q "osm.*RUNNING"

        # Still jobs running
        if [ $? -eq 0 ]; then
                exit 0
        fi

So i run the cron job a lot more often than i can process all of the
data. I feel this to be a bit like a hack.

Flo
-- 
Florian Lohoff                                                 f...@zz.de
        UTF-8 Test: The 🐈 ran after a 🐁, but the 🐁 ran away

Attachment: signature.asc
Description: PGP signature

Reply via email to