Dimuthu:

For gateway with specific applications the application or executable needs to 
be transferred along with the user inputs to the condor pool, which may require 
unpacking a tar ball or other mechanisms and identify the actual executable by 
a driver script. The driver script is the current executable application in the 
gateway but the actual executable or a container (the tarball above) also need 
to transferred before the execution can be set up. These steps require a pre 
job script or a way to add additional transfer_input_files instructions at the 
connecting HT condor host.

Of course dags are quite diverse and can be used for various other usecases 
such as parameter sweeps with different inputs (which again could be packaged 
in tarballs and need extraction after staging) etc..

Thanks,
Sudhakar.

From: DImuthu Upeksha <dimuthu.upeks...@gmail.com>
Reply-To: "dev@airavata.apache.org" <dev@airavata.apache.org>
Date: Thursday, January 5, 2023 at 10:14 AM
To: "dev@airavata.apache.org" <dev@airavata.apache.org>
Subject: Re: Regarding Pre and Post job commands in Airavata

Hi Dinuka,

Sorry for the late reply. It is great to explore options to integrate Dag 
capabilities into the job submitter. We already have some form of HTCondor 
support in Airavata. Can you summarize the difference between what we already 
have for HTCondor and this Dag feature? I am specifically looking for practical 
usages instead of technical differences.

Thanks
Dimuthu

On Sun, Jan 1, 2023 at 10:35 AM Dinuka De Silva 
<l.dinukadesi...@gmail.com<mailto:l.dinukadesi...@gmail.com>> wrote:
Hi,

The current implementation of mapping these (Pre, Post, etc. job commands) to 
the job scheduler script has assumed the job scheduler script to be a type of 
shell script. So, the order of the execution is based on the order of the 
commands listed in the script which is as below.

- Module Commands
- Pre Job Commands
- Job Submitter Command
- Post Job Commands

The scheduler script of SLURM, FORK, LSF, UGE, and PBS are shell scripts while 
HTCondor and maybe some other job schedulers have different file types. The 
script grammar in HTCondor does not support appending shell scripts inside. 
Now, we are needing to support Pre, Post, and other commands for HTCondor 
realizing the current design doesn't support it.

In HTCondor there's an option [1] to configure pre and post-scripts to be 
executed at the worker instance. But then the script has to be Dag and the 
pre-script, post-script and job-script are to be separate files. So, I tried a 
sample and planning to put this to airavata.

[1] 
https://htcondor.readthedocs.io/en/latest/users-manual/dagman-workflows.html<https://urldefense.com/v3/__https:/htcondor.readthedocs.io/en/latest/users-manual/dagman-workflows.html__;!!DZ3fjg!4xHiUMy46p6OObhx-PylS3ZWXAFlXqCzBbUeBtDWvE-qmqZG0oHxgB5A-kdQHicETrCJapoY398GYxeB78GACfTk8Gpm1w$>

Thanks & Regards,
Dinuka


Reply via email to