Dimuthu:

Yes, the working directory on remote HPC cluster.

The workflow may look like this..

The user launches a job..
The remote working directory, dynamically defined by Airavata during the launch 
of the experiment is registered as a remote disk accessible
The contents are made available readonly for  users to read/download
Remove this as accessible when the experiment ends
Continue with the rest of the Helix tasks
…


Thanks,
Sudhakar.

From: DImuthu Upeksha <[email protected]>
Reply-To: "[email protected]" <[email protected]>
Date: Thursday, March 26, 2020 at 12:23 PM
To: Airavata Dev <[email protected]>
Subject: Re: MFT and data access for running jobs

Sudhakar,

I’m not sure whether I grabbed your point about this remote working directory 
correctly. Are you taking about the working directory of the cluster? Can you 
please explain the workflow with more details?

Thanks
Dimuthu

On Thu, Mar 26, 2020 at 10:21 AM Pamidighantam, Sudhakar 
<[email protected]<mailto:[email protected]>> wrote:
Dimuthu:

When the MFT becomes available would there be a way to define the remote 
working directory as a device to provide access to the data there.
You know this has been a long standing need for particularly long running jobs.

Thanks,
Sudhakar.

Reply via email to