There is a working on S3 transport in my local copy. Will commit it once I test it out properly. You can follow the same pattern for any cloud provider which has clients with streaming IO. Streaming among different transfer protocols inside an Agent has been discussed in the last part of this [1] document. Try to get the conceptual idea from that and reverse engineer SCP transport.
[1] https://docs.google.com/document/d/1zrO4Z1dn7ENhm1RBdVCw-dDpWiebaZEWy66ceTWoOlo Dimuthu On Sat, Apr 4, 2020 at 9:22 PM Aravind Ramalingam <[email protected]> wrote: > Hello, > > We were looking at the existing code in the project. We could find > implementations only for local copy and SCP. > We were confused on how to go about with an external provider like S3 or > Azure? Since it would require integrating with their respective clients. > > Thank you > Aravind Ramalingam > > > On Apr 4, 2020, at 21:15, Suresh Marru <[email protected]> wrote: > > > > Hi Aravind, > > > > I have to catch up with the code, but you may want to look at the S3 > implementation and extend it to Azure, GCP or other cloud services like > Box, Dropbox and so on. > > > > There could be many use cases, here is an idea: > > > > * Compute a job on a supercomputer with SCP access and push the outputs > to a Cloud storage. > > > > Suresh > > > >> On Apr 4, 2020, at 8:09 PM, Aravind Ramalingam <[email protected]> > wrote: > >> > >> Hello, > >> > >> We set up the MFT project on local system and tested out SCP transfer > between JetStream VMs, we were wondering how the support can be extended > for AWS/GCS. > >> > >> As per our understanding, the current implementation has support for > two protocols i.e. local-transport and scp-transport. Would we have to > modify/add to the code base to extend support for AWS/GCS clients? > >> > >> Could you please provide suggestions for this use case. > >> > >> Thank you > >> Aravind Ramalingam > > >
