I haven't tested it, but the documentation says that each run() command
actually acts on an independent shell, so unless I can wrap it all in a
single script/command, I don't see how to do it...

Right now, what I am doing is running my script that does everything local
(commits, tags, compress files), then copies it to the remote host (entry
point to the private network)... but I can not automate any further, so I
print on screen the rest of the commands so I do a copy&paste to execute
them all at once (uncompress, copy to other remote servers, update/restart
applications, etc)

But I reckon this is dirty workaround, so I wanted a better one, if anyone
has experience on it... then I found Fabric, which looked promising... but
I am not sure it would work on this particular scenario. So I wanted to
know before spending more time on coding for it.

Any thoughts?

--
*Braga, Bruno*
www.brunobraga.net
bruno.br...@gmail.com


On Tue, Aug 7, 2012 at 11:58 AM, Amit Saha <amitks...@fedoraproject.org>wrote:

> On Tue, Aug 7, 2012 at 11:55 AM, BRAGA, Bruno <bruno.br...@gmail.com>
> wrote:
> > Hi,
> >
> > I have a system in which I can only get a single SSH entry point to the
> > private network, and from there, propagate application deployments across
> > multiple servers... Is there a way to achieve this with Fabric?
> Basically,
> > it is all about doing SSH within an SSH, and so on... It does not seem to
> > work well with bash (like automating commands to a single script
> execution).
>
> I will have to do something similar very soon. Won't putting in 'ssh'
> under run( ) work? (Just a guess).
>
> Cheers,
> Amit
>
>
> --
> http://echorand.me
>
_______________________________________________
Fab-user mailing list
Fab-user@nongnu.org
https://lists.nongnu.org/mailman/listinfo/fab-user

Reply via email to