On 12/17/2012 05:21 PM, Rajeev Prasad wrote:

the following is _i think_ timing out. when it is run from within my script


( @cmdresult, $cmderr ) = $ssh->capture($CMD);


where $CMD is:   egrep "data_to_grep" *.data_file.txt

the output is about 300Mb of data.

further, the command when run on the remote system directly (after logging in), 
takes only about 30 seconds.

also, when we run from my localhost (bash shell) ssh "$CMD" > local.file.save 
it completes within 2 minutes...

please advice.

ty.
Rajeev


300MB of data may be too much for capturing and unless you are very carefully you will end with an script requiring several GBs of memory to run.

Try saving the output to a file and process it afterwards line by line, or use Net::OpenSSH pipe_in method to read and process the data on the fly without storing it all in memory at once.

--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to