> ------------------------------------------------
> On Wed, 16 Jul 2003 16:49:40 -0500, [EMAIL PROTECTED] wrote:
> 
> > 
> > ------------------------------------------------
> > On Wed, 16 Jul 2003 15:58:39 -0500, "Dan Muey" 
> <[EMAIL PROTECTED]> 
> > wrote:
> > 
> > > 
> > > Howdy all,
> > > 
> > > I've confirmed that my test script is choking on retr()
> > > 
> > > I've confirmed that the file exists
> > > 
> > > my $guts;
> > > my $size = $ftp->size($file)
> > > if(defined $size) {
> > >   my $retr = $ftp->retr($file);
> > >   $retr->read($guts, $size);
> > >   $retr->close();
> > >   $r .= "I am done trying to get $file";
> > > } else { $r .= "$file size is not defined"; }
> > > 
> > > This routine eventually does
> > > return $r; and if I change $file to a fake name I get the 
> "$file size is not defined" returned
> > > Otherwise it hangs unless I comment out the 3 $retr lines.
> > > 
> > > Before I go and do a whole bunch more tests could anyone 
> tell me if 
> > > I'm even doing retr() correctly and if the way I'm doing it will 
> > > place the contents of $file on the server I'm connected 
> to via FTP 
> > > into the variable $guts like I'm expecting.
> > > 
> > > Any insights or experience?
> > > 
> > 
> > Naturally this is the worst possible way to do things (aka 
> insecure, 
> > slow, bandwidth wasting (in two directions), error prone, 
> but whatever 
> > floats your cereal.  I am also assuming you have explored all the 
> > other options, aka just getting a shell account and using a regular 
> > 'cp' command, etc.
> > 
> > Now obligatory stuff out of the way, it appears from the docs that 
> > 'retr' sets up the server to begin a file transfer, then 
> you would use 
> > 'read' to fill the buffer like you said. You would then while loop 
> > over remote file reading buffer size (though it appears you 
> are doing 
> > it all at once which may be a bad way to go with memory 
> depending on 
> > file sizes) during each iteration the contents of the file 
> should be 
> > in 'guts'.  Easiest test would be to put a plain text file on the 
> > server and just print 'guts' if it contains the file contents then 
> > joy!
> > 
> > To speed things up and keep the memory usage down I would setup 2 
> > connections at the same time, one for your retr/read and 
> the other for your stor/write, then loop over the remote file 
> doing a buffer at a time, you would essentially setup a pipe. 
>  Though that approach has its own problems, aka need to be 
> allowed two connections at a time, partial file uploads, etc.
> > 
> > In any case the easiest method to debug this is probably 
> just to try 
> > it with plain text files and see what you get ....

I've tried that, it dies on the initial retr();

The files it's moving are pretty small(1-4K) as they are created by the 
same program so I think size doesn't matter, well err, you know what I mean ;p

The source! Of course! Thanks for the kick in the pants I bet I'll find my erro there.

Thanks!

Dan

> > 
> 
> p.s. have a look at the source for Net::FTP particularly the 
> get/put methods as they are implemented using this same 
> technique (aka with retr/read,stor/write).
> 
> > http://danconia.org
> 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to