I'm accessing around 95 tar files on an FTP server ranging in size between
10 and 40MB a piece.

while certainly can click on them and download them outside of R, I'd like
to have my script do it.

Retrieving the ftp directory with RCurl works fine (about 90% of the time)

but downloading the files by looping through all the files is a random
process.

I may get 1-8 files download and then it throws an error

"cannot open URL "

sometimes I only can get 1 file before this error. with tryCatch() I've been
able to do some clean up
after the crash, but automating this whole download process has turned into
a bit of a circus.

The parameters (url, destfile, mode) are all correct in the download.file
call as the second attempt at a url will often succeed.

Is there anyway to get a deeper look at the cause of the problem? I've tried
closing all connections
in between each download. any pointers would be welcomed.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to