Dear all,

I am looking for a way to work out if a file on the internet exists before 
attempting to download it using the function download.file(). For example,
using a url that does not exist

url <- "http://finance.yahoo.com/ftse.csv";
destfile <- tempfile()
download.file(url = url, destfile = destfile)

# gives the following response ...

trying URL 'http://finance.yahoo.com/ftse.csv'
Error in download.file(url = url, destfile = destfile) : 
  cannot open URL 'http://finance.yahoo.com/ftse.csv'
In addition: Warning message:
In download.file(url = url, destfile = destfile) :
  cannot open: HTTP status was '404 Not Found'

When I am using the download.file() function in a loop over multiple URLs, the 
above error will cause the loop to terminate, so I want to avoid this by 
checking if the file exists first then wrapping the subsequent functions in an 
if() statment. The original fault came from the function get.hist.quote() in 
the "tseries" package. I was trying to iterate over various stocks, but some 
stocks listed in the yahoo website do not have any downloadable data associated 
with them, which causes the loop to terminate. The "workhorse" function of 
get.hist.quote() is the download.file() function.

Kind Regards

Chib

_________________________________________________________________
Win 100’s of Virgin Experience days with BigSnapSearch.com

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to