clair.crossup...@googlemail.com wrote:
Cheers Duncan, that worked great
getURL("http://uk.youtube.com";, httpheader = c("User-Agent" = "R (2.8.1)"))
[1] "http://www.w3.org/TR/1999/REC-html401-19991224/loose.dtd\";>\n\n\
[etc]
May I ask if there was a specific manual you read to learn these
Cheers Duncan, that worked great
> getURL("http://uk.youtube.com";, httpheader = c("User-Agent" = "R (2.8.1)"))
[1] "http://www.w3.org/TR/1999/REC-html401-19991224/loose.dtd\";>\n\n\
[etc]
May I ask if there was a specific manual you read to learn these
things please? I do not think i could have
opps, i meant:
toString(readLines("http://uk.youtube.com";))
> toString(readLines("http://uk.youtube.com";))
[1] "http://www.w3.org/TR/1999/REC-html401-19991224/loose.dtd\";>, , ,
\t, , , , , \t,
\tYouTube - Broadcast Yourself.,
[etc]
Warning message:
In readLines("http://uk.youtube.com";) :
inc
Some Web servers are strict. In this case, it won't accept
a request without being told who is asking, i.e. the User-Agent.
If you use
getURL("http://www.youtube.com";,
httpheader = c("User-Agent" = "R (2.9.0)")))
you should get the contents of the page as expected.
(Or with URL u
Thank you. The output i get from that example is below:
> d = debugGatherer()
> getURL("http://uk.youtube.com";,
+ debugfunction = d$update, verbose = TRUE )
[1] ""
> d$value()
text
"About to connect() to uk.youtube.com port 80 (#0)\n Trying
208.117.236.72... connected\nConnected to uk
clair.crossup...@googlemail.com wrote:
Thank you Duncan.
I remember seeing in your documentation that you have used this
'verbose=TRUE' argument in functions before when trying to see what is
going on. This is good. However, I have not been able to get it to
work for me. Does the output appear
Thank you Duncan.
I remember seeing in your documentation that you have used this
'verbose=TRUE' argument in functions before when trying to see what is
going on. This is good. However, I have not been able to get it to
work for me. Does the output appear in R or do you use some other
external win
Duncan Temple Lang wrote:
clair.crossup...@googlemail.com wrote:
Dear R-help,
There seems to be a web page I am unable to download using RCurl. I
don't understand why it won't download:
library(RCurl)
my.url <-
"http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html
clair.crossup...@googlemail.com wrote:
Dear R-help,
There seems to be a web page I am unable to download using RCurl. I
don't understand why it won't download:
library(RCurl)
my.url <-
"http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2";
getURL(my.url)
[1]
Hi, i ran your getURL example and had the same problem with
downloading the file.
## R Start..
> library(RCurl)
> toString(getURL("http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2";))
[1] ""
## R end.
However, if it is interesting that if you manually save the
Dear R-help,
There seems to be a web page I am unable to download using RCurl. I
don't understand why it won't download:
> library(RCurl)
> my.url <-
> "http://www.nytimes.com/2009/01/07/technology/business-computing/07program.html?_r=2";
> getURL(my.url)
[1] ""
Other web pages are ok to downl
11 matches
Mail list logo