Hello, list!
wget's default behavior upon encountering a 404 message is to error
out and simply not fetch that page. It would be useful to have a
flag which, when specified would actually download the 404 message
given to wget, rather than just failing out. I imagine that the flag
could
Hello, list!
I'd like to automate a download that has to happen a number of times
a week, and wget seems to be the best candidate for the job.
Unfortunately, the website requires a login to get at the file I'm
interested in, and once you're logged in, the site sets a number of
cookies. When I
I'd like to automate a download that has to happen a number of times
a week, and wget seems to be the best candidate for the job.
...
I neglected to mention that I'm running wget 1.8.1, sorry!
-CJ
--
WOW: Nyctitropic | Let us rain some DOOM upon the filthy heads
On Mon, Mar 25, 2002 at 12:00:33PM -0500, Alan Eldridge wrote:
Have you considered that it doesn't need just cookies? Sometimes
webmasters get kindof assholish and require a specific referer page,
too. Try setting referer on the command line.
Ah, hadn't thought of that . . . As it turns out
I'd like to automate a download that has to happen a number of times
a week, and wget seems to be the best candidate for the job.
Unfortunately, the website requires a login to get at the file I'm
interested in, and once you're logged in, the site sets a number of
...
Just as a followup,
promote your favorite browser. etc.I think many
developers, when not restrained by need to sell the
product, are fearful of being preceived as stupid if they
make things too easy or clear. What is so hard as what
why? when to use? Recommendations? examples??
I don't know if I'd say
On Wed Oct 10 15:40:57 2001, Robin B. Lake wrote:
How DOES Wget support HTTP cookies? I access a real-time stock quote
I've never used them myself, but the manpage had this to say (v 1.7):
--cookies=on/off
When set to off, disable the use of cookies. Cookies
are
On Thu Oct 4 04:55:53 2001, Ian Abbott wrote:
On 3 Oct 2001, at 16:01, CJ Kucera wrote:
The closest I've come is (and there's lots of extraneous stuff in there):
wget -r -l inf -k -p --wait=1 -H
--domains=theonion.com,graphics.theonion.com,www.theonion.com,theonionavclub.com
Greetings, humans!
I'd like to use wget to take a snapshot of www.theonion.com. On that
site, all of the graphics are served from graphics.theonion.com,
and there's a bunch of other sub-domains as well. Also, it links over
to www.theonionavclub.com, which I would also like to mirror.
How can I
I said:
I'd like to use wget to take a snapshot of www.theonion.com. On that
[snip!]
I forgot to mention I'm using wget 1.7. Sorry 'bout that.
-CJ
WOW: Rapacious | A priest advised Voltaire on his death bed to
apocalyptech.com/wow | renounce the devil. Replied Voltaire, This
10 matches
Mail list logo