-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Ryan Schmidt wrote:
> On Jun 20, 2008, at 4:47 PM, [EMAIL PROTECTED] wrote:
>> I get the following error:
>>
>> --17:42:58-- http://ajax.googleapis.com/ajax/services/search/web?v=1.0
>>=> [EMAIL PROTECTED]'
>> Resolving ajax.googleapis.com
On Jun 20, 2008, at 4:47 PM, [EMAIL PROTECTED] wrote:
I am trying to use wget to access Google apis for automatically
downloading query results as given on http://code.google.com/apis/
ajaxsearch/documentation/
But when I type curl --referer=http://www.my-ajax-site.com 'http://
ajax.googlea
erable
program or batch file.
Can someone help me on this?
thanks,
Sarabjeet
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Valentin wrote:
> Hi,
> I'm trying to mirror a site with this command:
> wget -nd -r -k -p -H -c -T 10
> -t 2 http://www.freesfonline.de/Magazines1.html
> It works fine, until at some point it tries to get
> http://www.booksense.com/robots.txt and core
. Is there some way to
increase wget's verbosity or another way of debugging this? I have
version 1.10.2-3ubuntu1.
Cheers,
Valentin
--
o
L_ This is Schaeuble. Copy Schaeuble into your signature to help him
OL on his way to Ueberwachungsstaat!
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Amit Patel wrote:
> Ya. I have checked it properly. It checks certificates. If i don't
> specify /etc/ca-bundle.crt with my working version of 1.10.2, it
> provides "Self-signed certificate encountered" error and fails.
Er, I'm not sure, but I think I
where
> the problem is. If you can spare some time and help help me to solve
> the issue , that would be great.
RedHat has been known to modify Wget fairly heavily. Recent versions of
RedHat's "Wget 1.10.2" differ very substantially from ours. It would be
more useful to see
27;.
Unable to establish SSL connection.
However , if i try same with wget version 1.10.2 which is on another
redhat machine, it is working perfect. I am not able to understand where
the problem is. If you can spare some time and help help me to solve
the issue , that would be great.
Following i
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Gary Lubrani wrote:
>
> ok, almost there, here's what ran so far, seems to fail when I
> do MAKE INSTALL (shows error 2, whatever that is)
> /usr/bin/install -c wget /usr/local/bin/wget
> /usr/bin/install: cannot create regular file `/usr/local/bin/w
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Gary Lubrani wrote:
> --
> Using Opera's revolutionary e-mail client: http://www.opera.com/mail/
A signature customarily goes _underneath_ the text of your message.
Otherwise, quoting your message in the response is a bit of a hassle.
> using Ubuntu
From: Gary Lubrani
Not the most descriptive subject I've ever seen.
> checking for C compiler default output file name...
> configure: error: C compiler cannot create executables
Apparently your C compiler is not working as expected.
> See `config.log' for more details.
Well? Any clu
found. Stop.
[EMAIL PROTECTED]:~/Desktop/wget-1.11$ install
install: missing file operand
Try `install --help' for more information.
[EMAIL PROTECTED]:~/Desktop/wget-1.11$ ./configure
configure: configuring for GNU Wget 1.11
checking build system type... i686-pc-linux-gnulibc1
checking host sys
t that off?
I doubt it has anything to do with the ?p, but it's really impossible to
say.
I'm afraid you really haven't provided enough information to help you
much; at a minimum, the version of Wget you are using, the relevant
contents of your .wgetrc file, and the command-line
On Feb 8, 2008 9:58 PM, Jacqui Lahr <[EMAIL PROTECTED]> wrote:
> hi .i've been trying to install opera on the olpc xo with info from
> wiki opera site
> and i get messages to contact you.iv'e tried both codes(?) with the
> "tar ball" and without. i have been using macs since the 512 and i
hi .i've been trying to install opera on the olpc xo with info from wiki
opera site
and i get messages to contact you.iv'e tried both codes(?) with the"tar
ball" and without. i have been using macs since the 512 and in my75 yr. old
ignorance i thought i thought i could just type it in a
Quotation marks around the text containing special characters should
work in Windows batch files.
- Original Message -
From: "Tony Godshall" <[EMAIL PROTECTED]>
To: "Uma Shankar" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Sunday, November 11
sounds like a shell issue. assuming you are on a nix, try 'pass' (so
shell passed the weird chars literally. If you are on Windows, it's
another story.
On 11/10/07, Uma Shankar <[EMAIL PROTECTED]> wrote:
> Hi -
> I've been struggling to download data from a protected site. The man pages
> intruc
Hi -
I've been struggling to download data from a protected site. The man pages
intruct me to use the --http-user=USER and --http-passwd=PASS options when
issuinig the wget command to the URL. I get error messages when wget
encounters special chars in the password. Is there a way to get around this
Josh Williams schrieb:
On 8/2/07, dmitry over <[EMAIL PROTECTED]> wrote:
Hi,
In `man wget` is see text
---[ cut ]---
--http-user=user
--http-password=password
[..]
but in `wget --help` is see
--http-user=USER set http user to USER.
--http-passwd=PASSset http password t
On 8/2/07, dmitry over <[EMAIL PROTECTED]> wrote:
> Hi,
>
> In `man wget` is see text
> ---[ cut ]---
> --http-user=user
>--http-password=password
> [..]
> but in `wget --help` is see
>
> --http-user=USER set http user to USER.
> --http-p
ke sure to protect those files from other users with
"chmod". If the passwords are really important, do not leave
them lying in those files either---edit the files and delete them after Wget
has started the download.
---[ cut ]---
but in `wget --help` is see
--h
Josh Williams wrote:
> Hmm. .org, maybe?
LOL. Do you know how many kewl domain names I had to go through before I
found one that didn't actually exist? Close to a dozen.
Tony
On 7/17/07, Tony Lewis <[EMAIL PROTECTED]> wrote:
Just forward the patch to [EMAIL PROTECTED] and let them test it. :-)
Hmm. .org, maybe?
Delivery to the following recipient failed permanently:
[EMAIL PROTECTED]
Technical details of permanent failure:
PERM_FAILURE: DNS Error: Domain nam
Josh Williams wrote:
> Let me know how it turns out. The only "testing" I did on it was
> checking to make sure my code compiled; I haven't actually tried the
> option.
That's the only testing a developer is *supposed* to do. Everything else is
QA's job!
;-)
Just forward the patch to [EMAIL PRO
On 7/16/07, Jaymz Goktug YUKSEL <[EMAIL PROTECTED]> wrote:
Hey Josh,
Thank you very much for that patch, this was what I was looking for, I think
this is going to solve my problem!
Thank you vary much, and have a good one!
Cordially,
James
You're welcome :-)
Let me know how it turns out. Th
On 7/16/07, Jaymz Goktug YUKSEL <[EMAIL PROTECTED]> wrote:
Hello everyone,
Is there a command to override the
maximum redirections?
Attached is a patch for this problem. Let me know if you have any
problems with it. It was written for the latest trunk in
e than 20
> pages. I need to redirect more.
>
> Do you think you can help me with this? Is there a command to override
> the maximum redirections?
Unfortunately, there isn't. There should be, and I've submitted bug
20499 to address this in 1.12: https://savannah.gnu.org/bugs/inde
tion (20) reached. So it connot redirect more than 20
> pages. I need to redirect more.
>
> Do you think you can help me with this? Is there a command to override
> the maximum redirections?
I never found such option, but f you use bash-shell you can turn of
redirects and do
for ((
supposed to go until it reaches 650.
However when it comes to 19 it stops and it says
Maximum redirection (20) reached. So it connot redirect more than 20 pages.
I need to redirect more.
Do you think you can help me with this? Is there a command to override the
maximum redirections?
I am the only
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Mishari Al-Mishari wrote:
> Hi,
> when i run this command
> wget -p wwwladultfriendfinder.co
>
> I recvd the following error messages, eventhoug I was able to
> sucessfully download the page using the browser;
> Resolving wwwladultfriendfinder.com.
Hi,
when i run this command
wget -p wwwladultfriendfinder.co
I recvd the following error messages, eventhoug I was able to
sucessfully download the page using the browser;
Resolving wwwladultfriendfinder.com... 8.15.231.13
Connecting to wwwladultfriendfinder.com|8.15.231.13|:80... connected.
HTT
rahul kumar wrote:
> The thing is that i can't execute that command (wget Command)through
> Windows...RUN Utility. So could you please give me some idea what i
> need to do such that i can execute my Batch Application through
> Windows XP Operationg System...
Wget it is a console-only command, so
Again how to get corresonding values and how to store this valus into some
excel file say record.xls (C:\record.xls)
Could you please help me...I will be really indebted to you for this
My Understanding after Reading the PDF that is attached here
http://w
Again how to get corresonding values and how to store this valus into some
excel file say record.xls (C:\record.xls)
Could you please help me...I will be really indebted to you for this
My Understanding after Reading the PDF that is attached here
http://w
From: shades13
> I have been having problems [...]
1. People with real names tend to get more respect than others.
2. As usual, it might help to know which wget version you're using
on which operating system.
I don't see any links on "http://www.talcomic.com
Please cc me in your posts
I have been having problems getting wget to follow links
in the following 2 types of websites. One type like
http://www.talcomic.com/ gives me a 403 error. The other
type like http://www.cad-comic.com/comic.php just ignores
most of the links. The command I am curr
box,
the quote i.e " is replaced by character m.
this is causing some problems.
i tried all options. but the issue still remains.
can you provide any help.
anand patil
--- End Message ---
issue still remains.
can you provide any help.
anand patil
The server told wget that it was going to return 6K:
Content-Length: 6720
_
From: Smith, Dewayne R. [mailto:[EMAIL PROTECTED]
Sent: Thursday, March 01, 2007 8:05 AM
To: [EMAIL PROTECTED]
Subject: wget help on file download
Trying to download a 4mb file. it only retrieves 6k of it
Trying to download a 4mb file. it only retrieves 6k of it.
I've tried without the added --options and it doesn't work.
Can you see any issues below?
Thx!
C:\Backup_CD\WGET>wget -dv -S --no-http-keep-alive --ignore-length
--secure-protocol=auto --no-check-certificate https://
server2.csci-va
I have a complaint: if I pass the flag "-rnpnd" to "wget" it complains
that "nn" is no flag. Surely, since "n" is always followed by
another letter, it is expected that users thus bundle them up.
I only a few days ago unpacked "wget", and I want to do something that I
cannot work out how to do, to
Tate Mitchell ha scritto:
Would it be possible to download each lesson individually, so that as
lessons are added, or finished, I can download them w/out re-downloading
the whole site? Could someone tell me how please? Or would it be possible to
download the whole thing and just re-download pa
Hi: I am getting the following error when I run wget
on a https website
HTTP request sent, awaiting response... 401
Unauthorized
Authorization failed.
The wget command I am running is the following
wget --secure-protocol=auto --no-check-certificate
--http-user=abc --http-password=sprint
--post-da
Hello, I have a question about downloading this site.http://www.ncsu.edu/project/hindi_lessons/
I want to download it all, basically, and was reading in a tutorial about wget that it's kind of hard to re-download sites that have been added to since the previous download. I could be wrong.As you can
Any solution to this found yet,
my own testing:
it works using the 1.8 wget
Redhat 7.3 wget-1.8.1.4
Then tried compiling from source
wget1.9.1 works
wget1.10.1 fails
wget1.10.2 fails
Thanks,
A.P.
From: Mauro Tortonesi <[EMAIL PROTECTED]>
> perhaps we should make this clear in the manpage
Always a good idea.
> and provide an
> additional option which just renames saved files after download and
> postprocessing according to a given pattern. IIRC, hrvoje was just
> suggesting to do t
Steven M. Schweda wrote:
But the real question is: If a Web page has links to other files, how
is Wget supposed to package all that stuff into _one_ file (which _is_
what -O will do), and still make any sense out of it?
even more, how is Wget supposed to properly postprocess the saved data,
From: David David
> 3. Outputs the graph to ta.html (replacing original
> ta.html)... BAD.
On VMS, where (by default) it's harder to write to an open file, the
symptom is different:
ta.html: file currently locked by another user
But the real question is: If a Web page has links to other
Hi,
Don't know if this will be answered - but I had to
ask (since I DID read the man page! :)P )
Symptom : automating my stock research I type a
command as "wget -p -H -k -nd -nH -x -Ota.html
-Dichart.finance.yahoo.com -Pbtu
"http://finance.yahoo.com/q/ta?s=btu&t=6m&l=on&z=l&q=b&p=b,p,s,v&a
When I use wget
> http://www.srh.noaa.gov/ifps/MapClick.php?FcstType=text&textField1=34.03740&textField2=-84.35600&site=ffc&Radius=0&CiTemplate=0&TextType=1
>
> It fails. Is there a way to parse this with wget and save to a short file
> name?
>
> I am new to wget, Any help would be greatly appreciated.
>
>
> Thanks,
> Dan
>
text&textField1=34.03740&textField2=-84.35600&site=ffc&Radius=0&CiTemplate=0&TextType=1
It fails. Is there a way to parse this with wget and save to a short file name?
I am new to wget, Any help would be greatly appreciated.
Thanks,
Dan
nicolas figaro wrote:
Hi,
there is a mistake in the french translation of wget --help (on linux
redhat).
in english :
wget --help | grep spider
--spider don't download anything
was translated in french this way :
wget --help | grep spider
--s
Is it enough to report it to [EMAIL PROTECTED] How can I check if
there is any development plan for fixing this bug?
Gang
Ray Rodriguez wrote:
I think it would be safe to call this a bug, but I seem to think I've seen
something about this before.
On Tue, 28 Mar 2006, gang wu wrote:
Than
Hi,
there is a mistake in the french translation of wget --help (on linux
redhat).
in english :
wget --help | grep spider
--spider don't download anything
was translated in french this way :
wget --help | grep spider
--spider ne pas téléch
I think it would be safe to call this a bug, but I seem to think I've seen
something about this before.
On Tue, 28 Mar 2006, gang wu wrote:
> Thanks for your reply. Should we report this as a bug?
>
> Gang
>
>
> Ray Rodriguez wrote:
>
> >I tried the command on SCO Unix 5.0.4 with wget 1.10.2 and
Thanks for your reply. Should we report this as a bug?
Gang
Ray Rodriguez wrote:
I tried the command on SCO Unix 5.0.4 with wget 1.10.2 and had the same
result as you.
On Mon, 27 Mar 2006, gang wu wrote:
Hi,
Can anyone try the following line several times to see if it downloads
the fil
I tried the command on SCO Unix 5.0.4 with wget 1.10.2 and had the same
result as you.
On Mon, 27 Mar 2006, gang wu wrote:
> Hi,
>
> Can anyone try the following line several times to see if it downloads
> the file every time?
>
> wget -N
> ftp://ftp.ncbi.nih.gov/genomes/Arabidopsis_thaliana/CHR_
Hi,
Can anyone try the following line several times to see if it downloads
the file every time?
wget -N
ftp://ftp.ncbi.nih.gov/genomes/Arabidopsis_thaliana/CHR_V/NC_003076.gbk
With the -N option, should wget download the remote file only once if
the remote and local files have the same tim
am sorry.
you're right, of course.
==
Thank you very much I've got the test version working now. :-)
Right, now I just have to migrate it into the "live" environment...
I may be back but in case not, I appreciate the help.
Regards,
TeeJay
==
Jim Wright wrote:
Wildcards don't work is the accepted wisdom. I just realized that I
have been using downloads of the form "--accept AB06*a.T00,AB06*a.BNX"
for a long time and it works fine for me. Should it not?
of course! i can't believe i wrote something so stupid. i was working on
regex
I'd suggest using your original accept, and not using a reject.
You know specifically what you want, and all the rest will be ignored.
Jim
On Mon, 20 Mar 2006, TeeJay wrote:
> Jim Wright unavco.org> writes:
>
> >
> > Wildcards don't work is the accepted wisdom. I just realized that I
> > ha
Jim Wright unavco.org> writes:
>
> Wildcards don't work is the accepted wisdom. I just realized that I
> have been using downloads of the form "--accept AB06*a.T00,AB06*a.BNX"
> for a long time and it works fine for me. Should it not?
>
> Looking at the lines below, the reject encompasses all
Mauro Tortonesi unife.it> writes:
you are using wildcard to specify which files to accept or reject, and wget
does not support them.
-
Thanks for the quick response Mauro!
I am surprised at the response though because fro
Wildcards don't work is the accepted wisdom. I just realized that I
have been using downloads of the form "--accept AB06*a.T00,AB06*a.BNX"
for a long time and it works fine for me. Should it not?
Looking at the lines below, the reject encompasses all of the accept,
so if reject is applied after
TeeJay wrote:
Help for a newbie please.
I have created my wgetrc file. It contains the following variables:
input = C:\WGET\source.txt
user = (crossed out for security)
password = (crossed out for security)
check_certificate = off
recursive = on
reclevel = 2
no_parent = off
Help for a newbie please.
I have created my wgetrc file. It contains the following variables:
input = C:\WGET\source.txt
user = (crossed out for security)
password = (crossed out for security)
check_certificate = off
recursive = on
reclevel = 2
no_parent = off
dirstruct = off
kayode giwa <[EMAIL PROTECTED]> writes:
> am new to wget and I was wondering if any one out
> there can assist me with the following error messages
> in my config.log file,
> What do I need to do to get wget working ? please
> respond !!
>
>
>
> $ ./configure
>
>
> PATH: /usr/ucb
>
>
> ## --
On Thursday 04 August 2005 03:43 pm, kayode giwa wrote:
> am new to wget and I was wondering if any one out
> there can assist me with the following error messages
> in my config.log file,
> What do I need to do to get wget working ? please
> respond !!
i am not familiar with solaris, but it seems
am new to wget and I was wondering if any one out
there can assist me with the following error messages
in my config.log file,
What do I need to do to get wget working ? please
respond !!
$ ./configure
PATH: /usr/ucb
## --- ##
## Core tests. ##
## --- ##
configure:150
Title: Help, Idea ?
I wonder your opinions about the following system:
We want to search some predefined words on a given set of web sites, continiously.
Thus, we will design two processes. One is downloader (which use wget),
the other is searcher.
Downloader downloads the whole content of
Title: Help, Idea ?
I wonder your opinions about the following system:
We want to search some predefined words on a given set of web sites, continiously.
Thus, we will design two processes. One is downloader (which use wget),
the other is searcher.
Downloader downloads the whole content
Werner Schmitt <[EMAIL PROTECTED]> writes:
> on machine 2 with wget version: Wget 1.9.1
>
> i get error: not implemented !!
>
> on machine 1 with wget Version: Wget 1.9+cvs-dev
> everything is ok
That is because the other machine has a newer (CVS) version of Wget
that correctly implements HTTPS d
on machine 2 with wget version: Wget 1.9.1
i get error: not implemented !!
on machine 1 with wget Version: Wget 1.9+cvs-dev
everything is ok
here the command line:
wget -S -d -v --sslcafile=/env/config/cacert.pem
--directory-prefix=/env/update --http-user= --http-passwd=xx
--proxy=on
htt
Hrvoje Niksic <[EMAIL PROTECTED]> writes:
>> Can I have it not do the translation ??!
>
> Unfortunately, only by changing the source code as described in the
> previous mail.
BTW I've just changed the CVS code to not decode the % sequences.
Wget 1.10 will contain the fix.
l this down. I did a
> byte by byte comparison between two tcpdumps and nailed this down to
> the '.' literally !
The server is broken, as %2e really is completely equivalent to ".".
Unfortunately that won't help you in practice.
> Can I have it not do the transla
Will Kuhn <[EMAIL PROTECTED]> writes:
> I try to do something like
> wget "http://website.com/ ...
> login=username&domain=hotmail%2ecom&_lang=EN"
>
> But when wget sends the URL out, the "hotmail%2ecom"
> becomes "hotmail.com" !!! Is this the supposed
> behaviour ?
Yes.
> I saw this on the snif
I try to do something like
wget "http://website.com/ ...
login=username&domain=hotmail%2ecom&_lang=EN"
But when wget sends the URL out, the "hotmail%2ecom"
becomes "hotmail.com" !!! Is this the supposed
behaviour ? I saw this on the sniffer. I suppose the
translation of "%2" to "." is done by wget
: Richard Emanilov [mailto:[EMAIL PROTECTED]
Sent: Monday, March 21, 2005 2:17 PM
To: Mauro Tortonesi
Cc: Tony Lewis; wget@sunsite.dk; [EMAIL PROTECTED]
Subject: RE: help!!!
/usr/local/bin/wget -dv --post-data="login=login&password=password"
https://login:[EMAIL PROTECTED]:8443/ft
DEBUG
PM
To: 'Mauro Tortonesi'
Cc: Tony Lewis; wget@sunsite.dk; [EMAIL PROTECTED]
Subject: RE: help!!!
/usr/local/bin/wget -dv --post-data="login=login&password=password"
https://login:[EMAIL PROTECTED]:8443/ft DEBUG output created by Wget 1.9.1 on
linux-gnu.
--17:11:35-- h
is issue?
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Monday, March 21, 2005 4:11 PM
To: Richard Emanilov
Cc: Tony Lewis; wget@sunsite.dk; [EMAIL PROTECTED]
Subject: Re: help!!!
On Monday 21 March 2005 02:22 pm, Richard Emanilov wrote:
> Guys,
>
>
&g
On Monday 21 March 2005 02:22 pm, Richard Emanilov wrote:
> Guys,
>
>
> Thanks so much for your help, when running
>
> wget --http-user=login --http-passwd=passwd
> --post-data="login=login&password=passwd" https://site
>
> With version 1.9.1, I get the err
Guys,
Thanks so much for your help, when running
wget --http-user=login --http-passwd=passwd
--post-data="login=login&password=passwd" https://site
With version 1.9.1, I get the error message
"Site: Unsupported scheme."
Richard Emanilov
[EMAIL PROTECTED]
The --post-data option was added in version 1.9. You need to upgrade your
version of wget.
Tony
-Original Message-
From: Richard Emanilov [mailto:[EMAIL PROTECTED]
Sent: Monday, March 21, 2005 8:49 AM
To: Tony Lewis; [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Subject: RE: help!!!
wget
ogin=login&password=password" https:site
wget: unrecognized option `--http-post=login=login&password=passwd'
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
wget -V
GNU Wget 1.8.2
Richard Emanilov
[EMAIL PROTECTED]
-Original Message-
From: Tony L
Richard Emanilov wrote:
> Below is what I have tried with no success
>
> wget --http-user=login --http-passwd=passwd
--http-post="login=login&password=passwd"
That should be:
wget --http-user=login --http-passwd=passwd
--post-data="login=login&password=passwd"
Tony
On Mon, 21 Mar 2005, Richard Emanilov wrote:
Thanks for your response, can you please for an example of how the string
should look? Below is what I have tried with no success
wget --http-user=login --http-passwd=passwd
--http-post="login=login&password=passwd"
https://siteIamTryingToDownLoa
nks,
Richard Emanilov
-Original Message-
From: Daniel Stenberg [mailto:[EMAIL PROTECTED]
Sent: Monday, March 21, 2005 9:53 AM
To: Richard Emanilov
Cc: wget@sunsite.dk; [EMAIL PROTECTED]
Subject: Re: help!!!
On Mon, 21 Mar 2005, Richard Emanilov wrote:
> Is there a way I can concatenat
On Mon, 21 Mar 2005, Richard Emanilov wrote:
Is there a way I can concatenate both challenges into one string when trying
to d/l the page?
No, you should use both options. And be prepared that it'll respond with a
set of headers that set one or more cookies and then redirect you to another
page.
the web server
there in an application authentication via post so I found, wget
--http-post="login=user&password=pw" http://www.yourclient.com/somepage.html on
the gnu site to be an option. Is there a way I can concatenate both challenges into one string when trying to d/l the page?
rick.
> "http://www.playagain.net/download/m.php?p=roms/zzyzzyx2.zip";
I am not sure about that "?" Url, can wget use it?
> Last help, i use a script that copy from an ftp the exactly structure of
> it,
> so i have on my pc many dir created from wget, is possible create
i normally use, can u tell me exactly
what i must add to the command ?
-e robots=off -N -nH --referer="http://www.playagain.net/"; -U "Mozilla/5.0
(Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113" -Ga -A zip -r
"http://www.playagain.net/download/m.php?p=roms/
Hey Ted!
> If I view the html the links are formatted as if to provide local viewing
> however when I open the html file in my browser all the images are
> red-xed (empty).
It all seems ok for me with wget 1.9 beta and Win2K
> Wget is 1.9.1-complete from http://xoomer.virgilio.it/hherold/
not as of this writing
(slow e-mail server on my end) so please CC me replies.
Thanks for any help on this,
Ted Phillips
Hi all,
is there some "companion tool" for wget that can compile all downloaded
files into one single file, for example chm, mht or java-help format?
Yes, I know there are some Windows GUI tools capable of this, but is
there anything for the command line?
TIA
Gerhard
Title: Help with subscribing
Hi,
I need help subscribing to the mail list. I sent two emails to [EMAIL PROTECTED] with the subject subscribe but havent become any response.
Can somebody please tell me how to subscribe to the mailing list?
Thanks
PS: Obviously I'm not subsc
ROTECTED]>
To:
Sent: Monday, January 03, 2005 8:07 PM
Subject: Help! wget timing out under linux
Hi all,
I am trying to download files with wget under linux (new install of
Fedora Core 3) and it keeps timing out.
I'm trying to figure out if this is a wget problem or a connection
proble
Hi all,
I am trying to download files with wget under linux (new install of
Fedora Core 3) and it keeps timing out.
I'm trying to figure out if this is a wget problem or a connection
problem (I see this problem across servers, so I don't believe it is a
server problem). The machine I am using is
Need to download via http with wget, but the link is not
direct to the file, it’s a php page that redirect to the file:
http://www.eprom.com/home/customer/product/download.php
how I can download it with wget.
And this one :
http://pub.supercom.ca/servlet/PriceFileDownloadSer
Alle 18:47, martedì 2 novembre 2004, hai scritto:
> Hello.
> Does wget have a nntp (Usenet newsgroups) support?
not at the moment.
> For example, I might want download all articles between
> numbers M and N. A date based system could be useful too.
> We just should agree how these queries are rep
Hello.
Does wget have a nntp (Usenet newsgroups) support?
For example, I might want download all articles between
numbers M and N. A date based system could be useful too.
We just should agree how these queries are represented to
wget.
I can dig out an old Usenet news downloader code if wget
does
1 - 100 of 224 matches
Mail list logo