In your problem report, I see version numbers for everything but
wget.
Does adding "-d" to the wget command tell you anything?
Anything in the Web server logs?
Steven M. Schweda [EMAIL PROTECTED]
Hello,
We are using wget to check if URL applications are up on AIX 5.1
production systems.
Since i have upgraded IBMHTTP server from version 1.3.28 having no
problems on this version with wget (running with websphere 5.1 plugin for
http Server 1.3.x ) to IBM HTTP Server 2.0.47 (running now w
ype
connect.c:530: warning: passing argument 3 of 'getpeername' from incompatible
pointer type
*** Error exit code 1
Stop.
*** Error exit code 1
Stop.
Regards,
Ting
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: 04 January 2007 00:40
To: Chung Ting Che
[EMAIL PROTECTED] wrote:
> Dear Mauro,
>
> Yes we have installed those prerequsite package but still failed.
> We have tried the PA-RISC depot and it works although we are using
> Itanium platform. We have tried another development machine and
> the result is the same. So I suspect the depot
.
Regards,
Ting
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: 1/3/2007 [Wed] 17:58
To: Chung Ting Cheng (HIT, ODT)
Cc: [EMAIL PROTECTED]
Subject: Re: wget problem
[EMAIL PROTECTED] wrote:
> Dear Sir,
>
> I have installed wget 1.10.2 into HP
[EMAIL PROTECTED] wrote:
Dear Sir,
I have installed wget 1.10.2 into HP-UX 11.23 from http://hpux.cs.utah.edu/hppd/hpux/Gnu/wget-1.10.2/. Also I have installed the runtime dependency packages like libgcc, gettext, libiconv and openssl. However when I run this and get some testing web content.
Dear Sir,
I have installed wget 1.10.2 into HP-UX 11.23 from
http://hpux.cs.utah.edu/hppd/hpux/Gnu/wget-1.10.2/. Also I have installed the
runtime dependency packages like libgcc, gettext, libiconv and openssl. However
when I run this and get some testing web content. The following errors is
Thanks a lot Pat.
I will try it out.
Paula
-Original Message-
From: Willener, Pat [mailto:[EMAIL PROTECTED]
Sent: Tuesday, June 27, 2006 8:40 PM
To: Paula; [EMAIL PROTECTED]
Subject: [SPAM] RE: [SPAM] RE: wget problem
Yes, wget is not a DOS command, but you can download the Windows
-
From: Willener, Pat [mailto:[EMAIL PROTECTED]
Sent: Tuesday, June 27, 2006 8:26 PM
To: [EMAIL PROTECTED]
Cc: Paula
Subject: [SPAM] RE: wget problem
..OR... specify the full path name - C:\> "C:\Program Files\wget\wget"
parameters
..OR... add the path to the %PATH% - C:\&g
-
From: Willener, Pat [mailto:[EMAIL PROTECTED]
Sent: Tuesday, June 27, 2006 8:26 PM
To: [EMAIL PROTECTED]
Cc: Paula
Subject: [SPAM] RE: wget problem
..OR... specify the full path name - C:\> "C:\Program Files\wget\wget"
parameters
..OR... add the path to the %PATH% - C:\> PATH %
PROTECTED]
Sent: Wednesday, June 28, 2006 2:00 AM
To: Paula; [EMAIL PROTECTED]
Subject: RE: wget problem
Paula,
Go to the directory where you have WGET installed, ie. The directory in
which wget.exe is located. The error is saying windows cannot find the
"wget" program.
Ranjit Sandhu
2:31 PM
To: [EMAIL PROTECTED]
Subject: wget problem
Hello GNU, when I type in C:\>wget http://.com I get the
message:
Wget is not recognized as an internal or external command, operable
program or batch file.
What am I doing wrong?
I start out from the C prompt.
Thank you.
Paula Van Berkom
Hello GNU, when I type in C:\>wget http://.com I get the message:
Wget is not recognized as an internal or external command, operable program
or batch file.
What am I doing wrong?
I start out from the C prompt.
Thank you.
Paula Van Berkom
On Monday 04 April 2005 09:48 am, Juhana Sadeharju wrote:
> Hello.
> The following document could not be downloaded at all:
> http://www.greyc.ensicaen.fr/~dtschump/greycstoration/
>
> If you succeed, please tell me how. I want all the html file
> and the images.
could you please tell us which c
Hello.
The following document could not be downloaded at all:
http://www.greyc.ensicaen.fr/~dtschump/greycstoration/
If you succeed, please tell me how. I want all the html file
and the images.
Juhana
--
http://music.columbia.edu/mailman/listinfo/linux-graphics-dev
for developers of open
I'm afraid Wget doesn't understand JavaScript. As your example
demonstrates, it is impossible to extract URLs from JavaScript by
merely parsing it -- you need to actually execute it.
Hello.
One wget problem this time. I downloaded all in
http://www.planetunreal.com/wod/tutorials/
but most of the files were not downloaded because urls are
in the file
http://www.planetunreal.com/wod/tutorials/sidebar.js
in the following format
FItem("Beginner's Guide to Un
Hi,
Using wget 1.8.2 (gnuwin32 but I think the problem should be the same
with unix flavors), I saw that if I specify:
reject = lang=it
Wget is nonetheless downloading an url like
http://www.myserver.com/[EMAIL PROTECTED]&source=search
I also observed that accept = lang=fr is not working either
Hi,
I want to use wildcards because I need to download
jpg-files that are named in a different formart such
as "[year][month][day].[number].jpg".
I could use this normal command:
[begin command]
wget http://www.website.com/20030510.0500.jpg -r -l1
--no-parent -w 1
[end]
But this did not work:
Rajesh wrote:
> Thanks for your reply. I have tried using the command wget
> --user-agent="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)", but it
didn't
> work.
Adding the user agent helps some people -- I think most often with web
servers from the evil empire.
> I have one more question. I
time using
scp.
Thanks,
Rajesh.
>From: "Tony Lewis" <[EMAIL PROTECTED]>
>To: "Rajesh" <[EMAIL PROTECTED]>, <[EMAIL PROTECTED]>, <[EMAIL PROTECTED]>
>Subject: Re: wget problem
>Date: Thu, 3 Jul 2003 07:46:33 -0700
>MIME-Version: 1.0
>Content
Rajesh wrote:
> Wget is not mirroring the web site properly. For eg it is not copying
symbolic
> links from the main web server.The target directories do exist on the
mirror
> server.
wget can only mirror what can be seen from the web. Symbolic links will be
treated as hard references (assuming t
Hi All,
I have installed wget (version 1.8.2) on a Sun solaris box to mirror a web
site. I am using the command "wget --mirror --no-host-directories
http://www.sl.nsw.gov.au/"; to mirror the web site.
Wget is not mirroring the web site properly. For eg it is not copying symbolic
links from th
Hi All,
I have installed wget (version 1.8.2) on a Sun solaris box to mirror a web
site. I am using the command "wget --mirror --no-host-directories
http://www.sl.nsw.gov.au/"; to mirror the web site.
Wget is not mirroring the web site properly. For eg it is not copying symbolic
links from the
Hi,
I'm trying to use wget-1.8.2 on linux 6.2 to download a html
file (http://www.economagic.com/em-cgi/data.exe/fedst1/fedfunds+2),
each time I do so, only part of the file is downloaded.
What's missing is a table starting from line 59 ("after
. I tried several combinations, there is no indication
Nathan Morehart wrote:
i have just installed the windows port of wget. my question comes under
the subject of querystrings. i see there is extra care in the version i
have (1.8.2b) to allow "?" in the url.. but what about "&"?
ie: http://www.mysite.com?user=me&password=me
i get a bad file descri
i have just installed the windows port of wget. my question comes under
the subject of querystrings. i see there is extra care in the version i
have (1.8.2b) to allow "?" in the url.. but what about "&"?
ie: http://www.mysite.com?user=me&password=me
i get a bad file descriptor error whenever i tr
Hi All,
Is there any way to get the following ftp mirroring
working with WGET? I have memories of this working - but
not sure if the domain has changed its server software???
#Mirroring FTP area
wget -r -nH --cut-dirs=1 -N -nr -l0 -np --cache=off \
ftp://user2:[EMAIL PROTECTED]//web \
-P
28 matches
Mail list logo