, 2002 6:52 AM
> To: Doug Kaufman
> Cc: Dan Mahoney, System Admin; [EMAIL PROTECTED]
> Subject: Re: arguably a bug
>
>
> On Wed, May 22, 2002 at 09:20:57PM -0700, Doug Kaufman wrote:
> > On Thu, 23 May 2002, Henrik van Ginhoven wrote: (no I did not ;)
> >
> &g
On Wed, May 22, 2002 at 09:20:57PM -0700, Doug Kaufman wrote:
> On Thu, 23 May 2002, Henrik van Ginhoven wrote: (no I did not ;)
>
> > On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
> > > On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
> > >
> > > Now, somethin
On Thu, 23 May 2002, Henrik van Ginhoven wrote:
> On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
> > On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
> >
> > Now, something occurs to ME. There really SHOULD be an alternate means of
> > prompting the user for a p
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
> On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
>
> Now, something occurs to ME. There really SHOULD be an alternate means of
> prompting the user for a password (i.e. something which is not readily
> visible thro
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to ME. There really SHOULD be an alternate means of
prompting the user for a password (i.e. something which is not readily
visible through ps, (or saved to a history file) I mean REALLY. wget
shows username:*password* i
On Tue, 21 May 2002, Andrew Mayo wrote:
> Unlike ncftpget and ncftpput, wget appears to have no mechanism for
> controlling ftp authentication. It always appears to perform an
> anonymous ftp transfer with no supplied password.
>
> However, it would be desirable to allow a user and password overr
Unlike ncftpget and ncftpput, wget appears to have no mechanism for controlling ftp
authentication. It always appears to perform an anonymous ftp transfer with no
supplied password.
However, it would be desirable to allow a user and password override to allow wget to
retrieve from password-pro
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hello all!
the bug is very simple and causes a segfault on my i386 architecture.
Try this:
wget -r /foo
Segmentation fault
Which causes immediate segfault.
This is also incosistent with the same without the -r:
wget /foo
Hi,
I have tried to download this page [1] following the links. The initial
page is saved correctly. But then this link [2] shall be loaded which
results in this http-query [3]. The actual problem is that '%2F' is decoded
to '@2F' (whereas e.g. '%5F' is correctly decoded to '_').
René
PS: I app
In message "Re: bug report and patch, HTTPS recursive get",
Ian Abbott wrote...
> Thanks again for the bug report and the proposed patch. I thought some
> of the scheme tests in recur.c were getting messy, so propose the
> following patch that uses a function to check
n.
>
>> wget -r https://www.example.com/index.html
>
>wget gets http://www.wget.org/ and other url which
>linked from http://www.wget.org/.
Thanks again for the bug report and the proposed patch. I thought some
of the scheme tests in recur.c were getting messy, so propose the
fol
On Wed, 15 May 2002 18:44:19 +0900, Kiyotaka Doumae <[EMAIL PROTECTED]>
wrote:
>I found a bug of wget with HTTPS resursive get, and proposal
>a patch.
Thanks for the bug report and the proposed patch. The current scheme
comparison checks are getting messy, so I'll write a
On Fri, 3 May 2002 18:37:22 +0200, Emmanuel Jeandel
<[EMAIL PROTECTED]> wrote:
>ejeandel@yoknapatawpha:~$ wget -r a:b
>Segmentation fault
Patient: Doctor, it hurts when I do this
Doctor: Well don't do that then!
Seriously, this is already fixed in CVS.
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
ejeandel@yoknapatawpha:~$
I encounter this bug while i wanted to do wget ftp://a:b@c/, forgetting the
ftp://
The bug is not present when -r is not there (a:b: Unsupported scheme)
Emmanuel
n -- if the connection is not persistent (which in this case
> it isn't), it should close immediately after the data is received.
There is/was no problem with wget. Here is the solution/answer
from the bug reporter
--8<--quote--8<--
This bug is to do with `transparent' web pr
Noel Koethe <[EMAIL PROTECTED]> writes:
> the wget 1.8.1 manpage tells me:
>
>--progress=type
>Select the type of the progress indicator you wish to
>use. Legal indicators are ``dot'' and ``bar''.
>
>The ``dot'' indicator is used by default. It traces
I'm afraid that downloading files larger than 2G is not supported by
Wget at the moment.
Tristan Horn <[EMAIL PROTECTED]> writes:
> tris.net/index.html: merge("http://tris.net/";, "//www.arrl.org/") ->
> http://tris.net//www.arrl.org/
> (it should return http://www.arrl.org/)
>
> See page 11 of rfc1630 and page 11 of rfc2396 for more details. I
> may well be the only person using 'e
"Matt Jackson" <[EMAIL PROTECTED]> writes:
> I'm using the NT port of WGET 1.8.1.
>
> FTP retrieval of files works fine, retrieval of directory listings fails.
> The problem happens under certain conditions when connecting to OS2 FTP
> servers.
>
> For example, if the "current directory" on the F
Pascal Vuylsteker <[EMAIL PROTECTED]> writes:
> I've downloaded wget from http://macosx.forked.net/ as a port to
> MacOSX (package).
I'm not sure how internationalization works on MacOS X. Perhaps you
should ask the people who did the porting?
If you want Wget to print English (original) messa
On Mit, 10 Apr 2002, Hrvoje Niksic wrote:
> You're right. I'll apply this patch, which I think should add enough
> warnings to educate the unwary.
Thanks for this patch. I will merge it into version 1.8.1
for Debian.
--
Noèl Köthe
Guillaume Morin <[EMAIL PROTECTED]> writes:
> I am forwarding Debian wishlist bug 21344
> http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21344&repeatmerged=yes
>
> In 1.8.1, the result is different. You get an overflow notice...
Yes, the negative amount has been fixed by
I believe this is already on the todo list. However, this is made
harder by the fact that, to implement this kind of reject, you have to
start downloading the file. This is very different from the
filename-based rejection, where the decision can be made at a very
early point in the download proc
Guillaume Morin <[EMAIL PROTECTED]> writes:
> When getting a file in a non-root directory from FTP with wget, wget
> always tries CWD to that directory before getting the
> file. Unfortunately sometimes you're not allowed to CWD to a
> directory, but you're all allowed to list or download files f
[ Cc'ing to [EMAIL PROTECTED], as requested by Guillaume. ]
Guillaume Morin <[EMAIL PROTECTED]> writes:
> this is from the "advanced usage" section of examples (info docs):
>
>> * If you want to encode your own username and password to HTTP or
>> FTP, use the appropriate URL syntax (*note
Unfortunately, this bug is not easy to fix. The problem is that `-O'
was originally invented for streaming, i.e. for `-O -'. As a result,
many places in Wget's code assume that they can freely operate on the
file names, and -O seems more like an afterthought.
On the other ha
Guillaume Morin <[EMAIL PROTECTED]> writes:
> For example if a link to the URL "/foo?bar" is seen then the correct
> file is downloaded and saved with the name "foo?bar". When viewing
> the pages with Netscape the '?' character is seen to separate the
> URL and the arguments. This makes the lin
Guillaume Morin <[EMAIL PROTECTED]> writes:
> If wget fetches a url which redirects to another host, wget
> retrieves the file, and there's nothing that can be done to turn
> that off.
>
> So, if you do wget -r on a machine that happens to have a redirect to
> www.yahoo.com you'll wind up trying
Good point there. I wonder... is there a legitimate reason to require
atime to be set to the mtime time? If not, we could just make the
change without the new option. In general I'm careful not to add new
options unless they're really necessary.
Guillaume Morin <[EMAIL PROTECTED]> writes:
> if I use 'wget ftp://site.com/file1.txt ftp://site.com/file2.txt',
> wget will no reuse the ftp connection, but will open one for each
> document downloaded from the same site...
Yes, that's how Wget currently behaves
Guillaume Morin <[EMAIL PROTECTED]> writes:
> I am forwarding you this bug. I can reproduce this on 1.8.1
Thanks for the report. I believe this patch should fix it:
Index: src/ChangeLog
===
RCS file: /pack/anoncvs
? (You may want to limit the
> recursion depth and the maximum amount to download if repeating the
> test!)
I found out that there have to be more circumstances fullfilled in order
to reproduce this bug. I have a local copy of the website where it
occurred, and it seems necessary to have nearly a
On 4 Apr 2002 at 13:21, Robert Mücke wrote:
> So it seems to be important to correct this behaviour. I think you only need
> to set up a test site (maybe with some subdirs) containing one file with
> an errorous href="" tag to reproduce this (maybe only in parts
> depending on your server con
>
> (it should return http://www.arrl.org/)
There haven't been any releases since 1.8.1, but this bug is fixed
in the current CVS version.
Hi,
Just wanted to point out that as of version 1.8.1, wget doesn't correctly
recognize -style links.
tris.net/index.html: merge("http://tris.net/";, "//www.arrl.org/") ->
http://tris.net//www.arrl.org/
(it should return http://www.arrl.org/)
See page 11 of rfc1630 and page 11 of rfc2396 for
Dear wget team,
recently found a bug in the version 1.8 of the wget program (recursive
retrieval) that did not occur in earlier versions (at least as far as
I can see, 1.7 is definitly not affected).
The new wget version treats single "?xxx" hrefs the same way as hrefs to
anchors (&
I'm using the NT port of WGET 1.8.1.
FTP retrieval of files works fine, retrieval of directory listings fails.
The problem happens under certain conditions when connecting to OS2 FTP
servers.
For example, if the "current directory" on the FTP server at login time is
"e:/abc", the command "wget f
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hallo specialists,
I used wget 1.8.1 on my system to mirror the site www.europa.eu.int.
Transfer was throug a proxy and DSL over night.
After about 12-13 hours I found following situation:
Totally download about 1.8GB data.
wget process was increa
Hello,
I got another bug reported (http://bugs.debian.org/139059):
"Examples:
roadkill:~/wget-1.8.1/src# wget -r http://www:s/
Segmentation fault
roadkill:~/wget-1.8.1/src# wget -r iftp://www.example.org/
Segmentation fault
Doesn't look to be exploitable, I think. When recursive
Hi,
I've downloaded wget from http://macosx.forked.net/ as a port to MacOSX
(package).
It installed fine and even realized that my native language was french
but had some issue with the printing of the help in french : the
accentuated char are replaced by a ?
I would even prefer to have acces
I found a serious bug in wget, all versions
affected.
Description: It is highly addictive
Solution:You should include a warning about this
somewhere in the product :)
a windows user
Hello,
the wget 1.8.1 manpage tells me:
--progress=type
Select the type of the progress indicator you wish to
use. Legal indicators are ``dot'' and ``bar''.
The ``dot'' indicator is used by default. It traces
the retrieval by printing dots on
I'm using -html-extension to append files with the html extension.
Debug log is below. I'm not getting the expected result, and I'm hoping
someone can determine the problem. For testing purposes, I've got a cgi
script that generates the html for a page. The server, that the cgi is
running on, h
fbsd1 --- http wget eshop.tar (3.3G) ---> fbsd2
command was:
# wget http://kamenica/eshop.tar
at the second G i got the following:
2097050K .. .. .. .. .. 431.03 KB/s
2097100K .. .. .. .. ..8.14 MB/s
2097150K
On Thursday 21 February 2002 05:44, Ian Abbott wrote:
> On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
> > You can't get it to work for timing out a socket connection, because
> > that is a bit of code that hasn't been implemented yet.
> >
> > If no one else wants to, I can work up a patch for this
On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
> You can't get it to work for timing out a socket connection, because
> that is a bit of code that hasn't been implemented yet.
>
> If no one else wants to, I can work up a patch for this next week.
> It's pretty standard coding, right out of Stevens
On Wed, Feb 20, 2002 at 10:20:38PM -0800, Partycrew Industries wrote:
>I might be wrong but I believe there is a bug in the
>--timeout=whatever syntax. I just can't get the
>program to obey it under any circumstances, I put in
No, that is not a correct statement. The program obe
I might be wrong but I believe there is a bug in the
--timeout=whatever syntax. I just can't get the
program to obey it under any circumstances, I put in
an ip that i know is non-existant and it takes it
forever to "figure that out". I've even tried
changing it in init.c wi
Hello,
http://bugs.debian.org/134765
--8<--
Package: wget
Version: 1.8.1-0.2
I have a ~/.wgetrc that contains the line
login = anonymous
and a .netrc that contains, among other things,
default login raj password xxx
The man page says that wget reads .wgetrc, but actually it also
;m replying to was sent to <[EMAIL PROTECTED]>. I'm
> continuing the thread on <[EMAIL PROTECTED]> as there is no bug and
> I'm turning it into a discussion about features.]
>
> On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
> > I've
"Mr.Fritz" <[EMAIL PROTECTED]> writes:
> When I retrieve recursively a directory using a site with https protocol,
> it searches for http://sitename/robots.txt but the site has only port
> 443 (https) open, so there is a connection refused error. Wget thinks
> the site is down and aborts the tran
Peteris Krumins <[EMAIL PROTECTED]> writes:
> GNU Wget 1.8
>
> get: progress.c:673: create_image: Assertion `p - bp->buffer <= bp->width' failed.
This problem has been fixed in Wget 1.8.1. Please upgrade.
[The message I'm replying to was sent to <[EMAIL PROTECTED]>. I'm
continuing the thread on <[EMAIL PROTECTED]> as there is no bug and
I'm turning it into a discussion about features.]
On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
> I
Hey there,
I wanna download a file at mustek's ftp site in america. This site has a 20
users limit. Have a look at this:
bash-2.05# wget --wait=30 --waitretry=30 -t 0
ftp://128.121.112.104/pub/1200UBXP/Web.EXE
--15:10:37-- ftp://128.121.112.104/pub/1200UBXP/Web.EXE
=> `Web.EXE'
Con
TED]>
If you wish to continue to submit further information on your problem,
please send it to [EMAIL PROTECTED], as before.
Please do not reply to the address at the top of this message,
unless you wish to report a problem with the Bug-tracking system.
Debian bug tracking system adminis
Guillaume Morin wrote:
>
> Forward of
> http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21588&repeatmerged=yes
>
>
>
> If I access a server not on the default port, wget does not write that
> port in the name of the directory it creates. Here is an exam
Debian wishlist bug 105278
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=105278&repeatmerged=yes
It would be nice if, upon noticing that it's getting a lot of invalid
port errors, wget would automatically try a passive FTP download unless
there had been some explicit configu
Debian wishlist bug 104122
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=104122&repeatmerged=yes
It would be extremly useful to have a 'quirks' mode which would do the
following (for instance, other things can be added):
- If a URL with \ characters gets a 404, tr
Forward of
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21588&repeatmerged=yes
If I access a server not on the default port, wget does not write that
port in the name of the directory it creates. Here is an example:
--13:43:40-- http://www.center.osaka-u.ac.jp:7080/ce
Hi,
I am forwarding Debian wishlist bug 21344
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21344&repeatmerged=yes
In 1.8.1, the result is different. You get an overflow notice...
When downloading more than 2 GB wget will give a negative number of
bytes in its summary at the end
Hi,
I am forwarding Debian wishlist bug 21148
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21148&repeatmerged=yes
While wget allows me to include/exclude documents based on their
extension,
it doesn't allow me to do the same based on mime type (for example,
if I only want to
Hi,
I am forwarding Debian bug #131851
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=131851&repeatmerged=yes
I can reproduce it on 1.8.1.
When getting a file in a non-root directory from FTP with wget, wget
always
tries CWD to that directory before getting the file. Unfortuna
Hi,
I am forwarding Debian bug 113281
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=113281&repeatmerged=yes
It still applies to 1.8.1. I am sure it is a bug though
wget doesn't wait when retrying to connect to an FTP server. Not sure
if
this affects HTTP downloads.
In
My mistake, it is bug 106361. Please do not CC [EMAIL PROTECTED] but
[EMAIL PROTECTED]
TIA.
--
Guillaume Morin <[EMAIL PROTECTED]>
If you want the answers, you'd better get ready for the fire
(System of a Down)
Hi,
I am forwarding Debian bug 106391.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=106361&repeatmerged=yes
The bug still applies.
this is from the "advanced usage" section of examples (info docs):
> * If you want to encode your own username and password to
Hi,
I am forwarding to you Debian bug 88176.
(http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=88176&repeatmerged=yes)
I can reproduce the problem with 1.8.1
The following transcript shows that the wget can do the Bad Thing with
-O when timestamping.
It can result on a 0 byte long re
Hi,
I am forwarding you Debian bug 65971
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=65791&repeatmerged=yes
I can reproduce this problem with 1.8.1
With the '-k' option or the 'convert_links = on' option in .wgetrc the
links in
the downloaded HTML pages are
Hi,
I am forwarding Debian bug 55145
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=55145&repeatmerged=yes
I can reproduce it on 1.8.1
If wget fetches a url which redirects to another host, wget
retrieves the file, and there's nothing that can be done to turn
that off.
So,
Hi,
I am forwarding Debian wishlist bug #32712
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=32712&repeatmerged=yes
Mirrors default behaviour when running in mirror or time stamp mode is
to set both atime and mtime of the down loaded file to the remote
files mtime. This ca
Hi,
I am forwarding Debian bug 32523
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=32353&repeatmerged=yes
-
if I use 'wget ftp://site.com/file1.txt ftp://site.com/file2.txt', wget
will no reuse the ftp connection, but will open one for each document
downloaded from
I am forwarding you 15844. I can reproduce it on 1.8.1.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=15844&repeatmerged=yes
-
I observed this behavior which is not documented, and I don't think
should
happen.
Suppose ftp://host.com/filename is a symlink to
ftp://host.com/rea
Hi,
I am forwarding you this bug. I can reproduce this on 1.8.1
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=117774&repeatmerged=yes
---
wget seems to always return 0 as return code even when it fails, but
only
AFAIK when using some wildcard char in the URL. For example:
spiney:~ $
On 01/02/2002 12:10:59 "Mr.Fritz" wrote:
>After the https/robots.txt bug, doing a recursive wget to an https-only
server
>gives me this error: it searches for http://servername/index.html but
there
>is no server on port 80, so wget receives a Connection refused error and
After the https/robots.txt bug, doing a recursive wget to an https-only server
gives me this error: it searches for http://servername/index.html but there
is no server on port 80, so wget receives a Connection refused error and
quits. It should search for https://servername/index.html
/robots.txt !!
Wget is the best downloading program, so I hope this bug get fixed
very soon. Thank you
t;
> simply add check routine at 'tag_handle_meta' function.
Thanks for the report; this patch should fix the bug:
2002-02-01 Hrvoje Niksic <[EMAIL PROTECTED]>
* html-url.c (tag_handle_meta): Don't crash on where content is missing.
Index: src/html-url.c
Hi.
if HTML document contains code like this
wget may be crushed. It has 'refresh' but
does not have 'content'. Of course this is
incorrect HTML. But I found some pages at web :)
simply add check routine at 'tag_handle_meta' function.
An, Young Hun
GNU Wget 1.8
get: progress.c:673: create_image: Assertion `p - bp->buffer <= bp->width' failed.
dunno what causes this bug, i get it if i run wget with: -c -r -nH
--cut-dirs=4 --http-user=user --http-passwd=pass http://server/dir/
or if i even run it with: -r --http-user=user --ht
Лыков А.А. <[EMAIL PROTECTED]> writes:
> 0% [
>] 9,652 1.73K/s ETA 3:31:49 a
> ssertion "p - bp->buffer <= bp->width" failed: file "progress.c", line 673
> Abort trap (core dumped)
The bug has been fixed in Wget 1.8.1. Please upgrade.
Hello,
A strange bug have occurred several times. Following is my screen dump.
=== Cut ===
bash-2.03$ wget -c
http://drivers.aver.com/software/AVerTVPVR/AVerTVStudioWin2K.zip
--09:14:43--
http://drivers.aver.com/software/AVerTVPVR/AVerTVStudioWin2K.zip
=> `AVerTVStudioWin2K.
On 21 Jan 2002 at 14:56, Thomas Lussnig wrote:
> >Why not just open the wgetrc file in text mode using
> >fopen(name, "r") instead of "rb"? Does that introduce other
> >problems?
> I think it has to do with comments because the defeinition is that
> starting with '#' the rest of the line
> is i
On 2002-01-21 18:53 +0100, Hrvoje Niksic wrote:
> "Ian Abbott" <[EMAIL PROTECTED]> writes:
>
> > Why not just open the wgetrc file in text mode using fopen(name,
> > "r") instead of "rb"? Does that introduce other problems?
>
> Not that I'm aware of. The reason we use "rb" now is the fact that
"Ian Abbott" <[EMAIL PROTECTED]> writes:
> Why not just open the wgetrc file in text mode using fopen(name,
> "r") instead of "rb"? Does that introduce other problems?
Not that I'm aware of. The reason we use "rb" now is the fact that we
handle the EOL problem ourselves, and it seems "safer" to
>
>
>>>WGet returns an error message when the .wgetrc file is terminated
>>>with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
>>>command-line language for all versions of Windows, so ignoring the
>>>end-of-file mark would make sense.
>>>
>>Ouch, I never thought of that. Wget opens files
On 17 Jan 2002 at 2:15, Hrvoje Niksic wrote:
> Michael Jennings <[EMAIL PROTECTED]> writes:
> > WGet returns an error message when the .wgetrc file is terminated
> > with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
> > command-line language for all versions of Windows, so ignoring the
>
Michael Jennings <[EMAIL PROTECTED]> writes:
> However, I have a comment: There is simple logic that would solve
> this problem. WGet, when it reads a line in the configuration file,
> probably now strips off trailing spaces (hex 20, decimal 32). I
> suggest that it strip off both trailing spaces
> From: Michael Jennings [mailto:[EMAIL PROTECTED]]
> Obviously, this is completely your decision. You are right,
> only DOS editors make the mistake. (It should be noted that
> DOS is MS Windows only command line language. It isn't going
> away; even Microsoft supplies command line utilities w
-
Obviously, this is completely your decision. You are right, only DOS editors make the
mistake. (It should be noted that DOS is MS Windows only command line language. It
isn't going away; even Microsoft supplies command line utilities with all versions of
its OSs. Yes, Windows will probably
Herold Heiko <[EMAIL PROTECTED]> writes:
> My personal idea is:
> As a matter of fact no *windows* text editor I know of, even the
> supplied windows ones (notepad, wordpad) AFAIK will add the ^Z at the
> end of file.txt. Wget is a *windows* program (although running in
> console mode), not a *Do
On 17/01/2002 07:34:05 Herold Heiko wrote:
[proper order restored]
>> -Original Message-
>> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
>> Sent: Thursday, January 17, 2002 2:15 AM
>> To: Michael Jennings
>> Cc: [EMAIL PROTECTED]
>> Subje
ph x39-041-5907073
-- I-31021 Mogliano V.to (TV) fax x39-041-5907087
-- ITALY
> -Original Message-
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
> Sent: Thursday, January 17, 2002 2:15 AM
> To: Michael Jennings
> Cc: [EMAIL PROTECTED]
> Subject: Re: Bug rep
Michael Jennings <[EMAIL PROTECTED]> writes:
> 1) There is a very small bug in WGet version 1.8.1. The bug occurs
>when a .wgetrc file is edited using an MS-DOS text editor:
>
> WGet returns an error message when the .wgetrc file is terminated
> with an MS-DOS end-of
Ivan Buttinoni <[EMAIL PROTECTED]> writes:
> - for "recursive retrieval", multiple simultaneus gets
This is very hard to do, not easy at all.
> - last but not the least: javascrip support (eheheh)
And this is even harder. Javascript is a full programming language
which, as used by the sites,
Ivan Buttinoni <[EMAIL PROTECTED]> writes:
> Ack! I found these nice options:
>--sslcertfile=FILE optional client certificate.
>--sslcertkey=KEYFILE optional keyfile for this certificate.
>--egd-file=FILEfile name of the EGD socket.
> but there're no refer of
Brendan Ragan <[EMAIL PROTECTED]> writes:
> This is the problem i'm having with an older wget (1.5.3) when i
> enter the url
>
> 'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirty&month=dec'
>
> it goes
>
> Connecting to www.tranceaddict.com:80... connected!
> HTTP request se
"Peter Gucwa @ IIS-RTP" <[EMAIL PROTECTED]> writes:
> option -k does not work in following call:
> wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Ryan Daniels <[EMAIL PROTECTED]> writes:
> The following command line causes a Segfault on my system:
>
> wget -spider "" http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But th
ªªÙ×(ªÏª«)
¹«´ý
¡Þ ÓÞ()ªª¹ª®ªë(¡çÓÞªª¤)
³Ê¹«Å©´Ù.
¡Þ ÞÂÛ°(ª·ª«ª¿)ªÊª¤
¾î¿ ¼ö ¾ø´Ù.
¡Þ ªÍª³ °í¾çÀÌ
¡Þ ìý(ªÏª¤)ªë
µé¾î°¡´Ù.
¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë
On Thursday 10 January 2002 16:51, you wrote:
> > - https support
>
> It has that, too. Since 1.7. You may need to recompile it yourself with
> the openssl libs.
Ack! I found these nice options:
--sslcertfile=FILE optional client certificate.
--sslcertkey=KEYFILE optional keyf
> - https support
It has that, too. Since 1.7. You may need to recompile it yourself with
the openssl libs.
---
Kim Scarborough http://www.unknown.nu/kim/
601 - 700 of 879 matches
Mail list logo