Micah Cowan wrote:
> Actually, I'll have to confirm this, but I think that current Wget will
> re-download it, but not overwrite the current content, until it arrives
> at some content corresponding to bytes beyond the current content.
>
> I need to investigate further to se
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Maksim Ivanov wrote:
> I'm trying to download the same file from the same server, command line
> I use:
> wget --debug -o log -c -t 0 --load-cookies=cookie_file
> http://rapidshare.com/files/153131390/Blind-Test.rar
>
> Belo
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Maksim Ivanov wrote:
> I'm trying to download the same file from the same server, command line
> I use:
> wget --debug -o log -c -t 0 --load-cookies=cookie_file
> http://rapidshare.com/files/153131390/Blind-Test.rar
>
> Belo
d have
> to deal with two parameters.
It's clearly easier to deal with options that wget is already programmed to
support. For a primer on wget options, take a look at this page on the wiki:
http://wget.addictivecode.org/OptionsHowto
I suspect you will need to add support for a
Hi,
* zanzi ([EMAIL PROTECTED]) wrote:
> I have tried to wget http://anidb.net/perl-bin/animedb.pl?show=main but all
> I seem to get is a file with unreadable characters (and not the HTML file
> I'm after).
> Is it because of some perl-script on the site?
This perl script as
Hi
I have tried to wget http://anidb.net/perl-bin/animedb.pl?show=main but all
I seem to get is a file with unreadable characters (and not the HTML file
I'm after).
Is it because of some perl-script on the site?
Thanks!
ZZ
I'm trying to download the same file from the same server, command line I
use:
wget --debug -o log -c -t 0 --load-cookies=cookie_file
http://rapidshare.com/files/153131390/Blind-Test.rar
Below attached 2 files: log with 1.9.1 and log with 1.10.2
Both logs are made when Blind-Test.rar was al
I'm trying to download the same file from the same server, command line I
use:
wget --debug -o log -c -t 0 --load-cookies=cookie_file
http://rapidshare.com/files/153131390/Blind-Test.rar
Below attached 2 files: log with 1.9.1 and log with 1.10.2
Both logs are made when Blind-Test.rar was al
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Maksim Ivanov wrote:
> Hello!
>
> Starting version 1.10 wget has very annoying bug: if you trying download
> already fully downloaded file, wget begin download it over,
> but 1.9.1 says: "Nothing to do" as it must to be
Hello!
Starting version 1.10 wget has very annoying bug: if you trying download
already fully downloaded file, wget begin download it over,
but 1.9.1 says: "Nothing to do" as it must to be.
Yours faithfully, Maksim Ivanov
And you'll probably have to do this again- I bet
yahoo expires the session cookies!
On Tue, Sep 9, 2008 at 2:18 PM, Donald Allen <[EMAIL PROTECTED]> wrote:
> After surprisingly little struggle, I got Plan B working -- logged into
> yahoo with wget, saved the cookies, includin
After surprisingly little struggle, I got Plan B working -- logged into
yahoo with wget, saved the cookies, including session cookies, and then
proceeded to fetch pages using the saved cookies. Those pages came back
logged in as me, with my customizations. Thanks to Tony, Daniel, and Micah
-- you
>> wrote:
> >
> > Donald Allen wrote:
> >>> I am doing the yahoo session login with firefox, not with wget,
> > so I'm
> >>> using the first and easier of your two suggested methods. I'm
> > guessing
> >>> you are thinking
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Donald Allen wrote:
>
>
> On Tue, Sep 9, 2008 at 1:41 PM, Micah Cowan <[EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]>> wrote:
>
> Donald Allen wrote:
>>> I am doing the yahoo session login with firefox, not
On Tue, Sep 9, 2008 at 1:41 PM, Micah Cowan <[EMAIL PROTECTED]> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Donald Allen wrote:
> >> I am doing the yahoo session login with firefox, not with wget, so I'm
> >> using the first and easier of
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Donald Allen wrote:
>> I am doing the yahoo session login with firefox, not with wget, so I'm
>> using the first and easier of your two suggested methods. I'm guessing
>> you are thinking that I'm trying to login to th
extra items firefox is sending
> > appear to be the difference, because I included them (from the
> > livehttpheaders output) when I tried sending the cookies manually with
> > --header, I got the same page back with wget that indicated that yahoo
> > knew I was logged in and fo
ehttpheaders output) when I tried sending the cookies manually with
> --header, I got the same page back with wget that indicated that yahoo
> knew I was logged in and formatted with page with my preferences.
Perhaps you missed this in my last message:
>> Probably there are session cookies
d to work hard(er) when trying this with non-browsers.
> >>
> >> It's certainly still possible, even without using the browser to get the
> >> first cookie file. But it may take some effort.
> >
> > I have not been able to retrieve a page with wget a
ling with cookies
>> just to make sure you have a javascript and cookie enabled browser. So you
>> need to work hard(er) when trying this with non-browsers.
>>
>> It's certainly still possible, even without using the browser to get the
>> first cookie file. But it may
(er) when trying this with non-browsers.
>
> It's certainly still possible, even without using the browser to get the
> first cookie file. But it may take some effort.
I have not been able to retrieve a page with wget as if I were logged
in using --load-cookies and Micah's suggesti
On Mon, 8 Sep 2008, Donald Allen wrote:
The page I get is what would be obtained if an un-logged-in user went to the
specified url. Opening that same url in Firefox *does* correctly indicate
that it is logged in as me and reflects my customizations.
First, LiveHTTPHeaders is the Firefox plugi
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Donald Allen wrote:
> There was a recent discussion concerning using wget to obtain pages
> from yahoo logged into yahoo as a particular user. Micah replied to
> Rick Nakroshis with instructions describing two methods for doing
>
2008/9/8 Tony Godshall <[EMAIL PROTECTED]>:
> I haven't done this but I can speculate that you need to
> have wget identify itself as firefox.
When I read this, I thought it looked promising, but it doesn't work.
I tried sending exactly the user-agent string firefox is s
There was a recent discussion concerning using wget to obtain pages
from yahoo logged into yahoo as a particular user. Micah replied to
Rick Nakroshis with instructions describing two methods for doing
this. This information has also been added by Micah to the wiki.
I just tried the simpler of
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFIvcPD7M8hyUobTrERAsCEAJ9oQDJWzD/OPAvz
p me out i am a complete novice in
> this regard.
WinCVS won't work, because there _is_ in fact no CVS module for Wget.
Wget uses Mercurial as the source repository (and was using Subversion
prior to that). For more information about the Wget source repository
and its use, see http:/
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
houda hocine wrote:
> Hi,
Hi houda.
This message was sent to the wget-notify, which was not the proper
forum. Wget-notify is reserved for bug-change and (previously) commit
notifications, and is not intended for discussion (though I obviou
Hi all,
I need to checkout the complete source into my local hard disk. I am using
WinCVS when i searched for the module its saying that there is no module
information out there. Could any one help me out i am a complete novice in
this regard.
Thanks,
VinothKumar.R
Hi all,
I need to checkout the complete source into my local hard disk. I am using
WinCVS when i searched for the module its saying that there is no module
information out there. Could any one help me out i am a complete novice in
this regard.
Thanks,
VinothKumar.R
Hi
* Jinhui Li ([EMAIL PROTECTED]) wrote:
> I am browsing the source code. And want to debug it to figure out how it
> works.
>
> So, somebody please tell me how to debug ( with GDB ) or where can I find
> information that I need.
Compile Wget with debug informations (-g flag fo
I am browsing the source code. And want to debug it to figure out how it
works.
So, somebody please tell me how to debug ( with GDB ) or where can I find
information that I need.
Sorry, if i bother you.
tion
>
> also when you mean rename
>
> what is the function to rename with wget ?
I mean, just use the "mv" or "rename" command on your operating system.
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, tes
nlever de profondeeur, et ca sera réglé !
This issue appears to have been fixed with the latest French
translation. It will be released with Wget 1.12.
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-BEGI
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
karlito wrote:
>
>
> Hello,
>
> First of all i would thank you for your great tool
>
> I have a request
>
> i use this function to save url with absolute link so it's very
Hello,
>
> First of all i would thank you for your great tool
>
> I have a request
>
> i use this function to save url with absolute link so it's very good
>
> wget -k http://www.google.fr/
>
> but i want to save this file as other name than index.html like
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
asm c wrote:
> I've recently been using wget, and got it working for the most part, but
> there's one issue that's really been bugging me. One of the parameters I
> use is '-R "*action=*,*oldid=*"' (side
Greetings,
Saw the address to this mailing list on the IRC topic & motd, so I thought
asking here might help. Please CC any replies to me.
I've recently been using wget, and got it working for the most part, but
there's one issue that's really been bugging me. One of the
t 1.12 to
1.13, and some items that have just been moved to 1.14 will get moved to
the new 1.13 target.
If you have bookmarks to the 1.13 set of bugs in Savannah, that link now
goes to 1.14.
I've been very happy with the progress and improvements that have been
made to Wget over th
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Tony Lewis wrote:
> Micah Cowan wrote:
>
>> The easiest way to do what you want may be to log in using your browser,
>> and then tell Wget to use the cookies from your browser, using
>
> Given the frequency of the "l
Micah Cowan wrote:
> The easiest way to do what you want may be to log in using your browser,
> and then tell Wget to use the cookies from your browser, using
Given the frequency of the "login and then download a file" use case , it
should probably be documented on the wiki. (Pe
t.org/team/fr.html; other translation teams are
listed at http://translationproject.org/team/index.html
Looks like it's still present in the latest fr.po file at
http://translationproject.org/latest/wget/fr.po
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Main
* Tom ([EMAIL PROTECTED]) wrote:
> Bonjour !
bonjour,
> Je souhaite vous informer d'une touche restée appuyée un quart de seconde
> trop longtemps semble-t-il !
...
> Téléchargement récursif:
> -r, --recursive spécifer un téléchargement récursif.
> -l, --level=NOMBRE *profond
Bonjour Tom,
Merci de cette information.
Mais pourrais tu nous préciser de quelle version de Wget il s'agit?
Tu peux obtenir cette information avec: wget --version
Je te recommande la dernière version de Wget, disponible ici:
http://wget.addictivecode.org/FrequentlyAskedQuestions#download
Bonjour !
Je souhaite vous informer d'une touche restée appuyée un quart de seconde
trop longtemps semble-t-il !
Dans l'aide de Wget (wget --help), nous trouvons en effet :
Téléchargement récursif:
-r, --recursive spécifer un téléchargement récursif.
-l, --le
At 04:27 PM 8/10/2008, you wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Rick Nakroshis wrote:
> Micah,
>
> If you will excuse a quick question about Wget, I'm trying to find out
> if I can use it to download a page from Yahoo that requires me to be
> logged in us
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Rick Nakroshis wrote:
> Micah,
>
> If you will excuse a quick question about Wget, I'm trying to find out
> if I can use it to download a page from Yahoo that requires me to be
> logged in using my Yahoo profile name and password
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Andreas Weller wrote:
> Hi!
> I use wget to download files from a ftp server in a bash script.
> For example:
> touch last.time
> wget -nc ftp://[]/*.txt .
> find -newer last.time
>
> This fails if the files on the FTP
my thinking :)
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla
On Thu, 7 Aug 2008, Micah Cowan wrote:
"niwt" (which I like best so far: Nifty Integrated Web Tools).
But the grand question is: how would that be pronounced? Like newt? :-)
--
/ daniel.haxx.se
he real work;
> the "getter" would probaby be a tool for communicating with the main driver.
...
> - Using existing tools to implement protocols Wget doesn't understand
> (want scp support? Just register it as an scp:// scheme handler), and
> instantly add support to W
Hi!
I use wget to download files from a ftp server in a bash script.
For example:
touch last.time
wget -nc ftp://[]/*.txt .
find -newer last.time
This fails if the files on the FTP server are older than my last.time. So I want
wget to set file date/time to the local creation time not the
> support for MetaLink
>
> Current Wget? I think someone's actually working on this. But, given Wget's
> current single-connection support, it couldn't be much more than falling back
> on one URL when another is broken.
> Pluggable/Library Wget (with multiple
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Dražen Kačar wrote:
> Micah Cowan wrote:
>
>> Okay, so there's been a lot of thought in the past, regarding better
>> extensibility features for Wget. Things like hooks for adding support
>> for traversal of new Conten
Micah Cowan wrote:
> Okay, so there's been a lot of thought in the past, regarding better
> extensibility features for Wget. Things like hooks for adding support
> for traversal of new Content-Types besides text/html, or adding some
> form of JavaScript support, or support f
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Okay, so there's been a lot of thought in the past, regarding better
extensibility features for Wget. Things like hooks for adding support
for traversal of new Content-Types besides text/html, or adding some
form of JavaScript support, or suppor
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Kevin O'Gorman wrote:
> Is there a reason i get this:
>> [EMAIL PROTECTED] Pending $ wget -O foo
>> "http://www.littlegolem.net/jsp/info/player_game_list_txt.jsp?plid=1107>id=hex";
>> Cannot specify -r, -p or -N
Is there a reason i get this:
> [EMAIL PROTECTED] Pending $ wget -O foo
> "http://www.littlegolem.net/jsp/info/player_game_list_txt.jsp?plid=1107>id=hex";
> Cannot specify -r, -p or -N if -O is given.
> Usage: wget [OPTION]... [URL]...
> [EMAIL PROTECTED] Pending $
W
Micah Cowan wrote:> The thing is, though, those two threads should be running
wgets under> separate processes
Yes, the two threads are running wgets under seperate processes with "system".
> What operating system are you running? Vista?mipsel-linux with kernel v2.4
> built from gcc v3.3.5
B
t; I reproduce this. But I can't make sure the really problem is in
> "resolve_bind_address." In the attached message, both
> api.yougotphogo.com and farm1.static.flickr.com get the same
> ip(74.124.203.218). The two wget are called from two threads of a
> program.
Yeah,
;resolve_bind_address."
In the attached message, both api.yougotphogo.com and farm1.static.flickr.com
get the same ip(74.124.203.218).
The two wget are called from two threads of a program.
Best regards,
k.c. chao
p.s.
The log is follworing:
wget -4 -t 6
"http://api.yougotphoto.com/dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
kuang-cheng chao wrote:
> Dear Micah:
>
> Thanks for your work of wget.
>
> There is a question about two wgets run simultaneously.
> In method resolve_bind_address, wget assumes that this is called once.
> However, this will
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Li Ru An wrote:
> Hi all:
>
> Greeting! I'm new in this list, hope I can help here.
>
> I found that there's some I18N issue with wget. For example, my OS is
> using GBK, when wget tries to get a URL coded in UTF-8,
Hi all:
Greeting! I'm new in this list, hope I can help here.
I found that there's some I18N issue with wget. For example, my OS is
using GBK, when wget tries to get a URL coded in UTF-8, there's some
issue in the coding translation.
I have managed to resolve my issue by cha
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hor Meng Yoong wrote:
> Hi:
>
> I understand that you are a very busy person. Sorry to disturb you.
Hi; please use the mailing list for support requests. I've copied the
list in my response.
> I am using wget to mirror (usi
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
> Hi,
>
> Thanks for the prompt response.
>
> I am using
>
> GNU Wget 1.10.2
>
> I tried a few things on your suggestion but the problem remains.
>
> 1. I exported the cookies file in Inter
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
> Hi,
>
> I am getting a strange bug when I use wget to download a binary file
> from a URL versus when I manually download.
>
> The attached ZIP file contains two files:
>
> 05.upc --- manua
anks!
>
> Robert
Thanks, Doug, for pointing that out.
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer,
and GNU Wget Project Maintainer.
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - htt
Good suggestion, I'll try this one too...
Robert
_
From: Doug Kaufman [mailto:[EMAIL PROTECTED]
To: wget@sunsite.dk
Sent: Tue, 01 Jul 2008 01:32:09 -0700
Subject: Re: Release: GNU Wget 1.11.4
On Mon, 30 Jun 2008, Micah Cowan wrote:
> Robert Denton wrote:
> > Hi,
[mailto:[EMAIL PROTECTED]
To: Robert Denton [mailto:[EMAIL PROTECTED]
Cc: wget@sunsite.dk
Sent: Mon, 30 Jun 2008 18:12:36 -0700
Subject: Re: Release: GNU Wget 1.11.4
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Robert Denton wrote:
> Hi, I have sent a few emails to:
>
> [EMAIL
On Mon, 30 Jun 2008, Micah Cowan wrote:
> Robert Denton wrote:
> > Hi, I have sent a few emails to:
> >
> > [EMAIL PROTECTED]
> >
> > but they keep bouncing (blocked by SpamAssassin). Is there any other
> > way to get off this list? Thanks!
>
> I'm afraid there's nothing we can do here. :\ Pl
ng we can do here. :\ Please contact
[EMAIL PROTECTED] to fix this.
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer,
and GNU Wget Project Maintainer.
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla
Hi, I have sent a few emails to:
[EMAIL PROTECTED]
but they keep bouncing (blocked by SpamAssassin). Is there any other way to
get off this list? Thanks!
Robert
_
From: Christopher G. Lewis [mailto:[EMAIL PROTECTED]
To: Wget [mailto:[EMAIL PROTECTED], [EMAIL PROTECTED]
Sent: Mon
I've published the Windows versions of Wget 1.11.4 on my web site.
http://www.christopherlewis.com/Wget/WGetFiles.htm
Christopher G. Lewis
http://www.ChristopherLewis.com
smime.p7s
Description: S/MIME cryptographic signature
I've published the Windows versions of Wget 1.11.4 on my web site.
http://www.christopherlewis.com/Wget/WGetFiles.htm
Christopher G. Lewis
http://www.ChristopherLewis.com
> -Original Message-
> From: Micah Cowan [mailto:[EMAIL PROTECTED]
> Sent: Sunday, June 29, 2008
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Announcing GNU Wget 1.11.4, a bugfix release.
The source code is available at
- http://ftp.gnu.org/gnu/wget/
- ftp://ftp.gnu.org/gnu/wget/
Documentation is at
- http://www.gnu.org/software/wget/manual/
More information about Wget is on the
Hello list!
I've just filed a bug report about the use of prefixes in wget [1]. I
write this email to start discussion about the topic.
I think it's important to use binary prefixes just to make things
clear and unambigous.
Right now the notation 10MB can be read at least like:
Sorry Guys - just an ID 10 T error on my part.
I think I need to change 2 things in the proxy server.
1. URLs in the HTML being returned to wget - this works OK
2. The "Content-Location" header used when the web server reports a
"301 Moved Permanently" response - I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Tony Lewis wrote:
> Coombe, Allan David (DPS) wrote:
>
>> However, the case of the files on disk is still mixed - so I assume
>> that wget is not using the URL it originally requested (harvested
>> from the HTML?) to create
Coombe, Allan David (DPS) wrote:
> However, the case of the files on disk is still mixed - so I assume that
> wget is not using the URL it originally requested (harvested from the
> HTML?) to create directories and files on disk. So what is it using? A
> http header (if so, whic
the response from the web site to lowercase the urls
> in the html (actually I lowercased the whole response) and the data that
> wget put on disk was fully lowercased - problem solved - or so I thought.
>
> However, the case of the files on disk is still mixed - so I assume that
>
e) and the data that
wget put on disk was fully lowercased - problem solved - or so I
thought.
However, the case of the files on disk is still mixed - so I assume that
wget is not using the URL it originally requested (harvested from the
HTML?) to create directories and files on disk. So what is i
Hello Stefan,
I have a question:
Am 2008-06-18 12:17:12, schrieb Stefan Nowak:
> wget \
> --page-requisites \
> --html-extension \
> --convert-links \
> --span-hosts \
> --no-check-certificate \
> --debug \
> https://help.ubuntu.com/community/MacBookPro/ &a
ver side
>
> Why do you assume the user of wget has any control over the server from which
> content is being downloaded?
>
>
--
-mmw
mm w wrote:
> a simple url-rewriting conf should fix the problem, wihout touch the file
> system
> everything can be done server side
Why do you assume the user of wget has any control over the server from which
content is being downloaded?
one - this would
>> need to include translation of urls in content as well as filenames on
>> disk.
>>
>> In the meantime - does anyone know of a proxy server that could
>> translate urls from mixed case to lower case. I thought that if we
>> downloaded using wget
gt; translate urls from mixed case to lower case. I thought that if we
> downloaded using wget via such a proxy server we might get the
> appropriate result.
>
> The other alternative we were thinking of was to post process the files
> with symlinks for all mixed case versions of fil
oxy server that could
translate urls from mixed case to lower case. I thought that if we
downloaded using wget via such a proxy server we might get the
appropriate result.
The other alternative we were thinking of was to post process the files
with symlinks for all mixed case versions of files and
same effect ([EMAIL PROTECTED] is copied to en_US.po).
Again, this is only in the mainline repo, and not in any release.
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer,
and GNU Wget Project Maintainer.
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
On Jun 18, 2008, at 5:17 AM, Stefan Nowak wrote:
where do I set the locale of the CLI environment of MacOSX?
You should set the LANG environment variable to the desired locale,
and one which is supported on your system; you can look at the
directories in /usr/share/locale to see what local
Dear Stefan,
If you take a look at the source of the page, you'll see this:
Simply add "-e robots=off" to your arguments and wget will ignore any
robots.txt files or tags. With that it should download everything you
want. (I did not find this myself, credits go to sxav for po
Dear Micah Cowan!
We started a conversation yesterday on the IRC, but couldn't get to
an end, so this is my follow up.
As I am not subscribed to the mailinglist, please CC me for this
thread. Thanks!
I am using Wget 1.10.2 installed through Fink 0.8.1 on MacOSX 10.4.11
It was impos
;
> In other words, those names are well within the standard when the server
> understands them. As far as I know, there is nothing in Internet standards
> restricting mixed case paths.
>
:) read again, nobody does except some punk-head folks
>> that's it, if the server manag
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sir Vision wrote:
> Hello,
>
> enterring following command results in an error:
>
> --- command start ---
> c:\Downloads\wget_v1.11.3b>wget
> "ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla
Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3b>wget
"ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/";
-P c:\Downloads\
--- command end ---
wget cant convert ".listing"-
rnet standards
restricting mixed case paths.
> that's it, if the server manages non-standard URL, it's not my
> concern, for me it doesn't exist
Oh. I see. You're writing to say that wget should only implement features that
are meaningful to you. Thanks for your narcissistic input.
Tony
al site has the URI strings "/dir/file", "dir/File", "Dir/file",
> and "/Dir/File", the same local file will be returned. However, wget will
> treat those as unique directories and files and you wind up with four copies.
>
> Allan asked if there is a way to have wget just create one copy and proposed
> one way that might accomplish that goal.
>
> Tony
>
>
--
-mmw
Steven M. Schweda wrote:
> >From Tony Lewis:
> > To have the effect that Allan seeks, I think the option would have to
> > convert all URIs to lower case at an appropriate point in the process.
> I think that that's the wrong way to look at it. Implementation
> details like name hashing may al
inal posting: how can one
conveniently mirror a site whose server uses case insensitive names onto a
server that uses case sensitive names.
If the original site has the URI strings "/dir/file", "dir/File", "Dir/file",
and "/Dir/File", the same local file will be re
In the VMS world, where file name case may matter, but usually
doesn't, the normal scheme is to preserve case when creating files, but
to do case-insensitive comparisons on file names.
>From Tony Lewis:
> To have the effect that Allan seeks, I think the option would have to
> convert all URIs
1 - 100 of 4043 matches
Mail list logo