Re: newbie question

2005-04-14 Thread Jens sner
Hi! Yes, I see now, I misread Alan's original post. I thought he would not even be able to download the single .pdf. Don't know why, as he clearly said it works getting a single pdf. Sorry for the confusion! Jens > "Tony Lewis" <[EMAIL PROTECTED]> writes: > > > PS) Jens was mistaken when he

Re: de Hacene : raport un bug

2005-03-25 Thread Jens sner
Hallo! Je ne parle pas francais (ou presque pas du tout)... > C:\wget>wget --proxy=on -x -r -l 2 -k -x -l > imit-rate=50k --tries=45 --directory-prefix=AsptDD Je pense que ce doit être: C:\wget>wget --proxy=on -x -r -l 2 -k -x -limit-rate=50k --tries=45 --directory-prefix=AsptDD dans un ligne

Re: Feautre Request: Directory URL's and Mime Content-Type Header

2005-03-21 Thread Jens sner
Hi Levander! I am not an expert by any means, "just another user", but what does the -E option do for you? -E = --html-extension > apache. Could wget, for url's that end in slashes, read the > content-type header, and if it's text/xml, could wget create index.xml > inside the directory wget

Re: Bug

2005-03-20 Thread Jens sner
Hi Jorge! Current wget versions do not support large files >2GB. However, the CVS version does and the fix will be introduced to the normal wget source. Jens (just another user) > When downloading a file of 2GB and more, the counter get crazy, probably > it should have a long instead if a int

Re: a little help for use wget

2005-02-18 Thread Jens sner
Hi LucMa! I have find a command > for autoskip if the file in my PC have the same name of the file > on the ftp -nc I guess. > but now i need a command for overwrite the file on my PC if is smaller > respect to the file in the ftp. > -e robots=off -N -N should do the trick. > "http://www.p

Re: -X regex syntax? (repost)

2005-02-18 Thread Jens sner
Hi Vince! > I did give -X"*backup" a try, and > it too didn't work for me. :( Does the -X"dir" work for you at all? If not, there might be a problem with MacOS. I hope one of the more knowledgeable people here can help you! > However, I would like to confirm something dumb - will wget fetch the

Re: -X regex syntax? (repost)

2005-02-17 Thread Jens sner
Hi Vince! > tip or two with regards to using -X? I'll try! > wget -r --exclude-directories='*.backup*' --no-parent \ > http://example.com/dir/stuff/ Well, I am using wget under Windows and there, you have have to use "exp", not 'exp', to make it work. The *x* works as expected. I could not test

RE: -i problem: Invalid URL ¹h: Unsupported scheme

2005-02-13 Thread Jens sner
Hi Mike! Strange! I suspect that you have some kind of typo in your test.txt If you cannot spot one, try wget -d -o logi.txt -i test.txt as a command line and send the debug output. Good luck Jens (just another user) > a) I've verified that they both exist > b) All of the URLs are purely HTTP.

Re: command prompt closes immediately after opening

2005-02-12 Thread Jens sner
Hi Jon! > and added the text "wget http://xoomer.virgilio.it/hherold/index.html"; > in the file and saved it.I then double-clicked on it and nothing > happened that I could see. Well, there now should be a file called index.html in your wget directory! Now replace the text in your wget.bat wi

Re: command prompt closes immediately after opening

2005-02-12 Thread Jens sner
Hi Jon! > Yes, I tried using the 'command prompt' (thru XP) and it replied: > " > 'wget' is not recognized as an internal or external command, operatable > program or batch file > " Did you cd to the directory wget was unpacked to? If not, you either need to or add the wget directory to your "pat

Re: command prompt closes immediately after opening

2005-02-12 Thread Jens sner
Hi Jon! > I downloaded the full version, but when I click and run the > exe, a command > prompt opens and then immediately closes. > I'm in safe mode now, but the > same things happens. > I have XP SP2. Is there a way I can get the benefit of wget? Of course! Don't let someone fool you! I've

Re: Help with wget

2005-02-11 Thread Jens sner
Hey Ted! > If I view the html the links are formatted as if to provide local viewing > however when I open the html file in my browser all the images are > red-xed (empty). It all seems ok for me with wget 1.9 beta and Win2K > Wget is 1.9.1-complete from http://xoomer.virgilio.it/hherold/

Re: timeouts not functioning

2005-01-04 Thread Jens sner
Hi Colin! I am just another user and rarely use "timeout". I hope I am not causing more trouble than I'm solving. > I'm attempting to use wget's --timeout flag to limit the time spent > downloading a file. However, wget seems to ignore this setting. For > example, if I set --timeout=2, the downl

Re: wget -r ist not recursive

2004-09-13 Thread Jens sner
Hi Helmut! I suspect there is a robots.txt that says "index, no follow" Try wget -nc -r -l0 -p -np -erobots=off "http://www.vatican.va/archive/DEU0035/_FA.HTM"; it works for me. -l0 says: infinite recursion depth -p means page requisites (not really necessary) -erobots=off orders wget to ignore

Re: HELP: Can not load websites with frames

2004-06-11 Thread Jens sner
Hi all! François just told me that it works. :) I thought that maybe I'll should add why it does ;) The original website sits on www.zurich-airport.com, the info frame however is loaded from http://www.uniqueairport.com As wget by default only downloads pages from the same server (which makes s

Re: HELP: Can not load websites with frames

2004-06-11 Thread Jens sner
Hi François! Well, it seems to work for me. Here's how: Open the frame in another window (works in Mozilla easily), then you'll see the URL: http://www.uniqueairport.com/timetable/fplan_landung_imm.asp?ID_site=1&sp=en&le=2&ID_level1=1&ID_level2=2&ID_level3=7&ID_level4=&ID_level5=&d=timetable/fpla

Re: Maybe a bug or something else for wget

2004-05-23 Thread Jens sner
Hi Ben! Not at a bug as far as I can see. Use -A to accept only certain files. Furthermore, the pdf and ppt files are located across various servers, you need to allow wget to parse other servers than the original one by -H and then restrict it to only certain ones by -D. wget -nc -x -r -l2 -p

Re: Site Mirror

2004-05-11 Thread Jens sner
Hi Kelvin! I must admit that I am a bit puzzled. > I am trying to mirror a web site that has many > hierarchical levels. > I am using the command > wget -m -k $site > which allows me to view the site fine. > However, I wish the mirror to make a directory > structure that also mimics the web

Re: skip robots

2004-02-07 Thread Jens sner
Hi Hrvoje! > > PS: One note to the manual editor(s?): The -e switch could be > > (briefly?) mentioned also at the "wgetrc commands" paragraph. I > > think it would make sense to mention it there again without > > clustering the manual too much. Currently it is only mentioned in > > "Basic Startup

RE: apt-get via Windows with wget

2004-02-02 Thread Jens sner
Hello Heiko! > I added a wget-complete-stable.zip, if you want to link to a fixed url > use > that, I'll update it whenever needed. Currently it is the same archive > as the wget-wget-1.9.1b-complete.zip . Great! Thank you very much, Heiko. I think I'll use it on my wgetgui page as well! :) Bu

Re: apt-get via Windows with wget

2004-02-01 Thread Jens sner
Note: Mail redirected from bug to normal wget list. > H> For getting Wget you might want to link directly to > H> ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip, > OK, but too bad there's no stable second link .../latest.zip so I > don't have to update my web page to follow t

Re: Spaces in directories/files are converted to '@' symbol.

2004-01-09 Thread Jens sner
Hi Tommy! Does this option, first shown in 1.9.1 (I think) help you: --restrict-file-names=mode It controls file-name escaping. I'll mail the complete extract from the manual to your private mail address. You can download the current wget version from http://www.sunsite.dk/wget/ CU Jens > I

Re: Maybe a bug?

2003-12-28 Thread Jens sner
Hi! Well, the message you got really tells you to have a look at the user agreement. So I did. http://www.quickmba.com/site/agreement/ clearly explains why your download failed under the point "Acceptable Use" As long as you have wget identifying itself as wget, you probably will not get any fi

Re: no escape

2003-12-07 Thread Jens sner
Hi Erez! Which version of wget are you using? In 1.9.1 there exists an option called --restrict-file-names=mode which should do what you want, if I understood you correctly. Cya Jens > hi > > i am trying to mirror a site. > wget keeps escaping the filenames ( e.g. i get '%20' instead of a s