RE: --convert-links vs. non-recursion

2005-03-02 Thread Belov, Charles
Original message corrected as [EMAIL PROTECTED] munged my URLs.

Please read [AT] as an at sign.

Hi -

I'm invoking wget 1.9.1 with the following options, among others:

--input-file _filename_
--restrict-file-names='windows'
--directory-prefix=/www/htdocs/newfolder1/newfolder2
--convert-links
--html-extension

but not recursion. 

The reason I'm using an input file instead of recursion is this documentation 
about --accept _acclist_:

"Note that these two options do not affect the downloading of HTML files; Wget 
must load all the HTMLs to know where to go at all--recursive retrieval would 
make no sense otherwise."

Well, not quite.  I want to retrieve all pages named
http://my.source.site/oldfolder/abc_pages.asp?id=n and
http://my.source.site/oldfolder/def_pages.asp?id=n and
http://my.source.site/oldfolder/ghi_pages.asp?id=n and
but not pages named
http://my.source.site/oldfolder/jkl_pages.asp?id=n or
http://my.source.site/oldfolder/mno_pages.asp?id=n or
http://my.source.site/oldfolder/pqr_pages.asp?id=n . 

(Where n is the 5-digit number corresponding to the actual content.)

That is, they are all in the same directory, with different whatever.asp names. 

What's happening is that the pages in my input list are correctly getting 
copied to 

http://my.target.site/newfolder1/newfolder2/abc_pages[AT]id=n.html etc

but the links in the pages are untranslated from their original

/oldfolder/def_pages?id=n

instead of being translated to a working

def_pages[AT]id=n.html
or   
/newfolder1/newfolder2/def_pages[AT]id=n.html

and links to unwanted pages such as
/oldfolder/jkl_pages.asp?id=n 
are not being translated to
http://my.source.site/oldfolder/jkl_pages.asp?id=n
in the files on my new site. 

I'm guessing that --convert-links will only work with recursion, and that's why 
the links also aren't being fixed for the .html extension or the Windows file 
names.  Is there a way to get some of the HTML files and not others when they 
are in the same directory, but still get the links fully translated? Or will I 
need to post-edit my new files outside of wget to fix the links?

Note: The target site is on a Un*x box, but I have to be able to 
upload/download from a PC.

Thanks in advance,
Charles "Chas" Belov


--convert-links vs. non-recursion

2005-03-02 Thread Belov, Charles
Hi -

I'm invoking wget 1.9.1 with the following options, among others:

--input-file _filename_
--restrict-file-names='windows'
--directory-prefix=/www/htdocs/newfolder1/newfolder2
--convert-links
--html-extension

but not recursion. 

The reason I'm using an input file instead of recursion is this documentation 
about --accept _acclist_:

"Note that these two options do not affect the downloading of HTML files; Wget 
must load all the HTMLs to know where to go at all--recursive retrieval would 
make no sense otherwise."

Well, not quite.  I want to retrieve all pages named
http://my.source.site/oldfolder/abc_pages.asp?id=n and
http://my.source.site/oldfolder/def_pages.asp?id=n and
http://my.source.site/oldfolder/ghi_pages.asp?id=n and
but not pages named
http://my.source.site/oldfolder/jkl_pages.asp?id=n or
http://my.source.site/oldfolder/mno_pages.asp?id=n or
http://my.source.site/oldfolder/pqr_pages.asp?id=n . 

(Where n is the 5-digit number corresponding to the actual content.)

That is, they are all in the same directory, with different whatever.asp names. 

What's happening is that the pages in my input list are correctly getting 
copied to 

http://my.target.site/newfolder1/newfolder2/[EMAIL PROTECTED] etc

but the links in the pages are untranslated from their original

/oldfolder/def_pages?id=n

instead of being translated to a working

[EMAIL PROTECTED]
or   
/newfolder1/newfolder2/[EMAIL PROTECTED]

and links to unwanted pages such as
/oldfolder/jkl_pages.asp?id=n 
are not being translated to
http://my.source.site/oldfolder/jkl_pages.asp?id=n
in the files on my new site. 

I'm guessing that --convert-links will only work with recursion, and that's why 
the links also aren't being fixed for the .html extension or the Windows file 
names.  Is there a way to get some of the HTML files and not others when they 
are in the same directory, but still get the links fully translated? Or will I 
need to post-edit my new files outside of wget to fix the links?

Note: The target site is on a Un*x box, but I have to be able to 
upload/download from a PC.

Thanks in advance,
Charles "Chas" Belov


Thanks RE: Makefile hassles

2005-03-02 Thread Belov, Charles
Up and running, thanks.

-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, March 02, 2005 12:34 PM
To: Belov, Charles
Cc: wget@sunsite.dk
Subject: Re: Makefile hassles


"Belov, Charles" <[EMAIL PROTECTED]> writes:

> I would like to use wget 1.9.1 instead of the wget 1.8.x which is
> installed on our server. I downloaded 1.9.1 from the Gnu ftp site,
> and issued the command:
>
> make -f Makefile.in wget191

You're not supposed to use Makefile.in directly.  Run `./configure'
and it should create Makefiles for you, after which you should be able
to just run `make' to get a usable Wget.


Re: Makefile hassles

2005-03-02 Thread Hrvoje Niksic
"Belov, Charles" <[EMAIL PROTECTED]> writes:

> I would like to use wget 1.9.1 instead of the wget 1.8.x which is
> installed on our server. I downloaded 1.9.1 from the Gnu ftp site,
> and issued the command:
>
> make -f Makefile.in wget191

You're not supposed to use Makefile.in directly.  Run `./configure'
and it should create Makefiles for you, after which you should be able
to just run `make' to get a usable Wget.


Re: Large file support

2005-03-02 Thread Hrvoje Niksic
Noèl Köthe <[EMAIL PROTECTED]> writes:

> Am Mittwoch, den 23.02.2005, 23:13 +0100 schrieb Hrvoje Niksic:
>
>> The most requested feature of the last several years finally arrives
>> -- large file support.  With this patch Wget should be able to
>> download files larger than 2GB on systems that support them.
>
> dont know if its still interesting but there where no problems when
> building wget from cvs on different Linux architectures:
>
>   http://experimental.ftbfs.de/build.php?arch=&pkg=wget-cvs

It is still interesting, thanks.

Note, however, that all of the above architectures seem to run Debian
GNU/Linux, which, being my current development platform, is pretty
much guaranteed to build.  That is, unless I did something stupid that
would prevent compilation under a different CPU.

> thx for including LFS.:)

It's been fun.  And long overdue.  :-)


Re: Large file support

2005-03-02 Thread Noèl Köthe
Am Mittwoch, den 23.02.2005, 23:13 +0100 schrieb Hrvoje Niksic:

Hello,

> The most requested feature of the last several years finally arrives
> -- large file support.  With this patch Wget should be able to
> download files larger than 2GB on systems that support them.

dont know if its still interesting but there where no problems when
building wget from cvs on different Linux architectures:

http://experimental.ftbfs.de/build.php?arch=&pkg=wget-cvs

thx for including LFS.:)

-- 
NoÃl KÃthe 
Debian GNU/Linux, www.debian.org


signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil


Re: Makefile hassles

2005-03-02 Thread Doug Kaufman
On Wed, 2 Mar 2005, Belov, Charles wrote:

> I would like to use wget 1.9.1 instead of the wget 1.8.x which is
> installed on our server. I downloaded 1.9.1 from the Gnu ftp site, and
> issued the command:
> 
> make -f Makefile.in wget191

You forgot to configure. See the file "install" for instructions.
Doug

-- 
Doug Kaufman
Internet: [EMAIL PROTECTED]



Makefile hassles

2005-03-02 Thread Belov, Charles
Hi -
I would like to use wget 1.9.1 instead of the wget 1.8.x which is installed on 
our server. I downloaded 1.9.1 from the Gnu ftp site, and issued the command:

make -f Makefile.in wget191

I got the response 

"/usr/local/apache/work/wget-1.9.1/Makefile.in", line 33: Need an operator
make: fatal errors encountered -- cannot continue

with the following lines in Makefile.in:

31
32  SHELL = /bin/sh
33  @SET_MAKE@
34
35  top_builddir = .

Any ideas?

On FreeBSD 4.7.

Charles "Chas" Belov


RE: Large file problem

2005-03-02 Thread Herold Heiko
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
> Gisle Vanem <[EMAIL PROTECTED]> writes:
> 
> > It doesn't seem the patches to support >2GB files works on
> > Windows. Wget hangs indefinitely at the end of transfer.  E.g.
> [...]
> 
> I seem to be unable to repeat this.

Me too. I transferred successfully several 3GB files from linux and
solaris/sparc servers to NT4 by ftp and http (on lan though, no restarts).
No help here, sorry.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax