RE: wGET - NTLM Support

2005-01-03 Thread Mudliar, Anand
Thanks.

I would appreciate if you please let me know as soon as the code is
posted on the site. 

It would be great help if you can point me to some site/link where the
old version is posted.

Thanks,
Anand Mudliar
Intel

-Original Message-
From: Daniel Stenberg [mailto:[EMAIL PROTECTED] 
Sent: Friday, December 31, 2004 2:38 PM
To: Herold Heiko
Cc: Mudliar, Anand; [EMAIL PROTECTED]; 'Mauro Tortonesi'
Subject: RE: wGET - NTLM Support

On Fri, 31 Dec 2004, Herold Heiko wrote:

 Daniel, could you resend that code to the current co-maintainer Mauro 
 Tortonesi [EMAIL PROTECTED] ? Maybe sooner or later he finds
some 
 time for this.

The files I prepared are no longer around (I put them up on a site in
november 
2003 and I fixed the FSF assignment stuff before 2003 ended).

I'll try to get time off to fix a new version of the files in the
beginning of 
next year.

-- 
  -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
   ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


Re: Help! wget timing out under linux

2005-01-03 Thread Vesselin Peev
Hi,
I also have a dual boot configuration between WinXP and Fedora Core 3, with 
both fully updated. I recently downloaded a several hundred megabyte file 
via wget under Fedora Core 3, and there were no problems.

-Vesko
- Original Message - 
From: Paul Leppert [EMAIL PROTECTED]
To: wget@sunsite.dk
Sent: Monday, January 03, 2005 8:07 PM
Subject: Help! wget timing out under linux


Hi all,
I am trying to download files with wget under linux (new install of
Fedora Core 3) and it keeps timing out.
I'm trying to figure out if this is a wget problem or a connection
problem (I see this problem across servers, so I don't believe it is a
server problem).  The machine I am using is dual bootable between
linux and winxp.  If I run wget under winxp, it works fine.  My
machine is connected to a netgear firewall (which acts as a DHCP
server) and then from there to an internet gateway.  So, given that
wget works from winxp and not from linux (on the same machine), I
don't think it is a problem with my connection or firewall (although
it could be a problem with the linux configuration).
I have also tried uninstalling and reinstalling wget with the same
results (so I don't think it is a bad install).  I also have no
problems downloading large (multi-mb) files using firefox browser as
well as apt-get (updates, upgrades, and installs).
I am running version 1.9.1:
[EMAIL PROTECTED] mythtv]# rpm -qa | grep wget
wget-1.9.1-16.fc2
Below is a snippet of the output from a sample session (stopped after
the retry).  Are there additional settings / diagnostics I can use to
determine what exactly is causing the timeout? My timeout setting for
this run was 30 seconds, but I get the same result with the default 15
minutes; once it stops downloading, it just sits until the timeout
expires.  Also, my .wgetrc file only has settings for tries = 5,
timeout = 30, and debug = on (possibly also verbose = on, but not
sure, don't have the .wgetrc file in front of me), which I set up to
debug the problem (originally I didn't have a .wgetrc and still had
the problem).
Thanks,
phlepper
[EMAIL PROTECTED] ~]$ wget
http://pvrguide.no-ip.com/files/0.16/mythtv-0.16.tar.bz2
DEBUG output created by Wget 1.9+cvs-stable (Red Hat modified) on 
linux-gnu.

--21:28:57--  http://pvrguide.no-ip.com/files/0.16/mythtv-0.16.tar.bz2
 = `mythtv-0.16.tar.bz2'
Resolving pvrguide.no-ip.com... 82.44.146.198
Caching pvrguide.no-ip.com = 82.44.146.198
Connecting to pvrguide.no-ip.com[82.44.146.198]:80... connected.
Created socket 3.
Releasing 0x95ca640 (new refcount 1).
---request begin---
GET /files/0.16/mythtv-0.16.tar.bz2 HTTP/1.0
User-Agent: Wget/1.9+cvs-stable (Red Hat modified)
Host: pvrguide.no-ip.com
Accept: */*
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response... HTTP/1.1 200 OK
Date: Mon, 03 Jan 2005 03:24:25 GMT
Server: Apache/2.0.49 (Gentoo/Linux) mod_ssl/2.0.49 OpenSSL/0.9.7d 
PHP/4.3.9
Last-Modified: Fri, 10 Sep 2004 19:03:32 GMT
ETag: 768010-cdf535-d0a900
Accept-Ranges: bytes
Content-Length: 13497653
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: application/x-tar

Found pvrguide.no-ip.com in host_name_addresses_map (0x95ca640)
Registered fd 3 for persistent reuse.
Length: 13,497,653 [application/x-tar]
0% [
] 87,2048.52K/sETA 22:30
Closing fd 3
Releasing 0x95ca640 (new refcount 1).
Invalidating fd 3 from further reuse.
21:29:37 (9.73 KB/s) - Read error at byte 87,204/87,204 (Connection
timed out). Retrying.
--
I hear and I forget. I see and I remember. I do and I understand.  --  
Confucius





RE: wGET - NTLM Support

2005-01-03 Thread Daniel Stenberg
On Mon, 3 Jan 2005, Mudliar, Anand wrote:
I would appreciate if you please let me know as soon as the code is posted 
on the site.

It would be great help if you can point me to some site/link where the old 
version is posted.
Let me just point out loud and clear that my files were not complete patches 
that introduced NTLM support to wget. They were files ripped out from the curl 
source tree that I re-licensed and handed to the wget project to allow someone 
with insights to adjust it to be usable. I'm not the person to say how much 
time or effort this requires, or even if anyone wants to do it.

The old version is not available anymore so posting the old URL is not gonna 
help anyone.

If you want to get a grasp of what the code looks like in its original shape, 
check the lib/http_ntlm.[ch] files in curl's source repository.

--
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


Re: new string module

2005-01-03 Thread Mauro Tortonesi
Alle 22:09, domenica 2 gennaio 2005, Jan Minar ha scritto:
 On Sun, Jan 02, 2005 at 01:37:36AM +0100, Mauro Tortonesi wrote:
  i have just commited the new string.c module which includes a mechanism
  to fix the bug reported by no?l köthe:
 
  http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=271931

 #271931 is:
  From: Ambrose Li [EMAIL PROTECTED]
  Subject: Weird escaping makes wget verbose output completely
   unreadable in non-English locales
  Message-ID: [EMAIL PROTECTED]

 Perhaps You meant [0]the #261755?

 [0] http://bugs.debian.org/261755

yes, both of them.

  the code was inspired by felix von leitner's libowfat and by jan minar's
  bug fixing patch.
 
  unfortunately i haven't fixed the bug yet since i don't like jan minar's
  approach (changing logprintf in a not so portable way to encode every
  string passed to the function as an argument) because of its
  inefficiency.

 That was a hotfix.  You know, that thing that you do in order not to
 have a security hole Right Now.

ok, then i don't think you should be offended just because i said that i think 
your patch is too inefficient to be merged in the wget cvs repository. 
especially after you've posted a bug report on bugtraq (which was more a 
personal attack than a professional bug report) saying that wget authors are 
all incompetent...

  as Fumitoshi UKAI suggested, the best choice would be to escape only the
  strings that need to be escaped. so, i think we should probably check
  together which strings passed to logprintf in the wget code need to be
  escaped. anyone willing to help?

 You don't want to check whether this or that string accidentally needs
 or doesn't need to get escaped. The right way is to sanitize *all*
 untrusted input before you even start thinking about using it.

mmmh, i don't think so. why would you for example want or need to escape 
format strings (that are retrieved via gettext and are already in your local 
charset), the URLs to download or the configuration data read from wgetrc?

anyway, simone piunno and i have been talking a lot about this problem and 
we've found that apart from a couple of minor problems (very easy to fix) the 
current implementation of escape_buffer works fine. the problem is when you 
pass escaped multibyte strings as arguments to printf. if these strings 
contain a 0x00 byte, it will be incorrectly interpreted by printf as a string 
termination characher. simone says for example that UTF16 strings can contain 
null bytes.

i don't really have any clue on how to solve this problem. simone suggests to 
change the internal format of strings in wget to UTF8, but of course i would 
prefer a less invasive solution if possible... i don't even know if we could 
keep using gettext in that case.

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
Institute of Human  Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Report to Sender

2005-01-03 Thread PGSSMTP01/smtp/IN
Incident Information:-

Database:   d:/lotus/domino/data/mail1.box
Originator: [EMAIL PROTECTED]
Recipients: [EMAIL PROTECTED]
Subject:Mail Delivery (failure [EMAIL PROTECTED])
Date/Time:  01/04/2005 11:36:07 AM

The file attachment / html you sent to the recipients listed above was
infected with the Suspicious IFrame.a virus and was deleted.