Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-21 Thread Mark Hindley
On Mon, Mar 21, 2011 at 08:54:41AM +0100, Eus wrote:
 Hi!
 
 On Sun, 2011-03-20 at 18:22 +0100, Eus wrote:
 
If suppose CURLOPT_FOLLOWLOCATION was set when the wireless connection
had been disconnected, then would apt-cacher have cached the login page
of the hotspot provider?
   
   Yes, I suspect so, but presumably with a 200 status and that would have 
   been returned to the apt-get/aptitude client which would not be happy!
 
 In my setting, it turns out that using only the FOLLOWLOCATION patch
 prevents the delivery of HTMLBODYH2Browser error!/H2Browser
 does not support redirects!/BODY line after the 302 HTTP header to
 apt-get. This makes apt-get moves along happily by ignoring
 Translate-en.bz2 printing Ign ... on the screen _although_ apt-cacher
 sends the login page to apt-get (apt-cacher has the login page in its
 packages directory under the name *Translate-en.bz2). Your second patch
 then acts as a garbage collector by removing the login page from the
 package directory and returns 404 to apt-get keeping apt-get moving
 along happily printing Ign... The details follow:

Thanks.

I will queue this for the next upload which should be fairly shortly.

Mark



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Tadeus Prastowo
Package: apt-cacher
Version: 1.6.12ubuntu1
Severity: important

apt-get update returns Bad header line. This problem also happens when 
using aptitude update and aptitude install although apt-get install works 
fine.

Instrumenting the source code of apt-get: methods/http.cc as follows:
--- http.cc.orig2011-03-20 11:28:12.347525680 +0100
+++ http.cc 2011-03-20 09:17:23.409707029 +0100
@@ -535,8 +535,11 @@ bool ServerState::HeaderLine(string Line
{
   // Blah, some servers use connection:closes, evil.
   Pos = Line.find(':');
-  if (Pos == string::npos || Pos + 2  Line.length())
-return _error-Error(_(Bad header line));
+  if (Pos == string::npos || Pos + 2  Line.length()) {
+   if (Debug == true)
+ clog  [Bad header line] The offending HTTP header line is
Line  std::endl;
+   return _error-Error(_(Bad header line));
+  }
   Pos++;
}
and running apt-get as follows:
bin/apt-get -o Debug::Acquire::http=true -o 
Dir::Bin::methods=/tmp/apt-0.8.3ubuntu7/bin/methods -s update
, I got the following error message that I cut after the first error to keep 
this report succinct:
--- 8 
NOTE: This is only a simulation!
  apt-get needs root privileges for real execution.
  Keep also in mind that locking is deactivated,
  so don't depend on the relevance to the real current situation!
GET /de.archive.ubuntu.com/ubuntu/dists/maverick/Release.gpg HTTP/1.1
Host: localhost:3142
Connection: keep-alive
Cache-Control: max-age=0
Range: bytes=197-
If-Range: Sun, 10 Oct 2010 10:18:55 GMT
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET /de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en_US.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/multiverse/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/multiverse/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/restricted/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/restricted/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/universe/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/universe/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET /de.archive.ubuntu.com/ubuntu/dists/maverick-updates/Release.gpg HTTP/1.1
Host: localhost:3142
Connection: keep-alive
Cache-Control: max-age=0
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


HTTP/1.0 200 OK
Connection: Keep-Alive
Accept-Ranges: bytes
Age: 785
ETag: 86081-c6-49240922fa1c0
Content-Length: 198
Content-Type: text/plain
Last-Modified: Sun, 10 Oct 2010 10:18:55 GMT

GET /de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en_US.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/multiverse/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/multiverse/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/restricted/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/restricted/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/universe/i18n/Translation-en.bz2 
HTTP/1.1
Host: localhost:3142
Connection: keep-alive
User-Agent: Debian APT-HTTP/1.3 (0.8.3ubuntu7)


GET 
/de.archive.ubuntu.com/ubuntu/dists/maverick/universe/i18n/Translation-en_US.bz2
 HTTP/1.1
Host: 

Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Some clarifications:

On Sun, 2011-03-20 at 11:44 +0100, Tadeus Prastowo wrote:

 apt-get update returns Bad header line. This problem also happens
 when using aptitude update and aptitude install although apt-get
 install works fine.

Actually apt-get install does not work fine. It gives the following
message:
--- 8 --
WARNING: The following packages cannot be authenticated!
  python-reportbug reportbug
Install these packages without verification [y/N]?
--- 8 --

And aptitude install gives similar message:
--- 8 --
WARNING: untrusted versions of the following packages will be installed!

Untrusted packages could compromise your system's security.
You should only proceed with the installation if you are certain that
this is what you want to do.

  bash-doc 

Do you want to ignore this warning and proceed anyway?
To continue, enter Yes; to abort, enter No:
--- 8 --

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/




-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Mark Hindley
On Sun, Mar 20, 2011 at 11:44:16AM +0100, Tadeus Prastowo wrote:
 Package: apt-cacher
 Version: 1.6.12ubuntu1
 Severity: important
 
 apt-get update returns Bad header line. This problem also happens when 
 using aptitude update and aptitude install although apt-get install 
 works fine.

Thanks.

If you apply this patch to /usr/share/apt-cacher/apt-cacher, does it 
help?

Mark


commit 078bea72c89edf4287e92e46510aaa6211daf56f
Author: Mark Hindley m...@hindley.org.uk
Date:   Sun Mar 20 11:05:05 2011 +

Possible fix for #618996

diff --git a/apt-cacher b/apt-cacher
index 5179ef9..8d83af9 100755
--- a/apt-cacher2
+++ b/apt-cacher2
@@ -1274,6 +1274,7 @@ sub init_curl {
 $curl-setopt(CURLOPT_LOW_SPEED_TIME, $cfg-{fetch_timeout});
 $curl-setopt(CURLOPT_INTERFACE, $cfg-{interface}) if defined 
$cfg-{interface};
 $curl-setopt(CURLOPT_NOSIGNAL, 1);
+$curl-setopt(CURLOPT_FOLLOWLOCATION, 1);
 
 # Callbacks
 $curl-setopt(CURLOPT_DEBUGFUNCTION, \debug_callback);



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Mark Hindley
On Sun, Mar 20, 2011 at 12:15:43PM +0100, Eus wrote:
 Some clarifications:
 
 On Sun, 2011-03-20 at 11:44 +0100, Tadeus Prastowo wrote:
 
  apt-get update returns Bad header line. This problem also happens
  when using aptitude update and aptitude install although apt-get
  install works fine.
 
 Actually apt-get install does not work fine. It gives the following
 message:
 --- 8 --
 WARNING: The following packages cannot be authenticated!
   python-reportbug reportbug
 Install these packages without verification [y/N]?
 --- 8 --
 
 And aptitude install gives similar message:
 --- 8 --
 WARNING: untrusted versions of the following packages will be installed!

I think this is just a side effect of the previous apt-get update failing

Mark



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Hi!

On Sun, 2011-03-20 at 11:07 +, Mark Hindley wrote:

 Thanks.

It's my pleasure.

 If you apply this patch to /usr/share/apt-cacher/apt-cacher, does it 
 help?

No, it doesn't. I applied the patch to the mentioned file by changing
{a,b}/apt-cacher2 to {a,b}/apt-cacher. Otherwise, the patch fails
since apt-cacher2 does not exist. Then, I restarted apt-cacher by
/etc/init.d/apt-cacher restart before firing bin/apt-get update 
The error message remains the same as that given in the first post.

 Mark

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/

 commit 078bea72c89edf4287e92e46510aaa6211daf56f
 Author: Mark Hindley m...@hindley.org.uk
 Date:   Sun Mar 20 11:05:05 2011 +
 
 Possible fix for #618996
 
 diff --git a/apt-cacher b/apt-cacher
 index 5179ef9..8d83af9 100755
 --- a/apt-cacher2
 +++ b/apt-cacher2
 @@ -1274,6 +1274,7 @@ sub init_curl {
  $curl-setopt(CURLOPT_LOW_SPEED_TIME, $cfg-{fetch_timeout});
  $curl-setopt(CURLOPT_INTERFACE, $cfg-{interface}) if defined 
 $cfg-{interface};
  $curl-setopt(CURLOPT_NOSIGNAL, 1);
 +$curl-setopt(CURLOPT_FOLLOWLOCATION, 1);
  
  # Callbacks
  $curl-setopt(CURLOPT_DEBUGFUNCTION, \debug_callback);





-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Mark Hindley
On Sun, Mar 20, 2011 at 02:09:33PM +0100, Eus wrote:
 Hi!
 
 On Sun, 2011-03-20 at 11:07 +, Mark Hindley wrote:
 
  Thanks.
 
 It's my pleasure.
 
  If you apply this patch to /usr/share/apt-cacher/apt-cacher, does it 
  help?
 
 No, it doesn't. I applied the patch to the mentioned file by changing
 {a,b}/apt-cacher2 to {a,b}/apt-cacher. Otherwise, the patch fails
 since apt-cacher2 does not exist. Then, I restarted apt-cacher by
 /etc/init.d/apt-cacher restart before firing bin/apt-get update 

OK, it is *possible* that that wasn't sufficient to restart the curl 
process with the patch. 

Having applied the patch could you do /etc/init.d/apt-cacher stop and 
then ensure there isn't an apt-cacher [libcurl] thread still running (ps 
-lfC apt-cacher ought to show it). If there is, kill that pid (or wait 5 
minutes for it to exit itself) and then restart.

Mark



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Hi!

On Sun, 2011-03-20 at 14:53 +, Mark Hindley wrote:

 Having applied the patch could you do /etc/init.d/apt-cacher stop and 
 then ensure there isn't an apt-cacher [libcurl] thread still running (ps 
 -lfC apt-cacher ought to show it). If there is, kill that pid (or wait 5 
 minutes for it to exit itself) and then restart.

It still does not work. However, I think I've discovered the culprit.

After reading apt-cacher source code, I sent SIGUSR1 to the daemon and
saw the following snippet in apt-cacher/error.log for the first Bad
header line:
--- 8 
Sun Mar 20 15:48:27 2011|debug [13565]: Sending libcurl
http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2
Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section :
connect libcurl
Sun Mar 20 15:48:27 2011|debug [13565]: Connection to running libcurl
process found on /var/cache/apt-cacher/libcurl.socket
Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
Sun Mar 20 15:48:27 2011|debug [13566]: libcurl: connection from
IO::Socket::UNIX=GLOB(0x97da768)
Sun Mar 20 15:48:27 2011|debug [13566]: Libcurl: thawed request
http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2,
 1, 
Sun Mar 20 15:48:27 2011|debug [13566]: Init new libcurl object
Sun Mar 20 15:48:27 2011|debug [13566]: Add curl handle #2: for
http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2
Sun Mar 20 15:48:27 2011|debug [13566]: libcurl: setting up for HEAD
request
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: About to connect()
to de.archive.ubuntu.com port 80 (#0)
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]:   Trying
141.30.3.82... 
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Connected to
de.archive.ubuntu.com (141.30.3.82) port 80 (#0)
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: HTTP 1.0, assume
close after body
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Expire cleared
Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Closing connection
#0
Sun Mar 20 15:48:27 2011|debug [13566]: curl handle #2 completed,
status: 0
Sun Mar 20 15:48:27 2011|debug [13566]: libcurl active transfers: 0
Sun Mar 20 15:48:27 2011|debug [13565]: libcurl reading of headers
complete
Sun Mar 20 15:48:27 2011|debug [13565]: Found EOF marker and status
FrT;@2|$1|0$0|
Sun Mar 20 15:48:27 2011|debug [13565]: HEAD request error: 404 Not
Found
 Reusing existing file
Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section : file
download decision
Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
Sun Mar 20 15:48:27 2011|debug [13565]: checks done, can return now
Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section :
reading the header file
Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
Sun Mar 20 15:48:27 2011|debug [13565]: Header sent: HTTP/1.1 302 Moved
Temporarily
Connection: Keep-Alive
Content-Length: 0
Sun Mar 20 15:48:27 2011|debug [13565]: ready to send contents
of 
/var/cache/apt-cacher/packages/de.archive.ubuntu.com_ubuntu_dists_maverick_main_i18n_Translation-en.bz2
Sun Mar 20 15:48:27 2011|debug [13565]: read 924 bytes
Sun Mar 20 15:48:27 2011|debug [13565]: wrote 924 (sum: 924) bytes
Sun Mar 20 15:48:27 2011|debug [13565]: read 0 bytes
Sun Mar 20 15:48:27 2011|debug [13565]: fetcher released lock
Sun Mar 20 15:48:27 2011|debug [13565]: read 0 bytes
Sun Mar 20 15:48:27 2011|debug [13565]: Package sent
--- 8 

Specifically, the file to be downloaded does not exist but instead of
passing 404 to the client, apt-cacher returns 302 by, I think, reading
an invalid cached header. I am wondering how can the invalid header
exists in the very first place if the file never exists in the server.

 Mark

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/




--
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Mark Hindley
On Sun, Mar 20, 2011 at 04:22:43PM +0100, Eus wrote:
 Hi!
 
 On Sun, 2011-03-20 at 14:53 +, Mark Hindley wrote:
 
  Having applied the patch could you do /etc/init.d/apt-cacher stop and 
  then ensure there isn't an apt-cacher [libcurl] thread still running (ps 
  -lfC apt-cacher ought to show it). If there is, kill that pid (or wait 5 
  minutes for it to exit itself) and then restart.
 
 It still does not work. However, I think I've discovered the culprit.
v 
 After reading apt-cacher source code, I sent SIGUSR1 to the daemon and
 saw the following snippet in apt-cacher/error.log for the first Bad
 header line:
 --- 8 
 Sun Mar 20 15:48:27 2011|debug [13565]: Sending libcurl
 http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2
 Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section :
 connect libcurl
 Sun Mar 20 15:48:27 2011|debug [13565]: Connection to running libcurl
 process found on /var/cache/apt-cacher/libcurl.socket
 Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
 Sun Mar 20 15:48:27 2011|debug [13566]: libcurl: connection from
 IO::Socket::UNIX=GLOB(0x97da768)
 Sun Mar 20 15:48:27 2011|debug [13566]: Libcurl: thawed request
 http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2,
  1, 
 Sun Mar 20 15:48:27 2011|debug [13566]: Init new libcurl object
 Sun Mar 20 15:48:27 2011|debug [13566]: Add curl handle #2: for
 http://de.archive.ubuntu.com/ubuntu/dists/maverick/main/i18n/Translation-en.bz2
 Sun Mar 20 15:48:27 2011|debug [13566]: libcurl: setting up for HEAD
 request
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: About to connect()
 to de.archive.ubuntu.com port 80 (#0)
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]:   Trying
 141.30.3.82... 
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Connected to
 de.archive.ubuntu.com (141.30.3.82) port 80 (#0)
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: HTTP 1.0, assume
 close after body
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Expire cleared
 Sun Mar 20 15:48:27 2011|debug CURLINFO_TEXT [13566]: Closing connection
 #0
 Sun Mar 20 15:48:27 2011|debug [13566]: curl handle #2 completed,
 status: 0
 Sun Mar 20 15:48:27 2011|debug [13566]: libcurl active transfers: 0
 Sun Mar 20 15:48:27 2011|debug [13565]: libcurl reading of headers
 complete
 Sun Mar 20 15:48:27 2011|debug [13565]: Found EOF marker and status
 FrT;@2|$1|0$0|
 Sun Mar 20 15:48:27 2011|debug [13565]: HEAD request error: 404 Not
 Found
 Reusing existing file
 Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section : file
 download decision
 Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
 Sun Mar 20 15:48:27 2011|debug [13565]: checks done, can return now
 Sun Mar 20 15:48:27 2011|debug [13565]: Entering critical section :
 reading the header file
 Sun Mar 20 15:48:27 2011|debug [13565]: Exiting critical section
 Sun Mar 20 15:48:27 2011|debug [13565]: Header sent: HTTP/1.1 302 Moved
 Temporarily
 Connection: Keep-Alive
 Content-Length: 0
 Sun Mar 20 15:48:27 2011|debug [13565]: ready to send contents
 of 
 /var/cache/apt-cacher/packages/de.archive.ubuntu.com_ubuntu_dists_maverick_main_i18n_Translation-en.bz2
 Sun Mar 20 15:48:27 2011|debug [13565]: read 924 bytes
 Sun Mar 20 15:48:27 2011|debug [13565]: wrote 924 (sum: 924) bytes
 Sun Mar 20 15:48:27 2011|debug [13565]: read 0 bytes
 Sun Mar 20 15:48:27 2011|debug [13565]: fetcher released lock
 Sun Mar 20 15:48:27 2011|debug [13565]: read 0 bytes
 Sun Mar 20 15:48:27 2011|debug [13565]: Package sent
 --- 8 
 
 Specifically, the file to be downloaded does not exist but instead of
 passing 404 to the client, apt-cacher returns 302 by, I think, reading
 an invalid cached header. I am wondering how can the invalid header
 exists in the very first place if the file never exists in the server.

I suspect it appeared as the CURLOPT_FOLLOWLOCATION was not set.

So, can you delete 
/var/cache/apt-cacher/headers/archive.ubuntu.com_ubuntu_dists_maverick_main_i18n_Translation-en.bz2

and try again (with the patch)

Mark



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Hi!

On Sun, 2011-03-20 at 15:50 +, Mark Hindley wrote:

  Specifically, the file to be downloaded does not exist but instead of
  passing 404 to the client, apt-cacher returns 302 by, I think, reading
  an invalid cached header. I am wondering how can the invalid header
  exists in the very first place if the file never exists in the server.
 
 I suspect it appeared as the CURLOPT_FOLLOWLOCATION was not set.

It turned out that it is not about the redirection. The real problem is
as follows:

My machine is connected through the Internet via a controlled hotspot
that disconnects a wireless connection after some hours of being
connected and requires the user to re-login.

I think apt-get update behavior is to probe several non-existent file
like Translation-en.bz2. If the wireless connection had been active,
apt-cacher would have contacted the server and returned 404. This would
have made apt-get continues probing happily. Unfortunately, the wireless
connection was not active.

So, apt-cacher sent request for the non-existent file and got 302 from
the Coova (OpenWRT-derivative) of the hotspot in attempt to tell
apt-cacher to go to the login page of the hotspot provider. This 302
header from Coova is then cached by apt-cacher and returned to the
apt-get causing Bad header line.

Once I saw Bad header line, I realized that the wireless connection
was disconnected. I re-logged-in and restarted the apt-get operation.
Then, during the normal apt-get probe, the server returns 404 to
apt-cacher and, because apt-cacher has got a cached header file from
Coova, apt-cacher returns that header (containing 302) to apt-get
preventing apt-get to happily continue probing.

A quick solution is to remove all 302 headers returned by Coova from
apt-cacher's cache.

But, if it is possible, I want to have a long-term solution as well.

What semantic does 404 have in apt-cacher? Does the semantic allow for
not using cached header in case apt-cacher gets 404 referring to the
above scenario where apt-get update probes for non-existent files?

Thanks.

 Mark

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/




-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Hi!

On Sun, 2011-03-20 at 17:01 +0100, Eus wrote:

 On Sun, 2011-03-20 at 15:50 +, Mark Hindley wrote:
  
  I suspect it appeared as the CURLOPT_FOLLOWLOCATION was not set.

 But, if it is possible, I want to have a long-term solution as well.
 
 What semantic does 404 have in apt-cacher? Does the semantic allow for
 not using cached header in case apt-cacher gets 404 referring to the
 above scenario where apt-get update probes for non-existent files?

If suppose CURLOPT_FOLLOWLOCATION was set when the wireless connection
had been disconnected, then would apt-cacher have cached the login page
of the hotspot provider?

If that is the case, will apt-cacher return the cached login page when
apt-cacher gets 404 during the normal non-existent file probe of
apt-get update?

Thanks.

  Mark

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/




-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Mark Hindley
On Sun, Mar 20, 2011 at 05:09:53PM +0100, Eus wrote:
 Hi!
 
 On Sun, 2011-03-20 at 17:01 +0100, Eus wrote:
 
  On Sun, 2011-03-20 at 15:50 +, Mark Hindley wrote:
   
   I suspect it appeared as the CURLOPT_FOLLOWLOCATION was not set.
 
  But, if it is possible, I want to have a long-term solution as well.
  
  What semantic does 404 have in apt-cacher? Does the semantic allow for
  not using cached header in case apt-cacher gets 404 referring to the
  above scenario where apt-get update probes for non-existent files?
 
 If suppose CURLOPT_FOLLOWLOCATION was set when the wireless connection
 had been disconnected, then would apt-cacher have cached the login page
 of the hotspot provider?

Yes, I suspect so, but presumably with a 200 status and that would have 
been returned to the apt-get/aptitude client which would not be happy!

 If that is the case, will apt-cacher return the cached login page when
 apt-cacher gets 404 during the normal non-existent file probe of
 apt-get update?

Again, I think so. apt-cacher would go into its 'offline' mode and do 
the best it can with cached files.

I can't think of a way for me to test either of those scenarios without 
your setup. Could you do it and check the behaviour is as expected

Could you try this patch as well which should check that the cached 
status is still valid.

Mark


commit 0f4dd6f8f220767ce9323c988b7b8f744e405e3a
Author: Mark Hindley m...@hindley.org.uk
Date:   Sun Mar 20 16:34:25 2011 +

Check result of HEAD matches status of cached header matches else, refresh

diff --git a/apt-cacher b/apt-cacher
index 8d83af9..77aaf8a 100755
--- a/apt-cacher
+++ b/apt-cacher
@@ -601,9 +601,9 @@ sub handle_connection {
}
else {
# use HTTP timestamping/ETag
-   my ($oldmod,$newmod,$oldtag,$newtag,$testfile);
+   my ($oldmod,$newmod,$oldtag,$newtag,$oldstat,$testfile);
my $response = ${libcurl($host, $uri, undef)}; # HEAD only
-   if($response-is_success) {
+   if($response-is_success || -f $cached_head) {
  $newmod = $response-header('Last-Modified');
  $newtag = $response-header('ETag');
  if(($newmod||$newtag)  open($testfile, $cached_head)) {
@@ -619,12 +619,22 @@ sub handle_connection {
  elsif (/^.*ETag:\s*(.*)(?:\r|\n)/) {
  $oldtag = $1;
  }
- last if $oldtag  $oldmod;
+ elsif (/^HTTP\S+\s+(\d+)\s.*(?:\r|\n)/) {
+ $oldstat = $1;
+ }
+ last if $oldtag  $oldmod  $oldstat;
  }
  close($testfile);
  }
+
+ # First check status
+ if ($oldstat  $oldstat ne $response-code) {
+ debug_message(Cached header status changed from 
$oldstat to {$response-code});
+ $cache_status = 'EXPIRED';
+ debug_message($cache_status);
+ }
  # Don't use ETag by default for now: broken on some 
servers
- if($cfg-{use_etags}  $oldtag  $newtag) { # Try ETag 
first
+ elsif($cfg-{use_etags}  $oldtag  $newtag) { # Try 
ETag first
  if ($oldtag eq $newtag) {
  debug_message(ETag headers match, $oldtag - 
$newtag. Cached file unchanged);
  }



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#618996: apt-get gives Bad header line due to apt-cacher giving Browser does not support redirects!

2011-03-20 Thread Eus
Hi!

On Sun, 2011-03-20 at 16:45 +, Mark Hindley wrote:
 On Sun, Mar 20, 2011 at 05:09:53PM +0100, Eus wrote:
  Hi!
  
  On Sun, 2011-03-20 at 17:01 +0100, Eus wrote:
  
   On Sun, 2011-03-20 at 15:50 +, Mark Hindley wrote:

I suspect it appeared as the CURLOPT_FOLLOWLOCATION was not set.
  
   But, if it is possible, I want to have a long-term solution as well.
   
   What semantic does 404 have in apt-cacher? Does the semantic allow for
   not using cached header in case apt-cacher gets 404 referring to the
   above scenario where apt-get update probes for non-existent files?
  
  If suppose CURLOPT_FOLLOWLOCATION was set when the wireless connection
  had been disconnected, then would apt-cacher have cached the login page
  of the hotspot provider?
 
 Yes, I suspect so, but presumably with a 200 status and that would have 
 been returned to the apt-get/aptitude client which would not be happy!
 
  If that is the case, will apt-cacher return the cached login page when
  apt-cacher gets 404 during the normal non-existent file probe of
  apt-get update?
 
 Again, I think so. apt-cacher would go into its 'offline' mode and do 
 the best it can with cached files.
 
 I can't think of a way for me to test either of those scenarios without 
 your setup. Could you do it and check the behaviour is as expected

Sure.

 Could you try this patch as well which should check that the cached 
 status is still valid.

Yup.

BTW, what do you think if apt-cacher simply ignores the cached header
when the cached header does not have status 200?

Thanks.

 Mark

-- 
Best regards,
Eus (FSF member #4445)

In this digital era, where computing technology is pervasive, your
freedom depends on the software controlling those computing devices.

Join free software movement today! It is free as in freedom, not as in
free beer!

Join: http://www.fsf.org/




-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org