On Sat, 2004-10-23 at 05:45, Brian May wrote:
No, max_versions is not correct. It will only work if all my computers
use the same distribution; if some computers use unstable while others
use stable for example, then the stable version will get deleted after
n revisions of the unstable version
also sprach Wouter Verhelst [EMAIL PROTECTED] [2004.10.22.2121 +0200]:
Oh, absolutely. One way could be to talk to the apt-cacher and
apt-proxy developers and help fixing bugs in their software,
instead of calling your not even fully thought out idea (which
surely hasn't proven itself) the
On Sat, 2004-10-23 at 12:57 -0500, Manoj Srivastava wrote:
On Fri, 22 Oct 2004 23:04:32 -0700, Matt Zimmerman [EMAIL PROTECTED] said:
On Wed, Oct 20, 2004 at 02:11:44AM +0200, martin f krafft wrote:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting,
Wouter == Wouter Verhelst [EMAIL PROTECTED] writes:
Wouter It's not actually version 2 yet, but the current apt-proxy
Wouter in unstable is supposed to be apt-proxy v2.
This version isn't in testing, hence part of my confusion. The other
part comes from the fact apt-proxy 1.9.18 in
On Thu, Oct 21, 2004 at 02:59:17PM -0500, Manoj Srivastava wrote:
Hi,
I can mostly live with the current apt-proxy, except for the
fact that it does not seem to want to play nice with debbootstrap:
debbootstrap just hangs.
Happens here too.. my apt-proxy and debootstrap client
On Wed, Oct 20, 2004 at 02:11:44AM +0200, martin f krafft wrote:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
My position on special-purpose proxy caches for APT is that general-purpose
proxy caches (like squid) seem
Hi, martin f krafft wrote:
I will have to think about the premature EOF.
It's a file. Files don't have premature EOFs, so you need some sort of
lock, which in turn requires a (non-shell ;-) script.
In other words, this rapidly approaches the complexity of
apt-proxy-or-whatever.
--
Matthias
On Fri, 22 Oct 2004 23:04:32 -0700, Matt Zimmerman [EMAIL PROTECTED] said:
On Wed, Oct 20, 2004 at 02:11:44AM +0200, martin f krafft wrote:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
My position on
Chris == Chris Halls [EMAIL PROTECTED] writes:
Chris Hmm, seems you are talking about version 1, which has been
Chris rewritten. The new version isn't bug free yet but it does
Chris fix several problems. It doesn't use wget.
It would appear apt-proxy v2 isn't in Debian (or that I
On Sat, Oct 23, 2004 at 02:45:54PM +1000, Brian May wrote:
Chris == Chris Halls [EMAIL PROTECTED] writes:
Chris Hmm, seems you are talking about version 1, which has been
Chris rewritten. The new version isn't bug free yet but it does
Chris fix several problems. It doesn't use
On Fri, 2004-10-22 at 13:43 +1000, Paul Hampson wrote:
Is there anything such a system would want to fetch from a Debian
mirror that doesn't show up in Packages.gz or Sources.gz?
Yes, lots of things as I found out the hard way when I implemented
object type checking in apt-cacher - even plain
Manoj == Manoj Srivastava [EMAIL PROTECTED] writes:
Manoj Hi, I can mostly live with the current apt-proxy, except
Manoj for the fact that it does not seem to want to play nice
Manoj with debbootstrap: debbootstrap just hangs.
Strange. I have never had any problems with debootstrap
martin == martin f krafft [EMAIL PROTECTED] writes:
martin also sprach Jonathan Oxer [EMAIL PROTECTED]
martin [2004.10.21.0617 +0200]:
So it's necessary to keep fetching the Packages files within
their expiry time or the cache gets nuked.
martin Why delete them at all?
On Fri, Oct 22, 2004 at 02:21:17PM +1000, Jonathan Oxer wrote:
On Fri, 2004-10-22 at 13:43 +1000, Paul Hampson wrote:
Is there anything such a system would want to fetch from a Debian
mirror that doesn't show up in Packages.gz or Sources.gz?
Yes, lots of things as I found out the hard way
#include hallo.h
* Adeodato Simó [Fri, Oct 22 2004, 04:40:52AM]:
Further, I wish there could be pre-caching. Means: if a file was
downloaded and that file was mentioned in packages-file A and after the
next update, A has a newer version of this package than the package
could be downloaded.
also sprach Robert Collins [EMAIL PROTECTED] [2004.10.22.0019 +0200]:
store_avg_object_size should have no impact on what is and is not
cached.
Ah, interesting. I guess my testing results were influence by my
expectations then.
Thanks for your tips!
--
Please do not CC me when replying to
Hi, martin f krafft wrote:
also sprach martin f krafft [EMAIL PROTECTED] [2004.10.20.1155 +0200]:
#!/bin/sh -e
echo 200 OK
echo Content-type: application/x-debian-package
echo
exec wget -O - $MIRROR/$RPATH | tee $LPATH
Don't forget
mkdir -p $(dirname $LPATH)
The above pipe needs
also sprach Matthias Urlichs [EMAIL PROTECTED] [2004.10.22.2011 +0200]:
exec wget -O - $MIRROR/$RPATH | tee $LPATH
Don't forget
mkdir -p $(dirname $LPATH)
Why the extra two processes?
mkdir -p ${LPATH%/*}
The above pipe needs either bash 3 or a subshell, if you want to
be able to
On Fri, Oct 22, 2004 at 08:22:57PM +0200, martin f krafft wrote:
also sprach Matthias Urlichs [EMAIL PROTECTED] [2004.10.22.2011 +0200]:
This rapidly turns from a plain 404 error script into a somewhat
nontrivial Perl-or-Python-or-whatever doument handler.
It's still rather simple. I will
On Thu, 2004-10-21 at 04:04, Brian May wrote:
* If the above point wasn't bad enough by itself, the apt-proxy binary has
hard coded:
WGET_CMD=$WGET --timestamping --no-host-directories --tries=5
--no-directories -P $DL_DESTDIR
Hmm, seems you are talking about version 1, which has been
On Fri, 22 Oct 2004 15:44:12 +1000, Brian May [EMAIL PROTECTED] said:
Manoj == Manoj Srivastava [EMAIL PROTECTED] writes:
Manoj Hi, I can mostly live with the current apt-proxy, except for
Manoj the fact that it does not seem to want to play nice with
Manoj debbootstrap: debbootstrap just
On Wed, 20 Oct 2004, martin f krafft wrote:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
rapt proxy is an actuall http proxy that caches debian packages. It's
written in ruby and since all you have to do to use it
On Thu, 2004-10-21 at 13:04 +1000, Brian May wrote:
* No thought put into the file deletion algorithm. IMHO, deleting
files based on age is wrong (consider how long stable files
last). Deleting files based on number of different copies is also
wrong (consider if you have some systems setup
also sprach Tobias Hertkorn [EMAIL PROTECTED] [2004.10.20.1449 +0200]:
As a part of speeding up delivery, I programmed an apache module called
mirror_mod.
For documentation look here:
http://hacktor.fs.uni-bayreuth.de/apt-got/docs/mod_mirror.html
Interesting! Though I am not sure this would
#include hallo.h
* Jonathan Oxer [Thu, Oct 21 2004, 02:17:49PM]:
is unstable). IMHO, the only correct way is to scan the most recently
downloaded Packages and Source index files and delete files that
aren't mentioned anymore.
That's how apt-cacher does it. Early versions of apt-cacher
On Wed, 2004-10-20 at 11:11, martin f krafft wrote:
1. apt-proxy:
While I love the concept of apt-proxy, it works very unreliably.
Frequently, the proxy fails to download the package or imposes
very long delays (#272217, and others).
This seems to be the result of a patch that fixed one
Hi,
I can mostly live with the current apt-proxy, except for the
fact that it does not seem to want to play nice with debbootstrap:
debbootstrap just hangs.
manoj
--
Philogyny recapitulates erogeny; erogeny recapitulates philogyny.
Manoj Srivastava [EMAIL PROTECTED]
On Wed, 2004-10-20 at 12:11 +0200, martin f krafft wrote:
also sprach martin f krafft [EMAIL PROTECTED] [2004.10.20.0211 +0200]:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
3. squid:
Squid works reliably, but
On Thu, 2004-10-21 at 17:31 +0100, Chris Halls wrote:
There have been quite a lot of attempts to make a better apt-proxy, but
almost always the authors discovered the problem is rather difficult to
get right, especially when you start worrying about streaming while
downloading, multiple
- Original Message -
From: Robert Collins [EMAIL PROTECTED]
To: Chris Halls [EMAIL PROTECTED]
Cc: debian-devel@lists.debian.org
Sent: Friday, October 22, 2004 1:56 AM
Subject: Re: an idea for next generation APT archive caching
Caching for concurrent clients is non trivial :). Theres not a lot
On Fri, 2004-10-22 at 03:36 +0200, Tobias Hertkorn wrote:
a request for http://yourserver/testing//apachedeb will not create a
hit if requested as http://yourserver/sid//apache...deb . Furthermore
requests to similar mirrors will not create cache hits. So everybody has to
use
On Thu, 2004-10-21 at 13:13 +0200, martin f krafft wrote:
also sprach Jonathan Oxer [EMAIL PROTECTED] [2004.10.21.0617 +0200]:
So it's necessary to keep fetching the Packages files within their
expiry time or the cache gets nuked.
Why delete them at all?
Because then they are never
On Fri, 2004-10-22 at 03:36 +0200, Tobias Hertkorn wrote:
One bad thing (amongs others) that happens if you use squid - first of all
you have to make your clients use the proxy settings
Set it up in reverse proxy mode (way easier in 3.0) and you don't use
proxy settings - you use it as your
* Eduard Bloch [Thu, 21 Oct 2004 18:12:42 +0200]:
And though I like apt-cacher in general (it worked immediately while I
did not manage to make apt-proxy work within 15 minutes and dropped the
crap), this is the only method I do not like at all.
It could be done better. I suggest you switch
On Wed, Oct 20, 2004 at 02:11:44AM +0200, martin f krafft wrote:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
Based on a normal mirror layout, the idea is to use apache's 404
hook for packages. When an existing
Hello Martin,
Am 2004-10-20 02:11:44, schrieb martin f krafft:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
:-)
Based on a normal mirror layout, the idea is to use apache's 404
hook for packages. When an existing
also sprach Michelle Konzack [EMAIL PROTECTED] [2004.10.20.1107 +0200]:
Because in Apache you can manipulate Error-Messages, it is
possibel. The problem is only the TimeOut of 'apt-get' if the
requested Server is not fast enough.
That's why there should be streaming going on. This has been my
also sprach martin f krafft [EMAIL PROTECTED] [2004.10.20.1155 +0200]:
#!/bin/sh -e
echo 200 OK
echo Content-type: application/x-debian-package
echo
exec wget -O - $MIRROR/$RPATH | tee $LPATH
one might want to parse wget's error output and return 404 as before
if it returns 404. then
also sprach martin f krafft [EMAIL PROTECTED] [2004.10.20.0211 +0200]:
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
Some people asked how this differs from existing methods. Here are
my experiences:
1. apt-proxy:
Martin,
On Wed, Oct 20, 2004 at 12:11:12PM +0200, martin f krafft wrote:
3. squid:
Squid works reliably, but it has no concept of the APT repository
and thus it is impossible to control what is cached and for how
long.
I've long wondered whether the best answer might not be to teach
On Wednesday 20 October 2004 12.11, martin f krafft wrote:
2. apt-cacher:
Also a very nice concept, I have found it rather unusable. Clients
would time out as the streaming does not work reliably. Also,
after using it for a day or two, I found 30 or more Perl zombies
on the system
-
From: Adrian 'Dagurashibanipal' von Bidder [EMAIL PROTECTED]
To: debian developers debian-devel@lists.debian.org
Sent: Wednesday, October 20, 2004 2:08 PM
Subject: Re: an idea for next generation APT archive caching
But your approach of catching apache's 404 certainly sounds interesting
martin == martin f krafft [EMAIL PROTECTED] writes:
martin 1. apt-proxy: While I love the concept of apt-proxy, it
martin works very unreliably. Frequently, the proxy fails to
martin download the package or imposes very long delays (#272217,
martin and others).
apt-proxy is
Here's an idea I just had about apt-proxy/apt-cacher NG. Maybe this
could be interesting, maybe it's just crap. Your call.
Based on a normal mirror layout, the idea is to use apache's 404
hook for packages. When an existing package is requested, it is
served regularly. If the file is not found, a
44 matches
Mail list logo