Anthony Towns aj@azure.humbug.org.au writes:
Hrm, thinking about it, I guess zsync probably works by storing the
state of the gzip table at certain points in the file and doing a
rolling hash of the contents and recompressing each chunk of the file;
that'd result in the size of the .gz not
On Fri, 11 Nov 2005 14:51:30 +1000
Anthony Towns aj@azure.humbug.org.au wrote:
Anyway, if it's recompressing like I think, there's no way to get the
same compressed md5sum -- even if the information could be
transferred, there's no guarantee the local gzip _can_ produce the
same output as the
On Tue, Nov 01, 2005 at 09:54:09AM -0500, Michael Vogt wrote:
A problem is that zsync needs to teached to deal with deb files (that
is, that it needs to unpack the data.tar and use that for the syncs).
[Anthony Towns]
That seems kinda awkward -- you'd need to start by downloading the ar
On Wed, Nov 09, 2005 at 04:26:59PM +0100, Goswin von Brederlow wrote:
Anthony Towns aj@azure.humbug.org.au writes:
On Sun, Oct 30, 2005 at 09:48:35AM +0100, Goswin von Brederlow wrote:
Zsync checksum files are, depending on block size, about 3% of the
file size. For the full archive that
On Tue, Nov 01, 2005 at 09:54:09AM -0500, Michael Vogt wrote:
My next test was to use only the data.tar.gz of the two
archives. Zsync will extract the gzip file then and use the tar as the
base. With that I got:
8
Read data.tar.gz. Target 34.1%
Anthony Towns aj@azure.humbug.org.au writes:
On Sun, Oct 30, 2005 at 09:48:35AM +0100, Goswin von Brederlow wrote:
Zsync checksum files are, depending on block size, about 3% of the
file size. For the full archive that means under 10G more data. As
comparison adding amd64 needs ~30G. After
Michael Vogt [EMAIL PROTECTED] writes:
8
Read data.tar.gz. Target 34.1% complete.
used 1056768 local, fetched 938415
8
The size of the data.tar.gz is 1210514.
So your simple test shows 34% savings for a
On Sun, Oct 30, 2005 at 09:48:35AM +0100, Goswin von Brederlow wrote:
Zsync checksum files are, depending on block size, about 3% of the
file size. For the full archive that means under 10G more data. As
comparison adding amd64 needs ~30G. After the scc split there might be
enough space on
On Thu, Oct 27, 2005 at 10:06:22AM +0200, Robert Lemmen wrote:
On Wed, Oct 26, 2005 at 09:15:38PM -0400, Joey Hess wrote:
(And yes, we still need a solution to speed up the actual deb file
downloads..)
[..]
if zsync would be taught to handle .deb files as it does .gz files, and
a method for
Henrique de Moraes Holschuh [EMAIL PROTECTED] writes:
On Thu, 27 Oct 2005, Robert Lemmen wrote:
if zsync would be taught to handle .deb files as it does .gz files, and
You are talking about freaking lot of metadata here, and about changing some
key stuff to get --rsyncable compression.
I
Kurt Roeckx [EMAIL PROTECTED] writes:
On Wed, Oct 26, 2005 at 05:11:00AM -0700, Ian Bruce wrote:
If the .deb files were compressed using the gzip --rsyncable option,
then fetching them with zsync (or rsync) would be considerably more
efficient than straight HTTP transfers.
No it wouldn't.
On Wed, Oct 26, 2005 at 04:47:21PM -0700, Ian Bruce wrote:
As explained, I wish to use rsync (or preferably, zsync) to update the
local packages list; repeatedly downloading the 3.6MB Packages.gz file
over a 56kb/s link is highly undesirable. I am unable to understand why
this ambition is
On Wed, Oct 26, 2005 at 09:15:38PM -0400, Joey Hess wrote:
(And yes, we still need a solution to speed up the actual deb file
downloads..)
i think zsync is the way to go here. it would cause no load on the
servers as rsync does, and only require a few percent more of mirror
space.
if zsync
On Thu, 27 Oct 2005, Robert Lemmen wrote:
if zsync would be taught to handle .deb files as it does .gz files, and
You are talking about freaking lot of metadata here, and about changing some
key stuff to get --rsyncable compression.
I may not understand why most apt metadata in .gz (Packages,
On 10/27/05, Henrique de Moraes Holschuh [EMAIL PROTECTED] wrote:
On Thu, 27 Oct 2005, Robert Lemmen wrote:
if zsync would be taught to handle .deb files as it does .gz files, and
You are talking about freaking lot of metadata here, and about changing some
key stuff to get --rsyncable
It seems that recently, the uncompressed version of the Packages file
has disappeared from the unstable archive on the Debian network
servers and all their mirrors.
http://ftp.debian.org/debian/dists/unstable/main/binary-i386/
On the other hand, the uncompressed file is still available for the
On Wed, 26 Oct 2005 21:32, Ian Bruce wrote:
It seems that recently, the uncompressed version of the Packages file
has disappeared from the unstable archive on the Debian network
servers and all their mirrors.
http://ftp.debian.org/debian/dists/unstable/main/binary-i386/
On the other hand,
Ian Bruce [EMAIL PROTECTED] writes:
Some related questions:
-- what is the purpose of the Packages.diff/ directory which has
appeared in the testing and unstable archives? Is there some piece
of software which makes use of this for updating the packages lists?
apt-get (experimental only
On Wed, 26 Oct 2005 12:05:08 +0200
Goswin von Brederlow [EMAIL PROTECTED] wrote:
-- has there been any progress towards providing zsync access to the
archives? It would seem that this would result in greatly reduced
data traffic on the network servers, without increasing the
computational
On Wed, 26 Oct 2005, Ian Bruce wrote:
option was implemented. Perhaps it's thought that more testing is
required before it can be used for the archives; is there any other
reason not to use it?
The way gzip --rsyncable works is perfectly safe, it cannot cause data loss
AFAIK. It just makes
On Wed, Oct 26, 2005 at 05:11:00AM -0700, Ian Bruce wrote:
If the .deb files were compressed using the gzip --rsyncable option,
then fetching them with zsync (or rsync) would be considerably more
efficient than straight HTTP transfers.
No it wouldn't. Remember that .deb files are never
On Wed, 26 Oct 2005 19:12:30 +0200
Kurt Roeckx [EMAIL PROTECTED] wrote:
If the .deb files were compressed using the gzip --rsyncable
option, then fetching them with zsync (or rsync) would be
considerably more efficient than straight HTTP transfers.
No it wouldn't. Remember that .deb
On 10454 March 1977, Ian Bruce wrote:
Returning to the original question: Does anybody know why the
uncompressed Packages file has disappeared from the unstable
archive?
Because relevant tools do not / should not use that file since years. It
was announced *long* ago to be in a few days, so
On Thu, 27 Oct 2005 00:24:36 +0200
Joerg Jaspert [EMAIL PROTECTED] wrote:
Returning to the original question: Does anybody know why the
uncompressed Packages file has disappeared from the unstable
archive?
Because relevant tools do not / should not use that file since years.
It was
Ian Bruce wrote:
As explained, I wish to use rsync (or preferably, zsync) to update the
local packages list; repeatedly downloading the 3.6MB Packages.gz file
over a 56kb/s link is highly undesirable. I am unable to understand why
this ambition is considered to be unreasonable.
Is there some
25 matches
Mail list logo