Re: Idea for structure of Apt-Get

2005-04-02 Thread Bob Proulx
Bob Proulx wrote:
> I switched to using socks and rsync as the transport protocol with
> debmirror to avoid the bad behavior of rsync.  That solved most of my
 ^ http proxy

  s/bad behavior of rsync/bad behavior of http proxy/

Drat.  I hate it when I do that.  I meant http proxy there.

Bob


signature.asc
Description: Digital signature


Re: Idea for structure of Apt-Get

2005-04-02 Thread Bob Proulx
Goswin von Brederlow wrote:
> The problem with ISPs with transparent proxies is that many of them
> are broken. Broken to a point that you can't fix it.

Agreed.  I am currently working behind such a broken proxy.  It gives
me no end of trouble!  It is one of the reasons that I really must
maintain my own full mirror of the global depots.  At least then local
clients to the local mirror work and problems getting offsite are
limited to updating the mirror.

I switched to using socks and rsync as the transport protocol with
debmirror to avoid the bad behavior of rsync.  That solved most of my
problems as the corp folks have not deduced a way to mess with that
access route.  But I dislike socks.  I really wish we had a modern
network at work and not something from twenty years ago.

Now before I hear from the 'net how rsync is hard on servers, and I
agree there, it is really only being used in a hard way on Packages
files.  The other files are all named uniquely and so the result is a
simple file copy using the same cpu as an http transfer.  I don't
believe it is server hard when used with debmirror.

Bob


signature.asc
Description: Digital signature


Re: Idea for structure of Apt-Get

2005-04-02 Thread Goswin von Brederlow
[EMAIL PROTECTED] (Lennart Sorensen) writes:

> On Fri, Apr 01, 2005 at 09:55:01PM +0200, Goswin von Brederlow wrote:
>> The ISps with caching proxies are usualy the ones that always have
>> problems with apt-get and basicaly any other http/ftp app, esspecialy
>> ftp. They start to cache stuff they aren't supposed to cache or don't
>> notice file changes.
>> 
>> Just think what happens if you get todays Release file and yesterdays
>> Packages file. Apt currnetly just fails to find packages but with 0.6
>> it will detect network intrusion and loudly scream as checksums don't
>> match.
>
> Transparent proxies that don't do a good job rechecking documents are
> broken.  If apt knows it is using a proxy it can request a check for
> updates on importante files (and does so).  It works great.  Nothing
> wrong with apt or proxies, only with transparent proxies that aren't
> that transparent.
>
> Len Sorensen

How do you tell apt that it is using a transparent proxy? How do you
tell an ftp proxy to ignore proxied files?

The problem with ISPs with transparent proxies is that many of them
are broken. Broken to a point that you can't fix it.

Even squid as ftp proxy is broken and doesn't notice file changes. How
do you expect some proprietary proxy to be better?

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-04-01 Thread Lennart Sorensen
On Fri, Apr 01, 2005 at 09:55:01PM +0200, Goswin von Brederlow wrote:
> The ISps with caching proxies are usualy the ones that always have
> problems with apt-get and basicaly any other http/ftp app, esspecialy
> ftp. They start to cache stuff they aren't supposed to cache or don't
> notice file changes.
> 
> Just think what happens if you get todays Release file and yesterdays
> Packages file. Apt currnetly just fails to find packages but with 0.6
> it will detect network intrusion and loudly scream as checksums don't
> match.

Transparent proxies that don't do a good job rechecking documents are
broken.  If apt knows it is using a proxy it can request a check for
updates on importante files (and does so).  It works great.  Nothing
wrong with apt or proxies, only with transparent proxies that aren't
that transparent.

Len Sorensen


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-04-01 Thread Goswin von Brederlow
Helge Hafting <[EMAIL PROTECTED]> writes:

> the protocols usually used by apt over the net.  Caching proxies have
> two big advantages over changing apt:
>
> * Nothing have to be done to apt at all!
> * Proxies also cache other things than debian packages.
>
> Helge Hafting

The ISps with caching proxies are usualy the ones that always have
problems with apt-get and basicaly any other http/ftp app, esspecialy
ftp. They start to cache stuff they aren't supposed to cache or don't
notice file changes.

Just think what happens if you get todays Release file and yesterdays
Packages file. Apt currnetly just fails to find packages but with 0.6
it will detect network intrusion and loudly scream as checksums don't
match.

MfG
Goswin


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-03-29 Thread Helge Hafting
Patrick Carlson wrote:
Hello.  I'm not sure if anyone has suggested something like this or
not but I was thinking about the apt-get system and bittorrent today. 
What if the apt-get system was redesigned so that users could download
updates and upgrades from other users?  This way they would trickle
out to people, slowly at first, but then more and more people would
have the update and thus more people could get it faster.  I know
 

Faster than what?  Today's system is very fast: 
One user (maintainer) uploads a new version and everybody
have instant access to it as soon as they do the "apt-get upgrade".
No "slow trickle at first."

there would probably be a lot of security issues involved but then
maybe people wouldn't have to worry about setting up .deb mirrors and
trying to get the latest upgrades.  Just a thought.  If it's a bad
one, let me know. :)
 

Oh, you're worried about the internet slowing as everybody
upgrade and downloads the same stuff?  There is a much
better solution to this, and it is called "caching proxies".  Many an ISP
have a caching proxy already, that caches both ftp and http which is
the protocols usually used by apt over the net.  Caching proxies have
two big advantages over changing apt:
* Nothing have to be done to apt at all!
* Proxies also cache other things than debian packages.
Helge Hafting
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]


Re: Idea for structure of Apt-Get

2005-03-21 Thread Patrick Carlson
Somebody has thought of it! :)
Thanks for the link. 


On Mon, 21 Mar 2005 09:54:12 +0100, Kaare Hviid <[EMAIL PROTECTED]> wrote:
> On Sat, Mar 19, 2005 at 12:30:35AM -0600, Patrick Carlson wrote:
> >
> > Hello.  I'm not sure if anyone has suggested something like this or
> > not but I was thinking about the apt-get system and bittorrent today.
> 
> You might want to check out
> 
> http://sianka.free.fr/
> 
> Which was mentioned in DWN November 2nd, 2004.
> 
> -ukh
>


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-03-21 Thread Mattias Wadenstein
On Sat, 19 Mar 2005, Nat Tuck wrote:
[snip]
I guess the real question is as follows:
- Is there a big enough shortage in donated mirror bandwidth to put the effort
into developing a peer to peer package distribution system and convincing a
large percentage of users to share their bandwidth?
No.
/Mattias Wadenstein, admin, ftp.se.debian.org and bach.hpc2n.umu.se
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]


Re: Idea for structure of Apt-Get

2005-03-21 Thread Kaare Hviid
On Sat, Mar 19, 2005 at 12:30:35AM -0600, Patrick Carlson wrote:
> 
> Hello.  I'm not sure if anyone has suggested something like this or
> not but I was thinking about the apt-get system and bittorrent today. 

You might want to check out

http://sianka.free.fr/

Which was mentioned in DWN November 2nd, 2004.

-ukh


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-03-19 Thread Javier Kohen
El dom, 20-03-2005 a las 00:10 +, James Titcumb escribiÃ:
> I'd imagine some kind of measure would be placed to prevent that... a 
> simple MD5 checksum would probably do the trick...

I'd rather go with a hash algorithm that hasn't been broken. Anyway, as
Nat Tuck posted earlier, Debian packages are supposed to be signed
already.

Greetings,
-- 
Javier Kohen <[EMAIL PROTECTED]>
ICQ: blashyrkh #2361802
Jabber: [EMAIL PROTECTED]


signature.asc
Description: Esta parte del mensaje =?ISO-8859-1?Q?est=E1?= firmada	digitalmente


Re: Idea for structure of Apt-Get

2005-03-19 Thread Patrick Carlson
I like your idea Max.  With a subscription based system, people
wouldn't really be anonymous anymore.  I think the bandwidth issue is
probably the biggest problem right now.  I'm still sitting on a dialup
connection when I'm at home.  I wonder though, with widespread use,
would the average upload amount for a user (or node) go down?  With
some central tracker system, sending out a relatively small patch to
thousands of users would be fairly easy and quick no?  The .deb
mirrors that are currently up now would basically become "super
hosts", able to start the distribution of the patch very quickly and
then as it picks speed, look at other patches that need to get out. 
The .deb mirrors would send the patches to the people with the fastest
connections first.  What if people never needed to run ap-get update
or ap-get upgrade?  What if it was always on?  If a patch is released,
you will get it quick.  Maybe I'm thinking of something that can't
really be accomplished right now but with South Korea pumping a large
amount of money into their internet infrastructure, look where it has
taken them?  Maybe we just need to lobby for better internet access. 
Just a thought.  If this is the wrong place for this conversation, I'm
sorry.  Maybe it should be moved to off-topic or something.  Anyway,
thanks for listening. :)


On Sun, 20 Mar 2005 00:10:14 +, Max Dyckhoff <[EMAIL PROTECTED]> wrote:
> I'm going to start off by admitting that I don't know exactly how the
> mirror system works, from my understanding it literally relies on a user
> changing their sources.list file to point to a different download site?
> 
> Here's an idea that I've just whipped up as a hybrid of the mirror
> system and true p2p systems, taking into consideration points mentioned
> in this thread.
> 
> Rather than have each user become a peer on the download network, have
> it a subscription based thing, very similar to a mirror system. A user
> would sign up saying that they were willing to share their .deb files,
> and this would be registered with a central tracker. Other users would
> then point their sources.list to the tracker, and when requesting a .deb
> the tracker would provide them with a link to one of the subscribed
> sharing users.
> 
> Given that (correct me if I'm wrong) the size of .deb files is
> relatively small, I don't think it would make much sense to split them
> into different chunks and farm them out to different servers. Rather the
> tracker would just choose a server to point the user at, based either on
> historical traffic or a quick query of the server's speed. This would be
> the "difficult" part of the system to develop, and while I have some
> ideas I suspect someone with a network background would be better (mine
> is AI ;-) )
> 
> This would make it somewhat more user-friendly (as they would just have
> to add a single line to their sources.list; the address of the "tracker"
> for a set of packages, rather than choosing from a mirror), and it would
> have the benefits of distributed network use that you get from a p2p
> network.
> 
> Security would be dealt with easily, MD5 sums or some such solution, but
> there would potentially be some latency, waiting for the tracker to
> provide a download location.
> 
> I don't personally have any problems with the existing system, and any
> change would presumably be prompted by problems with the hosts of
> mirrors if bandwidth usage gets too great for them. I reckon it would be
> relatively easy to implement, and depending on whether current mirrors
> are feeling at all pressed for bandwidth, it might not be a bad idea for
> someone to at least prototype it.
> 
> Then again, this is probably completely the wrong list for discussing
> this, given its an AMD64 place... I'm not subscribed to anywhere else
> though, and its always fun to chat ;)
> 
> Max
> 
> 
> Nat Tuck wrote:
> > The security issues in this plan are solved pretty well. If you used the
> > actual bittorrent protocol then it would be as secure as the mirrors are now
> > - if not slightly more secure.
> >
> > The biggest issues here are
> > A.) unexpected bandwidth usage.
> > B.) horrible latency
> >
> > The first issue is mostly a real issue from a bad press perspective. People
> > will see not using upstream bandwidth as a feature and try to avoid/cheat
> > the system. I actually wish bittorrent-style update mechanisms were more
> > common - people might stop paying for connections with horrible upload
> > speeds.
> >
> > The second issue is most likely an engineering problem. The existing
> > bittorrent protocol has a bit of a delay finding peers and convincing them
> > to share - until you have a chunk or two of the file, you'll be stuck at a
> > super-low download rate (typically 1kb/sec). Since a bittorrent "chunk" is a
> > good percentage of the size of the average Debian package, some sort of
> > custom bittorrent-like protocol would need to be developed.
> >
> > I guess the real question is as fo

Re: Idea for structure of Apt-Get

2005-03-19 Thread Nat Tuck
(Responding to both Javier Fernández Garcia and Brett Viren)

(Security)

Bittorrent is similarly secure to the existing debian mirror system.

In a Bittorrent system, you'd be distributing .torrent files from trusted 
debian mirrors - torrent files contain cryptographic hashes for each block of 
the data. The bittorrent protocol is already error correcting using the 
cryptographic hashes - any bad data injected would be recognized and 
discarded (and the recipient may decide to not accept data from that sender 
again for the current session).

There's actually a potential security gain: the set of trusted mirrors could 
be much smaller. On the other hand, aren't debian packages gpg signed anyway 
- making the whole security discussion redundant?

(Granularity)
The granularity issue and the latency issue are linked pretty closely. If you 
tried to implement this with existing bittorrent clients you'd run into 
exactly the problem that you mention - needing to have a separate instance of 
the client open for each .deb involved.

It would be possible with the current protocol to have one big .torrent file 
for the entire release, similar to the package list you download when you say 
"apt-get update", and then to only download the individual files that you 
need - the existing BitTornado client does this. I don't know if this would 
work for debian - it actually might.

On Saturday 19 March 2005 06:54 pm, Javier Fernández Garcia wrote:
> Hello.
>
> This thought looks pretty interesting.  I wonder if I could trust p2p...
> I'll explain myself:
>
> what would happen if someone changed the source code of a chunk of an
> application, I mean, I trust the servers from where I download the
> packages, but can I trust any user that offer me a chunk?

On Saturday 19 March 2005 06:55 pm, Brett Viren wrote:
> One more:
>
> Nat Tuck <[EMAIL PROTECTED]> writes:
> > The biggest issues here are
> > A.) unexpected bandwidth usage.
> > B.) horrible latency
>
> C.) granularity
>
> Please correct me if this is wrong, but must not a torrent pre defined
> to be a single file (or set of files)?  If so, one would need to
> decide at what granularity to build the torrents: one per .deb, one
> per category, one per section.
>
> If the .torrents are fine grained it means leaving many BT clients
> open.  OTOH, if they are made more inclusive, it means using more
> disk/bandwidth than really required.
>
> -Brett.



Re: Idea for structure of Apt-Get

2005-03-19 Thread James Titcumb
I'd imagine some kind of measure would be placed to prevent that... a 
simple MD5 checksum would probably do the trick...

Javier Fernández Garcia wrote:
Hello.
This thought looks pretty interesting.  I wonder if I could trust p2p... I'll 
explain myself:

what would happen if someone changed the source code of a chunk of an 
application, I mean, I trust the servers from where I download the packages, 
but can I trust any user that offer me a chunk?

In Spain internet is worse... (we upload at 15 k/s with our affordable ADSL).
 


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]


Re: Idea for structure of Apt-Get

2005-03-19 Thread Max Dyckhoff
I'm going to start off by admitting that I don't know exactly how the 
mirror system works, from my understanding it literally relies on a user 
changing their sources.list file to point to a different download site?

Here's an idea that I've just whipped up as a hybrid of the mirror 
system and true p2p systems, taking into consideration points mentioned 
in this thread.

Rather than have each user become a peer on the download network, have 
it a subscription based thing, very similar to a mirror system. A user 
would sign up saying that they were willing to share their .deb files, 
and this would be registered with a central tracker. Other users would 
then point their sources.list to the tracker, and when requesting a .deb 
the tracker would provide them with a link to one of the subscribed 
sharing users.

Given that (correct me if I'm wrong) the size of .deb files is 
relatively small, I don't think it would make much sense to split them 
into different chunks and farm them out to different servers. Rather the 
tracker would just choose a server to point the user at, based either on 
historical traffic or a quick query of the server's speed. This would be 
the "difficult" part of the system to develop, and while I have some 
ideas I suspect someone with a network background would be better (mine 
is AI ;-) )

This would make it somewhat more user-friendly (as they would just have 
to add a single line to their sources.list; the address of the "tracker" 
for a set of packages, rather than choosing from a mirror), and it would 
have the benefits of distributed network use that you get from a p2p 
network.

Security would be dealt with easily, MD5 sums or some such solution, but 
there would potentially be some latency, waiting for the tracker to 
provide a download location.

I don't personally have any problems with the existing system, and any 
change would presumably be prompted by problems with the hosts of 
mirrors if bandwidth usage gets too great for them. I reckon it would be 
relatively easy to implement, and depending on whether current mirrors 
are feeling at all pressed for bandwidth, it might not be a bad idea for 
someone to at least prototype it.

Then again, this is probably completely the wrong list for discussing 
this, given its an AMD64 place... I'm not subscribed to anywhere else 
though, and its always fun to chat ;)

Max

Nat Tuck wrote:
The security issues in this plan are solved pretty well. If you used the 
actual bittorrent protocol then it would be as secure as the mirrors are now 
- if not slightly more secure.

The biggest issues here are
A.) unexpected bandwidth usage.
B.) horrible latency
The first issue is mostly a real issue from a bad press perspective. People
will see not using upstream bandwidth as a feature and try to avoid/cheat
the system. I actually wish bittorrent-style update mechanisms were more
common - people might stop paying for connections with horrible upload
speeds.
The second issue is most likely an engineering problem. The existing 
bittorrent protocol has a bit of a delay finding peers and convincing them
to share - until you have a chunk or two of the file, you'll be stuck at a 
super-low download rate (typically 1kb/sec). Since a bittorrent "chunk" is a 
good percentage of the size of the average Debian package, some sort of
custom bittorrent-like protocol would need to be developed.

I guess the real question is as follows:
- Is there a big enough shortage in donated mirror bandwidth to put the effort 
into developing a peer to peer package distribution system and convincing a 
large percentage of users to share their bandwidth?

--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]


Re: Idea for structure of Apt-Get

2005-03-19 Thread Lee Begg
On Sun, 20 Mar 2005 11:44, Nat Tuck wrote:

> The biggest issues here are
> A.) unexpected bandwidth usage.
> B.) horrible latency
>
> The first issue is mostly a real issue from a bad press perspective. People
> will see not using upstream bandwidth as a feature and try to avoid/cheat
> the system. I actually wish bittorrent-style update mechanisms were more
> common - people might stop paying for connections with horrible upload
> speeds.

Here in New Zealand, I have to pay for all traffic to and from my connection - 
there isn't any choice in it currently.  For some of us it isn't an issue of 
horrible upload speeds (although they are pathetic here anyway) but the cost 
per MB downloaded & uploaded.

Hopefully this situation will change, but until it does there is a number of 
people who will not support it.

I love bittorrent but I have to keep my traffic local to NZ.

Later
Lee Begg


pgpZ4inXC9H15.pgp
Description: PGP signature


Re: Idea for structure of Apt-Get

2005-03-19 Thread Javier =?iso-8859-1?q?Fern=E1ndez_Garcia?=
Hello.

This thought looks pretty interesting.  I wonder if I could trust p2p... I'll 
explain myself:

what would happen if someone changed the source code of a chunk of an 
application, I mean, I trust the servers from where I download the packages, 
but can I trust any user that offer me a chunk?

In Spain internet is worse... (we upload at 15 k/s with our affordable ADSL).

-- 

Javier Fernández García (a.k.a calvin)
Presidente de Core Dumped
http://hal9000.eui.upm.es



El Domingo, 20 de Marzo de 2005 00:44, Nat Tuck escribió
> The security issues in this plan are solved pretty well. If you used the
> actual bittorrent protocol then it would be as secure as the mirrors are
> now - if not slightly more secure.
>
> The biggest issues here are
> A.) unexpected bandwidth usage.
> B.) horrible latency
>
> The first issue is mostly a real issue from a bad press perspective. People
> will see not using upstream bandwidth as a feature and try to avoid/cheat
> the system. I actually wish bittorrent-style update mechanisms were more
> common - people might stop paying for connections with horrible upload
> speeds.
>
> The second issue is most likely an engineering problem. The existing
> bittorrent protocol has a bit of a delay finding peers and convincing them
> to share - until you have a chunk or two of the file, you'll be stuck at a
> super-low download rate (typically 1kb/sec). Since a bittorrent "chunk" is
> a good percentage of the size of the average Debian package, some sort of
> custom bittorrent-like protocol would need to be developed.
>
> I guess the real question is as follows:
> - Is there a big enough shortage in donated mirror bandwidth to put the
> effort into developing a peer to peer package distribution system and
> convincing a large percentage of users to share their bandwidth?
>
> -- Nat Tuck
>
> On Saturday 19 March 2005 02:21 pm, James Titcumb wrote:
> > Patrick,
> >
> > It seems a good idea, but I dont think it could work in practise for a
> > few reasons...
> >
> > Firstly, the UK internet is terrible. There are bandwidth constraints on
> > 90% of home users now, which means that we'd have to pay for more
> > bandwidth every month due to the number of uploads... Also, the price of
> > symmetrical DSL is not yet affordable for home users like myself, so
> > most of us are stuck on ADSL, with upload speeds of only around 30k/s.
> > Not to mention the appauling contention ratios of anywhere up to
> > 100:1... I'm lucky enough to live in the countryside where there are
> > only about 5 other users on the local exchange :)
> >
> > Secondly, as you said, I can see security issues galore :(... especially
> > for server systems which would supposedly be secure, yet a user may
> > hypothetically be able to start downloading other files...  unless of
> > course the theoretical apt-get "uploader" limits it to one directory.
> >
> > Its a nice concept, granted, but I think people are so used to mirrors
> > now As that saying goes "if it ain't broke, don't fix it"... which I
> > never abide by, because I like to tinker with things, break them then
> > fix them again...  :)
> >
> > James
> >
> > Patrick Carlson wrote:
> > >Hello.  I'm not sure if anyone has suggested something like this or
> > >not but I was thinking about the apt-get system and bittorrent today.
> > >What if the apt-get system was redesigned so that users could download
> > >updates and upgrades from other users?  This way they would trickle
> > >out to people, slowly at first, but then more and more people would
> > >have the update and thus more people could get it faster.  I know
> > >there would probably be a lot of security issues involved but then
> > >maybe people wouldn't have to worry about setting up .deb mirrors and
> > >trying to get the latest upgrades.  Just a thought.  If it's a bad
> > >one, let me know. :)
> > >
> > >-Patrick



Re: Idea for structure of Apt-Get

2005-03-19 Thread Brett Viren
One more:

Nat Tuck <[EMAIL PROTECTED]> writes:

> The biggest issues here are
> A.) unexpected bandwidth usage.
> B.) horrible latency
C.) granularity

Please correct me if this is wrong, but must not a torrent pre defined
to be a single file (or set of files)?  If so, one would need to
decide at what granularity to build the torrents: one per .deb, one
per category, one per section.

If the .torrents are fine grained it means leaving many BT clients
open.  OTOH, if they are made more inclusive, it means using more
disk/bandwidth than really required.

-Brett.


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-03-19 Thread Nat Tuck
The security issues in this plan are solved pretty well. If you used the 
actual bittorrent protocol then it would be as secure as the mirrors are now 
- if not slightly more secure.

The biggest issues here are
A.) unexpected bandwidth usage.
B.) horrible latency

The first issue is mostly a real issue from a bad press perspective. People
will see not using upstream bandwidth as a feature and try to avoid/cheat
the system. I actually wish bittorrent-style update mechanisms were more
common - people might stop paying for connections with horrible upload
speeds.

The second issue is most likely an engineering problem. The existing 
bittorrent protocol has a bit of a delay finding peers and convincing them
to share - until you have a chunk or two of the file, you'll be stuck at a 
super-low download rate (typically 1kb/sec). Since a bittorrent "chunk" is a 
good percentage of the size of the average Debian package, some sort of
custom bittorrent-like protocol would need to be developed.

I guess the real question is as follows:
- Is there a big enough shortage in donated mirror bandwidth to put the effort 
into developing a peer to peer package distribution system and convincing a 
large percentage of users to share their bandwidth?

-- Nat Tuck

On Saturday 19 March 2005 02:21 pm, James Titcumb wrote:
> Patrick,
>
> It seems a good idea, but I dont think it could work in practise for a
> few reasons...
>
> Firstly, the UK internet is terrible. There are bandwidth constraints on
> 90% of home users now, which means that we'd have to pay for more
> bandwidth every month due to the number of uploads... Also, the price of
> symmetrical DSL is not yet affordable for home users like myself, so
> most of us are stuck on ADSL, with upload speeds of only around 30k/s.
> Not to mention the appauling contention ratios of anywhere up to
> 100:1... I'm lucky enough to live in the countryside where there are
> only about 5 other users on the local exchange :)
>
> Secondly, as you said, I can see security issues galore :(... especially
> for server systems which would supposedly be secure, yet a user may
> hypothetically be able to start downloading other files...  unless of
> course the theoretical apt-get "uploader" limits it to one directory.
>
> Its a nice concept, granted, but I think people are so used to mirrors
> now As that saying goes "if it ain't broke, don't fix it"... which I
> never abide by, because I like to tinker with things, break them then
> fix them again...  :)
>
> James
>
> Patrick Carlson wrote:
> >Hello.  I'm not sure if anyone has suggested something like this or
> >not but I was thinking about the apt-get system and bittorrent today.
> >What if the apt-get system was redesigned so that users could download
> >updates and upgrades from other users?  This way they would trickle
> >out to people, slowly at first, but then more and more people would
> >have the update and thus more people could get it faster.  I know
> >there would probably be a lot of security issues involved but then
> >maybe people wouldn't have to worry about setting up .deb mirrors and
> >trying to get the latest upgrades.  Just a thought.  If it's a bad
> >one, let me know. :)
> >
> >-Patrick


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]



Re: Idea for structure of Apt-Get

2005-03-19 Thread James Titcumb
Patrick,
It seems a good idea, but I dont think it could work in practise for a 
few reasons...

Firstly, the UK internet is terrible. There are bandwidth constraints on 
90% of home users now, which means that we'd have to pay for more 
bandwidth every month due to the number of uploads... Also, the price of 
symmetrical DSL is not yet affordable for home users like myself, so 
most of us are stuck on ADSL, with upload speeds of only around 30k/s. 
Not to mention the appauling contention ratios of anywhere up to 
100:1... I'm lucky enough to live in the countryside where there are 
only about 5 other users on the local exchange :)

Secondly, as you said, I can see security issues galore :(... especially 
for server systems which would supposedly be secure, yet a user may 
hypothetically be able to start downloading other files...  unless of 
course the theoretical apt-get "uploader" limits it to one directory.

Its a nice concept, granted, but I think people are so used to mirrors 
now As that saying goes "if it ain't broke, don't fix it"... which I 
never abide by, because I like to tinker with things, break them then 
fix them again...  :)

James
Patrick Carlson wrote:
Hello.  I'm not sure if anyone has suggested something like this or
not but I was thinking about the apt-get system and bittorrent today. 
What if the apt-get system was redesigned so that users could download
updates and upgrades from other users?  This way they would trickle
out to people, slowly at first, but then more and more people would
have the update and thus more people could get it faster.  I know
there would probably be a lot of security issues involved but then
maybe people wouldn't have to worry about setting up .deb mirrors and
trying to get the latest upgrades.  Just a thought.  If it's a bad
one, let me know. :)

-Patrick
 


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]


Idea for structure of Apt-Get

2005-03-18 Thread Patrick Carlson
Hello.  I'm not sure if anyone has suggested something like this or
not but I was thinking about the apt-get system and bittorrent today. 
What if the apt-get system was redesigned so that users could download
updates and upgrades from other users?  This way they would trickle
out to people, slowly at first, but then more and more people would
have the update and thus more people could get it faster.  I know
there would probably be a lot of security issues involved but then
maybe people wouldn't have to worry about setting up .deb mirrors and
trying to get the latest upgrades.  Just a thought.  If it's a bad
one, let me know. :)

-Patrick


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]