Greetings,

This is not a change I intend to make right now, but I wanted to start
a discussion on it.

Monday night, I had the opportunity to speak with a developer who was
quite knowledgeable about the operation of apt-get and other package
managers.

We were discussing the current method that ips uses to stream file
data during a filelist operation.

Notably, grabbing files one at a time and sending a tar stream over
the connection.

I was wondering if someone could expound a bit on that approach, and
why they chose to do a tar stream.

The developer I talked to suggested that instead of doing a tar stream
from the server, we could simply allow the client to perform HTTP/1.1
pipeline requests for each individual file.

After looking into this a little bit, it looks like the change would
be from this:
* establish connection
* get url_1
* readresponse url_1
* close connection

to this:
* establish connection
* get url_1
* readresponse url_1
* get url_2
* readresponse url_2
* get url_n
* readresponse url_n
* close connection

Any thoughts on this approach vs. the current one?

Are there certain advantages to a tar stream I might have missed?

Cheers,
-- 
Shawn Walker

"To err is human -- and to blame it on a computer is even more so." -
Robert Orben
_______________________________________________
pkg-discuss mailing list
[email protected]
http://mail.opensolaris.org/mailman/listinfo/pkg-discuss

Reply via email to