Van Jacobson, one of the main promoters of Multicast has written in

> 1995: "How to kill the internet? Easy! Just invent the web !"
> ftp://ftp.ee.lbl.gov/talks/vj-webflame.pdf

This reflects the web of the 1990s which for the most part consisted of
static data. The problem of caching, back in those times, was solved
through http proxies which every ISP provided and most people used to speed
up their dial-up web access. Today, nobody uses http proxies for those
purposes any more. In a time where most information on the Internet is
dynamically generated, this issue  has become obsolete. Multicast may, as
you imply, have benefits in some p2p distribution scenarios like bittorrent
- but only if the same data is shared/simultaneously downloaded on a
massive scale. It wouldn't make a difference to the torrent download of a
Stan Brakhage film from an obscure film lover's community tracker with
maybe two seeds and three peers.

In the end, it would be mostly big broadcasting stations profiting from IP
multicasting because they would have to pay much less for bandwidth - while
those packets would still generate the same transmission load on the rest
of the Internet and thus outsource costs to the user's ISPs. It would be
great for giants like Google because it would cut their bandwidth costs
when millions of people simultaneously watch Gangnam Style or a live stream
from the Oscar ceremony on YouTube. I still fail to see your political
argument.


#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nett...@kein.org

Reply via email to