Battle for bandwidth as P2P goes mainstream; Just how the
          conflict plays out will determine whether P2P can
          realise its potential to deliver high quality video and
          software direct to our PCs

Anil Ananthaswamy

WHEN Microsoft released the eagerly awaited Xbox 360 game Halo 3
last month, fans waited through the night outside stores to get
their hands on the first copies. How much more convenient it
would have been if the game had simply arrived on their computers
as soon as it was released. "If we had a delivery service, we
could deliver the content electronically and maybe offer a
discount," says Jin Li of Microsoft Research in Redmond,
Washington.

Unfortunately that wasn't possible. Microsoft's connections to the
internet would have been overwhelmed had they needed to send out
more than a million copies of the game. That could soon change if
the company decides to deliver games using a peer-to-peer (P2P)
delivery system, which alleviates such bandwidth burdens.

While the workings of P2P systems differ between applications, it
could go something like this: instead of every customer
downloading the game directly from Microsoft's servers, the
software would first be distributed to a small number of
computers. These "seed" computers would transmit the game to
purchasers, who would in turn pass the game to other purchasers,
or peers, all in a legal and accountable manner. Microsoft itself
would need far less bandwidth to deliver the software in this
manner than if everyone connected directly to its servers.

Microsoft is not alone. P2P networks were first popularised as the
technology behind music file-sharing network Napster. They now
look to be the future of high-quality content delivery. Warner
Brothers in the US is using the BitTorrent P2P system to deliver
video, the Canadian Broadcasting Corporation (CBC) is banking on
P2P software to deliver live TV, and universities are building
P2P systems to boost robustness at the core of the internet .

But while P2P applications remove the data bottleneck for the
organisation that originates the content, the surge in data
exchange between ordinary users' computers is consuming huge
swathes of internet bandwidth . The business models of the
internet service providers (ISPs) that supply that bandwidth have
yet to account for this growth in use. Feeling the pinch, some
are fighting back, and the way this plays out will determine
whether P2P can realise its potential in delivering high-quality
video and software directly to our PCs.

For most of us, most of the time, the internet operates according
to a "client-server" model. Each time you want to download a web
page, for example, an individual copy of that page is sent from
the web server to your computer. This has worked well for reading
news, accessing email, listening to radio and even viewing
low-quality video, since these applications require relatively
small amounts of data. But as the internet gears up to deliver
high-quality video and television, the client-server model is
beginning to creak.

Take the problem faced by the CBC. To upload content to users it
has to buy bandwidth, which can cost about $150,000 per year for
a 45 megabits/second "pipe". Under a client-server model, this
could stream high-quality video to up to 60 computers
simultaneously, so servicing the CBC's 6 million customers would
be prohibitively expensive, not to mention technically
challenging.

Like the researchers at Microsoft, to get around these problems,
Mohamed Hefeeda and colleagues at Simon Fraser University in
Burnaby, British Columbia, Canada, is building a P2P network for
the CBC. Great news for the broadcaster, but what about the ISPs
that transport the content between peers?

Home computer users with broadband connections typically buy their
bandwidth from an ISP at a monthly flat rate. That connection
tends to lie unused most of the time: "The internet service
providers are counting on that," says Dan Wallach, a P2P expert
at Rice University in Houston, Texas. In contrast, P2P networks
are designed to squeeze every last drop of the network bandwidth
available to them. "Once you actually start using P2P networks,
you break the business model of the ISPs," says Wallach.

P2P data now accounts for 60 per cent of daytime internet traffic
and 90 per cent at night, according to Klaus Mochalski of German
internet-traffic management firm Ipoque, so it is a serious
problem for ISPs. Mobile bandwidth providers are especially
concerned, he says, because their networks are smaller.Call to
arms

So what options do ISPs have? Metering bandwidth and charging
users who exceed a certain limit is an option, but is unpopular
with customers, who prefer to pay a flat rate. To conserve its
bandwidth, an ISP can also choose to cut a customer off if they
are generating levels of P2P-like traffic that exceed conditions
of fair use for a home broadband connection. Or they can restrict
the bandwidth available to that user - an action euphemistically
called "traffic shaping". The P2P software developers aren't
taking this sitting down, of course. "There is sort of an arms
race going on," says Wallach.

It used to be that P2P applications were easy to detect, because
they always used a particular "port" on the computer to
communicate. Later, P2P software developers got smarter,
constantly shifting the ports they used. In response, Sandvine,
Ipoque and other companies that produce programs to detect P2P
activity have resorted to something called "deep packet
inspection", a technique that allows them to examine the contents
of internet data packets and determine if they belong to a P2P
application. In retaliation, P2P software developers are trying
to ensure that the data they send does not show any
characteristic patterns, a technique called protocol obfuscation.

To make P2P packets even harder to detect, several P2P software
providers, including BitTorrent and eDonkey, have more recently
moved to total encryption of the packets, according to Mochalski.
"It's a cat-and-mouse game," he says.

While the battle rages, researchers are working on ways to
minimise P2P networks' impact on ISPs. Hefeeda's team, for
example, is looking to cut the financial burden on ISPs.
Peer-to-peer traffic mostly shuttles between a multitude of ISP
networks, and such transfers cost the ISPs money. By developing
"location aware" P2P applications that allow computers to
recognise which network they are on - and instruct them to share
data within their own network as much as possible - Hefeeda hopes
to lessen the cost to ISPs. "We try to find the local sender
closest to me, not just within the network, but also with the
least number of links," he says, to reduce the load on the
internet to the minimum.

Li is researching another type of location-aware P2P network.
Instead of keeping traffic within an ISP's network, his P2P
solutions allow computers to exchange data only within a home or
office network. Once a computer within a network has downloaded a
software update or game from Microsoft, say, it only passes it to
computers on its network, bypassing the ISP entirely.

There should be a way to keep everyone happy. Content providers
want P2P because they can use it to deliver high-quality content
and charge extra for it. With such a burgeoning demand for their
services, ISPs will inevitably profit from the bandwidth
explosion - the devil lies in figuring out how to charge for it,
and how much.Spread it around Anil Ananthaswamy

P2P technology won't just deliver high-quality video, says Dan
Wallach at Rice University in Houston, Texas. "As P2P techniques
mature, you are going to see them used to implement core internet
services, or any of a variety of plumbing issues that make the
internet go."

One core service is the domain name system (DNS) or internet
"phonebook". It allows computers to find websites by translating
user-friendly addresses like www.newscientist.com  to computer
addresses such as 81.144.183.95.

Currently these translations are stored by internet service
providers on relatively few servers. Now, researchers at
Princeton University have developed CoDNS, which stores copies of
the translations on PCs that are linked in a P2P network. If a
server is slow, the network uses these copies to find sites.

CoDNS might also be more robust than relying on servers. When
hurricane Isabel hit North Carolina in 2003, DNS servers at Duke
University went down, preventing the university's computers from
accessing the internet, except for a few that were running CoDNS.
To unsubscribe send a message to [EMAIL PROTECTED] with the subject unsubscribe.

To change your subscription to digest mode or make any other changes, please 
visit the list home page at
  http://accessindia.org.in/mailman/listinfo/accessindia_accessindia.org.in

Reply via email to