Re: RTT from NY to New Delhi?

2007-05-16 Thread Gian Constantine
Seems pretty damned reasonable to me considering the shortest  
distance between these two locations is a little less than 14,000  
kilometers. Given the speed of light through glass, a convoluted  
fiber path, quite a few O/E - E/O conversions (EDFAs will only get  
you so far), and several switches, a 350ms RTT is decent.


Removing the gear and assuming an impossibly direct fiber path, would  
still give an RTT of about 140ms.


Gian Anthony Constantine


On May 16, 2007, at 10:59 AM, Marshall Eubanks wrote:




On May 16, 2007, at 10:35 AM, Tim Franklin wrote:



On Wed, May 16, 2007 2:20 pm, Joe Maimon wrote:


What should I expect?

I am seeing ~350 from a vendor provided mpls cloud to a site in

Sukhrali Chowk, Gurgaon, Haryana, India


Seems not-unreasonable.  I remember getting about 150ms or 250ms from
London to Gurgaon depending on whether we were on the straight-across
cable or the round-the-bottom cable.  (Sorry, both my geography  
and my

cable-names are hazy).


The best recent data I have is from Bangalore to Tyco Road in  
Virginia through VSNL and Cogent.


Here is a sample (this goes through San Jose) :

Mon Mar  5 05:26:21 EST 2007
from Bangalore through the VSNL network
--- 63.105.122.1 ping statistics ---
10 packets transmitted, 10 packets received, 0% packet loss
round-trip min/avg/max/stddev = 285.495/319.649/395.330/38.576 ms

370 ms seems a little high but not unreasonable.

Regards
Marshall Eubanks




Going east from NY, you'd add 70 or 80ms to that - and a quick look
suggests routes going west instead.  (Test from home to .IN NS  
goes London

- NY - West Coast - Singtel - India, for ~370ms)

It's starting to head a bit towards walkie-talkie mode for VoIP,  
but not

too bad other than that...

Regards,
Tim.








Re: Thoughts on increasing MTUs on the internet

2007-04-12 Thread Gian Constantine
I agree. The throughput gains are small. You're talking about a  
difference between a 4% header overhead versus a 1% header overhead  
(for TCP).


One could argue a decreased pps impact on intermediate systems, but  
when factoring in the existing packet size distribution on the  
Internet and the perceived adjustment seen by a migration to 4470 MTU  
support, the gains remain small.


Development costs and the OpEx costs of implementation and support  
will, likely, always outweigh the gains.


Gian Anthony Constantine


On Apr 12, 2007, at 7:50 AM, Saku Ytti wrote:



On (2007-04-12 11:20 +0200), Iljitsch van Beijnum wrote:


What do you guys think about a mechanism that allows hosts and
routers on a subnet to automatically discover the MTU they can use
towards other systems on the same subnet, so that:
1. It's no longer necessary to limit the subnet MTU to that of the
least capable system

2. It's no longer necessary to manage 1500 byte+ MTUs manually


To me this sounds adding complexity for rather small pay-off. And
then we'd have to ask IXP people, would the enable this feature
if it was available? If so, why don't they offer high MTU VLAN
today?
And in the end, pay-off of larger MTU is quite small, perhaps
some interrupts are saved but not sure how relevant that is
in poll() based NIC drivers. Of course bigger pay-off
would be that users could use tunneling and still offer 1500
to LAN.

IXP peeps, why are you not offering high MTU VLAN option?
From my point of view, this is biggest reason why we today
generally don't have higher end-to-end MTU.
I know that some IXPs do, eg. NetNOD but generally it's
not offered even though many users would opt to use it.

Thanks,
--
  ++ytti




Re: Thoughts on increasing MTUs on the internet

2007-04-12 Thread Gian Constantine
I did a rough, top-of-the-head, with ~60 bytes header (ETH, IP, TCP)  
into 1500 and 4470 (a mistake, on my part, not to use 9216).


I still think the cost outweighs the gain, though there are some  
reasonable arguments for the increase.


Gian Anthony Constantine


On Apr 12, 2007, at 12:07 PM, Saku Ytti wrote:



On (2007-04-12 16:28 +0200), Iljitsch van Beijnum wrote:


On 12-apr-2007, at 16:04, Gian Constantine wrote:


I agree. The throughput gains are small. You're talking about a
difference between a 4% header overhead versus a 1% header overhead
(for TCP).


6% including ethernet overhead and assuming the very common TCP
timestamp option.


Out of curiosity how is this calculated?
[EMAIL PROTECTED] ~]% echo 1450/(1+7+6+6+2+1500+4+12)*100|bc -l
94.27828348504551365400
[EMAIL PROTECTED] ~]% echo 8950/(1+7+6+6+2+9000+4+12)*100|bc -l
99.02633325957070148200
[EMAIL PROTECTED] ~]%

I calculated less than 5% from 1500 to 9000, with ethernet and
adding TCP timestamp. What did I miss?

Or compared without tcp timestamp and 1500 to 4470.
[EMAIL PROTECTED] ~]% echo 1460/(1+7+6+6+2+1500+4+12)*100|bc -l
94.92847854356306892000
[EMAIL PROTECTED] ~]% echo 4410/(1+7+6+6+2+4470+4+12)*100|bc -l
97.82608695652173913000

Less than 3%.

However, I don't think it's relevant if it's 1% or 10%, bigger
benefit would be to give 1500 end-to-end, even with eg. ipsec
to the office.

--
  ++ytti




Re: IPv6 Finally gets off the ground

2007-04-10 Thread Gian Constantine
Yes. Silly of you. I think you may have missed more than the singular  
reference.


This back and forth has little to do with morality and more to do  
with opinion.


Yet it begs, how moral is an argument of 'my opinion is superior to  
your opinion'?


Such a lashing of another's opinion under the pretense of removing  
someone from their lofty perch to restore equality is hardly equality  
at all.


Everyone is entitled to their opinion. Though, I doubt Mr. Yao was  
expressing his so strongly.


Gian Anthony Constantine


On Apr 10, 2007, at 1:35 PM, Patrick W. Gilmore wrote:



On Apr 10, 2007, at 1:24 PM, Joseph S D Yao wrote:


On Tue, Apr 10, 2007 at 12:10:59PM -0400, Patrick W. Gilmore wrote:
...

Second, who said v6 was the heights?  ...


My, aren't we serious?  Too serious to realize that satellites are a
little higher than I, at least, can reach.


Guess I missed that reference.  Silly of me.  Fine imagery.  Just  
like the stuff you can get for free if you use a v6 stack :)


As for being serious, I do believe you were the one who claimed v6  
was going into the gutter, and the depth.  Pot, kettle, black?   
Actually, you went beyond being serious by implying some type of  
moral superiority.


Which is fine, you packets can be morally superior to mine

--
TTFN,
patrick





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-21 Thread Gian Constantine

Actually, I acknowledged the calculation mistake in a subsequent post.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 21, 2007, at 11:11 AM, Petri Helenius wrote:


Gian Constantine wrote:


I agree with you. From a consumer standpoint, a trickle or off- 
peak download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill  
on space for disposable content encoded in H.264. Excellent SD  
(480i) content can be achieved at ~1200 to 1500kbps, resulting in  
about a 1GB file for a 90 minute title. HD is almost out of the  
question for internet download, given good 720p at ~5500kbps,  
resulting in a 30GB file for a 90 minute title.


Kilobits, not bytes. So it's 3.7GB for 720p 90minutes at 5.5Mbps.  
Regularly transferred over the internet.
Popular content in the size category 2-4GB has tens of thousands  
and in some cases hundreds of thousands of downloads from a single  
tracker. Saying it's out of question does not make it go away.  
But denial is usually the first phase anyway.


Pete






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-15 Thread Gian Constantine
The problem with this all (or mostly) VoD model is the entrenched  
culture. In countries outside of the U.S. with smaller channel  
lineups, an all VoD model might be easier to migrate to over time. In  
the U.S., where we have 200+ channel lineups, consumers have become  
accustomed to the massive variety and instant gratification of a  
linear lineup. If you leave it to the customer to choose their  
programs, and then wait for them to arrive and be viewed, the instant  
gratification aspect is lost. This is important to consumers here.


While I do not think an all or mostly VoD model will work for  
consumers in U.S. in the near term (next 5 years), it may work in the  
long term (7-10 years). There are so many obstacles in the way from a  
business side of things, though.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 15, 2007, at 9:31 AM, Joe Abley wrote:




On 15-Jan-2007, at 08:48, Michal Krsek wrote:

This system works perfectly in our linear-line distribution  
(channels). As user you can choose time you want to see the show,  
but not the show itself. Capacity on PVR device is finite and if  
you don't want to waste the space with any broadcasted content you  
have to program the device. I have ten channels in my cable TV and  
sometimes I'm confused what to record. Beeing in the US and paid  
for ~100 channels will make me mad to crawl channel schedules :-)


So the technology is nice, but not a What you want is what you  
get. So you cannot address the long tail using this technology.


These are all UI details.

The (Scientific Atlanta, I think) PVRs that Rogers Cable gives  
subscribers here in Ontario let you specify the *names* of shows  
that you like, rather than selecting specific channels and times; I  
seem to think you can also tell it to automatically ditch old  
recorded material when disk space becomes low.


One thing that may not be obvious to people who haven't had this  
misfortune of consuming it at first hand is that North American TV,  
awash with channels as it is, contains a lot of duplicated content.  
The same episode of the same show might be broadcast tens of times  
per week; the same advertisement might be broadcast tens of times  
per hour.


How much more programming would the existing networks support if  
they were able to reduce those retransmissions, relying on the  
ubiquity of set-top boxes with PVR functionality?



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-13 Thread Gian Constantine
The cable companies have been chomping at the bit for unbundled  
channels for years, so have consumers. The content providers will  
never let it happen. Their claim is the popular channels support the  
diversity of not-so-popular channels. Apparently, production costs  
are high all around (not surprising) and most channels do not support  
themselves entirely.


The MSOs have had a la carte on their Santa wish list for years and  
the content providers do not believe in Santa Claus. :-) They believe  
in Benjamin Franklin...lots and lots of Benjamin Franklin.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 13, 2007, at 7:14 AM, Marshall Eubanks wrote:

In the USA at least, the cable companies make you pay for bundles  
to get channels you want. I have to pay for
3 bundles to get 2 channels we actually want to watch. (One of  
these bundle is apparently only sold if you are already getting  
another, which we don't actually care about.) So, it actually costs  
us $ 40 + / month to get the two channels we want (plus a bunch we  
don't.) So, it occurs to me that there is a business selling solo  
channels on the Internet, as is, with the ads, for order $ 5 - $ 10  
per subscriber per month, which should leave a substantial profit  
after the payments to the networks and bandwidth costs.




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine

Yes, the NCTC.

I have spoken with two of the vendors you mentioned. Neither have  
pass-through licensing rights. I still have to go directly to most of  
the content providers to get the proper licensing rights.


There are a few vendors out there who will help a company attain  
these rights, but the solution is not turnkey on licensing. To be  
clear, it is not turnkey for the major U.S. content providers.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 12, 2007, at 10:14 AM, Frank Bulk wrote:

You mean the NCTC?  Yes, they did close their doors for new  
membership, but
there are regional head ends that represent a larger number of ITCs  
that

have been able to directly negotiate with the content providers.

And then there's the turnkey vendors: IPTV Americas, SES Americom'  
IP-PRIME,

and Falcon Communications.

It's not entirely impossible.

Frank



From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of Gian

Constantine
Sent: Wednesday, January 10, 2007 7:47 AM
To: [EMAIL PROTECTED]
Cc: Marshall Eubanks; nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Many of the small carriers, who are doing IPTV in the U.S., have  
acquired
their content rights through a consortium, which has since closed  
its doors

to new membership.

I cannot stress this enough: content is the key to a good industry- 
changing

business model. Broad appeal content will gain broad interest. Broad
interest will change the playing field and compel content providers to
consider alternative consumption/delivery models.

The ILECs are going to do it. They have deep pockets. Look at how  
quickly

they were able to get franchising laws adjusted to allow them to offer
video.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine
I am pretty sure we are not becoming a VoD world. Linear programming  
is much better for advertisers. I do not think content providers, nor  
consumers, would prefer a VoD only service. A handful of consumers  
would love it, but many would not.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 12, 2007, at 10:05 AM, Frank Bulk wrote:



If we're becoming a VOD world, does multicast play any practical  
role in

video distribution?

Frank

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On  
Behalf Of

Michal Krsek
Sent: Wednesday, January 10, 2007 2:28 AM
To: Marshall Eubanks
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Hi Marshall,


- the largest channel has 1.8% of the audience
- 50% of the audience is in the largest 2700 channels
- the least watched channel has ~ 10 simultaneous viewers
- the multicast bandwidth usage would be 3% of the unicast.


I'm a bit skeptic for future of channels. For making money from the  
long
tail, you have to have to adapt your distribution to user's needs.  
It is not


only format, codec ... but also time frame. You can organise your  
programs
in channels, but they will not run simultaneously for all the  
users. I want

to control my TV, I don't want to my TV jockey my life.

For the distribution, you as content owner have to help the ISP  
find the
right way to distribute your content. In example: having  
distribution center


in Tier1 ISP network will make money from Tier2 ISP connected  
directly to
Tier1. Probably, having CDN (your own or pay for service) will be  
the only

one way for large scale non synchronous programing.

Regards
Michal






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-12 Thread Gian Constantine
I have spoken with a colleague in the industry regarding 4com.  
Apparently, they have been able to acquire some sort of pass-through  
licensing on much of the content, but I have not spoken directly with  
4com. I heard the same of Broadstream and SES Americom, but both  
proved to be more of an aid in acquisition, and not outright pass- 
through rights.


VoD is one of the main drivers, along with HD, but neither are a full- 
service alone. Consumers will demand linear programming. They have  
become accustomed to it. More importantly, the advertisers have  
become accustomed to it.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 12, 2007, at 5:29 PM, Michael Painter wrote:


- Original Message - From: Gian Constantine
Sent: Friday, January 12, 2007 5:24 AM
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?



Yes, the NCTC.
I have spoken with two of the vendors you mentioned. Neither have  
pass-through licensing rights. I still have to go directly to most  
of the content providers to get the proper licensing rights.
There are a few vendors out there who will help a company attain  
these rights, but the solution is not turnkey on licensing. To be  
clear, it is not turnkey for the major U.S. content providers.


Back in the 'day', these folks were great to work with, but I have  
no idea of how they would deal with IPTV.

http://www.4com.com/Company-Profile.html

Btw, I thought VoD was one of the main drivers of IPTV, at the  
local level at least.


--Michael







Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine
Many of the small carriers, who are doing IPTV in the U.S., have  
acquired their content rights through a consortium, which has since  
closed its doors to new membership.


I cannot stress this enough: content is the key to a good industry- 
changing business model. Broad appeal content will gain broad  
interest. Broad interest will change the playing field and compel  
content providers to consider alternative consumption/delivery models.


The ILECs are going to do it. They have deep pockets. Look at how  
quickly they were able to get franchising laws adjusted to allow them  
to offer video.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 2:30 AM, Christian Kuhtz wrote:


Marshall,

I completely agree, and due diligence on business models will show  
that fact very clearly.  And nothing much has changed here in terms  
of substance over the last 4+ yrs either.  Costs and opportunities  
have changed or evolved rather, but not the mechanics.


Infrastructure capital is very much the gating factor in every  
major video distribution infrastructure (and the reason why DOCSIS  
3.0 is just such a neato thing).  The carriage deals are merely  
table stakes, and that doesn't mean they're easy.  They are  
obtainable.


And some business models are just fundamentally broken.

Examples for infrastructure costs are size of CSA's or cost  
upgrading CPE is a far bigger deal than carriage.  And if you can't  
get into RT's in a ILEC colo arrangement, that doesn't per se  
globally invalidate business models, but rather provides unique  
challenges and limitations on a given specific business model.


What has changed is that ppl are actually 'doing it'.  And that  
proves that several models are viable for funding in all sorts of  
flavors and risks.


IPTV is fundamentally subject to the analog fallacies of VoIP  
replacing 1FR/1BR service on 1:1 basis (toll arbitrage or anomalies  
aside).  There seems to be plenty of that.  A new IP service  
offering no unique features over specialzed and depreciated  
infrastructure will not be viable until commoditized and not at an  
early maturity level like where IPTV is at.


Unless an IPTV service offers a compelling cost advantage, mass  
adoption will not occur.  And any cost increase will have to be  
justifiable to consumers, and that cannot be underestimated.


But, some just continue to ignore those fundamentals and those  
business models will fail.  And we should be thankful for that self  
cleansing action of a functioning market.


Enough rambling after a long day at CES, I suppose.  Thanks for  
reading this far.


Best regards,
Christian

--
Sent from my BlackBerry.

-Original Message-
From: Marshall Eubanks [EMAIL PROTECTED]
Date: Wed, 10 Jan 2007 01:52:06
To:Gian Constantine [EMAIL PROTECTED]
Cc:Bora Akyol [EMAIL PROTECTED],Simon Lockhart  
[EMAIL PROTECTED], [EMAIL PROTECTED],nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a day,  
continuously?




On Jan 9, 2007, at 8:40 PM, Gian Constantine wrote:


It would not be any easier. The negotiations are very complex. The
issue is not one of infrastructure capex. It is one of jockeying
between content providers (big media conglomerates) and the video
service providers (cable companies).


Not necessarily. Depends on your business model.

Regards
Marshall



Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:



Simon

An additional point to consider is that it takes a lot of effort and
 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.



-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Simon Lockhart
Sent: Tuesday, January 09, 2007 2:42 PM
To: [EMAIL PROTECTED]
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


On Tue Jan 09, 2007 at 07:52:02AM +,
[EMAIL PROTECTED] wrote:

Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?


How many channels can you get on your (terrestrial) broadcast
receiver?

If you want more, your choices are satellite or cable. To get
cable, you
need to be in a cable area. To get satellite, you need to
stick a dish on
the side of your house, which you may not want to do, or may
not be allowed
to do.

With IPTV, you just need a phoneline (and be close enough to
the exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering
40+ channels over
IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon












Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine

Ah-ha. You are mistaken. :-)

My focus is next-gen broadband and video. The wifi guys have their  
own department.


Good try, though. :-)

Personally, I am against the peer-to-peer method for business  
reasons, not technical ones. It will be difficult to get blessed by  
the content providers and painful to support (high opex).


I have confidence in creative engineers. I am sure any one of us  
could come up with a workable solution for P2P given the time and  
proper motivation.


All in all, P2P is really limited to a VoD model. It is hard to say  
whether or not VoD would ever become such an important service over  
the Internet, as to press content providers into an agreeable nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 3:25 AM, Frank Coluccio wrote:



Gian wrote:

From a big picture standpoint, I would say P2P distribution is a  
non-starter,
too many reluctant parties to appease. From a detail standpoint, I  
would say P2P
distribution faces too many hurdles in existing network  
infrastructure to be
justified. Simply reference the discussion of upstream bandwidth  
caps and you

will have a wonderful example of those hurdles.

Speaking about upstream hurdles, just out of curiosity (since this  
is merely a
diversionary discussion at this point;) ...  wouldn't peer-to-peer  
be the LEAST
desirable approach for an SP that is launching WiFi nets as its  
primary first
mile platform? I note that Earthlink is launching a number of  
cityscale WiFi nets
as we speak, which is why I'm asking. Has this in any way, even  
subliminally,

been influential in the shaping of your opinions about P2P for content
distribution? I know that it would affect my views, a whole lot,  
since the
prospects for WiFi's shared upstream capabilities to improve are  
slim to none in
the short to intermediate terms. Whereas, CM and FTTx are known to  
raise their
down and up offerings periodically, gated only by their usual game  
of chicken

where each watches to see who'll be first.

Frank

On Mon Jan  8 22:26 , Gian Constantine  sent:

My contention is simple. The content providers will not allow P2P  
video as a
legal commercial service anytime in the near future. Furthermore,  
most ISPs are
going to side with the content providers on this one. Therefore,  
discussing it at
this point in time is purely academic, or more so,  
diversionary.Personally, I am
not one for throttling high use subscribers. Outside of the fine  
print, which no
one reads, they were sold a service of Xkbps down and Ykbps up. I  
could not care
less how, when, or how often they use it. If you paid for it, burn  
it up.I have
questions as to whether or not P2P video is really a smart  
distribution method
for service provider who controls the access medium. Outside of  
being a service

provider, I think the economic model is weak, when there can be little
expectation of a large scale take rate.Ultimately, my answer is:  
we're not there
yet. The infrastructure isn't there. The content providers aren't  
there. The
market isn't there. The product needs a motivator. This discussion  
has been
putting the cart before the horse.A lot of big pictures pieces are  
completely
overlooked. We fail to question whether or not P2P sharing is a  
good method in
delivering the product. There are a lot of factors which play into  
this.
Unfortunately, more interest has been paid to the details of this  
delivery method
than has been paid to whether or not the method is even  
worthwhile.From a big
picture standpoint, I would say P2P distribution is a non-starter,  
too many
reluctant parties to appease. From a detail standpoint, I would say  
P2P
distribution faces too many hurdles in existing network  
infrastructure to be
justified. Simply reference the discussion of upstream bandwidth  
caps and you

will have a wonderful example of those hurdles.

Gian Anthony ConstantineSenior Network Design EngineerEarthlink, Inc.
On Jan 8, 2007, at 9:49 PM, Thomas Leavitt wrote:So, kind of back  
to the
original question: what is going to be the reaction of your average  
service
provider to the presence of an increasing number of people sucking  
down massive
amounts of video and spitting it back out again... nothing?  
throttling all
traffic of a certain type? shutting down customers who exceed  
certain thresholds?
or just throttling their traffic? massive upgrades of internal  
network hardware?
Is it your contention that there's no economic model, given the  
architecture of
current networks, which would would generate enough revenue to  
offset the cost of

traffic generated by P2P video?

Thomas
Gian Constantine wrote: There may have been a disconnect on my  
part, or at
least, a failure to disclose my position. I am looking at things  
from a provider

standpoint, whether as an ISP or a strict video service provider.
I agree with you. From a consumer standpoint, a trickle or off- 
peak download
model

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine

All H.264?

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 4:41 AM, Richard Naylor wrote:



At 08:40 p.m. 9/01/2007 -0500, Gian Constantine wrote:
It would not be any easier. The negotiations are very complex. The  
issue is not one of infrastructure capex. It is one of jockeying  
between content providers (big media conglomerates) and the video  
service providers (cable companies).


We're seeing a degree of co-operation in this area. Its being  
driven by the market. - see below.


snip
On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:
An additional point to consider is that it takes a lot of effort and

 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.


The other bigger driver, is that for most broadcasters (both TV and  
Radio), advertising revenues are flat, *except* in the on-line  
area. So they are chasing on-line growth like crazy. Typically on- 
line revenues now make up around 25% of income.


So broadcasters are reacting and developing quite large systems for  
delivering content both new and old. We're seeing these as a  
mixture of live streams, on-demand streams, on-demand downloads and  
torrents. Basically, anything that works and is reliable and can be  
scaled. (we already do geographic distribution and anycast routing).


And the broadcasters won't pay flash transit charges. They are  
doing this stuff from within existing budgets. They will put  
servers in different countries if it makes financial sense. We have  
servers in the USA, and their biggest load is non-peering NZ  based  
ISPs.


And broadcasters aren't the only source of large content. My  
estimate is that they are only 25% of the source. Somewhere last  
year I heard John Chambers say that many corporates are seeing 500%  
growth in LAN traffic - fueled by video.


We do outside webcasting - to give you an idea of traffic, when we  
get a fiber connex, we allow for 6GBytes per day between an encoder  
and the server network - per programme. We often produce several  
different programmes from a site in different languages etc. Each  
one is 6GB. If we don't have fiber, it scales down to about 2GB per  
programme. (on fiber we crank out a full 2Mbps Standard Def stream,  
on satellite we only get 2Mbps per link). I have a chart by my  
phone that gives the minute/hour/day/month traffic impact of a  
whole range of streams and refer to it every day. Oh - we can do  
1080i on demand and can and do produce content in that format.  
They're 8Mbps streams. Not many viewers tho :-)   We're close to  
being able to webcast it live.


We currently handle 50+ radio stations and 12 TV stations, handling  
around 1.5 to 2million players a month, in a country with a  
population of 4million. But then my stats could be lying..


Rich
(long time lurker)






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-10 Thread Gian Constantine
Sounds a little like low buffering and sparse I-frames, but I'm no  
MPEG expert. :-)


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 10, 2007, at 5:42 AM, Mikael Abrahamsson wrote:



On Tue, 9 Jan 2007, [EMAIL PROTECTED] wrote:

between handling 30K unicast streams, and 30K multicast streams  
that each have only one or at most 2-3 viewers?


My opinion on the downside of video multicast is that if you want  
it realtime your SLA figures on acceptable packet loss goes down  
from fractions of a percent into the thousands of a percent, at  
least with current implementations of video.


Imagine internet multicast and having customers complain about bad  
video quality and trying to chase down that last 1/10 packet  
loss that makes peoples video pixelate every 20-30 minutes, and the  
video stream doesn't even originate in your network?


For multicast video to be easier to implement we need more robust  
video codecs that can handle jitter and packet loss that are  
currently present in networks and handled acceptably by TCP for  
unicast.


--
Mikael Abrahamssonemail: [EMAIL PROTECTED]




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
I am not sure what I was thinking. Mr Bonomi was kind enough to point  
out a failed calculation for me. Obviously, a HD file would only be  
about 3.7GB for a 90 minute file at 5500kbps. In my haste, I  
neglected to convert bits to bytes. My apologies.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 9:07 PM, Gian Constantine wrote:

There may have been a disconnect on my part, or at least, a failure  
to disclose my position. I am looking at things from a provider  
standpoint, whether as an ISP or a strict video service provider.


I agree with you. From a consumer standpoint, a trickle or off-peak  
download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill on  
space for disposable content encoded in H.264. Excellent SD (480i)  
content can be achieved at ~1200 to 1500kbps, resulting in about a  
1GB file for a 90 minute title. HD is almost out of the question  
for internet download, given good 720p at ~5500kbps, resulting in a  
30GB file for a 90 minute title.


Service providers wishing to provide this service to their  
customers may see some success where they control the access medium  
(copper loop, coax, FTTH). Offering such a service to customers  
outside of this scope would prove very expensive, and likely, would  
never see a return on the investment without extensive peering  
arrangements. Even then, distribution rights would be very  
difficult to attain without very deep pockets and crippling revenue  
sharing. The studios really dislike the idea of transmission  
outside of a closed network. Don't forget. Even the titles you  
mentioned are still owned by very large companies interested in  
squeezing every possible dime from their assets. They would not be  
cheap to acquire.


Further, torrent-like distribution is a long long way away from  
sign off by the content providers. They see torrents as the number  
one tool of content piracy. This is a major reason I see the  
discussion of tripping upstream usage limits through content  
distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And,  
almost none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]
Sent: Monday, January 08, 2007 4:27 PM
To: Bora Akyol
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip


I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not
cheap. Well designed distribution systems are not cheap.
While price does decrease when scaled upwards, the cost of
such an operation remains hefty, and increases with additions
to the offered content library and a swelling of demand for
this content. I believe the graph becomes neither asymptotic,
nor anywhere near zero.


To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does  
is to
stress the ISP network to its max since the assumptions of  
statistical

multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7.

The user is still paying for only what they pay each month, and  
this is

network neutrality 2.0 all over again.



You are correct on the long tail nature of music. But music
is not consumed in a similar manner as TV and movies.
Television and movies involve a little more commitment and
attention. Music is more for the moment and the mood. There
is an immediacy with music consumption. Movies and television
require a slight degree more patience from the consumer. The
freshness (debatable :-) ) of new release movies and TV can
often command the required patience from the consumer. Older
content rarely has the same pull.


I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and  
the

content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games  
from

NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but  
multi-path

distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
Those numbers are reasonably accurate for some networks at certain  
times. There is often a back and forth between BitTorrent and NNTP  
traffic. Many ISPs regulate BitTorrent traffic for this very reason.  
Massive increases in this type of traffic would not be looked upon  
favorably.


If you considered my previous posts, you would know I agree streaming  
is scary on a large scale, but unicast streaming is what I reference.  
Multicast streaming is the real solution. Ultimately, a global  
multicast network is the only way to deliver these services to a  
large market.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:01 AM, Joe Abley wrote:




On 8-Jan-2007, at 22:26, Gian Constantine wrote:

My contention is simple. The content providers will not allow P2P  
video as a legal commercial service anytime in the near future.  
Furthermore, most ISPs are going to side with the content  
providers on this one. Therefore, discussing it at this point in  
time is purely academic, or more so, diversionary.


There are some ISPs in North America who tell me that something  
like 80% of their traffic *today* is BitTorrent. I don't know how  
accurate their numbers are, or whether those ISPs form a  
representative sample, but it certainly seems possible that the  
traffic exists regardless of the legality of the distribution.


If the traffic is real, and growing, the question is neither  
academic nor diversionary.


However, if we close our eyes and accept for a minute that P2P  
video isn't happening, and all growth in video over the Internet  
will be in real-time streaming, then I think the future looks a lot  
more scary. When TSN.CA streamed the World Junior Hockey  
Championship final via Akamai last Friday, there were several ISPs  
in Toronto who saw their transit traffic *double* during the game.



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
You are correct. Today, IP multicast is limited to a few small closed  
networks. If we ever migrate to IPv6, this would instantly change.  
One of my previous assertions was the possibility of streaming video  
as the major motivator of IPv6 migration. Without it, video streaming  
to a large market, outside of multicasting in a closed network, is  
not scalable, and therefore, not feasible. Unicast streaming is a  
short-term bandwidth-hogging solution without a future at high take  
rates.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:47 AM, Joe Abley wrote:



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at certain  
times. There is often a back and forth between BitTorrent and NNTP  
traffic. Many ISPs regulate BitTorrent traffic for this very  
reason. Massive increases in this type of traffic would not be  
looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a- 
mole. At what point does it cost more to play that game than it  
costs to build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is what  
I reference. Multicast streaming is the real solution. Ultimately,  
a global multicast network is the only way to deliver these  
services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
The available address space for multicast in IPv4 is limited. IPv6  
vastly expands this space. And here, I may have been guilty of  
putting the cart before the horse. Inter-AS multicast does not exist  
today because the motivators are not there. It is absolutely  
possible, but providers have to want to do it. Consumers need to see  
some benefit from it. Again, the benefit needs to be seen by a large  
market. Providers make decisions in the interest of their bottom  
line. A niche service is not a motivator for inter-AS multicast. If  
demand for variety in service provider selection grows with the  
proliferation of IPTV, we may see the required motivation for inter- 
AS multicast, which places us in a position moving to the large  
multicast space available in IPv6.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 9, 2007, at 1:09 PM, Joe Abley wrote:



On 9-Jan-2007, at 13:04, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change. One of my previous assertions was the possibility of  
streaming video as the major motivator of IPv6 migration. Without  
it, video streaming to a large market, outside of multicasting in  
a closed network, is not scalable, and therefore, not feasible.  
Unicast streaming is a short-term bandwidth-hogging solution  
without a future at high take rates.


So you are of the opinion that inter-domain multicast doesn't exist  
today for technical reasons, and those technical reasons are fixed  
in IPv6?



Joe





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
This is a little presumptuous on my part, but what other reason would  
motivate a migration to IPv6. I fail to see us running out of unicast  
addresses any time soon. I have been hearing IPv6 is coming for many  
years now. I think video service is really the only motivation for  
migrating.


I am wrong on plenty of things. This may very well be one of them. :-)

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

On Jan 9, 2007, at 1:21 PM, Marshall Eubanks wrote:




On Jan 9, 2007, at 1:04 PM, Gian Constantine wrote:

You are correct. Today, IP multicast is limited to a few small  
closed networks. If we ever migrate to IPv6, this would instantly  
change.


I am curious. Why do you think that ?

Regards
Marshall

One of my previous assertions was the possibility of streaming  
video as the major motivator of IPv6 migration. Without it, video  
streaming to a large market, outside of multicasting in a closed  
network, is not scalable, and therefore, not feasible. Unicast  
streaming is a short-term bandwidth-hogging solution without a  
future at high take rates.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 11:47 AM, Joe Abley wrote:



On 9-Jan-2007, at 11:29, Gian Constantine wrote:

Those numbers are reasonably accurate for some networks at  
certain times. There is often a back and forth between  
BitTorrent and NNTP traffic. Many ISPs regulate BitTorrent  
traffic for this very reason. Massive increases in this type of  
traffic would not be looked upon favorably.


The act of regulating p2p traffic is a bit like playing whack-a- 
mole. At what point does it cost more to play that game than it  
costs to build out to carry the traffic?


If you considered my previous posts, you would know I agree  
streaming is scary on a large scale, but unicast streaming is  
what I reference. Multicast streaming is the real solution.  
Ultimately, a global multicast network is the only way to  
deliver these services to a large market.


The trouble with IP multicast is that it doesn't exist, in a wide- 
scale, deployed, inter-provider sense.



Joe









Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine

Fair enough. :-)

Nearly everything has a time and place, though.

Pretty much everything on this thread is speculative.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 9, 2007, at 2:13 PM, John Kristoff wrote:



On Tue, 9 Jan 2007 13:21:38 -0500
Marshall Eubanks [EMAIL PROTECTED] wrote:


You are correct. Today, IP multicast is limited to a few small
closed networks. If we ever migrate to IPv6, this would instantly
change.


I am curious. Why do you think that ?


I could have said the same thing, but with an opposite end meaning.
You take one 10+ year technology with minimal deployment and put it
on top of another 10+ year technology also far from being widely
deployed and you end up with something quickly approaching zero
deployment, instantly.  :-)

John




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
It would not be any easier. The negotiations are very complex. The  
issue is not one of infrastructure capex. It is one of jockeying  
between content providers (big media conglomerates) and the video  
service providers (cable companies).


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 9, 2007, at 7:57 PM, Bora Akyol wrote:



Simon

An additional point to consider is that it takes a lot of effort and
 to get a channel allocated to your content in a cable network.

This is much easier when TV is being distributed over the Internet.



-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Simon Lockhart
Sent: Tuesday, January 09, 2007 2:42 PM
To: [EMAIL PROTECTED]
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


On Tue Jan 09, 2007 at 07:52:02AM +,
[EMAIL PROTECTED] wrote:

Given that the broadcast model for streaming content
is so successful, why would you want to use the
Internet for it? What is the benefit?


How many channels can you get on your (terrestrial) broadcast
receiver?

If you want more, your choices are satellite or cable. To get
cable, you
need to be in a cable area. To get satellite, you need to
stick a dish on
the side of your house, which you may not want to do, or may
not be allowed
to do.

With IPTV, you just need a phoneline (and be close enough to
the exchange/CO
to get decent xDSL rate). In the UK, I'm already delivering
40+ channels over
IPTV (over inter-provider multicast, to any UK ISP that wants it).

Simon








Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-09 Thread Gian Constantine
There you go. SSM would be a great solution. Who the hell supports  
it, though?


We still get back to the issue of large scale market acceptance. High  
take rate will be limited to the more popular channels, which are run  
by large media conglomerates, who are reluctant to let streams out of  
a closed network.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 10, 2007, at 12:08 AM, Sean Donelan wrote:



On Tue, 9 Jan 2007, [EMAIL PROTECTED] wrote:
Multicast streaming may be a big win when you're only streaming  
the top

5 or 10 networks (for some value of 5 or 10).  What's the performance
characteristics if you have 300K customers, and at any given time,  
10%
are watching something from the long tail - what's the  
difference between
handling 30K unicast streams, and 30K multicast streams that each  
have only

one or at most 2-3 viewers?


1/2, 1/3, etc the bandwidth for each additional viewer of the same  
stream?
The worst case for a multicast stream is the same as the unicast  
stream, but the unicast stream is always the worst case.


Multicast doesn't have to be real-time. If you collect interested  
subscribers over a longer time period, e.g. scheduled downloads  
over the next hour, day, week, month, you can aggregate more  
multicast receivers through the same stream.  TiVo collects its  
content using a broadcast

schedule.

A long tail distribution includes not only the tail, but also the  
head.  30K unicast streams may be the same as 30K multicast  
streams, but

30K multicast streams is a lot better than 300,000 unicast streams.
Although the long tail steams may have 1, 2, 3 receivers of a  
stream, the Parato curve also has 1, 2, 3 streams with 50K, 25K,  
12K receivers.


With Source-Specific Multicast addressing there isn't a shortage of  
multicast addresses for the typical broadcast usage.  At least not  
until

we also run out of IPv4 unicast addresses.

There is rarely only one way to solve a problem.  There will be  
multiple

ways to distribute data, video, voice, etc.




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
Well, yes. My view on this subject is U.S.-centric. In fairness to  
me, this is NANOG, not AFNOG or EuroNOG or SANOG.


I would also argue storage and distribution costs are not  
asymptotically zero with scale. Well designed SANs are not cheap.  
Well designed distribution systems are not cheap. While price does  
decrease when scaled upwards, the cost of such an operation remains  
hefty, and increases with additions to the offered content library  
and a swelling of demand for this content. I believe the graph  
becomes neither asymptotic, nor anywhere near zero.


You are correct on the long tail nature of music. But music is not  
consumed in a similar manner as TV and movies. Television and movies  
involve a little more commitment and attention. Music is more for the  
moment and the mood. There is an immediacy with music consumption.  
Movies and television require a slight degree more patience from the  
consumer. The freshness (debatable :-) ) of new release movies and TV  
can often command the required patience from the consumer. Older  
content rarely has the same pull.


I agree there is a market for ethnic and niche content, but it is not  
the broad market many companies look for. The investment becomes much  
more of a gamble than marketing the latest and greatest (again  
debatable :-) ) to the larger market of...well...everyone.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 5:15 PM, Bora Akyol wrote:






-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Gian Constantine
Sent: Sunday, January 07, 2007 7:18 PM
To: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip

In entertainment, content is king. More specifically, new
release content is king. While internet distribution may help
breathe life into the long tail market, it is hard to imagine
any major shift from existing distribution methods. People
simply like the latest TV shows and the latest movies.


What's new to you is very different from what's new to me?

I am very happy watching 1 year old episodes of Top Gear whereas
if you are located in the UK, you may consider this as old news.

The story here is about the cost of storing the video content (which
is asymptotically zero) and the cost of distributing it (which is also
asymptotically
approaching zero, despite the ire of the SPs).



So, this leaves us with little more than what is already
offered by the MSOs: linear TV and VoD. This is where things
become complex.

The studios will never (not any time soon) allow for a
subscription based VoD on new content. They would instantly
be sued by Time Warner (HBO).


This is a very US-centric view of the world. I am sure there are
hundreds of
TV stations from India, Turkey, Greece, etc that would love to put  
their

content
online and make money off the long tail.


I guess where I am going with all this is simply it is very
hard to make this work from a business and marketing side.
The network constraints are, likely, a minor issue for some
time to come. Interest is low in the public at large for
primary (or even major secondary) video service on the PC.



Again, your views are very US centric, and are mono-cultural.

If you open your horizons, I think there is a world of content out  
there
that the content owners would be happy to license and sell at  10  
cents

a pop.
To them it is dead content, but it turns out that they are worth
something to someone out there.
This is what iTunes, and Rhapsody are doing with music. And the day of
the video is coming.

Bora

-- Off to raise some venture funds now. (Just kidding ;)





Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
There may have been a disconnect on my part, or at least, a failure  
to disclose my position. I am looking at things from a provider  
standpoint, whether as an ISP or a strict video service provider.


I agree with you. From a consumer standpoint, a trickle or off-peak  
download model is the ideal low-impact solution to content delivery.  
And absolutely, a 500GB drive would almost be overkill on space for  
disposable content encoded in H.264. Excellent SD (480i) content can  
be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a  
90 minute title. HD is almost out of the question for internet  
download, given good 720p at ~5500kbps, resulting in a 30GB file for  
a 90 minute title.


Service providers wishing to provide this service to their customers  
may see some success where they control the access medium (copper  
loop, coax, FTTH). Offering such a service to customers outside of  
this scope would prove very expensive, and likely, would never see a  
return on the investment without extensive peering arrangements. Even  
then, distribution rights would be very difficult to attain without  
very deep pockets and crippling revenue sharing. The studios really  
dislike the idea of transmission outside of a closed network. Don't  
forget. Even the titles you mentioned are still owned by very large  
companies interested in squeezing every possible dime from their  
assets. They would not be cheap to acquire.


Further, torrent-like distribution is a long long way away from sign  
off by the content providers. They see torrents as the number one  
tool of content piracy. This is a major reason I see the discussion  
of tripping upstream usage limits through content distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And, almost  
none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]



On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]
Sent: Monday, January 08, 2007 4:27 PM
To: Bora Akyol
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?


snip


I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not
cheap. Well designed distribution systems are not cheap.
While price does decrease when scaled upwards, the cost of
such an operation remains hefty, and increases with additions
to the offered content library and a swelling of demand for
this content. I believe the graph becomes neither asymptotic,
nor anywhere near zero.


To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does is to
stress the ISP network to its max since the assumptions of statistical
multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7.

The user is still paying for only what they pay each month, and  
this is

network neutrality 2.0 all over again.



You are correct on the long tail nature of music. But music
is not consumed in a similar manner as TV and movies.
Television and movies involve a little more commitment and
attention. Music is more for the moment and the mood. There
is an immediacy with music consumption. Movies and television
require a slight degree more patience from the consumer. The
freshness (debatable :-) ) of new release movies and TV can
often command the required patience from the consumer. Older
content rarely has the same pull.


I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and the
content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games  
from

NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but multi- 
path

distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering of the network. I don't think you need huge SANs per se to
store the content either, since it is multi-source/multi-sink, the
reliability is built-in.

The SPs like Verizon  ATT moving fiber to the home hoping to get  
in on

the value add action are in for an awakening IMHO.

Regards

Bora
ps. I apologize for the tone of my previous email. That sounded  
grumpier

than I usually am.






Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-08 Thread Gian Constantine
My contention is simple. The content providers will not allow P2P  
video as a legal commercial service anytime in the near future.  
Furthermore, most ISPs are going to side with the content providers  
on this one. Therefore, discussing it at this point in time is purely  
academic, or more so, diversionary.


Personally, I am not one for throttling high use subscribers. Outside  
of the fine print, which no one reads, they were sold a service of  
Xkbps down and Ykbps up. I could not care less how, when, or how  
often they use it. If you paid for it, burn it up.


I have questions as to whether or not P2P video is really a smart  
distribution method for service provider who controls the access  
medium. Outside of being a service provider, I think the economic  
model is weak, when there can be little expectation of a large scale  
take rate.


Ultimately, my answer is: we're not there yet. The infrastructure  
isn't there. The content providers aren't there. The market isn't  
there. The product needs a motivator. This discussion has been  
putting the cart before the horse.


A lot of big pictures pieces are completely overlooked. We fail to  
question whether or not P2P sharing is a good method in delivering  
the product. There are a lot of factors which play into this.  
Unfortunately, more interest has been paid to the details of this  
delivery method than has been paid to whether or not the method is  
even worthwhile.


From a big picture standpoint, I would say P2P distribution is a non- 
starter, too many reluctant parties to appease. From a detail  
standpoint, I would say P2P distribution faces too many hurdles in  
existing network infrastructure to be justified. Simply reference the  
discussion of upstream bandwidth caps and you will have a wonderful  
example of those hurdles.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 8, 2007, at 9:49 PM, Thomas Leavitt wrote:

So, kind of back to the original question: what is going to be the  
reaction of your average service provider to the presence of an  
increasing number of people sucking down massive amounts of video  
and spitting it back out again... nothing? throttling all traffic  
of a certain type? shutting down customers who exceed certain  
thresholds? or just throttling their traffic? massive upgrades of  
internal network hardware?


Is it your contention that there's no economic model, given the  
architecture of current networks, which would would generate enough  
revenue to offset the cost of traffic generated by P2P video?


Thomas

Gian Constantine wrote:
There may have been a disconnect on my part, or at least, a  
failure to disclose my position. I am looking at things from a  
provider standpoint, whether as an ISP or a strict video service  
provider.


I agree with you. From a consumer standpoint, a trickle or off- 
peak download model is the ideal low-impact solution to content  
delivery. And absolutely, a 500GB drive would almost be overkill  
on space for disposable content encoded in H.264. Excellent SD  
(480i) content can be achieved at ~1200 to 1500kbps, resulting in  
about a 1GB file for a 90 minute title. HD is almost out of the  
question for internet download, given good 720p at ~5500kbps,  
resulting in a 30GB file for a 90 minute title.


Service providers wishing to provide this service to their  
customers may see some success where they control the access  
medium (copper loop, coax, FTTH). Offering such a service to  
customers outside of this scope would prove very expensive, and  
likely, would never see a return on the investment without  
extensive peering arrangements. Even then, distribution rights  
would be very difficult to attain without very deep pockets and  
crippling revenue sharing. The studios really dislike the idea of  
transmission outside of a closed network. Don't forget. Even the  
titles you mentioned are still owned by very large companies  
interested in squeezing every possible dime from their assets.  
They would not be cheap to acquire.


Further, torrent-like distribution is a long long way away from  
sign off by the content providers. They see torrents as the number  
one tool of content piracy. This is a major reason I see the  
discussion of tripping upstream usage limits through content  
distribution as moot.


I am with you on the vision of massive content libraries at the  
fingertips of all, but I see many roadblocks in the way. And,  
almost none of them are technical in nature.


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
[EMAIL PROTECTED]  
mailto:[EMAIL PROTECTED]




On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:



Please see my comments inline:


-Original Message-
From: Gian Constantine [mailto:[EMAIL PROTECTED]  
Sent: Monday, January 08, 2007 4:27 PM

To: Bora Akyol
Cc: nanog@merit.edu mailto:nanog@merit.edu
Subject: Re

Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-07 Thread Gian Constantine
You know, when it's all said and done, streaming video may be the  
motivator for migrating the large scale Internet to IPv6. I do not  
see unicast streaming as a long term solution for video service. In  
the short term, unicast streaming and PushVoD models may prevail, but  
the ultimate solution is Internet-wide multicasting.


I want my m6bone. :-)

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.


On Jan 6, 2007, at 1:52 AM, Thomas Leavitt wrote:

If this application takes off, I have to presume that everyone's  
baseline network usage metrics can be tossed out the window...


Thomas



From: David Farber [EMAIL PROTECTED]
Subject: Using Venice Project? Better get yourself a non-capping  
ISP...

Date: Fri, 5 Jan 2007 11:11:46 -0500



Begin forwarded message:

From: D.H. van der Woude [EMAIL PROTECTED]
Date: January 5, 2007 11:06:31 AM EST
To: [EMAIL PROTECTED]
Subject: Using Venice Project? Better get yourself a non-capping  
ISP...



I am one of Venice' beta testers. Works like a charm,
admittedly with a 20/1 Mbs ADSL2+ connection and
a unlimited use ISP.

Even at sub-DVD quality the data use is staggering...

Venice Project would break many users' ISP conditions
http://www.out-law.com/page-7604
OUT-LAW News, 03/01/2007

Internet television system The Venice Project could break users'  
monthly internet bandwith limits in hours, according to the team  
behind it.


It downloads 320 megabytes (MB) per hour from users' computers,  
meaning that users could reach their monthly download limits in  
hours and that it could be unusable for bandwidth-capped users.


The Venice Project is the new system being developed by Janus Friis  
and Niklas Zennström, the Scandinavian entrepreneurs behind the  
revolutionary services Kazaa and Skype. It is currently being used  
by 6,000 beta testers and is due to be launched next year.


The data transfer rate is revealed in the documentation sent to  
beta testers and the instructions make it very clear what the  
bandwidth requirements are so that users are not caught out.


Under a banner saying 'Important notice for users with limits on  
their internet usage', the document says: The Venice Project is a  
streaming video application, and so uses a relatively high amount  
of bandwidth per hour. One hour of viewing is 320MB downloaded and  
105 Megabytes uploaded, which means that it will exhaust a 1  
Gigabyte cap in 10 hours. Also, the application continues to run in  
the background after you close the main window.


For this reason, if you pay for your bandwidth usage per megabyte  
or have your usage capped by your ISP, you should be careful to  
always exit the Venice Project client completely when you are  
finished watching it, says the document


Many ISPs offer broadband connections which are unlimited to use by  
time, but have limits on the amount of data that can be transferred  
over the connection each month. Though limits are 'advisory' and  
not strict, users who regularly far exceed the limits break the  
terms of their deals.


BT's most basic broadband package BT Total Broadband Package 1, for  
example, has a 2GB monthly 'usage guideline'. This would be reached  
after 20 hours of viewing.


The software is also likely to transfer data even when not being  
used. The Venice system is going to run on a peer-to-peer (P2P)  
network, which means that users host and send the programmes to  
other users in an automated system.


OUT-LAW has seen screenshots from the system and talked to one of  
the testers of it, who reports very favourably on its use. This is  
going to be the one. I've used some of the other software out there  
and it's fine, but my dad could use this, they've just got it  
right, he said. It looks great, you fire it up and in two minutes  
you're live, you're watching television.


The source said that claims being made for the system being near  
high definition in terms of picture quality are wide of the mark.  
It's not high definition. It's the same as normal television, he  
said.





-- Private where private belongs, public where it's needed, and an  
admission that circumstances alter cases. Robert A. Heinlein, 1969


--
Thomas Leavitt - [EMAIL PROTECTED] - 831-295-3917 (cell)

*** Independent Systems and Network Consultant, Santa Cruz, CA ***

thomas.vcf




Re: Network end users to pull down 2 gigabytes a day, continuously?

2007-01-07 Thread Gian Constantine
I may have missed it in previous posts, but I think an important  
point is being missed in much of this discussion: take rate.


An assumption being made is one of widespread long time usage. I  
would argue consumers have little interest in viewing content for  
more than a few hundred seconds on their PC. Further, existing  
solutions for media extension to the television are gaining very  
little foothold outside of technophiles. They tend to be more complex  
for the average user than many vendors seemingly realize. While Apple  
may help in this arena, there are many other obstacles to widespread  
usage of streaming video outside of media extension.


In entertainment, content is king. More specifically, new release  
content is king. While internet distribution may help breathe life  
into the long tail market, it is hard to imagine any major shift from  
existing distribution methods. People simply like the latest TV shows  
and the latest movies.


So, this leaves us with little more than what is already offered by  
the MSOs: linear TV and VoD. This is where things become complex.


The studios will never (not any time soon) allow for a subscription  
based VoD on new content. They would instantly be sued by Time Warner  
(HBO). This leaves us with a non-subscription VoD option, which still  
requires an agreement with the each of the major studios, and would  
likely cost a fortune to obtain. CinemaNow and MovieLink have done  
this successfully, and use a PushVoD model to distribute their  
content. CinemaNow allows DVD burning for some of their content, but  
both companies are otherwise tied to the PC (without a media  
extender). Furthermore, the download wait is a pain. Their content is  
good quality 1200-1500 kbps VC-1 *wince*. It is really hard to say  
when and if either of these will take off as a service. It is a good  
service, with a great product, and almost no market at the moment.  
Get it on the TV and things may change dramatically.


This leaves us with linear TV, which is another acquisition  
nightmare. It is very difficult to acquire pass-through/distribution  
rights for linear television, especially via IP. Without deep  
pockets, a company might be spinning their wheels trying to get  
popular channels onto their lineup. And good luck trying to acquire  
the rights to push linear TV outside of a closed network. The studios  
will hear none of it.


I guess where I am going with all this is simply it is very hard to  
make this work from a business and marketing side. The network  
constraints are, likely, a minor issue for some time to come.  
Interest is low in the public at large for primary (or even major  
secondary) video service on the PC.


By the time interest in the product swells and content providers ease  
some of their more stringent rules for content distribution, a better  
solution for multicasting the content will have presented itself. I  
would argue streaming video across the Internet to a large audience,  
direct to subscribers, is probably 4+ years away at best.


I am not saying we throw in the towel on this problem, but I do think  
unicast streaming has a limited scope and short life-span for prime  
content. IPv6 multicast is the real long term solution for Internet  
video to a wide audience.


Of course, there is the other argument. The ILECs and MSOs will keep  
it from ever getting beyond a unicast model. Why let the competition  
in, right? *sniff* I smell lobbyists and legislation. :-)


Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.