I discovered something this week and am trying to understand its
ramifications. I noticed lots of pixelation and motion blur the last two
weeks of Heroes. NBC broadcasts at 1080i for HDTV. I checked the statistics
for the show I recorded via HD Homerun tuners using Comcast cable, and NBC
is averaging about 4.8 GB per hour for a 1080i show. I thought is a bit low
but was even more surprised when I checked out shows on the other broadcast
networks. 

ABC     720p/60fps              6.3 GB
NBC     1080i/29.97fps  4.8 GB
CBS     1080i/29.97fps  5.6 GB
PBS     720p/60fps              5.4 GB
CW      1080i/29.97fps  7.9 GB
FOX     720p/60fps              7.3 GB

I find it strange that NBC has the lowest total file size but is
broadcasting at 1080i, so I assuming (and I know the drawback of that!) it
is compressed more than the other channels and am again assuming that is why
I am seeing the picture degradation. Calling Comcast is a joke, so I wanted
to do the math to calculate the 'bits-per-second" for each case, but am not
exactly sure if I am doing this correctly. It would seem that 4.8 GB/hr
would calculate as:

4.8 GB/hr * 1 hr/60 min * 1 min/60 sec * 1024 MB/GB * 8 Mb/MB = 10.9 Mbps. 

One online source indicated that for quality 1080i you should have at least
15 Mbps.

For the FOX network, the calculation would give 16.6 Mbps, far better than
the 12 Mbps my online source gave for quality 720p broadcasts.

I can't understand why the 720p broadcast is actually providing better
throughput than the 1080i. It seems backwards (which is why I am wondering
if my math is correct). I am not sure how to factor in the fps figures, if
at all.

If you can add some insight, it would be appreciated.

Thanks,

Jim Maki
jwm_maill...@comcast.net

Reply via email to