On 12/10/19 1:32 PM, Adam Moffett wrote:

I was discussing with a TV station engineer some sort of disturbance he's seeing in a video feed which crosses a section of our network. This is crossing a blend of fiber and part 101 microwave, and it's been working fine for several years until suddenly their problem cropped up about a month ago.

His words, emphasis mine:

"We are seeing PCR clocking intolerance in our television data streams (~19.392685 Mbps, plus overhead; PCR is sent at a defined interval, at least once every 40ms, for each of five embedded streams with a drift tolerance of <10mHz and a /*jitter error of <25us per */ETSI TR 101 290), "

I know jack-all about TV broadcasting, but I discussed packet to packet delay variation of less than 1 millisecond being considered perfect in my world, and "do I understand you correctly that you really need clock signals transmitted across the network with less than 25 /micro/ seconds of jitter?" He seems to feel that yes, that is the case.  Is this guy mistaken?   I can't believe whatever converts the TV signal to ethernet and back wouldn't have at least some minimal jitter buffer.


They really do need tight timing, but there's gotta be more to this story...

If it's suddenly a new problem perhaps they upgraded equipment to be future-proof for ATSC 3.0 and now they're trying to rely on timing over the network to avoid having to place a clock source at one or both ends. No ethernet is going to meet timing requirements that tight. That's what external sync inputs on such equipment is for.

--
AF mailing list
AF@af.afmug.com
http://af.afmug.com/mailman/listinfo/af_af.afmug.com

Reply via email to