Hi Ross,
I have a a question. I see in the
ProxyServerMediaSubsession::createNewStreamSource, there is a normalizerFilter
setup for the PTS times from a source. Why can't the PTS times from a source be
used instead of normalizing it? Is this due to latency from proxying a camera
stream?
// Then, add to the front of all data sources a filter that will
'normalize' their frames'
// presentation times, before the frames get re-transmitted by our server:
FramedFilter* normalizerFilter = sms->fPresentationTimeSessionNormalizer
->createNewPresentationTimeSubsessionNormalizer(fClientMediaSubsession.readSource(),
fClientMediaSubsession.rtpSource(),
fCodecName);
fClientMediaSubsession.addFilter(normalizerFilter);
In our case the proxy server is proxying a rtsp stream from a camera (rtsp
stream). We are noticing that the when we pull from the stream the PTS don't
seem to have the large jumps we see when going through the proxy. I did see
your comment in the FAQ on the large timestamp jump and you said that is
normal. Is this normal jump due to the normalization?
I have a colleague working on a DVR application and his source is from the
proxyserver. He says he does not have issues with the PTS if coming direct from
a camera only the proxy and sometimes the timestamps jump/change in the middle
of his session (which I assume is a problem from the source).
So is the normalization needed for a source that has valid timestamps?
Thanks for helping me understand.
Craig
_______________________________________________
live-devel mailing list
[email protected]
http://lists.live555.com/mailman/listinfo/live-devel