On Mar 30, 2013, at 4:16 AM, René J.V. Bertin <rjvber...@gmail.com> wrote:
> Seriously, why not post the information provided by QTKit, and how you 
> convert it? Seems it could be quite easy to confound QT's time scale and 
> FFmpeg's time_base units?

It appears I've discovered what the problem is, however I'm not yet clear on 
how to fix it. The problem is not in my pts or dts values, or my formulas I'm 
using to convert QTSampleBuffer presentationTime and decodeTime to time_base 
units. (In fact, as an aside, I commented all that code out and used the 
muxing.c pts-setting code and it changed absolutely nothing -- the same problem 
existed. 

I am configuring QTKit to have a minimumFrameRate of 24, which is the value I'm 
using for time_base.den, according to the documentation. What I discovered is 
that despite configuring this frame rate in QTKit, that's not the actual frame 
rate being received -- at runtime capture is actually producing closer to 15 
fps. I determined this by simply reading the log of pts values  to the point 
where the value was the highest pts <= time_base.den -- and there were about 15 
frames that had been consistently processed. So I then just manually hardcoded 
the time_base.den to 15, and boom, both video and audio are right on the money, 
completely in sync. 

The problem is that I don't want (or more properly put, I do not think it would 
be prudent or bug-free code) to hard-code this value, as I expect frame rate 
likely will in reality vary, based on computer, camera, etc. At the present, 
I've got a question out to the QuickTime API users mailing list because there 
does not appear to be a way to query the actual frame rate being captured from 
either the sample buffer received, the capture device, or the capture 
connection. 

But this raises the question: what is the proper way to deal with a varying 
frame rate during encoding, so as to properly set pts and dts? It would appear 
that the intention is for a codec context's time_base to be set once prior to 
the encoding cycle. So I'd guess that even if I could get a runtime frame rate 
as capture was taking place, I couldn't change the time_base.den value on the 
fly during encoding.

How would you suggest one deals with this? What happens if you set the 
time_base.den to the expected frame rate, such as 24, only to actually receive 
15 (or some other number) frames per second? How do you deliver proper timings 
in this scenario? 

Thanks, 

Brad 




_______________________________________________
Libav-user mailing list
Libav-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to