On Mon, 2010-06-07 at 19:55 -0400, [email protected] wrote:
I am working on integrating a new video encoder. One issue I am
having
is determining the frame rate of the input video stream. The is
available in the AVFormatContext but not in the AVCodecContext. The
AVCodecContext does contain a time_base value but this does not seem
to
translate properly into a frame rate in all cases. There are some
video streams where the time_base is 50 / 2997 which would come to
59.94 fps. However, the AVFormatContext indicates the frame rate is
29.92 fps. I believe the actual frame rate is more likely the 29.92
value, however, this is not available at the libavcodec scope. Is
there any undocumented process for determining the actual frame rate
at
the libavcodec scope?? I also tried using the pts value in the
AVFrame
structure from the decoded frame. This turned out not to be
provided
by the decoder and is always 0.
One example . . .
<<
Duration: 00:02:42.36, start: 0.000000, bitrate: 820 kb/s
Stream #0.0: Video: h264, yuv420p, 640x352 [PAR 1:1 DAR 20:11],
820
kb/s, 58.82 fps, 29.92 tbr, 1k tbn, 59.94 tbc
Stream #0.1: Audio: aac, 22050 Hz, stereo, s16
Output #0, flv, to '':
Stream #0.0: Video: libx264, yuv420p, 640x352, q=2-31, 650
kb/s,
58.82 fps, 29.92 tbr, 1k tbn, 59.94 tbc
Stream #0.1: Audio: aac, 22050 Hz, stereo, s16
>>
Here's a an even worse case . . .
<<
Duration: 00:04:45.05, start: 0.000000, bitrate: 195 kb/s
Stream #0.0, 21, 1/1000: Video: h264, yuv420p, 320x214 [PAR 1:1
DAR
160:107], 1/2000, 195 kb/s, 29.92 tbr, 1k tbn, 2k tbc
Stream #0.1, 15, 1/1000: Audio: aac, 22050 Hz, stereo, s16
Output #0, flv, to '':
Stream #0.0, 0, 1/1000: Video: libx264, yuv420p, 320x214,
1/2000,
q=2-31, 650 kb/s, 29.92 tbr, 1k tbn, 2k tbc
Stream #0.1, 0, 1/90000: Audio: aac, 22050 Hz, stereo, s16
>>
FLV has variable frame rate - the timestamps are always specified with
a
1 kHz time base. It's fairly common for the "average" frame rate to
vary. I've seen YouTube FLVs start at 3 fps, but switch to ~29.97 fps a
few seconds later. In other words, in general lavf can't figure out the
exact frame rate. Also, if I remember correctly the codec's time base
is
the field rate, which might in fact be useful (divide by 2 if
interlaced).
One thing I can't figure out is how to use the time_base values as in
the above two examples. The first has a value (50 / 2997) which would
lead me to calculate a frame rate of 59.94 when it is really on the
order of 30 fps. The decoded frame does not indicate interlaced. In
the second case the time_base is (1/2000). How is 2000 fps a useful
value??
Your best bet would be to preserve the time stamps and convert them to
the muxer's time base when written (av_rescale_q() between the
AVFormatContext::time_base:es). If you want constant frame rate output
you'll have to drop and/or duplicate frames to reach the desired rate.
If you're outputing to flv again then you don't need to worry too much
since the output is VFR too. You will need to give the encoder a frame
rate guess for it to perform proper rate control though.
Ok, I am sure I can eventually figure out how to do this in my
application. The Dranger tutorial seems to have some useful details.
However, what about the ffmpeg CLI scenario? How can I pass a frame
rate for bit rate control if the decoded frame does not even provide
the encoder a pts value? Will I be able to directly convert flv to flv
using the built-in h264 decoder and my video encoder?
I hope this helps a bit at least.
Every little bit helps, thanks!
/Tomas
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user