Hi, when specifying bitrate using minrate/maxrate arguments, how does ffmpeg measure the bitrate in its rate control algorithm? Is this the bitrate as a GOP average, i.e. sum(packet sizes of gop)/gop duration or simply packet size/packet duration that ffmpeg attempts to keep within the limits? The latter would seem odd for non-I-frame-only material as an I-frame of a certain quality is typically (depending on the motive, I know) a lot larger than the following P- or B-frames of the same quality. If the answer is "it depends on the codec" then I would like to know how it is for the mpeg-style codecs, e.g. mpeg2video.
Thanks for any insights into this, Robert _______________________________________________ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user