Hi all, I'm trying to understand how ffmpeg calculates the start time for an MPEG-2 file containing one video track and one audio track. For example, when I run ffprobe, I get this output:
Input #0, mpeg, from 'file.mpg': Duration: 00:00:11.06, start: 579.376000, bitrate: 10704 kb/s As a guess, I looked at the system clock reference in the first pack header and the PTS/DTS values in the first PES header. When I calculate those values by hand, or examine the same file in Wireshark, I get a value that differs from the one calculated by ffmpeg. The first pack header gives me a system clock reference of 578.4, while the first video PES header gives PTS = DTS = 579.4. If I look at the first audio PES header, I get 579.375655555556, which is so close that it may be just a matter of doing math in the wrong mode (integer vs. floating-point). From looking at those values, I would guess that ffmpeg takes the minimum value from all the PES headers, but I'm not confident about that. I see in av_dump_format() the lines that calculate & output seconds (ic->start_time / AV_TIME_BASE) and microseconds (% instead of /), and am working backwards from there. Any information or pointers on where to look would be appreciated. Thanks, Harold _______________________________________________ ffmpeg-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email [email protected] with subject "unsubscribe".
