Hi,

Am 28.07.2010 11:52, schrieb august:
[...]
> but, I have to say, time in gavl and bgav is somewhat confusing.  There
> is  int64_t time and gavl_time_t which is also a int64_t.
>
> Also, what are "stream ticks"?  Is there somewhere where this is
> explained?

It's a timestamp scaled with the stream timescale.

> Also, for audio timestamp, what is a "samplerate tic"?

The same. For audio streams, timescale is always the samplerate.

> How do I go from stream ticks to number of frames?

In the general case (variable framerate) you can't. There are several
possibilities, for example:

- PAL (timescale 25):     0 1 2 3....
- NTSC (timescale 30000): 0 1001 2002 3003...

- PAL in Flash (timescale 1000): 0 40 80 120...
- NTSC in Flash (timescale 1000): 0 33 67 100....

But I saw even stranger files, e.g. a movie which shows a still image
at the beginning, which is coded as one video frame with several seconds
duration. One should be prepared for these cases :)

In a nutshell: Avoid frame numbers in your code

> For example, I call bgav_audio_duration and get the number of samples.
> But, if I call bgav_video_duration, I get stream ticks.  How do I go
> from that return value to number of frames?

As I said, in general you can't. In some simple cases, you can do:

if(video_format.framerate_mode == GAVL_FRAMERATE_CONSTANT)
   {
   frame_number = timestamp / video_format.frame_duration;
   }

A more generic approach, which also works for VFR streams are frame tables,
which I explained here:

http://hirntier.blogspot.com/2009/09/frame-tables.html

They let you obtain the timestamps of all frames and allow to do
timestamp <-> frame_number conversions even for variable framerate streams.

> I'm trying to keep audio and video in sync while looping, and am looking
> to keep track of the last video frame.
>
> any suggestions would be helpful.  What is your general model for
> keeping AV sync?   Do you use the audio and video timestamp?

Yes, what I do is the following:

- Get the first audio timestamp and initialize the player time with that

- The "player time" (with microsecond precision, gavl_time_t) is always:

   player_time = gavl_time_unscale(samplerate,
                                   first_audio_timestamp + 
samples_sent_to_soundcard - samples_buffered_in_soundcard);

- Read a video frame and get the video_frame_time (with microsecond precision, 
gavl_time_t) with:

   video_frame_time = gavl_time_unscale(video_format.timescale, 
frame->timestamp);

- Wait until it's time to display the frame:

   diff_time = video_frame_time - player_time;

   if(diff_time > 0)
     gavl_time_delay(&diff_time);

In the general case, you should be aware of the following:

- The video framerate might be variable

- The first timestamps of a file don't have to start with zero (especially for
   MPEG type streams). In particular, the first audio sample doesn't have to
   correspond to the first video frame.

- An audio sample and a video frame are assumed to be in sync if:

   gavl_time_unscale(audio_format.samplerate, audio_frame->timestamp) ==
   gavl_time_unscale(video_format.timescale, video_fame->timestamp);

Hope this helps

Burkhard

------------------------------------------------------------------------------
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share 
of $1 Million in cash or HP Products. Visit us here for more details:
http://ad.doubleclick.net/clk;226879339;13503038;l?
http://clk.atdmt.com/CRS/go/247765532/direct/01/
_______________________________________________
Gmerlin-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/gmerlin-general

Reply via email to