Le lundi 11 juin 2012 à 08:06 +0200, Anton Khirnov a écrit :
> On Sun, 10 Jun 2012 18:16:48 +0200, David Girault <[email protected]> wrote:
> > Le dimanche 10 juin 2012 à 16:24 +0200, Anton Khirnov a écrit :
> > > On Sun, 10 Jun 2012 15:08:35 +0200, David Girault <[email protected]> 
> > > wrote:
> > > > 
> > > > Yes, it was very similar (igsmenudec.c was started as a fork of
> > > > pgssubdec.c), and thus my first implementation use the subtitle API. But
> > > > I switch to another media type for two main reason:
> > > > - I don't want to change subtitle API,
> > > 
> > > That's not a very convincing reason without more information =p
> > > Why not?
> > 
> > First, subtitle API may be actually used by a lot of players, so I'm
> > pretty cautious before touching an API.
> > 
> 
> Well you're not breaking anything. From what i see you're only adding
> two fields. That is backward compatible and doesn't break API or ABI.
> 
> Which reminds me -- what are those fields even for? They don't seem to
> be used in your patchset.
> 

AVOverlay include ptr to an AVOverlayRect table instead of
AVSubtitleRect. Incompatible changes are here. I had removed subtitle
specific fields. If we rework API, we can maintain compatibility easily.

Now, concerning the two extra fields in AVOverlay, it's required to send
to player some data (interactive or not) when a button was selected or
activated or other input (like timing). These data are stream type
dependent and opaque for libavcodec.

This first implementation return in this field the HDMV commands to
execute when a button is activated ('enter' key pressed). These commands
should be run in a HDMV virtual machine that in response may select
another page in current menu, change played title/playlist/streams,
select another audio or subtitle sound, etc. This isn't in the scope of
avplay. This is why these fields aren't used in my patchset.

I think the extra field should type and be a table like rects, because
we may return many type of extra data:
- sound effect to play (mixed with audio stream currently played) in
response of the last action,
- window effects to apply (crop/fade-in/fade-out, etc.) if decoder don't
do it by itself.

Other information may be added directly to AVOverlay:
- base time for buttons animation (not supported yet but it should be
latter)
- duration of animation (max number of pictures for one button)

Having this in mind, AVOverlay become a lot more complex. avplay will
never use all possibilities but API should allow:
- handling button animation by receiving in ONE AVOverlay all pictures
for an animated button OR by calling, on time basis, a function that
update AVOverlay without a new frame to have animation/windows effect
rendered by stream decoder.
- handling multiple kind of interactive overlay stream. extra data
returned is stream dependant, so libavcodec should never deal with it.

A real bluray player will include an HDMV virtual machine, open
index.bdmv and movieobject.bdmv and execute code found in VM. This will
select playlist and clipinfo files, then mpegts files are then openned
and played.


Regards,
David

_______________________________________________
libav-devel mailing list
[email protected]
https://lists.libav.org/mailman/listinfo/libav-devel

Reply via email to