Hello,
I'm trying to track down a performance issue that I'm having when using
libavxx.so. I've found a simple example that demonstrates the issue I'm
having,
but I'm afraid although conceptually easy it may be difficult for people to
reproduce.
I have a "file" that delivers MPEG-TS, the TS has been partially demuxed (at
the
TS level) already so that only one Video and one Audio stream are present. When
this file is delivered to ffplay at the expected data rate that a DVB-T
receiver
would deliver it at (remember its has all the other TS streams removed and the
data rate has NOT speeded up as a consequence!) it takes about 10-15 seconds
for
video to show up on the screen. Using the same method mplayer takes a second to
deliver video to the screen. Unsurprisingly if the file is captured to a normal
file and played in ffplay video is seen on the screen quickly, as no throttling
of the TS data rate is being done (its a plain old file!).
I do get the following output from ffplay:
FFplay version 0.6, Copyright (c) 2003-2010 the FFmpeg developers
built on Oct 12 2010 20:18:28 with gcc 4.4.3
configuration: --prefix=/opt
libavutil 50.15. 1 / 50.15. 1
libavcodec 52.72. 2 / 52.72. 2
libavformat 52.64. 2 / 52.64. 2
libavdevice 52. 2. 0 / 52. 2. 0
libswscale 0.11. 0 / 0.11. 0
[mpeg2video @ 0x8bafe50]mpeg_decode_postinit() failure
Last message repeated 8 times
[mpegts @ 0x8b8d100]max_analyze_duration reached
after the ..duration reached message video is seen for the first time on the
screen. I am sure its a simple matter to telling the libraries not to parse the
stream just to try to decode it (mplayer after all has no issue), but I can't
find it, so I thought I'd ask the experts!
Thanks in advance,
Andy
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user