On Wed, Apr 13, 2011 at 01:33:47AM +0200, Alberto Delmás wrote:
> In the current code, the display extension info (if present) is used
> to set avctx->width and height. These in turn determine the size of
> the allocated picture buffer, so if they're too small the decoder will
> write past its end.

Unfortunately, that's the problem with decoder, or rather libraries around it.
Display dimensions are supposed to be for the picture what codec _shows_, not
decodes. In theory avctx->coded_{width,height} should be used to set pucture
dimensions so cropping to display size can de done later. In practice it's
not.

See also https://roundup.libav.org/issue2076 for reference.
 
> I've uploaded a sample illustrating the issue to incoming/VC1_DEI_crash.wmv
> 
> This patch fixes the issue by using the DEI data just to set the
> aspect ratio (which is more or less its intended purpose).

Since display dimensions are not optional there (and not allowed to be zero
either), one should blame {him/her/it}self for encoding bad values (lavc
silently failing is another issue).

As for the patch, approach seems to be correct but why it's done only for
standard aspect ratios?
_______________________________________________
libav-devel mailing list
libav-devel@libav.org
https://lists.libav.org/mailman/listinfo/libav-devel

Reply via email to