When attempting to decode VP9 SVC frames from an IVF or WebM file there is a 
difference when using the default “Google VP9” video codec as opposed to the 
“Libvpx VP9” codec. The former renders with the lowest spatial quality while 
the latter renders the highest.  The “Libvpx” decoder detects the dimension 
change while “Google” does not. Is this a bug?  I am able to force ffplay to 
render properly by specifying ‘-vcodec libvpx_vp9’. However, VLC and Chrome 
(WebM) appear to use the default codec and render the lowest spatial quality. 
_______________________________________________
Libav-user mailing list
Libav-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/libav-user

To unsubscribe, visit link above, or email
libav-user-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to