On Thu, Aug 20, 2009 at 08:31:43AM +1000, Torgeir Veimo wrote: > In my experience, you just have to do deinterlacing in vdpau with > interlaced content, even when displaying on an interlaced display. If > you try to output interlaced material directly, you get ghosting since > the weaved frames are copied to the progressive surface, and the > output resolution might be different than the weave pattern.
the main reason why nvidia chooses to deinterlace always even if you use an interlaced video timing is not the scaling problem you mention. This could be eventually solved (albeit not perfectly) by scaling both fields independently. The main reason is: even with VDPAU there still exists no synchronization between stream and video timing. That means there will be an arbitrary amount of lost/doubled fields with current VDPAU implementation. Nvidias workaround now is to deinterlace all content in any case. As a result these field losses are (mostly) not directly visible to the human eye. Ceasing from deinterlacing here would reveal field sequence problems (caused by missing stream-synchronization) immediately. Cheers Thomas _______________________________________________ vdr mailing list vdr@linuxtv.org http://www.linuxtv.org/cgi-bin/mailman/listinfo/vdr