At 16:24 07/06/2008, Younes Manton wrote:
>Encoding doesn't make sense on a GPU to be honest with you. The
>optimal end-point for GPU processing is the screen, so decoding fits
>perfectly, but for encoding we have to make a round trip from CPU to
>GPU and back to CPU. I think that would offset most/all speed gains
>you might get from having the GPU do the encoding. No clue about CUDA,
>but I'm sure it doesn't get around the fundamental problem that the
>round trip is sub-optimal. Dedicated encoding hardware is a different
>story, but I get the impression that VAAPI is only intended for
>decoding.

Actually I suspect that encoding on a GPU does make some sense. True 
you have to get the data on and off which is an overhead. However 
encoding is much more compute intensive than decoding so the overhead 
from data on and off is proportionately less. The GPU is actually 
well suited to some motion estimation schemes because it has been 
designed to be suitable for motion compensation.

So don't abandon the idea of coding on the GPU. On the other hand 
decoding is probably where it is at, at the moment.

Tim 


-------------------------------------------------------------------------
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
_______________________________________________
Schrodinger-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/schrodinger-devel

Reply via email to