Jan Ciger wrote:
> It is described in this post (that you have mentioned too):
> https://developer.oculus.com/blog/asynchronous-timewarp-examined/
> It was one of the reasons they had to go to Nvidia (the other being
> the direct mode support), because it is not possible to implement it
> without driver support.


I feel that you are misunderstanding what I am trying to say. I will try to 
reformulate myself.

First of all, we have two types of warping that could be performed:

1. Rotational - A pure 2D transformation, we only need a color texture for this.

2. Positional - A 3D re-projection were every pixel has to be transformed with 
its depth taken into account. This requires us to have a color texture, a depth 
texture and the projection matrices used.

Both techniques can be used either in sync with rendering (as was performed 
with rotational timewarp in SDK0.5 and earlier) or in a separate thread. The 
benefit of having it in a separate thread is that we can handle the case of the 
rendering is not done when the HMD requires a new image. But with the drawback 
that it requires low level support in OS/Driver. 

I concur with your conclusion that it is highly likely that Oculus are in fact 
using the asynchronous solution in the SDK0.6 and 0.7, BUT they have not stated 
officially that the are using this. In theory positional re-projection could be 
done in synchronous rendering (although unlikely). 

/Björn

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=65069#65069





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to