>> The transformation into screen coordinates is of little issue. Ideally, you'd
>> want applications using multi-touch stuff to be aware of the events anyway,
>> in which case you'd just use the device coordinate space.
> What if, for example, you have a camera-based input device with a
> fisheye lens? I can't imagine that every frontend should do the radial
> undistortion itself..

True, but an app might want that information.  For example to correct
for chromatic aberration.  It's always useful to be able to get the
raw data if wanted, while having access to the transformed data 'by
default'.

John
_______________________________________________
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Reply via email to