On 3/1/2010 6:09 PM, Bradley T. Hughes wrote:
On 03/01/2010 03:14 PM, ext Daniel Stone wrote:
And from the NSTouch (OS X) class documentation:
Touches do not have a corresponding screen location. The first touch
of a touch collection is latched to the view underlying the cursor
using the same hit detection as mouse events. Additional touches on
the same device are also latched to the same view as any other
touching touches. A touch remains latched to its view until the
touch has either ended or is cancelled.

Very quickly, since I forgot to mention it in my original reply:

Bear in mind that this documentation is for OS X, and current Apple
hardware only has multi-touch support via track-pads. There is no
touch-screen hardware running Mac OS X today. It's either iPhone, iPod
touch, or the upcoming iPad. iPhone and iPod touch use a different API
than the above, UITouch instead of NSTouch, which actually does give
on-screen locations (iirc).

Note that NSTouch provides the information about the device, in particular deviceSize. Coupled with normalizedPosition, this gives us an on-screen location.

Thanks,

Artem

The multi-focii discussions really only apply to multi-touch capable
touch screens, not to laptop track-pads or external multi-touch capable
tablets like some (all?) of the Wacom Bamboo tablets.

_______________________________________________
xorg-devel mailing list
xorg-devel@lists.x.org
http://lists.x.org/mailman/listinfo/xorg-devel

Reply via email to