So what you are saying is that something like xscribble becomes central
to input in general on handheld devices: it takes relatively raw input
and then tells X what to do when it needs to. Building it into the
core X server seems wrong to me: by definition, we won't get it right:
I think we should regard it as though it were a window manager: a separate
application providing some of the UI.
The issue I see is how to arbitrate among more than one application that
might want/need the high resolution digitizer device... We'll certainly
be running something like scribble continuously, but other apps may need
more than what the X core protocol gives them. If the Xinput stuff can
adaquately handle this situation (other applications in addition to
the scribble app), then we have a rational course of action.
I guess it is time to go look at the Input extension and see if it can
deal with digitizers adequately in the general case and whether it has
any grab/replay notion. My book is at home, I'll try to remember to get
a copy and stare at it some tomorrow.
- Jim
> Sender: [EMAIL PROTECTED]
> From: Keith Packard <[EMAIL PROTECTED]>
> Date: Wed, 12 Jul 2000 11:05:44 -0700
> To: [EMAIL PROTECTED] (Jim Gettys)
> Cc: Mike Touloumtzis <[EMAIL PROTECTED]>, Charlie Flynn <[EMAIL PROTECTED]>,
> [EMAIL PROTECTED], [EMAIL PROTECTED]
> Subject: Re: [Handhelds] Re: Touch Screen Driver Generic Interface for all
> Linux SA11x0 platforms.
> -----
> > There are two interfaces in X that have to be built: one in which the touch
> > screen is being used as the primary pointer, and then some other way to
> > deal with it in all its glory, almost certainly via the X Input extension.
>
> I've been thinking more about input recently, and it's less than clear
> that the input extension is sufficient for pen-based devices. Because the
> input device is so limited, I believe that a significant amount of policy
> will be required to distinguish between gestures intended as text input
> and gestures intended for pointer input.
>
> In addition, we need support for a digital ink "overlay". One of the
> important HWR lessons I learned from the Itsy was that graffiti with
> feedback is much more reliable than without.
>
> It's almost as if a separate app should be reading the touch screen and
> selectively synthesizing pointer and keyboard events; drawing digital
> ink on the screen to help make HWR usable. Perhaps a portion of this app
> could be done within the X server using a grab/queuing mechanism similar
> to the core grabs.
>
> Here's an idea -- have pen-down events cause raw input to be directed at
> the external agent and simultaneously queued within the server. The agent
> could then monitor the gesture and at some point decide whether to echo
> with digital ink, and finally decide whether to forward the entire queued
> event stream on to the X pointer input queue or send it along for HWR. The
> server could echo the ink itself or we could let the agent generate X
> protocol for that.
>
> [EMAIL PROTECTED] XFree86 Core Team SuSE, Inc.
>
>
> _______________________________________________
> Handhelds mailing list
> [EMAIL PROTECTED]
> http://handhelds.org/mailman/listinfo/handhelds
--
Jim Gettys
Technology and Corporate Development
Compaq Computer Corporation
[EMAIL PROTECTED]
unsubscribe: body of `unsubscribe linux-arm' to [EMAIL PROTECTED]
++ Please use [EMAIL PROTECTED] for ++
++ kernel-related discussions. ++