On Fri, Aug 3, 2012 at 6:12 PM, Charles Pritchard <ch...@jumis.com> wrote:

>  WebGL vectors map well to brush traces.
> One would process a trace group into an int or float array then upload
> that to webgl for rendering.
> Or, one might use that array to render via Canvas 2d.
>
It's a little more complex than that. But long as you can get the raw data
into an array buffer and process it in JS it's good.


> The issue is that we have no mechanism to actually capture high resolution
> ink data.
> They've also hit sampling issues: initial drafts and implementations
> worked with polling.
> It seems that event callbacks are more appropriate.
>
> My main issue with plugins and ink, and serial polling, is that I lose
> data when I render: I can either have high quality input, and poor
> immediate rendering,
> or high quality rendering, but lose a lot of accuracy in the data stream.
>
It's a well known problem for game applications, I'm a bit surprised this
wasn't done right from the start.

1) The host OS APIs support a hodgepodge of
polling/queing/selecting/events. Usually polling APIs require a very high
frequency of polling.
2) For this reason game related frameworks like SDL, pygame, pyglet etc.
abstract the underlying mechanism and fill event queues.
3) At user choosen intervals the queues events are dispatched by either
letting the user call a a) "get all events" function which clears the queue
and the user is responsible to work trough that queue (SDL/pygame model) or
b) by the framework dispatching events individually at regular intervals
(pyglet)

In working with framework dispatched events I hit some snags (such as when
events that belong together are not correlated by the framework) which
renders some devices the framework hasn't been coded for unusable (tablets
for instance). I'd definitely prefer the flavor that lets the user
explicitly clear the queue and work trough the events, also makes it easier
to synchronize event processing with rendering and behavioral logic and
enables the ecosystem of api users to come up with a solution to correlate
events for their favorite devices.

So I believe that when the gamepad API is moving to this model (and it does
by the sound of it) we're good on that front. There's still the question of
how to switch "modes" such as mouse emulation or capture to area.

Reply via email to