On Wed, Oct 2, 2013 at 8:21 PM, Carsten Haitzler <ras...@rasterman.com> wrote:
> On Wed, 2 Oct 2013 13:59:47 -0700 Jason Gerecke <killert...@gmail.com> said:
>
> 3 things here.
>
> 1. for general device queries (get name, description, device classes etc.)
> there is already an evas_device api. right now though nothng populates the 
> evas
> device information from lower levels (xi/xi2, etc. etc.), so it's unused. but
> it's there.
Would clients be expected to call this just once prior to receiving
input, or is there some way to notify them that a change has occured?
I ask because the type information (e.g. pen tip vs. eraser or
airbrush vs. inking pen) is likely to change fairly regularly. We
would want to be sure that clients can get their hands on that,
whether it means notifying that they need to refresh their
understanding or sending the type information in each event.

> 2. i've talked with some people about this and the general take was that we
> need to add new pen events (ala multi) because they need to handle more than
> multi: e.g.:
>
>   * button number on pen pressed/released SEPARATELY from pen touching.
>   * pen touch vs eraser touch (ie indicate which "end" of N ends a pen presses
> down).
>   * some pens support a hover ability - so that means motion events without
> down/up begin/end points like multi, BUT we would ned/want to report distance
> as a value during this hover
>   * possibly other custom inputs on the pens themselves that are not accounted
> for.
>
> we COULD extend multi events and add fields, but i think you are mistaking
> habitual over-engineering in efl for intent for these to be pen events. a lot
> of stuff gets extended beyond its initial scope "in case". eg in case out 
> touch
> surface can report size, pressure and angle of your finger... :) also it'd
> cause issues with existing multi event usage.
>
As I said, the multi event is interesting for its extra fields, but
might not be a good semantic match. A pen isn't anything more than
fancy mouse: it has motion without a touch event and a set of buttons
to worry about. The only thing that makes it special are all those
extra axes, and most of those could be imagined on a
sufficiently-advanced mouse. Wacom actually makes such a "mouse" for
their Intuos tablets which reports things like height above the pad,
rotation about the z-axis, and absolute position of a spring-loaded
fingerwheel.

The way most APIs handle pen input is to just pass the data alongside
the X/Y position you'd expect for a mouse. Throw in an enum or
function to let clients distinguish pen tip from eraser and you're
set.

> 3. if you want to tal about the extra buttons and what not that you find on
> pen tablets (i have a bamboo sitting on my desk at home - i know cheapo little
> pen tablet, but its indicative at a small scale of a lot of them), i believe
> these should just be "keys" like any keyboard. i don't think these belong in
> any specialized event system. as with #1 we CAN attach a special device handle
> to them though so you can differentiate where the key comes from... :)
>
Most implementations send these as "buttons" rather than "keys". In
the case of evdev, the Intuos and Cintiq tablets will send the
"meaningless" BTN_{0,1,2,3,...} buttons. For a Bamboo though, you'll
find the mouse buttons BTN_{LEFT,RIGHT,FORWARD,BACK} instead. In the X
driver these are all mapped to mouse buttons that clients can easily
understand. The first few mouse buttons usually have attached
semantics (e.g. button 1 => left click => primary action) which makes
them non-ideal, but "keys" don't fare much better since you'd need
clients to properly understand a non-standard keyboard layout. These
buttons are always a tricky issue.

> adding  new event type isn't too hard in evas - you add the structures, enums
> (at the end of the current list of callbacks), add the feed api's to feed the
> event in, add the appropriate routing (the same we do for multi and mouse), 
> and
> then that's done.
>
> then there is edje - likely you want to expose the pen events as signals like
> mouse. maybe not? optional really.
>
> and the important bit then ecore-evas + ecore-input + ecore-input-evas needs 
> to
> gather the events from the next layer down and call the feed calls. the layer
> below could be evdev/console input devices (fb) x11 (xi/xi2 etc.), or wl.. or
> win32.. or anything... and then for these ecore-x, ecore-fb, ecore-wl etc.
> needs to gather the appropriate events from the display system/event input
> system below that. one thing missign in ecore-evas is what i describe in #1 -
> querying all available devices, monitoring for new devices being
> plugged/unplugged and appropriately populating/managing the evas_device tree 
> per
> canvas. :) it's something we need to get around to ding some time... :)
>
Sounds like there'll be a bit of code spelunking in my future. I'll be
happy once I wrap my head around how each of these components (edje,
ecore, evas, etc.) fit together.

Jason
---
Now instead of four in the eights place /
you’ve got three, ‘Cause you added one  /
(That is to say, eight) to the two,     /
But you can’t take seven from three,    /
So you look at the sixty-fours....

------------------------------------------------------------------------------
October Webinars: Code for Performance
Free Intel webinars can help you accelerate application performance.
Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from 
the latest Intel processors and coprocessors. See abstracts and register >
http://pubads.g.doubleclick.net/gampad/clk?id=60134791&iu=/4140/ostg.clktrk
_______________________________________________
enlightenment-devel mailing list
enlightenment-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/enlightenment-devel

Reply via email to