Hello,

There's a blueprint about sensor related tasks on the Nexus 7

https://blueprints.launchpad.net/ubuntu/+spec/desktop-r-arm-input-sensor-
drivers

I'd like your input regarding the best place in our stack such features 
should be exposed to other software and to users.

There's a small test app written in Go that can be run on the Nexus 7 
that will orient the screen and set brightness depending on tablet 
position and ambient light respectively.

https://code.launchpad.net/~jani/+junk/nexus
On the Nexus 7 : Install golang, go build nexus.go, sudo ./nexus -v or 
download the prebuilt static binary from http://people.canonical.com/
~jani/a

However, to prevent adding yet another daemon with non-negligible memory 
footprint (~1.8M), I think behavior and various thresholds should be 
configurable in the Control Center GUI and the functionality added to one 
or several of the existing daemons written in C.
AFAIK there are no generic kernel and userland APIs for such sensors, 
save for device specific sysfs knobs so this may be a good opportunity to 
think about how to expose such hardware features in a device independent 
manner in the future.

Thoughts?

thanks
Jani


-- 
ubuntu-devel mailing list
ubuntu-devel@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel

Reply via email to