Hi Vipul,

On 04/02/17 01:08, Vipul Rahane wrote:
Hello All,

I will be taking over SensorAPI stuff from Sterling.

Kevin:
TSL2561 Driver and LSM303 Driver that you have implemented are pretty good and 
I used both of them. I saw the values coming in using the SensorAPI. I would 
also like to understand about sensors from you or anybody else that is more 
experienced on these as I am kind of new to it, I have some experience but not 
a lot to get a generic understanding.
Happy to help where I can on my side. I'm currently on vacation, but can look into this more seriously when I'm back next week.
I am analyzing the API currently along with looking at LSM and TSL 
simultaneously to figure out if we need to add something that is missing to the 
API. I had a few questions:

About LSM303:

* Are +-1g values orientation values or is that actual acceleration ?
+/-1g is actually 'gravity' (1g being 1 'unit' of standard Earth gravity), and you can determine orientation based on the influence of gravity which is being detected by the accelerometer. If you are getting +1g (or ~9.80665m/s2) on the X axis, it means X is pointing 'straight up' since you are seeing the 'full' effect of gravity on that axis. If you are are seeing -1g on X it means the device is 'flipped' 180° since you are seeing the negative value (meaning X is pointing 'straight down'). Depending on which axis you see values on, and the relationship between them, you can infer basic orientation on two axis with only an accelerometer (normally roll and pitch).

If you add a second sensor, typically a magnetometer, you can add a third axis for roll/pitch/heading. Adding a third sensor (a gyroscope) you can detect full 360° absolute orientation.

So, you are using an accelerometer and measuring acceleration, but in the case of orientation you normally aren't measuring 'acceleration' in the sense of movement, you're measuring gravity which is detected as acceleration since it presents itself as a constant 'pushing' force with a known value and direction, even though the device is potentially immobile.

Does that help?
* Should we be separating out acceleration from orientation(maybe in 
get_interface()?)
I think just providing the raw acceleration values is good enough, and 'orientation' should actually be a higher level set of helper functions that read the raw accel/mag/gyro data and act accordingly. A new sensor type could also be implemented for this, but it should rely on the lower level raw sensor data types I think.

Separating gravity from acceleration is also problematic. For now, I'd keep it simple and just send out the sensor data and users can interpret that however they want themselves.
* Zero-g level offset (TyOff) is the standard deviation of the actual output 
signal of a sensor from the ideal output signal. Should we be thinking of 
calibration for sensors in the SensorAPI ? I think in this case it comes with 
factory calibrated values which can be used by a developer. But that will 
change once put on a PCB, hence the concern.
For orientation you will ALWAYS needs to calibrate the sensors to get meaningful data (especially the mag, but also the gyro for the zero-offset level), and this will always have to be done in the final enclosure and environment. So yes, we should have functions to store and apply calibration coefficients for the accel/mag/gyro data types (plus perhaps some other sensors types, but definately those three). I think these should be stored in config memory somewhere, but we need helper functions to assign and apply them and just default to '0' values.
* Operating mode of Sensors. Low Power mode Vs Normal Mode for Accelerometer 
and Sleep Mode Vs Single Conversion Mode Vs Continuous Conversion Mode. Modes 
should also be a part of our SensorAPI I believe.
I think having built in power mode control at the API level is extremely value (and often neglected), but I think you want to keep things manageable as well so this might be pushed off to a second version? I do think we should keep power mode in mind with any API level decisions though. Sterling has some thoughts on this a while back and made an initial proposal that I found very sensible.
*  Should the Filter mode selection also be part of the SensorAPI ?
I think this should be a separate API, but also a key part of any system using sensor data. The key thing is to support compatible types between the sensor and filter APIs. float32 would be my lowest common denominator, especially since the Cortex M4F has excellent float32 performance (13 cycle divide, 3 cycle multiply off the top of my head ... which is also as good or better than fixed point), although int32_t is nice as a fallback. Float32 will be more painful on an M0, but I think most orientation systems will probably be implement on an M4F which has a lot more processing power per MHz for this kind of work.
* FIFO config and overrun interrupts and their need in the SensorAPI ?
Not sure here ... I think we'll see the need (or not) as we implement things.
* How are accelerometers and magnetometers generally used ? In an interrupt 
based fashion where the thresholds are specified and then the values are read 
in an ISR or is it more of a polling based architecture where the sensors would 
be sampled at a given rate ?
I've always used them for orientation, OR for things like 'tap' detection, or perhaps to detect when a device is falling. Tap detect and falling would be interrupt based where you set a threshold and trigger an interrupt when you start to 'fall' and react quickly to shut moving parts off. Orientation CAN be interrupt based, and that /may/ save some power depending on sample rate, but it can also just be polling. Polling is probably more straight forward, just setting up a timer to read at a fixed interval. No strong feelings on this on my side, though, it really depends on the use case.

As a minimum, I would start with timer based polling, and we can add interrupt support to drivers on an as-needed basis since every chip will have interrupts for different things, and there is already a good device config abstraction in the sensor API to provide a per-IC config struct.

About TSL2561:

* How are OLED sensors generally used ? In an interrupt based fashion where the 
thresholds are specified and then the values are read in an ISR or is it more 
of a polling based architecture where the sensors would be sampled at a given 
rate ?
This is also hard to say since it depends on your use case. Most of our customers just poll, but I do like the interrupt based threshold option on this sensor which is why I added it to the Mynewt driver I implemented. I think where there is the development bandwidth, we should support both, though when I'm pressed for time I generally implement polling first since it's easier.
* The lux calculation function that is part of the Adafruit library does a 
great job at converting the values. The TAO data sheet for the sensor has the 
same function and it is recommended to use the same without any changes. I have 
used that for converting the values, changed the indentation to match Apache 
Mynewt’s coding style.
There may be some issues with the lux conversion code, I'm not sure off the top of my head, but the values can be fixed if there are any bugs. I've written a lot of light and color sensor drivers and sometimes when we wrote them with very early silicon we didn't always have all the information from the manufacturer (device characterization wasn't complete at the time, etc.), but I think the TSL2561 should be good since it's a very old and popular sensor for us.
I might have some more questions with regards to TSL2561.
As mentioned above, this is an older and very mature sensor so there is a lot of information out there, thankfully, and I know it quite well by now, which is why I always start with this one and the LSM303DLHC. They're both popular, useful, cheap and easy to work with.

So far it looks like the SensorAPI does a great job apart from the above stuff.
I liked what is there so far. There are some gaps to fill in (DSP/Filtering, making composite sensor types like 'orientation' based on lower level raw sensor data, etc.), but it's very nice so far and moving things in a good, sustainable direction in my opinion!

Kevin

Reply via email to