On 14.03.2015, at 02:57 , Felix Baumann <[email protected]> wrote:
> Will they contribute/exchange their algorithms with Ichnaea to help Mozilla 
> improving MLS own accuracy further?
> I mean Ichnaea's current algorithms are pretty basic (it creates and 
> intersects bounding boxes).

We are starting this partnership by working on data exchange and all the 
technical details that requires.

For example we need to have a defined API and data format to share the datasets 
with each other. I think this will grow into building up some shared knowledge 
of simple data validation rules, like the ones we established with the 
OpenCellID project (simple range checks for values, identifying bad/special 
values especially around cell identifiers). It might also extend into 
workarounds for Android problems or common hardware and radio driver mistakes. 
Maybe we can even establish some rules on data collection frequency on the 
client side. Anything that is about how to get good quality data in an 
efficient manner.

We are already implementing the same API's for the geolocate / lookup case. My 
goal is to also establish some shared API for the geosubmit / submitting 
learning data case. If we get there, it should be possible for anyone to build 
a client side app, based on these API's and let it talk to either of our 
services. If we can give out some additional rules and best practices for 
clients, that'd be an added bonus.

I think this is in itself an ambitious undertaking, trying to standardize the 
public API's and data formats for basic geolocation services. Today each 
location service has its own API's and there's no interoperability. If we could 
get us, Combain, OpenCellID, openbmap, radiocells and more commercial partners 
to all agree on the same set of basic API's, that'd be a big win in my book.

The algorithmic side of how to process the data is something that I think will 
remain an area for projects and companies to distinguish themselves in. I see 
this as analogous to the openstreetmap ecosystem, where the base map data is 
shared. Companies distinguish themselves either based on operational excellence 
and potentially the cost reduction that comes with it, algorithmic knowledge 
translating into quality of results, great customer service or being a good 
partner for consulting services. And for mobile workloads there's another a 
whole area around battery and network efficiency, latency and intermittent / 
offline connectivity that is still seeing a lot of experimentation and 
innovation; something which doesn't yet seem to be ripe for standardization.

The geolocation space today is in a similar state as the mapping world was 
before openstreetmap came along. Owning the data and the coverage that comes 
with it is still seen as a competitive advantage by most of the players. I'd 
very much like to shift away from this and see innovation and competition move 
towards the upper layers.

To me this projects succeeds if there's a viable ecosystem of interoperable 
geolocation services, some commercial - some not; and as a consequence users 
can choose the kind of service they want. MLS won't be the most accurate 
service and we won't be focussing on quality of results outside of basic 
validation checks, leaving this to partners like Combain with years of hard won 
experience.

Sorry for going on a tangent here :)
Hanno
_______________________________________________
dev-geolocation mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-geolocation

Reply via email to