Hi all,
Hi all,

This came up during an import discussion on talk-nl and I am curious
about your thoughts and ideas.

Below is an edited down version of a question I posted on the GIS
StackExchange[1]:

We currently deal with object versioning in a rudimentary way: each
object has a integer version number, and only object with the highest
version is exposed in the live database. The database uses optimistic
locking, so users must resolve all conflicts that occur when uploading
contributions manually.

This all works reasonably well as long as human contributions through
the editors are the only mode of contribution - but they aren't.
Increasingly, imports of open public sector data are conducted. These
make for more complex versioning issues. Consider the following
scenario:

1    A building object is being imported from an open public sector dataset
3    The building receives some modifications by human contributors
(attributes, geometry, or both)
3    A new version of the public sector data becomes available and is imported.

Currently, in step 3. the human contributions would be lost, unless
each building that received community modifications is manually merged
with the new import.

How can we deal with this situation? Do we need to look at distributed
version control in software development? How can methods of DVC be
adapted to deal with distributed spatial data maintenance?

[1] 
http://gis.stackexchange.com/questions/10493/how-to-deal-with-versioning-in-openstreetmap

-- 
Martijn van Exel
http://about.me/mvexel

_______________________________________________
talk mailing list
talk@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk

Reply via email to