On Fri, May 14, 2010 at 1:09 PM, Nakor <nakor....@gmail.com> wrote:
>
>> Did you download all existing data in that area and run the validator
>> before uploading your data? This should have told you with a red "No
>> entry" sign in it's report that you were about to upload duplicate nodes.

I suspect the algorithm that's used to determine what is duplicate is
simply overly-sensitive.

In the datasets I'm working with, I'm simplifying and validating data
before submission.

> The thing is unfortunately the validator gives a lot of false positives
> (at least here in the US) and if you blindly merge all nodes you end up
> stiching together a bridge and the road/river/train track going
> underneath to name a few.

+1

> from TIGER use nodes at the same position for a lot of different things.
> I have effectively being fixing some of those blind merges, as part of
> other edits, and I do not see why I should be called a villain for this
> where I actually FIXED data.

Yes, I've been fixing some imported data too.

I am not sure how to react to being on the vilian list, other than
knowing I'm in good company with Nakor and wonderchook.

My suggestions:

1) Please reword the list to not have judgemental label on it. "just the facts"

2) Explain the algorithm. Are you looking for duplicated nodes
litterally by "nodes which are on top of one another" or something
more loose?

3) For those of us who have duplicated nodes still around, make it
easy to download the list and examine it. You're already compiling the
data- just make it available as an OSM file for us to look at in our
favorite OSM editor, please.

- Serge

_______________________________________________
talk mailing list
talk@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk

Reply via email to