On 21.06.2016 15:18, Martin Koppenhoefer wrote:

2016-06-21 14:40 GMT+02:00 Andy Mabbett <a...@pigsonthewing.org.uk <mailto:a...@pigsonthewing.org.uk>>:

    On 20 Jun 2016 5:31 pm, "Martin Koppenhoefer"
    <dieterdre...@gmail.com <mailto:dieterdre...@gmail.com>> wrote:

    > I have just discovered another type of problem:

    > people adding full wikipedia urls into the website tag. In all
    cases there was already a wikipedia tag present.

    This is precisely the sort of thing a bot could clean up, daily or
    weekly say.


actually it is not that simple. As we haven't only 1 method, but, for good reason, several, to store references to wikipedia, this bot would have to check whether the linked full url in the website is already covered by the wikipedia interlanguage links or not. This is not impossible, but also not completely trivial. This bot should also check whether the previous version had a different website value and restore this in case it makes sense, or flag it for human review.

Cheers,
Martin

I wrote a program http://ausleuchtung.ch/geo_wiki/ which allows to find location of all Wikipedia articles either by coordinates in the articles themselves, or by the OpenStreetMap tags (wikipedia, wikimedia_commons, wikidata) in the radius of ten kilometers around a click.

It works for all language versions of Wikipedia, just change Wikipedia language field from en to fr, de, it, ru, etc. A search by an OSM tag may take 2 - 3 seconds.

So it is possible to check Wikipedia articles locations and OSM tags in an area visually. I noticed and corrected quite of few Wikipedia articles with wrong geographical coordinates with this tool. Probably people, who posses encyclopedic knowledge and create articles, are not always too good in cartography.

Best regards,

Oleksiy

_______________________________________________
talk mailing list
talk@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk

Reply via email to