Re: [Wikidata] Wikidata Analyst, a tool to comprehensively analyze quality of Wikidata

2015-12-09 Thread Federico Leva (Nemo)
Useful and very pretty, I can't wait for the analysis by import source. 
I'll try to dig the data to find interesting evidence/examples of data 
to use more.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread André Costa
In case you haven't come across it before
http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the entry
point to the Swedish database of (primarily early) photographers curated by
the Nordic Museum in Stockholm.

It's not that well integrated into Wikidata yet but the plan is to fix that
during early 2016. That would also allow a variety of photographs on
Wikimedia Commons to be linked to these entries.

Cheers,
André

André Costa | GLAM developer, Wikimedia Sverige | andre.co...@wikimedia.se |
 +46 (0)733-964574

Stöd fri kunskap, bli medlem i Wikimedia Sverige.
Läs mer på blimedlem.wikimedia.se

On 9 December 2015 at 02:44, David Lowe  wrote:

> Thanks, Tom.
> I'll have to look at this specific case when I'm back at work tomorrow, as
> it does seem you found something in error.
> As for my process: with WD, I queried out the label, description & country
> of citizenship, dob & dod of of everyone with occupation: photographer.
> After some cleaning, I can get the WD data formatted like my own (Name,
> Nationality, Dates). I can then do a simple match, where everything matches
> exactly. For the remainder, I then match names and dates- without
> Nationality, which is often very "soft" information. For those that pass a
> smell test (one is "English" the other is "British") I pass those along,
> too. For those with greater discrepancies, I look still closer. For those
> with still greater discrepancies, I manually, individually query my
> database for anyone with the same last name & same first initial to catch
> misspellings or different transliterations. I also occasionally put my
> entire database into open refine to catch instances where, for instance, a
> Chinese name has been given as FamilyName, GivenName in one source, and
> GivenName, FamilyName in another.
> In short, this is scrupulously- and manually- checked data. I'm not savvy
> enough to let an algorithm make my mistakes for me! But let me know if this
> seems to be more than bad luck of the draw- finding the conflicting data
> you found.
> I have also to say, I may suppress the Niepce Museum collection, as it's
> from a really crappy list of photographers in their collection which I
> found many years ago, and can no longer find. I don't want to blame them
> for the discrepancy, but that might be the source. I don't know.
> As I start to query out places of birth & death from WD in the next days,
> I expect to find more discrepancies. (Just today, I found dozens of folks
> whom ULAN gendered one way, and WD another- but were undeniably the same
> photographer. )
> Thanks,
> David
>
>
> On Tuesday, December 8, 2015, Tom Morris  wrote:
>
>> Can you explain what "indexing" means in this context?  Is there some
>> type of matching process?  How are duplicates resolved, if at all? Was the
>> Wikidata info extracted from a dump or one of the APIs?
>>
>> When I looked at the first person I picked at random, Pierre Berdoy
>> (ID:269710), I see that both Wikidata and Wikipedia claim that he was born
>> in Biarritz while the NYPL database claims he was born in Nashua, NH.  So,
>> it would appear that there are either two different people with the same
>> name, born in different places, or the birth place is wrong.
>>
>>
>> http://mgiraldo.github.io/pic/?&biography.TermID=2028247&Location=269710|42.7575,-71.4644
>> https://www.wikidata.org/wiki/Q3383941
>>
>> Tom
>>
>>
>>
>>
>> On Tue, Dec 8, 2015 at 7:10 PM, David Lowe  wrote:
>>
>>> Hello all,
>>> The Photographers' Identities Catalog (PIC) is an ongoing project of
>>> visualizing photo history through the lives of photographers and photo
>>> studios. I have information on 115,000 photographers and studios as of
>>> tonight. It is still under construction, but as I've almost completed an
>>> initial indexing of the ~12,000 photographers in WikiData, I thought I'd
>>> share it with you. We (the New York Public Library) hope to launch it
>>> officially in mid to late January. This represents about 12 years worth of
>>> my work of researching in NYPL's photography collection, censuses and
>>> business directories, and scraping or indexing trusted websites, databases,
>>> and published biographical dictionaries pertaining to photo history.
>>> Again, please bear in mind that our programmer is still hard at work
>>> (and I continue to refine and add to the data*), but we welcome your
>>> feedback, questions, critiques, etc. To see the WikiData photographers,
>>> select WikiData from the Source dropdown. Have fun!
>>>
>>> *PIC*
>>> 
>>>
>>> Thanks,
>>> David
>>>
>>> *Tomorrow,  for instance, I'll start mining Wikidata for birth & death
>>> locations.
>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://li

Re: [Wikidata] Wikidata Analyst, a tool to comprehensively analyze quality of Wikidata

2015-12-09 Thread Gerard Meijssen
Hoi,
What would be nice is to have an option to understand progress from one
dump to the next like you can with the Statistics by Magnus. Magnus also
has data on sources but this is more global.
Thanks,
 GerardM

On 8 December 2015 at 21:41, Markus Krötzsch 
wrote:

> Hi Amir,
>
> Very nice, thanks! I like the general approach of having a stand-alone
> tool for analysing the data, and maybe pointing you to issues. Like a
> dashboard for Wikidata editors.
>
> What backend technology are you using to produce these results? Is this
> live data or dumped data? One could also get those numbers from the SPARQL
> endpoint, but performance might be problematic (since you compute averages
> over all items; a custom approach would of course be much faster but then
> you have the data update problem).
>
> An obvious feature request would be to display entity ids as links to the
> appropriate page, and maybe with their labels (in a language of your
> choice).
>
> But overall very nice.
>
> Regards,
>
> Markus
>
>
> On 08.12.2015 18:48, Amir Ladsgroup wrote:
>
>> Hey,
>> There has been several discussion regarding quality of information in
>> Wikidata. I wanted to work on quality of wikidata but we don't have any
>> source of good information to see where we are ahead and where we are
>> behind. So I thought the best thing I can do is to make something to
>> show people how exactly sourced our data is with details. So here we
>> have *http://tools.wmflabs.org/wd-analyst/index.php*
>>
>> You can give only a property (let's say P31) and it gives you the four
>> most used values + analyze of sources and quality in overall (check this
>> out )
>>   and then you can see about ~33% of them are sources which 29.1% of
>> them are based on Wikipedia.
>> You can give a property and multiple values you want. Let's say you want
>> to compare P27:Q183 (Country of citizenship: Germany) and P27:Q30 (US)
>> Check this out
>> . And
>> you can see US biographies are more abundant (300K over 200K) but German
>> biographies are more descriptive (3.8 description per item over 3.2
>> description over item)
>>
>> One important note: Compare P31:Q5 (a trivial statement) 46% of them are
>> not sourced at all and 49% of them are based on Wikipedia **but* *get
>> this statistics for population properties (P1082
>> ) It's not a
>> trivial statement and we need to be careful about them. It turns out
>> there are slightly more than one reference per statement and only 4% of
>> them are based on Wikipedia. So we can relax and enjoy these
>> highly-sourced data.
>>
>> Requests:
>>
>>   * Please tell me whether do you want this tool at all
>>   * Please suggest more ways to analyze and catch unsourced materials
>>
>> Future plan (if you agree to keep using this tool):
>>
>>   * Support more datatypes (e.g. date of birth based on year, coordinates)
>>   * Sitelink-based and reference-based analysis (to check how much of
>> articles of, let's say, Chinese Wikipedia are unsourced)
>>
>>   * Free-style analysis: There is a database for this tool that can be
>> used for way more applications. You can get the most unsourced
>> statements of P31 and then you can go to fix them. I'm trying to
>> build a playground for this kind of tasks)
>>
>> I hope you like this and rock on!
>> 
>> Best
>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikimedia-l] Quality issues

2015-12-09 Thread Markus Krötzsch

On 08.12.2015 00:02, Andreas Kolbe wrote:

Hi Markus,

...



Apologies for the late reply.

While you indicated that you had crossposted this reply to Wikimedia-l,
it didn't turn up in my inbox. I only saw it today, after Atlasowa
pointed it out on the Signpost op-ed's talk page.[1]


Yes, we have too many communication channels. Let me only reply briefly 
now, to the first point:



 > This prompted me to reply. I wanted to write an email that merely
says: > "Really? Where did you get this from?" (Google using Wikidata
content)

Multiple sources, including what appears to be your own research group's
writing:[2]


What this page suggested was that that Freebase being shutdown means 
that Google will use Wikidata as a source. Note that the short intro 
text on the page did not say anything else about the subject, so I am 
surprised that this sufficed to convince you about the truth of that 
claim (it seems that other things I write with more support don't have 
this effect). Anyway, I am really sorry to hear that this 
quickly-written intro on the web has misled you. When I wrote this after 
Google had made their Freebase announcement last year, I really believed 
that this was the obvious implication. However, I was jumping to 
conclusions there without having first-hand evidence. I guess many 
people did the same. I fixed the statement now.


To be clear: I am not saying that Google is not using Wikidata. I just 
don't know. However, if you make a little effort, there is a lot of 
evidence that Google is not using Wikidata as a source, even when it 
could. For example, population numbers are off, even in cases where they 
refer to the same source and time, and Google also shows many statements 
and sources that are not in Wikidata at all (and not even in Primary 
Sources).


I still don't see any problem if Google would be using Wikidata, but 
that's another discussion.


You mention "multiple sources".
{{Which}}?

Markus



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikimedia-l] Quality issues

2015-12-09 Thread Markus Krötzsch
P.S. Meanwhile, your efforts in other channels are already leading some 
people to vandalise Wikidata just to make a point [1].


Markus

[1] 
http://forums.theregister.co.uk/forum/1/2015/12/08/wikidata_special_report/



On 09.12.2015 11:32, Markus Krötzsch wrote:

On 08.12.2015 00:02, Andreas Kolbe wrote:

Hi Markus,

...



Apologies for the late reply.

While you indicated that you had crossposted this reply to Wikimedia-l,
it didn't turn up in my inbox. I only saw it today, after Atlasowa
pointed it out on the Signpost op-ed's talk page.[1]


Yes, we have too many communication channels. Let me only reply briefly
now, to the first point:


 > This prompted me to reply. I wanted to write an email that merely
says: > "Really? Where did you get this from?" (Google using Wikidata
content)

Multiple sources, including what appears to be your own research group's
writing:[2]


What this page suggested was that that Freebase being shutdown means
that Google will use Wikidata as a source. Note that the short intro
text on the page did not say anything else about the subject, so I am
surprised that this sufficed to convince you about the truth of that
claim (it seems that other things I write with more support don't have
this effect). Anyway, I am really sorry to hear that this
quickly-written intro on the web has misled you. When I wrote this after
Google had made their Freebase announcement last year, I really believed
that this was the obvious implication. However, I was jumping to
conclusions there without having first-hand evidence. I guess many
people did the same. I fixed the statement now.

To be clear: I am not saying that Google is not using Wikidata. I just
don't know. However, if you make a little effort, there is a lot of
evidence that Google is not using Wikidata as a source, even when it
could. For example, population numbers are off, even in cases where they
refer to the same source and time, and Google also shows many statements
and sources that are not in Wikidata at all (and not even in Primary
Sources).

I still don't see any problem if Google would be using Wikidata, but
that's another discussion.

You mention "multiple sources".
{{Which}}?

Markus





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Does a REST services exist that converts Wikipedia url to Wikidata Q id, and the converse?

2015-12-09 Thread Addshore
Hi there!

You actually need the normalize parameter (I have no idea where you got a
redirect parameter from)

https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Multicellular&normalize=&props=

Addshore

On Tue, 8 Dec 2015 at 18:07  wrote:

> Regarding the following query:
>
> https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Multicellular&redirects=yes&props=
>
> The desired effect is for it to be redirected to "Multicellular
> organism", but the redirects doesn't seem to be working, even when
> putting it explicitly in the URL.  Is there a way to cause a redirect
> with wbgetentities when used with the www.wikidata.org host?
>
> Please advise,
> James Weaver
>
> On Fri, Nov 20, 2015, at 04:01 PM, Daniel Kinzler wrote:
> > Am 20.11.2015 um 18:22 schrieb Legoktm:
> > > Kind of.
> > >
> > > <
> https://en.wikipedia.org/w/api.php?action=query&titles=Douglas%20Adams&prop=pageprops&formatversion=2
> >
> > > maps a page title on the English Wikipedia to its Wikidata Q id.
> >
> > You can also look this up directly on wikidata:
> > <
> https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Douglas_Adams&props=
> >
> >
> > > <
> https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q42&props=sitelinks
> >
> > > will do the reverse.
> >
> > A bit more terse, and with full URLs in the output:
> > <
> https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q42&props=sitelinks|sitelinks/urls&sitefilter=enwiki
> >
> >
> > You can also let yourself be redirected directly:
> > 
> >
> > Or the the reverse redirect:
> > 
> >
> > HTH
> >
> >
> > --
> > Daniel Kinzler
> > Senior Software Developer
> >
> > Wikimedia Deutschland
> > Gesellschaft zur Förderung Freien Wissens e.V.
> >
> > ___
> > Wikidata mailing list
> > Wikidata@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Does a REST services exist that converts Wikipedia url to Wikidata Q id, and the converse?

2015-12-09 Thread Daniel Kinzler
Am 08.12.2015 um 18:06 schrieb ja...@j1w.xyz:
> Regarding the following query:
> https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Multicellular&redirects=yes&props=
> 
> The desired effect is for it to be redirected to "Multicellular
> organism", but the redirects doesn't seem to be working, even when
> putting it explicitly in the URL.  Is there a way to cause a redirect
> with wbgetentities when used with the www.wikidata.org host?

There is no way to get a redirect from the API.
You can get a redirected via Special:GoToLinkedPage or Special:ItemByTitle, as
described in my mail.

Try this: 



-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread David Lowe
Thanks, André! I don't know that I've found that before. Great to get
country (or region) specific lists like this.
D

On Wednesday, December 9, 2015, André Costa 
wrote:

> In case you haven't come across it before
> http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the entry
> point to the Swedish database of (primarily early) photographers curated by
> the Nordic Museum in Stockholm.
>
> It's not that well integrated into Wikidata yet but the plan is to fix
> that during early 2016. That would also allow a variety of photographs on
> Wikimedia Commons to be linked to these entries.
>
> Cheers,
> André
>
> André Costa | GLAM developer, Wikimedia Sverige | andre.co...@wikimedia.se
>  |
> +46 (0)733-964574
>
> Stöd fri kunskap, bli medlem i Wikimedia Sverige.
> Läs mer på blimedlem.wikimedia.se
>
> On 9 December 2015 at 02:44, David Lowe  > wrote:
>
>> Thanks, Tom.
>> I'll have to look at this specific case when I'm back at work tomorrow,
>> as it does seem you found something in error.
>> As for my process: with WD, I queried out the label, description &
>> country of citizenship, dob & dod of of everyone with occupation:
>> photographer. After some cleaning, I can get the WD data formatted like my
>> own (Name, Nationality, Dates). I can then do a simple match, where
>> everything matches exactly. For the remainder, I then match names and
>> dates- without Nationality, which is often very "soft" information. For
>> those that pass a smell test (one is "English" the other is "British") I
>> pass those along, too. For those with greater discrepancies, I look still
>> closer. For those with still greater discrepancies, I manually,
>> individually query my database for anyone with the same last name & same
>> first initial to catch misspellings or different transliterations. I also
>> occasionally put my entire database into open refine to catch instances
>> where, for instance, a Chinese name has been given as FamilyName, GivenName
>> in one source, and GivenName, FamilyName in another.
>> In short, this is scrupulously- and manually- checked data. I'm not savvy
>> enough to let an algorithm make my mistakes for me! But let me know if this
>> seems to be more than bad luck of the draw- finding the conflicting data
>> you found.
>> I have also to say, I may suppress the Niepce Museum collection, as it's
>> from a really crappy list of photographers in their collection which I
>> found many years ago, and can no longer find. I don't want to blame them
>> for the discrepancy, but that might be the source. I don't know.
>> As I start to query out places of birth & death from WD in the next days,
>> I expect to find more discrepancies. (Just today, I found dozens of folks
>> whom ULAN gendered one way, and WD another- but were undeniably the same
>> photographer. )
>> Thanks,
>> David
>>
>>
>> On Tuesday, December 8, 2015, Tom Morris > > wrote:
>>
>>> Can you explain what "indexing" means in this context?  Is there some
>>> type of matching process?  How are duplicates resolved, if at all? Was the
>>> Wikidata info extracted from a dump or one of the APIs?
>>>
>>> When I looked at the first person I picked at random, Pierre Berdoy
>>> (ID:269710), I see that both Wikidata and Wikipedia claim that he was born
>>> in Biarritz while the NYPL database claims he was born in Nashua, NH.  So,
>>> it would appear that there are either two different people with the same
>>> name, born in different places, or the birth place is wrong.
>>>
>>>
>>> http://mgiraldo.github.io/pic/?&biography.TermID=2028247&Location=269710|42.7575,-71.4644
>>> https://www.wikidata.org/wiki/Q3383941
>>>
>>> Tom
>>>
>>>
>>>
>>>
>>> On Tue, Dec 8, 2015 at 7:10 PM, David Lowe  wrote:
>>>
 Hello all,
 The Photographers' Identities Catalog (PIC) is an ongoing project of
 visualizing photo history through the lives of photographers and photo
 studios. I have information on 115,000 photographers and studios as of
 tonight. It is still under construction, but as I've almost completed an
 initial indexing of the ~12,000 photographers in WikiData, I thought I'd
 share it with you. We (the New York Public Library) hope to launch it
 officially in mid to late January. This represents about 12 years worth of
 my work of researching in NYPL's photography collection, censuses and
 business directories, and scraping or indexing trusted websites, databases,
 and published biographical dictionaries pertaining to photo history.
 Again, please bear in mind that our programmer is still hard at work
 (and I continue to refine and add to the data*), but we welcome your
 feedback, questions, critiques, etc. To see the WikiData photographers,
 select WikiData from the Source dropdown. Have fun!

 *PIC*
 

Re: [Wikidata] provenance tracking for high volume edit sources (was Data model explanation and protection)

2015-12-09 Thread Lydia Pintscher
On Tue, Nov 10, 2015 at 10:04 PM, Finn Årup Nielsen  wrote:
> Fine. I have added a ticket https://phabricator.wikimedia.org/T118322
> "Merging wizard shouldn't allow dissimilar items to be merged". Perhaps a
> developer can help solve the issue.

This is scheduled to go live tonight. From then on two items that link
to each other in statements should no longer be mergeable by default.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata Analyst, a tool to comprehensively analyze quality of Wikidata

2015-12-09 Thread André Costa
Nice tool!

To understand the statistics better.
If a claim has two sources, one wikipedia and one other, how does that show
up in the statistics?

The reason I'm wondering is because I would normally care if a claim is
sourced or not (but not by how many sources) and whether it is sourced by
only Wikipedias or anything else.

E.g.
1) a statment with 10 claims each sourced is "better" than one with 10
claims where one claim has 10 sources.
2) a statement with a wiki source + another source is "better" than on with
just a wiki source and just as "good" as one without the wiki source.

Also is wiki ref/source Wikipedia only or any Wikimedia project? Whilst
(last I checked) the others were only 70,000 refs compared to the 21
million from Wikipedia they might be significant for certain domains and
are just as "bad".

Cheers,
André
On 9 Dec 2015 10:37, "Gerard Meijssen"  wrote:

> Hoi,
> What would be nice is to have an option to understand progress from one
> dump to the next like you can with the Statistics by Magnus. Magnus also
> has data on sources but this is more global.
> Thanks,
>  GerardM
>
> On 8 December 2015 at 21:41, Markus Krötzsch <
> mar...@semantic-mediawiki.org> wrote:
>
>> Hi Amir,
>>
>> Very nice, thanks! I like the general approach of having a stand-alone
>> tool for analysing the data, and maybe pointing you to issues. Like a
>> dashboard for Wikidata editors.
>>
>> What backend technology are you using to produce these results? Is this
>> live data or dumped data? One could also get those numbers from the SPARQL
>> endpoint, but performance might be problematic (since you compute averages
>> over all items; a custom approach would of course be much faster but then
>> you have the data update problem).
>>
>> An obvious feature request would be to display entity ids as links to the
>> appropriate page, and maybe with their labels (in a language of your
>> choice).
>>
>> But overall very nice.
>>
>> Regards,
>>
>> Markus
>>
>>
>> On 08.12.2015 18:48, Amir Ladsgroup wrote:
>>
>>> Hey,
>>> There has been several discussion regarding quality of information in
>>> Wikidata. I wanted to work on quality of wikidata but we don't have any
>>> source of good information to see where we are ahead and where we are
>>> behind. So I thought the best thing I can do is to make something to
>>> show people how exactly sourced our data is with details. So here we
>>> have *http://tools.wmflabs.org/wd-analyst/index.php*
>>>
>>> You can give only a property (let's say P31) and it gives you the four
>>> most used values + analyze of sources and quality in overall (check this
>>> out )
>>>   and then you can see about ~33% of them are sources which 29.1% of
>>> them are based on Wikipedia.
>>> You can give a property and multiple values you want. Let's say you want
>>> to compare P27:Q183 (Country of citizenship: Germany) and P27:Q30 (US)
>>> Check this out
>>> . And
>>> you can see US biographies are more abundant (300K over 200K) but German
>>> biographies are more descriptive (3.8 description per item over 3.2
>>> description over item)
>>>
>>> One important note: Compare P31:Q5 (a trivial statement) 46% of them are
>>> not sourced at all and 49% of them are based on Wikipedia **but* *get
>>> this statistics for population properties (P1082
>>> ) It's not a
>>> trivial statement and we need to be careful about them. It turns out
>>> there are slightly more than one reference per statement and only 4% of
>>> them are based on Wikipedia. So we can relax and enjoy these
>>> highly-sourced data.
>>>
>>> Requests:
>>>
>>>   * Please tell me whether do you want this tool at all
>>>   * Please suggest more ways to analyze and catch unsourced materials
>>>
>>> Future plan (if you agree to keep using this tool):
>>>
>>>   * Support more datatypes (e.g. date of birth based on year,
>>> coordinates)
>>>   * Sitelink-based and reference-based analysis (to check how much of
>>> articles of, let's say, Chinese Wikipedia are unsourced)
>>>
>>>   * Free-style analysis: There is a database for this tool that can be
>>> used for way more applications. You can get the most unsourced
>>> statements of P31 and then you can go to fix them. I'm trying to
>>> build a playground for this kind of tasks)
>>>
>>> I hope you like this and rock on!
>>> 
>>> Best
>>>
>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
> ___
> Wikidata mailing list

Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread André Costa
Happy to be of use. There is also one for:
* Swedish photo studios [1]
* Norwegian photographers[2]
* Norwegian photo studios [3]
I'm less familiar with these though and don't have a timeline for wikidata
integration.

Cheers,
André

[1] http://kulturnav.org/deb494a0-5457-4e5f-ae9b-e1826e0de681
[2] http://kulturnav.org/508197af-6e36-4e4f-927c-79f8f63654b2
[3] http://kulturnav.org/7d2a01d1-724c-4ad2-a18c-e799880a0241
--
André Costa
GLAM developer
Wikimedia Sverige
On 9 Dec 2015 15:07, "David Lowe"  wrote:

> Thanks, André! I don't know that I've found that before. Great to get
> country (or region) specific lists like this.
> D
>
> On Wednesday, December 9, 2015, André Costa 
> wrote:
>
>> In case you haven't come across it before
>> http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the entry
>> point to the Swedish database of (primarily early) photographers curated by
>> the Nordic Museum in Stockholm.
>>
>> It's not that well integrated into Wikidata yet but the plan is to fix
>> that during early 2016. That would also allow a variety of photographs on
>> Wikimedia Commons to be linked to these entries.
>>
>> Cheers,
>> André
>>
>> André Costa | GLAM developer, Wikimedia Sverige |
>> andre.co...@wikimedia.se | +46 (0)733-964574
>>
>> Stöd fri kunskap, bli medlem i Wikimedia Sverige.
>> Läs mer på blimedlem.wikimedia.se
>>
>> On 9 December 2015 at 02:44, David Lowe  wrote:
>>
>>> Thanks, Tom.
>>> I'll have to look at this specific case when I'm back at work tomorrow,
>>> as it does seem you found something in error.
>>> As for my process: with WD, I queried out the label, description &
>>> country of citizenship, dob & dod of of everyone with occupation:
>>> photographer. After some cleaning, I can get the WD data formatted like my
>>> own (Name, Nationality, Dates). I can then do a simple match, where
>>> everything matches exactly. For the remainder, I then match names and
>>> dates- without Nationality, which is often very "soft" information. For
>>> those that pass a smell test (one is "English" the other is "British") I
>>> pass those along, too. For those with greater discrepancies, I look still
>>> closer. For those with still greater discrepancies, I manually,
>>> individually query my database for anyone with the same last name & same
>>> first initial to catch misspellings or different transliterations. I also
>>> occasionally put my entire database into open refine to catch instances
>>> where, for instance, a Chinese name has been given as FamilyName, GivenName
>>> in one source, and GivenName, FamilyName in another.
>>> In short, this is scrupulously- and manually- checked data. I'm not
>>> savvy enough to let an algorithm make my mistakes for me! But let me know
>>> if this seems to be more than bad luck of the draw- finding the conflicting
>>> data you found.
>>> I have also to say, I may suppress the Niepce Museum collection, as it's
>>> from a really crappy list of photographers in their collection which I
>>> found many years ago, and can no longer find. I don't want to blame them
>>> for the discrepancy, but that might be the source. I don't know.
>>> As I start to query out places of birth & death from WD in the next
>>> days, I expect to find more discrepancies. (Just today, I found dozens of
>>> folks whom ULAN gendered one way, and WD another- but were undeniably the
>>> same photographer. )
>>> Thanks,
>>> David
>>>
>>>
>>> On Tuesday, December 8, 2015, Tom Morris  wrote:
>>>
 Can you explain what "indexing" means in this context?  Is there some
 type of matching process?  How are duplicates resolved, if at all? Was the
 Wikidata info extracted from a dump or one of the APIs?

 When I looked at the first person I picked at random, Pierre Berdoy
 (ID:269710), I see that both Wikidata and Wikipedia claim that he was born
 in Biarritz while the NYPL database claims he was born in Nashua, NH.  So,
 it would appear that there are either two different people with the same
 name, born in different places, or the birth place is wrong.


 http://mgiraldo.github.io/pic/?&biography.TermID=2028247&Location=269710|42.7575,-71.4644
 https://www.wikidata.org/wiki/Q3383941

 Tom




 On Tue, Dec 8, 2015 at 7:10 PM, David Lowe  wrote:

> Hello all,
> The Photographers' Identities Catalog (PIC) is an ongoing project of
> visualizing photo history through the lives of photographers and photo
> studios. I have information on 115,000 photographers and studios as of
> tonight. It is still under construction, but as I've almost completed an
> initial indexing of the ~12,000 photographers in WikiData, I thought I'd
> share it with you. We (the New York Public Library) hope to launch it
> officially in mid to late January. This represents about 12 years worth of
> my work of researching in NYPL's photography collection, censuses and
> business directories, and scraping 

Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread John Erling Blad
I think the Norwegian lists are a subset of Preus Photo Museums list. It is
now maintained partly by Nasjonalbiblioteket (the Norwegian one, not the
Swedish one) and Norsk Lokalhistorisk Institutt. For examle; Anders Beer
Wilse in nowiki,[1] at Lokalhistoriewiki,[2] and at Nasjonalbiblioteket.[3]

Kulturnav is a kind of maintained ontology, where most of the work is done
by local museums. The software for the site itself is made (in part) by a
grant from Norsk Kulturråd.

We should connect as much as possible of our resources to resources at
Kulturnav, and not just copy data. That said, we don't have a very good
model for hov to materialize data from external sites and make it available
for our client sites, so our option is more or less just to copy. It is
better to maintain data at one location.

[1] https://no.wikipedia.org/wiki/Anders_Beer_Wilse
[2] https://lokalhistoriewiki.no/index.php/Anders_Beer_Wilse
[3] http://www.nb.no/nmff/fotograf.php?fotograf_id=3050

On Wed, Dec 9, 2015 at 9:51 PM, André Costa 
wrote:

> Happy to be of use. There is also one for:
> * Swedish photo studios [1]
> * Norwegian photographers[2]
> * Norwegian photo studios [3]
> I'm less familiar with these though and don't have a timeline for wikidata
> integration.
>
> Cheers,
> André
>
> [1] http://kulturnav.org/deb494a0-5457-4e5f-ae9b-e1826e0de681
> [2] http://kulturnav.org/508197af-6e36-4e4f-927c-79f8f63654b2
> [3] http://kulturnav.org/7d2a01d1-724c-4ad2-a18c-e799880a0241
> --
> André Costa
> GLAM developer
> Wikimedia Sverige
> On 9 Dec 2015 15:07, "David Lowe"  wrote:
>
>> Thanks, André! I don't know that I've found that before. Great to get
>> country (or region) specific lists like this.
>> D
>>
>> On Wednesday, December 9, 2015, André Costa 
>> wrote:
>>
>>> In case you haven't come across it before
>>> http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the entry
>>> point to the Swedish database of (primarily early) photographers curated by
>>> the Nordic Museum in Stockholm.
>>>
>>> It's not that well integrated into Wikidata yet but the plan is to fix
>>> that during early 2016. That would also allow a variety of photographs on
>>> Wikimedia Commons to be linked to these entries.
>>>
>>> Cheers,
>>> André
>>>
>>> André Costa | GLAM developer, Wikimedia Sverige |
>>> andre.co...@wikimedia.se | +46 (0)733-964574
>>>
>>> Stöd fri kunskap, bli medlem i Wikimedia Sverige.
>>> Läs mer på blimedlem.wikimedia.se
>>>
>>> On 9 December 2015 at 02:44, David Lowe  wrote:
>>>
 Thanks, Tom.
 I'll have to look at this specific case when I'm back at work tomorrow,
 as it does seem you found something in error.
 As for my process: with WD, I queried out the label, description &
 country of citizenship, dob & dod of of everyone with occupation:
 photographer. After some cleaning, I can get the WD data formatted like my
 own (Name, Nationality, Dates). I can then do a simple match, where
 everything matches exactly. For the remainder, I then match names and
 dates- without Nationality, which is often very "soft" information. For
 those that pass a smell test (one is "English" the other is "British") I
 pass those along, too. For those with greater discrepancies, I look still
 closer. For those with still greater discrepancies, I manually,
 individually query my database for anyone with the same last name & same
 first initial to catch misspellings or different transliterations. I also
 occasionally put my entire database into open refine to catch instances
 where, for instance, a Chinese name has been given as FamilyName, GivenName
 in one source, and GivenName, FamilyName in another.
 In short, this is scrupulously- and manually- checked data. I'm not
 savvy enough to let an algorithm make my mistakes for me! But let me know
 if this seems to be more than bad luck of the draw- finding the conflicting
 data you found.
 I have also to say, I may suppress the Niepce Museum collection, as
 it's from a really crappy list of photographers in their collection which I
 found many years ago, and can no longer find. I don't want to blame them
 for the discrepancy, but that might be the source. I don't know.
 As I start to query out places of birth & death from WD in the next
 days, I expect to find more discrepancies. (Just today, I found dozens of
 folks whom ULAN gendered one way, and WD another- but were undeniably the
 same photographer. )
 Thanks,
 David


 On Tuesday, December 8, 2015, Tom Morris  wrote:

> Can you explain what "indexing" means in this context?  Is there some
> type of matching process?  How are duplicates resolved, if at all? Was the
> Wikidata info extracted from a dump or one of the APIs?
>
> When I looked at the first person I picked at random, Pierre Berdoy
> (ID:269710), I see that both Wikidata and Wikipedia claim that he was born
> in B

Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread John Erling Blad
Forgot to mention; Anders Beer Wilse in Kulturnav
http://kulturnav.org/2b94216b-f2fc-46a3-b2ce-eeb93aa19185

On Wed, Dec 9, 2015 at 11:19 PM, John Erling Blad  wrote:

> I think the Norwegian lists are a subset of Preus Photo Museums list. It
> is now maintained partly by Nasjonalbiblioteket (the Norwegian one, not the
> Swedish one) and Norsk Lokalhistorisk Institutt. For examle; Anders Beer
> Wilse in nowiki,[1] at Lokalhistoriewiki,[2] and at Nasjonalbiblioteket.[3]
>
> Kulturnav is a kind of maintained ontology, where most of the work is done
> by local museums. The software for the site itself is made (in part) by a
> grant from Norsk Kulturråd.
>
> We should connect as much as possible of our resources to resources at
> Kulturnav, and not just copy data. That said, we don't have a very good
> model for hov to materialize data from external sites and make it available
> for our client sites, so our option is more or less just to copy. It is
> better to maintain data at one location.
>
> [1] https://no.wikipedia.org/wiki/Anders_Beer_Wilse
> [2] https://lokalhistoriewiki.no/index.php/Anders_Beer_Wilse
> [3] http://www.nb.no/nmff/fotograf.php?fotograf_id=3050
>
> On Wed, Dec 9, 2015 at 9:51 PM, André Costa 
> wrote:
>
>> Happy to be of use. There is also one for:
>> * Swedish photo studios [1]
>> * Norwegian photographers[2]
>> * Norwegian photo studios [3]
>> I'm less familiar with these though and don't have a timeline for
>> wikidata integration.
>>
>> Cheers,
>> André
>>
>> [1] http://kulturnav.org/deb494a0-5457-4e5f-ae9b-e1826e0de681
>> [2] http://kulturnav.org/508197af-6e36-4e4f-927c-79f8f63654b2
>> [3] http://kulturnav.org/7d2a01d1-724c-4ad2-a18c-e799880a0241
>> --
>> André Costa
>> GLAM developer
>> Wikimedia Sverige
>> On 9 Dec 2015 15:07, "David Lowe"  wrote:
>>
>>> Thanks, André! I don't know that I've found that before. Great to get
>>> country (or region) specific lists like this.
>>> D
>>>
>>> On Wednesday, December 9, 2015, André Costa 
>>> wrote:
>>>
 In case you haven't come across it before
 http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the entry
 point to the Swedish database of (primarily early) photographers curated by
 the Nordic Museum in Stockholm.

 It's not that well integrated into Wikidata yet but the plan is to fix
 that during early 2016. That would also allow a variety of photographs on
 Wikimedia Commons to be linked to these entries.

 Cheers,
 André

 André Costa | GLAM developer, Wikimedia Sverige |
 andre.co...@wikimedia.se | +46 (0)733-964574

 Stöd fri kunskap, bli medlem i Wikimedia Sverige.
 Läs mer på blimedlem.wikimedia.se

 On 9 December 2015 at 02:44, David Lowe  wrote:

> Thanks, Tom.
> I'll have to look at this specific case when I'm back at work
> tomorrow, as it does seem you found something in error.
> As for my process: with WD, I queried out the label, description &
> country of citizenship, dob & dod of of everyone with occupation:
> photographer. After some cleaning, I can get the WD data formatted like my
> own (Name, Nationality, Dates). I can then do a simple match, where
> everything matches exactly. For the remainder, I then match names and
> dates- without Nationality, which is often very "soft" information. For
> those that pass a smell test (one is "English" the other is "British") I
> pass those along, too. For those with greater discrepancies, I look still
> closer. For those with still greater discrepancies, I manually,
> individually query my database for anyone with the same last name & same
> first initial to catch misspellings or different transliterations. I also
> occasionally put my entire database into open refine to catch instances
> where, for instance, a Chinese name has been given as FamilyName, 
> GivenName
> in one source, and GivenName, FamilyName in another.
> In short, this is scrupulously- and manually- checked data. I'm not
> savvy enough to let an algorithm make my mistakes for me! But let me know
> if this seems to be more than bad luck of the draw- finding the 
> conflicting
> data you found.
> I have also to say, I may suppress the Niepce Museum collection, as
> it's from a really crappy list of photographers in their collection which 
> I
> found many years ago, and can no longer find. I don't want to blame them
> for the discrepancy, but that might be the source. I don't know.
> As I start to query out places of birth & death from WD in the next
> days, I expect to find more discrepancies. (Just today, I found dozens of
> folks whom ULAN gendered one way, and WD another- but were undeniably the
> same photographer. )
> Thanks,
> David
>
>
> On Tuesday, December 8, 2015, Tom Morris  wrote:
>
>> Can you explain what "indexing" means in this context?  Is there some

Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2015-12-09 Thread David Lowe
Yep, thanks! Wilse there in duplicate (here's

the correct one). The other will be gone in an hour or so when I update.
I look forward to looking at these lists, thanks! It will probably next
week before I finish ingesting birth & death locations from WD.

d

On Wed, Dec 9, 2015 at 5:27 PM, John Erling Blad  wrote:

> Forgot to mention; Anders Beer Wilse in Kulturnav
> http://kulturnav.org/2b94216b-f2fc-46a3-b2ce-eeb93aa19185
>
> On Wed, Dec 9, 2015 at 11:19 PM, John Erling Blad 
> wrote:
>
>> I think the Norwegian lists are a subset of Preus Photo Museums list. It
>> is now maintained partly by Nasjonalbiblioteket (the Norwegian one, not the
>> Swedish one) and Norsk Lokalhistorisk Institutt. For examle; Anders Beer
>> Wilse in nowiki,[1] at Lokalhistoriewiki,[2] and at Nasjonalbiblioteket.[3]
>>
>> Kulturnav is a kind of maintained ontology, where most of the work is
>> done by local museums. The software for the site itself is made (in part)
>> by a grant from Norsk Kulturråd.
>>
>> We should connect as much as possible of our resources to resources at
>> Kulturnav, and not just copy data. That said, we don't have a very good
>> model for hov to materialize data from external sites and make it available
>> for our client sites, so our option is more or less just to copy. It is
>> better to maintain data at one location.
>>
>> [1] https://no.wikipedia.org/wiki/Anders_Beer_Wilse
>> [2] https://lokalhistoriewiki.no/index.php/Anders_Beer_Wilse
>> [3] http://www.nb.no/nmff/fotograf.php?fotograf_id=3050
>>
>> On Wed, Dec 9, 2015 at 9:51 PM, André Costa 
>> wrote:
>>
>>> Happy to be of use. There is also one for:
>>> * Swedish photo studios [1]
>>> * Norwegian photographers[2]
>>> * Norwegian photo studios [3]
>>> I'm less familiar with these though and don't have a timeline for
>>> wikidata integration.
>>>
>>> Cheers,
>>> André
>>>
>>> [1] http://kulturnav.org/deb494a0-5457-4e5f-ae9b-e1826e0de681
>>> [2] http://kulturnav.org/508197af-6e36-4e4f-927c-79f8f63654b2
>>> [3] http://kulturnav.org/7d2a01d1-724c-4ad2-a18c-e799880a0241
>>> --
>>> André Costa
>>> GLAM developer
>>> Wikimedia Sverige
>>> On 9 Dec 2015 15:07, "David Lowe"  wrote:
>>>
 Thanks, André! I don't know that I've found that before. Great to get
 country (or region) specific lists like this.
 D

 On Wednesday, December 9, 2015, André Costa 
 wrote:

> In case you haven't come across it before
> http://kulturnav.org/1f368832-7649-4386-97b6-ae40cce8752b is the
> entry point to the Swedish database of (primarily early) photographers
> curated by the Nordic Museum in Stockholm.
>
> It's not that well integrated into Wikidata yet but the plan is to fix
> that during early 2016. That would also allow a variety of photographs on
> Wikimedia Commons to be linked to these entries.
>
> Cheers,
> André
>
> André Costa | GLAM developer, Wikimedia Sverige |
> andre.co...@wikimedia.se | +46 (0)733-964574
>
> Stöd fri kunskap, bli medlem i Wikimedia Sverige.
> Läs mer på blimedlem.wikimedia.se
>
> On 9 December 2015 at 02:44, David Lowe  wrote:
>
>> Thanks, Tom.
>> I'll have to look at this specific case when I'm back at work
>> tomorrow, as it does seem you found something in error.
>> As for my process: with WD, I queried out the label, description &
>> country of citizenship, dob & dod of of everyone with occupation:
>> photographer. After some cleaning, I can get the WD data formatted like 
>> my
>> own (Name, Nationality, Dates). I can then do a simple match, where
>> everything matches exactly. For the remainder, I then match names and
>> dates- without Nationality, which is often very "soft" information. For
>> those that pass a smell test (one is "English" the other is "British") I
>> pass those along, too. For those with greater discrepancies, I look still
>> closer. For those with still greater discrepancies, I manually,
>> individually query my database for anyone with the same last name & same
>> first initial to catch misspellings or different transliterations. I also
>> occasionally put my entire database into open refine to catch instances
>> where, for instance, a Chinese name has been given as FamilyName, 
>> GivenName
>> in one source, and GivenName, FamilyName in another.
>> In short, this is scrupulously- and manually- checked data. I'm not
>> savvy enough to let an algorithm make my mistakes for me! But let me know
>> if this seems to be more than bad luck of the draw- finding the 
>> conflicting
>> data you found.
>> I have also to say, I may suppress the Niepce Museum col

Re: [Wikidata] [Wikimedia-l] Quality issues

2015-12-09 Thread John Erling Blad
Andreas Kolbe have one point,a reference to a Wikipedia article should
point to the correct article, and should preferably point to the revision
introducing the value. It should be pretty easy to do this for most of the
statements...

On Wed, Dec 9, 2015 at 11:35 AM, Markus Krötzsch <
mar...@semantic-mediawiki.org> wrote:

> P.S. Meanwhile, your efforts in other channels are already leading some
> people to vandalise Wikidata just to make a point [1].
>
> Markus
>
> [1]
> http://forums.theregister.co.uk/forum/1/2015/12/08/wikidata_special_report/
>
>
>
> On 09.12.2015 11:32, Markus Krötzsch wrote:
>
>> On 08.12.2015 00:02, Andreas Kolbe wrote:
>>
>>> Hi Markus,
>>>
>> ...
>>
>>>
>>>
>>> Apologies for the late reply.
>>>
>>> While you indicated that you had crossposted this reply to Wikimedia-l,
>>> it didn't turn up in my inbox. I only saw it today, after Atlasowa
>>> pointed it out on the Signpost op-ed's talk page.[1]
>>>
>>
>> Yes, we have too many communication channels. Let me only reply briefly
>> now, to the first point:
>>
>>  > This prompted me to reply. I wanted to write an email that merely
>>> says: > "Really? Where did you get this from?" (Google using Wikidata
>>> content)
>>>
>>> Multiple sources, including what appears to be your own research group's
>>> writing:[2]
>>>
>>
>> What this page suggested was that that Freebase being shutdown means
>> that Google will use Wikidata as a source. Note that the short intro
>> text on the page did not say anything else about the subject, so I am
>> surprised that this sufficed to convince you about the truth of that
>> claim (it seems that other things I write with more support don't have
>> this effect). Anyway, I am really sorry to hear that this
>> quickly-written intro on the web has misled you. When I wrote this after
>> Google had made their Freebase announcement last year, I really believed
>> that this was the obvious implication. However, I was jumping to
>> conclusions there without having first-hand evidence. I guess many
>> people did the same. I fixed the statement now.
>>
>> To be clear: I am not saying that Google is not using Wikidata. I just
>> don't know. However, if you make a little effort, there is a lot of
>> evidence that Google is not using Wikidata as a source, even when it
>> could. For example, population numbers are off, even in cases where they
>> refer to the same source and time, and Google also shows many statements
>> and sources that are not in Wikidata at all (and not even in Primary
>> Sources).
>>
>> I still don't see any problem if Google would be using Wikidata, but
>> that's another discussion.
>>
>> You mention "multiple sources".
>> {{Which}}?
>>
>> Markus
>>
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikimedia-l] Quality issues

2015-12-09 Thread Gerard Meijssen
Hoi,
If anything that would be the only point. It is a very sad piece of FUD. It
is not that easy..
Thanks,
 GerardM

http://ultimategerardm.blogspot.nl/2015/12/wikipedia-signpost-yeah-right.html

On 9 December 2015 at 23:51, John Erling Blad  wrote:

> Andreas Kolbe have one point,a reference to a Wikipedia article should
> point to the correct article, and should preferably point to the revision
> introducing the value. It should be pretty easy to do this for most of the
> statements...
>
> On Wed, Dec 9, 2015 at 11:35 AM, Markus Krötzsch <
> mar...@semantic-mediawiki.org> wrote:
>
>> P.S. Meanwhile, your efforts in other channels are already leading some
>> people to vandalise Wikidata just to make a point [1].
>>
>> Markus
>>
>> [1]
>> http://forums.theregister.co.uk/forum/1/2015/12/08/wikidata_special_report/
>>
>>
>>
>> On 09.12.2015 11:32, Markus Krötzsch wrote:
>>
>>> On 08.12.2015 00:02, Andreas Kolbe wrote:
>>>
 Hi Markus,

>>> ...
>>>


 Apologies for the late reply.

 While you indicated that you had crossposted this reply to Wikimedia-l,
 it didn't turn up in my inbox. I only saw it today, after Atlasowa
 pointed it out on the Signpost op-ed's talk page.[1]

>>>
>>> Yes, we have too many communication channels. Let me only reply briefly
>>> now, to the first point:
>>>
>>>  > This prompted me to reply. I wanted to write an email that merely
 says: > "Really? Where did you get this from?" (Google using Wikidata
 content)

 Multiple sources, including what appears to be your own research group's
 writing:[2]

>>>
>>> What this page suggested was that that Freebase being shutdown means
>>> that Google will use Wikidata as a source. Note that the short intro
>>> text on the page did not say anything else about the subject, so I am
>>> surprised that this sufficed to convince you about the truth of that
>>> claim (it seems that other things I write with more support don't have
>>> this effect). Anyway, I am really sorry to hear that this
>>> quickly-written intro on the web has misled you. When I wrote this after
>>> Google had made their Freebase announcement last year, I really believed
>>> that this was the obvious implication. However, I was jumping to
>>> conclusions there without having first-hand evidence. I guess many
>>> people did the same. I fixed the statement now.
>>>
>>> To be clear: I am not saying that Google is not using Wikidata. I just
>>> don't know. However, if you make a little effort, there is a lot of
>>> evidence that Google is not using Wikidata as a source, even when it
>>> could. For example, population numbers are off, even in cases where they
>>> refer to the same source and time, and Google also shows many statements
>>> and sources that are not in Wikidata at all (and not even in Primary
>>> Sources).
>>>
>>> I still don't see any problem if Google would be using Wikidata, but
>>> that's another discussion.
>>>
>>> You mention "multiple sources".
>>> {{Which}}?
>>>
>>> Markus
>>>
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata