Re: [Wikidata] weekly summary #194

2016-02-02 Thread james
Does anyone know if it would be possible with the RDF_GAS_API to specify
two Wikidata items (e.g. Kevin Bacon and Kathy Bates) and a property
(e.g. cast member) and have it calculate the shortest path between them?
 That functionality for future versions of ConceptMap is what pushed me
to adding Neo4j to the technologies selected for it, but perhaps I was
too hasty.

Regards,
James Weaver

On Tue, Feb 2, 2016, at 12:45 PM, Stas Malyshev wrote:
> Hi!
> 
> > Indeed, this is very nifty. I also note that this uses some special
> > features of our SPARQL endpoint that I did not know about (the "gas
> > service"). It seems that this is a proprietary extension of BlazeGraph,
> > which comes in very handy here.
> 
> Yes, it's described here:
> https://wiki.blazegraph.com/wiki/index.php/RDF_GAS_API and it's a
> service implementing basic graph algorithms such as BFS, shortest path,
> PageRank, etc. I personally didn't use it too much but it may be very
> useful for tasks which are naturally expressed as graph traversals.
> 
> -- 
> Stas Malyshev
> smalys...@wikimedia.org
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] weekly summary #194

2016-02-02 Thread james
+1 on mek's sentiment.  Outstanding work, AngryLoki!

Regards, James Weaver


On Mon, Feb 1, 2016, at 08:07 PM, Michael Karpeles wrote:
> Well, https://angryloki.github.io/wikidata-graph-builder will change
> my life, this is amazing. Thank you AngryLoki and all the hundreds of
> layers of contributors which lead to a tool like this. Also Lydia et
> al, thanks for the hard work in keeping these updates going.
>
> best wishes,
>
> - mek
>
> On Mon, Feb 1, 2016 at 9:44 AM, Lydia Pintscher
>  wrote:
>> Hey everyone :)
>>
>> Here's your summary of what's been happening around Wikidata over the
>> past week.
>>
>> Events[1]/Press/Blogs[2]


>>  * Replicator: Wikidata import tool[3]
>>  * The Facebook of German Playwrights[4]
>>  * Language usage on Wikidata[5]
>>  * Past: FOSDEM (slides of talk by Lucie[6]) Other Noteworthy Stuff


>>  * Please help us classify a bunch of edits to improve anti-vandalism
>>tools on Wikidata[7]
>>  * Over 18000 people who made at least one edit over the last month!
>>  * some visualizations:
>>* Family tree of King Halo, race horse[8]
>>* Doctoral students of Gauss[9]
>>* Tributaries of the Danube[10]
>>* Children of Kronos[11]
>>* Influenced by Leibnitz[12]
>>* Graphs[13]
>>* Zika virus papers[14]
>>  * KasparBot is now removing all PersonData template usages from
English Wikipedia. They added machine-readable information to articles.
>>  * Wikiversity will get the first phase of Wikidata support (language
>>links) on February 23rd.
>>  * Upcoming deployments of new datatypes, In Other Projects Sidebar,
>>Article Placeholder and more[15]
>>  * WD-FIST[16] now supports SPARQL queries Did you know?


>>  * Newest properties[17]: National Historic Sites of Canada ID[18]
>>  * Query example: horses[19], French sculptors by year of birth[20]
>>  * Newest external tools: Template:Complex constraint[21]
>>  * Newest database reports: Help:Wikimedia language
>>codes/lists/all[22]
>>  * New feature/gadget requests: badge for templates using
>>Wikidata[23] Development


>>  * und, mis, mul and zxx will be supported language codes for
>>monolingual text. More will come later.
>>  * Working on adjusting the layout of the ArticlePlaceholder
>>generated pages
>>  * Final touches on making search work on mobile
>>  * Finishing identifier datatype and section You can see all open
>>tickets related to Wikidata here[24].


>> Monthly Tasks


>>  * Hack on one of these[25].
>>  * Help develop the next summary here![26]
>>  * Contribute to a Showcase item[27]
>>  * Help translate[28] or proofread pages in your own language!
>>  * Add labels, in your own language(s), for the new properties listed
>>above. Anything to add? Please share! :)


>>
>> Cheers Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher Product Manager for
>> Wikidata
>>
>> Wikimedia Deutschland e.V. Tempelhofer Ufer 23-24 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien
>> Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
>>
>> ___
>>
Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
> _
> Wikidata mailing list Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata



Links:

   1. https://www.wikidata.org/wiki/Wikidata:Events
   2. https://www.wikidata.org/wiki/Wikidata:Press_coverage
   3. 
https://www.entropywins.wtf/blog/2016/01/25/replicator-a-cli-tool-for-wikidata/
   4. https://dlina.github.io/The-Facebook-of-German-Playwrights/
   5. http://addshore.com/2016/02/language-usage-on-wikidata/
   6. 
http://www.slideshare.net/frimelle/increasing-access-to-free-and-open-knowledge-for-speakers-of-underserved-languages-on-wikipedia
   7. https://lists.wikimedia.org/pipermail/wikidata/2016-January/008052.html
   8. https://tools.wmflabs.org/family/ancestors.php?q=Q11297913
   9. 
https://angryloki.github.io/wikidata-graph-builder/?property=P184&item=Q6722&direction=reverse
  10. 
https://angryloki.github.io/wikidata-graph-builder/?property=P974&item=Q1653&lang=de&iterations=7
  11. 
https://angryloki.github.io/wikidata-graph-builder/?property=P40&item=Q44204&lang=de&iterations=7
  12. 
https://angryloki.github.io/wikidata-graph-builder/?property=P737&item=Q9047&lang=de&direction=both&iterations=15
  13. 
https://angryloki.github.io/wikidata-graph-builder/?property=P279&item=Q141488&direction=reverse
  14. 
https://tools.wmflabs.org/wikidata-timeline/#/timeline?query=CLAIM%5B921:202864%5D%20OR%20CLAIM%5B921:8071861%5D
  15. https://lists.wikimedia.org/pipermail/wikidata/2016-February/008068.html
  16. https://tools.wmflabs.org/fist/wdfist/index.html
  

Re: [Wikidata] upcoming deployments/features

2016-02-02 Thread John Mark Vandenberg
On Wed, Feb 3, 2016 at 9:31 AM, Stas Malyshev  wrote:
> Hi!
>
>> Can you try again please? And in an in-cognito window? I just tried it
>> and it works for me: https://test.wikidata.org/wiki/Q649 We've had some
>> issues with local store though.
>
> Weird, does work for me incognito but not when logged in.
>
>> The datatype changes but the value type stays string. So depending on
>> what they use they might need to be adapted.
>
> RDF export seems to be fine, except that we need to update OWL and docs
> for new types, I'll check pywikibot a bit later.

We've already done analysis for pywikibot.  It will fail badly -- the
api cache needs to be manually cleaned, however there are some
improvements we can make so that the transition is smooth.
https://phabricator.wikimedia.org/T123882

-- 
John Vandenberg

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] upcoming deployments/features

2016-02-02 Thread Stas Malyshev
Hi!

> Can you try again please? And in an in-cognito window? I just tried it
> and it works for me: https://test.wikidata.org/wiki/Q649 We've had some
> issues with local store though.

Weird, does work for me incognito but not when logged in.

> The datatype changes but the value type stays string. So depending on
> what they use they might need to be adapted.

RDF export seems to be fine, except that we need to update OWL and docs
for new types, I'll check pywikibot a bit later.

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] weekly summary #194

2016-02-02 Thread Stas Malyshev
Hi!

> Indeed, this is very nifty. I also note that this uses some special
> features of our SPARQL endpoint that I did not know about (the "gas
> service"). It seems that this is a proprietary extension of BlazeGraph,
> which comes in very handy here.

Yes, it's described here:
https://wiki.blazegraph.com/wiki/index.php/RDF_GAS_API and it's a
service implementing basic graph algorithms such as BFS, shortest path,
PageRank, etc. I personally didn't use it too much but it may be very
useful for tasks which are naturally expressed as graph traversals.

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata - short biographies

2016-02-02 Thread Hampton Snowball
Okay, thanks!

On Tue, Feb 2, 2016 at 8:55 AM, Edgard Marx 
wrote:

> Hey,
>
> I recommend you to not post doubts related with third part systems or
> softwares that are not related with Wikidata or Wikimida here.
> In case of RDFSlice there is a page called issues (
> https://bitbucket.org/emarx/rdfslice/issues),
> where you can open an issue and someone will answer you.
>
> I also advise you to post your command line or a error, so the developers
> can better understand it and quickly fix it (if there is a problem).
>
> best regards,
> Edgard
>
> On Tue, Feb 2, 2016 at 7:18 AM, Hampton Snowball <
> hamptonsnowb...@gmail.com> wrote:
>
>> I was able to semi-successfully use RDFSlice with the dump using Windows
>> command prompt.  Only, maybe because it's a 5gb dump file I am getting java
>> errors line after line as it goes through the file
>> (java.lang.StringIndexOutOfBoundsException: String index out of range - 1.
>> Sometimes the last number changes).
>>
>> I thought it might might be a memory issue.  Increasing memory with the
>> -Xmx2G command (or 3G, 4G) I haven't had luck with.  Any tips would be
>> appreciated.
>>
>> Thanks
>>
>> On Mon, Feb 1, 2016 at 7:28 PM, Hampton Snowball <
>> hamptonsnowb...@gmail.com> wrote:
>>
>>> Of course I meant sorry if this is a dumb question :)
>>>
>>>
>>>
>>> On Mon, Feb 1, 2016 at 7:13 PM, Hampton Snowball <
>>> hamptonsnowb...@gmail.com> wrote:
>>>
 Sorry if this is a dump question (I'm not a developer).  To run the
 command on the rdfslice program in mentions (" java -jar rdfslice.jar
 -source | -patterns  -out  -order
  -debug ), can this be done with windows
 command prompt? or do I need some special developer version of 
 java/console?

 Thanks for the tool.

 On Sun, Jan 31, 2016 at 3:53 PM, Edgard Marx <
 m...@informatik.uni-leipzig.de> wrote:

> Hey,
> you can simple use RDFSlice (
> https://bitbucket.org/emarx/rdfslice/overview) directly on the dump
> file (https://dumps.wikimedia.org/wikidatawiki/entities/20160125/)
>
> best,
> Edgard
>
> On Sun, Jan 31, 2016 at 7:43 PM, Hampton Snowball <
> hamptonsnowb...@gmail.com> wrote:
>
>> Hello,
>>
>> I am interested in a subset of wikidata and I am trying to find the
>> best way to get it without getting a larger dataset then necessary.
>>
>> Is there a way to just get the "bios" that appear on the wikidata
>> pages below the name of the person/organization, as well as the link to 
>> the
>> english wikipedia page / or all wikipedia pages?
>>
>> For example from: https://www.wikidata.org/wiki/Q1652291";
>>
>> "Turkish female given name"
>> https://en.wikipedia.org/wiki/H%C3%BClya
>> and optionally https://de.wikipedia.org/wiki/H%C3%BClya
>>
>> I know there is SPARQL which previously this list helped me construct
>> a query, but I know some requests seem to timeout when looking at a large
>> amount of data so I am not sure this would work.
>>
>> The dumps I know are the full dataset, but I am not sure if there's
>> any other subset dumps available or better way of grabbing this data
>>
>> Thanks in advance,
>> HS
>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>

>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata - short biographies

2016-02-02 Thread Edgard Marx
Hey,

I recommend you to not post doubts related with third part systems or
softwares that are not related with Wikidata or Wikimida here.
In case of RDFSlice there is a page called issues (
https://bitbucket.org/emarx/rdfslice/issues),
where you can open an issue and someone will answer you.

I also advise you to post your command line or a error, so the developers
can better understand it and quickly fix it (if there is a problem).

best regards,
Edgard

On Tue, Feb 2, 2016 at 7:18 AM, Hampton Snowball 
wrote:

> I was able to semi-successfully use RDFSlice with the dump using Windows
> command prompt.  Only, maybe because it's a 5gb dump file I am getting java
> errors line after line as it goes through the file
> (java.lang.StringIndexOutOfBoundsException: String index out of range - 1.
> Sometimes the last number changes).
>
> I thought it might might be a memory issue.  Increasing memory with the
> -Xmx2G command (or 3G, 4G) I haven't had luck with.  Any tips would be
> appreciated.
>
> Thanks
>
> On Mon, Feb 1, 2016 at 7:28 PM, Hampton Snowball <
> hamptonsnowb...@gmail.com> wrote:
>
>> Of course I meant sorry if this is a dumb question :)
>>
>>
>>
>> On Mon, Feb 1, 2016 at 7:13 PM, Hampton Snowball <
>> hamptonsnowb...@gmail.com> wrote:
>>
>>> Sorry if this is a dump question (I'm not a developer).  To run the
>>> command on the rdfslice program in mentions (" java -jar rdfslice.jar
>>> -source | -patterns  -out  -order
>>>  -debug ), can this be done with windows command
>>> prompt? or do I need some special developer version of java/console?
>>>
>>> Thanks for the tool.
>>>
>>> On Sun, Jan 31, 2016 at 3:53 PM, Edgard Marx <
>>> m...@informatik.uni-leipzig.de> wrote:
>>>
 Hey,
 you can simple use RDFSlice (
 https://bitbucket.org/emarx/rdfslice/overview) directly on the dump
 file (https://dumps.wikimedia.org/wikidatawiki/entities/20160125/)

 best,
 Edgard

 On Sun, Jan 31, 2016 at 7:43 PM, Hampton Snowball <
 hamptonsnowb...@gmail.com> wrote:

> Hello,
>
> I am interested in a subset of wikidata and I am trying to find the
> best way to get it without getting a larger dataset then necessary.
>
> Is there a way to just get the "bios" that appear on the wikidata
> pages below the name of the person/organization, as well as the link to 
> the
> english wikipedia page / or all wikipedia pages?
>
> For example from: https://www.wikidata.org/wiki/Q1652291";
>
> "Turkish female given name"
> https://en.wikipedia.org/wiki/H%C3%BClya
> and optionally https://de.wikipedia.org/wiki/H%C3%BClya
>
> I know there is SPARQL which previously this list helped me construct
> a query, but I know some requests seem to timeout when looking at a large
> amount of data so I am not sure this would work.
>
> The dumps I know are the full dataset, but I am not sure if there's
> any other subset dumps available or better way of grabbing this data
>
> Thanks in advance,
> HS
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata


>>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] upcoming deployments/features

2016-02-02 Thread Moritz Schubotz
The string is interpreted by the math extension in the same way as the Math
extension interprets the text between the  tags.
There is an API to extract identifiers and the packages required to render
the input with regular latex from here:
http://api.formulasearchengine.com/v1/?doc
or also
https://en.wikipedia.org/api/rest_v1/?doc#!/Math/post_media_math_check_type
(The wikipedia endpoint has been opened to the public just moments ago)
In the future, we are planning to provide additional semantics from there.
If you have additional questions, please contact me directly, since I'm not
a member on the list.
Moritz

On Tue, Feb 2, 2016 at 8:53 AM, Lydia Pintscher  wrote:

> On Mon, Feb 1, 2016 at 8:44 PM Markus Krötzsch <
> mar...@semantic-mediawiki.org> wrote:
>
>> On 01.02.2016 17:14, Lydia Pintscher wrote:
>> > Hey folks :)
>> >
>> > I just sat down with Katie to plan the next important feature
>> > deployments that are coming up this month. Here is the plan:
>> > * new datatype for mathematical expressions: We'll get it live on
>> > test.wikidata.org  tomorrow and then bring it
>> > to wikidata.org  on the 9th
>>
>> Documentation? What will downstream users like us need to do to support
>> this? How is this mapped to JSON? How is this mapped to RDF?
>>
>
> It is a string representing markup for the Math extension. You can already
> test it here: http://wikidata.beta.wmflabs.org/wiki/Q117940. See also
> https://en.wikipedia.org/wiki/Help:Displaying_a_formula. Maybe Moritz
> wants to say  bit more as his students created the datatype.
>
> Cheers
> Lydia
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt
> für Körperschaften I Berlin, Steuernummer 27/029/42207.
>



-- 
Moritz Schubotz
TU Berlin, Fakultät IV
DIMA - Sekr. EN7
Raum EN742
Einsteinufer 17
D-10587 Berlin
Germany

Tel.: +49 30 314 22784
Mobil.: +49 1578 047 1397
Fax:  +49 30 314 21601
E-Mail: schub...@tu-berlin.de
Skype: Schubi87
ICQ: 200302764
Msn: mor...@schubotz.de
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] weekly summary #194

2016-02-02 Thread Markus Krötzsch

On 02.02.2016 02:07, Michael Karpeles wrote:

Well, https://angryloki.github.io/wikidata-graph-builder will change my
life, this is amazing. Thank you AngryLoki and all the hundreds of
layers of contributors which lead to a tool like this. Also Lydia et al,
thanks for the hard work in keeping these updates going.


Indeed, this is very nifty. I also note that this uses some special 
features of our SPARQL endpoint that I did not know about (the "gas 
service"). It seems that this is a proprietary extension of BlazeGraph, 
which comes in very handy here.


Best

Markus



On Mon, Feb 1, 2016 at 9:44 AM, Lydia Pintscher
mailto:lydia.pintsc...@wikimedia.de>> wrote:

Hey everyone :)

Here's your summary of what's been happening around Wikidata over
the past week.


  Events /Press/Blogs
  

  * Replicator: Wikidata import tool


  * The Facebook of German Playwrights

  * Language usage on Wikidata

  * Past: FOSDEM (slides of talk by Lucie

)


  Other Noteworthy Stuff

  * Please help us classify a bunch of edits to improve
anti-vandalism tools on Wikidata


  * Over 18000 people who made at least one edit over the last month!
  * some visualizations:
  o Family tree of King Halo, race horse

  o Doctoral students of Gauss


  o Tributaries of the Danube


  o Children of Kronos


  o Influenced by Leibnitz


  o Graphs


  o Zika virus papers


  * KasparBot is now removing all PersonData template usages from
English Wikipedia. They added machine-readable information to
articles.
  * Wikiversity will get the first phase of Wikidata support
(language links) on February 23rd.
  * Upcoming deployments of new datatypes, In Other Projects
Sidebar, Article Placeholder and more


  * WD-FIST  now
supports SPARQL queries


  Did you know?

  * Newest properties
: National
Historic Sites of Canada ID

  * Query example: horses



Re: [Wikidata] Wikidata - short biographies

2016-02-02 Thread Hampton Snowball
I was able to semi-successfully use RDFSlice with the dump using Windows
command prompt.  Only, maybe because it's a 5gb dump file I am getting java
errors line after line as it goes through the file
(java.lang.StringIndexOutOfBoundsException: String index out of range - 1.
Sometimes the last number changes).

I thought it might might be a memory issue.  Increasing memory with the
-Xmx2G command (or 3G, 4G) I haven't had luck with.  Any tips would be
appreciated.

Thanks

On Mon, Feb 1, 2016 at 7:28 PM, Hampton Snowball 
wrote:

> Of course I meant sorry if this is a dumb question :)
>
>
>
> On Mon, Feb 1, 2016 at 7:13 PM, Hampton Snowball <
> hamptonsnowb...@gmail.com> wrote:
>
>> Sorry if this is a dump question (I'm not a developer).  To run the
>> command on the rdfslice program in mentions (" java -jar rdfslice.jar
>> -source | -patterns  -out  -order
>>  -debug ), can this be done with windows command
>> prompt? or do I need some special developer version of java/console?
>>
>> Thanks for the tool.
>>
>> On Sun, Jan 31, 2016 at 3:53 PM, Edgard Marx <
>> m...@informatik.uni-leipzig.de> wrote:
>>
>>> Hey,
>>> you can simple use RDFSlice (
>>> https://bitbucket.org/emarx/rdfslice/overview) directly on the dump
>>> file (https://dumps.wikimedia.org/wikidatawiki/entities/20160125/)
>>>
>>> best,
>>> Edgard
>>>
>>> On Sun, Jan 31, 2016 at 7:43 PM, Hampton Snowball <
>>> hamptonsnowb...@gmail.com> wrote:
>>>
 Hello,

 I am interested in a subset of wikidata and I am trying to find the
 best way to get it without getting a larger dataset then necessary.

 Is there a way to just get the "bios" that appear on the wikidata pages
 below the name of the person/organization, as well as the link to the
 english wikipedia page / or all wikipedia pages?

 For example from: https://www.wikidata.org/wiki/Q1652291";

 "Turkish female given name"
 https://en.wikipedia.org/wiki/H%C3%BClya
 and optionally https://de.wikipedia.org/wiki/H%C3%BClya

 I know there is SPARQL which previously this list helped me construct a
 query, but I know some requests seem to timeout when looking at a large
 amount of data so I am not sure this would work.

 The dumps I know are the full dataset, but I am not sure if there's any
 other subset dumps available or better way of grabbing this data

 Thanks in advance,
 HS


 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata


>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata