[Wikidata-l] making otherProjectsLinksByDefault non-beta

2015-03-22 Thread Amir E. Aharoni
Hi,

The "Other projects sidebar" beta feature is already on by default in some
projects, such as the Italian Wikipedia and the French Wikipedia. Search
for otherProjectsLinksByDefault in
http://noc.wikimedia.org/conf/highlight.php?file=InitialiseSettings.php

Is there any reason not to make it it on by default in all (or most)
projects?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] OpenStreetMap + Wikidata

2015-03-10 Thread Amir E. Aharoni
Hi,

[ Aude and Christian Consonni, this should especially interest you. ]

I was throwing around ideas with a friend about how OpenStreetMap could be
integrated with Wikidata.

The thing that I care the most in any software is internationalization.
Having a map in which all labels of towns, streets and everything else is
translated to all languages sounds like a super-wonderful thing.

Wikidata allows labeling everything, translating everything, and attaching
properties to everything, so it sounds like it could be a good match.

But then the question of "what IS everything" came up. Wikidata was created
mostly with Wikipedia in mind, so Wikipedia's notability policies
influenced Wikidata. Roughly, Wikidata has items for every thing about
which there is, or can be, a Wikipedia article and for things that are
useful, or if it "fulfills some structural need
".

Towns obviously have or can a Wikipedia article about them, but probably
not every street or shop. But do they fulfill a structural need or is it
way too much?

If it's way too much, how can this be bridged, or federated, or whatever
the current popular word is? I don't even know exactly how does OSM store
labels and translations now, but it sounds like another instance of
Wikibase, if not Wikidata itself, can be used for it.

I don't have much to add, but I'd love to hear ideas from people who do
(again, Aude and Christian Consonni, I'm looking at you :) ).

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] mapping template parameters using Wikidata?

2015-03-04 Thread Amir E. Aharoni
> Maybe we should store these internationalised templates here on wikidata?

That's precisely what my opening post is about :)

I need help from people who understand Wikidata (and possible dbpedia)
better than I do to figure out the details of getting it done.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-03-04 12:21 GMT+02:00 Joe Filceolaire :

> If you want to list the properties called by a template then you need a
> property which links to other properties - ie it has a property datatype.
> Property datatype is not available yet but is coming soon.
>
> You can then use the labels for this property in various languages to
> label the corresponding template parameters
>
> It seems to me that it would be better to store this info in the template
> itself - use lua to specify the property for each parameter and also to
> specify that the parameter label should be the label of the corresponding
> property in the language of that wiki.
>
> Maybe we should store these internationalised templates here on wikidata?
> On 4 Mar 2015 09:35, "Dimitris Kontokostas"  wrote:
>
>>
>>
>> On Wed, Mar 4, 2015 at 11:07 AM, Stas Malyshev 
>> wrote:
>>
>>> Hi!
>>>
>>> > architect}} at the top. How would ContentTranslation, a MediaWiki
>>> > extension installed on the Wikimedia cluster, know that the "name"
>>> > parameter is "naam" in Dutch?
>>>
>>> "Name" would be a bit tricky since I'm not sure if we have property
>>> called "name" but for something like date of birth wouldn't it be useful
>>> to link it in the template to
>>> https://www.wikidata.org/wiki/Property:P569 somehow? Is there such
>>> possibility?
>>>
>>
>> In DBpedia we have our own properties and the mappings should use these
>> instead.
>> Some exceptions exist for very popular vocabularies such as foaf:name but
>> I am not sure if we should allow direct mappings to a wikidata property if
>> an equivalent DBpedia property exists.
>> In this case it's
>> http://mappings.dbpedia.org/index.php/OntologyProperty:BirthDate
>> We already have some mappings in place but more are needed for complete
>> coverage
>>
>> http://mappings.dbpedia.org/index.php?title=Special%3ASearch&search=wikidata&go=Go
>>
>>
>>> With identifying properties, however - such as name - I'm not sure if
>>> this could be used.
>>>
>>
>> I agree that general properties such as name are difficult to interpret
>>
>>
>>>
>>> > Even if it is possible to query it, is it good to be dependent on an
>>> > external website for this? Maybe it makes sense to import the data from
>>> > dbpedia to Wikidata? It's absolutely not a rhetorical question - maybe
>>> > it is OK to use dbpedia.
>>>
>>> Well, in dbpedia it says name is foaf:name, but this could only be
>>> appropriate for humans (and maybe only in specific contexts), for other
>>> entities "name" may have completely different semantics. In Wikidata,
>>> however, properties are generic, so I wonder if it would be possible to
>>> keep context. dbPedia obviously does have context but not sure where it
>>> would be in Wikidata.
>>>
>>
>> We could keep the context in DBpedia and with proper inter-linking do
>> many interesting stuff.
>>
>> As we discussed yesterday, we could use DBpedia Live and check for
>> updated/stalled/missing values.
>> For example, if the previous values were the same in DBpedia/Wikipedia &
>> Wikidata and e.g. Wikipedia changes a value we could trigger an update
>> alert, or if a new value such as deathDate in introduced that does not
>> exist in Wikidata.
>> DBpedia would use dbo:deathDate, but using the link to P570 we could
>> allow an agent to do the check
>>
>>
>>>
>>> --
>>> Stas Malyshev
>>> smalys...@wikimedia.org
>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>
>>
>>
>> --
>> Kontokostas Dimitris
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] mapping template parameters using Wikidata?

2015-03-04 Thread Amir E. Aharoni
> "Name" would be a bit tricky since I'm not sure if we have property
called "name"

At this stage it's not actually important for me for the purposes of
ContentTranslation to map it to a Wikidata property. Any mapping between
parameter names in different languages would be enough.

One possibility is to store this mapping as a property at the Wikidata item
page of the template. It would be a complex property, but it's not
impossible.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] mapping template parameters using Wikidata?

2015-03-04 Thread Amir E. Aharoni
> Using all these and interlanguage links I think we can create a (decent)
service that can work. I can suggest a DBpedia gsoc project for this if
some people are willing to mentor a student [2].

I know very little about dbpedia, but if it helps, I'm willing to support
it from the Wikipedia / ContentTranslation side.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-03-04 10:08 GMT+02:00 Dimitris Kontokostas :

> Hi all,
>
> What you can get from DBpedia is
> 1) template structure (all properties defined in a template)
> I am not sure why this was not included in the 2014 release but you can
> see an example in 3.9 [1]
> Our parser cannot handle very complex templates but it is a good start.
> I'll make sure these are included in the next release but it is also easy
> to create a service that extracts them on request
>
> 2) mappings wiki
> We are in the process of exporting our mappings in RDF using the [R2]RML
> vocabulary. We have code that does that for simple mappings but it's not
> ready to get merged yet.
> Hopefully we'll have this soon and will be quite easy to query and join.
> Even without that, we could get a partial functionality by translating &
> matching properties from #1
>
> 3) mappings wiki (ontology)
> links from ontology classes/properties to wikidata, at the moment they are
> stored in our wiki but could be stored in Wikidata instead as Daniel
> suggested.
>
> Using all these and interlanguage links I think we can create a (decent)
> service that can work. I can suggest a DBpedia gsoc project for this if
> some people are willing to mentor a student [2].
>
> What we would need from the Wikidata/DBpedia community is
> 1) more ontology links from DBpedia to Wikidata
> 2) contributions in the infobox mappings to cover more infoboxes for
> better coverage
>
> Best,
> Dimitris
>
>
> [1] http://downloads.dbpedia.org/3.9/en/template_parameters_en.ttl.bz2
> [2] dbpedia.org/gsoc2015/ideas
>
> On Tue, Mar 3, 2015 at 10:40 PM, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> Thanks, that's a step forward. Now the question is how to bring this all
>> together.
>>
>> The context that interests me the most is translating an article in
>> ContentTranslation. Let's go with an architect.[1] I am translating an
>> article about an architect from English to Dutch, and it has {{Infobox
>> architect}} at the top. How would ContentTranslation, a MediaWiki extension
>> installed on the Wikimedia cluster, know that the "name" parameter is
>> "naam" in Dutch?
>>
>> Currently, in theory, it would:
>> 1. Find that there's a corresponding infobox in Dutch using the
>> interlanguage link:
>> https://nl.wikipedia.org/wiki/Sjabloon:Infobox_architect
>> 2. Go to dbpedia and find the English template:
>> http://mappings.dbpedia.org/index.php/Mapping_en:Infobox_architect
>> 3. Find that name is foaf:name
>> 4. Go to dbpedia and find the Dutch template:
>> http://mappings.dbpedia.org/index.php/Mapping_nl:Infobox_architect
>> 5. Find that foaf:name is naam
>>
>> ... And then repeat steps 1 to 5 for each parameter.
>>
>> Is this something that is possible to query now? (I'm not even talking
>> about performance.)
>>
>> Even if it is possible to query it, is it good to be dependent on an
>> external website for this? Maybe it makes sense to import the data from
>> dbpedia to Wikidata? It's absolutely not a rhetorical question - maybe it
>> is OK to use dbpedia.
>>
>> [1] {{Infobox cricketer}} exists in the Dutch Wikipedia, but doesn'
>> appear in the Dutch mappings in dbpedia.
>>
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> 2015-03-03 20:39 GMT+02:00 Daniel Kinzler :
>>
>>> Am 03.03.2015 um 18:48 schrieb Amir E. Aharoni:
>>> > Trying again... It's a really important topic for me.
>>> >
>>> > How do I go about proposing storing information about templates
>>> parameters
>>> > mapping to the community? I kinda understand how Wikidata works, and
>>> it sounds
>>> > like something that could be implemented using the current properties,
>>> but
>>> > thoughts about moving this forward would be very welcome.
>>>
>>> Hi Amir!
>>>
>>> We had a call today with the dbPedia fol

Re: [Wikidata-l] mapping template parameters using Wikidata?

2015-03-03 Thread Amir E. Aharoni
Thanks, that's a step forward. Now the question is how to bring this all
together.

The context that interests me the most is translating an article in
ContentTranslation. Let's go with an architect.[1] I am translating an
article about an architect from English to Dutch, and it has {{Infobox
architect}} at the top. How would ContentTranslation, a MediaWiki extension
installed on the Wikimedia cluster, know that the "name" parameter is
"naam" in Dutch?

Currently, in theory, it would:
1. Find that there's a corresponding infobox in Dutch using the
interlanguage link: https://nl.wikipedia.org/wiki/Sjabloon:Infobox_architect
2. Go to dbpedia and find the English template:
http://mappings.dbpedia.org/index.php/Mapping_en:Infobox_architect
3. Find that name is foaf:name
4. Go to dbpedia and find the Dutch template:
http://mappings.dbpedia.org/index.php/Mapping_nl:Infobox_architect
5. Find that foaf:name is naam

... And then repeat steps 1 to 5 for each parameter.

Is this something that is possible to query now? (I'm not even talking
about performance.)

Even if it is possible to query it, is it good to be dependent on an
external website for this? Maybe it makes sense to import the data from
dbpedia to Wikidata? It's absolutely not a rhetorical question - maybe it
is OK to use dbpedia.

[1] {{Infobox cricketer}} exists in the Dutch Wikipedia, but doesn' appear
in the Dutch mappings in dbpedia.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-03-03 20:39 GMT+02:00 Daniel Kinzler :

> Am 03.03.2015 um 18:48 schrieb Amir E. Aharoni:
> > Trying again... It's a really important topic for me.
> >
> > How do I go about proposing storing information about templates
> parameters
> > mapping to the community? I kinda understand how Wikidata works, and it
> sounds
> > like something that could be implemented using the current properties,
> but
> > thoughts about moving this forward would be very welcome.
>
> Hi Amir!
>
> We had a call today with the dbPedia folks, about exactly this topic!
>
> The dbPedia mapping wiki[1] has this information, at least to some extent.
> Let's
> say you are looking at {{Cricketer Infobox}} on en. You can look out the
> DBPedia
> mappings for the template parameters on their mapping page[2]. There you
> can see
> that the "country" parameter maps to the "country" proeprty in the dbpedia
> ontology[2], which in turn uses owl:equivalentProperty to cross-link
> P17[4].
>
> I assume this info is also available in machine readable form somewhere,
> but I
> don't know where offhand.
>
> Today we discussed that this mapping should also be available in the
> opposite
> direction: on Wikidata, you can use P1628 ("equivalent property") to
> cross-reference the dbPedia ontology. I just added this info to the country
> property.
>
> let me know if this helps :)
> -- daniel
>
> [1] http://mappings.dbpedia.org/index.php/
> [2] http://mappings.dbpedia.org/index.php/Mapping_en:Cricketer_Infobox
> [3] http://mappings.dbpedia.org/index.php/OntologyProperty:Country
> [4] https://www.wikidata.org/wiki/Property:P17
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] mapping template parameters using Wikidata?

2015-03-03 Thread Amir E. Aharoni
Trying again... It's a really important topic for me.

How do I go about proposing storing information about templates parameters
mapping to the community? I kinda understand how Wikidata works, and it
sounds like something that could be implemented using the current
properties, but thoughts about moving this forward would be very welcome.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-09-24 14:18 GMT+03:00 Amir E. Aharoni :

> Hi,
>
> TL;DR: Did anybody consider using Wikidata items of Wikipedia templates to
> store multilingual template parameters mapping?
>
> Full explanation:
> As in many other projects in the Wikimedia world, templates are one of the
> biggest challenges in developing the ContentTranslation extension.
>
> Translating a template between languages is tedious - many templates are
> language-specific, many others have a corresponding template, but
> incompatible parameters, and even if the parameters are compatible, there
> is usually no comfortable mapping. Some work in that direction was done in
> DBpedia, but AFAIK it's far from complete.
>
> In ContentTranslation we have a simplistic mechanism for mapping between
> template parameters in pairs of languages, with proof of concept for three
> templates. We can enhance it with more templates, but the question is how
> much can it scale.
>
> Some templates shouldn't need such mapping at all - they should pull their
> data from Wikidata. This is gradually being done for infoboxes in some
> languages, and it's great.
>
> But not all templates can be easily mapped to Wikidata data. For example -
> reference templates, various IPA and language templates, quotation
> formatting, and so on. For these, parameter mapping could be useful, but
> doing this for a single language pair doesn't seem robust and reminds me of
> the old ways in which interlanguage links were stored.
>
> So, did anybody consider using Wikidata items of templates to store
> multilingual template parameters mapping?
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] interlanguage links not updated on the mobile site

2015-02-18 Thread Amir E. Aharoni
[ Crossposting to mobile and wikidata lists. Sorry about the inconvenience.
You may want to use "Reply to all". ]

Hi,

The articles about the musician Eviatar Banai in Hebrew and English
Wikipedias exist fr years.

On 2015-02-04 I created one in Catalan. Today I created one in Russian.

If I look at the Hebrew Wikipedia, I see interlanguage links to Catalan,
English and Russian:
https://he.wikipedia.org/wiki/%D7%90%D7%91%D7%99%D7%AA%D7%A8_%D7%91%D7%A0%D7%90%D7%99

*Now here's the really fun part:*
If I look at the mobile Hebrew Wikipedia, I only see an interlanguage link
to English.
https://he.m.wikipedia.org/wiki/%D7%90%D7%91%D7%99%D7%AA%D7%A8_%D7%91%D7%A0%D7%90%D7%99#/languages

I guess that it's a caching issue, but:

1. A link from Hebrew to Catalan doesn't appear after *two weeks*.
2. It's quite surprising that links for mobile and for desktop are cached
separately.

#2 may be OK if it helps with performance or something, but #1 seems
exaggerated to me. Does it really have to take two weeks or is it a bug?

Thanks :)

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-12 Thread Amir E. Aharoni
The question is not so much where to point it, but how to put it into the
wiki syntax of the page.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-12 13:05 GMT+02:00 Gerard Meijssen :

> Hoi,
> The obvious is painful. When you need a placeholder... Why not use
> Reasonator? It is just a call to the Wikidata item that is associated with
> the page.
> Thanks,
>   Gerard
>
> On 12 February 2015 at 11:18, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> 
>> The advantage of a template is that it doesn't touch core and doesn't
>> create new wiki syntax.
>>
>> Maybe this template could be a Lua module built into the Wikibase Client
>> extension, so it wouldn't have to be lamely synchronized across hundreds of
>> projects?
>> 
>>
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> 2015-02-12 12:12 GMT+02:00 Lydia Pintscher 
>> :
>>
>>> I am also interested in solving this for the article placeholder feature
>>> where we show date from Wikidata when no local article exists.
>>> We can't really just put the link to the non existent article into the
>>> Wikidata item because the article might be created and then cover a
>>> completely unrelated topic. We already have this problem with red links on
>>> Wikipedia but it would be even worse on Wikidata.
>>> I think the way to go is to have the Wikidata identifier used in the
>>> link on the article. Question is how to do that nicely. I am happy to see
>>> the template experiment. Are people generally ok with the way it works?
>>>
>>> Cheers
>>> Lydia
>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>>
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-12 Thread Amir E. Aharoni
> The other is to extend the link syntax similar to image syntax, for
example
> with  [[Article Name|Alternate Text|wd=Q1234]]. This should be minimally
disruptive
> to the editors.

Yes - this would be more or less perfect, but it would require changes in
core MediaWiki. If nothing else works, then it's possible, but seems harder
to get through in practice.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-12 12:20 GMT+02:00 Smolenski Nikola :

> Citiranje "Amir E. Aharoni" :
> > TL;DR: How can a red link be annotated in a semantic way with a foreign
> > article title or a Wikidata Q item number?
> >
> > Imagine: I'm writing a Wikipedia article in Russian. There's a red link
> in
> > it. I don't have time to write the target article for that link now, but
> > I'm sure that it should exist. In fact, that article does exist in the
> > English Wikipedia.
> >
> > I want the link to be red (fr the usual wiki reasons), but until the
> > Russian article is written, I want to give the software a hint about
> which
> > topic it is supposed to be about. Telling it the English article name
> would
> > be one way to do it. Giving it the Wikidata Q item number would be an
> even
> > better way to do it.
> >
> > Unfortunately, MediaWiki does not currently have true syntax to do
> either.
> > (Correct me if I'm wrong.)
> >
> > Some Wikipedias may have templates that do something like this (e.g.
> > Russian: https://ru.wikipedia.org/wiki/Template:En ). But there's
> nothing
> > that is uniform to all projects.
> >
> > *Why* is it useful to give the software this hint in the first place?
> Most
> > simplistically, it's useful to the reader - in case that reader knows
> > English, she can at least read something.
> >
> > But there's something bigger. When the ContentTranslation extension
> > translates links, it automatically adapts links that can be found. What
> to
> > do about those that can't be auto-adapted? It frequently happens when
> > Wikipedians translate articles that many links in the created articles
> turn
> > out to be red. We'd love to get ContentTranslation to help the
> translators
> > make those articles by writing relevant articles with as few clicks as
> > possible, and that is only possible by annotating the red links with the
> > topics to which they belong.
> >
> > So, any ideas?
> > What do other Wikipedias for such annotation?
> > Is it imaginable to add wiki syntax for such a thing?
> > Can anybody think of a hack that reuses the current [[link]] syntax to
> add
> > such annotation?
>
> One possibility would be to allow creation of links to nonexisting
> articles on
> Wikidata, perhaps by using a new "nonexisting article" badge. Of course,
> this
> could lead to various problems on its own,
>
> The other is to extend the link syntax similar to image syntax, for example
> with
> [[Article Name|Alternate Text|wd=Q1234]]. This should be minimally
> disruptive
> to
> the editors.
>
> Either one of these solutions would be useful for automated article
> creation,
> and other purposes, for example finding multiple unwritten articles with
> the
> same name about different topics, or finding erroneous links in Wikidata.
>
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-12 Thread Amir E. Aharoni

The advantage of a template is that it doesn't touch core and doesn't
create new wiki syntax.

Maybe this template could be a Lua module built into the Wikibase Client
extension, so it wouldn't have to be lamely synchronized across hundreds of
projects?



--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-12 12:12 GMT+02:00 Lydia Pintscher :

> I am also interested in solving this for the article placeholder feature
> where we show date from Wikidata when no local article exists.
> We can't really just put the link to the non existent article into the
> Wikidata item because the article might be created and then cover a
> completely unrelated topic. We already have this problem with red links on
> Wikipedia but it would be even worse on Wikidata.
> I think the way to go is to have the Wikidata identifier used in the link
> on the article. Question is how to do that nicely. I am happy to see the
> template experiment. Are people generally ok with the way it works?
>
> Cheers
> Lydia
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-11 Thread Amir E. Aharoni
Yeah, looking into labels is certainly something that I considered, but
that is by definition only a guess and not as bulletproof as Q numbers.

We considered doing stuff like:
* [[not-yet-written article about Douglas Adams|Douglas Adams]]
* [[not-yet-written article about Douglas Adams#Q42|Douglas Adams]]

... and this would kinda work, but would be leave a lot of mess to the
community editors to clean up. The template way, suggested by Gerard, is
similar and seems slightly less messy to me. But only slightly.

(If anybody cares, the relevant task in ContentTranslation is
https://phabricator.wikimedia.org/T88580 .)


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-12 7:51 GMT+02:00 Maarten Dammers :

> Hi Amir,
>
> Amir E. Aharoni schreef op 11-2-2015 om 13:12:
>
>> If I may dream for a moment, this should be something that can be used in
>> all Wikipedias, and without copying this template everywhere, but built
>> into the site's software :)
>>
> Exactly, the template based approach doesn't scale at all. You have to
> somehow make it automatic. One thing I thought about is adding suggested
> sitelinks to Wikidata. The software would encounter a red link and would
> look in Wikidata if it can find an item with a suggested sitelink of the
> same title. Huge software overhaul so I don't see that happening.
>
> Another approach that is probably already possible right now:
> * Take an article with a red link
> * Look at the links in the article in other languages.
> * If you find a link that points to another article which has the same
> label as the red link in the same language, link to it
>
> I wonder how many good results that would give.
>
> Maarten
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-11 Thread Amir E. Aharoni
Yes, Gerard and Jane - this looks like what I'm talking about.

If I may dream for a moment, this should be something that can be used in
all Wikipedias, and without copying this template everywhere, but built
into the site's software :)


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-11 22:47 GMT+02:00 Gerard Meijssen :

> Hoi,
> Have a look at this article ...
> https://en.wikipedia.org/wiki/Herman_Skolnik_Award
> Thanks to Magnus for a blog post I am still to write.
> Thanks,
>  GerardM
>
> On 11 February 2015 at 20:26, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> Hi,
>>
>> TL;DR: How can a red link be annotated in a semantic way with a foreign
>> article title or a Wikidata Q item number?
>>
>> Imagine: I'm writing a Wikipedia article in Russian. There's a red link
>> in it. I don't have time to write the target article for that link now, but
>> I'm sure that it should exist. In fact, that article does exist in the
>> English Wikipedia.
>>
>> I want the link to be red (fr the usual wiki reasons), but until the
>> Russian article is written, I want to give the software a hint about which
>> topic it is supposed to be about. Telling it the English article name would
>> be one way to do it. Giving it the Wikidata Q item number would be an even
>> better way to do it.
>>
>> Unfortunately, MediaWiki does not currently have true syntax to do
>> either. (Correct me if I'm wrong.)
>>
>> Some Wikipedias may have templates that do something like this (e.g.
>> Russian: https://ru.wikipedia.org/wiki/Template:En ). But there's
>> nothing that is uniform to all projects.
>>
>> *Why* is it useful to give the software this hint in the first place?
>> Most simplistically, it's useful to the reader - in case that reader knows
>> English, she can at least read something.
>>
>> But there's something bigger. When the ContentTranslation extension
>> translates links, it automatically adapts links that can be found. What to
>> do about those that can't be auto-adapted? It frequently happens when
>> Wikipedians translate articles that many links in the created articles turn
>> out to be red. We'd love to get ContentTranslation to help the translators
>> make those articles by writing relevant articles with as few clicks as
>> possible, and that is only possible by annotating the red links with the
>> topics to which they belong.
>>
>> So, any ideas?
>> What do other Wikipedias for such annotation?
>> Is it imaginable to add wiki syntax for such a thing?
>> Can anybody think of a hack that reuses the current [[link]] syntax to
>> add such annotation?
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] annotating red links

2015-02-11 Thread Amir E. Aharoni
2015-02-11 22:14 GMT+02:00 Ricordisamoa :
> Adding non-existing pages to Wikidata items?
> Using a syntax like [Q42[notexistingpagetitle]]?

Is this a suggestion for possible syntax or something that actually works
somewhere?

But yeah, something like this - something that includes the title of a page
that doesn't exist in this wiki, but may some to exist some day, and the Q
number.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] annotating red links

2015-02-11 Thread Amir E. Aharoni
Hi,

TL;DR: How can a red link be annotated in a semantic way with a foreign
article title or a Wikidata Q item number?

Imagine: I'm writing a Wikipedia article in Russian. There's a red link in
it. I don't have time to write the target article for that link now, but
I'm sure that it should exist. In fact, that article does exist in the
English Wikipedia.

I want the link to be red (fr the usual wiki reasons), but until the
Russian article is written, I want to give the software a hint about which
topic it is supposed to be about. Telling it the English article name would
be one way to do it. Giving it the Wikidata Q item number would be an even
better way to do it.

Unfortunately, MediaWiki does not currently have true syntax to do either.
(Correct me if I'm wrong.)

Some Wikipedias may have templates that do something like this (e.g.
Russian: https://ru.wikipedia.org/wiki/Template:En ). But there's nothing
that is uniform to all projects.

*Why* is it useful to give the software this hint in the first place? Most
simplistically, it's useful to the reader - in case that reader knows
English, she can at least read something.

But there's something bigger. When the ContentTranslation extension
translates links, it automatically adapts links that can be found. What to
do about those that can't be auto-adapted? It frequently happens when
Wikipedians translate articles that many links in the created articles turn
out to be red. We'd love to get ContentTranslation to help the translators
make those articles by writing relevant articles with as few clicks as
possible, and that is only possible by annotating the red links with the
topics to which they belong.

So, any ideas?
What do other Wikipedias for such annotation?
Is it imaginable to add wiki syntax for such a thing?
Can anybody think of a hack that reuses the current [[link]] syntax to add
such annotation?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] descriptions in mobile app

2015-02-09 Thread Amir E. Aharoni
Manual descriptions are not an entire waste of time.

Magnus writes in his post <http://magnusmanske.de/wordpress/?p=265>:
> And some people have seen my Reasonator
<http://tools.wmflabs.org/reasonator/?q=Q1339> tool, where (for some item
types, and some languages) rather long descriptions can be generated.

It's not necessary gd that they are long. For the mobile app it's better if
they are short.

But the "some item types, and some languages" part is the real problem.
Only some. It's quite possible that in the future Reasonator will cover all
languages and all data types and will also be tweaked t provide appropriate
length, maybe even different lengths according to context. Reasonator
natural language sentence creation works for a very small number of
languages. If it was as easy to translate it as it is to translate
MediaWiki UI messages, I wouldn't object to its wider, but AFAIK this is
not the case not.

And it's not that good for English either. Reasonator is not smart enough
at the moment to describe people with several qualifications. The current
Reasonator-generated description of Peter Garrett
<https://tools.wmflabs.org/reasonator/?find=peter+garrett> is vastly
inferior to the manually-written description. Compare:
1. "Australian singer and politician, Minister for School Education, Early
Childhood and Youth, Minister for Sustainability, Environment, Water,
Population and Communities (Australia), and Member of the Australian House
of Representatives (*1953) ♂"
2. "Australian politician and Midnight Oil lead singer".
Basic human intuition tells me that for most Wikipedia readers, who simply
want to know "Who is Peter Garrett?", #2 is far more useful. #1 has
oversize descriptions of all his political roles, and *doesn't* have the
name the rock band that made him popular. This is just one example out of
hundreds of thousands that could be brought up. For what it's worth, #2 is
also easier to translate manually.

It's important to emphasize at this point that I have the utmost respect to
Magnus's brilliant work. It's just not ready to completely replace the
manual descriptions.

A practical solution for now is to have a system for manual translation of
descriptions, which shows the Reasonator descriptions as a translation aid,
similarly to how the Translate extension shows translation memory
suggestions. Also, a way to manually tweak descriptions can take Reasonator
further, for example a way to tell it that for the Peter Garrett item
there's no need to include a long list of all his roles in the Australian
government.

Oh, and even if you can run away some day from manually translating
descriptions, you cannot run away from manually translating labels. At
most, some can be copied from Wikipedia, but even then many of them need
post-import fixing.

So all of this brings me back to https://phabricator.wikimedia.org/T64695 .


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-02-09 12:58 GMT+02:00 Daniel Kinzler :

> @Gerard, @Magnus: please help me out here.
>
> I agree that automatic descriptions are very useful. I also think that in
> *some*
> cases, manual descriptions are more useful, and maybe even needed.
>
> I definitely think that 3rd party consumers of wikidata should not have to
> think
> about whether descriptions have been written manually or were created
> automatically. This should be completely transparent.
>
> So, if you want to help with making automated description a reality,
> please make
> suggestions that take into account the above points, and also consider the
> mechanisms for language fallback.
>
> The only thing that I can think of right away is simply inserting automated
> descriptions by bot. This isn't ideal, but I can't think of a better
> solution
> that wouldn't be hugely complicated (and would thus not be implemented any
> time
> soon). Maybe you have ideas?
>
> -- daniel
>
>
> Am 09.02.2015 um 11:41 schrieb Magnus Manske:
> > Manual descriptions are, in the vast majority of cases, a waste of
> volunteer
> > time. Alternative:
> > http://magnusmanske.de/wordpress/?p=265
> >
> > On Sun Feb 08 2015 at 17:37:42 Gerard Meijssen <
> gerard.meijs...@gmail.com
> > <mailto:gerard.meijs...@gmail.com>> wrote:
> >
> > Hoi,
> > How does that help ? The point is exactly that there is no point to
> > descriptions. Why iterate on a dog it will still be a mutt.
> > Thanks,
> > GerardM
> >
> > On 8 February 2015 at 14:07, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il
> > <mailto:amir.ahar...@mail.huji.ac.il>>

Re: [Wikidata-l] descriptions in mobile app

2015-02-08 Thread Amir E. Aharoni
I'd rather see it not as something terribly disappointing, but as an
opportunity to find a way to fill item descriptions more efficiently.

Basically, to find some cycles to resolve
https://phabricator.wikimedia.org/T64695
בתאריך 8 בפבר 2015 10:33, ‏"Gerard Meijssen" 
כתב:

> Hoi,
> I understand that item descriptions are going to be used in a mobile app.
> In my opinion that is seriously disappointing because it is not realistic
> to expect enough coverage in any language. Particularly in the small
> languages it will not be really useful.
>
> My question is: we have had automated descriptions for a long time. What
> is it that they makes that they are not used.?
>
> Thanks,
>  GerardM
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Impossible to add interwiki links

2015-01-28 Thread Amir E. Aharoni
This is probably not-entirely-useful, because it's just a complaint and not
a fix, so apologies for that, but if I may...

My dream solution for some of those issues with integration of Wikipedia
and Wikidata is letting people edit Wikidata without leaving Wikipedia (or
Wikivoyage, or whatever).

Adding interlanguage links to an article that doesn't have any links
currently already works well for over a year: The editor sees a dialog box
that does everything needed without leaving Wikipedia, in the user's own
language and without ever leaving the page. This is awesome, and it should
work the same way for adding links to a page that already has some.

And it should be like that for editing statements, too. In the same dream
setup that I am thinking of, statements are edited right in the infobox,
and the values are stored in wikidata.org. Clever local gadgets and
templates on the Russian Wikipedia are frequently brought up when
discussing this topic. I wish I could see something like this in all
languages as part of the Wikibase/Wikidata product and not as a local hack,
as great as it is.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2015-01-28 16:40 GMT-08:00 Romaine Wiki :

> Hi all,
>
> Again, for already so many times, many users have complained on nl-wiki
> that it is for them impossible now to add interwikilinks on Wikidata. They
> are sick of all the changes all the time, especially if they experience the
> new design as impossible to use, as it now is.
>
> They are lost with the current design and can't add new interwikilinks.
> That this pops-up so many times is a serious problem that needs a solution.
>
> Wikidat is not meant for techno users only, but that is how many regular
> users experience Wikidata. If regular users find themselves impossible to
> add/update pages, the software needs a big change to re-enable them to work
> with Wikidata again.
>
>
> Ow, I am just the messenger... [1]
>
> Romaine
>
>
>
> [1] https://www.wikidata.org/wiki/Q2515525
>
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] coordinates: Wikidata vs Wikipedias

2014-12-15 Thread Amir E. Aharoni
Well, actually, there were some issues back then as well. Notice that I
opened that bug in January 2012. It was fixed, and I reopened it for the
new app.



--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-12-15 19:50 GMT+02:00 Andy Mabbett :

> It seemed to work fine with the previous version of "nearby" , in v1 of
> the app.
>
> On 15 December 2014 at 16:46, Amir E. Aharoni
>  wrote:
> > Hi,
> >
> > There's this bug:
> > https://phabricator.wikimedia.org/T35704
> >
> > Basically, the "Nearby" function in the Wikipedia Android app can only
> work
> > if the coordinates template in the Wikipedia in the relevant language
> uses
> > the magic word from the GeoData extension.
> >
> > And I wonder: Is this really needed? Updating templates in almost 300
> > languages doesn't scale well, and Wikidata already supports coordinates.
> It
> > also makes more general sense to me to query a structured database like
> > Wikidata instead of poking around with templates and magic words as it is
> > done with GeoData.
> >
> > But that's me, and I might be missing something.
> >
> > Is Wikidata actually ready for this technically?
> > Are coordinates filled for all the relevant items, or is it still better
> > supported in Wikipedias?
> >
> > Thanks.
> >
> > --
> > Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> > http://aharoni.wordpress.com
> > ‪“We're living in pieces,
> > I want to live in peace.” – T. Moore‬
> >
> > ___
> > Wikidata-l mailing list
> > Wikidata-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata-l
> >
>
>
>
> --
> Andy Mabbett
> @pigsonthewing
> http://pigsonthewing.org.uk
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] coordinates: Wikidata vs Wikipedias

2014-12-15 Thread Amir E. Aharoni
Hi,

There's this bug:
https://phabricator.wikimedia.org/T35704

Basically, the "Nearby" function in the Wikipedia Android app can only work
if the coordinates template in the Wikipedia in the relevant language uses
the magic word from the GeoData extension.

And I wonder: Is this really needed? Updating templates in almost 300
languages doesn't scale well, and Wikidata already supports coordinates. It
also makes more general sense to me to query a structured database like
Wikidata instead of poking around with templates and magic words as it is
done with GeoData.

But that's me, and I might be missing something.

Is Wikidata actually ready for this technically?
Are coordinates filled for all the relevant items, or is it still better
supported in Wikipedias?

Thanks.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata descriptions in the beta iOS Wikipedia app

2014-11-17 Thread Amir E. Aharoni
Screenshot:
https://twitter.com/aharoni/status/534292798430650369

The technical questions should be directed to the mobile apps team - Monte,
Brion, Dan et al.

My impression is that it simply pulls the description for the item from
Wikidata, if a description is available and there's not fallback, but I
might be wrong.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-11-17 10:52 GMT+02:00 Federico Leva (Nemo) :

> Amir E. Aharoni, 15/11/2014 20:56:
>
>> I haven't seen this mentioned in the context of Wikidata yet, so here:
>> The latest beta version of the Wikipedia app for iOS (iPhone, iPad,
>> iPod) shows descriptions from Wikidata as summaries in the search results.
>>
>
> Interesting; screenshots appreciated. How does it cooperate with
> TextExtracts? Does it also do crosswiki searches, like wdsearch? Users and
> wikis interested in such functionality should install it:
> https://en.wikipedia.org/wiki/MediaWiki_talk:Wdsearch.js
>
> Nemo
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Wikidata descriptions in the beta iOS Wikipedia app

2014-11-15 Thread Amir E. Aharoni
Hi,

I haven't seen this mentioned in the context of Wikidata yet, so here:
The latest beta version of the Wikipedia app for iOS (iPhone, iPad, iPod)
shows descriptions from Wikidata as summaries in the search results.

If you have an iOS device and want to see it in action, see the
instructions here:
https://www.mediawiki.org/wiki/Wikimedia_Apps#Stay_on_the_Cutting_Edge

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Missing Wikipedia links tool - thought

2014-10-30 Thread Amir E. Aharoni
ZOMFG, the tool that Denny introduced yesterday as a birthday gift is
unbelieavably useful and fun.

Here are a few thoughts I had about it:

I went over all the pages for the Hebrew-English pair. There were only 36,
and that is suspiciously low. Were all the articles in these languages
tested by this tool or only a subset?

Even though almost all of the tool's suggestions were correct It would be
problematic to fix these automatically. There were several types of article
pairs:
* Unrelated because one of the suggested pages was a disambiguation page
and the other was not. Sometimes there was a link to the correct related
page from the disambig page. If anybody makes a new version, this certainly
should be corrected.
* Related, but with explicit interlanguage links in the articles' source
code. This required old-style interwiki conflict resolution. There was a
surprisingly high number of these. I managed to resolve all the conflicts
manually, but it did take a few minutes for each case. Examples from
en.wikipedia: [[Bombe]], [[Bomba (cryptography)]], [[Diary of a Wimpy
Kid]], [[PFLAG]].
* Related, with a Wikidata item for each page, but without conflicts, so
easily mergeable. This can be done by a bot once it is identified for sure.

Adding links to a page without any language links shows a box to write a
language and a target title, and that's it. Adding a link to a new language
to a page which already has some interlanguage links opens the whole item
page in Wikidata (a whole other website!) and requires scrolling, editing
the links, and in many cases - merging the items manually. The result is
actually the same, so it would be very nice if the second case wouldn't be
so complicated.

That's it for now - I hope somebody finds it useful :)

I finished with Hebrew, and I'm going on to Russian, which has over a
thousand article pairs. IT'S INSANELY FUN.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-10-29 19:56 GMT+02:00 Denny Vrandečić :

> Folks,
>
> as you know, many Googlers are huge fans of Wikipedia. So here’s a little
> gift for Wikidata’s second birthday.
>
> Some of my smart colleagues at Google have run a few heuristics and
> algorithms in order to discover Wikipedia articles in different languages
> about the same topic which are missing language links between the articles.
> The results contain more than 35,000 missing links with a high confidence
> according to these algorithms. We estimate a precision of about 92+% (i.e.
> we assume that less than 8% of those are wrong, based on our evaluation).
> The dataset covers 60 Wikipedia language editions.
>
> Here are the missing links, available for download from the WMF labs
> servers:
>
> https://tools.wmflabs.org/yichengtry/merge_candidate.20141028.csv
>
> The data is published under CC-0.
>
> What can you do with the data? Since it is CC-0, you can do anything you
> want, obviously, but here are a few suggestions:
>
> There’s a small tool on WMF labs that you can use to verify the links (it
> displays the articles side by side from a language pair you select, and
> then you can confirm or contradict the merge):
>
> https://tools.wmflabs.org/yichengtry
>
> The tool does not do the change in Wikidata itself, though (we thought it
> would be too invasive if we did that). Instead, the results of the human
> evaluation are saved on WMF labs. You are welcome to take the tool and
> extend it with the possibility to upload the change directly on Wikidata,
> if you so wish, or, once the data is verified, to upload the results.
>
> Also, Magnus Manske is already busy uploading the data to the Wikidata
> game, so you can very soon also play the merge game on the data directly.
> He is also creating the missing items on Wikidata. Thanks Magnus for a very
> pleasant cooperation!
>
> I want to call out to my colleagues at Google who created the dataset -
> Jiang Bian and Si Li - and to Yicheng Huang, the intern who developed the
> tool on labs.
>
> I hope that this small data release can help a little with further
> improving the quality of Wikidata and Wikipedia! Thank you all, you are
> awesome!
>
> Cheers,
> Denny
>
>
>
> On Wed Oct 29 2014 at 10:52:05 AM Lydia Pintscher <
> lydia.pintsc...@wikimedia.de> wrote:
>
> Hey folks :)
>
> Today Wikidata is turning two. It amazes me what we've achieved in
> just 2 years. We've built an incredible project that is set out to
> change the world. Thank you everyone who has been a part of this so
> far.
> We've put together some notes and opinions. And there are presents as
> well! Check them out and leave your birthday wishes:
> https://www.wikidata.org/wiki/Wikidata:Second_Birthday
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesells

Re: [Wikidata-l] packaging and deployment time

2014-09-27 Thread Amir E. Aharoni
Cool, thanks for the pointer!


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-09-27 14:47 GMT+03:00 Katie Filbert :

> On Sat, Sep 27, 2014 at 1:38 PM, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> Hi,
>>
>> The following little change by myself was merged by Aude on September 9:
>> https://gerrit.wikimedia.org/r/#/c/159070/
>>
>> As far as I can see, it is not deployed to Wikipedia yet.
>>
>
> We deployed new code to test.wikidata on September 11 but had to revert
> due to caching issues there and then skipped deployment to wikidata.  Hence
> your patch, unfortunately, is not deployed yet.
>
> We will deploy on wikidata again on Tuesday since we had no issues this
> past Thursday on test.wikidata with our new code.
>
>
>> It's not really urgent, but it made me curious: What is the deployment
>> schedule for Wikidata extensions?
>>
>
> We normally deploy every two weeks, except perhaps around holidays or
> Wikimania, we skip deployment.
>
> Our schedule is here: https://www.mediawiki.org/wiki/Wikidata_deployment
>
> Cheers,
>
> Katie
>
>
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>
>
> --
> Katie Filbert
> Wikidata Developer
>
> Wikimedia Germany e.V. | Tempelhofer Ufer 23-24, 10963 Berlin
> Phone (030) 219 158 26-0
>
> http://wikimedia.de
>
> Wikimedia Germany - Society for the Promotion of free knowledge eV Entered
> in the register of Amtsgericht Berlin-Charlottenburg under the number 23
> 855 as recognized as charitable by the Inland Revenue for corporations I
> Berlin, tax number 27/681/51985.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] packaging and deployment time

2014-09-27 Thread Amir E. Aharoni
Hi,

The following little change by myself was merged by Aude on September 9:
https://gerrit.wikimedia.org/r/#/c/159070/

As far as I can see, it is not deployed to Wikipedia yet.

It's not really urgent, but it made me curious: What is the deployment
schedule for Wikidata extensions?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] extensions not on translatewiki.net

2014-09-27 Thread Amir E. Aharoni
Hi,

Several Wikidata-related extensions are not translatable on
translatewiki.net.

The ones I could find are:
* Wikibase DataModel
* Wikibase DataModel JavaScript
* Wikidata build
* WikimediaBadges
* The various DataValues extensions

All extensions need at least a translatable description for
Special:Version, and some of the above have actual messages to translate.

If I understand correctly, the reasons for this is that it's more
comfortable for their developers to have the code review in GitHub, and
translatewiki's MediaWiki extensions L10n export scripts work only with
Gerrit. The question is, is it feasible to sync GitHub and Gerrit, so that
these extensions would be easily translatable?

If I understand correctly, something like this is already done for some
MediaWiki extensions, among the SemanticResultFormats and Maps.

Thanks for any assistance.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] mapping template parameters using Wikidata?

2014-09-24 Thread Amir E. Aharoni
Hi,

TL;DR: Did anybody consider using Wikidata items of Wikipedia templates to
store multilingual template parameters mapping?

Full explanation:
As in many other projects in the Wikimedia world, templates are one of the
biggest challenges in developing the ContentTranslation extension.

Translating a template between languages is tedious - many templates are
language-specific, many others have a corresponding template, but
incompatible parameters, and even if the parameters are compatible, there
is usually no comfortable mapping. Some work in that direction was done in
DBpedia, but AFAIK it's far from complete.

In ContentTranslation we have a simplistic mechanism for mapping between
template parameters in pairs of languages, with proof of concept for three
templates. We can enhance it with more templates, but the question is how
much can it scale.

Some templates shouldn't need such mapping at all - they should pull their
data from Wikidata. This is gradually being done for infoboxes in some
languages, and it's great.

But not all templates can be easily mapped to Wikidata data. For example -
reference templates, various IPA and language templates, quotation
formatting, and so on. For these, parameter mapping could be useful, but
doing this for a single language pair doesn't seem robust and reminds me of
the old ways in which interlanguage links were stored.

So, did anybody consider using Wikidata items of templates to store
multilingual template parameters mapping?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata / Wikipedia integration : redlinks and items

2014-09-15 Thread Amir E. Aharoni
2014-09-15 16:16 GMT+03:00 Lydia Pintscher :
> As for simply allowing sitelinks to non-existing articles in Wikidata:
> I fear we can't easily do that. If someone adds the link to a specific
> item and then another person comes and creates an article under the
> same name but for a different topic we have an issue.
>
> Is anyone interested in thinking this through together and writing up
> a plan? Once we have that we can figure out if there is someone to
> help with implementation.

As I wrote above, using the label in a clever way should be enough.
Creating a sitelink without a target article is not desirable. A translated
label should be good enough as the basis for the name of the future article.

I hope to integrate this as smoothly as possible into the
ContentTranslation workflow Some Time Soon.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata / Wikipedia integration : redlinks and items

2014-09-08 Thread Amir E. Aharoni
One other thing that I thought about it is to use it in ContentTranslation
(a.k.a CX).[1]

In ContentTranslation we have a link adaptation feature - if an article is
available in the target language, it's automatically inserted as a link to
the translation. In the current code, if the article doesn't exist in the
target, nothing is done - it remains plain text.

It could be more useful to insert a red link if Wikidata has a label in the
target language.

Furthermore, if there is no label in the target, the translation interface
could ask the translator to supply a label, and then that label could be
inserted into the translation as a red link AND committed as a label to
Wikidata.

[1] https://www.mediawiki.org/wiki/Content_translation


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

2014-09-06 13:51 GMT+03:00 Thomas Douillard :

> Hi all, I'm wondering about one usage Wikidata could be useful to
> Wikipedias : Redlinks subject identification.
>
> Wikidata is good to identify subjects. Redlinks are used in Wikipedias to
> identify subjects with currently no article.
>
> I post here because I think there is something to integrate this further,
> but I don't know exactly what.
>
> A quick review about the current mechanisms we have to link items and/or
> articles and subjects together :
> * Wikidata interwikis. This works well. Links an item to articles and
> articles titles
> * articles redirect. This also works well, now we have a mechanism to link
> article redirects with items, which is cool.
> * There is currently templates like https://www.wikidata.org/wiki/Q6519884
> or https://www.wikidata.org/wiki/Q15977575 Interesting mechanisms actually
> * Wikidata items aliases : links a set of lexemes to an item
> * Special:ItemByTitle and Special:GoToLinkedPage which are great, I don't
> know how much they are used in practice though
>
> A little bit different but close
> * items redirects
>
> This seems this covers a lot of the user usecases. Yet there is a lot of
> red links in Wikipedia with actually no links to a Wikidata item.
>
>
> My feeling is that what actually lacks in this picture is that the
> templates are a bit hackish and that a deeper integration of item numbers
> with redlinks would allow to go further and encourage users to make the
> links at an earlier stage. What about a Wikisyntax to put an item number
> into a Wikilink or a visual editor integration to suggest an entity every
> time a user wants to enter a redlink ?
>
> This seems a low hanging fruit for WIkidata development and could make
> Wikidata more real to Wikipedia communities. Especially compared to doing
> this at the community level where this would require a big maintenance
> effort and community knowledge about the templates to make the link
> beetween the red label and the corresponding item concrete, especially if
> visual editor make the information come to eveyone.
>
> One other solution could be to allow to associate items to yet non
> existing articles to "reserve" them, allow redlinks into the Wikidata
> interwiki list ?
>
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] new features and changes

2014-08-16 Thread Amir E. Aharoni
I'd like to join Luca - there are a lot of wonderful updates here.

Ceterum censeo Vicidatam esse utenda :)
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] labels of renamed articles

2014-08-14 Thread Amir E. Aharoni
Hi,

I just changed the Russian label of the Chelsea Manning item from Bradley
to Chelsea. The Russian Wikipedia article was already moved to Chelsea, but
nobody noticed that the label probably needs a change, too.

How can such things be handled better?

One thing I can think of is that after the Wikipedia article in language X
is moved, the label in language X is shown as "possibly needs update" -
similarly to FUZZY in the Translate extension. There can also be a button
that says "confirm current label", for when there is no reason to change
it. Finally, there could also be a page that lists such possibly labels.

Of course, there may be better ideas, and maybe some of them are already
implemented and I just didn't notice it.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] integration of VisualEditor and Wikidata

2014-08-13 Thread Amir E. Aharoni
2014-08-13 22:52 GMT+03:00 James Forrester :

> On 13 August 2014 20:27, Bene*  wrote:
>
>> Afaik the infoboxes won't have any parameters once they use Wikidata and
>> be fully constructed using Lua. If I'm not correct I have to apologize but
>> that's the latest thing I know.
>>
>
> ​If the infobox really has no options and just builds itself entirely
> automatically, why not just move it out of the wikitext/etc. content
> entirely and display it always? This makes pages much easier to edit in
> wikitext (no KiB of {{…}} at the top, no confusing stuff at all, nothing to
> break) and simplifies a lot of things…
>

http://lists.wikimedia.org/pipermail/design/2014-August/001918.html \o/

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] {{Universal infocard}} in ruwiki, and cross-wiki coordination

2014-08-12 Thread Amir E. Aharoni
Oh, you mean the actual infoboxes! I thought that you meant using Wikidata
with infoboxes. Not using an infobox at all actually concerns me less,
because I care most strongly about translating article, and when an infobox
doesn't exist, it doesn't get in the way of the translator :)

That said, hundreds of thousands of articles in a lot of languages do have
infoboxes: wars, cities, languages, athletes, members of parliament,
animals, music albums and so on, and something must be done about them.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014-08-12 23:37 GMT+03:00 David Cuenca :

> In classical music biographies:
>
> https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Classical_music/Guidelines#Biographical_infoboxes
> Which was brought to the arbitration comitee:
> https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/Case/Infoboxes
>
> A general essay: https://en.wikipedia.org/wiki/Wikipedia:Disinfoboxes
>
> In German Wikipedia:
> https://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Infoboxen_in_Personenartikeln
>
> A rough translation of the current situation:
>
> "Again and again people infoboxes are created in the German Wikipedia.
> They are also regularly deleted on the grounds: "DE:WP wants no person
> infoboxes". Such a statement cannot be clearly determined from the many
> discussions on the topic, there is even evidence of a contrary opinion of
> the community, is hereby asked whether infoboxes are wanted in persons
> articles or not.
>
> Currently in the German Wikipedia  there are infoboxes for athletes and a
> few other groups of people."
>
> That RFC was closed due to a lack of votes to start it. I don't know if
> the arrival of Wikidata has changed the perspective during the last 2 years.
>
> Cheers,
> Micru
>
>
>
> On Tue, Aug 12, 2014 at 9:51 PM, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> 2014-08-12 22:48 GMT+03:00 Andy Mabbett :
>>
>>> On 12 August 2014 14:00, Amir E. Aharoni 
>>> wrote:
>>>
>>> > What does it entail?
>>>
>>> In part, resolving the vehement opposition to infoboxes in parts of
>>> the English Wikipedia, and the decision not to use them for
>>> biographies on the German Wikipedia.
>>>
>>
>> Thanks, this is a useful answer.
>>
>> Where is it written in German and English?
>>
>> I'd like to see their reasoning.
>>
>> --
>> Amir
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>
>
> --
> Etiamsi omnes, ego non
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] {{Universal infocard}} in ruwiki, and cross-wiki coordination

2014-08-12 Thread Amir E. Aharoni
2014-08-12 22:48 GMT+03:00 Andy Mabbett :

> On 12 August 2014 14:00, Amir E. Aharoni 
> wrote:
>
> > What does it entail?
>
> In part, resolving the vehement opposition to infoboxes in parts of
> the English Wikipedia, and the decision not to use them for
> biographies on the German Wikipedia.
>

Thanks, this is a useful answer.

Where is it written in German and English?

I'd like to see their reasoning.

--
Amir
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] {{Universal infocard}} in ruwiki, and cross-wiki coordination

2014-08-12 Thread Amir E. Aharoni
Thanks to David's comment earlier today about editing , I found this page
in the Russian Wikipedia:
https://ru.wikipedia.org/wiki/Module:Universal_infocard

It's a Lua module that shows a person infobox, pulling the data from
Wikidata and without giving any parameters in the wiki source code.

This is essentially the fulfillment of Wikidata's promise to make editing
articles with infoboxes easier, and it's wonderful. There are more things
to do, but I already want to thank everyone involved.

But now the question of cross-wiki synchronization arises. Wikidata.org is
cross-wiki by definition. Templates such as {{Universal infocard}} should
be cross-wiki as well.

Why? To make translation easier. The article about the Slovak poet Bohuslav
Tablic is available in Russian, but not in English. I'd love to translate
it to English, but I'll have to use {{Infobox writer}} and fill it manually
with data. This is doable, but it would be far more efficient to pull the
data from Wikidata. This will become even more acute when the
ContentTranslation, and that should happen Some Time Soon. Millions of such
articles could be translated, and using Wikidata well will save the
translators millions of minutes.

I am not saying that all projects should have the same templates. For
example, I am not concerned with the visual design of the templates - this
is up to the communities and the designers. But the way in which the data
is used should be sync'ed.

What does it entail?
Synchronizing the code of the Lua modules? Can these, maybe, be made into a
Lua library that is maintained in MediaWiki source, rather than as on-wiki
modules?
Major cross-wiki collaboration in functional specification of data to be
used in infoboxes?

What else?


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] integration of VisualEditor and Wikidata

2014-08-12 Thread Amir E. Aharoni
2014-08-12 13:33 GMT+03:00 Andy Mabbett :
>
> On 12 August 2014 08:53, Amir E. Aharoni 
wrote:
>
> > My Dream scenario is that the VE understands that the
> > data is pulled from Wikidata and shows a dialog that
> > is similar to the current template parameters. I see the
> > old mayor's name in that dialog, I write the new mayor's
> > name, and the new value is stored in Wikidata.
>
> A better scenario is that the editing wizard asks you for the date
> when the old mayor left office, and why (end of term, death,
> impeachment...) and likewise the date the new mayor took up office, if
> different. Then prompts you for your source(s).

Of course. I just have to admit that it's more complex to implement. (Maybe
it can become a workflow in the future version of the Flow extension, and
maybe I'm imagining too much.)
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] integration of VisualEditor and Wikidata

2014-08-12 Thread Amir E. Aharoni
Hi,

There are plenty of ways in which Wikidata and VisualEditor could be
integrated. Let me ask about the following one:

Given: Let's imagine that the venerable {{Infobox settlement}} template is
fully adapted for pulling the data from Wikidata. All the relevant data
about the city of Bratislava is entered into its item page on wikidata.org,
and in the source of the article [[Bratislava]] you only need to say
{{Infobox settlement}} without any parameters.

The people of Bratislava elect a new mayor, and I want to write it in the
article. I come to the article [[Bratislava]] and press edit. I click the
infobox.

What happens?

As far as I know, no work has been done in this area; PLEASE CORRECT ME if
I'm wrong.

My Super-Dream scenario is to be able to edit the Mayor's name write there
in the infobox, but I realize that it might be complicated.

My Dream scenario is that the VE understands that the data is pulled from
Wikidata and shows a dialog that is similar to the current template
parameters. I see the old mayor's name in that dialog, I write the new
mayor's name, and the new value is stored in Wikidata. Of course, it must
be taken into account that the name is likely not just a string, but a
label of the Wikidata item.

My acceptable-but-suboptimal scenario is taking the user to
https://www.wikidata.org/wiki/Q6850543 . This is probably a useful workflow
for the tech-savvy editors, but it's suboptimal for a casual editor. I'll
go as far as saying that for a casual editor it may be (relatively) more
comfortable to edit parameters in a MediaWiki template ("|mayor =[[Milan
Ftáčnik]]") than to go to https://www.wikidata.org/wiki/Q6850543 and find
the value.

Does anybody have any more ideas about it? Am I late to the party and this
has already been discussed and designed and I missed it? Please enlighten
me :)

Thanks!

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Translators-l] Wikidata weekly newsletter translation

2014-06-22 Thread Amir E. Aharoni
> The only concern is that Wikidata is a single wiki and
> not all users on a given wiki will be concerned (except for adding
> interwiki links but technically ll is done on this subject).
> The rest is about data that will be used by tricky templates
> and whose readers would be more technical users than usual.

It's a single wiki used by a lot of other wikis. Quite like Commons, and
news about tech updates on Commons do make it to Tech News.

Wikidata is severely underused by templates, even though templates are
supposed to be Wikidata's biggest killer application, so more PR about it
really won't hurt. Examples of successful use of Wikidata in templates in
some projects, even without tricky technical details, can go a long way to
furthering Wikidata adoption.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] multilingual terminology properties in Wikidata

2014-05-09 Thread Amir E. Aharoni
The question is not about sophisticated linguistic data. It's about a
simple link to a source that supports a term.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014-05-09 16:23 GMT+02:00 Gerard Meijssen :

> Hoi,
> The integration of lexical content is not planned for some time yet. This
> is very much an issue that is lexical / lexicographic in nature.
> Thanks,
>   GerardM
>
>
> On 9 May 2014 12:16, Amir E. Aharoni  wrote:
>
>> Hi,
>>
>> I am at a the Multilingual Web Workshop in Madrid. I had a discussion
>> here with a person who specializes in multilingual terminology translation
>> about how Wikipedia and its sister sites can be more useful and reliable
>> for people who search for translations of terms from different professional
>> fields - medicine, communications, law, etc.
>>
>> For example, if you go to the Wikipedia article [[Aorta]], how can you
>> know that this term is actually recognized as the English term by any
>> professional medical associations? And if you go to
>> https://www.wikidata.org/wiki/Q101004 , how can you know the same things
>> about each of the translations of this term? For example, how do you know
>> that "Srdcovnica" is recognized as a Slovak term by any medical association
>> or linguistic committee?
>>
>> By itself, the interlanguage link to Slovak is not reliable. A translator
>> to Slovak can, of course, go to a website of a relevant linguistic
>> committee and check the term there. But can it be more direct and
>> machine-readable?
>>
>> A property could probably be created, which would hold an id of a term in
>> such a terminology database, but would it be appropriate to include it in
>> an item page, given that such information is language-specific? It seems
>> reasonable to me, but I wanted to make sure that everybody find it
>> acceptable.
>>
>> And if there are such properties already, I'd love an example :)
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] multilingual terminology properties in Wikidata

2014-05-09 Thread Amir E. Aharoni
Hi,

I am at a the Multilingual Web Workshop in Madrid. I had a discussion here
with a person who specializes in multilingual terminology translation about
how Wikipedia and its sister sites can be more useful and reliable for
people who search for translations of terms from different professional
fields - medicine, communications, law, etc.

For example, if you go to the Wikipedia article [[Aorta]], how can you know
that this term is actually recognized as the English term by any
professional medical associations? And if you go to
https://www.wikidata.org/wiki/Q101004 , how can you know the same things
about each of the translations of this term? For example, how do you know
that "Srdcovnica" is recognized as a Slovak term by any medical association
or linguistic committee?

By itself, the interlanguage link to Slovak is not reliable. A translator
to Slovak can, of course, go to a website of a relevant linguistic
committee and check the term there. But can it be more direct and
machine-readable?

A property could probably be created, which would hold an id of a term in
such a terminology database, but would it be appropriate to include it in
an item page, given that such information is language-specific? It seems
reasonable to me, but I wanted to make sure that everybody find it
acceptable.

And if there are such properties already, I'd love an example :)

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #94

2014-01-27 Thread Amir E. Aharoni
I want an RSS feed, and the current RSS feed is pretty awful, because it
shows a diff of wiki syntax.

These updates look a lot like blog posts, so they should be blog posts.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014-01-27 Lydia Pintscher 

> On Mon, Jan 27, 2014 at 6:35 PM, John Lewis 
> wrote:
> > I do forget this quite a bit and Lydia unfortunately has to always do it
> for
> > me so if you could Bene, it'll be appreciated. Though it depends on
> Lydia's
> > thoughts on the matter.
>
> Automatically posting it here? It'd seem a bit unpersonal. But I won't
> object. Automatic blog: unless there is more demand for it, I'd say
> no. If someone wants an RSS feed they can for example use this one:
>
> https://meta.wikimedia.org/w/index.php?title=Wikidata/Newsletter/Archive&feed=atom&action=history
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #94

2014-01-25 Thread Amir E. Aharoni
Posting it to a blog would save you the trouble of manually announcing the
mailing list, and would be easier to check in general. Reguarly checking a
blog is also easier than checking a wiki page.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/25 John Lewis 

> The point of posting here is purely to inform people who don't check
> MetaWiki or aren't subscribed to the talk page notifications.
>
> John
>
>
> On Saturday, 25 January 2014, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> Because that's precisely what blogs were invented for. Mailing lists are
>> for discussions and announcements. Updates like this one are more
>> comfortable to read in an RSS reader.
>>
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>>
>> 2014/1/25 Sven Manguard 
>>
>>> Why? I don't see a benefit to that.
>>>
>>> Sven
>>> On Jan 25, 2014 10:38 AM, "Amir E. Aharoni" <
>>> amir.ahar...@mail.huji.ac.il> wrote:
>>>
>>>> Hi Lydia,
>>>>
>>>> These updates are a lot like a blog. Can it be a real blog? WordPress
>>>> should be fairly easy to set up :)
>>>>
>>>>
>>>> --
>>>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>>>> http://aharoni.wordpress.com
>>>> ‪“We're living in pieces,
>>>> I want to live in peace.” – T. Moore‬
>>>>
>>>>
>>>> 2014/1/25 Lydia Pintscher 
>>>>
>>>>> Hey folks :)
>>>>>
>>>>> Here's what's been going on around Wikidata this week:
>>>>> https://meta.wikimedia.org/wiki/Wikidata/Status_updates/2014_01_24
>>>>>
>>>>>
>>>>> Cheers
>>>>> Lydia
>>>>>
>>>>> --
>>>>> Lydia Pintscher - http://about.me/lydia.pintscher
>>>>> Product Manager for Wikidata
>>>>>
>>>>> Wikimedia Deutschland e.V.
>>>>> Tempelhofer Ufer 23-24
>>>>> 10963 Berlin
>>>>> www.wikimedia.de
>>>>>
>>>>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>>>>
>>>>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>>>>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>>>>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>>>>
>>>>> ___
>>>>> Wikidata-l mailing list
>>>>> Wikidata-l@lists.wikimedia.org
>>>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>>>
>>>>
>>>>
>>>> ___
>>>> Wikidata-l mailing list
>>>> Wikidata-l@lists.wikimedia.org
>>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>>
>>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>>
>>
>
> --
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #94

2014-01-25 Thread Amir E. Aharoni
I was not talking about anything like The Signpost. The Signpost is not a
blog, but a big collection of articles presented as a news site.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/25 John Lewis 

> While I do want to expand the updates and have spoken to Lydia about how
> we can, a full blown thing similar to the Signpost is not really on the
> board. The occasional longer piece of writing regarding the community we
> discussed but have not really tasked not further.
>
> John
>
>
> On Saturday, 25 January 2014, Sven Manguard 
> wrote:
>
>> Why? I don't see a benefit to that.
>>
>> Sven
>> On Jan 25, 2014 10:38 AM, "Amir E. Aharoni" 
>> wrote:
>>
>>> Hi Lydia,
>>>
>>> These updates are a lot like a blog. Can it be a real blog? WordPress
>>> should be fairly easy to set up :)
>>>
>>>
>>> --
>>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>>> http://aharoni.wordpress.com
>>> ‪“We're living in pieces,
>>> I want to live in peace.” – T. Moore‬
>>>
>>>
>>> 2014/1/25 Lydia Pintscher 
>>>
>>>> Hey folks :)
>>>>
>>>> Here's what's been going on around Wikidata this week:
>>>> https://meta.wikimedia.org/wiki/Wikidata/Status_updates/2014_01_24
>>>>
>>>>
>>>> Cheers
>>>> Lydia
>>>>
>>>> --
>>>> Lydia Pintscher - http://about.me/lydia.pintscher
>>>> Product Manager for Wikidata
>>>>
>>>> Wikimedia Deutschland e.V.
>>>> Tempelhofer Ufer 23-24
>>>> 10963 Berlin
>>>> www.wikimedia.de
>>>>
>>>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>>>
>>>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>>>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>>>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>>>
>>>> ___
>>>> Wikidata-l mailing list
>>>> Wikidata-l@lists.wikimedia.org
>>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>>
>>>
>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>>
>
> --
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #94

2014-01-25 Thread Amir E. Aharoni
Because that's precisely what blogs were invented for. Mailing lists are
for discussions and announcements. Updates like this one are more
comfortable to read in an RSS reader.


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/25 Sven Manguard 

> Why? I don't see a benefit to that.
>
> Sven
> On Jan 25, 2014 10:38 AM, "Amir E. Aharoni" 
> wrote:
>
>> Hi Lydia,
>>
>> These updates are a lot like a blog. Can it be a real blog? WordPress
>> should be fairly easy to set up :)
>>
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>>
>> 2014/1/25 Lydia Pintscher 
>>
>>> Hey folks :)
>>>
>>> Here's what's been going on around Wikidata this week:
>>> https://meta.wikimedia.org/wiki/Wikidata/Status_updates/2014_01_24
>>>
>>>
>>> Cheers
>>> Lydia
>>>
>>> --
>>> Lydia Pintscher - http://about.me/lydia.pintscher
>>> Product Manager for Wikidata
>>>
>>> Wikimedia Deutschland e.V.
>>> Tempelhofer Ufer 23-24
>>> 10963 Berlin
>>> www.wikimedia.de
>>>
>>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>>
>>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #94

2014-01-25 Thread Amir E. Aharoni
Hi Lydia,

These updates are a lot like a blog. Can it be a real blog? WordPress
should be fairly easy to set up :)


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/25 Lydia Pintscher 

> Hey folks :)
>
> Here's what's been going on around Wikidata this week:
> https://meta.wikimedia.org/wiki/Wikidata/Status_updates/2014_01_24
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] A short hello

2013-07-16 Thread Amir E. Aharoni
2013/7/16 Saskia Warzecha :
> Hi,
>
> I'm Saskia and I wanted to introduce myself. I started yesterday as an
> intern at Wikidata in Berlin.
>
> I am currently finishing my studies in Computational Linguistics (B.Sc.) at
> the University of Potsdam and will commence a M.Sc. in Vienna, Austria, this
> fall.
>
> My task at Wikidata is to analyze the proposals for Wiktionary in Wikidata
> [0],[1], to compare it with similar work (OmegaWiki, WordNet, etc.), and to
> help in finding the best solution for Wiktionary.

Oh yes! Welcome and good luck.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] phase 1 live on the English Wikipedia

2013-02-14 Thread Amir E. Aharoni
I prefer to make the API call and not to check by a list of languages.

Several reasons:
1. It's more robust in general.
2. Wikibase extension, as well as pywikipedia, can be used on other wikis, too.
3. There are dark corners in Wikimedia wikis - non-standard codes,
redirects, locked wikis, non-language wikis (Commons, Outreach etc.)
It's possible that these are out of the bots' scope, but checking the
API will still make it more robust.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2013/2/14 Amir Ladsgroup :
> That's a very interesting idea when i check this:
> http://fa.wikipedia.org/w/api.php?action=query&meta=wikibase
> we can make a call for a bot to check every wiki but for now i think there
> is a very few wikis updated on wikidata so I don't think it's very good idea
> right now but for the future it's the only option so I'll work on it but not
> now
>
> Best
>
>
> On Thu, Feb 14, 2013 at 9:44 AM, Yuri Astrakhan 
> wrote:
>>
>> We really ought to change it to dynamic:
>>
>> https://en.wikipedia.org/w/api.php?action=query&meta=wikibase
>> (thanks duh)
>>
>> If there is no error, wikibase is present, disable. This way no update to
>> the blacklist is needed. The query should be re-issued every 30 min to make
>> sure it hasn't changed.
>>
>> The sad part is that interwiki bot needs to be significantly reworked, so
>> there is no big incentive to do any changes like this to it. It should be
>> aware of wikidata, remove all data unless it is not in wikidata, etc. Lots
>> of heuristical changes.
>>
>> On Thu, Feb 14, 2013 at 12:34 AM, Amir Ladsgroup 
>> wrote:
>>>
>>> I did this:
>>> https://www.mediawiki.org/wiki/Special:Code/pywikipedia/11073
>>> so updated bots are not a concern anymore :)
>>> BTW:congrats.
>>>
>>>
>>> On Thu, Feb 14, 2013 at 1:11 AM, Katie Chan  wrote:

 On 13/02/2013 21:31, Denny Vrandečić wrote:
>
> You have examples of that? Did not happen to my edits (so far).


 Just once so far -
 


 --
 Experience is a good school but the fees are high.
 - Heinrich Heine

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>>
>>>
>>>
>>> --
>>> Amir
>>>
>>>
>>> ___
>>> Wikidata-l mailing list
>>> Wikidata-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>>
>>
>>
>> ___
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>
>
>
> --
> Amir
>
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] getting some stats for the Hungarian Wikipedia

2013-01-29 Thread Amir E. Aharoni
2013/1/29 Samat :
> On Tue, Jan 29, 2013 at 7:54 PM, Lydia Pintscher
>  wrote:
>>
>> On Tue, Jan 29, 2013 at 7:51 PM, Samat  wrote:
>> > I agree with you.
>> > I am also waiting for "somebody", who can change pywiki compatible with
>> > wikidata. I have no time and knowledge for it, but I have a bot (at
>> > least on
>> > huwiki, not on wikidata) and I have access to the Hungarian Toolserver,
>> > so I
>> > could tun this bot for cleaning the wikicode on huwiki and update the
>> > interwiki links on wikidata. But we need a/the "Somebody" first :)
>>
>> Have you looked at the link I posted? What exactly is missing for you
>> to do what you want to do?
>>
>>
>> Cheers
>> Lydia
>
>
> Yes, I have.
> I mean that interwiki.py should do at least the following:
> * delete interwikis from every article where there is no conflict;
> * add these interwikis to the relevant page on Wikidata (create this page if
> it doesn't exist yet, change the page if it already exists).
> As I know, the Hungarian editors are doing this tasks now manually.
> If there is (are) conflict(s) between interwiki links, it can be the next
> step.

Well, actually, I wouldn't think that it is immediately urgent. I
completely understand that this should be done some time soon -
probably in a couple of weeks from now. But it may be a good idea not
to use a bot to immediately remove the links from all the
(non-conflicting) articles until the post-deployment dust settles.

And until the Big Links Remove, if the bots don't re-add the removed
links by force, that should be enough.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Interwiki bots - practical questions

2013-01-29 Thread Amir E. Aharoni
Spin off from the "Phase 1" thread.

2013/1/29 Magnus Manske :
> Why not just block the bots on wikis that use wikidata?

This looks like the right thing to me, but I don't want to be too rude
to the bot operators and I do want the bots to keep doing useful
things.

Imagine the scenario:
* Wikidata Client is deployed to the Hebrew Wikipedia.
* I remove interlanguage links from the Hebrew Wikipedia article
[[ASCII]], an item for which is available in the Wikidata Repo (
https://www.wikidata.org/wiki/Q8815 ).
** The article is supposed to show the links brought from Wikidata now.
* After some time User:LovelyBot adds the links back.
* I block User:LovelyBot.

Now what do I say to User:Lovely?

A: Stop changing interlanguage links on the Hebrew Wikipedia. We have
Wikidata now.
B: Update your pywikipedia bot configuration (or version). We have
Wikidata now, and your bot must not touch articles that get the
interlanguage links from the Wikidata repo.

I prefer option B, but can pywikipediabot indeed identify that the
links in the article are coming from Wikidata? And are there interwiki
bots that are not using the pywikipediabot infrastructure?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata-l Digest, Vol 14, Issue 9

2013-01-10 Thread Amir E. Aharoni
2013/1/10 Nicholas Michael Bashour :
> Is there a way to make the names of languages appear in the language of the
> wiki on which they are displayed? For example, the language links now are in
> whatever that language is called in that specific language, but in the
> future, would it be possible, say, on the English Wikipedia to have all
> language links say the name of the language in English, and on the Hungarian
> page they would all be in Hungarian, etc?

There's a gadget that does it in the Portuguese Wikipedia. You could
check how many people actually use it; I suspect that not many.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] deployed new code

2012-12-10 Thread Amir E. Aharoni
Yay, all my right-to-left fixes are live :)

Thank you!

2012/12/10 Lydia Pintscher :
> Heya :)
>
> Just a quick note that we deployed new code to wikidata.org. All
> changes can be found at
> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fextensions%2FWikibase.git;a=shortlog;h=refs%2Fheads%2Fmw1.21-wmf6
> We're still working on getting test2.wikipedia.org to properly work
> with Wikidata.
> Please let me know of any problems you see that might be related to the 
> update.
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Community Communications for Wikidata
>
> Wikimedia Deutschland e.V.
> Obentrautstr. 72
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Wikidata:Glossary questions

2012-12-02 Thread Amir E. Aharoni
Hi,

There are several unanswered questions at
https://www.wikidata.org/wiki/Wikidata_talk:Glossary . Can any developer
please answer them or just edit the Glossary to make the answers obvious?

Thanks!

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] test client system that works with the live wikidata.org

2012-12-02 Thread Amir E. Aharoni
Hi,

Is there a test client system that works with the live wikidata.org? I'd
love to test it as early as possible.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] deployment of fixes to RTL bugs

2012-12-02 Thread Amir E. Aharoni
Hello,

There are a few RTL bugs that were already fixed in the Wikibase code a
while ago, but don't seem to be deployed yet. In particular, these two are
quite disruptive:
* https://bugzilla.wikimedia.org/show_bug.cgi?id=41005
* https://bugzilla.wikimedia.org/show_bug.cgi?id=40247

What is the deployment schedule for wikidata.org?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] weekly summary #30

2012-11-04 Thread Amir E. Aharoni
This would probably be a good place to note that Wikidata is the big
testing ground for the Universal Language Selector extension and for
the new generation of WebFonts and IME (Narayam) extensions (more
precisely, the first after translatewiki.net). There are still some
issues, and Santhosh was working very hard in the last few days on
resolving them.

We thank the Wikidata team for giving this a chance and for all their
help, and we thank all the early testers for the patience and the bug
reports!

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2012/11/2 Lydia Pintscher :
> Heya folks :)
>
> What a week! Here's your summary of what happened over the last week
> around Wikidata.
> You can find the wiki version at
> http://meta.wikimedia.org/wiki/Wikidata/Status_updates/2012_11_02
>
>
> = Development =
> * Launched wikidata.org \o/
> * Updated http://meta.wikimedia.org/wiki/Wikidata/Notes/Change_propagation
> (We need feedback on the data flow from Wikidata to the Wikipedias
> http://lists.wikimedia.org/pipermail/wikitech-l/2012-November/064196.html)
> * Updated the demo system: http://wikidata-test.wikimedia.de
> * Tagged a 0.1 release
> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/Wikibase.git;a=tags
> * Added QUnit tests for DataValues
> * Worked on Api.js JS refactoring
> * Fix content handler and other related core bugs
> * Generalized Autocomment
> * Changed name of wbsetitem to wbeditentity
> * First implementation of wbsearchentities
> * More Puppet scripts and Vagrant exploration
> * All of the API should now handle prefixed IDs
> * Implemented templating system
> * Browser code and server code both starts to use templates
> * Jeroen and Daniel were added as primary authors of MediaWiki core
> for their work on the Content Handler
>
>
> See http://meta.wikimedia.org/wiki/Wikidata/Development/Current_sprint
> for what we’re working on next.
>
> You can follow our commits at
> https://gerrit.wikimedia.org/r/#/q/(status:open+project:mediawiki/extensions/Wikibase)+OR+(status:merged+project:mediawiki/extensions/Wikibase),n,z
> and view the subset awaiting review at
> https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/Wikibase,n,z
>
> You can see all open bugs related to Wikidata at
> https://bugzilla.wikimedia.org/buglist.cgi?emailcc1=1&list_id=151540&resolution=---&emailtype1=exact&emailassigned_to1=1&query_format=advanced&email1=wikidata-bugs%40lists.wikimedia.org
>
> = Discussions/Press =
> * “wikidata.org is live” got quite some responses in the press. Some
> examples: 
> http://www.spiegel.de/netzwelt/web/wikidata-oeffnet-als-zentrale-datenbank-fuer-wikipedia-a-864649.html
> and 
> http://www.heise.de/newsticker/meldung/Wikidata-Daten-Fundus-fuer-Wikipedia-eroeffnet-1740780.html
>
> = Events =
> see http://meta.wikimedia.org/wiki/Wikidata/Events
>
> * SMWCon
> * WMF metrics and activities meeting
> * upcoming: office hours on IRC
> * upcoming: Wikimedia Conferentie
> * upcoming: ISWC
> * upcoming: talk at Bergman Center
> * upcoming: Wikidata intro and Q&A in Vienna
> * We’re looking for partners for a mass collaboration assembly at 29C3
>
> = Other Noteworthy Stuff =
> * First 1000 items: http://pastebin.com/Gahpgekp (html:
> http://pastebin.com/5L6N2gZq)
> * http://www.wikidata.org/wiki/Category:Task_force
> * http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions
> * http://www.wikidata.org/wiki/Wikidata:Project_chat#SlurpInterwiki_script
> * Translate extension enabled on wikidata.org for easier translation
> of help and similar pages
>
> = Open Tasks for You =
> * Give feedback on
> http://meta.wikimedia.org/wiki/Wikidata/Notes/Change_propagation
> * Hack on one of
> https://bugzilla.wikimedia.org/buglist.cgi?keywords=need-volunteer%2C%20&keywords_type=allwords&emailcc1=1&list_id=151541&resolution=---&emailtype1=exact&emailassigned_to1=1&query_format=advanced&email1=wikidata-bugs%40lists.wikimedia.org
>
> Anything to add? Please share! :)
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Community Communications for Wikidata
>
> Wikimedia Deutschland e.V.
> Obentrautstr. 72
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] wikidata.org is live (with some caveats)

2012-10-30 Thread Amir E. Aharoni
2012/10/30 emijrp :
> Cool, nice work.
>
> SUL is not enabled?

It is, we just discussed it on IRC :)

Log out, then log in again to some other existing project (like
https://ca.wikisource.org ) and then to https://www.wikidata.org , and
it should work.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] update from the Hebrew Wikipedia

2012-10-12 Thread Amir E. Aharoni
2012/10/12 Lydia Pintscher :
> Ok will add it to
> http://meta.wikimedia.org/wiki/Wikidata/Deployment_Questions as soon
> as I can. Anyone want to link that page from the FAQ?

I linked it.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] update from the Hebrew Wikipedia

2012-10-12 Thread Amir E. Aharoni
2012/10/12 Denny Vrandečić :
> 2012/10/12 Lydia Pintscher :
>> On Fri, Oct 12, 2012 at 4:45 PM, Amir E. Aharoni
>>  wrote:
>>> 5. Somebody complained that it's too easy to remove a link from a repo
>>> - clicking the "remove" link is enough. I mentioned it in a bug
>>> report:
>>> https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
>>
>> Thanks for filing. Not sure what the plan there is.
>
> It is very high on our priority list. We expect to tackle this in this month.

Thank you!

>>> 6. And this is probably the biggest issue: The workflow for adding an
>>> interlanguage link is cumbersome and in some cases the interface
>>> elements are undiscoverable.
>>
>> Do you have specifics about which elements are hard to discover?
>
> I think what Amir means is that for articles that do not have a
> language link yet, it is basically impossible to add those. (I.e. the
> workflow is, go to Wikidata, create a new item, then link that back to
> your article. Not cool.)

Yep, this one.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] update from the Hebrew Wikipedia

2012-10-12 Thread Amir E. Aharoni
Hi,

Lydia mentioned in her summary a major discussion about Wikidata in
the Hebrew Wikipedia. The discussion was in Hebrew of course, so I'll
bring a little summary of it.

Eleven people supported the installation of Wikidata. Nobody objected \o/

Despite the wide support, some issues and questions were raised:

1. How is the coordination with interwiki links bot operators progressing?
Will the bots be smart enough not to do anything to articles that are
already listed in the repository and have the correct links displayed?
Will the bots be smart enough to update the repo in the transition
period, when some Wikipedias have Wikidata and some don't?
Will the bots be smart enough not to do anything with articles that
have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?

2. What are the numbers after the Q in the titles in the repo site? -
I replied that they are just sequential identifiers without any
additional meaning. Maybe it can be added to the FAQ.

3. Several people complained about instability in the links editing
pages in the repo: They saw messages about network problems when they
tried to edit links. I experienced this a couple of times, too. I also
saw a complete crash with a "memory full" error once.

4. Somebody noticed that the testing sites don't support unified
accounts (CentralAuth). The production system will, right?

5. Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report:
https://bugzilla.wikimedia.org/show_bug.cgi?id=40200

6. And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface
elements are undiscoverable.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] too easy to remove links

2012-10-05 Thread Amir E. Aharoni
... In case it wasn't clear, I referred to removing a link to an
article in foreign Wikipedia from an item page in the repository.


2012/10/5 Amir E. Aharoni :
> Hi,
>
> An issue brought up in the discussion about Wikidata in the Hebrew
> Wikipedia: To remove a link, you just click the "remove" link... and
> that's it. Looks too easy. No asking for confirmation or anything. Is
> it a good idea?
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] too easy to remove links

2012-10-05 Thread Amir E. Aharoni
Hi,

An issue brought up in the discussion about Wikidata in the Hebrew
Wikipedia: To remove a link, you just click the "remove" link... and
that's it. Looks too easy. No asking for confirmation or anything. Is
it a good idea?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] software version on the demo sites

2012-10-05 Thread Amir E. Aharoni
Hi,

I wanted to make sure that the English and the Hebrew demo sites have
the same version. I can see the precise Git version of MediaWiki in
Special:Version, as well as the versions of most extensions. But the
versions of the Wikibase* extensions only appear as "Version 0.2
alpha" without the Git checkout number. Would it be possible to show
the number there, to let everybody know the precise version?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] comparison articles

2012-09-29 Thread Amir E. Aharoni
Hi,

Geekiness warning: this email mentions software in general and free
software licenses in particular. It mentions them because it's a
useful example, but the ideas can be applied to many other domains.

Automatic generation of list articles is frequently named as one of
the main use cases of Wikidata. There is, however, a particular type
of list articles for which Wikidata may be even more useful:
Comparison articles. You can find many examples of these at
Category:Software comparisons in the English Wikipedia.

It can be used for comparing cars, athletes, political candidates and
many other things. I frequently use such articles to help myself
choose software that I will use. Here's what I do today to search for
software, and what I would love to see supported better using
Wikidata:

* I always prefer to try Free Software first. Today I usually go to
such an article, sort the table by the license column, and then start
comparing features that are important to me.
* Very often the different groups of features appear in different
tables, so I have to scroll up and down a lot.
* Very often I quickly narrow down my search to a small number of
products, but there's no easy way to remove the products that I
decided not to use from the view, so again I have to scroll up and
down a lot.

Doing the above using a query would be very useful. I would be able to
quickly eliminate rows (products) and columns (features) that don't
interest me.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] watching Wikidata changes that affect my wiki

2012-08-14 Thread Amir E. Aharoni
2012/8/14 Denny Vrandečić :
> In general I am a strong believer of "let's start with the simple
> thing", which is to let editors add transliterations (that is why we
> have a label field for every entity in every language).
>
> I may see a use case for a transliteration-bot that does some of the
> transliterations (semi?)automatically, but I actually would think that
> this is probably something that should be left to the community.
>
> There might be some simple cases for language fallbacks (including
> transliterations) but we have not touched that development item yet.
> We have to see how this works out.
>
> But in short, I am wary of automatic systems and rather would count on
> the knowledge of the editors.
>
> I hope that makes sense,

This makes perfect sense and I agree.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] watching Wikidata changes that affect my wiki

2012-08-14 Thread Amir E. Aharoni
2012/8/14 Nikola Smolenski :
> On 14/08/12 08:57, Amir E. Aharoni wrote:
>>
>> 2012/8/14 Nikola Smolenski:
>>>
>>> I believe it should be possible to alleviate this problem to an extent by
>>> introducing automatic transcription between languages and specifying what
>>> language the mayor's "default" name is in. If automatic transcription
>>> gets
>>> it wrong, it could still be overriden when someone enters the name in
>>> another language.
>>
>>
>> It is guaranteed to be profoundly broken. The above-mentioned Hebrew
>> names will be transliterated as<'mrm mcn'>  (the apostrophes are part
>
>
> Would it? How many Hebrew names are there that are spelled "עמרם"? If the
> transliteration software knows it's a human name it can transliterate it as
> "Amram".

What you say is kinda true, but in practice it's much more
complicated. I worked for a few years in a company that makes software
that does this and I was the lead developer. There are two software
packages that do it for Hebrew, they are proprietary and very
expensive. It's not that making a Free package is impossible, but you
need a team for every language that has such problems, you need
several full time people to maintain the words, and what's worst is
that most words have six or so possible pronunciations. Sure,
crowdsourcing in Wikidata may change that, but it's too early to talk
about this.

AFAIK the situation is even worse in Arabic, which is a much bigger
language than Hebrew.

What I'm getting at is, again, that some limited helping
transliteration may be OK, but it must not be automatically
propagated. Naïve people may think that that's how the name is
actually written, and in such matters most people are very naïve.

--
Amir

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] watching Wikidata changes that affect my wiki

2012-08-13 Thread Amir E. Aharoni
2012/8/14 Nikola Smolenski :
> I believe it should be possible to alleviate this problem to an extent by
> introducing automatic transcription between languages and specifying what
> language the mayor's "default" name is in. If automatic transcription gets
> it wrong, it could still be overriden when someone enters the name in
> another language.

It is guaranteed to be profoundly broken. The above-mentioned Hebrew
names will be transliterated as <'mrm mcn'> (the apostrophes are part
of the transliteration!) and . The same problem applies to
Arabic, Punjabi and many other languages. Without manual maintenance
it will perpetuate horrendously wrong transliteration.

Some very limited auto-transliteration is OK, but just as a
suggestion. I was actually going to write an email about that. But it
must not be automatic all the way and propagate to all wikis.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] thoughts about a plan for enabling interlanguage linking

2012-08-13 Thread Amir E. Aharoni
Yes.
בתאריך 14 באוג 2012 00:14, מאת "Samat" :

> On Mon, Aug 13, 2012 at 5:48 PM, Snaevar  wrote:
>
>>  ...
>>
>> Bináris and Andre Engels have volunteered to make pywikipedia compatible
>> with wikidata. Bináris said on the 24th of July that he is busy, in a
>> response from Lydia to translate the Wikibase extensions into Hungarian.
>> Since no progress has been made on that on translatewiki, I am going to
>> assume that he still is busy.
>>
>
> Do you think of these [1] messages?
>
> I think, we can find volunteers for translation into Hungarian.
>
> [1]:
> https://translatewiki.net/w/i.php?title=Special%3ATranslate&taction=translate&group=ext-wikibase-repo&language=hu&limit=100&task=view
>
> Samat
>
> ___
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] watching Wikidata changes that affect my wiki

2012-08-13 Thread Amir E. Aharoni
Hallo,

Preamble 1: This email probably falls under this FAQ question:
Q: How will Wikidata change the way articles are edited?
A: That’s part of what we have to figure out during the development,
together with the community.

Preamble 2: It's possible that there's an answer to this issue
already, but I couldn't find it.

A popular example of using Wikidata is that it makes maintaining
articles about cities easier: When a mayor of a city changes, it must
only be updated once.

The problem is that the mayor's name can be written differently in
other languages. I didn't actually try running it myself, but as far
as I understand, Wikidata supports translating names. But what happens
when the mayor changes? It is likely that the name will be updated in
the language spoken in that city. At that point articles in Wikipedia
in other languages will probably show the name in the language of the
city, which may be unreadable.

Let's take Haifa for an example. Its previous mayor was:
he: עמרם מצנע
en: Amram Mitzna
ru: Амрам Мицна
hr: Amram Micna
etc.

Now it changes to:
he: יונה יהב

And then suddenly all the articles about Haifa in all the languages
will show the mayor's name as "יונה יהב", which most people won't be
able to read. Maybe the Wikidata community will develop some kind of a
policy that will discourage adding names in local scripts without any
translation to a more common script. Maybe at some point software
should even show a warning if somebody tries to do it.

The scenario can be even simpler: Somebody will vandalize Wikidata and
change the mayor's name to some nonsense.

The most practical way to solve this is to show that some piece of
data that affects a Wikipedia article in the watchlist, as if it is a
change in the article itself. Is it possible? If not, is it planned?

It's a problem with Commons, too: An image that is used in an article
can change in Commons and it won't appear in the watchlist. But I
expect that it will happen a lot more often with Wikidata items and
that the changes would be a lot more subtle and hard to notice: It's
easy to notice that an image changed, but it's harder to notice a
change in a number or a name of a mayor.

Another question is: What is the fallback mechanism if a name was not
translated? The usual MediaWiki fallback rules can be reused, but
there's a twist, because in Wikidata the usual fallback language may
be unavailable. So in this case it will probably be:

my language -> my fallback language -> English -> the language in
which it is written

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] i18n, l10n and m17n in Wikidata - lang and dir

2012-08-13 Thread Amir E. Aharoni
I'm not sure about the exact number either, but in general it is true.
There is a bug about this already reported in Wikidata:
https://bugzilla.wikimedia.org/show_bug.cgi?id=36635

I can't think of any good way to fix it except fixing bug 28970 in
core MediaWiki: to allow setting the language of a page and of a
page's title separately. Currently the language of a page and of the
title is always assumed to be the same as the content language of the
wiki. The only solutions to this problem are to use {{displaytitle}}
or , but these solutions are very incomplete.
(The Translate extension has a better solution, but it only works for
translatable pages.)

This is also needed for many other things that are not related to
Wikidata, such as correct display of titles in category pages. It is
also discussed in the Visual Editor i18n requirements (disclaimer: I
wrote them):
https://www.mediawiki.org/wiki/VisualEditor/Internationalization_requirements

I wanted to start an RFC about this issue on wikitech-l for a while
now, so maybe now is a good time to finally do it.

--
Amir

2012/8/13 Denny Vrandečić :
> Hi Amir,
>
> thanks for the bug report! We will go and implement it along the lines
> you suggest it.
>
> I have one question still, maybe you can help me:
>
> About 10% of the titles in the Hebrew Wikipedia are in latin alphabet
> (very rough estimate, may be completely off, based of a glance on
> Special:Allpages). So an article like
> http://he.wikipedia.org/wiki/Yesterday, where the title would be LTR,
> would be declared as RTL. Is there a way to avoid that?
>
> I guess the answer is no, but I wanted to ask.
>
> Cheers,
> Denny
>
>
>
>
> 2012/8/11 Amir E. Aharoni :
>> Hallo,
>>
>> It's my first email on this list, so in case you don't know me: I am
>> Amir, I'm from Israel, I'm a wikipedian since 2004, I write mostly in
>> Hebrew and English, I care strongly about language issues in software
>> in general and about right-to-left support in particular, and I work
>> in the WMF's localization team.
>>
>> Now, about the subject: you probably know that i18n is
>> "internationalization" and "l10n" is "localization". "m17n" is a less
>> common term, which means "multilingualization" - making software able
>> to work in many languages at once. This email is about one of the
>> easiest and the most important ways to make Wikidata support many
>> languages on one page everywhere.
>>
>> I've been testing the Wikidata demo for a few days now, with the aim
>> of getting it deployed in the Hebrew Wikipedia very soon. The first
>> thing that I noticed is that even though everybody understands that
>> Wikidata is supposed to be massively multilingual, little or no use is
>> made of the lang and dir attributes in the HTML that Wikidata
>> generates. The most immediate example is
>> http://wikidata-test-repo.wikimedia.de/wiki/Data:Q2?uselang=en
>>
>> It basically lists the word "Helium" in many languages, but as far as
>> the browser is concerned, almost all of it is written in English,
>> because the root  element says lang="en". The only exceptions
>> are the interlanguage links in the sidebar, where the lang attributes
>> are user properly, but that's a regular MediaWiki feature.
>>
>> It is very much needed to explicitly specify the lang attribute and
>> also the dir attribute (direction: "ltr" or "rtl") on every element,
>> the content language of which is known to be different from the
>> content language of the enclosing element. Many developers may think
>> that this attribute doesn't do anything, but actually it does a lot:
>> * correct text-to-speech and speech-to-text handling
>> * correct font rendering (relevant for Serbian [1], for some languages
>> of India etc.)
>> * selecting the correct spell checking dictionary
>> * selecting the right language for machine translation
>> * adjusting the line-height
>> * selecting the web font (in MediaWiki's WebFonts extension)
>> * etc.
>>
>> So please, use it whenever you can.
>>
>> Always use the dir attribute in these circumstances, too. It must be
>> specified explicitly even though "ltr" is the default, because if the
>> user interface is right-to-left, it will propagate to elements in
>> other languages, too, so you would right-to-left English. (I consider
>> this a bug in the HTML standard... but it's a topic for a different
>> email).
>>
>> In the case of the page that I mentioned above, it should be quite
>> tr

Re: [Wikidata-l] thoughts about a plan for enabling interlanguage linking

2012-08-12 Thread Amir E. Aharoni
2012/8/13 Snaevar :
> > For example:
> > * The interwiki bots' will definitely have to be modified for the
> > Wikidata age. Did anybody start a conversation with the operators of
> > these bots?
>
> MerlIwBot is the only bot that is compatible with Wikidata.

Is there any reason not to make the others compatible, too? IIRC, they
all use the same upstream pywikipedia code. Or am I naïve?

> Lydia has started a conversation with the operators of the bots that have
> volunteered to transit interwiki links to Wikidata.

Is it public?

> > * How will bots identify pages that use Wikidata?
>
> Is this http://meta.wikimedia.org/wiki/User:MerlIwBot/WikiData what you are
> looking for?

Well, probably. Probing Wikidata and seeing whether the page appears
there sounds reasonable. If everybody uses it, of course. What happens
if a bot doesn't behave well and does add a plain old interlanguage
link to a page that is already connected to Wikidata? Which link will
be shown to the reader - the old, the new, or both? And what to do
with the bot account - block it to make the operator notice that he
should upgrade?

--
Amir

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] thoughts about a plan for enabling interlanguage linking

2012-08-12 Thread Amir E. Aharoni
Hi,

I'm not sure that it was discussed already. If it was discussed,
please point me there. I read the Technical proposal and the
translatable pages on Meta and couldn't find it there.

Is there any concrete plan to start the migration of the current
interlanguage links to Wikidata storage?

For example:
* The interwiki bots' will definitely have to be modified for the
Wikidata age. Did anybody start a conversation with the operators of
these bots?
* How will interwiki conflicts be handled? Judging by earlier
discussions, I suppose that Wikidata will just let pages with
conflicts work as they always did, but this is an important issue that
should be answered in the FAQ.[1]
* How will bots identify pages that use Wikidata?

Maybe the answer to all of the above questions is: "The Wikidata
developers intentionally want to leave these things to the editors
community". If it is, then this is fine. It is even hinted in the
Technical proposal. But it should be written more explicitly, so that
the communities will actually start to work on that. Again, if
somebody already started it, please point me there.

Thank you!

[1] https://meta.wikimedia.org/wiki/Wikidata/FAQ

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] i18n, l10n and m17n in Wikidata - lang and dir

2012-08-11 Thread Amir E. Aharoni
Hallo,

It's my first email on this list, so in case you don't know me: I am
Amir, I'm from Israel, I'm a wikipedian since 2004, I write mostly in
Hebrew and English, I care strongly about language issues in software
in general and about right-to-left support in particular, and I work
in the WMF's localization team.

Now, about the subject: you probably know that i18n is
"internationalization" and "l10n" is "localization". "m17n" is a less
common term, which means "multilingualization" - making software able
to work in many languages at once. This email is about one of the
easiest and the most important ways to make Wikidata support many
languages on one page everywhere.

I've been testing the Wikidata demo for a few days now, with the aim
of getting it deployed in the Hebrew Wikipedia very soon. The first
thing that I noticed is that even though everybody understands that
Wikidata is supposed to be massively multilingual, little or no use is
made of the lang and dir attributes in the HTML that Wikidata
generates. The most immediate example is
http://wikidata-test-repo.wikimedia.de/wiki/Data:Q2?uselang=en

It basically lists the word "Helium" in many languages, but as far as
the browser is concerned, almost all of it is written in English,
because the root  element says lang="en". The only exceptions
are the interlanguage links in the sidebar, where the lang attributes
are user properly, but that's a regular MediaWiki feature.

It is very much needed to explicitly specify the lang attribute and
also the dir attribute (direction: "ltr" or "rtl") on every element,
the content language of which is known to be different from the
content language of the enclosing element. Many developers may think
that this attribute doesn't do anything, but actually it does a lot:
* correct text-to-speech and speech-to-text handling
* correct font rendering (relevant for Serbian [1], for some languages
of India etc.)
* selecting the correct spell checking dictionary
* selecting the right language for machine translation
* adjusting the line-height
* selecting the web font (in MediaWiki's WebFonts extension)
* etc.

So please, use it whenever you can.

Always use the dir attribute in these circumstances, too. It must be
specified explicitly even though "ltr" is the default, because if the
user interface is right-to-left, it will propagate to elements in
other languages, too, so you would right-to-left English. (I consider
this a bug in the HTML standard... but it's a topic for a different
email).

In the case of the page that I mentioned above, it should be quite
trivial to fix, because MediaWiki's Language class provides very easy
functions for this. I also opened bug 39257 [2] about it. I am
repeating it here on the mailing list, just to say to the developers
to do it everywhere. If you are a developer and you run into any
problems with using these attributes, please contact in any way that
is convenient to you.

Thank you!

[1] See https://sr.wikipedia.org/wiki/User:Amire80
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=39257

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l