[Wikidata-tech] Re: [BREAKING CHANGE] Pagename/filename normalization on saving an edit

2021-09-13 Thread Lucas Werkmeister
Hello,

This is an update to our previous announcement about data value
normalization when saving edits.

We’d like to announce an additional significant change in this area.

Together with the previously announced normalization of Commons media
values, we also implemented Unicode normalization of string values
(regardless of data type, i.e. for properties of data type string, external
identifier, URL, etc.): they are now always saved in Unicode Normalization
Form C (NFC, aka Normalization Form Canonical Composition). Note that, just
as for Commons media normalization, this only applies to new edits, and
existing data in Wikidata could still be in non-normalized form.

Cheers,
Lucas Werkmeister

Am Mo., 23. Aug. 2021 um 13:48 Uhr schrieb Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de>:

> Hello,
>
> As you may know, Wikibase currently does not normalize pagenames/filenames
> on save (e.g. underscores in the input for properties of datatype Commons
> media are allowed). At the same time, Wikidata’s quality constraints
> extension
> <https://www.mediawiki.org/wiki/Extension:WikibaseQualityConstraints>
> triggers a constraint violation after saving, if underscores are used. This
> is by design as to long-established
> <https://www.wikidata.org/wiki/Template:Constraint:Commons_link>
> Community practices. As a result, this inconsistency leaves users with
> unnecessary manual work.
>
> We will update Wikibase so that when a new edit is saved via UI or API,
> and a pagename/filename is added or changed in that edit, then this
> pagename/filename will be normalized on save ("My file_name.jpg" -> "My
> file name.jpg").
>
> More generally, the breaking change is that a user of the Wikibase API may
> send one data value when saving an edit, and get back a slightly different
> (normalized) data value after the edit was made: it is no longer the case
> that data values are either saved unmodified or totally rejected (e.g. if a
> file doesn’t exist on Commons). Since this guarantee is being removed with
> this breaking change announcement, we may introduce further normalizations
> in the future and only announce them as significant changes, not breaking
> changes.
>
> The change is currently available on test.wikidata.org and
> test-commons.wikimedia.org. It will be deployed on Wikidata on or shortly
> after September 6th. If you have any questions or feedback, please feel
> free to let us know in this ticket
> <https://phabricator.wikimedia.org/T251480>.
>
> Cheers,
> Lucas Werkmeister
>
> --
> Lucas Werkmeister (he/er)
> Full Stack Developer
>
> Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
> Phone: +49 (0)30 219 158 26-0
> https://wikimedia.de
>
> Imagine a world in which every single human being can freely share in the
> sum of all knowledge. Help us to achieve our vision!
> https://spenden.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/029/42207.
>


-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list -- wikidata-tech@lists.wikimedia.org
To unsubscribe send an email to wikidata-tech-le...@lists.wikimedia.org


[Wikidata-tech] [BREAKING CHANGE] Pagename/filename normalization on saving an edit

2021-08-23 Thread Lucas Werkmeister
Hello,

As you may know, Wikibase currently does not normalize pagenames/filenames
on save (e.g. underscores in the input for properties of datatype Commons
media are allowed). At the same time, Wikidata’s quality constraints
extension
<https://www.mediawiki.org/wiki/Extension:WikibaseQualityConstraints>
triggers a constraint violation after saving, if underscores are used. This
is by design as to long-established
<https://www.wikidata.org/wiki/Template:Constraint:Commons_link> Community
practices. As a result, this inconsistency leaves users with unnecessary
manual work.

We will update Wikibase so that when a new edit is saved via UI or API, and
a pagename/filename is added or changed in that edit, then this
pagename/filename will be normalized on save ("My file_name.jpg" -> "My
file name.jpg").

More generally, the breaking change is that a user of the Wikibase API may
send one data value when saving an edit, and get back a slightly different
(normalized) data value after the edit was made: it is no longer the case
that data values are either saved unmodified or totally rejected (e.g. if a
file doesn’t exist on Commons). Since this guarantee is being removed with
this breaking change announcement, we may introduce further normalizations
in the future and only announce them as significant changes, not breaking
changes.

The change is currently available on test.wikidata.org and
test-commons.wikimedia.org. It will be deployed on Wikidata on or shortly
after September 6th. If you have any questions or feedback, please feel
free to let us know in this ticket
<https://phabricator.wikimedia.org/T251480>.

Cheers,
Lucas Werkmeister

-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list -- wikidata-tech@lists.wikimedia.org
To unsubscribe send an email to wikidata-tech-le...@lists.wikimedia.org


[Wikidata-tech] BREAKING / SIGNIFICANT CHANGE: wbeditentity response to use standard JSON serialization

2021-02-08 Thread Lucas Werkmeister
Hi folks!

This is an announcement for a change to the response of the wbeditentity
API module, which is a breaking change for MediaInfo entities (Structured
Data on Commons), a significant change for Lexeme entities (lexicographical
data), a minor change for Property entities, and a no-op for Item entities.

The wbeditentity API module, which can be used to edit any part of a
Wikibase Entity, has long included a part of the edited data in its
response. However, this response data was incomplete: it included e.g.
Labels, Statements, Sitelinks, but not the datatype of a Property or the
Lemmas of a Lexeme. Additionally, Statements were always returned under the
key "claims", even though MediaInfo entities generally use the key
"statements". On or around February 10, we will deploy a code change that
will make wbeditentity return the entity data using the standard
serialization format of an entity type, the same that is used by
wbgetentities and Special:EntityData. This means that the response will now
contain all the parts of an Entity, and also that, for MediaInfo entities,
the Statements will now be returned under "statements". (These Statements
will also be missing the "datatype", just like the MediaInfo data from
wbgetentities and Special:EntityData – see T246809
<https://phabricator.wikimedia.org/T246809>.)

To avoid breaking MediaInfo API users immediately, we are temporarily
adding Statements under the key "claims" as well – that is, the change on
February 10 is only significant, not yet breaking, and MediaInfo API users
can use either the "claims" or the "statements". On or around March 3, we
will remove this compatibility code, and MediaInfo API users will have to
use "statements" if they want to look at the Statements of the returned
entity data.

It’s also worth mentioning here what the wbeditentity response data
represents. The API returns the Entity data as edited by your API request,
and the revision ID of the page that the change was saved under. This is
*not* necessarily the Entity data of that revision ID: if Wikibase patched
any edit conflicts between the base revision ID that your request specified
(baserevid parameter) and the actual latest revision ID, then those changes
are not included in the response. For example, if you load an Item with
revision ID *X* and labels “a” and “b” in different languages, and make a
wbeditentity request with baserevid=*X* to change the label “a” to “A”, but
in the meantime someone else had already changed the label “b” to “B” and
saved this as revision *Y*, then the API response for your request will
have a last revision ID ("lastrevid") of *Z*, and this revision *Z* will
have labels “A” and “B” if you get it from Special:EntityData, but the API
response for your request will have labels “A” and “b” (the result of
applying your edit to the base revision). If you need the actual latest
Entity data after your edit, make a separate request to Special:EntityData
or wbgetentities. (This is nothing new, and unaffected by the change being
announced here, but we thought it was still worth mentioning.)

If you have any issue or question, feel free to leave a comment at T271105
<https://phabricator.wikimedia.org/T271105>.

Cheers,
Lucas
-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] [Significant change] New datatype fields in Lexeme JSON output

2020-08-17 Thread Lucas Werkmeister
Hi all!

This is an announcement for a significant (but not breaking) change to the
JSON output of WikibaseLexeme entities, specifically of Senses and Forms,
when obtained via Special:EntityData
<https://www.wikidata.org/wiki/Special:EntityData> or the Wikibase API
endpoints like Special:ApiHelp/wbgetentities
<https://www.wikidata.org/wiki/Special:ApiHelp/wbgetentities>.

The Snak output in the Wikibase JSON serialization
<https://doc.wikimedia.org/Wikibase/master/php/md_docs_topics_json.html#json_snaks>
usually contains a datatype field for each Snak. Previously, these fields
were missing for statements within Senses and Forms of a Lexeme – see
the message
on Project Chat
<https://www.wikidata.org/wiki/Special:PermanentLink/1258923595#[Significant_change]_New_datatype_fields_in_Lexeme_JSON_output>
for an example (I didn’t want to bloat this email too much by including it
here 🙂). Starting on 26 August 2020 (barring unexpected deployment issues; 25
August 2020 on Test Wikidata), these datatype fields will be present there
as well.

If you have any issue or question, feel free to leave a comment at T249206
<https://phabricator.wikimedia.org/T249206>.

Cheers,
Lucas
-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Problem with SPARQL query: items which use a specific image (P18)

2020-07-24 Thread Lucas Werkmeister
This is very strange. Commons Media values are not supposed to be strings
in RDF – the documentation
<https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Commons_media>
confirms that they’re supposed to be Special:FilePath URIs, and for most
images, this is the case:

SELECT (STR(?image) AS ?str) (ISIRI(?image) AS ?isIri) WHERE {
  wd:Q33231166 wdt:P18 ?image.
}
str
isIri
http://commons.wikimedia.org/wiki/Special:FilePath/GuentherZ%202010-02-27%200207%20Wien10%20Scheunenstrasse2%20Bildstock.jpg
true

(The query service UI then rewrites that URL into a different one when
showing the result, sending you to the file description page instead of
directly to the file.)

To search for an image by its name, you can construct that URL in the query:

PREFIX commons: <http://commons.wikimedia.org/wiki/Special:FilePath/>
SELECT * WHERE {
  hint:Query hint:optimizer "None".
  BIND("1936-ChirkovAntonNikolaevich.jpg" AS ?fileName)
  BIND(IRI(CONCAT(STR(commons:), ENCODE_FOR_URI(?fileName))) AS ?image)
  ?item wdt:P18 ?image.
}

I have no idea why your simple query, searching for the file name as a
string, actually works sometimes. I assume this must be a bug – I’ve
created T258782 <https://phabricator.wikimedia.org/T258782> to track it.
(Your first example, Q16718402 / 1936-ChirkovAntonNikolaevich.jpg, is now a
URL – Lydia edited the item to see if that resolved the issue, and after
the query service updater re-imported the item, the issue was gone.)

Hope this helps,
Lucas


Am Mo., 6. Juli 2020 um 18:34 Uhr schrieb Stephan Bösch-Plepelits <
sk...@xover.mud.at>:

> Hi!
>
> I have a problem with a SPARQL query. Hopefully you can help me.
>
> I'd like to get wikidata articles which use a specific image in the P18
> property. I'm using this simple query:
>   SELECT ?item WHERE { ?item wdt:P18 "item.jpg". }
>
> This works for some items, but not all.
>
> Example 1:
>   SELECT ?item
>   WHERE { ?item wdt:P18 "1936-ChirkovAntonNikolaevich.jpg". }
>   -> returns "wd:Q16718402" (https://www.wikidata.org/wiki/Q16718402)
>
> Example 2:
>   SELECT ?item
>   WHERE { ?item wdt:P18 "GuentherZ 2010-02-27 0207 Wien10 Scheunenstrasse2
> Bildstock.jpg". }
>   -> returns nothing, although it is used by
> https://www.wikidata.org/wiki/Q33231166
>
> This might be related to the following fact:
>
> Example 1A:
>   SELECT ?item ?image
>   WHERE {
> ?item wdt:P18 ?image.
> ?item wdt:P214 "307433452".
>   }
>   -> the field "image" shows a string "1936-ChirkovAntonNikolaevich.jpg"
>
> Example 2A:
>   SELECT ?item ?foo
>   WHERE {
> ?item wdt:P18 ?image.
> ?item wdt:P2951 "86578".
>   }
>   -> the field "image" shows a link to
>
> https://commons.wikimedia.org/wiki/File:GuentherZ%202010-02-27%200207%20Wien10%20Scheunenstrasse2%20Bildstock.jpg
>   with the text "commons:GuentherZ 2010-02-27 0207 Wien10 Scheunenstrasse2
> Bildstock.jpg"
>
> (I also tried prefixing the query image name by 'commons:', like "?item
> wdt:P18 "commons:image.jpg")
>
> Please tell me if you have an idea how to solve this problem!
>
> greetings,
> Stephan
> --
> Seid unbequem, seid Sand, nicht Öl im Getriebe der Welt! - Günther Eich
> ,--.
> | Stephan Bösch-Plepelits  ❤ code ❤ urbanism ❤ free software ❤ cycling |
> | Projects:|
> | > OpenStreetMap: openstreetbrowser.org > openstreetmap.at|
> | > Urbanism: Radlobby Wien > Platz für Wien   |
> | Contact: |
> | > Mail: sk...@xover.mud.at > Blog: plepe.at > Code: github.com/plepe |
> | > Twitter: twitter.com/plepe > Jabber: sk...@jabber.at   |
> | > Mastodon: @pl...@en.osm.town   |
> `--'
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>


-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] [Config change] WikibaseClient example config will no longer define an example repo

2020-07-20 Thread Lucas Werkmeister
Hi everyone!


This is an announcement for a change to the example configuration of the
WikibaseClient extension, which only affects Client wikis which are not
also Repository wikis. If you don’t operate a non-Repo Client wiki, or if
you don’t use the example settings, you can ignore this message.
(Wikibase-Docker users, in particular, can ignore it.) The change will
become effective on the master branch soon, and then in the future 1.36
release (but not yet in the upcoming 1.35 release).

Currently, the WikibaseClient example settings (client/ExampleSettings.php
or client/config/WikibaseClient.example.php) define an example Repo for the
Client if the WikibaseRepository extension is not also being loaded. As we
are moving towards using extension registration for Wikibase (T88258
<https://phabricator.wikimedia.org/T88258>), this will no longer be
possible (extensions loaded using wfLoadExtension() are processed later, so
the example settings file can’t detect if WikibaseRepo is loaded or not).
Therefore, we will be removing this code from the Client example settings,
and if your wiki is not also a Repo wiki, you will have to configure the
Repo manually. (For wikis that are both Repo and Client, the default Client
settings are sufficient, and nothing needs to be done.) This means that if
your configuration (LocalSettings.php) currently contains a line like

require_once "$IP/extensions/Wikibase/client/ExampleSettings.php";

or

require_once
"$IP/extensions/Wikibase/client/config/WikibaseClient.example.php";

but not a line like

require_once "$IP/extensions/Wikibase/repo/Wikibase.php";

then you should copy the example settings
<https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/7613b87c6197ec5ec71a7ab9c751b2102f6538a6/client/config/WikibaseClient.example.php#L31-L65>
into your configuration and adjust them as needed (at least the repoUrl,
http://repo.example.org, is certainly wrong). If you already configure some
of these client settings, then you can skip copying the corresponding
example settings lines.

We will be making this change on the Wikibase master branch very soon, so
if you use the latest MediaWiki and Wikibase from Git, then you will
probably have to update your configuration ahead of the next update too.
However, the change will not be in the upcoming MediaWiki 1.35 release (or
the REL1_35 Wikibase branch), so if you are sticking to release versions,
you don’t need to change anything yet. That said, you could still update
your configuration in preparation for this future change – the only
potential problem with copying this block is the repeated define(), which
may raise a PHP Notice, so you might want to wrap those two lines
<https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/7613b87c6197ec5ec71a7ab9c751b2102f6538a6/client/config/WikibaseClient.example.php#L52-L53>
in an if ( !defined( 'WB_REPO_NS_ITEM' ) ) block when copying them (unless
you stop loading the example settings altogether, which is also an option).

I also want to stress that, while the eventual goal of this work is to load
WikibaseRepo and WikibaseClient using extension registration (i. e., JSON
files, not PHP entry points), we do not yet recommend that third-party
installs do this. When Wikibase is fully ready for extension registration,
we will send another announcement, and the PHP entry points will probably
still be kept for compatibility for a good while longer even after they’re
no longer required.

If you have any issue or question, feel free to leave a comment at T257449
<https://phabricator.wikimedia.org/T257449>. For more information, see also
T256238 <https://phabricator.wikimedia.org/T256238>.

Cheers,

Lucas

-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] BREAKING CHANGE: removing special pageterms behavior on repo wikis, use entityterms instead

2020-07-15 Thread Lucas Werkmeister
This is an announcement for a breaking change to the pageterms submodule of
the query API module, which only affects Wikibase repository wikis. If you
do not use that API module, or only use it on client wikis (e. g.
Wikipedias) and not on repository wikis (Wikidata, Wikimedia Commons), you
can ignore this message.

For years, the pageterms
<https://www.wikidata.org/wiki/Special:ApiHelp/query%2Bpageterms> API
module has served a double role: on client wikis, it returned the “terms”
(Label, Description, Aliases) of the Wikidata Item linked to the given
page(s), whereas on repo wikis, it would return the terms of the Item (or
other Entity) on that page itself. For example, querying for the Label of
Wikipedia:Village pump on English Wikipedia
<https://en.wikipedia.org/w/api.php?action=query&prop=pageterms&titles=Wikipedia%3AVillage%20pump&wbptterms=label>
would return “Project:Village pump” (the Label of Q16503
<https://www.wikidata.org/wiki/Q16503>), but querying for the Label of
Wikidata:Project chat on Wikidata
<https://www.wikidata.org/w/api.php?action=query&prop=pageterms&titles=Wikidata:Project%20chat&wbptterms=label>
would not return anything, even though that page is linked to the same Item
– you would have to query for the Label of Q16503
<https://www.wikidata.org/w/api.php?action=query&prop=pageterms&titles=Q16503&wbptterms=label>
instead. This behavior is inconsistent and also mixes repo and client
concerns in a way that makes the Wikibase code harder to maintain.

To resolve this, we introduced a new entityterms
<https://www.wikidata.org/wiki/Special:ApiHelp/query%2Bentityterms> API
module (a submodule of the query module, just like the pageterms module)
which has the same behavior as the pageterms module currently has for Item
(or other Entity) pages, and which is only available on repo wikis. If you
want to get the terms of Q16503, you can now use
action=query&prop=entityterms&titles=Q16503
<https://www.wikidata.org/w/api.php?action=query&prop=entityterms&titles=Q16503>
instead of action=query&prop=pageterms&titles=Q16503
<https://www.wikidata.org/w/api.php?action=query&prop=pageterms&titles=Q16503>.
(You can also use wbgetentities
<https://www.wikidata.org/wiki/Special:ApiHelp/wbgetentities>, which gives
you much more control over the returned data; pageterms/entityterms may be
faster and can also be combined with other submodules of the query module.)
On or shortly after 5 August 2020, we will remove the special repo behavior
of the pageterms module, and it will then behave just like it always has on
client wikis, and return the terms of the Item linked to a page, not the
terms of the Item (or other Entity) on a page. (Because the new API module
is already available on Wikidata, and you can start using it immediately,
we are not making this pageterms behavior change available on Test Wikidata
significantly before that date.)

If you have any issue or question, feel free to leave a comment at T257658
<https://phabricator.wikimedia.org/T257658>. For more information, see also
T115117 <https://phabricator.wikimedia.org/T115117>, T255882
<https://phabricator.wikimedia.org/T255882> and T256255
<https://phabricator.wikimedia.org/T256255>.

Cheers,
Lucas
-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] [Wikitech-l] BREAKING CHANGE: schema update, xml dumps

2019-11-28 Thread Lucas Werkmeister
Though I should also note that using the Wikidata XML dumps is not
recommended in general (see Wikidata:Database download#XML dumps
<https://www.wikidata.org/wiki/Wikidata:Database_download#XML_dumps>), and
this change also mainly affects non-main slots, which we don’t yet have on
Wikidata (Quarry <https://quarry.wmflabs.org/query/40356>). If you use the
dumps to analyze non-entity content (e. g. discussion pages), you may
notice a new  element within an  element, or a new sha1
attribute on the ’s main ; otherwise, this should be
backwards compatible.

Am Mi., 27. Nov. 2019 um 17:14 Uhr schrieb Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de>:

> Forwarding as this will also be relevant for people who consume Wikidata
> XML dumps (but not entity dumps), and especially for people who are
> interested in working with Structured Data on Commons from dumps.
>
> -- Forwarded message -
> Von: Ariel Glenn WMF 
> Date: Mi., 27. Nov. 2019 um 14:39 Uhr
> Subject: [Wikitech-l] BREAKING CHANGE: schema update, xml dumps
> To: Wikipedia Xmldatadumps-l ,
> Wikimedia developers 
>
>
> We plan to move to the new schema for xml dumps for the February 1, 2020
> run. Update your scripts and apps accordingly!
>
> The new schema contains an entry for each 'slot' of content. This means
> that, for example, the commonswiki dump will contain MediaInfo information
> as well as the usual wikitext. See
>
> https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/docs/export-0.11.xsd
> for the schema and
>
> https://www.mediawiki.org/wiki/Requests_for_comment/Schema_update_for_multiple_content_objects_per_revision_(MCR)_in_XML_dumps
> for further explanation and example outputs.
>
> Phabricator task for the update: https://phabricator.wikimedia.org/T238972
>
> PLEASE FORWARD to other lists as you deem appropriate. Thanks!
>
> Ariel Glenn
> ___
> Wikitech-l mailing list
> wikitec...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> --
> Lucas Werkmeister (he/er)
> Full Stack Developer
>
> Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
> Phone: +49 (0)30 219 158 26-0
> https://wikimedia.de
>
> Imagine a world in which every single human being can freely share in the
> sum of all knowledge. Help us to achieve our vision!
> https://spenden.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/029/42207.
>


-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] Fwd: [Wikitech-l] BREAKING CHANGE: schema update, xml dumps

2019-11-27 Thread Lucas Werkmeister
Forwarding as this will also be relevant for people who consume Wikidata
XML dumps (but not entity dumps), and especially for people who are
interested in working with Structured Data on Commons from dumps.

-- Forwarded message -
Von: Ariel Glenn WMF 
Date: Mi., 27. Nov. 2019 um 14:39 Uhr
Subject: [Wikitech-l] BREAKING CHANGE: schema update, xml dumps
To: Wikipedia Xmldatadumps-l ,
Wikimedia developers 


We plan to move to the new schema for xml dumps for the February 1, 2020
run. Update your scripts and apps accordingly!

The new schema contains an entry for each 'slot' of content. This means
that, for example, the commonswiki dump will contain MediaInfo information
as well as the usual wikitext. See
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/docs/export-0.11.xsd
for the schema and
https://www.mediawiki.org/wiki/Requests_for_comment/Schema_update_for_multiple_content_objects_per_revision_(MCR)_in_XML_dumps
for further explanation and example outputs.

Phabricator task for the update: https://phabricator.wikimedia.org/T238972

PLEASE FORWARD to other lists as you deem appropriate. Thanks!

Ariel Glenn
___
Wikitech-l mailing list
wikitec...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
Lucas Werkmeister (he/er)
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Changed JS API?

2019-11-18 Thread Lucas Werkmeister
On 18.11.19 01:05, Michael Schönitzer wrote:
> Hi,
>
> Am Sa., 16. Nov. 2019 um 15:30 Uhr schrieb Lucas Werkmeister
> mailto:m...@lucaswerkmeister.de>>:
>
> – but that’s not the error I get anyways (I get “Error loading citoid
> config”, because it’s trying to load a JSON file as JavaScript, I
> think).
>
>
> You're using the original version of Aude, that stopped working
> months ago due to a security related changes to the mimetypes returned
> by the servers.

No, I am not. This was using your fork, the one mentioned in the
original email, and I still got this error. (I also reproduced it in a
private window, just in case I had something weird in my common.js.)

Cheers,
Lucas

> There are at least three forks of the script that fixed this – one of
> me. But they now broke again, and at least I am tired of patching
> gadgets all the time, that break due to the lack of stable APIs. I
> also consider the functionality of this addon something that should be
> built into wikibase directly, otherwise the situation with lack of
> (proper) sources in Wikidata will never improve.
>
> Cheers,
>  M
>  
>
> Cheers,
> Lucas
>
> [1]:
> 
> https://www.wikidata.org/w/index.php?title=Wikidata:Stable_Interface_Policy&oldid=1037703687#Unstable_Interfaces
>
> On 16.11.19 01:49, Max Kristen wrote:
> > Hi Tech-Mailinglist,
> >
> > I'm using this gadget  here:
> >
> >
> 
> https://www.wikidata.org/wiki/Wikidata:Tools/Enhance_user_interface#CiteTool
> >
> > in its current  incarnation here:
> >
> > https://www.wikidata.org/wiki/User:MichaelSchoenitzer/CiteTool.js
> >
> > It doesn't seem to work since the last few weeks, maybe because
> it calls
> > on PropertyValueSnak, which seems to have moved since the last
> JS API
> > version.
> >
> >
> https://doc.wikimedia.org/Wikibase/master/js/#!/api/PropertyValueSnak
> >
> > Is the API not ready to be stable yet, and how can this be fixed?
> >
> > Greetings
> >
> > Max
> >
> >
> >
> >
> >
> > ___
> > Wikidata-tech mailing list
> > Wikidata-tech@lists.wikimedia.org
> <mailto:Wikidata-tech@lists.wikimedia.org>
> > https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
> >
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> <mailto:Wikidata-tech@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Changed JS API?

2019-11-16 Thread Lucas Werkmeister
The Wikibase JavaScript code has never been a stable interface, as the
Stable Interface Policy [1] explicitly points out. As for
PropertyValueSnak, it should now be loaded via ResourceLoader –

mw.loader.using( 'wikibase.datamodel' )
.then( function ( require ) {
var datamodel = require( 'wikibase.datamodel' ),
PropertyValueSnak = datamodel.PropertyValueSnak,
EntityId = datamodel.EntityId;
// ...
} );

– but that’s not the error I get anyways (I get “Error loading citoid
config”, because it’s trying to load a JSON file as JavaScript, I think).

Cheers,
Lucas

[1]:
https://www.wikidata.org/w/index.php?title=Wikidata:Stable_Interface_Policy&oldid=1037703687#Unstable_Interfaces

On 16.11.19 01:49, Max Kristen wrote:
> Hi Tech-Mailinglist,
> 
> I'm using this gadget  here:
> 
> https://www.wikidata.org/wiki/Wikidata:Tools/Enhance_user_interface#CiteTool
> 
> in its current  incarnation here:
> 
> https://www.wikidata.org/wiki/User:MichaelSchoenitzer/CiteTool.js
> 
> It doesn't seem to work since the last few weeks, maybe because it calls
> on PropertyValueSnak, which seems to have moved since the last JS API
> version.
> 
> https://doc.wikimedia.org/Wikibase/master/js/#!/api/PropertyValueSnak
> 
> Is the API not ready to be stable yet, and how can this be fixed?
> 
> Greetings
> 
> Max
> 
> 
> 
> 
> 
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
> 

___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Question on Wikidata/Wikibase reconciliation service for OpenRefine

2019-09-19 Thread Lucas Werkmeister
The API version was missing srnamespace=120 (only searches main
namespace by default, not Item namespace) and srwhat=text (no idea what
the default here is tbh). This version works:
https://data.biblissima.fr/w/api.php?action=query&list=search&srsearch=Jakob%20von%20Viraggio&srnamespace=120&srwhat=text

Cheers,
Lucas

On 19.09.19 17:44, Régis Robineau wrote:
> Got it! Thanks. The developer of Openrefine-Wikibase is telling me
> that, inexplicably in my own instance, there is a discrepancy between
> what is returned in the Mediawiki search UI:
> https://data.biblissima.fr/w/index.php?search=Jakob+von+Viraggio&title=Sp%C3%A9cial%3ARecherche&profile=default&fulltext=1
>  (1
> result)
> and what is returned by the API:
> https://data.biblissima.fr/w/api.php?action=query&list=search&srsearch=Jakob%20von%20Viraggio
> (0 result)
> What could be the reason for that? In theory both result sets should
> match, do they? This problem should not be related to CirrusSearch,
> but I may be wrong... Any idea?
>
> Cheers,
> Régis
>
>
>
>
> Le jeu. 19 sept. 2019 à 12:15, Lucas Werkmeister
> mailto:m...@lucaswerkmeister.de>> a écrit :
>
> Special:Search is the general MediaWiki search, only partially
> related to Wikibase, that’s why I specified you should test the
> other one :)
>
> Installing CirrusSearch is probably your best bet, if it’s
> possible for you, yeah.
>
> Cheers,
> Lucas
>
> On 19.09.19 10:52, Régis Robineau wrote:
>> Thank you for getting me on the right track. You're right, we're
>> not using CirrusSearch for the moment, and the suggestion box (at
>> the top right corner) does not perform cross-languages search in
>> our instance: the autocomplete mechanism is only aware of the
>> current active language. But on the other hand the main Wikibase
>> search (via Special:Search) does seem to search all available
>> languages... 
>> I will ask the openrefine-wikibase developer if he thinks there
>> is a way to solve this in his application. But anyway I guess the
>> best way to benefit from a proper cross-languages search, both
>> within Wikibase and for the reconciliation service, would be to
>> use CirrusSearch as well. What do you think?
>>
>> Cheers,
>> Régis
>>
>> Le jeu. 19 sept. 2019 à 00:08, Lucas Werkmeister
>> mailto:m...@lucaswerkmeister.de>> a
>> écrit :
>>
>> It looks like the openrefine-wikibase reconciliation service
>> uses the wbsearchentities API to find items. As far as I’m
>> aware, the default SQL-based Wikibase search also searches
>> other languages, but still, I think the most likely reason
>> you’re getting different results is that Wikidata uses
>> WikibaseCirrusSearch
>> <https://www.mediawiki.org/wiki/Extension:WikibaseCirrusSearch>,
>> and I assume your wiki doesn’t. If you use entity search on
>> your wiki directly (i. e. not via Special:Search, but in
>> suggestion boxes), does cross-language search work as it
>> should or does it have the same problem?
>>
>> Cheers,
>> Lucas
>>
>> On 18.09.19 22:55, Régis Robineau wrote:
>>> Hi all,
>>>
>>> I'd need help on the Wikidata/Wikibase reconciliation
>>> service for OpenRefine.
>>>
>>> Context: I have my own Wikibase and WDQS instances in
>>> production, and I want to set up a reconciliation service on
>>> top of it, so that users can perform matchings from their
>>> local OpenRefine. I'm using the same tool as Wikidata, i.e.
>>> https://github.com/wetneb/openrefine-wikibase. The web
>>> service works fine, I can reconcile strings in OpenRefine
>>> against the data stored in my Wikibase instance... 
>>>
>>> Issue: But there is a noteworthy difference compared to how
>>> the Wikidata reconciliation service works: 
>>> - with Wikidata, i.e. by using the web service URL with the
>>> "en" language prefix
>>> (https://tools.wmflabs.org/openrefine-wikidata/en/api), I am
>>> able to find matches among labels in any other language of a
>>> Wikidata item. For instance, if I send a request for "Jacopo
>>> de Fazio", which is an alias in French and Italian
>>> for Q313460 <https

Re: [Wikidata-tech] Question on Wikidata/Wikibase reconciliation service for OpenRefine

2019-09-19 Thread Lucas Werkmeister
Special:Search is the general MediaWiki search, only partially related
to Wikibase, that’s why I specified you should test the other one :)

Installing CirrusSearch is probably your best bet, if it’s possible for
you, yeah.

Cheers,
Lucas

On 19.09.19 10:52, Régis Robineau wrote:
> Thank you for getting me on the right track. You're right, we're not
> using CirrusSearch for the moment, and the suggestion box (at the top
> right corner) does not perform cross-languages search in our instance:
> the autocomplete mechanism is only aware of the current active
> language. But on the other hand the main Wikibase search (via
> Special:Search) does seem to search all available languages... 
> I will ask the openrefine-wikibase developer if he thinks there is a
> way to solve this in his application. But anyway I guess the best way
> to benefit from a proper cross-languages search, both within Wikibase
> and for the reconciliation service, would be to use CirrusSearch as
> well. What do you think?
>
> Cheers,
> Régis
>
> Le jeu. 19 sept. 2019 à 00:08, Lucas Werkmeister
> mailto:m...@lucaswerkmeister.de>> a écrit :
>
> It looks like the openrefine-wikibase reconciliation service uses
> the wbsearchentities API to find items. As far as I’m aware, the
> default SQL-based Wikibase search also searches other languages,
> but still, I think the most likely reason you’re getting different
> results is that Wikidata uses WikibaseCirrusSearch
> <https://www.mediawiki.org/wiki/Extension:WikibaseCirrusSearch>,
> and I assume your wiki doesn’t. If you use entity search on your
> wiki directly (i. e. not via Special:Search, but in suggestion
> boxes), does cross-language search work as it should or does it
> have the same problem?
>
> Cheers,
> Lucas
>
> On 18.09.19 22:55, Régis Robineau wrote:
>> Hi all,
>>
>> I'd need help on the Wikidata/Wikibase reconciliation service for
>> OpenRefine.
>>
>> Context: I have my own Wikibase and WDQS instances in production,
>> and I want to set up a reconciliation service on top of it, so
>> that users can perform matchings from their local OpenRefine. I'm
>> using the same tool as Wikidata, i.e.
>> https://github.com/wetneb/openrefine-wikibase. The web service
>> works fine, I can reconcile strings in OpenRefine against the
>> data stored in my Wikibase instance... 
>>
>> Issue: But there is a noteworthy difference compared to how the
>> Wikidata reconciliation service works: 
>> - with Wikidata, i.e. by using the web service URL with the "en"
>> language prefix
>> (https://tools.wmflabs.org/openrefine-wikidata/en/api), I am able
>> to find matches among labels in any other language of a Wikidata
>> item. For instance, if I send a request for "Jacopo de Fazio",
>> which is an alias in French and Italian for Q313460
>> <https://www.wikidata.org/wiki/Q313460>, OpenRefine will
>> match Q313460 as expected, even if I'm using the "en" language
>> code in the web service url. 
>> - Whereas in my own instance, i.e. by using my own
>> "openrefine-wikibase" reconciliation service, it can only perform
>> matching of labels/aliases in the same language: e.g. if I use
>> "https://my-service.org/openrefine-wikidata/en/api";, the web
>> service only searches for labels in English in my Wikibase. This
>> means that I am forced to launch the reconciliation process in
>> OpenRefine for every single language, one by one.
>>
>> I do not know how the Wikidata reconciliation service is able to
>> take into account all the labels/aliases in all the languages of
>> a given Wikidata item. The data is modeled in the same way in
>> Wikidata and in my Wikibase, and I do not see any difference
>> between the two in the way the RDF data is structured into the
>> respective triplestores...
>>
>> How can I enable the same behaviour as in the Wikidata
>> reconciliation service? (i.e. to look for labels/aliases in every
>> languages in one API call)
>> This would heavily improve the reconciliation process in
>> OpenRefine for my users.
>>
>> Thanks a lot for your help!
>>
>> Régis
>>
>>
>>
>> ___
>> Wikidata-tech mailing list
>> Wikidata-tech@lists.wikimedia.org 
>> <mailto:Wikidata-tech@lists.wikimedia.org>
>> https://lists.w

Re: [Wikidata-tech] Question on Wikidata/Wikibase reconciliation service for OpenRefine

2019-09-18 Thread Lucas Werkmeister
It looks like the openrefine-wikibase reconciliation service uses the
wbsearchentities API to find items. As far as I’m aware, the default
SQL-based Wikibase search also searches other languages, but still, I
think the most likely reason you’re getting different results is that
Wikidata uses WikibaseCirrusSearch
, and I
assume your wiki doesn’t. If you use entity search on your wiki directly
(i. e. not via Special:Search, but in suggestion boxes), does
cross-language search work as it should or does it have the same problem?

Cheers,
Lucas

On 18.09.19 22:55, Régis Robineau wrote:
> Hi all,
>
> I'd need help on the Wikidata/Wikibase reconciliation service for
> OpenRefine.
>
> Context: I have my own Wikibase and WDQS instances in production, and
> I want to set up a reconciliation service on top of it, so that users
> can perform matchings from their local OpenRefine. I'm using the same
> tool as Wikidata, i.e. https://github.com/wetneb/openrefine-wikibase.
> The web service works fine, I can reconcile strings in OpenRefine
> against the data stored in my Wikibase instance... 
>
> Issue: But there is a noteworthy difference compared to how the
> Wikidata reconciliation service works: 
> - with Wikidata, i.e. by using the web service URL with the "en"
> language prefix
> (https://tools.wmflabs.org/openrefine-wikidata/en/api), I am able to
> find matches among labels in any other language of a Wikidata item.
> For instance, if I send a request for "Jacopo de Fazio", which is an
> alias in French and Italian for Q313460
> , OpenRefine will match Q313460
> as expected, even if I'm using the "en" language code in the web
> service url. 
> - Whereas in my own instance, i.e. by using my own
> "openrefine-wikibase" reconciliation service, it can only perform
> matching of labels/aliases in the same language: e.g. if I use
> "https://my-service.org/openrefine-wikidata/en/api";, the web service
> only searches for labels in English in my Wikibase. This means that I
> am forced to launch the reconciliation process in OpenRefine for every
> single language, one by one.
>
> I do not know how the Wikidata reconciliation service is able to take
> into account all the labels/aliases in all the languages of a given
> Wikidata item. The data is modeled in the same way in Wikidata and in
> my Wikibase, and I do not see any difference between the two in the
> way the RDF data is structured into the respective triplestores...
>
> How can I enable the same behaviour as in the Wikidata reconciliation
> service? (i.e. to look for labels/aliases in every languages in one
> API call)
> This would heavily improve the reconciliation process in OpenRefine
> for my users.
>
> Thanks a lot for your help!
>
> Régis
>
>
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] Fwd: [Wikitech-l] Possible change in schedule of generation of wikidata entity dumps

2019-03-14 Thread Lucas Werkmeister
Forwarding in case anyone on this list doesn’t follow wikitech-l – if you
use the Wikidata entity dumps (JSON/Turtle/N-Triples), you might be
interested in T216160 <https://phabricator.wikimedia.org/T216160>.

Cheers,
Lucas

-- Forwarded message -
From: Ariel Glenn WMF 
Date: Do., 14. März 2019 um 10:53 Uhr
Subject: [Wikitech-l] Possible change in schedule of generation of wikidata
entity dumps
To: Wikimedia developers 


If you use these dumps regularly, please read and weigh in here:
https://phabricator.wikimedia.org/T216160

Thanks in advance,

Ariel Glenn
Wikimedia Foundation
ar...@wikimedia.org
___
Wikitech-l mailing list
wikitec...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
Lucas Werkmeister
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us to achieve our vision!
https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] New API module to format multiple entity IDs

2018-12-04 Thread Lucas Werkmeister
Users of the Wikibase API who need to format many entity IDs (e. g. in
QuickStatements, Wikidata Graph Builder, Wikidata Recent Changes, or
Wikidata Reconciliation) can now use a new API module for this:
wbformatentities. It combines advantages of wbgetentities and wbformatvalue:
as in wbformatvalue, you can use Wikibase’ own logic for formatting
entities (so you don’t have to worry about downloading labels, applying
language fallbacks, dealing with other entity types like lexemes, etc.),
but as in wbgetentities, you can process large numbers of entities at once,
instead of making one API call per entity.

The module is currently kept very simple: you specify a list of entity IDs
with the ids parameter, and the API returns a list of HTML snippets
corresponding to those IDs. (Support for other output formats may be added
later; let us know if it would be useful to you.) The language can be
controlled via the global uselang parameter. Normal users can format up to
50 entities at once, bots up to 500.

Please let us know if you have any comments, either by responding here or
over on Phabricator at T207484 <https://phabricator.wikimedia.org/T207484>.

-- Lucas


-- 
Lucas Werkmeister
Full Stack Developer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] wikidata query service CONSTRUCT results

2018-07-20 Thread Lucas Werkmeister
If you send an Accept header for Turtle, the server returns Turtle:

$ curl -H 'Accept: text/turtle' --data-urlencode query='CONSTRUCT {
?item a . } WHERE { ?item wdt:P31 wd:Q5. }
LIMIT 10' https://query.wikidata.org/sparql
@prefix wd:  .
# ...
wd:Q260 a  .
# ...

Unfortunately, specifying an analogous N-Triples header doesn’t work.
I’m not sure why – perhaps BlazeGraph just doesn’t support it directly.

Cheers,
Lucas


On 20.07.2018 04:22, Peter F. Patel-Schneider wrote:
> Hi:
>
> How can I control the form of CONSTRUCT results from the wikidata query
> service?
>
> On the web interface I can get the results in various formats (but not
> something that can be read into SPARQL, I don't think).   Using curl all I get
> is XML/RDF.
>
> What I really want is n-triples so that I can concatenate several results, but
> Turtle would be better than RDF/XML.
>
> peter
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Query data type using WDQS

2018-06-07 Thread Lucas Werkmeister
Well, for that you can also use Special:ListProperties/url
<https://www.wikidata.org/wiki/Special:ListProperties/url> :)


On 08.06.2018 00:16, Thad Guidry wrote:
> Thanks Lucas !
>
> With that I was able to find and tweak another example ( since you
> showed me it's a wikibase parameter itself ! thanks so much !)
>
> This gives me my list of all the Properties that are of " propertyType
> wikibase: Url "
>
> https://query.wikidata.org/#%23Properties%20of%20type%20URL%0A%23%20Make%20a%20list%20of%20properties%20of%20the%20type%20Url%0ASELECT%20%3Fproperty%20%3FpropertyLabel%20%3FpropertyDescription%20WHERE%20%7B%0A%20%20%3Fproperty%20wikibase%3ApropertyType%20wikibase%3AUrl%20.%0A%09SERVICE%20wikibase%3Alabel%20%7B%0A%09%09bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22%20.%0A%09%7D%20%20%20%20%20%20%20%20%20%20%0A%7D%20ORDER%20BY%20%3FpropertyLabel
>
>
> On Thu, Jun 7, 2018 at 4:28 PM Lucas Werkmeister
> mailto:m...@lucaswerkmeister.de>> wrote:
>
> The property type is available under the wikibase:propertyType
> predicate:
> 
> https://query.wikidata.org/#%23Subproperties%20of%20URL%20%28P2699%29%0ASELECT%20DISTINCT%20%3FsubProperties%20%3FsubPropertiesLabel%20%3FpropertyType%20WHERE%20%7B%0A%20%20%3FsubProperties%20wdt%3AP1647%2a%20wd%3AP2699%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20wikibase%3ApropertyType%20%3FpropertyType.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D
>
> Cheers, Lucas
>
>
> On 07.06.2018 17:01, Thad Guidry wrote:
>> 1. Ok, after 1 hour, I have given up trying to find some good
>> documentation on Data Type querying with Sparql and asking for
>> assistance from experts.
>>
>> Many properties have a Data Type locked in with URL as shown on
>> this Wiki maintenance page:
>> https://www.wikidata.org/wiki/Category:Properties_with_url-datatype
>>
>> However, I don't quite understand how to get at the Data Type
>> itself in Sparql ?
>> 
>> https://query.wikidata.org/#%23Subproperties%20of%20URL%20%28P2699%29%0ASELECT%20DISTINCT%20%3FsubProperties%20%3FsubPropertiesLabel%20WHERE%20%7B%0A%20%20%3FsubProperties%20wdt%3AP1647%2a%20wd%3AP2699.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D
>>
>> 2. Should equivalent property
>> <https://www.wikidata.org/wiki/Property:P1628> and equivalent
>> class <https://www.wikidata.org/wiki/Property:P1709> both be a
>> subproperty of URL (I have applied this assumption to both, but
>> wondering if I might have missed some other way of expressing
>> that or if not needed since the Data Type is set, but having
>> difficulty querying that in 1.)
>>
>> I'm looking to query subjects having a statement where the
>> property is a Data Type = URL and filtered by contains("world") 
>> (don't ask why, hahaha)
>>
>> Thanks in advance for direction and help,
>> -Thad
>>
>>
>> ___
>> Wikidata-tech mailing list
>> Wikidata-tech@lists.wikimedia.org
>> <mailto:Wikidata-tech@lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> <mailto:Wikidata-tech@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
>
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech



smime.p7s
Description: S/MIME Cryptographic Signature
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Query data type using WDQS

2018-06-07 Thread Lucas Werkmeister
The property type is available under the wikibase:propertyType
predicate:
https://query.wikidata.org/#%23Subproperties%20of%20URL%20%28P2699%29%0ASELECT%20DISTINCT%20%3FsubProperties%20%3FsubPropertiesLabel%20%3FpropertyType%20WHERE%20%7B%0A%20%20%3FsubProperties%20wdt%3AP1647%2a%20wd%3AP2699%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20wikibase%3ApropertyType%20%3FpropertyType.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D

Cheers, Lucas

On 07.06.2018 17:01, Thad Guidry wrote:
> 1. Ok, after 1 hour, I have given up trying to find some good
> documentation on Data Type querying with Sparql and asking for
> assistance from experts.
>
> Many properties have a Data Type locked in with URL as shown on this
> Wiki maintenance page:
> https://www.wikidata.org/wiki/Category:Properties_with_url-datatype
>
> However, I don't quite understand how to get at the Data Type itself
> in Sparql ?
> https://query.wikidata.org/#%23Subproperties%20of%20URL%20%28P2699%29%0ASELECT%20DISTINCT%20%3FsubProperties%20%3FsubPropertiesLabel%20WHERE%20%7B%0A%20%20%3FsubProperties%20wdt%3AP1647%2a%20wd%3AP2699.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D
>
> 2. Should equivalent property
>  and equivalent class
>  both be a subproperty
> of URL (I have applied this assumption to both, but wondering if I
> might have missed some other way of expressing that or if not needed
> since the Data Type is set, but having difficulty querying that in 1.)
>
> I'm looking to query subjects having a statement where the property is
> a Data Type = URL and filtered by contains("world")  (don't ask why,
> hahaha)
>
> Thanks in advance for direction and help,
> -Thad
>
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech

___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Fastest way (API or whatever) to verify a QID

2018-05-15 Thread Lucas Werkmeister
If you have the luxury of access to the replica servers (which is the
case for the reconciliation service, right?), then I doubt any API is
going to beat a raw SQL query:

$ sql wikidata 'SELECT page_title FROM page WHERE page_namespace = 0 AND
page_title IN ("Q12345", "Q123456", "Q1234567")'
++
| page_title |
++
| Q12345 |
| Q123456|
++

This shows that Q1234567 doesn’t exist. (Someone with more SQL knowledge
could probably turn that into a “yes/no” kind of reply, too.)

Cheers,
Lucas

On 15.05.2018 17:49, Thad Guidry wrote:
> Which service is the fastest that can return when a QID is valid ?
> 
> OpenRefine is trying to improve its reconciling when folks already have
> lists of QID's and just want to ensure they are still valid and exist.
> https://github.com/OpenRefine/OpenRefine/issues/1596
> 
> -Thad
> 
> 
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
> 

___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] BREAKING CHANGE: wbcheckconstraints status parameter

2018-01-29 Thread Lucas Werkmeister
Hi all!

This is an announcement for a breaking change to the default value of a
parameter of the WikibaseQualityConstraints constraint checking API, to go
live on 26 February 2018. It potentially affects clients that use the
*wbcheckconstraints* API action. (We are not aware of any such clients
apart from the *checkConstraints* gadget, which is not affected.)

Recently, we added a status parameter to the *wbcheckconstraints* API
action, with the intention that API users can declare ahead of time which
results they’re actually interested in, so that other results don’t need to
be sent to them: specifically, for most items the vast majority of results
indicate compliance with a constraint, which we expect most users aren’t
interested in.

*On 26 February 2018, we will change the default value of the status
parameter to violation|warning|bad-parameters.* We assume that most users
of the API will only be interested in results that actually indicate
problems, and this should significantly reduce the size of API responses.
Users who wish to receive all results, regardless of status, should specify
status=* in their API requests.

Our motivation for this change is that we want to enable caching of
constraint check results, but don’t want to bloat the cache with tons of
compliance and not-in-scope results that we don’t even show in the gadget.
With the status parameter, we can store only problematic results in the
cache, while still guaranteeing that the response we send is complete,
since the request indicated that it only needs these results anyways. This
also means that when we enable caching (see phabricator:T184812
<https://phabricator.wikimedia.org/T184812>), only requests with
status=violation|warning|bad-parameters will benefit from it.

Please let us know if you have any questions.

-- Lucas

Relevant tickets:

   - phabricator:T183927 <https://phabricator.wikimedia.org/T183927>
   - phabricator:T184812 <https://phabricator.wikimedia.org/T184812>
   - phabricator:T184937 <https://phabricator.wikimedia.org/T184937>

-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


Re: [Wikidata-tech] Order of claims on entity page

2017-11-29 Thread Lucas Werkmeister
As far as I can tell, this is not possible via the API directly. However,
you can get the order of properties from MediaWiki:Wikibase-SortedProperties
<https://www.wikidata.org/wiki/MediaWiki:Wikibase-SortedProperties>, and
sort the response you get accordingly. If you’re using Lua, there’s also
the mw.wikibase.getPropertyOrder()
<https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua#mw.wikibase.getPropertyOrder>
and mw.wikibase.orderProperties( propertyIds
<https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua#mw.wikibase.orderProperties>
)
<https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua#mw.wikibase.orderProperties>
functions.

2017-11-28 19:22 GMT+02:00 Vladimir Ryabtsev :

> I find the order of claims on web site useful, but when requesting entity
> data through API (action=wbgetentities) it got lost.
> How the claims are ordered on an entity web page and how to restore the
> order in API response?
>
> --
> Vlad
>
>
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>  Без
> вирусов. www.avast.ru
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
> <#m_-3502740646293738920_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
> ___
> Wikidata-tech mailing list
> Wikidata-tech@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
>


-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] BREAKING CHANGE: wbcheckconstraints detail output

2017-11-20 Thread Lucas Werkmeister
Hi all!

This is an announcement for a breaking change to the output of the
WikibaseQualityConstraints constraint checking API, to go live on 18
December 2017. It potentially affects clients that use the
*wbcheckconstraints* API action. (We are not aware of any such clients
apart from the *checkConstraints* gadget, which is not affected.)

Currently, the description of a constraint in the API response includes the
detail and detailHTML fields, which contain the constraint parameters. The
gadget has never used these fields, since the error messages for some time
now contain all the information needed to understand the constraint
violation (that is, the constraint parameters are part of the message where
necessary). Additionally, since the move from the {{Constraint
<https://www.wikidata.org/wiki/Template:Constraint>}} templates to
constraint statements on properties (using property constraint (P2302)
<https://www.wikidata.org/wiki/Property:P2302>), parsing constraint
parameters is no longer the complex task it once was, and consumers
interested in the constraint parameters can inspect the constraint
statements using the standard Wikibase APIs or the Wikidata Query Service.

Since these two fields can account for up to 40% of the *wbcheckconstraints*
API response size, and we want to start caching those responses soon, *we
will remove the detail and detailHTML fields on 18 December 2017.* This is
already in effect on the Wikidata constraints test system
<https://wikidata-constraints.wmflabs.org/>; you can test your tools or
other code there.

Please let us know if you have any comments or objections. -- Lucas

Relevant tickets:

   - phab:T180614 <https://phabricator.wikimedia.org/T180614>

Relevant patches:

   - gerrit:391864 <https://gerrit.wikimedia.org/r/391864>

-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] Gadget / userscript editor feedback wanted

2017-11-03 Thread Lucas Werkmeister
Hi everyone! I’m considering to make a change to the HTML output for
statements, and I’d like to gather some feedback from people who work on
gadgets and user scripts :)

The problem is that any gadget that appends to the value of a statement
(e. g. *checkConstraints*) changes the HTML in a way that Wikibase’s own
JavaScript doesn’t expect, and sometimes the appended elements bleed into
Wikibase’s own elements, causing e. g. phab:T167869
<https://phabricator.wikimedia.org/T167869> and phab:T169866
<https://phabricator.wikimedia.org/T169866>. (To clarify: the tasks mention
*checkConstraints* specifically, but at least the first task also affects
other gadgets.)

My proposed solution is to change the layout of the HTML slightly, from the
current


  ...
  ...

into the following:


  ...
  
...

  

There is a new element for “indicators” on a snak, and the value itself is
now a span inside of a new div wrapper. The indicators are hidden while the
statement is edited, and cleared on save. Gadgets can subscribe to the
wikibase.statement.saved hook to populate the indicators again after a
statement has been saved, using the new value. A simple example gadget
using this technique is at User:Lucas Werkmeister (WMDE)/colorIndicator.js
<https://www.wikidata.org/wiki/User:Lucas_Werkmeister_(WMDE)/colorIndicator.js>
.

Here’s a list of some gadgets and user scripts I’m aware of that could use
this “indicators” area:

   - checkConstraints
   <https://www.wikidata.org/wiki/MediaWiki:Gadget-checkConstraints.js>
   - EasyQuery <https://www.wikidata.org/wiki/MediaWiki:Gadget-EasyQuery.js>
   - User:Aude/mapview.js
   <https://www.wikidata.org/wiki/User:Aude/mapview.js>
   - User:Lucas Werkmeister (WMDE)/colorIndicator.js
   
<https://www.wikidata.org/wiki/User:Lucas_Werkmeister_(WMDE)/colorIndicator.js>
   (mentioned above)

Existing gadgets that use something like $( '.wikibase-snakview-value'
).append( … ) will continue to work, though they could be changed to append
to the indicators instead. However, gadgets that select something like
div.wikibase-snakview-value will break, since the element is no longer a div
.
Do you see any problems with this new HTML layout, or do you want to
suggest any improvements? (You can reply to this email, or on the Project
chat
<https://www.wikidata.org/wiki/Wikidata:Project_chat#Gadget_.2F_userscript_editor_feedback_wanted>,
or comment on phab:T95403 <https://phabricator.wikimedia.org/T95403>.)

Cheers,
Lucas

-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] BREAKING CHANGE: wbcheckconstraints API output format

2017-09-14 Thread Lucas Werkmeister
Hi all!

This is an announcement for a breaking change to the output format of the
WikibaseQualityConstraints constraint checking API, to go live on 10
October 2017. It affects all clients that use the *wbcheckconstraints* API
action. (We are not aware of any such clients apart from the
*checkConstraints* gadget, which has been adapted.)

We are soon going to check constraints not just on the main snak of a
statement, but also on qualifiers and references (T168532
<https://phabricator.wikimedia.org/T168532>). However, the current API
output format of the *wbcheckconstraints* API action cannot accommodate any
other constraint check results. To resolve this issue, we are introducing a
new, more flexible output format for the API, which can contain constraint
check results on all kinds of snaks and also leaves room for future
expansion (e. g. for T168626 <https://phabricator.wikimedia.org/T168626>).
The new format is based on the Wikibase JSON format, and documented (along
with the old format) on mw:Wikibase/API#wbcheckconstraints
<https://www.mediawiki.org/wiki/Wikibase/API#wbcheckconstraints>.

If you use the *wbcheckconstraints* API action in your tools, the safest
option is to make them support both output formats for the transitional
period. It’s easy to determine which format the API returned, because the
new format contains the fixed key "claims" on the second level, which will
never happen in the old format. You can see an example of this for the
*checkConstraints* gadget in change I99379a96cd
<https://gerrit.wikimedia.org/r/#/c/373323/>, specifically the new
extractResultsForStatement function.

The new API output format is already enabled on the Wikidata constraints
test system <https://wikidata-constraints.wmflabs.org/>. You can test your
tools or other code there.

Please let us know if you have any comments or objections.

-- Lucas

Relevant tickets:

   - T168532 <https://phabricator.wikimedia.org/T168532>
   - T174544 <https://phabricator.wikimedia.org/T174544>

Relevant patches:

   - https://gerrit.wikimedia.org/r/#/c/369420
   - https://gerrit.wikimedia.org/r/#/c/373323/

-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


[Wikidata-tech] Significant change: Snak hashes in API and HTML output formats

2017-09-11 Thread Lucas Werkmeister
Hi all!

This is an announcement for a significant change to the Wikibase entity
format, which went live the beginning of September. It potentially affects
clients that process snaks
<https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Glossary#Snak>.

Internally, Wikibase assigns a *hash* to each snak (which is just the hash
function (Q183427) <https://www.wikidata.org/wiki/Q183427> of an internal
representation of the snak). Those hashes were previously emitted for snaks
that appeared in qualifiers, but not for the main snak or reference snaks
of a statement. With the change, the hashes are emitted for all snaks,
regardless of where they appear. This means that a snak can now look like
this:

{
"snaktype": "value",
"property": "P370",
"hash": "682fdb448ef68669a1b728a5076836da9ac3ffae",
"datavalue": {
"value": "some text",
"type": "string"
},
"datatype": "string"}

The hashes are also added to the HTML output, as an additional class
similar to the statement ID class on statements:


  
  

  

  

The ultimate goal of this is to make any snak addressable in the DOM, which
is necessary for checking constraints on qualifiers and references (T168532
<https://phabricator.wikimedia.org/T168532>).

It should be noted that unlike statement IDs, snak hashes are not
identifiers. They are not stable, and may change at any time with the
internal format.

Please let us know if you have any comments or objections.

-- Lucas

Relevant tickets:

   - T171607 <https://phabricator.wikimedia.org/T171607>
   - T171725 <https://phabricator.wikimedia.org/T171725>

Relevant patches:

   - https://github.com/wmde/WikibaseDataModelSerialization/pull/233
   - https://gerrit.wikimedia.org/r/#/c/374835/

-- 
Lucas Werkmeister
Software Developer (Intern)

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de

Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech