nsions/Kartographer>, I
> believe.
>
> I’m not aware of any extensions that add new datatypes and are
> specifically intended to be used as examples or building blocks for your
> own extensions.
>
> Cheers,
> Lucas
> On 30.05.20 17:52, Yuri Astrakhan wrote:
>
> Hi,
Hi, I would like to implement a new property type for my project. Are there
any examples of extensions that add new prop types to wikibase?
I already implemented most of what I need by changing wikibase code, but I
doubt a property to store multiline code snippets will be accepted into
wikibase at
There has been a number of discussions about translations. At the moment,
the whole situation is very similar to the original interwiki (sitelink)
issue -- a lexeme in each language has to point to corresponding lexemes in
all other languages. This issue was actually what started Wikidata in the
fi
2019 at 8:38 PM Kingsley Idehen
wrote:
> On 5/31/19 11:28 AM, Yuri Astrakhan wrote:
>
> I actually already implemented support in SPARQL for that, but it needs a
> bit more work to get it properly merged with the Blazegraph code. I had it
> working for a while as part of Sophox
I actually already implemented support in SPARQL for that, but it needs a
bit more work to get it properly merged with the Blazegraph code. I had it
working for a while as part of Sophox (OSM Sparql).
* docs: https://wiki.openstreetmap.org/wiki/Sophox#External_Data_Sources
* code:
https://github
Sounds interesting, is there a github repo?
On Fri, May 3, 2019 at 8:19 PM Amirouche Boubekki <
amirouche.boube...@gmail.com> wrote:
> GerardM post triggered my interest to post to the mailing list. As you
> might know I am working on functional quadstore that is quadstore that
> keeps around old
Daniel,
> P and Q indicate the *type* of the entity ("P" = "Property", "Q" = "Item"
> for
> arcane reasons), "L" = Lexeme, "F" = Form, "S" = Sense, "M" = MediaInfo).
> As you
> can tell, we'd quickly run out of letters and cause confusion if this
> became
> configurable.
>
I don't think this woul
On Thu, Nov 29, 2018 at 1:03 PM Daniel Kinzler
wrote:
> This doesn't fix the hard-coded prefix in the RDF output generated by
> Wikibase.
>
> See my previous email - my patch fixes that too. Here's an example query
http://tinyurl.com/yav76uof in Sophox -- it calls out to Wikidata to get a
list o
Olaf, Andra, Lydia,
On Thu, Nov 29, 2018 at 4:01 AM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> Are we talking about https://phabricator.wikimedia.org/T194180? I'm
> happy to push that into one of the next sprints if so.
>
> I think my yesterday's patch fixes this issue on the server
On Thu, Nov 29, 2018 at 12:51 AM Federico Leva (Nemo)
wrote:
> Yuri Astrakhan, 29/11/18 04:14:
> > The "Q" prefix has a strong identity in itself. Anyone will instantly
> > say - yes, it's a Wikidata identifier
>
> But that's because most people only
Daniel, it is not so clear cut. Most users will not be exposed to a
"zoo". Case in point - Open Street Map. In OSM, the entire user base of
tens of thousands of people know the meaning of Q123. The "Q" prefix has a
strong identity in itself. Anyone will instantly say - yes, it's a
Wikidata iden
dlessly confusing -- but currently I think
> this is not possible.
>
>-- James
>
> On 28/11/2018 16:32, Yuri Astrakhan wrote:
> > I would add another very important aspect - query prefixes - to build
> some
> > cohesion within Wikibase community.
> >
>
I would add another very important aspect - query prefixes - to build some
cohesion within Wikibase community.
Currently, WDQS hardcodes prefixes like "wd:" and "wdt:" to be based on the
"conceptUri" parameter. Which means that any Wikibase installation that
has its own data would still use well-
There is a difference between the name "translation" and "transliteration".
Place translation should always take precedence, e.g. (Köln vs Cologne).
This mostly applies to cities/countries, but not street-level naming.
Transliterations are trickier. Should we simply transliterate everything
into
Amir, importing data from Wikidata to OSM has been discussed a number of
times. There is a number of active OSM community members who are strongly
opposing it because they feel Wikidata is not sufficiently safe from the
legal perspective. E.g. Wikipedia allows users to look up things in Google
Map
Seems like they simply store it as wiki markup -
https://ballotpedia.org/wiki/index.php?title=Marco+Rubio&action=edit,
unless they generate it from some other internal database.
On Mon, Mar 12, 2018 at 8:10 PM, Stas Malyshev
wrote:
> Hi!
>
> > Something I wish was available is the voting record,
Something I wish was available is the voting record, at least at a
country/state level. Knowing the politician's time in office is a great
start, but how that person voted is what really makes democracy work.
On Sun, Mar 11, 2018 at 5:16 AM, Gerard Meijssen
wrote:
> Hoi,
> For the majority of U
Thanks Stas. How does this affect non-WMF clones of Wikidata QS?
On Tue, Mar 6, 2018 at 2:51 PM, Stas Malyshev
wrote:
> Hi!
>
> This morning we have switched the polling mechanism for Wikidata Query
> Service from using Recent Changes API to using Kafka events
> (https://wikitech.wikimedia.org/w
P.S. is there a list of values we want to introduce with the well known
numbers?
e.g.peace - L1
On Wed, Mar 7, 2018 at 12:04 PM, Yuri Astrakhan
wrote:
> Awesome news, congratulations!
>
> See live demo at https://wikidata-lexeme.wmflabs.org/
>
> On Wed, Mar 7, 2018 a
Awesome news, congratulations!
See live demo at https://wikidata-lexeme.wmflabs.org/
On Wed, Mar 7, 2018 at 11:49 AM, Léa Lacroix
wrote:
> Hello all,
>
> First version of Lexicographical Data will be released in April. You can
> read the detailed announcement here: https://www.wikidata.org/wik
gt; qualifier of a traditional property?
>
> I find the feature very promising, but for now it is still in its
> infancy. I don't see how I could use it for edits like this one:
>
> https://www.wikidata.org/w/index.php?title=Q37461404&diff=578074181&oldid=578071885
>
> when you say "wikidata is not well suited for lists data", you refer
> to wikibase or WDQS here?
>
Wikibase, per Daniel K.
>
> the data:Bea.gov/GDP by state.tab above is certainly a good
> representation for efficient delivery (via json) and display of data.
> but inefficient for further data
There is a better alternative to storing lists -
https://www.mediawiki.org/wiki/Help:Tabular_Data -- it allows you to store
a CSV-like table of data on Commons, with localized columns, and access it
from all other wikis from the and Lua scripts.
A good example of it -- "per state GDP" page -- se
I would like to propose that we add how popular is each sitelink to WDQS.
This would allow queries that order results by wiki article popularity. For
example, this query lists Wikidata items without French labels but with
French articles, ordered by how often they gets viewed in frwiki.
http://tin
For consistency between all possible clients, we seem to have only two
options: either part of the query, or the X-Analytics header. The
user-agent header is not really an option because it is not available for
all types of clients, and we want to have just one way for everyone.
Headers other th
I would highly recommend using X-Analytics header for this, and
establishing a "well known" key name(s). X-Analytics gets parsed into
key-value pairs (object field) by our varnish/hadoop infrastructure,
whereas the user agent is basically a semi-free form text string. Also,
user agent cannot be set
I guess I qualify for #2 several times:
* The & support access to the geoshapes service, which
in turn can make requests to WDQS. For example, see
https://en.wikipedia.org/wiki/User:Yurik/maplink (click on "governor's
link")
* The wiki tag supports the same geoshapes service, as well as
direct
You might also use page views for the fame estimates. E.g. us election
candidate pageviews:
https://meta.wikimedia.org/wiki/User:Yurik/US_Politics_Real_Time
On Wed, Aug 17, 2016, 11:42 Felipe Hoffa wrote:
> I've been playing with Wikipedia (to extract list of links), Wikidata (to
> enrich), Wiki
Erika, would building a better wikidata UI help alleviate your concern?
For example, it used to be that to add a link to the same article in
another language, one had to edit raw wiki markup and add a weird language
link. Now with wikidata it is by far more intuitive, with an edit button
right next
Jane, now we are really going into the field of elastic search's relevancy
calculation. When searching, things like popularity (pageviews), incoming
links, number of different language wiki articles, article size, article
quality (good/selected), and many other aspects could be used to better the
r
Is there a way we could have more than just the number of language links?
Eg number of incoming links from other wikipedia pages?
On Aug 2, 2016 10:41 PM, "Markus Kroetzsch"
wrote:
> On 02.08.2016 20:59, Daniel Kinzler wrote:
>
>> Am 02.08.2016 um 20:19 schrieb Markus Kroetzsch:
>>
>>> Oh, there
Any person in wikidata is "famous" - otherwise they wouldn't be notable and
therefore wouldn't be there))
If you prefer the stricter notability requirement(as used by Wikipedia),
search only for those that have a wikipedia page
On Aug 2, 2016 1:44 PM, "Ghislain ATEMEZING"
wrote:
> Ahoy,
> I am c
https://commons.wikimedia.org/w/api.php?action=query&titles=File:Python-Foot.png&prop=imageinfo&&iiprop=url&iiurlwidth=100
On Wed, Apr 6, 2016 at 1:29 AM, wrote:
> From the following image URL returned from a SPARQL query, what would be
> the best way to generate a thumbnail 100 pixels wide?
> h
>> On 14.02.2016 15:11, Jane Darnell wrote:
>>
>>> Wow Hay, this is super useful
>>>
>>> On Sun, Feb 14, 2016 at 8:50 AM, Hay (Husky) >> <mailto:hus...@gmail.com>> wrote:
>>>
>>> Awesome, thanks! :)
>>>
>&g
Well done! Absolutely love it! I'm already using it to build SPARQL
queries for the wikidata visualizations [1].
[1]: http://en.wikipedia.beta.wmflabs.org/wiki/Sparql
On Sun, Feb 14, 2016 at 2:44 PM, Hay (Husky) wrote:
> Hey everyone,
> it seems we're getting new properties every day. Currentl
35 matches
Mail list logo