Hi Jean-Baptiste,
by default, suitable catalogs are re-scraped once every three months. I
will see if I can change it to one month in this case.
I have also manually started auto-scraping from this page:
https://mix-n-match.toolforge.org/#/jobs/1679
so the catalog should be updated soon.
Hi all,
I think toolhub is great!
I wrote a blog post about integrating some tools of mine that are missing
from the hub right now:
http://magnusmanske.de/wordpress/?p=658
Cheers,
Magnus
On Fri, Oct 15, 2021 at 10:12 PM Bryan Davis wrote:
> Leila Zia wrote:
> > Hi Bodhisattwa,
> >
> > See
Or, better yet, why can't the
> tool be taught not to overwrite human edits (particularly reverts)? To be
> fair, the original confusion is understandable, because the two
> entries/gentlemen are very confusable.
>
> Tom
>
> On Wed, May 6, 2020 at 4:28 AM Magnus Manske
> Thank you!
>
> Best,
> Tuomas / National Library of Finland
> ----------
> *Lähettäjä:* Wikidata käyttäjän
> Magnus Manske puolesta
> *Lähetetty:* keskiviikko 6. toukokuuta 2020 11.00
> *Vastaanottaja:* Discussion list for the Wikidata project <
Hi,
I am the author of Mix'n'match, so I hope I can answer your questions.
Match mode:
By default, "match mode" only shows unmatched entries, example:
https://tools.wmflabs.org/mix-n-match/#/random/473
You can force pre-matched entries, but currently they won't show the
automatic predictions:
Hi,
I can't speak to the wikibase capabilities directly, but QS via API will
always take a bit of time.
One could adapt my Rust core of QuickStatements [1], which also comes with
an (experimental but quite advanced) QS syntax parser, generate the JSON
for each item, and manually insert it into
Hi Olaf,
I don't think there is a way with the current code.
I am working on a Rust parser (instead of PHP) that could be extended more
easily.
Cheers,
Magnus
On Fri, Mar 20, 2020 at 2:03 PM Olaf Simons
wrote:
> A questions for those who have created Wikibase installations
>
> When creating
Well done!
On Tue, Feb 11, 2020 at 11:07 AM Max Klemm wrote:
> Hello all,
>
> While cleaning (reviewing and rewriting) the code of Wikidata and
> Wikimedia Commons backend in October 2019, The Wikidata team at WMDE
> together with WMF worked on reducing the loading time of pages. We managed
>
Hi,
if you need some "Wikibase item diff" function, have a look at the Rust
crate I am co-authoring:
https://gitlab.com/tobias47n9e/wikibase_rs
It comes with diff code:
https://gitlab.com/tobias47n9e/wikibase_rs/-/blob/master/src/entity_diff.rs
Should not be too hard to build eg a simple diff
Thank you!
On Fri, Jul 5, 2019 at 10:40 AM Alaa Sarhan
wrote:
> Hello all,
>
> This is an update regarding the progress and dates of migration of wb_terms
> table replacement solution in Wikidata production environment.
>
> We have successfully put Wikidata production in the stage where
Thank you!
On Fri, Jul 5, 2019 at 10:40 AM Alaa Sarhan
wrote:
> Hello all,
>
> This is an update regarding the progress and dates of migration of wb_terms
> table replacement solution in Wikidata production environment.
>
> We have successfully put Wikidata production in the stage where
On Tue, Mar 12, 2019 at 6:36 PM Nicolas VIGNERON
wrote:
> Hi,
>
> I'm not sure where the error come from.
> It doesn't come from the source, Trismegistos doesn't say that this person
> is born in 1999 (1999 is a publication date here, at least in the
> interface).
> I'm not even sure it comes
Update: dawiki category "Personer" seems to have some category tree cycles
in higher depths. Here are articles from that category (one layer deep)
with no P31 in the item:
http://petscan.wmflabs.org/?psid=8128462
On Tue, Mar 5, 2019 at 1:00 PM Magnus Manske
wrote:
> If you ma
If you make the gender optional, you also get the items without gender:
http://tinyurl.com/yygze9da
"People on Danish Wikipedia but not on Wikidata" is either:
* a subset of "Danish Wikipedia articles not on Wikidata". You can get all
of these (currently, 91) via my tool:
Nice!
On Tue, Dec 4, 2018 at 11:43 AM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:
> Users of the Wikibase API who need to format many entity IDs (e. g. in
> QuickStatements, Wikidata Graph Builder, Wikidata Recent Changes, or
> Wikidata Reconciliation) can now use a new API module
Hi, and thanks for working on this!
My subjective view:
* We don't need P2860/P1433 indexed, at least not at the moment
* I would really like dates (mainly, born/died), especially if they work
for "greater units", that is, I search for a year and get an item back,
even though the statament is
involved, and doesn't add most metadata, but it helps to avoid duplicate
items.
On Fri, May 25, 2018 at 10:33 AM Andreas Dittrich <dittri...@gmail.com>
wrote:
> @Andra Waagmeester: Thank you for your hint to the wikicite-community: I
> got some useful answers their.
>
You could make a catalog of all her works in Mix'n'match [1] (import at
[2]), to avoid creating duplicate items.
FWIW, this is what we have so far for her:
http://tinyurl.com/yc8tujs5
[1] https://tools.wmflabs.org/mix-n-match/
[2] https://tools.wmflabs.org/mix-n-match/import.php
On Thu, May
Hi Stas,
while you are at it, some things would be very useful to be search-able
(maybe some are already by now):
* "primary" (not references/qualifiers) years, for birth/death/flourit etc.
* "primary" string/monolingual values (title, taxon name, etc.)
* "primary" IDs, e.g. VIAF (might cause
Library**Specialist II, Photography
>> Collection*
>>
>> *Photographers' Identities Catalog <http://pic.nypl.org>*
>>
>> On Mon, Oct 16, 2017 at 5:26 PM, Magnus Manske <
>> magnusman...@googlemail.com> wrote:
>>
>>> Hi David,
>>>
&
Hi David,
the upload page at
https://tools.wmflabs.org/mix-n-match/import.php
won't take your matches, but they can be imported from Wikidata with a
click.
If the upload is too big for the page, mail me the file and I'll do it the
old-fashioned way ;-)
Cheers,
Magnus
On Mon, Oct 16, 2017 at
Oh and Leyla, I'm in Sulston building on Genome campus :-)
On Mon, Oct 9, 2017 at 2:21 PM Magnus Manske <magnusman...@googlemail.com>
wrote:
> Hi Leyla,
>
> you don't need permission just for linking to Wikidata!
>
> And the query looks fine, I just ran it without
Is anyone working on an "auto-resolve" bot? If you have VIAF (but nothing
else), you can resolve other identifiers via the VIAF site; similarly, if
you have only GND, you could try to reverse-lookup VIAF.
I think a list of items that have zero external identifiers, ordered by
"importance"
are
> linked to from Wikidata entities using the same YSO ID property, but
> these are not included in the "YSO Places" catalog in Mix'n'match
> because they are not places. I suspect at least some of those "24
> connections on Wikidata, but not here" may be lik
...@me.com
>
> Op 20 jul. 2017, om 15:16 heeft Magnus Manske <magnusman...@googlemail.com>
> het volgende geschreven:
>
> Hi,
>
> is it known why exactly this is happening? Wikidata editing too fast?
> en.wp updating too slow?
>
> I could deactivate both ve
Hi,
is it known why exactly this is happening? Wikidata editing too fast? en.wp
updating too slow?
I could deactivate both versions of quickstatements, but not for specific
"items with enwp sitelinks", only all-or-nothing; I hope it's not extreme
enough for that yet?
On Thu, Jul 20, 2017 at
Fantastic news!
Now if you could set up a SPARQL instance for those two...
(the reward for doing good work is more work!)
On Thu, Jul 6, 2017 at 2:10 PM Léa Lacroix wrote:
> Hello all,
>
> As you may know, WMF, WMDE and volunteers are working together on the
>
Reload once, should be fixed now.
On Tue, Jun 27, 2017 at 12:41 PM Osma Suominen <osma.suomi...@helsinki.fi>
wrote:
> Hi Magnus!
>
> Thanks for confirming. It's no big deal if you know about it, but is
> very confusing when you see it happening for the first time.
>
>
That is an issue with the rendering system I have encountered before. I
thought I had it fixed. The Wikidata entries are fine, it just shows "old"
information.
Until I fix it, reload the page and it will show correctly.
On Tue, Jun 27, 2017 at 8:17 AM Osma Suominen
On Mon, Jun 19, 2017 at 12:16 PM Osma Suominen
wrote:
>
> I couldn't see the "not on Wikidata" button that was mentioned in the
> manual in any of the modes. Has it been removed? It would be useful to
> be able to mark that something is not (yet) in Wikidata, though I
I fiddled with it a bit, now 35% automatched.
Will try some more, but there are some sanity constraints on the matching.
If it finds more than one match for the name, it does not set any match,
because random matches on the same name were annoying in the past. There is
also a type constraint,
a lot!
>
> I'm currently preparing a CSV dump of YSO places. Most of the entries
> have coordinates. I will send it to you soon for inclusion as a catalog
> in Mix'n'match.
>
> -Osma
>
> Magnus Manske kirjoitti 16.06.2017 klo 00:00:
> > Just to update everyone
On Tue, Jun 13, 2017 at 6:25 PM Neubert, Joachim wrote:
> Hi Magnus, Osma,
>
>
>
> I suppose the scenario Osma pointed out is quite common for knowledge
> organization systems and in particular thesauri: Matching could take
> advantage of multilingual labels and also of
l Finnish
> coordinates are in EPSG:3067. Before or in MixnMatch.
>
> Susanna
>
> 2017-06-07 14:03 GMT+03:00 Osma Suominen <osma.suomi...@helsinki.fi>:
>
>> 07.06.2017, 13:10, Magnus Manske kirjoitti:
>>
>>> Does that imply coordinates in Mix'n'match? Because th
Can do, but this can get quite complicated. Example:
https://www.wikidata.org/wiki/Q2917717#P180
There have to be "target search" (e.g. "bull"), zero to many qualifiers
(some qualifier properties may be used several times in a single statement,
like "applies to part"), some of these should be
>>
>> Cheers,
>>
>> Alex
>>
>> On Tue, Jun 6, 2017 at 10:22 AM, Susanna Ånäs <susanna.a...@gmail.com>
>> wrote:
>>
>>> Would anyone be interested in creating a map interface for matching
>>> places in Mix'n'Match?
>>>
>&
On Tue, Jun 6, 2017 at 2:44 PM Osma Suominen <osma.suomi...@helsinki.fi>
wrote:
> Hi Magnus!
>
> Thanks for your quick response. Comments inline.
>
> Magnus Manske kirjoitti 06.06.2017 klo 15:57:
> > * If you want to "seed" Mix'n'match with third-party/indir
Hi Osma,
just a few remarks:
* If you want to "seed" Mix'n'match with third-party/indirect IDs already
in Wikidata, best to not create the catalog yourself, but mail me the data
instead
* If you want "YSO places" in Wikidata, we will need a new property for
that, unless the P2347 formatter URL
The original code predated SPARQL, so I have to change it anyway. The
example I gave is small enough for SPARQL, but others will not be.
On Thu, Jun 1, 2017 at 4:11 PM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> Am 01.06.2017 um 16:59 schrieb Magnus Manske:
> > As an
So I'll be busy finding uses of this table, and changing them, for the next
week or two...
Note that since now I have to use substring comparisons in queries (instead
of integer comparisons), for example with wb_terms, SQL queries will run
slower as a result.
On Thu, Jun 1, 2017 at 2:08 PM Lydia
Just say "wd:Q12345" (the author) instead of "?author" ?
The backlinks thing works, but is tedious. You'll need to load the items
via action=wbgetentities to check if that link actually means "author", or
some other property.
On Wed, Apr 12, 2017 at 4:52 PM wrote:
>
> To get
I suppose Gerard means this:
https://en.wikipedia.org/wiki/MediaWiki:Wdsearch.js
Last time I checked, it was enabled by default for everyone on it.wp.
On Fri, Mar 31, 2017 at 3:58 PM Thad Guidry wrote:
> Hi Gerard,
>
> Can you point me to a URL that describes that
I literally had that problem yesterday myself. You do need "matching"
versions of MediaWiki and Wikibase. Stable/master does NOT work, but
master/master will work fine (but is a bit more fiddly to set up).
On Fri, Mar 24, 2017 at 2:27 PM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
>
; On Fri, 24 Mar 2017 at 11:03 Magnus Manske <magnusman...@googlemail.com>
> wrote:
>
> I am trying to install the (git HEAD) wikibase on a fresh MediaWiki stable
> (1.28.0) on Tool Labs.
>
> MediaWiki is up, but using "Item:" namespace and some "Special:" p
I am trying to install the (git HEAD) wikibase on a fresh MediaWiki stable
(1.28.0) on Tool Labs.
MediaWiki is up, but using "Item:" namespace and some "Special:" pages die
with a 500 error. Error log says:
2017-03-24 10:58:01: (mod_fastcgi.c.2702) FastCGI-stderr: PHP Catchable
fatal error:
My trusty WiDaR OAuth-based tool is throwing errors since a few days:
"The authorization headers in your request are not valid: Invalid signature"
I did see the breaking change at
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2017-February/000125.html
but I am not using lgpassword
Relevant: https://arxiv.org/pdf/1702.06235.pdf
On Wed, Feb 22, 2017 at 6:57 AM Gerard Meijssen
wrote:
> Hoi,
> You know, typically you are right. In the last few days I added members of
> the chamber of deputies of Haiti. I used names from the English Wikipedia
> but
On Fri, Feb 17, 2017 at 8:57 PM Kingsley Idehen <kide...@openlinksw.com>
wrote:
> On 2/16/17 5:52 PM, Magnus Manske wrote:
> > I have extended the resolver to include squid and reasonator as targets:
> >
> >
> https://tools.wmflabs.org/wikidata-todo/resolve
Done.
On Fri, Feb 17, 2017 at 8:40 AM Markus Kroetzsch <
markus.kroetz...@tu-dresden.de> wrote:
> Thanks, Magnus. Could you also change the SQID taret URL to the stable
> version (remove "dev/" from URL)?
>
> Best,
>
> Markus
>
> On 16.02.2017 23:52,
I have extended the resolver to include squid and reasonator as targets:
https://tools.wmflabs.org/wikidata-todo/resolver.php?quick=VIAF:12307054=sqid
https://tools.wmflabs.org/wikidata-todo/resolver.php?quick=VIAF:12307054=reasonator
On Thu, Feb 16, 2017 at 10:32 PM Markus Kroetzsch <
e statements don't already exist). With the appropriate reference to
> the dataset, ideally.
>
> I realise this is a lot to ask - maybe I should just write a bot.
>
> Alina, sorry to hijack your thread. I hope my questions were general
> enough to be interesting for other read
with the matching. Please
contact me if you need help!
On Thu, Jan 26, 2017 at 3:58 PM Magnus Manske <magnusman...@googlemail.com>
wrote:
> Alina, I just found your bug report, which you filed under the wrong issue
> tracker. The git repo (source code, issue tracker etc.) are h
stion for people who are using the Wikidata reconciliation
> > service: https://tools.wmflabs.org/wikidata-reconcile/ It was working
> > perfectly in my Open Refine in november 2016, but since december is
> > stopped working. I already have contacted Magnus Manske, but he hasn’t
> &g
FWIW, WiFi/DSL access from Germany...
"Brad Pitt": 128 results in 11.2s
"Antarctic rivers": Silently fails with an internal server error, time
keeps running:
Yes, it's a new account, 10 edits. Awaiting bot flag. I'll just do another
edit, thanks!
On Fri, Dec 16, 2016 at 8:27 AM Thiemo Mättig
wrote:
> Hi Magnus!
>
> This filter is for new users with less than 11 edits:
>
I am trying to use the Wikidata API with a bot to remove a claim:
https://www.wikidata.org/w/api.php?action=help=wbremoveclaims
But all I get is an abusefilter warning:
https://www.wikidata.org/wiki/MediaWiki:Abusefilter-warning-68
Is there a way for me to fix or ignore this, and get on with the
t to classify the data, but if our
> interest goes in the same direction, we could pool brains and resources to
> find a classification or at least some nice ggplot vizs :-)
>
> Jan
>
> 2016-10-20 14:42 GMT+02:00 Magnus Manske <magnusman...@googlemail.com>:
>
> Jan, my List
scary/friendly is it compared to SPARQL? How hard is
> finding the needed instructions compared to SPARQL etc.)
>
> Kind Regards,
> Jan
>
>
>
>
>
> 2016-10-20 13:55 GMT+02:00 Magnus Manske <magnusman...@googlemail.com>:
>
> Hi Jan,
>
> as the author of
labs.org/ppp-sparql/#Where%20is%20Paris%3F (it's very
> apha-ish).
>
> Thomas
>
> > Le 20 oct. 2016 à 13:55, Magnus Manske <magnusman...@googlemail.com> a
> écrit :
> >
> > Hi Jan,
> >
> > as the author of wdq and its query builder, I recommend agai
Hi Jan,
as the author of wdq and its query builder, I recommend against using it as
a model.
The wdq query builder does work, to some degree, because the instruction
set of wdq is very limited, and predictable. Since "proper" wikidata lists
would be powered by SPARQL, which is much more
“publication date” of
> the source at a later stage by re-running the tool when it has been
> improved? Or is it better to leave out the source altogether for the time
> being?
>
>
>
> Cheers,
>
> Beat
>
>
>
>
>
>
>
> *From:* Wikidata [mailto:wikida
The tool is missing several functions, and requires a general
overhaul/rewrite. Haven't gotten around to it.
On Tue, Oct 18, 2016 at 12:52 PM Estermann Beat
wrote:
> Hi,
>
>
>
> I’ve recently tried in vain to add a “publication date” qualifier to a
> reference, using the
You could try this (example:"Cambridge"):
https://quarry.wmflabs.org/query/13025
Not sure if your terms will work though; "Aerial photograph" does not
exist, for example. You can replace
term_type='label'
with
term_type IN ('label','alias')
to get more hits.
On Mon, Oct 10, 2016 at 7:14 AM
Using custom HTTP headers would, of course, complicate calls for the tool
authors (i.e., myself). $.ajax instead of $.get and all that. I would be
less inclined to change to that.
On Mon, Oct 3, 2016 at 10:42 AM Guillaume Lederrey
wrote:
> On Mon, Oct 3, 2016 at 12:40
On Wed, Sep 14, 2016 at 10:07 AM Luca Martinelli
wrote:
> Il 05 set 2016 11:43, "Jan Dittrich" ha
> scritto:
> > - What are current workflows with Listeria? I think I remember that
> people in our interviews (thanks to everyone who
One "typical" approach for a data set this type and size is Mix'n'match:
https://tools.wmflabs.org/mix-n-match/
If you get a list of IDs and names, let me know.
On Tue, Sep 6, 2016 at 7:17 PM Brill Lyle wrote:
> Hi Wikidatans,
>
> After going past my 500th edit on
FWIW, I also noticed the SERVICE label being much slower than using
rdfs:label. Not sure if that's a recent development, but I switched to
avoid timeouts.
On Wed, Sep 7, 2016 at 7:36 AM Markus Kroetzsch <
markus.kroetz...@tu-dresden.de> wrote:
> On 07.09.2016 03:05, Stas Malyshev wrote:
> > Hi!
Never mind, worked immediately after sending this mail ;-)
On Mon, Aug 22, 2016 at 8:37 PM Magnus Manske <magnusman...@googlemail.com>
wrote:
> I am trying to create new items by supplying a large-ish JSON structure,
> but I keep getting "The serialization is invalid&q
YES!
On Thu, Jul 28, 2016 at 5:02 PM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> Hey everyone :)
>
> I just posted exciting news about structured data support for Commons
> at
> https://commons.wikimedia.org/wiki/Commons_talk:Structured_data#It.27s_alive.21
> *SPOILER* There is a
On Jul 13, 2016 3:08 PM, "Magnus Manske" <magnusman...@googlemail.com>
> wrote:
>
>> Hi all,
>>
>> I have tried to replicate the issue in FIrefox, Chrome, and Safari. It
>> appears to work for me, bowever, API calls occasionally took a signisicant
&
Hi all,
I have tried to replicate the issue in FIrefox, Chrome, and Safari. It
appears to work for me, bowever, API calls occasionally took a signisicant
amount of time. This was the case for a few minutes, and then "fixed"
itself. So some of the described effect might be due to "Wikidata
...@wikimedia.de>
wrote:
> On Thu, Jul 7, 2016 at 8:06 PM, Magnus Manske
> <magnusman...@googlemail.com> wrote:
> > OK, this thread seems appropriate, so I just fixed up one of my scripts,
> it
> > lets you
> > *drag'n'drop references between statements
>
://www.wikidata.org/wiki/User:Magnus_Manske/dragref.js
Maybe it helps, a little.
On Thu, Jul 7, 2016 at 6:16 PM Magnus Manske <magnusman...@googlemail.com>
wrote:
> The Visual editor has a whole UI team behind it, who've been working on it
> for years. Yes, citations are only
gt;
>
> *Erika Herzog*
> Wikipedia *User:BrillLyle <https://en.wikipedia.org/wiki/User:BrillLyle>*
>
> On Thu, Jul 7, 2016 at 10:12 AM, Magnus Manske <
> magnusman...@googlemail.com> wrote:
>
>> While the proposal of all statements requiring citation is obviousl
While the proposal of all statements requiring citation is obviously
overshooting, I believe we all agree that more/better citations improve
Wikidata.
One component here would be a social one, namely that it first becomes good
practice, then the default, to cite statements.
For that, improved
-- Forwarded message -
From: Lydia Pintscher
* Gujarati (
https://gu.wikipedia.org/wiki/%E0%AA%B5%E0%AA%BF%E0%AA%B6%E0%AB%87%E0%AA%B7:AboutTopic/Q13520818
)
Honoured to be a test item, even if I have never heard about that language
before... :-)
For alternative interfaces that include editing capabilities, see also my
demo here:
https://tools.wmflabs.org/reasonator/widee/#q=Q42
(this one is optimized for load speed and readability, but it's been around
for a while, so no guarantees...)
On Tue, May 3, 2016 at 1:27 PM Markus Kroetzsch <
Let me be the first to say:
OH, YEAH!
On Thu, Apr 28, 2016 at 11:15 AM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> Hey folks :)
>
> We've been doing a lot of groundwork over the past months in order to
> support new entity types. We need this for Wikimedia Commons support among
Yay for geosearch in production!!!
On Mon, Apr 25, 2016 at 6:27 PM Guillaume Lederrey
wrote:
> Hello!
>
> To enable Geosearch [1] on WDQS, we need to do a full dataload to
> re-index all data with Geosearch extension. We will use this
> opportunity to also do a full
If you have the PubChem CID, you can try http://tinyurl.com/goefnxy
(example:CO2)
On Mon, Apr 25, 2016 at 4:03 PM Herman Bergwerf
wrote:
> Hi all! I'm building a web application where users can search for
> protein/compound/etc. names and view their 3D structure using
That looks familiar :-)
On Tue, Apr 19, 2016 at 8:46 PM Markus Kroetzsch <
markus.kroetz...@tu-dresden.de> wrote:
> Hi all,
>
> As promised a while ago, we have reworked our "Wikidata Classes and
> Properties" browser. I am happy to introduce the first beta version of a
> new app, called SQID:
>
Removed from the instructions. They were removed before adding statements
anyway.
On Wed, Apr 6, 2016 at 5:49 PM Thiemo Mättig
wrote:
> Hi,
>
> the "definition" at Magnus'
> http://tools.wmflabs.org/wikidata-todo/quick_statements.php is outdated.
> Magnus, can you
Or directly:
https://commons.wikimedia.org/wiki/Special:Redirect/file/Python-Foot.png?width=100px
On Tue, Apr 5, 2016 at 11:45 PM Yuri Astrakhan
wrote:
>
> https://commons.wikimedia.org/w/api.php?action=query=File:Python-Foot.png=imageinfo&=url=100
>
> On Wed, Apr 6,
Great, thanks!
On Sun, Apr 3, 2016 at 11:41 AM Stas Malyshev
wrote:
> Hi!
>
> > Great to see this!
> >
> > But is it down? I get "Data last updated: [connecting]", and queries run
> > forever...
>
> Thanks, should be fine now.
>
> --
> Stas Malyshev
>
Great to see this!
But is it down? I get "Data last updated: [connecting]", and queries run
forever...
On Sun, Apr 3, 2016 at 7:31 AM Stas Malyshev
wrote:
> Hi all!
>
> I would like to present to you a preview of an upcoming Wikidata Query
> Service feature, namely
Thank you!
On Thu, Mar 31, 2016 at 2:22 PM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> We actually have nice documentation for this :)
>
>
> https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/docs/usagetracking.wiki
>
> Am 29.03.2016 um 01:
As I said in my last mail, I know how to do it in the Wikidata DB replica.
I was just wondering how to do it in the Wikipedia one, if that's possible.
(and no, I don't want the federated join either)
On Wed, Mar 30, 2016 at 5:56 PM Thiemo Mättig
wrote:
> Hi Magnus,
Hi Thiemo,
what I am looking for is, in a Wikipedia database replica on Labs, for a
given page, the Wikidata item.
Yes, I know I can get that via API, or via the Wikidata DB replica.
Cheers,
Magnus
On Tue, Mar 29, 2016 at 11:10 PM Thiemo Mättig
wrote:
> Hi
To venture a guess, "eu_aspect='S'" seems to be the page-to-item identity,
with the exception of redirected items, which still appear to be associated
with the page (e.g. page_id 8983308 on dewiki).
On Tue, Mar 29, 2016 at 12:10 AM Magnus Manske <magnusman...@googlemail.com>
wr
So, is there an equivalent for WDQ "AROUND[]" now? :-)
On Thu, Feb 25, 2016 at 12:11 AM Stas Malyshev
wrote:
> Hi!
>
> The upgrade is now done and query.wikidata.org is running on Blazegraph
> 2.0. If you notice something wrong please tell me or file an issue in
>
Depends on your definition of "real". Wikidata Special Page stats count all
non-empty (as in "have statements", AFAIK) items, my stats (wikidata-todo)
count all items. Pick your truth.
On Tue, Feb 23, 2016 at 5:16 PM Tom Morris wrote:
> What is the canonical, authoritative
Better geo support will be added with the next blazegraph update, AFAIK.
On Fri, Feb 19, 2016 at 7:44 PM Jorge Hernandez wrote:
> Greeting Wiki Tinkers!
>
> I have been researching into Wikidata SPRQL, and am very much enjoying
> the Wikidata Query Service API
On Wed, Feb 17, 2016 at 7:16 AM Stas Malyshev
wrote:
>
> Well, again the problem is that one use case that I think absolutely
> needs caching - namely, exporting data to graphs, maps, etc. deployed on
> wiki pages - is also the one not implemented yet because we don't
I agree, we should look at some actual traffic to see how many queries
/could/ be cached in a 2/5/10/60 min window. Maybe remove the example
queries from those numbers, to separate the "production" and testing usage.
Also, look at query runtime; if only "cheap" queries would be cached, there
is no
want to do,
on the cheap ;-)
> I'm open for discussion with the community about this.
>
> What do you think?
>
> Cheers,
>
> Marco
>
> On 1/21/16 13:00, wikidata-requ...@lists.wikimedia.org wrote:
> > Date: Wed, 20 Jan 2016 16:44:46 +
> > Fr
Hi Marco,
I run this tool. Quick answers:
1. Yes. If you have a Labs account, you can see everything in database
s51434__mixnmatch_p . You can also get most of the data via the API
(undocumented; ask me for specifics, check out the requests of the
interface in the browser, or try the source code
/SourcererBot
On Wed, Jan 20, 2016 at 4:42 PM Magnus Manske <magnusman...@googlemail.com>
wrote:
> Hi Marco,
>
> I run this tool. Quick answers:
>
> 1. Yes. If you have a Labs account, you can see everything in database
> s51434__mixnmatch_p . You can also get most of the data via
On Sun, Jan 10, 2016 at 9:00 PM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> (For the technically inclined: The datatype will change for these
> properties but the value type will stay string.)
>
So, what exactly will the datatype change to?
"Nearby" items without image:
http://tools.wmflabs.org/wikishootme/
Also creates links to wdfist and a location-based Flickr free image search.
On Wed, 23 Dec 2015, 19:18 Gerard Meijssen
wrote:
> Hoi,
> If I recall well, Magnus did a tool for articles without images
Since no one mentioned it, there is a tool to do the matching to WD much
more efficiently:
https://tools.wmflabs.org/mix-n-match/
On Wed, 9 Dec 2015, 01:10 David Lowe wrote:
> Hello all,
> The Photographers' Identities Catalog (PIC) is an ongoing project of
> visualizing
1 - 100 of 205 matches
Mail list logo