Trying (on my user subpage!) Wikidata-based lists with {{#property}}
instead of fixed text:
https://nl.wikipedia.org/wiki/Gebruiker:Magnus_Manske/test1
Works (date columns only in this example), but could use improvement.
Are there more details on {{#property}} somwhere?
On Tue, May 19, 2015
the data ?
Thanks,
GerardM
On 19 May 2015 at 12:54, Magnus Manske magnusman...@googlemail.com
wrote:
Trying (on my user subpage!) Wikidata-based lists with {{#property}}
instead of fixed text:
https://nl.wikipedia.org/wiki/Gebruiker:Magnus_Manske/test1
Works (date columns only
PM Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Tue, May 19, 2015 at 12:54 PM, Magnus Manske
magnusman...@googlemail.com wrote:
Trying (on my user subpage!) Wikidata-based lists with {{#property}}
instead
of fixed text:
Thanks ;-)
https://nl.wikipedia.org/wiki
!
On Tue, May 19, 2015 at 1:22 PM, Magnus Manske
magnusman...@googlemail.com wrote:
I didn't copy the templates over; try this (I just ran it though):
https://tools.wmflabs.org/listeria/index.php?action=updatelang=nlpage=Gebruiker:Magnus_Manske/test1
On Tue, May 19, 2015 at 12:08 PM Gerard Meijssen
On Fri, May 8, 2015 at 10:16 AM Bene* benestar.wikime...@gmail.com wrote:
Hi
I do not think a separate Wikibase instance would be needed to provide
the data for Wiktionary. I think this can and should be done on
Wikidata. But as said by Milos and pointed out by Gerard, lexical
knowledge
Forgive me, but at the 2014 WikiCon in Cologne, I saw a talk that would see
Wiktionary converted to a separate wikibase installation, collapsing all
the wikitionary languages into items. THAT could reasonably be linked to
Wikidata, or just cross-references via properties.
Trying to wedge the
Well, getting a list of violations per country would not be hard, given
the dates. There are, for example, 2,300 UK citizens who died 1706 or
earlier:
Huh, just when I sent this mail, I realized that there is a database with
nation dates, it's called Wikidata...
So I present:
https://tools.wmflabs.org/wikidata-todo/wrong_nationality.html
Have fun!
On Mon, Apr 13, 2015 at 3:43 PM Magnus Manske magnusman...@googlemail.com
wrote:
Well, getting
Quick hack: On your user common.js page, add:
importScript( 'User:Magnus Manske/ext-props.js' );
This will move all statements for external IDs (to be exact, all
properties with a URL formatter property) to the sidebar.
The statements in the main body are just hidden; there is a toggle link in
On Fri, Mar 20, 2015 at 12:38 AM Amir Ladsgroup ladsgr...@gmail.com wrote:
One mistake https://www.wikidata.org/wiki/Q2963097 I just found via the
report. Article in French Wikipedia is about a French type of cheese but
connected to an article in Russian Wikipedia about a French playwriter.
Cool! And nice use of my API :-)
On Thu, Mar 19, 2015 at 2:45 PM Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey folks :)
Pasleim has created a nice new patrolling tool to help with vandalism and
spam fighting. Here's their announcement text:
Hey. To fight against vandalism, I
On Wed, Mar 11, 2015 at 8:52 PM Peter F. Patel-Schneider
pfpschnei...@gmail.com wrote:
I can understand that these large groups might have a Not right now
status, but that seems to be different from never. What am I
misunderstanding?
I'd say it's never within the current scope of the
Nice! One could even display the results in the spreadsheet, using this:
http://blog.fastfedora.com/projects/import-json
(haven't tried, though)
Note that there is a proper Wikidata graph database being developed by
WMF. Once that reaches production, WDQ will likely become a wrapper around
that
Congratulations for this bold step towards the Singularity :-)
As for tasks, basically everything us mere humans do in the Wikidata game:
https://tools.wmflabs.org/wikidata-game/
Some may require text parsing. Not sure how to get that working; haven't
spent much time with (artificial) neural
? My Python script
depends
on it.
In IRC I've been told that Yuvi Panda or Magnus Manske may help.
Thanks,
Dmitriy
Magnus tweeted they're investigating:
https://twitter.com/MagnusManske/status/572736927528566784
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Manual descriptions are, in the vast majority of cases, a waste of
volunteer time. Alternative:
http://magnusmanske.de/wordpress/?p=265
On Sun Feb 08 2015 at 17:37:42 Gerard Meijssen gerard.meijs...@gmail.com
wrote:
Hoi,
How does that help ? The point is exactly that there is no point to
, if there is demand.
On Mon Feb 09 2015 at 11:05:26 Markus Kroetzsch
markus.kroetz...@tu-dresden.de wrote:
On 09.02.2015 11:41, Magnus Manske wrote:
Manual descriptions are, in the vast majority of cases, a waste of
volunteer time. Alternative:
http://magnusmanske.de/wordpress/?p=265
I am slightly
description in the interface.
This would, however, require engineering beyond what I can offer as a
volunteer. It could also profit from the involvement of someone versed in
linguistics.
Cheers,
Magnus
-- daniel
Am 09.02.2015 um 11:41 schrieb Magnus Manske:
Manual descriptions
...@wikimedia.de
wrote:
Am 09.02.2015 um 12:08 schrieb Magnus Manske:
Considering that hardcoded descriptions (written manually, or generated
automatically) for all items in all ~290 languages would likely make up
most of
the data dump file, this seems somewhat impractical :-)
It's entirely
On Mon Feb 09 2015 at 11:27:06 Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
Am 09.02.2015 um 12:17 schrieb Magnus Manske:
My autodesc API serves both at the moment, so the consumer can decide
which one
they want to use. Automatic descriptions can miss the point sometimes
On Mon Feb 09 2015 at 13:00:35 Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
Since wb_terms has one row per term, and a field for the term type, it
would be
simple enough to inject auto-descriptions. The only issue is that
wb_terms is
already pretty huge, and adding automatic
Oh, real-live example for short automatic descriptions (same code as the
API) vs. manual ones: Searching for Peter on Wikidata, with autodesc
gadget:
https://twitter.com/MagnusManske/status/564782161845551104
On Mon Feb 09 2015 at 13:09:27 Magnus Manske magnusman...@googlemail.com
wrote
. This isn't ideal, but I can't think of a better
solution
that wouldn't be hugely complicated (and would thus not be implemented
any time
soon). Maybe you have ideas?
-- daniel
Am 09.02.2015 um 11:41 schrieb Magnus Manske:
Manual descriptions are, in the vast majority of cases, a waste
Yay!
On Wed Jan 21 2015 at 08:53:18 Lydia Pintscher lydia.pintsc...@wikimedia.de
wrote:
On Mon, Jan 19, 2015 at 6:12 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Mon, Jan 19, 2015 at 4:46 PM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Hi (esp. WMF people),
The
Hi James,
Reasonator uses the official wikidata API (your first example) to get the
related information. All statements with a value type item create a
normal Wikipedia-like link in the database, so you can get incoming
links to see which items refer to your start item. Example for Cambridge:
Yes, I will need those as well soon!
On Mon Jan 19 2015 at 15:46:55 Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Hi (esp. WMF people),
The JSON dumps used to be at
http://dumps.wikimedia.org/other/wikidata/
Now this directory is empty. Any hints at what is going on?
Cheers,
Little late to the party, but I feel I need to say: great work Lucie!
On Fri, Dec 19, 2014 at 2:52 PM, Lucie Kaffee lucie.kaf...@wikimedia.de
wrote:
2014-12-19 14:25 GMT+01:00 Gregor Hagedorn gregor.haged...@mfn-berlin.de
:
may be wrong) a mashup-error occurring when uncritically
Hray! :-)
On Wed, Dec 3, 2014 at 9:18 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey folks :)
We just enabled statements on properties. You can for example use this to:
* describe mappings to other projects' vocabularies
* indicate constraints for the usage of this
Hi,
I am running the Wikidata query tool (WDQ) at http://wdq.wmflabs.org/
WDQ can run many advanced queries, but I am using my bespoke query language.
I could try to write a wrapper around it, but have not had much (aka
none) experience with SPARQL. Are there some common use case examples
(even
Indeed. Full API documentation:
http://wdq.wmflabs.org/api_documentation.html
On Sun, Oct 26, 2014 at 4:52 PM, Andrew Gray andrew.g...@dunelm.org.uk
wrote:
CLAIM seems to expect that the target is an item number, rather than a
text string. For string properties, I think the correct query is:
On Thu, Sep 25, 2014 at 10:02 PM, Markus Bärlocher
markus.baerloc...@lau-net.de wrote:
Buongiorno Luca,
is it possible to ask for an English translation
With G's help:
Dear WD-specialists,
OpenSeaMap would like to improve the lighthouse data in WD
test:
Some people have multiple languages in their browser settings; there should
be a way to try them all before showing an error page/reasonator/google
translate.
How about https://www.wikidata.org/wiki/Special:GoToLinkedPage/auto/Q732383
?
Magnus
On Thu, Aug 28, 2014 at 3:33 PM, Andy Mabbett
Well, if I had known people's desperate need to keep their interest in the
Franco-Prussian War from the NSA... ;-)
https works now for that tool.
Cheers,
Magnus
On Fri, Aug 22, 2014 at 6:05 AM, Derric Atzrott
datzr...@alizeepathology.com wrote:
This looks really cool! I haven't really been
Thanks for this.
You might want to filter it, though: For example
https://www.wikidata.org/wiki/Q85256 states born in 1606 (no month or day
given), but your report at
https://www.wikidata.org/wiki/User:Ladsgroup/Birth_date_report2/26 gives it
as 1606-01-01, which then conflicts with the date
On Wed, Aug 20, 2014 at 4:51 PM, Paul Houle ontolo...@gmail.com wrote:
I'd be particularly wary of inferring anything from the EXIF data,
especially the time.
We could (should!) store the date/time anyway, and slap a source:EXIF
(or the like) qualifier on it.
If there is a manual time
If I may chime in: Most, if not all, of the (overly specific) categories on
Commons can be expressed by statements. So, storing the data/time from EXIF
or otherwise would allow for a midsummer morning query. Adding EXIF
camera model to the file data item would allow to query for cellphones (it
We don't have shapefiles yet, but a lot of property types such as
geographic coordinates (as in, one per item, ideally...), external
identifiers (e.g. VIAF), dates, etc.
A (reasonably) simple way to mass-add statements to Wikidata is this tool:
on a data set with 10M
entries. That actually works, and generates the correct output, but by that
time the connection has gone bye-bye.
I'll try to fix it tonight. Should be possible to speed up the property
lookup considerably.
Cheers,
Magnus
On Tue, Jul 15, 2014 at 10:26 PM, Magnus Manske
On Fri, Jul 4, 2014 at 1:40 PM, Scott MacLeod
worlduniversityandsch...@gmail.com wrote:
Jane, Lydia and WikiDatans,
These are great and helpful developments, which seem to be quite far along
now.
Jane and WikiDatans, can you point to similar helpful examples that would
distinguish how
Hi Rohan,
if by category you mean item, e.g.
https://www.wikidata.org/wiki/Q537
the easiest way to get the corresponding JSON is:
https://wikidata.org/entity/Q537.json
If you want a JSON object for all items with property P537, try:
http://wdq.wmflabs.org/api?q=claim[537]
See here for many
On Mon, Jun 16, 2014 at 6:13 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
@Magus, I've submitted a pull request that fixes that problem I was
complaining
about. Its not an ideal fix, but its good enough to satisfy me.
Thanks, merged and live now!
Cheers,
Magnus
FYI, I maintain Reasonator, as a click on Other/About Reasonator would
have revealed...
On Fri, Jun 13, 2014 at 3:57 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
It does crop up in many places.. What you see is not the Reasonator
per-se it
is the script used to generate a text.
That's the one!
On Wed, Apr 2, 2014 at 11:33 AM, Innovimax SARL innovi...@gmail.com wrote:
CORS is indeed pretty well supported now
http://caniuse.com/cors
Mohamed
On Wed, Apr 2, 2014 at 12:29 PM, Thomas Steiner to...@google.com wrote:
It's called JsonP. We could support it, but it
Yes, if it's enabled on Wikidata.
On Wed, Apr 2, 2014 at 1:26 PM, Daniel Kinzler
daniel.kinz...@wikimedia.dewrote:
Am 02.04.2014 14:12, schrieb Magnus Manske:
That's the one!
CORS should work fine with the cacheable URLs.
-- daniel
--
Daniel Kinzler
Senior Software Developer
Could one of the front-ends (squid?) perform a simple batch service, by
just concatenating the /entity/ JSON for requested items? That could
effectively run on the cache and still deliver batches.
On Wed, Apr 2, 2014 at 4:44 PM, Paul Houle ontolo...@gmail.com wrote:
I've been thinking about
Apparently, P107 still lurks in 1.4M items that don't have instance of:
http://tools.wmflabs.org/wikidata-todo/autolist.html?q=CLAIM%5B107%5D%20AND%20NOCLAIM%5B31%5D%20AND%20NOCLAIM%5B360%5D
(a few days out-of-date because Labs doesn't get daily Wikidata diff dumps
anymore, but still...)
On
Wow, that IS a big difference!
Looking at the network load for Q1339, there is still a four-second period
where JavaScript does ... something it doesn't need to do.
But, the site has regained its usefulness! Congrats, honestly!
Cheers,
Magnus
On Tue, Feb 25, 2014 at 8:07 PM, David Cuenca
Yes!!!
Already supporting quantities in Reasonator (search for index):
http://tools.wmflabs.org/reasonator/test/?q=Q213
On Thu, Jan 30, 2014 at 11:40 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey folks :)
We have just deployed the initial version of quantities. This means
(that should be http://tools.wmflabs.org/reasonator/?q=Q213 for the
production site ;-)
On Fri, Jan 31, 2014 at 1:08 PM, Magnus Manske
magnusman...@googlemail.comwrote:
Yes!!!
Already supporting quantities in Reasonator (search for index):
http://tools.wmflabs.org/reasonator/test/?q=Q213
Now if you could fix that setting labels via OAuth bug... :-)
On Fri, Jan 31, 2014 at 12:10 AM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
And of course in all the excitement about quantities I forgot the
other important stuff:
* We added the in other languages box for everyone
I like it! Can we use the Wikidata color schema, according to the new,
improved trademark policy (R) (TM)?
Cheers,
Magnus
On Tue, Dec 3, 2013 at 10:51 AM, Cristian Consonni
kikkocrist...@gmail.comwrote:
Hi all,
I want to propose a logo for reasonator (to substitute the R now in use);
OK, now live with the logo! :-)
On Tue, Dec 3, 2013 at 11:25 AM, Cristian Consonni
kikkocrist...@gmail.comwrote:
2013/12/3 Magnus Manske magnusman...@googlemail.com:
I like it! Can we use the Wikidata color schema, according to the new,
improved trademark policy (R) (TM)?
((... usual big
toolspam
The Terminator [1] can show you the most linked-to (~important) items with
no label (term, hence the name) in major languages.
/toolspam
[1] http://tools.wmflabs.org/wikidata-terminator/index.php
On Fri, Oct 18, 2013 at 9:27 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On
make the error?
rupert
On Sat, Oct 19, 2013 at 2:08 PM, Magnus Manske
magnusman...@googlemail.com wrote:
toolspam
The Terminator [1] can show you the most linked-to (~important) items
with
no label (term, hence the name) in major languages.
/toolspam
[1] http://tools.wmflabs.org
OK, quickly, while Labs is up :-)
http://tools.wmflabs.org/wikidata-todo/stats.php
On Thu, Oct 10, 2013 at 10:28 AM, Magnus Manske magnusman...@googlemail.com
wrote:
I have generated some stats, will present once Tools Labs has regained the
ability to run PHP :-(
On Mon, Oct 7, 2013 at 9
I can offer:
189,742 of 210,616 places (90%) in the U.S. with coordinates
http://208.80.153.172/wdq/?q=tree[30][150][17,131]_AND_claim[625]
55,504 of 92,510 (60%) in Russia:
http://208.80.153.172/wdq/?q=tree[159][150][17,131]_AND_claim[625]
You'll have to do the rest on your own ;-)
(for
This:
http://208.80.153.172/api?q=claim[279]props=279
will give you all the items that have the subclass of property, and the
respective item they are a subclass of. Enough to make a subclass tree for
all of Wikidata.
You'll have to get the labels and page counts yourself ;-)
On Tue, Sep 24,
.
[1] http://www.wikidata.org/wiki/Q1400551#sitelinks-wikipedia
On Sat, Sep 7, 2013 at 7:12 AM, Luca Martinelli
martinellil...@gmail.comwrote:
2013/9/7 Magnus Manske magnusman...@googlemail.com:
I believe that, for items that have basic claims/statements, short
descriptions can
.
S
On Sep 7, 2013 1:44 PM, Magnus Manske magnusman...@googlemail.com
wrote:
All valid points, Sven. I would just like to say that
* this is not intended as a replacement or auto-fill for descriptions; it
is to be shown if the manual description is blank (at least, that was my
angle
I am not sure if this has been discussed on-site, if so, apologies.
Many items on Wikidata do not have a short description. That's not really
an issue if you look at a single item, but for item lists (e.g. search
results) or predictive search drop-down boxes, descriptions can be quite
helpful.
I
I could offer an interface:
https://toolserver.org/~magnus/thetalkpage/
On Wed, Aug 7, 2013 at 10:00 AM, Mingli Yuan mingli.y...@gmail.com wrote:
Thanks, Markus,
About the background:
One is related with my current work and I can not say it too much. But
another story, I can say it
AFAIK the hold-up is because the numbers data type should also be able to
carry an uncertainty, and a unit. Not really necessary for number of
floors, but for plenty of other applications.
On Mon, Apr 22, 2013 at 11:21 AM, Michael Hale hale.michael...@live.comwrote:
Well, there is a
That sounds good in principle, but people might get upset (why did we put
this into Wikipedia then?)
A compromise could be to import from (in this example) both Wikipedia and
IMDb, add both as a reference to the same claim if they agree, and
separately if not. We can then deal with the
There will soon be a mechanism where Wikipedia can display data from
Wikidata directly, as it currently does with the language links. No need to
bot-edit Wikipedia.
On Fri, Mar 1, 2013 at 7:22 AM, Bináris wikipo...@gmail.com wrote:
Lego,
nice work, but I am not sure whether we speak abpout
about functionality and API that you're basing the
infrastructure on aren't officially locked in yet.
Sven
On Fri, Mar 1, 2013 at 5:12 AM, Bináris wikipo...@gmail.com wrote:
2013/3/1 Magnus Manske magnusman...@googlemail.com
There will soon be a mechanism where Wikipedia can display data
Congratulations! Much more usable now!
S... StringType? ;-)
On Mon, Feb 18, 2013 at 8:41 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey :)
We've just rolled out an update to the codebase on wikidata.org. The
most important new features and bugfixes for you are probably:
Thanks, and sorry that yours wasn't accepted. It reads way more useful than
my little toy there!
Cheers,
Magnus
On Sat, Feb 16, 2013 at 1:04 AM, jmccl...@hypergrove.com wrote:
**
Unfortunately my SOLRSearch proposal was deemed
Why not just block the bots on wikis that use wikidata?
On Tue, Jan 29, 2013 at 6:51 AM, Bináris wikipo...@gmail.com wrote:
2013/1/28 Amir Ladsgroup ladsgr...@gmail.com
What is exact time of the next deployment (it and he)?
If you want to catch it, join #wikimedia-wikidata on IRC. It was
So are the same bots doing different things? I seem to remember there was
one giant toolserver pybot instance doing only interwiki.
On Tue, Jan 29, 2013 at 9:17 AM, Nikola Smolenski smole...@eunet.rs wrote:
On 29/01/13 10:02, Magnus Manske wrote:
Why not just block the bots on wikis that use
69 matches
Mail list logo