Congratulations to everyone at Wikimedia Germany! Well deserved. I am
happy that my female mayor query made it to the description ;-)
What's the story to the picture with the painting on the award page?
Markus
On 11.05.2015 14:01, Lydia Pintscher wrote:
Hey folks :)
I'm really proud to let y
Spam. The address "Dear Colleague" sent to this mailing list is giving
it away, I know, but for less obvious cases, a good general guideline is
to avoid "research" conferences that ask you to pay for each paper you
publish there. Legit research events decouple paper selection from
financial asp
On 08.05.2015 11:30, Thomas Douillard wrote:
I don't get this, is this really a technical issue or just an interface
one ? It can be pretty clear to users that the semantic entity pages are
very different from lexical entities in the same instance just by
tweaking the UI. Or with separate instanc
Hi,
On 08.05.2015 09:40, Stas Malyshev wrote:
Hi!
Other technical solutions can be found for keeping content apart when
needed (e.g., separate dumps by entity types).
It's not only dumps, it's also searches, APIs, special pages, etc. Of
course, everything can be solved with enough time and c
On 08.05.2015 08:50, Lydia Pintscher wrote:
On Fri, May 8, 2015 at 7:15 AM, Stas Malyshev wrote:
I am worried that having two different data sets within the same
instance would be a problem for tools working with the data, and for
humans too. And frankly, I don't see too much benefit - virtuall
che Bretonne et Celtique
Unité mixte de service (UMS) 3554
20 rue Duquesne
CS 93837
29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95
fax : +33 (0)2 98 01 63 93
Le 29/04/2015 21:44, Markus Krötzsch a écrit :
On 29.04.2015 20:56, Luca Martinelli wrote:
Dear all,
I need to know about the possibilit
On 29.04.2015 20:56, Luca Martinelli wrote:
Dear all,
I need to know about the possibility of making queries on a Wikibase
instance. I think it is possible to make queries on data on a
particular instance only with external tools at the moment, right?
Yes, this is correct. The SPARQL query sup
ght just put the problem in another place though :) although
this might not change the correctness of statements like /:)
Pluto can still be an old style definition planet, and maybe
is a subclass of
Thinking about it, classes are naturally a good way to deal with
definitions.
2015-04
Hi,
General case first: Many statements depend on time and have an end date
(e.g., population numbers). The general approach there is to (1) have a
qualifier that clarifies the restricted temporal validity and (2) make
the current statement "preferred". So your idea with the ranks was a
good
On 26.04.2015 22:28, Gerard Meijssen wrote:
Hoi,
It is a matter of perspective. From my perspective a value exists or
not. Depending on that I may want to process. When you state novalue
there is a value of novalue and that is not the same as there not being
a value in the first place.
Ah, I se
On 26.04.2015 22:16, Gerard Meijssen wrote:
Hoi,
I regularly query for for instance claim[31] ie any instance of
whatever... I would also query for the existence of a date of death in a
similar way. for me a claim with a "whatever it is that says that there
is no value" would be a positive result
Quick reply to Denny and Gerard:
@Denny: I think it makes sense to treat qualifiers under a closed-world
semantics. That is: what is not there can safely be assumed to be false.
In this I agree with Gerard. OTOH, I don't think it hurts very much to
add them anyway.
@Gerard: Please note that
On 23.04.2015 20:09, Jeremy Baron wrote:
Hi,
On Mon, Apr 20, 2015 at 8:29 PM, Nicola Vitucci
wrote:
Markus, this is really cool! Can I reuse it as an example on WikiSPARQL? :-)
What's the difference between http://milenio.dcc.uchile.cl/sparql and
WikiSPARQL?
Just a different codebase/engine
On 23.04.2015 12:25, Thomas Douillard wrote:
This is a question of point of vue and how to solves conflicted
declaration, way larger than this. There could be disputes other who is
really the father of something, this would be the same.
such a statement in Wikidata means:
* This source says that
On 21.04.2015 17:31, Alan Said wrote:
Hi Markus et al.
Thank you for the answer. I have a few follow-up questions as I'm not
quite grasping the toolkit.
Alternative 1:
So, if I'd like to do 1) I need a dump file, I've downloaded a *-current
dump
(http://dumps.wikimedia.org/wikidatawiki/20150330/
On 22.04.2015 22:10, Stas Malyshev wrote:
Hi!
...
While letters like ч and щ can indeed
generate some long combinations which are not very visually appealing,
Tell me about it! -- M. Kroetzsch
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia
Hi Thomas,
On 22.04.2015 20:06, Thomas Douillard wrote:
Hi, there is items about Wikibase data model in Wikidata (created by me,
but not only)
If I understand correctly, they could be cited in the semantic web as
https://www.wikidata.org/entity/Q19798647
"No value" is exactly that: not a valu
On 21.04.2015 15:33, Andy Mabbett wrote:
On 20 April 2015 at 21:18, Markus Krötzsch
wrote:
I recently had the occasion of actually phrasing this in SPARQL, so that an
answer can now, finally, be given. The query to run at
http://milenio.dcc.uchile.cl/sparql
is as follows (with some
On 21.04.2015 13:49, Maxime Lathuilière wrote:
indeed!
I tried to import this statement from the French Wikipedia, but I guess
this can't be taken in account before the next dump/update(?)
Yes. We don't have live imports right now.
Markus
___
Wikid
/www.wikidata.org/wiki/User:Zorglub27>
Le 21/04/2015 12:03, Markus Krötzsch a écrit :
On 21.04.2015 11:27, Daniel Kinzler wrote:
Am 21.04.2015 um 00:50 schrieb Markus Krötzsch:
On 20.04.2015 23:47, Daniel Kinzler wrote:
Something seems to be wrong with the order, though. Munich (pop >
On 21.04.2015 11:27, Daniel Kinzler wrote:
Am 21.04.2015 um 00:50 schrieb Markus Krötzsch:
On 20.04.2015 23:47, Daniel Kinzler wrote:
Something seems to be wrong with the order, though. Munich (pop > 1m in all
statements) is listed way after Chemnitz (pop < 300k in all statements). An
,
so the query will have to change accordingly in the future.
Markus
On Mon, Apr 20, 2015 at 3:50 PM, Markus Krötzsch
mailto:mar...@semantic-mediawiki.org>>
wrote:
On 20.04.2015 23:47, Daniel Kinzler wrote:
Something seems to be wrong with the order, though. Munic
in alphanumeric order, because they are
decimal strings? They should be xsd:decimal...
They are.
Markus
Am 20.04.2015 um 22:18 schrieb Markus Krötzsch:
Hi all,
For many years, Denny and I have been giving talks about why we need to improve
the data management in Wikipedia. To explain and motiva
On 20.04.2015 22:51, Stas Malyshev wrote:
Hi!
is as follows (with some explaining comments inline):
This is very nice, thanks! Will use this as a test case for the query
engine (btw yes it works on my test machine just fine :).
more than one match per city then, even with DISTINCT). Picking
On 20.04.2015 22:29, Nicola Vitucci wrote:
...
I hope this is inspiring to some of you. One could also look for the
world's youngest or oldest current mayors with similar queries, for
example.
Markus, this is really cool! Can I reuse it as an example on WikiSPARQL? :-)
Yes, of course.
Mark
ountry rdfs:label ?label .
FILTER ( LANG(?label) = "en" )
}
} GROUP BY ?country ?label ORDER BY DESC(?count)
There seems to be a great imbalance here, which could indicate some
bias/incompleteness of our data -- or, possibly, of the world.
Cheers,
Markus
On Mon, Apr 20, 2015 a
Hi all,
For many years, Denny and I have been giving talks about why we need to
improve the data management in Wikipedia. To explain and motivate this,
we have often asked the simple question: "What are the world's largest
cities with a female mayor?" The information to answer this is clearly
Hi Matthew,
You can use our experimental SPARQL endpoint
http://milenio.dcc.uchile.cl/sparql. It has direct relations for all
statements that have no qualifiers, and two-step relations for all
statements (with or without qualifiers), which are a bit more complex
but give you more power over w
Hi Alan,
The SitelinksExample shows how to get the basic language-links data. In
Wikidata, sites are encoded by IDs such as "enwiki" or "frwikivoyage".
To find out what they mean in terms of URLs, you need to get the
interlanguage information first. The example shows you how to do this.
The
testing, but if your main interest is in the UI and not the
backend, this might be a nice cooperation.
Cheers,
Markus
On 09.04.2015 22:16, Markus Krötzsch wrote:
On 09.04.2015 01:11, Nicola Vitucci wrote:
...
Indeed. I made this temporary change on WikiSPARQL, so that links like
in Jean
On 09.04.2015 01:11, Nicola Vitucci wrote:
...
Indeed. I made this temporary change on WikiSPARQL, so that links like
in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/entity/Q18335803%3E
and then click on
On 08.04.2015 17:24, Nicola Vitucci wrote:
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are
out, either in the data (by adding some triples to a temporary
n and will happen first,
e.g., ranks in RDF).
Cheers
Markus
Cheers,
Nicola
(wikisparql.org)
Il 08/04/2015 14:49, Markus Krötzsch ha scritto:
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement
(with one Wikidata property) does not translate into a sin
On 08.04.2015 16:02, Jean-Baptiste Pressac wrote:
Thank you, I will have a look to your publication to have a better
understanding of the mecanism of "RDFisation".
Are you also going to solve the problem with the links to the wikidata
ontology ? For instance on this page
http://wikisparql.org/sp
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement
(with one Wikidata property) does not translate into a single triple
(with one RDF property) in RDF. Rather, several RDF triples are used,
they need to use more than one property, and these properties ha
ate a clean
and effective model for the community to build upon.
-Ben
On Mon, Apr 6, 2015 at 1:03 PM, Markus Krötzsch
mailto:mar...@semantic-mediawiki.org>>
wrote:
On 06.04.2015 22:02, Markus Krötzsch wrote:
Dear Sebastian,
Using OWL is surely a nice idea
On 06.04.2015 22:02, Markus Krötzsch wrote:
Dear Sebastian,
Using OWL is surely a nice idea when the semantics is appropriate (i.e.,
where you want Open-World entailment, not constraints) and here the
Possibly misleading typo: I meant "where", not "here" ;-) -- Markus
Dear Sebastian,
Using OWL is surely a nice idea when the semantics is appropriate (i.e.,
where you want Open-World entailment, not constraints) and here the
expressiveness is enough. This is much more difficult, however, than one
might at first think it is. For a simple example, the common Wik
Hi Erik, hi all,
Aren't those properties already distinguished by the classification
statements we now have on property pages? For example:
https://www.wikidata.org/wiki/Property:P214
Defines the VIAF id to be a "unique identifier" (yes, this is somewhat
questionable modelling, since a prope
Brilliant, we should set up a page with a list of SPARQL endpoits for
Wikidata! For production usage, it is great to have a variety to chose from.
==WARNING==
The RDF format is currently in flux. The purpose of the Chilean endpoint
http://milenio.dcc.uchile.cl/sparql is to gather feedback that
is
would then be somewhere on-wiki).
Markus
On Wed, Mar 11, 2015 at 2:09 PM Markus Krötzsch
mailto:mar...@semantic-mediawiki.org>>
wrote:
Hi Andrew,
This is a great idea! It would help data consumers to know what to
expect and community members to know what to put in
Hi Serge,
The short answer to this is that the purpose of aliases in Wikidata is
to help searching for items, and nothing more. Aliases may include
nicknames that are in no way official, and abbreviations that are not
valid if used in another context. Therefore, they seem to be a poor
source
Hi Andrew,
This is a great idea! It would help data consumers to know what to
expect and community members to know what to put in (or where help with
imports would be appreciated). Moreover, the discussion about this list
would be a great way to structure our work in general (have documented
On 11.03.2015 05:40, Tom Morris wrote:
On Tue, Mar 10, 2015 at 6:41 PM, Markus Krötzsch
mailto:mar...@semantic-mediawiki.org>>
wrote:
For example, you can see that Portugal has a lot of lighthouses
while Spain has almost none -- maybe we need to look at our data
there ;-)
P
On 10.03.2015 17:31, Thad Guidry wrote:
I helped with the Lighthouses schema in Freebase.
For your personal enjoyment:
https://tools.wmflabs.org/wikidata-exports/miga/?lighthouses
(use Google Chrome or any other non-IE, non-FF browser to view).
Just a quick hack, many of the data are incompl
On 10.03.2015 17:09, Daniel Kinzler wrote:
Am 10.03.2015 um 16:55 schrieb Markus Krötzsch:
Hi Serge,
The short answer to this is that the purpose of aliases in Wikidata is to help
searching for items, and nothing more. Aliases may include nicknames that are in
no way official, and
Awesome work :-). I love your use of Google Docs as a UI prototyping
tool. We could really use a few more special-purpose querying tools.
Markus
On 09.03.2015 22:03, Navino Evans wrote:
Hi all,
We've been using WDQ queries a lot recently to update timelines in the
Histropedia directory and, w
Thanks Maartens for the info.
On 08.03.2015 02:03, Jeroen De Dauw wrote:
Hey,
> And to answer your second question: "Maximum number of values is 50
(500 for bots)" (from
https://www.wikidata.org/w/api.php?action=help&modules=wbgetentities)
That seems a bit much to me. Considering an entity ca
Hi Amir,
In spite of all due enthusiasm, please evaluate your results (with
humans!) before making automated edits. In fact, I would contradict
Magnus here and say that such an approach would best be suited to
provide meaningful (pre-filtered) *input* to people who play a Wikidata
game, rathe
On 07.03.2015 18:21, Magnus Manske wrote:
Congratulations for this bold step towards the Singularity :-)
Lol. The word "neural" in the name of the algorithm is infinitely more
attractive and inspiring than something abstract like "Support Vector
Machine", isn't it? -- although we know that bo
On 07.03.2015 16:39, Federico Leva (Nemo) wrote:
https://www.mediawiki.org/wiki/API:Etiquette is probably the
best/reference document here...
Thanks, this answers my first question (no worries about request rate as
long as requests are serialized).
Markus
___
Hi,
Quick question about the wbgetentities API: are there any general rules
that clients should obey when making requests?
* Maximal hit rate?
* How many entities can you actually get in one request? Is this
documented anywhere? Is it possible for a tool to find this number or is
it just har
On 27.02.2015 17:47, Lydia Pintscher wrote:
On Thu, Feb 26, 2015 at 2:52 PM, Markus Kroetzsch
wrote:
Hi,
It's that time of the year again when I am sending a reminder that we still
have broken JSON in the dump files ;-). As usual, the problem is that empty
maps {} are serialized wrongly as emp
Hi Paul,
Re RDF*/SPARQL*: could you send a link? Someone has really made an
effort to find the least googleable terminology here ;-)
Re relying on standards: I think this argument is missing the point. If
you look at what developers in Wikidata are concerned with, it is +90%
interface and in
On 12.02.2015 07:17, Gerard Meijssen wrote:
Hoi,
It is pointless to include automated descriptions when they are then
saved in a fixed form. The point of automated descriptions is exactly
that they change as new statements are made. This is one reason why they
are superior to manual descriptions.
On 20.01.2015 23:27, Hydriz Scholz wrote:
Hi all,
All the Wikidata JSON dumps are available and archived on Archive.org.
See this search query [1] for a full list of them. For Labs users, the
latest 10 dumps are available at /data/scratch/wikidata.
Ah, interesting. I did not know that the JSON
On 20.01.2015 19:20, Jeroen De Dauw wrote:
Hey,
I seemed to recall this being reported earlier, it being discussed, and
a fix being created.
Yes. And in spite of your analysis, the problem seems to have almost
disappeared after that. It used to be all over the dataset, now it is
just in one
Dear Wikidata JSON export team,
There seems to be a sytnax error in the 20120112 JSON file that (I
think) has already been there in the previous dump. So I guess it makes
sense to report it.
In line 9374899, around column 2648 of the 20120112 JSON dump, we find
"snaks":[]
Of course, {} woul
Also, as this seem to be taking longer than expected, I have now also
re-published the Jan 12 and Jan 5 JSON dumps on labs now for your
convenience:
http://tools.wmflabs.org/wikidata-exports/tmp/
Users of Wikidata Toolkit can manually download the file
20150112.json.gz to a subdirectory
./du
Hi (esp. WMF people),
The JSON dumps used to be at
http://dumps.wikimedia.org/other/wikidata/
Now this directory is empty. Any hints at what is going on?
Cheers,
Markus
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikim
On 18.01.2015 15:32, Egon Willighagen wrote:
OK, thanks!
BTW, I could confirm the NPE solved by adding that json-MMDD/ subdir...
Great.
Another question: is it possible to cancel to process of parsing a
datadump file programmatically? I saw the time out, but integrating it
in a GUI wher
The issue was fixed in master now. I also added some more INFO-type
messages that will report about the dump files found online and locally.
Cheers,
Markus
On 18.01.2015 14:26, Markus Krötzsch wrote:
On 18.01.2015 10:58, Egon Willighagen wrote:
On Sat, Jan 17, 2015 at 11:04 PM, Markus
On 18.01.2015 10:58, Egon Willighagen wrote:
On Sat, Jan 17, 2015 at 11:04 PM, Markus Krötzsch
wrote:
It is easy to fix this (though I will not fix it tonight, but tomorrow) by
just adjusting the HTML strings we parse for.
Sure! I have subscribed to the bug report.
As an intermediate
On 17.01.2015 23:04, Markus Krötzsch wrote:
...
Question to the MW folks: Is there any machine-readable API to get the
list of available dump files?
I mean: "WMF folks", of course -- Markus
___
Wikidata-l mailing list
y system.
Egon
On 17 Jan 2015 22:50, "Markus Krötzsch" mailto:mar...@semantic-mediawiki.org>> wrote:
On 17.01.2015 22:43, Egon Willighagen wrote:
This last test from the cmd line is already with master from
GitHub...
Thanks, we will investigate. I cre
On 17.01.2015 22:43, Egon Willighagen wrote:
This last test from the cmd line is already with master from GitHub...
Thanks, we will investigate. I created a bug report at
https://github.com/Wikidata/Wikidata-Toolkit/issues/114
Markus
Egon
On 17 Jan 2015 22:40, "Markus Krö
Hi Egon,
WDTK 0.3.0 is rather old and we are about to prepare a new release
(there are other issues with 0.3.0: the JSON format has changed since
its release and it won't read the files anyway). Could you try if the
problem occurs with the current development code at github?
Cheers,
Markus
at draw
conclusions based on Wikidata. The semantics of properties might be
broad, but should not be ambiguous.
Kind regards,
Aleksander Smywinski-Pohl
[1]
http://www.itl.nist.gov/iad/mig/tests/ace/2008/doc/ace08-evalplan.v1.2d.pdf
---- Wł. Cz, 08 sty 2015 22:29:33 +0100 *Markus
Krötzsch* napi
, apply the same default for invalid values,
and change all of its coordinate data to floating point numbers
(currently using long as fixed precision decimal numbers).
Cheers,
Markus
On 11.01.2015 02:15, Markus Krötzsch wrote:
Hi,
Does anybody know the current documentation of the precision
history of wiki". This table
does indeed show contributor numbers below 14k for all months up to Nov
2014. Overall, the two counts seem to agree though :-)
Markus
Andrew.
On 11 January 2015 at 22:35, Lydia Pintscher
wrote:
On Sun, Jan 11, 2015 at 11:31 PM, Markus Krötzsch
wrote:
Hey Lydia,
* We just crossed 14000 active users (over the last month). Thank you all!
Could you clarify?
http://stats.wikimedia.org/wikispecial/EN/TablesWikipediaWIKIDATA.htm
shows around 5k active users (=users with >5 edits) each month, but more
than 18K users with an edit in November 2
On 11.01.2015 14:53, Maarten Dammers wrote:
Hi Markus,
Markus Krötzsch schreef op 11-1-2015 om 2:15:
Hi,
Does anybody know the current documentation of the precision of the
globe coordinate datatype? This precision was introduced after the
original datamodel discussions.
No clue, I do know
Hi,
Does anybody know the current documentation of the precision of the
globe coordinate datatype? This precision was introduced after the
original datamodel discussions.
I used to believe that it was a rough, informal indication of a
precision based on an (easy-to-process but necessarily ra
On 09.01.2015 19:43, Joe Filceolaire wrote:
There is a proposal to create a 'subproperty of' property but it is on
hold until we can have a property as a datatype
Yes, that's also important. But I was talking aobut a property that
would be used to establish a "subpropertyOf" relation between a
I am not the one to decide this, but from my POV:
(1) It would be nice to have a single, most general "root element".
Using "Entity" as this root element could be sensible, with a suitable
definition of "Entity" as anything we can refer to (i.e., any item).
(2) Every entity should be an instanc
On 09.01.2015 17:25, Thad Guidry wrote:
https://www.wikidata.org/wiki/Property:P279 aka "the superclass" ...
seems to have an equivalent property that refers to
http://www.w3.org/2000/01/rdf-schema#subClassOf ???
Basically yes, this was the informal design intention when the community
discus
On 08.01.2015 18:38, Denny Vrandečić wrote:
Yes, CC-BY is great.
Good. I have officially released the article text under this license now:
https://korrekt.org/page/Wikidata:_A_Free_Collaborative_Knowledgebase
Cheers,
Markus
On Thu Jan 08 2015 at 7:01:12 AM Markus Krötzsch
mailto:mar
On 09.01.2015 00:53, Lydia Pintscher wrote:
...
I like the property-centric approach of wikidata, but is there a notion of
subproperties for contextual refinement? I only found this:
https://www.wikidata.org/wiki/Property:P1647
Yes that is all there is. For a usage example see
https://www.wi
On 08.01.2015 22:52, Peter F. Patel-Schneider wrote:
What then is P17 supposed to be used for?
Could, I, for example, use P17 on the address of the Swiss embassy in
Germany and have Switzerland as the value?
"associated" is generally too weak a word to use in describing properties.
We have to
On 08.01.2015 21:29, Thad Guidry wrote:
Hi Marcus!
Yes, you and I are on the same page.
I do indeed get this impression ;-)
Yes, I know about the
Property-first view of WIkidata. No quibbles. But there is still an
issue with Assumptions for "Country" P17 being used for an instance of
Band.
Dear Thad,
The second part of your email has good points in it, too. As you say,
one must allow for adjustments in the intended meaning of a property in
real life, and adjusting too much could be dangerous. The method you
suggest (creating a new property and deprecating the old one, rather
th
On 08.01.2015 20:37, Thad Guidry wrote:
...
Right, Freebase would not stick a Property called "Country" right on an
instance of a Music Band. We would put Country under the Musical Group
type, and give it a better definition like "The nation or territory that
this item originated from". Freeb
On 08.01.2015 15:10, ja...@j1w.xyz wrote:
Prior to viewing Markus Krötzsch's Wikidata page, I was unaware of the
"Wikidata: A Free Collaborative Knowledgebase" article [1] written by
Denny Vrandečić and Markus Krötzsch. This is a very helpful article
that in my opinion should be
t
on" should point to "Wikidata" or to "Wikimedia" or something else. But
besides this minor point this seems to be a nice way to have COI
declarations in the data (would also be interesting to know which living
people have official Wikimedia accounts).
Cheers,
Marku
P.S. I also should declare a COI on this discussion: I am Q18618630. --
Markus
On 07.01.2015 15:25, Markus Krötzsch wrote:
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliber
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliberately would maybe
not be the biggest issue here (since it is already in conflict with
other general policies -- we do not want wron
On 31.12.2014 16:18, Thomas Douillard wrote:
Not sure either it's writeable as punning imply to treat the
class/individual as different things ...
TL;DR: This subtlety is important for powerful ontology modelling
languages such as OWL, but we don't need to worry about this in
Wikidata. Even i
On 04.11.2014 18:18, Cristian Consonni wrote:
Hi Markus,
2014-11-01 0:29 GMT+01:00 Markus Krötzsch :
Nice. We are running the RDF generation on a shared cloud environment and I
am not sure we can really use a lot of RAM there. Do you have any guess how
much RAM you needed to get this done?
I
On 31.10.2014 14:51, Cristian Consonni wrote:
2014-10-30 22:40 GMT+01:00 Cristian Consonni :
Ok, now I have managed to add the Wikidata statements dump too.
And I have added a wikidata.hdt combined dump of all of the above.
Nice. We are running the RDF generation on a shared cloud environme
Hi Christian,
Awesome :-) Small note: I just got a "Bad Gateway" when trying
http://data.wikidataldf.com/ but it now seems to work.
It also seems that some of your post answers the question from my
previous email. That sounds as if it is pretty hard to create HDT
exports (not much surprise t
On 30.10.2014 11:49, Cristian Consonni wrote:
2014-10-29 22:59 GMT+01:00 Lydia Pintscher :
Help with this would be awesome and totally welcome. The tracking bug
is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143
Speaking of totally awesome (aehm :D):
* see: http://wikidataldf.com
* see
but I haven't been on the list since the beginning - but I am
curious if there is any formal collaboration
(in-place|proposed|possible) between dbpedia and wikidata?
Phil
This message optimized for indexing by NSA PRISM
On Wed, Oct 29, 2014 at 2:34 PM, Markus Krötzsch
wrote:
Martynas,
Denny
Martynas,
Denny is right. You could set up a Virtuoso endpoint based on our RDF
exports. This would be quite nice to have. That's one important reason
why we created the exports, and I really hope we will soon see this
happening. We are dealing here with a very large project, and the
decision
Hi Cristian,
As Daniel said, the live export is currently somewhat limited. However,
we provide RDF dumps that contain all the data:
http://tools.wmflabs.org/wikidata-exports/rdf/
This shows how the final live exports should also look (more or less),
and it could be a blueprint for somebody
Dear all:
Those of you active in research may be interested in submitting to a
recently announced special issue of the Journal of Web Semantics that
explicitly refers to Wikidata in its call:
"JWS Special Issue on Knowledge Graphs"
http://www.websemanticsjournal.org/index.php/ps/announcement/
Dear all,
I am happy to announce the third release of Wikidata Toolkit [1], the
Java library for programming with Wikidata and Wikibase. The main new
features are:
* Full support for the (now) standard JSON format used by Wikidata
* Huge performance improvements (decompressing and parsing the
Hi,
I fully agree with Thomas and the other replies given here. Let me give
some other views on these topics (partly overlapping with what was said
before). It's important to understand these things to get the subclass
of/instance of thing right -- and it would be extremely useful if we
could
On 13.09.2014 21:25, Jeremy Baron wrote:
On Sat, Sep 13, 2014 at 7:23 PM, Denny Vrandečić wrote:
I am not a lawyer, but if I remember correctly, copyright covers expression,
not content. Since the Wikidata data model and its representation in JSON is
rather unique, an ISBN number in a Wikidata
gree that
this has other issues). Also, I don't know if you can have language
links to multiple pages on the same other Wikipedia using manual links
(which seems what Edward was trying to do by connecting the same enwiki
article to multiple items).
Cheers,
Markus
On 09.09.2014 13:50, Daniel
On 09.09.2014 14:23, Emw wrote:
...
Example: https://en.wikipedia.org/wiki/Samoan_Clipper
See https://www.wikidata.org/wiki/Q7409943 for an initial pass at
modelling that.
Thanks, that's a huge improvement over the previous state (=me looking
at the article and giving up on adding anyth
1 - 100 of 237 matches
Mail list logo