e cases, and try our best to
help.
I hope that the page now is in a state Dr Lemaire is content with.
I want to extend her an apology, although I am not sure she will read this
message.
Best regards,
Denny Vrandečić
On Fri, Jul 28, 2023 at 1:41 PM Vi to wrote:
> This mailing list
I think it would be great to have a mobile app for Wikidata.
On Thu, Feb 3, 2022 at 10:59 AM geislemx
wrote:
> Hey all,
>
> I hope this mail finds you well in this trying times.
> Over the last month I invested some time and put a little project
> together for personal purpose. Long story short
Hi all,
We are currently looking for input on the question: what licensing
structure we should apply to Abstract Wikipedia and Wikifunctions.
After some initial discussion, the two following question in particular
remain open and would benefit from your input or vote:
1) Should Abstract Content
Hi Thad,
Thanks for asking the questions, and thanks Tobi for the pointers. Man,
what a lengthy post it was.
I understand that the post answered most of your questions. I think that it
is entirely possible to layer a prototype semantics over Wikidata, just as
the DL semantics have been layered
The on-wiki version of this is here:
https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Focus_languages
Hello all,
The Wikidata team at Wikimedia Deutschland will be working on improvements
to the lexicographic data part of Wikidata during this year. The Abstract
Wikipedia team at the
A short blogpost by Lydia and me on the diff blog:
https://diff.wikimedia.org/2020/10/06/wikidata-reaches-q1/
Congratulations to the community, congratulations to the project!
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
We have released lexical masks as ShEx files before, schemata for
lexicographic forms that can be used to validate whether the data is
complete.
We saw that it was quite challenging to turn these ShEx files into forms
for entering the data, such as Lucas Werkmeister’s Lexeme Forms. So we
adapted
There's also this paper, which might be interesting for what you are doing:
https://www.wikidata.org/wiki/Q73506477
On Fri, Jun 19, 2020 at 9:42 AM Denny Vrandečić wrote:
> The dump with the (almost) complete edit history can be found here:
>
> https://dumps.wikimedia.org/wikidatawiki
The dump with the (almost) complete edit history can be found here:
https://dumps.wikimedia.org/wikidatawiki/20200501/
Search for "edit history" on that page. There are two versions, in two
different compression formats.
It is rather big.
On Fri, Jun 19, 2020 at 7:52 AM Elisavet Koutsiana <
Did you see this?
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Wed, Jun 10, 2020, 12:51 Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Dear all,
>
> I'm loading the whole wikidata dataset into Blazegraph using a High
> Performance
Did you see this?
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Sun, Jun 7, 2020 at 3:13 AM Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Dear all,
>
> I am a researcher from Hasselt University performing research on Query
>
Welcome to Wikidata! Thanks for taking on such an important task for
outreach!
On Tue, May 19, 2020 at 3:02 AM Dan Shick wrote:
> Hi all!
>
> I’m Dan Shick ( https://w.wiki/RDs ), the new technical writer at
> Wikimedia Deutschland. My goals are to discover, improve, unify and
> round out
Welcome to Wikidata! Thanks for taking on such an important task for
outreach!
On Tue, May 19, 2020 at 3:02 AM Dan Shick wrote:
> Hi all!
>
> I’m Dan Shick ( https://w.wiki/RDs ), the new technical writer at
> Wikimedia Deutschland. My goals are to discover, improve, unify and
> round out
Hello all,
after talking about it a few times here, the official proposal for creating
the multilingual Wikipedia proposal is now on Meta.
https://meta.wikimedia.org/wiki/Wikilambda
The idea is to create abstract, language-independent content in Wikidata,
and then translate it into natural
Kingsley,
thanks for suggesting that feature. Since you already have that feature,
could you let us know how often the UI option for these output formats are
used? That could help with prioritising.
My uninformed hunch would be that there isn't much demand for selecting the
format via the UI,
CONSTRUCT would be best, but I am not sure that there's any system to
allows you to do that.
What I would do is get the truthy dump in ntriples, and filter out all
lines with the respective properties. The Wikidata Toolkit allows you to do
that and more.
If you're interested in all new created schemas, you can actually follow a
feed for that.
https://www.wikidata.org/wiki/Special:RecentChanges?hidebots=1=1=1=1=1=640=50=14__likelybad_color=c4__verylikelybad_color=c5=2
There's a link to the atom feed on the left hand toolbar.
Cheers,
Denny
On
Mohammed,
welcome! I am very happy to see you join in this important role.
Thank you,
Denny
On Tue, Apr 21, 2020 at 9:11 AM Léa Lacroix
wrote:
> Welcome onboard Mohammed!
> I'm glad that you're here and to have your support in order to address the
> requests from the community and to
ge a bias. We still need a way to find people
> willing to tackle this, or at least giving them enough motivation to
> solve this in our current working framework.
>
> On the bright side, I can see so many applications of your project
> that I can't wait for it to happen. :)
>
&g
I sent a long email to Wikimedia-l and also made the same post to Meta. I
published a new paper recently with a proposal for a multilingual Wikipedia
and more, and, unsurprisingly, Wikidata plays a central role in that
proposal. I am trying to have the discussion not to be too fragmented, so I
It is allowed (and in fact encouraged) to embed Wikidata query results in
your site. That was one of the original use cases.
Also, I wouldn't worry tremendously about the load on the Query server from
that as this is particularly well cachable and cached. To the best of my
knowledge, embedded
Joachim did was to set up a small
> document on GitHub and refer to that in the statements. Should I do
> something similar here?
>
> -Osma
>
> Denny Vrandečić kirjoitti 12.3.2020 klo 20.35:
> > When we were uploading the links to Freebase, we also added references
> >
When we were uploading the links to Freebase, we also added references fro
these. And since you've gone through all this work (thank you for that!)
verifying the links, I think it would be fair to add a respective reference.
On Tue, Mar 10, 2020 at 9:27 AM Osma Suominen
wrote:
> Hi,
>
> I'm
That's awesome, thanks!
On Tue, Feb 11, 2020 at 3:46 AM Léa Lacroix
wrote:
> Hey,
>
> On Tue, 11 Feb 2020 at 12:30, Nicolas VIGNERON
> wrote:
>
>> Great !
>>
>> Two small perfectionist question:
>> - is it finished or can we maybe go further?
>>
> It is finished for now, but we should continue
Oh, wow, I just tried that out too, and indeed, it used to be possible to
link to the L-number very quickly if I remember correctly, but now this is
not the case anymore.
Weirdly enough, the SPARQL endpoint got updated with the new Lexeme very
quickly. So I think these two things are not related.
This should probably help:
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Tue, Nov 19, 2019 at 8:27 AM Peter F. Patel-Schneider <
pfpschnei...@gmail.com> wrote:
> Is this the recommended way to set up a local copy of Wikidata? (If not,
> what
> is the
:
> On Fri, Nov 15, 2019 at 12:49 AM Denny Vrandečić
> wrote:
>
>> Just wondering, is there a way to let volunteers look into the issue? (I
>> guess no because it would give potentially access to the query stream, but
>> maybe the answer is more optimistic)
>>
>
Just wondering, is there a way to let volunteers look into the issue? (I
guess no because it would give potentially access to the query stream, but
maybe the answer is more optimistic)
On Thu, Nov 14, 2019 at 2:39 PM Thad Guidry wrote:
> In the enterprise, most folks use either Java Mission
Hi all,
as promised, now that I am back from my trip, here's my draft of the
comparison of Wikidata, DBpedia, and Freebase.
It is a draft, it is obviously potentially biased given my background,
etc., but I hope that we can work on it together to get it into a good
shape.
Markus, amusingly I
Thanks everyone for this warm welcome (back)!
On Fri, Sep 20, 2019, 10:38 Denny Vrandečić wrote:
> Off to my Todo list :)
>
> On Thu, Sep 19, 2019 at 10:46 AM Andy Mabbett
> wrote:
>
>> On Thu, 19 Sep 2019 at 17:56, Denny Vrandečić
>> wrote:
>>
>>
Sep 20, 2019 at 1:32 PM Denny Vrandečić
> wrote:
>
>> Yes, you're touching exactly on the problems I had during the evaluation
>> - I couldn't even figure out what DBpedia is. Thanks, your help will be
>> very much appreciated.
>>
>> OK, I will send a link the week
Off to my Todo list :)
On Thu, Sep 19, 2019 at 10:46 AM Andy Mabbett
wrote:
> On Thu, 19 Sep 2019 at 17:56, Denny Vrandečić
> wrote:
>
> > I am moving to a new role in Google Research, akin to a Wikimedian in
> > Residence
>
> That's marvelous; congratulatio
), not the Open
Source office. But I plan to publish more code forthgoing, so I will
continue to work with them.
On Thu, Sep 19, 2019 at 10:35 AM Federico Leva (Nemo)
wrote:
> Denny Vrandečić, 19/09/19 19:56:
> > I had used my 20% time to support such teams. The requests became more
> > fr
Knowledge Graph Talk thing. I was a bit
> grumpy, because I thought I wasted a lot of time on the Talk page that
> could have been invested in making the article better (WP:BE_BOLD style),
> but now I think, it might have been my own mistake. So apologies for
> lashing out there.
>
&g
Sebastian,
"I don't want to facilitate conspiracy theories, but ..."
"[I am] interested in what is the truth behind the truth"
I am sorry, I truly am, but this *is* the language I know from conspiracy
theorists. And given that, I cannot imagine that there is anything I can
say that could
I think if we wanted to do this with a bot, we should go through the usual
bot approval process, and discuss this on wiki?
But in general, as said, adding unknown value to people who are very very
sure to be dead sounds like a good idea (
https://www.wikidata.org/wiki/Q28 )
On Thu, Sep 19,
"unknown value" was made for exactly that use case - a person that has
died, but we don't know when.
I would just add that on the "date of death" property.
On Sat, Sep 7, 2019 at 2:25 AM Thomas Douillard
wrote:
> We have already a qualifier for this kind of stuffs, I think : P887
>
Hello all,
Over the last few years, more and more research teams all around the world
have started to use Wikidata. Wikidata is becoming a fundamental resource
[1]. That is also true for research at Google. One advantage of using
Wikidata as a research resource is that it is available to
links. When we do we could override the delivery
> of that article with a choice to the "abstract" for comparison.
> Thanks,
>GerardM
>
>
>
> [1]
> https://tools.wmflabs.org/wikidata-todo/cloudy_concept.php?q=Q434706=en
>
> On Sat, 6 Jul 2019 at 23:53, Den
That is really cool! Thanks and congratulations! I will certainly play with
it.
Is it in some way synced or is it a static snapshot?
On Tue, Aug 13, 2019 at 4:10 PM Kingsley Idehen
wrote:
> Hi Everyone,
>
> A little FYI.
>
> We have loaded Wikidata into a Virtuoso instance accessible via
ich have all the predicate defined.
>
> On Mon, 12 Aug 2019 at 22:23, Denny Vrandečić wrote:
>
>> Maybe you mean this file:
>>
>>
>> https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/master/docs/ontology.owl
>>
>>
>>
Maybe you mean this file:
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/master/docs/ontology.owl
Cheers,
Denny
On Sun, Aug 11, 2019 at 7:20 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:
> Hi Ali, all,
>
> we have this dataset:
>
Thank you for the message, Lea, this seems like a good step.
On Thu, Aug 8, 2019 at 8:19 AM Gerard Meijssen
wrote:
> Hoi,
> Easy, my user interface is English in all of them.
> Thanks,
> GerardN
>
> On Thu, 8 Aug 2019 at 16:39, Imre Samu wrote:
>
>> *> Suggestion* display the Q number in
tool’s templates API
> <https://www.wikidata.org/wiki/Wikidata:Wikidata_Lexeme_Forms#Templates_API>
> rather than the wiki page itself: transcribing the templates into
> structured form takes some time, there’s no need for someone else to do it
> again :)
>
> Cheers,
>
Hey,
is anyone working on or has worked on generating EntitySchemas from the
Wikidata Lexeme Forms data that Lucas is collecting?
It seems that most of the necessary data should be there already for these.
E.g. generating
https://www.wikidata.org/wiki/EntitySchema:E34
from
Worked perfectly well! Thanks!
On Wed, Jul 17, 2019 at 1:31 PM Denny Vrandečić
wrote:
> Thank you Antonin, that's reassuring, I'll try that!
>
> Cheers,
> Denny
>
> On Wed, Jul 17, 2019 at 12:48 PM Antonin Delpeuch (lists) <
> li...@antonin.delpeuch.eu> wrot
ss the entire JSON serialization of the claim, including
> the reference. If you are trying to create a new statement, you will
> need to generate a fresh statement id client-side and include it in the
> JSON serialization.
>
> Cheers,
> Antonin
>
> On 7/17/19 7:49 PM, De
Hi,
since it is possible in the UI, and the API documentation for wbsetclaim
also states that it "Creates or updates an entire Statement or Claim", I am
wondering how I can create a claim with a reference (wbcreateclaim does not
allow adding a reference).
So, my questions are:
1. Does the API
Hi all!
I really try not to spam the chat too much with pointers to my work on the
Abstract Wikipedia, but this one is probably also interesting for Wikidata
contributors. It is the draft for a chapter submitted to Koerner and
Reagle's Wikipedia@20 book, and talks about knowledge diversity under
AllPages=202
> e.g. http://mappings.dbpedia.org/index.php/OntologyProperty:BirthDate
>
> You could directly use the DBpedia-lemon lexicalisation for Wikidata.
>
> The mappings can be downloaded with
>
> git clone https://github.com/dbpedia/extraction-framework ; cd core ;
> ../run download-mappin
<
hellm...@informatik.uni-leipzig.de> wrote:
> Hi Denny,
>
> On 18.05.2018 02:54, Denny Vrandečić wrote:
>
> Rob Speer wrote:
> > The result of this, by the way, is that commercial entities sell modified
> > versions of Wikidata with impunity. It undermines
Rob Speer wrote:
> The result of this, by the way, is that commercial entities sell modified
> versions of Wikidata with impunity. It undermines the terms of other
> resources such as DBPedia, which also contains facts extracted from
> Wikipedia and respects its Share-Alike terms. Why would anyone
Thanks!
On Mon, May 7, 2018 at 1:36 PM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:
> Folks, I’m already in contact with John, there’s no need to contact him
> again :)
>
> Cheers, Lucas
>
> Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić
there.
>
> All the best,
>
> Sebastian
>
> On 04.05.2018 18:33, Maarten Dammers wrote:
>
> It almost feels like someone doesn’t want Wikidata in there? Maybe that
> website is maintained by DBpedia fans? Just thinking out loud here because
> DBpedia is very popular in the
I'm pretty sure that Wikidata is doing better than 90% of the current
bubbles in the diagram.
If they wanted to have Wikidata in the diagram it would have been there
before it was too small to read it. :)
On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
pfpschnei...@gmail.com> wrote:
>
That's great! I was having the same thought, but was thinking on top of the
SPARQL interface - but if it works on top of the API even better.
Thanks for that, this might be quite interesting. I really hope that it can
integrated into Wikidata proper.
On Mon, Jan 1, 2018 at 9:12 AM Thomas
Scott,
The NC license clause is problematic in a number of jurisdictions. For
example, at least in Germany, as I remember from my law classes, it also
would definitively include not-for-profits, NGOs, and even say bloggers,
with or without ads on their sites. One must always be careful in the
And thanks for the use cases. This helps a lot with thinking about this.
On Thu, Aug 31, 2017, 16:31 Denny Vrandečić <vrande...@gmail.com> wrote:
> The reason why we save the actual value with more digits than the
> precision (and why we keep the precision as an explicit
The reason why we save the actual value with more digits than the precision
(and why we keep the precision as an explicit value at all) is because the
value could be entered and displayed either as decimal digits or in minutes
and seconds. So internally one would save 20' as 0.3, but the
> > https://www.wikidata.org/wiki/Property:P2581. This property has been
> very
> > little used: http://tinyurl.com/y8npwsm5
> >
> > There might be a Wikimedia-Wordnet indirect link through BabelNet
> >
> > /Finn
> >
> >
> > On 08/15/2017 0
we have a BabelNet Wikidata property,
> https://www.wikidata.org/wiki/Property:P2581. This property has been
> very little used: http://tinyurl.com/y8npwsm5
>
> There might be a Wikimedia-Wordnet indirect link through BabelNet
>
> /Finn
>
>
> On 08/15/2017 07:22 PM, Denny
That's a great question, I have no idea what the answer will turn out to be.
Is there any current link between Wiktionary and WordNet? Or WordNet and
Wikipedia?
On Tue, Aug 15, 2017 at 10:14 AM wrote:
>
>
> I have proposed a Wordnet synset property here:
>
Chris,
thanks. That's cute cute ultimately disappointing - I would have preferred
for it to take me to "other people with this death place", which would be
more interesting.
Ah well, thanks on answering,
Denny
On Mon, Jul 10, 2017 at 7:48 AM Chris Koerner wrote:
> Denny,
>
>
Aren't both ... uhm ... "use cases" supported by dbpedia proper anyway?
On Thu, May 4, 2017 at 3:40 AM Kingsley Idehen
wrote:
> On 5/3/17 3:37 PM, Nicholas Humfrey wrote:
> >
> > On 26/04/2017, 15:41, "Wikidata on behalf of Kingsley Idehen"
> >
Daniel, I agree, but isn't that what Multilingual Text requires? A language
code?
I.e. how does the current model plan to solve that?
I assume most of it is hidden behind mini-wizards like "Create a new
lexeme", which actually make sure the multitext language and the language
property are
So assume we enter a new Lexeme in Examplarian (which has a Q-Item), but
Examplarian has no language code for whatever reason. What language code
would they enter in the MultilingualTextValue?
On Mon, Apr 10, 2017 at 8:42 AM Daniel Kinzler
wrote:
> Tobias' comment
(and
> per
> http://scott-macleod.blogspot.com/2017/04/falco-peregrinus-smartphone-that-could.html
> ).
>
> Scott
>
>
>
> On Fri, Apr 7, 2017 at 10:13 AM, Daniel Kinzler <
> daniel.kinz...@wikimedia.de> wrote:
>
> Am 07.04.2017 um 01:34 schrieb De
On Thu, Apr 6, 2017, 16:16 Stas Malyshev wrote:
> Hi!
>
> > - use Q-Items instead of UserLanguageCodes for Multilingual texts (which
> > would be quite a migration)
>
> I foresee that might be a bit of a problem for external tools consuming
> this data - how they would
I agree with Peter here. Daniel's statement of "Anything that is a subclass
of X, and at the same an instance of Y, where Y is not "class", is
problematic." is simply too strong. The classical example is Harry the
eagle, and eagle being a species.
The following paper has a much more measured and
http://i0.kym-cdn.com/entries/icons/original/000/001/899/mission_accomplished.jpg
On Sat, Dec 31, 2016, 02:58 Lydia Pintscher
wrote:
> Folks,
>
> We're now officially mainstream ;-)
>
>
Hi all,
thanks for the two matrices and the input here. I am tending to again let
Daniel convince me about using multiple representations for the lemma and
the forms. Mostly because that's what's closest to Lemon, and I trust the
research and expertise within Lemon. Thank you Philipp for chiming
*not
On Tue, Nov 15, 2016, 10:06 Denny Vrandečić <vrande...@gmail.com> wrote:
> Do you make sure but to request the place of death or date of death on
> living people? I.e. can we filter certain properties?
>
> On Tue, Nov 15, 2016, 07:20 Simon Razniewski <sraz
Do you make sure but to request the place of death or date of death on
living people? I.e. can we filter certain properties?
On Tue, Nov 15, 2016, 07:20 Simon Razniewski wrote:
> On November 15, 2016, csara...@uni-koblenz.de wrote
>
> It would be very interesting to see the
Hi all,
I set up a vagrant environment to do some hacking on Wikibase, but the
wikidata role seems to create a client only? Is there a role for the server?
Cheers,
Denny
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
Do you have already a few ideas? Just for brainstorming.
On Sat, Nov 5, 2016, 05:26 Nazanin Kamali
wrote:
> Dear all,
>
> My name is Nazanin and I am a Wikipedia editor. I studied statistics in my
> undergraduate program and recently I have begun my master
Hi,
I am not questioning or criticizing, just curious - why was it decided to
implement lemmas as terms? I guess it is for code reuse purposes, but just
wanted to ask.
Cheers,
Denny
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
Is someone trying to train a generative language model on this mailing list?
On Thu, Oct 27, 2016 at 3:59 PM Ushen Kowlessar Bhin Bhinbaha Dur <
ushe...@gmail.com> wrote:
> Hi
>
> Add boolean axo prompt to built from word as key base word example wiki
> will be wik the denyfen works out the code
to
> hop on a call and learn more about your role in the project, and to share
> more about what we're working on.
>
> Will you have any availability this week?
>
> Warmly,
> Daniel
>
> On Friday, October 14, 2016, Denny Vrandečić <vrande...@gmail.com> wrote:
&
+1 (917) 975 1410
> @dbudell
>
> On Thu, Oct 13, 2016 at 11:50 PM, Denny Vrandečić <vrande...@gmail.com>
> wrote:
>
> Hi Daniel,
>
> good luck with the proposal! Did you take a look at Wikidata's proposal to
> support Wiktionary?
>
> https://www.wikidata.org
Hi Daniel,
good luck with the proposal! Did you take a look at Wikidata's proposal to
support Wiktionary?
https://www.wikidata.org/wiki/Wikidata:Wiktionary
Cheers,
Denny
On Thu, Oct 13, 2016 at 8:19 PM Daniel Bogre Udell
wrote:
> Hello, Wikidata community!
>
> My
As far as I can tell on https://www.wikidata.org/wiki/Q156578 VW indeed has
three different stock symbols.
On Thu, Oct 13, 2016 at 1:34 PM Hampton Snowball
wrote:
> Hi Stas - Thank you so much for your response! It seems the difference
> between 1 and 2 is that for
I don't know if Jakob Voss is on this list, but he had a recent paper on
using Wikidata (not Wikibase) for NKOS terminology.
On Wed, Oct 12, 2016 at 1:32 PM Gregor Hagedorn <
gregor.haged...@mfn-berlin.de> wrote:
> Here some old pointers to our TDWG - biodiversity - ViBRANT work from 2013:
>
ear Denny,
>
> Thanks for the pointers, some of them are surprisingly new to me. However,
> I was wondering if there are articles that especially compared Wikibase and
> Semantic MediaWiki. Any idea?
>
> Many thanks.
>
> Claudia
>
> > On 12 Oct 2016, at 22:05, D
Properties don't have an identity besides their ID - so you could in
test.wd just rename the subclass of property and reclaim that name.
On Tue, Oct 11, 2016 at 1:19 PM Loic Dachary wrote:
> Hi,
>
> I'd like to run integration tests using test.wikidata.org[1] and the
>
Thank you!
On Mon, Oct 10, 2016 at 1:10 PM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> On Mon, Oct 10, 2016 at 7:43 PM, Denny Vrandečić <vrande...@gmail.com>
> wrote:
> > Thanks!
> >
> > How do I read the WDQS report that is linked? I only
Thanks!
How do I read the WDQS report that is linked? I only see the link to the
rmd source.
https://github.com/wikimedia-research/Discovery-WDQS-Adhoc-Usage/blob/master/Report.Rmd
On Mon, Oct 10, 2016 at 9:13 AM Léa Lacroix
wrote:
>
>
> *Hello all,Here's your quick
Wikidata allows to set a coordinate system - it is called a globe or
coordinate system - on every coordinate. This would be the natural place to
specify whether it is WGS84 or GDA94 or another system. Most of them are
Q2, which, as per data model, is indeed WGS84.
DBpedia has the category data in RDF. The last release of DBpedia also
included Wikidata - it should be possible to query the combined dataset
there.
On Thu, Oct 6, 2016, 19:47 Thad Guidry wrote:
> Cool. Yeap, you got the idea now.
>
> OK, We'll stay tuned for a future
Markus, do you have access to the corresponding HTTP request logs? The
fields there might be helpful (although I might be overtly optimistic about
it)
On Fri, Sep 30, 2016 at 11:38 AM Yuri Astrakhan
wrote:
> I guess I qualify for #2 several times:
> * The & support
Markus, it really depends on what you mean with "a list of all Wikimedia
languages". That is why you get different numbers.
Usually, you will have a use case for this list, and depending on that use
case you should select the languages you really care about. Besides looking
good in some marketing
Uhm, sorry. I meant: is this what you look for?
It's based on the query by Jérémie a few mails back.
http://tinyurl.com/h2zky8p
On Thu, Sep 29, 2016 at 1:13 PM Denny Vrandečić <vrande...@gmail.com> wrote:
> Is that what yu look for?
>
> On Thu, Sep 29, 2016 at 10:56 AM M
ve the same distribution shape, but are overall
>> slightly shorter.
>>
>> Best,
>> Sebastian
>>
>> On Sun, Sep 18, 2016 at 9:19 PM, Denny Vrandečić <vrande...@gmail.com>
>> wrote:
>> > Can you figure out what a good limit would be for these two use ca
And just to point out - even though there are no plans to accommodate the
superstructures in the data model directly, it should be noted that the
current data model already is flexible to have it, i.e. if the community so
wishes they can create Lexemes which represent the "root" of a word like
for, and maybe yours is already
sufficiently close to optimal.
On Fri, Sep 16, 2016 at 11:00 AM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> Am 16.09.2016 um 19:41 schrieb Denny Vrandečić:
> > Yes, there should be some connection between items and lexemes, but I am
> st
(in particular because I expect that character limit to have to change for
Wiktionary in Wikidata)
On Fri, Sep 16, 2016 at 10:38 AM Denny Vrandečić <vrande...@gmail.com>
wrote:
> Markus' description of the decision for the limit corresponds with mine. I
> also think that this
Markus' description of the decision for the limit corresponds with mine. I
also think that this decision can be revisited. I would still advice for
caution, due to technical issues, but I am sure that the development team
will make a well-informed decision on this. It would be sad if valid
\o/
On Tue, Sep 13, 2016 at 6:18 AM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> Hey everyone :)
>
> Wiktionary is our third-largest sister project, both in term of active
> editors and readers. It is a unique resource, with the goal to provide
> a dictionary for every language, in
Yes, Violeta, that would have saved me quite some time
Thanks, Thad! I would, but Pasleim on the Wiki-Discussion found a source
that makes it clear that they are indeed one and the same person:
https://www.wikidata.org/wiki/Talk:Q937954
http://www.tv.com/people/ed-gilbert/biography/
I think
The primary sources tool can add references to existing claims. For some
reason, it started re-adding the claim yesterday instead.
Did anything change on the Wikidata-side in the last few days that could be
causing this change in behavior? Or should we look harder on the Primary
sources side?
(and to make it clear, it is unclear whether this is an error due to
DBpedia or due to the companies extraction framework, I was not diving into
the data)
On Wed, Mar 2, 2016 at 1:59 PM Denny Vrandečić <vrande...@gmail.com> wrote:
> Depends how good the DBpedia data really is - a
1 - 100 of 286 matches
Mail list logo