e cases, and try our best to
help.
I hope that the page now is in a state Dr Lemaire is content with.
I want to extend her an apology, although I am not sure she will read this
message.
Best regards,
Denny Vrandečić
On Fri, Jul 28, 2023 at 1:41 PM Vi to wrote:
> This mailing list
I think it would be great to have a mobile app for Wikidata.
On Thu, Feb 3, 2022 at 10:59 AM geislemx
wrote:
> Hey all,
>
> I hope this mail finds you well in this trying times.
> Over the last month I invested some time and put a little project
> together for personal purpose. Long story short
,
Denny
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org
really
should write that book!
Cheers,
Denny
On Sat, Jul 10, 2021 at 3:00 PM Thad Guidry wrote:
> *Tobi - *That blog post 3 is very helpful. It shows that Denny and I
> think alike and agree on everything. :-) His dislike for strong
> classification.
> Which is part of my bas
. Participating communities will
hopefully find that this project leads to long-term growth in Wikipedia and
Wiktionary in and about their language.
Lydia and Denny would like to choose the same focus languages for both of
the teams, as this is beneficial for both projects to have this aligned.
We
A short blogpost by Lydia and me on the diff blog:
https://diff.wikimedia.org/2020/10/06/wikidata-reaches-q1/
Congratulations to the community, congratulations to the project!
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
We have released lexical masks as ShEx files before, schemata for
lexicographic forms that can be used to validate whether the data is
complete.
We saw that it was quite challenging to turn these ShEx files into forms
for entering the data, such as Lucas Werkmeister’s Lexeme Forms. So we
adapted
There's also this paper, which might be interesting for what you are doing:
https://www.wikidata.org/wiki/Q73506477
On Fri, Jun 19, 2020 at 9:42 AM Denny Vrandečić wrote:
> The dump with the (almost) complete edit history can be found here:
>
> https://dumps.wikimedia.org/wikidatawiki
The dump with the (almost) complete edit history can be found here:
https://dumps.wikimedia.org/wikidatawiki/20200501/
Search for "edit history" on that page. There are two versions, in two
different compression formats.
It is rather big.
On Fri, Jun 19, 2020 at 7:52 AM Elisavet Koutsiana <
Did you see this?
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Wed, Jun 10, 2020, 12:51 Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Dear all,
>
> I'm loading the whole wikidata dataset into Blazegraph using a High
> Performance
Did you see this?
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Sun, Jun 7, 2020 at 3:13 AM Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Dear all,
>
> I am a researcher from Hasselt University performing research on Query
>
Welcome to Wikidata! Thanks for taking on such an important task for
outreach!
On Tue, May 19, 2020 at 3:02 AM Dan Shick wrote:
> Hi all!
>
> I’m Dan Shick ( https://w.wiki/RDs ), the new technical writer at
> Wikimedia Deutschland. My goals are to discover, improve, unify and
> round out
Welcome to Wikidata! Thanks for taking on such an important task for
outreach!
On Tue, May 19, 2020 at 3:02 AM Dan Shick wrote:
> Hi all!
>
> I’m Dan Shick ( https://w.wiki/RDs ), the new technical writer at
> Wikimedia Deutschland. My goals are to discover, improve, unify and
> round out
of your voices and support signatures, so
that I can go to the Board and tell them look at this :) So please sign
here:
https://meta.wikimedia.org/wiki/Talk:Wikilambda
Thank you,
Denny
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https
, and that is more relevant to have automated calls be
able to do that format selection, which the endpoint provides.
Thanks,
Denny
On Fri, May 1, 2020 at 9:59 AM Kingsley Idehen
wrote:
> On 5/1/20 11:53 AM, Isaac Johnson wrote:
>
> If the challenge is downloading large files, you can
CONSTRUCT would be best, but I am not sure that there's any system to
allows you to do that.
What I would do is get the truthy dump in ntriples, and filter out all
lines with the respective properties. The Wikidata Toolkit allows you to do
that and more.
If you're interested in all new created schemas, you can actually follow a
feed for that.
https://www.wikidata.org/wiki/Special:RecentChanges?hidebots=1=1=1=1=1=640=50=14__likelybad_color=c4__verylikelybad_color=c5=2
There's a link to the atom feed on the left hand toolbar.
Cheers,
Denny
Mohammed,
welcome! I am very happy to see you join in this important role.
Thank you,
Denny
On Tue, Apr 21, 2020 at 9:11 AM Léa Lacroix
wrote:
> Welcome onboard Mohammed!
> I'm glad that you're here and to have your support in order to address the
> requests from the
to see which biases are intentional in a language editions, and
which ones are not, in the hope that we can then tackle certain biases in
some language editions in a more targeted way.
Thank you for your kind words, and I am also very excited to get this thing
moving! :)
Stay safe,
Denny
On Fri
hope it will happen on Meta or on Wikimedia-l, but I also wanted to give a
ping here.
Stay safe!
Denny
[1] https://lists.wikimedia.org/pipermail/wikimedia-l/2020-April/094621.html
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Forum#Proposal_towards_a_multilingual_Wikipedia_and_a_new_Wikipedia_project
It is allowed (and in fact encouraged) to embed Wikidata query results in
your site. That was one of the original use cases.
Also, I wouldn't worry tremendously about the load on the Query server from
that as this is particularly well cachable and cached. To the best of my
knowledge, embedded
Yes, that sounds good to me.
Either create an item for that (preferred) or link to the URL directly.
On Fri, Mar 13, 2020, 04:06 Osma Suominen wrote:
> Thanks Denny.
>
> Do you have a practical suggestion how to do this? There's no obvious
> source URL to refer to currently. What
When we were uploading the links to Freebase, we also added references fro
these. And since you've gone through all this work (thank you for that!)
verifying the links, I think it would be fair to add a respective reference.
On Tue, Mar 10, 2020 at 9:27 AM Osma Suominen
wrote:
> Hi,
>
> I'm
That's awesome, thanks!
On Tue, Feb 11, 2020 at 3:46 AM Léa Lacroix
wrote:
> Hey,
>
> On Tue, 11 Feb 2020 at 12:30, Nicolas VIGNERON
> wrote:
>
>> Great !
>>
>> Two small perfectionist question:
>> - is it finished or can we maybe go further?
>>
> It is finished for now, but we should continue
Oh, wow, I just tried that out too, and indeed, it used to be possible to
link to the L-number very quickly if I remember correctly, but now this is
not the case anymore.
Weirdly enough, the SPARQL endpoint got updated with the new Lexeme very
quickly. So I think these two things are not related.
This should probably help:
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Tue, Nov 19, 2019 at 8:27 AM Peter F. Patel-Schneider <
pfpschnei...@gmail.com> wrote:
> Is this the recommended way to set up a local copy of Wikidata? (If not,
> what
> is the
:
> On Fri, Nov 15, 2019 at 12:49 AM Denny Vrandečić
> wrote:
>
>> Just wondering, is there a way to let volunteers look into the issue? (I
>> guess no because it would give potentially access to the query stream, but
>> maybe the answer is more optimistic)
>>
>
Just wondering, is there a way to let volunteers look into the issue? (I
guess no because it would give potentially access to the query stream, but
maybe the answer is more optimistic)
On Thu, Nov 14, 2019 at 2:39 PM Thad Guidry wrote:
> In the enterprise, most folks use either Java Mission
Denny added a comment.
In Chrome. But it does not happen on the first gloss (since Remove is grayed
out on the first gloss), only on the second and other additional glosses.
The part with the "Remove" is kinda OK. I would prefer it would jump to
Remove only if I left the g
Denny created this task.
Denny added projects: Wikidata, Lexicographical data.
TASK DESCRIPTION
When creating a Sense in a Lexeme, I choose the language, tab to the field,
enter the gloss, tab to Remove (which should preferably tab to Add instead),
tab again to Add, press return, it adds
is the core project to weave value-adding workflows on
top of Wikidata or other datasets from the linked open data cloud together.
But that's just a proposal.
Cheers,
Denny
On Sat, Sep 28, 2019 at 12:28 AM wrote:
> Hi Gerard,
>
> I was not trying to judge here. I was just saying
Denny added a comment.
Oh, wow, it also happens for Lexeme:thanks
I am super curious what is happening here!
TASK DETAIL
https://phabricator.wikimedia.org/T233763
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Denny
Cc: Aklapper, Denny
Denny created this task.
Denny added a project: Wikidata.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
When I search for Lexeme:Danke from the Search box in Wikidata (upper right
corner) I get the message
[XYqWIwpAADoAAJk2AOEAAABL] 2019-09-24 22:18:11: Fatal
Thanks everyone for this warm welcome (back)!
On Fri, Sep 20, 2019, 10:38 Denny Vrandečić wrote:
> Off to my Todo list :)
>
> On Thu, Sep 19, 2019 at 10:46 AM Andy Mabbett
> wrote:
>
>> On Thu, 19 Sep 2019 at 17:56, Denny Vrandečić
>> wrote:
>>
>>
Sep 20, 2019 at 1:32 PM Denny Vrandečić
> wrote:
>
>> Yes, you're touching exactly on the problems I had during the evaluation
>> - I couldn't even figure out what DBpedia is. Thanks, your help will be
>> very much appreciated.
>>
>> OK, I will send a link the week
Off to my Todo list :)
On Thu, Sep 19, 2019 at 10:46 AM Andy Mabbett
wrote:
> On Thu, 19 Sep 2019 at 17:56, Denny Vrandečić
> wrote:
>
> > I am moving to a new role in Google Research, akin to a Wikimedian in
> > Residence
>
> That's marvelous; congratulatio
), not the Open
Source office. But I plan to publish more code forthgoing, so I will
continue to work with them.
On Thu, Sep 19, 2019 at 10:35 AM Federico Leva (Nemo)
wrote:
> Denny Vrandečić, 19/09/19 19:56:
> > I had used my 20% time to support such teams. The requests became more
> > fr
Knowledge Graph Talk thing. I was a bit
> grumpy, because I thought I wasted a lot of time on the Talk page that
> could have been invested in making the article better (WP:BE_BOLD style),
> but now I think, it might have been my own mistake. So apologies for
> lashing out there.
>
&g
istakes, and stuff I missed in the evaluation. But
you know what would help?
You.
My suggestion is that I publish my current draft, and then you and me work
together on it, publically, in the open, until we reach a state we both
consider correct enough for publication.
What do you think?
Cheer
I think if we wanted to do this with a bot, we should go through the usual
bot approval process, and discuss this on wiki?
But in general, as said, adding unknown value to people who are very very
sure to be dead sounds like a good idea (
https://www.wikidata.org/wiki/Q28 )
On Thu, Sep 19,
"unknown value" was made for exactly that use case - a person that has
died, but we don't know when.
I would just add that on the "date of death" property.
On Sat, Sep 7, 2019 at 2:25 AM Thomas Douillard
wrote:
> We have already a qualifier for this kind of stuffs, I think : P887
>
looking forward to see them get out of the gates!
I am looking forward to hearing your ideas and suggestions, and to continue
contributing to the Wikimedia goals.
Cheers,
Denny
P.S.: Which also means, incidentally, that my 20% time is opening for new
shenanigans [3].
[1] https
Denny added a comment.
I tried Lucas suggestion, and it solves the reason for the bug. When there
are two actions, it still works fine (but their relative positioning of the
action icons changes slightly - they squeeze together).
Functionality wise, his suggested fix works.
Here's
Denny created this task.
Denny added projects: Wikidata Mobile, Wikidata-Termbox, Wikidata, Mobile.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
When using Chrome on my Pixel 3a, since the new Termbox has launched (which,
by the way, is awesome), it is hard
Hi Gerard,
thank you for your comments. I agree with them - the generated text
shouldn't be stored in the local Wikipedias, but merely cached. I have
updated the text accordingly to make it explicit.
Thanks!
Cheers,
Denny
On Sat, Jul 6, 2019 at 11:09 PM Gerard Meijssen
wrote:
> Hoi,
>
That is really cool! Thanks and congratulations! I will certainly play with
it.
Is it in some way synced or is it a static snapshot?
On Tue, Aug 13, 2019 at 4:10 PM Kingsley Idehen
wrote:
> Hi Everyone,
>
> A little FYI.
>
> We have loaded Wikidata into a Virtuoso instance accessible via
ich have all the predicate defined.
>
> On Mon, 12 Aug 2019 at 22:23, Denny Vrandečić wrote:
>
>> Maybe you mean this file:
>>
>>
>> https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/master/docs/ontology.owl
>>
>>
>>
Maybe you mean this file:
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/master/docs/ontology.owl
Cheers,
Denny
On Sun, Aug 11, 2019 at 7:20 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:
> Hi Ali, all,
>
> we have this d
Thank you for the message, Lea, this seems like a good step.
On Thu, Aug 8, 2019 at 8:19 AM Gerard Meijssen
wrote:
> Hoi,
> Easy, my user interface is English in all of them.
> Thanks,
> GerardN
>
> On Thu, 8 Aug 2019 at 16:39, Imre Samu wrote:
>
>> *> Suggestion* display the Q number in
tool’s templates API
> <https://www.wikidata.org/wiki/Wikidata:Wikidata_Lexeme_Forms#Templates_API>
> rather than the wiki page itself: transcribing the templates into
> structured form takes some time, there’s no need for someone else to do it
> again :)
>
> Cheers,
>
://www.wikidata.org/wiki/Wikidata:Wikidata_Lexeme_Forms/German
(If Danish and German were the same language, which they are not,
obviously, but this is to exemplify the idea).
If not, does anyone want to work / cooperate on that?
Cheers,
Denny
___
Wikidata mailing
Worked perfectly well! Thanks!
On Wed, Jul 17, 2019 at 1:31 PM Denny Vrandečić
wrote:
> Thank you Antonin, that's reassuring, I'll try that!
>
> Cheers,
> Denny
>
> On Wed, Jul 17, 2019 at 12:48 PM Antonin Delpeuch (lists) <
> li...@antonin.delpeuch.eu> wrot
Thank you Antonin, that's reassuring, I'll try that!
Cheers,
Denny
On Wed, Jul 17, 2019 at 12:48 PM Antonin Delpeuch (lists) <
li...@antonin.delpeuch.eu> wrote:
> Hi Denny,
>
> You are correct that wbsetclaim can create a claim with a reference. You
> just need to pa
es the API allow the creation of a claim with a reference in a single
step? And if yes, how?
2. Does pyWikibot allow it? And if yes, how?
Cheers,
Denny
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailma
pubpub.org/pub/vyf7ksah
Cheers,
Denny
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
AllPages=202
> e.g. http://mappings.dbpedia.org/index.php/OntologyProperty:BirthDate
>
> You could directly use the DBpedia-lemon lexicalisation for Wikidata.
>
> The mappings can be downloaded with
>
> git clone https://github.com/dbpedia/extraction-framework ; cd core ;
> ../run download-mappin
Denny added a comment.
That looks neat.TASK DETAILhttps://phabricator.wikimedia.org/T201000EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: DennyCc: Denny, Lucas_Werkmeister_WMDE, RazShuty, WMDE-leszek, Addshore, Lydia_Pintscher, Mringgaard, Lahi, Gq86
Denny added a comment.
"what are the benefit for the Wikimedia community of using exclusively CC-0 for its single Wikibase instance usable in the rest of its environment?"
This question is, I think, less suitable for a lawyer. I think this is a very interesting question, but I'd rather
or images. But for data
anything else is just weird and will bite is in the long run more than we
might ever benefit.
So, just to say it again: if the Gutachten you mentioned could be made
available, that would be very very awesome!
Thank you, Denny
On Thu, May 17, 2018, 23:06 Sebastian Hellmann
Denny added a comment.
@Mateusz_Konieczny I like R-OSM-1 too. I would go now for these two questions.
I'd really like to have @Psychoslave chime in, as he was the one opening this bug and certainly being the most vocal on this topic, as far as I have seen, so I will leave this open for a few days
Denny added a comment.
@Rspeer
My previous suggestion to @Psychoslave was
P) "Can you comment on the practise of extracting data from Wikipedia articles, which are published under CC-BY-SA, and storing the results in Wikidata, where they are published under CC-0?"
I guess th
Denny added a comment.
@MisterSynergy yes, I agree, it would seriously weaken Wikidata. Nevertheless it is good to resolve legal uncertainties as far as reasonable.
Regarding Gnom1 - well, he did write the previous, official answer by Wikilegal, which is why I consider that a great offer. But I
Denny added a comment.
@Micru I agree with @Cirdan that this would be a rather worrying way to deal with the situation. Also, as @Nemo_bis points out, it really couldn't be just the communities doing so. In my understanding, it would need an update to the CC license itself, which would need
linked-data.html
> >
> > On 11 May 2018 at 23:10, Rob Speer <r...@luminoso.com> wrote:
> >
> > > Wow, thanks for the heads up. When I was getting upset about projects
> > that
> > > change the license on Wikimedia content and commercialize it, I
Denny added a comment.
@Rspeer If I link an article from the German Wikipedia to the English Wikipedia by adding an interwiki link on the German Wikipedia, and then an interwiki bot makes this link be reciprocal by adding the interwiki link on the English Wikipedia, there is no attribution to me
Denny added a comment.
I was reading the article you linked to - https://de.wikipedia.org/wiki/Sch%C3%B6pfungsh%C3%B6he#Sch%C3%B6pfungsh%C3%B6he_seit_2013 - and nothing there lets me believe that the list of Interwikilinks would have sufficient "Schöpfungshöhe".TASK D
Denny added a comment.
@Rspeer
Copyright has to be about some concrete _expression_.
Are you claiming that the interwiki links that used to be in Wikipedia articles until five years ago should have had copyright protection? Their concrete _expression_ was [[en:London]] [[fr:Londres]] [[hr:London
Denny added a comment.
@Nemo_bis : good point. I wouldn't know what a good example is, though, maybe someone else can come up with something.TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: DennyCc: Lofhi
Denny added a comment.
@Rspeer regarding the ontology: the ontology of Wikidata is genuinely unique and not copied from any Wikipedia project, or any other project. It has been created on Wikidata.
Regarding the translations: we are talking about the labels of things in different languages
Denny added a comment.
@Nemo_bis thanks, I agree with your point a lot.
But regarding your question - just because there is a database which happens to reproducible should not trigger any right issues.
To give an example: it is easy to imagine a company that sells the list of all countries
Denny added a comment.
@Psychoslave sorry to disagree on the questions, but are we in any disagreement on these three questions?
We should not allow the (significant) import of data from databases which are licensed under a license incompatible with CC-0.
We should enforce that.
We should
Denny added a comment.
@Gnom1 - yes, anything that you can contribute would be awesome.
Unfortunately, this request here is all over the place, ranging from the question whether it is legally permissible to have a statement reference Wikipedia to the way inline images are displayed, so it might
Denny added a comment.
Re Psychoslave:
Having a statement in Wikidata with a reference, where the referenced work is not published under CC-0, is entirely fine in my understanding.
As a comparison: Wikipedia has plenty of references, where the referenced work is not published under CC-BY-SA
Denny added a comment.
Re Pintoch:
No, I was seriously not aware that we are uploading datasets
I think it is fair to say that this is not exactly an isolated case (but I am surprised that you seem (to pretend) not to know? Maybe for legal reasons?)
No, no legal reasons, I really didn't know
Thanks!
On Mon, May 7, 2018 at 1:36 PM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:
> Folks, I’m already in contact with John, there’s no need to contact him
> again :)
>
> Cheers, Lucas
>
> Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić
adding
John to this thread - although I know he already knows about this request -
and I am asking officially to enter Wikidata into the LOD diagram.
Let's keep it all open, and see where it goes from here.
Cheers,
Denny
On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
hellm...@informatik.uni-le
Denny added a comment.
The property is about adding the RNSR ID. That's fine.
On the data import hub page you linked, I don't see a mention of the license. Nor on your talk page. I find this upload problematic.TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps
Denny added a comment.
I don't know the Open License. Given what I understand using automatic translation, the license requires attribution.
So if the RNSR is a database in the sense of the Database Right directive, and if the RNSR is licensed under the Open License, and if the Open License
I'm pretty sure that Wikidata is doing better than 90% of the current
bubbles in the diagram.
If they wanted to have Wikidata in the diagram it would have been there
before it was too small to read it. :)
On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
pfpschnei...@gmail.com> wrote:
>
Denny added a comment.
@Psychoslave, I am not sure I entirely follow.
You said "there are contributors of Wikidata that do make massive imports of external data banks, regardless of the corresponding terms of use."
Then you refer to an email on a closed list that I cannot acces
Denny added a comment.
You state that "there are contributors of Wikidata that do make massive imports of external data banks, regardless of the corresponding terms of use."
Can you point to such imports?
In my opinion, we should treat such license issues on Wikidata as seriou
That's great! I was having the same thought, but was thinking on top of the
SPARQL interface - but if it works on top of the API even better.
Thanks for that, this might be quite interesting. I really hope that it can
integrated into Wikidata proper.
On Mon, Jan 1, 2018 at 9:12 AM Thomas
in the
choice of a license in order to avoid unintended consequences.
Just food for thought
Denny
On Thu, Nov 30, 2017, 20:51 John Erling Blad <jeb...@gmail.com> wrote:
> My reference was to in-place discussions at WMDE, not the open meetings
> with Markus. Each week we had an open demo
And thanks for the use cases. This helps a lot with thinking about this.
On Thu, Aug 31, 2017, 16:31 Denny Vrandečić <vrande...@gmail.com> wrote:
> The reason why we save the actual value with more digits than the
> precision (and why we keep the precision as an explicit
The reason why we save the actual value with more digits than the precision
(and why we keep the precision as an explicit value at all) is because the
value could be entered and displayed either as decimal digits or in minutes
and seconds. So internally one would save 20' as 0.3, but the
> > https://www.wikidata.org/wiki/Property:P2581. This property has been
> very
> > little used: http://tinyurl.com/y8npwsm5
> >
> > There might be a Wikimedia-Wordnet indirect link through BabelNet
> >
> > /Finn
> >
> >
> > On 08/15/2017 0
we have a BabelNet Wikidata property,
> https://www.wikidata.org/wiki/Property:P2581. This property has been
> very little used: http://tinyurl.com/y8npwsm5
>
> There might be a Wikimedia-Wordnet indirect link through BabelNet
>
> /Finn
>
>
> On 08/15/2017 07:22 PM, Denny
That's a great question, I have no idea what the answer will turn out to be.
Is there any current link between Wiktionary and WordNet? Or WordNet and
Wikipedia?
On Tue, Aug 15, 2017 at 10:14 AM wrote:
>
>
> I have proposed a Wordnet synset property here:
>
Chris,
thanks. That's cute cute ultimately disappointing - I would have preferred
for it to take me to "other people with this death place", which would be
more interesting.
Ah well, thanks on answering,
Denny
On Mon, Jul 10, 2017 at 7:48 AM Chris Koerner <nob...@gmail.com>
Aren't both ... uhm ... "use cases" supported by dbpedia proper anyway?
On Thu, May 4, 2017 at 3:40 AM Kingsley Idehen
wrote:
> On 5/3/17 3:37 PM, Nicholas Humfrey wrote:
> >
> > On 26/04/2017, 15:41, "Wikidata on behalf of Kingsley Idehen"
> >
Daniel, I agree, but isn't that what Multilingual Text requires? A language
code?
I.e. how does the current model plan to solve that?
I assume most of it is hidden behind mini-wizards like "Create a new
lexeme", which actually make sure the multitext language and the language
property are
So assume we enter a new Lexeme in Examplarian (which has a Q-Item), but
Examplarian has no language code for whatever reason. What language code
would they enter in the MultilingualTextValue?
On Mon, Apr 10, 2017 at 8:42 AM Daniel Kinzler
wrote:
> Tobias' comment
Scott,
I assume you realized that the article by Norvig you cited was rather
intentionally published on April 1st.
Cheers,
Denny
On Fri, Apr 7, 2017 at 11:04 AM Scott MacLeod <
worlduniversityandsch...@gmail.com> wrote:
> I tried to see how the ISO codes and IANA language subtag
On Thu, Apr 6, 2017, 16:16 Stas Malyshev wrote:
> Hi!
>
> > - use Q-Items instead of UserLanguageCodes for Multilingual texts (which
> > would be quite a migration)
>
> I foresee that might be a bit of a problem for external tools consuming
> this data - how they would
I agree with Peter here. Daniel's statement of "Anything that is a subclass
of X, and at the same an instance of Y, where Y is not "class", is
problematic." is simply too strong. The classical example is Harry the
eagle, and eagle being a species.
The following paper has a much more measured and
http://i0.kym-cdn.com/entries/icons/original/000/001/899/mission_accomplished.jpg
On Sat, Dec 31, 2016, 02:58 Lydia Pintscher
wrote:
> Folks,
>
> We're now officially mainstream ;-)
>
>
and a better
scientifically supported data model outweigh this.
I hope that makes sense,
Giving thanks,
Denny
On Tue, Nov 22, 2016 at 3:28 AM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> Am 22.11.2016 um 10:19 schrieb David Cuenca Tudela:
> >> There are many many w
*not
On Tue, Nov 15, 2016, 10:06 Denny Vrandečić <vrande...@gmail.com> wrote:
> Do you make sure but to request the place of death or date of death on
> living people? I.e. can we filter certain properties?
>
> On Tue, Nov 15, 2016, 07:20 Simon Razniewski <sraz
Do you make sure but to request the place of death or date of death on
living people? I.e. can we filter certain properties?
On Tue, Nov 15, 2016, 07:20 Simon Razniewski wrote:
> On November 15, 2016, csara...@uni-koblenz.de wrote
>
> It would be very interesting to see the
Hi all,
I set up a vagrant environment to do some hacking on Wikibase, but the
wikidata role seems to create a client only? Is there a role for the server?
Cheers,
Denny
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https
1 - 100 of 318 matches
Mail list logo