Re: [Wikidata-l] [Wikisource-l] next sister project: Wikisource

2013-11-05 Thread Andrea Zanni

 For the author pages it is quite straight-forward. For the bibliographic
 metadata the easiest would be to connect wikidata items with the Book:
 page generated by the (planned) Book Manager extension. The Book: page is
 supposed to provide the book structure and act as a metadata hub for both
 books with scans (those with Index: page) and books without scans (there
 was no solution for those yet)
  Project page: https://meta.wikimedia.org/wiki/Book_management
 Bugzilla page: https://bugzilla.wikimedia.org/show_bug.cgi?id=15071
 Example:
 http://tools.wmflabs.org/bookmanagerv2/mediawiki/index.php?title=Book:The_Interpretation_of_Dreamsaction=edit


Problem, the extension is not finished yet and neither Molly nor Raylton
 have time to keep working on it. Some bugs are still open and the fields in
 the template would need to be maped to Wikidata properties. All this is not
 relevant for phase 1 (if it is done only for books), but it will Excellent
 news! :)
 become relevant for phase 2.

 Is there anyone that could volunteer as an OPW mentor to help a potential
 student to finish this project?


The page looks nice. It's a pity it's incomplete.

I can volunteer for the metadata/librarian part (the metadata structure
should be Dublin Core compliant, and as DC every field should be optional
and repeatable). Moreover, we would desperately need to import and export
metadata. Tpt made the Index page OAI-PMH compliant, is this too? Of
course, these fields should be integrated with WD properties :-)

We still have this draft mapping:
https://docs.google.com/spreadsheet/ccc?key=0AlPNcNlN2oqvdFQyR2F5YmhrMWpXaUFkWndQWUZyemc#gid=0

Aubrey


 On Sat, Nov 2, 2013 at 4:30 PM, Lydia Pintscher 
 lydia.pintsc...@wikimedia.de wrote:

 Hey everyone,

 The next sister project to get language links via Wikidata is
 Wikisource. We're currently planning this for January 13.
 The coordination is happening at
 https://www.wikidata.org/wiki/Wikidata:Wikisource  On this page we're
 also looking for ambassadors to help spread the messages to the
 different language editions of Wikisource. Please help if you can.


 Cheers
 Lydia

 --
 Lydia Pintscher - http://about.me/lydia.pintscher
 Product Manager for Wikidata

 Wikimedia Deutschland e.V.
 Obentrautstr. 72
 10963 Berlin
 www.wikimedia.de

 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
 unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
 Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




 --
 Etiamsi omnes, ego non

 ___
 Wikisource-l mailing list
 wikisourc...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikisource-l


___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Wikisource-l] next sister project: Wikisource

2013-11-05 Thread Andrea Zanni
About this:

On Tue, Nov 5, 2013 at 2:13 PM, David Cuenca dacu...@gmail.com wrote:

 Connecting new uploaded books with Wikidata: again this is very related to
 the above. As a first preparatory step, one GsoC of this year worked on
 using templates (like commons:Template:Book) directly with the
 UploadWizard. It generates the form according to a template, which in turn
 could create both a Wikidata item and a Wikisource page when the uploaded
 file is a book. However this has been stalled due to this RFC on Commons:

 https://commons.wikimedia.org/wiki/Commons:Requests_for_comment/How_Commons_should_deal_with_TemplateData


how this concerns us?

Sorry, but I don't really understand this TemplateData issue.
Uploading books directly from Wikisource (entering all the important
metadata, that would go to Commons, Wikisource and Wikidata) is a crucial
*feature* that we absolutely need.
What is the problem, here, specifically?

Thanks!

Aubrey
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] hi wikidata team~

2013-11-05 Thread Kisoong Jang
hi wikidata team!

im kisoong.jang from korea
nowdays i'm interested of wikidata.

I'm trying to make wikidata clone on my localhost at the moment

I've installed mediawiki, wikibase client  wikibase repository successfully
and i downloaded whole dumps file in
http://dumps.wikimedia.org/wikidatawiki/latest/

there are many files in http://dumps.wikimedia.org/wikidatawiki/latest/
sql, xml, bz2,, so on

i try to import sql to mysql and import xml using importDump.php
but it doesn't work as well

Question
1) How to import xml and sql
2) whole files will be imported? (
wikidatawiki-latest-abstract.xmlhttp://dumps.wikimedia.org/wikidatawiki/latest/wikidatawiki-latest-abstract.xml
, 
wikidatawiki-latest-all-titles-in-ns0.gzhttp://dumps.wikimedia.org/wikidatawiki/latest/wikidatawiki-latest-all-titles-in-ns0.gz
)
3) are they have import sequence?
4) what do i have to do after import?

if u guys know about it then let me know~
hv a gd day!!

thanks.

Kisoong Jang
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] next sister project: Wikisource

2013-11-05 Thread Daniel Kinzler
Am 05.11.2013 01:12, schrieb David Cuenca:
 Yes, that would be it: one work-item (acting as hub), x edition items 
 connected
 to the work-item, each edition-item connected to its corresponding Wikisource
 page with a sitelink and, on Wikisource, an auto-generated nav bar that lists
 all sitelinks from all edition-items on the left (equivalent to the current
 interwiki link list). If there is more than one edition per language author
 citation (P835) or author (P50) value can be shown next to the language 
 name.
 For connecting works with editions we already have edition (P747) and 
 edition
 of (P629).

OK, I think I understand now: the issue is that Wikisource wants language links
on edition pages not from that edition's data item (since an edition can
generally only exist in one language anyway), but wants to use the items of
different editions of the same work in different languages to generate language
links.

A long as there is only one edition per language, this would work: Lua could
generate the corresponding language links (we may have to tweak the lua binding
a bit, but that should not be a problem).

However, MediaWiki only supports one link per target site in the sidebar.

Maybe an on-page navigation box could be used instead of proper language 
links?

With the help of JavaScript, the contents of that nav box could then be moved
into the sidebar. That's a bit hackish, but would work ok, I think.

 On Wikisource I don't think it is necessary to have always a work page, this
 only happens when there is more than one edition for any given language. 

I think it would be a good idea to always have that, for consistency and
structural integrity.

 The
 most important part is to automate the creation of a work-item on Wikidata
 whenever is needed to link one edition to another (same or different 
 languages)
 and, of course, show the generated nav bar on all edition pages .

That should be done by a bot.

 Wikipedia(s) will be connected to the work-items as usual. Template:Infobox
 book needs some work to be able to show work- and edition-item data. I have
 started a proposal for this task as a possible Code-In, but maybe the second
 part needs arbitrary item access.
 https://www.mediawiki.org/wiki/Google_Code-In#Lua_templates

nice!

-- daniel


___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


ESWC 2014 Second Call for Workshops

2013-11-05 Thread speroni

** apologies for cross-posting **

 Second Call for Workshops 
http://2014.eswc-conferences.org/important-dates/call-workshops

The organizers of the 11th ESWC 2014 cordially invite you to submit a workshop 
proposal. ESWC is a major venue for discussing the latest scientific results 
and innovations in the field of semantic technologies on the Web and Linked 
Data, attracting a high number of high quality submissions and participants 
from academia and industry alike.

Co-located workshops at ESWC conferences are distinguished meeting points for 
discussing ongoing work and latest ideas related to semantic technologies and 
the Semantic Web. Of particular interest are workshop proposals with an 
interdisciplinary standpoint, proposals focusing on a specific technology of 
general interest, or gathering a sub-community. We encourage the submission of 
workshop proposals on:
 
* Fundamental problems of the Semantic Web / Linked Data such as ontology 
mining, heterogeneity, scalability and distribution, uncertainty, etc.
* Applications of Semantic Web technologies in specific domains,
* Important enabling technologies and their adaptation to the needs of the 
Semantic Web, and
* Aspects of Semantic Web research that have been neglected so far, 
* Techniques from other research fields that are of relevance for Semantic Web 
research (e.g., machine learning, NLP, data mining)



# General Information and Criteria 

Each proposal will be reviewed by the members of the workshop programme 
committee, and ranked based on the overall quality of the proposal and the 
workshop's fit to the conference as detailed below. Their recommendation will 
determine the final decision on the acceptance/rejection of each proposal, 
which is to be taken by the workshop and tutorial chairs as well as by the 
local and the general chair of ESWC 2014.
 
The criteria for judging the quality of workshop proposals are as follows:
 
* Co-located workshops cover topics falling in the general scope of the ESWC 
conference.
* Workshops are intended to be genuine interactive events and not 
mini-conferences.
* We welcome workshops with creative structures and organizations that attract 
various types of contributions and ensure rich interactions.
* Workshops should have a clear focus on a specific technology, problem or 
application.
* There is potentially a significant community interested in the workshop's 
topic.
* Workshop duration can be half a day or a full day.
* We strongly advise having more than one organizer and no more than four, 
preferably from different institutions, bringing different perspectives to the 
workshop topic.
 
In case overlapping workshops are proposed, the workshop chair may contact the 
organisers to discuss the possibility of merging workshops. Please note that 
the duration of a workshop might need to be adjusted based on the overall 
number of submissions received. Further, workshops that receive less than 5 
submissions or have less than 10 people registered at the early registration 
deadline might be canceled.
 
The organizers of accepted workshops will be responsible for their own 
reviewing process, publicity (e.g., website, timelines and call for papers), 
and proceedings production. They will be required to closely cooperate with the 
Workshop Chair and the ESWC 2014 local organizers to finalize all 
organizational details. Workshop attendees must pay the ESWC 2014 workshop 
registration fee, as well as the conference registration fee.

Organizers of workshops and tutorials will get a free registration for 
workshops and tutorials at the pre-conference days, i.e. they will only have to 
pay the main conference fee.



# Important Dates 

Workshop proposals due: Nov 22, 2013 - 23:59 Hawaii Time 
Notification of acceptance: Dec 6, 2013 - 23:59 Hawaii Time
Workshop Web site due: Dec 16, 2013 - 23:59 Hawaii Time 
Workshop camera-ready proceedings due: Apr 25, 2014 - 23:59 Hawaii Time
Workshop days: May 25 and May 26, 2014



# Suggested Timeline for Workshops

Submission deadline: March 6, 2014 
Notifications: April 1, 2014
Camera ready version: April 15, 2014



# Submission Guidelines 

Workshop proposals have to be submitted via Easychair. Each proposal must 
consist of a single PDF document written in English, not longer than 3 pages, 
which contains the following information:
 
1. The title and brief technical description of the workshop, specifying its 
goals and motivation.
2. A brief discussion of why the topic is of particular interest at this time.
3. A brief description of why and to whom the workshop is of interest, the 
workshop audience, as well as the expected number of participants.
4. A brief description (draft outline) of the proposed workshop format, 
discussing the mix of events and activities such as paper presentations, 
invited talks, panels, hacking session, or general discussion, and and an 
approximate timeline.
5. A list of (potential) members of the program committee 

Re: [Wikidata-l] Questions about statement qualifiers

2013-11-05 Thread Denny Vrandečić
Hello Antoine,

just to add to what was already said:

a Qualifier in Wikidata is not a statement about a statement. In RDF
semantics, the pattern that we follow is not the reification of the triple
and then to make triples with the reified triple as a subject, as per 
http://www.w3.org/TR/rdf-mt/#ReifAndCont but rather the pattern of n-ary
relations per http://www.w3.org/TR/swbp-n-aryRelations/ . The use cases
very beautifully visualize how Wikidata maps to RDF: 
http://www.w3.org/TR/swbp-n-aryRelations/#useCase1

This is also what Wikidata's mapping to RDF document explains and
motivates: https://meta.wikimedia.org/wiki/Wikidata/Development/RDF

I hope this helps,

Denny



On Oct 31, 2013 3:40 AM, Antoine Zimmermann antoine.zimmerm...@emse.fr
wrote:

 Hello,


 I have a few questions about how statement qualifiers should be used.


 First, my understanding of qualifiers is that they define statements about
 statements. So, if I have the statement:

 Q17(Japan)  P6(head of government)  Q132345(Shinzō Abe)

 with the qualifier:

  P39(office held)  Q274948(Prime Minister of Japan)

 it means that the statement holds an office, right?
 It seems to me that this is incorrect and that this qualifier should in
 fact be a statement about Shinzō Abe. Can you confirm this?



 Second, concerning temporal qualifiers: what does it mean that the start
 or end is no value?  I can imagine two interpretations:

  1. the statement is true forever (a person is a dead person from the
 moment of their death till the end of the universe)
  2. (for end date) the statement is still true, we cannot predict when
 it's going to end.

 For me, case number 2 should rather be marked as unknown value rather
 than no value. But again, what does unknown value means in comparison
 to having no indicated value?



 Third, what if a statement is temporarily true (say, X held office from T1
 to T2) then becomes false and become true again (like X held same office
 from T3 to T4 with T3  T2)?  The situation exists for Q35171(Grover
 Cleveland) who has the following statement:

 Q35171  P39(position held)  Q11696(President of the United States of
 America)

 with qualifiers, and a second occurrence of the same statement with
 different qualifiers. The wikidata user interface makes it clear that there
 are two occurrences of the statement with different qualifiers, but how
 does the wikidata data model allows me to distinguish between these two
 occurrences?

 How do I know that:

  P580(start date)  March 4 1885

 only applies to the first occurrence of the statement, while:

  P580(start date)  March 4 1893

 only applies to the second occurrence of the statement?
 I could have a heuristic that says if two start dates are given, then
 assume that they are the starting points of two disjoint intervales. But
 can I always guarantee this?


 Best,
 AZ

 --
 Antoine Zimmermann
 ISCOD / LSTI - Institut Henri Fayol
 École Nationale Supérieure des Mines de Saint-Étienne
 158 cours Fauriel
 42023 Saint-Étienne Cedex 2
 France
 Tél:+33(0)4 77 42 66 03
 Fax:+33(0)4 77 42 66 66
 http://zimmer.aprilfoolsreview.com/

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Questions about statement qualifiers

2013-11-05 Thread Antoine Isaac

Hi Antoine, all,

I was also a bit puzzled by this. If you want more discussion I there is stuff 
on Gerard's blog [1,2].
After some patient explanations of the kind on this list, I think I understood 
what qualifiers are about.

Still I disagree with a part of what Markus said. Trying to understand claims as 
statement about statements, as Antoine did, is not being concerned only about 
the informal meaning. It is a rather deep data modeling issue.

In wikidata a qualifier can be about the object of a claim or about the claim itself (Markus' meta level), 
and there's no means to distinguish one from the other in the form of the data. In fact such data structure 
for qualifiers is much more dependent on an informal reading than thought: it fits really well how humans 
would enter and read the data, but less well what a machine would need to exploit it (as the construct is 
intrinsically ambiguous on whether it's an amendment to the truth conditions of a claim, or a n-ary relation).

I'm not saying it's bad per se. The discussion made me understand better why it 
was designed so, and I can understand the advantages. Still I am not so sure 
it's really a winner in terms of interoperability with other systems.

Best,

Antoine

http://ultimategerardm.blogspot.nl/2013/10/more-heady-stuff-about-wikidata-and.html
http://ultimategerardm.blogspot.nl/2013/10/abdallah-ii-six-times-sultan-of-morocco.html



Hello Antoine,

just to add to what was already said:

a Qualifier in Wikidata is not a statement about a statement. In RDF semantics, the pattern 
that we follow is not the reification of the triple and then to make triples with the reified triple as a 
subject, as per http://www.w3.org/TR/rdf-mt/#ReifAndCont but rather the pattern of n-ary relations 
per http://www.w3.org/TR/swbp-n-aryRelations/ . The use cases very beautifully visualize how 
Wikidata maps to RDF: http://www.w3.org/TR/swbp-n-aryRelations/#useCase1

This is also what Wikidata's mapping to RDF document explains and motivates: 
https://meta.wikimedia.org/wiki/Wikidata/Development/RDF

I hope this helps,

Denny



On Oct 31, 2013 3:40 AM, Antoine Zimmermann antoine.zimmerm...@emse.fr 
mailto:antoine.zimmerm...@emse.fr wrote:

Hello,


I have a few questions about how statement qualifiers should be used.


First, my understanding of qualifiers is that they define statements about 
statements. So, if I have the statement:

Q17(Japan)  P6(head of government)  Q132345(Shinzō Abe)

with the qualifier:

  P39(office held)  Q274948(Prime Minister of Japan)

it means that the statement holds an office, right?
It seems to me that this is incorrect and that this qualifier should in 
fact be a statement about Shinzō Abe. Can you confirm this?



Second, concerning temporal qualifiers: what does it mean that the start or end 
is no value?  I can imagine two interpretations:

  1. the statement is true forever (a person is a dead person from the 
moment of their death till the end of the universe)
  2. (for end date) the statement is still true, we cannot predict when 
it's going to end.

For me, case number 2 should rather be marked as unknown value rather than no 
value. But again, what does unknown value means in comparison to having no indicated value?



Third, what if a statement is temporarily true (say, X held office from T1 to 
T2) then becomes false and become true again (like X held same office from T3 to 
T4 with T3  T2)?  The situation exists for Q35171(Grover Cleveland) who has 
the following statement:

Q35171  P39(position held)  Q11696(President of the United States of 
America)

with qualifiers, and a second occurrence of the same statement with 
different qualifiers. The wikidata user interface makes it clear that there are 
two occurrences of the statement with different qualifiers, but how does the 
wikidata data model allows me to distinguish between these two occurrences?

How do I know that:

  P580(start date)  March 4 1885

only applies to the first occurrence of the statement, while:

  P580(start date)  March 4 1893

only applies to the second occurrence of the statement?
I could have a heuristic that says if two start dates are given, then 
assume that they are the starting points of two disjoint intervales. But can I always 
guarantee this?


Best,
AZ

--
Antoine Zimmermann
ISCOD / LSTI - Institut Henri Fayol
École Nationale Supérieure des Mines de Saint-Étienne
158 cours Fauriel
42023 Saint-Étienne Cedex 2
France
Tél:+33(0)4 77 42 66 03 tel:%2B33%280%294%2077%2042%2066%2003
Fax:+33(0)4 77 42 66 66 tel:%2B33%280%294%2077%2042%2066%2066
http://zimmer.__aprilfoolsreview.com/ http://zimmer.aprilfoolsreview.com/

_
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org mailto:Wikidata-l@lists.wikimedia.org

Re: [Wikidata-l] classes and qualifiers

2013-11-05 Thread Jane Darnell
Hi TomTOm,
Be careful what you wish for! If this were possible, then if someone changed 
the dates, this could mess up other things. We already have a big job 
untangling mismatched interwiki links, and this would make such mismatches 
possible to the nth degree.
Jane

Sent from my iPad

On Nov 4, 2013, at 6:14 PM, Thomas Douillard thomas.douill...@gmail.com wrote:

  
 Hey, I got an ontology question.
 
 Classes are, in semantic web framework and their foundations like Description 
 Logic, if I'am not wrong, something like a lohic predicate that intensionaly 
 or extentionaly defines the properties of their instances.
 
 They are usually not qualified, but in Wikidata, as of now they are 
 properties like the others, who can also be qualified. 
 
 So the question is : could we use qualifier on classes to add predicates on 
 the class definition ? For example if 
 George Bush is an instance of United States President [from 1980 to 
 1984] (random years), this would mean that the instanciation add some 
 predicates on the other predicates we have on the president of the united 
 states ? 
 
 Just a random thought, I just realise I just qualified the instanciation, not 
 the class itself.
 
 --TomT0m 
 
 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Wikisource-l] next sister project: Wikisource

2013-11-05 Thread billinghurst
On Tue, 5 Nov 2013 17:27:02 +0100, Andrea Zanni zanni.andre...@gmail.com
wrote:
 About this:
 
 On Tue, Nov 5, 2013 at 2:13 PM, David Cuenca dacu...@gmail.com wrote:
 
 Connecting new uploaded books with Wikidata: again this is very related
 to
 the above. As a first preparatory step, one GsoC of this year worked on
 using templates (like commons:Template:Book) directly with the
 UploadWizard. It generates the form according to a template, which in
 turn
 could create both a Wikidata item and a Wikisource page when the
uploaded
 file is a book. However this has been stalled due to this RFC on
Commons:


https://commons.wikimedia.org/wiki/Commons:Requests_for_comment/How_Commons_should_deal_with_TemplateData

 
 how this concerns us?
 
 Sorry, but I don't really understand this TemplateData issue.
 Uploading books directly from Wikisource (entering all the important
 metadata, that would go to Commons, Wikisource and Wikidata) is a
crucial
 *feature* that we absolutely need.
 What is the problem, here, specifically?
 
 Thanks!
 
 Aubrey

From my point of view, an upload form should be focused at Wikidata more
than at Commons, anything else is back-to-front.

If we are talking about a published work that it is published is its own
notability and transcends whether it is at Wikisource, Commons, or
Wikipedia, such that it is published makes it Wikidata-able (to coin a
word). We can easily support this statement as copyright alone will prevent
a work from appearing at Wikisource or WikiCommons, and similarly some
published works may not be individually notable for Wikipedia, but may be
so for other reference, thinking here of things that have a DOI.

*Then* comes the issue of which site wishes to utilise the data.  So
having Wikidata as the primary entry point to enter book data, and then
call it from other places as required seems the logical place to start for
any new work at any of the places.

Regards Billinghurst

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] classes and qualifiers

2013-11-05 Thread Gerard Meijssen
Hoi,
The good news is that the interwiki tangles are a big improvement over what
we had. Now there is clarity when an interwiki is wrong. There is one place
where to solve it and when solved, it is solved for all linked Wikipedias.

The problem I have with all the ontology issues is that they are relevant
in the context of interconnectivity between systems. This becomes
increasingly irrelevant as we identify the records in other systems.
Irrelevant it is also because we do not use it. Getting data from external
sources is largely frowned upon.

As it is, we are building the data in Wikidata ourselves and comparing
things is an afterthought. Many statements in Wikidata are problematic but
they have the saving grace that with more statements it becomes more clear
what works and what does not. As long as people talk in terms of ontologies
and do not translate it into Reasonator like application it is in my
opinion a waste of breath.
Thanks,
   GerardM

http://ultimategerardm.blogspot.nl/2013/10/abdullah-king-of-saoudi-arabia.html
http://tools.wmflabs.org/reasonator/?q=Q57298


On 6 November 2013 08:24, Jane Darnell jane...@gmail.com wrote:

 Hi TomTOm,
 Be careful what you wish for! If this were possible, then if someone
 changed the dates, this could mess up other things. We already have a big
 job untangling mismatched interwiki links, and this would make such
 mismatches possible to the nth degree.
 Jane

 Sent from my iPad

 On Nov 4, 2013, at 6:14 PM, Thomas Douillard thomas.douill...@gmail.com
 wrote:

 
  Hey, I got an ontology question.
 
  Classes are, in semantic web framework and their foundations like
 Description Logic, if I'am not wrong, something like a lohic predicate that
 intensionaly or extentionaly defines the properties of their instances.
 
  They are usually not qualified, but in Wikidata, as of now they are
 properties like the others, who can also be qualified.
 
  So the question is : could we use qualifier on classes to add predicates
 on the class definition ? For example if
  George Bush is an instance of United States President [from 1980
 to 1984] (random years), this would mean that the instanciation add some
 predicates on the other predicates we have on the president of the united
 states ?
 
  Just a random thought, I just realise I just qualified the
 instanciation, not the class itself.
 
  --TomT0m
 
  ___
  Wikidata-l mailing list
  Wikidata-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikidata-l

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l