Hi
> If you look at Wikidata property discussion pages, you can find important
metadata about a given property
Ok, I've found appropriate Wikidata API calls to get list of all Properties
and to retrieve data (including metadata from discussion pages).
There are 967 Properties now on Wikidata. I'
Hi Sergey,
On 3/11/14, 12:26 AM, Sergey Skovorodkin wrote:
> Chapter 4 in [1] says that there are two main approaches to ontology
> matching:
> - element-level (string-, language-, constraint-, informal
> resource-, and formal resource-based techniques);
> - structure-level (graph-, taxo
Chapter 4 in [1] says that there are two main approaches to ontology
matching:
- element-level (string-, language-, constraint-, informal resource-,
and formal resource-based techniques);
- structure-level (graph-, taxonomy-, model-, and instance-based
techniques).
Is there any reason why
Hi, Marco!
I’ve found some papers and a book, and as far as I can see, here we have an
“ontology matching/alignment” problem.
Could you give me some advice on where to start?
On Fri, Mar 7, 2014 at 7:46 PM, Marco Fossati wrote:
> Hi Sergey,
>
>
> On 3/7/14, 4:04 PM, Sergey Skovorodkin wrote:
Hi Sergey,
On 3/7/14, 4:04 PM, Sergey Skovorodkin wrote:
> Hi!
>
> May be I'll ask some irrelevant or strange questions, but I really want
> to understand what's going on :)
Don't worry, this is the right place to ask any kind of question ;-)
>
> >i collected them last year, they were 544 propert
Hi!
May be I'll ask some irrelevant or strange questions, but I really want to
understand what's going on :)
> i collected them last year, they were 544 properties and 4184 classes to
map
Does it mean, that we have to expand our ontology (as we have only 650
classes)? Or should we match Wikidata
Good point Hady, I'll add it to the idea description, thanks!
On 3/7/14, 1:26 PM, Hady elsahar wrote:
> HI all ,
>
> thanks for your reply Dimitris .
>
> Marco , We already extracted all Wikidata Facts in Triples[1] , so by
> querying all unique values for the wikidata property P31 [2] , we can
>
HI all ,
thanks for your reply Dimitris .
Marco , We already extracted all Wikidata Facts in Triples[1] , so by
querying all unique values for the wikidata property P31 [2] , we can get
all the wikidata classes in use (not sure if there are classes that doesn't
have any instances ).
i think we c
Hi Hady,
Thanks for pointing out the manual Wikidata-to-DBpedia mapping you did
last year.
How did you detect those 4k Wikidata entities that should be classes and
not individuals? Your work can be a great starting point, since there
seems to be no such distinction in Wikidata.
Cheers!
On 3/7/
Hello Hady,
You did a good job last year and mapped a big part of the (most frequent)
properties manually.
However, mapping the long tail can be a much more tedious job and we also
need to map new properties / classes as they are defined in wikidata.
In the end we want this to be a (semi-)automati
Dear all ,
i have a question considering the idea
*4.6. Automated Wikidata mappings to DBpedia ontology:*
those are the links for manually mapping wikidata properties[1] and
classes[2] to Dbpedia
- i collected them last year, they were 544 properties and 4184 classes to
map , of course ~4600 map
11 matches
Mail list logo