[Wikidata] Fwd: [Wikimedia-l] Captioning Wikidata items?

2018-09-26 Thread Pine W
I don't have enough knowledge about neural nets to evaluate the email
below, but I'm forwarding it in case it's of interest to others on two
relevant lists.

Pine
( https://meta.wikimedia.org/wiki/User:Pine )


-- Forwarded message -
From: John Erling Blad 
Date: Wed, Sep 26, 2018 at 6:23 PM
Subject: [Wikimedia-l] Captioning Wikidata items?
To: Wikimedia Mailing List 


Just a weird idea.

It is very interesting how neural nets can caption images. Quite
interesting. It is done by building a state-model of the image, that is
feed into a kind of neural net (RNN) and that net (a black box) will
transform the state-model into running text. In some cases the neural net
is steered. That is called an attention control, and it creates
relationship between parts in the image.

Swap out the image wit an item, and a virtually identical setup can
generate captions for items. The caption for an item is whats called the
description in Wikidata. It is also the first sentence with a lead-in in
Wikipedia articles. It is possible to steer the attention, that is to tell
the network what items should be used, and thus the later sentences will be
meaningful.

What that means is that we could create meaningful stub entries for the
article placeholder, that is the "AboutTopic" special page. We can't
automate this for very small projects, but somewhere between small and mid
sized languages it will start to make sense.

To make this work we need some very special knowledge, which we probably
don't have, like how to turn an item into a state-model by using the highly
specialized rdf2vec algorithm (hello Copenhagen) and verifying the stateful
language model (hello Helsinki and Tromsø).

I wonder if the only real problems are what do the community want, and what
is the acceptable error limit.

John Erling Blad
/jeblad
___
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: wikimedi...@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Semantic annotation of red links on Wikipedia

2018-09-26 Thread Andy Mabbett
On 24 September 2018 at 18:48, Maarten Dammers  wrote:

> Wouldn't it be nice to be able to make a connection between the red link on
> Wikipedia and the Wikidata item?

This facility already exists:

   
https://en.wikipedia.org/wiki/Template:Interlanguage_link#Link_to_Reasonator_and_Wikidata

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Looking for "data quality check" bots

2018-09-26 Thread Ettore RIZZA
Hi,

Wikidata is obviously linked to a bunch of unusable external ids, but also
to some very structured data. I'm interested for the moment in the state of
the art - even based on poor scraping, why not?.

I see for example this request for permission

for a bot able to retrieve information on the BNF (French national library)
database. It has been refused because of copyright's issues, but simply
checking the information without extracting anything is allowed, isn't?

On Wed, 26 Sep 2018 at 20:48, Paul Houle  wrote:

> "Poorly structured" HTML is not all that bad in 2018 thanks to HTML 5
> (which builds the "rendering decisions made about broken HTML from
> Netscape 3" into the standard so that in common languages you can get
> the same DOM tree as the browser)
>
> If you try to use an official or unofficial API to fetch data from some
> service in 2018 you will have to add some dependencies and you just
> might open a can of whoop-ass that will make you reinstall Anconda or
> maybe you will learn something you'll never be able to unlearn about how
> XML processing changed between two minor versions of the JDK
>
> On the other hand I have often dusted off the old HTML-based parser I
> made for Flickr and found I could get it to work for other media
> collections,  blogs, etc. by just changing the "semantic model" embodied
> in the application which could be as simple as some function or object
> that knows something about the structure of the URLs some documents.
>
> I cannot understand why so many standards have been pushed to integrate
> RDF and HTML that have gone nowhere but nobody has promoted the clean
> solution of "add a css media type for RDF" that marks the semantics of
> HTML up the way JSON-LD works.
>
> Often though if you look it that way much of the time these days
> matching patterns against CSS gets you most of the way there.
>
> I've had cases where I haven't had to change the rule sets much at all
> but none of them have been more than 50 lines of code,  all much less.
>
>
>
> -- Original Message --
> From: "Federico Leva (Nemo)" 
> To: "Discussion list for the Wikidata project"
> ; "Ettore RIZZA" 
> Sent: 9/26/2018 1:00:53 PM
> Subject: Re: [Wikidata] Looking for "data quality check" bots
>
> >Ettore RIZZA, 26/09/2018 15:31:
> >>I'm looking for Wikidata bots that perform accuracy audits. For
> >>example, comparing the birth dates of persons with the same date
> >>indicated in databases linked to the item by an external-id.
> >
> >This is mostly a screenscraping job, because most external databases
> >are only accessibly in unstructured or poorly structured HTML form.
> >
> >Federico
> >
> >___
> >Wikidata mailing list
> >Wikidata@lists.wikimedia.org
> >https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Official Website (P856) value constraint

2018-09-26 Thread Victor Villas Bôas Chaves
Hi there!

Do you people think that 
https?://(\S+\.)+\S+(/\S*)?
  can be improved? I think it can.

For instance, why require the protocol spec? For most use case scenarios (just 
clicking it), it's not useful. Arguably, it's not part of the website.

Also, the regex is lax. That's not a big issue, but a more strict one could 
prevent duplicate urls from being added by bots and data imports. Aside from 
the http/s discussion, constraints like these:

- warn against urls ending in /
- warn against uppercase

would help data imports from external databases or general batch edits with 
bots to avoid adding duplicate values.

Att,
Victor
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Looking for "data quality check" bots

2018-09-26 Thread Paul Houle
"Poorly structured" HTML is not all that bad in 2018 thanks to HTML 5 
(which builds the "rendering decisions made about broken HTML from 
Netscape 3" into the standard so that in common languages you can get 
the same DOM tree as the browser)


If you try to use an official or unofficial API to fetch data from some 
service in 2018 you will have to add some dependencies and you just 
might open a can of whoop-ass that will make you reinstall Anconda or 
maybe you will learn something you'll never be able to unlearn about how 
XML processing changed between two minor versions of the JDK


On the other hand I have often dusted off the old HTML-based parser I 
made for Flickr and found I could get it to work for other media 
collections,  blogs, etc. by just changing the "semantic model" embodied 
in the application which could be as simple as some function or object 
that knows something about the structure of the URLs some documents.


I cannot understand why so many standards have been pushed to integrate 
RDF and HTML that have gone nowhere but nobody has promoted the clean 
solution of "add a css media type for RDF" that marks the semantics of 
HTML up the way JSON-LD works.


Often though if you look it that way much of the time these days 
matching patterns against CSS gets you most of the way there.


I've had cases where I haven't had to change the rule sets much at all 
but none of them have been more than 50 lines of code,  all much less.




-- Original Message --
From: "Federico Leva (Nemo)" 
To: "Discussion list for the Wikidata project" 
; "Ettore RIZZA" 

Sent: 9/26/2018 1:00:53 PM
Subject: Re: [Wikidata] Looking for "data quality check" bots


Ettore RIZZA, 26/09/2018 15:31:
I'm looking for Wikidata bots that perform accuracy audits. For 
example, comparing the birth dates of persons with the same date 
indicated in databases linked to the item by an external-id.


This is mostly a screenscraping job, because most external databases 
are only accessibly in unstructured or poorly structured HTML form.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Looking for "data quality check" bots

2018-09-26 Thread Federico Leva (Nemo)

Ettore RIZZA, 26/09/2018 15:31:
I'm looking for Wikidata bots that perform accuracy audits. For example, 
comparing the birth dates of persons with the same date indicated in 
databases linked to the item by an external-id.


This is mostly a screenscraping job, because most external databases are 
only accessibly in unstructured or poorly structured HTML form.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread Thomas Pellissier Tanon
> I would certainly support this proposal or can even propose it. Would it
also be an idea to do the narrow equivalent, at the same time?  Any
objection to naming them broad and narrow match, to reflect the mapping
relations in SKOS?

I object to this. "broad match" and "narrow match" are used to compare
concepts, and "super class" and "subclass" to compare classes. It could
make sense to say that C1 is a sub class of C2 if all instances of C1 are
also instances of C2 even if the concept C1 is not related to C2.

I believe that not having a specific property for schema.org is actually
more convenient. Having one would mean to use qualifiers to replace the
different possible relations subClassOf, equivalentClass, superClassOf,
subPropertyOf, equivalentProperty, superPropertyOf, narrowMatch, exactMatch
and broadMatch and require people querying the data to always take care of
them.
Restricting to only schema.org is fast and easy in SPARQL with
FILTER(STRSTARTS(STR(?url), "http://schema.org";))

Cheers,

Thomas

Le mer. 26 sept. 2018 à 11:45, James Heald  a écrit :

> On 26/09/2018 10:16, Andra Waagmeester wrote:
> > On Wed, Sep 26, 2018 at 9:47 AM James Heald  wrote:
> >
> >>
> >> Far better to have a dedicated external-id property for schema.org,
> >> which would avoid this; and if there are important concepts there that
> >> we don't have an item for on Wikidata, then create those items.
> >>
> >>
> > Creating a dedicated property for schema.org, would also imply the need
> for
> > creating designated properties for other context providers such as OBO,
> > SIO, etc. I see that having to filter on matching uri providers in a
> single
> > property can be complicated, but would that be more complicated than
> having
> > to consider all possible schema/context providers through distinct
> > properties?
> >
>
> In SPARQL the latter is very easy.   Just make a VALUES list of all the
> properties you are interested in,
>
> VALUES ?prop_wdt {wdt:P, wdt:P, wdt:P, wdt:P}
>
> then look for
>
> ?item ?prop_wdt ?ext_id
>
>
> Alternatively, if there is something characteristic about a whole set of
> properties that you want to use, then add that information to the
> wikidata item for the property.  You will then be easily able to select
> all the with that characteristic, eg:
>
> ?prop wdt:1234 wd:Q5678901
> ?prop wikibase:directClaim ?prop_wdt
>
>
> This gives you the fine control to retrieve just the URIs of the
> services you want, rather than only being able to retrieve everything
> all lumped together.
>
>
> Using distinct external-ID properties also makes it much easier to see
> what properties are currently in play, for project tracking pages like
> this one:
>
> https://www.wikidata.org/wiki/Wikidata:WikiProject_BHL/Statistics:Titles#Titles_--_IDs
>
>-- James.
>
>
>
>
> ---
> This email has been checked for viruses by AVG.
> https://www.avg.com
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Looking for "data quality check" bots

2018-09-26 Thread Ettore RIZZA
Dear all,

I'm looking for Wikidata bots that perform accuracy audits. For example,
comparing the birth dates of persons with the same date indicated in
databases linked to the item by an external-id.

I do not even know if they exist. Bots are often poorly documented, so I
appeal to the community to get some example.

Many thanks.

Ettore Rizza
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread James Heald

On 26/09/2018 10:16, Andra Waagmeester wrote:

On Wed, Sep 26, 2018 at 9:47 AM James Heald  wrote:



Far better to have a dedicated external-id property for schema.org,
which would avoid this; and if there are important concepts there that
we don't have an item for on Wikidata, then create those items.



Creating a dedicated property for schema.org, would also imply the need for
creating designated properties for other context providers such as OBO,
SIO, etc. I see that having to filter on matching uri providers in a single
property can be complicated, but would that be more complicated than having
to consider all possible schema/context providers through distinct
properties?



In SPARQL the latter is very easy.   Just make a VALUES list of all the 
properties you are interested in,


   VALUES ?prop_wdt {wdt:P, wdt:P, wdt:P, wdt:P}

then look for

   ?item ?prop_wdt ?ext_id


Alternatively, if there is something characteristic about a whole set of 
properties that you want to use, then add that information to the 
wikidata item for the property.  You will then be easily able to select 
all the with that characteristic, eg:


   ?prop wdt:1234 wd:Q5678901
   ?prop wikibase:directClaim ?prop_wdt


This gives you the fine control to retrieve just the URIs of the 
services you want, rather than only being able to retrieve everything 
all lumped together.



Using distinct external-ID properties also makes it much easier to see 
what properties are currently in play, for project tracking pages like 
this one:

https://www.wikidata.org/wiki/Wikidata:WikiProject_BHL/Statistics:Titles#Titles_--_IDs

  -- James.




---
This email has been checked for viruses by AVG.
https://www.avg.com


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread Andra Waagmeester
On Wed, Sep 26, 2018 at 9:47 AM James Heald  wrote:

>
> Far better to have a dedicated external-id property for schema.org,
> which would avoid this; and if there are important concepts there that
> we don't have an item for on Wikidata, then create those items.
>
>
Creating a dedicated property for schema.org, would also imply the need for
creating designated properties for other context providers such as OBO,
SIO, etc. I see that having to filter on matching uri providers in a single
property can be complicated, but would that be more complicated than having
to consider all possible schema/context providers through distinct
properties?


Andra
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread Ettore RIZZA
By the way: the item "demand " in
Wikidata clearly refers to the economic concept in its broadest and most
abstract sense, while https://schema.org/Demand is defined as "the public
(...) announcement by an organization or person to seek certain types of
goods or services. "

So I would not say they are equivalent classes, but rather something like <
https://www.wikidata.org/wiki/Q4402708>  <
https://schema.org/Demand>.

As Andra Waagmeetser indicates, could not this be modeled with an external
ID?

On Wed, 26 Sep 2018 at 10:18, Ettore RIZZA  wrote:

> Hi,
>
> aggregate demand -- broader external class --> https://schema.org/Demand
>> place of devotion -- broader external class --> https://schema.org/Place
>> festival -- broader external class --> https://schema.org/Event
>
>
>  According to the "creator" of the property narrower external class
> (P3950)
> ,
> " the reverse (...) is less required because more general classes are
> more likely to be included in Wikidata anyway.  "
>
> These examples seem to prove him right, since "demand
> " exists in Wikidata and is
> already linked to "http://schema.org/Demand"; via the equivalent class
> property. Same thing for https://schema.org/Event, already mapped with
> event , or for
> https://schema.org/Place , which could be associated with location
>  (not sure).
>
> To be clear, I support the creation of "broader external class" because it
> can be used with some external vocabularies; I point this out just to make
> sure that all existing mapping possibilities are used. :)
>
> Cheers,
>
> Ettore Rizza
>
>
> On Wed, 26 Sep 2018 at 03:53, Thad Guidry  wrote:
>
>> Sure, Dan
>>
>> aggregate demand  -- broader
>> external class --> https://schema.org/Demand
>>
>> place of devotion  --
>> broader external class --> https://schema.org/Place
>>
>> festival  -- broader external
>> class --> https://schema.org/Event
>>
>> Usually we can discover these relationships quite easily with "What links
>> here" on the GUI and applicable SPARQL queries, but then would like to
>> apply the Wikidata->Schema.org mappings when we discover those
>> relationships can be made.  I suck at PHP, so I couldn't build or
>> contribute to a native application for Wikidata to host that application to
>> auto discover some of these mappings, but would be happy to assist someone
>> who could code in PHP to build such application...here's looking at you,
>> Magnus ?  :-)
>>
>> -Thad
>> +ThadGuidry 
>>
>>
>> On Tue, Sep 25, 2018 at 7:07 PM Dan Brickley  wrote:
>>
>>> On Tue, 25 Sep 2018 at 16:35, Thad Guidry  wrote:
>>>
 Hi Team !
 +Dan Brickley  +Lydia Pintscher
 

 Schema.org mapping is progressing on every new Weekly Summary "Newest
 properties" listing.
 That's great !  And thanks to Léa and team for providing the new
 properties listing !

 What's not great, is many times, we cannot apply a "broader external
 class" to map to a Schema.org Type.  This is because "broader concept"
 https://www.wikidata.org/wiki/Property:P4900 is constrained to
 "qualifiers only and not for use on statements".

 We are able to use the existing "narrower external class"
  , for example like here
 on this topic, https://www.wikidata.org/wiki/Q7406919 , but there is
 no "broader external class" property in Wikidata yet from what we see.

 It would be *awesome* if someone could advocate for that new property
 to help map Wikidata to external vocabularies that have broader concepts
 quite often, such as Schema.org.

>>>
>>> Could you give 2-3 specific examples, to help motivate the request, for
>>> folk who're not tracking this work?
>>>
>>> Dan
>>>
>>> -Thad
 +ThadGuidry 

 ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread Ettore RIZZA
Hi,

aggregate demand -- broader external class --> https://schema.org/Demand
> place of devotion -- broader external class --> https://schema.org/Place
> festival -- broader external class --> https://schema.org/Event


 According to the "creator" of the property narrower external class (P3950)
,
" the reverse (...) is less required because more general classes are more
likely to be included in Wikidata anyway.  "

These examples seem to prove him right, since "demand
" exists in Wikidata and is already
linked to "http://schema.org/Demand"; via the equivalent class property.
Same thing for https://schema.org/Event, already mapped with event
, or for  https://schema.org/Place
, which could be associated with location
 (not sure).

To be clear, I support the creation of "broader external class" because it
can be used with some external vocabularies; I point this out just to make
sure that all existing mapping possibilities are used. :)

Cheers,

Ettore Rizza


On Wed, 26 Sep 2018 at 03:53, Thad Guidry  wrote:

> Sure, Dan
>
> aggregate demand  -- broader
> external class --> https://schema.org/Demand
>
> place of devotion  --
> broader external class --> https://schema.org/Place
>
> festival  -- broader external
> class --> https://schema.org/Event
>
> Usually we can discover these relationships quite easily with "What links
> here" on the GUI and applicable SPARQL queries, but then would like to
> apply the Wikidata->Schema.org mappings when we discover those
> relationships can be made.  I suck at PHP, so I couldn't build or
> contribute to a native application for Wikidata to host that application to
> auto discover some of these mappings, but would be happy to assist someone
> who could code in PHP to build such application...here's looking at you,
> Magnus ?  :-)
>
> -Thad
> +ThadGuidry 
>
>
> On Tue, Sep 25, 2018 at 7:07 PM Dan Brickley  wrote:
>
>> On Tue, 25 Sep 2018 at 16:35, Thad Guidry  wrote:
>>
>>> Hi Team !
>>> +Dan Brickley  +Lydia Pintscher
>>> 
>>>
>>> Schema.org mapping is progressing on every new Weekly Summary "Newest
>>> properties" listing.
>>> That's great !  And thanks to Léa and team for providing the new
>>> properties listing !
>>>
>>> What's not great, is many times, we cannot apply a "broader external
>>> class" to map to a Schema.org Type.  This is because "broader concept"
>>> https://www.wikidata.org/wiki/Property:P4900 is constrained to
>>> "qualifiers only and not for use on statements".
>>>
>>> We are able to use the existing "narrower external class"
>>>  , for example like here
>>> on this topic, https://www.wikidata.org/wiki/Q7406919 , but there is no
>>> "broader external class" property in Wikidata yet from what we see.
>>>
>>> It would be *awesome* if someone could advocate for that new property
>>> to help map Wikidata to external vocabularies that have broader concepts
>>> quite often, such as Schema.org.
>>>
>>
>> Could you give 2-3 specific examples, to help motivate the request, for
>> folk who're not tracking this work?
>>
>> Dan
>>
>> -Thad
>>> +ThadGuidry 
>>>
>>> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread James Heald

On 26/09/2018 08:46, James Heald wrote:

This model is not good.

If you dump *all* such matches from whatever source into a single 
property, then you force people to use string-comparison filters if it 
is a particular source (eg schema.org) that they are interested in.


That may not be such a problem if you're only interested in incoming 
matches (eg matches *from* schema.org), but if you're going to want 
outgoing matches (matches *to* schema.org), it's tiresome and horribly 
inefficient.


Far better to have a dedicated external-id property for schema.org, 
which would avoid this; and if there are important concepts there that 
we don't have an item for on Wikidata, then create those items.


   -- James.


And just to add:

A dedicated external-id property then also means you can use qualifier 
P4900 "broader concept" as intended, as a way to represent and query 
within wikidata the structure of the external hierarchy.


  -- James.






On 26/09/2018 06:57, Andra Waagmeester wrote:

I would certainly support this proposal or can even propose it. Would it
also be an idea to do the narrow equivalent, at the same time?  Any
objection to naming them broad and narrow match, to reflect the mapping
relations in SKOS?

On Wed, Sep 26, 2018 at 3:54 AM Thad Guidry  wrote:


Sure, Dan

aggregate demand  -- broader
external class --> https://schema.org/Demand

place of devotion  --
broader external class --> https://schema.org/Place

festival  -- broader external
class --> https://schema.org/Event

Usually we can discover these relationships quite easily with "What 
links

here" on the GUI and applicable SPARQL queries, but then would like to
apply the Wikidata->Schema.org mappings when we discover those
relationships can be made.  I suck at PHP, so I couldn't build or
contribute to a native application for Wikidata to host that 
application to
auto discover some of these mappings, but would be happy to assist 
someone

who could code in PHP to build such application...here's looking at you,
Magnus ?  :-)

-Thad
+ThadGuidry 


On Tue, Sep 25, 2018 at 7:07 PM Dan Brickley  wrote:


On Tue, 25 Sep 2018 at 16:35, Thad Guidry  wrote:


Hi Team !
+Dan Brickley  +Lydia Pintscher


Schema.org mapping is progressing on every new Weekly Summary "Newest
properties" listing.
That's great !  And thanks to Léa and team for providing the new
properties listing !

What's not great, is many times, we cannot apply a "broader external
class" to map to a Schema.org Type.  This is because "broader concept"
https://www.wikidata.org/wiki/Property:P4900 is constrained to
"qualifiers only and not for use on statements".

We are able to use the existing "narrower external class"
 , for example like here
on this topic, https://www.wikidata.org/wiki/Q7406919 , but there 
is no

"broader external class" property in Wikidata yet from what we see.

It would be *awesome* if someone could advocate for that new property
to help map Wikidata to external vocabularies that have broader 
concepts

quite often, such as Schema.org.



Could you give 2-3 specific examples, to help motivate the request, for
folk who're not tracking this work?

Dan

-Thad

+ThadGuidry 

___

Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata






---
This email has been checked for viruses by AVG.
https://www.avg.com


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping back to Schema.org needs "broader external class"

2018-09-26 Thread James Heald

This model is not good.

If you dump *all* such matches from whatever source into a single 
property, then you force people to use string-comparison filters if it 
is a particular source (eg schema.org) that they are interested in.


That may not be such a problem if you're only interested in incoming 
matches (eg matches *from* schema.org), but if you're going to want 
outgoing matches (matches *to* schema.org), it's tiresome and horribly 
inefficient.


Far better to have a dedicated external-id property for schema.org, 
which would avoid this; and if there are important concepts there that 
we don't have an item for on Wikidata, then create those items.


  -- James.


On 26/09/2018 06:57, Andra Waagmeester wrote:

I would certainly support this proposal or can even propose it. Would it
also be an idea to do the narrow equivalent, at the same time?  Any
objection to naming them broad and narrow match, to reflect the mapping
relations in SKOS?

On Wed, Sep 26, 2018 at 3:54 AM Thad Guidry  wrote:


Sure, Dan

aggregate demand  -- broader
external class --> https://schema.org/Demand

place of devotion  --
broader external class --> https://schema.org/Place

festival  -- broader external
class --> https://schema.org/Event

Usually we can discover these relationships quite easily with "What links
here" on the GUI and applicable SPARQL queries, but then would like to
apply the Wikidata->Schema.org mappings when we discover those
relationships can be made.  I suck at PHP, so I couldn't build or
contribute to a native application for Wikidata to host that application to
auto discover some of these mappings, but would be happy to assist someone
who could code in PHP to build such application...here's looking at you,
Magnus ?  :-)

-Thad
+ThadGuidry 


On Tue, Sep 25, 2018 at 7:07 PM Dan Brickley  wrote:


On Tue, 25 Sep 2018 at 16:35, Thad Guidry  wrote:


Hi Team !
+Dan Brickley  +Lydia Pintscher


Schema.org mapping is progressing on every new Weekly Summary "Newest
properties" listing.
That's great !  And thanks to Léa and team for providing the new
properties listing !

What's not great, is many times, we cannot apply a "broader external
class" to map to a Schema.org Type.  This is because "broader concept"
https://www.wikidata.org/wiki/Property:P4900 is constrained to
"qualifiers only and not for use on statements".

We are able to use the existing "narrower external class"
 , for example like here
on this topic, https://www.wikidata.org/wiki/Q7406919 , but there is no
"broader external class" property in Wikidata yet from what we see.

It would be *awesome* if someone could advocate for that new property
to help map Wikidata to external vocabularies that have broader concepts
quite often, such as Schema.org.



Could you give 2-3 specific examples, to help motivate the request, for
folk who're not tracking this work?

Dan

-Thad

+ThadGuidry 

___

Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




---
This email has been checked for viruses by AVG.
https://www.avg.com


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata