Hoi, With the recent introduction of federation for DBpedia, it is possible to have queries for the DBpedias for a specific language and Wikidata. I have blogged how we can make use for this [1].
It makes it much easier to compare Wikidata and DBpedia and when we take this serious and apply some effort we can make a tool like the one by Pasleim [2] for Wikipedias that do not have a category for people who died in a given year. Thanks, GerardM [1] http://ultimategerardm.blogspot.nl/2017/04/wikidata-user-story-dbpedia-death-and.html [2] http://tools.wmflabs.org/pltools/recentdeaths/ On 1 April 2017 at 11:34, Gerard Meijssen <gerard.meijs...@gmail.com> wrote: > Hoi, > I was asked by one of the DBpedia people to write a project plan.. I gave > it a try [1]. > > The idea is to first compare DBpedia with Wikidata where a comparison is > possible. When it is not (differences in their classes for instance) it is > at first not what we focus on. > > Please comment on the talk page and when there are things missing in the > plan, please help it improve. > Thanks, > GerardM > > > > [1] https://www.wikidata.org/wiki/User:GerardM/DBpedia_for_Quality > > > On 1 April 2017 at 10:44, Reem Al-Kashif <reemalkas...@gmail.com> wrote: > >> Hi >> >> I don't have an idea about how to develop this, but it seems like an >> interesting project! >> >> Best, >> Reem >> >> On 30 Mar 2017 10:17, "Gerard Meijssen" <gerard.meijs...@gmail.com> >> wrote: >> >>> Hoi, >>> Much of the content of DBpedia and Wikidata have the same origin; >>> harvesting data from a Wikipedia. There is a lot of discussion going on >>> about quality and one point that I make is that comparing "Sources" and >>> concentrating on the differences particularly where statements differ is >>> where it is easiest to make a quality difference. >>> >>> So given that DBpedia harvests both Wikipedia and Wikidata, can it >>> provide us with a view where a Wikipedia statement and a Wikidata statement >>> differ. To make it useful, it is important to subset this data. I will not >>> start with 500.000 differences but I will begin when they are about a >>> subset that I care about. >>> >>> When I care about entries for alumni of a university, I will consider >>> curating the information in question. Particularly when I know the language >>> of the Wikipedia. >>> >>> When we can do this, another thing that will promote the use of a tool >>> like this is when regularly (say once a month) numbers are stored and >>> trends are published. >>> >>> How difficult is it to come up with something like this. I know this >>> tool would be based on DBpedia but there are several reasons why this is >>> good. First it gives added relevance to DBpedia (without detracting from >>> Wikidata) and secondly as DBpedia updates on RSS changes for several >>> Wikipedias, the effect of these changes is quickly noticed when a new set >>> of data is requested. >>> >>> Please let us know what the issues are and what it takes to move forward >>> with this, Does this make sense? >>> Thanks, >>> GerardM >>> >>> http://ultimategerardm.blogspot.nl/2017/03/quality-dbpedia-a >>> nd-kappa-alpha-psi.html >>> >>> _______________________________________________ >>> Wikidata mailing list >>> Wikidata@lists.wikimedia.org >>> https://lists.wikimedia.org/mailman/listinfo/wikidata >>> >>> >> _______________________________________________ >> Wikidata mailing list >> Wikidata@lists.wikimedia.org >> https://lists.wikimedia.org/mailman/listinfo/wikidata >> >> >
_______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata