Hi Pablo,

Wikipedia edits are only one feed in the updated data in DBpedia Live,
changes in the data may occur from updates in the mappings (
mappings.dbpedia.org) or changes in the code (bug fixes in the extractors
or addition of new extractors).
In addition keeping a listener on WIkipedia edits will not work correctly
since DBpedia Live might delay the extraction of a particular article for
various reasons.
The only way to get the current state of the Live KG is to create a feeder
for the live changesets
http://live.dbpedia.org/changesets/

So, the project idea is to get these changesets and update a TPFS. We
already have a feeder that updates a triple store where you can reuse some
code
https://github.com/dbpedia/dbpedia-live-mirror
This project is tailored for DBpedia Live but your solution, could possibly
be applied to any TPFS/change feed

Cheers,
Dimitris

On Fri, Mar 20, 2015 at 3:59 AM, Pablo Estrada <polecito...@gmail.com>
wrote:

> Hello Ruben!
> Thanks for your support. It took me a couple of days to prepare an answer,
> 'cause I was busy with school. I have fumbled around with the resources,
> and I'm going to try to summarize what we have and what we need for the
> project.
>
> - We want to query a Knowledge Graph using SPARQL. At this point, there
> are several ways of doing this: SPARQL endpoints, Triple Pattern Fragment
> Server (TPFS), Data files.
> - The TPFS allows clients to run their SPARQL queries, by providing just
> the data. The server uses a binary file, where it has the data. The
> configuration of the server, binary file, etc are in
> https://github.com/LinkedDataFragments/Server.js
> - Unfortunately, if someone modifies Wikipedia, the data in a TPFS becomes
> obsolete, and can only be updated after regenerating the binary file and
> restarting the server.
>
> We need...
> - Either to change the code in our current TPFS (in
> https://github.com/LinkedDataFragments/Server.js) to support 'live'
> updates of the data,
> - OR a new server implementation that supports live updates of the data
>
> Does that make sense?
>
> If that's correct, I have one more question:
> We want to update the data from modified infoboxes in wikipedia. We would
> need a sort of 'listener', that listens on changes to wikipedia, and
> whenever a change is done in Wikipedia, it looks at it, figures out if it
> updated some part of the knowledge graph, and if it did, it updates the
> knowledge graph. Is this correct?
>
> Okay, that's it for now. I'm just getting started. Thanks Ruben again for
> your time. I'll keep in touch with this.
>
> Best,
> Pablo
>
> On Tue, Mar 17, 2015 at 6:10 AM Ruben Verborgh <ruben.verbo...@ugent.be>
> wrote:
>
>> Hi Pablo,
>>
>> > I am interested in project 5.14, relating to scalable querying of the
>> DBpedia data stream.
>>
>>
>> Great to hear! I can tell you it's a very cool project :-)
>>
>> > Any advice, or extra references would be appreciated.
>>
>> I definitely recommend trying out the server (
>> http://fragments.dbpedia.org/2014/en)
>> and client (http://client.linkeddatafragments.org/).
>> Our main publication on this topic should give many insights:
>> http://linkeddatafragments.org/publications/iswc2014.pdf.
>>
>> Here are some things you can try to warm up:
>> – Use the interface from the command line (for instance, using curl):
>> http://fragments.dbpedia.org/2014/en.
>> – Retrieve responses in various content types through content
>> negotiation. The server currently supports HTML, JSON(-LD), Turtle, TriG,
>> N-Triples, N-Quads.
>> – Parse one or more responses and try to understand their differences.
>> – Set up a local server using a dataset of your choice. (Many datasets
>> can be found here:http://lodlaundromat.org/wardrobe/.)
>> – Try to set up a local server with DBpedia 2014 or DBpedia live.
>>
>> If you have any questions, just mail us!
>>
>> Best,
>>
>> Ruben
>
>
>
> ------------------------------------------------------------------------------
> Dive into the World of Parallel Programming The Go Parallel Website,
> sponsored
> by Intel and developed in partnership with Slashdot Media, is your hub for
> all
> things parallel software development, from weekly thought leadership blogs
> to
> news, videos, case studies, tutorials and more. Take a look and join the
> conversation now. http://goparallel.sourceforge.net/
> _______________________________________________
> Dbpedia-gsoc mailing list
> Dbpedia-gsoc@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/dbpedia-gsoc
>
>


-- 
Kontokostas Dimitris
------------------------------------------------------------------------------
Dive into the World of Parallel Programming The Go Parallel Website, sponsored
by Intel and developed in partnership with Slashdot Media, is your hub for all
things parallel software development, from weekly thought leadership blogs to
news, videos, case studies, tutorials and more. Take a look and join the 
conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Dbpedia-gsoc mailing list
Dbpedia-gsoc@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dbpedia-gsoc

Reply via email to