Hi Hady,
the dump script is definitely on the todo list but it hasn't been implemented
yet.
For now you can get RDF for specific entities via the Special:EntityData page
using http://www.wikidata.org/wiki/Special:EntityData/Q{id}.nt or .rdf
Cheers,
Anja
On Jun 8, 2013, at 2:17, Hady elsahar <[email protected]> wrote:
> Hello all ,
>
> i wonder if there's some news about the RDF Dumps , should we send an email
> to WikiData Tech considering this ?
> but considering both cases i don't mind implementing the issue #38 as a warm
> up task until we decide that.
>
>
> considering the URIs , i don't mind either to use any of both. however from
> my experience in consuming DBpedia when i was new in semantic web , URIs
> which had different namespaces was a bit confusing to us. as an instance ; i
> never knew then the difference between <http://dbpedia.org/property/x> and
> <http://http://dbpedia.org/ontology/x>, we used to query DBpedia many times
> to know what kind of properties are there and we hardcoded all types in our
> project ,and after many trials we found that the first one has a larger
> coverage.
>
> we should mention that in a place inside a wiki and also keep in mind to keep
> it as simple as possible so whenever there's an opportunity to merge things
> and unify namespaces for users we should do that , from my point of view. and
> provide other gates for specific sources. in the end most of the consumers
> are interested more in larger coverage and accurate data more than anything
> else.
>
>
> Thanks
> Regards
>
>
> On Fri, May 31, 2013 at 4:46 PM, Dimitris Kontokostas <[email protected]>
> wrote:
> Hi all,
>
> The whole purpose of the Wikidata integration was to use only the needed data
> and adjust them to DBpedia (ontology).
> http://www.wikidata.org/wiki/Special:EntityData/q1.nt has many triples we
> don't need and missing some triples we could add. To do that we can either
> built new tools or use the existing DBpedia software stack to extract what we
> want and the way we want to.
>
> Maybe in the long run RDF is the way to go but, for the purpose of GSoC I'd
> vote for the DBpedia framework. Having external dependencies (Wikidata
> deployments, changes) could jeopardize the whole project. Hady will have to
> officially start coding in ~20 days and he needs to have a clear plan from
> now. I say we can't depend on Wikidata now and Implementing issue #38 can do
> only good for the framework.
>
> Regarding the URI scheme we can use either one I don't mind but, if we change
> the data maybe we should change the namespace too
>
> Anja should also say her opinion on this, her dual role in this is a plus :)
>
> Cheers,
> Dimitris
>
>
> -------------------------------------------------
> Hady El-Sahar
> Research Assistant
> Center of Informatics Sciences | Nile University
>
> email : [email protected]
> Phone : +2-01220887311
> http://hadyelsahar.me/
>
>
>
------------------------------------------------------------------------------
How ServiceNow helps IT people transform IT departments:
1. A cloud service to automate IT design, transition and operations
2. Dashboards that offer high-level views of enterprise services
3. A single system of record for all IT processes
http://p.sf.net/sfu/servicenow-d2d-j
_______________________________________________
Dbpedia-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-developers