I am uploading the index file temporarily to:

http://fungen.wur.nl/~jasperk/WikiData/ 
<http://fungen.wur.nl/~jasperk/WikiData/>

Jasper


> On 3 Nov 2017, at 10:05, Ettore RIZZA <ettoreri...@gmail.com> wrote:
> 
> Thank you for this feedback, Laura. 
> 
> Is the hdt index you got available somewhere on the cloud?
> 
> Cheers
> 
> 
> 2017-11-03 9:56 GMT+01:00 Osma Suominen <osma.suomi...@helsinki.fi 
> <mailto:osma.suomi...@helsinki.fi>>:
> Hi Laura,
> 
> Thank you for sharing your experience! I think your example really shows the 
> power - and limitations - of HDT technology for querying very large RDF data 
> sets. While I don't currently have any use case for a local, queryable 
> Wikidata dump, I can easily see that it could be very useful for doing e.g. 
> resource-intensive, analytic queries. Having access to a recent hdt+index 
> dump of Wikidata would make it very easy to start doing that. So I second 
> your plea.
> 
> -Osma
> 
> 
> Laura Morales kirjoitti 03.11.2017 klo 09:48:
> Hello list,
> 
> a very kind person from this list has generated the .hdt.index file for me, 
> using the 1-year old wikidata HDT file available at the rdfhdt website. So I 
> was finally able to setup a working local endpoint using HDT+Fuseki. Set up 
> was easy, launch time (for Fuseki) also was quick (a few seconds), the only 
> change I made was to replace -Xmx1024m to -Xmx4g in the Fuseki startup script 
> (btw I'm not very proficient in Java, so I hope this is the correct way). 
> I've ran some queries too. Simple select or traversal queries seems fast to 
> me (I haven't measured them but the response is almost immediate), other 
> queries such as "select distinct ?class where { [] a ?class }" takes several 
> seconds or a few minutes to complete, which kinda tells me the HDT indexes 
> don't work well on all queries. But otherwise for simple queries it works 
> perfectly! At least I'm able to query the dataset!
> In conclusion, I think this more or less gives some positive feedback for 
> using HDT on a "commodity computer", which means it can be very useful for 
> people like me who want to use the dataset locally but who can't setup a 
> full-blown server. If others want to try as well, they can offer more 
> (hopefully positive) feedback.
> For all of this, I heartwarmingly plea any wikidata dev to please consider 
> scheduling a HDT dump (.hdt + .hdt.index) along with the other regular dumps 
> that it creates weekly.
> 
> Thank you!!
> 
> _______________________________________________
> Wikidata mailing list
> Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> <https://lists.wikimedia.org/mailman/listinfo/wikidata>
> 
> 
> 
> -- 
> Osma Suominen
> D.Sc. (Tech), Information Systems Specialist
> National Library of Finlan 
> <https://maps.google.com/?q=y+of+Finlan&entry=gmail&source=g>d
> P.O. Box 26 (Kaikukatu 4)
> 00014 HELSINGIN YLIOPISTO
> Tel. +358 50 3199529 <tel:%2B358%2050%203199529>
> osma.suomi...@helsinki.fi <mailto:osma.suomi...@helsinki.fi>
> http://www.nationallibrary.fi <http://www.nationallibrary.fi/>
> 
> _______________________________________________
> Wikidata mailing list
> Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> <https://lists.wikimedia.org/mailman/listinfo/wikidata>
> 
> _______________________________________________
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

Reply via email to