> 100 GB "with an optimized code" could be enough to produce an HDT like that.

The current software definitely cannot handle wikidata with 100GB. It was tried 
before and it failed.
I'm glad to see that new code will be released to handle large files. After 
skimming that paper it looks like they split the RDF source into multiple files 
and "cat" them into a single HDT file. 100GB is still a pretty large footprint, 
but I'm so glad that they're working on this. A 128GB server is *way* more 
affordable than one with 512GB or 1TB!

I can't wait to try the new code myself.

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

Reply via email to