Hi Sergio,We have written the attached program to try and test the issue you report:import com.hp.hpl.jena.query.*;import com.hp.hpl.jena.rdf.model.RDFNode;import com.hp.hpl.jena.graph.Triple;import com.hp.hpl.jena.graph.Node;import com.hp.hpl.jena.graph.Graph;import com.hp.hpl.jena.rdf.model.*;imp
Hi Hugh,
Thanks for the link to that script. It seems like it should save me a lot of
hassle. However, at the moment, I'm running into another small problem. I've
followed all the instructions given in the README.txt to the word, and
executed the following command to initialise the installation
Hi Alex,
If a local DBpedia instance is what you are trying to setup we have
the following installation script used for loading up the DBpedia 3.2
datasets:
http://s3.amazonaws.com/dbpedia-data/dbpedia_load.tar.gz
The loading of the DBpedia datasets can take many hours depending on
Hi Hugh, Egon,
I've tried using the DB.DBA.TTLP_MT_LOCAL_FILE a large .n3 file (1GB+),
and it seems to be working (or at least doing something), having escaped
the \ in the path. However, it is taking an absurdly long time to
actually finish (I had to terminate the task), whereas uploading via
Hi Hugh,
Indeed, I was attempting to upload the .nt files using WebDAV. Perhaps this
is a restriction of WebDav? I'm pretty sure it's not the client (curl), as
I've already looked into that.
Thanks for your suggestion regarding the TTLP functions. I tried out the one
suggested with that link, bu
Hi Alex,
Are you uploaded these 4GB files via WebDAV ? You might want to try
the Virtuoso TTLP family of functions for uploading such large
datasets as detailed at:
http://docs.openlinksw.com/virtuoso/fn_ttlp_mt_local_file.html
Best Regards
Hugh Williams
Professional Services
OpenL
Hello,
I've recently been experimenting with using Virtuoso OSE as an RDF
store/SPARQL server, and seem to have things mainly set up now. I have
uploaded various N3 files to rdf_sink via WebDAV (the DBpedia store, to
be specific), and all files have succeeded except for the two largest
ones.