Hi Alex,

If a local DBpedia instance is what you are trying to setup we have the following installation script used for loading up the DBpedia 3.2 datasets:

        http://s3.amazonaws.com/dbpedia-data/dbpedia_load.tar.gz

The loading of the DBpedia datasets can take many hours depending on the speed of the machine, as you have seen yourself with the ttlp_* functions which our installer script uses.

We also provide a DBpedia Virtuoso Amazon EC2 AMI to enable users to instantiate a running DBpedia instance in the cloud in less then an hour, as detailed at:

        
http://virtuoso.openlinksw.com/dataspace/dav/wiki/Main/VirtEC2AMIDBpediaInstall

Best Regards
Hugh Williams
Professional Services
OpenLink Software
Web: http://www.openlinksw.com
Support: http://support.openlinksw.com
Forums: http://boards.openlinksw.com/support



On 5 Sep 2009, at 13:04, Alex wrote:

Hi Hugh, Egon,

I've tried using the DB.DBA.TTLP_MT_LOCAL_FILE a large .n3 file (1GB +), and it seems to be working (or at least doing something), having escaped
the \ in the path. However, it is taking an absurdly long time to
actually finish (I had to terminate the task), whereas uploading via
WebDAV to rdf_sink was very quick (< 30 seconds for a 1GB file).

Firstly, are the two methods equivalent? What I want to do is simply
make the RDF triple-store (for DBpedia) available via a SPARQL endpoint.
There must be a recommended way for doing it - is this method it?

Thanks,
Alex

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
Virtuoso-users mailing list
Virtuoso-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtuoso-users

Reply via email to