[
https://issues.apache.org/jira/browse/STANBOL-433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13171501#comment-13171501
]
Stephen Bayliss edited comment on STANBOL-433 at 12/17/11 9:56 AM:
-------------------------------------------------------------------
This was loading an ontology from a file of around 200MB; the file was in fact
an RDF SKOS thesaurus. In fact we'd like to be able to load files larger than
this (the vocabulary is currently split into three files - combining these
would give a file of around 500MB).
The ontology was being added via the Space.addOntology(OntologyInputSource)
method.
We tried loading this in a VM of around 3GB.
was (Author: penthes):
This was loading an ontology from a file of around 200MB; the file was in
fact an RDF SKOS thesaurus. In fact we'd like to be able to load files larger
than this (the vocabulary is currently split into three files - combining these
would give a file of around 500MB).
The ontology was being added via the Space.addOntology(OntologyInputSource)
method.
> Loading large ontology using Java API gives out-of-memory error
> ---------------------------------------------------------------
>
> Key: STANBOL-433
> URL: https://issues.apache.org/jira/browse/STANBOL-433
> Project: Stanbol
> Issue Type: Bug
> Components: Ontology Manager
> Reporter: Stephen Bayliss
> Priority: Minor
>
> Loading a large ontology - in our case an RDF file in the order of hundreds
> of Megabytes - leads to an out of memory error.
> The ontology is being loaded into a custom space, using an
> OntologyInputSource.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira