On 2021-01-13 2:53 am, Jack Hodges wrote:
I was looking in the archives to see if there is a limit to the size
of files using Semantic XML and found only 1 thread from way back
(2009). I have to import large datasets into a knowledge graph
(offline for now) and, like the original poster, I have no problem
with very small files but even moderately-sized files are producing
errors.
Are these out-of-memory errors, or what?
I am using TBCME version 6.3 on a MacBook running macOS Catalina,
10.15.7 with 32 GB of memory. I have allocated (in the
Contents/Eclipse/TopBraid Composer.ini file) to 12 GB of memory with
no improvement. --
Semantic XML is quite chatty and produces lots of triples for each XML
element, so some overhead is to be expected. We didn't encode some kind
of limit though (that I could think of). If you get stuck there are all
kinds of 3rd party approaches that would allow you to pre-process the
XML, for example using XSL-T to produce RDF/XML, or any API in an
imperative language. Not every problem can be solved by TopBraid alone.
Holger
--
You received this message because you are subscribed to the Google Groups "TopBraid
Suite Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to topbraid-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/topbraid-users/b75828ee-58a1-a5c5-dd30-636f5ffbd199%40topquadrant.com.