Hello,

i just wanted to know if it is also possible to load (and manipule)
large files, which can be hundreds of megabytes, or even multiple
gigabytes. I tried to load a file into sedna via php api (240 MB), and
it took 463 seconds to import it (on win 64). I recognized that the
se_sm.exe process was working all the time, but only using about 14MB of
memory.

How can the import be accelerated? Couldn't find anything in the
documentation about that topic.

I also noticed that there is only one function for loading xml
documents, and thats "sedna_load", where you need to pass the whole
document as a string. Meaning it is necessary to parse the whole 240MB
into a string (-> into memory), so that you can pass it to the function.
Or is there a better way to do this?

Best regards,
Mario

------------------------------------------------------------------------------
Beautiful is writing same markup. Internet Explorer 9 supports
standards for HTML5, CSS3, SVG 1.1,  ECMAScript5, and DOM L2 & L3.
Spend less time writing and  rewriting code and more time creating great
experiences on the web. Be a part of the beta today
http://p.sf.net/sfu/msIE9-sfdev2dev
_______________________________________________
Sedna-discussion mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/sedna-discussion

Reply via email to