Hi Shashidhar,

I’m wondering how large the original file was, probably not 32gb. I’m also 
wondering how it ended up getting inserted without trouble. Almost as if memory 
values have been tuned down afterwards.

I’d decrease memory list size to a more reasonable value, and instead take a 
look at in memory tree size as well. It is also suggested to keep journal size 
larger than list size + tree size at minimum.

You could of course delete the file, that should just work, but I can’t judge 
whether the file contains valuable information or not.

It is also suggested to split that file into smaller parts. Best to do that at 
ingest time, but if that is not an option, fragmentation might help here. But 
word of warning fragmentation also influences how queries behave, and have some 
other side effects as well. We typically recommend against it..

Kind regards,
Geert

From: Shashidhar Rao 
<[email protected]<mailto:[email protected]>>
Date: Monday, May 11, 2015 at 6:52 PM
To: Geert Josten <[email protected]<mailto:[email protected]>>
Subject: XDMP-FRAGTOOLARGE

Hi Geert,

I am getting this error. I tried posting but not getting any replies.

Can you suggest anything to resolve this error.


There is currently an XDMP-FORESTERR: Error in reindex of forest PROD_DB_1: 
XDMP-REINDEX: Error reindexing fn:doc("/home/data/TD078999.XML"):
XDMP-FRAGTOOLARGE: Fragment of /home/data/TD078999.XML too large for in-memory 
storage:
: In-memory list storage full; list: table=100%, wordsused=50%, wordsfree=25%,
overhead=25%; tree: table=0%, wordsused=6%, wordsfree=94%, overhead=0% 
exception. Information on this page may be missing.

Any suggestion on how to resolve this error?

My in memory list is 32699MB

How can I increase this value or can I delete this file?

Please help

Thanks
_______________________________________________
General mailing list
[email protected]
Manage your subscription at: 
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to