Hi gkwelding,
I have checked explicitly on my box and the value for MAX_OPEN_FILES and
MAX_MAP_COUNT has been set to 65535.
--
View this message in context:
Hi Clinton/gkwelding,
Any suggestion to overcome this issue as I tried by increasing the heap size
and still getting the same error.
Below are the call logs for the issue:
error: array index out of bounds
[2014-03-05 21:03:25,212][WARN ][monitor.jvm
] [Node225] [gc][young][5079][15545] duration
My next guess, and it really is a guess now, is that you might be running
out of file descriptors. As
per
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/setup-service.html
make sure MAX_OPEN_FILES and MAX_MAP_COUNT aren't too low. This may also
help,
MAX_OPEN_FILES and MAX_MAP_COUNT are not being set explicitly, so having
default value only i.e. 65535.
And still its crashing with same error :(
--
View this message in context:
Have you been able to explicitly check what the max file descriptors limit
is set to on your box?
Other than that I'll have to step out of this conversation and hand over to
somebody who actually knows what they're talking about...
On Friday, March 7, 2014 12:48:25 PM UTC, Prashy wrote:
Hi Clinton/gkwelding,
I tried sending the document one by one as well for indexing.
So suppose my document is of 5 MB so it gets indexed and not giving any
exception. But in other hand for some documents which is in KB or about 1 MB
is giving the exception of out of memory.
Though mapping are
The message is pretty obvious. Your node is running out of heap memory...
Increase it.
On Tuesday, March 4, 2014 1:36:51 PM UTC, Prashy wrote:
Hi ES users,
I am getting the following exception while indexing huge amount of
data(say
~5GB) to ES node.
Exception:
1) /*array index out
I tried increasing the heap value by 2GB as well by ES_MAX_MEM: 2g but it
gave the same error.
--
View this message in context:
http://elasticsearch-users.115913.n3.nabble.com/Error-array-index-out-of-bounds-java-lang-OutOfMemoryError-Java-heap-space-tp4050914p4050916.html
Sent from the
Adding a bit more to my rather short answer. Both exceptions essentially
mean the same thing. I would follow the basic heap allocation advice.
Allocate 50% of your system RAM to ES as catastrophic things happen when ES
runs out of RAM. Leave the other 50% to the system. So if you have a server
you also don't give us much information on how you're trying to index this
3gb of information. Are you using the bulk API? Are you refreshing after
every index action? etc...
On Tuesday, March 4, 2014 1:40:58 PM UTC, Prashy wrote:
I tried increasing the heap value by 2GB as well by
Just wanted to know that as I was using 1GB as Heap Size I was getting an
error. So I increased it to 2GB (Heap , system has 4GB) so in that scenario
also I got the error at same point.
So if I increased the memory from 1 GB to 2 GB, at least it should process
one more record in compare to
Hi gkwelding,
I am using the bulk API for indexing the data. And also refresh parameter is
not set.
So what could be the issue for that exception.
Let me know if you require any other input for the same.
--
View this message in context:
12 matches
Mail list logo