Yes, kibana will load for whatever you ask for, *but* ES has to maintain
index metadata for every index in memory.
Those two coupled are pushing things too far for your heap.
Regards,
Mark Walkom
Infrastructure Engineer
Campaign Monitor
email: ma...@campaignmonitor.com
web: www.campaignmonitor.co
>
> How much RAM you need depends on how long you want to keep your data
> around for. So, given you have ~200GB now on 4GB of RAM, you can probably
> extrapolate that out based on your needs.
Isn't my problem more with 9G *daily* index, than with total of 200G(20
days x 9G) indexes?
Correct
It's standard practise to use 50% of system memory for the heap.
How much RAM you need depends on how long you want to keep your data around
for. So, given you have ~200GB now on 4GB of RAM, you can probably
extrapolate that out based on your needs.
Regards,
Mark Walkom
Infrastructure Engineer
C
>
> add more memory
i am doing 15 million docs, which total to ~9G. The average doc size is
~2KB.
1. How much memory would you suggest for my use-case?
2.Also, is it prudent for me to have half of OS memory dedicated to
elasticsearch?
On Monday, 12 May 2014 14:03:19 UTC+5:30, Mark Walkom wr
You need to reduce your data size, add more memory or add another node.
Basically, you've reached the limits of that node.
Regards,
Mark Walkom
Infrastructure Engineer
Campaign Monitor
email: ma...@campaignmonitor.com
web: www.campaignmonitor.com
On 12 May 2014 16:38, Abhishek Tiwari wrote: