Matthew -

That did it!

Actually, I tried with both settings, and also with just the ring_size change.

Setting ring_size to 8 got rid of crashing.  I'll have to do a bit more reading 
on this setting I suppose. I have a much more memory-constrained virtual 
machine running on my local desktop running with just the default install 
settings and no crashing.

Thanks!

Damion

On Jan 19, 2017, at 7:57 AM, Matthew Von-Maszewski 
<matth...@basho.com<mailto:matth...@basho.com>> wrote:

Damion,

Add the following settings within riak.conf:

leveldb.limited_developer_mem = on
ring_size = 8

Erase all data / vnodes and start over.

Matthew


On Jan 19, 2017, at 8:51 AM, Junk, Damion A 
<jun...@purdue.edu<mailto:jun...@purdue.edu>> wrote:

Hi Magnus -

I've tried a wide range of parameters for leveldb.maximum_memory_percent 
ranging from 5 to 70. I also tried the leveldb.maximum_memory setting in bytes, 
ranging from 500MB to 4GB. I get the same results in the crash/console log no 
matter what the settings. But the log messages seem to indicate an issue with 
yokozuna, and not leveldb itself from what I can tell.

I set the max (-Xmx) to 2G for SOLR as well.

From the log messages, it looks like it's not actually the KV leveldb system 
that's crashing, but the yokozuna system. I'm not sure how to control or set 
memory here:

{badmatch,{error,{db_open,"IO error: lock 
/var/lib/riak/yz_anti_entropy/639406966332270026714112114313373821099470487552/LOCK:
 Cannot allocate memory"}

This is a development node, running as a single (nojn-clustered) riak node. It 
has 14G memory, and at the time of trying changes with Riak, 9GB were free.


To Recap:

There are no keys/values in the database at all.
The only default settings I changed were:

storage_backend = leveldb
search = on

and when that didn't work, I started changing:

search.solr.jvm_options = -d64 -Xms1g -Xmx2g -XX:+UseStringCache 
-XX:+UseCompressedOops
leveldb.maximum_memory_percent = 5 .. 70

and then when nothing seemed to change:

leveldb.maximum_memory =  1000000 ... 4000000000


Thanks for any assistance!


Damion


On Jan 19, 2017, at 3:33 AM, Magnus Kessler 
<mkess...@basho.com<mailto:mkess...@basho.com>> wrote:

Hi Damion,

Let me first state that AAE always uses leveldb, regardless of the storage 
backend chosen for Riak KV data. Could you please state how much physical 
memory your Riak nodes have, and what you have configured for 
"leveldb.maximum_memory.percent" in "riak.conf"? Have you changed the settings 
for "search.solr.jvm_options", in particular the memory allocated to Solr?

As a general rule, leveldb should have at least 350MB of memory available per 
partition, and performance has been shown to increase with up to 2GB (2.5 GB 
when also using Search and AAE) per partition. Please check that you have 
enough memory available in your system.

Kind Regards,

Magnus

--
Magnus Kessler
Client Services Engineer
Basho Technologies Limited

Registered Office - 8 Lincoln’s Inn Fields London WC2A 3BP Reg 07970431

_______________________________________________
riak-users mailing list
riak-users@lists.basho.com<mailto:riak-users@lists.basho.com>
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com


_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to