Re: How to update custom solr schema?

2014-03-22 Thread Buri Arslon
Never mind. Got some hints from here: https://github.com/basho/yokozuna/issues/130 On Sat, Mar 22, 2014 at 10:15 PM, Buri Arslon wrote: > Hey guys! > > I modified my custom schema and tried these steps to update it in the riak: > > 1. Empty buckets > 2. run "create_search_schema" > 3. run "crea

How to update custom solr schema?

2014-03-22 Thread Buri Arslon
Hey guys! I modified my custom schema and tried these steps to update it in the riak: 1. Empty buckets 2. run "create_search_schema" 3. run "create_search_index" 4. stop and start riak But it didn't work. Solr isn't picking up the new schema. What am I missing here? thanks, -- Buriwoy

Re: Cleaning up bucket after basho_bench run

2014-03-22 Thread István
Makes sense, thank you! On Sat, Mar 22, 2014 at 4:57 PM, Matthew Von-Maszewski wrote: > There is no manual mechanism (or secret Erlang command) to trigger leveldb > compactions. The only activity that triggers compactions in 1.4 is incoming > writes. Write more to compact more to free up disk

Re: Cleaning up bucket after basho_bench run

2014-03-22 Thread Matthew Von-Maszewski
There is no manual mechanism (or secret Erlang command) to trigger leveldb compactions. The only activity that triggers compactions in 1.4 is incoming writes. Write more to compact more to free up disk space. Not logical, but the truth. Matthew On Mar 22, 2014, at 7:48 PM, István wrote:

Re: Cleaning up bucket after basho_bench run

2014-03-22 Thread István
Matthew, Thank for the details about LevelDB. Is there a way to trigger compaction from Erlang or any other way to get rid of tombstones faster with 1.4? If there is no such a thing I guess waiting is my only option. Thanks everybody helping with this issue. Regards, Istvan On Sat, Mar 22, 2

Re: Riak 2.0 Yokozuna performance

2014-03-22 Thread Ryan Zezeski
On Wed, Mar 19, 2014 at 4:15 AM, Andrey Anpilogov wrote: > Hi, > > I've been playing with Riak 2.0 and found some strange performance drop > with new Search system. > Run two Riak nodes on E3 machines with 32GB RAM and SSD drives. > 1) Use leveldb as backend. > 2) Enabled Search engine > 3) Join

Re: Search Index Not Found

2014-03-22 Thread Buri Arslon
thanks, Ryan! yes, it was because an error in my solr schema. After correcting error, it is working. On Sat, Mar 22, 2014 at 2:06 PM, Ryan Zezeski wrote: > > On Sat, Mar 22, 2014 at 2:57 PM, Buri Arslon wrote: > >> another weird thing I noticed is that after I restart riak, >> get_search_inde

Re: Search Index Not Found

2014-03-22 Thread Ryan Zezeski
On Sat, Mar 22, 2014 at 2:57 PM, Buri Arslon wrote: > another weird thing I noticed is that after I restart riak, > get_search_index returns {ok, Index}, but after a few seconds, it's going > back to {error, "notfound"} > > Do you see any errors related to that index in the solr.log file? ___

Re: Search Index Not Found

2014-03-22 Thread Buri Arslon
another weird thing I noticed is that after I restart riak, get_search_index returns {ok, Index}, but after a few seconds, it's going back to {error, "notfound"} On Sat, Mar 22, 2014 at 12:39 PM, Buri Arslon wrote: > Hi guys! > > lkkI added custom schema and search index via erlang client. > ri

Search Index Not Found

2014-03-22 Thread Buri Arslon
Hi guys! lkkI added custom schema and search index via erlang client. riakc_pb_socket:list_search_indexes is giving this result: {ok,[[{index,<<"business">>}, {schema,<<"business">>}, {n_val,3}]]} But "get_search_index" is returning error: {error,<<"notfound">>} What am I doing

Re: Riak 2.0 Solr schema directory

2014-03-22 Thread Buri Arslon
Thanks! On Sat, Mar 22, 2014 at 10:25 AM, Eric Redmond wrote: > Yes > On Mar 21, 2014 10:24 PM, "Buri Arslon" wrote: > >> is riakc_pb_socket:create_search_schema/3/4 erlang way of adding custom >> schemas to Riak? >> >> >> On Fri, Mar 21, 2014 at 11:05 PM, Buri Arslon wrote: >> >>> Thanks, Lu

Re: Riak 2.0 Solr schema directory

2014-03-22 Thread Eric Redmond
Yes On Mar 21, 2014 10:24 PM, "Buri Arslon" wrote: > is riakc_pb_socket:create_search_schema/3/4 erlang way of adding custom > schemas to Riak? > > > On Fri, Mar 21, 2014 at 11:05 PM, Buri Arslon wrote: > >> Thanks, Luke! >> >> >> On Fri, Mar 21, 2014 at 6:04 PM, Luke Bakken wrote: >> >>> Hi Bu

Re: Cleaning up bucket after basho_bench run

2014-03-22 Thread Matthew Von-Maszewski
Leveldb, as written by Google, does not actively clean up delete "tombstones" or prior data records with the same key. The old data and tombstones stay on disk until they happen to participate in compaction at the highest "level". The clean up can therefore happen days, weeks, or even months la