duplicate keys in Riak secondary index

2014-08-22 Thread Chaim Peck
I am looking for some clues as to why there might be duplicate keys in a Riak 
Secondary Index. I am using version 1.4.0.

Thanks,
Chaim
___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com


Re: duplicate keys in Riak secondary index

2014-08-22 Thread Alex De la rosa
Might be siblings?

Thanks,
Alex


On Thu, Aug 21, 2014 at 10:29 PM, Chaim Peck chaimp...@gmail.com wrote:

 I am looking for some clues as to why there might be duplicate keys in a
 Riak Secondary Index. I am using version 1.4.0.

 Thanks,
 Chaim
 ___
 riak-users mailing list
 riak-users@lists.basho.com
 http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com


Re: duplicate keys in Riak secondary index

2014-08-22 Thread Kelly McLaughlin
Have you changed the n_val property of the bucket in question? Lowering 
the n_val can result in duplicate results.


Kelly

On 08/21/2014 02:29 PM, Chaim Peck wrote:

I am looking for some clues as to why there might be duplicate keys in a Riak 
Secondary Index. I am using version 1.4.0.

Thanks,
Chaim
___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com



___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com


Re: duplicate keys in Riak secondary index

2014-08-22 Thread Chaim Peck
Could you please explain how that might be?

Just to give some more information… At this point, I am trying to simply purge 
the bucket and start fresh

I am using the python client, basically like this:

for keys in streaming_bucket.stream_index('$bucket', bucket_name):
for key in keys:
delete_bucket.delete(key)

I have been running this over and over, but the objects still persist in the 
bucket even hours after running it.

Last night I set delete_mode to immediately so these are probably not 
tombstones…

I noticed that if I set the bucket's n_val to 1, I get an error 400 for each 
delete operation, whereas if I leave it at the default then it reports back 
that the delete operation didn't fail. In either case, the keys do not seem to 
be deleted from the index.

So, in addition to their being duplicates, it seems that I cannot delete items.

---

I should not that last night, I tried deleting keys from a specific index (not 
the general $bucket index) and that appeared to work.

If anybody has some tips on how to effectively purge the bucket and start over 
that would be greatly appreciated. (I cannot delete the in the file-system 
because we have other buckets that cannot be deleted).

Thanks,
Jeff



On Aug 22, 2014, at 11:39 AM, Alex De la rosa alex.rosa@gmail.com wrote:

 Might be siblings?
 
 Thanks,
 Alex
 
 
 On Thu, Aug 21, 2014 at 10:29 PM, Chaim Peck chaimp...@gmail.com wrote:
 I am looking for some clues as to why there might be duplicate keys in a Riak 
 Secondary Index. I am using version 1.4.0.
 
 Thanks,
 Chaim
 ___
 riak-users mailing list
 riak-users@lists.basho.com
 http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
 

___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com