My last job was in telecom, so I think ``standard'' is an antonym for ``innovation.''

Choice can be confusing, but performance is often more valuable than interoperability here. If you're in a java-only shop, java's native hash is going to be faster than anything else. Strings are immutable and the hash is memoized. If you have to interoperate, you can start making choices.

It's way too early to be talking about standardization, anyway. There haven't been enough studies on the performance and appropriateness of different hash algorithms. When people are interested in things, the last thing you want to do is standardize them. They'll become standardized on their own once people are no longer interested.


(Just tossing out an idea, I've already discredited myself as useless here!)

Would a "minimal recommended feature set" be along the lines of acceptable?

I can imagine not having at least a common default to be insanity inducing. IMHO it's probably okay to request clients at least implement the crc32 over such and such details as long as they default to them. Then you're still free to implement something with higher performance characteristics.

On your last bit ... uhhhh, no strong opinion. I think the idea here is to do "something" which makes people feel comfortable to actually *use* consistent hashing in some form. Presently larger sites get bitten by the "I just added another memcached instance and my shit got all slow!" more and more often. Lets help spare sysadmins from having to do memcached maintenance at 4am? :)

-Dormando

Reply via email to