Hi,
I'm experiencing some rather strange behavior from memcache. I think
I'm getting different data back from memcache using the same key
The issue I see is that when putting load on our application, even
simple memcache queries are starting to return inconsistant data. When
running the same request from multiple threads, I get different
results.
I've made a very simple example, that runs fine on 1-200 threads, but
if I put load on the app (with some heavier requests) just before I
run my test, I see different values coming back from memcache using
the same keys.

def get_new_memcahce_value(key, old_value):
    old_val = memcache.get(key)
    new_val = uuid.uuid4().get_hex()
    reply = 'good'
    if old_val and old_value != "":
        if old_val != old_value:
            reply = 'fail'
            new_val = old_value
        else:
            if not memcache.set(key, new_val):
                reply = 'set_fail'
    else:
        reply = 'new'
        if not memcache.set(key,new_val):
            reply = 'set_fail'
    return (new_value, reply)

and from a server posting requests:

def request_loop(id):
    key = "test:key_%d" % id
    val, reply = get_new_memcahce_value(key, "")
    for i in range(20):
        val,reply = get_new_memcahce_value(key, val)

Is memcache working localy on a cluster of servers, and if an
application is spawned over more clusters, memcache will not
propergate data to the other clusters?

I hope someone can clarify this, since I can't find any post regarding
this issue.

Is there some way to get the application instance ID, so I can do some
more investigation on the subject?

Thanks
Kim

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to