Oh, I see what you're getting at: you're asking me if B will still be in the
cache if A is still in the cache. That depends on whether or not the keys
hash to the same Memcache instance.

FYI - in general, we don't make any guarantees of this behavior, so it
potentially can be problematic down the line if this changes.


Ikai Lan
Developer Programs Engineer, Google App Engine
Blog: http://googleappengine.blogspot.com
Twitter: http://twitter.com/app_engine
Reddit: http://www.reddit.com/r/appengine



On Fri, Jul 22, 2011 at 1:52 PM, Ikai Lan (Google) <ika...@google.com>wrote:

> Memcache works like an LRU cache, but I don't see why a would force out b
> unless you ran out of space.
>
> Also, App Engine's Memcache has 2 LRU structures: an app specific LRU and a
> global LRU for that Memcache instance.
>
> Ikai Lan
> Developer Programs Engineer, Google App Engine
> Blog: http://googleappengine.blogspot.com
> Twitter: http://twitter.com/app_engine
> Reddit: http://www.reddit.com/r/appengine
>
>
>
> On Fri, Jul 22, 2011 at 1:05 PM, Andrin von Rechenberg <andri...@gmail.com
> > wrote:
>
>> Hey there
>>
>> I'm building something like "Google Analytics" for Appengine but in real
>> time
>> (Including qps, hourly & daily graphs, backend counters, monitors with
>> alerts, etc...)
>> The cool thing is that it only uses memcache to increase counters/stats so
>> its
>> really quick to use in prod code. Every minute I gather all counters and
>> write them to datastore.
>> It seems to work perfectly for my app (~250qps with about 1000 different
>> counters, and about
>> 1000 counter increases per second)
>> I can also measure how correct my data is (if stuff flushed in memcache,
>> but so far that never happened),
>> but it's all based on one assumption:
>>
>> If I call:
>>
>> memcache.incr("a", intitail_value=0)
>> ...
>> memcache.incr("b", initial_value=0)
>> ....
>> memcache.incr("b", initial_value=0)
>> ....
>>
>> *if "a" is still in the memcache "b" will also be in the memcache and
>> wont have been flushed, correct?*
>> *
>> *
>> or in other words: If the entity size for two items in the memcache is the
>> same,
>> does the memcache work like either a LRU or FIFO cache?
>>
>> Any response is greatly appreciated...
>>
>> -Andrin
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "Google App Engine" group.
>> To post to this group, send email to google-appengine@googlegroups.com.
>> To unsubscribe from this group, send email to
>> google-appengine+unsubscr...@googlegroups.com.
>> For more options, visit this group at
>> http://groups.google.com/group/google-appengine?hl=en.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to