You *can *use cache.ram, but each process will have its own cache. That 
shouldn't affect expiration at all, though, as with cache.ram, the 
expiration is determined at *retrieval *time, not at the time of the 
initial write, and cache keys are not automatically purged after some 
period. Of course, in one request, you might put something in the cache of 
one process, and the very next request might hit a different process and 
therefore not have access to that cached item (so it will then be written 
to the cache of the second process).

Anthony

On Thursday, September 13, 2018 at 11:21:35 AM UTC-4, 
tim.nyb...@conted.ox.ac.uk wrote:
>
> Is it possible for cache.ram to work if running on nginx/uwsgi, 
> particularly with multiple processes?
>
> I don't really care if each of my processes (4) winds up with its own 
> cache in memory, but they seem to expire very quickly.  
>
> Moving to cache.disk or Redis may be the way to go, but if simple ram 
> caching can work, I'd sooner go that route.
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to