On Thu, 15 Oct 2020 22:31:25 -0600
Carl Meyer <c...@oddbird.net> wrote:
> On Thu, Oct 15, 2020 at 3:33 PM David Mertz <me...@gnosis.cx> wrote:
> > So the use case needs to be:
> >
> > * Function operates on large objects
> > * Function operates on large, immutable objects
> > * Function never takes literal or computed arguments (i.e. not 
> > `fun(big1+big2)`)
> > * Large immutable objects are deleted selectively (OK, this is plausible)
> > * The performance hit of clearing entire cache is not suitable  
> 
> One real-world use case I've seen personally that meets all these
> criteria is per-request memoization of expensive work in a server
> application (using an `lru_cache`-like decorator on functions taking
> the request as an argument.) The request object plus things hanging
> off it can be quite large, it's often effectively immutable (at least
> inasmuch as the functions so decorated care), it's desirable for all
> cached data for a request to expire as soon as handling for the
> request is finished, but you wouldn't want to fully clear the caches
> (e.g. the application may handle multiple requests concurrently, so
> other requests may still be in-flight.)

A common recipe for that is simply to set a custom attribute on the
request:

  request._myapp_cached_data = ...

It's probably much faster than caching through a dict, as well.

Regards

Antoine.

_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/22I32ZJD6YD3ONOKJJ75YQDCHTJ5EWKF/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to