On Thu, Oct 15, 2020 at 3:33 PM David Mertz <me...@gnosis.cx> wrote:
> So the use case needs to be:
>
> * Function operates on large objects
> * Function operates on large, immutable objects
> * Function never takes literal or computed arguments (i.e. not 
> `fun(big1+big2)`)
> * Large immutable objects are deleted selectively (OK, this is plausible)
> * The performance hit of clearing entire cache is not suitable

One real-world use case I've seen personally that meets all these
criteria is per-request memoization of expensive work in a server
application (using an `lru_cache`-like decorator on functions taking
the request as an argument.) The request object plus things hanging
off it can be quite large, it's often effectively immutable (at least
inasmuch as the functions so decorated care), it's desirable for all
cached data for a request to expire as soon as handling for the
request is finished, but you wouldn't want to fully clear the caches
(e.g. the application may handle multiple requests concurrently, so
other requests may still be in-flight.)

> Given that it's not really hard to write your own caching decorator, I don't 
> feel like this needs to change the API of lru_cache.

But I don't disagree with this conclusion. There are often other
specific needs that would mandate a custom decorator anyway.

Carl
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/Z2P3TMNWVB6X2WSX3OBNL34T3WSCIV7P/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to