On Thu, Oct 15, 2020 at 2:16 PM Ram Rachum <r...@rachum.com> wrote:

> Your use case is valid, there are times in which I'd like a strong
> reference to be kept. But there are also lots of times I want it to not be
> kept. Then I would offer this feature as a flag `weakref=True` with a
> default of `False`. What do you think?
>

I understand your use case.  But it feels uncommon and niche to me.
Especially given that you have already ` f.cache_clear()` available if you
want it.

Most of the time, if I have large objects, they are mutable objects.  And
if I am passing mutable objects into a function, it feels unlikely I want
to cache the result of the computation (which, after all, might depend on
the data inside the mutable object).  If the interpreter holds onto an
extra copy of a 4-tuple of scalars, or a dataclass instance, that seems
unlikely to make a measurable difference.

So the use case needs to be:

* Function operates on large objects
* Function operates on large, immutable objects
* Function never takes literal or computed arguments (i.e. not
`fun(big1+big2)`)
* Large immutable objects are deleted selectively (OK, this is plausible)
* The performance hit of clearing entire cache is not suitable

Given that it's not really hard to write your own caching decorator, I
don't feel like this needs to change the API of lru_cache.

-- 
The dead increasingly dominate and strangle both the living and the
not-yet born.  Vampiric capital and undead corporate persons abuse
the lives and control the thoughts of homo faber. Ideas, once born,
become abortifacients against new conceptions.
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/OC6N4IUX7ZDIDCOW5SGUVWWB7ZB44LR5/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to