> I propose a method:
> ...
> returns a dictionary {arg: value} representing the cache. 
> It wouldn't be the cache itself, just a shallow copy 
> of the cache data

I recommend against going down this path.

It exposes (and potentially locks in) implementation details such as
how we distinguish positional arguments, keyword arguments, and
type information (something that has changed more than once).

Also, a shallow copy still leaves plenty of room for meddling
with the contents of the keys, potentially breaking the integrity
of the cache.

Another concern is that we've worked hard to remove potential
deadlocks from the lru_cache.  Hanging on a lock while copying the
whole cache complicates our efforts and risks breaking it as users
exploit the new feature in unpredictable ways.

FWIW, OrderedDict provides methods that make it easy to roll
your own variants of the lru_cache().  It would better to do that
than to complexify the base implementation in ways that I think
we would regret.

Raymond
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/OKQO6GFE4JTEAJR4S454KMMKN6C6CNUZ/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to