New submission from Raymond Hettinger <[email protected]>:
The recent discussions on python-ideas showed that people have a hard time
finding the infinity-cache option for lru_cache(). Also, in the context of
straight caching without limits, the name *lru_cache()* makes the tool seem
complex and heavy when in fact, it is simple, lightweight, and fast (doing no
more than a simple dictionary lookup).
We could easily solve both problems with a helper function:
def cache(func):
'Simple unbounded cache. Sometimes called "memoize".'
return lru_cache(maxsize=None, typed=False)
It would be used like this:
@cache
def configure_server():
...
return server_instance
There was some discussion about a completely new decorator with different
semantics (holding a lock across a call to an arbitrary user function and being
limited to zero argument functions). It all the examples that were presented,
this @cache decorator would suffice. None of examples presented actually
locking behavior.
----------
components: Library (Lib)
messages: 368469
nosy: rhettinger
priority: normal
severity: normal
status: open
title: Make lru_cache(maxsize=None) more discoverable
type: enhancement
versions: Python 3.9
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue40571>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com