Hey Raymond,
Thanks for your input here! A new method wouldn’t be worth adding purely for 
performance reasons then, but there is still an issue around semantics and 
locking.

Should we encourage/document `lru_cache` as the way to do `call_once`? If so, 
then I guess that’s suitable, but people have brought up that it might be hard 
to discover and that it doesn’t actually ensure the function is called once.

The reason I bring this up is that I’ve seen several ad-hoc `call_once` 
implementations recently, and creating one is surprisingly complex for someone 
who’s not that experienced with Python.

So I think there’s room to improve the discoverability of lru_cache as an 
“almost” `call_once` alternative, or room for a dedicated method that might 
re-use bits of the`lru_cache` implementation.

Tom

> On 28 Apr 2020, at 20:51, Raymond Hettinger <raymond.hettin...@gmail.com> 
> wrote:
> 
> 
>> t...@tomforb.es wrote:
>> 
>> I would like to suggest adding a simple “once” method to functools. As the 
>> name suggests, this would be a decorator that would call the decorated 
>> function, cache the result and return it with subsequent calls.
> 
> It seems like you would get just about everything you want with one line:
> 
>    call_once = lru_cache(maxsize=None)
> 
> which would be used like this:
> 
>   @call_once
>   def welcome():
>       len('hello')
> 
>> Using lru_cache like this works but it’s not as efficient as it could be - 
>> in every case you’re adding lru_cache overhead despite not requiring it.
> 
> 
> You're likely imagining more overhead than there actually is.  Used as shown 
> above, the lru_cache() is astonishingly small and efficient.  Access time is 
> slightly cheaper than writing d[()]  where d={(): some_constant}. The 
> infinite_lru_cache_wrapper() just makes a single dict lookup and returns the 
> value.¹ The lru_cache_make_key() function just increments the empty args 
> tuple and returns it.²   And because it is a C object, calling it will be 
> faster than for a Python function that just returns a constant, "lambda: 
> some_constant()".  This is very, very fast.
> 
> 
> Raymond
> 
> 
> ¹ 
> https://github.com/python/cpython/blob/master/Modules/_functoolsmodule.c#L870
> ² 
> https://github.com/python/cpython/blob/master/Modules/_functoolsmodule.c#L809
> 
> 
> 
> 
> 
> 
>> 
>> Hello,
>> After a great discussion in python-ideas[1][2] it was suggested that I 
>> cross-post this proposal to python-dev to gather more comments from those 
>> who don't follow python-ideas.
>> 
>> The proposal is to add a "call_once" decorator to the functools module that, 
>> as the name suggests, calls a wrapped function once, caching the result and 
>> returning it with subsequent invocations. The rationale behind this proposal 
>> is that:
>> 1. Developers are using "lru_cache" to achieve this right now, which is less 
>> efficient than it could be
>> 2. Special casing "lru_cache" to account for zero arity methods isn't 
>> trivial and we shouldn't endorse lru_cache as a way of achieving "call_once" 
>> semantics 
>> 3. Implementing a thread-safe (or even non-thread safe) "call_once" method 
>> is non-trivial
>> 4. It complements the lru_cache and cached_property methods currently 
>> present in functools.
>> 
>> The specifics of the method would be:
>> 1. The wrapped method is guaranteed to only be called once when called for 
>> the first time by concurrent threads
>> 2. Only functions with no arguments can be wrapped, otherwise an exception 
>> is thrown
>> 3. There is a C implementation to keep speed parity with lru_cache
>> 
>> I've included a naive implementation below (that doesn't meet any of the 
>> specifics listed above) to illustrate the general idea of the proposal:
>> 
>> ```
>> def call_once(func):
>>   sentinel = object()  # in case the wrapped method returns None
>>   obj = sentinel
>>   @functools.wraps(func)
>>   def inner():
>>       nonlocal obj, sentinel
>>       if obj is sentinel:
>>           obj = func()
>>       return obj
>>   return inner
>> ```
>> 
>> I'd welcome any feedback on this proposal, and if the response is favourable 
>> I'd love to attempt to implement it.
>> 
>> 1. 
>> https://mail.python.org/archives/list/python-id...@python.org/thread/5OR3LJO7LOL6SC4OOGKFIVNNH4KADBPG/#5OR3LJO7LOL6SC4OOGKFIVNNH4KADBPG
>> 2. 
>> https://discuss.python.org/t/reduce-the-overhead-of-functools-lru-cache-for-functions-with-no-parameters/3956
>> _______________________________________________
>> Python-Dev mailing list -- python-dev@python.org
>> To unsubscribe send an email to python-dev-le...@python.org
>> https://mail.python.org/mailman3/lists/python-dev.python.org/
>> Message archived at 
>> https://mail.python.org/archives/list/python-dev@python.org/message/5CFUCM4W3Z36U3GZ6Q3XBLDEVZLNFS63/
>> Code of Conduct: http://python.org/psf/codeofconduct/
> 
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/Z426BX4ZUVO5VCLA6RV2O5ZNGKZMDEUP/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to