On Thu, Apr 30, 2020 at 3:12 PM Raymond Hettinger
<raymond.hettin...@gmail.com> wrote:
> Thanks for the concrete example.  AFAICT, it doesn't require (and probably 
> shouldn't have) a lock to be held for the duration of the call.  Would it be 
> fair to say the 100% of your needs would be met if we just added this to the 
> functools module?
>
>       call_once = lru_cache(maxsize=None)
>
> That's discoverable, already works, has no risk of deadlock, would work with 
> multiple argument functions, has instrumentation, and has the ability to 
> clear or reset.

Yep, I think that's fair. We've never AFAIK had a problem with
`lru_cache` races, and if we did, in most cases we'd be fine with
having it called twice.

I can _imagine_ a case where the call loads some massive dataset
directly into memory and we really couldn't afford it being loaded
twice under any circumstance, but even if we have a case like that, we
don't do enough threading for it ever to have been an actual problem
that I'm aware of.

> I'm still looking for an example that actually requires a lock to be held for 
> a long duration.

Don't think I can provide a real-world one from my own experience! Thanks,

Carl
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/XLXJFZ4K67RDEI3WUK2FNEKH547C36GK/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to