Serhiy Storchaka <storchaka <at> gmail.com> writes: > 2. Most code that use PyModule_AddObject() doesn't work as intended. > Since the bahavior of PyModule_AddObject() contradicts the documentation > and is contrintuitive, we can't blame authors in this. > > I don't say this is a high-impacting bug, I even agree that there is no > need to fix the second part in maintained releases. But this is a bug > unless you propose different definition for a bug.
Why do you think that module authors don't know that? For _decimal, I was aware of the strange behavior. Yes, a single reference can "leak" on failure. The problem is that we don't seem to have any common ground here. Do you accept the following? 1) PyModule_AddObject() can only fail if malloc() fails. a) Normally (for small allocations) this is such a serious problem that the whole application fails anyway. b) Say that you're lucky and the application continues. i) The import fails. In some cases ImportError is caught and a fallback is imported (example _pydecimal). In that case you leak an entire DSO and something small like a single context object. What is the practical difference between the two? ii) The import fails and there's no fallback. Usually the application stops, otherwise DSO+small leak again. iii) Retry the import (I have never seen this): while(1): try: import leftpad except (ImportError, MemoryError): continue break You could have a legitimate leak here, but see a). Module initializations are intricate and boring. I suspect that if we promote wide changes across PyPI packages we'll see more additional segfaults than theoretically plugged memory leaks. Stefan Krah _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com