On 8 August 2018 at 15:22, Ronald Oussoren via Python-ideas <python-ideas@python.org> wrote: >> >> On 08/08/18 07:14, Ken Hilton wrote: >>> Now, let's take a look at the following scenario: >>> def read_multiple(*filenames): >>> for filename in filenames: >>> with open(filename) as f: >>> yield f.read() >>> Can you spot the problem? The "with open(filename)" statement is supposed >>> to ensure that the file object is disposed of properly. However, the "yield >>> f.read()" statement suspends execution within the with block, so if this >>> happened: >>> for contents in read_multiple('chunk1', 'chunk2', 'chunk3'): >>> if contents == 'hello': >>> break >>> and the contents of "chunk2" were "hello" then the loop would exit, and >>> "chunk2" would never be closed! Yielding inside a with block, therefore, >>> doesn't make sense and can only lead to obscure bugs. > > It is also possible to fix the particular issue by using another with > statement, that is use: > > with contextlib.closing(read_multiple(…)) as chunks: > for contents in chunks: > …
That's a very good point Ronald. Having seen this a few times I can't think of any cases that wouldn't be solved by this. If a language change was wanted for this then perhaps it should be to do like file objects and add the __exit__ (calls close()) method to generator objects so that you can omit the closing and just do: with read_multiple(…) as chunks: for contents in chunks: ... -- Oscar _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/