Guido van Rossum <gu...@python.org> added the comment:

Serhiy, what do you mean by "otherwise we could run out of file descriptiors"? 
I looked a bit at the code and there are different kinds of algorithms involved 
for different forms of patterns, and the code also takes vastly different paths 
for recursive matches.

I found one bit of code that looked like it *could* be improved, with some 
effort: _glob1(). This constructs a list of all files in one directory and then 
filters then. It looks like this could be a problem if there are e.g. 100_000 
files in one directory. To fix, we could implement fnmatch.ifilter() which 
would be like fnmatch.filter() but uses `yield name` instead of 
`result.append(name)`; then _glob1() could be rewritten as follows (untested):

def _glob1(dirname, pattern, dironly):
    names = _iterdir(dirname, dironly))
    if not _ishidden(pattern):
        yield from fnmatch.ifilter(x for x in names if not _ishidden(x))
    else:
        yield from fnmatch.ifilter(names, pattern)

Thoughts? Would this increase the number of open file descriptors in some edge 
case?

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue22167>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to