Message370489
Serhiy, what do you mean by "otherwise we could run out of file descriptiors"? I looked a bit at the code and there are different kinds of algorithms involved for different forms of patterns, and the code also takes vastly different paths for recursive matches.
I found one bit of code that looked like it *could* be improved, with some effort: _glob1(). This constructs a list of all files in one directory and then filters then. It looks like this could be a problem if there are e.g. 100_000 files in one directory. To fix, we could implement fnmatch.ifilter() which would be like fnmatch.filter() but uses `yield name` instead of `result.append(name)`; then _glob1() could be rewritten as follows (untested):
def _glob1(dirname, pattern, dironly):
names = _iterdir(dirname, dironly))
if not _ishidden(pattern):
yield from fnmatch.ifilter(x for x in names if not _ishidden(x))
else:
yield from fnmatch.ifilter(names, pattern)
Thoughts? Would this increase the number of open file descriptors in some edge case? |
|
Date |
User |
Action |
Args |
2020-05-31 16:47:49 | gvanrossum | set | recipients:
+ gvanrossum, roysmith, steven.daprano, r.david.murray, docs@python, Gumnos, serhiy.storchaka, Roger Erens |
2020-05-31 16:47:49 | gvanrossum | set | messageid: <1590943669.27.0.65077907206.issue22167@roundup.psfhosted.org> |
2020-05-31 16:47:49 | gvanrossum | link | issue22167 messages |
2020-05-31 16:47:48 | gvanrossum | create | |
|