New submission from Eric Frederich <[email protected]>:
Calling pathlib.Path.glob("**/*) on a directory containing a symlink which
resolves to a very long filename causes OSError.
This is completely avoidable since symlinks are not followed anyway.
In pathlib.py, the _RecursiveWildcardSelector has a method _iterate_directories
which first calls entry.is_dir() prior to excluding based on entry.is_symlink().
It's the entry.is_dir() which is failing.
If the check for entry.is_symlink() were to happen first this error would be
avoided.
It's worth noting that on Linux "ls -l bad_link" works fine.
Also "find /some/path/containing/bad/link" works fine.
You do get an error however when running "ls bad_link"
I believe Python's glob() should act like "find" on Linux and not fail.
Because it is explicitly ignoring symlinks anyway, it has no business calling
is_dir() on a symlink.
I have attached a file which reproduces this problem. It's meant to be ran
inside of an empty directory.
----------
files: uhoh.py
messages: 388927
nosy: eric.frederich
priority: normal
severity: normal
status: open
title: pathlib.Path.glob causes OSError encountering symlinks to long filenames
Added file: https://bugs.python.org/file49884/uhoh.py
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue43529>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com