On 07/18/2017 05:37 PM, SF Markus Elfring wrote:
> Are there other programs (besides the tool “find”) which support
> such a data processing style?
Sorry you lost me. What are we talking about?
This discussion mentions constructs with globbing *.txt and passing
their basenames as parameter to another script ... while worrying
about the performance of stripping off the directories part of each
item.
When globbing in such shell, you easily hit the maximum command line
length of the shell, here:
$ xargs -r --show-limits </dev/null 2>&1 | grep '^Maximum .* actually use:'
Maximum length of command we could actually use: 2089058
Furthermore, many file systems do not scale very
well with several hundred thousand of files inside a directory.
So how many files do you have: thousands, millions?
And if the files are furtherly processed by an external script, why should
a simple `basename` or ${name##*/} hurt the performance?
AFAICS your best bet is still based on find(1).
Have a nice day,
Berny