Hello.

Rich wrote:
I have come accross what seems to be a bug with the ls command.  It only
happens in directories over a few thousand entries.

For example, the directory I am testing has roughly 7,000 files.  We
tried using:

ls *_*

This produces a failure, [Too Many Arguments].

There are limits in how long the command-line can be. 7000 files is a fair number of files, so you are hitting those limits.


These all work fine:
ls
ls -altr
ls -1

I tried other commands like find, tail, and grep. All work fine.
        find . -name "*_*" -exec ls -l {} \;
        tail *_*
        grep "DATA ERROR" *_*

BTW, the *_* produces ~4,300 files out of 7,000.

Not sure if others have seen this type of failure.

Yes, I've seen it quite a few times.

You may find this command quicker than "find -exec":

  find . -name "*_*" -print0 | xargs --null ls -l

This also copes with spaces in the filenames by using nul as the separator.

Hope that helps, bye, Rich =]

--
Richard Dawe [ http://homepages.nildram.co.uk/~phekda/richdawe/ ]

"You can't evaluate a man by logic alone."
  -- McCoy, "I, Mudd", Star Trek



_______________________________________________
Bug-fileutils mailing list
[EMAIL PROTECTED]
http://lists.gnu.org/mailman/listinfo/bug-fileutils

Reply via email to