On Tue, 22 May 2001, Terry Lambert wrote:

> I don't understand the inability to perform the trivial
> design engineering necessary to keep from needing to put
> 60,000 files in one directory.
>
> However, we can take it as a given that people who need
> to do this are incapable of doing computer science.
>
> I would suggest two things:
>
> 1)    If write caching is off on the Linux disks, turn
>       it off on the FreeBSD disks.
>
> 2)    "  " -- and then turn it on on both.
>
> 3)    Modify the test to delete the files based on a
>       directory traversal, instead of promiscuous
>       knowledge of the file names, which is cheating
>       to make the lookups appear faster.
>
> (the rationale behind this last is that people who can't
> design around needing 60,000 files in a single directory
> are probably going to to be unable to correctly remember
> the names of the files they created, since if they could,
> then they could remember things like ./a/a/aardvark or
> ./a/b/abominable).

The problem comes along when you are using a third party
application that keeps a bazillion files in a directory,
which was the problem that spawned this entire thread.

Why is knowing the file names cheating?  It is almost certain
that the application will know the names of it's own files
(and won't be grepping the entire directory every time it
needs to find a file).  I doubt a human is ever going to want
to work in a directory where you have 60000 files lying about,
but an application might easily be written to work in just such
conditions.


To Unsubscribe: send mail to [EMAIL PROTECTED]
with "unsubscribe freebsd-hackers" in the body of the message

Reply via email to