Goswin von Brederlow <[EMAIL PROTECTED]> wrote:
> Andreas Metzler <[EMAIL PROTECTED]> writes:
>> Goswin von Brederlow <[EMAIL PROTECTED]> wrote:
>>> Steve Greenland <[EMAIL PROTECTED]> writes:
[...]  
>>>>     for ct in * ; do
>>>>         chown $ct:crontab $ct
>>>>     done
 
>>> This also won't work with too many users.
 
>> No, it will.
 
>> [EMAIL PROTECTED]:/tmp/big> rm *
>> bash: /bin/rm: Argument list too long
>> [EMAIL PROTECTED]:/tmp/big> for ct in *; do rm $ct ; done \
>>   && echo success
>> success

> for doesn't fork so the realy small commandline limit isn't a
> problem. But are you sure the shell does not read in a full list for
> "*" and then work through that?

I've no idea ...

> Ok, with more users than you can hold
> in ram you have other problems.
[...]

... yes. A list of a million users would not take more 20MB of RAM,
but both the number of users and having a directory with a million
entries would probably be unbearably slow.

The RAM-Limit is not of any interest at all, because looping a million
times takes ages. (Just compare "for i in * ; do touch $i ; done"
with "find -maxdepth 1 -print0 | xargs -0r touch" on a directory with
5000 files, the former is a 100 times slower.)
            cu andreas
PS: Please respect Mail-Followup-To if set _and_ only send Cc's if
explicitely requested. - TIA.


Reply via email to