Selon James Reynolds <[EMAIL PROTECTED]>:

> I can't seem to get perl to open more than 254 files.  Is this a 
> known limitation, or is it a bug?  I can't find any info other than 
> Solaris had a similar problem (except it was 256 for Solaris) but is 
> now fixed.
> 
> To test, I wrote this script that will generate 254 lines of "open 
> (FILE... die...".  I copy the output of this script, paste it into a 
> new script and execute the new script.  The output of the new script 
> is at the bottom of this message.

In bash, you can use "ulimit -n" to raise the maximum number of 
open files: 

    $ ulimit -a
    core file size        (blocks, -c) 0
    data seg size         (kbytes, -d) 6144
    file size             (blocks, -f) unlimited
    max locked memory     (kbytes, -l) unlimited
    max memory size       (kbytes, -m) unlimited
    open files                    (-n) 256
    pipe size          (512 bytes, -p) 1
    stack size            (kbytes, -s) 512
    cpu time             (seconds, -t) unlimited
    max user processes            (-u) 100
    virtual memory        (kbytes, -v) 6656
    $ ulimit -n 512
    $ echo $?
    0
    $ ulimit -a
     ... 
    open files                    (-n) 512
     ...

The tcsh equivalent should be "limit" but it seems that the maximum 
number of open files can't be modified with this command. 

And in the general case, you can use the module FileCache, present 
in the core distribution of Perl: 

    NAME
       FileCache - keep more files open than the system permits

    SYNOPSIS
           cacheout $path;
           print $path @data;


Regards
-- 
Maddingue

Reply via email to