On 5/9/05, Vamsi_Doddapaneni <[EMAIL PROTECTED]> wrote:
> I am facing a new problem.
>
> Here is the code part:
> foreach $name(`ls $opt_i/*.xml`){
> chomp;
> push @f, $name;
> print "pushing elements=$name\n";
> }
> [EMAIL PROTECTED];
>
> Now in the directory $opt_i if there are some 10 , 20 or even 100 its
> working well. But now I have some 305 odd xmls and the code is EXITING
> WITH
>
> sh: /usr/bin/ls: 0403-027 The parameter list is too long.
>
> In unix prompt ls *.xml is working (giving out those 305 xmls)
>
> Could anybody help me out?
Ever seen an Open Source program autoconfigure itself and test how big
the argument list can be? There's a reason for doing so, and you're
probably running into it. POSIX recognizes that some systems have
limits on the number of characters that can be in the argument list,
and the permitted lowest value on the upper bound is around 5KB - and
includes the values in the environment. Many systems have very large
limits - others do not, and yours appears to be one of the latter.
What is the value of $opt_i? If it is
/some/very/long/name/with/many/parts then the 'ls' command line is
going to be 300 times (35 + average-length-of-file.xml) which could be
15 KB without problem. Which shell do you use personally? What
happens when you type 'ls /some/very/...'? If the shell you use is
not /bin/sh, could it be that Perl is using /bin/sh and running into a
problem? Do you have a very large environment.
Can you rewrite your Perl to do the reading, filtering and sorting of
the directory contents, instead of invoking 'ls'? (Perl can do it;
someone almost certainly already has done it and it's almost certainly
available on CPAN; is it worth your while to do so? The answer to
that depends on the tractability of the 'ls' problem.)
--
Jonathan Leffler <[EMAIL PROTECTED]> #include <disclaimer.h>
Guardian of DBD::Informix - v2005.01 - http://dbi.perl.org
"I don't suffer from insanity - I enjoy every minute of it."