$opt_i expands to some path, then perl calls the shell to run the command "ls some_path/*.xml". The shell then does what shells do, it tries to expand the *.xml pattern into a list of files to construct a command line of the form "ls some_path/file1 some_path/file2 some_path/file3" which it will then execute using the exec(2) operating system function. The exec(2) function has an upper limit on the command length it can handle, known as ARG_MAX. On my HP system ARG_MAX is 2048000 (in bytes). So if you have a great many files, and/or <some_path> is a very long path, and/or your ARG_MAX is small, you run into the problem you saw.
A google for ARG_MAX found this nice explanation: http://www.in-ulm.de/~mascheck/various/argmax/ The key lesson: do not use shell wild-cards in production code unless 1) by design you have a known upper limit on the number of files you can have, and 2) you've tested your code at that upper limit. Ways around this include doing the "ls" function all in perl by reading the directory yourself, or using alternative shell command pipelines that do not suffer this limit (for example, use "find"). Conrad Kimball Associate Technical Fellow IT - CNO, Shared Services Group [EMAIL PROTECTED] P.O.Box 24346, MS 7M-HC Seattle, WA 98124-0346 Bellevue 33-12 bldg, office 32C1 Phone: (425) 865-6410 Pager: (206) 797-3112 Cell: (425) 591-7802 -----Original Message----- From: Madani, Srikanth, VF-DE [mailto:[EMAIL PROTECTED] Sent: Monday, May 09, 2005 5:53 AM To: Vamsi_Doddapaneni; [email protected]; CAMPBELL, BRIAN D (BRIAN) Subject: RE: /usr/bin/ls: 0403-027 The parameter list is too long Vamsi_Doddapaneni wrote on Montag, 9. Mai 2005 12:16 > Here is the code part: > foreach $name(`ls $opt_i/*.xml`){ > chomp; > push @f, $name; > print "pushing elements=$name\n"; > } > [EMAIL PROTECTED]; > Now in the directory $opt_i if there are some 10 , 20 or even 100 its > working well. But now I have some 305 odd xmls and the code is EXITING > WITH > sh: /usr/bin/ls: 0403-027 The parameter list is too long. It's strange that your ls fails to handle 300+ files - is $opt_i being expanded correctly? And I don't ever recall seeing an ls display an error code upon failure. In any case, I would suggest using the Perl built-in to read the directory listing, instead of using the system ls command. Here's sample, albeit untested code: my $sample_dir = $opt_i; chdir $sample_dir || die "\nFatal error : Cannot cd to $sample_dir\n"; opendir SAMPLEDIR, $sample_dir || die "\nFatal error : Cannot read directory $sample_dir.\n"; my @ELEMENTS = grep /\.xml$/, readdir SAMPLEDIR; close SAMPLEDIR; The list @ELEMENTS will contain all *.xml files in $opt_i, if any. BTW, this is slightly OT for the DBI-users list. Cheers, Srikanth Madani Life is short. Be swift to love! Make haste to be kind! -Henri Frederic Amiel
