ID:               31515
 User updated by:  akorthaus at web dot de
 Reported By:      akorthaus at web dot de
 Status:           Open
 Bug Type:         Performance problem
 Operating System: Linux 2.4.28 (Gentoo)
 PHP Version:      5.0.3
 New Comment:

I tried php5-STABLE-200501140930 with the same result

The size of the directory-listing ("files"):

number of files:
ls -1 files | wc -l
10000

Number of bytes:
ls -1 files | wc -c
330000


Previous Comments:
------------------------------------------------------------------------

[2005-01-13 03:43:06] akorthaus at web dot de

the same with php5-STABLE-200501130130

------------------------------------------------------------------------

[2005-01-13 03:09:46] akorthaus at web dot de

I tried php5-STABLE-200501122330:

./configure \
  --prefix=/home/akorthaus/bin/php5-STABLE-200501122330 \
  --disable-all \
  --with-pcre-regex \
  --enable-memory-limit

With the following results:

scandir (foreach:500, files:527)
mem: 2M
time: 10.242558956146s

my_scandir (foreach:500, files:527)
mem: 0M
time: 2.3772580623627s

scandir (foreach:1, files:10000)
mem: 40M
time: 0.40674495697021s

my_scandir (foreach:1, files:10000)
mem: 1M
time: 0.17293095588684s

scandir (foreach:100, files:10000)
mem: 40M
time: 41.659919977188s

my_scandir (foreach:100, files:10000)
mem: 1M 
time: 20.631703853607s

------------------------------------------------------------------------

[2005-01-13 02:10:35] [EMAIL PROTECTED]

Please try using this CVS snapshot:

  http://snaps.php.net/php5-STABLE-latest.tar.gz
 
For Windows:
 
  http://snaps.php.net/win32/php5.0-win32-latest.zip

That's amazing. Try 5.0.4-dev.

------------------------------------------------------------------------

[2005-01-12 23:59:15] akorthaus at web dot de

With a small directory I get:

my_scandir:
count: 71
1.63159513474

scandir:
count: 71
3.12091302872

With 100.000 files it takes too long, and scandir() runs into
memory_limit (which is 500 MB!)

scandir() seems to need much more memory!
I added the following line to the scripts:
echo "mem: ".number_format(memory_get_usage()/1048576) . "M\n";

so I get:

my_scandir:
mem: 10M
count: 100002
1.38867807388

scandir:
mem: 397M
count: 100002
1.75003910065


If I put in (scandir version):

foreach(range(1, 2) as $unused)

I get:

Fatal error: Allowed memory size of 524288000 bytes exhausted (tried to
allocate 4096 bytes) in /home/akorthaus/test/scan.php on line 5


If I put in (my_scandir version):

foreach(range(1, 10) as $unused)

mem: 10M
count: 100002
16.7273759842

which is the same as with only one cycle.

------------------------------------------------------------------------

[2005-01-12 21:51:42] [EMAIL PROTECTED]

count: 2034                  
251.505897045                
                             
count: 2034                  
155.706785917                

Only difference:
foreach(range(1, 5000) as $unused)
    $files = scandir('C:\WINDOWS\System32');

So, not on Win32. Do a foreach like I have done and spread the function
call over quite a few calls, because with repeated execution of a single
function call, it went back and forth for me.

------------------------------------------------------------------------

The remainder of the comments for this report are too long. To view
the rest of the comments, please view the bug report online at
    http://bugs.php.net/31515

-- 
Edit this bug report at http://bugs.php.net/?id=31515&edit=1

Reply via email to