ID:               31515
 User updated by:  akorthaus at web dot de
 Reported By:      akorthaus at web dot de
-Status:           Feedback
+Status:           Open
 Bug Type:         Performance problem
 Operating System: Linux 2.4.28 (Gentoo)
 PHP Version:      5.0.3
 New Comment:

I tried php5-STABLE-200501122330:

./configure \
  --prefix=/home/akorthaus/bin/php5-STABLE-200501122330 \
  --disable-all \
  --with-pcre-regex \
  --enable-memory-limit

With the following results:

scandir (foreach:500, files:527)
mem: 2M
time: 10.242558956146s

my_scandir (foreach:500, files:527)
mem: 0M
time: 2.3772580623627s

scandir (foreach:1, files:10000)
mem: 40M
time: 0.40674495697021s

my_scandir (foreach:1, files:10000)
mem: 1M
time: 0.17293095588684s

scandir (foreach:100, files:10000)
mem: 40M
time: 41.659919977188s

my_scandir (foreach:100, files:10000)
mem: 1M 
time: 20.631703853607s


Previous Comments:
------------------------------------------------------------------------

[2005-01-13 02:10:35] [EMAIL PROTECTED]

Please try using this CVS snapshot:

  http://snaps.php.net/php5-STABLE-latest.tar.gz
 
For Windows:
 
  http://snaps.php.net/win32/php5.0-win32-latest.zip

That's amazing. Try 5.0.4-dev.

------------------------------------------------------------------------

[2005-01-12 23:59:15] akorthaus at web dot de

With a small directory I get:

my_scandir:
count: 71
1.63159513474

scandir:
count: 71
3.12091302872

With 100.000 files it takes too long, and scandir() runs into
memory_limit (which is 500 MB!)

scandir() seems to need much more memory!
I added the following line to the scripts:
echo "mem: ".number_format(memory_get_usage()/1048576) . "M\n";

so I get:

my_scandir:
mem: 10M
count: 100002
1.38867807388

scandir:
mem: 397M
count: 100002
1.75003910065


If I put in (scandir version):

foreach(range(1, 2) as $unused)

I get:

Fatal error: Allowed memory size of 524288000 bytes exhausted (tried to
allocate 4096 bytes) in /home/akorthaus/test/scan.php on line 5


If I put in (my_scandir version):

foreach(range(1, 10) as $unused)

mem: 10M
count: 100002
16.7273759842

which is the same as with only one cycle.

------------------------------------------------------------------------

[2005-01-12 21:51:42] [EMAIL PROTECTED]

count: 2034                  
251.505897045                
                             
count: 2034                  
155.706785917                

Only difference:
foreach(range(1, 5000) as $unused)
    $files = scandir('C:\WINDOWS\System32');

So, not on Win32. Do a foreach like I have done and spread the function
call over quite a few calls, because with repeated execution of a single
function call, it went back and forth for me.

------------------------------------------------------------------------

[2005-01-12 13:48:34] akorthaus at web dot de

Description:
------------
I do not understand why the new scandir() function is slower than an
own PHP-function which does the same (I used the "Example 2. PHP 4
alternatives to scandir()" from manual).

I tried this with 50 - 100.000 files, but the result is allways the
same. 

my_scandir() is about 50%-100% faster. If I don't sort, it is about
400% faster.

Reproduce code:
---------------
<?php
function my_scandir($dir) {
    $dh  = opendir($dir);
    while (false !== ($filename = readdir($dh))) {
        $files[] = $filename;
    }
    sort($files);
    return $files;
}
$t1= microtime(TRUE);
$files = my_scandir('/tmp');
$t2= microtime(TRUE);
echo "count: ".count($files)."\n";
echo $t2-$t1;
echo "\n";
?>

<?php
$t1 = microtime(TRUE);
$files = scandir('/tmp');
$t2= microtime(TRUE);
echo "count: ".count($files)."\n";
echo $t2-$t1;
echo "\n";
?>

Expected result:
----------------
I expect the c-function to be faster

Actual result:
--------------
the php-function is about 50-100% faster


------------------------------------------------------------------------


-- 
Edit this bug report at http://bugs.php.net/?id=31515&edit=1

Reply via email to