Edit report at http://bugs.php.net/bug.php?id=52312&edit=1
ID: 52312 Updated by: ras...@php.net Reported by: v dot damore at gmail dot com Summary: PHP lstat problem Status: Bogus Type: Bug Package: Safe Mode/open_basedir Operating System: Linux PHP Version: 5.2.13 New Comment: And like I said, we have made this more efficient in PHP 5.3 because we now cache the partial paths separately. You should see a performance improvement going to 5.3. Previous Comments: ------------------------------------------------------------------------ [2010-07-12 15:32:33] paj...@php.net Again, it does it once and only once per path. When it does it, it checks each element of a path (and cache each of them too). ------------------------------------------------------------------------ [2010-07-12 14:59:36] v dot damore at gmail dot com I must also to notify that looking at our production servers in some cases PHP engine tries up to 8 times before read the file. Can you explain why PHP engine have this behavior? There is any way to remove/change this behavior in PHP engine? ------------------------------------------------------------------------ [2010-07-12 14:43:43] v dot damore at gmail dot com We already tuned cache size to following values: realpath_cache_size=1024k realpath_cache_ttl=7200 Can we increase cache size to: realpath_cache_size=40960k realpath_cache_ttl=72000 Do you know if memory_limit is affected by realpath_cache_size increase? Actually our memory limit is set to: memory_limit = 96M But biggest problem we have at moment is when search engines spiders come to crawling all platform. In this case all existing pages are crawled by spiders. Can you suggest us a workaround? ------------------------------------------------------------------------ [2010-07-12 14:23:59] paj...@php.net That's why the setting realpath_cache_size and TTL exist. They allow you to fine tune this cache to fit your needs. On a shared host you will certainly increase the default value. ------------------------------------------------------------------------ [2010-07-12 14:16:59] v dot damore at gmail dot com Thanks for your explanation, I followed your suggestions and there is a performance improvement in submitted case. Anyway your suggestions are not applicable in all cases. Please consider this problem from my point of view: in a production environment of a big web hosting provider. In this real case there are many thousands of users that can freely write their own PHP code, there also are hundreds of thousands of pages that cannot be cached. Please consider also that what happens when Google spiders come to crawl all pages. ------------------------------------------------------------------------ The remainder of the comments for this report are too long. To view the rest of the comments, please view the bug report online at http://bugs.php.net/bug.php?id=52312 -- Edit this bug report at http://bugs.php.net/bug.php?id=52312&edit=1