[PHP] performance: large includes vs small includes with lots of reads
I'm trying to tweak my server a bit and wonder if it is better to have a large include file, say 20-40 kb with 15 user-defined functions, of which maybe 3 or 4 are used on any given page, or to have each function in it's own file and include (disk access/read) them only as needed. Is it faster/more efficient to read one large file or several smaller ones? It's a RS Linux server with a gig of memory. I'm also looking at PHP Accelerator and such. Do those store only in memory what the script needs, or does it include all that you require or include in, as well? Thanks, Hans -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] performance: large includes vs small includes with lots of reads
On Thu, 7 Oct 2004 11:41:10 -0500 (CDT), Hans H. Anderson [EMAIL PROTECTED] wrote: I'm trying to tweak my server a bit and wonder if it is better to have a large include file, say 20-40 kb with 15 user-defined functions, of which maybe 3 or 4 are used on any given page, or to have each function in it's own file and include (disk access/read) them only as needed. Is it faster/more efficient to read one large file or several smaller ones? It's a RS Linux server with a gig of memory. I'm also looking at PHP Accelerator and such. Do those store only in memory what the script needs, or does it include all that you require or include in, as well? With sufficient ram, the OS will likely cache a lot of the file reads anyway, so with files that small it shouldn't make a whole lot of difference. But like I always say, the best way to know is to try it both ways and benchmark it. -- Greg Donald Zend Certified Engineer http://gdconsultants.com/ http://destiney.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] performance: large includes vs small includes with lots of reads
* Thus wrote Hans H. Anderson: I'm trying to tweak my server a bit and wonder if it is better to have a large include file, say 20-40 kb with 15 user-defined functions, of which maybe 3 or 4 This is all a 'depends' situation. Of those 15 functions how many of them are actually executed accross all your scripts that depend on them. I tend to group function's into file where their related, and thus we get into a disussion of OOP. Functions that are used a majority of the time are included always, like authentication functions. Then their is the 'middleware' functons, that depend on what type of results you expect to have. And last are the very specific functions depending on the 'type'. are used on any given page, or to have each function in it's own file and include (disk access/read) them only as needed. This bottleneck is usually taken care of at the OS level, and how it caches frequently accesses files. Is it faster/more efficient to read one large file or several smaller ones? It's a RS Linux server with a gig of memory. I'm also looking at PHP Accelerator and such. Do those store only in memory what the script needs, or does it include all that you require or include in, as well? A PHP Accelerator will basically cache the actual OP code that php uses to execute a script, including the files that are included within the script An acclerator is always a good option to optimize performnce. Generally they will eliminate the overhead of php having to parse each file within each request, and will probably eliminate most of the overhead if you include one big file. Of course, even with the accelerator, I would suggest designing your code to be as optimal with out it. Curt -- The above comments may offend you. flame at will. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php