After reading about how file based PHP sessions are not the greatest for performance, it has me thinking. Does this mean a PHP script including a lot of files is bad as wel
Another option is to "lazy" include files, eg, require_once them before using them (example instantiating a class) so that a whole load of unused includes won't (kind of) happen.
I've used a Zend feature about this (register autoload) but I do not know if it is generic to php or if it is not but exists to other frameworks...
Like Chacha102 wrote, it isn't really that bad.
But it also depends on your actual script files.
In practice you should profile your code, Xdebug is great for this.
To clarify: profile and compare. Avoid lots of small script files if you can, but still keep your source code organized (a single script with thousands of lines is not comfortable to edit). A profiler will give you some numbers to find a good balance.
It isn't really that bad. This can be shown by the fact that no one has tried to create a compiler to make the entire PHP application a single script..
If you actually used the microtime
function of PHP to measure how much time it took, it would be in the billionths of a second.
Personally, I've always had more problems with the data the pages process rather than the size and amount of source file includes. However, I write heavily parameterised code in abstraction layers (think of this as the opposite of cut-n-paste programming) so a lot of different work is done by the same code merely by changing half-a-dozen simple parameters.
You should use spl_autoload_register() and OOP. This way, no matter how small your project currently is or how big it will evolve over time (and it would be dumb to exclude this possibility), PHP will only include what it needs, no more, no less.
That's the perfect future-oriented balance between runtime RAM usage, the maintainability of the code and the effects of hard disk latency time, I'd say, provided you're modularizing your code properly, of course (and XDebug helps here).
Having said that, it implies the badness of including unused files.
Inclusion of files, no matter which way (spl_autoload_register() or otherwise), should be done with absolute paths, due to the php.ini directive include_path, which PHP would search through for your files when using relative paths.
And a small extra-note to why "include 'foo.php'" works like "include './foo.php'" (the "normal" way of including files): it's because the directory "." is part of include_path by default.
I think everyone who fiddles with PHP comes to the point when their libraries become large, and worries come up for the performance.
My experience is that yes, if you always load all your libraries, then this is going to eat up precious memory (Of which you are allocated only a fixed number of megabytes per process). I have had source code files weighing 300-400kb (with comments) that ate 2-3 MB per script instance. Seeing that a script gets only 16-32 MB with many shared hosts, that is a lot. Also, the processing of such huge files often comes in at up to half a second per request, which is way too much.
So, splitting up is definitely necessary, and easy to do with Autoload and consorts. Check out my answer to this question for a few suggestions on how to split up your code wisely. There is also a link to a question about how to organize a large PHP project, which yielded great results. I'm in the process of figuring out the perfect structure myself, and not done yet. :)