A hypothetical question for you all to chew on...
I recently answered another question on SO where a PHP script was segfaulting, and it reminded me of something I ha
If you use XDebug, there is a maximum function nesting depth which is controlled by an ini setting:
$foo = function() use (&$foo) {
$foo();
};
$foo();
Produces the following error:
Fatal error: Maximum function nesting level of '100' reached, aborting!
This IMHO is a far better alternative than a segfault, since it only kills the current script, not the whole process.
There is this thread that was on the internals list a few years ago (2006). His comments are:
So far nobody had proposed a solution for endless loop problem that would satisfy these conditions:
- No false positives (i.e. good code always works)
- No slowdown for execution
- Works with any stack size
Thus, this problem remains unsloved.
Now, #1 is quite literally impossible to solve due to the halting problem. #2 is trivial if you keep a counter of stack depth (since you're just checking the incremented stack level on stack push).
Finally, #3 Is a much harder problem to solve. Considering that some operating systems will allocate stack space in a non-contiguous manner, it's not going to be possible to implement with 100% accuracy, since it's impossible to portably get the stack size or usage (for a specific platform it may be possible or even easy, but not in general).
Instead, PHP should take the hint from XDebug and other languages (Python, etc) and make a configurable nesting level (Python's is set to 1000 by default)....
Either that, or trap memory allocation errors on the stack to check for the segfault before it happens and convert that into a RecursionLimitException
so that you may be able to recover....
I could be totally wrong about this since my testing was fairly brief. It seems that Php will only seg fault if it runs out of memory (and presumably tries to access an invalid address). If the memory limit is set and low enough, you will get an out of memory error beforehand. Otherwise, the code seg faults and is handled by the OS.
Can't say whether this is a bug or not, but the script should probably not be allowed to get out of control like this.
See the script below. Behavior is practically identical regardless of options. Without a memory limit, it also slows my computer down severely before it's killed.
<?php
$opts = getopt('ilrv');
$type = null;
//iterative
if (isset($opts['i'])) {
$type = 'i';
}
//recursive
else if (isset($opts['r'])) {
$type = 'r';
}
if (isset($opts['i']) && isset($opts['r'])) {
}
if (isset($opts['l'])) {
ini_set('memory_limit', '64M');
}
define('VERBOSE', isset($opts['v']));
function print_memory_usage() {
if (VERBOSE) {
echo memory_get_usage() . "\n";
}
}
switch ($type) {
case 'r':
function segf() {
print_memory_usage();
segf();
}
segf();
break;
case 'i':
$a = array();
for ($x = 0; $x >= 0; $x++) {
print_memory_usage();
$a[] = $x;
}
break;
default:
die("Usage: " . __FILE__ . " <-i-or--r> [-l]\n");
break;
}
?>
Know nothing about PHP implementation, but it's not uncommon in a language runtime to leave pages unallocated at the "top" of the stack so that a segfault will occur if the stack overflows. Usually this is handled inside the runtime and either the stack is extended or a more elegant error is reported, but there could be implementations (and situations in others) where the segfault is simply allowed to rise (or escapes).