I have made a function that finds all the URLs within an html file and repeats the same process for each html content linked to the discovered URLs. The function is recursiv
You could try to wiggle down the nesting by implementing parallel workers (like in cluster computing) instead of increasing the number of nesting function calls.
For example: you define a limited number of slots (eg. 100) and monitor the number of "workers" assigned to each/some of them. If any slots become free, you put the waiting workers "in them".
In your case it's definitely the crawler instance is having more Xdebug limit to trace error and debug info.
But, in other cases also errors like on PHP or core files like CodeIgniter libraries will create such a case and if you even increase the x-debug level setting it would not vanish.
So, look into your code carefully :) .
Here was the issue in my case.
I had a service class which is library in CodeIgniter. Having a function inside like this.
class PaymentService {
private $CI;
public function __construct() {
$this->CI =& get_instance();
}
public function process(){
//lots of Ci referencing here...
}
My controller as follow:
$this->load->library('PaymentService');
$this->process_(); // see I got this wrong instead it shoud be like
Function call on last line was wrong because of the typo, instead it should have been like below:
$this->Payment_service->process(); //the library class name
Then I was keeping getting the exceed error message. But I disabled XDebug but non helped. Any way please check you class name or your code for proper function calling.
Another solution is to add xdebug.max_nesting_level = 200
in your php.ini
You could convert your recursive code into an iterative code, which simulates the recursion. This means that you have to push the current status (url, document, position in document etc.) into an array, when you reach a link, and pop it out of the array, when this link has finished.
Rather than going for a recursive function calls, work with a queue model to flatten the structure.
$queue = array('http://example.com/first/url');
while (count($queue)) {
$url = array_shift($queue);
$queue = array_merge($queue, find_urls($url));
}
function find_urls($url)
{
$urls = array();
// Some logic filling the variable
return $urls;
}
There are different ways to handle it. You can keep track of more information if you need some insight about the origin or paths traversed. There are also distributed queues that can work off a similar model.
Try looking in /etc/php5/conf.d/ to see if there is a file called xdebug.ini
max_nesting_level is 100 by default
If it is not set in that file add:
xdebug.max_nesting_level=300
to the end of the list so it looks like this
xdebug.remote_enable=on
xdebug.remote_handler=dbgp
xdebug.remote_host=localhost
xdebug.remote_port=9000
xdebug.profiler_enable=0
xdebug.profiler_enable_trigger=1
xdebug.profiler_output_dir=/home/drupalpro/websites/logs/profiler
xdebug.max_nesting_level=300
you can then use @Andrey's test before and after making this change to see if worked.
php -r 'function foo() { static $x = 1; echo "foo ", $x++, "\n"; foo(); } foo();'