I have a Perl CGI application that sometimes times out, causing it to be killed by Apache and 504 Gateway Time-out
error to be sent to browser. I am trying to profile this application using NYTProf, however I cannot read profile data:
$ nytprofhtml -f www/cgi-local/nytprof.out
Reading www/cgi-local/nytprof.out
Profile data incomplete, inflate error -5 ((null)) at end of input file, perhaps the process didn't exit cleanly or the file has been truncated (refer to TROUBLESHOOTING in the documentation)
I am using sigexit=1
NYTProf option. Here's minimal CGI script that reproduces problem:
#!/usr/bin/perl -d:NYTProf
sleep 1 while 1;
Setting sigexit=1
tells NYTProf to catch the following signals:
INT HUP PIPE BUS SEGV
However, when your CGI script times out, Apache sends SIGTERM
. You need to catch SIGTERM
:
sigexit=term
To catch SIGTERM
in addition to the default signals, use:
sigexit=int,hup,pipe,bus,segv,term
CGI.pm has a debug mode, which you can use to run your program from the command line, and pass your CGI parameters as key/value pairs.
It has another feature that you can use to save your params to a file, and then read that file back in later.
What I've done is added the code to save the params to a file, and run my program, via a browser. This also facilitates my abilty to insure that the browser is sending the correct data.
Then I change the code to read the params from the file, and run it as often as I need until I have everything else debugged.
Once you've got the program running to your satisfaction from the command line, you can run it via nytprof to figure out what is taking all the time.
来源:https://stackoverflow.com/questions/20739231/profiling-a-perl-cgi-script-that-times-out