问题
I recently got into learning cgi and I set up an Ubuntu server in vbox. The first program I wrote was in Python using vim through ssh. Then I installed Eclipse on my Windows 7 station and created the exact same Perl file; just a simple hello world deal.
I tried running it, and I was getting a 500 on it, while the Python code in the same dir (/usr/lib/cgi-bin) was showing up fine. Frustrated, I checked and triple-checked the permissions and that it began with #!/usr/bin/perl. I also checked whether or not AddHandler was set to .pl. Everything was set fine, and on a whim I decided to write the same exact code within the server using vim like I did with the Python file.
Lo and behold, it worked. I compared them, thinking I'd gone mad, and they are exactly the same. So, what's the deal? Why is a file made in Windows 7 on Eclipse different than a file made in Ubuntu server with vim? Do they have different binary headers or something? This can really affect my development environment.
#!/usr/bin/perl
print "Content-type: text/html\n\n";
print "Testing.";
Apache error log:
[Tue Aug 07 12:32:02 2012] [error] [client 192.168.1.8] (2)No such file or directory: exec of '/usr/lib/cgi-bin/test.pl' failed
[Tue Aug 07 12:32:02 2012] [error] [client 192.168.1.8] Premature end of script headers: test.pl
[Tue Aug 07 12:32:02 2012] [error] [client 192.168.1.8] File does not exist: /var/www/favicon.ico
This is the continuing error I get.
回答1:
I think you have some spurious \r
characters on the first line of your Perl script when you write it in Windows.
For example I created the following file on Windows:
#!/usr/bin/perl
code goes here
When viewed with hexdump it shows:
00000000 23 21 2f 75 73 72 2f 62 69 6e 2f 70 65 72 6c 0d |#!/usr/bin/perl.| 00000010 0a 0d 0a 63 6f 64 65 20 67 6f 65 73 20 68 65 72 |...code goes her| 00000020 65 0d 0a |e..| 00000023
Notice the 0d
- \r
that I've marked out in that. If I try and run this using ./test.pl
I get:
zsh: ./test.pl: bad interpreter: /usr/bin/perl^M: no such file or directory
Whereas if I write the same code in Vim on a UNIX machine I get:
00000000 23 21 2f 75 73 72 2f 62 69 6e 2f 70 65 72 6c 0a |#!/usr/bin/perl.| 00000010 0a 63 6f 64 65 20 67 6f 65 73 20 68 65 72 65 0a |.code goes here.| 00000020
You can fix this in one of several ways:
- You can probably make your editor save "UNIX line endings" or similar.
- You can run
dos2unix
or similar on the file after saving it - You can use sed:
sed -e 's/\r//g'
or similar.
Your apache logs should be able to confirm this (If they don't crank up the logging a bit on your development server).
回答2:
Sure, it can.
- One environment might have a module installed that the other might not.
- Perl might be installed in different locations in the two environment.
- The environments might have different versions of Perl.
- The environments might have different operating systems.
- The permissions might be setup incorrectly in one of the environments.
- etc
But instead of speculating wildly like this, why don't you check the error log for what error you actually got?
回答3:
No, they are just text files. Of course, it's possible to write unportable programs, trivially by using system()
or other similar services which depend on the environment.
来源:https://stackoverflow.com/questions/11850192/why-does-my-perl-cgi-program-return-a-server-error