问题
I'm using the SFTP functions of PHPSecLib to download files from an FTP server.
The line
$sftp->get($fname);
works if the file is up to 200MB, but if it's 300MB, the browser responds with "Firefox can't find the file at [download.php]". That is, it says it can't find the php file I use for downloading the remote file.
At first I thought this was due to the memory_limit
setting in php.ini, but it doesn't matter if it's set to 128M or 350M; 200MB files still work, and 300MB files fail. And it fails after ten seconds, so max_execution_time
or max_input_time
don't seem to be the culprits. What could be wrong?
回答1:
First, I strongly recommend that you put in your code set_time_limit(0);
at the very very top of your php file (even before any includes) since you're dealing with operations you don't know how much time they will take.
I'd say this is a case of the webserver/browser timing out on not sending/receiving any data over a "long" period of time. To fix this we have to alter a bit the SFTP.php file, namely the Net_SFTP class, then go to the get method (line 1482 if you have phpseclib 0.3.1) and add some stuff inside the only "while" control structure you have there (I'll paste the entire function below), to add the following code:
if (strtolower(PHP_SAPI) != 'cli') { // run this if request is handled by a webserver (like your case)
$my_iter++;
if($my_iter > 1024){
$my_iter = 0; // reset the counter
echo "transferring ... " . date("G:i:s") . "<br />"; // send something to the buffer
}
// flush the buffers and prevent the timeout by actually outputting something to the browser
ob_end_flush();
ob_flush();
flush();
ob_start();
usleep(100); // just in case, try removing this delay
}
Which basically outputs something from time to time (1024 iterations of this while) and flush the buffers to actually output something to the browser. Feel free to adjust the values. This is code (SFTP class) not meant to be ran from a webserver because of these problems. I mean, you CAN, but you'll run into some issues, like this one.
Also, if you try to send() you'll have to do a similar modification on the corresponding method, but hopefully this will cover your problem (at least this fixed my timeouts problems here on my local dev box).
Now, the full method modification goes below, as promised ;-)
function get($remote_file, $local_file = false)
{
if (!($this->bitmap & NET_SSH2_MASK_LOGIN)) {
return false;
}
$remote_file = $this->_realpath($remote_file);
if ($remote_file === false) {
return false;
}
$packet = pack('Na*N2', strlen($remote_file), $remote_file, NET_SFTP_OPEN_READ, 0);
if (!$this->_send_sftp_packet(NET_SFTP_OPEN, $packet)) {
return false;
}
$response = $this->_get_sftp_packet();
switch ($this->packet_type) {
case NET_SFTP_HANDLE:
$handle = substr($response, 4);
break;
case NET_SFTP_STATUS: // presumably SSH_FX_NO_SUCH_FILE or SSH_FX_PERMISSION_DENIED
$this->_logError($response);
return false;
default:
user_error('Expected SSH_FXP_HANDLE or SSH_FXP_STATUS', E_USER_NOTICE);
return false;
}
if ($local_file !== false) {
$fp = fopen($local_file, 'wb');
if (!$fp) {
return false;
}
} else {
$content = '';
}
$read = 0;
while (true) {
if (strtolower(PHP_SAPI) != 'cli') { // run this if request is handled by a webserver (like your case)
$my_iter++;
if($my_iter > 1024){
$my_iter = 0; // reset the counter
echo "transferring ... " . date("G:i:s") . "<br />"; // send something to the buffer
}
// flush the buffers and prevent the timeout by actually outputting something to the browser
ob_end_flush();
ob_flush();
flush();
ob_start();
usleep(100); // just in case, try removing this delay
}
$packet = pack('Na*N3', strlen($handle), $handle, 0, $read, 1 << 20);
if (!$this->_send_sftp_packet(NET_SFTP_READ, $packet)) {
if ($local_file !== false) {
fclose($fp);
}
return false;
}
$response = $this->_get_sftp_packet();
switch ($this->packet_type) {
case NET_SFTP_DATA:
$temp = substr($response, 4);
$read+= strlen($temp);
if ($local_file === false) {
$content.= $temp;
} else {
fputs($fp, $temp);
}
break;
case NET_SFTP_STATUS:
$this->_logError($response);
break 2;
default:
user_error('Expected SSH_FXP_DATA or SSH_FXP_STATUS', E_USER_NOTICE);
if ($local_file !== false) {
fclose($fp);
}
return false;
}
}
if ($local_file !== false) {
fclose($fp);
}
if (!$this->_send_sftp_packet(NET_SFTP_CLOSE, pack('Na*', strlen($handle), $handle))) {
return false;
}
$response = $this->_get_sftp_packet();
if ($this->packet_type != NET_SFTP_STATUS) {
user_error('Expected SSH_FXP_STATUS', E_USER_NOTICE);
return false;
}
$this->_logError($response);
// check the status from the NET_SFTP_STATUS case in the above switch after the file has been closed
if ($status != NET_SFTP_STATUS_OK) {
return false;
}
if (isset($content)) {
return $content;
}
return true;
}
回答2:
Something else you could do...
<?php
include('Net/SFTP.php');
$sftp = new Net_SFTP('www.domain.tld');
$sftp->login('username', 'password');
$start = 0;
while (true) {
$response = $sftp->get('1mb', false, $start, 1024);
$start+= 1024;
if (empty($response)) {
break;
}
echo $response;
}
ie. download the file in multiple chunks.
来源:https://stackoverflow.com/questions/15570922/server-fails-when-downloading-large-files-with-php