I\'m building a RESTS like service in PHP that should accept a large JSON post as main data (I send and read the data much like discussed here: http://forums.laravel.io/view
According to php manual max_input_vars
in php.ini
role is:
How many input variables may be accepted (limit is applied to
$_GET
,$_POST
and$_COOKIE
superglobal separately). Use of this directive mitigates the possibility of denial of service attacks which use hash collisions. If there are more input variables than specified by this directive, anE_WARNING
is issued, and further input variables are truncated from the request. This limit applies only to each nesting level of a multi-dimensional input array.
You just have to attribute greater number to max_input_vars
in your php.ini
.
I think most of the time it is not necessary to increase the max_input_vars
size,
but to optimize your code,
i had faced this problem when getting all results from one ajax request and sending that results to another ajax request.
So what i have done is stringified the array from the db results,
JSON.stringify(totalResults);
in javascript JSON.stringify
convert array to string so after converting i have send that string to another request and decoded that string to array again using json_decode
in php,
<?php $totalResults = json_decode($_POST['totalResults']); ?>
so i got that original array again,
i hope this can help someone so i have shared it, thank you.
I found out that the right way to handle json data directly in PHP (via file_get_contents('php://input')
) is to make sure the request sets the right content-type i.e. Content-type: application/json
in the HTTP request header.
In my case I'm requesting pages from php using curl with to this code:
function curl_post($url, array $post = NULL, array $options = array()) {
$defaults = array(
CURLOPT_POST => 1,
CURLOPT_HEADER => 0,
CURLOPT_URL => $url,
CURLOPT_FRESH_CONNECT => 1,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_FORBID_REUSE => 1,
CURLOPT_TIMEOUT => 600
);
if(!is_null($post))
$defaults['CURLOPT_POSTFIELDS'] = http_build_query($post);
$ch = curl_init();
curl_setopt_array($ch, ($options + $defaults));
if(($result = curl_exec($ch)) === false) {
throw new Exception(curl_error($ch) . "\n $url");
}
if(curl_getinfo($ch, CURLINFO_HTTP_CODE) != 200) {
throw new Exception("Curl error: ".
curl_getinfo($ch, CURLINFO_HTTP_CODE) ."\n".$result . "\n");
}
curl_close($ch);
return $result;
}
$curl_result = curl_post(URL, NULL,
array(CURLOPT_HTTPHEADER => array('Content-Type: application/json'),
CURLOPT_POSTFIELDS => json_encode($out))
);
Do note the CURLOPT_HTTPHEADER => array('Content-Type: application/json')
part.
On the receiving side I'm using the following code:
$rawData = file_get_contents('php://input');
$postedJson = json_decode($rawData,true);
if(json_last_error() != JSON_ERROR_NONE) {
error_log('Last JSON error: '. json_last_error().
json_last_error_msg() . PHP_EOL. PHP_EOL,0);
}
Do not change the max_input_vars
variable. Since changing the request to set right headers my issue with max_input_vars
went away. Apparently does not PHP evaluate the post variables with certain Content-type
is set.
Something is wrong, you do not need 1000 variables. Recode your program to accept 1 variable array with 1000 keys. This is error is there to warn you that you are not doing things the recommended way. Do not disable it, extend it, or hide it in any way.
<rant>I can accept that PHP has such a limit in place; it does make sense. What I cannot accept (and is one of the many reasons that make it very difficult for me to take PHP seriously as a programming language) is that processing then just continues with the truncated data, potentially overwriting good data with incomplete data. Yes, the data should be validated additionally before persisting it. But this behavior is just begging for troubles.</rant>
That said, I implemented the following to prevent this from happening again:
$limit = (int)ini_get('max_input_vars');
if (count($_GET) >= $limit) {
throw new Exception('$_GET is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}
if (count($_POST) >= $limit) {
throw new Exception('$_POST is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}
if (count($_COOKIE) >= $limit) {
throw new Exception('$_COOKIE is likely to be truncated by max_input_vars (' . $limit . '), refusing to continue');
}
Note that truncation doesn't necessarily happen at the limit. My limit was set to the default 1000, but $_POST
still ended up having 1001 elements.
Realy, change max_input_vars
using .htaccess
file is working but you need to restart Apache service.
Follow the complete process:
C:\xampp\apache\conf\httpd.conf
httpd.conf
xampp/htdocs
AllowOverride
#
before AllowOverride
delete the #
AllowOverride
insert php_value max_input_vars 10000