In PHP, which is quicker; using include(\'somefile.php\')
or querying a MySQL database with a simple SELECT
query to get the same information?
If this is something you're going to be fetching on a regular basis it might be worthwhile to prefetch the data (from disk or the database, doesn't matter) and have your script pull it from a RAM cache like memcached.
I recently had this issue. I had some data in mysql that I was querying on every page request. For my data set, it was faster to write a fixed record length file than to use MySQL.
There were a few different factors that made a file faster than MySQL for me:
Bottom line was that I benchmarked it and compared results. For my workload, the file system was faster. I suspect if my data set ever grows, that will change. I'm going to be keeping an eye on performance and I'm ready to change how it works in the future.
Including a file should almost always be quicker. If your database is on another machine (e.g. in shared hosting) or in a multi-server setup the lookup will have to make an extra hop.
However, in practice the difference is probably not going to matter. If the list is dynamic then storing it in MySQL will make your life easier. Static lists (e.g. countries or states) can be stored in a PHP include. If the list is quite short (a few hundred entries) and often used, you could load it straight into JavaScript and do away with AJAX.
If you are going the MySQL route and are worried about speed then use caching.
$query = $_GET['query'];
$key = 'query' . $query;
if (!$results = apc_fetch($key))
{
$statement = $db->prepare("SELECT name FROM list WHERE name LIKE :query");
$statement->bindValue(':query', "$query%");
$statement->execute();
$results = $statement->fetchAll();
apc_store($key, $results);
}
echo json_encode($results);
Reading in raw data to a script from a file will generally be faster than from a database.
However it sounds like you are wanting to query that data in order to find a match to return to the javascript. You may find in that case that MySQL will be faster for the actual querying/searching of the data (especially if correctly indexed etc.) as this is something a database is good at.
Reading in a big file is also less scalable as you will be using lots of server memory while the script executes.
It depends. If your file is stored locally in your server and the database is installed in another machine, then the faster is to include the file.
Buuuuut, because it depends on your system it could be not true. I suggest to you to make a PHP test script and run it 100 times from the command line, and repeat the test through HTTP (using cURL)
Example:
use_include.php
<?php
start = microtime(true);
include( 'somefile.php' );
echo microtime(true)-start;
?>
use_myphp.php
<?php
start = microtime(true);
__put_here_your_mysql_statements_to_retrieve_the_file__
echo microtime(true)-start;
?>
I exactly don't know, but in my opinio using MySQL, even if can be slower, sould be used if the content is dynamic. But I'm pretty sure it is faster, for big contents, using include.