I\'ve been doing some profiling on different methods of accessing large(ish) arrays of data in PHP. The use case is pretty simple: some of our tools output data into PHP files
For one of my projects in which a database was not an option, I faced the same problem of loading big (by big I mean series of 3 MB files) php files containing arrays in memory and I was looking for options to maximize the performances. I found a very easy one which was caching these files on the disk as json at first use. I divided load time by 3 and also memory peak consumption by 30%. Loading local json file with json_decode() is much much faster than including a big php file containing an array. it also has the advantage of being a format that most languages can manipulate directly. Hope that helps.
Not sure if this is exactly what your looking for but it should help out with speed and memory issues. You can use the fixed spl array:
$startMemory = memory_get_usage();
$array = new SplFixedArray(100000);
for ($i = 0; $i < 100000; ++$i) {
$array[$i] = $i;
}
echo memory_get_usage() - $startMemory, ' bytes';
Read more on big php arrays here: http://nikic.github.com/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html
Also have you thought about storing the data in a cache/memory? For example you could use mysqlite with the inmemory engine on the first execution then access data from there:
$pdo = new PDO('sqlite::memory:');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// .. Use PDO as normal