I\'m trying to read a file line by line. The problem is the file was too big(over 500000 line) and I reach out the memory limit. I wonder how to read the file without being
You could use a generator to handle the memory usage. This is just an example written by a user on the documentation page:
function getLines($file)
{
$f = fopen($file, 'r');
try {
while ($line = fgets($f)) {
yield $line;
}
} finally {
fclose($f);
}
}
foreach (getLines("file.txt") as $n => $line) {
// insert the line into db or do whatever you want with it.
}
A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate. Instead, you can write a generator function, which is the same as a normal function, except that instead of returning once, a generator can yield as many times as it needs to in order to provide the values to be iterated over.