How to read big file in php without being memory limit

前端 未结 4 1636
青春惊慌失措
青春惊慌失措 2021-01-14 05:04

I\'m trying to read a file line by line. The problem is the file was too big(over 500000 line) and I reach out the memory limit. I wonder how to read the file without being

4条回答
  •  鱼传尺愫
    2021-01-14 05:38

    You could use a generator to handle the memory usage. This is just an example written by a user on the documentation page:

    function getLines($file)
    {
        $f = fopen($file, 'r');
    
        try {
            while ($line = fgets($f)) {
                yield $line;
            }
        } finally {
            fclose($f);
        }
    }
    
    foreach (getLines("file.txt") as $n => $line) {
        // insert the line into db or do whatever you want with it.
    }
    

    A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate. Instead, you can write a generator function, which is the same as a normal function, except that instead of returning once, a generator can yield as many times as it needs to in order to provide the values to be iterated over.

提交回复
热议问题