Batching php's fgetcsv

故事扮演 提交于 2019-12-25 00:49:16

问题


I have a fairly large csv file (at least for the web) that I don't have control of. It has about 100k rows in it, and will only grow larger.

I'm using the Drupal Module Feeds to create nodes based on this data, and their parser batches the parsing in groups of 50 lines. However, their parser doesn't handle quotation marks properly, and fails to parse about 60% of the csv file. fgetcsv works but doesn't batch things as far as I can tell.

While trying to read the entire file with fgetcsv, PHP eventually runs out of memory. Therefore I would like to be able to break things up into smaller chunks. Is this possible?


回答1:


fgetcsv() works by reading one line at a time from a given file pointer. If PHP is running out of memory, perhaps you are trying to parse the whole file at once, putting it all into a giant array. The solution would be to process it line by line without storing it in a big array.

To answer the batching question more directly, read n lines from the file, then use ftell() to find the location in the file where you ended. Make a note of this point, and then you can return to it at some point in the future by calling fseek() before fgetcsv().




回答2:


Well, create a function to parse a bunch of lines:

function parseLines(array $lines) {
    foreach ($lines as $line) {
        //insert line into new node
    }
}

Then, just batch it up:

$numberOfLinesToBatch = 50;
$f = fopen($file, 'r');
if (!$f) die('implement better error checking');

$buffer = array();
while ($row = fgetcsv($f)) {
    $buffer[] = $row;
    if (count($buffer) >= $numberOfLinesToBatch) {
        parseLines($buffer);
        $buffer = array();
    }
}
if (!empty($buffer)) {
    parseLines(buffer);
}

fclose($f);

It streams the data in, and you can tune how many rows it buffers by tweaking the varariable...




回答3:


I suspect the problem is the fact that you're storing too much information in memory rather than how you're reading the CSV file off disk. (i.e.: fgetcsv will only read a line at a time, so if a single line's worth of data is causing you to run out of memory you're in trouble.)

As such, you simply need to use an approach where you:

  1. Read 'x' lines into an array.
  2. Process this information
  3. Clear any temporary variables/arrays.
  4. Repeat until FEOF.

Alternatively, you could execute the CSV processing via the command line version of PHP and use a custom php.ini that has a much larger memory limit.



来源:https://stackoverflow.com/questions/4586653/batching-phps-fgetcsv

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!