PHP - how to read big remote files efficiently and use buffer in loop

后端 未结 3 1882
礼貌的吻别
礼貌的吻别 2021-01-21 04:53

i would like to understand how to use the buffer of a read file.

Assuming we have a big file with a list of emails line by line ( delimiter is a classic \\n

相关标签:
3条回答
  • 2021-01-21 05:22

    Like already suggested in my closevotes to your question (hence CW):

    You can use SplFileObject which implements Iterator to iterate over a file line by line to save memory. See my answers to

    • Least memory intensive way to read a file in PHP and
    • How to save memory when reading a file in Php?

    for examples.

    0 讨论(0)
  • 2021-01-21 05:33

    Open the file with fopen() and read it incrementally. Probably one line at a time with fgets().

    file_get_contents reads the whole file into memory, which is undesirable if the file is larger than a few megabytes

    Depending on how long this takes, you may need to worry about the PHP execution time limit, or the browser timing out if it doesn't receive any output for 2 minutes.

    Things you might try:

    1. set_time_limit(0) to avoid running up against the PHP time limit
    2. Make sure to output some data every 30 seconds or so so the browser doesn't time out; make sure to flush(); and possibly ob_flush(); so your output is actually sent over the network (this is a kludge)
    3. start a separate process (e.g. via exec()) to run this in the background. Honestly, anything that takes more than a second or two is best run in the background
    0 讨论(0)
  • 2021-01-21 05:36

    Don't use file_get_contents for large files. This pulls the entire file into memory all at once. You have to read it in pieces

    $fp = fopen('file.txt', 'r');
    while(!feof($fp)){
      //get onle line 
      $buffer = fgets($fp);
       //do your stuff
    }
     fclose($fp);
    
    0 讨论(0)
提交回复
热议问题