Read a file one line at a time in node.js?

前端 未结 29 1076
深忆病人
深忆病人 2020-11-22 04:33

I am trying to read a large file one line at a time. I found a question on Quora that dealt with the subject but I\'m missing some connections to make the whole thing fit to

相关标签:
29条回答
  • 2020-11-22 04:50

    For such a simple operation there shouldn't be any dependency on third-party modules. Go easy.

    var fs = require('fs'),
        readline = require('readline');
    
    var rd = readline.createInterface({
        input: fs.createReadStream('/path/to/file'),
        output: process.stdout,
        console: false
    });
    
    rd.on('line', function(line) {
        console.log(line);
    });
    
    0 讨论(0)
  • 2020-11-22 04:53

    Update in 2019

    An awesome example is already posted on official Nodejs documentation. here

    This requires the latest Nodejs is installed on your machine. >11.4

    const fs = require('fs');
    const readline = require('readline');
    
    async function processLineByLine() {
      const fileStream = fs.createReadStream('input.txt');
    
      const rl = readline.createInterface({
        input: fileStream,
        crlfDelay: Infinity
      });
      // Note: we use the crlfDelay option to recognize all instances of CR LF
      // ('\r\n') in input.txt as a single line break.
    
      for await (const line of rl) {
        // Each line in input.txt will be successively available here as `line`.
        console.log(`Line from file: ${line}`);
      }
    }
    
    processLineByLine();
    
    0 讨论(0)
  • 2020-11-22 04:54

    Another solution is to run logic via sequential executor nsynjs. It reads file line-by-line using node readline module, and it doesn't use promises or recursion, therefore not going to fail on large files. Here is how the code will looks like:

    var nsynjs = require('nsynjs');
    var textFile = require('./wrappers/nodeReadline').textFile; // this file is part of nsynjs
    
    function process(textFile) {
    
        var fh = new textFile();
        fh.open('path/to/file');
        var s;
        while (typeof(s = fh.readLine(nsynjsCtx).data) != 'undefined')
            console.log(s);
        fh.close();
    }
    
    var ctx = nsynjs.run(process,{},textFile,function () {
        console.log('done');
    });
    

    Code above is based on this exampe: https://github.com/amaksr/nsynjs/blob/master/examples/node-readline/index.js

    0 讨论(0)
  • 2020-11-22 04:55

    Generator based line reader: https://github.com/neurosnap/gen-readlines

    var fs = require('fs');
    var readlines = require('gen-readlines');
    
    fs.open('./file.txt', 'r', function(err, fd) {
      if (err) throw err;
      fs.fstat(fd, function(err, stats) {
        if (err) throw err;
    
        for (var line of readlines(fd, stats.size)) {
          console.log(line.toString());
        }
    
      });
    });
    
    0 讨论(0)
  • 2020-11-22 04:55

    I wrap the whole logic of daily line processing as a npm module: line-kit https://www.npmjs.com/package/line-kit

    // example
    var count = 0
    require('line-kit')(require('fs').createReadStream('/etc/issue'),
                        (line) => { count++; },
                        () => {console.log(`seen ${count} lines`)})

    0 讨论(0)
  • 2020-11-22 04:56

    Since posting my original answer, I found that split is a very easy to use node module for line reading in a file; Which also accepts optional parameters.

    var split = require('split');
    fs.createReadStream(file)
        .pipe(split())
        .on('data', function (line) {
          //each chunk now is a seperate line! 
        });
    

    Haven't tested on very large files. Let us know if you do.

    0 讨论(0)
提交回复
热议问题