Parsing a CSV file using NodeJS

后端 未结 16 2270
执笔经年
执笔经年 2020-11-27 12:15

With nodejs I want to parse a .csv file of 10000 records and do some operation on each row. I tried using http://www.adaltas.com/projects/node-csv. I couldnt get this to pau

相关标签:
16条回答
  • 2020-11-27 12:46

    Try the line by line npm plugin.

    npm install line-by-line --save
    
    0 讨论(0)
  • 2020-11-27 12:47

    Seems like you need to use some stream based solution, there existed already such libraries so before reinventing yourself, try this library, which also includes validation support. https://www.npmjs.org/package/fast-csv

    0 讨论(0)
  • 2020-11-27 12:47

    this is my solution to get csv file from external url

    const parse = require( 'csv-parse/lib/sync' );
    const axios = require( 'axios' );
    const readCSV = ( module.exports.readCSV = async ( path ) => {
    try {
       const res = await axios( { url: path, method: 'GET', responseType: 'blob' } );
       let records = parse( res.data, {
          columns: true,
          skip_empty_lines: true
        } );
    
        return records;
     } catch ( e ) {
       console.log( 'err' );
     }
    
    } );
    readCSV('https://urltofilecsv');
    
    0 讨论(0)
  • 2020-11-27 12:49

    In order to pause the streaming in fast-csv you can do the following:

    let csvstream = csv.fromPath(filePath, { headers: true })
        .on("data", function (row) {
            csvstream.pause();
            // do some heavy work
            // when done resume the stream
            csvstream.resume();
        })
        .on("end", function () {
            console.log("We are done!")
        })
        .on("error", function (error) {
            console.log(error)
        });
    
    0 讨论(0)
提交回复
热议问题