I am trying to implement a module in nodejs(just started working in nodejs) which has requirement below as
I had a similar request to process csv file and I tried to implement your solution: it works but as long as I used it with console log. I tried to store the 'record' variable on a array called 'results', but I just got an empty array [] and after the presented this empty array I received the console.log response presenting the parsed CSV data.
So it seems to be a matter of sync.. I mean, the processing of csv file takes a while. So I tried to compact your code and transform it into a Promise and then execute it. so, after the execution of the promise, my array was ready to be used.
title, type, value, category
Loan, income, 1500, Others
Website Hosting, outcome, 50, Others
Ice cream, outcome, 3, Food
Note: There are some differences from your case: I'm receiving one single file from the rote '/import. I'm using Insomnina Designer app to send a multipart form body with one file named importFile
Note:I imported the same libraries that you used and I also used the concept of middlewares
Note:In this case I was just expecting one file, so I used multer({dest: './upload'}).single('importFile'). It could be used also .any().
Note: I'm using typescript, so for JS it is just a matter to remove after some variable declaration :@type, for instance
Note: I left option 1 - working only with arrays and option 2 - using objects.
const results: object[] = [];
becomes:
const results = [];
Let's go to the code:
import { Router, Request, Response } from 'express';
import csv from 'csv-parse';
import multer from 'multer';
import fs from 'fs';
// used on option 2 due typescript
interface CSVTransactionDTO {
title: string;
value: number;
type: 'income' | 'outcome';
category: string;
}
app.post(
'/import', // route name
multer({ dest: './upload' }).single('importFile'), // middleware to download one file (csv)
async (request: Request, response: Response) => {//last middleware with CSV parsing with arrow function
const filePath = request.file.path;
let rowCounter = 0;
const results: string[] = [];// option 1
const newTransactions: CSVTransactionDTO[] = [];// option 2
function parseCSVPromise(): Promise {
return new Promise((resolve, reject) => {
const ConfigCSV = {
// delimiter:';',//other delimiters different from default = ','
from_line: 2, // data starts here
trim: true, // ignore white spaces immediately around the delimiter (comma)
};
fs.createReadStream(filePath)
.pipe(csv(ConfigCSV))
.on('data', /* async */ row => {
rowCounter += 1;// counter of how many rows were processed
// console.log(data); // just test
results.push(row); // Option1 - The simplest way is to push a complete row
const [title, type, value, category] = row;// Option2, process it as an object
newTransactions.push({title, type, value, category});// Option2, process it as an object
})
.on('error', error => {
reject(error);
throw new Error('Fail to process CSV file');
})
.on('end', () => {
resolve();// ends the promise when CSV Parse send 'end' flag
});
});
}
await parseCSVPromise(); // now using the created promise - await finishing parsingCSV
console.log('option1', results);// option1
console.log('option2',newTransactions);// option2
return response.json({ resultsCounter, results }); // For testing only - interrupting the rote execution
// continue processing results and send it to dataBase...
//await fs.promises.unlink(filePath); // optionally you can delete the file parsed/processed
option1 response:
[
[ 'Loan', 'income', '1500', 'Others' ],
[ 'Website Hosting', 'outcome', '50', 'Others' ],
[ 'Ice cream', 'outcome', '3', 'Food' ]
]
option2 response:
[
{ title: 'Loan', type: 'income', value: '1500', category: 'Others' },
{ title: 'Website Hosting', type: 'outcome', value: '50', category: 'Others' },
{ title: 'Ice cream', type: 'outcome', value: '3', category: 'Food' }
]