问题
Reading a CSV file line-by-line (i.e. without loading the whole file into memory) in Python is simple:
import csv
for row in csv.reader(open("file.csv")):
print(row[0])
Doing the same with node.js involves using something like node-csv, streams and callbacks.
Is it possible to use new ES6/ES7 features like iterators, generators, promises and async functions to iterate over lines of a CSV file in a way that looks more like the Python code?
Ideally I'd like to be able to write something like this:
for (const row of csvOpen('file.csv')) {
console.log(row[0]);
}
(again, without loading the whole file into memory at once.)
回答1:
I'm not familiar with node-csv, but it sounds like an iterator using a generator should do it. Just wrap that around any asynchronous callback API:
let dummyReader = {
testFile: ["row1", "row2", "row3"],
read: function(cb) {
return Promise.resolve().then(() => cb(this.testFile.shift())).catch(failed);
},
end: function() {
return !this.testFile.length;
}
}
let csvOpen = url => {
let iter = {};
iter[Symbol.iterator] = function* () {
while (!dummyReader.end()) {
yield new Promise(resolve => dummyReader.read(resolve));
}
}
return iter;
};
async function test() {
// The line you wanted:
for (let row of csvOpen('file.csv')) {
console.log(await row);
}
}
test(); // row1, row2, row3
var failed = e => console.log(e.name +": "+ e.message);
Note that row
here is a promise, but close enough.
Paste it in babel.
来源:https://stackoverflow.com/questions/38313873/read-a-csv-file-line-by-line-using-node-js-and-es6-es7-features