Before I dive into my question, I wanted to point out that I am doing this partially to get familiarized with node and mongo. I realize there are probably better ways to acc
Not a answer to your exact situation of importing from .csv file, but instead, on doing bulk insert(s)
-> First of all there are no special 'bulk' insertions operations, its all a forEach in the end.
-> if you try to read a big file async-ly which would be a lot faster then the write process, then you should consider changing your approach, first of all figure out how much can your setup handle, (or just hit-n-trial).
---> After that, change the way you read from file, you dont need to read every line from file, async-ly, learn to wait, use forEach, forEachSeries from Async.js to bring down your reads near to mongodb write level, and you are good to go.
I would try the commandline CSV import option from Mongodb - it should do what you are after without having to write any code