Importing a very large record set into MongoDB using nodejs

前端 未结 2 1857
旧巷少年郎
旧巷少年郎 2021-01-02 12:21

Before I dive into my question, I wanted to point out that I am doing this partially to get familiarized with node and mongo. I realize there are probably better ways to acc

相关标签:
2条回答
  • 2021-01-02 12:49

    Not a answer to your exact situation of importing from .csv file, but instead, on doing bulk insert(s)

    -> First of all there are no special 'bulk' insertions operations, its all a forEach in the end.

    -> if you try to read a big file async-ly which would be a lot faster then the write process, then you should consider changing your approach, first of all figure out how much can your setup handle, (or just hit-n-trial).

    ---> After that, change the way you read from file, you dont need to read every line from file, async-ly, learn to wait, use forEach, forEachSeries from Async.js to bring down your reads near to mongodb write level, and you are good to go.

    0 讨论(0)
  • 2021-01-02 13:07

    I would try the commandline CSV import option from Mongodb - it should do what you are after without having to write any code

    0 讨论(0)
提交回复
热议问题