问题
I have to migrate 5 million records from PostgreSQL
to MongoDb
.
I tried using mongify
for the same but as it runs on ruby
and I am not at all acquainted with ruby i couldn't solve the errors posed by it.
So, I tried writing a code myself in node.js
that would first convert PostgreSQL data
into JSON
and then insert that JSON into mongoDb
.
But, this failed as it ate a lot of RAM and not more than 13000 records could be migrated.
Then I thought of writing code in Java
because of its garbage collector. It works fine in terms of RAM utilization but the speed is very slow (around 10000 records/hour). At this rate it would take me days to migrate my data.
So, Is there a more efficient and faster way of doing this? Would a python program be faster than the Java program? Or is there any other ready-made tool available for doing the same?
My system configuration is : OS - Windows 7 (64 bit), RAM - 4GB, i3 processor
回答1:
Seems like I am late to the party. However, this might come in handy to somebody, someday!!!!
The following python-based migration framework should come in handy.
https://github.com/datawrangl3r/pg2mongo
Answering to your performance, the migration of each JSON object will be dynamic and there shouldn't be any memory lock issues when you use the above framework.
Hope it helps!!
来源:https://stackoverflow.com/questions/41997999/migrate-data-from-postgresql-to-mongodb