问题
I want to insert real-time logging data into Neo4j 2.2.1 through Spring Data Neo4j 4.0.0. The logging data is very big which may reach hundreds of thousands records. How is the best way to implement this kind of functionality? Is it safe to just using the .save(Iterable) method at the end of all the node entity objects creation? Is there something like batch insertion mechanism in Spring Data Neo4j 4.0.0? Thanks in advance!
回答1:
As SDN4 can work with existing databases directly you can use neo4j-import for initial imports.
From Neo4j 2.2. we can also sustain highly concurrent write loads of parametrized cypher, I think you should be able to just multi-thread adding data to Neo4j using SDN4. I.e. create let's say 1000 to 10k objects per batch and send them off.
Otherwise you can just send parametrized Cypher directly concurrently to Neo4j.
来源:https://stackoverflow.com/questions/30494097/neo4j-spring-data-neo4j-4-0-0-importing-large-datasets