what is the better way to index data from Oracle/relational tables into elastic search?

前端 未结 2 1594
一生所求
一生所求 2021-02-04 15:49

What are the options to index large data from Oracle DB to elastic search cluster? Requirement is to index 300Million records one time into multiple indexes and

相关标签:
2条回答
  • 2021-02-04 15:53

    We use ES as a reporting db and when new records are written to SQL we take the following action to get them into ES:

    1. Write the primary key into a queue (we use rabbitMQ)
    2. Rabbit picks up the primary key (when it has time) and queries the relation DB to get the info it needs and then writes the data into ES

    This process works great because it handles both new data and old data. For old data just write a quick script to write 300M primary keys into rabbit and you're done!

    0 讨论(0)
  • 2021-02-04 16:00

    there are many integration options - I've listed out a few to give you some ideas, the solution is really going to depend on your specific resources and requirements though.

    1. Oracle Golden Gate will look at the Oracle DB transaction logs and feed them in real-time to ES.
    2. ETL for example Oracle Data Integrator could run on a schedule and pull data from your DB, transform it and send to ES.
    3. Create triggers in the Oracle DB so that data updates can be written to ES using a stored procedure. Or use the trigger to write flags to a "changes" table that some external process (e.g. a Java application) monitors and uses to extract data from the Oracle DB.
    4. Get the application that writes to the Oracle DB to also feed ES. Ideally your application and Oracle DB should be loosely coupled - do you have an integration platform that can feed the messages to both ES and Oracle?
    0 讨论(0)
提交回复
热议问题