Large dataset processing using Mule ESB from database: how to update the processed records based on certain batch size?

为君一笑 提交于 2019-12-25 14:39:30

问题


I have a large set of records to be processed eg: 100,000 records. My use case has 4 steps:

  1. pick the records from database table using jdbc inbound adapter
  2. convert the record to xml format
  3. post the message to the queue
  4. then update the same record with some status flag as it has been processed so that it will not be picked again

I don't want to pick all the records from table at one shot for processing: is there a way so that pick in some batches and don't want to update table for one record? Is there any option of bulk/batch update at batch level?

Or is there any other better way to approach this use case? Any suggestions highly appreciated.


回答1:


I would write the SQL select query to return only N records (like LIMIT 100 or equivalent) with a where clause that excludes the already processed records based on the status flag.



来源:https://stackoverflow.com/questions/14714032/large-dataset-processing-using-mule-esb-from-database-how-to-update-the-process

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!