Spring Data Flow w/ 2 sources feeding one processor/sink

[亡魂溺海] 提交于 2019-12-13 07:34:51

问题


I'm looking for some advice on setting up a Spring Data Flow stream for a specific use case.

My use case:

I have 2 RDBMS and I need to compare the results of queries run against each. The queries should be run roughly simultaneously. Based on the result of the comparison, I should be able to send an email through a custom email sink app which I have created.

I envision the stream diagram to look something like this (sorry for the paint):

The problem is that SDF does not, to my knowledge, allow a stream to be composed with 2 sources. It seems to me that something like this ought to be possible without pushing the limits of the framework too far. I'm looking for answers that provide a good approach to this scenario while working within the SDF framework.

I am using Kafka as a message broker and the data flow server is using mysql to persist stream information.

I have considered creating a custom Source app which polls two datasources and sends the messages on the output channel. This would eliminate my requirement of 2 sources, but it looks like it would require a significant amount of customization of the jdbc source application.

Thanks in advance.


回答1:


I have not really tried this, but you should be able to use named destinations to achieve that. Take a look here: http://docs.spring.io/spring-cloud-dataflow/docs/current-SNAPSHOT/reference/htmlsingle/#spring-cloud-dataflow-stream-advanced

stream create --name jdbc1 --definition "jdbc > :dbSource"

stream create --name jdbc2 --definition "jdbc > :dbSource"

stream create --name processor --definition ":dbSource > aggregator | sink"



来源:https://stackoverflow.com/questions/43623205/spring-data-flow-w-2-sources-feeding-one-processor-sink

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!