apache-kafka-streams

How to test Kafka Streams applications with Spring Kafka?

喜欢而已 提交于 2020-12-07 05:16:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

How to test Kafka Streams applications with Spring Kafka?

白昼怎懂夜的黑 提交于 2020-12-07 05:15:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

Stream join example with Apache Kafka?

被刻印的时光 ゝ 提交于 2020-12-04 05:14:06
问题 I was looking for an example using Kafka Streams on how to do this sort of thing, i.e. join a customers table with a addresses table and sink the data to ES:- Customers +------+------------+----------------+-----------------------+ | id | first_name | last_name | email | +------+------------+----------------+-----------------------+ | 1001 | Sally | Thomas | sally.thomas@acme.com | | 1002 | George | Bailey | gbailey@foobar.com | | 1003 | Edward | Davidson | ed@walker.com | | 1004 | Anne | Kim

Stream join example with Apache Kafka?

橙三吉。 提交于 2020-12-04 05:07:17
问题 I was looking for an example using Kafka Streams on how to do this sort of thing, i.e. join a customers table with a addresses table and sink the data to ES:- Customers +------+------------+----------------+-----------------------+ | id | first_name | last_name | email | +------+------------+----------------+-----------------------+ | 1001 | Sally | Thomas | sally.thomas@acme.com | | 1002 | George | Bailey | gbailey@foobar.com | | 1003 | Edward | Davidson | ed@walker.com | | 1004 | Anne | Kim

Can I rely on a in-memory Java collection in Kafka stream for buffering events by fine tuning punctuate and commit interval?

故事扮演 提交于 2020-11-29 11:12:35
问题 A custom processor which buffers events in a simple java.util.List in process() - this buffer is not a state store. Every 30 seconds WALL_CLOCK_TIME, punctuate() sorts this list and flushes to the sink. Assume only single partition source and sink. EOS processing guarantee is required. I know that at any given time either process() gets executed or punctuate() gets executed. I am concerned about this buffer not being backed by changelog topic. Ideally I believe this should have been a state

Can I rely on a in-memory Java collection in Kafka stream for buffering events by fine tuning punctuate and commit interval?

风格不统一 提交于 2020-11-29 11:11:46
问题 A custom processor which buffers events in a simple java.util.List in process() - this buffer is not a state store. Every 30 seconds WALL_CLOCK_TIME, punctuate() sorts this list and flushes to the sink. Assume only single partition source and sink. EOS processing guarantee is required. I know that at any given time either process() gets executed or punctuate() gets executed. I am concerned about this buffer not being backed by changelog topic. Ideally I believe this should have been a state