spring-kafka-test

Embedded Kafka tests randomly failing

馋奶兔 提交于 2021-02-11 14:06:45
问题 I implemented a bunch of integration tests using EmbededKafka to test one of our Kafka streams application running using spring-kafka framework. The stream application is reading a message from a Kafka topic, it stores it into an internal state store, does some transformation and sends it to another micro service into a requested topic. When the response comes back into the responded topic it retrieves the original message from the state store and depending on some business logic it forwards

spring kafka properties not auto loaded when writing customConsumerFactory and customKafkaListenerContainerFactory

梦想的初衷 提交于 2021-01-27 17:45:37
问题 I want to load my spring-kafka properties from application.properties and that must be loaded using spring auto configuration. My problem is Caused by: java.lang.IllegalStateException: No Acknowledgment available as an argument, the listener container must have a MANUAL AckMode to populate the Acknowledgment however I have already set it in properties file spring.kafka.listener.ack-mode=manual-immediate in this properties however because it's my custom fooKafkaListenerContainerFactory It's

How to test Kafka Streams applications with Spring Kafka?

喜欢而已 提交于 2020-12-07 05:16:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

How to test Kafka Streams applications with Spring Kafka?

白昼怎懂夜的黑 提交于 2020-12-07 05:15:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

KStream to KTable Inner Join producing different number of records every time processed with same data

谁说我不能喝 提交于 2020-07-23 06:07:21
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

KStream to KTable Inner Join producing different number of records every time processed with same data

我与影子孤独终老i 提交于 2020-07-23 06:06:26
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

KStream to KTable Inner Join producing different number of records every time processed with same data

≡放荡痞女 提交于 2020-07-23 06:05:06
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

spring Kafka integration testing with embedded Kafka

心不动则不痛 提交于 2019-12-23 04:06:15
问题 I have spring boot application that had a consumer consumes from topic in one cluster and produces to another topic in different cluster. Now I'm trying to write integration test case using spring embedded Kafka but having an issue KafkaTemplate could not be registered. A bean with that name has already been defined in class path resource Consumer Class @Service public class KafkaConsumerService { @Autowired private KafkaProducerService kafkaProducerService; @KafkaListener(topics = "${kafka

What is the proper way of doing @DirtiesConfig when used @EmbeddedKafka

扶醉桌前 提交于 2019-12-11 19:08:58
问题 We have a "little" problem in our project with: "Connection to node 0 could not be established. Broker may not be available." Tests runs very very long time, and this message is logged at least once every second. But I found out, how to get rid of it. Read on. If there is something incorrect in configurations/annotations, please let me know. Versions first: <springframework.boot.version>2.1.8.RELEASE</springframework.boot.version> which automatically brings <spring-kafka.version>2.2.8.RELEASE