spring-kafka

Handle multiple responses with ReplyingKafkaTemplate

こ雲淡風輕ζ 提交于 2021-01-29 12:29:07
问题 I'm trying to implement a reply response pattern in which I publish a message to a topic listened to by several consumer groups. That means that they will all get the message as well as submit the response in the reply topic. The problem is because they all respond to the same message, only the first received message in the reply topic shall be answered. The others will be discarded. Given that I know how many responses I should be getting on the reply topic(call that number-n), how can I

Aug 2019 - Kafka Consumer Lag programmatically

扶醉桌前 提交于 2021-01-29 12:20:03
问题 Is there any way we can programmatically find lag in the Kafka Consumer. I don't want external Kafka Manager tools to install and check on dashboard. We can list all the consumer group and check for lag for each group. Currently we do have command to check the lag and it requires the relative path where the Kafka resides. Spring-Kafka, kafka-python, Kafka Admin client or using JMX - is there any way we can code and find out the lag. We were careless and didn't monitor the process, the

Why async producer is not waiting for either linger.ms or batch.size to be full when i set them a custom value with autoFlush is set to false?

橙三吉。 提交于 2021-01-29 10:22:57
问题 I am using spring-kafka 2.2.8 and writing a simple async producer with the below settings: linger.ms : 300000, batch.size: 33554431, max.block.ms: 60000. Now i'm creating a KafkaTemplate with autoFlush as false by calling the below constructor public KafkaTemplate(ProducerFactory<K, V> producerFactory, boolean autoFlush) Now i've a simple test producing 10 message in the span of 10 sec using the above async producer and then stopped my producer using 'Cntrl+C'. Then surprisingly, i got all

Kafka Consumer- ClassCastException java

让人想犯罪 __ 提交于 2021-01-29 08:54:44
问题 My Kafka consumer throws an exception when trying to process messages in a batch(i.e process list of messages) Error Message is java.lang.ClassCastException: class kafka.psmessage.PMessage cannot be cast to class org.apache.kafka.clients.consumer.ConsumerRecord (kafka.psmessage.pMessage and org.apache.kafka.clients.consumer.ConsumerRecord are in unnamed module of loader 'app'); nested exception is java.lang.ClassCastException: class kafka.psmessage.PMessage cannot be cast to class org.apache

TransactionId prefix for producer-only and read-process-write - ProducerFencedException

感情迁移 提交于 2021-01-29 08:24:11
问题 Background: We have been getting ProducerFencedException in our producer-only transactions, and want to introduce uniqueness to our prefix to prevent this issue. In this discussion, Gary mentions that in the case of read-process-write, the prefix must be the same in all instances and after each restart. How to choose Kafka transaction id for several applications, hosted in Kubernetes? While digging into this issue, I came to the realisation that we are sharing the same prefixId for both

Why kafka producer is very slow on first message?

六月ゝ 毕业季﹏ 提交于 2021-01-29 07:22:06
问题 I am using kafka producer to send prices to topic. When I send first message it prints producer config and then send message due to this it takes more time to send first message . After first message it tooks hardly 1/2 milliseconds to send a message . My question is can we do something so that configuration part will skip or we can start before to send first message ? I am using spring kafka into my project. I read other question also but not really helpful . Application.yml server: port:

Spring Cloud Kafka: Can't serialize data for output stream when two processors are active

☆樱花仙子☆ 提交于 2021-01-28 05:13:21
问题 I have a working setup for Spring Cloud Kafka Streams with functional programming style. There are two use cases, which are configured via application.properties . Both of them work individually, but as soon as I activate both at the same time, I get a serialization error for the output stream of the second use case: Exception in thread "ActivitiesAppId-05296224-5ea1-412a-aee4-1165870b5c75-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Error encountered sending record to

Adding custom header using Spring Kafka

人走茶凉 提交于 2021-01-27 14:00:06
问题 I am planning to use the Spring Kafka client to consume and produce messages from a kafka setup in a Spring Boot application. I see support for custom headers in Kafka 0.11 as detailed here. While it is available for native Kafka producers and consumers, I don't see support for adding/reading custom headers in Spring Kafka. I am trying to implement a DLQ for messages based on a retry count that I was hoping to store in the message header without having to parse the payload. 回答1: Well, Spring

How can I map incoming headers as String instead of byte[] in my Spring Cloud Stream project?

北慕城南 提交于 2021-01-27 06:17:06
问题 I have a simple Spring Cloud Stream project using Spring Integration DSL flows and using the Kafka binder. Everything works great, but message header values coming from Kafka arrive as byte[] . This means that my SI @Header parameters need to be of type byte[] . Which works, but it'd be nice to have them as Strings (all the inbound headers I care about are String values). I've configured the Kafka clients to use StringSerializer/StringDeserializer. I assume I also need to somehow tell Spring

Transaction Synchronization in spring boot with Database+ kafka example

女生的网名这么多〃 提交于 2021-01-21 09:17:18
问题 I Want to write one new application with Spring boot using the database as MySQL + Mango and for messaging Spring Kafka. I tried with Many POC for synchronizing the transaction between Kafka and DB but I failed in certain conditions and also I searched many Repositories, blogs to get at least one example. I didn't get any example still now. if anyone gives at least one example or configurations it would be a nice reference in the future for all. 回答1: Here you go... @SpringBootApplication