spring-kafka

How to recover from exceptions sent by producer.send() in Spring Cloud Stream

99封情书 提交于 2021-02-10 07:48:36
问题 We experienced the following scenario : We have a Kafka cluster composed of 3 nodes, each topic created has 3 partitions A message is sent through MessageChannel.send() , producing a record for, let's say, partition 1 The broker acting as the partition leader for that partition fails By default, MessageChannel.send() returns true and doesn't throw any exception, even if, eventually, the KafkaProducer can't send successfully the message. We observe, about 30 seconds after this call, the

No pending reply: ConsumerRecord

有些话、适合烂在心里 提交于 2021-02-09 10:57:54
问题 I am trying to use ReplyingKafkaTemplate, and intermittently I keep seeing the message below. No pending reply: ConsumerRecord(topic = request-reply-topic, partition = 8, offset = 1, CreateTime = 1544653843269, serialized key size = -1, serialized value size = 1609, headers = RecordHeaders(headers = [RecordHeader(key = kafka_correlationId, value = [-14, 65, 21, -118, 70, -94, 72, 87, -113, -91, 92, 72, -124, -110, -64, -94])], isReadOnly = false), key = null, with correlationId: [

No pending reply: ConsumerRecord

ぃ、小莉子 提交于 2021-02-09 10:57:07
问题 I am trying to use ReplyingKafkaTemplate, and intermittently I keep seeing the message below. No pending reply: ConsumerRecord(topic = request-reply-topic, partition = 8, offset = 1, CreateTime = 1544653843269, serialized key size = -1, serialized value size = 1609, headers = RecordHeaders(headers = [RecordHeader(key = kafka_correlationId, value = [-14, 65, 21, -118, 70, -94, 72, 87, -113, -91, 92, 72, -124, -110, -64, -94])], isReadOnly = false), key = null, with correlationId: [

DeadLetterPublishingRecoverer - Dead-letter publication failed with InvalidTopicException for name topic at TopicPartition ends with _ERR

不打扰是莪最后的温柔 提交于 2021-02-08 11:56:22
问题 I identified an error when I changed the DeadLetterPublishingRecoverer destionationResolver. When I use: private static final BiFunction<ConsumerRecord<?, ?>, Exception, TopicPartition> DESTINATION_RESOLVER = (cr, e) -> new TopicPartition(cr.topic() + ".ERR", cr.partition()); it works perfectly. However, if you use _ERR instead of .ERR, an error occurs: 2020-08-05 12:53:10,277 [kafka-producer-network-thread | producer-kafka-tx-group1.ABC_TEST_XPTO.0] WARN o.apache.kafka.clients.NetworkClient

Spring Kafka ChainedKafkaTransactionManager doesn't synchronize with JPA Spring-data transaction

狂风中的少年 提交于 2021-02-08 07:36:40
问题 I read a ton of Gary Russell answers and posts, but didn't find actual solution for the common use-case for synchronization of the sequence below: recieve from topic A => save to DB via Spring-data => send to topic B As i understand properly: there is no guarantee for fully atomic processing in that case and i need to deal with messages deduplication on the client side, but the main issue is that ChainedKafkaTransactionManager doesn't synchronize with JpaTransactionManager (see @KafkaListener

How to verify sprng kafka producer has successfully sent message or not?

懵懂的女人 提交于 2021-02-08 07:29:32
问题 I am trying to test spring kafka(2.2.5.RELEASE) where when producer send message with kafkatemplate, i like to know if that message was sent successfully or not. Based on that I would like to update the db record for that message id. What is the best practice to handle this scenario? Here is the sample code which checks success or failure ListenableFuture<SendResult<String, String>> future = kafkaTemplate.send("test_topic", key, userMsg); SendResult<String, String> result = null; try { result

How to capture the exception and message key when using ErrorHandlingDeserializer2 to handle exceptions during deserialization

自作多情 提交于 2021-02-08 06:37:02
问题 I'm using spring boot 2.1.7.RELEASE and spring-kafka 2.2.8.RELEASE.And I'm using @KafkaListener annotation to create a consumer and I'm using all default settings for the consumer.And I'm using below configuration as specified in the Spring-Kafka documentation. // other props props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class); props.put(ErrorHandlingDeserializer

Design Kafka consumers and producers for scalability

Deadly 提交于 2021-02-07 10:50:30
问题 I want to design a solution for sending different kinds of e-mails to several providers. The general overview. I have several upstream providers Sendgrid, Zoho, Mailgun and etc. They will be used to send e-mails and etc. For example: E-mail for Register new user E-mail for Remove user E-mail for Space Quota limit (in general around 6 types of e-mails) Every type of e-mail should be generated into Producers, converted into Serialized Java Object and Send to the appropriate Kafka Consumer

spring Kafka listening to regex

跟風遠走 提交于 2021-02-07 07:24:10
问题 I am trying to listen newly created topic with the below code, but is not working. Can you please tell me if the below code is correct? public class KafkaMessageListener { private static final Logger LOGGER = LoggerFactory.getLogger(KafkaMessageListener.class); private final ProcessEventModel eventModel; @KafkaListener(topicPattern = "betsyncDataTopic*") public void receive(ConsumerRecord<String, String> consumerRecord) { LOGGER.info("received payload at '{}'", consumerRecord.timestamp());

Kafka Producer Thread, huge amound of threads even when no message is send

拟墨画扇 提交于 2021-02-05 12:04:49
问题 i currently profiled my kafka producer spring boot application and found many "kafka-producer-network-thread"s running (47 in total). Which would never stop running, even when no data is sending. My application looks a bit like this: var kafkaSender = KafkaSender(kafkaTemplate, applicationProperties) kafkaSender.sendToKafka(json, rs.getString("KEY")) with the KafkaSender: @Service class KafkaSender(val kafkaTemplate: KafkaTemplate<String, String>, val applicationProperties: