spring-kafka

how to pause and resume @KafkaListener using spring-kafka

倾然丶 夕夏残阳落幕 提交于 2020-06-23 11:45:30
问题 I have implemented the Kafka consumer, now I have a scenario. Read data from the Kafka stream 2.2.5.Release via Srpingboot load in the database table1 copy the data from table1 to table2 clear the table1 To do the above things, I need to pause/resume the Kafka consumer using a scheduling job(already written) using quartz, which copies data from table 1 to table 2. But during this activity, I want my Kafka listener to pause, and once the copy is done, it should resume. My implementation:

Kafka fails to keep track of last-commited offset

旧巷老猫 提交于 2020-06-17 09:42:07
问题 Is there any known issue with kakfa-broker in managing the offsets? Bcz, problem which we are facing is when we try to restart of kafka-consumer(i.e, app restart) sometimes all the offset are reset to 0. Completely clueless on why are consumers not able to start from the last commited offset. We are eventually facing this issue in prod wherein the whole q events are replayed again : spring-boot version -- 2.2.6 release spring-kafka - 2.3.7 release kafka-client -2.3.1 apache-kafka - kafka_2.12

kafkaendpointlistenerregistry.start() throws null pointer exception

怎甘沉沦 提交于 2020-06-17 09:41:06
问题 I have a requirement where I want to start Kakfa consumer manually. Code : class Dummy implements ConsumerSeekAware { @Autowired KafkaListenerEndpointRegistry registry; CountDownLatch latch; @Autowired ConcurrentKafkaListenerContainerFactory factory; onIdleEvent(){ latch.countdown() } @KafkaListener(id="myContainer", topics="mytopic", autoStartup="false") public void listen() {} @Scheduled(cron=" some time ") void do_some_consumption(){ latch = new CountDownLatch(1); this.registry

kafkaendpointlistenerregistry.start() throws null pointer exception

我的梦境 提交于 2020-06-17 09:41:04
问题 I have a requirement where I want to start Kakfa consumer manually. Code : class Dummy implements ConsumerSeekAware { @Autowired KafkaListenerEndpointRegistry registry; CountDownLatch latch; @Autowired ConcurrentKafkaListenerContainerFactory factory; onIdleEvent(){ latch.countdown() } @KafkaListener(id="myContainer", topics="mytopic", autoStartup="false") public void listen() {} @Scheduled(cron=" some time ") void do_some_consumption(){ latch = new CountDownLatch(1); this.registry

Consuming again messages from kafka log compaction topic

北慕城南 提交于 2020-06-08 12:44:10
问题 I have a spring application with a Kafka consumer using a @KafkaListerner annotation. The topic being consumed is log compacted and we might have the scenario where we must consume again the topic messages. What's the best way to achieve this programmatically? We don't control the Kafka topic configuration. 回答1: @KafkaListener(...) public void listen(String in, @Header(KafkaHeaders.CONSUMER) Consumer<?, ?> consumer) { System.out.println(in); if (this.resetNeeded) { consumer.seekToBeginning

How can I effectively bind my @KafkaListener to ConcurrentKafkaListenerContainerFactory?

别等时光非礼了梦想. 提交于 2020-06-01 06:23:26
问题 I hit this scenario which appears strange to me: So basically I have defined two @KafkaListener in one class: @KafkaListener(id = "listener1", idIsGroup = false, topics = "data1", containerFactory = "kafkaListenerContainerFactory") public void receive(){} @KafkaListener(id = "listener2", idIsGroup = false, topics = "data2", containerFactory = "kafkaListenerContainerFactory2") public void receive(){} Their id , topics , containerFactory are different, and each of them relies on a different

How to mock result from KafkaTemplate

杀马特。学长 韩版系。学妹 提交于 2020-05-28 04:33:05
问题 I have a method for sending kafka message like this: @Async public void sendMessage(String topicName, Message message) { ListenableFuture<SendResult<String, Message >> future = kafkaTemplate.send(topicName, message); future.addCallback(new ListenableFutureCallback<>() { @Override public void onSuccess(SendResult<String, Message > result) { //do nothing } @Override public void onFailure(Throwable ex) { log.error("something wrong happened"!); } }); } And now I am writing unit tests for it. I

Spring-Kafka vs. Spring-Cloud-Stream (Kafka)

大兔子大兔子 提交于 2020-05-25 03:26:25
问题 Using Kafka as a messaging system in a microservice architecture what are the benefits of using spring-kafka vs. spring-cloud-stream + spring-cloud-starter-stream-kafka ? The spring cloud stream framework supports more messaging systems and has therefore a more modular design. But what about the functionality ? Is there a gap between the functionality of spring-kafka and spring-cloud-stream + spring-cloud-starter-stream-kafka ? Which API is better designed? Looking forward to read about your

Exponential backoff with message order guarantee using spring-kafka

让人想犯罪 __ 提交于 2020-05-24 20:37:51
问题 I'm trying to implement a Spring Boot-based Kafka consumer that has some very strong message delivery guarentees, even in a case of an error. messages from a partition must be processed in order, if message processing fails, the consumption of the particular partition should be suspended, the processing should be retried with a backoff, until it succeeds. Our current implementation fulfills these requirements: @Bean public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer

How to use “li-apache-kafka-clients” in spring boot app to send large message (above 1MB) from Kafka producer?

馋奶兔 提交于 2020-05-17 08:07:43
问题 How to use li-apache-kafka-clients in spring boot app to send large message (above 1MB) from Kafka producer to Kafka Consumer? Below is the GitHub link of li-apache-kafka-clients: https://github.com/linkedin/li-apache-kafka-clients I have imported .jar file of li-apache-kafka-clients and put the below configuration for producer: props.put("large.message.enabled", "true"); props.put("max.message.segment.bytes", 1000 * 1024); props.put("segment.serializer", DefaultSegmentSerializer.class