Kafka messages getting lost when consumer goes down

人盡茶涼 提交于 2021-02-11 15:47:23

问题


Hello I am writing a kafka consumer-producer using spring cloud stream . Inside my consumer I save my data to a database , if the database goes down I will exit the application manually .After restarting application if the database is still down as a result the application gets stopped again . Now if i restart the application for the third time the messages received in the middle interval(the two failures) are lost, kafka consumer takes the latest message , also it skips the message on which I exited the code.

Inbound and outbound channel binder interface

public interface EventChannel {

String inputEvent = "inputChannel";
String outputEvent = "outputChannel";

@Input(inputChannel)
SubscribableChannel consumeEvent();

@Output(outputEvent)
SubscribableChannel produceEvent();
}

Service Class -

1) Producer Service

@Service
@EnableBinding(EventChannel.class)

public class EventProducerService{

private final EventChannel eventChannel;

@Autowired  
public EventConsumerService(EventChannel eventChannel){
this.eventChannel = eventChannel;
}

public void postEvent(EventDTO event) {
    MessageChannel messageChannel = eventChannel.produceEvent();
    messageChannel.send(MessageBuilder
            .withPayload(event)
            .setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
            .setHeader("partitionKey",event.getId().toString())
            .build());     
    }
}

2) Consumer Service

@Component
@EnableBinding(EventChannel.class)
public class EventConsumerService{ 

private final ApplicationContext applicationContext;
private final EventChannel eventChannel;

@Autowired  
public EventConsumerService(ApplicationContext applicationContext,EventChannel eventChannel){
this.applicationContext = applicationContext;
this.eventChannel = eventChannel;
}

@StreamListener(EventChannel.inputEvent)
public void saveUpdateCassandra(EventDTO event){
  Event newEvent = new Event(event);
  try{
     eventRepository.save(newEvent)
    } catch(Exceptione e){
     e.printStackTrace();
     SpringApplication.exit(applicationContext,()-> 0); 
  }
}

Application properties file

#Spring Cloud Streams Configuration
##Broker
spring.cloud.stream.kafka.binder.brokers=localhost:9092
##EventIngestion 
spring.cloud.stream.bindings.outputChannel.destination=Event
spring.cloud.stream.bindings.outputChannel.producer.partitionKeyExpression=headers.partitionKey
spring.cloud.stream.bindings.inputChannel.destination=Event
spring.cloud.stream.bindings.inputChannel.group=event-consumer-1
spring.cloud.stream.kafka.bindings.inputChannel.consumer.startOffset=earliest

Both application are running on independently So if my database goes down the consumer stops , on consecutive failures messages are getting lost


回答1:


First, I am not sure what are your expectation for SpringApplication.exit(applicationContext,()-> 0);, but you're essentially bringing down the entire application with everything hat may be running there. Second, your message loss is due to the fact that Kafka binder has absolutely no knowledge that an exception had occurred and that it has to put message back on the topic. In fact from the binder's perspective and due to your code every message is always processed successfully. So. . .

Please remove try/catch from your StreamListener method and let exception propagate, thus letting binder know that there was an error.



来源:https://stackoverflow.com/questions/54753093/kafka-messages-getting-lost-when-consumer-goes-down

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!