spring kafka properties not auto loaded when writing customConsumerFactory and customKafkaListenerContainerFactory

梦想的初衷 提交于 2021-01-27 17:45:37

问题


I want to load my spring-kafka properties from application.properties and that must be loaded using spring auto configuration. My problem is Caused by: java.lang.IllegalStateException: No Acknowledgment available as an argument, the listener container must have a MANUAL AckMode to populate the Acknowledgment however I have already set it in properties file spring.kafka.listener.ack-mode=manual-immediate in this properties however because it's my custom fooKafkaListenerContainerFactory It's not able to pick this settings. What I want is without setting it manually it should be picked up from my application.properies. @Gary Russell your help is appreciated.

My code looks like below

package com.foo;

import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.kafka.ConcurrentKafkaListenerContainerFactoryConfigurer;
import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.support.serializer.JsonDeserializer;

import com.foo.FooKafkaDTO;

@Configuration
public class KafkaConsumerConfig {

    @Autowired
    private KafkaProperties kafkaProperties;

    @Bean
    @ConditionalOnMissingBean(ConsumerFactory.class)
    public ConsumerFactory<?, ?> kafkaConsumerFactory() {

        return new DefaultKafkaConsumerFactory<>(kafkaProperties.buildConsumerProperties());
    }

    @Bean
    @ConditionalOnMissingBean(name = "kafkaListenerContainerFactory")
    public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
            ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
            ConsumerFactory<Object, Object> kafkaConsumerFactory) {

        ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<Object, Object>();
        configurer.configure(factory, kafkaConsumerFactory);
        return factory;
    }

    @Bean
    public ConsumerFactory<String, FooKafkaDTO> fooConsumerFactory() {

        return new DefaultKafkaConsumerFactory<>(
                kafkaProperties.buildConsumerProperties(), new StringDeserializer(), new JsonDeserializer<>(FooKafkaDTO.class));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, FooKafkaDTO> fooKafkaListenerContainerFactory(
            ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
            ConsumerFactory<String, FooKafkaDTO> fooConsumerFactory) {

        ConcurrentKafkaListenerContainerFactory<String, FooKafkaDTO> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(fooConsumerFactory());
        return factory;
    }
}


Here are my properties 

spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.listener.ack-mode=manual-immediate
spring.kafka.consumer.group-id=group_id
spring.kafka.consumer.auto-offset-reset=latest
spring.kafka.consumer.enable.auto.commit=false
spring.kafka.consumer.key-deserialize=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.value-deserialize=org.springframework.kafka.support.serializer.JsonDeserializer


Here is my listener

@Service
public class Consumer {

    private static final Log LOG = LogFactory.getLog(Consumer.class);

    @KafkaListener(
            topicPartitions = {@TopicPartition(topic = "outbox.foo",
                    partitionOffsets = @PartitionOffset(partition = "0", initialOffset = "0"))},
            groupId = "group_id",
            containerFactory = "fooKafkaListenerContainerFactory")
    public void consume(@Payload FooKafkaDTO fooKafkaDTO, Acknowledgment acknowledgment,
            @Headers MessageHeaders headers) {

        LOG.info("offset:::" + Long.valueOf(headers.get(KafkaHeaders.OFFSET).toString()));
        LOG.info(String.format("$$ -> Consumed Message -> %s", fooKafkaDTO));
        acknowledgment.acknowledge();

    }
}

回答1:


After going through the documentation of spring-kafka spring-kafka-official-documentation! I could find this code which replaced the whole boilerplate code. I have simplified my KafkaConsumerConfig class and it looks like below now.

package com.foo

import java.util.Map;

import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.support.serializer.JsonDeserializer;

import com.foo.FooKafkaDTO;

@Configuration
public class KafkaConsumerConfig {

    @Bean
    public DefaultKafkaConsumerFactory fooDTOConsumerFactory(KafkaProperties properties) {

        Map<String, Object> props = properties.buildConsumerProperties();
        return new DefaultKafkaConsumerFactory(props,
                new JsonDeserializer<>(String.class)
                        .forKeys()
                        .ignoreTypeHeaders(),
                new JsonDeserializer<>(FooKafkaDTO.class)
                        .ignoreTypeHeaders());

    }
}


来源:https://stackoverflow.com/questions/60841391/spring-kafka-properties-not-auto-loaded-when-writing-customconsumerfactory-and-c

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!