Spring Kafka Producer not sending to Kafka 1.0.0 (Magic v1 does not support record headers)

后端 未结 4 2187
青春惊慌失措
青春惊慌失措 2021-02-14 04:08

I am using this docker-compose setup for setting up Kafka locally: https://github.com/wurstmeister/kafka-docker/

docker-compose up works fine, creating topi

4条回答
  •  长情又很酷
    2021-02-14 04:24

    I had a similar issue. Kafka adds headers by default if we use JsonSerializer or JsonSerde for values. In order to prevent this issue, we need to disable adding info headers.

    if you are fine with default json serialization, then use the following (key point here is ADD_TYPE_INFO_HEADERS):

    Map props = new HashMap<>(defaultSettings);
    props.put(JsonSerializer.ADD_TYPE_INFO_HEADERS, false);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
    ProducerFactory producerFactory = new DefaultKafkaProducerFactory<>(props);
    

    but if you need a custom JsonSerializer with specific ObjectMapper (like with PropertyNamingStrategy.SNAKE_CASE), you should disable adding info headers explicitly on JsonSerializer, as spring kafka ignores DefaultKafkaProducerFactory's property ADD_TYPE_INFO_HEADERS (as for me it's a bad design of spring kafka)

    JsonSerializer valueSerializer = new JsonSerializer<>(customObjectMapper);
    valueSerializer.setAddTypeInfo(false);
    ProducerFactory producerFactory = new DefaultKafkaProducerFactory<>(props, Serdes.String().serializer(), valueSerializer);
    
    
    

    or if we use JsonSerde, then:

    Map jsonSerdeProperties = new HashMap<>();
    jsonSerdeProperties.put(JsonSerializer.ADD_TYPE_INFO_HEADERS, false);
    JsonSerde jsonSerde = new JsonSerde<>(serdeClass);
    jsonSerde.configure(jsonSerdeProperties, false);
    

    提交回复
    热议问题