spring-kafka

How to use Spring-Kafka to read AVRO message with Confluent Schema registry?

浪尽此生 提交于 2020-12-29 13:14:30
问题 How to use Spring-Kafka to read AVRO message with Confluent Schema registry? Is there any sample? I can't find it in official reference document. 回答1: Below code can read the message from customer-avro topic. Here's the AVRO schema on value i have defined as. { "type": "record", "namespace": "com.example", "name": "Customer", "version": "1", "fields": [ { "name": "first_name", "type": "string", "doc": "First Name of Customer" }, { "name": "last_name", "type": "string", "doc": "Last Name of

How to use Spring-Kafka to read AVRO message with Confluent Schema registry?

霸气de小男生 提交于 2020-12-29 13:14:25
问题 How to use Spring-Kafka to read AVRO message with Confluent Schema registry? Is there any sample? I can't find it in official reference document. 回答1: Below code can read the message from customer-avro topic. Here's the AVRO schema on value i have defined as. { "type": "record", "namespace": "com.example", "name": "Customer", "version": "1", "fields": [ { "name": "first_name", "type": "string", "doc": "First Name of Customer" }, { "name": "last_name", "type": "string", "doc": "Last Name of

KafkaException: class is not an instance of org.apache.kafka.common.serialization.Deserializer

那年仲夏 提交于 2020-12-15 07:17:30
问题 I want to implement Kafka producer which sends and receives Java Serialized Objects. I tried this: Producer: @Configuration public class KafkaProducerConfig { @Value(value = "${kafka.bootstrapAddress}") private String bootstrapAddress; @Bean public ProducerFactory<String, SaleRequestFactory> saleRequestFactoryProducerFactory() { Map<String, Object> configProps = new HashMap<>(); configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress); configProps.put(ProducerConfig.KEY

How to test Kafka Streams applications with Spring Kafka?

喜欢而已 提交于 2020-12-07 05:16:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

How to test Kafka Streams applications with Spring Kafka?

白昼怎懂夜的黑 提交于 2020-12-07 05:15:51
问题 I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores. Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated): @Bean public KTable

Spring Kafka: JsonDeserializer doesn't pick up TRUSTED_PACKAGE config

生来就可爱ヽ(ⅴ<●) 提交于 2020-12-06 15:48:49
问题 I just want to check if it's known behavior or I'm doing something wrong. I configuring producer and consumer with custom type mapping using JsonDeserializer . Consumer fails with org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition ticket-1 at offset 1. If needed, please seek past the record to continue consumption. Caused by: java.lang.IllegalArgumentException: The class 'createTicket' is not in the trusted packages: [java.util, java.lang]. If

KafkaConsumer.commitAsync() behavior with a lower offset than previous

痞子三分冷 提交于 2020-12-06 12:56:36
问题 How will kafka deal with a call to KafkaConsumer.commitAsync(Map<TopicPartition, OffsetAndMetadata> offsets, OffsetCommitCallback callback) when offset value for a topic is given as a lesser value than a previous invocation? 回答1: It will simply set the offset of the partition to the value you specified,so next time you will consume you message from commitedOffset+1. The javadoc of commitAsync() says: The committed offset should be the next message your application will consume,i.e.

KafkaConsumer.commitAsync() behavior with a lower offset than previous

匆匆过客 提交于 2020-12-06 12:55:48
问题 How will kafka deal with a call to KafkaConsumer.commitAsync(Map<TopicPartition, OffsetAndMetadata> offsets, OffsetCommitCallback callback) when offset value for a topic is given as a lesser value than a previous invocation? 回答1: It will simply set the offset of the partition to the value you specified,so next time you will consume you message from commitedOffset+1. The javadoc of commitAsync() says: The committed offset should be the next message your application will consume,i.e.

KafkaConsumer.commitAsync() behavior with a lower offset than previous

ε祈祈猫儿з 提交于 2020-12-06 12:55:31
问题 How will kafka deal with a call to KafkaConsumer.commitAsync(Map<TopicPartition, OffsetAndMetadata> offsets, OffsetCommitCallback callback) when offset value for a topic is given as a lesser value than a previous invocation? 回答1: It will simply set the offset of the partition to the value you specified,so next time you will consume you message from commitedOffset+1. The javadoc of commitAsync() says: The committed offset should be the next message your application will consume,i.e.

How do I configure Spring Kafka Listener for a specfic topic using the factory?

与世无争的帅哥 提交于 2020-12-02 00:20:37
问题 I want to be able to read in topics through the properties without specifying anything on the Kafka listener annotation. Not using Spring Boot. I tried having the topics read straight from the properties object via a "topics" key. That gives an error: IllegalStateException:topics, topicPattern, or topicPartitions must be provided. // some class @KafkaListener public void listener(List<String> messages) { System.out.print(messages); } //some other class @Bean public ConsumerFactory<String,