spring-kafka

Failed to convert from JSON;Unexpected character ('f' (code 102)): Expected space separating root-level values at (String)“5f19a7e99933db43cb23e83d”

和自甴很熟 提交于 2020-07-30 07:39:28
问题 @KafkaListener(id = ProductTopicConstants.GET_PRODUCT, topics = ProductTopicConstants.GET_PRODUCT) @SendTo public Product GetProduct(String id) { return _productRepository.findByid(id); } Kafka Configuration @Configuration public class KafkaConfiguration { @Bean public Map<String, Object> consumerConfigs() { Map<String, Object> props = new HashMap<>(); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,

KStream to KTable Inner Join producing different number of records every time processed with same data

谁说我不能喝 提交于 2020-07-23 06:07:21
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

KStream to KTable Inner Join producing different number of records every time processed with same data

我与影子孤独终老i 提交于 2020-07-23 06:06:26
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

KStream to KTable Inner Join producing different number of records every time processed with same data

≡放荡痞女 提交于 2020-07-23 06:05:06
问题 I want to do a KStream to KTable Join. using KTable as just a lookup table. below steps shows the sequence in which code is executed Construct KTable ReKey KTable Construct KStream ReKey KStream Join KStream - KTable Lets say there are 8000 records in KStream, 14 records in KTable and Assuming that for each key in KStreams there is a record in KTable. So the expected output would be 8000 records. Every time i do a join for first time or when i start the application. Expected output is 8000

How to pause a specific kafka consumer thread when concurrency is set to more than 1?

纵然是瞬间 提交于 2020-06-29 06:44:31
问题 I am using spring-kafka 2.2.8 and setting concurrency to 2 as shown below and trying to understand how do i pause an consumer thread/instance when particular condition is met. @KafkaListener(id = "myConsumerId", topics = "myTopic", concurrency=2) public void listen(String in) { System.out.println(in); } Now, I've two questions. Would my consumer span two different poll threads to poll the records? If i'm setting an id to the consumer as shown above. How can i pause a specific consumer thread

How to find no more messages in kafka topic/partition & reading only after writing to topic is done

江枫思渺然 提交于 2020-06-29 05:38:28
问题 I'm using Spring boot version 1.5.4.RELEASE & spring Kafka version 1.3.8.RELEASE. Some generic questions is there way to find out no more messages in topic/partition in consumer how to start consumer to start consuming messages from a topic only after writing from the producer is done? 回答1: Spring Boot 1.5 is end of life and no longer supported; the current version is 2.2.5. The latest 1.3.x version of Spring for Apache Kafka is 1.3.10. It will only be supported through the end of this year.

ReplyingKafkaTemplate / KafkaTemplate not sending / receiving key

邮差的信 提交于 2020-06-28 14:24:27
问题 I have a template that looks like this: @Autowired private ReplyingKafkaTemplate<ItemId, MessageBDto, MessageBDto> xxx2ReplyingKafkaTemplate; My send wrapper method looks like this: public RequestReplyFuture<ItemId, MessageBDto, MessageBDto> sendAndReceiveMessageB(MessageBDto message) { ProducerRecord<ItemId, MessageBDto> producerRecord = new ProducerRecord<>(KafkaTopicConfig.xxx2_TOPIC, new ItemId(message.getCount()), message); producerRecord.headers().add(new RecordHeader(KafkaHeaders.REPLY

Invalid UTF-8 middle byte 0x72

杀马特。学长 韩版系。学妹 提交于 2020-06-28 07:36:47
问题 I am using JsonSerializer and JsonDeserializer in spring-kafka to set the value serializer while producing a message. The message has one field(orgName)with a special character in it (german umlaut).How do I handle this special character? I know JsonDeserializer uses jackson and jackson supports utf-8. The JsonDeserializer throws this error because of it: Caused by: com.fasterxml.jackson.databind.JsonMappingException: Invalid UTF-8 middle byte 0x72 at [Source: [B@403d4534; line: 1, column:

Multiple KafkaConsumer on multiple Kafka Cluster in Spring Boot

耗尽温柔 提交于 2020-06-28 06:27:08
问题 Want to create homogeneous kafka consumers on different clusters from a Spring boot application using spring-kafka . i.e Want to create a Kafka Consumer object for class defined already which listens to multiple cluster defined dynamically. e.g: Lets say a Spring boot application S which contains the template for kafkaconsumer . And there are three Kafka Clusters custer1, cluster2, cluster3. The application S act as an aggregator of data produced from each of the cluster. Here the solution

how to pause and resume @KafkaListener using spring-kafka

北城余情 提交于 2020-06-23 11:45:46
问题 I have implemented the Kafka consumer, now I have a scenario. Read data from the Kafka stream 2.2.5.Release via Srpingboot load in the database table1 copy the data from table1 to table2 clear the table1 To do the above things, I need to pause/resume the Kafka consumer using a scheduling job(already written) using quartz, which copies data from table 1 to table 2. But during this activity, I want my Kafka listener to pause, and once the copy is done, it should resume. My implementation: