spring-cloud-stream

Spring Cloud Kafka: Can't serialize data for output stream when two processors are active

☆樱花仙子☆ 提交于 2021-01-28 05:13:21
问题 I have a working setup for Spring Cloud Kafka Streams with functional programming style. There are two use cases, which are configured via application.properties . Both of them work individually, but as soon as I activate both at the same time, I get a serialization error for the output stream of the second use case: Exception in thread "ActivitiesAppId-05296224-5ea1-412a-aee4-1165870b5c75-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Error encountered sending record to

Spring Cloud Stream + Quartz

左心房为你撑大大i 提交于 2021-01-27 13:23:31
问题 I am planning to use Spring cloud Stream for my project. I see that there's built-in Trigger source application starter. What I want to do is to use, quartz job scheduler as the source app. This is to allow dynamic job schedules from application. Is there a good sample to achieve this? I found this. spring integration + cron + quartz in cluster?. This solution talks about getting reference to inbound channel adapter. I am using Annotation to define the inbound channel adapter. How do I get

How can I map incoming headers as String instead of byte[] in my Spring Cloud Stream project?

北慕城南 提交于 2021-01-27 06:17:06
问题 I have a simple Spring Cloud Stream project using Spring Integration DSL flows and using the Kafka binder. Everything works great, but message header values coming from Kafka arrive as byte[] . This means that my SI @Header parameters need to be of type byte[] . Which works, but it'd be nice to have them as Strings (all the inbound headers I care about are String values). I've configured the Kafka clients to use StringSerializer/StringDeserializer. I assume I also need to somehow tell Spring

Replay Kafka topic from the last successful message

百般思念 提交于 2021-01-25 06:39:40
问题 With the standard configuration of a channel on Spring Cloud Stream a message is retried 3 times and then skipped. If the following message processing completes successfully the offset is committed. That means that under transient exceptions messages can be lost. Can this behavior be changed, so the channel get stuck on a failing message until the transient condition is repaired? I have tried configuring the retry template, but you have to specify a number of retries, which looks like a

Spring cloud stream kafka transactions in producer side

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-07 03:18:47
问题 We have a spring cloud stream app using Kafka. The requirement is that on the producer side the list of messages needs to be put in a topic in a transaction. There is no consumer for the messages in the same app. When i initiated the transaction using spring.cloud.stream.kafka.binder.transaction.transaction-id prefix, I am facing the error that there is no subscriber for the dispatcher and a total number of partitions obtained from the topic is less than the transaction configured. The app is

Spring cloud stream kafka transactions in producer side

吃可爱长大的小学妹 提交于 2021-01-07 03:18:32
问题 We have a spring cloud stream app using Kafka. The requirement is that on the producer side the list of messages needs to be put in a topic in a transaction. There is no consumer for the messages in the same app. When i initiated the transaction using spring.cloud.stream.kafka.binder.transaction.transaction-id prefix, I am facing the error that there is no subscriber for the dispatcher and a total number of partitions obtained from the topic is less than the transaction configured. The app is

Testing Spring cloud stream with kafka stream binder: using TopologyTestDriver I get the error of “The class is not in the trusted packages”

泪湿孤枕 提交于 2021-01-05 07:38:28
问题 I have this simple stream processor (not a consumer/producer) using kafka streams binder. @Bean fun processFoo():Function<KStream<FooName, FooAddress>, KStream<FooName, FooAddressPlus>> { return Function { input-> input.map { key, value -> println("\nPAYLOAD KEY: ${key.name}\n"); println("\nPAYLOAD value: ${value.address}\n"); val output = FooAddressPlus() output.address = value.address output.name = value.name output.plus = "$value.name-$value.address" KeyValue(key, output) }} } I'm trying

Spring Reactive Stream - Unexpected Shutdown

巧了我就是萌 提交于 2021-01-02 03:10:17
问题 We are using Spring Cloud Reactive Streams with RabbitMQ. Spring Reactive Stream appears to acknowledge the message as soon as it pulls it off the queue. So any errors unhandled exceptions that happens during the message processing need to be handled in the application (which is a different than a non-reactive stream where unhandled exceptions can be thrown and a message would be rejected, thus sending it to a dead letter queue). How are we supposed to deal with a sudden shutdown in an

Get Message From Kafka, send to Rsocket and Receive it from React client

我与影子孤独终老i 提交于 2020-12-31 01:38:01
问题 I am trying to send data from kafka using Spring cloud stream to Rsocket and then represent data on React Here is my configuration. @Configuration public class RsocketConsumerConfiguration { @Bean public Sinks.Many<Data> sender(){ return Sinks.many().multicast().directBestEffort(); } } @Controller public class ServerController { @Autowired private Sinks.Many<Data> integer; @MessageMapping("integer") public Flux<Data> integer() { return integer.asFlux(); } @EnableBinding(IClientProcessor.class

Get Message From Kafka, send to Rsocket and Receive it from React client

点点圈 提交于 2020-12-31 01:36:01
问题 I am trying to send data from kafka using Spring cloud stream to Rsocket and then represent data on React Here is my configuration. @Configuration public class RsocketConsumerConfiguration { @Bean public Sinks.Many<Data> sender(){ return Sinks.many().multicast().directBestEffort(); } } @Controller public class ServerController { @Autowired private Sinks.Many<Data> integer; @MessageMapping("integer") public Flux<Data> integer() { return integer.asFlux(); } @EnableBinding(IClientProcessor.class