spring-cloud-stream

spring cloud stream - can a spring cloud stream appliction exit explicitly?

 ̄綄美尐妖づ 提交于 2020-01-06 07:08:35
问题 I am not sure if my understand is correct. We launch a spring cloud stream appliction and subscribe to a topic. The application will be running and mornitoring the topic for new message, except we send kill signal to quit. I was thinking if we could quit spring cloud application explicityly, like waiting 5 minutes and no new messages come in? Or processed 1000 records and quit? 回答1: You don't need to stop the application for this as it will bring the entire JVM (which you can always do if you

Handling dead letter queue with delay

大憨熊 提交于 2020-01-06 04:38:07
问题 I want to do the following: when a message fails and falls to my dead letter queue, I want to wait 5 minutes and republishes the same message on my queue. Today, using Spring Cloud Streams and RabbitMQ, I did the following code Based on this documentation: @Component public class HandlerDlq { private static final Logger LOGGER = LoggerFactory.getLogger(HandlerDlq.class); private static final String X_RETRIES_HEADER = "x-retries"; private static final String X_DELAY_HEADER = "x-delay"; private

Update KTable based on partial data attributes

依然范特西╮ 提交于 2020-01-05 17:57:51
问题 I am trying to update a KTable with partial data of an object. Eg. User object is {"id":1, "name":"Joe", "age":28} The object is being streamed into a topic and grouped by key into KTable. Now the user object is updated partially as follows {"id":1, "age":33} and streamed into table. But the updated table looks as follows {"id":1, "name":null, "age":28} . The expected output is {"id":1, "name":"Joe", "age":33} . How can I use Kafka streams and spring cloud streams to achieve the expected

Spring cloud stream Special Chars in Message received from kinesis

杀马特。学长 韩版系。学妹 提交于 2020-01-04 09:25:00
问题 When I consume the message from kinesis stream. I get some junk chars with headers etc @StreamListener(Processor.INPUT) public void receive(String message) { System.out.println("Message recieved: "+message); throw new RuntimeException("Exception thrown"); } @StreamListener("errorChannel") public void transform(ErrorMessage errorMessage) throws UnsupportedEncodingException { //original paylaod System.out.println("Error Oiginal Message Payload"+new String((byte[])errorMessage.getOriginalMessage

Create Stream with one source, two parallel processors and one sink in Spring Cloud Data Flow

别说谁变了你拦得住时间么 提交于 2020-01-04 02:33:35
问题 I am trying to create a stream in Spring Cloud Data Flow with One source i.e. order-source and Order message will be published to the RabbitMQ Topic/Queue. Two parallel processors i.e. product-processor and shipment-processor Both of these processors will be subscribers to the RabbitMQ Topic/Queue and gets the Order message and each of them will process these Order message individually and update the Order and the Order message will be published to the RabbitMQ Topic/Queue. One sink i.e.

Health for Kafka Binder is always UNKNOWN

妖精的绣舞 提交于 2019-12-31 03:51:17
问题 When I try to activate the health indicator for the kafka binder as explained in Spring Cloud Stream Reference Documentation the health endpoint returns: binders":{"status":"UNKNOWN","kafka":{"status":"UNKNOWN"}}} my configuration contains as documented: management.health.binders.enabled=true I already debugged BindersHealthIndicatorAutoConfiguration and noticed, that no HealthIndicator is registered in the binderContext . Do I have to register a custom HealthIndicator as bean or what steps

Failed to start bean 'inputBindingLifecycle' when using spring-boot:1.5.1 and spring-cloud-stream

一曲冷凌霜 提交于 2019-12-31 03:27:08
问题 I get the below mentioned error when using spring-boot:1.5.1 but not when using spring-boot:1.4.4 Has anyone encountered this? package org.test; import lombok.Data; import lombok.ToString; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.cloud.stream.annotation.EnableBinding; import org.springframework.cloud.stream.annotation.StreamListener; import org.springframework.cloud.stream.messaging.Sink;

Failed to start bean 'inputBindingLifecycle' when using spring-boot:1.5.1 and spring-cloud-stream

Deadly 提交于 2019-12-31 03:27:06
问题 I get the below mentioned error when using spring-boot:1.5.1 but not when using spring-boot:1.4.4 Has anyone encountered this? package org.test; import lombok.Data; import lombok.ToString; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.cloud.stream.annotation.EnableBinding; import org.springframework.cloud.stream.annotation.StreamListener; import org.springframework.cloud.stream.messaging.Sink;

Spring cloud stream - send message after application initalization

半腔热情 提交于 2019-12-31 00:59:05
问题 I'am trying to send a simple message using "spring cloud stream" to the rabbitmq. Basically code looks like this: @EnableBinding(Source.class) @SpringBootApplication public class SourceApplication { public static void main(String[] args) { SpringApplication.run(SourceApplication.class, args); } @Autowired Source source; @PostConstruct public void init() { source.send(MessageBuilder.withPayload("payload").build()); } } then I get this error message: org.springframework.messaging

Tombstone messages not removing record from KTable state store?

会有一股神秘感。 提交于 2019-12-30 10:33:49
问题 I am creating KTable processing data from KStream. But when I trigger a tombstone messages with key and null payload, it is not removing message from KTable. sample - public KStream<String, GenericRecord> processRecord(@Input(Channel.TEST) KStream<GenericRecord, GenericRecord> testStream, KTable<String, GenericRecord> table = testStream .map((genericRecord, genericRecord2) -> KeyValue.pair(genericRecord.get("field1") + "", genericRecord2)) .groupByKey() reduce((genericRecord, v1) -> v1,