spring-cloud-stream

Spring Aws Kinesis Binder Acquiring and Releasing lock issues in Dynamo DB while consuming messages

蓝咒 提交于 2021-02-10 16:18:49
问题 Sometimes, when we are stopping the application abruptly exception occurs that unlocking has been failed. Then the same group will never gets the messages. Other groups are getting messages. I am using the aws kinesis binder snapshot version. This is the error, when application is stopped. 2018-07-19 22:21:21.371 ERROR 60981 --- [s-shard-locks-1] a.i.k.KinesisMessageDrivenChannelAdapter : Error during unlocking: DynamoDbLock [lockKey=aaaa:myStream:shardId-000000000000,lockedAt=2018-07-19@22

Spring Cloud Stream topic per message for different consumers

為{幸葍}努か 提交于 2021-02-10 12:49:13
问题 The topology I am looking for is So far I have not seen a way to define the topic per message in Cloud Stream. I understand that the consumers will be bound to specific topic but how does the producer sets the topic per message before sending the message to the exchange? source.output().send(MessageBuilder.withPayload(myMessage).build()); Does not provide any way to set the topic for the exchange to route to the proper consumer. Or maybe I don't understand something correctly? UPDATE I would

Spring Cloud Stream topic per message for different consumers

99封情书 提交于 2021-02-10 12:49:11
问题 The topology I am looking for is So far I have not seen a way to define the topic per message in Cloud Stream. I understand that the consumers will be bound to specific topic but how does the producer sets the topic per message before sending the message to the exchange? source.output().send(MessageBuilder.withPayload(myMessage).build()); Does not provide any way to set the topic for the exchange to route to the proper consumer. Or maybe I don't understand something correctly? UPDATE I would

Spring Cloud Data Flow Grafana Prometheus not showing stream data

江枫思渺然 提交于 2021-02-08 15:12:38
问题 I launch Spring cloud data flow with docker-compose base on this website. https://dataflow.spring.io/docs/installation/local/docker/ I created 3 apps, Source, Processor & Sink. I ran export STREAM_APPS_URI=https://dataflow.spring.io/Einstein-BUILD-SNAPSHOT-stream-applications-kafka-maven When I run docker-compose -f ./docker-compose.yml -f ./docker-compose-prometheus.yml up, all my containers start up as specified in the docker-compose.yml and docker-compose-prometheus.yml. I proceed to

Spring Cloud Data Flow Grafana Prometheus not showing stream data

情到浓时终转凉″ 提交于 2021-02-08 15:12:06
问题 I launch Spring cloud data flow with docker-compose base on this website. https://dataflow.spring.io/docs/installation/local/docker/ I created 3 apps, Source, Processor & Sink. I ran export STREAM_APPS_URI=https://dataflow.spring.io/Einstein-BUILD-SNAPSHOT-stream-applications-kafka-maven When I run docker-compose -f ./docker-compose.yml -f ./docker-compose-prometheus.yml up, all my containers start up as specified in the docker-compose.yml and docker-compose-prometheus.yml. I proceed to

Issues with binder using Spring-cloud-stream-kafka-stream

有些话、适合烂在心里 提交于 2021-02-08 09:44:33
问题 I m trying to read kafka using spring cloud stream kafka stream. then I aggregate the event in one min time window and wite it to the differnt topic. Then I need to read the aggregated event from the topic and write it to a different topic while binding the topic with the different topic in another kafka cluster. But I m getting the below binder exception. org.springframework.context.ApplicationContextException: Failed to start bean 'outputBindingLifecycle'; nested exception is java.lang

Issues with binder using Spring-cloud-stream-kafka-stream

别来无恙 提交于 2021-02-08 09:44:04
问题 I m trying to read kafka using spring cloud stream kafka stream. then I aggregate the event in one min time window and wite it to the differnt topic. Then I need to read the aggregated event from the topic and write it to a different topic while binding the topic with the different topic in another kafka cluster. But I m getting the below binder exception. org.springframework.context.ApplicationContextException: Failed to start bean 'outputBindingLifecycle'; nested exception is java.lang

Problem synchronizing 'bucket' to local directory with Spring Cloud DataFlow Streams

不打扰是莪最后的温柔 提交于 2021-01-29 09:28:41
问题 I'm following this Case Study, which is similar to mine where I want to receive thousand of files in a S3 bucket and launch the batch task which will consume them. But I'm getting: Problem occurred while synchronizing 'bucket' to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;

Exception thrown while starting consumer - (Cannot assign same group name for different channels in a microservice )

老子叫甜甜 提交于 2021-01-28 07:52:22
问题 I am a newbie in Spring cloud stream and rabbitmq. Recently I got an Exception after started one of my microservice. It is stating that could not register object because it is already registered, I think it is is because of the group name that assigned for each channel, please check the exception, 2018-05-28 10:01:38.420 ERROR 10244 --- [ask-scheduler-2] o.s.cloud.stream.binding.BindingService : Failed to create consumer binding; retrying in 30 seconds org.springframework.cloud.stream.binder