问题
I'm following this Case Study, which is similar to mine where I want to receive thousand of files in a S3 bucket and launch the batch task which will consume them.
But I'm getting:
Problem occurred while synchronizing 'bucket' to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;
I already consume this bucket using spring-cloud-starter-aws
dependency in some apps.
I know the message is pretty clear, but should I have specific permissions in a bucket when I need to sync like this with Spring Cloud DataFlow?
My current Stream config is:
s3
--spring.cloud.function.definition=s3Supplier,taskLaunchRequestFunction
--file.consumer.mode=ref
--s3.common.path-style-access=true
--s3.supplier.remote-dir=mybucket
--s3.supplier.local-dir=/scdf/infile
--cloud.aws.credentials.accessKey=****
--cloud.aws.credentials.secretKey=****
--cloud.aws.region.static=****
--cloud.aws.stack.auto=false
--task.launch.request.taskName=bill-composed-task
|
task-launcher-dataflow
--spring.cloud.dataflow.client.server-uri=http://localhost:9393
Thanks in advance
来源:https://stackoverflow.com/questions/65646098/problem-synchronizing-bucket-to-local-directory-with-spring-cloud-dataflow-str