File can be uploaded to S3 locally but can't within a container (Unable to locate credential)

纵饮孤独 提交于 2020-04-18 01:08:32

问题


I have a Python script to upload a file to S3, the code is the same in this question.

I have a bash script that pass the AWS credential. The file I wanted to upload is generated from a model that running on Fargate (ina container), so I tried to run this Python script within the container to upload to S3, I've built the image, but when I run docker run containername it will give me error:

INFO:root:Uploading to S3 from test.csv to bucket_name test.csv

  File "/usr/local/lib/python3.6/dist-packages/botocore/auth.py", line 357, in add_auth
    raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials

Can someone gave me some hint? How can I fix it? Thanks in advance.


回答1:


To pass docker credentials you either need to mount ~/.aws/credentials in your container

docker -v ~/.aws/credentials:/root/.aws/credentials:ro 

Or pass your credentials as env vars

docker run -e -e AWS_ACCESS_KEY_ID=$(aws configure get aws_access_key_id --profile profilename) -e AWS_SECRET_ACCESS_KEY=$(aws configure get aws_secret_access_key --profile profilename)

For the Fargate side you need to create an IAM Role and add a Policy that has access to that bucket.

This needs to be assign to the Task Role. This is different than the Task Execution Role which is used to pull the docker image. The Task Role is used at run time and that's where you need to add the policy for S3 access.



来源:https://stackoverflow.com/questions/61250393/file-can-be-uploaded-to-s3-locally-but-cant-within-a-container-unable-to-locat

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!