AccessDenied for ListObjects for S3 bucket when permissions are s3:*

前端 未结 13 789
佛祖请我去吃肉
佛祖请我去吃肉 2021-01-29 22:02

I am getting:

An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied

When I try to get folder from

相关标签:
13条回答
  • 2021-01-29 22:09

    I faced with the same issue. I just added credentials config:

    aws_access_key_id = your_aws_access_key_id
    aws_secret_access_key = your_aws_secret_access_key
    

    into "~/.aws/credentials" + restart terminal for default profile.

    In the case of multi profiles --profile arg needs to be added:

    aws s3 sync ./localDir s3://bucketName --profile=${PROFILE_NAME}
    

    where PROFILE_NAME:

    .bash_profile ( or .bashrc) -> export PROFILE_NAME="yourProfileName"
    

    More info about how to config credentials and multi profiles can be found here

    0 讨论(0)
  • 2021-01-29 22:11

    To allow permissions in s3 bucket go to the permissions tab in s3 bucket and in bucket policy change the action to this which will allow all actions to be performed:

    "Action":"*"
    
    0 讨论(0)
  • 2021-01-29 22:12

    I was unable to access to S3 because

    • first I configured key access on the instance (it was impossible to attach role after the launch then)
    • forgot about it for a few months
    • attached role to instance
    • tried to access. The configured key had higher priority than role, and access was denied because the user wasn't granted with necessary S3 permissions.

    Solution: rm -rf .aws/credentials, then aws uses role.

    0 讨论(0)
  • 2021-01-29 22:13

    You have given permission to perform commands on objects inside the S3 bucket, but you have not given permission to perform any actions on the bucket itself.

    Slightly modifying your policy would look like this:

    {
      "Version": "version_id",
      "Statement": [
        {
            "Sid": "some_id",
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": [
                "arn:aws:s3:::bucketname",
                "arn:aws:s3:::bucketname/*"
            ]
        }
      ] 
    }
    

    However, that probably gives more permission than is needed. Following the AWS IAM best practice of Granting Least Privilege would look something like this:

    {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Effect": "Allow",
              "Action": [
                  "s3:ListBucket"
              ],
              "Resource": [
                  "arn:aws:s3:::bucketname"
              ]
          },
          {
              "Effect": "Allow",
              "Action": [
                  "s3:GetObject"
              ],
              "Resource": [
                  "arn:aws:s3:::bucketname/*"
              ]
          }
      ]
    }
    
    0 讨论(0)
  • 2021-01-29 22:15

    If you wanted to copy all s3 bucket objects using the command "aws s3 cp s3://bucket-name/data/all-data/ . --recursive" as you mentioned, here is a safe and minimal policy to do that:

    {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Effect": "Allow",
              "Action": [
                  "s3:ListBucket"
              ],
              "Resource": [
                  "arn:aws:s3:::bucket-name"
              ],
              "Condition": {
                  "StringLike": {
                      "s3:prefix": "data/all-data/*"
                  }
              }
          },
          {
              "Effect": "Allow",
              "Action": [
                  "s3:GetObject"
              ],
              "Resource": [
                  "arn:aws:s3:::bucket-name/data/all-data/*"
              ]
          }
      ]
    }
    

    The first statement in this policy allows for listing objects inside a specific bucket's sub directory. The resource needs to be the arn of the S3 bucket, and to limit listing to only a sub-directory in that bucket you can edit the "s3:prefix" value.

    The second statement in this policy allows for getting objects inside of the bucket at a specific sub-directory. This means that anything inside the "s3://bucket-name/data/all-data/" path you will be able to copy. Be aware that this doesn't allow you to copy from parent paths such as "s3://bucket-name/data/".

    This solution is specific to limiting use for AWS CLI commands; if you need to limit S3 access through the AWS console or API, then more policies will be needed. I suggest taking a look here: https://aws.amazon.com/blogs/security/writing-iam-policies-grant-access-to-user-specific-folders-in-an-amazon-s3-bucket/.

    A similar issue to this can be found here which led me to the solution I am giving. https://github.com/aws/aws-cli/issues/2408

    Hope this helps!

    0 讨论(0)
  • 2021-01-29 22:18

    You have to specify Resource for the bucket via "arn:aws:s3:::bucketname" or "arn:aws:3:::bucketname*". The latter is preferred since it allows manipulations on the bucket's objects too. Notice there is no slash!

    Listing objects is an operation on Bucket. Therefore, action "s3:ListBucket" is required. Adding an object to the Bucket is an operation on Object. Therefore, action "s3:PutObject" is needed. Certainly, you may want to add other actions as you require.

    {
    "Version": "version_id",
    "Statement": [
        {
            "Sid": "some_id",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::bucketname*"
            ]
        }
    ] 
    }
    
    0 讨论(0)
提交回复
热议问题