I\'m trying to get django to upload static files to S3, but istead I\'m getting a 403 forbidden error, and I\'m not sure why.
Full Stacktrace:
Here is a refinement with minimal permissions.
In all cases, as discussed elsewhere s3:ListAllMyBuckets
is necessary on all buckets.
In it's default configuration django-storages will upload files to S3 with public-read permissions - see django-storages Amazon S3 backend
Trial and error revealed that in this default configuration the only two permissions required are s3:PutObject
to upload a file in the first place and s3:PutObjectAcl
to set the permissions for that object to public.
No additional actions are required because from that point forward read is public on the object anyway.
IAM User Policy - public-read (default):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::bucketname/*"
}
]
}
It is not always desirable to have objects publicly readable. This is achieved by setting the relevant property in the settings file.
Django settings.py:
...
AWS_DEFAULT_ACL = "private"
...
And then the s3:PutObjectAcl
is no longer required and the minimal permissions are as follows:
IAM User Policy - private:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject"
],
"Resource": "arn:aws:s3:::bucketname/*"
}
]
}
It is also possible that the wrong credentials are being used. To verify:
import boto
s3 = boto.connect_s3('<your access key>', '<your secret key>')
bucket = s3.get_bucket('<your bucket>') # does this work?
s3 = boto.connect_s3()
s3.aws_access_key_id # is the same key being used by default?
If not, take a look at ~/.boto
, ~/.aws/config
and ~/.aws/credentials
.
Maybe you actually don't have access to the bucket you're trying to lookup/get/create..
Remember: bucket names have to be unique across the entire S3 eco-system, so if you try to access (lookup/get/create) a bucket named 'test' you will have no access to it.
I would recommend that you try to test your AWS credentials separately to verify whether the credentials do actually have permission to read and write data to the S3 bucket. The following should work:
>>> import boto
>>> s3 = boto.connect_s3('<access_key>', '<secret_key>')
>>> bucket = s3.lookup('donebox-static')
>>> key = bucket.new_key('testkey')
>>> key.set_contents_from_string('This is a test')
>>> key.exists()
>>> key.delete()
You should try the same test with the other bucket ('donebox-media'). If this works, the permissions are correct and the problem lies in the Django storages code or configuration. If this fails with a 403 then either:
I hope that helps. Please report back your findings.
I'm using Amazon IAM for the particular key ID and access key and just bumped into the same 403 Forbidden... Turns out you need to give permissions that target both the bucket root and its subobjects:
{
"Statement": [
{
"Principal": {
"AWS": "*"
},
"Effect": "Allow",
"Action": "s3:*",
"Resource": ["arn:aws:s3:::bucket-name/*", "arn:aws:s3:::bucket-name"]
}
]
}
In case this helps anyone, I had to add the following configuration entry for collectstatic
to work and not return 403:
AWS_DEFAULT_ACL = ''