I don\'t know how (or where also) to grant read and write permission to the user from AWS so users can post pictures on sample_app in production enviroment. This is final ta
Here is my tutorial that I created to pick up where Michael Hartl left off at the end of Ruby on Rails Tutorial (3rd Ed.) Chapter 11. It should answer your question and more.
Getting the railstutorial.org Sample App to work between Heroku and AWS was a huge pain in the ass. But I did it. If you found this tutorial, that means you're probably encountering an error you can't get past. That's fine. I had a few of them.
2020 Note: It's possible that everything here referencing a Region for S3 is no longer needed, or perhaps was never needed. When I originally got this all to work properly it was after adding the region information. However, S3 Buckets all share a global namespace. So if anybody is still reading this, try it all without the region stuff first, and leave a comment about whether it works or not. Anyway, back to the tutorial...
The first thing you need to do is go back over the code that Hartl provided. Make sure you typed it (or copy/pasted it) in exactly as shown. Out of all the code in this section, there is only one small addition you might need to make. The "region" environment variable. This is needed if you create a bucket that is not in the default US area. More on this later. Here is the code for /config/initializers/carrier_wave.rb
:
if Rails.env.production?
CarrierWave.configure do |config|
config.fog_credentials = {
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ENV['S3_ACCESS_KEY'],
:aws_secret_access_key => ENV['S3_SECRET_KEY'],
:region => ENV['S3_REGION']
}
config.fog_directory = ENV['S3_BUCKET']
end
end
That line :region => ENV['S3_REGION']
is a problem for a lot of people. More on that later.
You should be using that block of code exactly as shown. Do NOT put your actual keys in there. We'll send them to Heroku separately.
If you had to add that line of code, don't forget to commit it to git and push it to Heroku.
Now let's move on to your AWS account and security.
Policy Name: AllowFullAccessToMySampleAppBucket20160126
Description: Allows remote write/delete access to S3 bucket named
my-sample-app-bucket-20160126.
Policy Document:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::my-sample-app-bucket-20160126",
"arn:aws:s3:::my-sample-app-bucket-20160126/*"
]
}
]
}
That's it for AWS configuration. I didn't need to make a policy to allow "fog" to list the contents of the bucket, even though most tutorials I tried said that was necessary. I think it's only necessary when you want a user that can log in through the dashboard.
Now for the Heroku configuration. This stuff gets entered in at your command prompt, just like 'heroku run rake db:migrate' and such. This is where you enter the actual Access Key and Secret Key you got from the "fog" user you created earlier.
$ heroku config:set S3_ACCESS_KEY=THERANDOMKEYYOUGOT
$ heroku config:set S3_SECRET_KEY=an0tHeRstRing0frAnDomjUnK
$ heroku config:set S3_REGION=us-west-2
$ heroku config:set S3_BUCKET=my-sample-app-bucket-20160126
Look again at that last one. Remember when you looked at the Properties of
your S3 bucket? This is where you enter the code associated with your
region. If your bucket is not in Oregon, you will have to change us-west-2
to your actual region code. This link worked when this tutorial was written:
http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
If that doesn't work, Google "AWS S3 region codes".
After doing all this and double-checking for mistakes in the code, I got Heroku to work with AWS for storage of pictures!