I\'m able to create an S3 bucket using cloudformation but would like to create a folder inside an S3 bucket..like
-->
<
We cannot (at least as of now) create a sub folder inside s3 bucket.
You can try using following command :
aws s3 mb s3://yavdhesh-bucket/inside-folder
And then try to list all the folders inside the bucket using command:
aws s3 ls s3://yavdhesh-bucket
And you will observe that the sub folder was not created.
there is only one way to create a subfolder, that is by creating/copying a file inside a non-existing sub folder or sub directory (with respect to bucket)
For example,
aws s3 cp demo.txt s3://yavdhesh-bucket/inside-folder/
Now if you list down the files present inside your sub-folder, it should work.
aws s3 ls s3://yavdhesh-bucket/inside-folder/
it should list down all the files present in this sub folder.
Hope it helps.
This is not possible using an AWS CloudFormation template.
It should be mentioned that folders do not actually exist in Amazon S3. Instead, the path of an object is prepended to the name (key
) of an object.
So, file bar.txt
stored in a folder named foo
is actually stored with a Key of: foo/bar.txt
You can also copy files to a folder that doesn't exist and the folder will be automatically created (which is not actually true, since the folder itself doesn't exist). However, the Management Console will provide the appearance of such a folder and the path will suggest that it is stored in such a folder.
Bottom line: There is no need to pre-create a folder. Just use it as if it were already there.
AWS doesn't provide an official CloudFormation resource to create objects within an S3 bucket. However, you can create a Lambda-backed Custom Resource to perform this function using the AWS SDK, and in fact the gilt/cloudformation-helpers GitHub repository provides an off-the-shelf custom resource that does just this.
As with any Custom Resource setup is a bit verbose, since you need to first deploy the Lambda function and IAM permissions, then reference it as a custom resource in your stack template.
First, add the Lambda::Function
and associated IAM::Role
resources to your stack template:
"S3PutObjectFunctionRole": {
"Type": "AWS::IAM::Role",
"Properties": {
"AssumeRolePolicyDocument": {
"Version" : "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": [ "lambda.amazonaws.com" ]
},
"Action": [ "sts:AssumeRole" ]
}
]
},
"ManagedPolicyArns": [
{ "Ref": "RoleBasePolicy" }
],
"Policies": [
{
"PolicyName": "S3Writer",
"PolicyDocument": {
"Version" : "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:DeleteObject",
"s3:ListBucket",
"s3:PutObject"
],
"Resource": "*"
}
]
}
}
]
}
},
"S3PutObjectFunction": {
"Type": "AWS::Lambda::Function",
"Properties": {
"Code": {
"S3Bucket": "com.gilt.public.backoffice",
"S3Key": "lambda_functions/cloudformation-helpers.zip"
},
"Description": "Used to put objects into S3.",
"Handler": "aws/s3.putObject",
"Role": {"Fn::GetAtt" : [ "S3PutObjectFunctionRole", "Arn" ] },
"Runtime": "nodejs",
"Timeout": 30
},
"DependsOn": [
"S3PutObjectFunctionRole"
]
},
Then you can use the Lambda function as a Custom Resource to create your S3 object:
"MyFolder": {
"Type": "Custom::S3PutObject",
"Properties": {
"ServiceToken": { "Fn::GetAtt" : ["S3PutObjectFunction", "Arn"] },
"Bucket": "mybucket",
"Key": "myfolder/"
}
},
You can also use the same Custom Resource to write a string-based S3 object by adding a Body
parameter in addition to Bucket
and Key
(see the docs).
I ended up with a small python script. It should be run manually, but it does the the sync automatically. It's for lazy people who don't want to create a Lambda-Backed Custom Resource.
import subprocess
import json
STACK_NAME = ...
S3_RESOURCE = <name of your s3 resource, as in CloudFormation template file>
LOCAL_DIR = <path of your local dir>
res = subprocess.run(
['aws', 'cloudformation', 'describe-stack-resource', '--stack-name', STACK_NAME, '--logical-resource-id', S3_RESOURCE],
capture_output=True,
)
out = res.stdout.decode('utf-8')
resource_details = json.loads(out)
resource_id = resource_details['StackResourceDetail']['PhysicalResourceId']
res = subprocess.run(
['aws', 's3', 'sync', LOCAL_DIR, f's3://{resource_id}/', '--acl', 'public-read']
)