boto

Upload Images to Amazon S3 using Django

风流意气都作罢 提交于 2019-12-30 03:28:06
问题 I'm currently resizing images on the fly when a user uploads a picture. The original picture is stored on Amazon S3 in a bucket called djangobucket. Inside this bucket, contains thousands of folders. Each folder is named after the user. I don't have to worry about bucket creation or folder creation since all of that is handled from the client side. Here is a diagram: djangobucket ------------> bob ---------> picture1.jpg picture2.jpg picture3.jpg picture4.jpg As you can see, Bob has many

Retrieve all items from DynamoDB using query?

五迷三道 提交于 2019-12-29 07:24:12
问题 I am trying to retrieve all items in a dynamodb table using a query. Below is my code: import boto.dynamodb2 from boto.dynamodb2.table import Table from time import sleep c = boto.dynamodb2.connect_to_region(aws_access_key_id="XXX",aws_secret_access_key="XXX",region_name="us-west-2") tab = Table("rip.irc",connection=c) x = tab.query() for i in x: print i sleep(1) However, I recieve the following error: ValidationException: ValidationException: 400 Bad Request {'message': 'Conditions can be of

boto issue with IAM role

有些话、适合烂在心里 提交于 2019-12-28 12:03:53
问题 I'm trying to use AWS' recently announced "IAM roles for EC2" feature, which lets security credentials automatically get delivered to EC2 instances. (see http://aws.amazon.com/about-aws/whats-new/2012/06/11/Announcing-IAM-Roles-for-EC2-instances/). I've set up an instance with an IAM role as described. I can also get (seemingly) proper access key / credentials with curl. However, boto fails to do a simple call like "get_all_buckets", even though I've turned on ALL S3 permissions for the role.

boto issue with IAM role

痴心易碎 提交于 2019-12-28 12:02:25
问题 I'm trying to use AWS' recently announced "IAM roles for EC2" feature, which lets security credentials automatically get delivered to EC2 instances. (see http://aws.amazon.com/about-aws/whats-new/2012/06/11/Announcing-IAM-Roles-for-EC2-instances/). I've set up an instance with an IAM role as described. I can also get (seemingly) proper access key / credentials with curl. However, boto fails to do a simple call like "get_all_buckets", even though I've turned on ALL S3 permissions for the role.

Get volume information associated with Instance

馋奶兔 提交于 2019-12-25 07:39:40
问题 I'm trying to retrieve all the volumes associated with an instance. if volume.attachment_state() == 'attached': volumesinstance = ec2_connection.get_all_instances() ids = [z for k in volumesinstance for z in k.instances] for s in ids: try: tags = s.tags instance_name = tags["Name"] print (instance_name) except Exception as e: print e However, it's not working as intended. 回答1: You can add filters in get_all_instances method like this: filter = {'block-device-mapping.volume-id': volume.id}

AWS S3: Enable encryption through API/Script

拈花ヽ惹草 提交于 2019-12-25 06:39:15
问题 We have images stored in the AWS S3 for our production services. Is there any API which will allow to enable encryption on these existing resources without downloading and uploading again? I see Boto module in Python allows to clone the key with additional parameters e.g encryption, but this will create a new key. As these keys are stored in a separate database, we want to retain existing keys but just enable encryption. 回答1: Here's some code that will convert all files in a bucket to use

iter() returned non-iterator of type 'Key' : boto amazon s3

China☆狼群 提交于 2019-12-25 01:16:51
问题 I am new to boto I was trying out the tutorial at this link http://boto.s3.amazonaws.com/s3_tut.html However every time I try to retrieve an item an error occurs. My code is follows: conn=boto.connect_s3(KEY,PRIVATEKEY) bucket= conn.create_bucket(bucketname) from boto.s3.Key import Key k= Key(bucket,'key') k.get_contents_to_file(filename) I get the following error: k.get_contents_to_file('test') Traceback (most recent call last): File "<pyshell#13>", line 1, in <module> k.get_contents_to_file

Dynamodb Update Item Expression with python boto3

心已入冬 提交于 2019-12-24 17:28:25
问题 I have a string field, "title". I am trying to update it with the update expression with persontable.update_item(Key={'person_id':person_id}, UpdateExpression="SET title = UPDATED") and I get An error occurred (ValidationException) when calling the UpdateItem operation: The provided expression refers to an attribute that does not exist in the item I can see the attribute "title" for that person in the AWS console. What gives? 回答1: Don't plug the value into the expression directly. Rather use

increase EC2 EBS volume after cloning - resize2fs not working

时光总嘲笑我的痴心妄想 提交于 2019-12-24 15:42:59
问题 This is a very similar problem to EC2 Can't resize volume after increasing size. However, I cannot resolve this manually using fdisk because I'm trying to get the whole process to run automatically. I'm using a python boto (2.39) script (snippet) which takes a snapshot, registers a new AMI with the same block device mapping, and then creates an instance from it. It's all working well. The new instance is created with a larger volume size. The new instance is loading ok. The only problem I see

Boto2 file upload gives ConnectionResetError

坚强是说给别人听的谎言 提交于 2019-12-24 06:34:50
问题 I am trying to upload files to S3 bucket using code from this question: https://stackoverflow.com/a/15087468/291372. I am using boto2 (boto3 has too many dependencies). I tried many methods, but no one works for me. CORS was checked for bucket and set to allow origin from "*" Here is my code: # -*- coding: utf-8 -*- import boto import boto.s3 import sys from boto.s3.key import Key AWS_ACCESS_KEY_ID = 'XXXXXXXXXXXXX' AWS_SECRET_ACCESS_KEY = 'YYYYYYYYYYYYYYYYYYYYYy' S3_BUCKET =