boto

How to get the aws s3 object key using django-storages and boto3

假装没事ソ 提交于 2019-12-24 03:41:34
问题 I am using django-storage and boto3 for media and static files using aws s3. I need to get the object key of aws s3 bucket, so that I can generate a url for that object. client = boto3.client('s3') bucket_name = 'django-bucket' key = ??? u = client.generate_presigned_url('get_object', Params = {'Bucket': bucket_name, 'Key': key,'ResponseContentType':'image/jpeg', 'ResponseContentDisposition': 'attachment; filename="your-filename.jpeg"'}, ExpiresIn = 1000) These are in my settings: STATICFILES

How to Upload/download to S3 without changing Last Modified date?

╄→尐↘猪︶ㄣ 提交于 2019-12-24 01:09:08
问题 I want to upload and download files to S3 using boto3 without changing their "LastModified" date so I can keep tabs on the age of the contents. Whenever I upload or download a file it takes on the date of this operation and I lose the date that the contents were modified. I'm looking at the timestamp of the files using fileObj.get('LastModified') where the fileObj is taken from a paginator result. I'm using the following command to upload s3Client.upload_fileobj(data, bucket_name, destpath)

Boto CloudSearch on GAE: TypeError: request() got an unexpected keyword argument 'config'

社会主义新天地 提交于 2019-12-24 00:54:14
问题 I'm using Boto 2.8 on GAE to search and index docs to AWS CloudSearch. When I try to index a document I get the following error: TypeError: request() got an unexpected keyword argument 'config' Surfing the web suggests there is a version compatibility issue with the request library. The issue seems to come from lines 189-199 of boto/cloudsearch/document.py request_config = { 'pool_connections': 20, 'keep_alive': True, 'max_retries': 5, 'pool_maxsize': 50 } r = requests.post(url, data=sdf,

boto s3 Bucket versus get_bucket

梦想与她 提交于 2019-12-24 00:10:07
问题 I try to access a key inside a bucket, for which I don't have permissions, though I do for the key. In order to be able to do get_key('this/is/my_key') , I need the bucket object: conn = boto.connect_s3(key, secret_key) my_bucket = conn.get_bucket('a_bucket') yields S3ResponseError: S3ResponseError: 403 Forbidden . On the other hand, the following works my_bucket = boto.s3.bucket.Bucket(conn, 'a_bucket') my_bucket.get_key('this/is/my_key') Question: What is the difference between creating the

boto .get_all_keypairs() method and the .save() of its results

巧了我就是萌 提交于 2019-12-23 22:30:09
问题 So I have access to a number of EC2 instances, some of which have been running for years. We have a special repository of the private keys to all of these; thus I can, for most of our instances, get into them as root (or the 'ubuntu' user in some cases) to administer them. While playing with boto I noticed the EC2 .get_keypair() and get_all_keypairs methods and was wondering if this could be used to recover any SSH keys which have slipped through the cracks of our procedures and been lost.

DB Security Groups can only be associated with VPC DB Instances using API versions

匆匆过客 提交于 2019-12-23 22:17:18
问题 I have this code below to create a RDS instance in aws: import boto.rds REGION="us-east-1" INSTANCE_TYPE="db.t1.micro" ID = "MySQL-db-instance-database-test2" USERNAME="root" PASSWORD = "pass" DB_PORT = 3306 DB_SIZE = 5 DB_ENGINE = "MySQL5.1" DB_NAME = "databasetest2" SECGROUP_HANDLE="default" print "Connecting to RDS" conn = boto.rds.connect_to_region(REGION) print "Creating a RDS Instance" instance = conn.create_dbinstance(ID, DB_SIZE, INSTANCE_TYPE, USERNAME, PASSWORD, port=DB_PORT, engine

How to read binary file on S3 using boto?

老子叫甜甜 提交于 2019-12-23 17:21:38
问题 I have a series of Python Script / Excel File in S3 folder (Private section). I can read access them through HTTP URL if they are public. Wondering how I can access in binary them for executing them ? FileURL='URL of the File hosted in S3 Private folder' exec(FileURL) run(FileURL) 回答1: I'm not totally sure I understood your question, but here is one answer based on how I interpreted your question. As long as you know your bucket name and object/key name , you can do the following with boto3

Compare launch time of EC2 instance and current time in python

℡╲_俬逩灬. 提交于 2019-12-23 13:58:11
问题 I extract the launch_time from EC2 instance, it returns a unicode string like this: 2014-12-22T08:46:10.000Z I use dateutil parser to convert it to datetime with launch_time = parser.parse(instance.launch_time) so I get lunch_time after converted like this: 2014-12-22 08:46:10+00:00 And I want to compare this launchtime with current time to see how long this instance has been running. I get current_time with: current_time = datetime.datetime.now() and I get it like this: 2014-12-22 11:46:10

Compare launch time of EC2 instance and current time in python

旧巷老猫 提交于 2019-12-23 13:58:08
问题 I extract the launch_time from EC2 instance, it returns a unicode string like this: 2014-12-22T08:46:10.000Z I use dateutil parser to convert it to datetime with launch_time = parser.parse(instance.launch_time) so I get lunch_time after converted like this: 2014-12-22 08:46:10+00:00 And I want to compare this launchtime with current time to see how long this instance has been running. I get current_time with: current_time = datetime.datetime.now() and I get it like this: 2014-12-22 11:46:10

Boto file upload to S3 failing on Windows [errno: 10054]

夙愿已清 提交于 2019-12-23 13:24:26
问题 I'm trying to upload a file to S3 using boto on a windows 7 machine, but i keep getting an error [Errno 10054] An existing connection was forcibly closed by the remote host My code to interact with S3 looks like this from boto.s3.connection import S3Connection from boto.s3.key import Key conn = S3Connection(Access_Key_ID, Secret_Key) bucket = conn.lookup(bucket_name) k = Key(bucket) k.key = 'akeynameformyfile' k.set_contents_from_filename(source_path_of_file_to_upload) The upload works fine