boto

How to change aws-ec2 instance type?

余生颓废 提交于 2020-01-03 02:28:10
问题 I wanted to change the aws-ec2 instance type(e.g from micro to large or vice-versa etc) using Boto3. What are the factors that needs to be care while changing the instance type of ec2-instances. Here is my code: def get_ec2_boto3_connection(region, arn): sess = Boto3Connecton.get_boto3_session(arn) ec2_conn = sess.client(service_name='ec2', region_name=region) return ec2_conn def change_instance_type(arn,region): ec2_conn=get_ec2_boto3_connection(region,arn) ec2_conn.modify_instance_attribute

Trouble setting cache-cotrol header for Amazon S3 key using boto

China☆狼群 提交于 2020-01-02 06:40:42
问题 My Django project uses django_compressor to store JavaScript and CSS files in an S3 bucket via boto via the django-storages package. The django-storages-related config includes if 'AWS_STORAGE_BUCKET_NAME' in os.environ: AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME'] AWS_HEADERS = { 'Cache-Control': 'max-age=100000', 'x-amz-acl': 'public-read', } AWS_QUERYSTRING_AUTH = False # This causes images to be stored in Amazon S3 DEFAULT_FILE_STORAGE = 'storages.backends.s3boto

How to calculate Dynamodb item size? Getting validationerror 400KB Boto

こ雲淡風輕ζ 提交于 2020-01-01 19:35:11
问题 ValidationException: ValidationException: 400 Bad Request {u'message': u'Item size has exceeded the maximum allowed size', u'__type': u'com.amazon.coral.validate#ValidationException'} The item object I have, has size of 92004 Bytes >>> iii <boto.dynamodb2.items.Item object at 0x7f7922c97190> >>> iiip = iii.prepare_full() # it is now in dynamodb format e.g. "Item":{"time":{"N":"300"}, "user":{"S":"self"}} >>> len(json.dumps(iiip)) 92004 >>> The size I get 92004 is less than 400KB, Why do I see

Issues trying to SSH into a fresh EC2 instance with Paramiko

泪湿孤枕 提交于 2020-01-01 08:42:30
问题 I'm working on a script that spins up a fresh EC2 instance with boto and uses the Paramiko SSH client to execute remote commands on the instance. For whatever reason, the Paramiko client is unabled to connect, I get the error: Traceback (most recent call last): File "scripts/sconfigure.py", line 29, in <module> ssh.connect(instance.ip_address, username='ubuntu', key_filename=os.path.expanduser('~/.ssh/test')) File "build/bdist.macosx-10.3-fat/egg/paramiko/client.py", line 291, in connect File

Issues trying to SSH into a fresh EC2 instance with Paramiko

一个人想着一个人 提交于 2020-01-01 08:41:46
问题 I'm working on a script that spins up a fresh EC2 instance with boto and uses the Paramiko SSH client to execute remote commands on the instance. For whatever reason, the Paramiko client is unabled to connect, I get the error: Traceback (most recent call last): File "scripts/sconfigure.py", line 29, in <module> ssh.connect(instance.ip_address, username='ubuntu', key_filename=os.path.expanduser('~/.ssh/test')) File "build/bdist.macosx-10.3-fat/egg/paramiko/client.py", line 291, in connect File

Django as S3 proxy

烂漫一生 提交于 2020-01-01 06:01:19
问题 I extended a ModelAdmin with a custom field "Download file", which is a link to a URL in my Django project, like: http://www.myproject.com/downloads/1 There, I want to serve a file which is stored in a S3-bucket. The files in the bucket are not public readable, and the user may not have direct access to it. Now I want to avoid that the file has to be loaded in the server memory (these are multi-gb-files) avoid to have temp files on the server The ideal solution would be to let django act as a

Django as S3 proxy

久未见 提交于 2020-01-01 06:00:05
问题 I extended a ModelAdmin with a custom field "Download file", which is a link to a URL in my Django project, like: http://www.myproject.com/downloads/1 There, I want to serve a file which is stored in a S3-bucket. The files in the bucket are not public readable, and the user may not have direct access to it. Now I want to avoid that the file has to be loaded in the server memory (these are multi-gb-files) avoid to have temp files on the server The ideal solution would be to let django act as a

Files served unbearably slow from amazon s3

痴心易碎 提交于 2020-01-01 05:48:09
问题 I have a django app on heroku which serves the static files from amazon s3 bucket. I use boto library and followed the guide on the website. What can I do to speed up the file transfers? Some of the code: DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage' AWS_ACCESS_KEY_ID = 'xxxx' AWS_SECRET_ACCESS_KEY = 'xxxx' AWS_STORAGE_BUCKET_NAME = 'boxitwebservicebucket' STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage' STATIC_URL = 'http://' + AWS_STORAGE_BUCKET_NAME + '.s3

Create a MTurk HIT from an existing template

匆匆过客 提交于 2020-01-01 05:18:06
问题 I'm using the Python AWS package boto v2.7 to interact with the Mturk API to create and manage HIT's etc. I'm getting stuck when trying to create a HIT using an existing template. Amazon's documentation on the topic is here: http://docs.aws.amazon.com/AWSMechTurk/2012-03-25/AWSMturkAPI/ApiReference_CreateHITOperation.html My code is: from boto.mturk.connection import MTurkConnection mtc = MTurkConnection(aws_access_key_id=ACCESS_ID, aws_secret_access_key=SECRET_KEY, host=HOST) mtc.create_hit

How can we fetch IAM users, their groups and policies?

拟墨画扇 提交于 2019-12-31 22:49:13
问题 I need to fetch all the aws user's, their corresponding groups, policies and then if mfa is activated for them or not. Can anyone tell me how it can be done via aws cli or boto. I have a script that fetches out just the all user's in aws. import boto3 from boto3 import * import argparse access_key = '' secret_key = '' def get_iam_uses_list(): client = boto3.client('iam', aws_access_key_id=access_key, aws_secret_access_key=secret_key) my_list=list() iam_all_users = client.list_users(MaxItems