boto3

How to prevent my app from hanging when parallelising paramiko.SFTPClient.get requests?

最后都变了- 提交于 2021-01-01 09:08:49
问题 I am trying to parallelise retrieval of files from a server via SFTP and upload to AWS. I am using python multi-threading, the upload part works fine, however, I noticed that the get operation from paramiko.SFTPClient keeps the program hanging at the end. In fact, all of the files are withdrawn and uploaded but the program doesn't exit. I tried many things from similar posts but nothing work, my pseudo-code is the following, any help would be welcome: def create_sftp_connection(host, port,

How to prevent my app from hanging when parallelising paramiko.SFTPClient.get requests?

最后都变了- 提交于 2021-01-01 09:04:53
问题 I am trying to parallelise retrieval of files from a server via SFTP and upload to AWS. I am using python multi-threading, the upload part works fine, however, I noticed that the get operation from paramiko.SFTPClient keeps the program hanging at the end. In fact, all of the files are withdrawn and uploaded but the program doesn't exit. I tried many things from similar posts but nothing work, my pseudo-code is the following, any help would be welcome: def create_sftp_connection(host, port,

“SSL: CERTIFICATE_VERIFY_FAILED” Error when publish MQTT, AWS IoT

情到浓时终转凉″ 提交于 2020-12-31 05:38:31
问题 I am getting the following error: [ERROR] SSLError: SSL validation failed for https://data.iot.ap-northeast-2.amazonaws.com/topics/app%2Ftest%2Fresponse?qos=1 [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1124) Traceback (most recent call last): File "/var/task/app.py", line 197, in lambda_handler mqttcli.test('test', '11111', {}, 1, 200) File "/opt/python/lib/python3.8/site-packages/connectors/MQTTClient.py", line 40, in test

Download a folder from S3 using Boto3

我怕爱的太早我们不能终老 提交于 2020-12-24 15:23:04
问题 Using Boto3 Python SDK , I was able to download files using the method bucket.download_file() Is there a way to download an entire folder? 回答1: quick and dirty but it works: import boto3 import os def downloadDirectoryFroms3(bucketName, remoteDirectoryName): s3_resource = boto3.resource('s3') bucket = s3_resource.Bucket(bucketName) for obj in bucket.objects.filter(Prefix = remoteDirectoryName): if not os.path.exists(os.path.dirname(obj.key)): os.makedirs(os.path.dirname(obj.key)) bucket

Example of update_item in dynamodb boto3

百般思念 提交于 2020-12-24 05:01:45
问题 Following the documentation, I'm trying to create an update statement that will update or add if not exists only one attribute in a dynamodb table. I'm trying this response = table.update_item( Key={'ReleaseNumber': '1.0.179'}, UpdateExpression='SET', ConditionExpression='Attr(\'ReleaseNumber\').eq(\'1.0.179\')', ExpressionAttributeNames={'attr1': 'val1'}, ExpressionAttributeValues={'val1': 'false'} ) The error I'm getting is: botocore.exceptions.ClientError: An error occurred

Example of update_item in dynamodb boto3

孤者浪人 提交于 2020-12-24 05:01:03
问题 Following the documentation, I'm trying to create an update statement that will update or add if not exists only one attribute in a dynamodb table. I'm trying this response = table.update_item( Key={'ReleaseNumber': '1.0.179'}, UpdateExpression='SET', ConditionExpression='Attr(\'ReleaseNumber\').eq(\'1.0.179\')', ExpressionAttributeNames={'attr1': 'val1'}, ExpressionAttributeValues={'val1': 'false'} ) The error I'm getting is: botocore.exceptions.ClientError: An error occurred

DynamoDB with boto3 - limit acts as page size

笑着哭i 提交于 2020-12-15 06:19:32
问题 According to the boto3 docs, the limit argument in query allows you to to limit the number of evaluated objects in your DynamoDB table/GSI. However, LastEvaluatedKey isn't returned when the desired limit is reached and therefore a client that would like to limit the number of fetched results will fail to do so consider the following code: while True: query_result = self._dynamodb_client.query(**query_kwargs) for dynamodb_formatted_item in query_result["Items"]: yield self._convert_dict_from

DynamoDB with boto3 - limit acts as page size

旧城冷巷雨未停 提交于 2020-12-15 06:19:17
问题 According to the boto3 docs, the limit argument in query allows you to to limit the number of evaluated objects in your DynamoDB table/GSI. However, LastEvaluatedKey isn't returned when the desired limit is reached and therefore a client that would like to limit the number of fetched results will fail to do so consider the following code: while True: query_result = self._dynamodb_client.query(**query_kwargs) for dynamodb_formatted_item in query_result["Items"]: yield self._convert_dict_from

DynamoDB with boto3 - limit acts as page size

妖精的绣舞 提交于 2020-12-15 06:19:10
问题 According to the boto3 docs, the limit argument in query allows you to to limit the number of evaluated objects in your DynamoDB table/GSI. However, LastEvaluatedKey isn't returned when the desired limit is reached and therefore a client that would like to limit the number of fetched results will fail to do so consider the following code: while True: query_result = self._dynamodb_client.query(**query_kwargs) for dynamodb_formatted_item in query_result["Items"]: yield self._convert_dict_from

finding s3 bucket's level 1 prefix sizes while including versions using boto3 and python

馋奶兔 提交于 2020-12-15 05:10:53
问题 I'm an aws python newbie and trying to account for total bucket size shown via metrics tab on UI vs calculating sizes one folder at a time in a give bucket. I tried to fetch it by setting an inventory configuration but it doesn't show what I'm looking for. I have an s3 bucket names my_bucket with versioning enabled. It has 100 Objects and 26 subfolders (will 100000+ objects in each subfolder and atleast two versions for each of the object) WHAT I AM TRYING TO DO: Calculate and display total