aws-sdk-js

Unmarshalling Dynamo DB stream data to Json format

青春壹個敷衍的年華 提交于 2019-12-09 02:19:29
I have to translate the DDB stream message into normal json type. For this I am using unmarshalleddata = aws.DynamoDB.Converter.unmarshall(result.NewImage); where result.NewImage is { carrier: { S: 'SPRING' }, partnerTransactionId: { S: 'a87ce47a46d7416586e0ece39f706d48' }, shipmentId: { S: 'SPRING2200419561404932' }, compressedShipmentPayload: { B: 'H4sIAAAAAAAAAO1b+ZPiRrL+Vzr4tb3TkrgaR7yIFWdDI3FJHHqxMSEkIQQ6aB1cjvnfX2ZVAoJur+1963hHeMaepirry8r6MisrRZd+KVhRHDu+mXpR2LULPxeWZkkwq2ZVFEulklWumo65sqt2tVhbrZzqqlL4qWBlfprFTiOyHQA44d86dejdmXEaOjEqqdSqr

Amazon Connet's getCurrentMetricData is not a function

倖福魔咒の 提交于 2019-12-08 20:24:27
Recently I am working on a project with Amazon Lambda. I created a lambda function as following var AWS = require ('aws-sdk'); exports.handler = (event, context, callback) => { // TODO implement var connect = new AWS.Connect({apiVersion: '2017-08-08'}); var params = { InstanceId: '' /* required */ }; connect.getCurrentMetricData(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else { const response = { statusCode: 200, body: JSON.stringify(data) }; callback(null, data); } // successful response }); // const response = { // statusCode: 200, // body: JSON

Amazon Connet's getCurrentMetricData is not a function

♀尐吖头ヾ 提交于 2019-12-08 07:18:57
问题 Recently I am working on a project with Amazon Lambda. I created a lambda function as following var AWS = require ('aws-sdk'); exports.handler = (event, context, callback) => { // TODO implement var connect = new AWS.Connect({apiVersion: '2017-08-08'}); var params = { InstanceId: '' /* required */ }; connect.getCurrentMetricData(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else { const response = { statusCode: 200, body: JSON.stringify(data) };

abort/stop amazon aws s3 upload, aws sdk javascript

不羁岁月 提交于 2019-12-06 04:31:11
问题 I am using aws sdk javascript to upload file in amazon s3. code : AWS.config.update({ accessKeyId : 'access-key', secretAccessKey : 'secret-key' }); AWS.config.region = 'region'; var bucket = new AWS.S3({params: {Bucket: 'bucket-name'}}); //var fileChooser = document.getElementById('file'); var files = event.target.files; $.each(files, function(i, file){ //console.log(file.name); if (file) { var params = {Key: file.name, ContentType: file.type, Body: file}; bucket.upload(params).on(

Return value from callback function in AWS Javascript SDK

試著忘記壹切 提交于 2019-12-02 09:51:46
I'm using the AWS Javascript SDK and I'm following the tutorial on how to send an SQS message. I'm basically following the AWS tutorial which has an example of the sendMessage as follows: sqs.sendMessage(params, function(err, data) { if (err) { console.log("Error", err); } else { console.log("Success", data.MessageId); } }); So the sendMessage function uses a callback function to output whether the operation was successful or not. Instead of printing to the console I want to return a variable, but every value I set is only visible within the callback function, even global variables like window

What should be done when the provisioned throughput is exceeded?

☆樱花仙子☆ 提交于 2019-11-27 16:22:22
I'm using AWS SDK for Javascript (Node.js) to read data from a DynamoDB table. The auto scaling feature does a great job during most of the time and the consumed Read Capacity Units (RCU) are really low most part of the day. However, there's a programmed job that is executed around midnight which consumes about 10x the provisioned RCU and since the auto scaling takes some time to adjust the capacity, there are a lot of throttled read requests. Furthermore, I suspect my requests are not being completed (though I can't find any exceptions in my error log). In order to handle this situation, I've

Nodejs AWS SDK S3 Generate Presigned URL

こ雲淡風輕ζ 提交于 2019-11-27 09:09:05
问题 I am using the NodeJS AWS SDK to generate a presigned S3 URL. The docs give an example of generating a presigned URL. Here is my exact code (with sensitive info omitted): const AWS = require('aws-sdk') const s3 = new AWS.S3() AWS.config.update({accessKeyId: 'id-omitted', secretAccessKey: 'key-omitted'}) // Tried with and without this. Since s3 is not region-specific, I don't // think it should be necessary. // AWS.config.update({region: 'us-west-2'}) const myBucket = 'bucket-name' const myKey

What should be done when the provisioned throughput is exceeded?

落花浮王杯 提交于 2019-11-26 18:36:16
问题 I'm using AWS SDK for Javascript (Node.js) to read data from a DynamoDB table. The auto scaling feature does a great job during most of the time and the consumed Read Capacity Units (RCU) are really low most part of the day. However, there's a programmed job that is executed around midnight which consumes about 10x the provisioned RCU and since the auto scaling takes some time to adjust the capacity, there are a lot of throttled read requests. Furthermore, I suspect my requests are not being