aws-java-sdk

Lambda : Is any Batch processing scheduler available?

醉酒当歌 提交于 2019-12-24 07:20:52
问题 Problem : Fetch 2000 items from Dynamo DB and process(Create a POST req from 100 items) it batch by batch (Batch size = 100). Question : Is there anyway that I can achieve it from any configuration in AWS. PS : I've configured a cron schedule to run my Lambda function. I'm using Java. I've made multi-threaded application which synchronously does so, but this eventually increases my computation time drastically. 回答1: I have the same problem and thinking of solving it in following way. Please

Can't access S3 Pre-Signed URL due to authorization [duplicate]

久未见 提交于 2019-12-17 21:29:54
问题 This question already has answers here : The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256 (13 answers) Closed 3 years ago . Using Java8 and aws-java-sdk 1.10.43 I'm trying to get a Pre-Signed URL to an S3 file. I do get back a link, but browsing to it lead to this error: authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256 To emphasize, I wish to generate a URL that can be sent via email and opened in a browser, not

how can I preprocess input data before making predictions in sagemaker?

穿精又带淫゛_ 提交于 2019-12-12 11:50:24
问题 I am calling a Sagemaker endpoint using java Sagemaker SDK. The data that I am sending needs little cleaning before the model can use it for prediction. How can I do that in Sagemaker. I have a pre-processing function in the Jupyter notebook instance which is cleaning the training data before passing that data to train the model. Now I want to know if I can use that function while calling the endpoint or is that function already being used? I can show my code if anyone wants? EDIT 1 Basically

AWS Image getBytes returning null

我的未来我决定 提交于 2019-12-11 14:55:02
问题 I am trying to convert and AWS Rekognition Image to java BufferedImage . In order to do this I need the byte array from the AWS Image. However, when I call the getBytes method it returns null instead of returning ByteBuffer . My code is as below: //Load an Rekognition Image object from S3 Image inputImage = new Image() .withS3Object(new com.amazonaws.services.rekognition.model.S3Object().withName(key).withBucket(bucket)); DetectFacesRequest request = new DetectFacesRequest().withImage

How to load property file from classpath in AWS lambda java

那年仲夏 提交于 2019-12-09 16:21:53
问题 I have written AWS lambda function in that i want to read database connection details from property file and which in my classpath, but I am not able to load that file.Here is my code: InputStream input = DBConfiguartion.class.getResourceAsStream("appsettings"); Reader r = new InputStreamReader(input, "UTF-8"); Properties prop = new Properties(); prop.load(r); If I run this code through normal java console application that time it is working, but whenever i run it as AWS lambda function then

Spark crash while reading json file when linked with aws-java-sdk

孤者浪人 提交于 2019-12-06 22:17:41
问题 Let config.json be a small json file : { "toto": 1 } I made a simple code that read the json file with sc.textFile (because the file can be on S3, local or HDFS, so textFile is convenient) import org.apache.spark.{SparkContext, SparkConf} object testAwsSdk { def main( args:Array[String] ):Unit = { val sparkConf = new SparkConf().setAppName("test-aws-sdk").setMaster("local[*]") val sc = new SparkContext(sparkConf) val json = sc.textFile("config.json") println(json.collect().mkString("\n")) } }

How do I use the requestShutdown and shutdown to do graceful shutdown in the case of KCL Java library for AWS Kinesis

孤街浪徒 提交于 2019-12-06 08:59:46
问题 I am trying to use the new feature of KCL library in Java for AWS Kinesis to do a graceful shutdown by registering with shutdown hook to stop all the record processors and then the worker gracefully. The new library provides a new interface which record processors needs to be implemented. But how does it get invoked? Tried invoking first the worker.requestShutdown() then worker.shutdown() and it works. But is it any intended way to use it. What is the use then to use both, and its benefit?

Upload ZipOutputStream to S3 without saving zip file (large) temporary to disk using AWS S3 Java

给你一囗甜甜゛ 提交于 2019-12-04 20:09:37
问题 I have a requirement to download photos (not in same directory) from S3, ZIP them and again upload to S3 using AWS S3 Java SDK. This zip file size can go in GBs. Currently I am using AWS Lambda which has a limitation of temporary storage up to 500 MB. So I don't want to save ZIP file on disk instead I want to stream ZIP file (which is being created dynamically using downloaded photos from S3) directly to S3. I need this using AWS S3 Java SDK. 回答1: The basic idea is to use streaming operations

Upload ZipOutputStream to S3 without saving zip file (large) temporary to disk using AWS S3 Java

馋奶兔 提交于 2019-12-04 04:24:40
I have a requirement to download photos (not in same directory) from S3, ZIP them and again upload to S3 using AWS S3 Java SDK. This zip file size can go in GBs. Currently I am using AWS Lambda which has a limitation of temporary storage up to 500 MB. So I don't want to save ZIP file on disk instead I want to stream ZIP file (which is being created dynamically using downloaded photos from S3) directly to S3. I need this using AWS S3 Java SDK. The basic idea is to use streaming operations. This way you won't wait till the ZIP is generated on a filesystem, but start uploading as soon, as the ZIP

How to load property file from classpath in AWS lambda java

妖精的绣舞 提交于 2019-12-04 03:33:07
I have written AWS lambda function in that i want to read database connection details from property file and which in my classpath, but I am not able to load that file.Here is my code: InputStream input = DBConfiguartion.class.getResourceAsStream("appsettings"); Reader r = new InputStreamReader(input, "UTF-8"); Properties prop = new Properties(); prop.load(r); If I run this code through normal java console application that time it is working, but whenever i run it as AWS lambda function then InputStream is coming null. You are only one character off. Here's a working example that I have to do