amazon-kinesis

Copy DynamoDB table data cross account real time

ⅰ亾dé卋堺 提交于 2020-01-25 01:57:16
问题 What is the easiest approach (easiest implies low number of service maintenance overhead. Would prefer server less approach if possible) to copy data from a DDB table in one account to another, preferably in server less manner (so no scheduled jobs using Data pipelines). I was exploring possibility of using DynamoDB streams, however this old answer mentions that is not possible. However, I could not find latest documentation confirming/disproving this. Is that still the case? Another option I

How to decide total number of partition keys in AWS kinesis stream?

安稳与你 提交于 2020-01-19 12:35:06
问题 In a producer-consumer web application, what should be the thought process to create a partition key for a kinesis stream shard. Suppose, I have a kinesis stream with 16 shards, how many partition keys should I create? Is it really dependent on the number of shards? 回答1: Partition (or Hash) Key: starts from 1 up to 340282366920938463463374607431768211455. Lets say ~34020 * 10^34, I will omit 10^34 for ease... If you have 30 shards, uniformly divided, each should cover 1134 * 10^34 hash keys.

How to change video source in Amazon kinesis_video_gstreamer_sample_app.cpp?

和自甴很熟 提交于 2020-01-15 09:43:13
问题 I am running kinesis_video_gstreamer_sample_app.cpp on MacOS and it streams to AWS Kinesis from FaceTime (iSight) camera. How can I switch the video source to a USB Webcam? Thanks :) 回答1: Ok, I figured it out finally, modify the file kinesis_video_gstreamer_sample_app.cpp as follows. Change: if (data.encoder) { data.source = gst_element_factory_make("autovideosrc", "source"); To: if (data.encoder) { data.source = gst_element_factory_make("avfvideosrc", "source"); g_object_set(G_OBJECT(data

AWS Lambda seems exiting before completion

ぃ、小莉子 提交于 2020-01-15 07:41:14
问题 I have a very simple lambda function (nodeJS) which put the event received in kinesis stream. Here is the source code: 'use strict'; const AWS = require('aws-sdk'); const kinesis = new AWS.Kinesis({apiVersion: '2013-12-02'}); exports.handler = async (event, context, callback) => { let body = JSON.parse(event.body); let receptionDate = new Date().toISOString(); let partitionKey = "pKey-" + Math.floor(Math.random() * 10); // Response format needed for API Gateway const formatResponse = (status,

Map request into aws service without lambdas and using AWS service proxy integration on Api Gateway

倖福魔咒の 提交于 2020-01-14 06:19:59
问题 So i've got a scenario that i want to use an endpoint and map the provided requests directly into Kinesis stream. I was able to do that manually in the aws console. But is there a way to do change the integration to aws service using serverless or serverless plugin? I tried to find a way to deploy an endpoint that communicates directly with an aws service, without lambdas, and could not find it. 回答1: It's been a while but recently i noticed that there's a plugin now that helps setup this

Write to a specific folder in S3 bucket using AWS Kinesis Firehose

安稳与你 提交于 2020-01-13 09:05:52
问题 I would like to be able to send data sent to kinesis firehose based on the content inside the data. For example if I sent this JSON data: { "name": "John", "id": 345 } I would like to filter the data based on id and send it to a subfolder of my s3 bucket like: S3://myS3Bucket/345_2018_03_05. Is this at all possible with Kinesis Firehose or AWS Lambda? The only way I can think of right now is to resort to creating a kinesis stream for every single one of my possible IDs and point them to the

How to put data from server to Kinesis Stream

落花浮王杯 提交于 2020-01-12 23:51:07
问题 I am new to Kinesis. Reading out the documentation i found i can create the Kinesis Stream to get data from Producer. Then using KCL will read this data from Stream to further processing. I understand how to write the KCL application by implemeting IRecordProcessor . However the very first stage as how to put data on Kinesis stream is still not clear to me. Do we have some AWS API which does need implementation to achieve this. Scenarios: I have an server which is contineously getting data

How to put data from server to Kinesis Stream

点点圈 提交于 2020-01-12 23:50:20
问题 I am new to Kinesis. Reading out the documentation i found i can create the Kinesis Stream to get data from Producer. Then using KCL will read this data from Stream to further processing. I understand how to write the KCL application by implemeting IRecordProcessor . However the very first stage as how to put data on Kinesis stream is still not clear to me. Do we have some AWS API which does need implementation to achieve this. Scenarios: I have an server which is contineously getting data

Amazon AWS Kinesis Video Boto GetMedia/PutMedia

亡梦爱人 提交于 2020-01-06 05:38:25
问题 Does anybody know of a complete sample as to how to send video to a kinesis video stream, using boto3 sdk? This question was asked initially for both both GetMedia and PutMedia. Now I have got this sample code for the GetMedia part: client = boto3.client('kinesisvideo') response = client.get_data_endpoint( StreamName='my-test-stream', APIName='GET_MEDIA' ) print(response) endpoint = response.get('DataEndpoint', None) print("endpoint %s" % endpoint) if endpoint is not None: client2 = boto3

Analyze a tumbling window with a lag in AWS Kinesis Analytics SQL

好久不见. 提交于 2020-01-05 07:55:08
问题 I've got a use case that seems like it should be supported by Kinesis Analytics SQL, but I can't seem to figure it out. Here is my scenario: I have an input stream of data where each event has an event_time field and a device_id field. I want to aggregate data by event_time and device_id. Here event_time is provided as a field in the source data, it is not the ROWTIME that the row was added to the Kinesis Analytics application, nor the approximate arrival time. The processes that send data to