问题
AWS Firehose was released today. I'm playing around with it and trying to figure out how to put data into the stream using AWS CLI. I have a simple JSON payload and the corresponding Redshift table with columns that map to the JSON attributes. I've tried various combinations but I can't seem to pass in the JSON payload via the cli.
What I've tried:
aws firehose put-record --delivery-stream-name test-delivery-stream --record '{ "attribute": 1 }'
aws firehose put-record --delivery-stream-name test-delivery-stream --record { "attribute": 1 }
aws firehose put-record --delivery-stream-name test-delivery-stream --record Data='{ "attribute": 1 }'
aws firehose put-record --delivery-stream-name test-delivery-stream --record Data={ "attribute": 1 }
aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json '{ "attribute": 1 }'
aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json { "attribute": 1 }
I've looked at the cli help which hasn't helped. This article was published today but looks like the command they use is already outdated as the argument "--firehose-name" has been replaced by "--delivery-stream-name".
回答1:
Escape the double-quotes around keys and values inside the blob:
aws firehose put-record --delivery-stream-name test-delivery-stream --record '{"Data":"{\"attribute\":1}"}'
回答2:
I have issues with my credentials and region, but this syntax at least got me past parsing errors:
aws firehose put-record --cli-input-json '{"DeliveryStreamName":"testdata","Record":{"Data":"test data"}}'
回答3:
This should work. Escape all quotes.replace strem_name with your stream name.
aws firehose put-record --cli-input-json "{\"DeliveryStreamName\":\"strem_name\",\"Record\":{\"Data\":\"test data\"}}"
回答4:
This is what I have tried and it has worked.
Below is the example for sending JSON records with Single Column and multiple columns.
Single Value in the Data:
Example: Sending a single column which is an integer.
aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute\":1}"'
Multiple column values in the data :
Example: Sending Integer and String values via Put-record
aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\"}"'
Example: Sending Integer,String and float values via Put-record
aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\",\"attribute_2\":\"14.9\"}"'
Acknowledgement of Success :
When the record is sent successfully, kinesis acknowledges it with a record id , which is similar to the one below.
{
"RecordId": "fFKN2aJfUh6O8FsvlrfkowDZCpu0sx+37JWKJBRmN++iKTYbm/yMKE4dQHdubMR4i+0lDP/NF3c+4y1pvY9gOBkqIn6cfp+1DrB9YG4a0jXmopvhjrXrqYpwo+s8I41kRDKTL013c65vRh5kse238PC7jQ2iOWIqf21wq4dPU9R5qUbicH76soa+bZLvyhGVPudNNu2zRyZwCCV0zP/goah54d/HN9trz"
}
This indicates that the put-record command has succeeded.
Streamed Record on S3:
This is how the record the record looks like in S3 after kinesis has processed it into S3.
{"attribute":1}
{"attribute_0":1,"attribute_1":"Sample String Value"}
{"attribute_0":1,"attribute_1":"Sample String Value","attribute_2":"14.9"}
Note : In S3, the records are created in single or multiple files depending on the rate in which we issue the put-record command.
Please do try and comment if this works.
Thanks & Regards, Srivignesh KN
回答5:
couple of things:
- did you create the delivery stream?
- by reading the doc it seems that you should do --cli-input-json '{"Data":"blob"}' or --record 'Data=blob'
- try using --generate-cli-skeleton on the cli for the put-record/firehose to see an example
来源:https://stackoverflow.com/questions/33006303/cli-to-put-data-into-aws-firehose