问题
Is it possible to export data from DynamoDB table in some format?
The concrete use case is that I want to export data from my production dynamodb database and import that data into my local dynamodb instance so my application can work with local copy of data instead of production data.
I use link as a local instance of DynamoDB.
回答1:
There is a tool named DynamoDBtoCSV
that can be used for export all the data to a CSV file. However, for the other way around you will have to build your own tool. My suggestion is that you add this functionality to the tool, and contribuite it to the Git repository.
Another way is use AWS Data Pipeline for this task (you will save all the costs of reading the data from outside AWS infraestructure). The approach is similar:
- Build the pipeline for output
- Download the file.
- Parse it with a custom reader.
回答2:
Export it from the DynamoDB interface to S3.
Then convert it to Json using sed:
sed -e 's/$/}/' -e $'s/\x02/,"/g' -e $'s/\x03/":/g' -e 's/^/{"/' <exported_table> > <exported_table>.json
Source
回答3:
This will export all items as jsons documents
aws dynamodb scan --table-name TABLE_NAME > export.json
回答4:
Here is a way to export some datas (oftentime we just want to get a sample of our prod data locally) from a table using aws cli and jq.
Let's assume we have a prod table called unsurprisingly my-prod-table
and a local table called my-local-table
To export the data run the following:
aws dynamodb scan --table-name my-prod-table \
| jq '{"my-local-table": [.Items[] | {PutRequest: {Item: .}}]}' > data.json
Basically what happens is that we scan our prod table, transform the output of the scan to shape into the format of the batchWriteItem and dump the result into a file.
To import the data in your local table run:
aws dynamodb batch-write-item \
--request-items file://data.json \
--endpoint-url http://localhost:8000
Note: There are some restriction with the batch-write-item
request - The BatchWriteItem operation can contain up to 25 individual PutItem and DeleteItem requests and can write up to 16 MB of data. (The maximum size of an individual item is 400 KB.).
回答5:
Try my simple node.js script dynamo-archive. It exports and imports in JSON format.
回答6:
I found the best current tool for simple import/exports (including round-tripping through DynamoDB Local) is this Python script:
https://github.com/bchew/dynamodump
This script supports schema export/import as well as data import/export. It also uses the batch APIs for efficient operations.
I have used it successfully to take data from a DynamoDB table to DynamoDB local for development purposes and it worked pretty well for my needs.
回答7:
For those of you that would rather do this using java, there is DynamodbToCSV4j.
JSONObject config = new JSONObject();
config.put("accessKeyId","REPLACE");
config.put("secretAccessKey","REPLACE");
config.put("region","eu-west-1");
config.put("tableName","testtable");
d2csv d = new d2csv(config);
回答8:
I have created a utility class to help developers with export. This can be used if you don't want to use data-pipeline feature of AWS. Link to git hub repo is -here
回答9:
Dynamo DB now provides a way to export and import data to/from S3 http://aws.amazon.com/about-aws/whats-new/2014/03/06/announcing-dynamodb-cross-region-export-import/
回答10:
if you need you can convert Dynamo data into JSON with this https://2json.net/dynamo
回答11:
In a similar use-case, I have used DynamoDB Streams to trigger AWS Lambda which basically wrote to my DW instance. You could probably write your Lambda to write each of the table changes to a table in your non-production account. This way your Devo table would remain quite close to Prod as well.
回答12:
In DynamoDB web console select your table, than Actions -> Export/Import
来源:https://stackoverflow.com/questions/18896329/export-data-from-dynamodb