Migration from DynamoDB to Spanner/BigTable

左心房为你撑大大i 提交于 2020-01-14 10:48:26

问题


I have a use case where I need to migrate 70 TB of data from DynamoDB to BigTable and Spanner. Tables with a single index will go to BigTable else they will go to Spanner.

I can easily handle the historical loads by exporting the data to S3 --> GCS --> Spanner/BigTable. But the challenging part is to handle the incremental streaming loads simultaneously happening on DynamoDB. There are 300 tables in DynamoDB.

How to handle this thing in the best possible manner? Has anyone done this before?


回答1:


One approach could be done using lambdas to capture the dynamodb changes, posting the changes to GCP pub/sub, and then having a Dataflow streaming pipeline processing the incoming pub/sub messages, writing it to Spanner or BigTable depending on the table

The basic DynamoDB->Spanner solution is documented here: https://cloud.google.com/solutions/migrating-dynamodb-to-cloud-spanner

This could be adapted to handle the different destinations for different tables.



来源:https://stackoverflow.com/questions/58709367/migration-from-dynamodb-to-spanner-bigtable

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!