问题
Given that bigquery is not meant as a platform to denormalize data, can I denormalize the data in google cloud sql prior to importing into bigquery?
I have the following tables: Table1 500M rows, Table2 2M rows, Table3 800K rows,
I can't denormalize in our existing relational database for various reasons. So I'd like to do a sql dump of the data base, load it into google cloud sql, then use sql join scripts to create one large flat table to be imported into bigquery.
Thanks.
回答1:
That should work. You should be able to dump the generated flat table to csv and import to bigquery. There is no direct Cloud SQL to bigquery loading mechanism, currently, however.
来源:https://stackoverflow.com/questions/15302811/can-i-denormalize-data-in-google-cloud-sql-in-prep-for-bigquery