Why is BigQuery so slow on non-large data sizes?

后端 未结 1 424
轻奢々
轻奢々 2021-02-01 20:19

We have found BigQuery to work great on data sets larger than 100M rows, where the \'initialization time\' doesn\'t really come into effect (or is negligible compared to the res

1条回答
  •  独厮守ぢ
    2021-02-01 21:02

    It's time spent on metadata/initiation, but actual execution time is very small. We have work in progress that will address this, but some of the changes are complicated and will take a while.

    You can imagine that in its infancy, BigQuery could have central systems for managing jobs, metadata, etc. in a manner that performed very well for all N0 entities using the service. Once you get to N1 entities, however, it may be necessary to rearchitect some things to make them have as little latency as possible. For notification about new features--which is also where we would announce API improvements related to start-up latency--keep an eye on our release notes, which you can also subscribe to as an RSS feed.

    0 讨论(0)
提交回复
热议问题