Is that possible to split a lager file more than 8 GB using snowflake?

心已入冬 提交于 2020-01-06 05:39:30

问题


I have a file more than 8 GB. want to load this file to snowflake. I was going through the snowflake documentation and found the best practices which says keep file size 10 MB to 100 MB for best load performance.

https://docs.snowflake.net/manuals/user-guide/data-load-considerations-prepare.html

Is that possible to split the file in snowflake itself? So I will upload 8 GB file to Azure Blob and then will use snowflake to split the file into multiple and then load into a table..?


回答1:


No, it's not possible to split a file using Snowflake before loading the file.
Snowflake only has the ability to split into multiple files when unloading a table to cloud storage.

But I guess there are possibilities within Azure:
Azure Batch Job How to split large file into smaller files




回答2:


I would add although it's not currently possible today in Snowflake, you are more than welcome to submit a feature request here: https://community.snowflake.com/s/ideas



来源:https://stackoverflow.com/questions/58947484/is-that-possible-to-split-a-lager-file-more-than-8-gb-using-snowflake

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!