问题
I have a file more than 8 GB. want to load this file to snowflake. I was going through the snowflake documentation and found the best practices which says keep file size 10 MB to 100 MB for best load performance.
https://docs.snowflake.net/manuals/user-guide/data-load-considerations-prepare.html
Is that possible to split the file in snowflake itself? So I will upload 8 GB file to Azure Blob and then will use snowflake to split the file into multiple and then load into a table..?
回答1:
No, it's not possible to split a file using Snowflake before loading the file.
Snowflake only has the ability to split into multiple files when unloading a table to cloud storage.
But I guess there are possibilities within Azure:
Azure Batch Job How to split large file into smaller files
回答2:
I would add although it's not currently possible today in Snowflake, you are more than welcome to submit a feature request here: https://community.snowflake.com/s/ideas
来源:https://stackoverflow.com/questions/58947484/is-that-possible-to-split-a-lager-file-more-than-8-gb-using-snowflake