0
votes

I am more curious to understand the Snowflake process to store the data into Micro-partitions. As Far as i know, snowflake each partition size would be 50-500MB.

Suppose I have a file size worth 1GB and i wanted to load this data into snowflake. Can some one explain me the internal process/steps snowflake does to store the data into micro partitons

2

2 Answers

4
votes

Snowflake's micro-partition file format is proprietary so you're not going to get much more information than is already in the documentation (short of someone breaching their employment contract from Snowflake).

0
votes

To optimize the number of parallel operations for a load, Snowflake recommends files roughly 10 MB to 100 MB in size, compressed. Splitting large files into a greater number of smaller files distributes the load among the servers in an active warehouse and increases performance.

https://docs.snowflake.net/manuals/user-guide/data-load-considerations-prepare.html#general-file-sizing-recommendations