0
votes

I am currently trying to upload a large, unzipped, CSV file into an internal snowflake stage. The file is 500 gb. I ran the put command, but it doesn't look like much is happening. There is no status update, it's just kind of hanging there.

Any ideas what's going on here? Will this eventually time out? Will it complete? Anyone have an estimated time?

I am tempted to try and kill it somehow. I am currently splitting the large 500 gb file up into about 1000 smaller files that I'm going to zip up and upload in parallel (after reading more on best practices).

2

2 Answers

1
votes

Unless you've specified auto_compress=FALSE, then step 1 in the PUT is compressing the file, which may take some time on 500GB...
Using parallel=<n> will automatically split the files into smaller chunks and upload them in parallel - you don't have to split the source file yourself. (But you can if you want to...)

1
votes

Per snowflake suggestion please split the file into multiple small file, then stage your file into snowflake internal stage.(By default snowflake will compress file)

Then try run copy command with multi-cluster warehouse, Then you will see the performance of snowflake.