My company are in the process of migrating from a local DB to a Data Warehouse as the load is too much for SQL Server at the moment. We've looked at what cloud solutions we might be able to use and decided on Snowflake. We need to process quite heavy compressed JSONs of up to 300 MB per file at times. I've read through the documentation, created the stage, file format and table as (json_data variant)
. I've loaded my first JSON file using SnowSQL CLI and that worked too. The test file is 3.7 kb. When trying to copy into mytable
I got this error
Error parsing JSON: document is too large, max size 16777216 bytes
How can I avoid this error without having to split the files before being uploaded in the Stage? The data is being sent by an app so every hour we'd have to load this data in when going live.