0
votes

I am trying to load External JSON File from Azure Blob Storage to Snowflake. I created the table LOCATION_DETAILS with all columns as Variant. When I try to load into the table, I am getting the below error:

COPY into Table from External JSON File Error Details

Can anyone help me on this?

2

2 Answers

1
votes

You need to create a file format and mention the type of file and other specification like below: create or replace file format myjsonformat type = 'JSON' strip_outer_array = true;

And then try to load the file it will work.

0
votes

When I use external data for Snowflake, I like to create stages that are linked to the BlobStorage (in this case), it's easy and you can do everything really easy and transparent, just as if it would be local data.

Create the stage linked to the blobstorage like this:

CREATE OR REPLACE STAGE "<DATABASE>"."<SCHEMA>"."<STAGE_NAME>"
  URL='azure://demostorage178.blob.core.windows.net/democontainer'
  CREDENTIALS=(AZURE_SAS_TOKEN='***********************************************')
  FILE_FORMAT = (TYPE = JSON);

After that, you can list what is in the blobstorage fromo snowflake like this:

list @"<DATABASE>"."<SCHEMA>"."<STAGE_NAME>";

Or like this:

use database "<DATABASE>";
use schema "<SCHEMA>";
SELECT *  FROM @"STAGE_NAME"/sales.json;

If you need to create the table, use this:

create or replace table "<DATABASE>"."<SCHEMA>"."<TABLE>" (src VARIANT);

And you can COPY your data like this (for a single file):

copy into "<DATABASE>"."<SCHEMA>"."<TABLE>" from @"<STAGE_NAME>"/sales.json;

Finally, use this for all new data that you get in your stage. Note: You don't need to erase previous data, it will ignore it and will load only the new one.

copy into "<DATABASE>"."<SCHEMA>"."<TABLE>" from @"STAGE_NAME";