2
votes

I'm attempting to create an s3 automated snowpipe.

I created an external stage, pointing to my s3 bucket, and configured it with Option 2 (AWS IAM role assumption) from https://docs.snowflake.net/manuals/user-guide/data-load-s3-config.html#configuring-secure-access-to-amazon-s3.

A simple "Copy Into..." statement successfully loaded from this stage into the intended table.

I then attempted to create a snowpipe to automate the copy into, using the above external stage, but received the following error:

SQL execution error: Error assuming AWS_ROLE. Please verify the role and externalId are configured correctly in your AWS policy.

Note that I'm using the Snowflake python SDK to post the DDL commands. Not sure if that matters

Ideas?

1
Double check that the role you're using to execute the Snowpipe has ownership on the pipe, usage on the stage, insert on the table, and usage on the storage integration if applicable. Sounds like it's broken between the stage and your AWS role. Also be sure to check the trust relationship of the AWS role, I recommend using a wildcard for which ever role created the stage or integration, ex. MYACCOUNT_SFCRole=2_*, then any time that specific Snowflake role creates a stage or storage integration referencing that AWS role, access will be allowed. Option 2, Step 4, Bullet 5 in the link you postedChris

1 Answers

0
votes

Check whether the s3 bucket has a KMS decryption and edit the policy to add a KMS decrypt policy.

Example:

"Effect": "Allow",
"Action":["kms:Decrypt" ],
"Resource": [ "XXXXX" ]