I have written aws glue job where i am trying to read snowflake tables as spark dataframe and also trying to write a spark dataframe into the snowflake tables. My job is failing stating "Insufficient privileges to operate on schema" in both scenario.
But when i am directly writing insert statement on snowflake cli, i am able to insert data. So basically i have insert privilege.
So why my job is failing when i am trying to insert data from dataframe or reading data from snowflake table as a dataframe?
Below is my code to write data into snowflake table.
sfOptions = {
"sfURL" : "xt30972.snowflakecomputing.com",
"sfAccount" : "*****",
"sfUser" : "*****",
"sfPassword" : "****",
"sfDatabase" : "*****",
"sfSchema" : "******"
}
df=spark.read.format("csv").option("header","false").option("delimiter",',').load(aws s3 file_name)
df2.write.format("net.snowflake.spark.snowflake") \
.options(**sfOptions) \
.option("dbtable", table_name) \
.mode("append") \
.save()