I am trying load the data into SQL datawarehouse from blob storage using azure databricks scala.
spark.conf.set("spark.sql.parquet.writeLegacyFormat","true")
df.write.format("com.databricks.spark.sqldw")
.option("url",sqlDwUrlSmall)
.option("dbtable", "Person")
.option("forward_spark_azure_storage_credentials","True")
.option("tempdir",tempDir).mode("overwrite").save()
I am getting this error
Underlying SQLException(s): - com.microsoft.sqlserver.jdbc.SQLServerException: External file access failed due to internal error: 'Error occurred while accessing HDFS: Java exception raised on call to HdfsBridge_IsDirExist. Java exception message: HdfsBridge::isDirExist - Unexpected error encountered checking whether directory exists or not: StorageException: This request is not authorized to perform this operation.' [ErrorCode = 105019] [SQLState = S0001]