1
votes

I am trying load the data into SQL datawarehouse from blob storage using azure databricks scala.

spark.conf.set("spark.sql.parquet.writeLegacyFormat","true")    
df.write.format("com.databricks.spark.sqldw")
.option("url",sqlDwUrlSmall)
.option("dbtable", "Person")        
.option("forward_spark_azure_storage_credentials","True")
.option("tempdir",tempDir).mode("overwrite").save()

I am getting this error

Underlying SQLException(s): - com.microsoft.sqlserver.jdbc.SQLServerException: External file access failed due to internal error: 'Error occurred while accessing HDFS: Java exception raised on call to HdfsBridge_IsDirExist. Java exception message: HdfsBridge::isDirExist - Unexpected error encountered checking whether directory exists or not: StorageException: This request is not authorized to perform this operation.' [ErrorCode = 105019] [SQLState = S0001]

1
i am using access key to get the data from blob storagePrasad

1 Answers

0
votes

To successfully load the data from blob storage to sql data warehouse using azure databricks scala.

  • Make sure are passing the correct path.

  • Make sure to pass the "tempDir" as shown in the below format.

tempDir = "wasbs://" + blobContainer + "@" + blobStorage +"/tempDirs"

Reference: Load data into Azure SQL Data Warehouse

Hope this helps.