0
votes

I am trying to create mount point to the ADLS Gen2 using key vault in databricks, however i am not being able to do so due to some error that i am getting. I have contributor access and i tried with Storage Blob Data Contributor and contributor access to the SPN still i am not being able to create it the mount points.

I request some help please

configs= {"fs.azure.account.auth.type":"OAuth",
       "fs.azure.account.oauth.provider.type":"org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
       "fs.azure.account.oauth2.client.id":"abcdefgh",
       "fs.azure.account.oauth2.client.secret":dbutils.secrets.get(scope="myscope",key="mykey"),
       "fs.azure.account.oauth2.client.endpoint":"https://login.microsoftonline.com/tenantid/oauth2/token",
       "fs.azure.createRemoteFileSystemDuringInitialization": "true"}

dbutils.fs.mount(
        source= "abfss://[email protected]/",
        mount_point="/mnt/cont1",
        extra_configs=configs)

the error i am getting is An error occurred while calling o280.mount. : HEAD https://storageaccount.dfs.core.windows.net/cont1?resource=filesystem&timeout=90 StatusCode=403 StatusDescription=This request is not authorized to perform this operation.

1

1 Answers

0
votes

When performing the steps in the Assign the application to a role, make sure that your user account has the Storage Blob Data Contributor role assigned to it.

Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.

enter image description here

Solution: Now assigned the Storage Blob Data Contributor role to the service principal.

enter image description here

Finally, able to get the output without any error message after assigning the Storage Blob Data Contributor role to the service principal.

enter image description here

For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.