0
votes

I am trying to read files in ADLS Gen 2 Storage from Databricks Notebook using Python.

The storage container however has it's public access level set to "Private".

I have Storage Account Contributor and Storage Blob Data Contributor access.

How can the Databricks be allowed to read and write into ADLS Storage ?

1
Please edit your question to show what you've done in your notebook, especially the approach you've taken to connect to ADLS. - David Makogon

1 Answers

1
votes

According to the information you provided, Storage Account Contributor has been assigned to your account. You have permission to get the storage account access key. So we can use access key to do auth then we can read and write into ADLS Gen 2 Storage. For more details, please refer to here

For example

spark.conf.set(
  "fs.azure.account.key.<storage-account-name>.dfs.core.windows.net",
  dbutils.secrets.get(scope="<scope-name>",key="<storage-account-access-key-name>"))
dbutils.fs.ls("abfss://<file-system-name>@<storage-account-name>.dfs.core.windows.net/<directory-name>")