I am using databricks to access my ADLS Gen2 container.
dbutils.fs.mount(
source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net",
mount_point = "/mnt/<mount-name>",
extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})
I am using the following code and changing out what I need to change out of course.
When I run script I get the following error
df = spark.read.text("/mnt/<mount-name>/...")
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container <container name> in account <storage account name>.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.
I registered databricks in my App registrations and added the name to my ADLS role as Storage Blob Data Contributor.
I'm not sure why my credentials are not allowing me to extract the text files that are in my ADLS account.
Any help is appreciated!