I wonder if my databricks code is addressing the correct location and if "contributor" right is enough for accessing storage.
- I have Azure Storage Gen 2 with container named staging. (Url in Azure portal is https://datalaketest123.blob.core.windows.net/staging)
- I have mounted Azure Storage Gen 2 with Azure Databricks.
- I have configured passthrough and assuming that I get access to storage with my AD users. (contributor rights)
- i have variable: source = 'abfss://' + in_fileSystemName + '@' + storageAccountName + '.dfs.core.windows.net/'
- I tried now to list file system with command: dbutils.fs.ls(source)
I get error:
ET https://datalaketest123.dfs.core.windows.net/staging?
resource=filesystem&maxResults=500&timeout=90&recursive=false
---------------------------------------------------------------------------
ExecutionError Traceback (most recent call last)
<command-1012822525241408> in <module>
27 # COMMAND ----------
28 source = 'abfss://' + in_fileSystemName + '@' + storageAccountName + '.dfs.core.windows.net/'
---> 29 dbutils.fs.ls(source)
30
31 # COMMAND ----------
/local_disk0/tmp/1235891082005-0/dbutils.py in f_with_exception_handling(*args, **kwargs)
312 exc.__context__ = None
313 exc.__cause__ = None
--> 314 raise exc
315 return f_with_exception_handling
316
ExecutionError: An error occurred while calling z:com.databricks.backend.daemon.dbutils.FSUtils.ls.
: GET https://datalaketest123.dfs.core.windows.net/staging?
resource=filesystem&maxResults=500&timeout=90&recursive=false
StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=AuthorizationPermissionMismatch


