I am trying to understand, why my ACL permissions are not working properly in Databricks.
Scenario: I have 2 Users. one with full permissions on FileSystem and. other without any permissions.
I tried mounting Gen2 filesystem in databricks using 2 different methods.
configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", "fs.azure.account.oauth2.client.id": clientid, "fs.azure.account.oauth2.client.secret": credential, "fs.azure.account.oauth2.client.endpoint": refresh_url} dbutils.fs.mount( source = "abfss://[email protected]/", mount_point = "/mnt/xyz", extra_configs = configs)
and using passthrough 2.
configs = {
"fs.azure.account.auth.type": "CustomAccessToken",
"fs.azure.account.custom.token.provider.class": spark.conf.get("spark.databricks.passthrough.adls.gen2.tokenProviderClassName")
}
dbutils.fs.mount(
source = "abfss://[email protected]/",
mount_point = "/mnt/xyz",
extra_configs = configs)
both mount the filesystem. But when I use:
dbfs.fs.ls("/mnt/xyz")
It displays all the contents files / folders for the user which has no permissions on datalake.
Would be glad if someone would explain me what's wrong.
Thanks