0
votes

I'm using Azure DataBricks to work with data from Azure storage accounts. I'm mounting them directly in Databricks File System as it is written here: Mount storage account in Databricks File System. So the data is accessible under the path: /mnt/storage_account/container/path_to_file

I have two storage accounts mounted. First one is standard storage account that is used as a source for tables, and users should not be able to access files there. Second one is ADLS storage account, where users has configured access policy, and with the ADLS Passthrough can read and write to containers that are dedicated for them.

The only thing I found for limiting the access to DBFS is using ANY FILE Object. But once I run GRANT SELECT ON ANY FILE TO <user>@<domain-name> user is able to read whole file system and can read sensitive data. With the DENY SELECT ON ANY FILE user is not able to write and read from any storage account, including the ADLS one so ADLS Passtrough doesn't work.

Is there any way to limit the access to /mnt/storage_account_1/container/... while still having the access to /mnt/storage_account_2/container...?

1

1 Answers

0
votes

you may try to setup access control on the storage account 1 by one of the following ways from the link https://docs.microsoft.com/en-us/azure/storage/common/storage-auth?toc=/azure/storage/blobs/toc.json