0
votes

I have databricks pointing to a storage account in Azure but the region was incorrect. Now I want to change it and point it to a different storage account. I have used the mount option with the code as below

dbutils.fs.mount(
    source = "wasbs://" + mountname + "@" + storageAccount + ".blob.core.windows.net",
    mount_point = root + mountname ,
    extra_configs = {"fs.azure.account.key." + storageAccount + ".blob.core.windows.net":dbutils.secrets.get(scope = "", key = "")})

This executes properly, but once I use %fs ls dbfs:/mnt/ to list the directories, it shows the directories of the old storage account.

Do let me know how I can achieve this if it is possible?

1

1 Answers

2
votes

All you need to do, just unmount the existing storage account and mount it with correct storage account which you are referring to different storage account.

OR

Create a new mount point with reference to the new storage account.

Unmount a mount point:

dbutils.fs.unmount("/mnt/<mountname>")

enter image description here

To mount a Blob Storage container or a folder inside a container, use the following command:

dbutils.fs.mount(
  source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net/<directory-name>",
  mountPoint = "/mnt/<mount-name>",
  extraConfigs = Map("<conf-key>" -> dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")))

enter image description here