I am trying to convert SAS files to CSV in Azure Databricks. SAS files are in Azure Blob. I am successfully able to mount the azure bolb in Databricks, but when I read from it, it has no files even though there are files on Blob. Has anyone done this before?
Thanks!
dbutils.fs.mount( source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net", mount_point = "/mnt/<mount-name>", extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})
This throws no errors. – Harsh Goswami%fs ls
This lists the mount storage location. I check that withdisplay(dbutils.fs.ls("dbfs:/mnt/"))
and then with my mount namedisplay(dbutils.fs.ls("dbfs:/mnt/harsh"))
which gives an error ofjava.io.FileNotFoundException: / is not found
– Harsh Goswami