I have a databricks clister running fine. And using the following code I can mount my "datalake storage gen2" account as well. I am mounting everything on /mnt/data1
val configs = Map("fs.azure.account.auth.type" -> "OAuth",
"fs.azure.account.oauth.provider.type" -> "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id" -> appID,
"fs.azure.account.oauth2.client.secret" -> password,
"fs.azure.account.oauth2.client.endpoint" -> ("https://login.microsoftonline.com/" + tenantID + "/oauth2/token"),
"fs.azure.createRemoteFileSystemDuringInitialization"-> "true")
dbutils.fs.mount(
source = "abfss://" + fileSystemName + "@" + storageAccountName + ".dfs.core.windows.net/",
mountPoint = "/mnt/data1",
extraConfigs = configs)
Until this point everything is fine and working. But when I'm trying to access one file from the mount location with the following command
val df = spark.read.csv("/mnt/data1/creodemocontainer/movies.csv")
I'm getting following error
java.io.FileNotFoundException: dbfs:/mnt/data1/creodemocontainer2/movies.csv
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.$anonfun$getFileStatus$2(DatabricksFileSystemV2.scala:775)
Though I can connect and load those files in PowerBI without any issue. I'm not getting any clue from last 2 days So any help will be really appreciated.
thanks in advance.
dbutils.fs.lsto check if the file exist? - Jim Xu