After running databricks-connect configure
, when I run databricks-connect test
, I am getting "The system cannot find the path specified."
and then nothing happens, no error nothing. Please help me resolve this. Since there is no error message as well I am short pressed on what to google as well.
1
votes
2 Answers
1
votes
Update: I resolved this by matching the JAVA versions
. The Databricks runtime
in the cluster is 6.5
and on checking the documentation, it said JAVA 1.8.0_252
and so I had to look for a version closer to this and it is working now (both JDK
and JRE
are working).
There is still a caveat though. For tables that belong to a data lake I am still unable to make it work with
sparklyr::spark_read_parquet(sc = sc, path = "/.../parquet_table", header = TRUE, memory = FALSE)
It does work for the tables that belong to the "default"
database in databricks
. Not sure if this is just in my case but I am tired of all the tweaking I have been doing for the past week lol. Please comment if anyone has been able to get this working!