1
votes

Everytime I attempted to execute any Scala code on Databricks Community Edition, I get the following error message:

java.lang.Exception: An error occurred while initializing the REPL. Please check whether there are conflicting Scala libraries or JARs attached to the cluster, such as Scala 2.11 libraries attached to Scala 2.10 cluster (or vice-versa).

Can someone let me know how to resolve this?

1
ok, I changed the Databricks Runtime version on the cluster to 7.0 (includes Apache Spark 3.0.0, Scala 2.12). When I now run any scala code I now get the error: scala.reflect.internal.FatalError: Error accessing /databricks/jars/adal4j-1.6.0.jar. I installed adal4j-1.6.0.jar in the location /databricks/jars, but I'm still getting the same error. Any thoughts? - Carltonp
do you see this library in the Cluster properties? Also, it's recommended to use MSAL4J instead of ADAL4J - Alex Ott

1 Answers

0
votes

You need to correctly add library, for example via Maven coordinates (this could be preferable, as it will pull all dependencies as well), or from the DBFS or S3, so the runtime can distribute this library to all nodes in the cluster. For example, I'm adding MSAL instead of ADAL:

Adding MSAL library

and after it's installed, I can access to it without any issue (I've used Runtime 6.5):

Using the MSAL library