2
votes

I installed scala and apache-spark using homebrew and it installed scala 2.12.4 and apache-spark 2.2.0. However, if you checkout spark-shell -version it uses a different scala version.

version mismatch scala and spark-shell

How do I set spark-shell to use the installed scala version? Is there way to set scala version used by apache-spark during homebrew installation?

1

1 Answers

3
votes

TL;DR You cannot.

Two problems:

  • Spark (it is not really specific to Spark) will use Scala version which has been used to compile it. Version of Scala compile installed on the machine is not relevant at all.
  • Spark doesn't support Scala 2.12 yet, so recompiling is not an option.