0
votes

Original Spark distributive supports several cluster managers like YARN, Mesos, Spark Standalone, K8s. I can't find what is under the hood in Databricks Spark, which cluster manager it is using, and is it possible to change?

enter image description here

What's Databricks Spark architecture?

Thanks.

1

1 Answers

1
votes

You can't check the cluster manager in Databricks, and you really don't need that because this part is managed for you. You can think about it as a kind of standalone cluster, but there are differences. General Databricks architecture is shown here.

You can change the cluster configuration by different means - init scripts, configuration parameters, etc. See documentation for more details.