1
votes

The Docs says "Every Databricks deployment has a central Hive metastore..." besides an external metastore for existing Hive installations.

I have an Azure Databricks workspace with an underlying spark cluster, and a datafiles stored on DBFS and Blob Storage. Do I need HDInsight cluster with external metastore to be able to create and use Hive tables? Or can I use the above mentioned central metastore to create Hive tables on data stored on DBFS or Blob storage?

1

1 Answers

1
votes

@Gadam nope you do not. Azure Databricks provisions its own Hive Metastore, but if you are already using one with HDInsight, Databricks can be configured to also use it (an external metastore).