The Docs says "Every Databricks deployment has a central Hive metastore..." besides an external metastore for existing Hive installations.
I have an Azure Databricks workspace with an underlying spark cluster, and a datafiles stored on DBFS and Blob Storage. Do I need HDInsight cluster with external metastore to be able to create and use Hive tables? Or can I use the above mentioned central metastore to create Hive tables on data stored on DBFS or Blob storage?