0
votes

I have Apache SuperSet installed via Docker on my local machine. I have a separate production 20 Node Spark cluster with Hive as the Meta-Store. I want my SuperSet to be able to connect to Hive and run queries via Spark-SQL. For connecting to Hive, I tried the following

Add Database --> SQLAlchemy URI *

hive://hive@<hostname>:10000/default

but it is giving some error when I test connection. I believe I have to do some tunneling, but I am not sure how. I have the Hive thrift server as well.

Please let me know how to proceed

1

1 Answers

0
votes

What is the error you are receiving? Although the docs do not mention this, the best way to provide the connection URL is in the following format :

hive://<url>/default?auth=NONE    ( when there is no security )
hive://<url>/default?auth=KERBEROS
hive://<url>/default?auth=LDAP