0
votes

I need to create an abstraction on top of existing Delta Lake Table in Databricks. Is it possible to make SQL Server kind of SQL View based on Delta Lake Table in Spark?

2
Have you tried to load the delta table (that'd give you a DataFrame) and register it as a view by Dataset.createOrReplaceView?Jacek Laskowski

2 Answers

4
votes

SQL view can be created on delta lake by multiple ways now.

  • Through Spark:

CREATE OR REPLACE VIEW sqlView AS SELECT col1, .., coln FROM delta_table

  • Hive table can be created on delta table (path). Just add jars in hive environment, set following properties & create external table (hive supported 2.x)

`

ADD JAR /path/to/delta-core-shaded-assembly_2.11-0.1.0.jar;
ADD JAR /path/to/hive-delta_2.11-0.1.0.jar;
SET hive.input.format=io.delta.hive.HiveInputFormat;
SET hive.tez.input.format=io.delta.hive.HiveInputFormat;
CREATE EXTERNAL TABLE deltaTable(col1 INT, col2 STRING)
STORED BY 'io.delta.hive.DeltaStorageHandler'
LOCATION '/delta/table/path'

`

For more details: https://github.com/delta-io/connectors

0
votes

A view can be created in Delta Lake just like in relational DBs using below DDL statement:

CREATE OR REPLACE VIEW SampleDB.Sample_View
AS
SELECT 
ColA
,COlB
FROM SampleDB.Sample_Table

Create View Documentation