1
votes

How to check the delta lake version in the databricks notebook?

(from slack)

2

2 Answers

1
votes

How about checking with databricks's dbutils.

println(dbutils.notebook.getContext.tags("sparkVersion"))

0
votes

You’d have to get the runtime version and from that match it up with lake version of included in the runtime.

spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")

and then check the build notes.