How to check the delta lake version in the databricks notebook?
(from slack)
You’d have to get the runtime version and from that match it up with lake version of included in the runtime.
spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
and then check the build notes.