Need an elegant way to rollback Delta Lake to a previous version.
My current approach is listed below:
import io.delta.tables._
val deltaTable = DeltaTable.forPath(spark, testFolder)
spark.read.format("delta")
.option("versionAsOf", 0)
.load(testFolder)
.write
.mode("overwrite")
.format("delta")
.save(testFolder)
This is ugly though, as the whole data set need to be rewritten. It seems that some meta update would be sufficient and no data I/O should be necessary. Anyone knows a better approach for this?