I have saved one dataframe in my delta lake, below is the command:
df2.write.format("delta").mode("overwrite").partitionBy("updated_date").save("/delta/userdata/")
Also I can load and see the delta lake /userdata:
dfres=spark.read.format("delta").load("/delta/userdata")
but here , I have one doubt like when I am moving several parquet files from blob to delta lake creating dataframe, then how some one else would know which file I have moved and how he can work on those delta, is there any command to list all the dataframes in delta lake in databricks?
SHOW TABLES
and see if somehow Databricks tracks delta tables? They're not tracked in a metastore in the OSS version (Delta Lake 0.5.0), but have seen some code that would imply it could work with Databricks. – Jacek Laskowski