0
votes

We have number of databricks DELTA tables created on ADLS Gen1. and also, there are external tables built on top each of those tables in one of the databricks workspace.

similarly, I am trying to create same sort of external tables on the same DELTA format files,but in different workspace.
I do have read only access via Service principle on ADLS Gen1. So I can read DELTA files through spark data-frames, as in given below:

read_data_df = spark.read.format("delta").load('dbfs:/mnt/data/<foldername>')

I can even able to create hive external tables, but I do see following warning while reading data from the same table:


Error in SQL statement: AnalysisException: Incompatible format detected.

A transaction log for Databricks Delta was found at `dbfs:/mnt/data/<foldername>/_delta_log`,
but you are trying to read from `dbfs:/mnt/data/<foldername>` using format("hive"). You must use
'format("delta")' when reading and writing to a delta table.

To disable this check, SET spark.databricks.delta.formatCheck.enabled=false
To learn more about Delta, see https://docs.microsoft.com/azure/databricks/delta/index
;

If I create external table 'using DELTA', then I see a different access error as in:

Caused by: org.apache.hadoop.security.AccessControlException: 
OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. 
Either the resource does not exist or the user is not authorized to perform the requested operation.). 
failed with error 0x83090aa2 (Forbidden. ACL verification failed.
Either the resource does not exist or the user is not authorized to perform the requested operation.). 

Does it mean that I would need full access, rather just READ ONLY?, on those underneath file system?

Thanks

1

1 Answers

1
votes

Resolved after upgrading to Databricks Runtime environment to runtime version DBR-7.3.