0
votes

Hi i want to upload Parquet files to ADLS gen 2 blob. I am using below line of code to create blob and upload parquet file in it.

blob = BlobClient.from_connection_string(conn_str="Connection String", container_name="parquet", blob_name=outdir)
df.to_parquet('logs.parquet',compression='GZIP') #df is dataframe
with open("./logs.parquet", "rb") as data:
blob.upload_blob(data)
os.remove("logs.parquet")

i do not face any error and the files are also written in the blob. But , i don't think i am doing it right as ADX/kusto query can not understand the file and no data is visible there.

Below are the steps i performed in Azure Data Explorer to fetch records from uploaded parquet file in ADLS gen 2.

Created External Table:

.create external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime) 
kind=blob
dataformat=parquet
( 
   h@'https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret'
)
with 
(
   folder = "ExternalTables"   
)

External Table Column Mapping :

.create external table LogDataParquet parquet mapping "LogDataMapparquet" '[{ "column" : "AppId_s", "path" : "$.AppId_s"},{ "column" : "UserId_g", "path" : "$"},{ "column" : "Email_s", "path" : "$.Email_s"},{ "column" : "RoleName_s", "path" : "$.RoleName_s"},{ "column" : "Operation_s", "path" : "$.Operation_s"},{ "column" : "EntityId_s", "path" : "$.EntityId_s"}]'

External Tables Gives no records

external_table('LogDataParquet')

No records

external_table('LogDataParquet') | count 

1 record - count 0

I have used similar scenario using stream analytics , where i receive incoming streams and save it in parquet format to ADLS. External table in ADX fetches records well in that case. I feel i am making mistake in the way parquet files are written in blob - (with open("./logs.parquet", "rb") as data: )

1
Can you please provide more details on the error you see with the Kusto query?Avnera
i do not see any error with kusto , but it does not fetches any records . Just empty rows.ashwini prakash
If you see empty rows (and count of empty rows matches the count of input rows in the parquet file) - this may suggest that your ingestion mapping is incorrect. The question does not specify how the data lands in ADX after uploaded into a blob. Are you using EventGrid or using ADX python library for ingesting the data? Please, clarify.Alexander Sloutsky
Empty rows and the counts are 0 as well. i am using ADX python library for ingesting data (from azure.storage.blob import ContainerClient, BlobClient) . i have edited the question with more details.ashwini prakash

1 Answers

1
votes

According to logs, external table was defined as follows:

.create external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime) 
kind=blob
partition by 
   AppId_s,
   bin(TimeGenerated,1d)
dataformat=parquet
( 
   '******'
)
with 
(
   folder = "ExternalTables"   
)

The PARTITION BY clause tells ADX that the expected folder layout is:

<AppId_s>/<TimeGenerated, formatted as 'yyyy/MM/dd'>

For example:

https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret/SuperApp/2020/01/31

You can find more info on how ADX locates files on external storage during a query, in this section: https://docs.microsoft.com/en-us/azure/data-explorer/kusto/management/external-tables-azurestorage-azuredatalake#artifact-filtering-logic

To fix the external table definition in accordance to folders layout, please use .alter command:

.alter external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime) 
kind=blob
dataformat=parquet
( 
  h@'https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret'
)
with 
(
   folder = "ExternalTables"   
)

BTW, if a mapping is naive (e.g. mapped column names match data source column names), then it's not needed for Parquet format.