0
votes

I have the schema associated with a table to be created fetched from confluent schema-registry in below code:

private val avroSchema = schemaRegistryClient.getLatestSchemaMetadata("topicName").getSchema
private var sparkSchema = SchemaConverters.toSqlType(new Schema.Parser().parse(avroSchema))
sparkSchema=sparkSchema.dataType.asInstanceOf[StructType]

Now I'm trying to define a delta lake table which has the structure that is based on this schema. However I'm not sure how to go about the same. Any help appreciated.

1

1 Answers

0
votes

In Scala you can use the following:

for defining schema

val customSchema = 
StructType(
 Array(
  StructField("col1", StringType, true),
  StructField("col2", StringType, true),
  StructField("col3", StringType, true)
  )
)

for reading the table from the schema

val DF = 
 spark.read.format("csv")
  .option("delimiter","\t") //use a proper delimiter
  .schema(customSchema)
  .load("path")

while writing the table to a particular location you can specify the .format("delta") to