I am planning to use Azure Data factory for creating backup of Azure Tables storage. The entities in my Azure Table could change their schema. Is there a way Azure Pipeline could handle this without a manual intervention everytime schema changes ?
Eg : Let first entry be
<entry>
<content type="application/xml">
<m:properties>
<d:PartitionKey>P1</d:PartitionKey>
<d:RowKey>R1</d:RowKey>
<d:Timestamp m:type="Edm.DateTime">2017-05-22T20:37:34.8743000Z</d:Timestamp>
<d:IsDefault m:type="Edm.Boolean">False</d:IsDefault>
</m:properties>
</content>
</entry>
While another entry could be :
<entry>
<content type="application/xml">
<m:properties>
<d:PartitionKey>P2</d:PartitionKey>
<d:RowKey>R2</d:RowKey>
<d:Timestamp m:type="Edm.DateTime">2017-05-22T20:37:34.8743000Z</d:Timestamp>
<d:IsDefault m:type="Edm.Boolean">False</d:IsDefault>
**<d:IsTest m:type="Edm.Boolean">False</d:IsTest>**
</m:properties>
</content>
</entry>
I don't want to change my Dataset everytime a entity change.
According to doc : https://docs.microsoft.com/en-us/azure/data-factory/data-factory-faq
If the structure and jsonPathDefinition are not defined in the Data Factory dataset, the Copy Activity detects the schema from the first object and flatten the whole object.
Is their a workaround to this problem.