2
votes

I am trying to set up a simple data factory pipeline with intention of copying Azure table storage to Cosmos DB. Azure table storage has a system managed field, Timestamp. When flow runs and Cosmos db is populated with the data Timestamp is always "Timestamp": "1970-01-01T00:00:00Z". Seems that it is not getting into Cosmos DB correctly.

How to reproduce: create a Azure table, add a few entries. Create Cosmos db instance and a new collection. Create Dat Factory flow. Note how Timestamp looks when exported.

I tried to change Timestamp data type from DateTime to DateTimeOffset as well as exporting it as a string. I also tried to specify the date format. The result is the same.

I suspect that Timestamp is a reserved word in Cosmos DB and somehow it fails to insert correct value.

1

1 Answers

2
votes

I reproduced your issue on my side. I tried to add an property time and set the value as same as Timestamp. It could imported into cosmos db correctly.

enter image description here

So, I think it's related to field name ,not data type or data format. However, I tried to specify target column field name timestamp instead of Timestamp then it does't working.

Based on this doc, _ts is auto-generated by cosmos db which as a number representing the number of elapsed seconds since January 1, 1970. It logs last updated timestamp of the resource so that it has the same meaning as Timestamp in Azure Table storage.It could be converted UnixDateTimeConverter class. So,you could trace date log via _ts field.

If you do want keep Timestamp, you could add a property as same value sa Timestamp then import it into cosmos db.

Hope it helps you.