0
votes

Azure Data factory Copy Activity

Source: csv file Sink: cosmos db Operation: upsert

Copy activity fails with code '2200', some issue with id field, It was working find before few weeks

My csv file has a number column that I am using as id for cosmos documents, so i can update existing ones

Error details

{
'errorCode': '2200',
'message': 'ErrorCode=UserErrorDocumentDBWriteError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Documents failed to import due to invalid documents which violate some of Cosmos DB constraints: 1) Document size shouldn't exceeds 2MB; 2) Document's 'id' property must be string if any, and must not include the following charaters: '/', '\\\\', '?', '#'; 3) Document's 'ttl' property must not be non-digital type if any.,Source=Microsoft.DataTransfer.DocumentDbManagement,'',
'failureType': 'UserError',
'target': 'Copy_ToCosmosDB',
'details': []
}
1
Hi, does my answer answered your question?If so,can you mark it as answer?Steve Zhao

1 Answers

0
votes

When you upsert items in cosmos db,don't change your Partition key.Because the Partition key in cosmos db can't be changed.More Detail refer to this documentation.

For example: My container's Partition key is /name,here is an item like this:

{
    "id": "2",
    "no": 2,
    "name": "Monica",
    "createTime": "2020-06-22T00:00:00.000Z",
    
}

My csv file is something like this(update Monica to Monican):

id,no,name,createTime
2,2,Monican,2020-06-22T00:00:00.000Z

When I run the pipeline,I will get the same error with you.