I would like to insert small documents (<10 JSON fields) to Azure Cosmos DB from an Azure Function using the Cosmos .NET SDK v3, but each insert is taking 1-2s.
The insertion code is
await container.CreateItemAsync(
objectToAdd,
new PartitionKey(objectToAdd.PartitionKey),
new ItemRequestOptions { EnableContentResponseOnWrite = false })
Looking at the diagnostics, the majority of the time is taken during item streaming (see logs at the end).
The Azure Function and Cosmos are both deployed to the same region (North Europe). Cosmos allows access to the Azure Function via a service endpoint enabled vnet.
I have read through https://blog.tdwright.co.uk/2019/06/29/aggressively-tuning-cosmos-db-the-long-way-round/ and https://docs.microsoft.com/en-us/azure/cosmos-db/performance-tips-dotnet-sdk-v3-sql tried the following:
- Using both Direct and Gateway connection modes (no significant difference was found).
- Disabling content response on write operations (this reduces the write time down to around 250ms-500ms).
- Ensuring that there is a singleton
CosmosClient
I haven't tried:
- increasing the RUs on the database, since I'm only testing a small number of 7.6 RU inserts.
- using the bulk library as documents are arriving individually
This seems like a long response time for something which should be entirely within a single data center.
My question:
- Does the above information indicate a network issue?
- If so, does Cosmos provide a way to trace a connection within Azure a bit like TRACERT does?
- Does Cosmos log the IP address that it thinks the connection is originating from?
- Or is there further optimisation at the code level which is possible?
{
"Summary": {
"StartUtc": "2020-10-22T21:21:03.3520979Z",
"ElapsedTime": "00:00:01.0030109",
"UserAgent": "cosmos-netstandard-sdk/3.6.0|3.4.2|38128|X86|Microsoft Windows 10.0.14393 |.NET Core 4.6.29215.02|"
},
"Context": [
{
"Id": "ItemStream",
"ElapsedTime": "00:00:01.0030109"
},
{
"Id": "ItemSerialize",
"ElapsedTime": "00:00:00.0000212"
},
{
"Id": "ExtractPkValue",
"ElapsedTime": "00:00:00.0000402"
},
{
"Id": "BatchAsyncContainerExecutor.Limiter",
"ElapsedTime": "00:00:00.0000056"
},
{
"Id": "RequestInvokerHandler",
"ElapsedTime": "00:00:00.0071549"
},
{
"Id": "Microsoft.Azure.Cosmos.Handlers.RetryHandler",
"ElapsedTime": "00:00:00.0071272"
},
{
"Id": "Microsoft.Azure.Cosmos.Handlers.RouterHandler",
"ElapsedTime": "00:00:00.0070979"
},
{
"Id": "TransportHandler",
"ElapsedTime": "00:00:00.0070954"
},
{
"Id": "PointOperationStatistics",
"ActivityId": "...",
"StatusCode": 200,
"SubStatusCode": 0,
"RequestCharge": 7.62,
"RequestUri": "...",
"RequestSessionToken": null,
"ResponseSessionToken": "..",
"ClientRequestStats": {
"RequestStartTimeUtc": "2020-10-22T21:21:04.3479626Z",
"RequestEndTimeUtc": "2020-10-22T21:21:04.3548940Z",
"RequestLatency": "00:00:00.0069314",
"IsCpuOverloaded": false,
"NumberRegionsAttempted": 1,
"ResponseStatisticsList": [
{
"ResponseTime": "2020-10-22T21:21:04.354894Z",
"ResourceType": 2,
"OperationType": 40,
"StoreResult": "StorePhysicalAddress: rntbd://cdb-ms-prod-northeurope1-fd8.documents.azure.com:14173/apps/.../, LSN: 304, GlobalCommittedLsn: 303, PartitionKeyRangeId: 0, IsValid: True, StatusCode: 200, SubStatusCode: 0, RequestCharge: 7.62, ItemLSN: -1, SessionToken: ..., UsingLocalLSN: False, TransportException: null"
}
],
"AddressResolutionStatistics": [],
"SupplementalResponseStatistics": [],
"FailedReplicas": [],
"RegionsContacted": [
"<redacted>"
],
"ContactedReplicas": [
"rntbd://cdb-ms-prod-northeurope1-fd8.documents.azure.com:14173/apps/<redacted>/",
"rntbd://cdb-ms-prod-northeurope1-fd8.documents.azure.com:14387/apps/<redacted>/",
"rntbd://cdb-ms-prod-northeurope1-fd8.documents.azure.com:14215/apps/<redacted>/",
"rntbd://cdb-ms-prod-northeurope1-fd8.documents.azure.com:14064/apps/<redacted>/"
]
}
},
{
"Id": "BatchAsyncContainerExecutor.ToResponse",
"ElapsedTime": "00:00:00.0000295"
}
]
}