0
votes

We have built a web api with .NET Core 3.1, Dapper and Azure SQL which is hosted on an Azure web app. Using loader.io we ran some performance tests, which were disappointing (~10 RPS) for an S1 instance. Each API call triggers a couple of SQL queries using Dapper to retrieve the data from an Azure SQL database (100 DTU).

The Application Insights profiler has made traces, which says that a lot of time is spent in the database: AI performance tip

Travelling the hot path: Hot path

The database itself doesn't appear stressed (DTU or other metrics wise): enter image description here

The web app has some peaks on CPU, the threads look OK: enter image description here

The code is injected using standard .NET Core IServiceCollection DI and Scrutor:

// Injection of all SqlServer repositories.
services.Scan(scan => scan
  .FromAssemblyOf<SomeRepository>()
  .AddClasses(classes => classes.Where(type => type.Name.EndsWith("Repository")))
  .AsImplementedInterfaces()
  .WithTransientLifetime());

Dapper is used in the SQL Server repositories to retrieve the information:

public async Task<IReadOnlyCollection<DocumentRevisionOverview>> GetDocumentRevisionsAsync()
{
  Logger.LogInformation("Starting {Method} on {Class}.", nameof(GetDocumentRevisionsAsync), ClassName);

  var sql = "select statement";

  // databaseParameters.ConnectionString (type is string) is injected via the constructor.
  using (var connection = new SqlConnection(databaseParameters.ConnectionString))
  {
    var documentRevisions = await connection.QueryAsync<DocumentRevisionOverview>(sql);

    Logger.LogInformation("Finished {Method} on {Class}.", nameof(GetDocumentRevisionsAsync), ClassName);

    return documentRevisions.ToList();
  }
}

We don't have varchars in our database tables, so the varchar to nvarchar conversion is not applicable.

  • All resources are located in the same resource group and location.
  • The tables contain only a couple of hundred records.
  • The JSON that is returned from the web API is 88 KB.
  • Dapper should re-use the ADO.NET connection pooling, which is based on the connection string which doesn't vary.
  • The connectionstring format is "Server={server};Initial Catalog={database};Persist Security Info=False;User ID={user};Password={password};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
  • Individual calls (from Postman for example) are quick, ~150 ms:

What can I do to further investigate the cause of the waiting time?

Nice PS: I was surprised to see a 5x improvement when deploying to a Linux web app (with the P1V2, 2x size of an S1).

1
Without giving out sensitive information, how does your connection string look like?JohanP
@JohanP: It was a custom one, but changed it to the default Azure one, see post. No changes in performance after change.JeroenW
I would first of all check that query is performing as expected. If you run the query manually via SSMS or Data Studio, performances are good (I guess so, since table only has an handful of records, but just in case). When under the (supposed) stress, try to use something like sp_whoisactive to see what's going on.mauridb
There are no recommendations for the query, not by reviewing the execution plan nor are there recommendations made by Azure SQL. I have connected Azure Sql Analytics, hoping to gain some insights using that..JeroenW
I've just tried to run something very very similar to that myself, and on a B1 WebApp I can reach 300 Request Per Second (using Locust.Io). Give it a try here: github.com/Azure-Samples/azure-sql-db-dotnet-rest-apimauridb

1 Answers

0
votes

Without knowing your situation it's hard to know for sure, but I have verified Azure SQL is slower than SQL Server by quite a huge amount due to the location of Azure SQL not being close enough to the executing server, even in the same data center.

At my last position we moved to Azure SQL, then due to our application be a heavy calculation app we moved back to SQL Server located on the virtual machine, and we used SQL Data Sync to send the data to Azure SQL for back up.

This won't help solve your problem, but Azure SQL is best for back end jobs from what I have found.