1
votes

I know there's a ton of related questions, but none of them seem to quite solve my issue. My ASP.NET Core Web App runs much slower on my Azure Basic (B1) Web App than my localhost (previously had standard tier, with no noticeable improvement). In my app, there is one critical section which is basically a Monte Carlo simulation, which is what I'm timing. This section is all number crunching, with no database calls or anything like that. One core of the CPU gets maxed out (no parallelism), and the ram is manageable. I've run this several times with very consistent results, and my average times are as follows:

  • Desktop LocalHost (debug): 2.318 sec
  • Desktop LocalHost (release): 1.278 sec
  • Laptop LocalHost (debug): 5.579 sec
  • Laptop LocalHost (release: 2.490 sec
  • Azure WebApp (release): 6.663 sec (less consistent; never faster, but often slower)

I've attached some screen shots below of my setup, Azure settings, and my Azure live data.

My desktop processor is an i7-7700k @4.20 GHz, which I know was pretty high end only a few years ago. However, my laptop processor is a i7-3537U @2.00 GHz, and it has to be about 7 years old now. I haven't been able to find any recent specs on the Azure processors. Am I missing anything here, or are the Azure processors just that slow?

Visual Studio Settings

Azure Settings

Azure Analytics

1

1 Answers

1
votes

Both B1 and S1 have 100 total Azure Compute Units (ACU). Try running your test on higher capacity SKUs like the B2/S2 (200 ACU), B3/S3 (400 ACU) and the P1V2 (200 ACU) and P2V2 (420 ACU) and compare your findings with what you have on a B1/S1.