0
votes

Based on the following code (taken from https://medium.com/@jackwild/getting-cpu-usage-in-net-core-7ef825831b8b) I determine the CPU usage of a console app written in C# (.Net 5.0):

class Program
{
    static async Task Main(string[] args)
    {
        Console.WriteLine("Hello World!");

        Process p = Process.GetCurrentProcess();

        while (true)
        {
            await Task.Delay(2000);
            var cpuUsage = await GetCpuUsageForProcess(p);

            Console.WriteLine("CPU: " + cpuUsage + "%");
        }
    }

    static async Task<double> GetCpuUsageForProcess(Process proc)
    {
        var startTime = DateTime.UtcNow;
        var startCpuUsage = proc.TotalProcessorTime;
        await Task.Delay(1000);

        var endTime = DateTime.UtcNow;
        var endCpuUsage = proc.TotalProcessorTime;
        var cpuUsedMs = (endCpuUsage - startCpuUsage).TotalMilliseconds;
        var totalMsPassed = (endTime - startTime).TotalMilliseconds;
        var cpuUsageTotal = cpuUsedMs / (Environment.ProcessorCount * totalMsPassed);
        return cpuUsageTotal * 100;
    }
}

When running the application on windows (either through debugging or after publishing it) the application has an expected average usage of 0% when idling. However when I run the exact same published code on my linux VM the idle usage is somewhere between 0 and 0.24

Example output from windows:

CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%

Output on linux VM (ubuntu 18.04):

CPU: 0.12440051392340312%
CPU: 0.12427189099068557%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0.24839400853779886%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0.12470035751093696%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0.24555489168426392%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0%
CPU: 0.12460451772120468%
CPU: 0%
CPU: 0.12426320617164646%

This seems unexpected because the code is the same and idling shouldn't use up any cpu time. Now, in the example it doesn't seem like a big issue if it occasionally takes up some cpu time for 0.1% but on our main program it takes 0.2% - 2% when just idling (while it is at 0% on windows) and the logic seems to run a bit slower on linux. This makes me wonder if running .net core applications is less efficient on linux or if I am missing something like if the cpu usage is just calculated different/more accurate on linux when using the Process.TotalProcessorTime property.

Is there some way to figure out what is causing the CPU usage from just the runtime environment? I've tried to use perf (https://docs.microsoft.com/en-us/dotnet/core/diagnostics/debug-highcpu?tabs=linux#trace-generation) but the report doesn't really help me because I couldn't figure out what is causing the cpu usage.

1

1 Answers

0
votes

For determining what is causing the CPU usage I am not sure, but according to https://github.com/dotnet/runtime/issues/2130 .NET Core has known performance differences between linux and IIS based servers as well as having two distinct deployment options, IIS and in-process hosting for Windows and Kestrel out-of-process hosting which has noticeable drags on performance as well.