I've looked at a couple of seemingly related posts that don't offer a solution (or a solution that helps my situation):
CPU usage goes upto 75% while stream a 300 MB file using WCF service
High CPU load using WCF streaming
So I'm hoping someone out there can help.
I have put together a WCF Service using .NET 4.5 to facilitate the Uploading and Downloading of large files (hundreds of MBs to GBs.)
I am using the "Streamed" TransferMode on a BasicHttpBinding with no security.
Everything works fine, however, I have noticed a huge disparity in the CPU utilization of the service when hosted in IIS vs self-hosting it in a Console application.
In the Console application, the utilization is below 20%, once in IIS the utilization is over 80% - This is a single instance download.
The service is configured the same way in both scenarios and both are running on the same box.
My binding is configured like so:
<binding name="UnsecuredStreamBinding"
receiveTimeout="00:30:00"
sendTimeout="00:30:00"
transferMode="Streamed"
maxReceivedMessageSize="53687091200" maxBufferSize="65536" /> <!--50GB-->
The service code is also very simple:
...
var fileStream = File.OpenRead(filename);
var size = fileStream.Length;
var response = new DownloadResponse
{
FileStream = fileStream,
Size = size,
};
return response;
Where DownloadResponse is defined as a MessageContract.
I've done very little to alter the default IIS environment (IIS 8.5 on Windows Server 2012 R2) Logging/Diagnostics are turned off so aren't adding overhead.
This is my first foray into Web Service development, so I know there is a lot I don't know and I'm hoping that I'm missing something fairly simple that would account for the huge difference in CPU utilization that I'm seeing.