(For my answer I will assume that you're building a framework-dependent application and not a self-contained application.)
You cannot set 2.1.3
directly as the <TargetFramework>
in the .csproj
because it's just the patch version that's different.
Only the following entries are allowed: https://docs.microsoft.com/en-us/dotnet/standard/frameworks
So in your case the .csproj
needs to have <TargetFramework>netcoreapp2.1</TargetFramework>
.
But as Damir pointed out in the comment, you can force the version if you also add <RuntimeFrameworkVersion>2.1.3</RuntimeFrameworkVersion>
to the <PropertyGroup>
.
You can check the effect of this change if you open the "Manage NuGet Packages" dialog and look at the version of the metapackage Microsoft.NETCore.App
that is implicitly being used.
Without the <RuntimeFrameworkVersion>
it's 2.1.0
, otherwise it should be 2.1.3
.
But aside from that there are sometimes other factors as well that will implicitly determine the version that's being used.
For example, if you want to build a self-contained application you have to specify the <RuntimeIdentifier>
which will also force the framework version to 2.1.3
because it will publish the highest patch runtime on your machine automatically.
That said, I wouldn't recommend to set any <RuntimeFrameworkVersion>
manually for the most part.
Starting with .NET Core 2.1 all framework-dependent applications (that are built using .NET Core 2.0 or later) will automatically roll forward to the latest minor version that is installed if the original version of the runtime isn't present on the system.
So if you only have .NET Core 2.1.3 installed on the target system, your framework-dependent .NET Core 2.1.0 application will automatically use that runtime and framework.
You can read a bit more about the .NET Core version selection mechanism here.
<RuntimeFrameworkVersion>2.1.3</RuntimeFrameworkVersion>
RuntimeFrameworkVersion
element overrides the default version – Damir Beylkhanov