I am trying to calculate the maximum portfolio standard deviation using Excel Solver (GRG Nonlinear)

w is a 20-dimensional vector of asset weights and C is the symmetric variance-covariance matrix of 20x20 size. So it is an optimization problem that maximizes the portfolio variance.
However, when I run Excel Solver with GRG Nonlinear, it is not giving me the answer I want.
For example, let's say the standard deviation for each asset is
5.11% 7.18% 3.83% 5.24% 3.26% 8.10% 1.62% 4.59% 4.95% 4.15% 2.62% 2.10% 4.58% 4.14% 2.01% 2.97% 1.80% 1.78% 3.07% 3.24%
The solution to this optimization should be 100% in the 6th asset, and 0% in all the other assets, since the 6th asset has the highest volatility (8.10%) and the portfolio that invests all the capital in the 6th asset will maximize the portfolio volaility.
However, the Excel Solver gives me the solution 100% invested in the 1st asset and 0% in all the other assets, thus giving me the portfolio volatility of 5.11%.
I want to know why this happens and potentially how I fix this. I would very appreciate and help.
Sincerely,
C=s*s^Twheresis the vector of standard deviations? - xidgel