0
votes

I am trying to calculate the maximum portfolio standard deviation using Excel Solver (GRG Nonlinear) enter image description here

w is a 20-dimensional vector of asset weights and C is the symmetric variance-covariance matrix of 20x20 size. So it is an optimization problem that maximizes the portfolio variance.

However, when I run Excel Solver with GRG Nonlinear, it is not giving me the answer I want.

For example, let's say the standard deviation for each asset is 5.11% 7.18% 3.83% 5.24% 3.26% 8.10% 1.62% 4.59% 4.95% 4.15% 2.62% 2.10% 4.58% 4.14% 2.01% 2.97% 1.80% 1.78% 3.07% 3.24% The solution to this optimization should be 100% in the 6th asset, and 0% in all the other assets, since the 6th asset has the highest volatility (8.10%) and the portfolio that invests all the capital in the 6th asset will maximize the portfolio volaility.

However, the Excel Solver gives me the solution 100% invested in the 1st asset and 0% in all the other assets, thus giving me the portfolio volatility of 5.11%.

I want to know why this happens and potentially how I fix this. I would very appreciate and help.

Sincerely,

1
Are you defining C=s*s^T where s is the vector of standard deviations? - xidgel

1 Answers

2
votes

min w'Cw should work just fine. max w'Cw is a very different animal. This objective makes the problem not convex and you need a global solver for this. GRG is a general purpose, local NLP solver. For non-convex problems it will find local optima instead of global solutions. You could try some multi-starts to find better solutions or use the evolutionary solver. Still no guarantee to find the global optimum however. For proven global solutions there are deterministic global solvers available outside Excel (Antigone, Baron, Couenne).