I have a correlation matrix for N
random variables. Each of them is uniformly distributed within [0,1]
. I am trying to simulate these random variables, how can I do that? Note N > 2
. I was trying to using Cholesky Decomposition and below is my steps:
- get the lower triangle of the correlation matrix
(L=N*N)
- independently sample
10000
times for each of the N uniformly distributed random variables(S=N*10000)
- multiply the two:
L*S
, and this gives me correlated samples but the range of them is not within[0,1]
anymore.
How can I solve the problem?
I know that if I only have 2 random variables I can do something like:
1*x1+sqrt(1-tho^2)*y1
to get my correlated sample y
. But if you have more than two variables correlated, not sure what should I do.