0
votes

I am using element wise multiplication in MATLAB whereby the biggest matrices I have are 120x50 matrices. In the code below, weight_N_120{i,1}{j,1} is a cell whereby the ith cell contains 10000 other cells. In each of these 10000 cells, I have a matrix which is of dimension 120 by i. The same goes for ind_ExRet_N{i,1}{j,1}.

 for i = 2:50
        for j = 1:10000
        weight_ExRet_NS{i,1}{j,1} = weight_N_120{i,1}{j,1}.*ind_ExRet_N{i,1}{j,1};
        end
        i
 end

When I run this, I get an error:

{Error using .*
Out of memory. Type HELP MEMORY for your options.

Error in PCA (line 26)
    weight_ExRet_NS{i,1}{j,1} = weight_N_120{i,1}{j,1}.*ind_ExRet_N{i,1}{j,1};

Error in run (line 64)
evalin('caller', [script ';']);
}

I realised it stopped running when i = 30, so it means there is not enough memory to do element wise multiplication with two 120 by 30 matrices. How can I resolve this problem?

1
The problem might be the amount of data you are producing. What is your RAM memory? Besides, storing matrices in cell arrays is the best thing to do if you want to waste memory. Since all the elements of your cell are double, it makes no sense to store them in an cell. Try a structure.gire
Yep it may happen that you are just generating to much data. Instead of creating a new variable "weight_ExRet_NS", replace "weight_N_120".ASantosRibeiro
@gire I am running this on a grid with 40GB of memory. I am new to MATLAB, how can I edit it to make a structure rather than cell? Thanks.TrueTears
@ASantosRibeiro OK. Thanks, I will try that now and see how it goes :)TrueTears
Check how to define a structure heregire

1 Answers

1
votes

I tried to estimate the expected memory consumption for (the matrices in) weight_ExRet_NS. If all matrices are 120x50 (type double), this is a slight underestimation (due to the cell):

120*50*50*10000*8/1024/1024=22888

and the answer is in Mb. Think that you also have 2 more cell arrays, which should be of similar size. What I think is that you have finnaly hit the limit. See it like a bucket which finally fills up an pour over. What you need to do is to think if you need all this data and to that precision. If this is still the case, you should chunk it up and work with it independently. If it is possible to erase (or overwrite) some of the data dynamically, you should do this as well.