General programming question, but there might be specific considerations for Matlab.
I will be importing very large data file. Is it better practice/faster/more efficient to import the whole file onto the memory and then divide it into submatrices, or rather to just import every n columns into a new matrix?
My guess is that it would be faster to load it all into the cache and then deal with it, but it's just an uneducated guess.
data = csvread('filename')
and then dividing the data matrix into several matrices. say:matrix_1_2 = data(:,1:2)
, etc. Is that better than scanning for the first two columns only, saving them, then scanning for the second pair of columns, etc? – msmf14resultingMatrix = data(:,1:n) .* data(:,n+1:2*n)
, but that will make the code less legible for others) – msmf14