Whereas it may make sense to preallocate a traditional dense matrix in R (in the same way it is much more efficient to preallocate a regular (atomic) vector rather than increasing its size one by one,
I'm pretty sure it will not pay to preallocate sparse matrices in R, in most situations.
Why?
For dense matrices, you allocate and then assign "piece by piece", e.g.,
m[i,j] <- value
For sparse matrices, however that is very different: If you do something like
S[i,j] <- value
the internal code has to check if [i,j] is an existing entry (typically non-zero) or not. If it is, it can change the value, but otherwise, one way or the other, the triplet (i,j, value)
needs to be stored and that means extending the current structure etc. If you do this piece by piece, it is inefficient... mostly irrespectively if you had done some preallocation or not.
If, on the other hand, you already know in advance all the [i,j]
combinations which will contain non-zeroes, you could "pre-allocate", but in this case,
just store the vector i
and j
of length nnzero
, say. And then use your underlying "algorithm" to also construct a vector x
of the same length which contains all the corresponding value
s, i.e., entries.
Now, indeed, as @Pafnucy suggested, use spMatrix()
or sparseMatrix()
, two slightly different versions of the same functionality: Constructing a sparse matrix, given its contents.
I am happy to help further, as I am the maintainer of the Matrix
package.
spMatrix
. – Pafnucy