I'm not sure that changing libraries will help. You're going to need doubles (8 bytes per). I don't know what the dimension of the covariance matrix would be in this case, but switching libraries won't change the underlying calculations much.
What is the -Xmx setting when you run? What about the perm gen size? Perhaps you can increase them.
Does the algorithm halt immediately or does it run for a while? If it's the latter, you can attach to the process using Visual VM 1.3.3 (download and install all the plugins). It'll let you see what's happening on the heap, threads, etc. Could help you ferret out the root cause.
A Google search for "Java eigenvalue of large matricies" turned up this library from Google. If you scroll down in the comments I wonder of a block Lanczos eigenvalue analysis might help. It might be enough if you can get a subset of the eigenvalues.
These SVM implementations claim to be useful for large datasets:
http://www.support-vector-machines.org/SVM_soft.html
I don't think you can ask for more than 2GB for a JVM:
http://www.theserverside.com/discussions/thread.tss?thread_id=26347
According to Oracle, you'll need a 64-bit JVM running on a 64-bit OS:
http://www.oracle.com/technetwork/java/hotspotfaq-138619.html#gc_heap_32bit