I have noticed in one of my Java agents that when storing a lot of Document objects in a Hashmap I get OutofMemoryException. What I'm trying to achieve is this:
I want to populate my Domino database with information from an Oracle database. The logic is to get all from Oracle and then get all documents from Domino database. All new from Oracle is to be saved and all not in Oracle but in database shall be removed from Domino. A sort of replication of you will…
I start with looping all documents in the Domino database and store them in a HashMap with UNID as key and Document object as value:
dominoHash.put(doc.getUniversalId(), doc);
Then I loop the resultset and creates in-memory Document objects:
Document doc = db.createDocument();
doc.replaceItemValue("MyField", rset.getString("MYCOLUMN");
oracleHash.put(rset.getString("UNID"), doc);
In my TEST environment I have +20000 Domino documents and the same in Oracle. In this case they are identical. The OutOfMemoryException occurs always when trying to store in oracleHash even if I have excluded all Domino document logic from the code to "save memory"…
I thought the objects I'm storing in the different Hashmaps are the same but some how they are not. I can successfully store my Domino Document objects in a Hashmap but not my Oracle document objects.
To make it more strange the document stored from Domino has more fields than the Oracle view has columns.
Why could be the cause of this?
In this specific case I use org.openntf.domino API and haven't tried native API (yet).