I believe my method is leaking memory since in the profiler the number of "Surviving generations" keeps increasing :
In production I get "OOM heap space" errors after a while and I now think my method is the culprit.
As a background my method goal is to retrieve already existing documents in an index. The list is then used afterwards to tell if the document can remain in the index or can be removed (e.g. the corresponding document has been deleted from disk) :
public final List<MyDocument> getListOfMyDocumentsAlreadyIndexed() throws SolrServerException, HttpSolrClient.RemoteSolrException, IOException {
final SolrQuery query = new SolrQuery("*:*");
query.addField("id");
query.setRows(Integer.MAX_VALUE); // we want ALL documents in the index not only the first ones
SolrDocumentList results = this.getSolrClient().
query(
query).getResults();
listOfMyDocumentsAlreadyIndexed = results.parallelStream() // tried to replace with stream with the same behaviour
.map((doc) -> {
MyDocument tmpDoc = new MyDocument();
tmpDoc.setId((String) doc.getFirstValue(
"id"));
// Usually there are things done here to set some boolean fields
// that I have removed for the test and this question
return tmpDoc;
})
.collect(Collectors.toList());
return listOfMyDocumentsAlreadyIndexed;
}
The test for this method does the following call in a for loop 300 times (this simulates the indexing loops since my program indexes one index after the other) :
List<MyDocument> listOfExistingDocsInIndex = index.getListOfMyDocumentsAlreadyIndexed();
I tried to nullify it after use (in the test there is no use, it was just to see if it has any effect) without any noticeable change : listOfExistingDocsInIndex = null;
This is the call tree I get from Netbeans profiler (I've just started using the profiler) :
What can I change / improve to avoid this memory leak (it is actually a memory leak, isn't it ?) ?
Any help appreciated :-),