I using C# to interact with the Domino COM. I am using Lotus Notes 8.5.2. Visual Studio 2008, Windows 7 SP1.
I am trying to prevent this error from Lotus:
LSXBE: ************************************
LSXBE: ****** Out of Backend Memory *******
LSXBE: ************************************
The following code copies each NotesDocument in an NSF to another NSF. After getting UNIDs from a NotesView having a selection query of SELECT @All, code copies one message at a time using
NotesDocument ndoc = nd.GetDocumentByUNID(nve.UniversalID);
ndoc.CopyToDatabase(nd2);
Given, a source NSF of 10 GB, the amount of memory (private bytes, I believe) used by my application grows steadily to about 450 MB. ANTS Memory Profiler indicates that almost all of that memory is allocated to unmanaged memory, which would be the COM.
- Could I reduce memory consumption by disposing of Notes objects? I have not seen methods to deallocate a NotesSession, NotesDatabase, NotesDocument, etc. Is there a way to deallocate the memory?
I have tried a version of the code calling GC.Collect() and GC.WaitForPendingFinalizers() after every 5000 documents, that eliminated the "Out of Backend Memory" errors on a machine with 16 GB RAM. Even when copying finishes, and I set the object containing the below code to null and call garbage collection, memory utilization remains around 450 MB.
I also tried putting the copy code in its own thread and then garbage collecting after the thread completes. That did not help.
- If there are not disposal methods for Notes objects, how else might I reduce memory utilization?
Code C#
//Establish session
NotesSession ns = new Domino.NotesSessionClass();
ns.Initialize("");
//Open source NSF
NotesDatabase nd = ns.GetDatabase("", "test.nsf", false);
//Open destination NSF.
//Assume that all design elements of nd2 are identical to those of nd
NotesDatabase nd2 = ns.GetDatabase("", "test2.nsf", false);
//Create view that returns all documents.
NotesView nView2 = nd.GetView("$All");
nd.CreateView("All-DR", "SELECT @ALL", nView2, false);
NotesView nView = NotesConnectionDatabase.GetView("All-DR");
//Loop through entries in the new view
NotesViewEntry nvec = nView.AllEntries;
nve = nvec.GetFirstEntry();
for (int j = 1; j <= intEntryCount; j++)
{
if (j == 1)
{
nve = nvec.GetFirstEntry();
}
else
{
nve = nvec.GetNextEntry(nve);
}
//Copy document to second database.
NotesDocument ndoc = nd.GetDocumentByUNID(nve.UniversalID);
ndoc.CopyToDatabase(nd2);
}
//End loop.
//All documents are copied.
I have these ideas, none of which is appealing:
Call garbage collection after every call to CopyToDatabase. I don't expect this will work given that I believe I am dealing with a memory leak in COM. I also expect it to reduce the speed of my application.
Try the C++ API. I have no idea whether the issue exists there.
This is a really clumsy way... Build a new manager app that a. Gets a list UNIDs and writes them to a text file. b. Starts the copy app. c. The copy app copies a subset of the records from the text file. d. The copy app terminates, freeing its memory. e. The manager app starts a new copy app process. f. Repeat c - e as needed.