I have a C++ application which embeds Lua and periodically (~10 times per second) runs scripts that can create new userdata objects that allocate non-trivial amounts of memory (~1MiB each) on C++ side. The memory is correctly freed when these objects are garbage-collected, but the problem is that they don't seem to be collected until it's too late and the process starts consuming inordinate amounts of memory (many GiBs). I guess this happens because Lua GC thinks that these objects are just not worth collecting because they appear small on Lua side, and it has no idea how much memory do they actually consume. If this is correct, is there any way to tell it about the real cost of keeping these objects around?
For people familiar with C#, I'm basically looking for the equivalent of GC.AddMemoryPressure() for Lua.
lua_newuserdata
, then Lua knows about their size. If you're only wrapping userdata around C++ pointers pointing to memory allocated in C++, then Lua does not know and can't be told. – lhf