So interestingly enough I have never had an Opengl context lost (where all buffer resources are wiped) until now. I currently am using OpenGL 4.2, via SDL 1.2 and GLEW on Win7 64, also my application is windowed without the ability to switch to fullscreen while running (only allowed on start up).
On my dev machine context never seems to be lost on re-size, but on other machines my application can lose the OpenGL context (it seems rare). Due to memory constraints (I have alot of memory being used by other parts of the application) I do not back up my gl buffer contents (VBOs, FBOs, Textures, etc) in system memory, oddly this hasn't been a problem for me in the past because the context never got wiped.
Its hard to discern from googling under what circumstances an OpenGL context will be lost (where all GPU memory buffers are wiped), other than maybe toggleing between fullscreen and windowed.
Back in my DX days, context lost could happen for many reasons, and I would be notified when it happened and reload my buffers from system memory backups. I was under the assumption (and I was perhaps wrong in that assumption) that OpenGL (or a managing library like SDL) would handle this buffer reload for me. Is this in any way even partially true?
One of the issues I have is that losing context on a resize, is pretty darn inconvenient, I am using ALOT of GPU memory, and having to reload everything could pause the app for while (well longer than I would like).
Is this a device dependent thing or driver dependent? Is it some combination of device, driver, and SDL version? How can a context loss like this be detected so that I can react to it?
Is it standard practice to keep system memory contents of all gl buffer contents, so that they may be reloaded on context loss? Or is a context loss rare enough that it isn't standard practice?