In iOS it is easy to setup OpenGL ES 2.0 render to textures and then use those textures for post processing passes or as textures for subsequent rendering passes. That seems like a fairly common approach across OpenGL implementations. All good.
According to Apple's OpenGL ES Programming Guide for iOS (see pages 28 and 29) you can also create and draw to multiple offscreen framebuffer objects. They suggest that you would do this in order to perform offscreen image processing. However, I can't find and description of how you would access the buffers for image processing or any other purpose after rendering to them.
Can these offscreen buffers be used with non-OpenGL frameworks for image processing? Can these buffers be read back by the CPU?
Does anyone have any pointers or examples?