1
votes

I am working on an AR app. I have a SceneKit scene being rendered into a SCNView (using OpenGL ES as the rendering API). Beneath this is another view showing a live preview of the camera. I would like to create a movie containing both the video and the 3D scene.

I am using code based on Apple’s RosyWriter sample code to process the video frame pixel buffers with OpenGL shaders. I don’t think I’ve quite understand the concepts well enough because I’m not sure how to overlay the SceneKit rendering on the video frames. Can I get a pixel buffer from the SCNView/SCNSceneRenderer or do I need to use an SCNRenderer to re-render the scene to a texture in the offscreen buffer that the OpenGL capture pipeline uses to process video frames?

1

1 Answers

6
votes

In iOS 11, there's now a technology for easily doing not just SceneKit overlays on video, but a whole augmented reality experience: ARKit. (Also, if you're rolling your own AR tracking tech bit want to display using SceneKit, in iOS 11 you can set an AVCaptureDevice as the contents of any SceneKit material property, including a scene's background.)

For older iOS versions, the below advice still applies...


If you already have content that you're rendering using OpenGL (or Metal, for that matter), and you want to add SceneKit content to it, use SCNRenderer. Set a scene on it just like you would with SCNView, do all your existing OpenGL rendering the way you usually do, and then call renderAtTime:.

Just for a proof of concept, here's a really quick hack at adding SceneKit rendering on top of RosyWriter. All the code below goes into RosyWriterOpenGLRenderer.m.

In the declaration of instance variables:

@interface RosyWriterOpenGLRenderer ()
{
    //... existing ivars, plus:
    SCNRenderer *_scnRenderer;
}

In init, after setting up _oglContext:

_scnRenderer = [SCNRenderer rendererWithContext:_oglContext options:nil];
SCNScene *scene = [SCNScene scene];
SCNNode *node = [SCNNode nodeWithGeometry:[SCNBox geometry]];
[node runAction:[SCNAction repeatActionForever:[SCNAction rotateByX:1 y:1 z:1 duration:1]]];
[scene.rootNode addChildNode:node];
_scnRenderer.scene = scene;

In copyRenderedPixelBuffer, between the existing call to glFlush() and the bail label:

glFlush();

[_scnRenderer renderAtTime:CFAbsoluteTimeGetCurrent()];

bail:

That's enough to get a big spinning white cube both in the live video view and in the recorded movie output. (It's only unshaded because of the default material and lighting setups. For a real use case, set up lights, materials, cameras, the geometry you want, etc.)

no, my desk isn't really that pink.