In iOS 11, there's now a technology for easily doing not just SceneKit overlays on video, but a whole augmented reality experience: ARKit. (Also, if you're rolling your own AR tracking tech bit want to display using SceneKit, in iOS 11 you can set an AVCaptureDevice
as the contents of any SceneKit material property, including a scene's background.)
For older iOS versions, the below advice still applies...
If you already have content that you're rendering using OpenGL (or Metal, for that matter), and you want to add SceneKit content to it, use SCNRenderer
. Set a scene on it just like you would with SCNView
, do all your existing OpenGL rendering the way you usually do, and then call renderAtTime:
.
Just for a proof of concept, here's a really quick hack at adding SceneKit rendering on top of RosyWriter. All the code below goes into RosyWriterOpenGLRenderer.m
.
In the declaration of instance variables:
@interface RosyWriterOpenGLRenderer ()
{
//... existing ivars, plus:
SCNRenderer *_scnRenderer;
}
In init
, after setting up _oglContext
:
_scnRenderer = [SCNRenderer rendererWithContext:_oglContext options:nil];
SCNScene *scene = [SCNScene scene];
SCNNode *node = [SCNNode nodeWithGeometry:[SCNBox geometry]];
[node runAction:[SCNAction repeatActionForever:[SCNAction rotateByX:1 y:1 z:1 duration:1]]];
[scene.rootNode addChildNode:node];
_scnRenderer.scene = scene;
In copyRenderedPixelBuffer
, between the existing call to glFlush()
and the bail
label:
glFlush();
[_scnRenderer renderAtTime:CFAbsoluteTimeGetCurrent()];
bail:
That's enough to get a big spinning white cube both in the live video view and in the recorded movie output. (It's only unshaded because of the default material and lighting setups. For a real use case, set up lights, materials, cameras, the geometry you want, etc.)
![no, my desk isn't really that pink.](https://i.stack.imgur.com/TjzEn.png)