4
votes

I put some objects in AR space using ARKit and SceneKit. That works well. Now I'd like to add an additional camera (SCNCamera) that is placed elsewhere in the scene attached and positioned by a common SCNNode. It is oriented to show me the current scene from an other (fixed) perspective.

Now I'd like to show this additional SCNCamera feed on i.Ex. a SCNPlane (as the diffuse first material) - Like a TV screen. Of course I am aware that it will only display the SceneKit content which stays in the camera focus and not rest of the ARKit image (which is only possible by the main camera of course). A simple colored background then would be fine.

I have seen tutorials that describes, how to play a video file on a virtual display in ARSpace, but I need a realtime camera feed from my own current scene.

I defined this objects:

let camera = SCNCamera()
let cameraNode = SCNNode()

Then in viewDidLoad I do this:

camera.usesOrthographicProjection = true
camera.orthographicScale = 9
camera.zNear = 0
camera.zFar = 100
cameraNode.camera = camera
sceneView.scene.rootNode.addChildNode(cameraNode)

Then I call my setup function to place the virtual Display next to all my AR stuff, position the cameraNode as well (pointing in the direction where objects stay in the scene)

cameraNode.position = SCNVector3(initialStartPosition.x, initialStartPosition.y + 0.5, initialStartPosition.z)

let cameraPlane = SCNNode(geometry: SCNPlane(width: 0.5, height: 0.3))
cameraPlane.geometry?.firstMaterial?.diffuse.contents = cameraNode.camera
cameraPlane.position = SCNVector3(initialStartPosition.x - 1.0, initialStartPosition.y + 0.5, initialStartPosition.z)

sceneView.scene.rootNode.addChildNode(cameraPlane)

Everything compiles and loads... The display shows up at the given position, but it stays entirely gray. Nothing is displayed at all from the SCNCamera I put in the scene. Everything else in the AR scene works well, I just don't get any feed from that camera.

Gray plane instead of camera feed

Hay anyone an approach to get this scenario working?

To even better visualize, I add some more print screens.

The following shows the Image trough the SCNCamera according to ARGeo's input. But it takes the whole screen, instead of displaying its contents on a SCNPlane, like I need.

View from the SCNCamera

The next Print screen actually shows the current ARView result as I got it using my posted code. As you can see, the gray Display-Plane remains gray - it shows nothing.

Current AR View

The last print screen is a photomontage, showing the expected result, as I'd like to get.

Expected or Desired AR View

How could this be realized? Am I missing something fundamental here?

2

2 Answers

4
votes

After some research and sleep, I came to the following, working solution (including some inexplainable obstacles):

Currently, the additional SCNCamera feed is not linked to a SCNMaterial on a SCNPlane, as it was the initial idea, but I will use an additional SCNView (for the moment)

In the definitions I add an other view like so:

let overlayView   = SCNView() // (also tested with ARSCNView(), no difference)
let camera        = SCNCamera()
let cameraNode    = SCNNode()

then, in viewDidLoad, I setup the stuff like so...

camera.automaticallyAdjustsZRange = true
camera.usesOrthographicProjection = false
cameraNode.camera                 = camera
cameraNode.camera?.focalLength    = 50
sceneView.scene.rootNode.addChildNode(cameraNode) // add the node to the default scene

overlayView.scene                    = scene // the same scene as sceneView
overlayView.allowsCameraControl      = false
overlayView.isUserInteractionEnabled = false
overlayView.pointOfView              = cameraNode // this links the new SCNView to the created SCNCamera
self.view.addSubview(overlayView)    // don't forget to add as subview

// Size and place the view on the bottom
overlayView.frame  = CGRect(x: 0, y: 0, width: self.view.bounds.width * 0.8, height: self.view.bounds.height * 0.25)
overlayView.center = CGPoint(x: self.view.bounds.width * 0.5, y: self.view.bounds.height - 175)

then, in some other function, I place the node containing the SCNCamera to my desired position and angle.

// (exemplary)
cameraNode.position = initialStartPosition + SCNVector3(x: -0.5, y: 0.5, z: -(Float(shiftCurrentDistance * 2.0 - 2.0)))          
cameraNode.eulerAngles = SCNVector3(-15.0.degreesToRadians, -15.0.degreesToRadians, 0.0)

The result, is a kind of window (the new SCNView) at the bottom of the screen, displaying the same SceneKit content as in the main sceneView, viewed trough the perspective of the SCNCamera plus its node position, and that very nicely.

Main AR view plus additional view from other perspective

In a common iOS/Swift/ARKit project, this construct generates some side effects, that one may struggle into.

1) Mainly, the new SCNView shows SceneKit content from the desired perspective, but the background is always the actual physical camera feed. I could not figure out, how to make the background a static color, by still displaying all the SceneKit content. Changing the new scene's background property affects also the whole main scene, what is actually NOT desired.

2) It might sound confusing, but as soon as the following code get's included (which is essential to make it work):

overlayView.scene = scene

the animation speed of the entire scenes (both) DOUBLES! (Why?)

I got this corrected by adding/changing the following property, which restores the animation speed behavour almost like normal (default):

// add or change this in the scene setup
scene.physicsWorld.speed = 0.5

3) If there are actions like SCNAction.playAudio in the project, all the effects will no longer play - as long as I don't do this:

overlayView.scene = nil

Of course, the additional SCNView stops working but everything else gets gets back to its normal.

0
votes

Use this code (as a starting point) to find out how to setup a virtual camera.

Just create a default ARKit project in Xcode and copy-paste my code:

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self
        sceneView.showsStatistics = true
        let scene = SCNScene(named: "art.scnassets/ship.scn")!
        sceneView.scene = scene

        let cameraNode = SCNNode()
        cameraNode.camera = SCNCamera()
        cameraNode.position = SCNVector3(0, 0, 1)
        cameraNode.camera?.focalLength = 70
        cameraNode.camera?.categoryBitMask = 1
        scene.rootNode.addChildNode(cameraNode)

        sceneView.pointOfView = cameraNode
        sceneView.allowsCameraControl = true
        sceneView.backgroundColor = UIColor.darkGray

        let plane = SCNNode(geometry: SCNPlane(width: 0.8, height: 0.45))
        plane.position = SCNVector3(0, 0, -1.5)

        // ASSIGN A VIDEO STREAM FROM SCENEKIT-RECORDER TO YOUR MATERIAL
        plane.geometry?.materials.first?.diffuse.contents = capturedVideoFromSceneKitRecorder
        scene.rootNode.addChildNode(plane)
    }
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }
}

UPDATED:

Here's a SceneKit Recorder App that you can tailor to your needs (you don't need to write a video to disk, just use a CVPixelBuffer stream and assign it as a texture for a diffuse material).

enter image description here

Hope this helps.