0
votes

UPDATE: My premise that "continuous image tracking" is not possible out of the box with RealityKit ARViews was incorrect. All I needed to do was correctly create the AnchorEntity for the continuously tracked reference image.

The anchor entity needs to be created using the init(anchor: ARAnchor) initializer. (The init(world: SIMD3<Float>) initializer is correct for anchors stuck to the real world, but not ones that should track the reference image.)

Using ARKit and RealityKit with an ARWorldTrackingConfiguration, I am trying to do "continuous image tracking" (where the reference image is tracked each frame, and virtual objects can be anchored to it, appearing to be attached to and move with the reference image). Because reference images are only recognized once in world tracking (as opposed to ARImageTrackingConfiguration, where reference images are continuously tracked as long as they are in frame), this is not possible out of the box.

To get the same results in a world tracking configuration, I am anchoring virtual objects to the reference image in the session(_:didAdd:) delegate method, and using the session(_:didUpdate:) delegate method as an opportunity to remove the ARImageAnchor after each time it is identified. This causes the reference image to be re-recognized over and over, allowing virtual objects to be anchored to the image and appear to track it frame-to-frame.

In the example below, I am placing two ball markers to track the position of the reference image. First marker is placed only once, at the location where the reference image is initially detected. The other marker is re-positioned each time the reference image is re-detected, appearing to follow it.

This works. Virtual content tracks the reference image in the ARWorldTrackingConfiguration the same way it would in an image tracking config. But while the "animation" in ARImageTrackingConfiguration is very smooth, the animation in world tracking is much less smooth, more jumpy, as if it was running at 10 or 15 frames per second. (Actual FPS as reported by .showStatistics stays near 60 FPS in both configurations.)

I assume the difference in smoothness results from the time it takes ARKit to do the work of repeatedly re-recognizing and removing the reference image anchor on each didAdd/didUpdate cycle.

I would like to know if there is a better technique to get "continuous image tracking" in an ARWorldTrackingConfiguration, and/or if there is any way I can improve the code in the delegate methods to achieve this affect.

import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {

    @IBOutlet var arView: ARView!
    
    // originalImageAnchor is used to visualize the first-detected location of reference image
    // currentImageAnchor should be continuously updated to match current position of ref image
    var originalImageAnchor: AnchorEntity!
    var currentImageAnchor: AnchorEntity!
    
    let ballRadius: Float = 0.02

    override func viewDidLoad() {
        super.viewDidLoad()
        
        guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources",
             bundle: nil) else { fatalError("Missing expected asset catalog resources.") }
        
        arView.session.delegate = self
        arView.automaticallyConfigureSession = false
        arView.debugOptions = [.showStatistics]
        arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur,
            .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion,
            .disableGroundingShadows, .disableAREnvironmentLighting]

        let configuration = ARWorldTrackingConfiguration()
        configuration.detectionImages = referenceImages
        configuration.maximumNumberOfTrackedImages = 1  // there is one ref image named "coaster_rb"

        arView.session.run(configuration)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        // Reference image detected. This will happen multiple times because
        // we delete ARImageAnchor in session(_:didUpdate:)
        if let imageName = imageAnchor.name, imageName  == "coaster_rb" {

            // If originalImageAnchor is nil, create an anchor and
            // add a marker at initial position of reference image.
            if originalImageAnchor == nil {
                originalImageAnchor = AnchorEntity(world: imageAnchor.transform)
                let originalImageMarker = generateBallMarker(radius: ballRadius, color: .systemPink)
                originalImageMarker.position.y = ballRadius + (ballRadius * 2)
                originalImageAnchor.addChild(originalImageMarker)
                arView.scene.addAnchor(originalImageAnchor)
            }
            
            // If currentImageAnchor is nil, add an anchor and marker at reference image position
            // If currentImageAnchor has already been added, adjust it's position to match ref image
            if currentImageAnchor == nil {
                currentImageAnchor = AnchorEntity(world: imageAnchor.transform)
                let currentImageMarker = generateBallMarker(radius: ballRadius, color: .systemTeal)
                currentImageMarker.position.y = ballRadius
                currentImageAnchor.addChild(currentImageMarker)
                arView.scene.addAnchor(currentImageAnchor)
            } else {
                currentImageAnchor.setTransformMatrix(imageAnchor.transform, relativeTo: nil)
            }
        }
    }
    
    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        // Delete reference image anchor to allow for ongoing tracking as it moves
        if let imageName = imageAnchor.name, imageName  == "coaster_rb" {
            arView.session.remove(anchor: anchors[0])
        }
    }
    
    func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
        let ball = ModelEntity(mesh: .generateSphere(radius: radius),
            materials: [SimpleMaterial(color: color, isMetallic: false)])
        return ball
    }
}
1

1 Answers

2
votes

Continuous image tracking does work out of the box with RealityKit ARViews in world tracking configurations. A mistake in my original code lead me to think otherwise.

Incorrect anchor entity initialization (for what I was trying to accomplish):

currentImageAnchor = AnchorEntity(world: imageAnchor.transform)

Since I wanted to track the ARImageAnchor assigned to the matched reference image, I should have done it like this:

currentImageAnchor = AnchorEntity(anchor: imageAnchor)

The corrected example below places one virtual marker that is fixed to the reference image's initial position, and another that smoothly tracks the reference image in a world tracking configuration:

import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {

    @IBOutlet var arView: ARView!
    
    let ballRadius: Float = 0.02

    override func viewDidLoad() {
        super.viewDidLoad()
        
        guard let referenceImages = ARReferenceImage.referenceImages(
            inGroupNamed: "AR Resources", bundle: nil) else {
            fatalError("Missing expected asset catalog resources.")
        }
        
        arView.session.delegate = self
        arView.automaticallyConfigureSession = false
        arView.debugOptions = [.showStatistics]
        arView.renderOptions = [.disableCameraGrain, .disableHDR,
            .disableMotionBlur, .disableDepthOfField,
            .disableFaceOcclusions, .disablePersonOcclusion,
            .disableGroundingShadows, .disableAREnvironmentLighting]

        let configuration = ARWorldTrackingConfiguration()
        configuration.detectionImages = referenceImages
        configuration.maximumNumberOfTrackedImages = 1

        arView.session.run(configuration)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        if let imageName = imageAnchor.name, imageName  == "target_image" {
            
            // AnchorEntity(world: imageAnchor.transform) results in anchoring
            // virtual content to the real world.  Content anchored like this
            // will remain in position even if the reference image moves.
            let originalImageAnchor = AnchorEntity(world: imageAnchor.transform)
            let originalImageMarker = makeBall(radius: ballRadius, color: .systemPink)
            originalImageMarker.position.y = ballRadius + (ballRadius * 2)
            originalImageAnchor.addChild(originalImageMarker)
            arView.scene.addAnchor(originalImageAnchor)

            // AnchorEntity(anchor: imageAnchor) results in anchoring
            // virtual content to the ARImageAnchor that is attached to the
            // reference image.  Content anchored like this will appear
            // stuck to the reference image.
            let currentImageAnchor = AnchorEntity(anchor: imageAnchor)
            let currentImageMarker = makeBall(radius: ballRadius, color: .systemTeal)
            currentImageMarker.position.y = ballRadius
            currentImageAnchor.addChild(currentImageMarker)
            arView.scene.addAnchor(currentImageAnchor)
        }
    }
    
    func makeBall(radius: Float, color: UIColor) -> ModelEntity {
        let ball = ModelEntity(mesh: .generateSphere(radius: radius),
            materials: [SimpleMaterial(color: color, isMetallic: false)])
        return ball
    }
}