2
votes

I'm using ARKit with SceneKit and would like to let my 3D objects physically interact with the reconstructed scene created by devices with LiDAR sensors (config.sceneReconstruction = .mesh). For example, having a virtual ball bounce off of the geometry of the reconstructed scene.

In RealityKit, this seems to be possible using sceneUnderstanding:

arView.environment.sceneUnderstanding.options.insert(.physics)

How can I achieve the same thing when using SceneKit?

1

1 Answers

2
votes

As far as I know, there is no built-in support for enabling this using SceneKit. However you can fairly easily put together a custom solution using the ARMeshAnchor created by ARKit.

First, configure ARKit to enable scene reconstruction:

let config = ARWorldTrackingConfiguration()
if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
    config.sceneReconstruction = .mesh
} else {
    // Handle device that doesn't support scene reconstruction
}

// and enable physics visualization for debugging
sceneView.debugOptions = [.showPhysicsShapes]

Then in your ARSCNViewDelegate use renderer nodeFor to create a scenekit node for newly created ARMeshAnchor instances:

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    guard let meshAnchor = anchor as? ARMeshAnchor else {
        return nil
    }

    let geometry = createGeometryFromAnchor(meshAnchor: meshAnchor)

    // Optionally hide the node from rendering as well
    geometry.firstMaterial?.colorBufferWriteMask = []

    let node = SCNNode(geometry: geometry)

    // Make sure physics apply to the node
    // You must used concavePolyhedron here!
    node.physicsBody = SCNPhysicsBody(type: .static, shape: SCNPhysicsShape(geometry: geometry, options: [.type: SCNPhysicsShape.ShapeType.concavePolyhedron]))

    return node
}

// Taken from https://developer.apple.com/forums/thread/130599
func createGeometryFromAnchor(meshAnchor: ARMeshAnchor) -> SCNGeometry {
    let meshGeometry = meshAnchor.geometry
    let vertices = meshGeometry.vertices
    let normals = meshGeometry.normals
    let faces = meshGeometry.faces
    
    // use the MTL buffer that ARKit gives us
    let vertexSource = SCNGeometrySource(buffer: vertices.buffer, vertexFormat: vertices.format, semantic: .vertex, vertexCount: vertices.count, dataOffset: vertices.offset, dataStride: vertices.stride)
    
    let normalsSource = SCNGeometrySource(buffer: normals.buffer, vertexFormat: normals.format, semantic: .normal, vertexCount: normals.count, dataOffset: normals.offset, dataStride: normals.stride)
    // Copy bytes as we may use them later
    let faceData = Data(bytes: faces.buffer.contents(), count: faces.buffer.length)
    
    // create the geometry element
    let geometryElement = SCNGeometryElement(data: faceData, primitiveType: toSCNGeometryPrimitiveType(faces.primitiveType), primitiveCount: faces.count, bytesPerIndex: faces.bytesPerIndex)
    
    return SCNGeometry(sources: [vertexSource, normalsSource], elements: [geometryElement])
}

Finally, update the scene nodes whenever the reconstructed geometry changes in the ARSCNViewDelegate renderer didUpdate: function:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let meshAnchor = anchor as? ARMeshAnchor else {
        return
    }
    
    let geometry = SCNGeometry.fromAnchor(meshAnchor: meshAnchor, setColors: false)
    geometry.firstMaterial?.colorBufferWriteMask = []

    node.geometry = geometry

    node.physicsBody!.physicsShape = SCNPhysicsShape(geometry: geometry, options: [.type: SCNPhysicsShape.ShapeType.concavePolyhedron])
}

Any physics objects you create in SceneKit should now be able to interact with reconstructed scene:

enter image description here