3
votes

Before iOS 13 (iOS 11 & 12) a SCNTechnique applied to an ARSCNView with a multipass setup that is first supposed to draw the entire Scene into a texture (via DRAW_SCENE) would also affect the ARKit scene background (the camera feed). Now with iOS 13 this does not seem to be the case anymore. Fragment shaders applied to the resulting texture do not alter the camera feed – only the SceneKit content (e.g nodes). When using a custom Scene Background like an image everything behaves as expected. Same outcome when working just with an SCNView.

Setting clearColor to sceneBackground in the colorStates does not help. I also tried disabling environment texturing and people occlusion.

Does anyone know a workaround for this? I would love to filter the whole rendered ARKit Scene to alter the mood of my AR experience. Attached is a screenshot of my technique plist.

Thank you!

enter image description here

1

1 Answers

3
votes

In iOS 13 ARKit introduced support for camera grain. When using ARSCNView this feature is automatically enabled but its implementation can sometimes conflict with your own use of SCNTechnique. Setting rendersCameraGrain to false should solve the issue.