Displays the depth values received by the front-facing camera.
Displays the depth values received by the front-facing camera. The depth values are applied to a heat map and multiplied with the camera's color image. The resulting image is then used as the background for the augmented reality scenekit scene.
This example builds upon the official Creating Face-Based AR Experiences demo and is free to use. The original demo code is here.
ARSessionDelegate
using the session(_ session: ARSession, didUpdate frame: ARFrame)
method.frame.displayTransform
to get a matrix that is used to properly align the projection.CIImage(cvImageBuffer: depthBuffer)
Depth frames are not received as quickly as color frames are. As a result, the camera feedback is not as fast as a photo preview would normally be. I haven't been able to find a way to manipulate the camera configuration to increase the rate at which the depth images are received.