The tablet's lidar scanner works with an ultra-wide camera, motion sensors, studio-quality mics and apps to provide 'cutting-edge depth-sensing capabilities' that have 'never before possible on any mobile device,' Apple has said.
The custom-designed lidar scanner uses direct time-of-flight to measure reflected light from up to 5m away, both indoors and out.
New depth frameworks in Apple's iPadOS operating software combine depth points measured by the lidar and data from both cameras and motion sensors, which is then enhanced by computer vision algorithms on a A12Z Bionic chip for a more detailed understanding of a scene. The tight integration of these elements enables a 'whole new class of AR experiences' the company says.
Every existing ARKit app – Apple's AR developer app - will get instant AR placement to improve motion capture and people occlusion. Using the latest update to ARKit with a new Scene Geometry API, developers can use the new lidar scanner to 'unleash scenarios never before possible'.
One of the applications the lidar scanner targets is gaming – for example video game Hot Lava is launching a new AR mode later this year, which will transform living rooms into a lava-filled obstacle course. This technology allows the game to interact with the real world, blurring the lines between the physical and virtual world.