ARCore is Google’s SDK for creating augmented reality apps, and it doesn’t require specialized hardware to operate, unlike Google Project Tango’s failed experiment.
The ARCore software development kit allows phones to place virtual objects in mixed reality modes using processors, motion sensors, light sensors, and a camera.
All your phone needs is a single RGB camera, and the IMU sensor provides accurate gyroscope and acceleration readings, and comprehensive calibration data.
With these primary sensors and a single camera, ARCore’s Depth API can Create depth maps to enable features like: occlusal, more realistic physics, path mapping, surface interaction, etc.
The experience is even more immersive if you can bring another camera into the mix, and it looks like that’s exactly what Google plans to do with the latest version of its augmented reality SDK.
The changelog for ARCore SDK 1.23 mentions dual-camera stereo depth across supported devices.
The ARCore 1.23 release notes via GitHub don’t mention stereo depth support for the dual camera, but it is mentioned in the release notes via the Google Developers page.
The release notes via the Google Developers page refer to Google’s list of devices that support the SDK, which has been recently updated to clarify that dual-camera support will be rolling out in the coming weeks for Pixel 4 and Pixel 4 XL phones.
Google’s Pixel 4 and Pixel 4 XL released in 2019 are the only Pixel phones that have a secondary telephoto camera, while Google’s Pixel 4a 5G and Pixel 5 phones feature an ultra-wide secondary camera.
Given that adding support for a second camera likely requires a significant amount of calibration work, it is likely that some current devices will not get support for dual-camera stereo depth maps.
However, many phones have ToF sensors that improve the quality of the deep mapping experience by reducing scan time and improving detection.