It’s been a few months since Apple’sbecame available, but relatively few AR apps have taken advantage of the device’s new lidar depth sensor so far. That rear sensor could be a sneak preview of the tech that makes its way into the next iPhones, which means that early iPad Pro apps could offer a window into what territory Apple ventures into next. With approaching next week, Apple will likely place great emphasis on new AR tools. Occipital’s new house-scanning app, Canvas, may be the clearest indicator of what’s to come for a next wave of depth-enhanced apps.
Occipital is a Boulder, Colorado-based company that’s madefor years, similar to the type of tech that used to be on and phones. could measure depth in a room, combined with image capture, and build a 3D-mapped mesh of a space. In the years since, companies such as 6D.ai ( ) have figured out ways to measure and map space without extra depth-sensing hardware at all. Occipital has since moved to a software-based approach for its scanning tools.
Occipital’s latest Canvas app uses regular iPhone and iPad cameras to mesh and map rooms quickly. But the company is also adding in support for Apple’s iPad Pro lidar sensor because of its extra depth-sensing range. The company sees the extra depth data adding to the tools it already has.
It’s likely that few developers have singled out the iPad Pro’s distinct lidar depth sensors because the 2020 iPad Pro models are a relatively small subset of all of Apple’s AR-ready devices. As that depth-sensing tech expands across the line to phones and other iPads, it’s likely that what Occipital is pursuing will happen with other companies, too.
“It’s a reasonably good assumption that you’re going to start seeing this on higher end iPhones as well,” Occipital’s Product Manager Alex Schiff says, referring to Apple’slater this year, which are reported to have depth-sensing rear camera capabilities much like the iPad Pro.
You can look at a Canvas scan below, embedded in this story. Room scans can sometimes look rough, but the data could get upgraded over time, and scans can be converted into CAD models for professional designers to use.
The iPad Pro’s lidar isn’t a camera: it’s a separate camera-free sensor, and the depth maps it collects are more like arrays of points in space, forming a mesh or a 3D mapping. Occipital’s app also can use camera data to stretch photos into a simulated 3D mesh, too, but the processes are separate. Future AR cameras, however, could blend both functions more. And Occipital sees the depth data as being useful in training cameras to compute where depth might be, too.
“Instead of just recognizing basic features such as a floor or wall, you’re able to see the whole geometry of the scene,” Occipital’s VP of Product Anton Yakubenko says. “It also accounts for semantic recognition. Joining the geometry and image information. It’s not perfect, but it could enable a new generation of applications which are not only about geometrical information, but also about semantic information. And that’s where we’re moving.” Occipital converts home scans into CAD files, but, “the next step is to know that this piece of a CAD model is a window or a door or a baseboard.”
Schiff sees the scans becoming more valuable as algorithms keep improving: “You can capture a home and have that data forever to revisit, and that data is going to become better or more interesting as you reprocess it with different kinds of algorithms that become available over time. It’s the revisitability of the data that is the thing that makes 3D mapping of space different than AR.”
As someone who’s still staying at home and avoiding home repair visits, I can see the appeal of scanning my home’s rooms to get analysis and expert opinions remotely. Apple’s likely to announce more major changes to its AR tools next week at its, but making more of the lidar sensor’s functions and how it relates to camera data could be the next step for iPhones.