Find out what lidar can do on the iPhone 12 with this 3D scanning app



[ad_1]

iphone-12-pro-kitchen-scan-instructions-3

The Canvas app scans houses in 3D using the iPhone 12 Pro’s lidar. Expect a lot more from this.

Occipital

the iPhone 12 Pro The deep-scanning lidar sensor looks set to open up a lot of possibilities for 3D scanning applications on phones. A new one designed for home scanning, called Canvas, uses lidar for more precision and detail. But the app will also work with non-professional iPhones reverting to the iPhone 8.

Canvas’s approach indicates how lidar might work in iPhone 12 Pro apps. It can add more precision and detail to processes that are already possible with other methods on phones and tablets not equipped with lidar.

Read more: The iPhone 12’s lidar technology does more than improve photos. Check out this awesome party trick

Canvas, created by Boulder-based company Occipital, was originally released for the iPad Pro to take advantage of its lidar digitization earlier this year. When I saw a demonstration of its possibilities at the time, I saw it as a sign of how Apple’s depth-sensing technology could be applied to home improvement and home improvement applications. measured. The updated app takes clearer and sharper scans.

Since iPhones equipped with lidar made their debut, a handful of optimized apps have emerged that offer 3D scanning of objects, larger-scale spatially scanned photography (called photogrammetry) and augmented reality that can mix and match. grid maps of spaces with virtual objects. But the sample analysis of Occipital’s Canvas app on the iPhone 12 Pro, embedded below, looks sharper than the 3D scanning apps I’ve played with so far.

Apple iOS 14 gives developers more raw access to iPhone lidar data, according to Occipital product vice presidents Alex Schiff and Anton Yakubenko. This allowed Occipital to create its own algorithms to make the best use of Apple’s lidar depth map. It could also allow Occipital to apply the depth mapping data to future enhancements to its app for phones not equipped with lidar.

Scanning 3D space without specific depth mapping lidar or time-of-flight sensors is possible, and companies like 6d.ai (acquired by Niantic) have already used it. But Schiff and Yakubenko say lidar still offers a faster and more precise upgrade of this technology. The iPhone 12 version of Canvas takes more detailed analysis than the first version on the iPad Pro earlier this year, mainly due to iOS 14’s deeper access to lidar information, according to Occipital. The latest lidar-enabled version is accurate in a 1% range, while the non-lidar scan is accurate in a 5% range (literally making the iPhone 12 Pro a pro upgrade for those who might need it. a helping hand).

Yakubenko says that based on previous measurements from Occipital, Apple’s iPad Pro lidar offers 574 depth points per frame on a scan, but depth maps can jump up to 256×192 points in iOS 14 for developers. This generates more detail using AI and camera data.

Canvas room scans can be converted into actionable CAD models, in a process that takes around 48 hours, but Occipital also works on converting scans more instantly and adding semantic data (like recognition of doors, windows and other room details) with AI.

As more and more 3D scans and 3D data begin to live on iPhones and iPads, it will also make sense for common formats to share and edit files. While iOS 14 uses a USDZ file format for 3D files, Occipital has its own format for its more in-depth analysis and can output files in .rvt, .ifc, .dwg, .skp, and .plan formats when outputting. conversion to CAD models. . At some point, 3D scans can become as standardized as PDFs. We’re not quite there yet, but we may have to get there soon.

[ad_2]

Source link