[ad_1]
The iPhone 12 Pro has a lot of cool features. We’ve all heard of 5G connectivity, although it’s not yet available where we all live and Apple fans will be familiar with MagSafe as well. But what is a LiDAR scanner?
Short for “Light Detection and Ranging”, this sensor bounces infrared lights off surfaces to create 3D maps. Like so many futuristic technologies, it was developed by the US military in the 1960s. Now Apple is touting it as the next big thing for mobile by including it in the iPhone 12 Pro and the iPhone 12 Pro Max as well. only in some iPad Pro models.
This ignores the fact that Samsung, LG, and Huawei are already using it. Granted, time-of-flight (ToF) sensors only shine a single beam, and Apple’s LiDAR scanner uses multiple pulses to create a more detailed image. But Google has been experimenting with this exact technology for over a decade.
LiDAR has been helping the tech giant’s self-driving cars navigate since 2009. While its 2014 Tango Project was a first attempt to bring augmented reality (AR) to phones. This led to the launch of two Android phones with LiDAR systems: the Lenovo Phab 2 Pro and the Asus ZenPhone AR.
Although Google eventually ditched Tango in favor of ARCore computer vision, because it could do the same job without specialized hardware.
But if Apple generates as much hype around LiDAR as it hopes, Android makers won’t want to be left behind. They may pressure Google to develop better Android support for the sensor. And maybe they should. Google might even be doing more, so here are a few ways Android manufacturers could use this technology.
1. AR out of this world
Apple says the iPhone 12 Pro will support powerful new augmented reality experiences, thanks to the LiDAR scanner. This means that applications can create a more detailed map of a room in less time. Thus, they can better integrate virtual objects, for example by hiding them behind real objects.
Tellingly, Google added the same functionality to its ARCore platform in June with just one software update. It has years more of experience in this area than Apple, which has just started investing heavily in AR.
So if Android were to embrace LiDAR, it would combine the best of both worlds: powering more precise data into already advanced algorithms. This could lead to more immersive mobile AR and VR games, as well as all kinds of new apps.
2. Take more eye-catching selfies
Apple’s other big selling point for the LiDAR scanner is that it helps the iPhone autofocus up to six times faster in low light. Depth detection also improves her portraits in night mode for selfies after dark.
Maybe it could do the same for Android. Just like with AR, Google advocates machine learning to improve its images. Night Site Mode for Android 11 works amazingly well in complete darkness. But even ambient light can add noise to these algorithmically enhanced selfies.
Whereas its bokeh simulator can do weird things if there is more than one face in a frame. So, it’s no wonder that many other Android manufacturers have added ToF sensors to improve shots. Full LiDAR could go even further.
3. Map the world inside and out
The original use of LiDAR was for mapping. Apollo 15 astronauts used it to study the surface of the Moon in 1971. Just as Android users filled in Street View’s shortcomings by uploading photo spheres, their depth analyzes could generate improvements in the 3D terrain of Google Earth.
Better yet, they could make Google Earth VR (yes, it really does exist) more immersive. But the Tango project focused on the mapping of interior spaces. Open up the possibility that Google Maps can allow you to see the majesty of the Sistine Chapel in 3D as if you were really there. Get step-by-step instructions to find your airport gate faster. Or check if your stadium seats are behind a pillar before purchasing tickets.
4. Turbo-charge contactless controls
The Pixel 4’s Motion Sense was an innovative idea. With the built-in radar, you can control the phone – from muting calls to skipping songs – with just a wave of your hand. It could also detect your presence to unlock the device faster than Apple’s Face ID. But sadly, the technology never caught on and was ditched for the Pixel 5.
But just as LiDAR helps self-driving cars detect obstacles with high accuracy, it could also help phones detect hand gestures. And since LiDAR sensors are cheaper than miniature radars, Motion Sense could be deployed on more Android devices beyond Google’s flagship. With a larger user base, app developers might be more inclined to embrace it and invent new contactless controls.
5. Smarter home security
Amazon and Google have been locked in a war to dominate the smart home for years. It goes beyond smart speakers like Echo and Home. Amazon also owns Ring and Google has Nest. Just like with photography, LiDAR could dramatically improve night vision for Nest security cameras.
Although Nest can also use sensor technology to launch a standalone alternative to Ring’s new home security drone. While Tango helped quadrocopters stand on their own in 2014. This in turn could open the doors for a whole slew of mainstream Android robots.
6. Bring back Google Glass
Google promised the world a head-mounted display in 2012. But despite partnerships with brands like Ray-Bans and Oakley, technical hurdles and privacy concerns have prevented the realization of a user-friendly Glass model.
With new competition from Facebook’s Project Aria and Apple’s smart specs, Glass may well be back. After all, Google Lens’s visual search and Assistant voice commands have essentially perfected all of the functionality Glass was meant to have. LiDAR could give it a competitive advantage by improving the performance of its AR display.
[ad_2]
Source link