Google provides ongoing support for cloud anchor content and extends augmented faces to iOS Mobile AR News :: Next Reality



[ad_1]

With Android 10 on the streets (at least for mobile devices that get quick updates) and the public release of iOS 13 abandoned September 19, Google publishes an update Thursday for ARCore that adds new fantastic benefits to its platform capabilities.

For its Cloud Anchors API for multi-user augmented reality experiences, Google has integrated persistent content support, allowing users to "save" AR content in real-world locations for others to discover and interact with. with him.

"Imagine that you are working together on a new design of your home throughout the year, leaving AR notes to your friends around an amusement park, or by hiding AR objects at specific locations around the world for them to be discovered by others, "said Christina Tong, product manager for the increase. the reality at Google, in a blog post.

Image of Sybo TV / YouTube

One of the first applications to offer the new Cloud Anchors persistent content features is Mark AR, a Sybo and iDreamSky developer application that gives users the freedom to build and share AR content with others. people in public spaces.

"Reliably anchoring AR content for each use case – regardless of surface, distance, and time – pushes the boundaries of compute and computer vision because the real world is varied and ever-changing. evolution, "Tong said. "By activating a" backup button "for RA, we are taking an important step toward bringing the digital and physical worlds together to expand the possibilities for RA use in our daily lives."

Alas, support for persistent content is limited at launch time, but Google does accept applications for developers interested in testing capacity.

In addition, the ARCore team has implemented some improvements to cloud anchors, which enhance the multi-user configuration process with better anchors and visual processing.

"Now, when creating an anchor, it's possible to capture more angles on larger areas of the scene to get a more robust 3D feature map," said Tong. . "Once the map is created, the visual data used to create it is removed, and only the anchor IDs shared with other devices need to be resolved, and multiple scene anchors can now be resolved simultaneously." , which reduces the time needed to start an application. shared AR experience. "

Applications that leverage cloud anchors include Google's Just a Line, NASA Jet Propulsion Laboratory's Spacecraft AR, and Childish Gambino's Pharos AR. Now users who may have been overwhelmed by the multi-user experiences of these apps can go back and see if new API efficiencies allow applications to benefit from better mileage.

In addition, Google has kept its promise of I / O by officially extending its Augmented Faces API to iOS, reflecting the cross-platform distinction of Cloud Anchors. Enhanced Faces is the ARKit feature, which gives developers the ability to create Snapchat selfie effects.

"Earlier this year, we announced our Augmented Faces API, which features a high-quality 468-point 3D mesh, which allows users to attach funny effects to their faces, all without depth sensors on their faces. smartphone, "said Tong. "With the addition of iOS support today, developers can now create effects for over a billion users."

In addition, Google has added a feature familiar to developers working in the Snapchat software. Lens Studio offers face effects templates to simplify the creative workflow.

With the latest updates, ARCore is catching ARKit in the field of persistent content. It is still lagging behind ARKit 3, particularly in People Occlusion, Motion Capture, and support for two cameras, but by offering cross-platform capabilities, Google is helping to break down barriers between systems. mobile operation.

[ad_2]

Source link