Google Lens works in real time on the Pixel 3



[ad_1]

During the Made by Google event organized today in New York, the Learning Learning image analysis tool, Google Lens, has reached a new milestone with a Expected but welcome update: The Google Camera app for Pixel 3 phones now includes a real-time analysis of Google OS to suggest actions based on what the Camera application sees.

Google's Goal Suggestion feature works by overlaying recognition points on items that can be operated in the live camera stream, with each point being set to display the possible actions. For example, a business card might contain an email address that Lens recognizes can be exploited, generating a pop-up window to open Gmail.

"When you point your camera at information you want to remember or you do not want to type," said Mario Queiroz, vice president of Pixel at Google, "as a URL or a QR code on a flyer or an email address on a business card – Google Lens suggests what to do next, like creating a new contact. "

In May, Google launched a real-time analytics mode for Lens, but this feature was introduced as a component of the Lens self-contained application, with the intention of adding an analysis to camera apps. some devices. Before Pixels, the LG G7 and some Sony phones, they had to receive this function.

The associated lens additions included the ability to automatically match the text of the images with the data covered by Google (the items in a menu could inform you about the ingredients or display images) and find elements about Google similar to those that the camera currently looks at. These advanced features to aid in machine learning and AI can potentially give Google Camera a long way ahead of alternatives such as the Apple Camera app, which focuses almost exclusively on capturing photos and videos.

[ad_2]
Source link