Google Images Is Getting Some Extra Muscle From Lens A.I. Tech



[ad_1]

Google Images is getting a big upgrade that is based on artificial intelligence.

Google said Thursday that it started introducing a new feature for smartphone users that lets them search for specific objects within photos, like a flower bouquet in a wedding portrait. All users have to do is trace a finger around that bouquet to find similar looking bouquets that could be linked to outside sources like floral sites and interior design blogs.

It’s part of a broader initiative by Google to develop more ways for people to search for information beyond entering words and text into the company’s core search engine, explained Google Images engineering head Cathy Edwards.

The revamped Google Images is now powered by the company’s Lens technology that it unveiled in 2017 and embedded in products like its Google Photos mobile phone app that people use to store their digital photos. The Lens technology, derived from the company’s various deep learning projects that teach computers to recognize objects in photos, can tell people what breed of dog their beloved pooch may be by scanning their pet photos.

Now Google has added Lens technology to Google Images to make it easier for people to search for specific images, Edwards explained.

Users will now see a small lens-shaped icon below the photos they chose to access from Google Images. Pressing the lens icon triggers several smaller dots to appear over objects in the photo that the tech recognizes, and touching one of those overlaid dots will produce other similar photos of the objects from around the web.

People can also use their fingers to trace over areas within photos that contain objects they want the tech to recognize and scour the web for. If users search for Oscar dresses, for instance, they can use their fingers to circle an actress’ shoes, which should retrieve images of similar looking footwear.

The goal is to give Google Images users something else to do besides merely stare at the photos that happen to show up in a search query. It’s also a way to try to keep Google Images competitive amid the rise of social photo sites like Facebook’s Instagram and Pinterest, both of which created ways for people to do more things when perusing their services, like visiting retail sites that may sell sneakers shown in certain photos.

Some users consult Google Images for ideas about do-it-yourself home projects, while others want to see photos of car engines to help with their auto repairs, Edwards said.

“We have an enormous number of queries for people who want to fix their cars and they want to identify car parts,” she said.

Although the new feature should be able to identify basic objects in photos, over time it should be able to identify more advanced things like specific types of flowers and noteworthy landmarks, she said. The new feature, however, won’t work with controversial content like pornography, she explained.

Still, adding more features to Google Images opens the company up to potential problems with its AI technologies. In 2015, for instance, Google Photos was heavily criticized for labeling African-Americans as gorillas, underscoring problems with artificial intelligence understanding what’s in a photo and inadvertent bias in the data used to train the technology.

Get Data Sheet, Fortune’s technology newsletter.

Edwards said that the company has been testing the latest feature for bias, and is using artificial intelligence technology to potentially mitigate problems. The new product is rolling out first in the U.S. and later overseas to work out any potential hiccups.

[ad_2]
Source link