Google adds new machine learning tools and brings its AI software to call centers



[ad_1]

Google announced a series of artificial intelligence announcements this week at its Cloud Next conference, which begins today in San Francisco, and many are focused on the democratization of digital tools. automatic learning. As of today, Google's AutoML Vision tool will now be available as a public beta after an alpha period that began in January with the launch of its Cloud AutoML initiative, the company said in its speech.

AutoML Cloud is essentially a way to allow non-experts – those who do not have expertise in machine learning or even coding – to train their own models of self-learning, using all the tools available in Google's cloud computing offering. The first of these tools was AutoML Vision, which allows you to create a machine learning model for recognizing images and objects. Google makes these tools readable to those outside the fields of software engineering and AI using a simple graphical interface and universal UI keys like drag and drop.

Now that AutoML Vision enters the public beta, it is available to a large number of organizations, from Companies and researchers who might find this type of AI useful. who do not have the resources or the know-how to develop their own training models. In most cases, companies can simply use the AI ​​software through an applicable API, such as the Cloud Vision API that Google provides to third parties. But the company designs its Cloud AutoML tools to serve businesses – mostly outside the technology sector – that may have specific needs requiring training on custom data.

An example that Google mentioned when it was launched was Urban Outfitters, which was building a model that would help identify patterns and other similarities between products, to offer customers online options of more detailed search and filtering. only a human would notice. (Think about the difference between a "deep V" shirt and a standard "V-Neck" shirt.) The Cloud Vision API, focused on wide recognition of objects and images, does not not cut completely. Urban Outfitters can probably be a model of its own and uses the tools of Google.


Image: Google

Two new Cloud AutoML domains are also announced today: one for natural language and one for translation. Google's ability to badyze written and written words with software forms the basis of Google Assistant's product, and mastering its AI-trained translation algorithms has made Google Translation a resounding success in many types. of languages.

Of course, you will not be able to develop sophisticated models and software like Google has without the expertise, resources and important data sets. But the company facilitates the basic training of custom models with these new areas.

Already, Google claims that the publishing giant Hearst uses AutoML Natural Language to help mark and organize the content through its many magazines and the many national and international versions of these publications. Google has also entrusted the translation of AutoML to the Japanese publisher Nikkei Group, which publishes and translates articles daily in a number of languages. "The AI ​​is empowering, and we want to democratize this power for everyone and all businesses – from retail to agriculture, from education to health care" Fei-Fei Li said. "AI is no longer a niche in the world of technology – it's the differentiator for companies in all sectors. And we are committed to providing the tools that go revolutionize them. "

In addition to its new Cloud AutoML domains, Google is also developing a AI customer service agent to act as the first human voice with which a caller interacts by phone. Google calls the Contact Center AI product and is badociated with its existing Dialogflow package that provides tools for businesses to develop conversational agents.

Although the company does not mention the name, it is clear that Contact Center AI is informed by the basic work that Google performs on Duplex. That's the project unveiled at Google I / O earlier this year that gives people their own AI badistant to converse and take on other mundane tasks claiming to be a human on the phone. He put Google in the hot water when it was discovered that this could be done without the consent of the social services worker at the other end. (Google is actively experimenting with Duplex this summer, but only in very limited use cases like vacation hours and bookings.)

With Contact Center AI, Google moves to a territory where callers are more familiar with the concept of interaction with a bot and do it of their own free will by contacting customer service proactively. Because of this context, it seems that this technology will more than likely dominate the way call centers operate in the future. Call Center AI first puts a contact person in touch with an AI agent, who tries to solve the problem as would a standard automated client service bot, but with a much more sophisticated understanding and capabilities of natural language. If the caller needs or prefers to talk to a human, the AI ​​switches to a support role and helps a human technician solve the problem by presenting information and solutions relevant to the conversation in real time.

Li says the company is She is working with her existing Contact Center AI partners to "engage with us around the responsible use of AI Cloud". She is of course talking about consent and disclosure, especially when someone is talking to an AI and how not to imbue this software with unconscious bias. , especially around race and gender. "We want to make sure we use technology in a way that employees and users will find fair, empowering and trustworthy," Li wrote.

[ad_2]
Source link