Google Cloud deploys new tools to make AI more accessible



[ad_1]

A growing number of organizations want to take advantage of AI and machine learning, but it can be difficult to find the right tools and the right staff to do it. Among other challenges, there is a shortage of data specialists capable of creating machine learning models.

"There is currently a huge talent shortage for AI," ZDNet Rajen Sheth, director of product management for Cloud AI at Google, told ZDNet. "It's a very new field, and few people know how to use it now."

To meet this challenge, Google Cloud is deploying a set of new products to maximize the impact of a computer scientist's work. The first product, Kubeflow Pipelines, facilitates the coordination of data scientists with other team members needed to bring an automated learning model into production. The second product, AI Hub, serves as a market where anyone in a company can access machine learning components such as datasets or models.

"The vast majority of models created by data scientists never reach production," Sheth said. "You take this very valuable asset from a computer scientist and waste it."

With Kubeflow Pipelines and AI Hub, the work of data scientists will be more accessible, and therefore more valuable, for the rest of the organization.

Kubeflow Pipelines is a new component of Kubeflow, a popular open-source project that facilitates the deployment of machine learning workflows on Kubernetes. Kubeflow Pipelines provides a workspace for composing, deploying and managing reusable end-to-end learning workflows.

Sheth explained that it allowed anyone "to enter and exit" the different components of a workflow. So, for example, a data scientist can create a template and another person from the company can insert the right data, while a developer can integrate an API. With this level of collaboration, users can experiment and put into production more quickly.

"We realized that artificial intelligence is a team sport," Sheth said. "It's more than the data specialist who builds the model. It may be the data engineer who introduces the data, it is an engineer who puts the model into production, it can be the developer who puts it in the application. are all uncoordinated right now. "

At the same time, AI Hub aims to facilitate the discovery, sharing and reuse of existing tools and work by members of an organization. This could include pipelines, Jupyter notebooks, TensorFlow modules, etc. Data scientists and other users can add their own components to the market so that other members of their organization can use them.

In addition to serving as a private and secure hub for businesses, the market will be stocked with machine learning resources developed by Google Cloud AI, Google Research and other Google teams. In the future, it will also provide access to the content of Kaggle, Google's online community of more than 2 million scientists.

Ultimately, Google plans to allow users to buy tools from third parties in the market. Google is currently working with its cloud customers to connect them with appropriate IA partners. For example, SparkCognition can be an ideal partner for an aerospace company using Google Cloud that searches for predictive maintenance tools. In the future, a company like SparkCognition could offer tools via AI Hub.

[ad_2]
Source link