AI Weekly: Microsoft interoperability of machine learning framework and ONNX



[ad_1]

This week, Facebook's artificial intelligence team introduced PyTorch 1.1 and Ax for managing model experiments. Microsoft also caused a stir with the launch of a blockchain service, support for HoloLens 2 for Enterprise by Unreal Engine and the new Azure Machine Learning and Azure Cognitive Service ads.

In the midst of all this news, some important stories may have gone unnoticed: Microsoft has made publicly available FPGA chips generally available for training and inference on machine models, and the Open Neural system Network Exchange (ONNX) now supports Nvidia and Intel Nerx for fast inferences. and Intel hardware.

This comes after Microsoft joins the MLflow project and opens the ONNX Runtime high performance inference engine.

Facebook and Microsoft created the ONNX Open Source Project in 2017, which now includes virtually all major global AI companies, including AWS, AMD, Baidu, Intel, IBM, Nvidia and Qualcomm.

In advance, Thursday, Scott Guthrie, head of the cloud computing group and head of the IT field at Microsoft Azure, spoke with reporters in San Francisco on various topics, including Microsoft's approach to projects open source and artificial intelligence strategy. More news is expected Monday as Microsoft launches its annual conference of build developers in Seattle.

"In the end, I think that what is fascinating with the material is not the material work that we do ourselves, it's what stands out in the foreground," he said. .

Guthrie said he liked ONNX because it offers machine learning practitioners the opportunity to use the best machine-learning framework and the best computer hardware for certain tasks. FPGA chips have been used for years to perform 100% of data encryption and compression acceleration tasks for Azure.

"Even today, with the ONNX workloads for AI, the important thing is that you can now build custom templates or use our templates, again using TensorFlow, PyTorch, Keras, whatever the framework that you want, and then know that you can speed it up materially. on the latest Nvidia GPU, whether it's on the new AMD GPU, Intel FPGA, another person's FPGA, or the new silicon that we could release in the future. For me, it's more convincing than "we have a better set of instructions at the hardware level" and, in general, what I find resonates best with customers. "

Guthrie spoke at length about open source contributions and said that overall, Microsoft was yielding more than Amazon or Google, as part of the evolution of the company over the past 10 years. years to create tools for DevOps, Database, Kubernetes and AI.

In the 2018 Octoverse Report released last fall, GitHub, acquired by Microsoft last year, revealed that Microsoft, Google, Redhat, and the University of California at Berkeley are using the more contributors to open source projects.

"We have not been a fan of open source and have become a big supporter," he said. "I think you see a very open Microsoft, as a consumer but also as a contributor, and I think it's unique. If you look, for example, at AWS contributions to open source, there is not much. There is a lot of consumption, but there is not much contribution in return, and I think that even if you looked at Google for the number of contributions we made on Azure, I think people are often pleasantly surprised when they add it. up. "

PyTorch and TensorFlow are among today's most popular frameworks, but the "it" frameworks come and go, Guthrie said. The interoperability brought by ONNX to the collections of different frames, runtime environments, compilers and other tools helps to create an ecosystem of machine learning.

Much of the modern machine learning industry relies on advances in computing power and open source projects. It is this architecture that will allow a breakthrough in artificial intelligence, and if the employees of the technology giants compete to give more, it is for the greater benefit.

To learn more about AI, send tips to Khari Johnson and Kyle Wiggers – and be sure to subscribe to the AI ​​Weekly newsletter and mark our AI channel.

Thank you for reading,

Khari Johnson

AI Editor

[ad_2]

Source link