Intel attacks Google and Amazon with 2 new chips focused on AI



[ad_1]

Intel has unveiled two new processors as part of its Nervana Neuron Network Processor (NNP) line to accelerate training and inference from Artificial Intelligence (AI) models.

Nicknamed Spring Crest and Spring Hill, the company presented for the first time, Tuesday, AI-based chips at the Hot Chips conference in Palo Alto, Calif., Every year in August.

Intel's Nervana NNP series was named in honor of Nervana Systems, the company it acquired in 2016. The chips were designed at its Haifa site in Israel. They make it possible to train the artificial intelligence and to deduce from data valuable information.

"In an AI world, we will need to adapt hardware solutions to a combination of processors tailored to specific use cases," said Naveen Rao, vice president of Intel for the group of artificial intelligence products. "That means looking at the specific needs of applications and reducing latency by delivering the best results as close to the data as possible."

The Nervana Neural Network Processor for Training (Intel Nervana NNP-T) is designed to manage data for a variety of learning models in depth as part of a power budget, while providing high performance and increased memory efficiency.

Earlier in July, Chinese technology giant Baidu had been enlisted as a development partner for NNP-T to ensure that development remained in line with the latest customer demand for training materials.

The other processor, Nervana Neural Network Processor for Inference (Intel Nervan NNP-I), specifically targets the inference aspect of AI to infer new knowledge. By using a specially designed AI inference calculation engine, NNP-I provides better performance with lower power consumption.

According to a Reuters report, Facebook would already use the new processors.

This development follows Intel's artificial intelligence-based performance accelerators, such as the Myriad X visual processing unit, which integrates a neural computing engine to draw deep inferences from the neural network.

That said, the chip maker is far from the only company to offer machine learning processors to handle AI algorithms. The Google Tensor Processing Unit (TPU), Amazon AWS Inferentia and NVDIA NVDLA are some of the solutions commonly used by companies as the need for complex calculations continues to grow.

But unlike the TPU, which was specifically designed for Google's TensorFlow machine learning library, NNP-T offers direct integration with popular deep learning infrastructures such as Baidu's PaddlePaddle, Facebook's PyTorch and TensorFlow.

Intel said its AI platform would help "manage the clutter of data generated and give businesses the ability to efficiently use their data, treat them where they are." collected where justified and to make better use of their upstream resources ".

To read further:

Facebook's "Cryptocurrency" on Facebook would be confronted with a competition investigation led by the EU

[ad_2]

Source link