Amazon launches a machine learning chip, taking Nvidia and Intel



[ad_1]

REUTERS: Amazon.com on Wednesday launched a microchip intended for automatic learning, penetrating a market in which Intel Corp and Nvidia Corp both count to increase their profits in the coming years.

Amazon is one of the largest chip buyers in Intel and Nvidia, whose semiconductors power Amazon's unique cloud computing unit, Amazon Web Services. But Amazon started designing their own chips.

The so-called "Inferentia" chip from Amazon, announced Wednesday, will help what researchers call the inference, which involves using an artificial intelligence algorithm, for example by analyzing the incoming sound and in translating it into textual queries.

The Amazon chip does not pose a direct threat to Intel and Nvidia's business because it will not sell the chips. Amazon will sell services to its cloud customers who operate on top of the chips starting next year. If Amazon relies on its own chips, this could rob Nvidia and Intel of an important customer.

Intel processors currently dominate the machine learning inference market, which, according to Morningstar analysts, will rise to 11.8 billion US dollars from here 2021. In September, Nvidia launched its own inference chip to compete with Intel.

In addition to its machine learning chip, Amazon announced Monday the creation of a processor chip for its cloud unit called Graviton. This chip is powered by Arm Holdings technology controlled by SoftBank Group Corp. Arm-based chips are currently feeding mobile phones, but many companies are trying to adapt them to data centers. The use of Arm Chips in data centers potentially represents a major challenge for Intel's dominance in this market.

Amazon is not the only one among cloud computing providers to design its own chips. In 2016, Google's cloud unit, owned by Alphabet Inc., unveiled an Artificial Intelligence chip designed to support Nvidia's chips.

Custom chips can be expensive to design and produce, and analysts have pointed out that such investments are leading to increased research and capital expenditures for large technology companies.

Google Cloud executives said that customer demand for Google's custom chip, the TPU, was strong. But chips can be expensive to use and require customization of the software.

Google Cloud charges US $ 8 per hour for access to its TPU chips and up to US $ 2.48 per hour in the United States to access Nvidia chips, according to the Google website.

(Report by Stephen Nellis and Jeffrey Dastin, additional report by Paresh Dave, edited by Peter Henderson)

[ad_2]
Source link