Apple's new iPhones move the battlefield from smartphone camera to AI



[ad_1]

Tim Cook, President and CEO of Apple Inc., talks about the new iPhone 11.

David Paul Morris | Bloomberg | Getty Images

When Apple introduced its triple-camera iPhone this week, marketing chief Phil Schiller explained the camera's ability to create the perfect photo by associating it with eight separate shows captured before the main shoot , a feat of "crazy science of computational photography".

"When you press the shutter button, you need a long exposure, and then, in one second, the neural engine analyzes the merged combination of long and short images, selecting the best one among these , by selecting all pixels and moving to pixel by pixel, 24 million pixels to optimize for detail and low noise, "said Schiller, describing a feature called" Deep Fusion "that will be available later this fall.

This was the kind of technical digression that, in the past, might have been reserved for narrative by the chief designer, Jony Ive, of a precision aluminum milling process for produce the clean lines of the iPhone. But in this case, Schiller, the company's most enthusiastic photographer, praised his praise for custom silicon and artificial intelligence software.

The battlefield of smart phone cameras from the technology industry has shifted to the inside of the phone, where a sophisticated software of artificial intelligence and special chips play a major role in the Appearance of the pictures of a phone.

"Cameras and displays are selling phones," said Julie Ask, vice president and principal forrester analyst.

Apple has added a third lens to the iPhone 11 Pro model, which matches the three-camera configuration of competitors such as Samsung Electronics and Huawei, already integrated into their flagship models.

But Apple has also played a leading role inside the phone, with some features such as the "night mode", a setting designed to enhance the appearance of poorly lit photos. Apple will add this mode to its new phones when they come out on Sept. 20, but Google Pixel from Huawei and Alphabet have similar features since last year.

By improving the appearance of photos, Apple is trying to gain an advantage through the custom chip that powers its phone. At the launch of the iPhone 11 Pro, executives spent more time talking about its processor, dubbed the A13 Bionic, to the specifications of the new lens added.

A special part of this chip called "neural engine", reserved for artificial intelligence tasks, aims to help the iPhone to take better, sharper images in difficult lighting conditions.

Samsung and Huawei also design custom chips for their phones and even Google has a custom "Visual Core" silicon that facilitates pixel photography tasks.

Ryan Reith, vice president of IDC's mobile device tracking program, said it created an expensive game in which only phone makers with enough resources to create custom chips and software could afford to invest in custom camera systems differentiating their devices. .

Even very cheap handsets now have two or three cameras on the back of the phone, but chips and software play a key role in the quality of the images.

"Owning the stack today in smartphones and chipsets is more important than ever, because the outside of the phone is made up of commodities," said Reith.

The custom chips and software that powers the new camera system take years to develop. But in the case of Apple, research and development may be useful later in products such as augmented reality glasses, which, according to many industry experts, are under development.

"Everything is built for the biggest story down the line – augmented reality, starting with phones and possibly other products," said Reith.

[ad_2]

Source link