Apple's new iPhones move the battlefield from smartphone camera to AI



[ad_1]

(Reuters) – When Apple Inc. (AAPL.O) introduced his triple-camera iPhone this week, marketing chief Phil Schiller explained the camera's ability to create the perfect photo by associating it with eight separate shows captured before the main shoot, a feat of the "mad science of computational photography".

PHOTO: CEO, Tim Cook, presents the new iPhone 11 Pro at an Apple event held at their headquarters in Cupertino, California on September 10, 2019. REUTERS / Stephen Lam – HP1EF9A1EM211

"When you press the shutter button, you need a long exposure, then, in one second, the neural engine analyzes the merged combination of long and short images, selecting the best one among them, selecting all the pixels, and passing at pixel by pixel. 24 million pixels to optimize for detail and low noise, "said Schiller, describing a feature called" Deep Fusion "that will be available later this fall.

It was the kind of technical digression that, in the past, might have been reserved for the narrative by design chief Jony Ive of a precision aluminum milling process to produce the clean lines of the iPhone. But in this case, Schiller, the company's most enthusiastic photographer, praised his praise for custom silicon and artificial intelligence software.

The battlefield of smart phone cameras from the tech industry has moved inside the phone, where sophisticated artificial intelligence software and special chips play a major role in the look of a phone's photos .

"Cameras and displays sell phones," said Julie Ask, vice president and principal forrester analyst.

Apple has added a third lens to the iPhone 11 Pro model, matching the three-camera configuration of competitors such as Samsung Electronics Co Ltd (005930.KS) and Huawei Technologies Co Ltd [HWT.UL], already a feature on their flagship models.

But Apple has also played a leading role inside the phone, with some features such as the "night mode," a setting designed to enhance the appearance of poorly lit photos. Apple will add this mode to its new phones when it ships on Sept. 20, but Huawei and Alphabet Inc (GOOGL.O) Google Pixel has similar features since last year.

By improving the appearance of photos, Apple is trying to gain an advantage through the custom chip that powers its phone. At the launch of the iPhone 11 Pro, executives spent more time talking about its processor, dubbed the A13 Bionic, to the specifications of the new lens added.

A special part of this chip called "neural engine", reserved for artificial intelligence tasks, aims to help the iPhone to take better images sharper in difficult lighting conditions.

Samsung and Huawei also design custom chips for their phones and even Google uses a custom "Visual Core" silicon that facilitates pixel photography tasks.

Ryan Reith, vice president of IDC's mobile device tracking program, said it created an expensive game in which only phone makers with enough resources to create custom chips and software could afford to invest in custom camera systems differentiating their devices. .

Even very cheap handsets now have two or three cameras on the back of the phone, but chips and software play a key role in the quality of the images.

"Owning the stack today in smartphones and chipsets is more important than ever because the outside of the phone is made up of commodities," said Reith.

The custom chips and software that powers the new camera system take years to develop. But in the case of Apple, the research and development work may prove useful later in products such as augmented reality glasses, which many industry experts believe that Apple is developing.

"Everything is built for the biggest story down the line – augmented reality, starting in phones and ultimately other products," said Reith.

Stephen Nellis reportage in San Francisco; Edited by Lisa Shumaker

Our standards:The principles of Thomson Reuters Trust.

[ad_2]

Source link