Apple Executives’ Detailed Approach to iPhone 12 Camera Design in New Interview



[ad_1]

In addition to orders for the iPhone 12 mini and Pro Max opening this week and the devices arriving to first customers on November 13, Apple vice president of camera software engineering Jon McCormack and head of Francesca Sweet’s product line shared a behind-the-scenes look at the company’s philosophy for iPhone camera design, its goals for everyday users to professionals, and more.

McCormack and Sweet spoke to PetaPixel for a new interview that delves into Apple’s thinking behind the design of its iPhone cameras. Unsurprisingly, they revealed as a whole that it was a holistic approach between software and hardware:

both made it clear that the company sees camera development holistically: it’s not just about the sensor and lenses, but everything from Apple’s A14 Bionic chip to signal processing. image, through the software behind its computer photography.

As for the primary focus, Apple wants capturing photos with an iPhone camera to be so seamless that users aren’t distracted from what’s going on all the time.

“As photographers we tend to think a lot about things like ISO, subject movement, etc.,” McCormack said. “And Apple wants to take that out so people can stay in the moment, take a great photo and get back to what they’re doing.

McCormack also points out that this applies even to “serious photographers”:

He explained that if more serious photographers want to take a photo and then go through an editing process to make it their own, Apple is doing what it can to compress that process into a single action of capturing a frame, the all for the purpose. to remove distractions that could possibly take a person out of the moment.

“We reproduce as much as we can what the photographer will be doing on post,” McCormack continued. “There are two aspects to taking a photo: the exposure and how you develop it afterwards. We use a lot of computational photography on display, but more and more in post and do it automatically for you. The goal is to create more realistic photographs, to reproduce what it was like to actually be there.

Machine learning is used by Apple to process different aspects of a photo individually.

“The background, the foreground, the eyes, the lips, the hair, the skin, the clothes, the sky. We deal with all of these elements independently, as you would in Lightroom with a lot of local tweaks, ”he explained. “We adjust everything from exposure, contrast, and saturation, and combine them all together.”

Francesca Sweet commented on the improvement that the cameras in the iPhone 12 range bring to night mode.

“The new wide camera, improved image fusion algorithms, reduce noise and improve detail,” she says. “With the Pro Max, we can extend this even further, because the larger sensor allows us to capture more light in less time, which allows for better freezing of motion at night.”

And regarding the new ProRAW option, McCormack said the idea came from Apple asking if it could deliver the benefits of RAW shooting while still retaining the benefits of computer photography.

The full interview is an interesting read, check it out on PetaPixel.

FTC: We use automatic income generating affiliate links. More.


Check out 9to5Mac on YouTube for more information on Apple:

[ad_2]

Source link