[ad_1]
While some observers have seen few obvious changes to the iPhone XS camera system compared to iPhone X, our iPhone XS review confirmed that Apple's latest phones offer significant improvements in sharpness image, dynamic range, and low light performance – lens, sensor, and processing changes. Even so, early customers continued to ask about features and separate reports addressed Apple's changes to photo and video.
Regarding the photos, Sebastiaan de With, third-party developer, addressed the theory of "Beautygate" – that Apple deliberately applied an image filter to make the selfies smoother – with the technical equivalent of a no. According to With, whose famous Halide application provides granular control over the cameras of every iPhone, it's true that the new XS camera system softens skin tones to eliminate stains, but there's no soft filter .
Instead, users are seeing the product of Apple's new Smart HDR feature, which uses multiple exposures to eliminate clear and dark contrasts. Since the brightest and darkest elements of the image are normalized, the skin is smoother because the light that strikes it is not as hard.
A more aggressive noise reduction is also applied, as the iPhone XS prefers faster shutter speeds with higher ISO levels, allowing you to capture more photos at a time, but with higher noise. The differences are particularly evident in With's lower-resolution front camera, but Apple could easily tweak the iPhone's software to rebalance the camera's results. It is also possible that iOS 12 allows users to toggle a switch to choose between its earlier HDR options and the latest ones.
On the video side, filmmaker Richard Lackey took the iPhone XS Max to Barcelona, Spain, for a low-light video test, and tried to figure out how Apple's computer images led to an outing. voodoo "which seemed to" go beyond the capabilities of the optics ". and sensor alone. "Lackey found that the raw video output of the low-density device could be overall better than the product of post-production color calibration." He also said that the video was both sharp and rich. in color information in the dark areas of the image. Any previous iPhone he had used.
According to him, the solution lies in the processing of images in real time assisted by IA: dynamic analysis of the scene, with possibly separate localized adjustments on individual parts of the image and dynamic matching of images. tones for luminance and color. In addition, although a noise reduction technique is certainly used, Lackey says that normally missing areas have unexpected details and texture – a result he can not fully explain.
Although Mr. Lackey does not know exactly how Apple gets its results, it is clear from the company's public statements that the core pipeline is to gather more detail than previously in real time – multiple images and exposures in a fraction of a second – and then use advanced processing to assemble these images into a sharper image. The specific technique seems sophisticated to weigh the averages of the individual detail areas in the images, resulting in smoothing but also more detail.
Obviously, the larger and faster wide-angle sensor of the iPhone XS also has something to do with the improvements. However, the impact of digital photography and the new XS A12 Bionic chip is very important, which shows that smartphones will likely continue to win on standalone cameras, even if they can not compete with the raw or the sensors.
Source link