[ad_1]
If you took a selfie on the iPhone XS, you may feel that it looks a bit like the other selfies you've taken in the past, especially those of the previous iPhone.
We already know the the cameras of the iPhone XS are different from those of the X.
But some Reddit and YouTube users claimed that the iPhone XS camera was producing a "beauty mode" effect on faces, which smoothed out imperfections. Several apps such as Snapchat, Instagram, and FaceTune actively offer filters that enhance or retouch facial features.
Here's why selfies look different on the XS, but not for the reason you might think.
Cameras do not see how our eyes make
If you take pictures of a high contrast scene, it is difficult for a camera sensor to capture all the details in highlights and shadows. Think of a picture taken from the inside, looking out the window with a lot of light coming from the outside. Most cameras end up being exposed to the inner light (which means the window light goes out completely) or to the outside light (which means the inner scene is dark and underexposed) .
One solution is to use high dynamic range (HDR) images. HDR combines multiple exposures – usually an underexposed, overexposed, and properly measured photo – into one. This makes it possible to capture a larger dynamic range in the photos, so that the details of shadows and highlights are equalized. Take this picture of the iPhone XS with HDR enabled (left) and disabled (right) and notice the additional details kept in the window.
Without HDR, phone cameras may have a hard time exposing both shadows and details, so you often find yourself with high lights or muddy shadows.
Why HDR is different
Many photographers have used HDR techniques to achieve what may look like "hyper-real" photos. With the right kind of treatment, the most extreme photos can seem overly saturated and look like illustrations or aerographed versions of reality.
The HDR version of Apple on the iPhone XS, the XS Max and the future iPhone XR called Smart HDR. It is enabled by default for pictures taken by both front and back cameras. (If you want to turn it off, go to Settings> Camera.)
AT Launch of Apple's iPhone XS, Phil Schiller used an example of a photo taken from a moving subject to explain how Smart HDR works. The A12 Bionic chip first captures four images as a buffer, then takes additional "interimages" at different exposures to bring out the highlights. It also takes a long exposure to capture the details of the shadows. Then all the images are analyzed and the solution is to merge the best parts into a single photo.
With Smart HDR enabled, the XS generates a merged image. But even if Smart HDR is not enabled, the XS already uses digital photography to merge exposures, perform a local tone mapping (a technique for mapping colors to achieve an HDR effect), and retrieve detail from highlighting ordinary photos.
It is also important to note that the merger of multiple exposures and the merge of images are not unique to Apple. The Google Pixel and Samsung Galaxy phones do similar things in their own HDR modes.
So what about the smoothing effect?
Two things. You might think that an HDR image looks "painted in the airbrush", especially when it is compared to a photo taken on a phone on which HDR is not. not activated. Take, for example, the portrait below taken on the rear cameras of the iPhone XS (left) and the iPhone X (right). The XS image may seem softer in your eyes because the highlights have been reduced, thanks to the fusion of the exposures and the reduction of the contrast.
Secondly, to create an HDR image, you need at least three images taken at the same time. Unless you hold the phone incredibly stable or ask your subject to keep his expression (try this with kids), you will probably introduce some kind of shake. To work around this problem, the camera must take pictures at an incredibly fast shutter speed.
But to get good exposure to a fast shutter speed of several hundredths of a second, especially in low light, the camera must increase the ISO (light sensitivity). This introduces a lot of noise, which may look like speckles or a grain on your photos. This is only magnified by a small sensor like that of a front camera.
Cameras often apply noise reduction to get rid of this noise, but the tradeoff is that the photos may seem smooth. Below is an example of a photo taken with a digital SLR under low light at 3200 ISO, with a lot of noise (left). On the right, the same photo with intense noise reduction applied in Lightroom. As you can see, the image on the right seems much smoother and loses some details. This is an extreme example, but gives you an idea of what noise reduction can do.
Here's an important caveat with the iPhone XS: if you take selfies or face-up photos under good lighting, the camera seems to not apply a lot of reduction from the noise, at least according to my tests. In low light, the noise reduction seems to be more aggressive, resulting in a smoothing effect.
And it's not just about faces. If you look at pictures taken with other subjects in low light, especially with the front camera, you may notice the same effect.
Sebastiaan de With, creator of the popular third-party Halide camera application, explains the changes made to the XS camera in this room. An important conclusion from his deep dive is:
"The iPhone XS merges the exposures and reduces the brightness of the bright areas, as well as the darkness of the shadows.The detail remains, but we can perceive it as less distinct because it has lost the local contrast."
What about the raw shoot?
Since iOS 10, iPhones have been able to capture raw images from the rear camera. Raw files are photos captured directly from the image sensor without processing. This means no HDR effect, no noise reduction and an unbroken image.
However, de With found that if you shoot in raw mode on the XS, the sound of the sensor is louder than on the X. The noise reduction is therefore more aggressive. Third party applications will need to be optimized specifically for the new camera or users will need to take photos in manual mode and deliberately underexpose.
Where to go from here?
One of the ways to potentially modify this effect is to update the software to offer different levels of Smart HDR or to reduce the intensity of the noise reduction algorithm for all photos.
Disabling Smart HDR makes more difference for pictures taken with the rear camera than for the front camera. And as we have already mentioned, the XS camera takes pictures in a different way than the previous iPhone, thanks to digital photography and the fusion of exhibitions. Thus, the photos already have a different appearance, even without Smart HDR.
But much of this whole discussion is about how we see photos differently – especially of ourselves. Many people who showed me selfies preferred the XS because their photos were more uniform and a little softer. Others postponed to the X because it seemed to hold more detail in their eyes, even though the picture had more noise. As always, your personal preferences can influence one way or another. But neither of the two images is false: they are simply different.
IPhone XS review: a step up from the iPhone X
IPhone XS vs XS Max, XR, X Specifications: What's New?
Source link