The future is unclear: we must unveil how smartphone clichés become a new generation



[ad_1]

Call it a portrait mode, a bokeh mode or a background blur: almost all major brand smartphones have a camera function that attenuates the background of your images. The portraits and images of nature look very artistic and may even give the impression that everyday objects pictures are larger than life.

But how does it work?

These modes emulate the effects you can get with a DSLR or a mirrorless camera. In these cameras, the two main factors that determine the strength of the blur effect are the size of the sensor and the size of the aperture, or the aperture of the lens.

The latest phones, such as the Huawei Mate 20 Pro and the Pixel 3 XL, feature incredible cameras that can produce great pictures in almost any condition.

Even these have much smaller sensors than large format cameras, which limits their ability to blur the backgrounds of the images. That's why they need a little help.

The effect two lenses

Most cameras with a blur mode use two rear cameras to create the effect. The idea is simple.

Both cameras see the depth as our eyes do. Both lenses are slightly offset, giving them a different view of the world, a different perspective. The closer an object is, the greater the disparity in their position and appearance with respect to the two "eyes".

The difference is then analyzed by the phone's processor, just as our brain processes the data from our eyes to create a depth map of the scene.

You will not see this when shooting, but you can consider it as a rudimentary 3D model or as a map where the terrain is represented by a series of contour rings. An algorithm then goes to work scrambling the farthest parts of your subject.

These algorithms have improved considerably since dual-lens cameras with blur modes began appearing on phones in 2014, with the HTC One M8 pioneering.

You will now see a convincing progressive blur, with objects slightly behind or slightly in front of your subject, while those who are farther away get the true "bokeh" effect, thus taking on a nice smooth blur. a "shallow depth of field".

When used correctly, the term bokeh refers to the quality of the blur effect created by a camera lens. All sorts of adjectives can be attached, so you'll hear about beautiful bokeh, creamy bokeh, and so on. – and fortunately, the best phones are starting to get closer to the camera's bokeh effects.

If you want to talk to a friend and make it feel like you know your material, make sure you use the term "production factor".

Some phones, including the Huawei Mate 20 Pro and the Samsung Galaxy Note 9, also allow you to choose the level of blur. This equates to setting the aperture of a camera lens, while the new iPhone range allows you to do it. once the snap taken.

To date, only a few phones have cameras with truly variable apertures: the hole through which light is captured by the sensor can be enlarged or reduced, including the Galaxy Note 9 and the Nokia N86 from 2009.

However, the idea on these phones is to reduce the aperture, to let it better handle the conditions of extreme brightness by letting in less light than to widen it, which creates more blur.

Even larger aperture phones, such as the Samsung Galaxy S9 f / 1.5, have only the lens chops to capture the natural look of the blur's shallow depth of field.

If you want to talk to a friend and make it feel like you know your material, make sure you use the term "production factor".

This refers to the size of the sensor compared to the size of a standard 35mm film and is a strong indicator of the ability of a camera to natively manage low-light conditions without any software help, and the power of a blur effect you will see considering the f-stop rating, still without software.

Nowadays, every smartphone shooting software is optimized, but you can understand how bad the background blur effect of some phones is when you ask them to handle small dots luminous.

These are not just blurry – they turn into balls of light of artistic appearance. Get some of these in your bokeh pictures, as in the photo above, and you'll win.

This type of light treatment demonstrates that phones like the iPhone XS do not just mimic a lens, but the elements inside a lens.

A camera lens is not a simple piece of glass, but a whole series of these that direct light from the widest aperture on the smaller sensor. The layout and quality of these elements affect the character of the fuzzy parts of an image.

As you can see, blur bokeh software is more than just an Instagram style filter.

Outline tracing

All phones with a blurred background mode tend to struggle when dealing with scenes in which the subject's outline is very complicated. Most phones have depth sensors of a lower resolution than the main camera, which means that the depth map created is somewhat rudimentary.

And even with the best depth systems, you'll often see a slightly coarse outline where the net subjects focus on the blurred background. Since the effect is not optical, background blurring is still, to some extent, an enlightened guess leading to strange "cut-out" edges that you might sometimes see.

Other methods

There are other background blurring methods that do not rely on a two-camera configuration, which puts more emphasis on these clever tricks. Google is the best implementation of a blur mode with a single camera.

This is not limited to recognizing objects and outlines. the Pixel 2 Pixel 3's rear cameras use their two-pixel autofocus to determine which areas of an image are part of the subject and which areas are farther away. This is based on the principle of phase detection.

Each pixel of the phone's camera sensor consists of two photodiodes, the elements that detect the light. This allows each pixel to separate the light received from the left and right sides of the lens.

When a pixel can indicate whether the captured light is sharp or not, it can determine whether that part of the image is part of the subject or not, and subsequently how far away it is from the point focal.

Pixel smartphones offer impressive capabilities from a single sensor

Pixel smartphones offer impressive capabilities from a single sensor

Google also adds a blurry image to the front camera of the Pixel 3, which does not have two-pixel autofocus.

Here we obtain the pure software effect, developed using a neural network designed to recognize people and pets.

This is a pure version of the aforementioned "informed guess", and its operation, as well as the dual-camera version for portraits, shows just how smart Google's software is.

There is also another method, which happily has fallen out of favor. Some older single-camera phones with blur scan the focus range of the lens, capturing the exposures to analyze the parts of the image that lose focus when the focal point retires from your position. It's a lot slower than a good dual-camera setup, and the results are often not so good.

  • Brought to you in association with Nokia and Android One, allowing you to take more advantage of your smartphone. You can learn more about the new Nokia 7.1 hereand you will find other useful tips to get the most out of your phone right here.
[ad_2]
Source link