How to get the most out of iPhone 12 Pro and Pro Max cameras



[ad_1]

Illustration from article titled How to Get the Most Out of iPhone 12 Pro and Pro Max Cameras

Photo: Caitlin McGarry / Gizmodo

If you pay extra hundreds of dollars for a Pro or Pro Max version of the iPhone 12, then you want to make sure you get what you pay for – and taking advantage of the extra photo-taking capabilities is key. Here are the features you get and how to get the most from them.

the iPhone 12 Pro and the iPhone 12 Pro Max have three rear cameras compared to the two on the iPhone 12 and the iPhone 12 Mini. The additional lens is a telephoto lens, which means you get 2x optical zoom on the Pro and 2.5x on the Pro Max (for 4x and 5x total optical range, respectively).

It should be noted that the Pro Max has a slightly better setup than the Pro, aside from the larger optical zoom range: its 12MP main sensor is larger, so more light can be captured and low-light photography is improved, and the sensor itself adjusts its position to counter camera shake, rather than fixing it through software afterwards.

There is also a LiDAR scanner on board, a technology that we have explained in more detail. here. This is basically an upgrade to your iPhone’s depth-sensing skills, so augmented reality apps are more accurate and the camera’s autofocus can pick a point in space and fix it faster (especially in low light).

Most of the time, the benefits of the iPhone 12 Pro and iPhone 12 Pro Max cameras are applied automatically in the background: there are no settings buttons to activate or features to activate. However, it always helps to know what the main benefits are and when you will see the biggest improvements.

Deploy zoom

The zoom on the Pro Max goes up to 2.5x.

The zoom on the Pro Max goes up to 2.5x.
Screenshot: ios

You’ll notice an additional zoom option when you load the Camera app on the iPhone 12 Pro and Pro Max models, compared to other iPhone 12 variants. Instead of a 0.5x and 1x option, you’ll see a 2x option for the iPhone 12 Pro and a 2.5x option for the iPhone 12 Pro Max.

The benefits of the extra optical zoom are obvious: Even if you can’t get physically close to what you’re photographing, whether it’s a music concert scene or a bird in your garden, the iPhone camera can help. The built-in image stabilization on the iPhone 12 Pro Max should also come in handy at higher zoom levels.

It’s worth noting that the Pro Max’s longer telephoto means it collects less light. If you use the 2.5x zoom in low light, the native Apple Camera app will switch to cropping a photo taken with the primary lens to keep noise levels low while maintaining the zoom effect – if you wish. to continue using the 2.5x telephoto zoom lens for some reason, you’ll need a third-party app that gives you full control (like Halide, which the developers have explained the camera switch).

The usual rules for shooting with zoom in apply: Keep the camera as stable as possible, using a tripod or other support if possible. The improved shutter speed and anti-shake technologies of the iPhone 12 Pro and iPhone 12 Pro Max should help you here, but you can also take steps to make sure you get a great shot every time. .

Night shot

Night mode automatically adjusts the exposure time, but you can adjust it.

Night mode automatically adjusts the exposure time, but you can adjust it.
Screenshot: ios

Each iPhone 12 model is very capable of taking photos in low light, but with its larger sensor, the iPhone 12 Pro Max will give you the best results. There’s no dedicated night mode in the iOS Camera app, but it should automatically activate when you’re shooting something in low-light conditions.

You’ll see a yellow icon appear on the shutter screen with night mode enabled, and Apple’s software will choose an exposure time depending on the scene you’re watching (this will be displayed at night plus the icon. herself). The longer the exposure time, the more light can penetrate the photo, but the longer it takes to keep the camera steady (again, the Pro Max’s auto-correcting sensor can make a difference here. ).

If you want to let in as much light as possible and are confident in your ability to keep the phone still – or if you have a tripod – tap the yellow night mode icon and you can adjust the exposure time. manually, up to a maximum of three seconds. In some cases, it may be useful to replace the Camera app to capture more detail.

Switch to Portrait mode using the selector next to the shutter button and you can get some really well judged background blur effects thanks to the advanced depth sensing that LiDAR does: it’s better able to identify the edge between subjects and the background using laser scanning, as demonstrated by Ben Lovejoy in 9to5Mac.

Adding additional LiDAR

Apple's Measure app shows the power of LiDAR.

Apple’s Measure app shows the power of LiDAR.
Picture: Apple

We’ve already mentioned how LiDAR can help in night photography and portraits, and its superior depth detection is one of the main ways to improve your photos on Pro cameras without you having to do anything. – it’s faster and more accurate than the ToF (time of flight) sensors included on many other handsets.

You’ll see it with enhanced portraits and faster focusing in low light. However, the main benefit of adding LiDAR to a phone camera is not to improve the images you can get, but to add extra speed and precision to augmented reality applications. At the moment, there aren’t a large number of AR apps you can install to test this out, but over time we’ll see more and more popping up.

The Measure app ships with your iPhone and offers a quick and easy way to test AR, although the differences in speed and accuracy may not be immediately apparent. Point the Measure app at a person – making sure their full frame is included in the camera’s viewfinder – and after a second or two you should see the app give you an estimate of their height, thanks to the Integrated LiDAR.

Applications such as Scandy Pro 3D scanner, RoomScan LiDAR, and Snapchat may also give you an idea of ​​how LiDAR scanning works on the iPhone 12 Pro and Pro Max – the developers at Snapchat have promised an AR filter that specifically takes advantage of LiDAR on Apple’s more expensive 2020 iPhones, although we were still waiting to deploy it.

Using ProRAW Mode

ProRAW adds a new format to the Camera app.

ProRAW adds a new format to the Camera app.
Screenshot: ios

The arrival of Apple ProRAW mode is a big update for iPhone 12 Pro and Pro Max users, offering more control for serious photographers. RAW format photos have been available on phones for years now, minimizing any processing or effects added by the software and giving users access to the “ raw ” data captured by image sensors, although in the case of ProRAW, Apple try to give us the best of both worlds.

This means that it applies some iPhone processing magic to the image, like Deep Fusion and Smart HDR, while still giving photographers as much flexibility as possible when it comes to adjusting white balance. , tone, color, etc. in an image editor. “ProRAW provides you with all of the standard RAW information, as well as data from the Apple Image Pipeline,” Apple explains.

As of this writing, ProRAW has just arrived in the beta of iOS 14.3, although it may be fully deployed as you read this. To activate it, you must go to Camera then Formats from iOS settings: once done, you will see a GROSS Up button in the corner of the screen when taking snapshots, which you can press to turn ProRAW on and off.

The feature has only just appeared, so we’re just waiting to see exactly what a difference it makes for photos and workflows. You can actually capture images from iPhone in a more conventional RAW format, you just need a third-party app to be able to do that.VSCO, Dark room, and Snapseed are among those who have the necessary functionality.

[ad_2]

Source link