The 7 most blatant fibs of Apple about the iPhone XS camera today – TechCrunch



[ad_1]

Apple always lets some whoppers fall at its events, and the announcement of the iPhone XS today is no exception. And nowhere else, they were more obvious than when introducing the "new" functions of the device. Nobody doubts that iPhones take great pictures, so why talk about it? I guess they can not help themselves.

To be clear, I have no doubt that they have made very good updates to improve a good camera. But regardless of these improvements, they have been overshadowed today by the breathtaking, often debatable and sometimes simply untrue hype. Now, to fill this article, I had to become a little pedantic, but honestly, some of them are pretty glaring.

"The most popular camera in the world"

Of course, there are many iPhones. But defining the iPhone as a kind of continuous camera ten years, what Apple seems to do, is somehow a dishonest way to do it. According to this standard, Samsung would certainly be ahead because it would be allowed to count all its Galaxy phones a decade, and they definitely exceeded Apple at that time. Going further, if you had to say that a standard camera and a standard Sony or Samsung sensor was a "camera," Android phones would probably have exceeded the 10: 1 iPhone.

Is the iPhone one of the most popular cameras in the world? To be sure Is this the most popular camera in the world? You have to decide it quite nicely and say that this or that year and this or that model is more numerous than any other single model. The fact is that this measure is very spongy and that many might claim to depend on how they choose or interpret the numbers. As usual, Apple has not shown its work here, so we could as well write a term and call it an educated bluff.

"New remarkable dual camera system"

As Phil will explain later, much of the novelty comes from improvements to the sensor and the image processor. But since he said that the system was new when he was supported by an exploded view of the camera material, we could think of it as also referring to it.

It's not clear how the hardware is different from the iPhone X. If you look at the specs, they are almost identical:

If I said that these cameras were different, would you believe me? Same number F, no reason to think that the stabilization of the image is different or better, etc. It would not be unreasonable to guess that these are, in terms of optics, the same cameras as before. Again, not that there was something wrong with them – they are a fabulous optics. But showing components that are actually the same and saying it's different is misleading.

Given Apple's style, if there were any real changes to the objectives or the OIS, they would have said something. It is not trivial to improve these things and they would benefit if they did.

The sensor of course is extremely important, and it East improved: the 1.4 micrometer pixel pitch on the main wide-angle camera is larger than the 1.22 micrometer pitch on the X. Since the megapixels are similar, we can probably assume that the "bigger" sensor is a consequence of this no different pixel pitch, no real shape factor change. It's certainly bigger, but the larger pixel pitch, which helps with sensitivity, is what's actually improved, and the increased dimensions are just the consequence of this.

We will examine the claims of the image processor below.

"Sensor 2 times faster … for better image quality"

It is not very clear what that means when he says that. "Take advantage of all this technology". Is this the reading rate? It is the processor that is the fastest, since that is what would probably produce a better image quality (more power to calculate colors, better encode, etc.)? 'Fast' also refers to the collection of light – is this faster?

I do not think it's accidental that it's just sort of thrown out and unspecified. Apple likes big simple numbers and does not want to play the spec game like any other. But in my opinion, this goes beyond simplification and confusion. This test at least Apple or third party can be cleared.

"What is quite new, is to connect the ISP with this neural engine to use them together."

Now, it was a bit of a sleight of hand on Phil's part. What is probably new is that Apple has better integrated the image processing path between the traditional image processor, which does things like autofocus and color, and the "neural engine" that detects faces.

It may be new to Apple, but this kind of thing has been standard in many cameras for years. Phones and interchangeable lens systems such as DSLRs use face and eye detection, some using neural-type models, to guide autofocus or exposure. That (and the problems that come with it) goes back years and years. I remember the events that took place, but unfortunately, I was not able to detect people with dark skin or frowning.

This has improved a lot (Apple depth detection units are probably very useful), but the idea of ​​linking a face tracking system, whatever the fancy name you call, to the capture process image is exceeded. What is in the XS is perhaps the best, but it's probably not quite new, even for Apple, not to mention the rest of the photograph.

"We have a whole new feature that we call intelligent HDR."

The all-new Apple feature is now available on Google's Pixel phones. A large number of cameras now retain an image buffer, essentially capturing images in the background while the application is open, and then using the last one when you press the button. And Google, among others, had the idea that you could use these unpublished images as raw material for HDR shooting.

Apple's method is probably different, and maybe even better, but basically it's the same thing. Again, "new" for iPhone users, but well known among the flagship devices of Android.

"That's what you're not supposed to do, it's good, take a picture in the sun, because you're going to blow the light.

I do not tell you should shoot straight into the sun, but it's not uncommon to include the sun in your shot. In the corner like that, there may be flares, for example. It will not blow these days because the automatic exposure algorithms of almost every camera are center-weighted or shifted intelligently – to find faces, for example.

When the sun is in your shot, your problem is not dazzled, but the lack of dynamic range due to a big difference between the exposure needed to capture the sunny background and the foreground shaded. That's, of course, as Phil claims, one of the best applications of HDR – a well-framed exhibition can assure you that you have shaded details while retaining bright details.

Curiously, in the image that he chose here, the details of the shadow are mostly lost – we only see a bunch of noises out there. You do not need HDR to get these water droplets, it's really a shutter speed. It's still a good photo, by the way, I do not think it's a good illustration of what Phil is talking about.

"You can adjust the depth of field … this has not been possible in photography of any type of camera."

This is not true. You can do it on the Galaxy S9, and it's also deployed in Google Photos. Lytro was doing something like this years ago, if we include "any type of camera". Will it be better? Probably – it's fine with me Has it never been possible? Not even close. I feel a little bad that no one has told Phil. He went out without the facts.

Well, they are all great. There was a lot more, we will say, embellishments at the event, but it's the same thing for any big business launch. I just thought that those could not go unanswered. I have nothing against the iPhone camera – I use one myself. But boys are raging with these claims. Someone has to say it, since there is clearly no one inside Apple.

Check out the rest of the coverage of our Apple event here:

more coverage iPhone Event 2018

[ad_2]
Source link