(Intro SFX) – Okay, what exactly is happening
with the iPhone's camera? Like we've done years of
blind smartphone camera tests in bracket format and the iPhone, supposedly one of the premium cameras in the entire smartphone industry, consistently loses in the first round. Then we do a scientific
version with 20 million+ votes and it finishes in the middle of the pack. "And yet, Marques, you named
it the fourth-time running best overall smartphone
camera system in 2022 and gave it a trophy. What's up with that?" A concerning amount of people have started to notice that the iPhones camera feels like it's taken
a few steps back lately and I agree with them. I think we should take
a closer look at this. (relaxed music) So first of all, cameras
have come a really long way to the point where smartphone cameras aren't just cameras anymore. See, back in the day, a
camera was basically a sensor that would travel around
covered all the time and when you wanted to take a photo, you would expose that sensitive bit to the environment around it and it would collect
the light and close it.
Then the photo would be a representation of how much light hit
each part of the sensor. The better the sensor, the
better an image you could get, the more light information, super simple. These days though, it's turned into a whole
computational event. Your smartphone sensor is
sampling the environment, not once, but often several times in rapid succession at different speeds. It's taking that light information, merging exposures together. It's doing tone mapping, noise
reduction, HDR processing and putting it all together into what it thinks will
be the best looking image. This, of course, is a very different
definition of a picture. So now it's not just about
having the best sensor that gathers the most light information, it's at the point where software makes a much bigger difference to the way the image looks
at the end of the day than anything else.
Like next time you watch a smartphone reveal event, for example, keep an eye on all the new
additions that get made and just how many of
them are pure software. So Google basically struck
gold when they first started using the IMX363 sensor way back in the day with
the Pixel 3's camera because they got their software
tuning with it just right and it was an instant smash hit. So they kept using that great camera combo in every Pixel since then. The 3, the 3a, the 4, the 4a, the 5, the 5a, and even the Pixel 6a.
So year after year of new phones, same sensor, same software tuning combo because it just worked. If it ain't broke, don't fix it. So when you saw the Pixel 6a win December's scientific
blind smartphone camera test, what you saw was a four-year-old sensor and software tuning combo
that is still so good that in a postage-stamp-sized comparison of compressed side-by-side images where you can't really judge sharpness or depth of field too much, basically just appreciating the basics, this combo absolutely nailed the basics better than anyone else. Now, when the Pixel 6
came along, stay with me, Google finally updated their
design and their branding and they finally changed to a new sensor with this new camera system. So they go from the
tried-and-true 12 megapixel to this massive new 50 megapixel sensor and it kind of threw a wrench into things. – So it looks to me that the
Pixel is over sharpening. I think the one on the
left looks too crunchy. – The camera on the
Pixel 6 does have a habit of making things just look HDR-y.
I dunno if there's really
a technical term for that. – [Dan] And if you look at all the photos, it's clear the Pixel is
still doing Pixel things. – I think Google's still running all of their camera algorithms at 11, like when they don't need to anymore. – Right now, new phones
with much bigger sensors are still processing like
their smaller older ones. – The basic principle is: they were doing all this
processing with the old sensors as if they were not getting a lot of light and then suddenly they had
this massive new sensor which is getting way
more light information but they were still running
all of this processing. They would still do high-sensitivity stuff and then they'd do noise reduction because if you have high sensitivity, you need noise reduction. But then since you're
doing noise reduction, you need to do sharpening on top of that to make up for it and just overall you're
doing way too much. And so the photos are
literally overprocessed. So this fancy new phone would come out with a new camera system, but you could argue, legitimately, that the older Pixel still
took better looking photos.
So Google had to work really
hard at the drawing board and make some adjustments and
some updates to the software to dial in this new sensor. It took a while, but
now with the Pixel 7 out a full year later with the
same huge 50 megapixel sensor, they're back on track. And hey would you look at that, Pixel 7 right behind the Pixel
6a in the blind camera test. So when I see iPhone 14 Pro photos looking a little inconsistent and a little overprocessed right now, I actually see a lot of the same stuff that Google just went
through with the Pixel.
Because the iPhone story is
kind of along the same lines, they used a small 12 megapixel sensor for years and years and years. Then the 13 Pro sensor got a little bigger but this year, the iPhone 14 Pro is the first time they're bumping up to this dramatically
larger 48 megapixel sensor. And so guess what? Some iPhone photos this year are looking a little too processed and it's nothing extreme, but it's real and they will have to work on this. I suspect that by the time we get to iPhone 15 Pro, you know, a year later, they'll have some new software
stuff they're working on.
And I bet there's one new
word they use on stage. You know, we finally have Deep Fusion and pixel-binning and all this stuff, I bet there's one new word they use to explain some software
improvement with the camera. But anyway, I think this
will continue improving with software updates over time and they'll continue to get it dialed and I think it'll be fine.
But that's only half my theory. This does not explain why all the previous 12 megapixel iPhones also all lost in the first round in all those other bracket style tests. And this is a separate issue that I'm actually a
little more curious about because as you might recall, all of our testing photos
have been photos of me. Now, this was on purpose, right? Like we specifically designed the tests to have as many potential factors to judge a photo as possible.
Like if it was just a
picture of this figurine in front of a white wall, the winner would probably just
be whichever one's brighter, maybe whichever one has a
better gold color, basically. But then if we take the figurine with some falloff in the background now we're judging both
color and background blur. Maybe you add a sky to the background, now you're also testing
dynamic range and HDR. So yeah, with our latest
photo, it's a lot. It's two different skin tones. It's two different colored shirts. It's some textures for sharpness, the sky back there for a dynamic range, short-range falloff on the left, long-range falloff on the right. I mean with all these factors, whichever one people pick as a winner ideally is closer to
the best overall photo. I also wanted the
pictures to be of a human just because I feel like most of the important
pictures that people take, most often, that they care
about are of other humans. But as it turns out, using my own face as a subject for these revealed a lot about how
different smartphones handle taking a picture of a human face.
Because as I've already mentioned, these smartphone cameras
are so much software now that the photo that you get
when you hit that shutter button isn't so much reality as much as it is this
computer's best interpretation of what it thinks you
want reality to look like. And each company goes to a different level of making different choices
and different optimizations to change their pictures
up to look different ways. They used to actually be a
little more transparent about it. There are phones that
would literally identify when you're taking a landscape photo and they'd pump up any greens
they can find of the grass or they'd identify any
picture with a sky in it and pump up the blues
to make it look nicer. I did a whole video on
smartphone cameras versus reality that I'll link below the Like button if you wanna check it out.
But the point is, when you
snap that photo on your phone, you're not necessarily
getting back a capture of what was really in front of you. They're really bending it in many ways. The iPhone's thing is
when you take a photo it likes to identify faces
and evenly light them. It tries every time. And so this feels like a
pretty innocent thing, right? Like if you ask people normally, "What do you think should
look good in a photo?" And you say, "Oh, I'll evenly
light all the faces in it." That sounds fine, right? And a lot of time it looks fine but it's a subtle thing like in a photo where you can see the light is
coming from one side clearly, where you can see from the Pixel's camera, there's a shadow on the
right side of the face.
With the iPhone though, it's almost like someone walked up and added a little bounce fill, (chuckles) just a really nice little
subtle bounce fill. But sometimes it looks a little off. Like look, this is the
low-light photo test we did from our blind camera test. On the left is the Pixel 7 again, which looks like all the other top dogs. And on the right is the iPhone 14 Pro that finished in the middle of the pack. It might be hard at first
to see why it looks so weird but look at how they
completely removed the shadow from half of my face. I am clearly being lit from a source that's to the side of me, and that's part of reality.
But in the iPhone's
reality, you cannot tell, at least from my face, where
the light is coming from. Every once in a while you
get weird stuff like this. And it all comes back to the fact that it's software making choices. And the other half of that is skin tones. So you've heard me say
for a few years in a row that I mostly prefer photos
coming from the Pixel's camera, and we've done lots of tests where I have me as a sample photo and you can tell it looks really good. Turns out Google's done this
thing over the past few years with the Pixel camera called Real Tone. It doesn't get that much attention, but it turns out to be making
a pretty big difference here. Historically, a real issue for
film cameras back in the day was that they were calibrated
for lighter skin tones and people with darker skin tones would typically be
underexposed in those pictures. So now fast forward today,
cameras are all software. Smartphone cameras are software so they can all make adjustments to account for different variety
of skin tones, of course.
But they still all do it to
different varying degrees. Like you might have noticed a lot of phones sold in China will just brighten up
faces across the board because that's what people prefer in photos in that region very often. Google goes the extra mile to train their camera
software on data sets that have a large variety of skin tones to try to represent them
correctly across the board. And that's what it's calling Real Tone. And Apple's cameras,
from what I've observed, simply just like to evenly
light faces across the board and doesn't necessarily account for different white balances
and exposures necessary to accurately represent
different types of skin tones when I think they totally could.
So basically, it turns
out this is a big part of what we were observing in Pixel's and a lot of the phones that do accurately represent my skin tone finishing higher in this
blind voting thing that we did because they happen to
do that really well. And that's a thing that
people really considered when they voted on them. I haven't said this a lot, but I think this is one
of the earliest reasons that I actually really
liked RED cameras was, you know, obviously 8K is great, Color Science is great,
but the way it represents and renders my skin tone
accurately over a lot of, you know, the Sonys and the ARRIs and Canons that I've tried, that's actually one of the things that really drew me to these cameras.
So all this software stuff
is why photo comparisons between modern smartphones is so hard. Like there are a lot of channels
that do a really good job with the side-by-side
photo test, you know, but even as you're trying to
like pick one over the other, you've probably noticed this, you might like the way one of them renders landscape photos over the other but the way a different one renders photos with your own skin tone and then the way a different one renders photos
of your pet, for example. So I'm sure Apple will defend
everything they're doing now with their current cameras
as they typically do. But I'm gonna keep an
eye on what I'm also sure which is they're for sure working on tuning these new
cameras, dialing it in, and eventually getting it better with the iPhone 15 and 15 Pro.
So back to the original question from the beginning of the video, we can't leave that unanswered, which is, "All right, the Pixel 6a, you like the Pixel photos, Marques, it won the blind scientific camera test but you still gave the trophy for best overall camera
system to the iPhone, the very 14 Pro that
we've been talking about this whole video, why?" And if you listen carefully,
you already got it, which is that scientific test that we did tested one specific thing, it tested the the small
postage-stamp-sized, you know, exposure and
colors general thing with a bunch of different factors, but sharpness and detail with all the compression
that we did wasn't tested.
Also, speed of autofocus, reliability of autofocus wasn't tested. The open-close time of the camera app, how fast and reliable you can
get a shot, wasn't tested. And also video wasn't tested. So the microphone quality, video quality, speed and reliability of autofocus there, file formats, sharpness, HDR,
all that stuff, wasn't tested. Maybe someday we will test
all that, but until then, the lesson learned is the pretty pictures that come from the Pixel or
whatever phone's in your pocket are partly photons, but
primarily processing. (relaxed music) Thanks for watching. Catch you guys the next one. Peace. (record crackling).