• 18 hours ago
Transcript
00:00Okay, what exactly is happening with the iPhone's camera?
00:06Like we've done years of blind smartphone camera tests
00:09in bracket format and the iPhone supposedly
00:11one of the premium cameras in the entire smartphone industry
00:14consistently loses in the first round.
00:16Then we do a scientific version with 20 million plus votes
00:19and it finishes in the middle of the pack.
00:21And yet Marques, you named it the fourth time running
00:25best overall smartphone camera system in 2022
00:28and gave it a trophy.
00:30What's up with that?
00:31A concerning amount of people have started to notice
00:33that the iPhone's camera feels like it's taken
00:35a few steps back lately and I agree with them.
00:39I think we should take a closer look at this.
00:43So first of all, cameras have come a really long way
00:46to the point where smartphone cameras
00:48aren't just cameras anymore.
00:51See, back in the day, a camera was basically a sensor
00:54that would travel around covered all the time.
00:56And when you wanted to take a photo,
00:57you would expose that sensitive bit
00:59to the environment around it
01:00and it would collect the light and close it.
01:02Then the photo would be a representation
01:04of how much light hit each part of the sensor.
01:07The better the sensor, the better an image you could get,
01:09the more light information, super simple.
01:11These days though,
01:12it's turned into a whole computational event.
01:14Your smartphone sensor is sampling the environment,
01:17not once, but often several times in rapid succession
01:20at different speeds.
01:21It's taking that light information,
01:23merging exposures together, it's doing tone mapping,
01:28noise reduction, HDR processing,
01:30and putting it all together into what it thinks
01:33will be the best looking image.
01:35This of course is a very different definition of a picture.
01:39So now it's not just about having the best sensor
01:41that gathers the most light information,
01:44it's at the point where software
01:46makes a much bigger difference
01:47to the way the image looks at the end of the day
01:50than anything else.
01:51Like next time you watch a smartphone reveal event,
01:53for example, keep an eye on all the new additions
01:56that get made and just how many of them are pure software.
01:59So Google basically struck gold
02:00when they first started using the IMX363 sensor
02:04way back in the day with the Pixel 3's camera,
02:06because they got their software tuning with it just right
02:09and it was an instant smash hit.
02:11So they kept using that great camera combo
02:14in every Pixel since then.
02:15The 3, the 3A, the 4, the 4A, the 5, the 5A,
02:21and even the Pixel 6A.
02:22So year after year of new phones,
02:24same sensor, same software tuning combo
02:27because it just worked.
02:28If it ain't broke, don't fix it.
02:30So when you saw the Pixel 6A
02:32win December's scientific blind smartphone camera test,
02:36what you saw was a four year old sensor
02:39and software tuning combo that is still so good
02:42that in a postage stamp sized comparison
02:44of compressed side-by-side images
02:46where you can't really judge sharpness
02:47or depth of field too much,
02:49basically just appreciating the basics,
02:51this combo absolutely nailed the basics
02:54better than anyone else.
02:54Now, when the Pixel 6 came along, stay with me,
02:57Google finally updated their design and their branding
03:00and they finally changed to a new sensor
03:03with this new camera system.
03:05So they go from the tried and true 12 megapixel
03:07to this massive new 50 megapixel sensor
03:11and it kind of threw a wrench into things.
03:14So it looks to me that the Pixel is over sharpening.
03:18I think the one on the left looks too crunchy.
03:20The camera on the Pixel 6 does have a habit
03:22of making things just look HDR-y.
03:24I don't know if there's really a technical term for that.
03:26If you look at all the photos,
03:27it's clear the Pixel is still doing pixel things.
03:30I think Google's still running
03:33all of their camera algorithms at 11,
03:36like when they don't need to anymore.
03:37Right now, new phones with much bigger sensors
03:40are still processing like their smaller, older ones.
03:43The basic principle is they were doing all this processing
03:46with the old sensors
03:47as if they were not getting a lot of light
03:49and then suddenly they had this massive new sensor,
03:51which is getting way more light information.
03:53But they were still running all of this processing.
03:56They would still do high sensitivity stuff
03:58and then they do noise reduction
04:00because if you have high sensitivity,
04:01you need noise reduction.
04:02But then since you're doing noise reduction,
04:04you need to do sharpening on top of that to make up for it.
04:06And just overall, you're doing way too much.
04:09And so the photos are literally over-processed.
04:12So this fancy new phone had come out
04:14with a new camera system,
04:15but you could argue legitimately
04:17that the older Pixel still took better looking photos.
04:20So Google had to work really hard at the drawing board
04:22and make some adjustments and some updates to the software
04:25to dial in this new sensor.
04:27It took a while,
04:28but now with the Pixel 7 out a full year later
04:31with the same huge 50 megapixel sensor,
04:33they're back on track.
04:34And hey, would you look at that?
04:35Pixel 7 right behind the Pixel 6a in the blind camera test.
04:39So when I see iPhone 14 Pro photos
04:41looking a little inconsistent
04:42and a little over-processed right now,
04:45I actually see a lot of the same stuff
04:47that Google just went through with the Pixel.
04:49Because the iPhone story is kind of along the same lines.
04:52They used a small 12 megapixel sensor
04:54for years and years and years.
04:56Then the 13 Pro sensor got a little bigger,
04:58but this year the iPhone 14 Pro
05:00is the first time they're bumping up
05:01to this dramatically larger 48 megapixel sensor.
05:05And so guess what?
05:06Some iPhone photos this year
05:08are looking a little too processed
05:11and it's nothing extreme, but it's real
05:13and they will have to work on this.
05:14I suspect that by the time we get to iPhone 15 Pro,
05:20you know, a year later,
05:21they'll have some new software stuff they're working on.
05:22And I bet there's one new word they use on stage.
05:26You know, we finally have Deep Fusion
05:28and Pixel Bing and all this stuff.
05:30I bet there's one new word they use
05:32to explain some software improvement with the camera.
05:35But anyway, I think this will continue improving
05:36with software updates over time
05:38and they'll continue to get it dialed
05:40and I think it'll be fine.
05:42But that's only half my theory.
05:45This does not explain
05:47why all the previous 12 megapixel iPhones
05:50also all lost in the first round
05:53and all those other bracket style tests.
05:54And this is a separate issue
05:55that I'm actually a little more curious about
05:58because as you might recall,
06:00all of our testing photos have been photos of me.
06:05Now this was on purpose, right?
06:06Like we specifically designed the tests
06:08to have as many potential factors
06:10to judge a photo as possible.
06:11Like if it was just a picture of this figurine
06:13in front of a white wall,
06:14the winner would probably just be
06:16whichever one's brighter,
06:17maybe whichever one has a better gold color basically.
06:20But then if we take the figurine
06:21with some fall off in the background,
06:23now we're judging both color and background blur.
06:27Maybe you add a sky to the background,
06:28now you're also testing dynamic range and HDR.
06:31So yeah, with our latest photo, it's a lot.
06:33It's two different skin tones,
06:35it's two different colored shirts,
06:37it's some textures for sharpness,
06:39the sky back there for dynamic range,
06:41short range fall off on the left,
06:42long range fall off on the right.
06:44I mean, with all these factors,
06:45whichever one people pick as a winner,
06:47ideally is closer to the best overall photo.
06:50I also wanted the pictures to be of a human
06:52just because I feel like most of the important pictures
06:54that people take most often that they care about
06:56are of other humans.
06:58But as it turns out,
06:59using my own face as a subject for these
07:02revealed a lot about how different smartphones handle
07:05taking a picture of a human face.
07:09Because as I've already mentioned,
07:10these smartphone cameras are so much software now
07:15that the photo that you get
07:16when you hit that shutter button,
07:18isn't so much reality as much as it is
07:22this computer's best interpretation
07:25of what it thinks you want reality to look like.
07:29And each company goes to a different level
07:32of making different choices and different optimizations
07:36to change their pictures up to look different ways.
07:38They used to actually be a little more transparent about it.
07:40There are phones that would literally identify
07:42when you're taking a landscape photo
07:44and they'd pump up any greens they can find of the grass
07:47or they'd identify any picture with a sky in it
07:49and pump up the blues to make it look nicer.
07:51I did a whole video on smartphone cameras versus reality
07:54that I'll link below the like button
07:55if you wanna check it out.
07:56But the point is when you snap that photo on your phone,
08:00you're not necessarily getting back a capture
08:02of what was really in front of you.
08:04They're really bending it in many ways.
08:07The iPhone's thing is when you take a photo,
08:12it likes to identify faces and evenly light them.
08:18It tries every time.
08:20And so this feels like a pretty innocent thing, right?
08:22Like if you ask people normally,
08:23what do you think should look good in a photo?
08:25And you say, I'll evenly light all the faces in it.
08:28That sounds fine, right?
08:30And a lot of time it looks fine, but it's a subtle thing.
08:33Like in a photo where you can see the light is coming
08:35from one side clearly,
08:36where you can see from the pixels camera,
08:38there's a shadow on the right side of the face.
08:42With the iPhone though,
08:42it's almost like someone walked up
08:43and added a little bounce fill,
08:45just a really nice little subtle bounce fill.
08:48But sometimes it looks a little off.
08:50Like, look, this is the low light photo test we did
08:52from our blind camera test.
08:54On the left is the Pixel 7 again,
08:56which looks like all the other top dogs.
08:57And on the right is the iPhone 14 Pro
09:00that finished in the middle of the pack.
09:01It might be hard at first to see why it looks so weird,
09:04but look at how they completely removed the shadow
09:08from half of my face.
09:09I am clearly being lit from a source
09:12that's to the side of me, and that's part of reality.
09:16But in the iPhone's reality, you cannot tell,
09:19at least from my face where the light is coming from.
09:21Every once in a while you get weird stuff like this.
09:23And it all comes back to the fact
09:24that it's software making choices.
09:28And the other half of that is skin tones.
09:31So you've heard me say for a few years in a row
09:33that I mostly prefer photos coming from the Pixel's camera.
09:36And we've done lots of tests
09:37where I have me as a sample photo
09:40and you can tell it looks really good.
09:41It turns out Google's done this thing
09:42over the past few years with the Pixel camera
09:43called Realtone.
09:45It doesn't get that much attention,
09:47but it turns out to be making a pretty big difference here.
09:50Historically, a real issue for film cameras back in the day
09:53was that they were calibrated for lighter skin tones
09:56and people with darker skin tones
09:58would typically be underexposed in those pictures.
10:01So now fast forward today, cameras are all software.
10:05Smartphone cameras are software.
10:06So they can all make adjustments
10:07to account for different variety of skin tones, of course.
10:11But they still all do it to different varying degrees.
10:14Like you might've noticed,
10:15a lot of phones sold in China
10:17will just brighten up faces across the board
10:20because that's what people prefer in photos
10:23in that region very often.
10:24Google goes the extra mile
10:26to train their camera software on data sets
10:28that have a large variety of skin tones
10:30to try to represent them correctly across the board.
10:33And that's what it's calling Realtone.
10:35And Apple's cameras, from what I've observed,
10:38simply just like the evenly light faces across the board
10:41and doesn't necessarily account
10:43for different white balances and exposures
10:46necessary to accurately represent
10:48different types of skin tones.
10:50When I think they totally could.
10:51So basically it turns out this is a big part
10:54of what we were observing in pixels.
10:57And a lot of the phones that do accurately represent
10:59my skin tone finishing higher
11:02in this blind voting thing that we did
11:04because they happen to do that really well.
11:06And that's the thing that people really considered
11:07when they voted on them.
11:08I haven't said this a lot,
11:09but I think this is one of the earliest reasons
11:11that I actually really liked RED cameras
11:13was obviously 8K is great, color science is great,
11:17but the way it represents and renders my skin tone
11:20accurately over a lot of the Sonys and the Ares
11:24and Canons that I've tried,
11:26that's actually one of the things
11:27that really drew me to these cameras.
11:29So all this software stuff is why photo comparisons
11:32between modern smartphones is so hard.
11:35Like there are a lot of channels that do a really good job
11:36with the side-by-side photo test, you know.
11:39But even as you're trying to like pick one over the other,
11:41you've probably noticed this.
11:43You might like the way one of them
11:44renders landscape photos over the other,
11:46but the way a different one renders photos
11:49with your own skin tone.
11:50And then the way a different one renders photos
11:53of your pet, for example.
11:54So I'm sure Apple will defend everything they're doing now
11:57with their current cameras as they typically do.
12:00But I'm gonna keep an eye on what I'm also sure,
12:02which is they're for sure working on
12:04tuning these new cameras, dialing it in,
12:06and eventually getting it better
12:08with the iPhone 15 and 15 Pro.
12:11So back to the original question
12:13from the beginning of the video,
12:14can't leave that unanswered, which is,
12:16all right, the Pixel 6a,
12:18you like the Pixel photos, Marques,
12:20it won the blind scientific camera test,
12:23but you still gave the trophy
12:24for best overall camera system to the iPhone.
12:27The very 14 Pro that we've been talking about
12:30this whole video, why?
12:31And if you listen carefully, you already got it,
12:34which is that scientific test that we did
12:37tested one specific thing.
12:39It tested the small postage stamp sized,
12:42you know, exposure and colors general thing
12:43with a bunch of different factors,
12:45but sharpness and detail with all the compression
12:49that we did wasn't tested.
12:51Also speed of autofocus,
12:53reliability of autofocus wasn't tested.
12:56The open close time of the camera app,
12:59how fast and reliable you can get a shot wasn't tested.
13:03And also video wasn't tested.
13:05So the microphone quality, video quality,
13:07speed and reliability of autofocus there,
13:10file formats, sharpness, HDR,
13:12all that stuff wasn't tested.
13:14Maybe someday we will test all that,
13:16but until then the lesson learned is
13:19the pretty pictures that come from the Pixel
13:22or whatever phones in your pocket are partly photons,
13:27but primarily processing.
13:31Thanks for watching.
13:33Catch you guys in the next one.
13:34Peace.