• last year
Hany Farid, an academic who has spent the past 25 years developing techniques to detect photo manipulation, breaks down the Kate Middleton photo and living in the age of AI.

Category

🗞
News
Transcript
00:00 Princess Kate is right that many amateur photographers and bring their photos into
00:04 Photoshop and hide the app. And I think probably what she did is one of these best take like
00:09 montages where she took a series of photos and she wanted to get the smiling face just right
00:14 for every before all the kids and herself and did a little montaging and then it just all it
00:19 in invariably leaves a few artifacts. And so I was pretty confident at that point that this was
00:25 fairly benign. I understood why the AP pulled it because they have very particular standards.
00:30 But I don't think it was nefarious. I don't think it's a sign that she's dead. I don't think aliens
00:35 have taken over her body and all the crazy nonsense that the internet is coming up with.
00:40 I think this is a classic story of photo editing that was a slightly sloppy. You know, at the end
00:45 of the day, this is a pretty simple story. But what's interesting about it is even though this
00:50 is a story about Photoshop and fairly benign photo editing, it there is the shadow of AI.
00:56 Like if we were living if we weren't living in the generative AI age, I don't think you'd be
01:01 hearing much of what we are hearing. Yeah, I do think that the reason why these conspiracies can
01:07 take hold is because we do live at a time where somebody could have fully fabricated that image,
01:12 or that was a body double and then they replaced her face there. And so this is still an AI story
01:18 to some degree. But I think it really is quite innocuous. We are seeing two things. One is people
01:26 are creating fake images, fake audio, fake video of Trump, of Biden, of Harris, and trying to use
01:31 that to damage them. But they're also doing something else, which is they're using the
01:36 specter of deep fakes to try to dismiss reality. Right? Because if you live in this world where
01:41 anything could be fake, well, the the video of Trump stumbling on his words and calling his wife
01:46 Mercedes, he can just say it's fake. So that's actually very dangerous also is you can deny
01:53 reality when things can be fake. And I think you're seeing both of those around the election.
01:58 And of course, we're seeing the awful content around non consensual sexual imagery. We saw it
02:02 with Taylor Swift, but high school girls every day are now being victims of this. Journalists,
02:08 human rights activists, politicians, people who attract unwanted attention are being victims of
02:14 being inserted into explicit material. People are using it to create child sexual abuse material,
02:18 people are using it to commit commit small to large scale frauds, getting phone calls from
02:23 what seems like your loved ones or talking to your CEO, but it's not it's actually somebody
02:27 who's faking them. So you're seeing really significant weaponization at the individual
02:33 level at the societal level and at the democracy level. And I think it's probably a sign of things
02:38 to come. Individuals have to stop pretending that they can forensically analyze content and be
02:43 reliable at it. You can't this is a really hard job. I do this for a living. It's hard. So what
02:49 you have to realize as a consumer is that you are vulnerable. And you are not going to be able to
02:54 forensically analyze analyze your way out of this. I think we just need to be more thoughtful about
02:59 how we consume information and understand that the journalists have a serious job. And by the way,
03:04 when they get it wrong, there are consequences. And you can't say that about the knuckleheads
03:07 on Twitter. There's no consequence for them. The first approach is what we call reactive approach.
03:12 I wait for the reporter from the Telegraph or BBC to contact me. They send me the photograph. I run
03:16 a battery of tasks, try to figure out what's going on. I let them know. They set the record
03:20 straight. Meanwhile, you know, an entire new cycle has gone by. Right. So it's a relatively slow
03:27 process. We measure that in hours or days. But at the speed of the Internet, you know, that doesn't
03:33 really work. So the proactive technique is that you're the camera itself. So Princess Kate's
03:41 camera, when she took that photo would have said, Ah, this photo was taken on this date and time at
03:46 this location. This is a credential that author that authenticates that. And then as she put that
03:52 into Photoshop, Photoshop would say, Okay, this is what I did the photo, bing, bing, bing, bing, bing.
03:56 And then when that goes to the AP, AP can look at that content credential. It's manifest, we call it
04:02 and say, ah, okay, this is what happened. This is consistent with our photo editing standards,
04:08 or it is not we will accept it, we will not. And none of this would have happened. Right? If that
04:13 if that existed, and the nice thing about the proactive techniques is you're there at the point
04:17 of creation. Yeah. And as long as those credentials are there, you get to make decisions. Now the
04:23 drawback is, we need those contra credentials in some what 10 billion devices around the world.
04:29 And we need the infrastructure. And so that's going to take a little bit of time. And I think
04:32 at the end of the day, it's going to be a combination of those two solutions. Right?
04:36 The credentials will work. When people like I think in this case, if that Canon camera,
04:41 and that version of Photoshop that Kate had were compliant, I think she wasn't trying to
04:45 fool anybody shoots that dinner photo, then it would have worked perfectly. A dedicated bad actor,
04:50 yeah, they'll find a way around it. And then the reactive techniques that we talked about,
04:54 that's where I come in. So a combination of those two is sort of what you need.

Recommended