Panayam kay CICC Deputy Executive Director Asec. Mary Rose Magsaysay kaunay sa imbestigasyon ng deepfake video at audio ni PBBM
Panayam kay CICC Deputy Executive Director Asec. Mary Rose Magsaysay kaunay sa imbestigasyon ng deepfake video at audio ni PBBM
Category
🗞
NewsTranscript
00:00 Update on the investigation and deepfake audio of President Marcos Jr.
00:05 Let's find out together with Assistant Secretary Mary Rose Magsaysay,
00:10 Deputy Executive Director of Cybercrime Investigation and Coordinating Center or CICC.
00:17 Asek Magsaysay, good afternoon and welcome to Bagong Pilipinas.
00:22 Good afternoon to all of you. It's my second time here.
00:25 Thank you for coming again.
00:27 It's always an interesting conversation with Asek Magsaysay.
00:30 You've gotten older.
00:31 [Laughs]
00:33 So let's go deep into the discussion.
00:36 Let's talk about President Marcos Jr.'s deepfake audio.
00:40 Do we have an update on the investigation on this?
00:43 Do we have a lead on who spread it and who is responsible for this deepfake audio?
00:51 Well, definitely, if we have a lead when it comes to the investigation of digital evidences,
01:01 it is CICC who handles it. That's why DICT also coordinates with us.
01:06 So the resources of the two agencies, of course the mother agency DICT and CICC,
01:11 is now working towards validating where this is coming from, who made it, what platforms were used.
01:18 So there's a lot of validation that goes through it.
01:21 So even through the weekend, we were heavy on investigating it.
01:25 And it's nice to be able to tell the people that as government, we are lead.
01:31 And at the end of the day, we are the ones who are being investigated.
01:34 So I can tell you right now, if there's really a good development on who, who is really behind it,
01:42 so obviously, he's a human being. You don't run after...
01:45 [Laughs]
01:46 He's not AI.
01:47 So you don't run after a whole country. You have to really find out who made it, what platform was used.
01:56 So it's not China?
01:59 It's not a country.
02:02 It's not a country.
02:03 It's not a country, definitely.
02:06 And I don't think...
02:08 Can we confirm if it's not China or not?
02:12 Because my next question has something to do with the Balikatan exercises.
02:17 Do you think that this fake content of President Marcos Jr. is related to the Balikatan exercises 2024?
02:26 Okay, on the get-go, no. Definitely not.
02:30 Because when it comes to that, it's an exercise we've been doing for so many years, right?
02:39 And for the creation of this one, it's just a coincidence that they're here, but it's not because of the Balikatan.
02:48 It's not connected to that?
02:49 We're trying to validate it still, but it's probably not.
02:54 You know, we're part of the NSC, so our minds are twisted.
02:58 Because this could have been released before the Balikatan or a month before the Balikatan.
03:04 But if you're the one who's thinking about this, you'll think about what's the best time to release the deepfake video.
03:12 That would be the time if the American troops are here or the French troops are here.
03:17 There are so many military resources in our country.
03:23 So maybe your actions could be looked at.
03:26 Why just the timing?
03:28 Well, coincidence.
03:29 Anybody who will do something like that, I see it as a David Goliath setup.
03:34 Well, for me, as a cyber technologist, it's really always, you know, everybody in the cyber world, the bad people,
03:44 they like to hit on governments because it makes them famous.
03:47 That's what I see there.
03:48 So it would also be nice if this happens, to release it, so that whatever it is, it can get some limelight or whatever.
03:58 Yes, so that it can also be seen by the would-be hackers.
04:03 Yes, the KSP.
04:05 KSP.
04:06 What are the, because we're used to regular forensics, right?
04:11 Yes.
04:12 The CSI, the ones in the Philippines, we also have that.
04:15 So there's also digital forensics, right?
04:18 Yes.
04:19 Can you explain to our fellow countrymen so that they know that they did this, they'll be caught.
04:24 They'll be traced.
04:28 Because, of course, the production of materials like this, there are only a few platforms used.
04:34 And usually, when you go into a platform, of course, you will have to put your credentials there and so on and so forth.
04:40 Or, we also, as government, not just digital forensics,
04:45 what is it that is caught by the peripherals, like the telephones or computers, that's what we're investigating.
04:52 So many, many things we put into consideration when we assess where these crimes really come from.
05:02 So digital evidence from the phone is different from what is being taken out from the computer,
05:08 what is coming from the telco, what is coming from a platform.
05:12 So all of this, we have about 10 labs already in the agency.
05:17 That's just one agency.
05:21 You have labs.
05:22 Yes.
05:23 In CICC, that's just 10.
05:25 We already are into R&D, research and development.
05:29 If you have to remember, during the good times, the Philippines was rising as a tiger.
05:37 We were very heavy on research and development.
05:39 And that was just in 2 after our beloved President, Apu Lakay, Marcos, came down.
05:45 And then now, CICC is already into research and development.
05:50 I already brought a sample of seven focuses, and one of them is adversarial AI,
06:00 as well as seven other things that we are focusing on.
06:04 We are already doing that.
06:06 In other words, we are creating already the solutions ourselves.
06:09 For every peso we spend, as I remind you, the Philippine government doesn't have endless amount of resources.
06:18 So what we try to do, the best way to use it is 20% for the input,
06:23 and definitely 80% will go to the people,
06:26 how we can help them and teach them to get the best of technology.
06:32 So that's a secure way of having a good ecosystem, digital ecosystem.
06:37 So that's what we are doing now.
06:39 So that's why I joined, so that people can see that technology is not only tech for good,
06:44 we will use it for education, but not for using chat GPT in the classroom.
06:50 That's why I'm happy with our new law.
06:53 And then also, we are using it to solve the crimes, but not only to solve the crimes,
06:58 but also to alleviate the plight of the people.
07:01 We are archipelagic, so we need to use technology to save money.
07:06 We won't go out, we will dress, we will massage, we will eat outside.
07:10 So, especially now, it's so hot, people, we are encouraging them.
07:14 But not here in PTV, it's cold here.
07:16 It's cold here, it's so cold.
07:18 In other words, what you're saying is, we will find out who this deepfake major is.
07:25 Just wait.
07:26 And also, I will tell the people, just wait here in the government station,
07:31 because we at CICC and DICT,
07:34 always, government communication is always the best source of the correct and not fake news.
07:42 You will be caught soon, whoever you are, you will be caught.
07:45 You might be caught.
07:46 DICT and CICC.
07:47 You might be caught.
07:48 With PNP.
07:49 With the Philippine National Police.
07:51 Alright, so at least we can all sleep better now.
07:54 We know, or we're close.
07:56 We're close.
07:57 In your talk, Asek, we also asked for an update on your experience
08:01 in the online sexual abuse or exploitation on children.
08:06 This is really worrying, this news.
08:09 Especially for me, we are the parents, right?
08:12 Yes.
08:13 Where are we now?
08:15 Okay, so we know that there have been laws that were passed with IIRs there.
08:20 And CICC is already doing our part,
08:24 because of course, the public policy pronouncement of the President,
08:27 all the agencies that can help to push this, do it.
08:32 What is good here is that we've already been doing it for the past whole year,
08:36 and as I've shown in AQUITY, we've already gone to 12 regions, okay?
08:43 And we've already reached out to train the people how to not be victims of sexual abuse.
08:51 And we have 100 bilats to different countries.
08:54 We are calling it evergreen bilaterals because we're not closing it yet,
08:59 only because there still is no treaty in the world, a cyber treaty still being created.
09:04 Likewise, we in the Philippines, we're a bit solid in Asia,
09:07 because we're implementing the Budapest Convention,
09:12 so we have some guidelines and laws on what we're going to do here.
09:16 We're following that.
09:18 We're one in the Philippines, besides Japan.
09:20 So what that means is that our law is strong when it comes to that.
09:24 But the secret is not the law or the IRR, but what you did to bring it closer to the people,
09:31 and that is AQUITY.
09:33 We've gone to the GIDAS, the Geographically Isolated and Disadvantaged Areas,
09:38 11,000 of the 42,046 barangays.
09:41 We've reached out to over 800.
09:44 That's Secretary Ivan who led that.
09:46 The connectivity part comes from DICT because it's expensive.
09:50 And then from that connectivity, we do to those areas is that we're holding courses.
09:56 You can keep on running on this presentation so they can see all the places they've been to.
10:01 That's quite a lot.
10:03 This is Project AQUITY.
10:04 Yes. Connect, Capacitate, and Confluence.
10:08 So that's continuous.
10:09 Now, our endpoints that we're putting connectivity,
10:13 AFP is the one who's monitoring that because they're in charge of critical infrastructures.
10:18 And then the police force, about 200,000 of them,
10:21 and about the same number as AFP, and I think more,
10:25 we're helping under the NICA and the NSHAC so that our movement is really coordinated
10:31 so that the people at the endpoints, they're the first, they're the number one,
10:36 they're the ones we're sending immediately.
10:38 So our resources are directed towards them.
10:41 90% of the resources goes directly to the people in this particular administration, not with just tools.
10:47 I understand what you're doing, Asek, and it's critical that we create resilience
10:53 on the part of the community to fight sexual abuse or exploitation.
10:57 In their dialect, right?
10:58 Yes. But my question, Asek, is more on the platforms.
11:01 Because if the government has a responsibility, the community has a responsibility,
11:05 and the police have a responsibility,
11:07 but the reality is that these social media platforms are very rich.
11:11 [Laughs]
11:13 They regulate this.
11:16 Aren't they supposed to follow our laws as well?
11:19 Yes.
11:20 How cooperative are they?
11:21 Because when I was with the DILG, they were not that cooperative.
11:24 No, they're not.
11:25 Their income is reduced.
11:26 Exactly.
11:27 So how is that, Asek?
11:29 Okay.
11:30 First of all, because they have a lot of money, right?
11:33 They can market. It's really great.
11:36 So those groups are on Facebook, on sexual abuse and exploitation, on META.
11:41 Yes. Actually, META is working closely with us.
11:44 And by God's grace, all of the sites that we put down in Candla,
11:50 when we talk about it, it's all motoproprio.
11:52 Right away, at the same time that we call Candla and email,
11:56 they immediately remove all of that.
11:58 But they can't do it on their own.
11:59 Some of them have a flag.
12:00 Yes.
12:01 Do they need to be laughed at?
12:02 Do they need to be laughed at?
12:03 It's their responsibility to have their own site.
12:06 Mandate.
12:07 Definitely, the secretary is doing that.
12:09 And there is ongoing conversation with that.
12:13 But we're not waiting for that.
12:15 Of course, it will take time.
12:16 It's a long piece.
12:17 That's why we go out to the community to tell them,
12:20 "We'll find another way."
12:22 So you have to remember that connectivity is a political commodity.
12:27 And if I will brief you, political commodity is anything that moves will affect the country.
12:32 For example, food.
12:34 But we can't move it because market forces are moving food.
12:37 When it comes to power, we're not Arabs to dictate that we lower the price.
12:44 But here, in connectivity, we can pour our resources and help from the government
12:50 to at least alleviate the lives of people because we are so reliant on communication
12:54 because we're fragmented in terms of geography.
12:57 But I think you would agree, Asek, that these social media platforms can use the latest technology like AI
13:04 to determine if the content has sexual or universal expiration.
13:08 They can see your cell phone and find your pictures.
13:12 That's true.
13:14 They could do more.
13:15 They should do more.
13:16 That's nice you mentioned that.
13:17 Because, okay, the bodies have digital forensics on videos.
13:22 Do you know that when a person is a bit dark, the skin won't be detected?
13:27 Because our basis to be able to put down a site is the amount of skin being shown.
13:32 Just so you know.
13:33 Yes.
13:34 If it's AI, it's taught that if there's a lot of skin, for example, 40% of the skin.
13:39 Yes.
13:40 They take it down.
13:41 Yes.
13:42 But if you're dark skinned.
13:43 You're dark skinned, it does not apply to you at this point.
13:45 But there will be technology eventually that will help us do that.
13:50 So, we have to help each other.
13:52 We told them already.
13:54 We want them to do it.
13:55 They want to do it because they will look like someone who's dark skinned.
13:59 But there are difficulties.
14:02 But I know the Secretary is doing something about it.
14:05 Is that why you said that the dark skin is not detected by AI?
14:11 Why are Filipinas in the top five of those?
14:14 Okay.
14:15 So, we are top five in PsychoSci.
14:17 I know that other agencies are monitoring it, especially because of the law.
14:20 But they say that it's a hard life.
14:24 It's easier to earn if this happens.
14:27 But that's why we went down to the ground because they didn't really understand the impact.
14:33 That's why the equity, we're double down, triple down on it.
14:39 Because we will also get inputs in the communities.
14:42 We can tell the right agency, this is what's lacking.
14:45 There's not enough food, there's no water, there's not enough doctors.
14:48 So, it works both ways.
14:50 Because our platform that is being used for equity is made by CICC, not by a vendor.
14:56 There's no bridge here.
14:58 It's only us in the government.
15:00 Okay.
15:01 So, what are the developments in our cybercrime investigations?
15:06 And how can we make the solving of cybercrime-related cases faster?
15:12 Okay. When it comes to cybercrime-related cases, what you need to know is everything is hybrid.
15:17 When I say that all cybercrimes are hybrid, it doesn't mean that it's just fintech crime or phishing.
15:25 Actually, it's a series of something.
15:27 You're phishing first, then you're a social engineer, you're playing, then you're already in the room.
15:33 So, that's why it is always hybrid.
15:35 People say, "We're in a mess."
15:36 Precisely.
15:37 That's what cybercriminals want.
15:39 That's why you should be diligent and make sure to read it first.
15:43 Think about it.
15:45 Please read the platforms that you're using.
15:48 Was it worth it for you to put anything there?
15:53 Anything you put in the net is forever.
15:57 So, just look for the relationship of forever.
16:00 Yes. Not the internet.
16:02 Not the internet.
16:03 Okay. Alright.
16:04 Message or just a reminder to our fellow countrymen.
16:08 So, fellow countrymen, I'm asking you to reach out to us.
16:16 We have a call center that was created by CICC.
16:20 The platform is internally, not a vendor.
16:23 The answer is, it's from the government, not a BPO.
16:26 Don't be afraid to call us.
16:28 Call us at 1-326.
16:32 That's our number.
16:33 1-326.
16:35 Yes.
16:36 Thank you very much for your time, Assistant Secretary Mary Rose Magsaysay of CICC.
16:42 Thank you.