• 2 days ago
In the House tri-committee (tri-comm) hearing on Tuesday, Feb. 4, cybersecurity analyst Marco Reyes of the Department of Information, Communication and Technology (DICT)'s Cybercrime Investigation and Coordinating Center (CICC) said that the agency now has a "tool" that can help law enforcement agencies detect deepfake videos.


READ: https://mb.com.ph/2025/2/4/dict-tells-house-panel-we-have-tool-that-can-detect-deepfake-videos


Subscribe to the Manila Bulletin Online channel! - https://www.youtube.com/TheManilaBulletin

Visit our website at http://mb.com.ph
Facebook: https://www.facebook.com/manilabulletin
Twitter: https://www.twitter.com/manila_bulletin
Instagram: https://instagram.com/manilabulletin
Tiktok: https://www.tiktok.com/@manilabulletin

#ManilaBulletinOnline
#ManilaBulletin
#LatestNews
Transcript
00:00I'm Marco, an agent of CICC, and I'll be presenting one of the tools that will basically detect
00:23deep fakes.
00:25So this tool, if you run it on your computer, anything that will be seen or you've watched
00:37on the computer, say for example a YouTube video, a Facebook video, a video from X, this
00:45tool will be able to detect it.
00:48So I will be quickly doing a demo of this tool.
00:58So one of our examples here is a deep fake video on President Zelensky.
01:07So this is publicly available on YouTube.
01:11So this tool, this app of ours is an app where you can install it on your computer.
01:18It acts somehow like an antivirus, and when you activate it, it will practically run like
01:24a background process.
01:27So if I run it now, as you can see here in the lower right, if I run it now and play
01:38the video, so it will take some time to analyze the video.
01:50And you can see on the lower right that you have a notification there to zoom.
01:57Zoom in.
01:58Yeah, on the lower right, please.
02:01The color red at the bottom, that's what determines if it's fake or not.
02:11On the lower right, is that an application that you have to put on your computer or on
02:19your cell phone?
02:21For now, you have to download it on your computer, Mr. Chair.
02:25So what app are you going to download?
02:27It's a tool called Deep Fake Agent.
02:30That is...
02:31Okay, sir.
02:32Mr. Chair, this is the tool that we have recently acquired, a licensed software which we intend
02:44to disseminate to the different watchdogs, to the civil society members who have a large
02:55following so that they can easily report.
03:01How about to the members of this committee, because we don't have the capacity to find
03:06out.
03:07Well, if you go out to the streets, sometimes you get confused.
03:11What more for others?
03:14So will we also be given that kind of thing?
03:17Well, Mr. Chair, the more, the merrier, because we will have a community-based fact-checker.
03:24It's no longer controlled.
03:26These are independent fact-checkers who will only report to us.
03:30How will you do that?
03:32By registration, by...
03:34We have a policy right now that's being developed together with the other members of the Council
03:40of CICC as a guideline for reporting.
03:44You see, we have very limited manpower and very limited...
03:48If you have limited manpower, you cannot patrol the whole cyberspace.
03:52Okay.
03:53So, can we continue so that we can see how it works?
03:56Okay, CICC?
03:57What's your name?
03:58Marco, Mr. Chair.
03:59Marco, please.
04:00Okay.
04:01So, the process has run through.
04:02So, basically, the tool will take snapshots of the deepfake video.
04:03So, as you can see, if I maximize it here, those are the samples, and it practically
04:20traces the face where it detects the anomaly.
04:26So, it's practically automated now.
04:30So, anything you view on your screen and at the same time this tool is being run, it will
04:38notify you immediately if there is a deepfake.
04:42So, if that tool is with you, for example, on your laptop or cell phone, before you watch
04:47a content or a video, you will turn on that tool first.
04:56Correct, Mr. Chair.
04:57And then, the color red that we saw earlier will be displayed.
05:02Yes, Mr. Chair.
05:03All right.
05:04Okay.
05:05So, if you have any questions, because this is a good issue, we will be able to capacitate
05:11our citizens on how to distinguish so that the government or anyone will not defend themselves
05:18that, hey, that's a deepfake, right?
05:21All right.
05:22Do you have more samples?
05:24We have one more example, Mr. Chair.
05:27Go ahead.
05:28This is Tom Cruz, right?
05:30Cruz.
05:31Yes, Tom Cruz.
05:32Go ahead.
05:33So, what I will do again is to activate the tool and then run through the video.
05:52So, it will take some time for it to detect.
06:01All right.
06:02All right.
06:03All right.
06:04All right.
06:05All right.
06:06All right.
06:07All right.
06:08All right.
06:09All right.
06:10All right.
06:11All right.
06:12All right.
06:13All right.
06:14All right.
06:15All right.
06:16All right.
06:17All right.
06:18All right.
06:19All right.
06:20All right.
06:21All right.
06:22All right.
06:23All right.
06:24All right.
06:25All right.
06:26All right.
06:27All right.
06:28All right.
06:29All right.
06:30All right.
06:31All right.
06:32All right.
06:33All right.
06:34All right.
06:35All right.
06:36All right.
06:37All right.
06:38All right.
06:39All right.
06:40All right.
06:41All right.
06:42All right.
06:43All right.
06:44All right.
06:45All right.
06:46All right.
06:47All right.
06:48All right.
06:49All right.
06:50All right.
06:51All right.
06:52All right.
06:53All right.
06:54All right.
06:55All right.
06:56All right.
06:57All right.
06:58All right.
06:59All right.
07:00All right.
07:01All right.
07:02All right.
07:03All right.
07:04All right.
07:05All right.
07:06All right.
07:07All right.
07:08All right.
07:09All right.
07:10All right.
07:11All right.
07:12All right.
07:13All right.
07:14All right.
07:15All right.
07:16All right.
07:17All right.
07:18All right.
07:19All right.
07:20All right.
07:21All right.
07:22All right.
07:23All right.
07:24All right.
07:25All right.
07:26All right.
07:27All right.
07:28All right.
07:29All right.
07:30All right.
07:31All right.
07:32All right.
07:33All right.
07:34All right.
07:35All right.
07:36All right.
07:37All right.
07:38All right.
07:39All right.
07:40All right.
07:41All right.
07:42All right.
07:43All right.
07:44All right.
07:45All right.
07:46All right.
07:47All right.
07:48All right.
07:49All right.
07:50All right.
07:51All right.
07:52All right.
07:53All right.
07:54All right.
07:55All right.
07:56All right.
07:57All right.
07:58All right.
07:59All right.
08:00All right.
08:01All right.
08:02All right.
08:03All right.
08:04All right.
08:05All right.
08:06All right.
08:07All right.
08:08All right.
08:09All right.
08:10All right.
08:11All right.
08:12All right.
08:13All right.
08:14All right.
08:15All right.
08:16All right.
08:17All right.
08:18All right.
08:19All right.
08:20All right.
08:21All right.
08:22All right.
08:23All right.
08:24All right.
08:25All right.
08:26All right.
08:27All right.
08:28All right.
08:29All right.
08:30All right.
08:31All right.
08:32All right.
08:33All right.
08:34All right.
08:35All right.
08:36All right.
08:37All right.
08:38All right.
08:39All right.
08:40All right.
08:41All right.
08:42All right.
08:43All right.
08:44All right.
08:45All right.
08:46All right.
08:47All right.
08:48All right.
08:49All right.
08:50All right.
08:51All right.
08:52All right.
08:53All right.
08:54All right.

Recommended