Skip to playerSkip to main contentSkip to footer
  • 4/1/2025
During a House Energy Committee hearing on examining harms online last week, Rep. Diana Harshbarger (R-TN) questioned Yiota Souras, the chief legal officer at the National Center for Missing and Exploited Children, about online securities practices to protect children.

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript


Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com

Category

🗞
News
Transcript
00:00I thank the gentlelady. The gentlelady yields back. Now I recognize Ms. Harshberger
00:05from the great state of Tennessee for her five minutes of questioning.
00:10Thank you, Mr. Chairman, and I want to thank the witnesses for being here today. I want to
00:15thank you for speaking up for our kids. You know, if the examples against
00:20innocent children that you provided in your testimony do not give us members a
00:26righteous anger about this issue, then we better, we just better check our very
00:30soul. And as a parent, and I know there's parents in the audience who have gone
00:36through this, and I am so sorry for you because it could happen to anybody. Don't
00:41ever think that you are exempt from this happening. Young people, you listen. If you
00:47read the testimony of what happened to these children, you probably wouldn't
00:51let them online or you probably wouldn't get online. You know, as a parent, you try
00:55to make sure your child's safe and kept away from harm, but it's very clear as
01:01technology continues to accelerate just how difficult it is to protect your kids
01:08from these predators. And if you, listen, if they're sick, depraved, possessed, and
01:15dangerous criminals are what they are. And if you really, really want to know
01:19what I think they should do with these once law enforcement catches one, then
01:23you see me after this hearing is over, and I'll be happy to tell you what I
01:28think. I want to start with Ms. Soros. Your very powerful testimony gave
01:34horrifying accounts of children being preyed upon by adult sex offenders. With
01:40the data already available, how can platforms better identify users who are
01:44posing as children? Thank you for the question. In every other industry, there
01:50is a concept of know your customer. It's what provides safety and compliance, and
01:55that simply does not exist online. It does not exist for social media
01:58companies. So, you know, that is certainly first and foremost. They need to
02:03know who is getting online, how they are registering, if they are representing
02:06themselves truthfully, whether they're an adult, whether they're a teenager. We've
02:10seen in some of these sex torture cases that individuals are creating profiles,
02:14you know, grown men, adult men are creating profiles indicating that they
02:17are 15, 16 year old girl. They're allowed to post photos and create a whole
02:21persona online that to a child, especially a child who's very trusting,
02:26who's only been online for a few years, seems very authentic. So, you know, I
02:31would definitely say they need to know their customer. There needs to be
02:34transparency reporting around how social media platforms are accomplishing that
02:39task, and there also needs to be age assurance. I think all of us have touched
02:43on that issue a little bit, that there's still a lot to be determined around how
02:47best to effectuate that, but we need to create spaces for children online that
02:52are safe. You know, we allow children to get online, have private conversations
02:5724 hours a day with whoever is online, exchange content, videos, and images. We
03:03would never allow that in real life. So, we need to, we need to understand the
03:08environment and we need to work together to accomplish that. Do you think this
03:12information could prove an asset for parents monitoring their child's online
03:17activity to ensure they're interacting with their peers and not some sick,
03:21perverted monster who deserves to be locked up until Judgment Day? Absolutely.
03:28I think all of us, you know, agree that parents play a role here, but tech
03:32companies need to meet them, I would say, more than halfway. They need to provide
03:35tools that are comprehensible, that are easy, that are accurate, and they need to
03:39provide transparency so parents actually understand how the algorithms work and
03:43how the apps work. You know, it's sad to say, though, that, you know, I interacted
03:49with middle schoolers and taught Sunday school for 24 years, and I used to say it
03:52was a fatherless society, but it's a parentless society in a lot of ways, and
03:57some of those children don't always have a parent who will protect them. I
04:01understand that the National Center of Missing and Exploited Children and Law
04:06Enforcement are interested in prioritizing quality of reports over
04:10quantity, and that one thing that's helped with that is bundling of reports.
04:15Could you tell us a little bit why that's so important, and how could we
04:18encourage more companies to do that? Absolutely. So the process of receiving
04:23cyber tip-line reports at NCMEC, really, if I can just give a brief
04:28explanation, we operate the online reporting mechanism for reports
04:33regarding child sexual exploitation. So we are flooded with tens of millions of
04:37reports from social media companies, and then we provide those reports to law
04:41enforcement around the world, but, of course, domestically, federal, and state,
04:44local law enforcement. There has been, really, I would say an issue in recent
04:50years, as companies are increasing the volume of reports, that reports that are
04:54really related regarding the same image, regarding the same incident, regarding
04:58the same time frame, have been reported one by one as separate reports. That
05:04creates a tremendous burden for law enforcement. So we were very gratified
05:08this year that Meta, which is one of the larger reporters, it is certainly one of
05:13the larger social media companies, agreed to start bundling incidents that are
05:17related. It reduces the work burden and the time that law
05:22enforcement has to consume on those reports. Okay, very good, and I think I'm
05:26over my time, so I yield back.

Recommended