Sen. Ted Cruz (R-TX) and other Senators promote the "Take It Down Act," a bill to combat illicit NCII.
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Category
🗞
NewsTranscript
00:00Thank you so much for being here today, and I hope you have a great rest of your day.
00:30So, welcome everyone. As many of you all know, we have seen a staggering increase in exploitative
00:48sexual material online, largely due to predators taking advantage of new technologies like
00:56Generative Artificial Intelligence, or AI. Now, almost instantaneously, perpetrators
01:04can use an app on their phone to create fake, explicit images depicting real people, commonly
01:13referred to as deepfakes. Disturbingly, this is increasingly affecting and targeted at
01:22minors, and particularly young girls. Up to 95% of all internet deepfakes depict sexual
01:33imagery, with the vast majority of it targeting women and girls. It feels like almost every
01:40week now we're seeing a new report of a middle school or high school young girl falling victim
01:48to this despicable activity. Young girls are waking up to a text message from their friends,
01:55saying there's an explicit photo of them circulating on social media or an explicit video, and
02:02yet the photos aren't real. They were created using AI, typically taking the face of a real
02:10person and using AI technology to graft it seamlessly and imperceptibly to a different
02:18image and a sexually explicit image. Imagine being 14 years old and showing up at school
02:27where your classmates believe they have seen an explicit photo of you that is a fake and a fraud.
02:33The photos may not be real, but the pain and the humiliation is. For many young teens it changes
02:42their lives forever. Even worse, these photos can prove very difficult to remove from the
02:49internet. They can spread quickly from one phone to another to another to another, causing many
02:55women and girls to have to live with being victimized over and over again. Today we are honored to be
03:03joined by Alliston Barrow from my home state in Texas. Alliston is brave enough to be here today
03:10sharing her personal story. She's joined by her mother Anna. Also joining us is Dorota Mani from
03:22New Jersey, whose daughter Francesca has also been a victim of sexualized deepfake images. Dorota and her
03:32daughter have now made it their mission to spread awareness about this issue and to help prevent
03:38other young girls from having to go through what Francesca and Alliston went through. To all three of you, thank you for being here today.
03:49Alliston, thank you for being here. This is a scary thing for an adult to face, but it's particularly scary for a teenager. Thank you for your courage.
04:03I'm also joined by my colleague Senator Lummis, who is one of the twelve bipartisan sponsors of the new legislation to help protect victims like
04:13Alliston and Francesca. Our bill, the Take It Down Act, will protect and empower victims of non-consensual intimate imagery, also known as NCII, by
04:29criminalizing its publication, making it unlawful for a person to knowingly put such content on social media or other platforms. This includes both
04:40real and computer-generated sexual images that depict real, identifiable people. There are a few bills in the Senate right now that are working to tackle this
04:52issue, and I want to be clear. Our legislation is designed to be complementary to those efforts. I'm supportive of the other efforts, and this is designed to fill
05:03gaps that the other bills do not address. As the ranking member of the Senate Commerce Committee, my legislation is the first of its kind to require websites, including
05:16social media sites, to have in place a process to remove such content within 48 hours of a request from a victim. The sites must also remove any copies of those images.
05:35You know, it can be maddening dealing with big tech and trying to get these images, these fake images of your child taken down. And big tech over and over again has demonstrated an arrogance, an
05:49imperiousness, a lack of accountability. We're all aware that this happened to Taylor Swift. And if you happen to be a global pop star, you can get the images pulled down. When Taylor Swift
06:07rightly said these images are disgusting, pulled them down, big tech responded. But if you're not fortunate enough to be a global celebrity, if you're just a teenager living in Texas or New Jersey or anywhere else, big tech ignores your pleas for help.
06:29When I met with Anna last week, she mentioned the images of Elliston were still on Snapchat. And that her repeated efforts to call Snapchat to talk to someone were met by bureaucratic resistance. Were met by talk to somebody else, talk to somebody else, send an email, a brick wall that got nowhere.
06:50Last week in my office, I turned to my team, I said get on the phone with Snapchat right now. If need be, put me on the phone with their CEO. I want those images taken down. They were taken down within 24 hours.
07:05Now, if you don't happen to be in a situation where a sitting member of Congress intervenes on your behalf, you know what you get right now? You get a closed door and a stone wall. That is not fair and that is not right. It should not take an elected member of Congress intervening to have these despicable lies pulled down from online.
07:29I'm proud to be joined in introducing this legislation with my colleagues, Senators Klobuchar, Lummis, Blumenthal, Capito, Rosen, Budd, Butler, Manchin, Young, Hickenlooper, Cassidy, and Heinrich.
07:52Twelve co-sponsors, bipartisan, across the board, across the ideological spectrum, we all agree this is a common sense step to criminalize the publication of non-consensual intimate images and to provide the statutory right if this garbage is put up by you, if it's put up against you, if it's put up against your child,
08:17you have a right within 48 hours to force big tech to take it down.
08:23We're also grateful for the over three dozen groups, including victims advocates, law enforcement, medical professionals, and others, who have come out in support of our bill.
08:33We are humbled by the overwhelming support this bill has received just upon introduction.
08:39Here with us today are representatives from three of those organizations.
08:45Yoda Soros, Chief Legal Officer of the National Center for Missing and Exploited Children, NCMEC.
08:52Don Hawkins, the CEO of the National Center on Sexual Exploitation.
08:59And Stephan Turkheimer, the Vice President of Public Policy at the Rape, Abuse, and Incest National Network.
09:08We are grateful for your support and your hard work in this field dealing with victims of horrific crimes.
09:15And as experts on sexual exploitation, we look forward to hearing from you today on the importance of passing this legislation.
09:25Most importantly, I'm proud that victims feel that the Take It Down Act will make a positive impact.
09:35Elliston, Ana DeRoto, I know that after all your families have been through, you have made it your mission to ensure that no other young girl has to go through this nightmare.
09:49You are protecting teenage girls across this country.
09:56You're protecting moms and dads from going through the agony of not being able to protect your child.
10:03And at a time when there's lots of partisan division in Washington, it is encouraging to see that we can come together and find bipartisan agreement.
10:14And with that, I want to introduce Anna McAdams initially.
10:20Thank you very much for allowing us to be here.
10:25On October 2nd, 2023, we would look to the news that devastated our lives forever.
10:34The previous day, a fellow classmate decided a fate for my daughter and her nine friends that would never they never would have imagined.
10:44He decided to take beautiful, innocent pictures off of social media platforms and impose new body parts onto an AI app called DNGG.
10:53On Snapchat, he made multiple accounts.
10:56He friend requested people and blasted them with the fake nudes of the girls.
11:00It's important to draw attention to the fact that he didn't take someone else's nude images and put their faces on them.
11:07Instead, he took their actual bodies using AI to make the images look even more real.
11:13These images are child pornography.
11:16These girls were 14 years old.
11:19Their innocence was taken that day.
11:22In just 24 hours, everyone in the social group had seen these pictures.
11:26The girls' phones were ablaze with warnings.
11:29That week would become a nightmare, and the girls' classmates had taken innocent pictures and sent them out to the entire Aledo High School.
11:36Some of their friends would ask him why he was doing it, and his response was,
11:41I want to go out with a bang. I want to ruin the girls.
11:44In today's world, those words are some of the scariest.
11:47We live with school shootings and regular things happening in our country.
11:52You could only imagine the fear that those words caused the girls.
11:56When he came to school with an intent to hurt them, when he killed himself, the girls didn't want to go to school.
12:01The few days that followed, the girls stayed home, but on that Friday went back to school, and the school accidentally went into lockdown.
12:10It seemed to be, to the girls, someone was after them.
12:13We're thankful that it was nothing more than a mistake, and during the lockdown, the boy decided to go into his accounts and continue to terrorize.
12:21He would send out other pictures.
12:23This allowed the school's tech team to catch him.
12:25It is so shocking to be violated in this way.
12:28First, having nudes sent out you didn't consent to, and second, for the fear of your life that someone might take it.
12:34What he did was malicious.
12:36He chose his victims, and he reveled in the torturing of them.
12:39If we do not have stricter laws and punishments for this type of crime, it will only increase.
12:45We are thankful that all the girls have family, friends, and counselors who support them.
12:50They will come through it if the girls had not had the support.
12:54It could have turned into the worst circumstance imaginable, like mental breakdown or even suicide.
13:01The perpetrator was charged with failed distribution, display of harmful material to a minor.
13:06But instead of having our day in court, the Parker County deemed him not a harm and gave him only probation.
13:12When he turns 18, his record will be expunged.
13:14He will walk away unscathed.
13:16However, our girls will forever live in fear that when they apply for a job or college, these pictures might resurface.
13:22He chose a fate for our daughters that we couldn't have imagined.
13:26There needs to be consequences for what he did.
13:32This is why the Takedown Act is so crucial.
13:34This bill would hold even minors accountable for jail time for this type of crime,
13:38and it would require Snapchat and other social media apps to take the images down within 48 hours.
13:44As of last week, as you heard from Senator Cruz, we had not gotten anywhere with Snapchat.
13:50It took eight and a half months for us to get those images off there.
13:53We don't know how many times they were shared and passed along.
13:57Like he said, if we had been Taylor Swift, it would have come down immediately.
14:02This bill will give us a voice that we did not have before.
14:05So thank you.
14:07Thank you, Anna.
14:09Thank you, Anna.
14:11Now I'd like to bring forward Elliston Berry to tell her story.
14:15My name is Elliston, and you just heard what happened to me last fall.
14:22As a 14-year-old who had just entered high school, this experience threw off my freshman year
14:26and created a fearful environment for me and my friends.
14:30I've always looked forward to going to high school as it is such a huge part of life.
14:34But being involved in volleyball and hanging with friends, I was able to enjoy school.
14:38But once these photos were released, I dreaded school and was anxious to even step foot on campus.
14:43Although the student left school, which allowed me to feel more confident about attending classes,
14:48there will always be a fear that these photos will resurface.
14:51I wouldn't wish this experience upon anyone and hope to create something
14:54that will prevent any student from having to go through this.
15:01Thank you, Elliston.
15:03Anna and Elliston are both Texans.
15:05I now want to bring forward Dorota Mani, who is from New Jersey,
15:09whose daughter experienced almost exactly the same thing.
15:13And it is unfair and it's wrong.
15:15Dorota?
15:16Thank you.
15:19Elliston, congratulations.
15:21Thank you for sharing.
15:22You empowered a lot of girls in this country.
15:28Francesca, unfortunately, couldn't be here.
15:30She's in her last exam.
15:32On October 20th, an incident at Westfield High School marked a critical turning point
15:38for my daughter Francesca, many of her female sophomore classmates, and their families.
15:44The girls were subjected to the distribution of inappropriate AI nude images, called deepfakes,
15:51by the fellow male students.
15:53Our initial reaction of shock and disbelief evolved into anger and now determination,
15:59which fuels our commitment to advocate for a safer environment for women and children in this country.
16:05Together, Francesca and I are tirelessly working to promote the enactment of stronger AI laws,
16:12comprehensive state and federal guidelines, and nationwide effective school AI policies.
16:20The incident at Westfield High School should never have happened.
16:24And sadly, it is not unique.
16:27Without collective efforts to educate, prevent, and legislate, the unacceptable misuse will persist.
16:35And let's not forget that 96% or even more of all AI victims are women and children.
16:43As of October 20th, only four states had specific laws targeting the misuse of AI deepfake imagery.
16:51Since then, thanks to collaborative efforts with lawmakers like Congressman King,
16:55Congressman Cruz, Congresswoman Sherry, and Congressman Morelli,
16:59three fundamental federal laws have been introduced.
17:02Recently, we have witnessed Congress consider the amendment of Section 230,
17:07and now 14 states have enacted laws to curb the misuse of a non-consexual deepfake
17:12with an additional 10 states limiting such imaginary and political campaigns.
17:17This rapid and bipartisan response underscores a strong governmental commitment to much-needed reforms.
17:23Take It Done, introduced today by Senator Cruz and Senator Klobuchar, is particularly significant.
17:30Among many important features of this bill, it requires digital platforms to remove reported AI deepfake images within 48 hours.
17:42A critical step in empowering women and children to regain control over the obvious abuse of their image without their consent.
17:52I am very grateful to see senators on both sides of the aisle stepping up and addressing this issue.
17:58And I know that once this bill is signed, it will make a difference for girls like Francesca and Ellie.
18:04So I am advocating to shed light on this important legislation and to seek support in shifting the narrative toward empowering women and children,
18:14which we can't do without proper legislation.
18:18It's time to stop ignoring this issue or dismissing it as someone else's problem.
18:24As you can see, this can happen to any of us, by anyone.
18:31We must face it together and move on.
18:34Thank you.
18:38Thank you, Dorodo.
18:39Thank you for speaking up for Francesca.
18:41And tell her good luck in her final today.
18:46I now want to recognize Senator Lewis.
18:51Thank you, Senator Cruz.
18:53And thanks to all of you who have had to deal with this issue firsthand.
18:58I remember what it was like to be in junior high and high school and just being uncomfortable in my own skin.
19:08I think that's very normal for girls that age.
19:11And I would ask every person in this room, every woman in this room, to think back to what it's like to be in junior high and high school,
19:22and that insecurity, that discomfort that we all have in our own skin.
19:27And then ask you to consider what these girls have been through.
19:33This is a horrific experience for these girls and their families.
19:41It can leave lasting scars.
19:45We must address this.
19:49This cannot be allowed to stand.
19:54I support the Cruz-Klobuchar bill wholeheartedly.
19:59There are other bills, as Senator Cruz pointed out.
20:05And as Senator Cruz pointed out, he supports these other efforts as well.
20:13Now, I did put a hold on one of those bills so we can work out changes to the definitions that make it as tight and targeted as the Cruz-Klobuchar bill.
20:26And I will continue to work with those teams in order to make whatever bill advances as clean and targeted and as timely.
20:39Because this issue cannot wait.
20:42This issue must be addressed now.
20:46Thank you, Senator Cruz, and thank you to the victims for bringing this issue forward.
20:53It takes courage, and you have that undaunted courage that we all admire.
21:00Thank you.
21:02Thank you, Cynthia, and I would echo.
21:06Allison, your standing here today is going to help protect another teenage girl in Texas or New Jersey or somewhere else, hopefully not to go through the nightmare you did.
21:18So thank you for doing that.
21:20I now want to recognize Yoda Soros, the chief legal officer of the National Center on Missing and Exploited Children, NCMEC.
21:29Thank you, Senator Cruz.
21:31The National Center for Missing and Exploited Children, or NCMEC, is proud to support the Take It Down Act.
21:38Since last year alone, NCMEC has received thousands of reports to our cyber tip line relating to deepfake content depicting nude and sexualized content of children.
21:51And we believe this is just the beginning of a flood of deepfake imagery exploiting children that will be circulated online.
21:59At NCMEC, we're seeing deepfake and nudify technology being used to generate new sexually exploitative content depicting children,
22:08to manipulate existing sexually explicit images to create new content,
22:14and to alter innocent images of children from social media to depict them in nude or sexually exploitative images.
22:22The Take It Down Act will ensure criminal penalties exist for circulating or threatening to circulate deepfake, nude, or exploitative images depicting children.
22:35This legislation is crucially needed to close gaps in existing laws relating specifically to deepfake, nude, or sexualized content that may not meet the definition of child pornography.
22:48The notice and removal provisions of the bill especially provide an essential remedy for victims to seek redress specifically from online platforms that are hosting intimate images in which adults and children are depicted.
23:03NCMEC is deeply appreciative of the leadership of Senators Klobuchar, Cruz, Lummis, and the many other co-sponsors in the Senate in introducing the Take It Down Act,
23:15and we applaud their dedication to ensuring the safety of children online.
23:19Thank you very much.
23:25I would now like to recognize Dawn Hawkins, who is the CEO of the National Center on Sexual Exploitation.
23:33Dawn.
23:34The Take It Down Act is not just a legislative proposal, it's a lifeline.
23:42One of the most critical components of this bill is the requirement of tech companies to listen to survivors.
23:48Our law center represents many survivors of image-based sexual abuse, including deepfake pornography.
23:55In one of our cases, a woman was sex trafficked and her trafficker is in prison.
24:01We sent a takedown request to Google to remove the video and images in Google Images and Videos, and Google responded saying they didn't see signs of coercion.
24:13And this is the reality that so many survivors are facing.
24:17They have to beg and beg and beg, as you heard, for this content to be taken down.
24:22We need tech giants to take responsibility.
24:26AI tools available in app stores hosted by Google and Apple have dozens of nudifying apps rated for age 4 and up.
24:34It took a large public shaming campaign for them to start to remove them.
24:39Or on Microsoft GitHub, you can download tons of different code to generate realistic, forged pornographic images in just minutes.
24:48We have platforms like Pornhub, Reddit, and X, where they let anyone upload sexually explicit content of anyone at any time,
24:57and then they not only drive people to that content through ads, but they also enable countless downloads and reshares.
25:04This is the unacceptable reality that we face today, and it will remain so unless Congress acts now.
25:10It's outrageous that tech companies can instantly remove copyrighted material.
25:14If you upload a clip from Disney or a pop song, it's gone in a heartbeat, risking an immediate ban from the platform.
25:21Yet if you upload a rape or hide a camera in a locker room or create a deepfake pornographic image, you can do so without scrutiny.
25:29The lack of laws, of really any laws right now, allow both the platforms and the initial uploaders of this abuse to completely evade accountability.
25:39And we all could be victims.
25:42It's critical that we act now, and we put people over profits.
25:46Thank you.
25:50Thank you, Dawn.
25:51And last, I would like to recognize Stephan Turkheimer, who is the Vice President of Public Policy at Rape, Abuse, and Incest National Network.
26:03I'm Stephan Turkheimer. I'm the VP of Public Policy for RAINN.
26:07I want to thank Senator Cruz and Senator Klobuchar for bringing this bill forward.
26:10It is incredibly important to the people we represent.
26:13We run the National Sexual Assault Hotline.
26:154.5 million people have called that hotline to help them seek counseling and support after a sexual assault.
26:22And make no mistake that image-based sexual abuse is sexual abuse.
26:28I want to tell a story that's a little different than the ones you've heard, which is a case that went to trial last week and went to the jury on Friday.
26:35It's a story about a Navy captain who dated a woman for a short period of time, but during that time, he was able to get some intimate images from her.
26:43When they broke up, he formed a Facebook account in her name and friended 1,200 of her friends, including all her new coworkers.
26:50And then he slowly sent out those photos over and over, including deepfakes.
26:56There was a federal case about this.
26:58It went on for three years.
27:01And Facebook was asked 400 times to take down the images and delete the account.
27:06And every single time, they said that the fake account looked more real to them than the real account.
27:13That's the reality of what people are dealing with.
27:16When Facebook comes up here and says they're working on the problem, note that all their economic incentives are not to do anything at all.
27:23And that changes when this bill passes.
27:25I want to thank Senator Cruz, Senator Klobuchar for making a world of difference to the survivors.
27:30Every person who calls the hotline is in pain.
27:32And this is a way of ending that pain.
27:34There's nothing, nothing more powerful, not tech companies, not lobbies, not anything more powerful than an idea whose time has come.
27:41Thank you.
27:44Thank you very much.
27:45With that, I'm happy to answer questions.
27:47What have the responses been from tech companies like Snapchat and TikTok?
27:54And also, what ramifications are there if these companies don't end up taking out the images within 48 hours?
28:01Well, at this point, we're introducing the bill today, so the tech companies are going to have to make a decision how to respond to it.
28:07I hope they do the right thing and recognize this is the right solution.
28:13Frankly, it's what they should be doing anyway.
28:15And part of the reason this bill is necessary is they repeatedly, you just heard from Stefan, Facebook asked 400 times to take these images down.
28:24We have hearing after hearing of people, Anna described, just repeatedly with Snapchat saying take this down.
28:32And they just give a stiff arm.
28:35They behave as if they are not accountable.
28:38And the model, the model is actually one that Don described.
28:43It's the copyright model.
28:45So if you put up the Lion King, that can get taken down incredibly quickly.
28:52If you put up a video of a George Strait song and he's got a copyright on it, you can pull that down immediately.
28:58And yet, if you put up non-consensual intimate images, the big tech companies act as if they are, they have no idea how to do this.
29:09This is designed to make it mandatory.
29:13To pull it down and it will, it is modeled after, we use the copyright law as the model for this bill to mandate that they pull it down.
29:24And once passed, and I believe we're going to get this legislation passed, we will see prompt and immediate compliance just like we do on the copyright side.
29:34Are there consequences for the social media companies if they don't comply with the 48 hours? What's the consequence?
29:41There are consequences. Brad, what are the specific consequences?
29:44FTC.
29:45So the FTC has the authority to enforce and bring an enforcement action.
29:50So it subjects them to the enforcement through the FTC.
29:53And the person posting it also has criminal liability of up to two or three years in jail, three years if it's a minor, two years if it's an adult.
30:02Can you explain the number of doubts in this bill as compared to the other bills?
30:06Well, there are two things in particular that are new about this bill that are not encompassed in the other bills that have been introduced.
30:12One is that this bill addresses AI and deep fakes.
30:17The other bills that have been introduced address intimate images that are real images that are placed non-consensually online.
30:26And that is an abuse of privacy.
30:28And this bill covers that.
30:30So if it is a real image that is involuntarily published, this bill penalizes that as well.
30:36But this bill addresses where the challenge is going.
30:41And you heard from Yoda how Nick Beck is getting thousands and thousands of reports.
30:49And what's frightening is her assessment, and I think that's all of our assessment, that this is just the beginning.
30:55That we are potentially on the front end of a deluge because it is too easy for someone to do this for anyone they might target.
31:06And what can be frustrating in many instances, you don't even know who's targeting you.
31:10It can be done thoroughly anonymously.
31:13And so this bill, number one, fills the hole of deep fake AI technology, which the other bills do not address.
31:21And then number two, the title of the bill, and it may be the most important piece of it, is the mandate that Big Tech take it down.
31:29None of the other bills have that requirement.
31:31And as long as those images are out there, the victim continues to be harmed over and over and over again until you pull it down so that they can't be shared anymore.
31:41Yeah?
31:42Is there any concern about taking the images down not preserving evidence for going after the person who posted it?
31:49Well, no, it doesn't require that Big Tech destroy the images, simply that it not be published to the world.
31:56So there's still full ability, and indeed there are exceptions within the bill for preserving an image.
32:03For example, to let someone know, hey, this image is up there of you.
32:07There are exceptions where it's not exploited.
32:10If someone has an image, you certainly want to know and want someone to let you know.
32:14There's also exceptions for reporting to law enforcement.
32:17So if a mom has an image and wants to give it to the police or the district attorney to say, somebody posted this.
32:24Please go prosecute it.
32:26That is not criminalized because that's a good faith effort to stop the violation.
32:34Yes?
32:35Senator, are you thinking about how to prevent users from actually creating these deep fakes?
32:41Because taking it down seems like the facts have already been done and it's out there.
32:44But how do you think about ways to legislate around preventing it from actually happening?
32:49So the way this bill focuses on prevention is through the criminal penalties.
32:53The criminal penalties of up to two years if it's an image of an adult, up to three years if it's an image of a child.
33:00That, I believe, will be a very significant deterrence.
33:05And will it be perfect?
33:08Will there still be people who commit offenses?
33:11In all likelihood, unfortunately.
33:13But I think the power of real criminal penalties is that it's going to ratchet up the cost for people doing it.
33:21And I think the question you're raising is an important one.
33:25And look, when it comes to big tech platforms, for example, you heard from the other witnesses here
33:33how the platforms put these apps up and make it easy to create these images.
33:38I think there's a lot to be done urging platforms don't make these apps widely available
33:44when their only use is to create this kind of abuse.
33:48And I don't know if anyone wants to chime in on any of these questions.
33:52Alright, let's do one last question.
34:10Look, I think this bill we're going to get passed.
34:13We have 12 co-sponsors right at the outset.
34:16It's bipartisan and it's across the spectrum.
34:19I don't think there's an ideological divide on this.
34:21I don't think there's a partisan divide on this.
34:24Standing here today, I don't know if the big tech companies will lobby against this or not.
34:29I hope they don't.
34:30I hope they recognize this is an eminently common sense step
34:34that frankly is what they should have done a long time ago.
34:37And the marginal additional profit they get from circulating this garbage,
34:44I hope they're willing to look the other way.
34:46So it would be disappointing if the tech companies come in against this.
34:50This bill is also designed to be in the Commerce Committee.
34:54I'm the ranking member on commerce.
34:56I very much hope we have a markup in the Commerce Committee soon.
34:59I'm going to be pressing the chairman of the committee, Maria Cantwell, to schedule a markup.
35:04This ought to be a bill that would pass.
35:07My suspicion is when we have a markup, I expect that it will pass unanimously in committee.
35:12I don't know that.
35:13We don't have those votes locked down.
35:15But I'm not aware of any opposition from members on either side of the dais.
35:20And then the question is, can we get it moved on the floor?
35:24My hope is to get it passed, and I'd like to get it passed this year.
35:28That's what I'm going to be pressing to do.
35:31It will depend on if there are any senators who object.
35:35As I stand here today, I don't know of any senators who object, and I hope they don't.
35:40We tried to spend real time drafting this in a way that it is narrowly targeted to the harm.
35:46It's not overbroad.
35:49It doesn't encompass conduct outside of what we're seeking to prevent.
35:54And look, this bill is really designed to provide help, needed help, to moms, to kids,
36:01to the victims of these egregious invasions of privacy and these egregious violations.
36:09And I just want to say I want to thank everyone for joining us today,
36:15and I want to thank, in particular, Geronimo and Anna, and especially Allison.
36:20I have teenage daughters, and I know this is standing in an array of TV cameras.
36:26It's not an easy thing to do.
36:28You did a fantastic job, everyone.
36:30And so thank you for your courage, and thank you for everyone for being here.
36:34We're going to get this done.
36:36There are a lot of times when the Senate doesn't work, it doesn't work very well,
36:40but I believe on this bill the Senate will do exactly the job we're supposed to do,
36:45and we'll come together and get it passed and answer the needs of our constituents.
36:49And I will point out as just a closing observation that one of the first steps that put this issue on my plate
36:59was a call that my office got from Anna.
37:02They're Texans.
37:04And she called our office and told us what had happened to Allison.
37:09And I've been focused for a long time on protecting kids online from all sorts of predators
37:15and all sorts of harmful material that is directed at children.
37:19And I've got to tell you, every parent I know who has teenage kids or who has adolescent kids
37:25is terrified about the threats that come online.
37:28Our kids, it's really hard to be a kid today.
37:32But this started really with that phone call that was teed up in our office,
37:36and we said, this is horrible.
37:38There ought to be a way to fix it.
37:40And that is what prompted this legislation and prompted the coalition that's come together.
37:46And thank you, each of you, for your groups.
37:49The work you do fighting for kids, fighting for victims of horrific abuse,
37:54that's incredibly important, and I appreciate it.
37:57All right, thank you, everybody.
38:02Thank you.