BREAKING NEWS: Ted Cruz Holds Texas-Based Senate Hearing To Demand Big Tech Action On Revenge Porn

  • 3 months ago
Sen. Ted Cruz (R-TX) holds a Senate Commerce Committee field hearing in Dallas, Texas, to demand action from big tech companies to combat revenge porn.

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript


Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Transcript
00:00:00You
00:00:30Good afternoon, everyone. Welcome. Welcome to a field hearing of the Senate Committee on Commerce Science and Transportation
00:00:37This is a field hearing on the take it down act ending big tech's complicity in revenge pornography. I
00:00:45Want to start by thanking the University of North Texas at Dallas for providing the venue for today's hearing
00:00:52Including the UNTD staff that have worked so hard behind the scenes to put this event together. I
00:00:58Owe another huge thanks to the UNTD Dallas campus police and the state troopers
00:01:05Who are providing security this afternoon? Thank you. And of course, thank you to our witnesses for testifying today and
00:01:12to our audience for taking the time to be here in person if
00:01:17You are a victim of revenge or AI generated explicit imagery in the Internet age
00:01:25your life changes
00:01:27Permanently
00:01:28you most likely are a woman or a young girl and
00:01:34you most likely have been targeted by someone, you know, and
00:01:38If you do not happen to be a celebrity
00:01:41You most likely are still struggling
00:01:44To have the images of you removed from the Internet
00:01:50The scourge of so-called revenge pornography is not new
00:01:54It has been with us sadly for decades
00:01:58however, new generative AI tools
00:02:01Have made creating realistic
00:02:04yet fake
00:02:06explicit images and videos of real people
00:02:10easier than ever
00:02:12Due to advances in technology now anyone can become a victim
00:02:19And it's increasingly affecting a particularly vulnerable population
00:02:25Teenage girls
00:02:27High schools across the country are seeing an explosion of AI generated
00:02:34sexually exploitative images of female students
00:02:40Thousands upon thousands upon thousands of them
00:02:44These images are often created and spread by male classmates
00:02:51We've seen almost identical cases
00:02:56In New Jersey
00:02:58Where a male classmate created and shared fake
00:03:02sexually explicit images of
00:03:05their underage female classmates
00:03:08We've seen it in Illinois
00:03:10Where boys altered dozens of their female classmates prom photos
00:03:15to create fake nude images
00:03:19We've seen it in Washington
00:03:21Where a boy at a dance uses AI to virtually strip his female classmates dresses off
00:03:28We've seen it in
00:03:30California
00:03:31Where three separate schools around Los Angeles in Beverly Hills
00:03:36Calabasas and Laguna Beach
00:03:38Have had incidents of students using AI to create inappropriate
00:03:44sexually explicit images of their classmates
00:03:48We've seen it in Florida
00:03:50where middle school boys created sexually explicit images of their 12 year old and 13 year old classmates and
00:03:58We've seen it here in Texas
00:04:00where a boy targeted a group of female friends and
00:04:04shared realistic
00:04:06but fake
00:04:08sexually explicit images
00:04:10in each case
00:04:12this cruel filth
00:04:15Was then sent to classmates
00:04:18further exploiting and
00:04:20embarrassing the victims I
00:04:23suspect that there are many many more instances that are never reported to law enforcement and
00:04:29That don't make the nightly news
00:04:32State lawmakers across the country have been taking action
00:04:36including here in Texas
00:04:38But the online nature of these would-be crimes
00:04:42Demands a federal solution to provide peace of mind to all victims
00:04:47That's why earlier this month. I was proud to join with Senator Amy Klobuchar a
00:04:54Democrat for Minnesota
00:04:55to lead a bipartisan group of 14 senators 7 Republicans and 7 Democrats
00:05:02to introduce the take it down act
00:05:06Also joining me last week in introducing this legislation
00:05:10Where senators Shelley Moore Capito?
00:05:13Richard Blumenthal Jackie Rosen Cynthia Lummis Ted Budd
00:05:19LaFonza Butler Todd Young Joe Manchin
00:05:24Bill Cassidy
00:05:25John Hickenlooper Martin Heinrich and John Brasso
00:05:30since then
00:05:32Senators John Thune Roger wicker and Marco Rubio have all joined with the effort as well
00:05:38And there is a companion bill underway in the United States House
00:05:42being led by representative Maria Salazar
00:05:46Our bill has garnered support from over 40 organizations across the political spectrum
00:05:53Including organizations that are both left-leaning and right-leaning advocacy groups across the political spectrum
00:06:00It's garnered support from unions from law enforcement and from industry as
00:06:07the support for this bill plainly indicates
00:06:10This is not and should not be a partisan issue and it is imperative that Congress act quickly
00:06:18to protect the victims
00:06:20The take it down act will make it a federal crime with real jail time
00:06:25for publishing online
00:06:28Non-consensual sexually explicit images of another person
00:06:32with heightened penalties if the victim is a child
00:06:37this applies even if the image is AI generated and
00:06:42It also applies in cases where the original image was created
00:06:47consensually
00:06:48But the depicted individual did not consent to its publication
00:06:53It also makes it a crime to threaten to publish these images online
00:06:59thereby covering the associated problem of
00:07:03Sextortion which we're seeing more and more targeting young people
00:07:08In addition the take it down act is the first of its kind to incorporate a notice and takedown
00:07:16Requirement for social media and other websites that allow users to post these images
00:07:22This means that tik-tok and Twitter and snapchat and Instagram will have processes in place
00:07:30to receive and
00:07:32to immediately act upon complaints from victims
00:07:37Notice and takedown is a critical remedy for victims who want the images and all the copies of them removed
00:07:45So they don't spread further
00:07:48Especially in cases where the victim may not even know
00:07:51Who initially posted the images
00:07:55As the Supreme Court stated in a 2014 case concerning restitution for the possession of child pornography
00:08:02quote
00:08:04Every viewing of child pornography is a repetition of the victim's abuse
00:08:11This is no less the case for victims of non-consensual so-called revenge pornography and
00:08:18For victims of realistic but fake
00:08:23computer-generated deep fake
00:08:26sexually explicit images
00:08:29Well many social media companies purport to have a zero tolerance policy for posting
00:08:35non-consensual sexual content in
00:08:38Practice it can be very difficult for victims to file report who file reports to have the images removed
00:08:45The take it down act ensures that social media
00:08:49Prioritizes reports from victims and if they don't comply it empowers the Federal Trade Commission to pursue
00:08:57enforcement actions
00:08:58The take it down act has been referred to the Senate Commerce Committee
00:09:02Where I sit as the ranking member it is one of my top priorities that this bill
00:09:09Be placed on the next committee markup so that it can be passed out a committee and that it can receive Senate floor
00:09:16consideration as soon as possible I
00:09:21Now want to introduce the witnesses are here today for this hearing
00:09:24We are honored in
00:09:27particular to have testifying to 15 year old victims of
00:09:31AI generated sexually exploitative images
00:09:36We have miss Francesca Mani of Westfield, New Jersey and
00:09:41We have miss Elliston Barry of Aledo, Texas
00:09:45almost unbelievably
00:09:47both of them
00:09:49Were victims of this heinous activity in the very same month
00:09:56October of 2023 while living nearly 2,000 miles apart
00:10:01Today they are together in person for the first time
00:10:05To ring the national alarm bell to tell their stories on the growing spread of deepfake
00:10:12Sexually exploitative images of minors nearly all of which are targeted at young girls
00:10:20Being a victim of this horrendous practice is deeply traumatic
00:10:26Choosing to speak out about it on a national level
00:10:31on a national stage
00:10:32with TV cameras
00:10:36Requires another level of courage and verve
00:10:39Thank you to both of you for being here. I know it's not easy. Thank you for telling your story
00:10:44Thank you for having the courage to tell your story
00:10:48for other young girls and other young boys
00:10:52Who might be victims of this going forward?
00:10:56Both Francesca and Elliston are accompanied by their equally tenacious mothers
00:11:02Elliston's mother mrs. Anna McAdams will also be testifying today. I
00:11:07Am also pleased to have here today another Texas resident
00:11:12Miss Holly Toups
00:11:14Who was the victim of?
00:11:17non-consensual
00:11:18Intimate image publication over a decade ago
00:11:23Let's be clear
00:11:25Holly has been fighting for over 10 years
00:11:28To get her images taken down from the web
00:11:32This is inexcusable
00:11:34It is well past time that Holly and other victims of so-called revenge pornography see justice
00:11:43Finally I'm honored to be joined by two other tireless advocates
00:11:48Miss Andrea Powell
00:11:51Who has over 18 years of experience
00:11:55advocating for survivors of trafficking
00:11:58sexual exploitation and
00:12:00sexual abuse and
00:12:02Also, mr. Stephan Turk Heimer
00:12:05From the rape abuse and incest national network or rain
00:12:10Which is also a supporter of the take it down act
00:12:14Thank you all for taking the time to be here today
00:12:17And I'd now like to recognize miss Powell to give her opening statement
00:12:23Dear senator Cruz, thank you for having me here today. Again. My name is Andrea Powell
00:12:27I am an expert and advocate on sexual exploitation human trafficking and image based sexual violence for the past 20 years
00:12:34I've worked alongside more than 2,000 survivors to create safe homes
00:12:39support law enforcement and medical interventions and support both child and adult victims and advocates for critical policies such as today and
00:12:46That will give survivors access to protection justice and healing
00:12:49I am the author of believe me an advocates account of finding justice for victims of sex trafficking and a founding partner to the first
00:12:57survivor created AI facial recognition solution to technological deepfake abuse electro AI
00:13:03I am here today to speak on the most pervasively rising form of sexual violence harming women and girls
00:13:09Which is known as deepfake abuse as a survivor of sexual assault as a teenage girl from Central, Texas
00:13:15I never thought my silence in the face of my own abuse was a privilege. I never told anyone that was my choice
00:13:22Today I'm standing up for survivors of what is known as image-based sexual violence
00:13:26Which by design humiliates victims by showing the entire world their abuse often before they even see it
00:13:34However, many and very few of the resources to ensure their abuse images are removed and never again uploaded
00:13:40Meanwhile their abusers continue to share and degrade them with little to no recourse
00:13:46The term deepfake abuse is in fact a term coined by reddit user in
00:13:492017 who went on to create what is arguably the largest sexually explicit website?
00:13:54Mr. Deepfakes where women and girls are exploited is one of 9,000 such websites hosted by Google
00:14:00There is nothing fake about the pain that deepfake abuse causes in
00:14:052022 nurse practitioner and candidate for the Virginia House of Representatives
00:14:09Susanna Gibson was harmed in this way when her non consensual real videos of her with her husband were
00:14:14Distributed and then reported on by the Washington Post
00:14:17Creating a road map to her abuse rather than being treated as a victim of sex crime
00:14:22Susanna was exploited by the media the public and those who shared her content
00:14:26Shortly thereafter deepfake abuse content of Susanna was also online. She lost her job her candidacy
00:14:33Experienced terrifying threats to her and her family at their home and almost took her life in order to get her content down
00:14:40Susanna like many could not rely upon law enforcement because there were no laws
00:14:44To protect her like most victims of deepfake abuse Susanna had to create her own digital rape kit
00:14:51painfully locating each abuse image
00:14:53Contacting each platform and website hoping they would take mercy on her and take the content down most did not
00:15:00Susanna now runs a nonprofit advocating for survivors like herself
00:15:04Braised Louie considered ending her life when her former partner created both
00:15:08Non-synthetic and deepfake abuse images of videos of her her content was over 800 URLs across the world
00:15:15And currently over a hundred and forty of these URLs are hosted by Microsoft Asia
00:15:20Having just graduated from the University of California Berkeley and a venture capitalist
00:15:24But he's was shocked when law enforcement asked her if she was just a prostitute
00:15:29The truth of the matter is anyone can be a victim
00:15:32Whether or not they've exchanged the content willingly whether or not it was created by eat deepfake abuse
00:15:37But no one should be a victim
00:15:39Deepfake abuse is growing at astounding rates estimates indicate deepfake abuse has grown
00:15:453,000% since 2019 with over 300 apps often free with a simple search online
00:15:50I believe this directly contributes to situations like teen boys finding and using online abuse apps to notify their female classmates and teachers
00:15:59It's a digital virtual gun that is loaded and aimed at the faces and lives of women and girls
00:16:04Such as those testifying before you today
00:16:07So you have the app stores that monetize the apps the software developers that create it the platforms that host it and the websites that
00:16:13Explicitly create a form of organized sexual violence that is coordinated and monetized some creators make over
00:16:20$20,000 a year they hire assistance in fact deepfake abuse has become a dark culture that celebrates the abuse and sexual
00:16:27Expectation of women and girls a simple look at the forums of websites like mr.
00:16:31Deepfake show there's a culture emerging that validates the creators even young boys and rewards the violence
00:16:37Well tech platforms like meta and bumble loudly partnering with efforts that just stop ncii to remove images the abuse still remains
00:16:44I earlier mentioned a young woman named Breeze Louie
00:16:47Breeze went on to found the very tech solution that she needed in her darkest hour
00:16:52Electro AI uses AI facial recognition software to partner with platforms and give individuals their consent back
00:16:59This gives the control and safety back to individuals and it saves them time and resources on the platforms
00:17:05Electro AI demonstrates through survivor leadership once again that we actually can solve this problem
00:17:11Survivors should have laws that match the crime
00:17:14Perpetrators should be held accountable and I stand before you today
00:17:18With the hope that we can make that possible because we can stop deepfake abuse. I know we can
00:17:23Thank you, Miss Powell for your testimony and thank you for your advocacy speaking up on behalf of victims everywhere
00:17:29Mr. McAdams
00:17:32Thank You senator Cruz
00:17:34On October the 2nd we woke up to the news that devastated our lives forever
00:17:39The previous day a fellow classmate decided a fate for my daughter and her eight friends that they would never have imagined for themselves
00:17:46He decided to take beautiful innocent pictures off of their social media platforms
00:17:50And impose nude body parts onto them using an AI app from google called d and gg
00:17:56On snapchat, he made multiple accounts
00:17:59He friend requested people and blasted in the fake nudes of the girls
00:18:02It is important to draw attention here to the fact that he didn't just take someone else's nudes and put their faces on them
00:18:08Instead he took their actual bodies and used AI
00:18:11To make the images look even more real
00:18:14These images are child pornography
00:18:16These girls were 14 years old
00:18:19Their innocence was taken that day
00:18:22In just 24 hours everyone in their social group had seen these pictures and they were being sent out to the entire high school
00:18:29That week became a nightmare
00:18:31Some of their friends asked him why he was doing it
00:18:34And his response was that he wanted to go out with a bang and he wanted to ruin the girls
00:18:38In today's world those words are some of the scariest
00:18:42In today's world those words are some of the scariest
00:18:46We live with school shootings on a regular basis in this country. You could only imagine the fear those words caused the girls
00:18:53Would he come to school with his intent to hurt them? Would he kill himself hurt himself?
00:18:58The girls didn't want to go to school at all that week
00:19:01On friday of that week the girls did go to school and the school went into lockdown
00:19:05It seemed to the girls someone was after them
00:19:08We are thankful that it was nothing more than a false alarm
00:19:11But during this lockdown this boy decided to go onto his account and continue to terrorize
00:19:17That allowed the school's tech team to catch him
00:19:20it is
00:19:21So shocking to be violated in this way first having nude sent out
00:19:25That you didn't consent to and second then having the fear that your life might be in danger
00:19:31What he did was malicious. He chose his victims and he reveled in their torture
00:19:36Though he caught though he was caught he was still not allowed
00:19:39We were still not allowed to know his identity the sheriff's office in the school said that they had to protect him as a minor
00:19:45Our girls felt jaded by this. They were the victims not him
00:19:50We were able to have the school do a title nine investigation and once that was completed we were able to know his identity
00:19:57The school sent him to in-school suspension but could not tell us when he was coming back
00:20:02So the rest of the fall semester was spent in fear that he would come back to school without a warning
00:20:07My husband and I went to the school board
00:20:09Our plea was for them to expel him indefinitely
00:20:12The school board argued that there was nothing in the student code of conduct to cover these ai offenses
00:20:18The perpetrator was charged with sale distribution display of harmful
00:20:22Material to a minor it was a class a misdemeanor
00:20:26Instead of having our day in court our parker county determined that he was not a harm to anybody and just gave him probation
00:20:33When he turns 18, his record will be expunged. He will walk away unscathed
00:20:37However, our girls will forever live in fear that when they apply for a job or go to college these pictures might resurface
00:20:45He chose a fate for our daughters that we cannot change
00:20:48There needs to be consequences for what he had did
00:20:51If we do not have stricter laws and punishments for this type of crime, it will only increase
00:20:55We are thankful that our girls have family friends counselors who support them
00:20:59They will come through this
00:21:01If the girls didn't have this support it could have turned into something worse something imaginable like mental breakdown or even suicide
00:21:09This is why the take it down act is so crucial
00:21:11This bill will hold even minors accountable for jail time for this crime
00:21:16And it would require snapchat and other social medias to take those images down after 48 hours
00:21:21As of two weeks ago snapchat had not responded to the warrant issued by our sheriff's department
00:21:26Nor to any of the requests that we made as parents online
00:21:29When I met with senator cruz's office two weeks ago
00:21:32They were able to get a hold of snapchat and get the accounts and images taken down. It took eight and a half months
00:21:38If we had been taylor swift, we would have it would have come down immediately
00:21:42This bill will give us a voice that we did not have before
00:21:45Thank you
00:21:48Thank you, mrs. Mcadams and thank you for your ferocious defense of your daughter
00:21:54Miss mary
00:21:57Thank you senator cruz
00:21:58I was 14 years old when a fellow classmate created ai nudes from just an innocent photo on instagram
00:22:05I was 14 years old when I was violated all over social media
00:22:09And I was just 14 years old when I feared my future was ruined
00:22:13October 2nd 2023. I had woken up to multiple messages from a friend
00:22:19Notifying me that photos were circulating social media
00:22:22Not just any photos pictures from a past instagram post with a nude body created by an ai app upon mine
00:22:29Fear shock disgust and disbelief were some of the many emotions that filled my head in that moment
00:22:35I was left speechless as I tried to wrap my head around the fact that this was occurring
00:22:40Having to admit to your parents was shameful as I still felt responsible and began to blame myself
00:22:46As I attended school, I was fearful of the reactions and opinions people had we live in a society that is built on social media
00:22:53So I convinced
00:22:54I had been convinced at least the whole school had seen these images and to this day
00:22:59The number of people that have these images or had seen them is still a mystery
00:23:04As it took eight and a half months to get these images off of snapchat that doesn't wipe the photos off of people's devices
00:23:12Every day, I will live in fear that these photos will resurface or someone could easily recreate
00:23:17As these pictures were being passed around. I still attended school and was expected to act like all as well
00:23:23I felt unprotected walking through school
00:23:26Safety was never a concern of mine until these photos spawned the school was little to no help as I said
00:23:31It was out of their control
00:23:34So
00:23:36The spread of AI nudes is unpunishable
00:23:38Because it isn't considered child pornography
00:23:41But as a victim of AI deep fakes it has created a tremendous amount of pain and this is why I come here and share
00:23:46my testimony
00:23:48My goal is prevent any other student from undergoing this issue
00:23:52Cases of students female and male are appearing every single day
00:23:56And the people behind this malicious act do not have any consequences
00:23:59My intent is to give victims a voice never had and hopefully turn this horrible situation into something good. Thank you
00:24:08Thank you very much miss barry miss mani
00:24:12First off a huge. Thank you to everyone who has taken time out of their insanely busy schedules to be here
00:24:19It means a lot to me to be able to share my story and ask you to do
00:24:23The right thing by putting laws in place to protect women and chilled teenagers like me
00:24:31on october 20th
00:24:32I found out that a few of my classmates created AI
00:24:36Nude images of me and other girls in my school without our knowledge and our consent now
00:24:43What is consent?
00:24:45Consent is clearly and willingly agreeing to something without any pressure
00:24:50Well, we did not agree to anything
00:24:53Despite not seeing the images ourselves and hearing assurances that they have been deleted
00:24:59What happened on october 20th to me?
00:25:02And the other girls is unacceptable
00:25:04No child teenager or woman should ever experience what we have experienced
00:25:10I initially felt shocked then powerless and angered by the absence of laws and school policies
00:25:17To protect us now. I am determined to push for reforms
00:25:22The obvious lack of laws speaks volumes we girls are on our own and considering that
00:25:2996% of deepfake AI victims are women and children
00:25:34We're also seriously vulnerable and we need your help
00:25:38This meeting proves that the politicians here today
00:25:41regardless of being democrats or republicans
00:25:44Genuinely care their fathers mothers uncles brothers and many more
00:25:50They understand how urgent it is to fix this mess
00:25:55Coming together to tackle the misuse of AI is super important
00:25:59So huge thank you to everyone here and even those not here
00:26:03But backing the newly introduced AI bill or thinking about supporting it
00:26:08I'm 15 now what happened to me at 14 could happen to anyone
00:26:13That's why it's so important to have laws in place. I'm not here by choice
00:26:19You know discussing legislation
00:26:21Isn't exactly my ideal way to have fun on a wednesday afternoon in june
00:26:27But I know my voice matters without senator cruz's bill
00:26:31We'll continue to have teens making AI deepfake images of girls
00:26:37But AI is not just a toy for bored teens. It's also a tool for predators
00:26:44And that's really not okay
00:26:46To wrap this up this issue is pretty black and white
00:26:49Creating AI deepfake nude images without consent should be illegal and it's not okay
00:26:55And it's the adult's job to protect us with laws and our job to learn how to protect ourselves by protecting our image
00:27:04In conclusion while we've been focusing on the challenges and potential misuse of AI
00:27:10It's important to recognize that AI is not also bad
00:27:14It's a fascinating and rapidly advancing technology
00:27:17That has been significantly benefiting our society in the health care and science arena
00:27:22It's crucial. However, that we approach AI with a balanced perspective
00:27:27We need to educate ourselves about its capabilities and limitations
00:27:32Ensuring that we evolve and grow alongside this technology
00:27:37If we get the right laws in place and learn more about AI in schools
00:27:41We can totally tap into the benefits of AI without getting burned
00:27:45It's all about
00:27:46Finding the sweet spot where innovation meets responsibility where progress doesn't come at the cost of safety and ethics
00:27:53So there you have it my Wednesday plea for change
00:27:57I hope I didn't take too much of your time and that together we will make the digital world a safer place
00:28:03Not just for me, but for everyone out there. Thank you
00:28:07Thank you very much miss bonnie and and let me thank say thank you also to your mom who's here with you today and
00:28:12Who was with us in dc, uh two weeks ago as well. Thank you for sharing your story
00:28:18Miss toops
00:28:21Thank you senator cruz
00:28:23Over a decade ago. I suddenly found myself in a disturbing and unfamiliar universe
00:28:28Intimate images that were of a personal nature had been posted on a website and shared with the world
00:28:33As I entered this alternate universe. I wasn't sure what to do
00:28:37I was overcome with an eerie feeling knowing that there were people in real time looking at my body without my consent
00:28:44But it was not just pictures. There were horrible comments threats and personal information posted about me
00:28:50I immediately felt unsafe in my own home and uncomfortable in my own skin
00:28:55Everyone in my hometown was talking about the website
00:28:58When I was in public, I was approached by strangers who had no boundaries
00:29:01They had after all seen me topless
00:29:04They felt like they knew me and they assumed I wanted the attention
00:29:08As soon as I was able to get myself together. I emailed the website explaining my pictures were posted without my consent
00:29:15And requested that they be removed. This was after all an egregious invasion of privacy, right?
00:29:21I was caught in a website where I was not allowed to share my personal information
00:29:26Right
00:29:27I was confident they would agree and that would be the end of it
00:29:30However, they said they would be happy to remove my photos for a fee
00:29:35Shocked I responded. I would not be paying to remove my photos. They had posted without my consent. I would however hire a lawyer
00:29:43I wasn't sure however that I could afford a lawyer. So I first sought assistance from law enforcement
00:29:48However, there was little they could do at that time sharing intimate photos without consent on the internet was something new
00:29:55And the laws had not called up
00:29:57I reached out to a lawyer who agreed to send a letter to the website
00:30:01Demanding the removal of my photos, but nothing legally could be done. He felt as they were protected by section 230
00:30:10It looked as if they were free to do as they pleased. This was not acceptable to me
00:30:16How was it that they were protected and I was not?
00:30:20By this time the photos had been shared and saved on other websites compounding the trauma
00:30:26Someone suggested I contacted a private investigator to see what they could do
00:30:31He agreed to help me and within a few days. My pictures were removed from the original website
00:30:36As you can imagine I breathed for the first time in months
00:30:40He warned me however that that might not be over and I found myself checking the website
00:30:45To be sure that my pictures were not back up
00:30:48Wake up pour coffee check website repeat
00:30:52Just when I began to feel like it was over. They were up again this time on the front page
00:30:58The perpetrators bragged about their ability to do freely as they pleased because they were posted protected by section 230
00:31:06From there my photos continued to be shared from website to website
00:31:10It was like a whack-a-mole trying to get them down as they continue to pop back up
00:31:15The comments the harassment increased with the help of the investigator I met with another lawyer who agreed to help me
00:31:23I had gathered names of other local women posted on the original website. He agreed to assist them, too
00:31:29We filed a lawsuit
00:31:31Naming the host site as well
00:31:33Unfortunately at that point when I choose chose to stand up for myself and fight for my privacy the harassment got worse
00:31:41The original site was eventually shut down by the courts and the site owner identified
00:31:47After some time of wallowing in depression I became angry this could not stand I reached out to my state
00:31:54Representative which led me to connect with then texas senator garcia and texas state representative gonzalez
00:32:01I had the honor of contributing to texas passing the relationship privacy act aimed at safeguarding texas from this unwarranted harm
00:32:09While they were able to determine who started the website
00:32:12I was never able to confirm who posted my photos and how they got possession of them
00:32:18For months after I checked the internet every day to make sure my photos were not there
00:32:23Every day it controlled me
00:32:25I often think of the messages and the threats that I received
00:32:29And of others who have gone through the same thing. It's really hard to put into five minutes what that year was like
00:32:36These actions can inflict long-term
00:32:39Psychological personal and social repercussions for the victims. I have gone through a lot of therapy to get past it
00:32:48I'm still to this day
00:32:49Amazed that with the click of a button your body your privacy and your personal information can be exposed for the world to see
00:32:57We have come a long way since my photos were posted without my consent
00:33:01And while i'm grateful to see the progress has been made
00:33:04To protect others and there are protections being in place. There is still more to do
00:33:09I think we can all agree on the importance of privacy
00:33:12I think we can all agree that things of an intimate nature should be kept private and not disclosed to the world by individuals who seek harm
00:33:20Additionally that bad actors should not be permitted to create images with someone's likeness
00:33:26Thank you for your time and your attention to this
00:33:29Thank you, mr. Tubbs for telling your very difficult and harrowing story, uh, mr. Turkheimer
00:33:39Thank you
00:33:39Thank you. Senator cruz. My name is stephan turkheimer and i'm vice president of public policy for rain
00:33:44Rain is the nation's largest anti-sexual violence organization
00:33:48We created and operate the national sexual assault hotline in partnership with more than 1000 local sexual assault service providers across the country
00:33:55Over 4.5 million survivors and their families have been helped by the hotline
00:33:59We also operate about 30 hotlines for partners including the dod self-help safe helpline for the department of defense
00:34:04Rain also carries out programs to prevent sexual violence educate the public ensure the perpetrators are brought to justice
00:34:09And help organizations and companies improve their approach to prevention and response
00:34:13Thank you for the opportunity to testify today on the critical issue of the distribution of non-consensual intimate images
00:34:18Often referred to as deep fakes revenge porn or non-consensual pornography
00:34:22Often referred to as deep fakes revenge porn or non-consensual pornography
00:34:25My testimony will outline why it's imperative for the united states
00:34:28To outlaw the creation distribution of these images and provide survivors with a means to remove them from the internet and stop the cycle of harm
00:34:35as you've heard from
00:34:37Everyone up here today
00:34:39Victims of non-consensual intimate image distribution often endorse a significant emotional and psychological distress
00:34:45Includes feelings of shame guilt anxiety and depression in severe cases distress can lead to self-harm or even suicide
00:34:52The emotional toll on victims underscores the needs to address these problems and for this bill to become law
00:35:00Distribution distribution of intimate images without consent can irreparably damage an individual reputation affecting both personal and professional lives victims may face stigma
00:35:09Discrimination and ostracism from their communities workplaces and social circles. The social consequences can be devastating and long long lasting
00:35:18Sharing intimate images without consent
00:35:20Fundamentally undermines an individual's autonomy over their own body and personal information consent is a cornerstone
00:35:26As miss money was saying of all interactions, especially those involving intimate or private matters
00:35:32outlawing non-consensual intimate image
00:35:34Distribution reinforces the importance of consent and personal autonomy and that's why it's so important that this actually becomes a crime
00:35:42Because criminalizing the distribution of non-consensual intimate images serves as a deterrent against malicious behavior
00:35:47It sends a clear message that such actions are unacceptable and punishable by law
00:35:54I want to underscore the lack of big tech support
00:35:58Intimate images have been around forever deep fakes are new
00:36:01It's the easy creation and distribution of these images has created the real problem
00:36:05We are standing on a precipice of proliferation of these images
00:36:08We know how harmful they can be but right now there is nothing requiring the tech companies to fix the problems that they facilitate
00:36:16Last week at the press conference
00:36:17I told the story of a federal prosecution for identity theft
00:36:20A woman had been in a brief relationship with a navy captain and shared images of herself in the context of that relationship
00:36:27After the relationship ended he created a facebook profile of her
00:36:31Friended all of her friends and new co-workers members of her softball team people at her gym
00:36:36And shared those photos and more with them
00:36:39There was enough evidence for a federal prosecutor to bring the case and for the jury to convict this person of identity theft
00:36:45But facebook who was asked 400 times to take down the photos would not
00:36:51Because they believed the fake profile was more real than the actual person
00:36:56Big tech they simply aren't going to fix the problem themselves
00:37:00Big tech needs this bill survivors need this bill
00:37:03We need this bill having legal recourse provides victims with a means to seek justice and hold perpetrators accountable
00:37:09It also validates the experiences of victims acknowledging the wrong done done to them and offering a pathway to closure and recovery
00:37:16I said earlier that rain runs the national sexual assault hotline where so many have found help
00:37:21I reached out to them this week to gather some stories about people that have been dealing with ncii and I was struck by how
00:37:27Often it overlapped with contact offenses. This is not just a crime of the internet
00:37:31It often flows from and leads to direct physical in-person abuse
00:37:35In many instances technology facilitated abuse overlaps
00:37:39this this
00:37:40This occurred at different points throughout the continuum or timeline of abuse
00:37:43I'm just going to give a couple examples from from people that have brought this up
00:37:47Technology is a tool to facilitate contact offenses. The survivor was groomed by a much older man online
00:37:52They said they were an online relationship at 13 14 with a predator and didn't know he was older
00:37:57Technology used during abuse the foster dad filmed himself hurting the visitor and also took photos of them
00:38:04Then
00:38:05The visitor states that their mother has threatened to share explicit photos of them
00:38:08If they didn't if they tell anyone that she has been allowing men to have sex with the visitor for money
00:38:13There's someone that was
00:38:15Sexually assaulted three years ago and took photos of it and now is being threatened by those photos being spread online
00:38:22This is a significant and pervasive problem and it's only going to get worse without help
00:38:25So thank you senator cruz for advocating for survivors and giving them a way to take back control of their own bodies
00:38:31Thank you for your time
00:38:34Well, let me say thank you to each of the witnesses for being here for for the advocates who speak to us
00:38:41For the advocates who spend your time speaking out and speaking out against this abuse. Thank you for doing so. Thank you for your courage
00:38:49And for those of you who have shared your own stories of being victimized
00:38:54That takes extraordinary courage
00:38:57It would be easy to say I I don't want to talk about what happened to me
00:39:02And you've each made the decision that that you're not going to do that. You're going to take the harder path
00:39:07Of describing the pain and the hurt and what happened
00:39:12And in doing that you're making a difference. You're making a difference for young people. You're making a difference for women
00:39:18You are making a real step to protect others and hopefully to prevent them
00:39:24From going through what you've gone through
00:39:28I have to say as the father of two teenage daughters. I'm horrified by what has happened to you
00:39:33and I hope
00:39:36That you're telling your stories
00:39:39Will help build momentum to drive congress to act
00:39:45It's well past time to talk about this we need to act we need to take up this legislation
00:39:49We need to vote on this legislation
00:39:51We need to pass it into law and I believe there needs to be real and serious consequences criminal consequences
00:40:00For doing to you what has been done to you
00:40:05I want to ask each of you briefly a question
00:40:09How important do you think it is?
00:40:11To hold big tech and websites accountable for taking the content down
00:40:19So
00:40:20I think it's
00:40:22immeasurably important because I i've met with
00:40:25Now over 50 survivors of image-based sexual violence both adults and minors
00:40:30And a mix thereof and we identified three core areas of desperate need and the number one resoundingly is having their images removed because
00:40:39when we think about
00:40:40Traditional offline sexual assault, which is also horrific. There's a post-traumatic stress factor, right?
00:40:46But you can't be post if you're always worried about the future of your violence being exposed online
00:40:53As I've mentioned there are technological advances meta with stop ncii
00:40:57But we need to do more and we also need to recognize that there's ever emerging smaller actors and websites and they too should be held
00:41:04accountable
00:41:05uh, so I I think
00:41:07Without that and i've been told directly by tech companies without legislation
00:41:11They will do nothing and it's not that they're heartless. It's that they're trying to avoid the communication decency act section 230
00:41:19Which my colleague here had mentioned earlier, so it's really about
00:41:22Bringing technology and law together to support and listen to survivors
00:41:27Mr. Mcadams
00:41:32I think like what she said it is very very important that we hold them accountable
00:41:36um
00:41:37Like I had mentioned
00:41:38This boy is going to go on with life and never going to be phased by what he did
00:41:43But the girls will live in that forever
00:41:45And I like what you said about being re-traumatized every time you think that those images might be out there
00:41:51So if um, we need as just common people that live every day
00:41:55We need to be able to get a hold of these tech companies
00:41:58And get them to take our images down and not have to know someone
00:42:02In order to protect our children
00:42:05miss barry
00:42:07Um, I think it's extremely important because as a victim of aidfakes it is took a toll on my mental health
00:42:16And ai crimes are just increasing and they're getting more accessible and more normalized and so far
00:42:22There's not much that us as normal people can do
00:42:25So holding tech companies accountable
00:42:28For allowing these images to still be up is really important
00:42:34miss bonnie
00:42:35um, I think
00:42:37Yes, I think it's important that they're held accountable
00:42:42however, if they have joined us
00:42:46Um realizing it's the right thing to do
00:42:48but if they don't join us, they um
00:42:54They uh
00:42:56It's shown that they should be held accountable because all those photos and pictures
00:43:01Um will live on their websites forever and I just don't think that's okay
00:43:07And just to be clear that these owners of these websites could take down these photos in maybe less than a minute
00:43:15Yeah
00:43:16miss tubes
00:43:18Yeah, I agree. I think that the them being held accountable and also
00:43:22Being cooperative would have stopped my photos from being shared so many times
00:43:27And I think if you're if they're cooperative and they have the tools to do that
00:43:32Initially, then it prevents a lot of what we've all gone through here today
00:43:35Well, and I will say you're telling the story of waking up each morning and checking each day
00:43:40Afraid that they'd be posted again. I think that was truly
00:43:43Haunting and powerful and and it underscores
00:43:46The repeated victimization that this produces
00:43:49Mr. Keller
00:43:52I think it's imperative that tech plays a role here. Um, and if they won't do it voluntarily
00:43:59They should be required to do so
00:44:01This is a problem
00:44:03That is not just facilitated by them, but they're created by them
00:44:07and
00:44:08The proliferation of these images is not possible without them
00:44:12And so they have to take a role in pulling this back there is not a situation where someone's creating a deep fake
00:44:20on their own without
00:44:21Without with the without the deep fake websites and then sending it in the mail. That's not what's happening here
00:44:26this is all facilitated by them and they have to be
00:44:30either part of the solution or
00:44:33required to fix it
00:44:36And let me ask again each of the witnesses very briefly and and
00:44:40Uh
00:44:41Do you believe that congress passing the take it down act making it a crime to post these images and creating a federal?
00:44:49Right to have these images removed. Do you believe that legislation would make a real difference in fighting this this threat?
00:44:55Absolutely, when you think of the crime of image-based sexual violence in particular deep fake abuse
00:45:00You have sort of like a grid here. You have the victim you have the initial abuser
00:45:04You have the platforms that facilitate the distribution of that abuse
00:45:09And then you have law enforcement law enforcement can't serve what's not in the kitchen
00:45:13If there's not legislation on the books on a state and federal level
00:45:17Then they don't have the capacity to engage oftentimes
00:45:20Because they'd like to and they can't this legislation creates a through line and honestly sends an important message to tech platforms
00:45:27And I don't care if they're big or small to be honest with you
00:45:30That we are taking this seriously up until now
00:45:33They've been in charge of what gets taken down or not and when good stakeholders get involved good things happen
00:45:39Otherwise, it doesn't those 9 000 websites that explicitly are designed for deep fake abuse
00:45:44Absolutely do so with the joy of knowing they're causing harm and they have to be held accountable
00:45:49And so do the platforms that host them and it's a great point that this applies to big tech
00:45:53But little tech and anyone else as well that yeah, ms. Mcadams
00:45:58Yes, I think it is imperative to pass this bill what we
00:46:02Experienced with our school district was that they didn't know what to do. So they did nothing same thing with our local sheriff
00:46:08They didn't have any laws anything in place. So they just did nothing
00:46:13So this take it down act is going to be the basis the starting point for now us to be able to go into the schools
00:46:19And work with our student code of conduct and change that so we're going to change it at every level
00:46:24And I think this is the beginning part of that
00:46:27ms. Barry
00:46:29As I mentioned before, um, these ai nudes are not considered child pornography
00:46:34So it doesn't count for anything. So allowing this bill to be passed it allows victims to have a sense of relief and safety because
00:46:42These photos will be off of social media within 48 hours
00:46:46But as it took eight months the spread of my photos could have gone
00:46:50For many many people I still have no idea but within 48 hours. I can be assured
00:46:56That it won't spread as much as it did as well as the person behind this act will get punishment
00:47:03Yeah, ms. Money
00:47:05um, I have one word and the word is yes, I think it's
00:47:10super
00:47:11important that this bill exists because it
00:47:15just proves and
00:47:18It just helps protect us and women and children
00:47:22ms. Tubbs
00:47:24Yes, I agree that this is this is vital I know that there have been
00:47:28Variations of laws passed across states to you know, try to combat this
00:47:32But I think this really encompasses like all victims
00:47:36And it's catching up with the times because you know when the laws were beginning to get passed
00:47:41we didn't have the ai problem, um, and I think the
00:47:44Take down provision is really vital because the sooner you can get them, you know
00:47:49Ideally, they don't go up right the sooner you can get them down the less likely they are to spread
00:47:54Um, and then also the mention of the threat of the posting, um, you know, that is just as bad
00:47:59I'm living under the threat that someone is going to post it
00:48:02Um, it is just as harmful to victims as the posting and so I think I think it's a great bill
00:48:08And I hope that it goes through right. Mr. Turcomer
00:48:13Yes, I think it's absolutely important I think that
00:48:18big tech little tech middle-sized tech
00:48:21None of them are really prioritizing the protection of victims and
00:48:26Having a criminal recognition of this crime is a recognition of the harm that it caused
00:48:31It's a recognition of the victims that are produced
00:48:34and
00:48:35to get
00:48:37Tech to take these things seriously in order to take them down in order to recognize them in order to remove them
00:48:43It requires that sort of criminal element
00:48:46Of the of the of the charge and so yes, absolutely
00:48:51Great
00:48:52So now i'm going to ask questions of each of you individually going a little bit further into the testimony that you prevented
00:48:58Presented and i'll start miss powell with you
00:49:01Um, thank you for sharing your story and and thank you for for advocating advocating for victims
00:49:08Can you talk about how image-based sexual abuse has has changed over the years?
00:49:15Absolutely, so
00:49:17Many of you here today are probably familiar with the avalanche of deepfake abuse
00:49:21It was created of taylor swift billy eilish and ariana grande
00:49:25What many don't know is that much of the bodies that were used along with their faces in this non-consensual sexual violence act
00:49:32Were the bodies of teenage and adult young women and what is infamously known as the girls do porn case in which girls were
00:49:40lured into san diego
00:49:42And then ultimately sexually assaulted and filmed during those sexual assaults while those perpetrators have been ultimately
00:49:50held accountable sort of
00:49:53What that shows is that what was off what was was non-synthetic abuse
00:49:57But obviously obviously quite horrific is now being repurposed for new abuse content of other individuals
00:50:04So what has changed is there's these overlapping intersections of different types of image-based sexual violence
00:50:10And what's also I think quite important to point out is that while these notify apps and other sources are easily found online
00:50:17As I demonstrated in my testimony
00:50:20It means that young people are being exposed to this violence in ever
00:50:25Epidemic ways and it's shifting the narrative to think that things like this
00:50:29Are okay
00:50:31And I think as we're talking about big tech and we're talking about law enforcement
00:50:34We're talking about perpetrators when you think about the bystanders and what kind of bystander do you want to be?
00:50:38And I think image-based sexual violence is challenging that
00:50:42I think that's very powerful
00:50:44Right now in your experience are our websites making it a priority to take these images down
00:50:51some websites are particularly those who are compelled to do so or have received excessive heat to do so like the
00:50:59Pornhub website, but we need to make sure that it doesn't again
00:51:02It doesn't matter your size or what's going on. You need to have a simple clear
00:51:07Understanding that this is about the survivor and their consent and not it's not up for debate
00:51:12Whether or not you want to take it down, but right now it's very much at the whim of those sites. Yeah
00:51:17And I know you also have deep expertise in working with survivors of human trafficking
00:51:23Can you share a bit about how image-based sexual abuse manifests in trafficking cases?
00:51:29Absolutely, and this goes for both minors and adults traffickers
00:51:33Just like everyone here use the internet
00:51:35And they quickly learned in the early 2000s that they could take photos of their victims and use those photos
00:51:42one to coerce them into not leaving for fear of those photos being distributed to
00:51:46Use those photos to market to individuals who wanted to buy them
00:51:50Either without their consent or if they're a minor and then ultimately and I know this personally there was a young woman
00:51:56who was
00:51:58mentally delayed she was a young in her early 20s, and she
00:52:02Was in my office one day and I was looking through a website called backpage.com looking for victims and working with law enforcement
00:52:08And I saw her photos. Well, I knew that that wasn't her because she was down the hall in my office in a coloring book
00:52:14So I contacted the website. They basically laughed in my face. I contacted law enforcement and they said what's the big deal?
00:52:19You got her out
00:52:20Anyway, and what I responded was with but the reason they're using her photos is because she looks very young
00:52:26And two months later a 16 year old pregnant girl walked into my office and she was the girl the trafficker was selling
00:52:32So they were manipulating her images long after she was out of her trafficking situation
00:52:37So when we think about it's a course of control
00:52:40If you think something horrible is going to happen to you, it's more likely you'll stay in a bad situation particularly if you're a child
00:52:47For fear of not wanting more bad things to happen
00:52:49And so that's how I see the intersection and where I think we need to have a coalescence
00:52:54of
00:52:55advocates and leaders in this conversation
00:52:58Thank you, Miss Powell
00:52:59Mrs. McAdams
00:53:01Thank you again for for sharing Elliston's experience. Thank you for
00:53:06Everything you're doing to making sure that this this never happens to anyone else's daughter
00:53:11You said that you first learned these images were being distributed
00:53:15on
00:53:16Snapchat
00:53:17Can you talk more about your experiences?
00:53:20Reporting the images of your daughter to snapchat and what kind of response you got
00:53:25Um, yes, so we started with the school, of course and letting them know and then went to snapchat
00:53:31Once we had figured out who the young man was we pressed charges with the sheriff's department. So we did a warrant
00:53:38And then I continued to go on to snapchat and try to get somebody to respond
00:53:43So you can't talk to anybody. You really can't email anybody
00:53:46You just send in a form and you kind of explain what's happening
00:53:50And then you never hear back from them and I was told by the sheriff's department that don't expect to hear back
00:53:55You're probably not going to hear anything back, even though we did a warrant
00:53:58So we really felt like our hands were tied. We weren't going anywhere. We were getting no response
00:54:05Well, and I will say and you told this story some when you and I sat down
00:54:11In my office along with Elliston three weeks ago and right before we introduced this legislation
00:54:17And we sat and you told your story to me of what had happened
00:54:22I asked in the course of the conversation. I said are these images still up?
00:54:26And you said yes, they still won't take it down. I've been trying for eight and a half months and they won't take it down
00:54:32And and I turned to my team that was in the office and I said I want you to get on the phone with snapchat
00:54:37this afternoon
00:54:38and if need be
00:54:40Put me on the phone with the ceo of snapchat and we are going to get these pictures taken down right now
00:54:46Within an hour they were taken down
00:54:49Now frankly, it is ridiculous. It should not take a sitting senator getting on the phone
00:54:55To pull those pictures down
00:54:57You should have the right
00:55:00if
00:55:01Elliston were a global
00:55:03Music star if she were taylor swift, this would be pulled down and they should be pulled down of taylor swift
00:55:09But they also should be pulled down
00:55:11For every teenager in texas and every every teenager and every victim in america. You should have the same right?
00:55:22And what's infuriating also is that demonstrates snapchat knows exactly how to pull it down
00:55:28It wasn't difficult. They didn't have any confusion about how to do it
00:55:32It just when you complained on behalf of your daughter
00:55:35It was not a priority for them
00:55:39That needs to change
00:55:43You also talked about the school's response to this issue how did the school
00:55:49Communicate to you and to the other parents, uh about what happened
00:55:54Um, they really didn't that was the hard part of it was we really had to be
00:56:00I had to kind of be that mama bear constantly calling constantly asking for meetings
00:56:05Um, you know my husband I went to the school board
00:56:07I mean, I couldn't get a response back from them
00:56:09It took weeks and weeks for them to actually get back in touch with me and we've still been working all semester
00:56:15I've been trying to get with them to just write
00:56:17Something in the student code of conduct so we're kind of ahead of this if it happens again
00:56:21So I think that their approach to it was we don't know what to do. So we're going to hope you're going to go away
00:56:27And so that approach can't be
00:56:30Um can't be how they attack it
00:56:32Um elliston and the girls were the third case that I know of i'm sure there's more at the school
00:56:37But all of those were just kind of swept under the rug because they didn't know what to do with it
00:56:41So, um, that's one of my pushes and I will say that when
00:56:46We got that word within 24 hours from your office that they had been taken down. I just cried
00:56:50I mean finally we got somewhere and somebody heard us, but it should be that way. It shouldn't have to go on that long
00:56:57Like you said anybody should be able to
00:57:00as a parent
00:57:01Protect my child, but also just a student be able to say hey, I want my pictures off of there
00:57:06Well, and so you've worked with the school district now and helping draft
00:57:11Policies or what is the status? No, I can't get them to respond to me
00:57:14So we've had a couple of meetings set up and then for some reason they've fallen apart and now we're you know
00:57:19We're into the summer. I still not have not heard back from them
00:57:22So I had to really push to even have a follow-up with the school board
00:57:26And then when I did their response was more we need to protect him
00:57:31We have a right to him
00:57:33And that was very maddening because he he did what he did. Um, our daughters were the victims
00:57:39So we really through this whole process. I felt like I really didn't have a voice
00:57:43well that that is
00:57:45Infuriating one of the things I hope when when the take it down act is enacted into law is making this conduct
00:57:52a federal crime and a felony
00:57:54I hope one of the things that will prompt is school districts across texas and across the country to adopt clear policies
00:58:01Against criminal conduct that that that that should be one of the next steps
00:58:07in response to this legislation
00:58:09can you also just describe your
00:58:13You're obviously a mom who loves your daughter deeply. What was it like as a mom to go through this experience and and what were the
00:58:20the
00:58:22What was it like for you personally as a mom
00:58:25um, it really was devastating to be honest, um as a parent you always think you're going to be able to protect your child and
00:58:32We had no way to navigate through this. So as she's crying as she's going through every day
00:58:37She's a very social person
00:58:39Loves everybody never meets a stranger and I just saw her kind of resort into kind of a shell
00:58:44And kind of become somebody that she really isn't
00:58:47Um since he's not been back this semester i've seen her kind of come out a little but
00:58:51She's always going to live kind of in the fear that something could appear
00:58:55um, and then for me as a parent, I still I don't have anything any way to protect her anybody could do this and
00:59:02So this is one step
00:59:05Closer to as a parent being able to protect my child
00:59:08Well, and even though the images are down from snapchat right now
00:59:11You don't know if someone has copies saved on their phone
00:59:14You don't know what other copies might appear sometime in the future
00:59:19Thank you
00:59:20uh, miss berry
00:59:22Let me say and elston. We've spent a little bit of time together
00:59:26You've got a lot of courage. And by the way elston both you and francesca. Y'all are doing an awesome job
00:59:33um, you're both
00:59:35Really articulate and and brave and and you're telling your story powerfully. So thank you for both of you in particular
00:59:43um
00:59:47Elston tell us
00:59:49How did you feel when you first found out what had been done to you
00:59:53I was terrified. It was something I had never even thought could happen
00:59:58And seeing that these images were on my phone. I was shocked
01:00:03I was scared
01:00:05And I couldn't even I didn't know what to do. It was terrifying knowing that
01:00:11These photos were going around without my consent without my knowledge and
01:00:15This was happening to my friends and I felt hopeless
01:00:19Now now you had a friend group that you that you went through this together
01:00:22How did how did the other girls who were targeted?
01:00:25How they dealt with this?
01:00:28I mean all of us were
01:00:3014 15 and we had just entered high school and seeing that these images were going around
01:00:35It scared all of us. It it's we're freshmen in high school
01:00:38And it just felt like our whole
01:00:42four years of high school was all
01:00:44Coming down in that moment because we'd always heard that your freshman year is most important
01:00:48It keeps you on track and two months of the school year. It's all falling apart
01:00:52And thankfully i'm so blessed to have that support around me. I
01:00:58Felt supported and I didn't feel alone
01:01:01And I think having a group was very beneficial
01:01:04But some of the many victims don't have that and they don't have anybody to run to
01:01:10Or to comfort them and help them and I think it's really important
01:01:13To allow everyone to know that this is happening and you aren't alone if this is happening
01:01:19Well, look and I I hope your experience of speaking out and and fighting to change change the law
01:01:26I I hope that helps you
01:01:28Gain ownership of what happened and and gain control that that I
01:01:32Think that will be part of moving on and I hope that it is and and so thank you for doing that. Thank you
01:01:40Uh francesca, thank you also for being here. Thank you for um
01:01:44your courage
01:01:47Can you tell tell us also?
01:01:51When you first found out what had happened
01:01:54How did it feel what what went through your mind when you first found out what had happened
01:01:59Well, you know at first I was shocked and you know
01:02:03um, I did
01:02:05um cry a little bit, but um in the end, I was just really mad because
01:02:13The school
01:02:15My school didn't have any updated
01:02:18AI school policies and I think that's
01:02:21Very important because if I had
01:02:24that AI school policy, I wouldn't be here sitting and stating the obvious because
01:02:31AI school policies, um protect us
01:02:35In our schools and I think every school in the u.s. Should update their AI school policies
01:02:41And am I right that that you now know the boys who who created these images? Yes
01:02:46And did they face any any consequences? Well, one of the boys did. Um
01:02:52What i'm aware of he did only
01:02:56face
01:02:57One consequence and that was being suspended for one day and the other boys didn't get any punishment
01:03:04And you're still going to school with with the boys who did that? Yes, they are attending my classes
01:03:10Which is completely unfair
01:03:12And I just wanted to say also that you know, my principal is a woman
01:03:18She's a mother and she should be sitting here right next to me fighting for laws
01:03:23um protecting her students against what has happened and she's
01:03:28And just the whole school administration didn't handle this job correctly
01:03:34Um
01:03:36And and elliston you also found out the boy who developed these images that took some time, is that right? Yes, sir
01:03:42And and he was ultimately transferred to another school. So you're not not going to school with him
01:03:47Yes, he was suspended for around a semester and we were told he
01:03:52Had the ability to come back to school, but his parents had taken him out
01:03:57And this you said was a boy. You didn't know terribly. Well, he was in class at the time
01:04:01But it wasn't it wasn't someone you knew very well, is that right? No, sir
01:04:04I had classes with him and he was just a peer of mine, but never had an actual
01:04:10Sort of friendship or anything like that
01:04:13Um francesca, how were the images shared with your classmates? Um, they were shared through snapchat
01:04:20And what was the experience you and your mom had dealing with snapchat?
01:04:24Um, you know, um, our lawyer lawyers tried to take them down and it
01:04:29They it took a while for them and they it was never taken down except from your team your team helped them take them down
01:04:42Uh, uh, miss tubes
01:04:45So, um, I don't know if I can answer that question because I don't know if I can answer that question
01:04:49Uh
01:04:52Miss tubes
01:04:54So your images your experience is a little bit different, uh from elliston and francesca
01:05:01um
01:05:03And you've been dealing with this for for over a decade. That's a long time to be be dealing with this
01:05:09Can you speak about how the awareness around
01:05:14Posting non-consensual sexually explicit images how that awareness has changed over time
01:05:20Yeah, um, we've come a long way. Um
01:05:23You know when it happened to me
01:05:25I was so blown away that this was something that was going on that people would do
01:05:31And I remember
01:05:33You know speaking with
01:05:34the police
01:05:36Like it was a shock to them
01:05:39To them
01:05:40um
01:05:41They didn't know what to do because again
01:05:42There was there was nothing for them to reach for and and rely on it as far as a law
01:05:47Um when I talked to my state representative at the time, I remember
01:05:51Him, you know leaning forward as I was talking and then just leaning back and like just utter shock. Um, so it took a lot
01:05:58um
01:06:00of me
01:06:02Having to educate people. Well, first I had to figure out what was going on to myself
01:06:06um, and then
01:06:08Understanding the issue and then helping others understand and it was really difficult. Um
01:06:13because it was still kind of new and the people that it had happened to
01:06:17um understandably didn't want to talk about it because it was traumatizing and then um,
01:06:22You know when you did speak out publicly there was a lot of backlash and a lot of harassment
01:06:26That came with it. So it was hard to kind of get the message out there
01:06:31Of what was going on and get people to understand it
01:06:34And and at the time it initially occurred did people understand how serious it was or is has that
01:06:41Understanding grown in the past decade. I think um once they
01:06:47Wrapped their head around
01:06:49What I was explaining
01:06:51um
01:06:52Because it was so foreign to them at the time
01:06:54and so I think once they wrapped their head around that they did understand the seriousness the thing that
01:06:59um, I think was was
01:07:01The underlying thread for everyone is was privacy like how
01:07:06Like that's such an invasion of privacy
01:07:09How is this happening in the first place, but also how come no one is responding
01:07:14And stopping it
01:07:16And are the images still online today
01:07:19Um as we sit here today, i'm not sure at some point I had to stop looking. Um
01:07:25You know, there was a point when they were they were so vast across, you know, so many places
01:07:32um, and the investigator that was helping me worked really hard to just
01:07:36Constantly go after them. But as we sit here today
01:07:38I don't know I stopped looking because I just no longer wanted to be controlled by that. Um, but I wouldn't doubt it
01:07:45and
01:07:46Can you describe a little more your experience trying to get websites to pull them down?
01:07:51And how they responded yeah, so my as I stated my initial response was um to tell the website
01:07:58Because I just figured you know, it was user generated. They didn't know any better
01:08:03um, and that didn't
01:08:04That didn't work out. Um, and so the police and then my next step was the law enforcement and they didn't really have any way
01:08:12um to stop that and then the first lawyer sent the the
01:08:17Letter which was ignored and then I went to the investigator and he did
01:08:22um, whatever he did to kind of get them temporarily down and then they came back up and they were ultimately only ever removed from the
01:08:28Original site by the courts when we when I ended up hiring an attorney and filing a lawsuit
01:08:34And were you ever able to figure out who had posted the images where they came from?
01:08:39um
01:08:40I I was not um, we were able to figure out who um started the website and
01:08:47Who was you know operating it?
01:08:49Um, but I was never able to figure out
01:08:52Um, you know, I had a lot of thoughts of how because some of the photos that were posted had never been sent to anyone
01:08:57um, so, you know, obviously
01:08:59I went into thinking, you know was my computer hat like all the things. Um,
01:09:04So that was you know another
01:09:07Scary thing was some of them had never been shared with anyone. Um, but no, I don't I still don't know
01:09:13Um
01:09:15Thank you, uh, mr
01:09:18turkheimer
01:09:19Um, thank you for traveling to texas today. I'm, sorry. It's it's so hot. Uh, this is
01:09:25Summertime in texas. We can't help that. Um, but but
01:09:29Come in november and it'll it'll be beautiful and you can get shorts and you can go play golf
01:09:33But it there is a trade-off we get
01:09:36um
01:09:38Your organization rain plays plays a very important role in protecting victims of sexual assault and abuse by running the national sexual assault hotline
01:09:47Uh, it's my understanding that rain has received millions of calls from victims of sexual violence over the years
01:09:55How often does rain receive calls and reports about image-based sexual abuse?
01:10:01It's a it's a really good question
01:10:04And um because of this this field hearing I I asked people on the hotline how many in the past few shifts
01:10:10They had been receiving so just sort of in the last week or less
01:10:14And um, they just did a quick, uh excel document for me and there are 20
01:10:20Sorry
01:10:23There are 36 in the last few shifts
01:10:26And that's in about a week. Is that the period or what? Not exactly sure what the exact period is
01:10:31But yeah, it's not very long. It's a few days. It's it's not an it's not months. No, it's not months
01:10:36And um, they talk about where they where they're where that's happening to them whether it's on tiktok or facebook or snapchat
01:10:43There's a lot of reddit on here
01:10:45and
01:10:46What often happens in these situations is is some of these are are non-consensual image sharing
01:10:51some of these are deep fakes and some of these are deep fakes that are
01:10:55intended to get the
01:10:57Visitor to the to the to the hotline who's complaining about this to create self self-created c-sam
01:11:04There are two parts of the hotline. There is a a phone number and there's also an online hotline
01:11:08The online hotline is primarily used by people that are under the age of 18
01:11:14And all of these reports are coming from the online hotline
01:11:18So do you have any data or sense of the frequency that that
01:11:24These non-consensual images are targeting
01:11:27Minors as compared to adults
01:11:29I don't have any really good data on it. Um, I just sort of have the anecdotal data on it
01:11:34Everything that comes off the hotline is anonymized. So I don't have the ages that are associated with these
01:11:39But the fact that they're coming from the online hotline does suggest that they're trending
01:11:43Younger and certainly that they're they're people that are under 18 are experiencing it
01:11:49And and how often does rain see image-based sexual abuse?
01:11:53In connection to other physical crimes like sexual assaults. Yeah, it's a really good question
01:11:59Um, and it comes up relatively frequently. Um, a lot of times people are calling the hotline
01:12:04Disclosing things that happened
01:12:06Their first disclosure is to the hotline and the person they're talking to is hearing about the whole story and the whole story sometimes
01:12:14involves
01:12:15a sexual assault and then pictures of that sexual assault and then sometimes it's it's the other way around where
01:12:21someone has been been groomed and sextorted and
01:12:25ends up being
01:12:26sexually assaulted after image-based abuse, so
01:12:30there are
01:12:32Sort of countless ways in which which sort of an image-based abuse which is sexual abuse ends up being an in-person
01:12:39contact offense
01:12:41And
01:12:42So you mentioned that the sextortion component of it
01:12:45Can you elaborate a little bit on on any information about how frequent this problem is?
01:12:50Yeah, um, so a lot of what happens for
01:12:55children online in sort of in grooming situations or in sextortion situations
01:13:00is
01:13:01the
01:13:02Person will the the the the person that's seeking the seeking the the survivor to create
01:13:09CSAM, um self-created CSAM does so by pretending to be a child themselves
01:13:14and
01:13:15um, they will
01:13:17send
01:13:18CSAM purportedly of themselves and try to get the child to create
01:13:22their own and oftentimes
01:13:25That's difficult to do for these people
01:13:27And the reason why is because a lot of CSAM is sort of marked through photo dna and so these sites will see it and remove
01:13:33It and so there's this market to create new CSAM that is evading these sensors
01:13:40Which is why the AI creation of CSAM is is so important to to these people that are looking to trade CSAM
01:13:47The second part of it is and they use the fake CSAM to get kids to create real CSAM
01:13:52and so you get that sort of relationship, um that is sort of exploding these numbers because
01:13:59As people have attempted to remove CSAM from the internet. It's created this
01:14:04market for new CSAM to be to be made and so that's that's sort of the
01:14:10kind of the economic
01:14:12Demand
01:14:13That has created the supply
01:14:15Well, and we've heard testimony in the senate about teenagers targeted with with sextortion
01:14:22where they
01:14:24may have been tricked into taking an explicit photo or several explicit photos and then
01:14:29The person to whom they sent it threatens to to make it public make it public to the world
01:14:35And and in some instances those teenagers take their own life and commit suicide
01:14:39have you
01:14:41Seen that potential pattern that is that is
01:14:46That is something that we have seen and it's something that's happening more and more frequently
01:14:50Um, the people that are creating the CSAM or excuse me
01:14:53The people that are soliciting the CSAM and and inducing and coercing teens into creating it
01:14:59Don't care what happens to these kids at all. All they're trying to do is empty their bank accounts
01:15:04And they'll keep doing that until there's nothing left
01:15:07And a lot of these kids don't realize that this isn't
01:15:11That this isn't that there are ways back from this
01:15:15And so end up taking their own lives
01:15:17Which is why it's so important for them to have some sort of agency to have some method
01:15:21to
01:15:22Combat this which is why I think this bill is is so so important
01:15:28In your experience have you seen
01:15:31How the non-consensual posting of these images
01:15:34Uh affects victims what what is the impact on the victims?
01:15:41You know, we see people call the hotline because they're feeling depressed they're feeling caught calling the hotline because they're feeling anxious
01:15:48They're feeling targeted. They don't want to leave their homes. They're they're they're having their life taken away from them
01:15:55and
01:15:58That is the common response when you feel that you've been violated when you feel like
01:16:04Your body is not your own
01:16:06And
01:16:10It's so important
01:16:12For those people to have their agency restored for those survivors to have their agency restored to have something that they can do
01:16:18And right now there isn't a tremendous amount that they can do
01:16:22It's very difficult to take back that control which is why it's so important that there needs to be some method
01:16:28That allows them to do so
01:16:30Thank you, uh, miss powell, I noticed you nodding emphatically during during much of what mr
01:16:35Turkheimer was saying and so I want to give you a chance
01:16:38To elaborate i'm going to give everyone a chance if there's any kind of final closing observation you want to make
01:16:44We're going to wrap up shortly, but i'll give everyone a chance just if there's something
01:16:48You don't want to leave without having said, uh, you'll you'll have that opportunity now, but but miss powell. We'll start with you
01:16:53Yes, I wanted to to close with um
01:16:56The story of a young man I worked closely with when matthew harrick was an emerging actor in new york his early 20s
01:17:03He began a dating relationship quickly realized the person he was dating was very controlling and ended it that individual made over
01:17:11Thousands of daybreak fantasy sites on a website called grinder
01:17:16Matthew tried to get the content down and well as over a thousand men
01:17:20And 1100 men came to his house his work attempting to assault him the more he ran
01:17:25The more he screamed the more they thought it was part of the game because of the way the ads were written
01:17:30Wow, he was targeted so badly had to move country
01:17:33He joined forces with I believe the most indomitable attorney in this field carrie goldberg
01:17:38Together they went all the way to the supreme court trying to get matthew's content off grinder
01:17:45To this day. It's still there
01:17:47And he is a internationally renowned advocate still living in fear
01:17:54And he's one of the bravest people I know
01:17:56And i'll close with this there was another young woman
01:18:00Who was desperate as a young actress had 50 at her bank account was told by a photographer take these images
01:18:06And i'll pay you but no one will ever know. It's you they even signed a document
01:18:10Within two years her images were all over the world and became one of the most infamous nude images on playboy magazine
01:18:17That was in the 1950s and that young woman's name is marilyn monroe
01:18:21We've been fighting this for that long and it's time. We actually do something
01:18:26Wow
01:18:28That's that's horrific and powerful, uh, mrs. Mcadams
01:18:34Yes, i'd like to to say something about
01:18:37Um since all this started and everything we've been getting all this traction with it
01:18:40I've had at least three or four moms reach out to me with their daughters having the same thing happen to them
01:18:46um, even to the point that
01:18:49Like he was talking about somebody shared her images then groomed her and then he showed up and raped her
01:18:56So it's like whoa, I can't stories that are coming out
01:19:01And so I just think that this is a good starting point to maybe stopping this abuse against our children
01:19:08Miss berry
01:19:10I just wanted to say that no student
01:19:13Or anybody for that matter should fear for their safety
01:19:16Which is why this bill is so important because it ensures the protection of all victims
01:19:21and allow schools
01:19:22Um to take issues like this seriously
01:19:25Thank you, miss bonnie
01:19:28um, I just wanted to say that
01:19:30Schools around the us should update their school
01:19:34Ai policies so students know how to protect themselves against it
01:19:39And I just think we should have more laws being put in place to help protect these women and children
01:19:46Miss tubes
01:19:48I just want to say thank you to yourself and to the other senators who have signed on to this
01:19:53Um, I understand that you know
01:19:55The world changes and moves pretty swiftly and we have to continue to revamp our laws to keep up
01:20:01but I didn't expect to still
01:20:04Um see this happening, you know a decade after um
01:20:09We tried to combat it the first time
01:20:11um, so i'm hoping that you know, this federal law will cover more victims and deter more um,
01:20:18And and give like he was saying, you know give people
01:20:21The ability, um to be able to be themselves again
01:20:25And to you know, take back control of their lives
01:20:29Mr. Turcotte
01:20:32Yeah, i'm very grateful for your advocacy in this area and I think that
01:20:37Survivors past present and future will be grateful for it as well
01:20:41there
01:20:42are not great
01:20:44methods for taking back control
01:20:46and there absolutely need to be um,
01:20:49and
01:20:51when you have a situation where
01:20:54You know, you try to upload a snl clip that is that is recorded on saturday night
01:20:59And you record you've said you try to upload it on sunday morning and it immediately gets taken down
01:21:04and
01:21:05Images that they know are child sexual abuse or images. They know are sexual abuse
01:21:11Um just get replicated over and over and over again
01:21:14is a situation where
01:21:16the platforms are prioritizing the rights of
01:21:21NBC universal
01:21:23Over people like this
01:21:25And it's going to take an act of congress to change that
01:21:30Well, thank you, I want to thank all of our witnesses today for being here
01:21:34and especially
01:21:35Elliston and francesca as as teenagers telling your story that
01:21:40You demonstrated a lot of bravery today and and I think a lot of people hurt you
01:21:44And and that's true of all the witnesses
01:21:47um
01:21:49It's my hope that your stories remind other girls and other boys who are going through this that that you're not alone
01:21:56And that
01:21:59Americans across this country are standing with you to end this horrific abuse
01:22:09As we've said throughout this hearing image-based sexual abuse is not new
01:22:14You heard from miss toops that she's been fighting for justice in her case
01:22:19For over a decade and your resilience is inspiring
01:22:22Every witness today knows this is exhausting and an emotionally draining fight
01:22:28I'm confident that the bipartisan support we have for tackling this issue
01:22:34Will cause congress to act and act decisively to address non-consensual sexually exploitative images online
01:22:43Perpetrators will be held accountable
01:22:45And victims will have an additional tool enforceable notice and takedown
01:22:49There are multiple bills pending in congress to address this problem. Many of them are good and positive bills that I support
01:22:55What the take it down act does?
01:22:57That is unique is number one make it a crime
01:23:00A federal crime and it addresses in particular
01:23:04not just
01:23:07Actual images that are posted non-consensually, but also deep fake images that are created using ai and it addresses both of them
01:23:14But it also contains the notice and takedown provisions and as mr. Turkheimer was noting
01:23:20There's a model we use to draw from
01:23:23Which is the existing laws governing copy copyright and that came from the digital millennium copyright act
01:23:30that places a federal obligation on tech platforms that if
01:23:34If you post a clip of snl if you post a clip from the lion king
01:23:39A clip of snl if you post a clip from the lion king
01:23:43If you post a george straight song online, you will see it pulled down and pulled down very very quickly
01:23:50Because congress has put a federal obligation
01:23:54On these platforms to pull down copyrighted material
01:23:58This bill takes that exact same mechanism. They know how to do it
01:24:03And it applies it here to non-consensual intimate images that you have the same obligation
01:24:08If the individual depicted in the image has not consented they have a right to have it removed
01:24:17I'm hopeful that this field hearing will help build momentum
01:24:21We have a broad bipartisan coalition behind this bill and i'm hopeful we see in the coming weeks
01:24:26The commerce committee mark it up and that we move to the floor and pass it into law
01:24:31And you don't have to take it from me. I think our witnesses today
01:24:35Have provided powerful testimony on why congress needs to act and act now. I want to thank
01:24:41Each of the witnesses
01:24:43For your courage your clarity your power
01:24:47State of texas and our country owes each of you a debt of gratitude
01:24:50Thank you for fighting to make a difference for women for girls for victims throughout the country and with that
01:24:57This hearing is adjourned

Recommended