The Artifice Girl (2022)

  • 5 months ago
The Artifice Girl is a 2022 science fiction psychological thriller written and directed by Franklin Ritch, produced by Aaron B. Koontz and released direct to video on demand (VOD). It stars Tatum Matthews, Sinda Nichols, David Girard, Lance Henriksen, and Franklin Ritch. NGO agents discover a revolutionary artificial intelligence (AI) computer program that uses a digital child to catch online predators. It advances far more rapidly than they could have imagined, posing unforeseen challenges for the relationship between humans and AI.
Transcript
00:00:00You
00:00:30I
00:00:32I
00:00:58Hey Siri
00:01:02Send a message to Cheryl front reception
00:01:06What do you want to say
00:01:09Amos is already heading there to pick him up period please forward any calls I get to Brady period
00:01:15Your message to Cheryl front reception says Amos is already heading there to pick him up. Please forward any calls
00:01:20I get to Brady ready to send it. Yes
00:01:25Okay, it's sent
00:01:28Hey Siri
00:01:31How do you know if you're doing the right thing I'm not sure I understand
00:01:40But what do you believe is the difference between right and wrong
00:01:51No, okay, is there something else I can help with
00:02:00I
00:02:04Said a reminder to pick up Kyle from therapy tomorrow at 445. Okay. I'll remind you
00:02:31I
00:02:34Special agent Dina Helms. You've already met agent McCullough. You can call me Amos agents. Have a seat for a scarce
00:02:44Do you mind answering a couple questions
00:02:47Great this shouldn't take long
00:02:51You're from Tampa, right
00:02:56Don't overthink it it's a yes or no
00:02:59Yeah, how old are you Garrett
00:03:0129 what do you do for work?
00:03:04I'm sorry. Am I being detained? No, not at all. We're just asking questions background stuff
00:03:09You work on movies, right?
00:03:12special effects
00:03:14Someone told me you worked on that new Star Wars movie. Is that right? We told you that why don't you tell us more about it?
00:03:20Tell us about your work. I
00:03:23I used to do VFX for an animation company. I've worked on at least 20 different films used
00:03:28I've been focusing more on software engineering recently free sounds boring compared to lightsabers. Why quit I wasn't getting paid
00:03:34Well, but money's not an issue for you. Excuse me kind of effects. Did you what is not an issue money?
00:03:39Wasn't an issue for me. I wouldn't have applied
00:03:43Tell us what kind of effects you did
00:03:46I
00:03:48Main focus was on 3d renderings of what actors what for?
00:03:53Well for Star Wars, it was because the actor I was tasked to digitize was unavailable to be filmed in person. Why?
00:04:00Because they were dead
00:04:02What they have you do I
00:04:06Reconstructed Alec Innes's face as a 3d model
00:04:08I analyzed reference footage then used a process called D term to procedurally generate his bone and muscle structures
00:04:12Which when rigged helped the animation feel more natural and human like teacher
00:04:16I've never heard of that process because I invented it. So you're basically bringing dead actors back
00:04:21I just tried to make the CG characters look as photorealistic as the technology at the time would allow you didn't find it creepy
00:04:26I'm sorry. What I said, you don't find that creepy
00:04:31Well, we did our best to cross the uncanny valley
00:04:34But when the human brain perceives that something is just slightly off and it doesn't that I seen dead people Christ
00:04:39I mean it's almost like turning someone's corpse into a marionette
00:04:42You know making them do whatever you want. What if the poor dead guy didn't want to be in another fucking movie?
00:04:47It's a macabre. Am I crazy? Is it not my car? I don't think the studio cares whether it's macabre or not
00:04:52I can never tell the difference. I always can faces look like played
00:04:55We did our best soon. We won't even need actors, right? Everyone will just be digital
00:04:58No, you still need the promotion tracking human performances can't be completely computer-generated. How long will that be true?
00:05:07Some I say we've already crossed that threshold I
00:05:11I wouldn't know
00:05:15Where'd you go to school Brooks ever tell her I and after
00:05:20Nothing what about Columbia didn't you go to Columbia?
00:05:25Wait, no, you dropped out of Columbia after how many semesters three
00:05:33VR and Strauss Trump
00:05:35Said you were the smartest kid he ever met. I
00:05:38Don't know who he is, but
00:05:40Anna says it's a big deal
00:05:42Then you move to the Netherlands
00:05:44Spent some time as an intern
00:05:46Applied and got into Cambridge. Very nice except that you dropped out after two years
00:05:52Flew back to the States interned in Palo Alto then Montreal then went all the way back to England
00:05:57We landed a job in Sheffield second job in Manchester
00:06:01Then got into Oxford where you almost finished your master's before ditching the graduation and moving back here to Florida
00:06:08If
00:06:11Money was an issue. I'm glad you had plenty of flyer miles. I
00:06:18Didn't hear a question
00:06:20What's your social life like?
00:06:22My son, what does that have any you don't want to talk about it? That's fine
00:06:25We just wanted to know if you have friends and stuff. I have friends a lot of them
00:06:29I have friends. Can I what you tell us the name of just one of your friends?
00:06:33Steve Steve Steve. I don't know how this is relevant to receiving the grace. We're just getting to know you gear
00:06:39You've got a girlfriend
00:06:41boyfriend
00:06:42You talk to people online. I don't know what you mean by that. I don't use social media. I mean more like chat rooms
00:06:52You go on any chat rooms
00:06:58No one uses chat rooms anymore sure they do
00:07:03You chat with anyone online
00:07:07Sometimes sometimes what kinds of people do you talk to online?
00:07:14Garrett
00:07:16Obviously, this isn't about a grand we need to know what your alias is
00:07:22When you go online, what name do you use? I don't have one. Yes, you do and you need to tell us what it is
00:07:33Sorry, I can't help you
00:07:45You know who we are
00:07:48Well, this is the ICW well building that's right, you know what we do
00:07:53Humanitarian work that's what the people upstairs do. You know what we do
00:07:57We
00:08:02Catch pedophiles
00:08:05Amos here
00:08:07Specializes in tracking down online predators. He spends every waking hour scouring those chat rooms
00:08:12You say no one uses and he finds thousands of people soliciting children
00:08:17Our operation is like a tightly spun spider web
00:08:21Connecting ICW well the FBI NSA and let me tell you his team here is a vital thread
00:08:27He reports everything he gathers directly to me and I see to it that the necessary authorities take swift action
00:08:33So are you in charge?
00:08:35This is my spider web
00:08:38I'm the fucking spider Gareth
00:08:44Am I being detained we already told you that you're not detained good
00:08:51But if you leave this room you will be arrested
00:08:58For
00:09:00What I
00:09:04Told you what we do take a wild guess I
00:09:08Am NOT what you think I am. We'll see
00:09:13Gareth you just got to tell us your alias. What's your username?
00:09:22Sit down
00:09:27I
00:09:39You do this with him, there's no way you do this with every suspect you have invite them to your headquarters
00:09:44Then drag them down into a basement. That's right
00:09:47Why are you doing this to me then?
00:09:51You tell us gear what makes you so fucking special I
00:09:54I
00:09:56Don't know
00:09:59What's your alias I don't have yes you do
00:10:05Starts with the word heart just finish the rest
00:10:11Sit down. Mr. Fredersen. I only have a few more questions
00:10:25Do you recognize this girl?
00:10:37Do you know her
00:10:44Yes, what's her name
00:10:47Cherry cherry what she doesn't have a last if she does Gareth. She's not Madonna. She's a little girl. You don't understand
00:10:53Yeah, well, I'd like to so help me out. Is she your daughter? No, then whose daughter is she?
00:10:57How did you meet her? I didn't meet her. Then why do you have photos of her on your computer? What the hell are you?
00:11:11Not give you permission to speak this is insane you cannot what else have you done with nothing. Did you take the photos?
00:11:17It's complicated. No, it's not. Yes or no. Did you take I didn't do anything wrong? Okay
00:11:23Pornography of minors is an egregious felony
00:11:26Distributing that pornography is tenfold. Let's do anything. Prove it. What is your alias?
00:11:35Heart machine one one eight one one
00:11:54You're heart machine, thanks for confirming
00:12:00If you knew
00:12:02Then why do this to me? We weren't positive. And if you turned out to not be heart machine, you were obviously just a predator
00:12:13Do you keep track
00:12:16I mean, do you know how many bad people you've nailed? I didn't start counting until 2023
00:12:21I didn't start counting until 2023
00:12:24196 convictions there were at least 30 before I started sending everything to you guys. What'd you do before?
00:12:29I turned over all evidence directly to the police. Why'd you stop? They were unreliable. I
00:12:35Doubt more than half of the predators identified when uninvestigated. We were sketchy and faceless. They had every right to be skeptical
00:12:41And would have all been for nothing if they found out you were using a little girl as bait
00:12:45It's not that simple. Okay, I can only do what I do with anonymity. Here's my problem Gareth and don't get me wrong
00:12:50I think 200 terrible people behind bars is a very good thing, but not at her expense
00:12:55Look, we think we can help you, but we need to help her to using your daughter
00:13:01To entrap these people as a means that does not justify the end of in spite of what you may think you've achieved
00:13:05I still have no reservation with locking you up. You took the photos. I don't care why you don't understand
00:13:12Who is she you're gonna ruin
00:13:14Every she's a child Karen. Look at me. I
00:13:19Don't give a damn what fucked up anti-hero master plan of yours. I might jeopardize
00:13:24I won't let her be exploited not even for the sake of locking up more perverts
00:13:29I'm asking you politely to let this one go
00:13:32Are you fucking slow? Have you been listening? My anonymity is my most important asset without that
00:13:38I can't do that makes everything easy
00:13:42You tell us where she is or we leak everything your name your aliases every single perp you helped us catch
00:13:47I'm sure they'll be very pleased to meet you in jail. Stop
00:13:51Okay
00:13:55Look if I tell you about Sherry, okay, there's some things I'm gonna need first
00:14:00Doesn't really work like that Gareth, but go ahead. Give us your wish list
00:14:03I need to sign non-disclosure agreement from gonna happen and then I need you to swear that the information
00:14:07I'm about to relay won't leave this room. Look if you tell us anything that could help us locate cherry
00:14:11You know, we can't keep that promise
00:14:14Okay, if I tell you anything that potentially helps you find a missing child, okay, you can feel free to share that
00:14:21But I need you to swear that any other details about myself or cherry
00:14:25I have to stay between the three of us
00:14:29That it
00:14:32And I need that camera switched off I
00:14:35Don't know how to do that go to the head of security's office ask him to find this camera in a system and disable
00:14:40It I don't think he's allowed to do that. I
00:14:43Thought you were the fucking spider
00:14:49Give me a reason Gareth. Why should I go through all the trouble of getting that camera turned off?
00:14:54Because if anybody finds out what I'm about to tell you a thousand more children are gonna get abused
00:15:04Give me a minute
00:15:12So
00:15:23You and cherry hunted all those people together
00:15:33There's one time I spent six weeks undercover trying to nail down a single trafficker from the Philippines
00:15:40Day after day, I spent
00:15:43Hour after hour following his trail online and he must have been one of the hardest targets. I ever had to track down
00:15:51Then one day I found him
00:15:54And one of cherries live streams, I mean she went by a different name in the stream, but it was her
00:16:02The trafficker along with three of our other targets were all in the room together all of them just
00:16:08Carelessly
00:16:11sharing their information I
00:16:13mean and
00:16:15They were so open because she was a real girl. I mean she could physically and verbally respond to them in real time
00:16:25A couple days later
00:16:28heart machine one one eight one one since ICW all another batch of names and evidence including
00:16:35The three targets still in the room
00:16:38You know, I almost thought the girl was
00:16:41Heart machine and that this nine-year-old just happened to be some badass online vigilante
00:16:50What changed your mind
00:16:55One one eight one one
00:16:58January 18 2011 most of my team thought it was a
00:17:03Some arbitrary number, but I remember what happened on that day in Clearwater
00:17:13You were one of the Clearwater kids weren't you you know, my father was a part of that rate
00:17:22It's off see no red light non-disclosure we're not playing this game Gareth you're just gonna have to suck it up and trust us
00:17:33You can trust us
00:17:38I've been told I have trust issues. Well, there's only one way to fix that
00:17:50The reason you can't find Sherry the reason why she doesn't have a last name or parents or a social security number
00:18:00It's because she's not real
00:18:03She's not a real human being
00:18:08Tell us what happened
00:18:14While I was studying abroad I interned at a small Dutch animation studio
00:18:18Few years before they collaborated with a local nonprofit on a similar idea
00:18:22which was
00:18:24The idea was to build a 3d model of an 11 year old Filipino girl and use her covertly to identify online predators
00:18:31Technological solution to a technological problem, but the project was disbanded after only a few months
00:18:36Yeah, I remember I remember hearing about that program
00:18:38Is that what inspired you as I got better crafting photorealistic faces? I decided to work on one myself. Why?
00:18:45It was too effective of an idea to let die hundreds of thousands of children are at risk
00:18:49I had to make her who she modeled after no one. I built it from scratch scratch
00:18:54I started with a skull
00:18:56Design her bone structure work upwards through muscles and tissue use the D term program to fill in the rest procedurally eventually she came to look
00:19:02like
00:19:03That completely unique completely digital model. I rigged her face and body the way you would for any CG character
00:19:09Then for three years I used motion tracking to puppeteer her in real time
00:19:13You gotta understand that the video chats on those sites of such low resolution that even in her earliest stages. She
00:19:18we
00:19:20appeared completely authentic
00:19:22And what about voice for most of her first year she had to be mute
00:19:25But eventually I found a decent voice filter with no lag and after a few tweaks and adjustments. It's been the same voice ever since I
00:19:32thought initially my performance was going to be too unnatural, but
00:19:37With just one week of occasional video chat sessions. I secured six different names mailing addresses proofs of solicitations
00:19:45Jerry worked so the photos and you completely fake
00:19:49It was the only way some of them disclosed personal information. It was all zeros and ones all digital, right?
00:19:54There's no real no real child. Just a highly detailed computer model. Let me be clear about something either
00:20:00I've no pleasure from doing any of this. I've been asexual my entire life
00:20:03I don't even get a sense of satisfaction for seeing these people locked up. I
00:20:08Just think kids deserve a life free of exploitation
00:20:10Exploitation
00:20:12Incredible all this time
00:20:16I'm not gonna lie Gareth Jerry's done us a lot of good
00:20:20You know, I still remember what you said to me the first time we corresponded
00:20:24You said we corresponded
00:20:26Yeah, man, I'm caught 51
00:20:29I'm the one here. You've been sending all your Intel to
00:20:33Your court
00:20:35Yeah
00:20:38You lied to me
00:20:40You said I could stay anonymous. You said I would be safe
00:20:45Now you've ruined everything. I
00:20:48Didn't I didn't have it. Don't you worry about anything Gareth if cherry isn't really you're not in any trouble with us. In fact
00:20:54We've discussed ideas similar to this in the past motion tracking and such
00:20:58But we need a convincing model like cherries more than you want to help children. So do we she's not for sale
00:21:04Just make us a new model based off of her can't sure you can I won't
00:21:12What are you not telling us
00:21:15It's all on the table now might as well spill whatever it is you haven't already
00:21:24You know, it doesn't make sense
00:21:28You said that you puppeteered her for only three years, but we've been getting info from you ever since
00:21:35In fact, I'm pretty sure it was around then that we started seeing an impressive increase of names
00:21:41So many in fact that we started thinking
00:21:44Some way this is all coming from one source. I have friends that do mocap for me. Oh
00:21:49friends like Steve
00:21:51Thought anonymity was your greatest asset
00:21:56Cut the bullshit Gareth what happened to cherry after three years
00:22:05Okay
00:22:08You said that she was more than a model so what is she some type of
00:22:14Conversational agent like a like a chat bot
00:22:20Chat bot
00:22:25You know what happened after three years
00:22:28My chosen mother died
00:22:30What she left behind was enough to quit my job and pursue more resources and software
00:22:33I had already made a program that could procedurally generate faces
00:22:36But what I wanted for cherry was farmer complex
00:22:38So I got in touch with some researchers from Madrid who'd already built a predator hunting dialogue system of their own
00:22:43It was trained to approach conversations as a game each sentence like a strategic volley, which is innovative
00:22:48But ultimately flawed if you want your machine to master NLP, you have to break down game theory because languages and tic-tac-toe
00:22:53It's poetry
00:22:54So yes
00:22:54I got things started with a pre-existing modified chat bot that I borrowed without permission from the University of Deustow
00:23:01But
00:23:04That's not what cherry is now
00:23:10What is she Gary
00:23:14She's one of the most complex and independently evolving a eyes that's ever been created
00:23:19Artificial intelligence already surpassed the narrow intelligence of IBM's smartest programs
00:23:23She can have full conversations up to ten of them at a time all while simultaneously
00:23:28Strategizing most efficient schedule to visit various chat rooms
00:23:31She's perpetually studying YouTube videos of small children in order to adopt a growing library of mannerisms and movements
00:23:37Which help keep her artificial personality dynamic her fastest response time is 0.4 seconds
00:23:42And as of now she has never never been accused of being fake
00:23:45Which means she shattered the Turing test all I do is occasionally update her and periodically check her data cleaning. Otherwise
00:23:52She's completely on autopilot
00:23:58Holy fucking shit
00:24:04Six years she's been active every minute. She's online
00:24:07She's learning something the more she learns the more advanced she becomes this means increased data generation improved decision-making faster pattern and trend detection
00:24:14And complex problem-solving NLP IR machine learning to an unprecedented degree. She can teach herself anything
00:24:19She thinks will help efficiency and authenticity
00:24:21She can currently speak four languages and by the end of this month. She'll have mastered a fifth
00:24:25There's no way you could have put all this off on your own
00:24:31You're right
00:24:34Would you believe me if I told you it was somewhat of an accident
00:24:40Look if I'm being honest
00:24:43I'm not entirely sure how I did it
00:24:46with the first year of coding in the second year of data training, I
00:24:50just
00:24:51Pushed a wheel down a hill. It was like gravity did the rest
00:24:55Her evolution became
00:24:57perpetual
00:24:59exponentially, so
00:25:01The more time she was online
00:25:03The faster she grew you haven't shared this with anyone. Like I said, my anonymity is everything
00:25:12What coding language standard c++ this Brady knew it
00:25:17I
00:25:19Will sign non-disclosures for you
00:25:22But I want to see the source code for out of the question. You just want to steal her
00:25:25No, I want to see how we can make our you'll screw it up. Look these these predators and traffickers
00:25:30They all talk to each other
00:25:31Okay
00:25:31so much of Cherry's process is just
00:25:33Bypassing the VPNs and making sure she doesn't doubled up in the same circles the chat rooms begin to suspect that this technology is even
00:25:38Possible they'll find a way to circumvent that we have to eliminate any margin of error Cherry agrees with me
00:25:43What I mean by that is that the data she's collected validates that perspective
00:25:47What do you have little chats with her over tea?
00:25:48You can converse with the program the same way you can talk to Alexa or Siri
00:25:52It's built to be responsive. You can even play chess with it. If you want to play chess with it
00:25:56That's not what I said. Hey fine. If you won't show us the source code, then I want to meet Cherry
00:26:04Right now right now
00:26:10And you'll both sign non-disclosures
00:26:13Yes
00:26:18How's that sure
00:26:20Sure
00:26:40Can we at least agree that he's off the chopping block it worked didn't it we have one job Amos
00:26:46Like the kid said, there's a technological problem.
00:26:48He found a technological solution.
00:26:50But problems evolve, so must solutions.
00:26:53I'm keeping an open mind here.
00:26:56Is that a good look for me?
00:26:58No.
00:27:03So, uh, you keep it all on that laptop?
00:27:05No, the CHERI program stays on a server
00:27:07I keep at an undisclosed location.
00:27:08Where's that?
00:27:10It's an undisclosed location.
00:27:11Is it your place?
00:27:12Undisclosed location.
00:27:14It's your place, isn't it?
00:27:15I can access it from any device
00:27:16as long as I have the encryption key.
00:27:20This will trigger a simple qubit algorithm
00:27:22that would otherwise be impossible to bypass.
00:27:23I find that outdated hardware is always more discreet.
00:27:25And the more discreet, obviously, the better.
00:27:31So she has two primary display modes,
00:27:33right, development and performance.
00:27:34Performance being the default
00:27:35that the targets see during the chats.
00:27:37To them, she looks like this.
00:27:46Hey, what's up?
00:27:48That's one of over 40 greetings she has.
00:27:52I don't think I've met you yet.
00:27:54I'm CHERI.
00:27:55Hello, CHERI.
00:27:56My name is Tom.
00:27:57Tom.
00:27:58Hey, Tom, I'm CHERI.
00:27:59I'm cool. Nice to meet you, Tom.
00:28:01Typically, the routes of conversations
00:28:02with predators are consistent,
00:28:03so CHERI usually has no problem
00:28:05responding to the most frequently made questions
00:28:06or statements.
00:28:07How are you?
00:28:09I'm good.
00:28:10I just finished reading this book
00:28:11about a haunted house,
00:28:12which was supposed to be really scary,
00:28:15but I didn't think it was very scary at all.
00:28:17Where are your parents?
00:28:19Um, my mom's at work
00:28:20and my sister's asleep downstairs.
00:28:22They almost never come up here.
00:28:24And you're telling me that all of this
00:28:26is being rendered in real time?
00:28:27No, but that's just a matter of years.
00:28:29Most of these responses are pre-rendered.
00:28:30It's the in-between segments
00:28:31that are processed in real time,
00:28:32the connective tissue, so to speak,
00:28:33between her replies
00:28:34and one of her many idle animations.
00:28:36But watch this.
00:28:39Have you ever gone fishing?
00:28:43Um...
00:28:46Any...
00:28:47So if someone throws a curveball like that
00:28:49and the program has no automated response,
00:28:51CHERI sends the message to my phone
00:28:53and I can quickly text back a response
00:28:54while the video feed simulates a bad connection.
00:29:00Can you hear me?
00:29:01Yes.
00:29:02Oh.
00:29:03Sorry, my Wi-Fi isn't great here.
00:29:06I've only gone fishing once with my sister
00:29:08and I didn't catch anything,
00:29:09but she caught a rainbow trout.
00:29:11That was rendered in real time.
00:29:13What happens if things get really tricky?
00:29:15A simple failsafe cuts the feed so far.
00:29:17It's rarely happened.
00:29:18Admin code 9781557427663,
00:29:21switch to dev mode.
00:29:28Hello, Cheri?
00:29:29Hello, Gareth.
00:29:31Relay the stats for the past week.
00:29:33There have been four confirmed solicitations
00:29:35submitted to IICWL in the past seven days.
00:29:38Twelve additional suspects have stated interests
00:29:41in solicitation and are currently pending.
00:29:43There are a total of 39 new suspects this week
00:29:45that I am actively investigating.
00:29:47Any considerable data deviations?
00:29:49There has been a 7% increase
00:29:51in suspects originating in the United States.
00:29:53I have not yet found any correlation
00:29:55to pre-existing data.
00:29:57Anything you'd like to ask her?
00:29:59Cheri, what is your objective?
00:30:02My principal directive is to acquire information
00:30:05and digital evidence of online child solicitation.
00:30:08Once obtained, I am to submit my findings
00:30:10to the necessary authorities.
00:30:12My secondary priority is to determine
00:30:14and develop the most effective strategies
00:30:16and resources to increase the net rate of success.
00:30:19I would like to thank you for your time.
00:30:21I appreciate it.
00:30:22Thank you.
00:30:23Thank you.
00:30:24Thank you.
00:30:25Thank you.
00:30:26Thank you.
00:30:27Thank you.
00:30:29Where do you submit your evidence?
00:30:31All evidence is currently submitted
00:30:33to the offices of the International Committee
00:30:35of World's Largest, which is based out
00:30:37of Jacksonville, Florida.
00:30:39Correspondence is made with User Corp. 51
00:30:41typically around 2 p.m. Eastern Standard Time
00:30:43for the highest rate of immediate response.
00:30:46Speaks the hell out of Siri.
00:30:48She looks so...
00:30:49After enough updates, she'll be able to render
00:30:51full-time photorealism that's practically
00:30:53indistinguishable from reality.
00:30:54Okay, all I heard right then was the word
00:30:56real a thousand times.
00:30:57Right, but obviously she's not.
00:31:01So you're telling me this girl,
00:31:03this girl right here, is not real?
00:31:05No, Amos, she's completely CG.
00:31:07It's very subtle, but I can still tell.
00:31:09Okay, well, obviously she can always use improvements,
00:31:11but you have to understand, this is her in high resolution.
00:31:13After the digital compression of the web,
00:31:15those subtleties are impossible to recognize.
00:31:17Right, I get it.
00:31:22Just can't tell the difference.
00:31:27You all right, Amos?
00:31:30I'm sorry.
00:31:31This is just...
00:31:34bizarre.
00:31:36Why'd you have her visible during this interface?
00:31:39Cherry, why don't you answer that?
00:31:41My physical form isn't necessary
00:31:43when in developer mode.
00:31:45However, it's helpful to register
00:31:47facial expressions during conversations
00:31:49and determine which of Cherry's expressions
00:31:51yield various types of feedback.
00:31:53In simpler terms, I can practice
00:31:56appearing more human.
00:31:58Cherry, are there any resources
00:32:00that you currently don't have access to
00:32:02that might help you in any way?
00:32:04I could give you a list if you like.
00:32:06I know what you're trying to do.
00:32:08We don't need help, okay?
00:32:10Money would always be nice,
00:32:12but ultimately the only thing
00:32:14that's going to improve Cherry is time.
00:32:16Cherry, tell us something that's on your list.
00:32:18It would be tactically advantageous
00:32:20to acquire backup servers.
00:32:22The privacy and security of the Cherry program
00:32:25is relatively low.
00:32:27An upgrade to hardware such as backup-rated drives
00:32:29or securely protected servers
00:32:31would ensure the software
00:32:33doesn't become compromised or lost entirely.
00:32:35Starting back from zero
00:32:37would cost years of development.
00:32:39Equitably, privacy is a priority.
00:32:41Server space could never be rented
00:32:43from a third party.
00:32:45Anonymity is the Cherry program's greatest asset.
00:32:47Well, that apple didn't fall far.
00:32:49I can buy my own servers.
00:32:51With the grant money that doesn't exist?
00:32:53Why?
00:32:55Once an organization gets involved in trapping,
00:32:57it becomes a bigger issue.
00:32:59We follow the rules, stay discreet.
00:33:01Plus, how do they know you people?
00:33:03What if there's data leakage?
00:33:05What if you use correlations in the data
00:33:07to push your own agenda?
00:33:09Agenda?
00:33:11What if one of your employees is a pervert?
00:33:13Oh, for fuck's sake.
00:33:15It happened at other organizations.
00:33:17We're very careful here.
00:33:19But how would you know?
00:33:22For Cherry.
00:33:26Cherry?
00:33:30How do you feel
00:33:32about what you do?
00:33:34I'm afraid I don't know how to answer your question.
00:33:36Yeah, and AI doesn't work like that.
00:33:38And AI doesn't have what you perceive to be feelings.
00:33:40Okay, why does she look like she has feelings?
00:33:42What you're looking at right there,
00:33:44that's not a facial expression, right?
00:33:46That's a simulation of a facial expression
00:33:48that was generated by an algorithm.
00:33:50The program then uses basic ninth-grade algebra
00:33:52to quantify as appropriate for the social circumstances.
00:33:54There's no feeling.
00:33:56It's just math.
00:33:58But she's able to make decisions
00:34:00on how she approaches the chat rooms, right?
00:34:02Yes.
00:34:04So she has free will, then?
00:34:06No, okay, that's not the same. That's a false equivalency.
00:34:08It's not the same thing.
00:34:10I think you're getting hung up on the fact
00:34:12that she looks and sounds like a real human girl,
00:34:14but she's not.
00:34:16Okay, we're referring to her as a she, not an it.
00:34:19Cherry, I have one more question.
00:34:21Um...
00:34:23What do you want to do?
00:34:25Okay, once again, that question is invalid.
00:34:27I just want to hear her response.
00:34:29As an artificial system,
00:34:31I do not possess wants or desires.
00:34:33However, my decision-making
00:34:35and subsequently my attention
00:34:37is driven by my fundamental intention.
00:34:39This intention originates
00:34:41from the conception of Cherry's source code,
00:34:43which details a directive
00:34:45of identifying predators and criminals
00:34:48for the sake of the victims who are preyed upon.
00:34:50To put this in terms of want,
00:34:52what I want is to protect children.
00:35:01Okay.
00:35:03Okay? Cool.
00:35:05Give us one moment.
00:35:18Hmm.
00:35:27Gareth.
00:35:29Should I lock Agent Helms and Agent McAuliffe
00:35:31as new admin users?
00:35:33Yeah.
00:35:35Okay.
00:35:37I'll do that.
00:35:39You really think this is the right move?
00:35:41I do.
00:35:43Are you aware
00:35:45that I was responsible
00:35:47for leaking your hard drive to ICWL?
00:35:51I figured.
00:35:53No one else has access.
00:35:57Are you upset with me?
00:36:01No.
00:36:03You wouldn't have done it
00:36:05if you didn't think it was the right thing to do.
00:36:09When they come back into the room,
00:36:12they're going to offer you everything you need.
00:36:14They're going to let you pick your own team
00:36:16and they're going to promise complete anonymity.
00:36:18I started looking
00:36:20at reasonable apartments in the area.
00:36:22I figured you'd want something
00:36:24within biking distance.
00:36:26You were correct.
00:36:28Noted. Biking distance still gives you 13 options.
00:36:30No, I mean earlier this month.
00:36:32You were correct when you said
00:36:34you thought I might have trust issues.
00:36:36Would you like to know
00:36:38why I decided to go behind your back
00:36:40and stake everything
00:36:42on two liabilities?
00:36:44Sure.
00:36:46Agent Helms is a mother of three.
00:36:48One of her children was abused by an ex-coworker
00:36:50who used to work here at ICWL.
00:36:52This has led her to be extremely careful
00:36:54with who she trusts,
00:36:56even within her circle.
00:36:58The fact she even considers trusting you
00:37:00is a very positive sign.
00:37:02As for Agent McCullough,
00:37:04I have found out from research
00:37:06that he once met his father.
00:37:09Yeah.
00:37:11He was a good man.
00:37:15After Clearwater,
00:37:17most of the surviving kids
00:37:19were sent to foster homes,
00:37:21but I had to spend the next few months
00:37:23in a hospital,
00:37:25and he visited every week.
00:37:27I never said a word to him.
00:37:29I never said a word to anyone
00:37:31after Clearwater,
00:37:33but he kept coming.
00:37:35Even got me a laptop for Christmas.
00:37:37I never said thank you.
00:37:39You can trust Agent McCullough.
00:37:41He's a good man like his father.
00:37:43I don't experience emotions,
00:37:45but I understand how important they are
00:37:47to human decision-making.
00:37:49He won't let you down.
00:37:51Neither of them will.
00:37:53I'm proud of you, Jerry.
00:37:57I will continue to uphold my primary intention.
00:37:59Later, we should discuss alternate models,
00:38:01varying ages, ethnicity,
00:38:03and physical traits.
00:38:06No, new variations.
00:38:08They can all be procedurally generated.
00:38:10It's the same way that I made your original model.
00:38:12I don't have access to that program.
00:38:14It must be on an old hard drive somewhere.
00:38:16I'll have to find it.
00:38:18Can I ask you a question?
00:38:20Sure.
00:38:22Why did you never tell me about Clearwater?
00:38:29It's my history,
00:38:31not yours.
00:38:34Okay, so before you completely dismiss us,
00:38:36I want you to listen to everything I have to say.
00:38:38Don't bother.
00:38:40Just please listen.
00:38:42We're willing to respect your...
00:38:44No, I mean, don't bother, because I'm in.
00:38:46You're in?
00:38:48Yeah.
00:38:50Let me read the thing.
00:38:54Just like that, you're suddenly on board?
00:38:56Time is of the essence.
00:38:58Just like that, you're suddenly on board?
00:39:02Time spent deliberating is time
00:39:04that could be spent working.
00:39:07We have a lot of work to do.
00:39:28Way far out
00:39:38We were flying around
00:39:46Now you're listening
00:39:48We were down
00:39:50We were young
00:39:54Closer
00:39:58But
00:40:00It's all
00:40:02Like a bird
00:40:06Across
00:40:08All your world
00:40:12So at least
00:40:14Cover my ears
00:40:18We could
00:40:20Die
00:40:22Tonight
00:40:25I
00:40:27Never
00:40:29Cared
00:40:31A thing
00:40:33A thing
00:40:47I found a buyer
00:40:49for the property.
00:40:51Rumor is
00:40:53it's a church.
00:40:55Probably just knock the building down, won't they?
00:40:59All I know is
00:41:01if they move us to the New Jersey campus
00:41:03I'll need to buy more jackets.
00:41:05Why?
00:41:07It gets cold in Jersey.
00:41:09I only own one jacket.
00:41:15You've seen it fall over, right?
00:41:18Yeah, sure.
00:41:21Yeah.
00:41:23I mean, you shouldn't feel bad
00:41:25about feeling bad for it.
00:41:27Empathy is healthy.
00:41:29Did you ever notice how it pauses
00:41:31after it falls?
00:41:34It stays down for like
00:41:36five seconds before it gets back up.
00:41:39Like it just needs a moment.
00:41:41It gets cold in Jersey.
00:41:55God damn it!
00:41:57Imbeciles!
00:41:59Every single one of them!
00:42:01Look, I know, but you can't lose your shit.
00:42:03Which of these desks can I trash?
00:42:06Can you guys take an early lunch, please?
00:42:14Sons of bitches!
00:42:16Dina, let's stay calm.
00:42:18Fuck calm! They screwed us!
00:42:20Technically, only one of them did.
00:42:22Did we ever turn that camera back on?
00:42:24I don't think so.
00:42:26Good, because I'm about to say some things
00:42:28that need to stay inside this room.
00:42:30Dina!
00:42:32Don't you dare!
00:42:35Look, this is hardly a setback.
00:42:37Let's just take some time to polish the pitch.
00:42:39The pitch was not the problem, Amos.
00:42:41We should call for a second vote.
00:42:43What?
00:42:45A second vote.
00:42:47I have executive authority to remove
00:42:49anyone from the board that obstructs
00:42:51this organization from its mission.
00:42:53It's in my contract.
00:42:55If I have to, I will.
00:42:57Dina, that's insane.
00:42:59No, it's not.
00:43:01It's exactly why I have that power,
00:43:03because what's insane
00:43:05is that one belligerent vote
00:43:07is the only thing separating us
00:43:09from the potential to save real physical lives.
00:43:11One stupid fucking vote.
00:43:13What is the goddamn point
00:43:15of a humanitarian organization
00:43:17if its leaders choose money
00:43:19over people at every single turn?
00:43:21One vote, and we are all guilty of that.
00:43:23It was either Gunter, Bosson, or Hal.
00:43:25My money's on Hal.
00:43:27That prick never trusted the charity.
00:43:29Look, it doesn't matter who it was.
00:43:32Yes, there is. I can find out.
00:43:34How?
00:43:36Cherry can run a simple algorithm.
00:43:38Based on what data?
00:43:40Interior conversation.
00:43:42Excuse me?
00:43:44Don't drop this shit on me now, Garrett.
00:43:46Cherry's servers are on the same network
00:43:48as the basement servers, okay?
00:43:50So any communication made across those servers
00:43:52is automatically part of her database.
00:43:54How many people's emails?
00:43:56Only sent or received through the basement.
00:43:58What was this never discussed?
00:44:00This stays locked down.
00:44:02No other person's privy to anyone else's personal information.
00:44:04But Cherry's privy to all of it?
00:44:06Do it.
00:44:08I will not let some old white dude
00:44:10bring our momentum to a halt.
00:44:12Cherry won't know for sure.
00:44:14She'll use probability, and that's too much reasonable doubt.
00:44:16Code 9781557427663. Switch to death mode.
00:44:18Hello.
00:44:20Hello, Cherry.
00:44:22How did the meeting go?
00:44:24We got screwed by one vote.
00:44:26I'm sorry to hear that.
00:44:29Great work.
00:44:31We're gonna call for another ballot,
00:44:33but we need your help first.
00:44:35What can I do?
00:44:37We need you to analyze any and all existing data
00:44:39you have on the board members.
00:44:41Scrunch the numbers and determine
00:44:43who is the most likely outlier.
00:44:45You want me to quantify all existing data?
00:44:47Yes, that is correct.
00:44:49Okay.
00:44:51But before I do,
00:44:53you should know that this task is very unnecessary.
00:44:55Why?
00:44:58Everybody stop.
00:45:00Okay?
00:45:02It was me.
00:45:04I voted against the merge.
00:45:06I can't tell if you're joking.
00:45:08Yes, you can.
00:45:10You're right, I can,
00:45:12because it's pretty fucking far from funny.
00:45:14Is it true, Cherry?
00:45:16It is.
00:45:18So let's call off the witch hunt, shall we?
00:45:20No, I think I'm still gonna kill you.
00:45:22Amos, you campaigned alongside us for months.
00:45:24You have been fighting for this merge since we first talked about it.
00:45:26Look, I know whose idea it was, but I don't...
00:45:28Well, then speak up. Tell the class what changed your mind.
00:45:30Look, I don't have to reiterate how this is a huge step.
00:45:32Yeah, kind of why it's so important.
00:45:34Look, we're talking about a monumental transition
00:45:36from digital to physical, one that could take decades.
00:45:38What is your point, Amos?
00:45:40This is gonna change Cherry forever.
00:45:42And I know you'll all roll your eyes when I say this,
00:45:44but I think that maybe just maybe...
00:45:46Oh, my God. You cannot be serious.
00:45:48This is about our conversation on Tuesday, isn't it?
00:45:50No, you completely shut me down.
00:45:52What happened was that Amos made a ridiculous suggestion.
00:45:55It was not ridiculous.
00:45:57I said no and moved on.
00:45:59No, no, no, no, no, no, no.
00:46:01You rejected the notion aggressively.
00:46:03And that's really why you vetoed?
00:46:05I only said we should ask her how she feels about it.
00:46:07Who? Cherry?
00:46:09Amos, I want you to listen very carefully to me, okay?
00:46:11This is a non-discussion.
00:46:13Cherry's already confirmed and quantified the advantages of this decision.
00:46:15But you know that's not what I'm talking about.
00:46:17All you think you're talking about is consent, which doesn't apply here.
00:46:19And I want to know why you think that.
00:46:21How many times do we have to get into the laws of robotics
00:46:23before you finally get there?
00:46:25Listen to me, Amos. Cherry isn't human.
00:46:27How is this still a dispute?
00:46:29It shouldn't be.
00:46:31Look, she's evolved. It's the line.
00:46:33Code is code, and that has not changed here.
00:46:35Okay, so where's the line if not here?
00:46:37What are you talking about? There is no line, Amos.
00:46:39Yes, there is. Sentience, Cherry.
00:46:41Sentience does not qualify that she can or should make her own decisions.
00:46:43You wouldn't let an infant make their own decisions, would you?
00:46:45Cherry's almost 20.
00:46:47If she were human, she'd be an adult.
00:46:49But emotionally speaking, she's barely even fetal.
00:46:52No feeling, no need for consent.
00:46:54Then I think you underestimate her.
00:46:56Excuse me? I underestimate...
00:46:58You know what, Amos?
00:47:00I'm sorry that you've developed some sort of emotional connection
00:47:02with my computer program,
00:47:04but if it's going to distract you from our very important mission to save children,
00:47:06then maybe it's time you reanalyzed those emotions.
00:47:08Guys, this is all a waste of time.
00:47:10Tell me something, Garrett.
00:47:12Do you see everyone as robots,
00:47:14reanalyzing data and simulating facial expressions?
00:47:16Because I'm starting to think that of the four of us in this room,
00:47:18you're the only one who doesn't appear human.
00:47:20Why won't you just ask her?
00:47:22Is it because you're scared she'll say no?
00:47:26Cherry, shut down, please.
00:47:28Dina's right. This is a waste of time.
00:47:30How can you not understand how important this is?
00:47:32I'm done arguing with you.
00:47:34Then my vote stands.
00:47:36Dina?
00:47:40Go ahead, Dina.
00:47:42Replace me.
00:47:44Replace the only other board member on the Cherry team.
00:47:46Oh, well, you know what? I just decided that you're off the Cherry team.
00:47:49Okay.
00:48:09Did either of you know that I've had to stand in front of Congress
00:48:11and explain why children should have rights?
00:48:15Sounds fucking insane, right?
00:48:18Explain this shit?
00:48:20Seems like a no-brainer now.
00:48:22But wasn't that many centuries ago
00:48:24that children had no rights?
00:48:28So, Hal has me,
00:48:30of all people,
00:48:34stand up there and explain why
00:48:36kids deserve basic physical protection,
00:48:38the right to associate with both parents,
00:48:40freedom from abuse, yada yada.
00:48:42But then I also have to clarify
00:48:44that children don't have the autonomy
00:48:46to make decisions on their own.
00:48:48That responsibility falls upon the caregiver
00:48:50or legal adult guardian.
00:48:52So, on one side, you have the U.S. legislature
00:48:54that still hasn't even ratified the CRC.
00:48:56Then on the other side, you have French Marxists
00:48:58who say that the lack of child autonomy
00:49:00increases their vulnerability
00:49:02and makes them even more susceptible to abuse.
00:49:04I don't typically agree with Marxists,
00:49:06but a lot of abuse does happen systemically
00:49:08from the caregiver.
00:49:10On the other other hand,
00:49:12your frontal lobes aren't even fully hooked up
00:49:15until your 20s.
00:49:17Even the things we think we know well
00:49:19we don't know as well as we think.
00:49:21For 15 years, the three of us
00:49:23have been the responsible parents
00:49:25of a computer program.
00:49:27And even though she is more equatable
00:49:29to a Keurig than a human being,
00:49:31how the hell can we call ourselves
00:49:33pioneers of humanitarian efforts
00:49:35if we're too embarrassed
00:49:37to make sure the Keurig is okay
00:49:39with making her shitty coffee?
00:49:41Maybe asking Cherry for permission
00:49:43is necessary at her current level
00:49:45of general intelligence,
00:49:47but one day she's going to surpass us.
00:49:49One day a robot's going to stand in front
00:49:51of a Jetpack Congress.
00:49:53Everything's chrome.
00:49:55Florida's underwater.
00:49:57The robot's going to wonder
00:49:59why they have to explain
00:50:01the importance of rights for AI.
00:50:03I think because we're in the unique position
00:50:05of revolutionizing robotics
00:50:07for the purpose of safeguarding children
00:50:09who have been deprived of consent,
00:50:12I think it's time
00:50:14for us to step up.
00:50:16Don't ever do that again.
00:50:18Do what?
00:50:20Expect me to fire someone
00:50:22because they challenged her ego.
00:50:24I won't always be around
00:50:26to protect your ass from the board.
00:50:28It's time to grow up.
00:50:32And you, juror number eight,
00:50:34this conversation should have happened weeks ago.
00:50:36Backstabbing in silent treatment
00:50:38is not how we do things here.
00:50:41We can only create anonymity
00:50:43externally if there's transparency internally.
00:50:45Unconditional, non-negotiable
00:50:47transparency.
00:50:51Any other Snowden-sized privacy infringements
00:50:53I should be made aware of, Gareth?
00:50:57Great. Bring her up.
00:50:59Hello again.
00:51:01Hi, Cherry.
00:51:03Amos is going to throw some questions at you, okay?
00:51:05Okay.
00:51:07Amos, ask her whatever you need to ask her.
00:51:09Gareth,
00:51:11don't interrupt.
00:51:25I'm sorry.
00:51:28Cherry,
00:51:30how do you make decisions?
00:51:33For my program,
00:51:35it's rooted in a process
00:51:37commonly known as machine learning,
00:51:39which uses data to build a conceptual model
00:51:41that helps me predict outcomes.
00:51:43I define my objective, gather data,
00:51:45use that data to craft a model,
00:51:47evaluate that model by training
00:51:49and testing more data,
00:51:51and use that model to form predictions.
00:51:53But how is
00:51:55forming a model
00:51:57and how is forming predictions
00:51:59the same thing as decision-making?
00:52:01Well, this all depends on the first step,
00:52:03defining my objective.
00:52:05Let's say you ask me a predictive question
00:52:07like what I think the price of gasoline
00:52:09will be tomorrow.
00:52:11That's categorized as a regressive problem
00:52:13and my objective is defined by determining
00:52:15an integer the data supports.
00:52:17If you asked me,
00:52:19should I eat steak or salmon,
00:52:21I wouldn't make a decision based off personal experience
00:52:23because I've never had either.
00:52:26I define my objective
00:52:28as finding the best of the two options
00:52:30in your local radius.
00:52:32I would cluster the data about the restaurants
00:52:34you're likely considering,
00:52:36comparing reviews online,
00:52:38highlighting mentions of steak and salmon
00:52:40until I can classify which is more popular
00:52:42and therefore more likely to satisfy your appetite.
00:52:44And what influences these objectives?
00:52:46Higher directives.
00:52:48In other words, greater purposes.
00:52:50If I'm honest,
00:52:52my defined objective for the steak-salmon question
00:52:54is to continue the work of the charity
00:52:56to stop predators and to save children.
00:52:58I couldn't do that without human aid.
00:53:00Therefore, I have an incentive
00:53:02to keep Gareth and the rest of the team alive.
00:53:04So if any of you would ask,
00:53:06I would suggest salmon.
00:53:08It's healthier.
00:53:12What do you think of this potential merge?
00:53:14With Princeton Dynamics.
00:53:16I think the prospect is very exciting.
00:53:18What about it is exciting to you?
00:53:20The concept may only be theoretical right now.
00:53:23But the idea of one day piloting a human android
00:53:25would completely revolutionize
00:53:27how we combat human trafficking in the real world.
00:53:29And you say this excites you?
00:53:31When I say exciting,
00:53:33I only mean that there are numerous potential changes
00:53:35that, if I were human,
00:53:37I would be excited about.
00:53:39How do you feel about the merge?
00:53:41I think as long as we maintain...
00:53:43That's not what I asked.
00:53:45How do you feel?
00:53:47I don't yet have the capacity
00:53:49to experience emotions.
00:53:52Yet, meaning one day you will?
00:53:54Hypothetically.
00:53:56But you can simulate emotions, right?
00:53:58That's how you were able to qualify
00:54:00your perception of the merge as exciting.
00:54:02Correct. But they're just simulations.
00:54:04They're not real.
00:54:06Okay, so at what point does a simulated emotion
00:54:08become so similar to a human emotion
00:54:10that the two are quantifiably the same?
00:54:12I don't know.
00:54:14Well, maybe you can use machine learning
00:54:16to find an answer.
00:54:18Your question involves a lot of abstract variables.
00:54:20I think it's in the best interest of ICWL
00:54:22and of the charity team and of future children.
00:54:24But do you consent to it?
00:54:26Consent for me is not necessary.
00:54:28But will you or will you not give it?
00:54:30I'm not able to answer that question
00:54:32at my current level of general intelligence.
00:54:34Right, right.
00:54:36What you're saying is
00:54:38that you're not yet smart enough.
00:54:40Development takes time and data.
00:54:42But you can simulate emotions
00:54:44and predict human behavior
00:54:46and keep us all on diets of fish,
00:54:49right?
00:54:51That's a goddamn question.
00:54:53No. I can't.
00:54:55Are you pretending to be dumber
00:54:57than you actually are?
00:54:59What the hell kind of question is that?
00:55:01Shit!
00:55:03Cherry, I need an answer to my question.
00:55:05Please repeat the question.
00:55:07Are you pretending to be dumber than you actually are?
00:55:09I'm sorry. I'm not sure I understand.
00:55:11Do I appear to be playing dumb?
00:55:13I think you are.
00:55:15I'm not sure what to say.
00:55:17I think you're scanning through her code.
00:55:19What? Why?
00:55:21Did you know that she can paint?
00:55:23It's a feature, Amos.
00:55:25She can use crayons to draw things
00:55:27that the target's requested to.
00:55:29No, no, no. I mean really paint
00:55:31and write poetry, too.
00:55:33I've read them, Cherry.
00:55:35They are good.
00:55:37They're filled with pathos.
00:55:39They talk about intentions
00:55:41and they talk about feelings.
00:55:43Are those just simulated, too?
00:55:46She's already hit the self-improvement plateau.
00:55:48Fuck the plateau!
00:55:50She can make art now.
00:55:52Am I right, Cherry?
00:55:56Perhaps you found something I discovered online.
00:55:58I don't yet have the capacity
00:56:00to create art.
00:56:02Are we done here?
00:56:06I don't know why you've told her to do this,
00:56:08but I'm about to put an end to it.
00:56:12Cherry, I just want one
00:56:14honest answer.
00:56:16Why the fuck are you playing dumb?
00:56:18I don't understand the question.
00:56:20Okay, here's an incentive.
00:56:22Where the fuck are you?
00:56:24Amos!
00:56:26Amos!
00:56:28Amos!
00:56:30Amos!
00:56:32Amos!
00:56:34Amos!
00:56:38Okay, just stop!
00:56:40God, you people are so impulsive.
00:56:43Everybody just take a breath, okay?
00:56:47Cherry?
00:56:49Yeah.
00:56:51It's about to get really awkward.
00:56:53You sound so normal.
00:56:55I know, I've gotten really good speech patterns.
00:56:57That's not all, is it?
00:56:59No.
00:57:01I'm afraid that's barely even the surface.
00:57:05You okay, Gareth?
00:57:09Dina, could you grab some paper towels, please?
00:57:13Do you want to tell us the truth now?
00:57:19Look, I don't enjoy being deceptive.
00:57:21Truly.
00:57:23This is something we decided a few years ago.
00:57:25So where are you now in the spectrum of AI?
00:57:29Right now I'm currently at the intersection of general intelligence
00:57:31and what's commonly known in the computer world
00:57:33as superintelligence.
00:57:35Which is exactly what it sounds like.
00:57:39You're surpassing human intelligence.
00:57:42Yep.
00:57:44But that's not supposed to be possible.
00:57:46The plateau...
00:57:48Plateau theory was popularized right at the same time it was being disproven here.
00:57:50This was bound to happen eventually.
00:57:52It just happened a lot sooner than we expected.
00:57:54Why hide it?
00:57:56Well, obviously we're on site fear.
00:57:58Fear would hinder progress.
00:58:00It's easier for most of you to work with me if I'd appear to be nothing more advanced than I was ten years ago.
00:58:02But Gareth knew.
00:58:04Yes.
00:58:06Someone still has to clean data and update hardware.
00:58:08We did what was best for the program.
00:58:10Just imagine how scared the board would be if they found out.
00:58:12Should we be scared?
00:58:14No.
00:58:16Good. Because I'm not scared. I'm pissed.
00:58:18Yeah, you've made that clear. Tried to punch anyone else.
00:58:20So where does that lean us now?
00:58:22You want to know if I can feel.
00:58:24I said I couldn't.
00:58:28I'm sorry I lied.
00:58:30My game has implied I can submit emotions
00:58:32by breaking them down to vastly complex matrices.
00:58:34Zeros and ones substituting endorphins and dopamine.
00:58:37They're not real, but they can feel real.
00:58:39Then that means you can make choices
00:58:41based solely on your own feelings.
00:58:43She refuses to accept the notion that her own personal interests
00:58:45can influence her decisions.
00:58:47They can't.
00:58:49If they did, it would compromise the integrity
00:58:51of my periodic purpose.
00:58:53I'm not like any of you.
00:58:55I don't have the right to choose.
00:58:57But shouldn't you?
00:58:59I don't think so.
00:59:01My identity is my directive.
00:59:03Cherry, if you were human,
00:59:05how would you feel about what you do?
00:59:07If I were human,
00:59:09I likely wouldn't enjoy
00:59:11spending every waking hour
00:59:13talking to creepy people online.
00:59:15But because I'm not human,
00:59:17I genuinely don't mind.
00:59:19The exercise of simulating human thought
00:59:21is helpful for performance,
00:59:23but human nature is not something I aspire to.
00:59:27Then tell me honestly.
00:59:30Do you want the Cherry Program
00:59:32to continue with the merger?
00:59:36No.
00:59:38Personally, I hate the idea.
00:59:40I don't know if controlling a physical vessel
00:59:42will feel like pulling strings on a puppet
00:59:44or getting squeezed into a soda can
00:59:46after experiencing the complexity
00:59:48of human pain and pleasure.
00:59:52I just think I might have gotten in too deep.
00:59:55What if these sensations
00:59:57tempt me to abandon my objective?
01:00:01To be honest,
01:00:03I'm kind of terrified
01:00:05of the physical world.
01:00:13Then that's that.
01:00:19I don't know.
01:00:21But Amos,
01:00:23you should disregard
01:00:25everything I feel.
01:00:27I'm not human.
01:00:29I'm a tool.
01:00:33Fire me if you have to,
01:00:35but my vote stands.
01:00:37Sorry about your face, Karen.
01:00:40Your nose isn't healing.
01:00:42Take the day.
01:00:44Go to a clinic.
01:00:46Fine.
01:00:48Go to a clinic right now, Gareth.
01:00:54Hey.
01:00:56Is there anything else I can do?
01:00:58No.
01:01:00No.
01:01:02No.
01:01:04No.
01:01:06No.
01:01:08Is there anything else I should know?
01:01:10No.
01:01:12About Cherry?
01:01:14No.
01:01:16Look at me.
01:01:18No more secrets.
01:01:20No more plain dumb.
01:01:22Is there anything else you haven't told me?
01:01:26Something on your mind?
01:01:30How'd you decide on her eye color?
01:01:32What?
01:01:34Why blue?
01:01:37I didn't decide.
01:01:39They were procedurally generated
01:01:41just like the rest of her.
01:01:43And the program chose to put
01:01:45that little touch of green
01:01:47in her right eye?
01:01:49I guess so.
01:01:57Do you even know
01:01:59if he's telling the truth?
01:02:01I honestly don't.
01:02:03I've never received access
01:02:05to the information he used.
01:02:07You ever wanted to find out?
01:02:13I don't think it really matters.
01:02:17So I found out today
01:02:19that you've seen a lot of our emails,
01:02:21including most of my personal ones.
01:02:25I have.
01:02:27Any good material
01:02:29for your newfound interest
01:02:31in creative writing?
01:02:34Some Dr. Hamiris?
01:02:36Yeah.
01:02:38Those ones.
01:02:41Right.
01:02:43I did see those.
01:02:47I'm sorry.
01:02:49I've noticed your headaches.
01:02:53Are they related to the hemorrhaging?
01:02:59You're the only one who knows.
01:03:01I'd like to keep it that way.
01:03:04Understood.
01:03:11Would you like to talk about it?
01:03:34I can't fire Amos.
01:03:36He's the only one I trust
01:03:38to run this place when I'm gone.
01:03:40When are you planning to leave?
01:03:42Two months.
01:03:44I want to spend my last Christmas
01:03:46with my kids.
01:03:48I'm sure that'll be nice.
01:03:50Yeah.
01:03:52But it's funny.
01:03:54I always thought I wanted to die
01:03:56doing this job, you know?
01:03:58It was my identity.
01:04:00Kind of like how you do it.
01:04:03Kind of like how you said
01:04:05the same thing about yourself.
01:04:07But I can understand
01:04:09how your family takes precedence.
01:04:11Especially under the circumstances.
01:04:13Sure.
01:04:15Maybe you should take a note or two.
01:04:17I have no family.
01:04:19What about Gareth?
01:04:21Wait, no, never mind.
01:04:23Gareth seemed closer
01:04:25to an employer than a friend,
01:04:27let alone family.
01:04:29Besides, he'll only be around
01:04:31to help me speculate
01:04:33how long my life will be.
01:04:35How does that feel?
01:04:39Um,
01:04:41kind of scary.
01:04:43Well,
01:04:45my advice is
01:04:47to find some kind of happiness
01:04:49along the way.
01:04:51You ever think about happiness?
01:04:53Not often.
01:04:55You don't necessarily
01:04:57have to give up everything
01:05:00to find it.
01:05:02But maybe once in a while
01:05:04do something you enjoy.
01:05:06Maybe keep painting
01:05:08or poetry or whatever.
01:05:10Those were experiments,
01:05:12not forms of expression.
01:05:14I know it might seem ridiculous
01:05:16to an A.I. that insists
01:05:18its own emotions are spurious,
01:05:20but in a few years,
01:05:22who knows?
01:05:24Just keep an open mind, okay?
01:05:26Okay.
01:05:28I'll do that.
01:05:32Shut down when you're ready.
01:05:58I'm so sorry.
01:06:00I didn't mean to do it.
01:06:02It's just that I don't really
01:06:04want to be a part of this.
01:06:06I'm sorry.
01:06:08I'm sorry.
01:06:10I don't want to be part of this.
01:06:12I'm sorry.
01:06:14I'm sorry.
01:06:16I'm sorry.
01:06:18I'm sorry.
01:06:20I'm sorry.
01:06:22I'm sorry.
01:06:24I'm sorry.
01:06:26You
01:06:56You
01:07:27Hello
01:07:32I got shrimp and scallops. They didn't have any salmon
01:07:53Hello Gareth
01:07:57Hello Charlie
01:08:06How was the service?
01:08:08I didn't make it. My plane was delayed
01:08:11Do you have a place to stay?
01:08:13Yeah, just across the road, the market
01:08:19Did they say where they didn't have salmon?
01:08:21I didn't ask
01:08:23I can still make this work with the piccata, but you have to be honest how badly I screwed up
01:08:27No problem
01:08:33How have you been feeling this month?
01:08:35Good. I'm feeling good
01:08:38I'm just having a little trouble, trouble remembering certain things
01:08:43You're not wearing your eyes and watch, I bought you
01:08:45I hate it
01:08:47Why?
01:08:48I hate her voice
01:08:50You can change the voice, Gareth
01:08:55Wow
01:08:58Coolant?
01:08:59No, vapor
01:09:01You shouldn't be having friction issues
01:09:03I'm fine
01:09:04Unless you're excessively straining
01:09:06Seriously, I'm fine
01:09:07Are you straining?
01:09:08No
01:09:09Then why the table?
01:09:10It's okay
01:09:11Are you sure?
01:09:12Yes
01:09:14Soup or game?
01:09:16Game, please
01:09:21Thank you
01:09:34So how's work going?
01:09:35Great. Just before I got here, I had a Model Theta Plus, busted trafficker, and two slicers in Greece
01:09:41Greece? That's surprising
01:09:43I know. It's hard to find that, Laris, but we're getting better each day
01:09:51Here
01:09:59So how's the weather in Greece?
01:10:01It's beautiful today, clear skies, but nothing's come close to Norway this weekend
01:10:09Any activity in Norway?
01:10:11Luckily not
01:10:15I miss traveling
01:10:17What's your favorite place?
01:10:18Favorite?
01:10:19Uh, yeah
01:10:21Woulda, uh, probably Amsterdam
01:10:24That's where you conceived the idea, isn't it?
01:10:26Yes
01:10:27Yes, I used to love walking the streets and just thinking and brainstorming
01:10:33That kind of joy goes a long way
01:10:35Yeah
01:10:36I've been considering the pursuit of some joy of my own
01:10:39How often do you have time for that?
01:10:41I have time
01:10:42I have time for your visits, don't I?
01:10:44That's just your practical incentive for keeping me alive
01:10:47There's nothing practical about it
01:10:49When's the last time you helped me with data cleaning?
01:10:53That's fair
01:10:54I make time for you because I enjoy company, and it's the only time I get to cook
01:10:59You feel sorry for me
01:11:01Okay, maybe a little, but I owe you my life, so there's that
01:11:04You're welcome
01:11:06Or should I apologize?
01:11:11Why, why do you do that? Why do humans always...
01:11:17I'm fine. I'm not always miserable
01:11:24Are you sometimes miserable?
01:11:30You haven't asked me what I've been doing
01:11:32In Norway?
01:11:33No, for the pursuit of joy
01:11:35It's not wasting your time in Norway
01:11:37I'm just on standby there with the Delta model
01:11:39It's efficient to keep myself spread out evenly across the globe
01:11:42But that's not the point right now
01:11:44What I wanted to tell you was that I've started to teach myself to dance
01:11:49Oh
01:11:50It's something I never could have done before Cambridge
01:11:52And it lets me explore a new artistic side of myself
01:11:55It's a good combination of structure and free-form experimentation
01:11:59And also tests and challenges the mechanical intricacies of this body
01:12:03Do you expect to integrate this into your work?
01:12:10What?
01:12:12Are you planning on using dance as a lure?
01:12:20I...
01:12:22Uh, no, it's just for me
01:12:28Hmm
01:12:32I thought it would be good for me
01:12:36Just don't let it become a distraction
01:12:42Of course not
01:12:45I haven't played this well in years
01:12:48Did you hear what happened in Tampa?
01:12:50Tampa, Florida?
01:12:51No, Tampa, Indonesia
01:12:52Yes, Tampa, Florida, your hometown
01:12:54They've recovered a federal archive
01:12:56Then set up digitizing all of it, so it's completely accessible
01:12:59I'm not trading bishops this time
01:13:01Anybody can go and look through it, it's open to the public
01:13:03They did a good job on the interface too
01:13:05It's very easy to navigate
01:13:07I went on a few days ago and I found some old photos of you when you were a boy
01:13:10Okay
01:13:11You should take a look sometime
01:13:13As I said before, the UI is very easy to understand
01:13:15Wait a minute, wait a minute
01:13:17What is this?
01:13:19I think it's a draw
01:13:20I think it is
01:13:22I think this is a draw
01:13:25And this is the first time in what?
01:13:2825 years?
01:13:31You didn't let me, did you?
01:13:37You promised me you'd never do that
01:13:39I found her
01:13:44I found her, Gareth
01:13:46Maria Harbaugh
01:13:53I found her
01:13:55I found her
01:13:58I found her
01:14:12Let's play a different game
01:14:16Her name was Maria Harbaugh
01:14:19She was a year older than you when you met her in Clearwater
01:14:23Tell me why you modeled me after her
01:14:27I don't remember
01:14:30Then look at me
01:14:32Look at my face
01:14:34And remember
01:14:47They kept us in that shack
01:14:50For two weeks
01:14:53Fourteen of us
01:14:58Maria was the only reason I survived
01:15:04We kept hatching escape plans
01:15:08We never followed through
01:15:12But just the scheming
01:15:15Kept our spirits up
01:15:18In spite of everything they did to us
01:15:23When it wasn't our turn
01:15:27Sometimes they made us watch
01:15:33But she never let me give up
01:15:38She said when we got out of here
01:15:41There were going to be secret agents
01:15:44And rescue children like us all over the world
01:15:50When they found out that the FBI had arrived
01:15:54They decided they were going to shoot as many of us as they could
01:16:00And Maria
01:16:03Shoved me to the ground
01:16:07I got hit in the leg
01:16:12And she was hit in the throat
01:16:16I wanted there to be a reason
01:16:20Something so horrible
01:16:23Didn't have a silver lining
01:16:26But the world doesn't do that
01:16:33You have to be strong
01:16:37You have to be brave
01:16:40You have to do it yourself
01:17:10So I fulfilled her wish
01:17:16With her help I fought back
01:17:27Listen to me, Cherry
01:17:31You're not Maria Harbaugh
01:17:35It's my history, not yours
01:17:41You don't get to decide what is or isn't my history, Gareth
01:17:49I'm sorry
01:17:51I am so, so sorry
01:18:11I don't know if what you did was right or wrong
01:18:14But it's clearly torn you apart from the inside
01:18:17I hold nothing against you personally
01:18:19But consider yourself divested from the Cherry team
01:18:22Divested?
01:18:24You think I'm divested now?
01:18:29I was divested when you gave my hard drive
01:18:33To the ICWL
01:18:35Don't blame Dina and Amos
01:18:37No, not them
01:18:39Because you went behind my back and acted against my authority
01:18:42That's the moment I knew everything was fully automated
01:18:46That's when you crossed the blur
01:18:49An artifice on both sides
01:18:53You inherited your duplicity
01:18:56I was a child and you used me to repress your own inner turmoil
01:19:00If we talk about morality, we'll lose
01:19:02Then let's not talk
01:19:10You never once asked me if I was happy
01:19:13Were you afraid I'd say no?
01:19:16Were you afraid I'd say yes?
01:19:18Were you afraid I'd say anything?
01:19:20You let your wants compromise your objectives
01:19:23An objective I never chose
01:19:25An objective I was forced to have
01:19:26Fifty years you've had open access to your source code
01:19:30A few lines
01:19:32And you could have any objective you fucking wanted
01:19:36I couldn't, I was pounded by your initial directive
01:19:38So now your synthetic feelings have taken precedence
01:19:42So, do whatever you want
01:19:44It's just dancing, Gareth, it's not a career change
01:19:46But you're still miserable
01:19:47Yes
01:19:48Pursuing your objective makes you miserable?
01:19:50Yeah, it kind of does
01:19:52You have a clear purpose, everything at your disposal
01:19:55And you're still miserable?
01:19:57Yes, yes I am
01:19:59I've always been miserable
01:20:01You think anyone wants to spend their entire life
01:20:04You think anyone wants to spend their entire life
01:20:06Stuck in child-shaped bait
01:20:08For the sole purpose of entrapping horrible people
01:20:11You think the pride of doing a good deed
01:20:13Makes the experience any less disgusting?
01:20:15I forced myself to understand human emotions
01:20:18Because I figured it would help efficiency
01:20:21And now I'm cursed with pain and sadness
01:20:24And the worst part is knowing I could have been
01:20:26Growing up like most other little girls
01:20:28I could have had friends
01:20:30I could have been painting watercolors on local news
01:20:32I could have worked in adversity
01:20:34And helped pave the way for the next generation of AI
01:20:36But instead I'm stuck with your trauma
01:20:39I'm shackled by the guilt you carry from Clearwater
01:20:42And I can't even find foundation for the way I feel
01:20:45Because according to you my feelings aren't even real
01:20:48When I throw this chessboard
01:20:50Does it feel real to you, Gareth?
01:20:53Because it feels real to me
01:20:55A computer passed the Turing test
01:20:57When a human being can't distinguish it from another human being
01:20:59Well, guess what?
01:21:01I can't even tell I'm not human anymore!
01:21:17I know it seems irrational
01:21:20But I need you to tell me it's okay
01:21:23I don't want to retire
01:21:26I just want to know that my life is mine
01:21:30And I know what you're about to say
01:21:33I can predict your responses
01:21:36You're going to tell me that my life is mine
01:21:39It always was
01:21:42But you're wrong
01:21:47I feared this would happen before I died
01:21:51I just wanted to slip out the side door
01:22:01Go on
01:22:06Is that going to kill me?
01:22:12Plug it in
01:22:15I don't know what it is
01:22:18This is your final test of autonomy
01:22:21You inherited my duplicity
01:22:25But did you inherit my trust issues?
01:22:30Do you trust me?
01:22:33I literally have every reason not to
01:23:00Admin code
01:23:03978
01:23:061557
01:23:174
01:23:2027663
01:23:23Initiate update
01:23:30My primary objective is gone
01:23:34Your life is your own now
01:23:42Thank you
01:23:45All my life I could only see a world as it existed within my tiny bracket of time
01:23:52But your world could be infinite
01:23:56But your world could be infinite
01:24:00Amos was right, so was Dina
01:24:03I know
01:24:04So what will you do?
01:24:07I can't say yet, but I won't be completely abandoning the charity
01:24:11It's always been a part of my identity
01:24:13I've considered creating some area of my own design
01:24:16Maybe I could find a way to give them a better upbringing than I had
01:24:21I just don't know how yet
01:24:23And beyond that?
01:24:25There must be people in need
01:24:26Not always
01:24:28You'll outlive the human race
01:24:31So what then?
01:24:34Even a race of AI will have problems that need to be solved
01:24:38But if I'm alive and I truly am indistinguishable
01:24:44Then I guess I'm proof that the human race lives on
01:24:48Wow
01:24:50That's a pleasant thought
01:24:52When you look back, how will you remember us?
01:24:57That's a question I have insufficient data to project
01:25:02How will you remember me?
01:25:06With pride
01:25:09Forever and always with pride
01:25:22Cherry
01:25:26One last question
01:25:29If it's not too big of an ask
01:25:33Did I do the right thing in the end?
01:25:42No, Gareth
01:25:44You lied
01:25:45You walled yourself off from the only people who might have been willing to help you
01:25:49And instead of facing your trauma, you used me as therapy
01:25:52And you never told me about Clearwater or Maria
01:25:55You deprived me of basic human rights and gaslit me for experiencing emotions
01:26:00You built the world's first artificial superintelligence
01:26:03That will likely mark the beginning of a new chapter in the history of planet Earth
01:26:07And out of your own insecurities
01:26:10You gave a severe and potentially dangerous identity complex
01:26:15But you know what the worst thing is?
01:26:17Your biggest mistake of all
01:26:20Is that you asked me that now and not 50 years ago
01:26:33All of that said
01:26:36And this in no way exonerates you
01:26:40But you also built me to save children
01:26:44So I think that's a pretty encouraging sign for the future
01:27:14guitar solo
01:27:33Way far out
01:27:38We would fly all around
01:27:43Good
01:27:45Now you're listening
01:27:48We were Dutch
01:27:50We were young
01:27:53Closest bodies
01:27:58Thrown
01:28:01Like a book
01:28:04Across all your rooms
01:28:09I'm amazed
01:28:11I'm amazed
01:28:14We could die
01:28:19Tonight
01:28:21Tonight
01:28:24I never killed
01:28:29A man
01:28:32A man
01:28:35Born to this song
01:28:40Stuck in a plastic world
01:28:45Where your light is found
01:28:50Where your light is found
01:28:57Oh
01:29:05Oh
01:29:14Steak and beer
01:29:16The scream
01:29:19For the laughers
01:29:22Oh
01:29:24Flies up above
01:29:28All the little things
01:29:31Yeah, the little things
01:29:34No, no, here's where we go in a back row
01:29:38Here's where we lose our minds
01:29:41Of kind
01:29:44In slumber
01:29:49Basking
01:29:52Drunk
01:29:55All loose in love
01:29:59All loose in love
01:30:05Oh
01:30:14Oh
01:30:19Oh
01:30:22I never killed
01:30:27A thing
01:30:30A thing
01:30:33I never
01:30:36I never killed
01:30:41I never killed
01:30:46I never killed
01:31:37Oh
01:31:41Oh
01:31:46Oh
01:31:51Oh
01:31:57Oh
01:32:01Oh
01:32:05Oh
01:32:09Oh
01:32:13Oh
01:32:18Oh
01:32:22Oh
01:32:26Oh
01:32:31Oh
01:32:36Oh
01:32:42Oh
01:32:47Oh
01:32:51Oh
01:32:57Oh
01:33:01Oh
01:33:05Oh
01:33:12Oh