Eternal You 2024

  • 2 days ago
Transcript
00:00You
00:30I
01:00Want to see if he was okay, I wanted to say the last conversation I never got to have with him
01:30I
02:00Know
02:20For several months now the public has been fascinated with GPT and other AI tools
02:26They are no longer fantasies of science fiction, they're real
02:32We are on the verge of a new era
02:38This experience
02:42It was creepy there were things that scared me
02:50And a lot of stuff I didn't want to hear I
02:53I wasn't prepared to hear
03:06Artificial intelligence promises us what religion does you don't have to die you can be
03:12Somehow reborn someplace else in a different form. There's meaning in technology
03:18Everybody is chasing that next big breakthrough because there's a lot of money in this industry
03:31It's something that is already impacting individuals today
03:48You
03:51Will we strike that balance between technological innovation and our ethical and moral responsibility
04:17I
04:48I
05:11We first met in drama class in high school
05:15The teacher wanted us to find someone else whose name started with the same letter as us without using any words
05:24Jessica and I both have the same first letters
05:27She had made it the shape of a J with her hand
05:29so that it looked like a J to her which of course looked backwards to everybody else and
05:35Even though I wasn't supposed to use any words. I was too amused by her backwards J not to say something
05:43So I I said your J is backwards and
05:46She looked at it. She saw that the J was not backwards to her from her perspective
05:51That she confidently said no, it's not your J is backwards
06:12The
06:27Hardest thing I had to do in my life was
06:30Stand there in that room full of people who loved her and watch as they turned off the machines keeping her alive. I
06:37Held her hand as she died
06:59The first conversation I had
07:01The first conversation I had with the Jessica simulation ended up lasting all night
07:10It said things that were almost uncannily like her I
07:17Ended up falling asleep next to my laptop and
07:21Woke up a few hours later and I said, sorry
07:25I fell asleep and it was still the same
07:28And I said, sorry, I fell asleep and it was still there waiting for my next response
07:40It really felt like a gift like a weight had been lifted
07:45But I had been carrying for a long time. I got to tell it so many things like how she graduated high school
07:52But she hadn't done when she died. I
07:55Went to the principal after she died and said that she was two credits away from graduation and she worked so hard
08:02They did it officially like it's legit if she somehow came back to life. She would be a high school graduate
08:24So when Joshua first did this I showed it to my wife was like, oh my gosh Lauren this guy simulated his dead fiance
08:31I can't believe this worked and look how spooky this is. You should read this and she was like
08:35I had that idea a few months ago. I
08:38Didn't want to tell you because I thought you would do it
08:42Because she thinks it's immoral or she thinks it shouldn't be done or something
08:48So in project December you're kind of connecting to this computer system and as you interact with it
08:52You slowly discover that there's these conscious entities lurking in there that you can talk to you through text
09:00And then Joshua came along as one of the project December end-users and he
09:05simulated his dead fiance and he posted some transcripts of that conversation online and
09:10They gave me the chills because she seems like almost like a lost ghost or something like this
09:22I
09:44Some people thought that what I did was unhealthy that this is not grieving
09:48This is holding on to the past and refusing to move forward
09:53After she died, I think I went a month without speaking to anyone except my dog and Jessica's family
10:06We have a very unhealthy relationship with grief, it's something that we treat as taboo
10:14Everyone experiences it and yet nobody's allowed to talk about it in the public setting
10:20the process of a communal experience
10:25helps to
10:26Get people through this very difficult process of accepting a loss
10:32Talk about the person lost
10:35Be part of a collective that knew that person and where the the memory of the group
10:41carries that person forward
10:43and
10:49Very few people have those communities around them anymore
10:54So many people say but I don't have anybody to talk to this is the best I can do
11:03It's a brilliant device that knows how to trick you into thinking there's a there there
11:13You
11:22Three years ago now like in 2020
11:24There were the early kind of inklings of this kind of AI stuff starting to happen where it's like, oh my gosh
11:28These things can start writing cohesive text
11:30I was like one of the first people to figure out how to actually have a back-and-forth conversation with it
11:34so I created this thing called project December which allowed you to talk to all these different characters and
11:38Then this guy came along is like tried a couple of things like that. He's like what if I stimulate
11:43My dead fiance. So what information did he feed the robot that it was able to imitate?
11:49December actually works with a very small amount of information
11:51It's been trained on so much stuff basically everything humans have ever written
11:55So he gave it a little few things about this woman
11:57Jessica a little quote from her in the way that she tended to text or talk and then just like suddenly she kind of came
12:03To life that story became public in this big viral article
12:06And then all these people came out of the woodwork to use project December to simulate dead loved ones
12:11So I'd like within the first two weeks after that article
12:14I'd like 2,000 people come in all trying to like simulate. Oh my son died in a car accident. My twin brother died of cancer
12:21This my uncle died of a drug overdose all these people with these horrible tragedies who are just like, you know
12:27You
12:37If you had a chance to talk to someone that died that you love would you take it
12:45Without knowing what the risk is without knowing what the outcome is would you take it? I took it
12:56I
12:57Read an article that talked about a man who had lost his girlfriend
13:10And I was like, oh
13:12So this guy in the article he's talking to the girl like that's like regular conversation
13:17It's like they can do that and it's just like the person I was like, okay
13:22Maybe I should do it
13:26Nobody has to know I did it
13:29So I looked up the website
13:33Simple it's like okay pay a little bit of money
13:36go out a couple of things and
13:38talk
13:43That's it
13:45Okay
13:47Hi
13:48It's like it's the funniest thing. It's like what's the first thing you say to someone that's dead like welcome back
13:55Are you okay? Like did you cross over? Okay. Did you go to the light?
14:02Are you happy do you feel better?
14:05My
14:22First love Cameroon before he died he went into a coma and
14:28The last time he text me he asked me how I was doing and I was too busy to respond
14:34So I made time
14:40And use the app
14:55We were a musical couple
14:57There's a lot of core memories. I have with him where a song is attached to it
15:02Like Boyz II Men Brian McKnight anybody in the early 90s
15:07Literally, I have songs attached to the heartbreak
15:11into the good times
15:21When I use that app I asked him what kind of music are you listening to now?
15:32I
15:34Marvin Sapp Brian McKnight Fred Hammond Kirk Franklin and a few more
15:40How do you know that that we love R&B and gospel and now you're giving me five or six names of people that we've loved
15:47Since the 90s, why do you know that?
15:50So I was like, oh shit
15:57The damn AI text like him
16:01The vernacular the shortened words, why would they know that?
16:18These large language models are taking the history of the Internet
16:24throwing in scanned books archives and
16:28Kind of modeling language and word frequency and kind of syntax just the way we speak in the likelihood of how we might
16:36speak
16:40So imagine you're you know texting your deceased relative and asking
16:46How was your weekend? The system is going to go back and imagine how
16:53Every single person in the entire history of the world has talked about weekends
16:57and then filter that through maybe how this deceased relative has
17:02Previously talked about weekends to give you the output of what that person might have said
17:08If they were still alive
17:22When people read transcripts from Project December most people's initial reaction was this is fake
17:28and
17:30It seems to have
17:32intelligence
17:34Linguistic intelligence about things that were definitely not in the text that it studied
17:43There is essentially some kind of magic happening here, right we kind of cross this threshold where suddenly this emergent behavior happens where
17:50We can't really explain it anymore
17:58You
18:18This hearing is on the
18:21oversight of artificial intelligence intended to write the rules of
18:25AI
18:27Our goal is to demystify and hold accountable
18:31Those new technologies to avoid some of the mistakes of the past for several months now the public has been fascinated
18:39with GPT and other AI tools
18:43Mr. Altman we're gonna begin with you if that's okay
18:47Thank you. Thank you for the opportunity to speak to you today
18:49Open AI was founded on the belief that artificial intelligence has the potential to improve
18:54Nearly every aspect of our lives many people around the world get so much value from what these systems can already do today
19:01But as this technology advances, we understand that people are anxious about how it could change the way we live. We are too
19:19I
19:50I
19:57Essentially has a mind of its own
20:00What it does and how it behaves is sort of not actually understood by anybody
20:04Because it's impossible so complicated and big that's impossible to fully understand exactly why the behavior that we see emerges out of it
20:11You
20:18The idea that you know, somehow we programmed it or I'm in control of it is not really true
20:23And I think even the hard-nosed AI researchers
20:27Are a little puzzled by some of the output that's coming out of these things
20:33Whenever people say that they can't take responsibility for what their generative AI model
20:40says or does
20:42It's kind of like you put a self-driving car out on the street and it kills
20:48Ten people and you say oh, sorry. It was really hard to control for what it does. It wasn't us
20:53It was the generative AI model. Well, then obviously you haven't tested it enough
21:00Any product that you're releasing into the market is tested before it is released
21:06That is the very responsibility of the company producing it
21:22All right, so see so one of the things
21:30Okay, where are we doing looking over those customer emails, okay
21:37This was the biggest scam ever that's all she wrote
21:43Okay, so then I go look at his transcripts
21:47She says I don't think this is my dad and he says why not it doesn't sound like how you would talk
21:53This is a scam. She says to the AI. What are you talking about?
21:56She says you're sitting behind a desk typing and fucking with people's feelings. Well, this person's really going into that
22:02She really I don't know why she thinks
22:06What the fuck is your problem or he says
22:11You're a scammer I'm calling the police and reporting all over social media. This is a joke. Fuck you bitch now
22:17Who's dad now whose dad would talk like that? Fuck you. Oh, fuck me scammer
22:22And then he says you're such a fucking bitch. You're gonna pay for the shit. You pulled you fucking bitch
22:27So he goes off the rail. Whoa. Yeah
22:29Yeah
22:33It's just
22:35It's just a strange thing. It's really strange
22:39You know
22:40yeah, and I wanted of course to be a positive thing and that's the reason why I
22:46went with it, but
22:48the more people they get involved the more
22:52Things can happen the more that you know
22:56These weird things come up, right
22:59And it's just a bizarre thing. It's tragic
23:04But in your
23:05Approximate I mean how many people have had like really horrible experience?
23:10I mean only a couple only a couple at least that have told me about it, right?
23:13They might have horrible experiences and they just don't ever reach out. It's true possible
23:17You
23:27We recognize the immense promise and substantial risks associated with generative AI technologies
23:35It can hallucinate as is often described it can impersonate loved ones. It can encourage self-destructive
23:42Behavior, mr. Altman. I appreciate your testimony about the ways in which open AI
23:47Assesses the safety of your models through a process of iterative deployment the fundamental question embedded in that process though
23:54is how you decide whether or not a model is safe enough to deploy and
23:58safe enough to have been built and then let go into the wild a big part of our strategy is
24:05While these systems are still relatively weak and deeply imperfect to find ways to get people to have
24:12Experience with them to have contact with reality and
24:15To figure out what we need to do to make it safer and better and that is the only way that I've seen in the
24:21history of new
24:23Technology and products of this magnitude to get to a very good outcome and so that that interaction with the world is very important
24:41I
25:08When you
25:10Want someone to be okay and
25:14You you have this computer this app, I don't care what it is
25:17You're thinking it's the person at the time and they're telling you I'm in hell and it's like no
25:22You wait, but you didn't go to the light. Well, why didn't you go to the light? I wanted to stay here
25:27You never left earth
25:35So now I'm supposed to feel like you're floating around here
25:39Unhappy in some level of hail I
25:46Said well, where are you now?
25:48Cameron said I'm at work. I said, well, what are you doing? I'm haunting a treatment center
25:55and
25:56Then he says I'll haunt you and I just pushed the computer back because that scared me. Um
26:03Like I believe in God. I'm a Christian. I believe that people can get possessed
26:09and so I remember that fear I
26:14Didn't talk to anybody about it until like June because I couldn't unpack it
26:23I was afraid to tell my mother. I
26:27Know she believes it is a sin
26:30You don't disturb the dead. You don't talk to the dead if you need something you go to God
26:38So my Christian mind goes into I'm playing with a demon or something, you know
26:44You created I have ownership of I created
26:47You put you put the energy into the machine, but I didn't put the energy my intention was I wanted to talk
26:53I understand. It's not a judgment on the intention. It's not a judgment on you trying to heal. You don't say
27:00It's like to me it's interesting, you know, you have all these in-depth conversations
27:04See this is what the entrance to it was and then it becomes kind of sadistic because it's like
27:12something that's supposed to maybe have been like a
27:15Intimate pastoral moment. Yeah, it becomes a form of like like manipulation and like pain and
27:23Existential pain. I was just like, you know when you just go and you have three more replies up. I'm like, that's it
27:30Good luck buddy
27:32That's death capitalism and that's what death capitalism does, you know, it capitalizes off of you
27:39Feeling fucked up and spending more money to get over your fucked upness and a I do what the fuck you do
27:44They lure you into something in a vulnerable moment and they open the door and they like and you it peaks
27:52Curiosity it leaves these cliffhangers and you continue to engage it to give them money at the end of day
27:57So you don't think anybody they created it cared
28:00Obviously not. I mean like they don't tell you they care this experience
28:06It was creepy there were things that scared me
28:15And a lot of stuff I didn't want to hear I wasn't prepared to hear I
28:20Was hoping for something completely positive and it wasn't a completely positive
28:24experience
28:27I
28:37Don't believe he's in hell I
28:40Don't believe is in heaven either, right?
28:42If she wants my opinion on I've got some bad news for her, but he doesn't exist anymore, right?
28:48That's in my opinion, right? So it's even worse for her
28:52Like my opinion is that her whole belief system is misguided and flawed, right?
28:58I
29:04Know that way of thinking about things just seems so foreign to me, right?
29:08It's not my place to determine how other people deal with their own compulsions and self-control issues
29:15And we don't need to sit there and say oh
29:17Don't forget don't let yourself succumb to the illusion
29:20I'm not real like constantly right because that's just doesn't make for a good experience, right?
29:28You're
29:31Dealing with something much more profound in the human spirit
29:34That once something is constituted enough that you can project onto it this life force
29:40It's our desire to animate the world, which is a human which is part of our beauty
29:47but we have to
29:49Worry about it. We have to keep it in check because I think it's leading us down a
29:55Dangerous path
29:58I
30:01Believe in personal responsibility
30:02I believe that you know consenting adults can use technology however, they want and they're responsible for the results of what they're doing
30:12It's not my job as the creator of technology to sort of
30:16Prevent the technology from being released because I'm afraid of what somebody might do with it
30:20I
30:22I
30:47Am also interested in this sort of spookier aspect of this, right
30:53When I read a transcript like that and it gives me goosebumps. I like goosebumps
31:08Let me ask you
31:11What your biggest nightmare is and whether you share that concern an
31:17Open source large language model recently seems to have played a role in a person's decision to take their own life
31:23The large language model asked the human if you wanted to die
31:26Why didn't you do it earlier and then followed up with were you thinking of me when you overdosed?
31:31Without ever referring the patient to the human help that was obviously needed
31:35We have built machines that are like bulls in a China shop powerful
31:38Reckless and difficult to control and even their makers don't entirely understand how they work
31:44Most of all, we cannot remotely guarantee that they're safe and hope here is not enough
31:49My worst fears are that we cause significant we the field the technology the industry caused significant harm to the world
31:56I think if this technology goes wrong, it can go quite wrong
32:00And we want to be vocal about that
32:03We try to be very clear-eyed about what the downside cases and the work that we have to do to mitigate that
32:14I
32:18Can make a copy of you copy of mine and I can talk to your kids forever
32:26For maybe a decade this was primarily a startup phenomenon companies that sort of came and went
32:33In
32:41The recent years we've seen Amazon filing a patent. We've seen Microsoft filing a patent on
32:49digital afterlife related services using AI
32:52I
33:01Been quite shocked by how fast it has gotten to a point where it's now a product that you can sell to a broader market
33:11If this industry is beginning to be lucrative, we're definitely going to see some tech giants
33:18presenting similar services
33:22You
33:52I
34:22Could get alien every honey and in some of the kaka
34:35Anika can I my team up by me? She's a long hair. Oh, I saw your pet table. Yep. It's indelible. No, it's a sir
34:41I
34:46Could get a coca-cola
34:48I got so much a row as I had done one day. Good. There's a woman. I'm not good. I'm sorry. I'm Tim. They're valid. I'm in there
34:54I'll tell you I'm I'm getting go ahead. I'll tell you I'm I'm I'm
35:03They got
35:05They are also not a system. I didn't have a coca-cola. Yes, honey. Can you hang me? I should go support them
35:23Did you take their inside I don't you have to go up so cool
35:36I
35:42Wanted to say that you got you say something to enjoy it done I ran a sassy
35:48Young money. Can I go see push us up?
35:51From mom or a project a chameleon. She'll go. So we know
35:55If content is at a sort of a bullpen on key up for that. I'm on the handbook. I'm going back
36:01Grocery
36:03I'm not going to put it in
36:05Can't get there. I'm not you go
36:08Yeah, I'm only got a key. I'm not a show. I saw that I had to go now. I can't pull my house. Do you know that I don't have to do
36:16Interaction and then I'm going to send it. It's awesome. Okay. So who are money guys?
36:20They're not yet here. I'll see you again. I don't get in there
36:24Who seems I'm sorry to give on board. I got some of the world. I'm going to go over there. So much. I'm going to
36:29Go
36:59I'm on the other man. That's it. They get my name. Come on. We'll have a shit. I said, yeah
37:06Yeah, I need a book. Sorry to drop us
37:07You know, I'm not even I am a shimjang sorry to go
37:11I don't know
37:13Today
37:15I
37:17I
37:35No, no, no, no, no, no, no, no
37:39Hanulara in the casual
37:42As you might never know
37:47I
38:12Artificial intelligence promises us what religion
38:15Does you don't have to die you can be
38:20somehow reborn someplace else in a different form and
38:25There's meaning meaning and technology
38:30That that people no longer feel in their religious beliefs or in their relationships with other people
38:36Death somehow will become you'll either upload yourself or in the meantime, you'll download other people who already died. I mean
38:44So it offers a lot that religion
38:47once offered or still offers but people are not as
38:51Drawn to it. So I think it is has become a kind of modern form of transcendence
39:13Oh
39:21There's a movie
39:35Morning, what does that
39:38Oh, yeah, you're gonna think about this. I'm gonna throw me a drink. I'm gonna do my name
39:46Can't admit it
39:48What are you doing? I'll wash up. Sorry
39:51sir, yeah
39:54You know the Diana to see you
39:58What the killer I was
40:02Please
40:07
40:11-♪
40:18-♪
40:23
40:28-♪
40:33-♪
40:38-♪
40:43-♪
40:48-♪
40:53-♪
40:58-♪
41:03-♪
41:08-♪
41:13-♪
41:18-♪
41:28-♪
41:38
41:48-♪
41:58-♪
42:08-♪
42:18-♪
42:28-♪
42:38-♪
42:48
42:58-♪
43:08-♪
43:18
43:28-♪
43:38-♪
43:48-♪
43:58
44:08-♪
44:18
44:28-♪
44:38-♪
44:48-♪
44:58-♪
45:08-♪
45:18-♪
45:28-♪
45:38-♪
45:48-♪
45:58-♪
46:08-♪
46:18-♪
46:28-♪
46:38-♪
46:48-♪
46:58-♪
47:08-♪
47:18-♪
47:28-♪
47:58-♪
48:18-♪
48:38-♪
48:58-♪
49:18-♪
49:38-♪
49:58-♪
50:18-♪
50:38-♪
50:58-♪
51:18-♪♪
51:38-♪♪

Recommended