A group of investors led by Elon Musk is offering about $97.4 billion to buy the nonprofit behind OpenAI, escalating a dispute with the artificial intelligence company that Musk helped found a decade ago.
Category
🗞
NewsTranscript
00:00The company is not for sale, neither is the mission.
00:02We're happy to buy Twitter if you want to talk about that.
00:04Sam, Sam.
00:06We're happy to buy Twitter.
00:07Sam, how can we trust your models are safe?
00:09Not much talk about safety here at the summit.
00:11How can you reassure people your models are safe?
00:13I mean, I think you have to look at what we've done.
00:15This summit, obviously, is not as focused on safety as the previous ones have been.
00:19But it's an integral part of everything we do at the work.
00:21We called you as a combat man.
00:24Did you see it?
00:26It cost a lot of people a lot of money.
00:28This is not a serious person.
00:29Can you afford your own company now, though?
00:30It does raise that question.
00:31Can you afford to buy OpenAI off the non-profit now?
00:34The board will decide what to do there.
00:36The non-profit will continue as a very, very strong thing.
00:39The mission is really important.
00:40And we're totally focused on making sure we preserve that.
00:43And how worried are you about China?
00:45Are we worried about China?
00:47We'd like to work with China, I think.
00:49Do you think your government would let you do that?
00:52Do I know that? For sure, no.
00:53Should we try as hard as we can? Absolutely, yes.
00:55I think that's really important.
00:56And is that important for safety and for trust?
00:58How much is OpenAI worth?
01:00If it's not $100 billion, how much is it?
01:02It's not for sale.
01:03What do you think about the critics of Mr. Musk?
01:05He said that OpenAI is not open.
01:08I mean, I think OpenAI has done the most
01:10to push the benefits of AI into the world of anyone.
01:12We have open-sourced a lot of things.
01:13I think we should probably open-source somewhat more.
01:15We'll probably do that.
01:17Grok is not super open-sourced either.
01:18But, you know, that's up to him.
01:20How are you going to convince people here, though,
01:22to trust you on safety?
01:23I think that's a really important thing.
01:24There's a lot of concern here.
01:25There's not enough about safety in the declaration.
01:27Can you do more?
01:28How are you going to persuade people?
01:29That actually has not been the main thing
01:30we've been hearing about.
01:31The concern is like, you know,
01:32can you make AI cheaper and can you have more of it
01:34and can we get it to be better and more advanced?
01:36Can you?
01:37Sorry, carry on with that.
01:38Well, yes, I think we can also do that.
01:39But again, safety is integral to what we do.
01:41As we get into this world of agents,
01:43we've got to make these systems really safe
01:45or people just won't use them.
01:46It's the same thing,
01:48and we'll push super hard on doing that.
01:49Do Europe have to regulate more?
01:51No.
01:52I mean, the European people can decide
01:53whatever they want to do,
01:54but they don't have to regulate more.
01:55The European people have to make decisions
01:57about how to balance the potentials of AI,
02:00the potential risks of AI,
02:02and the people will decide.
02:03We'll follow the rules.
02:04And, you know, there's trade-offs either way.