• 8 months ago
Presenter: Ami Badani, Chief Marketing Officer, Arm
Transcript
00:00 (upbeat music)
00:02 - Hi there.
00:05 I think we're all here because we believe
00:08 that AI will have a profound impact,
00:10 probably the most profound impact
00:12 that will be in our lifetimes.
00:14 It'll be bigger than the internet,
00:16 bigger than the smartphone revolution,
00:19 bigger than decades of innovations prior,
00:22 like the invention of the light bulb.
00:24 But AI is only on its tipping point.
00:28 In fact, you've seen use cases today,
00:30 like sports, financial services, entertainment,
00:34 and we're only getting started.
00:37 If you think about how AI will become mainstream
00:40 over the next decade, you think about ChatGBT.
00:44 It's at 200 million users in just over a year.
00:49 That's the fastest growing application
00:51 that we have ever seen.
00:53 Secondly, you think about self-driving cars.
00:56 One in 10 cars by 2030 will be self-driving cars,
01:01 and that will all be because of artificial intelligence.
01:05 Third, 75% of office applications,
01:08 office productivity applications,
01:11 will incorporate some form of generative AI
01:14 in some way, shape, or form.
01:16 So as you can see, AI will become mainstream.
01:20 But we're on a collision course with reality,
01:23 and it's not sustainable.
01:26 There are two things that we need
01:27 to really advance artificial intelligence,
01:30 and that's compute and power.
01:33 Let's talk about compute.
01:35 AI models are more hungry, more data hungry,
01:39 than they've ever been before.
01:42 They're increasing in complexity,
01:44 they're increasing in size,
01:46 and they're increasing in scope.
01:48 And what that drives is the need for more compute chips.
01:52 In fact, if you recall, OpenAI had their latest Sora model.
01:57 That's their latest text to video model.
02:01 It takes 100,000 AI chips,
02:04 working at full compute capacity and full power consumption,
02:09 in order to train Sora.
02:11 That's a huge amount of compute chips.
02:15 And that's just one large language model of many.
02:19 Then you think about smaller language models
02:22 that are gonna grow in scale,
02:25 and be quantized and applied to things like smartphones,
02:29 your thermostat in your home, and other home appliances,
02:33 your PC laptops, your inference in the cloud.
02:38 And then you think about large language models,
02:41 small language models, and that's really driving
02:44 the demand for compute chips.
02:47 But it's not about driving the demand for compute chips,
02:50 it's really about power efficient compute chips.
02:54 And that's really gonna help us carry
02:56 the advancements of artificial intelligence.
02:59 Power efficiency is in ARM's DNA.
03:02 In fact, 99% of smartphones
03:06 use an ARM power efficient chip.
03:09 And it's the same design philosophy
03:12 that we're applying to the advancements
03:14 of artificial intelligence.
03:17 In fact, if you think about AI, it comes with a cost.
03:22 And that cost is unfortunately power.
03:27 In fact, large language models
03:29 have an insatiable demand for power.
03:32 So without the need to address power,
03:35 we won't have advancements in artificial intelligence.
03:39 And if you think about the future
03:42 of artificial intelligence,
03:44 it's all gonna be about power efficient chip designs.
03:48 Now, if you look at this graph,
03:50 what it really shows is data center power consumption.
03:54 Most AI training and large language models
03:58 are all done in the data center today.
04:00 And that's 2% of the global electricity consumption,
04:05 which is an enormous amount.
04:07 But if I talk, you know, I spoke about earlier
04:09 that AI will become mainstream.
04:12 And when AI becomes mainstream,
04:15 the biggest tax that we'll see is on our power grid.
04:19 And we won't be able to continue the advancements of AI
04:23 without addressing power.
04:25 In fact, by 2030, artificial intelligence
04:29 will have triple the load than what it looks like today.
04:34 To put this into perspective,
04:35 25% of the US power grid
04:39 will be because of artificial intelligence.
04:42 So we really need to address this power problem head on.
04:46 And if you look at chatGBT,
04:48 it requires 15 times more energy
04:52 than a traditional web search.
04:55 So that's a huge amount of energy.
04:57 And that's what Arm is really focused on,
04:59 is driving power efficient compute chips.
05:02 In fact, if you look at customers like Google,
05:05 Microsoft, Amazon, Oracle,
05:08 they are all designing power efficient chips
05:11 because they know they have to solve the power problem first
05:16 in order to solve and drive
05:18 the advancements of artificial intelligence.
05:21 So in order for companies to have a competitive advantage,
05:24 they really need to think about power efficiency
05:27 in solving the power problem
05:29 in order to advance artificial intelligence.
05:31 [END]

Recommended