Professor and computer scientist David J. Malan joins WIRED to answer your computer and programming questions from Twitter. How do search engines compile information so quickly? Which operating system is best? How do microchips work?
Category
🤖
TechTranscript
00:00 Hello, world.
00:00 My name is Professor David J. Malan.
00:02 I teach computer science at Harvard, and I'm here today
00:04 to answer your questions from Twitter.
00:06 This is Computer Science Support.
00:08 [MUSIC PLAYING]
00:13 First up, from Tad Proletarian.
00:14 How do search engines work so fast?
00:16 Well, the short answer really is distributed computing,
00:20 which is to say that Google and Bing and other such search engines,
00:23 they don't just have one server, and they don't even
00:26 have just one really big server.
00:27 Rather, they have hundreds, thousands, probably hundreds of thousands
00:31 or more servers nowadays around the world.
00:33 And so when you and I go to Google or Bing
00:36 and maybe type in a word to search for, like cats,
00:39 it's quite possible that when you hit Enter,
00:41 and that keyword, like cats, is sent over the internet to Google or to Bing,
00:46 it's actually spread out ultimately across multiple servers, some of which
00:49 are grabbing the first 10 results, some of which
00:51 are grabbing the next 10 results, the next 10 results,
00:53 so that you see just one collection of results.
00:56 But a lot of those ideas, a lot of those search results,
00:58 came from different places.
01:00 And this eliminates what could potentially be a bottleneck of sorts
01:03 if all of the information you needed had to come from one specific server that
01:07 might very well be busy when you have that question.
01:10 Nick asks, will computer programming jobs be taken over by AI
01:14 within the next 5 to 10 years?
01:15 This is such a frequently asked question nowadays,
01:17 and I don't think the answer will be yes.
01:20 And I think we've seen evidence of this already,
01:22 in that early on when people were creating websites,
01:24 they were literally writing out code in a language called HTML by hand.
01:28 But then, of course, software came along, tools like Dreamweaver,
01:30 that you could download on your own computer, that would generate
01:33 some of that same code for you.
01:34 More recently, though, now you can just sign up
01:36 for websites like Squarespace and Wix and others, whereby click, click, click,
01:39 and the website is generated for you.
01:41 So I dare say, certainly in some domains,
01:44 that AI is really just an evolution of that trend.
01:46 And it hasn't put humans out of business as much as it has made you
01:50 and I much more productive.
01:51 AI, I think, and the ability soon to be able to program with natural language
01:55 is just going to enhance what you and I can already do logically,
01:59 but much more mechanically.
02:00 And I think, too, it's worth considering that there's just so many bugs
02:03 or mistakes in software in the world, and there's so many features
02:06 that humans wish existed in products present and future,
02:10 that our to-do list, so to speak, is way longer than we'll ever
02:13 have time to finish in our lifetimes.
02:15 And so I think the prospect of having an artificial intelligence
02:18 boost our productivity and work alongside us, so to speak,
02:22 as we try to solve problems is just going
02:24 to mean that you and I and the world together can solve so many more
02:27 problems and move forward together at an even faster rate.
02:31 All right, next up, Sophia, who asks, how do microchips even work?
02:34 It's just a green piece of metal.
02:36 Well, here, for instance, we have a whole bunch
02:38 of microchips on what's called a logic board,
02:41 or sometimes known as a motherboard.
02:42 There's a lot of ports that you might be familiar with, for instance.
02:45 Like here are some ports for audio, here are some ports for networking,
02:48 here are some ports for USB and some other devices as well.
02:51 And those ports, meanwhile, are connected
02:53 to lots of different chips on this board that
02:56 know how to interpret the signals from those ports.
02:58 And perhaps the biggest chip on this motherboard
03:01 tends to be this thing here called the CPU, or the central processing unit,
03:04 which is really the brains of the computer.
03:06 And what you can't necessarily quite see, because most of this
03:08 is actually paint and not traces, but if I flip this around,
03:11 you'll actually see in the right light and with the right angle
03:15 a whole bunch of traces running up, down, left, and right
03:17 on this logic board that's connecting all of these various microchips.
03:21 And by trace, I mean a tiny little wire that's
03:23 been etched into the top or the bottom of this circuit board
03:26 that connects two parts thereon.
03:28 Now, what might these microchips be doing?
03:29 Well, again, they might be simply interpreting signals
03:32 that are coming in from these ports.
03:33 Two, they might be performing mathematical operations,
03:36 doing something with those signals in order to convert input into output.
03:40 Or they might just be storing information ultimately.
03:42 In fact, there's all different types of memory on a logic board like this,
03:45 be it RAM or ROM or the like.
03:47 And so some of those chips might very well
03:49 be storing information for as long as the computer's plugged in,
03:52 or in some cases, depending on the device, even when the power goes off.
03:56 All right, next, a question from NKHE.
03:58 So if anyone can learn coding, what do computer scientists
04:01 do for four years in university?
04:03 Typically, in an undergraduate program in computer science or computer
04:07 engineering or a similar field, someone spends much more time
04:10 learning about the field itself than about programming specifically.
04:14 So as such, you might study not only a bit of programming,
04:16 but also mathematics, certain fundamentals
04:18 that transcend the particular classes you might have taken
04:21 in middle school or high school, but that can be used
04:23 to solve grander real world problems.
04:25 You might learn something about networks,
04:27 how you can send information from point A to point B.
04:30 You might learn about graphics, how you can display things on the screen
04:33 or even create interactive animations or the like.
04:35 You might learn how to leverage certain ideas from mathematics
04:38 and other fields to implement your very own artificial intelligence nowadays,
04:42 whereby you use probability and statistics and information
04:46 more generally to try to predict what a intelligent individual,
04:50 or in this case, computer, might say in response to a question.
04:53 So computer science itself is a very broad field,
04:56 and programming is really just a tool that you tend to learn along the way.
05:00 From Maya Shelby, how do zeros and ones turn into the internet?
05:04 Well, I think the simplest answer there is
05:06 that the internet is built upon layers and layers and layers of ideas.
05:10 And if we start at the lowest of those levels, zeros and ones,
05:13 you have something called binary, where zeros and ones
05:15 can be used to represent any other numbers as well.
05:18 And if we use more and more zeros and ones, more and more binary digits
05:21 or bits, so to speak, we can count up higher and higher and higher.
05:24 And then if you and I agree that, all right, well,
05:26 let's not just use these patterns of zeros and ones to represent numbers.
05:30 What if we reserve some of these patterns
05:32 to represent letters of the English alphabet?
05:35 And then maybe you and I can decide to reserve certain patterns of zeros
05:38 and ones to represent colors like red and green and blue
05:41 and combinations thereof.
05:42 Well, once we have the ability to represent colors,
05:44 we could then represent whole pictures.
05:47 Because what's a picture on your phone or a computer screen?
05:49 Well, it's really just a grid of dots, each of which has its own color.
05:53 So this is all to say that even if we start at this lowest
05:56 level of just zeros and ones, so long as you and I and all of the devices
06:00 we use agree to follow some standard like this,
06:02 we can build these layers and layers of abstraction,
06:05 so to speak, on top of one another until finally you and I come up
06:09 with a pattern of zeros and ones that represents
06:11 send this piece of information from me over there.
06:14 And thus, we have something like the internet.
06:17 Majinbu asks, can someone that knows computer science
06:20 explain to me why computers use binary coding and not trinary,
06:23 when trinary is supposed to be faster?
06:25 So it's not necessarily the case that a trinary system, which
06:28 would use three symbols, for instance, 0, 1, and 2,
06:31 would necessarily be faster than binary.
06:34 Because binary, using just 0 and 1, tends to be simpler to implement
06:38 and also more robust to potential errors.
06:40 Or if you're familiar with voltage levels, like in a battery,
06:43 it's very easy for a computer to distinguish something
06:45 for like 0 volts or 3 volts.
06:47 But it gets a little harder if we try to draw the line somewhere in between.
06:51 Because there's just a higher probability
06:53 that a computer might mistake a voltage level like 1.5 in the middle
06:57 as maybe being a little closer to off than on or to on than off.
07:02 Here, too, is where even though there might be mathematical efficiencies
07:05 and real world efficiencies to using trinary, otherwise known as ternary,
07:09 like a 0, a 1, and a 2 digit, instead of just zeros and ones,
07:12 it turns out because our world runs on electricity nowadays
07:15 and there's so much momentum behind binary,
07:17 that it just tends to be a net positive.
07:20 Rachel P95 asks, why is every Windows solution, have you tried restarting?
07:25 And why does that always work?
07:26 So that's a very heavy handed solution to what are typically just bugs
07:30 or mistakes in software, for instance, Windows in this case.
07:33 Restarting a computer just starts everything from scratch.
07:36 So all of the computer's short term memory is lost
07:38 and everything starts in pristine condition.
07:40 Which is to say that it starts in exactly the way
07:42 that the programmers at Microsoft intended,
07:44 without potentially the distractions of the computer being
07:47 in some weird state or condition that the programmers just didn't anticipate.
07:51 Maybe you clicked on some buttons in a weird order,
07:53 maybe you opened a strange file, but maybe you
07:55 got the computer into a state that just wasn't programmed for properly.
07:59 Jason Whitmer now asks, what's the best operating system?
08:01 Well, this is one of these questions in computing
08:03 we would call a religious question, since it evokes a religious debate as
08:07 to which might be best.
08:08 Of course, among the most popular operating systems out there
08:11 are Windows and Mac OS.
08:12 But there's also one you might not have heard of,
08:14 which is called Linux, which is actually very
08:16 omnipresent in the enterprise world.
08:18 So many of today's servers actually run Linux.
08:21 So many of today's desktops or laptops, though, run Windows or Mac OS.
08:25 Now, that's not to say you couldn't run all of those operating systems
08:27 in different contexts.
08:28 And some of us do actually run Linux on our own computers.
08:31 So a lot of it really boils down to personal preference.
08:34 I wouldn't even say that there's one best operating system,
08:37 but there tend to be correlations between the operating systems people use
08:41 and the applications they have in mind.
08:42 So Windows, for instance, is so popular in the world
08:45 of PCs and desktops and laptops.
08:47 Mac OS is, to some extent, particularly in academia and certain countries,
08:50 but not necessarily on the same scale.
08:52 Linux, by contrast, is, again, very much used heavily in the server side
08:56 industry, but so is Windows as well.
08:58 So a lot of the choice for operating systems
09:00 sometimes comes from just what's most appropriate, what's most popular,
09:04 what's most supportive.
09:05 But some of it comes, too, from just personal preference of the engineer,
09:08 maybe the first engineer that you hire to make one of those decisions.
09:11 So it's more about what's best for you and not so much best in general.
09:15 Next, Julio Magnifico asks, why aren't computers getting cheaper?
09:18 Well, computers, or at least computer parts inside of computers,
09:22 do tend to get cheaper.
09:24 The catch is that your expectations and my expectations just keep rising.
09:28 We want our phones, our laptops, our desktops
09:30 to do more and more in the way of the software that they run,
09:33 the games that we use, and just how quickly they perform for us.
09:36 So even though some of those parts are getting less expensive,
09:39 you and I want them to do more and more and be faster and larger in quantity.
09:43 And so as a result, I dare say that the price
09:46 isn't going down as far as you might hope.
09:48 That said, nowadays, you can get for the same amount of money
09:51 from yesteryear much, much more in the way of computing power.
09:55 So arguably, it's working to our benefit in some cases.
09:58 Next up, from Darrow Nabila, can someone explain cloud computing
10:02 to me like a five-year-old?
10:03 Cloud computing is essentially you using someone else's servers
10:07 that someone is paying to rent, for instance, or timeshare.
10:10 So this isn't really a new idea or a new technology.
10:13 Rather, it's a better branding of a technique
10:15 that's been used for years, not just in the computer world,
10:17 but in the real world as well, whereby someone like Google or Microsoft
10:20 or Apple or others nowadays might be able to afford lots and lots
10:24 and lots of servers, and then make those servers available in part to me,
10:27 to you, and many other customers as well.
10:30 Hey, I'm Marcus.
10:31 Hey, Marcus.
10:32 Well, Marcus asks, how does computer memory work?
10:35 Think of computer memory as really being driven by a whole bunch of switches
10:38 that can either be turned on and off.
10:40 So for instance, if I take this here light switch, which is currently off,
10:44 I could simply say that this switch here is
10:46 representing the number 0 in binary.
10:48 But if I turn the switch on, well, now I
10:50 can say that I'm representing the number 1.
10:53 Now, of course, I can only count as high as 0 to 1
10:56 with a single light switch.
10:57 But what if I bring over a second light switch like this one here?
11:01 If we started 0 in this way, turn on this switch first and claim that it's 1,
11:07 let me now be more creative and turn this one off and this one on,
11:10 and now claim this is how a computer's memory could represent the number 2.
11:14 And now if I turn this switch back on, giving me a fourth pattern,
11:18 this is how I might represent the number 3.
11:20 Now, of course, if we add more and more of these switches, more and more
11:23 of these light bulbs, we can count even higher than 3.
11:26 And indeed, that's what a computer's memory is ultimately doing.
11:29 It's using lots and lots of little tiny switches,
11:32 otherwise known as transistors, to turn the flow of electricity on and off.
11:36 And then it's got other types of hardware,
11:37 called, for instance, capacitors, that have a capacity
11:41 to hold on to some of that electricity, just like the light bulb there being on.
11:45 All right, next, Donnie asks, how do you explain web 3 to people?
11:49 So web 3, like web 2, and retrospectively web 1,
11:53 are really just buzzwords that describe sort
11:55 of phases of the internet or the World Wide Web as you and I know it.
11:58 For instance, back in the day when there was just the World Wide Web,
12:02 now perhaps referred to as web version 1, information was largely static.
12:06 If you were to create a website on the internet, you would type up your code,
12:10 you would type up your content, you would put it on a server somewhere,
12:13 and someone could read that information.
12:15 But it was you, the web developer, or you, the owner of the website,
12:19 that was creating that content for other people to actually read and consume.
12:23 In web 2, the world became much more dynamic in recent years,
12:27 whereby now websites tend to have databases.
12:30 And they have more sophistication, so that a lot of the content in websites
12:34 today are actually coming from me and from you.
12:36 So if you think of any social media site,
12:38 it's not the owners of those sites that are creating most of the content,
12:41 it's you and me as the users of those same websites.
12:44 But in web 2, everything is nonetheless very centralized,
12:47 whether you're Twitter or Facebook, now Meta, or other companies.
12:51 All of that data, even in the world of social media, that's coming from me
12:54 and you, is actually being stored centrally on those company servers.
12:58 So web 3.0, or web 3, so to speak, is really
13:01 about transitioning away, potentially, from that very centralized model
13:06 to one that's more distributed, where the data that you and I are creating,
13:09 whereby the data you and I are consuming,
13:11 is actually distributed over multiple servers,
13:14 over a technique called blockchain, for instance, in some cases,
13:18 whereby there's not necessarily one owner of that data,
13:21 but really collective ownership, and therefore verification
13:24 that the data maybe indeed came from me and you.
13:28 Next, a question from Gomo Tigers.
13:30 Can someone explain to me the difference between firmware and software?
13:33 Hardware is physical, software is code, WTF is firmware.
13:37 Firmware is really a synonym for a type of software.
13:40 So firmware is just software, but it tends
13:42 to be software that comes built into your hardware.
13:46 And you can think of, in the simplest scenario,
13:48 that firmware is software that is just completely integrated into the hardware
13:52 and itself cannot be changed or even upgraded.
13:55 But that's a bit of an oversimplification,
13:57 because even firmware, typically, when it comes in a computer,
14:00 when it comes in a phone or some other device, can very often be updated.
14:03 Why?
14:04 Because the firmware is the software that's really closest to the hardware,
14:07 and in that sense, it might very well be the most important.
14:10 And if anything goes wrong with the firmware,
14:12 you might not even be able to turn that device on,
14:13 whether it's a phone, a computer, or even your refrigerator nowadays.
14:17 All right, that's all the questions for today.
14:19 We hope you learned a little something along the way.
14:21 We'll see you next time.
14:23 (whooshing)