This surely won't make me seem like a crank.
Further watching:
@HGModernism on addiction to scrolling and the Skinner box mechanism:
https://www.youtube.com/watch?v=bNOol5OTasw
@acollierastro on the AI hype cycle and how professionals understand there's nothing new here
https://www.youtube.com/watch?v=rFGcqWbwvyc
Technology Connextras (the second channel where I put stuff sometimes)
https://www.youtube.com/@TechnologyConnextras
Technology Connections on Bluesky:
https://bsky.app/profile/techconnectify.bsky.social
Technology Connections on Mastodon:
https://mas.to/@TechConnectify
Have you ever noticed that I've never done that whole influencer thing? That's all thanks to people like you! Viewer support through Patreon keeps this channel independent and possible. It's how I can express my true thoughts on what YouTube's business daddy (and Silicon Valley at large) are up to. If you'd like to join the amazing folks who fund my work, check out the link below. And thank you!
https://www.patreon.com/technologyconnections
Further watching:
@HGModernism on addiction to scrolling and the Skinner box mechanism:
https://www.youtube.com/watch?v=bNOol5OTasw
@acollierastro on the AI hype cycle and how professionals understand there's nothing new here
https://www.youtube.com/watch?v=rFGcqWbwvyc
Technology Connextras (the second channel where I put stuff sometimes)
https://www.youtube.com/@TechnologyConnextras
Technology Connections on Bluesky:
https://bsky.app/profile/techconnectify.bsky.social
Technology Connections on Mastodon:
https://mas.to/@TechConnectify
Have you ever noticed that I've never done that whole influencer thing? That's all thanks to people like you! Viewer support through Patreon keeps this channel independent and possible. It's how I can express my true thoughts on what YouTube's business daddy (and Silicon Valley at large) are up to. If you'd like to join the amazing folks who fund my work, check out the link below. And thank you!
https://www.patreon.com/technologyconnections
Categoría
🤖
TecnologíaTranscripción
00:00I want to show you something.
00:01It's this radio.
00:03It's pretty cool looking, right?
00:05I use it pretty often.
00:07Now, here's a fun little challenge for you.
00:10Tell me the model number, what year this radio was made,
00:13what vacuum tubes are inside of it,
00:15and find its schematic diagram for me so I can order new capacitors for it.
00:20Go on.
00:22If that seems like a difficult task, it's very much not.
00:26So long as you can see this image,
00:28you have everything you need on your screen right now
00:32to get all that information and much more in less than a minute.
00:36That is, if you know where and how to look.
00:40Now, I'm going to show you where and how to look in just a moment.
00:43But first, I should note that this is no doubt going to be
00:47the most crotchety, old-man-yells-at-cloud video I've ever released,
00:52and I won't hide from that.
00:54Another thing I won't hide from is the fact that this is largely an opinion piece.
00:59If that doesn't sound like your jam,
01:01please feel free to watch any of the other videos on this here website.
01:06But if you'll indulge me,
01:08that's actually a large part of what this video is about.
01:12It's about the modern internet,
01:14and how I think it's caused a lot of folks to stop looking for things.
01:20I'll explain what I mean later,
01:21but first, let's figure out what that radio is.
01:25So, all we've got is a picture,
01:27but I promise you don't need any fancy image recognition tools to find this radio.
01:33All it takes is noticing a few of its details.
01:36The radio is branded Silvertone,
01:39and the tuning dial has two sets of numbers,
01:42one labeled Standard Broadcast,
01:44and the other labeled Ffff-something Modulation.
01:48The first word is partially obscured by the dial's pointer,
01:51it's likely Frequency.
01:54You probably know that Frequency Modulation is what FM is short for,
01:58and this is clearly an old radio.
02:01So it's some kind of old FM radio made by Silvertone.
02:05How can we identify which one?
02:08Well, let's see if we can pick it out from a Google image search.
02:11A lot of people would call this a vintage radio,
02:14so we'll look for Vintage Silvertone FM Radio.
02:18There are lots of pictures of cool old radios here,
02:21but look at that.
02:22It's the very same one.
02:24And with a click on it, we'll see the image description,
02:27which tells us this is a Silvertone Model 18.
02:31Wonderful.
02:32Now that we have its model number, we can get more details.
02:35I wanted the radio's schematic diagram,
02:37so let's do a web search for Silvertone Model 18 Schematic.
02:42The first link is to a website called radiomuseum.org.
02:46That looks pretty promising.
02:48Click through and there's another picture of the same radio,
02:51so we have confirmation that this is indeed a Silvertone Model 18.
02:57And right on this page we find it was produced between 1950 and 1954,
03:02plus we have a list of its 8 vacuum tubes.
03:04And of course, there is the schematic available for download,
03:08though it can also be found on other websites.
03:11Keeping this information available online is a serious undertaking,
03:15and I applaud the people who dedicate themselves to maintaining active databases like this,
03:20as well as those doing the work to back it all up.
03:23Especially right now.
03:25The point of that little exercise was, well, exercise.
03:29What we just did is nothing short of a human superpower.
03:34From just an image of an old radio,
03:36you can find out a lot of information using simple search tools and your own observations.
03:42And if you're just a little curious,
03:44you can keep going.
03:46Did you notice that the manufacturer was Sears Roebuck & Company?
03:50Yeah, that Sears.
03:52The one that built the tower I see every day.
03:54Weather permitting.
03:56They made a lot of stuff in-house back in the day,
03:58and Silvertone was their brand of electronics, and also musical instruments.
04:03This brown, mid-century beauty was one of their radio receivers.
04:07And if you don't know anything about antique radios and vacuum tubes
04:11and why these old things usually need new capacitors,
04:14you can also find all that out.
04:16A search for replacing capacitors antique radios brings you to this lovely website.
04:21A much more useful resource than whatever AI nonsense Google is synthesizing
04:26because that's apparently what search engines are supposed to be now for some reason.
04:30Oh, there's the old man yelling at clouds.
04:33Told ya.
04:34I am sure a lot of you knew how useful the information in that image was
04:39and how you could use it to find the radio.
04:42And to those of you who did, this video probably seems incredibly unremarkable so far.
04:48But I believe quite strongly now that this is a skill which as time marches on,
04:53people are forgetting they have and thus don't think to use even when it could help them.
05:00I want to reiterate the language I just used there.
05:03I'm not saying people don't know how to do this.
05:06Anyone can do this, and hopefully you learned how in school.
05:10It's pretty basic research.
05:12What I am saying is that it appears as though an increasing number of people
05:17seem to operate in the world without realizing these are things they can do themselves.
05:23And that really concerns me.
05:26Now, I've spent enough time on forums to know about
05:29Let Me Google That For You,
05:30a snarky response to people who ask easy-to-answer-with-a-web-search questions.
05:34I was even on the receiving end of that once.
05:37But that's not quite what I want to talk about.
05:41I want to talk about how we decide what we want to see, watch, and do on the internet.
05:48Because, well, I'm not sure we realize just how infrequently
05:53we are actually deciding for ourselves these days.
05:57That's right, this is a video about the problems of
06:00recommendation algorithms on social media and the internet at large.
06:04But I'm not going to be focusing much on what exactly those algorithms do.
06:08I think we all know by now.
06:11Instead, I'm going to be focusing on something which feels new and troubling.
06:16I'm starting to see evidence that an increasing number of folks actually prefer
06:22to let a computer program decide what they will see when they log on,
06:27even when they know they have alternatives.
06:30Since this feels like a new phenomenon, I felt it needed a name.
06:34I've chosen to call it Algorithmic Complacency.
06:38Now, I recognize that I am stepping into a discussion that many, many people have had,
06:43and so I don't want to claim that this is an original thought or even an original term.
06:49But as I've worked to define and articulate it,
06:52I've come to believe that it's a serious problem that needs our immediate attention.
06:58Succumbing to Algorithmic Complacency
07:00means you're surrendering your own agency in ways you may not realize.
07:06And as the internet and real life continue to blend together, that can end very badly.
07:13Since I believe there's significant gravity to this topic,
07:17I want to present a convincing argument to you.
07:20And that requires that we first look backward in time.
07:24Think for a moment about what your experience on the internet is like these days and,
07:29if you're old enough, how it differs from a couple of decades ago.
07:33The internet used to only exist through a web browser on a desktop computer.
07:39Maybe a laptop if you're fancy.
07:41And your computer had to pretend to be a telephone and shriek at other computers
07:45through a phone line just to transmit and receive data at blazing slow speeds.
07:51It was a dark time.
07:53Back then, Google was just a happy little search engine which helped you find websites.
07:59And when you found a cool website which you liked,
08:02you'd use your web browser to bookmark that website.
08:06That would make sure you could get back to it later without having to search for it again.
08:10Like writing a note to yourself.
08:12In other words, the internet was still very manual
08:16and you were in charge of navigating it and curating your own experience with it.
08:21That also meant the internet didn't do anything until you asked it to do something.
08:28You might set your browser's homepage to your local newspaper's website
08:32so you could get a bit of a news update each time you logged on.
08:35But other than that, information didn't just come to you.
08:40Finding news, products, and information on the internet was entirely up to you.
08:45You had the world's information at your fingertips,
08:48but you needed to use your fingertips and your brain to get to it.
08:52And it was also up to you to gauge the trustworthiness and reliability of the information you found.
08:59If you're over the age of 30, you probably remember what this was like.
09:04Over time, though, things have steadily become a lot more automated.
09:09And also, a lot more in-your-face.
09:13Many people these days experience the internet primarily through their smartphones and mobile apps
09:19in a neatly packaged ecosystem created and curated by giant tech corporations.
09:25Even though those apps are often just bespoke web browsers that only take you to a very specific
09:30website and keep your traffic inside a walled garden while also collecting lots of data about
09:34where you go and what you do, they still represent a radical shift in how we use and experience the
09:40internet as a resource.
09:43Platforms became the new name of the game.
09:45We're not surfing the web and looking for cool and useful things anymore.
09:50We're hanging out on platforms.
09:53Like, well, this one.
09:56We're so used to this now that we imagine apps less as a piece of software which enables
10:02connection to people and information, and more of a place where we spend time.
10:08Internet historians remind us that this is not our first rodeo.
10:12We escaped the walled garden that was AOL, after all.
10:15But I think most would agree that today's internet is distinctly different and intense.
10:22It has become so integral to our lives that it's shaping how we view the world
10:27and how we operate within it.
10:30And most troublingly, we are largely no longer in control of what we see.
10:37Recommendation algorithms end up putting content in front of our eyes using methods
10:42almost nobody really understands but probably have something to do with maximizing revenues.
10:46And, well, I think it's breaking our brains.
10:52When you have that finely-tuned, algorithmically-tailored firehose of information just
10:56coming at you like that, you might feel like you're having a good time and learning some
11:01interesting things, but you're not necessarily directing your own experience, are you?
11:08Is your train of thought really your own when the next swipe might derail it?
11:14Now, I am by no means the first person to ask that question.
11:18And, full disclosure, I make my living finding information, packaging it into videos,
11:24which I hope are entertaining and insightful, and putting them online for people to hopefully
11:28stumble across when the YouTube algorithm recommends it to them.
11:32So, I recognize the awkwardness of this particular person
11:35talking about this particular thing on this particular platform.
11:40But here's what I think might be new, or at least under-discussed.
11:44I am seeing mounting evidence that an increasing number of people are so used to algorithmically
11:50generated feeds that they no longer care to have a self-directed experience that they are in control
11:58of. The more time I spend interacting with folks online, the more it feels like large swaths of
12:05people have forgotten to exercise their own agency. That is what I mean by algorithmic complacency.
12:15More and more people don't seem to know or care how to view the world without a computer algorithm
12:21guiding what they see. That's a pretty bold claim I just made, and that's going to require evidence.
12:27I will say up front that my evidence is not scientific, so set your expectations accordingly.
12:34I do have data for one particular thing, but I want to talk about that second.
12:39First, I want to talk about my experiences on new forms of social media,
12:43and how that has informed this argument. I have long given up on
12:49traditional social media, but I've been indulging in some of the alternatives like Mastodon and
12:55lately, BlueSky. I'm not trying to sell you on using them, to be clear. But both of those
13:01platforms are far more manual than anything the likes of Meta or Twitter might have spun up.
13:08I think this is great and quite refreshing. I follow accounts that I'm interested in,
13:13and I never have to worry about whether an algorithm won't show me their posts.
13:18I still discover new accounts all the time, but through stuff the ones I'm following have shared.
13:25That's a much more human experience than letting an algorithm decide to put stuff in front of my
13:30eyeballs because it will keep me engaged and increase user time on platform. It's also a much
13:36more social experience. Yet to a lot of new users that are migrating to these platforms,
13:43the need to curate your own experience is very frustrating. Several of whom have told me this
13:49directly and even said they'd prefer not to have to put any work into social media.
13:55Now, I must admit, that sentiment alone concerns me. I'm one of those weirdos who think the most
14:01rewarding things in life take effort, at least outside November, so I would not expect to just
14:07walk into a new place and have a good time without doing a little old-fashioned exploring.
14:13However, I can sympathize with it. For one, I have to recognize that I'm a popular YouTuber,
14:19and I can just show up somewhere, say, hey guys, and expect a lot of people to start following me
14:25very quickly. And that helped re-establish connections I formed in other places.
14:31That's a privilege I have and skews my perspective a lot, which is only fair that I acknowledge.
14:36My experience online is not normal.
14:41But observing BlueSky grow from an extremely niche platform to a still niche but quickly
14:47growing one has presented a very interesting case study which I feel compelled to share.
14:53BlueSky, for those who don't know, allows for the creation of custom feeds.
14:58That's one of its central features, which is really cool. But baked into it as a default
15:02when you create an account are two feeds with different purposes—following and discover.
15:09The following feed is a reverse chronological feed of the posts from accounts you follow,
15:15exactly how Twitter used to work. But the discover feed is algorithmic,
15:21like those For You pages everyone keeps going on about.
15:24On BlueSky, it's a pretty basic algorithm, but its job is to pick posts from accounts
15:30across the platform and essentially promote them so people can find new accounts to follow.
15:35It's by no means a bad idea to have that feature.
15:40But as someone with a pretty large following on BlueSky now,
15:44I can tell the instant a post of mine ends up on the discover feed
15:49because the replies get real weird real fast.
15:53What was previously a post with a nice discussion going on underneath
15:57between myself and various people who know who I am and what I mean when I say words
16:02becomes littered with strange, out-of-context, often antagonistic replies as if the only
16:08possible response to seeing a post of any kind online is to loudly perform a challenge against it.
16:16Now, some of these are bots, a known problem on BlueSky.
16:20But through forming a habit where I check profiles before replying to see if they show
16:24signs of being a bot, it's clear that a lot of them are not.
16:29They're people who just start talking with absolutely no idea who they're talking to
16:34and with no desire to figure out the context of the discussion before they write their reply.
16:40This may feel like I'm veering off track a bit or just complaining about people,
16:45which is fun, but this is less about calling out the behavior of individuals
16:49and more about recognizing the incentives which promote that behavior.
16:55Algorithmic feeds on social media are unfortunately quite good at fostering
17:00something known as context collapse. To understand this, imagine you're dining
17:05in a restaurant and you're close enough to a table of people to hear snippets of their conversation.
17:12You don't know who any of the people at that table are, but if you manage to overhear them
17:17talk about something you're really interested in, you might feel tempted to join their conversation.
17:23But in the context of a restaurant setting, that's considered very rude, so it rarely ever happens.
17:30On social media, though, the same kinds of quasi-private conversations between parties
17:35who know each other are happening all the time. But since the platform is just one big space
17:42and it might decide to put that conversation in front of random people, that social boundary of
17:48etiquette which is normally respected is just not there. And lots of conflicts happen as a result.
17:56A really common one you might accidentally step into on social media happens when you stumble
18:01across a conversation among friends making sarcastic jokes with each other. But since
18:07you don't know who those people are, you don't have the context you need to recognize they're joking.
18:13And so if you reply with a serious critique, well, that's a social misfire which some will
18:19react poorly to. And that's a pretty mild form of context collapse. It can be much,
18:25much worse when people want to discuss things like politics. And unless we realize recommendation
18:32algorithms are what's fostering these reactionary conflicts, they're going to continue so long as
18:38we use platforms in the ways that we do. It's for all these reasons that I believe algorithmic
18:44complacency is creating a crisis of both curiosity and human connection. I would even go so far as
18:51to say it's fostering lots of other disturbing things, too. Ever notice how a lot of folks these
18:57days need to have a simple good or bad, black or white, best or worst understanding of a topic or
19:04issue? It seems to me like algorithms which promote content through a simple lens of positive
19:10or negative engagement would reinforce those binaries and contribute to polarization.
19:16And as people learn about new products through the slot machine of social media feeds,
19:21they can develop a learned helplessness where they will wait to be sold on a solution for
19:26their problems rather than be introspective and explore what their problems actually are
19:33and how they might be able to come up with their own solutions which don't cost any money.
19:38Introspection will reveal a lot of the problems you think you have are being put in your head
19:44by influencers. You weren't unhappy until they told you you should be.
19:50And, well, I can think of lots of other stuff which has disturbed me for quite a while but
19:55is now past the point I can ignore as a quirky consequence of connecting large
20:00numbers of humans together. Social media algorithms don't nurture human connection,
20:06they exploit it. And we are so used to this reality now that I'm not sure many of us care
20:12to get off this train. But I think we should. So now, let's talk about YouTube. I'm sure a
20:20good number of you have thought this is pretty rich coming from a guy who makes his living on
20:23a platform which does all these things. Well, there's a funny thing about YouTube.
20:30Its recommendation algorithm is entirely optional. You know that, right? You... you do know that,
20:39right? There's this feature on this website that has been here pretty much since it became a thing
20:45and it's called the Subscriptions Feed. It's not hiding. It's at the bottom of the mobile app and
20:51on the sidebar of the desktop site. In fact, it's got its own URL and if you're feeling old school,
20:57you can bookmark it. This is a completely manually curated feed which has nothing in it but the
21:04videos, and shorts for better and worse, from the creators you have chosen to subscribe to.
21:10That means it's entirely in your control. This feed has been in plain sight the whole time and
21:18here's why I'm using the term algorithmic complacency. Nobody cares to use it. And that
21:25I can back up with hard data. YouTube gives us a shocking amount of information regarding audience
21:32metrics and in 2024, less than three percent of this channel's views came from the Subscriptions
21:41Feed. Almost nobody is using this feature and yet it's the most reliable way to keep
21:47track of the things you have explicitly decided you want to watch through hitting the subscribe
21:52button. The fact that it's easy to get to, has existed in plain sight almost since this platform
21:58was born, yet almost nobody is using it anymore, is puzzling. I want to stress my use of the word
22:06puzzling. It's not my intent to be judgmental here and I'm sorry if it has sounded like that.
22:12Different people use websites differently and the main home feed on YouTube, which functions like a
22:18For You page, usually does a good job of surfacing new videos from creators you like.
22:24Subscribing to a channel is a signal to the recommendation algorithm to boost that channel's
22:29videos in your home feed. And it's not like that algorithm has no value. There are some creators
22:36I'm not even subscribed to, yet I see most of their new videos since the algorithm has figured
22:41out I keep watching them so I must want to see the new ones and it puts them in my home feed.
22:47Through that feed, I've also stumbled across countless new channels and have been reminded
22:52of videos I watched years ago and will happily watch again. Plus, of course, I have found so
22:59much cool stuff through the recommended videos that appear alongside whatever I happen to be watching.
23:05I'm glad YouTube does that.
23:08Wait.
23:09Nuance? On the internet? That's illegal!
23:13In the interest of time, I've removed a big section on the flaws with the subs feed and how
23:18I think YouTube should address them. Mainly, please let us remove shorts if we aren't interested in
23:23them and please let us organize the subs feed a little bit so creators who upload a lot don't end
23:28up cluttering it. Maybe condense their daily activity into a single tile which we can then
23:34click on and expand. Just a thought. But the reason I wanted to talk about it is that it feels
23:42indicative of a growing disinclination to grab the reins of the internet and be the person steering
23:49your own experience. That's really what bothers me. There has to be a healthy balance between cool
23:55stuff you stumble upon and stuff you're actually interested in and matters to you.
24:01Algorithmic complacency, if not noticed and acted upon, means you're allowing other people who are
24:08not you to decide what matters to you. I should not have to spell out why that's dangerous,
24:15so I won't. But I will spell out that it's very easy for people who wish to weaponize this reality
24:21to craft a narrative which is not overtly obvious and so might slip past your defenses.
24:28If you can reduce the presence of algorithmically curated feeds in your life, you'll be less
24:32susceptible to that. And if you build up your own network of people and resources you trust,
24:38you'll know when you're being bullshitted. And in case you haven't noticed,
24:42people are trying to bullshit us all the time now. And algorithmic feeds make this worse.
24:49They don't just exist on YouTube and social media, they exist in popular news aggregator apps.
24:55And that's meant a lot of really stupid articles keep floating around,
24:59because all that matters to many institutions which make news these days is clicks and ad money.
25:06Look at this stupid thing which ended up in the Google News app.
25:09The end of Walmart and Target in the US. A new retailer appears and is much cheaper.
25:15I know what an Aldi looks like. That's a picture of the inside of an Aldi.
25:21And, uh, if you've somehow not heard of Aldi or stepped inside one,
25:25that doesn't mean Aldi is a new retailer which just appeared.
25:30This article is a waste of time for the vast majority of people who might be tricked into
25:35clicking on it. You may have noticed that the publisher of that article was LDREO24,
25:41which, as far as I can tell, might actually be a fake news source.
25:46How did it get in Google News? That's a great question for Google. But even mainstream
25:52publications have gone wildly off the rails as they chase metrics rather than the truth.
25:58The New York Times of 2025 is publishing opinion pieces where bloviating morons go through the fun
26:05little exercise of what the political ramifications of turning Canada into the 51st state
26:11would be for Democrats. Folks, since they can't say this in plain language for some reason,
26:17I guess it's up to me. Canada is a sovereign nation. It's a foreign country which has the
26:26right to self-determination. The United States cannot simply turn Canada into a US state.
26:32Canadians have clearly indicated they do not want that, which means for us to force the issue
26:38would be to declare war with Canada and invade. No sane person should want that,
26:46and it's a shameful embarrassment that the New York Times would even entertain this as
26:51a possibility and legitimize the awful, inevitably bloody idea. How on earth did we get here?
26:59I can tell you how I think we got here. Big news publications have become just as dependent
27:03on algorithms to find their readers as their readers are to find the news. Which means
27:09they're more concerned with being enticing than being honest. Which is a damn shame.
27:16Okay, reel it in. Reel it in. I'm about to wrap up this video, but before I do,
27:23I want to explore one more related but different angle. I struggled with whether I wanted to call
27:30this phenomenon algorithmic complacency or automation complacency. The reason I struggled
27:36is that there isn't a clear distinction between those two things. Algorithms are a kind of
27:42automation, so you could say everything I've been talking about is the result of automatically
27:48curated feeds, and the video wouldn't change. But automation in itself is not necessarily bad.
27:55Lots of menial labor tasks have been replaced by automation, and this has largely been a great
28:01thing. The actually important word in this discussion is curated. It's one thing to
28:08automate, say, an elevator, or an inventory system, or a telephone switchboard. But it's a
28:16very different thing to automate what information people see. There are situations where that sort
28:24of automation is necessary. There's an ever-increasing amount of information being stored online,
28:29so when you're looking for something specific amongst that sea of information, keywords by
28:35themselves aren't enough. The demonstration we did at the beginning relied on Google's search
28:41algorithm determining context from the keywords we gave it so it could sort everything it found
28:46by relevance to that inferred context. And it's usually really good at that, as we saw.
28:54But it's not always. I'm sure by now you've had the really frustrating experience of Google
29:00latching on to the wrong context and producing a lot of irrelevant results. And it can be
29:06extremely tedious to refocus the algorithm on the correct context. This happens a lot when
29:12search queries include a word with many homonyms. But we are rapidly moving away from a paradigm in
29:19which search queries present a list of sources for us to look at, cite, and verify, and are now being
29:26pressured into a new reality where large language models synthesize responses to queries which are
29:31statistically likely to produce a useful output but which do not provide us with sources of
29:37verifiable information or at least obfuscate them to the point that many people are not going to
29:43check them. I don't think enough of us have put much thought into what that means. It means we're
29:51careening towards a future where people just trust computers to do their thinking for them.
29:58And the thing is, we already know that's often a bad idea.
30:03Take, for example, how we navigate the actual physical world. If you drive a car, I am certain
30:10that by now you've used some kind of GPS navigation app to figure out how to get places. I do too.
30:17Don't think I'm about to say, oh we should go back to paper maps. Gross. But when you use a
30:24mapping app to navigate somewhere, how often are you prioritizing the fastest route? You probably
30:30have that set up as the default, don't you? I do. But do you ever question whether that is actually
30:38the most logical way to get somewhere? It often isn't because arrival time is only one of many,
30:46many variables which might be important to you. If I ask Google Maps to take me from my home to
30:52my office, it is going to suggest a route which first requires going the wrong way to then hop on
30:59a tollway on which I have to pay a roughly one dollar toll and have to overshoot my destination
31:04to hop on a second expressway and backtrack. That suggested route requires an extra four miles of
31:11travel and an extra one dollar each way, all to save me exactly one minute overtaking much more
31:20direct state routes and surface streets. If I mindlessly did what Google suggested, over a year
31:27I'd put an extra two thousand miles on my car and spend an extra five hundred dollars on tolls
31:33just to save not even one full work day of time. Google is suggesting a terrible route just because
31:41it's one minute faster. The only reason I know it's a terrible route is because I live here
31:47and I know what it's suggesting is asinine. But when I don't know where I'm going,
31:53I trust it to make the best decision. But we don't necessarily agree on what is best.
31:59That's the problem. Side note, the other bad thing about taking the tollway is that when I do,
32:06I don't get to see my neighbors and what they're up to. I actually really enjoy driving through town,
32:13seeing new businesses as they pop up, homes getting built and remodeled, admiring Christmas
32:18lights over the holidays, and just seeing people to remind me that other people exist and they
32:25live real lives and they are connected to me because they're my neighbors. I'm not the most
32:32social person, but even I don't like the isolated feeling that a commute on a tollway where there's
32:38nothing to look at but other cars and sound isolation walls gives me. I feel way more
32:44connected to my community when I can actually, you know, see it and the people who define it.
32:51That's a preference of mine which I grant, but it's also a priority of mine. I am being mindful
32:58now to put human connections above technological connections. See, what I do here is I make
33:06connections between technologies. That way you can learn how they fit together and how best to
33:11use them and maybe you can use one concept in conjunction with another concept to make a third
33:16concept. That's what I'm doing here, trying to empower you to make your life better. Technologies
33:23which make human connection harder or even just more random are, uh, bad. That, I do think,
33:31is black and white true. Any piece of technology which gets in between humans who wish to help each
33:37other is frustrating at best and exploitative at worst. We ought to know by now the real reason
33:44those systems get put in place is so that we need fewer humans in helpful roles. And I think there
33:52is absolutely no question that this is going to get worse until we all start looking inward and
33:59begin questioning how we operate in this world and why. Silicon Valley seems hell-bent on creating
34:07machines which can do our thinking for us. Why should any of us want that? I certainly don't
34:15want that. I don't learn anything unless I do the mental work to create a complete framework of
34:21understanding in my mind. I don't talk about things I don't understand because that's the
34:27fastest way you can make a fool of yourself. And it can be dangerous when you have a platform like
34:33I do. I will never trust a computer program to be able to understand anything in the way a human can,
34:41nor will I trust it to find information for me. If I have to vet everything it's finding,
34:48then I end up doing the same work I would have done myself. And if I don't vet what it's finding,
34:54then what I'm really doing is saying I don't want to be responsible for what I do.
34:59It frightens me that even though we've all seen the consequences of what a social media
35:03recommendation algorithm can do to shape our viewpoints, that we are somehow falling for
35:08the temptation of machines which can offload our thought processes—the thing which makes us human.
35:17If that's not the purest form of lazy anti-intellectualism, I don't know what is.
35:23On that cheery note, let me make sure you know that I share the same frustration with the AI
35:29hype cycle as the AI researchers who are actually doing work to create tools to solve real problems,
35:36like early detection of cancers from images or blood screens.
35:40That is valuable research which will undoubtedly save lives. It's beyond frustrating that the only
35:46kind of AI that is in the public consciousness at the moment is the one that does some very
35:51impressive tricks, but any honest person will tell you needs intense supervision because it
35:57will hallucinate and produce bad outputs. It seems blindingly obvious to me that the stakes are way
36:04too high to hand over our decision making to a computer which cannot be held responsible for
36:09the decisions it makes. And that's, disturbingly, what I think is the real reason for wanting to
36:15push this future. But you've put up with me long enough. Thank you for watching,
36:21and I hope I don't sound too far off my rocker. I made this channel not just to share cool stuff
36:28I find, but to show all the amazing ways we solved our problems in the past. There are invaluable
36:36lessons there which we forget at our peril. I don't talk much about computer technology because
36:42it doesn't really interest me, and it's becoming less and less interesting as time goes on.
36:48Outside of hardcore enthusiasts or people doing research on vast amounts of data,
36:53let's be honest. Computing is a solved problem. I can use my desktop computer from 2017
37:01to make these videos for you, and it would not get in my way at all, and the $600 nearly
37:06base-spec M2 Mac Mini I bought to dingle around on can do it just as well. Video encoding is getting
37:14more efficient, and we need less bandwidth and storage to send videos like this across the world,
37:19and that's despite internet connections getting faster and faster. So, I think it's pretty clear
37:26that that reality is why Silicon Valley is doing the stuff it's doing these days.
37:32It has to justify itself as a center of innovation in a world where it's running out of runway.
37:38And the best answer it's got is, eugh, computers which pretend to think.
37:44Forgive me, but I want to think for myself, and I think you should too.