In the 1970s, at the dawn of personal computers, people like Steve Jobs and the scientists at Xerox PARC talked about computers as "bicycles for our mind". Sure, someone was going to make big money selling these hardware units, but the intention was at heart quite pure; computers would give our minds wheels to go farther than ever before. Our capabilities would be augmented by technology, and we would become smarter and more capable. That ethos has not really stuck, and today we find ourselves in a Pavlovian relationship with push notifications, incapacitated by the multi-directional pull on our attention spans.
We've made it through every new technological wave—newspapers, radio, TV, laptops, cell phones—without the social decay that was widely prophesied, but there's something different about smartphones loaded with apps living in the palm of our hand, says tech ethicist Tristan Harris. It would be a mistake not to recognize how, this time, it really is different. Companies today are not more evil than they were in the 1970s, what's changed is the environment they operate in: the attention economy, where the currency is your eyeballs on their product, for as long as possible—precious exposure that can be sold to advertisers. Unlike the neutral technology we once used, and could walk away from, today's technology uses us. Behind every app—Facebook, Twitter, Snapchat—are 1,000 software designers working every day to update and find new psychological levers to keep you hooked to this product. The most powerful development has been that of 'likes', public feedback that externalized our self-worth onto a score card (this has reached new heights with Snapchat's streaks, which research by Emily Weinstein at Harvard has shown puts extreme stress on kids and adolescents.) "These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds," says Harris. Is it too late to do something about the attention economy? To find out more about Tristan Harris, head to http://tristanharris.com.
Read more at BigThink.com: http://bigthink.com/videos/tristan-harris-social-medias-dark-side-how-connectivity-uprooted-our-self-worth
Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink
Well, there's a really common misconception that technology is neutral and it's up to us to just choose how to use it.
And so we're sitting there and we're scrolling and we find ourselves in this kind of wormhole and then we say, “Oh man, like, I should really have more self-control." And that's partially true, but what we forget when we talk about it that way is that there's a thousand engineers on the other side of the screen whose job it was to get my finger to do that the next time. And there's this whole playbook of techniques that they use to get us to keep using the software more.
Was design always this manipulative? It wasn't always this way. In fact, back in the 1970s and the early '80s at Xerox PARC when Steve Jobs first went over and saw the graphical user interface, the way people talked about computers and what computers were supposed to be was a “bicycle for our minds” that, here we are, you take a human being and they have a certain set of capacities and capabilities, and then you give them a bicycle and they can go to all these new distances, they're empowered to go to these brand-new places and to do these new things, to have these new capacities.
And that's always been the philosophy of people who make technology: how do we create bicycles for our minds to do and empower us to feel and access more?
Now, when the first iPhone was introduced it was also the philosophy of the technology; how do we empower people to do something more? And in those days it wasn't manipulative because there was no competition for attention. Photoshop wasn't trying to maximize how much attention it took from you—it didn't measure its success that way.
And the Internet overall had been, in the very beginning, not designed to maximize attention, it was just putting things out there, putting things out there, creating these message boards.
It wasn't designed with this whole persuasive psychology that emerged later. What happened is that the attention economy and this race for attention got more and more competitive, and the more competitive it got to get people's attention on, let's say a news website, the more they need to add these design principles, these more manipulative design tactics as ways of holding onto your attention.
And so YouTube goes from being a more neutral, honest tool of just, “Here's a video,” to, “Oh, do you want to see these other videos? And do you want to auto-play the next video? And here's some notifications…”
We've made it through every new technological wave—newspapers, radio, TV, laptops, cell phones—without the social decay that was widely prophesied, but there's something different about smartphones loaded with apps living in the palm of our hand, says tech ethicist Tristan Harris. It would be a mistake not to recognize how, this time, it really is different. Companies today are not more evil than they were in the 1970s, what's changed is the environment they operate in: the attention economy, where the currency is your eyeballs on their product, for as long as possible—precious exposure that can be sold to advertisers. Unlike the neutral technology we once used, and could walk away from, today's technology uses us. Behind every app—Facebook, Twitter, Snapchat—are 1,000 software designers working every day to update and find new psychological levers to keep you hooked to this product. The most powerful development has been that of 'likes', public feedback that externalized our self-worth onto a score card (this has reached new heights with Snapchat's streaks, which research by Emily Weinstein at Harvard has shown puts extreme stress on kids and adolescents.) "These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds," says Harris. Is it too late to do something about the attention economy? To find out more about Tristan Harris, head to http://tristanharris.com.
Read more at BigThink.com: http://bigthink.com/videos/tristan-harris-social-medias-dark-side-how-connectivity-uprooted-our-self-worth
Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink
Well, there's a really common misconception that technology is neutral and it's up to us to just choose how to use it.
And so we're sitting there and we're scrolling and we find ourselves in this kind of wormhole and then we say, “Oh man, like, I should really have more self-control." And that's partially true, but what we forget when we talk about it that way is that there's a thousand engineers on the other side of the screen whose job it was to get my finger to do that the next time. And there's this whole playbook of techniques that they use to get us to keep using the software more.
Was design always this manipulative? It wasn't always this way. In fact, back in the 1970s and the early '80s at Xerox PARC when Steve Jobs first went over and saw the graphical user interface, the way people talked about computers and what computers were supposed to be was a “bicycle for our minds” that, here we are, you take a human being and they have a certain set of capacities and capabilities, and then you give them a bicycle and they can go to all these new distances, they're empowered to go to these brand-new places and to do these new things, to have these new capacities.
And that's always been the philosophy of people who make technology: how do we create bicycles for our minds to do and empower us to feel and access more?
Now, when the first iPhone was introduced it was also the philosophy of the technology; how do we empower people to do something more? And in those days it wasn't manipulative because there was no competition for attention. Photoshop wasn't trying to maximize how much attention it took from you—it didn't measure its success that way.
And the Internet overall had been, in the very beginning, not designed to maximize attention, it was just putting things out there, putting things out there, creating these message boards.
It wasn't designed with this whole persuasive psychology that emerged later. What happened is that the attention economy and this race for attention got more and more competitive, and the more competitive it got to get people's attention on, let's say a news website, the more they need to add these design principles, these more manipulative design tactics as ways of holding onto your attention.
And so YouTube goes from being a more neutral, honest tool of just, “Here's a video,” to, “Oh, do you want to see these other videos? And do you want to auto-play the next video? And here's some notifications…”
Category
🤖
Tech