The UK is "walking down a path towards a surveillance state", a technology expert warns in the latest episode of The Data Capital podcast. Stephanie Hare voices her concerns about the different ways that technology and data are affecting our lives in a wide-ranging interview in We Begin As Data.
Hare, keynote speaker at The Scotsman's annual data conference on September 27th, also warns of the addictive power of tech platforms like TikTok and their potentially damaging long-term effect on young minds.
Hare, keynote speaker at The Scotsman's annual data conference on September 27th, also warns of the addictive power of tech platforms like TikTok and their potentially damaging long-term effect on young minds.
Category
🗞
NewsTranscript
00:00 [MUSIC PLAYING]
00:02 Hello.
00:03 I'm David Lee, and welcome to the latest episode
00:06 in the Data Capital podcast series, brought to you
00:10 by the Scotsman and the Data-Driven Innovation
00:12 Initiative, part of the Edinburgh and Southeast
00:15 Scotland City Region Deal.
00:17 This episode is called "We Begin as Data,"
00:21 and features Stephanie Hare, a fascinating thinker and writer
00:25 on technology issues.
00:27 Stephanie wrote "Technology is Not Neutral,
00:30 A Short Guide to Technology Ethics,"
00:32 which includes a chapter called "We Begin as Data,"
00:35 an intriguing statement which we discuss in this podcast.
00:39 We also talk about many other issues,
00:42 including the state's use of facial recognition technology
00:45 and the addictive power of tech and social media,
00:49 and how to stop beating that addiction.
00:52 But I kicked off by asking Stephanie Hare what
00:54 she meant by that statement, "We Begin as Data."
00:59 I think about data a lot.
01:02 In my training as an historian, I
01:04 would think about data as what I would go, for instance,
01:06 and find in the historical archives.
01:08 So if I wanted to answer a question, which
01:11 I have done, about French history,
01:13 I might have to go to different French archives,
01:15 and I'm pulling out effectively what are data sets.
01:17 It doesn't just have to be a data set in a computer.
01:20 It used to be in boxes, and for historians often still is.
01:24 So I was thinking about how what's
01:27 in an archive versus what is not in an archive.
01:31 It's very frustrating when you're an historian
01:33 because you're going to go and look for something,
01:35 and you can't find what you need.
01:37 But what you also will find is extra stuff that is there
01:40 that you weren't expecting.
01:41 And there's different kind of data sets.
01:42 They're found.
01:43 They're created.
01:45 And then there's always the anti-data set,
01:48 which is the shadow data set.
01:50 It's the data set that you wish existed but doesn't,
01:53 or that you can't find yet.
01:56 And again, for the historian, it's always the fear
01:58 that you finish writing your book,
02:00 and then somebody finds some amazing data--
02:04 journals, papers, whatever-- in somebody's attic.
02:07 That completely refutes what you just spent years working on.
02:10 So I think of data in that sense from my training.
02:15 And it made me think about how other people
02:16 from other disciplines think about data.
02:18 So physicists might be thinking about data
02:21 in terms of just pure raw elements,
02:24 and what's happening, and how those elements and particles
02:26 and quarks, et cetera, some atomic particles are behaving.
02:30 Chemists will think about data in terms
02:32 of the behavior of the periodic table
02:34 and what's happening there, and what
02:36 you can do in different combinations, or even proteins,
02:38 or again, synthetic chemicals to create
02:42 that data for themselves.
02:44 Biologists are thinking about data in terms of life--
02:47 flora, fauna, what's disappeared,
02:50 what used to be on Earth and has left traces, what is appearing,
02:55 and how it's all interacting.
02:56 So I was having all of this different ways
03:00 of thinking about data, from the social sciences
03:02 to the hard sciences, swimming around in my head
03:06 literally while I was swimming one night.
03:08 And I started thinking that actually, I'm
03:11 overcomplicating this, because all life begins with data.
03:16 It begins with DNA, and it begins with our genetic code,
03:22 which you can use to identify anything and link and connect
03:26 anything.
03:28 And therefore, that's a really powerful way
03:32 of thinking about everything we're doing,
03:34 because if I'm looking at studying how human beings are
03:36 existing in the world and their interactions with technology,
03:39 which has been my interest for the past several years,
03:43 if I can turn anything into code,
03:45 if I can make anything a data set and analyze it and recognize
03:49 its limitations and its potentials,
03:51 and if it's not there, how would I create it,
03:53 and all of that stuff, that just became--
03:56 it just felt very philosophical to me.
03:57 It felt like an absolute insight that
04:00 stopped me in my tracks of this data everywhere.
04:04 And I had never really thought of it that way before.
04:06 I hadn't joined it that way before, I guess.
04:08 So that was something I wanted to share,
04:11 because my goal was to try to bring as many people as
04:13 possible around the table for the discussion.
04:15 And I get so frustrated sometimes
04:17 when you hear people in the arts and humanities sometimes
04:20 can feel really shut out of conversations about technology.
04:24 You have this sense sometimes that the only way
04:26 we will think about tech is to look at the STEM subjects,
04:31 or only people with hard sciences,
04:33 or only people with computer sciences degree.
04:35 But I just fundamentally reject that.
04:37 I think technology and tools are for everyone.
04:40 Human beings make them.
04:42 It's part of our inheritance as a species.
04:45 So I wanted to take that massive, massive, huge approach
04:49 and then boil it down to just that very beginning phrase,
04:53 we begin as data, which also is a historical thing.
04:55 It's not just that we elementally begin as data
04:58 and then grow out from that.
04:59 It's also like when I was at CERN last year,
05:04 I was very kindly invited to come and give
05:06 a talk for a couple of days.
05:07 And then I got to walk around with a physicist
05:09 for a few hours who showed me one of the experiments
05:11 that they were working on.
05:13 She was in her 60s, towards the end of her career,
05:16 and I'm probably in the middle of mine.
05:19 And I asked her this huge machine they were working on.
05:22 I went to go underground in hard hats, pretty tough,
05:25 steel-tipped shoes, the lot, all the safety gear.
05:29 And I said, what on earth is this thing for?
05:31 Because it was probably the size of a football pitch,
05:33 I mean, massive.
05:34 Your head can't even wrap itself around what you're looking at.
05:37 And she said, we're looking at it
05:39 for the origins of the universe.
05:42 And I just spontaneously laughed and said, oh,
05:44 so you're like a historian, because that's what I do.
05:47 I'm also looking for the causes of things,
05:50 the beginning of it all.
05:53 What started whatever situation I'm looking at
05:56 from a historian will be like, what was the beginning of the--
05:58 what was the trigger?
05:59 What was the cause?
06:00 Oh, there's a context.
06:01 OK, let's study the context.
06:02 She's doing the same thing with her physics colleagues.
06:06 What started the universe?
06:07 What was the context in which the universe came into being?
06:10 I just loved that.
06:11 We had a really good chat about it,
06:13 very interdisciplinary chat.
06:15 There's a huge amount to digest and think about there.
06:18 But let's come back to when Stephanie was swimming
06:20 and her watery insight, we begin as data.
06:24 We are DNA.
06:25 We are code.
06:27 How did that develop into Stephanie's thinking
06:29 as she wrote Technology is Not Neutral,
06:32 particularly how different data markers on our body
06:35 are being used to identify us?
06:37 So I'm completely obsessed with the human body
06:40 in terms of as a feat of engineering
06:42 as well as a thing of beauty.
06:45 What it can do is just incredible.
06:47 So this idea that this code tells you--
06:50 it's a set of instructions that kicks off a set of processes
06:56 to build--
06:58 for humans, from the sperm and the egg--
06:59 to build a human within nine months.
07:03 And the woman with her body knows how to birth that human.
07:06 And it's insane.
07:07 It's just crazy instinctively what we come pre-coded with,
07:10 if you will.
07:11 And animals do this too.
07:12 And plants are doing their own thing.
07:15 I love that there's this code that is rendered into us
07:19 physically that is effectively a set of instructions,
07:23 but also aptitudes, like families that
07:26 tend to be very good at music.
07:27 Now, you could always do that thing of, is it nature
07:29 or nurture?
07:31 I won't pronounce either way.
07:33 I don't want to swim outside my lane, so to speak.
07:35 But I do think a lot of families will be like, yeah,
07:37 this runs in our family.
07:39 And it might not just be educational.
07:40 It might be a kid who's-- kids who are really good at sports,
07:44 or kids are really good at music, or art, or sculpting.
07:46 Now, were they doing that because they grew up
07:48 seeing mom and dad or somebody in their family doing it?
07:50 Or is there just sheer raw talent, like TBD?
07:53 We don't know.
07:54 But I love the idea that these things are in us, potentially,
07:59 biologically, and that also your environment can change stuff.
08:03 So you get this thing called epigenetics,
08:05 where you might-- for instance, I probably
08:07 had a marker in me that meant I would have a tendency
08:09 to get asthma.
08:10 But I didn't have it till I moved
08:12 to London, which is a filthy city where
08:14 the air is disgusting.
08:15 And after breathing it for a certain period of time,
08:18 I got adult-onset asthma, which I might not
08:21 have had if I had lived in a cleaner town, for instance.
08:26 So that's a pretty good example--
08:27 and loads of people will, I'm sure,
08:29 be able to relate to that--
08:30 of your environment can affect your health.
08:32 It can change you.
08:34 But also in terms of other qualities.
08:36 So that's interesting.
08:38 But then you can start getting into the whole thing
08:40 of your identity.
08:41 So how do you know who you're related to?
08:45 Or if you're a certain blood type,
08:46 or if you need a bone transplant or a donor for a marrow.
08:50 All of that has to be checked and coded.
08:53 It's not random.
08:53 We don't just randomly take these samples
08:56 and hope it works out.
08:57 We check it now, which is fascinating and fabulous.
09:00 And that same analogy can even be turned into policing.
09:03 So everybody knows about DNA and fingerprints in policing.
09:07 But it's actually fascinating how
09:10 you can identify a human being based
09:13 on their face, on their voice.
09:16 So it's not just a photograph.
09:18 You have a voice print, which can then be faked.
09:22 So you get deep fakes for your face,
09:24 deep fakes for your voice.
09:25 This question of how would you identify someone's body using
09:28 their body data in a world now of deep fakes,
09:31 like great new problems that we're going to have to solve.
09:34 But even how they walk.
09:36 There's even some behavioral indicators
09:37 of how people type or how they use their mouse pad.
09:41 Ways that you can tell that you are you,
09:45 or that you are part of a network, which
09:47 you might use for crime busting or counterterrorism
09:51 surveillance.
09:52 But you might also use just for advertising things
09:55 or trying to work out consumer trends with people.
09:59 These things can be nefarious or they can be benign.
10:02 It's the whole thing of like it's not neutral.
10:03 It's something.
10:04 I don't know what it is, but it's something.
10:07 So the idea that all of this comes from our physical selves,
10:14 that we are as much as we move into a digital world.
10:16 And I know there are some people who are living very, very
10:19 digital lives.
10:20 And we had the hype of the metaverse.
10:22 And we've had discussions of augmented reality,
10:25 virtual reality, mixed reality.
10:27 We just mentioned deep fakes of like, can you
10:29 trust even what you think you are seeing and hearing now?
10:32 What is reality?
10:33 A lot of people will link that back to our physical lives.
10:37 And the way that you can do that is encoded now,
10:41 which I just find fascinating.
10:43 And how did that lead you into writing the chapter in the book,
10:47 particularly around biometrics and how
10:50 they were being used by, particularly by law enforcement
10:53 agencies and your concerns?
10:55 This is heading towards, I guess,
10:57 what you would term the nefarious,
10:59 to use the word that you've just spoken.
11:02 In the United Kingdom, there is a specific context
11:04 around surveillance that is linked in that country
11:09 to the conflict with Irish Republicans
11:13 and the independence movement in Northern Ireland
11:17 and also the British presence there.
11:20 As a result, the UK has a tolerance
11:25 for CCTV and surveillance that is
11:27 really remarkable for people who are not from here
11:31 and move here and are like, my God.
11:34 And I say that when I moved here in the UK in 1998,
11:38 it was the year of the Good Friday Agreement.
11:41 So I've been very lucky to benefit
11:43 from living in a time of incredible peace
11:45 that so many friends and colleagues and family
11:47 told me about what it was like here.
11:49 And it was a really real threat.
11:51 So I think there's that whole thing of,
11:53 we're really happy to have surveillance, nothing to hide,
11:56 nothing to fear if it helps people, keep people safe.
12:00 And I get that.
12:00 And obviously, the United States and many other countries
12:03 around the world had, I won't say similar reactions,
12:05 but certainly adjacent reactions after 9/11,
12:08 where we saw surveillance really step up as well because we
12:12 had to deal with Islamist terrorist movements
12:15 and their actions.
12:17 So I get that for a lot of people,
12:20 having this kind of stuff, and they'll
12:22 have it on their Amazon Ring doorbell,
12:24 or they're putting surveillance cameras in pubs
12:26 and they're putting them in bookshops and the like,
12:29 they think it helps fight shoplifting,
12:32 keeps people safe, et cetera.
12:34 It's a hop, skip, and a jump from that kind of surveillance
12:39 to then being able to identify people using facial recognition
12:42 technology.
12:44 Technically, it's a bit of a leap, but not impossible.
12:48 It's more of a cost thing to put that kind of technology
12:51 onto your existing infrastructure
12:53 or simply swap it out.
12:55 And the next thing you know, you are living in a country,
12:58 as we are, that has pubs that are keeping private watch lists
13:02 that they're sharing with the police that are completely
13:04 outside of democratic control.
13:07 We know this.
13:08 Gordon's Wine Bar in London pioneered it, but it's all over.
13:11 It's in shops now.
13:14 It's anywhere that you want it to be.
13:16 Media reports have suggested this surveillance technology
13:19 goes right to the heart of government,
13:22 with the Home Office recently criticised
13:25 for interfering with the work of the Information Commissioner's
13:28 Office, and shockingly, being described as a known risk
13:32 with regard to facial recognition technology.
13:35 They really, really want to have facial recognition technology
13:39 wherever they would like in this country.
13:41 And they're exploiting really a weird situation
13:46 in the British Parliament, which is that the Science and
13:48 Technology Committee in 2019, before the pandemic,
13:51 said we would really like the police to stop.
13:54 Just like, stop.
13:55 We'd like everybody to stop, in fact.
13:56 Just down tools until we can get a proper legislative framework
13:59 in place, which I think is pretty clear.
14:03 However, it's just a committee, the Science and Technology
14:06 Committee, and Parliament has done nothing since then
14:12 to legislate.
14:12 It's a big ask to get people to get off their couch
14:16 and care about this, I know.
14:18 But we are so walking down the path towards a surveillance
14:21 state, and it's like people just, I don't know,
14:24 it's like learned helplessness or something.
14:26 They just don't care.
14:27 And you've also got serious concerns,
14:29 not just about the surveillance state per se,
14:31 but actually the kind of technology that's being used,
14:34 and whether it actually works.
14:36 That's a pretty massive question.
14:38 Yeah, and the thing is, is we know it doesn't work.
14:42 It's not like, oh, I wonder if it works.
14:43 Let's commission some studies and report back and have
14:46 a committee in two years to give us some information.
14:49 We know this.
14:50 And we know this because we're not the only country using it.
14:53 And one of our dear friends, my country of origin,
14:56 the United States, has been publishing many, many studies.
14:59 And journalists are doing some really great work over there
15:03 showing that the people it most likely doesn't work on
15:07 are people with darker skin.
15:08 And we've had several Black men in the United States
15:11 wrongfully arrested by the police.
15:13 We all watched what's happened in the United States
15:15 for years now, and particularly after George Floyd.
15:18 I think people are very, very aware of that.
15:21 So to use a tool that doesn't work on a population that
15:26 already has very delicate, sensitive relations
15:30 with the police, not just in the United States,
15:32 but also here in the United Kingdom--
15:35 the London Metropolitan Police is in special measures
15:37 right now for racism and sexism and misogyny.
15:40 It really is not the time, in my humble opinion,
15:44 for that force to be using tools that people
15:47 can't have confidence in.
15:50 And it's not good enough.
15:51 In my opinion, it's not good enough
15:53 that it works on white men very well.
15:57 It's probably going to work OK on white women.
16:01 But it's not going to work as well on people of color.
16:03 It can really mess up with aging--
16:06 so older people, younger people, trans people.
16:10 If you live in a democracy, you need
16:12 to be using tools that work on the whole population.
16:15 And if they don't, I think you park it.
16:19 You just park it.
16:20 We don't have to ban it, but we could have a moratorium.
16:23 And we could actually have parliaments
16:26 pass a law, a proper framework.
16:28 We need primary legislation.
16:30 We would need expert testimony of scientists and the like
16:34 coming in and explaining this.
16:35 And we'd want to hear from police and the security
16:37 services about their concerns.
16:40 And we would actually get a real framework around this.
16:42 Because this is what I don't understand,
16:44 is if everybody in our law enforcement audience
16:46 is really happy to use it, it actually
16:48 would benefit them to have that framework as well.
16:52 Because what's going to happen is
16:53 they're just going to get sued.
16:55 And we saw that with the South Wales police.
16:57 They did get sued, and they lost.
16:59 And it was a landmark decision, which
17:02 hasn't seemed to dampen that force's desire to continue
17:04 using the tech, by the way.
17:06 They're kind of like, OK, we'll take the feedback
17:08 and see if we can fix it and keep using it.
17:10 But the point is, that made headlines around the world.
17:12 And it's not good for British policing
17:15 to be in the news for that kind of a reason.
17:17 People do read the press and listen to the radio.
17:20 They're aware of what's going on.
17:22 So I feel like it would also give our friends in law
17:25 enforcement and the security services
17:27 greater confidence as well.
17:28 Yeah, sorry, Stephanie.
17:29 You said earlier on that you thought
17:31 the UK had this greater tolerance to surveillance
17:34 as a result of its modern history.
17:38 How do you think that then manifested itself
17:40 in what we were asked to do during the pandemic in terms
17:45 of using government apps and so on?
17:48 Do you think that history affected the way
17:52 that the government approached that issue and the way
17:54 that people responded?
17:56 That's interesting, isn't it?
17:57 I don't know if I can generalize it,
17:59 because there were so many different reactions.
18:01 I mean, first of all, it just became a question
18:03 of technological privilege.
18:06 So if you're able to use an app, that
18:08 means you have a smartphone.
18:10 So what kind of phone do you have?
18:12 It has to be a phone that is compliant with that software.
18:15 That was not the case for lots of people.
18:17 So it was a self-selecting group of people
18:19 who not only had the phones, but then
18:23 were happy to give it a try.
18:26 So is that going to be a younger demographic?
18:28 Brandon, I'm speaking very broadly here.
18:30 It may have been some very trendy older people who
18:32 wanted to give it a go as well.
18:33 In fact, I know there were.
18:34 But you might have seen it generally skew
18:36 towards a younger audience, perhaps
18:38 a more affluent audience who could
18:39 afford iPhones or Androids.
18:42 Then there's the attitude, the cultural attitude of like,
18:46 are you somebody who is really happy doing
18:49 lots of stuff via an app?
18:52 So you might already have been using the NHS app
18:54 to book appointments and keep track of medication
18:57 because perhaps--
18:58 again, I'm making a generalization here.
19:00 For example, you might have been a younger person who likes
19:02 to do everything on your phone.
19:04 If you're somebody who doesn't want your medical data
19:06 anywhere near an app, even a government app,
19:10 especially a government app, just doesn't want to go there,
19:13 you might have felt a bit nervous about using
19:17 this kind of an app.
19:17 There was also the question of, would the app somehow
19:20 be tracking people?
19:22 And the UK one did not.
19:25 I was really happy to see the communication that
19:27 was constantly coming out from the app designers
19:30 within government to show people how it was designed.
19:33 You really saw all of the tech ethicists
19:36 and other scientists in this country step up and critique
19:39 it.
19:39 So everybody really raised their concerns,
19:42 and those concerns were reflected in the design.
19:44 So that's a happy story and outcome for the UK,
19:46 less so in a country like Singapore,
19:49 where they were like, no, no, no.
19:50 Everyone has to use this app.
19:53 We would obviously never use this in a policing context.
19:56 It's just for health.
19:57 And of course, they were lying, and they did use it
19:59 in a policing context, and it got out,
20:01 because these things will always get out.
20:03 So that destroys trust, not just in Singapore,
20:05 but obviously for everybody else who was watching that
20:08 and going, we told you that was possible.
20:10 You said we were crazy.
20:11 Exhibit A. So I think that's more the thing,
20:16 is people are very astute and aware, I think,
20:20 about power.
20:21 So if they were doing it voluntarily,
20:23 the British approach, I thought that was quite interesting.
20:26 I think if they had tried to make it mandatory,
20:28 it would have been a disaster.
20:31 When you came to writing, technology is not neutral,
20:34 Stephanie, and you're looking at areas
20:36 like facial recognition data.
20:38 You're looking about how we behave during the pandemic
20:43 in terms of using apps and data capture and so on.
20:47 Where were you at the end of your research compared
20:49 to the beginning?
20:50 Were you more concerned about how governments and states
20:56 were using our data than you were before you started?
21:01 I was more in a hammer is always in search
21:05 of a nail frame of mind.
21:07 So I talked a lot with doctors during the pandemic.
21:11 I was part of a research group that
21:12 was working with a local authority here in London, which
21:15 was fascinating.
21:17 I have some friends who are doctors,
21:18 and I was constantly asking them,
21:20 is this what you would want?
21:21 And they were like, no, what we want
21:23 is for people to stay at home.
21:24 What we want is for PPE.
21:26 And then what we want is a vaccine.
21:28 And then what we want is for people
21:30 to get vaccinated and boosted.
21:32 So it was fascinating because if you only hung out
21:35 with other technologists who were interested in the app
21:38 as a solution, for them, the app was the be all and end all.
21:44 We have to do it.
21:45 And for most of the medical professionals and scientists
21:49 that I was talking to--
21:50 and I'm not pretending I was talking to all of them.
21:52 It's such a busy time.
21:53 But I did talk to a bunch of them.
21:55 They were all just like, no.
21:58 Which was, I thought, a really nice exercise in humility
22:01 because if you're standing in your little bubble talking
22:04 to the other people who are in the bubble with you,
22:07 you're going to see the world through that bubble.
22:09 And what you need to do is pop it and go and talk
22:13 to the people who are actually dealing with the problem
22:15 and ask them what they want.
22:16 And that wasn't what they wanted.
22:18 It isn't what they said they needed.
22:19 Then it becomes a question of, was this a good return
22:22 on investment in terms of time and money?
22:24 Which isn't to say we shouldn't have done it, by the way,
22:26 because I think it's easy now, in retrospect, in 2023,
22:31 for us to forget how terrifying it was
22:33 before we had the vaccine.
22:35 I understand that people were like, Jesus Christ,
22:38 we have to throw everything we can at this.
22:41 And we might as well try it.
22:42 And if it doesn't work, fine.
22:44 I am all about that.
22:45 Like, I get it.
22:46 You have to try stuff and stuff doesn't work all the time.
22:48 One idea out of 20 might work.
22:51 And you have to, unfortunately, try them all.
22:53 In a crisis, certainly.
22:54 So I think it just gets filed under lessons learned.
22:58 And my book has a chapter in it that I hopefully
23:01 will humbly try to help with that lessons learned a bit.
23:06 God forbid, if this ever happens again.
23:08 It was also really helpful being American, by the way,
23:10 because I talked to my family and friends who
23:12 were in different parts of the country.
23:14 My friends in Texas were like, there will be no app here.
23:19 Nobody's doing any of this.
23:20 Like, the British approach was so British, right?
23:23 And the American approach was different.
23:25 And you could make a lot of discussions about that
23:27 and say, yeah, but look at America.
23:29 It lost so many more people, right?
23:30 Like, so many more people died.
23:32 Was that because of an app or not?
23:34 Probably not, to be honest.
23:35 It's the whole package, right?
23:36 These things are very, very complex.
23:38 I'm just mentioning it to say it was very helpful for me,
23:43 having family and friends in the US, family and friends
23:46 in France, and then in the UK.
23:48 Because it meant that even if I didn't want to pop my bubble,
23:51 they would pop it for me and be like, why are you
23:53 guys so obsessed with this?
23:56 We're not doing this.
23:57 We don't care.
23:58 But they were doing other stuff.
24:00 And that's what's amazing about the pandemic
24:02 as a phenomenon from a research perspective,
24:04 is it allowed us to study so many different approaches.
24:08 And I just hope to God that we've
24:10 learned the absolute best lessons that we
24:12 can for the next time.
24:13 I'm sure someone is collating them somewhere.
24:15 You're listening to the Data Capital Podcast, brought to you
24:19 by the Scotsman and the Data-Driven Innovation
24:21 Initiative.
24:23 The podcast looks at how data is reaching
24:25 into all areas of our lives, the benefits and challenges,
24:29 the opportunities and threats.
24:32 I'm talking to Stephanie Hare, one
24:34 of the most insightful modern commentators on data
24:38 and the author of Technology Is Not Neutral.
24:41 I next ask Stephanie what she thought people should
24:44 be more concerned about, the way the state uses our data,
24:48 the threat from the big tech corporations, or maybe both.
24:53 OK, I would say that I don't think
24:55 it's very productive in life to go around being worried,
24:57 unless you're going to take actions.
24:59 So I would say more.
25:01 I think we all know now that surveillance capitalism is
25:05 a thing and that any company that is operating online
25:07 wants to take your data and use it for stuff.
25:11 And ditto governments for influencing elections, noted.
25:16 Ergo, what are you doing to minimize your data footprint?
25:20 I feel like the time to worry about this was a long time ago.
25:22 We all know now.
25:23 So it's more like, what are you doing?
25:26 Don't put pictures of your kids online.
25:28 Just don't.
25:30 I don't know how much clearer I can make that.
25:32 Just don't put pictures of minors online.
25:37 Be really careful about all of the stuff
25:39 that you're putting online without going around
25:41 with a tinfoil hat being paranoid,
25:42 because we have to transact.
25:44 But that doesn't mean that you can't be using really good
25:47 cyber hygiene practices.
25:48 And just be aware.
25:51 Be aware that when you're in there, you're being observed.
25:54 When you are online, you are being observed as much
25:56 as if you were in a room that's got sensors and cameras
25:58 all over you.
26:00 How would you comport yourself?
26:01 If you could see it, that might change how you act.
26:05 So it's not a case of worrying anymore.
26:08 I think now it's just a case of deciding your approach
26:11 and executing.
26:13 And how did you amend your own approach
26:16 towards your own online behavior as a result of years
26:21 of research and a greater, deeper understanding of what
26:26 was actually going on as a result of that research?
26:28 Well, first, I've tried to read a lot more.
26:32 There's some just incredible-- we're in a real golden age
26:35 in fields of scholarship on these topics,
26:38 particularly in artificial intelligence
26:40 and sort of what's coming, the future of this,
26:44 but also in terms of data ethics, technology ethics, AI
26:47 ethics.
26:47 So I've been trying to read books more and be online less.
26:53 I used to be online all the time for work, for my job.
26:55 So I had to be.
26:56 I was a political risk analyst for five years,
26:58 and it was a great way to get information
27:00 that would take 24 hours to get into the papers.
27:03 I could get it very, very fast online.
27:06 I don't know if that's as useful now.
27:08 I think social media has really changed.
27:11 So I would say I'm not on social media very much,
27:15 and I'm not on it as often, anywhere near as often.
27:18 I'm reading far more books and trying to go to talks
27:22 and listen to people talking.
27:25 And then the big one, I would say,
27:26 was actually paying attention to what
27:28 it had done to my ability to concentrate and to read.
27:32 Because I have to read for my work,
27:34 but also I like to read for my life,
27:36 I was really horrified to notice how much time--
27:40 I started tracking how much time I was spending online.
27:43 And then I could kind of see how few books
27:46 I was reading in comparison.
27:47 And it doesn't take an Einstein to go, hmm,
27:50 if you got off Twitter, you'd have more time for reading,
27:53 which is then where you have to confront the fact
27:55 that you might be socially addicted to these sites
27:58 because they're designed to make you very addicted to them.
28:01 Or it's how you interact with your friends.
28:03 So if you're on Instagram or Facebook or whatever,
28:06 no judgment, I've been there, it can be really, really hard
28:09 to rewire those patterns with your friends.
28:13 But once I did that for myself, at least,
28:15 and it took a long time, I have liberated a huge amount of time
28:18 that I have then greedily filled with reading.
28:23 That's what I've done.
28:24 I can't claim, by the way, that this is some path to happiness
28:27 or wealth or weight loss.
28:28 I'm just telling you it made me a lot happier.
28:32 And what I love about reading books is it's not being tracked.
28:35 No one knows.
28:36 It feels deeply radical in 2023 to live an untracked life.
28:41 Perhaps there's another book in there,
28:43 the radical read, perhaps.
28:45 But what would Stephanie Hare say
28:47 to those who are not as immersed as she is in trying
28:50 to make sense of technology in 2023?
28:53 The tech world is so all-pervading in our lives
28:57 that many of us perhaps don't even
28:59 think about thinking about it.
29:01 But if they do, where to start considering
29:05 your own online behavior?
29:07 I mean, I don't know enough about what everyone else is
29:10 doing because that's just so hard, isn't it?
29:12 I can tell you anecdotally, I know a lot of people
29:15 have gotten either off social media or really restricted it
29:19 or being much stricter with their children.
29:21 But a lot of friends who have children
29:23 who tell me that their kids went from being big readers
29:26 to suddenly not reading as soon as they got on TikTok.
29:29 And I don't want to single TikTok out
29:31 because I personally think it's all of them.
29:33 But I think TikTok has just got a very good algorithm that
29:36 just keeps you so addicted.
29:39 And again, The Guardian just posted a horrible study
29:41 saying that over half of UK children
29:44 don't read now for fun, for leisure.
29:48 And we just-- we don't know, I think,
29:50 what we're doing to young minds at all with this.
29:52 That really horrifies me because reading is a skill,
29:54 but it's also such a great way to learn about the world
29:58 and empower yourself.
30:00 The idea that we would be dumbing our kids down
30:02 kind of freaks me out, to be honest.
30:04 And I've noticed that with friends who have kids,
30:06 that they either seem to feel, oh, god, there's
30:08 just nothing we can do, or they have
30:10 to clamp down on the phone.
30:13 And the phone itself as a thing that's not allowed at dinner,
30:17 so you're actually having conversations.
30:19 Otherwise, children don't know how to socialize.
30:23 I think that's real, and we need to pay attention to that.
30:25 I think we have to--
30:26 all of us maybe kind of understand
30:27 a phone is not just a phone.
30:29 It's so powerful.
30:32 And it's easy to underestimate that
30:34 because it's this cute little device
30:36 that you can just have in your hand all the time.
30:38 And it's packed with useful stuff.
30:40 But it's also really powerful.
30:43 So I feel like that's a big one.
30:45 You could just start there and be like, A, how often am I
30:48 or my loved ones on our phones?
30:50 Does it interfere with our interactions at dinner?
30:54 Are people not going to bed on time?
30:56 Or are they sleep deprived because they're not
30:58 using the phone?
30:59 Is anybody seeming addicted?
31:01 Are we reading or not as an individual or as a family?
31:05 Not to say that you should be doing these things.
31:07 I'm just saying to be mindful and to ask yourself
31:10 the question can be, I think, a very powerful exercise.
31:15 But what is it about TikTok and that immediacy,
31:18 that short, sharp, often humorous video content
31:21 that is so addictive?
31:22 If you're a parent, for example, is
31:24 it difficult to have an open discussion with your kids
31:27 about this?
31:28 I mean, I'm not a parent.
31:30 So I don't want to judge anybody who's a parent if they are like,
31:34 I sometimes let my kid go on this stuff
31:36 or they might be doing it a bit too much.
31:37 And yeah, they're probably not reading as much as they are.
31:40 What I would say is this.
31:42 It would be very useful to educate yourself
31:44 about how China governs and regulates
31:46 the use of TikTok by children in China compared to elsewhere.
31:51 They get different content.
31:53 They have restrictions on how long you can be on it
31:56 because they know it's addictive.
31:58 Like, everybody who's making this stuff
32:00 knows it's addictive.
32:01 It's not like, oh my god, we had no idea
32:03 we were messing up the kids.
32:04 They know.
32:06 They know and they're like laughing all the way
32:07 to the bank about it.
32:10 That's worrying.
32:12 I find it super worrying.
32:13 So I think I would probably be really old school parenting
32:16 and be like, you're not on it.
32:17 You're just not on it.
32:18 You're not on TikTok.
32:19 And then the kids would hate me.
32:21 So it's really easy for me to say that because I don't
32:23 have to be the hated parent.
32:25 But I'm super worried about this not reading thing,
32:28 the not having concentration, not being
32:31 able to handle being bored.
32:34 These are life skills.
32:36 You have to be able to cope with boredom
32:39 because it's from boredom that your brain then solves
32:41 boredom by getting creative.
32:44 So you have to go through that and learn
32:45 how to deal with that.
32:47 And if you're just constantly going to the slot machine that
32:50 is TikTok or Snapchat or whatever,
32:51 but again, I think TikTok is just head and shoulders
32:55 above the rest in terms of that level of addiction,
32:59 your brain is a thing that you train.
33:03 You can train it to do almost anything.
33:05 So what are you training it to do if all you're doing
33:09 is being on these sites?
33:11 And for young brains in particular,
33:12 they are so plastic, so malleable.
33:16 That's the time when they should be getting
33:18 the best mental diet, just like you don't
33:21 want to feed your kids crap.
33:23 You want to feed them really healthy, the best healthy food
33:27 that you can afford.
33:28 Ditto, you want them to be doing the best things for their brain
33:33 to train it.
33:33 The human brain is still growing, I think,
33:35 up to the age of 25 now.
33:37 But you can still train it even then.
33:39 Even for myself, getting myself off of these sites
33:42 as much as possible, I have been rewiring my brain,
33:47 not just with reading and the concentration that
33:49 comes from that, but studying languages
33:51 and getting back to feeling like I'm using my brain again.
33:54 Because otherwise, you're not really using it.
33:56 You're just passively consuming.
33:58 And most of what you're consuming--
33:59 I know people learn a lot on TikTok and the likes.
34:02 I don't want to totally slam it.
34:03 These things can be very useful.
34:05 But I think you also don't want to lie to yourself.
34:10 TikTok is just-- it is not a library.
34:13 It's just not teaching you critical thinking
34:14 in the same way.
34:16 So is it part of a healthy, balanced mental diet?
34:18 I mean, maybe.
34:19 If you can restrict yourself to only using it
34:22 for a really limited period of time
34:24 and you can fight that addiction, great.
34:27 But know that they are doing everything
34:29 they can to keep you hooked.
34:33 And if you're on TikTok, what are you not doing?
34:36 What are you not doing with the rest of your life?
34:39 I think particularly for kids, there's a piece there.
34:41 And then I'm sure every adult will also recognize it.
34:43 And I get it.
34:44 When you come home from work, you're tired.
34:45 You just want to chill out and do some chilling out
34:49 for 10, 15 minutes.
34:50 By all means, do it.
34:51 But if it becomes hours, and if it's making you feel bad,
34:56 if you're feeling those things, then maybe intervene
35:00 with yourself.
35:02 Consider getting off of it.
35:03 It's really hard.
35:04 I know.
35:05 It's like quitting smoking probably for some people.
35:07 And I have a lot of sympathy with that.
35:09 I found it hard too.
35:10 But once you do, you're kind of like, why was--
35:12 somebody said something to me that really freaked me out.
35:14 They said, if you were to spend all your life on these sites,
35:17 which a lot of people have been doing
35:18 because they've been in existence now
35:20 since the mid-2000s, if you're spending hours on this stuff
35:23 every day, what memories are you making
35:26 to remember at the end of your life?
35:29 You're not really living your life.
35:31 I guess you're at work or at school,
35:32 and then you're on these sites.
35:35 It just-- it was the most depressing and terrifying
35:37 image.
35:38 So I was like, delete, delete, delete.
35:39 They're all gone now.
35:40 And what about the much vaunted role
35:42 of technology and social media in election campaigns?
35:46 We've got some big elections coming up
35:47 and being very, very aware that what you see and hear
35:53 can be really manipulated now in a very different way
35:56 than before, and that we as a society
35:58 are going to have to figure out what to do with that.
36:02 And I know every media organization
36:04 in a liberal democracy right now is wrestling with that.
36:06 It's not like we have the answer, by the way,
36:08 to any of this.
36:09 I think that's a really healthy thing for people
36:11 to talk about with kids and among their friends, et cetera.
36:14 Because I feel like with anything with technology,
36:17 it's kind of like, if this is in my life,
36:19 it should be making my life better.
36:21 And if it isn't making my life better, I want it out.
36:25 That's how I feel now about these things.
36:28 And I think, do I need targeted advertising at me?
36:32 Personally, I do not.
36:33 I know other people who really like it.
36:35 Good for them.
36:36 So I just want the option to make it
36:38 where the default should be you can't targeted advertise
36:42 at me or kids, which is coming into effect now
36:45 in the European Union, but not in the UK, alas.
36:48 So just thinking about it, just not being so passive,
36:52 which I'm sure we all were for a really long time.
36:55 I know I'm sure everybody else that's
36:57 going to be listening to this is very
36:58 sensitive to these issues and aware.
37:00 But it can sometimes still be nice to be
37:02 jolted by a conversation like this or with someone else
37:05 to go, actually, it's true.
37:07 Why do I have listening devices in my house from Jeff Bezos?
37:14 Why do I need to be videoing my cat and dog in the house?
37:18 I have become the panopticon.
37:20 Why?
37:20 And some people will say, that's just really handy for me
37:23 to monitor my pet.
37:24 But it's weird if you go to stay at a friend's house
37:28 and they've got cameras everywhere.
37:29 And you have to be like, do you mind turning that off?
37:32 It's just strange.
37:33 So I just feel like we're in a thing where
37:35 I think becoming aware and having conversations
37:39 with each other about, do we need this?
37:41 What is the actual problem that this is solving?
37:43 Or how is this making life better?
37:46 And if it isn't, can we maybe just not?
37:49 That would be a really good start.
37:51 It seems fair to say technology is not neutral is actually
37:55 a bit of an understatement.
37:56 So where does Stephanie Hare go from here?
37:59 Where can this researcher, historian, commentator, writer
38:04 who's described herself as a human Swiss army
38:06 knife with many different functions best
38:09 deploy her wide talents in this huge conversation?
38:13 Well, first of all, I usually will work with anybody
38:16 who needs help.
38:17 So people get in touch with me for stuff all the time.
38:20 And if I can help them, I do.
38:21 So I feel like that's a good use of my time.
38:23 Anybody that's in tech distress that needs tech ethics.
38:27 But in terms of setting my own research agenda,
38:30 things that are really on my mind and that I feel
38:35 just are not getting the attention they deserve
38:37 from the tech community,
38:38 are climate change and biodiversity loss.
38:43 It was actually very interesting to me when we had back
38:47 in the spring of 2023, we had a number
38:49 of very notorious and slightly famous statements
38:53 from top AI researchers and people who were in startups
38:56 and who had created open AI and Denis Hassabis
38:59 at DeepMind, lots of very, very interesting people
39:02 as well as academics, Stuart Russell,
39:05 who wrote the book on AI, all coming out
39:07 and signing these things saying that they felt
39:10 that AI was going to become like an existential threat
39:13 to humanity on the level with a pandemic or nuclear war.
39:16 And I was like, they haven't mentioned climate change.
39:21 The crisis that we're actually in,
39:26 they haven't mentioned that.
39:27 I'm really interested in that absence, like why?
39:31 And for all of the use cases that everybody's looking for,
39:35 it just feels like they're all about chat bots.
39:37 We're gonna use it, all these businesses are rushing
39:41 to integrate LLMs and large language models
39:43 into their businesses,
39:44 even though the National Cybersecurity Center
39:46 just put out two blog posts saying basically don't do that.
39:49 These techs are still in beta.
39:51 They can be tampered with, you can't always identify it
39:55 and there are no fail safe mitigations, right?
39:57 So you're like, okay, why is everybody rushing to this?
40:00 Answer, money, I guess is the answer,
40:03 but it just feels like the big exciting innovation
40:05 around this is like largely being boiled down to chat bots
40:10 and very energy intensive searches on things like,
40:16 chat GPT that I'm not sure always needs to be the case
40:20 when you could just use Google
40:21 and it's like gonna use less water and less power.
40:25 Meanwhile, I'm not hearing anything that's like,
40:29 this is how we're gonna use AI to make supply chains greener
40:34 or energy efficiency within cities,
40:37 which is where the majority of human beings will live
40:39 on this planet by the end of the century.
40:41 And indeed, most are moving there now.
40:44 How we can like use artificial intelligence for healthcare
40:49 in an ethical and responsible way.
40:52 We have so many problems that I feel are not being addressed
40:57 and answered by the people who are getting
41:02 the most attention and money in the AI world.
41:06 And I find that really weird
41:07 and I don't know what to do about it.
41:09 I don't know if like biodiversity loss
41:12 isn't sexy enough for them
41:13 or it isn't profitable enough for them
41:15 or they just don't know enough scientists to talk to.
41:18 I'm just not sure, but that's the thing
41:22 that's gonna take us out.
41:24 And that's the thing that I'd like to see us focus
41:27 as a species on.
41:29 Where is our big, all hands on deck, big picture thinking
41:34 on climate and biodiversity loss?
41:37 So that is where I would like to find my way
41:40 towards working on that.
41:41 I'm not claiming that I have the answer for that,
41:43 but that's in terms of my own time.
41:45 That's where I'd like to spend, I think,
41:49 the rest of my career.
41:50 - And do you still feel when you're in the tech world,
41:53 Stephanie, we talked a bit about this before,
41:55 do you still feel like you're a little seen
41:58 as a little bit of an interloper?
41:59 You're somebody who's come to technology later in life.
42:02 Do you still feel that the tech community
42:05 and the big decisions are being made by those guys?
42:10 And I use that generally, but it usually is guys
42:13 who have got their computing degree or their physics degree
42:18 from a top American or European university.
42:21 - I think that when you look at who's been invited
42:26 to the White House or to testify to the US Congress
42:29 on artificial intelligence,
42:30 it has overwhelmingly been people who fit the description
42:33 that you just gave.
42:34 I don't know why that is.
42:38 I don't know why an interdisciplinary approach
42:41 or a humanities and arts and humanities
42:45 and social sciences approach isn't understood
42:49 by policymakers to be valuable.
42:52 Some of the most exciting scholarship that is coming out
42:55 on artificial intelligence is precisely from people
42:58 with those backgrounds of social sciences, art, design,
43:03 people who can work across disciplines.
43:07 So the idea of having one sole function
43:10 or one sole focus has many advantages.
43:14 Of course it does, but at the same time
43:17 can be really limiting.
43:18 And in the world that we're living in,
43:20 you really need people who can work together
43:22 across different problem sets and with other kinds
43:24 of experts to solve some of these bigger problems.
43:27 So I don't get it.
43:29 And I'd love to see a move towards something
43:33 that is much broader in our approach.
43:37 I think we'll get better decisions out of it.
43:40 There's an entire group of people
43:41 that you're not talking to,
43:42 much less just like the rest of society is not involved.
43:46 That said, I also don't think we need to wait
43:48 to be invited to the party.
43:49 I think people can totally empower themselves
43:53 to be like, cool, if that conversation is not happening
43:55 the way that I'd like to see it,
43:56 then I will throw that party
43:58 or I will throw that conference
44:00 or I will write that book or have this podcast or whatever.
44:03 So there's, I don't like to sort of wallow
44:05 in the despair of it all.
44:06 I think you just have to go, okay, this is clearly,
44:09 they're not getting the memo that we're gonna need
44:12 a lot more than, yes, your typical Silicon Valley
44:16 venture capitalists and people from Stanford
44:19 who have created many, many wonderful things
44:22 but in other lenses could be seen as part of the problem.
44:26 We need to have something
44:27 that's going to be much more inclusive.
44:29 If what you see is not existing, then you have to build it.
44:34 - Given her thoughts on state surveillance,
44:36 the addictive power of social media and much more,
44:39 it might be easy for Stephanie here
44:41 to be dragged into a pit of tech despair.
44:44 So how does she keep going?
44:46 How does she remain optimistic to fight the good fight?
44:50 - I mean, I allow myself to have the negative emotions
44:53 when I see something that horrifies me or pisses me off
44:57 or just scares me or I think, oh my God,
44:59 I don't repress that.
45:01 I feel it and I'll talk to people about it
45:03 and pound it out running or whatever.
45:05 But at the same time, I do my best thinking
45:10 when I'm in a more relaxed and creative space.
45:14 So I try to, if I can, move through my negative feelings
45:19 about whatever I'm learning as quickly as I can
45:24 and as honestly as I can, as completely as I can
45:27 so that it's done and I can get back to the space
45:29 that I need to be in, which is like, okay,
45:31 this is our new reality, like we're here
45:34 and I need to get us to here.
45:36 So how am I gonna do that?
45:38 What do I need to do?
45:39 And what's the best vehicle to do that?
45:41 Like, is it a writing project?
45:42 Do I need to like go on radio and television about it?
45:46 Am I actually even the best person to work on this
45:47 or do I find something and then give it to someone else
45:50 and go, look, I'm working on this and I'm seeing this,
45:52 it's freaking me out.
45:54 You're actually a better place to do it.
45:56 Or you and your crew are a better place to do it.
45:58 Can you guys take a look at it and keep me in the loop?
46:01 I just, I try not to take responsibility for everything.
46:05 I can't solve everything.
46:07 I can barely decide what I'm having for lunch most days.
46:10 Figuring out the bigger problems can be a trick,
46:14 but I do think allowing yourself to feel frustrated
46:17 or sad or angry when you read about stuff.
46:19 I mean, if you're reading a book,
46:20 climate change and biodiversity loss all the time,
46:22 that's, I can tell you not a cheerful topic,
46:25 but I want to face it so that I can then grapple with it
46:29 so that I can then start coming up with ways
46:32 to be a positive contributor towards making it better.
46:35 One day at a time, one conversation at a time.
46:39 And who knows, you know,
46:41 other people might be feeling the same way.
46:44 And we may all decide to push and make things better.
46:48 So I'm not throwing in the towel until the very end.
46:50 - That's the fantastic Stephanie here
46:54 ending this episode of Data Capital,
46:56 brought to you by the Scotsman
46:58 and the Data Driven Innovation Initiative,
47:00 part of the Edinburgh and Southeast Scotland
47:02 city region deal.
47:04 Thanks so much to Stephanie for her thoughtful insights
47:07 on such a wide range of data and tech topics.
47:11 Technology is not neutral, it's definitely worth a read.
47:15 To listen to all the episodes in this series,
47:18 simply search for Data Capital
47:20 on your favorite podcast platform.
47:22 Data Capital is presented by me, David Lee,
47:26 and produced by Andrew Mulligan.
47:28 (upbeat music)
47:31 (upbeat music)
47:33 [MUSIC]