Kara Swisher stopped by to talk about her latest book and where journalism stands amid the rise of artificial intelligence.
Category
🥇
SportsTranscript
00:00 I want to get right in and start with your book, Burn Book.
00:04 A lot of it you talk about a lot of the predictions you made
00:07 in tech that have been right over and over again.
00:10 Which of those are you most proud of that was most shocking to the industry
00:14 that you were right about?
00:15 A lot of them I don't remember because we had a lot of scoops and things like that.
00:18 But I think that the one thing is the very base fact that
00:21 everything that can be digitized would be digitized.
00:23 I think it's kind of an obvious thing, but 25 years ago it was not, right?
00:27 The idea that every industry would be changed by what was happening in tech.
00:32 But I think more recently when I wrote that column in the New York Times,
00:36 sort of predicting the insurrection and how it was, not sort of, I was.
00:39 I talked about the lies Donald Trump would tell.
00:42 It would shoot up and down the ecosystem, the digital ecosystem of the right.
00:45 And then he'd ask them to do something to stop the election.
00:48 And when I did it, I got a lot of pushback from social media sites
00:52 because I said I called them handmaidens to sedition eventually.
00:56 And I think I felt good about being right about that,
00:58 though I felt I wish I was wrong, I guess.
01:01 Any predictions that you were wrong about and that surprised you?
01:05 You know, at first when eBay, when they visited me
01:08 and I went to look at what they were doing, I thought it was a stupid idea.
01:11 I just was like, I don't get this. I don't get it at all.
01:13 And I think I was wrong about that.
01:16 Now, look, eBay is not doing as well, but the idea was a great one,
01:20 the idea of marketplaces.
01:22 And for some reason in that case, I couldn't wrap my head around it.
01:25 So I was really wrong about that, though I certainly quickly came around.
01:29 I was not a supporter of Amazon, but I thought they would do a lot better
01:33 than most of the press thought at the time when they called them Amazon.bomb.
01:36 I thought Jeff Bezos was a great entrepreneur and directionally it was correct.
01:42 Others, you know, Webvan, so many others failed.
01:45 But I did believe in commerce after that,
01:48 but marketplace was something I missed, I think.
01:50 Valuations, I didn't think.
01:52 When Facebook got a $15 billion valuation after the Microsoft investment,
01:56 and I forget what year that was, maybe 2008, 2009, I don't remember.
02:00 I was gobsmacked and I thought it was stupid.
02:04 And I said so, and I was wrong.
02:05 Because what they did is they grew into it.
02:09 They sort of faked it till they made it kind of thing.
02:11 And I just didn't think that the market would sustain these crazy valuations,
02:16 but I should have known because they did it before.
02:18 And a lot of that, it seems like narrative economics is, you know,
02:21 since I know Schiller wrote a book about it,
02:23 but how much do you think storytelling is driving the value of tech assets right now?
02:28 I think it's two things.
02:29 I think, you know, look, there's lots of good stories for businesses,
02:32 and they don't seem to get the kind of lift that others do.
02:34 There's a mania around some of it, you know,
02:36 like you saw with the GameStop thing or anything else that has nothing,
02:39 is divorced from value, obviously.
02:42 I think tech gets a pass where they divorce value with some tech,
02:48 not all of it, because some of it's really big.
02:50 But it gets divorced from value.
02:52 And so I, sometimes the story is a good one,
02:56 and it sells it more than it should.
02:58 Other times, it is a significant business.
03:02 I think a lot of these, you know, in a bad case,
03:05 the currency, the cryptocurrency stocks got hit more
03:09 because the story was not quite filling in right.
03:12 In the case of AI, we'll see.
03:13 The valuations are very high in the private markets.
03:16 We'll see if they have actual businesses
03:19 and whether that matters.
03:20 - And you did predict that the internet was going to ground media
03:24 as we knew it into dust.
03:25 - Yes, I did.
03:26 That was a good one.
03:28 - That was a good one.
03:29 And we work in the media space, so you were really right about that.
03:33 - It's still doing it.
03:35 Let me just say it's not over, but go ahead.
03:37 - Well, no, yeah, that's what I was going to say.
03:39 We need to shift again.
03:40 What are we still doing wrong?
03:42 And then we could get to what's happening next with AI.
03:44 But what are we still doing wrong?
03:46 - I think it's too late.
03:47 Media doesn't have technological chops that they need to do to beat these companies.
03:51 At the time they were coming in, a lot of media companies,
03:55 first of all, they thought they'd go away
03:56 because these valuations wouldn't be sustained.
03:58 Well, they were.
03:59 So they had the money to do it, right?
04:01 They have the money and they have the means to do it.
04:03 And anyone who has the money and the means tends to win in many markets.
04:07 And one of the things that was interesting to me is I kept saying,
04:10 I kept referencing the Twilight Zone episode,
04:13 the very famous one, to serve man.
04:15 And I kept telling these media companies, it's a cookbook.
04:18 It's not here to help you.
04:19 It's here to hurt you.
04:20 It's here to eat you.
04:22 And I think it's the same thing with AI.
04:23 They've got to use every means at their disposal, not cooperate with them.
04:27 Actually, they're your competitors.
04:30 They are media.
04:31 Tech companies are media now.
04:33 And so, and they have the money and the means and they want to eat you.
04:36 So what can you do?
04:38 Like with AI, you have copyright ability to sue on copyright.
04:42 I think you should sue and negotiate and get as much value out of it as you can.
04:45 But handing over your value with, well, they don't have a choice right now
04:49 because their tech skills are not as high as these tech companies.
04:52 So again, the big tech companies are owning AI and media is not.
04:56 And that's a big deal.
04:58 - And you called Google the Borg, sucking up all the information.
05:01 And it does feel like maybe AI is going to do the same thing.
05:04 - It's search plus.
05:06 I mean, that's what I, some internet people are calling it that.
05:09 It's search plus.
05:10 It's calculator plus.
05:11 It's more and amplified and better and stronger.
05:15 So it's the same thing.
05:18 It's the same story.
05:19 And how many times can you be Charlie Brown and Lucy in the football?
05:23 I'm not sure.
05:23 Many, lots of times.
05:24 That went on for a while.
05:26 - And you do use so many sci-fi references.
05:29 You talk about Star Trek a lot.
05:30 Are there any specific lessons from Star Trek or sci-fi in general that are your favorite?
05:35 - Well, you know, I think I use the Star Trek metaphor
05:38 because that's how I'd like it to be,
05:39 where everybody gets along and everybody benefits from each other.
05:42 And it's a united Benetton of people that are always trying to do the best.
05:46 I think it's more a Star Wars story is the Death Star is always here.
05:50 And so is the dark side, you know.
05:54 And so I use those metaphors because I think they work well, right?
05:56 They work well in this idea, whether is it Terminator for AI
05:59 or is it something more benign?
06:01 I think the answer is Terminator is not where we're going.
06:04 But it's something more benign that may be even more dangerous, right?
06:07 It may be even more slow.
06:10 I think media, because they love to tell stories, do so in a dramatic fashion.
06:14 And it's been my experience that companies die by the cell
06:18 rather than by instantly of some tragic accident.
06:21 This has been going on for 25 years, really, in earnest.
06:27 And media still doesn't understand what's happening to it in many, many ways.
06:32 And they're smaller and smaller and with less and less means.
06:36 And they're relying on the fact that these companies,
06:39 that they're here to help us and they're not here to help us.
06:41 They're not.
06:42 So you have to do other things, like be smaller, be more nimble,
06:45 have your costs in line with your revenue.
06:48 But you have to get used to the idea that they are ascending.
06:50 For everyone, not just for media,
06:52 is there a company or a person in tech right now that you think is really dangerous?
06:57 No, I don't think dangerous is the wrong way to...
06:59 I call Mark Zuckerberg dangerous only because he's inept to the task
07:03 of the responsibilities that were put upon his shoulders
07:05 or he put himself on the shoulders.
07:07 No, it's not dangerous.
07:09 Well, OK, there are dangerous people going to use AI
07:14 just the way they've used the internet over the past decade.
07:17 You know, some are very explicit, like Donald Trump,
07:19 and some are very implicit, like nation states that want to create discord.
07:24 So I'd look more to the malevolent players than anyone else.
07:27 That said, what's really dangerous is that our Congress can't pass laws
07:32 around antitrust or privacy or guidelines around AI.
07:36 We have to do that because this is the only industry in history
07:40 that doesn't have significant regulation giving it guardrails.
07:45 They just don't.
07:45 They can whine all they want about other regulations,
07:48 but every industry, the car industry, the pharmaceutical industry,
07:51 the insurance industry, however imperfect,
07:54 have regulations governing them specifically.
07:56 This one does not.
07:57 Is there a person or a company in tech that isn't a household name yet
08:01 that you think should be?
08:02 No, I think they're pretty obvious, like the Sam Altman's of the world.
08:06 You know, I met him when he was 19 and had a failed company called Loopt,
08:09 which I didn't think was going to make it, and it didn't.
08:11 But, you know, you sort of see some people.
08:14 I'm really interested in those who are making investments
08:18 in health care and climate change tech.
08:20 I think they're kind of really interesting,
08:22 and they have more of a mission-driven kind of thing going on.
08:25 And I kind of like...
08:26 Also, there's lots of profits, FYI, so that's good, too.
08:30 I just interestingly did...
08:31 I don't know if he's going to be the one that does it, of course.
08:33 He's Reid Jobs, who's the son of Steve Jobs,
08:35 has an investment company around health care.
08:38 I'm super interested in people like him that are doing that.
08:42 He's trying to make cancer non-lethal for obvious reasons,
08:45 personal reasons and investment reasons,
08:47 and they're doing it through a combination of philanthropy and investment.
08:51 There's all these really interesting things happening in the health space
08:54 and the climate space that I think are interesting.
08:58 Even these weight loss drugs, I think, is going to change a lot.
09:02 It's going to sort of kill the diabetic industrial complex
09:05 and good riddance to that.
09:07 And so you just have to sort of look.
09:09 Psychedelics, I think, are interesting.
09:11 A lot more in health care and physical things like climate change tech.
09:15 In your book, especially, but just also through your career,
09:18 you've been consistently brave and confident.
09:20 Were you always this way?
09:22 I don't know.
09:24 It's so unusual that people ask about that.
09:26 I'm not sure.
09:27 I'm just the way I am.
09:28 I was born this way, I guess, as I, you know, in a lot of things, I was born this way.
09:32 I do think, I do find it really interesting that people find a confident woman unusual
09:38 because it's not.
09:38 The women are very confident.
09:40 It's inside all of us.
09:41 It's just, it gets bleached out of us in a lot of ways for lots of reasons.
09:45 In my case, it didn't.
09:46 I just, I have, I'm more interested, you know,
09:49 I know it sounds stupid, but I like myself.
09:50 I don't know what to tell you.
09:52 And I don't want to say that when I'm good.
09:54 I'm good at things I tell you,
09:56 and I don't find that anything, but I find it the truth.
09:59 And when I'm bad at things, I don't really dwell on them.
10:01 So, or I don't do them.
10:03 And so is that confidence or just, I know myself really well.
10:06 And the reason that I wanted to ask you that was really because I wanted to ask
10:09 if it annoys you that people ask you that,
10:11 because we don't ask confident, successful men how they got that way.
10:15 Yes, we do not.
10:16 That's right.
10:16 That's great.
10:17 I don't mind being asked it.
10:18 I do being, mind being asked by men when they said, you have uncommon confidence.
10:22 What are you expecting?
10:23 Docile?
10:24 Is that the, is that the, that where we start is docile or, you know,
10:27 sequeous or thank you, sir, for letting me in the room.
10:31 I don't, I don't live that way.
10:32 So, you know, I do find that they're surprised by it and maybe, you know,
10:37 there's fault all around in that regard.
10:39 And, but I honestly, whatever.
10:42 Okay, sure.
10:43 It only tells me they're scared of me.
10:44 So whatever.
10:45 Good.
10:46 That helps me too.
10:47 You've accomplished so much.
10:48 What is next on your list that you would like to achieve?
10:51 Well, this book is doing really well.
10:53 It's a bestseller, so that's good.
10:54 That's really nice.
10:55 And I, I thought it would be, there hadn't been a tech book.
10:58 That's been really a bestseller in a while, except for that Elon book, which is a different
11:02 creature than mine, I would say.
11:04 And but that was, that's, that's a different situation.
11:08 I think I'm really interested in video.
11:11 I love this idea that cable and television is over because to me, a worldwide global
11:16 information network should be able to make money and do really well.
11:19 And so when everyone's going like, oh no, it's over.
11:22 I'm like, oh, is it?
11:24 I don't know.
11:24 So I'm, I've just, I'm on a CNN show right now.
11:26 I'm there, I'm learning.
11:28 But video, not just video and short things, long, just video.
11:33 What, where is video going and how do you monetize it?
11:36 I'm always interested in business problems that you can take advantage of when people
11:40 are looking the other way.
11:41 Like podcasting.
11:42 I did that early and it turned out pretty good.
11:45 And you broke a lot of the rules of journalism all along the way.
11:48 A lot of things that were ingrained, I think, probably in your journalism education.
11:52 No, I broke the rules of journalism.
11:54 Let me just say, I broke the rules of journalism business, business, not the journalism.
11:58 We have been accurate and substantive the whole time.
12:01 At the very heart of journalism is storytelling that is accurate.
12:04 That is the hard, accurate reporting, accurate writing and stuff like that at the heart.
12:09 What we've done is done other things to it, added reported analysis.
12:13 We've had a point of view based on journalism, right?
12:17 And so one of the things I like when Christiane Amanpour, and it was about something else,
12:22 but she was saying we're truthful, not neutral.
12:24 I love that.
12:25 That's sort of encapsulated what I thought.
12:26 And so in the business sense, I changed that, that you could be a brand, you could do things
12:31 without the mothership.
12:33 I think I did change that.
12:34 The idea that you could tell interesting, compelling stories, that you could build a
12:39 fan base.
12:39 I think that was something journalists were not doing because they were sort of, speaking
12:44 of a board, you're part of the board there.
12:45 And so I think we definitely, all things do, you talk about all these companies on Substack
12:50 or whatever, all these new media companies.
12:53 We were there at the start.
12:54 And I'm glad that everyone's trying new, innovative things.
12:57 It's exciting that you can, I think one thing I did is you can go and make your own way.
13:02 And I think that was hard for journalists.
13:04 And it's been proven over and over again in the years since.
13:07 And I couldn't, I could not be happier.
13:09 [BLANK_AUDIO]