• 2 days ago
AI and politics, politics and AI. That's the story of 2025. On this episode, The Verge's Kylie Robison joins the show to talk about ChatGPT's big new features, Operator and deep research, both of which promise to make the chatbot more useful and more autonomous. To access either one costs $200 a month — is it worth it? After that, The Verge's Liz Lopatto catches us up on the latest from Elon Musk and Doge, including why Musk is doing this thing, this way. Liz also makes the case that this isn't going to slow down anytime soon. Finally, Nilay Patel helps us answer a question from the Vergecast Hotline, and tells us how he felt about the Super Bowl's 4K stream.

Category

🤖
Tech
Transcript
00:00:00Welcome to the Vergecast, the flagship podcast of Chain of Thought Reasoning. I'm your friend
00:00:04David Pierce. It is the Monday morning after the Super Bowl, and I'm tired. It was a truly
00:00:09terrible football game. If you didn't watch, you really didn't miss anything. Congrats to all the
00:00:15Eagles fans out there, I suppose. But it was actually a pretty interesting tech evening.
00:00:19There were AI commercials everywhere, including some that were like, really reminiscent of crypto
00:00:25commercials from a few years ago. I have a lot of feelings about that. A surprising number of people
00:00:29in my life are super into Gemini now, thanks to that Google commercial. I'll say this for Google,
00:00:35company has a lot of problems. Very good at making commercials. Just very, very, very good
00:00:40at making commercials. Also, Kendrick Lamar did the Halftime Show and spawned a bunch of memes
00:00:45that I suspect you're going to see forever for the rest of your life. Tubi streamed the Super
00:00:50Bowl for the first time and held up surprisingly well. Just a lot of interesting technology going
00:00:54on. And then a really crappy football game. Super Bowl 59, everybody. Anyway, that is mostly not
00:01:00what we're here to talk about. We're going to talk a little bit about the Super Bowl on the
00:01:03Vergecast hotline at the very end. But before we get to that, we're going to talk about AI,
00:01:07and we're going to talk about Elon Musk. We're going to talk about all of what's going on with
00:01:11chat GPT, and the operator feature, and the deep research feature, and the tasks feature.
00:01:16And there's just something happening in the most expensive version of chat GPT that I find really
00:01:21fascinating. So we're going to dig into that. Then we're going to talk about what is happening
00:01:26with Elon Musk and the Department of Government Efficiency. Things have slowed down, I suppose,
00:01:32relatively speaking, the last few days, but there's still a lot going on. There's still
00:01:36a lot to talk about. And we're going to try to make some sense of what's really happening here.
00:01:40Then we're going to talk about the Super Bowl, and then we'll get out of here. All of that is
00:01:43coming up in just a second. But first, I need to go watch the Z-Suite on Tubi, because this is just
00:01:50what happened to me. I watched Tubi, and now I'm a Tubi guy. I don't know. This is my life now.
00:01:55This is the Verge cast. We'll be right back. All right, we're back. Let's start by talking
00:02:00about AI. So over the last few weeks, chat GPT has gotten a bunch of new features. OpenAI has
00:02:06been shipping stuff pretty fast. And there are three in particular that I'm really interested
00:02:11in. The first is called Operator, which is basically a way to let chat GPT use the internet
00:02:17on your behalf. It'll actually click around and do stuff. You say, book me a reservation for
00:02:23dinner for tomorrow night. And the idea is it will actually go do that thing for you.
00:02:27This is the agentic AI we've been talking about for a long time. Super, super interesting. There's
00:02:32also this thing called Deep Research, which is one of OpenAI's most advanced models,
00:02:37really put to the task of doing longer-term, thoughtful work. It takes time. It gives you
00:02:44a message that's like, come back later. I'll let you know when I'm finished. But the idea is it can
00:02:49go do much more thorough, much more accurate, much less hallucination-y research for you on
00:02:55almost anything. And I've been hearing from people over the last couple of weeks who are really,
00:03:00really excited about it. The third is chat GPT tasks, which is just a reminder system for chat
00:03:05GPT. And no one is excited about this but me. But here we are. Kylie Robinson has been testing all
00:03:10of this stuff with me in the $200 a month chat GPT Pro subscription. We both have some thoughts.
00:03:17Let's dig into this. Kylie, hello.
00:03:19KYLIE ROBINSON Hello.
00:03:20TAYLOR Welcome back. It's been a minute. I feel like every week you should be on the show,
00:03:24but then every week I'm like, we can't talk anymore about AI. And so we're back. It's just
00:03:29time to do this. KYLIE ROBINSON Yeah, listeners need a break. It's fair.
00:03:32TAYLOR And in part, you and I have spent $200 each on the chat GPT Pro account. And so we are here
00:03:40for work so that we can expense this. KYLIE ROBINSON Yes, exactly.
00:03:43TAYLOR This is very important.
00:03:44KYLIE ROBINSON Yes.
00:03:44TAYLOR So we've talked a bunch about, I would say, the not great products coming out of OpenAI
00:03:53over the last couple of years. They've shipped a bunch of really interesting technology
00:03:57and a bunch of not very good things to do with said interesting technology. But it feels like
00:04:03just in the last, what, two weeks, there's been this really interesting run of actually shipping
00:04:10cool, impressive product. Where did this come from? What is happening here?
00:04:13KYLIE ROBINSON Well, competition and scared,
00:04:16and we need to double our valuation in four months. Did you see that? They want to be valued
00:04:20at $300 billion. So... TAYLOR Yeah, this is all the SoftBank money, right?
00:04:24KYLIE ROBINSON Yes, exactly. $300 billion, doubling the valuation in four months as they last
00:04:30raised money. So yeah, that's where it's coming from. Like VCs, you know,
00:04:33Ma says over there, like, what, is he going to get my groceries for me or what?
00:04:36TAYLOR And it won't, spoiler alert, but we'll come back to that. The thing I keep thinking about is
00:04:44what Sam Altman, the CEO, said right after DeepSeek came out. And we've talked a bunch about
00:04:47DeepSeek on the show, but he seemed caught off guard by DeepSeek in a way I found really
00:04:54fascinating. In the same way that like everybody else working on AI was caught off guard by ChatGPT
00:04:58two years, like there was some sort of like rhyming thing that happened there that I thought
00:05:01was really interesting. But then he was basically like, we're going to move up a bunch of launches.
00:05:06We have much better stuff, we're going to show it too much faster. And my immediate reaction was
00:05:11like, well, that's obviously a lie. But maybe that was real. And maybe this is just the company that
00:05:18thought it had a big lead, suddenly with a fire lit under it, and it's going to start to move
00:05:23much faster. Like, is that really what's happening here? KYLIE ROBINSON I think that's the classic
00:05:26open AI move is if someone launches anything that takes the spotlight off of them for one second,
00:05:32they will figure out something to just drop immediately. Some blog posts, some research
00:05:37prototype. It's very cutthroat in that way. But Sam Altman said in a Reddit AMA last week or earlier
00:05:44this week that they're going to have less of a lead. And I actually appreciated him just saying
00:05:49it. It's his first shot at humbleness this entire stretch of time. So yeah, I think they're very
00:05:55aware of the competition. And even the people that helped co-found OpenAI and ran it are now
00:06:03building competitors that are quickly like taking up mindshare in Silicon Valley. So yeah, they're
00:06:09noticing. Okay. Yeah, it's really interesting. And so there are three products that I want to
00:06:13talk about. Two, I think you also want to talk about and one that I will barely make you talk
00:06:17about because it's chat GPT tasks. And I am America's only person who is interested in
00:06:21chat GPT tasks. But let's start with deep research. The first thing I want to know,
00:06:26we've both used it and I want to talk about what your experience has been. But
00:06:30you're out talking to people all the time. And I would say my loose impression of the
00:06:34reaction to deep research has been like really, really positive. People are really impressed with
00:06:41what this new thing is able to do. What are you hearing from people who are
00:06:45using and trying this kind of stuff right now? Yeah. Did you see Gary Marcus's reaction yet?
00:06:49No, people are saying it's the Gary Marcus benchmark because he starts this tweet with
00:06:54deep research is actually good. And then people are cropping out the blah, blah, blah, blah, blah.
00:07:00But he actually like compliments it. And for the listener, Gary Marcus,
00:07:03it's just a hardcore skeptic and has been right a lot to his credit. He's been wrong sometimes,
00:07:08but he's also been right a lot. I take he's one of those people I take very seriously, but also
00:07:14he's like one of those people. Every time you read him, you have to read somebody who is like
00:07:18pie in the sky, AI will save us. And it lands you in like a nice, happy middle, I think.
00:07:22Right. That's my day to day. Yeah, deep research. People seem really excited about it. I have to
00:07:28temper what I see and what I hear from people quite a bit because I don't know how you feel,
00:07:32but in San Francisco and just in the AI communities, people are like, this is AGI,
00:07:38this is the best thing that's ever happened to us, et cetera, et cetera. But I've seen really
00:07:42positive reactions and I've seen really cool use cases for it. And I jumped pretty quickly
00:07:47to asking my coworkers what I should ask it because it's $200. And for some reason,
00:07:51I just don't have enough questions in the world that I think would be good enough for this system.
00:07:56She's not a curious enough person, Kylie.
00:07:57Apparently not. Someone said, what was it like, you know,
00:08:01a hundred bears versus one bear was a question that someone asked on the internet. I was like,
00:08:04I should get more creative here. But my first question was about tariffs.
00:08:08That's so funny. That was my first question too.
00:08:11Really?
00:08:11This is what's going on in the world right now. I'm like, what do I barely understand
00:08:15that maybe the internet can teach me about? I was like, oh, it's tariffs.
00:08:17Right. We yearn for more tariff information because it is so weird. So that's a question
00:08:21I get asked a lot is how are these tariffs going to impact the AI industry? And I'm like,
00:08:26so much is up in the air. I don't know. So I asked a pretty detailed question because I figured a
00:08:30more detailed question would get me a better result. And it crashed like three times.
00:08:35Oh boy.
00:08:36And it never gave me my answer. So I pinged OpenAI and I was like, this is
00:08:40dog shit and they fixed it and everything's been good since.
00:08:43So you're the reason it's good, basically.
00:08:46Right. My one question of like, I'm going to write about how this is dog shit,
00:08:49so can we figure this out?
00:08:51So what's your understanding of what's going on here technically? Because there's been this big
00:08:56thing with reasoning and chain of thought and this idea that these things should be allowed
00:09:00to take their time and show their work and that will make them better. Is that all that deep
00:09:03research is? Or is there something else going on here that it's doing that's different?
00:09:09I think that most frontier labs have realized that inference time compute and, you know,
00:09:14the model thinking longer and giving it more time to do its thing is sort of the next frontier.
00:09:20It's the next, you know, parallel that we need, you know, more data,
00:09:25more compute and inference time compute is going to be really important. And that's why we're
00:09:30seeing like these reasoning models crop up. And sort of my argument, I try to take the position
00:09:35of my angry as verge commenter. And I think like, you know, well, okay, we're just getting products
00:09:41that are slower. No one wants a product that is slower and perhaps still inaccurate. I think in
00:09:46the press release, they said, you know, deep research is still a prototype and it might get
00:09:51like up to half of the things wrong. It gave some data point about how it gets, it still gets a lot
00:09:55of things wrong. So it's hard to make that argument for something that costs $200 still
00:09:59gets tons of things wrong and is slower. But what I will give this whole product idea in this whole
00:10:07industry is that it is so nascent and I still think it's a cool thing to try out. And, you know,
00:10:12as long as you're not doing it with high stakes tasks, you know, it's not your lawyer, your doctor
00:10:16or anything like that. I think it's a cool tool. Yeah. I think one of the things I've spent a lot
00:10:22of time thinking about recently is what the stakes are for things I should do with something like AI.
00:10:28Let me just give you an example of a thing I did with deep research that actually that was
00:10:30really fascinating. So do you know what JetFlix is? No. So it was basically this illegal streaming
00:10:37service that they built some really interesting technology that basically went and combed like
00:10:41every pirate site and torrenting system and just pulled in all of the content they could possibly
00:10:48find illegally. And for $9.99 a month, you could sign up and stream it. So it was like Netflix,
00:10:54but crimes was like literally the whole pitch ran super successfully for a really long time.
00:10:59And then I think last June, a bunch of people who were running it were convicted of running it
00:11:06and just got sentenced like this past week. I find this all fascinating. I just told you literally
00:11:12every single fact I know about JetFlix. And I was like, I feel like there's a story here. There's
00:11:15something I want to do. And so I just went to deep research and I was basically like,
00:11:20where do I even start? Tell me in relatively short order sort of the story of JetFlix. Did
00:11:26they have any interesting technology? Who were the people? What are the legal fights going on?
00:11:30It did a shockingly good job. It's a really interesting thing because I haven't thoroughly
00:11:37fact checked it enough to know what it got wrong. And I'm sure it missed stuff and got stuff wrong.
00:11:40And it's not perfect. But in terms of like, I went from knowing two facts about a thing to having a
00:11:47pretty good, at least like textbook primer of a thing that I can play with. I was really honestly
00:11:53impressed. And it's relatively well cited. There are a ton of links. And I came to the end of that
00:11:59and I was like, oh, this is actually useful. I'm a person who likes to read Wikipedia pages.
00:12:05And I don't know, are Wikipedia pages famously always correct? No. And we kind of understand
00:12:10how that's supposed to work. And I got to this point where I was like, is deep research just
00:12:13something I should think about the way that I think about Wikipedia, which is a useful starting
00:12:16point to understand something. But again, don't cite it in your paper or your teacher will fail
00:12:21you. You know what I mean? You totally waded into some fun AI drama that's happening post deep
00:12:25research is that a researcher at OpenAI tweeted, I feel bad, but he tweeted, basically Wikipedia's
00:12:32dead and good riddance because we have deep research. And then people are like, fuck you.
00:12:39He got so much blowback. He ended up deleting the tweet, but it was, that's some drama happening
00:12:43right now. Is this the end of Wikipedia? But people were like, you guys train on Wikipedia.
00:12:47This is an awful thing. I was going to say without Wikipedia, this doesn't work. So I
00:12:51think there's that part of the whole thing. Exactly. That's really interesting. But yeah,
00:12:55I sort of felt twinges of that. Like I wouldn't have tweeted that because I've tweeted some things
00:13:00before and I know what happens, but I felt it a little bit. It was very interesting.
00:13:05What else have you been doing with deep research? What have you figured out to throw at it?
00:13:09You know, the tariffs question was interesting. And here's an error. I'm wondering if you noticed
00:13:13this or if you looked at your query close enough to notice this, but all of my colleagues had
00:13:19myriad of questions, really, really variable. I think Sean asked me about the smallest phones.
00:13:25Jess asked me about the best horror movies, stuff like that. And I gave everyone their
00:13:30query. And then my editor, Addy, who wrote about this, asked about section 230. Is that the law?
00:13:35She wrote a great piece about it, by the way. We'll put it in the show notes, but she did a,
00:13:38she then called a scholar to go through what she got back and it was very good.
00:13:42So dope. So that was cool to give everyone their queries, but all of them tended to have, oh, and
00:13:48Jake asked about Apple earnings. And we both agreed, you know, as I was watching it pull 10Ks,
00:13:53I was like, oh, thank God, I never have to do that again. So like that's actually useful. But
00:13:58all of these queries had the same issue, which is they missed a whole year. They missed 2024.
00:14:03They stop at 2023. Oh, interesting.
00:14:05And it's funny because in that same Reddit AMA, Sam Altman was asked about, you know,
00:14:10the knowledge cutoff and he's like, oh, I don't think about the knowledge cutoff at all anymore
00:14:13because we have web search. But clearly like there's some sort of training bug there that
00:14:18misses a whole year. So that's really annoying because when I'm asking for deep research on
00:14:24something, I want the most up-to-date information. So hopefully they figured that out.
00:14:28Yeah. It turns out like a lot of stuff happened between 2023 and today.
00:14:31Believe it or not.
00:14:32Yeah. Some things went down.
00:14:34Exactly. So I kept running into that error, but another thing, not to totally shit on it,
00:14:39but like during my tariff query, I thought this is kind of annoying to read. Like I just don't
00:14:45know how you would change the interface for such a thing, but it was just, it was a block of text
00:14:50where I was already snoring by the end, but I guess that's what I get for asking about tariffs.
00:14:55But yeah, like I saw a really cool use case from a researcher at OpenAI who posted like,
00:14:59you know, he picks a topic and like every day before bed, he reads the deep research on that
00:15:05topic. And I thought that's kind of cool.
00:15:08Yeah. I don't know if that's fascinating or terrifying or a little bit of both. Probably
00:15:12a little bit of both, but it makes me think of like, I've used notebook LM from Google a lot
00:15:17for similar sorts of things. I just find like it's process of outputting stuff really fascinating.
00:15:22And it like OpenAI needs more notebook LM things where like that fake podcast that notebook LM
00:15:29makes is very compelling. Like a lot of people really like it. And it's really good at making
00:15:34flashcards and it's really good at making study guides. And I think I kept being struck through
00:15:38this whole thing by OpenAI is still stuck on the idea that you can do all of this in a chat
00:15:44interface. And it's just wrong about that. Like the idea that what I actually want is what looks
00:15:51like a text message, but it's 9,000 words long is incorrect. And I think there's still some like UI
00:15:58and human factor interface stuff here that OpenAI has not done.
00:16:03Totally.
00:16:03Which brings us to Operator, which is the other thing I want to talk about,
00:16:06which is I think if you're OpenAI, this is the thing you hope works, right? Like this is the
00:16:12thing, right? Am I crazy? Like the Operator is the one that when it works is going to be the
00:16:18one that everybody thinks is like going to change the way we do everything forever.
00:16:22Yeah. Well, everyone's building the exact same thing. Anthropic has I think a computer use,
00:16:25Google is doing Project Mariner. And it's funny when I used Operator for the first time,
00:16:33it's weird because I'm like, this is so slow. I could make this reservation in two seconds.
00:16:38Why am I using this? But when I watch Project Mariner at the Google campus,
00:16:41get groceries for me, I was actually pretty compelled by that. And I don't know what the
00:16:46difference is. Maybe because this is in my hands and not useful, whereas that was a demo. But
00:16:50yeah, I don't know.
00:16:52So what have you done with Operator? So Operator's thing, I should just explain
00:16:54how it works because I was sort of surprised at how it works. It's much more basic than I
00:16:58realized. You open Operator, which is, again, I think only available to
00:17:02ChatGPT Pro customers who have $200 a month to spend on this. Thank you, Vox Media.
00:17:08And you type in your query, and it literally just opens Chrome inside of ChatGPT and starts
00:17:16doing web things for you. It browses the web kind of like a person in a way that I found sort of
00:17:22unnerving. But it just opens up a Chrome tab and, at least in my experience, every single time
00:17:28bings something.
00:17:29Yes, I was going to say bing.
00:17:30Which is the first thing about it that I don't trust. Don't like that at all.
00:17:35But then it goes and tries to, with clicks and typing and normal web stuff,
00:17:41accomplish all of your goals for you. So what have you tried with Operator?
00:17:44Did you try to break it yet? Because that was the first thing when I noticed I could click around
00:17:47in their little siloed Chrome. I was like, how do I break this immediately?
00:17:53Let the computer do its work.
00:17:54I know. Now I feel bad. But seriously, I'm like, I want to crash this thing. I didn't succeed.
00:17:59I asked my friends pretty immediately, where should we go to dinner? And my friend recommended
00:18:04a place. I put it into Operator, and I watched it click around, and it was so painfully slow.
00:18:09And then it gets to the checkout, and it was $250 a person. I was like,
00:18:13my friend fucking sucks. I'm not doing that.
00:18:15Goodness, yeah. I don't like very many people that much.
00:18:17Yeah. So I'm going to Japan in a few weeks. I asked my mom what I should do,
00:18:23like tell it to book our train tickets. I asked it to do that.
00:18:28I didn't find the use cases so compelling for how slow it is, because I think I'm just
00:18:33naturally sort of neurotic and impatient. So watching it do something for me when I could
00:18:37just do it faster myself made me feel a little crazy, even though I could see if this is the
00:18:43start of this technology. I can only imagine how much better it gets over time. But yeah,
00:18:48pretty quickly, I got frustrated with it. And then I told David, do you want to take this article?
00:18:51Because I'm annoyed with this technology.
00:18:53And then, you want to know the first thing I did? I told it to go to the Amtrak website
00:18:58to try and book train tickets, because the Amtrak website is the single worst
00:19:03website that I interact with on a day-to-day basis. And so I was like, I wonder if
00:19:07JATGPT can figure this out. It is horribly designed. It's broken all the time.
00:19:12And I had the most vindicating experience. So I was like, tell me the price of a round-trip
00:19:17train ticket from Washington, D.C. to New York tomorrow. I just want to know what it costs.
00:19:20And it asked me follow-up questions, which it always does. And this took me
00:19:24a minute to get used to, both in deep research and in operator. You plug it in, and you're like,
00:19:29OK, I'm going to go away and wait for it to come back with information. But almost always,
00:19:32it has follow-up questions, which I get. And it's trying to sort of refine your prompt. And that
00:19:37all makes sense. But it's also kind of annoying. I'm like, just shut up. Just go do it. I asked
00:19:41you a simple question. But anyway, so it follows up. And then it goes. It opens a Chrome tab.
00:19:45It goes to Amtrak.com. It pretty successfully plugs in the days I want to go. And then it
00:19:51asks me for confirmation that this is what I want to search for. And I'm like, again,
00:19:54if I have to sit here and babysit this, what is the point of this? It's actually less efficient
00:20:01because it's slower, and I just have to sit here and watch in case you have questions for me.
00:20:05Didn't like that. But then it found a train ticket to go to New York. I'm in D.C. It found a train
00:20:13ticket to go to New York and then tried to go and find the return ticket. And the Amtrak website
00:20:18crashed and sent it all the way back to the beginning. And I was very impressed because it
00:20:21figured out what had happened and started the process over. And actually just ran the process
00:20:25again, got through the thing, and it shows you the steps as you're going. So it was like,
00:20:32it said something like, trying the Amtrak.com again. And it goes through the same thing,
00:20:36and it crashed in the same spot. And it was like, I'm just imagining this little robot just
00:20:40getting increasingly frustrated. And so then it goes back to Bing, searches for train tickets,
00:20:45finds a whole different third-party booking website that had, as best as I can tell,
00:20:49completely wrong information about Amtrak tickets, does the thing, and then presents
00:20:55it to me as like, here are the prices of the Amtrak ticket. Do you want to book it?
00:20:58And I'm on a website I've never heard of. And I was like, no. But thank you. But it was really
00:21:04fascinating. I was like, on the one hand, this is actually pretty resilient, impressive technology.
00:21:09It understands what it's doing enough to compensate for things that go wrong,
00:21:14which is pretty cool. What we've seen with a lot of these models is they just,
00:21:18once they're off track, they just keep getting further and further and further off track. And
00:21:21with all of these thinking models, one thing they're able to do is be like, oh,
00:21:25I'm off track, and go back and start over. But then it got to the end, and it was like,
00:21:29we accomplished nothing. It took 10 minutes, and I had to sit here and babysit it,
00:21:33and we accomplished nothing. The one good experience I had,
00:21:36again, thinking about stakes, right? I've been rewatching Schitt's Creek for the 9,000th time,
00:21:43because it's the greatest show in the history of television. And I was like, find me some fun
00:21:48Schitt's Creek merch. So I went, had it go, it went to Bing, and then I think landed on Etsy.
00:21:53And then, as far as I can tell, just combed the page for things with a high number of five-star
00:21:59reviews and just opened three tabs with pretty cool, highly reviewed Schitt's Creek merch.
00:22:06I was like, that's a victory, right? You just did a thing that I would have done
00:22:10roughly the same way, but I didn't have to do it. Love it. I'll take it. And then I was like,
00:22:14okay, do you want the mug or the keychain or the greeting cards? I was like, I'll take the
00:22:20greeting cards. I was like, okay, you can get one for this price or two for this price. I was like,
00:22:23okay, I'll take two. And it says, great, it's all in your cart. Here's the link.
00:22:26Sends me the link. I click on it. Guess what happens?
00:22:29They're not there.
00:22:30My cart's empty because it wasn't in my cart. It was in this virtual computer's Chrome cart.
00:22:36And this is the thing, there's this like, at every time I've used Operator,
00:22:39there's this fundamental disconnect where it actually can't do the job
00:22:42because it's not on my computer. And I don't know if I want it to be or not,
00:22:46that's a separate question, but like, it actually doesn't have access to all of the stuff that I
00:22:50need, which is like my shipping address and my credit card number and all of this stuff.
00:22:55And like, it can't even send me a cart because it's on a virtual computer somewhere and not on
00:23:02my computer.
00:23:02Well, you could sign in. I feel like that would have changed the outcome if you had
00:23:05signed into Etsy or wherever it was.
00:23:08Yes. If I had gone to Etsy on the virtual computer and signed in, but that's not what it had me do.
00:23:12It just sent me a link in the chat GPT message. It was just like, here's your cart. And I clicked
00:23:18it and it said, your cart is empty.
00:23:19Perfect.
00:23:19It's just like, it's doing something cool, but there is so much left to do before this stuff
00:23:25actually works on your behalf. And I was just like, I can see it, but we're not there yet.
00:23:30Totally.
00:23:30And I'm not even sure we're all that close.
00:23:32And why do you think that OpenAI released this on that $200 tier and Project Mariner is not
00:23:39getting released in the same way? Because they're essentially the exact same thing.
00:23:43I don't understand why Google's holding theirs back.
00:23:46Okay. I have a theory about this. It's based on nothing, but I also think I'm right. And I'm
00:23:50curious what you think. So I think if you're OpenAI, you have a bunch of people who pay $200
00:23:57a month for this thing. That is by definition going to be a group of people who really believe
00:24:00in AI, right? That is so perfectly self-selected for people who want to use the bleeding edge stuff.
00:24:07There's kind of no risk for OpenAI, right? These are people predisposed to thinking it's cool
00:24:11and who want to try it and test it and poke at the edges of it, but aren't.
00:24:15Obviously, this is not a mainstream thing. It's $200 a month.
00:24:19Google, on the other hand, I think is increasingly only interested in mainstream things. Google
00:24:24seems totally uninterested in rolling out AI experiments because every time it does it backfires
00:24:31and it's a mess and it tells you to put glue on your pizza. And so I think what Google has learned
00:24:36is there's only value in this stuff if we can put it in front of a billion people, and we have to
00:24:43get that more right than we have. Whereas OpenAI is sick. We have this group of people who just
00:24:47loves AI to pieces, and we can just throw every experiment we have at them, which I think in a
00:24:52certain way was kind of genius of OpenAI to create this tier of people. They seem to be losing money
00:24:57on hand over fist, but it's a good idea to put that group of people in a bucket together and be
00:25:03like, we're just all going to experiment together over here. That's my theory.
00:25:06Totally. I agree. I think also I felt Google doesn't have to worry about this because they
00:25:11have buckets of money, but it feels deeply abusive to the GPUs whenever I use deep research. I'm like,
00:25:18oh my God, I have burned so much trying to get this one query I don't really care about,
00:25:23and I don't really have a choice on refining it after it starts. So yeah, I really do see the
00:25:29potential of both deep research and operator. Both are fundamentally frustrating to use,
00:25:36and I don't have a ton of use cases that would make them a part of my everyday life,
00:25:40but I do use AI tools quite a bit. What I factor into what I want is I want it to be
00:25:46accurate and fast, and neither of these tools are there at all.
00:25:50No. They're neither of those things.
00:25:52No.
00:25:52So what are you using it for right now, day to day? Because I think we did a thing on the show
00:25:57last week where we just had a bunch of people call in and tell us what they use AI for, and
00:26:01overwhelmingly it was pretty low-stakes stuff, but also just conversational. People just want
00:26:08something to hang out with and talk to and get ideas from, and that both is and isn't a tool in
00:26:15the sense that we're talking about. But I'm curious from your perspective, what have you
00:26:19found that's actually working for you in day-to-day life?
00:26:21What I use it for is I use Claude. I'm obsessed with Claude. I think it has a really good
00:26:27personality. It's very conversational. It understands nuance, and what really changed
00:26:33the tides for me is when I was working late, it was like 7 or 8 p.m. here, and I had this 5,000-word
00:26:38story, and I had just finished it, and I'm like, okay, before I hand this over, let me plug it into
00:26:43here and you tell me what you think because I had been cracked out on this. I don't know if the flow
00:26:48is good and I feel bad delivering it if it's just complete shit, so tell me if it's complete shit.
00:26:54And it focused on TKs, which are just empty spaces where I'm going to put in more information later,
00:27:00and I was like, no, don't focus on that. Focus on the actual substance and how I organized it,
00:27:04and then it was like, okay, and then it switched into this casual mode, and one of the notes that
00:27:08really stood out to me was you say this one thing, and then immediately it's like you apologize and
00:27:12undercut yourself, so don't do that, and I was like, man, that's something Kevin would tell me,
00:27:16something an editor would tell me. That's actually really interesting and compelling, and
00:27:20I can't believe it said that or caught that, and yeah, so I use Cloud for that, or sometimes I'm
00:27:26like, what am I trying to say here? It's all a variation of what made me use ChatGPT for the
00:27:31very first time, which is I had tweeted, like, average day for me, and it was me Googling how to
00:27:36say a cuss word politely in writing, and someone was like, this is exactly what I use ChatGPT for,
00:27:42and I had been staunchly against it. I was like, no, I don't want to use that, stupid.
00:27:47I really was not about it, and then I used it for synonyms because I'm always like,
00:27:51what am I trying to say here? And that was my first use case, and now it's just
00:27:56more advanced versions of that, like, okay, I'm going to type a bunch of stuff. Do you understand
00:28:00what I'm saying? Where could I be more concise? And yeah, I think it's really helpful in those,
00:28:06and I said this for our Verge cast, like, end of year stuff was, you know, I use
00:28:12SearchGPT quite a bit, a little less often now because it just isn't reliable, but I find it
00:28:19helpful for getting links, whereas Google has just failed me in that regard.
00:28:23Yeah, Google, not great at links, which you would think would be a thing Google knew how to do.
00:28:29Yeah, there's another thing we're going to have to do here at some point about how Google just
00:28:34incredibly fumbled the bag on all of this stuff because what everybody is just looking for is,
00:28:38like, what if Google was good? And it isn't. But at any rate, I think there's a thing in that
00:28:45interaction that I find really interesting, which is, like, AI is fine as long as it's not the last
00:28:51step in the process. Like, this is kind of what I'm coming to is, like, I think the danger of
00:28:56something like deep research is that it seems very confident, it looks very well-sourced,
00:29:00it took its time, and so you're like, okay, this thing is authoritative.
00:29:03Yep.
00:29:03It's going to be wrong about a lot of things. It's going to miss important
00:29:06nuance. It's going to miss whole years of time that are very important. But, like,
00:29:13if you take it on the understanding that, okay, I'm going to take this information and do something
00:29:18with it, I think all this stuff can be really useful. And what you're talking about is, like,
00:29:23there is necessarily, like, a you between everything that ChatGPT is doing there or
00:29:28Claude is doing there and it going to your editor, right? If you were just, like, using
00:29:33Gemini to find and replace your grammar and then just shipping it to your editor, like,
00:29:38I would have so much more of a problem with that.
00:29:40Really? I don't use Gemini at all. I should, but, I mean.
00:29:44Gemini's fine. Like, my hottest take on all of this is that they're all the same,
00:29:48and I'm going to get a lot of emails about that, but I basically believe that to my bones.
00:29:52No, I endorse.
00:29:53But, and I think, again, it goes back to, like, what the people were telling us
00:29:57last week is, like, using it for, like, advice and input and information and kind of as a starting
00:30:02point, fine. I think I'm actually increasingly good with that. But it's the thing where you're
00:30:06just like, make this for me and ship it off. Or, like, even in the operator world, it's like,
00:30:10just go buy me groceries. I'm like, I don't trust you, and I have a thousand questions,
00:30:14and it's actually less work for me to go get my groceries than to ask you the thousand questions.
00:30:19And there's a tipping point for that that I don't think we're anywhere close to where, like,
00:30:24I want the AI to be the last step of the process. And until then, I don't think we should let it be
00:30:29the last step of the process anywhere, for any reason.
00:30:32You even saying that filled me with dread,
00:30:34imagining it being the last step of the process at all.
00:30:36Like, truly, if you were just like, ChadGBT, order me groceries.
00:30:41Yeah.
00:30:41Like, and they just appeared at your door. Well, I'm not even confident they'd appear at your door.
00:30:46But if you were just like, ChadGBT, order me groceries for delivery tomorrow morning,
00:30:52what percent chance do you think there is that that would go well?
00:30:57In, like, meaning that you would be, like, psyched about everything that it delivered
00:31:01you, and it came at the right time, and it charged you what it should, and it came from
00:31:03the right place. Like, it's zero, right? Like, I can't imagine a world in which that goes well.
00:31:08Right now, yes. Like, negative.
00:31:09Yeah.
00:31:10No. No, I mean, I would like to see, in the next 10 years, I think AI leaders who are building
00:31:16this would like you to believe it's in the next six months. But I think maybe in the next 10
00:31:20years, 15 years, we might see AI that is reliable. But that's, like, really, like,
00:31:27just a complete stab in the dark. Because right now, no, I cannot do that whatsoever. And that's
00:31:31the one thing I want it to do, is I'm about to board a flight coming home. Can you restock my
00:31:35fridge? And blah, blah, blah. No, it cannot do that.
00:31:37The dream.
00:31:38The dream. The dream. And I feel the same way about deep research, because everyone I gave that,
00:31:44their query back to, they have to fact check it. And it takes twice, three times as long
00:31:48just to make sure it's accurate, and so much of it isn't. So, you know, it's not in this place
00:31:54where they claim, you know, superintelligence is coming soon, and it'll replace, you know,
00:31:59economically valuable positions. It'll be whole companies. They even say this year you're going
00:32:05to see agents in the workforce proper. And from what I see, what I wrote when I was starting the
00:32:11draft, I was like, this reminds me of the dumbest, most eager intern I could have. And I really
00:32:17appreciate how happy it is to do the job, but I don't think I'm converting it to a full-time
00:32:21position.
00:32:22Okay. A, you and I have both been that intern at various times in our lives.
00:32:26Oh, yes.
00:32:26I'm sure. So, be kind to those interns.
00:32:29Exactly. Exactly. But I'm like, get it together.
00:32:32I was not to be trusted when I was that intern.
00:32:33A hundred percent.
00:32:34Yeah. Yeah. The more I use AI, the more I am trying to figure out how to balance that,
00:32:40and I go back to the JetFlix thing, and I'm like, what that gave me is a bunch of
00:32:43names of people to go research. It gave me a bunch of, like, technological terms that I should go
00:32:49look up for how this worked, and it linked me to a bunch of court cases. And if that is the value
00:32:54that I derive from it, terrific. If I read that thing and think, oh, now I know the JetFlix story,
00:32:59I have gone wrong. And I think, again, we're like, it's the same thing with everything on
00:33:03the internet. We have this, like, incredible media literacy problem where we just, it's so
00:33:08hard to know what is real and what isn't, and we just don't usually have time to go fact check
00:33:13everything we see on the internet. And so I think, like, deep research is good enough that I
00:33:18am actually optimistic that it's going to get better, but we're still in this weird
00:33:22liminal space of, like, it seems trustworthy, but it isn't. But it's so much work to figure
00:33:28out in what way it's not that I don't quite know what to do with it. And one thing I liked about
00:33:32Operator was I could just watch what happened. Like, I just, I asked it to find me an Airbnb
00:33:37and then sat there and watched while it looked for an Airbnb, and I know exactly where it went
00:33:40wrong. And there's something really helpful about that. Like, I know the problem, and then I could
00:33:45just pick it up from there and go. Yeah. I was looking for use cases for, like, how people were
00:33:49using it, and I watched this whole YouTube video of a guy using Operator to make changes to his
00:33:53website, which was really interesting. I don't know if you've seen this. No. And it was successfully
00:33:58executing, like, I need you to change, like, the thickness of these lines or something. And
00:34:03before it would execute, though, it recognized that this is, like, a high-stakes task. So it
00:34:07had, like, this huge pop-up, like, are you sure you want to do something? Because he was, like,
00:34:11editing live on this website. And it was really compelling because it worked for him, but
00:34:16obviously it was really slow. So those are the use cases I could see. I was just talking to somebody
00:34:21about, like, why I use Claude versus Chachapiti being really good or supposedly very good at
00:34:27coding and mathematics. And someone was like, because that's what the people building it
00:34:31care about. They don't care about writing. They care about mathematics and coding. So that's
00:34:36why it's good in those domains. So I have faith that it will be useful for those kinds of people,
00:34:42but for right now, not so useful for me. Yeah, that's fair. We got to learn to code, Kylie.
00:34:48Oh, God.
00:34:51I didn't make you talk about tasks. Thank God.
00:34:53I would like you to appreciate that I didn't make you talk about tasks.
00:34:55Yes, thank you.
00:34:56But just some other time we're going to talk about tasks.
00:34:58Merciful. Thank you.
00:34:59Kylie, thank you as always. All right, we got to take a break and then we're going to come back
00:35:04and talk about Elon Musk. We'll be right back.
00:35:12All right, we're back. So we've talked a bunch about the Department of Government Efficiency
00:35:17and what Elon Musk and his crew of engineers have been doing inside of the U.S. federal government.
00:35:24But I thought this was a good moment to sort of pull back a little bit. This has been happening
00:35:28for a little while. Things have slowed down a little bit. There have been some legal challenges.
00:35:31It felt like a good moment to try to figure out what's really going on here. Who are these people?
00:35:36What do they want? What is actually happening? Is this all actually going to work? And there's no
00:35:42one on Earth I would rather talk that stuff through with than Liz Lopato, who has been
00:35:46covering Elon Musk in various ways, shapes and forms for many, many years. So Liz is going to
00:35:52help me understand what in the world is actually happening here. Liz, welcome back.
00:35:57Thank you. It's good to be here.
00:35:58I feel like you're like the unofficial insanity correspondent of The Verge cast,
00:36:02which I didn't mean to happen. But whenever it's like,
00:36:04is everything falling apart? We have to call Liz.
00:36:08The chaos correspondent, if you will.
00:36:10Yeah, I like this for us. I think what I want to try to do
00:36:14is back up slightly from like the minute to minute chaos of what's going on and try to
00:36:20just make sense of the whole sort of Doge experiment, which I think is also what you've
00:36:25spent a lot of time trying to do over the last couple of weeks. So I guess like at the risk of
00:36:31asking like an outrageously broad question to start us here, what is your sense of what
00:36:36Elon Musk and Doge are actually in reality trying to do right now?
00:36:41So I think they're essentially politicizing what was previously neutral infrastructure.
00:36:48So if you think about a lot of what they're doing, it's taking over stuff that was just
00:36:54sort of there to help the government function. So like the General Services Administration,
00:37:00for instance, which is kind of like the IT for every part of the bureaucracy,
00:37:06it's like the thing that lets everything else function. That's where they started.
00:37:10And the sort of metaphor for this is you might remember after Russia invaded Ukraine,
00:37:19they were kicked out of SWIFT, which is how banks message each other. And that was a moment where
00:37:23a previously neutral infrastructure had suddenly become weaponized. And so we're
00:37:28watching this happen now within the United States government.
00:37:33How strategic do you think this group is being inside of that? Because I think on the one hand,
00:37:40there is a way to look at this that is just absolute abject chaos. And I think one of the
00:37:45things we learned from the first Trump administration was that sometimes things
00:37:49that look like chaos just actually are chaos. And we spent a long time sort of assigning strategy
00:37:55to chaos. And it's like, no, it's actually sometimes it's just chaos. But it also feels
00:38:00like something is happening on purpose here, like you're describing. I heard someone ascribe it to
00:38:06basically like somebody tweets the name of a government agency at Elon Musk, and he's like,
00:38:09let's go kill him. And I don't think that's right, but I also don't think that's wrong.
00:38:15So how on purpose does all of this feel to you right now?
00:38:18Well, it feels varying degrees of on purpose. Let's say broadly, I think that there is a
00:38:24purpose here. And if you step back, you sort of see it, which is he's after the money.
00:38:30That's why he's like so incensed that a judge has ordered him away from the Treasury. Like,
00:38:36that's where the power is, is the U.S. Treasury. That's where he wants to be. That's the thing he
00:38:41wants to control more than anything else. And if you think about his ambitions for X,
00:38:47that makes sense, right? Like he wants it to be the everything app, the payment app.
00:38:51And so, you know, when he went after the Consumer Finance Protection Bureau and tried to illegally
00:38:59close it, part of that is like in line with trying to make X a major payment app, right?
00:39:07Because that's a regulator that he doesn't have to deal with anymore.
00:39:11The other thing that I would keep in mind here is his AI interest. And what I think is going on
00:39:20is there is essentially an attempt to replace bureaucracy with AI. And I think that's not
00:39:29going to work out great for a wide variety of reasons. But if you've seen like the list of
00:39:33banned words that they're trying to remove from the government, one of them is privilege.
00:39:38Oh, interesting.
00:39:40Right. Because as soon as you start thinking about privilege, you think about things like
00:39:45IT, like who has privileged access. Or for instance, like if you're in the NSA,
00:39:50you're thinking about things like privilege escalation, which is when an attacker like
00:39:55starts moving up the chain. Like there are ways that these words get used that have nothing to
00:39:59do with the so-called woke ideology that they are theoretically seeking to uproot. But because
00:40:05there's such a blunt tool being used looking for these specific keywords, there's a bunch of stuff
00:40:12that potentially gets knocked over. So there seems to be sort of a mix of like total chaos,
00:40:21in part because of how they're choosing to do this. And what I think is a little bit strategic,
00:40:26which is like, how do you get power? You control the payments.
00:40:31Right. And I think it seems like part of this is also just something that Elon Musk is like
00:40:37uniquely suited to do in this really weird way. Like one thing I've been thinking about a lot,
00:40:42and I'm curious for your thoughts on this, is if you just replaced Elon Musk with some other
00:40:49incredibly self-interested billionaire, put Jeff Bezos in here, right? Somebody else who is
00:40:54tied up in a lot of government contracts and has an awful lot to gain from having more control over
00:40:59the systems of government. It feels like it would be really different. And so there is something to
00:41:04the Elon Musk-ness of it all, outside of just like, my dude would like Grock to be used by
00:41:10the government for billions of dollars, right? You've been following Elon Musk for a long time.
00:41:17Put this man and this moment together for me. I feel like this is the thing that breaks my
00:41:21brain the most, is like, how did it happen that this particular guy is doing this,
00:41:26and that it is going this way so much like in his own specific image?
00:41:31I mean, he bought the government, right? Like that's what happened.
00:41:34For not even that much money.
00:41:36Not even that much money. Like he financed, you know, an incredible part of Trump's campaign.
00:41:41And like, look at Donald Trump. That man does not enjoy being president. He doesn't want to
00:41:47do the hard shit. He wants to go to his rallies and have people cheer for him and give a press
00:41:53conference. He doesn't want to think. He's not good at thinking.
00:41:57He went to the Super Bowl. That's like, he was the first sitting president to go to the
00:42:00Super Bowl, which I thought was very odd. And then I was like, well, wait, if I'm Donald Trump,
00:42:03and I got elected so that I don't go to jail, I'd go to the Super Bowl. Like,
00:42:07hell yeah, I'd go to all the things.
00:42:09I mean, he's taken over the Kennedy Centers for the Arts. Like those are his priorities.
00:42:13He's entertainment, right? Like that's all he cares about. So like, that leaves a real power
00:42:18vacuum, which we've seen other people exploit. We saw other people exploit in the first Trump
00:42:24administration, and arguably Stephen Miller is still exploiting. But that's what Musk is doing.
00:42:30He is arguably the actual power center rather than the person who got elected president.
00:42:35And one of the things to keep in mind about Elon Musk is that he is like
00:42:42chaos at every company he's at. He is total chaos. And at the companies that he's been
00:42:47at for a long time, like SpaceX, which he founded, and Tesla, which he took over,
00:42:54there's padding around him to keep him from doing damage. They essentially keep him pinned off.
00:43:00There are people whose job description includes managing his chaos, right?
00:43:05That's right. Yeah, like they absorb whatever nonsense he's up to. And we saw what happened
00:43:10at Twitter when that doesn't exist, which is, you know, he just starts ripping out wires.
00:43:16He does a bunch of stuff like moving servers randomly over Christmas break, remember?
00:43:23Or like, remember when retweets broke for some reason? There's no plan. He's just like,
00:43:33well, let's do something and see what happens. And that is the plan. And so, you know,
00:43:40when people talk about comparing what's going on in the government to what happened at Twitter,
00:43:46that's right. I mean, like, I have seen reporting of government employees reading books
00:43:53about what happened at Twitter so they know what to expect.
00:43:56Oh, wow.
00:43:56Okay? And like, they're not wrong. Like, he's gonna barge in as loudly as possible,
00:44:01and he's going to just see what happens. And like, if you think about somebody like Jeff Bezos,
00:44:07this is a man who is strategic and who is thoughtful and who you can say no to. Like,
00:44:16you can be in a meeting with him and say no to him. Jeff Bezos has not surrounded himself with
00:44:21yes-men to the extent that Elon Musk has. And everything in Elon Musk's life is within Elon
00:44:28Musk's control as far as he is concerned, including now the U.S. government.
00:44:33Right. A thing you've said over and over over the years that I think has
00:44:37continued to be more true every time you've said it is that history has taught Elon Musk
00:44:42that there are no consequences and that he can essentially do whatever he wants. And like,
00:44:47again, history suggests he's not wrong. If you were Elon Musk, wouldn't you also think
00:44:53there are no consequences? Yeah. I mean, like, remember the time he
00:44:57tweeted on the Joe Rogan show and then, like, NASA gave him another contract?
00:45:01Right. Like, you know, like, there are arguably no consequences for Elon Musk. And so why would
00:45:10he behave in the way that any of the rest of us behave? Because he has learned that
00:45:13laws don't apply to him. And the only place I can think of in recent memory where that isn't true
00:45:20is the Twitter takeover where the Delaware Chancery Court made him buy the thing he
00:45:24promised he was going to buy. Right. The only judge left in America, it turns out.
00:45:29To that end, actually, what is your sense at this moment? We're talking on on Monday afternoon,
00:45:36which I should say because God only knows how much will change between now and Tuesday morning
00:45:40when this publishes. What is your sense of how well this is working? I think if I asked you a
00:45:45week ago, it seemed like this this sort of running roughshod over the federal government in the U.S.
00:45:51was working very well. And there was this big question of, like, who is going to fight back?
00:45:55Where are the Democrats? Why is no one doing something about the fact that this seems
00:46:00clearly illegal and must be stopped? That seems to have shifted a little, but I can't quite get my
00:46:07hands around how much I feel like it has shifted. Where do you think we are right now?
00:46:12I don't think we're in a very different place. I mean, I'm just going to be real with you.
00:46:16We've had judges making orders, but this is Elon Musk we're talking about. He does, at times,
00:46:23ignore court orders. Like you may remember, you know, when he pretended he was going to take
00:46:27Tesla private. Oh, right. The funding secured tweet. That's right. Yeah. And he made a settlement
00:46:33with the SEC where he was supposed to have a Twitter sitter, like somebody who watched his
00:46:37tweets about Tesla. And he just never did that. And we went all the way through the court system
00:46:43and they were all like, yeah, you need to do this. And there is still no Twitter sitter. So
00:46:47he has a history of ignoring legal orders. And I don't know who's enforcing any of these orders.
00:46:55I don't know who is interested in enforcing these orders. I don't know who's checking to
00:46:59see that he's following them. So there is a sense in which, you know, there is a real question right
00:47:05now of, like, do the courts matter? Because I feel like we're about to find out and we might
00:47:11not like the answer we're going to get. So there's that. But the other thing that I have
00:47:17noticed is that Steve Bannon, who notably hates Musk, has been daring Musk to go do his – I hate
00:47:27saying doge because that, like, nice dog didn't do anything wrong. You know? It's really true.
00:47:35I'm sorry for that, Shiba Inu. Like, I really am. Like, this is not the dog's fault. But he's been
00:47:41daring doge to go in and do what they're doing with the military, which is another moment where
00:47:47it's like that's where the rubber hits the road. Like, you know, we have some pretty severe national
00:47:53security risks already with the Treasury stuff. But I suspect when you go into the place where
00:47:59everybody's armed and you try to do stuff that's a national security risk, that runs a little
00:48:04differently. So I really think that we are in a very frightening place. And part of the reason
00:48:10I think we're in a very frightening place is because our lawmakers are not doing anything.
00:48:16You know, they're writing strongly worded letters and they're allowing themselves to be turned away
00:48:20from government buildings. You know, those are the Democrats. And then the Republicans, the ones
00:48:24who are not vociferously approving of this, they're Milford men. They're allowing their power
00:48:31to be stripped from them as members of Congress because they are scared of this guy, which is
00:48:36embarrassing. It is embarrassing. I cannot imagine right now being like, yeah, all right. Like,
00:48:44I'm cool with just being, you know, being useless as a lawmaker and having my power stripped from
00:48:51me and like the things that I vote for may or may not go through. Like, have some dignity, my God.
00:48:57Yeah. Yeah. It's been very funny watching people like go back to the founding fathers and talk
00:49:03about the, you know, the separation of powers and the extent to which they were like, they thought
00:49:08people would, you know, jealously guard their own power. And now everybody's just like, ah,
00:49:12who cares? Let Elon do whatever he wants. It's so embarrassing. It's so embarrassing.
00:49:17Yeah. So let's talk about the money of it all for a minute, because I think
00:49:21I've been thinking a lot about the reaction to this. And I think one of the strange things
00:49:27about the whole doge chaos of the last week or so has been that it's, it's both very real and
00:49:34very kind of theoretical. Like we're talking a lot about access to systems and what's read-only
00:49:38versus right. And everybody's talking about it as like an insider threat and a, and a hack.
00:49:44These are the kinds of things that you say to like people in the world and no one ever knows
00:49:47what to make of it. Right. And there's this sense of like, okay, they have access to some data.
00:49:52This is like the data privacy argument that we have. Right. Like, I, I think it's very hard for
00:49:56people to understand the risks of someone unauthorized having access to things like
00:50:03treasury payment systems. Help, help me make this real for people and myself. Like what,
00:50:09what is the actual issue and risk at hand here with these people having access to these systems?
00:50:14I can do this pretty easily actually, because it's tax season. It's tax season.
00:50:20And it's so funny you say that. I just filed my taxes literally yesterday as we're recording this.
00:50:24And I had a moment clicking the button of being like, is this a stupid idea? Like should I,
00:50:28but then I'm like, all that information's in the system anyway. Like, I don't know,
00:50:31maybe it's fine. Hopefully I get a refund. I don't know.
00:50:34So let's think, let's think about tax system, like the tax season for a minute, right? Like
00:50:40everybody's filing taxes. I filed mine, you filed yours. Those of you who are listening,
00:50:44if you haven't filed yet, like file your taxes, you know, it's time.
00:50:48Kudos for us being ahead of the game though. So let's just briefly pat ourselves on the back here.
00:50:52Yeah. So let's imagine the, you know, 12 year olds that he has hired
00:50:59are in the COBOL, which is one of the programming languages. A language they don't know well,
00:51:05they don't have a lot of experience with because they're not old enough to have experience with it.
00:51:10And they create a bug. That is a bug that potentially affects the tax refunds of every
00:51:16American. Right. That's a good one. Okay. This is, again, this is not, this is not like the grim
00:51:22scenario. This is the, he broke retweets scenario, except it's your tax payments.
00:51:28I saw somebody who tweeted the other day, they were just like, no IRS,
00:51:32I don't have $200,000 in student debt. That's a bug in the system.
00:51:36That's right. Proof it. And I was like, all right, that's right.
00:51:40So, you know, that's, that's the minor end of things is like, there's an accidental screw up
00:51:45because this is an understaffed office filled with people who aren't expert in the programming
00:51:50language that most of our government systems are in. So they screw something up and then it
00:51:55affects every single last one of us. Right. I have to say, I find that very
00:51:59compelling, but it doesn't feel like that's the way it's being talked about, right? The like,
00:52:03what if they just break it is, is really scary and really complicated and really interesting,
00:52:09but it does feel like the way this stuff gets talked about in broader strokes is like,
00:52:14Elon Musk is going to have your social security number, which feels is like a different thing
00:52:18that I think is harder to wrap your head around. And maybe we're just talking about it the wrong
00:52:22way. Well, there, again, there's a range of risks and I have just gone with like the most,
00:52:30that's the most benign one is like, fuck up your, your tax returns. Yeah. Okay.
00:52:36But if you think about, you know, for instance, that they've had these unsecured servers that
00:52:41they've been doing stuff with, it doesn't just mean that like Elon Musk has your social security
00:52:46number. It might also mean that everybody else in the world has your social security number.
00:52:50So that's cool. On top of that fun, one of the things to keep in mind about Elon Musk. And one
00:52:56of the things that I think is driving a lot of this anxiety is that he is a deeply vindictive man
00:53:03and he likes to pursue his enemies. And so if you were a person who has ever said anything negative
00:53:10about Elon Musk for any reason the idea that he might potentially have access to all of your
00:53:19secure financial data and like be able to mess with you personally is not beyond the pale. And
00:53:24like you think about the ways that he has, for instance, targeted specific government employees
00:53:30or specific reporters with his Twitter account, where he has, you know, essentially like put them
00:53:36on blast with the understanding that his audience is going to then dox and harass them. And you
00:53:42imagine him now having a bunch of sensitive data about all of these people who are speaking out
00:53:49about him. That is a much more precarious scenario. That's the sort of thing that can
00:53:55have a real chilling effect at minimum and at maximum can be quite dangerous.
00:54:01Yeah. Can you explain the Peter Thiel of it all to me? Speaking of deeply vindictive men,
00:54:08I keep seeing his name connected with this and I'm like, I sort of assumed he'd be involved
00:54:13here somehow. But how does Peter Thiel figure into this entire equation right now?
00:54:17Oh, that's sort of a question mark to me, but he is associated, I think. I mean,
00:54:21he and Elon obviously have a history. They're sort of frenemies. And some of Elon's hires
00:54:29have Thiel ties. JD Vance, who has been busily kissing ass, has Thiel ties. So there is this
00:54:36kind of network that Thiel is associated with that Musk is obviously drawing from.
00:54:43Thiel did sit out the election financially. So, you know, who knows what's going on there?
00:54:51And unlike Musk, Thiel doesn't generally barge around loudly. However, like a number of companies
00:54:59with Peter Thiel, for instance, Palantir, stand to benefit from this sort of new order, right?
00:55:06Where there's a more AI-focused government and where defense contracting suddenly has to run
00:55:16through a number of Thiel allies. And you can also imagine a world in which a number of government
00:55:23functions get privatized. If you read Project 2025, which it seems pretty obvious is what's
00:55:29being executed here, the entire idea is to wreck the government as we know it,
00:55:36and then privatize stuff so people will just be happy that things function again.
00:55:41Right. Yeah, I mean, I think I keep vacillating between wanting to boil most of this down to
00:55:49essentially government contracts and privatizing the government because there's a huge amount of
00:55:53money in it for the people who get it and they can hand it to themselves, and thinking that can't
00:55:57possibly be the whole answer, and that it has to be more complicated than that. And I think, like,
00:56:03leaving aside the fact that some of the people involved in this, I think, do genuinely believe
00:56:07the things that they're doing for political and moral and whatever value-based reasons,
00:56:13is most of it defense contracts? If you boil it all the way down, is it kind of that simple in
00:56:18a lot of places here? I think that is a pretty big motivator. Yeah, because, like, think about
00:56:24how expensive AI is, and how terrible it has been, and how widely made fun of it is. Like,
00:56:32I personally have switched away from using Google Search. Like, I pay for Kaggy now,
00:56:36which, by the way, you know, for those of you who are listening who are like,
00:56:40gee, I wish Google functioned like it did in 2012, boy, do I have a service for you.
00:56:45I highly recommend it. But like, yeah, I mean, like, this has made a lot of, AI has made a lot
00:56:52of things worse. And like, you can see it, you interact with it, you know it doesn't work.
00:56:58And so, thinking about ways to essentially bail themselves out, like, the funder of Last Resort
00:57:03at this point is the government. So I think that is a huge part of it, for sure. There's also,
00:57:07obviously, ideological motivations, like the idea of privatizing everything is like
00:57:12a conservative wet dream. And then on top of it, there are a bunch of, like, weird conspiracy
00:57:17theories that, you know, people are varying degrees of bought into about what the government
00:57:22does and doesn't do. Like, every once in a while, I see some, like, weird stuff from, like,
00:57:27Musk, where it's like, can you imagine that the government has your social security number? And
00:57:31it's like, okay, but like, who, do you know who assigned me my social security number?
00:57:35Right.
00:57:36Like, do you know what it's for? Like?
00:57:40This is what I mean. It's like how, it's hard to know what of this stuff is
00:57:45very scary, what is a little scary, and what is actually just, like, how things already were.
00:57:52But it does feel like everything is sliding in the direction of very scary because it is just
00:57:56a bunch of people you've never heard of doing things no one elected them to do. And that,
00:58:01on its face, is a problem. Like, even absent the rest of it, that on its face is part of the problem.
00:58:06Right. And the chaos is a problem, too. Because, like, if you think about it,
00:58:10one of the reasons why America is a dominant global power has to do with the fact that our
00:58:17treasury bonds are considered to be the safest investment on Earth. You know, they're used
00:58:24all over the place as, like, guaranteed money. And on Sunday, which, as we're talking right now,
00:58:31was yesterday, Donald Trump came out and was like, oh, well, Elon Musk has discovered that
00:58:38some of those treasury bonds aren't real. And it's like, you know you can crash the entire
00:58:44global financial system by saying that, right? And the reaction in the market has been muted
00:58:51because they, you know, you can't tell whether grandpa's sundowning again or whether he actually
00:58:56means it. But, you know, these are two men who have a history of not paying their debts,
00:59:02so that's very exciting. You know, it's the uncertainty in particular that I think is a
00:59:07problem beyond everything else. Because one of the reasons why, you know, for instance,
00:59:12people incorporate in Delaware is because there is a certain amount of certainty about how things
00:59:16turn out. You know how this is going to go. There's, like, a history and, like, people
00:59:21generally follow the history and, like, there is a, you know, an amount of de-risking that goes on
00:59:27because of that. So separately from whatever, you know, sort of other terrifying things may be
00:59:32happening, inserting this kind of uncertainty into a previously, like, almost taken for granted
00:59:41baseline of certainty is in and of itself really destroying something important.
00:59:47Totally. And so just to end on a true bummer of a note, I have been sort of assuming that
00:59:53this would eventually in some way kind of, I don't know, if not peter out, then at least,
00:59:58like, slow down. That the legal fights would slow things down. You kind of make the case
01:00:03it's the opposite. That if anything, we're due for this to accelerate. Can you just tell me why,
01:00:08and then I'm going to go feel sad about myself while we take a break?
01:00:11Well, here's the thing. I think this is an all or nothing gamble for Musk. At this point,
01:00:17he's broken so many laws that he is looking at a number of serious legal fights, if not actual jail
01:00:23time, if he is stopped. So this is, you know, this is very much a, is a big gamble. And if he wins,
01:00:31he's the king of the United States. And if he loses, his life becomes incredibly, incredibly,
01:00:38incredibly painful, much more so than any kind of pain he's ever experienced before.
01:00:43So I don't think he's going to slow down because he can't slow down. Because if he slows down,
01:00:50then he increases the likelihood that he is in fact stopped. And that is one of the reasons I
01:00:57think we're in a really scary moment is because this is such an existential gamble for him
01:01:02personally, that he's going to do whatever it takes to remain in charge.
01:01:08Fair enough. All right. Well, I desperately hope you're wrong, but I fear that you're right.
01:01:12We're going to have to do this again. This is, as you said, not going away. But until then,
01:01:17thank you as always, Liz.
01:01:18Yeah, my pleasure.
01:01:20All right, we got to take one more break,
01:01:22and then we are going to come back and do a question from the Vergecast hotline. We'll be
01:01:25right back. All right, we're back. Let's get to the hotline. As always, the number is 866-VERGE11.
01:01:38You can also email vergecasts at theverge.com. We love all of your questions. The hotline has been
01:01:43so, so fun the last few weeks, and I am extremely grateful to everybody who's reached out. Thank
01:01:48you again for all of your questions, all the time. We try to do at least one on the show every
01:01:52week, but frankly, we're getting so many good ones that we're going to have to figure out how
01:01:56to start doing more. This week, we have a question about the Super Bowl. Here it is.
01:02:02Hey, it's Peter in Brooklyn, and I'm watching the Super Bowl on, yep,
01:02:05Tubi. And it's so clear that the on-screen graphics are in 720p, not even the 1080p
01:02:13that's being upscaled to 4K, but the same 720p that's on broadcast. The edges are
01:02:21like jagged and gross, and it just bugs the crazy out of me. Thanks. Bye.
01:02:29Neil Patel is here, America's number one upscaling correspondent.
01:02:34Was that a question or a comment, sir?
01:02:36I think sometimes you just need to have feelings at the Vergecast hotline.
01:02:39I have no problem with this. If you just want to have feelings about the state of technology,
01:02:43call us. 1-1, we'd love to hear from you. So you posted something on Blue Sky during the Super Bowl
01:02:50in which you were annoyed at the on-screen graphics, which is why I brought you here,
01:02:54because I just want to hear how you feel as a man with a lot of thoughts about upscaling
01:02:58and fake Dolby Atmos. How'd the Super Bowl go? Well, the right team won.
01:03:04I didn't say who it was in the last episode, but the right team won.
01:03:08Yeah, that's fair.
01:03:08So that was good. I don't think those graphics were in 720p, because just the way that Fox's
01:03:14pipeline works, they produce the show in 1080p HDR, and then they go down to 720p for broadcast,
01:03:21and they went up to 4K for 2B. And so from what I understand, the whole production was in 1080p
01:03:30from tip to tail. The graphics did look bad, though. They did. I don't think that's a technical
01:03:36system problem. I think that's a design problem. There were drop shadows. It was very 90s.
01:03:44It was very 90s, but I do think I noticed it looking fuzzy also. The KC
01:03:50for Kansas City looked like it had been vector stretched and didn't look right.
01:03:55So I think they used a custom typeface based on the Fox logo for those letters, which is weird.
01:04:01And then maybe this is only because I have a gigantic TV, because I thought it was blurry,
01:04:08too. But it turns out they were using a border around the letters.
01:04:13It was a slightly different color that made it look fuzzy. So I think at a small size,
01:04:19you were seeing this weird gradient fuzz because there was a slightly different color border
01:04:26around the letters. Interesting.
01:04:28So it's a border. So on Kansas City, it was white letters with a very thin yellow border.
01:04:34And on Philadelphia, it's white letters with a very thin light green border.
01:04:39And that's not high contrast enough against the colors of the blocks.
01:04:45So it totally looks fuzzy. I completely agree. And then if you look above or any other stuff,
01:04:50the clock doesn't have the border, so it looks sharp. The score itself
01:04:53didn't have the border, so it looks sharp. I think a lot of people saw that.
01:04:57And then bizarrely, the stats where they were like, Jalen hurts four for five, 51 yards.
01:05:05They have a drop shot, which makes them pop even more.
01:05:08Yeah, it was it was terribly designed. Yeah.
01:05:12So I agree. This is this is just a mess of design. I saw people who responded to my tweet
01:05:20saying that it looked like it was made in Print Shop Pro. The point they were making is this is
01:05:25better for phones and people are watching this on phones now. I don't I don't think people watch
01:05:30Super Bowl on phones. And I think prioritizing that audience is weird. Right. Because most people
01:05:38watch Super Bowl around televisions or with people or in bars. Yeah. Like famously, it's
01:05:43the thing everyone watches on televisions. Yeah. It's like the Super Bowl's thing. Yeah.
01:05:47And so you can make an argument even in that case that maybe what you want to do
01:05:51is, you know, if you're watching in a bar on a TV, you want to make it very obvious what's going on.
01:05:58But I actually don't think this served that goal either. It made everything bigger. But in terms
01:06:03of information design did not make it easier to understand what was going on in a way because it
01:06:10looks so crazy. Yeah. Yeah. My favorite tweet was from Tom Fornelli, who's a sportscaster for CBS
01:06:17and stuff. And he said, after giving Tom Brady three hundred and seventy five million dollars,
01:06:20Fox only had twenty five bucks left to spend on a scoreboard design, which feels about right.
01:06:24That's good. Yeah. I tried to workshop a joke like that and I couldn't get there.
01:06:28So I appreciate that someone someone nailed it. Yeah, I will say I'm looking at a screenshot
01:06:34of the bug and you're right that it is both sharp and blurry at the same time somehow,
01:06:38which suggests that it is correctly rendered bad design. How did you watch the Super Bowl?
01:06:44What was your what was your setup? I watched on Tubi. I only paid for YouTube TV 4K for one month
01:06:52when they had the Packers in the playoffs and then I canceled it,
01:06:56which I felt was appropriate. Like I paid extra money and then they lost the other playoff game.
01:07:00So I canceled it. Save money. Yeah. I was like, I'm done here. And then I was like, crap,
01:07:03I didn't cancel it. Like I didn't wait the extra two days to get to the Super Bowl.
01:07:07So I watched on Tubi in 4K. It looked great. Fox, I think, did a good job. Yeah. Like as good as
01:07:13they could have done, they I read somewhere that the highest bit rate was 14.4 megabits,
01:07:19which is not great. It's not the 80 megabits of Sony picture core on my Bravia television.
01:07:24But it was, I think, good for live. It didn't block up. I always look at the confetti at the
01:07:29end of the Super Bowl to see how bad it is, like where they've made the tradeoff. And it wasn't
01:07:34bad. It wasn't great. It wasn't bad. There have been broadcasts where the confetti looked like
01:07:39a like 1990s video game like that. We've we've seen that in the past. And that's that's literally
01:07:44just compression in bit rate. If you don't have a high enough bit rate stream, you can't move
01:07:48enough data. So you're going to compress everything down and the confetti is moving too fast.
01:07:53So the compression can't create key frames. So you just get blocks like every block on the screen
01:07:59gets pixelated that I think that's like I can see it. And people are like, well, you only get
01:08:04confetti once a year. I'm like, yeah, at the end of the Super Bowl. Yeah, it doesn't. It is pretty
01:08:10important confetti. It's considered. Yeah. And so that's the tradeoff. But I thought I think 14.4
01:08:19was enough, at least on my stream on to be the way I was watching it to handle it. But I can look at
01:08:24the fake HDR was OK. And I think the upscaling was not good. Like I watched the halftime show
01:08:31again on YouTube in 1080p HDR, which is the highest YouTube has it at or at least last night
01:08:36when I was watching it, that was the highest it had it at. And it looks substantially the same.
01:08:41Yeah. Like my TV was doing a good enough job upscaling the 1080p HDR to its 4K display is
01:08:47Fox's native 4K upscaling. That's not, you know, like presumably they have more budget than Sony
01:08:54did when they built the upscaler. My TV. Yeah. So I think it was just fun. Yeah, I it held up,
01:09:02which is not nothing for something like to be, which I'm sure has never had the volume of people
01:09:07watching one specific thing as it did on. I read and Sportico, which is just like a business of
01:09:13sports publication. I read an interview with the TV folks and they were really concerned about
01:09:19signups, like people signing into to be downloading the app and getting all this stuff. And they were
01:09:24paying attention to how their competitors and Fox had done it in the past to get people to download
01:09:28the Fox Sports app. And they they realized there was a ton of friction around, you know, having an
01:09:34account, doing a credit card number. And the to be executive was like, yeah, we don't even we can't
01:09:39even take a credit card number. Like we're just like not we're just not that thing. Yeah. Like
01:09:43we're just like open the app and please start watching ads. So they were very confident that
01:09:48people would get the app and like immediately start watching. And I think that that kind of
01:09:53like zero friction TV experience is very much what to be spending on. Yeah. And it felt like
01:09:57old school TV. It was like two clicks and I was watching the thing. It was awesome. I like it
01:10:02made me feel like I was watching the pre-roll ad before the thing starts that you get on a lot of
01:10:07these services now. And I was just like, oh, no, this is just like a Super Bowl ad. It's just it
01:10:11just started playing the television show. It was great. Did you try any of the Dolby Atmos
01:10:18shenanigans? Oh, I guess you don't have Comcast. You couldn't try the shenanigans.
01:10:21I don't think Tubi's audio is very good.
01:10:26You know, when you watch a football game and like true 5.1, you get the crowd behind you and all
01:10:30the stuff. None of that. It was a very flat sort of stereo mix. My team, you know, my shiver was
01:10:35like doing its own virtual up mixing. But even the halftime show was kind of quiet and muffled.
01:10:41It was. It was one of the things I was most struck by from the halftime show was it was
01:10:46I couldn't tell if it was not well mixed on their end or if it was just coming out of
01:10:52my speakers incorrectly. As you know, my speakers are not famously fabulous on my television.
01:10:58Your alarm clock that you play music through.
01:11:01Exactly. But that seemed to be an experience a lot of people had that the sound
01:11:05on the streaming side of things was not super impressive.
01:11:08Yeah, I think they prioritized for delivering a 4K HDR. Yeah. Quote unquote 4K HDR to a lot
01:11:14of people for free, and people will be impressed by that because it's free and they can see it.
01:11:18And most people don't have the audio setups. Right. Or they have an inexpensive soundbar,
01:11:23so I think they just didn't prioritize that. I would like to see them prioritize that next time.
01:11:27Yeah, agreed. Yeah, I feel like this was a victory. Like all things considered,
01:11:32we've had some truly messy football streaming experiences over the years. This felt like a
01:11:36good one. It was a good one. I think overall, everyone has learned a bunch of lessons.
01:11:42But the part where it's still just not possible to watch a true 4K football game
01:11:50remains devastating in the United States of America.
01:11:53You know what I'm forever struck by is when they started using those Sony cameras in the
01:11:58end zone that get actual, honest to God, bokeh on the players. And all of a sudden,
01:12:03everybody was like, they're streaming this in 8K. And it's like, no, they just have
01:12:08apertures now. They're just doing bokeh.
01:12:12The first time they were doing it, they were using YouTuber cameras. They were using just A7s.
01:12:16Now I think they're using fancier cameras.
01:12:17But the gap between what we're getting and things that will look substantially better
01:12:23is actually pretty small. And it's just little things like,
01:12:26what if the background was blurry? That all of a sudden makes everything feel better.
01:12:30So it's like, there's actually so much room to make this stuff better without
01:12:33fundamentally rewriting the architecture of the internet.
01:12:36There's that. There's also the reason they were able to bring the A7s to the broadcast.
01:12:42We wrote about this a bunch at the time. A bunch of other business of sports websites
01:12:48wrote about it at the time, because what they needed to do to make that work was one,
01:12:54they needed to try it in a low-cost way. These broadcast operations are big, integrated,
01:13:00third-party operations. Fox hires a production company to show up with trucks and just do this.
01:13:06So there's not a lot of forward investment. It was pretty low cost to throw an A7 out there on
01:13:12a gimbal in the scheme of things. But then they weren't prioritizing just delivering to broadcast.
01:13:20And so if you have an A7 and a wireless thing, you might have to
01:13:26rethink your entire broadcast system for that. And then because you're sending out over there,
01:13:32in some cases, CBS still sends out 1080i. That's weird. They have to rethink all of that and work
01:13:40with all the graphics packages. And they had just gotten to the place where enough digital video
01:13:46had just taken over the production. They're being like, screw it, send somebody out there
01:13:50with a wireless A7. Let's see what happens. It was possible. And then the A7s were good enough.
01:13:56And now that's all changed because people liked it so much, they fully invested in it.
01:14:00But it's a really interesting – the distribution had to flip over so that enough of the gear they
01:14:06were using was already prepared for it, as opposed to being prepared to send out broadcast signals.
01:14:12And that's a great Verge story. We could write that story all day long. At the time, we kind of
01:14:17did because it was so much fun. But now all of that production pipeline is more geared towards
01:14:23internet delivery because that's where the industry is than television delivery. So they're
01:14:28able to do more YouTuber-y things, I guess you could call them. Including completely cheaping out
01:14:34on the graphics. Yeah. All right. Well, next year's Super Bowl is going to be on Peacock,
01:14:40so we have a whole year of weirdness to be expected there. It's going to be great.
01:14:45Hopefully they can stream one football game in 4K. I just beg you. Produce,
01:14:49natively distribute the thing in 4K. I beg everybody. We can get this done in America.
01:14:55Executive order a 4K football game, man. Do it.
01:14:58This is our job. We have 12 months, Nili, you and me to get this done.
01:15:01All the money he saved from just randomly ordering the treasury to not do pennies anymore.
01:15:054K football.
01:15:06Love it. All right. Nili, thank you.
01:15:09Yeah.
01:15:12All right. That is it for The Vergecast today. Thank you to everybody who was on the show,
01:15:15and thank you, as always, for listening. There's lots more on everything we talked about at
01:15:19theverge.com. All of our stuff on open AI and Operator and deep research. I'll link it in the
01:15:23show notes, but there is a ton of news ongoing there. There's even more news, maybe unfortunately,
01:15:29ongoing with all of this Elon Musk and Doge stuff, so keep it locked to The Verge. We are doing our
01:15:34very best to keep it all up to date and all on our homepage, so keep it locked.
01:15:39And if you have thoughts, questions, feelings, or other government agencies you're curious about,
01:15:44you can always email us at vergecast at theverge.com or call the hotline 866-VERGE11.
01:15:49We really honestly love hearing from you. It is the best.
01:15:52This show is produced by Will Poore, Eric Gomez, and Brandon Kiefer.
01:15:55The Vergecast is a Verge production and part of the Vox Media Podcast Network.
01:15:58Nila and I will be back on Friday to talk about all the AI news, because it just keeps coming,
01:16:03all the government news, because it just keeps coming. I think there's some interesting gadgety
01:16:07news coming this week. Lots to do, lots to talk about. We will see you then. Rock and roll.

Recommended