back to index

E36: New FTC Chair, breaking up big tech, government silent spying, Jon Stewart, wildfires & more


Chapters

0:0 Besties intro
3:37 Lina Khan appointed to the Chair of the Federal Trade Commission
7:2 Will Lina break up big tech? Which one will be first?
18:2 Potential repercussions to consumers
27:48 Sacks’ antitrust experience at PayPal vs. eBay, Visa & MasterCard
29:50 Google’s multi-trillion dollar data trove
35:40 The U.S. government's capability to silently take data while “gagging” Big Tech
53:32 COVID’s psychic shadow, Friedberg’s office landlord is still requiring masks
60:17 Jon Stewart’s lab leak bit on Stephen Colbert’s show
70:4 California’s wildfire risk increasing with climate change
80:57 Bestie summer plans

Whisper Transcript | Transcript Only Page

00:00:00.000 | What's going on, Sax LP meeting?
00:00:01.720 | Is it an LP meeting or are you going to lunch, Peter Teo?
00:00:06.480 | A little layer poop?
00:00:07.840 | What's going on?
00:00:08.860 | It's 9 a.m.
00:00:09.760 | You must be, there must be a call going on here.
00:00:12.240 | Tutto bene, Saxy-poo.
00:00:13.100 | Tutto bene.
00:00:13.580 | Esa dich, Sax.
00:00:15.120 | Every week that Chamath is in Italy, another button gets undone.
00:00:18.540 | This is definitely...
00:00:20.000 | Let your winners ride.
00:00:24.240 | Rainman David Sax.
00:00:27.140 | I'm doing all in.
00:00:30.000 | And it's said...
00:00:30.860 | We open source it to the fans and they've just gone crazy with it.
00:00:33.900 | Love you, besties.
00:00:34.680 | Queen of quinoa.
00:00:36.260 | I'm doing all in.
00:00:38.020 | Hey, everybody.
00:00:38.960 | Hey, everybody.
00:00:39.400 | Welcome to another episode of the All In Podcast, episode 36.
00:00:44.240 | Back with us today on the program, the Queen of Quinoa, science spectacular.
00:00:51.220 | Friedberg is with us again with leading off last episode, Friedberg,
00:00:55.580 | with a great...
00:00:56.580 | Friedberg science monologue.
00:00:59.480 | The crowd went crazy for it.
00:01:00.980 | How does it feel coming off that epic performance in episode 35?
00:01:04.040 | Tell us, what were you thinking going into the game?
00:01:06.000 | And yeah.
00:01:07.400 | Well, I was thinking I would talk about the Alzheimer's drug approval at Biogen.
00:01:11.800 | Got it.
00:01:11.900 | And then I felt like I did it when we were done.
00:01:13.560 | Great.
00:01:14.920 | Got it.
00:01:15.440 | It's just, it's like literally interviewing Kawhi Leonard after like a 50-point game.
00:01:18.680 | Okay.
00:01:19.340 | And with us, Rainman David Sax with layers for players.
00:01:23.960 | He's been styled and groomed.
00:01:26.280 | And he's in some random hotel room.
00:01:28.780 | How are you doing, Rainman?
00:01:29.820 | Good.
00:01:30.640 | Good.
00:01:30.960 | I'm not in a hotel room.
00:01:32.540 | I'm...
00:01:32.780 | Oh, your home just happens to look like a five-star resort.
00:01:36.220 | Got it.
00:01:36.480 | Forgot that.
00:01:36.980 | And give us an idea coming into today's game with the layers.
00:01:43.160 | You obviously are here to dominate and get your monologues up.
00:01:47.560 | Gotta be hard for you to look at the stat line and see yourself trailing
00:01:51.080 | in monologues behind the dictator.
00:01:53.000 | I'm of course referring to All In statistics.
00:01:55.340 | Yeah.
00:01:55.900 | Where some maniac is breaking down how many minutes we each talk per episode.
00:02:00.080 | Jason, I'm really happy with my performance.
00:02:02.360 | For me, it's about quality, not quantity.
00:02:04.520 | I like to stick and jab.
00:02:06.880 | Okay.
00:02:07.720 | Got it.
00:02:08.020 | Got it.
00:02:08.360 | What are we talking about right now?
00:02:09.620 | What, what, what, what is the dictator leaving all monologues?
00:02:13.540 | What the hell are you talking about right now?
00:02:16.220 | Okay.
00:02:16.240 | There is a Twitter account done.
00:02:17.960 | You know how the, the All In stands have a ton of skills, like there is an audience for
00:02:25.580 | this podcast.
00:02:25.680 | There is a ton of people who are listening to this podcast that has more skills than,
00:02:32.560 | you know, it's like the, the 5% of the most skilled people in the world listen to this
00:02:35.760 | podcast.
00:02:36.000 | So in addition to doing the merge, in addition to doing, who's the guy, Henry, who does all
00:02:40.560 | those incredible videos with animations.
00:02:42.720 | In addition to those.
00:02:43.800 | Henry Belcaster.
00:02:44.800 | Yeah.
00:02:44.900 | Henry Belcaster.
00:02:45.900 | Henry Belcaster crushing it.
00:02:46.900 | Those things are great.
00:02:47.900 | Those are amazing.
00:02:48.900 | Incredible.
00:02:49.900 | And then, uh, of course you have young Spielberg who led the charge dropping incredible tracks.
00:02:53.560 | Incredible.
00:02:54.560 | Incredible.
00:02:55.560 | that is analyzing, somebody put, you know, put in the show notes,
00:03:00.040 | a link to it, but they do some type of AI analysis of the audio
00:03:06.300 | files, and they tell us who had the most monologues, and then
00:03:09.720 | the running time and then historic running time. So
00:03:12.780 | they're actually looking at it trying to figure out, you know,
00:03:15.080 | who is speaking the most, and they thought freeboard was gonna
00:03:17.800 | run away with the episode. But it kind of disappeared in the
00:03:19.940 | second half of the game. And smart, obviously camera in the
00:03:23.180 | corner and took his 27%. But they live a pie chart of how
00:03:26.480 | much we each talk. I have a I always have a very strong first
00:03:29.000 | and third quarter. Yes, absolutely. And then he gets
00:03:32.780 | frustrated when he passes the ball and somebody misses a shot.
00:03:35.360 | It's kind of like LeBron in the early days. So kicking off
00:03:38.060 | today, Lena Khan has been confirmed to the FTC with
00:03:41.780 | bipartisan support. Interesting. And this is obviously going to
00:03:45.020 | be a challenge for big tech. On Tuesday, the Senate voted 69 to
00:03:49.360 | 28 to confirm Lena Khan, who is a very well established
00:03:53.180 | critic of big tech. And this is obviously really unique because
00:03:58.820 | she's 32 years old. And she's leaving FTC, which is
00:04:02.300 | unbelievable. I did a little research on her and watch some
00:04:05.660 | videos. She's basically written two amazing papers. And the
00:04:11.480 | first paper came out in 2017. Amazon's antitrust paper, the
00:04:14.480 | second one came out in June, and was about the separation of
00:04:18.320 | platforms and commerce. And when you hear her speak, she is
00:04:21.320 | incredibly
00:04:22.400 | credible and knowledgeable. It is as if one of the four of us
00:04:26.660 | were discussing this, she could come into this podcast and speak
00:04:29.120 | credibly about Amazon's businesses, as opposed to the
00:04:32.060 | charades we saw at different hearings where the senators and
00:04:36.620 | Congress people just absolutely had no idea what they're talking
00:04:39.740 | about. Some of the items I picked up from a talk she gave
00:04:43.280 | in Aspen is that she, she formed a lot of these opinions by
00:04:49.040 | talking to venture capitalists, who are not the people who are
00:04:52.080 | interested in the business.
00:04:52.340 | And she's also a very good example of a lot of the
00:04:55.160 | people who were concerned about Amazon's dominance and other
00:04:57.980 | companies, and the competitive space and she is looking at
00:05:02.060 | consumer welfare, one of the lenses of antitrust, which will
00:05:05.360 | I'm sure David Sachs will have some thoughts on as our resident
00:05:08.660 | attorney here. And the framing of those in terms of harm of the
00:05:12.260 | consumer, she believes there's other harm that happens. And she
00:05:16.760 | thinks one remedy is to kill Amazon basics, because the
00:05:20.240 | marketplace shouldn't own the goods as well. She's concerned
00:05:22.220 | about the consumer's reputation, because that creates fragility.
00:05:25.160 | And that is another type of consumer harm, while she freely
00:05:28.520 | admits that prices have gone down services are free. And this
00:05:31.400 | is a consumer benefit. So she wants to rethink the entire
00:05:34.460 | concept. And she is savvy. She brought up Facebook buying a
00:05:39.020 | Novo, the reportedly spyware VPN to give them a little advantage
00:05:44.000 | as to what was being used on phones, and maybe give them a
00:05:46.940 | little product roadmap information. She also brought up
00:05:51.500 | Amazon,
00:05:52.100 | Amazon, the Amazon website, which is a great place to get
00:05:54.920 | information about the market. She also brought up the Amazon
00:05:57.380 | website, which is a great place to get information about the
00:05:59.060 | market. She also brought up the Amazon website, which is a great
00:06:00.800 | place to get information about the market. She also brought up
00:06:02.360 | Amazon's VC arm, which is a great place to get information
00:06:03.980 | about the market. She also brought up Amazon's VC arm,
00:06:05.840 | using data to invest in buying companies, why wouldn't they that
00:06:08.720 | makes total sense. That's great signal for them. She seems to
00:06:12.380 | want Amazon Web Services spun out, which I think would just
00:06:14.900 | double the value of it, or maybe add 50% of the value of it. And
00:06:20.000 | she gave very pragmatic examples, like maybe
00:06:21.980 | you turn on your Android phone, you would have to install maps,
00:06:25.220 | or maybe you would pick from the different maps that are out
00:06:28.100 | there different programs, and that there would be integration
00:06:30.740 | in them. And people could swap out, you know, MapQuest or
00:06:34.760 | Apple Maps in their Google searches. So a lot of actually
00:06:37.520 | very interesting, pragmatic approaches. And she doesn't
00:06:40.040 | think these need to be decade long lawsuits. She thinks this
00:06:42.660 | is going to be a negotiation, and that people will kind of
00:06:46.140 | work together on it. But this is all with the backdrop of
00:06:49.580 | partisan politics. And you know, one group of people,
00:06:51.860 | looking at this through the lens of wealth, and inequality, and
00:06:55.280 | another group looking at it through censorship sacks, since
00:06:58.220 | you are our counsel here. What are your thoughts on this
00:07:00.560 | appointment?
00:07:00.980 | Yeah, I mean, the interesting thing is that, you know, Lena
00:07:06.020 | Khan is the the Bernie approved candidate, she is liked by the
00:07:10.340 | progressive left, but at the same time, she got 21
00:07:13.460 | Republicans to support her. And so this nomination, you know,
00:07:18.620 | sailed through confirmation. I think what she's saying,
00:07:21.740 | I think there's, there's a very good, good argument to it, that
00:07:25.160 | and I've said similar things in the past, which is, you know,
00:07:28.040 | what she's basically saying, especially in the case of
00:07:30.320 | Amazon is, look, you've got this company, Amazon that controls
00:07:33.200 | essential infrastructure, AWS, the whole distribution supply
00:07:36.980 | chain going all the way from the port to warehouses to, to
00:07:41.060 | logistics and distribution, that is going to be owned by a scaled
00:07:45.140 | monopoly player, you have a massive economy that's going to
00:07:48.060 | be owned by a large monopoly player, and you have a massive
00:07:51.620 | economies of scale, it's pretty clear, they're going to dominate
00:07:54.440 | that. And what they're doing is systematically going category by
00:07:58.100 | category, and using the monopoly monopoly profits they make by
00:08:02.540 | owning the sort of core infrastructure, and subsidizing
00:08:06.080 | their entry into each of these new categories that Amazon
00:08:09.920 | basics and others, and she calls that, you know, predatory
00:08:12.920 | pricing. And she's afraid that Amazon's is going to end up
00:08:16.520 | dominating every category, every category that you could build on
00:08:19.760 | top of this core infrastructure.
00:08:21.500 | I think it's actually a pretty valid concern. I think you see
00:08:24.380 | something analogous happening with Apple and Google and the
00:08:27.200 | app stores. We had a congressional hearing pretty
00:08:29.600 | recently, which you had Spotify, and other apps complaining about
00:08:33.200 | what Apple was doing to them saying they are making our
00:08:36.020 | service non viable with the 30% rate that they're charging. You
00:08:40.400 | remember Bill Gurley had a great post about this saying just
00:08:42.920 | because you can charge a 30% rate doesn't mean you should
00:08:45.920 | right now we're seeing this blowback from this massive 30%
00:08:49.640 | rate. And you had Spotify
00:08:51.380 | saying, look, Apple is doing this to basically make us
00:08:54.800 | infeasible relative to Apple Music. So I think there is a
00:08:58.640 | legit point here, which is that if you own the monopoly platform,
00:09:02.780 | the sort of essential infrastructure, you cannot use
00:09:06.140 | it to basically take over every application on that can be built
00:09:11.660 | on top of that platform. That I think is a very appropriate use
00:09:16.040 | of antitrust law. And I think so I think that's the good here.
00:09:19.700 | Now, I think that there
00:09:21.260 | are some, some concerns, or some potential downsides. And, you
00:09:27.260 | know, and the downside that I see is that we used to think we
00:09:30.500 | used to judge antitrust law in terms of consumer welfare. And
00:09:35.000 | so, so there was a limiting principle to the actions of
00:09:38.420 | government, which is you would just look at prices and the
00:09:40.400 | effect on prices. Here, you know, the sort of movement that
00:09:44.420 | lean economy represents the so called hipster antitrust
00:09:47.600 | movement, they're concerned about power, and they want to
00:09:50.660 | restructure
00:09:51.140 | markets to avoid sort of concentrations of power. I don't
00:09:54.980 | see the limiting principle there. And so I think what the
00:09:58.400 | would market share be a limiting principle?
00:10:00.740 | Well, it would be a limiting principle in terms of who you
00:10:03.440 | could take action on, but it wouldn't be a limiting principle
00:10:05.480 | in terms of how you would restructure the market. And I
00:10:08.480 | think what we're in for over the next few years is potentially a
00:10:12.260 | hyper politicization of big tech markets. I think these 21
00:10:17.180 | Republicans might soon feel like the dog who caught the bumper.
00:10:21.020 | In the sense that yes, they're finally going to have the
00:10:23.480 | regulation of big tech they've been calling for. But they might
00:10:26.780 | not like all of the results because we because what could
00:10:29.540 | happen is a very intrusive meddling by government in the
00:10:34.580 | markets of technology. And it could go well beyond sort of
00:10:38.300 | this, this this gatekeeper principle that we've been
00:10:42.320 | talking about that I think would be a valid reason to regulate.
00:10:45.680 | Chamath, I think she has to be careful in focusing on Amazon. So if you
00:10:50.900 | break down antitrust law, there are really three big buckets
00:10:54.140 | where the attack vectors are. And I'm not going to claim to be
00:10:58.100 | an expert, but I think they're relatively easy to understand.
00:11:01.940 | So you have the first principal body, which is called the
00:11:04.880 | Sherman Act. That's the thing that everybody's looked at. And
00:11:07.040 | that's, you know, sort of where most current antitrust enforcement
00:11:15.800 | action has failed on tech companies because it largely
00:11:18.920 | looks at the predatory nature of the antitrust law. And it's
00:11:20.780 | really about the nature of pricing power that certain
00:11:23.720 | companies have. And you have to remember, this thing was written
00:11:25.880 | in the 1800s. And so, you know, what did people do when they
00:11:28.700 | control things, they just they drove prices up. Tech does the
00:11:31.940 | exact opposite, right? They constantly drive prices down.
00:11:34.940 | And what's counterintuitive is it turns out that in the olden
00:11:39.080 | days, driving prices up drove out competition. Today, driving
00:11:44.060 | prices down drives out competition. Yes, right. So you
00:11:47.420 | know, you make Gmail infinite storage, nobody else can
00:11:50.660 | do that. You make, you know, photos completely subsidized, you
00:11:55.820 | make certain music products effectively free. And you
00:11:59.720 | subsidize that, you know, you create enormous amounts of
00:12:02.600 | content, blah, blah, blah. So you have the Sherman Act, then
00:12:05.780 | somewhere along the way, we realized, okay, we need to add
00:12:08.060 | something, we created this thing called the Clayton Act that was
00:12:10.220 | around M&A. Right, we added to that, a lot of folks that are
00:12:14.300 | listening probably have heard of Hart Scott, Rodino, HSR, we've
00:12:17.420 | all gone through it, right on M&A events, we have to file these
00:12:20.540 | HSR clearances, when you make big investments, for example, you
00:12:23.600 | know, I just made a climate change thing, we had to file
00:12:26.480 | HSR. And then there's this FTC, which is the Federal Trade
00:12:30.980 | Commission Act, that is where she can get, you know, if to use
00:12:34.940 | a poker term, you know, a little frisky, why? Because the FTC a
00:12:40.100 | has these two specific things, which says you can have an
00:12:43.280 | unfair method of competition, or an unfair or deceptive act or
00:12:48.080 | practice.
00:12:50.420 | there's a lot of people who are in the big tech industry, and I'm
00:12:53.660 | going to talk about that in a bit. But I'm going to talk about
00:12:56.300 | the big tech industry, because the big tech industry is a big
00:12:59.120 | company, and they're not necessarily the biggest company.
00:13:01.580 | But they're not necessarily the biggest company, they're not
00:13:04.220 | necessarily the biggest company. And so they're not necessarily
00:13:06.740 | the biggest company. And so, you know, they're not necessarily
00:13:09.200 | the biggest company, they're not necessarily the biggest company,
00:13:11.780 | but they're not necessarily the biggest company. And so, you know,
00:13:14.740 | the big tech industry, they're not necessarily the biggest
00:13:16.460 | company, they're not necessarily the biggest company, but they're
00:13:17.780 | not necessarily the biggest company. And so, you know, the big
00:13:20.300 | tech industry, they're not necessarily the biggest company,
00:13:21.300 | but they're not necessarily the biggest company, but they're
00:13:22.300 | not necessarily the biggest company, but they're not necessarily
00:13:23.300 | the biggest company, but they're not necessarily the biggest
00:13:24.300 | company, but they're not necessarily the biggest company.
00:13:25.300 | And so, you know, the big tech industry, they're not necessarily
00:13:26.300 | the biggest company, but they're not necessarily the biggest
00:13:27.300 | company, but they're not necessarily the biggest company.
00:13:28.300 | And so, you know, the big tech industry, they're not necessarily
00:13:29.300 | the biggest company, but they're not necessarily the biggest
00:13:30.300 | company, but they're not necessarily the biggest company.
00:13:31.300 | And so, you know, the big tech industry, they're not necessarily
00:13:32.300 | the biggest company, but they're not necessarily the biggest
00:13:33.300 | company, but they're not necessarily the biggest company.
00:13:34.300 | And so, you know, the big tech industry, they're not necessarily
00:13:35.300 | the biggest company, but they're not necessarily the biggest
00:13:36.300 | company, but they're not necessarily the biggest company,
00:13:37.300 | but they're not necessarily the biggest company, but they're
00:13:38.300 | not necessarily the biggest company, but they're not necessarily
00:13:39.300 | the biggest company, but they're not necessarily the biggest
00:13:40.300 | company, but they're not necessarily the biggest company,
00:13:41.300 | but they're not necessarily the biggest company, but they're not
00:13:42.300 | necessarily the biggest company, but they're not necessarily the
00:13:43.300 | biggest company, but they're not necessarily the biggest company,
00:13:44.300 | but they're not necessarily the biggest company, but they're not
00:13:45.300 | necessarily the biggest company, but they're not necessarily the
00:13:46.300 | biggest company, but they're not necessarily the biggest company,
00:13:47.300 | but they're not necessarily the biggest company, but they're not
00:13:48.300 | necessarily the biggest company, but they're not necessarily the
00:13:49.300 | biggest company, but they're not necessarily the biggest company,
00:13:50.300 | but they're not necessarily the biggest company, but they're not
00:13:51.300 | 15 or 20 years of Google, Facebook, less Apple, by the way, using their edge to decrease price.
00:13:59.460 | For the first time in the last quarter, both of these two companies,
00:14:03.500 | and they were the only two of big tech, that announced an increase in pricing. They saw
00:14:09.300 | a diminishing of CPM inventory. They had to figure out ways to grow inventory as users
00:14:14.420 | started to stagnate. What they really said is, we're ramping up CPMs. CPMs, I think,
00:14:19.120 | were up 28, 30% in a quarter. Yeah. There's a lot of competition right now for ad space.
00:14:24.220 | If you put these two ideas together, which is step one is you surreptitiously basically take
00:14:28.920 | all the costs out of the system, and then step two, raise price over time,
00:14:31.720 | there's probably something there.
00:14:33.620 | Friedberg, when we look at her age and her obvious, deep, deep knowledge, do you see that as
00:14:42.060 | an overall plus? Obviously, David framed her as the Bernie-approved candidate,
00:14:48.440 | but then-
00:14:49.100 | Then conceded that 20 Republicans are backing her. What do you think about the massive credibility
00:14:55.140 | she has, Friedberg, in terms of she actually understands this deeply, clearly?
00:15:00.140 | I'm sure she's not dumb, if that's what you're asking. I'm not sure-
00:15:07.220 | Well, I mean, it's a 32-year-old. I mean, have we seen an appointment like that before? I mean,
00:15:11.440 | I don't think so.
00:15:12.320 | Yeah, that's good for her.
00:15:13.620 | Yeah.
00:15:14.240 | So, I just feel like there's a-
00:15:19.080 | A bit of a cycle underway where we have this kind of anti-wealth, anti-wealth accumulation
00:15:28.000 | sentiment as an undercurrent right now. You know, obviously, Bernie and Elizabeth Warren
00:15:33.980 | and others are key vocal proponents of change that's needed to keep this kind of wealth
00:15:42.020 | disparity from continuing to grow. And one of the solutions is to reduce the monopolistic capacity
00:15:49.060 | of certain business models, specifically in technology. The downside that I don't think
00:15:56.480 | is realized, and that inevitably comes with this action under this new kind of business model
00:16:03.580 | of the technology age, or the digital age, is the damage to consumers. And so, you know,
00:16:12.120 | as Chamath and David pointed out, like, historically, antitrust has been about protecting the consumer.
00:16:17.480 | And the irony is, is that, you know, there's a lot of, you know, there's a lot of, you know,
00:16:19.040 | the more monopoly or the more monopolistic or the more market share Amazon gains,
00:16:24.000 | the cheaper things get for consumers. And it's unfair to small businesses and to business owners
00:16:31.720 | and to competitors, but consumers do fundamentally benefit. And so the logical argument she made in
00:16:37.660 | her paper that was widely distributed a few years ago, was around this notion that in this new world,
00:16:44.800 | it's not about consumer harm. And we need to look past the impact to consumers and look more at kind
00:16:50.760 | of the, you know, the fact that this company, maybe prevents innovation and prevents competition.
00:16:55.520 | But ultimately, if the consumer is harmed, in the resolution of that concern, we're not going to
00:17:02.760 | wake up to it for a while. And then consumers one day are going to wake up and they're going to be
00:17:06.500 | like, wait a second, why am I paying five bucks for Gmail? And you know, why am I paying an extra
00:17:11.000 | $10 for shipping to get my Amazon products brought to me every day?
00:17:14.720 | And, you know, all the things that I think we've taken for granted, in the digital age, with the
00:17:21.640 | advent of these, you know, call it monopolistic kind of business models, where they accumulate
00:17:25.680 | market share, and they can squeeze pricing and keep people out and the bigger they get, the cheaper
00:17:29.000 | they get, and therefore it's harder to compete. Consumers have benefited tremendously, I think all
00:17:34.040 | of us would be hard pressed to say, I would love to pay 10 bucks a month for Gmail, I'd love to pay
00:17:37.700 | for Facebook. And at the end of the day, these models, I'd love to pay more for shipping with
00:17:41.900 | Amazon. And so, you know,
00:17:44.640 | it becomes a value question, right? What do you value more? Do you value the opportunity for
00:17:48.900 | competition and innovation in the business world? Or do you value as a consumer, better pricing?
00:17:53.760 | And I don't think that we're really having that debate. And I think that that debate will
00:17:57.420 | inevitably kind of arise over the next couple of years, if an
00:17:59.860 | freeberg, how much of this kind of played out.
00:18:02.160 | And I think to be clear, freeberg, what you're saying is this is driven by the extraordinary
00:18:07.260 | wealth of Jeff Bezos, Zuckerberg, etc.
00:18:09.540 | It's easy to pinpoint that problem, and then not involve the repercussions to
00:18:14.560 | the consumers, if you try and change how business operates in a free market system. And these
00:18:20.320 | businesses are successful, because they have customers that like competition. And they're and
00:18:26.500 | they drive in a competitive way of pricing down and they prevent people from coming in and competing,
00:18:30.640 | not by entering into contracts and antitrust enforcement, all this sort of stuff. They're
00:18:35.620 | doing it because they're scaling and offering lower prices. I mean, this, like Peter Thiel and
00:18:39.820 | Marc Andreessen have separately argued for this in really intelligent ways, probably in a far more articulate way.
00:18:44.480 | than I can. But and they did this early on, which is, you know, we want to find businesses that can become monopolies. Because if you can reduce your pricing and improve your pricing power with scale, it's going to be harder and harder for someone to compete. And therefore, the capital theory is rush a bunch of capital into these businesses help them scale very quickly. I mean, this is obviously the basis of Uber and others, and then get really big, really fast, create the mode, create the mode, drop the pricing, and then no one can compete with you on pricing consumers benefit, and you've created the big business and you've locked everyone.
00:19:13.240 | Okay, so let me
00:19:14.400 | go around the horn here and frame this for everybody. Let's assume that big tech does get breaking up this broken up. This is an exercise. We assume it gets broken up and YouTube and Android are spun out Instagram, WhatsApp are spun out AWS is spun out. And you know, app stores are allowed on Apple's platform iOS for the first time, I want to know if this is good, bad or neutral for the following two people. So these breakups occur? Is it good, bad or
00:19:44.320 | neutral for consumers? And then two? Is it good, bad or neutral for startups? sacks?
00:19:49.240 | I generally would lean towards saying yes, I mean, a lot depends on
00:19:54.640 | better neutral for each party, startups, and for consumers.
00:19:58.660 | I think it could ultimately be good for both. But it really
00:20:03.120 | depends on how it's done. And I think there is a big risk here
00:20:07.940 | that this just degenerates into sort of hyper politicization, you
00:20:13.240 | get intensive
00:20:14.220 | amounts of lobbying by big tech in Washington, that what happens is, you know, you have a good cop, bad cop, where Lena Khan just becomes the bad cop, she's there to kind of keep big tech in line threatens to break them up. And then the good cop is, you know, Biden, and the administration and then they, they become the protection and the extortion racket they raise on, you know, ungodly amounts of money. And really, it'll be a bonanza for for all elected officials, because now, big tech's gonna have to increase his donations even more.
00:20:44.120 | That's the that's the cynical take. So we could end up with something much worse than what we have now. But, but I think the legit, I think the words you're going to hear a lot, okay, are common carrier, because what she seems to be saying is, look, if you're a tech monopoly that controls core infrastructure, we need to regulate you, like a common carrier, you cannot summarily deny service to your competitors who are downstream applications built on top of your platform. Consumers can get behind that, because that is the argument they've been making about the fact that you're a common carrier. And so I think that's a big risk.
00:21:14.020 | about Facebook cutting off free speech is you are a speech utility. You should be regulated
00:21:19.780 | as a common carrier, you cannot cut off people. summarily, you cannot discriminate against people
00:21:25.540 | who should be allowed to have free speech on your platform. And so I think there is, I think the left
00:21:30.760 | and the right here can cut a deal where they regulate these guys, these big tech companies
00:21:34.720 | is common carriers. I think that is what we're headed towards. So bakery can deny service,
00:21:39.700 | as we talked about previous issue to a gay couple who wants a cake because it's a tiny little
00:21:43.720 | company and there's other choices. But when we're talking about Facebook and Twitter, there are not
00:21:47.740 | other choices. And once you're removed, like Trump has been from the public square, there is no
00:21:51.700 | recourse, you are essentially zeroed out. Chamath is a good for startups bad for startups neutral,
00:21:56.860 | same thing for consumers. If you know one chunk of every company got cleavered off.
00:22:01.720 | It's unanimously good for startups in any scenario in which they get involved. And I
00:22:09.820 | think in most cases in which the government gets involved, it's,
00:22:13.420 | it's good for consumers as well. And why, in both cases.
00:22:17.440 | So for startups, it's just because I think right now we have a massive human capital sucking sound
00:22:24.640 | that big tech creates in the ecosystem, which is that there is an entire generation of people
00:22:32.980 | that are basically unfortunately frittering away their most productive years,
00:22:39.160 | getting paid what seems to them like a lot of money.
00:22:42.880 | Yeah.
00:22:43.120 | But is what is effectively just, you know, payola to not go to a competitor or go to a startup at by
00:22:51.640 | big tech. So to explain that clearly, for example, like if you're a machine learning person, right,
00:22:57.640 | those machine learning people, you know, can get paid 750 to a million dollars a year to stay at
00:23:06.160 | Google. And instead, they won't go to a startup because they take sort of the bird in the hand,
00:23:10.180 | right? You multiply that by 100, right? Yeah.
00:23:10.780 | You multiply that by 100 or 150,000 very talented, you know, technical people. And that's actually
00:23:18.340 | what you're seeing every day. Now, those numbers are actually much higher, you know, if you're,
00:23:22.720 | if you're a specific AI person, you can get paid 510 million dollars a year. My point is,
00:23:27.760 | he could have started a startup. And they could have been frankly, they just million,
00:23:31.840 | they look, let's be honest, they go to Google, Facebook and whatever. And I don't think anybody
00:23:36.280 | sees the real value of what they're doing in those places except getting paid. Now, they're making a
00:23:40.480 | rational economic decision for themselves. And so nobody should blame them for that.
00:23:44.200 | But if startups had more access to those people, or if you know, those engineers finally said,
00:23:52.180 | you know what, enough's enough, I'm actually going to go and try something new. That's net
00:23:55.900 | additive to the ecosystem. It's net additive to startups, right? That's, that's, that's for them.
00:24:00.160 | And then for consumers, I think the reason why it's positive is that it'll start to show you
00:24:05.320 | in which cases you have been giving away something that
00:24:10.180 | you didn't realize was either valuable, or you didn't realize you were giving away
00:24:14.380 | in return for all of these product subsidies that you were getting. And I think that's the
00:24:20.440 | next big thing that's happening. You can see it in the the enormous amount of investment Apple,
00:24:25.420 | for example, is making in both advertising the push to privacy as well as implementing the push
00:24:31.600 | to privacy. You know, this last WWDC, you know, they really threw the gauntlet down, you know,
00:24:37.120 | they were really trying to blow up
00:24:39.880 | the advertising business models of Google and Facebook. And as consumers become more aware of
00:24:45.760 | that, they're probably willing to pay more. So a simple example is, you know, there are a lot of
00:24:50.620 | people now who will pay higher prices for food, if they know it to be organic, right, there are
00:24:56.140 | people who will pay higher prices for electricity or for an electric car because of its impact or
00:25:00.820 | the lack thereof in the climate. So it's not to say that people always want cheaper, faster, better.
00:25:05.020 | Right. I mean, sometimes people will buy an iPhone because it's obviously protecting their privacy,
00:25:10.180 | and they know it's not an ad based model. And in fact, Apple is now making that part of their
00:25:15.040 | process. So, Freeberg, I asked the other gentleman, if they thought some large unit being chopped off
00:25:22.540 | of every company, YouTube, AWS, Instagram, you pick it would be a net positive for startups,
00:25:28.720 | or negative or neutral, and the same thing for consumers. What do you think?
00:25:32.380 | Which gentleman did you ask? You mean,
00:25:34.720 | I was specifically referring to the ones who are wearing layers.
00:25:38.680 | Hello, sir.
00:25:39.520 | Yes. I'm using the term lightly.
00:25:43.420 | So if you guys go back a few years ago, you'll remember there were these,
00:25:46.780 | I think there were congressional hearings. And Jeremy Stapelman from Yelp was pretty vocal
00:25:51.400 | about how Google was redirecting search engine traffic to their own kind of reviews. And they
00:26:00.160 | were pulling Yelp content off the site. But then they said to Yelp,
00:26:04.420 | if you don't want us to pull your content, you can turn the web crawler toggle off and we won't crawl
00:26:09.160 | your site, but your site is publicly available, we can crawl it and we show snippets on our homepage.
00:26:13.720 | But then their argument was, well, you're using our content to drive your own reviews. And they
00:26:17.800 | made this whole kind of case that Google's kind of monopoly in search was harming their ability
00:26:23.020 | to do business. The counter argument was, well, if you guys have a great service, consumers will go
00:26:28.600 | to your app directly or your website directly to get reviews, they won't go to Google. And so it
00:26:33.100 | created a little bit of this kind of, yeah, I'm not sure if I'm going to be able to do that. But I'm
00:26:34.120 | going to do it. And then I'm going to do it. And then I'm going to do it. And then I'm going to do it.
00:26:34.600 | And then I'm going to do it. And then I'm going to do it. And then I'm going to do it. And then I'm going to do it.
00:26:35.380 | And then I'm going to do it. And then I'm going to do it. And then I'm going to do it. And then I'm going to do it.
00:26:35.800 | And then I'm going to do it. And then I'm going to do it. And then I'm going to do it. And then I'm going to do it.
00:26:35.980 | I think there was some follow up. And this is all very much related, because ultimately,
00:26:40.780 | if he was able to get Google to stop providing a review service, his business would do better.
00:26:47.620 | Because Google would effectively redirect search traffic to his site,
00:26:51.640 | as opposed to their own internal site. So it is inevitably the case that in house apps or
00:26:57.700 | in house services that compete with third party services when you're a platform business are,
00:27:03.580 | you know, if they're removed, it's certainly going to benefit the competitive landscape,
00:27:07.780 | which is typically startups. You know, imagine if Apple didn't have Apple Maps pre installed
00:27:12.520 | on the iPhone, everyone would download and use Google Maps, right? I mean, they're
00:27:16.300 | already MapQuest, whatever, or MapQuest, or whatever. And so, you know, or whatever startup
00:27:21.340 | came along like Waze and said, Hey, we've got a better map. But because they have this ability
00:27:26.080 | to kind of put that Apple Maps in front of you as a consumer, and it's a default on your phone,
00:27:29.860 | you're more likely to just click on it and start using it and you're done. It certainly opens up
00:27:33.460 | this window. But I think the question is, what's ultimately best for the consumer? If you believe
00:27:39.160 | that consumers will choose what's best for themselves, you're starting to kind of manipulate
00:27:44.260 | with the market a bit. And Sax, I don't know, I think you've got a different point of view on this.
00:27:47.740 | But yeah, yeah. Well, I'm a free markets type
00:27:51.040 | of guy. But my experience at PayPal really changed my thinking on this because PayPal
00:27:56.440 | was a startup that launched effectively as an involuntary app on top of the eBay market. At
00:28:04.060 | that time, eBay had a monopoly on the auction market. And that was the key sort of beachhead
00:28:09.340 | market for online payments. So we launched on top of eBay, they were constantly trying to dislodge
00:28:15.700 | us and remove us from their platform. And really, the only thing keeping them from just switching us
00:28:20.740 | off was an antitrust threat. We actually spun up, you could call it a lobbying operation,
00:28:26.920 | where we would send information to the FTC and the DOJ and say, "Listen, you've got this
00:28:31.540 | auction monopoly here that's taking anti-competitive actions against us, this little
00:28:37.120 | startup." And so we were able to rattle the saber and sort of brush them back from the plate from
00:28:44.680 | taking a much more dramatic action against us. And frankly, we did something kind of similar
00:28:50.440 | with Visa MasterCard because PayPal was essentially an application on top of Visa MasterCard as well.
00:28:55.780 | We offered merchants the ability to accept Visa MasterCard, but also PayPal payments,
00:29:00.700 | which were gradually eating into and supplanting the credit card payments.
00:29:04.720 | And so Visa MasterCard had a very dim view of PayPal and they were constantly making noise about
00:29:13.000 | switching us off. And I do think that without the threat of antitrust hanging over these big
00:29:19.240 | monopolies or
00:29:20.140 | duopolies, it would have been very hard for us as a startup to get the access to these networks
00:29:27.040 | that we needed. And so it really kind of changed my thinking about it because if you let these giant
00:29:34.660 | monopolies run wild, run amok, they will absolutely stifle innovation. They will become gatekeepers.
00:29:43.000 | And so you have to have the threat of antitrust action hanging over their heads or you will stifle
00:29:49.840 | innovation.
00:29:50.200 | Absolutely. I mean, if you just look at the interest in Google Flights over time,
00:29:54.340 | I'm looking at a chart right now, we'll put it into the notes. Google Flights,
00:29:57.700 | you know, I know some of us don't fly commercial anymore, but you know,
00:30:01.660 | for somebody who's looking for flights on a regular basis, watching Google intercept
00:30:07.540 | flight information, put up Google Flights, and it's an awesome product. And just Expedia and
00:30:14.380 | bookings.com have collapsed.
00:30:15.700 | So Jason, that was a company called ITA Software based out of Boston.
00:30:19.540 | And ITA was acquired by Google. ITA was the search engine behind flight search for most companies. It
00:30:26.380 | was like 70 PhDs. They were all statistics guys. And they basically built this logistical model
00:30:31.360 | that identified flights and pricing and all this sort of stuff.
00:30:34.060 | Oh, wow. So that should never been allowed.
00:30:36.760 | Well, they created a white label search capability that they then provided,
00:30:41.260 | and they were making plenty of money providing this as a white label search capability to
00:30:45.040 | Expedia and Kayak and all the online travel agencies.
00:30:49.240 | And Google wanted to be in that business because travel search was obviously such a big vertical.
00:30:53.980 | And rather than just buy a travel search site, they bought the engine that powers travel search
00:30:59.380 | for most of the other sites.
00:31:00.280 | So gangster.
00:31:00.940 | And then they also revealed the results in their own search result homepage,
00:31:05.260 | which effectively cut off the OTAs and the OTAs are big spenders on Google ads.
00:31:10.180 | So basically Google, this is how nefarious it is. If I'm hearing what you're saying,
00:31:14.200 | Friedberg correctly, they watched all this money being made by
00:31:18.940 | those OTAs. They watched where they got their data from, then they bought their data source,
00:31:23.920 | and then they decided, you know what, we won't take your cost per click money.
00:31:27.580 | We'll just take your entire business.
00:31:29.260 | I don't know. So let me just let me just say it another way. What's best for consumers.
00:31:33.220 | So does a consumer because what happens a lot in
00:31:35.860 | the dictatorships, I guess, don't want to make money in online advertising.
00:31:39.640 | There are a lot of these ad arbitrage businesses is one way to think about it,
00:31:43.660 | where, you know, a service provider will pay for ads on Google to get traffic. The
00:31:48.640 | ads will come to their site, and then they will either make money on ads or,
00:31:52.720 | you know, kind of sell that consumer services. Right. And so that's effectively what the OTAs
00:31:57.280 | were is they were they became intermediaries on online search and intermediaries that were
00:32:02.560 | arbitraging Google's ad costs versus what they could get paid for the consumer. And so Google
00:32:06.700 | look at this, and they're like, wait a second, we're only capturing half the pie.
00:32:09.880 | And consumers don't want to have to click through three websites to buy a flight or
00:32:14.140 | buy a hotel. And by the way, if they did, they would keep doing it. So why don't we just give them
00:32:18.100 | Right.
00:32:18.340 | the paid result right up front, and then consumers will be happier, the less time they have to spend
00:32:22.420 | clicking through sites and looking at other shitty ads, the happier they'll be. And the product just
00:32:26.500 | works incredibly well. Consumers love it. So make consumers lives less arduous,
00:32:30.400 | while building a power base that then could make their lives miserable.
00:32:34.480 | What I think Lina Khan is saying, though, is you can't just look at the short term interests of
00:32:39.340 | consumers, you got to look at their long term interests. What's in the long term interest of
00:32:42.700 | consumers is to have competition. In the short term, these giant monopolies can engage in
00:32:48.040 | regulatory pricing to lower the cost for consumers. And so just looking at the price
00:32:53.500 | on a short term basis isn't enough. And they can trick people to giving
00:32:57.580 | them something else that they don't know to be valuable. So in the case of these,
00:33:01.120 | you know, a lot of these companies, what are they doing? They're tricking them to get enormous
00:33:05.980 | amounts of user information, personal information, user generated content, and they get nothing for
00:33:13.180 | it. And then on the back of that, if you're able to build a trillion, look at look at the value
00:33:17.740 | that YouTube has generated economic value, and then try to figure out how much of that value is
00:33:23.920 | really shared with the creator community inside of YouTube, I'm guessing it's less than 50 basis
00:33:29.200 | points. 55% of revenue. Yeah, but you're saying downstream with all that data, Google is making
00:33:36.460 | a massive amount of money. I just wanted to if you impute the value of all of the PII that Google
00:33:41.380 | basically, explain PII, personally identifiable information, all the cookies that they drop all
00:33:47.440 | the information and you equate it to an economic enterprise value, not necessarily yearly revenue,
00:33:52.660 | like a discounted cash flow over 20 years, you would be in the trillions and trillions of dollars.
00:33:58.000 | And then if you discounted the same 20 years of revenue share that they give to their content
00:34:02.800 | producers, it will be in the hundreds of billions of dollars at best. And so you're talking about
00:34:08.500 | an enormous trade off where Google basically has built, you know, a multi trillion dollar asset,
00:34:17.140 | and has leaked away less than 10 or 15% of the value. But that's an example where they are giving
00:34:23.140 | people something that they think is valuable. But in return, they're able to build something much,
00:34:27.820 | much more valuable. I just want to address like Sax's point, which is the regulators are now going
00:34:34.060 | to start to think about the long term interest of consumer over the short term interest of the
00:34:37.960 | consumer, as effectively giving the regulatory throttle to elected officials. And this means,
00:34:46.840 | this means that you're now giving another throttle, right, another controlled joystick
00:34:52.000 | to folks that may not necessarily come from business, that may not necessarily have the
00:34:59.020 | appropriate background, and that may have their own kind of political incentives and motivations
00:35:03.400 | to make decisions about what is right and what is wrong for consumers over the long term. And
00:35:07.540 | ultimately, those are going to be value judgments, right? There's no determinism here, there's no
00:35:11.620 | right or wrong, they're going to be decisions based on the kind of opinion and nuance of
00:35:16.540 | some elected people. And so it is a very dangerous and kind of slippery slope to end up in this world
00:35:21.880 | where the judgment of some regulator about what's best for consumers long term versus the cold hard
00:35:27.580 | facts, oh, prices went up prices didn't, you know, but really saying, well, this could affect you in
00:35:32.380 | the future in this way, starts to become kind of a really, you know, scary and slippery slope. If we
00:35:38.320 | kind of embrace this, this this new regulatory order.
00:35:40.600 | All right, moving on big news this week, Apple had a gag order, it has been revealed.
00:35:46.240 | This is unbelievable.
00:35:47.920 | It's pretty crazy. And we only have partial information here. But the Justice Department
00:35:53.500 | subpoenaed Apple in February of 2018, about an account that belonged to Donald McGahn,
00:35:59.020 | who obviously was the Trump's White House counsel at the time, and obviously was part
00:36:05.140 | of the campaign. He is very famously known for being interviewed by Mueller. And at that,
00:36:11.380 | this is the time period, by the way, we're talking about here in February of 2018, when Mueller,
00:36:15.940 | Mueller was investigating Manafort, who, of course, was super corrupt and went to jail and
00:36:20.500 | then was suddenly pardoned. Because they he was also involved in the campaign in 2016. It's possible
00:36:25.660 | that this related to Mueller, it's unknown at this time. Many other folks were also caught up in this
00:36:31.420 | dragnet. Rod Rosenstein was a second and it's unclear if the FBI agents were investigating
00:36:36.580 | whether McGahn was the leaker or not. Trump had previously ordered began the previous June to have
00:36:45.640 | the Justice Department remove Mueller, which began refused and threatened to resign and began later
00:36:51.160 | revealed that he had in fact leaked his resignation threat to the Washington Post.
00:36:55.720 | According to the Times disclosure that agents had collected data of the sitting White House
00:37:00.640 | counsel, which they kept secret for years is extraordinary. Go ahead, Sachs.
00:37:05.320 | Well, I just think you just got all the facts out here. I think you're missing some of the key
00:37:08.920 | facts. So the the Justice Department under Trump starts this investigation to leaks of classified
00:37:14.560 | information. They're on a.
00:37:15.340 | They're on a mole hunt effectively. And they start making they subpoena the DOJ subpoenas records from
00:37:23.980 | Apple and it goes very broad and they end up subpoenaing the records, not just of McGahn,
00:37:28.720 | who's the White House counsel, which is very bizarre and curious. They'd be investigating
00:37:32.440 | their own White House counsel, but they also.
00:37:35.380 | Well, it was a leaving ship.
00:37:37.300 | Yes, but they also subpoena records of Adam Schiff and Swalwell and members of the House
00:37:43.540 | Intelligence Committee. Yeah.
00:37:45.040 | And so you have now an accusation which is being breathlessly reported on CNN and MSNBC that here you
00:37:53.140 | had the Trump administration investigating its political enemies and using the subpoena power
00:37:58.360 | of the DOJ with Apple's compliance to now spy on their political enemies.
00:38:04.480 | That that that those are some big jumps. Those are some big jumps.
00:38:07.480 | That's the setup.
00:38:07.900 | Yeah. And and that those are some big jumps because according to Preet Bharara and some other folks who
00:38:13.960 | are in the industry who who have done these actual subpoenas, they could have been subpoenaing, you know,
00:38:19.540 | one of Manafort's, you know, corrupt, you know, partners in crime. And then those people, he could
00:38:25.540 | have been talking to many people in the Trump administration and then subsequently family
00:38:29.020 | members and others. So he might have not been the target. He could have been caught up in the
00:38:33.100 | metadata of other people. Yeah. So this might not be Trump saying, get me his iPhone records. It
00:38:39.280 | could be there's some dirty person. They know they're dirty. And that person had reached
00:38:43.660 | out to other people and they might have even done one more hop from it.
00:38:46.840 | I mean, okay, that's one version. Yeah. And then, you know, the other, the other version,
00:38:54.520 | which is important is you subpoena your own lawyer by going to Apple, getting basically
00:39:02.140 | God knows what data associated with this man's account. And then, you know, institutes a gag
00:39:08.560 | order on that company so that they can neither tell the person until now when the gag order expired,
00:39:13.360 | nor tell anybody else, nor have any recourse to the extent that they think that this is
00:39:17.440 | illegitimate. That to me smells really fishy. And so, you know, like there are other mechanisms
00:39:24.040 | that, that we know of like physical requests and other things that these big companies have
00:39:27.640 | to deal with all the time. This, at least the way that it's written and how it's been reported is
00:39:32.680 | something outside of the pale. And so I think you have to deal with it with this question of like,
00:39:36.820 | what the hell was going on over there? Yeah. It does seem like they were going, uh,
00:39:43.060 | I mean, you know, kindly, maybe mole hunting more nefariously, which hunting,
00:39:48.580 | um, but they were trying to pin it on people and they may have used this blanket sort of
00:39:55.660 | deniable plausibility of the Russia, you know, in Brolio, but really what these guys were doing was
00:40:02.500 | they were investigating anybody that they thought was a threat. And that is a really scary thing to
00:40:07.900 | have in a democracy. And then the fact that these big tech companies basically just turned it over
00:40:12.760 | to the government, didn't have any recourse to protect the user or to inform the public.
00:40:16.540 | Forget Trump for a second. I think we don't necessarily want that to be the
00:40:22.120 | precedent that holds going forward. Yeah. And the interesting thing here is that
00:40:25.720 | SACS, Jeff sessions, Rosenstein and bar all say they're unaware of this. So what would be
00:40:32.860 | the charitable reason they were unaware of it or what would be the nefarious reason,
00:40:37.900 | or is that important at all? Because that's really strange. Well, they would go after the white house
00:40:42.460 | council and Adam Schiff and those top three people would have no idea. Are they lying?
00:40:48.520 | I mean, what's next is the SCS, you know, are we gonna basically go to a point where like,
00:40:52.420 | you know, every single, every, no, but I mean like every single post that one makes on Facebook is
00:40:58.000 | basically surveilled. Um, if you make an anonymous post on Twitter, will you be tracked down? I,
00:41:04.060 | I remember like as much as everybody thinks there's anonymity on the internet,
00:41:06.820 | there really isn't. And you should just completely assume that you are trackable,
00:41:12.160 | are being tracked, have been tracked. Everything is in the wide open. It's just a matter of whether
00:41:16.000 | it's disclosed to you or not, or whether it's brought back to you or not. So, yeah. So look,
00:41:20.080 | I mean, I, I agree with Jamath that this stinks and it's a, it's an invasion of people's civil
00:41:25.240 | liberties, but I would not make it too partisan because the Obama administration was engaging in
00:41:31.720 | similar, uh, activity back in 2013. And I don't think people realize this. The,
00:41:36.700 | there's an old saying in Washington that the real scandal is what's legal.
00:41:41.860 | The fact of the matter is that what the Trump administration did was certainly suspicious and
00:41:46.000 | it might have been politically motivated. We don't know, but it was legal. The DOJ convened
00:41:51.040 | a federal grand jury, got these, uh, got these subpoenas, presented them to Apple and got this
00:41:56.560 | information. And in a similar way, back in 2013, the Obama administration did something similar.
00:42:02.140 | It's quite extraordinary. They subpoenaed the records of the AP. They, for, they,
00:42:07.120 | for two months, they got the records of reporters and
00:42:11.560 | five branches of the AP and all their mobile records. And they were on a mole hunt to try and
00:42:17.080 | find leakers of classified information. So the Trump administration basically did exactly what
00:42:22.600 | the Obama administration did. The only new wrinkle is that they'd only went after reporters. They
00:42:27.880 | actually subpoenaed records of members. You're missing one huge, you're missing, you're missing
00:42:33.160 | one huge difference. Trump was under investigation, uh, for espionage and treason at the time. So it is
00:42:41.260 | completely different. Um, Obama, I, I, I, I don't think it's that different in the sense that Trump
00:42:46.660 | used powers that were pioneered by the Obama administration. They just took them. They just
00:42:50.980 | took them one little step further. And in addition to that, Sachs, in addition to that,
00:42:54.880 | um, when Obama did it, all the top brass at the Department of Justice were aware of this. And in
00:43:00.220 | this case, you have three people who are running the Department of Justice all claiming they don't
00:43:04.180 | know. No, in 2013, there's a New York Times article on this. I'm gonna post on the, in the show notes,
00:43:09.460 | but it said that when, when, first of all, the AP was not informed about the subpoenas until a number
00:43:17.440 | of months later. So it was a secret seizure of records, same thing here with the gag order.
00:43:23.080 | And so you have people being investigated. They don't even know they're being investigated,
00:43:27.280 | investigated. They can't even get a lawyer spun up to oppose the invasion of their rights.
00:43:32.380 | I agree with you, but the attorney general knew about that.
00:43:35.860 | Maybe the attorney general did, but the white house claims that it didn't know.
00:43:39.160 | So in any event, I mean, look, what we, I, I, my view on this is that we shouldn't try to make
00:43:44.380 | this too partisan. What we have here is an opportunity to hopefully get some bipartisan
00:43:48.880 | legislation to fix the issue. And I think the fix should be this, that when you investigate somebody,
00:43:57.040 | when you subpoena records from a big tech company, you have to notify them.
00:44:01.120 | You should not be able to do that secretly because the fact of the matter is that Apple
00:44:04.780 | and these other big tech companies don't have an incentive to oppose the subpoena.
00:44:08.860 | They're not your lawyer. And actually Brad Smith, the president of Microsoft had a great
00:44:13.960 | op-ed in the Washington post that we should post that we should put in the show notes where he said
00:44:19.180 | these secret gag orders must stop. He said that in the old way of the government subpoenaing records
00:44:25.600 | is that you would have essentially offline records. You'd have a file cabinet and the
00:44:30.280 | government would come with a search warrant that present the search warrant to you. And then you
00:44:34.120 | could get a lawyer to oppose it. Well, they don't do that anymore because your records aren't in a
00:44:38.560 | file cabinet somewhere they're in the cloud. And so now they don't even go to the person who's being
00:44:43.000 | investigated. They just go to a big tech company, seize the records and then put a gag order on top
00:44:47.800 | of it. So you don't even know you're being investigated. That's the part of it that
00:44:50.740 | states. And by the way, it's even more
00:44:52.120 | pernicious than that, Sax, because to combine this with the previous story, what incentive
00:44:59.380 | does Apple have to say to an administration that could break them up? We're not going to cooperate.
00:45:04.480 | Of course. Zero incentive. They are not your agent in this. And here's the thing. Those are
00:45:11.240 | your records. They're in the cloud, but they're your records. In every other privacy context,
00:45:15.280 | we say those records belong to you, not to big tech. So why is Apple moving everything to your
00:45:21.160 | phone? Right. But just to finish this point, why should the government be able to do an end run
00:45:25.120 | around you, the target of the investigation, go to big tech, get your records from--
00:45:30.400 | Because they're not your records. Well, first of all, they're not your records. These companies,
00:45:34.180 | they're not your records. These companies tricked all of us by giving it to us for free so that we
00:45:38.320 | gave them all of our content. They are not just the custodian. They are the trustee of our content.
00:45:45.520 | And it's a huge distinction in what they're allowed to do. And Jason brings up an incredible
00:45:50.920 | point, which is that, of course, they're now incentivized to have a back door and live under
00:45:56.980 | a gag order because their defense in a back room is, you guys, you know, when in the
00:46:03.880 | light, somebody says we should break you up, in the dark, they can say, guys, come on, we got a
00:46:09.400 | back door. You just come in, gag order us, give us-- we'll give you what you want. You want a
00:46:14.080 | honeypot. Right.
00:46:14.800 | You don't want this thing all over the internet. And can you imagine how credible, David,
00:46:19.000 | that is to your point? Because that is a body of concentrating power that I think is very scary.
00:46:25.600 | In fairness to Apple, Friedberg, they have locked down the phone and they've
00:46:30.040 | moved all of this information from the cloud, or they're starting this process and
00:46:33.580 | saying, we're going to keep some amounts of this data encrypted on your phone. And of course,
00:46:37.480 | with the San Bernardino shooting, they refused in a terrorist shooting, a known terrorist shooting,
00:46:42.640 | to not give a back door. Well, that's a crazy standard. It's like, you know what, okay,
00:46:47.260 | there was a San Bernardino shooter. And they were like, nope, sorry, that's a bridge too far. But,
00:46:52.180 | you know, Don McGahn and basically like, you know, political espionage. They're like, here you go.
00:46:57.700 | Yeah. I don't know. I don't know how, how do you make these decisions?
00:47:00.520 | Let me ask you guys a question. Go ahead, Friedberg.
00:47:03.280 | Go ahead, Friedberg. Would you be, could you see yourself thriving in a world
00:47:10.060 | where all of your information was completely publicly available, but also all of everyone
00:47:17.380 | else's information was completely publicly available? Yes.
00:47:20.680 | Oh, everybody has all their nudes on the web is what you're saying?
00:47:24.700 | Everybody. There's a, there's a, there's a book by Stephen Baxter called The Light of
00:47:29.800 | Other Days. It's one of my favorite sci-fi books. I sent it out to all of my, uh,
00:47:32.980 | investors this last, uh, we do like a book thing every year and I reread it recently,
00:47:37.900 | but the whole point of the book is that there's like a wormhole technology that they discover and
00:47:42.580 | they can figure out how to like look in and you can boot up your computer and look in anywhere
00:47:46.780 | and see anything and hear anything you want. And so all of a sudden society has to transform
00:47:52.600 | under this kind of new regime of hyper-transparency where all information
00:47:57.160 | about everything is completely available. But I think the, the fear and the concern that we innately
00:48:02.560 | have with respect to loss of privacy is that there's a centralized or controlled power that
00:48:08.200 | has that information. But what if there was a world that, that you evolved to where all of
00:48:12.940 | that information is generally available quite broadly. And I'm, I'm not advocating for this,
00:48:16.780 | by the way, I'm just pointing out that like the sensitivity we have is about our information
00:48:20.620 | being concentrated in the hands of either a government or a business. Um, and I think you
00:48:26.740 | have to kind of accept the fact that more information is being generated about each of us
00:48:31.360 | every day than was being generated by us a few weeks ago or months ago or years ago.
00:48:35.920 | Basically everybody, everybody's the Truman and everybody's the Truman show is what you're saying.
00:48:40.600 | Well, in a geometrically growing way, information, which we're calling PII or whatever is being
00:48:45.760 | generated about us. And I think the genie's out of the bottle, meaning like the, the cost of sensors,
00:48:51.520 | the access to digital, the digital age and what it brings to us from a benefit perspective is
00:48:56.560 | creating information about us and a footprint about us that I don't think we ever kind of contemplated.
00:49:00.940 | But as that happens, the question is, where does that information go? Can you put that genie back in
00:49:06.520 | the bottle? And I think there's a, there's a big philosophical point, which is like, if you try and
00:49:10.420 | put the genie back in the bottle, you're really just trying to fight information wants to be free.
00:49:14.380 | Information wants to grow. What's the name of the book you were talking about there?
00:49:17.800 | The light of other days by Stephen Baxter and Arthur C. Clark helped write it. But the book
00:49:22.780 | is most interesting about the philosophical implications of a world where all information
00:49:27.640 | is completely freely available. Transparency. And any, anyone, yeah,
00:49:30.520 | completely transparent. And so like, do we see ourselves because I think there's two paths. One
00:49:34.960 | is you fight this and you fight it and you fight it every which way, which is I want my PII locked
00:49:38.920 | up. I don't want anyone having access to it. Yada, yada, yada. You'll either see a diminishment of
00:49:42.760 | services or you will see. And the other one is you do a selfie on Twitter where you take your
00:49:46.600 | shirt off. Or you'll see this concentration of power where we all kind of freak out where the
00:49:50.680 | government or, or some business has all of our information. The other path is a path that society
00:49:56.140 | starts to recognize that this information is out there. There is, you know, whatever.
00:50:00.100 | It's not just about PII here. This is about due process. This is about our fifth amendment right
00:50:05.680 | to due process. You have the government secretly investigating people. They could never do this.
00:50:10.780 | If they had to present you with a search warrant, they are doing an end run around that process by
00:50:15.580 | going to big tech, just to put some numbers on this. Big tech is getting something like 400
00:50:20.380 | subpoenas a week for people's records. They only oppose 4% of them. Why they have no incentive to
00:50:27.160 | oppose them. How many of those are, do you know how many of those are.
00:50:29.980 | Is this a secret or not?
00:50:30.760 | We don't know how many of them have a gag order. They are required to tell the target what happened,
00:50:36.940 | but not if there's a gag order attached to it. We don't know how many have a gag order. You should
00:50:40.960 | have the right to send your own lawyer to oppose the request, not to depend on big tech for it.
00:50:46.540 | If you want to see an amazing movie, the lives of others, which is about the state security service
00:50:52.840 | in East Berlin, Germany, also known as the Stasi and the impact of literally in your apartment
00:50:59.860 | building. There are three people spying on the other 10 people and they're the postman and you
00:51:05.320 | know, the housewife and the teacher and they're all tapped and secretly recording to each other.
00:51:10.000 | It leads to chaos and bad feelings and obviously when East Berlin, um, when the wall came down,
00:51:16.540 | all of this came out and it was really dark and crazy.
00:51:19.300 | Yeah. I mean, look, let me connect this to the censorship issue actually, because in my view,
00:51:23.980 | they're both very similar civil liberties issues, which is in the case of the censorship issue, you have the government
00:51:29.740 | doing an end run around the first amendment by demanding that big tech companies engage in
00:51:35.500 | censorship that the government itself could not do. You have something very similar taking place
00:51:39.580 | here with these records. The government is demanding secrecy about its seizure of records.
00:51:44.860 | They're imposing that on big tech. They're making big tech do its dirty work for them.
00:51:48.820 | They could never do that, uh, directly if they had to go to the target of the investigation and ask
00:51:54.100 | for their and, and subpoena the records that way. So what you have here is a case where we not only need to be
00:51:59.620 | protected against the power of big tech, we need to be protected against the power of government usurping the
00:52:06.220 | powers of big tech to engage in, you know, behavior they couldn't otherwise engage in.
00:52:11.380 | And let's be honest, big tech and the government are, uh, overlapping and in cahoots or they're in
00:52:19.540 | some, they're in some really crazy dance. The money is flowing freely from, uh, lobbyists and.
00:52:27.340 | It's a very, very complicated relationship.
00:52:29.500 | It's a very complicated relationship.
00:52:31.180 | It's a very complicated relationship.
00:52:32.200 | All right. Seven day average for COVID debts is, uh, now at 332.
00:52:38.200 | Um, finding cases of people who have had COVID, um, is now becoming like almost shocking.
00:52:45.580 | Uh, I don't know if you guys saw, but, uh, the point guard, Chris Paul, uh, who is having an incredibly winning season in the NBA.
00:52:52.240 | He basically got COVID.
00:52:53.680 | They said he was vaccinated, so it could be a mild case, but he's been, uh, pulled out indefinitely and he's
00:52:59.380 | about to play in the Western conference final. So it's pretty crazy. And Freeburg, uh, obviously
00:53:04.300 | California's opened up after 15 months and we were the first to shut down the last open up and we were
00:53:10.780 | hit the least, I think of any state or amongst the least of any, certainly the least of any large
00:53:15.940 | state. Um, and you're being asked to still wear a mask at your office.
00:53:20.320 | I'm also being asked to take off my shoes when I get on an airplane.
00:53:25.420 | Yeah.
00:53:26.620 | 20 years later. And I don't think I'll.
00:53:29.260 | I don't think the data exists anymore.
00:53:30.340 | Uh, yeah, maybe some parts of it. Explain what's happening to you in the Presidio,
00:53:36.580 | which is a lovely state park, uh, here in California.
00:53:40.780 | My office in the Presidio in California, San Francisco County and the federal government
00:53:46.480 | have all removed mask mandates, but our, our landlord has determined in their judgment that
00:53:53.020 | everyone should still wear a mask to go to work. And so to go into my rented office and work, I have to wear a mask.
00:53:58.660 | Yeah.
00:53:59.140 | Um, and I think it's, it's an issue for a lot of people who like, there's people that,
00:54:04.000 | um, uh, I I've been probably at a couple of restaurants this week and, you know,
00:54:08.620 | you go to some restaurants and everyone's just chilling. The employees are not wearing masks.
00:54:12.160 | There's other restaurants where they're being told they have to keep wearing masks by their
00:54:15.460 | manager or their boss. Um, and so this brings up this big question, which is like, we've now got
00:54:22.120 | the kind of psychic shadow of COVID that that's gonna, it's gonna, it's gonna cast a very long
00:54:27.520 | shadow. I think.
00:54:28.060 | You predicted it. You predicted it.
00:54:29.020 | And, um, and so people that, that are in power wanna continue to kind of impress upon,
00:54:35.440 | you know, whatever, you know, employees or tenants or, or what have you, they might have,
00:54:39.700 | um, in whatever they deem their judgment to be, which is obviously in, in many cases,
00:54:44.320 | an under informed, uninformed, non-scientific and, um, uh, and non-mandated judgment about
00:54:51.100 | effectively what people should have to wear. So if the threat or the risk has been removed
00:54:55.300 | and all of the health officials and all of the government agencies are saying the threat
00:54:58.900 | has been removed, you no longer need to kind of wear masks, but your boss or your manager or
00:55:04.000 | your landlord tells you, you have to wear a mask to conduct your business or to, to go to work.
00:55:08.680 | Um, you know, it's gonna bring up this whole series of challenges and questions I foresee
00:55:13.360 | for the next couple of months, at least, and maybe for several years about what's fair and
00:55:17.620 | what's right. And, and there will always be the safety argument to be made on the other side.
00:55:20.740 | So it's very hard to argue against that and, oh, well, the inconvenience is just a mask.
00:55:24.160 | It's not a big deal, but for, you know, a number of people to, to now kind of be told, you know,
00:55:28.780 | what to do and what to wear. It'll take a year to sort all these things out,
00:55:32.740 | cuz they'll all get prosecuted or not prosecuted, but litigated and they're gonna go to court.
00:55:37.480 | They will get litigated for sure. There will, there will be lawsuits on this.
00:55:40.240 | And, and what's gonna happen is that you're gonna basically have again, Jason, back to that example
00:55:45.280 | of the, the bakery in Colorado, um, private institutions will be allowed some level of
00:55:51.460 | independence in establishing, um, you know, certain employee guidelines and so on.
00:55:56.380 | Exactly. Yeah. And you, you'll have to conform
00:55:58.660 | to those and, um, it, it is what it is. I mean, I, I, I just,
00:56:02.920 | it was very strange in Austin in terms of these COVID, uh, dead enders who just will not let it go.
00:56:10.120 | I'm in Austin where nobody is wearing a mask. And then there were like, I went into Lululemon and
00:56:17.140 | they like two people charged me with masks in hand and they were like, you have to wear a mask. And I
00:56:23.680 | was like, do I? And they're like, yes, it's our policy. I was like, fine. I'll put it on. I don't
00:56:27.520 | care. You know,
00:56:28.540 | no big deal. This thing, this thing has really fried a bunch of people's brains. I mean,
00:56:32.680 | it's crazy. I mean, it's, it's basically like you've taken an entire group of folks and
00:56:37.480 | kidnap them and kidnap them essentially. It's Stockholm syndrome. It's incredible. No,
00:56:42.400 | everyone's been held hostage, you know, in a prison for the last year. And so you've kind
00:56:48.400 | of accepted that this is the new reality. I gotta wear a mask. I gotta wear gloves. Um,
00:56:53.680 | and you know, it's the similar sort of shift in reality that I think was needed going into this,
00:56:57.700 | where people didn't believe what it was and now it's hard for them to believe what it's become.
00:57:02.560 | Well, we, we, we flagged this.
00:57:04.960 | That's just human nature. Yeah.
00:57:05.920 | Yeah. We, we flagged on this pod a few months ago, the threat of zeroism.
00:57:10.000 | Yeah.
00:57:10.540 | Which is that we wouldn't let, you know, all the special rules and restrictions lift until there
00:57:17.500 | were zero cases of COVID. And we all know that's never gonna happen. COVID will always be around
00:57:21.700 | in the background. And just to add a layer to what's happening here in California is yeah,
00:57:25.360 | on June 15th, we lifted the restrictions,
00:57:27.580 | But Governor Newsom has not given up his emergency powers and he's, he says he will
00:57:33.160 | keep them until COVID's been extinguished. So he's now embraced zeroism on behalf of,
00:57:38.860 | uh, this sort of authoritarianism.
00:57:41.260 | Yeah.
00:57:41.800 | And you know, so we've got this like golden state Caesar. And now I, I, what's interesting is I
00:57:47.920 | don't think this is just because he's a tyrant, although he's certainly been heavy handed. I
00:57:52.780 | think it's because that the, I think it's more about corruption than ideology,
00:57:57.460 | because federal funds, emergency funds from the federal government keep flowing to the state.
00:58:03.640 | As long as we have a state of emergency. And so the longer he keeps this thing going,
00:58:08.500 | the more money he gets from the federal government that he can then use in this
00:58:12.220 | recall year to pay people off. And so we've already seen he's been buying every vote he can,
00:58:17.140 | right? He gave 600 bucks to everyone making under 75,000. He's forgiving all the traffic
00:58:22.360 | fines and parking tickets. He's doing this, this lottery ticket, uh, thing for getting the vaccine.
00:58:27.340 | And so he just wants to keep the, the, um, the gravy train from Washington,
00:58:32.260 | California, even though we have a surplus,
00:58:35.320 | it's what government governor Timoth would have done.
00:58:38.500 | I mean, it reminds me of nine 11, where people were just like,
00:58:43.540 | Hey, we can keep this gravy train.
00:58:44.920 | No, I mean like nine 11 is the perfect kind of psychic, you know, scenario, uh, you know,
00:58:50.920 | replaying itself with COVID. There are behavioral changes that have lasted forever. There are
00:58:55.960 | regulatory changes. This,
00:58:57.220 | you know, department of Homeland security. I mean, you go through the amount of money
00:58:59.740 | that gets spent by the TSA every year and the qualified risk and the qualified benefit
00:59:04.420 | completely unquantified, right? Like the amount of money that flows into these programs,
00:59:09.160 | because you can make the, the subjective statement. There is a threat. There is risk
00:59:14.320 | therefore spend infinite amounts of money, right? Like it's because, because you never kind of put
00:59:18.700 | pen to paper and say, what is the risk? What is the probability? What is the severity of loss?
00:59:23.020 | And therefore let's make a value judgment about how much we should spend to protect against that
00:59:27.100 | downside. And we're now doing the same thing with COVID. We're not having a conversation about how
00:59:31.960 | many cases, how many what's the risk. Should we really still be spending billions of dollars of
00:59:36.280 | state funding to continue to protect a state where 70% of people are vaccinated.
00:59:40.540 | And we, and we have a massive surplus and we're still giving people money,
00:59:43.900 | um, who may or may not need it. And we're doing it indiscriminately speaking of,
00:59:48.040 | uh, discussions and hard topics and being able to have them YouTube,
00:59:52.360 | which kicked off a ton of people on the platform for talking about things that were not approved by,
00:59:56.980 | by the WHO has taken professor Brett Weinstein's podcast down because he had a very reasonable
01:00:04.540 | discussion about ivermectin and its efficacy or lack of efficacy. This is a doctor, a PhD talking
01:00:11.260 | to an MD. Um, and the video was removed. Apple did not remove it's really scary. This episode,
01:00:17.560 | these people should not be the gatekeepers of the truth. They have no idea what the truth is. Let's
01:00:22.960 | talk about the, the Jon Stewart appearance on Steve. Well, that's what I was about to do.
01:00:26.860 | Yeah. He killed, he killed on Steven Colbert, but the things he was saying about the lab leak would
01:00:32.680 | not have been allowed on YouTube. If it was three months ago that you would have been removed for it,
01:00:37.840 | even as a comedian, the performance was amazing. He basically says,
01:00:41.320 | you know, the Wuhan, uh, COVID lab is where the Wuhan, you know, uh, no,
01:00:47.740 | the disease is named after the lab. So where do you think it came from? It was like a panel in,
01:00:54.700 | you know, mated with a bat. I mean,
01:00:56.740 | and he goes on this whole diatribe. It's incredibly funny. Yes. But then at the end of it,
01:01:01.900 | well, I, I had two takeaways. I don't know if you guys felt this at first. I was like,
01:01:05.560 | ah, Jon Stewart's a little unhinged here. Like, I mean, there was a part of it that was funny. And
01:01:10.480 | then there was a part of it, which is like, wow, Jon Stewart's been trapped indoors a little too
01:01:13.840 | long. Yeah. Yeah. So I, I thought that as well, to be honest. But then the second thing, which I saw
01:01:18.940 | on Twitter was all these people reminding, uh, anybody who saw the tweet that this exact content
01:01:25.540 | would have not been allowed on big tech platforms were it said three or six months ago. And I was
01:01:33.400 | like, wow, this is, this is really nuts. Meaning it takes a left-leaning, smart, funny comedian to
01:01:41.140 | say something that if the, if the right, if the right would have said it would have just been
01:01:45.760 | instantly banished. And that's like, that's kind of crazy. Yeah. The great quote was, I think we
01:01:51.220 | owe a great debt of gratitude to science. Science has in many ways helped ease the suffering of this
01:01:55.000 | pandemic, uh, which was more than likely caused by science.
01:01:58.120 | Yeah. Well, it was a funny line where he said something like, uh, if there was an outbreak of
01:02:03.520 | chocolate goodness in Hershey, Pennsylvania, Pennsylvania, it wouldn't be, it wouldn't be
01:02:07.180 | because you know, whatever the pangolin kissed a bat it's because there's a fucking chocolate
01:02:10.960 | factory there. I don't know. Maybe a steam shovel made it with a cocoa bean chocolate factory.
01:02:18.160 | Maybe that's it.
01:02:19.000 | Oh, is that what he was so funny? He's so, he's so funny. So I, I agree with.
01:02:24.640 | Yeah. I agree with you. I agree with you. I agree with you. I agree with you. I agree with you.
01:02:25.480 | I agree with you. I agree with you. I agree with you. I agree with you. I agree with you. I agree with
01:02:26.080 | you. I agree with you. I agree with you. I agree with you. I agree with you. I agree with you. I agree
01:02:26.380 | with you. I agree with you. I agree with you. I agree with you. I agree with you. I agree with you.
01:02:26.920 | Great example of, of censorship run amok at these big tech companies. But the other thing I saw that
01:02:31.960 | was really interesting was Stephen Colbert lose control of his audience. And you know, Stuart
01:02:38.500 | killed on that show, but you could see Stephen Colbert was I think visibly nervous.
01:02:42.460 | Very uncomfortable. Yeah. Very uncomfortable. He did not know what was coming and he was trying
01:02:47.260 | to and when, and when, when John Stewart kept pushing this, he was like, well, uh, he kept
01:02:51.280 | trying to qualify. Well, so what you're saying is now that Fauci has said, this might be a
01:02:55.840 | possibility you're saying it might be a possibility. And John Stewart was having none of
01:02:59.260 | it. He ran right over that, said, no, the name is the same. It's obvious. Come on. And yeah.
01:03:05.380 | Well, Colbert kept challenging him. I don't know if you saw this part where he said,
01:03:08.440 | Hey, listen, is it possible that they have a lab in Wuhan to study the coronavirus disease?
01:03:12.880 | Because one, there are a lot of novel coronavirus diseases because there's a big bat population.
01:03:16.900 | And then Stewart is like, no, I'm not standing for that. He goes, I totally understand. It's
01:03:21.580 | the local specialty and it's the only place to find bats. You won't find bats anywhere else. Oh,
01:03:24.940 | wait, Austin, Texas has thousands of them fly out of a cave every night at dusk and he wouldn't let
01:03:29.920 | it go. So it's just great watching. It was, it was a reminder, frankly, of how funny both John
01:03:35.980 | Stewart and Stephen Colbert were about 15 years ago. And I, frankly, I don't think Stephen Colbert
01:03:40.420 | is funny anymore because no, because he's gotta keep his job. He's carrying water. He's also too.
01:03:46.540 | And yeah, yes, he's become very polemical. And, and what Stewart reminded us is that comedy is
01:03:53.260 | funny when it's making fun of the people who are pretentious and basically who aren't telling the
01:03:59.800 | truth. And Stephen Colbert has become so polemical that he's lost sight of the comedy and John
01:04:06.520 | Stewart brought it back. And I hope you know what they said.
01:04:08.800 | By the way, Colbert Colbert had this element of satire, which even Stewart,
01:04:13.660 | cause Stewart was in your face. Funny. Whereas
01:04:16.180 | Where was like subtle and dry and you had to think about it. There was layered.
01:04:19.720 | Oh, for sure.
01:04:20.680 | And he's totally lost it. Totally, totally lost it.
01:04:23.680 | Well, if you know,
01:04:24.400 | And then, and then, and then, and then I thought Stewart came out swinging hard. I do think though,
01:04:28.300 | it's sacks. You have to agree. Did it seem to you though? Like Stewart had not, like,
01:04:32.440 | he just needed more human to human interaction.
01:04:34.360 | Absolutely. He was a cage tiger, man. He was a cage tiger. They let him out.
01:04:39.160 | He was like Jake Hale going to off.
01:04:41.200 | Absolutely.
01:04:41.800 | But it was the funniest thing John Stewart's done in many
01:04:45.820 | years. And the reason is because he connected with the fact that here is this obvious thing
01:04:52.180 | that we're not allowed to say, and that is what comics should be doing.
01:04:56.380 | Yes. Put a light on it. I mean, if comedy is tragedy plus time, I think that this is a great
01:05:03.040 | moment for us to reflect on. Like, I think we're going to go back to normal pretty quick. If you
01:05:09.340 | remember after 9/11, there was this idea that comedy was over forever. You were not going to
01:05:13.960 | be able to make fun of things. And that, that was the first thing that we were going to do.
01:05:15.460 | And that this was the end of satire. People were, this is, you know, a bridge too far,
01:05:20.860 | et cetera. And I think we're back. We're back and that's it. You know, we can joke about the
01:05:26.440 | coronavirus. We can talk about it. We don't need to censor people for having an opinion. We're all
01:05:32.080 | adults here. You know, the idea that, you know, we have to take down people's tweets because they
01:05:37.900 | have some crazy theory or put a label on them. Like we went a little crazy during the pandemic
01:05:44.500 | and tried to stifle discussions for what reason exactly? Like when we look back on this,
01:05:49.900 | it's going to look really strange that we demanded that we put labels on people questioning or having
01:05:57.280 | a debate, including doctors. Doctors were not allowed to debate to the public on YouTube or
01:06:04.600 | Twitter about, uh, what was the drug that Trump kept promoting? Hydroxychloroquine.
01:06:11.320 | Hydroxychloroquine. Hydroxychloroquine. Like remember that whole chloroquine? I think this,
01:06:14.140 | uh, Ivermectin or whatever it is, it's just triggering people because it feels like that
01:06:18.580 | last drug, which is a drug that may or may not work to slow down the progression of COVID. But
01:06:23.260 | anyway, this is all over. If you haven't gotten your goddamn vaccine, please get it.
01:06:26.680 | Stop denying science. Stop denying science.
01:06:29.440 | Climate change, climate change is not real.
01:06:31.600 | And remember, oh my God, the YouTube just canceled our account. Jamal,
01:06:36.640 | what are you doing? You can't say that!
01:06:39.220 | Science, we talk about science as if science is,
01:06:43.780 | uh, a definitive answer to a question. It's a process.
01:06:46.720 | It's a process by which you come to answers. You test them. And look,
01:06:50.920 | hydroxychloroquine may have been completely wrong, but let the debate happen. The answers came out
01:06:55.900 | anyway. I'll tell you a fundamental premise of science is to challenge assumptions. And so,
01:07:00.700 | when you challenge, um, uh, an existing hypothesis or, uh, kind of an existing thing that we hold to
01:07:07.000 | be true, you are engaging in science and the rigorous debate around what works and what doesn't
01:07:12.820 | work was
01:07:13.420 | was notably absent over the past year because everything became about the political truth.
01:07:18.040 | You're either true or you're false based on your political orientation. And we reduced
01:07:22.180 | everything down to kind of this one-dimensional.
01:07:23.560 | Identity politics.
01:07:24.280 | This one-dimensional framework, which we have a tendency to, by the way,
01:07:26.920 | Party politics.
01:07:27.160 | Let me just point this out to you guys. I, I was gonna mention this a few weeks ago,
01:07:30.040 | but like, think about every conversation you have, how, um, common it is now to immediately
01:07:36.220 | think about what the person on the other side that you're talking to just said,
01:07:39.580 | and then trying to put them on a blue or red spectrum.
01:07:43.060 | It's it's how we've all kind of been reprogrammed over the past decade or so,
01:07:47.620 | where it used to be about the topic itself and the objective truth finding or the,
01:07:52.960 | or the specifics of what we're talking about. And now it's become about you immediately try
01:07:57.760 | and resolve them to being conservative or, or not red or blue Trump or not purple.
01:08:03.880 | And so every conversation you kind of try and Orient around that simple,
01:08:08.200 | ridiculous one-dimensional framework. And it's a complete loss of the discovery,
01:08:12.700 | of objective truth in all matters in life and all matters in that affect all of us.
01:08:17.800 | Um, and it's, uh, it's really quite, uh, stark and, and sad.
01:08:21.400 | This is why we need a new political party.
01:08:24.640 | The reason party.
01:08:25.540 | I think it's less about that. I think it's more about everyone just reorienting themselves.
01:08:28.600 | When you have a conversation, just notice yourself doing it and then recognize that
01:08:32.560 | maybe that's not the way to make a decision about the conversation or about having an opinion or a
01:08:37.060 | point of view, but have an opinion or a point of view about the topic itself, not about the
01:08:41.800 | orientation.
01:08:42.340 | Of the topic on a, on a single dimensional spectrum.
01:08:44.980 | And then layer identity politics into that. So not only your politics, but your gender,
01:08:51.460 | your race, your sexual preference, the color of your skin. And now how is anybody supposed
01:08:56.200 | to have a reasonable argument when I have to process like, oh, from, you know, Sri Lanka,
01:09:01.960 | but he went through Canada and he worked for, I mean, it's just, well, it's so reductive that no
01:09:05.980 | one gets it. It's so reductive that no one gets to have an identity anymore, right? Because we,
01:09:10.780 | we are all complex.
01:09:11.980 | And all issues are complex and they are all nuanced. And when you reduce everything down
01:09:16.180 | to kind of this one dimensional framework, you lose any ability to have depth, to have nuance,
01:09:20.800 | to have, um,
01:09:21.400 | said another way, the issues are complex enough. We don't have to put identity politics or
01:09:25.840 | political, you know, leanings on top of it.
01:09:28.960 | All right. So we had the worst fire season, uh, in California ever last year, obviously,
01:09:33.940 | as Chamath said, glo, uh, global warming is, uh, a conspiracy, um, by the Chinese, uh, as per your
01:09:40.840 | guy, Trump, uh,
01:09:41.620 | uh, sacks and, uh, there is climate change in Switzerland. There is a center called the
01:09:48.040 | center for climate change. There is a reason that there's climate change in Switzerland.
01:09:51.700 | It's coming from that lab. Ah, the center did it. Look at the side,
01:09:56.080 | look at the side. It says climate change. They're getting paid to propagate this conspiracy theory.
01:10:03.640 | Yeah. Uh, all right. So it's, it's going to be the worst.
01:10:05.920 | So we are, we, well, we are at risk more than ever, right? So we're entering June. So as of June,
01:10:11.260 | 1st, the California snow pack is down to 0% of normal. That's never happened before. So it's the
01:10:17.620 | lowest it's ever been there. There is absolutely like no snow pack in the entire Sierra and the
01:10:22.540 | entire state. 40% of the state is in a state of extreme drought right now. We've had 16,000 acres
01:10:28.240 | burn as of a few weeks ago, up from 3,600 during the same time period, the same day of the year,
01:10:32.800 | last year. Um, and so the, the, the Tinder is there now remember, uh, last year was the highest,
01:10:38.860 | um, uh, California.
01:10:40.900 | California has ever seen, we, we burned 4 million acres, uh, last year, California has about 33
01:10:46.420 | million acres of farmland of forest land representing about a third of our total land
01:10:50.560 | size in the state. Um, you know, 60% of that land is federal 40%. It's private. Um, and so the, the,
01:10:57.580 | the big kind of variable drivers this year are going to be, um, you know, uh, wind and,
01:11:02.440 | and heat, and we're already seeing a few heat waves, but it's the wind that kind of kicks
01:11:06.160 | these things off, but the Tinder is there, right? So like the state is dry. Um, the, uh,
01:11:10.540 | the, uh, the, the, the, the snow pack is gone. We're on severe water restrictions in a lot of
01:11:15.820 | counties throughout the state. Um, it's worth, I think, talking about the carbon effect, you know,
01:11:20.260 | last year, um, based on the forest that burnt in California, uh, we released about one and a half
01:11:28.180 | times as much carbon into the atmosphere from our forest fires as we did from, uh, cars burning
01:11:33.700 | fossil fuels in the state. Um, and so, so here's some statistics for you guys, which I think are
01:11:39.880 | just worth highlighting. Um, and so, uh, so here's some statistics for you guys, which I think are just worth highlighting.
01:11:40.180 | Um, and so, so here's some statistics for you guys, which I think are just worth highlighting. Um, there's
01:11:41.740 | about 2 billion metric tons of carbon stored in California forest land, which is about 60 tons per
01:11:47.080 | acre. Um, so there's about 9 million new tons of carbon sequestered per, uh, in California by our
01:11:57.160 | forest land per year. When there's a fire, we release about 10 tons per acre. So about one sixth
01:12:03.880 | of the carbon in that, in that forest land, the rest of the carbon doesn't burn up. So remember,
01:12:09.400 | there's a forest fire, typically the outside of the tree burns, the whole thing doesn't burn to
01:12:12.940 | ash. And so a forest fire can actually, if you look at the longitudinal kind of effect of it,
01:12:18.820 | burning forests can actually preserve the carbon sequestration activity versus, you know,
01:12:25.600 | just removing forest or removing trees. Um, and so there is to some extent, um, you know,
01:12:30.460 | an effort that has been shut down several times, which is to do these kind of controlled burns
01:12:34.540 | through the state, but it's met with such resistance, uh, given that it's so controversial,
01:12:39.040 | everyone wants to have smoke in their, in their neighborhood. It shouldn't be,
01:12:41.980 | it shouldn't be controversial. The problem is you can't present simple data and have people
01:12:45.520 | have a logical conversation about it. And the cost per acre to clear land and farm to forest
01:12:50.620 | land in California is it ranges depending on the complexity of the land, but it's somewhere between
01:12:54.520 | 50 and a thousand dollars. So call it a couple hundred dollars per acre. So you can very quickly
01:12:58.540 | kind of do the math on a carbon credit basis, Chamath. So it's about 40 bucks per ton, uh, for,
01:13:04.120 | for carbon credit today. So you're actually, you know, you can kind of preserve about $400,
01:13:08.680 | $400 per ton by not putting carbon into the atmosphere. And if you can actually manage farmland,
01:13:14.200 | uh, forest land clearance and forest land preservation, uh, from fire at a cost of $400
01:13:19.780 | or less, and there was an active carbon credit market, you should be able to cover the cost of
01:13:24.100 | managing that forest land back. But we're, we're, we're at an incredibly high risk this year. It
01:13:28.360 | doesn't mean that we're necessarily gonna have a fire because weather is the key driver. The
01:13:32.080 | weather is highly variable. Wind. We need wind. We need wind and we need a heat wave with wind.
01:13:35.860 | And then there will be fires, but the, and then what do they do?
01:13:38.320 | When they, when the wind kicks up right now, uh, the electric company turns off power
01:13:42.400 | in California because they don't wanna be blamed when a power line goes down and starts a fire.
01:13:48.160 | And so we have these regular moments where we just lose power.
01:13:52.240 | Yeah. This is not just a California problem. I know everyone wants to beat up on California,
01:13:55.600 | but like the whole Western us go look at Google maps. You'll see how much green
01:13:58.660 | stuff there is on Google maps. It's green up and down the Western half of the US.
01:14:02.440 | Friedberg. It was Trump, right? That, um, raking up the forests, uh, to put it in,
01:14:07.960 | uh, layman's terms or simple terms is an actual thing that helps.
01:14:12.340 | 60% of forest land in California is, uh, federal land. And, uh, it was the federal
01:14:18.220 | government's responsibility to manage that, that cost down to manage that risk down. What
01:14:22.900 | is the incentive? What is the motivation? You know, what are the key drivers? Those are all.
01:14:26.680 | Obviously it does work to clear it though.
01:14:28.480 | It, it, theoretically, when you reduce the amount of Tinder, you will reduce the risk of a burn,
01:14:32.800 | right? And so the cost, but the cost of doing so, as we mentioned, it's probably a couple hundred dollars per acre.
01:14:37.840 | And so who's gonna, let's say you wanna do that on 5 million acres, you know,
01:14:41.860 | wouldn't this create a bunch of jobs? Oh, wait, we're paying people to stay home.
01:14:44.440 | Yeah.
01:14:45.700 | Like it would create a ton of jobs. I mean, I hate to be like that guy, but like,
01:14:50.380 | could we create a bunch of $35 an hour jobs for people?
01:14:53.800 | I've I've heard scuttle, but that Newsome is so worried about fire season that they're
01:14:58.420 | gonna try and accelerate the recall election. So it happens before there was, you know,
01:15:02.140 | the, the conventional wisdom, the conventional wisdom would do that too. He's so smart.
01:15:07.720 | No, if Chamath did it, it would be strategic.
01:15:09.400 | Well, the conventional wisdom, the conventional wisdom was that you'd want to wait as long as
01:15:14.020 | possible to do the recall, because the longer you wait, the longer you get the rebound of the
01:15:17.800 | economy from COVID, right? Sure.
01:15:19.420 | But now they're talking about accelerating it to beat fire season because it's looking really bad.
01:15:23.740 | And Freebird's right. I need an escape plan.
01:15:26.020 | That we needed much more aggressive forest management. It's not just climate change.
01:15:29.980 | It's also forest management. We don't do it in California anymore. And so I think we're,
01:15:34.540 | we are in for a really hellish fire season. That's my prediction.
01:15:37.600 | We're going to have a terrible, we're gonna have a terrible fire season.
01:15:40.480 | There's going to be brownouts probably throughout a lot of the Western states.
01:15:46.480 | What played out in Texas that affected folks a few months ago, I think will some version of
01:15:53.980 | that will happen in many places in the US. This is, and it's all roughly avoidable.
01:16:00.940 | And the critical principle act is here is the progressive left. They need
01:16:07.480 | to marry their disdain for climate change and their disdain for the things that need to happen
01:16:19.360 | to prevent it. Because right now these two things for them are just like, it's cataclysmically not
01:16:25.960 | possible for us to agree on, for example, as Freeberg says, a controlled burn program
01:16:31.840 | as a mechanism of sort of like fighting climate change or, you know, investing
01:16:37.360 | more in the, the greenification of the economy so that we can actually eliminate the use of a
01:16:44.020 | lot of these non sustainable energy sources, all these things basically just come down to a group
01:16:49.180 | of individuals, deciding that they can both have an opinion on something as important as climate
01:16:55.540 | change, but then are also willing to then go and act right now they won't until they do. It's just
01:17:01.240 | going to spill over everywhere, it's going to be a very bad fire season. And the only reason I know
01:17:05.080 | that it is, is that every year before it has been every single year has gotten warmer. It's not
01:17:10.420 | only to be a genius here better. Yes. By the way, let me just correct a statistic I said,
01:17:15.100 | because the statistic I gave was a few weeks ago. But as of today, we were actually at the average,
01:17:20.380 | the historical average in terms of number of acres that are burnt in California. As we have seen
01:17:24.820 | historically, I will also say that, you know, close to one sixth of California's forest land
01:17:32.200 | burnt last year. So there is a tremendous amount of, you know, the number of acres that are burnt
01:17:34.960 | last year, there is a tremendous amount of tinder that has been removed from the risk equation. And
01:17:39.760 | we typically burn about a million acres a year, I think we burnt like 4 million last year, a little
01:17:43.660 | over 4 million last year. So you know, as you look at the cumulative kind of reduction of burnable
01:17:49.720 | acres, we're actually the good thing that's going on is we're actually at a lower risk scenario
01:17:55.300 | going into this year in terms of total amount of tinder, the risk of the tinder catching is higher,
01:17:59.980 | because it's drier. NASA, but when you add this all up, there's certainly a high probability
01:18:04.840 | of a third season. But there's no scenario here where we end up with less than zero,
01:18:08.980 | zero scenario that's gonna happen. NASA publishes temperature studies, they do measured measurements
01:18:16.000 | of how much warming there is in the earth. Last year, we set yet another record, it was the seventh
01:18:23.260 | year in a row where it was warmer than all the previous successive years. It's just going in
01:18:29.380 | the same place. I mean, and so if we're all of a sudden supposed to bet that a trend that
01:18:34.720 | has effectively been reliable for the last decade is going to turn, I'm not sure that that's a bet
01:18:40.420 | you'd want to make, or that the winds is not going to blow. It's a stupid bet. There's no reason to
01:18:45.880 | make that bet. I mean, this is like betting on a one outer. We need the left to take control
01:18:51.460 | of this issue and solve it. Get ready for Martian skies over California. I mean,
01:18:58.420 | literally, I'm thinking about an escape plan from California. And I'm putting a generator in this month.
01:19:04.600 | I bought six new air filters, you know, like beautiful. That's not that's not good enough.
01:19:09.940 | Well, I have my house is totally sealed. And I have the air purifiers in I have a built in air
01:19:14.920 | purifier thousand. And I have six portable ones in each every room. Are you coming back in August?
01:19:20.680 | In September at the end of August, but by the way, let me let me tell you where it really the rubber
01:19:26.380 | meets the road. Just again, I'm speaking to the progressive left. They care apparently so much
01:19:33.100 | about minorities. I'm speaking to the progressive left. They care apparently so much about minorities.
01:19:34.480 | I just want to make sure you guys understand that, you know, air quality disproportionately
01:19:39.640 | affects minorities. Why? Because we are not not me anymore. But you know, minorities are the ones
01:19:46.120 | that typically live near industrial output, near transportation through ways and went through
01:19:52.360 | affairs. Yeah, it is. It is statistically proven that blacks, brown, other minority people are the
01:19:59.380 | worst people to suffer from respiratory diseases, and airborne illnesses. And these are things
01:20:04.360 | that are that are happening today. So again, I want to go back to the same group of individuals
01:20:09.880 | who apparently believe in climate change, but don't believe in nuclear, they don't believe in
01:20:14.020 | control burns, they believe in inequality, but they don't want to do what's necessary to regulate
01:20:18.040 | emission. What are we doing, guys? Just at some point, the job, do the job, I think your fucking
01:20:24.040 | job. I what you're saying is correct your mouth. But I think it's a sad statement about the
01:20:28.120 | progressive left that the only way to reach them through an argument is to argue for that disparate
01:20:32.680 | impact on a minority. The reality is all Americans. Yes, exactly. Exactly. It's it's bad. Red pill me
01:20:39.040 | give me those red pills. Come on. Come on, Sasha, hold it out. But but your mouth understands that
01:20:45.340 | audience he is making the argument they're going to respond to. But the argument that they and
01:20:49.900 | everyone should be responding to is bad for everybody. The planet all humans exactly. What
01:20:57.340 | are you guys gonna do for fire season? Do you actually I'm thinking about renting a house like
01:21:00.760 | I rented a house in Chicago and like Michigan.
01:21:02.560 | Last year, and I went there and it was a great escape for a month to get away from fire season.
01:21:05.800 | I don't I don't I'm very scared to be in California during all of this to be completely
01:21:09.640 | honest with you. I don't just want to be there. Yeah, I'm out. I'm gonna try to figure out some
01:21:13.960 | some back in late of August and hopefully everything is calmed down by then although
01:21:17.720 | it won't because it gets very, very hot at the end of August. September was the heart of it.
01:21:22.280 | It's typically the heart of it. Jake, how do you think you're gonna go to Miami or Austin or
01:21:26.440 | something? You know, I I went back to Austin for a wedding and
01:21:32.440 | I met the governor and
01:21:35.200 | don't care. You went sweating and you met Greg Abbott.
01:21:38.920 | Oh, beep! You gotta beep that out. You went to sweating about the governor. Yes. And
01:21:44.980 | going to Austin in 2021 is like when I would come to San Francisco and go to the battery
01:21:52.780 | in 2003 and 2013 and Sachs would say, "Why don't you live here? There's so much going
01:21:59.320 | on in San Francisco. Come to San Francisco." And I did. And I
01:22:02.320 | got the last five years of the peak, but Austin, very appealing to me. And then I've been looking
01:22:07.480 | at beach houses in Miami and I'm 50% of the way there, folks. Oh, my God. I mean, the fact that
01:22:17.800 | you can now buy a beach house. I mean, God bless America. God bless America. Oh, my God. I had a
01:22:25.960 | 71 three-year average. I had a 71 three-year average in high school. I had 1150 on my SATs.
01:22:32.200 | And I'm going to buy a beach house. Congrats, J. Cal. And I forgot that I
01:22:34.960 | convinced you to move up to San Francisco. Yet another way in which I have-
01:22:38.800 | Contributed to the monster. Seared your career towards success.
01:22:41.860 | Absolutely. Okay. I'm going to use call-in every day. Call-in syndicates underway.
01:22:47.080 | Everything's going to be- You wouldn't even be a VC if it
01:22:49.780 | wasn't for me. You'd still be a media figure. You did. That's right. I'd be doing conference
01:22:52.360 | producing. You and Naval really pushed me towards it. And then special thank you to you and Chamath,
01:22:57.460 | Bill Lee for anchoring and Dave Goldberg- We love you, J. Cal. We love you.
01:23:02.080 | You know what? I tweeted the other day, at the end of the day, our lives are a collection when
01:23:08.560 | we look back on them of memories with our friends. And I include family and friends.
01:23:12.760 | And this podcast, not to get all gushy and whatever, has been a delight over the really hard
01:23:20.320 | pandemic that's now ending. And I'm really happy that we get to spend this time every week together.
01:23:25.120 | Every week, I get a little bit of excitement, like I used to get when we go year-host poker, Sachs,
01:23:31.960 | or Chamath. Those days when we'd have a poker game, Sky Dayton would tell me, and I get a little
01:23:39.040 | tingly feeling, like, "Oh my God, I'm going to see my friends tonight and play poker and laugh."
01:23:43.360 | And we got that amazing note from the woman who said she was really having a hard time during the
01:23:48.820 | pandemic and that the podcast, All In podcast, really helped her. And-
01:23:52.120 | Shout out to Sam. Thanks for that note.
01:23:53.800 | Yeah, Sam. That really made our week. So shout out to Sam.
01:23:56.620 | That was amazing.
01:23:58.000 | And that's a long way of saying, "I love you, Sachs. I love you, Sachs."
01:24:01.840 | Jake, you are the Stephen Colbert to my Jon Stewart.
01:24:06.100 | I think it's the opposite. I think it's the opposite.
01:24:09.340 | I have to come on your show and red pill you and make sure that you're saying the
01:24:15.220 | truth and not getting too wrapped up in your Trump derangement syndrome or whatever.
01:24:18.580 | At the end of the day, we are, I think, all of us working through complex issues. To
01:24:26.800 | Friedberg, I really loved your contribution today about how complex these issues are and layering
01:24:31.720 | a little more complexity onto them of our identities, our wealth, our histories,
01:24:36.820 | immigrants, whatever, politics. These issues are so hard and in some ways also so easy with
01:24:43.660 | technology and world-class execution that the world needs to have more reasonable conversations.
01:24:49.420 | And I think that what we've demonstrated here is that four friends can have reasonable discussions
01:24:55.120 | and laugh about life and enjoy life. And that should be for everybody listening.
01:24:59.020 | That's what Sam said in her note to us, which was very heartwarming.
01:25:01.600 | So thank you. Yeah, that was great.
01:25:03.280 | Yeah. I mean, love you guys.
01:25:04.780 | Love you, Sax.
01:25:05.800 | Back at you guys.
01:25:07.060 | Back at you guys.
01:25:08.140 | Jesus fucking Asperger guys.
01:25:11.440 | It's a brutal.
01:25:12.100 | I must download new directions to escape forest fires.
01:25:16.540 | Love program active.
01:25:18.820 | Love L-O-V-E querying dictionary, a feeling of affection for another entity or human.
01:25:25.480 | Like I like playing video games till 2:00 AM and my dog, can I say
01:25:31.480 | it to a very similar to coding or problem solving using my computer to my 17 B subroutine overheating
01:25:43.000 | must play chess with Peter Teo and stop saying I love you to Jake.
01:25:46.660 | Stop saying it.
01:25:49.480 | Stop saying it.
01:25:50.620 | I need to sell it.
01:25:51.760 | He's overloading.
01:25:53.620 | My shirt was so expensive.
01:25:56.140 | But yet so thin.
01:25:57.340 | I put on yet another shirt.
01:25:59.380 | I had to pump it out for the pod.
01:26:01.360 | I'm lucky because I've gained 15 pounds.
01:26:04.600 | I need to make up for it with a $1,200.
01:26:06.700 | How do I look with four collars for you say no, it's more two, but also two chins.
01:26:12.400 | So two, two collars for my double chin.
01:26:15.820 | Two shirts are better than one.
01:26:17.800 | Two shirts are better than one.
01:26:19.240 | What?
01:26:19.600 | Oh, it's twice as good.
01:26:20.860 | Sax is adding shirts.
01:26:27.640 | Tomas is removing it.
01:26:28.840 | I'll see you all next time.
01:26:30.280 | We're all in podcast.
01:26:33.240 | Love you guys.
01:26:38.240 | We open source it to the fans and they've just gone crazy with it.
01:26:47.200 | I love you.
01:26:51.160 | I love you.
01:27:01.120 | We should all just get a room and just have one big huge orgy because they're all just useless.
01:27:10.000 | It's like this like sexual tension, but they just need to release it now.
01:27:17.080 | What?
01:27:18.080 | You're the B. What?
01:27:25.080 | You're B. B?
01:27:26.120 | What?
01:27:27.120 | We need to get.
01:27:28.120 | Merch is our.
01:27:29.120 | I'm going all in.
01:27:30.120 | Thank you.