back to index

#AIS: Tim Urban on political discourse + Keith Rabois on early-stage investing in 2022


Chapters

0:0 Tim Urban gives a talk on political discourse in America and explains high-rung vs. low-rung thinking
26:20 The Besties and Keith Rabois join Tim Urban on stage for a roundtable discussion on cancel culture
51:23 Keith Rabois talks about taking a pause on new investments in 2022 and gives his take on other major VC players

Whisper Transcript | Transcript Only Page

00:00:00.000 | Next up is my good friend Tim Urban from Wait But Why. I asked him to do this as a favor.
00:00:04.160 | He gets a huge speaking fee. I said we have no budget.
00:00:07.880 | He said Jake, I think it's $7,500 a ticket. I said I have no budget. I stole it all.
00:00:12.680 | And he has the number one talk in the history of TED on YouTube. My pal, Tim Urban.
00:00:20.480 | He said something yesterday to Nate Silver after about one poker and he was like,
00:00:44.420 | oh, I'm going to take away your speaking fee. And I was like, the fuck? Speaking fee?
00:00:47.500 | Yeah.
00:00:50.440 | All right. So the title of my talk is Tim Talks About Politics and Other Things That
00:00:55.400 | Are Probably a Bad Idea to Talk About in Front of All These People.
00:00:57.700 | And I want to start with why am I even writing about politics? I don't like politics. I like
00:01:04.240 | writing about the science and tech and the future and procrastination and things that interest me.
00:01:09.000 | But as I'm thinking about the future and all this awesome stuff that we could have,
00:01:14.380 | I started to have a bad feeling. I would think of society kind of like a giant organism.
00:01:18.820 | And this is how I was...
00:01:20.200 | I always grew up assuming that society was like, it's like a big grown up. But when I looked around,
00:01:25.840 | it looked more like a poopy pants six year old who dropped its ice cream.
00:01:30.880 | And I feel like this is what a lot of people are kind of getting at in these talks. We're talking
00:01:36.640 | about kind of all this crazy polarization and mobs and all that. And to me, I just look out and I see
00:01:42.760 | this. I see kind of reverting and people are acting like they're in middle school and we can't
00:01:47.380 | communicate and what's going on. So I started putting my mind to this. Now, what was the problem?
00:01:52.840 | And the problem is very complicated and I'm not going to try to get into the whole thing today,
00:01:58.840 | but I think that what we can do is have a better framework to talk about the problem.
00:02:03.460 | I think that we are very constrained to this one dimensional axis. It's like a straight jacket in
00:02:11.020 | our conversations. You hear people say, the problem is the far left and the far right. We need to be in
00:02:16.360 | the center. We need to be more moderate. But what is that? And the center is just a policy position.
00:02:20.020 | Right? The far left and far right aren't inherently bad. The far left is just kind of radical and
00:02:25.540 | questioning everything and they're experimental. And the far right is just questioning. Maybe we
00:02:29.500 | messed up. Maybe we should go back to the way things were. I mean, there's nothing inherently
00:02:32.740 | better or worse about any part of this spectrum. But we're using these words to try to get at
00:02:37.900 | something else. We'd say centrist, moderate. We don't really mean in the middle of the spectrum.
00:02:41.920 | I think we're talking about a different axis. I call it the ladder.
00:02:45.160 | So I think bringing our political discussions into two dimensions can be hugely helpful. Now,
00:02:50.200 | sometimes you'll see like the political compass, you'll see politics in 2D, but that's still all
00:02:54.520 | what you think. That's all different ways to look at what you think about politics. The ladder is a
00:02:59.380 | how you think axis. So there's some nuance to it. It's a spectrum. But for our purposes, let's just
00:03:07.240 | focus on the two kind of core ideas here. There's high rung political thinking, high rung politics,
00:03:12.760 | and low rung politics.
00:03:14.620 | So the high rungs, you can kind of divide into high rung progressivism and high rung conservatism,
00:03:22.120 | which I kind of think is like two arguing giants. They're like, you know, the collective efforts of
00:03:27.940 | high rung progressivism and conservatism are kind of like lawyers in a courtroom. They're heated.
00:03:36.340 | They don't like each other a lot of the time. They have very different ideas of how things should go.
00:03:39.640 | But it's kind of like, you know, the two lawyers in a courtroom, this is kind of a wink that goes on,
00:03:44.020 | where they understand ultimately they're on the same team. They're two sides of a truth kind of
00:03:48.580 | discovery machine. And I think this is the same thing. They don't like each other, but they're
00:03:52.360 | actually ultimately on the same team trying to figure out the roadmap. How do we move forward?
00:03:55.420 | And the conversations in high rung politics are complex. They're nuanced. You know, there's
00:04:01.360 | different realms. There's what is, right? There's science and history arguing about what is. That's
00:04:05.020 | hard to figure out. There's what should be, right? That's philosophy and ethics. Then there's,
00:04:10.240 | you know, even if they agree on those two things, how do we get there, right? What are the right things?
00:04:13.780 | What are the right policies, strategies, experimentation, testing? So there's a lot of
00:04:18.520 | nuance. There's a lot of complexity. And one of the core defining features is if this is how you
00:04:23.800 | form beliefs, right? You know, you go from, I don't know, some kind of process to I know. High rung
00:04:28.000 | politics is all about truth. They're geared towards truth. They start here. At I don't know. There's
00:04:33.820 | kind of an inherent humility to this process. So I think of humility a little bit like trying to
00:04:42.880 | stay on a tightrope. It's not easy, right? We are, it's easy for your confidence. You know,
00:04:50.740 | you have the Dunning-Kruger thing. Your confidence shoots up when you first learn something and then
00:04:54.100 | it goes down after you, you know, realize you don't know as much as you know. And then sometimes you
00:04:59.020 | can go too low. And so when you go too low, you're in the kind of the insecure zone, right? You
00:05:02.740 | actually know more than you think you know. But you, but like you're just not, you're even some
00:05:10.720 | kind of imposter syndrome.
00:05:11.740 | Above the line, you know, we're in the arrogant zone, very common in politics, obviously, you know,
00:05:16.600 | you, you, you think you know more than you really actually do. So like, you know, you could even
00:05:21.460 | measure it. Like this is how much you're full of shit, how much above, like the amount above the
00:05:25.960 | line you are. And in high rung politics, look, no one is great at staying on the tightrope. It's
00:05:32.680 | very hard, but it's the, the, the, the culture of high rung politics is helpful because it can actually,
00:05:40.840 | um, it humbles you because people will disagree with you. And it's cool in kind of a high rung
00:05:45.580 | political culture to be humble. Like if you say, I don't know, or you say, yeah, you know, I, I
00:05:49.900 | haven't thought about that issue that makes you seem smart in high rung politics, right? It's it's
00:05:53.560 | so, so it's encouraged whatever the culture finds cool, we're going to do more of, um,
00:05:57.340 | a core thing about high rung politics. We don't identify with our ideas. So, um, I think of,
00:06:05.140 | you know, uh, ideas when you're in this zone are like a machine that you built. It's like a hypothesis,
00:06:10.300 | right? You put the boxing gloves on, you let your friends kick it, you know, go to town, you know,
00:06:14.200 | you, you throw it out there and people try to argue with it. You know, the besties are big on
00:06:17.920 | this, right? They, they love an opportunity, relish opportunity to just tell the other person they're
00:06:21.400 | wrong, or here's why you're biased or here's why you have, you know, you're being hypocritical.
00:06:24.580 | And this is what high rung politics is about. No one takes it personally. You're just kicking my
00:06:28.600 | machine. And I'm saying, I bet my machine can stand up to it. And they're saying, I bet it can.
00:06:31.660 | And if it does, man, I just got more confident because I just, I just realized this thing is,
00:06:35.440 | is pretty strong. If they, if they break it, it doesn't feel good, but I just got a little smarter.
00:06:39.400 | I just got a little bit less dumb because I learned something I was wrong about. So they're
00:06:43.360 | kicking it and you know, you're watching them box is dialectic. When you watch them box together,
00:06:47.620 | sometimes you play devil's advocate. You take the bat to your own idea. This is, this is,
00:06:51.040 | you know, kind of how you move up that humility, tightrope to a more knowledgeable place.
00:06:55.480 | Principles wise. One of the things that defines high rung politics is consistency. It's not,
00:07:02.200 | again, there's left, right center. So the principles will totally vary,
00:07:06.100 | but there's consistency either way. So classic example,
00:07:09.340 | Elon talking about yesterday, free speech doesn't count to value the,
00:07:13.480 | you know, to, to, to fight for the free speech of people who you agree with.
00:07:15.880 | Every single person in history has had that principle. That's the yellow zone. It's very
00:07:19.780 | easy to support for your principles. When it's also supporting your team. The challenge comes
00:07:24.100 | when it's not, when it's people you don't like saying things you don't like, for example, or when,
00:07:27.520 | or when it's your team trying to shut down the free speech of others. And you know,
00:07:30.520 | it's wrong, even though you, you do hate that speech. That's when you have to choose green
00:07:34.240 | zone or orange zone. High rung politics is great about staying in the green zone. You will see them
00:07:39.280 | go against their own team all the time. If it, if it doesn't conflict, if it doesn't jive with
00:07:44.500 | their principles, I think if you take a big step back, this thing, again, it gets heated. This
00:07:48.820 | isn't, you know, people mistake high rung politics that, you know, it's, oh, it's, we should be all,
00:07:52.120 | you know, we should be, you know, kind of, uh, withdrawn and, and, and irrational. But I think
00:07:56.380 | it's actually also, it can be very passionate, very emotional, very heated. People care deeply
00:08:00.760 | in high rung. They can form coalitions and do marches and still, and stuff like that.
00:08:04.660 | It's just that they care about truth. They they're consistent with their principles. They don't
00:08:09.220 | identify with their ideas. They like to argue and ultimately it's a positive sum game with a
00:08:14.140 | positive effect on the country. This is what drives the country forward, right? In the science Academy,
00:08:18.100 | this is what drives, uh, knowledge forward, right? This, this is what drives innovation forward is,
00:08:23.620 | is people able to disagree. Now you get to the other thing that is low rung politics,
00:08:29.740 | low rung politics. I have a name for it. I call it political Disney world.
00:08:33.520 | And I call it that because it's a land of rainbows and unicorns and a bunch of people who
00:08:39.160 | have not changed their mind under any circumstances. It's a land of good guys and bad guys. The good
00:08:45.100 | guys are angels, perfectly righteous. The bad guys are awful in every possible way. And the
00:08:50.020 | good guys have good ideas and the bad guys have bad ideas. And there's a checklist in high rung
00:08:55.780 | politics. If someone tells me their position on guns, I have no idea what their position is on
00:08:59.800 | climate change or on abortion or on immigration and low rung politics. You hear one position
00:09:05.740 | from someone. Boom. I, you can just look at their demeanor and I know every single position
00:09:09.100 | they've got on every single issue, the same concept in lowering politics. Again, no one thinks
00:09:16.120 | they're in low rung politics. So people there will, will think, yeah, of course I value truth,
00:09:20.320 | but they don't, they're actually starting it. I know they start at the checklist item and now
00:09:24.880 | they say, well, I have to prove this is correct. So when they read an article that they put, they
00:09:28.720 | won't read the article, but if they read the article that disagrees with them, they'll meet
00:09:32.440 | their, their, their, their, their brick wall in their head about, you know, this can't be true.
00:09:35.620 | This person is biased. This is, this is, you know, ad hominem, whatever they,
00:09:39.040 | um, and when they read an article that agrees with them, when they hear an opinion,
00:09:41.440 | they'll there's all that skepticism disappears and suddenly it must be true. Yes, of course.
00:09:45.400 | Um, so I talked about high-rank politics. It's like a, the ideas are like machines, right? It's
00:09:51.700 | not, you know, you don't get sensitive about it. You kick, kick the machine, right? Low-rung
00:09:55.360 | politics. It's like a baby, a very cute baby who you love so much. So people's ideas, they're
00:10:00.580 | sacred in low-rung politics. And, and this is why, um, you know, you can kick a machine and no,
00:10:06.280 | that's no big deal. If you kick a baby, you're an asshole. And so,
00:10:08.980 | on the high rungs, people can disagree. You have two axes here, decency and agreement,
00:10:16.300 | and they're totally different, right? You can have people that disagree with you that are awesome and
00:10:20.320 | vice versa. You can have people that agree with you and they're assholes, but in low-rung politics,
00:10:23.200 | it's very simple. People who agree with you, they're good people. People who don't, they're
00:10:26.320 | assholes. Um, so this is, you know, what it comes down to is, you know, you have a high-rung
00:10:29.680 | discussion and it kind of looks like this. They're examining things, low-rung discussion. It's like,
00:10:33.160 | "Fucking shit. That's a cute baby. God, it's such a good baby. How awful are people who don't like
00:10:37.180 | their baby?" So awful, right? This is very common. If you listen to a low-rung political discussion,
00:10:42.520 | this is essentially what's happening. They're sitting around and they're talking about how
00:10:47.020 | right they are and how awful the people and dangerous the people are who disagree with
00:10:52.300 | them. And that's just, they'll just talk about that forever and ever and ever.
00:10:55.180 | Principles, same idea here. You, you actually stick with the left circle. You'll constantly,
00:10:59.560 | you know, giveaway here for low-rung politics is that when, when, when it's not convenient yellow
00:11:06.400 | circle territory,
00:11:07.120 | they will almost always jump over to the orange circle. You know, you'll have, again, you know,
00:11:11.200 | so free speech you'll see is a perfect litmus test. If you, you know, as soon as it's free speech,
00:11:14.980 | people you don't like, all those principles disappear. Um, we, we can, you know, how about
00:11:19.480 | COVID marches? Uh, all, all, you know, people are completely worked up about, uh, lockdown marches
00:11:26.080 | in right-wing states. Soon as it's marches for racial justice, all good, all good. This is,
00:11:30.820 | this is a public health crisis, right? This is, that's, that's orange material. How about all
00:11:35.380 | the people who are super anti-
00:11:37.060 | uh, you know, immigration policies and surveillance policies and foreign policy and, you know,
00:11:43.660 | debt, uh, issues. And then as soon as it's the other president, now that your president's in
00:11:48.280 | office, all those same policies stay and you're fine with them. Um, you know, the classic example,
00:11:52.540 | the debt was the worst thing in the world during Obama's presidency. And then Trump comes in office,
00:11:56.500 | starts doing it, these tax packages that are adding to it. And suddenly it's no problem.
00:12:00.880 | So there's endless examples here. Um, if high-rung politics is kind of this positive sum
00:12:07.000 | game, lowering politics, I see it much more like two screaming giants. Um, and, and, and, and they're,
00:12:13.720 | they're the, if the high-rung kind of emergent property is intelligence and, and progress,
00:12:18.700 | the low-rung emergent property is just strengthened, you know, fighting for power.
00:12:23.140 | It's a battle of good versus evil. And the big, the big goal is not, you know,
00:12:27.580 | not, uh, trying to create a more perfect union. Again, they think that's the goal,
00:12:32.140 | but the big goal really is beating the bad guys. It's a zero sum game that ultimately has a negative
00:12:36.940 | effect. So I know I just threw a lot at you because I wanted to kind of cover the different
00:12:40.840 | bases of this to give a feel for what I'm talking about here. This is the framework that I think is
00:12:45.160 | very useful. I've been living with it now for a few years. I've been having conversations with
00:12:48.940 | it and I find that it clarifies a lot and it helps with a lot of things. Like for example,
00:12:53.560 | uh, if you just think it's a horizontal axis, so a, as I said, you know, you, you, you mistake that
00:12:58.720 | the far left and right must be the problem, but it's not, it's the low rungs that are the problem.
00:13:02.320 | That's actually what people are trying to say. The moderate centrist, you know, think that's not what
00:13:06.880 | they're actually trying to say. They're trying to say high rung, which can expand the, the, the
00:13:10.780 | horizontal axis. Uh, there's more than one tug of war going on. We think if you just have one axis,
00:13:16.180 | well, it's left versus right. And that is a tug of war, both in the high and low rungs that is,
00:13:19.900 | you know, there are fighting for what they want, but there's a tug of war going on from the north
00:13:24.880 | and south as well. The, the, the progressive, I think I know a lot of people in here probably are
00:13:29.320 | thinking I'm in that upper left guy. That's my guess. And if that's true and it might be true,
00:13:36.820 | you have a tug of war going on against that upper right guy. You also have a tug of war gone going
00:13:41.860 | on against that lower left guy. Uh, this is the thing that I think is important to realize is
00:13:46.240 | when you have this, that the people who are on your team, you know, they also hate Trump or
00:13:50.560 | whatever. Uh, they might be actually like the biggest impediment to what you care about
00:13:55.180 | politically. They, they, they undermine the progress of what you care about.
00:13:58.180 | Um, it, it also can enhance kind of collaboration because if you're in that,
00:14:02.680 | one of those upper giants, the other upper giant is a lot more on
00:14:06.760 | your ultimate team. If you take a big step back, then the lower giant that wears the same color.
00:14:10.720 | So, so I, once you start to, I think this way, I think it helps to, to, um, kind of loosen some
00:14:15.640 | of the tribalism and give some nuance to our discussions and give some nuance to what we're
00:14:19.360 | trying to do. Now, the story I wanted to talk about here is that this is okay. This is normal,
00:14:24.820 | by the way, this is not a problem. Every democracy in the world will have this. The
00:14:29.740 | founders knew this would be here. The goal was not to, uh, suppress low rungness. It was to contain it.
00:14:36.700 | And actually, you know, in the economy to harness it for progress, but in politics to contain it.
00:14:42.220 | So it can't totally take over. They contain it by taking away the physical cudgel. You know,
00:14:46.300 | you can't just conquer and become a dictator. Like so many low rung giants in other countries
00:14:51.820 | have done. There's there's laws here. And, and, and most importantly, there's kind of a high rung
00:14:56.260 | immune system, which is just vigorous defenses, defense, uh, defense, uh, defense against low
00:15:02.440 | rung infringement, low rung. This will try to shut down the conversations in the high
00:15:06.640 | rungs in the high rungs, uh, resist. They say, no, off. Like, uh, we, uh, we, we, we,
00:15:12.940 | you know, we're, you can't enforce your echo chamber upon us. You're allowed to have your
00:15:16.060 | echo chamber. That's fine. You can't enforce it. So this is how it's supposed to be. Now,
00:15:19.600 | part of the reason we're all here continually in each talk, talking about man, politics is awful
00:15:24.520 | and things are bad. And there's a poopy pace pants, six-year-old, but the ice cream falling is because
00:15:29.800 | of, I think we've had some big changes to the environment. This is the kind of simple human
00:15:35.500 | equation. I think about,
00:15:36.580 | you've got human nature is constant. The environment is what changes and that produces
00:15:41.500 | different behavior, right? You know, the people who are really hardened during war, you know,
00:15:44.980 | they're not different biologically than us. They just were put in a very different environment and
00:15:49.300 | it created different kinds of people. So our environment has changed a lot. And I think
00:15:52.960 | it's causing a lot of problems. I think it's causing a low rung flare up. If, if, you know,
00:15:57.520 | so here's one way to think about it in the sixties, you've got intra party factions
00:16:03.400 | within the parties, right? You have a lot of progressive Republicans and conservative
00:16:06.520 | Democrats and these, these factions within the parties, they hate each other, right? Which is,
00:16:11.080 | which was a source of tribalism. Some people are just so focused on the other people in their party,
00:16:15.280 | the other factions. There's the national parties like we have that we talk about a lot today,
00:16:19.900 | Republicans and Democrats nationally, that was a source of tribalism. And then there was this,
00:16:23.200 | you know, USSR and also before that Hitler, like there were all these, you know, scary foreign
00:16:27.580 | enemies that created this kind of macro tribalism on the national level.
00:16:31.840 | So you have patriotism, which is one kind of tribalism,
00:16:36.460 | but it also unifies down below and the intra party factions might actually cause the national parties
00:16:41.560 | to collaborate sometimes. So it's not that people were less tribal. It's that tribalism was
00:16:45.820 | distributed. What's happened is now the intra party factions have disappeared because the
00:16:50.200 | conservative Democrats have all gone to the Republicans. They're the progressive Republicans
00:16:53.800 | have all gone for lots of reasons. We can get into some other talk, but that's waned. There's
00:16:59.080 | still a little, you still have Bernie and you know, Hillary not liking, you know, they're,
00:17:01.900 | they're people not like it, but it's, it's much less of a thing. Likewise, you still have, you know, yes.
00:17:06.400 | Still Russia, you know, but mostly that's not the focus. In fact, uh, the focus is, is it's so not
00:17:13.420 | here that when there's a foreign thing now, usually we'll just use it as like political fodder for our
00:17:17.440 | national debate. Uh, you know, all the Russians are on their side. No, they're on their side,
00:17:20.560 | right? And there's no patriotism that unites anymore. What you have is one big old political
00:17:26.080 | divide and all the tribalism from all those things is concentrated into one place, which is an
00:17:30.400 | unhealthy. That's not great. I don't think that's good. Um, and so this is one environmental change.
00:17:35.740 | No one's fault.
00:17:36.340 | It's just what happened. Then, you know, you also have a lot, a lot of things with like the electoral
00:17:40.720 | map you have between gerrymandering and, you know, geographic sorting, you have purple counties
00:17:45.640 | turning, you know, mostly red and blue now, which means primaries are actually electing the farthest
00:17:49.900 | right and left people as opposed to, you know, people who could win a general election. There's
00:17:53.380 | a lot of other kind of little environmental changes, but one huge one that we talk about
00:17:57.580 | is the media. I think of a media, I'd like to place them on a media matrix accuracy,
00:18:02.260 | uh, on the Y axis and the objectivity. So the, uh, where you want to be is the
00:18:06.280 | middle, right? And actually for a long time, there was an incentive magnet, uh, to be there
00:18:10.660 | for ABC, CBS, NBC, right? They, they, they didn't want to seem like they were inaccurate and they
00:18:15.040 | had to cater to the whole country, which kept them somewhat close to that. There was this incentive
00:18:18.820 | magnet today you have cable TV, and then eventually you have, you know, talk radio and you've got,
00:18:24.820 | um, then the internet and all these websites, you have tribal media, which is a totally different
00:18:29.020 | set of incentives. You cater to one side only you it's more bias. Uh, the, the, uh, the more clicks.
00:18:36.220 | And accuracy is just not a concern to the audiences they end up having. And then you
00:18:40.000 | have this feedback loop like was discussed yesterday, where once you, uh, catered to that,
00:18:45.160 | now you have to keep that going, right? You you've now lost a neutral audience. And,
00:18:49.000 | um, and so now we have a lot of Americans super addicted to a really trashy reality show. Um,
00:18:54.640 | it's the real politicians of Washington. Um,
00:18:57.640 | and then I took me a long time to make this by the way,
00:19:06.160 | I think McConnell's my favorite anyway. Um, so then you've got, of course, the big bomb drops
00:19:15.700 | in our environment. You've got social media. This is a real graph showing people retweet things they
00:19:21.400 | agree with to people they agree with almost entirely, right? It's these algorithmic bubbles.
00:19:25.780 | It's insane, you know? And so if you're one of the people that actually, oh, I follow all kinds of
00:19:29.200 | different people. You're very rare because, and, and, and it didn't, again, it didn't used to be
00:19:33.220 | this way. John Ronson talks about, you know, how it used to be. It's
00:19:36.100 | radical de-shaming like Twitter. You know, you go on and be like, oh, I do this embarrassing thing.
00:19:39.580 | People will be like me too. And it'd be like, oh, so nice and fuzzy at the very beginning. And then
00:19:43.240 | it turned into, wait a second, you know, this bad guy is harassing women at work. And now actually
00:19:47.200 | this woman has power for the first time. She can talk about it on social media. We can create a,
00:19:51.040 | a whole kind of, uh, uh, a coalition against it. And he gets fired and it's exhilarating and this
00:19:56.020 | is good, right? This is speaking truth to power. Problem is now people are exhilarated and they're
00:20:01.120 | saying who's next, right? And you, you have this new source of power, which again can be used for good.
00:20:06.040 | But it's gotten picked up by a lot of the low rung tribes who have started to use this cudgel.
00:20:12.460 | Um, not started. It's been a while now, you know, creating mobs to actually enforce low rung
00:20:17.500 | politics. And what happens is you end up with high rung world, very scared, kind of caught off guard.
00:20:22.660 | The normal defenses, the normal immune system is not doing its job. And so what happens when
00:20:29.200 | the high rung world gets scared, this is a very, you know, it can set off a domino effect. Imagine
00:20:33.880 | we picture this is the high rung world. These are
00:20:35.980 | brains. This is what a bunch of high rung people in a community think. They all think different
00:20:40.480 | things based on the color, right? Now, if we draw a circle around them, this is imagining what they're
00:20:44.440 | saying is the circles color. So here is a perfect high rung community, right? Everyone is it's a
00:20:49.180 | diverse, you know, thinking, and they're saying what they're thinking and it connects together
00:20:52.720 | into this super brain and it's awesome. Right? But now maybe the social media cudgel, maybe
00:20:57.400 | something else starts to be a little bit scary. And, and this one group starts to say the only
00:21:02.800 | opinion that's okay is the orange opinion. Anyone who
00:21:05.920 | has anything other than the orange opinions, an awful person, um, the high rung immune system
00:21:11.080 | supposed to kick in and say, cool, off. If it doesn't say that everyone starts getting scared.
00:21:16.540 | And then cowardice starts to spread. And before you know it, everyone's just saying the orange
00:21:20.860 | out loud, even if they don't agree with it, no one wants to outwardly say what they think anymore.
00:21:25.000 | And the problem is you can't actually see what's going on in the brains. You only know what people
00:21:29.440 | are thinking based on what they're saying. So all people see is this. So if you're this guy who
00:21:35.320 | actually has one opinion and actually is full of diverse thinking around them, they don't know that
00:21:39.460 | they assume it must look like this. Everyone starts to feel like I'm the only one who thinks
00:21:44.260 | this. I'm the only one who doesn't like this movement or this politician or whatever. And
00:21:49.780 | the group intelligence that's so awesome about high rung politics,
00:21:54.160 | it disappears. I think what we're seeing is if what, you know, why, why, uh, are things so bad?
00:22:01.360 | I, I don't think it's because we moved to the far right and far left. I think it's because,
00:22:05.260 | uh, you have a low rung flare up generated by changes of the environment and the high rungs
00:22:11.380 | have been caught off guard by really rapid environment changes and they've just disappeared.
00:22:15.400 | They've shrunk away. And the low rungs are running, you know, buck wild. You can see this on
00:22:20.500 | the right. I think mostly in Washington, you see the debt ceiling, you know, being used as a, you
00:22:25.480 | know, a weapon in a way that should never happen. You see, uh, McConnell and the Senate not putting
00:22:30.820 | through a Senate candidate, uh, a, a, a Supreme court candidate, because it's the last year, totally.
00:22:35.020 | Unprecedented. That's not the rule. And then four years later they go and they do,
00:22:37.720 | they put their own candidate through this is low rung. Of course, Trump with the election.
00:22:41.680 | I mean, Reagan's big thing was, you know, the, the, the peaceful transition of power is what
00:22:46.300 | makes us special. And you know, Trump of course is the exact opposite on the left.
00:22:49.960 | I think we see it less in Washington and much more in culture. I think wokeness,
00:22:54.340 | um, is two things. It's a far left ideology and it is far left. It's, you know, postmodern and
00:23:00.280 | it's, it's Marxist and that's fine. You can have those things in the high rungs. The thing that
00:23:04.000 | makes.
00:23:04.780 | Wokeness low rung is the way they treat others. You can, you can go and have your own, you can
00:23:10.240 | have your own echo chamber and do it with, with, with the, with the, the woke mantra is, you know,
00:23:14.620 | what, what a low rung person in a liberal country supposed to say is I don't like these ideas.
00:23:18.940 | And so I won't listen to them. What, uh, what you're not supposed to be able to say is I don't
00:23:22.960 | like the idea. So no one is allowed to listen to them, right? A disinvitation on campus,
00:23:27.160 | which has been come very common, right? Um, it's, it's, it's not saying I won't go to that talk,
00:23:31.780 | which is a low rung thing to say it's much worse. It's saying,
00:23:34.540 | no one on this campus is allowed to hear that talk. And we see that having played out. We see
00:23:38.860 | James Bennett, the editor of the New York times op-ed section getting fired because he published
00:23:43.120 | an op-ed by Tom cotton that 62% of the country agreed with, but it didn't jive with woke,
00:23:48.640 | um, uh, orthodoxy. You see Denise young at apple, a black woman who's ahead of diversity,
00:23:54.460 | who says to me, diversity is not, you know, it's, it's more complicated than just about
00:23:58.720 | something like race. When I look at 12 blue eyed, blonde haired guys, I see tw I see
00:24:04.300 | diversity. I see different people diverse in different ways. She was fired for saying that,
00:24:07.840 | um, you can go on and on medical journals are retracting papers that have never retracted
00:24:13.660 | papers before because double peer review papers, because they get a rise on Twitter from the woke
00:24:20.020 | mob. So I think we're seeing this in different ways, but to me, it's all one big story, which is
00:24:24.100 | that we're having a low rung flare up and these low rung giants are out of hand. They're doing
00:24:28.180 | things they're not supposed to be able to be doing. And they're doing that because the immune
00:24:31.120 | system's failing. And that's why we all look like this. Now, the good news is,
00:24:34.240 | I do think this can change. I don't think most people are like this. I think most people are.
00:24:39.100 | And by the way, if you think this is, oh, another binary divide, we are all high rung and low rung
00:24:43.900 | at different times. And that's one of the big differences here. Um, I think that if we want
00:24:47.620 | to get out of this and get back to here, we need two things who we need, uh, awareness,
00:24:53.860 | uh, which is the first thing we need. We need to be aware of, I think this, this, this, this axis,
00:24:59.080 | and to think about not just where am I being bullied intellectually, where w what's really the
00:25:04.180 | low rung thing and what's not, but also where are we being low rung? Cause we all can do this. This
00:25:08.440 | is, this is, this is a huge part of our brain that wants to go and, and identify with our ideas and,
00:25:12.820 | and be hypocritical. So where am I doing it? Where are the people around me doing it? And,
00:25:17.440 | and maybe realizing, okay, maybe the, the, the people that on the high rungs, when I am there,
00:25:21.400 | that disagree with me horizontally, maybe those are my friends, a lot more than the low rung
00:25:25.600 | people that are voting for the same candidate. And finally awareness without saying anything
00:25:31.540 | out loud is useless, right? We need awareness. It has to be
00:25:34.120 | coupled with courage. Uh, people have to start speaking out and actually that's the,
00:25:40.000 | the high rung immune system is built of courage. It's built of people actually standing up and
00:25:45.700 | you've seen this with some companies declaring we will not, uh, we were not a political place.
00:25:50.500 | That's courage in the face of a cudgel that's trying to get them to be political.
00:25:54.400 | Um, and so I think if you can have a little bit more awareness and a little
00:25:58.120 | bit more core courage, this kind of this low rung flare up can be, I think, uh, uh,
00:26:04.060 | controlled and I think we can end up in a better place. Thank you.
00:26:12.400 | Amazing. Truly epic. What an amazing talk to follow. Um, the talk we had earlier, I think with,
00:26:27.580 | I don't know if you got to witness it, the Palmer lucky talk.
00:26:30.400 | I was trying to think of how to trash you because it was so popular to do.
00:26:34.000 | So you were going to go low rung. Yeah, I was. But I mean, in fact, uh, you know, that, that,
00:26:39.820 | I think Palmer and I had some low rung moments where, you know, he was doing the anti-Hillary
00:26:44.260 | stuff. I was dunking on him for it. And then we saw an example of maybe adult high rung behavior
00:26:49.180 | of like, Hey, let's sit here and talk about the differences. I want to put out there just talking
00:26:54.400 | about the woke movement for a second. One of the major challenges I had in this event was certain
00:27:01.180 | people attending the event.
00:27:03.940 | Made some people in that group unwilling to come to the event. No offense, Keith.
00:27:10.660 | In other words, like Keith sacks, you know, and then even Glenn Grunwald and Greenwald, I'm sorry.
00:27:18.400 | And Matt Taibbi were triggers for certain people to not come speak. They were going to kick the baby.
00:27:25.720 | They were going to kick the baby. And so I think, and then on the right, we have, I think some pleasure
00:27:33.880 | in knowing you're triggering the libs and it's exacerbated this. It's hard for me as a conference
00:27:41.020 | producer or a podcast producer to get the two sides to sit and just have a reasonable discussion at
00:27:46.720 | time. How do we break that log jam of the right? Just loves to troll and, and, and trigger the
00:27:51.940 | libs and the libs are like, I'm not even participating in the discussion with this
00:27:56.860 | group of people, that group of people, you know, the sacks is the Keith's the, you know, whatever.
00:28:01.420 | I, I think, or keep, you just came on stage.
00:28:03.820 | Yeah. Oh, by the way, please welcome Keith for boy.
00:28:06.160 | He triggers a lot of libs,
00:28:11.800 | but let's start there. And then Keith, I'd love to hear you respond to this dynamic,
00:28:18.160 | which I know you are fully aware of.
00:28:19.540 | Yeah. So I, I think that we can get some clear definitions here.
00:28:23.140 | Not wanting to go to something that, that, you know, a high runner says, oh, they disagree with
00:28:30.400 | me. Great. Let me go. And that's, that's, that's what they really want to hear because I want to
00:28:33.400 | learn something. Right.
00:28:33.760 | The low runger says, those evil, awful people. I'm not going to go, right. They storm away. Fine.
00:28:39.580 | You're in a liberal country, live and let live. You, these are both. Okay.
00:28:42.400 | What's not okay is the low rungers and pressuring you to kick off those speakers because otherwise
00:28:49.780 | they're going to start a movement, a petition, a boycott of your show. That's gonna that's gonna
00:28:55.420 | end up hurting you in some way, you know, you know, taking, you know, smearing you on social
00:28:59.500 | media and, and to the, to pressuring this to not happen at all. That's saying no, one's allowed to
00:29:03.700 | go to that conference. That's what's not okay. It's interesting. You bring this up. I shared
00:29:08.560 | with you that back channel. It was beautiful. There was back channel of, you know, how beautiful
00:29:14.800 | the moment was with the high rung discussion we just had. There was also a dark moment before
00:29:19.000 | the event where a group of people who did not agree were doing what you're saying. The woke
00:29:25.360 | mob was saying, we need to get other people on the left. Oh, hello, David.
00:29:33.640 | It's time. They were supposed to tell me when round boy got here. So there was literally to
00:29:38.500 | your point, an intolerance level of not only are we not going to come to all in summit because
00:29:44.140 | sacks or this person or that person are there. We're going to start telling other people to not
00:29:49.660 | go and not participate. It literally happened. And I had to stop, but this is a, so look, this
00:29:54.940 | conference did happen. Those people did come ideas were spread. So this is a victory for high rung.
00:29:59.740 | This, this is yeah.
00:30:03.580 | So then to you, Keith, um, tell us why is it so pleasurable to trigger the libs, David,
00:30:10.840 | Keith? Uh, no, in all seriousness, you love to debate. You take all comers, no problem. You,
00:30:16.240 | you, you want to get in the arena, what you're seeing now. Um, how does this,
00:30:21.640 | can I actually just interject on that? Sure. So, I mean, speaking for myself, I don't get any
00:30:28.240 | pleasure in triggering libs and that's not my objective. And I don't think it's necessarily
00:30:33.520 | what I'm really doing is because we are willing to debate and we're not afraid to have the
00:30:37.900 | conversation. You're now redefining that as triggering other people. No, we're not. We're
00:30:42.700 | just willing to have a conversation. Now. Yeah. I think it's, I think it's really easy to tell
00:30:51.400 | who are the people who have good points to make and are, and have intellectual confidence because
00:30:57.160 | they're the ones willing to show up and have conversations. And I think it's the biggest cop
00:31:01.120 | out for anybody to say, well, I can't be your.
00:31:03.460 | But I see this name and this name on your agenda. How lame is that?
00:31:07.480 | Well, and, and, and to be honest, you know, a lot of the positions,
00:31:12.100 | um, I think you and Palmer probably disagree on the approach to Ukraine. He's probably very pro
00:31:18.580 | supporting that. And you might be a little more dovish.
00:31:21.760 | Yeah. So I think two points, first of all, a, I, I, I took on this fool's errand like 10 years ago
00:31:28.180 | of correcting everything wrong on the internet, which is an insane idea.
00:31:33.400 | Um, and I still haven't quite got myself out of that, but, but the reason why I did it was I felt
00:31:38.380 | like, wow, someone in, um, in, uh, who doesn't know any better might read something that's wrong
00:31:43.060 | and they might believe it. And so at least if I start correcting it, they'll see that there's
00:31:47.380 | multiple perspectives and then they'll have to dig in as opposed to just take this for granted.
00:31:50.860 | The second thing is, yeah, I have no desire to trigger the libs, but I do feel like I have a
00:31:55.420 | platform and I don't want to die without having used whatever influence I have to proselytize
00:32:00.400 | for ideas I believe in. So if I have 300,000 followers,
00:32:03.340 | I feel I would be neglecting like my light, like benefits of my life. If I'm not proselytizing for
00:32:09.460 | the few five, six, seven, eight, nine things I care about. And so I don't want to wake up one
00:32:13.540 | day and say, I wish I had done X, Y, or Z, and it could have maybe changed the world.
00:32:16.660 | Can I ask Tim a question around? His name's Tim.
00:32:21.880 | Hey, nice to meet you. David. How are you? We actually haven't met before.
00:32:26.740 | Um, uh, do you think that over time content has gotten
00:32:33.280 | shorter soundbites have become kind of the primary form of content? You know, we used to be that we'd
00:32:38.680 | sit down and read books and we'd read newspapers and we'd watch these long form news hour
00:32:42.580 | conversations. And then, you know, things got shorter, they got faster, they got quicker. And
00:32:48.760 | as a result, we ended up kind of debasing ourselves and ending up in this point where
00:32:52.120 | everything has to be reduced to that primal instinctual reaction moment. And it gets even
00:32:58.720 | more significantly fueled by the feedback loops associated with social media. So the things that
00:33:03.220 | you see more of are the things that really do trigger that kind of primal, you know,
00:33:08.380 | um, uh, emotional, uh, sense more, uh, is that a big driver? Do you think societally in terms of,
00:33:14.440 | have we become more tribal over the last century? Yeah. I mean, I think, I think environmental
00:33:20.740 | changes are just, it's like they will produce behavioral changes and it can be sometimes a
00:33:27.400 | feedback loop where you have shorter content, more emotional, you know, kind of triggering content.
00:33:33.160 | Um, you know, there's, there's almost like pheromones evolutionarily it wins. Yeah. Well,
00:33:38.080 | and all, you know, on Twitter, actually there's a phenomenon where actually, um,
00:33:42.040 | virality dumbs down information because, uh, nuanced information doesn't hit as hard. Totally.
00:33:48.760 | And so it's when you have, um, it's, if you, if it's, it's kind of like, um, it's like evolution
00:33:53.980 | where you see, you know, the, the, the tweet that ends up super viral it's it's, you know, survived
00:33:58.240 | a hundred other competing tweets to get there. Totally. And the ones that are rising to the top,
00:34:01.720 | There is a mechanism right now that is pushing, that's kind of forming a magnet down in political
00:34:09.280 | Disney world that is pulling us down and one of the questions I have for Elon is how can
00:34:14.860 | that somehow be.
00:34:16.800 | One idea that a friend and I were kicking around is almost like Wikipedia managed to
00:34:25.740 | somehow stay somewhat nuanced and neutral in a way.
00:34:30.600 | There would be some kind of giant 10,000 pool of moderators that actually rank things by
00:34:38.080 | maybe high rung and low rung and the algorithm doesn't necessarily suppress the low rung
00:34:42.840 | stuff, it just doesn't push it.
00:34:44.220 | Which right now the algorithm is...
00:34:45.840 | You're talking about like moderation, editorialization almost.
00:34:48.840 | Yeah, at least to give it like a credit rating on maybe a high-low scale.
00:34:53.220 | I kind of view this as like a muting effect.
00:34:56.380 | It's like an institutionalization of these social networks.
00:34:59.480 | Where everyone talks about them being free to run as a network without kind of a central
00:35:04.440 | system of control, but sometimes that central system of control has an important role in
00:35:08.300 | playing moderation, muting, editorialization that kind of avoids some of the adverse consequences.
00:35:14.820 | It's definitely optimizing downwards right now.
00:35:17.160 | What do you think Keith?
00:35:18.160 | Yeah, I sort of disagree.
00:35:19.160 | Should Elon buy Twitter and then...
00:35:22.040 | I sort of disagree.
00:35:23.280 | I mean I grew up in the 70s and 80s and soundbite was the term of art for like 30 second commercials.
00:35:28.480 | Yeah, right.
00:35:29.480 | I think the term of art for politics was 30 second commercials.
00:35:31.360 | I don't know any evidence that suggests that tweets today in politics are worse than the
00:35:35.000 | 30 second commercials I grew up with.
00:35:37.140 | And if you think about polarization, I also used to watch European politics in the 70s
00:35:41.720 | and 80s.
00:35:42.720 | The most extreme ends of politics you'll ever see.
00:35:44.560 | We don't have any of those extremes in the United States still today.
00:35:47.520 | So I don't think there's...
00:35:48.520 | I think a lot of people make arguments without evidence that things have changed.
00:35:52.200 | And I actually start with first principles like, wait, where's the evidence?
00:35:56.480 | People talk about misinformation.
00:35:57.920 | There's no evidence that the American voter...
00:35:59.440 | 2016 have less information or less accurate information than in 1888 or 1894 or 1910.
00:36:05.620 | In fact, the opposite is true by most serious studies.
00:36:08.680 | So this is all kind of made up in my mind.
00:36:11.060 | And yes, Elon should buy Twitter to save the world, but it's not going to be a good financial
00:36:14.260 | investment.
00:36:16.720 | How does it save the world, do you think?
00:36:18.760 | Well, we need a free speech platform where people can debate ideas and the left wing
00:36:23.260 | of Twitter, the employee base has completely suppressed ideas.
00:36:26.800 | For example, my husband, I happen to know this.
00:36:28.620 | I've been in the media for 20 years.
00:36:28.620 | I've been in the media for 20 years.
00:36:28.680 | I've been in the media for 20 years.
00:36:28.760 | I've been in the media for 20 years.
00:36:28.800 | I've been in the media for 20 years.
00:36:28.840 | I've been in the media for 20 years.
00:36:28.860 | I've been in the media for 20 years.
00:36:28.900 | know this, wrote an article in Foreign Policy magazine, like the most prestigious publication
00:36:33.260 | in the entire planet for foreign policy debate about the CCP. Twitter refused for years to allow
00:36:39.900 | them to advertise that article published in Foreign Policy magazine. So there's clearly
00:36:43.800 | someone at Twitter suppressing content that's critical of the CCP. And we tried appealing to
00:36:48.160 | everybody, and they wouldn't change this. So there's either Chinese spies there or a left-wing
00:36:51.980 | culture that suppresses debate. This is Foreign Policy magazine. We can't get any more prestigious
00:36:57.240 | than that. It's absurd. Let alone the fact that I have 300,000 followers and do not have a blue
00:37:01.720 | check. I must have the largest follower of anybody who doesn't have a blue check. And it's all
00:37:05.880 | because I have views that are unacceptable. That seems really pretty ridiculous considering many
00:37:11.780 | other VCs who are meaningfully less credentialed. Of course. Experience. There's obvious. And I have,
00:37:18.260 | you know, insiders at Twitter have sent me screenshots of various things. There's no doubt
00:37:22.320 | that it's a left-wing monoculture that's suppressing ideas, and someone needs to fix that. Either the
00:37:26.460 | government needs to fix it, or the government needs to fix it. And I think that's the way it's
00:37:27.220 | going to be. There's going to be a lot of subpoenas flying over to Twitter because there are absolutely
00:37:35.100 | foreign governments influencing some of those decisions at Twitter. Well, I mean, it was in fact
00:37:40.760 | proven that there were Saudis inside of Twitter, Saudi national operatives. Yes, the best tweet
00:37:46.980 | retort ever by Elon. Yeah. I wish I would be that good. Yeah, I mean. What was the tweet? Well, you know,
00:37:53.720 | the Saudi prince was complaining. I mean, he said,
00:37:56.620 | please explain freedom of speech and how that works in your country. All right. Yeah. I mean,
00:38:00.920 | can you explain cancel culture in your framework? Yeah. So I like to use a couple terms here.
00:38:14.700 | There's social bullying, which is no one, if you disagree with me, you can't be my friend.
00:38:22.600 | And again, that's okay. Right? I don't think you're an,
00:38:26.580 | I don't think you're an awesome person if you act like that, but you're allowed to.
00:38:29.320 | Then there's what I would call idea supremacy, which is, you know, it's kind of,
00:38:35.460 | it's like I've been saying, no one is allowed to say this thing, whether you're my friend or not.
00:38:43.140 | And, you know, if you want to run something on your own property, you can make all the speech
00:38:48.500 | rules. But cancel culture is specifically going into places that are supposed to be high rung.
00:38:54.240 | You know what it says on top of Harvard College?
00:38:56.540 | Veritas. Right? Veritas, which is, which is them. That is, that is them putting their stake down on
00:39:02.780 | the ground and saying, we are a high rung place. They're not say using those words, but that is
00:39:06.780 | what they're saying. We are a place that cares about truth, that cares about diversity of ideas,
00:39:10.360 | right? It cares about openness and inquiry and curiosity and all of this.
00:39:13.380 | And so cancel culture goes into places like that. Google, you know, you, you know, started off,
00:39:19.680 | they had, they're all hands meetings. It was all about, you know, and every idea is good.
00:39:22.500 | Criticize the leadership, like, you know, you know, right. So these things were specifically high
00:39:26.200 | rung, right. They were founded on these things. Cancel culture goes into those places and says
00:39:31.140 | our preferred echo chamber. Now those rules apply to everyone here. And it's a power, you know,
00:39:36.400 | you're not, a lot of things want to do that, right? A lot of, you know, I'm sure the pro
00:39:40.080 | lifers would like to go into campuses and say, no one can have a pro choice position. They don't
00:39:43.120 | have the power. Cancel culture is a product of a group that's not supposed to have the power to do
00:39:48.960 | that, having the power to do that. And I think that comes from the fear of social media. It comes
00:39:53.620 | from this hypercharged tribalism in the,
00:39:56.180 | the environment we live in right now. And a lot of things.
00:39:58.080 | So let me give you a solution. So one of the solutions to many things in life is moving to
00:40:02.100 | Miami and I'm serious about this. Ladies and gentlemen, mayor Francis Suarez.
00:40:08.880 | One of the most stark things when we moved to Miami 17 months ago was in Miami, you,
00:40:13.820 | it's incredibly refreshing because everybody has a different position. There's literally
00:40:17.640 | no environment socially, politically, culturally, business-wise, where you won't run into people
00:40:22.800 | who voted for Biden or for Trump. Like you cannot go to a dinner of eight people.
00:40:26.160 | And have people have the same views. You cannot work in a company where people don't haven't voted
00:40:30.480 | or didn't have views. And if you try to caricature people, you're going to be wrong all the time.
00:40:34.240 | Even I catch myself, like assuming this person of this demographic is going to be liberal and they're
00:40:38.000 | not. And so here people learn to both be polite, like sort of like when you were growing up,
00:40:43.280 | you were taught, like you don't debate religion in front of people at dinner, people are polite,
00:40:46.880 | but also they have to engage. And it's incredibly refreshing because people learn to partake in
00:40:51.600 | arguments and it would be impossible to live in Miami successfully unless you do this every day.
00:40:56.140 | And so I think this is a model for America, like many things in Miami.
00:40:59.500 | But keep over time, doesn't that transform? So like, isn't there a concentration of ideas of
00:41:04.860 | memetics that ultimately kind of rule the juice and you know, this whole thing kind of eventually
00:41:10.060 | you end up with, with, you know, two polls, two polls, two camps. I mean,
00:41:14.860 | isn't this how all societies start the great debate, the great conversation? This is a microcosm
00:41:19.180 | of what just happens with human behavior over time.
00:41:20.940 | Maybe because if you understand ideas,
00:41:22.540 | one of the benefits for me was I grew up in like the most woke environments ever. I spent years
00:41:26.120 | at Stanford and then Harvard, like pretty woke places. And all my professors in political science
00:41:30.900 | were super liberal, but I was conservative the whole time. And every one of my essays,
00:41:35.540 | if you read my final exams, they're all conservative because I had to learn to
00:41:38.660 | master all the liberal arguments and find the weaknesses and the data points and be able to
00:41:42.540 | marshal evidence. And that's a healthy thing. So when you encounter people who have different views,
00:41:46.980 | like for example, you know, there's controversial laws in Florida, don't say gay, quote unquote,
00:41:52.180 | you know, changing abortion policy here. People here will talk about this,
00:41:56.100 | but they're going to talk about it politely and debate them. And that's good for everybody.
00:41:59.760 | Like I bet you, for example, like, you know, if you read the media or you read Twitter,
00:42:03.320 | you think this abortion law change in Florida is radical. It's actually more
00:42:06.680 | permissive than any European country, but nobody, nobody knows that France actually
00:42:10.560 | only allows abortion up to 14 weeks. Germany is like 16. So we're 20 here. So we're more
00:42:15.560 | liberal than Europe, but nobody talks about that on Twitter that way. But if you lived in Florida,
00:42:19.360 | you would actually know that.
00:42:20.160 | By the way, the campuses you just described, they're not here anymore. You, you,
00:42:24.360 | the amount of testimonials from, from students,
00:42:26.080 | students saying, if I disagree with the professor on my exam, I will get a bad grade even worse.
00:42:31.260 | Again, this is when there's encroachment by a low rung giant and there's no pushback,
00:42:35.320 | it will keep going. So they've gone to some crazy places. Here's an example, Berkeley
00:42:39.700 | right now and UCLA and about 20 other schools. If you want to apply to be a chemistry professor,
00:42:49.920 | the first thing that you do is you have to fill out a diversity statement and there,
00:42:53.180 | and it's called, that sounds nice, a diversity statement, but it's actually, you have to
00:42:56.060 | prove that you have a proven track record of social justice activism of the woke variety. Not,
00:43:00.680 | not MLK style social justice, very specific social justice in this. And if you are not a proven
00:43:06.180 | activist that has the right political, that's more than even a political litmus test, you have to
00:43:09.740 | actually be an activist to get it, to even be seen by the chemistry department. They won't even show
00:43:14.220 | the chemistry. So there's, there's stories like that that you're just like, oh my God, but that's
00:43:17.880 | what happens when the immune system is failing this, the, the things will continue.
00:43:21.600 | So what is the, what is the antidote to that? If we, for those of us that can't move to Miami,
00:43:26.040 | well, everybody can, we welcome, we welcome new business.
00:43:28.620 | Those of us that haven't yet.
00:43:29.580 | The antidote is leadership because what happens is, uh, in each one of these stories,
00:43:35.580 | you know, James Bennett getting fired from the New York times, right? You list, read the story
00:43:38.700 | in detail. Um, you know, McNeil is another example for the New York times for a whole long story, but
00:43:43.620 | in each story that you, the leadership often, cuz leadership is, you know,
00:43:48.960 | that most people are not insane like this. Almost every, this is again with the orange circles,
00:43:52.500 | almost everyone actually thinks this is insane. These firings.
00:43:56.020 | And that's what's scary is they're happening anyway. So in each of these stories, you see a
00:43:59.440 | moment when the, the leadership first says, well, you know, here we do agree with, even though I,
00:44:02.920 | I hate his views too. Uh, uh, you know, we, we, we, we value a diversity of a viewpoint and then
00:44:09.100 | there's a huge pushback and there's a moment of truth. Are they gonna stand up for the Veritas
00:44:13.600 | and for the, for the core values? Are they gonna, are they gonna, or are they going to seed the
00:44:18.580 | culture to the mob and in what cancel culture is, is these moments of truth, the leadership
00:44:23.860 | choosing cowardice.
00:44:26.000 | And the, the, the actual cudgel of social media doesn't actually hit the person.
00:44:30.680 | It's the leader actually going and actually firing them.
00:44:34.240 | The leader's the one who ends up actually being the one.
00:44:36.080 | Standing up to the mob, as opposed to letting the mob rule you.
00:44:39.080 | Which is, which is the hard thing in a lot of these companies.
00:44:41.480 | Well, it's very hard to do. You think about we,
00:44:43.040 | We see it at all these companies in Silicon Valley.
00:44:44.720 | Well, we see it when we do the podcast, we had a moment, um, and we were discussing the
00:44:50.420 | don't say gay slash parents choice bill, which you look at the
00:44:55.980 | framing of that. It's completely hilarious that like we framed it as those two things.
00:45:00.900 | Either you're like, you don't want parents to be able to parent their kids or you hate gay people.
00:45:05.400 | It's like, really, is that what we're talking about here? And we looked at it and a couple of
00:45:10.440 | besties were having a conversation. I won't say who, and we were trying to get educated on it.
00:45:14.340 | And I'm like, should people be able to talk about their gay parents in first grade, second grade,
00:45:20.580 | third grade? Of course, you're a parent, you're gay. I'm assuming you don't want people to talk about their gay parents.
00:45:25.960 | You're a parent. You're not supposed to be able to tell you, you can't be talked about at school. And then it was like, end gender assignment and what gender you choose. And now we're sitting there going, I don't actually know enough about this. Should you introduce that? You can be one of 40 genders at six years old or 12 years old? When should sex education start? I actually don't know. I, we learned at 15. Should it be 12? I don't know. And we're, we were like, is this a discussion we can have on the podcast without us actually consulting with some people who know more than us and discussing it?
00:45:55.940 | And I've written about three or four tweets about the trans swimmer. And I have feelings on it. But I'm like, should I actually tweet that I find it's profoundly unfair that this person gets to win every single women's meet? And I kind of feel bad for the women who now can, the best they can do is second place? Am I going to get canceled for that? Because that was my initial response to it. And I don't actually know my position because I don't know that other person's story.
00:46:25.920 | Who's a trans woman, and maybe she does deserve to be in that. I don't know if anybody has an answer for that. So I'm curious, you know, from the besties themselves, you know, what are your thoughts on our tackling some of those things and not getting canceled or the blowback?
00:46:55.900 | Yeah, absolutely.
00:47:25.880 | Yeah, absolutely.
00:47:55.860 | Yeah, absolutely.
00:48:25.840 | Yeah, absolutely.
00:48:55.820 | Yeah, absolutely.
00:49:25.800 | Yeah, absolutely.
00:49:55.780 | Yeah, absolutely.
00:50:25.760 | Yeah, absolutely.
00:50:55.740 | Yeah, absolutely.
00:51:25.720 | Yeah, absolutely.
00:51:55.700 | that we're going to see 2000 all over again. And so privately, internally, I've been arguing this
00:52:00.660 | internally that this is exactly what's going to happen. And so, you know, my behavior should
00:52:04.720 | reflect my views. I believe in some consistency and harmonization. So if I believe tech stocks
00:52:10.060 | and tech companies aren't worth that much, I can't be investing until they reset. And so I don't want
00:52:14.180 | to spend money and invest in companies that aren't going to make me money. My job is to ultimately
00:52:17.520 | return billions of dollars to my LPs. And if I can't do that, I shouldn't be giving anybody any
00:52:21.880 | money. So when do you change your mind? Well, there are founders who are ahead of the curve.
00:52:25.820 | There always are, who understand where the world's going. They actually understand the world, where
00:52:29.240 | the world's going better than I do. They actually teach me about where the world's going more
00:52:32.260 | typically. And if they have appropriate expectations, I'm happy to invest. So the last three or four
00:52:37.080 | investments I did make actually were all, interestingly enough, about $1.5 million
00:52:41.280 | investments, where the founder walked in and said, you know, I don't need a lot of money. I can
00:52:45.240 | accomplish a lot. I can achieve inflection moments for a very small amount of capital. That was the
00:52:49.660 | easiest thing ever to say, yes, at $1.5 million.
00:52:52.460 | I don't need to think about the macro world. I don't need to think about where the NASDAQ's going.
00:52:56.580 | And so the last three or four investments were all incredibly disciplined founders that I made
00:53:01.240 | late last year, arguably into January. Now, we have doubled down, just to be clear about our
00:53:06.540 | conversation. We have doubled down in portfolio companies, where we've led new rounds. But as far
00:53:11.220 | as a new investment from scratch, I haven't made any new ones this year. So when you double down
00:53:15.640 | in a moment like this, how do you set valuation, especially if the last valuation was maybe,
00:53:20.360 | felt like a top ten?
00:53:21.840 | If I think the founder has sort of digested where the world is, then, you know, we can have a
00:53:28.140 | dialogue about valuation. Otherwise, I actually encourage them to go shop it. Like, I'm saying,
00:53:31.740 | like, we will give you money.
00:53:32.660 | But will you price it at the same mark, at a discount now?
00:53:35.100 | Well, if they have a fair market valuation from top tier firms, we'll try to be, like,
00:53:39.260 | in that zone. But they'll often go to the market, and people will be, like, either pass, pass,
00:53:43.300 | pass, pass, pass, or they will give them, you know, a dose of reality, and then we'll match that.
00:53:48.020 | But we've done that a few times, where we've encouraged founders. Typically, we wouldn't do
00:53:51.540 | this.
00:53:51.700 | We wouldn't do this.
00:53:51.820 | Because my partner, Brian Singerman, loves to power money into companies that are working.
00:53:55.700 | That's been, we've been a high conviction fund for about a decade. So, typically, if we like a
00:53:59.900 | company, we'll lead the next round, and lead the next round. We've done this with Ramp, for example.
00:54:03.260 | We've led, like, three or four rounds. But now, with a valuation reset going on, it's been easier
00:54:08.180 | sometimes with founders, I really like, to say, why don't you go talk to five other people?
00:54:11.760 | It's better hygiene.
00:54:11.920 | Well, it's just like, go talk to five other people, and I'll match what they do if they're
00:54:15.900 | really top tier people. But, like, I want you to get, like, fair market feedback, you know, not
00:54:20.240 | just have to rely upon my judgment card.
00:54:21.680 | Are we at the point in the cycle where the down routes, the warrants, the liquidation
00:54:29.180 | preferences have happened, or are starting to be discussed?
00:54:33.320 | Definitely seeing a lot of liquid preferences again.
00:54:35.300 | Explain what it is, and why that's important.
00:54:37.700 | Yeah, so, liquidation preference basically means that the investor is going to get their
00:54:43.840 | money back first, regardless of what happens in the world. And that nobody who's a shareholder,
00:54:48.480 | nobody who's a founder, is going to get it. Nobody who's a common shareholder, which
00:54:51.540 | basically means founder or employee, is going to get any money until the investor gets all
00:54:55.960 | their money back times some multiple. And that multiple is based on time and/or just
00:55:00.860 | a hurdle. It's very scary, but it can be arbitraged by success. Founders sometimes can arbitrage
00:55:06.480 | it well, meaning they have asymmetric information about the future of the company. If they really
00:55:10.440 | believe they can hit escape velocity in a short period of time, it can be a decent gamble.
00:55:15.140 | I've seen someone like Jack Dorsey at Square did this, very sophisticated CEO, and he knew
00:55:20.460 | what he was doing and knew why he was doing it.
00:55:21.400 | And it's worked out pretty well, actually. But you're playing with a lot of fire, so
00:55:25.940 | it's not for everybody. And you should get a lot of feedback and advice before. The flat
00:55:30.020 | rounds are definitely happening. The new flat is the up round kind of philosophy, even in
00:55:34.500 | some of our better.
00:55:35.500 | Are those senior-like preference or peri-preference?
00:55:37.700 | It depends. Depends on the round. They're all over the map, actually.
00:55:42.300 | So, the market hasn't shifted to the point that every new money coming in is senior to
00:55:45.260 | all other money.
00:55:46.260 | Depends how much leverage and what quality investors you have on your cap table. Like,
00:55:48.760 | say, for example, someone tries to put a senior-like preference on top of a market, and they're
00:55:51.260 | like, "Oh, I'm going to put a senior-like preference on top of my capital. I'm going
00:55:52.620 | to yell at them a lot." And if they ever want a new investment that is from our fund, they
00:55:56.740 | may not want to do that.
00:55:57.740 | Do you think that we're a couple turns away from the discount rounds?
00:56:03.640 | Well, some companies are going to have to try. The problem is, for example, we don't
00:56:07.040 | like to do those rounds. There's so much brain damage in the politics of that with founders,
00:56:11.020 | with prior investors.
00:56:12.020 | Just actually walk us through that. Tell us about that brain damage.
00:56:14.660 | So typically, you think there's an efficient market of pricing, right? Like, "I need this
00:56:18.320 | much capital, and the market's going to float with the price of that capital."
00:56:21.120 | The answer is, the market's going to float with the price of that capital. And private
00:56:22.940 | capital is not really true. So if someone comes to me and says, "My last round was done
00:56:26.900 | at $300 million, nine months ago. And today, it'd probably get priced at, let's say, $120
00:56:32.920 | million."
00:56:33.920 | I'm more likely to say no than to give them an offer at $120, because I know their prior
00:56:38.980 | investors and their prior employees are going to be mad at me and furious at me. And I don't
00:56:44.080 | want a lot of founders and people annoyed at me. And so that brain damage isn't worth
00:56:51.980 | So I'm more likely, and our fund is more likely to say no than try to find whether $80, $100,
00:56:55.980 | $120, $140 is the appropriate price, which is very bad for the company in some ways,
00:57:01.100 | because they might need the capital.
00:57:02.600 | And you starve them of money.
00:57:03.780 | Yeah. They may be able to find somebody else. But we typically, at Founders Fund, really
00:57:07.720 | don't like to do those rounds. The only way we would consider it is pretty much if everybody
00:57:12.020 | on the cap table called us up, the founder, CEO, the board members, prior investors said,
00:57:16.580 | "We really want you to do this. And we're all collectively holding hands and want you
00:57:20.240 | to do this."
00:57:20.840 | Then we'd seriously consider it.
00:57:22.480 | Do you, at the end of Q1, do you guys sit around and reset valuations and marks before
00:57:27.380 | you tell your LPs what these companies are worth? Meaning your own sense when you generate
00:57:31.800 | a sense of valuations?
00:57:33.600 | Yeah. We do mark down.
00:57:35.280 | Proactively mark down?
00:57:36.080 | We do proactively mark down.
00:57:38.500 | What's your methodology for that?
00:57:41.460 | Peter's views?
00:57:42.460 | I mean, I think--
00:57:43.460 | Peter's sense.
00:57:44.460 | We'd be open to doing that if we felt like we had an objective methodology for
00:57:50.700 | doing it.
00:57:51.700 | It's very tricky. I think you can-- later stage one's a little bit easier because you
00:57:55.200 | can apply multiples. There's public comps, and you just adjust to that. I think the earlier
00:58:00.420 | stage stuff, very difficult to do objectively. And it's also not that-- you're probably not
00:58:04.640 | as sensitive to it in terms of how it moves the needle. But the growth stuff, we try to
00:58:08.760 | use public comps and be realistic.
00:58:11.940 | What do you think about-- we'll just throw out some firms, if you had to guess, the next
00:58:18.420 | 18 months for some of these folks? Softbank?
00:58:20.560 | I mean, my views on Softbank have been obvious since I did a New York Times, of all things,
00:58:26.060 | interview in 2016. You should reread the transcript. But I was like, that strategy just does not
00:58:31.020 | work. Powering money into companies and hoping that money is the key asset and the key ingredient
00:58:35.820 | for success has been false in the history of technology for 50 years. And so they lost
00:58:41.220 | $27 billion again. The brand's subprime. They used to do well in Latin America, but they
00:58:45.600 | got rid of the person who actually knew what he was doing. So it's just a catastrophic
00:58:49.420 | mess.
00:58:50.420 | And I think that's the real problem.
00:58:51.420 | I think that's the real problem.
00:58:52.420 | And I think that's the real problem.
00:58:53.420 | And I think that's the real problem.
00:58:54.420 | And I think that's the real problem.
00:58:55.420 | And I think that's the real problem.
00:58:56.420 | And I think that's the real problem.
00:58:57.420 | And I think that's the real problem.
00:58:58.420 | And I think that's the real problem.
00:58:59.420 | And I think that's the real problem.
00:59:00.420 | And I think that's the real problem.
00:59:01.420 | And I think that's the real problem.
00:59:02.420 | And I think that's the real problem.
00:59:03.420 | And I think that's the real problem.
00:59:04.420 | And I think that's the real problem.
00:59:05.420 | And I think that's the real problem.
00:59:06.420 | And I think that's the real problem.
00:59:07.420 | And I think that's the real problem.
00:59:08.420 | And I think that's the real problem.
00:59:09.420 | And I think that's the real problem.
00:59:10.420 | And I think that's the real problem.
00:59:11.420 | And I think that's the real problem.
00:59:12.420 | And I think that's the real problem.
00:59:13.420 | And I think that's the real problem.
00:59:14.420 | And I think that's the real problem.
00:59:15.420 | And I think that's the real problem.
00:59:16.420 | And I think that's the real problem.
00:59:17.420 | And I think that's the real problem.
00:59:18.420 | And I think that's the real problem.
00:59:19.420 | And I think that's the real problem.
00:59:20.420 | And I think that's the real problem.
00:59:21.420 | And I think that's the real problem.
00:59:22.420 | And I think that's the real problem.
00:59:23.420 | And I think that's the real problem.
00:59:24.420 | And I think that's the real problem.
00:59:25.420 | And I think that's the real problem.
00:59:26.420 | And I think that's the real problem.
00:59:27.420 | And I think that's the real problem.
00:59:28.420 | And I think that's the real problem.
00:59:29.420 | And I think that's the real problem.
00:59:30.420 | And I think that's the real problem.
00:59:31.420 | And I think that's the real problem.
00:59:32.420 | And I think that's the real problem.
00:59:33.420 | And I think that's the real problem.
00:59:34.420 | And I think that's the real problem.
00:59:35.420 | And I think that's the real problem.
00:59:36.420 | And I think that's the real problem.
00:59:37.420 | And I think that's the real problem.
00:59:38.420 | And I think that's the real problem.
00:59:39.420 | And I think that's the real problem.
00:59:40.420 | And I think that's the real problem.
00:59:41.420 | And I think that's the real problem.
00:59:42.420 | And I think that's the real problem.
00:59:43.420 | And I think that's the real problem.
00:59:44.420 | And I think that's the real problem.
00:59:45.420 | And I think that's the real problem.
00:59:46.420 | And I think that's the real problem.
00:59:47.420 | And I think that's the real problem.
00:59:48.420 | And I think that's the real problem.
00:59:49.420 | And I think that's the real problem.
00:59:50.420 | And I think that's the real problem.
00:59:51.420 | And I think that's the real problem.
00:59:52.420 | And I think that's the real problem.
00:59:53.420 | And I think that's the real problem.
00:59:54.420 | And I think that's the real problem.
00:59:55.420 | And I think that's the real problem.
00:59:56.420 | And I think that's the real problem.
00:59:57.420 | And I think that's the real problem.
00:59:58.420 | And I think that's the real problem.
00:59:59.420 | And I think that's the real problem.
01:00:00.420 | And I think that's the real problem.
01:00:01.420 | And I think that's the real problem.
01:00:02.420 | And I think that's the real problem.
01:00:03.420 | And I think that's the real problem.
01:00:04.420 | And I think that's the real problem.
01:00:05.420 | And I think that's the real problem.
01:00:06.420 | And I think that's the real problem.
01:00:07.420 | And I think that's the real problem.
01:00:08.420 | And I think that's the real problem.
01:00:09.420 | And I think that's the real problem.
01:00:10.420 | And I think that's the real problem.
01:00:11.420 | And I think that's the real problem.
01:00:12.420 | And I think that's the real problem.
01:00:13.420 | And I think that's the real problem.
01:00:14.420 | And I think that's the real problem.
01:00:15.420 | And I think that's the real problem.
01:00:16.420 | And I think that's the real problem.
01:00:17.420 | And I think that's the real problem.
01:00:18.420 | And I think that's the real problem.
01:00:19.420 | And I think that's the real problem.
01:00:20.420 | And I think that's the real problem.
01:00:21.420 | And I think that's the real problem.
01:00:22.420 | And I think that's the real problem.
01:00:23.420 | And I think that's the real problem.
01:00:24.420 | And I think that's the real problem.
01:00:25.420 | And I think that's the real problem.
01:00:26.420 | And I think that's the real problem.
01:00:27.420 | And I think that's the real problem.
01:00:28.420 | And I think that's the real problem.
01:00:29.420 | And I think that's the real problem.
01:00:30.420 | And I think that's the real problem.
01:00:31.420 | And I think that's the real problem.
01:00:32.420 | And I think that's the real problem.
01:00:33.420 | And I think that's the real problem.
01:00:34.420 | And I think that's the real problem.
01:00:35.420 | And I think that's the real problem.
01:00:36.420 | And I think that's the real problem.
01:00:37.420 | And I think that's the real problem.
01:00:38.420 | And I think that's the real problem.
01:00:39.420 | And I think that's the real problem.
01:00:40.420 | And I think that's the real problem.
01:00:41.420 | And I think that's the real problem.
01:00:42.420 | And I think that's the real problem.
01:00:43.420 | And I think that's the real problem.
01:00:44.420 | And I think that's the real problem.
01:00:45.420 | And I think that's the real problem.
01:00:46.420 | And I think that's the real problem.
01:00:47.420 | And I think that's the real problem.
01:00:48.420 | And I think that's the real problem.
01:00:49.420 | And I think that's the real problem.
01:00:50.420 | And I think that's the real problem.
01:00:51.420 | And I think that's the real problem.
01:00:52.420 | And I think that's the real problem.
01:00:53.420 | And I think that's the real problem.
01:00:54.420 | And I think that's the real problem.
01:00:55.420 | And I think that's the real problem.
01:00:56.420 | And I think that's the real problem.
01:00:57.420 | And I think that's the real problem.
01:00:58.420 | And I think that's the real problem.
01:00:59.420 | And I think that's the real problem.
01:01:00.420 | And I think that's the real problem.
01:01:01.420 | And I think that's the real problem.
01:01:02.420 | And I think that's the real problem.
01:01:03.420 | And I think that's the real problem.
01:01:04.420 | And I think that's the real problem.
01:01:05.420 | And I think that's the real problem.
01:01:06.420 | And I think that's the real problem.
01:01:07.420 | And I think that's the real problem.
01:01:08.420 | And I think that's the real problem.
01:01:09.420 | And I think that's the real problem.
01:01:10.420 | And I think that's the real problem.
01:01:11.420 | And I think that's the real problem.
01:01:12.420 | And I think that's the real problem.
01:01:13.420 | And I think that's the real problem.
01:01:14.420 | And I think that's the real problem.
01:01:15.420 | And I think that's the real problem.
01:01:16.420 | And I think that's the real problem.
01:01:17.420 | And I think that's the real problem.
01:01:18.420 | And I think that's the real problem.
01:01:19.420 | And I think that's the real problem.
01:01:20.420 | And I think that's the real problem.
01:01:21.420 | And I think that's the real problem.
01:01:22.420 | And I think that's the real problem.
01:01:23.420 | And I think that's the real problem.
01:01:24.420 | And I think that's the real problem.
01:01:25.420 | And I think that's the real problem.
01:01:26.420 | And I think that's the real problem.
01:01:27.420 | And I think that's the real problem.
01:01:28.420 | And I think that's the real problem.
01:01:29.420 | And I think that's the real problem.
01:01:30.420 | And I think that's the real problem.
01:01:31.420 | And I think that's the real problem.
01:01:32.420 | And I think that's the real problem.
01:01:33.420 | And I think that's the real problem.
01:01:34.420 | And I think that's the real problem.
01:01:35.420 | And I think that's the real problem.
01:01:36.420 | And I think that's the real problem.
01:01:37.420 | And I think that's the real problem.
01:01:38.420 | And I think that's the real problem.
01:01:39.420 | And I think that's the real problem.
01:01:40.420 | And I think that's the real problem.
01:01:41.420 | And I think that's the real problem.
01:01:42.420 | And I think that's the real problem.
01:01:43.420 | And I think that's the real problem.
01:01:44.420 | And I think that's the real problem.
01:01:45.420 | And I think that's the real problem.
01:01:46.420 | And I think that's the real problem.
01:01:47.420 | And I think that's the real problem.
01:01:48.420 | And I think that's the real problem.
01:01:49.420 | And I think that's the real problem.
01:01:50.420 | And I think that's the real problem.
01:01:51.420 | And I think that's the real problem.
01:01:52.420 | And I think that's the real problem.
01:01:53.420 | And I think that's the real problem.
01:01:54.420 | And I think that's the real problem.
01:01:55.420 | And I think that's the real problem.
01:01:56.420 | And I think that's the real problem.
01:01:57.420 | And I think that's the real problem.
01:01:58.420 | And I think that's the real problem.
01:01:59.420 | And I think that's the real problem.
01:02:00.420 | And I think that's the real problem.
01:02:01.420 | And I think that's the real problem.
01:02:02.420 | And I think that's the real problem.
01:02:03.420 | And I think that's the real problem.
01:02:04.420 | And I think that's the real problem.
01:02:05.420 | And I think that's the real problem.
01:02:06.420 | And I think that's the real problem.
01:02:07.420 | And I think that's the real problem.
01:02:08.420 | And I think that's the real problem.
01:02:09.420 | And I think that's the real problem.
01:02:10.420 | And I think that's the real problem.
01:02:11.420 | And I think that's the real problem.
01:02:12.420 | And I think that's the real problem.
01:02:13.420 | And I think that's the real problem.
01:02:14.420 | And I think that's the real problem.
01:02:15.420 | And I think that's the real problem.
01:02:16.420 | And I think that's the real problem.
01:02:17.420 | And I think that's the real problem.
01:02:18.420 | And I think that's the real problem.
01:02:19.420 | And I think that's the real problem.
01:02:20.420 | And I think that's the real problem.
01:02:21.420 | And I think that's the real problem.
01:02:22.420 | And I think that's the real problem.
01:02:23.420 | And I think that's the real problem.
01:02:24.420 | And I think that's the real problem.
01:02:25.420 | And I think that's the real problem.
01:02:26.420 | And I think that's the real problem.
01:02:27.420 | And I think that's the real problem.
01:02:28.420 | And I think that's the real problem.
01:02:29.420 | And I think that's the real problem.
01:02:30.420 | And I think that's the real problem.
01:02:31.420 | And I think that's the real problem.
01:02:32.420 | And I think that's the real problem.
01:02:33.420 | And I think that's the real problem.
01:02:34.420 | And I think that's the real problem.
01:02:35.420 | And I think that's the real problem.
01:02:36.420 | And I think that's the real problem.
01:02:37.420 | And I think that's the real problem.
01:02:38.420 | And I think that's the real problem.
01:02:39.420 | And I think that's the real problem.
01:02:40.420 | And I think that's the real problem.
01:02:41.420 | And I think that's the real problem.
01:02:42.420 | And I think that's the real problem.
01:02:43.420 | And I think that's the real problem.
01:02:44.420 | And I think that's the real problem.
01:02:45.420 | And I think that's the real problem.
01:02:46.420 | And I think that's the real problem.
01:02:47.420 | And I think that's the real problem.
01:02:48.420 | And I think that's the real problem.
01:02:49.420 | And I think that's the real problem.
01:02:50.420 | And I think that's the real problem.
01:02:51.420 | And I think that's the real problem.
01:02:52.420 | And I think that's the real problem.
01:02:53.420 | And I think that's the real problem.
01:02:54.420 | And I think that's the real problem.
01:02:55.420 | And I think that's the real problem.
01:02:56.420 | And I think that's the real problem.
01:02:57.420 | And I think that's the real problem.
01:02:58.420 | And I think that's the real problem.
01:02:59.420 | And I think that's the real problem.
01:03:00.420 | And I think that's the real problem.
01:03:01.420 | And I think that's the real problem.
01:03:02.420 | And I think that's the real problem.
01:03:03.420 | And I think that's the real problem.
01:03:04.420 | And I think that's the real problem.
01:03:05.420 | And I think that's the real problem.
01:03:06.420 | And I think that's the real problem.
01:03:07.420 | And I think that's the real problem.
01:03:08.420 | And I think that's the real problem.
01:03:09.420 | And I think that's the real problem.
01:03:10.420 | And I think that's the real problem.
01:03:11.420 | And I think that's the real problem.
01:03:12.420 | And I think that's the real problem.
01:03:13.420 | And I think that's the real problem.
01:03:14.420 | And I think that's the real problem.
01:03:15.420 | And I think that's the real problem.
01:03:16.420 | And I think that's the real problem.
01:03:17.420 | And I think that's the real problem.
01:03:18.420 | And I think that's the real problem.
01:03:19.420 | And I think that's the real problem.
01:03:20.420 | And I think that's the real problem.
01:03:21.420 | And I think that's the real problem.
01:03:22.420 | And I think that's the real problem.
01:03:23.420 | And I think that's the real problem.
01:03:24.420 | And I think that's the real problem.
01:03:25.420 | And I think that's the real problem.
01:03:26.420 | And I think that's the real problem.
01:03:27.420 | And I think that's the real problem.
01:03:28.420 | And I think that's the real problem.
01:03:29.420 | And I think that's the real problem.
01:03:30.420 | And I think that's the real problem.
01:03:31.420 | And I think that's the real problem.
01:03:32.420 | And I think that's the real problem.
01:03:33.420 | And I think that's the real problem.
01:03:34.420 | And I think that's the real problem.
01:03:35.420 | And I think that's the real problem.
01:03:36.420 | And I think that's the real problem.
01:03:37.420 | And I think that's the real problem.
01:03:38.420 | And I think that's the real problem.
01:03:39.420 | And I think that's the real problem.
01:03:40.420 | And I think that's the real problem.
01:03:41.420 | And I think that's the real problem.
01:03:42.420 | And I think that's the real problem.
01:03:43.420 | And I think that's the real problem.
01:03:44.420 | And I think that's the real problem.
01:03:45.420 | And I think that's the real problem.
01:03:46.420 | And I think that's the real problem.
01:03:47.420 | And I think that's the real problem.
01:03:48.420 | And I think that's the real problem.
01:03:49.420 | And I think that's the real problem.
01:03:50.420 | And I think that's the real problem.
01:03:51.420 | And I think that's the real problem.
01:03:52.420 | And I think that's the real problem.
01:03:53.420 | And I think that's the real problem.
01:03:54.420 | And I think that's the real problem.
01:03:55.420 | And I think that's the real problem.
01:03:56.420 | And I think that's the real problem.
01:03:57.420 | And I think that's the real problem.
01:03:58.420 | And I think that's the real problem.
01:03:59.420 | And I think that's the real problem.
01:04:00.420 | And I think that's the real problem.
01:04:01.420 | And I think that's the real problem.
01:04:02.420 | And I think that's the real problem.
01:04:03.420 | And I think that's the real problem.
01:04:04.420 | And I think that's the real problem.
01:04:05.420 | And I think that's the real problem.
01:04:06.420 | And I think that's the real problem.
01:04:07.420 | And I think that's the real problem.
01:04:08.420 | And I think that's the real problem.
01:04:09.420 | And I think that's the real problem.
01:04:10.420 | And I think that's the real problem.
01:04:11.420 | And I think that's the real problem.
01:04:12.420 | And I think that's the real problem.
01:04:13.420 | And I think that's the real problem.
01:04:14.420 | And I think that's the real problem.
01:04:15.420 | And I think that's the real problem.
01:04:16.420 | And I think that's the real problem.
01:04:17.420 | And I think that's the real problem.
01:04:18.420 | And I think that's the real problem.
01:04:19.420 | And I think that's the real problem.
01:04:20.420 | And I think that's the real problem.
01:04:21.420 | And I think that's the real problem.
01:04:22.420 | And I think that's the real problem.
01:04:23.420 | And I think that's the real problem.
01:04:24.420 | And I think that's the real problem.
01:04:25.420 | And I think that's the real problem.
01:04:26.420 | And I think that's the real problem.
01:04:27.420 | And I think that's the real problem.
01:04:28.420 | And I think that's the real problem.
01:04:29.420 | And I think that's the real problem.
01:04:30.420 | And I think that's the real problem.
01:04:31.420 | And I think that's the real problem.
01:04:32.420 | And I think that's the real problem.
01:04:33.420 | And I think that's the real problem.
01:04:34.420 | And I think that's the real problem.
01:04:35.420 | And I think that's the real problem.
01:04:36.420 | And I think that's the real problem.
01:04:37.420 | And I think that's the real problem.
01:04:38.420 | And I think that's the real problem.
01:04:39.420 | And I think that's the real problem.
01:04:40.420 | And I think that's the real problem.
01:04:41.420 | And I think that's the real problem.
01:04:42.420 | And I think that's the real problem.
01:04:43.420 | And I think that's the real problem.
01:04:44.420 | And I think that's the real problem.
01:04:45.420 | And I think that's the real problem.
01:04:46.420 | And I think that's the real problem.
01:04:47.420 | And I think that's the real problem.
01:04:48.420 | And I think that's the real problem.
01:04:49.420 | And I think that's the real problem.
01:04:50.420 | And I think that's the real problem.
01:04:51.420 | And I think that's the real problem.
01:04:52.420 | And I think that's the real problem.
01:04:53.420 | And I think that's the real problem.
01:04:54.420 | And I think that's the real problem.
01:04:55.420 | And I think that's the real problem.
01:04:56.420 | And I think that's the real problem.
01:04:57.420 | And I think that's the real problem.
01:04:58.420 | And I think that's the real problem.
01:04:59.420 | And I think that's the real problem.
01:05:00.420 | And I think that's the real problem.
01:05:01.420 | And I think that's the real problem.
01:05:02.420 | And I think that's the real problem.
01:05:03.420 | And I think that's the real problem.
01:05:04.420 | And I think that's the real problem.
01:05:05.420 | And I think that's the real problem.
01:05:06.420 | And I think that's the real problem.
01:05:07.420 | And I think that's the real problem.
01:05:08.420 | And I think that's the real problem.
01:05:09.420 | And I think that's the real problem.
01:05:10.420 | And I think that's the real problem.
01:05:11.420 | And I think that's the real problem.
01:05:12.420 | And I think that's the real problem.
01:05:13.420 | And I think that's the real problem.
01:05:14.420 | And I think that's the real problem.
01:05:15.420 | And I think that's the real problem.
01:05:16.420 | And I think that's the real problem.
01:05:17.420 | And I think that's the real problem.
01:05:18.420 | And I think that's the real problem.
01:05:19.420 | And I think that's the real problem.
01:05:20.420 | And I think that's the real problem.
01:05:21.420 | And I think that's the real problem.
01:05:22.420 | And I think that's the real problem.
01:05:23.420 | And I think that's the real problem.
01:05:24.420 | And I think that's the real problem.
01:05:25.420 | And I think that's the real problem.
01:05:26.420 | And I think that's the real problem.
01:05:27.420 | And I think that's the real problem.
01:05:28.420 | And I think that's the real problem.
01:05:29.420 | And I think that's the real problem.
01:05:30.420 | And I think that's the real problem.
01:05:31.420 | And I think that's the real problem.
01:05:32.420 | And I think that's the real problem.
01:05:33.420 | And I think that's the real problem.
01:05:34.420 | And I think that's the real problem.
01:05:35.420 | And I think that's the real problem.
01:05:36.420 | And I think that's the real problem.
01:05:37.420 | And I think that's the real problem.
01:05:38.420 | And I think that's the real problem.
01:05:39.420 | And I think that's the real problem.
01:05:40.420 | And I think that's the real problem.
01:05:41.420 | And I think that's the real problem.
01:05:42.420 | And I think that's the real problem.
01:05:43.420 | And I think that's the real problem.
01:05:44.420 | And I think that's the real problem.
01:05:45.420 | And I think that's the real problem.
01:05:46.420 | And I think that's the real problem.
01:05:47.420 | And I think that's the real problem.
01:05:48.420 | And I think that's the real problem.
01:05:49.420 | And I think that's the real problem.
01:05:50.420 | And I think that's the real problem.
01:05:51.420 | And I think that's the real problem.
01:05:52.420 | And I think that's the real problem.
01:05:53.420 | And I think that's the real problem.
01:05:54.420 | And I think that's the real problem.
01:05:55.420 | And I think that's the real problem.
01:05:56.420 | And I think that's the real problem.
01:05:57.420 | And I think that's the real problem.
01:05:58.420 | And I think that's the real problem.
01:05:59.420 | And I think that's the real problem.
01:06:00.420 | And I think that's the real problem.
01:06:01.420 | And I think that's the real problem.
01:06:02.420 | And I think that's the real problem.
01:06:03.420 | And I think that's the real problem.
01:06:04.420 | And I think that's the real problem.
01:06:05.420 | And I think that's the real problem.
01:06:06.420 | And I think that's the real problem.
01:06:07.420 | And I think that's the real problem.
01:06:08.420 | And I think that's the real problem.
01:06:09.420 | And I think that's the real problem.
01:06:10.420 | And I think that's the real problem.
01:06:11.420 | And I think that's the real problem.
01:06:12.420 | And I think that's the real problem.
01:06:13.420 | And I think that's the real problem.
01:06:14.420 | And I think that's the real problem.
01:06:15.420 | And I think that's the real problem.
01:06:16.420 | And I think that's the real problem.
01:06:17.420 | And I think that's the real problem.
01:06:18.420 | And I think that's the real problem.
01:06:19.420 | And I think that's the real problem.
01:06:20.420 | And I think that's the real problem.
01:06:21.420 | And I think that's the real problem.
01:06:22.420 | And I think that's the real problem.
01:06:23.420 | And I think that's the real problem.
01:06:24.420 | And I think that's the real problem.
01:06:25.420 | And I think that's the real problem.
01:06:26.420 | And I think that's the real problem.
01:06:27.420 | And I think that's the real problem.
01:06:28.420 | And I think that's the real problem.
01:06:29.420 | And I think that's the real problem.
01:06:30.420 | And I think that's the real problem.
01:06:31.420 | And I think that's the real problem.
01:06:32.420 | And I think that's the real problem.
01:06:33.420 | And I think that's the real problem.
01:06:34.420 | And I think that's the real problem.
01:06:35.420 | And I think that's the real problem.
01:06:36.420 | And I think that's the real problem.
01:06:37.420 | And I think that's the real problem.
01:06:38.420 | And I think that's the real problem.
01:06:39.420 | And I think that's the real problem.
01:06:40.420 | And I think that's the real problem.
01:06:41.420 | And I think that's the real problem.
01:06:42.420 | And I think that's the real problem.
01:06:43.420 | And I think that's the real problem.
01:06:44.420 | And I think that's the real problem.
01:06:45.420 | And I think that's the real problem.
01:06:46.420 | And I think that's the real problem.
01:06:47.420 | And I think that's the real problem.
01:06:48.420 | And I think that's the real problem.
01:06:49.420 | And I think that's the real problem.
01:06:50.420 | And I think that's the real problem.
01:06:51.420 | And I think that's the real problem.
01:06:52.420 | And I think that's the real problem.
01:06:53.420 | And I think that's the real problem.
01:06:54.420 | And I think that's the real problem.
01:06:55.420 | And I think that's the real problem.
01:06:56.420 | And I think that's the real problem.
01:06:57.420 | And I think that's the real problem.
01:06:58.420 | And I think that's the real problem.
01:06:59.420 | And I think that's the real problem.
01:07:00.420 | And I think that's the real problem.
01:07:01.420 | And I think that's the real problem.
01:07:02.420 | And I think that's the real problem.
01:07:03.420 | And I think that's the real problem.
01:07:04.420 | And I think that's the real problem.
01:07:05.420 | And I think that's the real problem.
01:07:06.420 | And I think that's the real problem.
01:07:07.420 | And I think that's the real problem.
01:07:08.420 | And I think that's the real problem.
01:07:09.420 | And I think that's the real problem.
01:07:10.420 | And I think that's the real problem.
01:07:11.420 | And I think that's the real problem.
01:07:12.420 | And I think that's the real problem.
01:07:13.420 | And I think that's the real problem.
01:07:14.420 | And I think that's the real problem.
01:07:15.420 | And I think that's the real problem.
01:07:16.420 | And I think that's the real problem.
01:07:17.420 | And I think that's the real problem.
01:07:18.420 | And I think that's the real problem.
01:07:19.420 | And I think that's the real problem.
01:07:20.420 | And I think that's the real problem.
01:07:21.420 | And I think that's the real problem.
01:07:22.420 | And I think that's the real problem.
01:07:23.420 | And I think that's the real problem.
01:07:24.420 | And I think that's the real problem.
01:07:25.420 | And I think that's the real problem.
01:07:26.420 | And I think that's the real problem.
01:07:27.420 | And I think that's the real problem.
01:07:28.420 | And I think that's the real problem.
01:07:29.420 | And I think that's the real problem.
01:07:30.420 | And I think that's the real problem.
01:07:31.420 | And I think that's the real problem.
01:07:32.420 | And I think that's the real problem.
01:07:33.420 | And I think that's the real problem.
01:07:34.420 | And I think that's the real problem.
01:07:35.420 | And I think that's the real problem.
01:07:36.420 | And I think that's the real problem.
01:07:37.420 | And I think that's the real problem.
01:07:38.420 | And I think that's the real problem.
01:07:39.420 | And I think that's the real problem.
01:07:40.420 | And I think that's the real problem.
01:07:41.420 | And I think that's the real problem.
01:07:42.420 | And I think that's the real problem.
01:07:43.420 | And I think that's the real problem.
01:07:44.420 | And I think that's the real problem.
01:07:45.420 | And I think that's the real problem.
01:07:46.420 | And I think that's the real problem.
01:07:47.420 | And I think that's the real problem.
01:07:48.420 | And I think that's the real problem.
01:07:49.420 | And I think that's the real problem.
01:07:50.420 | And I think that's the real problem.
01:07:51.420 | And I think that's the real problem.
01:07:52.420 | And I think that's the real problem.
01:07:53.420 | And I think that's the real problem.
01:07:54.420 | And I think that's the real problem.
01:07:55.420 | And I think that's the real problem.
01:07:56.420 | And I think that's the real problem.
01:07:57.420 | And I think that's the real problem.
01:07:58.420 | And I think that's the real problem.
01:07:59.420 | And I think that's the real problem.
01:08:00.420 | And I think that's the real problem.
01:08:01.420 | And I think that's the real problem.
01:08:02.420 | And I think that's the real problem.
01:08:03.420 | And I think that's the real problem.
01:08:04.420 | And I think that's the real problem.
01:08:05.420 | And I think that's the real problem.
01:08:06.420 | And I think that's the real problem.
01:08:07.420 | And I think that's the real problem.
01:08:08.420 | And I think that's the real problem.
01:08:09.420 | And I think that's the real problem.
01:08:10.420 | And I think that's the real problem.
01:08:11.420 | And I think that's the real problem.
01:08:12.420 | And I think that's the real problem.
01:08:13.420 | And I think that's the real problem.
01:08:14.420 | And I think that's the real problem.
01:08:15.420 | And I think that's the real problem.
01:08:16.420 | And I think that's the real problem.
01:08:17.420 | And I think that's the real problem.
01:08:18.420 | And I think that's the real problem.
01:08:19.420 | And I think that's the real problem.
01:08:20.420 | And I think that's the real problem.
01:08:21.420 | And I think that's the real problem.
01:08:22.420 | And I think that's the real problem.
01:08:23.420 | And I think that's the real problem.
01:08:24.420 | And I think that's the real problem.
01:08:25.420 | And I think that's the real problem.
01:08:26.420 | And I think that's the real problem.
01:08:27.420 | And I think that's the real problem.
01:08:28.420 | And I think that's the real problem.
01:08:29.420 | And I think that's the real problem.
01:08:30.420 | And I think that's the real problem.
01:08:31.420 | And I think that's the real problem.
01:08:32.420 | And I think that's the real problem.
01:08:33.420 | And I think that's the real problem.
01:08:34.420 | And I think that's the real problem.
01:08:35.420 | And I think that's the real problem.
01:08:36.420 | And I think that's the real problem.
01:08:37.420 | And I think that's the real problem.
01:08:38.420 | And I think that's the real problem.
01:08:39.420 | And I think that's the real problem.
01:08:40.420 | And I think that's the real problem.
01:08:41.420 | And I think that's the real problem.
01:08:42.420 | And I think that's the real problem.
01:08:43.420 | And I think that's the real problem.
01:08:44.420 | And I think that's the real problem.
01:08:45.420 | And I think that's the real problem.
01:08:46.420 | And I think that's the real problem.
01:08:47.420 | And I think that's the real problem.
01:08:48.420 | And I think that's the real problem.
01:08:49.420 | And I think that's the real problem.
01:08:50.420 | And I think that's the real problem.
01:08:51.420 | And I think that's the real problem.
01:08:52.420 | And I think that's the real problem.
01:08:53.420 | And I think that's the real problem.
01:08:54.420 | And I think that's the real problem.
01:08:55.420 | And I think that's the real problem.
01:08:56.420 | And I think that's the real problem.
01:08:57.420 | And I think that's the real problem.
01:08:58.420 | And I think that's the real problem.