back to indexE13: SPACsgiving Special! Vaccine news, innovation vs regulation, fixing higher ed, challenge trials
Chapters
0:0 Besties congratulate Friedberg & Chamath for taking Metromile public, Chamath explains a PIPE, Sacks & Jason express their discontent for being left out of the first bestie SPAC
10:59 More positive vaccine news, NYT article on Operation Warp Speed: did the Trump administration nail it?
21:50 How will the COVID experience impact the response to the next pandemic? Morality of challenge trials, hypocrisy of regulatory capture around gambling, drug use, pharma, etc.
35:25 Why innovation has occurred so rapidly on the Internet: Permissionless innovation & lack of regulators, regulation vs. innovation
47:7 Thoughts on ISAs & how they could disrupt overpriced higher education, Dave Chappelle's contract with Comedy Central
59:58 Trump accepts defeat (sort of), Biden's cabinet selections so far
66:29 What the besties are thankful for
74:50 Peace in the Middle East being achieved by declining reliance on oil, based on resume alone - would Trump have won if not for his antics?
81:1 Code 13!
00:00:01.880 |
Besties are back, and it's a Bestie SPACsgiving. 00:00:10.800 |
His second company, David Freeberg, announces today, 00:00:15.380 |
hours before the taping of this special Thanksgiving pod, 00:00:18.900 |
that he is taking Metro Mile public through a SPAC, 00:00:47.600 |
Well, this isn't a self-promoting podcast, is it? 00:00:52.320 |
No, but I think it's just, you know, it's your second company. 00:00:58.280 |
I was the CEO of Climate when I was the CEO there. 00:01:00.240 |
And, you know, we were, Climate was offering insurance at the time. 00:01:05.080 |
We learned a lot about the insurance markets and figured like, 00:01:07.600 |
hey, you know, telematics or connecting cars to the internet is going to be a big deal. 00:01:11.520 |
And we're going to be able to completely change the auto insurance industry. 00:01:16.220 |
I was the chairman from the founding in 2011. 00:01:20.500 |
And, you know, been chairman and I've been an active investor in the business and every round since then. 00:01:27.880 |
The business has built some, you know, really compelling value proposition for customers. 00:01:33.960 |
And, you know, it's got really good unit economics and it's, you know, needed its last round of capital to get profitable. 00:01:40.440 |
And it turns out, you know, as we were thinking about that this summer, that a SPAC was a really good path for the business, given the inflection point it's at. 00:01:47.840 |
And the basic premise of the business is instead of paying for insurance by month or time period, the innovation here is you pay per month. 00:01:56.960 |
The reason you're paying for insurance today is you pay per mile. 00:02:00.020 |
Insurance today is like, you know, you fill out a form and you get a price for insurance. 00:02:03.940 |
You pay that rate for six months of coverage. 00:02:05.900 |
But, you know, depending on when you're driving and how much you're driving, you should be paying a different price. 00:02:11.100 |
So we, we kind of changed the model to a rate per mile. 00:02:14.760 |
And so if you don't drive, you save, you know, so the average customer doesn't drive a lot with MetroMile. 00:02:19.640 |
They save 47% over what they were paying with like Geico or Progressive Reassurance. 00:02:27.240 |
And increasingly, we're actually doing it directly by connecting to cars direct through Ford and a couple other big automotive OEMs now have this ability to send the data directly out of the car because they're all internet connected now. 00:02:39.060 |
So, so that allows us to just basically, you know, see how many miles you're driving and the rate per mile is what we bill you each month times the number of miles you drove on your, on your car. 00:02:50.200 |
I think that was a controversial concept for a while. 00:03:00.120 |
But frankly, 70% of the price difference you get in auto insurance is from the number of miles you drive. 00:03:06.200 |
And only 30% is really in this variance around behavior. 00:03:08.720 |
You know, most people are generally pretty good drivers. 00:03:12.240 |
So the, the real variance in terms of, you know, your risk to the insurance company is how many miles you drive. 00:03:20.560 |
Now what's interesting is like in a world of autonomous cars where you're like turning on the car. 00:03:25.160 |
To be autonomous or fully self-driving at some point, you know, it's on and off. 00:03:28.400 |
You should be getting a different rate for those miles. 00:03:31.280 |
So if the car is, if your Tesla is on autopilot on the freeway, that should be safer than you on the freeway. 00:03:35.660 |
You shouldn't be paying as much fraud insurance. 00:03:38.060 |
You can kind of think about how this moves into a world where everything is dynamically priced and dynamically build. 00:03:46.920 |
So, you know, if you're a good driver and you're driving well, or if you're using autonomous features, you shouldn't pay as much. 00:03:51.980 |
And ultimately that translates, uh, you know, to. 00:03:54.960 |
You know, into a truly kind of more dynamic service. 00:03:57.600 |
And that's really where the world has to go because those low mileage drivers or those good drivers or those drivers using autonomous features should be paying considerably less. 00:04:05.580 |
So they'll start using our service and that'll force the other guys to raise their rates. 00:04:10.280 |
And it creates this huge, you know, kind of market mode. 00:04:14.300 |
And we're in the, we're in the, we're in the very early days. 00:04:17.400 |
So we're, you know, we're, we're just getting going. 00:04:19.580 |
Chamath, you chose to do a pipe, um, with the SPAC. 00:04:24.660 |
For people who don't know and what you loved about Metro mile. 00:04:31.380 |
A pipe is a, a private investment in a public enterprise. 00:04:35.460 |
And basically what that means, um, is that, uh, you're making the round bigger, right? 00:04:41.880 |
So it's kind of like Sequoia does your series a and invest 10 million. 00:04:45.780 |
I would come in and put another 10 million and now your series a is 20 million. 00:04:49.420 |
Um, same terms as the, um, as Sequoia is round. 00:04:59.760 |
Well, what it does is it allows somebody to price the deal. 00:05:03.080 |
So in this case, David's SPAC sponsor priced the deal, did the diligence and decided to 00:05:09.580 |
And then they came to me and said, Hey, um, you know, do you want to come and join this round? 00:05:18.160 |
Um, a lot of things that David said basically are true. 00:05:21.480 |
The, the thing that I will say is like, you know, everybody talks about, um, the value 00:05:29.220 |
Um, the most obvious thing that you can do is if you learn on top of a huge subset of 00:05:35.700 |
data, especially out in the real world, like driving data or any other kind of information 00:05:42.780 |
And this is why sort of these next generation insurance companies to me are so interesting, 00:05:47.980 |
because it's probably where you're going to see, um, machine learning, um, be used that 00:05:52.900 |
just massive, massive scale, because you're just going to reprice risk and make it as 00:05:59.880 |
I, um, I really liked the, the product, the metrics are, uh, really amazing. 00:06:12.060 |
Jamal and I have never worked on anything together, so it's awesome. 00:06:14.220 |
We're doing our first project here on the all in podcast. 00:06:19.360 |
So it sounds like a brilliant idea and I'm just pissed. 00:06:42.280 |
Narcissistic besties have been, have been running around touting every single unicorn. 00:06:49.780 |
I miss my allocation in bird or Uber or whatever. 00:06:55.660 |
Everybody gets a slice, even a little tasty poo. 00:07:02.020 |
Just saying a little kick back, a little share. 00:07:04.060 |
Let's Jason, why don't you start an AngelList syndicate for every all in podcast 00:07:18.520 |
And then we'll aggregate all these, uh, subscribers and I'll take a 20% carry. 00:07:24.820 |
Can I just say, no, we'll do, we'll do zero carry and we'll allow all the listeners to 00:07:28.300 |
participate in our deals, which I think would be pretty cool. 00:07:30.640 |
Um, listen, I've, uh, I love how every time you loop my business into this, 00:07:34.600 |
whether it's podcasting or syndicates, you take out the money and the profit. 00:07:52.120 |
Um, by the way, uh, on this topic, uh, Saxipoo had this incredible tweet this 00:07:57.520 |
week, which was basically like, wow, this is like my 95th, uh, unicorn that went, 00:08:04.820 |
I mean, literally every single unicorn that filed this week to go public. 00:08:08.020 |
Sax was an angel investor, which is incredible. 00:08:10.180 |
Says a goodies as an investor, but the best followup tweet was Zach Weinberg, 00:08:17.080 |
Health whose tweet was, I just want to congratulate myself on nothing for having 00:08:23.260 |
I would like to congratulate myself for making no bets and taking no risk and 00:08:30.580 |
No, I thought, I thought Zach's tweet was the best. 00:08:34.480 |
Like all these investments that you've made, like don't these guys call you and 00:08:39.820 |
I mean, you know, this has gotta be like a huge amount of effort, right? 00:08:49.840 |
Well, you know, if you're not on the board, your obligation is basically to respond. 00:08:55.720 |
When somebody asks you for something, I know as an angel investor, it is a 00:08:59.080 |
little different when you're actually like leading around and you're a board member. 00:09:02.860 |
Um, I was making, I was making these investments back in 2012, 2013. 00:09:11.380 |
These were 50 K, a hundred K, 250 K checks, right? 00:09:14.980 |
So the responsibility is proportional to the. 00:09:17.380 |
So somewhere a little bigger than that actually. 00:09:19.960 |
But, um, I, um, I mean, I've written, I've written checks, you know, seven figure checks 00:09:26.860 |
Well, sex, according to your personal balance sheet that we got from your accountants this 00:09:37.500 |
You did the series B of house, the entire series B dollar. 00:09:51.640 |
I did co-lead, um, the series B or C of at a park. 00:09:56.920 |
Um, that was one that I did as an individual. 00:10:02.560 |
I have no idea what that company is, but that sounds awesome. 00:10:07.780 |
It's coming up on, you know, a hundred million. 00:10:18.620 |
Well, I mean, I think now that we've all self promoted. 00:10:22.840 |
Well, I'm just happy to hear that that Jason was cut out of your deal as much as me. 00:10:26.720 |
When I, when, when I read about it on Twitter, I'm like, I better not be the only best to cut out. 00:10:40.420 |
It is getting awkward now because people who watch this podcast, 00:10:48.820 |
Like literally they're like, well, you guys go grocery shopping together. 00:10:59.340 |
Um, I, I think the first thing we should, we should talk about is just this amazing. 00:11:06.000 |
A moment in time when, because of science, you know, and this podcast started 00:11:11.280 |
during the pandemic, we have had as predicted by free bird, who gets to take. 00:11:16.600 |
He said by the end of the year, we'd have vaccines and they would have, because of this 00:11:21.120 |
M RNA, if I remember correctly, 90, 95% efficacy. 00:11:26.120 |
And sure enough, the week after Trump, uh, wins, I'm sorry, loses, sorry, sacks. 00:11:32.420 |
Um, the week after Trump lost, I know you didn't vote for him. 00:11:39.060 |
Moderna, a Pfizer, then the next week, Moderna, and then the next week, 00:11:45.520 |
Oxford, and I understand Johnson and Johnson is about to announce something. 00:11:49.540 |
And all of these have 90 to 95% efficacy and that there are going to be 40, 50, 00:11:55.080 |
and 60 million doses in December, January, and February, just from the first two. 00:12:00.420 |
So Friedberg, if you were to put a number on when herd immunity hits, because 00:12:05.620 |
probably 20 or 30% of people have had it 20, 30% of people have, um, some natural 00:12:15.460 |
We had to be doing this from, you know, a warriors game next year. 00:12:18.700 |
When, when are we going to be able to do this in person? 00:12:22.900 |
I don't know if it was Fauci or someone that's closer to the operation shared that 00:12:26.900 |
they do think they can get 70% of Americans immunized by may. 00:12:32.340 |
So, you know, if you'll remember a few podcasts ago, I think I tried to explain 00:12:37.020 |
a big part of the budget that went into this operation warp speed was to parallelize 00:12:41.580 |
production of these vaccines while they were being tested. 00:12:44.980 |
We've been scaling up the production and the manufacturing of these, and if they 00:12:47.740 |
weren't going to work, we're just going to crash them, right? 00:12:53.160 |
So we've got a ton of doses that have been produced. 00:12:58.360 |
And that's, you know, supposed to be kind of underway with the plan to be that on 00:13:02.280 |
December 11th or 12th, when they, uh, give the emergency use authorization, these doses 00:13:11.040 |
Like we just shoved the chips in and said, we're going to make these vaccines, even if 00:13:15.420 |
It's more like a spray and pray, um, angel investment portfolio. 00:13:20.000 |
You know, we, we bought a, but we bought like four different things or five different things. 00:13:23.780 |
We made a bets in all of them and hope that one of them pays off. 00:13:26.440 |
And it turns out they're all going to pay off. 00:13:27.960 |
So, you know, or, or, or chunk of them are going to pay off and we get to have them ready. 00:13:32.460 |
You know, in time to kind of make a difference here. 00:13:34.860 |
Now, all that being said, if you look at the case numbers in the U S right now, we could 00:13:40.140 |
be as high as 30% of the American people have already been infected. 00:13:44.740 |
So, um, and then if you look at the number of people that have been infected since then, 00:13:47.620 |
and you apply the similar sort of multiple that you would assume based on tests, you 00:13:52.180 |
know, there's, there's an estimate that we could already be at up to 30% of the U S population 00:13:57.640 |
So, um, and then if you look at the number of people that have been infected since then, 00:14:00.580 |
and you apply the similar sort of multiple that you would assume based on tests, you 00:14:05.020 |
know, there's, there's an estimate that we could already be at up to 30% of the U S population 00:14:10.600 |
So, um, and then if you look at the number of people that have been infected since then, 00:14:14.660 |
you know, there's an estimate that we could already be at up to 30% of the U S population 00:14:21.340 |
So, um, and then if you look at the number of people that have been infected since then, 00:14:24.340 |
you know, there's an estimate that we could already be at up to 30% of the U S population 00:14:29.260 |
So, um, and then if you look at the number of people that have been infected since then, 00:14:30.260 |
you know, there's an estimate that we could already be at up to 30% of the U S population 00:14:33.260 |
So, um, and then if you look at the number of people that have been infected since then, 00:14:34.260 |
you know, there's an estimate that we could already be at up to 30% of the U S population 00:14:37.260 |
know about you guys, but I got some conference invites this last week for conferences for next 00:14:41.040 |
year that have been canceled this year and were being put on hold. Yeah. So people are starting 00:14:45.860 |
to lean out. For what time frame? Third quarter or fourth quarter? Summer, July. Yeah. So people 00:14:51.460 |
are now assuming that and they're booking hotel spaces based on it. That is extraordinary. Sax, 00:14:57.680 |
you shared this New York Times story. Why don't you summarize it for the audience? 00:15:01.540 |
And I'd love to get your thoughts in addition to this as to what this recovery might look like 00:15:08.640 |
if in fact we have more vaccine than we need and even a reasonable number of Americans take it and 00:15:15.840 |
don't believe that it's a conspiracy theory by Bill Gates to control and the Illuminati 00:15:20.620 |
and all that stuff. Yeah. I mean, this New York Times story is pretty remarkable. It's called 00:15:25.960 |
Politics, Science, and the Remarkable Race for a Coronavirus Vaccine. This came out, 00:15:31.140 |
In the New York Times. Actually, sorry, it's published. It published November 21st, 00:15:36.420 |
updated November 24th. And it's pretty remarkable. It describes the effort by Operation Warp Speed, 00:15:43.700 |
by the administration, by Pfizer, by Moderna. It kind of gives you the behind the scenes, 00:15:48.040 |
play by play. And reading the article, you have to come away thinking that the Trump 00:15:54.920 |
administration did a pretty good job with this whole Warp Speed project. I don't know if the 00:16:00.800 |
New York Times is going to be able to do that. I don't know. I don't know. I don't know. I don't 00:16:01.120 |
realizes that it's making the Trump administration look so good or maybe 00:16:05.580 |
they don't care anymore because he's lost the election but the article does make 00:16:10.660 |
the administration look very good I mean competent first very competent first of all 00:16:16.080 |
they shoveled money to the right people they you know they offered Pfizer money Pfizer didn't 00:16:21.080 |
want it or need it but Moderna did so they got a few billion dollars Moderna 00:16:24.780 |
they parallel processed a bunch of different attempts here so that if one company failed 00:16:31.440 |
the others might succeed there was a sort of a Reaganite cutting of bureaucratic red tape 00:16:36.800 |
wherever they could there were examples in the story of the drug company needing something some 00:16:41.840 |
supplies or what have you and they would call the administration and they would you know make it 00:16:46.800 |
happen and then finally the administration didn't do anything to to kind of mess with the science 00:16:54.760 |
they describe how this the this new experimental mRNA technique that they used to generate the 00:17:02.720 |
vaccine they had the code for that within two days they actually had the vaccine sort of printed 00:17:08.820 |
if you will within weeks and really what took all this time were the human trials 00:17:14.200 |
what you know the three-stage human trials which the administration did not do anything to speed up 00:17:18.960 |
and probably the irony of ironies the supreme irony is there's a story there's a 00:17:23.500 |
there's a bit in the story that's not true but there's a bit in the story that's not true but 00:17:24.740 |
there's a bit in the story that's not true but there's a bit in the story it actually begins with 00:17:26.740 |
there's a bit in the story that's not true but there's a bit in the story it actually begins with 00:17:26.740 |
there's a bit in the story it actually begins with the uh this guy Slaoui who's the head of the 00:17:30.260 |
the uh this guy Slaoui who's the head of the Trump administration's effort to produce the vaccine 00:17:32.820 |
Trump administration's effort to produce the vaccine 00:17:32.820 |
Trump administration's effort to produce the vaccine he actually slowed down the Moderna human 00:17:36.020 |
he actually slowed down the Moderna human trials by about three weeks because they weren't 00:17:38.660 |
trials by about three weeks because they weren't 00:17:38.660 |
trials by about three weeks because they weren't including enough minorities 00:17:40.580 |
including enough minorities in the you know in the in the trial 00:17:43.480 |
in the you know in the trial and that cost him three weeks if it 00:17:45.240 |
and that cost him three weeks if it weren't for those three weeks 00:17:47.320 |
weren't for those three weeks Moderna's vaccine would have happened 00:17:48.840 |
Moderna's vaccine would have happened before Pfizer and it would have come out 00:17:50.520 |
before Pfizer and it would have come out about a week before the election 00:17:52.840 |
about a week before the election so you gotta wonder like Trump has got 00:17:56.440 |
to be pulling out his hair about um what about 00:17:59.880 |
pulling out his hair about um what about about about this twist of fate David 00:18:01.960 |
about this twist of fate David how does it attribute credit for warp 00:18:05.800 |
speed and I'm going someplace with this so I'm 00:18:07.560 |
and I'm going someplace with this so I'm just asking you the question 00:18:09.000 |
just asking you the question well I don't the article's not trying to 00:18:10.840 |
well I don't the article's not trying to attribute credit they're just kind of 00:18:12.680 |
attribute credit they're just kind of describing the behind the scenes of of 00:18:14.840 |
describing the behind the scenes of of how Pfizer Moderna came up with their 00:18:16.520 |
how Pfizer Moderna came up with their vaccines and um 00:18:18.520 |
vaccines and um it just but but in in describing 00:18:21.480 |
it just but but in in describing you know Pfizer Moderna did the work of 00:18:23.800 |
you know Pfizer Moderna did the work of creating the vaccine but in describing 00:18:25.400 |
creating the vaccine but in describing the ways that warp speed contributed 00:18:27.720 |
the ways that warp speed contributed they did things that were only helpful 00:18:29.640 |
they did things that were only helpful and nothing that was harmful and so in 00:18:31.400 |
and nothing that was harmful and so in that sense it made the Trump 00:18:32.280 |
that sense it made the Trump administration look quite good 00:18:34.040 |
administration look quite good yeah I I think the point is that the 00:18:36.280 |
yeah I I think the point is that the the warp speed folks which is probably 00:18:38.840 |
the warp speed folks which is probably the least well-known working group 00:18:40.600 |
the least well-known working group working on coronavirus to the rest of 00:18:42.120 |
working on coronavirus to the rest of the public is because it was a lot of 00:18:44.040 |
the public is because it was a lot of wonky insiders it was sign of almost 00:18:46.040 |
wonky insiders it was sign of almost proving the exact opposite of what 00:18:48.280 |
proving the exact opposite of what Trump typically does which is you know 00:18:50.360 |
Trump typically does which is you know some idiotic nepotistic leaning where 00:18:52.600 |
some idiotic nepotistic leaning where it's his daughter or it's his son-in-law 00:18:54.280 |
it's his daughter or it's his son-in-law running around you know completely 00:18:56.820 |
running around you know completely and effectively you know doing something 00:19:00.440 |
and effectively you know doing something where they become sort of front and 00:19:02.680 |
where they become sort of front and center for taking credit in this case it 00:19:05.480 |
center for taking credit in this case it was just a bunch of policy walks you 00:19:06.840 |
was just a bunch of policy walks you never heard about the project we only 00:19:08.280 |
never heard about the project we only know about work speed because a handful 00:19:09.620 |
handful of us have talked about it. And it turns out to actually have been good because they knew 00:19:14.600 |
what they were doing and they knew enough to not try to seek the credit and just got out of the 00:19:18.240 |
way. I mean, it proves almost the antithesis of how the Trump campaign, you know, managed their 00:19:22.860 |
time in the White House. Well, I mean, you have a point where I was kind of reading this article 00:19:27.820 |
wondering, you know, where was Jared Kushner? Because if Kushner knew about this, there's no 00:19:32.960 |
way he slows down the Moderna vaccine by three weeks. I mean, that might have made the difference 00:19:39.020 |
right there and whether Trump wins or not. Can you imagine the effect on the election if the 00:19:43.260 |
vaccine had come out one week before November 3rd? I think Trump would have won decidedly if 00:19:48.880 |
he had a vaccine or two out with 90%. But I think he had no credibility, even though Moderna and 00:19:56.160 |
Pfizer were saying late November, and he said the vaccines are around the corner. His whole tenure 00:20:01.940 |
was based on so much lying. He was the boy who cried wolf. He was the president who cried wolf. 00:20:08.420 |
He was the president who told the truth. And he was telling the truth about the vaccines. It was 00:20:14.660 |
like, oh, my Lord, he told the truth in the final three months. 00:20:18.620 |
Can I give you the opposite, David, of that, which is that if Moderna basically says, 00:20:23.420 |
hey, guys, we have a vaccine that works for white people, and a disease that's, you know, 00:20:28.160 |
disproportionately killing blacks and, you know, Hispanics and brown people, Native Americans, and 00:20:35.760 |
all of a sudden, people are like, wait, what the fuck? 00:20:37.820 |
David Morgan: Yeah, I mean, I think it's ironic that an administration that was constantly accused 00:20:58.820 |
of white supremacy, probably lost the election, because they slowed down this vaccine trial group 00:21:07.220 |
David Morgan: No, I know. But I think what we're saying is it wasn't the actual administration. 00:21:10.460 |
It was somebody that actually knows what they were doing. 00:21:14.120 |
David Morgan: We call that redemption in the movie when the person actually loses because they did 00:21:18.980 |
the right thing. It's kind of redemption. Freeberg, when you look at these mRNA vaccines, you were 00:21:24.080 |
educating us about them last night, and also for a long time. Tell the audience just one more time 00:21:36.620 |
and what the potential for them is in the future because I think a lot of us now are starting to 00:21:42.060 |
see the light at the end of the tunnel this is gonna be over we're gonna be at conferences or 00:21:46.740 |
going traveling to Europe or whatever it is next summer but we are all gonna be scarred for life 00:21:52.320 |
thinking you know when is the next coronavirus just like for a decade we we were on pins and 00:21:57.460 |
needles when is the next 9/11 so this is gonna be scar tissue for a generation or two of people 00:22:03.800 |
when the next Cove it comes how quickly will warp speed 2.0 go yeah it's it's a it's the right 00:22:11.400 |
question because our approach to doing vaccines may have just changed permanently so you know 00:22:18.500 |
every cell has your DNA and DNA basically codes proteins every three letters of DNA makes an amino 00:22:26.040 |
acid codes for a specific amino acid there's 20 amino acids the way that DNA turns into proteins 00:22:32.840 |
is through RNA so you're gonna be able to get the DNA from your DNA and you're gonna be able to get 00:22:33.780 |
the DNA so RNA is kind of like a mirror copy of your DNA it floats into these things called 00:22:38.460 |
ribosomes in your cells and outcome proteins and those it's like a printer right and so those those 00:22:43.260 |
ribosomes make your proteins using those amino acid sequences coded by the RNA so the way that 00:22:48.960 |
the vaccines work historically is you'll get a dead virus which is basically the protein of a 00:22:55.260 |
virus your immune system then learns to kill that or to remove that protein oh man Jesus my dog your 00:23:03.760 |
immune system then learns to remove that protein from your body and that's how you develop this 00:23:08.860 |
memory your immune memory to a specific protein and so they put this dead art vaccine but virus 00:23:14.560 |
in your body and hopefully your immune system learns a good response to it mRNA basically puts 00:23:19.480 |
the RNA in your body that codes for that protein your cells then make that protein and because it's 00:23:26.320 |
making a lot more of the protein in a more consistent you know way theoretically the idea is 00:23:31.300 |
your body develops a much more robust immune memory than your immune system so you're going to be able to 00:23:33.740 |
have a much more robust immune memory and immune response without it overloading your system where 00:23:36.800 |
all the immune cells try and wipe that protein out right away and so on the challenge is you're 00:23:41.420 |
putting RNA in your body and we've always been worried about we don't know what the side effects 00:23:44.960 |
of mixing RNA in your body would be is it going to change your DNA is it going to change your genetic 00:23:48.680 |
makeup is it going to cause other deleterious side effects so this technology this capability 00:23:53.120 |
this knowledge has been around forever I'm not forever but for you know a long time and the idea 00:23:57.140 |
has always been we could use RNA in this way but you know no one wanted and we've used it in animals 00:24:01.760 |
and we've used it in plants and we've seen the capability 00:24:03.720 |
of using RNA to do different things like this but this is a big leap and so you know we kind of leapt 00:24:10.800 |
forward here getting to the point that we felt comfortable with you know RNA as a as a treatment 00:24:15.240 |
like this and it's it's working so in theory in the future as fax points out you could take any 00:24:19.740 |
virus or any bacteria you could read its DNA you could do that in an hour then you could take chunks 00:24:24.480 |
of that DNA and code RNA for it so your body makes those proteins that theoretically produce an immune 00:24:28.800 |
response so that that's the that's the science of like how do you how do you create a new vaccine I 00:24:33.700 |
think that's the amazing part is the way it's described you know in this article is that's 00:24:38.860 |
almost like laser printing or 3D printing a vaccine it's kind of like the equivalent of 00:24:43.780 |
that you just take the genome of the virus and you know boom you've got the vaccine and then 00:24:49.900 |
all the other delay is about human trials and testing of it but imagine if that there there 00:24:56.320 |
was the next coronavirus is 10 times as deadly you know something that's as spreads as contagiously as 00:25:03.520 |
smallpox and as you know as deadly as Ebola or something like that we could have a vaccine the 00:25:09.160 |
next day you know like you could we could have a very challenge David would be is we'd have it 00:25:14.560 |
the next day like we did here but we would be going through this three-phase trial and so I 00:25:19.540 |
want to take a moment here and talk to Chamath about something which is challenge trials 00:25:23.920 |
um in the UK they will start doing challenge trials for those people who don't know what 00:25:28.000 |
a challenge trial is essentially they expose you to something dangerous i.e a virus like 00:25:33.500 |
covid and then uh they give you the uh vaccine and then they give you uh the virus as opposed to 00:25:41.120 |
how we do a three-phase trial which is you give the vaccine to 30 000 people and a placebo to 30 00:25:47.720 |
000 and then you come back three months later and see how many people got affected and it takes time 00:25:51.500 |
and money whereas challenge trials only take risk on the individuals who are part of it Chamath 00:25:58.160 |
hundreds of people in the science community signed a letter and in the UK they're going to be doing the 00:26:03.480 |
reverse challenge trials in January the United States is not doing these I'm certain China is 00:26:07.740 |
um do you think it's a moment in time where we need to think about the ethics and morality of 00:26:15.120 |
challenge trials specifically and then if so how do you execute them with that how do you 00:26:20.880 |
execute a challenge trial without it being unfair um or too dangerous for people obviously you're 00:26:26.220 |
not going to just go into a prison and say hey anybody want to get 10 years off the sentence 00:26:29.520 |
during the challenge trial that seems morally bankrupt but we let people 00:26:33.460 |
climb mountains without ropes so right well I think this speaks to a whole bunch of other issues that 00:26:39.760 |
we've talked about on the pod before you know another example of this was section 230 before 00:26:44.380 |
when we talked about it we had a body of law that was created in a moment of time that essentially 00:26:51.100 |
was about framing and understanding a specific pathway and a way to use Technologies that today 00:26:56.560 |
look archaic and we have to rewrite the laws in order to just compensate and understand for where 00:27:03.440 |
we are so if I if you double click on trials as an example you know if you have a solution for a rare 00:27:08.780 |
disease you can go in a specific pathway with the FDA and get breakthrough and fast track approval 00:27:13.580 |
but if you for example have uh you know a novel immunotherapy cancer drug you probably you cannot 00:27:19.700 |
you know you have to do a multi-phase trial a typical three-phase trial you have to solve for 00:27:24.140 |
very typical things like fatigue Etc Etc all these things slow progress down now in a world where we 00:27:33.420 |
lying blind 40 or 50 years ago we didn't have you know things like crisper we didn't have you know a 00:27:39.780 |
real understanding of the genome we didn't have delivery mechanisms like cart you would say okay 00:27:44.820 |
yeah we should be really really careful but I would say that the more you know 00:27:51.120 |
um the more you can ease up on the rules because you can actually empower people with a lot of 00:27:57.000 |
information and it shouldn't take a disaster scenario for us to be iterative and experimental 00:28:03.400 |
so I think the challenge trial is really important I think the concept of them make a lot of sense I 00:28:09.340 |
think a lot of government should employ incentives to figure out who is eligible and why but if you're 00:28:16.420 |
a healthy adult male or female and you want to participate in a trial for whatever set of reasons 00:28:22.060 |
um you should be allowed to do so and companies that want to run those trials should be allowed 00:28:27.700 |
to run them similarly if you want to find a complementary pathway through regulatory 00:28:33.380 |
agencies to get drugs to the starting line you should be able to do those too and I think what 00:28:38.240 |
we have to do is multi-path um these compounds going forward because I think that's where you 00:28:44.040 |
accelerate all these technologies ability to actually solve these diseases 00:28:47.360 |
Freberg why is it so controversial that um a rocket ship company uh or 00:28:53.720 |
you know uh people who want to climb on mountains without ropes or a rocket ship 00:28:59.000 |
company like there are experimental pilots we have astronauts they take unbelievable 00:29:03.360 |
risk we send thousands of troops into harm's way for many different reasons many of which 00:29:11.340 |
sadly die and they volunteer for those activities those people are volunteering and compensated 00:29:17.100 |
but when we look at science and we look at a challenge trial scientists say this is morally 00:29:23.760 |
reprehensible to compensate somebody for taking risk when it that's exactly what we do in the 00:29:29.100 |
army we do it with police offers and we do it with astronauts help us understand 00:29:33.340 |
how scientists think so differently than say war I don't know if it's scientists as much as it is 00:29:39.760 |
you know regulatory framework like the some people would call it a nanny state and you know there are 00:29:47.680 |
things that the nanny state assumes individuals uh don't have the capacity to understand the 00:29:53.920 |
extent of the potential loss or the or the nature of the risk this is true for Angel investing right 00:29:59.320 |
you have to be a qualified investor to invest in a private company without appropriate disclosures 00:30:03.320 |
and uh it's true in a lot of other contexts so um you know to to give people the authority to make 00:30:09.980 |
decisions like this it seems like my I have no point of view that I'm kind of making here but 00:30:15.440 |
it seems like the the the the the government assumes or the the elected officials assume or 00:30:21.020 |
the populace assumes that there are things that people aren't really equipped to make decisions 00:30:24.740 |
on because they can't understand the risk because they're not qualified and uh that and they get 00:30:29.540 |
excluded from those activities but Saks is the uh it's just how should we 00:30:33.300 |
frame this yeah yeah well I think how should we frame it yeah I think assumption of risk is is a 00:30:39.960 |
is a really good principle and it is a way for people to engage in potentially harmful behavior 00:30:48.360 |
just because a behavior is is potentially harmful doesn't mean you don't get to do it 00:30:51.660 |
um in the United States you're allowed to do things that are manifestly harmful to yourself 00:30:56.820 |
like smoking well you know even worse if you're in Oregon you can now do all kinds of hard 00:31:03.280 |
drugs I mean you can assume that risk for yourself so you know if in Portland Oregon you can now take 00:31:09.160 |
heroin openly in the street with no consequence but you can't participate in a trial that could 00:31:14.320 |
basically cure a cancer that's insane to me I'll tell you yeah I'll tell you I'll tell you the flip 00:31:21.280 |
side of it um you're allowed in Nevada to play roulette I mean what the like you know it's 00:31:27.460 |
got a negative expected value statistically factually for individuals but the individual 00:31:32.200 |
doesn't have the capacity to do it so I think that's a really good point and I think that's a 00:31:33.260 |
really good point and I think that's a really good point and I think that's a really good point 00:31:33.320 |
generally speaking that's playing roulette to recognize that every dollar they're spending at 00:31:36.920 |
the roulette table is likely going to be to you know taken away from them like there's some 00:31:40.820 |
percentage of that that's going to be taken away there's a five percentage for the house on that 00:31:44.480 |
uh on that game and um and so we make the argument in some cases that people don't have the capacity 00:31:50.000 |
to understand risk but in other cases it's okay for them to not have the capacity to to take risk 00:31:54.920 |
and I think that there is this notion of what some people call regulatory capture that probably 00:31:59.000 |
encompasses both of these which is that there is some degree of profiteering that has created 00:32:03.240 |
some set of laws that kind of manifestly capture that system in a certain way so there are profitable 00:32:09.780 |
Casino enterprises that say let's get people to spend their money in a risky way that they don't 00:32:13.860 |
understand and that becomes the law and then people in Nevada are allowed to do that there 00:32:18.060 |
are also pharmaceutical companies that will say we need to have huge regulatory burdens so once we 00:32:22.620 |
make that big investment and we get patent approval and we can lock in that drug we can 00:32:26.280 |
charge a lot of money for it so I would argue to some extent that the regulatory capture associated 00:32:31.440 |
with the profit hearing that happens on the back end of the system is that we're not going to be 00:32:33.220 |
able to do that and that's why I think that the way that we're going to be able to do that is 00:32:35.380 |
that we're going to be able to do that is by making sure that we're not going to be able to 00:32:37.620 |
do that and that's why I think that the way that we're going to be able to do that is by making sure 00:32:39.700 |
that we're not going to be able to do that is by making sure that we're not going to be able to do that 00:32:42.340 |
and that's why I think that the way that we're going to be able to do that is by making sure 00:32:44.440 |
that we're not going to be able to do that is by making sure that we're not going to be able to do that 00:32:46.540 |
and that's why I think that the way that we're going to be able to do that is by making sure 00:32:48.700 |
that we're not going to be able to do that is by making sure that we're not going to be able to do that 00:32:51.400 |
and the rationale is well, you're the most protected, right? 00:32:54.460 |
Well, the drug infrastructure here has created to your point because of regulation the entire 00:32:58.800 |
you know, CRO industry which is a multi-multi-billion dollar industry which 00:33:04.120 |
basically is essentially a retardant of R&D velocity, right? Its entire job is to slow 00:33:12.160 |
things down, create these double-blinded studies. By the way, and so many of these studies are not 00:33:18.040 |
even double-blinded, they're not even actually scientifically rigorous, they get basically 00:33:21.380 |
blown apart after the fact. So what are they really in the business of doing? It's because 00:33:25.940 |
they're exploiting a business model that was created by laws, laws that were written by 00:33:29.960 |
regulators, regulators that were in the hands of lobbyists. None of those folks truly understood 00:33:34.560 |
at the time but especially today what's really possible. So you know, it does not make sense 00:33:41.180 |
in the United States of America, just writ large simply put that you can buy alcohol and drink 00:33:46.760 |
yourself into the ground, buy cigarettes and smoke yourself to death, you know, jump out 00:33:51.360 |
of a car and gamble away your money where you're negative EV or open, you know, openly do illicit 00:33:57.740 |
drugs but you can't participate in a thoughtful trial backed by scientific research. 00:34:02.940 |
In fairness, Jamath, you can get away with smoking fentanyl in San Francisco and Oregon 00:34:10.020 |
but it is illegal to use that plastic straw. So be careful folks, the laws are very clear here. 00:34:15.860 |
That plastic straw is going to get you in a lot of trouble. I don't know what the fine is, but 00:34:23.840 |
It wouldn't be so crazy if it were true but to your point, like you would actually get arrested 00:34:28.560 |
for having a plastic bag than you will for having fentanyl in San Francisco. 00:34:32.800 |
Absolutely. They'll arrest you for the bag that the fentanyl is in but Chesa 00:34:37.380 |
Budin will not arrest you for the fentanyl in the plastic bag. Sacks, we've totally lost the script 00:34:43.160 |
but Jamath sort of bridged this with the 230 discussion of common carrier and 00:34:51.320 |
said, hey, you know, we had great intent with this law but nobody saw social networks becoming 00:34:57.960 |
this dominant, addictive, et cetera. So we need to be more nimble as a government in changing these 00:35:03.000 |
regulations whether it's straws, fentanyl, challenge trials, or 230. You wrote a blog 00:35:07.960 |
post about it. After we had our 230 discussion, a lot of people started talking about it. 00:35:12.040 |
Have you come to some conclusion as to an exit ramp for 230 or a way to maintain it without, 00:35:18.520 |
you know, throwing the baby with the bathwater? Yeah, yeah. 00:35:21.300 |
Well, okay, just kind of make one concluding thought on this last topic. 00:35:24.980 |
Sure, of course. So the reason why innovation has happened so fast on the internet is because of, 00:35:30.500 |
you know, one word, permissionless, right? Permissionless innovation. Nobody who has an idea 00:35:34.900 |
for a startup needs to go get permission from someone in the government, you know, repeatedly. 00:35:39.380 |
That's really what makes a difference. You know, Mark Zuckerberg as a, you know, sophomore in college can just build his project. 00:35:45.300 |
Larry and Sergey as PhD students can just build their project, ship it, start, you know, getting used 00:35:51.280 |
to it. And they don't have to get the permission of a regulator whose incentive, by the way, is just 00:35:57.520 |
typically not to get fired by approving something that might rebound on them in some, you know, 00:36:02.880 |
in some bad way. And to keep their job. Yeah. Keep their job. I mean, imagine if, you know, 00:36:06.880 |
imagine just to take like a random example, when Elon launched Starman, remember when he put the, 00:36:12.640 |
like, he launched the Tesla into space and there was like a astronaut in there and it was like this 00:36:18.080 |
kind of really cool moment. I assume he just, you know, he just, you know, he just, you know, he just 00:36:21.260 |
did it. I assume he didn't get permission from anybody to do it, but could a moment like that 00:36:27.180 |
have really happened if he did have to get permission? No way. It'd be like making its way 00:36:31.100 |
up through the chain. No one would know what to think of it. No one would know whether they could 00:36:35.820 |
be the one to approve it. And then what if something goes wrong? What if the Tesla comes 00:36:40.380 |
back down to earth and, you know, turns into a meteor or whatever, those are the scenarios that 00:36:44.700 |
be running through their heads. Nobody would have allowed it. Right. And so when you just let 00:36:48.700 |
entrepreneurs do things like good things, you know, you're not going to be able to do it. You're not 00:36:51.240 |
going to be able to do it. You're not going to be able to do it. You're not going to be able to do it. 00:36:51.260 |
Right. And that's why we've had so much progress on the internet. And in so many of these other 00:36:56.340 |
areas, we've had much less progress because, you know, what a system like that selects for is your 00:37:01.460 |
ability to go lobby regulators as opposed to just building your project and shipping it. 00:37:07.160 |
I think that one of the things that maybe happens is that, you know, we were all expecting, 00:37:12.840 |
or maybe some of us, I have definitely some version of a new deal and some grand bargain. 00:37:17.740 |
And I wonder maybe whether the new deal and our sort of like 00:37:21.220 |
our version of FDR over the next, I don't know, 10 years is the person that actually says, 00:37:28.080 |
guys, we're going to have a wholesale rewrite of the regulatory infrastructure to account for 00:37:32.700 |
technology. Just period. We're going to start someplace reasonable and small, 00:37:36.560 |
and we're going to make common sense reforms, just observing the times as they exist today. 00:37:42.120 |
Right. And we're going to go and systematically try to make these industries 00:37:46.340 |
a little bit more resilient, a little bit more entrepreneurial, you know, a little bit 00:37:51.200 |
less corrupted by regulatory capture and lobbyists and laws that just don't make sense 100 years later. 00:37:58.140 |
I mean, it'd be great if we could do that. The reason we can't is because how do you reform 00:38:03.100 |
the law without the lobbyists getting their fingers in it? Right. And the problem is there's 00:38:08.440 |
no lobbyist for the company that doesn't exist yet, right? For the founder, for the entrepreneur 00:38:14.500 |
who's got an idea in their head, but they haven't built their company yet. There's nobody representing 00:38:21.180 |
Right. And what happens is you get new regulations in Washington. Some agency gets created. 00:38:26.800 |
They reach out and touch an industry. Now, every player that's affected has to create their own 00:38:32.000 |
lobbying organization. Not true. And David, what do you think about this? What if the advocate for 00:38:39.400 |
the entrepreneur or the uncreated company and uncreated product is really the same as just 00:38:44.380 |
individual civil liberties and rights? How are they really that different? Because really what 00:38:50.860 |
it's doing is it's creating a new organization. It's creating a new organization. It's creating 00:38:51.160 |
a new system. It's creating a new system. It's creating a new system. And so what we're doing is 00:38:51.520 |
saying the entrenched organizational infrastructure that runs my life gets deconstructed and then 00:38:58.580 |
power gets pushed down to be the individual. That's tantamount to the same thing, I think. 00:39:02.320 |
Well, we have seen some pushback. Well, I mean, there is some silver lining here. If you look 00:39:10.100 |
right now, the citizens of California, obviously people have been fleeing and we've been talking 00:39:14.620 |
about California and the one-party system here causing so many problems. But we did have Prop 22 00:39:21.140 |
pass and we now have 900,000 people have signed to recall Governor Newsom. And these seem to be 00:39:30.120 |
some pushback against this sort of nanny state where people can't make decisions. 00:39:35.340 |
And then additionally, today, the SEC announced that they will allow gig working companies to give 00:39:45.840 |
stock to employees as part of their compensation. Up to $1 billion. 00:39:51.120 |
Up to 15% of their compensation, in fact. Hester, how do you pronounce her last name? It's not Pierce. 00:40:02.140 |
Pierce. Pierce. No, it's spelled Pierce. But Hester Pierce is worth following on the Twitter, 00:40:07.640 |
H-E-S-T-E-R-P-E-I-R-C-E. Hester Pierce, I had her on my podcast this week in startups and 00:40:13.340 |
they're really getting aggressive in changing the accreditation law. So anybody's going to be able 00:40:19.120 |
to be an accredited investor, you just become sophisticated. 00:40:21.100 |
Through a testing mechanism. And now they want to let gig workers get equity compensation. 00:40:26.260 |
But by the way, sorry, here's a perfect example of a regulatory body and infrastructure that 00:40:32.120 |
actually has changed with the times. I think the SEC, in fact, I think maybe we talked about this 00:40:38.320 |
a little bit the last time. I actually think one of the best Trump appointees has been Jay Clayton. 00:40:43.320 |
And I actually think Jay Clayton has done an incredibly good job. And a lot of what he's done 00:40:51.080 |
Jay Clayton: Yeah. I mean, I think he's done a really good job. And he's done a really good job. I'm not saying that he's done a great job. But he's done a really good job. And I think that's what he's done. And I think that's what he's done. I think he's done a really good job. 00:41:21.060 |
educated and they're going to have to find a way of putting their money to work in assets that have 00:41:26.500 |
a better return, which literally was illegal up until very recently. And so you were systematically 00:41:33.380 |
letting the rich stay rich. And you were blocking the poor from 00:41:37.220 |
having an opportunity to advance. What if Apple Store employees could get their paycheck and say, 00:41:42.500 |
"I want to take my paycheck, 70% cash, 30% restricted stock units at this discount?" 00:41:50.260 |
We could let Apple Store employees make a million dollars after five or 10 years of working there, 00:41:56.100 |
10, 20 years from now, and have generational wealth change. David Sachs, you had a comment. 00:41:59.940 |
Well, I was going to ask, Jarmath, do you think this type of deregulation is going to happen 00:42:04.260 |
in the Biden administration? Because what you're discussing is like a very, really kind of classical 00:42:10.100 |
Republican or Reaganite idea, right? I agree. And I think what's 00:42:14.660 |
going to happen is I think you're going to continue to see deregulation in areas that 00:42:19.460 |
are benign and non-controversial. So I think the SEC, the FCC, I think all of these places you'll 00:42:29.300 |
see movement. I think you'll probably actually see a lot more choice and deregulation effectively in 00:42:37.540 |
Medicare and CMMI. I think those places will move first. And then I think it'll be up to some leader 00:42:48.020 |
politician to decide whether or not they want to do it. And I think that's going to be a very, 00:42:48.660 |
to then really rally the troops on some higher order bits. 00:42:55.000 |
would be sort of around the broader health care infrastructure, 00:43:18.900 |
will continue to dominate a lot of the federal and state policy. 00:44:01.540 |
And they basically halted all gene editing programs 00:44:06.140 |
You know how many people died during that time? 00:44:18.940 |
you know, basically doesn't take into account 00:44:26.560 |
So, yeah, I think that one's probably pretty important. 00:44:31.180 |
If you want to hear some interesting thoughts, 00:44:44.820 |
And he highlights a couple of really good examples 00:44:54.620 |
or go work in a salon and make money cutting hair, 00:45:06.960 |
who's poor and needs to become a hairdresser. 00:45:30.420 |
helps propel disinformation in a social network. 00:46:15.260 |
How fucking useful is a college philosophy degree 00:46:18.060 |
to someone that's trying to get an entry-level job 00:46:21.680 |
Like, it's a bit absurd that the structure is set up 00:46:29.860 |
which is this alternative to regulatory capture, 00:46:34.120 |
but it's like, call it embedded system, you know, capture. 00:46:39.420 |
that you have to get through that costs a lot, 00:46:41.080 |
and the point of entry has gotten too high for most. 00:46:45.520 |
and it's going to cause more damage than good. 00:46:47.060 |
Well, and it's also been completely disconnected 00:46:49.480 |
from the reality of your employment potential. 00:46:55.660 |
that provides no skill that is valuable in the world. 00:47:01.580 |
for needing a bachelor's degree for everyone? 00:50:15.700 |
That is what the contract that Chappelle signed 00:50:24.900 |
If you wanted to create the Dave Chappelle show again, 00:50:31.960 |
In perpetuity in all mediums and platforms forever. 00:50:34.480 |
This is a perfect tie back to the earlier statement about, 00:50:37.640 |
you know, do you have the appropriate kind of understanding 00:50:42.680 |
and what kind of benefit that you're getting. 00:50:44.680 |
I think Kanye has been doing this whole thing about 00:50:46.680 |
he doesn't own the masters to his music originally, right? 00:50:54.680 |
and all the future revenue rights to his music, 00:50:56.680 |
he was getting paid a ton of money in his mind at that time. 00:50:58.680 |
He said, oh my gosh, I'm getting whatever it was. 00:51:05.680 |
It is absolutely worth me giving over the masters of this music 00:51:08.680 |
and the future royalty rights to this music for 12 songs 00:51:12.560 |
For me to get $5 million today and I can live an amazing life 00:51:19.560 |
of the appropriate sound mind and understanding 00:51:22.560 |
to decide in that moment to take that $5 million 00:51:24.560 |
and give up all future royalties to his music? 00:51:26.560 |
Well, Chappelle's argument was the people who are in that ecosystem 00:51:32.560 |
are all friends, they're all working together, 00:51:36.560 |
and they collude to do this kind of upside limiting deals. 00:51:42.440 |
And I think what makes Silicon Valley so special 00:51:49.440 |
and we want the social contract of Silicon Valley is 00:51:57.440 |
and that you create multi-generational wealth 00:52:02.440 |
Look, if I'm an angel investor or want to be an angel investor 00:52:04.440 |
and some schmo tricks me into investing in his shitty company 00:52:10.440 |
but I didn't know how to determine whether that company was shitty. 00:52:12.320 |
I don't know if it's shitty or not because I've never done investing before. 00:52:15.320 |
Is it for the government to regulate that circumstance and say, 00:52:18.320 |
You need to be a quote unquote qualified investor 00:52:20.320 |
to be able to discern a good company for a bad company or bullshit from not." 00:52:25.320 |
I mean, so we're making the argument, I think, 00:52:28.320 |
for the regulatory framework for preventing people 00:52:33.320 |
And I think like earlier, the case was made like people should be able to make 00:52:39.320 |
Why should the government kind of step in and make a decision for me? 00:52:42.200 |
I think that there, I'm not advocating for government intervention. 00:52:46.200 |
My only point is that there are morally reprehensible things 00:52:50.200 |
that are happening today using contracts and that, you know, 00:52:53.200 |
something like an income sharing agreement actually rights the ship in so many ways. 00:53:01.200 |
Where like, "Oh, you're sick? You have cancer? 00:53:04.200 |
Pay me a million dollars. I'll give you this drug." 00:53:06.200 |
So, Hollywood's a little bit of its own beast where there's all these gatekeepers 00:53:12.080 |
always trying to get control over, you know, the creative product. 00:53:15.080 |
And so, yeah, like, you know, signing contracts with those sharks 00:53:27.080 |
I think the income sharing agreements are much more in that vein. 00:53:30.080 |
And I think they're a great idea for like any trade 00:53:33.080 |
that demonstrably increases your earning power, right? 00:53:36.080 |
So, the beauty of Lambda School is they pay for you to get an education 00:53:43.960 |
It then increases your salary and they get a piece of that. 00:53:47.960 |
I think where this goes, if these ISAs are successful, 00:53:50.960 |
is that every trade that's valuable will get its own ISA type school. 00:53:56.960 |
And then that gets peeled out of a university education. 00:54:02.960 |
It's basically, you know, all the stuff that doesn't add value. 00:54:06.960 |
They're basically like these monasteries again, right? 00:54:16.840 |
Because I think like, you know, we have so celebrated this mythical bachelor's degree. 00:54:26.840 |
And that's just as, frankly, that should be as respected or more. 00:54:30.840 |
But in the American culture, you don't do that right now. 00:54:34.840 |
If you go, for example, to different countries around the world, 00:54:37.840 |
Europe's a good example actually of this because it's the closest. 00:54:41.720 |
Analog in terms of quality of living to America. 00:54:43.720 |
There's a real celebration of people who choose vocational tradecraft. 00:54:48.720 |
Because there's an entire educational infrastructure that you can onboard yourself into. 00:55:00.720 |
And in America, you know, you covet this piece of paper. 00:55:03.720 |
The piece of paper is really just a scam that allows, you know, administrators to basically pay themselves millions of dollars. 00:55:10.720 |
And they can spend or run an asset management business as the Trojan horse of that purpose. 00:55:22.720 |
The most amazing thing about these companies, I've been digging into them because I'm looking for more to invest in. 00:55:27.720 |
And these companies, these companies that are the trade schools that are based themselves on ISIS. 00:55:33.720 |
And there are ones who are doing welding and plumbing and all kinds of different things. 00:55:37.720 |
And there are ISA platforms that are providing ISIS. 00:55:40.600 |
And there are companies that are doing, you know, the same thing.