back to indexMarc Andreessen: How Risk Taking, Innovation & Artificial Intelligence Transform Human Experience
Chapters
0:0 Marc Andreessen
3:2 Sponsors: LMNT & Eight Sleep
6:5 Personality Traits of an Innovator
12:49 Disagreeableness, Social Resistance; Loneliness & Group Think
18:48 Testing for Innovators, Silicon Valley
23:18 Unpredictability, Pre-Planning, Pivot
28:53 Intrinsic vs Extrinsic Motivation, Social Comparison
32:52 Sponsor: AG1
33:49 Innovators & Personal Relationships
39:24 Risk Taking, Innovators, “Martyrs to Civilizational Progress”
46:16 Cancel Culture, Public vs. Elite
53:8 Elites & Institutions, Trust
57:38 Sponsor: InsideTracker
58:44 Social Media, Shifts in Public vs. Elite
65:45 Reform & Institutions, Universities vs. Business
80:56 Alternative University; Great Awakenings; Survivorship Bias
87:25 History of Computers, Neural Network, Artificial Intelligence (AI)
95:50 Apple vs. Google, Input Data Set, ChatGPT
102:8 Deep Fakes, Registries, Public-Key Cryptography; Quantum Internet
106:46 AI Positive Benefits, Medicine, Man & Machine Partnership
112:18 AI as Best-Self Coach; AI Modalities
119:19 Gene Editing, Precautionary Principle, Nuclear Power
125:38 Project Independence, Nuclear Power, Environmentalism
132:40 Concerns about AI
138:0 Future of AI, Government Policy, Europe, US & China
143:47 China Businesses, Politics; Gene Editing
148:38 Marketing, Moral Panic & New Technology; Politics, Podcasts & AI
159:3 Innovator Development, Courage, Support
166:36 Small Groups vs. Large Organization, Agility; “Wild Ducks”
174:50 Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous, Neural Network Newsletter, Social Media
00:00:02.280 |
where we discuss science and science-based tools 00:00:10.240 |
and I'm a professor of neurobiology and ophthalmology 00:00:24.440 |
which was one of the first widely used web browsers. 00:00:30.040 |
which was one of the earliest widespread used web browsers. 00:00:43.400 |
is one of the most successful innovators and investors ever. 00:00:47.400 |
I was extremely excited to record this episode with Marc 00:00:50.920 |
First of all, he himself is an incredible innovator. 00:00:58.640 |
And third, Marc has shown over and over again 00:01:12.880 |
as well as what sorts of environmental conditions 00:01:15.320 |
make for exceptional innovation and creativity 00:01:21.240 |
not just in terms of risk-taking in one's profession, 00:01:26.040 |
but how some people who are risk-takers and innovators 00:01:30.640 |
also seem to take a lot of risks in their personal life 00:01:36.220 |
Then we discuss some of the most transformative technologies 00:01:40.440 |
such as novel approaches to developing clean energy, 00:01:49.180 |
as to why AI is likely to greatly improve human experience. 00:02:00.960 |
all of us are very likely to have AI assistance. 00:02:20.080 |
this can be a tremendously positive addition to our life. 00:02:23.440 |
In doing so, Marc provides a stark counterargument 00:02:30.860 |
So if you're hearing about and/or concerned about 00:02:36.060 |
today you are going to hear about the many different ways 00:02:41.140 |
are likely to enhance our human experience at every level. 00:02:44.660 |
What you'll soon find is that while today's discussion 00:02:46.680 |
does center around technology and technology development, 00:02:57.480 |
I'm certain that you'll find today's discussion 00:03:02.080 |
into what will soon be the future that we all live in. 00:03:05.320 |
Before we begin, I'd like to emphasize that this podcast 00:03:07.920 |
is separate from my teaching and research roles at Stanford. 00:03:12.500 |
to bring zero cost to consumer information about science 00:03:14.960 |
and science-related tools to the general public. 00:03:18.600 |
I'd like to thank the sponsors of today's podcast. 00:03:24.520 |
that has everything you need and nothing you don't. 00:03:26.860 |
That means plenty of the electrolytes, sodium, magnesium, 00:03:29.480 |
and potassium in the correct ratios, but no sugar. 00:03:32.980 |
The electrolytes and hydration are absolutely key 00:03:35.820 |
for mental health, physical health, and performance. 00:03:40.140 |
can impair our ability to think, our energy levels, 00:03:44.600 |
Element makes it very easy to achieve proper hydration, 00:03:47.600 |
and it does so by including the three electrolytes 00:03:52.700 |
I drink Element first thing in the morning when I wake up. 00:03:54.860 |
I usually mix it with about 16 to 32 ounces of water. 00:03:58.040 |
If I'm exercising, I'll drink one while I'm exercising, 00:04:01.240 |
and I tend to drink one after exercising as well. 00:04:06.800 |
of ingesting sodium, because obviously we don't want 00:04:11.080 |
However, for people that have normal blood pressure, 00:04:15.120 |
very clean diets, that is consuming not so many 00:04:23.460 |
magnesium, and potassium, and we can suffer as a consequence. 00:04:31.340 |
If you'd like to try Element, you can go to Drink Element, 00:04:36.660 |
to claim a free Element sample pack with your purchase. 00:04:42.880 |
Today's episode is also brought to us by Eight Sleep. 00:04:47.140 |
with cooling, heating, and sleep tracking capacity. 00:04:49.840 |
I've spoken many times before in this podcast 00:04:58.860 |
When we're sleeping well, everything goes far better. 00:05:01.180 |
And when we are not sleeping well, we're enough, 00:05:03.420 |
everything gets far worse at the level of mental health, 00:05:07.420 |
And one of the key things to getting a great night's sleep 00:05:09.420 |
and waking up feeling refreshed is that you have to control 00:05:11.960 |
the temperature of your sleeping environment. 00:05:13.460 |
And that's because in order to fall and stay deeply asleep, 00:05:19.860 |
And in order to wake up feeling refreshed and energized, 00:05:22.660 |
you want your core body temperature to increase 00:05:26.460 |
With Eight Sleep, it's very easy to induce that drop 00:05:29.880 |
in core body temperature by cooling your mattress 00:05:35.380 |
I started sleeping on an Eight Sleep mattress cover 00:05:37.300 |
a few years ago, and it has completely transformed 00:05:44.140 |
because I don't have my Eight Sleep mattress cover 00:05:50.780 |
and you'll save up to $150 off their pod three cover. 00:05:54.340 |
Eight Sleep currently ships in the USA, Canada, UK, 00:06:02.260 |
And now for my discussion with Marc Andreessen. 00:06:07.820 |
- Delighted to have you here and have so many questions 00:06:10.040 |
for you about innovation, AI, your view of the landscape 00:06:17.340 |
I want to start off by talking about innovation 00:06:25.660 |
or the psychology of the innovator or innovators, 00:06:29.340 |
things like their propensity for engaging in conflict 00:06:33.140 |
or not, their propensity for having a dream or a vision, 00:06:37.060 |
and in particular, their innovation as it relates 00:07:12.420 |
So to start off, is there a common trait of innovators 00:07:17.420 |
that you think is absolutely essential as a seed 00:07:22.700 |
to creating things that are really impactful? 00:07:31.140 |
And so it was a great moment of delight in my life 00:07:33.940 |
when I learned about the big five personality traits. 00:07:37.540 |
to actually describe the answer to this question 00:07:45.460 |
people who actually do really creative breakthrough work, 00:07:47.180 |
I think you're talking about a couple things. 00:07:48.380 |
So one is very high in what's called trait openness, 00:07:53.620 |
which is basically just like flat out open to new ideas. 00:07:59.420 |
is trait openness means you're not just open to new ideas 00:08:01.380 |
in one category, you're open to many different kinds 00:08:05.420 |
that a lot of innovators also are very creative people 00:08:09.660 |
even outside of their specific creative domain. 00:08:13.860 |
But of course, just being open is not sufficient. 00:08:19.100 |
and talking to people and never actually create something. 00:08:26.800 |
You need somebody who's really willing to apply themselves. 00:08:29.540 |
And in our world, typically over a period of many years, 00:08:38.620 |
that end up getting told about these people are, 00:08:40.380 |
it's just like there's this kid, and he just had this idea, 00:08:48.380 |
it's years and years and years of applied effort. 00:08:53.740 |
basically, willingness to defer gratification 00:08:55.740 |
and really apply themselves to a specific thing 00:08:59.340 |
And of course, this is why there aren't very many 00:09:02.680 |
who are high in openness and high in conscientiousness 00:09:04.860 |
'cause to a certain extent, they're opposed, right, traits. 00:09:07.820 |
And so you need somebody who has both of those. 00:09:10.240 |
Third is you need somebody high in disagreeableness, 00:09:15.100 |
So you need somebody who's just like basically ornery, right, 00:09:21.280 |
by people who will be like, oh, well, you know, 00:09:22.820 |
'cause the reaction most people have to ideas is, 00:09:35.140 |
is they tend to be disagreeable about everything, right? 00:09:37.940 |
So they tend to be these very kind of like kind of classic, 00:09:42.620 |
And then there's just a table stakes component, 00:09:44.420 |
which is they just also need to be high IQ, right? 00:09:47.980 |
because it's just it's hard to innovate in any category 00:09:50.340 |
if you can't synthesize large amounts of information quickly. 00:09:53.300 |
And so those are four like basically like high spikes, 00:10:01.740 |
You could probably also say they probably at some point 00:10:08.180 |
they probably can't handle the stress, right? 00:10:11.720 |
And then of course, if you're into like this sort of science 00:10:16.180 |
these are all people who are on like the far outlying 00:10:23.840 |
the sort of hardest topic of all around this whole concept 00:10:26.940 |
which is just there are very few of these people. 00:10:29.540 |
- Do you think they're born with these traits? 00:10:31.860 |
- Yeah, well, so they're born with the traits 00:10:39.820 |
just 'cause they have those personality traits 00:10:41.460 |
doesn't mean they're gonna deliver great creativity, 00:10:53.900 |
You know, it's very easy for them to get like high paying jobs 00:11:01.860 |
And, you know, there's a lot of people at, you know, 00:11:03.880 |
big institutions that we, you know, you and I know well 00:11:08.900 |
and they get a lot of respect and they go for 20 years 00:11:11.040 |
and it's great and they never create anything new, right? 00:11:20.360 |
The world needs, you know, the world needs that also, right? 00:11:22.380 |
The innovators can't run everything 'cause everything, 00:11:24.540 |
you know, the rate of change would be too high. 00:11:26.280 |
Society, I think, probably wouldn't be able to handle it. 00:11:27.920 |
So you need some people who are on the other side 00:11:33.080 |
But there is this decision that people have to make, 00:11:35.280 |
which is, okay, if I have the sort of latent capability 00:11:40.800 |
And do I wanna go through the stress and the pain 00:11:43.160 |
and the trauma, right, and the anxiety, right, 00:11:48.480 |
Once in a while, you run into somebody who's just like, 00:11:50.460 |
can't do it any other way, like they just have to. 00:12:00.400 |
but in part because he's talked about this in interviews 00:12:06.080 |
Like, the idea's come, I have to pursue them, right? 00:12:10.080 |
It's why he's like running five companies at the same time 00:12:16.680 |
You know, look, there's a lot of other people 00:12:23.880 |
to put them in a position where they did something else. 00:12:26.560 |
You know, obviously there are people who try to be creative 00:12:31.860 |
of determinism through traits, but also choices in life. 00:12:35.960 |
And then also of course, the situation in which they're born, 00:12:38.340 |
the context within which they grow up, the culture, right? 00:12:41.480 |
What their parents expect of them and so forth. 00:12:46.360 |
You have to thread all these nadles kind of at the same time. 00:12:51.120 |
that meet these criteria who are disagreeable, 00:12:57.400 |
You know, they can, for those just listening, 00:13:13.040 |
as it relates to the requirements of the system. 00:13:19.120 |
and following the rules until they get to a place 00:13:21.560 |
where being disagreeable feels less threatening 00:13:27.760 |
- Yeah, I mean, look, the really highly competent people 00:13:36.400 |
that sort of happened around the movie "The Godfather." 00:13:38.060 |
And then there was this character Meyer Lansky, you know, 00:13:39.800 |
who's like ran basically the mafia, you know, 00:13:44.080 |
well, Meyer Lansky had only like applied himself 00:13:55.040 |
They can, you know, they can do, they can work. 00:14:00.560 |
You know, they don't need to take the easy out. 00:14:09.600 |
this is probably the thing that gets missed the most, 00:14:15.180 |
like once it works, like the stories get retconned, 00:14:21.920 |
to where it's like it was inevitable all along. 00:14:23.640 |
You know, everybody always knew that this was a good idea. 00:14:25.680 |
You know, the person has won all these awards. 00:14:33.200 |
or if you actually get a couple drinks into them 00:14:45.380 |
No, I'm not gonna come work for your company. 00:14:50.440 |
And so they get just like tremendous social resistance. 00:14:57.840 |
the way that more agreeable people need to have, right? 00:15:00.520 |
And this is why agreeableness is a problem for innovation. 00:15:03.940 |
you're gonna listen to the people around you. 00:15:05.520 |
They're gonna tell you that new ideas are stupid, right? 00:15:13.480 |
they need to be able to deal with social discomfort 00:15:17.600 |
or at some point they're gonna get shaken out 00:15:20.660 |
- Do you think that people that meet these criteria 00:15:26.860 |
Or is it important that they form this deep sense of self, 00:15:31.160 |
like the ability to cry oneself to sleep at night 00:15:36.440 |
worrying that things aren't going to work out 00:15:42.280 |
So Sean Parker has the best line by the way on this. 00:15:45.080 |
He says, you know, being an entrepreneur, being a creator 00:15:47.400 |
is like, you know, getting punched in the face, 00:15:53.140 |
And I love that line 'cause it makes everybody 00:15:56.420 |
But it gives you a sense of like how basically painful 00:16:02.520 |
they're like, oh yeah, that's exactly what it's like. 00:16:04.600 |
So there is this, there is a big individual component to it. 00:16:10.360 |
And especially, you know, very hard I think to do this 00:16:13.120 |
if nobody around you is trying to do anything 00:16:16.020 |
And if you're getting just universally negative responses, 00:16:18.200 |
like, you know, very few people I think have, 00:16:26.800 |
There's a huge advantage to clustering, right? 00:16:28.580 |
And so you have, and you know, throughout history, 00:16:32.060 |
You had, you know, clustering of the great artists 00:16:33.640 |
and sculptors in Renaissance Florence, you know, 00:16:35.680 |
you have the clustering of the philosophers of Greece, 00:16:37.780 |
you have the clustering of tech people in Silicon Valley, 00:16:39.960 |
you have the clustering of creative, you know, arts, 00:16:44.120 |
And so forth and so on, you know, for, you know, 00:16:56.100 |
they're going to be much better off being around a lot of people 00:17:00.180 |
than they are in a place where nobody else is doing it. 00:17:02.900 |
Having said that, the clustering has, it can have downsides. 00:17:09.820 |
even among people who are individually very disagreeable. 00:17:16.700 |
they do have fads and trends just like every place else, 00:17:20.020 |
And so they get wrapped up in their own social dynamics. 00:17:23.840 |
And the good news is the social dynamic in those places 00:17:28.180 |
And so it's usually like, you know, it's, I don't know, 00:17:30.660 |
it's like a herd of iconoclasts looking for the next big thing, 00:17:33.860 |
So iconoclasts looking for the next big thing, that's good. 00:17:38.580 |
So even when you're in one of these environments, 00:17:40.120 |
you have to be careful that you're not getting sucked 00:17:43.020 |
When you say groupthink, do you mean excessive friction 00:17:45.380 |
due to pressure testing each other's ideas to the point 00:17:50.880 |
where people start to form a consensus or the self-belief 00:17:54.540 |
that, gosh, we are so strong because we are so different? 00:18:09.900 |
We just end up zeroing in on the same ideas, right? 00:18:14.060 |
It's like there are years where all of a sudden there's 00:18:18.260 |
It's like, why are there all these volcano movies? 00:18:21.100 |
There was just something in the gestalt, right? 00:18:25.940 |
There are moments in time where you'll have these-- 00:18:29.700 |
What's the difference between a fad and a trend, right? 00:18:33.820 |
And so Silicon Valley is subject to fads, both fads and trends, 00:18:39.100 |
In other words, you take smart, disagreeable people, 00:18:41.140 |
you cluster them together, they will act like a herd, right? 00:19:00.980 |
Here, I'm making the presumption that you have these traits. 00:19:08.100 |
Have you observed people trying to feign these traits? 00:19:17.780 |
that they're pretending to be the young Steve Jobs 00:19:22.380 |
or that they're pretending to be the young Henry Ford? 00:19:28.500 |
that qualify as authentic, legitimate innovators. 00:19:33.740 |
who have tried to disguise themselves as true innovators, 00:19:40.380 |
And I realize here that we don't want you to give these away 00:19:51.940 |
So there are people who definitely try to come in 00:19:53.580 |
and basically present as being something that they're not. 00:19:56.820 |
They will have listened to this interview, right? 00:19:58.260 |
They study everything, and they construct a facade, 00:20:01.540 |
and they come in and present as something they're not. 00:20:15.500 |
and people who should be doing it basically give up 00:20:17.220 |
'cause they just think that whatever the industry's over, 00:20:19.460 |
the trend is over, whatever, it's all hopeless. 00:20:22.420 |
So nobody ever shows up at a stock market low 00:20:35.100 |
they're fundamentally oriented for social status. 00:20:41.380 |
and there are always other places to go get social status. 00:21:00.700 |
By 2001, the joke was that there were these terms, 00:21:11.420 |
which is two different kinds of business models 00:21:21.860 |
the people oriented to status who showed up to be in tech 00:21:28.740 |
"or go back to McKinsey where I can be high status," 00:21:41.380 |
with a lot of kind of, let's say, public persona 00:21:48.340 |
and you can actually say exactly how we test for this, 00:21:50.260 |
which because the test exactly addresses the issue 00:21:55.820 |
and it's actually the same way homicide detectives 00:22:01.140 |
if you're innocent or whether you've killed somebody, 00:22:04.540 |
Which is you ask increasingly detailed questions, right? 00:22:27.380 |
and things just fuzz into just kind of obvious bullshit. 00:22:34.980 |
of what they're doing that they've kind of engineered. 00:22:37.260 |
But as they get into the details, it just fuzzes out. 00:22:39.980 |
Whereas the true people that you want to back 00:22:50.660 |
that they know so much more about it than you ever will. 00:22:56.380 |
Which is also what the homicide cops say, right? 00:22:58.500 |
Which you actually want the emotional response of like, 00:23:01.780 |
"I can't believe that you're asking me questions 00:23:06.220 |
And they kind of figure out what you're doing 00:23:21.580 |
have actually taken the time to systematically think 00:23:24.300 |
through the if-ands of all the possible implications 00:23:31.140 |
of how things need to turn out or will turn out? 00:23:40.400 |
because the world will sort of bend around it? 00:23:43.640 |
do you think that they place their vision in context 00:23:47.900 |
and they have that tunnel vision of that thing 00:23:54.460 |
I mean, that's how I first came to know your name. 00:23:58.340 |
When you were conceiving Netscape, did you think, 00:24:01.900 |
okay, there's this search engine and this browser 00:24:05.620 |
and it's going to be this thing that looks this way 00:24:10.760 |
Did you think that and also think about, you know, 00:24:15.280 |
that there was going to be a gallery of other search engines 00:24:18.140 |
and it would fit into that landscape of other search engines 00:24:20.540 |
or were you just projecting your vision of this thing 00:24:28.660 |
and then we can talk about the specific example. 00:24:30.220 |
So the general answer is with entrepreneurship, 00:24:32.780 |
creativity, innovation is what economists call 00:24:37.020 |
And so in both parts, it's important, decision-making, 00:24:40.020 |
because you have to decide what to do, what not to do 00:24:46.900 |
the world is a complex adaptive system with feedback loops 00:24:49.300 |
and like, it's really, I mean, it's extreme, you know. 00:24:55.380 |
he wrote about this field called psychohistory 00:24:57.340 |
which is the idea that there's like a supercomputer 00:24:59.060 |
that can predict the future of like human affairs, right? 00:25:10.460 |
military commanders call this the fog of war, right? 00:25:14.860 |
where the number of variables are just off the charts. 00:25:20.260 |
making all these decisions in different directions. 00:25:23.820 |
which is these people are colliding with each other, 00:25:26.660 |
And so, I mean, look, the most straightforward kind of way 00:25:29.940 |
to think about this is it's just, it's amazing. 00:25:32.260 |
Like anybody who believes in economic central planning, 00:25:33.940 |
it always blows my mind 'cause it's just like, 00:25:37.540 |
Like try just opening a restaurant on the corner down here 00:25:40.660 |
and like 50/50 odds the restaurant's gonna work. 00:25:43.260 |
And like all you have to do to run a restaurant 00:25:44.940 |
is like have a thing and serve food and like, 00:25:48.460 |
And so, and restaurant people who run restaurants 00:25:52.060 |
about these things very hard and they all wanna succeed. 00:25:56.700 |
or to start an artistic movement or to fight a war, 00:26:05.660 |
where there's just like incredible levels of complexity, 00:26:12.540 |
And so what we look for is basically the sort of drop, 00:26:16.460 |
the really good innovators, they've got a drive 00:26:18.700 |
to basically be able to cope with that and deal with that. 00:26:22.540 |
So one is they try to pre-plan as much as they possibly can. 00:26:29.020 |
And so the idea maze basically is I've got this general idea 00:26:37.260 |
okay, if I do it this way, that way, this third way, 00:26:39.300 |
here's what will happen, then I have to do that, 00:26:43.140 |
Here's the technical challenge I'm gonna hit. 00:26:44.900 |
And they've got in their heads as best anybody could, 00:26:50.060 |
of possible futures as they could possibly have. 00:26:53.340 |
when you ask them increasingly detailed questions, 00:26:54.860 |
that's what you're trying to kind of get them 00:27:06.700 |
now they're in it, now they're in the fog of war, right? 00:27:10.340 |
And now that idea maze is maybe not helpful practically, 00:27:13.120 |
but now they're gonna be basically constructing 00:27:20.180 |
'cause they're gonna change, if their thing starts to work, 00:27:31.740 |
the great ones course correct every single day. 00:27:38.100 |
The great ones tend to think in terms of hypotheses, right? 00:27:40.460 |
It's a little bit like a scientific sort of mentality, 00:27:42.580 |
which is they tend to think, okay, I'm gonna try this. 00:27:46.500 |
I'm gonna announce that I'm doing this for sure. 00:27:52.900 |
I'm gonna put a stake in there, that's my plan. 00:27:55.740 |
And even though I sound like I have complete certainty, 00:28:03.420 |
well, actually, we're not going left, we're going right. 00:28:05.740 |
And they have to run that loop thousands of times, right? 00:28:10.260 |
And this led to the creation of this great term pivot, 00:28:31.180 |
the businesses that end up working really well 00:28:35.100 |
But that's part of the process of a really smart founder 00:28:38.380 |
basically working their way through reality, right? 00:28:42.860 |
- The way you're describing this has parallels 00:28:57.740 |
I could imagine a great risk to early success. 00:29:01.560 |
So for instance, somebody develops a product, 00:29:10.720 |
to use the less profane version of it, right? 00:29:15.300 |
In other words, the, and I think of everything these days, 00:29:18.700 |
or most everything in terms of reward schedules 00:29:22.800 |
'cause that is the universal currency of reward. 00:29:25.340 |
And so when you talk about the Sean Parker quote 00:29:28.100 |
of learning to enjoy the taste of one's own blood, 00:29:32.700 |
than learning to enjoy the taste of success, right? 00:29:43.140 |
In other words, building up of those five traits 00:29:50.200 |
So on the outside, we just see the product, the end product, 00:29:52.880 |
the iPhone, the MacBook, the Netscape, et cetera. 00:29:55.820 |
But I have to presume, and I'm not a psychologist, 00:30:00.840 |
and I've studied the dopamine system enough to know that 00:30:08.060 |
sounds to be a reinforcement of those five traits, 00:30:10.740 |
rather than, oh, it's going to be this particular product, 00:30:16.940 |
That all seems like the peripheral to what's really going on, 00:30:21.940 |
that great innovators are really in the process 00:30:34.460 |
so this is like extrinsic versus intrinsic motivation. 00:30:36.700 |
So the Steve Jobs kind of Zen version of this, right, 00:30:46.140 |
of these big public markers, like the stock price 00:30:47.940 |
or the IPO or the product launch or whatever. 00:30:49.740 |
He's like, no, it's actually the process itself 00:30:54.240 |
And to your point, if you have that mentality, 00:30:59.700 |
And so that's the kind of intrinsic motivation 00:31:05.900 |
It's like, can I get better at doing this, right? 00:31:08.820 |
And can I prove to myself that I can get better? 00:31:13.140 |
and this is one of the reasons why Silicon Valley 00:31:21.140 |
So a phenomenon that we've observed over time 00:31:35.140 |
and as long as they beat that level of success, 00:31:40.740 |
but then in contrast, you're in Silicon Valley, 00:31:42.340 |
and you look around, and it's just like Facebook, 00:31:44.140 |
and Cisco, and Oracle, and Hewlett-Packard, and-- 00:31:47.880 |
- Yeah, and you're just looking at these giants, 00:31:53.100 |
Mark Zuckerberg's still going to work every day 00:31:54.740 |
and trying to do, and so these people are like, 00:32:03.960 |
and how much bigger their accomplishments are, 00:32:07.540 |
in that environment have much greater aspirations, right? 00:32:11.220 |
'Cause they just, again, maybe at that point, 00:32:14.340 |
maybe there's an extrinsic component to that, 00:32:16.960 |
but or maybe it helps calibrate that internal system 00:32:21.120 |
no, the opportunity here is not to build a local, 00:32:23.220 |
you know, what you may call local maximum form of success, 00:32:25.100 |
but let's build to a global maximum form of success, 00:32:27.660 |
which is something as big as we possibly can. 00:32:30.500 |
Ultimately, the great ones are probably driven 00:32:32.380 |
more internally than externally when it comes down to it, 00:32:39.140 |
who very easily can punch out and move to Fiji 00:32:41.700 |
and just call it, and they're still working 16-hour days, 00:32:45.100 |
right, and so obviously something explains that 00:32:47.420 |
that has nothing to do with external rewards, 00:32:51.060 |
- As many of you know, I've been taking AG1 daily since 2012, 00:32:56.260 |
so I'm delighted that they're sponsoring the podcast. 00:33:05.720 |
of vitamins and minerals through whole food sources 00:33:08.020 |
that include vegetables and fruits every day, 00:33:10.260 |
but oftentimes I simply can't get enough servings, 00:33:12.940 |
but with AG1, I'm sure to get enough vitamins and minerals 00:33:17.660 |
and it also contains adaptogens to help buffer stress. 00:33:20.840 |
Simply put, I always feel better when I take AG1. 00:33:23.640 |
I have more focus and energy, and I sleep better, 00:33:30.260 |
if you could take just one supplement, what would it be? 00:33:34.660 |
If you'd like to try AG1, go to drinkag1.com/huberman 00:33:48.880 |
- I've heard you talk a lot about the inner landscape, 00:34:03.860 |
that are ideal for certain types of pursuits. 00:34:05.860 |
I think there was an article written by Paul Graham 00:34:10.460 |
that you overhear in a city will tell you everything 00:34:12.480 |
you need to know about whether or not you belong there 00:34:19.460 |
and now we should probably add Austin to the mix 00:34:29.100 |
of this intrinsic versus extrinsic motivators 00:34:33.140 |
in terms of something that's a bit more cryptic, 00:34:38.740 |
You know, if I think about the catalog of innovators 00:34:41.720 |
in Silicon Valley, some of them, like Steve Jobs, 00:34:48.920 |
I don't know, I wasn't their couples therapist, 00:34:55.460 |
that for all the world seemed like a happy marriage. 00:35:05.840 |
You know, I don't think I'm disclosing anything 00:35:13.480 |
But the reason I'm asking this is you can imagine 00:35:15.960 |
that for the innovator, the person with these traits 00:35:19.700 |
who's trying to build up this thing, whatever it is, 00:35:23.700 |
that having someone, or several people in some cases, 00:35:32.340 |
when the rest of the world may not believe in you yet 00:35:36.760 |
And we have examples from cults that embody this. 00:35:42.540 |
We have examples from tech innovation and science. 00:36:01.380 |
or cheat on them or pair up with some other innovator, 00:36:05.020 |
which we've seen several times, recently and in the past, 00:36:10.220 |
But what are your thoughts on the role of personal 00:36:18.720 |
and their feeling that they can really bring that idea 00:36:29.220 |
So first is we talked about the personality traits 00:36:33.980 |
- Doesn't foster a good romantic relationships. 00:36:35.660 |
- Highly disagreeable people can be difficult 00:36:38.260 |
- I may have heard of that once or twice before. 00:36:42.340 |
And maybe you just need to find the right person 00:36:46.860 |
it's always this question about relationships, right? 00:36:48.420 |
Which is, do you want to have the same personality, 00:36:57.980 |
There are relationships where you'll have somebody 00:37:00.100 |
who's paired with somebody who's highly agreeable. 00:37:03.140 |
'Cause one person just gets to be on their soapbox 00:37:04.700 |
all the time and the other person's just like, okay. 00:37:10.340 |
You know, you put two disagreeable people together, 00:37:13.660 |
and they have great conversations all the time 00:37:15.160 |
and maybe they come to hate each other, right? 00:37:20.520 |
you're fishing out of the disagreeable end of the pond. 00:37:23.780 |
I don't mean, you know, these are normal distributions. 00:37:25.900 |
I don't mean like 60% disagreeable or 80% disagreeable. 00:37:28.780 |
The people we're talking about are 99.99% disagreeable, 00:37:36.780 |
they have the other personality traits, right? 00:37:40.340 |
As a consequence, they tend to work really hard. 00:37:41.920 |
They tend to not have a lot of time for, you know, 00:37:47.740 |
And so again, that kind of thing can fray in a relationship. 00:37:50.420 |
So there's a fair amount in there that's loaded. 00:37:53.800 |
Like somebody's gonna partner with one of these people 00:38:00.740 |
Or you need a true partnership with two of these, 00:38:06.180 |
And then look, I think a big part of it is, you know, 00:38:10.140 |
and, you know, either in their own minds or, you know, 00:38:16.120 |
And they start to be able to, it's like, well, okay, 00:38:18.100 |
you know, now we're rich and successful and famous 00:38:22.740 |
I view this now in the realm of personal choice, right? 00:38:25.340 |
You get in this thing where people start to think 00:38:27.900 |
And so they start to behave in, you know, very bad ways. 00:38:30.900 |
And then they blow up their personal worlds as a consequence 00:38:33.660 |
and maybe they regret it later and maybe they don't, right? 00:38:40.460 |
And then I don't know, like, yeah, some people just need, 00:38:44.980 |
some people just need more emotional support than others. 00:38:54.260 |
on having this kind of firm foundation to rely upon. 00:39:02.900 |
Like professionally, they just keep doing what they're doing. 00:39:04.940 |
And maybe we could talk here about like, you know, 00:39:07.700 |
whatever is the personality trait for risk taking, right? 00:39:25.780 |
because I think risk taking and sensation seeking 00:39:29.540 |
is something that fascinates me for my own reasons 00:39:46.900 |
of how we're painting this picture of the innovator 00:39:54.500 |
are innovations that make the world far better 00:39:59.560 |
And by the way, everything we're talking about also 00:40:00.940 |
is not just in tech or science or in business. 00:40:03.160 |
It's also, everything we're also talking about 00:40:07.840 |
is you have people with all these same kinds of traits. 00:40:10.960 |
and his regular turnover of lovers and partners. 00:40:26.260 |
- Right, or that was his story for behaving in a pattern 00:40:29.420 |
that was very awful for the people around him 00:40:41.500 |
I keep a list of things that will get me kicked 00:40:42.500 |
out of a dinner party at topics at any given point in time. 00:40:50.580 |
I'll recall so that I can get out of these things. 00:40:53.180 |
But so here's the thing that can get me kicked 00:40:55.540 |
out of a dinner party, especially these days. 00:40:58.480 |
So think of the kind of person where it's like very clear 00:41:04.220 |
somebody is super high output in whatever domain they're in. 00:41:06.180 |
They've done things that have like fundamentally 00:41:08.580 |
They've brought new, whether it's businesses or technologies 00:41:10.780 |
or works of art, entire schools of creative expression, 00:41:20.080 |
And they do that either through like a massive 00:41:23.460 |
They do that through a massive personal breakdown. 00:41:25.920 |
They do that through some sort of public expression 00:41:38.020 |
there's this moral arc that people kind of want to apply, 00:41:40.100 |
which is like the Icarus flying too close to the sun. 00:41:43.620 |
And he had it coming and he needed to keep his ego 00:42:01.640 |
I call them martyrs to civilizational progress, right? 00:42:05.160 |
So, and we're backwards, civilizational progress. 00:42:07.600 |
So look, the only way civilization gets moved forward 00:42:10.520 |
is when people like this do something new, right? 00:42:12.900 |
'Cause civilization as a whole does not do new things, right? 00:42:16.280 |
Groups of people do not do new things, right? 00:42:26.680 |
is because one of these people stands up and says, 00:42:30.440 |
than what everybody else has ever done before. 00:42:48.440 |
Like when they go down in flames, like they have, 00:43:00.940 |
He was the kind of person who was going to do great things 00:43:03.780 |
and also was going to take on a level of risk 00:43:06.300 |
and take on a level of sort of extreme behavior 00:43:16.260 |
The reason you have the Picassos and the Beethovens 00:43:18.680 |
and all these people is because they're willing 00:43:27.340 |
that they will set themselves up to be able to fail. 00:43:29.500 |
Psychologic, you know, a psychologist would probably, 00:43:31.380 |
a psychiatrist would probably say, you know, maybe, 00:43:39.360 |
But you see this, they deliberately move themselves 00:43:46.580 |
they deliberately move back towards it, right? 00:43:48.700 |
You know, they come right back and they want the risk. 00:43:55.180 |
to civilizational progress, like this is how progress happens. 00:44:00.100 |
the natural inclination is to judge them morally. 00:44:02.580 |
I tend to think we should basically say, look, 00:44:04.820 |
and I don't even know if this means like giving them 00:44:07.780 |
But it's like, look, like this is how civilization progresses. 00:44:12.900 |
that there's a self-sacrificial aspect to this 00:44:23.180 |
who were able to compartmentalize their risk-taking 00:44:27.920 |
to such a degree that they had what seemed to be 00:44:37.700 |
So some people are very highly controlled like that. 00:44:42.980 |
and I don't really want to set myself an example 00:44:44.780 |
on a lot of this, but I will tell you like as an example, 00:44:46.700 |
like I will never use debt in business, number one. 00:44:50.420 |
Number two, like I have the most placid personal life 00:44:53.060 |
Number three, I'm the last person in the world 00:44:57.020 |
I mean, I'm not even gonna go on the sauna on the ice bath. 00:45:12.500 |
I have no interest in any of this stuff, right? 00:45:13.780 |
And so like there are, and I know people like this, right, 00:45:17.060 |
It's just like, yeah, they're completely segmented. 00:45:21.020 |
They're completely buttoned down on the personal side. 00:45:22.820 |
They're completely buttoned down financially. 00:45:24.940 |
They're scrupulous with following every rule and law 00:45:30.860 |
And then I know many others who are just like, 00:45:32.660 |
their life is on fire all the time in every possible way. 00:45:36.780 |
And whenever it looks like the fire is turning into embers, 00:45:38.820 |
they figure out a way to like relight the fire, right? 00:45:41.420 |
And they just really want to live on the edge. 00:45:54.020 |
I think Bach was, as an example, one of the kind of best 00:45:56.860 |
musicians of all time, had just a completely sedate 00:45:59.700 |
personal life, never had any aberrant behavior 00:46:07.380 |
And so if Bach could be Bach and yet not burn his way 00:46:10.500 |
through 300 mistresses or whatever, maybe you can, too. 00:46:15.700 |
So in thinking about these two different categories 00:46:18.100 |
of innovators, those that take on tremendous risk in all 00:46:20.500 |
domains of their life and those that take on tremendous risk 00:46:27.180 |
But I have to wonder if in this modern age of the public being 00:46:41.500 |
like by just simply frightening or eliminating 00:46:47.140 |
because they don't have the confidence or the means 00:46:58.380 |
So do you think the public is less tolerant than they 00:47:08.420 |
I think the large institution systems are not 00:47:22.500 |
enough noise in the mob, I think institutions bow out. 00:47:27.860 |
they essentially say, OK, let the cancellation proceed. 00:47:31.540 |
And then maybe they're the gavel that comes down, 00:47:34.340 |
but they're not the lever that got the thing going. 00:47:36.980 |
And so I'm not just thinking about universities. 00:47:40.620 |
I'm thinking about the big movie houses that cancel a film that 00:47:43.980 |
a given actor might be in because they had something 00:47:46.180 |
in their personal life that's still getting worked out. 00:47:48.560 |
I'm thinking about people who are in a legal process that's 00:47:55.700 |
My question is are we really talking about the public? 00:47:59.060 |
I agree with your question, and I'm going to come back to it. 00:48:01.380 |
I'm going to examine one part of your question, which 00:48:04.180 |
is is this really the public we're talking about? 00:48:07.740 |
who is the current front runner for the Republican nomination 00:48:13.380 |
The public, at least on one side of the political aisle, 00:48:19.340 |
Number two, look, there's a certain musician who flew too 00:48:23.900 |
close to the sun, blew himself to smithereens. 00:48:25.860 |
He's still hitting all-time highs on music streams 00:48:35.660 |
more open to these things than it actually maybe ever has been. 00:48:43.340 |
but it's a differentiation between the public 00:48:46.660 |
So my view is everything that you just described 00:48:48.940 |
is an elite phenomenon, and actually the public 00:48:53.540 |
And so what's actually happening is the division-- 00:48:58.300 |
The public is more forgiving of what previously 00:49:01.860 |
might have been considered kind of ever an extreme behavior. 00:49:05.700 |
Of Scott Fitzgerald, there are no second acts 00:49:09.500 |
It turns out there are second acts, third acts, fourth acts. 00:49:11.340 |
Apparently, you can have a limited number of acts. 00:49:14.220 |
Yeah, I mean, I think of somebody like Mike Tyson. 00:49:21.180 |
that's amazing and great and also terrible about America. 00:49:24.900 |
If we took Mike Tyson to dinner tonight at any restaurant 00:49:27.580 |
anywhere in the United States, what would happen? 00:49:31.660 |
He would be-- the outpouring of enthusiasm and passion 00:49:45.820 |
know that the public is just like 100%, like absolutely. 00:49:52.860 |
And then you see it when he shows up in movies, right? 00:49:55.940 |
I mean, the big breakthrough I figured this out 00:49:57.340 |
with respect to him because I don't really follow sports. 00:49:59.100 |
But when he showed up in that, it was that first Hangover 00:50:10.060 |
I always say that Mike Tyson is the only person 00:50:11.980 |
I'm aware of that can wear a shirt with his own name on it. 00:50:18.180 |
In fact, it just kind of makes you like him more. 00:50:29.820 |
maybe as a consequence of all that he's been through. 00:50:40.380 |
I think you're probably going to get a different reaction. 00:50:43.860 |
I mean, David Simon, the guy who wrote The Wire, 00:50:54.540 |
that people adore people who are connected to everybody 00:51:02.780 |
I feel like everybody loves Mike from above his status, 00:51:14.540 |
Yeah, and then, look, the other side of this is the elites. 00:51:17.060 |
And you kind of alluded to this, or the institutions. 00:51:20.060 |
who were like, at least, nominally in charge, 00:51:35.860 |
Who can get who fired, boycotted, blacklisted, 00:51:37.900 |
ostracized, like when push prosecuted, jailed, 00:51:46.300 |
And of course, you'll notice that that is heavily asymmetric 00:51:50.340 |
It's very clear which side can get the other side fired 00:51:54.260 |
And so, yeah, so look, I think we live in a period of time 00:52:01.460 |
extreme groupthink, extreme sanctimony, extreme moral, 00:52:05.620 |
I would say, dudgeon, this weird sort of modern puritanism, 00:52:13.100 |
of punishment and terror against their perceived enemies. 00:52:17.140 |
But I wanted to go through that, because I actually 00:52:21.180 |
And I think what's happening to the elites is very different 00:52:22.580 |
than what's happening in the population at large. 00:52:24.940 |
And then, of course, I think there's a feedback loop 00:52:27.020 |
in there, which is I think the population at large 00:52:31.300 |
I think the elites are aware that the population is not 00:52:37.140 |
That causes the elites to harden their own positions. 00:52:42.260 |
And so they're in sort of an oppositional negative feedback 00:52:55.300 |
ostracized, banned, hit pieces in the press, like whatever. 00:53:01.060 |
they have to really band together and really mount 00:53:03.740 |
a serious challenge, which mostly doesn't happen, 00:53:06.100 |
but might be starting to happen in some cases. 00:53:19.020 |
everything you're giving each and every person 00:53:21.620 |
their own little reality TV show, their own voice, 00:53:27.940 |
in the number of cancellations and firings related 00:53:30.580 |
to immoral behavior based on things that were either done 00:53:46.260 |
somewhat interchangeable, but elites and institutions. 00:53:51.460 |
And so it's sort of a self-reinforcing thing. 00:53:55.260 |
Anyway, institutions of all kinds, institutions, 00:53:57.220 |
everything from the government, bureaucracies, companies, 00:53:59.580 |
nonprofits, foundations, NGOs, tech companies, 00:54:02.720 |
on and on and on, people who are in charge of big complexes 00:54:07.340 |
and that carry a lot of basically power and influence 00:54:17.740 |
that would cause somebody to have a high opinion of them 00:54:19.740 |
But they're in charge of this gigantic multi-billion dollar 00:54:22.140 |
complex and have all this power of the results. 00:54:24.180 |
So that's just to define terms, elites and institutions. 00:54:29.540 |
Gallup has been doing polls on the question of trust 00:54:41.980 |
across all the categories of big institutions, 00:54:44.340 |
basically everyone I just talked about, a bunch of others, 00:54:46.580 |
big business, small business, banks, newspapers, 00:54:52.300 |
So they've got like 30 categories or something. 00:54:54.340 |
And basically what you see is almost all the categories 00:54:56.620 |
basically started in the early '70s at like 60% or 70% trust. 00:55:00.220 |
And now they've-- basically almost across the board, 00:55:03.100 |
they've just had a complete basically linear slide down 00:55:11.860 |
Congress and journalists bottom out at like 10%. 00:55:19.460 |
And then it's like a lot of other big institutions 00:55:23.780 |
Actually, big business actually scores fairly high. 00:55:29.060 |
But basically everything else has really caved in. 00:55:31.220 |
And so this is sort of my fundamental challenge 00:55:36.900 |
hear the simple form of this, which is social media caused 00:55:43.020 |
Collapse in faith in institutions and elites, 00:55:47.940 |
Everybody's like, oh, social media caused that. 00:55:49.180 |
I was like, well, no, social media is new in the last-- 00:55:53.140 |
social media is effectively new practically speaking since 2010, 00:55:58.180 |
And so if the trend started in the early 1970s 00:56:04.060 |
And Martin Gurry wrote I think the best book on this called 00:56:09.920 |
And he does say that social media had a lot to do with 00:56:14.820 |
But he says if you go back, you look further, 00:56:19.220 |
One was just a general change in the media environment. 00:56:21.480 |
And in particular, the 1970s is when you started-- 00:56:24.460 |
especially in the 1980s is when you started to get specifically 00:56:35.180 |
you had paperback books, which was another one of these, 00:56:37.780 |
So you had a fracturing in the media landscape 00:56:42.320 |
And then, of course, the internet blew it wide open. 00:56:45.160 |
Having said that, if the elites in the institutions 00:56:47.240 |
were fantastic, you would know it more than ever. 00:56:51.240 |
And so the other thing that he says and I agree with 00:56:55.280 |
into thinking the elites in institutions are bad. 00:56:59.560 |
And therefore, the mystery of the Gallup poll 00:57:04.880 |
which is arguably in a lot of cases where they should be. 00:57:14.320 |
So he basically says you can't replace elites with nothing. 00:57:22.720 |
You're going to be left with a completely, basically, 00:57:25.120 |
atomized, out-of-control society that has no ability to marshal 00:57:29.800 |
It's just going to be a dog-eat-dog awful world. 00:57:32.980 |
I have a very different view on that, which we can talk about. 00:57:38.400 |
I'd like to take a quick break and acknowledge our sponsor, 00:57:42.140 |
InsideTracker is a personalized nutrition platform 00:57:50.420 |
I'm a big believer in getting regular blood work done 00:57:52.740 |
for the simple reason that many of the factors 00:57:55.060 |
that impact your immediate and long-term health 00:57:57.300 |
can only be analyzed from a quality blood test. 00:57:59.880 |
However, with a lot of blood tests out there, 00:58:04.940 |
but you don't know what to do with that information. 00:58:06.940 |
With InsideTracker, they have a personalized platform 00:58:09.260 |
that makes it very easy to understand your data, 00:58:15.920 |
and behavioral supplement, nutrition, and other protocols 00:58:19.100 |
to adjust those numbers to bring them into the ranges 00:58:21.900 |
that are ideal for your immediate and long-term health. 00:58:24.160 |
InsideTracker's ultimate plan now includes measures 00:58:28.660 |
which are key indicators of cardiovascular health 00:58:39.900 |
Again, that's insidetracker.com/huberman to get 20% off. 00:58:44.500 |
The quick question I was going to ask before we go there is, 00:58:48.060 |
I think that one reason that I and many other people 00:58:53.640 |
caused the demise of our faith in institutions is, 00:58:58.640 |
well, first of all, I wasn't aware of this lack 00:59:02.680 |
of correlation between the decline in faith in institutions 00:59:08.220 |
But secondarily, that we've seen some movements 00:59:11.560 |
that have essentially rooted themselves in tweets, 00:59:24.420 |
In fact, I can't name one person who initiated 00:59:33.260 |
adding on to some person that was essentially anonymous. 00:59:36.400 |
we have the bottom, to use neuroscience language, 00:59:40.680 |
Oh, someone sees something in their daily life 00:59:45.940 |
and they tweet about it or they comment about it 00:59:50.160 |
And then enough people dogpile on the accused 00:59:54.560 |
that it picks up force and then the elites feel compelled, 01:00:08.780 |
whereas normally someone would just be standing 01:00:13.460 |
And you've got the like the Aaron Brockovich model 01:00:20.660 |
who's got this idea in mind about how big institution 01:00:24.020 |
is doing wrong or somebody is doing wrong in the world 01:00:27.380 |
and then can leverage big institution, excuse me. 01:00:30.220 |
But the way that you describe it is that the elites 01:00:42.120 |
if for instance, no one tweeted or commented a me too, 01:00:47.980 |
or no one tweeted or commented about some ill behavior 01:00:53.420 |
of some, I don't know, university faculty member 01:00:57.760 |
would the elite have come down on them anyway? 01:01:27.600 |
There's this whole universe of basically these funded groups 01:01:29.960 |
that basically do quote, unquote, misinformation 01:01:32.960 |
and they're constantly mounting these kinds of attacks. 01:01:43.000 |
No, and almost always when you trace these things back, 01:01:50.840 |
These are entrepreneurs in sort of a weird way. 01:01:55.520 |
They're basically, they're paid their job mission calling. 01:02:04.240 |
I mean, there's a very large funding complex for this 01:02:15.320 |
have been on the receiving end of for the last decade. 01:02:17.920 |
It's basically a political media activism complex 01:02:34.120 |
every politician who goes out and gives stump speeches, 01:02:36.080 |
you'll see there's always somebody in the crowd 01:02:37.240 |
with a camcorder or now with a phone recording them 01:02:43.400 |
and record every single thing the politician says 01:02:45.600 |
so that when Mitt Romney says whatever the 47% thing, 01:03:03.600 |
they're protecting and then they know how to use the tools 01:03:07.640 |
and so they know how to try to gin up the outrage 01:03:10.180 |
and then by the way, sometimes it works in social cascades. 01:03:20.260 |
They're constantly trying to light this fire. 01:03:29.600 |
and then the thing takes on a life of its own. 01:03:33.600 |
at the social media firms who are responsible 01:03:38.960 |
and you'll notice one large social media company 01:03:46.560 |
and all of a sudden a different kind of boycott movement 01:03:53.840 |
and so for sure there's intermediation happening. 01:03:57.000 |
Like look, the stuff that's happening in the world today 01:04:00.440 |
'cause social media is the defining media of our time 01:04:15.620 |
and when it appears to be a grassroots thing, 01:04:20.360 |
which is possible 'cause there's a fairly large number 01:04:22.960 |
of people who are signed up for that particular crusade 01:04:29.200 |
The question is okay, at what point does the population 01:04:32.960 |
and maybe there are movements at certain points in time 01:04:39.440 |
and then there's another question of like well, 01:04:43.540 |
are they gonna be the same movements as the elites want 01:04:46.280 |
and are the elites, how are the elites gonna react 01:04:48.080 |
when the population actually like fully expresses itself? 01:04:55.000 |
they tend to push the population to more extreme views 01:05:06.040 |
- By the way, Tyeeby, so Mike Shellenberger, Matt Tyeeby, 01:05:08.720 |
a bunch of these guys have done a lot of work. 01:05:18.740 |
- I've seen more and more Shellenberger showing up, right? 01:05:22.920 |
And he's just like, on this stuff, he and Tyeeby, 01:05:27.280 |
It's just like, it's very clear how the money flows, 01:05:36.800 |
- The government should not be funding programs 01:05:38.640 |
that take away people's constitutional rights 01:05:40.340 |
and yet somehow that is what's been happening. 01:05:48.660 |
about why the decline in confidence in institutions 01:05:59.120 |
and burning down of the forest that will lead to new life? 01:06:17.440 |
And then, yeah, the other question is like, okay, 01:06:22.480 |
like can, it's this wonderful word reform, right? 01:06:26.120 |
And everybody always wants to reform everything 01:06:30.940 |
And so people are trying to reform, you know, 01:06:35.480 |
we're building fewer houses than ever before. 01:06:36.920 |
So somehow reform movements seem to lead to more, 01:06:40.960 |
But anyway, yeah, so if you have an existing institution, 01:06:46.320 |
Like there's a lot of, there are professors at Stanford 01:06:56.360 |
- Well, I mean, there are many things about Stanford 01:07:00.920 |
It's certainly got its issues like any other place. 01:07:18.120 |
You know, you look to the left, you look to the right 01:07:19.800 |
or anywhere above or below you and you have excellence, 01:07:23.680 |
right, I mean, I've got a Nobel Prize winner below me 01:07:27.440 |
and his scientific offspring is likely to win. 01:07:36.300 |
So there's that and that's great and that persists. 01:07:47.600 |
And then of course there are the things that, you know, 01:07:49.680 |
many people are aware of are there are public accusations 01:07:53.280 |
about people in positions of great leadership 01:07:56.680 |
and the whole thing becomes kind of overwhelming 01:07:59.540 |
and a little bit opaque when you're just trying 01:08:03.440 |
And so I think one of the reasons for this lack of reform 01:08:06.040 |
that you're referring to is because there's no position 01:08:20.180 |
And so, you know, we don't have a dedicated role of reformer, 01:08:26.320 |
there's just a lot of fat on this and we need to trim it 01:08:32.920 |
And that's, I think, in part because universities 01:08:45.320 |
- Well, so the point we can debate the university 01:08:57.000 |
And then at the very least, the people who run institutions 01:08:59.040 |
ought to really think hard about what that means. 01:09:00.760 |
- But people still strive to go to these places. 01:09:03.840 |
And I still hear from people who, like for instance, 01:09:11.240 |
that their son or daughter is going to Stanford 01:09:13.380 |
or is going to UCLA or is going to Urbana-Champaign. 01:09:18.100 |
that's always the most shocking contradiction is like, 01:09:29.280 |
or that they started their own business most of the time, 01:09:35.600 |
- So do you think the median voter in the United States 01:09:40.180 |
- Do you think the median voter in the United States 01:09:46.620 |
- In this day and age, the competition is so fierce 01:09:51.180 |
- Yeah, so first of all, again, we're dealing here, 01:09:59.380 |
Most people have no connectivity to them whatsoever. 01:10:09.500 |
but like the reality of it is they have a very, 01:10:19.240 |
that there's collapsing faith in the institutions, 01:10:20.880 |
if you believe that it is merited, at least in some ways, 01:10:23.540 |
if you believe that reform is effectively impossible, 01:10:25.400 |
then you are faced, and we could debate each of those, 01:10:28.920 |
but the population at large seems to believe a lot of that. 01:10:39.480 |
or do you actually need to basically clear the field 01:10:44.040 |
The universities are a great case study of this 01:10:48.040 |
and the way student loans work is to be able to be a, 01:10:51.200 |
to be an actual competitive university and compete, 01:10:53.840 |
you need to have access to federal student lending, 01:10:55.520 |
'cause if you don't, everybody has to pay out of pocket, 01:10:58.820 |
other than a certain class of either extremely rich 01:11:03.180 |
So you need access to federal student loan facility. 01:11:04.760 |
To get access to federal student loan facility, 01:11:17.380 |
like they decide who the new universities are. 01:11:19.000 |
Guess how many new universities get accredited, right, 01:11:27.800 |
and as long as they have the government wired 01:11:29.680 |
the way that they do, and as long as they control 01:11:32.200 |
who gets access to federal student loan funding, 01:11:34.180 |
like of course there's not gonna be any competition, right? 01:11:40.500 |
And so if you actually wanted to create a new system 01:11:46.000 |
or hundreds of ways, it could obviously be better 01:11:49.400 |
It probably can't be done as long as existing institutions 01:11:55.000 |
which is like, yeah, look, if we're gonna tear down the old, 01:11:59.680 |
before we get to the new, but we're never gonna get 01:12:04.080 |
you're talking about the author of "Revolt of the Public." 01:12:06.520 |
What Martin Gurry says is like, look, he said basically, 01:12:11.600 |
The elites deserve contempt, but the only thing worse 01:12:19.080 |
And he basically says on the other side of the destruction 01:12:23.720 |
of the elites and the institutions is nihilism. 01:12:27.320 |
And then by the way, there is a nihilistic streak. 01:12:28.960 |
I mean, there's a nihilistic streak in the culture 01:12:31.540 |
There are people who basically would just say, 01:12:32.940 |
yeah, just tear the whole system down, right? 01:12:34.980 |
And without any particular plan for what follows. 01:12:40.460 |
in that you wanna be careful that you actually have a plan 01:12:42.280 |
on the other side that you think is actually achievable. 01:12:47.140 |
if you're not willing to actually tear down the old, 01:12:52.360 |
is this is what happens every day in business, right? 01:12:59.100 |
The way that you know is that the old companies, 01:13:01.460 |
when they're no longer like the best at what they do, 01:13:03.560 |
they get torn down and then they ultimately die 01:13:10.640 |
And we know, what's so interesting is we know in capitalism 01:13:13.800 |
and a market economy, we know that that's the sign of health. 01:13:20.080 |
And in fact, we get actually judged by antitrust authorities 01:13:24.520 |
It's like the best defense against antitrust charges is, 01:13:28.160 |
and they're doing like a really good job of it. 01:13:31.160 |
And in fact, in business, we are specifically, 01:13:35.380 |
in the same industry to get together and plot and conspire 01:13:37.440 |
and plan and have things like these accreditation bureaus. 01:13:39.760 |
Like we would get, if I created the equivalent 01:13:41.920 |
in my companies of the kind of accreditation bureau 01:13:46.640 |
And a trust violation Sherman Act, straight to prison. 01:13:50.440 |
So in the business world, we know that you want 01:14:02.440 |
We want basically stagnation and log rolling, right? 01:14:05.960 |
And basically institutional incestuous entanglements 01:14:10.000 |
and conflicts of interest as far as the eye can see. 01:14:14.820 |
- So let's play it out as a bit of a thought experiment. 01:14:27.480 |
where unless somebody has egregious behavior, 01:14:31.420 |
violent behavior, truly sexually inappropriate behavior 01:14:35.060 |
against somebody, they're committing a crime, right? 01:14:38.560 |
They're allowed to be a student or a faculty member 01:14:45.820 |
allowed student loans for this one particular university. 01:14:48.520 |
Or let's say that there was an independent source of funding 01:14:53.140 |
They didn't need to be part of this elite accredited group, 01:15:01.120 |
Not necessarily violent, but certainly coercive 01:15:06.160 |
Let's say that then there were 20 or 30 of those 01:15:19.960 |
Remember the Sherlock Holmes, the dog that didn't bark? 01:15:28.020 |
One is like nobody wants that, which I don't believe. 01:15:31.220 |
And then the other is like the system is wired 01:15:39.900 |
- Or the people that band together have enough money 01:15:50.260 |
when thinking about the size of a university. 01:15:51.860 |
And most of them hopefully will graduate in four years 01:15:57.380 |
And do you think that the great future innovators 01:16:05.340 |
more than they currently do toward the traditional model? 01:16:08.220 |
I mean, what I'm trying to get back to here is 01:16:09.660 |
how do you think that the current model thwarts innovation 01:16:12.900 |
as well as maybe some ways that it still supports innovation 01:16:17.460 |
certainly cancellation and the risk of cancellation 01:16:25.840 |
of the category of risk takers that take risk 01:16:27.840 |
in every domain that really like to fly close to the sun 01:16:33.280 |
- Or are doing research that is just not politically. 01:16:39.120 |
- Right, that we can't even talk about on this podcast 01:16:46.080 |
- But it gives up the whole game right there. 01:16:53.020 |
because I'm afraid to put it into electronic form 01:16:55.480 |
of all the things that I'm afraid to talk about publicly 01:17:00.340 |
where all three died young and I figure if nothing else, 01:17:03.160 |
I'll die and then I'll make it into the world 01:17:08.340 |
and if not, I know a certainty I'm going to die at some point 01:17:11.360 |
and then we'll see where all those issues stand. 01:17:14.880 |
- Is that list getting longer over time or shorter? 01:17:23.000 |
that I would love to explore on this podcast with experts 01:17:26.960 |
and that I can't explore just even if I had a panel of them 01:17:31.960 |
because of the way that things get soundbited 01:17:39.940 |
and so fortunately, there are an immense number 01:17:44.840 |
that I'm excited to have but it is a little disturbing. 01:17:52.320 |
Lysankoism, so famous in the history of the Soviet Union, 01:17:59.040 |
but I'm not calling to mind what the context is. 01:18:00.880 |
- Well, he was the guy who did communist genetics. 01:18:04.100 |
The field of genetics, the Soviets did not approve 01:18:10.480 |
and total equality and genetics did not support that 01:18:13.440 |
and so if you were doing traditional genetics, 01:18:15.720 |
you were at the very least fired if not killed 01:18:27.200 |
in the agriculture system of the Soviet Union 01:18:29.220 |
and it's the origin of one of the big reasons 01:18:32.400 |
which was they ultimately couldn't feed themselves. 01:18:39.160 |
And so they not only created it, they taught it, 01:18:57.600 |
So I tend to like to play on lush open fields. 01:19:04.300 |
This goes to the rot and I'll come back to your question 01:19:06.020 |
but like this goes to the rot in the existing system 01:19:07.800 |
which is we've, by the way, I'm no different. 01:19:10.200 |
Like I'm not trying not to light myself on fire either 01:19:14.320 |
and by system I mean the institutions and the elites. 01:19:19.360 |
I mean that list is like obviously expanding over time 01:19:22.280 |
and like that's a real like historically speaking 01:19:33.000 |
- It's sort of the boomers plus the millennials. 01:19:39.940 |
Gen X is weird 'cause we kind of slipped in the middle. 01:19:42.160 |
We were kind of the, I don't know how to describe it. 01:19:47.440 |
kind of sandwiched between the boomers and the millennials. 01:19:49.720 |
Gen Z is a very, I think, open question right now 01:19:53.320 |
I could imagine them being actually like much more intense 01:19:58.280 |
I could also imagine them reacting to the millennials 01:20:10.940 |
and so where the jocks and the hippies and the punks 01:20:13.960 |
and the, we're all divided and they were all segmented 01:20:21.800 |
And I think that had a lot to do with, like you said, 01:20:24.400 |
the sort of apolitical aspect of our generation. 01:20:27.880 |
- We just knew, Gen X just knew the boomers were nuts. 01:20:36.600 |
was Family Ties with the character Michael P. Keaton. 01:20:39.120 |
And he was just like this guy who's just like yeah, 01:20:44.980 |
Like there was something iconic about that character 01:20:47.240 |
in our culture and people like me were like yeah, 01:20:51.920 |
And then it's just like man, that came whipping back around 01:20:56.180 |
So just to touch real quick on the university thing. 01:20:59.200 |
and I'm actually gonna do a thing this afternoon 01:21:00.680 |
with the University of Austin which is one of these. 01:21:03.560 |
And so there are people trying to do new universities. 01:21:06.340 |
You know, like I say, it's certainly possible. 01:21:10.740 |
I think it'd be great if there were a lot more of them. 01:21:17.000 |
'cause I don't know originally who the idea was. 01:21:22.200 |
It's called University of Austin or they call it, 01:21:27.320 |
And so it's a lot of very short people associated with it. 01:21:37.280 |
I would just tell you like the wall of opposition 01:21:41.860 |
And part of it is economic which is can they ever get access 01:21:45.360 |
And I hope that they can, but it seems nearly inconceivable 01:21:51.400 |
And then the other is just like they're gonna, 01:21:54.440 |
I mean, anybody who publicly associates with them 01:21:58.600 |
who is in traditional academia immediately gets lit on fire. 01:22:02.720 |
So they're up against a wall of social ostracism. 01:22:07.940 |
They're up against a wall of people just like doing 01:22:10.800 |
the thing, pouncing on any anytime anybody says anything, 01:22:13.300 |
they're gonna try to like burn the place down. 01:22:14.720 |
- This reminds me of like Jerry Springer episodes 01:22:18.360 |
and Geraldo Rivera episodes where it's like if a team 01:22:23.200 |
listened to like Danzig or Marilyn Manson type music 01:22:28.200 |
or Metallica that they were considered a devil worshiper. 01:22:35.240 |
People listen to music with all sorts of lyrics 01:22:40.800 |
But there were people legitimately sent to prison, 01:22:46.340 |
These kids out in West Memphis that looked different, 01:22:51.240 |
that eventually was made clear they clearly didn't commit, 01:23:06.500 |
it's a little bit niche, but I mean, these were real lives 01:23:09.900 |
and there was an active witch hunt for people 01:23:15.600 |
And yet now we're sort of in this inverted world 01:23:21.560 |
that we can express ourselves however we want. 01:23:24.460 |
you can't get a bunch of people together to take classes 01:23:26.820 |
where they learn biology and sociology and econ in Texas. 01:23:34.540 |
Well, so the simple explanation is this is Puritanism, right? 01:23:40.440 |
that just like works itself out through the system 01:23:47.840 |
There'll be these periods in American history 01:23:52.920 |
where you'll have this basically, this frenzy 01:23:57.040 |
In the old days, it would have been tent revivals 01:23:59.080 |
and people speaking in tongues and all this stuff. 01:24:02.880 |
it's of the form that we're living through right now. 01:24:05.880 |
And so, yeah, it's just basically these waves 01:24:09.840 |
And you know, remember like religion in our time, 01:24:12.320 |
religious impulses in our time don't get expressed, 01:24:14.200 |
you know, 'cause we live in more advanced times, right? 01:24:28.600 |
As long as the church is secular, there's no problem, right? 01:24:34.680 |
and we're in the middle of another religious frenzy. 01:24:47.760 |
- So that's how I feel too, because you know, I'll take-- 01:24:51.560 |
- Take any number of things that we've talked about 01:24:54.100 |
and you know, actually it's so crazy, you know, 01:24:59.840 |
or it's so crazy the way things have gone with social media 01:25:01.880 |
or it's so crazy, fill in the blank and people will say, 01:25:08.360 |
Like it's the stock market or something, you know, 01:25:11.320 |
after every crash, there'll be an eventual boom 01:25:17.780 |
- Most stock markets, we have of course survivorship, 01:25:19.280 |
but it's all survivorship, everything is survivorship. 01:25:23.080 |
So if you look globally, most stock markets over time 01:25:27.800 |
The American stock market has always recovered. 01:25:30.520 |
- Right, I was referring to the American stock market. 01:25:33.380 |
But the reason everybody refers to the American stock market 01:25:37.160 |
The other 200 or whatever that crash and burn 01:25:43.600 |
Like, I don't think it's coming back anytime soon. 01:25:45.360 |
- Yeah, my father's Argentine and immigrated to the US 01:25:48.400 |
in the 1960s, so he would definitely agree with that. 01:25:53.080 |
like when their stocks crash, they don't come back. 01:25:57.080 |
like the Soviet Union never recovered from like Senkoism, 01:25:58.960 |
it never came back, it led to the end of the country. 01:26:01.560 |
You know, literally the things that took down 01:26:02.980 |
the Soviet Union were oil and wheat and the wheat thing, 01:26:05.480 |
you can trace the crisis back to like Senkoism. 01:26:08.320 |
And so, yeah, no, look, pendulum always swings back, 01:26:11.720 |
it's true only in the cases where the pendulum swings back. 01:26:16.560 |
all the other circumstances where that doesn't happen. 01:26:18.440 |
One of the things people, you see this in business also, 01:26:21.320 |
people have a really hard time confronting really bad news. 01:26:27.920 |
I think every doctor who's listening right now 01:26:30.200 |
But like, there are situations, you see it in business, 01:26:33.400 |
there are situations that, you see Star Trek, 01:26:36.000 |
remember Star Trek, the Kobayashi Maru simulator, right? 01:26:38.800 |
So the big lesson to become a Star Trek captain 01:26:41.480 |
called the Kobayashi Maru, and the point was, 01:26:43.260 |
there's no way to, it's a no-win scenario, right? 01:26:47.660 |
was the only person to ever win the scenario, 01:26:49.200 |
and the way that he did it was he went in ahead of time 01:26:54.560 |
And then there was a debate whether to fire him 01:26:55.960 |
or make him a captain, so they made him a captain. 01:27:01.480 |
you do get the Kobayashi Maru on a regular basis. 01:27:07.040 |
And as a leader, you can't ever cop to that, right? 01:27:10.480 |
and you have to look for every possible choice you can. 01:27:12.480 |
But every once in a while, you do run into a situation 01:27:16.400 |
And at least I've found people just cannot cope with that. 01:27:21.000 |
they basically just exclude it from their memory 01:27:26.720 |
'cause I want to make sure that we talk about 01:27:42.560 |
by creating some clever segue, but I'm not going to. 01:27:45.220 |
Except I'm going to ask, is there a possibility 01:27:48.940 |
that AI is going to remedy some of what we're talking about? 01:27:53.120 |
Let's make sure that we earmark that for discussion 01:27:59.040 |
of this podcast might not be as familiar with AI 01:28:03.580 |
We've all heard about artificial intelligence. 01:28:05.320 |
People hear about machine learning, et cetera. 01:28:07.440 |
But it'd be great if you could define for us what AI is. 01:28:17.380 |
I'm going to wake up and I'm going to be strapped to the bed 01:28:19.860 |
and my organs are going to be pulled out of me. 01:28:24.400 |
They're going to kill all my children and dystopia for most. 01:28:29.400 |
Clearly, that's not the way it's going to go. 01:28:34.560 |
If you believe that machines can augment human intelligence 01:28:41.180 |
So tell us what AI is and where you think it can take us, 01:28:55.540 |
People like Alan Turing and John von Neumann and these people. 01:29:01.520 |
because they knew they wanted to build computers, 01:29:04.960 |
And there had been calculating machines before that 01:29:08.760 |
that you basically programmed with punch cards. 01:29:13.380 |
sort of increasingly complex calculating machines. 01:29:16.760 |
but they knew they were going to be able to build, 01:29:24.680 |
which is should the fundamental architecture of the computer 01:29:27.240 |
be based on either A, like calculating machines, 01:29:30.480 |
like cache registers and looms and other things like that, 01:29:34.200 |
or should it be based on a model of the human brain? 01:29:40.760 |
And this was this concept of so-called neural networks. 01:29:50.600 |
So they didn't have our level of neuroscience, 01:29:56.680 |
and information processing in the brain even back then. 01:30:00.080 |
And a lot of people at the time basically said, 01:30:02.600 |
you know what, we should basically have the computer 01:30:04.360 |
from the start be modeled after the human brain, 01:30:09.240 |
that would be the best possible general purpose computer. 01:30:16.000 |
It turns out that didn't happen in our world. 01:30:22.920 |
It went basically in the model of the calculating machine 01:30:36.760 |
up to and including the iPhone in our pocket, 01:30:46.260 |
they're basically like mathematical savants at best, right? 01:30:50.160 |
So they're like, they're really good at like, 01:30:59.540 |
One of the things you learn early when you're a programmer 01:31:05.620 |
'cause it will do exactly what you tell it to do. 01:31:08.100 |
And bugs in computer programs are always a mistake 01:31:19.700 |
- Yeah, yeah, and it's the programmer's fault. 01:31:23.980 |
They'll be like, yeah, if there's a problem, it's my fault. 01:31:32.100 |
develop an independent understanding of anything. 01:31:33.960 |
It's literally just doing what I tell it to do step by step. 01:31:39.320 |
this very kind of hyper literal kind of model computers. 01:31:43.400 |
these are what are called von Neumann machines 01:31:44.880 |
based after the mathematician, John von Neumann. 01:31:48.240 |
And they've been very successful and very important 01:31:52.140 |
But there was always this other idea out there, 01:31:53.940 |
which is, okay, how about a completely different approach, 01:31:56.220 |
which is based much more on how the human brain operates, 01:32:05.100 |
It basically says, okay, what if you could have a computer, 01:32:08.480 |
what if you could have it actually be conceptual, right? 01:32:11.500 |
And creative and able to synthesize information, right? 01:32:24.680 |
And so, and the applications for this, of course, 01:32:42.440 |
how to recognize objects and images at high speeds, 01:32:45.040 |
the same way, basically the same way the human brain does. 01:32:46.880 |
And so those are so-called neural networks running inside. 01:32:49.540 |
- So essentially let the machine operate based on priors. 01:33:02.400 |
it now introduces to its catalog of boulders. 01:33:07.160 |
or let's even make it even starker for self-driving car. 01:33:11.280 |
Is it a small child or a plastic shopping bag 01:33:18.800 |
you definitely want to go straight through it 01:33:21.760 |
you might, you know, you're gonna make a fast, 01:33:26.240 |
Like you don't want to swerve to avoid a shopping bag 01:33:29.760 |
If it's a small child, for sure you want to swerve, right? 01:33:31.960 |
And so, but it's very, but like in that moment, 01:33:34.700 |
and you know, small children come in different 01:33:40.640 |
- Yeah, they might look like they're tumbling. 01:33:43.040 |
they might be wearing a Halloween mask, right? 01:33:45.760 |
They might not have a recognizable human face, right? 01:33:48.400 |
Or it might be a kid with, you know, one leg, right? 01:33:59.080 |
of a von Neumann machine to basically real life 01:34:01.560 |
and expect the computer to be in any way understanding 01:34:06.840 |
And this is why there's always been such a stark divide 01:34:08.620 |
between what the machine can do and what the human can do. 01:34:11.720 |
And so basically what's happened is in the last decade, 01:34:18.760 |
It started to work actually first interestingly in vision, 01:34:22.560 |
which is why the self-driving car is starting to work. 01:34:26.120 |
- I mean, when I was started off in visual neuroscience, 01:34:28.000 |
which is really my original home in neuroscience, 01:34:34.120 |
could do face recognition better than a human 01:34:42.420 |
based on the understanding of the face recognition cells 01:34:45.760 |
Now you would be smartest to put all your money 01:34:51.760 |
You know, you want to find faces in airports, 01:34:57.480 |
machines can do it far better than most all people. 01:35:05.600 |
Now, 10 years ago, what I just said was the exact reverse. 01:35:13.680 |
and then voice, right, being able to understand voice. 01:35:16.300 |
Like if you use, just as a user, if you use Google Docs, 01:35:22.580 |
If you use voice transcription in Google Docs, 01:35:25.400 |
You just speak into it and it just like types 01:35:31.760 |
and they'll say, "I need to pick up a few thongs." 01:35:37.500 |
with whatever the voice recognition is that Google's using. 01:35:43.560 |
- That was not the topic I was avoiding discussing. 01:35:48.520 |
So look, there's a reason actually why Google's so good 01:35:52.480 |
and Apple is not right now at that kind of thing, 01:35:55.720 |
it's actually an ideological thing of all things. 01:35:58.120 |
Apple does not permit pooling of data for any purpose, 01:36:06.480 |
And Apple's just like stake their brand on privacy 01:36:13.800 |
that has to happen like locally on your phone, 01:36:16.060 |
whereas Google's AI can happen in the cloud, right? 01:36:19.100 |
Now, by the way, some people think that that's bad 01:36:25.160 |
which is you have this separation between the people 01:36:26.960 |
who are embracing the new way of training AIs 01:36:29.600 |
and the people who basically, for whatever reason, are not. 01:36:32.580 |
- Excuse me, you say that some people think it's bad 01:36:34.920 |
because of privacy issues or they think it's bad 01:36:36.640 |
because of the reduced functionality of that AI. 01:36:41.760 |
so there's three reasons AIs have started to work. 01:36:48.960 |
So specifically the reason why objects and images are now, 01:36:53.040 |
the reason machines are now better than humans 01:36:54.520 |
at recognizing objects and images or recognizing faces 01:36:59.520 |
are trained across all photos on the internet of people, 01:37:02.720 |
billions and billions and billions of photos, right? 01:37:04.620 |
A limited number of photos of people on the internet. 01:37:08.960 |
10 or 20 years ago, they'd be trained on, you know, 01:37:12.940 |
- So the input data is simply much more vast. 01:37:16.360 |
And this is the reason, to get to the conclusion on this, 01:37:18.080 |
this is the reason why ChatGPT works so well, 01:37:19.920 |
is ChatGPT, one of the reasons ChatGPT works so well 01:37:22.620 |
is it's trained on the entire internet of text. 01:37:25.140 |
And the entire internet of text was not something 01:37:30.800 |
which is new in the last, you know, basically decade. 01:37:35.140 |
I could see how having a much larger input data set 01:37:38.060 |
would be beneficial if the goal is to recognize 01:37:40.080 |
Marc Andreessen's face because you are looking 01:37:42.520 |
for signal to noise against everything else, right? 01:37:52.640 |
construct a paragraph about Marc Andreessen's prediction 01:37:56.360 |
of the future of human beings over the next 10 years 01:37:59.640 |
and the likely to be most successful industries, 01:38:08.540 |
how does it know what is authentically Marc Andreessen's text? 01:38:14.340 |
you have a, you've got a standard to work from, 01:38:23.960 |
you have to make sure that what you're starting with 01:38:28.520 |
So, which makes sense if it's coming from video, 01:38:46.080 |
So I would say there's a before and after thing here. 01:38:48.920 |
there's like a before ChatGPT and after GPT question, right? 01:38:51.840 |
'Cause the existence of GPT itself changes the answer. 01:38:57.100 |
the version you're using today is trained on data 01:39:03.860 |
almost all texts on the internet was written by a human being 01:39:12.440 |
is 'cause it was published in a magazine under my name 01:39:14.280 |
or it's a podcast transcript and it's under my name. 01:39:16.920 |
And generally speaking, if you just did a search on like, 01:39:19.240 |
what are things Marc Andreessen has written and said, 01:39:23.800 |
And there, look, somebody might have written a fake, 01:39:25.960 |
you know, parody article or something like that, 01:39:27.840 |
but like not that many people were spending that much time 01:39:30.260 |
writing like fake articles about like things that I said. 01:39:34.900 |
And so generally speak, you can kind of get your arms 01:39:36.900 |
around the idea that there's a corpus of material 01:39:38.660 |
associated with me or by the way, same thing with you. 01:39:41.940 |
and other your academic papers and talks that you've given. 01:39:47.060 |
They take all that data collectively, they put it in there. 01:39:49.380 |
And that's why this works as well as it does. 01:39:50.980 |
And that's why if you ask ChatGPT to speak or write like me 01:39:56.660 |
it will actually generally do a really good job 01:39:58.620 |
'cause it has all of our prior text in its training data. 01:40:02.000 |
That said, from here on out, this gets harder. 01:40:06.660 |
is because now we have AI that can create text. 01:40:09.500 |
We have AI that can create text at industrial scale. 01:40:21.680 |
I was talking to a friend who's got like a 14 year old kid 01:40:23.460 |
in a class and there's like these recurring scandals. 01:40:25.720 |
It's like every kid in the class is using ChatGPT 01:40:39.940 |
but it's like only write like 60% of the time. 01:40:42.540 |
And so there was this case where the student wrote an essay 01:40:45.180 |
where their parents sat and watched them write the essay 01:40:53.240 |
but the teacher is like, well, you're all using the tool. 01:41:02.280 |
but what it really is is it's a cheating mechanism 01:41:03.980 |
to basically just shuffle the words around enough 01:41:17.620 |
I don't think it's possible to do the watermarking. 01:41:21.220 |
which is you can just read the output for yourself. 01:41:25.720 |
How are you actually gonna tell the difference 01:41:27.800 |
between that and something that a real person wrote? 01:41:31.060 |
you can also ask JetGPT to write in different styles, right? 01:41:37.960 |
in the style of a non-native English speaker, right? 01:41:41.620 |
you can tell it to write in the style of an English speaker, 01:41:50.140 |
I think there's a lot of people who are gonna want 01:41:58.220 |
And by the way, I actually think this is good. 01:42:11.140 |
So then there's the problem of like deliberate, 01:42:29.860 |
and I'm gonna have him say all these bad things. 01:42:32.900 |
- I mean, Joe Rogan and I were deep faked in a video. 01:42:37.820 |
I won't, so I won't talk about what it was about, 01:42:43.620 |
looked like a conversation that we were having, 01:42:51.200 |
is there's gonna need to be basically registries 01:42:59.100 |
into a registry under your unique cryptographic key, right? 01:43:07.940 |
for public figures, it needs to be done for politicians, 01:43:18.180 |
I get asked, is this your account about some, 01:43:20.820 |
a direct message that somebody got on Instagram? 01:43:22.900 |
And I always tell them, look, I only have the one account, 01:43:28.200 |
although now with the advent of pay-to-play verification, 01:43:32.000 |
makes it a little less potent as a security blanket 01:43:35.060 |
for knowing if it's not this account, then it's not me. 01:43:38.900 |
But in any case, these accounts pop up all the time, 01:43:42.560 |
And I'm relatively low on the scale, not low, 01:43:47.560 |
but relatively low on the scale to say like a Beyonce 01:43:54.200 |
So is there a system in mind where people could go in 01:43:58.840 |
and verify text, click yes or no, this is me, this is not me. 01:44:02.440 |
And even there, there's the opportunity for people to fudge 01:44:08.620 |
no, that's not me, I didn't actually say that or create that. 01:44:12.520 |
So technologically, it's actually pretty straightforward. 01:44:16.360 |
is with public key, it's called public key cryptography, 01:44:18.280 |
which is the basis for how cryptography information 01:44:24.640 |
you would pick whatever is your most trusted channel. 01:44:27.000 |
Let's say it's your YouTube channel as an example, 01:44:31.280 |
'cause you've been doing it for 10 years or whatever, 01:44:36.800 |
you would just publish your public cryptographic key 01:44:42.680 |
to see whether any piece of content is actually you, 01:44:44.700 |
they go to a registry in the cloud somewhere, 01:44:47.680 |
and they basically submit, they basically say, 01:44:52.040 |
to see whether somebody with your public key, 01:44:54.560 |
you had actually certified that this was something 01:44:58.340 |
Now, who runs that registry is an interesting question. 01:45:10.920 |
of like a credit bureau or something like that. 01:45:13.920 |
The problem with that is that company now becomes 01:45:15.800 |
hacking target number one of every bad person on earth, 01:45:18.840 |
'cause you can, if anybody breaks into that company, 01:45:26.080 |
also their employees can monkey with it, right? 01:45:29.640 |
The third way to do it is with a blockchain, right? 01:45:31.500 |
And so this, with the crypto blockchain technology, 01:45:35.080 |
basically a distributed database in the cloud 01:45:45.860 |
I know most of our listeners are probably not familiar 01:45:49.720 |
it's a way to secure communications on the internet. 01:45:54.440 |
It's sophisticated, and we'll probably do a whole episode 01:46:00.060 |
but that would be better, and if so, please offer it up. 01:46:12.440 |
We don't yet have working quantum computers in practice, 01:46:14.680 |
so it's not currently something you could do, 01:46:19.980 |
at defining quantum internet in one sentence. 01:46:21.600 |
It's a way in which if anyone were to try and peer in 01:46:27.560 |
because of the way that quantum internet changes 01:46:32.440 |
the way that the communication is happening so fast 01:46:37.120 |
essentially changing the translation or the language so fast 01:46:42.760 |
- Yeah, conceivably, yeah, not yet, but yeah, someday. 01:46:48.140 |
most people who hear about AI are afraid of AI. 01:46:50.560 |
Well, I think most people who aren't informed. 01:46:54.600 |
- This goes back to our elites versus masses thing. 01:46:59.480 |
and this is from a really wonderful tweet thread 01:47:11.560 |
and that everyone really should take the time to read it. 01:47:13.720 |
It probably takes about 20 minutes to read it carefully 01:47:16.600 |
and to think about each piece and I highly recommend it. 01:47:28.440 |
which is AI will make it easier for bad people 01:47:37.080 |
there is a general freakout happening around AI. 01:47:38.980 |
I think it's primarily, it's one of these, again, 01:47:41.520 |
I don't think the man in the street knows cares 01:47:45.900 |
and it probably just sounds like science fiction. 01:47:52.000 |
I think that elite-driven freakout has many aspects to it 01:47:54.900 |
that I think are incorrect, which is not surprising. 01:47:57.980 |
I would think that given that I think the elites 01:48:00.640 |
but I think they're very wrong about a number of things. 01:48:04.340 |
look, this is a very powerful new technology, right? 01:48:10.340 |
So like, what if machines could think, right? 01:48:14.580 |
and what if you could have them think for you? 01:48:16.200 |
There's obviously a lot of good that could come from that, 01:48:20.360 |
look, criminals could use them to plan better crimes. 01:48:28.340 |
that bad people can use to do bad things, for sure. 01:48:31.460 |
- I can think of some ways that AI could be leveraged 01:48:33.780 |
to do fantastic things, like in the realm of medicine, 01:48:38.780 |
an AI pathologist perhaps can scan 10,000 slides 01:48:45.420 |
of histology and find the one micro-tumor seller aberration 01:48:56.340 |
or well-rested human pathologists, as great as they come, 01:49:01.920 |
And perhaps the best solution is for both of them to do it. 01:49:05.280 |
And then for the human to verify what the AI has found 01:49:10.900 |
I mean, I can come up with thousands of examples 01:49:15.120 |
- I'll give you another one by the way, medicine. 01:49:20.180 |
The other is like the machines are going to be 01:49:22.900 |
They're going to be much better at dealing with the patient. 01:49:25.900 |
And we already know there's already been a study. 01:49:31.140 |
where there was a study team that scraped thousands 01:49:34.180 |
of medical questions off of an internet forum. 01:49:35.860 |
And then they had real doctors answer the questions. 01:49:38.200 |
And then they had basically GPT-4 answer the questions. 01:49:44.460 |
So there were no patients experimented on here. 01:49:46.340 |
This was a test contained within the medical world. 01:50:02.580 |
on most of the factual questions analytically already. 01:50:05.880 |
And it's not even a specifically trained medical AI. 01:50:13.300 |
And so, and you know, I don't think, yeah, I don't, 01:50:26.700 |
but from the surgeons, like if you talk to surgeons 01:50:32.500 |
need to have an emotional remove from their patients 01:50:39.020 |
and tell somebody that they're gonna die, right? 01:50:40.700 |
Or that they have, so they're never gonna recover. 01:50:42.340 |
They're never gonna walk again or whatever it is. 01:50:43.780 |
And so there's sort of something inherent in that job 01:50:48.180 |
from the patient, right, to be able to do the job. 01:50:55.180 |
Like the machine can be as sympathetic as you want it to be 01:51:00.420 |
It's happy to talk to you at four in the morning. 01:51:03.300 |
And by the way, it's not just sympathizing you 01:51:07.740 |
You know, it's just making up words to lie to you 01:51:11.580 |
in terms of helping you through all the things 01:51:13.220 |
that you can actually do to improve your situation, right? 01:51:15.580 |
And so, you know, boy, like if you'd be, you know, 01:51:21.540 |
Can you keep a patient on track with a nutritional program? 01:51:23.780 |
Can you keep a patient off of drugs or alcohol, right? 01:51:30.780 |
that's infinitely patient, infinitely wise, right? 01:51:42.020 |
Cognitive behavioral therapy is an obvious fit here. 01:51:47.220 |
But you can already use ChatGPT as a CBT therapist 01:52:04.220 |
And so the doctor is going to pair with the AI 01:52:08.740 |
But the patient is also going to pair with the AI. 01:52:10.900 |
And I think this partnership that's gonna emerge 01:52:20.880 |
on excellent mentors from a very young age and still now 01:52:30.820 |
And rarely were they available at four in the morning, 01:52:38.640 |
And they have their own stuff like anybody else, 01:52:50.200 |
that hopefully would learn to identify our best self 01:52:57.780 |
I don't mean that in any kind of pop psychology way. 01:53:06.260 |
I tend to make at two o'clock in the afternoon 01:53:10.340 |
or maybe just less REM sleep the night before. 01:53:13.340 |
It might encourage me to take a little more time 01:53:18.580 |
through a device that no one else would detect 01:53:26.100 |
It's never going to be upset that you didn't listen to it. 01:53:31.720 |
I think this is the way people are going to live. 01:53:37.120 |
is they're going to have a exactly friend, therapist, 01:53:39.680 |
companion, mentor, coach, teacher, assistant, 01:53:43.260 |
and that, or by the way, maybe multiple of those. 01:53:56.140 |
when there are difficult decisions to be made in your life, 01:54:05.120 |
and you're going to always be able to talk to it 01:54:08.320 |
and always be able to help it make, you know, 01:54:09.840 |
and like, it's going to be a symbiotic relationship. 01:54:14.000 |
I think it's going to be a much better way to live. 01:54:15.120 |
I think people are going to get a lot out of it. 01:54:18.680 |
So I can imagine my phone has this engine in it, 01:54:24.360 |
and I'm listening in headphones as I walk into work 01:54:27.600 |
and it's giving me some, not just encouragement, 01:54:33.100 |
things that I might ask Marc Andreessen today 01:54:37.680 |
I could also imagine it having a more human form. 01:54:45.140 |
so that it's not going to enter our conversation 01:54:55.680 |
are we going to allow these AI coaches to approach us with? 01:55:03.160 |
'Cause I'm hearing a lot about the software piece. 01:55:07.100 |
- Yeah, so this is where Silicon Valley is going to kick in. 01:55:15.120 |
So obviously, there's big companies that are working. 01:55:18.040 |
The big companies have talked about a variety of these, 01:55:19.760 |
including heads-up displays, AR/VR kinds of things. 01:55:25.560 |
The voice thing is, voice is a real possibility. 01:55:30.040 |
There's a new startup that just unveiled a new thing 01:55:35.240 |
So you'll have a pendant you wear on a necklace, 01:55:37.520 |
and it actually literally will project images on your hand 01:55:40.900 |
or on the table or on the wall in front of you. 01:55:44.920 |
Yeah, there are people working on so-called haptic or touch-based 01:55:48.820 |
There are people working on actually picking up 01:55:50.920 |
nerve signals out of your arm to be able to-- 01:55:56.440 |
there's some science for being able to do basically 01:56:01.360 |
So maybe you could pick up that way, build conduction. 01:56:08.640 |
So that's one question is the physical form of it. 01:56:10.720 |
And then the other question is the software version of it, 01:56:13.100 |
which is like, OK, what's the level of abstraction 01:56:18.320 |
Right now, it's like a question-answer paradigm 01:56:21.360 |
Ask a question, get an answer, ask a question, get an answer. 01:56:26.400 |
You want it to build up more knowledge of who you are, 01:56:28.320 |
and you don't want to have to explain yourself 01:56:30.960 |
And then you want to be able to tell it things like, well, 01:56:32.360 |
remind me this, that, or be sure and tell me when, x. 01:56:46.720 |
unless I make a concerted effort to do otherwise, 01:56:58.040 |
learn my style of fragmented internal dialogue. 01:57:02.080 |
And maybe I have an earpiece, and I'm walking in, 01:57:07.280 |
But it's some advice, et cetera, encouragement, discouragement. 01:57:12.100 |
But at some point, those sounds that I hear in an earphone 01:57:22.500 |
of musical perception and language perception. 01:57:24.720 |
Hearing something in your head is very different. 01:57:30.640 |
where if it has inline wiring to actually control 01:57:40.960 |
through the earpiece, a little ultrasound wave now 01:57:43.060 |
can stimulate prefrontal cortex in a non-invasive way. 01:57:46.040 |
That's being used clinically and experimentally. 01:57:52.160 |
to be a little bit more context aware, right? 01:57:58.320 |
for those listening that are trying to figure out 01:58:05.020 |
that are appropriate for that situation and not others. 01:58:07.440 |
And this would go along with agreeableness perhaps, 01:58:15.720 |
There's nothing diabolical about that context is important, 01:58:24.400 |
in prefrontal cortex a little bit in a certain way 01:58:27.000 |
that allows you to be more situationally aware 01:58:32.720 |
unless I can't necessarily short circuit that influence 01:58:37.160 |
because at some point the AI is actually then controlling 01:58:41.660 |
my brain activity and my decision-making and my speech. 01:58:44.340 |
I think that's what people fear is that once we cross 01:58:47.040 |
that precipice, that we are giving up control 01:58:49.760 |
to the artificial versions of our human intelligence. 01:58:54.400 |
We collectively and we as individuals, I think, 01:58:57.680 |
And this is the big thing that I believe about AI. 01:58:59.560 |
There's just a much more, I would say, practical view 01:59:01.280 |
of the world than a lot of the panic that you hear. 01:59:06.640 |
are like the things that people can do in some circumstances. 01:59:08.640 |
But these are machines, we build the machines, 01:59:12.120 |
When we want the machines turned on, they're turned on. 01:59:14.840 |
And so, yeah, so I think that's absolutely the kind of thing 01:59:17.320 |
that the individual person should always be in charge of. 01:59:19.680 |
- I mean, everyone was, and I have to imagine 01:59:22.180 |
some people are still afraid of CRISPR, of gene editing, 01:59:25.120 |
but gene editing stands to revolutionize our treatment 01:59:29.480 |
You know, inserting and deleting particular genes 01:59:32.020 |
in adulthood, right, not having to recombine in the womb 01:59:35.200 |
a new organism is an immensely powerful tool. 01:59:38.480 |
And yet the Chinese scientist who did CRISPR on humans, 01:59:42.480 |
this has been done, actually did his postdoc at Stanford 01:59:50.680 |
I believe it was the HIV, one of the HIV receptors. 02:00:10.000 |
We actually don't know what happened to them, 02:00:18.600 |
We know this, it's not legal in the US and other countries, 02:00:30.520 |
that we back away from them and end up 10, 20 years 02:00:38.640 |
- Yeah, so there's always, and the details matter, 02:00:45.960 |
I think, in terms of counterfactuals and opportunity cost. 02:01:03.680 |
You are quite possibly, if you're Genghis Khan, 02:01:05.880 |
you're determining the future of humanity, right, 02:01:08.240 |
by those, like, yeah, nature, I mean, look, mutations. 02:01:13.240 |
So, this is the old question of, like, basically, 02:01:20.720 |
and then therefore artificial things are bad, 02:01:24.140 |
A lot of people have ethical views like that. 02:01:37.880 |
Nature wants to, like, do, you know, whether, you know, 02:01:42.840 |
nature religion was the original religion, right? 02:01:44.680 |
Like, that was the original thing people worshiped. 02:01:46.560 |
And the reason was because nature was the thing 02:01:49.800 |
before you had scientific and technological methods 02:01:53.640 |
So, the idea of not doing these things, to me, 02:01:57.200 |
is just saying, oh, we're just gonna turn over 02:02:02.840 |
in a particularly good direction or a bad, you know, 02:02:07.480 |
And then the related thing that comes from that 02:02:14.560 |
on things like CRISPR, which basically is this, 02:02:31.680 |
Before that, people didn't think in those terms. 02:02:33.720 |
People just invented things and rolled them out, 02:02:38.560 |
by people inventing things and rolling them out. 02:02:41.760 |
The German Greens came up with the precautionary principle 02:02:51.400 |
It was to shut down attempts to do civilian nuclear power, 02:02:56.500 |
you're like, wow, that was a big mistake, right? 02:03:02.080 |
are not gonna melt down and cause all kinds of problems, 02:03:07.860 |
You can't rule out things that might happen in the future. 02:03:10.540 |
And so that philosophy was used to stop nuclear power, 02:03:14.420 |
by the way, not just in Europe, but also in the US 02:03:20.940 |
this is the worst thing that happened in the last 50 years 02:03:24.160 |
We actually have the silver bullet answer to unlimited energy 02:03:32.100 |
we're actually shutting down the plants that we have now 02:03:34.860 |
in California, and we just shut down the big plant. 02:03:39.100 |
Germany's in the middle of an energy war with Russia 02:03:41.760 |
that we are informed as existential for the future of Europe. 02:03:44.220 |
- But unless the risk of nuclear power plant meltdown 02:03:47.820 |
has increased, and I have to imagine it's gone the other way, 02:03:51.420 |
what is the rationale behind shutting down these plants 02:04:06.060 |
- So what happened was, so nuclear technology arrived 02:04:12.060 |
The first thing they did was in the middle of World War II. 02:04:13.900 |
The first thing they did was the atomic bomb. 02:04:15.240 |
They dropped it on Japan, and then there were all 02:04:16.980 |
the debates that followed around nuclear weapons 02:04:18.620 |
and disarmament, and there's a whole conversation 02:04:22.180 |
'cause there's different views you could have on that. 02:04:25.860 |
where they started to roll out civilian nuclear power, 02:04:29.300 |
There was like Three Mile Island melted down, 02:04:31.100 |
and then Chernobyl melted down in the Soviet Union, 02:04:34.020 |
and then even recently, Fukushima melted down. 02:04:37.980 |
And so I think it was a combination of it's a weapon, 02:04:42.620 |
There's a lot of scientists sometimes say the ick factor, 02:04:49.260 |
By the way, it becomes like a mythical fictional thing, 02:04:53.340 |
and so you have all these movies of horrible supervillains 02:04:55.860 |
powered by nuclear energy and all this stuff. 02:04:59.700 |
is the nuclear power plant and the three-eyed fish 02:05:10.540 |
And that is the dystopia where people are unaware 02:05:17.540 |
- And who owns the nuclear power plant, right? 02:05:26.380 |
for the demise of a particular nuclear power plant. 02:05:33.020 |
where if you're just thinking rationally, scientifically, 02:05:35.660 |
you're like, okay, we want to get rid of carbon, 02:05:38.500 |
So, okay, fun fact, Richard Nixon did two things 02:05:47.280 |
which was to create 1,000 new state-of-the-art 02:05:49.220 |
nuclear plants, civilian nuclear plants in the US by 1980 02:06:07.520 |
which then prevented that from happening, right? 02:06:16.020 |
- You know, he got distracted by him [laughs] 02:06:21.880 |
- I think Ellsberg just died recently, right? 02:06:31.260 |
He didn't have time to fully figure this out. 02:06:33.340 |
I don't know whether he would have figured it out or not. 02:06:46.300 |
Russia's invasion of Ukraine by paying them for oil, right? 02:06:50.740 |
'cause they won't cut over to nuclear, right? 02:06:54.020 |
so then here's the other kicker of what happens, right? 02:07:00.220 |
And so what they do is they do solar and wind. 02:07:08.540 |
And so then what happens is they fire up the coal plants. 02:07:20.820 |
That is the consequence of the precautionary principle. 02:07:23.180 |
Like that's the consequence of that mentality. 02:07:25.620 |
And so it's a failure of a principle on its own merits 02:07:33.260 |
And this is the hot topic on AI right now in Washington, 02:07:36.540 |
which is like, oh my God, these people have to prove 02:07:47.820 |
I mean, there is something about the naming of things. 02:07:55.220 |
and things like that, these are bad words in biology, 02:07:57.780 |
but we had a guest on this podcast, Oded Reshavi, 02:07:59.800 |
who's over in Israel, who's shown inherited traits. 02:08:05.200 |
then it has all sorts of negative implications, 02:08:07.260 |
but his discoveries have important implications 02:08:14.200 |
I mean, there's all sorts of positives that await us 02:08:16.460 |
if we are able to reframe our thinking around something 02:08:27.200 |
This fundamental truth that, at least to my knowledge, 02:08:29.640 |
no one is revising in any significant way anytime soon. 02:08:35.340 |
Instead of nuclear, it's called sustainable, right? 02:08:39.660 |
I mean, it's amazing how marketing can shift our perspective 02:08:46.020 |
I'm sure you can come up with better examples than I can, 02:09:02.600 |
which by the way, she's not, she's dead set against it. 02:09:11.340 |
in environmentalism for 50 years is that nuclear is evil, 02:09:15.700 |
certain environmentalists who disagree with this, 02:09:17.220 |
and so Stewart Brand is the one that's been the most public, 02:09:19.220 |
and he has impeccable credentials in the space, 02:09:25.580 |
and he wrote a recent book that goes through in detail. 02:09:27.660 |
He's like, yes, obviously the correct environmental thing 02:09:31.580 |
and we should be implementing project independence. 02:09:34.020 |
We should be building 1,000, specifically we should, 02:09:36.380 |
he didn't say this, but this is what I would say. 02:09:42.620 |
And they should build us 1,000 nuclear power plants, right? 02:09:48.700 |
- And that would put us independent of our reliance on oil. 02:10:06.300 |
We're not, nothing, no offshore rigs, no nothing. 02:10:10.340 |
you build state-of-the-art plants, engineered properly. 02:10:14.220 |
you just entomb the waste, right, in concrete. 02:10:19.580 |
It's this very small footprint kind of thing. 02:10:25.220 |
to me it's like scientifically, technologically, 02:10:26.780 |
this is just like the most obvious thing in the world. 02:10:29.100 |
It's a massive tell on the part of the people 02:10:37.760 |
because it's the more sustainable form of power. 02:11:04.320 |
I gave Andrew a book on the way in here with this, 02:11:07.120 |
The title of it is "When Reason Goes on Holiday." 02:11:15.900 |
like the positions just simply don't reconcile. 02:11:20.580 |
So be clear, I predict none of this will happen. 02:11:36.720 |
'cause they're not reliable, so you need something. 02:11:40.160 |
it's gonna be either like oil, natural gas, or coal. 02:11:44.840 |
because you don't think that the sociopolitical, 02:11:49.800 |
elitist trends that are driving against nuclear 02:12:02.680 |
And look, they've been saying this for 50 years, 02:12:05.360 |
off of a bad position they've had for 50 years, 02:12:10.120 |
- One thing that's good about this and other podcasts 02:12:18.560 |
So there are actually, on the point of young kids, 02:12:22.120 |
who are basically not taking no for an answer, 02:12:25.280 |
and particularly there's people trying to develop 02:12:26.600 |
new very small form factor nuclear power plants 02:12:32.720 |
So look, maybe they show up with a better mousetrap 02:12:36.360 |
and people take a second look, but we'll see. 02:12:44.360 |
we should go all in on AI with the constraints 02:12:49.000 |
that we discover we need in order to rein in safety 02:12:52.520 |
and things of that sort, not unlike social media, 02:12:56.440 |
- Not unlike what we should have done with nuclear power. 02:12:58.640 |
- And in terms of the near infinite number of ways 02:13:07.280 |
how do you think we should cope with that psychologically? 02:13:10.400 |
You know, because I can imagine a lot of people 02:13:15.840 |
but there are just too many what ifs that are terrible, 02:13:18.800 |
right, you know, what if the machines take over? 02:13:20.760 |
What if, you know, the silly example I gave earlier, 02:13:23.400 |
but, you know, what if one day I get logged into my, 02:13:26.040 |
you know, hard-earned bank account and it's all gone? 02:13:28.600 |
You know, the AI version of myself like ran off 02:13:31.560 |
with someone else and with all my money, right, right? 02:13:39.420 |
after it learned all the stuff that I taught it. 02:13:42.320 |
It took off with somebody else, stranded, you know, 02:13:49.360 |
- You could really make this scenario horrible 02:13:52.000 |
- Yeah, well, we can throw in a benevolent example as well 02:13:55.760 |
to counter it, but it's just kind of fun to think about 02:14:01.920 |
- So first I say, we got to separate the real problems 02:14:03.960 |
from the fake problems, and so there's a lot, 02:14:05.560 |
a lot of science fiction scenarios I think are just not real 02:14:07.840 |
and the ones that you just cited as an example, 02:14:09.560 |
like it's not, that's not what's gonna happen 02:14:11.000 |
and I can explain why that's not what's gonna happen. 02:14:16.800 |
I think, technologically grounded, that aren't rational. 02:14:19.120 |
It's the AI's gonna like wake up and decide to kill us all. 02:14:21.360 |
It's gonna like, yeah, it's gonna develop the kind of agency 02:14:25.320 |
money and our spouse and everything else, our kids. 02:14:30.340 |
And then there's also all these concerns, you know, 02:14:32.200 |
destruction of society concerns, and this is, you know, 02:14:36.060 |
like all that stuff, which I don't think is a real, 02:14:40.160 |
And then there's, people have a bunch of economic concerns 02:14:42.600 |
around, you know, what's gonna take all the jobs, 02:14:45.200 |
and all those kinds of things, we could talk about that. 02:14:48.440 |
I don't think that's actually the thing that happens. 02:14:57.540 |
And there's a whole set of things to be done inside there. 02:15:01.020 |
The big one is we should use AI to build defenses 02:15:09.020 |
to build pathogens, right, design pathogens in labs, 02:15:11.200 |
which, you know, bad people, bad scientists can do today, 02:15:15.400 |
Well, obviously we should have the equivalent 02:15:17.160 |
of an Operation Warp Speed operating, you know, 02:15:20.980 |
But then we should use AI to build much better biodefenses, 02:15:24.080 |
right, and we should be using AI today to design, 02:15:27.680 |
against every possible form of pathogen, right? 02:15:32.860 |
you can use AI to build better defense tools, right? 02:15:35.040 |
And so you should have a whole new kind of security suite 02:15:37.380 |
wrapped around you, wrapped around your data, 02:15:43.620 |
Disinformation, hate speech, deep fakes, all that stuff, 02:15:46.000 |
you should have an AI filter when you use the internet, 02:15:48.220 |
where, you know, you shouldn't have to figure out 02:15:50.700 |
whether it's really me or whether it's a made up thing, 02:15:52.980 |
you should have an AI assistant that's doing that for you. 02:15:55.100 |
- Oh yeah, I mean, these little banners and cloaks 02:16:01.240 |
You know, if you're me, you always click, right? 02:16:07.560 |
I don't always look at this image as gruesome type thing. 02:16:17.440 |
- Well, and you should have an AI assistant with you 02:16:20.360 |
and you should be able to tell that AI assistant 02:16:22.720 |
So yes, I want the full experience to show me everything. 02:16:27.840 |
and I don't want to hear from these other people 02:16:30.640 |
By the way, it's gonna be my eight-year-old is using this. 02:16:32.400 |
I don't want anything that's gonna cause a problem, 02:16:35.280 |
And AI-based filters like that that you program and control 02:16:40.040 |
and be much more honest and straightforward and clear 02:16:43.640 |
So anyway, so basically what I want people to do 02:16:48.920 |
okay, we can use it to build a countermeasure. 02:16:50.960 |
And the great thing about the countermeasures 02:16:59.400 |
We ought to have better vaccines anyway, right? 02:17:02.760 |
where there's cyber hacking and cyber terrorism. 02:17:06.560 |
And we have the ability now to build much better 02:17:08.720 |
AI-powered tools to deal with all those things. 02:17:14.760 |
You know, getting decent healthcare in this country 02:17:24.440 |
that you have to go through to get a referral 02:17:30.920 |
I mean, it makes one partially or frankly ill 02:17:34.200 |
just to go through the process of having to do all that. 02:17:38.480 |
And granted, I don't have the highest degree of patience, 02:17:41.600 |
but I'm pretty patient and it drives me insane 02:17:47.400 |
But so I can think of a lot of benevolent uses of AI 02:17:52.040 |
and I'm grateful that you're bringing this up in here 02:17:54.800 |
and that you've tweeted about it in that thread. 02:17:59.680 |
I have to imagine that in your role as investor nowadays 02:18:03.080 |
that you're also thinking about AI quite often 02:18:08.080 |
And so does that mean that there are a lot of young people 02:18:11.640 |
who are really bullish on AI and are going for it? 02:18:19.160 |
- Unlike CRISPR, which is sort of in this liminal place 02:18:28.120 |
the governing bodies are going to allow gene editing. 02:18:39.320 |
There's a fight happening in Washington right now 02:18:41.320 |
over exactly what should be legal or not legal. 02:18:50.640 |
or specifically limit it to a small number of big companies, 02:18:57.140 |
- By the way, the EU also is like super negative. 02:19:06.020 |
- Yeah, they just like flat out don't want it. 02:19:07.720 |
- But that's like saying you're going to outlaw the internet. 02:19:10.940 |
- And frankly, they're not a big fan of the internet either. 02:19:14.600 |
the EU has a very, especially the EU bureaucrats, 02:19:19.800 |
have a very negative view on a lot of modernity. 02:19:26.200 |
what calls to mind things that I've heard people 02:19:28.760 |
which is there's so many lazy undisciplined people out there 02:19:32.700 |
that nowadays it's easier and easier to become exceptional. 02:19:36.620 |
It almost sounds like there's so many countries 02:19:38.460 |
that are just backing off of particular technologies 02:19:41.660 |
because it just sounds bad from the PR perspective 02:19:44.840 |
that it's creating great low-hanging fruit opportunities 02:19:50.000 |
for people to barge forward and countries to barge forward 02:19:55.200 |
you have to have a country that wants to do that 02:19:57.720 |
and those exist and there are countries like that. 02:20:03.300 |
from stronger countries that don't want them to do it. 02:20:09.740 |
over like whatever it is, 27 or whatever member countries. 02:20:21.200 |
we have a lot of control over a lot of the world. 02:20:24.360 |
But it sounds like we sit somewhere sort of in between. 02:20:26.920 |
Like right now, people are developing AI technologies 02:20:45.920 |
the other part of this is China's got a whole different 02:20:49.480 |
And so they're of course going to allow it for sure, 02:20:59.980 |
And then of course, they are going to spread their technology 02:21:12.320 |
on concepts like freedom and individual choices, 02:21:21.300 |
There are a lot of, so I'm having a lot of schizophrenic, 02:21:23.260 |
I'm having specifically a lot of schizophrenic conversations 02:21:25.340 |
with people in DC right now where if I talk to them 02:21:27.820 |
and China doesn't come up, they just like hate tech, 02:21:30.380 |
they hate American tech companies, they hate AI, 02:21:32.660 |
they hate social media, they hate this, they hate that, 02:21:35.580 |
and they just want to like punish and like ban 02:21:37.820 |
and like they're just like very, very negative. 02:21:39.860 |
But then if we have a conversation a half hour later 02:21:44.620 |
Now we need a partnership between the US government 02:22:04.340 |
it all sounds very cold, very dystopian to a lot of people. 02:22:12.540 |
And then you say you raise the issue of China 02:22:14.740 |
and then it sounds like this big dark cloud emerging 02:22:17.500 |
and then all of a sudden we need to galvanize 02:22:20.700 |
and develop these technologies to counter their effort. 02:22:23.820 |
So is it fear of them or is it competitiveness or both? 02:22:32.780 |
there's an old Bedouin saying as me against my brother, 02:22:37.660 |
me and my brother and my cousin against the world, right? 02:22:48.740 |
there's a big fight between specifically tech 02:22:56.040 |
It's fundamentally a fight for status and for power, 02:23:05.180 |
to show up and change things 'cause change is bad, right? 02:23:09.620 |
It threatens the respect that people have for you 02:23:13.540 |
And so I think it's primarily a status fight, 02:23:17.500 |
But the China thing is just like a straight up geopolitical 02:23:23.080 |
And look, 20 years ago, the prevailing view in Washington 02:23:27.780 |
And we're gonna be trading partners with China. 02:23:29.300 |
And yes, they're a totalitarian dictatorship, 02:23:35.820 |
it's become more and more clear that that's just not true. 02:23:39.340 |
And now there's a lot of people in both political parties 02:23:54.420 |
like how much you trust them or don't trust them? 02:23:56.300 |
I can go on record myself by saying that early on 02:24:01.600 |
we were told as Stanford faculty that we should not 02:24:04.500 |
and could not have TikTok accounts, nor WeChat accounts. 02:24:10.660 |
there are a lot of really bright Chinese tech entrepreneurs 02:24:13.020 |
and engineers who are trying to do good things. 02:24:16.820 |
So I think that many of the people mean very well, 02:24:22.140 |
and the system is very clear and unambiguous. 02:24:25.500 |
And the system is everything in China is owned by the party. 02:24:31.260 |
So the Chinese Communist Party owns everything 02:24:33.900 |
By the way, it's actually illegal to this day, 02:24:39.060 |
There's all these like basically legal machinations 02:24:55.380 |
So they own everything, they control everything. 02:24:59.420 |
but people in China can invest in American companies 02:25:02.900 |
- Well, they can subject to US government constraints. 02:25:13.400 |
But if you can get through that approval process, 02:25:17.100 |
Whereas the same is not true with respect to China. 02:25:23.500 |
And so if you're the CEO of a Chinese company, 02:25:27.300 |
If you're the CEO of ByteDance or CEO of Tencent, 02:25:30.000 |
your relationship with the Chinese Communist Party 02:25:33.940 |
And what's required is you are a unit of the party 02:25:36.740 |
and you and your company do what the party says. 02:25:45.020 |
to optimize to a certain social result, you say yes. 02:25:47.680 |
So it's whatever Xi Jinping and his party cadres decide, 02:26:01.760 |
And at any given time, he can come down the hall, 02:26:03.880 |
he can grab you out of your staff meeting or board meeting, 02:26:06.760 |
and he can make you sit for hours and study Marxism 02:26:09.380 |
Xi Jinping thought and quiz you on it and test you on it, 02:26:13.900 |
Right, so it's like a straight political control thing. 02:26:17.260 |
And then by the way, if you get crosswise with them, like, 02:26:20.980 |
- So when we see tech founders getting called up to Congress 02:26:26.540 |
for, you know, what looks like interrogation, 02:26:35.700 |
They just have this view of top-down state power 02:26:40.300 |
for lots of historical and moral reasons that they've defined 02:26:44.280 |
how they want to propagate that vision outside the country. 02:26:46.700 |
And they have these programs like Belt and Road, right, 02:26:55.580 |
Like, I will say that they don't lie about it, right? 02:27:07.840 |
- And is there a goal that, you know, in 200 years, 02:27:14.780 |
- Yeah, or 20 years, 30 years, or two years, three years, 02:27:18.820 |
- I mean, look, they're, yeah, they're, they're, I mean, 02:27:21.580 |
and I don't know, everybody's a little bit like this, 02:27:22.800 |
I guess, but yeah, if you're, they want to win. 02:27:25.940 |
- Well, the CRISPR in humans example that I gave earlier 02:27:30.760 |
I'm a neuroscientist and they could have edited any genes, 02:27:37.200 |
in the attempt to create super memory babies, 02:27:41.020 |
which presumably would grow into super memory adults, 02:27:43.660 |
and whether or not they succeeded in that isn't clear. 02:27:48.140 |
Those babies are alive and presumably by now walking, 02:27:53.780 |
whether or not they have super memories isn't clear, 02:27:55.560 |
but China is clearly unafraid to augment biology 02:28:11.840 |
But at some point I'm assuming people are going to augment 02:28:18.460 |
but often will select mates based on the traits 02:28:23.400 |
So this happens far more frequently than could be deemed 02:28:26.760 |
bad because either that or people are bad because people do 02:28:30.500 |
selecting mates that have physical and psychological 02:28:33.360 |
and cognitive traits that you would like your offspring 02:28:36.720 |
CRISPR is a more targeted approach of course. 02:28:40.960 |
and examples like it is that I feel like so much of the way 02:28:44.380 |
that governments and the public react to technologies 02:28:49.200 |
is to just take that first glimpse and it just feels scary. 02:28:53.280 |
You think about the old Apple ad of the 1984 ad. 02:29:01.760 |
and robots taking over and everyone like automatons. 02:29:06.420 |
where it's all about creativity, love and peace. 02:29:08.500 |
And it had the pseudo psychedelic California thing 02:29:11.480 |
Again, great marketing seems to convert people's thinking 02:29:17.340 |
about technology such that what was once viewed 02:29:26.920 |
So why are people so afraid of new technologies? 02:29:30.120 |
- So this is the thing I've tried to understand 02:29:32.020 |
for a long time because the history is so clear 02:29:34.920 |
and the history basically is every new technology 02:29:39.440 |
And so it's basically this like historical freak out 02:29:42.960 |
of some kind that causes people to basically predict 02:29:45.660 |
And you go back in time and actually this historical 02:29:48.400 |
sort of effect, it happens even in things now 02:29:50.260 |
where you just go back and it's ludicrous, right? 02:29:51.800 |
And so you mentioned earlier the satanic panic of the 80s 02:29:55.000 |
and the concern around like heavy metal music. 02:29:57.480 |
Right, before that there was like a freak out 02:29:59.060 |
around comic books in the 50s, there was a freak out 02:30:01.640 |
around jazz music in the 20s and 30s, it's devil music. 02:30:05.920 |
There was a freak out, the arrival of bicycles 02:30:07.780 |
caused a moral panic in the like 1860s, 1870s. 02:30:10.840 |
- Bicycles, yeah, so there was this thing at the time. 02:30:16.160 |
personal transportation thing that basically let kids 02:30:18.280 |
travel between towns quickly without any overhead. 02:30:22.480 |
You have to take care of a horse, just jump on a bike and go. 02:30:25.880 |
And so there was a historical panic specifically 02:30:28.200 |
around at the time young women who for the first time 02:30:30.600 |
were able to venture outside the confines of the town 02:30:33.600 |
to maybe go have a boyfriend in another town. 02:30:36.160 |
And so the magazines at the time ran all these stories 02:30:38.580 |
on this phenomenon, medical phenomenon called bicycle face. 02:30:41.920 |
And the idea of bicycle face was the exertion caused 02:30:46.480 |
your face would grimace and then if you run the bicycle 02:30:48.660 |
for too long your face would lock into place. 02:30:55.080 |
and therefore of course unable to then get married. 02:30:57.580 |
Cars, there was a moral panic around red flag laws. 02:31:02.760 |
There were all these laws that created the automobile. 02:31:04.580 |
Automobile freaked people out, so there were all these laws. 02:31:07.280 |
In the early days the automobile in a lot of places 02:31:09.760 |
you had to, you would take a ride in an automobile. 02:31:13.160 |
And automobiles, they broke down all the time. 02:31:14.520 |
So it would be, only rich people had automobiles. 02:31:16.520 |
It would be you and your mechanic in the car, right, 02:31:20.560 |
And then you had to hire another guy to walk 200 yards 02:31:28.000 |
and so you could only drive as fast as he could walk 02:31:35.160 |
they had the most draconian version which was, 02:31:37.860 |
they were very worried about the car scaring the horses. 02:31:40.680 |
And so there was a law that said if you saw a horse coming, 02:31:44.080 |
you needed to stop the car, you had to disassemble the car. 02:31:48.360 |
behind the nearest hay bale, wait for the horse to go by. 02:31:51.740 |
And then you could put your car back together. 02:31:54.480 |
So anyways, this example is electric lighting. 02:31:56.840 |
There was a panic around whether this was gonna completely 02:31:59.040 |
ruin, this was gonna completely ruin the romance of the dark 02:32:01.680 |
and it was gonna cause a whole new kind of terrible 02:32:04.360 |
civilization where everything is always brightly lit. 02:32:08.440 |
And so it's like, okay, what on earth is happening 02:32:15.900 |
The book is called Men, Machines in Modern Times, 02:32:17.760 |
and it's written by this MIT professor 60 years ago. 02:32:28.160 |
There's a three-stage societal response to new technologies. 02:32:31.480 |
He said stage one is basically just denial, just ignore. 02:32:35.320 |
Like we just don't pay any attention to this. 02:32:37.720 |
We just, there's just a blackout on the whole topic. 02:32:46.640 |
with all the different reasons why this can't possibly work. 02:32:48.640 |
It can't possibly ever get cheaper, this, that, 02:32:58.240 |
So he says stage three is like, right, right. 02:33:10.240 |
This is evil, this is terrible, this is awful. 02:33:23.560 |
is because basically fundamentally what these things are 02:33:28.160 |
It's a war over status and therefore a war over power 02:33:36.960 |
what is the societal impact of a new technology? 02:33:43.520 |
So the people who are specialists in that technology 02:33:45.840 |
become high status and the people who are specialists 02:33:48.280 |
in the previous way of doing things become low status 02:33:53.000 |
Generally, if you're the kind of person who is high status 02:34:00.600 |
that's going to enthusiastically try to replant yourself 02:34:07.760 |
about social media, like why are they so freaked out 02:34:11.080 |
that the whole nature of modern politics has changed. 02:34:13.120 |
The entire battery of techniques that you use 02:34:14.880 |
to get elected before social media are now obsolete. 02:34:17.680 |
Obviously the best new politicians of the future 02:34:19.780 |
are going to be 100% creations of social media. 02:34:23.440 |
- And we're seeing this now as we head towards 02:34:26.560 |
that podcasts clearly are going to be featured very heavily 02:34:30.360 |
in that next election because long form content 02:34:38.340 |
Like he's had like Bernie, he's had like Tulsi, 02:34:41.080 |
- RFK most recently and that's created a lot of controversy. 02:34:49.980 |
I'm sure he'd love to. - You'd have to ask him. 02:35:04.200 |
to have true long form discourse that would allow people 02:35:08.080 |
to really understand people's positions on things, 02:35:15.200 |
You know, some other podcaster undoubtedly would, right? 02:35:17.880 |
And so there's, my point, exactly, I totally agree with you, 02:35:22.560 |
if you're, let's say a legacy politician, right, 02:35:25.440 |
you have the option of embracing the new technology. 02:35:32.520 |
Like they won't do it and why won't they do it? 02:35:34.720 |
Well, okay, first of all, they want to ignore it, right? 02:35:36.600 |
They want to pretend that things aren't changing. 02:35:41.080 |
why the existing campaign system works the way that it does 02:35:43.120 |
and this and that and the existing media networks 02:35:51.200 |
It's how you succeeded was coming up through that system. 02:35:55.400 |
And then we've now proceeded to the name calling phase, 02:35:59.900 |
Now it's evil for somebody to show up in, you know, 02:36:09.080 |
It's like, it's a classic example of this pattern. 02:36:17.960 |
This is one of those things where you can learn about it 02:36:20.480 |
and still not, the entire world could learn about this 02:36:22.400 |
and still nothing changes because at the end of the day, 02:36:28.960 |
- I have a lot of thoughts about the podcast component. 02:36:33.200 |
I'll just say this because I wanna get back to 02:36:39.520 |
but on a long form podcast, there's no safe zone. 02:36:49.180 |
and certainly Joe is the best of the very best, 02:36:52.580 |
if not the most skilled podcaster in the entire universe 02:36:56.920 |
at continuing to press people on specific topics 02:37:00.180 |
when they're trying to bob and weave and wriggle out, 02:37:03.360 |
he'll just keep either drilling or alter the question 02:37:07.360 |
somewhat in a way that forces them to finally come up 02:37:12.040 |
And I think that probably puts certain people's cortisol 02:37:20.320 |
- Well, I think there's another deeper question also, 02:37:23.000 |
which is how many people actually have something to say? 02:37:28.840 |
- Like how many people can actually talk in a way 02:37:35.480 |
And like a lot of historical politics was to be able 02:37:43.040 |
the thoughts are, like even if they have deep thoughts, 02:37:47.800 |
- Well, it's going to be an interesting next, 02:37:54.000 |
- The panic and the name calling have already started, 02:37:56.240 |
- Yeah, I was going to say this list of three things, 02:37:57.980 |
denial, you know, the counterargument and name calling, 02:38:02.200 |
it seems like with AI, it's already just jumped 02:38:17.460 |
they almost always have a 30 or 40 year history 02:38:19.200 |
where people tried and failed to get them to work 02:38:21.640 |
AI has an 80 year prehistory, so it has a very long one. 02:38:24.640 |
And then it just, it all of a sudden started to work 02:38:38.660 |
And so I actually think that's exactly what's happening. 02:38:40.560 |
I think it's actually speed running this progression 02:38:42.240 |
just because if you use mid journey or you use GPT 02:38:46.800 |
you're just like, wow, like obviously this thing 02:38:53.200 |
that I can use it and then therefore immediately 02:38:55.760 |
you're like, oh my God, this is going to transform 02:39:02.660 |
In the face of all this, there are innovators out there. 02:39:12.620 |
or maybe they are just some young or older person 02:39:16.620 |
who has these five traits in abundance or doesn't, 02:39:21.080 |
but knows somebody who does and is partnering with them 02:39:28.980 |
at identifying these people, I think in part, 02:39:44.480 |
the world will often reconfigure itself around you 02:39:47.060 |
much more quickly and easily than you would think. 02:39:55.320 |
One is that you have a very clear understanding 02:39:58.560 |
of the inner workings of these great innovators. 02:40:06.500 |
but that also you have an intense understanding 02:40:12.840 |
for the last hour or so is that it is a really intense 02:40:17.780 |
You've got countries and organizations and the elites 02:40:20.420 |
and journalists that are trying to, not necessarily trying, 02:40:28.540 |
I mean, that's sort of the picture that I'm getting. 02:40:30.220 |
So it's like we're trying to innovate inside of a vice 02:40:35.460 |
And yet this quote argues that it is the person, 02:40:46.860 |
but my view of the world is the way the world's gonna bend. 02:40:55.960 |
I'm actually gonna uncurl the vice the other direction. 02:40:58.740 |
And so I'm at once picking up a sort of pessimistic 02:41:10.980 |
And if you would, could you tell us about that 02:41:13.820 |
from the perspective of someone listening who is thinking, 02:41:18.860 |
and I know it's a really good one 'cause I just know. 02:41:22.780 |
I might not have the confidence of extrinsic reward yet, 02:41:38.160 |
one of the ways to square it is I think you as the innovator 02:41:40.260 |
need to be signed up to fight the fight, right? 02:41:42.640 |
So, and again, this is where like the fictional portrayals 02:41:52.620 |
and they may get made to be like cute and fun. 02:41:54.800 |
And it's like, yeah, no, like if you talk to anybody 02:41:56.660 |
who actually did any of these things, like, no, 02:41:58.520 |
it was just like these things are always just like 02:42:00.420 |
brutal exercises and just like sheer willpower 02:42:08.640 |
And this kind of goes to the conscientiousness thing 02:42:12.080 |
We also, my partner Ben uses the term courage a lot, right? 02:42:15.160 |
Which is some combination of like just stubbornness, 02:42:17.520 |
but coupled with like a willingness to take pain 02:42:22.560 |
And, you know, have people think very bad things of you 02:42:24.440 |
for a long time until it turns out, you know, 02:42:26.280 |
you hopefully prove yourself, prove yourself correct. 02:42:30.000 |
Like it's a context sport, like these aren't easy roads, 02:42:40.760 |
is at the end of the day, the truth actually matters. 02:42:47.160 |
classic Victor Hugo quote is there's nothing more powerful 02:42:49.480 |
in the world than an idea whose time has come, right? 02:42:51.560 |
Like if it's real, right, and this is just pure substance, 02:42:58.960 |
like if it's a legitimately good scientific discovery, 02:43:03.500 |
if it's a new invention, if it's a new work of art, 02:43:09.800 |
at the end of the day, you have that on your side. 02:43:17.160 |
It's not like they're showing up with some other thing 02:43:19.980 |
and they're like, my thing is better than your thing. 02:43:23.420 |
The main problem is like, I have a thing, I'm convinced, 02:43:26.300 |
everybody else is telling me it's stupid, wrong, 02:43:30.020 |
But at the end of the day, I still have the thing, right? 02:43:35.120 |
the truth really matters, the substance really matters 02:43:39.240 |
It's really hard historically to find an example 02:43:46.120 |
And you know, nuclear is maybe an example of that, 02:43:49.360 |
but even still, there are still, you know, nuclear, 02:43:51.000 |
there are still nuclear plants like running today. 02:43:54.960 |
You know, I would say the same thing as scientific, 02:43:59.000 |
I don't know of any scientific discovery that was made 02:44:01.000 |
and then people like, I know there are areas of science 02:44:04.340 |
that are not politically correct to talk about today, 02:44:11.520 |
I mean, even the geneticists in the Soviet Union 02:44:21.560 |
that one gets in any field establishes some core truths 02:44:24.040 |
upon which even the crazy ideas have to rest. 02:44:29.960 |
I would say that even the technologies that did not pan out 02:44:42.200 |
So the example I'll give is that most people are aware 02:44:49.600 |
You know, analyzing what's in a single drop of blood 02:45:00.600 |
as opposed to having a phlebotomist come to your house 02:45:02.560 |
or you have to go in and you get tapped with, you know, 02:45:19.980 |
But the idea of getting a wide array of markers 02:45:28.160 |
The biggest challenge that company is going to confront 02:45:30.320 |
is the idea that it's just the next Theranos. 02:45:32.160 |
But if they've got the thing and they're not fudging it 02:45:38.280 |
I think everything will work out a la Victor Hugo. 02:45:58.900 |
They're bringing the silence or counterargument, right, 02:46:04.840 |
- Well, this is why I think people who need to be loved 02:46:12.160 |
And maybe that's also why having people close to you 02:46:15.000 |
that do love you and allowing that to be sufficient 02:46:18.880 |
This gets back to the idea of partnership and family 02:46:21.400 |
around innovators, because if you feel filled up 02:46:25.140 |
by those people local to you, you know, in your home, 02:46:32.740 |
because you're good and you can forge forward. 02:46:47.700 |
where, you know, a small group of individuals 02:46:53.100 |
outdoes what a giant like Facebook might be doing 02:46:56.800 |
or what any other large company might be doing. 02:47:00.940 |
There are a lot of theories as to why that would happen, 02:47:11.280 |
- So the conventional explanation is, I think, correct. 02:47:18.920 |
actually executing anything because of all the overhead. 02:47:27.160 |
The number of people who have to be consulted, 02:47:28.860 |
who have to agree on things gets to be staggering. 02:47:31.140 |
The amount of time it takes to schedule the meeting 02:47:36.140 |
and they have some issue they're dealing with. 02:47:37.500 |
And it takes like a month to schedule the pre-meeting 02:47:42.340 |
which is then going to result in a post-meeting, 02:47:44.060 |
which will then result in a board presentation, 02:47:46.120 |
which will then result in a planning offsite, right? 02:47:50.000 |
but what you're describing is giving me hives. 02:47:56.960 |
I mean, look, you'd have these organizations. 02:47:59.360 |
like you're more of a nation state than a company. 02:48:05.420 |
you know, it's the Bedouin thing I was saying 02:48:08.980 |
your internal enemies are like way more dangerous to you 02:48:16.060 |
the big competition is for the next promotion, right? 02:48:24.380 |
The competitor on the outside is like an abstraction. 02:48:28.780 |
I got to beat that guy inside my own company, right? 02:48:31.460 |
And so the internal warfare is at least as intense 02:48:35.500 |
And so, yeah, so it's just, I mean, this is just all the, 02:48:38.020 |
you know, it's the iron law of all these big bureaucracies 02:48:41.140 |
So if a big bureaucracy ever does anything productive, 02:48:49.980 |
for like big, large organizations that actually do things. 02:48:52.100 |
Like that's great because it's like, it's so rare. 02:48:56.420 |
So anyway, so that's the conventional explanation. 02:49:04.420 |
They don't have global coverage and all these kinds of, 02:49:06.620 |
you know, they don't have the resources and so forth, 02:49:12.140 |
They can have, you know, if there's an issue today, 02:49:22.060 |
but I think there's another deeper thing underneath that 02:49:25.780 |
that takes us back full circle to where we started, 02:49:27.580 |
which is just the sheer number of people in the world 02:49:33.320 |
And so you're not gonna have a hundred of them in a company 02:49:42.780 |
- And some of them are flying too close to the sun. 02:49:44.660 |
- Some of them are blowing themselves up, right? 02:49:49.420 |
my first actual job job was at IBM when it was, 02:49:55.140 |
And so when I was there, it was 440,000 employees, 02:50:05.340 |
of like a two or three million person organization. 02:50:11.140 |
You know, we were next door to another building 02:50:12.700 |
that had another 6,000 people in another division. 02:50:16.280 |
and never meet anybody who didn't work for IBM. 02:50:20.420 |
or just introducing themselves to each other. 02:50:22.160 |
Like, it was just mind boggling, the level of complexity. 02:50:41.860 |
And so they had a system, and it worked really well 02:50:44.300 |
for like 50 years, they had a system which was, 02:50:46.740 |
most of the employees in the company were expected 02:50:50.800 |
So they dressed the same, they acted the same, 02:50:53.520 |
You know, they were trained very specifically. 02:50:56.040 |
But they had this category of people they called wild ducks. 02:50:58.960 |
And this was an idea that the founder, Thomas Watson, 02:51:02.880 |
And the wild ducks were, they often had the formal title 02:51:16.040 |
They got to go off and work on something new. 02:51:19.040 |
They got to pull people off of other projects 02:51:29.420 |
And they showed-- the one in Austin at the time 02:51:32.020 |
was this guy Andy Heller, and he would show up in jeans 02:51:35.840 |
And you know, went amongst an ocean of men in blue suits, 02:51:38.840 |
white shirts, red ties, and put his cowboy boots up 02:51:44.760 |
and it was not fine for you to do that, right? 02:51:52.140 |
within our company that gets to play by different rules. 02:51:57.460 |
Their job is to invent the next breakthrough product. 02:51:59.620 |
But we, IBM management, know that the 6,000 person division 02:52:04.980 |
We know it's going to be a crazy Andy Heller and his cowboy 02:52:15.900 |
And I think that's basically the model that works. 02:52:20.920 |
Which is like, how do you have a large, bureaucratic, regimented 02:52:24.000 |
organization, whether it's academia or government 02:52:26.460 |
or business or anything, that has all these rule 02:52:30.280 |
are jealous of their status and don't want things to change, 02:52:33.080 |
but then still have that spark of creativity? 02:52:49.920 |
are in the business of finding and funding the wild dogs. 02:52:54.160 |
And actually, this is actually going to close the loop. 02:52:56.400 |
This is actually, I think, the simplest explanation 02:52:58.120 |
for why IBM ultimately caved in and then HP sort of in the '80s 02:53:02.140 |
IBM and HP kind of were these incredible monolithic, 02:53:07.760 |
And then they kind of both caved in in the '80s and '90s. 02:53:10.140 |
And I actually think it was the emergence of venture capital. 02:53:12.760 |
It was the emergence of a parallel funding system 02:53:16.900 |
their superstar technical people could actually leave and start 02:53:20.680 |
And again, it goes back to the university discussion 02:53:23.040 |
It's like, this is what doesn't exist at the university level. 02:53:25.600 |
This certainly does not exist at the government level. 02:53:29.820 |
exist until there's this thing that we call podcasts. 02:53:42.320 |
But the one thing you know, right, and you know this. 02:53:44.360 |
The one thing you know is the people on the other side 02:53:47.320 |
Yeah, they're going to-- well, I think they're past denial. 02:54:01.920 |
But as with every time I talk to you, I learn oh so very much. 02:54:07.360 |
So I'm so grateful for you taking the time out 02:54:09.320 |
of your schedule to talk about all of these topics in depth 02:54:14.280 |
I'd be remiss if I didn't say that it is clear to me 02:54:16.480 |
now that you are hyper-realistic about the landscape. 02:54:21.800 |
But you are also intensely optimistic about the existence 02:54:25.160 |
of wild ducks and those around them that support them 02:54:28.360 |
that are necessary for the implementation of their ideas 02:54:32.120 |
And that also you have a real rebel inside you. 02:54:38.640 |
And it's oh so needed in these times and every time. 02:54:45.640 |
here at the podcast and especially the listeners, 02:54:50.400 |
Thank you for joining me for today's discussion 02:54:53.880 |
If you're learning from and/or enjoying this podcast, 02:54:58.080 |
That's a terrific zero-cost way to support us. 02:55:08.200 |
If you have questions for me or comments about the podcast 02:55:10.520 |
or guests that you'd like me to consider hosting 02:55:12.600 |
on the Huberman Lab Podcast, please put those 02:55:20.080 |
at the beginning and throughout today's episode. 02:55:24.900 |
Not on today's podcast, but on many previous episodes 02:55:27.600 |
of the Huberman Lab Podcast, we discuss supplements. 02:55:30.160 |
While supplements aren't necessary for everybody, 02:55:32.280 |
many people derive tremendous benefit from them 02:55:34.440 |
for things like improving sleep, hormone support, and focus. 02:55:40.800 |
If you'd like to access the supplements discussed 02:55:51.800 |
Again, that's Live Momentous, spelled O-U-S.com/huberman. 02:55:59.000 |
our neural network newsletter is a completely 02:56:01.080 |
zero-cost monthly newsletter that includes summaries 02:56:09.660 |
tools to improve sleep, tools to improve neuroplasticity. 02:56:13.180 |
We talk about deliberate cold exposure, fitness, 02:56:18.960 |
And to sign up, you simply go to HubermanLab.com, 02:56:23.480 |
scroll down to newsletter, and provide your email. 02:56:28.040 |
If you're not already following me on social media, 02:56:32.360 |
So that's Instagram, Twitter, Threads, LinkedIn, 02:56:37.920 |
I talk about science and science-related tools, 02:56:42.600 |
but much of which is distinct from the content 02:56:46.000 |
Again, it's Huberman Lab on all social media platforms.