back to index

Marc Andreessen: How Risk Taking, Innovation & Artificial Intelligence Transform Human Experience


Chapters

0:0 Marc Andreessen
3:2 Sponsors: LMNT & Eight Sleep
6:5 Personality Traits of an Innovator
12:49 Disagreeableness, Social Resistance; Loneliness & Group Think
18:48 Testing for Innovators, Silicon Valley
23:18 Unpredictability, Pre-Planning, Pivot
28:53 Intrinsic vs Extrinsic Motivation, Social Comparison
32:52 Sponsor: AG1
33:49 Innovators & Personal Relationships
39:24 Risk Taking, Innovators, “Martyrs to Civilizational Progress”
46:16 Cancel Culture, Public vs. Elite
53:8 Elites & Institutions, Trust
57:38 Sponsor: InsideTracker
58:44 Social Media, Shifts in Public vs. Elite
65:45 Reform & Institutions, Universities vs. Business
80:56 Alternative University; Great Awakenings; Survivorship Bias
87:25 History of Computers, Neural Network, Artificial Intelligence (AI)
95:50 Apple vs. Google, Input Data Set, ChatGPT
102:8 Deep Fakes, Registries, Public-Key Cryptography; Quantum Internet
106:46 AI Positive Benefits, Medicine, Man & Machine Partnership
112:18 AI as Best-Self Coach; AI Modalities
119:19 Gene Editing, Precautionary Principle, Nuclear Power
125:38 Project Independence, Nuclear Power, Environmentalism
132:40 Concerns about AI
138:0 Future of AI, Government Policy, Europe, US & China
143:47 China Businesses, Politics; Gene Editing
148:38 Marketing, Moral Panic & New Technology; Politics, Podcasts & AI
159:3 Innovator Development, Courage, Support
166:36 Small Groups vs. Large Organization, Agility; “Wild Ducks”
174:50 Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous, Neural Network Newsletter, Social Media

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science and science-based tools
00:00:04.880 | for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.240 | and I'm a professor of neurobiology and ophthalmology
00:00:13.320 | at Stanford School of Medicine.
00:00:15.340 | Today, my guest is Marc Andreessen.
00:00:17.720 | Marc Andreessen is a software engineer
00:00:19.760 | and an investor in technology companies.
00:00:22.240 | He co-founded and developed Mosaic,
00:00:24.440 | which was one of the first widely used web browsers.
00:00:27.160 | He also co-founded and developed Netscape,
00:00:30.040 | which was one of the earliest widespread used web browsers.
00:00:33.340 | And he co-founded and is a general partner
00:00:35.520 | at Andreessen Horowitz,
00:00:37.300 | one of the most successful Silicon Valley
00:00:39.320 | venture capital firms.
00:00:40.900 | All of that is to say that Marc Andreessen
00:00:43.400 | is one of the most successful innovators and investors ever.
00:00:47.400 | I was extremely excited to record this episode with Marc
00:00:49.640 | for several reasons.
00:00:50.920 | First of all, he himself is an incredible innovator.
00:00:53.960 | Second of all, he has an uncanny ability
00:00:56.300 | to spot the innovators of the future.
00:00:58.640 | And third, Marc has shown over and over again
00:01:00.860 | the ability to understand how technologies
00:01:03.280 | not yet even developed are going to impact
00:01:05.820 | the way that humans interact at large.
00:01:08.240 | Our conversation starts off by discussing
00:01:10.500 | what makes for an exceptional innovator,
00:01:12.880 | as well as what sorts of environmental conditions
00:01:15.320 | make for exceptional innovation and creativity
00:01:17.760 | more generally.
00:01:18.820 | In that context, we talk about risk-taking,
00:01:21.240 | not just in terms of risk-taking in one's profession,
00:01:24.020 | but about how some people, not all,
00:01:26.040 | but how some people who are risk-takers and innovators
00:01:29.020 | in the context of their work,
00:01:30.640 | also seem to take a lot of risks in their personal life
00:01:33.780 | and some of the consequences that can bring.
00:01:36.220 | Then we discuss some of the most transformative technologies
00:01:39.060 | that are now emerging,
00:01:40.440 | such as novel approaches to developing clean energy,
00:01:43.540 | as well as AI or artificial intelligence.
00:01:46.600 | With respect to AI, Marc shares his views
00:01:49.180 | as to why AI is likely to greatly improve human experience.
00:01:53.460 | And we discussed the multiple roles
00:01:54.780 | that AI is very likely to have
00:01:56.560 | in all of our lives in the near future.
00:01:58.800 | Marc explains how not too long from now,
00:02:00.960 | all of us are very likely to have AI assistance.
00:02:04.400 | For instance, assistance that give us
00:02:06.180 | highly informed health advice,
00:02:08.180 | highly informed psychological advice.
00:02:10.240 | Indeed, it is very likely that all of us
00:02:12.580 | will soon have AI assistance that govern
00:02:14.720 | most if not all of our daily decisions.
00:02:17.360 | And Marc explains how if done correctly,
00:02:20.080 | this can be a tremendously positive addition to our life.
00:02:23.440 | In doing so, Marc provides a stark counterargument
00:02:26.660 | for those that argue that AI
00:02:28.540 | is going to diminish human experience.
00:02:30.860 | So if you're hearing about and/or concerned about
00:02:33.240 | the ways that AI is likely to destroy us,
00:02:36.060 | today you are going to hear about the many different ways
00:02:38.820 | that AI technologies now in development
00:02:41.140 | are likely to enhance our human experience at every level.
00:02:44.660 | What you'll soon find is that while today's discussion
00:02:46.680 | does center around technology and technology development,
00:02:49.880 | it is really a discussion about human beings
00:02:52.240 | and human psychology.
00:02:53.680 | So whether you have an interest
00:02:54.840 | in technology development and/or AI,
00:02:57.480 | I'm certain that you'll find today's discussion
00:02:59.560 | to be an important and highly lucid view
00:03:02.080 | into what will soon be the future that we all live in.
00:03:05.320 | Before we begin, I'd like to emphasize that this podcast
00:03:07.920 | is separate from my teaching and research roles at Stanford.
00:03:10.440 | It is, however, part of my desire and effort
00:03:12.500 | to bring zero cost to consumer information about science
00:03:14.960 | and science-related tools to the general public.
00:03:17.520 | In keeping with that theme,
00:03:18.600 | I'd like to thank the sponsors of today's podcast.
00:03:21.360 | Our first sponsor is Element.
00:03:23.120 | Element is an electrolyte drink
00:03:24.520 | that has everything you need and nothing you don't.
00:03:26.860 | That means plenty of the electrolytes, sodium, magnesium,
00:03:29.480 | and potassium in the correct ratios, but no sugar.
00:03:32.980 | The electrolytes and hydration are absolutely key
00:03:35.820 | for mental health, physical health, and performance.
00:03:38.440 | Even a slight degree of dehydration
00:03:40.140 | can impair our ability to think, our energy levels,
00:03:42.960 | and our physical performance.
00:03:44.600 | Element makes it very easy to achieve proper hydration,
00:03:47.600 | and it does so by including the three electrolytes
00:03:50.000 | in the exact ratios they need to be present.
00:03:52.700 | I drink Element first thing in the morning when I wake up.
00:03:54.860 | I usually mix it with about 16 to 32 ounces of water.
00:03:58.040 | If I'm exercising, I'll drink one while I'm exercising,
00:04:01.240 | and I tend to drink one after exercising as well.
00:04:04.200 | Now, many people are scared off by the idea
00:04:06.800 | of ingesting sodium, because obviously we don't want
00:04:09.520 | to consume sodium in excess.
00:04:11.080 | However, for people that have normal blood pressure,
00:04:13.280 | and especially for people that are consuming
00:04:15.120 | very clean diets, that is consuming not so many
00:04:18.580 | processed foods or highly processed foods,
00:04:21.100 | oftentimes we are not getting enough sodium,
00:04:23.460 | magnesium, and potassium, and we can suffer as a consequence.
00:04:26.420 | And with Element, simply by mixing in water,
00:04:28.380 | it tastes delicious.
00:04:29.220 | It's very easy to get that proper hydration.
00:04:31.340 | If you'd like to try Element, you can go to Drink Element,
00:04:34.260 | that's lmnt.com/huberman,
00:04:36.660 | to claim a free Element sample pack with your purchase.
00:04:39.000 | Again, that's drinkelementlmnt.com/huberman.
00:04:42.880 | Today's episode is also brought to us by Eight Sleep.
00:04:45.500 | Eight Sleep makes smart mattress covers
00:04:47.140 | with cooling, heating, and sleep tracking capacity.
00:04:49.840 | I've spoken many times before in this podcast
00:04:52.000 | about the fact that sleep,
00:04:53.500 | that is getting a great night's sleep,
00:04:55.440 | is the foundation of all mental health,
00:04:57.660 | physical health, and performance.
00:04:58.860 | When we're sleeping well, everything goes far better.
00:05:01.180 | And when we are not sleeping well, we're enough,
00:05:03.420 | everything gets far worse at the level of mental health,
00:05:05.860 | physical health, and performance.
00:05:07.420 | And one of the key things to getting a great night's sleep
00:05:09.420 | and waking up feeling refreshed is that you have to control
00:05:11.960 | the temperature of your sleeping environment.
00:05:13.460 | And that's because in order to fall and stay deeply asleep,
00:05:16.540 | you need your core body temperature to drop
00:05:18.340 | by about one to three degrees.
00:05:19.860 | And in order to wake up feeling refreshed and energized,
00:05:22.660 | you want your core body temperature to increase
00:05:24.660 | by about one to three degrees.
00:05:26.460 | With Eight Sleep, it's very easy to induce that drop
00:05:29.880 | in core body temperature by cooling your mattress
00:05:32.340 | early and throughout the night
00:05:33.500 | and warming your mattress toward morning.
00:05:35.380 | I started sleeping on an Eight Sleep mattress cover
00:05:37.300 | a few years ago, and it has completely transformed
00:05:39.900 | the quality of the sleep that I get.
00:05:41.540 | So much so that I actually loathe traveling
00:05:44.140 | because I don't have my Eight Sleep mattress cover
00:05:46.060 | when I travel.
00:05:46.880 | If you'd like to try Eight Sleep,
00:05:48.140 | you can go to eightsleep.com/huberman,
00:05:50.780 | and you'll save up to $150 off their pod three cover.
00:05:54.340 | Eight Sleep currently ships in the USA, Canada, UK,
00:05:56.820 | select countries in the EU and Australia.
00:05:59.160 | Again, that's eightsleep.com/huberman.
00:06:02.260 | And now for my discussion with Marc Andreessen.
00:06:05.400 | Marc, welcome.
00:06:06.420 | - Hey, thank you.
00:06:07.820 | - Delighted to have you here and have so many questions
00:06:10.040 | for you about innovation, AI, your view of the landscape
00:06:14.220 | of tech and humanity in general.
00:06:17.340 | I want to start off by talking about innovation
00:06:20.660 | from three different perspectives.
00:06:23.300 | There's the inner game, so to speak,
00:06:25.660 | or the psychology of the innovator or innovators,
00:06:29.340 | things like their propensity for engaging in conflict
00:06:33.140 | or not, their propensity for having a dream or a vision,
00:06:37.060 | and in particular, their innovation as it relates
00:06:40.500 | to some psychological trait or expression.
00:06:45.140 | So we'll get to that in a moment.
00:06:46.640 | The second component that I'm curious about
00:06:48.820 | is the outer landscape around innovators,
00:06:51.220 | who they place themselves with,
00:06:53.900 | the sorts of choices that they make,
00:06:55.800 | and also the sorts of personal relationships
00:06:58.020 | that they might have or not have.
00:06:59.660 | And then the last component is this notion
00:07:02.860 | of the larger landscape that they happen
00:07:04.460 | to find themselves in.
00:07:05.400 | What time in history?
00:07:06.560 | What's the geography?
00:07:09.020 | Bay Area, New York, Dubai, et cetera.
00:07:12.420 | So to start off, is there a common trait of innovators
00:07:17.420 | that you think is absolutely essential as a seed
00:07:22.700 | to creating things that are really impactful?
00:07:26.180 | - Yeah, so I'm not a psychologist,
00:07:28.040 | but I've picked up some of the concepts,
00:07:30.060 | some of the terms.
00:07:31.140 | And so it was a great moment of delight in my life
00:07:33.940 | when I learned about the big five personality traits.
00:07:36.020 | 'Cause I was like, aha, there's a way
00:07:37.540 | to actually describe the answer to this question
00:07:39.540 | in at least reasonably scientific terms.
00:07:41.740 | And so I think what you're looking for
00:07:43.540 | when you're talking about real innovators,
00:07:45.460 | people who actually do really creative breakthrough work,
00:07:47.180 | I think you're talking about a couple things.
00:07:48.380 | So one is very high in what's called trait openness,
00:07:51.740 | which is one of the big five,
00:07:53.620 | which is basically just like flat out open to new ideas.
00:07:57.120 | And of course, the nature of trait openness
00:07:59.420 | is trait openness means you're not just open to new ideas
00:08:01.380 | in one category, you're open to many different kinds
00:08:03.160 | of new ideas.
00:08:04.000 | And so we might talk about the fact
00:08:05.420 | that a lot of innovators also are very creative people
00:08:07.820 | in other aspects of their lives,
00:08:09.660 | even outside of their specific creative domain.
00:08:13.020 | So that's important.
00:08:13.860 | But of course, just being open is not sufficient.
00:08:15.620 | 'Cause if you're just open,
00:08:16.460 | you could just be curious and explore,
00:08:18.060 | and spend your entire life reading
00:08:19.100 | and talking to people and never actually create something.
00:08:21.700 | So you also need a couple other things.
00:08:23.760 | You need a high level of conscientiousness,
00:08:25.620 | which is another one of the big five.
00:08:26.800 | You need somebody who's really willing to apply themselves.
00:08:29.540 | And in our world, typically over a period of many years,
00:08:32.140 | to be able to accomplish something great.
00:08:34.180 | They typically work very hard.
00:08:36.580 | That often gets obscured 'cause the stories
00:08:38.620 | that end up getting told about these people are,
00:08:40.380 | it's just like there's this kid, and he just had this idea,
00:08:42.220 | and it was like a stroke of genius,
00:08:43.400 | and it was like a moment in time,
00:08:44.620 | and it's just like, oh, he was so lucky.
00:08:46.460 | And it's like, no, for most of these people,
00:08:48.380 | it's years and years and years of applied effort.
00:08:51.200 | And so you need somebody with an extreme,
00:08:53.740 | basically, willingness to defer gratification
00:08:55.740 | and really apply themselves to a specific thing
00:08:58.200 | for a long time.
00:08:59.340 | And of course, this is why there aren't very many
00:09:01.460 | of these people, is there aren't many people
00:09:02.680 | who are high in openness and high in conscientiousness
00:09:04.860 | 'cause to a certain extent, they're opposed, right, traits.
00:09:07.820 | And so you need somebody who has both of those.
00:09:10.240 | Third is you need somebody high in disagreeableness,
00:09:12.940 | which is the third of the big five.
00:09:15.100 | So you need somebody who's just like basically ornery, right,
00:09:18.360 | because if they're not ornery,
00:09:19.700 | then they'll be talked out of their ideas
00:09:21.280 | by people who will be like, oh, well, you know,
00:09:22.820 | 'cause the reaction most people have to ideas is,
00:09:24.660 | oh, that's dumb.
00:09:25.780 | And so somebody who's too agreeable
00:09:27.640 | will be easily dissuaded to not pursue,
00:09:29.840 | you know, not pulling the thread anymore.
00:09:31.880 | So you need somebody highly disagreeable.
00:09:33.140 | Again, the nature of disagreeableness
00:09:35.140 | is they tend to be disagreeable about everything, right?
00:09:37.940 | So they tend to be these very kind of like kind of classic,
00:09:40.420 | you know, kind of renegade characters.
00:09:42.620 | And then there's just a table stakes component,
00:09:44.420 | which is they just also need to be high IQ, right?
00:09:46.380 | They just need to be really smart
00:09:47.980 | because it's just it's hard to innovate in any category
00:09:50.340 | if you can't synthesize large amounts of information quickly.
00:09:53.300 | And so those are four like basically like high spikes,
00:09:57.180 | you know, very rare traits
00:09:58.520 | that basically have to come together.
00:10:01.740 | You could probably also say they probably at some point
00:10:04.020 | need to be relatively low in neuroticism,
00:10:06.140 | which is another of the big five
00:10:07.260 | 'cause if they're too neurotic,
00:10:08.180 | they probably can't handle the stress, right?
00:10:09.900 | So it's kind of this dial in there.
00:10:11.720 | And then of course, if you're into like this sort of science
00:10:14.940 | of the big five, basically, you know,
00:10:16.180 | these are all people who are on like the far outlying
00:10:19.060 | kind of point on the normal distribution
00:10:20.820 | across all these traits.
00:10:22.140 | And then that just gets you to, I think,
00:10:23.840 | the sort of hardest topic of all around this whole concept
00:10:26.940 | which is just there are very few of these people.
00:10:29.540 | - Do you think they're born with these traits?
00:10:31.860 | - Yeah, well, so they're born with the traits
00:10:34.000 | and then of course the traits are not,
00:10:35.420 | you know, genetics are not destiny.
00:10:36.500 | And so the traits are not deterministic
00:10:38.980 | in the sense of that, you know,
00:10:39.820 | just 'cause they have those personality traits
00:10:41.460 | doesn't mean they're gonna deliver great creativity,
00:10:43.880 | but like they need to have those properties
00:10:46.660 | because otherwise they're just not either
00:10:47.700 | gonna be able to do the work
00:10:48.680 | or they're not gonna enjoy it, right?
00:10:50.460 | I mean, look, a lot of these people
00:10:51.880 | are highly capable, competent people.
00:10:53.900 | You know, it's very easy for them to get like high paying jobs
00:10:57.100 | in traditional institutions and, you know,
00:10:58.780 | get lots of, you know, traditional awards
00:11:00.420 | and, you know, end up with big paychecks.
00:11:01.860 | And, you know, there's a lot of people at, you know,
00:11:03.880 | big institutions that we, you know, you and I know well
00:11:06.300 | and I deal with many of these
00:11:07.820 | where people get paid a lot of money
00:11:08.900 | and they get a lot of respect and they go for 20 years
00:11:11.040 | and it's great and they never create anything new, right?
00:11:13.660 | And so there's a lot of--
00:11:14.500 | - Administrators.
00:11:15.340 | - Yeah, well, a lot of them, yeah,
00:11:16.500 | a lot of them end up in administrative jobs.
00:11:18.620 | And that's fine, that's good.
00:11:20.360 | The world needs, you know, the world needs that also, right?
00:11:22.380 | The innovators can't run everything 'cause everything,
00:11:24.540 | you know, the rate of change would be too high.
00:11:26.280 | Society, I think, probably wouldn't be able to handle it.
00:11:27.920 | So you need some people who are on the other side
00:11:30.120 | who are gonna kind of keep the lights on
00:11:31.360 | and keep things running.
00:11:33.080 | But there is this decision that people have to make,
00:11:35.280 | which is, okay, if I have the sort of latent capability
00:11:37.920 | to do this, is this actually what I want
00:11:39.720 | to spend my life doing?
00:11:40.800 | And do I wanna go through the stress and the pain
00:11:43.160 | and the trauma, right, and the anxiety, right,
00:11:45.760 | and the risk of failure, right?
00:11:47.360 | And so do I really want to?
00:11:48.480 | Once in a while, you run into somebody who's just like,
00:11:50.460 | can't do it any other way, like they just have to.
00:11:53.400 | - Who's an example of that?
00:11:54.240 | - I mean, Elon's the, you know,
00:11:55.840 | the paramount example of our time.
00:11:57.480 | And I bring him up in part
00:11:59.040 | because he's such an obvious example,
00:12:00.400 | but in part because he's talked about this in interviews
00:12:03.740 | where he basically says like, he's like,
00:12:05.240 | I can't turn it off.
00:12:06.080 | Like, the idea's come, I have to pursue them, right?
00:12:10.080 | It's why he's like running five companies at the same time
00:12:12.000 | and like working on a sixth, right?
00:12:14.240 | It's just like, he can't turn it off.
00:12:16.680 | You know, look, there's a lot of other people
00:12:17.960 | who probably have the capability to do it,
00:12:20.120 | who ended up talking themselves into
00:12:22.120 | or, you know, whatever events conspired
00:12:23.880 | to put them in a position where they did something else.
00:12:26.560 | You know, obviously there are people who try to be creative
00:12:28.200 | who just don't have the capability.
00:12:29.960 | And so there's some Venn diagram there
00:12:31.860 | of determinism through traits, but also choices in life.
00:12:35.960 | And then also of course, the situation in which they're born,
00:12:38.340 | the context within which they grow up, the culture, right?
00:12:41.480 | What their parents expect of them and so forth.
00:12:43.760 | And so you have to, you know,
00:12:44.720 | you kind of get all the way through this.
00:12:46.360 | You have to thread all these nadles kind of at the same time.
00:12:49.560 | - Do you think there are folks out there
00:12:51.120 | that meet these criteria who are disagreeable,
00:12:54.640 | but that can feign agreeableness?
00:12:57.400 | You know, they can, for those just listening,
00:13:00.240 | Mark just raised his right hand.
00:13:01.840 | In other words, that can sort of,
00:13:04.920 | phrase that comes to mind,
00:13:07.000 | maybe because I can relate to a little bit,
00:13:08.600 | they sneak up through the system,
00:13:11.240 | meaning they behave ethically
00:13:13.040 | as it relates to the requirements of the system.
00:13:14.760 | They're not breaking laws or breaking rules.
00:13:16.580 | In fact, quite the opposite.
00:13:17.640 | They're paying attention to the rules
00:13:19.120 | and following the rules until they get to a place
00:13:21.560 | where being disagreeable feels less threatening
00:13:26.020 | to their overall sense of security.
00:13:27.760 | - Yeah, I mean, look, the really highly competent people
00:13:30.600 | don't have to break laws, right?
00:13:31.720 | Like it's, there was this myth, you know,
00:13:36.400 | that sort of happened around the movie "The Godfather."
00:13:38.060 | And then there was this character Meyer Lansky, you know,
00:13:39.800 | who's like ran basically the mafia, you know,
00:13:41.280 | 50, 60, 70 years ago.
00:13:42.500 | And there was this great line of like,
00:13:44.080 | well, Meyer Lansky had only like applied himself
00:13:45.720 | to running General Motors.
00:13:46.560 | He would have been the best CEO of all time.
00:13:48.400 | It's like, no, not really, right?
00:13:49.620 | Like the people who are like great
00:13:51.200 | at running the big companies,
00:13:52.040 | they don't have to be mob bosses.
00:13:54.000 | They don't have to like break laws.
00:13:55.040 | They can, you know, they can do, they can work.
00:13:57.280 | They're smart and sophisticated enough
00:13:58.840 | to be able to work inside the system.
00:14:00.560 | You know, they don't need to take the easy out.
00:14:01.980 | So I don't think there's any implication
00:14:03.560 | that they have to, you know,
00:14:04.400 | that they have to break laws.
00:14:05.600 | That said, they have to break norms, right?
00:14:07.460 | And specifically the thing,
00:14:09.600 | this is probably the thing that gets missed the most,
00:14:11.200 | 'cause the process of innovating,
00:14:14.040 | the process of creating something new,
00:14:15.180 | like once it works, like the stories get retconned,
00:14:18.600 | as they say in comic books.
00:14:20.560 | So the stories get adapted
00:14:21.920 | to where it's like it was inevitable all along.
00:14:23.640 | You know, everybody always knew that this was a good idea.
00:14:25.680 | You know, the person has won all these awards.
00:14:27.640 | Society embraced them.
00:14:29.040 | And invariably, if you were with them
00:14:31.480 | when they were actually doing the work,
00:14:33.200 | or if you actually get a couple drinks into them
00:14:34.960 | and talk about it, it'd be like,
00:14:35.980 | no, that's not how it happened at all.
00:14:37.600 | They faced a wall of skepticism,
00:14:39.500 | just like a wall of basically social,
00:14:41.720 | you know, essentially denial.
00:14:42.780 | No, this is not gonna work.
00:14:43.960 | No, I'm not gonna join your lab.
00:14:45.380 | No, I'm not gonna come work for your company.
00:14:47.200 | No, I'm not gonna buy your product, right?
00:14:48.940 | No, I'm not gonna meet with you.
00:14:50.440 | And so they get just like tremendous social resistance.
00:14:54.200 | So they're not getting positive feedback
00:14:56.180 | from their social network
00:14:57.840 | the way that more agreeable people need to have, right?
00:15:00.520 | And this is why agreeableness is a problem for innovation.
00:15:03.100 | If you're agreeable,
00:15:03.940 | you're gonna listen to the people around you.
00:15:05.520 | They're gonna tell you that new ideas are stupid, right?
00:15:08.120 | End of story.
00:15:08.960 | You're not gonna proceed.
00:15:10.440 | And so I would put it more on
00:15:11.680 | like they need to be able to deal with,
00:15:13.480 | they need to be able to deal with social discomfort
00:15:15.840 | to the level of ostracism,
00:15:17.600 | or at some point they're gonna get shaken out
00:15:19.360 | and they're just gonna quit.
00:15:20.660 | - Do you think that people that meet these criteria
00:15:22.940 | do best by banding with others
00:15:24.780 | that meet these criteria early?
00:15:26.860 | Or is it important that they form this deep sense of self,
00:15:31.160 | like the ability to cry oneself to sleep at night
00:15:34.500 | or, you know, lie in the fetal position
00:15:36.440 | worrying that things aren't going to work out
00:15:38.280 | and then still get up the next morning
00:15:39.960 | and get right back out there?
00:15:41.440 | - Right.
00:15:42.280 | So Sean Parker has the best line by the way on this.
00:15:45.080 | He says, you know, being an entrepreneur, being a creator
00:15:47.400 | is like, you know, getting punched in the face,
00:15:49.040 | like over and over again.
00:15:50.000 | He said, eventually you start to like
00:15:51.120 | the taste of your own blood.
00:15:53.140 | And I love that line 'cause it makes everybody
00:15:54.740 | like massively uncomfortable, right?
00:15:56.420 | But it gives you a sense of like how basically painful
00:15:58.720 | the process is.
00:15:59.560 | If you talk to any entrepreneur, you know,
00:16:01.480 | who's been through it about that,
00:16:02.520 | they're like, oh yeah, that's exactly what it's like.
00:16:04.600 | So there is this, there is a big individual component to it.
00:16:07.960 | But look, it can be very lonely, right?
00:16:10.360 | And especially, you know, very hard I think to do this
00:16:13.120 | if nobody around you is trying to do anything
00:16:15.020 | even remotely similar, right?
00:16:16.020 | And if you're getting just universally negative responses,
00:16:18.200 | like, you know, very few people I think have,
00:16:20.160 | very few people have an ego strength
00:16:21.380 | to be able to survive that for years.
00:16:23.480 | So I do think there's a huge advantage.
00:16:25.160 | And this is why you do see clusters.
00:16:26.800 | There's a huge advantage to clustering, right?
00:16:28.580 | And so you have, and you know, throughout history,
00:16:30.800 | you've had this clustering effect, right?
00:16:32.060 | You had, you know, clustering of the great artists
00:16:33.640 | and sculptors in Renaissance Florence, you know,
00:16:35.680 | you have the clustering of the philosophers of Greece,
00:16:37.780 | you have the clustering of tech people in Silicon Valley,
00:16:39.960 | you have the clustering of creative, you know, arts,
00:16:41.720 | movie TV people in Los Angeles, right?
00:16:44.120 | And so forth and so on, you know, for, you know,
00:16:46.400 | there's always a scene, right?
00:16:47.460 | There's always like a nexus and a place
00:16:49.740 | where people come together, you know,
00:16:51.660 | for these kinds of things.
00:16:53.100 | So generally speaking, like, if somebody
00:16:54.700 | wants to work in tech, innovate in tech,
00:16:56.100 | they're going to be much better off being around a lot of people
00:16:58.280 | who are trying to do that kind of thing
00:17:00.180 | than they are in a place where nobody else is doing it.
00:17:02.900 | Having said that, the clustering has, it can have downsides.
00:17:05.600 | It can have side effects.
00:17:06.760 | And you put any group of people together
00:17:08.420 | and you do start to get groupthink,
00:17:09.820 | even among people who are individually very disagreeable.
00:17:12.980 | And so these same clusters where you
00:17:15.040 | get these very idiosyncratic people,
00:17:16.700 | they do have fads and trends just like every place else,
00:17:19.540 | right?
00:17:20.020 | And so they get wrapped up in their own social dynamics.
00:17:23.840 | And the good news is the social dynamic in those places
00:17:26.180 | is usually very forward looking, right?
00:17:28.180 | And so it's usually like, you know, it's, I don't know,
00:17:30.660 | it's like a herd of iconoclasts looking for the next big thing,
00:17:33.380 | right?
00:17:33.860 | So iconoclasts looking for the next big thing, that's good.
00:17:36.280 | The herd part, right?
00:17:37.500 | That's what you've got to be careful of.
00:17:38.580 | So even when you're in one of these environments,
00:17:40.120 | you have to be careful that you're not getting sucked
00:17:41.620 | into the groupthink too much.
00:17:43.020 | When you say groupthink, do you mean excessive friction
00:17:45.380 | due to pressure testing each other's ideas to the point
00:17:47.660 | where things just don't move forward?
00:17:49.260 | Or are you talking about groupthink
00:17:50.880 | where people start to form a consensus or the self-belief
00:17:54.540 | that, gosh, we are so strong because we are so different?
00:17:59.600 | Can we better define groupthink?
00:18:01.060 | It's actually less either one of--
00:18:02.860 | those things both happen.
00:18:04.020 | Those are good.
00:18:05.060 | Those are good.
00:18:05.980 | The part of groupthink I'm talking about
00:18:07.780 | is just like we all basically zero in.
00:18:09.900 | We just end up zeroing in on the same ideas, right?
00:18:12.300 | In Hollywood, there's this classic thing.
00:18:14.060 | It's like there are years where all of a sudden there's
00:18:16.620 | a lot of volcano movies.
00:18:18.260 | It's like, why are there all these volcano movies?
00:18:20.300 | And it's just like, I don't know.
00:18:21.100 | There was just something in the gestalt, right?
00:18:22.660 | There was just something in the air.
00:18:23.940 | You know, look, Silicon Valley has this.
00:18:25.940 | There are moments in time where you'll have these--
00:18:28.380 | it's like the old thing.
00:18:29.700 | What's the difference between a fad and a trend, right?
00:18:31.940 | A fad is the trend that doesn't last, right?
00:18:33.820 | And so Silicon Valley is subject to fads, both fads and trends,
00:18:37.500 | just like anyplace else.
00:18:39.100 | In other words, you take smart, disagreeable people,
00:18:41.140 | you cluster them together, they will act like a herd, right?
00:18:43.820 | They will end up thinking the same things,
00:18:45.580 | unless they try very hard not to.
00:18:48.140 | You've talked about these personality traits
00:18:50.220 | of great innovators before.
00:18:53.220 | And we're talking about them now.
00:18:55.220 | You invest in innovators.
00:18:56.820 | You try and identify them.
00:18:58.140 | And you are one.
00:18:59.100 | So you can recognize these traits.
00:19:00.980 | Here, I'm making the presumption that you have these traits.
00:19:03.180 | Indeed, you do.
00:19:04.820 | We'll just get that out of the way.
00:19:08.100 | Have you observed people trying to feign these traits?
00:19:12.540 | And are there any specific questions
00:19:14.540 | or behaviors that are a giveaway
00:19:17.780 | that they're pretending to be the young Steve Jobs
00:19:22.380 | or that they're pretending to be the young Henry Ford?
00:19:25.780 | Pick your list of other names
00:19:28.500 | that qualify as authentic, legitimate innovators.
00:19:32.380 | We won't name names of people
00:19:33.740 | who have tried to disguise themselves as true innovators,
00:19:36.060 | but what are some of the litmus tests?
00:19:40.380 | And I realize here that we don't want you to give these away
00:19:43.260 | to the point where they lose their potency.
00:19:45.420 | But if you could share a few of those.
00:19:47.100 | - Yeah, no, that's good.
00:19:47.940 | We're actually a pretty open book on this.
00:19:49.380 | So yeah, so first of all, yes.
00:19:51.940 | So there are people who definitely try to come in
00:19:53.580 | and basically present as being something that they're not.
00:19:55.420 | And they've read all the books.
00:19:56.820 | They will have listened to this interview, right?
00:19:58.260 | They study everything, and they construct a facade,
00:20:01.540 | and they come in and present as something they're not.
00:20:03.500 | I would say the amount of that varies
00:20:05.260 | exactly correlated to the NASDAQ, right?
00:20:09.220 | And so when stock prices are super low,
00:20:12.500 | you actually get the opposite.
00:20:13.340 | When stock prices are super low,
00:20:14.420 | people get too demoralized,
00:20:15.500 | and people who should be doing it basically give up
00:20:17.220 | 'cause they just think that whatever the industry's over,
00:20:19.460 | the trend is over, whatever, it's all hopeless.
00:20:21.140 | And so you get this flushing thing.
00:20:22.420 | So nobody ever shows up at a stock market low
00:20:24.940 | and says, "I'm the new next big thing,"
00:20:27.260 | and doesn't really want to do it
00:20:30.780 | because there are higher status.
00:20:32.780 | The kinds of people who do the thing
00:20:34.260 | that you're talking about,
00:20:35.100 | they're fundamentally oriented for social status.
00:20:36.820 | They're trying to get the social status
00:20:39.380 | without actually the substance,
00:20:41.380 | and there are always other places to go get social status.
00:20:44.020 | So after 2000, the joke was,
00:20:46.900 | so when I got to Silicon Valley in '93, '94,
00:20:50.740 | the valley was dead.
00:20:51.620 | We can talk about that.
00:20:52.660 | By '98, it was roaring,
00:20:54.020 | and you had a lot of these people showing up
00:20:55.820 | who were, you basically had a lot of people
00:20:57.900 | showing up with some kind of stories.
00:20:59.860 | 2000 market crash.
00:21:00.700 | By 2001, the joke was that there were these terms,
00:21:04.580 | B2C and B2B,
00:21:06.220 | and in 1998, B2C meant business to consumer,
00:21:09.820 | and B2B meant business to business,
00:21:11.420 | which is two different kinds of business models
00:21:12.940 | for internet companies.
00:21:13.780 | By 2001, B2B meant back to banking,
00:21:17.780 | B2C meant back to consulting,
00:21:19.220 | which is the high status people,
00:21:21.860 | the people oriented to status who showed up to be in tech
00:21:24.500 | were like, "Yeah, screw it.
00:21:25.660 | "This is over.
00:21:26.500 | "Stick a fork in it.
00:21:27.340 | "I'm gonna go back to Goldman Sachs
00:21:28.740 | "or go back to McKinsey where I can be high status,"
00:21:31.940 | and so you get this flushing kind of effect
00:21:34.540 | that happens in a downturn.
00:21:36.620 | That said, in a big upswing,
00:21:38.900 | yeah, you get a lot of people showing up
00:21:41.380 | with a lot of kind of, let's say, public persona
00:21:44.740 | without the substance to back it up.
00:21:46.940 | So the way we stress that,
00:21:48.340 | and you can actually say exactly how we test for this,
00:21:50.260 | which because the test exactly addresses the issue
00:21:53.860 | in a way that is impossible to fake,
00:21:55.820 | and it's actually the same way homicide detectives
00:21:58.820 | try to find out if you've actually,
00:22:01.140 | if you're innocent or whether you've killed somebody,
00:22:02.300 | it's the same tactic.
00:22:04.540 | Which is you ask increasingly detailed questions, right?
00:22:08.500 | And so the way a homicide cop does this
00:22:11.060 | is what were you doing last night?
00:22:12.500 | Oh, I was at a movie.
00:22:13.740 | Well, which movie?
00:22:14.660 | Oh, which theater?
00:22:17.660 | Okay, which seat did you sit in?
00:22:19.300 | Okay, what was the end of the movie, right?
00:22:22.980 | And you ask increasingly detailed questions
00:22:24.740 | and people have trouble,
00:22:25.580 | at some point, people have trouble making up
00:22:27.380 | and things just fuzz into just kind of obvious bullshit.
00:22:29.580 | And basically, fake founders
00:22:31.100 | basically have the same problem.
00:22:32.580 | They're able to relay a conceptual theory
00:22:34.980 | of what they're doing that they've kind of engineered.
00:22:37.260 | But as they get into the details, it just fuzzes out.
00:22:39.980 | Whereas the true people that you want to back
00:22:43.020 | that can do it, basically what you find is
00:22:44.700 | they've spent five or 10 or 20 years
00:22:46.460 | obsessing on the details of whatever it is
00:22:48.340 | they're about to do.
00:22:49.180 | And they're so deep in the details
00:22:50.660 | that they know so much more about it than you ever will.
00:22:52.980 | And in fact, the best possible reaction
00:22:54.620 | is when they get mad, right?
00:22:56.380 | Which is also what the homicide cops say, right?
00:22:58.500 | Which you actually want the emotional response of like,
00:23:01.780 | "I can't believe that you're asking me questions
00:23:03.600 | this detailed and specific and picky."
00:23:06.220 | And they kind of figure out what you're doing
00:23:07.860 | and then they get upset.
00:23:09.180 | Like that's good.
00:23:10.620 | That's perfect, right?
00:23:11.620 | But then they have to prove it themselves
00:23:14.340 | in the sense of like they have to be able
00:23:15.660 | to answer the questions in great detail.
00:23:18.420 | - Do you think that people that are able
00:23:19.600 | to answer those questions in great detail
00:23:21.580 | have actually taken the time to systematically think
00:23:24.300 | through the if-ands of all the possible implications
00:23:28.740 | of what they're going to do
00:23:29.580 | and they have a specific vision in mind
00:23:31.140 | of how things need to turn out or will turn out?
00:23:34.740 | Or do you think that they have a vision
00:23:37.660 | and it's a no matter what it will work out
00:23:40.400 | because the world will sort of bend around it?
00:23:42.780 | I mean, in other words,
00:23:43.640 | do you think that they place their vision in context
00:23:46.540 | or they simply have a vision
00:23:47.900 | and they have that tunnel vision of that thing
00:23:50.580 | and that's going to be it.
00:23:51.740 | Let's use you for an example with Netscape.
00:23:54.460 | I mean, that's how I first came to know your name.
00:23:58.340 | When you were conceiving Netscape, did you think,
00:24:01.900 | okay, there's this search engine and this browser
00:24:05.620 | and it's going to be this thing that looks this way
00:24:08.460 | and works this way and feels this way.
00:24:10.760 | Did you think that and also think about, you know,
00:24:15.280 | that there was going to be a gallery of other search engines
00:24:18.140 | and it would fit into that landscape of other search engines
00:24:20.540 | or were you just projecting your vision of this thing
00:24:23.100 | as this unique and special brainchild?
00:24:27.420 | So I'm going to give the general answer
00:24:28.660 | and then we can talk about the specific example.
00:24:30.220 | So the general answer is with entrepreneurship,
00:24:32.780 | creativity, innovation is what economists call
00:24:34.860 | decision-making under uncertainty, right?
00:24:37.020 | And so in both parts, it's important, decision-making,
00:24:38.860 | like you're going to make a ton of decisions
00:24:40.020 | because you have to decide what to do, what not to do
00:24:41.620 | and then uncertainty, which is like,
00:24:43.580 | the world's a complicated place, right?
00:24:45.500 | And in mathematical terms,
00:24:46.900 | the world is a complex adaptive system with feedback loops
00:24:49.300 | and like, it's really, I mean, it's extreme, you know.
00:24:53.380 | Isaac Asimov wrote, you know, in his novels,
00:24:55.380 | he wrote about this field called psychohistory
00:24:57.340 | which is the idea that there's like a supercomputer
00:24:59.060 | that can predict the future of like human affairs, right?
00:25:01.100 | And it's like, we don't have that.
00:25:03.260 | - Not yet.
00:25:04.100 | - Not yet, not yet.
00:25:04.940 | - We'll get to that later.
00:25:06.620 | - We certainly don't have that yet.
00:25:08.660 | And so you're just dealing, you know,
00:25:10.460 | military commanders call this the fog of war, right?
00:25:13.220 | You're just dealing with a situation
00:25:14.860 | where the number of variables are just off the charts.
00:25:17.260 | It's all these other people, right?
00:25:19.020 | Who are inherently unpredictable,
00:25:20.260 | making all these decisions in different directions.
00:25:22.140 | And then the whole system is combinatorial,
00:25:23.820 | which is these people are colliding with each other,
00:25:25.260 | influencing their decisions.
00:25:26.660 | And so, I mean, look, the most straightforward kind of way
00:25:29.940 | to think about this is it's just, it's amazing.
00:25:32.260 | Like anybody who believes in economic central planning,
00:25:33.940 | it always blows my mind 'cause it's just like,
00:25:35.340 | it's just like try opening a restaurant.
00:25:37.540 | Like try just opening a restaurant on the corner down here
00:25:40.660 | and like 50/50 odds the restaurant's gonna work.
00:25:43.260 | And like all you have to do to run a restaurant
00:25:44.940 | is like have a thing and serve food and like,
00:25:47.020 | and it's like most restaurants fail, right?
00:25:48.460 | And so, and restaurant people who run restaurants
00:25:49.900 | are like pretty smart.
00:25:50.740 | Like they're, you know, they usually think
00:25:52.060 | about these things very hard and they all wanna succeed.
00:25:54.220 | And it's hard to do that.
00:25:55.420 | And so to start a tech company
00:25:56.700 | or to start an artistic movement or to fight a war,
00:26:00.300 | like you're just going into this,
00:26:01.820 | like basically about conceptual battleground
00:26:03.740 | or military terms, real battleground
00:26:05.660 | where there's just like incredible levels of complexity,
00:26:08.300 | branching future paths.
00:26:10.100 | And so there's nothing, it's, you know,
00:26:11.700 | there's nothing predictable.
00:26:12.540 | And so what we look for is basically the sort of drop,
00:26:16.460 | the really good innovators, they've got a drive
00:26:18.700 | to basically be able to cope with that and deal with that.
00:26:20.820 | And they basically do that in two steps.
00:26:22.540 | So one is they try to pre-plan as much as they possibly can.
00:26:25.340 | And we call that the process of navigating,
00:26:27.580 | we call the idea maze, right?
00:26:29.020 | And so the idea maze basically is I've got this general idea
00:26:31.700 | and it might be the internet's gonna work
00:26:32.980 | or search or whatever.
00:26:34.220 | And then it's like, okay, in their head,
00:26:35.940 | they have thought through of like,
00:26:37.260 | okay, if I do it this way, that way, this third way,
00:26:39.300 | here's what will happen, then I have to do that,
00:26:40.900 | then I have to do this,
00:26:41.740 | then I have to bring in somebody to do that.
00:26:43.140 | Here's the technical challenge I'm gonna hit.
00:26:44.900 | And they've got in their heads as best anybody could,
00:26:47.900 | they've got as complete as sort of a map
00:26:50.060 | of possible futures as they could possibly have.
00:26:51.940 | And this is where I say,
00:26:53.340 | when you ask them increasingly detailed questions,
00:26:54.860 | that's what you're trying to kind of get them
00:26:56.260 | to kind of chart out is, okay,
00:26:57.940 | how far ahead have you thought
00:26:59.620 | and how much are you anticipating
00:27:01.220 | all of the different twists and turns
00:27:02.500 | that this is gonna take?
00:27:03.900 | Okay, so then they start on day one.
00:27:05.380 | And then of course, what happens is,
00:27:06.700 | now they're in it, now they're in the fog of war, right?
00:27:09.300 | They're in future uncertainty.
00:27:10.340 | And now that idea maze is maybe not helpful practically,
00:27:13.120 | but now they're gonna be basically constructing
00:27:14.780 | on the fly day by day as they learn
00:27:16.420 | and discover new things.
00:27:17.820 | And as the world changes around them.
00:27:19.060 | And of course, it's a feedback loop
00:27:20.180 | 'cause they're gonna change, if their thing starts to work,
00:27:22.020 | it's gonna change the world.
00:27:22.940 | And then the fact the world is changing
00:27:24.220 | is gonna cause their plan to change as well.
00:27:27.460 | And so, yeah, the great ones basically,
00:27:29.980 | they course correct,
00:27:31.740 | the great ones course correct every single day.
00:27:33.860 | They take stock of what they've learned.
00:27:36.340 | They modified the plan.
00:27:38.100 | The great ones tend to think in terms of hypotheses, right?
00:27:40.460 | It's a little bit like a scientific sort of mentality,
00:27:42.580 | which is they tend to think, okay, I'm gonna try this.
00:27:45.660 | I'm gonna go into the world,
00:27:46.500 | I'm gonna announce that I'm doing this for sure.
00:27:48.780 | I'm gonna say like, this is my plan.
00:27:50.660 | And I'm gonna tell all my employees that.
00:27:51.820 | I'm gonna tell all my investors that.
00:27:52.900 | I'm gonna put a stake in there, that's my plan.
00:27:54.060 | And then I'm gonna try it, right?
00:27:55.740 | And even though I sound like I have complete certainty,
00:27:57.700 | I know that I need to test
00:27:58.900 | to find out whether it's gonna work.
00:28:00.380 | And if it's not, then I have to go back
00:28:02.060 | to all those same people and I have to say,
00:28:03.420 | well, actually, we're not going left, we're going right.
00:28:05.740 | And they have to run that loop thousands of times, right?
00:28:09.060 | And they had to get through the other side.
00:28:10.260 | And this led to the creation of this great term pivot,
00:28:13.220 | which has been very helpful in our industry
00:28:14.660 | 'cause the word when I was young,
00:28:16.540 | the word we used was fuck up and pivot
00:28:18.900 | sounds like so much better.
00:28:20.500 | It sounds like so much more professional.
00:28:22.260 | But yeah, you like make mistakes.
00:28:24.540 | It's just too complicated to understand.
00:28:26.460 | You course correct, you adjust, you evolve.
00:28:29.100 | Often these things, at least in business,
00:28:31.180 | the businesses that end up working really well
00:28:32.780 | tend to be different than the original plan.
00:28:35.100 | But that's part of the process of a really smart founder
00:28:38.380 | basically working their way through reality, right?
00:28:40.900 | As they're executing their plan.
00:28:42.860 | - The way you're describing this has parallels
00:28:44.820 | to a lot of models in biology
00:28:46.420 | and the practice of science, random walks,
00:28:49.100 | but that aren't truly random,
00:28:50.460 | pseudo random walks in biology, et cetera.
00:28:52.780 | But one thing that is becoming clear
00:28:55.460 | from the way you're describing this is that
00:28:57.740 | I could imagine a great risk to early success.
00:29:01.560 | So for instance, somebody develops a product,
00:29:03.460 | people are excited by it.
00:29:05.380 | They start to implement that product,
00:29:06.640 | but then the landscape changes
00:29:08.420 | and they don't learn how to pivot
00:29:10.720 | to use the less profane version of it, right?
00:29:13.860 | They don't learn how to do that.
00:29:15.300 | In other words, the, and I think of everything these days,
00:29:18.700 | or most everything in terms of reward schedules
00:29:21.040 | and dopamine reward schedules,
00:29:22.800 | 'cause that is the universal currency of reward.
00:29:25.340 | And so when you talk about the Sean Parker quote
00:29:28.100 | of learning to enjoy the taste of one's own blood,
00:29:31.300 | that is very different
00:29:32.700 | than learning to enjoy the taste of success, right?
00:29:35.620 | It's about internalizing success
00:29:38.180 | as a process of being self-determined
00:29:40.740 | and less agreeable, et cetera.
00:29:43.140 | In other words, building up of those five traits
00:29:46.360 | becomes the source of dopamine, perhaps,
00:29:48.780 | in a way that's highly adaptive.
00:29:50.200 | So on the outside, we just see the product, the end product,
00:29:52.880 | the iPhone, the MacBook, the Netscape, et cetera.
00:29:55.820 | But I have to presume, and I'm not a psychologist,
00:29:59.420 | but I have done neurophysiology
00:30:00.840 | and I've studied the dopamine system enough to know that
00:30:04.500 | what's being rewarded in the context
00:30:06.660 | of what you're describing
00:30:08.060 | sounds to be a reinforcement of those five traits,
00:30:10.740 | rather than, oh, it's going to be this particular product,
00:30:14.040 | or the company's going to look this way,
00:30:15.300 | or the logo is going to be this or that.
00:30:16.940 | That all seems like the peripheral to what's really going on,
00:30:21.940 | that great innovators are really in the process
00:30:24.620 | of establishing neural circuitry
00:30:26.980 | that is all about reinforcing the me
00:30:30.100 | and the process of being me.
00:30:32.220 | - Yeah, yeah, so this goes to, yeah,
00:30:34.460 | so this is like extrinsic versus intrinsic motivation.
00:30:36.700 | So the Steve Jobs kind of Zen version of this, right,
00:30:39.100 | or the sort of hippie version of this was,
00:30:40.580 | the journey is the reward, right?
00:30:42.380 | He always told his employees that.
00:30:44.460 | It's like, look, everybody thinks in terms
00:30:46.140 | of these big public markers, like the stock price
00:30:47.940 | or the IPO or the product launch or whatever.
00:30:49.740 | He's like, no, it's actually the process itself
00:30:52.700 | is the point, right?
00:30:54.240 | And to your point, if you have that mentality,
00:30:56.580 | then that's an intrinsic motivation,
00:30:58.160 | not an extrinsic motivation.
00:30:59.700 | And so that's the kind of intrinsic motivation
00:31:01.140 | that can keep you going for a long time.
00:31:03.620 | Another way to think about it
00:31:04.480 | is competing against yourself, right?
00:31:05.900 | It's like, can I get better at doing this, right?
00:31:08.820 | And can I prove to myself that I can get better?
00:31:11.220 | There's also a big social component to this,
00:31:13.140 | and this is one of the reasons why Silicon Valley
00:31:15.060 | punches so far above its weight as a place.
00:31:17.580 | There's a psychological component,
00:31:18.660 | which also goes to the comparison set.
00:31:21.140 | So a phenomenon that we've observed over time
00:31:23.020 | is the leading tech company in any city
00:31:26.620 | will aspire to be as large as the previous
00:31:28.900 | leading tech company in that city,
00:31:31.240 | but often not larger, right?
00:31:32.900 | 'Cause they have a model of success,
00:31:35.140 | and as long as they beat that level of success,
00:31:36.860 | they've kind of checked the box,
00:31:38.200 | like they've made it, and then they,
00:31:40.740 | but then in contrast, you're in Silicon Valley,
00:31:42.340 | and you look around, and it's just like Facebook,
00:31:44.140 | and Cisco, and Oracle, and Hewlett-Packard, and--
00:31:47.040 | - Gladiators.
00:31:47.880 | - Yeah, and you're just looking at these giants,
00:31:51.380 | and many of them are still,
00:31:53.100 | Mark Zuckerberg's still going to work every day
00:31:54.740 | and trying to do, and so these people are like,
00:31:58.280 | the role models are like alive, right?
00:32:00.260 | And they're like right there, right?
00:32:01.880 | And it's so clear how much better they are
00:32:03.960 | and how much bigger their accomplishments are,
00:32:05.460 | and so what we find is young founders
00:32:07.540 | in that environment have much greater aspirations, right?
00:32:11.220 | 'Cause they just, again, maybe at that point,
00:32:13.280 | maybe it's the social status,
00:32:14.340 | maybe there's an extrinsic component to that,
00:32:16.960 | but or maybe it helps calibrate that internal system
00:32:19.500 | to basically say actually, you know,
00:32:21.120 | no, the opportunity here is not to build a local,
00:32:23.220 | you know, what you may call local maximum form of success,
00:32:25.100 | but let's build to a global maximum form of success,
00:32:27.660 | which is something as big as we possibly can.
00:32:30.500 | Ultimately, the great ones are probably driven
00:32:32.380 | more internally than externally when it comes down to it,
00:32:34.700 | and that is where you get this phenomenon
00:32:36.260 | where you get people who are, you know,
00:32:37.460 | extremely successful and extremely wealthy
00:32:39.140 | who very easily can punch out and move to Fiji
00:32:41.700 | and just call it, and they're still working 16-hour days,
00:32:45.100 | right, and so obviously something explains that
00:32:47.420 | that has nothing to do with external rewards,
00:32:49.060 | and I think it's an internal thing.
00:32:51.060 | - As many of you know, I've been taking AG1 daily since 2012,
00:32:56.260 | so I'm delighted that they're sponsoring the podcast.
00:32:58.720 | AG1 is a vitamin mineral probiotic drink
00:33:01.020 | that's designed to meet
00:33:01.860 | all of your foundational nutrition needs.
00:33:04.100 | Now, of course, I try to get enough servings
00:33:05.720 | of vitamins and minerals through whole food sources
00:33:08.020 | that include vegetables and fruits every day,
00:33:10.260 | but oftentimes I simply can't get enough servings,
00:33:12.940 | but with AG1, I'm sure to get enough vitamins and minerals
00:33:15.900 | and the probiotics that I need,
00:33:17.660 | and it also contains adaptogens to help buffer stress.
00:33:20.840 | Simply put, I always feel better when I take AG1.
00:33:23.640 | I have more focus and energy, and I sleep better,
00:33:26.180 | and it also happens to taste great.
00:33:28.420 | For all these reasons, whenever I'm asked,
00:33:30.260 | if you could take just one supplement, what would it be?
00:33:32.980 | I answer, AG1.
00:33:34.660 | If you'd like to try AG1, go to drinkag1.com/huberman
00:33:39.200 | to claim a special offer.
00:33:40.820 | They'll give you five free travel packs
00:33:42.580 | plus a year supply of vitamin D3K2.
00:33:45.240 | Again, that's drinkag1.com/huberman.
00:33:48.880 | - I've heard you talk a lot about the inner landscape,
00:33:53.700 | the inner psychology of these folks,
00:33:55.580 | and I appreciate that we're going
00:33:57.000 | even deeper into that today,
00:33:58.600 | and we will talk about the landscape around,
00:34:00.880 | whether or not Silicon Valley or New York,
00:34:02.520 | whether or not there are specific cities
00:34:03.860 | that are ideal for certain types of pursuits.
00:34:05.860 | I think there was an article written by Paul Graham
00:34:07.500 | some years ago about the conversations
00:34:10.460 | that you overhear in a city will tell you everything
00:34:12.480 | you need to know about whether or not you belong there
00:34:15.100 | in terms of your professional pursuits.
00:34:18.180 | Some of that's changed over time,
00:34:19.460 | and now we should probably add Austin to the mix
00:34:22.260 | 'cause it was written some time ago.
00:34:24.760 | In any event, I want to return to that,
00:34:27.120 | but I want to focus on an aspect
00:34:29.100 | of this intrinsic versus extrinsic motivators
00:34:33.140 | in terms of something that's a bit more cryptic,
00:34:35.680 | which is one's personal relationships.
00:34:38.740 | You know, if I think about the catalog of innovators
00:34:41.720 | in Silicon Valley, some of them, like Steve Jobs,
00:34:44.220 | had complicated personal lives,
00:34:45.860 | romantic personal lives early on,
00:34:47.540 | and then it sounds like he worked it out.
00:34:48.920 | I don't know, I wasn't their couples therapist,
00:34:51.820 | but when he died, he was in a marriage
00:34:55.460 | that for all the world seemed like a happy marriage.
00:34:58.060 | You also have examples of innovators
00:35:01.340 | who have had many partners, many children
00:35:03.400 | with other partners.
00:35:04.240 | Elon comes to mind.
00:35:05.840 | You know, I don't think I'm disclosing anything
00:35:07.840 | that isn't already obvious.
00:35:09.180 | Those could have been happy relationships
00:35:11.980 | and just had many of them.
00:35:13.480 | But the reason I'm asking this is you can imagine
00:35:15.960 | that for the innovator, the person with these traits
00:35:19.700 | who's trying to build up this thing, whatever it is,
00:35:23.700 | that having someone, or several people in some cases,
00:35:29.420 | who just truly believe in you,
00:35:32.340 | when the rest of the world may not believe in you yet
00:35:34.420 | or at all, could be immensely powerful.
00:35:36.760 | And we have examples from cults that embody this.
00:35:41.220 | We have examples from politics.
00:35:42.540 | We have examples from tech innovation and science.
00:35:46.260 | And I've always been fascinated by this
00:35:47.700 | because I feel like it's the more cryptic
00:35:49.760 | and yet very potent form of allowing someone
00:35:53.300 | to build themselves up.
00:35:54.960 | It's a combination of inner psychology
00:35:57.580 | and extrinsic motivation because obviously
00:35:59.500 | if that person were to die or leave them
00:36:01.380 | or cheat on them or pair up with some other innovator,
00:36:05.020 | which we've seen several times, recently and in the past,
00:36:08.680 | it can be devastating to that person.
00:36:10.220 | But what are your thoughts on the role of personal
00:36:13.600 | and in particular romantic relationship
00:36:15.340 | as it relates to people having an idea
00:36:18.720 | and their feeling that they can really bring that idea
00:36:21.460 | to fruition in the world?
00:36:22.980 | - So it's a real mixed bag.
00:36:24.500 | You have lots of examples in all directions.
00:36:26.580 | And I think it's something like following.
00:36:29.220 | So first is we talked about the personality traits
00:36:31.360 | of these people.
00:36:32.200 | They tend to be highly disagreeable.
00:36:33.980 | - Doesn't foster a good romantic relationships.
00:36:35.660 | - Highly disagreeable people can be difficult
00:36:37.420 | to be in relationships.
00:36:38.260 | - I may have heard of that once or twice before.
00:36:39.580 | A friend may have given me that example.
00:36:41.500 | - Yeah, right?
00:36:42.340 | And maybe you just need to find the right person
00:36:43.700 | who like compliments that and is willing to,
00:36:45.580 | there's a lot of relationships where like,
00:36:46.860 | it's always this question about relationships, right?
00:36:48.420 | Which is, do you want to have the same personality,
00:36:50.540 | growth profile, the same behavioral traits
00:36:52.100 | basically as your partner?
00:36:53.020 | Or do you actually want to have,
00:36:54.660 | is it an opposites thing?
00:36:56.140 | And, you know, I'm sure you've seen this.
00:36:57.980 | There are relationships where you'll have somebody
00:36:59.260 | who's highly disagreeable,
00:37:00.100 | who's paired with somebody who's highly agreeable.
00:37:01.500 | And it actually works out great.
00:37:03.140 | 'Cause one person just gets to be on their soapbox
00:37:04.700 | all the time and the other person's just like, okay.
00:37:06.980 | Right, and it's fine, right?
00:37:08.860 | It's fine, it's good.
00:37:10.340 | You know, you put two disagreeable people together,
00:37:12.260 | you know, maybe sparks fly
00:37:13.660 | and they have great conversations all the time
00:37:15.160 | and maybe they come to hate each other, right?
00:37:16.580 | And so, so anyway, so these people,
00:37:19.100 | if you're gonna be with one of these people,
00:37:20.520 | you're fishing out of the disagreeable end of the pond.
00:37:22.640 | And again, when I say disagreeable,
00:37:23.780 | I don't mean, you know, these are normal distributions.
00:37:25.900 | I don't mean like 60% disagreeable or 80% disagreeable.
00:37:28.780 | The people we're talking about are 99.99% disagreeable,
00:37:31.740 | right, so these are ornery people.
00:37:34.020 | So part of it's that.
00:37:35.940 | And then, of course,
00:37:36.780 | they have the other personality traits, right?
00:37:38.020 | They're, you know, super conscientious,
00:37:39.460 | they're super driven.
00:37:40.340 | As a consequence, they tend to work really hard.
00:37:41.920 | They tend to not have a lot of time for, you know,
00:37:43.560 | family vacations or other things.
00:37:45.340 | You know, and they don't enjoy them
00:37:46.780 | if they're forced to go on them.
00:37:47.740 | And so again, that kind of thing can fray in a relationship.
00:37:50.420 | So there's a fair amount in there that's loaded.
00:37:53.800 | Like somebody's gonna partner with one of these people
00:37:55.740 | and needs to be signed up for the ride.
00:37:57.340 | And that's a hard thing, you know,
00:37:59.140 | that's a hard thing to do.
00:38:00.740 | Or you need a true partnership with two of these,
00:38:02.460 | which is also hard to do.
00:38:04.260 | So I think that's part of it.
00:38:06.180 | And then look, I think a big part of it is, you know,
00:38:07.780 | people achieve a certain level of success
00:38:10.140 | and, you know, either in their own minds or, you know,
00:38:12.980 | publicly, and then they start to be able
00:38:14.940 | to get away with things, right?
00:38:16.120 | And they start to be able to, it's like, well, okay,
00:38:18.100 | you know, now we're rich and successful and famous
00:38:20.100 | and now I deserve, you know,
00:38:21.540 | and this is where you get into,
00:38:22.740 | I view this now in the realm of personal choice, right?
00:38:25.340 | You get in this thing where people start to think
00:38:26.980 | that they deserve things.
00:38:27.900 | And so they start to behave in, you know, very bad ways.
00:38:30.900 | And then they blow up their personal worlds as a consequence
00:38:33.660 | and maybe they regret it later and maybe they don't, right?
00:38:36.300 | It's always a question.
00:38:38.760 | So yeah, so I think there's that.
00:38:40.460 | And then I don't know, like, yeah, some people just need,
00:38:44.140 | maybe the other part of it is,
00:38:44.980 | some people just need more emotional support than others.
00:38:46.940 | And I don't know that that's a big,
00:38:48.320 | I don't know that that tilts either way.
00:38:50.000 | Like I know some of these people
00:38:51.340 | who have like great loving relationships
00:38:53.020 | and seem to draw very much
00:38:54.260 | on having this kind of firm foundation to rely upon.
00:38:57.140 | And then I know other people
00:38:58.020 | who are just like their personal lives
00:38:59.140 | are just a continuous train wreck
00:39:00.340 | and it doesn't seem to matter.
00:39:02.900 | Like professionally, they just keep doing what they're doing.
00:39:04.940 | And maybe we could talk here about like, you know,
00:39:07.700 | whatever is the personality trait for risk taking, right?
00:39:10.820 | Some people are so incredibly risk prone
00:39:13.260 | that they need to take risk
00:39:14.140 | in all aspects of their lives at all times.
00:39:15.980 | And if part of their life gets stable,
00:39:18.060 | they find a way to blow it up.
00:39:20.220 | And that's some of these people
00:39:22.260 | you could describe in those terms also.
00:39:24.120 | - Yeah, let's talk about that
00:39:25.780 | because I think risk taking and sensation seeking
00:39:29.540 | is something that fascinates me for my own reasons
00:39:32.720 | and in my observations of others.
00:39:34.580 | Does it dovetail with these five traits
00:39:39.220 | in a way that can really serve innovation
00:39:42.060 | in ways that can benefit everybody?
00:39:43.460 | The reason I say to benefit everybody
00:39:44.820 | is because there is a view
00:39:46.900 | of how we're painting this picture of the innovator
00:39:50.220 | as this like really cruel person.
00:39:53.180 | But oftentimes what we're talking about
00:39:54.500 | are innovations that make the world far better
00:39:56.600 | for billions of people.
00:39:58.080 | - Yeah, that's right.
00:39:59.560 | And by the way, everything we're talking about also
00:40:00.940 | is not just in tech or science or in business.
00:40:03.160 | It's also, everything we're also talking about
00:40:04.240 | is true for the arts, right?
00:40:05.640 | So the history of like artistic expression
00:40:07.840 | is you have people with all these same kinds of traits.
00:40:09.880 | - Well, I was thinking about Picasso
00:40:10.960 | and his regular turnover of lovers and partners.
00:40:13.820 | And he was very open about the fact
00:40:15.200 | that it was one of the sources
00:40:16.520 | of his productivity/creativity.
00:40:20.240 | He wasn't shy about that.
00:40:21.640 | I suppose if he were alive today,
00:40:24.020 | it might be a little bit different.
00:40:24.860 | He might be judged a little differently.
00:40:26.260 | - Right, or that was his story for behaving in a pattern
00:40:29.420 | that was very awful for the people around him
00:40:31.600 | and he didn't care.
00:40:32.440 | - Right, maybe they left him.
00:40:33.620 | - Yeah, like who knows, right?
00:40:34.840 | So it puts and takes to all this.
00:40:37.620 | But no, okay, so I have a theory.
00:40:39.620 | So here's a theory.
00:40:40.660 | This is one of these.
00:40:41.500 | I keep a list of things that will get me kicked
00:40:42.500 | out of a dinner party at topics at any given point in time.
00:40:45.900 | - Do you read it before you go in?
00:40:47.020 | - Yeah, I just, yeah.
00:40:48.020 | (laughing)
00:40:50.580 | I'll recall so that I can get out of these things.
00:40:53.180 | But so here's the thing that can get me kicked
00:40:55.540 | out of a dinner party, especially these days.
00:40:58.480 | So think of the kind of person where it's like very clear
00:41:02.140 | that they're like super high, to your point,
00:41:04.220 | somebody is super high output in whatever domain they're in.
00:41:06.180 | They've done things that have like fundamentally
00:41:07.620 | like changed the world.
00:41:08.580 | They've brought new, whether it's businesses or technologies
00:41:10.780 | or works of art, entire schools of creative expression,
00:41:15.260 | in some cases, to the world.
00:41:16.560 | And then at a certain point,
00:41:18.160 | they blow themselves to smithereens, right?
00:41:20.080 | And they do that either through like a massive
00:41:22.460 | like financial scandal.
00:41:23.460 | They do that through a massive personal breakdown.
00:41:25.920 | They do that through some sort of public expression
00:41:28.460 | that causes them a huge amount of problems.
00:41:30.140 | They say the wrong thing, maybe not once,
00:41:31.880 | but several hundred times,
00:41:34.100 | and blow themselves to smithereens.
00:41:35.820 | And there's this kind of arc,
00:41:38.020 | there's this moral arc that people kind of want to apply,
00:41:40.100 | which is like the Icarus flying too close to the sun.
00:41:43.620 | And he had it coming and he needed to keep his ego
00:41:45.980 | under control.
00:41:46.820 | And you get kind of this judgment
00:41:49.620 | that applies.
00:41:50.760 | So I have a different theory on this.
00:41:52.760 | So the term I use to describe these people,
00:41:54.840 | and a lot of, and by the way,
00:41:55.960 | a lot of other people who don't actually
00:41:56.960 | blow themselves up but get close to it,
00:41:59.340 | which is a whole other set of people,
00:42:01.640 | I call them martyrs to civilizational progress, right?
00:42:05.160 | So, and we're backwards, civilizational progress.
00:42:07.600 | So look, the only way civilization gets moved forward
00:42:10.520 | is when people like this do something new, right?
00:42:12.900 | 'Cause civilization as a whole does not do new things, right?
00:42:16.280 | Groups of people do not do new things, right?
00:42:18.320 | These things don't happen automatically.
00:42:20.320 | Like by default, nothing changes.
00:42:22.960 | The only way civilizational change
00:42:24.720 | on any of these axes ever happens
00:42:26.680 | is because one of these people stands up and says,
00:42:29.200 | no, I'm going to do something different
00:42:30.440 | than what everybody else has ever done before.
00:42:32.000 | So this is progress.
00:42:34.560 | Like this is actually how it happens.
00:42:36.280 | Sometimes they get lionized or awarded.
00:42:38.180 | Sometimes they get crucified.
00:42:39.880 | Sometimes the crucification is literal.
00:42:42.120 | Sometimes it's just symbolic,
00:42:43.560 | but like they are those kinds of people.
00:42:46.980 | And then, and then, and then murders.
00:42:48.440 | Like when they go down in flames, like they have,
00:42:51.360 | and again, this is where it really screws
00:42:52.720 | the people's moral judgments,
00:42:53.620 | 'cause everybody wants to have this sort of
00:42:55.880 | super clear story of like, okay,
00:42:57.140 | he did a bad thing and he was punished.
00:42:59.540 | And I'm like, no, no, no, no, no, no.
00:43:00.940 | He was the kind of person who was going to do great things
00:43:03.780 | and also was going to take on a level of risk
00:43:06.300 | and take on a level of sort of extreme behavior
00:43:08.620 | such that he was going to expose himself
00:43:10.520 | to flying too close to the sun,
00:43:12.220 | wings melt and crash to ground.
00:43:13.320 | But it's a package deal, right?
00:43:16.260 | The reason you have the Picassos and the Beethovens
00:43:18.680 | and all these people is because they're willing
00:43:20.360 | to take these extreme level of risks.
00:43:21.980 | They are that creative and original,
00:43:24.100 | not just in their art or their business,
00:43:25.820 | but in everything else that they do,
00:43:27.340 | that they will set themselves up to be able to fail.
00:43:29.500 | Psychologic, you know, a psychologist would probably,
00:43:31.380 | a psychiatrist would probably say, you know, maybe,
00:43:33.020 | you know, to what extent do they actually
00:43:34.120 | like have a death wish?
00:43:34.960 | Do they actually, you know, at some point,
00:43:36.260 | do they want to punish themselves?
00:43:37.300 | Do they want to fail?
00:43:38.120 | That I don't know.
00:43:39.360 | But you see this, they deliberately move themselves
00:43:42.060 | too close to the sun.
00:43:42.900 | And you can see it when it's happening,
00:43:44.580 | is like if they get too far from the sun,
00:43:46.580 | they deliberately move back towards it, right?
00:43:48.700 | You know, they come right back and they want the risk.
00:43:52.660 | And so anyway, like I, yeah, so martyrs
00:43:55.180 | to civilizational progress, like this is how progress happens.
00:43:58.460 | When these people crash and burn,
00:44:00.100 | the natural inclination is to judge them morally.
00:44:02.580 | I tend to think we should basically say, look,
00:44:04.820 | and I don't even know if this means like giving them
00:44:06.460 | a moral pass or whatever.
00:44:07.780 | But it's like, look, like this is how civilization progresses.
00:44:11.440 | And we need to at least understand
00:44:12.900 | that there's a self-sacrificial aspect to this
00:44:15.340 | that may be tragic and often is tragic,
00:44:17.780 | but it is quite literally self-sacrificial.
00:44:20.500 | - Are there any examples of great innovators
00:44:23.180 | who were able to compartmentalize their risk-taking
00:44:27.920 | to such a degree that they had what seemed to be
00:44:31.960 | a morally impeccable life in every domain
00:44:34.900 | except in their business pursuits?
00:44:36.860 | - Yeah, that's right.
00:44:37.700 | So some people are very highly controlled like that.
00:44:40.660 | Some people are able to like very narrowly,
00:44:42.980 | and I don't really want to set myself an example
00:44:44.780 | on a lot of this, but I will tell you like as an example,
00:44:46.700 | like I will never use debt in business, number one.
00:44:50.420 | Number two, like I have the most placid personal life
00:44:52.220 | you can imagine.
00:44:53.060 | Number three, I'm the last person in the world
00:44:54.860 | who's ever gonna do an extreme sport.
00:44:57.020 | I mean, I'm not even gonna go on the sauna on the ice bath.
00:44:59.100 | Like I'm not doing any of this.
00:45:01.140 | Like I don't end up doing anything.
00:45:02.020 | I'm not power skiing.
00:45:03.140 | - No obligation to do the ice bath.
00:45:04.860 | - I'm not on the Titan.
00:45:05.700 | I'm not going down to see the Titanic.
00:45:07.380 | - Thank goodness you weren't there.
00:45:08.260 | - I'm not doing any of this.
00:45:09.100 | I'm not doing any of this stuff.
00:45:09.980 | I have no interest.
00:45:10.820 | I don't play golf.
00:45:11.660 | I don't ski.
00:45:12.500 | I have no interest in any of this stuff, right?
00:45:13.780 | And so like there are, and I know people like this, right,
00:45:16.140 | who are very high achievers.
00:45:17.060 | It's just like, yeah, they're completely segmented.
00:45:19.460 | They're extreme risk takers in business.
00:45:21.020 | They're completely buttoned down on the personal side.
00:45:22.820 | They're completely buttoned down financially.
00:45:24.940 | They're scrupulous with following every rule and law
00:45:27.980 | you can possibly imagine.
00:45:29.300 | But they're still fantastic innovators.
00:45:30.860 | And then I know many others who are just like,
00:45:32.660 | their life is on fire all the time in every possible way.
00:45:36.780 | And whenever it looks like the fire is turning into embers,
00:45:38.820 | they figure out a way to like relight the fire, right?
00:45:41.420 | And they just really want to live on the edge.
00:45:45.300 | And so I think that's maybe--
00:45:46.860 | I think that's an independent variable.
00:45:48.660 | And again, I would apply the same thing.
00:45:49.820 | I think the same thing applies to the arts.
00:45:52.660 | Classical music is an example.
00:45:54.020 | I think Bach was, as an example, one of the kind of best
00:45:56.860 | musicians of all time, had just a completely sedate
00:45:59.700 | personal life, never had any aberrant behavior
00:46:02.260 | at all in his personal life.
00:46:03.580 | Family man, tons of kids, apparently pillar
00:46:05.940 | of the community, right?
00:46:07.380 | And so if Bach could be Bach and yet not burn his way
00:46:10.500 | through 300 mistresses or whatever, maybe you can, too.
00:46:15.700 | So in thinking about these two different categories
00:46:18.100 | of innovators, those that take on tremendous risk in all
00:46:20.500 | domains of their life and those that take on tremendous risk
00:46:23.140 | in a very compartmentalized way, I
00:46:25.340 | don't know what the percentages are.
00:46:27.180 | But I have to wonder if in this modern age of the public being
00:46:31.420 | far less forgivable, what I'm referring to
00:46:33.660 | is cancel culture.
00:46:36.220 | Do you think that we are limiting
00:46:38.700 | the number of innovations in total,
00:46:41.500 | like by just simply frightening or eliminating
00:46:45.020 | an enormous category of innovators
00:46:47.140 | because they don't have the confidence or the means
00:46:49.780 | or the strategies in place to regulate?
00:46:53.780 | So they're just either bowing out
00:46:55.500 | or they're getting crossed off.
00:46:56.820 | They're getting canceled one by one.
00:46:58.380 | So do you think the public is less tolerant than they
00:47:00.700 | used to be or more tolerant?
00:47:03.860 | Well, the systems that--
00:47:07.020 | I'm not going to be careful here.
00:47:08.420 | I think the large institution systems are not
00:47:14.100 | tolerant of what the public tells them
00:47:16.780 | they shouldn't be tolerant of.
00:47:20.260 | And so if there's enough noise, there's
00:47:22.500 | enough noise in the mob, I think institutions bow out.
00:47:25.820 | And here I'm referring not just to--
00:47:27.860 | they essentially say, OK, let the cancellation proceed.
00:47:31.540 | And then maybe they're the gavel that comes down,
00:47:34.340 | but they're not the lever that got the thing going.
00:47:36.980 | And so I'm not just thinking about universities.
00:47:38.980 | I'm also thinking about advertisers.
00:47:40.620 | I'm thinking about the big movie houses that cancel a film that
00:47:43.980 | a given actor might be in because they had something
00:47:46.180 | in their personal life that's still getting worked out.
00:47:48.560 | I'm thinking about people who are in a legal process that's
00:47:51.700 | not yet resolved, but the public has decided
00:47:53.900 | they're a bad person, et cetera.
00:47:55.700 | My question is are we really talking about the public?
00:47:59.060 | I agree with your question, and I'm going to come back to it.
00:48:01.380 | I'm going to examine one part of your question, which
00:48:04.180 | is is this really the public we're talking about?
00:48:06.220 | And I would just say exhibit A is
00:48:07.740 | who is the current front runner for the Republican nomination
00:48:11.460 | today?
00:48:13.380 | The public, at least on one side of the political aisle,
00:48:16.700 | seems very on board.
00:48:19.340 | Number two, look, there's a certain musician who flew too
00:48:23.900 | close to the sun, blew himself to smithereens.
00:48:25.860 | He's still hitting all-time highs on music streams
00:48:29.780 | every month.
00:48:30.820 | The public seems fine.
00:48:32.820 | I think the public might--
00:48:34.020 | I would argue the public is actually
00:48:35.660 | more open to these things than it actually maybe ever has been.
00:48:38.660 | And we could talk about why that's the case.
00:48:40.460 | I think it's a differentiation, and this
00:48:42.140 | is what your question was aiming at,
00:48:43.340 | but it's a differentiation between the public
00:48:45.220 | and the elites.
00:48:46.660 | So my view is everything that you just described
00:48:48.940 | is an elite phenomenon, and actually the public
00:48:50.980 | is very much not on board with it.
00:48:52.860 | Interesting.
00:48:53.540 | And so what's actually happening is the division--
00:48:55.740 | what's happened is the public and the elites
00:48:57.540 | have gapped out.
00:48:58.300 | The public is more forgiving of what previously
00:49:01.860 | might have been considered kind of ever an extreme behavior.
00:49:05.700 | Of Scott Fitzgerald, there are no second acts
00:49:07.540 | in American lives.
00:49:08.140 | It turns out completely wrong.
00:49:09.500 | It turns out there are second acts, third acts, fourth acts.
00:49:11.340 | Apparently, you can have a limited number of acts.
00:49:12.780 | The public is actually up for it.
00:49:14.220 | Yeah, I mean, I think of somebody like Mike Tyson.
00:49:16.940 | I feel like his life exemplifies everything
00:49:21.180 | that's amazing and great and also terrible about America.
00:49:24.900 | If we took Mike Tyson to dinner tonight at any restaurant
00:49:27.580 | anywhere in the United States, what would happen?
00:49:29.460 | He would be loved.
00:49:30.140 | Oh, he would be like--
00:49:31.140 | Adored.
00:49:31.660 | He would be-- the outpouring of enthusiasm and passion
00:49:34.820 | and love would be incredible.
00:49:37.540 | It would be unbelievable.
00:49:38.940 | This is a great example.
00:49:40.620 | And again, I'm not even going to draw--
00:49:42.200 | I'm not even going to say I agree with that
00:49:43.100 | or disagree with that.
00:49:44.020 | I'm just like, we all intuitively
00:49:45.820 | know that the public is just like 100%, like absolutely.
00:49:48.820 | He's a legend.
00:49:49.500 | He's a living legend.
00:49:50.420 | People love Mike.
00:49:51.220 | He's like a cultural touchstone.
00:49:52.420 | Absolutely.
00:49:52.860 | And then you see it when he shows up in movies, right?
00:49:54.460 | He shows-- I don't remember the--
00:49:55.940 | I mean, the big breakthrough I figured this out
00:49:57.340 | with respect to him because I don't really follow sports.
00:49:59.100 | But when he showed up in that, it was that first Hangover
00:50:01.500 | movie, and he shows up.
00:50:02.740 | And I was in a theater.
00:50:04.440 | And the audience just goes, banana's crazy.
00:50:07.460 | They're so excited to see him.
00:50:08.820 | Yeah, he evokes delight.
00:50:10.060 | I always say that Mike Tyson is the only person
00:50:11.980 | I'm aware of that can wear a shirt with his own name on it.
00:50:15.380 | And it somehow doesn't seem wrong.
00:50:18.180 | In fact, it just kind of makes you like him more.
00:50:21.780 | His ego feels very contoured in a way
00:50:24.660 | that he knows who he is and who he was.
00:50:27.300 | And yet, there's a humbleness woven in,
00:50:29.820 | maybe as a consequence of all that he's been through.
00:50:32.500 | I don't know.
00:50:33.900 | But yeah, people love Mike.
00:50:36.060 | Public closely.
00:50:37.380 | Exactly.
00:50:37.900 | Now, if he shows up to lecture at Harvard,
00:50:40.380 | I think you're probably going to get a different reaction.
00:50:42.020 | I don't know.
00:50:43.060 | I don't know.
00:50:43.860 | I mean, David Simon, the guy who wrote The Wire,
00:50:46.580 | gave a talk at Harvard.
00:50:48.860 | And it sounded to me, based on his report
00:50:51.300 | of that, which is very interesting, in fact,
00:50:54.540 | that people adore people who are connected to everybody
00:51:01.980 | in that way.
00:51:02.780 | I feel like everybody loves Mike from above his status,
00:51:06.300 | the sides, below his status.
00:51:08.900 | He occupies this halo of love and adoration.
00:51:12.980 | OK, all right.
00:51:14.540 | Yeah, and then, look, the other side of this is the elites.
00:51:17.060 | And you kind of alluded to this, or the institutions.
00:51:18.820 | So basically, it's like the people
00:51:20.060 | who were like, at least, nominally in charge,
00:51:21.620 | or feel like that they should be in charge.
00:51:22.860 | Yeah, I want to make sure we define elite.
00:51:24.620 | So you're not necessarily talking
00:51:25.500 | about people who are wealthy.
00:51:26.420 | You're talking about people who have
00:51:27.980 | authority within institutions.
00:51:29.380 | So the ultimate definition of an elite
00:51:30.980 | is who can get who fired, right?
00:51:34.540 | Like, that's the ultimate test.
00:51:35.860 | Who can get who fired, boycotted, blacklisted,
00:51:37.900 | ostracized, like when push prosecuted, jailed,
00:51:41.700 | like when push comes to shove, right?
00:51:43.660 | I think that's always the question,
00:51:45.060 | who can destroy whose career.
00:51:46.300 | And of course, you'll notice that that is heavily asymmetric
00:51:49.060 | when these fights play out.
00:51:50.340 | It's very clear which side can get the other side fired
00:51:52.500 | and which side can't.
00:51:54.260 | And so, yeah, so look, I think we live in a period of time
00:51:56.340 | where the elites have gotten to be extreme
00:51:58.140 | in a number of dimensions.
00:51:59.340 | And I think it's characterized by, for sure,
00:52:01.460 | extreme groupthink, extreme sanctimony, extreme moral,
00:52:05.620 | I would say, dudgeon, this weird sort of modern puritanism,
00:52:10.980 | and then an extreme sort of morality
00:52:13.100 | of punishment and terror against their perceived enemies.
00:52:17.140 | But I wanted to go through that, because I actually
00:52:19.660 | think that's a very different phenomenon.
00:52:21.180 | And I think what's happening to the elites is very different
00:52:22.580 | than what's happening in the population at large.
00:52:24.940 | And then, of course, I think there's a feedback loop
00:52:27.020 | in there, which is I think the population at large
00:52:29.020 | is not on board with that program, right?
00:52:31.300 | I think the elites are aware that the population is not
00:52:33.540 | on board with that program.
00:52:34.340 | I think they judge the population negatively
00:52:36.340 | as a consequence.
00:52:37.140 | That causes the elites to harden their own positions.
00:52:39.340 | That causes them to be even more alienating
00:52:41.260 | to the population.
00:52:42.260 | And so they're in sort of an oppositional negative feedback
00:52:45.100 | loop.
00:52:46.500 | And, yeah, it's going to be--
00:52:48.300 | but again, it's the sort of question, OK,
00:52:49.500 | who can get who fired?
00:52:50.500 | And so elites are really good at getting
00:52:53.940 | like normal people fired--
00:52:55.300 | ostracized, banned, hit pieces in the press, like whatever.
00:52:59.460 | For normal people to get elites fired,
00:53:01.060 | they have to really band together and really mount
00:53:03.740 | a serious challenge, which mostly doesn't happen,
00:53:06.100 | but might be starting to happen in some cases.
00:53:08.300 | Do you think this power of the elites
00:53:12.340 | over stemmed from social media sort of going
00:53:16.460 | against its original purpose?
00:53:18.020 | I mean, when you think social media,
00:53:19.020 | everything you're giving each and every person
00:53:21.620 | their own little reality TV show, their own voice,
00:53:24.380 | and yet we've seen a dramatic uptick
00:53:27.940 | in the number of cancellations and firings related
00:53:30.580 | to immoral behavior based on things that were either done
00:53:35.020 | or amplified on social media.
00:53:37.180 | It's almost as if the public is holding
00:53:38.980 | the wrong end of the knife.
00:53:41.180 | Yeah, so the way I describe it--
00:53:43.860 | so I use these two terms, and they're
00:53:46.260 | somewhat interchangeable, but elites and institutions.
00:53:48.340 | And they're somewhat interchangeable,
00:53:49.540 | because who runs the institutions?
00:53:50.660 | The elites, right?
00:53:51.460 | And so it's sort of a self-reinforcing thing.
00:53:55.260 | Anyway, institutions of all kinds, institutions,
00:53:57.220 | everything from the government, bureaucracies, companies,
00:53:59.580 | nonprofits, foundations, NGOs, tech companies,
00:54:02.720 | on and on and on, people who are in charge of big complexes
00:54:07.340 | and that carry a lot of basically power and influence
00:54:09.960 | and capability and money as a consequence
00:54:11.900 | of their positional authority.
00:54:13.780 | So the head of a giant foundation
00:54:15.860 | may never have done anything in their life
00:54:17.740 | that would cause somebody to have a high opinion of them
00:54:19.140 | as a person.
00:54:19.740 | But they're in charge of this gigantic multi-billion dollar
00:54:22.140 | complex and have all this power of the results.
00:54:24.180 | So that's just to define terms, elites and institutions.
00:54:28.340 | So it's actually interesting.
00:54:29.540 | Gallup has been doing polls on the question of trust
00:54:34.300 | in institutions, which is sort of therefore
00:54:36.180 | a proxy for trust in elites, basically
00:54:38.520 | since the early 1970s.
00:54:40.300 | And what you find-- and they do this
00:54:41.980 | across all the categories of big institutions,
00:54:44.340 | basically everyone I just talked about, a bunch of others,
00:54:46.580 | big business, small business, banks, newspapers,
00:54:49.080 | broadcast television, the military, police.
00:54:52.300 | So they've got like 30 categories or something.
00:54:54.340 | And basically what you see is almost all the categories
00:54:56.620 | basically started in the early '70s at like 60% or 70% trust.
00:55:00.220 | And now they've-- basically almost across the board,
00:55:03.100 | they've just had a complete basically linear slide down
00:55:05.900 | for 50 years, basically my whole life.
00:55:09.180 | And they're now bottoming out.
00:55:11.860 | Congress and journalists bottom out at like 10%.
00:55:16.140 | Like the two groups everybody hates
00:55:17.580 | are like Congress and journalists.
00:55:19.460 | And then it's like a lot of other big institutions
00:55:21.780 | are like in their 20s, 30s, 40s.
00:55:23.780 | Actually, big business actually scores fairly high.
00:55:25.980 | Tech actually scores quite high.
00:55:27.640 | The military scores quite high.
00:55:29.060 | But basically everything else has really caved in.
00:55:31.220 | And so this is sort of my fundamental challenge
00:55:33.740 | to everybody who basically says--
00:55:35.220 | and you didn't do this, but you'll
00:55:36.900 | hear the simple form of this, which is social media caused
00:55:40.060 | the current trouble.
00:55:41.540 | And it's called this an example.
00:55:43.020 | Collapse in faith in institutions and elites,
00:55:45.700 | let's call that part of the current trouble.
00:55:47.940 | Everybody's like, oh, social media caused that.
00:55:49.180 | I was like, well, no, social media is new in the last--
00:55:53.140 | social media is effectively new practically speaking since 2010,
00:55:55.940 | 2012 is when it really took off.
00:55:58.180 | And so if the trend started in the early 1970s
00:56:00.740 | and has been continuous, then we're
00:56:02.160 | dealing with something broader.
00:56:04.060 | And Martin Gurry wrote I think the best book on this called
00:56:07.340 | The Revolt of the Public, where he
00:56:08.760 | goes through this in detail.
00:56:09.920 | And he does say that social media had a lot to do with
00:56:13.300 | what's happened in the last decade.
00:56:14.820 | But he says if you go back, you look further,
00:56:16.700 | it was basically two things coinciding.
00:56:19.220 | One was just a general change in the media environment.
00:56:21.480 | And in particular, the 1970s is when you started--
00:56:24.460 | especially in the 1980s is when you started to get specifically
00:56:27.180 | talk radio, which was a new outlet.
00:56:29.820 | And then you also got cable television.
00:56:32.400 | And then you also, by the way, it's
00:56:33.940 | actually interesting in the '50s, '60s,
00:56:35.180 | you had paperback books, which was another one of these,
00:56:36.940 | which was an outlet.
00:56:37.780 | So you had a fracturing in the media landscape
00:56:40.340 | that started in the '50s through the '80s.
00:56:42.320 | And then, of course, the internet blew it wide open.
00:56:45.160 | Having said that, if the elites in the institutions
00:56:47.240 | were fantastic, you would know it more than ever.
00:56:49.560 | Previous information is more accessible.
00:56:51.240 | And so the other thing that he says and I agree with
00:56:53.400 | is the public is not being tricked
00:56:55.280 | into thinking the elites in institutions are bad.
00:56:57.640 | They're learning that they're bad.
00:56:59.560 | And therefore, the mystery of the Gallup poll
00:57:02.000 | is why those numbers aren't all just zero,
00:57:04.880 | which is arguably in a lot of cases where they should be.
00:57:08.080 | I think one reason that--
00:57:09.200 | Oh, and by the way, he thinks this is bad.
00:57:10.900 | So he and I have a different view.
00:57:12.240 | Here's our he, and I disagree.
00:57:13.440 | He thinks this is bad.
00:57:14.320 | So he basically says you can't replace elites with nothing.
00:57:18.120 | You can't replace institutions with nothing
00:57:20.500 | because what you're just left with
00:57:21.840 | is just going to be wreckage.
00:57:22.720 | You're going to be left with a completely, basically,
00:57:25.120 | atomized, out-of-control society that has no ability to marshal
00:57:28.240 | any sort of activity in any direction.
00:57:29.800 | It's just going to be a dog-eat-dog awful world.
00:57:32.980 | I have a very different view on that, which we can talk about.
00:57:35.680 | Yeah, I'd love to hear your views on that.
00:57:37.400 | Yeah.
00:57:38.400 | I'd like to take a quick break and acknowledge our sponsor,
00:57:40.740 | InsideTracker.
00:57:42.140 | InsideTracker is a personalized nutrition platform
00:57:44.500 | that analyzes data from your blood and DNA
00:57:46.980 | to help you better understand your body
00:57:48.540 | and help you meet your health goals.
00:57:50.420 | I'm a big believer in getting regular blood work done
00:57:52.740 | for the simple reason that many of the factors
00:57:55.060 | that impact your immediate and long-term health
00:57:57.300 | can only be analyzed from a quality blood test.
00:57:59.880 | However, with a lot of blood tests out there,
00:58:01.900 | you get information back about blood lipids,
00:58:03.800 | about hormones, and so on,
00:58:04.940 | but you don't know what to do with that information.
00:58:06.940 | With InsideTracker, they have a personalized platform
00:58:09.260 | that makes it very easy to understand your data,
00:58:12.020 | that is to understand what those lipids,
00:58:13.940 | what those hormone levels, et cetera, mean,
00:58:15.920 | and behavioral supplement, nutrition, and other protocols
00:58:19.100 | to adjust those numbers to bring them into the ranges
00:58:21.900 | that are ideal for your immediate and long-term health.
00:58:24.160 | InsideTracker's ultimate plan now includes measures
00:58:26.460 | of both ApoB and of insulin,
00:58:28.660 | which are key indicators of cardiovascular health
00:58:31.320 | and energy regulation.
00:58:32.900 | If you'd like to try InsideTracker,
00:58:34.340 | you can visit insidetracker.com/huberman
00:58:37.140 | to get 20% off any of InsideTracker's plans.
00:58:39.900 | Again, that's insidetracker.com/huberman to get 20% off.
00:58:44.500 | The quick question I was going to ask before we go there is,
00:58:48.060 | I think that one reason that I and many other people
00:58:51.260 | sort of reflexively assume that social media
00:58:53.640 | caused the demise of our faith in institutions is,
00:58:58.640 | well, first of all, I wasn't aware of this lack
00:59:02.680 | of correlation between the decline in faith in institutions
00:59:06.180 | and the rise of social media.
00:59:08.220 | But secondarily, that we've seen some movements
00:59:11.560 | that have essentially rooted themselves in tweets,
00:59:16.560 | in comments, in posts that get amplified.
00:59:20.780 | And those tweets and comments and posts
00:59:22.860 | come from everyday people.
00:59:24.420 | In fact, I can't name one person who initiated
00:59:28.340 | a given cancellation or movement
00:59:30.620 | because it was the sort of dogpiling or mob
00:59:33.260 | adding on to some person that was essentially anonymous.
00:59:35.360 | So I think that for many of us,
00:59:36.400 | we have the bottom, to use neuroscience language,
00:59:38.440 | is sort of a bottom up and a perspective.
00:59:40.680 | Oh, someone sees something in their daily life
00:59:44.180 | or experiences something in their daily life
00:59:45.940 | and they tweet about it or they comment about it
00:59:48.700 | or they post about it.
00:59:50.160 | And then enough people dogpile on the accused
00:59:54.560 | that it picks up force and then the elites feel compelled,
00:59:59.560 | obligated to cancel somebody.
01:00:02.340 | That tends to be the narrative.
01:00:04.680 | And so I think the logical conclusion is,
01:00:06.460 | oh, social media allows for this to happen,
01:00:08.780 | whereas normally someone would just be standing
01:00:10.260 | on the corner shouting or calling lawyers
01:00:12.440 | that don't have faith in them.
01:00:13.460 | And you've got the like the Aaron Brockovich model
01:00:15.900 | of that turns into a movie,
01:00:18.960 | but that's a rare case of this lone woman
01:00:20.660 | who's got this idea in mind about how big institution
01:00:24.020 | is doing wrong or somebody is doing wrong in the world
01:00:27.380 | and then can leverage big institution, excuse me.
01:00:30.220 | But the way that you describe it is that the elites
01:00:32.820 | are leading this shift.
01:00:36.420 | So what is the role of the public in it?
01:00:39.680 | I mean, just to give it a concrete example,
01:00:42.120 | if for instance, no one tweeted or commented a me too,
01:00:47.980 | or no one tweeted or commented about some ill behavior
01:00:53.420 | of some, I don't know, university faculty member
01:00:55.960 | or a business person,
01:00:57.760 | would the elite have come down on them anyway?
01:01:01.440 | - Oh yeah, so it was happening.
01:01:02.580 | So based on what I've seen over the years,
01:01:05.900 | there is so much astroturfing right now.
01:01:09.660 | There are entire categories of people
01:01:12.060 | who are paid to do this.
01:01:13.980 | Some of them we call journalists,
01:01:15.380 | some of them we call activists,
01:01:16.420 | some of them we call NGO, non-profit,
01:01:18.820 | some of them we call university professors,
01:01:20.540 | some of them we call grad students,
01:01:21.660 | like whatever, they're paid to do this.
01:01:24.220 | I don't know if you've ever looked
01:01:25.040 | into the misinformation industrial complex.
01:01:27.600 | There's this whole universe of basically these funded groups
01:01:29.960 | that basically do quote, unquote, misinformation
01:01:32.960 | and they're constantly mounting these kinds of attacks.
01:01:36.080 | They're constantly trying to gin up
01:01:37.480 | this kind of basically panic
01:01:38.680 | to cause somebody to get fired.
01:01:40.080 | - So it's not a grassroots.
01:01:41.580 | - No, it's the opposite of grassroots.
01:01:43.000 | No, and almost always when you trace these things back,
01:01:45.200 | it was a journalist, it was an activist,
01:01:47.300 | it was a public figure of some kind.
01:01:50.840 | These are entrepreneurs in sort of a weird way.
01:01:55.520 | They're basically, they're paid their job mission calling.
01:01:59.840 | It's all wrapped up together.
01:02:00.880 | They're true believers,
01:02:01.720 | but they're also getting paid to do it.
01:02:03.040 | And there's a giant funding,
01:02:04.240 | I mean, there's a very large funding complex for this
01:02:06.240 | coming from certain high profile people
01:02:09.320 | who put huge amounts of money into this.
01:02:10.520 | - Is this well known?
01:02:11.520 | - Yes, well, so I mean, it is in my world.
01:02:13.360 | So this is what the social media companies
01:02:15.320 | have been on the receiving end of for the last decade.
01:02:17.920 | It's basically a political media activism complex
01:02:22.520 | with very deep pockets behind it
01:02:23.880 | and you've got people who basically,
01:02:25.200 | literally people who sit all day
01:02:26.820 | and watch the TV network on the other side
01:02:28.800 | or watch the Twitter feeds on the other side
01:02:29.980 | and they wait, they basically wait.
01:02:31.680 | It's like every politician,
01:02:33.120 | this has been the case for a long time now,
01:02:34.120 | every politician who goes out and gives stump speeches,
01:02:36.080 | you'll see there's always somebody in the crowd
01:02:37.240 | with a camcorder or now with a phone recording them
01:02:39.560 | and that's somebody from the other campaign
01:02:41.840 | who's paid somebody to just be there
01:02:43.400 | and record every single thing the politician says
01:02:45.600 | so that when Mitt Romney says whatever the 47% thing,
01:02:48.600 | they've got it on tape and then they clip it
01:02:50.440 | and they try to make it viral.
01:02:51.720 | So this stuff is, and again, look,
01:02:54.560 | these people believe what they're doing.
01:02:55.760 | I'm not saying it's even dishonest.
01:02:57.080 | These people believe what they're doing.
01:02:58.040 | They think they're fighting a holy war.
01:02:59.160 | They think they're protecting democracy.
01:03:00.440 | They think they're protecting civilization.
01:03:01.880 | They think they're protecting whatever it is
01:03:03.600 | they're protecting and then they know how to use the tools
01:03:07.640 | and so they know how to try to gin up the outrage
01:03:10.180 | and then by the way, sometimes it works in social cascades.
01:03:12.960 | Sometimes it works, sometimes it doesn't.
01:03:14.860 | Sometimes they cascade, sometimes they don't
01:03:16.720 | but if you follow these people on Twitter,
01:03:18.360 | this is what they do every day.
01:03:20.260 | They're constantly trying to light this fire.
01:03:22.440 | - Wow, I assume that it was really bottom up
01:03:25.360 | but it sounds like it's sort of a mid-level
01:03:27.560 | and then it captures the elites
01:03:29.600 | and then the thing takes on a life of its own.
01:03:31.600 | - By the way, it also intersects
01:03:32.560 | with the trust and safety groups
01:03:33.600 | at the social media firms who are responsible
01:03:35.880 | for figuring out who gets promoted
01:03:37.160 | and who gets banned right across this
01:03:38.960 | and you'll notice one large social media company
01:03:40.960 | has recently changed hands
01:03:42.080 | and has implemented a different kind
01:03:43.600 | of set of trust and safety
01:03:46.560 | and all of a sudden a different kind of boycott movement
01:03:48.380 | has all of a sudden started to work
01:03:49.640 | that wasn't working before that
01:03:50.880 | and another kind of boycott movement
01:03:52.380 | is not working as well anymore
01:03:53.840 | and so for sure there's intermediation happening.
01:03:57.000 | Like look, the stuff that's happening in the world today
01:03:58.960 | is being intermediated through social media
01:04:00.440 | 'cause social media is the defining media of our time
01:04:03.320 | but there are people who know how to do this
01:04:05.240 | and do this for a living.
01:04:06.980 | So no, I very much view this as a,
01:04:09.440 | I view very much the cancellation wave
01:04:11.640 | like this whole thing.
01:04:12.640 | It's an elite phenomenon
01:04:15.620 | and when it appears to be a grassroots thing,
01:04:18.000 | it's either grassroots among the elites
01:04:20.360 | which is possible 'cause there's a fairly large number
01:04:22.960 | of people who are signed up for that particular crusade
01:04:26.320 | but there's also a lot of astroturfing
01:04:27.680 | that's taking place inside that.
01:04:29.200 | The question is okay, at what point does the population
01:04:31.160 | at large get pulled into this
01:04:32.960 | and maybe there are movements at certain points in time
01:04:35.720 | where they do get pulled in
01:04:36.600 | and then maybe later they get disillusioned
01:04:38.120 | and so then there's some question there
01:04:39.440 | and then there's another question of like well,
01:04:40.960 | if the population at large is gonna decide
01:04:42.720 | what these movements are,
01:04:43.540 | are they gonna be the same movements as the elites want
01:04:46.280 | and are the elites, how are the elites gonna react
01:04:48.080 | when the population actually like fully expresses itself?
01:04:50.840 | Right, and so there's, and like I said,
01:04:52.200 | there's a feedback loop between these
01:04:53.680 | where the more extreme the elites get,
01:04:55.000 | they tend to push the population to more extreme views
01:04:57.000 | on the other side and vice versa
01:04:58.320 | so it ping pong's back and forth
01:04:59.960 | and so yeah, this is our world.
01:05:02.040 | - Yeah, this explains a lot.
01:05:04.200 | I wanna make sure that I--
01:05:06.040 | - By the way, Tyeeby, so Mike Shellenberger, Matt Tyeeby,
01:05:08.720 | a bunch of these guys have done a lot of work.
01:05:11.600 | If you just look into what's called
01:05:12.960 | the misinformation industrial complex,
01:05:14.960 | you'll find a network of money and power
01:05:16.580 | that is really quite amazing.
01:05:18.740 | - I've seen more and more Shellenberger showing up, right?
01:05:22.920 | And he's just like, on this stuff, he and Tyeeby,
01:05:25.520 | they're literally just like tracking money.
01:05:27.280 | It's just like, it's very clear how the money flows,
01:05:29.600 | including like a remarkable amount of money
01:05:31.240 | out of the government, which is of course,
01:05:33.440 | like in theory, very concerning.
01:05:35.000 | - Very interesting.
01:05:36.800 | - The government should not be funding programs
01:05:38.640 | that take away people's constitutional rights
01:05:40.340 | and yet somehow that is what's been happening.
01:05:44.160 | - Very interesting.
01:05:45.000 | - Yes.
01:05:46.120 | - I wanna make sure that I hear your ideas
01:05:48.660 | about why the decline in confidence in institutions
01:05:52.840 | and is not necessarily problematic.
01:05:56.180 | Is this going to be a total destruction
01:05:59.120 | and burning down of the forest that will lead to new life?
01:06:01.600 | Is that your view?
01:06:02.880 | - Yeah, well, so this is the thing.
01:06:04.220 | And look, there's a question if you're,
01:06:05.540 | there's a couple of questions in here,
01:06:06.560 | which is like, how bad is it really?
01:06:08.680 | Like, how bad are they, right?
01:06:11.240 | And I think they're pretty bad.
01:06:13.520 | A lot of them are pretty actually bad.
01:06:15.320 | And so that's one big question.
01:06:17.440 | And then, yeah, the other question is like, okay,
01:06:19.560 | if an institution has gone bad
01:06:21.200 | or a group of elites have gone bad,
01:06:22.480 | like can, it's this wonderful word reform, right?
01:06:24.840 | Can they be reformed?
01:06:26.120 | And everybody always wants to reform everything
01:06:27.600 | and yet somehow like nothing ever,
01:06:29.280 | quite ever gets reformed, right?
01:06:30.940 | And so people are trying to reform, you know,
01:06:32.480 | housing policy in the Bay Area for decades
01:06:34.000 | and you know, we're not building,
01:06:35.480 | we're building fewer houses than ever before.
01:06:36.920 | So somehow reform movements seem to lead to more,
01:06:39.080 | just more bad stuff.
01:06:40.960 | But anyway, yeah, so if you have an existing institution,
01:06:42.940 | can it be reformed?
01:06:43.780 | Can it be fixed from the inside?
01:06:45.280 | Like what's happened in universities?
01:06:46.320 | Like there's a lot of, there are professors at Stanford
01:06:48.000 | as an example who very much think
01:06:49.240 | that they can fix Stanford.
01:06:51.400 | Like, I don't know what you think.
01:06:52.760 | It doesn't seem like it's going
01:06:53.860 | in productive directions right now.
01:06:56.360 | - Well, I mean, there are many things about Stanford
01:06:57.880 | that function extremely well.
01:06:59.540 | I mean, it's a big institution.
01:07:00.920 | It's certainly got its issues like any other place.
01:07:03.680 | They're also my employer.
01:07:04.720 | Mark's giving me some interesting looks.
01:07:06.280 | He wants me to get a little more vocal here.
01:07:08.440 | - No, no, no, you don't need to.
01:07:09.400 | - Oh no, I mean, I think that, yeah, I mean,
01:07:12.000 | one of the things about being a researcher
01:07:13.480 | at a big institution like Stanford is,
01:07:15.200 | well, first of all, it meets the criteria
01:07:17.000 | that you described before.
01:07:18.120 | You know, you look to the left, you look to the right
01:07:19.800 | or anywhere above or below you and you have excellence,
01:07:23.680 | right, I mean, I've got a Nobel Prize winner below me
01:07:25.760 | whose daddy also won a Nobel Prize
01:07:27.440 | and his scientific offspring is likely to win.
01:07:30.280 | I mean, it inspires you to do bigger things
01:07:33.120 | than one ordinarily would, no matter what.
01:07:36.300 | So there's that and that's great and that persists.
01:07:39.540 | There's all the bureaucratic red tape
01:07:42.040 | about trying to get things done
01:07:43.360 | and how to implement decisions is very hard
01:07:46.400 | and there are a lot of reasons for that.
01:07:47.600 | And then of course there are the things that, you know,
01:07:49.680 | many people are aware of are there are public accusations
01:07:53.280 | about people in positions of great leadership
01:07:55.500 | and that's getting played out
01:07:56.680 | and the whole thing becomes kind of overwhelming
01:07:59.540 | and a little bit opaque when you're just trying
01:08:01.120 | to like run your lab or live your life.
01:08:03.440 | And so I think one of the reasons for this lack of reform
01:08:06.040 | that you're referring to is because there's no position
01:08:10.080 | of reformer, right?
01:08:11.880 | So deans are dealing with a lot of issues,
01:08:14.880 | provosts are dealing with a lot of issues,
01:08:16.320 | presidents are dealing with a lot of issues
01:08:17.920 | and then some in some cases.
01:08:20.180 | And so, you know, we don't have a dedicated role of reformer,
01:08:24.400 | someone to go in and say, listen,
01:08:26.320 | there's just a lot of fat on this and we need to trim it
01:08:28.960 | or we need to create this or do that.
01:08:31.040 | There just isn't a system to do that.
01:08:32.920 | And that's, I think, in part because universities
01:08:35.680 | are built on old systems and, you know,
01:08:38.080 | it's like the New York subway, it's amazing,
01:08:41.240 | it still works as well as it does
01:08:42.800 | and yet it's got a ton of problems also.
01:08:45.320 | - Well, so the point we can debate the university
01:08:48.520 | specifically, but the point is like, look,
01:08:50.080 | if you do think institutions are going bad
01:08:51.760 | and then you have to make it,
01:08:52.600 | number one, you have to figure out
01:08:53.520 | if you think institutions are going bad.
01:08:54.840 | The population largely does think that.
01:08:57.000 | And then at the very least, the people who run institutions
01:08:59.040 | ought to really think hard about what that means.
01:09:00.760 | - But people still strive to go to these places.
01:09:03.840 | And I still hear from people who, like for instance,
01:09:06.320 | did not go to college or talking about
01:09:08.080 | how a university degree is useless.
01:09:10.020 | They'll tell you how proud they are
01:09:11.240 | that their son or daughter is going to Stanford
01:09:13.380 | or is going to UCLA or is going to Urbana-Champaign.
01:09:16.300 | I mean, it's almost like, to me,
01:09:18.100 | that's always the most shocking contradiction is like,
01:09:21.660 | yeah, like these institutions don't matter,
01:09:23.300 | but then when people want to hold up a card
01:09:24.980 | that says why their kid is great,
01:09:27.320 | it's not about how many pushups they can do
01:09:29.280 | or that they started their own business most of the time,
01:09:31.720 | it's they're going to this university.
01:09:33.740 | And I think, well, what's going on here?
01:09:35.600 | - So do you think the median voter in the United States
01:09:37.260 | can have their kid go to Stanford?
01:09:38.500 | - No. - No.
01:09:39.340 | - No, and no--
01:09:40.180 | - Do you think the median voter in the United States
01:09:41.920 | could have their kid admitted to Stanford
01:09:43.080 | even with perfect SAT?
01:09:44.260 | - No. - No.
01:09:46.620 | - In this day and age, the competition is so fierce
01:09:50.180 | that it requires more.
01:09:51.180 | - Yeah, so first of all, again, we're dealing here,
01:09:54.100 | yes, we're dealing with a small number
01:09:55.820 | of very elite institutions.
01:09:57.220 | People may admire them or not.
01:09:59.380 | Most people have no connectivity to them whatsoever.
01:10:01.900 | In the statistics, in the polling,
01:10:04.560 | universities are not doing well.
01:10:06.220 | You know, the population at large,
01:10:07.620 | yeah, they may have fantasies
01:10:08.580 | about their kid going to Stanford,
01:10:09.500 | but like the reality of it is they have a very,
01:10:11.620 | say, collapsing view of these institutions.
01:10:14.280 | So anyways, this actually goes straight
01:10:16.240 | to the question of alternatives then, right,
01:10:17.600 | which is like, okay, if you believe
01:10:19.240 | that there's collapsing faith in the institutions,
01:10:20.880 | if you believe that it is merited, at least in some ways,
01:10:23.540 | if you believe that reform is effectively impossible,
01:10:25.400 | then you are faced, and we could debate each of those,
01:10:28.920 | but the population at large seems to believe a lot of that.
01:10:32.440 | Then there's a question of like, okay,
01:10:34.240 | can it be replaced, and if so,
01:10:36.300 | are you better off replacing these things
01:10:37.920 | basically while the old things still exist,
01:10:39.480 | or do you actually need to basically clear the field
01:10:41.820 | to be able to have a new thing exist?
01:10:44.040 | The universities are a great case study of this
01:10:45.560 | because of how student loans work, right,
01:10:48.040 | and the way student loans work is to be able to be a,
01:10:51.200 | to be an actual competitive university and compete,
01:10:53.840 | you need to have access to federal student lending,
01:10:55.520 | 'cause if you don't, everybody has to pay out of pocket,
01:10:57.480 | and it's completely out of reach for anybody
01:10:58.820 | other than a certain class of either extremely rich
01:11:01.400 | or foreign students.
01:11:03.180 | So you need access to federal student loan facility.
01:11:04.760 | To get access to federal student loan facility,
01:11:06.440 | you need to be an accredited university.
01:11:09.320 | Guess who runs the Accreditation Council?
01:11:11.920 | - I don't know.
01:11:12.760 | - The existing universities, right?
01:11:14.320 | So it's a self-laundering machine,
01:11:17.380 | like they decide who the new universities are.
01:11:19.000 | Guess how many new universities get accredited, right,
01:11:21.040 | each year to be able to, okay.
01:11:23.160 | - Zero. - Zero, right?
01:11:24.680 | And so as long as that system is in place,
01:11:27.800 | and as long as they have the government wired
01:11:29.680 | the way that they do, and as long as they control
01:11:32.200 | who gets access to federal student loan funding,
01:11:34.180 | like of course there's not gonna be any competition, right?
01:11:36.640 | Of course there can't be a new institution
01:11:38.080 | that's gonna be able to get to scale.
01:11:39.600 | Like it's not possible.
01:11:40.500 | And so if you actually wanted to create a new system
01:11:43.680 | that was better in the, I would argue dozens
01:11:46.000 | or hundreds of ways, it could obviously be better
01:11:47.720 | if you were starting today.
01:11:49.400 | It probably can't be done as long as existing institutions
01:11:51.800 | are actually intact.
01:11:53.100 | And this is my counter against to Martin,
01:11:55.000 | which is like, yeah, look, if we're gonna tear down the old,
01:11:58.480 | there may be a period of disruption
01:11:59.680 | before we get to the new, but we're never gonna get
01:12:01.420 | to the new if we don't tear down the old.
01:12:03.020 | - When you say counter to Martin,
01:12:04.080 | you're talking about the author of "Revolt of the Public."
01:12:05.680 | - Yeah, Martin Gurry, yeah.
01:12:06.520 | What Martin Gurry says is like, look, he said basically,
01:12:09.040 | what Martin says is as follows.
01:12:11.600 | The elites deserve contempt, but the only thing worse
01:12:15.200 | than these elites that deserve contempt
01:12:17.360 | would be no elites at all, right?
01:12:19.080 | And he basically says on the other side of the destruction
01:12:23.720 | of the elites and the institutions is nihilism.
01:12:26.240 | You're basically left with nothing.
01:12:27.320 | And then by the way, there is a nihilistic streak.
01:12:28.960 | I mean, there's a nihilistic streak in the culture
01:12:30.720 | and the politics today.
01:12:31.540 | There are people who basically would just say,
01:12:32.940 | yeah, just tear the whole system down, right?
01:12:34.980 | And without any particular plan for what follows.
01:12:37.720 | And so I think he makes a good point
01:12:40.460 | in that you wanna be careful that you actually have a plan
01:12:42.280 | on the other side that you think is actually achievable.
01:12:44.700 | But again, the counter argument to that is
01:12:47.140 | if you're not willing to actually tear down the old,
01:12:49.160 | you're not gonna get to the new.
01:12:50.640 | Now, what's interesting, of course,
01:12:52.360 | is this is what happens every day in business, right?
01:12:54.680 | So like the entire way, like how do you know
01:12:57.300 | that the capitalist system works?
01:12:59.100 | The way that you know is that the old companies,
01:13:01.460 | when they're no longer like the best at what they do,
01:13:03.560 | they get torn down and then they ultimately die
01:13:06.060 | and they get replaced by better companies.
01:13:07.560 | - Yeah, I haven't seen a Sears in a while.
01:13:09.200 | - Exactly, right?
01:13:10.640 | And we know, what's so interesting is we know in capitalism
01:13:13.800 | and a market economy, we know that that's the sign of health.
01:13:16.800 | Right, that's the sign of how the system
01:13:18.640 | is working properly, right?
01:13:20.080 | And in fact, we get actually judged by antitrust authorities
01:13:22.580 | in the government on that basis, right?
01:13:24.520 | It's like the best defense against antitrust charges is,
01:13:27.020 | no, people are like coming to kill us
01:13:28.160 | and they're doing like a really good job of it.
01:13:29.360 | Like that's how we know we're doing our job.
01:13:31.160 | And in fact, in business, we are specifically,
01:13:33.240 | it is specifically illegal for companies
01:13:35.380 | in the same industry to get together and plot and conspire
01:13:37.440 | and plan and have things like these accreditation bureaus.
01:13:39.760 | Like we would get, if I created the equivalent
01:13:41.920 | in my companies of the kind of accreditation bureau
01:13:44.200 | that the universities have,
01:13:45.040 | I'd get straight to federal prison.
01:13:46.640 | And a trust violation Sherman Act, straight to prison.
01:13:48.440 | People have been sent to prison for that.
01:13:50.440 | So in the business world, we know that you want
01:13:53.080 | everything subject to market competition.
01:13:54.660 | We know that you want creative destruction.
01:13:56.520 | We know that you want replacement of the old
01:13:58.240 | with superior new.
01:13:59.920 | It's just once we get outside of business,
01:14:01.240 | we're like, oh, we don't want any of that.
01:14:02.440 | We want basically stagnation and log rolling, right?
01:14:05.960 | And basically institutional incestuous entanglements
01:14:10.000 | and conflicts of interest as far as the eye can see.
01:14:12.800 | And then we're surprised by the results.
01:14:14.820 | - So let's play it out as a bit of a thought experiment.
01:14:17.640 | So let's say that one small banding together
01:14:22.640 | of people who want to start a new university
01:14:24.880 | where free exchange of open ideas,
01:14:27.480 | where unless somebody has egregious behavior,
01:14:31.420 | violent behavior, truly sexually inappropriate behavior
01:14:35.060 | against somebody, they're committing a crime, right?
01:14:37.560 | They're allowed to be there.
01:14:38.560 | They're allowed to be a student or a faculty member
01:14:41.700 | or administrator.
01:14:42.940 | And let's just say this accreditation bureau
01:14:45.820 | allowed student loans for this one particular university.
01:14:48.520 | Or let's say that there was an independent source of funding
01:14:50.460 | for that university such that students
01:14:52.180 | could just apply there.
01:14:53.140 | They didn't need to be part of this elite accredited group,
01:14:57.200 | which is it sounds very mafia like frankly.
01:15:01.120 | Not necessarily violent, but certainly coercive
01:15:03.140 | in the way that it walls people out.
01:15:06.160 | Let's say that then there were 20 or 30 of those
01:15:11.480 | or 40 of those.
01:15:12.780 | Do you think that over time that model
01:15:15.260 | would overtake the existing model?
01:15:17.820 | - Is it interesting that those don't exist?
01:15:19.960 | Remember the Sherlock Holmes, the dog that didn't bark?
01:15:24.800 | - It is interesting those don't exist.
01:15:26.800 | - So there's two possibilities.
01:15:28.020 | One is like nobody wants that, which I don't believe.
01:15:31.220 | And then the other is like the system is wired
01:15:32.740 | in a way that will just simply not allow it.
01:15:34.580 | Right, and you did a hypothetical
01:15:35.840 | in which the system would allow it.
01:15:37.660 | My response to that is no,
01:15:38.500 | of course the system won't allow that.
01:15:39.900 | - Or the people that band together have enough money
01:15:42.140 | or get enough resources to say,
01:15:43.920 | look, we can afford to give loans
01:15:46.660 | to 10,000 students per year.
01:15:48.880 | 10,000 isn't a trivial number
01:15:50.260 | when thinking about the size of a university.
01:15:51.860 | And most of them hopefully will graduate in four years
01:15:56.000 | and there'll be a turnover.
01:15:57.380 | And do you think that the great future innovators
01:16:01.880 | would tend to orient toward that model
01:16:05.340 | more than they currently do toward the traditional model?
01:16:08.220 | I mean, what I'm trying to get back to here is
01:16:09.660 | how do you think that the current model thwarts innovation
01:16:12.900 | as well as maybe some ways that it still supports innovation
01:16:17.460 | certainly cancellation and the risk of cancellation
01:16:20.760 | from the way that we framed it earlier
01:16:22.420 | is going to discourage risk takers
01:16:25.840 | of the category of risk takers that take risk
01:16:27.840 | in every domain that really like to fly close to the sun
01:16:31.840 | and sometimes into the sun.
01:16:33.280 | - Or are doing research that is just not politically.
01:16:35.980 | - Right, yeah, looking into issues that--
01:16:38.280 | - Palatable.
01:16:39.120 | - Right, that we can't even talk about on this podcast
01:16:42.440 | probably without causing a distraction
01:16:44.560 | of what we're actually trying to talk about.
01:16:46.080 | - But it gives up the whole game right there.
01:16:46.920 | - Right, exactly.
01:16:47.760 | So I keep a file and it's a written file
01:16:53.020 | because I'm afraid to put it into electronic form
01:16:55.480 | of all the things that I'm afraid to talk about publicly
01:16:58.220 | because I come from a lineage of advisors
01:17:00.340 | where all three died young and I figure if nothing else,
01:17:03.160 | I'll die and then I'll make it into the world
01:17:05.000 | and when I'll say five, 10 years, 20 years
01:17:08.340 | and if not, I know a certainty I'm going to die at some point
01:17:11.360 | and then we'll see where all those issues stand.
01:17:13.720 | In any event--
01:17:14.880 | - Is that list getting longer over time or shorter?
01:17:16.520 | - Oh, it's definitely getting longer.
01:17:17.920 | - Isn't that interesting?
01:17:18.760 | - Yeah, it's getting much longer.
01:17:20.360 | I mean, there are just so many issues
01:17:23.000 | that I would love to explore on this podcast with experts
01:17:26.960 | and that I can't explore just even if I had a panel of them
01:17:31.960 | because of the way that things get soundbited
01:17:35.240 | and segmented out and taken out of context,
01:17:37.680 | it's like the whole conversation is lost
01:17:39.940 | and so fortunately, there are an immense number
01:17:41.960 | of equally interesting conversations
01:17:44.840 | that I'm excited to have but it is a little disturbing.
01:17:49.040 | - Do you remember Lysankoism?
01:17:50.480 | - No.
01:17:52.320 | Lysankoism, so famous in the history of the Soviet Union,
01:17:54.600 | this is a famous thing.
01:17:55.440 | So there was a geneticist named Lysanko.
01:17:58.040 | - Yeah, that's why it sounds familiar
01:17:59.040 | but I'm not calling to mind what the context is.
01:18:00.880 | - Well, he was the guy who did communist genetics.
01:18:04.100 | The field of genetics, the Soviets did not approve
01:18:07.120 | of the field of genetics because of course,
01:18:08.920 | they believed in the creation of the new man
01:18:10.480 | and total equality and genetics did not support that
01:18:13.440 | and so if you were doing traditional genetics,
01:18:15.720 | you were at the very least fired if not killed
01:18:19.320 | and so this guy Lysanko stood up and said,
01:18:20.940 | "Oh, I've got Marxist genetics.
01:18:22.520 | "I've got like a whole new field of genetics
01:18:23.960 | "that basically is politically compliant."
01:18:25.960 | And then they actually implemented that
01:18:27.200 | in the agriculture system of the Soviet Union
01:18:29.220 | and it's the origin of one of the big reasons
01:18:31.120 | that the Soviet Union actually fell
01:18:32.400 | which was they ultimately couldn't feed themselves.
01:18:34.680 | - So create a new notion of biology
01:18:36.800 | as it relates to genetics.
01:18:37.920 | - Politically create biology, right?
01:18:39.160 | And so they not only created it, they taught it,
01:18:41.880 | they mandated it, they required it
01:18:43.280 | and then they implemented it in agriculture.
01:18:45.560 | - Interesting.
01:18:46.400 | - So yeah, so I never understood,
01:18:48.280 | there was a bunch of things in history
01:18:49.440 | I never understood until the last decade
01:18:50.840 | and that's one of them.
01:18:51.880 | - Well, I censor myself at the level
01:18:53.380 | of deleting certain things
01:18:54.880 | but I don't contort what I do talk about.
01:18:57.600 | So I tend to like to play on lush open fields.
01:19:01.980 | Just makes my life a lot easier.
01:19:03.460 | - This goes to the rot.
01:19:04.300 | This goes to the rot and I'll come back to your question
01:19:06.020 | but like this goes to the rot in the existing system
01:19:07.800 | which is we've, by the way, I'm no different.
01:19:09.360 | I'm just like you.
01:19:10.200 | Like I'm not trying not to light myself on fire either
01:19:12.260 | but like the rot in the existing system
01:19:14.320 | and by system I mean the institutions and the elites.
01:19:16.320 | The rot is that the set of things
01:19:18.200 | that are no longer allowed.
01:19:19.360 | I mean that list is like obviously expanding over time
01:19:22.280 | and like that's a real like historically speaking
01:19:26.120 | that doesn't end in good places.
01:19:27.860 | - Is this group of a particular generation
01:19:30.080 | that we can look forward to the time
01:19:31.480 | when they eventually die off?
01:19:33.000 | - It's sort of the boomers plus the millennials.
01:19:34.980 | So good news, bad news.
01:19:37.840 | I mean Gen X is weird, right?
01:19:39.120 | I'm Gen X.
01:19:39.940 | Gen X is weird 'cause we kind of slipped in the middle.
01:19:42.160 | We were kind of the, I don't know how to describe it.
01:19:45.860 | We were the kind of nonpolitical generation
01:19:47.440 | kind of sandwiched between the boomers and the millennials.
01:19:49.720 | Gen Z is a very, I think, open question right now
01:19:52.400 | which way they go.
01:19:53.320 | I could imagine them being actually like much more intense
01:19:56.400 | than the millennials on all these issues.
01:19:58.280 | I could also imagine them reacting to the millennials
01:20:00.040 | and being far more open-minded.
01:20:01.960 | - We don't know which way it's gonna go yet.
01:20:03.040 | - Where it's gonna go.
01:20:03.880 | It might be different groups of them.
01:20:05.200 | - I mean I'm Gen X also.
01:20:06.300 | I'm 47, you're--
01:20:07.440 | - 50, 50, yeah.
01:20:08.280 | - Right, so more or less same.
01:20:09.320 | So grew up with some John Hughes films
01:20:10.940 | and so where the jocks and the hippies and the punks
01:20:13.960 | and the, we're all divided and they were all segmented
01:20:16.720 | but then it all sort of mishmashed together
01:20:19.840 | a few years later.
01:20:21.800 | And I think that had a lot to do with, like you said,
01:20:24.400 | the sort of apolitical aspect of our generation.
01:20:27.880 | - We just knew, Gen X just knew the boomers were nuts.
01:20:30.280 | Like all the, I mean this is the canonical,
01:20:34.080 | one of the great sitcoms of the era
01:20:36.600 | was Family Ties with the character Michael P. Keaton.
01:20:39.120 | And he was just like this guy who's just like yeah,
01:20:41.060 | my boomer hippie parents are crazy.
01:20:42.560 | Like I'm just gonna like go into business
01:20:43.720 | and like actually do something productive.
01:20:44.980 | Like there was something iconic about that character
01:20:47.240 | in our culture and people like me were like yeah,
01:20:49.060 | obviously go into business.
01:20:50.460 | You know, go into like political activism.
01:20:51.920 | And then it's just like man, that came whipping back around
01:20:54.640 | with the next generation.
01:20:56.180 | So just to touch real quick on the university thing.
01:20:57.780 | So look, there are people trying to do,
01:20:59.200 | and I'm actually gonna do a thing this afternoon
01:21:00.680 | with the University of Austin which is one of these.
01:21:03.560 | And so there are people trying to do new universities.
01:21:06.340 | You know, like I say, it's certainly possible.
01:21:08.240 | I hope they succeed.
01:21:09.080 | I'm pulling for them.
01:21:09.900 | I think it'd be great.
01:21:10.740 | I think it'd be great if there were a lot more of them.
01:21:12.220 | - Who founded this university?
01:21:13.620 | - This is a whole group of people.
01:21:14.960 | I don't want to freelance on that
01:21:17.000 | 'cause I don't know originally who the idea was.
01:21:18.600 | - University of Austin, not UT Austin.
01:21:20.660 | - Yeah, so this is not UT Austin.
01:21:22.200 | It's called University of Austin or they call it,
01:21:24.040 | I think it's UAT, what is it, UATX.
01:21:27.320 | And so it's a lot of very short people associated with it.
01:21:31.400 | And they're gonna try to very much exactly
01:21:35.280 | like what you described.
01:21:36.120 | They're gonna try to do a new one.
01:21:37.280 | I would just tell you like the wall of opposition
01:21:39.560 | that they're up against is profound, right?
01:21:41.860 | And part of it is economic which is can they ever get access
01:21:44.400 | to federal student lending?
01:21:45.360 | And I hope that they can, but it seems nearly inconceivable
01:21:49.040 | the way the system's rigged today.
01:21:51.400 | And then the other is just like they're gonna,
01:21:53.480 | they already have come under.
01:21:54.440 | I mean, anybody who publicly associates with them
01:21:58.600 | who is in traditional academia immediately gets lit on fire.
01:22:01.120 | And there's like cancellation campaigns.
01:22:02.720 | So they're up against a wall of social ostracism.
01:22:05.600 | They're up against a wall of press attacks.
01:22:07.940 | They're up against a wall of people just like doing
01:22:10.800 | the thing, pouncing on any anytime anybody says anything,
01:22:13.300 | they're gonna try to like burn the place down.
01:22:14.720 | - This reminds me of like Jerry Springer episodes
01:22:18.360 | and Geraldo Rivera episodes where it's like if a team
01:22:23.200 | listened to like Danzig or Marilyn Manson type music
01:22:28.200 | or Metallica that they were considered a devil worshiper.
01:22:32.360 | Now we just laugh, right?
01:22:33.800 | We're like, that's crazy, right?
01:22:35.240 | People listen to music with all sorts of lyrics
01:22:37.240 | and ideas and looks and that's crazy.
01:22:40.800 | But there were people legitimately sent to prison,
01:22:44.800 | I think with the West Memphis Three, right?
01:22:46.340 | These kids out in West Memphis that looked different,
01:22:49.540 | acted different, were accused of murders
01:22:51.240 | that eventually was made clear they clearly didn't commit,
01:22:55.460 | but they were in prison
01:22:56.300 | because of the music they listened to.
01:22:58.020 | I mean, this sounds very similar to that.
01:22:59.980 | And I remember seeing Bumpersiggers
01:23:01.500 | free the West Memphis Three.
01:23:02.540 | And I thought this was some crazy thing.
01:23:04.620 | And you look into it and this isn't,
01:23:06.500 | it's a little bit niche, but I mean, these were real lives
01:23:09.900 | and there was an active witch hunt for people
01:23:13.500 | that looked different and acted different.
01:23:15.600 | And yet now we're sort of in this inverted world
01:23:18.500 | where on the one hand, we're all told
01:23:21.560 | that we can express ourselves however we want.
01:23:23.620 | But on the other hand,
01:23:24.460 | you can't get a bunch of people together to take classes
01:23:26.820 | where they learn biology and sociology and econ in Texas.
01:23:31.820 | Wild.
01:23:33.720 | - Yes.
01:23:34.540 | Well, so the simple explanation is this is Puritanism, right?
01:23:37.600 | So this is the original American Puritanism
01:23:40.440 | that just like works itself out through the system
01:23:42.560 | in different ways at different times.
01:23:44.000 | You know, there's this phenomenon,
01:23:44.840 | there's a religious phenomenon in America
01:23:46.240 | called the Great Awakenings.
01:23:47.840 | There'll be these periods in American history
01:23:50.020 | where there's basically religiosity fades
01:23:51.600 | and then there'll be this snapback effect
01:23:52.920 | where you'll have this basically, this frenzy
01:23:55.440 | basically of religion.
01:23:57.040 | In the old days, it would have been tent revivals
01:23:59.080 | and people speaking in tongues and all this stuff.
01:24:01.800 | And then in the modern world,
01:24:02.880 | it's of the form that we're living through right now.
01:24:05.880 | And so, yeah, it's just basically these waves
01:24:07.320 | of sort of American religious.
01:24:09.840 | And you know, remember like religion in our time,
01:24:12.320 | religious impulses in our time don't get expressed,
01:24:14.200 | you know, 'cause we live in more advanced times, right?
01:24:16.400 | We live in scientifically informed times.
01:24:17.760 | And so religious impulses in our time
01:24:19.320 | don't show up as overtly religious, right?
01:24:21.920 | They show up in a secularized form, right?
01:24:24.280 | Which of course conveniently is therefore
01:24:26.040 | not subject to the First Amendment
01:24:27.160 | separation of church and state, right?
01:24:28.600 | As long as the church is secular, there's no problem, right?
01:24:31.520 | And so, but we're acting out these kind
01:24:33.360 | of religious scripts over and over again
01:24:34.680 | and we're in the middle of another religious frenzy.
01:24:37.440 | - There's a phrase that I hear a lot
01:24:40.120 | and I don't necessarily believe it,
01:24:42.680 | but I want your thoughts on it,
01:24:43.680 | which is the pendulum always swings back.
01:24:46.520 | - Yeah, not quite.
01:24:47.760 | - So that's how I feel too, because you know, I'll take--
01:24:50.720 | - Boy, that would be great.
01:24:51.560 | - Take any number of things that we've talked about
01:24:54.100 | and you know, actually it's so crazy, you know,
01:24:57.440 | the way things have gone with institutions
01:24:59.840 | or it's so crazy the way things have gone with social media
01:25:01.880 | or it's so crazy, fill in the blank and people will say,
01:25:05.860 | well, you know, pendulum always swings back.
01:25:08.360 | Like it's the stock market or something, you know,
01:25:11.320 | after every crash, there'll be an eventual boom
01:25:14.040 | and vice versa.
01:25:15.520 | - By the way, that's not true either, right?
01:25:16.940 | - Right.
01:25:17.780 | - Most stock markets, we have of course survivorship,
01:25:19.280 | but it's all survivorship, everything is survivorship.
01:25:20.800 | It's all, everything you just said
01:25:21.720 | is obviously survivorship bias, right?
01:25:23.080 | So if you look globally, most stock markets over time
01:25:26.200 | crash and burn and ever recover.
01:25:27.800 | The American stock market has always recovered.
01:25:30.520 | - Right, I was referring to the American stock market.
01:25:32.040 | - Yeah, but globally.
01:25:33.380 | But the reason everybody refers to the American stock market
01:25:35.200 | is 'cause it's the one that doesn't do that.
01:25:37.160 | The other 200 or whatever that crash and burn
01:25:39.840 | and ever recover.
01:25:40.840 | Like, let's go check in on the, you know,
01:25:42.360 | an Argentina stock market right now.
01:25:43.600 | Like, I don't think it's coming back anytime soon.
01:25:45.360 | - Yeah, my father's Argentine and immigrated to the US
01:25:48.400 | in the 1960s, so he would definitely agree with that.
01:25:51.200 | - Yeah, like it doesn't come, you know,
01:25:53.080 | like when their stocks crash, they don't come back.
01:25:55.560 | So, and then, you know, like Senkoism,
01:25:57.080 | like the Soviet Union never recovered from like Senkoism,
01:25:58.960 | it never came back, it led to the end of the country.
01:26:01.560 | You know, literally the things that took down
01:26:02.980 | the Soviet Union were oil and wheat and the wheat thing,
01:26:05.480 | you can trace the crisis back to like Senkoism.
01:26:08.320 | And so, yeah, no, look, pendulum always swings back,
01:26:11.720 | it's true only in the cases where the pendulum swings back.
01:26:14.720 | Everybody just conveniently forgets
01:26:16.560 | all the other circumstances where that doesn't happen.
01:26:18.440 | One of the things people, you see this in business also,
01:26:21.320 | people have a really hard time confronting really bad news.
01:26:25.440 | I don't know if you've noticed that.
01:26:27.920 | I think every doctor who's listening right now
01:26:29.360 | is like, yeah, no shit.
01:26:30.200 | But like, there are situations, you see it in business,
01:26:33.400 | there are situations that, you see Star Trek,
01:26:36.000 | remember Star Trek, the Kobayashi Maru simulator, right?
01:26:38.800 | So the big lesson to become a Star Trek captain
01:26:40.360 | is you had to go through the simulation
01:26:41.480 | called the Kobayashi Maru, and the point was,
01:26:43.260 | there's no way to, it's a no-win scenario, right?
01:26:45.500 | And then it turned out like Captain Kirk
01:26:47.660 | was the only person to ever win the scenario,
01:26:49.200 | and the way that he did it was he went in ahead of time
01:26:51.200 | and hacked the simulator, right?
01:26:53.080 | It was the only way to actually get through.
01:26:54.560 | And then there was a debate whether to fire him
01:26:55.960 | or make him a captain, so they made him a captain.
01:26:58.920 | And the problem is, in real life,
01:27:01.480 | you do get the Kobayashi Maru on a regular basis.
01:27:03.720 | There are actual no-win situations
01:27:05.240 | that you can't work your way out of.
01:27:07.040 | And as a leader, you can't ever cop to that, right?
01:27:09.120 | 'Cause you have to carry things forward,
01:27:10.480 | and you have to look for every possible choice you can.
01:27:12.480 | But every once in a while, you do run into a situation
01:27:14.440 | where it's really not recoverable.
01:27:16.400 | And at least I've found people just cannot cope with that.
01:27:18.960 | And so what happens is they basically then,
01:27:21.000 | they basically just exclude it from their memory
01:27:23.080 | that it ever happened.
01:27:25.400 | - I'm glad you brought up simulators
01:27:26.720 | 'cause I want to make sure that we talk about
01:27:29.160 | the new and emerging landscape of AI,
01:27:32.640 | artificial intelligence.
01:27:34.000 | And I could try and smooth our conversation
01:27:39.380 | of a moment ago with this one
01:27:42.560 | by creating some clever segue, but I'm not going to.
01:27:45.220 | Except I'm going to ask, is there a possibility
01:27:48.940 | that AI is going to remedy some of what we're talking about?
01:27:53.120 | Let's make sure that we earmark that for discussion
01:27:55.120 | a little bit later.
01:27:56.120 | But first off, because some of the listeners
01:27:59.040 | of this podcast might not be as familiar with AI
01:28:01.840 | as perhaps they should be.
01:28:03.580 | We've all heard about artificial intelligence.
01:28:05.320 | People hear about machine learning, et cetera.
01:28:07.440 | But it'd be great if you could define for us what AI is.
01:28:11.180 | People almost immediately hear AI and think,
01:28:15.680 | okay, robot's taking over.
01:28:17.380 | I'm going to wake up and I'm going to be strapped to the bed
01:28:19.860 | and my organs are going to be pulled out of me.
01:28:22.520 | Robots are going to be in my bank account.
01:28:24.400 | They're going to kill all my children and dystopia for most.
01:28:29.400 | Clearly, that's not the way it's going to go.
01:28:34.560 | If you believe that machines can augment human intelligence
01:28:39.400 | and human intelligence is a good thing.
01:28:41.180 | So tell us what AI is and where you think it can take us,
01:28:46.180 | both good and bad.
01:28:49.240 | - Yeah, so there was a big debate
01:28:51.720 | when the computer was first invented,
01:28:53.080 | which was in the 1930s, 1940s.
01:28:55.540 | People like Alan Turing and John von Neumann and these people.
01:28:58.960 | And the big debate at the time was,
01:29:01.520 | because they knew they wanted to build computers,
01:29:02.960 | they had the basic idea.
01:29:04.960 | And there had been calculating machines before that
01:29:07.140 | and there had been these looms
01:29:08.760 | that you basically programmed with punch cards.
01:29:10.200 | And so there was a prehistory to computers
01:29:12.540 | that had to do with building
01:29:13.380 | sort of increasingly complex calculating machines.
01:29:15.680 | So they were kind of on a track,
01:29:16.760 | but they knew they were going to be able to build,
01:29:18.120 | they call it a general purpose computer
01:29:19.600 | that basically you could program
01:29:21.120 | on the way that you program computers today.
01:29:23.160 | But they had a big debate early on,
01:29:24.680 | which is should the fundamental architecture of the computer
01:29:27.240 | be based on either A, like calculating machines,
01:29:30.480 | like cache registers and looms and other things like that,
01:29:34.200 | or should it be based on a model of the human brain?
01:29:36.480 | And they actually had this idea of computers
01:29:38.920 | modeled on the human brain back then.
01:29:40.760 | And this was this concept of so-called neural networks.
01:29:43.620 | And it's actually fairly astonishing
01:29:45.120 | from a research standpoint.
01:29:46.600 | The original paper on neural networks
01:29:48.320 | actually was published in 1943, right?
01:29:50.600 | So they didn't have our level of neuroscience,
01:29:52.640 | but they actually knew about the neuron.
01:29:54.000 | And they actually had a theory
01:29:54.920 | of neurons interconnecting in synapses
01:29:56.680 | and information processing in the brain even back then.
01:30:00.080 | And a lot of people at the time basically said,
01:30:02.600 | you know what, we should basically have the computer
01:30:04.360 | from the start be modeled after the human brain,
01:30:05.980 | 'cause if the computer could do everything
01:30:08.160 | that the human brain can do,
01:30:09.240 | that would be the best possible general purpose computer.
01:30:11.220 | And then you could have it do jobs
01:30:12.800 | and you could have it create art
01:30:13.960 | and you could have it do all kinds of things
01:30:15.000 | like humans can do.
01:30:16.000 | It turns out that didn't happen in our world.
01:30:20.260 | What happened instead was the industry
01:30:21.920 | went in the other direction.
01:30:22.920 | It went basically in the model of the calculating machine
01:30:25.040 | or the cash register.
01:30:25.880 | And I think practically speaking,
01:30:27.400 | that kind of had to be the case
01:30:28.760 | 'cause that was actually the technology
01:30:30.140 | that was practical at the time.
01:30:32.960 | But that's the path.
01:30:33.800 | And so what we all have experiences with,
01:30:36.760 | up to and including the iPhone in our pocket,
01:30:38.340 | is computers built on that,
01:30:39.840 | basically calculating machine model,
01:30:41.400 | not the human brain model.
01:30:42.880 | And so what that means is computers,
01:30:44.340 | as we have come to understand them,
01:30:46.260 | they're basically like mathematical savants at best, right?
01:30:50.160 | So they're like, they're really good at like,
01:30:52.400 | doing lots of mathematical calculations.
01:30:54.280 | They're really good at executing
01:30:55.460 | these extremely detailed computer programs.
01:30:57.720 | They're hyper literal.
01:30:59.540 | One of the things you learn early when you're a programmer
01:31:01.720 | is as the human programmer,
01:31:03.320 | you have to get every single instruction
01:31:04.600 | you give the computer correct,
01:31:05.620 | 'cause it will do exactly what you tell it to do.
01:31:08.100 | And bugs in computer programs are always a mistake
01:31:10.980 | on the part of the programmer.
01:31:12.060 | - Interesting.
01:31:12.900 | - You never blame the computer.
01:31:13.720 | You always blame the programmer
01:31:15.020 | 'cause that's the nature of the thing
01:31:16.780 | that you're dealing with.
01:31:17.620 | - One down score off and the whole thing--
01:31:19.700 | - Yeah, yeah, and it's the programmer's fault.
01:31:22.300 | And if you talk to any programmer,
01:31:23.140 | they'll agree with this.
01:31:23.980 | They'll be like, yeah, if there's a problem, it's my fault.
01:31:25.480 | I did it.
01:31:26.320 | I can't blame the computer.
01:31:27.280 | The computer has no judgment.
01:31:28.500 | It has no ability to interpret, synthesize,
01:31:32.100 | develop an independent understanding of anything.
01:31:33.960 | It's literally just doing what I tell it to do step by step.
01:31:37.120 | So for 80 years, we've had this just,
01:31:39.320 | this very kind of hyper literal kind of model computers.
01:31:42.120 | These are called, technically,
01:31:43.400 | these are what are called von Neumann machines
01:31:44.880 | based after the mathematician, John von Neumann.
01:31:47.400 | They run in that way.
01:31:48.240 | And they've been very successful and very important
01:31:50.340 | and our world has been shaped by them.
01:31:52.140 | But there was always this other idea out there,
01:31:53.940 | which is, okay, how about a completely different approach,
01:31:56.220 | which is based much more on how the human brain operates,
01:31:59.220 | or at least our kind of best understanding
01:32:01.320 | of how the human brain operates, right?
01:32:02.820 | 'Cause those aren't the same thing.
01:32:05.100 | It basically says, okay, what if you could have a computer,
01:32:07.500 | instead of being hyper literal,
01:32:08.480 | what if you could have it actually be conceptual, right?
01:32:11.500 | And creative and able to synthesize information, right?
01:32:15.260 | And able to draw judgments and able to,
01:32:18.280 | behave in ways that are not deterministic,
01:32:21.120 | but are rather creative, right?
01:32:24.680 | And so, and the applications for this, of course,
01:32:27.560 | are endless.
01:32:28.400 | And so, for example, the self-driving car,
01:32:30.480 | the only way that you can make a car,
01:32:32.320 | you cannot program a computer with rules
01:32:34.400 | to make it a self-driving car.
01:32:35.400 | You have to do what Tesla and Waymo
01:32:36.880 | and these other companies have done now.
01:32:37.920 | You have to use AI, right?
01:32:39.200 | You have to use this other architecture.
01:32:40.960 | And you have to basically teach them
01:32:42.440 | how to recognize objects and images at high speeds,
01:32:45.040 | the same way, basically the same way the human brain does.
01:32:46.880 | And so those are so-called neural networks running inside.
01:32:49.540 | - So essentially let the machine operate based on priors.
01:32:53.060 | You know, we almost clipped a boulder
01:32:56.820 | going up this particular drive.
01:32:58.580 | And so therefore this shape that previously
01:33:00.760 | the machine didn't recognize as a boulder,
01:33:02.400 | it now introduces to its catalog of boulders.
01:33:05.320 | Is that sort of-
01:33:06.160 | - Yeah, a good example,
01:33:07.160 | or let's even make it even starker for self-driving car.
01:33:10.200 | There's something in the road.
01:33:11.280 | Is it a small child or a plastic shopping bag
01:33:13.640 | being blown by the wind?
01:33:15.880 | Very important difference.
01:33:17.840 | If it's a shopping bag,
01:33:18.800 | you definitely want to go straight through it
01:33:20.380 | because if you deviate off course,
01:33:21.760 | you might, you know, you're gonna make a fast,
01:33:24.000 | you know, it's the same challenge we have
01:33:25.400 | when we're driving.
01:33:26.240 | Like you don't want to swerve to avoid a shopping bag
01:33:27.820 | 'cause you might hit something
01:33:28.660 | that you didn't see on the side.
01:33:29.760 | If it's a small child, for sure you want to swerve, right?
01:33:31.960 | And so, but it's very, but like in that moment,
01:33:34.700 | and you know, small children come in different
01:33:36.200 | like shapes and descriptions
01:33:37.100 | and are wearing different kinds of clothes.
01:33:37.940 | - They might tumble onto the road
01:33:39.080 | the same way a bag would tumble.
01:33:40.640 | - Yeah, they might look like they're tumbling.
01:33:41.760 | And by the way, they might not be,
01:33:43.040 | they might be wearing a Halloween mask, right?
01:33:45.760 | They might not have a recognizable human face, right?
01:33:48.400 | Or it might be a kid with, you know, one leg, right?
01:33:51.180 | You definitely want to not hit those, right?
01:33:53.040 | Like, so you can't,
01:33:55.120 | this is what basically we figured out is,
01:33:56.720 | you can't apply the rules-based approach
01:33:59.080 | of a von Neumann machine to basically real life
01:34:01.560 | and expect the computer to be in any way understanding
01:34:03.900 | or resilient to change it
01:34:05.120 | to basically things happening in real life.
01:34:06.840 | And this is why there's always been such a stark divide
01:34:08.620 | between what the machine can do and what the human can do.
01:34:11.720 | And so basically what's happened is in the last decade,
01:34:14.320 | that second type of computer,
01:34:15.720 | the neural network based computer,
01:34:16.840 | has started to actually work.
01:34:18.760 | It started to work actually first interestingly in vision,
01:34:21.360 | recognizing objects and images,
01:34:22.560 | which is why the self-driving car is starting to work.
01:34:24.440 | - Face recognition.
01:34:25.280 | - Face recognition.
01:34:26.120 | - I mean, when I was started off in visual neuroscience,
01:34:28.000 | which is really my original home in neuroscience,
01:34:31.320 | the idea that a computer or a camera
01:34:34.120 | could do face recognition better than a human
01:34:35.880 | was like a very low probability event
01:34:40.460 | based on the technology we had at the time,
01:34:42.420 | based on the understanding of the face recognition cells
01:34:44.660 | and the fusiform gyrus.
01:34:45.760 | Now you would be smartest to put all your money
01:34:50.760 | on the machine.
01:34:51.760 | You know, you want to find faces in airports,
01:34:53.480 | even with masks on and, you know,
01:34:55.140 | at profile versus straight on,
01:34:57.480 | machines can do it far better than most all people.
01:35:00.580 | I mean, they're the super recognizers,
01:35:02.420 | but even they can't match the best machines.
01:35:05.600 | Now, 10 years ago, what I just said was the exact reverse.
01:35:08.920 | - Right, that's right.
01:35:09.760 | - All right, so faces, handwriting, right,
01:35:13.680 | and then voice, right, being able to understand voice.
01:35:16.300 | Like if you use, just as a user, if you use Google Docs,
01:35:18.640 | it has a built-in voice transcription.
01:35:20.040 | They have sort of the best industry-leading
01:35:21.280 | kind of voice transcription.
01:35:22.580 | If you use voice transcription in Google Docs,
01:35:24.160 | it's breathtakingly good.
01:35:25.400 | You just speak into it and it just like types
01:35:27.340 | what you're saying.
01:35:28.280 | - Well, that's good 'cause in my phone,
01:35:29.420 | every once in a while, I'll say,
01:35:30.320 | "I need to go pick up a few things,"
01:35:31.760 | and they'll say, "I need to pick up a few thongs."
01:35:34.100 | And so Apple needs to get on board
01:35:37.500 | with whatever the voice recognition is that Google's using.
01:35:39.840 | - Maybe it knows you better than you think.
01:35:41.500 | (laughing)
01:35:43.560 | - That was not the topic I was avoiding discussing.
01:35:46.040 | - No, so that's on the list, right?
01:35:47.160 | That's on your list.
01:35:48.520 | So look, there's a reason actually why Google's so good
01:35:52.480 | and Apple is not right now at that kind of thing,
01:35:54.160 | and it actually goes to actually the,
01:35:55.720 | it's actually an ideological thing of all things.
01:35:58.120 | Apple does not permit pooling of data for any purpose,
01:36:03.400 | including training AI, whereas Google does.
01:36:06.480 | And Apple's just like stake their brand on privacy
01:36:08.800 | and among that is sort of a pledge
01:36:10.260 | that they don't like pool your data.
01:36:11.740 | And so all of Apple's AI is like AI
01:36:13.800 | that has to happen like locally on your phone,
01:36:16.060 | whereas Google's AI can happen in the cloud, right?
01:36:18.200 | It can happen across pool data.
01:36:19.100 | Now, by the way, some people think that that's bad
01:36:20.760 | 'cause they think pooling data is bad.
01:36:22.360 | But that's an example of the shift
01:36:24.080 | that's happening in the industry right now,
01:36:25.160 | which is you have this separation between the people
01:36:26.960 | who are embracing the new way of training AIs
01:36:29.600 | and the people who basically, for whatever reason, are not.
01:36:32.580 | - Excuse me, you say that some people think it's bad
01:36:34.920 | because of privacy issues or they think it's bad
01:36:36.640 | because of the reduced functionality of that AI.
01:36:40.160 | - Oh, no, so you're definitely gonna get,
01:36:41.760 | so there's three reasons AIs have started to work.
01:36:45.400 | One of them is just simply larger data sets,
01:36:47.760 | larger amounts of data.
01:36:48.960 | So specifically the reason why objects and images are now,
01:36:53.040 | the reason machines are now better than humans
01:36:54.520 | at recognizing objects and images or recognizing faces
01:36:57.000 | is because modern facial recognition AIs
01:36:59.520 | are trained across all photos on the internet of people,
01:37:02.720 | billions and billions and billions of photos, right?
01:37:04.620 | A limited number of photos of people on the internet.
01:37:06.880 | Attempts to train facial recognition systems
01:37:08.960 | 10 or 20 years ago, they'd be trained on, you know,
01:37:11.100 | thousands or tens of thousands of photos.
01:37:12.940 | - So the input data is simply much more vast.
01:37:15.520 | - Much larger.
01:37:16.360 | And this is the reason, to get to the conclusion on this,
01:37:18.080 | this is the reason why ChatGPT works so well,
01:37:19.920 | is ChatGPT, one of the reasons ChatGPT works so well
01:37:22.620 | is it's trained on the entire internet of text.
01:37:25.140 | And the entire internet of text was not something
01:37:27.040 | that was available for you to train an AI on
01:37:28.960 | until it came to actually exist itself,
01:37:30.800 | which is new in the last, you know, basically decade.
01:37:33.040 | So in the case of face recognition,
01:37:35.140 | I could see how having a much larger input data set
01:37:38.060 | would be beneficial if the goal is to recognize
01:37:40.080 | Marc Andreessen's face because you are looking
01:37:42.520 | for signal to noise against everything else, right?
01:37:45.360 | But in the case of ChatGPT,
01:37:47.320 | when you're pooling all text on the internet
01:37:49.700 | and you ask ChatGPT to say,
01:37:52.640 | construct a paragraph about Marc Andreessen's prediction
01:37:56.360 | of the future of human beings over the next 10 years
01:37:59.640 | and the likely to be most successful industries,
01:38:04.560 | give ChatGPT that.
01:38:06.060 | If it's pooling across all text,
01:38:08.540 | how does it know what is authentically Marc Andreessen's text?
01:38:12.840 | Because in the case of face recognition,
01:38:14.340 | you have a, you've got a standard to work from,
01:38:18.120 | a verified image versus everything else.
01:38:21.340 | In the case of text,
01:38:23.960 | you have to make sure that what you're starting with
01:38:26.080 | is verified text from your mouth.
01:38:28.520 | So, which makes sense if it's coming from video,
01:38:30.960 | but then if that video is deep faked,
01:38:33.820 | all of a sudden what's true,
01:38:36.260 | your valid Marc Andreessen is of question
01:38:41.120 | and then everything ChatGPT is producing
01:38:43.720 | that is then of question.
01:38:45.240 | - Right.
01:38:46.080 | So I would say there's a before and after thing here.
01:38:48.000 | There's like a before,
01:38:48.920 | there's like a before ChatGPT and after GPT question, right?
01:38:51.840 | 'Cause the existence of GPT itself changes the answer.
01:38:55.040 | So before ChatGPT, so the reason,
01:38:57.100 | the version you're using today is trained on data
01:38:58.960 | up till September, 2021.
01:39:00.700 | They're caught up with the training set.
01:39:02.120 | Up till September, 2021,
01:39:03.860 | almost all texts on the internet was written by a human being
01:39:06.720 | and then most of that was written
01:39:08.320 | by people under their own names.
01:39:09.520 | Some of it wasn't, but a lot of it was.
01:39:11.300 | And why do you know this for me
01:39:12.440 | is 'cause it was published in a magazine under my name
01:39:14.280 | or it's a podcast transcript and it's under my name.
01:39:16.920 | And generally speaking, if you just did a search on like,
01:39:19.240 | what are things Marc Andreessen has written and said,
01:39:21.000 | 90 plus percent of that would be correct.
01:39:23.800 | And there, look, somebody might have written a fake,
01:39:25.960 | you know, parody article or something like that,
01:39:27.840 | but like not that many people were spending that much time
01:39:30.260 | writing like fake articles about like things that I said.
01:39:32.360 | - So many people can pretend to be human.
01:39:34.060 | - Exactly, right.
01:39:34.900 | And so generally speak, you can kind of get your arms
01:39:36.900 | around the idea that there's a corpus of material
01:39:38.660 | associated with me or by the way, same thing with you.
01:39:40.280 | There's a corpus of YouTube transcripts
01:39:41.940 | and other your academic papers and talks that you've given.
01:39:44.320 | You can kind of get your hands around that.
01:39:45.500 | And that's how these systems are trained.
01:39:47.060 | They take all that data collectively, they put it in there.
01:39:49.380 | And that's why this works as well as it does.
01:39:50.980 | And that's why if you ask ChatGPT to speak or write like me
01:39:54.200 | or like you or like somebody else,
01:39:56.660 | it will actually generally do a really good job
01:39:58.620 | 'cause it has all of our prior text in its training data.
01:40:02.000 | That said, from here on out, this gets harder.
01:40:05.300 | And of course, the reason this gets harder
01:40:06.660 | is because now we have AI that can create text.
01:40:09.500 | We have AI that can create text at industrial scale.
01:40:12.500 | - Is it watermarked as AI generating text?
01:40:14.500 | - No, no, no, no.
01:40:15.340 | - How hard would it be to do that?
01:40:16.260 | - I think it's impossible.
01:40:17.740 | I think it's impossible.
01:40:18.620 | There are people who are trying to do that.
01:40:20.300 | This is a hot topic in the classroom.
01:40:21.680 | I was talking to a friend who's got like a 14 year old kid
01:40:23.460 | in a class and there's like these recurring scandals.
01:40:25.720 | It's like every kid in the class is using ChatGPT
01:40:27.980 | to like write their essays
01:40:28.980 | or to help them write their essays.
01:40:30.780 | And then the teacher is using one of,
01:40:34.320 | there's a tool that you can use
01:40:35.780 | that it purports to be able to tell you
01:40:38.340 | whether something was written by ChatGPT,
01:40:39.940 | but it's like only write like 60% of the time.
01:40:42.540 | And so there was this case where the student wrote an essay
01:40:45.180 | where their parents sat and watched them write the essay
01:40:48.020 | and then they submitted it
01:40:48.900 | and this tool got the conclusion incorrect.
01:40:50.720 | And then the student feels outraged
01:40:52.180 | 'cause he got unfairly cheated,
01:40:53.240 | but the teacher is like, well, you're all using the tool.
01:40:55.240 | Then it turns out there's another tool
01:40:56.540 | that basically you feed in text
01:40:57.960 | and it actually is sort of,
01:40:59.700 | they call it a summarizer,
01:41:02.280 | but what it really is is it's a cheating mechanism
01:41:03.980 | to basically just shuffle the words around enough
01:41:06.840 | so that it sheds whatever characteristics
01:41:08.740 | were associated with AI.
01:41:09.960 | So there's like an arms race going on
01:41:11.860 | in educational settings right now
01:41:13.140 | around this exact question.
01:41:14.560 | I don't think it's possible to do,
01:41:16.460 | there are people working on the watermark.
01:41:17.620 | I don't think it's possible to do the watermarking.
01:41:18.980 | And I think it's just kind of obvious
01:41:20.100 | why it's not possible to do that,
01:41:21.220 | which is you can just read the output for yourself.
01:41:24.400 | It's really good.
01:41:25.720 | How are you actually gonna tell the difference
01:41:27.800 | between that and something that a real person wrote?
01:41:30.220 | And then by the way,
01:41:31.060 | you can also ask JetGPT to write in different styles, right?
01:41:33.560 | So you can tell it like,
01:41:34.940 | write in the style of a 15 year old, right?
01:41:37.120 | You can tell it to write
01:41:37.960 | in the style of a non-native English speaker, right?
01:41:40.380 | Or if you're a non-native English speaker,
01:41:41.620 | you can tell it to write in the style of an English speaker,
01:41:44.140 | native English speaker, right?
01:41:45.060 | And so the tool itself will help you evade.
01:41:47.860 | So I don't think that,
01:41:50.140 | I think there's a lot of people who are gonna want
01:41:51.660 | to distinguish, quote, real versus fake.
01:41:54.120 | I think those days are over.
01:41:56.100 | - Genie's out of the bottle.
01:41:56.940 | - Genie's completely out of the bottle.
01:41:58.220 | And by the way, I actually think this is good.
01:42:00.220 | This doesn't map to my worldview
01:42:02.380 | of how we use this technology anyway,
01:42:03.660 | which we can come back to.
01:42:04.980 | So there's that.
01:42:07.340 | So there's that.
01:42:08.340 | And then there's the problem, therefore,
01:42:09.680 | of like the so-called deep fake problem.
01:42:11.140 | So then there's the problem of like deliberate,
01:42:12.860 | basically, manipulation.
01:42:14.300 | And that's like one of your many enemies,
01:42:17.980 | if you're increasingly long list of enemies,
01:42:20.820 | like mine, who basically is like, wow,
01:42:23.820 | I know how I'm gonna get him, right?
01:42:25.260 | I'm gonna use it to create something
01:42:28.500 | that looks like a Hebrewman transcript,
01:42:29.860 | and I'm gonna have him say all these bad things.
01:42:31.220 | - Or a video.
01:42:32.060 | - Or a video, or a video.
01:42:32.900 | - I mean, Joe Rogan and I were deep faked in a video.
01:42:36.200 | I don't want to flag people to it.
01:42:37.820 | I won't, so I won't talk about what it was about,
01:42:39.880 | but where it, for all the world,
01:42:43.620 | looked like a conversation that we were having,
01:42:45.580 | and we never had that specific conversation.
01:42:47.900 | - Yeah, that's right.
01:42:48.740 | So that's gonna happen for sure.
01:42:49.580 | And so what there's gonna need to be
01:42:51.200 | is there's gonna need to be basically registries
01:42:53.020 | where basically you, like in your case,
01:42:55.540 | you will submit your legitimate content
01:42:59.100 | into a registry under your unique cryptographic key, right?
01:43:02.480 | And then basically there will be a way
01:43:03.620 | to check against that registry
01:43:05.000 | to see whether that was the real thing.
01:43:06.160 | And I think this needs to be done for sure
01:43:07.940 | for public figures, it needs to be done for politicians,
01:43:09.860 | it needs to be done for music.
01:43:11.660 | - What about taking what's already out there
01:43:13.380 | and being able to authenticate it or not?
01:43:15.180 | - In the same way that many times per week,
01:43:18.180 | I get asked, is this your account about some,
01:43:20.820 | a direct message that somebody got on Instagram?
01:43:22.900 | And I always tell them, look, I only have the one account,
01:43:26.560 | this one verified account,
01:43:28.200 | although now with the advent of pay-to-play verification,
01:43:32.000 | makes it a little less potent as a security blanket
01:43:35.060 | for knowing if it's not this account, then it's not me.
01:43:38.900 | But in any case, these accounts pop up all the time,
01:43:41.600 | pretending to be me.
01:43:42.560 | And I'm relatively low on the scale, not low,
01:43:47.560 | but relatively low on the scale to say like a Beyonce
01:43:51.400 | or something like that,
01:43:52.280 | who has hundreds of millions of followers.
01:43:54.200 | So is there a system in mind where people could go in
01:43:58.840 | and verify text, click yes or no, this is me, this is not me.
01:44:02.440 | And even there, there's the opportunity for people to fudge
01:44:05.740 | to eliminate things about themselves
01:44:07.280 | that they don't want out there by saying,
01:44:08.620 | no, that's not me, I didn't actually say that or create that.
01:44:11.680 | - Yeah, no, that's right.
01:44:12.520 | So technologically, it's actually pretty straightforward.
01:44:14.760 | So the way to implement this technologically
01:44:16.360 | is with public key, it's called public key cryptography,
01:44:18.280 | which is the basis for how cryptography information
01:44:20.720 | is secured in the world today.
01:44:22.200 | And so basically what you would do,
01:44:23.320 | the implementation form of this would be,
01:44:24.640 | you would pick whatever is your most trusted channel.
01:44:27.000 | Let's say it's your YouTube channel as an example,
01:44:28.760 | where just everybody just knows
01:44:30.120 | that it's you and your YouTube channel,
01:44:31.280 | 'cause you've been doing it for 10 years or whatever,
01:44:32.800 | and it's just obvious.
01:44:34.000 | And you would just publish,
01:44:34.880 | like in the about me page on YouTube,
01:44:36.800 | you would just publish your public cryptographic key
01:44:39.280 | that's unique to you, right?
01:44:40.920 | And then anytime anybody wants to check
01:44:42.680 | to see whether any piece of content is actually you,
01:44:44.700 | they go to a registry in the cloud somewhere,
01:44:47.680 | and they basically submit, they basically say,
01:44:49.440 | okay, is this him?
01:44:50.880 | And then they can basically,
01:44:52.040 | to see whether somebody with your public key,
01:44:54.560 | you had actually certified that this was something
01:44:56.800 | that you made.
01:44:58.340 | Now, who runs that registry is an interesting question.
01:45:01.080 | If that registry is run by the government,
01:45:02.520 | we will call that the Ministry of Truth.
01:45:04.620 | I think that's probably a bad idea.
01:45:06.760 | If that registry is run by a company,
01:45:09.000 | we would call that basically the equivalent
01:45:10.920 | of like a credit bureau or something like that.
01:45:12.920 | Maybe that's how it happens.
01:45:13.920 | The problem with that is that company now becomes
01:45:15.800 | hacking target number one of every bad person on earth,
01:45:18.840 | 'cause you can, if anybody breaks into that company,
01:45:21.280 | they can fake all kinds of things.
01:45:22.920 | - Yeah, they own the truth.
01:45:24.120 | - Right, they own the truth, right.
01:45:25.080 | And by the way, insider threat,
01:45:26.080 | also their employees can monkey with it, right?
01:45:27.840 | So you have to really trust that company.
01:45:29.640 | The third way to do it is with a blockchain, right?
01:45:31.500 | And so this, with the crypto blockchain technology,
01:45:33.640 | you could have a distributed system,
01:45:35.080 | basically a distributed database in the cloud
01:45:37.480 | that is run through a blockchain,
01:45:38.880 | and then it implements this cryptography
01:45:41.200 | and the certification process.
01:45:42.600 | - What about quantum internet?
01:45:44.320 | Is that another way to encrypt these things?
01:45:45.860 | I know most of our listeners are probably not familiar
01:45:47.800 | with quantum internet, but put simply,
01:45:49.720 | it's a way to secure communications on the internet.
01:45:53.220 | Let's just leave it at that.
01:45:54.440 | It's sophisticated, and we'll probably do a whole episode
01:45:56.720 | about this at some point,
01:45:57.540 | but maybe you have a succinct way
01:45:58.800 | of describing quantum internet,
01:46:00.060 | but that would be better, and if so, please offer it up.
01:46:04.520 | But is quantum internet going to be one way
01:46:06.800 | to secure these kinds of data and resources?
01:46:10.320 | - Maybe in the future, years in the future.
01:46:12.440 | We don't yet have working quantum computers in practice,
01:46:14.680 | so it's not currently something you could do,
01:46:16.560 | but maybe in a decade or two.
01:46:18.720 | - Tell me, I'm going to take a stab
01:46:19.980 | at defining quantum internet in one sentence.
01:46:21.600 | It's a way in which if anyone were to try and peer in
01:46:23.880 | on a conversation on the internet,
01:46:25.160 | it essentially would be futile
01:46:27.560 | because of the way that quantum internet changes
01:46:32.440 | the way that the communication is happening so fast
01:46:34.960 | and so many times in any one conversation,
01:46:37.120 | essentially changing the translation or the language so fast
01:46:40.080 | that there's just no way to keep up with it.
01:46:41.460 | Is that more or less accurate?
01:46:42.760 | - Yeah, conceivably, yeah, not yet, but yeah, someday.
01:46:45.840 | - So going back to AI,
01:46:48.140 | most people who hear about AI are afraid of AI.
01:46:50.560 | Well, I think most people who aren't informed.
01:46:54.600 | - This goes back to our elites versus masses thing.
01:46:57.120 | - Oh, interesting.
01:46:58.080 | Well, I heard you say that,
01:46:59.480 | and this is from a really wonderful tweet thread
01:47:04.660 | that we will link in the show note captions
01:47:06.600 | that you put out not long ago
01:47:08.980 | and that I've read now several times,
01:47:11.560 | and that everyone really should take the time to read it.
01:47:13.720 | It probably takes about 20 minutes to read it carefully
01:47:16.600 | and to think about each piece and I highly recommend it.
01:47:19.740 | But you said, and I'm quoting here,
01:47:23.540 | let's address the fifth,
01:47:26.880 | the one thing I actually agree with,
01:47:28.440 | which is AI will make it easier for bad people
01:47:31.160 | to do bad things.
01:47:32.600 | - Yeah, well, so first of all,
01:47:37.080 | there is a general freakout happening around AI.
01:47:38.980 | I think it's primarily, it's one of these, again,
01:47:40.460 | it's an elite-driven freakout.
01:47:41.520 | I don't think the man in the street knows cares
01:47:43.300 | or feels one way or the other.
01:47:44.300 | I think it's just not a relevant concept
01:47:45.900 | and it probably just sounds like science fiction.
01:47:47.840 | So I think there's an elite-driven freakout
01:47:50.580 | that's happening right now.
01:47:52.000 | I think that elite-driven freakout has many aspects to it
01:47:54.900 | that I think are incorrect, which is not surprising.
01:47:57.980 | I would think that given that I think the elites
01:47:59.180 | are incorrect about a lot of things,
01:48:00.640 | but I think they're very wrong about a number of things.
01:48:02.140 | They're saying about AI, but that said,
01:48:04.340 | look, this is a very powerful new technology, right?
01:48:06.940 | This is like a new general purpose,
01:48:08.380 | like thinking technology, right?
01:48:10.340 | So like, what if machines could think, right?
01:48:12.420 | And what if you could use machines to think
01:48:14.580 | and what if you could have them think for you?
01:48:16.200 | There's obviously a lot of good that could come from that,
01:48:19.140 | but also people, you know,
01:48:20.360 | look, criminals could use them to plan better crimes.
01:48:24.000 | You know, terrorists could use them to plan
01:48:25.160 | better terror attacks and so forth.
01:48:26.420 | And so these are going to be tools
01:48:28.340 | that bad people can use to do bad things, for sure.
01:48:31.460 | - I can think of some ways that AI could be leveraged
01:48:33.780 | to do fantastic things, like in the realm of medicine,
01:48:38.780 | an AI pathologist perhaps can scan 10,000 slides
01:48:45.420 | of histology and find the one micro-tumor seller aberration
01:48:50.620 | that would turn into a full-blown tumor.
01:48:53.020 | Whereas the even mildly fatigued
01:48:56.340 | or well-rested human pathologists, as great as they come,
01:49:00.500 | might miss that.
01:49:01.920 | And perhaps the best solution is for both of them to do it.
01:49:05.280 | And then for the human to verify what the AI has found
01:49:07.980 | and vice versa.
01:49:08.820 | - Right, that's right.
01:49:09.660 | - Right, and that's just one example.
01:49:10.900 | I mean, I can come up with thousands of examples
01:49:13.280 | where this would be wonderful.
01:49:15.120 | - I'll give you another one by the way, medicine.
01:49:17.780 | So you're talking about an analytic result,
01:49:19.300 | which is good and important.
01:49:20.180 | The other is like the machines are going to be
01:49:21.540 | much better bedside manner.
01:49:22.900 | They're going to be much better at dealing with the patient.
01:49:25.900 | And we already know there's already been a study.
01:49:27.540 | There's already been a study on that.
01:49:28.540 | So there was already a study done on this
01:49:31.140 | where there was a study team that scraped thousands
01:49:34.180 | of medical questions off of an internet forum.
01:49:35.860 | And then they had real doctors answer the questions.
01:49:38.200 | And then they had basically GPT-4 answer the questions.
01:49:40.820 | And then they had another panel of doctors
01:49:42.580 | score the responses, right?
01:49:44.460 | So there were no patients experimented on here.
01:49:46.340 | This was a test contained within the medical world.
01:49:49.020 | But then the panel of the judges,
01:49:51.960 | the panel of doctors who were the judges
01:49:53.620 | scored the answers on both factual accuracy
01:49:55.500 | and on bedside manner, on empathy.
01:49:58.500 | And the GPT-4 was equal or better
01:50:02.580 | on most of the factual questions analytically already.
01:50:05.880 | And it's not even a specifically trained medical AI.
01:50:08.940 | But it was overwhelmingly better on empathy.
01:50:12.100 | - Amazing. - Right?
01:50:13.300 | And so, and you know, I don't think, yeah, I don't,
01:50:15.940 | do you treat patients directly in your work?
01:50:18.180 | - No, I don't. - You don't, yeah.
01:50:19.020 | - I don't.
01:50:19.860 | We run clinical trials. - Right.
01:50:22.180 | - But I don't do any direct clinical.
01:50:24.340 | - So I have no direct experience with this,
01:50:26.700 | but from the surgeons, like if you talk to surgeons
01:50:29.500 | or you talk to people who train surgeons,
01:50:31.140 | what they'll tell you is like surgeons
01:50:32.500 | need to have an emotional remove from their patients
01:50:34.500 | in order to do a good job with the surgery.
01:50:35.980 | The side effect of that, and by the way,
01:50:37.580 | look, it's a hell of a job to have to go in
01:50:39.020 | and tell somebody that they're gonna die, right?
01:50:40.700 | Or that they have, so they're never gonna recover.
01:50:42.340 | They're never gonna walk again or whatever it is.
01:50:43.780 | And so there's sort of something inherent in that job
01:50:46.660 | where they need to keep an emotional reserve
01:50:48.180 | from the patient, right, to be able to do the job.
01:50:50.620 | And it's suspected of them as professionals.
01:50:53.220 | The machine has no such limitation.
01:50:55.180 | Like the machine can be as sympathetic as you want it to be
01:50:57.980 | for as long as you want it to be.
01:50:59.260 | It can be infinitely sympathetic.
01:51:00.420 | It's happy to talk to you at four in the morning.
01:51:01.980 | It's happy to sympathize with you.
01:51:03.300 | And by the way, it's not just sympathizing you
01:51:05.980 | in the way that, oh, it's just lying.
01:51:07.740 | You know, it's just making up words to lie to you
01:51:09.140 | to make you feel good.
01:51:10.340 | It can also sympathize with you
01:51:11.580 | in terms of helping you through all the things
01:51:13.220 | that you can actually do to improve your situation, right?
01:51:15.580 | And so, you know, boy, like if you'd be, you know,
01:51:18.740 | can you keep a patient actually on track
01:51:20.340 | with a physical therapy program?
01:51:21.540 | Can you keep a patient on track with a nutritional program?
01:51:23.780 | Can you keep a patient off of drugs or alcohol, right?
01:51:26.340 | And if they have a machine medical companion
01:51:28.660 | that's with them all the time
01:51:29.820 | that they're talking to all the time,
01:51:30.780 | that's infinitely patient, infinitely wise, right?
01:51:33.980 | Infinitely loving, right?
01:51:35.700 | And it's just gonna be there all the time.
01:51:37.300 | And it's gonna be encouraging.
01:51:38.300 | And it's gonna be saying, you know,
01:51:39.140 | you did such a great job yesterday.
01:51:40.180 | I know you can do this again today.
01:51:42.020 | Cognitive behavioral therapy is an obvious fit here.
01:51:44.500 | These things are gonna be great at CBT.
01:51:46.180 | And that's already starting.
01:51:47.220 | But you can already use ChatGPT as a CBT therapist
01:51:50.580 | if you want.
01:51:51.400 | It's actually quite good at it.
01:51:52.420 | And so there's a universe here that's,
01:51:55.100 | it goes to what you said.
01:51:55.940 | There's a universe here that's opening up,
01:51:57.300 | which is what I believe is it's partnership
01:51:59.520 | between man and machine, right?
01:52:01.180 | It's a symbiotic relationship,
01:52:02.700 | not an adversarial relationship.
01:52:04.220 | And so the doctor is going to pair with the AI
01:52:07.200 | to do all the things that you described.
01:52:08.740 | But the patient is also going to pair with the AI.
01:52:10.900 | And I think this partnership that's gonna emerge
01:52:15.060 | is gonna lead, among other things,
01:52:16.180 | to actually much better health outcomes.
01:52:18.180 | - I mean, I've relied for so much of my life
01:52:20.880 | on excellent mentors from a very young age and still now
01:52:25.500 | in order to make best decisions possible
01:52:29.380 | with the information I had.
01:52:30.820 | And rarely were they available at four in the morning,
01:52:34.480 | sometimes, but not on a frequent basis.
01:52:36.820 | And they fatigue like anybody else.
01:52:38.640 | And they have their own stuff like anybody else,
01:52:42.440 | baggage events in their life, et cetera.
01:52:44.540 | What you're describing is a sort of AI coach
01:52:48.440 | or therapist of sorts
01:52:50.200 | that hopefully would learn to identify our best self
01:52:53.900 | and encourage us to be our best self.
01:52:56.340 | And when I say best self,
01:52:57.780 | I don't mean that in any kind of pop psychology way.
01:53:00.180 | I mean, I could imagine AI very easily
01:53:02.140 | knowing how well I slept the night before
01:53:04.500 | and what types of good or bad decisions
01:53:06.260 | I tend to make at two o'clock in the afternoon
01:53:08.580 | when I've only had five hours of sleep,
01:53:10.340 | or maybe just less REM sleep the night before.
01:53:13.340 | It might encourage me to take a little more time
01:53:15.140 | to think about something.
01:53:16.600 | Might give me a little tap on the wrist
01:53:18.580 | through a device that no one else would detect
01:53:20.280 | to refrain from something.
01:53:23.480 | - Never going to judge you.
01:53:24.900 | It's never going to be resentful.
01:53:26.100 | It's never going to be upset that you didn't listen to it.
01:53:28.920 | It's never going to go on vacation.
01:53:30.880 | It's going to be there for you.
01:53:31.720 | I think this is the way people are going to live.
01:53:34.000 | It's going to start with kids
01:53:34.840 | and then over time it's going to be adults.
01:53:36.000 | And the way people are going to live
01:53:37.120 | is they're going to have a exactly friend, therapist,
01:53:39.680 | companion, mentor, coach, teacher, assistant,
01:53:43.260 | and that, or by the way, maybe multiple of those.
01:53:46.560 | Maybe we're actually talking about six
01:53:47.760 | like different personas interacting,
01:53:48.940 | which is a whole other possibility.
01:53:50.300 | But they're going to have-
01:53:51.140 | - A committee.
01:53:51.980 | - A committee, yeah, yeah, yeah.
01:53:52.800 | - A committee.
01:53:53.640 | - Yeah, exactly.
01:53:54.480 | Actually different personas.
01:53:55.300 | And maybe by the way,
01:53:56.140 | when there are difficult decisions to be made in your life,
01:53:56.980 | maybe what you want to hear is the argument
01:53:58.640 | among the different personas.
01:54:00.720 | And so you're just going to grow up.
01:54:03.720 | You're just going to have this in your life
01:54:05.120 | and you're going to always be able to talk to it
01:54:07.020 | and always be able to learn from it
01:54:08.320 | and always be able to help it make, you know,
01:54:09.840 | and like, it's going to be a symbiotic relationship.
01:54:14.000 | I think it's going to be a much better way to live.
01:54:15.120 | I think people are going to get a lot out of it.
01:54:16.760 | What modalities will it include?
01:54:18.680 | So I can imagine my phone has this engine in it,
01:54:22.960 | this AI companion,
01:54:24.360 | and I'm listening in headphones as I walk into work
01:54:27.600 | and it's giving me some, not just encouragement,
01:54:30.240 | some warning, some thoughts that,
01:54:33.100 | things that I might ask Marc Andreessen today
01:54:34.940 | that I might not have thought of and so on.
01:54:37.680 | I could also imagine it having a more human form.
01:54:40.720 | I could imagine it being a tactile,
01:54:43.000 | having some haptics or tapping to remind me
01:54:45.140 | so that it's not going to enter our conversation
01:54:47.220 | in a way that interferes or distracts you,
01:54:50.120 | but I would be aware, oh, right.
01:54:52.100 | You know, things of that sort.
01:54:53.440 | I mean, how many different modalities
01:54:55.680 | are we going to allow these AI coaches to approach us with?
01:54:59.820 | And is anyone actually thinking
01:55:01.320 | about the hardware piece right now?
01:55:03.160 | 'Cause I'm hearing a lot about the software piece.
01:55:05.220 | What does the hardware piece look like?
01:55:07.100 | - Yeah, so this is where Silicon Valley is going to kick in.
01:55:09.440 | So the entrepreneurial community
01:55:10.600 | is going to try all of those, right?
01:55:12.600 | By the way, the big companies and startups
01:55:14.280 | are going to try all those.
01:55:15.120 | So obviously, there's big companies that are working.
01:55:18.040 | The big companies have talked about a variety of these,
01:55:19.760 | including heads-up displays, AR/VR kinds of things.
01:55:23.920 | There's lots of people doing voice.
01:55:25.560 | The voice thing is, voice is a real possibility.
01:55:27.560 | It may just be an earpiece.
01:55:30.040 | There's a new startup that just unveiled a new thing
01:55:33.320 | where they actually project.
01:55:35.240 | So you'll have a pendant you wear on a necklace,
01:55:37.520 | and it actually literally will project images on your hand
01:55:40.900 | or on the table or on the wall in front of you.
01:55:42.520 | So maybe that's how it shows up.
01:55:44.920 | Yeah, there are people working on so-called haptic or touch-based
01:55:48.040 | kinds of things.
01:55:48.820 | There are people working on actually picking up
01:55:50.920 | nerve signals out of your arm to be able to--
01:55:56.440 | there's some science for being able to do basically
01:55:59.440 | like subvocalization.
01:56:01.360 | So maybe you could pick up that way, build conduction.
01:56:06.160 | So yeah, these are all going to be tried.
01:56:08.640 | So that's one question is the physical form of it.
01:56:10.720 | And then the other question is the software version of it,
01:56:13.100 | which is like, OK, what's the level of abstraction
01:56:15.140 | that you want to deal with these things in?
01:56:18.320 | Right now, it's like a question-answer paradigm
01:56:20.200 | in so-called chatbot.
01:56:21.360 | Ask a question, get an answer, ask a question, get an answer.
01:56:23.680 | Well, you want that to go for sure
01:56:25.080 | to more of a fluid conversation.
01:56:26.400 | You want it to build up more knowledge of who you are,
01:56:28.320 | and you don't want to have to explain yourself
01:56:29.600 | a second time and so forth.
01:56:30.960 | And then you want to be able to tell it things like, well,
01:56:32.360 | remind me this, that, or be sure and tell me when, x.
01:56:36.040 | But then maybe over time, more and more,
01:56:37.660 | you want it actually deciding when
01:56:39.360 | it's going to talk to you.
01:56:40.880 | And when it thinks it has something to say,
01:56:42.580 | it says it, and otherwise, it stays silent.
01:56:44.640 | And normally, at least in my head,
01:56:46.720 | unless I make a concerted effort to do otherwise,
01:56:50.160 | I don't think in complete sentences.
01:56:53.040 | So presumably, these machines could
01:56:58.040 | learn my style of fragmented internal dialogue.
01:57:02.080 | And maybe I have an earpiece, and I'm walking in,
01:57:04.720 | and I start hearing something.
01:57:07.280 | But it's some advice, et cetera, encouragement, discouragement.
01:57:12.100 | But at some point, those sounds that I hear in an earphone
01:57:17.100 | are very different than seeing something
01:57:19.080 | or hearing something in the room.
01:57:20.160 | We know this based on the neuroscience
01:57:22.500 | of musical perception and language perception.
01:57:24.720 | Hearing something in your head is very different.
01:57:27.720 | And I could imagine at some point
01:57:28.840 | that the AI will cross a precipice
01:57:30.640 | where if it has inline wiring to actually control
01:57:34.860 | neural activity in specific brain areas,
01:57:36.560 | and I don't mean very precisely,
01:57:37.980 | even just stimulating a little more
01:57:39.320 | prefrontal cortical activity, for instance,
01:57:40.960 | through the earpiece, a little ultrasound wave now
01:57:43.060 | can stimulate prefrontal cortex in a non-invasive way.
01:57:46.040 | That's being used clinically and experimentally.
01:57:48.440 | That the AI could decide that I need
01:57:52.160 | to be a little bit more context aware, right?
01:57:56.280 | This is something that is very beneficial
01:57:58.320 | for those listening that are trying to figure out
01:58:00.040 | how to navigate through life.
01:58:01.200 | It's like, know the context you're in
01:58:02.980 | and know the catalog of behaviors and words
01:58:05.020 | that are appropriate for that situation and not others.
01:58:07.440 | And this would go along with agreeableness perhaps,
01:58:11.700 | but strategic agreeableness, right?
01:58:13.760 | Context is important.
01:58:15.720 | There's nothing diabolical about that context is important,
01:58:17.800 | but I could imagine the AI recognizing,
01:58:19.900 | ah, we're entering a particular environment.
01:58:22.720 | I'm now actually going to ramp up activity
01:58:24.400 | in prefrontal cortex a little bit in a certain way
01:58:27.000 | that allows you to be more situationally aware
01:58:30.000 | of yourself and others, which is great
01:58:32.720 | unless I can't necessarily short circuit that influence
01:58:37.160 | because at some point the AI is actually then controlling
01:58:41.660 | my brain activity and my decision-making and my speech.
01:58:44.340 | I think that's what people fear is that once we cross
01:58:47.040 | that precipice, that we are giving up control
01:58:49.760 | to the artificial versions of our human intelligence.
01:58:52.760 | - And look, I think we have to decide.
01:58:54.400 | We collectively and we as individuals, I think,
01:58:56.160 | have to decide exactly how to do that.
01:58:57.680 | And this is the big thing that I believe about AI.
01:58:59.560 | There's just a much more, I would say, practical view
01:59:01.280 | of the world than a lot of the panic that you hear.
01:59:03.300 | It's just like, these are machines.
01:59:04.920 | They're able to do things that increasingly
01:59:06.640 | are like the things that people can do in some circumstances.
01:59:08.640 | But these are machines, we build the machines,
01:59:10.160 | we decide how to use the machines.
01:59:12.120 | When we want the machines turned on, they're turned on.
01:59:13.500 | We want them turned off, they're turned off.
01:59:14.840 | And so, yeah, so I think that's absolutely the kind of thing
01:59:17.320 | that the individual person should always be in charge of.
01:59:19.680 | - I mean, everyone was, and I have to imagine
01:59:22.180 | some people are still afraid of CRISPR, of gene editing,
01:59:25.120 | but gene editing stands to revolutionize our treatment
01:59:27.560 | of all sorts of diseases.
01:59:29.480 | You know, inserting and deleting particular genes
01:59:32.020 | in adulthood, right, not having to recombine in the womb
01:59:35.200 | a new organism is an immensely powerful tool.
01:59:38.480 | And yet the Chinese scientist who did CRISPR on humans,
01:59:42.480 | this has been done, actually did his postdoc at Stanford
01:59:45.880 | with Steve Quake, then went to China,
01:59:48.360 | did CRISPR on babies, mutated something,
01:59:50.680 | I believe it was the HIV, one of the HIV receptors.
01:59:53.820 | I'm told it was with the intention
01:59:55.760 | of augmenting human memory.
01:59:57.680 | It had very little to do, in fact,
01:59:59.160 | with limiting susceptibility to HIV per se
02:00:02.800 | to do with the way that that receptor
02:00:03.960 | is involved in human memory.
02:00:05.360 | The world demonized that person.
02:00:10.000 | We actually don't know what happened to them,
02:00:11.400 | whether or not they have a laboratory now
02:00:12.680 | or they're sitting in jail, it's unclear.
02:00:14.580 | But in China and elsewhere,
02:00:16.520 | people are doing CRISPR on humans.
02:00:18.600 | We know this, it's not legal in the US and other countries,
02:00:23.240 | but it's happening.
02:00:25.020 | Do you think it's a mistake for us
02:00:28.640 | to fear these technologies so much
02:00:30.520 | that we back away from them and end up 10, 20 years
02:00:33.080 | behind other countries that could use it
02:00:35.120 | for both benevolent or malevolent reasons?
02:00:38.640 | - Yeah, so there's always, and the details matter,
02:00:41.360 | so it's technology by technology,
02:00:42.960 | but I would say there's two things.
02:00:44.540 | You always have to think in these questions,
02:00:45.960 | I think, in terms of counterfactuals and opportunity cost.
02:00:48.640 | And so, CRISPR's an interesting one.
02:00:51.440 | CRISPR, you manipulate the human genome.
02:00:53.280 | Nature manipulates the human genome.
02:00:55.040 | Like, in all kinds of ways.
02:00:58.520 | - Yeah, when you pick a spouse
02:00:59.440 | and you have a child with that spouse,
02:01:00.680 | you're doing genetic recombination.
02:01:02.400 | - You are, really, yes.
02:01:03.680 | You are quite possibly, if you're Genghis Khan,
02:01:05.880 | you're determining the future of humanity, right,
02:01:08.240 | by those, like, yeah, nature, I mean, look, mutations.
02:01:13.240 | So, this is the old question of, like, basically,
02:01:17.080 | this is all state of nature, state of grace.
02:01:19.000 | Like, basically, is nature good,
02:01:20.720 | and then therefore artificial things are bad,
02:01:23.300 | which is kind of shocking.
02:01:24.140 | A lot of people have ethical views like that.
02:01:26.880 | I'm always at the view that nature's a bitch
02:01:28.920 | and wants us dead.
02:01:30.180 | Like, nature's out to get us, man.
02:01:32.320 | Like, nature wants to kill us, right?
02:01:34.240 | Like, nature wants to, like,
02:01:35.120 | evolve all kinds of, like, horrible viruses.
02:01:36.760 | Nature wants, you know, plagues.
02:01:37.880 | Nature wants to, like, do, you know, whether, you know,
02:01:39.960 | like, nature wants to do all kinds of stuff.
02:01:42.000 | I mean, look, the original,
02:01:42.840 | nature religion was the original religion, right?
02:01:44.680 | Like, that was the original thing people worshiped.
02:01:46.560 | And the reason was because nature was the thing
02:01:48.060 | that was out to get you, right,
02:01:49.800 | before you had scientific and technological methods
02:01:51.960 | to be able to deal with it.
02:01:53.640 | So, the idea of not doing these things, to me,
02:01:57.200 | is just saying, oh, we're just gonna turn over
02:01:58.760 | the future of everything to nature,
02:02:00.100 | and I don't think that that,
02:02:01.400 | there's no reason to believe that that leads
02:02:02.840 | in a particularly good direction or a bad, you know,
02:02:05.720 | that's not a value neutral decision.
02:02:07.480 | And then the related thing that comes from that
02:02:09.760 | is this, always this question around
02:02:11.080 | what's called the precautionary principle,
02:02:13.120 | which shows up in all these conversations
02:02:14.560 | on things like CRISPR, which basically is this,
02:02:17.440 | it's this principle that basically says
02:02:18.960 | the inventors of a new technology
02:02:20.240 | should be required to prove
02:02:21.460 | that it will not have negative effects
02:02:22.600 | before they roll it out.
02:02:23.800 | This, of course, is a very new idea.
02:02:26.920 | This is actually a new idea in the 1970s.
02:02:29.200 | It's actually invented by the German Greens,
02:02:30.560 | the 1970s.
02:02:31.680 | Before that, people didn't think in those terms.
02:02:33.720 | People just invented things and rolled them out,
02:02:36.640 | and we got all of modern civilization
02:02:38.560 | by people inventing things and rolling them out.
02:02:41.760 | The German Greens came up with the precautionary principle
02:02:43.600 | for one specific purpose.
02:02:44.760 | I'll bet you can guess what it is.
02:02:46.400 | It was to prevent--
02:02:49.480 | - Famine? - Nuclear power.
02:02:51.400 | It was to shut down attempts to do civilian nuclear power,
02:02:54.560 | and if you fast forward 50 years later,
02:02:56.500 | you're like, wow, that was a big mistake, right?
02:02:59.140 | So what they said at the time was
02:03:00.860 | you have to prove that nuclear reactors
02:03:02.080 | are not gonna melt down and cause all kinds of problems,
02:03:04.240 | and of course, as an engineer,
02:03:05.280 | can you prove that that will never happen?
02:03:07.020 | You can't.
02:03:07.860 | You can't rule out things that might happen in the future.
02:03:10.540 | And so that philosophy was used to stop nuclear power,
02:03:14.420 | by the way, not just in Europe, but also in the US
02:03:17.140 | and around much of the rest of the world.
02:03:18.580 | If you're somebody who's concerned
02:03:19.660 | about carbon emissions, of course,
02:03:20.940 | this is the worst thing that happened in the last 50 years
02:03:22.980 | in terms of energy.
02:03:24.160 | We actually have the silver bullet answer to unlimited energy
02:03:26.980 | with zero carbon emissions, nuclear power.
02:03:28.980 | We choose not to do it.
02:03:30.600 | Not only do we choose not to do it,
02:03:32.100 | we're actually shutting down the plants that we have now
02:03:34.860 | in California, and we just shut down the big plant.
02:03:37.480 | Germany just shut down their plants.
02:03:39.100 | Germany's in the middle of an energy war with Russia
02:03:41.760 | that we are informed as existential for the future of Europe.
02:03:44.220 | - But unless the risk of nuclear power plant meltdown
02:03:47.820 | has increased, and I have to imagine it's gone the other way,
02:03:51.420 | what is the rationale behind shutting down these plants
02:03:54.140 | and not expanding?
02:03:54.980 | - Because nuclear is bad, right?
02:03:56.180 | Nuclear is icky, nuclear is...
02:03:57.900 | Nuclear has been tagged.
02:03:58.980 | - It just sounds bad, nuclear.
02:04:00.840 | - Yeah, yeah, yeah.
02:04:01.740 | - Go nuclear.
02:04:02.580 | - Nuclear, well, so what happened?
02:04:03.420 | - We didn't shut down postal offices,
02:04:05.020 | and you hear it go postal.
02:04:06.060 | - So what happened was, so nuclear technology arrived
02:04:08.840 | on planet Earth as a weapon, right?
02:04:10.460 | So it arrived in the form of the...
02:04:12.060 | The first thing they did was in the middle of World War II.
02:04:13.900 | The first thing they did was the atomic bomb.
02:04:15.240 | They dropped it on Japan, and then there were all
02:04:16.980 | the debates that followed around nuclear weapons
02:04:18.620 | and disarmament, and there's a whole conversation
02:04:20.580 | to be had, by the way, about that,
02:04:22.180 | 'cause there's different views you could have on that.
02:04:24.460 | And then it was in the '50s and '60s
02:04:25.860 | where they started to roll out civilian nuclear power,
02:04:27.500 | and then there were accidents.
02:04:29.300 | There was like Three Mile Island melted down,
02:04:31.100 | and then Chernobyl melted down in the Soviet Union,
02:04:34.020 | and then even recently, Fukushima melted down.
02:04:36.260 | And so there have been meltdowns.
02:04:37.980 | And so I think it was a combination of it's a weapon,
02:04:40.980 | it is sort of icky.
02:04:42.620 | There's a lot of scientists sometimes say the ick factor,
02:04:45.420 | right, it's radioactive, it glows green.
02:04:49.260 | By the way, it becomes like a mythical fictional thing,
02:04:53.340 | and so you have all these movies of horrible supervillains
02:04:55.860 | powered by nuclear energy and all this stuff.
02:04:58.060 | - Well, the intro to "The Simpsons," right,
02:04:59.700 | is the nuclear power plant and the three-eyed fish
02:05:02.400 | and all the negative implications
02:05:05.620 | of this nuclear power plant run by,
02:05:07.940 | at least on "The Simpsons," idiots.
02:05:10.540 | And that is the dystopia where people are unaware
02:05:15.540 | of just how bad it is.
02:05:17.540 | - And who owns the nuclear power plant, right?
02:05:19.100 | This evil capitalist, right?
02:05:21.940 | So it's connected to capitalism, right?
02:05:24.900 | - So we're blaming Matt Groening
02:05:26.380 | for the demise of a particular nuclear power plant.
02:05:28.380 | - He certainly didn't help, right?
02:05:31.500 | But it's literally this amazing thing
02:05:33.020 | where if you're just thinking rationally, scientifically,
02:05:35.660 | you're like, okay, we want to get rid of carbon,
02:05:37.380 | this is the obvious way to do it.
02:05:38.500 | So, okay, fun fact, Richard Nixon did two things
02:05:43.180 | that really mattered on this.
02:05:44.000 | So one is he defined in 1971
02:05:45.540 | something called project independence,
02:05:47.280 | which was to create 1,000 new state-of-the-art
02:05:49.220 | nuclear plants, civilian nuclear plants in the US by 1980
02:05:52.140 | and to get the US completely off of oil
02:05:54.380 | and cut the entire US energy grid
02:05:56.020 | over to nuclear power electricity,
02:05:57.340 | cut over to electric cars, the whole thing,
02:05:58.940 | like detach from carbon.
02:06:00.340 | You'll notice that didn't happen.
02:06:03.460 | Why did that not happen?
02:06:04.300 | 'Cause he also created the EPA
02:06:05.780 | and the Nuclear Regulatory Commission,
02:06:07.520 | which then prevented that from happening, right?
02:06:09.120 | And the Nuclear Regulatory Commission
02:06:10.480 | did not authorize a new nuclear plant
02:06:11.980 | in the US for 40 years.
02:06:13.180 | - Why would he hamstring himself like that?
02:06:16.020 | - You know, he got distracted by him [laughs]
02:06:19.220 | by Watergate and Vietnam.
02:06:21.880 | - I think Ellsberg just died recently, right?
02:06:24.020 | The guy who released the Pentagon Papers.
02:06:25.420 | Yeah, so it's complicated.
02:06:27.540 | - But it's, yeah, yeah, yeah, exactly.
02:06:28.720 | It's this thing, yeah, he didn't,
02:06:30.100 | he left office shortly thereafter.
02:06:31.260 | He didn't have time to fully figure this out.
02:06:33.340 | I don't know whether he would have figured it out or not.
02:06:35.500 | Look, Ford could have figured it out.
02:06:36.620 | Carter could have figured it out.
02:06:37.580 | Reagan could have figured it out.
02:06:38.540 | Any of these guys could have figured it out.
02:06:39.780 | It's like the most obvious,
02:06:40.740 | knowing what we know today,
02:06:41.780 | it's the most obvious thing in the world.
02:06:43.600 | The Russia thing is the amazing thing.
02:06:44.860 | It's like Europe is literally funding
02:06:46.300 | Russia's invasion of Ukraine by paying them for oil, right?
02:06:49.420 | And they can't shut off the oil
02:06:50.740 | 'cause they won't cut over to nuclear, right?
02:06:52.540 | And then of course what happens, okay,
02:06:54.020 | so then here's the other kicker of what happens, right?
02:06:55.740 | Which is they won't do nuclear,
02:06:57.100 | but they want to do renewables, right?
02:06:59.260 | Sustainable energy.
02:07:00.220 | And so what they do is they do solar and wind.
02:07:02.900 | Solar and wind are not reliable
02:07:04.660 | 'cause it sometimes gets dark out
02:07:06.460 | and sometimes the wind doesn't blow.
02:07:08.540 | And so then what happens is they fire up the coal plants.
02:07:10.900 | Right, and so the actual consequence
02:07:13.020 | of the precautionary principle
02:07:14.380 | for the purpose it was invented
02:07:15.980 | is a massive spike in use of coal.
02:07:17.980 | - That's taking us back over 100 years.
02:07:19.820 | - Yes, correct.
02:07:20.820 | That is the consequence of the precautionary principle.
02:07:23.180 | Like that's the consequence of that mentality.
02:07:25.620 | And so it's a failure of a principle on its own merits
02:07:28.020 | for the thing it was designed for.
02:07:29.700 | And then there is a whole movement of people
02:07:31.880 | who want to apply it to every new thing.
02:07:33.260 | And this is the hot topic on AI right now in Washington,
02:07:36.540 | which is like, oh my God, these people have to prove
02:07:38.260 | that this can never get used for bad things.
02:07:39.860 | - Sorry, I'm hung up on this nuclear thing
02:07:41.660 | and I wonder, can it just be renamed?
02:07:46.260 | I mean, seriously.
02:07:47.820 | I mean, there is something about the naming of things.
02:07:50.400 | We know this in biology, right?
02:07:52.220 | I mean, you know, Lamarckian evolution
02:07:55.220 | and things like that, these are bad words in biology,
02:07:57.780 | but we had a guest on this podcast, Oded Reshavi,
02:07:59.800 | who's over in Israel, who's shown inherited traits.
02:08:03.200 | But if you talk about it as Lamarckian,
02:08:05.200 | then it has all sorts of negative implications,
02:08:07.260 | but his discoveries have important implications
02:08:10.480 | for everything from inherited trauma
02:08:12.580 | to treatment of disease.
02:08:14.200 | I mean, there's all sorts of positives that await us
02:08:16.460 | if we are able to reframe our thinking around something
02:08:19.260 | that yes, indeed could be used for evil,
02:08:21.260 | but that has enormous potential
02:08:24.100 | and that is an agreement with nature, right?
02:08:27.200 | This fundamental truth that, at least to my knowledge,
02:08:29.640 | no one is revising in any significant way anytime soon.
02:08:33.040 | So what if it were called something else?
02:08:35.340 | Instead of nuclear, it's called sustainable, right?
02:08:39.660 | I mean, it's amazing how marketing can shift our perspective
02:08:42.680 | of robots, for instance, or anyway,
02:08:46.020 | I'm sure you can come up with better examples than I can,
02:08:47.980 | but is there a good, solid PR firm
02:08:52.940 | working from the nuclear side?
02:08:55.120 | - Thunbergian, it's good, a Thunbergian.
02:08:58.620 | - Oh, Thunbergian. - Thunbergian,
02:09:00.440 | like if she was in favor of it,
02:09:02.600 | which by the way, she's not, she's dead set against it.
02:09:04.760 | - She said that. - 100%, yeah.
02:09:06.680 | - Based on- - Based on-
02:09:08.600 | - Murgian principles. - The prevailing ethic
02:09:11.340 | in environmentalism for 50 years is that nuclear is evil,
02:09:13.560 | like they won't consider it.
02:09:14.860 | There are, by the way,
02:09:15.700 | certain environmentalists who disagree with this,
02:09:17.220 | and so Stewart Brand is the one that's been the most public,
02:09:19.220 | and he has impeccable credentials in the space,
02:09:20.900 | and he wrote this- - Hoehler's Catalogs?
02:09:22.260 | - Hoehler's Catalog guy, yeah,
02:09:23.360 | and he's written a whole bunch
02:09:24.400 | of really interesting books since,
02:09:25.580 | and he wrote a recent book that goes through in detail.
02:09:27.660 | He's like, yes, obviously the correct environmental thing
02:09:29.940 | to do is nuclear power,
02:09:31.580 | and we should be implementing project independence.
02:09:34.020 | We should be building 1,000, specifically we should,
02:09:36.380 | he didn't say this, but this is what I would say.
02:09:37.620 | We should hire Charles Koch.
02:09:38.980 | We should hire Koch Industries, right?
02:09:42.620 | And they should build us 1,000 nuclear power plants, right?
02:09:44.980 | And then we should give them
02:09:45.820 | the Presidential Medal of Freedom
02:09:47.220 | for saving the environment.
02:09:48.700 | - And that would put us independent of our reliance on oil.
02:09:51.020 | - Yeah, then we're done with oil.
02:09:52.540 | Just think about what happens.
02:09:53.380 | We're done with oil, zero emissions.
02:09:54.980 | We're done with the Middle East, we're done.
02:09:57.300 | We're done.
02:09:58.120 | We're not drilling.
02:09:58.960 | We're not drilling on American land anymore.
02:10:00.340 | We're not drilling on foreign lands.
02:10:01.780 | Like we have no military entanglements
02:10:03.340 | in places where we're drilling.
02:10:04.340 | We're not despoiling Alaska.
02:10:06.300 | We're not, nothing, no offshore rigs, no nothing.
02:10:08.340 | We're done.
02:10:09.500 | And you basically just,
02:10:10.340 | you build state-of-the-art plants, engineered properly.
02:10:12.060 | You have them just completely contained.
02:10:13.380 | When there's nuclear waste,
02:10:14.220 | you just entomb the waste, right, in concrete.
02:10:16.540 | And so it just sits there forever.
02:10:19.580 | It's this very small footprint kind of thing.
02:10:22.260 | And you're just done.
02:10:23.660 | And so this is like the most,
02:10:25.220 | to me it's like scientifically, technologically,
02:10:26.780 | this is just like the most obvious thing in the world.
02:10:29.100 | It's a massive tell on the part of the people
02:10:31.060 | who claim to be pro-environment
02:10:32.060 | that they're not in favor of this.
02:10:33.660 | - And if I were to say,
02:10:35.380 | tweet that I'm pro-nuclear power
02:10:37.760 | because it's the more sustainable form of power.
02:10:39.720 | If I hypothetically did that today,
02:10:42.720 | what would happen to me in the-
02:10:44.280 | - You'd be a crypto-fascist, you know.
02:10:45.680 | (laughing)
02:10:48.100 | Dirty, evil capitalist, you know,
02:10:49.920 | a monster, how dare you, right?
02:10:51.340 | - I'm unlikely to run that experiment.
02:10:52.920 | I was just curious.
02:10:53.760 | That was what we call a Godankan experiment,
02:10:55.560 | experiment in our own hands.
02:10:56.400 | - Andrew, you're a terrible human being.
02:10:58.280 | - Wow.
02:10:59.120 | - We were looking for evidence
02:10:59.940 | that you're a terrible human being,
02:11:00.780 | and now we know it, right?
02:11:02.160 | This is a great example of that.
02:11:04.320 | I gave Andrew a book on the way in here with this,
02:11:06.280 | my favorite new book.
02:11:07.120 | The title of it is "When Reason Goes on Holiday."
02:11:09.680 | And this is a great example of it,
02:11:11.040 | it's the people who simultaneously
02:11:14.000 | say they're environmentalists
02:11:14.840 | and say they're anti-nuclear power,
02:11:15.900 | like the positions just simply don't reconcile.
02:11:18.280 | But that doesn't bother them at all.
02:11:20.580 | So be clear, I predict none of this will happen.
02:11:23.080 | - Amazing.
02:11:24.860 | I need to learn more about nuclear power.
02:11:27.080 | - Long coal.
02:11:28.320 | - Long coal.
02:11:29.160 | - Long coal, invest in coal.
02:11:30.960 | - Because you think we're just gonna revert.
02:11:32.240 | - It's the energy source of the future.
02:11:34.040 | Well, 'cause it can't be solar and wind
02:11:36.720 | 'cause they're not reliable, so you need something.
02:11:39.320 | If it's not nuclear,
02:11:40.160 | it's gonna be either like oil, natural gas, or coal.
02:11:42.480 | - And you're unwilling to say bet on nuclear
02:11:44.840 | because you don't think that the sociopolitical,
02:11:49.800 | elitist trends that are driving against nuclear
02:11:51.800 | are likely to dissipate anytime soon?
02:11:53.520 | - Not a chance.
02:11:54.360 | I can't imagine.
02:11:55.720 | It would be great if they did,
02:11:56.680 | but the powers that be are very locked in
02:12:01.680 | on this as a position.
02:12:02.680 | And look, they've been saying this for 50 years,
02:12:04.120 | and so they'd have to reverse themselves
02:12:05.360 | off of a bad position they've had for 50 years,
02:12:07.160 | and people really don't like to do that.
02:12:10.120 | - One thing that's good about this and other podcasts
02:12:12.120 | is that young people listen,
02:12:13.420 | and they eventually will take over.
02:12:15.400 | - And by the way, I will say also,
02:12:16.680 | there are nuclear entrepreneurs.
02:12:18.560 | So there are actually, on the point of young kids,
02:12:20.820 | there are a bunch of young entrepreneurs
02:12:22.120 | who are basically not taking no for an answer,
02:12:24.080 | and they're trying to develop,
02:12:25.280 | and particularly there's people trying to develop
02:12:26.600 | new very small form factor nuclear power plants
02:12:30.040 | with a variety of possible use cases.
02:12:32.720 | So look, maybe they show up with a better mousetrap
02:12:36.360 | and people take a second look, but we'll see.
02:12:38.880 | - We'll just rename it.
02:12:40.160 | So my understanding is that you think
02:12:44.360 | we should go all in on AI with the constraints
02:12:49.000 | that we discover we need in order to rein in safety
02:12:52.520 | and things of that sort, not unlike social media,
02:12:54.880 | not unlike the internet.
02:12:56.440 | - Not unlike what we should have done with nuclear power.
02:12:58.640 | - And in terms of the near infinite number of ways
02:13:04.680 | that AI can be envisioned to harm us,
02:13:07.280 | how do you think we should cope with that psychologically?
02:13:10.400 | You know, because I can imagine a lot of people
02:13:12.380 | listening to this conversation are thinking,
02:13:13.640 | okay, that all sounds great,
02:13:15.840 | but there are just too many what ifs that are terrible,
02:13:18.800 | right, you know, what if the machines take over?
02:13:20.760 | What if, you know, the silly example I gave earlier,
02:13:23.400 | but, you know, what if one day I get logged into my,
02:13:26.040 | you know, hard-earned bank account and it's all gone?
02:13:28.600 | You know, the AI version of myself like ran off
02:13:31.560 | with someone else and with all my money, right, right?
02:13:36.200 | My AI coach abandoned me for somebody else
02:13:39.420 | after it learned all the stuff that I taught it.
02:13:42.320 | It took off with somebody else, stranded, you know,
02:13:46.060 | and it has my bank account numbers,
02:13:47.960 | like this kind of thing, right?
02:13:49.360 | - You could really make this scenario horrible
02:13:51.160 | if you kept going.
02:13:52.000 | - Yeah, well, we can throw in a benevolent example as well
02:13:55.760 | to counter it, but it's just kind of fun to think about
02:13:59.680 | where the human mind goes, right?
02:14:01.920 | - So first I say, we got to separate the real problems
02:14:03.960 | from the fake problems, and so there's a lot,
02:14:05.560 | a lot of science fiction scenarios I think are just not real
02:14:07.840 | and the ones that you just cited as an example,
02:14:09.560 | like it's not, that's not what's gonna happen
02:14:11.000 | and I can explain why that's not what's gonna happen.
02:14:12.360 | So you should, there's a set of fake ones.
02:14:14.480 | The fake ones are the ones that just aren't,
02:14:16.800 | I think, technologically grounded, that aren't rational.
02:14:19.120 | It's the AI's gonna like wake up and decide to kill us all.
02:14:21.360 | It's gonna like, yeah, it's gonna develop the kind of agency
02:14:23.740 | where it's gonna steal our money, you know,
02:14:25.320 | money and our spouse and everything else, our kids.
02:14:28.040 | Like that's just, that's not how it works.
02:14:30.340 | And then there's also all these concerns, you know,
02:14:32.200 | destruction of society concerns, and this is, you know,
02:14:34.540 | misinformation, hate speech, deep fakes,
02:14:36.060 | like all that stuff, which I don't think is a real,
02:14:39.040 | is actually a real problem.
02:14:40.160 | And then there's, people have a bunch of economic concerns
02:14:42.600 | around, you know, what's gonna take all the jobs,
02:14:45.200 | and all those kinds of things, we could talk about that.
02:14:47.000 | I don't think those are, those are,
02:14:48.440 | I don't think that's actually the thing that happens.
02:14:50.520 | But then there are two actual real concerns
02:14:52.580 | that I actually do very much agree with,
02:14:53.980 | and one of them is what you said,
02:14:55.540 | which is bad people doing bad things.
02:14:57.540 | And there's a whole set of things to be done inside there.
02:15:01.020 | The big one is we should use AI to build defenses
02:15:03.600 | against all the bad things, right?
02:15:05.540 | And so, for example, there's a concern,
02:15:07.400 | AI is gonna make it easier for bad people
02:15:09.020 | to build pathogens, right, design pathogens in labs,
02:15:11.200 | which, you know, bad people, bad scientists can do today,
02:15:13.200 | but this is gonna make it easier to do.
02:15:15.400 | Well, obviously we should have the equivalent
02:15:17.160 | of an Operation Warp Speed operating, you know,
02:15:19.480 | in perpetuity anyway, right?
02:15:20.980 | But then we should use AI to build much better biodefenses,
02:15:24.080 | right, and we should be using AI today to design,
02:15:26.040 | like for example, full spectrum vaccines
02:15:27.680 | against every possible form of pathogen, right?
02:15:29.960 | And so defensive mechanism, hacking,
02:15:32.860 | you can use AI to build better defense tools, right?
02:15:35.040 | And so you should have a whole new kind of security suite
02:15:37.380 | wrapped around you, wrapped around your data,
02:15:38.700 | wrapped around your money,
02:15:39.540 | where you're having AI repel attacks.
02:15:43.620 | Disinformation, hate speech, deep fakes, all that stuff,
02:15:46.000 | you should have an AI filter when you use the internet,
02:15:48.220 | where, you know, you shouldn't have to figure out
02:15:50.700 | whether it's really me or whether it's a made up thing,
02:15:52.980 | you should have an AI assistant that's doing that for you.
02:15:55.100 | - Oh yeah, I mean, these little banners and cloaks
02:15:57.540 | that you see on social media,
02:15:58.540 | like this has been deemed misinformation.
02:16:01.240 | You know, if you're me, you always click, right?
02:16:03.400 | 'Cause you're like, what's behind the scrim.
02:16:05.200 | And then, or this is a,
02:16:07.560 | I don't always look at this image as gruesome type thing.
02:16:11.320 | Sometimes I just pass on that.
02:16:13.180 | But if it's something that seems debatable,
02:16:16.480 | of course you look.
02:16:17.440 | - Well, and you should have an AI assistant with you
02:16:19.520 | when you're on the internet,
02:16:20.360 | and you should be able to tell that AI assistant
02:16:21.520 | what you want, right?
02:16:22.720 | So yes, I want the full experience to show me everything.
02:16:26.260 | I want it from a particular point of view,
02:16:27.840 | and I don't want to hear from these other people
02:16:29.240 | who I don't like.
02:16:30.640 | By the way, it's gonna be my eight-year-old is using this.
02:16:32.400 | I don't want anything that's gonna cause a problem,
02:16:34.320 | and I want everything filtered.
02:16:35.280 | And AI-based filters like that that you program and control
02:16:38.560 | are gonna work much better,
02:16:40.040 | and be much more honest and straightforward and clear
02:16:42.300 | and so forth than what we have today.
02:16:43.640 | So anyway, so basically what I want people to do
02:16:45.840 | is think every time you think of like a risk
02:16:47.680 | of how it can be used, just think of like,
02:16:48.920 | okay, we can use it to build a countermeasure.
02:16:50.960 | And the great thing about the countermeasures
02:16:52.440 | is they can not only offset AI risks,
02:16:54.200 | they can offset other risks, right?
02:16:55.720 | 'Cause we already live in a world
02:16:57.080 | where pathogens are a problem, right?
02:16:59.400 | We ought to have better vaccines anyway, right?
02:17:01.920 | We already live in a world
02:17:02.760 | where there's cyber hacking and cyber terrorism.
02:17:04.320 | They already live in a world
02:17:05.200 | where there is bad content on the internet.
02:17:06.560 | And we have the ability now to build much better
02:17:08.720 | AI-powered tools to deal with all those things.
02:17:11.080 | - I also love the idea of the AI physicians.
02:17:14.760 | You know, getting decent healthcare in this country
02:17:18.640 | is so difficult.
02:17:19.600 | Even for people who have means or insurance,
02:17:22.160 | I mean, the number of phone calls and waits
02:17:24.440 | that you have to go through to get a referral
02:17:26.800 | to see a specialist, I mean, it's absurd.
02:17:28.920 | Like, I mean, the process is absurd.
02:17:30.920 | I mean, it makes one partially or frankly ill
02:17:34.200 | just to go through the process of having to do all that.
02:17:36.560 | I don't know how anyone does it.
02:17:38.480 | And granted, I don't have the highest degree of patience,
02:17:41.600 | but I'm pretty patient and it drives me insane
02:17:45.360 | to even just get a remedial care.
02:17:47.400 | But so I can think of a lot of benevolent uses of AI
02:17:52.040 | and I'm grateful that you're bringing this up in here
02:17:54.800 | and that you've tweeted about it in that thread.
02:17:56.800 | Again, we'll refer people to that
02:17:58.440 | and that you're thinking about this.
02:17:59.680 | I have to imagine that in your role as investor nowadays
02:18:03.080 | that you're also thinking about AI quite often
02:18:06.120 | in terms of all these roles.
02:18:08.080 | And so does that mean that there are a lot of young people
02:18:11.640 | who are really bullish on AI and are going for it?
02:18:15.200 | - Yeah.
02:18:16.040 | - Okay, this is here to stay.
02:18:16.860 | - Yeah. - Okay.
02:18:17.700 | - Yeah, oh yeah, big time.
02:18:19.160 | - Unlike CRISPR, which is sort of in this liminal place
02:18:21.600 | where biotech companies aren't sure
02:18:22.880 | if they should invest or not in CRISPR
02:18:26.560 | because it's unclear whether or not
02:18:28.120 | the governing bodies are going to allow gene editing.
02:18:30.160 | - Right.
02:18:31.000 | - Just like it was unclear 15 years ago
02:18:32.600 | if they were going to allow gene therapy,
02:18:33.920 | but now we know they do allow gene therapy
02:18:35.960 | and immunotherapy.
02:18:36.800 | - Right, okay.
02:18:37.640 | So there is a fight.
02:18:38.480 | Now having said that, there is a fight.
02:18:39.320 | There's a fight happening in Washington right now
02:18:41.320 | over exactly what should be legal or not legal.
02:18:43.560 | And there's quite a bit of risk, I think,
02:18:45.080 | attached to that fight right now
02:18:46.200 | 'cause there are some people in there
02:18:47.260 | that are telling a very effective story
02:18:49.120 | to try to get people to either outlaw AI
02:18:50.640 | or specifically limit it to a small number of big companies,
02:18:53.760 | which I think is potentially disastrous.
02:18:57.140 | - By the way, the EU also is like super negative.
02:18:59.920 | The EU has turned super negative
02:19:01.040 | on basically all new technology.
02:19:02.120 | So they're moving to try to outlaw AI,
02:19:03.800 | which if they succeed-
02:19:05.180 | - Outlaw AI?
02:19:06.020 | - Yeah, they just like flat out don't want it.
02:19:07.720 | - But that's like saying you're going to outlaw the internet.
02:19:09.400 | I don't see how you can stop this train.
02:19:10.940 | - And frankly, they're not a big fan of the internet either.
02:19:12.620 | So like I think they regret,
02:19:14.600 | the EU has a very, especially the EU bureaucrats,
02:19:18.120 | the people who run the EU in Brussels
02:19:19.800 | have a very negative view on a lot of modernity.
02:19:24.240 | - But what I'm hearing here,
02:19:26.200 | what calls to mind things that I've heard people
02:19:27.740 | like David Goggins say,
02:19:28.760 | which is there's so many lazy undisciplined people out there
02:19:32.700 | that nowadays it's easier and easier to become exceptional.
02:19:35.300 | I've heard him say something to that extent.
02:19:36.620 | It almost sounds like there's so many countries
02:19:38.460 | that are just backing off of particular technologies
02:19:41.660 | because it just sounds bad from the PR perspective
02:19:44.840 | that it's creating great low-hanging fruit opportunities
02:19:50.000 | for people to barge forward and countries to barge forward
02:19:52.480 | if they're willing to embrace this stuff.
02:19:54.140 | - It is, but number one,
02:19:55.200 | you have to have a country that wants to do that
02:19:57.720 | and those exist and there are countries like that.
02:20:00.240 | But, and then the other is look,
02:20:01.080 | they need to be able to withstand the attack
02:20:03.300 | from stronger countries that don't want them to do it.
02:20:06.920 | So like EU, like the EU has nominal control
02:20:09.740 | over like whatever it is, 27 or whatever member countries.
02:20:12.820 | So like, even if you're like whatever,
02:20:14.360 | the Germans get all fired up about whatever,
02:20:15.900 | like Brussels can still, in a lot of cases,
02:20:17.340 | just like flat out basically control them
02:20:18.920 | and tell them not to do it.
02:20:19.760 | And then the US, I mean, look,
02:20:21.200 | we have a lot of control over a lot of the world.
02:20:24.360 | But it sounds like we sit somewhere sort of in between.
02:20:26.920 | Like right now, people are developing AI technologies
02:20:31.320 | in US companies, right?
02:20:32.640 | So it is happening.
02:20:33.640 | - Yeah, today it's happening.
02:20:34.840 | But like I said, there's a set of people
02:20:36.100 | who are very focused in Washington right now
02:20:38.120 | about trying to either ban it outright
02:20:40.360 | or trying to, as I said,
02:20:41.840 | limit it to a small number of big companies.
02:20:44.560 | And then look, China's got a whole,
02:20:45.920 | the other part of this is China's got a whole different
02:20:48.080 | kind of take on this, right, than we do.
02:20:49.480 | And so they're of course going to allow it for sure,
02:20:51.600 | but they're going to allow it in the ways
02:20:52.660 | that their system wants it to happen, right,
02:20:55.700 | which is much more for population control
02:20:58.140 | and to implement authoritarianism.
02:20:59.980 | And then of course, they are going to spread their technology
02:21:03.220 | and their vision of how society should run
02:21:05.060 | across the world, right?
02:21:06.480 | So we're back in a Cold War dynamic
02:21:08.380 | like we were with the Soviet Union
02:21:09.380 | where there are two different systems
02:21:10.500 | that have fundamentally different views
02:21:12.320 | on concepts like freedom and individual choices,
02:21:15.500 | freedom of speech and so on.
02:21:17.540 | And we know where the Chinese stand.
02:21:19.220 | We're still figuring out where we stand.
02:21:21.300 | There are a lot of, so I'm having a lot of schizophrenic,
02:21:23.260 | I'm having specifically a lot of schizophrenic conversations
02:21:25.340 | with people in DC right now where if I talk to them
02:21:27.820 | and China doesn't come up, they just like hate tech,
02:21:30.380 | they hate American tech companies, they hate AI,
02:21:32.660 | they hate social media, they hate this, they hate that,
02:21:34.460 | they hate crypto, they hate everything,
02:21:35.580 | and they just want to like punish and like ban
02:21:37.820 | and like they're just like very, very negative.
02:21:39.860 | But then if we have a conversation a half hour later
02:21:41.860 | when we talk about China,
02:21:43.020 | then the conversation totally different.
02:21:44.620 | Now we need a partnership between the US government
02:21:46.420 | and American tech companies to defeat China.
02:21:48.820 | It's like the exact opposite discussion.
02:21:51.380 | - Is that fear or competitiveness?
02:21:53.180 | - On China specifically?
02:21:54.680 | - In terms of the US response in Washington,
02:21:57.660 | when you bring up these technologies,
02:21:59.980 | like I'll lump CRISPR in there,
02:22:02.300 | things like CRISPR, nuclear power, AI,
02:22:04.340 | it all sounds very cold, very dystopian to a lot of people.
02:22:07.960 | And yet there are all these benevolent uses
02:22:10.940 | as we've been talking about.
02:22:12.540 | And then you say you raise the issue of China
02:22:14.740 | and then it sounds like this big dark cloud emerging
02:22:17.500 | and then all of a sudden we need to galvanize
02:22:20.700 | and develop these technologies to counter their effort.
02:22:23.820 | So is it fear of them or is it competitiveness or both?
02:22:28.140 | - Well, so without them in the picture,
02:22:30.580 | you just have this basically,
02:22:32.780 | there's an old Bedouin saying as me against my brother,
02:22:35.860 | me and my brother against my cousin,
02:22:37.660 | me and my brother and my cousin against the world, right?
02:22:40.220 | So it's actually it's evolution in action
02:22:43.340 | if you want to think about it.
02:22:44.180 | If there's no external threat,
02:22:45.940 | then the conflict turns inward.
02:22:47.500 | And then at that point,
02:22:48.740 | there's a big fight between specifically tech
02:22:51.360 | and then I was just say generally politics.
02:22:53.180 | And in my interpretation of that fight
02:22:54.980 | is it's a fight for status.
02:22:56.040 | It's fundamentally a fight for status and for power,
02:22:58.380 | which is like, if you're in politics,
02:22:59.740 | you like the status quo of how power
02:23:02.020 | and status work in our society.
02:23:03.740 | You don't want these new technologies
02:23:05.180 | to show up and change things 'cause change is bad, right?
02:23:08.420 | Change threatens your position.
02:23:09.620 | It threatens the respect that people have for you
02:23:11.620 | and your control over things.
02:23:13.540 | And so I think it's primarily a status fight,
02:23:15.380 | which we could talk about.
02:23:17.500 | But the China thing is just like a straight up geopolitical
02:23:20.180 | us versus them.
02:23:21.500 | Like I said, it's like a Cold War scenario.
02:23:23.080 | And look, 20 years ago, the prevailing view in Washington
02:23:26.140 | was we need to be friends with China, right?
02:23:27.780 | And we're gonna be trading partners with China.
02:23:29.300 | And yes, they're a totalitarian dictatorship,
02:23:31.220 | but like if we trade with them over time,
02:23:32.660 | they'll become more democratic.
02:23:34.200 | In the last five to 10 years,
02:23:35.820 | it's become more and more clear that that's just not true.
02:23:39.340 | And now there's a lot of people in both political parties
02:23:42.120 | in DC who very much regret that
02:23:43.900 | and want to change too much more
02:23:45.380 | of a sort of a Cold War footing.
02:23:47.160 | - Are you willing to comment on TikTok
02:23:49.420 | and technologies that emerged from China
02:23:51.540 | that are in widespread use within the US,
02:23:54.420 | like how much you trust them or don't trust them?
02:23:56.300 | I can go on record myself by saying that early on
02:23:59.980 | when TikTok was released,
02:24:01.600 | we were told as Stanford faculty that we should not
02:24:04.500 | and could not have TikTok accounts, nor WeChat accounts.
02:24:07.920 | - So let's start with,
02:24:10.660 | there are a lot of really bright Chinese tech entrepreneurs
02:24:13.020 | and engineers who are trying to do good things.
02:24:14.940 | I'm totally positive about that.
02:24:16.820 | So I think that many of the people mean very well,
02:24:20.100 | but the Chinese have a specific system
02:24:22.140 | and the system is very clear and unambiguous.
02:24:25.500 | And the system is everything in China is owned by the party.
02:24:28.140 | It's not even owned by the state,
02:24:29.600 | it's owned by the party,
02:24:30.440 | it's owned by the Chinese Communist Party.
02:24:31.260 | So the Chinese Communist Party owns everything
02:24:32.660 | and they control everything.
02:24:33.900 | By the way, it's actually illegal to this day,
02:24:35.520 | it's illegal for an investor to buy equity
02:24:37.700 | in a Chinese company.
02:24:39.060 | There's all these like basically legal machinations
02:24:41.600 | that people do to try to do something
02:24:42.920 | that's like economically equivalent to that,
02:24:44.300 | but it's actually still illegal to do that.
02:24:45.920 | The Chinese have no intention,
02:24:47.280 | the Chinese Communist Party has no intention
02:24:49.120 | of letting foreigners own any of China,
02:24:51.120 | like zero intention of that.
02:24:52.620 | And they regularly move to make sure
02:24:54.200 | that that doesn't happen.
02:24:55.380 | So they own everything, they control everything.
02:24:58.580 | - Sorry to interrupt you,
02:24:59.420 | but people in China can invest in American companies
02:25:02.060 | essentially all the time.
02:25:02.900 | - Well, they can subject to US government constraints.
02:25:05.360 | There is a US government system
02:25:07.460 | that attempts to mediate that called CFIUS.
02:25:10.300 | And there are more and more limitations
02:25:12.560 | being put on that.
02:25:13.400 | But if you can get through that approval process,
02:25:15.660 | then legally, you can do that.
02:25:17.100 | Whereas the same is not true with respect to China.
02:25:20.400 | And so they just have a system.
02:25:23.500 | And so if you're the CEO of a Chinese company,
02:25:25.920 | it's not optional.
02:25:27.300 | If you're the CEO of ByteDance or CEO of Tencent,
02:25:30.000 | your relationship with the Chinese Communist Party
02:25:32.380 | is not optional, it's required.
02:25:33.940 | And what's required is you are a unit of the party
02:25:36.740 | and you and your company do what the party says.
02:25:38.780 | And when the party says we get full access
02:25:40.580 | to all user data in America, you say yes.
02:25:43.180 | When the party says you change the algorithm
02:25:45.020 | to optimize to a certain social result, you say yes.
02:25:47.680 | So it's whatever Xi Jinping and his party cadres decide,
02:25:53.680 | and that's what gets implemented.
02:25:55.660 | If you're the CEO of a Chinese tech company,
02:25:57.220 | there is a political officer assigned to you
02:25:59.940 | who has an office down the hall.
02:26:01.760 | And at any given time, he can come down the hall,
02:26:03.880 | he can grab you out of your staff meeting or board meeting,
02:26:05.920 | and he can take you down the hall
02:26:06.760 | and he can make you sit for hours and study Marxism
02:26:09.380 | Xi Jinping thought and quiz you on it and test you on it,
02:26:12.220 | and you'd better pass the test.
02:26:13.900 | Right, so it's like a straight political control thing.
02:26:17.260 | And then by the way, if you get crosswise with them, like,
02:26:20.140 | you know.
02:26:20.980 | - So when we see tech founders getting called up to Congress
02:26:26.540 | for, you know, what looks like interrogation,
02:26:29.980 | but it's probably pretty light interrogation
02:26:32.040 | compared to what happens in other countries.
02:26:33.580 | - Yeah, it's state power.
02:26:35.700 | They just have this view of top-down state power
02:26:37.940 | and they view it's their system
02:26:39.140 | and they view that it's necessary
02:26:40.300 | for lots of historical and moral reasons that they've defined
02:26:42.440 | and that's how they run.
02:26:43.280 | And then they've got a view that says
02:26:44.280 | how they want to propagate that vision outside the country.
02:26:46.700 | And they have these programs like Belt and Road, right,
02:26:49.220 | that basically are intended to propagate
02:26:50.740 | kind of their vision worldwide.
02:26:52.940 | And so they are who they are.
02:26:55.580 | Like, I will say that they don't lie about it, right?
02:26:57.520 | They're very straightforward.
02:26:59.060 | They give speeches, they write books,
02:27:00.260 | you can buy Xi Jinping's speeches,
02:27:01.500 | he goes through the whole thing.
02:27:02.340 | They have their tech 2025 plan,
02:27:04.500 | you know, it's like 10 years ago,
02:27:05.740 | their whole AI agenda, it's all in there.
02:27:07.840 | - And is there a goal that, you know, in 200 years,
02:27:10.880 | 300 years that China is the superpower
02:27:12.940 | controlling everything?
02:27:14.780 | - Yeah, or 20 years, 30 years, or two years, three years,
02:27:17.160 | yeah.
02:27:18.000 | - Well, they got a shorter horizon.
02:27:18.820 | - I mean, look, they're, yeah, they're, they're, I mean,
02:27:20.220 | if you, if you're, you know,
02:27:21.580 | and I don't know, everybody's a little bit like this,
02:27:22.800 | I guess, but yeah, if you're, they want to win.
02:27:25.940 | - Well, the CRISPR in humans example that I gave earlier
02:27:28.920 | was interesting to me because first of all,
02:27:30.760 | I'm a neuroscientist and they could have edited any genes,
02:27:35.340 | but they chose to edit the genes involved
02:27:37.200 | in the attempt to create super memory babies,
02:27:41.020 | which presumably would grow into super memory adults,
02:27:43.660 | and whether or not they succeeded in that isn't clear.
02:27:48.140 | Those babies are alive and presumably by now walking,
02:27:52.380 | talking as far as I know,
02:27:53.780 | whether or not they have super memories isn't clear,
02:27:55.560 | but China is clearly unafraid to augment biology
02:28:00.560 | in that way.
02:28:03.300 | And I believe that that's inevitable,
02:28:08.100 | that it's going to happen elsewhere,
02:28:09.960 | probably first for the treatment of disease.
02:28:11.840 | But at some point I'm assuming people are going to augment
02:28:14.580 | biology to make smarter kids.
02:28:16.280 | I mean, people not always,
02:28:18.460 | but often will select mates based on the traits
02:28:21.200 | they would like their children to inherit.
02:28:23.400 | So this happens far more frequently than could be deemed
02:28:26.760 | bad because either that or people are bad because people do
02:28:29.680 | this all the time,
02:28:30.500 | selecting mates that have physical and psychological
02:28:33.360 | and cognitive traits that you would like your offspring
02:28:35.840 | to have.
02:28:36.720 | CRISPR is a more targeted approach of course.
02:28:38.920 | The reason I'm kind of giving this example
02:28:40.960 | and examples like it is that I feel like so much of the way
02:28:44.380 | that governments and the public react to technologies
02:28:49.200 | is to just take that first glimpse and it just feels scary.
02:28:53.280 | You think about the old Apple ad of the 1984 ad.
02:28:58.020 | I mean, there was one very scary version
02:29:00.000 | of the personal computer and computers
02:29:01.760 | and robots taking over and everyone like automatons.
02:29:04.480 | And then there was the Apple version
02:29:06.420 | where it's all about creativity, love and peace.
02:29:08.500 | And it had the pseudo psychedelic California thing
02:29:10.580 | going for it.
02:29:11.480 | Again, great marketing seems to convert people's thinking
02:29:17.340 | about technology such that what was once viewed
02:29:20.840 | as very scary and dangerous and dystopian
02:29:24.520 | is like an oasis of opportunity.
02:29:26.920 | So why are people so afraid of new technologies?
02:29:30.120 | - So this is the thing I've tried to understand
02:29:32.020 | for a long time because the history is so clear
02:29:34.920 | and the history basically is every new technology
02:29:37.520 | is greeted by what's called a moral panic.
02:29:39.440 | And so it's basically this like historical freak out
02:29:42.960 | of some kind that causes people to basically predict
02:29:44.820 | the end of the world.
02:29:45.660 | And you go back in time and actually this historical
02:29:48.400 | sort of effect, it happens even in things now
02:29:50.260 | where you just go back and it's ludicrous, right?
02:29:51.800 | And so you mentioned earlier the satanic panic of the 80s
02:29:55.000 | and the concern around like heavy metal music.
02:29:57.480 | Right, before that there was like a freak out
02:29:59.060 | around comic books in the 50s, there was a freak out
02:30:01.640 | around jazz music in the 20s and 30s, it's devil music.
02:30:05.920 | There was a freak out, the arrival of bicycles
02:30:07.780 | caused a moral panic in the like 1860s, 1870s.
02:30:10.000 | - Bicycles?
02:30:10.840 | - Bicycles, yeah, so there was this thing at the time.
02:30:12.640 | So bicycles were the first very easy to use
02:30:16.160 | personal transportation thing that basically let kids
02:30:18.280 | travel between towns quickly without any overhead.
02:30:22.480 | You have to take care of a horse, just jump on a bike and go.
02:30:25.880 | And so there was a historical panic specifically
02:30:28.200 | around at the time young women who for the first time
02:30:30.600 | were able to venture outside the confines of the town
02:30:33.600 | to maybe go have a boyfriend in another town.
02:30:36.160 | And so the magazines at the time ran all these stories
02:30:38.580 | on this phenomenon, medical phenomenon called bicycle face.
02:30:41.920 | And the idea of bicycle face was the exertion caused
02:30:44.400 | by pedaling a bicycle would cause your face,
02:30:46.480 | your face would grimace and then if you run the bicycle
02:30:48.660 | for too long your face would lock into place.
02:30:51.360 | - Right, and then you would be unattractive
02:30:55.080 | and therefore of course unable to then get married.
02:30:57.580 | Cars, there was a moral panic around red flag laws.
02:31:02.760 | There were all these laws that created the automobile.
02:31:04.580 | Automobile freaked people out, so there were all these laws.
02:31:07.280 | In the early days the automobile in a lot of places
02:31:09.760 | you had to, you would take a ride in an automobile.
02:31:13.160 | And automobiles, they broke down all the time.
02:31:14.520 | So it would be, only rich people had automobiles.
02:31:16.520 | It would be you and your mechanic in the car, right,
02:31:19.360 | for when it broke down.
02:31:20.560 | And then you had to hire another guy to walk 200 yards
02:31:24.160 | in front of the car with a red flag.
02:31:26.280 | And he had to wave the red flag,
02:31:28.000 | and so you could only drive as fast as he could walk
02:31:29.840 | 'cause the red flag was to warn people
02:31:31.320 | that the car was coming.
02:31:32.960 | And then in, I think it was Pennsylvania,
02:31:35.160 | they had the most draconian version which was,
02:31:37.860 | they were very worried about the car scaring the horses.
02:31:40.680 | And so there was a law that said if you saw a horse coming,
02:31:44.080 | you needed to stop the car, you had to disassemble the car.
02:31:46.760 | And you had to hide the pieces of the car
02:31:48.360 | behind the nearest hay bale, wait for the horse to go by.
02:31:51.740 | And then you could put your car back together.
02:31:54.480 | So anyways, this example is electric lighting.
02:31:56.840 | There was a panic around whether this was gonna completely
02:31:59.040 | ruin, this was gonna completely ruin the romance of the dark
02:32:01.680 | and it was gonna cause a whole new kind of terrible
02:32:04.360 | civilization where everything is always brightly lit.
02:32:06.600 | So there's just all these examples.
02:32:08.440 | And so it's like, okay, what on earth is happening
02:32:10.240 | that this is always what happens?
02:32:12.280 | And so I finally found this book
02:32:13.600 | that I think has a good model for it.
02:32:15.900 | The book is called Men, Machines in Modern Times,
02:32:17.760 | and it's written by this MIT professor 60 years ago.
02:32:20.280 | So it predates the internet,
02:32:22.080 | but it uses a lot of historical samples.
02:32:24.000 | And what he says basically is he says
02:32:26.360 | there's actually a three-stage response.
02:32:28.160 | There's a three-stage societal response to new technologies.
02:32:30.520 | It's very predictable.
02:32:31.480 | He said stage one is basically just denial, just ignore.
02:32:35.320 | Like we just don't pay any attention to this.
02:32:36.840 | Nobody takes it seriously.
02:32:37.720 | We just, there's just a blackout on the whole topic.
02:32:40.760 | He says, that's stage one.
02:32:42.320 | Stage two is rational counterargument.
02:32:45.140 | So stage two is where you line up
02:32:46.640 | with all the different reasons why this can't possibly work.
02:32:48.640 | It can't possibly ever get cheaper, this, that,
02:32:52.040 | not fast enough or whatever the thing is.
02:32:54.360 | And then he says stage three,
02:32:55.480 | he says is when the name calling begins.
02:32:58.240 | So he says stage three is like, right, right.
02:33:00.440 | So when they fail to ignore it
02:33:03.160 | and they fail to argue society out of it,
02:33:05.820 | they move to the name calling, right?
02:33:07.120 | And what's the name calling?
02:33:08.120 | The name calling is this is evil.
02:33:09.400 | This is moral panic.
02:33:10.240 | This is evil, this is terrible, this is awful.
02:33:11.840 | This is going to destroy everything.
02:33:13.080 | Like, don't you understand like all this,
02:33:15.280 | this is just like, this is horrifying.
02:33:17.120 | And you, you know, the person working on it
02:33:18.680 | are being reckless and evil and you know,
02:33:20.060 | all this stuff and you must be stopped.
02:33:22.120 | And he said the reason for that
02:33:23.560 | is because basically fundamentally what these things are
02:33:26.040 | is they're a war over status.
02:33:28.160 | It's a war over status and therefore a war over power
02:33:30.720 | and then of course ultimately money,
02:33:32.620 | but status, human status is the thing.
02:33:35.040 | And so, and 'cause what he says is
02:33:36.960 | what is the societal impact of a new technology?
02:33:39.440 | The societal impact of a new technology
02:33:41.360 | is it reorders status in the society.
02:33:43.520 | So the people who are specialists in that technology
02:33:45.840 | become high status and the people who are specialists
02:33:48.280 | in the previous way of doing things become low status
02:33:50.600 | and generally people don't adapt, right?
02:33:53.000 | Generally, if you're the kind of person who is high status
02:33:55.300 | because you're an evolved adaptation
02:33:57.720 | to an existing technology,
02:33:59.240 | you're probably not the kind of person
02:34:00.600 | that's going to enthusiastically try to replant yourself
02:34:03.040 | onto a new technology.
02:34:04.440 | And so this is like every politician
02:34:06.320 | who's just like in a complete state of panic
02:34:07.760 | about social media, like why are they so freaked out
02:34:09.360 | about social media is 'cause they all know
02:34:11.080 | that the whole nature of modern politics has changed.
02:34:13.120 | The entire battery of techniques that you use
02:34:14.880 | to get elected before social media are now obsolete.
02:34:17.680 | Obviously the best new politicians of the future
02:34:19.780 | are going to be 100% creations of social media.
02:34:21.920 | - And podcasts. - And podcasts.
02:34:23.440 | - And we're seeing this now as we head towards
02:34:25.340 | the next presidential election,
02:34:26.560 | that podcasts clearly are going to be featured very heavily
02:34:30.360 | in that next election because long form content
02:34:32.740 | is a whole different landscape.
02:34:34.840 | - So this is exactly, so this is so far,
02:34:37.080 | so Rogan, Rogan's had like what?
02:34:38.340 | Like he's had like Bernie, he's had like Tulsi,
02:34:40.240 | he's had like a whole series of these.
02:34:41.080 | - RFK most recently and that's created a lot of controversy.
02:34:44.440 | - But also my understanding is like,
02:34:46.360 | I'm sure he's invited everybody.
02:34:47.420 | I'm sure he'd love to have Biden on.
02:34:48.980 | I'm sure he'd love to have Trump on.
02:34:49.980 | I'm sure he'd love to. - You'd have to ask him.
02:34:51.160 | I mean, I think that, you know,
02:34:53.280 | every podcaster has their own ethos
02:34:55.540 | around who they invite on and why and how.
02:34:58.720 | So I certainly can't speak for him,
02:35:00.440 | but I have to imagine that any opportunity
02:35:04.200 | to have true long form discourse that would allow people
02:35:08.080 | to really understand people's positions on things,
02:35:11.000 | I have to imagine that he would be in favor
02:35:12.920 | of that sort of thing. - Or somebody else.
02:35:14.120 | Yeah, or somebody else would, right?
02:35:15.200 | You know, some other podcaster undoubtedly would, right?
02:35:17.880 | And so there's, my point, exactly, I totally agree with you,
02:35:21.000 | but my point is if you're a politician,
02:35:22.560 | if you're, let's say a legacy politician, right,
02:35:25.440 | you have the option of embracing the new technology.
02:35:27.520 | You can do it anytime you want, right?
02:35:29.820 | But you don't, they're not, they won't.
02:35:32.520 | Like they won't do it and why won't they do it?
02:35:34.720 | Well, okay, first of all, they want to ignore it, right?
02:35:36.600 | They want to pretend that things aren't changing.
02:35:38.280 | You know, second is they want like,
02:35:39.600 | they have rational counterarguments for like
02:35:41.080 | why the existing campaign system works the way that it does
02:35:43.120 | and this and that and the existing media networks
02:35:44.840 | and like here's how you like do things
02:35:46.200 | and here's how you give speeches
02:35:47.180 | and here's the clothes you wear and the tie
02:35:48.560 | and the thing in the pocket square
02:35:49.620 | and like you've got your whole system.
02:35:51.200 | It's how you succeeded was coming up through that system.
02:35:53.000 | So you've got all your arguments
02:35:53.880 | as to why that won't work anymore.
02:35:55.400 | And then we've now proceeded to the name calling phase,
02:35:58.600 | which is now it's evil, right?
02:35:59.900 | Now it's evil for somebody to show up in, you know,
02:36:02.400 | on a stream, God forbid, for three hours
02:36:04.960 | and actually say what they think, right?
02:36:06.640 | It's gonna destroy society, right?
02:36:07.880 | So it's exactly right.
02:36:09.080 | It's like, it's a classic example of this pattern.
02:36:11.840 | And anyway, so Morrison says in the book,
02:36:14.360 | basically this is the forever pattern.
02:36:16.160 | Like this will never change.
02:36:17.960 | This is one of those things where you can learn about it
02:36:20.480 | and still not, the entire world could learn about this
02:36:22.400 | and still nothing changes because at the end of the day,
02:36:24.920 | it's not the tech that's the question.
02:36:27.200 | It's the reordering of status.
02:36:28.960 | - I have a lot of thoughts about the podcast component.
02:36:33.200 | I'll just say this because I wanna get back to
02:36:36.160 | the topic of innovation of technology,
02:36:39.520 | but on a long form podcast, there's no safe zone.
02:36:44.280 | The person can get up and walk out,
02:36:47.100 | but if the person interviewing them,
02:36:49.180 | and certainly Joe is the best of the very best,
02:36:52.580 | if not the most skilled podcaster in the entire universe
02:36:56.920 | at continuing to press people on specific topics
02:37:00.180 | when they're trying to bob and weave and wriggle out,
02:37:03.360 | he'll just keep either drilling or alter the question
02:37:07.360 | somewhat in a way that forces them to finally come up
02:37:10.500 | with an answer of some sort.
02:37:12.040 | And I think that probably puts certain people's cortisol
02:37:16.160 | levels through the roof,
02:37:18.440 | such that they just would never go on there.
02:37:20.320 | - Well, I think there's another deeper question also,
02:37:21.780 | or another question along with that,
02:37:23.000 | which is how many people actually have something to say?
02:37:25.480 | - Real substance.
02:37:28.840 | - Like how many people can actually talk in a way
02:37:30.840 | that's actually interesting to anybody else
02:37:32.300 | for any length of time?
02:37:34.140 | Like how much substance is there really?
02:37:35.480 | And like a lot of historical politics was to be able
02:37:38.160 | to manufacture a facade where you honestly,
02:37:40.160 | as far as like, you can't tell like how deep
02:37:43.040 | the thoughts are, like even if they have deep thoughts,
02:37:44.640 | like it's kept away from you,
02:37:45.960 | they would certainly never cop to it.
02:37:47.800 | - Well, it's going to be an interesting next,
02:37:49.560 | what is it about, you know, 20 months or so?
02:37:52.260 | - Yeah, yeah, yeah.
02:37:53.100 | - Leading into the next election.
02:37:54.000 | - The panic and the name calling have already started,
02:37:55.400 | so yeah, it's going to be.
02:37:56.240 | - Yeah, I was going to say this list of three things,
02:37:57.980 | denial, you know, the counterargument and name calling,
02:38:02.200 | it seems like with AI, it's already just jumped
02:38:04.880 | to numbers two and three.
02:38:06.520 | - Yes, correct.
02:38:07.360 | - And we're already at two and three
02:38:08.240 | and it's kind of leaning three.
02:38:09.680 | - Yes.
02:38:10.520 | - Yeah.
02:38:11.340 | - Yeah, that's correct.
02:38:12.180 | AI is unusual just because it had,
02:38:14.720 | so new technologies that take off,
02:38:16.120 | they almost always have a prehistory,
02:38:17.460 | they almost always have a 30 or 40 year history
02:38:19.200 | where people tried and failed to get them to work
02:38:20.800 | before they took off.
02:38:21.640 | AI has an 80 year prehistory, so it has a very long one.
02:38:24.640 | And then it just, it all of a sudden started to work
02:38:27.520 | dramatically well, like seemingly overnight.
02:38:30.980 | And so it went from basically,
02:38:32.800 | as far as most people were concerned,
02:38:34.080 | it went from it doesn't work at all
02:38:35.340 | to it works incredibly well in one step.
02:38:37.120 | And that almost never happens.
02:38:38.660 | And so I actually think that's exactly what's happening.
02:38:40.560 | I think it's actually speed running this progression
02:38:42.240 | just because if you use mid journey or you use GPT
02:38:45.400 | or any of these things for five minutes,
02:38:46.800 | you're just like, wow, like obviously this thing
02:38:48.900 | is going to be like, obviously in my life,
02:38:50.720 | this is going to be the best thing ever.
02:38:51.800 | Like this is amazing, there's all these ways
02:38:53.200 | that I can use it and then therefore immediately
02:38:55.760 | you're like, oh my God, this is going to transform
02:38:57.360 | everything, therefore step three.
02:39:00.760 | - Right, straight to the name calling.
02:39:02.660 | In the face of all this, there are innovators out there.
02:39:07.780 | Maybe they are aware they are innovators,
02:39:09.780 | maybe they are already starting companies,
02:39:12.620 | or maybe they are just some young or older person
02:39:16.620 | who has these five traits in abundance or doesn't,
02:39:21.080 | but knows somebody who does and is partnering with them
02:39:23.740 | in some sort of idea.
02:39:24.920 | And you have an amazing track record
02:39:28.980 | at identifying these people, I think in part,
02:39:31.580 | because you have those same traits yourself.
02:39:34.220 | I've heard you say the following,
02:39:37.580 | the world is a very malleable place.
02:39:39.840 | If you know what you want and you go for it
02:39:42.040 | with maximum energy and drive and passion,
02:39:44.480 | the world will often reconfigure itself around you
02:39:47.060 | much more quickly and easily than you would think.
02:39:49.980 | That's a remarkable quote because it says
02:39:53.280 | at least two things to me.
02:39:55.320 | One is that you have a very clear understanding
02:39:58.560 | of the inner workings of these great innovators.
02:40:02.280 | We talked a little bit about that earlier,
02:40:04.700 | these five traits, et cetera,
02:40:06.500 | but that also you have an intense understanding
02:40:09.900 | of the world landscape.
02:40:11.380 | And the way that we've been talking about it
02:40:12.840 | for the last hour or so is that it is a really intense
02:40:16.000 | and kind of oppressive landscape.
02:40:17.780 | You've got countries and organizations and the elites
02:40:20.420 | and journalists that are trying to, not necessarily trying,
02:40:25.320 | but are suppressing the innovation process.
02:40:28.540 | I mean, that's sort of the picture that I'm getting.
02:40:30.220 | So it's like we're trying to innovate inside of a vice
02:40:33.580 | that's getting progressively tighter.
02:40:35.460 | And yet this quote argues that it is the person,
02:40:40.140 | the boy or girl, man or woman who says,
02:40:43.900 | well, you know what, that all might be true,
02:40:46.860 | but my view of the world is the way the world's gonna bend.
02:40:50.420 | Or I'm gonna create a dent in that vice
02:40:52.860 | that allows me to exist the way that I want.
02:40:55.120 | Or you know what?
02:40:55.960 | I'm actually gonna uncurl the vice the other direction.
02:40:58.740 | And so I'm at once picking up a sort of pessimistic
02:41:03.380 | glass half empty view of the world,
02:41:06.520 | as well as a glass half full view.
02:41:08.940 | And so tell me about that.
02:41:10.980 | And if you would, could you tell us about that
02:41:13.820 | from the perspective of someone listening who is thinking,
02:41:16.420 | you know, I've got an idea
02:41:18.860 | and I know it's a really good one 'cause I just know.
02:41:22.780 | I might not have the confidence of extrinsic reward yet,
02:41:25.840 | but I just know there's a seed of something.
02:41:28.880 | What does it take to foster that?
02:41:30.840 | And how do we foster real innovation
02:41:34.080 | in the landscape that we're talking about?
02:41:36.200 | - Yeah, so part is I think you just,
02:41:38.160 | one of the ways to square it is I think you as the innovator
02:41:40.260 | need to be signed up to fight the fight, right?
02:41:42.640 | So, and again, this is where like the fictional portrayals
02:41:44.920 | of startups, I think take people off course,
02:41:46.500 | or even scientists or whatever,
02:41:47.560 | 'cause when there's great success stories,
02:41:49.600 | they get kind of prettified after the fact
02:41:52.620 | and they may get made to be like cute and fun.
02:41:54.800 | And it's like, yeah, no, like if you talk to anybody
02:41:56.660 | who actually did any of these things, like, no,
02:41:58.520 | it was just like these things are always just like
02:42:00.420 | brutal exercises and just like sheer willpower
02:42:02.980 | and fighting, you know, fighting forces
02:42:04.520 | that are trying to get you.
02:42:05.360 | So part of it is you just,
02:42:07.600 | you have to be signed up for the fight.
02:42:08.640 | And this kind of goes to the conscientiousness thing
02:42:10.700 | we're talking about.
02:42:12.080 | We also, my partner Ben uses the term courage a lot, right?
02:42:15.160 | Which is some combination of like just stubbornness,
02:42:17.520 | but coupled with like a willingness to take pain
02:42:20.560 | and not stop.
02:42:22.560 | And, you know, have people think very bad things of you
02:42:24.440 | for a long time until it turns out, you know,
02:42:26.280 | you hopefully prove yourself, prove yourself correct.
02:42:28.640 | And so you have to be willing to do that.
02:42:30.000 | Like it's a context sport, like these aren't easy roads,
02:42:33.880 | right, it's a context sport.
02:42:35.000 | So you have to be signed up for the fight.
02:42:37.400 | The advantage that you have as an innovator
02:42:40.760 | is at the end of the day, the truth actually matters.
02:42:44.840 | And all the arguments in the world,
02:42:47.160 | classic Victor Hugo quote is there's nothing more powerful
02:42:49.480 | in the world than an idea whose time has come, right?
02:42:51.560 | Like if it's real, right, and this is just pure substance,
02:42:56.400 | if the thing is real, if the idea is real,
02:42:58.960 | like if it's a legitimately good scientific discovery,
02:43:02.200 | you know, about how the nature works,
02:43:03.500 | if it's a new invention, if it's a new work of art,
02:43:06.720 | and if it's real, you know, then you do,
02:43:09.800 | at the end of the day, you have that on your side.
02:43:11.960 | And all of the people who are fighting you
02:43:14.000 | and arguing with you and telling you no,
02:43:15.520 | they don't have that on their side, right?
02:43:17.160 | It's not like they're showing up with some other thing
02:43:19.980 | and they're like, my thing is better than your thing.
02:43:21.640 | Like, that's not the main problem, right?
02:43:23.420 | The main problem is like, I have a thing, I'm convinced,
02:43:26.300 | everybody else is telling me it's stupid, wrong,
02:43:28.200 | it should be illegal, whatever the thing is.
02:43:30.020 | But at the end of the day, I still have the thing, right?
02:43:32.800 | And so at the end of the day, like, yeah,
02:43:35.120 | the truth really matters, the substance really matters
02:43:37.120 | if it's real, I'll give you an example.
02:43:39.240 | It's really hard historically to find an example
02:43:41.800 | of a new technology that came into the world
02:43:44.200 | that was then pulled back.
02:43:46.120 | And you know, nuclear is maybe an example of that,
02:43:49.360 | but even still, there are still, you know, nuclear,
02:43:51.000 | there are still nuclear plants like running today.
02:43:53.200 | You know, that still exists.
02:43:54.960 | You know, I would say the same thing as scientific,
02:43:57.400 | like, at least I may do this, I don't know,
02:43:59.000 | I don't know of any scientific discovery that was made
02:44:01.000 | and then people like, I know there are areas of science
02:44:04.340 | that are not politically correct to talk about today,
02:44:06.960 | but every scientist knows the truth, right?
02:44:09.600 | Like, the truth is still the truth.
02:44:11.520 | I mean, even the geneticists in the Soviet Union
02:44:13.200 | they were forced to buy into less egoism,
02:44:14.640 | like knew the whole time that it was wrong.
02:44:15.880 | Like that I'm completely convinced of.
02:44:18.120 | - Yeah, they couldn't delude themselves,
02:44:19.520 | especially because the basic training
02:44:21.560 | that one gets in any field establishes some core truths
02:44:24.040 | upon which even the crazy ideas have to rest.
02:44:26.520 | And if they don't, as you pointed out,
02:44:29.120 | things fall to pieces.
02:44:29.960 | I would say that even the technologies that did not pan out
02:44:33.880 | and in some cases were disastrous,
02:44:35.960 | but that were great ideas at the beginning
02:44:40.960 | are starting to pan out.
02:44:42.200 | So the example I'll give is that most people are aware
02:44:44.480 | of the Elizabeth Holmes Theranos debacle,
02:44:48.560 | to put it lightly.
02:44:49.600 | You know, analyzing what's in a single drop of blood
02:44:53.060 | as a way to analyze hormones and diseases
02:44:55.200 | and antibodies, et cetera.
02:44:56.800 | I mean, that's a great idea.
02:44:58.840 | I mean, it's a terrific idea
02:45:00.600 | as opposed to having a phlebotomist come to your house
02:45:02.560 | or you have to go in and you get tapped with, you know,
02:45:04.560 | and then pulling vials and the whole thing.
02:45:07.160 | There's now a company born out of Stanford
02:45:10.280 | that is doing exactly what she sought to do,
02:45:13.920 | except that at least the courts ruled
02:45:15.760 | that she fudged the thing
02:45:17.560 | and that's why she's in jail right now.
02:45:19.980 | But the idea of getting a wide array of markers
02:45:24.980 | from a single drop of blood
02:45:26.160 | is an absolutely spectacular idea.
02:45:28.160 | The biggest challenge that company is going to confront
02:45:30.320 | is the idea that it's just the next Theranos.
02:45:32.160 | But if they've got the thing and they're not fudging it
02:45:35.200 | as apparently Theranos was,
02:45:38.280 | I think everything will work out a la Victor Hugo.
02:45:42.920 | - Yeah, exactly. - Yeah.
02:45:44.240 | - 'Cause who wants to go back?
02:45:45.080 | Like, if they get to the work, if it's real,
02:45:48.980 | it's going to be, like, this is the thing.
02:45:50.920 | The opponents, the opponents,
02:45:53.120 | they're not bringing their own ideas.
02:45:54.460 | [laughs]
02:45:55.480 | Like, they're not bringing their,
02:45:56.600 | oh, my idea's better than yours.
02:45:57.640 | Like, that's not what's happening.
02:45:58.900 | They're bringing the silence or counterargument, right,
02:46:03.400 | or name calling, right?
02:46:04.840 | - Well, this is why I think people who need to be loved
02:46:08.420 | probably stand a reduced chance of success.
02:46:12.160 | And maybe that's also why having people close to you
02:46:15.000 | that do love you and allowing that to be sufficient
02:46:17.920 | can be very beneficial.
02:46:18.880 | This gets back to the idea of partnership and family
02:46:21.400 | around innovators, because if you feel filled up
02:46:25.140 | by those people local to you, you know, in your home,
02:46:28.680 | then you don't need people on the internet
02:46:30.720 | saying nice things about you or your ideas,
02:46:32.740 | because you're good and you can forge forward.
02:46:35.520 | Another question about innovation
02:46:38.260 | is the teams that you assemble around you.
02:46:40.260 | And you've talked before about
02:46:42.240 | the sort of small squadron model,
02:46:44.780 | sort of David and Goliath examples as well,
02:46:47.700 | where, you know, a small group of individuals
02:46:49.900 | can create a technology that, frankly,
02:46:53.100 | outdoes what a giant like Facebook might be doing
02:46:56.800 | or what any other large company might be doing.
02:47:00.940 | There are a lot of theories as to why that would happen,
02:47:03.200 | but I know you have some unique theories.
02:47:05.480 | Why do you think small groups
02:47:08.980 | can defeat large organizations?
02:47:11.280 | - So the conventional explanation is, I think, correct.
02:47:13.840 | And it's just that large organizations
02:47:16.300 | have a lot of advantages,
02:47:17.180 | but they just have a very hard time
02:47:18.920 | actually executing anything because of all the overhead.
02:47:23.440 | So large organizations have
02:47:25.360 | combinatorial communication overhead, right?
02:47:27.160 | The number of people who have to be consulted,
02:47:28.860 | who have to agree on things gets to be staggering.
02:47:31.140 | The amount of time it takes to schedule the meeting
02:47:33.260 | gets to be staggering.
02:47:34.700 | You know, you get these really big companies
02:47:36.140 | and they have some issue they're dealing with.
02:47:37.500 | And it takes like a month to schedule the pre-meeting
02:47:39.460 | to like plan for the meeting,
02:47:41.060 | which is going to happen two months later,
02:47:42.340 | which is then going to result in a post-meeting,
02:47:44.060 | which will then result in a board presentation,
02:47:46.120 | which will then result in a planning offsite, right?
02:47:48.860 | - I thought academia was bad,
02:47:50.000 | but what you're describing is giving me hives.
02:47:51.780 | - Kafka was a documentary.
02:47:53.660 | Yeah, like this is, yeah.
02:47:55.200 | So it's just like these are,
02:47:56.960 | I mean, look, you'd have these organizations.
02:47:58.180 | At a hundred thousand people or more,
02:47:59.360 | like you're more of a nation state than a company.
02:48:03.220 | And you've got all these competing internal,
02:48:05.420 | you know, it's the Bedouin thing I was saying
02:48:06.620 | before you've got all these internal,
02:48:07.580 | like at most big companies,
02:48:08.980 | your internal enemies are like way more dangerous to you
02:48:10.820 | than anybody on the outside.
02:48:12.780 | - Can you elaborate on that?
02:48:13.620 | - Oh yeah, yeah, yeah.
02:48:14.460 | You're a big, at a big company,
02:48:16.060 | the big competition is for the next promotion, right?
02:48:18.500 | And the enemy for the next promotion
02:48:20.200 | is the next executive over in your company.
02:48:22.740 | Like that's your enemy.
02:48:24.380 | The competitor on the outside is like an abstraction.
02:48:27.100 | Like maybe they'll matter someday, whatever.
02:48:28.780 | I got to beat that guy inside my own company, right?
02:48:31.460 | And so the internal warfare is at least as intense
02:48:34.060 | as the external warfare.
02:48:35.500 | And so, yeah, so it's just, I mean, this is just all the,
02:48:38.020 | you know, it's the iron law of all these big bureaucracies
02:48:40.300 | and how they function.
02:48:41.140 | So if a big bureaucracy ever does anything productive,
02:48:43.480 | I think it's like a miracle.
02:48:44.400 | Like it's like a miracle to the point
02:48:45.640 | where there should be like a celebration.
02:48:48.160 | There should be parties,
02:48:49.000 | there should be like ticker tape parades
02:48:49.980 | for like big, large organizations that actually do things.
02:48:52.100 | Like that's great because it's like, it's so rare.
02:48:55.520 | It doesn't happen very often.
02:48:56.420 | So anyway, so that's the conventional explanation.
02:48:59.220 | Whereas look, small companies, small teams,
02:49:01.460 | you know, there's a lot that they can't do
02:49:02.740 | 'cause they can't, you know,
02:49:03.580 | they're not operating at scale.
02:49:04.420 | They don't have global coverage and all these kinds of,
02:49:06.620 | you know, they don't have the resources and so forth,
02:49:08.220 | but at least they can move quickly, right?
02:49:10.700 | They can organize fast.
02:49:12.140 | They can have, you know, if there's an issue today,
02:49:13.720 | they can have a meeting today.
02:49:14.700 | They can solve the issue today, right?
02:49:16.380 | And everybody they need to solve the issue
02:49:17.660 | is in the room today.
02:49:19.020 | And so they can just move a lot faster.
02:49:21.220 | I think that's part of it,
02:49:22.060 | but I think there's another deeper thing underneath that
02:49:24.160 | that people really don't like to talk about
02:49:25.780 | that takes us back full circle to where we started,
02:49:27.580 | which is just the sheer number of people in the world
02:49:29.980 | who are capable of doing new things
02:49:31.100 | is just a very small set of people.
02:49:33.320 | And so you're not gonna have a hundred of them in a company
02:49:36.560 | or a thousand or 10,000.
02:49:37.860 | You're gonna have three, eight or 10, maybe.
02:49:42.780 | - And some of them are flying too close to the sun.
02:49:44.660 | - Some of them are blowing themselves up, right?
02:49:46.560 | Some of them are.
02:49:47.540 | So IBM, I actually first learned this,
02:49:49.420 | my first actual job job was at IBM when it was,
02:49:51.420 | and when IBM was still on top of the world
02:49:53.420 | right before it caved in in the early '90s.
02:49:55.140 | And so when I was there, it was 440,000 employees,
02:49:58.660 | which, and again, if you inflation adjust,
02:50:00.540 | like today for that same size of business,
02:50:02.480 | inflation adjusted, market size adjusted,
02:50:04.140 | it would be, it's equivalent today
02:50:05.340 | of like a two or three million person organization.
02:50:06.940 | It was like a, it was a nation state.
02:50:09.500 | There were 6,000 people in my division.
02:50:11.140 | You know, we were next door to another building
02:50:12.700 | that had another 6,000 people in another division.
02:50:14.460 | So you just, you could work there for years
02:50:16.280 | and never meet anybody who didn't work for IBM.
02:50:18.380 | The first half of every meeting was just IBM
02:50:20.420 | or just introducing themselves to each other.
02:50:22.160 | Like, it was just mind boggling, the level of complexity.
02:50:25.240 | But they were so powerful that they had,
02:50:28.660 | four years before I got there in 1985,
02:50:31.360 | they were 80% of the market capitalization
02:50:33.240 | of the entire tech industry.
02:50:34.840 | Right, so they were at a level of dominance
02:50:36.640 | that even, you know, Google or Apple today
02:50:38.300 | is not even close to, right, at the time.
02:50:40.640 | So that's how powerful they were.
02:50:41.860 | And so they had a system, and it worked really well
02:50:44.300 | for like 50 years, they had a system which was,
02:50:46.740 | most of the employees in the company were expected
02:50:48.560 | to basically rigid, follow rules.
02:50:50.800 | So they dressed the same, they acted the same,
02:50:52.240 | they did everything out of the playbook.
02:50:53.520 | You know, they were trained very specifically.
02:50:56.040 | But they had this category of people they called wild ducks.
02:50:58.960 | And this was an idea that the founder, Thomas Watson,
02:51:01.240 | came up with, wild ducks.
02:51:02.880 | And the wild ducks were, they often had the formal title
02:51:05.840 | of an IBM fellow, and they were the people
02:51:08.220 | who could make new things.
02:51:09.920 | And there were eight of them.
02:51:12.160 | And they got to break all the rules.
02:51:14.280 | And they got to invent new products.
02:51:16.040 | They got to go off and work on something new.
02:51:17.560 | They didn't have to report back.
02:51:19.040 | They got to pull people off of other projects
02:51:20.780 | to work with them.
02:51:22.600 | They got budget when they needed it.
02:51:24.260 | They reported directly to the CEO.
02:51:26.040 | They got whenever they needed.
02:51:26.960 | He supported them in doing it.
02:51:28.220 | And they were glassbreakers.
02:51:29.420 | And they showed-- the one in Austin at the time
02:51:32.020 | was this guy Andy Heller, and he would show up in jeans
02:51:34.200 | and cowboy boots.
02:51:35.840 | And you know, went amongst an ocean of men in blue suits,
02:51:38.840 | white shirts, red ties, and put his cowboy boots up
02:51:41.880 | on the table.
02:51:42.880 | And it was fine for Andy Heller to do that,
02:51:44.760 | and it was not fine for you to do that, right?
02:51:46.680 | And so they very specifically identified.
02:51:48.440 | We have almost like an aristocratic class
02:51:52.140 | within our company that gets to play by different rules.
02:51:55.060 | Now, the expectation is they deliver, right?
02:51:57.460 | Their job is to invent the next breakthrough product.
02:51:59.620 | But we, IBM management, know that the 6,000 person division
02:52:03.380 | is not going to invent the next product.
02:52:04.980 | We know it's going to be a crazy Andy Heller and his cowboy
02:52:07.460 | boots.
02:52:08.380 | And so I was always very impressed.
02:52:09.860 | And again, ultimately, IBM had its issues.
02:52:12.460 | But that model worked for 50 years, right?
02:52:14.980 | Worked incredibly well.
02:52:15.900 | And I think that's basically the model that works.
02:52:19.760 | But it's a paradox, right?
02:52:20.920 | Which is like, how do you have a large, bureaucratic, regimented
02:52:24.000 | organization, whether it's academia or government
02:52:26.460 | or business or anything, that has all these rule
02:52:28.880 | followers in it and all these people who
02:52:30.280 | are jealous of their status and don't want things to change,
02:52:33.080 | but then still have that spark of creativity?
02:52:37.620 | I would say mostly it's impossible.
02:52:39.800 | Mostly it just doesn't happen.
02:52:41.480 | Those people get driven out, right?
02:52:42.920 | And in tech, what happens is those people
02:52:44.640 | get driven out because we will fund them.
02:52:46.780 | These are the people we fund.
02:52:47.940 | I was going to say, I gather that you
02:52:49.920 | are in the business of finding and funding the wild dogs.
02:52:52.620 | The wild dogs, that's exactly right.
02:52:54.160 | And actually, this is actually going to close the loop.
02:52:56.400 | This is actually, I think, the simplest explanation
02:52:58.120 | for why IBM ultimately caved in and then HP sort of in the '80s
02:53:01.260 | also came--
02:53:02.140 | IBM and HP kind of were these incredible monolithic,
02:53:05.580 | incredible companies for 40 or 50 years.
02:53:07.760 | And then they kind of both caved in in the '80s and '90s.
02:53:10.140 | And I actually think it was the emergence of venture capital.
02:53:12.760 | It was the emergence of a parallel funding system
02:53:14.840 | where the wild dogs, or in HP's case,
02:53:16.900 | their superstar technical people could actually leave and start
02:53:19.440 | their own companies.
02:53:20.680 | And again, it goes back to the university discussion
02:53:22.480 | we're having.
02:53:23.040 | It's like, this is what doesn't exist at the university level.
02:53:25.600 | This certainly does not exist at the government level.
02:53:27.940 | And until recently in media, it didn't
02:53:29.820 | exist until there's this thing that we call podcasts.
02:53:32.480 | Exactly, right, exactly, right.
02:53:34.240 | Which clearly have picked up some momentum.
02:53:36.880 | And I would hope that these other wild duck
02:53:39.440 | models will move quickly.
02:53:42.320 | But the one thing you know, right, and you know this.
02:53:44.360 | The one thing you know is the people on the other side
02:53:45.880 | are going to be mad as hell.
02:53:47.320 | Yeah, they're going to-- well, I think they're past denial.
02:53:51.560 | The counterarguments continue.
02:53:53.780 | The name calling is prolific.
02:53:55.200 | The name calling is fully underway, yes.
02:53:58.600 | Well, Mark, we've covered a lot of topics.
02:54:01.920 | But as with every time I talk to you, I learn oh so very much.
02:54:07.360 | So I'm so grateful for you taking the time out
02:54:09.320 | of your schedule to talk about all of these topics in depth
02:54:13.040 | with us.
02:54:14.280 | I'd be remiss if I didn't say that it is clear to me
02:54:16.480 | now that you are hyper-realistic about the landscape.
02:54:21.800 | But you are also intensely optimistic about the existence
02:54:25.160 | of wild ducks and those around them that support them
02:54:28.360 | that are necessary for the implementation of their ideas
02:54:30.680 | at some point.
02:54:32.120 | And that also you have a real rebel inside you.
02:54:35.360 | So that is oh so welcome on this podcast.
02:54:38.640 | And it's oh so needed in these times and every time.
02:54:42.440 | So on behalf of myself and the rest of us
02:54:45.640 | here at the podcast and especially the listeners,
02:54:47.720 | thank you so much.
02:54:49.080 | Thanks for having me.
02:54:50.400 | Thank you for joining me for today's discussion
02:54:52.400 | with Mark Andreessen.
02:54:53.880 | If you're learning from and/or enjoying this podcast,
02:54:56.360 | please subscribe to our YouTube channel.
02:54:58.080 | That's a terrific zero-cost way to support us.
02:55:00.600 | In addition, please subscribe to the podcast
02:55:02.880 | on both Spotify and Apple.
02:55:04.600 | And on both Spotify and Apple, you
02:55:06.040 | can leave us up to a five-star review.
02:55:08.200 | If you have questions for me or comments about the podcast
02:55:10.520 | or guests that you'd like me to consider hosting
02:55:12.600 | on the Huberman Lab Podcast, please put those
02:55:14.920 | in the comment section on YouTube.
02:55:16.360 | I do read all the comments.
02:55:18.360 | Please also check out the sponsors mentioned
02:55:20.080 | at the beginning and throughout today's episode.
02:55:22.180 | That's the best way to support this podcast.
02:55:24.900 | Not on today's podcast, but on many previous episodes
02:55:27.600 | of the Huberman Lab Podcast, we discuss supplements.
02:55:30.160 | While supplements aren't necessary for everybody,
02:55:32.280 | many people derive tremendous benefit from them
02:55:34.440 | for things like improving sleep, hormone support, and focus.
02:55:37.880 | The Huberman Lab Podcast
02:55:39.160 | has partnered with Momentous Supplements.
02:55:40.800 | If you'd like to access the supplements discussed
02:55:42.540 | on the Huberman Lab Podcast,
02:55:44.160 | you can go to Live Momentous, spelled O-U-S.
02:55:46.680 | So it's livemomentous.com/huberman.
02:55:49.680 | You can also receive 20% off.
02:55:51.800 | Again, that's Live Momentous, spelled O-U-S.com/huberman.
02:55:56.020 | If you haven't already subscribed
02:55:57.240 | to our neural network newsletter,
02:55:59.000 | our neural network newsletter is a completely
02:56:01.080 | zero-cost monthly newsletter that includes summaries
02:56:03.980 | of podcast episodes, as well as protocols.
02:56:07.080 | That is short PDFs describing, for instance,
02:56:09.660 | tools to improve sleep, tools to improve neuroplasticity.
02:56:13.180 | We talk about deliberate cold exposure, fitness,
02:56:16.000 | various aspects of mental health.
02:56:17.280 | Again, all completely zero cost.
02:56:18.960 | And to sign up, you simply go to HubermanLab.com,
02:56:21.720 | go over to the menu in the corner,
02:56:23.480 | scroll down to newsletter, and provide your email.
02:56:25.640 | We do not share your email with anybody.
02:56:28.040 | If you're not already following me on social media,
02:56:29.960 | I am @hubermanlab on all platforms.
02:56:32.360 | So that's Instagram, Twitter, Threads, LinkedIn,
02:56:35.360 | and Facebook, and at all of those places,
02:56:37.920 | I talk about science and science-related tools,
02:56:39.920 | some of which overlaps with the content
02:56:41.480 | of the Huberman Lab Podcast,
02:56:42.600 | but much of which is distinct from the content
02:56:44.820 | of the Huberman Lab Podcast.
02:56:46.000 | Again, it's Huberman Lab on all social media platforms.
02:56:49.140 | Thank you once again for joining me
02:56:50.520 | for today's discussion with Marc Andreessen.
02:56:53.040 | And last, but certainly not least,
02:56:55.160 | thank you for your interest in science.
02:56:57.120 | [upbeat music]
02:56:59.700 | (upbeat music)