back to index

Neal Stephenson: Sci-Fi, Space, Aliens, AI, VR & the Future of Humanity | Lex Fridman Podcast #240


Chapters

0:0 Introduction
0:43 WWII and human nature
9:28 Search engine morality
14:6 Space exploration
31:7 Aliens and UFOs
39:30 SpaceX and Blue Origin
46:52 Social media
51:19 Climate change
63:9 Consequences of big ideas
67:50 Virtual reality
90:58 Artificial intelligence
105:57 Cryptocurrency
118:35 Writing, storytelling, and books
141:13 Martial arts
150:31 Final thoughts

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Neil Stevenson,
00:00:02.760 | a legendary science fiction writer exploring ideas
00:00:05.820 | in mathematics, science, cryptography, money,
00:00:08.340 | linguistics, philosophy, and virtual reality.
00:00:11.680 | From his early book, "Snow Crash,"
00:00:13.720 | to his new one called "Termination Shock."
00:00:17.080 | He doesn't just write novels.
00:00:18.800 | He worked at the space company, Blue Origin,
00:00:21.780 | for many years, including technically
00:00:24.880 | being Blue Origin's first employee.
00:00:27.040 | He also was the chief futurist at the virtual reality
00:00:30.160 | company, Magic Leap.
00:00:32.200 | This is the Lex Friedman Podcast.
00:00:35.140 | To support it, please check out our sponsors
00:00:37.200 | in the description.
00:00:38.560 | And now, here's my conversation with Neil Stevenson.
00:00:42.400 | You write both historical fiction,
00:00:45.760 | like "World War II" in "Cryptonomicon,"
00:00:49.200 | and science fiction, looking both into the past
00:00:52.320 | and the future.
00:00:53.160 | So let me ask, does history repeat itself?
00:00:56.600 | In which way does it repeat itself,
00:00:58.080 | and which way does it not?
00:00:59.760 | - I'm afraid it repeats itself a lot.
00:01:02.640 | So I think human nature kind of is what it is,
00:01:05.880 | and so we tend to see similar behavior patterns
00:01:10.160 | emerging again and again.
00:01:12.080 | And so it's kind of the exception,
00:01:16.720 | rather than the rule, when something new happens.
00:01:19.960 | - What role does technology play in the suppression
00:01:23.080 | or in revealing human nature?
00:01:25.980 | - Well, the standards of living, life expectancy,
00:01:29.720 | all that have gotten incredibly better within the last,
00:01:34.720 | particularly the last 100 years.
00:01:36.840 | I mean, just antibiotics, modern vaccines,
00:01:40.840 | electrification, the internet.
00:01:44.100 | These are all improvements in most people's
00:01:47.000 | standard of living and health and longevity
00:01:49.960 | that exceed anything that was seen before
00:01:54.800 | in human history.
00:01:56.320 | So people are living longer,
00:01:59.720 | they're generally healthier, and so on.
00:02:02.680 | But again, we still see a lot of the same behavior patterns,
00:02:07.640 | some of which are not very attractive.
00:02:10.320 | - So some of it has to do with the constraints on resources.
00:02:13.940 | Presumably with technology,
00:02:15.560 | you have less and less constraints on resources,
00:02:17.560 | so we get to maybe emphasize the better angels of our nature.
00:02:22.520 | And in so doing, does that not potentially
00:02:26.560 | fundamentally alter the experience
00:02:30.580 | that we have of life on Earth?
00:02:32.080 | - You know, until the last 10 or so years,
00:02:34.900 | I would have taken that view, I think.
00:02:37.680 | But people will find ways to be divisive and angry
00:02:42.680 | if it scratches a kind of psychological itch
00:02:50.480 | that they have got.
00:02:51.520 | And we used to look at the Weimar Republic,
00:02:55.640 | what happened in the economic collapse of Germany
00:02:58.240 | prior to the rise of Hitler, World War II,
00:03:03.240 | and kind of explain Hitler, at least partially,
00:03:09.760 | by just the misery that people were living in at that time.
00:03:14.760 | - The economic collapse.
00:03:18.400 | - Yeah, hyperinflation and unemployment
00:03:22.000 | and the decline in standard of living.
00:03:25.560 | And that sounds like a plausible explanation,
00:03:29.120 | but there are economic troubles now, for sure.
00:03:32.160 | We had the bank collapse in 2008,
00:03:35.040 | and there's stagnation in some people's standards of living,
00:03:39.760 | but it's hard to explain what we've seen in this country
00:03:42.640 | in the last few years just strictly on the basis
00:03:44.980 | of people are poor and angry and sad.
00:03:48.580 | I think they wanna be angry.
00:03:51.080 | - So without being political in a divisive kind of way,
00:03:57.080 | can we talk about the lessons you can draw
00:03:59.720 | from World War II?
00:04:01.760 | - Sure. - This singular event
00:04:03.340 | in human history, it seems like.
00:04:05.160 | - Yeah. - And yet, as you say,
00:04:07.500 | history rhymes at the very least.
00:04:09.960 | - Yeah.
00:04:10.800 | Being who I am, I tend to focus on the curious
00:04:14.600 | technological things that happened
00:04:16.680 | in conjunction with that war,
00:04:18.600 | which may not be where you wanna go, but--
00:04:22.600 | - Well, there's several things, and sorry to interrupt.
00:04:24.420 | So one in cryptonomicon is more like
00:04:27.800 | the Alan Turing side of things, right?
00:04:29.760 | - Right.
00:04:30.600 | - And then there's the outside of technology.
00:04:35.000 | First of all, there's the tools of war,
00:04:36.520 | which is a kind of technology,
00:04:38.160 | but then there's just the human nature,
00:04:40.200 | the nature of good and evil.
00:04:41.800 | - Yeah, well, so one of the things that emerges
00:04:44.440 | from the war and from the extermination camps
00:04:49.040 | is that we're never allowed to have illusions anymore
00:04:52.840 | about human nature.
00:04:55.080 | So you have to learn that lesson
00:04:58.800 | to be an educated person, and you have to know
00:05:01.640 | that even in a supposedly enlightened, civilized society,
00:05:06.640 | people can become monsters quite easily.
00:05:10.480 | So that is for sure the big takeaway.
00:05:13.400 | - So do you agree with Solzhenitsyn about,
00:05:16.560 | what is it, the line between good and evil
00:05:21.200 | runs through the heart of every man?
00:05:23.760 | - Yeah.
00:05:24.600 | - That all of us are capable--
00:05:26.560 | - Great line, yeah.
00:05:27.400 | - Of evil?
00:05:28.760 | - I read a good chunk of the Gulag Archipelago
00:05:32.160 | when I was a teenager, 'cause my grandfather had it
00:05:36.160 | in his house, 'cause he was one of these Americans
00:05:39.480 | who was obsessed with the Soviet Union
00:05:42.000 | and the Soviet threat, and wanted people to be aware
00:05:47.000 | of some of what had happened.
00:05:49.880 | And so he had those books lying around,
00:05:53.200 | and I would read them.
00:05:55.720 | And it's a similar kind of parallel story
00:05:59.200 | to what happened in Germany during the war,
00:06:03.360 | this creation of this system of camps and oppression
00:06:07.520 | and lots of troubling behavior.
00:06:12.520 | - To me, it's a story of how fear and desperation
00:06:18.480 | combined with a charismatic leader can lead to evil.
00:06:23.840 | But it's also a story of bravery, of love,
00:06:28.120 | of brotherhood and sisterhood, and basically survival.
00:06:32.720 | You have like "A Man's Search for Meaning,"
00:06:34.960 | which is the story of a man in a concentration camp,
00:06:39.960 | basically finding beauty in life
00:06:42.080 | even under most extreme conditions.
00:06:45.760 | So to me, World War II is not necessarily
00:06:48.400 | a bleak view of human nature.
00:06:53.400 | It's a little moment of evil that revealed
00:06:58.400 | a much bigger good in humanity.
00:07:02.560 | So I'm not so sure that it leads me
00:07:05.280 | to a pessimistic view of the world,
00:07:07.240 | the fact that somebody like Hitler could happen,
00:07:09.900 | the fact that a lot of people could follow Hitler
00:07:13.760 | and get excited and maybe even love the hate of the other
00:07:18.760 | for some moment of time.
00:07:20.880 | I think that's, all of us are capable of that,
00:07:25.000 | but I think all of us also have a capacity for good.
00:07:28.960 | And I think, I don't know what you think,
00:07:31.480 | but I think we have a greater desire for good than evil.
00:07:36.480 | And it seems like that's where technology
00:07:40.280 | is very useful as a guide, as a helping hand.
00:07:44.840 | - Okay, can you give me an example, maybe?
00:07:48.480 | - So I give you examples of futuristic technologies,
00:07:51.480 | and I can give you examples of current technologies.
00:07:53.940 | Current technologies, knowledge,
00:07:58.960 | in the form of very basic knowledge,
00:08:01.920 | which is like Wikipedia,
00:08:03.240 | and search the original dream of Google
00:08:08.240 | that I think is very much a success,
00:08:10.440 | which is making the world's information accessible
00:08:13.520 | at your fingertips.
00:08:15.000 | That kind of technology enables the natural,
00:08:20.000 | if this axiom, this assumption that people want to do good
00:08:25.800 | is true, then letting them discover
00:08:29.120 | all of the information out there,
00:08:30.760 | false information and true information, all of it,
00:08:33.840 | and let them explore, that's going to lead
00:08:36.960 | to a better world, to better people.
00:08:40.260 | Futuristic technologies is, I personally,
00:08:44.600 | I mentioned to you offline,
00:08:46.140 | sort of love artificial intelligence.
00:08:48.320 | And so AI, that's an assistant,
00:08:51.880 | that's a guide, like a mentor to you,
00:08:54.280 | that you can, in the way that Google searches,
00:08:57.760 | but smarter, where you can help send it out and say,
00:09:01.760 | this is the direction in which I want to grow.
00:09:04.400 | Not authoritarian lecturing down from the algorithm
00:09:08.880 | of telling you, this is how you should grow,
00:09:12.360 | but almost the opposite, where you use it as an assistant,
00:09:17.360 | a servant in your journey towards knowledge.
00:09:22.520 | - Yeah. - That sounds like
00:09:24.000 | an easy thing, but it's actually,
00:09:25.280 | from an AI perspective, very difficult.
00:09:27.280 | - I mean, this is the theme of a book I wrote
00:09:30.720 | called "The Diamond Age," which talks about a book
00:09:33.480 | that essentially does that.
00:09:35.560 | And I've been sort of watching people try to come at
00:09:39.600 | the problem of building that thing
00:09:42.280 | from different directions for,
00:09:44.760 | ever since the book came out, basically.
00:09:47.400 | And so, I kind of have a, although I'm not a scientist,
00:09:52.400 | although I haven't worked on it myself,
00:09:53.880 | I do get a sense of the level of difficulty
00:09:57.280 | in realizing that goal.
00:10:00.560 | - So that book is in the '90s, so as Google is coming to be,
00:10:06.760 | is essentially, not Google, but the search engine,
00:10:11.080 | the initial search engine, which gave birth to Google,
00:10:14.160 | essentially, in contrast.
00:10:16.640 | - Right, yeah, yeah, that was still in the era of AltaVista
00:10:20.000 | and Ask Jeeves and multiple different search engines.
00:10:24.600 | And yeah, I'm pretty sure I had not heard of Google
00:10:27.440 | at that point, that would have been '95, '96.
00:10:30.280 | I think the book came out in '94.
00:10:33.240 | - And then, of course, the social networks followed,
00:10:35.360 | which is another form of guidance
00:10:39.320 | through the space of information.
00:10:41.120 | - Yeah, well, what happens is that these things come along
00:10:44.760 | and then people find ways to game them.
00:10:48.080 | And so I saw an interesting thread the other day
00:10:51.200 | pointing out that 20 years ago,
00:10:55.360 | if you had Googled Pythagorean theorem,
00:11:00.360 | chances are you would have been taken directly
00:11:04.280 | to a page explaining the Pythagorean theorem.
00:11:08.120 | If you do it now, you're probably gonna,
00:11:10.680 | the top hits are gonna be from somebody who's got an angle,
00:11:14.160 | who's got a scheme, right?
00:11:15.440 | They're trying to sell you math tutoring
00:11:18.200 | or they're working some kind of marketing plan on you.
00:11:23.200 | So the traditional engines become actually less useful
00:11:29.840 | over time for their original educational purpose.
00:11:35.040 | That doesn't mean that they can't,
00:11:36.840 | it shouldn't be replaced by newer and better ones.
00:11:40.040 | - First of all, to defend the people with the angle, right?
00:11:45.540 | They're trying to find business models to fund oftentimes,
00:11:49.700 | which is funny you went with Pythagorean,
00:11:51.620 | like you went at math, those greedy bastards.
00:11:55.940 | But it's great.
00:11:58.420 | - How can we monetize the Pythagorean theorem?
00:12:01.220 | - Well, I mean, education, right?
00:12:04.580 | It's to figure out like,
00:12:06.780 | people who love math education, for example,
00:12:09.620 | love it purely, not purely, but very often love it
00:12:14.460 | for itself, for just teaching math.
00:12:17.780 | But then they start, when coming face to face with,
00:12:21.700 | for example, like the YouTube algorithm,
00:12:23.460 | they start to try to figure out,
00:12:24.780 | okay, how can I make money off of this?
00:12:27.340 | The primary goal is still that love of education,
00:12:32.340 | but they also want to make that love of education
00:12:36.260 | their full-time job.
00:12:37.980 | But I see that sort of that dance of humanity
00:12:41.860 | with the algorithms as it finds this kind of local pocket
00:12:46.780 | of optimality or sub-optimality, whatever.
00:12:50.220 | It gets stuck in it. - Pocket anyway.
00:12:52.100 | - It's a pocket of some sort.
00:12:54.260 | But I see that pocket is way better
00:12:55.740 | than what we had before in the '80s, right?
00:12:58.740 | In the '90s before the internet.
00:13:00.220 | But like, and now we're now, this is also human nature.
00:13:03.780 | We start writing very eloquent articles
00:13:06.540 | about how this pocket is clearly a pocket,
00:13:09.820 | it's not very good.
00:13:10.900 | And we can imagine much better lands far beyond.
00:13:14.580 | But the reality is it's better than before.
00:13:16.900 | And now we're waiting for--
00:13:18.820 | - We have to escape from a local minimum.
00:13:21.140 | - And you have to wait either for lone geniuses
00:13:23.900 | or for some kind of momentum of a group of geniuses
00:13:26.720 | that just say, enough is enough, I have an idea.
00:13:29.700 | This is how we get out.
00:13:31.300 | And it's too easy to be sort of, I think,
00:13:35.740 | partially because you can get a lot of clicks
00:13:37.420 | in your articles, being cynical about being in this pocket.
00:13:40.460 | And we are forever stuck in this pocket.
00:13:42.860 | And then coming up with this grandiose theory
00:13:46.140 | that humanity has finally, it's collapsing,
00:13:50.180 | stuck forever like a prison in this pocket.
00:13:53.040 | But reality, it's just clickbait articles and books
00:13:57.340 | until one curious ant comes up with the next pocket.
00:14:02.040 | - Yeah, tunnels through the barrier
00:14:03.460 | or gets enough energy to jump over the barrier.
00:14:06.620 | - And eventually we'll be, as you've talked about,
00:14:09.300 | I mean, we'll colonize the solar system
00:14:12.660 | and then we'll be stuck in the solar system.
00:14:16.060 | And then people will say, well, we're screwed
00:14:18.620 | 'cause when the sun energy runs out,
00:14:20.980 | there's no way to get to the next solar system.
00:14:23.940 | And so on, it goes on until we colonize
00:14:26.620 | the entirety of the observable universe.
00:14:28.580 | - Yeah, I think getting out of the solar system
00:14:31.700 | is gonna be a hard one.
00:14:33.380 | - So can you, you mentioned this,
00:14:34.780 | can you elaborate why you think,
00:14:37.900 | back to sort of a serious question,
00:14:40.620 | why do you think it's hard to get
00:14:42.260 | outside of our solar system?
00:14:43.860 | - It's just an energy, I mean, you can do it slowly
00:14:48.700 | whenever you want, but the idea of getting there
00:14:54.740 | in a one lifetime or multiple, a few lifetimes
00:15:00.940 | requires huge amounts of energy to accelerate.
00:15:04.900 | And then as soon as you get halfway there,
00:15:07.900 | you need to expend an equal amount of energy to decelerate
00:15:11.860 | or you'll just go shooting by.
00:15:13.360 | And so that means carrying a lot of energy.
00:15:18.220 | And there's ideas like Yuri Milner,
00:15:22.220 | I think is still funding the idea to use laser propulsion
00:15:26.860 | to send something to another star system,
00:15:29.740 | a small object, but it'll have no way to slow down,
00:15:34.700 | as far as I know.
00:15:35.620 | - They never talk about that part,
00:15:36.860 | like how do we slow down?
00:15:38.020 | - Yeah.
00:15:38.860 | - So it's a quick flyby, you take a good picture, I guess.
00:15:43.140 | - Yeah, you better take some good pictures on your way by.
00:15:45.620 | So, and that's great if it happens, I'm not knocking it,
00:15:49.180 | but the amount of energy that's needed is just staggering
00:15:53.820 | and there's other issues like just how do you maintain
00:15:59.420 | an ecosystem for that long in isolation?
00:16:02.660 | How do you prevent people from going crazy?
00:16:05.060 | What happens if you hit something while traveling
00:16:07.780 | at a significant fraction of the speed of light?
00:16:10.540 | - What about sort of some combination
00:16:12.780 | of expanding human lifespan, but also just good old fashion,
00:16:17.780 | stable society on a spaceship?
00:16:21.020 | - Yeah, yeah, the generation ship, yeah, yeah.
00:16:24.900 | No, I think that's the only way.
00:16:26.620 | It would have to keep going for a long time.
00:16:29.380 | And they might get to where they're going
00:16:33.260 | and find a shitty solar system.
00:16:37.860 | Like we can try to do some advanced survey,
00:16:42.460 | but I mean, if you get there and all the planets
00:16:46.460 | in that solar system are just garbage planets,
00:16:48.980 | then it's kind of a big letdown
00:16:52.220 | for this like thousand year voyage
00:16:54.220 | that you've just been on, right?
00:16:57.500 | So I mean, we have a pretty narrow range of parameters
00:17:02.300 | that we need to stay between in order to survive
00:17:07.020 | in terms of the gravitational field that we can deal with.
00:17:11.540 | So that sets a bound on the size of the planet
00:17:16.660 | and what we need in the way of temperature
00:17:19.540 | and atmosphere and so on.
00:17:21.580 | So when you look at all those complications,
00:17:24.820 | then basically building sort of exactly the environment
00:17:29.820 | we want out of available materials in this solar system
00:17:34.940 | starts to look a hell of a lot better.
00:17:37.340 | It's hard to make an economic argument,
00:17:43.300 | let's say for making that journey.
00:17:46.980 | One of the things I like about the expanse
00:17:49.500 | is the fact that the people who are trying
00:17:51.460 | to build the starship to go to the other solar system
00:17:54.940 | are doing it for religious reasons.
00:17:57.460 | I think that's the only reason that you would do it
00:18:00.660 | 'cause economically it just makes more sense
00:18:04.620 | to build rotating cylindrical space habitats
00:18:08.660 | and make them perfect.
00:18:10.500 | - Well, isn't everything done for religious reasons?
00:18:13.180 | Like why do we, exploration?
00:18:15.340 | - Yeah.
00:18:16.180 | - Like why do we go to the moon again
00:18:18.660 | and do the other things?
00:18:20.100 | What is JFK said, it's because,
00:18:21.860 | not because they're easy, but because they're hard.
00:18:24.060 | Isn't that kind of a religious reason?
00:18:26.060 | - I knew a veteran of the Apollo program
00:18:28.820 | who once said that the Apollo moon landings
00:18:31.060 | were communism's greatest achievement.
00:18:33.140 | - Yeah, so the conflict between nations is a kind of--
00:18:40.180 | - Not exactly a religion, but it's what you're talking about.
00:18:43.780 | - Well, it's a struggle for meaning.
00:18:45.820 | I mean, and that meaning isn't found in some kind of,
00:18:49.860 | it's hard to find meaning in mathematics.
00:18:52.060 | It's found in some kind of,
00:18:53.780 | in music and religion, whatever, art.
00:18:56.260 | - I mean, some people do,
00:18:57.660 | but those are probably not enough of them to--
00:19:00.860 | - Well, people that find meaning in mathematics,
00:19:05.300 | they usually find meaning between the lines nevertheless,
00:19:08.420 | not in the actual, like proving,
00:19:13.420 | proving some kind of thing.
00:19:14.660 | - Fair enough, yeah.
00:19:15.660 | - So from a cost perspective,
00:19:19.140 | do you actually see a possible future
00:19:20.980 | where we're building these kind of generation ships
00:19:24.020 | and just, why not launch them one a year out,
00:19:29.020 | like wandering ants out into the galaxy?
00:19:34.620 | - I have nothing against it.
00:19:38.340 | It's just, like I said, it's got a,
00:19:40.420 | the motivation to do it has to come
00:19:44.180 | from some kind of spiritual
00:19:46.460 | or kind of non-tangible calculus.
00:19:51.140 | - So from a business model perspective,
00:19:53.020 | you don't think there's a business model there?
00:19:54.660 | - No, no way.
00:19:56.380 | - One of the many fascinating things
00:19:57.820 | you've done in your life,
00:19:59.420 | you were, at the very beginning,
00:20:01.420 | you were the person that convinced Jeff Bezos
00:20:04.020 | to start a spaceship company, a space company.
00:20:08.500 | You were there at Blue Origin
00:20:10.060 | for a few years in the beginning,
00:20:13.420 | working on alternate propulsion systems,
00:20:17.140 | and at least according to Wikipedia,
00:20:19.980 | alternate business models.
00:20:22.620 | - Yeah, I mean, to go back to the first thing you said,
00:20:25.900 | Jeff Bezos is not a guy who required a lot of convincing.
00:20:29.960 | He'd been thinking about it since he was five years old,
00:20:34.180 | and it was an inevitability.
00:20:35.940 | But the idea that kind of got hatched in 1999
00:20:41.620 | was to just do some advanced scouting work,
00:20:46.620 | explore the corners of the space of possibilities.
00:20:51.900 | And so that's what, that was Blue Operations LLC,
00:20:58.100 | which was the precursor to Blue Origin.
00:21:01.740 | And so it was a small staff of people
00:21:04.820 | that did that for a few years,
00:21:07.020 | and I think it was about 2003, 2004,
00:21:10.020 | that it swung decisively towards the direction
00:21:15.020 | it's been following ever since,
00:21:18.700 | which is using basically existing aerospace technologies
00:21:23.700 | and models to make chemical-fueled rockets
00:21:27.580 | for space tourism.
00:21:30.980 | I believe, and I continue to believe,
00:21:33.260 | that the fact that we use chemical rockets
00:21:35.840 | is just an accident of history.
00:21:38.660 | It comes out of World War II.
00:21:40.700 | So until World War II, rockets are being built
00:21:44.700 | on a small scale by people like Robert Goddard.
00:21:47.280 | But then Hitler desperately wants to bomb London,
00:21:53.640 | but he can't quite reach it,
00:21:55.440 | and the Luftwaffe has been kind of neutralized.
00:21:58.420 | So he decides he's gonna lob warheads into it with rockets,
00:22:03.420 | which is a terrible misallocation of resources.
00:22:08.540 | It's a terrible idea.
00:22:09.860 | But so it only could have happened
00:22:12.340 | in a dictatorship controlled by a lunatic.
00:22:16.380 | But that's the situation that existed,
00:22:20.220 | so they built these rockets.
00:22:21.700 | They, you know, that's the V-2.
00:22:23.780 | And then it's just a complete coincidence
00:22:27.980 | that that war ends with atomic bombs being developed
00:22:32.940 | in a completely separate superweapon program.
00:22:37.060 | And so suddenly the existence of the bombs
00:22:40.620 | creates a demand for rockets that didn't exist before.
00:22:45.400 | 'Cause if you've got atomic bombs,
00:22:48.240 | you need a way to deliver them.
00:22:50.340 | You can do it with bombers,
00:22:51.680 | but it's a lot better to just hurl them
00:22:56.060 | to the other side of the world on the top of a rocket.
00:22:59.300 | So suddenly rockets, which had gotten a boost
00:23:03.820 | because of Hitler's V-2 program,
00:23:05.720 | got a much bigger boost during the '50s and '60s.
00:23:10.620 | - And it is a complete, you're right,
00:23:12.700 | for some reason never thought of this,
00:23:14.620 | it is an accident of history
00:23:16.660 | that nuclear weapons are developed at a similar time.
00:23:20.340 | First of all, nuclear weapons didn't have to be developed
00:23:24.060 | at the same time as World War II.
00:23:26.140 | That's an accident in history.
00:23:27.860 | And the fact that, okay,
00:23:29.820 | so then Hitler started using rockets.
00:23:32.140 | That's an accident.
00:23:33.060 | Okay, that's fascinating.
00:23:35.220 | It's a fascinating set of coincidences.
00:23:38.780 | - Yeah, which is true of a lot of technologies, by the way.
00:23:42.180 | But by the time these rockets are kind of working,
00:23:45.540 | we've got hydrogen bombs that are so big
00:23:50.060 | and so devastating that nobody really wants to use them.
00:23:54.320 | But it turns out you can fit a capsule
00:23:56.540 | with a couple of people in it
00:23:58.680 | into the socket on the end of a missile
00:24:03.120 | that was made to hold a hydrogen bomb.
00:24:05.620 | So we start doing that instead as a proxy for having a war.
00:24:13.040 | And--
00:24:16.160 | - I'd love to be in a meeting
00:24:17.400 | where the first guy brought that up as an idea.
00:24:19.760 | - Yeah.
00:24:20.600 | (both laughing)
00:24:22.360 | - It's probably a Russian.
00:24:23.400 | Why don't we strap a person to the rocket?
00:24:25.840 | - Yeah, yeah.
00:24:27.120 | Well, it probably was 'cause they did it first, right?
00:24:30.080 | The Russians did it.
00:24:30.920 | - And they had perhaps less respect
00:24:32.420 | for sort of safety protocols.
00:24:34.240 | - Could be.
00:24:35.160 | - They're a little bit more willing
00:24:37.480 | to sacrifice the life of an astronaut
00:24:39.320 | or to risk the life of an astronaut.
00:24:41.400 | - Could be, yeah, yeah.
00:24:42.880 | This is basically the story of how
00:24:44.720 | through all of this competition
00:24:47.040 | and 'cause of these historical accidents,
00:24:49.320 | trillions of R&D dollars and rubles
00:24:52.840 | were put into development of chemical rocket technology,
00:24:57.720 | which is now advanced to an incredibly high degree.
00:25:02.000 | But there's other ways to make things go really fast,
00:25:04.860 | which is all that rockets do.
00:25:08.120 | That's all orbit is, it's just going really fast.
00:25:11.220 | And because so many nerds are obsessed with space,
00:25:16.320 | people have been thinking about alternate schemes
00:25:20.520 | for as long as they've been thinking about rockets.
00:25:23.220 | And so one of the first things that I learned
00:25:26.880 | kind of trying to explore new possibilities
00:25:31.240 | was that I could put all of my brain power to work
00:25:37.600 | and be creative as I could
00:25:40.360 | and invent some idea that I thought was new
00:25:44.840 | for making things go fast.
00:25:46.440 | And I would always find out that some guy in Russia
00:25:49.440 | or somewhere had thought the same idea up 50 years ago
00:25:53.480 | and figured out all the math.
00:25:56.140 | - Yeah.
00:25:57.520 | - And so at a certain point,
00:26:00.280 | you give up on trying to invent completely new ideas
00:26:03.880 | and just go poking around trying to find those guys.
00:26:07.860 | So there's a number of ideas that we looked at.
00:26:14.960 | Some are crazier, some are less crazy,
00:26:18.280 | but the direction that that company eventually took
00:26:21.920 | was chemical rockets.
00:26:23.680 | - Is there something you can comment on possible ideas?
00:26:26.560 | So first of all, I mean, you could use nuclear,
00:26:31.560 | so nuclear propulsion.
00:26:34.920 | - Yeah, so that's, I mean,
00:26:36.520 | you've probably heard of Project Orion,
00:26:38.560 | which was Freeman Dyson and some of his collaborators
00:26:46.000 | had a scheme to power a large space vehicle
00:26:50.260 | by detonating atomic bombs behind it.
00:26:53.140 | And so one of the other people
00:26:55.000 | who was working at Blue Operations during this time
00:26:58.060 | was George Dyson, the son of Freeman.
00:27:01.220 | And so we knew all about Project Orion
00:27:04.720 | and he found an old film that they'd shot
00:27:08.120 | on a beach in La Jolla of a prototype of this
00:27:11.200 | that was powered by like lumps of C4.
00:27:16.200 | So that was an idea, but for a private company,
00:27:19.120 | obtaining a large number of atomic bombs
00:27:21.720 | was probably out of scope.
00:27:23.280 | So there's more of a theoretical thing.
00:27:27.160 | There's a conceptually similar approach using lasers
00:27:32.160 | that Freeman worked on with Arthur Kantrowitz
00:27:39.320 | and some others where you take a pulse laser
00:27:41.720 | and you fire it at a vehicle
00:27:44.720 | that has a block of ice on the back.
00:27:47.320 | And the pulse hits the ice and flashes off a layer of steam
00:27:52.320 | that becomes plasma and plasma is opaque
00:27:57.060 | because it conducts.
00:27:58.280 | And so being opaque, it then absorbs all of the energy
00:28:02.920 | from the laser pulse and gets really hot
00:28:05.600 | and just pushes on the back of the block of ice.
00:28:09.960 | And then you wait a moment for that to dissipate
00:28:12.320 | and then you do it again.
00:28:14.000 | So it would just kind of vibrate its way.
00:28:18.000 | Like it sounds really violent,
00:28:20.600 | but Freeman said that if you were wearing
00:28:22.560 | like rubber-soled tennis shoes standing in this vehicle,
00:28:26.480 | you would just feel a mild vibration.
00:28:29.260 | So there your source of energy is on the ground
00:28:33.360 | and you're getting higher specific impulse
00:28:35.400 | than you could get by burning chemicals.
00:28:37.620 | Jordan Kerr and others worked on another laser system,
00:28:43.640 | the late Dr. Jordan Kerr,
00:28:45.480 | that just would heat up a heat exchanger
00:28:49.900 | by many converging solid state lasers from the ground.
00:28:54.900 | And Kevin Parkin works on a similar scheme
00:29:00.000 | that just uses microwaves to do that.
00:29:05.120 | We looked at tall towers.
00:29:08.120 | I spent a while looking kind of semi-seriously
00:29:10.840 | at giant bull whips.
00:29:12.760 | - What's a bull whip?
00:29:15.320 | - Just a whip.
00:29:16.360 | Just you have them here in Texas, right?
00:29:19.440 | - Yeah, I understand.
00:29:21.880 | But how does that have to do with propulsion?
00:29:23.900 | - If you think about it,
00:29:24.760 | a whip is an incredibly simple primitive object
00:29:28.660 | that can break the speed of sound.
00:29:31.800 | So it's unbelievable in a way that for thousands of years,
00:29:36.800 | people with no technology have been able
00:29:40.280 | to accelerate objects through the speed of sound
00:29:44.840 | just through an architectural trick.
00:29:47.280 | Just the physics of a moving bend of material in a medium
00:29:53.640 | can do this.
00:29:56.380 | So that's the thing I still think about
00:30:00.980 | from time to time.
00:30:02.200 | You can use the same physics to make freestanding loops
00:30:05.860 | of chain or other flexible materials
00:30:09.720 | that just kind of stand up under their own physics.
00:30:15.160 | - I mean, it's kind of awesome to imagine.
00:30:20.280 | So you imagine using the same kind of physics of a whip
00:30:23.840 | but have at the end of it a spaceship.
00:30:27.880 | - Yeah, that would detach at the moment of maximum velocity.
00:30:32.880 | - Why not?
00:30:36.780 | Why wouldn't that?
00:30:37.660 | - So part of my motivation in studying that
00:30:41.020 | was to ask that question.
00:30:42.900 | It was more almost a symbolic way of saying,
00:30:47.900 | look, there's all kinds of physics we haven't explored yet
00:30:54.620 | that it's no more crazy than the idea of chemical rockets.
00:30:59.620 | It's just that more money's gone into chemical rockets.
00:31:05.800 | - Can I ask you a question on propulsion
00:31:11.300 | that's a little bit more out there?
00:31:14.260 | So I don't know if you've seen quite a lot
00:31:19.100 | out of recent articles and reports and so on about UFOs,
00:31:24.260 | like the Tic Tac aircraft.
00:31:26.700 | - I keep seeing a lot of chatter about it,
00:31:30.020 | but I haven't gone deep into it.
00:31:32.580 | - So the DOD released footage filmed by pilots
00:31:37.580 | and there's a lot of reports about objects
00:31:42.500 | that moved in ways they haven't seen before
00:31:45.260 | that seem to defy the laws of physics
00:31:47.900 | if we consider the aircraft that we have today.
00:31:52.460 | And so the reason I asked you that
00:31:54.900 | is because it kind of, to me, whatever the heck it is,
00:31:59.900 | it's inspiring for the possibilities
00:32:05.140 | of ideas for propulsion.
00:32:07.940 | If it's like secret projects from foreign nations
00:32:12.940 | or it's physical phenomena that we don't yet understand,
00:32:16.060 | like ball lightning, all those kinds of things,
00:32:18.300 | or if it is aliens or objects from an alien civilization,
00:32:23.300 | I most likely believe if it's an object
00:32:26.180 | from an alien civilization,
00:32:27.300 | it's gotta be like a really dumb drone
00:32:30.820 | that just got lost.
00:32:32.300 | It's definitely not the pinnacle of intelligence.
00:32:36.900 | It's like some teenagers--
00:32:39.700 | - Science fair experiment.
00:32:42.300 | - Yeah, it just flew for a few centuries out
00:32:45.140 | and just landed and then we humans
00:32:47.460 | are all like really excited about this wild thing.
00:32:51.940 | I mean, what do you think about those,
00:32:53.900 | first of all, like the millions of reports of UFOs, right?
00:32:57.300 | There's some psychology there that's deeply cultural,
00:33:00.700 | but also the possibility of aliens having visited Earth.
00:33:05.700 | - Yeah, I mean, I'd like to see some better pictures.
00:33:09.360 | For the reason I mentioned earlier,
00:33:10.940 | having to do with the difficulty
00:33:12.780 | of traveling between star systems,
00:33:15.780 | it's really hard for me to believe it's aliens.
00:33:18.280 | I just can't understand why you would go to all that trouble
00:33:24.140 | to transport something across light years
00:33:27.100 | and then do what these UFOs are allegedly doing.
00:33:32.100 | Like, how is that interesting?
00:33:33.980 | How does that justify the trip?
00:33:36.700 | - So if you travel across those kinds of distances,
00:33:41.700 | you'd make a bigger splash.
00:33:45.180 | First of all, I would expect that the arrival
00:33:48.700 | of these things would be something we'd notice.
00:33:51.300 | It's gotta decelerate into our solar system by,
00:33:56.300 | unless it got here really, really, really slowly.
00:33:59.060 | So I guess that's a possibility and just kind of snuck in.
00:34:04.060 | - So at the end, we would detect some kind of footprint
00:34:06.800 | in terms of energy.
00:34:07.940 | - You would think.
00:34:08.940 | So I actually think your idea
00:34:11.100 | of a science fair project gone bad,
00:34:15.180 | it makes more sense in that it would explain
00:34:19.020 | why if these things are alien technologies,
00:34:22.780 | they're just kind of hanging around our aircraft carriers
00:34:26.640 | for no particular reason,
00:34:28.100 | like not trying to communicate.
00:34:30.900 | - Can you imagine a scenario where aliens
00:34:36.740 | have visited Earth or are visiting Earth
00:34:39.140 | and we wouldn't notice it at all?
00:34:41.500 | - Oh, sure.
00:34:42.340 | If they've got technology to get here,
00:34:45.660 | they've probably got technology to conceal the fact.
00:34:49.180 | - Oh, they're trying to conceal themselves.
00:34:50.940 | I meant more like they're not trying to conceal themselves,
00:34:53.640 | but we're just, our cognitive capabilities are too limited
00:34:58.640 | and we are not thinking big enough.
00:35:00.820 | We're looking for little green men.
00:35:03.180 | We're looking for things that operate at a time scale
00:35:06.020 | that's human-like.
00:35:10.260 | - Yeah, no, I love thinking about ideas like that.
00:35:13.100 | That's great science fiction novel fodder,
00:35:16.620 | that the aliens are so different
00:35:19.620 | that we simply don't see them.
00:35:22.220 | - I mean, is there, in terms of language,
00:35:25.440 | do you think it would be difficult,
00:35:28.620 | not aliens visiting us, but traveling to other places
00:35:31.580 | to find a common language?
00:35:33.780 | You've written about the importance of language
00:35:37.260 | in intelligent civilizations.
00:35:40.160 | How difficult is the problem to bridge the gap
00:35:44.360 | between aliens and humans in terms of language
00:35:48.400 | so we're not lost in translation?
00:35:50.200 | - Yeah, I mean, there's different takes on that
00:35:52.320 | depending on how biologically similar they are to us.
00:35:56.560 | I mean, there's a school of thought that says basically,
00:35:59.680 | advanced life has to be carbon-based
00:36:05.640 | for just reasons of chemistry.
00:36:07.440 | So right away, if you impose that limitation,
00:36:10.720 | then you're kind of assuming something
00:36:14.240 | that's starting to be biologically similar to us.
00:36:17.520 | So if they're about as big as we are
00:36:20.040 | and they kind of move around in space,
00:36:25.040 | in a physical body the way we do,
00:36:26.960 | then there's probably a way
00:36:28.680 | to solve that communication problem.
00:36:31.760 | If they're beings of pure energy from Star Trek
00:36:36.920 | or something like that, then it's a different story.
00:36:40.760 | - Well, I love thinking about that kind of stuff too.
00:36:42.480 | I mean, consciousness itself may be alien.
00:36:47.480 | I mean, it could be, like you said, beings of pure energy.
00:36:52.200 | I think of life as just complex systems
00:36:58.920 | and the kind of forms those complex systems can take
00:37:02.440 | seems to be much larger
00:37:04.120 | than the particular biological systems
00:37:05.880 | we see here on Earth.
00:37:06.960 | I have to ask a Twitter question about aliens.
00:37:11.720 | You're ready, this is for Twitter.
00:37:13.360 | - I'm ready.
00:37:14.200 | - What would you expect from Twitter?
00:37:15.560 | Can humans have sex with aliens?
00:37:17.740 | Neil Stevenson.
00:37:20.360 | (laughing)
00:37:21.960 | You can pass.
00:37:23.000 | I asked the language question, can they communicate?
00:37:27.480 | - Yeah.
00:37:28.600 | - Can they fall in love before sex?
00:37:30.800 | That's how it works.
00:37:33.520 | So which question am I answering, the sex or the love?
00:37:37.000 | - I mean, it depends what is more fundamental
00:37:40.960 | to relations across intelligent species.
00:37:45.280 | - Yeah, I mean, sex can mean a lot of things.
00:37:49.760 | So I mean, if you're--
00:37:51.920 | - Reproduction, right?
00:37:52.960 | - In Star Trek, in classic Star Trek,
00:37:58.200 | you had to really suspend your disbelief
00:38:02.200 | to think that Spock was half Vulcan and half human, right?
00:38:07.200 | 'Cause that's just not gonna work DNA-wise.
00:38:12.800 | So if by sex you mean reproductive sex,
00:38:19.200 | then I would say no,
00:38:22.560 | unless you go to a panspermia kind of theory,
00:38:27.680 | which is that humans were seeded onto the planet
00:38:32.440 | as part of a galactic program of some sort.
00:38:37.440 | - And then we're just returning home
00:38:41.840 | and hanging out with our old relatives.
00:38:44.160 | - Distant cousins, yeah, yeah.
00:38:46.240 | But that doesn't seem plausible.
00:38:52.000 | We know that humans had sex with Neanderthals,
00:38:55.800 | with Denisovans, so you could think of them as aliens
00:39:00.800 | that came from our planet.
00:39:04.480 | So that's a kind of data point, I guess.
00:39:09.460 | But if you broaden your definition of sex
00:39:14.820 | to mean any kind of gratifying physical interaction,
00:39:19.820 | then sure.
00:39:23.320 | Dancing, and that's how we get to love.
00:39:26.760 | - Okay. - And love can take many forms.
00:39:28.320 | - Love can certainly take many forms.
00:39:30.320 | - I have to ask you, in terms of space,
00:39:34.080 | just looking at where Blue Origin is,
00:39:35.960 | looking at where SpaceX is today,
00:39:37.880 | and maybe looking out 10, 20 years out from now,
00:39:43.140 | are you impressed of what's happening?
00:39:44.960 | We just saw William Shatner go up to space.
00:39:47.680 | - Yeah, I was just watching his video this morning
00:39:50.520 | before I came here, yeah.
00:39:52.720 | - Are you impressed of where things stand today?
00:39:54.480 | - Yeah, I mean, SpaceX in particular has done things
00:39:59.480 | that are just unbelievable.
00:40:01.860 | And I don't think anyone was anticipating 20 years ago,
00:40:08.360 | let's say, when this all started,
00:40:11.840 | just the speed with which they'd be able
00:40:14.980 | to rack up these incredible achievements.
00:40:18.920 | If you've kind of even seen a little bit
00:40:22.280 | of how the sausage is made,
00:40:24.280 | and sort of the difficulty of doing any kind of space travel,
00:40:29.280 | what they've achieved is just, is unbelievable.
00:40:35.640 | - What about the, maybe a question about Elon Musk,
00:40:40.780 | even more than Jeff Bezos,
00:40:43.440 | he has a very kind of ambitious vision
00:40:47.160 | of this project that we're on as a species,
00:40:51.920 | of becoming a multiple planetary species,
00:40:54.760 | and becoming that quickly, as soon as possible,
00:40:58.800 | landing on Mars, colonizing Mars.
00:41:01.160 | What do you think of that project?
00:41:03.000 | There's two questions to ask.
00:41:04.040 | First, the question is,
00:41:05.280 | what do you think about the project of colonizing Mars?
00:41:08.920 | And second, what do you think about a human being
00:41:14.880 | who is so unapologetically ambitious
00:41:18.400 | at achieving the impossible,
00:41:21.120 | at what a lot of people would say is impossible?
00:41:23.440 | - I think that colonizing Mars is the kind of goal
00:41:27.760 | that's easily stated, it's catchy,
00:41:32.760 | it's the kind of thing that can inspire people
00:41:37.840 | to get involved in a way that some other programs might not.
00:41:42.720 | So I think it's well chosen in that way.
00:41:45.800 | I have technical questions about,
00:41:48.240 | you know, there's a problem of perchlorates
00:41:52.960 | on the surface of Mars that's gonna be big trouble,
00:41:56.580 | and there's radiation, so, and this is known, but--
00:42:02.240 | - What about business questions?
00:42:06.000 | Do you think, 'cause you mentioned sort of,
00:42:08.160 | going outside of the solar system
00:42:11.560 | would best be done for religious reasons.
00:42:14.140 | What about colonizing Mars?
00:42:17.640 | Can you spin it into a business proposition?
00:42:20.240 | - It's hard to think of a resource that's on Mars
00:42:24.840 | that could be brought back here cheaply enough
00:42:28.080 | to compete with stuff we could just dig out of the ground
00:42:33.080 | here or grow here.
00:42:35.560 | So I don't know if there is a business plan for that,
00:42:40.440 | or if it's just strictly, we're gonna go there
00:42:43.840 | and see what happens.
00:42:45.840 | - Maybe again, we need communism to kinda--
00:42:51.640 | - Yeah. - To get us going,
00:42:53.400 | to give us a reason, a little bit of the competition.
00:42:55.720 | - Well, there's plenty of people who are sufficiently excited
00:42:58.560 | by the colonized Mars vision that they're willing
00:43:02.640 | to just go all in on it,
00:43:05.440 | even if there's not a business plan behind it.
00:43:10.260 | So I think it's well chosen.
00:43:13.260 | It's just, I think it's probably the only approach to take.
00:43:18.260 | Again, a lot of the, when white people came to this continent
00:43:27.580 | and started colonizing it,
00:43:29.520 | there was not a lot of coherent planning.
00:43:35.020 | Like what plans they did have
00:43:37.300 | turned out to be terrible plans.
00:43:39.260 | Trying to come up with plans that extend decades
00:43:43.660 | into the future is a waste of time.
00:43:48.020 | - So do it for the kind of unexplainable love
00:43:52.300 | of the unknown, like the journey
00:43:56.500 | towards exploring the unknown.
00:43:59.500 | - Yeah. - And just kinda keep going.
00:44:01.740 | - Yeah.
00:44:02.900 | And well, you saw it with Shatner and his reaction
00:44:06.580 | to the flight yesterday.
00:44:08.900 | For him, that trip was more than worth it
00:44:17.100 | just for these intangible reasons.
00:44:19.860 | - What did he say?
00:44:20.780 | I haven't watched the video yet.
00:44:21.980 | - He was trying to express,
00:44:24.180 | talking a lot about the moment where suddenly you kind of
00:44:27.220 | rise above the thin blue blanket of the atmosphere
00:44:33.700 | and you're up into the blackness.
00:44:37.020 | And that had a huge impact on him.
00:44:40.940 | So he was kind of, I wouldn't say groping for words
00:44:44.260 | 'cause he was pretty eloquent,
00:44:45.660 | but he was trying to express his feelings about that
00:44:49.140 | in a way that is pretty gripping to watch.
00:44:54.020 | - So you've worked on this kind of stuff,
00:44:58.740 | we can go back to 10 years ago.
00:45:00.580 | You wrote an essay called "Innovation Starvation."
00:45:03.940 | You worked on this kind of idea since then.
00:45:06.840 | Kind of looking at maybe a little bit cynically
00:45:13.380 | about our age today and our unwillingness
00:45:17.260 | to take on big, risky projects.
00:45:19.860 | So in the face of that,
00:45:20.980 | what do you think of people like Elon Musk?
00:45:23.980 | 'Cause to me, people like that are inspiring
00:45:28.140 | and gives you hope in the face of a more kind of
00:45:31.940 | pessimistic perspective of our age.
00:45:36.220 | - Yeah, well, he's clearly willing to tackle
00:45:39.740 | big, ambitious projects without a lot of
00:45:44.660 | kind of soul searching or
00:45:47.780 | trying to make up his mind, right?
00:45:52.580 | It's just like- - Just go and do it.
00:45:55.620 | Let's dig tunnels under cities, go.
00:45:58.520 | - Step one, make a joke about it on Twitter,
00:46:02.460 | step two, actually do it.
00:46:03.780 | - Yeah, yeah, yeah.
00:46:06.260 | And I mean, things have slowed down quite,
00:46:09.460 | our ability to build things
00:46:13.380 | at pace is a lot less than it was.
00:46:18.300 | And there's reasons for that.
00:46:20.220 | We're more concerned with safety and environmental impacts
00:46:23.660 | than people were when they were building
00:46:26.620 | some of the great public's works projects
00:46:29.100 | of the mid 20th century.
00:46:31.540 | But even, we're at the point now
00:46:34.060 | where even just maintaining the stuff that we've got
00:46:36.700 | is such a huge project
00:46:38.940 | that we need to put big resources into it
00:46:42.540 | and good minds into it,
00:46:45.060 | or else we're gonna be losing
00:46:47.780 | things that we take for granted.
00:46:52.300 | - Do you think that there's a lot to be done
00:46:54.220 | in the digital space?
00:46:55.860 | That's, we mentioned sort of Wikipedia and knowledge.
00:47:00.460 | Don't you think there could be a lot of flourishing
00:47:02.460 | in the space of innovation,
00:47:04.100 | in terms of innovation in the digital space?
00:47:07.300 | - Yeah, I mean, I'd like to see that.
00:47:08.900 | I think it's where a lot of the brainpower went
00:47:13.180 | during the last couple of generations
00:47:15.540 | because people who might previously
00:47:19.180 | have been building rockets
00:47:20.940 | or other kinds of sort of hard technologies
00:47:25.100 | ended up instead going into programming computer science,
00:47:29.300 | which is understandable and great.
00:47:32.180 | We've got structural problems right now
00:47:35.700 | in the way social media works that are pretty severe.
00:47:39.380 | And so I certainly hope that we're not,
00:47:44.220 | 10 years from now,
00:47:45.060 | that we're not exactly where we are today
00:47:47.660 | when it comes to that stuff.
00:47:49.700 | We need to move on.
00:47:51.580 | - The beautiful thing about problems
00:47:54.500 | is they show you how not to do things.
00:47:57.540 | And they give opportunity to new ideas to flourish
00:48:02.540 | and to beat out the ideas of the old,
00:48:06.420 | which is a dream for me to see new social media
00:48:11.420 | that beats out the ways of the old.
00:48:15.620 | So I tend to, you perhaps agree that it's not,
00:48:19.620 | that it's impossible to do social media well.
00:48:22.020 | - Oh, not at all.
00:48:22.860 | I mean, I listened to your interview with Jaron
00:48:25.460 | a couple of weeks ago, and I know Jaron,
00:48:28.220 | and we've talked about this.
00:48:30.940 | - He went hard on me.
00:48:32.660 | He basically said like, it's impossible.
00:48:35.700 | - It's very nice.
00:48:36.660 | Well, the last time I kind of paid attention
00:48:40.700 | to Jaron's thoughts on it,
00:48:41.980 | he was thinking in terms of that basically
00:48:45.940 | there should be micro payments such that if I,
00:48:50.260 | by clicking the like button on something,
00:48:53.260 | I'm essentially giving valuable intellectual property
00:48:58.260 | to Facebook or Twitter or whatever.
00:49:03.140 | It's not a very large amount of IP,
00:49:05.580 | but it's definitely a transfer of information
00:49:08.220 | that when they aggregate it is beneficial to them.
00:49:11.860 | So, and now I do remember that he,
00:49:15.380 | on his interview with you was talking about,
00:49:19.660 | what, data unions or, yeah.
00:49:22.580 | - Those are a lot of interesting ideas,
00:49:23.940 | but for me, the biggest disagreement
00:49:26.860 | was in the level of cynicism.
00:49:30.540 | He has a distrust and cynicism towards people
00:49:34.380 | in Silicon Valley being able to do these kinds of things.
00:49:37.900 | And I'm really,
00:49:41.200 | okay, when you have a large crowd of people
00:49:43.180 | that are doing things the wrong way,
00:49:45.620 | you should nevertheless maintain optimism
00:49:48.460 | because what's important is to find the one person
00:49:51.620 | in that room that's going to do things the right way.
00:49:53.540 | Cynicism is going to completely silence out the whole room.
00:49:57.660 | So he was saying, I've been here a long time.
00:50:00.780 | - Oh yeah.
00:50:01.620 | - I know, I understand like how these folks work.
00:50:05.940 | They think they're gods
00:50:09.200 | and they know the right way to do things
00:50:11.560 | and they will tell you how to do those things.
00:50:15.080 | And that kind of hubris is going to always lead you astray
00:50:19.200 | when you are the one who's engineering the algorithms.
00:50:22.600 | And there's a lot of deep truth to that
00:50:25.040 | because algorithms are powerful
00:50:27.280 | and many people when given power
00:50:31.320 | do not do the best of things.
00:50:32.880 | I mean, most, what is it?
00:50:35.200 | The old Lincoln line,
00:50:36.960 | if you want to test a man's character, give him power.
00:50:39.420 | - Yeah.
00:50:40.620 | - Yes, but that doesn't mean
00:50:42.460 | that some people are not able to handle the power
00:50:45.500 | and that some people are not able to come up
00:50:47.060 | with good ideas that create better social media.
00:50:51.220 | - Yeah, I didn't interpret Jaron's statements
00:50:54.100 | as being entirely cynical and hopeless.
00:50:57.560 | I mean, he's definitely raising issues of concern,
00:51:02.560 | but he wouldn't be out writing the books that he's written
00:51:06.660 | and talking about this stuff
00:51:07.720 | if he didn't think there was a way.
00:51:09.180 | - If he didn't think there was hope, yeah.
00:51:11.360 | And part of it, as you probably know with Jaron,
00:51:13.720 | he just loves a good argument.
00:51:15.380 | - Yeah.
00:51:16.220 | - He just loves to have a little bit of fun.
00:51:18.620 | Well, I have to ask you about,
00:51:21.980 | I mean, we talked about taking all big, bold, risky ideas.
00:51:26.980 | So in your new book, "Termination Shock,"
00:51:31.100 | it's set here in Texas.
00:51:32.900 | - Part of it is, yeah.
00:51:33.980 | - Yeah.
00:51:34.820 | - Most of it.
00:51:35.640 | - Yeah, it's a great place to set it.
00:51:37.440 | So in it, the main character, T.R. McCooligan,
00:51:41.140 | a Texas billionaire, oil man, and truck stop magnate,
00:51:44.780 | decides to solve climate change,
00:51:46.380 | to take on climate change by himself.
00:51:48.380 | So this is an interesting philosophical exploration
00:51:52.540 | of how to solve climate change from a perspective
00:51:54.660 | that's perhaps different than we've been thinking about.
00:51:57.780 | - I wouldn't use the word solve,
00:51:59.380 | but let's say ameliorate the temporary effects.
00:52:04.820 | But please.
00:52:05.660 | - Take on.
00:52:06.500 | - Yeah.
00:52:07.320 | - Take on the challenge.
00:52:08.160 | So it's very interesting,
00:52:09.620 | but as, so there's a gradual nature to this process.
00:52:14.060 | And I mean, just like in your book,
00:52:19.620 | the power of innovation is something
00:52:26.420 | that has saved us quite a few times in history.
00:52:30.760 | So what role does that play in this gradual process?
00:52:34.700 | - Right, so ultimately we don't solve the problem
00:52:38.620 | until we get the CO2 out of the atmosphere.
00:52:42.480 | But that is gonna take a while.
00:52:45.860 | We're still adding more.
00:52:48.980 | We haven't even started to reduce the amount.
00:52:53.140 | - So there's two possibilities in solving climate change.
00:52:56.900 | Reduce the amount that we're putting in the atmosphere,
00:53:00.420 | and two is removing what we got in the atmosphere.
00:53:03.660 | - We have to do both.
00:53:04.780 | - Right, and those are two different kind of efforts
00:53:09.380 | in terms of like what's involved.
00:53:10.940 | - 'Cause it stays up there.
00:53:12.500 | So I think just last week China announced
00:53:15.980 | that they're gonna try to level off
00:53:18.840 | their CO2 emissions in like 2030.
00:53:23.260 | So 2031, they'll only put as much CO2
00:53:27.300 | into the atmosphere as they did in 2030,
00:53:30.420 | which is still a lot of CO2.
00:53:33.260 | In 2060, they're saying we'll be net zero.
00:53:36.740 | So if everyone in the world does that,
00:53:38.640 | and the PPM of CO2 in the atmosphere by then
00:53:42.680 | is say 450 parts per million,
00:53:46.020 | it'll stay at 450 parts per million until we take it out.
00:53:50.680 | And taking it out is hard.
00:53:54.740 | It's a big.
00:53:56.780 | It took us a long time.
00:53:58.320 | We had to empty out huge coal mines and oil reservoirs
00:54:02.940 | and burn all that stuff.
00:54:04.220 | We had to chop down forests and dig up peat bogs
00:54:07.220 | in order to create all of that CO2.
00:54:11.300 | And so we have to reverse all of those processes somehow
00:54:16.300 | in order to remove the CO2 and get it back down,
00:54:21.220 | hopefully into the 200 and some parts per million range
00:54:24.820 | where it used to be.
00:54:26.340 | - So how about you get a single Texas billionaire
00:54:30.280 | to have a massive gun that blasts huge quantities of sulfur
00:54:32.960 | into the upper atmosphere?
00:54:35.120 | Like that's idea number one.
00:54:37.220 | - This is called solar geoengineering.
00:54:40.480 | And we know that it's a possibility on a technical level
00:54:45.200 | because volcanoes have been doing it forever.
00:54:48.100 | So many times in human history,
00:54:51.300 | we've seen a volcanic eruption
00:54:54.080 | that was followed by a global cooling trend
00:54:56.520 | that lasted for a couple of years.
00:54:58.840 | And one of these things happened,
00:55:00.860 | I think in the '60s or '70s in Indonesia.
00:55:03.800 | And the Australians sent a plane up into the stratosphere
00:55:08.800 | to take some samples of the plume.
00:55:11.520 | And when it came back down,
00:55:13.040 | the windscreen of the plane had sort of a deposit on it.
00:55:17.520 | So one of the Australian scientists licked it
00:55:22.440 | and reported that it was painfully acid.
00:55:26.160 | So that was our first kind of clue
00:55:28.120 | that what was being injected into the stratosphere
00:55:31.760 | was sulfur dioxide.
00:55:33.940 | And so we know, then Pinatubo came along in the '90s
00:55:41.360 | and did this experiment for us.
00:55:43.080 | So we know that sulfur in the stratosphere,
00:55:46.840 | it forms little spherical droplets of sulfuric acid
00:55:51.060 | after it combines with water,
00:55:52.600 | and those bounce back some of the sun's rays
00:55:56.560 | and reduce the amount of solar energy
00:56:00.120 | entering the troposphere, which is where we live.
00:56:03.540 | So we know that it works,
00:56:06.800 | and we also know that the stuff goes away
00:56:10.040 | after a couple of years.
00:56:12.200 | So it gradually washes out.
00:56:14.200 | And so it's not a permanent thing.
00:56:17.760 | The good news, bad news is,
00:56:19.380 | good news is it's not permanent.
00:56:22.580 | So if you don't like what's happening,
00:56:25.300 | you can just stop and wait a couple of years,
00:56:28.060 | and you'll get back to where you started.
00:56:30.660 | And the bad news, if you're in favor of this kind of thing,
00:56:33.960 | is that you have to keep doing it forever.
00:56:36.700 | So this guy is one of those,
00:56:41.980 | he's read these papers,
00:56:43.540 | he under, the TR, the character in the book,
00:56:46.380 | he knows all this.
00:56:47.920 | And all people who are familiar with climate science
00:56:51.840 | are kind of know this.
00:56:53.940 | It's a pretty well-established fact.
00:56:56.980 | And so he just decides he's gonna take action unilaterally
00:57:01.980 | and do this.
00:57:04.240 | And so there's different ways to get the sulfur up there,
00:57:09.240 | but because it's Texas,
00:57:10.600 | he builds the biggest gun in the world.
00:57:13.400 | It's just six barrels pointed straight up,
00:57:15.600 | and he begins firing shells loaded with sulfur
00:57:18.960 | into the stratosphere.
00:57:20.320 | And so the book is about not so much that
00:57:22.760 | as how people react to his doing that,
00:57:26.520 | what the political ramifications are around the world,
00:57:30.200 | 'cause this is a extremely controversial idea,
00:57:35.080 | and not everyone's on board with it.
00:57:37.480 | And even if you are willing to consider
00:57:42.040 | using a technological intervention,
00:57:44.320 | the fact is that it's gonna have different effects
00:57:47.840 | on different parts of the world.
00:57:49.300 | So some areas may suffer negative,
00:57:54.080 | more negatives than positives,
00:57:56.600 | and they're not gonna be happy.
00:57:58.900 | - So what do you think,
00:58:00.680 | so in his case, in TR's case,
00:58:03.540 | he can get around getting permission from governments.
00:58:10.840 | If we were to look at our,
00:58:12.560 | us facing, outside of the story,
00:58:17.360 | us facing climate change,
00:58:19.080 | where do you think the solution will come from?
00:58:20.620 | Governments working together,
00:58:22.400 | or from bold billionaire Texans?
00:58:27.400 | - I'm pretty sure that this kind of intervention
00:58:31.720 | is never gonna emerge from Western democracies.
00:58:36.720 | - This kind of, sorry, government coordinate,
00:58:40.800 | which option, one or--
00:58:42.600 | - Solar geoengineering.
00:58:43.960 | - Solar geoengineering.
00:58:45.280 | From a government, from a,
00:58:47.200 | like those are, I wanna sort of, the distinction,
00:58:50.400 | one is the idea, the technological idea
00:58:53.200 | you're talking about, but two is like,
00:58:56.160 | who comes up with the idea and agrees on it?
00:58:58.960 | Governments or individuals?
00:59:00.680 | - Yeah, if this were to happen,
00:59:02.400 | I think it would be either an individual,
00:59:05.120 | or more likely just some government somewhere
00:59:08.280 | that just decides it's in their interests
00:59:10.640 | to unilaterally do this.
00:59:13.920 | And that's not me advocating it,
00:59:17.520 | it's just, it's so,
00:59:21.640 | it would be comparatively so cheap and easy
00:59:24.960 | to implement a solar geoengineering scheme
00:59:28.880 | that someone is probably gonna do it
00:59:32.160 | once things get bad enough.
00:59:34.660 | But I don't think that governments will,
00:59:37.000 | or Western governments, just because
00:59:39.880 | they're not, well, we've seen what happened
00:59:43.320 | with vaccines, right?
00:59:45.400 | So, getting people to take vaccinations
00:59:50.400 | or wear masks has turned out to be incredibly hard,
00:59:55.960 | even though it might save those people's lives.
00:59:58.940 | - See, I blame, that's not Western,
01:00:03.200 | that's, I blame failure of leadership there,
01:00:06.440 | of leaders being, not coming off as authentic,
01:00:09.740 | not being inspiring, uniting, all those kinds of things.
01:00:13.200 | I think that's possible.
01:00:14.280 | I think it's just that we've gotten,
01:00:16.760 | the leaders we have right now-
01:00:18.240 | - Aren't the right people.
01:00:19.480 | - Aren't the right people, 'cause we've lived
01:00:21.160 | through kind of a long stretch
01:00:23.120 | of relatively comfortable times.
01:00:25.160 | And it feels like unfortunate,
01:00:27.880 | if you just look at history,
01:00:29.520 | that hard times make great leaders,
01:00:31.640 | and easy times make like bureaucrats
01:00:36.000 | that are egotistical and greedy
01:00:38.960 | and not very interesting and not very bold.
01:00:41.960 | - Yeah, no, I think that's fair.
01:00:43.560 | So, we may be entering one of those interesting times.
01:00:47.440 | - Of hardship.
01:00:48.280 | - In the Chinese curse sense, yeah.
01:00:50.360 | So, I could be wrong, but I mean,
01:00:54.960 | there have been some efforts to explore
01:00:58.560 | solar geoengineering.
01:01:00.540 | There was a plan to send up some balloons,
01:01:04.560 | high altitude balloons, to take some measurements
01:01:08.680 | in Scandinavia that got squashed by objections
01:01:13.080 | from people who lived up there,
01:01:14.900 | who were just opposed to the whole program on principle.
01:01:21.380 | So, we'll see a lot more of that,
01:01:24.600 | and it's gonna be a hard program to advocate for,
01:01:28.440 | just because I think people don't quite understand
01:01:32.400 | how much carbon dioxide is in the atmosphere
01:01:36.280 | and how far we are from even slowing down
01:01:40.780 | the rate that we're adding more,
01:01:43.880 | to say nothing of bringing that number down.
01:01:48.120 | We're a long way out from that.
01:01:50.520 | - Do you see, in terms of portfolio of solutions,
01:01:53.440 | us becoming a multi-planetary species as part of that,
01:01:56.760 | as this also being a motivator for investing some percent
01:02:03.400 | of GDP into becoming a multi-planetary species?
01:02:07.360 | And what percent should that be, do you think?
01:02:09.200 | - You know, in a indirect way, maybe.
01:02:11.520 | I mean, you know what people will say,
01:02:13.440 | which is the same argument that has been leveled
01:02:17.000 | against space exploration since the Apollo program,
01:02:20.440 | which is why don't we solve our problems here on Earth
01:02:23.760 | before we spend money going into space.
01:02:27.680 | So, I've never been a believer in that argument.
01:02:32.280 | I think there could be a sense in which the new perspective
01:02:37.280 | that could be obtained by thinking about,
01:02:43.680 | like if we're thinking about terraforming Mars,
01:02:47.640 | changing its atmosphere,
01:02:48.960 | making it more amenable to life and survival,
01:02:53.160 | you could see that maybe changing people's opinions
01:02:57.360 | about terraforming the Earth.
01:02:59.760 | - Yeah, there are some dangerous consequences
01:03:02.560 | to this particular idea of blasting software
01:03:07.560 | of geoengineering.
01:03:08.600 | What do you make of sort of big, bold ideas
01:03:15.280 | that are a double-edged sword?
01:03:18.720 | Are all ideas like this, all big ideas like this,
01:03:22.240 | they have the potential to have highly beneficial consequences
01:03:28.240 | and a potential to have highly destructive consequences?
01:03:33.240 | - I wouldn't say all.
01:03:35.360 | I think, you know, going back to what we were talking about
01:03:39.080 | earlier, you know, how technology developed
01:03:41.520 | in the '50s and '60s, there was a period of time there
01:03:44.960 | when people maybe had unrealistic ideas
01:03:48.720 | about new technology and weren't sufficiently attentive
01:03:52.560 | to the possible downsides.
01:03:55.380 | So we got, and there's a reason why.
01:04:00.380 | I mean, in the mid-20th century, we saw antibiotics,
01:04:05.640 | we saw the polio vaccine, we saw just simple things
01:04:13.480 | like refrigerators in the home.
01:04:15.700 | My grandmother, to her dying day,
01:04:20.240 | called the refrigerator the icebox
01:04:23.060 | because when she grew up, it was a box with ice in it.
01:04:26.800 | So you see all that change, and it's largely
01:04:30.080 | for the benefit of people.
01:04:31.360 | And so if somebody comes along and says,
01:04:34.400 | "Hey, we're gonna build nuclear reactors to make energy,"
01:04:38.460 | or "Here's a new chemical called DDT
01:04:42.640 | "that's gonna kill mosquitoes,"
01:04:46.120 | then it's easy to just buy into that
01:04:51.160 | and not be alert to the possible downsides.
01:04:55.340 | And of course, we know that the way
01:04:59.380 | that those early reactors were built
01:05:01.740 | and the way that the supply chain was built
01:05:04.980 | to create the fuel and deal with the waste
01:05:09.980 | was poorly thought out.
01:05:13.820 | And we're still dealing with the resulting problems
01:05:19.660 | at places like Hanford in the state of Washington.
01:05:23.260 | And we know that DDT, although it did kill a lot of insects,
01:05:28.260 | also had terrible effects on bird populations.
01:05:32.880 | So the kind of backlash that happened in the '70s
01:05:36.660 | that is still kind of going on is to sort of assume
01:05:41.360 | that everything is a double-edged sword
01:05:44.740 | and always to look for,
01:05:48.820 | we have to absolutely convince ourselves
01:05:51.060 | that the downside isn't gonna come back and bite us
01:05:56.060 | before we can adopt any new technology.
01:06:00.500 | And I think the people are overly sensitized to that now.
01:06:05.500 | - Yeah, it's funny.
01:06:11.220 | Depending on the technology,
01:06:12.500 | people are a little bit too terrified
01:06:15.940 | of certain technologies,
01:06:17.420 | like artificial intelligence is one.
01:06:19.260 | My sense is that the things that they're afraid of
01:06:25.260 | aren't the things that are likely going to happen
01:06:28.540 | in terms of negative things.
01:06:30.260 | It's probably impossible to predict exactly
01:06:32.580 | the unintended negative consequences.
01:06:35.940 | But what's also interesting is for AI as an example,
01:06:39.860 | people don't think enough about the positive things.
01:06:43.420 | I mean, the same is true with social media.
01:06:45.460 | It's very popular now for some reason
01:06:48.380 | to talk about all the negative effects of social media.
01:06:51.060 | We've immediately forgotten
01:06:52.780 | how incredible it is to connect across the world.
01:06:58.700 | There's a deep loneliness within all of us.
01:07:02.020 | We long to connect and social media, at least in part,
01:07:05.660 | enables that even in its current state.
01:07:08.620 | And all the negative things we see with social media
01:07:12.500 | currently are also in part
01:07:14.380 | just revealing the basics of human nature.
01:07:16.700 | It didn't make us worse.
01:07:17.540 | It's just bringing it to the surface.
01:07:20.100 | And step one of solving a problem
01:07:21.540 | is bringing it to the surface.
01:07:23.140 | The fact that there's a division,
01:07:26.060 | the fact that they're more easily angered and upset,
01:07:30.380 | and all of that, the witch hunts,
01:07:32.740 | all those kinds of things, that's human nature.
01:07:35.140 | And it just reveals that allowing us to not work on it,
01:07:38.860 | it's therapy.
01:07:40.780 | And so that's another example of a technology
01:07:43.380 | that's just, we're not considering
01:07:46.500 | the positive effects now and in the future enough of.
01:07:50.180 | I have to ask about,
01:07:52.500 | there's a million things I can ask about,
01:07:55.220 | but virtual reality, I gotta ask you.
01:07:57.740 | You've thought about virtual reality, mixed reality
01:08:02.300 | quite a bit.
01:08:05.500 | What are the interesting trajectories you see
01:08:08.580 | for the proliferation of virtual reality
01:08:11.540 | or mixed reality in the next few years?
01:08:13.900 | - Yeah, so I was at Magically for, what, five years?
01:08:18.900 | - With the best title of all time.
01:08:21.260 | - Oh, thanks.
01:08:22.980 | Chief Futurist?
01:08:23.980 | - Yeah. - Yeah.
01:08:25.340 | And so I sort of had a little squad of people in Seattle
01:08:30.340 | doing what you might call content R&D.
01:08:33.660 | So we're trying to make content for AR,
01:08:36.260 | but because it's such a new medium,
01:08:40.860 | there's more of an engineering R&D project almost
01:08:45.580 | than a creative project.
01:08:48.500 | So it was fascinating to see everything that goes
01:08:53.500 | into making an AR system that runs.
01:08:58.540 | So AR, an AR device, if it's really gonna do AR,
01:09:05.980 | needs to be running SLAM in real time.
01:09:09.700 | And that alone is a big--
01:09:11.620 | - So for people who don't know, first of all,
01:09:13.860 | virtual reality is creating an almost fully artificial world
01:09:18.860 | and putting you inside it.
01:09:20.620 | Augmented reality, AR, is taking the real world
01:09:25.380 | and putting stuff on top of that real world.
01:09:29.660 | And when you say SLAM, that means in real time,
01:09:32.500 | the device needs to be able to sense accurately,
01:09:36.700 | detect everything about that world sufficiently
01:09:39.900 | to be able to reconstruct the 3D structure of it
01:09:44.900 | so you can put stuff on top of it.
01:09:47.060 | And doing that in real time, presumably not just real time,
01:09:51.300 | but in a way that creates a pleasant experience
01:09:54.660 | for the human perception system is, yeah,
01:09:58.540 | that's an engineering project.
01:10:01.620 | - Right, yeah, well said.
01:10:02.900 | And it's just one of the things that the system has to do.
01:10:07.460 | It's also tracking your eyes,
01:10:09.660 | so it knows what you're looking at,
01:10:11.700 | how far away what you're looking at is.
01:10:14.620 | It's performing all those functions,
01:10:21.180 | and it's gotta keep doing that without burning up the CPU
01:10:27.780 | or depleting the battery unreasonably fast.
01:10:32.500 | And that's just table stakes.
01:10:35.540 | It's just the basic functions of the operating system.
01:10:39.140 | And then any content that you wanna add
01:10:41.800 | has to sit on top of that.
01:10:43.860 | It's gotta be rendered by the optics
01:10:46.260 | at a sufficiently low latency that it looks real
01:10:51.180 | and you don't get sick.
01:10:52.380 | So it's an amazing thing.
01:10:53.820 | And a magically shipped device that can do that in 2019,
01:10:58.820 | and they're about to ship the ML2,
01:11:04.580 | but I don't know any more about that than anyone else
01:11:08.540 | 'cause I don't work there anymore.
01:11:10.500 | - But does it still, to some degree,
01:11:14.340 | boil down to a killer app, a content question?
01:11:19.300 | Like you said, it's kind of a wide open space.
01:11:21.320 | Nobody knows exactly what's going to be
01:11:23.260 | the compelling thing.
01:11:25.180 | So doesn't a super compelling experience of some sort
01:11:29.900 | alleviate some of the need for engineering perfection?
01:11:34.860 | - Well, there's a base layer of engineering
01:11:38.580 | that you have to have no matter what,
01:11:41.620 | but you're certainly right that people,
01:11:44.140 | like in the early days of video games,
01:11:46.060 | put up with kind of low frame rate
01:11:48.860 | and what we would now call crappy graphics
01:11:52.180 | because they were having so much fun playing "Doom"
01:11:54.700 | or whatever.
01:11:56.060 | - Even "Tetris."
01:11:57.020 | - Yeah, yeah.
01:11:58.180 | So for sure that's true.
01:12:01.500 | And so, I was working on consumer-facing content.
01:12:06.500 | There was a great team in Wellington, New Zealand
01:12:14.180 | that made a game called "Dr. Groydbrodt's Invaders"
01:12:18.940 | that realized the potential of AR gaming
01:12:23.940 | in a way that I don't think anything else has
01:12:29.380 | before or since.
01:12:31.960 | And so that was definitely the strategy
01:12:36.420 | until April 2020,
01:12:41.700 | which is when the company decided to pivot
01:12:45.900 | to commercial industrial applications instead.
01:12:49.680 | So, and I haven't seen their financial projections,
01:12:56.180 | but I assume they had good reasons
01:13:00.860 | for making that strategic decision.
01:13:04.860 | It just means that it's no longer necessarily targeted
01:13:10.020 | at just end users who want to play a game or be entertained,
01:13:15.300 | but it's, you know.
01:13:16.780 | - That to me from a sort of a dreamer,
01:13:19.860 | futurist perspective is heartbreaking
01:13:22.100 | 'cause I don't know necessarily from in the VR space,
01:13:26.300 | but I see this kind of thing with robotics,
01:13:31.300 | where to me, the future of robotics is consumer-facing
01:13:36.460 | and a lot of great roboticists,
01:13:40.500 | Boston Dynamics and companies like that are focused on.
01:13:45.160 | - Sort of industrial applications.
01:13:47.180 | - Yeah.
01:13:48.140 | - Because for financial business reasons.
01:13:50.380 | - Yeah, no, I can see the parallels for sure.
01:13:53.660 | You know, we'll see.
01:13:54.660 | It was a fun project.
01:13:57.020 | You know, we worked on an app, for example,
01:14:02.020 | called Baby Goats,
01:14:04.060 | which just populated your room with baby goats.
01:14:07.700 | - That seemed like a killer app right there.
01:14:09.380 | - Well, we thought highly of the idea for sure.
01:14:12.980 | - Yes.
01:14:13.820 | - So, but because of the slam,
01:14:16.240 | the system knew, for example, here's a table,
01:14:22.420 | here's a little end table.
01:14:24.140 | We know the heights.
01:14:26.180 | We know how high our animated baby goat can jump.
01:14:29.680 | And so our engineers had to build a system
01:14:35.260 | for converting the slam primitives
01:14:37.660 | into game engine objects
01:14:41.540 | that the game, the AIs in the game could navigate around.
01:14:46.540 | So, and that ended up shipping as more of a dev kit
01:14:52.100 | or a sort of how-to, a sample app
01:14:55.820 | than as a finished consumer facing.
01:15:00.220 | - You mean the Baby Goat AI?
01:15:02.460 | Yeah. - Yeah.
01:15:04.060 | - That seems to me like a world
01:15:06.540 | I can entertain myself for hours,
01:15:09.420 | just every day coming home to see baby goats.
01:15:13.860 | - Yeah, I mean, it was an ambient kind of,
01:15:16.940 | it's not a thing that you would sit there
01:15:18.980 | and play like a video.
01:15:21.140 | - Just life. - Yeah, yeah.
01:15:22.740 | - But now there's baby goats.
01:15:24.620 | I mean, what's the purpose of having dogs and cats
01:15:27.580 | in your life exactly?
01:15:28.900 | It's kind of ambient.
01:15:30.700 | They're not really helping you do anything,
01:15:32.700 | but it's enriching your life.
01:15:34.660 | - You can go and play fetch or something for a while
01:15:36.900 | if you want, but you don't have to.
01:15:38.820 | - Right. - Yeah.
01:15:40.260 | So, we worked on that and a bigger project
01:15:45.220 | that was more of a storytelling in a fictional universe.
01:15:50.220 | The hardware is worth a look.
01:15:52.260 | There's still a belief, I just saw it this morning
01:15:55.780 | looking at Twitter, that the Magic Leap
01:15:57.900 | never shipped anything, but they've been,
01:16:02.220 | since 2019, you can go to their website
01:16:06.500 | and buy one of these devices anytime you
01:16:08.700 | wanna spend the money.
01:16:10.820 | - Yeah, and the new one's coming out, I think in 2022,
01:16:14.140 | so in a few months.
01:16:16.300 | What do you think, looking out 50 years from now,
01:16:21.540 | what wins, virtual reality, augmented reality,
01:16:26.540 | or physical reality?
01:16:28.860 | - What wins?
01:16:31.900 | - Meaning like what's, what do people,
01:16:36.420 | of, that have financial resources,
01:16:39.140 | enjoy spending most of their time in?
01:16:42.140 | - I've always been a fan of AR,
01:16:47.100 | and it's kind of an easy answer,
01:16:48.420 | because if you're wearing an AR device,
01:16:51.300 | you put a bag over your head, it becomes a VR device.
01:16:55.060 | You know, it just, if you block out the,
01:16:58.380 | what's really there, then all you're seeing is a VR.
01:17:03.620 | - But you are, with AR, constrained to kind of operate
01:17:08.620 | in something that's similar to physical reality.
01:17:12.500 | - Yeah. - With VR,
01:17:13.340 | you can go into fantastical worlds.
01:17:15.500 | - True, true.
01:17:16.700 | So there are still issues in those fantastical worlds
01:17:21.700 | with motion sickness, right?
01:17:26.820 | So if your body is experiencing accelerations
01:17:32.940 | in your inner ear that differs from what your eye
01:17:37.940 | thinks it's seeing, then you'll get sick,
01:17:41.220 | unless you're a very unusual person.
01:17:43.340 | So it doesn't mean you can't do it,
01:17:44.980 | it's just, it's a constraint that VR designers
01:17:49.860 | have to learn to work with.
01:17:52.180 | - So do you think it's possible that in the future,
01:17:55.620 | we're living mostly in a virtual reality world?
01:17:59.820 | Like, we become more and more detached
01:18:02.300 | from physical reality?
01:18:03.540 | - For entertainment, maybe, for certain applications,
01:18:08.780 | I'm personally more, I mean, we have to make a distinction
01:18:13.100 | between what I would personally find interesting
01:18:15.700 | and what might win in the market.
01:18:18.500 | So maybe some people, maybe lots of people,
01:18:22.280 | would like to spend a huge amount of time in VR.
01:18:26.740 | I'm personally more interested in enhancing
01:18:31.100 | the experience that I have of the physical world,
01:18:34.420 | 'cause the physical world's pretty cool, right?
01:18:36.940 | There's a lot to be said for moving around
01:18:41.380 | in the real world.
01:18:42.780 | - Can I ask you for you personally,
01:18:44.980 | to try to play devil's advocate, or to try to construct,
01:18:49.320 | to imagine a VR world where you and Neil Stephenson
01:18:54.700 | wouldn't want to stay.
01:18:57.660 | Not because the physical world all of a sudden
01:19:02.420 | became really bad, for some reason,
01:19:04.380 | like you're trying to escape it,
01:19:05.900 | but like literally, it's just more enriching.
01:19:10.620 | In the same way, like there's a glimmer in your eye
01:19:13.900 | when you said you enjoy the physical world.
01:19:17.020 | Like, double up on that glimmer for the virtual reality.
01:19:21.260 | Can you imagine such a world?
01:19:23.260 | - Well, like, I'll give maybe an example that's a bridge,
01:19:26.740 | which is that I've been, I like making things.
01:19:29.780 | So I like working in a machine shop
01:19:33.180 | and making objects with 3D printers,
01:19:36.460 | or machines, or whatever.
01:19:37.700 | And so I've had to learn how to get good
01:19:40.140 | at using a CAD program.
01:19:43.580 | There's many to choose from.
01:19:46.940 | I use one called Fusion 360.
01:19:50.820 | And I can spend hours in that,
01:19:55.820 | trying to create, imagine and create
01:19:59.900 | the things I want to create.
01:20:01.420 | And it's not virtual reality, exactly,
01:20:05.360 | but that whole time,
01:20:06.820 | my whole field of view is occupied by this monitor
01:20:13.820 | that's showing me a window into a three-dimensional space.
01:20:18.080 | I'm rotating things around.
01:20:20.460 | I'm imagining things, I'm making things.
01:20:23.740 | And so that is pretty close to being in virtual reality.
01:20:28.740 | - Does that thing have to exist
01:20:32.780 | for you to experience true joy?
01:20:34.540 | Can you stay in Fusion 360 the whole time?
01:20:38.340 | Do you have to 3D print it and touch it?
01:20:41.620 | - Yeah, I mean, that's my game.
01:20:44.040 | That's what I'm up to.
01:20:45.700 | But it happens that if you're building
01:20:50.020 | a virtual environment, if you're making a game level
01:20:54.460 | or creating a virtual set for a film or TV production,
01:20:58.780 | the thing that you're designing in the program
01:21:00.860 | may never physically exist.
01:21:02.880 | And in fact, it's preferable that it doesn't
01:21:06.740 | because the whole point of that
01:21:09.180 | is to make imaginary things
01:21:13.340 | that you couldn't build otherwise.
01:21:17.900 | So I think lots of people spend a good chunk
01:21:20.220 | of their working hours in something
01:21:22.760 | that's pretty close to VR.
01:21:25.780 | It's just that currently the output device
01:21:27.540 | happens to be a rectangular object in front of them.
01:21:31.780 | You could replace that with a VR headset
01:21:35.660 | and they'd be doing the same stuff.
01:21:37.700 | - There's all kinds of interfaces.
01:21:40.180 | For example, I enjoy listening to podcasts or audio books.
01:21:42.940 | Let's say actually podcast
01:21:44.940 | 'cause there's a intimate human connection in a podcast.
01:21:48.900 | It's one way, but you get to learn
01:21:51.100 | about the person you're listening to.
01:21:53.180 | And that's a real connection.
01:21:54.420 | And that's just audio for a lot of people.
01:21:56.540 | That's just audio.
01:21:57.500 | - True.
01:21:58.340 | - And for me, that's just audio as a fan of people.
01:22:03.340 | And you kind of a little bit are friends with those people.
01:22:07.520 | - Yeah, they're in your life.
01:22:08.980 | You're listening to them, yeah.
01:22:10.700 | And I mean, they're as far away from real as it gets.
01:22:15.700 | There's not even a visual component.
01:22:20.900 | It's just audio, but they're as real.
01:22:23.660 | Like if I was on a desert Island,
01:22:25.420 | my imagination, this thing works pretty good
01:22:30.820 | in terms of imagination.
01:22:33.100 | It creates a very beautiful world with just audio.
01:22:38.100 | So I mean-
01:22:40.500 | - Or even just reading books.
01:22:42.020 | - Exactly, reading books.
01:22:45.340 | Even more so with reading books
01:22:47.620 | 'cause there are certain mediums
01:22:49.660 | which stimulate the imagination more.
01:22:51.900 | When you present less, the imagination works more
01:22:57.580 | and that can create really enriching experiences.
01:23:00.060 | So I mean, to me, the question is,
01:23:03.860 | can you do some of the amazing things
01:23:07.580 | that make life amazing in virtual worlds?
01:23:11.180 | It seems to me the answer there is obviously yes.
01:23:14.460 | Even if I, like you, am attached to a lot of stuff
01:23:17.060 | in the physical world, I think I can very readily imagine
01:23:22.060 | coming up with some of the same magical experiences
01:23:26.580 | in the virtual world where you make friends
01:23:30.180 | and you can fall in love where the source of love
01:23:34.300 | in your life is to a much greater degree
01:23:39.300 | inside of a virtual world.
01:23:41.580 | And then love means fulfillment, that means happiness.
01:23:46.140 | That's the thing you look forward to.
01:23:47.540 | And not some kind of dopamine rush type of love,
01:23:50.300 | but like long lasting friendship.
01:23:53.500 | - Yeah, yeah, real deal.
01:23:54.700 | Yeah, yeah.
01:23:55.940 | It just depends on what is there
01:23:59.020 | in the way of applications, the content,
01:24:01.500 | and can it feed you those things?
01:24:04.180 | Can it give you, like in my example of using the CAD program,
01:24:08.260 | it gives me the ability to do something I enjoy,
01:24:12.600 | which is imagining things and making things
01:24:16.980 | in a particular way.
01:24:18.180 | - Can we psychoanalyze you for a second?
01:24:19.900 | - Sure.
01:24:21.300 | - What exactly do you enjoy?
01:24:23.520 | Is there some component of you building the thing
01:24:27.580 | where you get to at least a little bit share with others?
01:24:32.660 | Is there a human in the loop outside of you in that picture?
01:24:35.760 | - Will anyone ever see it?
01:24:38.900 | - Right.
01:24:41.180 | There's a source of your enjoyment,
01:24:42.780 | because I would argue that perhaps,
01:24:46.180 | turtles all the way down,
01:24:49.180 | when you get to the bottom turtle,
01:24:50.620 | it has to do with sharing with other humans.
01:24:53.900 | And if you can then put those humans inside the VR world,
01:25:01.340 | then you can, okay, for example,
01:25:03.060 | you could do it in the physical world, the 3D printing,
01:25:07.180 | but you share it in the virtual world,
01:25:10.060 | and that's where the source of happiness is.
01:25:12.260 | - I think, at least speaking for myself,
01:25:14.820 | I'm always thinking in terms of an audience,
01:25:17.740 | and at some level, I feel like I'm doing this for someone
01:25:21.940 | or communicating to someone,
01:25:24.540 | even if there's not a specific someone in mind.
01:25:27.900 | It could just be an abstract, theoretical someone.
01:25:31.560 | And it's like another app I spend a lot of time in
01:25:35.860 | is Mathematica, okay?
01:25:37.980 | - Yeah, incredible app.
01:25:39.580 | - Yeah, yeah.
01:25:40.700 | And when I do a Mathematica notebook,
01:25:42.700 | if I'm trying to figure something out,
01:25:43.940 | I spend a lot of time typing.
01:25:45.760 | My stuff is just huge blocks of text,
01:25:50.620 | just me thinking out loud,
01:25:52.020 | and then some graphs and calculations and stuff.
01:25:56.900 | Because to me, that act of explaining things and commenting
01:26:01.900 | helps me understand what I'm doing.
01:26:07.340 | - And there's kind of an audience,
01:26:09.900 | amorphous audience in mind.
01:26:12.180 | - Yeah, like, I mean, most of this stuff
01:26:14.300 | nobody will ever see, and yet,
01:26:16.700 | I'm creating it as if there were an audience
01:26:20.100 | that might read this stuff,
01:26:21.920 | because that's a necessary constraint
01:26:25.540 | that helps me do a better job.
01:26:29.660 | - What's the, this might be a tricky question to answer.
01:26:33.100 | What comes to mind as a particularly beautiful thing
01:26:37.120 | that you're proud of that you created inside Mathematica,
01:26:40.020 | visualization-wise, or something that just comes to memory,
01:26:44.260 | if it's possible to retrieve?
01:26:46.220 | - So, the thing I've spent the most amount of time on
01:26:52.820 | is I got obsessed a long time ago
01:26:56.940 | with trying to tile the globe with hexagons.
01:27:01.740 | - Yes.
01:27:02.700 | - And--
01:27:04.180 | - An actual globe?
01:27:05.780 | - Well, any spherical--
01:27:07.100 | - Any spherical, okay.
01:27:08.060 | - Object, yeah, but with an eye
01:27:09.780 | towards putting it on the Earth.
01:27:12.400 | And so, and have it be recursive.
01:27:15.320 | So, you can have hexagons within hexagons,
01:27:18.860 | which is hard, because, and probably a bad idea,
01:27:22.140 | because you can't tile a hexagon with smaller hexagons.
01:27:27.140 | They don't, they stick out.
01:27:30.620 | - Got it.
01:27:31.460 | So, they're, oh, they stick out.
01:27:33.700 | So, there's a, can you do some kind
01:27:35.500 | of fractal hexagon situation?
01:27:37.860 | - Yeah, yeah, so it's that,
01:27:40.540 | and people who know me are always,
01:27:43.920 | now make fun of me for this, so they'll send me,
01:27:48.180 | if they see a picture with hexagons in it,
01:27:51.740 | they'll send me a link to make fun of me.
01:27:55.580 | So, as some--
01:27:57.420 | - One of those people, Roger Penrose, or?
01:28:00.300 | - I think Roger's a little above my level.
01:28:04.060 | - Well, he's into hexagons as well, and tiling.
01:28:08.300 | - Yeah, yeah, so I did a lot of that,
01:28:11.740 | and I thought it was pretty cool,
01:28:14.300 | but there's some surprisingly intractable problems
01:28:18.720 | that keep coming up.
01:28:19.900 | Like, you've always gotta have some pentagons.
01:28:24.900 | Like, if you start with an icosahedron,
01:28:27.900 | which is equilateral triangles,
01:28:31.260 | which is a logical place to start,
01:28:33.080 | you can cover those with hexagons,
01:28:36.180 | but every vertex where the triangles come together
01:28:41.180 | is a pentagon, has to be a pentagon.
01:28:46.540 | - Oh, interesting, so it's all hexagons,
01:28:47.980 | and then there's a pentagon at the intersections.
01:28:49.900 | - Yeah, yeah.
01:28:50.740 | - Cool, how'd you figure that out?
01:28:52.180 | Is that a known fact?
01:28:53.420 | - Well, it's just if you look at it,
01:28:55.540 | just by inspection. - It's an obvious thing.
01:28:56.820 | Got it.
01:28:57.660 | - Yeah, so, and you can't make that go away.
01:29:01.180 | So, any system that you come up with to do this
01:29:04.660 | has gotta have this exceptions built into it for those 12.
01:29:09.660 | You could have quintillions of hexagons,
01:29:14.100 | but you've still gotta have 12 pentagons.
01:29:17.940 | Somewhere.
01:29:18.780 | So, I've blown a hell of a lot of time on that
01:29:25.340 | over the years.
01:29:26.940 | - By the way, a lot of those kind of problems
01:29:30.580 | are very difficult to prove something about.
01:29:32.740 | - Yeah, yeah, yeah.
01:29:35.900 | And I think Uber did it, 'cause someone,
01:29:39.260 | one of my friends who knows of my interest in this
01:29:45.100 | and who likes to give me a hard time,
01:29:48.780 | sent me a link, this was a couple years ago,
01:29:51.380 | to some code base that I think came out of Uber
01:29:56.380 | where they had done this.
01:29:58.180 | You break down the whole surface of the Earth
01:30:01.500 | into little hexagons.
01:30:04.380 | So, that was a real knife through the heart.
01:30:07.700 | But I'll probably come back to it someday.
01:30:14.140 | - Is there something special about hexagons,
01:30:16.300 | or are you interested in all kinds of tiling?
01:30:18.740 | - Well, I'm interested in all kinds of tiling,
01:30:22.420 | but I know my limitations as a math guy.
01:30:27.420 | So, hexagons are about my speed.
01:30:31.760 | - Just a sufficient amount of complexity.
01:30:36.500 | - Yeah, yeah.
01:30:37.860 | So, but no, tiling is a really interesting problem,
01:30:41.800 | both two and three-dimensional.
01:30:44.140 | Tiling problems are fascinating,
01:30:46.780 | and they're one of those ancient puzzles
01:30:48.700 | that has attracted brainiacs for centuries.
01:30:53.700 | - Let me ask you a little bit about AI.
01:31:00.940 | What are some likely interesting trajectories
01:31:07.580 | for the proliferation of AI in society
01:31:10.540 | over the next couple of decades?
01:31:12.400 | Do you think about this kind of stuff?
01:31:14.140 | - I do not think about it a lot,
01:31:16.980 | 'cause it's a deep topic,
01:31:18.560 | and I don't consider myself super well-informed about it.
01:31:22.580 | And AI seems to be a term
01:31:25.100 | that is applied to a lot of different things.
01:31:28.740 | So, I've messed around just a tiny little bit
01:31:31.320 | with neural nets, with, what's it called,
01:31:35.420 | PCA, principal component analysis.
01:31:38.340 | So, I guess I tend to think in terms of
01:31:40.820 | sort of granular bottom-up ideas,
01:31:43.580 | rather than big picture, top-down.
01:31:49.620 | - Oh, got it, so like very specific algorithms,
01:31:51.660 | like how are they going to,
01:31:54.220 | what problem are they going to solve in society,
01:31:56.860 | such that it has a lot of big ripple effects?
01:31:59.860 | I mean, we could talk about
01:32:03.540 | particular successful AI systems
01:32:06.740 | and success defined in different ways of recent years.
01:32:09.820 | So, one is language models with GPT-3.
01:32:13.440 | Most importantly, they're self-supervised,
01:32:17.100 | meaning they don't require much supervision from humans,
01:32:20.980 | which means they can learn by just reading
01:32:23.700 | a huge amount of content created by humans.
01:32:26.540 | So, read the internet, and from that,
01:32:28.200 | be able to generate text
01:32:29.540 | and do all kinds of things like that.
01:32:31.740 | It's possible they have a big enough neural network
01:32:34.020 | that's going to be able to have conversations with humans
01:32:38.420 | based on just reading human language.
01:32:41.660 | That's an interesting idea.
01:32:43.100 | To me, the very interesting idea
01:32:45.460 | that people don't think about it as AI,
01:32:48.740 | because they're kind of dumb currently,
01:32:50.860 | is actual embodied robots.
01:32:53.540 | So, robotics, like Boston Dynamics.
01:32:56.020 | I have downstairs and upstairs, legged robots.
01:33:02.860 | Currently, Boston Dynamics robots and most legged robots,
01:33:07.580 | most robots, period, are pretty dumb.
01:33:09.740 | Most of the challenges have to do with the actual,
01:33:14.420 | first of all, the engineering of making the thing work,
01:33:17.260 | getting a sensor suite that allows you to do
01:33:19.620 | the same things with Magic Leap,
01:33:21.140 | that base layer of like--
01:33:22.820 | - Where is it stuff?
01:33:23.980 | - Where am I?
01:33:25.460 | And what am I looking at?
01:33:29.460 | I don't need to deeply understand my surroundings
01:33:33.500 | at a level of like,
01:33:34.940 | at a level beyond of what will hurt if I run into it.
01:33:40.500 | - Yeah, yeah.
01:33:41.340 | - Yeah.
01:33:42.180 | - But even that is hard.
01:33:43.500 | - That's hard, but the thing that I think people don't,
01:33:47.480 | in the robotics space explore enough,
01:33:50.580 | is the human-robot interaction part of the picture,
01:33:56.860 | which is how it makes humans feel,
01:33:59.700 | how robots make humans feel.
01:34:01.660 | And I think that's going to have a very significant impact
01:34:04.860 | in the near future in society,
01:34:08.580 | which is the more you integrate AI systems of whatever form
01:34:13.100 | into society where humans are in contact
01:34:17.580 | with them regularly.
01:34:19.500 | So that could be embodied robotics,
01:34:21.620 | or that could be social media algorithms.
01:34:23.740 | I think that has a very significant impact.
01:34:25.660 | And people often think like,
01:34:28.220 | AI needs to be super smart to have an impact.
01:34:31.260 | I think it needs to be super integrated with society
01:34:35.380 | to have an impact.
01:34:36.260 | And more and more of that's happening,
01:34:38.900 | even if they're dumb.
01:34:40.460 | - Yeah, yeah.
01:34:41.500 | No, I mean, a lot of my exposure to robots
01:34:46.060 | is that I'm associated with a combat robotics team.
01:34:53.460 | And I've been to a few BattleBots competitions.
01:34:55.940 | And that's not, like, in a lot of ways,
01:34:58.440 | that's pretty far from the kind of robotics
01:35:01.260 | you're talking about,
01:35:02.620 | because these robots are remote-controlled.
01:35:05.900 | They're not autonomous.
01:35:07.540 | And so they're pretty simple.
01:35:11.300 | But it's interesting to watch people's emotional reactions
01:35:16.300 | to different robots.
01:35:18.920 | So there was one that was in the last year's season,
01:35:21.940 | the 2020 season, called Rusty,
01:35:25.300 | that was just put together out of spare parts.
01:35:29.900 | And it looked kind of cute.
01:35:32.300 | And it became this huge crowd favorite,
01:35:34.840 | because you could see it was made of salad bowls
01:35:37.580 | and random pieces of hardware
01:35:39.980 | that this guy had scavenged from his farm.
01:35:43.220 | And so immediately, people kind of fell in love
01:35:47.140 | with this one particular robot,
01:35:49.420 | whereas other robots might be like the bad guy
01:35:53.020 | in, if you think of professional wrestling,
01:35:57.340 | the heel and the baby face.
01:36:00.820 | So people do, for reasons that are hard to understand,
01:36:04.100 | form these emotional reactions.
01:36:06.860 | - And we form narratives in the same way we do
01:36:08.700 | when we meet human beings.
01:36:09.900 | We tell stories about these objects,
01:36:11.940 | and they can be intelligent, and they can be biological,
01:36:14.420 | or they can be almost close to inanimate objects.
01:36:19.100 | - Yeah.
01:36:19.940 | - And that, to me, is kind of fascinating.
01:36:21.300 | And if robots choose to lean into that,
01:36:26.300 | it creates an interesting world.
01:36:29.700 | - If they start using feedback loops
01:36:32.200 | to make themselves cuter.
01:36:33.700 | - Not just cuter, but everything that humans do.
01:36:36.700 | Let's not speak harshly of robots.
01:36:40.620 | Humans do the same thing.
01:36:41.700 | - Oh, no, I wasn't meaning it in a...
01:36:44.340 | But you're right, humans, based on feedback,
01:36:47.800 | will change their appearance, their dress.
01:36:49.860 | - Yes, I do this on Instagram all the time.
01:36:51.380 | How do I look cuter?
01:36:52.220 | That's the fundamental question I ask myself.
01:36:54.540 | - Yeah, so why wouldn't a robot wanna...
01:36:57.580 | It's like, oh, wow, people really don't like
01:37:00.500 | the quad mount machine gun on top of my turret.
01:37:05.140 | Maybe I should get rid of that,
01:37:06.700 | and people would feel more at ease.
01:37:09.520 | - Or lean into it, be proud of it.
01:37:12.580 | Like, you won't take my gun, whatever the saying is.
01:37:17.100 | - Yeah.
01:37:17.940 | - From my dead, cold hands.
01:37:19.840 | I mean, their personality, adding personality
01:37:24.140 | such that you can start to heal,
01:37:25.940 | you can start to weave narratives.
01:37:27.860 | I think that's a fascinating place where
01:37:31.340 | there's this feedback loop, like you said,
01:37:37.180 | where AI, especially when it's embodied,
01:37:42.180 | puts a mirror to ourselves.
01:37:45.760 | Just like other humans are close friends,
01:37:49.360 | they kind of teach us about ourselves.
01:37:52.480 | We teach each other, and through that process, grow close.
01:37:57.080 | And to me, it's so fascinating to expand the space
01:38:02.080 | of deep, meaningful interactions beyond just humans.
01:38:07.100 | That's the opportunity I see with robots and with AI systems.
01:38:15.360 | And that's why I don't like,
01:38:17.960 | my biggest problem with social media algorithms
01:38:20.880 | is the lack of transparency.
01:38:22.680 | It's not the existence of the algorithms.
01:38:24.840 | It's, well, there's many things.
01:38:27.780 | One is the data.
01:38:28.620 | Data should be controlled by people themselves.
01:38:32.280 | So, but also the lack of transparency
01:38:34.640 | in how the algorithms work.
01:38:36.200 | - And change your perception of what's real.
01:38:38.760 | Yeah, in hidden ways, yeah.
01:38:41.560 | - In hidden ways.
01:38:43.000 | Like, you should be aware, just like when you take,
01:38:45.160 | I don't know, if you take psychedelics,
01:38:47.600 | you should be aware that you took the psychedelics.
01:38:50.080 | It shouldn't be a surprise.
01:38:51.880 | And second, you should, I mean, become a student
01:38:56.280 | and a scholar, and there should be research done.
01:38:59.520 | There should be open conversation
01:39:00.960 | about how your perception has changed.
01:39:03.240 | And then you are, become your own guide
01:39:07.660 | in this world of altered perception,
01:39:10.340 | because arguably, none of it is real.
01:39:13.280 | You get to choose the flavor of real.
01:39:16.100 | I mean, this is something you explore quite a bit.
01:39:22.600 | Do you yourself think that there is a bottom to it
01:39:27.600 | where there is reality?
01:39:31.040 | There's a base layer of reality that physics can explore
01:39:35.800 | and our human perceptions sort of layer stuff.
01:39:38.880 | Is there, let's go to Plato.
01:39:41.600 | Is there such a thing as truth?
01:39:43.760 | - I lean towards the Platonic view of things.
01:39:48.360 | So I believe that mathematical objects
01:39:51.000 | have a reality that is not all made up by human minds.
01:39:56.000 | And I don't know where that reality comes from.
01:40:03.200 | I can't explain it, but I do think
01:40:05.400 | that mathematical objects are discovered and not invented.
01:40:10.100 | (clears throat)
01:40:12.760 | I did a lot of, not a lot,
01:40:18.060 | but I did some reading of Husserl
01:40:21.220 | when I was writing "Anathem."
01:40:23.420 | And he's a 20th century phenomenologist,
01:40:30.460 | and he's writing in the, he's writing at the same time
01:40:34.500 | as scientists are starting to understand atoms
01:40:38.780 | and becoming aware that when we look at this table,
01:40:43.780 | it's really just a slab of almost entirely vacuum.
01:40:49.860 | And there's a very sparse arrangement
01:40:54.300 | of tiny, tiny little particles there,
01:40:58.060 | occupying that space that interact with each other
01:41:00.620 | in such a way that our brains perceive this object.
01:41:06.940 | So that's kind of the beginnings of phenomenology.
01:41:11.860 | And his stuff is pretty hard to read.
01:41:18.580 | You really have to take it in small bites
01:41:23.580 | and go a little bit at a time.
01:41:27.700 | But he's trying to come to grips with these,
01:41:30.300 | with these kinds of questions.
01:41:32.620 | - How did you come to grips with it?
01:41:35.540 | Like why does this table feel solid?
01:41:38.380 | - Well, I mean, we're an evolved system that there's,
01:41:41.380 | we have biological advantages
01:41:43.500 | in knowing where solid objects are.
01:41:46.680 | So we've got this system in our head
01:41:48.620 | that integrates our perceptions
01:41:52.900 | into this coherent view of things.
01:41:55.780 | The, one of the take-homes that I like from Husserl
01:42:04.180 | is the idea of intersubjectivity
01:42:06.100 | and the idea that a fundamental requirement
01:42:10.460 | for us to stay sane is for us to share our perceptions
01:42:15.460 | and have them ratified by other,
01:42:18.540 | they don't even have to be people,
01:42:20.440 | but that, you know, a prisoner in solitary confinement
01:42:25.300 | might domesticate a mouse or even insects
01:42:29.300 | because they perceive the same things
01:42:33.340 | that the prisoner perceives.
01:42:35.120 | And so convince him that he's not just hallucinating.
01:42:40.980 | - Yeah, establish a consensus.
01:42:44.540 | - Yeah, yeah.
01:42:46.220 | - But see, that doesn't mean any of it is real.
01:42:49.260 | You just establish a consensus.
01:42:51.100 | It could be very, very distant from something that,
01:43:00.340 | something that's real in the engineering sense of real,
01:43:04.980 | like that you could build it using physics.
01:43:08.380 | - But I think that a valuable application
01:43:11.420 | for an AI robot would be just to do nothing except that.
01:43:16.420 | It just, so--
01:43:19.340 | - Consensus.
01:43:20.180 | - It just sits there.
01:43:21.380 | - Yeah.
01:43:22.220 | - And if you hear a door slam,
01:43:24.580 | you might turn to see what it is.
01:43:26.980 | If the robot at the same time turns to look at the door slam,
01:43:31.860 | it's ratifying your perception.
01:43:34.660 | - But isn't that the basis of love
01:43:36.580 | is when the door slams, you both look,
01:43:40.020 | but for deeper things, you both hear the same music
01:43:46.140 | and others don't.
01:43:49.020 | I mean, isn't that what, I mean--
01:43:51.500 | - Yeah.
01:43:52.340 | - That's, by love, I mean depth of human connection.
01:43:55.300 | - Yeah.
01:43:56.140 | - That's, or not--
01:43:58.020 | - You arrive at similar reactions
01:44:01.580 | without having to explicitly communicate it.
01:44:05.180 | - Yeah.
01:44:06.020 | - Yeah.
01:44:06.860 | - But we could start with a robot
01:44:08.940 | that listens explicitly for the slam doors.
01:44:12.500 | - Yeah, but no, I've--
01:44:14.380 | - Or scary sounds.
01:44:15.380 | - I can think of, so an example of this is,
01:44:18.580 | when I went to college,
01:44:23.500 | we'd be sitting at the cafeteria,
01:44:26.860 | a bunch of people eating our dinner together
01:44:29.900 | that we had just met, let's say,
01:44:32.620 | so a bunch of new people in your life,
01:44:37.300 | and someone might make a funny remark
01:44:41.060 | or a not so funny remark or something would happen,
01:44:45.620 | and you might then, at that moment,
01:44:47.980 | make eye contact with someone you didn't know
01:44:51.980 | at the other end of the table,
01:44:53.780 | and in that moment, you would realize
01:44:57.180 | this person is reacting, this person heard what I heard.
01:45:01.620 | They're reacting the way I reacted.
01:45:04.340 | - Yeah.
01:45:05.180 | - Nobody else appears to get the joke
01:45:07.260 | or to understand what just happened,
01:45:09.660 | but random stranger down there and I,
01:45:11.820 | we have this connection.
01:45:13.700 | - Yeah.
01:45:14.540 | - And then you build on that,
01:45:15.540 | so then the next time something happens,
01:45:18.640 | you automatically look at your new friend
01:45:21.640 | and they look back at you,
01:45:23.500 | and before you know it, you're hanging out together.
01:45:27.500 | - Yeah.
01:45:28.340 | - 'Cause you've already established
01:45:30.120 | without even talking to each other
01:45:32.420 | that you're on the same wavelength.
01:45:35.100 | - Yeah.
01:45:36.100 | It's seemingly so simple, but so powerful,
01:45:39.580 | that it's establishing that you're on the same wavelength.
01:45:41.780 | - Yeah.
01:45:42.620 | - At some level.
01:45:43.440 | - Yeah.
01:45:44.580 | - There's no reason why you and a toaster can't have that.
01:45:47.220 | (Dave laughs)
01:45:48.100 | I'm just saying.
01:45:50.860 | - Does this smell burn to you?
01:45:52.200 | (Dave laughs)
01:45:54.120 | - Exactly.
01:45:54.960 | - I think it's burn.
01:45:55.800 | - If a toaster could just say that to you.
01:45:57.960 | - Yeah, yeah.
01:45:59.700 | - Cryptonomicon, published in 1999,
01:46:03.440 | set in the late '90s,
01:46:05.140 | and involves hackers who build essentially cryptocurrency.
01:46:08.420 | Bitcoin white paper came out in 2008.
01:46:12.640 | So I have to kind of ask,
01:46:18.600 | from you looking at this layout
01:46:22.080 | of what's been happening in cryptocurrency,
01:46:25.160 | the evolution of this technology,
01:46:27.680 | how has it rolled out differently
01:46:31.640 | than you could have imagined in two ways?
01:46:34.040 | One, the technology itself,
01:46:36.040 | and two, the human side of things,
01:46:39.320 | the human stories of the hackers and the financial folks
01:46:44.320 | and the powerful and the powerless,
01:46:47.320 | the human side of things.
01:46:48.600 | - Yeah, well, cryptonomicon is pre-Bitcoin,
01:46:52.840 | it's pre-Satoshi, it's pre-blockchain, as you point out.
01:46:56.560 | So at that point, I was kind of reacting
01:47:01.480 | to what I was seeing among people
01:47:04.780 | like the Bay Area cypherpunks in Berkeley.
01:47:07.680 | There was a branch here in Austin as well.
01:47:12.640 | And a lot of their thinking was,
01:47:16.960 | so based on the idea that you would have to have
01:47:20.480 | a physical region of the earth
01:47:23.880 | that was free of government interference.
01:47:28.440 | You couldn't achieve that freedom
01:47:31.240 | by purely mathematical means on the network.
01:47:34.520 | You actually had to have a room somewhere
01:47:37.840 | with servers in it
01:47:41.040 | that a government couldn't come and meddle with.
01:47:44.800 | And so a lot of ideation happened around that view of things
01:47:48.880 | that there were efforts to figure out jurisdictions
01:47:52.240 | where this might work.
01:47:53.200 | There was a lot of interest for a while in Anguilla,
01:47:56.840 | which is a Caribbean island
01:47:58.400 | that had some unusual jurisdictional properties.
01:48:03.000 | There was Sealand, which is a platform in the North Sea.
01:48:09.340 | And so there was a lot of effort
01:48:10.880 | that went into finding these physical locations
01:48:13.440 | that were deemed kind of safe.
01:48:17.200 | And that all goes away with blockchain.
01:48:22.200 | It's no longer necessary.
01:48:24.100 | And so that really changes the picture in a lot of ways
01:48:27.940 | because you no longer have,
01:48:31.920 | I mean, from a novelist point of view,
01:48:34.080 | the old system was a lot more fun to work with
01:48:37.620 | because it gives you a situation
01:48:39.400 | where hackers are wandering around
01:48:41.600 | in strange parts of the world,
01:48:44.000 | trying to set up server rooms.
01:48:46.200 | So that's a great storytelling thing.
01:48:50.160 | - There's still a little bit of that, right,
01:48:51.640 | in the modern world,
01:48:52.960 | but it's just there's several server rooms
01:48:55.540 | as opposed to one centralized one.
01:48:57.480 | - Yeah, yeah.
01:48:58.380 | And there's the, like the new wrinkle
01:49:00.740 | is the need to do a lot of computation
01:49:03.640 | and to keep your GPUs from melting down.
01:49:08.640 | So people building things in Iceland
01:49:11.300 | or in shipping containers on the bottom of the ocean
01:49:14.620 | or whatever.
01:49:17.980 | - But there's still governments evolved
01:49:19.500 | and there's still from a novelist perspective,
01:49:21.560 | interesting dynamics.
01:49:23.580 | What is big governments like China
01:49:26.020 | and more sort of renegade governments
01:49:28.700 | from all over the world,
01:49:30.100 | trying to contend with this idea
01:49:32.260 | of what to do in terms of control and power
01:49:36.160 | over these kinds of centers that do the mining
01:49:39.480 | of the cryptocurrency?
01:49:42.200 | - Yeah, so we're in a stage now
01:49:43.840 | that kind of goes beyond the initial,
01:49:46.400 | like there's the stuff I was describing
01:49:49.080 | in "Cryptonomicon" had a little bit of air
01:49:53.840 | about it of the underpants gnomes
01:49:56.520 | in that we're going to build this system
01:49:59.880 | and then we'll make money somehow.
01:50:03.000 | But the intermediate step was left out.
01:50:06.260 | And that is,
01:50:09.520 | I think we're now sort of into that phase
01:50:15.500 | of the thing where Bitcoin,
01:50:18.620 | blockchain exists, people know how it works.
01:50:21.180 | Bitcoin and other cryptocurrencies exist,
01:50:24.820 | people are using them.
01:50:26.400 | And it's sort of like, okay, what now?
01:50:28.580 | Where does this all lead?
01:50:33.120 | - Do you have a sense of where it all leads?
01:50:34.940 | Like, is it possible that the set of technology
01:50:37.900 | kind of continues to have
01:50:41.220 | transformational effects on not just sort of finance,
01:50:48.180 | but who gets to have power in this world?
01:50:52.740 | So the decentralization of power.
01:50:55.380 | - Yeah, big questions, right?
01:50:56.820 | So I guess there's a little bit of the cynic in me
01:51:01.460 | thinking that as soon as it becomes important enough,
01:51:05.020 | the existing banks and people in power
01:51:07.700 | are gonna sort of control it.
01:51:10.120 | I guess an easy answer is that
01:51:12.060 | maybe it won't be a big change in the end.
01:51:15.000 | There's a utopian strain sometimes
01:51:19.220 | in the way people think about this
01:51:21.980 | that I'm not so sure about.
01:51:26.260 | - There is a technological aspect to Bitcoin
01:51:30.140 | and other cryptocurrencies that make it a little easier
01:51:35.060 | to pull along the utopian thread.
01:51:38.620 | - Yeah.
01:51:39.460 | - Because it's harder for governments to control Bitcoin.
01:51:43.700 | - Yeah.
01:51:44.540 | - I mean, they have much fewer options.
01:51:46.640 | They can ban, they can make it illegal.
01:51:50.820 | It's more difficult.
01:51:51.980 | - Yeah.
01:51:52.800 | - And so technology here is on the side of the powerless,
01:51:54.940 | the voiceless, which is a very interesting idea.
01:51:58.660 | Of course, yes, it does have a utopian feel to it,
01:52:02.100 | but we have been making progress throughout human history.
01:52:05.580 | - Yeah.
01:52:06.420 | - Maybe this is what progress looks like.
01:52:07.380 | There will be the powerful and the greedy
01:52:10.740 | and the bureaucrats that take advantage of it,
01:52:13.200 | skim off the top kind of thing.
01:52:15.740 | But maybe this does give more power
01:52:18.500 | to people that haven't had power before in a good way,
01:52:21.500 | like distributing power
01:52:24.300 | and enabling sort of more,
01:52:26.740 | greater resistance to sort of dictatorships
01:52:30.980 | and authoritarian regimes, that kind of thing.
01:52:33.640 | And also enabling all kinds of technologies
01:52:38.940 | built on top of it.
01:52:40.160 | Ultimately, when you digitize money,
01:52:43.240 | money is a kind of speech,
01:52:46.460 | or it's a kind of like mechanism of how humans interact.
01:52:53.420 | And if you make that digital,
01:52:54.620 | more and more of the world moves to the digital space.
01:52:57.940 | And then you can have the,
01:53:01.340 | then you can finally fully live in that virtual reality
01:53:03.940 | with the toaster.
01:53:04.980 | And then.
01:53:05.900 | - Yeah, yeah.
01:53:07.060 | In a lot of ways, I think in that realm of technology,
01:53:10.080 | that the money per se is one of the less interesting things
01:53:13.300 | you can do with it.
01:53:14.140 | So I think, you know,
01:53:15.220 | cryptographically enforceable contracts
01:53:17.460 | and organizations built on those,
01:53:20.740 | that seems to me like it's got more potential for change
01:53:25.380 | just because we do already have money.
01:53:29.180 | And although it's an old system,
01:53:32.160 | it's been digitized to a large extent by,
01:53:36.540 | you know, the stripes
01:53:38.060 | and the credit card companies of the world.
01:53:40.560 | - And I also love the idea of like connecting,
01:53:45.660 | to connect to two smart contracts, connecting data.
01:53:48.980 | Sort of making it more formal, it's like Mathematica,
01:53:53.660 | more structured, the integration of data,
01:53:56.940 | of weather data, of all kinds of data
01:54:01.420 | about the stuff in the world,
01:54:05.280 | so they can make contracts between people
01:54:08.020 | that's grounded in data.
01:54:10.060 | And that's actually getting closer to
01:54:12.060 | something like truth,
01:54:15.860 | because then you can make agreements
01:54:17.340 | based on actual data versus kind of perceptions of data.
01:54:21.540 | And if you can formalize,
01:54:24.420 | like distribute the power of who gets to tell the story,
01:54:28.300 | that's an interesting kind of resistance
01:54:32.300 | against the powerful in the space of narrative.
01:54:35.060 | - Yeah, David Brin has been saying for a while
01:54:37.220 | that the only way to settle arguments
01:54:40.980 | with, you know, across the political divide
01:54:44.300 | is to make bets.
01:54:46.780 | So people can say, you know, the election was stolen,
01:54:51.260 | or, you know, whatever controversial position
01:54:54.700 | they're taking, and they'll keep saying it
01:54:58.860 | until you wager real money on it.
01:55:03.760 | So maybe there's something there,
01:55:08.860 | if you could kind of turn that into a,
01:55:13.140 | put a user interface on that, say, you know.
01:55:16.780 | - Yeah, have a stake in your divisiveness,
01:55:20.940 | in your arguments.
01:55:22.460 | - Right, right, yeah.
01:55:24.860 | - Will Dogecoin take over the world?
01:55:27.620 | Twitter question.
01:55:28.460 | - You know, I don't follow the different coins that much.
01:55:32.260 | So I don't, I mean, I hear about Dogecoin,
01:55:34.660 | and I, you know, I've kind of followed the story of it.
01:55:37.780 | - So the interesting aspect of Dogecoin is it,
01:55:42.900 | so in contrast to like Bitcoin and Ethereum,
01:55:46.980 | which are these serious implementations of cryptocurrency
01:55:51.780 | that seek to solve some of the problems
01:55:53.300 | that we're talking about with smart contracts,
01:55:55.660 | and resist the banks and all those kinds of things,
01:56:00.540 | Dogecoin operates more in the space of memes and humor,
01:56:05.460 | while still doing some of the similar things.
01:56:08.660 | And it presents to the world sort of a question
01:56:11.500 | of whether memes, whether humor,
01:56:16.500 | whether narrative will go a long way in the future,
01:56:23.020 | like much farther than some kind of boring,
01:56:28.580 | old grounded technologies,
01:56:32.340 | whether we'll be playing in the space of fun.
01:56:35.340 | Like once we built a base of comfort and stability,
01:56:40.100 | and like a robust system where everyone has shelter,
01:56:42.620 | everyone has food, and the basic needs covered,
01:56:47.620 | are we going to then operate in the space of fun?
01:56:51.900 | That's what I think about Dogecoin,
01:56:55.060 | because it seems like fun spreads faster than anything else,
01:57:00.060 | fun of different kinds, and that can be bad fun,
01:57:04.420 | and that could be good fun.
01:57:06.300 | And so it's a battle of good fun versus bad fun.
01:57:09.060 | - It goes viral very, very quickly,
01:57:11.420 | when you post something that people find fun to.
01:57:14.540 | - Yeah, and that's what Dogecoin represents.
01:57:17.140 | So there's like, so Bitcoin represents like financial,
01:57:20.480 | like serious financial instruments,
01:57:25.500 | and then Dogecoin represents fun.
01:57:27.740 | And it's interesting to watch the battle go on
01:57:31.140 | on the internet to see which wins.
01:57:33.780 | This is also like an open question to me
01:57:35.940 | of what is the internet?
01:57:38.620 | Because fun seems to prevail on the internet.
01:57:43.420 | And is that a fundamental property
01:57:45.740 | of the internet moving forward
01:57:47.260 | when you look 100 years out?
01:57:49.060 | Or is this a temporary thing that was true
01:57:52.180 | at the birth of the internet,
01:57:53.500 | and it's just true for a couple of decades
01:57:55.780 | until it fades away and the adults take over
01:57:58.580 | and become serious again?
01:58:00.020 | - Well, I think the adults took over initially,
01:58:02.860 | and then it was later on that people started using it
01:58:05.980 | for fun, frivolous things like memes.
01:58:08.800 | And I think that's pretty much unstoppable.
01:58:12.680 | - Yeah.
01:58:15.700 | - 'Cause even people who are very serious
01:58:17.700 | enjoy sending around a funny picture
01:58:23.340 | or something that amuses them.
01:58:26.340 | - Yeah, I personally think, we spoke about World War II,
01:58:30.460 | I think memes will save the world
01:58:32.100 | and prevent all future wars.
01:58:35.700 | You've been handwriting your work for the past 20 years
01:58:38.980 | since writing "The Baroque Cycle."
01:58:40.900 | What are the pros and cons of handwriting versus typing?
01:58:43.940 | - For me, I started it as an experiment
01:58:46.780 | when I started "The Baroque Cycle"
01:58:48.180 | because I had noticed that sometimes if I was stuck
01:58:52.180 | having a hard time getting started,
01:58:53.900 | if I just picked up a pen and started writing,
01:58:57.300 | it was easy to go.
01:58:59.940 | So I just decided to keep with that.
01:59:03.860 | If it got in my way, I didn't like it,
01:59:05.900 | I could always just go back to the word processor.
01:59:08.540 | It'd be fine.
01:59:09.380 | So, but I never, that never happened.
01:59:10.980 | So there's a certain security that comes from knowing
01:59:15.040 | that it's ink on paper and there's no operating system crash
01:59:19.840 | or software failure that can obliterate it.
01:59:22.840 | There's, it's a slower output technique.
01:59:31.700 | And so a sentence or a paragraph spends a longer time
01:59:36.700 | in the buffer up here before it gets committed to paper,
01:59:41.820 | whereas I can type really fast.
01:59:43.740 | And so I can slam things out
01:59:46.180 | before I've really thought them through.
01:59:48.780 | So I think the first draft quality ends up being higher.
01:59:52.360 | And then editing, first draft of editing is just faster
01:59:58.180 | because instead of like trying to move the cursor around
02:00:02.140 | or whatever, or, you know, hitting the backspace key,
02:00:06.220 | I can just draw a line through a word or a sentence
02:00:09.360 | or just around a whole paragraph and exit out.
02:00:12.900 | And in doing so, I've very quickly created an edit,
02:00:17.240 | but I've also left behind a record
02:00:18.860 | of what the text was prior to the edit.
02:00:21.060 | - Of course, you know, all the digital versions
02:00:24.720 | have those quote unquote features,
02:00:26.940 | but their experience is different.
02:00:29.740 | - Yeah, yeah.
02:00:31.220 | - Is there a romance to just the physical,
02:00:35.060 | you know, the touch of the pen to the paper,
02:00:40.340 | doing what has been done for centuries?
02:00:43.180 | - I think there is.
02:00:44.020 | I think there's a, just the simplicity of it
02:00:47.780 | and not having any intermediary technology
02:00:51.060 | beyond the pen and the paper is just very simple and clean.
02:00:56.060 | And so I've got a bunch of fountain pens.
02:01:01.060 | I started buying fancy paper from Italy a few years ago
02:01:07.940 | because I thought I would be more conservative with it.
02:01:13.020 | You know, but it still doesn't,
02:01:15.020 | it's still a trivial expenditure.
02:01:19.060 | So it doesn't really alter my habits very much.
02:01:24.140 | - So all that said, you, once you do type stuff up,
02:01:28.820 | you use Emacs.
02:01:29.900 | - Yeah.
02:01:30.740 | - I use Emacs, obviously the superior editor.
02:01:33.460 | - Of course.
02:01:34.540 | - You, let me just ask the ridiculous futuristic question,
02:01:38.460 | 'cause Emacs has been around forever.
02:01:41.060 | Do you think in 100 years, we will still have Emacs in Vim?
02:01:46.060 | Or like, pick a, let's say 50, 100 years, 20 years.
02:01:53.820 | - Yeah, no, I mean, whenever you're doing anything in Linux,
02:01:57.800 | you're spending a lot of time editing little config files
02:02:02.260 | and scripts and stuff.
02:02:04.440 | And you need to be able to pop in and out
02:02:06.980 | of editing those things.
02:02:09.420 | And it needs to work,
02:02:11.100 | like even if the windowing GUI is dead
02:02:16.140 | and all you've got is like a command line,
02:02:19.520 | to get out of that problem,
02:02:21.180 | you might need to enter an editor and alter a file.
02:02:26.180 | So I think on that level,
02:02:29.020 | there'll always have to be sort of very simple,
02:02:33.620 | well, Emacs isn't very simple, but you know what I mean.
02:02:36.780 | There have to be basic editors that you can use
02:02:40.420 | from either the command line or a GUI,
02:02:42.780 | just for administering systems.
02:02:47.340 | Now, how widespread they'll be,
02:02:50.300 | there's a certain amount of,
02:02:54.380 | what's the story of the,
02:02:55.780 | there's the American folktale of the guy who,
02:03:01.860 | the hammer guy who drives the railroad spikes, John Henry,
02:03:06.020 | trying to keep up with the steam hammer.
02:03:08.980 | And eventually the steam hammer wins
02:03:11.780 | 'cause he can't drive the spikes fast enough.
02:03:14.020 | So there's a sense in which,
02:03:17.980 | Microsoft like,
02:03:19.220 | who knows how much they've invested in code,
02:03:25.100 | Visual Studio to, or Apple with Xcode.
02:03:29.780 | So they've put huge amounts of money
02:03:32.620 | into enhancing their IDEs.
02:03:36.420 | And Emacs in theory can duplicate all of those features
02:03:41.340 | by, if you just have enough Linux hackers
02:03:46.140 | writing Emacs Lisp macros.
02:03:48.380 | But at some point it's gonna be hard
02:03:54.300 | to maintain that level of,
02:03:57.980 | to keep up feature for feature.
02:04:02.020 | - The interesting thing about Emacs
02:04:04.940 | just is it lasted a long time.
02:04:07.140 | - Yeah.
02:04:08.220 | - I think you've talked about,
02:04:09.760 | there's a certain like, there's certain fads.
02:04:15.620 | Certainly in the software engineering space.
02:04:19.720 | And it's interesting to think about technologies
02:04:23.420 | that sort of last for a very long time.
02:04:28.300 | And just kind of being in the, what is it?
02:04:32.180 | How do they get by?
02:04:33.340 | It's like the cockroaches of software
02:04:37.900 | or the bacteria software or something.
02:04:40.500 | Like this base thing that nobody,
02:04:42.220 | everybody's just became reliant on.
02:04:45.060 | And they just outlast everything else
02:04:47.940 | and slowly, slowly adjust with the times.
02:04:50.660 | With a little bit of a delay,
02:04:52.460 | with a little bit of customization by individuals,
02:04:55.380 | kind of that.
02:04:56.620 | But they're always there in the shadows.
02:04:58.580 | - Yeah.
02:04:59.420 | - And they outlast everybody else.
02:05:01.260 | And I wonder if that might be the story
02:05:03.780 | for a lot of technologies,
02:05:05.020 | especially in the software space.
02:05:06.300 | - Yeah, shell scripts, all that stuff.
02:05:08.580 | You can't run the modern world
02:05:11.620 | without a bunch of shell scripts,
02:05:14.860 | booting up machines and running things.
02:05:18.780 | So that is gonna be a hard thing to replace.
02:05:23.780 | - And then tech for typesetting that you use, you said.
02:05:29.100 | - For when I want to print it out,
02:05:31.020 | yeah, I just have some simple macros that I use.
02:05:34.820 | But then I have to, the publisher put their foot down
02:05:39.240 | and they want it in Word format now.
02:05:42.340 | So years ago, I wrote some macros to convert.
02:05:47.340 | And this time, what did I do?
02:05:52.220 | - Copy paste?
02:05:53.380 | - No, I use sort of regular expressions.
02:05:57.320 | So I was to do italics in,
02:06:00.000 | you put it in curly brackets and you do backslash IT
02:06:05.300 | and then you type what you want to type.
02:06:06.860 | And that's how you get italics in tech.
02:06:09.720 | So you can create a regular expression
02:06:12.880 | that'll look for some text between curly brackets
02:06:16.520 | preceded by backslash IT.
02:06:19.000 | And then instead convert that to italics.
02:06:24.000 | And Word will do that.
02:06:26.380 | Word, if you go deep enough into its search and replace UI.
02:06:32.840 | - You can do regular expressions.
02:06:34.040 | - Is just reg Fs.
02:06:36.660 | - Yeah.
02:06:37.500 | - It's funny that you did that.
02:06:38.800 | I mean, I'm sure there's tools that help you
02:06:40.760 | with that kind of thing,
02:06:41.600 | but the task is sufficiently simple
02:06:46.400 | to where you can do a much better job
02:06:48.600 | than anybody else's tool can.
02:06:51.080 | - Yeah, yeah.
02:06:52.040 | - So that's a fascinating process.
02:06:54.080 | - Works fine for me, yeah.
02:06:56.160 | And it keeps you from messing around with formatting.
02:06:59.040 | - Yeah.
02:06:59.880 | - Like, oh, what if I put this chapter heading
02:07:02.680 | in a sans serif font?
02:07:06.000 | You know, it's just classic wanking.
02:07:09.140 | And so those options are closed off in what I'm doing.
02:07:14.140 | - Is there advice you could say,
02:07:17.980 | what does it take to write a great story?
02:07:20.820 | - The power of good yarns, good narratives
02:07:23.660 | to pull people in is incredible.
02:07:28.060 | And I think my sort of amateur theory
02:07:31.460 | is that it's an evolutionary development
02:07:34.740 | that if you're a cave person sitting around a fire
02:07:39.740 | in the Rift Valley a million years ago,
02:07:46.180 | if you can tell the story of how you escaped
02:07:53.020 | from the hyenas or how Uncle Bob didn't escape
02:07:58.020 | from the hyenas, and if the people listening to you
02:08:02.580 | can take that in and they can build that scenario
02:08:06.740 | in their heads, like a kind of virtual reality
02:08:09.380 | and see what you're describing,
02:08:12.620 | then you've just conferred an incredibly important advantage
02:08:16.980 | on the people who've heard that story.
02:08:19.100 | - Yeah.
02:08:19.940 | - Right?
02:08:20.760 | And so they know a bunch of stuff now
02:08:22.880 | about how to stay alive that they could not have learned
02:08:26.720 | in any other way.
02:08:28.000 | I mean, animals who don't have speech,
02:08:32.200 | though, they might warn each other,
02:08:34.320 | they might make a sound that says, "Danger, danger."
02:08:37.660 | But as far as we know,
02:08:41.640 | they can't tell more complicated stories.
02:08:44.920 | - So it's a part of us.
02:08:47.380 | Yeah, the collective intelligence seems to be
02:08:50.980 | one of the key characteristics of Homo sapiens,
02:08:54.740 | the ability to share ideas and hold ideas together
02:08:58.220 | in our minds, and storytelling is the fundamental aspect
02:09:01.420 | of that, maybe even language itself is more fundamental.
02:09:04.780 | - Yeah.
02:09:06.220 | - 'Cause the language is required to do the storytelling.
02:09:09.900 | Or maybe they evolve together.
02:09:11.300 | - Maybe they co-evolve, yeah.
02:09:13.480 | So I think that you've gotta work with that,
02:09:15.980 | and I think sometimes it seems like in kind of
02:09:19.500 | literary circles that having a lot of plot
02:09:25.580 | is a little bit frowned upon as it's pulpy
02:09:28.620 | or it's exploitative, but for me,
02:09:32.260 | I don't have any compunctions whatsoever about that.
02:09:35.580 | I like stories that are grabby and fun and exciting to read,
02:09:40.580 | and once you've got one of those going,
02:09:43.360 | once you've got a good yarn going,
02:09:45.860 | that people will enjoy reading,
02:09:47.180 | then you're free to do whatever you want
02:09:49.540 | in the frame of that story.
02:09:52.300 | But if you don't have that, then you got nothing.
02:09:57.300 | - What about having like,
02:09:58.980 | which you do a technological scientific rigor
02:10:02.460 | like to the accuracy and as much as possible.
02:10:06.160 | How does that add to Bob telling the story,
02:10:10.100 | or telling the story about Bob around the campfire?
02:10:12.600 | - Well, the main thing that it does is
02:10:15.100 | present little details that you might not have
02:10:20.640 | come up with on your own.
02:10:22.980 | So if you're just sitting there freely imagining things,
02:10:27.460 | your brain probably isn't gonna serve up
02:10:32.460 | the wealth of details and the resulting complications
02:10:36.440 | and surprises that the real world
02:10:40.820 | is constantly presenting us with.
02:10:42.540 | And so in my case, if I'm trying to write a story
02:10:47.540 | about some that involves some technology like a rocket
02:10:52.820 | or orbital maneuvers or whatever,
02:10:55.060 | then delving into those details
02:10:58.100 | eventually is gonna turn up some weird, unexpected,
02:11:02.140 | you know, thing that gives me material to work with.
02:11:05.660 | But also subliminally readers who see that
02:11:09.740 | are gonna be drawn in more
02:11:13.300 | because they're going to find that,
02:11:17.380 | oh, I didn't see that coming, you know.
02:11:19.980 | You know, it's got some of the complexity
02:11:21.820 | and surprise value of the real world.
02:11:24.680 | - Yeah, it does something.
02:11:26.020 | Alex Garland, director who wrote, directed Ex Machina.
02:11:31.640 | I think about AI movies
02:11:35.820 | and the more care you take in making it accurate,
02:11:38.940 | the more compelling the story becomes somehow.
02:11:43.560 | I'm not sure what that is.
02:11:46.160 | Maybe because it becomes more real
02:11:49.940 | to the people writing the story.
02:11:52.220 | Maybe it just makes you a better writer.
02:11:54.580 | - The key to any storytelling is getting the readers
02:11:58.100 | to suspend their disbelief.
02:12:00.780 | And there's all kinds of triggers and little tells
02:12:03.220 | that can break that.
02:12:04.860 | - Right.
02:12:06.240 | - And once it's broken, it's really hard to get it back.
02:12:09.040 | You know, a lot of times that's the end.
02:12:12.020 | Somebody will just close the book and not pick it up.
02:12:15.480 | - I gotta ask you, you've answered this question,
02:12:18.340 | but I gotta ask you the most impossible question
02:12:22.980 | for an author to answer.
02:12:24.100 | But which Neil Stevenson book should one read first?
02:12:29.100 | - So when people ask me that,
02:12:31.580 | I usually ask them what they like to read, right?
02:12:34.900 | 'Cause I mean, there's the best known one
02:12:38.900 | is probably Snow Crash, but that's a cyberpunk novel
02:12:43.100 | that's at the same time making fun of cyberpunk.
02:12:46.940 | So it's kind of got some layers to it
02:12:49.720 | that might not seem so funny
02:12:53.220 | if you don't have that, if you don't get the joke, right?
02:12:56.900 | So there's, I've written, as you point out,
02:13:01.220 | I've written historical novels.
02:13:03.020 | Some people like those, some people prefer those.
02:13:06.420 | So if that's what you like,
02:13:08.480 | then Cryptonomicon or The Baroque Cycle
02:13:11.300 | is where you would start.
02:13:12.980 | If you like sort of techno thrillers
02:13:14.740 | that are set in a modern day setting,
02:13:17.480 | but aren't science fiction-y per se,
02:13:20.660 | then Reamde is one of those.
02:13:24.860 | And Termination Shock is definitely one of those.
02:13:29.000 | So it just depends on what people like.
02:13:35.640 | - When people a long time ago recommend I read Snow Crash,
02:13:40.700 | they said, "It's Neil Stevenson lite."
02:13:47.780 | It's the, if you don't want to be overwhelmed by the depth,
02:13:52.060 | like the rigor of a book,
02:13:54.060 | like that's a good introduction to the man.
02:13:57.620 | So essentially you broke it down by topics,
02:14:00.620 | but if you wanted to read all of them,
02:14:04.980 | what's a good introduction to the man?
02:14:07.980 | Because obviously these worlds are very different.
02:14:10.340 | The philosophies are very different.
02:14:12.780 | What's a good introduction to the human?
02:14:17.460 | People ask the same thing of Dostoevsky.
02:14:19.460 | It's a hard one to answer.
02:14:23.060 | - Maybe Seveneves, 'cause it's got big themes.
02:14:26.920 | It's about heavy things happening to the human race,
02:14:32.400 | but hopefully the story is told through a cast of characters
02:14:37.680 | that people can relate to, and it moves along.
02:14:43.900 | So it does go kind of deep eventually on how rockets work
02:14:48.900 | and orbital mechanics and all that stuff,
02:14:51.220 | but people were able to get through it anyway,
02:14:55.300 | or some people just skip over that.
02:14:57.380 | It's fine.
02:14:58.220 | - As an author, let me ask you,
02:15:03.140 | what books had a big impact on your life that you've read?
02:15:07.300 | Is there any that jumped to mind
02:15:10.060 | that you learned from as a writer, as a philosopher,
02:15:13.940 | as a mathematician, as an engineer?
02:15:16.220 | - This is one of these questions where I always blank out,
02:15:19.440 | and then when I'm walking out the door,
02:15:22.380 | I'll remember 12 of them.
02:15:24.340 | - So this is a random selection
02:15:26.060 | that doesn't represent the top ones?
02:15:28.840 | - Well, I mentioned "Gulag Archipelago."
02:15:32.540 | That's kind of a hefty and dark, but-
02:15:35.740 | - And then it has a personal connection as well?
02:15:38.300 | - Yeah, just, yeah.
02:15:39.140 | - It's like where you found the book too.
02:15:41.300 | - Right.
02:15:42.140 | - The time in your life, where you found it,
02:15:44.580 | who recommended it, that's also part of the story.
02:15:46.780 | - Yeah, so there's definitely that.
02:15:48.580 | I circle back to "Moby Dick" a lot
02:15:52.060 | because we read it in a really great English class
02:15:58.460 | I had in high school,
02:15:59.460 | and I came in with an oppositional stance
02:16:02.660 | because I thought that the teacher
02:16:04.780 | was gonna try to talk me into having
02:16:07.460 | all kinds of highfalutin ideas about allegory,
02:16:11.100 | and what does this mean, and what's the symbolism?
02:16:14.180 | And it turned out that,
02:16:15.660 | it turned out to be a lot more interesting
02:16:20.280 | and satisfying than that.
02:16:21.620 | - What was the first powerful book you remember reading
02:16:25.860 | that convinced you that this form could have depth?
02:16:30.860 | Was it "Moby Dick," was it in high school?
02:16:35.780 | - I'm trying to remember,
02:16:36.980 | well, "Moby Dick" was definitely a big one.
02:16:39.180 | I mean, I used to read a lot of classics comics.
02:16:43.220 | When I was, I don't know if you've seen these,
02:16:46.620 | it's a whole series of comic books that,
02:16:50.860 | it was viral, you could,
02:16:54.180 | in the back of each comic book was an order form.
02:16:57.700 | You could check some boxes and fill out your address
02:17:01.340 | and mail it in, and more would show up.
02:17:04.420 | But it was like, they would do the count of money.
02:17:06.660 | Christo, "Moby Dick," Robert Louis Stevenson,
02:17:11.660 | Robinson Crusoe, all this sort of classic books
02:17:15.500 | were, they had put into comic book form.
02:17:20.100 | - That's amazing.
02:17:20.980 | - Yeah, reading "Moby Dick," if you're nine years old,
02:17:24.340 | is a tall order.
02:17:26.100 | There's some very complicated sentences in there,
02:17:28.860 | and a lot of digressions.
02:17:31.620 | But if you're just looking at the comic books,
02:17:33.420 | it's like, holy shit, look at that whale.
02:17:35.500 | - And ultimately the power of the story
02:17:41.180 | doesn't need the complicated words.
02:17:42.980 | It's all about the man and the whale.
02:17:47.780 | - Yeah, yeah, so you could get kind of a grounding
02:17:50.140 | in a lot of classic works of literature
02:17:52.380 | without actually reading them,
02:17:53.500 | which is great when you're nine years old.
02:17:57.460 | So I read a lot of that stuff, for sure.
02:18:01.500 | The annotated Sherlock Holmes.
02:18:03.460 | - You mentioned David Deutsch, too,
02:18:08.460 | as an inspiration for some of your work.
02:18:10.380 | I mean, you've obviously done, like,
02:18:12.060 | really a lot of research for the books you do.
02:18:15.300 | Roger Penrose.
02:18:16.660 | Do you remember a book that made you wanna become a writer
02:18:21.940 | or a moment that made you become a writer?
02:18:23.860 | - I think, like, the answer I usually give
02:18:29.340 | is that when I was in, like, fifth grade,
02:18:31.700 | one of my friends came to school one day.
02:18:36.660 | He was wearing leather shoes, like dress shoes.
02:18:40.140 | And I hated dress shoes 'cause mine never fit.
02:18:44.500 | And so they were uncomfortable.
02:18:45.980 | I couldn't run.
02:18:47.700 | They were cold.
02:18:49.100 | It was Iowa.
02:18:50.380 | So I kinda said, I remember very clearly thinking,
02:18:55.380 | okay, I don't like where this is going.
02:18:58.180 | Like, does this mean that next year
02:19:02.660 | all of the kids are gonna be wearing leather shoes?
02:19:07.220 | So I need to find a job where I don't have to do that.
02:19:11.360 | So that was, like, the first time I thought about
02:19:16.100 | trying to find such a job, you know, being a writer.
02:19:18.700 | And then I just read a lot of just classic
02:19:23.100 | science fiction short stories
02:19:24.740 | and started trying to write some of my own.
02:19:28.220 | And they were just classic young adult stories,
02:19:31.380 | like by Heinlein and the other classic names
02:19:36.380 | that you think of, but the Heinlein ones
02:19:39.180 | have stuck with me in a way that the others didn't.
02:19:42.020 | - What's the greatest science fiction book ever written?
02:19:45.080 | Just removing your work from consideration.
02:19:51.140 | - Greatest?
02:19:54.700 | - I'm loving torturing you right now.
02:19:56.100 | - Greatest ever non-Stevenson, do we include fantasy?
02:20:00.580 | Or does it have to be science fiction?
02:20:02.740 | - Oh, interesting, fantasy.
02:20:04.420 | I did not expect that twist.
02:20:10.260 | - Well, for in a weird way,
02:20:11.580 | they're lumped together in people's minds, right?
02:20:14.240 | - They are, but there's also a boundary somehow.
02:20:18.460 | I'm not sure what that is exactly.
02:20:20.140 | - Nobody is, it's a mystery.
02:20:23.220 | So, I mean, if we do include it,
02:20:24.580 | then it's easily the Lord of the Rings.
02:20:27.380 | But, I mean, greatness is a interesting quality
02:20:32.380 | to try to define.
02:20:34.240 | And for me, a lot of the fun and the joy of such books
02:20:41.660 | is not in what you'd call greatness, but just storytelling.
02:20:46.720 | So I was always a big fan of "Have Space Sue Will Travel,"
02:20:51.680 | which is a Heinlein young adult book.
02:20:53.860 | It's just a fun, good read.
02:20:56.360 | - So fun is a big component, greatness is overrated.
02:21:02.340 | - Well, I don't know if it's overrated,
02:21:04.060 | but it's just, it might be underdefined,
02:21:08.380 | let's put it that way.
02:21:10.820 | - "Have Space Sue Will Travel,"
02:21:11.980 | now I definitely have to read that one.
02:21:14.100 | You mentioned Iowa.
02:21:15.660 | I was there a couple of times.
02:21:18.140 | I got to spend quite a bit of time with Dan Gable,
02:21:21.820 | with Tom Brands, who are wrestlers.
02:21:23.820 | Is it now wrestling, martial arts, part of your life,
02:21:30.860 | any part of your formation of who you are as a human being?
02:21:35.380 | - I think so.
02:21:36.900 | It was a late thing for me, but growing up in Ames,
02:21:41.800 | Dan Gable was a few years older than me.
02:21:47.740 | And so sometimes we would go to the arena at the university
02:21:51.500 | and watch wrestling meets.
02:21:54.260 | And this was before his Olympic career.
02:21:58.920 | So everyone knew he was the star of that team,
02:22:02.540 | and that he was the best,
02:22:03.540 | but people didn't yet know
02:22:05.180 | that he was the greatest of all time.
02:22:07.140 | - Gee, you saw Gable, so that was part,
02:22:09.620 | it's funny, it feels like a small world,
02:22:13.060 | that you would be in the same space as Dan Gable.
02:22:16.380 | - Well, from 100 feet away, a little dot on the mat,
02:22:19.980 | trouncing his opponents, him and Chris Taylor.
02:22:23.500 | So the other star was this 400 pound plus guy
02:22:27.660 | named Chris Taylor, who also went to the Olympics.
02:22:32.060 | So yeah, people, he was an athletic hero.
02:22:37.060 | And wrestling is, there's certain states like Oklahoma,
02:22:43.380 | Pennsylvania, Iowa, where wrestling is the sport,
02:22:46.900 | because those are states of small towns.
02:22:49.060 | And so if you're a small town, if you're like Dan Gable,
02:22:52.020 | and you have to be on a football team
02:22:56.500 | with 20 other guys who are not Dan Gable,
02:22:59.860 | then no matter how good you are, your team might suck.
02:23:04.660 | But in a solo thing, you can go to the Olympics.
02:23:11.800 | So we did a lot of wrestling in our gym classes in school,
02:23:15.260 | and I didn't like it.
02:23:16.460 | And I think partly it's just that it was so competitive,
02:23:20.700 | and the people who cared about it,
02:23:23.100 | really cared about it a lot.
02:23:26.420 | And so it was pretty tough.
02:23:29.260 | I didn't think I had the right body type.
02:23:31.220 | But then when I was, after college,
02:23:33.920 | I was in Iowa City for a few years,
02:23:36.060 | when he was coaching the wrestling team there.
02:23:39.900 | And he won like nine championships out of 10 years
02:23:44.900 | during that time.
02:23:47.300 | So he was both the greatest individual wrestler of all time
02:23:50.620 | and the greatest team coach.
02:23:53.520 | So I've never met him, but we've,
02:23:57.940 | he's kind of been in my sphere of awareness
02:24:02.940 | since I was, kind of my whole life.
02:24:04.820 | And people would always tell stories about him.
02:24:08.600 | I think he got arrested once for some kind of,
02:24:11.080 | I don't know, minor offense in Ames.
02:24:15.740 | And so he just basically stayed up all night.
02:24:17.640 | He was in this cage in the jail.
02:24:20.940 | He just stayed up all night doing pull-ups.
02:24:22.800 | - Yeah, sounds about right.
02:24:24.360 | - Yeah.
02:24:25.200 | And so, yeah.
02:24:30.200 | - So has that been,
02:24:31.960 | I mean, Iowa is such an interesting place in the world.
02:24:35.000 | And wrestling is just part of that story.
02:24:38.100 | Is that somewhere in there?
02:24:41.900 | Does that resonate deeply with who you are?
02:24:44.780 | - It was a formative, yeah, thing for me
02:24:47.200 | growing up there, for sure.
02:24:48.820 | It's just a,
02:24:49.680 | or at least used to be a very orderly place,
02:24:55.040 | high social capital, very minimal class differences.
02:25:01.180 | So like you'd have some people who would drive a Cadillac
02:25:05.260 | instead of a Chevy, but that was it.
02:25:09.780 | Those were the rich people, right?
02:25:11.420 | And a college town is always a different environment.
02:25:16.500 | Like Austin has some of this.
02:25:21.400 | So it was a pretty kind of utopian,
02:25:25.940 | other than the weather and a few other things,
02:25:29.380 | environment to grow up in.
02:25:31.860 | The martial art I ended up doing is sword stuff,
02:25:34.900 | which is interesting
02:25:36.580 | because it uses a different feedback loop.
02:25:39.300 | So if you're grappling,
02:25:41.460 | everything is through sense of touch.
02:25:46.160 | And your sense of touch is very old and simple, right?
02:25:50.700 | Like earthworms don't even have eyes,
02:25:54.300 | but they can tell when they're being touched, right?
02:25:56.700 | So it's very fast.
02:26:00.780 | And with a standoff art like boxing
02:26:05.780 | or some kinds of sword fighting,
02:26:08.620 | you're not touching the other person most of the time.
02:26:12.300 | Your visual system is doing something way more,
02:26:16.700 | it's doing slam and trying to figure out
02:26:20.420 | what the other person is up to.
02:26:22.340 | And so that always felt more my speed.
02:26:28.140 | So in Olympic style fencing,
02:26:31.200 | it doesn't start really until you're crossing blades
02:26:36.700 | with the other person.
02:26:37.620 | And now you're back to wrestling.
02:26:39.420 | You're feeling what they're doing.
02:26:42.140 | And it's all about that.
02:26:43.380 | But some of the older sword arts
02:26:46.100 | don't engage the blade that way.
02:26:49.900 | You stand off at range and then you make cutting attacks.
02:26:57.140 | And so those are all processed visually.
02:27:01.860 | And I think I'm more of a slow thinker.
02:27:06.180 | So it works for me better.
02:27:08.740 | - I mean, it has the same,
02:27:11.260 | the artistry and the beauty of boxing,
02:27:13.100 | I suppose, just like you said.
02:27:14.900 | It's like there's no contact
02:27:16.900 | and it's all processed visually.
02:27:18.780 | And I'm sure there's a dance of its own.
02:27:20.780 | - Yeah.
02:27:22.180 | - That depends on the characteristic of a sword involved.
02:27:24.820 | - Yeah, there is a set of stances
02:27:27.220 | and basic reactions that you try to learn
02:27:30.520 | that are thought to be defensible and safe or safer.
02:27:35.520 | And so it tends to be a series of short engagements
02:27:40.260 | where you'll close in, you'll try out your idea,
02:27:45.260 | and it works or it doesn't.
02:27:47.220 | Then you back off again.
02:27:49.420 | - It's interesting to think about like human history,
02:27:53.240 | 'cause martial arts, okay, that's a thing.
02:27:57.340 | But in terms of sword fighting,
02:28:00.340 | just the full range of humans that existed
02:28:05.340 | who mastered sword fighting
02:28:07.940 | or sought to master sword fighting,
02:28:09.780 | just to imagine the thousands of people
02:28:12.300 | who the heights they have achieved,
02:28:14.820 | 'cause the stakes are so incredibly high to be good.
02:28:19.660 | - And it's the richest, most powerful people
02:28:22.580 | in those societies spending whatever it takes
02:28:26.400 | to get the best gear and the best training.
02:28:31.220 | 'Cause you're right, everything depends on it.
02:28:33.620 | - And it's still life and death.
02:28:35.020 | I mean, that's fascinating.
02:28:36.740 | That's fascinating.
02:28:40.420 | We perhaps have lost that forever with greater weapons.
02:28:45.420 | I mean, the artistry of sword fighting
02:28:48.780 | when it's life and death and you go into war,
02:28:52.300 | you have the Miyamoto Musashi's of the world, right?
02:28:54.900 | I don't know.
02:28:57.260 | There's a poetry to that,
02:28:59.540 | that there's a mastery to that
02:29:01.940 | that I don't know if we could achieve
02:29:03.020 | with any other kind of martial art.
02:29:05.520 | - Well, one of the good,
02:29:07.700 | you were talking earlier about the good effects
02:29:11.560 | of the internet, social media that we sometimes overlook.
02:29:15.500 | And one of those is that there were all these isolated
02:29:19.660 | people around the world who were interested in this
02:29:21.900 | who found each other and kind of created a network
02:29:26.020 | of people who help each other learn these things.
02:29:28.860 | So that doesn't mean that anyone is up to the level
02:29:32.180 | of that you're talking about yet, but it is happening.
02:29:37.180 | And so there's a large number of old treatises,
02:29:44.580 | old written documents that have been dug up from libraries
02:29:49.280 | and people have been going over these
02:29:51.220 | and translating them from old dialects of Italian
02:29:55.100 | and German to make sense of them
02:29:58.420 | and learning how to do these techniques
02:30:02.040 | with different weapons.
02:30:04.480 | Actually, there's a guy here in Austin named Daman Stith
02:30:10.300 | who does African, historical African martial arts.
02:30:13.680 | Also martial arts of enslaved Africans
02:30:18.900 | who would learn machete fighting techniques
02:30:23.340 | in the Caribbean, South America.
02:30:26.140 | He's probably within a mile of us.
02:30:27.860 | He's an amazing guy. - That's awesome.
02:30:29.420 | I'm gonna look him up.
02:30:30.660 | Can I ask you for advice?
02:30:34.120 | Can you give advice for young people, high school,
02:30:37.820 | college, undergrads, thinking about their career,
02:30:42.820 | thinking about life, how to live a life
02:30:45.860 | they can be proud of.
02:30:48.260 | You think quite a bit about what is required
02:30:51.260 | to be innovative in this world.
02:30:53.380 | You think quite a bit about the future.
02:30:55.020 | So if somebody wanted to be a person
02:30:57.540 | that makes a big impact on the future,
02:31:00.300 | what advice would you give them?
02:31:01.900 | - I think a big part of it is finding the thing
02:31:06.460 | that you will do happily and I don't wanna say obsessively
02:31:11.460 | because that sounds like maybe it's pathological,
02:31:16.220 | but if you can find a thing that you'll sit down,
02:31:20.700 | you'll start doing it and hours later
02:31:23.420 | you kind of snap out of it, where did the time go?
02:31:26.420 | Then that's a really key discovery
02:31:32.740 | for anyone to make about themselves when they're young
02:31:37.180 | because if you don't have that,
02:31:38.780 | it's hard to figure out where you should put your energies.
02:31:45.580 | And so you might have the best intentions.
02:31:47.700 | You might say, I want world peace or whatever,
02:31:51.560 | but at the end of the day, what really matters
02:31:56.940 | is how do you spend your time?
02:31:59.700 | And are you spending it in a way that's productive?
02:32:02.940 | Because it doesn't matter how smart you are
02:32:09.620 | or well-intentioned you are unless you've figured that out.
02:32:13.500 | And so it's finding that thing in which you can sort of,
02:32:16.300 | you naturally lose yourself in.
02:32:18.700 | The thing is, at least for me,
02:32:22.560 | there's a lot of things like that,
02:32:25.340 | but I first have to overcome the initial hump
02:32:28.420 | of really sucking at that thing.
02:32:31.060 | Like the fun starts a little bit after the first hump
02:32:34.860 | of really sucking and then you could suck just regularly.
02:32:38.020 | - Yeah.
02:32:39.180 | - So oftentimes people can give up too early, I think.
02:32:43.140 | I mean, that's true with mathematics for me.
02:32:44.740 | It's for a lot of people,
02:32:47.500 | is if you just give it a chance to struggle.
02:32:50.060 | If you give yourself time to struggle,
02:32:51.980 | you'll find a way, you'll find the thing within that thing
02:32:56.540 | that you can lose track of time with.
02:32:59.000 | - Yeah, that's a key detail that,
02:33:01.760 | that's an important thing to add to what I said,
02:33:05.940 | which is that this might not happen
02:33:08.340 | the first time you do a thing.
02:33:10.740 | Maybe it will.
02:33:11.580 | But you might have to climb that learning curve.
02:33:16.100 | And if there's pressures in your life
02:33:19.140 | that are making you feel bad about that,
02:33:21.260 | then it might prevent you from getting where you need to be.
02:33:27.180 | So there's some complexity there
02:33:32.380 | that can make this kind of non-obvious.
02:33:36.800 | But that's why we need good teachers.
02:33:41.800 | Another beneficial thing of the internet is YouTube
02:33:48.920 | and being able to learn things,
02:33:51.360 | how to do things on YouTube.
02:33:52.940 | The dude who made the YouTube video
02:33:56.480 | doesn't care how many times you hit pause and rewind.
02:34:00.040 | They're never gonna roll their eyes
02:34:04.080 | and be impatient with you.
02:34:06.680 | - And sometimes spending a huge amount of time
02:34:13.080 | on one video or one book,
02:34:15.340 | like making that the thing you just spent
02:34:17.800 | a huge amount of time on rereading, rereading,
02:34:20.280 | or rewatching, rewatching,
02:34:22.520 | that somehow really solidifies your love for that thing.
02:34:27.520 | And like the depth of understanding you start to gain.
02:34:32.760 | And it's okay to stay with that.
02:34:34.440 | I used to think like there's all these books out there.
02:34:37.560 | So like I need to keep reading or keep reading.
02:34:40.720 | But then I realized, I think it was somewhere in college,
02:34:44.480 | where you could just spend your whole life
02:34:49.120 | with a single textbook.
02:34:50.800 | There's enough in that textbook to really, really stay.
02:34:55.000 | - Miesner, Thorne, and Wheeler, "Gravitation"
02:34:58.480 | is one of those.
02:34:59.320 | Or another one is "The Road to Reality" by Roger Penrose,
02:35:03.320 | which is just incredibly deep.
02:35:05.760 | And it starts with like two plus two equals four.
02:35:08.480 | And at the end, you're at the boundaries of physics.
02:35:12.920 | It's an amazing, amazing book.
02:35:16.480 | - Let me ask you the big ridiculous question.
02:35:20.200 | Since you've pondered some big ridiculous questions
02:35:23.640 | in your work, what's the meaning of this whole thing?
02:35:27.320 | What's the meaning of life?
02:35:28.800 | - Wow. - Human life.
02:35:30.080 | - Well, as far as I know, we're unique in the universe.
02:35:36.280 | There's no evidence that there's anything else
02:35:40.080 | in the universe that's as complicated
02:35:41.880 | as what's between our ears.
02:35:43.840 | Might be, you can't rule it out.
02:35:45.960 | So we appear to be pretty special.
02:35:51.360 | And so it's gotta have something to do with that.
02:35:55.560 | And one of the reasons I like David Deutsch,
02:35:59.440 | in particular his book, "The Beginning of Infinity,"
02:36:02.040 | is that he talks about the power of explanations
02:36:06.440 | and the fact that most civilizations are static,
02:36:11.440 | that they've got a set of dogmas
02:36:14.280 | that they arrive at somehow,
02:36:17.040 | and they just pass those on from one generation to the next
02:36:22.040 | and nothing changes.
02:36:24.360 | But that huge changes have happened
02:36:27.880 | when people sort of follow whatever you wanna call it,
02:36:32.000 | the scientific method or enlightenment.
02:36:34.880 | There's different ways of thinking about it,
02:36:38.160 | but basically explanatory,
02:36:40.520 | it's about the power of explanations
02:36:44.560 | and being able to figure out why things are the way they are.
02:36:48.200 | And that has created changes in our thinking
02:36:52.640 | and our way of life over the last few centuries
02:36:55.880 | that are explosive compared to anything that came before.
02:37:00.440 | And David sort of verges on classifying this
02:37:05.040 | as like a force of nature
02:37:07.160 | in its potential transformative power.
02:37:10.680 | If we keep going,
02:37:11.880 | we could, if we figure out how to colonize the universe,
02:37:18.960 | like you were talking about earlier,
02:37:21.800 | how to spread to other star systems,
02:37:23.920 | then it is effectively a force of nature.
02:37:29.440 | - This kind of drive to understand more and more and more,
02:37:32.440 | deeper and deeper and deeper,
02:37:34.560 | and to engineer stuff so that we can understand even more.
02:37:37.680 | - Yeah.
02:37:38.520 | - Yeah, it's the old,
02:37:41.400 | the universe created us to understand itself.
02:37:43.720 | Maybe that's the whole purpose.
02:37:46.480 | - Yeah.
02:37:49.560 | It is an interesting, peculiar side effect
02:37:52.800 | of the way we've been created
02:37:54.120 | is we seem to be conscious beings.
02:37:56.040 | We seem to have little egos.
02:37:57.360 | We seem to be born and die pretty quickly.
02:38:02.000 | There's a bunch of drama.
02:38:03.040 | We're all within ourselves, pretty unique.
02:38:06.680 | And we fall in love and start wars and there's hate
02:38:09.840 | and all the full interesting dynamic of it.
02:38:12.680 | So it's not just about the individual people.
02:38:15.840 | - Yeah.
02:38:16.680 | - Somehow like the concert that we played together.
02:38:18.920 | - Mm-hmm, yeah, yeah, so.
02:38:22.880 | - That's kind of interesting.
02:38:24.200 | There's a lot of peculiar aspects of that,
02:38:26.240 | that I wonder if they're fundamental
02:38:28.320 | just quirks of evolution,
02:38:29.840 | whether it's death, whether it's love,
02:38:33.400 | whether all those things.
02:38:35.400 | I wonder if they're, from an engineering perspective,
02:38:39.800 | when we're trying to create that intelligent toaster
02:38:41.920 | that listens for the slam door
02:38:47.360 | and the smell of burning toast,
02:38:49.240 | whether that toaster should be afraid of death
02:38:54.680 | and should fall in love just like we do.
02:38:56.680 | Neil, you're a fascinating human being.
02:39:01.120 | You've impacted the lives of millions of people.
02:39:03.840 | - Well, thank you. - It is a huge honor
02:39:05.440 | that you would spend your valuable time with me today.
02:39:08.040 | Thank you so much.
02:39:08.880 | Thank you for coming down to beautiful, hot Texas.
02:39:13.480 | And thank you for talking today.
02:39:15.280 | - It was a pleasure.
02:39:16.120 | I'm glad I came and did it.
02:39:17.640 | - Thanks for listening to this conversation
02:39:20.440 | with Neil Stevenson.
02:39:21.640 | To support this podcast,
02:39:23.040 | please check out our sponsors in the description.
02:39:26.040 | And now, let me leave you with some words
02:39:28.120 | from Neil Stevenson himself in his novel "Snow Crash."
02:39:32.160 | "The world is full of things more powerful than us,
02:39:35.720 | but if you know how to catch a ride, you can go places."
02:39:39.240 | Thanks for listening and hope to see you next time.
02:39:42.720 | (upbeat music)
02:39:45.320 | (upbeat music)
02:39:47.920 | [BLANK_AUDIO]