back to index

Walter Isaacson: Elon Musk, Steve Jobs, Einstein, Da Vinci & Ben Franklin | Lex Fridman Podcast #395


Chapters

0:0 Introduction
3:0 Difficult childhood
20:4 Jennifer Doudna
23:1 Einstein
28:20 Tesla
45:24 Elon Musk's humor
49:34 Steve Jobs' cruelty
52:58 Twitter
65:7 Firing
67:52 Hiring
76:55 Time management
84:39 Groups vs individuals
88:25 Mortality
91:57 How to write
112:56 Love & relationships
117:50 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | I hope with my books, I'm saying,
00:00:02.740 | this isn't a how-to guide,
00:00:06.240 | but this is somebody you can walk alongside.
00:00:08.960 | You can see Einstein growing up Jewish in Germany.
00:00:13.440 | You can see Jennifer Doudna growing up as an outsider,
00:00:18.120 | Leonardo da Vinci, or Elon Musk, you know,
00:00:22.040 | in really violent South Africa
00:00:24.640 | with a psychologically difficult father
00:00:27.680 | and getting off the train when he goes to anti-apartheid
00:00:30.800 | concert with his brother,
00:00:32.320 | and there's a man with a knife sticking out of his head,
00:00:36.720 | and they step into the pool of blood,
00:00:38.480 | and it's sticky on their souls.
00:00:40.540 | This causes, you know, scars
00:00:44.580 | that last the rest of your life,
00:00:47.320 | and the question is not how do you avoid getting scarred,
00:00:51.400 | it's, you know, how do you deal with it?
00:00:56.240 | The following is a conversation with Walter Isaacson,
00:00:59.640 | one of the greatest biography writers ever,
00:01:02.380 | having written incredible books on Albert Einstein,
00:01:05.620 | Steve Jobs, Leonardo da Vinci, Jennifer Doudna,
00:01:08.960 | Benjamin Franklin, Henry Kissinger,
00:01:11.720 | and now a new one on Elon Musk.
00:01:15.240 | We talked for hours on and off the mic.
00:01:18.200 | I'm sure we'll talk many more times.
00:01:20.080 | Walter is a truly special writer, thinker,
00:01:22.960 | observer, and human being.
00:01:25.920 | I highly recommend people read his new book on Elon.
00:01:29.800 | I'm sure there will be short-term controversy,
00:01:32.640 | but in the long-term,
00:01:34.560 | I think it will inspire millions of young people,
00:01:37.260 | especially with difficult childhoods,
00:01:39.200 | with hardship in their surroundings or in their own minds,
00:01:42.920 | to take on the hardest problems in the world
00:01:45.360 | and to build solutions to those problems,
00:01:47.880 | no matter how impossible the odds.
00:01:50.800 | In this conversation, Walter and I cover all of his books
00:01:54.480 | and use personal stories from them
00:01:56.560 | to speak to the bigger principles
00:01:58.400 | of striving for greatness in science,
00:02:01.040 | in tech, engineering, art, politics, and life.
00:02:04.820 | There are many things in the new Elon book
00:02:08.180 | that I felt are best saved
00:02:09.720 | for when I speak to Elon directly again on this podcast,
00:02:13.700 | which will be soon enough.
00:02:16.080 | Perhaps it's also good to mention here
00:02:18.080 | that my friendships, like with Elon,
00:02:21.440 | nor any other influence like money, access, fame, power,
00:02:25.800 | will ever result in me sacrificing my integrity, ever.
00:02:29.800 | I do like to celebrate the good in people,
00:02:32.920 | to empathize and to understand,
00:02:35.720 | but I also like to call people out on their bullshit
00:02:39.520 | with respect and with compassion.
00:02:42.120 | If I fail, I fail due to a lack of skill,
00:02:44.960 | not a lack of integrity.
00:02:47.280 | I'll work hard to improve.
00:02:50.020 | This is the Lex Rudin Podcast.
00:02:52.060 | To support it, please check out our sponsors
00:02:54.140 | in the description.
00:02:55.380 | And now, dear friends, here's Walter Isaacson.
00:02:59.220 | What is the role of a difficult childhood
00:03:02.420 | in the lives of great men and women, great minds?
00:03:05.700 | Is that a requirement, is it a catalyst,
00:03:08.380 | or is it just a simple coincidence of fate?
00:03:11.060 | - Well, it's not a requirement.
00:03:12.860 | Some people with happy childhood do quite well,
00:03:16.260 | but it certainly is true that a lot of really driven people
00:03:21.260 | are driven because they're harnessing
00:03:23.540 | the demons of their childhood.
00:03:25.860 | Even Barack Obama's sentence in his memoirs,
00:03:29.060 | which is, "I think every successful man
00:03:31.060 | "is either trying to live up to the expectations
00:03:33.480 | "of his father or live down the sins of his father."
00:03:37.040 | And for Elon, it's especially true,
00:03:39.020 | 'cause he had both a violent and difficult childhood
00:03:42.540 | and a very psychologically problematic father.
00:03:46.340 | He's got those demons dancing around in his head,
00:03:50.500 | and by harnessing them, it's part of the reason
00:03:55.140 | that he does riskier, more adventurous, wilder things
00:03:59.980 | than maybe I would ever do.
00:04:02.020 | - You've written that Elon talked about his father
00:04:06.060 | and that at times it felt like mental torture,
00:04:09.980 | the interaction with him during his childhood.
00:04:12.620 | Can you describe some of the things you've learned?
00:04:16.180 | - Yeah, well, Elon and Kimball would tell me
00:04:19.300 | that, for example, when Elon got bullied on the playground
00:04:24.300 | and one day was pushed down some concrete steps
00:04:27.920 | and had his face pummeled so badly
00:04:29.980 | that Kimball said, "I couldn't really recognize him."
00:04:32.580 | And he was in the hospital for almost a week.
00:04:34.700 | But when he came home, Elon had to stand
00:04:38.100 | in front of his father, and his father berated him
00:04:41.540 | for more than an hour and said he was stupid
00:04:44.860 | and took the side of the person who had beaten him.
00:04:48.380 | - That's probably one of the more traumatic events
00:04:50.420 | of Elon's life.
00:04:51.580 | - Yes, and there's also Veld School,
00:04:53.940 | which is a sort of paramilitary camp
00:04:56.860 | that young South African boys got sent to.
00:04:59.580 | And at one point, he was scrawny.
00:05:02.100 | He has very bad at picking up social cues
00:05:06.300 | and emotional cues, he talks about being Asperger's.
00:05:10.180 | And so he gets traumatized at a camp like that.
00:05:13.920 | But the second time he went, he'd gotten bigger.
00:05:16.660 | He had shot up to almost six feet
00:05:18.060 | and he'd learned a little bit of judo.
00:05:19.980 | And he realized that if he was getting beaten up,
00:05:22.500 | it might hurt him, but he would just punch the person
00:05:25.900 | in the nose as hard as possible.
00:05:28.060 | So that sense of always punching back
00:05:30.980 | has also been ingrained in Elon.
00:05:33.660 | I spend a lot of time talking to Errol Musk, his father.
00:05:36.940 | Elon doesn't talk to Errol Musk anymore, his father,
00:05:40.500 | nor does Kimball, it's been years.
00:05:42.940 | And Errol doesn't even have Elon's email.
00:05:47.300 | So a lot of times, Errol will be sending me emails.
00:05:50.580 | And Errol had one of those Jekyll and Hyde personalities.
00:05:54.740 | He was a great mind of engineering
00:05:59.100 | and especially material science,
00:06:01.540 | knew how to build a wilderness camp in South Africa
00:06:06.220 | using mica and how it would not conduct the heat.
00:06:09.720 | But he also would go into these dark periods
00:06:13.720 | in which he would just be psychologically abusive.
00:06:18.180 | And of course, May Musk says to me,
00:06:21.460 | his mother, who divorced Errol early on,
00:06:25.060 | said the danger for Elon is that he becomes his father.
00:06:29.700 | And every now and then, you've been with him so much, Lex,
00:06:32.540 | and you know him well, he'll even talk to you
00:06:35.920 | about the demons, about Diablo dancing in his head.
00:06:39.180 | I mean, he gets it, he's self-aware.
00:06:41.920 | But you've probably seen him at times
00:06:44.580 | where those demons take over
00:06:47.460 | and he goes really dark and really quiet.
00:06:50.380 | And Grimes says, you know, I can tell a minute or two
00:06:55.380 | in advance when demon mode's about to happen.
00:06:58.780 | And he'll go a bit dark.
00:07:00.540 | I was here at Austin, wanted dinner with a group.
00:07:04.260 | And you could tell suddenly something had triggered him
00:07:08.020 | and he was gonna go dark.
00:07:09.380 | I've watched it in meetings where somebody will say,
00:07:12.420 | we can't make that part for less than $200
00:07:15.700 | or no, that's wrong.
00:07:17.940 | And he'll berate them.
00:07:20.180 | And then he snaps out of it, as you know that too,
00:07:22.940 | the huge snap out where suddenly he's showing you
00:07:25.620 | Monty Python's skit on his phone
00:07:27.580 | and he's joking about things.
00:07:29.540 | So I think coming out of the childhood,
00:07:32.080 | there were just many facets, maybe even many personalities,
00:07:35.620 | the engineering mode, the silly mode,
00:07:38.360 | the charismatic mode, the visionary mode,
00:07:40.860 | but also the demon and dark mode.
00:07:43.340 | - A quote you cited about Elon really stood out to me.
00:07:46.760 | I forget who it's from, but inside the man,
00:07:50.380 | he's still there as a child,
00:07:51.820 | the child standing in front of his dad.
00:07:53.900 | That was Tallulah, his second wife.
00:07:57.140 | And she's great.
00:07:58.660 | She's an English actress.
00:08:00.620 | They've been married twice, actually.
00:08:02.620 | And Tallulah said, that's just him from his childhood.
00:08:06.020 | He's a drama addict.
00:08:07.340 | Kimball says that as well.
00:08:09.620 | And I asked why.
00:08:12.380 | And he said, and Tallulah said,
00:08:14.620 | for him, love and family are kind of associated
00:08:20.300 | with those psychological torments.
00:08:23.000 | And in many ways, he'll channel.
00:08:25.340 | I mean, Tallulah would be with him in 2008
00:08:29.980 | when the company was going bankrupt,
00:08:31.420 | whatever it may have been, or later,
00:08:33.940 | and he would be so stressed, he would vomit.
00:08:37.940 | And then he would channel things that his father had said,
00:08:41.380 | use phrases his father had said to him.
00:08:44.300 | And so she told me, deep inside the man
00:08:47.500 | is this man-child still standing in front of his father.
00:08:51.860 | - To what degree is that true for many of us, do you think?
00:08:55.320 | - I think it's true, but in many different ways.
00:08:58.120 | I'll say something personal, which is,
00:09:01.480 | I was blessed, and perhaps it's a bit of a downside too,
00:09:06.000 | with the fact that I had the greatest father
00:09:07.400 | you could ever imagine, and mother.
00:09:09.360 | They were the kindest people you'd ever wanna meet.
00:09:11.640 | I grew up in a magical place in New Orleans.
00:09:14.040 | My dad was an engineer, an electrical engineer.
00:09:17.240 | And he was always kind.
00:09:21.780 | Perhaps I'm not quite as driven or as crazed.
00:09:26.340 | I don't have to prove things.
00:09:28.580 | So I get to write about Elon Musk.
00:09:30.580 | I get to write about Einstein or Steve Jobs
00:09:33.880 | or Leonardo da Vinci, who, as you know,
00:09:36.420 | was totally torn by demons and had different,
00:09:39.680 | difficult childhood situations,
00:09:41.640 | not even legitimized by his father.
00:09:44.960 | So sometimes those of us who are lucky enough
00:09:48.540 | to have really gentle, sweet childhood,
00:09:51.980 | we grow up with fewer demons,
00:09:54.780 | but we grow up with fewer drives,
00:09:57.080 | and we end up maybe being Boswell
00:10:00.780 | and not being Dr. Johnson.
00:10:02.740 | We end up being the observer, not being the doer.
00:10:07.540 | And so I always respect those who are in the arena.
00:10:11.620 | I don't, you know--
00:10:13.100 | - You don't see yourself as a man in the arena.
00:10:16.000 | - I've had a gentle, sweet career,
00:10:19.700 | and I've got to cover really interesting people.
00:10:23.740 | But I've never shot off a rocket
00:10:25.540 | that might someday get to Mars.
00:10:27.740 | I've never moved us into the air of electric vehicles.
00:10:30.560 | I've never stayed up all night on the factory floor.
00:10:34.540 | I don't have quite those,
00:10:38.500 | either the drives or the
00:10:45.060 | addiction to risk.
00:10:46.400 | I mean, Elon's addicted to risk.
00:10:48.920 | He's addicted to adventure.
00:10:51.200 | Me, if I see something that's risky,
00:10:53.960 | I spend some time calculating,
00:10:55.720 | okay, upside, downside here.
00:10:57.700 | But that's another reason that people like Elon Musk
00:11:03.680 | get stuff done, and people like me
00:11:06.920 | write about the Elon Musks.
00:11:09.400 | - One other aspect of this, given a difficult childhood,
00:11:13.160 | whether it's Elon or Da Vinci,
00:11:16.140 | I wonder if there's some wisdom,
00:11:20.240 | some advice almost that you can draw,
00:11:25.440 | that you can give to people with difficult childhoods.
00:11:29.240 | - I think all of us have demons,
00:11:31.060 | even those of us who grew up in a magical part
00:11:34.720 | of New Orleans with sweet parents.
00:11:36.720 | - Yes.
00:11:37.560 | - And we all have demons.
00:11:39.420 | And rule one in life is harness your demons.
00:11:44.420 | Know that you're ambitious or not ambitious,
00:11:47.660 | or you're lazy or whatever.
00:11:49.060 | Leonardo da Vinci knew he was a procrastinator.
00:11:52.680 | I think it's useful to know what's eating at you,
00:11:58.760 | know how to harness it.
00:12:01.940 | Also know what you're good at.
00:12:06.500 | I'll take Musk as another example.
00:12:10.040 | I'm a little bit more like Kimball Musk than Elon.
00:12:14.120 | I maybe got over-endowed with the empathy gene.
00:12:17.520 | And what does that mean?
00:12:19.560 | Well, it means that I was okay when I ran Time Magazine.
00:12:23.280 | It was a group of about 150 people on the editorial floors,
00:12:27.160 | and I knew them all, and we had a jolly time.
00:12:30.680 | When I went to CNN, I was not very good
00:12:34.220 | at being a manager or an executive of an organization.
00:12:38.780 | I cared a little bit too much
00:12:41.060 | that people didn't get annoyed at me or mad at me.
00:12:46.060 | And Elon said that about John McNeil, for example,
00:12:49.820 | who was president of Tesla.
00:12:51.700 | It's in the book.
00:12:52.540 | I talked to John McNeil a long time,
00:12:54.860 | and he says, "You know, Elon just would fire people.
00:12:59.540 | "He'd be really rough on people.
00:13:01.140 | "He didn't have the empathy for the people in front of him."
00:13:04.420 | And Elon says, "Yeah, that's right."
00:13:06.700 | And John McNeil couldn't fire people.
00:13:09.140 | He cared more about pleasing the people in front of him
00:13:12.680 | than pleasing the entire enterprise or getting things done.
00:13:16.700 | Being over-endowed with the desire to please people
00:13:19.780 | can make you less tough of a manager.
00:13:23.420 | And that doesn't mean there aren't great people
00:13:27.180 | who are over-endowed.
00:13:28.100 | Ben Franklin, over-endowed with the desire to please people.
00:13:32.420 | The worst criticism of him from John Adams and others
00:13:36.280 | was that he was insinuating, which kind of meant
00:13:40.260 | he was always trying to get people to like him.
00:13:44.000 | But that turned out to be a good thing.
00:13:46.700 | When they can't figure out the big state, little state issue
00:13:49.180 | at the Constitutional Convention,
00:13:50.940 | when they can't figure out the Treaty of Paris,
00:13:52.960 | whatever it is, he brings people together,
00:13:56.240 | and that is his superpower.
00:13:59.500 | So to get back to the lessons you asked,
00:14:02.020 | and the first was harness your demons,
00:14:05.720 | the second is to know your strengths and your superpower.
00:14:10.600 | My superpower is definitely not being a tough manager.
00:14:14.660 | After running CNN for a while, I said, okay,
00:14:17.540 | I think I've proven I don't really enjoy this
00:14:19.940 | or know how to do this well.
00:14:21.400 | Do I have other talents?
00:14:25.380 | Yeah, I think I have the talent to observe people
00:14:27.940 | really closely, to write about it in a straight,
00:14:32.460 | but I hope interesting narrative style.
00:14:34.680 | That's a power.
00:14:35.680 | It's totally different from running an organization.
00:14:38.020 | It took me until three years of running CNN
00:14:41.280 | that I realized I'm not cut to be an executive
00:14:45.280 | in a really high-intense situations.
00:14:50.280 | Elon Musk is cut to be an executive
00:14:54.960 | in highly intense situation, so much so
00:14:57.940 | that when things get less intense,
00:15:00.060 | when they actually are making enough cars
00:15:02.120 | and rockets are going up and landing,
00:15:06.280 | he thinks of something else so he can surge
00:15:08.600 | and have more intensity.
00:15:09.680 | He's addicted to intensity, and that's his superpower,
00:15:14.680 | which is a lot greater than the superpower
00:15:17.400 | of being a good observer.
00:15:18.800 | - But I think also to build on that,
00:15:22.040 | it's not just addiction to risk and drama.
00:15:27.000 | There's always a big mission above it.
00:15:30.880 | So I would say it's an empathy
00:15:34.600 | towards people in the big picture.
00:15:39.200 | - It's an empathy towards humanity--
00:15:41.400 | - Humanity.
00:15:42.240 | - More than the empathy towards the three or four humans
00:15:45.120 | who might be sitting in the conference room with you,
00:15:47.720 | and that's a big deal.
00:15:49.660 | You see that in a lot of people.
00:15:51.200 | You see it, Bill Gates or Larry Summers, Elon Musk.
00:15:58.300 | They always have empathy for these great goals of humanity,
00:16:03.300 | and at times they can be clueless about the emotions
00:16:08.100 | of the people in front of them, or callous sometimes.
00:16:12.220 | Musk, as you said, is driven by mission
00:16:15.500 | more than any person I've ever seen.
00:16:18.660 | And it's not only mission, it's like cosmic missions,
00:16:23.260 | meaning he's got three really big missions.
00:16:26.740 | One is to make humans a space-faring civilization,
00:16:31.740 | make us multi-planetary, or get us to Mars.
00:16:36.420 | Number two is to bring us into the era of sustainable energy,
00:16:40.620 | to bring us into the era of electric vehicles
00:16:43.060 | and solar roofs and battery packs.
00:16:46.940 | And third is to make sure that artificial intelligence
00:16:51.060 | is safe and is aligned with human values.
00:16:54.500 | And every now and then I'd talk to him,
00:16:56.500 | and we'd be talking about Starlink satellites or whatever,
00:17:00.540 | or he would be pushing the people in front of him
00:17:03.460 | at SpaceX and saying, "If you do this,
00:17:05.620 | "we'll never get to Mars in our lifetime."
00:17:07.700 | And then he would give the lecture how important it was
00:17:10.420 | for human consciousness to get to Mars in our lifetime.
00:17:13.900 | And I'm thinking, okay, this is the pep talk
00:17:16.260 | of somebody trying to inspire a team,
00:17:18.700 | or maybe it's a type of pontification you do on a podcast.
00:17:23.540 | But on like the 20th time I watched him,
00:17:26.220 | I realized, okay, I believe it.
00:17:28.300 | He actually is driven by this.
00:17:31.580 | - He is frustrated and angry that because
00:17:35.820 | of this particular minor engineering decision,
00:17:38.860 | the big mission is not going to be accomplished.
00:17:41.900 | It's not a pep talk, it's a literal frustration.
00:17:44.860 | - An impatience, a frustration,
00:17:47.700 | and it's also just probably
00:17:52.700 | the most deeply ingrained thing in him
00:17:55.700 | is his mission.
00:17:57.140 | He joked at one point to me about how much
00:18:00.460 | he loved reading comics as a kid,
00:18:03.420 | and he said, "All the people in the comic books,
00:18:05.900 | "they're trying to save the world,
00:18:07.500 | "but they're wearing their underpants on the outside,
00:18:09.800 | "and they look ridiculous."
00:18:10.980 | And then he paused and said,
00:18:12.460 | "But they are trying to save the world."
00:18:14.820 | And whether it's Starlink in Ukraine,
00:18:17.760 | or Starship going to Mars,
00:18:20.960 | or trying to get a global new Tesla,
00:18:25.300 | I think he's got this epic sense
00:18:28.620 | of the role he's gonna play
00:18:30.780 | in helping humanity on big things.
00:18:33.400 | And like the characters in the comic books,
00:18:39.180 | it's sometimes ridiculous,
00:18:41.340 | but it also is sometimes true.
00:18:43.940 | - When I was reading this part of the book,
00:18:46.100 | I was thinking of all the young people
00:18:49.980 | who are struggling in this way,
00:18:52.180 | and I think a lot of people are in different ways,
00:18:54.740 | whether they grow up without a father,
00:18:56.060 | whether they grow up with physical, emotional,
00:18:58.500 | mental abuse, or demons of any kind, as you talked about.
00:19:02.140 | And it's really painful to read,
00:19:04.420 | but also really damn inspiring
00:19:06.500 | that if you sort of walk side by side with those demons,
00:19:12.400 | if you don't let that pain break you,
00:19:18.180 | or somehow channel it, if you can put it this way,
00:19:20.840 | that you can achieve, you can do great things in this world.
00:19:23.660 | - Well, that's an epic view of why we write biography,
00:19:28.300 | which is more epic than I'd even thought of.
00:19:31.660 | So I say thank you,
00:19:33.440 | because in some ways what you're trying to do is say,
00:19:36.740 | okay, I mean, Leonardo, you talk about being a misfit,
00:19:41.460 | he's born illegitimate in the village of Vinci,
00:19:45.100 | and he's gay, and he's left-handed, and he's distracted,
00:19:49.020 | and his father won't legitimize him.
00:19:52.420 | And then he wanders off to the town of Florence,
00:19:57.420 | and he becomes the greatest artist and engineer
00:20:01.300 | of that part of the Renaissance.
00:20:03.820 | I hope this book inspires.
00:20:06.980 | Jennifer Doudna, the gene editing pioneer
00:20:10.500 | who helps discover CRISPR, gene editing tool,
00:20:14.180 | which my book, The Codebreaker,
00:20:17.260 | she grew up feeling like a misfit, you know,
00:20:19.580 | in Hawaii, in a Polynesian village,
00:20:22.460 | being the only white person,
00:20:24.580 | and also trying to live up to a father who pushed her.
00:20:27.700 | So if people can read the books,
00:20:31.140 | and I should've said about Jennifer Doudna,
00:20:33.100 | my point was that she was told
00:20:35.220 | by her school guidance counselor,
00:20:36.780 | no, girls don't do science.
00:20:38.700 | You know, science is not for girls.
00:20:40.080 | You're not gonna do math or science.
00:20:42.220 | And so it pushes her to say,
00:20:44.340 | all right, I'm gonna do math and science.
00:20:47.260 | - Just to interrupt real quick,
00:20:48.860 | but Jennifer Doudna, you've written an amazing book
00:20:51.820 | about her, Nobel Prize winner, CRISPR developer,
00:20:55.380 | just incredible, one of the great scientists
00:20:57.140 | in the 21st century.
00:20:58.460 | - Right, and I'm talking about,
00:21:00.260 | when Jennifer Doudna was young,
00:21:02.320 | and she felt really, really out of place,
00:21:04.740 | like you and me and a lot of people
00:21:07.460 | when they're feeling that way, they read books.
00:21:09.660 | They go into, they curl up with a book.
00:21:11.980 | So her father drops a book on her bed
00:21:14.020 | called The Double Helix, the book by James Watson
00:21:16.740 | on the discovery of the structure of DNA
00:21:19.900 | by him and Rosalind Franklin and Francis Crick.
00:21:23.820 | And she realizes, oh my God,
00:21:27.620 | girls can become scientists.
00:21:29.240 | My school guidance counselor's wrong.
00:21:31.200 | So I think books,
00:21:37.020 | like she read this book, and even if it's a comic book,
00:21:40.180 | like Elon Musk read, books can sometimes inspire you.
00:21:45.700 | And every one of my books is about people
00:21:48.300 | who were totally innovative, who weren't just smart,
00:21:52.340 | 'cause none of us are gonna be able to match Einstein
00:21:55.460 | in mental processing power,
00:21:57.740 | but we can be as curious as he was,
00:22:01.060 | and creative, and think out of the box the way he did,
00:22:04.300 | or Steve Jobs put it, think different.
00:22:07.020 | And so I hope with my books, I'm saying,
00:22:10.360 | this isn't a how-to guide,
00:22:13.860 | but this is somebody you can walk alongside.
00:22:16.600 | You can see Einstein growing up Jewish in Germany.
00:22:21.060 | You can see Jennifer Doudna growing up as an outsider,
00:22:25.660 | or Leonardo da Vinci, or Elon Musk,
00:22:29.660 | in really violent South Africa
00:22:32.280 | with a psychologically difficult father,
00:22:35.320 | and getting off the train when he goes
00:22:37.420 | to anti-apartheid concert with his brother,
00:22:39.940 | and there's a man with a knife sticking out of his head,
00:22:44.340 | and they step into the pool of blood,
00:22:46.100 | and it's sticky on their soles.
00:22:48.180 | This causes scars that last the rest of your life,
00:22:53.180 | and the question is not how do you avoid getting scarred,
00:22:59.020 | it's how do you deal with it.
00:23:01.500 | - Einstein, too, one of my,
00:23:06.140 | it's hard to pick my favorite of your biographies,
00:23:10.180 | but Einstein, I mean, you really paint a picture
00:23:13.700 | of another, I don't wanna call him a misfit,
00:23:16.460 | but a person who doesn't necessarily
00:23:19.300 | have a standard trajectory through life of success.
00:23:23.220 | - Absolutely.
00:23:25.500 | - And that's extremely inspiring.
00:23:28.320 | I don't know exactly what question to ask.
00:23:31.400 | There's a million.
00:23:32.240 | - Well, I'll talk about the misfit for a second,
00:23:33.980 | 'cause we talked about Leonardo being that way.
00:23:36.780 | Einstein's Jewish in Germany at a time
00:23:40.100 | when it starts getting difficult.
00:23:42.420 | He's slow in learning how to talk,
00:23:44.660 | and he's a visual thinker,
00:23:46.180 | so he's always daydreaming and imagining things.
00:23:49.240 | The first time he applies to the Zurich Polytech,
00:23:52.940 | 'cause he runs away from the German education system
00:23:56.620 | 'cause it's too much learning by route,
00:23:59.340 | he gets rejected by the Zurich Polytech.
00:24:02.220 | Now, it's the second best school in Zurich,
00:24:04.020 | and they're rejecting Einstein.
00:24:05.180 | I tried to find, but couldn't,
00:24:06.620 | the name of the admissions counselor
00:24:08.900 | at the Zurich Polytech.
00:24:10.740 | Like, you rejected Einstein?
00:24:12.300 | And then he doesn't finish in the top half of his class,
00:24:16.020 | and once he does and he goes to graduate school,
00:24:20.740 | they don't accept his dissertation, so he can't get a job.
00:24:23.980 | He's not teaching it.
00:24:25.420 | He even tries about 14 different high schools,
00:24:28.660 | a gymnasium, to get a job, and they won't take him.
00:24:32.060 | So he's a third-class examiner
00:24:33.820 | in the Swiss Patent Office in 1905.
00:24:36.940 | Third class 'cause they've rejected his doctoral dissertation
00:24:41.460 | and so he can't be second class or first class
00:24:44.180 | 'cause he doesn't have a doctoral degree,
00:24:46.260 | and yet he's sitting there on the stool
00:24:47.860 | in the Patent Office in 1905
00:24:50.540 | and writes three papers that totally transform science.
00:24:55.540 | And if you're thinking about being misunderstood
00:25:00.440 | or unappreciated, in 1906, he's still a third-class patent.
00:25:03.700 | In 1907, he still is.
00:25:05.260 | It takes until 1909 before people realize
00:25:08.800 | that this notion of the theory of relativity
00:25:11.200 | might be correct and it might upend
00:25:12.980 | all of Newtonian physics.
00:25:15.300 | - How is it possible for three of the greatest papers
00:25:18.980 | in the history of science to be written in one year
00:25:21.580 | by this one person?
00:25:22.500 | Is there some insights, wisdoms you draw?
00:25:26.780 | - Plus, he had a day job as a patent examiner,
00:25:29.340 | and there's really three papers,
00:25:30.520 | but there's also an addendum
00:25:32.440 | 'cause once you figure out quantum theory
00:25:34.600 | and then you figure out relativity
00:25:36.760 | and you're understanding Maxwell's equations
00:25:39.140 | and the speed of light, he does a little addendum.
00:25:43.300 | That's the most famous equation in all of physics,
00:25:46.060 | which is E equals MC squared.
00:25:49.000 | So it's a pretty good year.
00:25:51.640 | It partly starts because he's a visual thinker,
00:25:54.860 | and I think it was helpful that he was at the Patent Office
00:25:57.960 | rather than being the acolyte of some professor
00:26:02.180 | at the Academy where he was supposed to follow the rules.
00:26:05.740 | And so at the Patent Office,
00:26:06.960 | they're doing devices to synchronize clocks
00:26:09.460 | 'cause the Swiss have just gone on standard time zones,
00:26:12.380 | and Swiss people, as you know, tend to be rather Swiss.
00:26:15.620 | They care if it strikes the hour in Basel,
00:26:18.540 | it should do the same in Bern at the exact instant.
00:26:21.280 | So you have to send a light signal
00:26:22.800 | between two distant clocks, and he's visualizing.
00:26:26.200 | What's it look like to ride alongside a light beam?
00:26:29.940 | He says, well, if you catch up with it,
00:26:31.900 | if you go almost as fast, it'll look stationary,
00:26:33.900 | but Maxwell's equations don't allow for that.
00:26:36.780 | And he said, it was making my palms sweat
00:26:39.300 | that I was so worried.
00:26:40.900 | And so he finally figures out,
00:26:42.820 | because he's looking at these devices to synchronize clocks,
00:26:46.100 | that if you're traveling really, really fast,
00:26:48.980 | what looks synchronous to you or synchronized to you
00:26:52.580 | is different than for somebody traveling really fast
00:26:55.300 | in the other direction, and he makes a mental leap
00:26:58.320 | that time, that the speed of light's always constant,
00:27:02.200 | but time is relative depending on your state of motion.
00:27:05.200 | So it was that type of out-of-the-box thinking,
00:27:07.840 | those leaps, that made 1905 his miracle year.
00:27:12.700 | Likewise with Musk.
00:27:15.020 | I mean, after General Motors and Ford,
00:27:18.060 | everybody gives up on electric vehicles,
00:27:21.280 | to just say, I know how we're going to have a path
00:27:25.560 | to change the entire trajectory of the world
00:27:28.720 | into the era of electric vehicles.
00:27:31.360 | And then when he comes back from Russia,
00:27:32.880 | where he tried to buy a little rocket ship
00:27:35.240 | so he could send a experimental greenhouse to Mars,
00:27:39.000 | and they were poking fun of him,
00:27:41.640 | and actually spit on him at one point in a drunken lunch.
00:27:45.220 | This is very fortuitous, because on the ride back home,
00:27:49.040 | on the Delta Airlines flight,
00:27:52.660 | he's like doing the calculations of how much materials,
00:27:56.040 | how much metal, how much fuel,
00:27:58.440 | how much would it really cost?
00:28:00.520 | And so he's visualizing things that other people
00:28:05.520 | would just say is impossible.
00:28:08.400 | It's what Steve Jobs's friends called
00:28:11.360 | the reality distortion field, and it drove people crazy.
00:28:15.260 | It drove them mad, but it also drove them
00:28:17.560 | to do things they didn't think they would be able to do.
00:28:20.360 | - You said visual thinking.
00:28:21.880 | I wonder if you've seen parallels of the different styles
00:28:25.400 | and kinds of thinking that,
00:28:28.160 | that operate the minds of these people.
00:28:34.480 | So is there parallels you see between Elon, Steve Jobs,
00:28:39.480 | Einstein, Da Vinci, specifically in how they think?
00:28:44.480 | - I think they were all visual thinkers,
00:28:46.480 | perhaps coming from slight handicaps as children,
00:28:49.320 | meaning Leonardo was left-handed
00:28:52.360 | and a little bit dyslectic, I think.
00:28:54.160 | And certainly Einstein had echolia,
00:28:58.320 | he would repeat things, he was slow in learning to talk.
00:29:02.320 | So I think visualizing helps a lot.
00:29:06.760 | And with Musk, I see it all the time
00:29:10.360 | when I'm walking the factory lines with him
00:29:12.960 | or in product development, where he'll look
00:29:16.960 | at, say, the heat shield under the Raptor engine
00:29:20.120 | of a Starship booster, and he'll say,
00:29:24.240 | "Why does it have to be this way?
00:29:26.200 | "Couldn't we trim it this way?
00:29:27.320 | "Or make it, or even get rid of this part of it?"
00:29:30.880 | And he can visualize the material science.
00:29:33.040 | There's small anecdotes in my book,
00:29:35.360 | but at one point he's on the Tesla line
00:29:38.920 | and they're trying to get 5,000 cars a week in 2018.
00:29:43.000 | It's a life or death situation.
00:29:45.280 | And he's looking at the machines
00:29:47.760 | that are bolting something to the chassis.
00:29:50.680 | And he insists that Drew Baglione, not Drew,
00:29:53.080 | that Lars Maravy, one of his great lieutenants,
00:29:55.800 | come and they have to summon him.
00:29:58.000 | And he says, "Why are there six bolts here?"
00:30:00.900 | And Lars and others explain, "Well, for the crash test
00:30:05.880 | "or anything else, the pressure would be in this way,
00:30:09.140 | "so you have to," and they went, "Blah, blah, blah, blah."
00:30:11.920 | And he said, "No, if you visualize it,
00:30:14.720 | "you'll see if there's a crash,
00:30:16.880 | "the force would go this way and that way,
00:30:20.900 | "and it could be done with four bolts."
00:30:23.920 | Now, that sounds risky, and they go test it
00:30:26.840 | and they engineer it, but it turns out to be right.
00:30:28.920 | I know that seems minor, but I could give you 500 of those
00:30:33.080 | where in any given day, he's visualizing the physics
00:30:38.080 | of an engineering or manufacturing problem.
00:30:42.640 | That sounds pretty mundane.
00:30:44.880 | But for me, if you say what makes him special,
00:30:48.600 | there's a mission-driven thing,
00:30:50.000 | I'll give you a lot of reasons,
00:30:52.500 | but one of the reasons is he cares not just about
00:30:56.840 | the design of the product, but visualizing the manufacturing
00:31:01.480 | and of the product, the machine that makes the machine.
00:31:05.160 | And that's what we failed to do in America
00:31:07.280 | for the past 40 years, we outsourced so much manufacturing.
00:31:10.860 | I don't think you can be a good innovator
00:31:13.800 | if you don't know how to make the stuff you're designing.
00:31:16.960 | And that's why Musk puts his designer's desk
00:31:20.560 | right next to the assembly lines in the factories,
00:31:23.560 | so that they have to visualize what they drew
00:31:27.680 | as it becomes the physical object.
00:31:30.120 | - So understanding everything from the physics
00:31:32.160 | all the way up to the software, it's like end to end.
00:31:35.400 | - Well, having an end to end control is important,
00:31:39.800 | certainly with Steve Jobs.
00:31:41.240 | I'm looking at my iPhone here, it's a big deal.
00:31:44.640 | That hardware only works with Apple software,
00:31:47.800 | and for a while, the iTunes Store,
00:31:50.320 | and only what works, you know.
00:31:52.720 | So he has an end to end that makes it like
00:31:55.400 | a Zen garden in Kyoto, very carefully curated,
00:31:58.960 | but a thing of beauty.
00:32:00.120 | For Musk, when he first was at Tesla,
00:32:04.480 | and before he was the CEO,
00:32:06.440 | when he was just the executive chairman
00:32:08.160 | and basically the finance person funding it,
00:32:11.700 | they were outsourcing everything.
00:32:13.880 | They were making the batteries in Japan,
00:32:16.000 | and the battery pack would be at some barbecue shop
00:32:18.600 | in Thailand, and then sent to the Lotus factory in England
00:32:22.480 | to be put into a Lotus Elise chassis,
00:32:25.840 | and then, that was a nightmare.
00:32:28.800 | You did not have end to end control
00:32:31.640 | of the manufacturing process.
00:32:33.440 | So he goes to the other extreme.
00:32:35.240 | He gets a factory in Fremont from Toyota,
00:32:38.880 | and he wants to do everything in house.
00:32:41.400 | The software, in house.
00:32:43.360 | The painting, in house.
00:32:44.960 | He makes his own batteries,
00:32:51.640 | and I think that end to end control
00:32:54.020 | is part of his personality.
00:32:56.020 | I mean, there's a, but it also would allow Tesla
00:33:01.020 | to be innovative.
00:33:03.880 | - Yeah, I got to see and understand in detail
00:33:08.800 | one example of that, which is the development
00:33:12.860 | of the brain of the car in autopilot,
00:33:15.960 | going from Mobileye to in house building
00:33:19.560 | the autopilot system to basically getting rid
00:33:23.960 | of all sensors that are not rich in data
00:33:28.960 | to make it AI friendly, sort of saying
00:33:32.520 | that we can do it all with vision,
00:33:34.360 | and like you said, removing some of the bolts.
00:33:36.880 | So sometimes it's small things,
00:33:38.460 | but sometimes it's really big things,
00:33:40.160 | like getting rid of radar.
00:33:41.760 | - Well, vision only.
00:33:43.080 | Getting rid of radar is huge, and everybody's against it.
00:33:46.800 | Everybody, and they're still fighting it a bit.
00:33:48.920 | They're still trying to do in next generation
00:33:51.260 | some form of radar, but it gets back
00:33:54.300 | to the first principles.
00:33:55.840 | We're talking about visualizing.
00:33:57.380 | Well, he starts with the first principles,
00:34:01.220 | and the first principles of physics involve things
00:34:04.300 | like, well, humans drive with only visual input.
00:34:08.700 | They don't have radar.
00:34:09.700 | They don't have LIDAR.
00:34:10.620 | They don't have sonar, and so there is no reason
00:34:15.120 | in the laws of physics that make it
00:34:17.900 | so that vision only won't be successful
00:34:21.340 | in creating self-driving.
00:34:23.300 | Now, that becomes an article of faith to him,
00:34:26.660 | and he gets a lot of pushback, but now,
00:34:31.020 | and he's, by the way, not been that successful
00:34:33.320 | in meeting his deadlines of getting self-driving.
00:34:36.360 | He's way too optimistic, but it was that first principles
00:34:41.100 | of get rid of unnecessary things.
00:34:44.380 | Now, you would think LIDAR, why not use it?
00:34:46.880 | Like, why not use a crutch?
00:34:48.100 | It's like, yeah, we can do things vision only,
00:34:50.260 | but when I look at the stars at night,
00:34:52.220 | I use a telescope too.
00:34:54.380 | Well, you could use LIDAR, but you can't do millions
00:34:57.500 | of cars that way at scale.
00:34:59.320 | At a certain point, you have to make it
00:35:01.560 | not only a good product, but a product that goes to scale,
00:35:05.300 | and you can't make it based on maps like Google Maps
00:35:07.980 | 'cause it'll never be able to, you know,
00:35:10.260 | then drive from New Orleans to Slidell
00:35:13.340 | where I wanna go when it's too hot in New Orleans.
00:35:17.360 | Take, for example, full self-drive.
00:35:20.120 | He has been obsessed with what he calls the robo-taxi.
00:35:24.240 | We're gonna build the next generation car
00:35:26.120 | without a steering wheel, without pedals,
00:35:29.240 | because it's gonna be full self-drive.
00:35:31.080 | You just summon it.
00:35:32.240 | You won't need to drive it.
00:35:34.100 | Well, over and over again, all these people I've told you
00:35:36.920 | about, you know, Lars Marvy and Drew Baglino and others,
00:35:40.080 | they're saying, okay, fine, that sounds really good,
00:35:42.120 | but, you know, it ain't happened yet.
00:35:45.520 | We need to build a $25,000 mass-market global car
00:35:50.280 | that's just normal with a steering wheel.
00:35:52.640 | And yeah, he finally turned around a few months ago
00:35:57.280 | and said, let's do it.
00:35:59.040 | And then he starts focusing on
00:36:00.880 | how's the assembly line gonna work?
00:36:02.340 | How are we gonna do it?
00:36:03.320 | And make it the same platform for robo-taxi
00:36:06.000 | so you can have the same assembly line.
00:36:07.640 | Likewise, for full self-drive,
00:36:10.320 | they were doing it by coding
00:36:12.200 | hundreds of thousands of lines of code
00:36:15.400 | that would say things like, if you see a red light, stop.
00:36:18.040 | If there's a blinking light,
00:36:19.120 | if there are two yellow lines, do this.
00:36:21.600 | If there's a bike lane, do this.
00:36:23.020 | If there's a crosswalk, do that.
00:36:25.040 | Well, that's really hard to do.
00:36:27.320 | Now he's doing it through artificial intelligence
00:36:30.400 | and machine learning only.
00:36:32.240 | FSD-12 will be based on the billion or so frames
00:36:38.080 | from Tesla each week of Tesla drivers
00:36:40.760 | and saying, what happened when a human was in this situation?
00:36:43.440 | What did the human do?
00:36:44.640 | And let's only pick the best humans,
00:36:46.200 | the five-star drivers, the Uber drivers, as Elon says.
00:36:50.080 | And so that's him changing his mind
00:36:54.840 | and going to first principles, but saying,
00:36:57.960 | all right, I'm even gonna change full self-driving
00:37:00.880 | so that it's not rules-based.
00:37:02.480 | It becomes AI-based, just like Chat GPT
00:37:06.860 | doesn't try to answer your question,
00:37:08.760 | who are the five best popes or something,
00:37:11.520 | by, Chat GPT does it by having ingested billions
00:37:16.520 | of pieces of writing that people have done.
00:37:22.880 | This will be AI, but real world, done by ingesting video.
00:37:26.720 | - Sometimes it feels like he and others,
00:37:31.080 | they're building things in this world successfully,
00:37:33.400 | are basically confidently exploring a dark room
00:37:37.140 | with a very confident, ambitious vision
00:37:41.460 | of what that room actually looks like.
00:37:43.460 | They're just walking straight into the darkness.
00:37:48.980 | There's no painful toys or Legos on the ground.
00:37:52.100 | I'm just going to walk.
00:37:53.140 | I know exactly how far the wall is.
00:37:55.420 | And then very quickly willing to adjust
00:37:58.180 | as they run into, they step on the Lego
00:38:01.180 | and their body is filled with a lot of pain.
00:38:04.780 | What I mean by that is there's this kind of evolution
00:38:07.720 | that seems to happen where you discover really good ideas
00:38:11.840 | along the way that allow you to pivot.
00:38:14.360 | Like to me, since a few years ago,
00:38:19.360 | when you could see with Andrej Karpathy,
00:38:22.920 | the software 2.0 evolution of autopilot,
00:38:26.280 | it became obvious to me that this is not about the car.
00:38:30.480 | This is about Optimus, the robot.
00:38:32.760 | If we look back 100 years from now,
00:38:37.400 | the car will be remembered as a cool car,
00:38:40.840 | nice transportation, but the autopilot won't be
00:38:44.140 | the thing that controls the car.
00:38:45.820 | It will be the thing that allows embodied AI systems
00:38:50.420 | to understand the world, so broadly.
00:38:54.040 | And so that kind of approach,
00:38:55.820 | and you kind of stumble into it.
00:38:57.880 | Will Tesla be a car company?
00:39:01.360 | Will it be an AI company?
00:39:03.040 | Will it be a robotics company?
00:39:04.920 | Will it be a home robotics company?
00:39:06.800 | Will it be an energy company?
00:39:08.240 | And you kind of slowly discover this
00:39:10.840 | as you confidently push forward with a vision.
00:39:15.840 | So it's interesting to watch that kind of evolution,
00:39:20.080 | as long as it's backed by this confidence.
00:39:22.560 | - There are a couple of things that are required for that.
00:39:25.540 | One is being adventurous.
00:39:27.280 | One doesn't enter a dark room without a flashlight
00:39:29.800 | and a map, unless you're a risk taker,
00:39:31.920 | unless you're adventurous.
00:39:33.480 | The second is to have iterative brain cycles,
00:39:37.880 | where you can process information
00:39:40.200 | and do a feedback loop and make it work.
00:39:43.480 | The third, and this is what we failed to do a lot
00:39:47.160 | in the United States and perhaps around the world,
00:39:50.560 | is when you take risks, you have to realize
00:39:55.160 | you're gonna blow things up.
00:39:57.360 | You know, first three rockets, the Falcon rockets
00:40:01.600 | that must does, they blow up.
00:40:04.440 | Even Starship, three and a half minutes,
00:40:06.480 | but then it blows up the first time.
00:40:08.720 | So I think Boeing and NASA and others
00:40:11.960 | have become unwilling to enter your dark room
00:40:15.900 | without knowing exactly where the exit is
00:40:19.000 | and the lighted path to the exit.
00:40:21.880 | And the people who created America,
00:40:24.360 | whenever they came over, you know,
00:40:26.640 | whether the Mayflower is refugees from the Nazis,
00:40:31.240 | they took a lot of risks to get here.
00:40:33.800 | And now I think we have more referees
00:40:38.800 | than we have risk takers.
00:40:41.100 | More lawyers and regulators and others saying,
00:40:46.080 | you can't do that, that's too risky,
00:40:48.060 | than people willing to innovate.
00:40:50.040 | And you need both.
00:40:51.820 | I think you're also right on 50, 100 years from now,
00:40:56.340 | what Musk will be most remembered for besides space travel
00:41:02.280 | is real world AI.
00:41:04.740 | Not just Optimus the Robot, but Optimus the Robot
00:41:09.080 | and the self-driving car, they're pretty much the same.
00:41:13.480 | They're using, you know, GPU clusters
00:41:18.400 | or dojo chips or whatever it may be
00:41:20.420 | to process real world data.
00:41:22.560 | We all got, and you did on your podcast,
00:41:24.920 | quite excited about large language model,
00:41:28.280 | you know, generative, predictive text AI.
00:41:32.960 | That's fine, especially if you wanna
00:41:35.120 | chit chat with your chat bot.
00:41:37.520 | But the holy grail is artificial general intelligence.
00:41:41.320 | And the tough part of that is real world AI.
00:41:45.360 | And that's where Optimus the Robot
00:41:47.560 | or full self-drive are, I think,
00:41:52.560 | far ahead of anybody else.
00:41:57.520 | - Well, I like how you said chit chat.
00:42:00.340 | I would say, for one of the greatest writers ever,
00:42:05.360 | it's funny that you spoke about language
00:42:08.560 | and the mastery of languages as merely chit chat.
00:42:11.840 | You know, people have fallen in love over some words.
00:42:14.720 | People have gone to wars over some words.
00:42:16.780 | I think words have a lot of power.
00:42:18.660 | It's actually an interesting question
00:42:20.680 | where the wisdom of the world,
00:42:22.400 | the wisdom of humanity is in the words
00:42:24.280 | or is it in visuals, is it in the physical?
00:42:27.800 | I don't really-- - It's in mathematics.
00:42:30.440 | - It might, maybe it all boils down to math
00:42:32.440 | and in the end, this kind of discussion
00:42:34.040 | about real world AI versus language is all the same, maybe.
00:42:39.040 | I've gotten a chance to hang out quite a bit
00:42:42.440 | in the metaverse with Mr. Mark Zuckerberg recently.
00:42:45.860 | And boy, is the realism in there.
00:42:50.520 | Like, the thing that's coming up in the future is incredible.
00:42:55.120 | I got scanned in Pittsburgh for 10 hours
00:43:00.120 | into the metaverse and there's like a virtual version of me
00:43:06.440 | and I got to hang out with that virtual version.
00:43:09.360 | - Do you like yourself?
00:43:10.820 | - Well, I never like myself.
00:43:13.960 | But it was easier to like that other guy.
00:43:18.220 | That was interesting.
00:43:19.060 | - Did he like you?
00:43:20.100 | - He didn't seem to care much.
00:43:22.800 | - That's the lack of the empathy.
00:43:25.340 | - But that was, it made me start to question
00:43:31.140 | even more than before, like,
00:43:33.300 | well, how important is this physical reality?
00:43:36.020 | Because I got to see myself and other people
00:43:41.100 | in that metaverse, like the details of the face,
00:43:44.440 | all the things that you think,
00:43:49.440 | maybe if you look at yourself in the mirror,
00:43:51.440 | are imperfections, all this kind of stuff.
00:43:53.920 | When I was looking at myself and at others,
00:43:55.960 | all those things were beautiful.
00:43:57.200 | And it was like, it was real and it was intense
00:44:02.200 | and it was scary because you're like,
00:44:05.320 | well, are you allowed to murder people in the metaverse?
00:44:08.400 | 'Cause like, are you allowed to,
00:44:10.680 | 'cause what are you allowed to do?
00:44:12.380 | Because you can replicate a lot of those things.
00:44:14.520 | And you start to question what are the fundamental things
00:44:18.580 | that make life worth living here as we know as humans.
00:44:21.500 | - Have you talked to Elon about his views
00:44:24.020 | of we're living in a simulation maybe
00:44:26.440 | and how you would figure out if that's true?
00:44:28.820 | - Yes, there's a constant lighthearted
00:44:31.100 | but also a serious sense that this is all a bit of a game.
00:44:36.340 | One of my theories on Elon, a minor theory,
00:44:40.040 | is that he read Hitchhiker's Guide to the Galaxy
00:44:42.520 | once too often.
00:44:43.680 | (Dave laughs)
00:44:46.280 | And as you know, there's a scene in there that says
00:44:48.780 | that there's a theory about the universe
00:44:53.080 | that if anybody ever discovers the secrets
00:44:55.680 | of meanings of the universe,
00:44:57.320 | it will be replaced by an even more complex universe.
00:45:00.960 | And then the next line Douglas Adams writes is,
00:45:03.680 | there's another theory that this has already happened.
00:45:07.040 | So I'm gonna try to get my head around that,
00:45:09.200 | but I know that Elon Musk tries to.
00:45:11.720 | - Well, there's a humor to that.
00:45:13.880 | - There's an enormous humor to Hitchhiker's Guide.
00:45:16.800 | And I really think that helped Musk
00:45:18.280 | out of the darkest of his periods
00:45:20.600 | to have sort of the sense of fun
00:45:23.200 | of figuring out what life is all about.
00:45:25.040 | - I wonder if this is a small aside we could say,
00:45:27.040 | just having gotten to know Elon very well,
00:45:29.920 | like the silliness, the willingness to engage
00:45:34.720 | in the absurdity of it all and have fun.
00:45:37.980 | What is that?
00:45:39.760 | Is that just a quirk of personality
00:45:43.240 | or is that a fundamental aspect of a human
00:45:46.200 | who's running six plus companies?
00:45:48.120 | - Well, it's a relief valve,
00:45:49.880 | just like video games and polytopia and Elden Ring
00:45:53.080 | are release valves for him.
00:45:54.840 | And he does have an explosive sense of humor.
00:45:59.720 | As you know.
00:46:00.800 | And the weird thing is when he makes the abrupt transition
00:46:05.200 | from dark demon mode and you're in a conference room
00:46:09.000 | and he has really become upset about something.
00:46:12.480 | And not only their dark vibes,
00:46:14.360 | but there's dark words emanating
00:46:16.240 | and he's saying your resignation will be accepted
00:46:19.480 | if you die, you know, et cetera.
00:46:21.640 | And then something pops and he pulls out his phone
00:46:26.400 | and pulls up a Bonnie Python skit,
00:46:29.000 | you know, like the School of Silly Walks
00:46:31.960 | or whichever John Cleese it was.
00:46:33.840 | And he starts laughing again and things break.
00:46:36.840 | So it's almost as if he has different modes,
00:46:41.840 | the emulation of human mode, the engineering mode,
00:46:46.080 | the dark and demon mode,
00:46:48.600 | and certainly there is the silly and giddy mode.
00:46:51.320 | - Yeah, you've actually opened the Elon book
00:46:56.360 | with quotes from Elon and from Steve Jobs.
00:46:59.800 | So Elon's quote is, "To anyone I've offended,
00:47:02.040 | "I just wanna say," this is on SNL,
00:47:04.440 | "I just wanna say I reinvented electric cars
00:47:06.760 | "and I'm sending people to Mars on a rocket ship.
00:47:10.000 | "Did you also think I was going to be a chill, normal dude?"
00:47:13.080 | And then the quote from Steve Jobs, of course,
00:47:15.000 | is, "The people who are crazy enough to think
00:47:17.720 | "they can change the world are the ones who do."
00:47:21.280 | So what do you think is the role of,
00:47:24.920 | the old madness and genius,
00:47:28.480 | what do you think the role of crazy in this?
00:47:30.440 | - Well, first of all, let's both stipulate
00:47:33.560 | that Musk is crazy at times.
00:47:35.640 | I mean, and then let's figure out,
00:47:40.640 | and I try to do it through storytelling,
00:47:42.920 | not through highfalutin preaching,
00:47:46.000 | where that craziness works.
00:47:50.680 | Give me a story, tell me an anecdote,
00:47:52.200 | tell me where he's crazy.
00:47:54.200 | And the almost final example,
00:47:59.200 | AI, but him shooting off Starship for the first time.
00:48:03.120 | In between an aborted countdown and the shoot-off,
00:48:06.280 | he goes to Miami to an ad sales conference
00:48:08.520 | and meets Linda Iaccarino for the first time,
00:48:10.440 | makes her the CEO.
00:48:11.720 | I mean, there's a very impulsiveness to him.
00:48:13.600 | Then he flies back, they launch Starship,
00:48:17.160 | and you realize that there's a drive
00:48:21.840 | and there are demons and there's also craziness.
00:48:25.040 | And you sometimes wanna pull those out.
00:48:30.040 | You wanna take away his phone
00:48:31.920 | so he doesn't tweet at 3 a.m.
00:48:34.220 | You wanna say, quit being so crazy.
00:48:37.520 | But then you realize there's a wonderful line
00:48:40.600 | of Shakespeare in Measure for Measure at the very end.
00:48:43.760 | He says, "Even the best are molded out of faults."
00:48:48.480 | And so you take the faults of Musk, for example,
00:48:52.760 | which includes a craziness that can be endearing,
00:48:55.400 | but also craziness that's just like effing crazy,
00:48:59.540 | as well as this drive and demon mode.
00:49:04.700 | I don't know that you can take that strand out
00:49:08.400 | of the fabric and the fabric remains whole.
00:49:11.520 | - I wonder, sometimes it saddens me
00:49:16.680 | that we live in a society that doesn't celebrate
00:49:18.960 | even the darker aspects of crazy
00:49:21.880 | in acknowledging that it all comes in one package.
00:49:25.560 | It's the man in the arena versus the critic.
00:49:27.960 | - And the man in the arena versus the regulator,
00:49:30.200 | and to make it more prosaic.
00:49:32.260 | - Well, let me ask about not just the crazy,
00:49:37.400 | but the cruelty.
00:49:38.320 | So you've written, when reporting on Steve Jobs,
00:49:42.680 | Waz told you that the big question to ask was,
00:49:45.840 | did he have to be so mean, so rough and cruel,
00:49:50.200 | so drama addicted?
00:49:52.120 | What is this answer for Steve Jobs?
00:49:54.080 | Did he have to be so cruel?
00:49:56.600 | - For Jobs, I asked Waz at the end of my reporting,
00:50:00.120 | 'cause that's what he said at the beginning,
00:50:02.800 | we're doing the launch of, I think,
00:50:05.680 | the iPad 2, it may have been.
00:50:08.440 | Steve is emaciated because he's been sick.
00:50:13.080 | And so I say to Waz, what's the answer to your question?
00:50:15.360 | And he said, well, if I had been running Apple,
00:50:18.220 | I would have been nicer to everybody.
00:50:19.720 | Everybody got stock options.
00:50:20.880 | We'd have been like a family.
00:50:22.440 | And then, I don't know if you know Waz,
00:50:24.240 | we were like a teddy bear.
00:50:25.120 | He paused, he smiled, and he said,
00:50:26.880 | but if I had been running Apple,
00:50:29.000 | I don't think we would have done the Macintosh
00:50:31.280 | or the iPhone.
00:50:32.440 | So yeah, you have to sometimes be rough.
00:50:37.440 | And Jobs said the same thing that Musk said to me,
00:50:43.440 | which is he said, people like you
00:50:45.560 | love wearing velvet gloves.
00:50:47.560 | Now, I don't know that I've worn velvet gloves often,
00:50:50.720 | but you like people to like you,
00:50:52.360 | you like to sweet-talk things, you sugarcoat things.
00:50:55.680 | He says, I'm just a working class kid,
00:50:58.520 | and I don't have that luxury.
00:51:00.880 | If something sucks, I gotta tell people it sucks,
00:51:03.160 | or I got a team of B players.
00:51:05.120 | Well, Musk is that way as well,
00:51:06.600 | and it gets back to what I said earlier,
00:51:08.980 | which is, yeah, I probably would wear velvet gloves
00:51:13.460 | if I could find them at my haberdasher.
00:51:15.620 | And I do try to sugarcoat things,
00:51:19.800 | but when I was running CNN, it needed to be reshaped.
00:51:24.180 | It needed to be broken.
00:51:25.460 | It needed to have certain things blown up,
00:51:28.340 | and I didn't do it.
00:51:29.500 | So, bad on me, but it made me realize,
00:51:33.540 | okay, I'll just write about the people who can do it.
00:51:36.140 | - Well, that thing of saying,
00:51:38.340 | I think probably both of them,
00:51:39.500 | but Elon certainly, saying things like,
00:51:41.660 | that is the stupidest thing I've ever heard.
00:51:43.900 | - By the way, I've heard Jeff Bezos say that.
00:51:47.580 | I've heard Bill Gates say that.
00:51:49.180 | I've heard Steve Jobs say it.
00:51:50.940 | I've heard Steve Jobs say it about a smoothie.
00:51:53.500 | They were making it a whole food or something.
00:51:55.700 | People, they use the word stupid really often,
00:51:59.260 | and you know who else used it?
00:52:00.780 | Errol Musk.
00:52:01.620 | He kept making Elon stand in front of him
00:52:04.160 | and saying, that's the stupidest thing.
00:52:05.660 | You're the stupidest person.
00:52:07.100 | You'll never amount to anything.
00:52:08.700 | I don't know.
00:52:12.540 | As John McNeil, the president of Tesla, said,
00:52:17.780 | do you have to be that way?
00:52:19.460 | Probably not.
00:52:20.740 | There are a lot of successful people who are much kinder,
00:52:25.760 | it's sometimes necessary
00:52:32.940 | to be much more brutal and honest,
00:52:36.380 | brutally honest, I would say,
00:52:38.460 | than people who win boss of the year trophies.
00:52:43.460 | - Well, as you said, this kind of idea
00:52:47.460 | did also send a signal.
00:52:49.140 | This idea of Steve Jobs of eight players,
00:52:51.180 | it did send a signal to everybody.
00:52:52.960 | It was a kind of encouragement to the people
00:52:56.140 | that are all in.
00:52:57.460 | - Right, and that happened at Twitter.
00:52:58.740 | When we went to Twitter headquarters the day before
00:53:02.220 | the takeover, he was having Andrew and James,
00:53:06.700 | his two young cousins, and other people
00:53:09.180 | from the autopilot team, going over lines of code.
00:53:12.380 | Musk himself sat there with a laptop
00:53:14.340 | on the second floor of the building
00:53:16.300 | looking at the lines of code that had been written
00:53:18.800 | by Twitter engineers, and they decided
00:53:21.460 | they were gonna fire 85% of them
00:53:24.720 | because they had to be all in.
00:53:26.460 | This notion of psychological safety and mental days off
00:53:29.700 | and working remotely, he said, either,
00:53:33.420 | and then it came up, actually, one of his,
00:53:37.460 | I think it was one of the cousins,
00:53:40.260 | or maybe Ross Nordin, came up with the idea
00:53:42.900 | of let's not be so rough and just fire all these people.
00:53:46.700 | Let's ask 'em, do you really wanna be all in?
00:53:49.260 | 'Cause this is gonna be hardcore.
00:53:50.580 | It's gonna be intent.
00:53:52.180 | You get to choose, but by midnight tonight,
00:53:54.700 | we want you to check the box.
00:53:56.580 | I'm hardcore, all in, I'll be there in person,
00:53:59.140 | I'll work, you know, as much, or that's not for me.
00:54:02.100 | I've got a family, I've got work balance,
00:54:04.580 | and you got different type of people that way,
00:54:08.020 | and different stages of their life.
00:54:09.620 | I was a little bit more hardcore and all in
00:54:12.100 | when I was in my 20s than when I was, you know, in my 50s.
00:54:16.300 | - Yeah, and you write about this really nice idea,
00:54:19.100 | actually, that there's two camps,
00:54:21.780 | and you find out, I wonder how true this is.
00:54:25.020 | It rings true.
00:54:26.620 | You can just ask people, which camp are you in?
00:54:29.220 | Are you the kind of person that prides themselves,
00:54:31.980 | that enjoy staying up 'til 2 a.m. programming or whatever,
00:54:36.620 | or do you see the value of, quote-unquote,
00:54:39.420 | you know, work-life balance, all this kind of stuff?
00:54:43.020 | And it's interesting.
00:54:44.500 | I mean, like, you could, people probably divide themselves
00:54:49.100 | in different stages in life, and you can just ask them,
00:54:51.820 | and it makes sense for certain companies
00:54:54.140 | at certain stages of their development to be like,
00:54:57.020 | we only want hardcore people. - Or certain teams.
00:54:58.340 | It doesn't even have to be a whole company.
00:55:00.260 | And you're right.
00:55:01.580 | It goes back to what I was saying about rule,
00:55:03.740 | the first secret is sort of know thyself.
00:55:07.100 | Obviously, it comes from Plato,
00:55:08.580 | and everything comes from Plato and Socrates.
00:55:12.100 | But, and decide, in this stage of my life,
00:55:17.100 | am I, do I wanna be a hackathon all in all night
00:55:22.860 | and change the world, or do I want to bring wisdom
00:55:26.220 | and stability, but also have balance?
00:55:29.060 | I think it's good to have different companies
00:55:32.220 | with different styles.
00:55:33.740 | The problem was, Twitter was at almost one extreme,
00:55:37.300 | with yoga studios and mental health days off,
00:55:40.580 | and enshrining psychological safety as one of the mantras
00:55:45.580 | that people should never feel psychologically threatened.
00:55:48.340 | And he, I remember the bitter laugh he unleashed
00:55:52.100 | when he kept hearing that word.
00:55:53.620 | He said, no, I like the words hardcore.
00:55:56.180 | I like intensity.
00:55:57.620 | I like a intense sense of urgency
00:56:01.420 | as our operating principle.
00:56:03.700 | Well, yeah, there are people that way as well.
00:56:05.860 | So know who you are and know what type of team
00:56:08.020 | you wanna build.
00:56:09.260 | - Versus psychological safety
00:56:10.940 | and too many birds everywhere.
00:56:13.340 | - Oh, yeah.
00:56:14.180 | A lot of times, Musk did things, and I go, what the hell?
00:56:18.060 | And among them was changing the name, Twitter,
00:56:20.140 | and getting rid of the birds.
00:56:21.180 | I go, hey, hey, man, it's a lot invested in that brand.
00:56:24.500 | But when I watched him, he thought,
00:56:26.860 | okay, these sweet little chirpy birds tweeting away
00:56:29.900 | in the name Twitter, it's not hardcore.
00:56:34.300 | It's not intense.
00:56:35.620 | And so, for better and for worse,
00:56:38.620 | I think he's taking X into the hardcore realm
00:56:42.980 | with people who post hardcore things,
00:56:47.500 | with people with hardcore views.
00:56:50.140 | It's not a polite playpen
00:56:55.140 | for the blue-checked, anointed elite.
00:57:01.580 | And I thought, okay, this is gonna be bad.
00:57:05.620 | The whole thing's gonna fall apart.
00:57:06.780 | Well, it has had problems,
00:57:09.220 | but the hardcore intensity of it
00:57:11.660 | has also meant that there's new things happening there.
00:57:14.420 | So it's very Elon Musk to not like this sweetness of birds
00:57:20.140 | chirping and tweeting and saying,
00:57:22.300 | I want something more hardcore.
00:57:23.900 | - As you've written in referring
00:57:27.580 | to the previous Twitter CEO,
00:57:30.460 | Elon said Twitter needs a fire-breathing dragon.
00:57:35.460 | I think this is a good opportunity
00:57:36.900 | to maybe go through some of the memorable moments
00:57:40.620 | of the Twitter saga, as you've written about extensively
00:57:43.700 | in your book, from the early days of considering
00:57:47.620 | the acquisition to how it went through to the details
00:57:50.980 | of, like you mentioned, the engineering teams.
00:57:53.700 | - Well, at the beginning of 2022, he was riding high.
00:57:58.700 | But as we say, he's a drama addict.
00:58:00.860 | He doesn't like to ghost.
00:58:02.860 | And Tesla told a million vehicles.
00:58:05.660 | I think 33 boosters, Falcon 9s have been shot up
00:58:10.660 | and landed safely in the past few months.
00:58:16.140 | And he was the richest person on Earth
00:58:19.100 | and Times Person of the Year.
00:58:21.700 | And yet, he'd said, I still want to put all my chips
00:58:26.220 | back on the table.
00:58:27.300 | I want to keep taking risks.
00:58:28.620 | I don't want to savor things.
00:58:31.060 | He had sold all of his houses.
00:58:32.780 | So he starts secretly buying shares of Twitter.
00:58:37.300 | January, February, March.
00:58:38.820 | Becomes public at a certain point, he has to declare it.
00:58:43.520 | And we were here in Austin at Gigafactory on the mezzanine.
00:58:48.520 | And he was trying to figure out,
00:58:50.920 | well, where do I go from here?
00:58:53.080 | And at that time, it was early April,
00:58:55.940 | they were gonna offer him a board seat.
00:58:58.240 | He was gonna do a standstill agreement
00:58:59.960 | and stop at 10% or something.
00:59:02.320 | Now remember, we were standing around.
00:59:04.480 | It was Luke Nozick, whom you know well, Ken Howery,
00:59:08.280 | some of his friends on that mezzanine here.
00:59:11.160 | And all afternoon and then late into the evening at dinner
00:59:14.800 | is like, should we do this?
00:59:17.680 | And I didn't say anything, I'm just the observer.
00:59:20.220 | But everybody else is saying, excuse me,
00:59:21.840 | why do you want to own Twitter?
00:59:24.400 | And Griffin, his son, joined at dinner.
00:59:27.360 | And May, for some reason, was in town.
00:59:30.060 | And like everybody says, no, we don't use Twitter.
00:59:33.000 | Why would you do that?
00:59:34.480 | And May said, well, I use Twitter.
00:59:36.120 | And it's almost like, okay, the demographics are people
00:59:39.920 | my age or May's age.
00:59:41.540 | And so it looked like he wasn't gonna pursue it.
00:59:45.680 | They offered him a board seat.
00:59:47.360 | And then he went off to Hawaii,
00:59:53.120 | to Larry Ellison's house, which he sometimes uses.
00:59:58.120 | He was meeting a friend, Angela Bassett, an actress.
01:00:01.840 | And instead of enjoying three days of vacation,
01:00:06.040 | he just became supercharged
01:00:08.560 | and started firing off text messages,
01:00:10.840 | including the fire-breathing dragon one.
01:00:13.720 | I think he used that phrase a few times,
01:00:16.160 | that Parag wasn't the person
01:00:17.800 | who was going to take Twitter to a new level.
01:00:21.200 | And then by the time he gets to Vancouver,
01:00:23.400 | where Grimes meets him,
01:00:25.200 | they stay up all night playing Elden Ring.
01:00:27.800 | He was doing a TED Talk.
01:00:29.640 | And then at 5.30, he finishes playing Elden Ring
01:00:33.640 | and sends out that I've made an offer.
01:00:37.040 | Even when he comes back, people are trying to intervene
01:00:40.360 | and say, excuse me, why are you doing it?
01:00:42.460 | And so it was a rocky period between late April and October
01:00:48.680 | when the deal closed.
01:00:49.960 | And people ask me all the time,
01:00:51.480 | well, did he want to get out of the deal?
01:00:53.160 | I said, which Elon are you talking about at what time of day?
01:00:56.600 | 'Cause there'll be times in the morning
01:00:58.680 | when he'd say, oh, the Delaware court's
01:01:01.080 | gonna force me to do it, it's horrible.
01:01:03.360 | Talk to his lawyers, you can win this case,
01:01:05.380 | get me out of it.
01:01:07.200 | He met here in Austin with three or four investment bankers,
01:01:11.200 | Blair Efron at Centerview, Bob Steele at Perella Weinberg.
01:01:16.200 | And they offered him options.
01:01:17.400 | Do you want to get out?
01:01:18.240 | Do you want to stay in?
01:01:19.060 | Do you want to reduce the price?
01:01:21.280 | And I think he was mercurial.
01:01:23.560 | There were times he would text me or say to me,
01:01:26.820 | this is gonna be great.
01:01:28.280 | It's gonna be the accelerant to do x.com
01:01:31.960 | the way we thought about 20 years ago.
01:01:35.520 | And so it's not until they finally tell him
01:01:37.920 | at the beginning of October, right when Optimus the Robot
01:01:40.520 | is being unveiled in California, actually,
01:01:44.120 | that the lawyers say, you're not gonna probably win
01:01:47.840 | this case, better go through with the deal.
01:01:51.380 | And by then, he's not only made his peace with it,
01:01:55.040 | he's kind of happy with it at times.
01:01:57.760 | Eventually, the deal is gonna close on,
01:02:01.960 | I think, a Friday morning.
01:02:04.960 | I have it in the book.
01:02:06.560 | And we're there on Thursday, and he's wandering around
01:02:09.580 | looking at the Stay Woke t-shirts
01:02:11.440 | and psychological safety lingo they're all using.
01:02:16.440 | And he and his lawyers and bankers hatched a plan
01:02:20.360 | to do a flash close.
01:02:22.040 | And the reason for that was if they closed the deal
01:02:27.960 | after the markets had closed for the day,
01:02:31.760 | and he could send a letter to Parag and two others
01:02:35.680 | firing them, quote, for cause,
01:02:38.280 | and this will be something the courts
01:02:39.360 | will have to figure out,
01:02:40.560 | then he could save 200 million or so.
01:02:44.080 | And it was both the money, but for him a matter,
01:02:46.640 | I won't say of principle, but of, hey,
01:02:49.480 | they misled me about the numbers,
01:02:51.320 | I got forced into doing it, so I'm gonna try
01:02:56.320 | this jujitsu maneuver and be able to get some money
01:02:59.840 | out of them.
01:03:00.680 | And then when he takes over, it's kind of a wild scene.
01:03:03.680 | Him trying to decide in three different rounds
01:03:07.940 | how to get the staff down to 15% of what it was.
01:03:11.480 | Him deciding on Christmas Eve after he'd been at a meeting
01:03:15.960 | where they told him we can't get rid of that
01:03:18.120 | Sacramento server farm because it's needed for redundancy.
01:03:22.520 | He says, no, it's not.
01:03:24.200 | And he's flying here to Austin.
01:03:26.460 | And young James says, why don't we just do it ourselves?
01:03:29.280 | He turns the plane around, they land in Sacramento,
01:03:32.000 | and he pulls them out himself.
01:03:33.440 | So it was a manic period.
01:03:35.200 | - We should also say that underneath of that,
01:03:38.040 | there was a running desire to, or a consideration
01:03:42.680 | to perhaps start a new company to build
01:03:46.520 | a social media company from scratch.
01:03:48.720 | - Well, Kimball wanted to do that,
01:03:50.080 | and Kimball here at a wonderful restaurant
01:03:52.520 | in Austin at lunch is like, hey, why are you buying Twitter?
01:03:56.200 | Let's start one from scratch and do it on the blockchain.
01:03:59.840 | Now, it took him a while, and you can argue it
01:04:02.360 | one way or the other, to come to the conclusion
01:04:05.480 | that the blockchain was not fast enough
01:04:08.080 | and responsive time enough to be able to handle
01:04:10.920 | a billion tweets in a day or so.
01:04:13.660 | He gets mad when they keep trying to get him
01:04:16.560 | to talk to Sam Bankman-Fried, who's trying to say,
01:04:19.640 | I'll invest, but we have to do it on the blockchain.
01:04:22.280 | Kimball is still in favor of starting a new one
01:04:26.280 | and doing it on blockchain-based.
01:04:29.120 | In retrospect, I think starting a new media company
01:04:34.900 | would have been better.
01:04:35.740 | He wouldn't have had the baggage or the legacy
01:04:39.460 | that he's breaking now in breaking the way Twitter had been.
01:04:44.460 | But it's hard to have millions and millions,
01:04:48.000 | hundreds of millions of true users,
01:04:51.540 | not just trolls, and start from scratch,
01:04:54.660 | as others have found, as Macedon and Blue Sky and Threads.
01:04:59.660 | Threads even had a base, so it would have been hard.
01:05:03.900 | - Yeah, and to do that in the way he did
01:05:06.860 | requires another part that you write about
01:05:09.980 | with the three musketeers and the whole engineering,
01:05:12.780 | the firing and the bringing in the engineers
01:05:15.680 | to try to sort of go hardcore.
01:05:17.540 | So there's a lot of interesting questions to ask there,
01:05:20.020 | but the high level, can you just comment about
01:05:23.340 | that part of the saga, which is bringing in the engineers
01:05:28.340 | and seeing what can we do here?
01:05:31.820 | - Right, he brought in the engineers
01:05:34.180 | and figured that the amount of people doing Tesla,
01:05:38.420 | full self-driving, autopilot, and all the software there
01:05:42.300 | was about 1/10 of what was doing software for Twitter.
01:05:46.300 | He said, "This can't be the case."
01:05:48.580 | And he fired 85%, in three different rounds.
01:05:52.380 | The first was just firing people
01:05:55.340 | because they looked at the coding
01:05:57.340 | and they had a team of people from Tesla's autopilot team
01:06:01.500 | grading the codes of all that was written
01:06:04.940 | in the past year or so.
01:06:06.880 | Then he fired people who didn't seem
01:06:09.860 | to be totally all-in or loyal,
01:06:12.220 | and then another round of layoffs.
01:06:14.420 | So at each step of the way, almost everybody said,
01:06:19.420 | "That's enough, it's going to destroy things."
01:06:23.360 | From Alex Spiro, his lawyer, to Jared Burchill,
01:06:27.460 | it's like, "Whoa, whoa, whoa."
01:06:29.260 | And even Andrew and James, the young cousins
01:06:31.860 | who are tasked with making a list
01:06:34.040 | and figuring out who's good or bad,
01:06:35.560 | said, "We've done enough, we're going to be in real trouble."
01:06:39.380 | And they were partly right.
01:06:41.500 | I mean, there was degradation of the service some,
01:06:46.220 | but not as much as half the services I use half the time.
01:06:50.380 | I'd wake up each morning and hit the app,
01:06:55.740 | and okay, still there.
01:06:57.380 | - What do you think, was that too much?
01:06:59.220 | - I think that he has an algorithm
01:07:02.020 | that we mentioned earlier
01:07:03.100 | that begins with question every requirement.
01:07:05.620 | But step two is delete, delete, delete.
01:07:08.140 | Delete every part.
01:07:09.700 | And then a corollary to that is,
01:07:12.620 | if you don't end up adding back 20% of what you deleted,
01:07:17.020 | then you didn't delete enough in the first round
01:07:19.340 | 'cause you were too timid.
01:07:21.060 | Well, so you ask me, did he overdo it?
01:07:24.140 | He probably overdid it by 20%, which is his formula.
01:07:28.940 | And they're probably trying to hire people now
01:07:31.860 | to keep things going.
01:07:33.160 | - But it sends a strong signal
01:07:36.540 | to people that are hired back
01:07:38.060 | or the people that are still there,
01:07:39.260 | the A player idea. - Yeah,
01:07:40.500 | and what Steve Jobs and many other great leaders felt,
01:07:43.220 | and certainly Bezos, and certainly,
01:07:46.060 | in the early days of Microsoft, Bill Gates,
01:07:48.860 | it was hardcore, only A players.
01:07:52.180 | - So how much of Elon's success would you say,
01:07:54.660 | Elon's and Steve Jobs' success
01:07:57.020 | is the hiring and managing of great teams?
01:07:59.900 | - When I asked Steve Jobs at one point
01:08:01.620 | what was the best product you ever created,
01:08:04.620 | I thought he'd say maybe the Macintosh
01:08:07.820 | or maybe the iPhone.
01:08:10.620 | He said, "No, those products are hard.
01:08:12.780 | "The best thing I ever created was the team
01:08:15.820 | "that made those products."
01:08:17.940 | And that's the hard part is creating a team.
01:08:20.020 | And he did, from Johnny Ive to Tim Cook,
01:08:22.580 | and Eddie Q, and Phil Schiller.
01:08:26.040 | Elon has done a good job bringing in people.
01:08:31.140 | Gwynne Shotwell, obviously, Linda Iaccarino,
01:08:34.860 | she can navigate through the current crises.
01:08:39.460 | Certainly stellar people at SpaceX, like Mark Giancosa,
01:08:45.660 | and then at Tesla, like Drew Bacalino, and Lars Marvy,
01:08:50.260 | and Tom Zhu, and many others.
01:08:52.780 | He's not as much of a team collaborator
01:08:58.820 | as say Benjamin Franklin, who, by the way,
01:09:01.540 | that's the best team ever created, which is the founders.
01:09:04.740 | And you had to have really smart people
01:09:06.380 | like Jefferson and Madison, and really passionate people
01:09:10.260 | like John Adams and his cousin Samuel,
01:09:12.420 | and really a guy of high rectitude like Washington.
01:09:15.900 | But you also needed a Ben Franklin
01:09:18.140 | who could bring everybody together
01:09:19.620 | and forge a team out of them
01:09:21.720 | and make them compromise with each other.
01:09:24.180 | Musk is a magnet for awesome talent.
01:09:28.500 | - Magnet, interesting.
01:09:29.900 | But there's the priorities of hiring
01:09:33.540 | based on excellence, trustworthiness, and drive.
01:09:37.500 | These are things you've described throughout the book.
01:09:40.060 | I mean, there's a pretty concrete and rigorous set of ideas
01:09:45.060 | based on which the hiring is done.
01:09:50.460 | - Oh yeah, and he has a very good, spidey, intuitive sense
01:09:55.460 | just looking at people, I mean, not looking at them,
01:10:00.620 | but studying them, who could be good.
01:10:03.060 | One of his ways of operating
01:10:08.060 | is what he calls a skip-level meeting.
01:10:12.300 | And let's take a very specific thing,
01:10:13.900 | like the Raptor engine, which is powering the Starship.
01:10:18.500 | And it wasn't going well.
01:10:19.540 | It looked like a spaghetti bush,
01:10:20.940 | and it was gonna be hard to manufacture.
01:10:23.100 | And he got rid of the people who were in charge of that team.
01:10:27.380 | And I remember that he spent a couple of months
01:10:31.820 | doing what he calls skip-level,
01:10:33.300 | which means instead of meeting with his direct reports
01:10:36.500 | on the Raptor team,
01:10:38.820 | he would meet with the people one level below them.
01:10:42.980 | And so he would skip a level and meet with them.
01:10:46.520 | And he said, and I just ask them what they're doing,
01:10:49.820 | and I drill them with questions.
01:10:51.940 | And he said, and this is how I figure out
01:10:53.860 | who's going to emerge.
01:10:55.420 | He said it was particularly difficult,
01:10:56.940 | I was sitting in those meetings,
01:10:58.300 | 'cause people were wearing masks.
01:10:59.780 | It was during the height of COVID.
01:11:01.580 | And he said it made it a little bit harder for him,
01:11:04.700 | because he has to get the input.
01:11:07.620 | But I watched as a young kid, dreadlocks,
01:11:12.120 | named Jacob McKenzie, he's in the book,
01:11:15.260 | is sitting there, and he's a bit like you,
01:11:17.820 | engineering mindset speaks in a bit of a monotone.
01:11:21.500 | Musk would ask a question, and he would give an answer.
01:11:24.180 | And the answer would be very straightforward,
01:11:26.820 | and he didn't get rattled.
01:11:28.340 | He was like this.
01:11:29.740 | And Musk said one day, called him up at 3 a.m.,
01:11:33.300 | well, I won't say 3 a.m., but after midnight,
01:11:35.860 | said, you still around?
01:11:36.980 | And Jake said, yeah, I'm still at work.
01:11:39.060 | And he said, okay, I'm gonna make you in charge
01:11:41.060 | of the team building Raptor.
01:11:43.840 | And that was like a big surprise.
01:11:47.380 | But Jacob McKenzie has now gotten a version of Raptor,
01:11:50.500 | and where they're building him at least one a week,
01:11:52.340 | and they're pretty awesome.
01:11:54.220 | And that's where his talent, Musk's talent,
01:11:59.980 | for finding the right person and promoting them,
01:12:04.820 | that's where it is.
01:12:05.660 | - And promoting it in a way where it's like,
01:12:08.820 | here's the ball, here, catch.
01:12:10.940 | - Yeah, yeah.
01:12:11.780 | - And you run with it.
01:12:13.620 | I have interacted with quite a few folks
01:12:16.980 | from even just the Model X, all throughout,
01:12:20.660 | where people on paper don't seem like
01:12:23.220 | they would be able to run the thing,
01:12:24.560 | and they run it extremely successfully.
01:12:26.300 | - And he does it wrong sometimes.
01:12:27.580 | He's had a horrible track record
01:12:29.340 | with the solar roof division.
01:12:32.340 | Wonderful guy named Brian Dow, I really liked him.
01:12:36.500 | And when they were doing the battery factory surge
01:12:39.140 | in Nevada, Musk got rid of two or three people,
01:12:42.540 | and there's Brian Dow, can do, can do, can do.
01:12:45.980 | Stays up all night, and he gets promoted and runs it.
01:12:48.300 | And so finally, Musk goes through two or three people
01:12:50.940 | running the solar roof division.
01:12:53.300 | Finally, calls up Brian Dow.
01:12:54.780 | I was sitting in Musk's house in Boca Chica,
01:12:57.140 | that little tiny two-bedroom he has,
01:13:00.300 | and he offers Brian Dow the job of running solar roof.
01:13:04.220 | And you know, Brian there, okay, can do, can do.
01:13:08.940 | And two or three times, Musk insisted
01:13:12.220 | that they install a solar roof
01:13:14.900 | in one of those houses in Boca Chica.
01:13:16.820 | This is this tiny village at the south end of Texas.
01:13:20.700 | And late at night, I mean, I'd have to climb up
01:13:23.480 | to the top of the roof on these ladders
01:13:25.780 | and stand on this peaked roof as Musk is there saying,
01:13:29.620 | "Why do we need four screws to put in this single leg?"
01:13:33.700 | And Brian was just sweating and doing everything.
01:13:37.780 | But then after a couple of months, it wasn't going well,
01:13:40.380 | and boom, Musk just fired him.
01:13:44.700 | So I always try to learn,
01:13:46.780 | what is it that makes those who stay thrive?
01:13:51.420 | - What's the lesson there, what do you think?
01:13:53.340 | - Well, I think it's self-knowledge,
01:13:55.140 | like an Andy Krebs or others.
01:13:57.820 | They say, "I am hardcore,
01:14:00.620 | "I really wanna get a rocket to Mars,
01:14:02.580 | "and that's more important than anything else."
01:14:04.940 | One of the people, I think it's Tim Zayman,
01:14:09.340 | I hope when he hears this, I'm getting him the right person,
01:14:13.020 | who took time and was working for Tesla Autopilot,
01:14:16.860 | and it was just so intense, he took some time off
01:14:18.980 | and then went to another company.
01:14:21.020 | He said, "I was burned out at Tesla,
01:14:23.780 | "but then I was bored at the next place."
01:14:25.980 | So I called, I think it was Ashok at Tesla,
01:14:28.900 | I said, "Can I come back?"
01:14:29.820 | He said, "Sure."
01:14:30.780 | He said, "I learned about myself,
01:14:32.660 | "I'd rather be burned out than bored."
01:14:35.220 | - That's a good line.
01:14:36.260 | Well, can you just linger on one of the three
01:14:40.860 | that seem interesting to you in terms of excellence,
01:14:43.420 | trustworthiness, and drive?
01:14:44.900 | Which one do you think is the most important
01:14:47.180 | and the hardest to get at?
01:14:48.420 | The trustworthiness is an interesting one.
01:14:50.540 | Like, are you ride or die kind of thing?
01:14:53.340 | - Yeah, I think that, especially when it came
01:14:56.020 | to taking over Twitter, he thought half the people there
01:14:58.260 | were disloyal, and he was wrong.
01:15:00.600 | About 2/3 were disloyal, not just half.
01:15:03.460 | And it was, how do we weed out those?
01:15:06.080 | And he did something and made the firing squad, I call it,
01:15:10.180 | or the Musketeers, I think is my nickname for them,
01:15:13.660 | which is the young cousins and two or three other people,
01:15:17.420 | he made them look at the Slack messages
01:15:19.500 | these people had posted, everybody at Twitter had posted.
01:15:22.700 | And they went through hundreds of Slack messages.
01:15:25.420 | So if anybody posted on the internal Slack,
01:15:30.060 | that jerk Elon Musk is gonna take over,
01:15:32.460 | and I'm afraid that he's a maniac or something,
01:15:35.940 | they would be on the list because they weren't all-in loyal.
01:15:41.500 | They did not look at private Slack messages,
01:15:45.580 | and I guess people who are posting
01:15:47.580 | on a corporate Slack board should be aware
01:15:50.940 | that your company can look at them.
01:15:53.540 | But that's more than I would have done
01:15:56.420 | or most people would have done.
01:15:58.580 | And so that was to figure out
01:15:59.900 | who's deeply committed and loyal.
01:16:01.780 | I think that was mainly the case at Twitter.
01:16:04.780 | He doesn't sit around at SpaceX saying, "Who's loyal to me?"
01:16:08.540 | At other places, it's excellence,
01:16:13.540 | but that's pretty well a given.
01:16:18.180 | Everybody is like a Mark Giancosa, just whip smart.
01:16:22.020 | It's are you hardcore and all-in,
01:16:23.820 | especially if you have to move to this bit of a town
01:16:27.660 | in the south tip of Texas called Boca Chica,
01:16:30.500 | you gotta be all-in.
01:16:32.900 | - Yeah, and that's the drive, the last piece.
01:16:36.740 | So you, in terms of collaborating,
01:16:38.900 | one of the great teams of all time, Ben Franklin,
01:16:41.340 | I like that, I thought it was the Beatles,
01:16:43.620 | but Ben Franklin is pretty good.
01:16:44.940 | - Oh, no, no, no.
01:16:46.140 | - I'm sorry.
01:16:46.980 | - Yeah. - Sorry to offend you.
01:16:48.140 | - Read the Constitution and read Abby Road,
01:16:50.420 | Lucy Abby Road, they're both good,
01:16:52.020 | but they're in a different league.
01:16:53.140 | - Yeah, different league.
01:16:54.340 | Okay, so one of the many things that comes to mind
01:16:57.900 | with Ben Franklin is incredible time management.
01:17:00.780 | Is there something you could say about Ben Franklin
01:17:07.020 | and about Steve Jobs?
01:17:10.060 | I think interesting with Elon is that he, as you write,
01:17:13.620 | runs six companies, seven,
01:17:16.100 | I mean, it depends how you count,
01:17:17.660 | Starlink is its own thing, I don't know.
01:17:19.660 | What can you say about these people
01:17:23.340 | in terms of time management?
01:17:24.780 | - Well, Musk is in a league of his own
01:17:28.620 | in the way he does it.
01:17:30.300 | First of all, Steve Jobs had to run Pixar
01:17:33.860 | and Apple for a while,
01:17:35.800 | but Musk, every couple of hours,
01:17:39.540 | is switching his mindset from how to implant
01:17:43.380 | the Neuralink chip and what will the robot
01:17:45.700 | that implants it in the brain look like
01:17:47.900 | and how fast can we make it move,
01:17:50.340 | and then the heat shield on the Raptor,
01:17:53.660 | or switching to human imitation,
01:17:57.540 | machine learning, full self-drive.
01:17:59.700 | On the night that the Twitter board agreed to the deal,
01:18:04.700 | this is huge around the world,
01:18:07.500 | I'm sure you remember, like, Musk buys Twitter.
01:18:11.220 | It wasn't when the deal closed,
01:18:12.380 | it was when the Twitter accepted his offer.
01:18:16.300 | And I thought, okay, but then he went to Boca Chica,
01:18:20.620 | to South Texas, and spent time fixating on,
01:18:25.140 | if I remember correctly, a valve in the Raptor engine
01:18:30.540 | that had a methane leak issue,
01:18:33.460 | and what were the possible ways to fix it.
01:18:37.240 | And all the engineers in that room,
01:18:41.300 | I assume, are thinking about,
01:18:43.300 | this guy just bought Twitter, should we say something?
01:18:46.300 | And he's like, and then he goes with Kimball
01:18:49.500 | to a roadside joint in Brownsville
01:18:53.940 | and just sits in the front and listens to music
01:18:55.840 | with nobody noticing really him being there.
01:18:58.360 | One of the things that's one of his strengths
01:19:02.460 | and sort of weaknesses in a way,
01:19:04.940 | is in a given day, he'll focus serially, sequentially,
01:19:10.780 | on many different things.
01:19:14.100 | He will worry about uploading video onto X.com
01:19:19.100 | or the payment system, and then immediately switch over
01:19:25.060 | to some issue with the FAA giving a permit for Starship,
01:19:30.060 | or with how to deal with Starlink and the CIA.
01:19:35.140 | And when he's focused on any of these things,
01:19:39.340 | you cannot distract him.
01:19:41.340 | It's not like he's also thinking about
01:19:44.220 | "I'm dealing with Starlink, but I've gotta also worry
01:19:46.380 | "about the Tesla decision on the new $25,000 car."
01:19:51.380 | Now, he'll, in between these sessions,
01:19:55.060 | process information, then let off steam.
01:19:57.820 | And for better or worse, he lets off steam
01:20:00.260 | by either playing a friend in Polytopia,
01:20:04.460 | or fire off some tweets, which is often not a healthy thing.
01:20:09.460 | But it's a release for him.
01:20:12.960 | And he doesn't, I once said he was a great multitasker,
01:20:16.340 | and that was a mistake.
01:20:18.220 | People corrected me.
01:20:19.220 | He's a serial tasker, which means focuses intensely
01:20:23.740 | on a task for an hour, almost has a,
01:20:28.060 | what do they call it at restaurants
01:20:29.860 | where they give you a palate cleanser.
01:20:32.580 | - Yeah.
01:20:33.420 | - Does some palate cleanser with Polytopia,
01:20:35.900 | and then focuses on the next task.
01:20:37.980 | - I mean, is there some wisdom about time management
01:20:40.780 | that you can draw from that?
01:20:42.020 | - There's some things that these people do,
01:20:43.980 | and you say, "Okay, I can be that way.
01:20:46.120 | "I can be more curious.
01:20:47.160 | "I can question every rule and regulation."
01:20:49.620 | I just don't think anybody should try
01:20:54.740 | to emulate Musk's time management style,
01:20:58.760 | because it takes a certain set of teams
01:21:01.520 | who know how to deal with everything else
01:21:03.100 | other than the thing he's focusing on,
01:21:05.580 | and a certain mind that can shift,
01:21:10.580 | just like his moods can shift.
01:21:13.900 | You and I go through transitions,
01:21:15.660 | and also if I'm thinking about what I'm gonna say
01:21:18.200 | on this podcast, I'm also thinking about the email
01:21:20.940 | my daughter just sent about a house that she's looking,
01:21:23.380 | you know, and I'm multitasking.
01:21:26.580 | He doesn't actually do that.
01:21:27.740 | He single tasks sequentially with a focus
01:21:31.900 | that's hardcore.
01:21:33.700 | - I don't know.
01:21:34.540 | I think there's wisdom to draw from that.
01:21:36.820 | First of all, he makes me,
01:21:38.540 | Ben Franklin makes me feel that way,
01:21:40.900 | that there's a lot of hours in the day.
01:21:43.340 | There's a lot of minutes in the day.
01:21:45.180 | Like, there's no excuse not to get a lot done,
01:21:48.180 | and that requires just an extreme focus,
01:21:50.520 | an extreme focus and like an urgency.
01:21:54.340 | - I think the fierce urgency
01:21:58.260 | that drives him is important,
01:22:03.100 | and it's sometimes ginned up.
01:22:05.860 | Like I say, the fierce urgency of getting to Mars,
01:22:08.980 | and on a Friday night at the launch pad in Boca Chica
01:22:13.980 | at 10 p.m., there are only a few people working
01:22:16.620 | 'cause it's a Friday night.
01:22:17.740 | They're not supposed to launch for another eight months.
01:22:20.500 | And he orders a surge.
01:22:23.020 | He says, "I want 200 people here by tomorrow
01:22:25.100 | "working on this pad."
01:22:26.940 | We have to have a fierce sense of urgency
01:22:29.100 | or we will never get to Mars.
01:22:31.460 | - That sense of urgency is also a vibrancy
01:22:36.460 | that's like really taking on life fully.
01:22:42.500 | I mean, to me, this is a lesson is like,
01:22:44.600 | even the mundane can be full of this just richness,
01:22:49.540 | and you just have to really take it in intensely.
01:22:54.900 | So like the switching enables that kind of intensity
01:22:58.820 | 'cause most of us can't hold that intensity
01:23:00.420 | in any one task for a prolonged period of time.
01:23:03.020 | Maybe that's also a lesson.
01:23:04.540 | - Right, and I guess it goes back to also know who you are,
01:23:08.820 | meaning there are people who can focus intensely,
01:23:13.740 | and there are people who can see patterns across many things.
01:23:16.580 | Look, Leonardo da Vinci, he was not all that focused.
01:23:21.180 | He was easily distracted.
01:23:23.580 | - Procrastinated.
01:23:24.420 | - That's why he has more unfinished paintings
01:23:26.780 | and finished paintings in his canon.
01:23:28.980 | But his ability to see patterns across nature
01:23:35.380 | and to in some ways process, procrastinate, be distracted,
01:23:40.380 | that helped him some, but Musk is not that way.
01:23:46.460 | And every few months, there's a new surge.
01:23:50.340 | You don't know where it'll be, but you'll be on solar roofs,
01:23:53.500 | and all of a sudden, we'll have a surge,
01:23:54.940 | and there has to be 100 solar roofs built,
01:23:58.700 | or this has to be done by tomorrow,
01:24:00.300 | or make a starship dome by dawn and surge and do it.
01:24:05.300 | And there are people who are built that way.
01:24:08.620 | It is inspiring, but also let's appreciate
01:24:13.020 | that there are people who can be really good,
01:24:19.820 | but also can savor the success,
01:24:24.820 | savor the moment, savor the quiet sometimes.
01:24:28.500 | Musk's big failing is he can't savor the moment or success,
01:24:33.500 | and that's the flip side of hardcore intensity.
01:24:39.160 | - In Innovators, another book of yours that I love,
01:24:43.600 | you write about individuals and about groups.
01:24:46.420 | So one of the questions the book addresses
01:24:49.180 | is is it individuals or is it groups
01:24:52.100 | that turn the tides of history?
01:24:53.660 | - When Henry Kissinger was on the shuttle missions
01:24:58.580 | for the Middle East Peace, this is the first book I ever
01:25:01.900 | wrote, he said, "When I was a professor at Harvard,
01:25:06.380 | "I thought that history was determined by great forces
01:25:09.840 | "and groups of people, but when I see it up close,
01:25:14.100 | "I see what a difference an individual can make."
01:25:17.700 | He's talking about Sadat and Golda Meir,
01:25:20.100 | probably talking about himself too,
01:25:22.020 | or at least in his mind, and we biographers
01:25:26.300 | have this dirty secret that we know.
01:25:28.420 | We distort history a bit by making the narrative
01:25:31.180 | too driven by an individual, but sometimes it is driven
01:25:35.220 | by an individual.
01:25:36.060 | Musk is a case like that, and sometimes,
01:25:38.420 | as I did with the Innovators, there's teams
01:25:42.080 | and people who build on each other,
01:25:43.700 | and Gordon Moore and Bob Noyce, then getting Andy Grove
01:25:47.500 | and doing the microchip, which then comes out
01:25:49.940 | and Wozniak and Jobs find it at some electronics store
01:25:54.940 | and they decide to build the Apple.
01:25:57.620 | And so, sometimes there are flows of forces
01:26:04.180 | and groups of people.
01:26:06.200 | I guess I err a little bit on the side of looking
01:26:09.740 | at what a Steve Jobs and Elon Musk
01:26:13.220 | and Albert Einstein can do.
01:26:17.020 | And I also try to figure out if they hadn't been around
01:26:19.260 | would the forces of history and the groups of people
01:26:21.600 | have done it without them.
01:26:23.260 | That's a good historical question,
01:26:25.100 | as somebody who loves history.
01:26:28.020 | And you think about Special Relativity,
01:26:30.980 | one of the 1905 papers.
01:26:32.760 | Even after he writes it, it's four years
01:26:36.860 | before people truly get what he's saying,
01:26:38.820 | which is it's not just how you observe time is relative,
01:26:42.220 | it's time itself is relative.
01:26:44.740 | And on the general theory, which he does a decade later,
01:26:48.340 | I'm not sure we would have gotten that yet.
01:26:50.580 | What about moving us into the era of an iPhone,
01:26:54.980 | in which it's so beautiful that you can't live
01:26:57.500 | without a thousand songs in your pocket,
01:27:00.060 | email and the internet in your pocket and a phone.
01:27:04.480 | There are a lot of brain dead people
01:27:08.300 | from Panasonic to Motorola who didn't get that
01:27:11.260 | and it may have been a while.
01:27:13.180 | I certainly think it's true
01:27:14.780 | of the era of electric vehicles.
01:27:16.820 | Jim and Ford, all the great people there,
01:27:19.540 | they crushed the boat, and I mean that literally.
01:27:22.300 | They ended up smashing them
01:27:24.020 | because they decided to discontinue it.
01:27:26.460 | Likewise, nobody was sending up rockets.
01:27:29.660 | Our space shuttle was about to be grounded 12 years ago.
01:27:33.800 | And so, Musk does things,
01:27:38.240 | and there'll be people who say and read the book,
01:27:41.420 | well, if they read the book, they'll see the full story.
01:27:43.620 | Well, they'll say it wasn't Musk who did Tesla,
01:27:45.500 | it was Martin Eberhard or Mark Tarpany.
01:27:47.260 | No, no, no.
01:27:48.580 | There were people who had helped create
01:27:52.940 | the shells of companies and other things,
01:27:55.620 | and they were all deserved to be called co-founders.
01:27:59.220 | But the guy who actually gets us
01:28:01.620 | to a million electric vehicles a year is Elon Musk,
01:28:05.220 | and without him, I don't think we,
01:28:09.260 | look, if anybody five years from now buys a car
01:28:14.260 | that's gasoline-powered, we'll think,
01:28:16.900 | that's quaint, you know, that's odd.
01:28:20.300 | I mean, suddenly, we've changed.
01:28:21.760 | We're not gonna do it.
01:28:23.500 | 90% of that is Elon Musk.
01:28:26.020 | We're all mortal.
01:28:27.780 | When and how do you think Elon will retire
01:28:30.100 | from the insanely productive schedule he's on now?
01:28:34.280 | I would think that he would hate to retire.
01:28:37.500 | I think that he can't live without the pressure,
01:28:42.220 | the drama, the all-in feeling.
01:28:46.140 | It's never been anything that seemed to have crossed his mind.
01:28:54.140 | He's never said, "Maybe I love Larry Ellison's house
01:28:57.660 | "on the beach in Hawaii.
01:28:58.980 | "Maybe I should spend time in doing."
01:29:01.700 | Instead, he says things like, "I learned early on
01:29:04.460 | "that vacations will kill you."
01:29:06.320 | (Luke laughs)
01:29:07.520 | He gets malaria when he goes on one vacation.
01:29:09.720 | I mean, he goes on vacation at one point,
01:29:11.800 | and they oust him from PayPal,
01:29:13.440 | and then he goes to Africa at one point, he gets malaria.
01:29:15.720 | He says, "I've learned, vacations kill you."
01:29:17.200 | Lesson learned.
01:29:18.360 | Well, it's interesting because the projects
01:29:19.840 | are hundred-plus year projects, many of these.
01:29:23.880 | One of the weird things is watching him think
01:29:28.880 | incredibly long-term.
01:29:31.640 | One of the meetings every week early on
01:29:35.000 | when I was watching him was Mars Colonizer.
01:29:40.000 | And we did through a two-hour meeting
01:29:42.320 | about what would the governance structure be on Mars?
01:29:46.000 | What would people wear?
01:29:47.640 | How would the robots work?
01:29:50.120 | And would there be democracy, or should there be
01:29:55.120 | a different form of governance?
01:29:59.240 | I'm sitting there saying, "What are they doing?
01:30:01.500 | "What are they talking about?"
01:30:02.920 | They're trying to build rocket ships and everything else.
01:30:05.440 | They are worrying about the governance structure of Mars?
01:30:08.520 | And likewise, whenever he's in a tense moment,
01:30:13.520 | like there's a rocket that's about to be launched,
01:30:16.240 | he'll start asking people something in the way future,
01:30:19.160 | like the new lead engine or something.
01:30:23.480 | If we're gonna build that, do we have enough materials
01:30:26.040 | ready to order?
01:30:27.040 | Or, I don't know, he'll just ask questions
01:30:32.280 | like when he's building Robotaxi, the global car,
01:30:37.000 | the $25,000 inexpensive global car.
01:30:39.920 | That's not a total passion.
01:30:41.440 | He was talked into doing that.
01:30:43.680 | His passion is Robotaxi.
01:30:45.760 | But his passion is, how are we gonna make this factory
01:30:48.920 | to do a million cars a year?
01:30:51.280 | So even the Robotaxi is a longer-range vision.
01:30:55.900 | I mean, he's been touting it since 2016.
01:30:59.120 | But, you know, we're not.
01:31:01.280 | I don't know, Robotaxi, I mean, there's Waymo
01:31:04.200 | maybe doing a little experiment,
01:31:05.760 | but there's not cars being manufactured
01:31:09.040 | without steering wheels that are going to take over
01:31:11.680 | the highways.
01:31:13.160 | Yeah, so he's always looking way into the future,
01:31:15.960 | is my point.
01:31:17.120 | - I just hope that there's a lot of Da Vinci's
01:31:22.120 | and Steve Jobs's and Einstein's and Elon Musk's
01:31:26.640 | that carry the flame forward.
01:31:28.800 | - That's one of the reasons you write books
01:31:30.320 | about these people is so that if you're a young woman
01:31:34.440 | in a school where you're not being told to do science
01:31:37.200 | and you read The Codebreaker about Jennifer Doudna,
01:31:40.320 | you say, okay, I can be that.
01:31:42.640 | And when you say, oh, maybe I'll be a regulator,
01:31:46.920 | or you say, oh, no, maybe I'll be the person
01:31:49.360 | who pushes the boundaries, who pushes the lines,
01:31:53.000 | who pushes, as Steve Jobs said, the human race.
01:31:57.260 | - Let me ask you about your mind, your genius,
01:32:02.260 | your process.
01:32:04.460 | - I'll give you two out of three.
01:32:05.700 | - All right.
01:32:07.100 | Take me through your process of writing a biography.
01:32:09.420 | I mean, the full of it.
01:32:11.500 | And not just writing a biography,
01:32:13.240 | but understanding deeply which your books have done
01:32:18.240 | for the human story and the bigger ideas
01:32:23.780 | underlying the human story.
01:32:24.940 | So you've written biographies both of individuals,
01:32:28.180 | which are hardly individuals.
01:32:30.260 | It's a really big, complex picture.
01:32:32.520 | And biographies of ideas that involve individuals.
01:32:37.520 | - Well, step one for me is trying to figure out
01:32:42.260 | how the mind works.
01:32:43.700 | What causes Einstein to make that leap?
01:32:48.220 | Relon Musk to say stainless steel
01:32:50.700 | while he's looking at a carbon fiber rocket.
01:32:54.500 | Or how do you make the mental leap?
01:32:57.460 | Because I write about smart people,
01:32:59.420 | but smart people are a dime a dozen.
01:33:01.220 | They don't usually amount to much.
01:33:02.540 | You have to be creative, imaginative,
01:33:04.180 | to think different, as Jobs would say.
01:33:06.920 | And so what makes people creative?
01:33:08.900 | What makes them take imaginative leaps?
01:33:11.460 | That's the key question you gotta ask.
01:33:14.500 | You also ask the questions like you've asked earlier,
01:33:17.020 | which is what demons are jangling in their head
01:33:20.260 | and how do they harness them into drives?
01:33:22.860 | So you look at all that.
01:33:24.060 | And you try to observe really carefully the person.
01:33:29.060 | One of the more mundane things I do
01:33:33.420 | is a lot of writers try to give you a lot of their opinions
01:33:38.420 | and preach or whatever.
01:33:42.540 | As I said, this mentor said two people types come out,
01:33:47.820 | preachers, storytellers.
01:33:49.660 | Be a storyteller.
01:33:52.780 | I try whenever I'm trying to convey a thought,
01:33:57.780 | there's six magic words that I almost should have
01:34:02.620 | written on a card, pinned above my desk,
01:34:06.340 | which is let me tell you a story.
01:34:09.220 | So if somebody says, how does Elon Musk
01:34:13.860 | figure out good talent as you did?
01:34:18.140 | I think, well, let me tell you the story.
01:34:19.620 | I'll tell you the story of Jake McKenzie.
01:34:22.220 | Or this is not something I invented.
01:34:24.940 | I mean, this is the way the good Lord does it in the Bible.
01:34:27.460 | I mean, has the best opening lead sentence ever,
01:34:31.220 | in the beginning, comma, and then it's stories.
01:34:34.820 | And secondly, to pick up on that lead sentence
01:34:37.620 | in the beginning, make it chronological.
01:34:40.980 | Everybody in the 40th year of their life
01:34:45.980 | has grown from the 39th year and the 38th year.
01:34:49.980 | And so you want to show how people evolve and grow.
01:34:54.780 | I had the greatest of all nonfiction narrative editors,
01:34:57.700 | Alice Mayhew at Simon & Schuster,
01:34:59.940 | who, among other things, created
01:35:02.220 | all the President's Men with Woodward and Bernstein.
01:35:04.940 | But she had a note she'd put in the margins of my books
01:35:08.340 | that was a tic-tac, and it meant all things in good time.
01:35:12.380 | Keep it chronological.
01:35:14.260 | If it's good enough for the Bible, it's good enough for you.
01:35:16.900 | - Interesting, to me, that's a small note.
01:35:19.580 | But to you, it's extremely important.
01:35:21.980 | - Because it's the framework for how you structure things,
01:35:24.500 | but also how you understand things,
01:35:26.340 | which is, if you keep it a chronological narrative,
01:35:31.340 | then you're showing how a person has grown
01:35:34.700 | from one experience you've talked about to the next one.
01:35:39.060 | And that moral growth, creative growth,
01:35:43.140 | risk-taking growth, wisdom,
01:35:46.740 | that's the essences of creativity.
01:35:50.020 | But you can't do it, there's a term, Bildungsroman,
01:35:55.020 | which is a book that carries a narrative
01:35:59.400 | and tells how people learn something.
01:36:01.660 | I'm a big believer in narrative.
01:36:03.540 | If you're an academic, you sometimes, not today,
01:36:08.540 | but in, like, 20 years ago, 30 years ago,
01:36:11.160 | there were two things you thought were bad.
01:36:13.900 | One was having a great person theory of history
01:36:18.060 | in which you decided to do biography.
01:36:21.100 | I had a great professor when I was in college.
01:36:23.500 | Her name was Doris Kearns.
01:36:25.780 | She later married Dick Goodwin.
01:36:27.900 | And she, when she was going for tenure at the university,
01:36:32.580 | wrote a biography of Lyndon Johnson and the American Dream.
01:36:36.460 | And they denied her tenure
01:36:38.280 | because it was beneath the dignity of the academy
01:36:41.980 | to write history through one person.
01:36:44.780 | That's great.
01:36:46.960 | It opened up the field of biography to us non-academics,
01:36:51.960 | starting with David McCullough, Bob Caro,
01:36:56.860 | but maybe John Meacham and myself are in a new generation,
01:37:01.020 | and certainly there's a generation coming after us.
01:37:04.700 | But the second thing, besides telling it through people,
01:37:07.340 | which is the academy tended to disdain
01:37:12.340 | what they called imposing a narrative
01:37:16.980 | in which you made it storytelling.
01:37:19.140 | 'Cause that meant you were leaving things out
01:37:22.180 | and making it into a narrative.
01:37:24.580 | Well, that's how we form our views of the world.
01:37:29.580 | - Well, let me ask you this question.
01:37:32.180 | In terms of gathering an understanding,
01:37:36.900 | how much of it is one observing,
01:37:39.760 | and how much of it is interviews?
01:37:43.460 | - Yeah, and obviously depends on the subject.
01:37:47.420 | I mean, with Ben Franklin, it's all based on archives.
01:37:50.200 | And every, of course, we have 40 volumes of letters he wrote.
01:37:54.300 | That was the good old days,
01:37:55.420 | when every day you'd write 20 letters.
01:37:58.060 | The Musk book is based much more on observation
01:38:01.640 | than almost any of my books,
01:38:03.740 | because he opened up in a way that was breathtaking to me.
01:38:08.280 | Even when he'd be sitting playing Polytopia
01:38:13.620 | or seething at other people,
01:38:16.780 | he'd have me just sitting there watching.
01:38:18.900 | I mean, I spent a lot of time with Jennifer Dowden
01:38:22.060 | at her side.
01:38:22.900 | I went to her lab and edited a human gene
01:38:25.380 | and with a pipette and a test tube.
01:38:28.940 | But I would say I spent 30 hours with her.
01:38:33.580 | I can't count, you know, 100 hours or more
01:38:36.620 | just observing Musk.
01:38:39.220 | And I'm not sure that any biographer,
01:38:44.220 | perhaps since Boswell took on Dr. Johnson,
01:38:48.100 | has ever had quite as much up close,
01:38:53.100 | meaning five feet away at all times, access.
01:38:58.540 | And because of that,
01:39:01.560 | I'll go back to what I said a moment ago,
01:39:03.380 | I try to get out of the way of the story.
01:39:06.060 | It's not about me, it's not about,
01:39:07.500 | I try to just say, okay, here's what happened.
01:39:11.420 | Here's this story.
01:39:12.260 | Here's what happened the night he came in to Twitter
01:39:14.860 | for the first time.
01:39:15.980 | And let you form your own judgment.
01:39:18.300 | - What about the interviews?
01:39:20.380 | You've had a lot of conversations.
01:39:23.660 | You give acknowledgement to the people
01:39:24.940 | you've done interviews with.
01:39:28.520 | Well, one, I have to ask, as an aspiring interviewer myself,
01:39:36.260 | - People love to talk.
01:39:37.440 | People just love, you know that.
01:39:39.200 | And I've had 140, maybe 150 people,
01:39:43.080 | they're all listed in the back.
01:39:45.020 | One of the little things that people won't notice,
01:39:47.880 | but I'll say it now, is all of them are on the record.
01:39:51.280 | Getting them to talk is easy.
01:39:53.020 | They all want to talk about Musk.
01:39:54.980 | But then at a certain point say,
01:39:56.720 | I don't put anonymous quotes in my book.
01:39:59.000 | I cite things.
01:40:01.640 | I say, if you're tough enough and you've gone through this,
01:40:06.000 | and a lot of times it takes two or three calls back.
01:40:08.960 | Somebody will tell me a story, say, oh no, no, no,
01:40:10.640 | I don't want to, but I think it's important
01:40:14.320 | to know where everything came from.
01:40:16.600 | And with Musk, it's, you know, I had that
01:40:19.560 | from the very beginning,
01:40:20.700 | because I was a Time Magazine reporter.
01:40:22.720 | I'd worked, reported for the Times-Picayune on New Orleans.
01:40:25.700 | My first day on the job, I had to go cover a murder.
01:40:29.120 | And I phoned in the story from a pay phone,
01:40:32.560 | and my editor, you know, the city editor said,
01:40:35.400 | well, did you talk to the family?
01:40:36.920 | I went, no, Billy, I mean, the family,
01:40:38.760 | you know, the daughter just got,
01:40:40.560 | he said, go knock on the door.
01:40:42.080 | I knocked on the door.
01:40:43.800 | An hour later, they were still talking.
01:40:45.560 | They were bringing out her yearbook.
01:40:47.800 | Lesson one I learned, people want to talk
01:40:50.880 | if you're willing to just listen.
01:40:53.400 | And whether it be Henry Kissinger,
01:40:55.820 | you just push the button and say Kissinger,
01:40:57.700 | and people tell you the stories,
01:40:59.420 | all the way through Elon Musk, everybody talked.
01:41:03.000 | Everybody in his family, everybody he fired,
01:41:05.280 | everybody, I mean, I think it's important
01:41:07.980 | to listen to people.
01:41:09.020 | And the other thing I learned as a reporter,
01:41:11.500 | back when I was covering politics in New Hampshire
01:41:13.780 | in the early campaigns, I learned from
01:41:17.380 | two or three great reporters, a guy named David Broder
01:41:19.820 | and Tim Russert, the late NBC guy.
01:41:22.740 | They do what was called door knocking.
01:41:24.440 | You just walk in a neighborhood, knock on a door,
01:41:26.680 | and ask people about the election.
01:41:29.080 | But they said, here's the secret.
01:41:31.560 | Don't ask any leading questions.
01:41:33.400 | Don't have any premise.
01:41:35.060 | Just say, hey, I'm trying to figure out this election.
01:41:38.040 | What's going on?
01:41:39.760 | What do you think?
01:41:41.040 | And then stay silent.
01:41:42.620 | With Musk, a third secret.
01:41:44.640 | You know this well.
01:41:45.600 | He'll go silent at times.
01:41:49.120 | Sometimes a minute, two minutes, four minutes.
01:41:52.280 | Don't try to fill the silences.
01:41:54.440 | If you're a listener, you gotta learn,
01:41:57.720 | okay, he's not saying anything for four minutes.
01:42:01.760 | I can outlast him.
01:42:03.040 | - It's tough.
01:42:03.880 | As humans, it's very tough.
01:42:05.280 | Respecting the silence is really, really difficult.
01:42:07.800 | Speaking of demons, when there's silence,
01:42:11.960 | all the demons show up in my head.
01:42:13.680 | - Oh, dear.
01:42:14.520 | - The fear, I think, is if I don't say anything,
01:42:17.880 | it's boring, and if I say something, it's gonna be stupid.
01:42:21.080 | And that's the basic engine that just keeps running.
01:42:23.520 | Not on the podcast, but also in human interaction.
01:42:28.000 | And so I think there's that nervous energy
01:42:30.160 | when interacting with people.
01:42:31.960 | - You can never go wrong by staying silent
01:42:35.920 | if there's nothing you have to say.
01:42:38.560 | Not something I've mastered,
01:42:40.280 | but I do, when I'm a reporter, try to master that,
01:42:44.360 | which is don't ask complex questions.
01:42:50.360 | Don't interject, and when somebody hasn't
01:42:55.160 | fully answered the question, don't say,
01:42:57.200 | well, let me, just stay silent,
01:43:00.520 | and then they'll keep talking.
01:43:02.240 | - Just give 'em a chance to keep talking,
01:43:04.200 | even if they've kind of finished, they're still--
01:43:06.480 | - Sometimes, if they haven't given you enough,
01:43:08.200 | instead of following up, I'll just nod and keep waiting.
01:43:13.200 | - You're making it sound simple.
01:43:14.600 | Is there a secret to getting people to open up more?
01:43:18.120 | - I'm somewhat lucky because I started off
01:43:21.520 | working for a daily newspaper,
01:43:25.000 | and people back then, they wanted to talk
01:43:28.080 | to the newspaper reporter.
01:43:30.080 | - But you also have a way about you.
01:43:31.800 | I feel like you have a cowboy in a saloon.
01:43:35.440 | You just kind of wanna talk.
01:43:36.840 | Like there's a draw to, I don't know what it is.
01:43:40.760 | Maybe it's, I don't know if it's developed
01:43:42.480 | or you're born with it, but there's a,
01:43:44.040 | it feels like I wanna tell you a story of some sort.
01:43:46.160 | - Good, tell me a story.
01:43:47.460 | (laughing)
01:43:49.340 | - A couple things.
01:43:51.180 | I did learn to be more quiet.
01:43:54.820 | I'm sure I know when I was younger,
01:43:58.020 | or even I'll see videos of me at, you know,
01:44:02.540 | news things where I'm always trying to interject a question.
01:44:07.700 | And so you learn to be quieter sometimes.
01:44:12.140 | I haven't mastered it, I haven't learned it enough.
01:44:16.340 | You learn to be naturally curious.
01:44:18.820 | Many reporters today, when they ask a question,
01:44:22.940 | are either trying to play gotcha
01:44:24.300 | or trying to get a news scoop or trying to,
01:44:27.360 | you know, gig something that can make a lead.
01:44:31.940 | And if you actually are curious
01:44:38.140 | and you really wanna know the answer to a question,
01:44:41.820 | then people can tell that you asked it
01:44:44.720 | because you want the answer,
01:44:46.140 | not because you're playing a game with them.
01:44:48.580 | - I'm sure some of them off the record,
01:44:52.180 | some of them on the record, you had maybe, you know,
01:44:56.320 | just some incredible conversations.
01:44:59.580 | I was gonna say some of the greatest conversations ever,
01:45:01.500 | but who knows?
01:45:02.440 | Some of the best conversations ever
01:45:03.860 | are probably somewhere in South America
01:45:06.020 | between two drunk people that we never get to hear.
01:45:08.840 | So I don't know.
01:45:10.120 | But is there a device you can give
01:45:14.720 | from what you've learned to somebody like me
01:45:17.120 | on how to have good conversation,
01:45:19.880 | especially when it's recorded?
01:45:21.840 | - Well, to be actually curious.
01:45:23.840 | I mean, every question you've asked me
01:45:25.560 | is 'cause I think you actually want to know the answer
01:45:27.960 | and you've done your homework.
01:45:30.440 | To be open and not to have an agenda.
01:45:34.240 | I mean, we all suffer from there being too many agendas
01:45:39.160 | in the world today.
01:45:40.200 | - Yeah, so that's just genuine curiosity.
01:45:43.020 | But there's something,
01:45:45.580 | when you talk about just one-on-one interaction,
01:45:47.980 | whether it's Elon or Steve Jobs,
01:45:50.820 | there's something beautiful about that person's mind.
01:45:57.140 | And it feels like it's possible to reveal that,
01:46:02.260 | to discover that together, efficiently.
01:46:08.680 | And that's kind of the goal of a conversation.
01:46:11.180 | - Well, I mean, look, you're amongst the top podcasters
01:46:16.180 | and interviewers in the world today.
01:46:19.660 | You have an earnestness to you.
01:46:21.840 | Ben Franklin is the person who taught me,
01:46:27.300 | I mean, by reading him, the most about on conversation.
01:46:31.500 | He wrote a wonderful essay on that.
01:46:33.700 | It includes on silence.
01:46:38.140 | But it includes trying to ask sincere questions
01:46:43.140 | rather than get a point across.
01:46:48.840 | I mean, it's somewhat Socratic,
01:46:51.920 | but whenever he wanted to start a fireman's corps
01:46:57.780 | in Philadelphia, he would go to his group
01:47:04.100 | that he called the Leather Apron Club
01:47:06.560 | and they would pose a question.
01:47:08.080 | Why don't we have it?
01:47:09.920 | What would it take?
01:47:11.180 | What would be good?
01:47:12.320 | And then the second part is to make sure that you listen.
01:47:15.740 | And if somebody has even just the germ of an idea,
01:47:19.920 | give them credit for it.
01:47:21.280 | Like, as Joe said, the real problem is this.
01:47:26.280 | And I do think that if I'm in situations,
01:47:35.180 | and I just mean even at dinner or something,
01:47:37.820 | I'm with somebody, I'm usually curious
01:47:41.660 | and the conversation will proceed with questions.
01:47:46.660 | And I guess it's also 'cause I'm pretty interested
01:47:51.320 | in what anybody's doing, whoever I happen to be with.
01:47:54.180 | And that's a talent you have,
01:47:58.940 | which is you're pretty genuine in your interests.
01:48:02.700 | There are people like Benjamin Franklin,
01:48:05.980 | like the, I'll say Charlie Rose,
01:48:08.460 | even though he's in disfavor,
01:48:10.340 | who are interested in a huge number of subjects.
01:48:12.780 | And I think that helps as well,
01:48:14.620 | to be interested in basketball and opera
01:48:17.860 | and physics and metaphysics.
01:48:21.140 | That was a Ben Franklin, that was a Leonardo trick,
01:48:23.860 | which is they wanted to know everything
01:48:26.380 | you could possibly know about every subject knowable.
01:48:29.700 | - But there's a different aspect of this,
01:48:32.420 | which is that I would love to hear how you've solved it,
01:48:37.380 | or if you faced it, that you're certainly disarming.
01:48:40.200 | See, I'm like peppering you with compliments here,
01:48:44.580 | trying to get you to open.
01:48:45.420 | - That's a very disarming method.
01:48:46.500 | - Yeah, I've recently talked to Benjamin Netanyahu,
01:48:49.660 | we'll talk again.
01:48:50.800 | We unfortunately, 'cause of scheduling and complexities,
01:48:53.220 | only had one hour, which is very difficult,
01:48:55.940 | very difficult with a charismatic politician.
01:48:57.980 | - He is the prime minister.
01:48:59.540 | - I understand this.
01:49:00.760 | But he's also a charismatic talker,
01:49:03.180 | which is very difficult to break through in one hour.
01:49:05.780 | But people have built up walls,
01:49:10.040 | whether it's because of demons
01:49:11.300 | or because of their politicians,
01:49:13.380 | and so they have agendas and narratives and so on,
01:49:15.900 | and so to break through those,
01:49:18.980 | I wonder if there's some advice,
01:49:20.460 | some wisdom you've learned of how to
01:49:22.820 | sort of wear down through water or whatever,
01:49:28.920 | whatever method, the walls that we've built up
01:49:31.920 | as individuals.
01:49:33.160 | - I mean, you call it disarming,
01:49:34.520 | which I don't know that I am,
01:49:36.580 | but disarming basically means
01:49:38.400 | you're taking down their shields also.
01:49:41.520 | And you know when people have a shield,
01:49:43.980 | and you try to give them comfort.
01:49:48.440 | I had zero of that problem with Elon Musk.
01:49:51.720 | I mean, it was like disarming to me,
01:49:54.400 | which is, I kept waiting to say,
01:49:56.640 | okay, he's not gonna,
01:49:57.940 | they've got a shell, he won't do that.
01:50:00.180 | But he was
01:50:01.860 | almost
01:50:05.440 | crazily open
01:50:09.420 | and did not seem to
01:50:13.100 | wanna be spinning or hiding or faking things.
01:50:18.400 | And I've been lucky.
01:50:23.020 | Doudna was that way.
01:50:24.940 | Steve Jobs was that way.
01:50:26.580 | But you have to put in time, too.
01:50:29.360 | In other words, you can't say,
01:50:31.920 | okay, there's a one hour interview,
01:50:33.800 | and I'm gonna break down every wall.
01:50:36.160 | It's like on your fifth visit.
01:50:38.440 | - Yes, well, actually, that's one of the things
01:50:40.160 | in my situation, you learn,
01:50:41.840 | fifth visit is very nice,
01:50:44.560 | but sometimes you don't get a fifth visit.
01:50:46.280 | Sometimes it's just the first date.
01:50:48.120 | And I think what it boils down to,
01:50:51.240 | and we said disarming,
01:50:52.540 | but there's something about this person
01:50:55.120 | that you trust.
01:50:58.600 | I think a lot of it just boils down to trust
01:51:01.080 | in some deep human way.
01:51:03.160 | And I think with many of the people I've spoken with,
01:51:07.640 | sometimes the trust happens after the interview,
01:51:10.540 | which is really sad because it's like, oh, man.
01:51:13.320 | - I've never been in your situation
01:51:15.240 | where I have a show,
01:51:17.040 | I usually have
01:51:17.960 | mini cracks at the wheel. - Second date.
01:51:20.680 | - Yes, I'm not a first date person.
01:51:23.700 | - Yeah, yeah, yeah, well, you know.
01:51:25.740 | - But then I'm lucky.
01:51:26.580 | I mean, I say lucky, but I'm in print.
01:51:29.320 | Print is a couple thousand year old medium,
01:51:34.020 | but there are those of us who love it.
01:51:35.700 | - Well, the nature of the podcast medium
01:51:38.380 | is that I'm a one night stand kind of girl.
01:51:41.160 | Let me ask you about objectivity.
01:51:44.580 | You've followed Elon, you've followed Steve Jobs.
01:51:48.360 | I mean, I don't even know if you would say your friend,
01:51:50.760 | you have to be careful with words like that,
01:51:52.360 | but there's an intimacy.
01:51:53.920 | And how do you remain objective?
01:51:59.140 | Do you want to remain objective
01:52:02.040 | while telling a deeply human story?
01:52:03.920 | - Yeah, I mean, I want to be honest,
01:52:06.480 | which I think is akin to being objective.
01:52:09.740 | I try to keep in mind,
01:52:12.960 | who am I writing for?
01:52:16.680 | I'm not writing for Elon Musk.
01:52:18.620 | As I say, I haven't sent him the book.
01:52:21.220 | I don't know if he, I don't think he's read it yet.
01:52:24.160 | I've got one person I'm writing for,
01:52:28.500 | the open-minded reader.
01:52:31.380 | And if I can put in a story and say,
01:52:36.380 | well, that will piss off the subject
01:52:39.860 | or that will really make the subject happy,
01:52:42.620 | that's irrelevant.
01:52:44.260 | Or I try to make that a minor consideration.
01:52:47.420 | It's, will the reader have a better understanding
01:52:52.420 | because I've put this story in the book?
01:52:55.280 | - I'm a bit of a romantic.
01:52:58.840 | So to me, even your Einstein book
01:53:03.840 | had lessons on romance and relationships.
01:53:07.480 | - Oh, dear.
01:53:08.320 | (laughing)
01:53:09.360 | - So how important are romantic relationships
01:53:11.400 | to the success of great men, great women,
01:53:14.000 | great minds?
01:53:14.840 | - Well, sometimes people who affect the course of humanity
01:53:20.920 | have better relationships with humanity
01:53:23.320 | than they do with the humans sitting around them.
01:53:26.320 | Einstein had two interesting relationships with wives.
01:53:31.320 | Maleva, his first wife, was a sounding board
01:53:36.760 | and helped with the mathematics
01:53:38.120 | of the special relativity paper in particular.
01:53:42.920 | But he didn't treat her well.
01:53:44.680 | I mean, he made her sign a letter
01:53:48.640 | that she wouldn't interrupt him, she wouldn't, you know.
01:53:51.880 | And finally, when she wanted a divorce,
01:53:53.920 | he couldn't afford it 'cause he was still a patent clerk.
01:53:56.780 | And so he offered her a deal,
01:54:00.800 | which is, I think, totally amazing.
01:54:03.320 | He said, one of these days, one of those papers from 1905
01:54:07.040 | is gonna win the Nobel Prize.
01:54:09.720 | If we get a divorce, you know, I'll give you the money.
01:54:14.100 | That was a lot of money back then,
01:54:17.640 | like a million dollars now or something.
01:54:20.240 | And she's smart, she's a scientist.
01:54:22.200 | She consults with a few other scientists,
01:54:24.140 | and after a week or so, she takes the bet.
01:54:26.960 | It's not until, what, 1919 that he wins his Nobel Prize.
01:54:31.220 | And she gets all the money.
01:54:35.400 | She buys three apartment buildings in Zurich.
01:54:38.640 | With his second wife, Elsa,
01:54:41.240 | it was more a partnership of convenience.
01:54:44.800 | It was not a romantic love, but he knew,
01:54:49.000 | and that's sometimes what people need in life,
01:54:51.880 | is just a partner.
01:54:53.480 | I mean, somebody who's gonna handle the stuff
01:54:55.120 | you're not gonna handle.
01:54:56.820 | So I guess if you look at my books,
01:55:00.880 | they're not great inspiring guides
01:55:03.840 | to personal relationships.
01:55:06.520 | - Let me ask you about, actually,
01:55:07.960 | the process of writing itself.
01:55:10.240 | When you've observed, when you've listened,
01:55:12.920 | when you've collected all the information,
01:55:15.200 | what's, maybe even just the silly mundane question
01:55:18.520 | of what do you eat for breakfast before you start writing?
01:55:22.800 | When do you write?
01:55:23.640 | - First of all, breakfast is not my favorite meal.
01:55:26.520 | And those people who tell you that you have to start
01:55:29.260 | with a hearty breakfast, I look askance.
01:55:32.920 | - Yes.
01:55:34.120 | - And morning is not my favorite day part,
01:55:35.920 | so I write at night.
01:55:37.680 | And because I love narrative,
01:55:42.080 | it's easy to structure a book.
01:55:44.040 | Which is, I can make a outline that if I printed it out,
01:55:48.240 | or notes, would be 100 pages, but everything's in order.
01:55:53.240 | In other words, if there's a burning man
01:55:59.440 | and he's coming back from grimes,
01:56:01.760 | and then there's a solar roof thing,
01:56:03.320 | and then there's something,
01:56:04.880 | I put it all in order day by day as an outline.
01:56:09.880 | And that disciplines me when I'm starting to write
01:56:14.360 | to follow the mantra from Alice Mayhew, my first editor,
01:56:18.720 | which is all things in good time.
01:56:20.400 | Don't get ahead of the story, don't have to flashback.
01:56:23.100 | And then after you get it,
01:56:26.740 | so that it's all chronological, you know,
01:56:29.080 | then you have to do some clustering.
01:56:30.920 | You know, you have to say, okay,
01:56:32.400 | we're going to do the decision to do Starship,
01:56:36.840 | or to build a factory in Texas, or to whatever.
01:56:40.920 | And then you sometimes have the organizational problem of,
01:56:45.040 | yeah, and that gets us all the way up to here.
01:56:48.080 | Do I keep that in that chapter,
01:56:50.320 | or do I wait until later when it's better chronologically?
01:56:55.320 | But those are easy.
01:56:57.060 | - Well, what about the actual process of telling the story?
01:57:02.900 | - Well, that's the mantra I mentioned earlier,
01:57:05.780 | which is whenever I get pause,
01:57:08.660 | or I don't know how to say something,
01:57:10.880 | I just say, let me tell you a story.
01:57:13.060 | And then I find the actual anecdote, the story,
01:57:18.060 | the tale that encompasses what I'm trying to convey.
01:57:23.140 | And then I don't say what I'm trying to convey.
01:57:25.140 | I don't have a transition sentence that says,
01:57:28.600 | you know, Elon sometimes changed his mind so often
01:57:31.460 | he couldn't remember whether he had changed his mind.
01:57:34.140 | You know, you don't need transition sentences.
01:57:37.740 | You just say, all right,
01:57:39.300 | here's the point I need to make next.
01:57:41.820 | And so you start with a sentence that says,
01:57:45.940 | you know, one day in January in the factory in Texas, comma.
01:57:49.860 | - Well, one of the things I'd love to ask you is
01:57:53.460 | for advice for young people.
01:57:58.340 | To me, first advice would be to read biographies.
01:58:02.380 | In the sense, because they help you understand
01:58:07.380 | of all the different ways you can live a life well lived.
01:58:10.780 | But from having written biographies,
01:58:13.860 | having studied so many great men and women,
01:58:17.900 | what advice could you give to people
01:58:20.140 | of how to live this life?
01:58:22.980 | - Well, I keep going back to the classics
01:58:24.880 | and Plato and Aristotle and Socrates.
01:58:28.280 | And I guess it's Plato's maxim,
01:58:31.760 | but he may be quoting Socrates,
01:58:33.680 | that the unexamined life is not worth living.
01:58:36.140 | And it gets back to the know thyself.
01:58:39.600 | Which is, you don't have to figure out
01:58:43.320 | what is the big meaning of it all.
01:58:45.560 | But you have to figure out why you're doing what you're doing
01:58:50.640 | and that requires something that I did not have enough of
01:58:55.640 | when I was young, which is self-awareness
01:59:01.240 | and examining every motive, everything I do.
01:59:05.660 | - Where does the examination lead you?
01:59:10.040 | Is it to a shift in life trajectory?
01:59:17.880 | - I mean, it's not for me, sort of,
01:59:21.000 | all right, I've now decided, having been a journalist,
01:59:24.480 | I'll run a think tank or I'll run a network
01:59:27.520 | or I'll write a bio.
01:59:28.720 | It is actually something that's more useful
01:59:32.140 | on an hourly basis.
01:59:34.440 | Like, why am I about to say that to somebody
01:59:37.920 | or why am I going to do this particular act?
01:59:42.920 | What's my true motive here?
01:59:47.680 | And also, in the broader sense,
01:59:50.800 | to learn as I did after a couple years at CNN,
01:59:53.720 | my examination of my life is that I'm not great
02:00:00.080 | at running complex organizations.
02:00:03.880 | I'm not great as a manager, given the choice.
02:00:07.000 | I'd rather somebody else have to manage me
02:00:08.920 | than me have to manage people.
02:00:10.640 | But it took me a while to figure that out
02:00:14.000 | and I was probably too ambitious when I was young
02:00:16.760 | and at Time Magazine.
02:00:18.960 | That was when I was green and, oh well,
02:00:23.160 | that was when I was in my salad days
02:00:25.040 | and green and judgment.
02:00:27.160 | And it was like chasing the next level
02:00:31.520 | at Time Incorporated, whatever it might be.
02:00:35.040 | And then one day, I caught the brass ring
02:00:37.840 | and I became an editor and then the top editor.
02:00:41.240 | And after a while, I realized that wasn't really
02:00:46.480 | totally what I'm suited to be,
02:00:48.000 | especially when I got put in charge of CNN.
02:00:50.760 | I mean, all young people are almost by definition
02:00:54.400 | in their salad days and green and judgment.
02:00:57.560 | But you learn what's motivating you
02:01:02.560 | and then you learn to ask,
02:01:05.040 | but is that really what I want?
02:01:08.480 | Should I be careful of what I'm wishing for?
02:01:13.280 | - One of the big examinations you can do
02:01:16.640 | is the fact that you and everybody dies one day.
02:01:20.560 | How much you, Walter Isaacson, think about death?
02:01:25.120 | Are you afraid of it?
02:01:26.200 | - No, and I don't think about it a lot,
02:01:28.000 | but I do think about Steve Jobs's
02:01:29.840 | Let Me Tell You a Story, which is the wonderful
02:01:32.320 | Steve Jobs story of, I think after he was diagnosed
02:01:36.840 | but before it was public.
02:01:38.360 | And he gave both a Stanford talk but other things
02:01:42.080 | in which he said the fact that we are going to die
02:01:46.600 | gives you focus and gives you meaning.
02:01:49.560 | If you're gonna live, and Elon Musk has said that to me,
02:01:52.800 | which is a lot of the tech bros out in the Silicon Valley
02:01:56.160 | that are looking for ways to live forever,
02:01:59.040 | I can think, Musk says, of nothing worse.
02:02:02.620 | We read the myth of Sisyphus and we know how bad it is
02:02:06.400 | to be condemned to eternal life.
02:02:10.200 | So there was an ancient Greece,
02:02:12.780 | the person who walked behind the king and said,
02:02:18.040 | "Memento mori," remember you're gonna die.
02:02:21.400 | And it kept people from losing it a bit.
02:02:24.740 | - Do you think about legacy?
02:02:30.880 | - The lucky thing about being a biographer
02:02:34.880 | is that you kind of know what your legacy is.
02:02:37.440 | It's gonna be a shelf and it'll be of interesting people
02:02:40.680 | and you will have inspired a 17-year-old biology student
02:02:45.680 | somewhere to be the next great biochemist
02:02:51.320 | or somebody to start a company like Elon Musk.
02:02:56.020 | And what I think more about, I won't say giving back,
02:03:03.600 | that's such a trite thing.
02:03:05.560 | I moved back to New Orleans for a reason.
02:03:09.880 | First of all, the hurricane hit.
02:03:11.800 | And after Katrina, I was asked to be
02:03:13.680 | vice chair of the Recovery Authority.
02:03:17.000 | And I realized everything I've got going for me,
02:03:20.760 | it all comes from this beautiful gem of a troubled city.
02:03:25.760 | The wonderful high school I went to,
02:03:29.080 | the wonderful streets where I learned to ride a bike,
02:03:34.760 | and it's got challenges.
02:03:36.240 | I'm never gonna solve challenges at the grand global level,
02:03:41.240 | but I can go back home and say,
02:03:44.320 | part of my legacy is going to be,
02:03:46.720 | I tried to pay it back to my hometown,
02:03:50.760 | even by teaching at Tulane, which I don't do as a favor.
02:03:53.640 | I mean, I enjoy the hell out of it,
02:03:55.920 | but it's like, all right, I'm part of a community.
02:03:58.520 | And I think we lose that in America
02:04:00.880 | because people who are lonely are lonely
02:04:02.760 | 'cause they're not part of a community.
02:04:04.640 | But I've got all my high school kids,
02:04:06.240 | their friends, they're all still in New Orleans.
02:04:08.440 | I've got my family, but I also have Tulane,
02:04:11.400 | institutions in New Orleans that have been there forever.
02:04:14.840 | And if I can get involved
02:04:16.600 | in helping the school system in New Orleans,
02:04:18.680 | of helping the youth empowerment programs,
02:04:21.560 | of helping the innovation center at Tulane,
02:04:25.280 | I was even on the city planning commission,
02:04:27.240 | which worries about zoning ordinances
02:04:29.320 | for short-term rentals, you know, go figure.
02:04:33.000 | But it was like, no, immerse myself in my community
02:04:35.760 | 'cause my community was just so awesomely good
02:04:40.300 | at allowing me to become who I became
02:04:43.840 | and has trouble year by year, hurricane by hurricane,
02:04:48.840 | making sure that each new generation can be creative.
02:04:53.760 | And it's a city of creativity,
02:04:55.780 | from jazz to the food to the architecture.
02:04:58.440 | So when I think of, I won't say legacy,
02:05:01.820 | but what am I gonna do to pay it forward,
02:05:04.160 | which is a lower level way of saying legacy,
02:05:08.240 | I pay it forward by going back to the place where I began
02:05:11.980 | and trying to know it for the first time.
02:05:14.040 | That was a ripoff of a T.S. Eliot line.
02:05:19.100 | I don't want you to think I thought of that one.
02:05:21.500 | - Always cite your sources, I appreciate it.
02:05:24.800 | - T.S. Eliot, if you ever need to figure it out,
02:05:27.920 | the four quartets, there's that part at the end,
02:05:30.680 | which is, "We shall not cease from exploration."
02:05:33.480 | And the end of all of our exploring
02:05:35.360 | will be to return to the place where we started
02:05:37.760 | and know it for the first time,
02:05:39.960 | to the unknown but half-remembered gate.
02:05:42.480 | It's just beautiful.
02:05:43.800 | And that's been an inspiration of,
02:05:47.780 | what do you do in, I guess if it's a Shakespeare play,
02:05:54.200 | you'd call it Act V.
02:05:57.080 | Well, you go back to the place where you came
02:05:59.560 | and you don't sit there worrying about legacy,
02:06:04.000 | but you'll sit there saying,
02:06:05.520 | how do I make sure that somebody else can have
02:06:07.800 | a magical trajectory starting in New Orleans?
02:06:12.920 | - Well, to me, you're one of the greatest storytellers
02:06:16.800 | of all time, I've been a huge fan.
02:06:19.320 | - Definitely not true, but it's so sweet of you.
02:06:21.240 | See, you can be-- - Rudely interrupting.
02:06:23.520 | (laughing)
02:06:28.360 | - I think probably Ben Franklin,
02:06:31.840 | so for I don't know how many years, 15 years,
02:06:33.760 | Einstein, all the way through today,
02:06:36.720 | I've just been a huge fan of yours
02:06:38.340 | and you're one of the people that I thought
02:06:40.640 | surely would not lower themselves
02:06:44.120 | to appear and have a conversation with me.
02:06:47.600 | And it's just a giant gift to me.
02:06:49.800 | - Hey, I flew into Austin for this because I am a big fan,
02:06:54.200 | and especially a big fan because you take people seriously
02:06:57.800 | and you care.
02:06:58.960 | - Thank you, a thousand times thank you for respecting me
02:07:01.400 | and for inspiring just millions of people with your stories.
02:07:04.800 | Again, an incredible storyteller, incredible human,
02:07:07.720 | and thank you for talking today.
02:07:09.400 | - Thank you, Alex.
02:07:11.120 | - Thanks for listening to this conversation
02:07:12.520 | with Walter Isaacson.
02:07:13.840 | To support this podcast,
02:07:15.120 | please check out our sponsors in the description.
02:07:17.960 | And now let me leave you with one of my favorite quotes
02:07:21.080 | from Carl Jung, "People will do anything,
02:07:24.760 | no matter how absurd,
02:07:26.440 | in order to avoid facing their own souls.
02:07:29.780 | One does not become enlightened
02:07:31.280 | by imagining figures of light,
02:07:33.480 | but by making the darkness conscious."
02:07:37.220 | Thank you for listening and hope to see you next time.
02:07:40.460 | (upbeat music)
02:07:43.040 | (upbeat music)
02:07:45.620 | [BLANK_AUDIO]