back to index

Jeff Bezos: Amazon and Blue Origin | Lex Fridman Podcast #405


Chapters

0:0 Introduction
0:24 Texas ranch and childhood
4:2 Space exploration and rocket engineering
16:36 Physics
26:10 New Glenn rocket
68:59 Lunar program
78:55 Amazon
96:16 Principles
114:56 Productivity
125:34 Future of humanity

Whisper Transcript | Transcript Only Page

00:00:00.000 | "The following is a conversation with Jeff Bezos,
00:00:03.180 | "founder of Amazon and Blue Origin.
00:00:06.280 | "This is his first time doing a conversation
00:00:09.120 | "of this kind and of this length.
00:00:11.280 | "And as he told me,
00:00:12.620 | "it felt like we could have easily talked
00:00:14.160 | "for many more hours, and I'm sure we will.
00:00:17.700 | "This is the Lex Friedman Podcast,
00:00:19.760 | "and now, dear friends, here's Jeff Bezos."
00:00:23.720 | You spent a lot of your childhood with your grandfather
00:00:27.800 | on a ranch here in Texas.
00:00:30.240 | And I heard you had a lot of work to do around the ranch.
00:00:33.000 | So what's the coolest job you remember doing there?
00:00:35.320 | - Wow, coolest.
00:00:37.120 | - Most interesting, most memorable.
00:00:39.160 | - Most memorable.
00:00:40.000 | - Most impactful.
00:00:41.240 | - It's a real working ranch.
00:00:43.340 | And I spent all my summers on that ranch
00:00:46.640 | from age four to 16.
00:00:49.320 | And my grandfather was really taking me those
00:00:51.680 | in the summers, and in the early summers,
00:00:54.360 | he was letting me pretend to help on the ranch,
00:00:57.440 | 'cause of course, a four-year-old is a burden,
00:00:59.240 | not a help in real life.
00:01:01.120 | He was really just watching me and taking care of me.
00:01:03.820 | And he was doing that because my mom was so young.
00:01:07.960 | She had me when she was 17,
00:01:09.880 | and so he was sort of giving her a break,
00:01:11.480 | and my grandmother and my grandfather
00:01:13.280 | would take me for these summers.
00:01:15.000 | But as I got a little older,
00:01:15.880 | I actually was helpful on the ranch, and I loved it.
00:01:18.800 | I was out there, like, my grandfather
00:01:21.360 | had a huge influence on me, huge factor in my life.
00:01:25.480 | I did all the jobs you would do on a ranch.
00:01:27.800 | I've fixed windmills and laid fences and pipelines,
00:01:32.800 | and done all the things that any rancher would do,
00:01:37.920 | vaccinated the animals, everything.
00:01:40.000 | But we had a, you know, my grandfather,
00:01:44.000 | after my grandmother died, I was about 12,
00:01:47.160 | and I kept coming to the ranch.
00:01:48.640 | Then it was just him and me, just the two of us.
00:01:51.160 | And he was completely addicted to the soap opera,
00:01:55.840 | The Days of Our Lives, and we would go back
00:01:58.960 | to the ranch house every day around 1 p.m. or so
00:02:02.000 | to watch Days of Our Lives.
00:02:04.320 | Like, sands through an hourglass,
00:02:06.040 | so are the days of our lives.
00:02:07.720 | - Just the image of that, the two of you sitting there,
00:02:10.920 | watching a soap opera.
00:02:12.160 | - He had these big, crazy dogs.
00:02:14.320 | It was really a very formative experience for me.
00:02:16.520 | But the key thing about it, for me,
00:02:19.240 | the great gift I got from it,
00:02:21.800 | was that my grandfather was so resourceful.
00:02:24.840 | You know, he did everything himself.
00:02:26.920 | He made his own veterinary tools.
00:02:28.440 | He would make needles to suture the cattle up with.
00:02:31.240 | He would find a little piece of wire and heat it up
00:02:34.080 | and pound it thin and drill a hole in it and sharpen it.
00:02:37.160 | So, you know, you learn different things on a ranch
00:02:40.800 | than you would learn, you know, growing up in a city.
00:02:43.320 | - So self-reliance.
00:02:44.520 | - Yeah, like figuring out that you can solve problems
00:02:47.600 | with enough persistence and ingenuity.
00:02:51.320 | And my grandfather bought a D6 bulldozer,
00:02:55.040 | which is a big bulldozer,
00:02:56.680 | and he got it for like $5,000
00:02:58.480 | 'cause it was completely broken down.
00:03:00.200 | It was like a 1955 Caterpillar D6 bulldozer.
00:03:04.640 | Knew it would've cost, I don't know, more than $100,000.
00:03:07.920 | And we spent an entire summer fixing,
00:03:10.880 | like repairing that bulldozer.
00:03:13.040 | We'd, you know, use mail order to buy big gears
00:03:16.880 | for the transmission and they'd show up
00:03:18.440 | and they'd be too heavy to move,
00:03:19.720 | so we'd have to build a crane.
00:03:21.160 | You know, just that kind of,
00:03:23.400 | kind of that problem-solving mentality.
00:03:26.540 | He had it so powerfully.
00:03:28.880 | You know, he did all of his own.
00:03:32.560 | He just, he didn't pick up the phone and call somebody.
00:03:35.400 | He would figure it out on his own.
00:03:37.320 | Doing his own veterinary work, you know.
00:03:39.360 | - But just the image of the two of you
00:03:41.000 | fixing a D6 bulldozer and then going in for a little break
00:03:44.880 | at 1 p.m. to watch soap opera.
00:03:47.440 | - Laying on the floor, that's how he watched TV.
00:03:50.440 | He was a really, really remarkable guy.
00:03:52.600 | - That's how I imagine Clint Eastwood also.
00:03:54.600 | (laughing)
00:03:55.760 | In all those westerns, when he's not doing what he's doing,
00:03:59.480 | he's just watching soap operas.
00:04:01.440 | All right, I read that you fell in love
00:04:04.040 | with the idea of space and space exploration
00:04:06.120 | when you were five,
00:04:07.760 | while watching Neil Armstrong walking on the moon.
00:04:10.880 | So let me ask you to look back
00:04:13.360 | at the historical context and impact of that.
00:04:16.160 | So the space race from 1957 to 1969
00:04:20.920 | between the Soviet Union and the U.S.
00:04:23.520 | was in many ways epic.
00:04:25.520 | It was a rapid sequence of dramatic events.
00:04:29.200 | First satellite to space, first human to space,
00:04:32.000 | first spacewalk, first uncrewed landing on the moon,
00:04:35.560 | then some failures, explosions,
00:04:38.360 | deaths on both sides actually,
00:04:40.400 | and then the first human walking on the moon.
00:04:43.760 | What are some of the more inspiring moments
00:04:45.800 | or insights you take away from that time,
00:04:47.560 | those few years, that just 12 years?
00:04:51.080 | - Well, I mean, there's so much inspiring there.
00:04:54.560 | One of the great things to take away from that,
00:04:56.920 | one of the great Von Braun quotes is,
00:04:59.340 | "I have come to use the word impossible with great caution."
00:05:04.340 | (laughing)
00:05:08.080 | That's kind of the big story of Apollo,
00:05:10.440 | is that going to the moon was literally
00:05:15.440 | an analogy that people used
00:05:17.200 | for something that's impossible.
00:05:19.280 | Oh yeah, you'll do that when men walk on the moon.
00:05:23.280 | And of course it finally happened.
00:05:26.120 | So I think it was pulled forward in time
00:05:29.240 | because of the space race.
00:05:31.560 | I think with the geopolitical implications
00:05:34.820 | and how much resource was put into it.
00:05:37.720 | At the peak, that program was spending
00:05:39.800 | two or 3% of GDP on the Apollo program.
00:05:44.040 | So much resource, I think it was pulled forward in time.
00:05:48.360 | We kind of did it ahead of when we,
00:05:51.000 | quote unquote, should have done it.
00:05:53.240 | And so in that way, it's also a technical marvel.
00:05:56.840 | I mean, it's truly incredible.
00:05:58.920 | It's the 20th century version
00:06:02.120 | of building the pyramids or something.
00:06:04.560 | It's an achievement that because it was pulled forward
00:06:08.800 | in time and because it did something
00:06:10.280 | that had previously been thought impossible,
00:06:12.520 | it rightly deserves its place
00:06:14.360 | as in the pantheon of great human achievements.
00:06:17.720 | - And of course you named the projects,
00:06:20.000 | the rockets that Blue Origin is working on
00:06:22.520 | after some of the folks involved.
00:06:24.120 | I don't understand why it didn't say New Gagarin.
00:06:26.600 | Is that--
00:06:27.440 | - There's an American bias in the naming.
00:06:29.000 | I apologize.
00:06:30.080 | - It's very strange.
00:06:31.200 | - Lex.
00:06:32.040 | (laughing)
00:06:32.860 | - Just asking for a friend.
00:06:33.700 | - I'm a big fan of Gagarin's though.
00:06:35.320 | And in fact, I think his first words in space
00:06:40.320 | I think are incredible.
00:06:43.280 | He purportedly said, "My God, it's blue."
00:06:47.640 | And that really drives home.
00:06:49.640 | No one had seen the Earth from space.
00:06:52.320 | No one knew that we were on this blue planet.
00:06:55.560 | No one knew what it looked like from out there.
00:06:58.040 | And Gagarin was the first person to see it.
00:07:01.040 | One of the things I think about is how dangerous
00:07:04.120 | those early days were for Gagarin, for Glenn,
00:07:07.540 | for everybody involved.
00:07:08.680 | Like how big of a risk they were all taking.
00:07:10.440 | - They were taking huge risks.
00:07:12.360 | I'm not sure what the Soviets thought
00:07:14.520 | about Gagarin's flight,
00:07:16.040 | but I think that the Americans thought
00:07:18.320 | that the Alan Shepard flight,
00:07:19.960 | the flight that New Shepard is named after,
00:07:22.440 | the first American in space,
00:07:23.800 | he'd been on his suborbital flight.
00:07:26.340 | They thought he had about a 75% chance of success.
00:07:31.320 | So that's a pretty big risk, a 25% risk.
00:07:36.000 | - It's kind of interesting that Alan Shepard
00:07:37.760 | is not quite as famous as John Glenn.
00:07:39.920 | So for people who don't know,
00:07:40.880 | Alan Shepard is the first astronaut--
00:07:44.680 | - The first American in space.
00:07:46.400 | - American in suborbital flight.
00:07:48.360 | - Correct.
00:07:49.200 | - And then the first orbital flight is--
00:07:51.320 | - John Glenn is the first American to orbit the Earth.
00:07:54.800 | By the way, I have the most charming, sweet,
00:07:58.000 | incredible letter from John Glenn,
00:08:00.920 | which I have framed and hang on my office wall.
00:08:04.000 | - What does he say?
00:08:04.840 | - Where he tells me how grateful he is
00:08:07.840 | that we have named new Glenn after him.
00:08:10.640 | And he sent me that letter about a week before he died.
00:08:13.640 | And it's really an incredible,
00:08:15.840 | it's also a very funny letter he's writing.
00:08:18.440 | And he says, you know, this is a letter about new Glenn
00:08:22.720 | from the original Glenn.
00:08:24.120 | And he's just, he's got a great sense of humor
00:08:26.160 | and he's very happy about it and grateful.
00:08:29.240 | It's very sweet.
00:08:30.280 | - Does he say, "P.S., don't mess this up," or is that--
00:08:32.560 | (laughing)
00:08:34.240 | - No, he doesn't.
00:08:35.080 | - Make me look good.
00:08:35.920 | - He doesn't do that.
00:08:36.740 | - Okay, all right.
00:08:37.580 | - But wait, but John, wherever you are,
00:08:38.480 | we got you covered.
00:08:39.320 | - All right, good.
00:08:41.040 | So back to maybe the big picture of space.
00:08:44.360 | When you look up at the stars and think big,
00:08:47.920 | what do you hope is the future of humanity,
00:08:49.800 | hundreds, thousands of years from now out in space?
00:08:54.200 | - I would love to see a trillion humans
00:08:59.200 | living in the solar system.
00:09:01.600 | If we had a trillion humans, we would have,
00:09:04.380 | at any given time, 1,000 Mozarts and 1,000 Einsteins.
00:09:08.080 | That would, our solar system would be full of life
00:09:11.740 | and intelligence and energy.
00:09:13.760 | And we can easily support a civilization that large
00:09:17.520 | with all of the resources in the solar system.
00:09:21.180 | - So what do you think that looks like?
00:09:23.340 | Giant space stations?
00:09:24.560 | - Yeah, the only way to get to that vision
00:09:26.480 | is with giant space stations.
00:09:27.960 | You know, the planetary surfaces are just way too small.
00:09:32.040 | So you can, I mean, unless you turn them
00:09:33.880 | into giant space stations or something.
00:09:36.480 | But yeah, we will take materials from the moon
00:09:39.360 | and from near-Earth objects and from the asteroid belt
00:09:43.520 | and so on, and we'll build giant O'Neill-style colonies.
00:09:48.520 | And people will live in those.
00:09:50.900 | And they have a lot of advantages over planetary surfaces.
00:09:54.100 | You can spin them to get normal Earth gravity.
00:09:57.640 | You can put them where you want them.
00:09:59.320 | I think most people are gonna wanna live near Earth,
00:10:03.100 | not necessarily in Earth orbit, but in, you know,
00:10:07.220 | Earth vicinity orbits.
00:10:11.100 | And so they can move relatively quickly back and forth
00:10:16.100 | between their station and Earth.
00:10:19.180 | So I think a lot of people, especially in the early stages,
00:10:22.120 | are not gonna wanna give up Earth altogether.
00:10:24.780 | - They go to Earth for vacation.
00:10:26.640 | - Yeah, same way that, you know,
00:10:28.240 | you might go to Yellowstone National Park for vacation.
00:10:31.600 | People will, and people will get to choose
00:10:35.000 | whether they live on Earth or whether they live in space,
00:10:37.720 | but they'll be able to use much more energy
00:10:40.520 | and much more material resource in space
00:10:43.200 | than they would be able to use on Earth.
00:10:45.440 | - One of the interesting ideas you had
00:10:46.780 | is to move the heavy industry away from Earth.
00:10:49.720 | So people sometimes have this idea
00:10:51.400 | that somehow space exploration is in conflict
00:10:55.800 | with the celebration of the planet Earth,
00:10:58.560 | that we should focus on preserving Earth.
00:11:00.600 | And basically your idea is that space travel
00:11:04.200 | and space exploration is a way to preserve Earth.
00:11:06.760 | - Exactly.
00:11:07.880 | This planet, we've sent robotic probes to all the planets.
00:11:12.880 | We know that this is the good one.
00:11:14.880 | (laughing)
00:11:17.460 | Not to play favorites or anything, but--
00:11:19.180 | - Earth really is the good planet.
00:11:20.740 | It's amazing, the ecosystem we have here,
00:11:24.100 | all of the life and the lush plant life
00:11:27.740 | and the water resources, everything.
00:11:30.260 | This planet is really extraordinary.
00:11:33.220 | And of course, we evolved on this planet,
00:11:35.020 | so of course it's perfect for us.
00:11:37.300 | But it's also perfect for all the advanced life forms
00:11:39.820 | on this planet, all the animals and so on.
00:11:42.220 | And so this is a gem.
00:11:44.060 | We do need to take care of it.
00:11:46.020 | And as we enter the Anthropocene,
00:11:48.700 | as we humans have gotten so sophisticated
00:11:52.500 | and large and impactful, as we stride across this planet,
00:11:57.500 | as we continue, we want to use a lot of energy.
00:12:03.380 | We want to use a lot of energy per capita.
00:12:05.260 | We've gotten amazing things.
00:12:07.220 | We don't wanna go backwards.
00:12:09.980 | If you think about the good old days,
00:12:14.980 | they're mostly an illusion.
00:12:17.260 | Like in almost every way,
00:12:19.140 | life is better for almost everyone today
00:12:21.780 | than it was, say, 50 years ago or 100 years ago.
00:12:25.180 | We live better lives by and large than our grandparents did
00:12:29.420 | and then their grandparents did and so on.
00:12:31.660 | And you can see that in global illiteracy rates,
00:12:34.340 | global poverty rates, global infant mortality rates,
00:12:38.020 | like almost any metric you choose,
00:12:40.980 | we're better off than we used to be.
00:12:42.660 | We get antibiotics and all kinds of life-saving medical care
00:12:47.260 | and so on and so on.
00:12:49.060 | And there's one thing that is moving backwards
00:12:53.020 | and it's the natural world.
00:12:54.540 | So it is a fact that 500 years ago, pre-industrial age,
00:12:59.540 | the natural world was pristine.
00:13:01.400 | It was incredible.
00:13:03.620 | And we have traded some of that pristine beauty
00:13:08.620 | for all of these other gifts
00:13:10.700 | that we have as an advanced society.
00:13:13.660 | And we can have both,
00:13:16.100 | but to do that, we have to go to space.
00:13:18.860 | And all of this really,
00:13:20.540 | the most fundamental measure is energy usage per capita.
00:13:25.540 | And when you look at,
00:13:27.260 | you do wanna continue to use more and more energy.
00:13:30.300 | It is going to make your life better in so many ways.
00:13:33.540 | But that's not compatible ultimately
00:13:36.340 | with living on a finite planet.
00:13:37.980 | And so we have to go out into the solar system.
00:13:40.900 | And really you could argue about when you have to do that,
00:13:44.740 | but you can't credibly argue
00:13:47.420 | about whether you have to do that.
00:13:49.540 | - Eventually we have to do that.
00:13:51.220 | - Exactly.
00:13:52.300 | - So you don't often talk about it,
00:13:53.980 | but let me ask you on that topic about the Blue Ring
00:13:57.380 | and the Orbital Reef Space Infrastructure Projects.
00:14:01.340 | What's your vision for these?
00:14:03.380 | - So Blue Ring is a very interesting spacecraft
00:14:07.100 | that is designed to take up to 3,000 kilograms of payload
00:14:12.100 | up to geosynchronous orbit or in lunar vicinity.
00:14:17.180 | It has two different kinds of propulsion.
00:14:20.180 | It has chemical propulsion and it has electric propulsion.
00:14:24.460 | And so you can use Blue Ring in a couple of different ways.
00:14:29.060 | You can slowly move,
00:14:30.980 | let's say up to geosynchronous orbit
00:14:32.780 | using electric propulsion that might take 100 days
00:14:36.780 | or 150 days, depending on how much mass you're carrying.
00:14:40.260 | And then, and reserve your chemical propulsion
00:14:42.940 | so that you can change orbits quickly in geosynchronous orbit
00:14:46.740 | or you can use the chemical propulsion first
00:14:49.660 | to quickly get up to geosynchronous
00:14:51.340 | and then use your electrical propulsion
00:14:52.860 | to slowly change your geosynchronous orbit.
00:14:55.660 | Blue Ring has a couple of interesting features.
00:14:59.500 | It provides a lot of services to these payloads.
00:15:04.500 | So the payload, it could be one large payload
00:15:07.260 | or it can be a number of small payloads.
00:15:10.140 | And it provides thermal management,
00:15:12.100 | it provides electric power,
00:15:14.060 | it provides compute, provides communications.
00:15:18.940 | And so when you design a payload for Blue Ring,
00:15:23.020 | you don't have to figure out all of those things
00:15:27.220 | on your own.
00:15:28.060 | That kind of radiation-tolerant compute
00:15:30.860 | is a complicated thing to do.
00:15:32.820 | And so we have an unusually large amount
00:15:35.740 | of radiation-tolerant compute on board Blue Ring
00:15:39.100 | and your payload can just use that when it needs to.
00:15:42.620 | So it's sort of all these services,
00:15:46.660 | it's like a set of APIs.
00:15:49.580 | It's a little bit like Amazon Web Services,
00:15:51.620 | but for space payloads that need to move
00:15:54.820 | about an Earth vicinity or lunar vicinity.
00:15:57.780 | - AWSS.
00:15:59.820 | (laughing)
00:16:00.940 | Okay, so compute in space.
00:16:02.740 | So you get a giant chemical rocket
00:16:04.820 | to get a payload out to orbit
00:16:06.900 | and then you have these admins that show up,
00:16:09.900 | this Blue Ring thing that manages various things
00:16:13.220 | like compute.
00:16:14.060 | - Exactly.
00:16:14.900 | And it can also provide transportation
00:16:17.180 | and move you around to different orbits.
00:16:18.980 | - Including humans, do you think?
00:16:20.420 | - No, but Blue Ring is not designed to move humans around.
00:16:24.260 | It's designed to move payloads around.
00:16:26.820 | So we're also building a lunar lander,
00:16:30.180 | which is, of course, designed to land humans
00:16:33.580 | on the surface of the moon.
00:16:34.660 | - I'm gonna ask you about that,
00:16:35.740 | but let me actually just step back to the old days.
00:16:40.500 | You were at Princeton with aspirations
00:16:43.580 | to be a theoretical physicist.
00:16:45.380 | - Yeah.
00:16:46.220 | - What attracted you to physics
00:16:49.300 | and why did you change your mind and not become,
00:16:52.900 | why are you not, Jeff Bezos,
00:16:54.740 | the famous theoretical physicist?
00:16:56.980 | - So I loved physics and I studied physics
00:16:59.820 | and computer science.
00:17:01.500 | And I was proceeding along the physics path
00:17:05.100 | I was planning to major in physics.
00:17:06.380 | And I wanted to be a theoretical physicist.
00:17:08.820 | And the computer science was sort of something
00:17:11.180 | I was doing for fun.
00:17:12.540 | I really loved it.
00:17:13.640 | And I was very good at the programming
00:17:18.180 | and doing those things
00:17:19.020 | and I enjoyed all my computer science classes immensely.
00:17:22.980 | But I really was determined to be a theoretical physicist.
00:17:26.860 | It's why I went to Princeton in the first place.
00:17:29.060 | It was definitely.
00:17:30.420 | And then I realized I was gonna be
00:17:33.100 | a mediocre theoretical physicist.
00:17:35.340 | And there were a few people in my classes,
00:17:39.580 | like in quantum mechanics and so on,
00:17:41.220 | who they could effortlessly do things
00:17:44.860 | that were so difficult for me.
00:17:46.380 | And I realized, like, you know,
00:17:48.340 | there are a thousand ways to be smart.
00:17:50.660 | And to be a really, you know,
00:17:52.740 | theoretical physics is not one of those fields
00:17:55.900 | where the, you know, only the top few percent
00:17:59.780 | actually move the state of the art forward.
00:18:02.620 | It's one of those things where you have to be really,
00:18:05.540 | just your brain has to be wired in a certain way.
00:18:09.060 | And there was a guy named,
00:18:10.780 | one of these people who convinced me,
00:18:14.420 | he didn't mean to convince me,
00:18:15.740 | but just by observing him,
00:18:17.100 | he convinced me that I should not try
00:18:18.900 | to be a theoretical physicist.
00:18:20.740 | His name was Yosanta.
00:18:22.540 | And Yosanta was from Sri Lanka.
00:18:26.860 | And he was one of the most brilliant people I'd ever met.
00:18:30.580 | My friend Joe and I were working on a very difficult
00:18:34.860 | partial differential equations problem set one night.
00:18:37.780 | And there was one problem that we worked on for three hours.
00:18:41.780 | And we made no headway whatsoever.
00:18:45.460 | And we looked up at each other at the same time.
00:18:48.220 | And we said, "Yosanta."
00:18:49.580 | So we went to Yosanta's dorm room and he was there.
00:18:52.980 | He was almost always there.
00:18:54.860 | And we said, "Yosanta, we're having trouble
00:18:57.460 | "solving this partial differential equation.
00:19:00.580 | "Would you mind taking a look?"
00:19:02.540 | And he said, "Of course."
00:19:03.420 | By the way, he was the most humble, most kind person.
00:19:07.580 | And so he took our, he looked at our problem
00:19:10.380 | and he stared at it for just a few seconds,
00:19:12.660 | maybe 10 seconds, and he said, "Cosine."
00:19:15.740 | And I said, "What do you mean, Yosanta?
00:19:17.660 | "What do you mean, cosine?"
00:19:18.500 | He said, "That's the answer."
00:19:19.700 | And I said, "No, no, no, come on."
00:19:21.460 | And he said, "Let me show you."
00:19:22.420 | And he took out some paper.
00:19:23.940 | And he wrote down three pages of equations.
00:19:27.580 | Everything canceled out.
00:19:29.380 | And the answer was cosine.
00:19:30.820 | And I said, "Yosanta, did you do that in your head?"
00:19:35.580 | And he said, "Oh no, that would be impossible.
00:19:37.980 | "A few years ago, I solved a similar problem.
00:19:40.820 | "And I could map this problem onto that problem.
00:19:43.480 | "And then it was immediately obvious
00:19:45.020 | "that the answer was cosine."
00:19:47.260 | - I had a few, you know, you have an experience like that,
00:19:49.500 | you realize maybe being a theoretical physicist
00:19:52.580 | (laughing)
00:19:54.540 | isn't what the universe wants you to be.
00:19:59.340 | And so I switched to computer science
00:20:02.140 | and that worked out really well for me.
00:20:05.580 | I enjoy it, I still enjoy it today.
00:20:07.380 | - Yeah, there's a particular kind of intuition
00:20:09.220 | you need to be a great physicist, applied to physics.
00:20:12.660 | - I think the mathematical skill required today
00:20:16.320 | is so high, you have to be a world-class mathematician
00:20:20.820 | to be a successful theoretical physicist today.
00:20:24.260 | And it's not, you know,
00:20:26.300 | you probably need other skills too,
00:20:29.900 | intuition, lateral thinking, and so on.
00:20:32.620 | But without just top-notch math skills,
00:20:37.020 | you're unlikely to be successful.
00:20:39.260 | - And visualization skill, you have to be able to
00:20:41.980 | really kind of do these kind of thought experiments.
00:20:44.300 | And if you want a truly great creativity,
00:20:46.840 | actually, Walter Isaacson writes about you.
00:20:49.000 | Puts you on the same level as Einstein.
00:20:51.880 | - That's very kind.
00:20:55.700 | I have, I'm an inventor.
00:20:57.500 | If you want to boil down what I am,
00:21:00.800 | I'm really an inventor.
00:21:02.420 | And I look at things and I can come up with
00:21:05.860 | atypical solutions and, you know,
00:21:08.900 | and then I can create a hundred such atypical solutions
00:21:12.380 | for something, and 99 of them may not survive,
00:21:16.260 | you know, scrutiny.
00:21:18.540 | But one of those 100 is like, hmm, maybe there is,
00:21:21.940 | maybe that might work.
00:21:23.460 | And then you can keep going from there.
00:21:25.300 | So that kind of lateral thinking,
00:21:27.660 | that kind of inventiveness,
00:21:30.020 | in a high-dimensionality space
00:21:32.660 | where the search space is very large,
00:21:34.780 | that's where my inventive skills come.
00:21:37.420 | That's the thing I'm, if I self-identify as an inventor
00:21:41.500 | more than anything else.
00:21:43.300 | - Yeah, and he describes it in all kinds of different ways,
00:21:45.660 | Walter Isaacson does,
00:21:46.780 | that creativity combined with childlike wonder
00:21:51.780 | that you've maintained still to this day,
00:21:55.780 | all of that combined together.
00:21:57.260 | Is there, like, if you were to study your own brain,
00:21:59.540 | introspect, how do you think?
00:22:02.100 | What's your thinking process like?
00:22:03.620 | We'll talk about the writing process
00:22:05.420 | of putting it down on paper,
00:22:07.460 | which is quite rigorous and famous at Amazon.
00:22:12.460 | But how do you, when you sit down,
00:22:15.020 | maybe alone, maybe with others,
00:22:16.700 | and thinking through this high-dimensional space
00:22:19.580 | and looking for creative solutions,
00:22:22.420 | creative paths forward,
00:22:24.060 | is there something you could say about that process?
00:22:26.260 | - It's such a good question.
00:22:27.920 | And I honestly don't know how it works.
00:22:30.980 | If I did, I would try to explain it.
00:22:33.140 | I know it involves lots of wandering.
00:22:35.900 | So I, you know, when I sit down to work on a problem,
00:22:40.220 | I know I don't know where I'm going.
00:22:43.620 | So to go in a straight line, to be efficient,
00:22:47.260 | efficiency and invention are sort of at odds,
00:22:50.580 | because invention, real invention,
00:22:53.180 | not incremental improvement.
00:22:54.580 | Incremental improvement is so important
00:22:56.460 | in every endeavor and everything you do.
00:22:58.660 | You have to work hard on also just making things
00:23:00.540 | a little bit better.
00:23:02.260 | But I'm talking about real invention,
00:23:04.620 | real lateral thinking, that requires wandering.
00:23:08.540 | And you have to give yourself permission to wander.
00:23:11.180 | I think a lot of people,
00:23:12.740 | they feel like wandering is inefficient.
00:23:20.220 | And, you know, like when I sit down at a meeting,
00:23:25.220 | I don't know how long the meeting is gonna take
00:23:28.580 | if we're trying to solve a problem.
00:23:30.340 | Because if I did, then I'd already,
00:23:34.180 | I'd know there's some kind of straight line
00:23:35.900 | that we're drawing to the solution.
00:23:37.980 | The reality is we may have to wander for a long time.
00:23:42.180 | And I do like group invention.
00:23:43.780 | I think there's really nothing more fun
00:23:46.820 | than sitting at a whiteboard with a group of smart people
00:23:51.380 | and spitballing and coming up with new ideas
00:23:55.540 | and objections to those ideas
00:23:57.340 | and then solutions to the objections
00:23:59.260 | and going back and forth.
00:24:01.060 | So, like, you know, sometimes you wake up with an idea
00:24:05.420 | in the middle of the night.
00:24:07.620 | And sometimes you sit down with a group of people
00:24:10.620 | and go back and forth.
00:24:11.780 | And both things are really pleasurable.
00:24:14.860 | - And when you wander, I think one key thing
00:24:17.220 | is to notice a good idea.
00:24:21.900 | And to, maybe to notice the kernel of a good idea.
00:24:25.900 | Maybe pull out that string.
00:24:27.940 | 'Cause I don't think good ideas come fully formed.
00:24:31.340 | - 100% right.
00:24:33.660 | In fact, when I come up with what I think is a good idea,
00:24:37.460 | and it survives kind of the first level of scrutiny,
00:24:40.860 | you know, that I do in my own head,
00:24:42.540 | and I'm ready to tell somebody else about the idea,
00:24:45.820 | I will often say, look, it is gonna be really easy
00:24:48.540 | for you to find objections to this idea.
00:24:51.940 | But work with me.
00:24:53.140 | - There's something there.
00:24:54.500 | - There's something there.
00:24:55.660 | And that is intuition.
00:24:57.700 | Because it's really easy to kill new ideas in the beginning.
00:25:02.700 | 'Cause they do have so many easy objections to them.
00:25:06.980 | So you need to kind of forewarn people and say,
00:25:10.180 | look, I know it's gonna take a lot of work
00:25:12.260 | to get this to a fully formed idea.
00:25:14.740 | Let's get started on that.
00:25:15.940 | It'll be fun.
00:25:17.260 | - So you've got that ability to say cosine
00:25:19.140 | in you somewhere, after all.
00:25:20.700 | (laughing)
00:25:22.020 | Maybe not on math, but--
00:25:23.140 | - In a different domain.
00:25:25.140 | There are a thousand ways to be smart, by the way.
00:25:27.940 | And that is a really, when I go around and I meet people,
00:25:32.940 | I'm always looking for the way that they're smart.
00:25:35.620 | And you find, that's one of the things
00:25:38.620 | that makes the world so interesting and fun,
00:25:42.700 | is that it's not like IQ is a single dimension.
00:25:47.700 | There are people who are smart in such unique ways.
00:25:52.900 | - Yeah, you just gave me a good response
00:25:54.420 | is when somebody calls me an idiot on the internet.
00:25:56.540 | (laughing)
00:25:57.380 | You know, there's a thousand ways to be smart, sir.
00:26:01.060 | - Well, they might tell you, yeah,
00:26:02.940 | but there are a million ways to be dumb.
00:26:04.780 | - Yeah, right.
00:26:05.620 | (laughing)
00:26:06.460 | I feel like that's a Mark Twain quote.
00:26:08.700 | Okay, all right.
00:26:10.500 | You gave me an amazing tour of Blue Origin Rocket Factory
00:26:13.500 | and Launch Complex in the historic Cape Canaveral.
00:26:16.320 | That's where New Glenn, the big rocket we talked about,
00:26:21.060 | is being built and will launch.
00:26:24.380 | Can you explain what the New Glenn rocket is
00:26:26.700 | and tell me some interesting technical aspects
00:26:28.860 | of how it works?
00:26:29.700 | - Sure.
00:26:30.540 | New Glenn is a very large, a heavy lift launch vehicle.
00:26:36.460 | It'll take about 45 metric tons to LEO,
00:26:41.340 | a very large class.
00:26:43.640 | It's about half the thrust, a little more than half
00:26:47.780 | the thrust of the Saturn V rocket.
00:26:50.940 | So it's about 3.9 million pounds of thrust on liftoff.
00:26:55.300 | The booster has seven BE4 engines.
00:27:00.220 | Each engine generates a little more
00:27:01.860 | than 550,000 pounds of thrust.
00:27:05.440 | The engines are fueled by liquid natural gas,
00:27:09.300 | liquefied natural gas, LNG, as the fuel
00:27:12.420 | and LOX as the oxidizer.
00:27:15.240 | The cycle is an ox-riched stage combustion cycle.
00:27:19.220 | It's a cycle that was really pioneered by the Russians.
00:27:21.740 | It's a very good cycle.
00:27:23.220 | And that engine is also going to power the first stage
00:27:29.220 | of the Vulcan rocket,
00:27:30.220 | which is the United Launch Alliance rocket.
00:27:32.560 | Then the second stage of New Glenn
00:27:36.660 | is powered by two BE3U engines,
00:27:39.260 | which is a upper stage variant
00:27:41.540 | of our New Shepard liquid hydrogen engine.
00:27:43.940 | So the BE3U has 160,000 pounds of thrust.
00:27:48.180 | So two of those, 320,000 pounds of thrust.
00:27:51.820 | And hydrogen is a very good propellant for upper stages
00:27:56.820 | because it has very high ISP.
00:27:59.340 | It's not a great propellant in my view for booster stages
00:28:03.820 | because the stages then get physically so large.
00:28:06.780 | Hydrogen has very high ISP,
00:28:10.000 | but liquid hydrogen is not dense at all.
00:28:14.720 | So to store liquid hydrogen,
00:28:17.180 | if you need to store many thousands of pounds
00:28:19.020 | of liquid hydrogen, your tanks,
00:28:20.340 | your liquid hydrogen tank gets very large.
00:28:23.020 | So you really, you get more benefit from the higher ISP,
00:28:27.380 | the specific impulse.
00:28:29.000 | You get more benefit from the higher specific impulse
00:28:31.620 | on the second stage.
00:28:32.960 | And that stage carries less propellant.
00:28:37.540 | So you don't get such geometrically gigantic tanks.
00:28:40.980 | The Delta IV is an example of a vehicle
00:28:43.940 | that is all hydrogen.
00:28:45.140 | The booster stage is also hydrogen.
00:28:47.660 | And I think that it's a very effective vehicle,
00:28:50.140 | but it never was very cost-effective.
00:28:52.360 | So it's operationally very capable,
00:28:55.340 | but not very cost-effective.
00:28:56.740 | - So size is also costly.
00:28:58.440 | - Size is costly.
00:28:59.700 | So it's interesting.
00:29:01.500 | Rockets love to be big.
00:29:03.420 | Everything works better.
00:29:05.100 | - What do you mean by that?
00:29:06.300 | You've told me that before.
00:29:07.460 | It sounds epic, but what does it mean?
00:29:09.940 | - I mean, when you look at the physics of rocket engines,
00:29:15.820 | and also when you look at parasitic mass,
00:29:19.540 | it doesn't, if you have,
00:29:20.380 | let's say you have an avionics system.
00:29:22.180 | So you have a guidance and control system.
00:29:24.820 | That is gonna be about the same mass and size
00:29:29.340 | for a giant rocket as it is gonna be for a tiny rocket.
00:29:33.420 | And so that's just parasitic mass
00:29:36.540 | that is very consequential
00:29:38.780 | if you're building a very small rocket,
00:29:41.740 | but is trivial if you're building a very large rocket.
00:29:44.100 | So you have the parasitic mass thing.
00:29:45.420 | And then if you look at, for example,
00:29:48.020 | rocket engines have turbopumps.
00:29:50.380 | They have to pressurize the fuel and the oxidizer
00:29:53.740 | up to a very high pressure level
00:29:55.740 | in order to inject it into the thrust chamber
00:29:58.180 | where it burns.
00:29:59.780 | And those pumps, all rotating machines, in fact,
00:30:03.860 | get more efficient as they get larger.
00:30:06.620 | So really tiny turbopumps are very challenging
00:30:11.340 | to manufacture and any kind of gaps
00:30:15.060 | are between the housing, for example,
00:30:19.420 | and the rotating impeller that pressurizes the fuel,
00:30:22.900 | there has to be some gap there.
00:30:24.340 | You can't have those parts scraping against one another.
00:30:27.460 | And those gaps drive inefficiencies.
00:30:31.060 | And so if you have a very large turbopump,
00:30:34.900 | those gaps and percentage terms end up being very small.
00:30:39.780 | And so there's a bunch of things
00:30:41.860 | that you end up loving about having a large rocket
00:30:46.060 | and that you end up hating for a small rocket.
00:30:48.460 | But there's a giant exception to this rule
00:30:51.420 | and it is manufacturing.
00:30:53.260 | So manufacturing large structures is very, very challenging.
00:30:58.020 | It's a pain in the butt.
00:30:59.380 | And so it's just, if you're making a small rocket engine,
00:31:03.340 | you can move all the pieces by hand.
00:31:04.820 | You could assemble it on a table.
00:31:06.580 | One person can do it.
00:31:09.060 | You don't need cranes and heavy lift operations
00:31:12.260 | and tooling and so on and so on.
00:31:14.860 | When you start building big objects,
00:31:16.820 | civil infrastructure, just like the launch pad
00:31:21.900 | and all this, we went and visited,
00:31:24.060 | I took you to the launch pad
00:31:25.900 | and you can see it's so monumental.
00:31:27.740 | - Yeah, it is.
00:31:28.580 | - And so just these things become major undertakings,
00:31:32.940 | both from an engineering point of view,
00:31:35.060 | but also from a construction and cost point of view.
00:31:37.780 | - And even the foundation of the launch pad,
00:31:40.060 | I mean, this is Florida.
00:31:42.020 | Isn't it like swampland, like how deep you have to go?
00:31:44.900 | - At Cape Canaveral, in fact, at most ocean,
00:31:48.660 | most launch pads are on beaches somewhere in the ocean side
00:31:52.380 | 'cause you wanna launch over water for safety reasons.
00:31:55.800 | Yes, you have to drive pilings,
00:32:00.420 | dozens and dozens and dozens of pilings,
00:32:05.060 | 50, 100, 150 feet deep to get enough structural integrity
00:32:09.340 | for these very large, yes,
00:32:12.140 | these turn into major civil engineering projects.
00:32:15.380 | - I just have to say, everything about that factory
00:32:17.540 | is pretty badass, he said, tooling.
00:32:19.780 | The bigger it gets, the more epic it is.
00:32:22.500 | - It does make it epic, it's fun to look at,
00:32:24.860 | it's extraordinary.
00:32:25.980 | - It's humbling also 'cause humans are so small
00:32:28.780 | compared to it.
00:32:29.620 | - We are building these enormous machines
00:32:33.340 | that are harnessing enormous amounts of chemical power
00:32:38.340 | in very, very compact packages.
00:32:42.900 | It's truly extraordinary.
00:32:44.360 | - But then there's all the different components
00:32:47.020 | and the materials involved.
00:32:49.980 | Is there something interesting that you can describe
00:32:52.180 | about the materials that comprise the rocket
00:32:57.020 | so it has to be as light as possible, I guess,
00:32:59.820 | whilst withstanding the heat and the harsh conditions?
00:33:03.100 | - Yeah, I play a little kind of game sometimes
00:33:05.920 | with other rocket people that I run into
00:33:08.500 | where I say, what are the things
00:33:10.780 | that would amaze the 1960s engineers?
00:33:13.940 | Like, what's the change?
00:33:15.340 | 'Cause surprisingly, some of rocketry's greatest hits
00:33:18.620 | have not changed.
00:33:19.900 | They are still, they would recognize immediately
00:33:22.660 | a lot of what we do today,
00:33:24.140 | and it's exactly what they pioneered back in the '60s.
00:33:27.980 | But a few things have changed.
00:33:31.260 | The use of carbon composites is very different today.
00:33:36.260 | We can build very sophisticated,
00:33:39.820 | you saw our carbon tape laying machine
00:33:42.640 | that builds the giant fairings,
00:33:44.900 | and we can build these incredibly light,
00:33:48.540 | very stiff fairing structures
00:33:51.940 | out of carbon composite material
00:33:55.260 | that they could not have dreamed of.
00:33:57.500 | I mean, the efficiency, the structural efficiency
00:34:00.540 | of that material is so high
00:34:03.780 | compared to any metallic material
00:34:06.260 | you might use or anything else.
00:34:08.100 | So, that's one.
00:34:10.040 | Aluminum-lithium, and the ability
00:34:14.660 | to friction stir-weld aluminum-lithium.
00:34:17.420 | Do you remember the friction stir-welding
00:34:19.100 | that I showed you? - Yeah, yeah, it's incredible.
00:34:20.740 | - This is a remarkable technology.
00:34:23.220 | It was invented decades ago,
00:34:24.860 | but has become very practical
00:34:26.660 | over just the last couple of decades.
00:34:29.360 | And instead of using heat
00:34:31.860 | to weld two pieces of metal together,
00:34:34.760 | it literally stirs the two pieces.
00:34:37.300 | There's a pin that rotates at a certain rate,
00:34:40.740 | and you put that pin between the two plates of metal
00:34:44.060 | that you wanna weld together,
00:34:45.980 | and then you move it at a very precise speed.
00:34:50.780 | And instead of heating the material,
00:34:52.980 | it heats it a little bit because of friction,
00:34:54.540 | but not very much.
00:34:55.380 | You can literally, immediately after welding
00:34:57.540 | with stir-friction welding, you can touch the material
00:35:00.020 | and it's just barely warm.
00:35:01.400 | It literally stirs the molecules together.
00:35:04.780 | It's quite extraordinary.
00:35:05.940 | - Relatively low temperature,
00:35:07.100 | and I guess high temperature is what makes them,
00:35:09.740 | that makes it a weak point.
00:35:11.540 | - Exactly, so with traditional welding techniques,
00:35:15.980 | you may have, whatever the underlying
00:35:18.540 | strength characteristics of the material are,
00:35:21.340 | you end up with weak regions where you weld.
00:35:24.500 | And with friction stir-welding,
00:35:26.500 | the welds are just as strong as the bulk material.
00:35:29.700 | So it really allows you, 'cause when you're,
00:35:32.860 | let's say you're building a tank
00:35:34.100 | that you're gonna pressurize,
00:35:35.940 | a large liquid natural gas tank for our booster stage,
00:35:40.060 | for example, if you are welding that
00:35:43.140 | with traditional methods, you have to size those weld lands,
00:35:46.220 | the thickness of those pieces, with that knockdown
00:35:49.060 | for whatever damage you're doing with the weld,
00:35:51.060 | and that's gonna add a lot of weight to that tank.
00:35:53.820 | - I mean, even just looking at the fairings,
00:35:56.380 | the result of that, the complex shape that it takes,
00:36:00.540 | and what it's supposed to do is kind of incredible,
00:36:04.740 | 'cause people don't know it's on top of the rocket,
00:36:06.740 | it's gonna fall apart.
00:36:08.580 | That's its task, but it has to stay strong sometimes,
00:36:12.020 | and then disappear when it needs to.
00:36:14.900 | - That's right.
00:36:15.740 | - Which is a very difficult task.
00:36:17.260 | - Yes, when you need something that needs
00:36:19.620 | to have 100% integrity until it needs
00:36:23.180 | to have 0% integrity, it needs to stay attached
00:36:26.820 | until it's ready to go away, and then when it goes away,
00:36:28.980 | it has to go away completely.
00:36:30.900 | You use explosive charges for that,
00:36:33.660 | and so it's a very robust way of separating structure
00:36:38.500 | when you need to.
00:36:40.300 | - Exploding.
00:36:41.220 | - Yeah, little tiny bits of explosive material,
00:36:45.740 | and it'll sever the whole connection.
00:36:49.860 | So if you wanna go from 100% structural integrity
00:36:54.620 | to zero as fast as possible, it's explosive.
00:36:58.060 | - It's explosives.
00:36:59.740 | - The entirety of this thing is so badass.
00:37:02.060 | Okay, so we're back to the two stages.
00:37:04.180 | So the first stage is reusable.
00:37:06.260 | - Yeah, second stage is expendable.
00:37:08.820 | Second stage is liquid hydrogen, liquid oxygen,
00:37:11.260 | so we get to take advantage of the higher specific impulse.
00:37:16.140 | The first stage lands downrange on a landing platform
00:37:20.700 | in the ocean, comes back for maintenance,
00:37:24.140 | and get ready to do the next mission.
00:37:26.200 | - I mean, there's a million questions,
00:37:28.260 | but also, is there a path towards reusability
00:37:31.220 | for the second stage?
00:37:32.660 | - There is, and we know how to do that.
00:37:35.660 | Right now, we're gonna work on manufacturing
00:37:39.140 | that second stage to make it as inexpensive as possible.
00:37:42.860 | Sort of two paths for a second stage.
00:37:45.580 | Make it reusable, or work really hard
00:37:50.060 | to make it inexpensive so you can afford to expend it.
00:37:53.620 | And that trade is actually not obvious which one is better.
00:37:58.620 | - Even in terms of cost, like time, cost?
00:38:03.580 | - I'm talking about cost.
00:38:04.900 | Space flight, getting into orbit is a solved problem.
00:38:08.720 | We solved it back in the '50s and '60s.
00:38:11.580 | - You're making it sound easy.
00:38:12.620 | - The only thing, the only interesting problem
00:38:15.180 | is dramatically reducing the cost of access to orbit,
00:38:19.200 | which is, if you can do that,
00:38:21.940 | you open up a bunch of new endeavors
00:38:26.180 | that lots of startup companies, everybody else can do.
00:38:28.780 | So that's, we really, that's one of our missions
00:38:33.400 | is to be part of this industry and lower the cost to orbit
00:38:37.740 | so that there can be a kind of a renaissance,
00:38:42.400 | a golden age of people doing all kinds
00:38:45.420 | of interesting things in space.
00:38:47.280 | - I like how you said getting to orbit is a solved problem.
00:38:50.540 | It's just the only interesting thing
00:38:52.620 | is reducing the cost.
00:38:53.540 | You know you can describe every single problem
00:38:55.180 | facing human civilization that way.
00:38:57.140 | I mean, the physicists would say
00:38:58.700 | everything is a solved problem.
00:39:00.700 | We've solved everything.
00:39:01.780 | The rest is just, well, as Rutherford said,
00:39:04.660 | that it's just stamp collecting.
00:39:06.420 | It's just the details.
00:39:07.740 | Some of the greatest innovations and inventions
00:39:10.100 | and brilliance is in that cost reduction stage, right?
00:39:15.060 | And you've had a long career of cost reduction.
00:39:17.940 | - For sure, and when you,
00:39:20.780 | what does cost reduction really mean?
00:39:22.620 | It means inventing a better way.
00:39:24.460 | - Yeah, exactly.
00:39:25.300 | - Right, and when you invent a better way,
00:39:26.940 | you make the whole world richer.
00:39:28.900 | So whatever it was,
00:39:30.420 | I don't know how many thousands of years ago,
00:39:32.040 | somebody invented the plow.
00:39:34.220 | And when they invented the plow,
00:39:35.900 | they made the whole world richer
00:39:37.300 | because they made farming less expensive.
00:39:40.280 | And so it is a big deal to invent better ways.
00:39:45.280 | That's how the world gets richer.
00:39:48.380 | - So what are some of the biggest challenges
00:39:52.420 | on the manufacturing side, on the engineering side
00:39:54.800 | that you're facing in working to get
00:39:58.600 | to the first launch of New Glenn?
00:40:00.300 | - The first launch is one thing,
00:40:03.300 | and we'll do that in 2024, coming up in this coming year.
00:40:07.100 | The real thing that's the bigger challenge
00:40:09.940 | is making sure that our factory
00:40:12.780 | is efficiently manufacturing at rate.
00:40:17.780 | So rate production.
00:40:19.940 | So consider if you wanna launch New Glenn 24 times a year,
00:40:24.940 | you need to manufacture a upper stage,
00:40:30.060 | since they're expendable, every twice a month,
00:40:34.860 | you need to do it every two weeks.
00:40:36.340 | So you need to have all of your manufacturing facilities
00:40:40.980 | and processes and inspection techniques
00:40:43.460 | and acceptance tests and everything operating at rate.
00:40:47.760 | And rate manufacturing is at least as difficult
00:40:51.660 | as designing the vehicle in the first place.
00:40:55.340 | And the same thing.
00:40:56.180 | So every upper stage has two BE3U engines.
00:41:01.180 | So those engines, you know, you need,
00:41:05.340 | if you're gonna launch the vehicle twice a month,
00:41:08.620 | you need four engines a month.
00:41:09.860 | So you need an engine every week.
00:41:11.620 | So you need to be, that engine needs
00:41:13.660 | to be being produced at rate.
00:41:16.580 | And that's a, and there's all of the things
00:41:19.700 | that you need to do that, all the right machine tools,
00:41:22.100 | all the right fixtures, the right people,
00:41:25.220 | process, et cetera.
00:41:26.940 | So it's one thing to build a first article, right?
00:41:30.940 | So that's, you know, to launch New Glenn
00:41:33.740 | for the first time, you need to produce a first article.
00:41:37.400 | But that's not the hard part.
00:41:39.820 | The hard part is everything that's going on
00:41:42.500 | behind the scenes to build a factory
00:41:45.420 | that can produce New Glenns at rate.
00:41:47.660 | - So the first one is produced in a way
00:41:49.340 | that's enables the production of the second,
00:41:51.860 | the third, and the fourth, and the fifth, and sixth.
00:41:53.140 | - You can think of the first article as kind of pushing,
00:41:57.460 | it pushes all of the rate manufacturing technology along.
00:42:02.460 | You know, in other words, it's kind of the, you know,
00:42:06.580 | it's the test article in a way
00:42:09.060 | that's testing out your manufacturing technologies.
00:42:13.180 | - The manufacturing is the big challenge.
00:42:15.740 | - Yes, I mean, I don't wanna make it sound
00:42:17.660 | like any of it is easy.
00:42:18.980 | I mean, the people who are designing the engines
00:42:21.740 | and all of this, all of it is hard, for sure.
00:42:25.180 | But the challenge right now is driving really hard
00:42:29.900 | to get to rate manufacturing,
00:42:32.860 | and to do that in an efficient way.
00:42:34.060 | Again, kind of back to our cost point.
00:42:36.740 | If you get to rate manufacturing in an inefficient way,
00:42:39.940 | you haven't really solved the cost problem,
00:42:41.820 | and maybe you haven't really moved
00:42:43.740 | the state of the art forward.
00:42:45.180 | All this has to be about moving the state of the art forward.
00:42:48.140 | There are easier businesses to do.
00:42:50.100 | I always tell people, look, if you are trying to make money,
00:42:53.220 | you know, like start a salty snack food company
00:42:56.140 | or something, you know.
00:42:58.100 | - You write that idea down.
00:42:59.340 | (laughing)
00:43:01.420 | - Like, make the Lex Friedman potato chips, you know.
00:43:04.060 | - Okay, don't say it, the people are gonna steal it.
00:43:06.740 | (laughing)
00:43:09.140 | But yeah, it's hard.
00:43:09.980 | - You see what I'm saying?
00:43:10.820 | It's like, there's nothing easy about this business,
00:43:13.740 | but it's its own reward.
00:43:17.100 | It's fascinating, it's worthwhile, it's meaningful,
00:43:22.060 | and so, you know, I don't wanna pick
00:43:24.820 | on salty snack food companies,
00:43:26.300 | but I think it's less meaningful.
00:43:28.260 | You know, at the end of the day,
00:43:29.420 | you're not gonna have accomplished something amazing.
00:43:33.300 | - Yeah, there's--
00:43:34.140 | - Even if you do make a lot of money out of it.
00:43:35.740 | - Yeah, there's something fundamentally different
00:43:37.300 | about the, quote unquote, business of space exploration.
00:43:41.340 | - Yeah, for sure.
00:43:42.460 | - It's a grand project of humanity.
00:43:44.100 | - Yes, it's one of humanity's grand challenges,
00:43:47.620 | and especially as you look at going to the moon
00:43:50.500 | and going to Mars and building giant O'Neill colonies
00:43:54.060 | and unlocking all the things, you know,
00:43:57.180 | I won't live long enough to see the fruits of this,
00:44:01.220 | but the fruits of this come from building a road to space,
00:44:06.220 | getting the infrastructure.
00:44:08.980 | I'll give you an analogy.
00:44:10.500 | When I started Amazon,
00:44:12.620 | I didn't have to develop a payment system.
00:44:15.580 | It already existed.
00:44:16.420 | It was called the credit card.
00:44:18.060 | I didn't have to develop a transportation system
00:44:21.340 | to deliver the packages.
00:44:22.620 | It already existed.
00:44:23.660 | It was called the postal service
00:44:25.420 | and Royal Mail and Deutsche Post and so on,
00:44:28.620 | so all this heavy-lifting infrastructure
00:44:32.260 | was already in place, and I could stand on its shoulders,
00:44:36.740 | and that's why, when you look at the internet,
00:44:40.820 | you know, and by the way,
00:44:41.660 | another giant piece of infrastructure
00:44:43.180 | that was around in the early,
00:44:44.140 | I'm taking you back to like 1994.
00:44:46.660 | People were using dial-up modems,
00:44:48.860 | and it was piggybacking
00:44:50.620 | on top of the long-distance phone network.
00:44:53.300 | That's how the internet,
00:44:54.620 | that's, you know, how people were accessing servers
00:44:57.500 | and so on, and again, if that hadn't existed,
00:45:02.100 | it would have been hundreds of billions of capex
00:45:04.700 | to put that out there.
00:45:06.380 | No startup company could have done that,
00:45:09.020 | and so the problem, you know, you see in,
00:45:13.020 | if you look at the dynamism in the internet space
00:45:16.540 | over the last 20 years, it's because, you know,
00:45:19.620 | you see like two kids in a dorm room
00:45:22.260 | could start an internet company
00:45:23.700 | that could be successful and do amazing things
00:45:26.380 | because they didn't have to build heavy infrastructure.
00:45:28.580 | It was already there, and that's what I wanna do.
00:45:32.740 | I'd take, you know, my Amazon winnings
00:45:36.660 | and use that to build heavy infrastructure
00:45:39.020 | so that the next generation, you know,
00:45:41.940 | my, the generation that's my children and their children,
00:45:45.180 | these, you know, those generations
00:45:47.700 | can then use that heavy infrastructure.
00:45:50.100 | Then there'll be space entrepreneurs
00:45:51.860 | who start in their dorm room.
00:45:53.300 | - Yeah.
00:45:54.140 | - Like that will be a marker of success
00:45:56.580 | when you can have a really valuable space company
00:45:59.980 | started in a dorm room.
00:46:01.740 | Then we know that we've built enough infrastructure
00:46:04.820 | so that ingenuity and imagination can really be unleashed.
00:46:08.980 | I find that very exciting.
00:46:10.520 | - They will, of course, as kids do,
00:46:13.300 | take all of this hard infrastructure building for granted.
00:46:16.420 | - Of course.
00:46:17.420 | - Which is the entrepreneurial spirit.
00:46:19.380 | - That's a, an inventor's greatest dream
00:46:23.500 | is that their inventions are so successful
00:46:26.580 | that they are one day taken for granted.
00:46:29.140 | You know, nobody thinks of Amazon as an invention anymore.
00:46:31.780 | Nobody thinks of customer reviews as an event.
00:46:34.020 | We pioneered customer reviews,
00:46:35.300 | but now they're so commonplace.
00:46:37.260 | Same thing with one-click shopping and so on.
00:46:39.420 | But that's a compliment.
00:46:40.580 | That's how, you know, you invent something
00:46:43.180 | that's so used, so beneficially used by so many people
00:46:48.180 | that they take it for granted.
00:46:49.860 | - I don't know about nobody.
00:46:50.860 | Every time I use Amazon, I'm still amazed.
00:46:52.700 | How does this work?
00:46:53.820 | (laughing)
00:46:55.940 | - That proves you're a very curious explorer.
00:46:57.940 | - All right, all right, back to rockets.
00:46:59.940 | Timeline, you said 2024.
00:47:03.240 | As it stands now, are both the first test launch
00:47:08.140 | and the launch of Escapade Explorers
00:47:10.140 | to Mars still possible in 2024?
00:47:11.940 | - In 2024?
00:47:12.780 | - Yeah.
00:47:13.600 | - Yeah, I think so.
00:47:14.820 | For sure the first launch,
00:47:16.220 | and then we'll see if Escapade goes on that or not.
00:47:19.380 | I think that the first launch for sure,
00:47:20.900 | and I hope Escapade too.
00:47:22.340 | - Hope?
00:47:24.580 | - Well, I just don't know which mission
00:47:26.380 | it's actually gonna be slated on.
00:47:28.420 | So we also have other things
00:47:29.940 | that might go on that first mission.
00:47:31.580 | - Oh, I got it.
00:47:32.420 | But you're optimistic that the launches will still--
00:47:35.220 | - Oh, the first launch, I'm very optimistic
00:47:37.100 | that the first launch of New Glenn will be in 2024,
00:47:39.580 | and I'm just not 100% certain what payload
00:47:42.940 | will be on that first launch.
00:47:44.580 | - Are you nervous about it?
00:47:46.060 | - Are you kidding?
00:47:46.900 | I'm extremely nervous about it.
00:47:48.700 | (laughing)
00:47:50.940 | - Oh, man.
00:47:52.340 | - 100%.
00:47:54.340 | Every launch I go to for New Shepard,
00:47:58.260 | for other vehicles too,
00:47:59.900 | I'm always nervous for these launches.
00:48:01.980 | But yes, for sure.
00:48:03.060 | A first launch, to have no nervousness about that
00:48:05.820 | would be some sign of derangement, I think, so.
00:48:09.740 | - Well, I got to visit the launch pad.
00:48:11.060 | It's pretty, I mean, it's epic.
00:48:13.740 | - We have done a tremendous amount of ground testing,
00:48:18.700 | a tremendous amount of simulation.
00:48:21.820 | So, you know, a lot of the problems
00:48:25.580 | that we might find in flight have been resolved,
00:48:27.900 | but there are some problems you can only find in flight.
00:48:30.180 | So, you know, cross your fingers.
00:48:33.380 | I guarantee you, you'll have fun watching it
00:48:36.180 | no matter what happens.
00:48:37.020 | - 100%, when the thing is fully assembled, it comes up.
00:48:41.260 | - Yeah, the transporter erector.
00:48:44.420 | Just the transporter erector for a rocket of this scale
00:48:48.420 | is extraordinary.
00:48:49.460 | - That's an incredible machine.
00:48:50.460 | - The vehicle travels out horizontally
00:48:53.500 | and then kind of, you know, comes up.
00:48:56.940 | - Over a few hours?
00:48:58.260 | - Yeah, it's a beautiful thing to watch.
00:49:00.700 | - Speaking of which, if that makes you nervous,
00:49:04.180 | I don't know if you remember,
00:49:06.460 | but you were aboard New Shepard on its first crewed flight.
00:49:11.460 | How was that experience?
00:49:16.820 | Were you terrified then?
00:49:20.100 | - You know, strangely, I wasn't.
00:49:21.980 | You know, I--
00:49:22.820 | - You ride the rocket.
00:49:24.260 | Less nerve-wracking. - It's true.
00:49:25.540 | I've watched other people ride the rocket
00:49:27.420 | and I'm more nervous than when I was
00:49:29.260 | inside the rocket myself.
00:49:31.220 | It was a difficult conversation to have with my mother
00:49:35.540 | when I told her I was gonna go on the first one.
00:49:38.620 | And not only was I gonna go,
00:49:39.860 | but I was gonna bring my brother too.
00:49:41.500 | This is a tough conversation to have with a mom.
00:49:44.060 | And--
00:49:44.900 | - There's a long pause when you told her.
00:49:47.140 | - She's like, "Both of you?"
00:49:49.600 | And it was an incredible experience.
00:49:54.320 | And we were laughing inside the capsule
00:49:58.420 | and, you know, we're not nervous.
00:50:00.920 | The people on the ground were very nervous for us.
00:50:05.180 | It was actually one of the most emotionally powerful parts
00:50:11.920 | of the experience was not,
00:50:15.220 | happened even before the flight at 4.30 in the morning.
00:50:20.140 | Brother and I are getting ready to go to the launch site
00:50:22.540 | and Lauren is gonna take us there in her helicopter
00:50:25.140 | and we're getting ready to leave.
00:50:26.560 | And we go outside, outside the ranch house there
00:50:29.620 | in West Texas where the launch facility is.
00:50:32.620 | And all of our family, my kids and my brother's kids
00:50:37.220 | and our parents and close friends are assembled there
00:50:42.220 | and they're saying goodbye to us,
00:50:48.260 | but they're kind of saying,
00:50:50.440 | maybe they think they're saying goodbye to us forever.
00:50:54.080 | And we might not have felt that way,
00:50:56.040 | but it was obvious from their faces how nervous they were
00:50:58.760 | that they felt that way.
00:51:00.340 | And it was sort of powerful because it allowed us to see,
00:51:03.720 | it was almost like attending your own memorial service
00:51:05.760 | or something, like you could feel how loved you were
00:51:08.520 | in that moment and it was really amazing.
00:51:12.800 | - Yeah, and I mean, there's just an epic nature to it too.
00:51:17.440 | The ascent, the floating in zero gravity,
00:51:20.960 | I'll tell you something very interesting.
00:51:22.680 | Zero gravity feels very natural.
00:51:25.120 | I don't know if it's because it's like a return to the womb
00:51:30.120 | or what it is. - He just confirmed
00:51:31.440 | you're an alien, but that's--
00:51:32.260 | (laughing)
00:51:33.640 | I think that's what he just said.
00:51:36.400 | - It feels so natural to be in zero G.
00:51:38.480 | It was really interesting.
00:51:39.320 | And then when people talk about the overview effect
00:51:41.980 | and seeing Earth from space,
00:51:44.280 | I had that feeling very powerfully.
00:51:46.100 | I think everyone did.
00:51:47.920 | You see how fragile the Earth is.
00:51:51.600 | If you're not an environmentalist, it will make you one.
00:51:54.400 | The great Jim Lovell quote,
00:51:57.280 | he looked back at the Earth from space
00:51:59.800 | and he said he realized you don't go to heaven when you die,
00:52:03.240 | you go to heaven when you're born.
00:52:05.320 | And it's just, that's the feeling that people get
00:52:08.760 | when they're in space.
00:52:09.600 | You see all this blackness, all this nothingness,
00:52:12.020 | and there's one gem of life and it's Earth.
00:52:15.160 | - It is a gem.
00:52:16.160 | You've talked a lot about decision-making
00:52:20.940 | throughout your time with Amazon.
00:52:23.140 | What was that decision like to be the first
00:52:27.400 | to ride in your Shepard?
00:52:29.040 | Just before you talk to your mom.
00:52:31.200 | - Yeah.
00:52:32.040 | - What were the pros and cons?
00:52:34.560 | Actually, as one human being, as a leader of a company,
00:52:38.840 | on all fronts, what was that decision-making like?
00:52:43.220 | - I decided that, first of all,
00:52:45.440 | I knew the vehicle extremely well.
00:52:48.140 | I know the team who built it, I know the vehicle.
00:52:51.040 | I'm very comfortable with the escape system.
00:52:57.900 | We put as much effort into the escape system
00:53:02.060 | on that vehicle as we put into all the rest
00:53:04.620 | of the vehicle combined.
00:53:05.780 | It's one of the hardest pieces of engineering
00:53:08.140 | in the entire New Shepard architecture.
00:53:10.380 | - Can you actually describe what do you mean
00:53:11.880 | by escape system, what's involved?
00:53:13.480 | - We have a solid rocket motor in the base
00:53:16.900 | of the crew capsule so that if anything goes wrong
00:53:21.400 | on ascent, while the main rocket engine is firing,
00:53:25.680 | we can ignite this solid rocket motor
00:53:30.260 | in the base of the crew capsule and escape from the booster.
00:53:35.260 | It's a very challenging system to build, design,
00:53:39.140 | validate, test, all of these things.
00:53:42.240 | It is the reason that I am comfortable letting anyone
00:53:46.240 | go on New Shepard.
00:53:47.320 | So the booster is as safe and reliable as we can make it,
00:53:52.320 | but we're harnessing, whenever you're talking
00:53:56.920 | about rocket engines, I don't care what rocket engine
00:54:01.160 | you're talking about, you are harnessing such vast power
00:54:06.080 | in such a small, compact, geometric space.
00:54:10.060 | The power density is so enormous that it is impossible
00:54:15.060 | to ever be sure that nothing will go wrong.
00:54:17.920 | And so the only way to improve safety
00:54:22.960 | is to have an escape system.
00:54:24.580 | And historically, human-rated rockets
00:54:28.380 | have had escape systems.
00:54:29.420 | Only the space shuttle did not.
00:54:33.700 | But Apollo had one, all of the previous,
00:54:38.640 | Gemini, et cetera, they all had escape systems.
00:54:41.960 | And we have on New Shepard unusual escapes.
00:54:46.040 | Most escape systems are towers.
00:54:48.440 | We have a pusher escape system.
00:54:50.040 | So the solid rocket motor is actually embedded
00:54:52.200 | in the base of the crew capsule and it pushes.
00:54:54.800 | And it's reusable in the sense that if we don't use it,
00:54:58.840 | so if we have a nominal mission, we land with it.
00:55:01.940 | The tower systems have to be ejected
00:55:04.440 | at a certain point in the mission.
00:55:06.240 | And so they get wasted even in a nominal mission.
00:55:09.200 | And so again, cost really matters on these things.
00:55:12.240 | So we figured out how to have the escape system
00:55:14.160 | be a reusable, in the event that it's not used,
00:55:17.920 | you can reuse it and have it be a pusher system.
00:55:21.200 | It's a very sophisticated thing.
00:55:22.920 | So I knew these things.
00:55:23.880 | You asked me about my decision to go.
00:55:25.720 | And so I know the vehicle very well.
00:55:28.560 | I know the people who designed it.
00:55:31.320 | I have great trust in them
00:55:33.420 | and in the engineering that we did.
00:55:35.240 | And I thought to myself, look, if I am not ready to go,
00:55:40.260 | then I wouldn't want anyone to go.
00:55:43.460 | A tourism vehicle has to be designed, in my view,
00:55:47.260 | to have very, to be as safe as one can make it.
00:55:50.140 | You can't make it perfectly safe.
00:55:52.760 | It's impossible.
00:55:54.460 | But you just have to, people will do things.
00:55:57.980 | People take risks.
00:55:59.020 | They climb mountains, they skydive,
00:56:01.940 | they do deep underwater scuba diving, and so on.
00:56:05.500 | People are okay taking risk.
00:56:08.020 | You can't eliminate the risk.
00:56:10.160 | But it is something, because it's a tourism vehicle,
00:56:13.620 | you have to do your utmost to eliminate those risks.
00:56:16.740 | And I felt very good about the system.
00:56:18.980 | I think that's one of the reasons I was so calm inside.
00:56:23.060 | Maybe others weren't as calm.
00:56:24.460 | They didn't know as much about it as I did.
00:56:26.340 | - Who was in charge of engaging the escape system?
00:56:28.820 | - It's automated.
00:56:29.720 | - Okay.
00:56:30.620 | - The escape system is completely automated.
00:56:34.100 | Automated is better because it can react so much faster.
00:56:38.460 | - So yeah, for tourism, rockets, safety is a huge,
00:56:41.700 | huge, huge priority for space exploration also,
00:56:44.020 | but a little, you know, a delta less.
00:56:46.700 | - Yes, I mean, I think for, you know, if you're doing,
00:56:49.220 | you know, there are human activities
00:56:51.020 | where we tolerate more risk.
00:56:52.480 | If you're saving somebody's life, you know,
00:56:55.380 | if you are, you know, engaging in real exploration.
00:57:00.100 | These are things where, you know,
00:57:02.500 | I personally think we would accept more risk,
00:57:07.300 | in part because you have to.
00:57:08.700 | - Is there a part of you that's frustrated
00:57:12.780 | by the rate of progress in Blue Origin?
00:57:15.860 | - Blue Origin needs to be much faster.
00:57:18.140 | And it's one of the reasons that I left my role
00:57:21.580 | as the CEO of Amazon a couple of years ago.
00:57:25.580 | I needed, I wanted to come in,
00:57:27.020 | and Blue Origin needs me right now.
00:57:29.900 | And so I had always, when I was the CEO of Amazon,
00:57:33.300 | my point of view on this is,
00:57:35.140 | if I'm the CEO of a publicly traded company,
00:57:38.060 | it's gonna get my full attention.
00:57:39.780 | And I really, just how I think about things,
00:57:44.660 | it was very important to me.
00:57:45.940 | I felt I had an obligation to all the stakeholders
00:57:49.460 | at Amazon to do that.
00:57:52.260 | And so having, you know, turned the CEO,
00:57:55.100 | I was still the executive chair there,
00:57:56.500 | but I turned the CEO role over.
00:57:59.260 | And the reason, the primary reason I did that
00:58:03.260 | is so that I could spend time on Blue Origin,
00:58:05.940 | adding some, you know, energy, some sense of urgency.
00:58:09.540 | We need to move much faster, and we're going to.
00:58:12.300 | - What are the ways to speed it up?
00:58:15.060 | So there's, you've talked a lot of different ways
00:58:20.060 | to sort of, and Amazon, you know,
00:58:24.900 | removing barriers for progress,
00:58:29.580 | sort of distributing, making everybody autonomous
00:58:31.700 | and self-reliant in terms, all those kinds of things.
00:58:34.580 | Is that apply at Blue Origin, or is this?
00:58:37.380 | - It does apply, and I'm leading this directly.
00:58:41.500 | We are gonna become the world's most decisive company.
00:58:45.140 | Across any industry.
00:58:47.100 | And so, you know, at Amazon, you know,
00:58:51.620 | ever since the beginning, I said,
00:58:53.980 | we're gonna become the world's most
00:58:56.500 | customer-obsessed company.
00:58:58.380 | And no matter the industry, like people,
00:59:01.660 | one day people are going to come to Amazon
00:59:03.780 | from the healthcare industry and want to know,
00:59:05.500 | how did you guys, how are you so customer-obsessed?
00:59:08.300 | How do you actually, not just pay lip service to that,
00:59:10.260 | but actually do that?
00:59:12.980 | And from, you know, all different industries
00:59:15.460 | should come want to study us
00:59:16.860 | to see how we accomplish that.
00:59:18.900 | And the analogous thing at Blue Origin,
00:59:21.780 | it will help us move faster,
00:59:24.220 | is we're gonna become the world's most decisive company.
00:59:28.780 | We're gonna get really good at taking
00:59:30.860 | appropriate technology risk,
00:59:32.860 | making those decisions quickly.
00:59:34.580 | You know, being bold on those things.
00:59:37.340 | That's what, and having the right culture
00:59:39.380 | that supports that.
00:59:40.820 | You need people to be ambitious, technically ambitious.
00:59:44.620 | You know, if there are five ways to do something,
00:59:47.460 | we'll study them, but let's study them very quickly
00:59:49.660 | and make a decision.
00:59:50.980 | We can always change our mind.
00:59:52.500 | It doesn't, you know, changing your mind,
00:59:56.620 | I talk about one-way doors and two-way doors.
00:59:59.900 | Most decisions are two-way doors.
01:00:03.820 | - Can you explain that?
01:00:04.660 | 'Cause I love that metaphor.
01:00:06.780 | - If you make the wrong decision,
01:00:09.940 | if it's a two-way door decision,
01:00:11.620 | you walk out the door, you pick a door, you walk out,
01:00:14.820 | and you spend a little time there,
01:00:16.420 | it turns out to be the wrong decision,
01:00:17.900 | you can come back in and pick another door.
01:00:20.180 | Some decisions are so consequential and so important
01:00:24.660 | and so hard to reverse,
01:00:26.980 | that they really are one-way door decisions.
01:00:28.940 | You go in that door, you're not coming back.
01:00:32.500 | And those decisions have to be made very deliberately,
01:00:36.380 | very carefully.
01:00:38.740 | If you can think of yet another way
01:00:40.340 | to analyze the decision, you should slow down and do that.
01:00:43.220 | So, you know, when I was CEO of Amazon,
01:00:48.060 | I often found myself in the position
01:00:49.940 | of being the chief slowdown officer,
01:00:52.180 | because somebody would be bringing me
01:00:54.620 | a one-way door decision, and I was, okay,
01:00:57.300 | I can think of three more ways to analyze that,
01:00:59.380 | so let's go do that.
01:01:00.860 | Because we are not gonna be able to reverse this one easily.
01:01:03.980 | Maybe you can reverse it, but it's gonna be very costly
01:01:06.340 | and very time-consuming.
01:01:07.740 | We really have to get this one right from the beginning.
01:01:10.540 | And what happens, unfortunately, in companies,
01:01:15.540 | what can happen is that you have a one-size-fits-all
01:01:21.620 | decision-making process,
01:01:23.940 | where you end up using the heavyweight process
01:01:27.020 | on all decisions. - For everything, yeah.
01:01:29.500 | - Including the lightweight ones,
01:01:31.180 | the two-way door decisions.
01:01:33.180 | Two-way door decisions should mostly be made
01:01:35.260 | by single individuals or by very small teams
01:01:39.020 | deep in the organization.
01:01:41.220 | And one-way door decisions are the ones that,
01:01:44.380 | the irreversible ones, those are the ones
01:01:45.820 | that should be elevated up to, you know,
01:01:48.580 | the senior-most executives who should slow them down
01:01:52.180 | and make sure that the right thing is being done.
01:01:55.100 | - Yeah, I mean, part of the skill here
01:01:56.700 | is to know the difference between one-way and two-way.
01:01:59.380 | I think you mentioned-- - Yes!
01:02:00.740 | - I mean, I think you mentioned Amazon Prime,
01:02:03.820 | the decision to sort of create Amazon Prime
01:02:06.500 | as a one-way door.
01:02:08.420 | I mean, it's unclear if it is or not,
01:02:10.620 | but it probably is, and it's a really big risk to go there.
01:02:14.460 | - There are a bunch of decisions like that
01:02:16.380 | that are, you know, changing the decision
01:02:20.060 | is gonna be very, very complicated.
01:02:22.460 | Some of them are technical decisions, too,
01:02:24.260 | because some technical decisions
01:02:25.820 | are like quick-drying cement.
01:02:27.140 | You know, if you're gonna, once you make 'em,
01:02:29.420 | it gets really hard, I mean, you know,
01:02:31.060 | choosing which propellants to use in a vehicle,
01:02:33.980 | you know, selecting LNG for the booster stage
01:02:36.820 | and selecting hydrogen for the upper stage,
01:02:39.060 | that has turned out to be a very good decision,
01:02:43.420 | but if you change your mind, (laughs)
01:02:47.020 | that would be a very, that would be a very big setback.
01:02:50.700 | Do you see what I'm saying? - Yeah, yeah.
01:02:52.220 | - So that's the kind of decision
01:02:54.580 | you scrutinize very, very carefully.
01:02:57.460 | Other things just aren't like that.
01:02:58.740 | Most decisions are not that way.
01:03:01.420 | Most decisions should be made by single individuals,
01:03:04.340 | but they need, and done quickly,
01:03:06.740 | in the full understanding
01:03:08.220 | that you can always change your mind.
01:03:10.660 | - Yeah, one of the things I really liked,
01:03:13.020 | perhaps it's not two-way door decisions,
01:03:14.980 | is I disagree and commit phrase.
01:03:18.980 | So don't, so somebody brings up an idea to you,
01:03:23.700 | if it's a two-way door,
01:03:25.620 | you state that you don't understand enough
01:03:28.460 | to agree, but you still back them.
01:03:32.220 | Hey, I'd love for you to explain that.
01:03:33.660 | - Yeah, disagree and commit is a really important principle
01:03:36.780 | that saves a lot of arguing.
01:03:38.500 | - Yeah.
01:03:39.340 | - So-- - I'm gonna use that
01:03:40.180 | in my personal life. (laughs)
01:03:41.620 | I disagree, but commit.
01:03:44.140 | - It's very common in any endeavor in life,
01:03:47.340 | in business, in anybody where you have teammates,
01:03:51.380 | you have a teammate and the two of you disagree.
01:03:53.860 | At some point, you have to make a decision.
01:03:57.420 | And in companies, we tend to organize hierarchically.
01:04:01.060 | So there's this, whoever's the more senior person
01:04:04.500 | ultimately gets to make the decision.
01:04:05.940 | So ultimately, the CEO gets to make that decision.
01:04:09.300 | And the CEO may not always make the decision
01:04:11.700 | that they agree with.
01:04:12.740 | So like, I would often,
01:04:15.460 | I would be the one who would disagree and commit.
01:04:17.860 | Some, one of my direct reports would very much
01:04:21.220 | wanna do something in a particular way.
01:04:24.420 | I would think it was a bad idea.
01:04:26.260 | I would explain my point of view.
01:04:28.780 | They would say, "Jeff, I think you're wrong, and here's why."
01:04:33.100 | And we would go back and forth.
01:04:35.040 | And I would often say, "You know what?
01:04:38.020 | "I don't think you're right,
01:04:39.820 | "but I'm gonna gamble with you.
01:04:43.360 | "And you're closer to the ground truth than I am.
01:04:48.000 | "I've known you for 20 years.
01:04:49.920 | "You have great judgment.
01:04:51.400 | "I don't know that I'm right either.
01:04:53.900 | "Not really, not for sure."
01:04:55.340 | All these decisions are complicated.
01:04:57.700 | Let's do it your way.
01:04:58.940 | But at least then you've made a decision.
01:05:01.580 | And I'm agreeing to commit to that decision.
01:05:05.280 | So I'm not gonna be second-guessing it.
01:05:06.940 | I'm not gonna be sniping at it.
01:05:08.540 | I'm not gonna be saying, "I told you so."
01:05:10.580 | I'm gonna try actively to help make sure it works.
01:05:14.640 | That's a really important teammate behavior.
01:05:18.140 | There's so many ways that dispute resolution
01:05:21.900 | is a really interesting thing on teams.
01:05:25.180 | And there are so many ways
01:05:26.540 | when two people disagree about something,
01:05:28.540 | even though I'm assuming the case
01:05:30.380 | where everybody's well-intentioned,
01:05:32.160 | they just have a very different opinion
01:05:33.700 | about what the right decision is.
01:05:36.300 | And we have in our society and inside companies,
01:05:40.440 | we have a bunch of mechanisms that we use
01:05:43.580 | to resolve these kinds of disputes.
01:05:46.860 | A lot of them are, I think, really bad.
01:05:50.220 | So an example of a really bad way of coming to agreement
01:05:55.260 | is compromise.
01:05:57.240 | So compromise, we're in a room here and I could say,
01:06:01.220 | "Lex, how tall do you think this ceiling is?"
01:06:04.460 | And you'd be like, "I don't know, Jeff, maybe 12 feet tall."
01:06:07.260 | And I would say, "I think it's 11 feet tall."
01:06:11.340 | And then we'd say, "You know what?
01:06:13.420 | "Let's just call it 11 1/2 feet."
01:06:16.040 | (laughing)
01:06:17.020 | That's compromise.
01:06:18.500 | Instead of, the right thing to do is to get a tape measure
01:06:22.260 | or figure out some way of actually measuring.
01:06:24.940 | But think, getting that tape measure
01:06:27.100 | and figure out how to get it to the top of the ceiling
01:06:29.060 | and all these things, that requires energy.
01:06:31.420 | Compromise, the advantage of compromise
01:06:34.700 | as a resolution mechanism is that it's low energy.
01:06:38.140 | But it doesn't lead to truth.
01:06:41.300 | And so in things like the height of the ceiling
01:06:44.340 | where truth is a knowable thing,
01:06:46.620 | you shouldn't allow compromise to be used
01:06:48.840 | when you can know the truth.
01:06:51.620 | Another really bad resolution mechanism
01:06:53.980 | that happens all the time is just who's more stubborn.
01:06:57.540 | This is also, let's say two executives who disagree
01:07:03.460 | and they just have a war of attrition.
01:07:05.700 | And whichever one gets exhausted first
01:07:09.420 | capitulates to the other one.
01:07:11.060 | Again, you haven't arrived at truth.
01:07:13.140 | And this is very demoralizing.
01:07:15.900 | So this is where escalation,
01:07:18.620 | I try to ask people who are on my team,
01:07:23.340 | I say never get to a point where you are resolving something
01:07:27.460 | by who gets exhausted first.
01:07:30.300 | Escalate that.
01:07:32.620 | I'll help you make the decision.
01:07:34.980 | Because that's so de-energizing
01:07:37.620 | and such a terrible, lousy way to make a decision.
01:07:40.780 | - Do you want to get to the resolution
01:07:42.020 | as quickly as possible?
01:07:42.940 | Because that ultimately leads to a high velocity.
01:07:45.180 | - Yes, and you want to try to get
01:07:46.660 | as close to truth as possible.
01:07:48.820 | So you want, exhausting the other person
01:07:52.700 | is not truth-seeking, and compromise is not truth-seeking.
01:07:57.700 | So it doesn't mean, and there are a lot of cases
01:08:00.840 | where no one knows the real truth,
01:08:02.300 | and that's where disagree and commit can come in.
01:08:04.700 | But it's, escalation is better than war of attrition.
01:08:09.900 | Escalate to your boss and say hey,
01:08:13.700 | we can't agree on this, we like each other,
01:08:15.420 | we're respectful of each other,
01:08:16.620 | but we strongly disagree with each other.
01:08:18.860 | We need you to make a decision here so we can move forward.
01:08:22.740 | But decisiveness, moving forward quickly on decisions,
01:08:27.740 | as quickly as you responsibly can,
01:08:30.720 | is how you increase velocity.
01:08:32.380 | Most of what slows things down
01:08:34.580 | is taking too long to make decisions, at all skill levels.
01:08:39.100 | So it has to be part of the culture to get high velocity.
01:08:43.180 | Amazon has a million and a half people,
01:08:45.660 | and the company is still fast.
01:08:47.980 | We're still decisive, we're still quick.
01:08:50.460 | And that's because the culture supports that.
01:08:53.220 | - At every scale, in a distributed way,
01:08:55.740 | to try to maximize the velocity of decisions.
01:08:58.140 | - Exactly.
01:08:59.740 | - You mentioned the lunar program, let me ask you about that.
01:09:02.580 | - Yeah.
01:09:03.420 | - There's a lot going on there,
01:09:06.420 | and you haven't really talked about it much.
01:09:08.220 | So in addition to the Artemis program with NASA,
01:09:11.620 | BLU is doing its own lander program.
01:09:13.420 | Can you describe it?
01:09:15.180 | There's a sexy picture on Instagram with one of them.
01:09:19.140 | Is it the MK1, I guess?
01:09:20.580 | - Yeah, the Mark 1.
01:09:22.380 | The picture here is me with Bill Nelson,
01:09:24.580 | the NASA administrator.
01:09:26.380 | - Just to clarify, the lander is the sexy thing
01:09:28.660 | about the Instagram.
01:09:29.580 | (laughing)
01:09:31.300 | - I know it's not me.
01:09:32.140 | I know it was either the lander or Bill.
01:09:34.220 | - Okay.
01:09:35.060 | (laughing)
01:09:36.820 | - I love Bill, but yeah. - Thank you for clarifying.
01:09:38.260 | - Okay.
01:09:40.500 | Yes, the Mark 1 lander is designed to take
01:09:45.500 | 3,000 kilograms to the surface of the moon
01:09:48.780 | in a cargo, expendable cargo.
01:09:50.820 | It's an expendable lander, lands on the moon, stays there,
01:09:54.060 | take 3,000 kilograms to the surface.
01:09:56.100 | It can be launched on a single New Glenn flight,
01:09:59.940 | which is very important.
01:10:01.180 | So it's a relatively simple architecture,
01:10:04.020 | just like the human landing system lander
01:10:07.300 | they call the Mark 2.
01:10:09.020 | Mark 1 is also fueled with liquid hydrogen,
01:10:13.180 | and which is for high energy missions,
01:10:18.740 | like landing on the surface of the moon,
01:10:20.700 | the high specific impulse of hydrogen
01:10:22.780 | is a very big advantage.
01:10:24.660 | The disadvantage of hydrogen has always been
01:10:28.100 | that it's such a deep cryogen, it's not storable.
01:10:33.020 | So it's constantly boiling off and you're losing propellant
01:10:37.220 | because it's boiling off.
01:10:39.060 | And so what we're doing as part of our lunar program
01:10:43.460 | is developing solar-powered cryocoolers
01:10:46.620 | that can actually make hydrogen a storable propellant
01:10:50.940 | for deep space, and that's a real game changer.
01:10:53.980 | It's a game changer for any high energy mission,
01:10:56.220 | so to the moon, but to the outer planets,
01:10:58.380 | to Mars, everywhere.
01:11:00.320 | - So the idea with both Mark 1 and Mark 2
01:11:03.420 | is the new Glenn can carry it from the surface of Earth
01:11:08.420 | to the surface of the moon.
01:11:12.660 | - Exactly.
01:11:13.860 | So the Mark 1 is expendable.
01:11:16.500 | The lunar lander we're developing for NASA,
01:11:20.300 | the Mark 2 lander, that's part of the Artemis program,
01:11:24.100 | they call it the sustaining lander program.
01:11:27.580 | So that lander is designed to be reusable.
01:11:31.740 | It can land on the surface of the moon
01:11:33.980 | in a single stage configuration and then take off.
01:11:37.220 | So the whole, if you look at the Apollo program,
01:11:41.580 | the lunar lander in Apollo was really two stages.
01:11:45.220 | It would land on the surface
01:11:46.940 | and then it would leave the descent stage
01:11:49.460 | on the surface of the moon and only the ascent stage
01:11:52.260 | would go back up into lunar orbit
01:11:53.900 | where it would rendezvous with the command module.
01:11:56.820 | Here what we're doing is we have a single stage
01:11:59.660 | lunar lander that carries down enough propellant
01:12:02.220 | so that it can bring the whole thing back up
01:12:04.420 | so that it can be reused over and over.
01:12:07.340 | And the point of doing that, of course,
01:12:08.740 | is to reduce cost so that you can make lunar missions
01:12:12.820 | more affordable over time,
01:12:14.620 | which is, that's one of NASA's big objectives
01:12:17.620 | because this time, the whole point of Artemis
01:12:21.540 | is go back to the moon, but this time to stay.
01:12:25.580 | So back in the Apollo program,
01:12:28.820 | we went to the moon six times and then ended the program
01:12:31.900 | and it really was too expensive to continue.
01:12:36.220 | - And so there's a few questions there,
01:12:38.100 | but one is how do you stay on the moon?
01:12:40.660 | What ideas do you have about--
01:12:42.440 | - Yeah.
01:12:44.900 | - Like a sustaining life where a few folks
01:12:49.180 | can stay there for prolonged periods of time?
01:12:51.780 | - Well, one of the things we're working on
01:12:55.180 | is using lunar resources like lunar regolith
01:13:00.180 | to manufacture commodities and even solar cells
01:13:05.020 | on the surface of the moon.
01:13:06.860 | We've already built a solar cell
01:13:09.300 | that is completely made from lunar regolith stimulant
01:13:13.260 | and this solar cell is only about 7% power efficient,
01:13:18.260 | so it's very inefficient compared to
01:13:21.780 | the more advanced solar cells that we make here on Earth.
01:13:25.540 | But if you can figure out how to make
01:13:27.460 | a practical solar cell factory
01:13:29.980 | that you can land on the surface of the moon
01:13:33.900 | and then the raw material for those solar cells
01:13:37.540 | is simply lunar regolith, then you can just
01:13:40.460 | continue to churn out solar cells
01:13:44.500 | on the surface of the moon,
01:13:45.540 | have lots of power on the surface of the moon.
01:13:48.020 | That will make it easier for people to live on the moon.
01:13:51.420 | Similarly, we're working on extracting
01:13:55.100 | oxygen from lunar regolith.
01:13:58.300 | So lunar regolith by weight has a lot of oxygen in it.
01:14:02.420 | It's bound very tightly as oxides with other elements
01:14:07.420 | and so you have to separate the oxygen,
01:14:10.700 | which is very energy intensive.
01:14:12.700 | So that also could work together with the solar cells.
01:14:17.420 | But if you can, and then ultimately,
01:14:19.820 | we may be able to find practical quantities of ice
01:14:25.740 | in the permanently shadowed craters
01:14:29.540 | on the poles of the moon.
01:14:30.900 | And we know there is ice water in those,
01:14:36.660 | or water ice in those craters,
01:14:39.260 | and we know that we can break that down
01:14:42.340 | with electrolysis into hydrogen and oxygen.
01:14:46.180 | And then you'd not only have oxygen,
01:14:48.180 | but you'd also have a very good,
01:14:51.260 | high efficiency propellant fuel in hydrogen.
01:14:56.260 | So there's a lot we can do to make the moon
01:15:00.260 | more sustainable over time.
01:15:02.260 | But the very first step, the kind of gate
01:15:05.140 | that all of that has to go through,
01:15:07.620 | is we need to be able to land cargo and humans
01:15:13.060 | on the surface of the moon at an acceptable cost.
01:15:16.500 | - To fast forward a little bit,
01:15:18.180 | is there any chance Jeff Bezos steps foot
01:15:21.740 | on the moon and on Mars, one or the other or both?
01:15:26.740 | - It's very unlikely.
01:15:29.820 | I think it's probably something that gets done
01:15:32.300 | by future generations by the time it gets to me.
01:15:34.660 | I think in my lifetime, that's probably gonna be
01:15:38.180 | done by professional astronauts.
01:15:40.580 | Sadly, I would love to sign up for that mission.
01:15:43.480 | So don't count me out yet, Lex.
01:15:47.100 | Give me a fighting shot here, maybe.
01:15:50.100 | But I think if we are placing reasonable bets
01:15:55.100 | on such a thing in my lifetime,
01:15:57.100 | that will continue to be done by professional astronauts.
01:15:59.420 | - Yeah, so these are risky, difficult missions.
01:16:02.300 | - And probably missions that require a lot of training.
01:16:05.180 | You are going there for a very specific purpose
01:16:08.200 | to do something.
01:16:09.580 | We're gonna be able to do a lot
01:16:10.580 | on the moon, too, with automation.
01:16:13.020 | So in terms of setting up these factories
01:16:15.580 | and doing all that, we're sophisticated enough now
01:16:18.720 | with automation that we probably don't need humans
01:16:20.960 | to tend those factories and machines.
01:16:23.680 | So there's a lot that's gonna be done in both modes.
01:16:28.700 | - So I have to ask the bigger picture question
01:16:31.620 | about the two companies pushing humanity forward
01:16:36.400 | out towards the stars, Blue Origin and SpaceX.
01:16:39.700 | Are you competitors, collaborators, which, to what degree?
01:16:44.060 | - Well, I would say, just like the internet is big
01:16:47.020 | and there are lots of winners at all skill levels,
01:16:49.020 | I mean, there are half a dozen giant companies
01:16:51.620 | that the internet has made,
01:16:53.460 | but they're a bunch of medium-sized companies
01:16:56.060 | and a bunch of small companies, all successful,
01:16:58.960 | all with profit streams,
01:17:00.340 | all driving great customer experiences.
01:17:02.700 | That's what we wanna see in space.
01:17:06.140 | That kind of dynamism, and space is big.
01:17:09.080 | There's room for a bunch of winners,
01:17:11.440 | and it's gonna happen at all skill levels.
01:17:13.480 | And so SpaceX is gonna be successful, for sure.
01:17:17.580 | I want Blue Origin to be successful,
01:17:20.160 | and I hope there are another five companies right behind us.
01:17:24.400 | - But I spoke to Elon a few times recently
01:17:28.720 | about you, about Blue Origin,
01:17:30.760 | and he was very positive about you as a person
01:17:33.400 | and very supportive of all the efforts
01:17:35.640 | you've been leading at Blue.
01:17:37.180 | What's your thoughts?
01:17:38.080 | You worked with a lot of leaders at Amazon, at Blue.
01:17:42.300 | What's your thoughts about Elon
01:17:43.560 | as a human being and a leader?
01:17:45.340 | - Well, I don't really know Elon very well.
01:17:50.040 | I know his public persona,
01:17:53.000 | but I also know you can't know anyone
01:17:55.760 | by their public persona.
01:17:57.920 | It's impossible.
01:17:59.080 | I mean, you may think you do, but I guarantee you don't.
01:18:02.080 | So I don't really know.
01:18:02.920 | You know Elon way better than I do, Lex,
01:18:04.720 | but in terms of his, judging by the results,
01:18:09.720 | he must be a very capable leader.
01:18:12.760 | There's no way you could have Tesla and SpaceX
01:18:18.400 | without being a capable leader.
01:18:19.840 | It's impossible.
01:18:20.900 | - Yeah, I hope you guys hang out sometimes,
01:18:25.800 | shake hands, and sort of have a kind of friendship
01:18:30.120 | that would inspire just the entirety of humanity,
01:18:32.480 | 'cause what you're doing is like one of the big,
01:18:35.920 | grand challenges ahead for humanity.
01:18:40.360 | - Well, I agree with you,
01:18:41.440 | and I think in a lot of these endeavors,
01:18:44.200 | we're very like-minded.
01:18:45.560 | - Yeah.
01:18:46.400 | - So I think, you know, I think,
01:18:48.460 | I'm not saying we're identical,
01:18:50.120 | but I think we're very like-minded,
01:18:51.880 | and so I, you know, I love that idea.
01:18:56.080 | - All right, going back to sexy pictures on your Instagram,
01:18:59.220 | there's a video of you from the early days of Amazon
01:19:02.940 | giving a tour of your, quote, sort of offices.
01:19:06.500 | I think your dad is holding the camera.
01:19:07.940 | - He is, yeah, I know, right, yes.
01:19:10.120 | This is one of the giant orange extension cord, yeah.
01:19:12.860 | - And you're like explaining the genius
01:19:15.020 | of the extension cord, and how there's a desk,
01:19:18.940 | and a CRT monitor, and sort of that's where the,
01:19:21.860 | that's where all the magic happens.
01:19:23.260 | I forget what your dad said,
01:19:24.220 | but this is like the center of it all.
01:19:27.220 | So what was it like?
01:19:29.420 | What was going through your mind at that time?
01:19:31.140 | You left a good job in New York, and took this leap.
01:19:36.140 | Were you excited?
01:19:37.220 | Were you scared?
01:19:38.040 | - So excited, and scared, anxious, you know,
01:19:41.540 | thought the odds of success were low,
01:19:44.300 | told all of our early investors
01:19:46.340 | that I thought there was a 30% chance of success,
01:19:49.120 | by which I just mean getting your money back,
01:19:50.780 | not like, not what actually happened,
01:19:53.980 | because that's the truth.
01:19:55.180 | Every startup company is unlikely to work.
01:19:58.220 | It's helpful to be in reality about that,
01:20:03.020 | but that doesn't mean you can't be optimistic.
01:20:05.060 | So you kind of have to have this duality in your head,
01:20:07.700 | like on the one hand, you know what the baseline statistics
01:20:11.700 | say about startup companies, and the other hand,
01:20:14.700 | you have to ignore all of that,
01:20:16.600 | and just be 100% sure it's gonna work.
01:20:19.540 | And you're doing both things at the same time,
01:20:21.700 | you're holding that contradiction in your head.
01:20:24.400 | But it was so, so exciting.
01:20:26.720 | I love, you know, every from 1994,
01:20:30.180 | when the company was founded, to 1995,
01:20:33.460 | when we opened our doors, all the way until today,
01:20:38.100 | I find Amazon so exciting.
01:20:40.500 | And that doesn't mean, it's like full of pain,
01:20:43.420 | full of problems, you know, it's like,
01:20:46.580 | there's so many things that need to be resolved,
01:20:49.020 | and worked, and made better, and et cetera.
01:20:52.220 | But on balance, it's so fun.
01:20:55.400 | It's such a privilege, it's been such a joy.
01:20:57.840 | I feel so grateful that I've been part of that journey.
01:21:01.960 | It's just been incredible.
01:21:04.720 | - So in some sense, you don't want a single day of comfort.
01:21:07.480 | You've written about this many times.
01:21:10.120 | We'll talk about your writing,
01:21:11.200 | which I would highly recommend people read
01:21:14.560 | in just the letters to shareholders.
01:21:16.400 | So you wrote explaining the idea of day one thinking.
01:21:21.600 | I think you first wrote about it
01:21:23.020 | in 97 letters to shareholders.
01:21:25.840 | Then you also, in a way, wrote about,
01:21:28.300 | sad to say, is your last letter to shareholders as CEO.
01:21:33.380 | And you said that day two is stasis,
01:21:37.420 | followed by irrelevance,
01:21:40.300 | followed by excruciating painful decline,
01:21:43.900 | followed by death.
01:21:45.900 | And that is why it's always day one.
01:21:47.700 | (laughing)
01:21:49.820 | Can you explain this day one thing?
01:21:51.180 | This is a really powerful way to describe
01:21:53.720 | the beginning and the journey of Amazon.
01:21:56.080 | - It's really a very simple and, I think,
01:22:00.880 | age-old idea about renewal and rebirth.
01:22:05.120 | And every day is day one.
01:22:08.500 | Every day, you're deciding what you're gonna do.
01:22:12.380 | And you are not trapped by what you were,
01:22:17.240 | or who you were, or any self-consistency.
01:22:20.720 | Self-consistency even can be a trap.
01:22:23.960 | And so day one thinking is kind of,
01:22:27.720 | we start fresh every day.
01:22:30.280 | And we get to make new decisions every day
01:22:33.800 | about invention, about customers,
01:22:37.000 | about how we're going to operate.
01:22:40.720 | Even as deeply as what our principles are,
01:22:44.500 | we can go back to that.
01:22:45.560 | It turns out we don't change those very often,
01:22:47.320 | but we change them occasionally.
01:22:49.320 | And when we work on programs at Amazon,
01:22:54.320 | we often make a list of tenets.
01:22:58.800 | And the tenets are kind of, they're not principles.
01:23:03.000 | They're a little more tactical than principles,
01:23:05.240 | but it's kind of the main ideas
01:23:08.600 | that we want this program to embody, whatever those are.
01:23:12.400 | And one of the things that we do is we put,
01:23:15.320 | these are the tenets for this program,
01:23:17.120 | and in parentheses we always put,
01:23:19.600 | unless you know a better way.
01:23:21.280 | And that idea, unless you know a better way,
01:23:27.260 | is so important because you never want
01:23:30.760 | to get trapped by dogma.
01:23:33.060 | You never want to get trapped by history.
01:23:35.400 | It doesn't mean you discard history or ignore it.
01:23:38.720 | There's so much value in what has worked in the past.
01:23:42.360 | But you can't be blindly following what you've done.
01:23:46.320 | And that's the heart of day one,
01:23:48.580 | is you're always starting fresh.
01:23:51.120 | - And to the question of how to fend off day two,
01:23:53.840 | you said, "Such a question can't have a simple answer,"
01:23:57.160 | as you're saying.
01:23:58.080 | "There will be many elements, multiple paths,
01:23:59.960 | "and many traps.
01:24:01.200 | "I don't know the whole answer, but I may know bits of it.
01:24:04.880 | "Here's a starter pack of essentials."
01:24:07.160 | Maybe others come to mind.
01:24:08.320 | For day one, defense.
01:24:10.620 | Customer obsession, a skeptical view of proxies,
01:24:14.240 | the eager adoption of external trends
01:24:17.000 | and high-velocity decision-making.
01:24:19.300 | So we talked about high-velocity decision-making.
01:24:22.080 | That's more difficult than it sounds.
01:24:24.720 | So maybe you can pick one that stands out to you
01:24:27.680 | as you can comment on.
01:24:28.920 | Eager adoption of external trends,
01:24:31.960 | high-velocity decision-making, skeptical view of proxies.
01:24:34.680 | How do you fight off day two?
01:24:36.580 | - Well, you know, I'll talk about,
01:24:38.200 | because I think it's the one that is, maybe in some ways,
01:24:41.800 | the hardest to understand,
01:24:43.760 | is the skeptical view of proxies.
01:24:48.260 | One of the things that happens in business,
01:24:52.720 | probably anything that you're,
01:24:54.600 | where you're, you know, you have an ongoing program
01:24:57.600 | and something is underway for a number of years,
01:25:01.560 | is you develop certain things that you're managing to.
01:25:04.680 | Like, let's say, the typical case would be a metric.
01:25:08.000 | And that metric isn't the real underlying thing.
01:25:11.880 | And so, you know, maybe the metric is efficiency metric
01:25:17.880 | around customer contacts per unit sold or something.
01:25:22.160 | Like, if you sell a million units,
01:25:24.920 | how many customer contacts do you get?
01:25:26.720 | Or how many returns do you get?
01:25:28.320 | And so on and so on.
01:25:29.760 | And so what happens is a little bit of a kind of inertia
01:25:33.320 | sets in, where somebody a long time ago
01:25:38.200 | invented that metric.
01:25:40.280 | And they invented that metric.
01:25:41.320 | They decided we need to watch for, you know,
01:25:44.780 | customer returns per unit sold as an important metric.
01:25:49.780 | But they had a reason why they chose that metric,
01:25:53.420 | the person who invented that metric
01:25:55.280 | and decided it was worth watching.
01:25:57.680 | And then fast forward five years,
01:25:59.320 | that metric is the proxy.
01:26:01.600 | - The proxy for truth, I guess.
01:26:02.980 | - The proxy for truth.
01:26:03.920 | The proxy for, let's say in this case,
01:26:06.360 | it's a proxy for customer happiness.
01:26:08.400 | But that metric is not actually customer happiness.
01:26:12.780 | It's a proxy for customer happiness.
01:26:15.560 | The person who invented the metric
01:26:17.680 | understood that connection.
01:26:20.200 | Five years later, a kind of inertia can set in
01:26:25.200 | and you forget the truth behind why you were watching
01:26:30.360 | that metric in the first place.
01:26:32.040 | And the world shifts a little.
01:26:33.580 | And now that proxy isn't as valuable as it used to be,
01:26:38.060 | or it's missing something.
01:26:39.680 | And you have to be on alert for that.
01:26:42.460 | You have to know, okay, this is,
01:26:44.340 | I don't really care about this metric.
01:26:46.700 | I care about customer happiness.
01:26:49.500 | And this metric is worth putting energy into
01:26:54.500 | and following and improving and scrutinizing
01:26:58.320 | only in so much as it actually affects customer happiness.
01:27:03.220 | And so you've got to constantly be on guard.
01:27:05.020 | And it's very, very common.
01:27:06.820 | This is a nuanced problem.
01:27:09.140 | It's very common, especially in large companies,
01:27:12.620 | that they are managing to metrics
01:27:14.820 | that they don't really understand.
01:27:16.980 | They don't really know why they exist.
01:27:19.340 | And the world may have shifted out from under them a little.
01:27:22.220 | And the metrics are no longer as relevant as they were
01:27:25.420 | when somebody 10 years earlier invented the metric.
01:27:29.360 | - That is a nuance, but that's a big problem, right?
01:27:33.600 | There's something so compelling
01:27:36.120 | to have a nice metric to try to optimize.
01:27:38.600 | - Yes.
01:27:39.440 | And by the way, you do need metrics.
01:27:41.000 | - Yes, you do.
01:27:41.840 | - You can't ignore them.
01:27:43.560 | You want them.
01:27:44.640 | But you just have to be constantly on guard.
01:27:48.000 | This is a way to slip into day two thinking,
01:27:52.000 | would be to manage your business to metrics
01:27:54.760 | that you don't really understand,
01:27:56.860 | and you're not really sure why they were invented
01:27:58.660 | in the first place, and you're not sure
01:28:00.420 | they're still as relevant as they used to be.
01:28:02.940 | - What does it take to be the guy or gal
01:28:05.300 | who brings up the point that this proxy might be outdated?
01:28:09.940 | I guess, what does it take to have a culture
01:28:12.260 | that enables that in the meeting?
01:28:14.540 | 'Cause that's a very uncomfortable thing
01:28:16.080 | to bring up in a meeting.
01:28:17.860 | We all showed up here, it's a Friday.
01:28:19.940 | - This is such, you have just asked a million dollar question
01:28:24.180 | so this is, if I generalize what you're asking,
01:28:29.180 | you're talking in general about truth telling.
01:28:33.980 | - Yeah.
01:28:35.100 | - And we humans are not really truth-seeking animals.
01:28:40.100 | We are social animals.
01:28:42.820 | - Yeah, we are.
01:28:44.220 | - And take you back in time 10,000 years
01:28:47.420 | and you're in a small village.
01:28:48.940 | If you go along to get along, you can survive.
01:28:53.620 | You can procreate.
01:28:55.300 | If you're the village truth teller,
01:28:57.980 | you might get clubbed to death in the middle of the night.
01:29:01.060 | Truths are often, they don't want to be heard
01:29:05.220 | 'cause important truths can be uncomfortable,
01:29:09.740 | they can be awkward, they can be exhausting.
01:29:12.740 | - Impolite.
01:29:13.580 | - Yes, challenging.
01:29:16.100 | They can make people defensive
01:29:17.580 | even if that's not the intent.
01:29:19.660 | But any high-performing organization,
01:29:21.860 | whether it's a sports team, a business,
01:29:24.700 | a political organization, an activist group,
01:29:27.340 | I don't care what it is,
01:29:29.220 | any high-performing organization has to have mechanisms
01:29:34.220 | and a culture that supports truth telling.
01:29:38.300 | One of the things you have to do
01:29:39.620 | is you have to talk about that.
01:29:41.220 | You have to talk about the fact
01:29:42.620 | that it takes energy to do that.
01:29:45.440 | You have to talk to people.
01:29:46.580 | You have to remind people it's okay that it's uncomfortable.
01:29:50.940 | You have to literally tell people
01:29:53.140 | it's not what we're designed to do as humans.
01:29:56.340 | It's not really, it's kind of a side effect.
01:29:59.420 | You know, we can do that.
01:30:00.940 | But it's not how we survive.
01:30:02.540 | We mostly survive by being social animals
01:30:05.500 | and being cordial and cooperative,
01:30:08.100 | and that's really important.
01:30:10.140 | And so there's a, you know,
01:30:12.820 | science is all about truth telling.
01:30:14.620 | It's actually a very formal mechanism
01:30:18.780 | for trying to tell the truth.
01:30:22.020 | And even in science,
01:30:24.060 | you find that it's hard to tell the truth, right?
01:30:27.720 | Even, you know, you're supposed to have a hypothesis,
01:30:31.460 | test it, and find data, and reject the hypothesis,
01:30:34.180 | and so on.
01:30:35.420 | It's not easy.
01:30:36.660 | - But even in science, there's like the senior scientists
01:30:40.300 | and the junior scientists. - Correct.
01:30:41.640 | - And then there's a hierarchy of humans
01:30:43.900 | where the somehow seniority-- - It's still social.
01:30:46.180 | - Somehow seniority matters in the scientific process,
01:30:49.220 | which it should not. - And that's true
01:30:50.420 | inside companies, too.
01:30:51.700 | And so you wanna set up your culture
01:30:55.460 | so that the most junior person
01:30:57.740 | can overrule the most senior person, if they have data.
01:31:03.620 | And that really is about trying to, you know,
01:31:08.940 | there are little things you can do.
01:31:10.460 | So for example, in every meeting that I attend,
01:31:13.580 | I always speak last.
01:31:16.320 | And I know from experience
01:31:19.920 | that, you know, if I speak first,
01:31:24.740 | even very strong-willed,
01:31:28.280 | highly intelligent, high-judgment participants
01:31:33.600 | in that meeting will wonder,
01:31:36.360 | "Well, if Jeff thinks that,
01:31:38.880 | "I came in this meeting thinking one thing,
01:31:41.080 | "but maybe I'm not right."
01:31:44.520 | And so you can do little things like,
01:31:47.160 | if you're the most senior person in the room, go last,
01:31:52.080 | but everybody else go first.
01:31:53.440 | In fact, ideally,
01:31:55.120 | let's try to have the most junior person go first
01:31:57.680 | and the second, then try to go in order of seniority
01:32:01.080 | so that you can hear everyone's opinion
01:32:05.600 | in a kind of unfiltered way, because we really do.
01:32:08.880 | We actually literally change our opinions.
01:32:11.240 | If somebody who you really respect says something,
01:32:14.400 | it makes you change your mind a little.
01:32:17.360 | - So you're saying implicitly or explicitly give permission
01:32:22.360 | for people to have a strong opinion
01:32:25.760 | as long as it's backed by data?
01:32:27.640 | - Yes, and sometimes it can even,
01:32:29.440 | by the way, a lot of our most powerful truths
01:32:32.600 | turn out to be hunches.
01:32:34.440 | They turn out to be based on anecdotes.
01:32:36.840 | They're intuition-based.
01:32:38.560 | And sometimes you don't even have strong data,
01:32:41.640 | but you may know the person well enough
01:32:44.280 | to trust their judgment.
01:32:45.760 | You may feel yourself leaning in.
01:32:47.600 | It may resonate with a set of anecdotes you have.
01:32:50.960 | And then you may be able to say,
01:32:52.200 | "You know, something about that feels right.
01:32:56.640 | Let's go collect some data on that.
01:32:58.720 | Let's try to see if we can actually know whether it's right.
01:33:02.720 | But for now, let's not disregard it 'cause it feels right."
01:33:06.620 | You can also fight inherent bias.
01:33:08.440 | There's an optimism bias.
01:33:09.800 | Like if there are two interpretations of a new set of data
01:33:14.040 | and one of 'em is happy and one of 'em is unhappy,
01:33:17.360 | it's a little dangerous to jump to the conclusion
01:33:20.560 | that the happy interpretation is right.
01:33:23.280 | You may want to sort of compensate for that human bias
01:33:27.160 | of looking for, you know,
01:33:28.840 | trying to find the silver lining and say,
01:33:30.480 | "Look, that might be good,
01:33:32.920 | but I'm gonna go with it's bad for now until we're sure."
01:33:36.520 | - So speaking of happiness bias,
01:33:38.920 | data collection, and anecdotes,
01:33:41.440 | you have to, how's that for a transition?
01:33:43.500 | (Dave laughing)
01:33:45.360 | You have to tell me the story of the call you made,
01:33:50.360 | the customer service call you made
01:33:52.560 | to demonstrate a point about wait times.
01:33:57.280 | - Yeah, this is very early in the history of Amazon.
01:34:00.880 | And we were going over a weekly business review
01:34:04.880 | and a set of documents.
01:34:05.760 | And I have a saying,
01:34:08.640 | which is when the data and the anecdotes disagree,
01:34:12.240 | the anecdotes are usually right.
01:34:14.520 | And it doesn't mean you just slavishly
01:34:17.600 | go follow the anecdotes then.
01:34:18.760 | It means you go examine the data.
01:34:20.760 | 'Cause it's usually not that the data is being miscollected.
01:34:25.760 | It's usually that you're not measuring the right thing.
01:34:30.100 | And so, you know,
01:34:31.080 | if you have a bunch of customers complaining about something
01:34:34.520 | and at the same time, you know,
01:34:36.920 | your metrics look like why they shouldn't be complaining,
01:34:40.300 | you should doubt the metrics.
01:34:43.760 | And an early example of this
01:34:46.760 | was we had metrics that showed that our customers
01:34:51.040 | were waiting, I think, less than, I don't know, 60 seconds.
01:34:54.280 | When they called a 1-800 number
01:34:55.920 | to get, you know, phone customer service,
01:34:59.080 | the wait time was supposed to be less than 60 seconds.
01:35:02.700 | And, but we had a lot of complaints
01:35:04.760 | that it was longer than that.
01:35:06.480 | And anecdotally, it seemed longer than that.
01:35:09.320 | Like, you know, I would call customer service myself.
01:35:12.160 | And so one day we're in a meeting,
01:35:13.640 | we're going through the WBR
01:35:15.600 | and the Weekly Business Review,
01:35:17.680 | and we get to this metric in the DAC.
01:35:20.360 | And the guy who leads customer service
01:35:22.960 | is defending the metric.
01:35:24.120 | And I said, okay, let's call.
01:35:27.080 | (laughing)
01:35:29.560 | Picked up the phone, and I dialed the 1-800 number
01:35:32.820 | and called customer service.
01:35:34.860 | And we just waited in silence for a few minutes.
01:35:39.260 | - What did it turn out to be?
01:35:40.100 | - Oh, it was really long, more than 10 minutes, I think.
01:35:42.740 | - Oh, wow.
01:35:43.580 | - I mean, it was many minutes.
01:35:45.300 | And so, you know, it dramatically made the point
01:35:47.740 | that something was wrong with the data collection.
01:35:50.340 | We weren't measuring the right thing.
01:35:51.940 | And that, you know, set off a whole chain of events
01:35:54.140 | where we started measuring it right.
01:35:55.960 | And that's an example, by the way, of truth-telling,
01:35:59.800 | is like, that's an uncomfortable thing to do.
01:36:03.400 | But you have to seek truth, even when it's uncomfortable.
01:36:08.400 | And you have to get people's attention,
01:36:10.800 | and they have to buy into it,
01:36:12.120 | and they have to get energized around really fixing things.
01:36:16.180 | - So that speaks to the obsession
01:36:18.280 | with the customer experience.
01:36:19.720 | So one of the defining aspects of your approach to Amazon
01:36:23.440 | is just being obsessed with making customers happy.
01:36:27.040 | I think companies sometimes say that,
01:36:29.600 | but Amazon is really obsessed with that.
01:36:34.240 | I think there's something really profound to that,
01:36:37.080 | which is seeing the world through the eyes of the customer,
01:36:39.760 | like the customer experience, the human being
01:36:42.440 | that's using the product, that's enjoying the product,
01:36:46.520 | like what they're, like the subtle little things
01:36:49.200 | that make up their experience.
01:36:52.260 | Like, how do you optimize those?
01:36:53.860 | - This is another really good and kind of deep question,
01:37:00.200 | because there are big things
01:37:04.680 | that are really important to manage,
01:37:07.120 | and then there are small things.
01:37:10.040 | Internally into Amazon, we call them paper cuts.
01:37:12.480 | So we have, we're always working on the big things.
01:37:16.240 | Like, if you ask me, and most of the energy
01:37:18.680 | goes into the big things, as it should.
01:37:20.320 | So, and you can identify the big things,
01:37:23.660 | and I would encourage anybody,
01:37:25.020 | if anybody listening to this is an entrepreneur,
01:37:29.100 | has a small business, whatever,
01:37:30.720 | think about the things that are not going to change
01:37:35.500 | over 10 years, and those are probably the big things.
01:37:38.540 | So like, I know in our retail business at Amazon,
01:37:43.140 | 10 years from now,
01:37:43.980 | customers are still gonna want low prices.
01:37:45.900 | I know they're still gonna want fast delivery,
01:37:47.820 | and I just know they're still gonna want big selection.
01:37:50.220 | So it's impossible to imagine a scenario
01:37:53.120 | where 10 years from now, I say,
01:37:55.380 | where a customer says, "I love Amazon,
01:37:56.900 | "I just wish the prices were a little higher."
01:37:58.940 | Or, "I love Amazon, I just wish
01:38:00.340 | "you delivered a little more slowly."
01:38:02.400 | So, when you identify the big things,
01:38:05.760 | you can tell they're worth putting energy into,
01:38:08.400 | because they're stable in time.
01:38:10.840 | Okay.
01:38:12.000 | But you're asking about something a little different,
01:38:14.320 | which is, in every customer experience,
01:38:16.840 | there are those big things, and by the way,
01:38:18.520 | it's astonishingly hard to focus even on just the big things.
01:38:21.840 | So, even though they're obvious,
01:38:24.640 | they're really hard to focus on.
01:38:27.080 | But in addition to that, there are all these little,
01:38:30.160 | tiny customer experience deficiencies,
01:38:33.920 | and we call those paper cuts.
01:38:35.680 | And we make long lists of them,
01:38:37.320 | and then we have dedicated teams that go fix paper cuts,
01:38:43.200 | because the teams working on the big issues
01:38:47.320 | never get to the paper cuts.
01:38:49.920 | They never work their way down the list to get to,
01:38:53.480 | they're working on big things, as they should,
01:38:56.520 | and as you want them to.
01:38:57.840 | And so, you need special teams
01:39:00.920 | who are charged with fixing paper cuts.
01:39:04.720 | - Where would you put, on the paper cut spectrum,
01:39:08.140 | the buy now with one click button,
01:39:10.400 | which is, I think, pretty genius.
01:39:12.280 | So, to me, like, okay.
01:39:15.040 | My interaction with things I love on the internet.
01:39:18.260 | There's things I do a lot.
01:39:19.880 | I may be representing a regular human.
01:39:23.080 | I would love for those things to be frictionless.
01:39:25.480 | For example, booking airline tickets.
01:39:27.540 | Just saying.
01:39:29.720 | But it's buying a thing with one click,
01:39:34.720 | making that experience frictionless,
01:39:38.840 | intuitive, all aspects of that.
01:39:41.000 | That just fundamentally makes my life better.
01:39:46.000 | Not just in terms of efficiency,
01:39:47.600 | in terms of some kind of--
01:39:48.960 | - Cognitive load.
01:39:49.960 | - Yeah, cognitive load and inner peace and happiness.
01:39:53.720 | First of all, buying stuff is a pleasant experience.
01:39:58.720 | Having enough money to buy a thing
01:40:01.160 | and then buying it is a pleasant experience.
01:40:03.520 | And having pain around that,
01:40:05.680 | it's somehow, you're ruining a beautiful experience.
01:40:10.040 | - And I guess all I'm saying,
01:40:12.320 | as a person who loves good ideas,
01:40:14.960 | is that a paper cut, a solution to a paper cut?
01:40:17.400 | - Yes.
01:40:18.480 | So, that particular thing is probably a solution
01:40:21.580 | to a number of paper cuts.
01:40:23.200 | So, if you go back and look at our order pipeline
01:40:25.660 | and how people shopped on Amazon
01:40:27.280 | before we invented one click shopping,
01:40:30.380 | there was more friction.
01:40:32.640 | There was a whole series of paper cuts.
01:40:36.000 | And that invention eliminated a bunch of paper cuts.
01:40:40.520 | And I think you're absolutely right, by the way,
01:40:44.180 | that when you come up with something like one click shopping,
01:40:48.540 | again, this is so ingrained in people now.
01:40:51.840 | I'm impressed that you even notice it.
01:40:53.480 | I mean, most people--
01:40:54.640 | - Every time I click the button.
01:40:56.020 | (laughing)
01:40:56.860 | I just-- - Most people never notice.
01:40:58.360 | - Surge of happiness.
01:40:59.680 | - This, there is, in the perfect invention,
01:41:03.840 | for the perfect moment, in the perfect context,
01:41:06.600 | there is real beauty.
01:41:07.880 | It is actual beauty, and it feels good.
01:41:13.360 | It's emotional.
01:41:14.600 | It's emotional for the inventor.
01:41:16.480 | It's emotional for the team that builds it.
01:41:18.520 | It's emotional for the customer.
01:41:20.240 | It's a big deal.
01:41:21.360 | And you can feel those things.
01:41:23.520 | - But to keep coming up with that idea,
01:41:25.680 | with those kinds of ideas,
01:41:26.920 | I guess is the day one thinking effort.
01:41:29.000 | - Yeah, and you need a big group of people
01:41:31.600 | who feel that kind of satisfaction
01:41:35.320 | with creating that kind of beauty.
01:41:37.100 | - There's a lot of books written about you.
01:41:41.640 | There's a book, "Invent and Wander,"
01:41:43.840 | where Walter Isaacson does an intro,
01:41:45.760 | and it's mostly collective writings of yours.
01:41:48.680 | I've read that.
01:41:49.640 | I also recommend people check out "The Founders" podcast
01:41:52.520 | that covers you a lot, and it does different analysis
01:41:56.760 | of different business advice you've given over the years.
01:41:59.960 | - I bring all that up because I saw that there,
01:42:03.420 | I mentioned that you said that books are an antidote
01:42:07.640 | for short attention spans.
01:42:10.160 | And I forget how it was phrased,
01:42:11.840 | but that when you were thinking about the Kindle,
01:42:15.240 | that you're thinking about how technology changes us.
01:42:20.240 | - We co-evolve with our tools.
01:42:26.520 | So we invent new tools, and then our tools change us.
01:42:30.840 | - Which is fascinating to think about.
01:42:31.680 | - It goes in a circle.
01:42:33.600 | - And there's some aspect, even just inside business,
01:42:37.600 | where you don't just make the customer happy,
01:42:40.000 | but you also have to think about
01:42:41.520 | where is this going to take humanity,
01:42:44.040 | if you zoom out a bit.
01:42:45.480 | - 100%, and you can feel your brain.
01:42:50.480 | Brains are plastic, and you can feel your brain
01:42:56.440 | getting reprogrammed.
01:42:57.360 | I remember the first time this happened to me
01:43:00.280 | was when Tetris first came on the scene.
01:43:04.680 | I'm sure you've had, anybody who's been a game player
01:43:07.100 | has this experience where you close your eyes
01:43:10.360 | to lay down to go to sleep,
01:43:12.000 | and you see all the little blocks moving,
01:43:16.080 | and you're kind of rotating them in your mind,
01:43:18.480 | and you can just tell as you walk around the world
01:43:21.080 | that you have rewired your brain to play Tetris.
01:43:25.880 | And, but that happens with everything.
01:43:29.960 | And so, one of the, I think,
01:43:32.920 | we still have yet to see the full repercussions of this,
01:43:37.000 | I fear, but I think one of the things
01:43:39.360 | that we've done online,
01:43:41.560 | largely because of social media,
01:43:44.220 | is we have trained our brains to be really good
01:43:47.580 | at processing super short-form content.
01:43:51.440 | And your podcast flies in the face of this.
01:43:55.720 | You do these long-format things.
01:43:58.360 | And reading books is a long-format thing.
01:44:02.920 | And we all do more of, if something is convenient,
01:44:07.920 | we do more of it.
01:44:09.540 | And so, when you make tools,
01:44:11.120 | we carry around a little,
01:44:14.640 | we carry around in our pocket a phone.
01:44:17.360 | And one of the things that phone does, for the most part,
01:44:20.160 | is it is an attention-shortening device.
01:44:23.120 | Because most of the things we do on our phone
01:44:25.280 | shorten our attention spans.
01:44:27.640 | And I'm not even gonna say we know for sure that that's bad,
01:44:30.560 | but I do think it's happening.
01:44:31.760 | It's one of the ways we're co-evolving with that tool.
01:44:34.780 | But I think it's important to spend some of your time
01:44:38.520 | and some of your life doing long attention-span things.
01:44:41.560 | - Yeah, I think you've spoken about the value
01:44:44.440 | in your own life of focus,
01:44:45.980 | of singular focus on a thing for prolonged periods of time.
01:44:50.720 | And that's certainly what books do.
01:44:52.800 | And that's certainly what that piece of technology does.
01:44:55.040 | But I bring all that up to ask you
01:44:57.040 | about another piece of technology, AI,
01:45:00.920 | that has the potential to have various trajectories
01:45:05.920 | to have an impact on human civilization.
01:45:10.360 | How do you think AI will change this?
01:45:12.440 | - If you're talking about generative AI,
01:45:17.080 | large language models, things like Chai GPT,
01:45:19.600 | and its soon successors.
01:45:22.900 | And these are incredibly powerful technologies.
01:45:27.900 | To believe otherwise is to bury your head in the sand,
01:45:33.900 | soon to be even more powerful.
01:45:35.740 | It's interesting to me that large language models
01:45:45.980 | in their current form are not inventions,
01:45:49.500 | they're discoveries.
01:45:51.920 | You know, the telescope was an invention,
01:45:54.380 | but looking through it at Jupiter,
01:45:58.560 | knowing that it had moons was a discovery.
01:46:02.420 | My God, it has moons.
01:46:05.600 | And that's what Galileo did.
01:46:09.040 | And so, this is closer on that spectrum of invention.
01:46:13.200 | You know, we know exactly what happens with a 787.
01:46:16.740 | It's an engineered object, we designed it,
01:46:20.240 | we know how it behaves, we don't want any surprises.
01:46:24.160 | Large language models are much more like discoveries.
01:46:28.640 | We're constantly getting surprised by their capabilities.
01:46:31.520 | They're not really engineered objects.
01:46:33.440 | Then, you know, you have this debate
01:46:39.040 | about whether they're gonna be good for humanity
01:46:41.240 | or bad for humanity.
01:46:42.720 | You know, even specialized AI
01:46:49.040 | can be very bad for humanity.
01:46:51.080 | I mean, it's just, you know,
01:46:52.920 | just regular machine learning models
01:46:55.760 | that can make, you know, certain weapons of war
01:47:00.760 | that could be incredibly destructive and very powerful.
01:47:04.680 | And they're not general AIs,
01:47:05.920 | they're just, they could just be very smart weapons.
01:47:09.340 | And so, we have to think about all of those things.
01:47:17.800 | I'm very optimistic about this.
01:47:20.160 | So, even in the face of all this uncertainty,
01:47:23.380 | my own view is that these powerful tools
01:47:29.320 | are much more likely to help us and save us even
01:47:35.040 | than they are to, on balance, hurt us and destroy us.
01:47:39.160 | I think, you know, we humans have a lot of ways of,
01:47:41.720 | we can make ourselves go extinct.
01:47:45.360 | These things may help us not do that, you know.
01:47:48.000 | So, we may actually, they may actually save us.
01:47:50.940 | So, the people who are, you know, overly concerned,
01:47:54.160 | in my view, overly concerned, it's a valid debate.
01:47:57.140 | I think that they may be missing part of the equation,
01:48:02.200 | which is how helpful they could be
01:48:03.760 | in making sure we don't destroy ourselves.
01:48:05.960 | I don't know if you saw the movie "Oppenheimer,"
01:48:11.200 | but to me, first of all, I loved the movie,
01:48:14.620 | and I thought the best part of the movie
01:48:17.720 | is this bureaucrat, played by Robert Downey Jr.,
01:48:22.120 | who, you know, some of the people I've talked to
01:48:23.600 | think that's the most boring part of the movie.
01:48:25.840 | I thought it was the most fascinating,
01:48:28.080 | because what's going on here is you realize
01:48:32.280 | we have invented these awesome, destructive,
01:48:36.400 | powerful technologies called nuclear weapons,
01:48:40.520 | and they are managed, and, you know,
01:48:43.440 | we humans are, we're not really capable
01:48:48.440 | of wielding those weapons.
01:48:51.580 | - Yeah.
01:48:52.420 | - We're, you know, and that's what he represented
01:48:54.600 | in that movie, is here's this guy who is,
01:48:58.940 | he wrongly thinks, he's like, being so petty,
01:49:03.360 | he thinks that he said something,
01:49:04.960 | that Oppenheimer said something bad to Einstein about him.
01:49:08.160 | They didn't talk about him at all,
01:49:09.640 | as you find out in the final scene of the movie,
01:49:12.200 | and yet he spent his career trying to be vengeful
01:49:16.560 | and petty, and that's the problem.
01:49:21.320 | We as a species are not really sophisticated enough
01:49:26.320 | and mature enough to handle these technologies,
01:49:32.400 | and so, and by the way, before you get to general AI
01:49:36.080 | and the possibility of AI having agency,
01:49:38.920 | and there's a lot of things that would have to happen,
01:49:40.440 | but there's so much benefit that's gonna come
01:49:43.640 | from these technologies in the meantime,
01:49:46.240 | even before they're, you know, general AI,
01:49:49.320 | in terms of better medicines and better tools
01:49:52.800 | to develop more technologies and so on,
01:49:54.800 | so I think it's an incredible moment to be alive
01:49:59.800 | and to witness the transformations that are gonna happen.
01:50:02.860 | How quickly it'll happen, no one knows,
01:50:04.340 | but over the next 10 years and 20 years,
01:50:06.200 | I think we're gonna see really remarkable advances,
01:50:09.620 | and I personally am very excited about it.
01:50:12.600 | - First of all, really interesting to say
01:50:13.960 | that it's discoveries that it's true
01:50:16.920 | that we don't know the limits of what's possible
01:50:21.920 | with the current language models.
01:50:24.080 | - We don't.
01:50:24.920 | - And like, it could be a few tricks and hacks
01:50:27.800 | here and there that open doors
01:50:31.000 | to whole entire new possibilities.
01:50:33.320 | - We do know that humans are doing something different
01:50:36.920 | from these models, in part because,
01:50:41.260 | you know, we're so power-efficient.
01:50:44.560 | You know, the human brain does remarkable things,
01:50:48.080 | and it does it on about 20 watts of power,
01:50:51.940 | and, you know, the AI techniques we use today
01:50:55.740 | use many kilowatts of power to do equivalent tasks,
01:50:59.120 | so there's something interesting
01:51:00.360 | about the way the human brain does this,
01:51:02.560 | and also, we don't need as much data,
01:51:04.520 | so, you know, like, self-driving cars are,
01:51:08.040 | they have to drive billions and billions of miles
01:51:10.400 | to try and to learn how to drive,
01:51:12.600 | and, you know, your average 16-year-old figures it out
01:51:16.180 | with many fewer miles, so there are still some tricks,
01:51:21.720 | I think, that we have yet to learn.
01:51:23.880 | I don't think we've learned the last trick.
01:51:25.640 | I don't think it's just a question of scaling things up,
01:51:30.120 | but what's interesting is that just scaling things up,
01:51:33.160 | and I put just in quotes
01:51:34.560 | because it's actually hard to scale things up,
01:51:36.160 | but just scaling things up
01:51:37.920 | also appears to pay huge dividends.
01:51:40.320 | - Yeah, and there's some more nuanced aspects
01:51:42.960 | about human beings that's interesting
01:51:44.840 | if it's able to accomplish, like,
01:51:46.640 | being truly original and novel to, you know,
01:51:50.040 | large-language models being able
01:51:51.960 | to come up with some truly new ideas.
01:51:54.980 | That's one, and the other one is truth.
01:51:59.760 | It seems that large-language models
01:52:01.640 | are very good at sounding like they're saying a true thing,
01:52:06.640 | but they don't require or often have a grounding
01:52:11.400 | in sort of a mathematical truth.
01:52:14.040 | It can just, like, basically is a very good bullshitter.
01:52:17.120 | So if there's not enough,
01:52:19.320 | if there's not enough sort of data in the training data
01:52:24.320 | about a particular topic,
01:52:25.520 | it's just going to concoct accurate-sounding narratives,
01:52:30.320 | which is a very fascinating problem to try to solve.
01:52:34.120 | How do you get language models
01:52:37.080 | to infer what is true or not, to sort of introspect?
01:52:41.880 | - Yeah, they need to be taught to say,
01:52:43.520 | "I don't know," more often.
01:52:46.120 | And I know several humans who could be taught that as well.
01:52:50.040 | - Sure, and then the other stuff,
01:52:52.400 | because you're still a bit involved
01:52:54.200 | in the Amazon side with the AI things.
01:52:56.640 | The other open question is what kind of products
01:52:59.360 | are created from this?
01:53:01.120 | - Oh, so many.
01:53:02.240 | - Yeah.
01:53:03.080 | - I mean, you know, just to, you know,
01:53:05.160 | we have Alexa and Echo,
01:53:08.640 | and Alexa has, you know, hundreds of millions
01:53:11.340 | of installed base, you know, inputs.
01:53:15.080 | And so there's this, you know, there's Alexa everywhere,
01:53:17.800 | and guess what?
01:53:18.920 | Alexa is about to get a lot smarter.
01:53:20.840 | - Yeah.
01:53:21.680 | - And so that's really, you know,
01:53:24.120 | from a product point of view, that's super exciting.
01:53:27.080 | - There's so many opportunities there.
01:53:28.800 | - So many opportunities.
01:53:30.080 | Shopping assistant, you know, all that stuff is amazing.
01:53:33.680 | And AWS, you know, we're building Titan,
01:53:36.840 | which is our foundational model.
01:53:38.600 | We're also building Bedrock,
01:53:41.600 | which our corporate clients at AWS,
01:53:45.400 | our enterprise clients,
01:53:46.640 | they want to be able to use these powerful models
01:53:49.440 | with their own corporate data
01:53:51.640 | without accidentally contributing their corporate data
01:53:55.760 | to that model.
01:53:57.040 | - Yes.
01:53:57.880 | - So those are the tools we're building for them
01:53:59.080 | with Bedrock.
01:54:00.200 | So there's tremendous opportunity here.
01:54:02.800 | - Yeah, the security, the privacy,
01:54:04.000 | all those things are fascinating,
01:54:05.680 | of how to, 'cause so much value can be gained
01:54:08.000 | by training on private data,
01:54:09.800 | but you want to keep the secure.
01:54:11.080 | That's a fascinating technical problem.
01:54:13.520 | - This is a very challenging technical problem,
01:54:15.840 | and it's one that we're, you know,
01:54:17.280 | making progress on and dedicated to solving
01:54:19.400 | for our customers.
01:54:20.560 | - Do you think there will be a day
01:54:23.320 | when humans and robots, maybe Alexa,
01:54:25.480 | have a romantic relationship?
01:54:27.360 | That could be her.
01:54:29.120 | - Well, I mean, I think if you look at the--
01:54:31.280 | - I'm just brainstorming products here.
01:54:32.200 | - If you look at the spectrum of human variety
01:54:35.520 | and what people like, you know, sexual variety.
01:54:38.240 | - Yes.
01:54:39.080 | - You know, there are people who like everything.
01:54:40.800 | So the answer to your question has to be yes.
01:54:43.080 | - Okay.
01:54:43.920 | - I don't know how-- - I guess I'm asking when.
01:54:44.760 | - I don't know how widespread that will be.
01:54:46.800 | - All right.
01:54:47.960 | - But it will happen.
01:54:49.200 | - I was just asking when for a friend, but it's all right.
01:54:51.480 | (Dave laughing)
01:54:52.720 | I'll just, moving on.
01:54:54.020 | Next question.
01:54:56.360 | What's a perfectly productive day
01:54:59.520 | in the life of Jeff Bezos?
01:55:01.440 | You're one of the most productive humans in the world.
01:55:03.880 | - Well, first of all, I get up in the morning
01:55:06.480 | and I putter.
01:55:07.520 | I like have a coffee.
01:55:09.800 | - Can you define putter?
01:55:10.960 | - Just like I slowly move around.
01:55:14.480 | I'm not as productive as you might think I am.
01:55:16.880 | I mean, 'cause I do believe in wandering
01:55:19.480 | and I sort of, you know, I read my phone for a while.
01:55:23.400 | I read newspapers for a while.
01:55:25.820 | I chat with Laura and I drink my first coffee.
01:55:30.260 | So I kind of, I move pretty slowly
01:55:33.060 | in the first couple of hours.
01:55:34.000 | I get up early, just naturally.
01:55:36.960 | And then, you know, I exercise most days
01:55:41.960 | and most days it's not that hard for me.
01:55:44.240 | Some days it's really hard and I do it anyway.
01:55:46.480 | I don't want to, you know, and it's painful
01:55:48.920 | and I'm like, why am I here?
01:55:50.240 | And I don't want to do that.
01:55:51.960 | - I mean, why am I here at the gym?
01:55:53.280 | - Why am I here at the gym?
01:55:54.440 | Why don't I do something else?
01:55:55.780 | You know, it's not always easy.
01:55:59.680 | - What's your source of motivation in those moments?
01:56:02.360 | - I know that I'll feel better later if I do it.
01:56:05.840 | And so like the real source of motivation,
01:56:08.900 | I can tell the days when I skip it.
01:56:11.320 | I'm not quite as alert.
01:56:13.000 | I don't feel as good.
01:56:14.840 | And then there's harder motivations.
01:56:17.240 | It's longer term.
01:56:18.160 | You want to be healthy as you age.
01:56:20.040 | You know, you want healthspan.
01:56:21.320 | You want, ideally, you know, you want to be healthy
01:56:24.680 | and moving around when you're 80 years old.
01:56:26.560 | You know, and so there's a lot of,
01:56:29.740 | but that kind of motivation is so far in the future,
01:56:33.120 | it can be very hard to work in the second.
01:56:35.760 | So thinking about the fact,
01:56:37.080 | I'll feel better in about four hours if I do it now.
01:56:39.880 | I'll have more energy for the rest of my day
01:56:41.600 | and so on and so on.
01:56:42.720 | - What's your exercise routine just to linger on that?
01:56:45.120 | What do you, how much you curl?
01:56:46.480 | I mean, what are we talking about here?
01:56:47.920 | (laughing)
01:56:49.800 | That's all I do at the gym, so I just.
01:56:51.760 | - My routine, you know, on a good day,
01:56:55.720 | I do about half an hour of cardio
01:56:58.520 | and I do about 45 minutes of weightlifting,
01:57:01.420 | resistance training of some kind, mostly weights.
01:57:04.520 | I have a trainer who, you know, I love,
01:57:08.160 | who pushes me, which is really helpful.
01:57:11.360 | You know, I'll be like, he'll say,
01:57:14.140 | Jeff, do you think you could,
01:57:17.000 | can we go up on that weight a little bit
01:57:18.720 | and I'll think about it and I'll be like,
01:57:21.280 | no, I don't think so.
01:57:23.520 | And he'll be, he'll look at me and say,
01:57:25.840 | yeah, I think you can.
01:57:27.280 | (laughing)
01:57:30.200 | And of course he's right.
01:57:31.400 | - Yeah, of course.
01:57:32.240 | - So it's helpful to have somebody push you a little bit.
01:57:34.840 | - But almost every day you do that.
01:57:36.800 | - I do, almost every day I do a little bit of cardio
01:57:40.600 | and a little bit of weightlifting and I'd rotate,
01:57:44.000 | I do a pulling day and a pushing day and a leg day.
01:57:46.400 | It's all pretty standard stuff.
01:57:48.180 | - So puttering, coffee, gym.
01:57:49.880 | - Puttering, coffee, gym and then work.
01:57:52.160 | - Work.
01:57:53.000 | What's work look like?
01:57:53.960 | What do the productive hours look like for you?
01:57:58.300 | - I, you know, so I, a couple of years ago,
01:58:00.800 | I left as the CEO of Amazon
01:58:03.200 | and I have never worked harder in my life.
01:58:05.520 | (laughing)
01:58:08.480 | I am working so hard and I'm mostly enjoying it,
01:58:11.160 | but there are also some very painful days.
01:58:14.620 | Most of my time is spent on Blue Origin
01:58:18.800 | and I've been, I'm so deeply involved here now
01:58:21.380 | for the last couple of years and in the big, I love it.
01:58:24.120 | In the small, there's all the frustrations
01:58:26.240 | that come along with everything.
01:58:27.600 | You know, we're trying to get to rate manufacturing
01:58:29.680 | as we talked about, that's super important.
01:58:31.800 | We'll get there.
01:58:32.680 | We just hired a new CEO, a guy I've known
01:58:34.440 | for close to 15 years now, a guy named Dave Limp
01:58:37.640 | who I love, he's amazing, you know.
01:58:40.680 | So we're super lucky to have Dave
01:58:43.720 | and, you know, we're gonna,
01:58:44.960 | you're gonna see us move faster there.
01:58:46.400 | But so my day of work, you know,
01:58:49.720 | reading documents, having meetings,
01:58:52.960 | sometimes in person, sometimes over Zoom,
01:58:55.240 | depends on where I am.
01:58:56.800 | It's all about, you know, the technology.
01:58:59.500 | It's about the organization.
01:59:00.980 | It's about, you know, I'm very,
01:59:04.660 | I have architecture and technology meetings
01:59:08.680 | almost every day on various subsystems
01:59:12.000 | inside the vehicle, inside the engines.
01:59:14.800 | It's super fun for me.
01:59:16.200 | My favorite part of it is the technology.
01:59:18.600 | My least favorite part of it is, you know,
01:59:22.160 | building organizations and so on.
01:59:24.960 | That's important, but it's also my least favorite part.
01:59:27.600 | So, you know, that's why they call it work.
01:59:29.400 | You don't always get to do what you wanna do.
01:59:31.580 | - How do you achieve time where you can focus
01:59:34.100 | and truly think through problems?
01:59:36.160 | - I do little thinking retreats.
01:59:38.880 | So this is not the only, I can do that all day long.
01:59:42.480 | I'm very good at focusing.
01:59:44.040 | I'm very good at, you know,
01:59:47.600 | I don't keep to a strict schedule.
01:59:49.960 | Like my meetings often go longer than I plan for them to
01:59:53.360 | because I believe in wandering.
01:59:55.960 | My perfect meeting starts with a crisp document.
01:59:59.360 | So the document should be written with such clarity
02:00:01.940 | that it's like angels singing from on high.
02:00:04.920 | I like a crisp document and a messy meeting.
02:00:08.140 | And so the meeting is about like asking questions
02:00:12.520 | that nobody knows the answer to
02:00:14.280 | and trying to like wander your way to a solution.
02:00:19.280 | And that is, when that happens just right,
02:00:26.960 | it makes all the other meetings worthwhile.
02:00:29.080 | It feels good.
02:00:29.920 | It has a kind of beauty to it.
02:00:31.760 | It has an aesthetic beauty to it.
02:00:34.140 | And you get real breakthroughs in meetings like that.
02:00:37.200 | - Can you actually describe the crisp document?
02:00:39.360 | Like this is one of the legendary aspects of Amazon,
02:00:42.700 | of the way you approach meetings.
02:00:44.840 | Is this the six page memo?
02:00:47.640 | Maybe first describe the process
02:00:49.480 | of running a meeting with memos.
02:00:51.440 | - Meetings at Amazon and Blue Origin are unusual.
02:00:55.420 | When we get new, when new people come in,
02:00:58.880 | like a new executive joins,
02:01:00.720 | they're a little taken aback sometimes
02:01:02.720 | because a typical meeting will start
02:01:05.480 | with a six page narratively structured memo.
02:01:08.920 | And we do study hall.
02:01:11.600 | For 30 minutes, we sit there silently together
02:01:14.800 | in the meeting and read, take notes in the margins.
02:01:19.360 | And then we discuss.
02:01:22.280 | And the reason, by the way, we do study,
02:01:24.760 | you could say, I would like everybody
02:01:26.520 | to read these memos in advance.
02:01:28.920 | But the problem is people don't have time to do that.
02:01:32.400 | And they end up coming to the meeting
02:01:33.760 | having only skimmed the memo,
02:01:35.600 | or maybe not read it at all,
02:01:37.000 | and they're trying to catch up.
02:01:38.600 | And they're also bluffing like they were in college
02:01:40.520 | having pretended to do the reading.
02:01:42.120 | - Yeah, exactly.
02:01:43.900 | - It's better just to carve out the time for people.
02:01:46.680 | So now we're all on the same page.
02:01:48.280 | We've all read the memo.
02:01:50.140 | And now we can have a really elevated discussion.
02:01:53.080 | And this is so much better
02:01:54.460 | from having a slideshow presentation,
02:01:56.680 | you know, a PowerPoint presentation of some kind,
02:01:58.760 | where that has so many difficulties.
02:02:01.680 | But one of the problems is PowerPoint
02:02:03.920 | is really designed to persuade.
02:02:05.800 | It's kind of a sales tool.
02:02:07.720 | And internally, the last thing you wanna do is sell.
02:02:11.360 | You wanna, again, you're truth-seeking.
02:02:13.360 | You're trying to find truth.
02:02:15.180 | And the other problem with PowerPoint
02:02:17.200 | is it's easy for the author and hard for the audience.
02:02:21.840 | And a memo is the opposite.
02:02:24.020 | It's hard to write a six-page memo.
02:02:25.720 | A good six-page memo might take two weeks to write.
02:02:28.640 | You have to write it.
02:02:29.480 | You have to rewrite it.
02:02:30.320 | You have to edit it.
02:02:31.160 | You have to talk to people about it.
02:02:32.560 | They have to poke holes in it for you.
02:02:34.880 | You write it again.
02:02:36.280 | It might take two weeks.
02:02:37.560 | So the author, it's really a very difficult job.
02:02:41.360 | But for the audience, it's much better.
02:02:45.120 | So you can read a half hour.
02:02:46.480 | And, you know, there are little problems
02:02:48.400 | with PowerPoint presentations too.
02:02:50.040 | You know, senior executives interrupt with questions
02:02:52.400 | halfway through the presentation.
02:02:54.280 | That question's gonna be answered on the next slide,
02:02:56.320 | but you never got there.
02:02:57.840 | Whereas if you read the whole memo in advance,
02:02:59.760 | you know, I often write lots of questions
02:03:02.320 | that I have in the margins of these memos.
02:03:05.240 | And then I go cross them all out
02:03:06.720 | because by the time I get to the end of the memo,
02:03:08.640 | they've been answered.
02:03:09.680 | That's why I save all that time.
02:03:11.840 | You also get, you know,
02:03:13.480 | if the person who's preparing the memo,
02:03:15.400 | we talked earlier about, you know, groupthink
02:03:20.240 | and, you know, the fact that I go last in meetings
02:03:22.760 | and that you don't want, you know,
02:03:24.400 | your ideas to kind of pollute the meeting prematurely.
02:03:27.840 | You know, the author of the memos
02:03:32.400 | has kind of got to be very vulnerable.
02:03:35.000 | They got to put all their thoughts out there
02:03:37.560 | and they've got to go first.
02:03:39.280 | But that's great 'cause it makes them really good.
02:03:42.880 | And so, and you get to see their real ideas
02:03:45.320 | and you're not trompling on them accidentally
02:03:47.440 | in a big, you know, PowerPoint presentation.
02:03:49.840 | - What's that feel like when you've authored a thing
02:03:52.000 | and then you're sitting there
02:03:53.200 | and everybody's reading your thing?
02:03:55.120 | You're like--
02:03:55.960 | - I think it's mostly terrifying.
02:03:57.640 | - Yeah.
02:03:58.900 | (laughing)
02:04:00.280 | Like maybe in a good way?
02:04:02.160 | - I think it's--
02:04:03.000 | - Like a purifying--
02:04:03.840 | - I think it's terrifying in a productive way.
02:04:07.200 | - Yeah.
02:04:08.800 | - But I think it's emotionally
02:04:10.400 | a very nerve-wracking experience.
02:04:13.120 | - Is there a art science to the writing
02:04:16.280 | of the six-page memo?
02:04:18.000 | Or just writing in general to you?
02:04:20.240 | - I mean, it's really got to be a real memo.
02:04:23.360 | So it means, you know,
02:04:25.680 | paragraphs of topic sentences,
02:04:27.840 | it's verbs and nouns, you can't,
02:04:30.160 | that's the other problem with PowerPoint
02:04:31.520 | is they're often just bullet points.
02:04:33.680 | And you can hide a lot of sloppy thinking
02:04:36.800 | behind bullet points.
02:04:37.840 | When you have to write in complete sentences
02:04:39.820 | with narrative structure,
02:04:41.200 | it's really hard to hide sloppy thinking.
02:04:43.580 | So it does, it forces the author to be at their best.
02:04:48.320 | And so you're getting somebody's,
02:04:50.160 | they're getting somebody's really their best thinking.
02:04:53.260 | And then you don't have to spend a lot of time
02:04:55.380 | trying to tease that thinking out of the person.
02:04:58.720 | And you've got it from the very beginning.
02:05:00.240 | And so it really saves you time in the long run.
02:05:03.280 | - So that part is crisp,
02:05:04.640 | and then the rest is messy.
02:05:06.120 | Crisp document--
02:05:06.960 | - Yes, and you don't want,
02:05:07.940 | you don't want to pretend
02:05:09.040 | that the discussion should be crisp.
02:05:11.120 | - Yeah.
02:05:11.960 | - There's, you know, most meetings,
02:05:13.040 | you're trying to solve a really hard problem.
02:05:15.960 | There's a different kind of meeting,
02:05:16.980 | which we call weekly business reviews or business reviews.
02:05:19.800 | They may be weekly or monthly or daily, whatever they are.
02:05:22.420 | But these business review meetings,
02:05:24.500 | that's usually for incremental improvement.
02:05:27.100 | And you're looking at a series of metrics
02:05:29.180 | every time it's the same metrics.
02:05:30.760 | Those meetings can be very efficient.
02:05:32.560 | They can start on time and end on time.
02:05:35.040 | - So we're about to run out of time,
02:05:37.200 | which is a good time to ask about the 10,000 year clock.
02:05:41.800 | (laughing)
02:05:44.060 | - It's funny.
02:05:44.900 | That's what I'm known for, is the humor.
02:05:47.840 | Okay.
02:05:49.400 | Can you explain what the 10,000 year clock is?
02:05:51.920 | - 10,000 year clock is a physical clock of monumental scale.
02:05:56.160 | It's about 500 feet tall.
02:05:57.800 | It's inside a mountain in West Texas,
02:06:00.280 | in a chamber that's about 12 feet in diameter
02:06:03.040 | and 500 feet tall.
02:06:04.840 | 10,000 year clock is a idea conceived
02:06:07.760 | by a brilliant guy named Danny Hillis,
02:06:10.160 | way back in the '80s.
02:06:13.420 | The idea is to build a clock as a symbol
02:06:15.660 | for long-term thinking.
02:06:17.660 | And you can kind of just very conceptually
02:06:20.180 | think of the 10,000 year clock as it ticks once a year.
02:06:25.180 | It chimes once every 100 years
02:06:30.300 | and the cuckoo comes out once every 1,000 years.
02:06:33.100 | So it just sort of slows everything down.
02:06:36.420 | And it's a completely mechanical clock.
02:06:40.440 | It is designed to last 10,000 years
02:06:43.260 | with no human intervention.
02:06:45.220 | So the material choices and everything else.
02:06:47.940 | It's in a remote location, both to protect it,
02:06:51.500 | but also so that visitors have to kind of make a pilgrimage.
02:06:56.500 | The idea is that over time,
02:07:00.380 | and this will take hundreds of years,
02:07:02.660 | but over time it will take on the patina of age
02:07:06.600 | and then it will become a symbol for long-term thinking
02:07:10.060 | that will actually hopefully get humans
02:07:13.720 | to extend their thinking horizons.
02:07:18.360 | And in my view, that's really important
02:07:21.960 | as we have become, as a species,
02:07:24.840 | as a civilization, more powerful.
02:07:26.640 | You know, we're really affecting the planet now.
02:07:28.600 | We're really affecting each other.
02:07:30.440 | We have weapons of mass destruction.
02:07:32.840 | We have all kinds of things
02:07:35.540 | where we can really hurt ourselves.
02:07:37.740 | And the problems we create can be so large.
02:07:40.800 | You know, the unintended consequences of some of our actions
02:07:44.680 | like climate change,
02:07:45.520 | putting carbon in the atmosphere is a perfect example.
02:07:47.400 | That's an unintended consequence
02:07:48.780 | of the Industrial Revolution.
02:07:51.000 | Got a lot of benefits from it,
02:07:52.120 | but we've also got this side effect
02:07:54.560 | that is very detrimental.
02:07:56.600 | We need to be, we need to start training ourselves
02:07:59.160 | to think longer term.
02:08:00.240 | Long-term thinking is a giant lever.
02:08:02.160 | You can literally solve problems if you think long-term
02:08:06.060 | that are impossible to solve if you think short-term.
02:08:09.040 | And we aren't really good at thinking long-term.
02:08:11.520 | As you know, it's not really,
02:08:13.200 | you were kind of, you know,
02:08:14.800 | five years is a tough timeframe
02:08:18.080 | for most institutions to think past.
02:08:21.260 | And we probably need to stretch that to 10 years
02:08:24.960 | and 15 years and 20 years and 25 years.
02:08:27.360 | And we'd do a better job for our children
02:08:29.480 | or our grandchildren
02:08:31.000 | if we could stretch those thinking horizons.
02:08:33.200 | And so the clock is, in a way, it's an art project.
02:08:37.600 | It's a symbol.
02:08:40.220 | And if it ever has any power to influence people
02:08:44.980 | to think longer term,
02:08:46.020 | that won't happen for hundreds of years,
02:08:47.540 | but we have to, you know, we're gonna build it now
02:08:49.320 | and let it accrue the patina of age.
02:08:52.140 | - Do you think humans will be here
02:08:53.380 | when the clock runs out here on Earth?
02:08:56.460 | - I think so.
02:08:57.360 | But, you know, the United States won't exist.
02:09:01.840 | Like, whole civilizations rise and fall.
02:09:03.760 | 10,000 years is so long.
02:09:05.740 | Like, no nation state has ever survived
02:09:10.020 | for anywhere close to 10,000 years.
02:09:12.700 | - And the increasing rate of progress makes that even--
02:09:15.060 | - Even less likely.
02:09:15.940 | So do I think humans will be here?
02:09:18.100 | What, you know, how will we have changed ourselves
02:09:20.820 | and what will we be and so on and so on?
02:09:22.900 | I don't know, but I think we'll be here.
02:09:25.620 | - On that grand scale, a human life feels tiny.
02:09:28.460 | Do you ponder your own mortality?
02:09:31.060 | Are you afraid of death?
02:09:32.500 | - No, I'm, you know, I used to be afraid of death.
02:09:36.820 | I did, I, like, my, like, I remember as a young person
02:09:40.480 | being kind of, like, very scared of mortality.
02:09:44.880 | Like, didn't want to think about it and so on
02:09:47.020 | and always had a big, and as I've gotten older,
02:09:49.920 | I'm 59 now, as I've gotten older,
02:09:52.940 | somehow that fear has sort of gone away.
02:09:56.180 | I don't, you know, I would like to stay alive
02:10:00.740 | for as long as possible, but I'd like to be,
02:10:02.940 | I'm really more focused on health span.
02:10:05.340 | I want to be healthy.
02:10:06.760 | I want that square wave.
02:10:08.100 | I want to, you know, I want to be healthy, healthy, healthy
02:10:11.100 | and then gone.
02:10:12.140 | I don't want the long decay.
02:10:13.720 | But, and I'm curious, I want to see how things turn out.
02:10:18.860 | You know, I'd like to be here.
02:10:19.940 | I love my family and my close friends
02:10:22.820 | and I want to, I'm curious about them and I want to see.
02:10:24.960 | So I have a lot of reasons to stay around,
02:10:28.220 | but it's, mortality doesn't have that effect on me
02:10:33.220 | that it did, you know, maybe when I was in my 20s.
02:10:37.940 | - Well, Jeff, thank you for creating Amazon,
02:10:40.940 | one of the most incredible companies in history
02:10:43.100 | and thank you for trying your best to make humans
02:10:46.140 | a multi-planetary species, expanding out
02:10:48.820 | into our solar system, maybe beyond
02:10:50.820 | to meet the aliens out there.
02:10:53.740 | And thank you for talking today.
02:10:55.780 | - Well, Lex, thank you for doing your part
02:10:58.020 | to lengthen our attention space.
02:11:00.740 | Appreciate that very much.
02:11:02.500 | - Thanks for listening to this conversation
02:11:05.700 | with Jeff Bezos.
02:11:06.940 | To support this podcast,
02:11:08.140 | please check out our sponsors in the description.
02:11:10.900 | And now let me leave you with some words
02:11:12.740 | from Jeff Bezos himself.
02:11:15.020 | Be stubborn on vision, but flexible on the details.
02:11:19.100 | Thank you for listening and hope to see you next time.
02:11:23.080 | (upbeat music)
02:11:25.660 | (upbeat music)
02:11:28.240 | [BLANK_AUDIO]