back to index

Keoki Jackson: Lockheed Martin | Lex Fridman Podcast #33


Chapters

0:0 Introduction
1:55 Skunkworks
2:31 Favorite Aircraft
3:38 Lockheed Martin
4:44 Do you Dream
5:59 Long Term Dream
8:41 Challenges
11:44 Orbiting an asteroid
13:11 Opportunities for humans in space
15:0 What is teaming
16:54 How do you develop systems
18:35 Verification and validation
21:58 Software and safety
25:20 Lessons from 737 Max
27:20 Categories of Lockheed Martin systems
30:10 Integrated air and missile defense
34:59 Hypersonics
38:15 Secret Engineering
42:41 Autonomous Aircraft
44:27 Humans in the Cockpit
48:22 Optimal Piloting
51:31 Policy Considerations

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Kiyoki Jackson.
00:00:03.400 | He's the CTO of Lockheed Martin,
00:00:06.700 | a company that through its long history
00:00:08.740 | has created some of the most incredible engineering marvels
00:00:11.580 | human beings have ever built,
00:00:13.960 | including planes that fly fast and undetected,
00:00:17.040 | defense systems that intersect nuclear threats
00:00:19.840 | that can take the lives of millions,
00:00:22.460 | and systems that venture out into space,
00:00:25.280 | the moon, Mars, and beyond.
00:00:28.340 | And these days, more and more,
00:00:30.580 | artificial intelligence has an assistive role
00:00:32.840 | to play in these systems.
00:00:34.800 | I've read several books in preparation
00:00:36.660 | for this conversation.
00:00:38.300 | It is a difficult one,
00:00:40.040 | because in part, Lockheed Martin builds military systems
00:00:43.440 | that operate in a complicated world
00:00:45.240 | that often does not have easy solutions
00:00:48.120 | in the gray area between good and evil.
00:00:51.480 | I hope one day this world will rid itself of war
00:00:56.400 | in all its forms.
00:00:58.620 | But the path to achieving that
00:00:59.940 | in a world that does have evil is not obvious.
00:01:02.940 | What is obvious is good engineering
00:01:05.100 | and artificial intelligence research
00:01:07.180 | has a role to play on the side of good.
00:01:11.240 | Lockheed Martin and the rest of our community
00:01:14.060 | are hard at work at exactly this task.
00:01:17.100 | We talk about these and other important topics
00:01:19.740 | in this conversation.
00:01:21.380 | Also, most certainly,
00:01:23.480 | both Kiyoki and I have a passion for space.
00:01:27.100 | Us, humans, venturing out toward the stars.
00:01:31.440 | We talk about this exciting future as well.
00:01:35.440 | This is the Artificial Intelligence Podcast.
00:01:38.080 | If you enjoy it, subscribe on YouTube,
00:01:40.540 | give it five stars on iTunes, support it on Patreon,
00:01:43.920 | or simply connect with me on Twitter @LexFriedman,
00:01:47.540 | spelled F-R-I-D-M-A-N.
00:01:50.700 | And now, here's my conversation with Kiyoki Jackson.
00:01:55.580 | I read several books on Lockheed Martin recently.
00:01:57.940 | My favorite in particular is by Ben Rich,
00:02:00.580 | Carlos Concord's personal memoir.
00:02:03.420 | It gets a little edgy at times.
00:02:05.100 | But from that, I was reminded that
00:02:08.700 | the engineers at Lockheed Martin
00:02:10.300 | have created some of the most
00:02:12.000 | incredible engineering marvels human beings have ever built
00:02:15.180 | throughout the 20th century and the 21st.
00:02:18.700 | Do you remember a particular project or system at Lockheed
00:02:22.660 | or before that at the Space Shuttle Columbia
00:02:25.480 | that you were just in awe at the fact that us humans
00:02:29.500 | could create something like this?
00:02:31.140 | - You know, that's a great question.
00:02:34.220 | There's a lot of things that I could draw on there.
00:02:37.460 | When you look at the Skunk Works
00:02:38.940 | and Ben Rich's book in particular,
00:02:40.620 | of course it starts off with basically
00:02:42.580 | the start of the jet age and the P-80.
00:02:44.920 | I had the opportunity to sit next to one of the
00:02:50.100 | Apollo astronauts, Charlie Duke, recently at dinner.
00:02:53.100 | And I said, "Hey, what's your favorite aircraft?"
00:02:56.080 | And he said, "Well, it was by far the F-104 Starfighter,"
00:02:59.320 | which was another aircraft that came out of Lockheed there.
00:03:02.760 | - What kind of?
00:03:03.600 | - It was the first Mach 2 jet fighter aircraft.
00:03:08.240 | They called it the missile with a man in it.
00:03:11.280 | And so those are the kinds of things
00:03:12.440 | I grew up hearing stories about.
00:03:14.320 | You know, of course, the SR-71 is incomparable
00:03:19.120 | as kind of the epitome of speed, altitude,
00:03:24.120 | and just the coolest looking aircraft ever.
00:03:26.940 | - That's a plane that--
00:03:29.200 | - That's a, yeah, intelligence, surveillance,
00:03:30.920 | and reconnaissance aircraft that was designed
00:03:33.340 | to be able to outrun, basically go faster
00:03:36.160 | than any air defense system.
00:03:38.600 | But I'll tell you, I'm a space junkie.
00:03:42.880 | That's why I came to MIT.
00:03:44.800 | That's really what took me ultimately to Lockheed Martin.
00:03:49.080 | And I grew up, and so Lockheed Martin, for example,
00:03:51.320 | has been essentially at the heart of every planetary mission
00:03:56.320 | like all the Mars missions we've had a part in.
00:03:59.560 | And we've talked a lot about the 50th anniversary
00:04:02.080 | of Apollo here in the last couple of weeks, right?
00:04:04.960 | But remember 1976, July 20th, again, National Space Day.
00:04:09.960 | So the landing of the Viking,
00:04:13.160 | the Viking lander on the surface of Mars,
00:04:16.240 | just a huge accomplishment.
00:04:18.000 | And when I was a young engineer at Lockheed Martin,
00:04:21.000 | I got to meet engineers who had designed
00:04:24.080 | various pieces of that mission as well.
00:04:26.800 | So that's what I grew up on is these planetary missions,
00:04:29.720 | the start of the space shuttle era,
00:04:31.560 | and ultimately had the opportunity
00:04:34.760 | to see Lockheed Martin's part.
00:04:39.080 | And we can maybe talk about some of these here,
00:04:41.100 | but Lockheed Martin's part
00:04:42.120 | in all of these space journeys over the years.
00:04:44.680 | - Do you dream, and I apologize
00:04:46.500 | for getting philosophical at times or sentimental,
00:04:49.920 | I do romanticize the notion of space exploration.
00:04:53.080 | So do you dream of the day when us humans
00:04:55.600 | colonize another planet like Mars,
00:04:57.560 | or a man, a woman, a human being steps on Mars?
00:05:02.560 | - Absolutely, and that's a personal dream of mine.
00:05:06.560 | I haven't given up yet on my own opportunity
00:05:09.200 | to fly into space.
00:05:10.440 | But as you know, from the Lockheed Martin perspective,
00:05:14.440 | this is something that we're working towards every day.
00:05:16.900 | And of course, we're building the Orion spacecraft,
00:05:20.060 | which is the most sophisticated human-rated spacecraft
00:05:23.380 | ever built, and it's really designed
00:05:24.860 | for these deep space journeys,
00:05:27.020 | starting with the moon, but ultimately going to Mars,
00:05:29.820 | and being the platform from a design perspective,
00:05:34.820 | we call the Mars Base Camp,
00:05:36.380 | to be able to take humans to the surface,
00:05:38.900 | and then after a mission of a couple of weeks,
00:05:41.020 | bring them back up safely.
00:05:42.300 | And so that is something I wanna see happen
00:05:44.600 | during my time at Lockheed Martin.
00:05:46.600 | So I'm pretty excited about that.
00:05:49.400 | And I think once we prove that's possible,
00:05:52.540 | colonization might be a little bit further out,
00:05:57.200 | but it's something that I'd hope to see.
00:06:00.040 | - So maybe you can give a little bit an overview of,
00:06:03.820 | so Lockheed Martin has partnered with,
00:06:05.880 | a few years ago, with Boeing to work with the DoD
00:06:09.380 | and NASA to build launch systems and rockets
00:06:11.900 | with the ULA.
00:06:13.540 | What's beyond that?
00:06:15.460 | What's Lockheed's mission timeline,
00:06:17.380 | long-term dream in terms of space?
00:06:19.340 | You mentioned the moon.
00:06:22.100 | I've heard you talk about asteroids.
00:06:25.180 | As Mars, what's the timeline?
00:06:27.620 | What's the engineering challenges,
00:06:29.260 | and what's the dream long-term?
00:06:31.300 | - Yeah, I think the dream long-term
00:06:33.240 | is to have a permanent presence in space
00:06:36.180 | beyond low-Earth orbit,
00:06:37.820 | ultimately with a long-term presence on the moon,
00:06:41.060 | and then to the planets, to Mars.
00:06:43.200 | - Sorry to interrupt on that.
00:06:45.620 | So long-term presence means--
00:06:47.980 | - Sustained and sustainable presence in an economy,
00:06:51.060 | a space economy, that really goes alongside that.
00:06:54.380 | - With human beings and being able to launch, perhaps,
00:06:58.260 | from those, so like, hop?
00:07:01.020 | - There's a lot of energy that goes in those hops, right?
00:07:06.060 | So I think the first step is being able to get there
00:07:09.740 | and to be able to establish sustained bases, right,
00:07:12.620 | and build from there.
00:07:14.820 | And a lot of that means getting, as you know,
00:07:18.940 | things like the cost of launch down,
00:07:21.500 | and you mentioned United Launch Alliance,
00:07:23.580 | and so I don't wanna speak for ULA,
00:07:26.100 | but obviously they're working really hard
00:07:29.020 | to, on their next generation of launch vehicles,
00:07:34.020 | to maintain that incredible mission success record
00:07:39.260 | that ULA has, but ultimately continue to drive down the cost
00:07:42.660 | and make the flexibility, the speed,
00:07:44.380 | and the access ever greater.
00:07:46.900 | - So what's the missions that are on the horizon
00:07:50.380 | that you could talk to?
00:07:51.660 | Is there a hope to get to the moon?
00:07:53.380 | - Absolutely, absolutely.
00:07:54.620 | I mean, I think you know this, or you may know this,
00:07:57.900 | you know, there's a lot of ways to accomplish
00:07:59.780 | some of these goals, and so that's a lot
00:08:01.980 | of what's in discussion today.
00:08:03.780 | But ultimately, the goal is to be able to establish a base,
00:08:09.060 | essentially in cislunar space,
00:08:11.100 | that would allow for ready transfer from orbit
00:08:16.100 | to the lunar surface and back again.
00:08:19.900 | And so that's sort of that near-term,
00:08:21.900 | I say near-term, in the next decade or so vision.
00:08:24.900 | Starting off with a stated objective by this administration
00:08:29.860 | to get back to the moon in the 2024, 2025 timeframe,
00:08:34.060 | which is right around the corner here.
00:08:36.740 | - How big of an engineering challenge is that?
00:08:39.140 | - I think the big challenge is not so much to go,
00:08:44.580 | but to stay, right?
00:08:46.140 | And so we demonstrated in the '60s
00:08:48.940 | that you could send somebody up,
00:08:50.860 | do a couple of days of mission,
00:08:52.900 | and bring 'em home again successfully.
00:08:55.580 | Now we're talking about doing that,
00:08:57.260 | I'd say more to, I don't wanna say an industrial scale,
00:08:59.780 | but a sustained scale, right?
00:09:01.380 | So permanent habitation, you know,
00:09:05.420 | regular reuse of vehicles,
00:09:09.460 | the infrastructure to get things like fuel, air,
00:09:13.820 | consumables, replacement parts,
00:09:17.100 | all the things that you need
00:09:18.260 | to sustain that kind of infrastructure.
00:09:20.740 | So those are certainly engineering challenges.
00:09:23.620 | There are budgetary challenges,
00:09:26.100 | and those are all things
00:09:28.980 | that we're gonna have to work through.
00:09:30.380 | You know, the other thing, and I shouldn't,
00:09:32.900 | I don't wanna minimize this.
00:09:35.060 | I mean, I'm excited about human exploration,
00:09:38.220 | but the reality is our technology
00:09:40.780 | and where we've come over the last, you know,
00:09:43.060 | 40 years essentially,
00:09:44.980 | has changed what we can do with robotic exploration as well.
00:09:48.820 | And, you know, to me, it's incredibly thrilling.
00:09:52.060 | This seems like old news now,
00:09:53.700 | but the fact that we have rovers driving around
00:09:57.340 | the surface of Mars and sending back data
00:10:00.300 | is just incredible.
00:10:01.300 | The fact that we have satellites in orbit around Mars
00:10:04.240 | that are collecting weather, you know,
00:10:06.380 | they're looking at the terrain, they're mapping,
00:10:08.300 | all of these kinds of things on a continuous basis.
00:10:11.340 | That's incredible.
00:10:12.740 | And the fact that, you know, you got the time lag,
00:10:15.460 | of course, going to the planets,
00:10:17.900 | but you can effectively have virtual human presence there
00:10:22.020 | in a way that we have never been able to do before.
00:10:25.860 | And now with the advent of even greater processing power,
00:10:30.060 | better AI systems,
00:10:32.340 | better cognitive systems and decision systems,
00:10:35.760 | you know, you put that together with the human piece
00:10:38.800 | and we really opened up the solar system
00:10:41.520 | in a whole different way.
00:10:42.540 | And I'll give you an example.
00:10:43.720 | We've got OSIRIS-REx,
00:10:44.880 | which is a mission to the asteroid Bennu.
00:10:47.800 | So the spacecraft is out there right now
00:10:50.920 | on basically a year mapping activity
00:10:54.260 | to map the entire surface of that asteroid in great detail.
00:10:59.200 | You know, all autonomously piloted, right?
00:11:02.540 | But the idea then that, and this is not too far away,
00:11:04.820 | it's gonna go in,
00:11:05.980 | it's got a sort of fancy vacuum cleaner with a bucket.
00:11:09.660 | It's gonna collect the sample off the asteroid
00:11:12.540 | and then send it back here to earth.
00:11:14.420 | And so, you know, we have gone
00:11:17.180 | from sort of those tentative steps in the 70s,
00:11:20.260 | you know, early landings, video of the solar system,
00:11:23.940 | to now we've sent spacecraft to Pluto,
00:11:27.060 | we have gone to comets and intercepted comets,
00:11:31.640 | we've brought stardust material back.
00:11:36.440 | So that's, we've gone far
00:11:40.780 | and there's incredible opportunity to go even farther.
00:11:43.720 | - So it seems quite crazy that this is even possible,
00:11:47.420 | that, can you talk a little bit about
00:11:50.340 | what it means to orbit an asteroid
00:11:54.060 | and with a bucket to try to pick up some soil samples?
00:11:58.380 | - Yeah, so part of it is just kind of the,
00:12:02.220 | you know, these are the same kinds of techniques
00:12:04.860 | we use here on earth for high speed,
00:12:09.580 | high accuracy imagery, stitching these scenes together
00:12:13.260 | and creating essentially high accuracy world maps, right?
00:12:17.500 | And so that's what we're doing,
00:12:19.740 | obviously on a much smaller scale with an asteroid.
00:12:23.220 | But the other thing that's really interesting,
00:12:24.980 | you put together sort of that neat control
00:12:28.500 | and data and imagery problem,
00:12:32.580 | but the stories around how we designed the collection,
00:12:37.020 | I mean, as essentially, you know,
00:12:38.460 | this is the sort of the human ingenuity element, right?
00:12:41.420 | That essentially, you know,
00:12:43.740 | had an engineer who had a, you know,
00:12:45.820 | one day he's like, "Oh,"
00:12:47.340 | starts messing around with parts,
00:12:49.260 | vacuum cleaner, bucket, you know,
00:12:51.940 | "Maybe we could do something like this."
00:12:53.500 | And that was what led to what we call
00:12:55.180 | the pogo stick collection, right?
00:12:57.060 | Where basically a thing comes down,
00:12:59.260 | it's only there for seconds, does that collection,
00:13:02.900 | grabs the, essentially blows the regolith material
00:13:07.580 | into the collection hopper and off it goes.
00:13:10.260 | - It doesn't really land almost.
00:13:12.140 | - It's a very short landing.
00:13:13.580 | - Wow, that's incredible.
00:13:15.500 | So what is in those,
00:13:19.700 | talk a little bit more about space.
00:13:22.180 | What's the role of the human in all of this?
00:13:24.380 | What are the challenges?
00:13:25.820 | What are the opportunities for humans
00:13:29.060 | as they pilot these vehicles in space?
00:13:33.780 | And for humans that may step foot
00:13:36.980 | on either the moon or Mars?
00:13:41.260 | - Yeah, it's a great question
00:13:42.660 | because, you know, I just have been extolling
00:13:45.140 | the virtues of robotic and, you know,
00:13:48.780 | rovers, autonomous systems,
00:13:50.820 | and those absolutely have a role.
00:13:53.740 | I think the thing that we don't know how to replace today
00:13:57.260 | is the ability to adapt on the fly to new information.
00:14:02.260 | And I believe that will come, but we're not there yet.
00:14:07.620 | There's a ways to go.
00:14:08.820 | And so, you know, you think back to Apollo 13
00:14:13.580 | and the ingenuity of the folks on the ground
00:14:15.940 | and on the spacecraft essentially cobbled together
00:14:19.140 | a way to get the carbon dioxide scrubbers to work.
00:14:22.740 | Those are the kinds of things that ultimately, you know,
00:14:28.380 | and I'd say not just from dealing with anomalies,
00:14:31.340 | but, you know, dealing with new information.
00:14:33.700 | You see something and rather than waiting 20 minutes
00:14:38.380 | or half an hour, an hour to try to get information
00:14:41.340 | back and forth, but be able to essentially
00:14:44.060 | re-vector on the fly, collect, you know,
00:14:46.380 | different samples, take a different approach,
00:14:49.140 | choose different areas to explore.
00:14:52.740 | Those are the kinds of things that human presence enables
00:14:56.780 | that is still a ways ahead of us on the AI side.
00:15:00.300 | - Yeah, there's some interesting stuff we'll talk about
00:15:02.220 | on the teaming side here on Earth.
00:15:04.580 | That's pretty cool to explore.
00:15:06.460 | - And in space, let's not leave the space piece out.
00:15:08.820 | - So what is teaming, what is AI and humans
00:15:11.720 | working together in space look like?
00:15:13.940 | - Yeah, one of the things we're working on
00:15:15.460 | is a system called Maya, which is,
00:15:18.060 | you can think of it, so it's an AI assistant.
00:15:20.900 | In space, exactly.
00:15:24.220 | And you think of it as the Alexa in space, right?
00:15:28.540 | But this goes hand in hand with a lot of other developments.
00:15:31.740 | And so today's world, everything is essentially model-based.
00:15:35.160 | Model-based systems engineering to the actual
00:15:39.740 | digital tapestry that goes through the design,
00:15:42.580 | the build, the manufacture, the testing,
00:15:44.820 | and ultimately the sustainment of these systems.
00:15:47.660 | And so our vision is really that
00:15:51.020 | when our astronauts are there around Mars,
00:15:54.800 | you're gonna have that entire digital library
00:15:58.060 | of the spacecraft, of its operations, all the test data,
00:16:04.420 | all the test data and flight data from previous missions
00:16:08.100 | to be able to look and see if there are anomalous conditions
00:16:11.840 | and tell the humans and potentially deal with that
00:16:16.060 | before it becomes a bad situation
00:16:20.100 | and help the astronauts work through those kinds of things.
00:16:23.220 | And it's not just dealing with problems as they come up,
00:16:26.860 | but also offering up opportunities
00:16:29.220 | for additional exploration capability, for example.
00:16:32.540 | So that's the vision is that these are,
00:16:35.900 | take the best of the human to respond
00:16:37.820 | to changing circumstances and rely on the best
00:16:42.780 | of AI capabilities to monitor these,
00:16:46.180 | this almost infinite number of data points
00:16:49.520 | and correlations of data points
00:16:51.620 | that humans frankly aren't that good at.
00:16:54.040 | - So how do you develop systems in space like this,
00:16:56.320 | whether it's Alexa in space or in general,
00:17:01.280 | any kind of control systems,
00:17:03.600 | any kind of intelligent systems
00:17:05.000 | when you can't really test stuff too much out in space?
00:17:08.720 | It's very expensive to test stuff.
00:17:10.880 | So how do you develop such systems?
00:17:14.280 | - Yeah, that's the beauty of this digital twin, if you will.
00:17:19.040 | And of course, with Lockheed Martin,
00:17:21.160 | we've over the past five plus decades
00:17:24.640 | been refining our knowledge of the space environment,
00:17:28.240 | of how materials behave, dynamics, the controls,
00:17:33.360 | the radiation environments, all of these kinds of things.
00:17:37.260 | So we're able to create very sophisticated models.
00:17:39.980 | They're not perfect, but they're very good.
00:17:43.500 | And so you can actually do a lot.
00:17:46.700 | I spent part of my career simulating
00:17:50.780 | communication spacecraft, missile warning spacecraft,
00:17:54.820 | GPS spacecraft in all kinds of scenarios
00:17:58.100 | and all kinds of environments.
00:17:59.340 | So this is really just taking that to the next level.
00:18:01.940 | The interesting thing is that now
00:18:03.860 | you're bringing into that loop
00:18:06.400 | a system depending on how it's developed
00:18:08.380 | that may be non-deterministic,
00:18:10.580 | it may be learning as it goes.
00:18:13.260 | In fact, we anticipate that it will be learning as it goes.
00:18:16.620 | And so that brings a whole new level of interest, I guess,
00:18:21.620 | into how do you do verification and validation
00:18:25.420 | of these non-deterministic learning systems
00:18:28.580 | in scenarios that may go out of the bounds
00:18:31.780 | of the envelope that you have initially designed them to.
00:18:35.100 | - So this system in its intelligence
00:18:37.460 | has the same complexity,
00:18:39.220 | some of the same complexities a human does.
00:18:41.060 | And it learns over time, it's unpredictable
00:18:43.660 | in certain kinds of ways.
00:18:44.900 | So you also have to model that when you're thinking about it.
00:18:50.100 | So in your thoughts, is it possible
00:18:53.460 | to model the majority of situations,
00:18:57.260 | the important aspects of situations here on Earth
00:18:59.620 | and in space enough to test stuff?
00:19:02.340 | - Yeah, this is really an active area of research.
00:19:05.620 | And we're actually funding university research
00:19:07.500 | in a variety of places, including MIT.
00:19:10.140 | This is in the realm of trust
00:19:12.460 | and verification and validation of,
00:19:15.780 | I'd say autonomous systems in general,
00:19:17.980 | and then as a subset of that, autonomous systems
00:19:20.980 | that incorporate artificial intelligence capabilities.
00:19:24.580 | And this is not an easy problem.
00:19:27.940 | We're working with startup companies,
00:19:29.560 | we've got internal R&D, but our conviction is
00:19:33.660 | that autonomy and more and more AI-enabled autonomy
00:19:38.660 | is gonna be in everything
00:19:40.820 | that Lockheed Martin develops and fields.
00:19:44.260 | And it's gonna be retrofit,
00:19:46.660 | autonomy and AI are gonna be retrofit into existing systems.
00:19:50.140 | They're gonna be part of the design
00:19:52.460 | for all of our future systems.
00:19:54.500 | And so maybe I should take a step back
00:19:56.540 | and say the way we define autonomy.
00:19:58.660 | So we talk about autonomy essentially,
00:20:01.080 | a system that composes, selects,
00:20:04.780 | and then executes decisions
00:20:08.460 | with varying levels of human intervention.
00:20:12.500 | And so you could think of no autonomy.
00:20:15.660 | So this is essentially the human doing the task.
00:20:18.460 | You can think of effectively partial autonomy
00:20:23.080 | where the human is in the loop.
00:20:25.820 | So making decisions in every case
00:20:29.140 | about what the autonomous system can do.
00:20:31.140 | - Either in the cockpit or remotely.
00:20:33.220 | - Or remotely, exactly, but still in that control loop.
00:20:36.020 | And then there's what you'd call supervisory autonomy.
00:20:39.860 | So the autonomous system is doing most of the work.
00:20:42.420 | The human can intervene to stop it
00:20:44.380 | or to change the direction.
00:20:45.820 | And then ultimately full autonomy
00:20:47.940 | where the human is off the loop altogether.
00:20:50.300 | And for different types of missions,
00:20:52.860 | you wanna have different levels of autonomy.
00:20:55.820 | So now take that spectrum and this conviction
00:20:58.380 | that autonomy and more and more AI
00:21:01.220 | are in everything that we develop.
00:21:03.480 | The kinds of things that Lockheed Martin does,
00:21:07.980 | a lot of times are safety of life critical
00:21:10.820 | kinds of missions.
00:21:12.700 | Think about aircraft, for example.
00:21:14.640 | And so we require and our customers require
00:21:20.140 | an extremely high level of confidence.
00:21:23.260 | One, that we're gonna protect life.
00:21:26.420 | Two, that we're going to,
00:21:28.460 | that these systems will behave in ways
00:21:31.300 | that their operators can understand.
00:21:33.900 | And so this gets into that whole field.
00:21:36.420 | Again, being able to verify and validate
00:21:40.340 | that the systems have been,
00:21:44.060 | they will operate the way they're designed
00:21:46.300 | and the way they're expected.
00:21:48.100 | And furthermore, that they will do that
00:21:50.740 | in ways that can be explained and understood.
00:21:55.460 | And that is an extremely difficult challenge.
00:21:58.860 | - Yeah, so here's a difficult question.
00:22:00.820 | I don't mean to bring this up,
00:22:04.420 | but I think it's a good case study
00:22:05.620 | that people are familiar with.
00:22:07.900 | Boeing 737 MAX commercial airplane
00:22:11.140 | has had two recent crashes
00:22:13.440 | where their flight control software system failed.
00:22:15.980 | And it's software.
00:22:17.540 | So I don't mean to speak about Boeing,
00:22:19.060 | but broadly speaking,
00:22:20.540 | we have this in the autonomous vehicle space too,
00:22:22.860 | semi-autonomous.
00:22:24.060 | When you have millions of lines of code software
00:22:27.820 | making decisions,
00:22:29.320 | there is a little bit of a clash of cultures
00:22:32.900 | because software engineers
00:22:35.300 | don't have the same culture of safety often
00:22:38.300 | that people who build systems like at Lockheed Martin
00:22:43.100 | do where it has to be exceptionally safe.
00:22:46.460 | You have to test this on.
00:22:48.060 | So how do we get this right
00:22:49.860 | when software is making so many decisions?
00:22:53.180 | - Yeah, and there's a lot of things that have to happen.
00:22:57.100 | And by and large, I think it starts with the culture,
00:23:01.260 | which is not necessarily something
00:23:03.340 | that A, is taught in school,
00:23:05.940 | or B, is something that would come,
00:23:07.980 | depending on what kind of software you're developing,
00:23:10.820 | it may not be relevant
00:23:13.100 | if you're targeting ads or something like that.
00:23:15.740 | So, and by and large,
00:23:19.020 | I'd say not just Lockheed Martin,
00:23:20.620 | but certainly the aerospace industry as a whole
00:23:23.700 | has developed a culture that does focus on safety,
00:23:27.260 | safety of life, operational safety, mission success.
00:23:31.020 | But as you know,
00:23:33.300 | these systems have gotten incredibly complex.
00:23:36.100 | And so they're to the point where it's almost impossible.
00:23:40.660 | The state space has become so huge
00:23:42.580 | that it's impossible to,
00:23:44.820 | or very difficult to do a systematic verification
00:23:48.860 | across the entire set of potential ways
00:23:52.260 | that an aircraft could be flown,
00:23:53.740 | all the conditions that could happen,
00:23:55.540 | all the potential failure scenarios.
00:23:59.300 | Now, maybe that's soluble one day,
00:24:01.100 | maybe when we have our quantum computers at our fingertips,
00:24:04.380 | we'll be able to actually simulate across an entire,
00:24:08.820 | you know, almost infinite state space.
00:24:11.260 | But today, you know, there's a lot of work
00:24:16.260 | to really try to bound the system,
00:24:20.940 | to make sure that it behaves in predictable ways,
00:24:24.780 | and then have this culture of continuous inquiry
00:24:29.060 | and skepticism and questioning to say,
00:24:33.140 | did we really consider the right realm of possibilities?
00:24:37.300 | Have we done the right range of testing?
00:24:40.140 | Do we really understand, you know, in this case,
00:24:42.100 | you know, human and machine interactions,
00:24:44.580 | the human decision process alongside the machine processes?
00:24:49.460 | And so that's that culture,
00:24:51.420 | that we call it the culture of mission success
00:24:53.500 | at Lockheed Martin, that really needs to be established.
00:24:56.700 | And it's not something, you know,
00:24:57.900 | it's something that people learn by living in it.
00:25:02.140 | And it's something that has to be promulgated,
00:25:04.860 | you know, and it's done, you know,
00:25:06.060 | from the highest levels at a company of Lockheed Martin,
00:25:09.140 | like Lockheed Martin.
00:25:10.180 | - Yeah, and the same is being faced
00:25:12.500 | at certain autonomous vehicle companies
00:25:13.980 | where that culture is not there
00:25:15.780 | because it's started mostly by software engineers.
00:25:18.580 | So that's what they're struggling with.
00:25:20.500 | Is there lessons that you think we should learn
00:25:25.700 | as an industry and a society
00:25:27.300 | from the Boeing 737 MAX crashes?
00:25:30.220 | - These crashes obviously are either tremendous tragedies,
00:25:34.740 | they're tragedies for all of the people,
00:25:37.820 | the crew, the families, the passengers,
00:25:41.260 | the people on the ground involved.
00:25:43.220 | And, you know, it's also a huge business
00:25:47.460 | and economic setback as well.
00:25:49.060 | I mean, you know, we've seen it's impacting
00:25:51.100 | essentially the trade balance of the US.
00:25:53.820 | So these are important questions.
00:25:58.380 | And these are the kinds of, you know,
00:26:00.180 | we've seen similar kinds of questioning at times,
00:26:03.020 | you know, you go back to the Challenger accident,
00:26:06.900 | and it is, I think, always important to remind ourselves
00:26:10.580 | that humans are fallible,
00:26:11.900 | that the systems we create,
00:26:13.980 | as perfect as we strive to make them,
00:26:16.540 | we can always make them better.
00:26:18.900 | And so another element of that culture of mission success
00:26:21.700 | is really that commitment to continuous improvement.
00:26:24.940 | If there's something that goes wrong,
00:26:27.460 | a real commitment to root cause
00:26:31.140 | and true root cause understanding
00:26:33.300 | to taking the corrective actions
00:26:35.100 | and to making the future systems better.
00:26:38.900 | And certainly we strive for, you know, no accidents.
00:26:43.900 | And if you look at the record
00:26:47.780 | of the commercial airline industry as a whole
00:26:50.500 | and the commercial aircraft industry as a whole,
00:26:53.020 | you know, there's a very nice decaying exponential
00:26:57.660 | to years now where we have no commercial aircraft accidents
00:27:01.700 | at all, right, or fatal accidents at all.
00:27:04.780 | So that didn't happen by accident.
00:27:08.380 | It was through the regulatory agencies, FAA,
00:27:11.660 | the airframe manufacturers,
00:27:14.380 | really working on a system to identify root causes
00:27:18.660 | and drive them out.
00:27:20.500 | - So maybe we can take a step back,
00:27:23.860 | and many people are familiar,
00:27:25.540 | but Lockheed Martin broadly,
00:27:28.880 | what kind of categories of systems
00:27:31.260 | are you involved in building?
00:27:34.300 | - You know, Lockheed Martin,
00:27:35.140 | we think of ourselves as a company
00:27:37.300 | that solves hard mission problems.
00:27:39.900 | And the output of that might be an airplane or a spacecraft
00:27:43.060 | or a helicopter or a radar or something like that.
00:27:45.700 | But ultimately we're driven by these,
00:27:47.940 | you know, like what is our customer?
00:27:50.260 | What is that mission that they need to achieve?
00:27:52.860 | And so that's what drove the SR-71, right?
00:27:55.540 | How do you get pictures of a place
00:27:57.860 | where you've got sophisticated air defense systems
00:28:02.220 | that are capable of handling any aircraft
00:28:05.460 | that was out there at the time, right?
00:28:07.480 | So that's what yielded an SR-71.
00:28:10.420 | - Let's build a nice flying camera.
00:28:12.500 | - Exactly, and make sure it gets out and it gets back.
00:28:16.100 | And that led ultimately to really the start
00:28:18.300 | of the space program in the US as well.
00:28:20.480 | So now take a step back to Lockheed Martin of today.
00:28:24.980 | And we are on the order of 105 years old now
00:28:29.080 | between Lockheed and Martin, the two big heritage companies.
00:28:32.460 | Of course, we're made up of a whole bunch of other companies
00:28:34.640 | that came in as well.
00:28:36.140 | General Dynamics, you know, kind of go down the list.
00:28:39.840 | Today, you can think of us
00:28:42.640 | in this space of solving mission problems.
00:28:44.880 | So obviously on the aircraft side,
00:28:48.480 | tactical aircraft, building the most advanced fighter
00:28:52.400 | aircraft that the world has ever seen.
00:28:54.880 | You know, we're up to now several hundred
00:28:56.720 | of those delivered, building almost 100 a year.
00:29:00.120 | And of course, working on the things that come after that.
00:29:04.120 | On the space side, we are engaged in pretty much every venue
00:29:09.120 | of space utilization and exploration you can imagine.
00:29:14.320 | So I mentioned things like navigation, timing, GPS,
00:29:18.100 | communication satellites, missile warning satellites.
00:29:22.460 | We've built commercial surveillance satellites.
00:29:24.820 | We've built commercial communication satellites.
00:29:27.700 | We do civil space.
00:29:29.260 | So everything from human exploration
00:29:32.360 | to the robotic exploration of the outer planets.
00:29:35.040 | And keep going on the space front.
00:29:39.140 | But you know, a couple of other areas I'd like to put out.
00:29:42.540 | We're heavily engaged in building
00:29:45.580 | critical defensive systems.
00:29:47.440 | And so a couple that I'll mention,
00:29:50.080 | the Aegis Combat System.
00:29:51.680 | This is basically the integrated air
00:29:53.640 | and missile defense system for the US and allied fleets.
00:29:58.640 | And so protects carrier strike groups, for example,
00:30:03.680 | from incoming ballistic missile threats,
00:30:06.600 | aircraft threats, cruise missile threats,
00:30:08.520 | and you know, kind of go down the list.
00:30:10.120 | - So the carriers, the fleet itself is the thing
00:30:14.020 | that is being protected.
00:30:15.360 | The carriers aren't serving as a protection
00:30:18.180 | for something else.
00:30:19.400 | - Well, that's a little bit of a different application.
00:30:21.880 | We've actually built a version called Aegis Ashore,
00:30:24.400 | which is now deployed in a couple of places
00:30:26.960 | around the world.
00:30:28.000 | So that same technology, I mean,
00:30:29.760 | basically can be used to protect either
00:30:34.280 | an ocean going fleet or a land-based activity.
00:30:37.880 | Another one, the THAAD program.
00:30:39.720 | So THAAD, this is the Theater High Altitude Area Defense.
00:30:44.780 | This is to protect, you know,
00:30:48.320 | relatively broad areas against sophisticated
00:30:52.320 | ballistic missile threats.
00:30:54.120 | And so now, you know, it's deployed
00:30:58.600 | with a lot of US capabilities.
00:31:00.560 | Now we have international customers
00:31:02.680 | that are looking to buy that capability as well.
00:31:05.240 | And so these are systems that defend,
00:31:07.720 | not just defend militaries and military capabilities,
00:31:10.840 | but defend population areas.
00:31:13.020 | We saw, you know, maybe the first public use of these
00:31:17.800 | back in the first Gulf War with the Patriot systems.
00:31:21.620 | And these are the kinds of things
00:31:24.560 | that Lockheed Martin delivers.
00:31:27.440 | And there's a lot of stuff that goes with it.
00:31:29.520 | So think about the radar systems and the sensing systems
00:31:33.120 | that cue these, the command and control systems
00:31:36.760 | that decide how you pair a weapon
00:31:39.360 | against an incoming threat.
00:31:40.920 | And then all the human and machine interfaces
00:31:45.400 | to make sure that they can be operated successfully
00:31:48.040 | in very strenuous environments.
00:31:51.040 | - Yeah, there's some incredible engineering
00:31:54.640 | that at every front, like you said.
00:31:57.280 | So maybe if we just take a look at Lockheed history broadly,
00:32:02.280 | maybe even looking at Skunk Works,
00:32:05.640 | what are the biggest,
00:32:08.520 | most impressive milestones of innovation?
00:32:11.200 | So if you look at stealth, I would have called you crazy
00:32:14.880 | if you said that's possible at the time.
00:32:16.880 | And supersonic and hypersonic.
00:32:21.360 | So traveling at, first of all,
00:32:24.080 | traveling at the speed of sound is pretty damn fast.
00:32:27.640 | And the supersonic and hypersonic,
00:32:29.760 | three, four, five times the speed of sound.
00:32:32.240 | That seems, I would also call you crazy
00:32:34.440 | if you say you can do that.
00:32:35.840 | So can you tell me how it's possible
00:32:38.120 | to do these kinds of things?
00:32:39.640 | And is there other milestones
00:32:41.120 | and innovation that's going on
00:32:44.360 | that you can talk about?
00:32:45.200 | - Yeah, well, let me start on the Skunk Works saga.
00:32:49.040 | And you kind of alluded to it in the beginning.
00:32:51.760 | Skunk Works is as much a idea as a place.
00:32:54.960 | And so it's driven really by Kelly Johnson's 14 principles.
00:32:59.580 | And I'm not gonna list all 14 of them off,
00:33:02.040 | but the idea, and this I'm sure will resonate
00:33:04.560 | with any engineer who's worked
00:33:06.280 | on a highly motivated small team before.
00:33:09.480 | The idea that if you can essentially have a small team
00:33:13.440 | of very capable people who wanna work
00:33:17.320 | on really hard problems, you can do almost anything.
00:33:20.560 | Especially if you kind of shield them
00:33:23.320 | from bureaucratic influences,
00:33:26.680 | if you create very tight relationships with your customers
00:33:30.760 | so that you have that team
00:33:33.020 | and shared vision with the customer.
00:33:36.040 | Those are the kinds of things that enable the Skunk Works
00:33:40.400 | to do these incredible things.
00:33:43.040 | - And we listed off a number that you brought up stealth.
00:33:46.360 | And I mean, this whole,
00:33:48.600 | I wish I could have seen Ben Rich with a ball bearing,
00:33:51.720 | rolling it across the desk to a general officer and saying,
00:33:55.800 | "Would you like to have an aircraft
00:33:58.320 | "that has the radar cross section of this ball bearing?"
00:34:01.800 | Probably one of the least expensive
00:34:04.280 | and most effective marketing campaigns
00:34:06.280 | in the history of the industry.
00:34:08.400 | - So just for people that are not familiar,
00:34:10.660 | I mean, the way you detect aircraft,
00:34:12.760 | 'cause I mean, I'm sure there's a lot of ways,
00:34:14.680 | but radar, for the longest time,
00:34:17.360 | there's a big blob that appears in the radar.
00:34:20.680 | How do you make a plane disappear
00:34:22.360 | so it looks as big as a ball bearing?
00:34:26.200 | What's involved in technology-wise there?
00:34:28.040 | What's, broadly, sort of the stuff you can speak about?
00:34:32.480 | - I'll stick to what's in Ben Rich's book.
00:34:34.840 | But obviously the geometry of how radar gets reflected
00:34:39.040 | and the kinds of materials that either reflect
00:34:41.420 | or absorb are kind of the couple
00:34:44.360 | of the critical elements there.
00:34:46.520 | I mean, it's a cat and mouse game, right?
00:34:48.120 | I mean, radars get better, stealth capabilities get better,
00:34:53.000 | and so it's a really a game of continuous improvement
00:34:57.720 | and innovation there, and I'll leave it at that.
00:35:00.200 | - Yeah, so the idea that something is essentially invisible
00:35:04.800 | is quite fascinating.
00:35:06.480 | But the other one is flying fast.
00:35:09.000 | So speed of sound is 750, 60 miles an hour.
00:35:13.300 | So supersonic is Mach 3, something like that.
00:35:19.240 | - Yeah, we talk about the supersonic, obviously,
00:35:21.640 | and we kind of talk about that as that realm
00:35:24.160 | from Mach 1 up through about Mach 5.
00:35:26.760 | And then hypersonic, so high supersonic speeds
00:35:31.760 | would be past Mach 5.
00:35:34.800 | And you gotta remember, Lockheed Martin
00:35:37.160 | and actually other companies have been involved
00:35:39.120 | in hypersonic development since the late '60s.
00:35:42.320 | You know, you think of everything from the X-15
00:35:45.400 | to the Space Shuttle as examples of that.
00:35:48.080 | I think the difference now is if you look around the world,
00:35:54.360 | particularly the threat environment that we're in today,
00:35:57.380 | you're starting to see publicly folks like the Russians
00:36:02.560 | and the Chinese saying they have hypersonic weapons
00:36:07.560 | capability that could threaten US and Allied capabilities.
00:36:12.600 | And also, basically, you know, the claims are
00:36:17.240 | that these could get around defensive systems
00:36:19.920 | that are out there today.
00:36:21.880 | And so there's a real sense of urgency.
00:36:24.560 | You hear it from folks like the Undersecretary of Defense
00:36:28.200 | for Research and Engineering, Dr. Mike Griffin,
00:36:30.840 | and others in the Department of Defense
00:36:32.840 | that hypersonics is something that's really important
00:36:37.240 | to the nation in terms of both parity
00:36:41.080 | but also defensive capabilities.
00:36:43.160 | And so that's something that, you know, we're pleased.
00:36:46.240 | It's something that Lockheed Martin's, you know,
00:36:47.880 | had a heritage in.
00:36:49.320 | We've invested R&D dollars on our side for many years.
00:36:53.840 | And we have a number of things going on
00:36:56.240 | with various US government customers in that field today
00:36:59.800 | that we're very excited about.
00:37:01.560 | So I would anticipate we'll be hearing more about that
00:37:04.560 | in the future from our customers.
00:37:06.280 | - And I've actually haven't read much about this.
00:37:08.920 | Probably you can't talk about much of it at all,
00:37:10.880 | but on the defensive side,
00:37:12.800 | it's a fascinating problem of perception,
00:37:15.640 | of trying to detect things that are really hard to see.
00:37:18.400 | Can you comment on how hard that problem is
00:37:21.600 | and how hard is it to stay ahead,
00:37:26.600 | even if we're going back a few decades,
00:37:29.240 | stay ahead of the competition?
00:37:30.520 | - Well, maybe I'd, again, you gotta think of these
00:37:33.760 | as ongoing capability development.
00:37:36.520 | And so think back to the early days of missile defense.
00:37:40.760 | So this would be in the '80s, the SDI program.
00:37:44.160 | And in that timeframe, we proved,
00:37:46.520 | Lockheed Martin proved that you could hit a bullet
00:37:48.920 | with a bullet, essentially,
00:37:50.320 | and which is something that had never been done before,
00:37:53.240 | to take out an incoming ballistic missile.
00:37:56.080 | And so that's led to these incredible
00:37:58.720 | hit-to-kill kinds of capabilities, PAC-3.
00:38:01.840 | That's the Patriot Advanced Capability,
00:38:05.280 | Model 3 that Lockheed Martin builds,
00:38:08.120 | the THAAD system that I talked about.
00:38:10.720 | So now hypersonics,
00:38:13.880 | they're different from ballistic systems.
00:38:17.520 | And so we gotta take the next step in defensive capability.
00:38:21.100 | - I can, I'll leave that there, but I can only imagine.
00:38:26.480 | Now, let me just comment.
00:38:27.720 | So if it's an engineer, it's sad to know that
00:38:31.440 | so much that Lockheed has done in the past is classified,
00:38:36.440 | or today, and it's shrouded in secrecy.
00:38:40.920 | It has to be by the nature of the application.
00:38:44.680 | So like what I do, so what we do here at MIT,
00:38:49.200 | we'd like to inspire young engineers, young scientists.
00:38:53.920 | And yet, in the Lockheed case,
00:38:56.440 | some of that engineer has to stay quiet.
00:38:59.680 | How do you think about that?
00:39:00.880 | How does that make you feel?
00:39:02.080 | Is there a future where more can be shown?
00:39:07.080 | Or is it just the nature of this world
00:39:11.200 | that it has to remain secret?
00:39:13.480 | - It's a good question.
00:39:15.640 | I think the public can see enough of,
00:39:21.800 | and including students who may be in grade school,
00:39:25.560 | high school, college today,
00:39:27.720 | to understand the kinds of really hard problems
00:39:32.400 | that we work on.
00:39:34.080 | And I mean, look at the F-35, right?
00:39:36.800 | And obviously a lot of the detailed performance levels
00:39:41.280 | are sensitive and controlled.
00:39:43.820 | But we can talk about what an incredible aircraft this is.
00:39:48.720 | Supersonic, super cruise kind of a fighter,
00:39:51.160 | a stealth capabilities.
00:39:55.160 | It's a flying information system in the sky
00:39:58.560 | with data fusion, sensor fusion capabilities
00:40:02.120 | that have never been seen before.
00:40:03.800 | So these are the kinds of things that I believe,
00:40:06.200 | these are the kinds of things that got me excited
00:40:07.980 | when I was a student.
00:40:08.920 | I think these still inspire students today.
00:40:12.600 | And the other thing, I'd say,
00:40:14.040 | I mean, people are inspired by space.
00:40:17.000 | People are inspired by aircraft.
00:40:20.180 | Our employees are also inspired by that sense of mission.
00:40:25.320 | And I'll just give you an example.
00:40:27.520 | I had the privilege to work
00:40:30.400 | and lead our GPS programs for some time.
00:40:34.400 | And that was a case where I actually worked on a program
00:40:39.160 | that touches billions of people every day.
00:40:41.720 | And so when I said I worked on GPS,
00:40:43.480 | everybody knew what I was talking about,
00:40:45.240 | even though they didn't maybe appreciate
00:40:46.880 | the technical challenges that went into that.
00:40:50.240 | But I'll tell you, I got a briefing one time
00:40:54.980 | from a major in the Air Force.
00:40:57.440 | And he said, "I go by call sign GIMP.
00:41:01.600 | GPS is my passion."
00:41:04.320 | So I love GPS.
00:41:05.720 | And he was involved in the operational test of the system.
00:41:08.960 | He said, "I was out in Iraq and I was on a helicopter,
00:41:14.640 | Black Hawk helicopter, and I was bringing back a sergeant
00:41:19.640 | and a handful of troops from a deployed location."
00:41:23.800 | And he said, "My job is GPS."
00:41:26.600 | So I asked that sergeant, and he's beating down
00:41:29.200 | and kind of half asleep.
00:41:31.400 | And I said, "What do you think about GPS?"
00:41:34.100 | And he brightened up, his eyes lit up, and he said,
00:41:36.300 | "Well, GPS, that brings me and my troops home every day.
00:41:39.960 | I love GPS."
00:41:41.120 | And that's the kind of story where it's like,
00:41:42.960 | okay, I'm really making a difference here
00:41:45.600 | in the kind of work.
00:41:46.440 | So that mission piece is really important.
00:41:48.920 | Last thing I'll say is, and this gets to some
00:41:52.840 | of these questions around advanced technologies.
00:41:56.120 | It's not, you know, they're not just airplanes
00:41:58.720 | and spacecraft anymore.
00:41:59.960 | For people who are excited
00:42:01.420 | about advanced software capabilities, about AI,
00:42:04.780 | about bringing machine learning,
00:42:06.040 | these are the things that we're doing to, you know,
00:42:09.200 | exponentially increase the mission capabilities
00:42:12.960 | that go on those platforms.
00:42:14.360 | And those are the kinds of things that I think are more
00:42:16.440 | and more visible to the public.
00:42:18.440 | - Yeah, I think autonomy,
00:42:19.920 | especially in flight is super exciting.
00:42:23.080 | Do you see if a day, here we go, back into philosophy,
00:42:28.080 | future when most fighter jets will be highly autonomous
00:42:33.120 | to a degree where a human doesn't need to be in the cockpit
00:42:38.900 | in almost all cases?
00:42:40.640 | - Well, I mean, that's a world that to a certain extent
00:42:43.520 | we're in today.
00:42:44.360 | Now, these are remotely piloted aircraft to be sure,
00:42:47.800 | but we have hundreds of thousands of flight hours a year now
00:42:52.800 | in remotely piloted aircraft.
00:42:55.800 | And then if you take the F-35,
00:42:58.400 | there are huge layers, I guess, in levels of autonomy
00:43:04.680 | built into that aircraft
00:43:06.200 | so that the pilot is essentially more of a mission manager
00:43:11.200 | rather than doing the data, you know,
00:43:14.020 | the second to second elements of flying the aircraft.
00:43:17.180 | So in some ways it's the easiest aircraft
00:43:19.500 | in the world to fly.
00:43:20.820 | And kind of a funny story on that.
00:43:22.500 | So I don't know if you know how aircraft carrier landings
00:43:26.940 | work, but basically there's what's called a tail hook
00:43:30.820 | and it catches wires on the deck of the carrier.
00:43:33.780 | And that's what brings the aircraft
00:43:37.320 | to a screeching halt, right?
00:43:39.440 | And there's typically three of these wires.
00:43:41.840 | So if you miss the first, the second one,
00:43:43.520 | you catch the next one, right?
00:43:46.000 | And, you know, we got a little criticism.
00:43:49.580 | I don't know how true this story is,
00:43:50.940 | but we got a little criticism.
00:43:52.440 | The F-35 is so perfect.
00:43:54.520 | It always gets the second wire.
00:43:56.160 | So we're wearing out the wire
00:43:57.720 | because it always hits that one.
00:44:00.640 | So that, but that's the kind of autonomy
00:44:03.040 | that just makes these, that essentially up levels
00:44:05.800 | what the human is doing to more of that mission manager.
00:44:08.600 | - So much of that landing by the F-35 is autonomous.
00:44:12.100 | - Well, it's just, you know, the control systems are such
00:44:14.460 | that you really have dialed out the variability
00:44:17.100 | that comes with all of the environmental conditions.
00:44:19.920 | - You're wearing it out.
00:44:20.840 | - So my point is to a certain extent,
00:44:24.360 | that world is here today.
00:44:26.180 | Do I think that we're gonna see a day anytime soon
00:44:30.020 | when there are no humans in the cockpit?
00:44:31.860 | I don't believe that,
00:44:33.380 | but I do think we're gonna see much more
00:44:35.900 | human machine teaming,
00:44:37.160 | and we're gonna see that much more at the tactical edge.
00:44:40.540 | And we did a demo,
00:44:41.500 | you asked about what the Skunk Works is doing these days.
00:44:43.800 | And so this is something I can talk about,
00:44:46.260 | but we did a demo with the Air Force Research Laboratory.
00:44:51.260 | We called it Have Raider.
00:44:52.660 | And so using an F-16 as an autonomous wingman,
00:44:59.860 | and we demonstrated all kinds of maneuvers
00:45:02.540 | and various mission scenarios with the autonomous F-16
00:45:06.340 | being that so-called loyal or trusted wingman.
00:45:09.580 | And so those are the kinds of things that, you know,
00:45:12.140 | we've shown what is possible now,
00:45:15.500 | given that you've up-leveled that pilot
00:45:17.940 | to be a mission manager,
00:45:19.040 | now they can control multiple other aircraft.
00:45:22.340 | Think of them almost as extensions of your own aircraft
00:45:25.100 | flying alongside with you.
00:45:27.220 | So that's another example of how this is really
00:45:30.340 | coming to fruition.
00:45:31.620 | And then, yeah, I mentioned the landings,
00:45:35.220 | but think about just the implications
00:45:38.160 | for humans and flight safety.
00:45:39.900 | And this goes a little bit back to the discussion
00:45:41.860 | we were having about how do you continuously improve
00:45:45.740 | the level of safety through automation
00:45:48.980 | while working through the complexities
00:45:51.400 | that automation introduces.
00:45:53.380 | So one of the challenges that you have
00:45:54.780 | in high-performance fighter aircraft
00:45:56.340 | is what's called G-LOC.
00:45:57.500 | So this is G-induced loss of consciousness.
00:46:00.000 | So you pull nine Gs, you're wearing a pressure suit,
00:46:02.860 | that's not enough to keep the blood going to your brain,
00:46:05.820 | you blackout.
00:46:06.940 | - Right.
00:46:07.780 | - Right?
00:46:08.620 | And of course that's bad if you happen to be flying low,
00:46:11.900 | you know, near the deck,
00:46:13.100 | and you know, or an obstacle or terrain environment.
00:46:17.580 | And so we developed a system in our aeronautics division
00:46:22.460 | called Auto GCAS,
00:46:23.780 | so Autonomous Ground Collision Avoidance System.
00:46:27.460 | And we built that into the F-16.
00:46:30.100 | It's actually saved seven aircraft, eight pilots already,
00:46:33.020 | in the relatively short time it's been deployed.
00:46:35.860 | It was so successful that the Air Force said,
00:46:39.220 | "Hey, we need to have this in the F-35 right away."
00:46:41.460 | So we've actually done testing of that now on the F-35.
00:46:45.380 | And we've also integrated
00:46:47.860 | an autonomous air collision avoidance system.
00:46:51.000 | So think the air-to-air problem.
00:46:52.900 | So now it's the integrated collision avoidance system.
00:46:56.060 | But these are the kinds of capabilities,
00:46:58.600 | you know, I wouldn't call them AI.
00:46:59.940 | I mean, they're very sophisticated models,
00:47:03.540 | you know, of the aircraft dynamics,
00:47:06.260 | coupled with the terrain models,
00:47:08.080 | to be able to predict when essentially,
00:47:11.420 | you know, the pilot is doing something
00:47:12.820 | that is gonna take the aircraft into,
00:47:14.860 | or the pilot's not doing something in this case.
00:47:18.140 | But those, it just gives you an example
00:47:20.220 | of how autonomy can be really a lifesaver in today's world.
00:47:25.220 | - It's like a autonomous,
00:47:28.180 | automated emergency braking in cars.
00:47:30.580 | But is there any exploration of perception of,
00:47:34.660 | for example, detecting a G-lock that the pilot is out?
00:47:39.660 | So as opposed to perceiving the external environment
00:47:43.020 | to infer that the pilot is out,
00:47:44.500 | but actually perceiving the pilot directly?
00:47:47.420 | - Yeah, this is one of those cases
00:47:48.740 | where you'd like to not take action
00:47:50.780 | if you think the pilot's there.
00:47:52.100 | And it's almost like systems that try to detect
00:47:54.740 | if a driver's falling asleep on the road, right?
00:47:57.700 | With limited success.
00:47:59.980 | So, I mean, this is what I'd call
00:48:02.060 | the system of last resort, right?
00:48:03.740 | Where if the aircraft has determined
00:48:06.940 | that it's going into the terrain, get it out of there.
00:48:10.900 | And this is not something that we're just doing
00:48:13.580 | in the aircraft world.
00:48:15.700 | And I wanted to highlight,
00:48:16.940 | we have a technology we call Matrix,
00:48:18.660 | but this is developed at Sikorsky Innovations.
00:48:21.980 | The whole idea there is what we call optimal piloting.
00:48:26.140 | So not optional piloting or unpiloted, but optimal piloting.
00:48:31.140 | So an FAA certified system.
00:48:35.100 | So you have a high degree of confidence.
00:48:37.460 | It's generally pretty deterministic.
00:48:40.620 | So we know that it'll do in different situations,
00:48:43.940 | but effectively be able to fly a mission
00:48:48.140 | with two pilots, one pilot, no pilots.
00:48:51.540 | And you can think of it almost as like a dial
00:48:56.020 | of the level of autonomy that you want,
00:48:58.340 | but able, so it's running in the background at all times
00:49:01.300 | and able to pick up tasks,
00:49:03.260 | whether it's sort of autopilot kinds of tasks
00:49:05.900 | or more sophisticated path planning kinds of activities.
00:49:10.900 | To be able to do things like, for example,
00:49:14.220 | land on an oil rig in the North Sea
00:49:16.860 | in bad weather, zero, zero conditions.
00:49:19.540 | And you can imagine, of course,
00:49:20.740 | there's a lot of military utility to capability like that.
00:49:24.540 | You could have an aircraft that you want to send out
00:49:27.300 | for a crewed mission, but then at night,
00:49:29.780 | if you want to use it to deliver supplies
00:49:31.900 | in an unmanned mode, that could be done as well.
00:49:35.540 | And so there's clear advantages there.
00:49:39.980 | But think about on the commercial side,
00:49:41.860 | if you're an aircraft taken,
00:49:44.420 | you're gonna fly out to this oil rig.
00:49:46.100 | And if you get out there and you can't land,
00:49:48.020 | then you gotta bring all those people back,
00:49:50.660 | reschedule another flight, pay the overtime for the crew
00:49:53.140 | that you just brought back 'cause they didn't get
00:49:55.260 | where they were going, pay for the overtime
00:49:56.660 | for the folks that are out there on the oil rig.
00:49:58.660 | This is real economic, these are dollars and cents
00:50:01.900 | kinds of advantages that we're bringing
00:50:04.020 | in the commercial world as well.
00:50:06.060 | - So this is a difficult question from the AI space
00:50:09.180 | that I would love it if we're able to comment.
00:50:11.660 | So a lot of this autonomy in AI you've mentioned just now
00:50:15.420 | has this empowering effect.
00:50:17.060 | One is the last resort, it keeps you safe.
00:50:20.420 | The other is there's a, with the teaming
00:50:22.900 | and in general assistive AI.
00:50:27.900 | And I think there's always a race.
00:50:33.140 | So the world is full of, the world is complex.
00:50:36.980 | It's full of bad actors.
00:50:41.140 | So there's often a race to make sure
00:50:43.640 | that we keep this country safe, right?
00:50:47.160 | But with AI, there is a concern
00:50:51.940 | that it's a slightly different race.
00:50:55.100 | There's a lot of people in the AI space
00:50:56.780 | that are concerned about the AI arms race.
00:50:59.580 | That as opposed to the United States becoming,
00:51:03.300 | you know, having the best technology
00:51:05.420 | and therefore keeping us safe,
00:51:07.380 | even we lose ability to keep control of it.
00:51:11.500 | So this, the AI arms race getting away
00:51:14.520 | from all of us humans.
00:51:16.780 | So do you share this worry?
00:51:19.440 | Do you share this concern when we're talking
00:51:21.520 | about military applications that too much control
00:51:25.720 | and decision-making capabilities giving to software or AI?
00:51:30.400 | - Well, I don't see it happening today.
00:51:34.140 | And in fact, this is something from a policy perspective,
00:51:37.840 | you know, it's obviously a very dynamic space,
00:51:39.960 | but the Department of Defense has put quite a bit
00:51:42.400 | of thought into that.
00:51:44.280 | And maybe before talking about the policy,
00:51:46.580 | I'll just talk about some of the why.
00:51:48.920 | And you alluded to it being a sort of a complicated
00:51:52.240 | and a little bit scary world out there,
00:51:54.040 | but there's some big things happening today.
00:51:57.280 | You hear a lot of talk now about a return
00:51:59.420 | to great powers competition,
00:52:01.480 | particularly around China and Russia with the US,
00:52:05.440 | but there are some other big players out there as well.
00:52:09.620 | And what we've seen is the deployment of some very,
00:52:14.620 | I'd say concerning new weapon systems,
00:52:20.340 | you know, particularly with Russia and breaching
00:52:23.040 | some of the IRBM, Intermediate-Range Ballistic Missile
00:52:26.060 | treaties that's been in the news a lot.
00:52:28.020 | You know, the building of islands, artificial islands
00:52:33.060 | in the South China Sea by the Chinese
00:52:35.160 | and then arming those islands.
00:52:38.740 | The annexation of Crimea by Russia,
00:52:42.920 | the invasion of Ukraine.
00:52:44.820 | So there's some pretty scary things.
00:52:47.180 | And then you add on top of that,
00:52:48.820 | the North Korean threat has certainly not gone away.
00:52:52.980 | There's a lot going on in the Middle East
00:52:54.740 | with Iran in particular.
00:52:56.680 | And we see this global terrorism threat
00:52:59.700 | has not abated, right?
00:53:02.380 | So there are a lot of reasons to look for technology
00:53:06.100 | to assist with those problems,
00:53:07.660 | whether it's AI or other technologies like hypersonics,
00:53:11.300 | which we discussed.
00:53:12.980 | So now let me give just a couple of hypotheticals.
00:53:17.340 | So people react sort of in the second timeframe, right?
00:53:22.260 | You know, you're,
00:53:23.620 | photon hitting your eye to, you know,
00:53:27.180 | movement is, you know, on the order of a few tenths
00:53:30.000 | of a second kinds of processing times.
00:53:34.400 | Roughly speaking, you know,
00:53:36.520 | computers are operating in the nanosecond timescale, right?
00:53:41.520 | So just to bring home what that means,
00:53:44.600 | a nanosecond to a second is like a second to 32 years.
00:53:49.600 | So seconds on the battlefield in that sense,
00:53:53.920 | literally are lifetimes.
00:53:55.660 | And so if you can bring an autonomous
00:54:01.000 | or AI enabled capability
00:54:03.240 | that will enable the human to shrink,
00:54:05.600 | maybe you've heard the term the OODA loop.
00:54:07.520 | So this whole idea that a typical battlefield decision
00:54:12.120 | is characterized by observe.
00:54:15.800 | So information comes in, orient,
00:54:18.400 | how does that, what does that mean in the context?
00:54:21.240 | Decide, what do I do about it?
00:54:23.040 | And then act, take that action.
00:54:25.160 | If you can use these capabilities to compress that OODA loop
00:54:29.040 | to stay inside what your adversary is doing,
00:54:32.240 | that's an incredible, powerful force on the battlefield.
00:54:37.240 | - That's a really nice way to put it.
00:54:39.160 | That the role of AI in computing in general
00:54:41.720 | has a lot to benefit from just decreasing
00:54:45.180 | from 32 years to one second.
00:54:47.260 | As opposed to on the scale of seconds and minutes and hours
00:54:50.580 | making decisions that humans are better at making.
00:54:53.480 | - And it actually goes the other way too.
00:54:55.000 | So that's on the short time scale.
00:54:57.200 | So humans kind of work in the, you know,
00:54:58.780 | one second, two seconds to eight hours.
00:55:01.560 | After eight hours, you get tired, you know,
00:55:04.360 | you gotta go to the bathroom, whatever the case might be.
00:55:07.520 | So there's this whole range of other things.
00:55:09.760 | Think about, you know, surveillance and guarding,
00:55:14.760 | you know, facilities.
00:55:16.600 | Think about moving material, logistics, sustainment.
00:55:20.540 | A lot of these, what they call dull, dirty,
00:55:22.620 | and dangerous things that you need to have
00:55:24.960 | sustained activity, but it's sort of beyond the length
00:55:27.600 | of time that a human can practically do as well.
00:55:30.960 | So there's this range of things that are critical
00:55:35.280 | in military and defense applications
00:55:39.160 | that AI and autonomy are particularly well suited to.
00:55:43.260 | Now, the interesting question that you brought up is,
00:55:46.080 | okay, how do you make sure that stays within human control?
00:55:49.040 | And that, so that was the context for now the policy.
00:55:52.400 | And so there is a DOD directive called 3000.09,
00:55:56.200 | because that's the way we name stuff in this world.
00:55:59.300 | And, but it, you know, and I'd say it's well worth reading.
00:56:04.300 | It's only a couple pages long, but it makes some key points.
00:56:07.320 | And it's really around, you know,
00:56:09.000 | making sure that there's human agency and control
00:56:12.360 | over use of semi-autonomous and autonomous weapon systems.
00:56:17.360 | Making sure that these systems are tested, verified,
00:56:23.840 | and evaluated in realistic, real-world type scenarios.
00:56:28.240 | Making sure that the people are actually trained
00:56:30.440 | on how to use them.
00:56:31.920 | Making sure that the systems have human machine interfaces
00:56:36.040 | that can show what state they're in
00:56:38.120 | and what kinds of decisions they're making.
00:56:40.360 | Making sure that you've established doctrine and tactics
00:56:43.880 | and techniques and procedures
00:56:45.840 | for the use of these kinds of systems.
00:56:48.280 | And so, and by the way, I mean, none of this is easy,
00:56:52.920 | but I'm just trying to lay kind of the picture
00:56:56.080 | of how the US has said,
00:56:58.200 | this is the way we're gonna treat AI and autonomous systems.
00:57:02.620 | That it's not a free for all.
00:57:04.600 | And like there are rules of war and rules of engagement
00:57:08.160 | with other kinds of systems,
00:57:09.480 | think chemical weapons, biological weapons,
00:57:12.200 | we need to think about the same sorts of implications.
00:57:15.600 | And this is something that's really important
00:57:17.320 | for Lockheed Martin.
00:57:18.240 | I mean, obviously we are 100% complying
00:57:21.600 | with our customer and the policies and regulations.
00:57:26.440 | But I mean, AI is an incredible enabler,
00:57:30.580 | say within the walls of Lockheed Martin
00:57:32.360 | in terms of improving production efficiency,
00:57:35.220 | doing helping engineers, doing generative design,
00:57:38.240 | improving logistics, driving down energy costs.
00:57:42.040 | I mean, there's so many applications.
00:57:44.380 | But we're also very interested in some of the elements
00:57:48.160 | of ethical application within Lockheed Martin.
00:57:51.800 | So we need to make sure that things like privacy
00:57:54.360 | is taken care of, that we do everything we can
00:57:58.480 | to drive out bias in AI enabled kinds of systems.
00:58:03.440 | That we make sure that humans are involved in decisions,
00:58:06.280 | that we're not just delegating accountability to algorithms.
00:58:10.600 | And so for us, it all comes back,
00:58:13.240 | I talked about culture before,
00:58:14.520 | and it comes back to sort of the Lockheed Martin culture.
00:58:17.880 | And our core values, and so it's pretty simple for us.
00:58:20.880 | And do what's right, respect others,
00:58:22.480 | perform with excellence.
00:58:24.240 | And now how do we tie that back to the ethical principles
00:58:27.880 | that will govern how AI is used within Lockheed Martin?
00:58:31.800 | And we actually have a world,
00:58:33.360 | so you might not know this,
00:58:35.520 | but there are actually awards for ethics programs.
00:58:37.680 | Lockheed Martin's had a recognized ethics program
00:58:41.400 | for many years, and this is one of the things
00:58:43.680 | that our ethics team is working
00:58:45.040 | with our engineering team on.
00:58:47.840 | - One of the miracles to me, perhaps a layman,
00:58:51.320 | again, I was born in the Soviet Union,
00:58:53.760 | so I have echoes, at least in my family history
00:58:57.840 | of World War II and the Cold War.
00:59:00.640 | Do you have a sense of why human civilization
00:59:04.800 | has not destroyed itself through nuclear war,
00:59:07.120 | so nuclear deterrence?
00:59:09.200 | And thinking about the future,
00:59:11.960 | does technology have a role to play here?
00:59:14.620 | And what does the long-term future
00:59:17.420 | of nuclear deterrence look like?
00:59:20.460 | - Yeah, this is one of those hard, hard questions.
00:59:24.860 | And I should note that Lockheed Martin
00:59:27.980 | is both proud and privileged to play a part
00:59:30.420 | in multiple legs of our nuclear and strategic deterrent
00:59:35.420 | systems like the Trident submarine-launched
00:59:38.780 | ballistic missiles.
00:59:41.780 | You know, you talk about, you know,
00:59:45.300 | is there still a possibility that the human race
00:59:48.060 | could destroy itself?
00:59:49.100 | I'd say that possibility is real,
00:59:50.900 | but interestingly, in some sense,
00:59:55.460 | I think the strategic deterrents have prevented
01:00:00.060 | the kinds of, you know, incredibly destructive
01:00:03.100 | world wars that we saw in the first half
01:00:05.220 | of the 20th century.
01:00:07.300 | Now, things have gotten more complicated
01:00:10.040 | since that time and since the Cold War.
01:00:12.300 | It is more of a multipolar, great powers world today.
01:00:15.520 | Just to give you an example, back then,
01:00:18.620 | you know, there were, you know,
01:00:20.060 | in the Cold War time frame, just a handful of nations
01:00:23.060 | that had ballistic missile capability.
01:00:25.120 | By last count, and this is a few years old,
01:00:28.220 | there's over 70 nations today that have that.
01:00:31.220 | Similar kinds of numbers in terms
01:00:34.300 | of space-based capabilities.
01:00:38.080 | So, the world has gotten more complex
01:00:41.600 | and more challenging, and the threats, I think,
01:00:43.800 | have proliferated in ways that we didn't expect.
01:00:47.560 | You know, the nation today is in the middle
01:00:52.000 | of a recapitalization of our strategic deterrent.
01:00:55.360 | I look at that as one of the most important things
01:00:58.760 | that our nation can do.
01:01:00.320 | - What is involved in deterrence?
01:01:01.880 | Is it being ready to attack?
01:01:07.700 | Or is it the defensive systems that catch attacks?
01:01:11.580 | - A little bit of both.
01:01:12.580 | And so, it's a complicated game theoretical kind of program.
01:01:16.700 | But, ultimately, we are trying to prevent the use
01:01:21.700 | of any of these weapons.
01:01:24.900 | And the theory behind prevention is that even
01:01:29.700 | if an adversary uses a weapon against you,
01:01:33.340 | you have the capability to essentially strike back
01:01:37.660 | and do harm to them that's unacceptable.
01:01:40.860 | And so, that will deter them from making use
01:01:44.960 | of these weapon systems.
01:01:46.300 | The deterrence calculus has changed, of course,
01:01:50.820 | with more nations now having these kinds of weapons.
01:01:55.140 | But I think, from my perspective, it's very important
01:02:02.220 | to maintain a strategic deterrent.
01:02:05.020 | You have to have systems that you know will work
01:02:08.820 | when they're required to work.
01:02:11.220 | You know that they have to be adaptable
01:02:13.540 | to a variety of different scenarios in today's world.
01:02:17.540 | And so, that's what this recapitalization of systems
01:02:20.380 | that were built over previous decades,
01:02:23.220 | making sure that they are appropriate, not just for today,
01:02:26.640 | but for the decades to come.
01:02:29.120 | So, the other thing I'd really like to note
01:02:32.220 | is strategic deterrence has a very different character today.
01:02:37.220 | You know, we used to think of weapons of mass destruction
01:02:42.420 | in terms of nuclear, chemical, biological.
01:02:45.780 | And today, we have a cyber threat.
01:02:48.700 | We've seen examples of the use of cyber weaponry.
01:02:53.700 | And if you think about the possibilities
01:02:58.580 | of using cyber capabilities or an adversary
01:03:02.260 | attacking the US to take out things
01:03:05.480 | like critical infrastructure, electrical grids,
01:03:08.860 | water systems, those are scenarios that are strategic
01:03:13.860 | in nature to the survival of a nation as well.
01:03:19.080 | So, that is the kind of world that we live in today.
01:03:23.060 | And, you know, part of my hope on this
01:03:26.180 | is one that we can also develop technical
01:03:29.140 | or technological systems, perhaps enabled by AI and autonomy
01:03:33.620 | that will allow us to contain and to fight back
01:03:38.620 | against these kinds of new threats that were not conceived
01:03:43.500 | when we first developed our strategic deterrence.
01:03:46.260 | - Yeah, I know that Lockheed is involved in cyber.
01:03:48.340 | So, I saw that you mentioned that.
01:03:50.900 | It's an incredibly,
01:03:54.380 | nuclear almost seems easier than cyber
01:03:57.580 | 'cause there's so many attack,
01:03:58.780 | there's so many ways that cyber can evolve.
01:04:01.740 | It's such an uncertain future.
01:04:03.460 | But talking about engineering with a mission,
01:04:05.780 | I mean, in this case, your engineering system
01:04:09.580 | is that basically save the world.
01:04:12.740 | - It's, like I said, we're privileged to work
01:04:18.000 | on some very challenging problems
01:04:19.980 | for very critical customers here in the US
01:04:23.300 | and with our allies abroad as well.
01:04:25.140 | - Lockheed builds both military and non-military systems.
01:04:30.740 | And perhaps the future of Lockheed
01:04:32.940 | may be more in non-military applications.
01:04:35.300 | If you talk about space and beyond.
01:04:37.180 | I say that as a preface to a difficult question.
01:04:41.420 | So, President Eisenhower in 1961,
01:04:45.100 | in his farewell address talked about
01:04:46.780 | the military industrial complex
01:04:49.020 | and that it shouldn't grow beyond what is needed.
01:04:52.740 | So, what are your thoughts on those words
01:04:55.860 | on the military industrial complex
01:04:58.780 | on the concern of growth of their developments
01:05:03.780 | beyond what may be needed?
01:05:05.740 | - That what, where it may be needed
01:05:09.460 | is a critical phrase, of course.
01:05:12.260 | And I think it is worth pointing out,
01:05:14.300 | as you noted, that Lockheed Martin,
01:05:15.980 | we're in a number of commercial businesses
01:05:19.380 | from energy to space to commercial aircraft.
01:05:24.020 | And so, I wouldn't neglect the importance
01:05:28.660 | of those parts of our business as well.
01:05:32.100 | I think the world is dynamic and there was a time,
01:05:37.100 | it doesn't seem that long ago to me,
01:05:39.420 | I was a graduate student here at MIT
01:05:41.940 | and we were talking about the peace dividend
01:05:43.940 | at the end of the Cold War.
01:05:45.780 | If you look at expenditure on military systems
01:05:49.260 | as a fraction of GDP, we're far below peak levels
01:05:53.260 | of the past.
01:05:55.660 | And to me, at least, it looks like a time
01:05:59.180 | where you're seeing global threats changing
01:06:01.580 | in a way that would warrant relevant investments
01:06:06.580 | in defensive capabilities.
01:06:10.020 | The other thing I'd note,
01:06:12.180 | for military and defensive systems,
01:06:18.540 | it's not quite a free market, right?
01:06:21.500 | We don't sell to people on the street.
01:06:25.740 | And that warrants a very close partnership
01:06:29.500 | between, I'd say, the customers and the people
01:06:33.180 | that design, build, and maintain these systems.
01:06:38.180 | Because of the very unique nature,
01:06:42.020 | the very difficult requirements,
01:06:44.980 | the very great importance on safety
01:06:49.460 | and on operating the way they're intended every time.
01:06:54.460 | And so that does create, and frankly,
01:06:57.700 | it's one of Lockheed Martin's great strengths,
01:06:59.580 | is that we have this expertise built up over many years
01:07:03.500 | in partnership with our customers
01:07:05.420 | to be able to design and build these systems
01:07:08.260 | that meet these very unique mission needs.
01:07:11.580 | - Yeah, because building those systems is very costly.
01:07:14.420 | There's very little room for mistake.
01:07:16.100 | I mean, it's, yeah, just Ben Rich's book and so on
01:07:18.980 | just tells the story.
01:07:20.300 | It's nerve-wracking just reading it.
01:07:22.340 | If you're an engineer, it reads like a thriller.
01:07:24.380 | Okay, let's go back to space for a second.
01:07:29.380 | I guess--
01:07:30.700 | - I'm always happy to go back to space.
01:07:33.100 | - So a few quick, maybe out there, maybe fun questions,
01:07:38.100 | maybe a little provocative.
01:07:40.580 | What are your thoughts on the efforts
01:07:43.660 | of the new folks, SpaceX and Elon Musk?
01:07:48.660 | What are your thoughts about what Elon is doing?
01:07:50.900 | Do you see him as competition?
01:07:52.660 | Do you enjoy competition?
01:07:54.220 | What are your thoughts?
01:07:56.220 | - Yeah, first of all, certainly Elon, I'd say SpaceX
01:08:01.140 | and some of his other ventures
01:08:03.220 | are definitely a competitive force in the space industry.
01:08:08.220 | And do we like competition?
01:08:09.900 | Yeah, we do.
01:08:11.500 | And we think we're very strong competitors.
01:08:15.580 | I think it's, competition is what the US is founded on
01:08:20.580 | in a lot of ways and always coming up with a better way.
01:08:24.700 | And I think it's really important to continue,
01:08:28.740 | to have fresh eyes coming in, new innovation.
01:08:33.020 | I do think it's important to have level playing fields.
01:08:35.500 | And so you wanna make sure that you're not giving
01:08:39.940 | different requirements to different players.
01:08:42.860 | But I tell people, I spent a lot of time at places like MIT,
01:08:47.580 | I'm gonna be at the MIT Beaverworks Summer Institute
01:08:50.640 | over the weekend here.
01:08:52.140 | And I tell people, this is the most exciting time
01:08:55.100 | to be in the space business in my entire life.
01:08:58.460 | And it is this explosion of new capabilities
01:09:03.020 | that have been driven by things like the massive increase
01:09:07.020 | in computing power, things like the massive increase
01:09:10.940 | in comms capabilities, advanced and additive manufacturing
01:09:15.180 | are really bringing down the barriers to entry in this field
01:09:19.540 | and it's driving just incredible innovation.
01:09:21.940 | And it's happening at startups,
01:09:23.080 | but it's also happening at Lockheed Martin.
01:09:25.460 | You may not realize this, but Lockheed Martin,
01:09:27.300 | working with Stanford actually built the first CubeSat
01:09:30.020 | that was launched here out of the US
01:09:33.660 | that was called QuakeSat.
01:09:35.180 | And we did that with Stellar Solutions.
01:09:37.500 | This was right around just after 2000, I guess.
01:09:41.660 | And so we've been in that from the very beginning.
01:09:45.580 | And I talked about some of these like Maya and Orion,
01:09:50.160 | but we're in the middle of what we call smartsats
01:09:54.220 | and software-defined satellites
01:09:55.860 | that can essentially restructure and remap their purpose,
01:10:00.660 | their mission on orbit to give you
01:10:03.140 | almost unlimited flexibility for these satellites
01:10:06.540 | over their lifetimes.
01:10:08.020 | So those are just a couple of examples,
01:10:10.260 | but yeah, this is a great time to be in space.
01:10:13.520 | - Absolutely.
01:10:14.380 | So Wright Brothers flew for the first time 116 years ago.
01:10:19.380 | So now we have supersonic stealth planes
01:10:23.060 | and all the technology we've talked about.
01:10:25.420 | What innovations, obviously you can't predict the future,
01:10:29.300 | but do you see Lockheed in the next 100 years?
01:10:32.420 | If you take that same leap,
01:10:34.100 | how will the world of technology and engineering change?
01:10:37.820 | I know it's an impossible question,
01:10:39.340 | but nobody could have predicted
01:10:42.300 | that we could even fly 120 years ago.
01:10:45.780 | So what do you think is the edge of possibility
01:10:50.620 | that we're going to be exploring in the next 100 years?
01:10:52.660 | - I don't know that there is an edge.
01:10:54.500 | We've been around for almost that entire time, right?
01:11:00.780 | The Lockheed brothers and Glenn L. Martin
01:11:03.860 | starting their companies in the basement of a church
01:11:07.980 | and an old service station.
01:11:10.680 | We're very different companies today
01:11:14.260 | than we were back then, right?
01:11:15.700 | And that's because we've continuously reinvented ourselves
01:11:19.060 | over all of those decades.
01:11:21.700 | I think it's fair to say, I know this for sure,
01:11:24.340 | the world of the future, it's gonna move faster,
01:11:27.840 | it's gonna be more connected,
01:11:29.340 | it's gonna be more autonomous,
01:11:31.700 | and it's gonna be more complex than it is today.
01:11:36.220 | And so this is the world, as a CTO at Lockheed Martin,
01:11:39.740 | that I think about,
01:11:40.580 | what are the technologies that we have to invest in,
01:11:42.700 | whether it's things like AI and autonomy,
01:11:45.500 | you can think about quantum computing,
01:11:47.300 | which is an area that we've invested in
01:11:49.140 | to try to stay ahead of these technological changes
01:11:53.540 | and frankly, some of the threats that are out there.
01:11:56.300 | I believe that we're gonna be out there
01:11:58.380 | in the solar system, that we're gonna be defending
01:12:00.860 | and defending well against probably military threats
01:12:04.980 | that nobody has even thought about today.
01:12:07.180 | We are going to be, we're gonna use these capabilities
01:12:12.380 | to have far greater knowledge of our own planet,
01:12:15.700 | the depths of the oceans, all the way to the upper reaches,
01:12:19.320 | the atmosphere and everything out to the sun
01:12:21.420 | and to the edge of the solar system.
01:12:23.420 | So that's what I look forward to.
01:12:26.740 | And I'm excited, I mean, just looking ahead
01:12:30.860 | in the next decade or so to the steps
01:12:33.380 | that I see ahead of us in that time.
01:12:35.360 | - I don't think there's a better place to end, Koki.
01:12:38.580 | Thank you so much.
01:12:39.620 | - Lex, it's been a real pleasure
01:12:41.100 | and sorry it took so long to get up here,
01:12:43.460 | but glad we're able to make it happen.
01:12:45.540 | (upbeat music)
01:12:48.120 | (upbeat music)
01:12:50.700 | (upbeat music)
01:12:53.280 | (upbeat music)
01:12:55.860 | (upbeat music)
01:12:58.440 | (upbeat music)
01:13:01.020 | [BLANK_AUDIO]