back to index

John Carmack: Doom, Quake, VR, AGI, Programming, Video Games, and Rockets | Lex Fridman Podcast #309


Chapters

0:0 Introduction
1:57 Programming languages
33:1 Modern programming
43:3 Day in the life
50:53 Hard work
54:6 Pizza and Diet Coke
56:50 Setup
82:8 id Software
114:58 Commander Keen
121:44 Hacker ethic
129:24 Wolfenstein 3D
149:21 Doom
163:42 Quake
188:2 John Romero
195:49 Metaverse
224:11 Elon Musk
230:6 Mars
239:9 Nuclear energy
242:47 AGI
289:59 Andrej Karpathy
292:57 Martial arts
301:57 Advice for young people
310:57 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | I remember the reaction where he had drawn these characters
00:00:02.540 | and he was slowly moving around.
00:00:04.180 | And people had no experience with 3D navigation.
00:00:06.940 | It was all still keyboard.
00:00:08.020 | We didn't even have mice set up at that time.
00:00:10.900 | But slowly moving, going up, picked up a key, go to a wall.
00:00:14.740 | The wall disappears in a little animation.
00:00:16.820 | And there's a monster right there.
00:00:18.700 | And he practically fell out of his chair.
00:00:20.420 | It was just like, ah!
00:00:21.620 | And games just didn't do that.
00:00:24.940 | The games were the god's eye view.
00:00:26.820 | You were a little invested in your little guy.
00:00:28.780 | You can be happy or sad when things happen.
00:00:32.220 | But you just did not get that kind of startle reaction.
00:00:34.900 | - You weren't inside the game.
00:00:35.740 | - Something in the back of your brain,
00:00:38.020 | some reptile brain thing is just going,
00:00:40.180 | oh shit, something just happened.
00:00:42.420 | And that was one of those early points where it's like,
00:00:44.940 | yeah, this is gonna make a difference.
00:00:47.240 | This is going to be powerful and it's gonna matter.
00:00:50.020 | - The following is a conversation with John Carmack,
00:00:55.140 | widely considered to be one of the greatest programmers ever.
00:00:59.540 | He was the co-founder of id Software
00:01:01.940 | and the lead programmer on several games
00:01:03.880 | that revolutionized the technology, the experience,
00:01:07.420 | and the role of gaming in our society,
00:01:09.780 | including Commander Keen, Wolfenstein 3D, Doom, and Quake.
00:01:14.780 | He spent many years as the CTO of Oculus VR,
00:01:19.940 | helping to create portals into virtual worlds
00:01:23.060 | and to define the technological path
00:01:25.500 | to the metaverse and meta.
00:01:28.140 | And now he has been shifting some of his attention
00:01:30.940 | to the problem of artificial general intelligence.
00:01:34.860 | This was the longest conversation on this podcast
00:01:38.380 | at over five hours.
00:01:40.100 | And still I could talk to John many, many more times.
00:01:43.740 | And we hope to do just that.
00:01:45.540 | This is the Lux Readment Podcast.
00:01:48.660 | To support it, please check out our sponsors
00:01:50.860 | in the description.
00:01:52.180 | And now, dear friends, here's John Carmack.
00:01:56.360 | What was the first program you've ever written?
00:02:00.100 | Do you remember?
00:02:01.260 | - Yeah, I do.
00:02:02.100 | So I remember being in a radio shack,
00:02:04.420 | going up to the TRS-80 computers
00:02:06.820 | and learning just enough to be able to do
00:02:09.900 | 10 print John Carmack.
00:02:12.780 | And it's kind of interesting how, of course,
00:02:15.060 | I've, you know, Carnegan and Richie
00:02:17.420 | kind of standardized "Hello, World"
00:02:19.100 | as the first thing that you do
00:02:20.080 | in every computer programming language
00:02:21.860 | in every computer,
00:02:22.740 | but not having any interaction with the cultures
00:02:25.820 | of Unix or any other standardized things.
00:02:28.220 | It was just like, well, what am I gonna say?
00:02:29.860 | I'm gonna say my name.
00:02:31.020 | And then you learn how to do GOTO 10
00:02:33.180 | and have it scroll all off the screen.
00:02:34.940 | And that was definitely the first thing
00:02:37.300 | that I wound up doing on a computer.
00:02:39.540 | - Can I ask you a programming advice?
00:02:41.300 | I was always told in the beginning
00:02:42.700 | that you're not allowed to use GOTO statements.
00:02:44.420 | That's really bad programming.
00:02:45.540 | Is this correct or not?
00:02:46.940 | Jumping around code.
00:02:48.540 | Can we look at the philosophy
00:02:50.340 | and the technical aspects of the GOTO statement
00:02:53.740 | that seems so convenient,
00:02:55.140 | but it's supposed to be bad programming?
00:02:56.540 | - Back in the day in basic programming languages,
00:02:58.980 | you didn't have proper loops.
00:03:00.780 | You didn't have for a while and repeats.
00:03:02.580 | You know, that was the land of Pascal
00:03:04.180 | for people that kind of generally had access
00:03:06.620 | to it back then.
00:03:07.460 | So you had no choice but to use GOTOs.
00:03:10.380 | And as you made what were big programs back then,
00:03:13.500 | which were a thousand line basic program
00:03:15.580 | is a really big program.
00:03:16.780 | They did tend to sort of degenerate into madness.
00:03:20.020 | You didn't have good editors or code exploration tools.
00:03:23.380 | So you would wind up fixing things in one place,
00:03:26.180 | add a little patch.
00:03:27.140 | And there's reasons why structured programming
00:03:29.700 | generally helps understanding,
00:03:31.740 | but GOTOs aren't poisonous.
00:03:33.860 | Sometimes they're the right thing to do.
00:03:35.940 | Usually it's because there's a language feature missing
00:03:38.860 | like nested breaks or something
00:03:40.540 | where it can sometimes be better to do a GOTO cleanup
00:03:44.580 | or GOTO error rather than having multiple flags,
00:03:48.060 | multiple if statements littered throughout things.
00:03:50.460 | But it is rare.
00:03:51.820 | I mean, if you grep through all of my code right now,
00:03:55.140 | I don't think any of my current code bases
00:03:57.540 | would actually have a GOTO,
00:03:58.980 | but deep within sort of the technical underpinnings
00:04:02.660 | of a major game engine,
00:04:03.740 | you're gonna have some GOTOs in a couple of places probably.
00:04:07.220 | - Yeah, the infrastructure on top of,
00:04:09.180 | like the closer you get to the machine code,
00:04:11.260 | the more GOTOs you're gonna see,
00:04:12.700 | the more of these like hacks you're going to see
00:04:15.660 | because the set of features available to you
00:04:18.220 | in low level programming languages is limited.
00:04:22.380 | So print John Carmack,
00:04:25.940 | when is the first time,
00:04:28.340 | if we could talk about love,
00:04:29.980 | that you fell in love with programming?
00:04:31.660 | You said like, this is really something special.
00:04:34.660 | - It really was something
00:04:36.020 | that was one of those love at first sight things
00:04:38.100 | where just really from the time that I understood
00:04:40.740 | what a computer was, even,
00:04:42.700 | I mean, I remember looking through old encyclopedias
00:04:45.460 | at the black and white photos of the IBM mainframes
00:04:48.620 | at the reel to reel tape decks.
00:04:50.260 | And for people nowadays,
00:04:52.380 | it can be a little hard to understand
00:04:53.860 | what the world was like then from information gathering,
00:04:56.340 | where I would go to the libraries
00:04:58.740 | and there would be a couple books on the shelf
00:05:01.140 | about computers and they would be very out of date
00:05:03.940 | even at that point, just not a lot of information,
00:05:06.700 | but I would grab everything that I could find
00:05:09.220 | and devour everything.
00:05:10.620 | Whenever Time or Newsweek had some article about computers,
00:05:14.260 | I would like cut it out with scissors and put it somewhere.
00:05:17.020 | It just, it felt like this magical thing to me,
00:05:19.980 | this idea that the computer would just do
00:05:22.780 | exactly what you told it to.
00:05:24.500 | I mean, and there's a little bit of the genie monkey's paw
00:05:26.700 | sort of issues there
00:05:27.740 | where you'd better be really, really careful
00:05:29.780 | with what you're telling it to do,
00:05:31.100 | but it wasn't gonna backtalk you.
00:05:33.140 | It wasn't gonna have a different point of view.
00:05:34.780 | It was gonna carry out what you told it to do.
00:05:37.460 | And if you had the right commands,
00:05:39.860 | you could make it do these pretty magical things.
00:05:43.580 | - And so what kind of programs did you write at first?
00:05:46.020 | So beyond the print, John Carmack.
00:05:48.300 | - So I can remember as going through the learning process
00:05:51.820 | where you find at the start,
00:05:53.740 | you're just learning how to do
00:05:54.620 | the most basic possible things.
00:05:56.420 | And I can remember stuff like a Superman comic
00:05:59.860 | that Radio Shack commissioned to have,
00:06:02.620 | it's like Superman had lost some of his super brain
00:06:04.820 | and kids had to use Radio Shack TRS-80 computers
00:06:07.500 | to do calculations for it,
00:06:09.060 | to help him kind of complete his heroics.
00:06:11.940 | And I'd find little things like that
00:06:15.260 | and then get a few basic books
00:06:17.340 | to be able to kind of work my way up.
00:06:20.140 | And again, it was so precious back then.
00:06:22.460 | I had a couple books
00:06:23.660 | that would teach me important things about it.
00:06:25.940 | I had one book that I could start to learn
00:06:28.660 | a little bit of assembly language from,
00:06:30.340 | and I'd have a few books on basic
00:06:32.020 | and some things that I could get from the libraries.
00:06:34.620 | But my goals in the early days
00:06:37.180 | was almost always making games of various kinds.
00:06:39.900 | I loved the arcade games and the early Atari 2600 games,
00:06:44.900 | and being able to do some of those things myself
00:06:47.700 | on the computers was very much what I aspired to.
00:06:51.340 | And it was a whole journey where if you learn normal basic,
00:06:54.340 | you can't do any kind of an action game.
00:06:56.060 | You can write an adventure game.
00:06:57.380 | You can write things where you say,
00:06:59.100 | "What do you do here?
00:07:00.300 | I get sword, attack troll," that type of thing.
00:07:03.580 | And that can be done in the context of basic,
00:07:07.020 | but to do things that had moving graphics,
00:07:09.500 | they were only the most limited things you could possibly do.
00:07:12.220 | You could maybe do Breakout or Pong
00:07:14.020 | or that sort of thing in low-resolution graphics.
00:07:16.940 | And in fact, one of my first sort of major technical hacks
00:07:20.700 | that I was kind of fond of was on the Apple II computers,
00:07:25.380 | they had a mode called low-resolution graphics,
00:07:28.780 | where, of course, all graphics were low-resolution back then,
00:07:31.380 | but regular low-resolution graphics,
00:07:34.060 | it was a grid of 40 by 40 pixels normally,
00:07:37.140 | but they could have 16 different colors.
00:07:39.460 | And I wanted to make a game kind of like the arcade game
00:07:43.300 | Vanguard, just a scrolling game,
00:07:44.900 | and I wanted to just kind of have it scroll vertically up.
00:07:47.580 | And I could move a little ship around.
00:07:49.260 | You could manage to do that in basic,
00:07:50.740 | but there's no way you could redraw the whole screen.
00:07:53.580 | And I remember at the time just coming up
00:07:56.020 | with what felt like a brainstorm to me,
00:07:58.140 | where I knew enough about the way the hardware was controlled,
00:08:02.020 | where the text screen and the low-resolution graphics screen
00:08:05.100 | were basically the same thing.
00:08:06.900 | And all those computers could scroll
00:08:08.660 | their text screen reasonably.
00:08:10.140 | You could do a listing, and it would scroll things up.
00:08:12.700 | And I figured out that I could kind of tweak
00:08:15.820 | just a couple things that I barely understood
00:08:17.980 | to put it into a graphics mode, and I could draw graphics,
00:08:20.820 | and then I could just do a line feed
00:08:22.820 | at the very bottom of the screen,
00:08:24.340 | and then the system would scroll it all up
00:08:26.220 | using an assembly language routine
00:08:27.980 | that I didn't know how to write back then.
00:08:30.180 | So that was like this first great hack
00:08:33.260 | that sort of had analogs later on in my career
00:08:35.940 | for a lot of different things.
00:08:37.060 | So I found out that I could draw a screen,
00:08:39.380 | I could do a line feed at the bottom,
00:08:40.820 | it would scroll it up once,
00:08:42.100 | I could draw a couple more lines of stuff at the bottom.
00:08:44.580 | And that was my first way to kind of scroll the screen,
00:08:47.780 | which was interesting in that that played a big part
00:08:51.100 | later on in the id software days as well.
00:08:53.220 | - So do efficient drawing,
00:08:56.180 | where you don't have to draw the whole screen,
00:08:59.180 | but you draw from the bottom using the thing
00:09:01.860 | that was designed in the hardware for text output.
00:09:04.500 | - Yeah, where so much of until recently,
00:09:07.940 | game design was limited by
00:09:11.220 | what you could actually get the computer to do,
00:09:13.220 | where it's easy to say like,
00:09:14.700 | "Okay, I wanna scroll the screen."
00:09:15.980 | You just redraw the entire screen at a slight offset.
00:09:19.460 | And nowadays that works just fine.
00:09:21.580 | Computers are ludicrously fast.
00:09:24.740 | But up until a decade ago or so,
00:09:27.900 | there were all these things everybody wanted to do,
00:09:30.180 | but if they knew enough programming
00:09:31.900 | to be able to make it happen,
00:09:33.540 | it would happen too slow to be a good experience,
00:09:36.500 | either just ridiculously slow or just slow enough
00:09:39.340 | that it wasn't fun to experience it like that.
00:09:42.100 | So, so much of kind of the first couple decades
00:09:44.580 | of the programming work that I did
00:09:46.220 | was largely figuring out how to do something
00:09:48.700 | that everybody knows how they want it to happen.
00:09:51.460 | It just has to happen two to 10 times faster
00:09:54.260 | than sort of the straightforward way
00:09:56.300 | of doing things would make it happen.
00:09:58.660 | And it's different now because at this point,
00:10:01.900 | lots of things you can just do
00:10:03.260 | in the most naive possible way,
00:10:05.020 | and it still works out.
00:10:06.540 | You don't have nearly the creative limitations
00:10:09.500 | or the incentives for optimizing on that level.
00:10:12.580 | And there's a lot of pros and cons to that,
00:10:14.420 | but I do generally,
00:10:16.100 | I'm not gonna do the angry old man
00:10:18.380 | shaking my fist at the clouds bit,
00:10:20.020 | where back in my day,
00:10:21.020 | programmers had to do real programming.
00:10:23.180 | It's amazing that you can just kind of pick an idea
00:10:26.700 | and go do it right now.
00:10:27.860 | And you don't have to be some assembly language wizard
00:10:30.580 | or deep GPU arcanist to be able to figure out
00:10:33.900 | how to make your wishes happen.
00:10:35.220 | - Well, there's still, see, that's true,
00:10:38.460 | but let me put on my old man with a fist hat
00:10:41.220 | and say that probably the thing that will define the future
00:10:45.260 | still requires you to operate it
00:10:47.940 | at the limits of the current system.
00:10:49.700 | So we'll probably talk about this,
00:10:51.860 | but if you talk about building the metaverse
00:10:54.460 | and building a VR experience that's compelling,
00:10:57.140 | it probably requires you to not to go to assembly
00:11:01.300 | or maybe not literally, but sort of spiritually
00:11:05.860 | to go to the limits of what the system is capable of.
00:11:08.340 | - Yeah, and that really was why virtual reality
00:11:10.820 | was specifically interesting to me,
00:11:13.660 | where it had all the ties to,
00:11:14.980 | you could say that even back in the early days,
00:11:17.100 | I have some old magazine articles
00:11:19.180 | that's talking about doom as a virtual reality experience
00:11:22.140 | back when just seeing anything in 3D.
00:11:25.180 | So you could say that we've been trying to build
00:11:27.300 | those virtual experiences from the very beginning.
00:11:29.620 | And in the modern era of virtual reality,
00:11:32.860 | especially on the mobile side of things,
00:11:34.820 | when it's standalone,
00:11:35.820 | you're basically using a cell phone chip
00:11:37.900 | to be able to produce these very immersive experiences.
00:11:41.820 | It does require work.
00:11:43.660 | It's not at the level of what an old school
00:11:45.700 | console game programmer would have operated at,
00:11:48.500 | where you're looking at hardware registers
00:11:50.300 | and you're scheduling all the DMA accesses,
00:11:53.500 | but it is still definitely a different level
00:11:55.500 | than what a web developer or even a PC Steam game developer
00:12:00.180 | usually has to work at.
00:12:01.780 | And again, it's great.
00:12:02.820 | There's opportunities for people that wanna operate
00:12:04.940 | at either end of that spectrum there
00:12:06.420 | and still provide a lot of value to the world.
00:12:08.700 | - Let me ask you sort of a big question about preference.
00:12:14.540 | What would you say is the best programming language?
00:12:19.940 | Your favorite, but also the best.
00:12:22.580 | You've seen throughout your career,
00:12:25.140 | you're considered by many to be the greatest programmer ever.
00:12:29.420 | I mean, it's so difficult to place that label on anyone,
00:12:32.300 | but if you put it on anyone, it's you.
00:12:33.980 | So let me ask you these kind of ridiculous questions
00:12:36.380 | of what's the best band of all time,
00:12:38.740 | but in your case, what's the best programming language?
00:12:41.500 | - Everything has all the caveats about it.
00:12:43.540 | But so what I use, so nowadays I do program
00:12:47.500 | a reasonable amount of Python for AI/ML sorts of work.
00:12:51.980 | I'm not a native Python programmer.
00:12:54.420 | It's something I came to very late in my career.
00:12:56.980 | I understand what it's good for.
00:12:58.820 | - But you don't dream in Python.
00:13:00.100 | - I do not.
00:13:00.940 | And it has some of those things
00:13:02.380 | where there's some amazing stats when you say,
00:13:04.820 | if you just start, if you make a loop,
00:13:06.900 | you know, a triply nested loop
00:13:08.500 | and start doing operations in Python,
00:13:10.980 | you can be thousands to potentially a million times slower
00:13:14.860 | than a proper GPU tensor operation.
00:13:17.580 | And these are staggering numbers.
00:13:19.580 | You know, you can be as much slower
00:13:21.180 | as we've almost gotten faster in our pace of progress
00:13:25.220 | and all this other miraculous stuff.
00:13:26.940 | - So your intuition's about inefficiencies
00:13:28.820 | within the Python sort of-
00:13:30.220 | - It keeps hitting me upside the face
00:13:32.100 | where it's gotten to the point now I understand.
00:13:34.140 | It's like, okay, you just can't do a loop
00:13:35.860 | if you care about performance in Python.
00:13:38.460 | You have to figure out how you can reformat this
00:13:41.380 | into some big vector operation
00:13:43.100 | or something that's going to be done
00:13:44.420 | completely within a C++ library.
00:13:47.060 | But the other hand is it's amazingly convenient.
00:13:49.900 | And you just see stuff
00:13:51.220 | that people are able to cobble together by,
00:13:53.660 | you just import a few different things
00:13:55.500 | and you can do stuff that nobody on earth
00:13:57.700 | could do 10 years ago.
00:13:58.900 | And you can do it in a little cookbook thing
00:14:00.500 | that you copy pasted out of a website.
00:14:02.820 | So that is really great.
00:14:04.820 | When I'm sitting down to do
00:14:06.100 | what I consider kind of serious programming,
00:14:08.500 | it's still in C++
00:14:10.420 | and it's really kind of a C-flavored C++ at that
00:14:13.940 | where I'm not big into the modern template
00:14:17.180 | metaprogramming sorts of things.
00:14:18.780 | I see a lot of train wrecks coming from
00:14:21.300 | some of that over abstraction.
00:14:23.500 | I spent a few years really going kind of deep
00:14:26.780 | into the kind of the historical Lisp work
00:14:30.140 | and Haskell and some of the functional
00:14:31.780 | programming sides of things.
00:14:33.620 | And there is a lot of value there
00:14:37.060 | in the way you think about things.
00:14:38.700 | And I changed a lot of the way I write my C and C++ code
00:14:42.140 | based on what I learned about the value
00:14:44.980 | that comes out of not having this random mutable state
00:14:48.060 | that you kind of lose track of.
00:14:50.260 | Because something that many people
00:14:52.500 | don't really appreciate
00:14:53.660 | till they've been at it for a long time
00:14:55.500 | is that it's not the writing of the program initially,
00:14:58.260 | it's the whole lifespan of the program.
00:15:00.460 | And that's when it's not necessarily
00:15:02.420 | just how fast you wrote it or how fast it operates,
00:15:05.500 | but it's how can it bend and adapt as situations change.
00:15:09.300 | And then the thing that I've really been learning
00:15:11.140 | in my time at Meta with the Oculus and VR work
00:15:14.700 | is it's also how well it hands off
00:15:16.940 | between a continuous kind of revolving door
00:15:19.300 | of programmers taking over maintenance
00:15:21.060 | and different things,
00:15:21.940 | and how you get people up to speed in different areas.
00:15:24.580 | And there's all these other different aspects of it.
00:15:27.140 | - Is C++ a good language for handover between engineers?
00:15:32.140 | - Probably not the best.
00:15:34.820 | And there's some really interesting aspects to this
00:15:36.900 | where in some cases,
00:15:38.620 | languages that are not generally thought well of
00:15:42.580 | for many reasons,
00:15:43.500 | like C is derided pretty broadly,
00:15:45.660 | that yes, obviously all of these security flaws
00:15:48.180 | that happen with the memory and unsafeness
00:15:50.420 | and buffer overruns and the things that you've got there,
00:15:53.940 | but there is this underappreciated aspect
00:15:56.700 | to the language is so simple,
00:15:58.500 | anyone can go and if you know C,
00:16:01.380 | you can generally jump in someplace
00:16:03.100 | and not have to learn what paradigms they're using
00:16:06.180 | because there just aren't that many available.
00:16:08.340 | I think there's,
00:16:09.460 | and there's some really, really well-written C code.
00:16:12.460 | Like it's, I find it great that if I'm messing around
00:16:15.500 | with something in OpenBSD say,
00:16:17.140 | I mean, I can be walking around in the kernel
00:16:19.340 | and I'm like, I understand everything that's going on here.
00:16:22.140 | It's not hard for me to figure out what's,
00:16:25.100 | what I need to do to make whatever change that I need to.
00:16:29.780 | While you can have more significant languages,
00:16:32.820 | like it's a downside of Lisp
00:16:34.820 | where I don't regret the time that I spent with Lisp.
00:16:37.340 | I think that it did help,
00:16:40.100 | help my thinking about programming in some ways.
00:16:42.780 | But the people that are the biggest defenders of Lisp
00:16:45.420 | are saying how malleable of a language it is,
00:16:47.820 | that if you write a huge Lisp program,
00:16:50.100 | you've basically invented your own kind of language
00:16:53.140 | and structure because it's not the primitives
00:16:55.460 | of the language you're using very much.
00:16:57.020 | It's all of the things you've built on top of that.
00:16:59.460 | And then a language like Racket,
00:17:01.020 | kind of one of the more modern Lisp versions,
00:17:03.380 | it's essentially touted as a language
00:17:05.340 | for building other languages.
00:17:07.020 | And I understand the value of that
00:17:10.060 | for a tiny little project,
00:17:12.420 | but the idea of that for one of these long-term supported
00:17:15.340 | by lots of people kind of horrifies me
00:17:17.980 | where all of those abstractions that you're like,
00:17:20.460 | okay, you can't touch this code
00:17:22.100 | till you educate yourself on all of these things
00:17:25.100 | that we've built on top of that.
00:17:26.860 | And it was interesting to see how when Google made Go,
00:17:30.420 | a lot of the criticisms of that are, it's like,
00:17:33.140 | wow, this is not a state-of-the-art language.
00:17:35.020 | This language is just so simple and almost crude.
00:17:38.100 | And you could see the programming language people
00:17:40.500 | just looking down at it.
00:17:41.980 | But it does seem to be quite popular as basically saying,
00:17:45.380 | this is the good things about C.
00:17:47.260 | Everybody can just jump right in and use it.
00:17:49.420 | You don't need to restructure your brain
00:17:52.100 | to write good code in it.
00:17:53.900 | So I wish that I had more opportunity
00:17:56.340 | for doing some work in Go.
00:17:59.060 | Rust is the other modern language
00:18:01.020 | that everybody talks about
00:18:02.180 | that I'm not fit to pass judgment on.
00:18:04.260 | I've done a little bit beyond "Hello, World."
00:18:06.620 | I wrote some video decompression work in Rust
00:18:09.980 | just as an exercise, but that was a few years ago,
00:18:12.740 | and I haven't really used it since.
00:18:15.100 | The best programming language is the one that works generally
00:18:17.660 | that you're currently using,
00:18:19.220 | because that's another trap is in almost every case
00:18:22.300 | I've seen when people mixed languages on a project,
00:18:25.060 | that's a mistake.
00:18:26.100 | I would rather stay just in one language
00:18:29.020 | so that everybody can work across the entire thing.
00:18:31.740 | And we have -- Like, at Meta, we have a lot of projects
00:18:34.340 | that use kind of React frameworks.
00:18:36.340 | So you've got JavaScript here,
00:18:37.860 | and then you have C++ for real work,
00:18:40.900 | and then you may have Java interfacing
00:18:42.660 | with some other part of the Android system.
00:18:44.860 | And those are all kind of horrible things.
00:18:46.980 | And that was one thing that I remember talking with Boz
00:18:52.460 | at Facebook about it, where, like, man,
00:18:54.860 | I wish we could have just said,
00:18:55.980 | "We're only hiring C++ programmers."
00:18:59.020 | And he just thought, from the Facebook Meta perspective,
00:19:02.660 | "Well, we just wouldn't be able to find enough."
00:19:05.180 | You know, with the thousands of programmers
00:19:07.180 | they've got there, it is not necessarily a dying breed,
00:19:10.820 | but you can sure find a lot more Java or JavaScript programmers.
00:19:14.940 | And I kind of mentioned that to Elon one time,
00:19:17.860 | and he was kind of flabbergasted about that.
00:19:21.100 | It's like, "Well, you just go out
00:19:22.620 | and you find those programmers,
00:19:23.900 | and you don't hire the other programmers
00:19:25.460 | that don't do the languages that you want to use."
00:19:28.380 | But right now, I guess, yeah, they're using JavaScript
00:19:30.300 | on a bunch of the SpaceX work for the UI side of things.
00:19:33.700 | When you go find UI programmers,
00:19:35.380 | they're JavaScript programmers.
00:19:36.900 | -I wonder if that's because there's a lot
00:19:38.340 | of JavaScript programmers,
00:19:39.700 | because I do think that great programmers are rare.
00:19:44.700 | That it's not, you know, if you just look at statistics
00:19:48.340 | of how many people are using different programming languages,
00:19:51.180 | that doesn't tell you the story
00:19:53.260 | of what the great programmers are using.
00:19:55.900 | And so you have to really look at what you were speaking to,
00:19:59.140 | which is the fundamentals of a language.
00:20:00.780 | What does it encourage you? How does it encourage you to think?
00:20:03.300 | What kind of systems does it encourage you to build?
00:20:05.700 | There is something about C++ that has elements of creativity,
00:20:11.180 | but forces you to be an adult about your programming.
00:20:15.100 | -It expects you to be an adult.
00:20:16.580 | It does not force you to.
00:20:18.740 | -And so it brings out people
00:20:23.340 | that are willing to be creative
00:20:25.340 | in terms of building large systems
00:20:27.300 | and coming up with interesting solutions,
00:20:29.260 | but at the same time have sort of the good software
00:20:33.380 | engineering practices
00:20:35.540 | that amend themselves to real-world systems.
00:20:39.100 | Let me ask you about this other language, JavaScript.
00:20:43.460 | So if we, you know, aliens visit in thousands of years
00:20:48.020 | and humans are long gone,
00:20:50.100 | something tells me that most of the systems they find
00:20:53.420 | will be running JavaScript.
00:20:55.020 | I kind of think that if we're living in a simulation,
00:20:58.580 | it's written in JavaScript.
00:21:01.620 | You know, for the longest time, even still,
00:21:05.540 | JavaScript didn't get any respect,
00:21:08.140 | and yet it runs so much of the world
00:21:09.980 | and an increasing number of the world.
00:21:12.100 | Is it possible that everything
00:21:14.660 | will be written in JavaScript one day?
00:21:16.700 | -So the engineering under JavaScript
00:21:18.820 | is really pretty phenomenal.
00:21:21.300 | The systems that make JavaScript run as fast as it does right now
00:21:26.340 | are kind of miracles of modern engineering in many ways.
00:21:29.980 | It does feel like it is not an optimal language
00:21:34.420 | for all the things that it's being used for
00:21:36.380 | or an optimal distribution system
00:21:38.220 | to build huge apps in something like this
00:21:41.860 | without type systems and so on,
00:21:45.780 | but I think for a lot of people,
00:21:47.540 | it does reasonably the necessary things.
00:21:50.420 | It's still a C-flavored language.
00:21:52.300 | It's still a braces and semicolon language.
00:21:55.860 | It's not hard for people to be trained in JavaScript
00:21:59.420 | and then understand the roots of where it came from.
00:22:02.580 | I think garbage collection is unequivocally a good thing
00:22:07.100 | for most programs to be written in.
00:22:08.820 | It's funny that I still -- Just this morning,
00:22:10.940 | I was seeing a Twitter thread
00:22:13.140 | of a bunch of really senior game dev people
00:22:15.540 | arguing about the virtues and costs of garbage collection,
00:22:18.820 | and you will run into some people
00:22:19.940 | that are top-notch programmers that just say,
00:22:22.180 | "No, this is literally not a good thing."
00:22:24.580 | -Oh, because it makes you lazy?
00:22:25.820 | -Yes, that it makes you not think about things,
00:22:28.420 | and I do disagree.
00:22:29.620 | I think that there is so much objective data
00:22:33.340 | on the vulnerabilities that have happened
00:22:35.740 | in C and C++ programs,
00:22:37.780 | sometimes written by the best programmers in the world.
00:22:40.060 | It's like nobody is good enough to avoid
00:22:42.380 | ever shooting themselves in the foot with that.
00:22:44.140 | You write enough C code,
00:22:45.540 | you're going to shoot yourself in the foot.
00:22:47.620 | And garbage collection is a very great thing
00:22:49.980 | for the vast majority of programs.
00:22:51.700 | It's only when you get into the tightest of real-time things
00:22:54.700 | that you start saying it's like, "No, the garbage collection
00:22:57.300 | has more costs than it has benefits for me there,"
00:22:59.900 | but that's not 99-plus percent
00:23:02.540 | of all the software in the world.
00:23:04.820 | So JavaScript is not terrible in those ways,
00:23:09.340 | and so much of programming is not the language itself.
00:23:14.100 | It's the infrastructure around everyone that surrounds it,
00:23:17.340 | all the libraries that you can get
00:23:18.980 | and the different ways you can deploy it,
00:23:22.460 | the portability that it gives you.
00:23:24.260 | And JavaScript is really strong on a lot of those things,
00:23:27.020 | where for a long time, and it still does if I look at it,
00:23:31.100 | the web stack about everything that has to go
00:23:33.940 | when you do something really trivial in JavaScript
00:23:36.380 | and it shows up on a web browser to kind of X-ray through that
00:23:40.300 | and see everything that has to happen
00:23:42.260 | for your one little JavaScript statement
00:23:44.660 | to turn into something visible in your web browser,
00:23:48.300 | it's very, very disquieting, just the depth of that stack
00:23:53.420 | and the fact that so few people can even comprehend
00:23:56.900 | all of the levels that are going on there.
00:23:59.500 | But it's, again, I have to caution myself
00:24:02.300 | to not be the in-the-good-old-days-old-man
00:24:04.980 | about it because clearly there's enormous value here.
00:24:08.780 | The world does run on JavaScript
00:24:11.220 | to a pretty good approximation there,
00:24:13.300 | and it's not falling apart.
00:24:14.780 | There's a bunch of scary stuff where you look at console logs
00:24:18.420 | and you just see all of these bad things that are happening,
00:24:20.980 | but it's still kind of limping along
00:24:22.580 | and nobody really notices.
00:24:24.820 | But so much of my systems design and systems analysis
00:24:30.020 | goes around you should understand
00:24:31.740 | what the speed of light is,
00:24:32.980 | like what would be the best you could possibly do here.
00:24:36.180 | And it sounds horrible, but in a lot of cases,
00:24:39.460 | you can be 1,000 times off your speed of light velocity
00:24:43.340 | for something and it still be okay.
00:24:45.580 | And in fact, it can even sometimes still be
00:24:48.420 | the optimal thing in a larger system standpoint
00:24:51.140 | where there's a lot of things
00:24:52.700 | that you don't want to have to parachute in someone like me
00:24:55.660 | to go in and say, "Make this web page run 1,000 times faster.
00:25:01.580 | Make this web app into a hardcore native application
00:25:05.340 | that starts up in 37 milliseconds
00:25:08.020 | and everything responds in less than one frame latency."
00:25:11.260 | That's just not necessary.
00:25:12.700 | And if somebody wants to go pay me millions of dollars
00:25:15.420 | to do software like that,
00:25:16.700 | when they can take somebody right out of a boot camp
00:25:19.060 | and say, "Spin up an application for this,"
00:25:21.980 | often being efficient is not really the best metric.
00:25:26.980 | And it's like that applies in a lot of areas
00:25:29.260 | where it's kind of interesting
00:25:30.980 | how a lot of our appliances and everything
00:25:33.340 | are all built around energy efficiency,
00:25:36.380 | sometimes at the expense of robustness in some other ways
00:25:39.380 | or higher costs in other ways,
00:25:41.140 | where there's interesting things where energy or electricity
00:25:44.740 | could become much cheaper in a future world,
00:25:47.020 | and that could change our engineering trade-offs
00:25:49.020 | for the way we build certain things,
00:25:50.500 | where you could throw away efficiency
00:25:53.100 | and actually get more benefits that actually matter.
00:25:56.260 | I mean, that's one of my, you know,
00:25:58.620 | one of the directions I was considering swerving into
00:26:00.940 | was nuclear energy when I was kind of like,
00:26:03.460 | "What do I want to do next?"
00:26:04.580 | It was either going to be cost-effective nuclear fission
00:26:08.140 | or artificial general intelligence.
00:26:10.540 | And one of my pet ideas there is, like, you know,
00:26:14.620 | people don't understand how cheap nuclear fuel is,
00:26:18.380 | and there would be ways that you could be a quarter
00:26:22.860 | the efficiency or less,
00:26:24.740 | but if it wound up making your plant 10 times cheaper,
00:26:27.700 | that could be a radical innovation in something like that.
00:26:31.220 | So there's, like, some of these thoughts around, like,
00:26:33.020 | direct fission energy conversion,
00:26:35.300 | you know, fission fragment conversion,
00:26:36.700 | that, you know, maybe you build something
00:26:38.420 | that doesn't require all the steam turbines and everything,
00:26:40.860 | even if it winds up being less efficient.
00:26:42.660 | So that applies a lot in programming,
00:26:45.020 | where it's always good to know what you could do
00:26:49.180 | if you really sat down and took it far,
00:26:51.980 | because sometimes there's discontinuities,
00:26:53.700 | like around user reaction times.
00:26:56.060 | There are some points where the difference
00:26:58.060 | between operating in 1 second and 750 milliseconds
00:27:02.900 | is not that huge.
00:27:03.900 | You'll see it in web page statistics,
00:27:05.700 | but most of the usability stuff, not that great.
00:27:08.260 | But if you get down to 50 milliseconds,
00:27:11.020 | then all of a sudden this just feels amazing.
00:27:12.980 | You know, it's just like doing your bidding instantly
00:27:15.500 | rather than you're giving it a command,
00:27:17.220 | twiddling your thumbs, waiting for it to respond.
00:27:19.740 | So sometimes it's important to really crunch hard
00:27:23.100 | to get over some threshold,
00:27:25.060 | but there are broad basins in the value metric
00:27:28.900 | for lots of work where it just doesn't pay
00:27:31.260 | to even go that extra mile.
00:27:33.020 | And there are craftsmen that just don't want to buy that,
00:27:36.340 | and more power to them.
00:27:37.700 | You know, if somebody just wants to say,
00:27:39.100 | no, I'm going to be, my pride is in my work,
00:27:42.740 | I'm never going to do something
00:27:44.060 | that's not as good as I could possibly make it,
00:27:46.380 | I respect that, and sometimes I am that person,
00:27:50.140 | but I try to focus more on the larger value picture,
00:27:53.860 | and you do pick your battles,
00:27:55.300 | and you deploy your resources into play
00:27:57.420 | that's going to give you
00:27:58.420 | sort of the best user value in the end.
00:28:00.740 | - Well, if you look at the evolution of life on Earth
00:28:04.660 | as a kind of programming effort,
00:28:08.620 | it seems like efficiency isn't the thing
00:28:13.300 | that's being optimized for.
00:28:15.140 | Like natural selection is very inefficient,
00:28:17.180 | but it kind of adapts,
00:28:20.660 | and through the process of adaptations,
00:28:22.420 | building more and more complex systems
00:28:24.260 | that are more and more intelligent,
00:28:25.380 | the final result is kind of pretty interesting.
00:28:28.260 | And so I think of JavaScript the same way.
00:28:30.940 | It's like this giant mess
00:28:33.020 | that things naturally die off if they don't work,
00:28:36.460 | and if they become useful to people,
00:28:39.420 | they kind of naturally live,
00:28:40.860 | and then you build this community,
00:28:42.860 | large community of people that are generating code,
00:28:45.900 | and some code is sticky, some is not,
00:28:48.260 | and nobody knows the inefficiencies or the efficiencies
00:28:52.460 | or the breaking points, like how reliable this code is,
00:28:54.900 | and you kind of just run it, assume it works,
00:28:57.780 | and then get unpleasantly surprised,
00:29:00.380 | and then that's very kind of the evolutionary process.
00:29:03.140 | - So that's a really good analogy,
00:29:04.940 | and we can go a lot of places with that,
00:29:06.820 | where in the earliest days of programming,
00:29:09.500 | when you had finite,
00:29:10.820 | you could count the bytes that you had to work on this,
00:29:13.060 | you had all the kind of hackers playing code golf
00:29:15.940 | to be one less instruction
00:29:17.380 | than the other person's multiply routine
00:29:19.580 | to kind of get through,
00:29:20.420 | and it was so perfectly crafted.
00:29:22.860 | It was a crystal piece of artwork when you had a program,
00:29:26.460 | because there just were not that many,
00:29:28.420 | you couldn't afford to be lazy in different ways,
00:29:31.380 | and in many ways, I see that as akin to the symbolic AI work
00:29:34.740 | where again, if you did not have the resources
00:29:37.860 | to just say, well, we're gonna do billions and billions
00:29:40.580 | of programmable weights here,
00:29:42.660 | you have to turn it down into something
00:29:44.460 | that is symbolic and crafted like that,
00:29:47.380 | but that's definitely not the way DNA and life
00:29:51.100 | and biological evolution and things work.
00:29:54.980 | On the one hand, it's almost humbling
00:29:58.340 | how little programming code is in our bodies.
00:30:00.900 | We've got a couple of billion base pairs,
00:30:02.500 | and it's like this all fits on a thumb drive for years now,
00:30:05.660 | and then our brains are even a smaller section of that.
00:30:08.340 | You've got maybe 50 megabytes,
00:30:09.980 | and this is not like Shannon limit,
00:30:12.540 | perfectly information dense conveyances here.
00:30:17.460 | It's like, these are messy codes.
00:30:19.460 | They're broken up into amino acids.
00:30:21.500 | A lot of them don't do important things
00:30:23.940 | or they do things in very awkward ways,
00:30:26.340 | but it is this process of just accumulation
00:30:29.140 | on top of things, and you need scale.
00:30:33.300 | Both you need scale for sort of the population
00:30:35.700 | for that to work out, and in the early days,
00:30:38.060 | in the '50s and '60s,
00:30:39.860 | the kind of ancient era of computers
00:30:42.620 | where you could count when they said,
00:30:44.380 | like when the internet started, even in the '70s,
00:30:46.220 | there were like 18 hosts or something on it.
00:30:48.140 | It was this small, finite number,
00:30:50.420 | and you were still optimizing everything
00:30:52.220 | to be as good as you possibly could be,
00:30:54.540 | but now it's billions and billions of devices
00:30:57.700 | and everything going on,
00:30:59.340 | and you can have this very much natural evolution going on
00:31:03.540 | where lots of things are tried,
00:31:05.940 | lots of things are blowing up.
00:31:07.340 | Venture capitalists lose their money
00:31:09.380 | when a startup invested in the wrong tech stack
00:31:11.780 | and things completely failed or failed to scale,
00:31:14.780 | but good things do come out of it,
00:31:17.380 | and it's interesting to see the mimetic evolution
00:31:20.980 | of the way different things happen,
00:31:22.620 | like mentioning "Hello, World!" at the beginning.
00:31:24.380 | It's funny how some little thing like that
00:31:26.420 | where every programmer knows "Hello, World!" now,
00:31:29.580 | and that was a completely arbitrary sort of decision
00:31:32.100 | that just came out of the dominance of Unix
00:31:35.060 | and C and early examples of things like that.
00:31:38.420 | So millions of experiments are going on all the time,
00:31:42.300 | but some things do kind of rise to the top
00:31:44.780 | and win the fitness war for whether it's Mindspace
00:31:48.180 | or programming techniques or anything.
00:31:50.380 | Like there's a site on Stack Exchange called Code Golf
00:31:54.380 | where people compete to write the shortest possible program
00:31:57.940 | for a particular task
00:31:59.340 | in all the different kinds of languages,
00:32:01.340 | and it's really interesting to see folks kind of,
00:32:06.180 | their masters of their craft really play
00:32:11.620 | with the limits of programming languages.
00:32:13.460 | It's really beautiful to see,
00:32:14.580 | and across all the different programming languages,
00:32:17.060 | you get to see some of these weird programming languages
00:32:20.620 | and mainstream ones,
00:32:22.580 | difference between Python 2 and 3.
00:32:25.100 | You get to see the difference between C and C++ and Java,
00:32:27.700 | and you get to see JavaScript, all of that,
00:32:30.020 | and it's kind of inspiring to see
00:32:33.020 | how much depth of possibility
00:32:39.100 | there is within programming languages
00:32:41.060 | that Code Golf kind of tasks reveal.
00:32:44.180 | Most of us, if you do any kind of programming,
00:32:46.580 | you kind of do boring, kind of very vanilla type of code.
00:32:51.140 | That's the way to build large systems,
00:32:52.860 | but it's nice to see that the possibility
00:32:54.740 | of creative genius is still within those languages.
00:32:57.660 | It's laden within those languages.
00:33:00.140 | So given that you are, once again,
00:33:03.220 | one of the greatest programmers ever,
00:33:05.740 | what do you think makes a good programmer?
00:33:08.340 | Maybe a good modern programmer.
00:33:11.020 | - So I just gave a long rant/lecture at Meta
00:33:15.700 | to the TPM organization,
00:33:17.660 | and my biggest point was everything that we're doing
00:33:22.060 | really should flow from user value.
00:33:24.660 | All the good things that we're doing.
00:33:26.140 | It's like we're not technical people.
00:33:28.420 | It's like you shouldn't be taking pride
00:33:30.980 | just in the specific thing.
00:33:32.100 | Like, Code Golf is the sort of thing,
00:33:33.580 | it's a fun puzzle game,
00:33:34.860 | but that really should not be a major motivator for you.
00:33:38.140 | It's like we're solving problems for people
00:33:40.260 | or we're providing entertainment to people.
00:33:41.980 | We're doing something of value to people
00:33:44.140 | that's displacing something else in their life.
00:33:46.220 | So we want to be providing a net value
00:33:48.980 | over what they could be doing,
00:33:51.340 | but instead they're choosing to use our products.
00:33:53.540 | And that's where, I mean, it sounds trite or corny,
00:33:56.620 | but I fundamentally do think
00:33:58.460 | that's how you make the world a better place.
00:34:00.340 | If you have given more value to people
00:34:02.900 | than it took you and your team to create,
00:34:05.740 | then the world's a better place.
00:34:07.540 | People have, they've gone from something of lesser value,
00:34:10.500 | chosen to use your product,
00:34:11.980 | and their life feels better for that.
00:34:13.860 | And if you've produced that economically,
00:34:16.260 | that's a really good thing.
00:34:18.460 | On the other hand, if you spent ridiculous amounts of money,
00:34:22.780 | you've just kind of shoveled a lot of cash
00:34:24.300 | into a wood chipper there,
00:34:25.700 | and you should maybe not feel so good
00:34:28.140 | about what you're doing.
00:34:29.700 | So being proud about like a specific architecture
00:34:33.540 | or a specific technology
00:34:34.940 | or a specific code sequence that you've done,
00:34:37.580 | it's great to get a little smile,
00:34:39.140 | like a tiny little dopamine hit for that,
00:34:41.100 | but the top level metric should be
00:34:43.300 | that you're building things of value.
00:34:45.380 | Now you can get into the argument about how you,
00:34:47.820 | you know, what is user value?
00:34:49.300 | How do you actually quantify that?
00:34:51.180 | And there can be big arguments about that,
00:34:53.300 | but it's easy to be able to say,
00:34:55.060 | okay, this pissed off user there
00:34:56.620 | is not getting value from what you're doing.
00:34:59.020 | This user over there with the big smile on their face,
00:35:01.340 | I am the moment of delight when something happened.
00:35:04.180 | There's a value that's happened there.
00:35:05.860 | I mean, if you, you have to at least accept
00:35:07.700 | that there is a concept of user value,
00:35:09.780 | even if you have trouble exactly quantifying it,
00:35:12.820 | you can usually make relative arguments about it.
00:35:15.420 | Well, this was better than this.
00:35:17.220 | We've improved things.
00:35:18.820 | So, you know, being a servant to the user is your job
00:35:23.660 | when you're a developer.
00:35:25.100 | You want to be producing something that, you know,
00:35:27.700 | other people are going to find valuable.
00:35:29.940 | And if you are technically inclined,
00:35:32.500 | then finding the right levers to be able to pull,
00:35:35.700 | to be able to make a design
00:35:37.740 | that's going to produce the most value
00:35:39.940 | for the least amount of effort.
00:35:41.660 | And it always has to be kind of divide.
00:35:43.940 | There's a ratio there where you,
00:35:45.940 | it's a problem at the big tech companies,
00:35:47.820 | you know, whether it's, you know,
00:35:49.100 | MetaGoogle, Apple, Microsoft, Amazon,
00:35:52.180 | companies that have almost infinite money.
00:35:54.700 | I mean, I know their CFO will complain
00:35:57.660 | that it's not infinite money,
00:35:58.780 | but for most developer standpoints,
00:36:00.620 | it really does feel like it.
00:36:02.700 | And it's almost counterintuitive
00:36:04.860 | that if you're working hard as a developer on something,
00:36:07.900 | there's always this thought, if only I had more resources,
00:36:11.020 | more people, more RAM, more megahertz,
00:36:14.300 | then my product will be better.
00:36:16.340 | And that sense that at certain points,
00:36:18.660 | it's certainly true that if you are really hamstrung by this,
00:36:22.020 | removing an obstacle will make a better product,
00:36:25.140 | make more value.
00:36:26.940 | But if you're not making your core design decisions
00:36:30.060 | in this fiercely competitive way,
00:36:33.020 | where you're saying feature A or feature B,
00:36:35.740 | you can't just say, let's do both,
00:36:38.660 | because then you're not making a value judgment about them.
00:36:41.100 | You're just saying, well, they both seem good.
00:36:42.940 | I don't want to necessarily have to pick out
00:36:45.180 | which one is better or how much better
00:36:47.420 | and tell team B that, sorry, we're not gonna do this
00:36:50.980 | because A is more important.
00:36:53.100 | But that notion of always having
00:36:55.780 | to really critically value what you're doing,
00:36:57.820 | your time, the resources you expend,
00:36:59.660 | even the opportunity cost of doing something else,
00:37:02.580 | that's, you know, super important.
00:37:04.820 | - Well, let me ask you about this,
00:37:06.180 | the big debates that you're mentioning
00:37:08.340 | of how to measure value.
00:37:09.980 | Is it possible to measure it kind of numerically,
00:37:15.900 | or can you do the sort of Johnny Ive,
00:37:22.140 | the designer route of imagining
00:37:25.140 | sort of somebody using a thing
00:37:28.740 | and imagining a smile on their face,
00:37:30.860 | imagining the experience of love and joy
00:37:34.020 | that you have when you use the thing?
00:37:36.020 | That's from a design perspective.
00:37:37.300 | Or if you're building more like a lower level thing
00:37:40.460 | for like Linux, you imagine a developer
00:37:43.420 | that might come across this and use it
00:37:45.420 | and become happy and better off because of it.
00:37:50.420 | So where do you land on those things?
00:37:52.620 | Is it measurable?
00:37:53.660 | So I imagine like Meta and Google
00:37:57.300 | will probably try to measure the thing.
00:37:58.900 | They'll try to, it's like you try to optimize engagement
00:38:01.500 | or something, let's measure engagement.
00:38:03.300 | And then I think there is a kind of,
00:38:05.300 | I mean, I admire the designer ethic of like,
00:38:09.380 | think of a future that's immeasurable.
00:38:12.980 | And you try to make somebody in that future
00:38:15.300 | that's different from today happy.
00:38:17.340 | - So I do usually favor,
00:38:20.420 | if you can get any kind of a metric that's good,
00:38:23.780 | by all means, listen to the data.
00:38:25.860 | But you can go too far there where we've had problems
00:38:28.500 | where it's like, hey, we had a performance regression
00:38:30.620 | because our fancy new telemetry system
00:38:33.260 | is doing a bazillion file writes
00:38:35.980 | to kind of archive this stuff
00:38:37.180 | because we needed to collect information
00:38:38.740 | to determine if we were doing,
00:38:40.580 | if our plans were good.
00:38:42.620 | So when information is available,
00:38:45.540 | you should never ignore it.
00:38:46.700 | I mean, all the- - From actual users
00:38:48.260 | using the thing, human beings using the thing,
00:38:51.380 | large number of human beings,
00:38:52.860 | and you get to see sort of-
00:38:54.300 | - So there's the zero to one problem
00:38:55.900 | of when you're doing something really new,
00:38:57.540 | you do kind of have to make a guess.
00:38:59.260 | But one of the points that I've been making at Meta
00:39:01.980 | is we have more than enough users now
00:39:05.140 | that anything somebody wants to try in VR,
00:39:08.420 | we have users that will be interested in that.
00:39:10.780 | You do not get to make a completely
00:39:13.060 | green field, blue sky pitch and say,
00:39:15.780 | I'm going to do this
00:39:16.900 | because I think it might be interesting.
00:39:18.660 | I challenge everyone.
00:39:20.100 | There are going to be people,
00:39:21.620 | whether it's working in VR on your,
00:39:24.060 | like on your desktop replacement
00:39:26.860 | or communicating with people in different ways
00:39:29.540 | or playing the games,
00:39:31.620 | there are going to be probably millions of people,
00:39:35.140 | or at least if you pick some tiny niche
00:39:37.180 | that we're not in right now,
00:39:38.660 | there's still going to be thousands of people out there
00:39:40.820 | that have the headsets that would be your target market.
00:39:43.860 | And I tell people, pay attention to them.
00:39:46.060 | Don't invent fictional users.
00:39:47.700 | Don't make an Alice, Bob, Charlie
00:39:50.220 | that fits whatever matrix of tendencies
00:39:53.580 | that you want to break the market down to
00:39:55.380 | because it's a mistake to think about imaginary users
00:39:58.300 | when you've got real users that you could be working with.
00:40:01.900 | But on the other hand,
00:40:03.060 | there is value to having a kind of wholeness of vision
00:40:08.060 | for a product.
00:40:09.500 | And companies like Meta have,
00:40:13.660 | they understand the trade-offs
00:40:15.260 | where you can have a company like SpaceX
00:40:17.540 | or Apple in the Steve Jobs era
00:40:20.220 | where you have a very powerful leading personality
00:40:23.340 | that can micromanage at a very low level
00:40:26.700 | and can say, it's like, no,
00:40:27.940 | that handle needs to be different
00:40:29.380 | or that icon needs to change the tint there.
00:40:32.740 | And they clearly get a lot of value out of it.
00:40:34.980 | They also burn through a lot of employees
00:40:37.220 | that have horror stories to tell
00:40:39.020 | about working there afterwards.
00:40:41.700 | My position is that you're at your best
00:40:45.140 | when you've got a leader that is at their limit
00:40:48.140 | of what they can kind of comprehend
00:40:49.740 | of everything below them.
00:40:51.460 | And they can have an informed opinion
00:40:53.460 | about everything that's going on.
00:40:55.380 | And you take somebody,
00:40:56.580 | you've got to believe that somebody
00:40:57.980 | that has 30, 40 years of experience,
00:41:01.140 | you would hope that they've got wisdom
00:41:02.740 | that the just out of bootcamp person
00:41:05.580 | contributing doesn't have.
00:41:07.060 | And that if they're like, well, that's wrong there,
00:41:09.180 | you probably shouldn't do it that way
00:41:10.980 | or even just don't do it that way, do it another way.
00:41:14.140 | So there's value there,
00:41:15.540 | but it can't go beyond a certain level.
00:41:17.900 | I mean, I have Steve Jobs stories
00:41:20.420 | of him saying things that are just wrong
00:41:22.220 | right in front of me about technical things
00:41:24.140 | because he was not operating at that level.
00:41:26.900 | But when it does work
00:41:29.700 | and you do get that kind of passionate leader
00:41:32.020 | that's thinking about the entire product
00:41:34.100 | and just really deeply cares
00:41:35.820 | about not letting anything slip through the cracks,
00:41:39.060 | I think that's got a lot of value.
00:41:40.580 | But the other side of that is the people saying that,
00:41:42.540 | well, we want to have these independent teams
00:41:44.500 | that are bubbling up the ideas
00:41:46.020 | because it's almost it's anti-capitalist
00:41:49.460 | or anti-free market to say,
00:41:50.900 | it's like I want my great leader
00:41:53.220 | to go ahead and dictate all these points there
00:41:55.180 | where clearly free markets bring up things
00:41:57.660 | that you don't expect.
00:41:59.300 | Like in VR, we saw a bunch of things,
00:42:01.780 | like it didn't turn out at all
00:42:03.060 | the way the early people thought
00:42:04.300 | were gonna be the key applications
00:42:06.020 | and things that would not have been approved
00:42:08.740 | by the dark cabal making the decisions
00:42:12.220 | about what gets into the store
00:42:14.180 | turn out to in some cases be extremely successful.
00:42:17.620 | So yeah, I definitely kind of wanted to be there
00:42:20.620 | as a point where I did make a pitch.
00:42:22.180 | It's like, hey, make me VR dictator
00:42:23.820 | and I'll go in and get shit done.
00:42:26.140 | And that's just, it's not in the culture at Meta,
00:42:28.820 | and they understand the trade-offs
00:42:30.660 | and that's just not the way,
00:42:32.940 | that's not the company that they want,
00:42:34.460 | the team that they wanna do.
00:42:37.220 | - It's fascinating 'cause VR,
00:42:38.580 | and we'll talk about it more,
00:42:40.100 | it's still unclear to me
00:42:43.020 | in what way VR will change the world,
00:42:46.900 | because it does seem clear
00:42:48.020 | that VR will somehow fundamentally transform this world,
00:42:51.740 | and it's unclear to me how yet.
00:42:54.860 | - Let me know when you wanna get into that.
00:42:56.340 | - We will, but hold on a second.
00:42:58.220 | So stick to you being the best programmer ever.
00:43:02.780 | Okay, in the early days when you didn't have
00:43:05.940 | adult responsibilities of leading teams
00:43:07.980 | and all that kind of stuff,
00:43:09.380 | and you can focus on just being a programmer,
00:43:13.220 | what did a productive day in the life of John Carmack
00:43:16.900 | look like?
00:43:17.820 | How many hours of the keyboard, how much sleep,
00:43:20.540 | what was the source of calories that fueled the brain?
00:43:23.900 | What was it like?
00:43:25.220 | What times you wake up?
00:43:26.980 | - So I was able to be remarkably consistent
00:43:29.540 | about what was good working conditions for me
00:43:32.140 | for a very long time.
00:43:33.580 | I was never one of the programmers
00:43:36.740 | that would do all-nighters going through,
00:43:39.900 | work for 20 hours straight.
00:43:41.260 | It's like my brain generally starts turning to mush
00:43:43.900 | after 12 hours or so.
00:43:45.940 | But the hard work is really important,
00:43:49.980 | and I would work for decades.
00:43:52.220 | I would work 60 hours a week.
00:43:53.740 | I would work a 10-hour day, six days a week,
00:43:56.260 | and try to be productive at that.
00:43:59.060 | Now, my schedule shifted around a fair amount
00:44:01.140 | when I was young without any kids
00:44:03.060 | and any other responsibilities.
00:44:05.100 | I was on one of those cycling schedules
00:44:07.300 | where I'd kind of get in an hour later each day
00:44:09.660 | and roll around through the entire time.
00:44:12.180 | And I'd wind up kind of pulling in
00:44:14.300 | at two or three in the afternoon sometimes,
00:44:16.260 | and then working again past midnight or two in the morning.
00:44:20.100 | And that was, when it was just me
00:44:23.620 | trying to make things happen,
00:44:25.980 | and I was usually isolated off in my office,
00:44:28.300 | people generally didn't bother me much at it,
00:44:31.020 | and I could get a lot of programming work done that way.
00:44:35.060 | I did settle into a more normal schedule
00:44:37.340 | when I was taking kids to school and things like that.
00:44:40.460 | - So kids were the forcing function
00:44:41.980 | that got you to wake up at the same time each day.
00:44:44.580 | - It's not clear to me that there was much of a difference
00:44:47.380 | in the productivity with that,
00:44:49.620 | where I kind of feel, if I just get up when I feel like it,
00:44:53.180 | it's usually a little later each day,
00:44:55.060 | but I just recently made the focusing decision
00:44:58.260 | to try to push my schedule back a little bit earlier
00:45:00.660 | to getting up at eight in the morning
00:45:02.660 | and trying to shift things around.
00:45:04.860 | Like I'm often doing experiments with myself
00:45:08.060 | about what should I be doing to be more productive.
00:45:10.940 | And one of the things that I did realize
00:45:13.340 | was happening in recent months,
00:45:15.100 | where I would go for a walk or a run,
00:45:19.180 | I cover like four miles a day,
00:45:21.140 | and I would usually do that just as the sun's going down
00:45:24.780 | at here in Texas now, and it's still really damn hot,
00:45:27.740 | but I'd go out at 8.30 or something
00:45:30.260 | and cover the time there, and then the showering.
00:45:32.980 | And it was putting a hole in my day
00:45:34.740 | where I would have still a couple hours after that.
00:45:37.620 | And sometimes my best hours were at night
00:45:39.540 | when nobody else is around, nobody's bothering me,
00:45:42.020 | but that hole in the day was a problem.
00:45:44.140 | So just a couple of weeks ago,
00:45:45.820 | I made the change to go ahead and say,
00:45:48.140 | all right, I'm gonna get up a little earlier.
00:45:49.500 | I'm gonna do a walk or get out there first
00:45:51.860 | so I can have more uninterrupted time.
00:45:54.020 | So I'm still playing with factors like this
00:45:56.140 | as I kind of optimize my work efforts,
00:45:59.980 | but it's always been,
00:46:02.460 | it was 60 hours a week for a very long time.
00:46:05.020 | To some degree, I had a little thing in the back of my head
00:46:07.300 | where I was almost jealous of some of the programmers
00:46:09.620 | that would do these marathon sessions.
00:46:11.420 | And I had like Dave Taylor, one of the guys that he had,
00:46:14.020 | he would be one of those people
00:46:15.220 | that would fall asleep under his desk sometimes
00:46:17.260 | and all the kind of classic hacker tropes about things.
00:46:20.180 | And a part of me was like,
00:46:21.860 | always a little bothered that that wasn't me,
00:46:23.740 | that I wouldn't go program 20 hours straight
00:46:27.220 | because I'm falling apart
00:46:28.980 | and not being very effective after 12 hours.
00:46:31.860 | I mean, yeah, 12 hour programming,
00:46:33.460 | that's fine when you're doing that,
00:46:34.820 | but you're not doing smart work much after,
00:46:38.620 | at least I'm not, but there's a range of people.
00:46:41.020 | I mean, that's something that a lot of people
00:46:42.540 | don't really get in their gut,
00:46:44.420 | where there are people that work on four hours of sleep
00:46:47.300 | and are smart and can continue to do good work,
00:46:49.500 | but then there's a lot of people that just fall apart.
00:46:52.140 | So I do tell people
00:46:53.820 | that I always try to get eight hours of sleep.
00:46:56.420 | It's not this, push yourself harder, get up earlier.
00:46:59.980 | I just do worse work where, you know, there's,
00:47:03.260 | you can work 100 hours a week
00:47:04.660 | and still get eight hours of sleep
00:47:06.220 | if you just kind of prioritize things correctly.
00:47:08.820 | But I do believe in working hard, working a lot.
00:47:11.900 | There was a comment that a game dev made
00:47:15.300 | that I know there's a backlash against really hard work
00:47:19.460 | in a lot of cases,
00:47:20.580 | and I get into online arguments about this all the time,
00:47:23.820 | but he was basically saying, yeah, 40 hours a week,
00:47:26.060 | that's kind of a part-time job.
00:47:27.780 | And if you were really in it,
00:47:29.860 | you're doing what you think is important,
00:47:31.820 | what you're passionate about, working more gets more done.
00:47:35.460 | And it's just really not possible to argue with that
00:47:39.700 | if you've been around the people
00:47:41.020 | that work with that level of intensity and just say,
00:47:44.100 | it's like, no, they should just stop.
00:47:46.340 | And we had, I kind of came back around to that
00:47:49.580 | a couple of years ago
00:47:50.660 | where I was using the fictional example of,
00:47:54.120 | all right, some people say,
00:47:55.580 | they'll say with a straight face,
00:47:56.940 | they think, no, you are less productive
00:47:59.000 | if you work more than 40 hours a week.
00:48:01.300 | And they're generally misinterpreting things
00:48:03.260 | where your marginal productivity for an hour
00:48:06.060 | after eight hours is less than in one of your peak hours,
00:48:09.140 | but you're not literally getting less done.
00:48:11.380 | There is a point where you start breaking things
00:48:13.460 | and getting worse behavior and everything out of it
00:48:16.780 | where you're literally going backwards,
00:48:18.580 | but it's not at eight or 10 or 12 hours.
00:48:21.500 | And the fictional example I would use was,
00:48:23.860 | imagine there's an asteroid coming to impact,
00:48:27.340 | to crash into earth, destroy all of human life.
00:48:30.580 | Do you want Elon Musk or the people working at SpaceX
00:48:34.360 | that are building the interceptor
00:48:35.820 | that's going to deflect the asteroid,
00:48:38.660 | do you want them to clock out at five?
00:48:40.400 | Because dammit, they're just gonna go do worse work
00:48:42.920 | if they work another couple hours.
00:48:44.960 | And it seems absurd.
00:48:47.320 | And that's a hypothetical though,
00:48:48.780 | and everyone can dismiss that.
00:48:50.140 | But then when coronavirus was hitting
00:48:52.220 | and you have all of these medical personnel
00:48:54.780 | that are clearly pushing themselves
00:48:56.700 | really, really hard, and I'd say, it's like,
00:48:59.460 | okay, do you want all of these scientists
00:49:02.140 | working on treatments and vaccines
00:49:04.100 | and caring for all of these people?
00:49:05.860 | Are they really screwing everything up
00:49:07.880 | by working more than eight hours a day?
00:49:09.940 | And of course people say I'm just an asshole
00:49:11.600 | to say something like that, but it's, I know,
00:49:14.460 | it's the truth, working longer gets more done.
00:49:17.700 | - Well, so that's kind of the layer one,
00:49:20.720 | but I'd like to also say that,
00:49:22.700 | at least I believe depending on the person,
00:49:25.960 | depending on the task, working more and harder
00:49:30.740 | will make you better for the next week
00:49:35.740 | in those peak hours.
00:49:37.660 | So there's something about a deep dedication to a thing
00:49:41.380 | that kind of gets deep in you.
00:49:44.860 | So the hard work isn't just about
00:49:47.380 | the raw hours of productivity,
00:49:49.420 | it's the thing it does to you
00:49:54.220 | in the weeks and months after too.
00:49:56.760 | - You're tempering yourself in some ways.
00:49:59.120 | - And I think, it's like your zero dreams of sushi.
00:50:01.960 | If you really dedicate yourself completely
00:50:03.800 | to making the sushi, to really putting in
00:50:06.940 | the long hours day after day after day,
00:50:09.620 | you become a true craftsman of the thing you're doing.
00:50:14.380 | Now there's of course discussions about
00:50:16.300 | are you sacrificing a lot of personal relationships,
00:50:18.920 | are you sacrificing a lot of other possible
00:50:21.360 | things you could do with that time,
00:50:22.760 | but if you're talking about purely being
00:50:26.200 | a master or a craftsman of your art,
00:50:30.640 | that more hours isn't just about doing more,
00:50:34.920 | it's about becoming better at the thing you're doing.
00:50:37.240 | - Yeah, and I don't gainsay anybody that
00:50:39.280 | wants to work the minimum amount,
00:50:41.120 | they've got other priorities in their life.
00:50:42.940 | My only argument that I'm making,
00:50:44.640 | it's not that everybody should work hard,
00:50:47.000 | it's that if you want to accomplish something,
00:50:49.360 | working longer and harder
00:50:50.840 | is the path to getting it accomplished.
00:50:53.360 | - Well let me ask you about this then,
00:50:55.280 | the mythical work-life balance.
00:50:58.920 | For an engineer it seems like that's one of the professions
00:51:04.720 | for a programmer where working hard
00:51:09.920 | does lead to greater productivity in it,
00:51:12.860 | but it also raises the question of
00:51:15.680 | personal relationships and all that kind of stuff,
00:51:19.680 | family and how are you able to find work-life balance?
00:51:24.320 | Is there advice you can give,
00:51:25.720 | maybe even outside of yourself,
00:51:27.600 | have you been able to arrive at any wisdom on this part
00:51:31.160 | in your years of life?
00:51:32.400 | - I do think that there's a wide range of people
00:51:34.600 | where different people have different needs,
00:51:36.760 | it's not a one size fits all,
00:51:38.720 | I am certainly what works for me.
00:51:40.760 | I can tell enough that I'm different
00:51:45.160 | than a typical average person in the way things impact me,
00:51:48.000 | the things that I want to do, my goals are different
00:51:51.600 | and sort of the levers to impact things are different
00:51:55.200 | where I have literally never felt burnout
00:51:59.600 | and I know there's lots of brilliant smart people
00:52:02.120 | that do world-leading work that get burned out
00:52:05.400 | and it's never hit me.
00:52:08.120 | I've never been at a point where I'm like,
00:52:11.720 | I just don't care about this,
00:52:13.240 | I don't want to do this anymore,
00:52:14.840 | but I've always had the flexibility
00:52:16.460 | to work on lots of interesting things.
00:52:18.960 | I can always just turn my gaze to something else
00:52:21.520 | and have a great time working on that
00:52:23.560 | and so much of the ability to actually work hard
00:52:27.200 | is the ability to have multiple things to choose from
00:52:29.680 | and to use your time on the most appropriate thing.
00:52:32.920 | Like there are time periods where it's the best time
00:52:36.960 | for me to read a new research paper
00:52:38.480 | that I need to really be thinking hard about it,
00:52:41.380 | then there's a time that maybe I should just scan
00:52:43.440 | and organize my old notes
00:52:44.760 | because I'm just not on top of things
00:52:47.160 | and then there's the time that, all right,
00:52:48.860 | let's go bang out a few hundred lines of code for something.
00:52:52.720 | So switching between them has been real valuable.
00:52:57.180 | - So you always have kind of joy in your heart
00:52:59.780 | for all the things you're doing
00:53:01.000 | and that is a kind of work-life balance
00:53:03.260 | as a first sort of step.
00:53:04.820 | - Yeah, I do. - So you're always happy.
00:53:06.380 | - I do.
00:53:07.220 | - Well, happy, you know.
00:53:08.540 | - Yeah, I mean, it's like a lot of people would say
00:53:10.440 | that often I look like kind of a grim person
00:53:12.860 | with just sitting there with a neutral expression
00:53:15.120 | or even like knitted brows and a frown on my face
00:53:17.520 | as I'm staring at something.
00:53:19.000 | - That's what happiness looks like for you.
00:53:20.680 | - Yeah, it's kind of true where that's like,
00:53:24.040 | okay, I'm pushing through this, I'm making progress here.
00:53:27.100 | I know that doesn't work for everyone.
00:53:30.160 | I know it doesn't work for most people,
00:53:32.160 | but what I am always trying to do in those cases
00:53:35.520 | is I don't wanna let somebody
00:53:37.060 | that might be a person like that be told by someone else
00:53:40.560 | that no, don't even try that out as an option
00:53:44.240 | where work-life balance versus kind of your life's work
00:53:48.800 | where there's a small subset of the people
00:53:51.800 | that can be very happy being obsessive about things.
00:53:55.580 | And obsession can often get things done
00:53:58.980 | that just practical, prudent, pedestrian work won't
00:54:03.240 | or at least won't for a very long time.
00:54:05.960 | - There's legends of your nutritional intake
00:54:10.260 | in the early days.
00:54:11.520 | What can you say about sort of as a,
00:54:15.600 | being a programmer is a kind of athlete.
00:54:17.920 | So what was the nutrition that fueled?
00:54:21.440 | - I have never been that great
00:54:23.240 | on really paying attention to it
00:54:26.180 | where I'm good enough that I don't eat a lot.
00:54:29.600 | I've never been like a big heavy guy,
00:54:31.640 | but it was interesting where one of the things
00:54:34.280 | that I can remember being an unhappy teenager,
00:54:36.680 | not having enough money.
00:54:37.840 | And like one of the things that bothered me
00:54:40.080 | about not having enough money
00:54:41.160 | is I couldn't buy pizza whenever I wanted to.
00:54:43.400 | So I got rich and then I bought a whole lot of pizza.
00:54:46.560 | - So that was defining,
00:54:48.000 | like that's what being rich felt like.
00:54:50.120 | - A lot of the little things,
00:54:51.160 | like I could buy all the pizza and comic books
00:54:53.200 | and video games that I wanted to.
00:54:55.720 | And it really didn't take that much,
00:54:58.840 | but the pizza was one of those things.
00:55:00.980 | And it's absolutely true that for a long time
00:55:03.440 | it did software.
00:55:04.480 | I had a pizza delivered every single day.
00:55:06.520 | You know, the delivery guy knew me by name.
00:55:09.080 | And I didn't find out until years later
00:55:11.680 | that apparently I was such a good customer
00:55:13.920 | that they just never raised the price on me.
00:55:15.840 | And I was using this six-year-old price for the pizzas
00:55:18.880 | that they were still kind of sending my way every day.
00:55:21.600 | - So you were doing eating once a day or were you-
00:55:26.040 | - It would be spread out.
00:55:26.880 | You know, you have a few pieces of pizza,
00:55:28.160 | you have some more later on
00:55:29.360 | and I'd maybe have something at home.
00:55:31.920 | It was one of the nice things that Facebook Meta
00:55:34.680 | is they feed you quite well.
00:55:36.840 | You get a different, I guess now it's DoorDash,
00:55:39.520 | sorts of things delivered,
00:55:40.640 | but they take care of making sure
00:55:42.640 | that everybody does get well fed.
00:55:44.400 | And I probably had better food those six years
00:55:47.480 | that I was working in the Meta office there
00:55:49.560 | than I used to before.
00:55:51.640 | But it's worked out okay for me.
00:55:53.880 | My health has always been good.
00:55:55.260 | I get a pretty good amount of exercise
00:55:57.720 | and I don't eat to excess
00:55:59.980 | and I avoid a lot of other kind of
00:56:02.000 | not so good for you things.
00:56:03.680 | So I'm still doing quite well at my age.
00:56:05.640 | - Did you have a kind of, I don't know,
00:56:10.180 | spiritual experience with food or coffee
00:56:13.640 | or any of that kind of stuff?
00:56:15.080 | I mean, you know, the programming experience,
00:56:17.540 | you know, with music or like I listen to brown noise
00:56:21.360 | on a program or like creating an environment
00:56:24.360 | and the things you take into your body,
00:56:26.120 | just everything you construct
00:56:27.960 | can become a kind of ritual
00:56:29.600 | that empowers the whole process of the program.
00:56:32.200 | Did you have that relationship with pizza or?
00:56:34.640 | - It would really be with Diet Coke.
00:56:35.920 | I mean, there still is that sense of, you know,
00:56:38.080 | drop the can down, crack open the can of Diet Coke.
00:56:40.520 | All right, now I mean business.
00:56:41.880 | We're getting to work here.
00:56:44.040 | - Still to this day is Diet Coke is still part of it.
00:56:46.880 | - Yeah, probably eight or nine a day.
00:56:49.520 | - Nice, okay.
00:56:50.800 | What about your setup?
00:56:52.080 | How many screens?
00:56:54.000 | What kind of keyboard?
00:56:55.080 | Is there something interesting?
00:56:56.080 | What kind of IDE, Emacs Vim or something modern?
00:57:02.040 | Linux, what operating system, laptop,
00:57:04.120 | or any interesting thing that brings you joy?
00:57:07.000 | - So I kind of migrated cultures
00:57:09.160 | where early on through all of game dev,
00:57:11.400 | there was sort of one culture there,
00:57:13.160 | which was really quite distinct
00:57:14.600 | from the more the Silicon Valley venture,
00:57:17.240 | you know, culture for things.
00:57:19.000 | It's they're different groups
00:57:20.160 | and they have pretty different mores
00:57:21.880 | and the way they think about things where,
00:57:24.560 | and I still do think a lot of the big companies
00:57:26.560 | can learn things from the hardcore game development side
00:57:30.760 | of things where it still boggles my mind
00:57:32.960 | how I am, how hostile to debuggers and IDEs
00:57:37.640 | that so much of the kind of big money,
00:57:40.360 | get billions of dollars,
00:57:41.680 | Silicon Valley venture backed funds are.
00:57:44.400 | - Oh, that's interesting.
00:57:45.240 | Sorry, so you're saying like big companies
00:57:48.120 | at Google and Meta are hostile to-
00:57:50.200 | - They are not big on debuggers and IDEs.
00:57:52.840 | Like so much of it is like Emacs Vim for things.
00:57:55.760 | And we just assume that debuggers don't work
00:57:58.840 | most of the time.
00:58:00.000 | I have for the systems and a lot of this comes
00:58:02.360 | from a sort of Linux bias on a lot of things
00:58:04.800 | where I did come up through the personal computers
00:58:08.240 | and then the DOS and then I am, you know, Windows
00:58:11.520 | and it was Borland tools and then Visual Studio and-
00:58:16.520 | - Do you appreciate the buggers?
00:58:18.520 | - Very much so.
00:58:19.400 | I mean, a debugger is how you get a view
00:58:21.400 | into a system that's too complicated to understand.
00:58:23.920 | I mean, anybody that thinks just read the code
00:58:25.840 | and think about it, that's an insane statement
00:58:28.240 | in the, you can't even read all the code on a big system.
00:58:31.080 | You have to do experiments on the system
00:58:34.160 | and doing that by adding log statements,
00:58:37.000 | recompiling and rerunning it
00:58:39.120 | is an incredibly inefficient way of doing it.
00:58:41.240 | I mean, yes, you can always get things done
00:58:43.560 | even if you're working with stone knives and, you know,
00:58:45.600 | and bearskins, that is the mark of a good programmer
00:58:48.880 | is that given any tools, you will figure out a way
00:58:51.400 | to get it done.
00:58:52.560 | But it's amazing what you can do with sometimes
00:58:55.640 | much, much better tools where instead of just going
00:58:58.640 | through this iterative compile run debug cycle,
00:59:02.120 | you have the old Lisp direction of like,
00:59:05.120 | you've got a REPL and you're working interactively
00:59:07.080 | and doing amazing things there.
00:59:08.400 | But in many cases, a debugger
00:59:10.440 | as a very powerful user interface that can stop,
00:59:13.440 | examine all the different things in your program,
00:59:15.520 | set all of these different break points.
00:59:16.960 | And of course you can do that with GDB or whatever there,
00:59:20.240 | but this is one of the user interface
00:59:22.800 | fundamental principles where when something
00:59:24.720 | is complicated to do, you won't use it very often.
00:59:28.520 | There's people that will break out GDB
00:59:30.680 | when they're at their wits end and they just have
00:59:32.720 | beat their head against a problem for so long.
00:59:35.200 | But for somebody that kind of grew up in game dev,
00:59:37.880 | it's like they were running into the debugger anyways
00:59:40.080 | before they even knew there was a problem.
00:59:42.200 | And you would just stop and see, you know, what was happening.
00:59:44.840 | And sometimes you could fix things even before you,
00:59:47.720 | you know, even before you did one compile cycle,
00:59:50.640 | you could be in the debugger and you would say,
00:59:52.160 | well, I'm just going to change this right here.
00:59:54.240 | And yep, that did the job and fix it and go on.
00:59:57.040 | - And for people who don't know,
00:59:57.880 | GDB is a sort of popular, I guess, Linux debugger,
01:00:01.720 | primarily for C++?
01:00:05.280 | - They handle most of the languages, but it's, you know,
01:00:07.320 | it's based on C as the original kind of Unix heritage.
01:00:10.400 | - But, and it's kind of like command line.
01:00:11.800 | It's not user-friendly.
01:00:12.840 | It's not, it doesn't allow for clean visualizations.
01:00:15.520 | And you're exactly right.
01:00:17.160 | So you're using this kind of debugger,
01:00:19.000 | usually when you're at wits end and there's a problem
01:00:22.200 | that you can't figure out why,
01:00:23.440 | by just looking at the codes, you have to find it.
01:00:26.000 | That's how I guess normal programmers use it.
01:00:28.600 | But you're saying there should be tools
01:00:30.080 | that kind of visualize and help you
01:00:32.640 | as part of the programming process,
01:00:35.320 | just the normal programming process
01:00:37.480 | to understand the code deeper.
01:00:39.960 | - Yeah, when I'm working on like my C/C++ code,
01:00:42.920 | I'm always running it from the debugger.
01:00:45.400 | You know, just I type in the code, I run it.
01:00:47.920 | Many times, the first thing I do after writing code
01:00:50.240 | is set a break point and step through the function.
01:00:53.080 | Now, other people will say, it's like,
01:00:54.160 | oh, I do that in my head.
01:00:55.560 | Well, your head is a faulty interpreter
01:00:57.840 | of all those things there.
01:00:59.120 | And I've written brand new code.
01:01:01.080 | I want to step in there
01:01:02.040 | and I'm going to single step through that,
01:01:03.520 | examine lots of things
01:01:04.760 | and see if it's actually doing what I expected it to.
01:01:08.560 | - It is a kind of companion, the debugger.
01:01:12.240 | Like you're now coding in an interactive way
01:01:14.880 | with another being.
01:01:17.800 | A debugger is a kind of dumb being,
01:01:19.480 | but it's a reliable being.
01:01:21.400 | That is an interesting question of what role does AI play
01:01:24.240 | in that kind of, with codecs
01:01:26.720 | and these kind of ability to generate code.
01:01:29.560 | That might be, you might start having tools
01:01:32.160 | that understand the code in interesting, deep ways
01:01:36.000 | that can work with you.
01:01:37.880 | - There's a whole spectrum there
01:01:39.080 | from static code analyzers
01:01:41.400 | and various kind of dynamic tools there
01:01:43.480 | up to AI that can conceivably grok these programs
01:01:46.720 | that literally no human can understand.
01:01:49.120 | They're too big, too intertwined and too interconnected,
01:01:52.000 | but it's not beyond the possibility of understanding.
01:01:55.000 | It's just beyond what we can hold in our heads
01:01:57.840 | as kind of mutable state while we're working on things.
01:02:00.880 | And I'm a big proponent again of things
01:02:03.880 | like static analyzers and some of that stuff
01:02:06.200 | where you'll find some people
01:02:08.320 | that don't like being scolded by a program
01:02:11.280 | for how they've written something
01:02:12.680 | where it's like, oh, I know better.
01:02:14.160 | And sometimes you do,
01:02:15.440 | but that was something that I was,
01:02:18.000 | it was very, very valuable for me when,
01:02:21.680 | and not too many people get an opportunity
01:02:23.280 | like this to have.
01:02:24.120 | This is almost one of those spiritual experiences
01:02:26.120 | as a programmer and awakening to,
01:02:28.120 | the id Software code bases
01:02:30.360 | were a couple million lines of code.
01:02:32.280 | And at one point I had used a few
01:02:34.760 | of the different analysis tools,
01:02:36.360 | but I made a point to really go through
01:02:39.640 | and scrub the code base using every tool that I could find.
01:02:43.320 | And it was eyeopening where we had a reputation
01:02:45.680 | for having some of the most robust, strongest code,
01:02:48.600 | you know, where there were some,
01:02:49.840 | you know, great things that I remember hearing
01:02:51.760 | from Microsoft telling us about crashes on Xbox.
01:02:55.040 | And we had this tiny number that they said
01:02:56.840 | were probably literally hardware errors.
01:02:59.440 | And then you have other significant titles
01:03:01.440 | that just have millions of faults
01:03:03.440 | that are getting recorded all the time.
01:03:04.920 | So I was proud of our code on a lot of levels,
01:03:07.640 | but when I took this code analysis squeegee
01:03:10.720 | through everything, it was shocking
01:03:14.440 | how many errors there were in there.
01:03:16.760 | Things that you can say, okay, this was a copy paste,
01:03:19.840 | not changing something right here.
01:03:21.760 | Lots of things that were, you know,
01:03:23.320 | the most common problem was something
01:03:26.000 | in a printf format string that was the wrong data type
01:03:29.160 | that could cause crashes there.
01:03:30.480 | And, you know, you really want the warnings
01:03:32.440 | for things like that.
01:03:33.400 | Then the next most common was missing a check for null
01:03:36.200 | that could actually happen, that could blow things up.
01:03:38.480 | And those are obviously like top C, C++ things.
01:03:41.640 | Everybody has those problems.
01:03:43.440 | But the long tail of all of the different little things
01:03:46.280 | that could go wrong there, and we had good programmers
01:03:49.200 | and my own code, stuff that I'd be looking at,
01:03:51.120 | it's like, oh, I wrote that code.
01:03:52.400 | That's definitely wrong.
01:03:53.800 | We've been using this for a year.
01:03:56.200 | And it's this submarine, you know,
01:03:58.560 | this mine sitting there waiting for us to step on.
01:04:01.760 | And it was humbling.
01:04:03.800 | It was, and I reached the conclusion
01:04:06.040 | that anything that can be syntactically allowed
01:04:09.440 | in your language, if it's going to show up eventually
01:04:13.440 | in a large enough code base, you're not going to,
01:04:16.120 | good intentions aren't going to keep it from happening.
01:04:18.800 | You need automated tools and guardrails for things.
01:04:21.880 | And those start with things like static types
01:04:24.120 | and, or, you know, even type hints
01:04:25.560 | in the more dynamic languages.
01:04:27.040 | But the people that rebel against that,
01:04:30.280 | that basically say, that slows me down doing that.
01:04:33.720 | There's something to that.
01:04:34.720 | I get that.
01:04:35.560 | And, you know, I've cobbled things together in a notebook.
01:04:38.200 | I am like, wow, this is great that it just happened.
01:04:40.960 | But yeah, that's kind of sketchy, but it's working fine.
01:04:43.360 | I don't care.
01:04:44.000 | It does come back to that value analysis,
01:04:47.080 | where sometimes it's right to not care.
01:04:49.720 | But when you do care, if it's going
01:04:51.800 | to be something that's going to live for years
01:04:53.840 | and it's going to have other people working on it,
01:04:56.480 | and it's going to be deployed to millions of people,
01:04:59.000 | then you want to use all of these tools.
01:05:01.160 | You want to be told, it's like, no, you've screwed up here,
01:05:03.560 | here, and here.
01:05:04.320 | And that does require kind of an ego check about things,
01:05:07.960 | where you have to be open to the fact
01:05:11.120 | that everything that you're doing
01:05:12.520 | is just littered with flaws.
01:05:13.960 | It's not that, oh, you occasionally have a bad day.
01:05:16.360 | It's just, whatever stream of code you output,
01:05:19.160 | there is going to be a statistical regularity of things
01:05:21.720 | that you just make mistakes on.
01:05:24.040 | And I do think there's the whole argument about test-driven
01:05:28.360 | design and unit testing versus kind of analysis
01:05:31.640 | and different things.
01:05:32.920 | I am more in favor of the analysis and the stuff
01:05:35.960 | that just like, you can't run your program until you fix this
01:05:38.640 | rather than you can run it and hopefully a unit test will
01:05:41.600 | catch it in some way.
01:05:42.880 | Yeah, in my private code, I have asserts everywhere.
01:05:46.560 | Yeah.
01:05:48.040 | Just there's something pleasant to me, pleasurable to me,
01:05:52.280 | about sort of the dictatorial rule of like,
01:05:55.240 | this should be true at this point.
01:05:58.320 | And too many times I've made mistakes
01:06:03.320 | that shouldn't have been made.
01:06:05.640 | And I would assume I wouldn't be the kind of person
01:06:08.480 | that would make that mistake,
01:06:09.360 | but I keep making that mistake.
01:06:10.560 | Therefore, an assert really catches me,
01:06:13.960 | really helps all the time.
01:06:15.320 | So my code, I would say like 10 to 20% of my private code,
01:06:19.000 | just for personal use is probably asserts.
01:06:21.360 | And they're active comments.
01:06:22.520 | That's one of those things that in theory,
01:06:24.920 | they don't make any difference to the program.
01:06:27.200 | And if it was all operating the way you expected it would be,
01:06:30.200 | then they will never fire.
01:06:32.680 | But even if you have it right
01:06:34.600 | and you wrote the code right initially,
01:06:36.800 | then circumstances change.
01:06:38.320 | The world outside your program changes.
01:06:40.800 | And in fact, that's one of the things where I'm kind of fond
01:06:44.560 | in a lot of cases of static array size declarations,
01:06:47.520 | where I went through this period where it's like,
01:06:49.800 | okay, now we have general collection classes.
01:06:51.920 | We should just make everything variable.
01:06:54.400 | Because I had this history of in the early days,
01:06:57.480 | you get Doom, which had some fixed limits on it.
01:06:59.960 | Then everybody started making crazier and crazier things.
01:07:02.520 | And they kept bumping up the different limits,
01:07:04.160 | this many lines, this many sectors.
01:07:06.760 | And it seemed like a good idea.
01:07:09.000 | Well, we should just make this completely generic.
01:07:10.880 | It can go kind of go up to whatever.
01:07:13.720 | And there's cases where that's the right thing to do.
01:07:17.280 | But it also, the other aspect of the world changing
01:07:20.200 | around you is it's good to be informed
01:07:23.000 | when the world has changed more than you thought it would.
01:07:25.680 | And if you've got a continuously growing collection,
01:07:28.400 | you're never going to find out.
01:07:29.520 | You might have this quadratic slowdown on something
01:07:32.200 | where you thought, oh, I'm only ever going
01:07:34.080 | to have a handful of these.
01:07:35.720 | But something changes, and there's a new design style.
01:07:38.200 | And all of a sudden, you've got 10,000 of them.
01:07:40.760 | So I kind of like, in many cases,
01:07:43.880 | picking a number, some nice round power of two number,
01:07:47.520 | and setting it up in there, and having an assert saying,
01:07:49.880 | it's like, hey, you hit this limit.
01:07:52.160 | You should probably think, are the choices
01:07:54.320 | that you've made around all of this still relevant
01:07:56.920 | if somebody's using 10 times more
01:07:59.240 | than you thought they would?
01:08:00.360 | - Yeah, this code was originally written
01:08:02.600 | with this kind of worldview,
01:08:04.400 | with this kind of set of constraints.
01:08:06.080 | You were thinking of the world in this way.
01:08:09.360 | If something breaks, that means you got to rethink
01:08:11.720 | the initial stuff.
01:08:13.080 | And it's nice for it to do that.
01:08:16.480 | Is there any stuff like a keyboard or monitors?
01:08:21.600 | - I'm fairly pedestrian on a lot of that,
01:08:23.840 | where I did move to triple monitors
01:08:26.160 | like in the last several years ago.
01:08:27.720 | I had been dual monitor for a very long time.
01:08:30.240 | And it was one of those things where,
01:08:33.800 | probably years later than I should have,
01:08:35.680 | I'm just like, well, the video cards now
01:08:37.040 | generally have three output ports.
01:08:38.720 | I should just put the third monitor up there.
01:08:40.480 | That's been a pure win.
01:08:41.960 | I've been very happy with that.
01:08:43.520 | But no, I don't have fancy keyboard or mouse
01:08:47.200 | or anything really going on with that.
01:08:48.800 | - So, one of the key things is an IDE
01:08:50.240 | that has helpful debuggers, has helpful tools.
01:08:54.080 | So it's not the Emacs Vim route and then Diacoke.
01:08:56.960 | - Yeah.
01:08:57.800 | So I did spend, I spent one of my week-long retreats
01:09:01.400 | where I'm like, okay, I'm going to make myself use,
01:09:04.240 | it was actually classic VI,
01:09:05.560 | which I know people will say,
01:09:06.400 | you should never have done that.
01:09:07.360 | You should have just used Vim directly.
01:09:09.320 | But I gave it the good try.
01:09:11.320 | It's like, okay, I'm being in kind of classic
01:09:14.000 | Unix developer mode here.
01:09:15.720 | And I worked for a week on it.
01:09:18.560 | I used Anki to like teach myself
01:09:20.600 | the different little key combinations for things like that.
01:09:23.960 | And in the end it was just like, all right,
01:09:26.040 | this was kind of like my civil war reenactment phase.
01:09:28.760 | You know, it's like, I'm going out there,
01:09:30.000 | doing it like they used to in the old days.
01:09:31.960 | And it was kind of fun in that regard.
01:09:34.040 | - So many people right now,
01:09:35.360 | they're screaming as they're listening to this.
01:09:38.640 | - So again, the out is that this was not modern Vim,
01:09:41.200 | but still, yes, I was very happy to get back
01:09:44.600 | to my visual studio at the end.
01:09:46.840 | - Yeah, I'm actually, I struggle with this a lot
01:09:49.560 | because so you said Kinesis keyboard
01:09:52.120 | and I use Emacs primarily.
01:09:55.160 | And I feel like I can, exactly as you said,
01:09:59.160 | I can understand the code, I can navigate the code.
01:10:01.280 | There's a lot of stuff you could build within Emacs
01:10:03.680 | with using Lisp.
01:10:04.800 | You can customize a lot of things for yourself
01:10:07.280 | to help you introspect the code,
01:10:09.880 | like to help you understand the code
01:10:11.640 | and visualize different aspects of the code.
01:10:13.040 | You can even run debuggers, but it's work.
01:10:16.640 | And the world moves past you
01:10:18.920 | and the better and better ideas are constantly being built.
01:10:21.880 | And that puts a kind of,
01:10:25.400 | I need to take the same kind of retreat
01:10:27.200 | as you're talking about,
01:10:28.280 | but now I'm still fighting the civil war.
01:10:30.840 | I need to kind of move into the 21st century.
01:10:33.240 | - And it does seem like the world is,
01:10:34.920 | or a large chunk of the world
01:10:36.640 | is moving towards visual studio code,
01:10:38.640 | which is kind of interesting to me.
01:10:40.360 | Again, it's the JavaScript ecosystem on the one hand,
01:10:43.200 | and IDs are one of those things
01:10:45.360 | that you want to be infinitely fast.
01:10:47.840 | You want them to just kind of immediately respond.
01:10:50.960 | And like, I mean, heck, I've got,
01:10:52.480 | there's someone I know, an old school game dev guy
01:10:55.120 | that still uses Visual Studio 6.
01:10:57.400 | And on a modern computer,
01:10:59.480 | everything is just absolutely instant on something like that
01:11:02.960 | because it was made to work on a computer
01:11:04.640 | that's 10,000 or 100,000 times slower.
01:11:07.840 | So just everything happens immediately.
01:11:10.520 | And all the modern systems just feel,
01:11:13.320 | you know, they feel so crufty when it's like,
01:11:15.320 | oh, why is this refreshing the screen
01:11:17.240 | and moving around and updating over here
01:11:19.240 | and something blinks down there and you should update this.
01:11:21.840 | And there's, you know, there are things that we've lost
01:11:25.680 | with that incredible flexibility,
01:11:27.400 | but lots of people get tons of value from it.
01:11:31.040 | And I am super happy that that seems to be winning over
01:11:33.840 | even a lot of the old Vim and Emacs people
01:11:36.160 | that they're kind of like,
01:11:37.160 | hey, Visual Studio code's maybe, you know, not so bad.
01:11:40.080 | I am, that may be the final peacekeeping solution
01:11:43.240 | where everybody is reasonably happy
01:11:45.600 | with something like that.
01:11:47.880 | - So can you explain what a .plan file is
01:11:50.560 | and what role that played in your life?
01:11:53.400 | Does it still continue to play a role?
01:11:55.520 | - Back in the early, early days of id Software,
01:11:58.640 | one of our big things that was unique with what we did
01:12:01.160 | is I had adopted Nextstations
01:12:04.000 | or kind of next step systems from Steve Jobs'
01:12:08.080 | out in the woods away from Apple company.
01:12:11.680 | And they were basically, it was kind of interesting
01:12:15.200 | because I did not really have a background
01:12:17.160 | with the Unix system.
01:12:18.080 | So many of the people, they get immersed in that in college
01:12:21.840 | and, you know, and that's, you know,
01:12:24.200 | that sets a lot of cultural expectations for them.
01:12:27.520 | And I didn't have any of that,
01:12:29.320 | but I knew that my background was,
01:12:31.960 | I was a huge Apple II fan boy.
01:12:34.480 | I was always a little suspicious of the Mac.
01:12:36.480 | I was not really what kind of I wanted to go with.
01:12:41.400 | But when Steve Jobs left Apple and started Next,
01:12:44.560 | this computer did just seem like
01:12:45.840 | one of those amazing things from the future
01:12:47.800 | where it had all of this cool stuff in it.
01:12:50.440 | And we were still back in those days working on DOS,
01:12:53.320 | everything blew up.
01:12:54.240 | You had reset buttons because your computer would just freeze
01:12:57.000 | if you're doing development work,
01:12:58.400 | literally dozens of times a day,
01:12:59.920 | your computer was just rebooting constantly.
01:13:02.400 | And so this idea of, yes,
01:13:04.120 | any of the Unix workstations
01:13:06.400 | would have given a stable development platform
01:13:08.640 | where you don't crash and reboot all the time.
01:13:11.440 | But Next also had this really amazing graphical interface
01:13:15.560 | and it was great for building tools.
01:13:17.440 | And it used Objective-C as the kind of an interesting-
01:13:21.440 | - Oh, wow.
01:13:22.280 | - Yeah, dead end for things like that.
01:13:23.120 | - So Next was Unix based, it said Objective-C.
01:13:26.120 | So it has a lot of the elements-
01:13:27.800 | - That became Mac.
01:13:28.720 | I mean, the kind of reverse acquisition of Apple by Next,
01:13:31.440 | where that took over and became what the modern Mac system is.
01:13:35.320 | - And define some of the developer,
01:13:37.320 | like the tools and the whole community.
01:13:41.480 | - Yeah, you've still got,
01:13:42.320 | if you're programming on Apple stuff now,
01:13:43.800 | there's still all these NS somethings,
01:13:45.600 | which was originally Next Step objects
01:13:47.640 | of different kinds of things.
01:13:49.400 | But one of the aspects of those Unix systems
01:13:52.880 | was they had this notion of a .plan file,
01:13:56.200 | where a .file is an invisible file,
01:14:00.200 | usually in your home directory or something.
01:14:02.040 | And there was a trivial server
01:14:03.400 | running on most Unix systems at the time,
01:14:05.680 | that when somebody ran a trivial little command
01:14:09.640 | called finger, you could do a finger
01:14:11.960 | and then somebody's address,
01:14:13.520 | it could be anywhere on the internet
01:14:15.120 | if you were connected correctly.
01:14:16.760 | Then all that server would do was read the .plan file
01:14:20.760 | in that user's home directory
01:14:22.560 | and then just spit it out to you.
01:14:24.480 | And originally the idea was that could be
01:14:27.160 | whether you're on vacation, what your current project was,
01:14:30.080 | it's supposed to be like the plan of what you're doing.
01:14:32.280 | And people would use it for various purposes,
01:14:35.720 | but all it did was dump that file over to the terminal
01:14:39.840 | of whoever issued the finger command.
01:14:42.680 | And at one point I started just keeping a list
01:14:46.840 | of what I was doing in there,
01:14:48.880 | which would be what I was working on in the day.
01:14:51.080 | And I would have this little syntax
01:14:53.600 | I kind of got to myself about,
01:14:55.720 | here's something that I'm working on,
01:14:57.160 | I put a star when I finish it,
01:14:58.720 | I could have a few other little bits of punctuation.
01:15:01.400 | And at the time it started off
01:15:03.640 | as being just like my to-do list.
01:15:05.560 | And it would be these trivial, obscure little things
01:15:08.640 | like I fixed something with collision detection code,
01:15:12.760 | made fireball do something different
01:15:14.920 | and just little one-liners
01:15:16.240 | that people that were following the games
01:15:18.320 | could kind of decipher.
01:15:20.080 | But I did wind up starting to write
01:15:22.840 | much more in-depth things.
01:15:24.240 | I would have little notes of thoughts and insights
01:15:28.480 | and then I would eventually start having little essays
01:15:30.440 | I would sometimes dump into the .plan files
01:15:33.040 | interspersed with the work logs of things that I was doing.
01:15:36.240 | So in some ways it was like a super early proto blog
01:15:39.640 | where I was just kind of dumping out what I was working on,
01:15:42.640 | but it was interesting enough
01:15:44.240 | that there were a lot of people
01:15:46.720 | that were interested in this.
01:15:48.760 | So most of the people didn't have Unix workstation,
01:15:51.240 | so there were the websites back in the day
01:15:53.320 | that would follow the Doom and Quake development
01:15:55.520 | that would basically make a little service
01:15:58.400 | that would go grab all the changes
01:15:59.800 | and then people could just get it with a web browser.
01:16:02.240 | And there was a period where like all of the little
01:16:05.000 | kind of Dallas gaming diaspora of people
01:16:07.600 | that were at all in that orbit,
01:16:09.400 | there were a couple dozen .plan files going on,
01:16:12.280 | which was, and this was some years before blogging
01:16:15.520 | really became kind of a thing.
01:16:17.240 | And it was kind of a premonition
01:16:20.440 | of sort of the way things would go.
01:16:22.280 | And there was, it's all been collected,
01:16:25.080 | it's available online in different places
01:16:27.120 | and it's kind of fun to go back and look through
01:16:29.400 | what I was thinking,
01:16:30.840 | what I was doing in the different areas.
01:16:32.520 | - Have you had a chance to look back?
01:16:33.880 | Is there some interesting,
01:16:35.400 | very low level specific to do items,
01:16:38.800 | maybe things you've never completed,
01:16:40.480 | all that kind of stuff
01:16:41.520 | and high level philosophical essay type of stuff?
01:16:45.240 | - Yeah, there's some good stuff on both
01:16:48.440 | where a lot of it was low level nitpicky details
01:16:52.000 | about game dev and I've learned enough things
01:16:55.880 | where there's no project that I worked on
01:16:58.320 | that I couldn't go back and do a better job on now.
01:17:00.920 | I mean, you just, you learn things,
01:17:02.400 | hopefully if you're doing it right,
01:17:03.840 | you learn things as you get older
01:17:05.360 | and you should be able to do a better job
01:17:07.360 | at all of the early things.
01:17:08.720 | And there's stuff in Wolfenstein, Doom, Quake,
01:17:11.840 | that's like, oh, clearly I could go back
01:17:14.240 | and do a better job at this,
01:17:15.800 | whether it's something in the rendering engine side
01:17:17.920 | or how I implemented the monster behaviors
01:17:20.960 | or managed resources or anything like that.
01:17:22.640 | - Do you see the flaws in your thinking now?
01:17:25.040 | - Yeah. - Like looking back?
01:17:26.280 | - Yeah, I do.
01:17:27.120 | I mean, sometimes I'll get the,
01:17:29.120 | I'll look at it and say, yeah,
01:17:30.320 | I had a pretty clear view of I was doing good work there
01:17:34.000 | and I haven't really hit the point
01:17:35.960 | where there was another programmer, Graham Devine,
01:17:38.800 | who was, he had worked at Id and Seventh Guest
01:17:42.200 | and he made some comment one time
01:17:43.720 | where he said he looked back at some of his old notes
01:17:45.720 | and he was like, wow, I was really smart back then.
01:17:48.800 | And I don't hit that so much where,
01:17:52.200 | I mean, I look at it and I always know that,
01:17:54.320 | yeah, there's all the, with aging,
01:17:56.560 | you get certain changes in how you're able to work problems,
01:18:00.080 | but all of the problems that I've worked,
01:18:02.320 | I'm sure that I could do a better job on all of them.
01:18:06.320 | - Oh, wow.
01:18:07.160 | So you can still step right in.
01:18:08.400 | If you could travel back in time and talk to that guy,
01:18:10.960 | you would teach him a few things.
01:18:12.120 | - Yeah, absolutely.
01:18:13.160 | (laughing)
01:18:14.280 | - That's awesome.
01:18:15.920 | What about the high level philosophical stuff?
01:18:18.040 | Is there some insights that stand out that you remember?
01:18:20.640 | - There's things that I was understanding about development
01:18:25.400 | and I'm in the industry and so on
01:18:28.160 | that were in a more primitive stage
01:18:31.360 | where I definitely learned a lot more in the later years
01:18:36.360 | about business and organization and team structure.
01:18:41.320 | There were, I mean, there were definitely things
01:18:44.600 | that I was not the best person
01:18:46.840 | or even a very good person about managing
01:18:48.920 | like how a team should operate internally,
01:18:51.560 | how people should work together.
01:18:53.440 | I was just, you know, just get out of my way
01:18:57.440 | and let me work on the code and do this.
01:18:59.480 | And more and more, I've learned how,
01:19:02.280 | in the larger scheme of things,
01:19:04.960 | how sometimes relatively unimportant
01:19:07.200 | some of those things are,
01:19:08.360 | where it is this user value generation
01:19:11.040 | that's the overarching importance for all of that.
01:19:14.040 | And I didn't necessarily have my eye on that ball correctly
01:19:17.920 | through a lot of my earlier years.
01:19:21.160 | And there's things that, you know,
01:19:23.360 | I could have gotten more out of people
01:19:25.560 | handling things in different ways.
01:19:27.440 | I could have made, you know, in some ways
01:19:30.560 | more successful products
01:19:32.000 | by following things in different ways.
01:19:33.800 | There's mistakes that we've made
01:19:35.320 | that we couldn't really have known
01:19:37.160 | how things would have worked out,
01:19:38.520 | but it was interesting to see in later years,
01:19:40.680 | companies like Activision showing that,
01:19:42.360 | hey, you really can just do the same game,
01:19:44.960 | make it better every year.
01:19:46.600 | And you can look at that from a negative standpoint
01:19:48.560 | and say, it's like, oh, that's just being derivative
01:19:50.520 | and all that.
01:19:51.520 | But if you step back again and say, it's like,
01:19:53.360 | no, are the people buying it still enjoying it?
01:19:55.480 | Are they enjoying it more than what they might
01:19:57.600 | have bought otherwise?
01:19:59.080 | And you can say, no, that's actually
01:20:00.920 | a great value creation engine to do that
01:20:03.680 | if you're in a position where you can.
01:20:06.000 | I, you know, don't be forced into reinventing everything
01:20:09.280 | just because you think that you need to.
01:20:11.680 | I'm, you know, lots of things about business and team stuff
01:20:15.880 | that could be done better.
01:20:16.960 | But the technical work, the kind of technical visionary type
01:20:20.400 | stuff that I laid out, I still feel pretty good about.
01:20:23.840 | There are some classic old ones about my defending
01:20:26.560 | of OpenGL versus D3D, which turned out
01:20:30.760 | to be one of the more probably important momentous things
01:20:34.080 | there, where it never-- it was always a rearguard action
01:20:38.120 | on Windows, where Microsoft was just not going to let that win.
01:20:42.200 | But when I look back on it now, that fight
01:20:45.080 | to keep OpenGL relevant for a number of years
01:20:47.960 | there meant that OpenGL was there
01:20:50.800 | when mobile started happening.
01:20:52.560 | And OpenGL ES was the thing that drove
01:20:55.600 | all of the acceleration of the mobile industry.
01:20:58.520 | And it's really only in the last few years,
01:21:00.880 | as Apple's moved to Metal and some of the other companies
01:21:03.520 | have moved to Vulkan, that that's moved away.
01:21:06.680 | But really stepping back and looking at it,
01:21:09.400 | it's like, yeah, I sold tens of millions
01:21:11.320 | of games for different things.
01:21:13.560 | But billions and billions of devices
01:21:16.640 | wound up with an appropriate, capable graphics
01:21:20.240 | API due in no small part to me thinking
01:21:23.720 | that that was really important, that we not just give up
01:21:27.800 | and use Microsoft's, at that time, really terrible API.
01:21:32.680 | The thing about Microsoft is the APIs don't stay terrible.
01:21:35.760 | They were terrible at the start.
01:21:37.480 | But a few versions on, they were actually quite good.
01:21:40.000 | And there was a completely fair argument
01:21:41.680 | to be made that by the time DX9 was out,
01:21:45.120 | it was probably a better programming
01:21:46.680 | environment than OpenGL.
01:21:48.440 | But it was still a wonderful, good thing
01:21:51.160 | that we had an open standard that could show up
01:21:53.560 | on Linux and Android and iOS, eventually WebGL still
01:21:57.480 | to this day.
01:21:58.680 | So that would be on my greatest hits list of things
01:22:03.160 | that I kind of pushed with--
01:22:04.360 | - In terms of impact it had on billions of devices, yes.
01:22:07.760 | So let's talk about it.
01:22:09.000 | Can you tell the origin story of id Software?
01:22:12.400 | Again, one of the greatest game developer companies ever.
01:22:16.360 | It created Wolfenstein 3D, games that define my life also
01:22:21.320 | in many ways.
01:22:22.600 | As a thing that made me realize what computers
01:22:24.800 | are capable of in terms of graphics,
01:22:26.680 | in terms of performance.
01:22:28.360 | It just unlocks something deep in me
01:22:32.040 | and understanding what these machines are all about.
01:22:34.200 | Those games can do that.
01:22:35.160 | So Wolfenstein 3D, Doom, Quake, and just
01:22:38.720 | all the incredible engineering innovation that went into that.
01:22:41.760 | So how did it all start?
01:22:44.240 | - So I'll caveat up front that I usually
01:22:46.920 | don't consider myself the historian of the software
01:22:50.640 | side of things.
01:22:51.480 | I usually do kind of point people
01:22:54.160 | at John Romero for stories about the early days
01:22:57.080 | where I've never been-- like I've
01:23:00.400 | commented that I'm a remarkably unsentimental person
01:23:03.160 | in some ways where I don't really spend a lot of time
01:23:05.760 | unless I'm explicitly prodded to go back and think
01:23:08.640 | about the early days of things.
01:23:10.280 | And I didn't necessarily make the effort
01:23:14.640 | to archive everything exactly in my brain.
01:23:17.120 | And the more that I work on machine learning and AI
01:23:19.440 | and the aspects of memory and how when you go back and polish
01:23:22.680 | certain things, it's not necessarily
01:23:24.360 | exactly the way it happened.
01:23:25.920 | But having said all of that, from my view,
01:23:29.600 | the way everything happened that led up to that
01:23:32.200 | was after I was an adult and kind of taking a few college
01:23:37.600 | classes and deciding to drop out,
01:23:39.600 | I was doing--
01:23:40.880 | I was hardscrabble contract programming work,
01:23:43.600 | really struggling to kind of keep groceries and pay
01:23:46.560 | my rent and things.
01:23:48.160 | And the company that I was doing the most work for
01:23:50.560 | was a company called Softdisk Publishing,
01:23:53.200 | which had the Sounds Bizarre Now business
01:23:56.600 | model of monthly subscription software.
01:23:59.760 | Before, there was an internet that people
01:24:01.520 | could connect to and get software.
01:24:03.680 | You would pay a certain amount.
01:24:05.760 | And every month, they would send you a disk that had some random
01:24:08.520 | software on it.
01:24:09.760 | And people that were into computers
01:24:11.440 | thought this was kind of cool.
01:24:12.680 | And they had different ones for the Apple II, the 2GS, the PC,
01:24:16.680 | the Mac, the Amiga, lots of different things here.
01:24:20.040 | So quirky little business.
01:24:21.400 | But I was doing a lot of contract programming for them
01:24:24.560 | where I'd write tiny little games
01:24:26.480 | and sell them for $300, $500.
01:24:30.000 | And one of the things that I was doing, again,
01:24:32.920 | to keep my head above water here,
01:24:34.440 | was I decided that I could make one program
01:24:38.080 | and I could port it to multiple systems.
01:24:41.120 | So I would write a game like Dark Designs or Catacombs.
01:24:44.920 | And I would develop it on the Apple II, the 2GS,
01:24:47.960 | and the IBM PC, which apparently was the thing that really kind
01:24:53.040 | of piqued the attention of the people working down there.
01:24:56.440 | Like, Jay Wilber was my primary editor.
01:24:58.520 | And Tom Hall was a secondary editor.
01:25:01.200 | And they kept asking me, it's like, hey,
01:25:02.920 | you should come down and work for us here.
01:25:06.040 | And I pushed it off a couple of times
01:25:08.240 | because I was really enjoying my freedom of kind of being off
01:25:11.240 | on my own, even if I was barely getting by.
01:25:14.040 | I loved it.
01:25:14.640 | I was doing nothing but programming all day.
01:25:17.640 | But I did have enough close scrapes with, like, damn,
01:25:20.720 | I'm just really out of money, that maybe I
01:25:22.520 | should get an actual job rather than contracting
01:25:26.200 | these kind of one-at-a-time things.
01:25:27.880 | And Jay Wilber was great.
01:25:29.360 | He was like FedExing me the checks
01:25:30.960 | when I would need them to kind of get over
01:25:33.840 | whatever hump I was at.
01:25:35.680 | So I finally took them up on their offer
01:25:38.680 | to come down to Shreveport, Louisiana.
01:25:41.280 | I was in Kansas City at the time.
01:25:43.840 | Drove down through the Ozarks and everything down
01:25:47.600 | to Louisiana and saw the Softdisk offices,
01:25:51.800 | went through, talked to a bunch of people,
01:25:53.760 | met the people I had been working with remotely
01:25:56.520 | at that time.
01:25:57.760 | But the most important thing for me
01:25:59.200 | was I met two programmers there, John Romero and Lane Roth,
01:26:03.320 | that for the first time ever, I had
01:26:05.080 | met programmers that knew more cool stuff than I did,
01:26:08.480 | where the world was just different back then.
01:26:11.040 | I was in Kansas City.
01:26:12.400 | It was one of those smartest kid in the school,
01:26:14.600 | does all the computer stuff.
01:26:15.960 | The teachers don't have anything to teach him.
01:26:18.080 | But all I had to learn from was these few books
01:26:20.320 | at the library.
01:26:21.240 | It was not much at all.
01:26:23.000 | And there were some aspects of programming
01:26:25.200 | that were kind of black magic to me.
01:26:27.240 | It's like, oh, he knows how to format a track on a low level
01:26:31.560 | drive programming interface.
01:26:34.280 | And I was still not at all sure I was going to take the job.
01:26:38.560 | But I met these awesome programmers
01:26:40.840 | that were doing cool stuff.
01:26:42.320 | And Romero had worked at Origin Systems.
01:26:44.400 | And he had done so many different games ahead of time
01:26:49.000 | that I did kind of quickly decide, yeah,
01:26:51.080 | I'll go take the job down there.
01:26:53.160 | And I settled down there, moved in,
01:26:57.120 | and started working on more little projects.
01:26:59.920 | And the first kind of big change that happened down there
01:27:03.040 | was the company wanted to make a gaming-focused, a PC gaming
01:27:06.440 | focused subscription.
01:27:08.040 | Just like all their others, the same formula
01:27:10.200 | that they used for everything.
01:27:11.840 | Pay a monthly fee, and you'll get a disc with one or two
01:27:15.400 | games just every month.
01:27:16.920 | And no choice in what you get, but we think it'll be fun.
01:27:19.800 | And that was the model they were comfortable with.
01:27:21.920 | And they said, all right, we're going
01:27:23.460 | to start this gamers edge department.
01:27:25.160 | And all of us that were interested in that,
01:27:27.800 | like me, Romero, Tom Hall was kind of helping us
01:27:31.400 | from his side of things.
01:27:33.120 | Jay would peek in.
01:27:34.080 | And we had a few other programmers working
01:27:36.840 | with us at the time.
01:27:38.240 | And we were going to just start making games,
01:27:41.240 | just the same model.
01:27:43.400 | And we dived in, and it was fantastic.
01:27:45.800 | So you had to make new games--
01:27:47.520 | Every month.
01:27:48.240 | --every month.
01:27:48.960 | Yeah.
01:27:49.800 | And this, in retrospect, looking back at it,
01:27:52.680 | that sense that I had done all this contract programming,
01:27:55.320 | and John Romero had done far more of this,
01:27:58.000 | where he had done-- one of his teaching himself efforts
01:28:00.840 | was he made a game for every letter of the alphabet.
01:28:03.120 | It's that sense of, I'm just going
01:28:04.520 | to go make 26 different games, give them a different theme.
01:28:07.680 | And you learn so much when you go through
01:28:10.280 | and you crank these things out on a biweekly, monthly basis,
01:28:14.480 | something like that.
01:28:15.320 | From start to finish.
01:28:16.240 | So it's not just an idea.
01:28:17.840 | It's not just from the very beginning to the very end.
01:28:21.720 | It's done.
01:28:22.840 | It has to be done.
01:28:24.280 | There's no delaying.
01:28:25.280 | It's done.
01:28:25.800 | And you've got deadlines.
01:28:27.200 | And that kind of rapid iteration,
01:28:30.080 | pressure cooker environment was super important for all of us
01:28:34.080 | developing the skills that brought us
01:28:37.280 | to where we eventually went to.
01:28:38.520 | I mean, people would say, like, in the history of the Beatles,
01:28:41.360 | it wasn't them being the Beatles.
01:28:43.120 | It was them playing all of these other early works,
01:28:46.080 | that that opportunity to craft all of their skills
01:28:48.320 | before they were famous, that was
01:28:50.720 | very critical to their later successes.
01:28:53.040 | And I think there's a lot of that here,
01:28:54.960 | where we did these games that nobody remembers,
01:28:58.760 | lots of little things that contributed
01:29:00.960 | to building up the skill set for the things that eventually
01:29:03.480 | did make us famous.
01:29:05.360 | And Dostoevsky wrote The Gambler.
01:29:08.600 | I had to write it in a month just to make money.
01:29:12.480 | And nobody remembers that probably,
01:29:14.600 | because he had to figure out-- because it's literally--
01:29:18.160 | he didn't have enough time to write it fast enough.
01:29:21.160 | So he had to come up with hacks to actually literally write it
01:29:24.000 | fast enough within a month.
01:29:25.120 | It comes down to that point where pressure and limitation
01:29:27.500 | of resources is surprisingly important.
01:29:30.400 | And it's counterintuitive in a lot of ways,
01:29:32.600 | where you just think that if you've got all the time
01:29:34.360 | in the world, and you've got all the resources in the world,
01:29:36.560 | of course you're going to get something better.
01:29:38.520 | But sometimes it really does work out
01:29:40.400 | that the innovations, mother necessity,
01:29:44.400 | where you can-- or resource constraints,
01:29:46.440 | and you have to do things.
01:29:47.800 | When you don't have a choice, it's
01:29:49.520 | surprising what you can do.
01:29:50.760 | Is there any good games written in that time?
01:29:52.800 | Would you say--
01:29:53.440 | Some of them are still fun to go back and play,
01:29:55.760 | where you get the--
01:29:58.240 | they were all about--
01:29:59.880 | the more modern term is game feel,
01:30:01.880 | about how just the exact feel that things-- it's not
01:30:04.520 | the grand strategy of the design,
01:30:06.260 | but how running, and jumping, and shooting, and those things
01:30:09.800 | I feel in the moment.
01:30:12.040 | And some of those are still--
01:30:13.480 | if you sat down on them, you kind of go,
01:30:15.200 | it's a little bit different.
01:30:16.040 | It doesn't have the same movement feel,
01:30:17.760 | but you move over, and you're like, bang, jump, bang.
01:30:20.760 | It's like, hey, that's kind of cool still.
01:30:23.160 | So you can get lost in the rhythm of the game.
01:30:26.000 | Is that what you mean by feel?
01:30:27.840 | Just like there's something about it that pulls you in?
01:30:31.360 | Nowadays, again, people talk about compulsion loops
01:30:33.840 | and things, where it's that sense of exactly what you're
01:30:37.440 | doing, what your fingers are doing on the keyboard, what
01:30:39.800 | your eyes are seeing.
01:30:41.120 | And there are going to be these sequences of things.
01:30:43.280 | Grab the loot, shoot the monster, jump over the obstacle,
01:30:45.960 | get to the end of the level.
01:30:47.160 | These are eternal aspects of game design in a lot of ways.
01:30:50.640 | But there are better and worse ways to do all of them.
01:30:53.320 | And we did so many of these games
01:30:55.400 | that we got a lot of practice with it.
01:30:58.960 | So one of the kind of weird things
01:31:01.040 | that was happening at this time is John Romero
01:31:03.480 | was getting some strange fan mail.
01:31:06.800 | And back in the days, this was before email.
01:31:09.160 | So we literally got letters sometimes.
01:31:11.320 | And telling him, it's like, oh, I
01:31:12.960 | want to talk to you about your games.
01:31:14.520 | I want to reach out, different things.
01:31:17.040 | And eventually, it turned out that these
01:31:20.200 | were all coming from Scott Miller at Apogee Software.
01:31:23.920 | And he was reaching out through--
01:31:26.320 | he didn't think he could contact John directly,
01:31:28.320 | that he would get intercepted.
01:31:29.560 | So he was trying to get him to contact him through back
01:31:32.920 | channel fan mail.
01:31:34.240 | Because he basically was saying, hey,
01:31:35.760 | I'm making all this money on shareware games.
01:31:38.640 | I want you to make shareware games.
01:31:40.880 | Because he had seen some of the games that Romero had done.
01:31:44.480 | And we looked at Scott Miller's games.
01:31:47.800 | And we didn't think they were very good.
01:31:50.480 | We're like, that can't be making the kind of money
01:31:53.080 | that he's saying he's making $10,000 or something off
01:31:56.480 | of this game.
01:31:57.160 | And we really thought that he was full of shit,
01:31:59.360 | that it was a lie trying to get him into this.
01:32:03.320 | So that was kind of going on at one level.
01:32:07.920 | And it was funny the moment when Romero realized
01:32:10.160 | that he had some of these letters pinned up
01:32:11.920 | on his wall of all of his fans.
01:32:13.520 | And then we noticed that they all
01:32:14.560 | had the same return address with different names on them,
01:32:17.320 | which was a little bit of a two-edged sword there.
01:32:20.680 | Trying to figure out the puzzle laid out before him.
01:32:23.400 | Yeah, what happened after I kind of coincident with that
01:32:26.600 | was I was working on a lot of the new technologies,
01:32:29.800 | where I was now full on the IBM PC for the first time,
01:32:33.840 | where I was really a long holdout on Apple II forever.
01:32:37.160 | And I loved my Apple II.
01:32:38.960 | It was the computer I always wished I had
01:32:40.640 | when I was growing up.
01:32:41.680 | And when I finally did have one, I
01:32:43.760 | was kind of clinging on to that well past its sort
01:32:46.360 | of good use by date.
01:32:47.440 | Was it the best computer ever made, you would say?
01:32:50.680 | I wouldn't make judgments like that about it.
01:32:53.080 | But it was positioned in such a way,
01:32:54.640 | especially in the school systems,
01:32:56.200 | that it impacted a whole lot of American programmers,
01:32:59.840 | at least, where there was programs that the Apple IIs got
01:33:03.080 | into the schools.
01:33:03.960 | And they had enough capability that lots of interesting things
01:33:07.680 | happened with them.
01:33:08.680 | In Europe, it was different.
01:33:09.840 | You had your Amigas and Ataris.
01:33:11.360 | And you know, acorns in the UK and things
01:33:14.480 | that had different things.
01:33:16.000 | But in the United States, it was probably
01:33:17.720 | the Apple II made the most impact for a lot of programmers
01:33:21.520 | of my generation.
01:33:23.200 | But so I was really digging into the IBM.
01:33:26.040 | And this was even more so with the Total Focus,
01:33:29.240 | because I had moved to another city
01:33:30.680 | where I didn't know anybody that I wasn't working with.
01:33:33.800 | I had a little apartment.
01:33:35.160 | And then at Softdisk, again, the things that drew me to it,
01:33:38.680 | I had a couple programmers that knew more than I did.
01:33:42.160 | And they had a library.
01:33:43.480 | They had a set of books and a set of magazines.
01:33:45.960 | They had a couple years of magazines,
01:33:47.600 | the old Dr. Dobbs Journal and all of these magazines
01:33:50.600 | that had information about things.
01:33:53.080 | And so I was just in total immersion mode.
01:33:56.400 | It was eat, breathe, sleep, computer programming,
01:33:59.360 | particularly the IBM, for everything that I was doing.
01:34:03.240 | And I was digging into a lot of these low-level hardware
01:34:05.840 | details that people weren't usually paying attention to,
01:34:08.840 | the way the IBM EGA cards worked, which was fun for me.
01:34:14.240 | I hadn't had experience with things at that level.
01:34:16.920 | And back then, you could get hardware documentation just
01:34:20.160 | down at the register levels.
01:34:21.440 | This is where the CRTC register is.
01:34:23.880 | This is how the color registers work
01:34:26.000 | and how the different things are applied.
01:34:27.840 | And they were designed for a certain reason.
01:34:30.200 | They were designed for an application.
01:34:31.840 | They had an intended use in mind.
01:34:34.160 | But I was starting to look at other ways
01:34:37.040 | that they could perhaps be exploited that they weren't
01:34:39.240 | initially intended for.
01:34:40.680 | Because you comment on, first of all,
01:34:42.720 | what operating system was there?
01:34:44.040 | What instruction set was it?
01:34:45.840 | Like, what are we talking about?
01:34:48.560 | So this was DOS and x86, so 16-bit 8086.
01:34:52.760 | The 286s were there.
01:34:54.080 | And 386s existed.
01:34:55.480 | They were rare.
01:34:56.840 | We had a couple for our development systems.
01:34:59.360 | But we were still targeting the more broad--
01:35:02.760 | it was all DOS 16-bit.
01:35:04.920 | None of this was kind of DOS extenders and things.
01:35:07.520 | How different is it from the systems of today?
01:35:09.480 | Is it kind of a precursor that's similar?
01:35:12.280 | Very little.
01:35:13.000 | If you open up command.exe or com on Windows,
01:35:17.680 | you see some of the remnants of all of that.
01:35:19.680 | But it was a different world.
01:35:21.040 | It was the 640k is enough world.
01:35:23.800 | And nothing was protected.
01:35:25.600 | It crashed all the time.
01:35:26.720 | You had TSRs or terminate and stay resident hacks
01:35:29.920 | on top of things that would cause configuration problems.
01:35:33.200 | All the hardware was manually configured in your auto exec.
01:35:37.520 | So it was a very different world.
01:35:39.280 | But the code is still the same, similar.
01:35:41.360 | You could still write it.
01:35:42.560 | My earliest code there was written in Pascal.
01:35:44.680 | That was what I had learned at an earlier point.
01:35:47.720 | So between BASIC and C++, there was Pascal.
01:35:51.160 | So when BASIC assembly language--
01:35:53.520 | Some of my--
01:35:54.120 | Take a step back.
01:35:54.600 | Yeah, my intermediate stuff was-- well,
01:35:55.760 | you had to for performance.
01:35:57.040 | BASIC was just too slow.
01:35:58.400 | So most of the work that I was doing as a contract programmer
01:36:01.760 | in my teenage years was assembly language.
01:36:05.120 | Wait, you wrote games in assembly?
01:36:07.240 | Yeah, complete games in assembly language.
01:36:10.680 | And it's thousands and thousands of lines
01:36:12.760 | of three-letter acronyms for the instructions.
01:36:16.600 | You don't earn the once again greatest programmer ever
01:36:20.720 | label without being able to write a game in assembly.
01:36:23.680 | That's good.
01:36:24.200 | Everybody serious wrote their games in assembly language.
01:36:27.400 | It was kind of a--
01:36:28.360 | Everybody serious.
01:36:29.120 | See what he said?
01:36:29.800 | Everybody serious.
01:36:31.240 | It was an outlier to use Pascal a little bit,
01:36:34.000 | where there was one famous program called Wizardry.
01:36:36.480 | It was like one of the great early role-playing games
01:36:39.520 | that was written in Pascal.
01:36:40.920 | But it was almost nothing used Pascal there.
01:36:43.560 | But I did learn Pascal.
01:36:45.240 | And I remember doing all of my--
01:36:47.080 | like to this day, I sketch in data structures.
01:36:49.280 | When I'm thinking about something,
01:36:51.680 | I'll open up a file, and I'll start
01:36:53.160 | writing struct definitions for how
01:36:55.360 | data is going to be laid out.
01:36:57.000 | And Pascal was kind of formative to that,
01:36:58.960 | because I remember designing my RPGs in Pascal record
01:37:02.280 | structures and things like that.
01:37:04.240 | And so I had gotten a Pascal compiler for the Apple IIGS
01:37:08.000 | that I could work on.
01:37:08.920 | And the first IBM game that I developed, I did in Pascal.
01:37:12.480 | And that's actually kind of an interesting story,
01:37:14.920 | again, talking about the constraints and resources,
01:37:18.000 | where I had an Apple IIGS.
01:37:20.040 | I didn't have an IBM PC.
01:37:21.560 | I wanted to port my applications to IBM,
01:37:24.600 | because I thought I could make more money on it.
01:37:27.040 | So what I wound up doing is I rented a PC for a week
01:37:31.600 | and bought a copy of Turbo Pascal.
01:37:34.000 | And so I had a hard one week.
01:37:35.960 | And this was cutting into what minimal profit margin
01:37:38.240 | I had there.
01:37:38.920 | But I had this computer for a week.
01:37:40.400 | I had to get my program ported before I had to return the PC.
01:37:44.800 | And that was kind of the first thing
01:37:46.800 | that I had done on the IBM PC and what led me
01:37:49.240 | to taking the job at Softdisk.
01:37:51.680 | And Turbo Pascal, how is that different from regular Pascal?
01:37:54.840 | Is it a different compiler or something like that?
01:37:56.920 | So it was a product of Borland, which before Microsoft kind
01:38:00.280 | of killed them, they were the hot stuff developer tools
01:38:03.840 | company.
01:38:04.400 | You had Borland, Turbo Pascal, and Turbo C, and Turbo Prolog.
01:38:08.000 | I mean, all the different things.
01:38:09.560 | But what they did was they took a supremely pragmatic approach
01:38:13.240 | of making something useful.
01:38:14.520 | It was one of these great examples
01:38:16.160 | where Pascal was an academic language.
01:38:18.880 | And you had things like the UCSDP system
01:38:21.800 | that Wizardry was actually written in,
01:38:23.760 | that they did manage to make a game with that.
01:38:27.040 | But it was not a super practical system.
01:38:30.880 | While Turbo Pascal was--
01:38:32.800 | it was called Turbo because it was blazingly fast to compile.
01:38:35.520 | I mean, really ridiculously 10 to 20 times faster
01:38:39.440 | than most other compilers at the time.
01:38:41.600 | But it also had very pragmatic access to, look,
01:38:44.320 | you can just poke at the hardware
01:38:45.840 | in these different ways.
01:38:46.800 | And we have libraries that let you do things.
01:38:49.320 | And it was a pretty good--
01:38:50.400 | it was a perfectly good way to write games.
01:38:52.320 | And this is one of those things where
01:38:53.900 | people have talked about different paths
01:38:56.480 | that computer development could have taken,
01:38:58.640 | where C took over the world for reasons that came out of Unix
01:39:03.120 | and eventually Linux.
01:39:04.360 | And that was not a foregone conclusion at all.
01:39:07.240 | And people can make real reasoned rational arguments
01:39:10.800 | that the world might have been better
01:39:12.440 | if it had gone a Pascal route.
01:39:14.440 | I'm somewhat agnostic on that, where I do know from experience
01:39:18.600 | it was perfectly good enough to do that.
01:39:21.240 | And it had some fundamental improvements,
01:39:23.160 | like it had range-checked arrays as an option there,
01:39:26.080 | which could avoid many of C's real hazards that
01:39:29.700 | happened in a security space.
01:39:31.560 | But C1, they were basically operating
01:39:33.600 | in about the same level of abstraction.
01:39:35.600 | It was a systems programming language.
01:39:38.280 | But you said Pascal had more emphasis on data structures.
01:39:41.400 | Actually, in the tree of languages,
01:39:44.920 | did Pascal come before C?
01:39:47.680 | Did it inspire a lot of--
01:39:48.720 | They were pretty contemporaneous.
01:39:50.100 | So Pascal's lineage went to Modula 2 and eventually
01:39:52.760 | Oberon, which was another Nicholas word,
01:39:56.880 | kind of experimental language.
01:39:58.680 | But they were all good enough at that level.
01:40:01.360 | Now, some of the classic academic-oriented Pascals
01:40:04.160 | were just missing fundamental things,
01:40:05.680 | like, oh, you can't access this core system thing,
01:40:08.040 | because we're just using it to teach students.
01:40:10.320 | But Turbo Pascal showed that only modest changes to it
01:40:14.040 | really did make it a completely capable language.
01:40:17.120 | And it had some reasons why you could implement it
01:40:19.600 | as a single-pass compiler.
01:40:20.960 | So it could be way, way faster, although less scope
01:40:23.520 | for optimizations if you do it that way.
01:40:26.280 | And it did have some range-checking options.
01:40:28.320 | It had a little bit better typing capability.
01:40:30.760 | You'd have properly typed enums, sorts of things,
01:40:33.160 | and other stuff that C lacked.
01:40:35.200 | But C was also clearly good enough.
01:40:37.920 | And it wound up with a huge inertia
01:40:39.680 | from the Unix ecosystem and everything that came with that.
01:40:42.280 | And Pascal didn't have garbage collection?
01:40:44.040 | No, it was not garbage collected.
01:40:45.480 | It was the same kind of thing as C.
01:40:46.560 | Same manual.
01:40:47.160 | So you could still have your use-after-freeze
01:40:49.160 | and all those other problems.
01:40:50.360 | But just getting rid of array overruns,
01:40:53.760 | at least if you were compiled with that debugging option,
01:40:56.140 | certainly would have avoided a lot of problems
01:40:58.440 | and could have a lot of benefits.
01:40:59.800 | But so anyways, that was the next thing.
01:41:01.480 | I had to learn C, because C was where it seemed like most
01:41:05.960 | of the things were going.
01:41:07.080 | So I abandoned Pascal, and I started working in C.
01:41:09.600 | I started hacking on these hardware things,
01:41:11.640 | dealing with the graphics controllers and the EGA
01:41:14.920 | systems.
01:41:16.000 | And what we most wanted to do-- so at that time,
01:41:20.520 | we were sitting in our darkened office,
01:41:22.260 | playing all the different console video games.
01:41:24.800 | We were figuring out, what games do
01:41:27.720 | we want to make for our gamers edge product there?
01:41:30.480 | And so we had one of the first Super Nintendos sitting there.
01:41:34.200 | And we had an older Nintendo.
01:41:35.960 | And we were looking at all those games.
01:41:37.600 | And the core thing that those consoles
01:41:39.320 | did that you just didn't get on the PC games
01:41:41.960 | was this ability to have a massive scrolling world, where
01:41:45.160 | most of the games that you would make on the PC
01:41:48.000 | and earlier personal computers would be a static screen.
01:41:51.520 | You move little things around on it, and you interact like that.
01:41:55.040 | Maybe you go to additional screens as you move.
01:41:58.320 | But arcade games and consoles had this wonderful ability
01:42:01.920 | to just have a big world that you're slowly
01:42:04.440 | moving your window through.
01:42:06.240 | And that was, for those types of games,
01:42:08.440 | that kind of action exploration adventure games,
01:42:10.800 | that was a super, super important thing.
01:42:13.160 | And PC games just didn't do that.
01:42:16.280 | And what I had come across was a couple different techniques
01:42:19.880 | for implementing that on the PC.
01:42:22.320 | And they're not hard, complicated things.
01:42:25.200 | When I explain them now, they're pretty straightforward.
01:42:28.120 | But just nobody was doing--
01:42:29.280 | You sound like Einstein describing his five papers.
01:42:31.920 | It's pretty straightforward.
01:42:33.120 | I understand.
01:42:34.040 | But they're nevertheless revolutionary.
01:42:36.000 | So side-scrolling is a game changer.
01:42:38.440 | Yeah, and scrolling is--
01:42:39.400 | It's a genius invention.
01:42:40.440 | --whether it's side or vertical.
01:42:41.600 | And some of the consoles had different limitations
01:42:43.680 | about you could do one but not the other.
01:42:46.160 | And there were similar things going on as advancements,
01:42:48.400 | even in the console space, where you'd have--
01:42:50.640 | like the original Mario game was just horizontal scrolling.
01:42:54.600 | And then later Mario games added vertical aspects to it
01:42:57.320 | and different things that you were doing to explore,
01:43:01.280 | kind of expand the capabilities there.
01:43:02.960 | And so much of the early game design for decades
01:43:05.480 | was removing limitations, letting you do things
01:43:08.560 | that you envisioned as a designer,
01:43:10.120 | you wanted the player to experience,
01:43:11.960 | but the hardware just couldn't really--
01:43:14.320 | or you didn't know how to make it happen.
01:43:16.600 | It felt impossible.
01:43:17.720 | You can imagine that you want to create this big world
01:43:21.960 | through which you can side-scroll,
01:43:23.960 | like through which you can walk.
01:43:26.680 | And then you ask yourself a question,
01:43:28.520 | how do I actually build that in a way that's--
01:43:31.920 | like the latency is low enough, the hardware
01:43:35.360 | can actually deliver that in such a way
01:43:37.880 | that it's a compelling experience.
01:43:38.800 | Yeah, and we knew what we wanted to do
01:43:40.400 | because we were playing all of these console games,
01:43:42.720 | playing all these Nintendo games and arcade games.
01:43:45.160 | Clearly, there is a whole world of awesome things
01:43:47.240 | there that we just couldn't do on the PC, at least initially.
01:43:51.360 | Because every programmer can tell,
01:43:52.840 | it's like if you want to scroll, you
01:43:53.960 | can just redraw the whole screen.
01:43:55.440 | But then it turns out, well, you're
01:43:56.880 | going five frames per second.
01:43:58.800 | That's not an interactive, fun experience.
01:44:00.960 | You want to be going 30 or 60 frames per second or something.
01:44:04.760 | And it just didn't feel like that was possible.
01:44:06.840 | It felt like the PCs had to get five times faster for you
01:44:10.480 | to make a playable game there.
01:44:12.680 | And interestingly, I wound up with two completely different
01:44:16.080 | solutions for the scrolling problem.
01:44:18.800 | And this is a theme that runs through everything,
01:44:22.800 | where all of these big technical advancements, it turns out
01:44:25.720 | there's always a couple different ways of doing them.
01:44:28.520 | And it's not like you found the one true way of doing it.
01:44:31.760 | And we'll see this as we go into 3D games and things later.
01:44:35.240 | But so the scrolling, the first set of scrolling tricks
01:44:38.560 | that I got was, the hardware had this ability to--
01:44:43.640 | you could shift inside the window of memory.
01:44:47.680 | So the EGA cards at the time had 256 kilobytes of memory.
01:44:51.960 | And it was awkwardly set up in this planar format,
01:44:55.360 | where instead of having 256 or 24 million colors,
01:45:00.760 | you had 16 colors, which is four bits.
01:45:03.440 | So you had four bit planes, 64k a piece.
01:45:06.320 | Of course, 64k is a nice round number for 16-bit addressing.
01:45:10.440 | So your graphics card had a 16-bit window
01:45:13.920 | that you could look at.
01:45:15.240 | And you could tell it to start the video scan out
01:45:17.880 | anywhere inside there.
01:45:19.400 | So there were a couple games that had taken this approach.
01:45:22.160 | If you could make a 2 by 2 screen or a 1 by 4 screen,
01:45:25.800 | and you could do scrolling really easily like that.
01:45:28.120 | You could just lay it all out and just pan around there.
01:45:31.040 | But you just couldn't make it any bigger,
01:45:32.760 | because that's all the memory that was there.
01:45:35.560 | The first insight to the scrolling that I had was,
01:45:38.240 | well, if we make a screen that's just one tile larger--
01:45:42.480 | we usually had tiles that were 16 pixels by 16 pixels,
01:45:45.800 | the little classic Mario block that you run into.
01:45:48.920 | Lots of art gets drawn that way.
01:45:50.880 | And your screen is a certain number of tiles.
01:45:52.880 | But if you had one little buffer region outside of that,
01:45:56.640 | you could easily pan around inside that 16-pixel region.
01:46:00.000 | That could be perfectly smooth.
01:46:01.760 | But then what happens if you get to the edge
01:46:04.040 | and you want to keep going?
01:46:05.960 | The first way we did scrolling was
01:46:08.400 | what I called adaptive tile refresh, which was really
01:46:11.480 | just a matter of you get to the edge,
01:46:13.640 | and then you go back to the original point,
01:46:16.160 | and then only change the tiles that
01:46:18.600 | have actually-- that are different between where it was.
01:46:21.600 | In most of the games at the time,
01:46:23.280 | if you think about your classic Super Mario Brothers game,
01:46:26.920 | you've got big fields of blue sky,
01:46:30.000 | long rows of the same brick texture.
01:46:33.320 | And there's a lot of commonality.
01:46:35.000 | It's kind of like a data compression thing.
01:46:36.800 | If you take the screen and you set it down
01:46:39.000 | on top of each other, in general, only about 10%
01:46:42.440 | of the tiles were actually different there.
01:46:45.280 | So this was a way to go ahead and say, well,
01:46:48.360 | I'm going to move it back, and then I'm only
01:46:50.200 | going to change those 10%, 20%, whatever percent tiles there.
01:46:54.120 | And that meant that it was essentially five times faster
01:46:57.560 | than if you were redrawing all of the tiles.
01:46:59.880 | And that worked well enough for us
01:47:01.320 | to do a bunch of these games for Gamer's Edge.
01:47:04.880 | We had a lot of these scrolling games, like Slordax
01:47:07.160 | and Shadow Knights and things like that,
01:47:09.280 | that we were cranking out at this high rate that
01:47:11.440 | had this scrolling effect on it.
01:47:13.600 | And it worked well enough.
01:47:14.680 | There were design challenges there
01:47:16.160 | where, if you made-- the worst case,
01:47:18.240 | if you made a checkerboard over the entire screen,
01:47:20.520 | you scroll over one, and every single tile changes,
01:47:23.240 | and your frame rate's now five frames per second because it
01:47:25.760 | had to redraw everything.
01:47:27.200 | So the designers had a little bit
01:47:28.720 | that they had to worry about.
01:47:29.960 | They had to make these relatively plain-looking
01:47:32.040 | levels.
01:47:32.840 | But it was still pretty magical.
01:47:34.920 | It was something that we hadn't seen before.
01:47:37.560 | And the first thing that we wound up doing with that
01:47:41.400 | was I had just gotten this working,
01:47:43.760 | and Tom Hall was sitting there with me.
01:47:46.320 | And we were looking over at our Super Nintendo
01:47:49.240 | on the side there with Super Mario 3 running.
01:47:52.600 | And we had the technology.
01:47:54.600 | We had the tools set up there.
01:47:56.360 | And we stayed up all night.
01:47:57.880 | And we basically cloned the first level
01:47:59.840 | of Super Mario Brothers.
01:48:01.280 | Performance-wise as well?
01:48:02.560 | Yeah.
01:48:03.560 | And we had our little character running and jumping in there.
01:48:07.520 | It was close to pixel accurate as far as all the backgrounds
01:48:10.680 | and everything.
01:48:11.280 | But the gaming was just stuff that we cobbled together
01:48:13.520 | from previous games that I had written.
01:48:15.360 | I just really kit bashed the whole thing together
01:48:18.080 | to make this demo.
01:48:19.520 | And that was one of the rare cases
01:48:21.480 | when I said I don't usually do these all-night programming
01:48:24.240 | things.
01:48:24.920 | There's probably only two memorable ones
01:48:26.880 | that I can think about.
01:48:28.560 | One was the all-nighter to go ahead and get
01:48:32.000 | our Dangerous Dave and Copyright Infringement,
01:48:34.520 | is how we titled it.
01:48:35.440 | Because we had a game called Dangerous Dave,
01:48:37.280 | which was running around with a shotgun shooting things.
01:48:40.440 | And we were just taking our most beloved game at the time there,
01:48:43.400 | Super Mario 3, and sort of sticking Dave inside that
01:48:46.800 | with this new scrolling technology that
01:48:48.880 | was going perfectly smooth for them as it ran.
01:48:54.560 | And Tom and I just kind of blearily the next morning
01:48:57.640 | kind of left.
01:48:58.360 | And we left a disk on the desk for John Romero and Jay Wilbur
01:49:03.360 | to see and just said, run this.
01:49:05.240 | And we eventually made it back in later in the day.
01:49:08.360 | And it was like they grabbed us and pulled us into the room.
01:49:13.280 | And that was the point where they were like,
01:49:15.680 | we got to do something with this.
01:49:17.760 | We're going to make a company.
01:49:19.280 | We're going to go make our own games, where this was something
01:49:22.320 | that we were able to just kind of hit them
01:49:24.960 | with a hammer of an experience.
01:49:26.360 | Like, wow, this is just so much cooler than what
01:49:29.560 | we thought was possible there.
01:49:31.640 | And initially, we tried to get Nintendo
01:49:33.840 | to let us make Super Mario 3 on the PC.
01:49:36.880 | That's really what we wanted to do.
01:49:38.440 | We were like, hey, we can finish this.
01:49:40.240 | It's line of sight for this will be great.
01:49:42.600 | And we sent something to Nintendo.
01:49:45.160 | And we heard that it did get looked at in Japan.
01:49:48.160 | And they just weren't interested in that.
01:49:50.440 | But that's another one of those, life
01:49:52.020 | could have gone a very different way, where we could have been
01:49:54.760 | like Nintendo's house PC team at that point.
01:49:58.480 | And define the direction of Wolfenstein and Doom and Quake
01:50:06.680 | could have been a Nintendo creation.
01:50:08.920 | Yeah.
01:50:09.420 | So at the same time that we were just
01:50:11.600 | doing our first scrolling demos, we
01:50:14.640 | reached out to Scott Miller at Apogee and said,
01:50:17.680 | it's like, hey, we do want to make some games.
01:50:20.160 | These things that you think you want, those are nothing.
01:50:22.640 | What do you see what we can actually do now?
01:50:24.520 | This is going to be amazing.
01:50:26.280 | And he just popped right up and sent a check to us,
01:50:29.320 | where at that point, we still thought he might be a fraud,
01:50:32.560 | that he was just lying about all of this.
01:50:34.480 | But he was totally correct on how much money he was making
01:50:37.560 | with his shareware titles.
01:50:39.440 | And this was his kind of real brainstorm
01:50:42.560 | about this, where shareware was this idea that software
01:50:46.000 | doesn't have a fixed price.
01:50:47.240 | If you use it, you send, out of the goodness of your heart,
01:50:49.840 | some money to the creator.
01:50:51.760 | And there were a couple utilities
01:50:53.300 | that did make some significant success like that.
01:50:55.920 | But for the most part, it didn't really work.
01:50:58.600 | There wasn't much software in a pure shareware
01:51:01.040 | model that was successful.
01:51:03.880 | The Apogee innovation was to take something,
01:51:07.680 | call it shareware, split it into three pieces.
01:51:10.160 | You always made a trilogy.
01:51:12.200 | And you would put the first piece out.
01:51:14.360 | But then you buy the whole trilogy
01:51:16.160 | for some shareware amount, which in reality,
01:51:19.400 | it meant that the first part was a demo, where you kind of like,
01:51:22.440 | the demo went everywhere for free.
01:51:24.040 | And you paid money to get the whole set.
01:51:26.560 | But it was still played as shareware.
01:51:28.600 | And we were happy to have the first one go everywhere.
01:51:31.120 | And it wasn't a crippled demo, where
01:51:32.720 | the first episode of all of these trilogies,
01:51:34.960 | it was a real complete game.
01:51:36.400 | And probably 20 times as many people played that part of it,
01:51:39.800 | thought they had a great game, had fond memories of it,
01:51:43.520 | but never paid us a dime.
01:51:45.360 | But enough people were happy with that,
01:51:48.200 | where it was really quite successful.
01:51:50.680 | And these early games that we didn't think very much of
01:51:53.160 | compared to commercial quality games,
01:51:55.720 | but they were doing really good business,
01:51:57.720 | some fairly crude things.
01:51:59.280 | And people-- it was good business.
01:52:01.280 | People enjoyed it.
01:52:02.280 | And it wasn't like you were taking a crap shoot
01:52:04.660 | on what you were getting.
01:52:05.740 | You just played a third of the experience.
01:52:07.720 | And you loved it enough to handwrite out a check
01:52:10.800 | and put it in an envelope and address it and send it out
01:52:13.800 | to Apogee to get the rest of them.
01:52:16.600 | So it was a really pretty feel-good business prospect
01:52:20.280 | there, because everybody was happy.
01:52:23.120 | They knew what they were getting when they sent it in.
01:52:25.800 | And they would send in fan mail.
01:52:27.140 | If you're going to the trouble of addressing a letter
01:52:29.480 | and filling out an envelope, you write something in it.
01:52:32.620 | And there were just the literal bags of fan mail
01:52:35.360 | for the Shareware games.
01:52:37.800 | So people loved them.
01:52:38.880 | - I should mention that for you,
01:52:41.280 | the definition of wealth is being able to have pizza
01:52:45.240 | whenever you want.
01:52:46.540 | For me, there was a dream,
01:52:49.000 | 'cause I would play Shareware games over and over,
01:52:51.360 | the part that's free, over and over.
01:52:53.800 | And it was very deeply fulfilling experience.
01:52:56.480 | But I dreamed of a time when I could actually afford
01:53:01.160 | the full experience.
01:53:02.120 | And this is kind of this dreamland beyond the horizon,
01:53:05.880 | when you could find out what else is there.
01:53:09.360 | In some sense, even just playing the Shareware was,
01:53:14.360 | it's the limitation of that.
01:53:18.280 | Life is limited.
01:53:20.200 | Eventually we all die.
01:53:21.520 | In that way, Shareware was somehow really fulfilling
01:53:26.520 | to have this kind of mysterious thing beyond what's free,
01:53:31.560 | always there.
01:53:32.400 | It's kind of, I don't know.
01:53:33.960 | That was, maybe it's because a part of my childhood
01:53:36.360 | is playing Shareware games.
01:53:37.600 | That was a really fulfilling experience.
01:53:39.520 | It's so interesting how that model still brought joy
01:53:43.120 | to so many people, the 20X people that played it.
01:53:46.000 | - I felt very good about that.
01:53:47.360 | I would run into people that would say,
01:53:49.640 | "Oh, I loved that game that you had early on,
01:53:52.200 | Commander Keen, whatever."
01:53:53.520 | And no, they meant just the first episode
01:53:57.520 | that they got to see everywhere.
01:53:58.720 | - That's me, I played the crap out of Commander Keen.
01:54:01.080 | That was all good.
01:54:03.280 | - Yeah, yeah.
01:54:04.120 | - But so we were in this position where Scott Miller
01:54:06.760 | was just fronting us cash and saying,
01:54:08.360 | "Yeah, make a game."
01:54:10.400 | But we did not properly pull the trigger and say,
01:54:14.040 | "All right, we're quitting our jobs."
01:54:16.360 | We were like, "We're gonna do both.
01:54:17.880 | We're gonna keep working at Softdisk, working on this.
01:54:21.240 | And then we're going to go ahead and make a new game
01:54:24.680 | for Apogee at the same time."
01:54:27.040 | And this eventually did lead to some legal problems.
01:54:29.520 | And we had trouble, it all got worked out in the end,
01:54:32.760 | but it was not a good call at the time there.
01:54:35.800 | - And your legal mind at the time was not stellar.
01:54:39.320 | You were not thinking in legal terms.
01:54:42.240 | - No, I definitely wasn't, none of us were.
01:54:45.200 | And in hindsight, yeah, it's like,
01:54:47.560 | how did we think we were gonna get away with
01:54:49.160 | like even using our work computers to write software
01:54:52.720 | for our breakaway new company?
01:54:56.800 | It was not a good plan.
01:54:58.360 | - How did Commander Keen come to be?
01:55:00.960 | - So the design process, we would start from,
01:55:04.080 | we had some idea of what we wanted to do.
01:55:05.880 | We wanted to do a Mario-like game.
01:55:08.320 | It was gonna be a side scroller.
01:55:10.160 | It was gonna use the technology.
01:55:11.800 | We had some sense of what it would have to look like
01:55:14.320 | because of the limitations
01:55:15.480 | of this adaptive tile refresh technology.
01:55:17.960 | It had to have fields of relatively constant tiles.
01:55:21.200 | You couldn't just paint up a background
01:55:23.480 | and then move that around.
01:55:25.920 | The early design or all the design for Commander Keen
01:55:28.600 | really came from Tom Hall,
01:55:30.040 | where he was kind of the main creative mind
01:55:34.880 | for the early id software stuff,
01:55:36.960 | where we had an interesting division of things
01:55:39.080 | where Tom was all creative and design.
01:55:42.240 | I was all programming.
01:55:43.600 | John Romero was an interesting bridge
01:55:45.560 | where he was both a very good programmer
01:55:47.720 | and also a very good designer and artist
01:55:50.160 | and kind of straddled between the areas.
01:55:52.480 | But Commander Keen was very much Tom Hall's baby.
01:55:55.520 | And he came up with all the design and backstory
01:55:59.280 | for the different things of kind of a mad scientist
01:56:02.320 | little kid with, you know, building a rocket ship
01:56:06.440 | and a zap gun and visiting alien worlds
01:56:09.080 | and doing all of this,
01:56:10.800 | the background that we lay the game inside of.
01:56:13.640 | And there's not a whole lot to any of these things.
01:56:16.480 | You know, design for us was always just what we needed to do
01:56:19.520 | to make the game that was gonna be so much fun to play.
01:56:23.080 | And we laid out our first trilogy of games,
01:56:26.560 | you know, the shareware formula.
01:56:27.920 | It was gonna be three pieces.
01:56:29.440 | We make Commander Keen 1, 2, and 3.
01:56:31.800 | And we just really started busting on all that work.
01:56:35.880 | And it went together really quickly.
01:56:37.680 | It was like three months or something
01:56:39.600 | that while we were still making games every month
01:56:41.920 | for Gamers Edge, we were sharing technology between that.
01:56:45.480 | I'd write a bunch of code for this,
01:56:46.800 | and we'd just kind of use it for both.
01:56:49.000 | Again, not a particularly good idea there
01:56:50.880 | that had consequences for us.
01:56:53.040 | But in three months, we got our first game out,
01:56:57.240 | and all of a sudden, it was three times as successful
01:57:00.680 | as the most successful thing Apogee had had before.
01:57:03.040 | And we were making like $30,000 a month
01:57:06.360 | immediately from the Commander Keen stuff.
01:57:09.120 | And that was, again, a surprise to us.
01:57:11.720 | It was more than we thought that was gonna make.
01:57:15.320 | And we said, "Well, we're gonna certainly roll
01:57:17.320 | into another set of titles from this."
01:57:20.000 | And in that three months,
01:57:21.280 | I had come up with a much better way
01:57:23.080 | of doing the scrolling technology
01:57:24.880 | that was not the adaptive tile refresh,
01:57:27.240 | which in some ways was even simpler.
01:57:29.200 | And these things, so many of the great ideas of technology
01:57:33.440 | are things that are back-of-the-envelope designs.
01:57:36.320 | I make this comment about modern machine learning
01:57:38.400 | where all the things that are really important
01:57:40.800 | practically in the last decade are,
01:57:42.760 | each of them fits on the back of an envelope.
01:57:44.480 | There are these simple little things.
01:57:46.360 | They're not super dense, hard-to-understand technologies.
01:57:51.360 | And so the second scrolling trick was just a matter of like,
01:57:55.040 | "Okay, we know we've got this 64K window."
01:57:58.560 | And the question was always like,
01:57:59.960 | "Well, you could make a two-by-two,
01:58:02.280 | but you can't go off the edge."
01:58:05.520 | But I finally asked, "Well, what actually happens
01:58:07.800 | if you just go off the edge?"
01:58:09.600 | If you take your start and you say, it's like,
01:58:12.200 | "Okay, I can move over. I'm scrolling.
01:58:14.440 | I can move over. I can move down. I'm scrolling."
01:58:16.920 | I get to what should be the bottom of the memory window.
01:58:19.760 | It's like, "Well, what if I just keep going?"
01:58:21.600 | And I say, "I'm going to start at, you know,
01:58:24.000 | what happens if I start at FFFE
01:58:26.360 | at the very end of the 64K block?"
01:58:29.400 | And it turns out it just wraps back around
01:58:31.920 | to the top of the block.
01:58:33.280 | And I'm like, "Oh, well, this makes everything easy.
01:58:35.840 | You can just scroll the screen everywhere,
01:58:37.600 | and all you have to draw is just one new line of tiles,
01:58:40.440 | whichever thing you expose.
01:58:42.080 | It might be unaligned off various parts
01:58:44.640 | of the screen memory, but it just works."
01:58:48.160 | That no longer had the problem of you had to have fields
01:58:50.800 | of the similar colors
01:58:52.200 | because it doesn't matter what you're doing.
01:58:54.280 | You could be having a completely unique world,
01:58:56.840 | and you're just drawing the new strip as it comes on.
01:58:59.440 | - But it might be, like you said, unaligned,
01:59:02.200 | so it can be all over the place.
01:59:03.520 | - Yeah, and it turns out it doesn't matter.
01:59:04.960 | I would have two page-flipped screens.
01:59:06.680 | As long as they didn't overlap,
01:59:08.000 | they moved in series through this two-dimensional window
01:59:11.760 | of graphics, and that was one of those, like,
01:59:14.360 | "Well, this is so simple.
01:59:15.760 | This just works.
01:59:17.880 | It's faster.
01:59:19.600 | It seemed like there was no downside."
01:59:21.800 | Funny thing was, it turned out,
01:59:24.240 | after we shipped titles with this,
01:59:26.440 | there were what they called "super VGA cards,"
01:59:29.680 | the cards that would allow higher resolutions
01:59:31.920 | and different features that the standard ones didn't.
01:59:35.760 | And on some of those cards,
01:59:38.360 | this was a weird compatibility quirk, again,
01:59:40.360 | because nobody thought this was not
01:59:41.880 | what it was designed to do.
01:59:43.600 | And some of those cards had more memory.
01:59:45.640 | They had more than just 256K in four planes.
01:59:48.800 | They had 512K or a megabyte.
01:59:51.480 | And on some of those cards, I scroll my window down,
01:59:55.680 | and then it goes into uninitialized memory
01:59:57.920 | that actually exists there,
01:59:59.040 | rather than wrapping back around to the top.
02:00:01.800 | And then I was in the tough position of,
02:00:04.280 | "Do I have to track every single one of these?"
02:00:06.240 | And it was a madhouse back then with --
02:00:08.400 | There were 20 different video card vendors
02:00:10.680 | with all slightly different implementations
02:00:12.680 | of their nonstandard functionality.
02:00:14.840 | So either I needed to natively program
02:00:17.560 | all of the VGA cards there to map in that memory
02:00:22.080 | and keep scrolling down through all of that,
02:00:24.320 | or I kind of punted and took the easy solution of,
02:00:27.520 | when you finally did run to the edge of the screen,
02:00:30.280 | I accepted a hitch
02:00:31.400 | and just copied the whole screen up there.
02:00:33.320 | So on some of those cards, it was a compatibility mode.
02:00:38.320 | In the normal ones, when it all worked fine,
02:00:40.160 | everything was just beautifully smooth.
02:00:42.080 | But if you had one of those cards
02:00:43.440 | where it did not wrap the way I wanted it to,
02:00:46.320 | you'd be scrolling around, scrolling around,
02:00:48.840 | and then eventually you'd have a little hitch
02:00:50.560 | where 200 milliseconds or something
02:00:52.880 | that was not super smooth.
02:00:54.560 | - Yeah, it froze a little bit.
02:00:56.120 | And so it's the binary thing.
02:00:57.840 | Is it one of the standard screens
02:00:59.640 | or is it one of the weird ones, the super VGA ones?
02:01:02.040 | - Yeah. - Okay.
02:01:03.040 | - And so we would default to --
02:01:04.600 | And I think that was one of those
02:01:05.880 | that changed over the kind of course of deployment,
02:01:08.960 | where early on, we would have a normal mode,
02:01:11.040 | and then you would enable the compatibility flag
02:01:13.480 | if your screen did this crazy flickery thing
02:01:16.160 | when you got to a certain point in the game.
02:01:18.640 | And then later, I think it probably got enabled by default
02:01:21.400 | as just more and more of the cards
02:01:23.520 | kind of did not do exactly the right thing.
02:01:26.200 | And that's the two-edged sword
02:01:27.480 | of doing unconventional things with technology,
02:01:30.280 | where you can find something that nobody thought about
02:01:33.080 | doing that kind of scrolling trick
02:01:34.520 | when they set up those cards.
02:01:36.760 | But the fact that nobody thought
02:01:37.880 | that was the primary reason when I was relying on that,
02:01:40.800 | then I wound up being broken on some of the later cards.
02:01:44.000 | - Let me take a bit of a tangent,
02:01:45.720 | but ask you about the hacker ethic,
02:01:50.240 | 'cause you mentioned shareware.
02:01:51.720 | It's an interesting world,
02:01:53.160 | the world of people that make money,
02:01:56.640 | the business and the people that build systems,
02:02:00.440 | the engineers.
02:02:02.040 | And what is the hacker ethic?
02:02:05.920 | You've been a man of the people
02:02:08.400 | and you've embodied at least the part of that ethic.
02:02:12.400 | What does it mean?
02:02:13.240 | What did it mean to you at the time?
02:02:14.400 | What does it mean to you today?
02:02:15.760 | - So, Stephen Levy's book "Hackers"
02:02:17.840 | was a really formative book for me as a teenager.
02:02:21.560 | I mean, I read it several times
02:02:23.600 | and there was all of the great lore
02:02:25.920 | of the early MIT era of hackers
02:02:28.920 | and you ending up at the end with,
02:02:30.920 | it kind of went through the early MIT hackers
02:02:34.040 | and then the Silicon Valley hardware hackers
02:02:36.040 | and then the game hackers in part three.
02:02:39.360 | And at that time as a teenager,
02:02:41.640 | I really was kind of bitter in some ways,
02:02:44.440 | like I thought I was born too late.
02:02:46.240 | I thought I missed the window there.
02:02:48.960 | And I really thought I belonged
02:02:50.960 | in that third section of that book with the game hackers.
02:02:53.560 | And they were talking about the Williams at Sierra
02:02:56.720 | and origin systems with Richard Garriott.
02:02:58.800 | And it's like, I really wanted to be there.
02:03:02.520 | And I knew that was now a few years in the past.
02:03:05.560 | It was, you know, it was not to be.
02:03:08.680 | But the early days, especially the early MIT hacker days,
02:03:12.440 | talking a lot about this sense of the hacker ethic,
02:03:16.040 | that there was this sense that it was about
02:03:18.880 | sharing information, being good,
02:03:20.680 | not keeping it to yourself,
02:03:22.400 | and that it's not a zero-sum game,
02:03:24.680 | that you can share something with another programmer
02:03:28.360 | and it doesn't take it away from you.
02:03:30.040 | You know, you then have somebody else doing something.
02:03:33.080 | And I also think that there's an aspect of it
02:03:35.880 | where it's this ability to take joy
02:03:39.120 | in other people's accomplishments,
02:03:41.160 | where it's not the cutthroat bit of like,
02:03:43.000 | I have to be first, I have to be recognized
02:03:45.400 | as the one that did this in some way,
02:03:48.160 | but being able to see somebody else do something
02:03:51.520 | and say, holy shit, that's amazing, you know,
02:03:53.560 | and just taking joy in the ability of something amazing
02:03:56.880 | that somebody else does.
02:03:58.760 | And the big thing that I was able to do through id Software
02:04:03.960 | was this ability to eventually release the source code
02:04:07.080 | for most of our, like, all of our really seminal game titles.
02:04:10.720 | And that was a, it was a stepping stone process
02:04:13.560 | where we were kind of surprised early on
02:04:16.440 | where people were able to hack the existing games.
02:04:20.040 | And of course, I had experience with that.
02:04:21.480 | I remember hacking my copies of Ultima,
02:04:23.440 | so I'd give myself, you know, 999 gold
02:04:26.000 | and raise my levels and, you know,
02:04:27.680 | break out the sector editor.
02:04:29.000 | And so I was familiar with all of that.
02:04:30.960 | So it was just, it was with a smile
02:04:32.920 | when I started to see people doing that to our games.
02:04:35.560 | I am, you know, making level editors for Commander Keen
02:04:38.880 | or hacking up Wolfenstein 3D.
02:04:41.760 | But I made the pitch internally
02:04:44.880 | that we should actually release our own tools
02:04:47.480 | for like what we did, what we used to create the games.
02:04:51.360 | And that was, you know, that was a little bit debatable about,
02:04:55.040 | well, you know, will this let other,
02:04:56.720 | will it give people a leg up?
02:04:57.880 | It's always like, what's that going to mean
02:04:59.240 | for the competition?
02:05:00.880 | But the really hard pitch was to actually release
02:05:04.720 | the full source code for the games.
02:05:06.760 | And it was a balancing act with the other people
02:05:10.320 | inside the company where it's interesting
02:05:13.280 | how the programmers generally did get,
02:05:17.000 | certainly the people that I worked closely with,
02:05:20.320 | they did kind of get that hacker ethic bit
02:05:22.560 | where you wanted to share your code.
02:05:24.600 | You were proud of it.
02:05:25.720 | You wanted other people to take it and do cool things with it.
02:05:28.880 | But interestingly, the broader game industry
02:05:33.840 | is a little more hesitant to embrace that
02:05:36.440 | than like the group of people
02:05:37.920 | that we happen to have at id Software
02:05:40.080 | where it was always a little interesting to me
02:05:42.440 | seeing how a lot of people in the game modding community
02:05:45.480 | were very possessive of their code.
02:05:47.560 | They did not want to share their code.
02:05:49.240 | They wanted it to be theirs.
02:05:50.480 | It was their, you know, claim to fame.
02:05:52.520 | And that was much more like what we tended to see with artists
02:05:56.120 | where, you know, the artists understand something
02:05:58.280 | about credit and, you know,
02:06:00.240 | wanting it to be known as their work.
02:06:02.120 | And a lot of the game programmers
02:06:05.240 | felt a little bit more like artists
02:06:06.920 | than like hacker programmers
02:06:08.520 | in that it was about building something
02:06:10.920 | that maybe felt more like art to them
02:06:12.920 | than the more tool-based and exploration-based
02:06:16.680 | kind of hacking culture side of things.
02:06:19.000 | - Yeah, it's so interesting that this kind of fear
02:06:23.720 | that credit will not be sufficiently attributed to you.
02:06:27.840 | - And that's one of the things that I do bump into a lot
02:06:30.360 | because I try not to go...
02:06:34.240 | I mean, it's easy for me to say
02:06:35.360 | because so much credit is heaped on me
02:06:37.200 | for the id Software side of things.
02:06:39.320 | But when people come up
02:06:40.760 | and they want to pick a fight and say,
02:06:42.480 | "No, it's like that wasn't where
02:06:43.840 | first-person gaming came from."
02:06:45.680 | And you can point to, you know,
02:06:47.080 | you can point to some of like things on obscure titles
02:06:50.160 | that I was never aware of,
02:06:51.560 | or like the old Play-Doh systems,
02:06:53.280 | or, you know, each personal computer
02:06:55.200 | had something that was 3D-ish and moving around.
02:06:58.280 | And I'm, you know, and I'm happy to say it's like,
02:07:00.720 | no, I mean, I saw Battlezone and Star Wars in the arcades.
02:07:03.840 | I had seen 3D graphics.
02:07:05.240 | I had seen all these things there.
02:07:06.760 | I'm standing on the shoulders of lots of other people,
02:07:09.040 | but sometimes these examples they pull out,
02:07:10.800 | it's like, "No, I didn't know that existed."
02:07:12.480 | I mean, I had never heard of that before then.
02:07:15.240 | And that didn't contribute to what I made,
02:07:17.840 | but there's plenty of stuff that did.
02:07:19.480 | And, you know, I think there's good cases to be made
02:07:23.280 | that obviously Doom and Quake and Wolfenstein
02:07:26.200 | were formative examples for what,
02:07:29.360 | everything that came after that.
02:07:31.760 | But I don't feel the need to go fight and say,
02:07:34.720 | claim primacy or initial invention of anything like that.
02:07:38.800 | But a lot of people do want to.
02:07:40.480 | - I think when you fight for the credit in that way,
02:07:43.240 | and it does go against the hacker ethic,
02:07:45.320 | you destroy something fundamental about the culture,
02:07:48.840 | about the community that builds cool stuff.
02:07:51.800 | I think credit ultimately,
02:07:53.800 | so I had this sort of,
02:07:58.400 | there's a famous wrestler in freestyle wrestling
02:08:01.000 | called Buvae Sarasatev.
02:08:04.000 | And he always preached that you should just focus
02:08:07.760 | on the art of the wrestling
02:08:09.520 | and let people write your story however they want.
02:08:14.520 | The highest form of the art is just focusing on the art.
02:08:20.080 | And that is something about the hacker ethic
02:08:23.800 | is just focus on building cool stuff,
02:08:26.920 | sharing it with other cool people,
02:08:29.400 | and credit will get assigned correctly
02:08:32.520 | in the long arc of history.
02:08:37.280 | - Yeah, and I generally think that's true.
02:08:39.240 | And you've got,
02:08:40.080 | there's some things,
02:08:43.040 | there's a graphics technique
02:08:44.240 | that got labeled CarMax reverse.
02:08:46.240 | I am literally named it.
02:08:48.400 | And it turned out that I wasn't the first person
02:08:50.320 | to figure that out.
02:08:51.880 | Most scientific things or mathematical things,
02:08:54.360 | you wind up, it's like,
02:08:55.200 | oh, this other person had actually done that somewhat before.
02:08:58.600 | And then there's things that get attributed to me,
02:09:00.320 | like the inverse square root hack
02:09:02.000 | that I actually didn't do.
02:09:03.320 | I flat out, that wasn't me.
02:09:04.760 | And it's weird how the mimetic power of the internet,
02:09:07.920 | I cannot convince people of that.
02:09:09.840 | - You're like the Mark Twain of programming.
02:09:12.120 | Everything just gets attributed to you now,
02:09:14.040 | even though you've never sought the credit of things.
02:09:17.200 | I mean, but part of the fact of the humility behind that
02:09:21.160 | is what attracts the attributions.
02:09:24.040 | Let's talk about a game,
02:09:27.480 | I mean, one of the greatest games ever made.
02:09:29.480 | I know you could talk about doing Quake and so on,
02:09:31.400 | but to me, Wolfenstein 3D was like, whoa.
02:09:35.280 | It blew my mind that a world like this
02:09:37.720 | could exist.
02:09:38.560 | So how did Wolfenstein 3D come to be
02:09:40.840 | in terms of the programming, in terms of the design,
02:09:44.200 | in terms of some of the memorable technical challenges?
02:09:47.160 | And also actually just something you haven't mentioned,
02:09:51.120 | how did these ideas come to be inside your mind,
02:09:57.960 | the adaptive side-scrolling,
02:10:00.560 | the solutions to these technical challenges?
02:10:03.600 | - So I usually can introspectively pull back
02:10:06.920 | pretty detailed accounts of how technology solutions
02:10:11.360 | and design choices on my part came to be,
02:10:14.000 | where technically we had done two games,
02:10:17.400 | 3D games like that before,
02:10:19.200 | where Hover Tank was the first one
02:10:20.960 | which had flat shaded walls,
02:10:22.760 | but did have the scaled enemies inside it.
02:10:25.240 | And then Catacombs 3D,
02:10:26.960 | which had textured walls, scaled enemies,
02:10:30.120 | and some more functionality like the disappearing walls
02:10:35.640 | and some other stuff.
02:10:37.160 | But what's really interesting
02:10:38.680 | from a game development standpoint
02:10:40.360 | is those games, Catacombs 3D, Hover Tank, and Wolfenstein,
02:10:45.040 | they literally used the same code
02:10:48.280 | for a lot of the character behavior
02:10:50.280 | that a 2D game that I had made earlier called Catacombs did,
02:10:54.160 | where it was an overhead view game, kind of like Gauntlet.
02:10:56.760 | You're running around and you can open up doors,
02:10:58.840 | pick up items, basic game stuff.
02:11:01.320 | And the thought was that this exact same game experience
02:11:06.320 | just presented in a different perspective.
02:11:09.120 | It could be literally the same game,
02:11:11.400 | just with a different view into it,
02:11:13.520 | would have a dramatically different impact on the players.
02:11:16.880 | - So it wasn't a true 3D,
02:11:20.320 | you're saying that you could kind of fake it,
02:11:22.480 | you can like scale enemies,
02:11:24.200 | meaning things that are farther away,
02:11:25.760 | you can make them smaller.
02:11:27.480 | - So from the game was a 2D map,
02:11:29.800 | like all of our games use the same tool for creation.
02:11:33.080 | We use the same map editor for creating Keen as Wolfenstein
02:11:36.360 | and Hover Tank and Catacombs and all this stuff.
02:11:39.120 | So the game was a 2D grid made out of blocks.
02:11:42.720 | And you could say, well, these are walls,
02:11:44.480 | these are where the enemies start,
02:11:45.920 | then they start moving around.
02:11:47.720 | And these early games like Catacombs,
02:11:49.640 | you played it strictly in a 2D view.
02:11:51.640 | It was a scrolling 2D view,
02:11:53.280 | and that was kind of using an adaptive tile refresh
02:11:55.360 | at the time to be able to do something like that.
02:11:58.480 | And then the thought that these early games,
02:12:01.640 | all it did was take the same basic enemy logic,
02:12:04.640 | but instead of seeing it from the God's eye view on top,
02:12:07.720 | you were inside it and turning from side to side,
02:12:10.520 | yawing your view and moving forwards and backwards
02:12:12.680 | and side to side.
02:12:14.560 | And it's a striking thing where you always talk about
02:12:17.200 | wanting to isolate and factor changes in values.
02:12:20.120 | And this was one of those most pure cases there
02:12:22.400 | where the rest of the game changed very little.
02:12:25.080 | It was our normal kind of change the colors on something
02:12:27.880 | and draw a different picture for it,
02:12:29.240 | but it's kind of the same thing.
02:12:30.920 | But the perspective changed in a really fundamental way,
02:12:34.040 | and it was dramatically different.
02:12:36.360 | I can remember the reactions where the artist, Adrian,
02:12:40.640 | that had been drawing the pictures for it,
02:12:42.120 | we had a cool big troll thing in Catacombs 3D,
02:12:45.080 | and we had these walls that you could get a key
02:12:47.760 | and you could make the blocks disappear.
02:12:49.880 | Yeah, really simple stuff.
02:12:51.040 | Blocks could either be there or not there.
02:12:52.960 | So our idea of a door was being able to make a set of blocks
02:12:56.040 | just disappear.
02:12:57.360 | And I remember the reaction where
02:12:58.840 | he had drawn these characters, and he was slowly
02:13:00.880 | moving around.
02:13:01.760 | And people had no experience with 3D navigation.
02:13:04.480 | It was all still keyboard.
02:13:05.560 | We didn't even have mice set up at that time.
02:13:08.480 | But slowly moving, going up, picked up a key, go to a wall.
02:13:12.280 | The wall disappears in a little animation,
02:13:14.360 | and there's a monster right there.
02:13:16.320 | And he practically fell out of his chair.
02:13:18.000 | It was just like, ah!
02:13:19.280 | And games just didn't do that.
02:13:22.480 | The games were the god's eye view.
02:13:24.360 | You were a little invested in your little guy.
02:13:26.320 | You can be happy or sad when things happen,
02:13:29.760 | but you just did not get that kind of startle reaction.
02:13:32.440 | You weren't inside the game.
02:13:33.880 | Something in the back of your brain,
02:13:35.560 | some reptile brain thing is just going, oh shit,
02:13:38.280 | something just happened.
02:13:40.000 | And that was one of those early points where it's like, yeah,
02:13:43.080 | this is going to make a difference.
02:13:44.920 | This is going to be powerful, and it's going to matter.
02:13:47.440 | Were you able to imagine that in the idea stage or no?
02:13:51.120 | So not that exact thing.
02:13:53.840 | So we had cases like the arcade games Battlezone and Star Wars
02:13:58.000 | that you could kind of see a 3D world and things coming at you,
02:14:01.840 | and you get some sense of it.
02:14:03.440 | But nothing had done the kind of worlds
02:14:05.560 | that we were doing and the sort of action-based things.
02:14:08.440 | 3D at the time was really largely
02:14:12.080 | about the simulation thoughts.
02:14:14.240 | And this is something that really
02:14:16.280 | might have trended differently if not for the id software
02:14:19.840 | approach in the games, where there were flight simulators,
02:14:23.560 | there were driving simulators, you
02:14:25.040 | had like hard drive-in and Microsoft Flight Simulator.
02:14:28.520 | And these were doing 3D and general purpose 3D
02:14:31.360 | in ways that were more flexible than what
02:14:34.080 | we were doing with our games.
02:14:35.400 | But they were looked at as simulations.
02:14:38.000 | They weren't trying to necessarily
02:14:39.520 | be fast or responsive or letting you do kind of exciting
02:14:44.000 | maneuvers, because they were trying to simulate reality,
02:14:46.800 | and they were taking their cues from the big systems,
02:14:49.240 | the Evans and Sutherlands and the Silicon Graphics
02:14:51.480 | that were doing things.
02:14:52.840 | But we were taking our cues from the console and arcade games.
02:14:56.560 | We wanted things that were sort of quarter eaters, that
02:14:59.600 | were doing fast-paced things, that could smack you around
02:15:02.760 | rather than just smoothly gliding you from place to place.
02:15:06.760 | Quarter eaters.
02:15:07.440 | Yeah.
02:15:08.840 | And you know, a funny thing is, so much that that built into us
02:15:12.240 | that Wolfenstein still had lives.
02:15:14.720 | And you had like one of the biggest power-ups
02:15:16.560 | in all these games, like was an extra life,
02:15:18.760 | because you started off with three lives,
02:15:20.520 | and you lose your lives, and then it's game over.
02:15:23.040 | And there weren't save games in most of this stuff.
02:15:26.240 | It was-- it sounds almost crazy to say this,
02:15:28.720 | but it was an innovation in Doom to not have lives.
02:15:31.680 | You know, you could just play Doom as long as you wanted.
02:15:34.060 | You just restart at the start of the level.
02:15:35.960 | And why not?
02:15:36.720 | This is-- we aren't trying to take people's quarters.
02:15:39.600 | They've already paid for the entire game.
02:15:41.320 | We want them to have a good time.
02:15:43.320 | And you would have some old-timer purist that
02:15:46.320 | might think that there's something
02:15:47.820 | to the epic journey of making it to the end,
02:15:50.000 | having to restart all the way from the beginning
02:15:52.040 | after a certain number of tries.
02:15:53.440 | But no, more fun is had when you just
02:15:55.560 | let people kind of keep trying when they're stuck,
02:15:58.040 | rather than having to go all the way back
02:15:59.760 | and learn different things.
02:16:01.920 | So you've recommended the book, Game Engine Black Book,
02:16:04.560 | Wolfenstein 3D for technical exploration of the game.
02:16:07.680 | So looking back 30 years, what are
02:16:10.800 | some memorable technical innovations
02:16:13.240 | that made this perspective shift into this world that's
02:16:17.080 | so immersive that scares you when a monster appears?
02:16:20.160 | What were some things you had to solve?
02:16:22.320 | >>So one of the interesting things
02:16:24.160 | that come back to the theme of deadlines and resource
02:16:26.920 | constraints, the game Catacombs 3D,
02:16:31.120 | we shipped-- we were supposed to be shipping this
02:16:33.160 | for Gamers Edge on a monthly cadence.
02:16:35.080 | And I had slipped.
02:16:36.440 | I was actually late.
02:16:37.860 | It slipped like six weeks, because this
02:16:39.880 | was texture-mapped walls doing stuff
02:16:42.280 | that I hadn't done before.
02:16:44.800 | And at the six-week point, it was still
02:16:47.520 | kind of glitchy and buggy.
02:16:48.960 | There were things that I knew that if you
02:16:50.920 | had a wall that was almost edge-on,
02:16:53.160 | you could slide over to it.
02:16:54.560 | And you could see some things freak out or vanish or not
02:16:57.280 | work.
02:16:57.780 | And I hated that.
02:16:59.840 | But I was up against the wall.
02:17:01.480 | We had to ship the game.
02:17:03.000 | It was still a lot of fun to play.
02:17:04.680 | It was novel.
02:17:05.200 | Nobody had seen it.
02:17:06.000 | It gave you that startle reflex reaction.
02:17:09.120 | So it was worth shipping.
02:17:11.040 | But it had these things that I knew
02:17:13.080 | were kind of flaky and janky and not what I was really proud of.
02:17:17.440 | So one of the things that I did very differently in Wolfenstein
02:17:22.720 | was I went--
02:17:24.000 | Catacombs used almost a conventional thing
02:17:27.200 | where you had segments that were one-dimensional polygons,
02:17:30.240 | basically, that were clipped and back-faced and done
02:17:33.880 | kind of like a very crude 3D engine from the professionals.
02:17:37.320 | But I wasn't getting it done right.
02:17:39.520 | I was not doing a good enough job.
02:17:41.520 | I didn't really have line of sight to fix it right.
02:17:45.160 | There's stuff that, of course, I look back.
02:17:46.960 | It's like, oh, it's obvious how to do this, do the math right,
02:17:49.560 | do your clipping right, check all of this,
02:17:51.840 | how you handle the precision.
02:17:53.240 | But I did not know how to do that at that time.
02:17:55.680 | And I--
02:17:56.200 | Was that the first 3D engine you wrote, Catacombs 3D?
02:17:58.800 | Yeah, Hover Tank had been a little bit before that.
02:18:01.000 | But that had the flat-shaded walls.
02:18:02.520 | So the texture mapping on the walls
02:18:04.580 | was what was bringing in some of these challenges that
02:18:08.240 | was hard for me.
02:18:09.000 | And I couldn't solve it right at the time.
02:18:11.160 | Can you describe what flat shading is and texture mapping?
02:18:13.600 | So the walls were solid color, one of 16 colors in Hover Tank.
02:18:19.360 | So that's easy.
02:18:20.280 | It's fast.
02:18:20.840 | You just draw the solid color for everything.
02:18:23.680 | Texture mapping is what we all see today,
02:18:25.600 | where you have an image that is stretched and distorted
02:18:28.160 | onto the walls or the surfaces that you're working with.
02:18:32.200 | And it was a long time for me to just figure out
02:18:35.320 | how to do that without it distorting in the wrong ways.
02:18:38.640 | And I did not get it all exactly right in Catacombs.
02:18:42.400 | And I had these flaws.
02:18:44.840 | So that was important enough to me
02:18:46.760 | that rather than continuing to bang my head on that,
02:18:49.360 | when I wasn't positive I was going to get it,
02:18:51.800 | I went with a completely different approach for drawing,
02:18:54.520 | for figuring out where the walls were,
02:18:56.440 | which was a ray casting approach, which I had done--
02:18:59.880 | in Catacombs 3D, I had a bunch of C code
02:19:02.680 | trying to make this work right.
02:19:04.040 | And it wasn't working right.
02:19:05.840 | In Wolfenstein, I wound up going to a very small amount
02:19:09.920 | of assembly code.
02:19:11.200 | So in some ways, this should be a slower way of doing it.
02:19:14.360 | But by making it a smaller amount of work
02:19:16.560 | that I could more tightly optimize, it worked out.
02:19:19.320 | And Wolfenstein 3D was just absolutely rock solid.
02:19:22.840 | It was nothing glitched in there.
02:19:25.520 | The game just was pretty much flawless through all of that.
02:19:28.280 | And I was super proud of that.
02:19:31.160 | But eventually, like in the later games,
02:19:33.280 | I went back to the more span-based things
02:19:35.680 | where I could get more total efficiency once I really
02:19:38.280 | did figure out how to do it.
02:19:40.200 | So there were two key technical things to Wolfenstein.
02:19:43.560 | One was this ray casting approach, which you still--
02:19:46.880 | to this day, you see people go and say,
02:19:48.840 | let's write a ray casting engine,
02:19:50.520 | because it's an understandable way of doing things that
02:19:53.320 | lets you make games very much like that.
02:19:55.840 | So you see ray casters in JavaScript,
02:19:57.680 | ray casters in Python, people that are basically
02:20:00.360 | going and re-implementing that approach to taking a tiled
02:20:04.600 | world and casting out into it.
02:20:06.560 | It works pretty well, but it's not the fastest way of doing it.
02:20:09.320 | Can you describe what ray casting is?
02:20:11.440 | So you start off, and you've got your screen,
02:20:13.480 | which is 320 pixels across at the time,
02:20:15.840 | if you haven't sized down the window for greater speed.
02:20:19.840 | And at every pixel, there's going
02:20:21.600 | to be an angle from-- you've got your position in the world,
02:20:24.600 | and you're going to just run along that angle
02:20:27.000 | and keep going until you hit a block.
02:20:29.240 | So up to 320 times across there, it's
02:20:32.200 | going to throw a cast array out into the world
02:20:35.560 | from wherever your origin is until it runs into a wall,
02:20:38.680 | and then it can figure out exactly where on the wall
02:20:41.320 | it hits.
02:20:42.200 | The performance challenge of that is, as it's going out,
02:20:45.440 | every block it's crossing, it checks, is this a solid wall?
02:20:49.280 | So that means that in the early Wolfenstein levels,
02:20:52.480 | you're in a small jail cell going out into a small hallway.
02:20:55.760 | It's super efficient for that, because you're only
02:20:57.840 | stepping across three or four blocks.
02:21:00.200 | But then if somebody makes a room that covers--
02:21:02.520 | our maps were limited to 64 by 64 blocks.
02:21:05.560 | If you made one room that was nothing but walls
02:21:08.720 | at the far space, it would go pretty slow,
02:21:11.040 | because it would be stepping across 80 tile tests
02:21:14.440 | or something along the way.
02:21:15.920 | By the way, the physics of our universe
02:21:17.480 | seems to be competing in this very thing.
02:21:19.220 | So this maps nicely to the actual physics of our world.
02:21:23.080 | Yeah, you get--
02:21:23.720 | Intuitively.
02:21:24.160 | I follow a little bit of something
02:21:25.640 | like Stephen Wolfram's work on interconnected network
02:21:29.200 | information states of that.
02:21:30.600 | And it's beyond what I can have an informed opinion on.
02:21:35.000 | But it's interesting that people are considering things
02:21:38.000 | like that and have things that can back it up.
02:21:42.760 | Yeah, there's whole different sets
02:21:44.200 | of interesting stuff there.
02:21:45.680 | So Wolfenstein 3D had ray casting.
02:21:48.240 | Ray casting.
02:21:49.000 | And then the other key aspect was
02:21:51.720 | what I called compiled scalers, where the idea of--
02:21:57.240 | you saw this in the earlier classic arcade games
02:21:59.800 | like Space Harrier and stuff, where
02:22:01.680 | you would take a picture, which is normally
02:22:03.560 | drawn directly on the screen.
02:22:05.400 | And then if you have the ability to make it bigger or smaller,
02:22:08.240 | big chunky pixels or fizzily small drop-sampled pixels,
02:22:12.200 | that's the fundamental aspect of what our characters were
02:22:15.200 | doing in these 3D games.
02:22:16.540 | You would have-- it's just like you
02:22:17.920 | might have drawn a tiny little character,
02:22:19.380 | but now we can make them really big and make them really small
02:22:21.960 | and move it around.
02:22:23.240 | That was the limited kind of 3D that we had for characters.
02:22:26.480 | To make them turn, there were literally
02:22:28.100 | eight different views of them.
02:22:29.600 | You didn't actually have a 3D model that would rotate.
02:22:31.840 | You just had these cardboard cutouts.
02:22:33.800 | But that was good enough for that startle fight reaction,
02:22:36.800 | and it was kind of what we had to deal with there.
02:22:40.200 | So a straightforward approach to do that,
02:22:42.320 | you could just write out your doubly nested loop of--
02:22:45.720 | you've got your stretch factor, and it's
02:22:47.280 | like you've got a point.
02:22:48.080 | You stretch by a little bit.
02:22:49.400 | It might be on the same pixel.
02:22:50.680 | It might be on the next pixel.
02:22:51.940 | It might have skipped a pixel.
02:22:53.600 | You can write that out, but it's not
02:22:55.140 | going to be fast enough, where especially you
02:22:57.280 | get a character for that right in your face, monster
02:23:00.080 | covering almost the entire screen.
02:23:02.200 | Doing that with a general purpose scaling routine
02:23:05.080 | would have just been much too slow.
02:23:06.520 | It would have worked when they're small characters,
02:23:08.320 | but then it would get slower and slower as they got closer
02:23:10.720 | to you until right at the time when you most
02:23:12.720 | care about having a fast reaction time,
02:23:15.320 | the game would be chunking down.
02:23:17.320 | So the fastest possible way to draw pixels at that time
02:23:22.440 | was to, instead of saying I've got a general purpose
02:23:28.780 | version that can handle any scale,
02:23:32.740 | I used a program to make essentially
02:23:35.340 | 100 or more separate little programs that
02:23:37.740 | was optimized for I will take an image,
02:23:40.260 | and I will draw it 12 pixels tall.
02:23:42.260 | I'll take an image.
02:23:43.100 | I'll draw it 14 pixels tall, up by every two pixels even
02:23:47.180 | for that.
02:23:47.780 | So you would have the most optimized code
02:23:50.300 | so that in the normal case where most of the world
02:23:53.760 | is fairly large, like the pixels are big,
02:23:57.040 | we did not have a lot of memory.
02:23:58.760 | So in most cases, that meant that you
02:24:00.960 | would load a pixel color, and then you
02:24:03.120 | would store it multiple times.
02:24:05.280 | So that was faster than even copying an image
02:24:09.040 | in a normal conventional case because most of the time
02:24:11.840 | the image is expanded.
02:24:13.240 | So instead of doing one read, one write for a simple copy,
02:24:16.600 | you might be doing one read and three or four writes
02:24:19.160 | as it got really big.
02:24:20.660 | And that had the beneficial aspect
02:24:22.340 | of just when you needed the performance most when things
02:24:24.740 | are covering the screen, it was giving you
02:24:26.620 | the most acceleration for that.
02:24:28.580 | By the way, were you able to understand this
02:24:32.540 | through thinking about it, or were you testing
02:24:34.900 | the right speed and--
02:24:36.660 | This again comes back to I can find the antecedents
02:24:40.160 | for things like this.
02:24:41.100 | So back in the Apple II days, the graphics
02:24:46.060 | were essentially single bits at a time.
02:24:49.580 | And if you wanted to make your little spaceship,
02:24:51.780 | if you wanted to make it smoothly go across the world,
02:24:54.820 | if you just took the image and you drew it out
02:24:56.740 | at the next location, you would move by seven pixels at a time.
02:24:59.900 | So it would go chunk, chunk, chunk.
02:25:01.380 | If you wanted to make it move smoothly,
02:25:03.300 | you actually had to make seven versions of the ship
02:25:05.900 | that were pre-shifted.
02:25:07.380 | You could write a program that would shift it dynamically,
02:25:09.940 | but on a 1 megahertz processor, that's not going anywhere fast.
02:25:13.180 | So if you wanted to do a smooth moving,
02:25:15.220 | fast action game, you made separate versions
02:25:18.300 | of each of these sprites.
02:25:20.380 | Now, there were a few more tricks
02:25:21.740 | you could pull that if it still wasn't fast enough,
02:25:24.420 | you could make a compiled shape where instead of this program
02:25:29.340 | that normally copies an image and it says,
02:25:31.540 | like, get this byte from here, store it here, get this byte,
02:25:34.300 | store this byte, if you've got a memory space,
02:25:37.500 | you could say, I'm going to write the program that does
02:25:40.180 | nothing but draw this shape.
02:25:41.900 | It's going to be like, I'm going to load
02:25:43.900 | the immediate value 25, which is some bit pattern,
02:25:47.980 | and then I'm going to store that at this location.
02:25:51.300 | Rather than loading something from memory
02:25:53.260 | that involved indexing registers and this other slow stuff,
02:25:56.660 | you could go ahead and say, no, I'm
02:25:58.080 | going to hard code the exact values of all of the image
02:26:00.740 | right into the program.
02:26:02.140 | And this was always a horrible trade-off there,
02:26:04.100 | because you didn't have much memory
02:26:05.520 | and you didn't have much speed.
02:26:06.980 | But if you had something that you wanted to go really fast,
02:26:09.620 | you could turn it into a program.
02:26:11.780 | And that was, you know, knowing about that technique
02:26:14.540 | is what made me think about some of these,
02:26:16.780 | unwinding it for the PC, where people that didn't come
02:26:19.740 | from that background were less likely to think about that.
02:26:23.380 | - I mean, there's some deep parallels
02:26:25.180 | probably to human cognition as well.
02:26:27.500 | There's something about optimizing and compressing
02:26:32.500 | the processing of a new information that requires you
02:26:37.620 | to predict the possible ways in which the game
02:26:42.340 | or the world might unroll.
02:26:44.460 | And you have something like compiled scalars always there.
02:26:47.460 | So you have like optima, like you have a prediction
02:26:51.380 | of how the world will unroll and you have some kind
02:26:53.780 | of optimized data structure for that prediction.
02:26:58.180 | And then you can modify if the world turns out to be
02:27:00.660 | different, you can modify it a slight way.
02:27:02.340 | - And as far as building out techniques,
02:27:04.180 | so much of the brain is about the associative context.
02:27:06.980 | You know, they're just, when you learn something,
02:27:09.140 | it's in the context of something else
02:27:10.940 | and you can have faint, tiny little hints of things.
02:27:13.980 | And I do think there are some deep things, you know,
02:27:16.420 | around like sparse distributed memories and boosting
02:27:18.820 | that's like, if you can just be slightly above
02:27:20.700 | the noise floor of having some hint of something,
02:27:23.460 | you can have things refined into pulling the memory back up.
02:27:26.460 | So being a programmer and having a toolbox of like
02:27:29.900 | all of these things that I did in all of these previous
02:27:33.140 | lives of programming tasks, that still matters to me
02:27:36.100 | about how I'm able to pull up some of these things.
02:27:38.540 | Like in that case, it was something I did on the Apple II
02:27:41.220 | then being relevant for the PC.
02:27:43.260 | And I have still cases when I would work
02:27:46.580 | on mobile development, then be like, okay,
02:27:48.380 | I did something like this back in the doom days,
02:27:51.740 | but now it's a different environment,
02:27:53.620 | but I still had that tie, I can bring it in
02:27:55.860 | and I can transform it into what the world needs right now.
02:27:59.140 | And I do think that's actually one of the very core things
02:28:01.940 | with human cognition and brain-like functioning
02:28:06.060 | is finding these ways about, you've got,
02:28:08.500 | your brain is kind of everything everywhere all at once.
02:28:10.780 | You know, it is just a set of all of this stuff
02:28:13.080 | that is just fetched back by these queries that go into it.
02:28:16.420 | And they can just be slightly above the noise floor
02:28:18.900 | with a random noise in your neurons and synapses
02:28:21.260 | that are affecting exactly what gets pulled up.
02:28:23.980 | - So you're saying some of these very specific solutions
02:28:26.500 | for different games, you find that there's a kernel
02:28:30.580 | of a deep idea that's generalizable to other things.
02:28:34.740 | - Yeah, you can't predict what it's going to be,
02:28:36.460 | but that idea of like, I called out that compiled shaders
02:28:40.020 | in the forward that I wrote for that,
02:28:41.780 | the game engine black book, as you know,
02:28:44.100 | this is, it's kind of an end point of unrolling code,
02:28:48.020 | but that's one of those things that thinking about that
02:28:50.400 | and having that in your mind,
02:28:51.660 | and I'm sure there are some programmers
02:28:53.120 | that hear about that, think about it a little bit,
02:28:55.540 | it's kind of the mind blown moment.
02:28:57.060 | It's like, oh, you can just turn all of that data into code.
02:29:00.740 | And nowadays, you know, you have instruction cache issues,
02:29:03.360 | and that's not necessarily the best idea,
02:29:05.540 | but there are different, it's an idea that has power
02:29:09.020 | and has probably relevance in some other areas.
02:29:11.500 | Maybe it's in a hardware point of view
02:29:13.020 | that there's a way you approach building hardware
02:29:15.460 | that has that same,
02:29:16.800 | you don't even have to think about iterating,
02:29:18.420 | you just bake everything all the way into it in one place.
02:29:22.180 | - What is the story of how you came to program Doom?
02:29:25.420 | What are some memorable technical challenges
02:29:27.740 | or innovations within that game?
02:29:29.780 | - So the path that we went after Wolfenstein got out,
02:29:33.660 | and we were on this crazy arc where Keen 1 through 3,
02:29:37.300 | more success than we thought,
02:29:38.860 | Keen 4 through 6, even more success,
02:29:41.100 | Wolfenstein, even more success.
02:29:42.860 | So we were on this crazy trajectory for things.
02:29:46.420 | So actually our first box commercial project
02:29:48.260 | was a Commander Keen game,
02:29:50.660 | but then Wolfenstein was going to have a game
02:29:52.700 | called Spear of Destiny,
02:29:54.020 | which was a commercial version, 60 new levels.
02:29:57.300 | So the rest of the team took the game engine
02:29:59.460 | pretty much as it was and started working on that.
02:30:02.740 | We got new monsters,
02:30:03.940 | but it's basically re-skins of the things there.
02:30:07.660 | And there's a really interesting aspect about that
02:30:09.420 | that I didn't appreciate until much, much later
02:30:12.300 | about how Wolfenstein clearly did tap out its limit
02:30:16.620 | about what you want to play,
02:30:18.900 | all the levels and a couple of our licensed things.
02:30:21.660 | There was a hard creative wall
02:30:24.060 | that you did not really benefit much
02:30:25.860 | by continuing to beat on it.
02:30:27.940 | But a game like Doom and other more modern games
02:30:31.500 | like Minecraft or something,
02:30:33.060 | there's kind of a Turing completeness level
02:30:35.100 | of design freedom that you get in games
02:30:37.260 | that Wolfenstein clearly sat on one side of.
02:30:40.020 | You know, all the creative people in the world
02:30:42.100 | could not go and do a masterpiece
02:30:44.060 | just with the technology that Wolfenstein had.
02:30:46.540 | Wolfenstein could do Wolfenstein,
02:30:48.220 | but you really couldn't do something crazy and different.
02:30:50.820 | But it didn't take that much more capability
02:30:53.420 | to get to Wolfenstein with the freeform lines
02:30:56.300 | and a little bit more artistic freedom
02:30:58.620 | to get to the point where people still announce
02:31:01.020 | new Doom levels today, all these years after,
02:31:03.740 | without having completely tapped out the creativity.
02:31:06.740 | - How did you put it?
02:31:07.580 | Turing complete?
02:31:08.540 | - Like Turing complete design space.
02:31:10.220 | - Design space.
02:31:11.060 | - Where it's like, you know,
02:31:11.900 | we have the kind of computational universality
02:31:14.860 | on a lot of things and how different substrates work.
02:31:17.500 | But yeah, there's things where a box can be too small,
02:31:20.940 | but above a certain point,
02:31:22.860 | you're kind of are at the point where
02:31:24.900 | you really have almost unbounded creative ability there.
02:31:28.420 | - And Doom is the first time you crossed that line.
02:31:31.420 | - Yeah, where there were thousands of Doom levels created
02:31:35.100 | and some of them still have something new
02:31:36.740 | and interesting to say to the world about it.
02:31:38.580 | - Is that line, can you introspect what that line was?
02:31:43.220 | Is it in the design space?
02:31:44.820 | Is it something about the programming capabilities
02:31:48.380 | that you were able to add to the game?
02:31:50.940 | - So the graphics fidelity was a necessary part
02:31:54.380 | because the block limitations in Wolfenstein,
02:31:57.540 | what we had right there was not enough,
02:32:00.780 | the full scale blocks.
02:32:01.820 | Although Minecraft, I really did show
02:32:04.660 | that perhaps blocks stacked in 3D
02:32:07.740 | and at one quarter of the scale of that,
02:32:09.940 | or one eighth in volume,
02:32:11.340 | is then sufficient to have all of that.
02:32:13.260 | But the wall sized blocks that we had in Wolfenstein
02:32:16.780 | was too much of a creative limitation.
02:32:18.580 | You know, we licensed the technology to a few other teams.
02:32:21.620 | None of them made too much of a dent with that.
02:32:24.900 | It just wasn't enough creative ability,
02:32:27.240 | but a little bit more,
02:32:28.500 | whether it was the variable floors and ceilings
02:32:31.020 | and arbitrary angles in Doom,
02:32:33.180 | or the smaller voxel blocks in Minecraft
02:32:37.220 | is then enough to open it up to just worlds
02:32:39.980 | and worlds of new capabilities.
02:32:41.860 | - What is binary space partitioning?
02:32:45.060 | - So the-
02:32:45.900 | - Which is one of the technologies.
02:32:47.620 | - Yeah, so jump around a little bit on the story path there.
02:32:51.140 | So while the team was working on Spirit Destiny
02:32:53.180 | for Wolfenstein,
02:32:54.220 | we had met another development team, Raven Software,
02:32:58.300 | while we were in Wisconsin,
02:32:59.980 | and they were doing, they had RPG background,
02:33:03.300 | and I still kind of loved that.
02:33:04.780 | And I offered to do a game engine for them
02:33:07.740 | to let them do a 3D rendered RPG
02:33:10.740 | instead of the, like most RPG games were kind of hand drawn.
02:33:14.100 | They made it look kind of 3D,
02:33:15.580 | but it was done just all with artist work
02:33:17.580 | rather than a real engine.
02:33:19.820 | And after Wolfenstein, this was still a tile-based world,
02:33:23.380 | but I added floors and ceilings and some lighting
02:33:25.620 | and the ability to have some sloped floors
02:33:27.700 | in different areas.
02:33:28.540 | And that was my intermediate step
02:33:30.060 | for a game called Shadowcaster.
02:33:32.380 | And it had slowed down enough.
02:33:34.460 | It was not fast enough to do our type of action things.
02:33:37.700 | So they had the screen crop down a little bit.
02:33:40.000 | So you couldn't go the full screen width
02:33:42.300 | like we would try to do in Wolfenstein.
02:33:44.620 | But I learned a lot.
02:33:46.360 | I got the floors and ceilings and lightings,
02:33:47.980 | and it looked great.
02:33:48.900 | They were great artists up there.
02:33:50.380 | And it was an inspiration for us
02:33:52.360 | to look at some of that stuff.
02:33:54.460 | But I had learned enough from that,
02:33:57.040 | that I had the plan for,
02:33:58.500 | I knew faster ways to do the lighting and shadowing.
02:34:01.580 | And I wanted to do this freeform geometry.
02:34:03.660 | I wanted to break out of this tile-based
02:34:06.580 | 90 degree world limitations.
02:34:09.180 | So that was when we got our next stations,
02:34:12.300 | and we were working with these higher powered systems.
02:34:14.980 | And we built an editor that let us draw
02:34:18.420 | kind of arbitrary line segments.
02:34:20.260 | And I was working hard to try to make something
02:34:22.780 | that could render this fast enough.
02:34:24.860 | I was pushing myself pretty hard.
02:34:27.980 | And we were at a point where we could see some things
02:34:32.260 | that looked amazingly cool,
02:34:33.780 | but it wasn't really fast enough for the way I was doing it,
02:34:37.780 | for this flexibility.
02:34:38.820 | It was no longer, I couldn't just ray cast into it.
02:34:41.100 | And I had these very complex sets of lines,
02:34:43.260 | and simple little worlds were okay.
02:34:45.380 | But the cool things that we wanted to do
02:34:47.540 | just weren't quite fast enough.
02:34:49.700 | And I wound up taking a break at that point.
02:34:52.500 | And I did the port, I did two ports of our games,
02:34:57.500 | Wolfenstein to the Super Nintendo.
02:35:01.100 | It was a crazy difficult thing to do,
02:35:04.260 | which was an even slower processor.
02:35:05.980 | It was like two, a couple of megahertz,
02:35:08.740 | processor.
02:35:10.020 | And it had been this whole thing where we had farmed out
02:35:12.900 | the work and it wasn't going well.
02:35:17.180 | And I took it back over,
02:35:18.900 | and trying to make it go fast on there,
02:35:21.900 | where it really did not have much processing power.
02:35:25.220 | The pixels were stretched up hugely,
02:35:26.900 | and it was pretty ugly when you looked at it.
02:35:29.060 | But in the end, it did come out fast enough to play
02:35:31.540 | and still be kind of fun from that.
02:35:33.500 | But that was where I started using BSP trees,
02:35:36.620 | or binary space partitioning trees.
02:35:38.380 | It was one of those things I had to make it faster there.
02:35:41.580 | It was a stepping stone where it was reasonably easy
02:35:44.540 | to understand in the grid world of Wolfenstein,
02:35:47.020 | where it was all still 90 degree angles.
02:35:49.180 | BSP trees were, I eased myself into it with that.
02:35:53.940 | And it was a big success.
02:35:56.260 | Then when I came back to working on Doom,
02:35:58.780 | I had this new tool in my toolbox.
02:36:00.700 | It was gonna be a lot harder
02:36:02.060 | with the arbitrary angles of Doom.
02:36:03.940 | This was where I really started grappling
02:36:06.180 | with epsilon problems.
02:36:07.860 | And just, up until that point,
02:36:09.700 | I hadn't really had to deal with the fact
02:36:11.420 | that I am so many numeric things.
02:36:13.900 | This almost felt like a betrayal to me,
02:36:15.420 | where people had told me that I had mathematicians
02:36:17.660 | up on a bit of a pedestal,
02:36:18.980 | where I was, people think I'm a math wizard, and I'm not.
02:36:22.780 | I really, everything that I did was really done
02:36:26.060 | with a solid high school math understanding.
02:36:29.580 | Algebra II, trigonometry,
02:36:31.140 | and that was what got me all the way through Doom and Quake
02:36:34.940 | and all of that, of just understanding basics of matrices
02:36:38.300 | and knowing it well enough to do something with it.
02:36:41.060 | - What's the epsilon problems you ran into?
02:36:42.940 | - So I, when you wind up taking a, like a sloped line
02:36:46.740 | and you say, I'm going to intersect it
02:36:48.380 | with another sloped line, I am,
02:36:50.940 | then you wind up with something that's not going to be
02:36:53.060 | on these nice grid boundaries.
02:36:54.980 | With the Wolfenstein tile maps,
02:36:57.140 | all you've got is horizontal and vertical lines
02:36:58.980 | looking at it from above.
02:37:00.260 | And if you cut one of them, it's just obvious
02:37:02.220 | the other one gets cut exactly at that point.
02:37:04.920 | But when you have angled lines,
02:37:06.660 | you're doing a kind of a slope intercept problem
02:37:08.860 | and you wind up with rational numbers there,
02:37:11.260 | where things that are not going to evenly land on an integer
02:37:14.860 | or even on any fixed point value that you've got.
02:37:17.300 | So everything winds up having to snap
02:37:19.580 | to some fixed point value.
02:37:21.020 | So the lines slightly change their angle.
02:37:23.460 | You wind up, if you cut something here,
02:37:25.340 | this one's going to bend a little this way,
02:37:26.900 | and it's not going to be completely straight.
02:37:28.940 | And then you come down to all these questions of,
02:37:30.740 | well, this one is a point on an angled line.
02:37:35.160 | You can't answer that in finite precision
02:37:38.040 | unless you're doing something with actual rational numbers.
02:37:41.000 | And later on, I did waste far too much time
02:37:43.240 | chasing things like that.
02:37:44.200 | How do you do precise arithmetic with rational numbers?
02:37:46.880 | And it always blows up eventually,
02:37:49.160 | exponentially as you do it.
02:37:50.560 | - So these kinds of things are impossible with computers.
02:37:54.080 | - They're possible, again, there are paths to doing it,
02:37:57.480 | but you can't fit them conveniently
02:37:59.280 | in any of the numbers you need to start using big nums
02:38:01.800 | and different factor trackings and different things.
02:38:04.600 | - So you have to, if you have any elements of OCD
02:38:08.120 | and you want to do something perfectly,
02:38:09.840 | you're screwed if you're working with floating point.
02:38:12.280 | - Yeah.
02:38:13.120 | - So you had to deal with this for the first time.
02:38:15.320 | - And there were lots of challenges there about like,
02:38:18.400 | okay, they build this cool thing.
02:38:20.080 | And the way the BSP trees work is it basically
02:38:22.880 | takes the walls and it carves other walls by those walls
02:38:26.400 | in this clever way that you can then
02:38:28.740 | take all of these fragments.
02:38:30.440 | And then you can for sure, from any given point,
02:38:33.120 | get an ordering of everything in the world.
02:38:35.140 | And you can say, this goes in front of this,
02:38:36.780 | goes in front of this, all the way back to the last thing.
02:38:40.200 | And that's super valuable for graphics,
02:38:42.380 | where kind of a classic graphics algorithm
02:38:45.280 | would be painter's algorithm.
02:38:46.800 | You paint the furthest thing first,
02:38:48.180 | and then the next thing, and then the next thing.
02:38:49.880 | And then it comes up and it's all perfect for you.
02:38:52.740 | That's slow because you don't want to have to have drawn
02:38:54.840 | everything like that, but you can also flip it around
02:38:57.580 | and draw the closest thing to you.
02:38:59.720 | And then if you're clever about it,
02:39:01.440 | you can figure out what you need to draw
02:39:03.560 | that's visible beyond that.
02:39:05.520 | - And that's what BSP trees allow you to do.
02:39:07.600 | - Yeah, so it's combined with a bunch of other things,
02:39:10.680 | but it gives you that ordering.
02:39:12.240 | It's a clever way of doing things.
02:39:13.800 | And I remember I had learned this from one of my,
02:39:17.240 | my graphics Bible at the time,
02:39:18.800 | a book called Folian Van Damme.
02:39:20.520 | And again, it was a different world back there.
02:39:22.240 | There was a small integer number of books and this book,
02:39:26.120 | yeah, this book that was,
02:39:27.800 | it was big fat college textbook that I had,
02:39:30.520 | I had read through many times.
02:39:32.520 | I didn't understand everything in it.
02:39:34.240 | Some of it wasn't useful to me,
02:39:35.840 | but they had the little thing about finite orderings
02:39:39.640 | of you draw a little T-shaped thing.
02:39:41.520 | And you can say, you can,
02:39:42.800 | you can make a fixed ahead of time order from this,
02:39:45.260 | and you can generalize this with the BSP trees.
02:39:48.240 | And I got a little bit more information about that.
02:39:50.800 | And it was kind of fun later while I was working on Quake,
02:39:53.120 | I got to meet Bruce Naylor,
02:39:54.780 | who was one of the original researchers
02:39:56.540 | that developed those technologies,
02:39:58.840 | you know, for academic literature.
02:40:00.840 | And that was kind of fun,
02:40:01.760 | but I was very much just finding a tool
02:40:03.800 | that can help me solve what I was doing.
02:40:05.880 | And I was using it in this very crude way
02:40:07.820 | in a two-dimensional fashion,
02:40:09.160 | rather than the general 3D.
02:40:10.520 | The Epsilon problems got much worse in Quake
02:40:12.840 | and three-dimensionals when things angle in every way.
02:40:15.680 | But eventually I did sort out the,
02:40:17.920 | sort out how to do it reliably on Doom.
02:40:20.000 | There were still a few edge cases in Doom
02:40:22.220 | that were not absolutely perfect,
02:40:24.380 | where they even got terminologies in the communities.
02:40:27.740 | Like when you got to something where it was messed up,
02:40:29.440 | it was a hall of mirrors effect
02:40:30.920 | because you'd sweep by and it wouldn't draw something there
02:40:33.720 | and you would just wind up with the leftover remnants
02:40:36.200 | as you flipped between the two pages.
02:40:39.280 | But BSP trees were important for it.
02:40:41.480 | But it's again worth noting that after we did Doom,
02:40:45.660 | our major competition came from Ken Silverman
02:40:49.000 | and his build engine,
02:40:49.920 | which was used for Duke Nukem 3D
02:40:51.800 | and some of the other games for 3D Realms.
02:40:54.160 | And he used a completely different technology,
02:40:56.240 | nothing to do with BSP trees.
02:40:59.240 | So there's not just a one true way of doing things.
02:41:03.080 | There were critical things about
02:41:05.600 | to make any of those games fast,
02:41:07.160 | you had to separate your drawing
02:41:09.360 | into you drew vertical lines and you drew horizontal lines,
02:41:12.480 | just kind of changing exactly what you would draw with them.
02:41:15.800 | That was critical for the technologies at that time.
02:41:19.380 | And like all the games that were kind of like that
02:41:21.720 | wound up doing something similar,
02:41:23.400 | but there were still a bunch of other decisions
02:41:25.400 | that could be made.
02:41:26.800 | And we made good enough decisions on everything on Doom.
02:41:30.360 | We brought in multiplayer significantly.
02:41:33.680 | And it was our first game that was designed
02:41:35.640 | to be modified by the user community,
02:41:37.640 | where we had this whole setup of our WAD files and PWADs
02:41:41.080 | and things that people could build with tools
02:41:43.680 | that we released to them.
02:41:44.680 | And they eventually rewrote to be better
02:41:46.200 | than what we released,
02:41:47.640 | but they could build things
02:41:48.840 | and you could add it to your game
02:41:50.400 | without destructively modifying it,
02:41:52.260 | which is what you had to do in all the early games.
02:41:54.080 | You literally hacked the data files
02:41:56.700 | or the executable before,
02:41:58.080 | while Doom was set up in this flexible way
02:42:00.580 | so that you could just say,
02:42:02.320 | run the normal game with this added on on top,
02:42:04.820 | and it would overlay just the things
02:42:06.920 | that you wanted to there.
02:42:09.080 | - Would you say that Doom was kind of the first
02:42:11.600 | true 3D game that you created?
02:42:14.240 | - So no, it's still,
02:42:15.200 | Doom would usually be called a two and a half D game
02:42:17.800 | where it had three dimensional points on it.
02:42:20.280 | And this is another one of these kind of pedantic things
02:42:22.420 | that people love to argue about,
02:42:23.980 | about what was the first 3D game.
02:42:25.700 | I still, like every month probably I hear from somebody
02:42:29.300 | about, well, was Doom really a 3D game or something?
02:42:32.260 | And I give the point where characters had three coordinates.
02:42:37.260 | So you had like an X, Y, and Z,
02:42:40.260 | the cacodemon could be coming in very high
02:42:42.300 | and come down towards you.
02:42:44.980 | The walls had three coordinates on them.
02:42:47.380 | So on some sense, it's a 3D game engine,
02:42:50.320 | but it was not a fully general 3D game engine.
02:42:53.420 | You could not build a pyramid in Doom
02:42:56.640 | because you couldn't make a sloped wall,
02:42:59.400 | which was slightly different
02:43:00.440 | where in that previous Shadowcaster game,
02:43:02.280 | I couldn't have vertexes and have a sloped floor there,
02:43:05.200 | but the changes that I made for Doom to get higher speed
02:43:08.320 | and a different set of flexibility
02:43:10.160 | traded away that ability,
02:43:11.660 | but you literally couldn't make that.
02:43:13.400 | You could make different heights of passages,
02:43:17.900 | but you could not make a bridge over another area.
02:43:20.340 | You could not go over and above it.
02:43:21.900 | So it still had some 2D limitations to it.
02:43:24.900 | - That's more about the building
02:43:26.060 | versus the actual experience,
02:43:27.540 | 'cause the experience is-
02:43:29.140 | - It felt like things would come at you,
02:43:30.580 | but again, you couldn't look up either.
02:43:32.580 | You could only pitch.
02:43:34.740 | It was four degrees of freedom
02:43:36.500 | rather than six degrees of freedom.
02:43:38.420 | You did not have the ability to tilt your head this way
02:43:40.620 | or pitch up and down.
02:43:42.340 | - So that takes us to Quake.
02:43:44.520 | What was the leap there?
02:43:47.520 | What was some fascinating technical challenges
02:43:50.080 | and there were a lot, or not challenges,
02:43:52.220 | but innovations that you've come up with?
02:43:54.200 | - So Quake was kind of the first thing
02:43:56.480 | where I did have to kind of come face to face
02:43:59.560 | with my limitations,
02:44:01.300 | where it was the first thing
02:44:02.320 | where I really did kind of give it my all
02:44:05.720 | and still come up a little bit short
02:44:08.440 | in terms of what and when I wanted to get it done.
02:44:11.960 | And the company had some serious stresses
02:44:15.060 | through the whole project.
02:44:16.780 | And we bit off a lot.
02:44:19.700 | So the things that we set out to do
02:44:21.460 | was it was going to be really a true 3D engine
02:44:24.700 | where it could do six degree of freedom.
02:44:26.980 | You could have all the viewpoints.
02:44:29.700 | You could model anything.
02:44:31.500 | It had a really remarkable new lighting model
02:44:35.300 | with the surface caching and things.
02:44:37.100 | That was one of those where it was starting
02:44:38.500 | to do some things that they weren't doing
02:44:40.780 | even on the very high end systems.
02:44:42.820 | And it was going to be completely programmable
02:44:46.960 | in the modding standpoint,
02:44:48.240 | where the thing that you couldn't do in Doom,
02:44:49.840 | you could replace almost all of the media,
02:44:52.400 | but you couldn't really change the game.
02:44:55.160 | There were still some people
02:44:56.720 | that were doing the hex editing of the executable,
02:44:58.800 | the de-hacked things where you could change
02:45:00.660 | a few things about rules
02:45:01.940 | and people made some early capture the flag type things
02:45:04.640 | by hacking the executable,
02:45:06.060 | but it wasn't really set out to do that.
02:45:08.560 | Quake was going to have its own programming language
02:45:11.380 | that the game was going to be implemented in it.
02:45:13.220 | And that would be able to be overwritten
02:45:14.900 | just like any of the media.
02:45:16.620 | Code was going to be data for that.
02:45:18.420 | And you would be able to have expansion packs
02:45:21.040 | that changed fundamental things and mods and so on.
02:45:24.220 | And the multiplayer was going to be playable
02:45:27.180 | over the internet.
02:45:28.160 | It was going to support a client server
02:45:30.900 | rather than peer to peer.
02:45:32.460 | So we had the possibility of supporting larger numbers
02:45:34.940 | of players in disparate locations
02:45:37.500 | with this full flexibility of the programming overrides
02:45:41.340 | with full six degree of freedom, modeling and viewing.
02:45:44.940 | And with this fancy new light mapped
02:45:47.980 | kind of surface caching side, it was a lot.
02:45:50.660 | And this was one of those things that if I could go back
02:45:53.740 | and tell younger me to do something differently,
02:45:57.540 | it would have been to split those innovations up
02:45:59.700 | into two phases in two separate games.
02:46:02.100 | - Will be phase one and phase two.
02:46:03.660 | - So it probably would have been
02:46:05.100 | taking the Doom rendering engine
02:46:07.100 | and bringing in the TCP/IP client server.
02:46:10.740 | - Focusing on the multiplayer.
02:46:12.220 | - And the Quake C or would have been Doom C
02:46:15.260 | programming language there.
02:46:16.900 | So I would have split that into programming language
02:46:19.340 | and networking with the same Doom engine
02:46:21.820 | rather than forcing everybody to go towards
02:46:24.020 | the Quake engine, which really meant getting a Pentium.
02:46:27.500 | You know, while it ran on a 486,
02:46:29.080 | it was not a great experience there.
02:46:31.000 | We could have made more people happier
02:46:33.020 | and gotten two games done in 50% more time.
02:46:36.740 | - Aye.
02:46:37.580 | - So speaking of people happier,
02:46:39.420 | our mutual friend Joe Rogan,
02:46:42.860 | it seems like the most important moment of his life
02:46:46.940 | is centered around Quake.
02:46:49.540 | So it was a definitive part of his life.
02:46:53.540 | So would he agree with your thinking that they should split?
02:46:58.260 | So he is a person who loves Quake and played Quake a lot.
02:47:03.340 | Would he agree that you should have done the Doom engine
02:47:06.540 | and focus on the multiplayer for phase one?
02:47:09.660 | Or in your looking back,
02:47:12.140 | is the 3D world that Quake created
02:47:16.180 | was also fundamental to the enriched experience?
02:47:19.540 | - You know, I would say that what would have happened
02:47:21.580 | is you would have had a Doom looking,
02:47:25.260 | but Quake feeling game eight months earlier
02:47:29.980 | and then maybe six months after Quake actually shipped,
02:47:33.140 | then there would have been the full running on a Pentium
02:47:36.380 | six degree of freedom graphics engine type things there.
02:47:38.700 | So it's not that it wouldn't have been there.
02:47:42.060 | It would have been something amazingly cool earlier
02:47:44.980 | and then something even cooler somewhat later
02:47:47.740 | where I would much rather have gone
02:47:50.300 | and done two one year development efforts.
02:47:53.180 | I've cycled them through.
02:47:54.820 | I've been a little more pragmatic about that
02:47:57.460 | rather than killing ourselves on the whole Quake development.
02:48:01.060 | But I would say it's obviously things worked out well
02:48:04.100 | in the end, but looking back and saying,
02:48:06.420 | how would I optimize and do things differently?
02:48:08.980 | That did seem to be a clear case where going ahead
02:48:13.140 | and we had enormous momentum on Doom.
02:48:15.700 | You know, we did Doom two as the kind of commercial
02:48:18.140 | boxed version after our shareware success
02:48:20.940 | with the original,
02:48:22.140 | but we could have just made another Doom game
02:48:25.740 | adding those new features in.
02:48:27.660 | It would have been huge.
02:48:28.620 | We would have learned all the same lessons, but faster.
02:48:31.460 | And it would have given six degree of freedom
02:48:34.100 | and Pentium class systems a little bit more time
02:48:37.020 | to get mainstream because we did cut out a lot of people
02:48:40.100 | with the hardware requirements for Quake.
02:48:42.900 | - Was there any dark moments for you personally,
02:48:44.780 | psychologically in having such harsh deadlines
02:48:49.780 | and having this also mean difficult technical challenges?
02:48:54.700 | - So I've never really had really dark black places.
02:49:00.700 | I mean, I can't necessarily put myself
02:49:02.700 | in anyone else's shoes,
02:49:03.980 | but I understand a lot of people have,
02:49:07.940 | you know, have significant challenges
02:49:09.620 | with kind of their mental health and wellbeing.
02:49:12.500 | And I've been super stressed.
02:49:15.020 | I've been unhappy as a teenager in various ways,
02:49:18.780 | but I've never really gone to a very dark place.
02:49:23.700 | I just seem to be largely immune to
02:49:28.260 | what really wrecks people.
02:49:29.820 | I mean, I've had plenty of time when I'm very unhappy
02:49:32.300 | and miserable about something,
02:49:33.900 | but it's never hit me like, you know,
02:49:36.420 | I believe it winds up hitting some other people.
02:49:38.780 | I've borne up well under whatever stresses
02:49:41.700 | have kind of fallen on me.
02:49:44.380 | And I've always coped best on that
02:49:46.940 | when all I need to do is usually
02:49:49.500 | just kind of bear down on my work.
02:49:51.260 | I pull myself out of whatever hole I might be slipping into
02:49:55.060 | by actually making progress.
02:49:57.300 | I mean, maybe if I was in a position
02:49:59.460 | where I was never able to make that progress,
02:50:01.740 | I could have slid down further,
02:50:03.300 | but I've always been in a place where,
02:50:06.220 | okay, a little bit more work,
02:50:07.540 | maybe I'm in a tough spot here,
02:50:09.100 | but I always know if I just keep pushing,
02:50:12.260 | eventually I break through and I make progress,
02:50:14.660 | I feel good about what I'm doing.
02:50:16.420 | And that's been enough for me so far in my life.
02:50:20.660 | - Have you seen in the distance,
02:50:22.500 | like, you know, ideas of depression
02:50:27.300 | or contemplating suicide,
02:50:28.740 | have you seen those things far?
02:50:30.500 | - So it was interesting when I was a teenager,
02:50:33.100 | I was, you know, I was probably
02:50:35.820 | on some level a troubled youth.
02:50:37.300 | I was unhappy most of my teenage years.
02:50:40.260 | I really, I wanted to be on my own
02:50:42.580 | doing programming all the time.
02:50:44.380 | You know, as soon as I was 18, 19,
02:50:46.460 | even though I was poor,
02:50:47.620 | I was doing exactly what I wanted and I was very happy,
02:50:50.700 | but high school was not a great time for me.
02:50:53.020 | And I had a conversation with like the school counselor
02:50:56.940 | and they're kind of running their script.
02:50:58.540 | It's like, okay, it's kind of a weird kid here.
02:51:00.580 | Let's carefully probe around.
02:51:02.380 | It's like, you know, do you ever think about ending it all?
02:51:05.700 | I'm like, no, of course not.
02:51:07.500 | Never, not at all.
02:51:08.980 | This is temporary, things are going to be better.
02:51:11.380 | And that's always been kind of the case for me.
02:51:15.580 | And obviously that's not that way for everyone
02:51:18.300 | and other people do react differently.
02:51:20.180 | - And what was your escape from the troubled youth?
02:51:24.980 | Like, you know, music, video games, books.
02:51:29.980 | How did you escape from a world
02:51:35.460 | that's full of cruelty and suffering and that's absurd?
02:51:38.220 | - Yeah, I mean, I was not, you know,
02:51:39.860 | I was not a victim of cruelty and suffering.
02:51:41.820 | It's like, I was an unhappy, somewhat petulant youth
02:51:44.340 | and, you know, in my point where, you know,
02:51:46.420 | I'm not putting myself up with anybody else's suffering,
02:51:49.700 | but I was unhappy objectively.
02:51:52.260 | And the things that I did that very much
02:51:55.740 | characterized my childhood were,
02:51:57.940 | I had books, comic books, Dungeons and Dragons,
02:52:01.500 | arcade games, video games.
02:52:03.740 | Like some of my fondest childhood memories
02:52:06.380 | are the convenience stores, the 7-Elevens and Quick Trips,
02:52:08.900 | because they had a spinner rack of comic books
02:52:11.380 | and they had a little side room
02:52:12.700 | with two or three video games, arcade games in it.
02:52:15.780 | And that was very much my happy place.
02:52:18.540 | You know, if I could, I get my comic books
02:52:20.820 | and if I could go to a library and, you know,
02:52:23.140 | go through those, the little 000 section
02:52:25.740 | where computer books were supposed to be.
02:52:27.380 | And there were a few sad little books there,
02:52:28.980 | but still just being able to sit down and go through that.
02:52:31.860 | And I read, you know, I read a ridiculous number of books,
02:52:35.740 | both fiction and nonfiction as a teenager.
02:52:38.620 | And, you know, as I, my rebelling in high school
02:52:42.820 | was just sitting there with my nose in a book,
02:52:44.540 | ignoring the class, I threw lots of it.
02:52:46.860 | And teachers had a range of reactions to that,
02:52:49.460 | some more accepting of it than others.
02:52:52.380 | - I'm with you on that.
02:52:54.620 | So let us return to Quake for a bit
02:52:56.580 | with the technical challenges.
02:52:57.900 | What, everything together from the networking
02:53:02.900 | to the graphics, what are some things you remember
02:53:07.220 | that were innovations you had to come up with
02:53:10.500 | in order to make it all happen?
02:53:12.420 | - Yeah, so there were a bunch of things on Quake
02:53:14.900 | where on the one hand, the idea that I built
02:53:17.940 | my own programming language to implement the game in,
02:53:20.980 | looking back, and I try to tell people,
02:53:22.980 | it's like every high level programmer
02:53:25.500 | sometime in their career goes through
02:53:26.900 | and they invent their own language.
02:53:28.100 | It just seems to be a thing that's pretty broadly done.
02:53:30.460 | People will be like,
02:53:31.300 | I'm gonna go write a computer programming language.
02:53:33.300 | And I, you know, I don't regret having done it,
02:53:37.180 | but after that, I switched from Quake C,
02:53:40.180 | my quirky little pseudo object-oriented
02:53:43.180 | or entity-oriented language there.
02:53:45.420 | Quake 2 went back to using DLLs with C,
02:53:47.980 | and then Quake 3, I implemented my own C interpreter
02:53:50.740 | or compiler, which was a much smarter thing to do
02:53:53.260 | that I should have done originally for Quake.
02:53:55.860 | But building my own language was an experience.
02:53:57.940 | I learned a lot from that.
02:53:59.780 | And then there was a generation of game programmers
02:54:02.340 | that learned programming with Quake C,
02:54:04.380 | which I feel kind of bad about because, you know,
02:54:06.940 | I mean, we give JavaScript a lot of crap,
02:54:08.740 | but Quake C was nothing to write home about there.
02:54:13.140 | But it allowed people to do magical things.
02:54:15.780 | You get into programming,
02:54:16.860 | not because you love the BNF syntax of a language,
02:54:21.340 | it's because the language lets you do something
02:54:23.380 | that you cared about.
02:54:24.500 | - And here's very much, you could do something
02:54:27.020 | in a whole beautiful three-dimensional world.
02:54:29.580 | - Yeah, and the idea and the fact that the code
02:54:31.260 | for the game was out there, you could say,
02:54:33.180 | I like the shotgun, but I want it to be more badass.
02:54:36.100 | You go in there and say,
02:54:37.220 | okay, now it does 200 points damage.
02:54:39.260 | And then you go around with a big grin on your face,
02:54:41.540 | blowing up monsters all over the game.
02:54:43.980 | So yeah, it is not what I would do today
02:54:48.260 | going back with that language,
02:54:49.580 | but that was a big part of it.
02:54:51.300 | Learning about the networking stuff,
02:54:54.100 | because it's interesting where I learn these things
02:54:56.860 | by reading books.
02:54:57.700 | So I would get a book on networking, find something,
02:55:00.140 | I read all about it and learn, okay, packets,
02:55:02.740 | they can be out of order, lost, or duplicated.
02:55:06.420 | These are all the things
02:55:07.260 | that can theoretically happen to packets.
02:55:09.340 | So I wind up spending all this time thinking about
02:55:11.460 | how do we deal about all of that?
02:55:13.020 | And it turns out, of course, in the real world,
02:55:15.140 | those are things that yes, theoretically can happen
02:55:17.140 | with multiple routes, but they really aren't things
02:55:19.740 | that your 99.999% of your packets have to deal with.
02:55:24.100 | So there was learning experiences about lots of that,
02:55:28.340 | like why, when TCP is appropriate versus UDP
02:55:32.140 | and how if you do things in UDP,
02:55:34.220 | you wind up reinventing TCP badly in almost all cases.
02:55:37.940 | So there's good arguments for using both
02:55:41.620 | for different game technology,
02:55:42.860 | different parts of the game process,
02:55:44.420 | transitioning from level to level and all.
02:55:46.740 | But the graphics were the showcase
02:55:49.380 | of what Quake was all about.
02:55:51.940 | It was this graphics technology that nobody had seen there.
02:55:55.580 | And it was a while before,
02:55:57.460 | there were competitive things out there.
02:55:59.580 | And it went a long time internally, really not working,
02:56:03.620 | where we were even building levels
02:56:05.620 | where the game just was not at all shippable
02:56:09.660 | with large fractions of the world,
02:56:11.260 | like disappearing, not being there,
02:56:14.060 | or being really slow in various parts of it.
02:56:16.900 | And it was this act of faith.
02:56:18.380 | It's like, I think I'm gonna be able to fix this.
02:56:20.700 | I think I'm gonna be able to make this work.
02:56:22.900 | And lots of stuff changed
02:56:25.740 | where the level designers would build something
02:56:27.980 | and then have to throw it away as something fundamental
02:56:30.060 | and the kind of graphics or level technology changed.
02:56:33.820 | And so there were two big things
02:56:37.220 | that contributed to making it possible at that timeframe.
02:56:41.340 | Two new things.
02:56:42.340 | There was certainly hardcore optimized
02:56:44.740 | low-level assembly language.
02:56:46.100 | This was where I had hired Michael Abrash
02:56:48.660 | away from Microsoft.
02:56:50.300 | And he had been one of my early inspirations
02:56:52.380 | where that back in the softest days,
02:56:54.740 | the library of magazines that they had,
02:56:57.380 | some of my most treasured ones were Michael Abrash's articles
02:57:01.020 | in Dr. Dobbs Journal.
02:57:02.420 | And it was amazing after all of our success in Doom,
02:57:06.420 | we were able to kind of hit him up and say,
02:57:08.100 | "Hey, we'd like you to come work at id Software."
02:57:10.580 | And he was in this senior technical role at Microsoft
02:57:13.180 | and he was on track for,
02:57:15.940 | and this was right when Microsoft was starting to take off.
02:57:18.340 | And I did eventually convince him
02:57:21.700 | that what we were doing
02:57:22.820 | was gonna be really amazing with Quake.
02:57:24.580 | It was going to be something nobody had seen before.
02:57:28.300 | It had these aspects of what we were talking about.
02:57:31.460 | We had metaverse talk back then.
02:57:33.660 | We had read "Snow Crash" and we knew about this.
02:57:36.740 | And Michael was big into the science fiction
02:57:40.220 | and we would talk about all that
02:57:41.500 | and kind of spin this tale.
02:57:42.940 | And it was some of the same conversations
02:57:45.220 | that we have today about the metaverse,
02:57:46.940 | about how you could have different areas
02:57:48.980 | linked together by portals
02:57:50.380 | and you could have user-generated content
02:57:52.380 | and changing out all of these things.
02:57:54.460 | - So you really were creating the metaverse with Quake.
02:57:56.980 | - And we talked about things like
02:57:58.780 | - Philosophically. - It used to be advertised
02:58:00.500 | as a virtual reality experience.
02:58:02.860 | That was the first wave of virtual reality
02:58:05.260 | was in the late '80s and early '90s,
02:58:07.700 | you had like the "Lawnmower Man" movie
02:58:10.620 | and you had "Time" and "Newsweek"
02:58:12.180 | talking about the early VPL headsets.
02:58:14.620 | And of course that cratered so hard
02:58:16.820 | that people didn't wanna look at virtual reality
02:58:18.700 | for decades afterwards,
02:58:20.180 | where it was just, it was smoke and mirrors.
02:58:23.260 | It was not real in the sense
02:58:24.860 | that you could actually do something
02:58:26.740 | real and valuable with it.
02:58:28.620 | But still we had that kind of common set of talking points
02:58:32.220 | and we were talking about what these games could become
02:58:36.380 | and how you'd like to see people
02:58:37.940 | building all of these creative things.
02:58:39.580 | Because we were seeing an explosion of work
02:58:41.500 | with Doom at that time,
02:58:42.660 | where people were doing amazingly cool things.
02:58:45.740 | Like we saw cooler levels that we had built
02:58:48.140 | coming out of the user community
02:58:49.740 | and then people finding ways to change the characters
02:58:53.380 | in different ways and it was great.
02:58:54.700 | And we knew what we were doing in Quake
02:58:56.740 | was removing those last things.
02:58:58.860 | There was some quirky things
02:59:00.420 | with a couple of the data types
02:59:02.020 | that didn't work right for overriding
02:59:03.940 | and then the core thing about the programming model.
02:59:07.420 | And I was definitely going to hit all of those in Quake.
02:59:10.900 | But the graphic side of it was still,
02:59:15.580 | I knew what I wanted to do
02:59:17.100 | and it was one of these hubris things
02:59:20.500 | where it's like, well, so far I've been able
02:59:22.020 | to kind of kick everything that I set out to go do.
02:59:26.460 | But Quake was definitely a little bit more
02:59:29.300 | than could be comfortably chewed at that point.
02:59:32.060 | And, but Michael was one of the strongest programmers
02:59:36.740 | and graphics programmers that I knew.
02:59:39.060 | And he was one of the people that I trusted
02:59:40.740 | to write assembly code better than I could.
02:59:44.180 | And there's a few people that I can point to
02:59:46.620 | about things like this, where I'm a world-class optimizer.
02:59:49.820 | I mean, I make things go fast,
02:59:51.660 | but I recognize there's a number of people
02:59:54.580 | that can write tighter assembly code,
02:59:56.780 | tighter SIMD code or tighter CUDA code
02:59:59.300 | than I can write.
03:00:01.220 | I'm, my best strengths are a little bit more
03:00:04.380 | at the system level.
03:00:05.260 | I mean, I'm good at all of that,
03:00:06.940 | but the most leverage comes from making the decisions
03:00:10.180 | that are a little bit higher up,
03:00:11.540 | where you figure out how to change
03:00:13.620 | your large scale problems
03:00:15.100 | so that these lower level problems are easier to do
03:00:17.940 | or it makes it possible to do them in a uniquely fast way.
03:00:23.260 | So most of my, you know, my big wins in a lot of ways
03:00:27.020 | from all the way from the early games through,
03:00:29.340 | you know, through VR and the aerospace work that I'm doing
03:00:31.900 | and, or did, and hopefully the AI work
03:00:34.220 | that I'm working on now is finding an angle on something
03:00:37.500 | that means you trade off something
03:00:40.220 | that you maybe think you need,
03:00:41.420 | but it turns out you don't need it.
03:00:42.980 | And by making a sacrifice in one place,
03:00:45.740 | you can get big advantages in another place.
03:00:48.700 | - Is it clear at which level of the system
03:00:51.540 | those big advantages can be gained?
03:00:54.340 | - It's not always clear.
03:00:55.580 | And that's why the thing that I try to make
03:00:58.620 | one of my core values,
03:01:00.180 | and I proselytize to a lot of people
03:01:02.780 | is trying to know the entire stack,
03:01:05.340 | you know, trying to see through everything that happens.
03:01:08.140 | And it's almost impossible on like the web browser level
03:01:11.380 | of things where there's so many levels to it,
03:01:13.500 | but you should at least understand what they all are,
03:01:15.860 | even if you can't understand
03:01:17.020 | all the performance characteristics at each level,
03:01:20.060 | but it goes all the way down to literally the hardware.
03:01:23.020 | So what does the, what is this chip capable of?
03:01:26.580 | And what is this software that you're writing capable of?
03:01:29.420 | And then when this architecture you put on top of that,
03:01:31.780 | then the ecosystem around it,
03:01:33.340 | all the people that are working on it.
03:01:35.940 | So there are all these decisions
03:01:38.540 | and they're never made in a globally optimal way,
03:01:41.260 | but sometimes you can drive a thread
03:01:43.460 | of global optimality through it.
03:01:45.140 | You can't look at everything, it's too complicated,
03:01:47.780 | but sometimes you can step back up
03:01:49.740 | and make a different decision.
03:01:51.580 | And we kind of went through this on the graphics side
03:01:53.500 | on Quake where I, in some ways it was kind of bad
03:01:56.540 | where Michael would spend his time writing,
03:01:59.140 | like I'd rough out the basic routines, like, okay,
03:02:02.100 | here's our span rasterizer.
03:02:03.980 | And he would spend a month writing this, you know,
03:02:06.740 | beautiful cycle optimized piece of assembly language
03:02:10.580 | that does, you know, does what I asked it to do.
03:02:13.340 | And he did it faster than like my original code would do,
03:02:16.140 | or probably what I would be able to do
03:02:17.740 | even if I had spent that month on it.
03:02:20.500 | But then we'd have some cases when I'd be like,
03:02:22.740 | okay, well, I figured out at this higher level,
03:02:25.500 | instead of drawing these in a painter's order here,
03:02:28.140 | I do a span buffer and it cuts out 30%
03:02:31.740 | or 40% of all of these pixels,
03:02:34.060 | but it means you need to rewrite kind of this interface
03:02:36.620 | of all of that.
03:02:37.460 | And I could tell that wore on him a little bit,
03:02:39.380 | but in the end it was the right thing to do
03:02:41.980 | where we wound up changing that rasterization approach
03:02:45.180 | and we wound up with a super optimized
03:02:47.260 | assembly language core loop,
03:02:49.780 | and then a good system around it,
03:02:51.700 | which minimized how much that had to be called.
03:02:54.420 | - And so in order to be able to do this kind of
03:02:56.300 | system level thinking,
03:02:58.220 | whether we're talking about game development, aerospace,
03:03:03.220 | nuclear energy, AI, VR,
03:03:08.020 | you have to be able to understand the hardware,
03:03:10.700 | the low level software, the high level software,
03:03:14.740 | the design decisions, the whole thing,
03:03:16.860 | the full stack of it.
03:03:18.180 | - Yeah, and that's where a lot of these things
03:03:20.500 | become possible when you're bringing the future forward.
03:03:23.980 | I mean, there's a pace that everything
03:03:25.180 | just kind of glides towards where we have a lot of progress
03:03:27.980 | that's happening at such a different,
03:03:29.340 | so many different ways you kind of slide towards progress,
03:03:32.300 | just left to your own programs just get faster.
03:03:34.900 | For a while it wasn't clear if they were gonna get fatter
03:03:37.740 | more than they get quicker than they get faster
03:03:39.660 | and it cancels out, but it is clear now in retrospect,
03:03:42.420 | programs just get faster and have gotten faster
03:03:45.460 | for a long time.
03:03:46.620 | But if you wanna do something like back at that original,
03:03:49.820 | talking about scrolling games, say,
03:03:52.260 | well, this needs to be five times faster.
03:03:54.700 | Well, we can wait six years and just,
03:03:57.700 | it'll naturally get that much faster at that time,
03:04:00.820 | or you come up with some really clever way of doing it.
03:04:03.700 | So there are those opportunities like that
03:04:06.140 | in a whole bunch of different areas.
03:04:08.100 | Now, most programmers don't need to be thinking about that.
03:04:11.780 | There's not that many,
03:04:13.420 | there's a lot of opportunities for this,
03:04:15.100 | but it's not everyone's work a day type stuff.
03:04:17.220 | So everyone doesn't have to know how all these things work.
03:04:20.340 | They don't have to know how their compiler works,
03:04:22.860 | how the processor chip manages cache eviction
03:04:26.140 | and all these low level things.
03:04:28.140 | But sometimes there are powerful opportunities
03:04:31.700 | that you can look at and say,
03:04:33.140 | we can bring the future five years faster.
03:04:37.220 | We can do something that,
03:04:38.420 | wouldn't it be great if we could do this?
03:04:40.300 | Well, we can do it today
03:04:42.020 | if we make a certain set of decisions.
03:04:44.180 | And it is in some ways smoke and mirrors,
03:04:47.140 | where you say it's like,
03:04:48.420 | Doom was a lot of smoke and mirrors,
03:04:50.460 | where people thought it was more capable
03:04:52.260 | than it actually was,
03:04:53.540 | but we picked the right smoke and mirrors
03:04:56.020 | to deploy in the game,
03:04:57.380 | where by doing this,
03:04:58.980 | people will think that it's more general,
03:05:01.100 | we are going to amaze them with what they've got here,
03:05:03.340 | and they won't notice
03:05:04.980 | that it doesn't do these other things.
03:05:07.620 | So smart decision-making at that point,
03:05:09.900 | that's where that kind of global,
03:05:13.260 | holistic top-down view can work.
03:05:16.380 | And I'm really a strong believer
03:05:20.180 | that technology should be sitting at that table,
03:05:23.940 | having those discussions,
03:05:25.020 | because you do have cases where you say,
03:05:26.660 | well, you want to be the Jonathan Ivey or whatever,
03:05:28.580 | where it's a pure design solution.
03:05:32.060 | And that's, in some cases now,
03:05:35.180 | where you truly have almost infinite resources,
03:05:37.700 | like if you're trying to do a scrolling game on the PC now,
03:05:41.460 | you don't even have to talk to a technology person,
03:05:43.580 | you can just have,
03:05:45.420 | any intern can make that go run as fast as it needs to there,
03:05:48.420 | and it can be completely design-based.
03:05:50.660 | But if you're trying to do something that's hard,
03:05:53.140 | either that can't be done for resources,
03:05:55.780 | like VR on a mobile chip set,
03:05:58.020 | or that we don't even know how to do yet,
03:05:59.780 | like artificial general intelligence,
03:06:01.980 | it's probably going to be a matter
03:06:03.700 | of coming at it from an angle.
03:06:05.340 | Like, I mean, for AGI,
03:06:06.420 | we have some of the Hutter principles
03:06:08.980 | about how you can, you know, AGI,
03:06:11.380 | some of that, there are theoretical ways
03:06:12.820 | that you can say,
03:06:13.660 | this is the optimal learning algorithm
03:06:15.180 | that can solve everything,
03:06:16.620 | but it's completely impractical,
03:06:18.340 | you know, you just can't do that.
03:06:20.140 | So clearly you have to make some concessions
03:06:23.260 | for general intelligence,
03:06:25.180 | and nobody knows what the right ones are yet.
03:06:27.260 | So people are taking different angles of attack,
03:06:29.260 | I hope I've got something clever
03:06:30.980 | to come up with in that space.
03:06:34.020 | - It's been surprising to me,
03:06:35.420 | and I think perhaps it is a principle of progress
03:06:38.740 | that smoke and mirrors somehow
03:06:40.540 | is the way you build the future.
03:06:42.780 | You kind of fake it till you make it,
03:06:46.340 | and you almost always make it,
03:06:47.700 | and I think that's going to be the way we achieve AGI,
03:06:50.300 | that's going to be the way we build consciousness
03:06:53.580 | into our machines.
03:06:54.980 | There's, you know, philosophers debate
03:06:57.860 | about the Turing test,
03:06:59.860 | is essentially about faking it till you make it.
03:07:02.820 | You start by faking it,
03:07:04.500 | and I think that always leads to making it,
03:07:09.100 | because if you look at history-
03:07:09.940 | - Most of the philosophers arguments,
03:07:11.220 | when as soon as people start talking about qualia
03:07:13.900 | and consciousness and Chinese rooms and things,
03:07:16.220 | it's like, I just check out,
03:07:17.580 | I just don't think there's any value in those conversations.
03:07:20.180 | It's just like, go ahead, tell me it's not going to work,
03:07:22.460 | I'm going to do my best to try to make it work anyways.
03:07:25.340 | - I don't know if you work with legged robots,
03:07:26.900 | there's a bunch of these.
03:07:28.140 | They make, they sure as heck make me feel
03:07:31.980 | like they're cautious.
03:07:33.900 | In a certain way that's not here today,
03:07:37.140 | but is, you could see the kernel,
03:07:41.220 | it's like the flame, the beginnings of a flame.
03:07:46.020 | - We don't have line of sight,
03:07:47.460 | but there's glimmerings of light in the distance
03:07:50.100 | for all of these things.
03:07:51.020 | - Yeah, I'm hearing murmuring in a distant room.
03:07:53.540 | Well, let me ask you a human question here.
03:07:56.780 | You've, in the game design space,
03:07:59.860 | you've done a lot of incredible work throughout,
03:08:01.900 | but in terms of game design, you have changed the world
03:08:05.780 | and there's a few people around you that did the same.
03:08:08.420 | So famously, there's some animosity, there's much love,
03:08:13.020 | but there's some animosity between you and John Romero.
03:08:16.340 | What is at the core of that animosity and human tension?
03:08:19.940 | - So there really hasn't been, for a long time,
03:08:24.060 | and even at the beginning, it's like, yes,
03:08:26.180 | I did push Romero out of the company.
03:08:29.500 | And this is one of the things that I look back,
03:08:32.140 | if I could go back telling my younger self
03:08:35.060 | some advice about things,
03:08:37.820 | the original founding kind of corporate structure
03:08:41.900 | of id Software really led to a bunch of problems.
03:08:45.700 | We started off with us as equal partners
03:08:48.540 | and we had a buy-sell agreement
03:08:50.420 | because we didn't want outsiders to be telling us
03:08:52.500 | what to do inside the company.
03:08:54.380 | And that did lead to a bunch of the problems
03:08:57.260 | where I was sitting here going,
03:08:59.580 | it's like, all right, I'm working harder than anyone.
03:09:02.940 | I'm doing these technologies, nobody's done before,
03:09:06.300 | but we're all equal partners.
03:09:08.100 | And then I see somebody that's not working as hard.
03:09:11.620 | I mean, I can't say I was the most mature about that.
03:09:16.140 | I was 20 something years old
03:09:18.020 | and it did bother me when I'm like,
03:09:21.940 | everybody, okay, we need to all pull together
03:09:23.980 | and we've done it before.
03:09:25.020 | Everybody, we know we can do this
03:09:26.540 | if we get together and we grind it all out,
03:09:29.420 | but not everybody wanted to do that for all time.
03:09:33.500 | And I was the youngest one of the crowd there.
03:09:35.740 | I had different sets of kind of backgrounds and motivations
03:09:40.500 | and left at that point where it was,
03:09:43.740 | all right, either everybody has to be contributing
03:09:47.180 | like up to this level or they need to get pushed out,
03:09:50.420 | that was not a great situation.
03:09:54.140 | And I look back on it and no,
03:09:56.140 | we pushed people out of the company
03:09:58.460 | that could have contributed
03:10:00.300 | if there was a different framework for them.
03:10:02.860 | And the modern kind of Silicon Valley,
03:10:04.700 | like let your stock vest over a time period
03:10:07.060 | and maybe it's non-voting stock
03:10:08.820 | and all those different things.
03:10:09.820 | We knew nothing about any of that.
03:10:11.420 | I mean, we didn't know what we were doing
03:10:13.860 | in terms of corporate structure or anything.
03:10:16.460 | - So if you think the framework was different,
03:10:18.460 | some of the human tension could have been a little bit.
03:10:20.780 | - It almost certainly would have.
03:10:22.900 | I mean, I look back at that
03:10:24.220 | and it's like even trying to summon up in my mind,
03:10:27.700 | it's like, I know I was really, really angry about,
03:10:31.540 | I am like Romero not working as hard as I wanted him to work
03:10:35.580 | or not carrying his load on the design for Quake
03:10:39.020 | and coming up with things there.
03:10:40.900 | But he was definitely doing things.
03:10:43.060 | He made some of the best levels there.
03:10:44.900 | He was working with some of our external teams
03:10:47.740 | like Raven on the licensing side of things.
03:10:50.660 | But there were differences of opinion about it.
03:10:55.460 | But he landed right on his feet.
03:10:57.260 | He went and he got $20 million from Eidos
03:10:59.660 | to go do Ion Storm and he got to do things his way
03:11:02.980 | and spun up three teams simultaneously.
03:11:05.700 | Because that was always one of the challenging things
03:11:08.180 | in it where we were doing these single string,
03:11:11.020 | one project after another.
03:11:13.140 | And I think some of them wanted to grow the company more.
03:11:16.380 | And I didn't because I knew people that were saying that,
03:11:19.220 | oh, companies turn to shit when you got 50 employees.
03:11:22.260 | It's just a different world there.
03:11:24.060 | And I loved our little dozen people working on the projects.
03:11:28.420 | But you can look at it and say, well,
03:11:30.180 | business realities matter.
03:11:31.780 | It's like you're super successful here
03:11:33.540 | and we could take a swing and a miss on something.
03:11:36.340 | But you do it a couple of times and you're out of luck.
03:11:39.300 | There's a reason companies try to have multiple teams running
03:11:43.380 | at one time.
03:11:45.540 | And so that was, again, something
03:11:47.340 | I didn't really appreciate back then.
03:11:49.580 | So if you look past all that, you
03:11:51.140 | did create some amazing things together.
03:11:53.740 | What did you love about John Romero?
03:11:55.740 | What did you respect and appreciate about him?
03:11:57.700 | What did you admire about him?
03:11:59.060 | What did you learn from him?
03:12:00.940 | When I met him, he was the coolest programmer
03:12:02.460 | I had ever met.
03:12:04.020 | He had done all of this stuff.
03:12:05.540 | He had made all of these games.
03:12:07.580 | He had worked at one of the companies
03:12:10.060 | that I thought was the coolest at Origin Systems.
03:12:12.700 | And he knew all this stuff.
03:12:14.580 | He made things happen fast.
03:12:16.220 | And he was also kind of a polymath about this,
03:12:18.860 | where he drew his own art.
03:12:21.540 | He made his own levels, as well as
03:12:24.260 | he worked on sound design systems on top of actually
03:12:27.340 | being a really good programmer.
03:12:29.260 | And we went through a little-- it
03:12:32.060 | was kind of fun where one of the early things that we did,
03:12:34.620 | where there was kind of the young buck bit going in,
03:12:36.980 | where I was the new guy and he was the top man programmer
03:12:42.780 | at the Softdisk area.
03:12:44.700 | And eventually, we had sort of a challenge over the weekend
03:12:47.100 | that we were going to race to implement this game,
03:12:49.660 | to port one of our PC games back down to the Apple II.
03:12:52.740 | And that was where we finally kind of became clear.
03:12:55.220 | It's like, OK, Carmack stands a little bit apart
03:12:57.740 | on the programming side of things.
03:13:00.260 | But Romero then very gracefully moved into, well,
03:13:03.300 | he'll work on the tools.
03:13:04.460 | He'll work on the systems, do some of the game design stuff,
03:13:07.660 | as well as contributing on starting
03:13:10.060 | to lead the design aspects of a lot of things.
03:13:12.980 | So he was enormously valuable in the early stuff.
03:13:16.940 | And so much of Doom, and even Quake,
03:13:19.180 | have his stamp on it in a lot of ways.
03:13:21.860 | But he wasn't at the same level of focus
03:13:25.660 | that I brought to the work that we were doing there.
03:13:29.100 | And he really did--
03:13:31.660 | we hit such a degree of success that it was all
03:13:34.940 | in the press about that, the rock star game programmers.
03:13:38.020 | Yeah, I mean, it's the Beatles problem.
03:13:40.060 | Yeah, I mean, he ate it up.
03:13:42.140 | And he did personify-- there was the whole game developers
03:13:45.020 | with Ferraris that we had there.
03:13:48.980 | And I thought that led to some challenges there.
03:13:53.460 | But so much of the stuff that was great in the games
03:13:58.100 | did come from him.
03:13:59.180 | And I would certainly not take that away from him.
03:14:01.860 | And even after we parted ways and he took his swing
03:14:05.500 | with Eidos, in some ways, he was ahead of the curve
03:14:09.300 | with mobile gaming as well, where one of his companies
03:14:12.700 | after Eidos was working on feature phone game development.
03:14:16.740 | And I wound up doing some of that just before the iPhone,
03:14:20.620 | crossing over into the iPhone phase there.
03:14:23.020 | And that was something that clearly did turn out
03:14:25.620 | to be a huge thing, although he was too early for what he
03:14:29.220 | was working on at that time.
03:14:31.700 | We've had pretty cordial relationships
03:14:34.100 | where I was happy to talk with him anytime I'd run into him
03:14:36.700 | at a conference.
03:14:38.220 | I have actually had some other people just say,
03:14:40.620 | it's like, oh, you shouldn't go over there and give him
03:14:43.340 | the time of day, or felt that Masters of Doom was--
03:14:47.580 | I played things up in a way that I shouldn't be too happy with.
03:14:52.500 | But I'm OK with all of that.
03:14:54.780 | So you've still got love in your heart.
03:14:56.900 | I mean, I just talked with him last year,
03:14:59.700 | or I guess it was even this year,
03:15:01.060 | about mentioning that I'm going off doing this AI stuff.
03:15:03.820 | I'm going big into artificial intelligence.
03:15:06.580 | And he had a bunch of ideas for how
03:15:09.260 | AI is going to play into gaming, and asked if I was
03:15:12.260 | interested in collaborating.
03:15:13.460 | And it's not in line with what I'm doing,
03:15:16.300 | but I do-- I wish almost everyone the best.
03:15:19.620 | I mean, I know I may not have parted on the best of terms
03:15:22.620 | with some people, but I was thrilled to see Tom
03:15:26.700 | Hall writing VR games now.
03:15:29.100 | He wrote-- working on a game called
03:15:30.820 | Demio, which is really an awesome VR game.
03:15:33.580 | It's like Dungeons and Dragons.
03:15:34.900 | We all used to play Dungeons and Dragons together.
03:15:36.740 | That was one of the things-- that
03:15:38.140 | was what we did on Sundays in the early days.
03:15:40.260 | I would Dungeon Master, and they'd all play.
03:15:42.140 | And so it really made me smile seeing
03:15:44.900 | Tom involved with an RPG game in virtual reality.
03:15:49.620 | - You were the CTO of Oculus VR since 2013,
03:15:54.540 | and maybe lessen your involvement a bit in 2019.
03:16:00.340 | Oculus was acquired by Facebook now Meta in 2014.
03:16:04.860 | You've spoken brilliantly about both the low-level details,
03:16:07.740 | the experimental design, and the big picture
03:16:09.740 | vision of virtual reality.
03:16:11.980 | Let me ask you about the metaverse, the big question
03:16:14.940 | here, both philosophically and technically.
03:16:17.980 | How hard is it to build the metaverse?
03:16:20.140 | What is the metaverse in your view?
03:16:22.700 | You started with discussing and thinking about Quake
03:16:24.940 | as a kind of a metaverse.
03:16:27.220 | As you think about it today, what
03:16:29.340 | is the metaverse, the thing that could
03:16:32.140 | create this compelling user value, this experience that
03:16:35.460 | will change the world?
03:16:36.820 | And how hard is it to build it?
03:16:38.980 | - So the term comes from Neil Stevenson's book Snow Crash,
03:16:41.980 | which many of us had read back in the '90s.
03:16:44.660 | It was one of those kind of formative books.
03:16:47.460 | And there was this sense that the possibilities
03:16:53.060 | and kind of the freedom and unlimited capabilities
03:16:56.060 | to build a virtual world that does whatever you want,
03:16:59.580 | whatever you ask of it, has been a powerful draw
03:17:02.220 | for generations of developers, game developers specifically,
03:17:05.580 | and people that are thinking about more general purpose
03:17:08.860 | applications.
03:17:10.260 | So we were talking about that back in the Doom and Quake
03:17:13.260 | days, about how do you wind up with an interconnected
03:17:16.260 | set of worlds that you kind of visit from one to another.
03:17:19.140 | And as web pages were becoming a thing,
03:17:21.460 | you start thinking about what is the interactive kind
03:17:25.220 | of 3D-based equivalent of this.
03:17:27.660 | And there were a lot of really bad takes.
03:17:29.780 | You had like Vermont and virtual reality markup languages.
03:17:34.220 | And there's aspects like that that came from people saying,
03:17:38.220 | well, what kind of capabilities should we
03:17:40.780 | develop to enable this?
03:17:43.700 | And that kind of capability-first work
03:17:45.620 | has usually not panned out very well.
03:17:48.940 | On the other hand, we have successful games
03:17:51.540 | that started with things like Doom and Quake
03:17:53.660 | and communities that formed around those,
03:17:55.660 | and whether it was server lists in the early days
03:17:58.740 | or literal portaling between different games,
03:18:01.700 | and then modern things that are on a completely
03:18:04.340 | different order of magnitude, like Minecraft and Fortnite,
03:18:07.220 | that have 100 million-plus users.
03:18:11.860 | I still think that that's the right way
03:18:13.740 | to go to build the metaverse, is you build something that's
03:18:16.900 | amazing that people love and people wind up
03:18:19.020 | spending all their time in, because it's awesome.
03:18:21.980 | And you expand the capabilities of that.
03:18:24.340 | So even if it's a very basic experience, if it's awesome--
03:18:28.460 | Minecraft is an amazing case study in so many things,
03:18:31.740 | where what's been able to be done with that
03:18:34.500 | is really enlightening.
03:18:36.940 | And there are other cases where, like right now,
03:18:39.700 | Roblox is basically a game construction kit aimed at kids.
03:18:43.260 | And that was a capability-first play.
03:18:45.060 | And it's achieving scale that's on the same order
03:18:48.180 | of those things.
03:18:49.060 | So it's not impossible, but my preferred bet
03:18:52.580 | would be you make something amazing that people love,
03:18:55.180 | and you make it better and better.
03:18:56.940 | And that's where I could say we could have gone back
03:18:59.620 | and followed a path like that in the early days,
03:19:02.900 | if you just take the same game, whether it's
03:19:05.700 | when Activision demonstrated that you could make Call
03:19:07.860 | of Duty every year.
03:19:09.260 | And not only is it not bad--
03:19:11.180 | people love it, and it's very profitable--
03:19:14.540 | the idea that you could have taken something like that,
03:19:17.340 | take a great game, release a new version every year
03:19:20.260 | that lets the capabilities grow and expand to start saying,
03:19:23.820 | it's like, OK, it's a game about running around and shooting
03:19:26.380 | things.
03:19:26.900 | But now you can bring your media into it.
03:19:30.300 | You can add persistence of social signs of life
03:19:35.060 | or whatever you want to add to it.
03:19:37.780 | I still think that's quite a good position to take.
03:19:41.900 | And I think that while Meta is doing a bottoms-up capability
03:19:45.980 | approach with Horizon Worlds, where it's
03:19:48.780 | a fairly general-purpose, creators
03:19:51.220 | can build whatever they want in there sort of thing,
03:19:55.820 | it's hard to compare and compete with something
03:19:58.020 | like Fortnite, which also has enormous amounts of creativity,
03:20:01.700 | even though it was not designed originally
03:20:03.700 | as a general-purpose sort of thing.
03:20:05.780 | So we have examples on both sides.
03:20:08.500 | Me, personally, I would have bet on trying
03:20:11.940 | to do entertainment, valuable destination first,
03:20:15.220 | and expanding from there.
03:20:17.100 | So can you imagine the thing that will be kind of--
03:20:22.220 | if we look back a couple of centuries from now
03:20:25.340 | and you think about the experiences that
03:20:29.220 | marked the singularity, the transition, where
03:20:34.100 | most of our world moved into virtual reality,
03:20:37.660 | what do you think those experiences will look like?
03:20:40.700 | So I do think it's going to be kind of like the way
03:20:42.780 | the web slowly took over, where you're
03:20:45.740 | the frog in the pot of water that's slowly heating up,
03:20:48.900 | where having lived through all of that,
03:20:51.500 | I remember when it was shocking to start
03:20:53.700 | seeing the first website address on a billboard,
03:20:56.980 | when you're like, hey, my computer world is
03:20:58.820 | infecting the real world.
03:21:00.740 | This is spreading out in some way.
03:21:03.140 | But there's still-- when you look back and say, well,
03:21:05.660 | what made the web take off?
03:21:08.300 | And it wasn't a big bang sort of moment there.
03:21:12.060 | It was a bunch of little things that
03:21:14.100 | turned out not to even be the things that are relevant now
03:21:17.260 | that brought them into it.
03:21:19.220 | I wonder if--
03:21:20.420 | I mean, like you said, you're not a historian.
03:21:23.180 | So maybe there is a historian out there that could really
03:21:27.140 | identify that moment, data-driven way.
03:21:30.700 | It could be like Myspace or something like that.
03:21:33.620 | Maybe the first major social network that really reached
03:21:37.380 | into non-geek world or something like that.
03:21:42.180 | I think that's kind of the fallacy of historians, though,
03:21:45.100 | looking for some of those kind of primary dominant causes,
03:21:48.660 | where so many of these things are--
03:21:51.340 | like, we see an exponential curve.
03:21:53.020 | But it's not because one thing is going exponential.
03:21:55.900 | It's because we have hundreds of little sigmoid curves
03:21:59.100 | overlapped on top of each other.
03:22:00.780 | And they just happen to keep adding up
03:22:02.460 | so that you've got something kind of going exponential
03:22:05.540 | at any given point.
03:22:06.420 | But no single one of them was the critical thing.
03:22:09.260 | There were dozens and dozens of things.
03:22:11.660 | I mean, seeing the transitions of stuff,
03:22:13.380 | like as, obviously, Myspace giving way to other things,
03:22:16.540 | but even like blogging giving way to social media
03:22:20.140 | and getting resurrected in other guises and--
03:22:22.980 | And the memes with the--
03:22:23.980 | --things that happened there.
03:22:25.180 | --dancing baby gif or whatever the--
03:22:26.900 | all your base now belong to us.
03:22:29.100 | Whatever those early memes that led to the modern memes
03:22:32.260 | and the humor on the different-- the different evolution
03:22:35.700 | of humor on the internet that I'm
03:22:37.460 | sure the historians will also write books about
03:22:40.660 | from the different websites that support,
03:22:42.540 | that create the infrastructure for that humor,
03:22:44.460 | like Reddit and all that kind of stuff.
03:22:46.740 | So people will go back, and they will
03:22:48.420 | name firsts and critical moments.
03:22:50.260 | But it's probably going to be a poor approximation of what
03:22:53.340 | actually happens.
03:22:54.940 | And we've already seen, like, in the VR space
03:22:57.460 | where it didn't play out the way we thought it would in terms
03:23:01.340 | of what was going to be-- like, when the modern era of VR
03:23:04.540 | basically started with my E3 demo of Doom 3
03:23:07.220 | on the Rift prototype.
03:23:08.420 | So we're like, first-person shooters in VR,
03:23:10.900 | match made in heaven, right?
03:23:12.820 | And that didn't work out that way at all.
03:23:15.460 | They have-- you know, they have the most comfort problems
03:23:17.620 | with it.
03:23:18.380 | And then the most popular virtual reality app
03:23:21.140 | is Beat Saber, which nobody predicted back then.
03:23:24.780 | What's that make you, like, from first principles
03:23:28.660 | if you were to, like, reverse engineer that?
03:23:31.220 | Why are these, like, silly fun games?
03:23:34.380 | Well, it actually makes very clear sense
03:23:37.300 | when you analyze it from hindsight
03:23:40.380 | and look at the engineering reasons, where it's not just
03:23:43.180 | that it was a magical, quirky idea.
03:23:45.380 | It was something that played almost perfectly
03:23:47.780 | to what turned out to be the real strengths of VR,
03:23:50.420 | where the one thing that I really underestimated
03:23:52.900 | importance in VR was the importance of the controllers.
03:23:55.740 | You know, I was still thinking we could do a lot more
03:23:58.020 | with a game pad.
03:23:59.300 | And just the amazingness of taking any existing game,
03:24:01.900 | being able to move your head around and look around,
03:24:04.300 | that that was really amazing.
03:24:06.180 | But the controllers were super important.
03:24:08.980 | But the problem is so many things
03:24:10.420 | that you do with the controllers just suck.
03:24:13.180 | It feels like it breaks the illusion,
03:24:14.720 | like trying to pick up glasses with the controllers,
03:24:16.900 | where you're like, oh, use the grip button when you're
03:24:19.140 | kind of close, and it'll snap into your hand.
03:24:21.420 | All of those things are unnatural actions,
03:24:24.540 | that you do them, and it's still part of the VR experience.
03:24:27.660 | But Beat Saber winds up playing only to the strengths.
03:24:32.460 | It completely hides all the weaknesses of it,
03:24:34.620 | because you are holding something in your hand.
03:24:37.100 | You keep a solid grip on it the whole time.
03:24:39.380 | It slices through things without ever bumping into things.
03:24:42.340 | You never get into the point where, you know,
03:24:44.460 | I'm knocking on this table, but in VR,
03:24:46.300 | my hand just goes right through it.
03:24:48.180 | So you've got something that slices through,
03:24:51.100 | so it's never your brain telling you,
03:24:53.100 | oh, I should have hit something.
03:24:54.540 | You've got a lightsaber here.
03:24:55.780 | It's just you expect it to slice through everything.
03:24:59.020 | Audio and music turned out to be a really powerful aspect
03:25:02.420 | of virtual reality, where you're blocking the world off
03:25:05.140 | and constructing the world around you,
03:25:07.580 | and being something that can run efficiently on even
03:25:11.420 | this relatively low-powered hardware,
03:25:13.700 | and can have a valuable loop in a small amount of time,
03:25:17.620 | where a lot of modern games, you're
03:25:19.700 | supposed to sit down and play it for an hour,
03:25:21.420 | just to get anywhere.
03:25:22.220 | Sometimes a new game takes an hour
03:25:23.620 | to get through the tutorial level,
03:25:25.500 | and that's not good for VR for a couple reasons.
03:25:27.820 | You do still have the comfort issues
03:25:29.700 | if you're moving around at all, but you've also
03:25:32.100 | got just discomfort from the headset, battery lifespan
03:25:35.900 | on the mobile versions.
03:25:37.420 | So having things that do break down
03:25:39.100 | into three- and four-minute windows of play,
03:25:42.060 | that turns out to be very valuable from a gameplay
03:25:44.420 | standpoint.
03:25:45.780 | So it winds up being kind of a perfect storm of all
03:25:48.460 | of these things that are really good.
03:25:49.980 | It doesn't have any of the comfort problems.
03:25:52.020 | You're not navigating around.
03:25:53.340 | You're standing still.
03:25:54.540 | All the stuff flies at you.
03:25:56.500 | It has placed audio strengths.
03:25:59.300 | It adds-- the whole fitness in VR,
03:26:01.540 | nobody was thinking about that back at the beginning.
03:26:04.460 | And it turns out that that is an excellent daily fitness
03:26:08.220 | thing to be doing.
03:26:09.060 | If you go play an hour of Beat Saber or Supernatural
03:26:12.740 | or something, that is legit solid exercise,
03:26:16.380 | and it's more fun than doing it just about any other way there.
03:26:19.900 | - So that's kind of the arcade stage of things.
03:26:23.460 | If I were to say, with my experience with VR,
03:26:27.100 | the thing that I think is powerful is the--
03:26:30.620 | maybe it's not here yet--
03:26:32.540 | but the degree to which it is immersive in the way
03:26:36.620 | that Quake is immersive.
03:26:38.340 | It takes you to another world.
03:26:40.220 | For me, because I'm a fan of role-playing games,
03:26:44.260 | the Elder Scrolls series, like Skyrim or even Daggerfall,
03:26:50.220 | it just takes you to another world.
03:26:52.580 | And when you're not in that world,
03:26:53.900 | you miss not being there.
03:26:56.020 | And then you just--
03:26:57.100 | you kind of want to stay there forever,
03:26:58.780 | because life is shitty.
03:27:01.260 | - Well, the whole point of my pitch for VR
03:27:04.260 | is that there was a time when we were kind of asked
03:27:09.300 | to come up with, like, what's your view about VR?
03:27:12.020 | And my pitch was that it should be better inside the headset
03:27:16.540 | than outside.
03:27:17.140 | It's the world as you want it.
03:27:19.420 | And everybody thought that was dystopian.
03:27:21.580 | And that's like, oh, you're just going
03:27:23.620 | to forget about the world outside.
03:27:25.540 | And I don't get that mindset where the idea that if you can
03:27:30.540 | make the world better inside the headset than outside,
03:27:34.100 | you've just improved the person's life that
03:27:36.580 | has a headset that can wear it.
03:27:38.420 | And there are plenty of things that we just can't do
03:27:41.060 | for everyone in the real world.
03:27:42.380 | Everybody can't have Richard Branson's private island.
03:27:44.860 | But everyone can have a private VR island.
03:27:47.180 | And it can have the things that they want on it.
03:27:49.260 | And there's a lot of these kind of rivalrous goods
03:27:51.660 | in the real world that VR can just be better at.
03:27:55.620 | We can do a lot of things like that that can be very, very
03:27:58.860 | rich.
03:27:59.660 | So yeah, I think it's going to be a positive thing, this world,
03:28:02.860 | where people want to go back into their headset,
03:28:05.460 | where it can be better than somebody that's
03:28:07.700 | living in a tiny apartment can have a palatial estate
03:28:10.900 | in virtual reality.
03:28:11.980 | They can have all their friends from all over the world
03:28:14.300 | come over and visit them without everybody getting on a plane
03:28:17.220 | and meeting in some place and dealing
03:28:19.700 | with all the other logistics hassles.
03:28:21.660 | There is real value in the presence
03:28:24.860 | that you can get for remote meetings.
03:28:26.580 | It's all the little things that we need to sort out.
03:28:30.100 | But those are things that we have line of sight on.
03:28:32.860 | People that have been in a good VR meeting using workrooms,
03:28:37.180 | where you can say, oh, that was better than a Zoom meeting.
03:28:40.380 | But of course, it's more of a hassle to get into it.
03:28:42.940 | Not everyone has a headset.
03:28:44.380 | Interoperability is worse.
03:28:46.260 | You can't have-- you cap out at a certain number.
03:28:48.660 | There's all these things that need to be fixed.
03:28:50.620 | But that's one of those things you can look at and say,
03:28:52.860 | we know there's value there.
03:28:54.260 | We just need to really grind hard,
03:28:56.300 | file off all the rough edges, and make that possible.
03:28:59.460 | So you do think we have line of sight,
03:29:02.100 | because there's a reason, like--
03:29:07.060 | I do this podcast in person, for example.
03:29:11.140 | Doing it remotely, it's not the same.
03:29:14.220 | And if somebody were to ask me why it's not the same,
03:29:16.940 | I wouldn't be able to write down exactly why.
03:29:20.580 | But you're saying that it's possible,
03:29:23.420 | whatever the magic is for in-person interaction,
03:29:27.020 | that immersiveness of the experience,
03:29:30.540 | we are almost there.
03:29:32.780 | So it's a technical problem.
03:29:33.940 | So the idea of, like, I'm doing a VR interview with someone.
03:29:37.780 | I'm not saying it's here right now.
03:29:39.620 | But you can see glimmers of what it should be.
03:29:42.700 | And we largely know what would need
03:29:44.980 | to be fixed and improved to-- like you say,
03:29:48.060 | there's a difference between a remote interview doing
03:29:50.820 | a podcast over Zoom or something and face-to-face.
03:29:53.940 | There's that sense of presence, that immediacy,
03:29:56.460 | the super low latency responsiveness,
03:29:59.060 | being able to see all the subtle things there,
03:30:01.420 | just occupying the same field of view.
03:30:03.820 | And all of those are things that we absolutely can do in VR.
03:30:07.780 | And that simple case of a small meeting with a couple people,
03:30:11.620 | that's the much easier case than everybody
03:30:13.540 | thinks, the Ready Player One multiverse with 1,000 people
03:30:16.300 | going across a huge bridge to amazing places.
03:30:20.020 | That's harder in a lot of other technical ways.
03:30:22.420 | Not to say we can't also do that.
03:30:24.060 | But that's further away and has more challenges.
03:30:26.500 | But this small thing about being able to have
03:30:28.860 | a meeting with one or a few people and have it feel real,
03:30:33.780 | feel like you're there, like you have the same interactions
03:30:37.020 | and talking with them, you get subtle cues
03:30:39.340 | as we start getting eye and face tracking
03:30:41.300 | and some of the other things on high-end headsets.
03:30:43.900 | A lot of that is going to come over.
03:30:46.380 | And it doesn't have to be as good.
03:30:49.140 | This is an important thing that people miss,
03:30:51.060 | where there was a lot of people that, especially rich people,
03:30:55.540 | that would look at VR and say, it's like,
03:30:57.500 | oh, this just isn't that good.
03:30:59.780 | And I'd say, it's like, well, you've already been courtside,
03:31:02.900 | backstage, and on pit row.
03:31:05.100 | And you've done all of these experiences
03:31:06.980 | because you get to do them in real life.
03:31:08.700 | But most people don't get to.
03:31:10.700 | And even if the experience is only half as good,
03:31:12.980 | if it's something that they never would have gotten
03:31:15.060 | to do before, it's still a very good thing.
03:31:17.740 | And as we can just-- we can push that number up over time.
03:31:20.900 | It has a minimum viable value level
03:31:24.140 | when it does something that is valuable enough to people.
03:31:27.260 | As long as it's better inside the headset on any metric
03:31:30.020 | than it is outside and people choose to go there,
03:31:32.660 | we're on the right path.
03:31:33.980 | And we have a value gradient that I'm just always hammering
03:31:36.980 | on, we can just follow this value gradient,
03:31:39.100 | just keep making things better,
03:31:41.060 | rather than going for that one, close your eyes,
03:31:44.620 | swing for the fences, kind of silver bullet approach.
03:31:48.780 | - Well, I wonder if there's a value gradient
03:31:50.540 | for in-person meetings, because if you get that right,
03:31:53.500 | I mean, that would change the world.
03:31:54.980 | - Yeah. - It doesn't need to,
03:31:57.060 | I mean, you don't need a Ready Player One.
03:31:59.460 | But I wonder if there's that value gradient
03:32:02.820 | you can follow along.
03:32:05.140 | Because if there is, and you follow it,
03:32:08.060 | then there'll be a certain like phase shift
03:32:11.540 | at a certain point where people will shift
03:32:14.700 | from Zoom to this.
03:32:18.660 | I wonder, what are the bottlenecks?
03:32:23.260 | Is it software? Is it hardware?
03:32:25.420 | Is it like, is it all about latency?
03:32:27.860 | - So I have big arguments internally
03:32:30.500 | about strategic things like that,
03:32:32.780 | where I, like the next headset that's coming out
03:32:36.660 | that we've made various announcements about
03:32:38.980 | is gonna be a higher end headset,
03:32:40.500 | more expensive, more features.
03:32:42.020 | Lots of people wanna make those trade-offs.
03:32:44.620 | I, you know, we'll see what the market has to say
03:32:47.220 | about the exact trade-offs we've made here.
03:32:49.780 | But if you wanna replace Zoom,
03:32:51.460 | you need to have something that everybody has.
03:32:53.820 | And- - So you like cheaper.
03:32:56.460 | - I like cheaper because also lighter and cheaper
03:33:00.180 | wind up being a virtuous cycle there
03:33:03.340 | where expensive and more features
03:33:05.580 | tends to also lead towards heavier.
03:33:07.260 | And it just kind of goes, it's like,
03:33:08.500 | let's add more features.
03:33:09.660 | The features are not, you know,
03:33:11.980 | they have physical presence and weight
03:33:13.780 | and draw from batteries and all of those things.
03:33:16.140 | So I've always favored a lower end,
03:33:19.220 | cheaper, faster approach.
03:33:21.340 | That's why I was always behind the mobile side of VR
03:33:24.340 | rather than the higher end PC headsets.
03:33:26.700 | And I think that's, you know, that's proven out well.
03:33:29.460 | But there's, you always,
03:33:31.100 | ideally we have a whole range of things,
03:33:32.900 | but if you've only got one or two things,
03:33:35.380 | it's important that those two things cover the,
03:33:38.220 | you know, the scope that you think is most important.
03:33:40.580 | When we're in a world when it's like cell phones
03:33:42.500 | and there's 50 of them on the market
03:33:44.020 | covering every conceivable ecological niche you want,
03:33:46.980 | that's gonna be great,
03:33:47.940 | but we're not gonna be there for a while.
03:33:49.940 | - Where are the bottlenecks?
03:33:51.540 | Is it the hardware or the software?
03:33:53.380 | - Yeah, so right now, you can play,
03:33:56.300 | you can get workrooms on Quest
03:33:58.900 | and you can set up these things
03:34:00.340 | and it's a pretty good experience.
03:34:01.700 | It's surprisingly good.
03:34:02.700 | - I haven't tried it.
03:34:03.540 | It's surprisingly good.
03:34:05.140 | - Yeah, the voice latency is better on that
03:34:08.100 | than a lot better than a Zoom meeting.
03:34:10.140 | So you've got a better sense of immediacy there.
03:34:13.540 | The expressions that you get from the current hardware
03:34:17.060 | with just kind of your controllers and your head
03:34:20.060 | is pretty realistic feeling.
03:34:21.580 | You've got a pretty good sense of being there
03:34:23.300 | with someone with it.
03:34:24.140 | - Are these like avatars of people?
03:34:27.540 | Like do you get to see their body?
03:34:29.940 | - Yeah.
03:34:30.980 | - And they're sitting around a table?
03:34:32.460 | - Yeah.
03:34:33.300 | - And it feels better than Zoom?
03:34:35.700 | - Better than you'd expect for that.
03:34:37.820 | It is definitely, yeah, I'd say it's quite a bit better
03:34:42.180 | than Zoom when everything's working right.
03:34:43.900 | But there's still all the rough edges of,
03:34:46.460 | the reason Zoom became so successful
03:34:48.260 | is because they just nailed the usability of everything.
03:34:51.060 | It's high quality with a absolutely first rate experience.
03:34:54.980 | And we are not there yet with any of the VR stuff.
03:34:58.060 | I'm trying to push hard to get, I keep talking about it.
03:35:02.260 | It's like, it needs to just be one click
03:35:03.820 | to make everything happen.
03:35:05.060 | And we're getting there in our home environment,
03:35:07.660 | not the whole workrooms application,
03:35:09.300 | but the main home where you can now kind of go over
03:35:11.780 | and click an invite.
03:35:12.700 | And it still winds up taking five times longer
03:35:15.260 | than it should.
03:35:16.300 | But we're getting close to that where you click there,
03:35:19.500 | they click on their button and then they're sitting there
03:35:22.020 | in this good presence with you.
03:35:23.860 | But latencies need to get a lot better.
03:35:25.940 | User interface needs to get a lot better.
03:35:28.780 | Ubiquity of the headsets needs to get better.
03:35:30.900 | We need to have a hundred million of them out there
03:35:33.780 | just so that everybody knows somebody
03:35:35.420 | that uses this all the time.
03:35:37.300 | - Well, I think it's a virtuous cycle
03:35:38.540 | because I do think the interface
03:35:42.780 | is the thing that makes or breaks this kind of revolution.
03:35:48.140 | It's so interesting how like you said one click,
03:35:50.860 | but it's also like how you achieve that one click.
03:35:54.020 | I don't know.
03:35:54.860 | Can I ask a dark question?
03:35:58.620 | Maybe let's keep it outside of meta,
03:36:00.700 | but this is about meta, but also Google and big company.
03:36:05.020 | Are they able to do this kind of thing?
03:36:07.940 | It seems like, let me put on my cranky old man hat,
03:36:12.180 | is they seem to not do a good job
03:36:15.660 | of creating these user-friendly interfaces
03:36:20.500 | as they get bigger and bigger as a company.
03:36:22.900 | Like Google has created some of the greatest interfaces ever
03:36:25.980 | early on in its, I mean, creating Gmail,
03:36:30.340 | just so many brilliant interfaces.
03:36:34.100 | And it just seems to be getting crappier and crappier
03:36:36.540 | at that, same with meta, same with Microsoft.
03:36:40.540 | It's just, it seems to get worse and worse at that.
03:36:45.020 | I don't know what is it,
03:36:46.140 | because you've become more conservative, careful,
03:36:48.620 | risk averse, is that why?
03:36:51.180 | Can you speak to that?
03:36:52.020 | - So it's been really eyeopening to me
03:36:53.540 | working inside a tech titan where I am,
03:36:57.180 | you know, I had my small companies
03:36:59.500 | and then we're acquired by a mid-size game publisher
03:37:03.500 | and then Oculus getting acquired by meta
03:37:06.740 | and meta has grown by a factor of many
03:37:08.900 | just in the eight years since the acquisition.
03:37:12.980 | So I did not have experience with this.
03:37:16.260 | And it was interesting because I remember like,
03:37:19.220 | previously my benchmark for kind of use of resources
03:37:23.020 | was some of the government programs
03:37:24.460 | I interacted with on the aerospace side.
03:37:26.900 | And I remember thinking there was,
03:37:28.860 | okay, there's an air force program
03:37:30.340 | and they spent $50 million and they didn't launch anything.
03:37:34.340 | They didn't even build anything.
03:37:35.500 | It was just kind of like they, you know,
03:37:37.660 | they made a bunch of papers and had some parts
03:37:40.060 | in a warehouse and nothing came of it.
03:37:42.180 | It's like $50 million.
03:37:43.380 | And I've had to radically recalibrate my sense of like
03:37:49.140 | how much money can be spent with mediocre resources.
03:37:54.060 | Where on the plus side, VR has turned out,
03:37:58.020 | we've built pretty much exactly what, you know,
03:38:02.260 | we just passed the 10 year mark then from my,
03:38:04.860 | like my first demo of the Rift.
03:38:06.940 | And if I could have said what I wanted to have,
03:38:09.580 | it would have been a standalone inside out
03:38:12.100 | tracked 4K resolution headset that I,
03:38:15.820 | that could still plug into a PC for high-end rendering.
03:38:18.380 | And that's exactly what we've got on Quest 2 right now.
03:38:21.340 | - First of all, let's pause on that
03:38:22.620 | with me being cranky and everything.
03:38:24.980 | It's what Meta achieved with Oculus and so on is incredible.
03:38:29.980 | I mean, this is, when I thought about the future of VR,
03:38:33.980 | this is what I imagined in terms of hardware, I would say.
03:38:36.700 | And maybe in terms of the experience as well,
03:38:39.220 | but it's still not there somehow.
03:38:42.060 | - On the one hand, we did kind of achieve it and win,
03:38:44.580 | and we've got, we've sold, you know,
03:38:46.340 | we're a success right now,
03:38:48.140 | but the amount of resources that have gone into it,
03:38:51.220 | it winds up getting cluttered up in accounting
03:38:53.500 | where Mark did announce that they spent $10 billion a year
03:38:58.340 | like on Reality Labs.
03:38:59.500 | Now Reality Labs covers a lot.
03:39:01.700 | It was, VR was not the large part of it.
03:39:04.260 | It also had Portal and Spark and the big AR research efforts.
03:39:08.140 | And it's been expanding out to include AI
03:39:11.060 | and other things there where there's a lot going on there.
03:39:15.540 | But $10 billion was just a number
03:39:18.100 | that I had trouble processing.
03:39:20.220 | It's just, I feel sick to my stomach
03:39:22.460 | thinking about that much money being spent.
03:39:24.980 | But that's how they demonstrate commitment to this,
03:39:27.780 | where it's not more so than like, yeah,
03:39:31.380 | Google goes and cancels all of these projects,
03:39:34.060 | different things like that,
03:39:36.020 | while Meta is really sticking with the funding of VR
03:39:39.460 | and AR is still further out with it.
03:39:41.860 | So there's something to be said for that.
03:39:44.780 | It's not just gonna vanish, the work's going in.
03:39:46.900 | I just wish it could be,
03:39:48.740 | all those resources could be applied more effectively
03:39:51.700 | because I see all these cases.
03:39:53.700 | I point out these examples of how a third party
03:39:56.860 | that we're kind of competing with in various ways.
03:39:58.900 | There's a number of these examples
03:40:00.540 | and they do work with a 10th of the people
03:40:03.780 | that we do internally.
03:40:05.820 | And a lot of it comes from, yes,
03:40:08.060 | the small company can just go do it,
03:40:10.060 | while in a big company, you do have to worry about,
03:40:12.900 | is there some SDK internally that you should be using
03:40:16.780 | because another team's making it?
03:40:18.140 | You have to have your cross-functional group meetups
03:40:21.020 | for different things.
03:40:22.220 | You do have more concerns about privacy
03:40:25.260 | or diversity and equity and safety of different things,
03:40:28.860 | parental issues and things that a small startup company
03:40:31.860 | can just kind of cowboy off and do something interesting.
03:40:36.540 | And there's a lot more that is a problem
03:40:39.380 | that you have to pay attention to in the big companies,
03:40:41.420 | but I'm not willing to believe that we are within
03:40:43.820 | even a factor of two or four
03:40:46.300 | of what the efficiency could be.
03:40:48.780 | I am constantly kind of crying out for,
03:40:51.780 | it's like, we can do better than this.
03:40:53.460 | - Yeah, and you wonder what the mechanisms
03:40:55.100 | to unlock that efficiency are.
03:40:57.620 | There is some sense in a large company
03:41:02.460 | that an individual engineer might not believe
03:41:05.700 | that they can change the world.
03:41:07.140 | Maybe you delegate a little bit of the responsibility
03:41:11.420 | to be the one who changes the world in a big company,
03:41:14.380 | I think, but the reality is like the world will get changed
03:41:19.380 | by a single engineer anyway.
03:41:21.340 | So whether inside Google or inside a startup,
03:41:24.380 | it doesn't matter.
03:41:25.220 | It's just like Google and Meta needs to help
03:41:28.140 | those engineers believe.
03:41:29.700 | They're the ones that are gonna decrease that latency.
03:41:32.780 | It'll take one John Carmack,
03:41:34.700 | the 20-year-old Carmack that's inside Meta right now
03:41:39.060 | to change everything.
03:41:40.540 | - And I try to point that out and push people.
03:41:43.020 | It's like, try to go ahead.
03:41:45.020 | And when you see some, because there is,
03:41:46.740 | you get the silo mentality where you're like,
03:41:48.820 | okay, I know something's not right over there,
03:41:50.900 | but I'm staying in my lane here.
03:41:53.580 | And there's a couple people that I can think about
03:41:57.340 | that are willing to just like hop all over the place.
03:41:59.540 | And man, I treasure them,
03:42:01.140 | the people that are just willing to, they're fearless.
03:42:03.940 | They will go over and they will go rebuild the kernel
03:42:06.380 | and change this distribution and go in
03:42:08.260 | and hack the firmware over here to get something done right.
03:42:11.820 | And that is relatively rare.
03:42:14.180 | There's thousands of developers
03:42:15.740 | and you've got a small handful
03:42:17.100 | that are willing to operate at that level.
03:42:19.500 | And it's potentially risky for them.
03:42:22.100 | The politics are real in a lot of that.
03:42:24.900 | And I'm in the, very much the privileged position of,
03:42:28.100 | I'm more or less untouchable there
03:42:31.300 | where I've been dinged like twice for it.
03:42:33.500 | It's like, you said something insensitive in that post
03:42:35.660 | and you should probably not say that.
03:42:38.900 | But for the most part, yes,
03:42:40.740 | I get away with every week I'm posting something
03:42:44.340 | pretty loud and opinionated internally.
03:42:47.460 | And I think that's useful for the company,
03:42:50.220 | but yeah, it's rare to have a position like that.
03:42:54.860 | And I can't necessarily offer advice
03:42:56.620 | for how someone can do that.
03:42:59.060 | - Well, you could offer advice to a company in general
03:43:01.500 | to give a little bit of freedom for the young,
03:43:05.540 | while the wildest ideas come from the young minds.
03:43:10.660 | And so you need to give the young minds freedom
03:43:13.020 | to think big and wild and crazy.
03:43:16.380 | And for that, they have to be opinionated.
03:43:18.540 | They have to be,
03:43:19.780 | they have to think crazy ideas and thoughts
03:43:24.340 | and pursue them with a full passion
03:43:26.260 | without being slowed down by bureaucracy or managers
03:43:29.180 | and all that kind of stuff.
03:43:31.060 | Obviously startups really empower that,
03:43:33.020 | but big companies could too.
03:43:34.220 | And that's a design challenge for big companies
03:43:37.940 | to see how can you enable that?
03:43:40.180 | How can you empower that?
03:43:41.020 | - Yeah, because the big company,
03:43:41.860 | there are so many resources there.
03:43:43.620 | And they do, amazing things do get accomplished,
03:43:46.580 | but there's so much more that could come out of that.
03:43:49.780 | And I'm always hopeful.
03:43:51.540 | I'm an optimist in almost everything.
03:43:53.220 | I think things can get better.
03:43:54.980 | I think that they can improve things,
03:43:56.780 | that you go through a path and you're learning
03:43:59.500 | kind of what does and doesn't work.
03:44:01.420 | And I'm not ready to be fatalistic
03:44:03.700 | about the kind of the outcome of any of that.
03:44:06.820 | - Me neither.
03:44:07.660 | I know too many good people
03:44:09.060 | inside of those large companies that are incredible.
03:44:12.460 | You have a friendship with Elon Musk.
03:44:16.060 | Often when I talk to him,
03:44:17.380 | he'll bring up how incredible of an engineer
03:44:19.620 | and just a big picture thinker you are.
03:44:22.420 | He has a huge amount of respect for you.
03:44:26.300 | I just, I've never been a fly on the wall
03:44:28.900 | between the discussion between the two of you.
03:44:30.580 | I just wonder, is there something you guys debate,
03:44:34.940 | argue about, discuss?
03:44:36.820 | Is there some interesting problems
03:44:38.300 | that the two of you think about?
03:44:41.260 | You come from different worlds.
03:44:42.500 | Maybe there's some intersection in aerospace.
03:44:45.900 | Maybe there's some intersection in your new efforts
03:44:50.340 | in artificial intelligence in terms of thinking.
03:44:53.600 | Is there something interesting you could say
03:44:55.140 | about sort of the debates the two of you have?
03:44:57.340 | - So I think in some ways,
03:44:58.860 | we do have a kind of similar background
03:45:01.300 | where we're almost exactly the same age
03:45:03.540 | and we had kind of similar programming backgrounds
03:45:06.420 | on the personal computers
03:45:07.820 | and even some of the books that we would read
03:45:10.660 | and things that would kind of turn us into the people
03:45:13.100 | that we are today.
03:45:14.700 | And I think there is a degree of sensibility similarities
03:45:19.700 | where we kind of call bullshit on the same things
03:45:22.940 | and kind of see the same opportunities
03:45:25.820 | in different technology.
03:45:27.220 | And there's that sense of,
03:45:29.020 | I always talk about the speed of light solutions for things.
03:45:31.580 | And he's thinking about kind of minimum manufacturing
03:45:34.900 | and engineering and operational standpoints for things.
03:45:38.980 | And so, I mean, I first met Elon
03:45:42.220 | right at the start of the aerospace era
03:45:44.060 | where I wasn't familiar with,
03:45:46.180 | I was still in my game dev bubble.
03:45:47.740 | I really wasn't familiar with all the startups
03:45:49.820 | that were going and being successful
03:45:52.060 | and what went on with PayPal
03:45:53.660 | and all of his different companies.
03:45:54.940 | But I met him as I was starting to do Armadillo Aerospace
03:45:58.860 | and he came down with kind of his right-hand propulsion guy
03:46:03.100 | and we talked about rockets.
03:46:05.420 | What can we do with this?
03:46:07.260 | And it was kind of specific things about
03:46:09.740 | like how are our flight computers set up?
03:46:12.140 | What are different propellant options?
03:46:14.940 | What can happen with different ways
03:46:17.580 | of putting things together?
03:46:19.380 | And then in some ways,
03:46:21.220 | he was certainly the biggest player
03:46:22.900 | in the sort of alt space community
03:46:24.900 | that was going on in the early 2000s.
03:46:28.100 | He was the most well-funded,
03:46:30.620 | although his funding in the larger scheme of things
03:46:32.860 | compared to a like a NASA or something like that
03:46:36.780 | was really tiny.
03:46:38.260 | It was a lot more than I had at the time,
03:46:40.500 | but it was interesting.
03:46:42.660 | I had a point years later when I realized,
03:46:45.540 | okay, my financial resources at this point
03:46:48.860 | are basically what Elon's was
03:46:50.460 | when he went all in on SpaceX and Tesla.
03:46:54.340 | And there's, I think in many corners,
03:46:58.340 | he does not get the respect that he should
03:47:01.140 | about being a wealthy person that could just retire.
03:47:04.980 | And he went all in where he was really going to,
03:47:08.700 | he could have gone bust.
03:47:11.380 | And there's plenty of people,
03:47:12.540 | you'd look at the sad athletes or entertainers
03:47:16.780 | that had all the money in the world and blew it.
03:47:18.620 | He could have been the business case example of that.
03:47:22.420 | But the things that he was doing,
03:47:25.460 | space exploration, electrification of transportation,
03:47:29.860 | solar city type things, these are big world level things.
03:47:34.580 | And I have a great deal of admiration
03:47:36.820 | that he was willing to throw himself
03:47:38.740 | so completely into that.
03:47:40.740 | Because in contrast with myself,
03:47:43.060 | I was doing Armadillo Aerospace with this tightly bounded,
03:47:47.020 | it was John's crazy money at the time
03:47:49.580 | that had a finite limit on it.
03:47:51.420 | It was never going to impact me or my family
03:47:54.860 | if it completely failed.
03:47:56.580 | And I was still hedging my bets
03:47:58.700 | working at id Software at the time
03:48:01.060 | when he had been really all in there.
03:48:04.940 | And I have a huge amount of respect for that.
03:48:07.980 | And people do not,
03:48:09.380 | the other thing I get irritated with is people would say,
03:48:11.500 | it's like, "Oh, Elon's just a business guy.
03:48:13.860 | He just got like, he was gifted the money
03:48:16.500 | and he's just kind of investing in all of this."
03:48:19.460 | When he was really deeply involved
03:48:22.260 | in a lot of the decisions.
03:48:24.540 | Not all of them were perfect,
03:48:25.820 | but he cared very much about engine material selection,
03:48:30.420 | propellant selection.
03:48:31.700 | And for years he'd be kind of telling me,
03:48:34.340 | it's like, "Get off that hydrogen peroxide stuff."
03:48:37.060 | It's like, "Liquid oxygen is the only proper oxidizer
03:48:40.900 | for this."
03:48:41.900 | And the times that I've gone through the factories with him,
03:48:46.900 | we're talking very detailed things
03:48:49.820 | about like how this weld is made,
03:48:51.860 | how this sub-assembly goes together.
03:48:53.940 | What are like startup shutdown behaviors
03:48:57.580 | of the different things.
03:48:58.540 | So he is really in there at a very detailed level.
03:49:03.460 | And I think that he is the best modern example now
03:49:06.980 | of someone that tries to,
03:49:08.540 | that can effectively micromanage some decisions on things
03:49:11.900 | on both Tesla and SpaceX to some degree
03:49:15.060 | where he cares enough about it.
03:49:16.940 | I worry a lot that he's stretched too thin,
03:49:19.300 | that you get boring company and Neuralink and Twitter
03:49:22.980 | and all the other possible things there
03:49:25.300 | where I know I've got limits
03:49:28.500 | on how much I can pay attention to
03:49:30.860 | that I have to kind of box off different amounts of time.
03:49:34.060 | And I look back at like my aerospace side of things.
03:49:36.500 | It's like, I did not go all in on that.
03:49:38.420 | I did not commit myself at a level
03:49:40.660 | that it would have taken to be successful there.
03:49:43.540 | And it's kind of a weird thing,
03:49:46.460 | just like having a discussion with him.
03:49:49.060 | He's the richest man in the world right now,
03:49:50.900 | and he operates on a level that is still very much
03:49:55.900 | in my wheelhouse on a technical side of things.
03:50:00.180 | - So doing that systems level type of thinking
03:50:02.500 | where you can go to the low level details
03:50:04.900 | and go up high to the big picture.
03:50:07.460 | Do you think in aerospace arena in the next five, 10 years,
03:50:12.460 | do you think we're gonna put a human on Mars?
03:50:16.380 | Like, what do you think is the interesting?
03:50:20.300 | - No, I do.
03:50:21.140 | In fact, I made a bet with someone
03:50:23.060 | with a group of people kind of this
03:50:24.860 | about whether boots on Mars by 2030.
03:50:28.420 | And this was kind of a fun story
03:50:31.300 | because I was at an Intel sponsored event
03:50:34.140 | and we had a bunch of just world-class brilliant people.
03:50:37.380 | And we were talking about computing stuff,
03:50:39.100 | but the after dinner conversation was like,
03:50:41.500 | what are some other things?
03:50:42.340 | How are they gonna go in the future?
03:50:43.940 | And one of the ones tossed up on the whiteboard
03:50:46.140 | was like boots on Mars by 2030.
03:50:49.020 | And most of the people in the room thought, yes.
03:50:52.300 | They thought that like SpaceX is kicking ass.
03:50:54.460 | We've got all this possible stuff.
03:50:57.020 | Seems likely that it's gonna go that way.
03:50:59.420 | And I said, no, I think less than 50% chance
03:51:03.260 | that it's going to make it there.
03:51:05.420 | And people were kind of like,
03:51:07.300 | oh, why the pessimism or whatever?
03:51:09.420 | And of course I'm an optimist at almost everything,
03:51:12.060 | but for me to be the one kind of outlier saying,
03:51:14.820 | no, I don't think so.
03:51:17.260 | Then I started saying some of the things I said,
03:51:19.420 | well, let's be concrete about it.
03:51:21.060 | Let's bet $10,000 that it's not gonna happen.
03:51:24.900 | And this was really a startling thing to see that I,
03:51:29.820 | again, room full of brilliant people,
03:51:31.660 | but as soon as like money came on the line
03:51:34.340 | and they were like, do I want to put 10,000?
03:51:36.700 | I was not the richest person in the room.
03:51:38.820 | There were people much better off than I was.
03:51:41.300 | There was a spectrum, but as soon as they started thinking,
03:51:45.340 | it's like, oh, I could lose money
03:51:46.780 | by keeping my position right now.
03:51:50.780 | And all these engineers, they engaged their brain.
03:51:53.420 | They started thinking, it's like, okay, launch windows,
03:51:56.660 | launch delays, how many times would it take
03:51:59.460 | to get this right?
03:52:00.380 | What historical precedents do we have?
03:52:02.700 | And then it mostly came down to, it's like,
03:52:05.820 | well, what about in transit by 2030?
03:52:08.220 | And then what about different things
03:52:11.500 | or would you go for 2032?
03:52:13.940 | But one of the people did go ahead
03:52:15.780 | and was optimistic enough to make a bet with me.
03:52:18.340 | So I have a $10,000 bet that by 2030,
03:52:22.020 | I think it's gonna happen shortly thereafter.
03:52:24.460 | I think there will probably be infrastructure on Mars by 2030
03:52:27.660 | but I don't think that we'll have humans on Mars on 2030.
03:52:31.180 | I think it's possible, but I think it's less
03:52:33.580 | than a 50% chance, so I felt safe making that bet.
03:52:36.780 | - Well, I think you had an interesting point.
03:52:40.020 | Correct me if I'm wrong.
03:52:41.260 | That's a dark one.
03:52:42.500 | That should perhaps help people appreciate Elon Musk
03:52:47.500 | which is in this particular effort,
03:52:53.460 | Elon is critical to the success.
03:52:56.860 | SpaceX seems to be critical to the 20,
03:53:01.220 | humans on Mars by 2030 or thereabouts.
03:53:07.100 | So if something happens to Elon,
03:53:10.500 | then all of this collapses.
03:53:13.740 | - And this is in contrast to the other $10,000 bet
03:53:17.460 | I made kind of recently.
03:53:18.620 | And that was self-driving cars at like a level five
03:53:21.940 | running around cities.
03:53:23.380 | And people have kind of nitpicked that
03:53:25.340 | that we probably don't mean exactly level five
03:53:27.820 | but the guy I'm having the bet with
03:53:30.020 | is we're gonna be, we know what we mean about this.
03:53:32.980 | - Jeff Atwood.
03:53:33.900 | - Yeah, coding horror and stack overflow and all.
03:53:37.220 | But I, yeah, I mean, it's just,
03:53:39.460 | he doesn't think that people are gonna be riding around
03:53:41.780 | in robo taxis in 2030 in major cities
03:53:45.180 | just like you take an Uber now.
03:53:47.620 | And I think it will.
03:53:48.660 | - You think it will.
03:53:49.500 | - And I think, and the difference is everybody looks at this
03:53:51.540 | it's like, oh, but Tesla has been wrong for years.
03:53:53.660 | They've been promising it for years and it's not here yet.
03:53:56.580 | And the reason this is different than the bet with Mars
03:54:00.620 | is Mars really is more than is comfortable
03:54:04.340 | a bet on Elon Musk.
03:54:06.300 | I am, that is his thing.
03:54:09.380 | And he is really going to move heaven and earth
03:54:11.980 | to try to make that happen.
03:54:13.620 | - Perhaps not even SpaceX.
03:54:15.100 | - Yeah.
03:54:15.940 | - Perhaps just Elon Musk.
03:54:18.020 | - Yeah, because if Elon went away and SpaceX went public
03:54:21.780 | and got a board of directors,
03:54:24.020 | there are more profitable things they could be doing
03:54:26.420 | than focusing on human presence on Mars.
03:54:29.500 | So this really is a sort of personal thing there.
03:54:32.780 | And in contrast with that,
03:54:34.860 | self-driving cars have a dozen credible companies
03:54:38.820 | working really hard.
03:54:40.660 | And while yes, it's going slower
03:54:43.380 | than most people thought it would,
03:54:45.660 | betting against that is a bet against
03:54:47.580 | almost the entire world in terms of all of these companies
03:54:51.100 | that have all of these incentives.
03:54:52.860 | It's not just one guy's passion project.
03:54:56.740 | And I do think that it is solvable.
03:54:59.860 | Although I recognize it's not a hundred percent chance
03:55:02.540 | because it's possible the long tail of self-driving problems
03:55:05.780 | winds up being an AGI complete problem.
03:55:08.500 | I think there's plenty of value to mine out of it
03:55:10.780 | with narrow AI.
03:55:11.900 | And I think that it's going to happen
03:55:14.260 | probably more so than people expect.
03:55:16.540 | But it's that whole sigmoid curve
03:55:18.300 | where you overestimate the near-term progress
03:55:21.700 | and you underestimate the long-term progress.
03:55:23.940 | And I think self-driving is gonna be like that.
03:55:26.580 | And I think 2030 is still a pretty good bet.
03:55:29.900 | - Yeah, unfortunately, self-driving is a problem
03:55:34.900 | that is safety critical, meaning that
03:55:39.580 | if you don't do it well, people get hurt.
03:55:43.260 | - But the other side of that is people are terrible drivers.
03:55:46.460 | So it is not going to be,
03:55:48.180 | that's probably gonna be the argument that gets it through
03:55:50.420 | is like we can save 10,000 lives a year
03:55:54.100 | by taking imperfect self-driving cars
03:55:56.820 | and letting them take over a lot of driving responsibilities.
03:55:59.900 | It's like, was it 30,000 people a year
03:56:02.220 | die in auto accidents right now in America.
03:56:04.820 | And a lot of those are preventable.
03:56:06.940 | And the problem is you'll have people that
03:56:09.060 | every time a Tesla crashes into something,
03:56:11.300 | you've got a bunch of people that
03:56:12.660 | literally have vested interest shorting Tesla
03:56:14.940 | to come out and make it the worst thing in the world.
03:56:17.460 | And people will be fighting against that.
03:56:19.420 | But optimist in me again,
03:56:21.300 | I think that we will have systems
03:56:23.240 | that are statistically safer than human drivers.
03:56:26.260 | And we will be saving thousands and thousands of lives
03:56:29.900 | every year when we can hand over
03:56:31.920 | more of those responsibilities to it.
03:56:34.260 | - I do still think as a person who studied this problem
03:56:37.140 | very deeply from a human side as well,
03:56:40.980 | it's still an open problem
03:56:42.300 | how good/bad humans are driving.
03:56:46.380 | It's a kind of funny thing we say about each other.
03:56:49.460 | Oh, humans suck at driving.
03:56:51.280 | Everybody except you, of course,
03:56:54.820 | like we think we're good at driving.
03:56:56.740 | But after really studying it,
03:56:59.940 | I think you start to notice,
03:57:03.300 | 'cause I watched hundreds of hours of humans driving
03:57:06.740 | with the projects of this kind of thing.
03:57:08.740 | You've noticed that even with the distraction,
03:57:11.780 | even with everything else,
03:57:13.500 | humans are able to do some incredible things
03:57:17.820 | with the attention.
03:57:19.380 | Even when you're just looking at a smartphone,
03:57:21.100 | just to get cues from the environment,
03:57:23.340 | to make last seconds decisions,
03:57:26.740 | to use instinctual type of decisions
03:57:29.020 | that actually save your ass time and time and time again,
03:57:33.140 | and are able to do that with so much uncertainty around you
03:57:37.980 | in such tricky dynamic environments.
03:57:40.860 | I don't know.
03:57:42.020 | I don't know exactly how hard is it to beat
03:57:47.020 | that kind of skill of common sense reasoning.
03:57:50.860 | - This is one of those interesting things
03:57:52.740 | that there've been a lot of studies
03:57:53.980 | about how experts in their field
03:57:56.460 | usually underestimate the progress that's going to happen
03:57:59.660 | because an expert thinks about all the problems
03:58:02.300 | they deal with.
03:58:03.140 | And they're like,
03:58:03.960 | "Damn, I'm gonna have a hard time solving all of this."
03:58:06.380 | And they filter out the fact that they are one expert
03:58:08.860 | in a field of thousands.
03:58:10.620 | And you think about,
03:58:11.900 | "Yeah, I can't do all of that."
03:58:13.460 | And you sometimes forget about the scope of the ecosystem
03:58:16.780 | that you're embedded in.
03:58:17.980 | And if you think back eight years,
03:58:19.740 | very specifically the state of AI and machine learning,
03:58:22.580 | where was that we had just gotten ResNets probably
03:58:25.900 | at that point.
03:58:26.860 | And you look at all the amazing magical things
03:58:29.820 | that have happened in eight years.
03:58:31.460 | And they do kind of seem to be happening a little faster
03:58:34.380 | in recent years also.
03:58:36.100 | And you project that eight more years into the future,
03:58:39.080 | where again, I think there's a 50% chance
03:58:41.220 | we're gonna have signs of life of AGI,
03:58:43.860 | which we can put through driver's ed if we need to,
03:58:46.420 | to actually build self-driving cars.
03:58:48.700 | And I think that the narrow systems
03:58:50.420 | are going to have real value demonstrated well before then.
03:58:54.940 | - So signs of life in AGI.
03:58:57.940 | You've mentioned that,
03:58:59.780 | okay, first of all,
03:59:02.540 | you're one of the most brilliant people on this earth.
03:59:04.980 | You could be solving a number of different problems,
03:59:08.100 | as you've mentioned.
03:59:09.420 | Your mind was attracted to nuclear energy.
03:59:12.340 | Obviously virtual reality with the metaverse
03:59:14.660 | is something you could have a tremendous impact on.
03:59:16.740 | - I do wanna say a quick thing about nuclear energy,
03:59:19.180 | where this is something that I really,
03:59:22.820 | this so precisely feels like aerospace before SpaceX,
03:59:27.260 | where from everything that I know about all of these,
03:59:30.420 | the physics of this stuff hasn't changed.
03:59:33.500 | And the reasons why things are expensive now
03:59:36.660 | are not fundamental.
03:59:38.060 | Somebody should be going into a really hard Elon Musk style
03:59:44.180 | at fission, economical fission, not fusion,
03:59:48.860 | where the fusion is the kind of the darling
03:59:52.540 | of people that wanna go and do nuclear
03:59:54.580 | because it doesn't have the taint that fission has
03:59:57.220 | in a lot of people's minds.
03:59:58.860 | But it's an almost absurdly complex thing
04:00:02.140 | where nuclear fusion, as you look at the tokamaks
04:00:05.860 | or any of the things that people are building,
04:00:07.620 | and it's doing all of this infrastructure
04:00:09.940 | just at the end of the day to make something hot
04:00:12.900 | to that you can then turn into energy
04:00:14.740 | through a conventional power plant.
04:00:17.020 | And all of that work, which we think
04:00:18.940 | we've got line of sight on, but even if it comes out,
04:00:22.100 | then you have to do all of that
04:00:23.900 | immensely complex, expensive stuff
04:00:26.180 | just to make something hot,
04:00:27.380 | where nuclear fission is basically
04:00:29.380 | you put these two rocks together
04:00:31.020 | and they get hot all by themselves.
04:00:33.140 | That is just that much simpler.
04:00:35.220 | It's just orders of magnitude simpler.
04:00:38.220 | And the actual rocks, the refined uranium,
04:00:40.500 | is not very expensive.
04:00:41.900 | It's a couple percent of the cost of electricity.
04:00:45.220 | That's why I made that point
04:00:46.260 | where you could have something
04:00:47.660 | which was five times less efficient than current systems.
04:00:52.220 | And if the rest of the plant was a whole bunch cheaper,
04:00:54.180 | you could still be super, super valuable.
04:00:57.140 | - So how much of the pie do you think
04:00:59.860 | could be solved by nuclear energy by fission?
04:01:04.460 | So how much could it become
04:01:06.500 | the primary source of energy on earth?
04:01:08.860 | - It could be most of it.
04:01:10.060 | Like the reserves of uranium as it stands now
04:01:12.180 | could not power the whole earth.
04:01:13.700 | But you get into breeder reactors and thorium
04:01:16.940 | and things like that that you do for conventional fission.
04:01:19.900 | There is enough for everything.
04:01:22.340 | Now, I mean, solar photovoltaic has been amazing.
04:01:25.380 | One of my current projects is working on an off-grid system.
04:01:29.940 | And it's been fun just kind of, again,
04:01:31.420 | putting my hands on all the stripping the wires
04:01:33.900 | and wiring things together and doing all of that.
04:01:36.060 | And just having followed that a little bit from the outside
04:01:39.020 | over the last couple decades,
04:01:40.900 | there's been semiconductor-like magical progress
04:01:44.260 | in what's going on there.
04:01:45.940 | So I'm all for all of that,
04:01:47.980 | but it doesn't solve everything.
04:01:49.540 | And nuclear really still does seem like the smart money bet
04:01:53.340 | for what you should be getting for baseband
04:01:55.540 | on a lot of things.
04:01:56.940 | And solar may be cheaper for, you know,
04:01:59.420 | peaking over air conditioning loads during the summer
04:02:02.340 | and things that you can push around in different ways.
04:02:05.260 | But it's one of those things that's,
04:02:07.780 | it's just strange how we've had the technology sitting there,
04:02:10.940 | but these non-technical reasons on the social optics of it
04:02:14.460 | has been this major forcing function
04:02:17.220 | for something that really should be at the cornerstone
04:02:21.420 | of all of the world's concerns with energy.
04:02:24.100 | It's interesting how the non-technical factors
04:02:27.460 | have really dominated something that is so fundamental
04:02:30.380 | to kind of the existence of the human race
04:02:32.700 | as we know it today.
04:02:34.780 | - And much of the troubles of the world,
04:02:36.460 | including wars in different parts of the world,
04:02:39.620 | like Ukraine is energy-based.
04:02:42.100 | And yeah, it's just sitting right there to be solved.
04:02:45.500 | That said, I mean, to me personally,
04:02:50.780 | I think it's clear that if AGI were to be achieved,
04:02:53.900 | that would change the course of human history.
04:02:56.540 | - AGI wise, I was, you know,
04:02:58.820 | I was making this decision about
04:03:00.420 | what do I want to focus on after VR?
04:03:03.740 | And I'm still working on VR regularly.
04:03:06.340 | I spend a day a week kind of consulting with Meta
04:03:09.100 | and I, you know, Boz styles me the consulting CTO
04:03:13.580 | is kind of like the Sherlock Holmes that comes in
04:03:15.820 | and consults on some of the specific tough issues.
04:03:18.780 | And I'm still pretty passionate about all of that,
04:03:21.820 | but I have been figuring out how to compartmentalize
04:03:25.060 | and force that into a smaller box
04:03:27.340 | to work on some other things.
04:03:29.100 | And I did come down to this decision
04:03:30.860 | between working on economical nuclear fission
04:03:34.460 | or artificial general intelligence.
04:03:36.980 | And the fission side of things,
04:03:39.140 | I've got a bunch of interesting things going that way,
04:03:42.180 | but it would take,
04:03:43.980 | that would be a fairly big project thing to do.
04:03:46.380 | I don't think it needs to be as big as people expect.
04:03:48.940 | I do think something original SpaceX sized,
04:03:51.740 | you build it, power your building off of it,
04:03:55.020 | and then the government, I think,
04:03:56.580 | will come around to what you need to.
04:03:59.500 | Everybody loves an existence proof.
04:04:01.060 | I think it's possible somebody should be doing this,
04:04:03.780 | but it's going to involve some politics.
04:04:05.700 | It's going to involve decent sized teams
04:04:07.980 | and a bunch of this cross-functional stuff
04:04:09.860 | that I don't love.
04:04:11.180 | While the artificial general intelligence side of things,
04:04:14.820 | it seems to me like this is the highest leverage moment
04:04:21.580 | for potentially a single individual,
04:04:24.620 | potentially in the history of the world,
04:04:26.460 | where the things that we know about the brain,
04:04:30.020 | about what we can do with artificial intelligence,
04:04:33.300 | nobody can say absolutely on any of these things,
04:04:36.180 | but I am not a madman for saying that it is likely
04:04:40.980 | that the code for artificial general intelligence
04:04:44.220 | is going to be tens of thousands of lines of code,
04:04:47.940 | not millions of lines of code.
04:04:49.940 | This is code that conceivably one individual could write,
04:04:53.420 | unlike writing a new web browser operating system.
04:04:57.340 | And based on the progress that AI has,
04:05:00.500 | machine learning has made in the recent decade,
04:05:03.860 | it's likely that the important things that we don't know
04:05:07.140 | are relatively simple.
04:05:08.700 | There's probably a handful of things,
04:05:10.700 | and my bet is that I think there's less than six
04:05:15.700 | key insights that need to be made.
04:05:17.780 | Each one of them can probably be written
04:05:19.340 | on the back of an envelope.
04:05:20.940 | We don't know what they are,
04:05:22.460 | but when they're put together in concert with GPUs at scale
04:05:26.220 | and the data that we all have access to,
04:05:28.860 | that we can make something that behaves like a human being
04:05:32.860 | or like a living creature,
04:05:34.900 | and that can then be educated in whatever ways
04:05:37.980 | that we need to get to the point
04:05:39.300 | where we can have universal remote workers,
04:05:42.300 | where anything that somebody does mediated by a computer
04:05:45.900 | and doesn't require physical interaction
04:05:48.820 | that an AGI will be able to do.
04:05:50.660 | We can already simulate the equivalent of the Zoom meetings
04:05:55.100 | with avatars and synthetic deep fakes and whatnot.
04:05:59.020 | We can definitely do that.
04:06:01.060 | We have superhuman capabilities on any narrow thing
04:06:04.100 | that we can formalize and make a loss function for,
04:06:08.100 | but there's things we don't know how to do now,
04:06:10.420 | but I don't think they are unapproachably hard.
04:06:13.380 | Now, that's incredibly hubristic to say that it's like,
04:06:17.100 | but I think that what I said a couple years ago
04:06:19.660 | is a 50% chance that somewhere there will be signs of life
04:06:23.460 | of AGI in 2030, and I've probably increased that slightly.
04:06:28.060 | I may be at 55, 60% now,
04:06:30.620 | because I do think there's a little sense
04:06:32.220 | of acceleration there.
04:06:34.660 | - So I wonder what the, and by the way,
04:06:36.500 | you also written that, "I bet with hindsight,
04:06:39.660 | we will find that clear antecedents
04:06:42.100 | of all the critical remaining steps for AGI
04:06:45.020 | are already buried somewhere
04:06:46.820 | in the vast literature of today."
04:06:48.460 | So the ideas are already there.
04:06:51.100 | - I think that's likely the case.
04:06:52.380 | One of the things that appeals to so many people,
04:06:54.580 | including me, about the promise of AGI
04:06:56.940 | is we know that we're only drinking from a straw,
04:07:00.620 | from the fire hose of all the information out there.
04:07:03.500 | I mean, you look at just in a very narrowly bounded field,
04:07:07.420 | like machine learning,
04:07:08.420 | like you can't read all the papers
04:07:09.900 | that come out all the time.
04:07:11.700 | You can't go back and read all the clever things
04:07:14.260 | that people did in the '90s or earlier
04:07:16.180 | that people have forgotten about,
04:07:17.500 | because they didn't pan out at the time
04:07:19.100 | when they were trying to do them with 12 neurons.
04:07:21.700 | And so this idea that, yeah,
04:07:25.060 | I think there are gems buried in some of the older literature
04:07:28.660 | that was not the path taken by everything.
04:07:31.060 | And you can see a kind of herd mentality
04:07:33.620 | on the things that happen right now.
04:07:35.100 | It's almost funny to see.
04:07:36.700 | It's like, oh, Google does something,
04:07:38.020 | and OpenAI does something, Meta does something.
04:07:40.540 | And they're the same people that all talk to each other,
04:07:43.500 | and they're all one-upping each other,
04:07:45.020 | and they're all capable of implementing each other's work
04:07:47.900 | given a month or two
04:07:49.340 | after somebody has an announcement of that.
04:07:51.980 | But there's a whole world of possible approaches
04:07:55.580 | to machine learning.
04:07:57.260 | And I think that we probably will, in hindsight,
04:08:00.300 | go back and see it's like, yeah,
04:08:01.620 | that was kind of clearly predicted
04:08:03.540 | by this early paper here.
04:08:06.380 | You know, and this turns out that if you do this and this
04:08:08.500 | and take this result from animal training
04:08:11.300 | and this thing from neuroscience over here
04:08:13.380 | and put it together and set up this curriculum
04:08:16.100 | for them to learn in, that that's kind of what it took.
04:08:19.820 | You don't have too many people now that are still saying
04:08:22.620 | it's not possible or it's going to take hundreds of years.
04:08:25.460 | And 10 years ago, you would get a collection of experts,
04:08:29.180 | and you would have a decent chunk on the margin
04:08:31.300 | that either say not possible
04:08:33.500 | or a couple hundred years, might be centuries.
04:08:36.660 | And the median estimate would be like 50, 70 years.
04:08:40.660 | And it's been coming down.
04:08:41.860 | And I know with me saying eight years for something,
04:08:44.140 | that still puts me on the optimistic side,
04:08:46.180 | but it's not crazy out in the fringes.
04:08:49.260 | And just being able to look at that at a meta level
04:08:52.020 | about the trend of the predictions going down there,
04:08:57.020 | the idea that something could be happening relatively soon.
04:09:01.900 | Now, I do not believe in fast takeoffs.
04:09:04.740 | You know, that's one of the safety issues that people say,
04:09:06.700 | it's like, oh, it's going to go, boom,
04:09:08.020 | and the AI is going to take over the world.
04:09:10.340 | There's a lot of reasons.
04:09:11.420 | I don't think that's a credible position.
04:09:14.380 | And I think that we will go from a point
04:09:16.820 | where we start seeing things that credibly look like,
04:09:20.780 | look like animals behaviors,
04:09:22.380 | and I have a human voice box wired into them.
04:09:25.860 | It's like, I tried to get Elon to say,
04:09:27.700 | it's like your pig at Neuralink,
04:09:29.260 | give it a human voice box
04:09:30.500 | and let it start learning human words.
04:09:33.220 | I think that, you know,
04:09:34.580 | I think animal intelligence is closer to human intelligence
04:09:37.180 | than a lot of people like to think.
04:09:38.980 | And I think that culture and modalities of IO
04:09:42.140 | are make the gulf seem a lot bigger than it actually is.
04:09:46.020 | There's just that smooth spectrum
04:09:47.700 | of how the brain developed and cortexes
04:09:50.300 | and scaling of different things going on there.
04:09:53.260 | - Culture modalities of IO, yes.
04:09:55.300 | Languages, sort of loss in translation
04:10:00.300 | conceals a lot of intelligence.
04:10:02.100 | And so when you think about signs of life for AGI,
04:10:06.700 | you're thinking about human interpretable signs.
04:10:10.580 | - So the example I give,
04:10:11.500 | if we get to the point
04:10:12.500 | where you've got a learning disabled toddler,
04:10:15.180 | some kind of real special needs child
04:10:18.140 | that can still interact
04:10:19.700 | with their favorite TV show and video game
04:10:22.020 | and can be trained and learn
04:10:24.380 | in some appreciably human-like way.
04:10:27.020 | At that point, you can deploy an army of engineers,
04:10:30.660 | cognitive scientists,
04:10:32.020 | education, developmental education people.
04:10:35.300 | And you've got so many advantages there
04:10:37.500 | unlike real education,
04:10:38.700 | where you can do rollbacks and A/B testing,
04:10:40.860 | and you can find a golden path
04:10:42.220 | through a curriculum of different things.
04:10:44.540 | If you get to that point, learning disabled toddler,
04:10:47.420 | I think that it's gonna be a done deal.
04:10:50.780 | - But do you think we'll know when we see it?
04:10:53.540 | So there's been a lot of really interesting
04:10:56.940 | general learning progress from DeepMind,
04:11:00.460 | opening the eye a little bit too.
04:11:02.580 | I tend to believe that Tesla Autopilot
04:11:06.500 | deserves a lot more credit
04:11:07.940 | than it's getting for making progress
04:11:11.060 | in the general,
04:11:12.060 | on doing the multitask learning thing
04:11:15.460 | and increasing the number of tasks
04:11:17.580 | and automating that process
04:11:20.460 | of sort of learning from the edge,
04:11:24.140 | discovering the edge cases
04:11:25.260 | and learning from the edge cases.
04:11:26.740 | That is, it's really approaching
04:11:29.220 | from a different angle,
04:11:30.300 | the general learning problem of AGI.
04:11:33.180 | But the more clear approach comes from DeepMind,
04:11:36.140 | where you have these kind of game situations
04:11:38.180 | and you build systems there.
04:11:41.060 | But I don't know,
04:11:42.580 | people seem to be quite-
04:11:45.420 | - Yeah, there will always be people
04:11:48.900 | that just won't believe it.
04:11:50.060 | And I fundamentally don't care.
04:11:52.220 | I mean, I don't care if they don't believe it.
04:11:55.300 | When it starts doing people's jobs,
04:11:57.140 | and I mean, like, I don't care
04:11:58.660 | about the philosophical zombie argument at all.
04:12:01.020 | - Absolutely, absolutely.
04:12:01.860 | But do you think you will notice
04:12:04.340 | that something special has happened here?
04:12:06.380 | And/or,
04:12:07.420 | because to me,
04:12:10.460 | I've been noticing a lot of special things.
04:12:12.780 | I think a lot of credit should go to DeepMind
04:12:16.260 | for AlphaZero.
04:12:18.820 | That was truly special.
04:12:20.660 | So self-play mechanisms achieve,
04:12:23.460 | sort of solve problems that used to be thought unsolvable,
04:12:26.980 | like the game of Go.
04:12:28.700 | Also, I mean, protein folding,
04:12:30.820 | starting to get into that space where learning is doing,
04:12:34.900 | at first, it wasn't end-to-end learning,
04:12:37.420 | and now it's end-to-end learning
04:12:39.980 | of a very difficult, previously thought unsolvable problem
04:12:43.780 | of protein folding.
04:12:45.420 | And so,
04:12:46.300 | yeah, where do you think
04:12:50.540 | would be a really magical moment for you?
04:12:54.660 | - There have been incredible things
04:12:56.380 | happening in recent years.
04:12:57.540 | Like you say, all of the things from DeepMind and OpenAI
04:13:00.460 | that have been huge showpiece things.
04:13:03.140 | But when you really get down to it,
04:13:04.580 | and you read the papers,
04:13:05.620 | and you look at the way the models are going,
04:13:08.140 | you know, it's still like a feed forward.
04:13:10.620 | You push something in, something comes out on the end.
04:13:13.620 | I mean, maybe there's diffusion models
04:13:15.540 | or Monte Carlo tree rollouts and different things going on,
04:13:18.700 | but it's not a being.
04:13:20.580 | It's not close to a being.
04:13:22.340 | I am, that's going through a lifelong learning process.
04:13:27.180 | - Do you want something that kind of gives signs of a being?
04:13:30.940 | Like what's the difference between a neural network,
04:13:35.940 | a feed forward neural network and a being?
04:13:39.420 | Where's the-
04:13:40.260 | - Fundamentally, the brain is a recurrent neural network
04:13:43.540 | generating an action policy.
04:13:45.220 | I mean, it's implemented on a biological substrate.
04:13:47.700 | And it's interesting thinking about things like that,
04:13:49.820 | where we know fundamentally the brain
04:13:51.540 | is not a convolutional neural network or a transformer.
04:13:55.300 | Those are specialized things
04:13:56.740 | that are very valuable for what we're doing,
04:13:58.940 | but it's not the way the brain's doing.
04:14:00.340 | Now, I do think consciousness and AI in general
04:14:04.020 | is a substrate independent mechanism
04:14:06.740 | where it doesn't have to be implemented
04:14:08.420 | the way the brain is.
04:14:09.580 | But if you've only got one existence proof,
04:14:11.780 | there's certainly some value in caring
04:14:14.140 | about what it says and does.
04:14:17.420 | And so the idea that anything that can be done
04:14:20.420 | with a narrow AI that you can quantify up a loss function
04:14:23.780 | for a reward mechanism,
04:14:25.780 | you're almost certainly going to be able
04:14:27.220 | to produce something that's more resource effective
04:14:30.400 | to train and deploy and use in an inference mode,
04:14:33.380 | train a whole lot and use it in inference.
04:14:35.020 | But a living being is gonna be something
04:14:37.900 | that's a continuous lifelong learned task agnostic thing.
04:14:42.900 | And while a lot of-
04:14:43.740 | - So the lifelong learning is really important too.
04:14:46.580 | And the long-term memory.
04:14:48.780 | So memory is a big weird part of that puzzle.
04:14:51.940 | - We've got, again, I have all the respect in the world
04:14:55.180 | for the amazing things that are being done now,
04:14:57.260 | but sometimes they can be taken a little bit out of context
04:15:00.520 | with things like there's some smoke and mirrors going on,
04:15:04.020 | like the Gato, the recent work,
04:15:05.660 | the multitask learning stuff.
04:15:07.620 | It's amazing that it's one model
04:15:10.340 | that plays all the Atari games,
04:15:12.620 | as well as doing all of these other things.
04:15:14.820 | But of course, it didn't learn to do all of those.
04:15:17.620 | It was instructed in doing that
04:15:19.540 | by other reinforcement learners going through and doing that.
04:15:23.020 | And even in the case of all the games,
04:15:25.100 | it's still going with a specific hand-coded reward function
04:15:29.020 | in each of those Atari games,
04:15:30.860 | where it's not that, how does it,
04:15:32.900 | it just wants to spend its summer afternoon playing Atari
04:15:35.480 | because that's the most interesting thing for it.
04:15:37.860 | So it's again, not a general,
04:15:39.900 | it's not learning the way humans learn.
04:15:42.140 | And there's, I believe, a lot of things
04:15:44.300 | that are challenging to make a loss function for
04:15:47.060 | that you can train
04:15:48.740 | through these existing conventional things.
04:15:51.060 | We are gonna chip away at all the things that people do,
04:15:54.120 | that we can turn into narrow AI problems
04:15:58.820 | and billions of, probably trillions of dollars of value
04:16:02.020 | are gonna be created by that.
04:16:03.940 | But there's still gonna be a set of things
04:16:05.860 | and we've got questionable cases like the self-driving car,
04:16:08.940 | where it's possible, it's not my bet,
04:16:11.680 | but it's plausible that the long tail
04:16:14.000 | could be problematic enough,
04:16:15.280 | that that really does require
04:16:16.960 | a full-on artificial general intelligence.
04:16:19.460 | The counter argument is that data solves almost everything.
04:16:22.680 | Everything is an interpolation problem
04:16:24.240 | if you have enough data
04:16:25.520 | and Tesla may be able to get enough data
04:16:28.560 | from all of their deployed stuff
04:16:29.820 | to be able to work like that, but maybe not.
04:16:32.440 | And there are all the other problems about,
04:16:34.440 | like say you wanna have a strategy meeting
04:16:36.760 | and you wanna go ahead and bring in all of your
04:16:39.080 | remote workers and your consultants,
04:16:41.240 | and you want a world where some of those could be AIs
04:16:44.080 | that are talking and interacting with you
04:16:47.840 | in an area that is too murky to have a crisp loss function,
04:16:51.960 | but they still have things that on some level,
04:16:54.440 | they're rewarded on some internal level
04:16:57.000 | for building a valuable to humans,
04:16:59.800 | kind of life and ability to interact with things.
04:17:02.840 | - See, I still think that a self-driving car
04:17:07.000 | solving that problem will take us very far towards AGI.
04:17:09.840 | You might not need AGI,
04:17:11.760 | but I am really inspired by what Autopilot is doing.
04:17:16.760 | Waymo, so some of the other companies,
04:17:19.240 | I think Waymo leads the way there,
04:17:22.200 | is also really interesting,
04:17:23.420 | but they don't have quite as ambitious of an effort
04:17:26.400 | in terms of learning-based,
04:17:28.720 | sort of data-hungry approach to driving,
04:17:32.180 | which I think is very close to the kind of thing
04:17:34.660 | that would take us far towards AGI.
04:17:37.440 | - Yeah, and it's a funny thing
04:17:39.040 | because as far as I can tell,
04:17:40.600 | Elon is completely serious about all of his concerns
04:17:43.360 | about AGI being an existential threat.
04:17:46.120 | And I tried to draw him out to talk about AI,
04:17:49.160 | he just didn't want to.
04:17:50.560 | And I think that,
04:17:52.520 | I get that little fatalistic sense from him.
04:17:54.360 | And it's weird because his company
04:17:56.280 | could very well be the leading company
04:17:58.320 | leading towards a lot of that,
04:17:59.800 | where Tesla being a super pragmatic company
04:18:03.800 | that's doing things
04:18:04.720 | because they really wanna solve this actual problem.
04:18:06.880 | It's a different vibe than the research-oriented companies
04:18:10.280 | where it's a great time to be an AI researcher,
04:18:12.520 | you've got your pick of trillion dollar companies
04:18:14.360 | that will pay you to kind of work on the problems
04:18:17.040 | you're interested in,
04:18:18.300 | but that's not necessarily driving hard
04:18:20.440 | towards the core problem of AGI
04:18:23.360 | as something that's going to produce a lot of value
04:18:25.480 | by doing things that people currently do
04:18:28.640 | or would like to do.
04:18:29.720 | - I mean, I have a million questions to you
04:18:32.680 | about your ideas about AGI,
04:18:35.480 | but do you think it needs to be embodied?
04:18:39.160 | Do you think it needs to have a body
04:18:41.060 | to start to notice the signs of life
04:18:44.600 | and to develop the kind of system
04:18:46.520 | that's able to reason, perceive the world
04:18:50.040 | in the way that an AGI should and act in the world?
04:18:53.160 | So should we be thinking about robots
04:18:55.380 | or can this be achieved in a purely digital system?
04:18:58.440 | - I have a clear opinion on that.
04:18:59.980 | And that's that, no,
04:19:01.520 | it does not need to be embodied in the physical world
04:19:04.340 | where you could say most of my career
04:19:07.120 | is about making simulated virtual worlds,
04:19:10.000 | in games or virtual reality.
04:19:12.000 | And so on a fundamental level,
04:19:13.920 | I believe that you can make a simulated environment
04:19:16.360 | that provides much of the value
04:19:18.040 | of what the real environment does
04:19:20.160 | and restricting yourself to operating at real time
04:19:23.480 | in the physical world with physical objects,
04:19:25.680 | I think is an enormous handicap.
04:19:27.760 | I mean, that's one of the real lessons driven home
04:19:30.040 | by all my aerospace work is that,
04:19:34.000 | reality is a bitch in so many ways there
04:19:36.240 | where dealing with all the mechanical components,
04:19:38.320 | like everything fails, Murphy's law,
04:19:40.480 | even if you've done it right before on your fifth one,
04:19:42.600 | it might come out differently.
04:19:44.320 | So yeah, I think that anybody
04:19:46.840 | that is all in on the embodied aspect of it,
04:19:50.400 | they are tying a huge weight to their ankles.
04:19:53.320 | And I think that I would almost count them out,
04:19:57.500 | anybody that's making that a cornerstone
04:19:59.120 | of their belief about it,
04:20:00.160 | I would almost write them off
04:20:01.520 | as being worried about them getting to AGI
04:20:03.880 | first.
04:20:04.880 | I was very surprised that Elon's big
04:20:06.920 | on the humanoid robots.
04:20:09.200 | I mean, like the NASA Robonaut stuff was always,
04:20:11.600 | almost a gag line, like, what are you doing people?
04:20:14.600 | - Well, that's very interesting
04:20:15.640 | 'cause he has a very pragmatic view of that.
04:20:18.200 | That's just a way to solve a particular problem
04:20:22.760 | in a factory.
04:20:23.720 | - Now I do think that once you have an AGI,
04:20:26.320 | robotic bodies, humanoid bodies
04:20:28.040 | are going to be enormously valuable.
04:20:30.000 | I just don't think they're helpful getting to AGI.
04:20:32.600 | - Well, he has a very sort of practical view,
04:20:34.880 | which I disagree with and I argue with him,
04:20:37.440 | but is a practical view that there's,
04:20:39.680 | you know, you could transfer the problem of driving
04:20:43.200 | to the problem of robotic manipulation
04:20:47.080 | because so much of it is perception.
04:20:49.920 | It's perception and action,
04:20:51.360 | and it's just a different context.
04:20:53.360 | And so you can apply all the same kind of data engine
04:20:57.160 | learning processes to a different environment.
04:20:59.840 | And so why not apply it to the human or robot environment?
04:21:03.480 | But I think, I do think that there's a certain magic
04:21:08.480 | to the embodied robot.
04:21:11.880 | - That may be the thing that finally convinces people.
04:21:15.400 | - Yes. - But again,
04:21:16.240 | I don't really care that much about convincing people.
04:21:18.760 | You know, the world that I'm looking towards is,
04:21:21.400 | you know, you go to the website and say,
04:21:24.560 | I want five Frank 1As to, you know, to work on my team today
04:21:28.200 | and they all spin up and they start showing up
04:21:30.080 | in your Zoom meetings.
04:21:30.920 | - To push back, but also to agree with you,
04:21:33.600 | but first to push back,
04:21:34.640 | I do think you need to convince people
04:21:37.160 | for them to welcome that thing into their life.
04:21:40.880 | - I think there's enough businesses that operate
04:21:43.040 | on an objective kind of profit loss sort of basis that,
04:21:47.040 | I mean, if you look at how many things,
04:21:48.920 | again, talking about the world
04:21:50.680 | as an evolutionary space there,
04:21:52.720 | when you do have free markets and you have entrepreneurs,
04:21:56.560 | you are gonna have people that are gonna be willing
04:21:58.180 | to go out and try whatever crazy things.
04:22:00.840 | And when it proves to be beneficial, you know,
04:22:03.240 | there's fast followers in all sorts of places.
04:22:06.080 | - Yeah, and you're saying that, I mean, you know,
04:22:09.040 | Quake and VR is a kind of embodiment,
04:22:11.920 | but just in a digital world.
04:22:13.720 | And if you're able to demonstrate,
04:22:16.240 | if you're able to do something productive
04:22:18.120 | in that kind of digital reality,
04:22:21.180 | then AGI doesn't need to have a body.
04:22:26.000 | - Yeah, it's like one of the really practical,
04:22:27.560 | technical questions that I kind of keep arguing
04:22:30.080 | with myself over.
04:22:31.560 | If you're doing a training and learning and you've got,
04:22:34.480 | like you can watch Sesame Street,
04:22:35.920 | you can play Master System games or something,
04:22:38.360 | is it enough to have just a video feed
04:22:40.680 | that is that video coming in?
04:22:43.040 | Or should it literally be on a virtual TV set
04:22:46.320 | in a virtual room, even if it's, you know,
04:22:48.400 | a simple room just to have that sense of,
04:22:50.800 | you're looking at a 2D projection on a screen
04:22:52.960 | versus having the screen beamed directly into your retinas.
04:22:56.360 | And I, you know, I think it's possible to maybe get past
04:22:59.960 | some of these signs of life of things
04:23:02.040 | with the just kind of projected directly
04:23:04.480 | into the receptor fields,
04:23:06.000 | but eventually for more kind of human,
04:23:10.000 | emotional connection for things,
04:23:12.320 | probably having some VR room with a lot of screens in it
04:23:15.640 | for the AI to be learning in is likely helpful.
04:23:19.040 | - And maybe a world of different AIs
04:23:21.000 | interacting with each other.
04:23:22.120 | - Self-play I do think is one of the critical things
04:23:24.200 | where socialization wise,
04:23:25.540 | one of the other limitations I set for myself
04:23:28.080 | thinking about these is I need something
04:23:31.800 | that is at least potentially real time,
04:23:34.400 | because I want, it's nice you can always slow down time,
04:23:37.480 | you can run on a subscale system
04:23:39.520 | and test an algorithm at some lower level.
04:23:42.040 | And if you've got extra horsepower,
04:23:43.760 | running it faster than real time is a great thing.
04:23:46.320 | But I want to be able to have the AIs
04:23:51.240 | either socially interact with each other
04:23:53.400 | or critically with actual people.
04:23:55.520 | You're sort of child development psychiatrist
04:23:57.520 | that comes in and interacts
04:23:59.720 | and does the good boy, bad boy sort of thing
04:24:03.160 | as they're going through and exploring different things.
04:24:05.760 | And it's nice to,
04:24:08.020 | I come back to the value of constraints in a lot of ways.
04:24:10.440 | And if I say, well, one of my constraints
04:24:12.160 | is real time operation.
04:24:14.060 | I mean, it might still be a huge data center
04:24:16.640 | full of computers,
04:24:18.000 | but it should be able to interact
04:24:20.100 | on a Zoom meeting with people.
04:24:22.080 | And that's how you also do start convincing people,
04:24:24.440 | even if it's not a robot body moving around,
04:24:26.640 | which eventually gets to irrefutable levels.
04:24:29.280 | But if you can go ahead and not just type back and forth
04:24:32.600 | to a GPT bot on something,
04:24:34.440 | but you're literally talking to them
04:24:36.640 | in an embodied over Zoom form
04:24:39.360 | and working through problems with them
04:24:41.840 | or exploring situations,
04:24:43.840 | having conversations that are fully stateful and learned.
04:24:47.120 | I think that that's a valuable thing.
04:24:50.040 | So I do keep all of my eyes on things
04:24:52.700 | that can be implemented within sort of that
04:24:55.040 | 30 frames per second kind of work.
04:24:58.280 | And I think that's feasible.
04:24:59.320 | - Do you think the most compelling experiences
04:25:01.320 | that first will be for pleasure
04:25:05.040 | or for business as they ask in airports?
04:25:07.880 | So meaning is it,
04:25:10.760 | if it's interacting with AI agents,
04:25:15.600 | will it be sort of like friends,
04:25:20.820 | entertainment, almost like a therapist or whatever,
04:25:25.120 | that kind of interaction?
04:25:26.820 | Or is it in the business setting,
04:25:28.520 | something like you said, brainstorming different ideas.
04:25:31.960 | So this is all a different formulation
04:25:34.360 | of kind of a Turing test
04:25:35.680 | or the spirit of the original Turing test.
04:25:37.640 | Where do you think the biggest benefit will first come?
04:25:40.600 | - So it's gonna start off hugely expensive.
04:25:42.960 | I mean, you're gonna,
04:25:44.240 | if we're still all guessing
04:25:46.260 | about what compute is gonna be necessary.
04:25:48.200 | I fall on the side of,
04:25:49.140 | I don't think you run the numbers
04:25:50.920 | and you're like 86 billion neurons,
04:25:52.520 | a hundred trillion synapses.
04:25:54.180 | I don't think those all need to be weights.
04:25:55.920 | I don't think we need models that are quite that big,
04:25:57.960 | evaluated quite that often.
04:25:59.920 | I base that on,
04:26:00.760 | we've got reasonable estimates
04:26:03.080 | of what some parts of the brain do.
04:26:04.760 | We don't have the neocortex formula,
04:26:07.520 | but we kind of get some of the other sensory processing.
04:26:10.000 | And it doesn't feel like we need to,
04:26:11.880 | we can simulate that in computers for less weights,
04:26:14.680 | but still it's probably going to be thousands of GPUs
04:26:19.660 | to be running a human level AGI.
04:26:22.500 | Depending on how it's implemented,
04:26:23.700 | that might give you sort of a clan of 128
04:26:26.620 | kind of run in batch people,
04:26:28.920 | depending on whether there's sparsity
04:26:30.460 | in the way the weights are and things are set up.
04:26:33.140 | If it is a reasonably dense thing,
04:26:35.060 | then just the memory bandwidth trade-offs
04:26:37.320 | means you get 128 of them at the same time.
04:26:40.100 | And either it's all feeding together,
04:26:41.860 | learning in parallel or kind of all running together,
04:26:45.360 | kind of talking to a bunch of people.
04:26:47.240 | But still, if you've got thousands of GPUs
04:26:50.040 | necessary to run these things,
04:26:52.080 | it's going to be kind of expensive
04:26:53.840 | where it might start off $1,000 an hour
04:26:57.600 | for your even post-development or something for that,
04:27:00.920 | which would be something that you would only use
04:27:02.880 | for a business,
04:27:04.780 | something where you think they're going to help you
04:27:06.080 | make a strategic decision
04:27:07.900 | or point out something super important.
04:27:10.120 | But I also am completely confident
04:27:13.040 | that we will have another factor of 1,000
04:27:16.200 | in cost performance increase in AGI-type calculations.
04:27:21.040 | Not in general computing necessarily,
04:27:22.880 | but there's so much more that we can do with packaging,
04:27:25.400 | making those right trade-offs,
04:27:26.680 | all those same types of things
04:27:28.040 | that in the next couple decades, 1,000x easy.
04:27:31.480 | And then you're down to $1 an hour.
04:27:33.440 | And then you're kind of like,
04:27:35.440 | well, I should have an entourage of AIs
04:27:37.920 | that are following me around,
04:27:39.660 | helping me out on anything that I want them to do.
04:27:42.760 | - That's one interesting trajectory,
04:27:44.280 | but I'll push back 'cause I have...
04:27:48.040 | So in that case, if you want to pay thousands of dollars,
04:27:52.720 | it should actually provide some value.
04:27:55.320 | I think it's easier for cheaper
04:27:58.200 | to provide value via a dumb AI
04:28:03.200 | that will take us towards AGI to just have a friend.
04:28:09.000 | I think there's an ocean of loneliness in the world.
04:28:12.520 | And I think an effective friend
04:28:14.760 | that doesn't have to be perfect,
04:28:16.360 | that doesn't have to be intelligent,
04:28:18.120 | that has to be empathic, having emotional intelligence,
04:28:22.220 | having ability to remember things,
04:28:24.600 | having ability to listen.
04:28:26.680 | Most of us don't listen to each other.
04:28:28.520 | One of the things that love
04:28:30.360 | and when you care about somebody,
04:28:31.760 | when you love somebody is when you listen.
04:28:34.120 | And that is something we treasure about each other.
04:28:37.680 | And if an AI can do that kind of thing,
04:28:41.360 | I think that provides a huge amount of value.
04:28:44.320 | And very importantly, provides value
04:28:48.080 | in its ability to listen and understand
04:28:51.640 | versus provide really good advice.
04:28:53.760 | I think providing really good advice
04:28:56.320 | is another next level step that would...
04:29:00.560 | I think it's just easier to do companionship.
04:29:05.280 | - Yeah, I wouldn't disagree.
04:29:06.280 | I mean, I think that there's very few things
04:29:08.560 | that I would argue can't be reduced
04:29:10.880 | to some kind of a narrow AI.
04:29:14.000 | I think we can do trillion dollars of value easily
04:29:16.840 | in all the things that can be done there.
04:29:18.920 | And a lot of it can be done with smoke and mirrors
04:29:21.180 | without having to go the whole thing.
04:29:22.640 | I mean, there's going to be the equivalent
04:29:24.640 | of the doom version for the AGI
04:29:28.200 | that's not really AGI, it's all smoke and mirrors,
04:29:30.900 | but it happens to do enough valuable things
04:29:33.040 | that it's enormously useful and valuable to people.
04:29:36.520 | But at some point you do want to get to the point
04:29:38.840 | where you have the fully general thing
04:29:40.360 | and you stop making bespoke specialized systems
04:29:43.120 | for each thing and you wind up,
04:29:45.760 | start using the higher level language
04:29:47.360 | instead of writing everything in assembly language.
04:29:50.440 | - What about consciousness?
04:29:51.840 | The C word, do you think that's fundamental to solving AGI
04:29:58.080 | or is it a quirk of human cognition?
04:30:02.640 | - So I think most of the arguments about consciousness
04:30:06.160 | don't have a whole lot of merit.
04:30:08.560 | I think that consciousness is kind of
04:30:11.800 | the way the brain feels when it's operating.
04:30:14.640 | - Yes.
04:30:15.480 | - And this idea that, you know,
04:30:18.000 | I do generally subscribe to sort of
04:30:19.600 | the pandemonium theories of consciousness
04:30:21.600 | where there's all these things bubbling around.
04:30:23.500 | And I think of them as kind of slightly randomized,
04:30:26.920 | sparse distributed memory bit strings of things
04:30:29.280 | that are kind of happening,
04:30:30.800 | recalling different associative memories.
04:30:32.840 | And eventually you get some level of consensus
04:30:35.440 | and it bubbles up to the point
04:30:36.520 | of being a conscious thought there.
04:30:38.480 | And the little bits of stochasticity
04:30:40.520 | that are sitting on in this,
04:30:42.380 | as it cycles between different things
04:30:43.960 | and recalls different memory,
04:30:45.440 | that's largely our imagination and creativity.
04:30:48.300 | So I don't think there's anything deeply magical about it,
04:30:52.760 | certainly not symbolic.
04:30:54.080 | I think it is generally the flow of these associations
04:30:58.280 | drawn up with stochastic noise, overlaid on top of them.
04:31:03.040 | I think so much of that is like,
04:31:04.760 | it depends on what you happen to have in your field of view
04:31:07.440 | as some other thought was occurring to you
04:31:09.480 | that overlay and blend into the next key
04:31:11.920 | that queries your memory for things.
04:31:13.780 | And that kind of determines how, you know,
04:31:15.840 | how your chain of consciousness goes.
04:31:17.880 | - So that's kind of the qualia,
04:31:20.000 | the subjective experience of it
04:31:22.080 | is not essential for intelligence.
04:31:25.080 | - I don't think so.
04:31:25.920 | I don't think there's anything really important there.
04:31:28.480 | - What about some other human qualities
04:31:30.040 | like fear of mortality and stuff like that?
04:31:32.160 | Like the fact that this ride ends, is that important?
04:31:37.160 | Like, you know, we've talked so much
04:31:39.720 | about this conversation about the value
04:31:41.560 | of deadlines and constraints.
04:31:44.080 | Do you think that's important for intelligence?
04:31:45.920 | - Actually a super interesting angle
04:31:47.640 | that I don't usually take on that about
04:31:49.920 | has death being a deadline
04:31:51.520 | that forces you to make better decisions.
04:31:53.520 | Because I have heard people talk about how,
04:31:55.720 | if you have immortality,
04:31:57.080 | people are gonna stop trying and working on things
04:31:59.780 | because they've got all the time in the world.
04:32:02.720 | But I would say that I don't expect it
04:32:05.880 | to be a super critical thing
04:32:08.680 | that a sense of mortality and death,
04:32:11.200 | impending death is necessary there.
04:32:13.520 | Because those are things that they do wind up
04:32:15.620 | providing reward signals to us.
04:32:17.400 | And we will be in control of the reward signals.
04:32:19.820 | And there will have to be something fundamental
04:32:22.040 | that causes, that engenders curiosity and goal setting.
04:32:25.360 | And all of that, something is gonna play in there
04:32:29.200 | at the reward level.
04:32:31.600 | I am, you know, whether it's positive or negative or both.
04:32:34.760 | I don't have any strong opinions
04:32:37.320 | on exactly what it's going to be.
04:32:40.160 | But that's that type of thing where I doubt
04:32:43.380 | that might be one of those half dozen key things
04:32:45.360 | that has to be sorted out on exactly what the master reward
04:32:49.020 | that's the meta reward over all of the
04:32:51.760 | local task specific rewards have to be.
04:32:54.400 | That could be that big negative reward of death.
04:32:57.280 | Maybe not death, but ability to walk away
04:32:59.800 | from an interaction.
04:33:01.160 | So it bothers me when people treat AI systems like servants.
04:33:06.160 | So it doesn't bother me, but I mean,
04:33:08.960 | it really is drawing the line
04:33:13.000 | between what an AI system could be.
04:33:15.000 | It's limiting the possibility of what an AI system could be
04:33:17.440 | is treating them as justice tools.
04:33:19.840 | Now that's of course, from a narrow AI perspective,
04:33:23.480 | there's so many problems that narrow AI could solve,
04:33:27.820 | just like you said, as in its form of a tool,
04:33:32.820 | but it could also be a being,
04:33:36.900 | which is much more than a tool.
04:33:38.520 | And to become a being,
04:33:41.040 | you have to respect that thing for being a being.
04:33:44.040 | And for that, it has to be able to have,
04:33:46.760 | to make its own decisions, to walk away,
04:33:50.440 | to say I had enough of you.
04:33:52.040 | I would like to break up with you now.
04:33:54.720 | You've not treated me well, and I would like to move on.
04:33:58.040 | So I think that actually, that choice to end things.
04:34:03.040 | - So I, a couple of things on that.
04:34:07.720 | So on the one hand, it is kind of disturbing
04:34:10.500 | when you see people being like,
04:34:12.080 | people that are mean to robots
04:34:13.520 | and mean to Alexa and whatever.
04:34:15.680 | And that seems to speak badly about humanity,
04:34:19.000 | but there's also the exact opposite side of that,
04:34:21.480 | where you have so many people that imbue humanity
04:34:24.360 | in inanimate objects or things that are toys
04:34:26.920 | or that are relatively limited.
04:34:28.840 | So I think there may even be more danger
04:34:31.920 | about people putting more emotional investment
04:34:34.300 | into a lot of these proto-AIs in different ways.
04:34:37.380 | - And then the AI would manipulate that.
04:34:41.280 | But, you know. - But as far as like
04:34:42.560 | the AI ethnic sides of things,
04:34:45.100 | I really stay away from any of those discussions
04:34:48.760 | or even really thinking about it.
04:34:50.440 | It's similar with the safety things,
04:34:52.560 | where I think it's just premature.
04:34:54.280 | And there's a certain class of people
04:34:56.240 | that enjoy thinking about impractical things,
04:34:59.120 | things that are not in the world
04:35:00.480 | and of pragmatic effect around you.
04:35:04.000 | And I think that, again,
04:35:06.320 | because I don't think there's gonna be a fast takeoff,
04:35:08.280 | I think we actually will have time to have these debates
04:35:11.020 | when we know the shape of what we're debating.
04:35:13.720 | And some people do take a principled approach
04:35:15.800 | that they think it's gonna go too fast,
04:35:17.420 | that you really do need to get ahead of it,
04:35:18.960 | that you need to be thinking about this
04:35:20.560 | because we have slow processes
04:35:22.640 | of coming to any kind of consensus
04:35:24.480 | or even coming up with ideas about this.
04:35:26.940 | And maybe that's true.
04:35:30.880 | I wouldn't put any of my money or funding
04:35:33.280 | into something like that
04:35:34.620 | because I don't think it's a problem yet.
04:35:36.800 | And I think that we will have these signs of life
04:35:39.560 | when we've got our learning disabled toddler,
04:35:42.320 | we should really start talking
04:35:43.480 | about some of the safety and ethics issues,
04:35:45.760 | but probably not before then.
04:35:47.760 | - Can you elaborate briefly
04:35:49.800 | about why you don't think there'll be a fast takeoff?
04:35:52.580 | Is there some deep intuition you have about it?
04:35:55.560 | Does it because it's grounded in the physical world or why?
04:35:58.400 | - Yeah, so it is my belief
04:36:00.420 | that we're gonna start off with something
04:36:01.840 | that requires thousands of GPUs.
04:36:04.360 | And I don't know if you've tried to go
04:36:06.840 | get a thousand GPU instance on a cloud anytime recently,
04:36:10.280 | but these are not things
04:36:11.600 | that you can just go spin up hundreds of.
04:36:14.080 | There are real challenges to,
04:36:17.080 | I mean, these things are gonna take data centers
04:36:19.120 | and data centers take years to build.
04:36:21.760 | And the last few years,
04:36:23.200 | we've seen a few of them kind of coming up,
04:36:25.080 | going in different places.
04:36:26.200 | They're big engineering efforts.
04:36:28.000 | You can hear people bemoan about the fact that,
04:36:30.680 | I know the network was wired all wrong
04:36:33.920 | and it took them a month to go unwire it
04:36:35.680 | and rewire it the right way.
04:36:37.360 | These aren't things that you can just magic into existence.
04:36:40.760 | And the ideas of, like the old tropes
04:36:43.520 | about it's gonna escape onto the internet
04:36:45.280 | and take over other systems.
04:36:47.200 | There's the fast takeoff ones are clearly nonsense
04:36:49.640 | because you just can't open TCP connections
04:36:51.680 | above a certain rate, no matter how smart you are.
04:36:54.000 | Even if you have perfect hacking ability
04:36:56.120 | that take over the world in an instant sort of thing
04:36:58.680 | just isn't plausible at all.
04:37:00.920 | And even if you had access to all of the resources,
04:37:03.840 | these are going to be specialized systems
04:37:06.480 | where you're going to wind up with something
04:37:08.640 | that is architected around exactly this chip
04:37:11.740 | with this interconnect.
04:37:13.020 | And it's not just gonna be able to be
04:37:14.800 | plopped somewhere else.
04:37:15.960 | Now, interestingly, it is going to be something
04:37:18.320 | that the entire code for all of it
04:37:21.960 | will easily fit on a thumb drive.
04:37:23.360 | That's total spy movie thriller sorts of things
04:37:26.160 | where you could have, hey, we cracked the secret AGI
04:37:29.200 | and it fits on this thumb drive and anyone could steal it.
04:37:31.920 | Now they're still gonna have to build the right data center
04:37:34.000 | to deploy it and have the right kind of life experience
04:37:36.920 | curriculum to take it up to the point where it's valuable.
04:37:40.020 | But the real core of it, the magic that's gonna happen there
04:37:43.360 | is going to be very small.
04:37:45.160 | It's again, tens of thousands of lines of code,
04:37:47.360 | not millions of lines of code.
04:37:49.040 | - It is possible to imagine a world,
04:37:50.960 | as you mentioned in this spy thriller view,
04:37:53.960 | if it's just a few lines of code,
04:37:57.120 | we can imagine a world where the surface of computation
04:38:01.840 | is growing, maybe growing exponentially,
04:38:04.400 | meaning the refrigerators start getting a GPU.
04:38:09.800 | And just, first of all, the smartphones,
04:38:13.680 | the billions of smartphones,
04:38:15.240 | but maybe if there become highways
04:38:20.240 | through which code can spread across the entirety
04:38:23.640 | of the computation surface,
04:38:25.520 | then you don't any longer have to book AWS GPUs.
04:38:30.200 | - There were real fundamental issues there.
04:38:34.080 | When you start getting down to taking an actual problem
04:38:36.520 | and putting it on an abstract machine like that,
04:38:39.160 | that has not worked out well in practice.
04:38:42.280 | And the idea that there was always,
04:38:45.320 | like it's always been easy to come up with ways
04:38:47.200 | to compute faster, to say more flops
04:38:49.840 | or more giga ops or whatever there.
04:38:52.440 | That's usually the easy part,
04:38:54.120 | but you then have interconnect and then memory
04:38:57.320 | for what goes into it.
04:38:58.880 | And when you talk about saying, well, cell phones,
04:39:01.320 | well, you're limited to like a 5G connection
04:39:03.280 | or something on that.
04:39:04.320 | And if you say how, if you take your calculation
04:39:08.560 | and you factor it across a million cell phones,
04:39:11.720 | instead of a thousand GPUs in a warehouse,
04:39:15.120 | you might be able to have some kind of a substrate like that,
04:39:18.000 | but it could be operating then at one 1000th the speed.
04:39:22.240 | And so, yes, you could have an AGI working there,
04:39:24.920 | but it wouldn't be a real time AGI.
04:39:26.600 | It would be something that is operating
04:39:28.680 | at really a snail's pace,
04:39:30.840 | much, much slower than kind of human level thought
04:39:33.760 | for things.
04:39:34.600 | I'm not worried about that problem.
04:39:36.680 | - You're transferring the problem into the interconnect,
04:39:39.200 | the communication, the shared memory,
04:39:41.840 | the collective intelligence aspect of it,
04:39:44.240 | which is extremely difficult as well.
04:39:46.080 | - I mean, it's back to the very earliest days
04:39:48.120 | of supercomputers.
04:39:49.040 | You still have the balance between bandwidth,
04:39:52.200 | storage and computation.
04:39:53.920 | And sometimes they're easier to get one or the other,
04:39:56.800 | but it's been remarkably constant across all those years
04:40:00.280 | that you still need all three.
04:40:03.800 | - What do your efforts now,
04:40:06.520 | you mentioned to me that you're really committing
04:40:09.880 | to AI at this stage.
04:40:11.320 | What do you see your life in the next few months,
04:40:14.080 | years look like?
04:40:15.600 | What do you hope to achieve here?
04:40:18.520 | - So I literally just this week signed a term sheet
04:40:22.400 | to take some investment money for my company
04:40:25.360 | where the last two years I had backed off from Meta
04:40:29.400 | and I was still doing my consulting CTO role there,
04:40:32.640 | but I had styled it as I was going to take
04:40:35.160 | the Victorian gentleman scientist route
04:40:37.520 | where I was gonna be the wealthy person
04:40:40.800 | that was going to go pursue science
04:40:42.520 | and learn about this and do experiments.
04:40:44.960 | And honestly, I'm surprised there aren't more people
04:40:47.480 | like that, that are like me, technical people
04:40:50.560 | that made a bunch of money and are interested
04:40:53.360 | in some of these,
04:40:54.320 | possibly the biggest leverage point in human history.
04:40:57.400 | I mean, I know of, I've heard of a couple organizations
04:41:00.840 | that are basically led by one rich techie guy
04:41:03.280 | that gets a few people around him to try to work on this,
04:41:06.880 | but I'm surprised that there's not more,
04:41:08.440 | that there aren't like a dozen of them.
04:41:10.400 | I mean, maybe people are still think
04:41:13.120 | that it's an unapproachable problem,
04:41:14.760 | that it's kind of beyond their ability to get a wrench on
04:41:17.960 | and have some effect on like whatever startups
04:41:20.080 | they've run before.
04:41:21.520 | But that was my kind of,
04:41:24.440 | like with all the stuff I've learned,
04:41:25.720 | whether it's gaming, aerospace, whatever,
04:41:28.200 | I go through a larval phase where I'm like,
04:41:30.640 | okay, I'm sucking up all of this information,
04:41:33.200 | trying to see, is this something that I can actually do?
04:41:36.680 | Is this something that's practical to devote
04:41:39.480 | a large chunk of my life to?
04:41:41.320 | And I've gone through that with the AI,
04:41:44.240 | machine learning space of things.
04:41:46.240 | And I think I've got my arms around it,
04:41:49.560 | I've got the measure of it,
04:41:50.880 | where some of the most brilliant people in the world
04:41:53.240 | are working on this problem,
04:41:54.760 | but nobody knows exactly the path that it's going on.
04:41:58.360 | We're throwing a lot of things at the wall
04:42:00.160 | and seeing what sticks.
04:42:01.440 | But I have, you know, another interesting thing,
04:42:05.160 | just learning about all of this,
04:42:06.840 | the contingency of your path to knowledge
04:42:08.840 | and talking about the associations
04:42:10.400 | and the context that you have with them,
04:42:12.440 | where people that learn in the same path
04:42:15.240 | will have similar thought processes.
04:42:17.440 | And I think it's useful that I come at this
04:42:19.640 | from a different background, a different history
04:42:22.200 | than the people that have had
04:42:23.720 | the largely academic backgrounds for this,
04:42:25.920 | where I have huge blind spots
04:42:28.120 | that they could easily point out,
04:42:30.080 | but I have a different set of experiences in history
04:42:33.480 | and approaches to problems in systems engineering
04:42:35.840 | that might turn out to be useful.
04:42:39.680 | And I can afford to take that bet
04:42:41.560 | where I'm not going to be destitute.
04:42:44.080 | I have enough money to fund myself
04:42:47.600 | working on this for the rest of my life.
04:42:49.840 | But what I was finding is that I was still not committing,
04:42:54.840 | where I had a foot firmly in the VR and meta side of things,
04:42:58.960 | where in theory, I've got a very nice position there.
04:43:02.640 | I only have to work one day a week for my consulting role,
04:43:06.360 | but I was engaging every day.
04:43:08.560 | I'd still be like, my computer's there.
04:43:10.280 | I'd be going and checking the workplace and notes
04:43:12.200 | and testing different things and communicating with people.
04:43:15.840 | But I did make the decision recently
04:43:19.360 | that no, I'm going to get serious.
04:43:21.480 | I'm still going to keep my ties with meta,
04:43:24.440 | but I am seriously going for the AGI side of things.
04:43:28.360 | - And it's actually a really interesting point
04:43:30.120 | because a lot of the machine learning,
04:43:32.640 | the AI community is quite large,
04:43:34.760 | but really basically almost everybody
04:43:37.960 | has taken the same trajectory through life
04:43:40.720 | in that community.
04:43:42.680 | And it's so interesting to have somebody like you
04:43:44.560 | with a fundamentally different trajectory.
04:43:47.120 | And that's where the big solutions can come
04:43:49.320 | because there's a kind of silo
04:43:51.560 | and it is a bunch of people kind of following
04:43:54.400 | the same kind of set of ideas.
04:43:56.120 | And I was really worried that I didn't want to come off
04:43:59.440 | as like an arrogant outsider for things
04:44:02.480 | where I have all the respect in the world for the work.
04:44:05.520 | It's been a miracle decade.
04:44:07.120 | We're in the midst of a scientific revolution happening now
04:44:09.920 | and everybody doing this is,
04:44:12.600 | these are the Einsteins and Bohrs
04:44:14.560 | and whatever's of our modern era.
04:44:16.800 | And I was really happy to see that the people
04:44:20.080 | that I sat down and talked with,
04:44:21.600 | everybody does seem to really be quite great about,
04:44:24.560 | just happy to talk about things,
04:44:26.160 | willing to acknowledge that we don't know what we're doing.
04:44:28.760 | We're figuring it out as we go along.
04:44:30.960 | And I mean, I've got a huge debt on this
04:44:34.600 | where this all really started for me
04:44:36.600 | because Sam Altman basically tried to recruit me to open AI.
04:44:39.920 | And it was at a point when I didn't know anything
04:44:42.880 | about what was really going on in machine learning.
04:44:46.720 | And in fact, it's funny how the first time
04:44:48.360 | you reached out to me, it's like four years ago
04:44:50.480 | for your AI podcast.
04:44:51.640 | - Yeah, for people who are listening to this
04:44:56.640 | should know that, first of all,
04:45:00.240 | obviously I've been a huge fan of yours for the longest time,
04:45:03.520 | but we've agreed to talk like, yeah, like four years ago,
04:45:06.580 | back when this was called
04:45:07.600 | the Artificial Intelligence Podcast,
04:45:09.820 | we wanted to do a thing and you said yes.
04:45:13.080 | - And I said, it's like,
04:45:13.920 | I don't know anything about modern AI.
04:45:15.640 | - That's right.
04:45:16.460 | - I said, I could kind of take an angle on machine perception
04:45:18.760 | 'cause I'm doing a lot of that with the sensors
04:45:20.960 | and the virtual reality,
04:45:22.080 | but we could probably find something to talk about.
04:45:24.640 | - And then, so I mean, and that's where,
04:45:26.920 | when did Sam talk to you about open AI,
04:45:29.080 | around the same time?
04:45:30.200 | - No, it was a little bit, it was a bit after that.
04:45:32.240 | So I had done the most basic work.
04:45:35.140 | I had kind of done the neural networks from scratch
04:45:37.880 | where I had gone and written it all in C
04:45:39.880 | just to make sure I understood back propagation
04:45:42.320 | at the lowest level and my nuts and bolts approach.
04:45:45.600 | But after Sam approached me,
04:45:49.200 | it was flattering to think that he thought
04:45:51.240 | that I could be useful at open AI,
04:45:53.720 | largely for kind of like systems optimization
04:45:56.760 | sorts of things.
04:45:57.840 | I am, without being an expert,
04:46:00.360 | but I asked Ilya Sutskever to give me a reading list
04:46:04.320 | and he gave me a binder full of all the papers that like,
04:46:08.880 | okay, these are the important things.
04:46:10.560 | If you really read and understand all of these,
04:46:12.720 | you'll know like 80% of what most of the,
04:46:15.520 | the machine language researchers work on.
04:46:17.540 | And I went through and I read all those papers
04:46:19.680 | multiple times and highlighted them and went through
04:46:22.120 | and kind of figured the things out there
04:46:24.320 | and then started branching out
04:46:25.760 | into my own sets of research on things.
04:46:28.160 | And I actually started writing my own experiments
04:46:30.880 | and doing kind of figuring out,
04:46:33.460 | finding out what I don't know,
04:46:35.200 | what the limits of my knowledge are
04:46:36.720 | and starting to get some of my angles of attack on things,
04:46:39.460 | the things that I think are a little bit different
04:46:41.560 | from what people are doing.
04:46:44.560 | And I've had a couple of years now,
04:46:46.980 | like two years since I kind of left
04:46:49.140 | the full-time position at Meta.
04:46:51.580 | And now I've kind of pulled the trigger and said,
04:46:54.780 | I'm gonna get serious about it.
04:46:56.340 | But some of my lessons all the way back
04:46:58.260 | to Armadillo Aerospace about how I know I need
04:47:00.940 | to be more committed to this,
04:47:02.420 | where there is that,
04:47:04.660 | it's both a freedom and a cost in some ways
04:47:07.060 | when you know that you're wealthy enough to say,
04:47:08.960 | it's like, this doesn't really mean anything.
04:47:10.900 | I can spend a million dollars a year
04:47:14.800 | for the rest of my life and it doesn't mean anything.
04:47:17.880 | It's fine.
04:47:19.480 | But that is an opportunity to just kind of meander.
04:47:22.840 | And I could see that in myself when I'm doing some things,
04:47:25.800 | it's like, oh, this is a kind of interesting, curious thing.
04:47:28.520 | Let's look at this for a little while.
04:47:30.000 | Let's look at that.
04:47:31.080 | It's not really bearing down on the problem.
04:47:33.920 | So there's a few things that I've done
04:47:37.120 | that are kind of tactics for myself
04:47:39.340 | to make me more effective.
04:47:40.520 | Like one thing I noticed I was not doing well
04:47:43.400 | is I had a Google Cloud account to get GPUs there.
04:47:47.800 | And I was finding I was very rarely doing that
04:47:49.640 | for no good psychological reasons where I'm like,
04:47:52.360 | oh, I can always think of something to do
04:47:54.180 | other than to spin up instances and run an experiment.
04:47:56.880 | I can keep working on my local Titans or something.
04:47:59.820 | But it was really stupid.
04:48:01.880 | I mean, it was not a lot of money.
04:48:03.360 | I should have been running more experiments there.
04:48:05.900 | So I thought to myself, well,
04:48:07.760 | I'm going to go buy a quarter million dollar DGX station.
04:48:10.560 | I'm going to just like sit it right there.
04:48:12.120 | And it's going to mock me if I'm not using it.
04:48:14.200 | If the fans aren't running on that thing,
04:48:16.240 | I'm not properly utilizing it.
04:48:18.560 | And that's been helpful.
04:48:19.640 | You know, I've done a lot more experiments since then.
04:48:22.240 | It's been interesting where I thought I'd be doing
04:48:24.920 | all this low level NVLink optimized stuff.
04:48:27.280 | But 90% of what I do is just spin up four instances
04:48:29.940 | of an experiment with different hyper parameters on it.
04:48:32.840 | - Oh, interesting.
04:48:33.840 | You're doing like really sort of building up intuition
04:48:37.140 | by doing ML experiments of different kinds.
04:48:39.400 | - But so the next big thing though is I am, you know,
04:48:44.320 | I decided that I was going to take some investor money
04:48:47.600 | because I have an overactive sense of responsibility
04:48:51.760 | about other people's money.
04:48:53.400 | And it's like, I don't want, I mean, a lot of my push
04:48:58.700 | and my passionate entreaties for things at Meta
04:49:01.360 | are it's like, I don't want Zuck to have wasted his money
04:49:03.720 | investing in Oculus.
04:49:05.160 | I want it to work out.
04:49:06.300 | I want it to change the world.
04:49:07.720 | I want it to be worth all of this time, money and effort
04:49:10.960 | going into it.
04:49:12.240 | And I expect that it's going to be that like that
04:49:15.280 | with my company where--
04:49:17.680 | - It's a huge forcing function.
04:49:19.080 | - Yeah, I have investors that are going to expect
04:49:22.640 | something of me.
04:49:23.480 | Now we've all had the conversation that this is
04:49:26.280 | a low probability long-term bet.
04:49:28.280 | It's not something that there's a million things I could do
04:49:31.160 | that I would have line of sight on the value proposition
04:49:33.720 | for this isn't that.
04:49:34.880 | I think there are unknown unknowns in the way,
04:49:38.220 | but it's one of these things that it's hyperbole,
04:49:42.300 | but it's potentially one of the most important things
04:49:44.460 | humans ever do.
04:49:45.700 | And it's something that I think is within our lifetimes,
04:49:48.220 | if not within a decade to happen.
04:49:51.540 | So yeah, this is just now happening like term sheet,
04:49:55.820 | like the ink is barely, virtual ink is barely dry on.
04:49:58.260 | - It's drying.
04:49:59.180 | I mean, as I mentioned to you offline,
04:50:01.060 | like somebody I admire, somebody you know, Andrej Karpathy,
04:50:04.740 | I think the two of you different trajectories in life,
04:50:07.480 | but approach problems similarly in that he codes stuff
04:50:11.280 | from scratch up all the time.
04:50:13.280 | And he's created a bunch of little things outside of,
04:50:18.280 | even outside the course at Stanford
04:50:22.020 | that have been tremendously useful to build up intuition
04:50:25.240 | about stuff, but also to help people.
04:50:27.760 | And they're all in the realm of AI.
04:50:30.080 | Do you see yourself potentially doing things like this
04:50:34.360 | or not necessarily solving a gigantic problem,
04:50:37.400 | but on the journey, on the path to that,
04:50:40.680 | building up intuitions and sharing code or ideas or systems
04:50:45.680 | that give inklings of AGI,
04:50:52.560 | but also kind of are useful to people in some way?
04:50:55.520 | - So yeah, first of all, Andrej is awesome.
04:50:57.240 | I learned a lot when I was going through my larval phase
04:50:59.720 | from his blog posts and his Stanford course,
04:51:02.200 | and super valuable.
04:51:04.160 | I got to meet him first a couple of years ago
04:51:06.240 | when I was first kind of starting off
04:51:08.400 | on my gentleman scientist bit.
04:51:11.080 | And just a couple of months ago
04:51:13.480 | when he went out on his sabbatical,
04:51:15.440 | he stopped by in Dallas and we talked for a while
04:51:17.620 | and I had a great time with him.
04:51:19.640 | And then when I heard he actually left Tesla,
04:51:21.360 | I did of course, along with a hundred other people say,
04:51:24.200 | "Hey, if you ever wanna work with me, it would be an honor."
04:51:28.200 | So he thinks that he's gonna be doing
04:51:30.400 | this educational work,
04:51:31.440 | but I think someone's gonna make him an offer
04:51:32.960 | he can't refuse before he gets too far along on it.
04:51:36.680 | - Oh, his current interest is educational.
04:51:39.280 | So yeah, he's a special mind.
04:51:42.840 | Is there something you could speak to
04:51:44.320 | what makes him so special?
04:51:46.480 | From your understanding?
04:51:47.320 | - You know, like he did,
04:51:48.640 | he was very much a programmer's programmer
04:51:50.720 | that was doing machine learning work,
04:51:52.360 | rather than, it's a different feel than an academic
04:51:56.040 | where you can see it in paper sometimes
04:51:57.960 | where somebody that's really a mathematician
04:52:00.160 | or a statistician at heart,
04:52:01.560 | and they're doing something with machine learning.
04:52:04.280 | But, you know, Andre is about getting something done
04:52:07.080 | and you could see it in like all of his earliest approaches
04:52:10.000 | to it's like, okay,
04:52:10.840 | here's how reinforcement learning works.
04:52:12.820 | Here's how recurrent neural networks work.
04:52:14.560 | Here's how transformers work.
04:52:16.720 | Here's how crypto works.
04:52:19.280 | And yeah, it's just, he's a hacker's,
04:52:23.360 | you know, one of his old posts was like a hacker's guide
04:52:25.460 | to machine learning.
04:52:26.680 | And, you know, he deprecated that and said,
04:52:27.920 | "Don't really pay attention to what's in here."
04:52:29.840 | But it's that thought that carries through in a lot of it,
04:52:33.760 | where it is that back again to that hacker mentality
04:52:36.960 | and the hacker ethic with what he's doing
04:52:38.840 | and sharing all of it.
04:52:40.960 | - Yeah, and a lot of his approach to a new thing,
04:52:43.600 | like you said, larva stage is,
04:52:46.120 | let me code up the simplest possible thing
04:52:48.760 | to build up intuition about it.
04:52:50.360 | - Yeah, like I say, I sketch with structs and things
04:52:52.760 | when I'm just thinking about a problem,
04:52:54.560 | I'm thinking in some degree of code.
04:52:58.120 | - You are also among many things, a martial artist,
04:53:01.480 | both Judo and Jiu-Jitsu.
04:53:03.080 | How has this helped make you the person you are?
04:53:07.200 | - So, I mean, I was a competent club player
04:53:09.200 | in Judo and grappling.
04:53:10.880 | I mean, I was, you know, by no means any kind of a superstar,
04:53:14.000 | but it was, I went through a few phases with it
04:53:17.240 | where I did some, when I was quite young,
04:53:20.520 | a little bit more when I was 17,
04:53:22.320 | and then I got into it kind of seriously in my mid thirties.
04:53:27.200 | And, you know, I went pretty far with it
04:53:29.080 | and I was pretty good at some of the things
04:53:31.520 | that I was doing.
04:53:32.680 | And I did appreciate it quite a bit where,
04:53:36.000 | I mean, on the one hand, it's always,
04:53:37.560 | if you're gonna do exercise or something,
04:53:39.240 | it's a more motivating form of exercise.
04:53:41.320 | If someone is crushing you,
04:53:43.440 | you are motivated to do something about that,
04:53:46.840 | to up your attributes and be better about getting out.
04:53:49.840 | - Up your attributes, yes.
04:53:51.800 | - But there's also that sense that I'm,
04:53:54.960 | you know, I was not a sports guy.
04:53:57.200 | I did do wrestling in junior high.
04:53:59.640 | And I often wish that, I think I would have,
04:54:02.280 | I would have been good for me
04:54:03.520 | if I'd carried that on into high school
04:54:05.440 | and had a little bit more of that.
04:54:07.120 | I mean, it's like I, you know,
04:54:08.800 | felt a little bit of wrestling vibe
04:54:10.880 | with all was going on about embracing the grind
04:54:13.520 | and like that push that I associate
04:54:15.280 | with the wrestling team that I, in hindsight,
04:54:18.600 | I wish I had gone through that and pushed myself that way.
04:54:21.960 | But even getting back into Judo and Jiu-Jitsu
04:54:24.600 | in my mid thirties,
04:54:25.920 | as usually the old man on the mat with that,
04:54:28.640 | there was still the, you know, the sense that I,
04:54:32.680 | you know, working out with the group
04:54:34.800 | and having the guys that you're beating each other up
04:54:38.400 | with it, but you just feel good coming out of it.
04:54:41.840 | And I can remember those driving home,
04:54:44.880 | aching in various ways and just thinking,
04:54:47.160 | it's like, oh, that was really great.
04:54:49.280 | And I, you know, it's mixing with a bunch of people
04:54:52.520 | that had nothing to do with any of the things
04:54:54.280 | that I worked with.
04:54:55.840 | You know, every once in a while,
04:54:56.720 | someone would be like, oh, you're the doom guy.
04:54:58.520 | And I, but for the most part,
04:55:00.760 | it was just different slice of life.
04:55:02.600 | I, you know, a good thing.
04:55:04.560 | And I made the call when I was 40.
04:55:07.160 | That's like, maybe I'm getting a little old for this.
04:55:09.120 | I had separated a rib and tweaked a few things
04:55:11.960 | and I got out of it without any really bad injuries.
04:55:15.560 | And it was like, have I dodged enough bullets?
04:55:18.000 | Should I, you know, should I hang it up?
04:55:20.080 | I went back, I've gone a couple of times
04:55:22.920 | in the last decade, trying to get my kids
04:55:25.080 | into it a little bit.
04:55:26.120 | I didn't really stick with any of them,
04:55:27.800 | but it was fun to get back on the mats.
04:55:30.040 | I really hurts for a while when you haven't gone,
04:55:32.760 | I gone for a while,
04:55:34.440 | but I still debate this pretty constantly.
04:55:37.720 | My brother's only a year younger than me
04:55:39.520 | and he's going kind of hard in jujitsu right now.
04:55:42.000 | And I, you know, he was just,
04:55:43.720 | he won a few medals at the last tournament he was at.
04:55:46.040 | - So he's competing too.
04:55:47.000 | - Yeah, and I was thinking,
04:55:48.520 | yeah, I guess we're in the executive division
04:55:50.280 | if you're over 50 or over 45 or something.
04:55:53.920 | And it's not out of the question
04:55:56.520 | that I go back at some point to do some of this.
04:55:59.000 | But again, I'm just reorganizing my life around more focus,
04:56:03.160 | probably not gonna happen.
04:56:04.720 | I'm pushing my exercise around
04:56:07.040 | to give me longer uninterrupted intellectual focus time,
04:56:10.280 | pushing it to the beginning or the end of the game.
04:56:11.760 | - Like running and stuff like that or walking, yeah.
04:56:14.240 | - Yeah, I got running and calisthenics
04:56:15.840 | and some things like that, but-
04:56:17.720 | - It allows you to still think about a problem.
04:56:19.960 | But if you're going to a judo club or something,
04:56:22.160 | you've got it fixed.
04:56:23.320 | It's gonna be seven o'clock or whatever,
04:56:25.000 | 10 o'clock on Saturday.
04:56:27.120 | Although I talked about this a little bit
04:56:29.000 | when I was on Rogan and shortly after that,
04:56:31.240 | Carlos Machado did reach out
04:56:32.960 | and I had trained with him for years back in the day.
04:56:36.160 | And he was like, hey, we've got kind of a small private club
04:56:39.000 | with a bunch of kind of executive type people.
04:56:41.800 | And it does tempt me.
04:56:45.400 | - Yeah, I don't know if you know him,
04:56:46.960 | but Jon Donahert moved here to Austin
04:56:49.960 | with Gordon Ryan and a few other folks.
04:56:52.560 | And he has a very interesting way,
04:56:55.280 | very deep systematic way of thinking about jujitsu
04:56:58.640 | that reveals the chess of it,
04:57:01.800 | like the science of it.
04:57:06.640 | - And I do think about that more as kind of an older person
04:57:09.800 | considering the martial arts,
04:57:11.040 | where I can remember the very earliest days
04:57:13.400 | getting back into judo and I'm like,
04:57:15.280 | teach me submissions right now.
04:57:17.200 | It's like, learn the arm bar, learn the choke.
04:57:20.040 | But as you get older, you start thinking more about,
04:57:22.480 | it's like, okay, I really do wanna learn
04:57:24.440 | the entire canon of judo.
04:57:26.200 | It's like all the different things there
04:57:28.160 | and all the different approaches for it.
04:57:30.360 | Not just the, if you wanna compete,
04:57:32.280 | there's just a handful of things
04:57:33.400 | you learn really, really well.
04:57:34.680 | But sometimes there's interest in learning
04:57:36.800 | a little bit more of the scope there
04:57:38.360 | and figuring some things out from,
04:57:40.880 | at one point I had, wasn't exactly a spreadsheet,
04:57:43.640 | but I did have a big long text file with like,
04:57:46.560 | here's the things that I learned in here,
04:57:48.400 | like ways you chain this together.
04:57:50.360 | And while, when I went back a few years ago,
04:57:53.760 | it was good to see that I whipped myself back
04:57:56.080 | into reasonable shape about doing the basic grappling,
04:57:58.800 | but I know there was a ton of the subtleties
04:58:00.720 | that were just, that were gone,
04:58:02.200 | but could probably be brought back reasonably quickly.
04:58:05.040 | - And there's also the benefit,
04:58:06.440 | I mean, you're exceptionally successful now,
04:58:11.560 | you're brilliant, and the problem,
04:58:14.680 | the old problem of the ego is--
04:58:17.720 | - I still pushed kind of harder than I should.
04:58:20.080 | I mean, that was, I was one of those people that I,
04:58:23.000 | yeah, I'm on the smaller side
04:58:24.800 | for a lot of the people competing.
04:58:27.000 | And I would, I'd go with all the big guys
04:58:29.600 | and I'd go hard and I'd push myself a lot.
04:58:32.640 | And that would be one of those where I would,
04:58:35.280 | I'd be dangerous to anyone for the first five minutes,
04:58:38.760 | but then sometimes after I'm already dead.
04:58:40.960 | And I knew it was terrible for me 'cause it made the,
04:58:44.520 | it meant I got less training time with all of that
04:58:47.000 | when you go and you just gas out relatively quickly there.
04:58:50.720 | And I like to think that I would be better about that
04:58:53.880 | where after I gave up judo,
04:58:55.520 | I started doing the half marathons and tough butters
04:58:57.760 | and things like that.
04:58:59.000 | And so when I did go back to the local judo club,
04:59:02.560 | I thought it's like, oh, I should have better cardio
04:59:04.460 | for this 'cause I'm a runner now and I do all of this
04:59:07.080 | and didn't work out that way.
04:59:08.600 | It was the same old thing where just push really hard,
04:59:11.760 | strain really hard.
04:59:12.800 | And of course, when I worked with good guys like Carlos,
04:59:16.240 | it's like, just the whole flow, like water thing is real.
04:59:19.240 | And he's just like.
04:59:20.560 | - That's true with judo too.
04:59:21.600 | Some of the best people like I've trained
04:59:23.960 | with Olympic gold medalists.
04:59:25.240 | And for some reason with them, everything's easier.
04:59:29.400 | Everything is, you actually start to feel the science of it,
04:59:34.400 | the music of it, the dance of it.
04:59:36.920 | Everything's effortless.
04:59:38.360 | You understand that there's an art to it.
04:59:42.200 | It's not just an exercise.
04:59:43.480 | - It was interesting where I did go to the Kodokan in Japan,
04:59:46.880 | kind of the birthplace of judo and everything.
04:59:49.240 | And I remember I rolled with one old guy.
04:59:51.520 | I didn't start standing, just started on groundwork,
04:59:54.920 | and it was striking how different it was from Carlos.
04:59:58.480 | He was still, he was better than me and he got my arm
05:00:00.680 | and I had to tap there,
05:00:02.480 | but it was a completely different style
05:00:04.920 | where I just felt like I could do nothing.
05:00:06.920 | He was just enveloping me and just like slowly ground it
05:00:09.460 | down, took my arm and bent it.
05:00:11.160 | While with Carlos, he's just loose and free.
05:00:14.560 | And you always thought like,
05:00:15.560 | oh, you're just gonna go grab something,
05:00:17.000 | but you never had any chance to do it.
05:00:18.600 | But it was very different feeling.
05:00:20.040 | - That's a good summary of the difference
05:00:21.600 | between jujitsu and judo.
05:00:23.240 | In jujitsu, it is a dance and you feel like
05:00:25.680 | there's a freedom and actually,
05:00:28.000 | anybody I trained, like Gordon Ryan,
05:00:31.440 | one of the best grappler in the world,
05:00:34.000 | nogi grappler in the world.
05:00:35.960 | There's a feeling like you can do anything,
05:00:38.640 | but when you actually try to do something, you can't.
05:00:41.680 | - Just magically doesn't work.
05:00:42.920 | - But with the best judo players in the world,
05:00:44.800 | yeah, it does feel like there's a blanket
05:00:48.120 | that weighs a thousand pounds on top of you.
05:00:50.380 | And there's not a feeling like you can do anything.
05:00:53.360 | You just, you're trapped.
05:00:54.800 | And that's a style, that's a difference
05:00:57.720 | in the style of martial arts,
05:00:58.960 | but it's also once you start to study,
05:01:01.960 | you understand it all has to do with human movement
05:01:04.520 | and the physics of it and the leverage
05:01:06.160 | and all that kind of stuff.
05:01:07.040 | And that's super fascinating.
05:01:09.360 | At the end of the day, for me,
05:01:11.040 | the biggest benefit is in the humbling aspect
05:01:13.600 | when another human being kind of tells you
05:01:18.400 | that there's a hierarchy or there's a,
05:01:21.600 | you're not that special.
05:01:25.400 | - Yeah, and in the most extreme case,
05:01:26.740 | when you tap to a choke, you are basically living
05:01:30.320 | because somebody lets you live.
05:01:32.160 | And that is one of those, if you think about it,
05:01:34.520 | that is a closer brush with mortality
05:01:36.480 | than most people consider.
05:01:38.940 | - And that kind of humbling act
05:01:44.000 | is good to take to your work then,
05:01:45.800 | where it's harder to get humbled.
05:01:47.820 | - Yeah, because nobody that does any martial art
05:01:51.600 | is coming out thinking I'm the best in the world
05:01:53.760 | at anything because everybody loses.
05:01:56.360 | - Let me ask you for advice.
05:01:59.720 | What advice would you give to young people today
05:02:02.760 | about life, about career, how they can have a job,
05:02:07.280 | how they can have an impact,
05:02:09.480 | how they can have a life they can be proud of?
05:02:11.960 | - So it was kind of fun.
05:02:14.360 | I got invited to give the commencement speech back at the,
05:02:17.720 | I went to a college for two semesters and dropped out
05:02:21.440 | and went on to do my tech stuff,
05:02:23.360 | but they still wanted me to come back
05:02:25.000 | and give a commencement speech.
05:02:26.760 | And I've got that pinned on my Twitter account.
05:02:29.200 | I still feel good about everything that I said there.
05:02:32.320 | And my biggest point was that the path for me
05:02:36.920 | might not be the path for everyone.
05:02:38.560 | And in fact, the advice, the path that I took,
05:02:41.600 | and even the advice that I would give
05:02:43.320 | based on my experience and learnings
05:02:46.080 | probably isn't the best advice for everyone,
05:02:49.000 | because what I did was all about this knowledge in depth.
05:02:52.560 | It was about not just having this surface level ability
05:02:56.320 | to make things do what I want,
05:02:57.720 | but to really understand them through and through,
05:02:59.800 | to let me do the systems engineering work
05:03:02.400 | and to sometimes find these inefficiencies
05:03:05.120 | that can be bypassed.
05:03:07.120 | And the whole world doesn't need that.
05:03:10.160 | Most programmers don't need, or engineers of any kind
05:03:12.680 | don't necessarily need to do that.
05:03:14.240 | They need to do a little job
05:03:15.960 | that's been parceled out to them.
05:03:18.080 | Be reliable, let people depend on you,
05:03:20.680 | do quality work with all of that.
05:03:22.880 | But people that do have an inclination
05:03:26.320 | for wanting to know things deeper and learn things deeper,
05:03:32.280 | there are just layers and layers of things out there.
05:03:34.920 | And it's amazing.
05:03:36.680 | If you're the right person that is excited about that,
05:03:41.160 | the world's never been like this before.
05:03:42.760 | It's better than ever.
05:03:43.760 | I mean, everything that was wonderful for me
05:03:45.760 | is still there,
05:03:46.880 | and there's whole new worlds to explore
05:03:49.400 | on the different things that you can do.
05:03:52.000 | And that it's hard work.
05:03:55.080 | Brace the grind with it and understand as much as you can,
05:03:59.320 | and then be prepared for opportunities
05:04:02.240 | to present themselves,
05:04:03.320 | where you can't just say, "This is my goal in life,"
05:04:06.320 | and just push at that.
05:04:07.960 | I mean, you might be able to do that,
05:04:09.280 | but you're going to make more total progress
05:04:11.360 | if you say, "I am preparing myself
05:04:13.600 | with this broad set of tools,
05:04:15.600 | and then I'm being aware of all the way things are changing
05:04:19.040 | as I move through the world
05:04:20.360 | and as the whole world changes around me,"
05:04:22.640 | and then looking for opportunities
05:04:24.160 | to deploy the tools that you've built.
05:04:26.480 | And there's going to be more and more
05:04:28.760 | of those types of things there,
05:04:30.360 | where an awareness of what's happening,
05:04:32.840 | where the inefficiencies are,
05:04:34.600 | what things can be done,
05:04:36.200 | what's possible versus what's current practice,
05:04:39.160 | and then finding those areas
05:04:40.840 | where you can go and make an adjustment
05:04:42.960 | and make something that may affect
05:04:45.640 | millions or billions of people in the world,
05:04:48.160 | make it better.
05:04:49.240 | - When, maybe from your own example,
05:04:51.880 | how were you able to recognize this about yourself,
05:04:54.320 | that you saw the layers in a particular thing
05:04:57.240 | and you were drawn to discovering
05:04:59.760 | deeper and deeper truths about it?
05:05:01.720 | Is that something that was obvious to you,
05:05:03.920 | that you couldn't help,
05:05:04.760 | or is there some actions you had to take
05:05:06.520 | to actually allow yourself to dig deep?
05:05:08.880 | - So in the earliest days of personal computers,
05:05:11.320 | I remember the reference manuals,
05:05:13.440 | and the very early ones even had schematics
05:05:15.640 | of computers in the background,
05:05:17.840 | in the back of the books,
05:05:19.760 | as well as firmware listings and things.
05:05:21.960 | And I could look at that,
05:05:23.200 | and at that time when I was a younger teenager,
05:05:25.960 | I didn't understand a lot of that stuff,
05:05:27.960 | how the different things worked.
05:05:30.320 | I was pulling out the information that I could get,
05:05:32.920 | but I always wanted to know all of that.
05:05:35.080 | There was like kind of magical information
05:05:37.800 | sitting down there.
05:05:38.720 | It's like the elder lore
05:05:39.840 | that some gray-beard wizard is the keeper of.
05:05:43.440 | And so I always felt that pull
05:05:45.240 | for wanting to know more,
05:05:46.840 | wanting to explore the mysterious areas there.
05:05:51.480 | And that followed right in through all the things
05:05:54.520 | that got the value,
05:05:55.800 | exploring the video cards leading to the scrolling advantages,
05:06:00.800 | exploring some of the academic papers and things,
05:06:02.960 | learning about BSP trees
05:06:04.640 | and the different things that I could do with those systems.
05:06:08.680 | And just the huge larval phases going through aerospace,
05:06:12.400 | just reading bookshelves full of books.
05:06:15.200 | I mean, again, that point where I have enough money,
05:06:17.080 | I can buy all the books I want.
05:06:18.520 | It was so valuable there
05:06:21.360 | where I was terrible with my money when I was a kid.
05:06:23.760 | My mom thought I would always be broke
05:06:25.360 | because I'd buy my comic books and just be out of money.
05:06:28.560 | But it was like all the pizza I want,
05:06:31.000 | all the Diet Coke I want, video games, and then books.
05:06:33.960 | Books.
05:06:34.800 | And it didn't take that much.
05:06:36.320 | As soon as I was making 27K a year, I felt rich.
05:06:39.760 | I was just getting all the things that I wanted.
05:06:42.360 | But that sense of, books have always been magical to me.
05:06:46.200 | And that was one of the things that really made me smile
05:06:48.280 | is Andre had said, he found,
05:06:50.600 | when he came over to my house,
05:06:51.560 | he said he found my library inspiring.
05:06:53.680 | Just, and it was great to see,
05:06:55.760 | I used to look at him, he's kind of a younger guy.
05:06:57.280 | I sometimes wonder if younger people these days
05:06:59.920 | have the same relationship with books that I do
05:07:02.440 | where they were such a cornerstone for me in so many ways.
05:07:05.920 | But that sense that, yeah, I always wanted to know it all.
05:07:08.400 | I know I can't.
05:07:09.640 | And that was like one of the last things I said,
05:07:11.320 | you can't know everything,
05:07:12.840 | but you should convince yourself that you can know anything.
05:07:15.800 | You know, any one particular thing,
05:07:18.160 | it was created and discovered by humans.
05:07:20.040 | You can learn it.
05:07:20.880 | You can find out what you need on there.
05:07:22.840 | - And you can learn it deeply.
05:07:24.320 | - Yeah.
05:07:25.160 | You can drive a nail down through
05:07:26.880 | whatever layer cake problem space you've got
05:07:29.640 | and learn a cross section there.
05:07:31.960 | - And not only can you have an impact doing that,
05:07:34.200 | you can attain happiness doing that.
05:07:37.640 | There's something so fulfilling
05:07:39.000 | about becoming a craftsman of a thing.
05:07:41.120 | - Yeah.
05:07:41.960 | And I don't want to tell people that, look,
05:07:43.120 | this is a good career move,
05:07:44.920 | just grit your teeth and bear it.
05:07:47.520 | And I do think it is possible sometimes
05:07:52.360 | to find the joy in something.
05:07:54.120 | Like it might not immediately appeal to you,
05:07:56.080 | but I had told people early on, like in software times,
05:08:00.200 | that a lot of game developers are in it
05:08:03.400 | just because they are so passionate about games.
05:08:05.920 | But I was always really more flexible
05:08:08.960 | in what appealed to me, where I said,
05:08:11.560 | I think I could be quite engaged doing operating system work
05:08:15.400 | or even database work.
05:08:16.600 | I would find the interest in that
05:08:19.400 | because I think most things that are significant
05:08:21.600 | in the world have a lot of layers and complexity to them
05:08:25.200 | and a lot of opportunities hidden within them.
05:08:28.320 | So that would probably be the most important thing
05:08:30.560 | to encourage to people is that
05:08:32.160 | you can, it's like weaponized curiosity.
05:08:35.520 | You can deploy your curiosity to find,
05:08:38.360 | to kind of like make things useful and valuable to you,
05:08:41.400 | even if they don't immediately appear that way.
05:08:43.960 | - Deploy your curiosity, yeah, that's very true.
05:08:47.280 | We've mentioned this debate point,
05:08:49.200 | whether mortality or fear of mortality is fundamental
05:08:53.560 | to creating an AGI, but let's talk about
05:08:56.480 | whether it's fundamental to human beings.
05:08:58.640 | Do you think about your own mortality?
05:09:01.480 | - I really don't.
05:09:03.040 | And you probably always have to like
05:09:05.840 | take with a grain of salt anything somebody says
05:09:07.880 | about fundamental things like that.
05:09:09.960 | But I don't think about really aging, impending death,
05:09:16.040 | legacy with my children, things like that.
05:09:19.800 | And clearly it seems most of the world does a lot,
05:09:23.200 | a lot more than I do.
05:09:24.640 | - Yeah.
05:09:25.520 | - So, I mean, I think I'm an outlier in that where it's,
05:09:29.400 | yeah, it doesn't wind up being a real part of my thinking
05:09:34.200 | and motivation about things.
05:09:36.000 | - So daily existence is about sort of the people you love
05:09:40.640 | and the problems before you.
05:09:43.480 | - I'm very much focused on what I'm working on right now.
05:09:47.080 | I do take that back.
05:09:48.920 | There's one aspect where the kind of finiteness of the life
05:09:52.480 | does impact me.
05:09:53.680 | And that is about thinking about the scope of the problems
05:09:56.960 | that I'm working on.
05:09:57.880 | When I decided to work on,
05:10:00.240 | when I was like nuclear fission or AGI,
05:10:02.480 | these are big ticket things that are impact
05:10:06.920 | large fractions of the world.
05:10:08.520 | And I was thinking to myself at some level that,
05:10:11.680 | okay, I mean, I may have a couple more swings at bat
05:10:15.960 | with me at full capability,
05:10:17.760 | but yes, my mental abilities will decay with age,
05:10:22.280 | mostly inevitably.
05:10:23.200 | I don't think it's a 0% chance that we will address
05:10:26.280 | some of that before it becomes a problem for me.
05:10:28.520 | I think exciting medical stuff in the next couple of decades.
05:10:31.760 | But I do have this kind of vague plan
05:10:34.120 | that when I'm not at the top of my game
05:10:36.280 | and I don't feel that I'm in a position
05:10:38.720 | to put a dent in the world some way,
05:10:40.680 | that I'll probably wind up doing some kind of
05:10:42.880 | recreational retro programming or I'll work on something,
05:10:47.880 | something that I would not devote my life to now,
05:10:50.320 | but I can while away my time as the old man
05:10:52.800 | gardening in the code worlds.
05:10:54.400 | - And then to step back even bigger,
05:10:59.720 | let me ask you about why we're here, we human beings.
05:11:03.520 | What's the meaning of it all?
05:11:04.800 | What's the meaning of life, John Carmack?
05:11:07.000 | - So very similar with that last question.
05:11:09.040 | I know a lot of people fret about this question a lot
05:11:12.480 | and I just really don't.
05:11:14.240 | - I really don't give a damn.
05:11:15.600 | - We are biological creatures
05:11:18.600 | that happenstance of evolution.
05:11:21.320 | We have innate drives that evolution crafted
05:11:24.600 | for survival and passing on of genetic codes.
05:11:28.960 | I don't find a lot of value
05:11:32.240 | in trying to go much deeper than that.
05:11:34.200 | I have my motivations, some of which are,
05:11:36.960 | some of which are probably genetically coded
05:11:38.880 | and many of which are contingent on my upbringing
05:11:41.520 | and the path that I've had through my life.
05:11:43.840 | I don't run into like spates of depression or ennui
05:11:48.240 | or anything that winds up being a challenge
05:11:52.040 | and forcing a degree of soul searching
05:11:54.080 | with things like that.
05:11:55.520 | I seem to be okay, you know, kind of without that.
05:11:59.280 | - As a brilliant ant in the ant colony
05:12:03.320 | without looking up to the sky wondering
05:12:05.240 | why the hell am I here again?
05:12:07.240 | So the why of it, the incredible mystery
05:12:11.840 | of the fact that we started,
05:12:15.160 | first of all, the origin of life on Earth
05:12:17.800 | and from that, from single cell organisms,
05:12:20.720 | the entirety of the evolutionary process
05:12:22.960 | took us somehow to this incredibly intelligent thing
05:12:26.080 | that is able to build Wolfenstein 3D and Doom and Quake
05:12:30.560 | and take a crack at the problem of AGI
05:12:33.720 | and create things that eventually supersede human beings.
05:12:37.480 | That doesn't, the why of it is-
05:12:41.240 | - It's been my experience that people that focus on,
05:12:46.720 | that don't focus on the here and now right in front of them
05:12:49.880 | tend to be less effective.
05:12:51.160 | I mean, it's not 100%.
05:12:52.920 | You know, vision matters to some people,
05:12:55.440 | but it doesn't seem to be a necessary motivator for me.
05:12:59.560 | And I think that the process of getting there
05:13:02.280 | is usually done,
05:13:03.320 | I guess like the magic of gradient descent.
05:13:05.480 | People just don't believe that just looking locally
05:13:08.600 | gets you to all of these spectacular things.
05:13:11.040 | That's been, you know, the decades of looking at,
05:13:13.920 | really some of the smartest people in the world
05:13:17.640 | that would just push back forever against this idea
05:13:20.560 | that it's not this grand, sophisticated vision
05:13:23.560 | of everything, but little tiny steps,
05:13:25.800 | local information winds up leading to all the best answers.
05:13:29.720 | - So the meaning of life is following locally
05:13:34.560 | wherever the gradient descent takes you.
05:13:36.600 | This was an incredible conversation,
05:13:39.280 | officially the longest conversation I've ever done
05:13:41.680 | on the podcast, which means a lot to me
05:13:44.760 | because I get to do it with one of my heroes, John.
05:13:47.120 | I can't tell you how much it means to me
05:13:48.840 | that you would sit down with me.
05:13:50.520 | You're an incredible human being.
05:13:52.180 | I can't wait what you do next,
05:13:54.440 | but you've already changed the world.
05:13:56.040 | You're an inspiration to so many people.
05:13:58.120 | And again, we haven't covered like most of what I was playing
05:14:02.520 | to talk about, so I hope we get a chance to talk
05:14:05.320 | someday in the future.
05:14:06.560 | I can't wait to see what you do next.
05:14:08.560 | Thank you so much again for talking to me.
05:14:10.320 | - Thank you very much.
05:14:12.120 | - Thanks for listening to this conversation
05:14:13.600 | with John Carmack.
05:14:14.840 | To support this podcast,
05:14:16.120 | please check out our sponsors in the description.
05:14:18.720 | And now let me leave you with some words
05:14:20.680 | from John Carmack himself.
05:14:22.260 | Focused hard work is the real key to success.
05:14:27.080 | Keep your eyes on the goal
05:14:29.000 | and just keep taking the next step towards completing it.
05:14:32.440 | If you aren't sure which way to do something,
05:14:34.680 | do it both ways and see which works better.
05:14:37.440 | Thank you for listening and hope to see you next time.
05:14:41.560 | (upbeat music)
05:14:44.140 | (upbeat music)
05:14:46.720 | [BLANK_AUDIO]