back to index

James Gosling: Java, JVM, Emacs, and the Early Days of Computing | Lex Fridman Podcast #126


Chapters

0:0 Introduction
4:45 Irrational numbers
8:4 Math and programming
10:36 Coding style
14:41 First computer
23:54 Lisp
27:22 Write an Emacs implementation in C
35:15 Early days of the Internet
45:57 Elon Musk, Steve Jobs, Jeff Bezos
56:13 Work hard and smart
58:48 Open source
70:25 Java
88:31 Java virtual machine
104:5 Android
107:4 Advice

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with James Gosling,
00:00:02.440 | the founder and lead designer
00:00:03.880 | behind the Java programming language,
00:00:05.960 | which in many indices is the most popular
00:00:08.800 | programming language in the world,
00:00:10.680 | or is always at least in the top two or three.
00:00:14.120 | We only had a limited time for this conversation,
00:00:16.520 | but I'm sure we'll talk again several times in this podcast.
00:00:19.880 | Quick summary of the sponsors,
00:00:21.480 | Public Goods, BetterHelp, and ExpressVPN.
00:00:24.800 | Please check out these sponsors in the description
00:00:26.660 | to get a discount and to support this podcast.
00:00:29.840 | As a side note, let me say that Java is the language
00:00:33.420 | with which I first learned object-oriented programming,
00:00:36.520 | and with it, the art and science of software engineering.
00:00:40.300 | Also, early on in my undergraduate education,
00:00:43.360 | I took a course on concurrent programming with Java.
00:00:47.880 | Looking back at that time,
00:00:49.440 | before I fell in love with neural networks,
00:00:51.820 | the art of parallel computing was both algorithmically
00:00:55.660 | and philosophically fascinating to me.
00:00:58.560 | The concept of a computer in my mind before then
00:01:01.420 | was something that does one thing at a time.
00:01:04.840 | The idea that we could create an abstraction of parallelism
00:01:07.920 | where you could do many things at the same time
00:01:10.480 | while still guaranteeing stability
00:01:11.980 | and correctness was beautiful.
00:01:15.080 | While some folks in college took drugs
00:01:17.200 | to expand their mind, I took concurrent programming.
00:01:21.140 | If you enjoy this thing, subscribe on YouTube,
00:01:23.440 | review it with 5 Stars on Apple Podcasts,
00:01:25.600 | follow on Spotify, support on Patreon,
00:01:28.160 | or connect with me on Twitter @LexFriedman.
00:01:31.280 | As usual, I'll do a few minutes of ads now
00:01:33.520 | and no ads in the middle.
00:01:34.820 | I try to make these interesting,
00:01:36.280 | but I do give you timestamps, so go ahead and skip,
00:01:39.200 | but please do check out the sponsors
00:01:41.560 | by clicking the links in the description.
00:01:43.520 | It's the best way to support this podcast.
00:01:46.200 | This show, sponsored by Public Goods,
00:01:49.520 | the one-stop shop for affordable, sustainable,
00:01:51.960 | healthy household products.
00:01:54.580 | I take their fish oil and use their toothbrush,
00:01:57.160 | for example.
00:01:58.520 | Their products often have a minimalist
00:02:00.280 | black and white design that I find to be just beautiful.
00:02:03.680 | Some people ask why I wear this black suit and tie.
00:02:07.240 | There's a simplicity to it that, to me,
00:02:09.440 | focuses my mind on the most important bits
00:02:12.480 | of every moment of every day,
00:02:14.920 | pulling only at the thread of the essential
00:02:17.860 | and all that life has to throw at me.
00:02:20.360 | It's not about how I look, it's about how I feel.
00:02:23.040 | That's what design is to me,
00:02:24.520 | creating an inner conscious experience,
00:02:26.960 | not an external look.
00:02:29.280 | Anyway, Public Goods plants one tree
00:02:32.080 | for every order placed, which is kind of cool.
00:02:35.000 | Visit publicgoods.com/lex, or use code LEX at checkout
00:02:38.880 | to get 15 bucks off your first order.
00:02:41.340 | This show is also sponsored by BetterHelp,
00:02:45.200 | spelled H-E-L-P, help.
00:02:47.780 | Check it out at betterhelp.com/lex.
00:02:50.640 | They figure out what you need
00:02:52.040 | and match you with a licensed professional therapist
00:02:54.720 | in under 48 hours.
00:02:56.640 | I chat with the person on there and enjoy it.
00:02:59.800 | Of course, I also regularly talk to David Goggins these days
00:03:03.600 | who is definitely not a licensed professional therapist,
00:03:07.120 | but he does help me meet his and my demons
00:03:11.280 | and become comfortable to exist in their presence.
00:03:14.640 | Everyone is different, but for me,
00:03:16.800 | I think suffering is essential for creation,
00:03:19.460 | but you can suffer beautifully
00:03:21.140 | in a way that doesn't destroy you.
00:03:23.360 | I think therapy can help
00:03:24.900 | in whatever form that therapy takes,
00:03:27.200 | and I do think that BetterHelp is an option worth trying.
00:03:30.000 | They're easy, private, affordable, and available worldwide.
00:03:35.080 | You can communicate by text anytime
00:03:37.400 | and schedule weekly audio and video sessions.
00:03:39.960 | Check it out at betterhelp.com/lex.
00:03:42.760 | This show is also sponsored by ExpressVPN.
00:03:46.200 | You can use it to unlock movies and shows
00:03:48.880 | that are only available in other countries.
00:03:51.320 | I did this recently with Star Trek Discovery
00:03:53.440 | and UK Netflix mostly because I wonder what it's like
00:03:57.240 | to live in London.
00:03:58.840 | I'm thinking of moving from Boston
00:04:00.480 | to a place where I can build the business
00:04:02.220 | I've always dreamed of building.
00:04:04.160 | London is probably not in the top three,
00:04:06.740 | but top 10 for sure.
00:04:08.680 | The number one choice currently is Austin
00:04:11.720 | for many reasons that I'll probably speak to another time.
00:04:14.840 | San Francisco, unfortunately,
00:04:17.040 | dropped out from the number one spot,
00:04:18.940 | but it's still in the running.
00:04:20.660 | If you have advice, let me know.
00:04:23.040 | Anyway, check out ExpressVPN.
00:04:25.000 | It lets you change your location to almost 100 countries
00:04:27.880 | and it's super fast.
00:04:29.640 | Go to expressvpn.com/lexpod
00:04:32.560 | to get an extra three months of ExpressVPN for free.
00:04:36.040 | That's expressvpn.com/lexpod.
00:04:39.720 | And now, here's my conversation with James Gosling.
00:04:44.200 | I've read somewhere that the square root of two
00:04:47.760 | is your favorite irrational number.
00:04:49.600 | - I have no idea where that got started.
00:04:52.180 | (laughing)
00:04:53.960 | - Is there any truth to it?
00:04:55.080 | Is there anything in mathematics or numbers
00:04:57.160 | that you find beautiful?
00:04:58.880 | - Oh, well, there's lots of things in math
00:05:02.120 | that's really beautiful.
00:05:03.720 | You know, I used to consider myself really good at math
00:05:09.080 | and these days I consider myself really bad at math.
00:05:12.040 | I never really had a thing for the square root of two,
00:05:17.000 | but when I was a teenager,
00:05:20.840 | there was this book called
00:05:22.340 | The Dictionary of Curious and Interesting Numbers,
00:05:26.100 | which for some reason I read through
00:05:32.300 | and damn near memorized the whole thing.
00:05:35.900 | And I started this weird habit
00:05:41.380 | of when I was like filling out checks,
00:05:45.300 | you know, or paying for things with credit cards,
00:05:49.580 | I would want to make the receipt
00:05:52.720 | add up to an interesting number.
00:05:54.780 | - Is there some numbers that stuck with you
00:05:57.680 | that just kind of make you feel good?
00:06:00.000 | - They all have a story.
00:06:01.200 | And fortunately, I've actually mostly forgotten all of them.
00:06:06.200 | (laughing)
00:06:08.620 | - Are they, so like 42?
00:06:11.240 | - Well, yeah, I mean, 42 is pretty magical.
00:06:14.640 | - And then the irrationals.
00:06:15.880 | I mean, but is there a square root of two
00:06:17.920 | story in there somewhere?
00:06:19.540 | How did the number get started?
00:06:21.420 | - It's like the only number that has destroyed a religion.
00:06:26.420 | - In which way?
00:06:28.020 | - Well, the Pythagoreans,
00:06:31.180 | they believed that all numbers were perfect
00:06:33.900 | and you could represent anything as a rational number.
00:06:38.900 | And in that time period,
00:06:45.340 | this proof came out that there was no,
00:06:50.340 | you know, rational fraction whose value
00:06:57.840 | was equal to the square root of two.
00:06:59.800 | - And that means nothing in this world is perfect,
00:07:04.040 | not even mathematics.
00:07:05.640 | - Well, it means that your definition
00:07:08.720 | of perfect was imperfect.
00:07:10.360 | - Well, then there's the Gatel and completeness theorems
00:07:14.120 | in the 20th century that ruined it once again for everybody.
00:07:17.420 | - Yeah, although, although, although Gatel's theorem,
00:07:21.120 | you know, the lesson I take from Gatel's theorem
00:07:26.580 | is not that, you know, there are things you can't know,
00:07:30.540 | which is fundamentally what it says.
00:07:33.780 | But, you know, people want black and white answers.
00:07:39.380 | They want true or false.
00:07:43.580 | But if you allow a three-state logic
00:07:48.020 | that is true, false, or maybe,
00:07:50.060 | then life's good.
00:07:54.060 | - I feel like there's a parallel
00:07:56.540 | to modern political discourse in there somewhere.
00:07:59.540 | But let me ask, so with your kind of early love
00:08:04.540 | or appreciation of the beauty of mathematics,
00:08:11.500 | do you see a parallel between that world
00:08:15.060 | and the world of programming?
00:08:16.760 | - You know, programming is all about logical structure,
00:08:22.020 | understanding the patterns that come out of computation,
00:08:28.040 | understanding sort of, I mean, it's often like,
00:08:37.700 | you know, the path through the graph of possibilities.
00:08:41.700 | To find a short route.
00:08:44.960 | - Meaning like find a short program
00:08:48.340 | that gets the job done kind of thing?
00:08:51.020 | But so then on the topic of irrational numbers,
00:08:54.580 | do you see programming, you just painted it so cleanly.
00:08:59.580 | It's a little of this trajectory
00:09:02.900 | to find like a nice little program,
00:09:05.240 | but do you see it as fundamentally messy?
00:09:07.580 | Maybe unlike mathematics?
00:09:10.900 | - I don't think of it as, I mean, you know,
00:09:13.780 | you watch somebody who's good at math do math,
00:09:16.860 | and you know, often it's fairly messy.
00:09:21.860 | Sometimes it's kind of magical.
00:09:25.240 | When I was a grad student, one of the students,
00:09:32.700 | his name was Jim Sachs, was, he had this,
00:09:37.780 | he had this reputation of being sort of a walking,
00:09:42.780 | talking human theorem proving machine.
00:09:48.440 | And if you were having a hard problem with something,
00:09:51.540 | you could just like accost him in the hall and say,
00:09:54.740 | "Jim," and he would do this funny thing
00:09:59.740 | where he would stand up straight,
00:10:01.860 | his eyes would kind of defocus.
00:10:03.580 | He'd go, you know, just like, you know,
00:10:06.900 | like something in today's movies, he's just,
00:10:09.380 | and then he'd straighten up and say,
00:10:12.020 | "N log N," and walk away.
00:10:13.900 | And you'd go, well, okay, so N log N is the answer.
00:10:20.520 | How did he get there?
00:10:21.980 | By which time he's, you know, down the hallway somewhere.
00:10:27.100 | - Yeah, it's just the Oracle,
00:10:29.580 | the black box just gives you the answer.
00:10:31.380 | - Yeah, and then you have to figure out the path
00:10:33.340 | from the question to the answer.
00:10:35.380 | - I think in one of the videos I watched,
00:10:38.180 | you mentioned Don Knuth, well, at least recommending his,
00:10:43.180 | you know, his book is something people should read.
00:10:47.820 | - Oh yeah.
00:10:48.660 | - But in terms of, you know, theoretical computer science,
00:10:52.380 | do you see something beautiful
00:10:57.220 | that has been inspiring to you, speaking of N log N,
00:11:00.780 | in your work on programming languages,
00:11:03.900 | that's in that whole world of algorithms and complexity
00:11:08.740 | and, you know, these kinds of more formal
00:11:11.660 | mathematical things?
00:11:13.580 | Or did that not really stick with you
00:11:16.780 | in your programming life?
00:11:18.820 | - It did stick pretty clearly for me,
00:11:24.100 | because one of the things that I care about
00:11:27.060 | is being able to sort of look at a piece of code
00:11:32.060 | and be able to prove to myself that it works.
00:11:39.300 | And, you know, so for example, I find that I'm at odds
00:11:46.460 | with many of the people around me
00:11:54.220 | over issues about like how you lay out
00:11:59.220 | a piece of software, right?
00:12:03.460 | You know, so software engineers get really cranky
00:12:07.660 | about how they format the documents that are the programs,
00:12:11.460 | you know, where they put new lines
00:12:13.340 | and where they put, you know.
00:12:14.700 | - The braces.
00:12:15.740 | - The braces and all the rest of that, right?
00:12:18.780 | And I tend to go for a style that's very, very simple
00:12:23.860 | that's very dense.
00:12:25.580 | - To minimize the white space.
00:12:29.100 | - Yeah, well, to maximize the amount
00:12:34.380 | that I can see at once, right?
00:12:37.620 | So I like to be able to see a whole function
00:12:40.780 | and to understand what it does,
00:12:43.100 | rather than have to go scroll, scroll, scroll
00:12:44.980 | and remember, right?
00:12:46.180 | - Yeah, I'm with you on that.
00:12:47.540 | Yeah, that's, and people don't like that?
00:12:52.900 | - Yeah, I've had, you know, multiple times
00:12:56.980 | when engineering teams have staged
00:13:01.060 | what was effectively an intervention.
00:13:03.700 | You know, where they invite me to a meeting
00:13:10.180 | and everybody's arrived before me
00:13:12.140 | and they sort of all look at me and say,
00:13:14.580 | "James, about your coding style."
00:13:21.140 | I'm sort of an odd person to be programming
00:13:24.820 | because I don't think very well verbally.
00:13:29.660 | I am just naturally a slow reader.
00:13:35.820 | I'm what most people would call a visual thinker.
00:13:42.220 | - So when you think about a program, what do you see?
00:13:45.180 | - I see pictures, right?
00:13:47.460 | So when I look at a piece of code on a piece of paper,
00:13:52.260 | it very quickly gets transformed into a picture.
00:13:55.940 | And, you know, it's almost like a piece of machinery
00:14:02.060 | with, you know, this connected to that and-
00:14:05.620 | - Like these gears of different sizes.
00:14:07.660 | - Yeah, yeah.
00:14:09.580 | I see them more like that than I see
00:14:12.100 | the sort of verbal structure
00:14:16.140 | or the lexical structure of letters.
00:14:18.660 | - So then when you look at the program,
00:14:19.980 | that's why you wanna see it all in the same place,
00:14:21.900 | then you can just map it to something visual.
00:14:24.180 | - Yeah, and it just kind of like it leaps off the page at me.
00:14:28.260 | - Yeah, what are the inputs, what are the outputs,
00:14:30.780 | what the heck is this thing doing?
00:14:32.180 | - Yeah, yeah.
00:14:33.660 | - Getting a whole vision of it.
00:14:35.700 | Can we go back into your memory, long-term memory access?
00:14:40.700 | What's the first program you've ever written?
00:14:45.920 | - Oh, I have no idea what the first one was.
00:14:50.920 | I mean, I know the first machine
00:14:54.800 | that I learned to program on.
00:14:58.560 | - What is it?
00:14:59.400 | - Was a PDP-8 at the University of Calgary.
00:15:04.400 | - Do you remember the specs?
00:15:08.120 | - Oh yeah, so the thing had 4K of RAM.
00:15:12.720 | - Nice.
00:15:13.720 | - 12-bit words.
00:15:16.400 | The clock rate was,
00:15:18.560 | it was about a third of a megahertz.
00:15:25.720 | - Oh, so you didn't even get to the M, okay.
00:15:29.280 | - Yeah, yeah, so we're like 10,000 times faster these days.
00:15:34.280 | - And was this kind of like a super computer,
00:15:40.560 | like a serious computer for--
00:15:42.360 | - No, the PDP-8i was the first thing
00:15:46.240 | that people were calling like mini computer.
00:15:49.760 | - Got it.
00:15:50.720 | - They were sort of inexpensive enough
00:15:53.640 | that a university lab could maybe afford to buy one.
00:15:57.940 | - And was there time-sharing, all that kind of stuff?
00:16:02.440 | - There actually was a time-sharing OS for that,
00:16:06.240 | but it wasn't used really widely.
00:16:10.520 | The machine that I learned on was one
00:16:13.040 | that was kind of hidden in the back corner
00:16:16.720 | of the computer center.
00:16:20.000 | And it was bought as part of a project
00:16:27.360 | to do computer networking.
00:16:32.400 | But they didn't actually use it very much.
00:16:38.800 | It was mostly just kind of sitting there.
00:16:40.840 | And it was kind of sitting there,
00:16:44.240 | and I noticed it was just kind of sitting there.
00:16:47.040 | And so I started fooling around with it,
00:16:50.640 | and nobody seemed to mind, so I just kept doing that.
00:16:54.880 | - It had a keyboard and like a monitor?
00:16:59.040 | - Oh, this is way before monitors were common.
00:17:02.600 | So it was literally a Model 33 Teletype.
00:17:07.440 | - Okay.
00:17:08.280 | - Paper tape reader.
00:17:09.400 | (laughing)
00:17:12.120 | - Okay, so the user interface wasn't very good.
00:17:14.640 | - Yeah, yeah.
00:17:15.880 | It was the first computer ever built
00:17:20.160 | with integrated circuits.
00:17:22.140 | But by integrated circuits, I mean that they would have
00:17:26.440 | like 10 or 12 transistors on one piece of silicon.
00:17:31.440 | Not the 10 or 12 billion that machines have today.
00:17:38.000 | - So what did that, I mean, feel like,
00:17:41.680 | if you remember those, I mean,
00:17:45.040 | did you have kind of inklings of the magic
00:17:48.960 | of exponential kind of improvement of Moore's Law,
00:17:52.920 | of the potential of the future that was at your fingertips
00:17:56.520 | kind of thing, or was it just a cool--
00:17:58.560 | - Yeah, it was just a toy.
00:18:00.920 | You know, I had always liked building stuff,
00:18:04.160 | but one of the problems with building stuff
00:18:06.920 | is that you need to have parts.
00:18:08.800 | You need to have pieces of wood or wire
00:18:12.320 | or switches or stuff like that.
00:18:15.040 | And those all cost money.
00:18:16.880 | - And here you could build--
00:18:18.040 | - You could build arbitrarily complicated things,
00:18:20.720 | and I didn't need any physical materials.
00:18:23.860 | It required no money.
00:18:27.280 | - That's such a good way to put programming.
00:18:29.120 | You're right.
00:18:30.960 | If you love building things, it's completely accessible.
00:18:35.960 | You don't need anything.
00:18:38.800 | Anybody from anywhere could just build
00:18:40.680 | something really cool.
00:18:41.640 | - Yeah, yeah.
00:18:43.080 | If you've got access to a computer,
00:18:44.760 | you can build all kinds of crazy stuff.
00:18:49.400 | And when you were somebody like me
00:18:58.280 | who had really no money,
00:19:01.280 | and I remember just lusting after being able
00:19:08.360 | to buy a transistor.
00:19:10.920 | (laughing)
00:19:13.160 | And when I would do electronics kind of projects,
00:19:20.360 | they were mostly made, done by dumpster diving for trash.
00:19:27.520 | And one of my big hauls was discarded relay racks
00:19:32.520 | from the back of the phone company switching center.
00:19:37.520 | - Oh, nice.
00:19:38.360 | That was the big memorable treasure.
00:19:41.560 | - Oh yeah, yeah.
00:19:43.400 | - What do you use that for?
00:19:45.320 | - I built a machine that played tic-tac-toe.
00:19:47.700 | (laughing)
00:19:50.600 | - Nice.
00:19:51.440 | - Out of relays.
00:19:52.260 | Of course, the thing that was really hard
00:19:55.920 | was that all the relays required a specific voltage,
00:20:00.280 | but getting a power supply that would do that voltage
00:20:04.640 | was pretty hard.
00:20:06.400 | And since I had a bunch of trashed television sets,
00:20:09.440 | I had to sort of cobble together something
00:20:15.320 | that was wrong, but worked.
00:20:19.400 | So I was actually running these relays at 300 volts.
00:20:25.360 | - Huh.
00:20:26.200 | And none of the electrical connections
00:20:30.480 | were properly sealed off.
00:20:32.540 | - Surprised you survived that period of your life.
00:20:36.400 | - Oh, for so many reasons.
00:20:38.960 | For so many reasons.
00:20:40.480 | I mean, it's pretty common for teenage geeks to discover,
00:20:45.480 | oh, thermite, that's real easy to make.
00:20:48.960 | - Yeah, well, I'm glad you did.
00:20:52.800 | But do you remember what program in Calgary that you wrote?
00:20:57.800 | Anything that stands out?
00:21:01.920 | And what language?
00:21:03.880 | - Well, so mostly, anything of any size was assembly code.
00:21:08.880 | And actually, before I learned assembly code,
00:21:17.560 | there was this programming language on the PDP-8
00:21:20.040 | called Focal 5.
00:21:22.580 | And Focal 5 was kind of like a really stripped down Fortran.
00:21:26.860 | And I remember playing, you know,
00:21:31.700 | building programs that did things like play Blackjack
00:21:36.140 | or Solitaire or, and for some reason or other,
00:21:42.940 | the things that I really liked were ones
00:21:45.580 | where they were just like plotting graphs.
00:21:50.420 | - So something with like a function or a data,
00:21:54.140 | and then you'd plot it.
00:21:55.540 | - Yeah, yeah, I did a bunches of those things
00:21:59.420 | and went, ooh, pretty pictures.
00:22:01.700 | - And so this would like print out, again, no monitors.
00:22:07.500 | - Right, so it was like on a teletype.
00:22:10.700 | - Yeah.
00:22:13.420 | - So it's using something that's kind of like a typewriter.
00:22:18.820 | And then using those to plot functions.
00:22:22.180 | - So when, I apologize to romanticize things,
00:22:26.220 | but when did you first fall in love with programming?
00:22:31.220 | You know, what was the first programming language?
00:22:34.300 | Like as a serious, maybe software engineer,
00:22:36.340 | where you thought this is a beautiful thing?
00:22:39.460 | - I guess I never really thought
00:22:41.860 | of any particular language as being like beautiful,
00:22:45.700 | 'cause it was never really about the language for me.
00:22:47.940 | It was about what you could do with it.
00:22:49.940 | And, you know, even today, you know,
00:22:54.860 | people try to get me into arguments
00:22:56.580 | about particular forms of syntax or this or that.
00:23:00.660 | And I'm like, who cares?
00:23:03.180 | You know, it's about what you can do,
00:23:04.940 | not how you spell the word.
00:23:08.100 | And, you know, so back in those days,
00:23:12.820 | I learned like PL1 and Fortran and COBOL.
00:23:18.780 | And, you know, by the time that people were willing
00:23:22.660 | to hire me to do stuff, you know,
00:23:24.860 | it was mostly assembly code and, you know,
00:23:28.580 | PDP assembly code and Fortran code
00:23:32.300 | and control data assembly code for like the CDC 6400,
00:23:37.300 | which was an early, I guess, supercomputer.
00:23:41.780 | Even though that supercomputer has less compute power
00:23:46.300 | than my phone by a lot.
00:23:49.100 | - And that was mostly, like you said, Fortran world.
00:23:54.820 | That said, you've also showed appreciation
00:23:57.340 | for the greatest language ever
00:24:00.740 | that I think everyone agrees is Lisp.
00:24:03.540 | - Well, Lisp is definitely on my list
00:24:07.900 | of the greatest ones that have existed.
00:24:11.700 | - Is it at number one or, I mean, are you, I mean.
00:24:16.180 | - You know, the thing is that it's, you know,
00:24:19.220 | I wouldn't put it number one, no.
00:24:23.460 | - Is it the parentheses?
00:24:24.780 | What do you love and what do you not love about Lisp?
00:24:30.380 | - Well, I guess the number one thing to not love about it
00:24:36.860 | is so freaking many parentheses.
00:24:39.100 | - Yeah.
00:24:39.940 | - On the love thing is, you know,
00:24:44.020 | out of those tons of parentheses,
00:24:48.020 | you actually get an interesting language structure.
00:24:51.020 | And I've always thought that there was a friendlier version
00:24:54.420 | of Lisp hiding out there somewhere,
00:24:56.540 | but I've never really spent much time.
00:25:01.740 | - Thinking about it.
00:25:02.580 | - Thinking about it, but, you know,
00:25:03.980 | so like up the food chain for me,
00:25:07.980 | then from Lisp is Simula,
00:25:12.460 | which a very small number of people have ever used.
00:25:16.260 | - But a lot of people, I think,
00:25:18.180 | had a huge influence, right?
00:25:19.580 | - Yeah.
00:25:20.420 | - On the programming, but in Simula,
00:25:23.180 | I apologize if I'm wrong on this,
00:25:24.980 | but is that one of the first functional languages or no?
00:25:28.220 | - No, it was the first object oriented programming language.
00:25:32.340 | - Got it.
00:25:33.180 | - It's really where object oriented
00:25:35.620 | and languages sort of came together.
00:25:39.820 | And it was also the language where coroutines
00:25:44.820 | first showed up as a part of the language.
00:25:48.700 | So you could have a programming style that was,
00:25:53.060 | you could think of it as multi-threaded
00:25:57.820 | with a lot of parallelism.
00:26:00.060 | - Really?
00:26:01.820 | There's ideas of parallelism in there?
00:26:03.660 | - Yeah.
00:26:04.500 | Yeah, so that was back, you know,
00:26:08.300 | so the first Simula spec was Simula 67.
00:26:11.340 | - From like 1967?
00:26:14.900 | - Yeah.
00:26:15.980 | - Wow.
00:26:16.820 | - So it had coroutines, which are almost threads.
00:26:21.820 | The thing about coroutines
00:26:23.940 | is that they don't have true concurrency.
00:26:26.340 | So you can get away without really complex locking.
00:26:31.340 | You can't usably do coroutines
00:26:36.380 | on the multi-core machine.
00:26:38.860 | Or if you try to do coroutines
00:26:41.900 | on a multi-core machine,
00:26:44.260 | you don't actually get to use the multiple cores.
00:26:47.660 | - Got it.
00:26:48.660 | - Either that or you, you know,
00:26:49.980 | 'cause you start then having to get into the universe
00:26:54.140 | of semaphores and locks and things like that.
00:26:58.380 | But, you know, in terms of the style of programming,
00:27:04.900 | you could write code and think of it
00:27:08.820 | as being multi-threaded.
00:27:11.580 | The mental model was very much a multi-threaded one.
00:27:16.020 | And all kinds of problems
00:27:18.660 | you could approach very differently.
00:27:20.620 | - To return to the world of Lisp for a brief moment,
00:27:27.420 | you, at CMU, you wrote a version of Emacs
00:27:32.620 | that I think was very impactful on the history of Emacs.
00:27:36.220 | What was your motivation for doing so?
00:27:42.420 | - At that time, so that was in like '85 or '86.
00:27:47.420 | I had been using Unix for a few years.
00:27:57.740 | And most of the editing was this tool called ED,
00:28:02.740 | which was sort of an ancestor of VI.
00:28:08.260 | And--
00:28:11.460 | - Is it a pretty good editor, not a good editor?
00:28:14.180 | - Well, if what you're using,
00:28:16.780 | if your input device is a teletype, it's pretty good.
00:28:21.700 | - Yeah.
00:28:22.820 | - It's certainly more humane than Tico,
00:28:25.900 | which was kind of the common thing
00:28:28.260 | in a lot of the DEC universe at the time.
00:28:32.700 | - Tico is spelled T-K?
00:28:34.700 | Is that the--
00:28:35.540 | - No, Tico, T-E-C-O, the text editor and corrector.
00:28:39.780 | - Corrector, wow, so many features.
00:28:41.740 | - And the original Emacs came out as,
00:28:48.860 | so Emacs stands for editor macros.
00:28:52.300 | And Tico had a way of writing macros.
00:28:55.740 | And so the original Emacs from MIT,
00:29:01.740 | sort of started out as a collection of macros for Tico.
00:29:06.220 | But then, the sort of Emacs style got popular
00:29:12.180 | originally at MIT, and then people did a few
00:29:19.140 | other implementations of Emacs that were,
00:29:22.540 | you know, the code base was entirely different,
00:29:26.340 | but it was sort of the philosophical style
00:29:29.020 | of the original Emacs.
00:29:30.900 | - What was the philosophy of Emacs?
00:29:33.420 | And by the way, were all the implementations always in C?
00:29:36.820 | And then--
00:29:37.660 | - No, no.
00:29:38.500 | - And how does Lisp fit into the picture?
00:29:39.860 | - No, so the very first Emacs was written
00:29:43.420 | as a bunch of macros for the Tico text editor.
00:29:46.580 | - Wow, that's so interesting.
00:29:47.940 | - And the macro language for Tico
00:29:52.940 | was probably the most ridiculously obscure format.
00:29:58.340 | You know, if you just look at a Tico program on a page,
00:30:02.500 | you think it was just random characters.
00:30:06.060 | It really looks like just line noise.
00:30:08.860 | - So it's kind of like LaTeX or something?
00:30:11.580 | - Oh, way worse than LaTeX.
00:30:15.020 | Way, way worse than LaTeX.
00:30:17.980 | But, you know, if you use Tico a lot, which I did,
00:30:21.980 | the Tico was completely optimized
00:30:25.460 | for touch typing at high speed.
00:30:29.220 | So there were no two character commands.
00:30:34.340 | Well, there were a few,
00:30:36.340 | but mostly they were just one character.
00:30:38.980 | So every character on the keyboard was a separate command.
00:30:41.900 | And actually every character on the keyboard
00:30:45.740 | was usually two or three commands
00:30:48.060 | because you can hit shift and control
00:30:51.380 | and all of those things.
00:30:52.580 | It's just a way of very tightly encoding it.
00:30:55.300 | And mostly what Emacs did was it made that visual.
00:31:00.900 | So one way to think of Tico is use Emacs
00:31:08.580 | with your eyes closed.
00:31:14.100 | Where you have to maintain a mental model of,
00:31:17.140 | you know, sort of a mental image of your document.
00:31:20.660 | You have to go, okay, so the cursor is between the A
00:31:25.660 | and the E and I want to exchange those.
00:31:29.780 | So I do these things, right?
00:31:31.860 | So it is almost exactly the Emacs command set.
00:31:36.860 | Well, it's roughly the same as Emacs command set,
00:31:42.860 | but using Emacs with your eyes closed.
00:31:45.180 | So what Emacs, you know,
00:31:50.980 | part of what Emacs added to the whole thing
00:31:52.940 | was being able to visually see what you were editing
00:31:57.700 | in a form that matched your document.
00:32:02.660 | And, you know, a lot of things changed in the command set.
00:32:11.700 | You know, because it was programmable,
00:32:13.820 | it was really flexible.
00:32:15.980 | You could add new commands for all kinds of things.
00:32:18.540 | And then people rewrote Emacs like multiple times in Lisp.
00:32:23.540 | There was one done at MIT for the Lisp machine.
00:32:28.180 | There was one done for Multics.
00:32:31.500 | And one summer I got a summer job
00:32:35.100 | to work on the Pascal compiler for Multics.
00:32:40.300 | And that was actually the first time I used Emacs.
00:32:43.700 | And so-
00:32:47.820 | - To write the compiler.
00:32:48.740 | So you've worked on compilers too.
00:32:50.420 | That's fascinating.
00:32:52.220 | - Yeah, so I did a lot of work.
00:32:53.860 | You know, I mean, I spent like a really intense three months
00:32:59.580 | working on this Pascal compiler, basically living in Emacs.
00:33:05.300 | And it was the one written in Mac Lisp
00:33:08.940 | by Bernie Greenberg.
00:33:11.380 | And I thought, wow, this is just a way better way
00:33:15.420 | to do editing.
00:33:16.460 | And then I got back to CMU where we had kind of one
00:33:24.140 | of everything and two of a bunch of things
00:33:28.700 | and four of a few things.
00:33:30.620 | And since I mostly worked in the Unix universe
00:33:35.620 | and Unix didn't have an Emacs,
00:33:38.740 | I decided that I needed to fix that problem.
00:33:41.420 | So I wrote this implementation of Emacs in C
00:33:47.700 | 'cause at the time C was really the only language
00:33:51.020 | that worked on Unix.
00:33:54.940 | - And you were comfortable with C as well?
00:33:57.780 | - Oh yeah.
00:33:58.620 | - At that point?
00:33:59.540 | - Yeah, at that time I had done a lot of C coding.
00:34:02.500 | This was in like '86.
00:34:04.340 | And it was running well enough
00:34:11.980 | for me to use it to edit itself within a month or two.
00:34:17.100 | And then it kind of took over the university.
00:34:22.460 | - And it spread outside.
00:34:25.580 | - Yeah, and then it went outside.
00:34:28.300 | And largely because Unix kind of took over
00:34:32.460 | the research community on the ARPANET.
00:34:37.220 | And Emacs was kind of the best editor out there.
00:34:43.300 | It kind of took over.
00:34:44.380 | And there was actually a brief period
00:34:48.020 | where I actually had login IDs
00:34:53.020 | on every non-military host on the ARPANET.
00:34:57.780 | (laughing)
00:34:59.220 | Because people would say, "Oh, can we install this?"
00:35:01.460 | And I'd like, "Well, yeah, but you'll need some help."
00:35:06.460 | (laughing)
00:35:09.780 | - The days when security wasn't--
00:35:11.980 | - When nobody cared.
00:35:12.940 | - Nobody cared.
00:35:14.060 | - Yeah.
00:35:14.900 | - Can I ask briefly, what were those early days
00:35:19.340 | of ARPANET and the internet like?
00:35:24.940 | I mean, did you, again, sorry for the silly question,
00:35:28.940 | but could you have possibly imagined
00:35:31.980 | that the internet would look like what it is today?
00:35:36.780 | - You know, some of it is remarkably unchanged.
00:35:41.460 | So like one of the things that I noticed really early on
00:35:46.700 | when I was at Carnegie Mellon was that
00:35:53.860 | a lot of social life became centered around the ARPANET.
00:35:58.860 | So things like, you know, between email and text messaging,
00:36:05.140 | because text messaging was a part of the ARPANET
00:36:09.700 | really early on.
00:36:11.780 | There were no cell phones, but you're sitting at a terminal
00:36:15.220 | and you're typing stuff.
00:36:16.420 | - So essentially email, or like what is--
00:36:20.020 | - Well, just like a one-line message, right?
00:36:23.940 | - Oh, cool, so like chat.
00:36:25.580 | - Like chat.
00:36:26.740 | - Yeah.
00:36:27.580 | - Right, so it's like sending a one-line message
00:36:30.260 | to somebody, right?
00:36:31.980 | And so pretty much everything from, you know,
00:36:36.980 | arranging lunch to going out on dates, you know,
00:36:43.700 | it was all like driven by social media.
00:36:48.180 | - Social media. (laughs)
00:36:49.820 | - Right, in the '80s.
00:36:52.980 | - Easier than phone calls, yeah.
00:36:55.060 | - You know, and my life had gotten to where, you know,
00:36:59.660 | I was, you know, living on social media, you know,
00:37:03.940 | from like the early mid '80s.
00:37:08.140 | And so when it sort of transformed into the internet
00:37:16.340 | and social media explodes, I was kind of like,
00:37:19.980 | what's the big deal?
00:37:21.060 | - Yeah, it's just a scale thing.
00:37:23.500 | - It's, right, the scale thing is just astonishing.
00:37:27.940 | - Yeah, but the fundamentals in some ways--
00:37:32.100 | - The fundamentals have hardly changed.
00:37:36.220 | And, you know, the technologies behind the networking
00:37:40.940 | have changed significantly.
00:37:42.580 | The, you know, the watershed moment of, you know,
00:37:48.300 | going from the ARPANET to the internet,
00:37:50.460 | and then people starting to just scale and scale and scale.
00:37:57.340 | I mean, the scaling that happened in the early '90s
00:38:02.340 | and the way that so many vested interests
00:38:08.700 | fought the internet.
00:38:11.340 | - Oh, who, oh, interesting.
00:38:14.220 | What was the, oh, because you can't really control
00:38:16.820 | the internet?
00:38:18.020 | So who fought the internet?
00:38:19.180 | - So fundamentally, the, you know,
00:38:23.700 | the cable TV companies and broadcasters
00:38:27.500 | and phone companies, you know,
00:38:30.180 | at the deepest fibers of their being,
00:38:35.620 | they hated the internet.
00:38:38.380 | But it was often kind of a funny thing
00:38:43.380 | because, you know, so think of a cable company, right?
00:38:48.380 | Most of the employees of the cable company,
00:38:57.340 | their job is getting TV shows, movies,
00:39:02.340 | whatever out to their customers.
00:39:06.060 | They view their business as serving their customers.
00:39:13.020 | But as you climb up the hierarchy in the cable companies,
00:39:18.020 | that view shifts because really the business
00:39:24.980 | of the cable companies had always been
00:39:32.580 | selling eyeballs to advertisers.
00:39:34.900 | - Right.
00:39:40.020 | - And, you know, that view of like a cable company
00:39:45.020 | didn't really dawn on most people
00:39:49.020 | who worked at the cable companies.
00:39:51.060 | But, you know, I had various dustups
00:39:55.180 | with various cable companies where you could see,
00:39:58.460 | you know, in the stratified layers of the corporation
00:40:01.420 | that this view of, you know,
00:40:05.900 | the reason that you have, you know, cable TV
00:40:10.020 | is to capture eyeballs, you know, they're--
00:40:13.580 | - So they didn't see it that way.
00:40:15.180 | - Well, so the people who,
00:40:17.020 | most of the people who worked at the phone company
00:40:19.940 | or at the cable companies, their view was
00:40:24.020 | that their job was getting delightful content
00:40:28.220 | out to their customers and their customers
00:40:30.460 | would pay for that.
00:40:33.260 | Higher up, they viewed this as a way
00:40:36.700 | of attracting eyeballs to them.
00:40:40.980 | And then what they were really doing
00:40:45.700 | was selling the eyeballs that were glued to their content,
00:40:50.700 | to the advertisers.
00:40:52.580 | - To the advertisers, yeah.
00:40:54.620 | And so the internet was a competition in that sense.
00:40:57.380 | - Right, and so--
00:40:58.220 | - They were right.
00:41:00.500 | - Well, yeah, I mean, there was one proposal
00:41:04.940 | that we sent that we, one detailed proposal
00:41:09.940 | that we wrote up, you know, back at Sun
00:41:15.580 | in the early '90s that was essentially like,
00:41:19.380 | look, anybody, you know, with internet technologies,
00:41:22.380 | anybody can become provider of content.
00:41:27.220 | So, you know, you could be distributing home movies
00:41:32.220 | to your parents or your cousins who are anywhere else, right?
00:41:37.820 | So anybody can become a publisher.
00:41:41.500 | - Wow, you were thinking about that already, yeah.
00:41:43.460 | Netflix, Netflix.
00:41:45.780 | - Yeah, that was like in the early '90s.
00:41:48.980 | - Yeah.
00:41:49.980 | - And we thought this would be great.
00:41:52.460 | And the kind of content we were thinking about
00:41:56.900 | at the time was like, you know, home movies, kids' essays,
00:42:01.900 | you know, stuff from like grocery stores
00:42:08.660 | or, you know, or a restaurant
00:42:12.500 | that they could actually like start
00:42:15.500 | sending information about.
00:42:17.100 | - That's brilliant.
00:42:21.500 | - And the reaction of the cable companies was like,
00:42:27.140 | fuck no, because then we're out of business.
00:42:32.140 | - What is it about companies that,
00:42:36.900 | 'cause they could have just,
00:42:38.500 | they could have been ahead of that wave.
00:42:40.140 | They could have listened to that and they could have--
00:42:42.660 | - They didn't see a path to revenue.
00:42:44.700 | - You know, there's somewhere in there,
00:42:47.620 | there's a lesson for like big companies, right?
00:42:50.420 | Like to listen, to try to anticipate the renegade,
00:42:56.100 | the out there, out of the box people like yourself
00:42:59.500 | in the early days writing proposals
00:43:01.580 | about what this could possibly be.
00:43:03.900 | - Well, and that, you know, it wasn't,
00:43:06.780 | you know, if you're in a position
00:43:08.580 | where you're making truckloads of money
00:43:12.900 | off of a particular business model,
00:43:15.660 | you know, the whole thought of like,
00:43:25.060 | you know, leaping the chasm, right?
00:43:27.540 | You know, you can see, oh, new models
00:43:31.780 | that are more effective are emerging, right?
00:43:35.580 | So like digital cameras versus film cameras.
00:43:39.860 | You know, I mean--
00:43:44.140 | - Why take the leap?
00:43:45.300 | - Why take the leap?
00:43:46.660 | Because you're making so much money off of film.
00:43:51.140 | And, you know, in my past at Sun,
00:43:56.140 | one of our big customers was Kodak.
00:43:59.660 | And I ended up interacting with folks from Kodak quite a lot
00:44:03.340 | and they actually had a big digital camera research
00:44:08.340 | and, you know, digital imaging business
00:44:12.220 | or development group.
00:44:16.020 | And they knew that, you know,
00:44:21.620 | you just look at the trend lines
00:44:24.420 | and you look at, you know, the emerging quality
00:44:28.660 | of these, you know, digital cameras.
00:44:33.660 | And, you know, you can just plot it on a graph, you know?
00:44:37.980 | And it's like, you know, sure film is better today,
00:44:42.820 | but, you know, digital is improving like this.
00:44:49.420 | The lines are gonna cross and, you know,
00:44:53.260 | the point at which the lines cross
00:44:55.140 | is gonna be a collapse in their business.
00:44:57.740 | And they could see that, right?
00:45:03.580 | They absolutely knew that.
00:45:06.100 | The problem is that, you know,
00:45:07.860 | up to the point where they hit the wall,
00:45:10.380 | they were making truckloads of money, right?
00:45:14.860 | - Yeah.
00:45:15.980 | - Right, and when they did the math,
00:45:20.140 | it never started to make sense for them
00:45:26.140 | to kind of lead the charge.
00:45:29.420 | And part of the issues for a lot of companies
00:45:32.540 | for this kind of stuff is that, you know,
00:45:36.820 | if you're gonna leap over a chasm like that,
00:45:39.300 | like with Kodak going from film to digital,
00:45:45.300 | that's a transition that's gonna take a while, right?
00:45:49.940 | We had fights like this with people who were like smart cards.
00:45:53.540 | The smart cards fights were just ludicrous.
00:45:56.980 | - But that's where visionary leadership comes in, right?
00:46:00.060 | Yeah, somebody needs to roll in and say,
00:46:02.740 | then take the leap.
00:46:04.780 | - Well, it partly take the leap,
00:46:07.420 | but it's also partly take the hit.
00:46:09.580 | - Take the hit in the short term.
00:46:10.780 | - So you can draw the graphs you want that show
00:46:15.780 | that if we leap from here,
00:46:18.420 | on our present trajectory, we're doing this
00:46:21.660 | and there's a cliff.
00:46:23.140 | If we force ourselves into a transition
00:46:27.740 | and we proactively do that, we can be on the next wave,
00:46:32.740 | but there will be a period when we're in a trough.
00:46:39.220 | And pretty much always there ends up being a trough
00:46:44.020 | as you leap the chasm.
00:46:46.220 | But the way that public companies work on this planet,
00:46:51.220 | they're reporting every quarter.
00:46:55.900 | And the one thing that a CEO must never do
00:47:00.140 | is take a big hit.
00:47:02.820 | - Take a big hit.
00:47:03.740 | - Over some quarter.
00:47:06.260 | And many of these transitions involve a big hit
00:47:11.260 | for a period of time, one, two, three quarters.
00:47:16.300 | And so you get some companies and like Tesla and Amazon
00:47:22.500 | are really good examples of companies that take huge hits,
00:47:31.100 | but they have the luxury of being able
00:47:34.660 | to ignore the stock market for a little while.
00:47:38.220 | And that's not so true today really,
00:47:42.220 | but in the early days of both of those companies,
00:47:47.140 | they both did this thing of,
00:47:53.040 | I don't care about the quarterly reports,
00:47:57.740 | I care about how many happy customers we have.
00:48:02.380 | And having as many happy customers as possible
00:48:05.940 | can often be an enemy of the bottom line.
00:48:10.940 | - Yeah, so how do they make that work?
00:48:12.820 | I mean, Amazon operated in the negative for a long time.
00:48:15.500 | It's like investing into the future.
00:48:17.340 | - Right, but so Amazon and Google and Tesla and Facebook,
00:48:22.340 | a lot of those had what amounted to patient money.
00:48:30.380 | Often because there's like a charismatic central figure
00:48:35.380 | who has a really large block of stock
00:48:39.980 | and they can just make it so.
00:48:44.700 | - So on that topic, just maybe it's a little small tangent,
00:48:49.500 | but you've gotten the chance to work
00:48:51.500 | with some pretty big leaders.
00:48:53.300 | What are your thoughts about on Tesla's side,
00:48:55.780 | Elon Musk leadership, on the Amazon side, Jeff Bezos,
00:49:00.180 | all of these folks with large amounts of stock
00:49:02.900 | and vision in their company.
00:49:05.020 | I mean, they're founders,
00:49:06.380 | either complete founders or early on folks.
00:49:11.460 | And Amazon have taken a lot of leaps
00:49:16.300 | that probably at the time people would criticize
00:49:21.860 | as like, what is this bookstore thing?
00:49:24.940 | Why? (laughs)
00:49:26.500 | - Yeah, and Bezos had a vision
00:49:30.300 | and he had the ability to just follow it.
00:49:36.380 | Lots of people have visions
00:49:37.900 | and the average vision is completely idiotic
00:49:41.580 | and you crash and burn.
00:49:42.940 | The Silicon Valley crash and burn rate is pretty high.
00:49:50.100 | And they don't necessarily crash and burn
00:49:54.620 | because they were dumb ideas,
00:49:55.980 | but often it's just timing and luck.
00:50:00.980 | And you take companies like Tesla
00:50:06.020 | and really the original Tesla,
00:50:12.020 | sort of pre-Elon was kind of doing sort of okay,
00:50:18.820 | but he just drove them.
00:50:26.500 | And because he had a really strong vision,
00:50:31.500 | he would make calls that were always,
00:50:36.420 | well, mostly pretty good.
00:50:40.380 | I mean, the Model X was kind of a goofball thing to do.
00:50:44.740 | - But he did it boldly anyway.
00:50:46.540 | Like there's so many people that just said,
00:50:49.660 | like there's so many people that oppose them
00:50:51.900 | on the Falconwind door, like the door.
00:50:54.460 | From an engineering perspective,
00:50:55.860 | those doors are ridiculous.
00:50:57.740 | It's like--
00:50:58.580 | - Yeah, they are a complete travesty.
00:51:00.780 | - But they're exactly the symbol
00:51:04.260 | of what great leadership is,
00:51:05.860 | which is like you have a vision and you just go like--
00:51:09.140 | - If you're gonna do something stupid,
00:51:10.740 | make it really stupid.
00:51:12.140 | - Yeah, and go all in.
00:51:13.620 | - Yeah, yeah.
00:51:15.100 | And to my credit, he's a really sharp guy.
00:51:22.620 | So going back in time a little bit to Steve Jobs,
00:51:26.020 | Steve Jobs was a similar sort of character
00:51:29.700 | who had a strong vision and was really, really smart.
00:51:33.660 | And he wasn't smart about the technology parts of things,
00:51:39.340 | but he was really sharp about the sort of human relationship
00:51:44.340 | between humans and objects.
00:51:52.500 | And, but he was a jerk.
00:51:57.500 | - You know--
00:52:00.100 | - Right?
00:52:01.140 | - Can we just linger on that a little bit?
00:52:02.580 | Like people say he's a jerk.
00:52:04.420 | Is that a feature or a bug?
00:52:08.180 | - Well, that's the question, right?
00:52:11.300 | So you take people like Steve,
00:52:13.820 | who was really hard on people.
00:52:18.500 | And so the question is, was he really,
00:52:22.980 | was he needlessly hard on people
00:52:25.460 | or was he just making people reach to meet his vision?
00:52:30.460 | And you could kind of spin it either way.
00:52:39.220 | - Well, the results tell a story.
00:52:46.020 | Whatever jerk ways he had,
00:52:47.900 | he made people often do the best work of their life.
00:52:51.580 | - Yeah, yeah.
00:52:52.700 | And that was absolutely true.
00:52:54.620 | And I interviewed with him several times.
00:52:58.580 | I did various negotiations with him.
00:53:04.540 | And even though kind of personally I liked him,
00:53:09.540 | I could never work for him.
00:53:14.540 | - Why do you think that,
00:53:22.620 | can you put into words the kind of tension
00:53:25.540 | that you feel would be destructive
00:53:29.220 | as opposed to constructive?
00:53:30.660 | - Oh, he'd yell at people, he'd call them names.
00:53:39.140 | - And you don't like that?
00:53:40.220 | - No, no, I don't think you need to do that.
00:53:44.180 | And I think there's pushing people to excel
00:53:51.820 | and then there's too far.
00:54:01.620 | And I think he was on the wrong side of the line.
00:54:04.740 | And I've never worked for Musk.
00:54:07.420 | I know a number of people who have,
00:54:10.220 | many of them have said,
00:54:11.900 | and it shows up in the press a lot
00:54:13.900 | that Musk is kind of that way.
00:54:18.340 | And one of the things that I sort of loathe
00:54:21.940 | about Silicon Valley these days
00:54:24.460 | is that a lot of the high flying successes
00:54:29.460 | are run by people who are complete jerks.
00:54:33.500 | But it seems like there's become this,
00:54:36.860 | there's come this sort of mythology out of Steve Jobs
00:54:41.140 | that the reason that he succeeded
00:54:44.540 | was because he was super hard on people.
00:54:49.380 | And in a number of corners, people start going,
00:54:55.340 | oh, if I wanna succeed, I need to be a real jerk.
00:55:01.060 | - Yeah.
00:55:01.900 | - Right, and that for me just does not compute.
00:55:05.740 | I mean, I know a lot of successful people
00:55:07.740 | who are not jerks, who are perfectly fine people.
00:55:12.060 | You know, they tend to not be in the public eye.
00:55:19.420 | - The general public somehow lifts the jerks up
00:55:25.020 | into the hero status.
00:55:28.700 | - Right, well, because they do things
00:55:30.700 | that get them in the press.
00:55:32.380 | - Yeah.
00:55:33.220 | - And the people who don't do the kind of things
00:55:38.220 | that spill into the press.
00:55:46.060 | - Yeah, I just talked to Chris Lautner for the second time.
00:55:52.460 | He's a super nice guy, just an example
00:55:57.620 | of this kind of kind of individual
00:55:58.900 | that's in the background.
00:56:00.420 | I feel like he's behind like a million technologies,
00:56:03.100 | but he also talked about the jerkiness
00:56:05.660 | of some of the folks.
00:56:06.900 | - Yeah, yeah, and the fact that being a jerk
00:56:10.660 | has become your required style.
00:56:13.180 | - But one thing I maybe wanna ask on that
00:56:15.820 | is maybe to push back a little bit.
00:56:17.820 | So there's the jerk side, but there's also,
00:56:21.420 | if I were to criticize what I've seen in Silicon Valley,
00:56:24.420 | which is almost the resistance to working hard.
00:56:28.420 | So on the jerkiness side is,
00:56:31.100 | so Poste Jobs and Elon kind of push people
00:56:38.740 | to work really hard to do.
00:56:41.820 | And there's a question whether it's possible
00:56:44.300 | to do that nicely.
00:56:45.940 | But one of the things that bothers me,
00:56:47.580 | maybe I'm just a Russian and just kind of,
00:56:51.100 | romanticize the whole suffering thing.
00:56:53.380 | But I think working hard is essential
00:56:56.620 | for accomplishing anything interesting, like really hard.
00:57:00.180 | And in the parlance of Silicon Valley,
00:57:02.980 | it's probably too hard.
00:57:04.860 | This idea of the, you should work smart, not hard,
00:57:07.500 | often to me sounds like you should be lazy,
00:57:12.500 | because of course you wanna be to work smart.
00:57:15.300 | Of course you would be maximally efficient,
00:57:18.300 | but in order to discover the efficient path,
00:57:20.180 | like we're talking about with the short programs.
00:57:21.980 | - Yeah, well, you know,
00:57:25.180 | the smart hard thing isn't an either or, it's an and.
00:57:29.700 | - It's an and, yeah.
00:57:30.820 | - Right?
00:57:31.660 | And, you know,
00:57:35.020 | the people who say you should work smart, not hard,
00:57:41.540 | they pretty much always fail.
00:57:44.500 | - Yeah, thank you.
00:57:46.380 | - Right, I mean, that's just a recipe for disaster.
00:57:49.820 | I mean, there are counter examples,
00:57:53.580 | but there are more people who benefited from luck.
00:57:57.620 | - And you're saying, yeah, exactly.
00:58:00.340 | Luck and timing, like you said, is often an essential thing.
00:58:04.740 | But you're saying, you know,
00:58:06.860 | you can push people to work hard
00:58:08.540 | and do incredible work without--
00:58:10.380 | - Without being nasty.
00:58:13.700 | - Yeah, without being nasty.
00:58:14.980 | I think Google is a good example
00:58:19.980 | of the leadership of Google throughout its history
00:58:23.060 | has been a pretty good example of not being nasty.
00:58:28.060 | - Yeah, I mean, the twins, Larry and Sergey,
00:58:33.260 | are both pretty nice people.
00:58:37.700 | - Sandra Pacheco's very nice.
00:58:39.740 | - Yeah, yeah, and, you know,
00:58:43.020 | it's a culture of people who work really, really hard.
00:58:49.860 | - Let me ask maybe a little bit of a tense question.
00:58:53.180 | We're talking about Emacs.
00:58:56.500 | It seems like you've done some incredible work,
00:58:58.900 | so outside of Java, you've done some incredible work
00:59:01.900 | that didn't become as popular as it could have
00:59:04.980 | because of like licensing issues
00:59:07.020 | and open sourcing issues.
00:59:10.060 | What are your thoughts about that entire mess?
00:59:18.740 | Like what's about open source now in retrospect,
00:59:22.220 | looking back, about licensing, about open sourcing,
00:59:27.100 | do you think open source is a good thing, a bad thing?
00:59:32.100 | Do you have regrets?
00:59:35.700 | Do you have wisdom that you've learned
00:59:38.100 | from that whole experience?
00:59:39.860 | - So in general, I'm a big fan of open source.
00:59:45.420 | The way that it can be used to build communities
00:59:49.300 | and promote the development of things
00:59:52.060 | and promote collaboration and all of that
00:59:55.220 | is really pretty grand.
00:59:56.660 | When open source turns into a religion
01:00:03.420 | that says all things must be open source,
01:00:05.900 | I get kind of weird about that
01:00:10.940 | because it's sort of like saying,
01:00:13.340 | some versions of that end up saying
01:00:18.340 | that all software engineers must take a vow of poverty.
01:00:23.420 | Right, as though--
01:00:26.980 | - It's unethical to have money.
01:00:30.140 | - Yeah.
01:00:31.580 | - To build a company to, right.
01:00:34.620 | - And there's a slice of me
01:00:38.740 | that actually kind of buys into that
01:00:41.340 | because people who make billions of dollars
01:00:46.340 | off of a patent and the patent came from literally
01:00:52.500 | a stroke of lightning that hits you
01:00:58.700 | as you lie half awake in bed,
01:01:01.500 | yeah, that's lucky, good for you.
01:01:06.020 | The way that that sometimes sort of explodes
01:01:09.740 | into something that looks to me a lot like exploitation,
01:01:14.140 | you see a lot of that in the drug industry.
01:01:19.060 | When you've got medications that cost you $100 a day
01:01:27.460 | and it's like, no.
01:01:36.940 | - Yeah, so the interesting thing about sort of open source,
01:01:41.220 | what bothers me is when something is not open source
01:01:47.300 | and because of that, it's a worse product.
01:01:51.380 | - Yeah.
01:01:52.220 | - So like, I mean, if I look at your just implementation
01:01:55.380 | of Emacs, like that could have been the dominant,
01:01:59.140 | like I use Emacs, that's my main ID,
01:02:01.300 | I apologize to the world, but I still love it.
01:02:05.060 | And I could have been using your implementation of Emacs
01:02:10.060 | and why aren't I?
01:02:13.020 | - So are you using the GNU Emacs?
01:02:16.340 | - I guess the default on Linux, is that GNU?
01:02:18.580 | - Yeah, and that through a strange passage
01:02:22.700 | started out as the one that I wrote.
01:02:24.500 | - Exactly, so it still has, yeah.
01:02:28.940 | - Well, and part of that was because
01:02:32.980 | in the last couple of years of grad school,
01:02:37.980 | it became really clear to me that I was either going
01:02:44.140 | to be Mr. Emacs forever or I was gonna graduate.
01:02:49.540 | - Got it.
01:02:52.660 | - I couldn't actually do both.
01:02:54.940 | - Was that a hard decision?
01:02:58.140 | That's so interesting to think about you as a,
01:03:00.540 | like it's a different trajectory that could have happened.
01:03:02.860 | - Yeah.
01:03:03.700 | - That's fascinating.
01:03:04.540 | - And maybe I could be fabulously wealthy today
01:03:10.300 | if I had become Mr. Emacs and Emacs had mushroomed
01:03:14.420 | into a series of text processing applications
01:03:19.300 | and all kinds of stuff and I would have,
01:03:22.500 | but I have a long history
01:03:28.420 | of financially suboptimal decisions
01:03:32.620 | because I didn't want that life, right?
01:03:37.620 | And I went to grad school because I wanted to graduate
01:03:43.500 | and being Mr. Emacs for a while was kind of fun
01:03:52.540 | and then it kind of became--
01:04:00.060 | - Not fun.
01:04:00.900 | - Not fun.
01:04:02.580 | And when it was not fun and I was,
01:04:07.580 | there was no way I could pay my rent, right?
01:04:14.780 | - Yeah.
01:04:17.580 | - And I was like, okay, do I carry on as a grad student?
01:04:21.580 | I had a research assistantship
01:04:25.420 | and I was sort of living off of that
01:04:27.300 | and I was trying to do my,
01:04:30.260 | I was doing all my RA work, all my RA,
01:04:34.060 | being grad student work and being Mr. Emacs
01:04:36.900 | all at the same time.
01:04:38.140 | And I decided to pick one.
01:04:44.060 | And one of the things that I did at the time
01:04:48.460 | was I went around all the people I knew on the ARPANET
01:04:53.620 | who might be able to take over looking after Emacs
01:04:58.620 | and pretty much everybody said, I got a day job.
01:05:04.100 | So I actually found two folks and a couple of folks
01:05:11.060 | in a garage in New Jersey, complete with a dog,
01:05:17.900 | who were willing to take it over,
01:05:23.140 | but they were gonna have to charge money.
01:05:25.220 | But my deal with them was that they would,
01:05:28.780 | only that they would make it free
01:05:32.420 | for universities and schools and stuff.
01:05:35.180 | And they said, sure.
01:05:36.860 | And that upset some people.
01:05:41.220 | - So you have some,
01:05:42.340 | now I don't know the full history of this,
01:05:44.020 | but I think it's kind of interesting.
01:05:46.940 | You have some tension with Mr. Richard Stallman.
01:05:52.560 | And he kind of represents this kind of,
01:05:56.800 | like you mentioned, free software,
01:05:59.360 | sort of a dogmatic focus on-
01:06:06.880 | - Yeah, all information must be free.
01:06:10.280 | - Must be free.
01:06:11.100 | So what, is there an interesting way to paint a picture
01:06:15.760 | of the disagreement you have with Richard through the years?
01:06:20.040 | - My basic opposition is that,
01:06:23.840 | when you say information must be free,
01:06:28.240 | to a really extreme form that turns into,
01:06:33.520 | all people whose job is the production of
01:06:41.200 | everything from movies to software,
01:06:49.440 | they must all take a vow of poverty
01:06:51.640 | because information must be free.
01:06:56.920 | And that doesn't work for me.
01:06:58.580 | Right, and I don't wanna be wildly rich.
01:07:04.760 | I am not wildly rich.
01:07:07.660 | I do okay.
01:07:10.760 | But I do actually,
01:07:18.880 | I can feed my children.
01:07:20.760 | - Yeah, I totally agree with you.
01:07:22.160 | It does just make me sad that sometimes
01:07:25.440 | the closing of the source,
01:07:27.040 | for some reason the people that,
01:07:29.640 | like a bureaucracy begins to build,
01:07:33.800 | and sometimes it doesn't, it hurts the product.
01:07:37.560 | - Oh, absolutely, absolutely.
01:07:39.520 | - It's always sad.
01:07:40.560 | - And there is a balance in there.
01:07:44.920 | - There's a balance.
01:07:47.280 | And it's not hard over rapacious capitalism,
01:07:52.280 | and it's not hard over in the other direction.
01:08:00.960 | And a lot of the open source movement,
01:08:07.880 | they have been managing to find a path
01:08:11.920 | to actually making money, right?
01:08:15.360 | So doing things like service and support
01:08:19.040 | works for a lot of people.
01:08:21.480 | And there are some ways where it's kind of,
01:08:29.520 | some of them are a little perverse, right?
01:08:35.920 | So as a part of things like the Sarbanes-Oxley Act
01:08:43.280 | that would be various people's interpretations
01:08:46.000 | of all kinds of accounting principles,
01:08:48.320 | and this is kind of a worldwide thing,
01:08:52.000 | but if you've got a corporation
01:08:55.280 | that is depending on some piece of software,
01:08:57.920 | the often various accounting and reporting standards say,
01:09:04.880 | if you don't have a support contract on this thing
01:09:08.560 | that your business is depending on,
01:09:11.960 | then that's bad.
01:09:14.800 | So if you've got a database, you need to pay for support.
01:09:21.120 | But there's a difference between
01:09:26.480 | the sort of support contracts
01:09:30.280 | that the average open source database producer charges
01:09:37.320 | and what somebody who is truly rapacious,
01:09:42.200 | like Oracle charges.
01:09:43.840 | - Yeah, so it's a balance, like you said.
01:09:47.400 | - It is absolutely a balance.
01:09:49.920 | And there are a lot of different ways
01:09:54.920 | to make the math work out for everybody.
01:10:05.160 | And the very unbalanced,
01:10:10.160 | like the winner takes all thing
01:10:18.320 | that happens in so much of modern commerce,
01:10:21.960 | that just doesn't work for me either.
01:10:25.240 | - I know you've talked about this in quite a few places,
01:10:31.640 | but you have created one of the most popular
01:10:36.600 | programming languages in the world.
01:10:39.360 | This is the programming language that I first learned
01:10:42.760 | about object-oriented programming with.
01:10:45.200 | I think it's a programming language
01:10:49.400 | that a lot of people use in a lot of different places
01:10:52.840 | and millions of devices today, Java.
01:10:55.920 | So the absurd question,
01:10:59.680 | but can you tell the origin story of Java?
01:11:03.200 | - So long time ago at Sun, in about 1990,
01:11:07.440 | there was a group of us who were kind of worried
01:11:12.440 | that there was stuff going on in the universe of computing
01:11:17.880 | that the computing industry was missing out on.
01:11:25.200 | And so a few of us started this project at Sun.
01:11:30.200 | The really got going,
01:11:32.480 | I mean, we started talking about it in 1990
01:11:34.960 | and it really got going in '91.
01:11:37.000 | And it was all about what was happening
01:11:44.160 | in terms of computing hardware,
01:11:48.080 | processors and networking and all of that
01:11:51.080 | that was outside of the computer industry.
01:11:54.440 | And that was everything from the sort of early glimmers
01:11:59.440 | of cell phones that were happening then to,
01:12:04.800 | you know, you look at elevators and locomotives
01:12:08.040 | and process control systems in factories
01:12:12.560 | and all kinds of audio equipment and video equipment.
01:12:17.560 | They all had processors in them
01:12:22.360 | and they were all doing stuff with them.
01:12:24.960 | And it sort of felt like there was something going on there
01:12:29.960 | that we needed to understand.
01:12:32.640 | And--
01:12:36.040 | - So C and C++ was in the air already.
01:12:39.040 | - Oh no, C and C++ absolutely owned the universe
01:12:42.320 | at that time.
01:12:43.160 | Everything was written in C and C++.
01:12:45.240 | - So where was the hunch
01:12:46.600 | that there was a need for a revolution?
01:12:48.880 | - Well, so the need for a revolution
01:12:50.800 | was not about a language.
01:12:54.680 | It was about, it was just as simple and vague
01:12:58.440 | as there are things happening out there.
01:13:03.440 | - We need to understand them.
01:13:05.160 | - We need to understand them.
01:13:06.680 | And so a few of us went on several
01:13:12.800 | somewhat epic road trips.
01:13:19.600 | - Literal road trips?
01:13:20.840 | - Literal road trips.
01:13:22.000 | It's like get on an airplane, go to Japan,
01:13:24.880 | visit Toshiba and Sharp and Mitsubishi
01:13:29.880 | and Sony and all of these folks.
01:13:33.920 | And because we worked for Sun,
01:13:36.560 | we had folks who were willing to give us introductions.
01:13:41.560 | We visited Samsung and a bunch of Korean companies
01:13:48.040 | and we went all over Europe, went to places
01:13:53.640 | like Phillips and Siemens and Thompson.
01:13:56.840 | - What did you see there?
01:14:00.200 | - For me, one of the things that sort of left out
01:14:03.200 | was that they were doing all the usual computer things
01:14:07.480 | that people had been doing like 20 years before.
01:14:10.760 | The thing that really left out to me
01:14:13.240 | was that they were sort of reinventing computer networking
01:14:18.120 | and they were making all the mistakes
01:14:24.040 | that people in the computer industry had made.
01:14:28.840 | And since I had been doing a lot of work
01:14:31.000 | in the networking area, we'd go and visit Company X,
01:14:36.000 | they'd describe this networking thing that they were doing.
01:14:40.760 | And just without any thought, I could tell them
01:14:43.400 | like the 25 things that were going to be complete disasters
01:14:48.360 | with that thing that they were doing.
01:14:50.240 | And I don't know whether that had any impact on any of them,
01:14:55.840 | but that particular story of sort of repeating
01:15:00.840 | the disasters of the computer science industry was there.
01:15:08.120 | And one of the things we thought was,
01:15:10.320 | well, maybe we could do something useful here
01:15:13.080 | with like bringing them forward somewhat.
01:15:16.520 | But also at the same time,
01:15:20.240 | we learned a bunch of things from these,
01:15:26.400 | mostly consumer electronics companies.
01:15:29.520 | And high on the list was that
01:15:38.160 | they viewed their like relationship
01:15:41.960 | with the customer as sacred.
01:15:43.560 | They were never, ever willing to make trade-offs
01:15:50.640 | between for safety, right?
01:15:56.680 | So one of the things that had always made me nervous
01:16:01.680 | in the computer industry was that
01:16:07.120 | people were willing to make trade-offs in reliability
01:16:11.120 | to get performance.
01:16:12.680 | They want faster, faster,
01:16:17.720 | it breaks a little more often because it's fast.
01:16:20.520 | Maybe you run it a little hotter than you should,
01:16:22.960 | or like the one that always blew my mind
01:16:26.640 | was the way that the folks at Cray Supercomputers
01:16:33.200 | got their division to be really fast
01:16:38.200 | was that they did Newton-Raphson approximations.
01:16:43.040 | And so, the bottom several bits of A over B
01:16:49.360 | were essentially random numbers.
01:16:53.680 | - What could possibly go wrong?
01:16:56.960 | - What could go wrong, right?
01:16:59.840 | And just figuring out how to nail the bottom bit,
01:17:04.840 | how to make sure that if you put a piece of toast
01:17:13.200 | in a toaster, it's not going to kill the customer.
01:17:17.280 | It's not gonna burst into flames and burn the house down.
01:17:21.800 | - So those are, I guess those are the principles
01:17:25.400 | that were inspiring, but how did,
01:17:28.800 | from the days of Java's called Oak,
01:17:33.200 | because of a tree outside the window story
01:17:35.080 | that a lot of people know,
01:17:36.760 | how did it become this incredible, like,
01:17:40.640 | powerful language?
01:17:44.040 | - Well, so it was a bunch of things.
01:17:46.920 | So we, you know, after all that, we started,
01:17:49.800 | you know, the way that we decided
01:17:51.320 | that we could understand things better
01:17:54.640 | was by building a demo, building a prototype of something.
01:17:59.440 | So kind of because it was easy and fun,
01:18:02.720 | we decided to build a control system
01:18:05.520 | for some home electronics, you know, TV, VCR,
01:18:09.440 | that kind of stuff.
01:18:11.240 | And as we were building it, we, you know,
01:18:14.440 | we sort of discovered that there were some things
01:18:18.240 | about standard practice in C programming
01:18:22.080 | that were really getting in the way.
01:18:26.560 | And it wasn't exactly, you know,
01:18:29.840 | because we were writing this,
01:18:31.440 | all the C code and C++ code,
01:18:34.240 | that we couldn't write it to do the right thing,
01:18:37.840 | but that one of the things that was weird in the group
01:18:42.440 | was that we had a guy who's, you know,
01:18:49.560 | his sort of top level job was he was a business guy.
01:18:52.760 | You know, he was sort of an MBA kind of person,
01:18:56.320 | you know, think about business plans and all of that.
01:18:58.840 | And, you know, there were a bunch of things
01:19:03.280 | that were kind of, you know,
01:19:05.320 | and we would talk about things that were going wrong
01:19:07.640 | and things that were going wrong,
01:19:10.160 | things that were going right.
01:19:11.360 | And, you know, as we thought about, you know,
01:19:14.360 | things like the requirements for security and safety,
01:19:19.120 | we would look at some low level details
01:19:21.600 | and see like naked pointers.
01:19:24.200 | - Yeah.
01:19:25.120 | - And, you know, so back in the early nineties,
01:19:30.080 | it was well understood that, you know,
01:19:36.400 | the number one source of like security vulnerabilities.
01:19:41.400 | - Is pointers.
01:19:42.520 | - Was just pointers, was just bugs.
01:19:44.880 | - Yeah.
01:19:46.040 | - And it was like, you know, 50, 60, 70%
01:19:50.160 | of all security vulnerabilities were bugs.
01:19:53.360 | And the vast majority of them were like buffer overflows.
01:19:57.240 | - Yeah.
01:19:58.080 | So you're like, we have to fix this.
01:20:00.240 | - We have to make sure that this cannot happen.
01:20:03.200 | And that was kind of the original thing for me
01:20:07.440 | was this cannot continue.
01:20:11.600 | And one of the things I find really entertaining this year
01:20:16.600 | was I forget which rag published it,
01:20:21.640 | but there was this article that came out
01:20:24.960 | that was an examination.
01:20:28.040 | It was sort of the result of an examination
01:20:31.320 | of all the security vulnerabilities in Chrome.
01:20:34.560 | And Chrome is like a giant piece of C++ code.
01:20:39.320 | And 60 or 70% of all the security vulnerabilities
01:20:44.240 | were stupid pointer tricks.
01:20:46.960 | And I thought it's 30 years later and we're still there.
01:20:53.320 | - Still there.
01:20:55.280 | - And we're still there.
01:20:56.440 | And you know, that's one of those, you know,
01:20:59.680 | slap your forehead and just want to cry.
01:21:04.680 | - Would you attribute,
01:21:07.120 | or is that too much of a simplification,
01:21:09.200 | but would you attribute the creation of Java
01:21:11.440 | to C pointers?
01:21:14.720 | Obvious problem.
01:21:16.800 | - Well, I mean, that was one of the trigger points.
01:21:21.800 | - And currency you've mentioned.
01:21:23.560 | - Concurrency was a big deal.
01:21:25.640 | You know, because when you're interacting with people,
01:21:30.720 | you know, the last thing you ever want to see
01:21:32.840 | is the thing like waiting.
01:21:35.880 | And, you know, issues about the software development process,
01:21:40.880 | you know, when faults happen, can you recover from them?
01:21:45.960 | What can you do to make it easier to create
01:21:50.320 | and eliminate complex data structures?
01:21:54.520 | What can you do to fix, you know,
01:21:57.320 | one of the most common C problems, which is storage leaks.
01:22:03.720 | And it's evil twin, the freed,
01:22:07.040 | but still being used piece of memory.
01:22:14.160 | You know, you free something and then you keep using it.
01:22:17.920 | - Oh, yeah.
01:22:19.240 | - You know, so when I was originally thinking about that,
01:22:21.680 | I was thinking about it in terms of,
01:22:23.760 | of sort of safety and security issues.
01:22:26.760 | And one of the things I sort of came to believe,
01:22:28.920 | came to understand was that it wasn't just about safety
01:22:32.720 | and security, but it was about developer velocity, right?
01:22:37.720 | So, and I got really religious about this
01:22:43.440 | because at that point I had spent an ungodly amount
01:22:46.680 | of my life hunting down mystery pointer bugs.
01:22:51.680 | And, you know, like two thirds of my time
01:22:59.400 | as a software developer was, you know,
01:23:02.000 | because the mystery pointer bugs tend to be
01:23:04.360 | the hardest to find because they tend to be
01:23:08.840 | very, very statistical.
01:23:11.960 | The ones that hurt, you know, they're, you know,
01:23:14.080 | they're like a one in a million chance.
01:23:16.440 | And--
01:23:20.800 | - But nevertheless create an infinite amount of suffering.
01:23:23.520 | - Right.
01:23:24.600 | Because when you're doing a billion operations a second,
01:23:27.720 | - Yeah.
01:23:28.560 | - You know, one in a million chance means
01:23:31.640 | it's going to happen.
01:23:32.800 | And so I got really religious about this thing,
01:23:38.520 | about, you know, making it so that if something fails,
01:23:41.360 | it fails immediately and visibly.
01:23:44.120 | And, you know, one of the things that was a real
01:23:50.120 | attraction of Java to lots of development shops was that,
01:23:55.560 | you know, we get our code up and running twice as fast.
01:24:01.080 | - You mean like the entirety of the development process,
01:24:03.640 | debugging, all that kind of stuff?
01:24:05.000 | - Yeah, if you, you know, so if you measure time from,
01:24:09.560 | you know, you first touch fingers to keyboard
01:24:13.840 | until you get your first demo out,
01:24:15.880 | not much different.
01:24:21.120 | But if you look from fingers touching keyboard
01:24:23.720 | to solid piece of software that you could release
01:24:27.960 | in production, it would be way faster.
01:24:32.200 | - And I think what people don't often realize is,
01:24:34.600 | yeah, there's things that really slow you down,
01:24:36.800 | like it's hard to catch bugs probably is the thing
01:24:41.800 | that really slows down that.
01:24:43.240 | - It really slows things down.
01:24:45.080 | But also there were, you know, one of the things
01:24:47.720 | that you get out of object oriented programming
01:24:51.320 | is a strict methodology about, you know,
01:24:53.520 | what are the interfaces between things?
01:24:56.000 | And being really clear about how parts relate to each other.
01:25:00.760 | And what that helps with is so many times what people do
01:25:07.200 | is they kind of like sneak around the side.
01:25:12.520 | So if you've built something and people are using it,
01:25:15.840 | and then, and you say, well, okay, you know,
01:25:20.320 | I built this thing, you use it this way.
01:25:23.920 | And then you change it in such a way that it still does
01:25:27.840 | what you said it does,
01:25:28.680 | it just does it a little bit different.
01:25:30.400 | Then you find out that somebody out there
01:25:33.400 | was sneaking around the side,
01:25:35.800 | they sort of tunneled in a back door,
01:25:38.360 | and this person, their code broke.
01:25:42.040 | And because they were sneaking through a side door.
01:25:47.040 | And normally the attitude is,
01:25:52.800 | (laughs)
01:25:54.200 | dummy.
01:25:55.040 | But a lot of times, you know, you can't get away,
01:26:01.440 | you can't just slap their hand and tell them to not do that.
01:26:06.800 | Because, you know, it's, you know, somebody's,
01:26:10.800 | you know, some banks, you know,
01:26:15.240 | account reconciliation system that, you know,
01:26:19.320 | some developer decided, oh, I'm lazy, you know,
01:26:22.800 | I'll just sneak through the back door.
01:26:24.880 | - And because the language allows it,
01:26:26.440 | I mean, you can't even mad at them.
01:26:28.120 | - Yeah, and so one of the things I did
01:26:30.360 | that on the one hand upset a bunch of people,
01:26:33.680 | it was that I made it so that
01:26:36.040 | you really couldn't go through back doors, right?
01:26:38.800 | So the whole point of that was to say,
01:26:41.400 | if you need, you know, if the interface here isn't right,
01:26:47.160 | the wrong way to deal with that is go through a back door.
01:26:51.440 | The right way to deal with it is to walk up
01:26:53.400 | to the developer of this thing and say,
01:26:55.960 | - Change the interface.
01:26:56.800 | - Fix it.
01:26:57.640 | - Yep.
01:26:58.480 | - Right, and so it was kind of like
01:27:00.160 | a social engineering thing.
01:27:01.800 | - Yeah.
01:27:02.640 | - And,
01:27:03.480 | - It's brilliant.
01:27:05.480 | - And people ended up discovering
01:27:07.160 | that that really made a difference in terms of,
01:27:12.160 | you know, and a bunch of this stuff, you know,
01:27:14.560 | if you're just like screwing around,
01:27:16.120 | writing your own, like, you know,
01:27:18.160 | class project scale stuff,
01:27:20.040 | a lot of this stuff doesn't,
01:27:22.680 | isn't quite so important because, you know,
01:27:27.120 | you're, you know, both sides of the interface,
01:27:29.640 | but, you know, when you're building, you know,
01:27:34.000 | sort of larger, more complex pieces of software
01:27:37.760 | that have a lot of people working on them,
01:27:40.040 | and especially when they like span organizations,
01:27:46.080 | you know, having really clear,
01:27:49.280 | having clarity about how that stuff gets structured
01:27:52.920 | saves your life.
01:27:54.000 | - Yeah.
01:27:54.840 | - And, you know, especially, you know,
01:27:58.920 | there's so much software that is fundamentally untestable,
01:28:03.120 | you know, until you do the real thing.
01:28:08.120 | - It's better to write good code in the beginning
01:28:12.000 | as opposed to writing crappy code
01:28:13.640 | and then trying to fix it and,
01:28:15.440 | - Yeah. - trying to scramble
01:28:16.760 | and figure out and through testing,
01:28:19.120 | figure out where the bugs are.
01:28:20.400 | - Yeah, it's like, it's like, it's like,
01:28:23.360 | which shortcut caused that rocket
01:28:28.080 | to not get where it was needed to go?
01:28:30.680 | - So I think one of the most beautiful ideas
01:28:36.640 | philosophically and technically is of a virtual machine,
01:28:41.760 | the Java virtual machine.
01:28:45.240 | Again, I apologize to romanticize things,
01:28:47.480 | but how did the idea of the JVM come to be?
01:28:52.480 | How to you radical of an idea it is?
01:28:56.480 | 'Cause it seems to me to be just a really interesting idea
01:29:01.480 | in the history of programming.
01:29:03.520 | - So, - And what is it?
01:29:05.720 | - So the Java virtual machine,
01:29:07.720 | you can think of it in different ways
01:29:14.600 | because it was carefully designed
01:29:17.120 | to have different ways of viewing it.
01:29:19.840 | So one view of it that most people
01:29:22.840 | don't really realize is there,
01:29:25.120 | is that you can view it as sort of an encoding
01:29:30.120 | of the abstract syntax tree in reverse Polish notation.
01:29:36.520 | I don't know if that makes any sense at all.
01:29:41.560 | I could explain it and that would blow all over time.
01:29:44.280 | - Yeah.
01:29:45.120 | - But the other way to think of it
01:29:46.960 | and the way that it ends up being explained
01:29:50.160 | is that it's like the instruction set of an abstract machine
01:29:55.160 | that's designed such that you can translate
01:30:00.520 | that abstract machine to a physical machine.
01:30:03.120 | And the reason that that's important,
01:30:07.960 | so if you wind back to the early '90s
01:30:10.560 | when we were talking to all of these companies
01:30:14.000 | doing consumer electronics
01:30:16.880 | and you talk to the purchasing people,
01:30:20.760 | there were interesting conversations with purchasing.
01:30:25.440 | So if you look at how these devices come together,
01:30:32.000 | they're sheet metal and gears and circuit boards
01:30:36.120 | and capacitors and resistors and stuff.
01:30:40.120 | And everything you buy has multiple sources, right?
01:30:45.120 | So you can buy a capacitor from here,
01:30:49.840 | you can buy a capacitor from there
01:30:52.320 | and you've got kind of a market
01:30:54.360 | so that you can actually get a decent price for a capacitor.
01:30:59.360 | But CPUs and particularly in the early '90s
01:31:08.480 | CPUs were all different and all proprietary.
01:31:13.480 | So if you use the chip from Intel,
01:31:18.200 | you had to be an Intel customer
01:31:22.600 | till the end of time.
01:31:25.480 | Because if you wrote a bunch of software,
01:31:28.520 | when you wrote software using whatever technique you wanted
01:31:33.520 | and C was particularly important
01:31:36.080 | and C was particularly bad about this
01:31:39.560 | because there was a lot of properties
01:31:43.760 | of the underlying machine that came through.
01:31:47.480 | - So you were stuck, so the code you wrote,
01:31:50.120 | you were stuck to that particular machine.
01:31:51.760 | - You were stuck to that particular machine,
01:31:54.280 | which meant that they couldn't decide,
01:31:56.960 | you know, Intel is screwing us.
01:32:01.480 | I'll start buying chips from, you know, Bob's Better Chips.
01:32:06.480 | This drove the purchasing people absolutely insane
01:32:12.960 | that they were welded into this decision.
01:32:19.400 | And they would have to make this decision
01:32:22.880 | before the first line of software was written.
01:32:25.920 | - It's funny that you're talking about the purchasing people.
01:32:28.120 | So there's one perspective, right?
01:32:31.320 | There's a lot of other perspectives
01:32:32.760 | that all probably hated this idea.
01:32:35.280 | - Right.
01:32:36.120 | - But from a technical aspect,
01:32:37.560 | just like the creation of an abstraction layer
01:32:41.560 | that's agnostic to the underlying machine
01:32:45.320 | from the perspective of the developer,
01:32:48.280 | I mean, that's brilliant.
01:32:50.120 | - Right, well, and, you know,
01:32:54.320 | so that's like across the spectrum of providers of chips.
01:32:58.520 | But then there's also the time thing
01:33:00.720 | because, you know, as you went from one generation
01:33:04.840 | to the next generation to the next generation,
01:33:06.840 | they were all different.
01:33:07.960 | And you would often have to rewrite your software.
01:33:10.200 | - Oh, you mean generations of machines of different kinds?
01:33:14.760 | - Yeah, so like one of the things
01:33:18.520 | that sucked about a year out of my life
01:33:20.520 | was when San went from the Motorola 68010 processor
01:33:28.400 | to the 68020 processor.
01:33:31.640 | Then they had a number of differences
01:33:33.960 | and one of them hit us really hard.
01:33:36.720 | And I ended up being the point guy
01:33:41.720 | on the worst case of where
01:33:44.120 | the new instruction cache architecture hurt us.
01:33:49.160 | - Well, okay, so I mean, so when did this idea,
01:33:52.760 | I mean, okay, so yeah, you articulate a really clear
01:33:56.480 | fundamental problem in all of computing,
01:33:59.240 | but where do you get the guts to think
01:34:02.200 | we can actually solve this?
01:34:05.360 | - You know, in our conversations with all of these vendors,
01:34:09.480 | you know, these problems started to show up.
01:34:12.800 | And I kind of had this epiphany
01:34:19.320 | because it reminded me of a summer job
01:34:27.080 | that I had had in grad school.
01:34:29.520 | So back in grad school, my thesis advisor,
01:34:36.360 | well, I had two thesis advisors for bizarre reasons.
01:34:41.880 | One of them was a guy named Raj Reddy,
01:34:44.960 | the other one was Bob Sproul.
01:34:46.920 | And Raj, I love Raj, I really love both of them.
01:34:53.520 | - Yeah, Raj is amazing.
01:34:56.240 | - So the department had bought a bunch of
01:35:01.240 | like early workstations
01:35:06.480 | from a company called Three Rivers Computer Company.
01:35:09.680 | And Three Rivers Computer Company
01:35:13.440 | was a bunch of electrical engineers
01:35:15.360 | who wanted to do as little software as possible.
01:35:17.880 | So they knew that they'd need to have like compilers
01:35:23.080 | and an OS and stuff like that,
01:35:24.520 | but they didn't wanna do any of that.
01:35:26.920 | And they wanted to do that
01:35:28.320 | for as close to zero money as possible.
01:35:31.360 | So what they did was they built a machine
01:35:36.360 | whose instruction set was literally the byte code
01:35:43.120 | for UCSD Pascal, the P code.
01:35:50.680 | And so we had a bunch of software
01:35:55.120 | that was written for this machine.
01:36:00.480 | And for various reasons,
01:36:05.200 | the company wasn't doing terrifically well.
01:36:08.480 | We had all this software on these machines
01:36:10.200 | and we wanted it to run on other machines,
01:36:12.800 | principally the VAX.
01:36:19.800 | And so Raj asked me if I could come up with a way
01:36:24.240 | to port all of this software
01:36:27.440 | - Translate it.
01:36:28.280 | - From the PERC machines to VAXs.
01:36:33.280 | And I think he, you know what he had in mind
01:36:40.240 | was something that would translate from like Pascal to C
01:36:48.080 | or Pascal to, actually at those times,
01:36:51.480 | pretty much it was, you could translate to C or C.
01:36:55.560 | And if you didn't like translate to C,
01:36:57.120 | you could translate to C.
01:36:58.720 | There was, you know, it's like the Henry Ford,
01:37:04.440 | you know, any color you want,
01:37:05.920 | just as long as it's black.
01:37:07.320 | And I went, that's really hard.
01:37:13.760 | - That's a...
01:37:16.720 | - And I noticed that, you know,
01:37:19.160 | and I was like looking at stuff and I went,
01:37:21.480 | ooh, I bet I could rewrite the P code
01:37:26.000 | into VAX assembly code.
01:37:27.840 | And then I started to realize that, you know,
01:37:33.880 | there were some properties of P code
01:37:36.000 | that made that really easy,
01:37:38.600 | some properties that made it really hard.
01:37:40.840 | So I ended up writing this thing that translated
01:37:44.760 | from P code on the three rivers perks
01:37:49.760 | into assembly code on the VAX.
01:37:52.440 | And I actually got higher quality code than the C compiler.
01:37:58.920 | And so everything just went, got really fast.
01:38:03.360 | It was really easy.
01:38:04.360 | It was like, wow, I thought that was a sleazy hack
01:38:08.800 | 'cause I was lazy.
01:38:10.880 | And in actual fact, it worked really well.
01:38:14.200 | And I tried to convince people
01:38:16.880 | that that was maybe a good thesis topic.
01:38:19.320 | And nobody was, you know, it was like, nah.
01:38:26.440 | - Really?
01:38:27.280 | That's, I mean, yeah.
01:38:29.000 | It's kind of a brilliant idea, right?
01:38:31.400 | Maybe you didn't have the,
01:38:34.360 | you weren't able to articulate the big picture of it.
01:38:37.280 | - Yeah, and I think, you know, that was a key part.
01:38:42.560 | But so then, you know, clock comes forward a few years
01:38:47.000 | and it's like, we've got to be able to, you know,
01:38:51.400 | if they want to be able to switch from, you know,
01:38:54.240 | this weird microprocessor to that weird
01:38:57.160 | and totally different microprocessor, how do you do that?
01:39:00.840 | And I kind of went, oh, maybe by doing something
01:39:05.840 | kind of in the space of, you know, Pascal P code,
01:39:11.760 | you know, I could do like multiple translators.
01:39:14.640 | And I spent some time thinking about that
01:39:17.320 | and thinking about, you know, what worked
01:39:18.920 | and what didn't work when I did the P code to Vax translator.
01:39:23.920 | And I talked to some of the folks
01:39:29.480 | who were involved in Smalltalk
01:39:30.840 | because Smalltalk also did a bytecode.
01:39:32.960 | And then I kind of went, yeah, I want to do that.
01:39:41.400 | 'Cause that, you know, and it had the other advantage
01:39:45.040 | that you could either interpret it or compile it.
01:39:48.600 | And interpreters are usually easier to do
01:39:55.120 | but not as fast as a compiler.
01:39:58.800 | And so I figured, good, I can be lazy again.
01:40:03.720 | You know, sometimes I think that most of my good ideas
01:40:09.040 | are driven by laziness.
01:40:12.480 | And often I find that some of the people's stupidest ideas
01:40:15.920 | are because they're insufficiently lazy.
01:40:17.920 | They just want to build something really complicated.
01:40:23.960 | It's like, it doesn't need to be that complicated.
01:40:26.840 | Yeah, and so that's how that came out.
01:40:30.600 | You know, but that also turned into kind of,
01:40:37.600 | you know, almost a religious position on my part,
01:40:39.920 | which got me in several other fights.
01:40:44.400 | So like one of the things that was a real difference
01:40:49.120 | was the way that arithmetic worked.
01:40:51.240 | You know, once upon a time,
01:40:57.840 | there were, you know, it wasn't always
01:40:59.400 | just two's complement arithmetic.
01:41:01.600 | There were some machines
01:41:02.440 | that had one's complement arithmetic,
01:41:04.200 | which was like almost anything built by CDC.
01:41:07.120 | And occasionally there were machines
01:41:10.680 | that were decimal arithmetic.
01:41:12.240 | And I was like, this is crazy.
01:41:16.240 | You know, pretty much two's complement
01:41:20.400 | integer arithmetic has one.
01:41:22.680 | So just--
01:41:24.000 | - Let's just do that.
01:41:26.080 | - Just do that.
01:41:27.320 | One of the other places where there was a lot of variability
01:41:31.240 | was in the way that floating point behaved.
01:41:34.640 | And that was causing people
01:41:37.880 | throughout the software industry much pain
01:41:42.280 | because you couldn't do a numerical computing library
01:41:46.520 | that would work on CDC
01:41:48.240 | and then have it work on an IBM machine
01:41:50.080 | and then work on a DEC machine.
01:41:52.080 | And as a part of that whole struggle,
01:41:56.880 | there had been this big body of work
01:42:00.200 | on floating point standards.
01:42:04.640 | And this thing emerged that came to be called IEEE 754,
01:42:09.480 | which is the floating point standard
01:42:12.920 | that pretty much has taken over the entire universe.
01:42:16.680 | And at the time I was doing Java,
01:42:21.160 | it had pretty much completed taking over the universe.
01:42:25.000 | There were still a few pockets of holdouts,
01:42:27.480 | but I was like, you know,
01:42:30.400 | it's important to be able to say what two plus two means.
01:42:33.240 | - Yeah.
01:42:36.920 | - And so I went that.
01:42:40.000 | And one of the ways that I got into fights with people
01:42:45.440 | was that there were a few machines
01:42:48.320 | that did not implement IEEE 754 correctly.
01:42:52.960 | - Of course, that's all short term.
01:42:57.000 | - Short term kind of fights.
01:42:58.600 | I think in the long term, I think this vision is one out.
01:43:03.200 | - Yeah, and I think it's, you know,
01:43:05.080 | and it worked out over time.
01:43:06.240 | I mean, the biggest fights were with Intel
01:43:09.280 | because they had done some strange things with rounding.
01:43:15.520 | They'd done some strange things
01:43:16.840 | with their transcendental functions,
01:43:19.480 | which turned into a mushroom cloud of, you know, weirdness.
01:43:25.000 | And in the name of optimization,
01:43:27.520 | but from the perspective of the developer,
01:43:30.600 | that's not good.
01:43:32.680 | - Well, their issues with transcendental functions
01:43:34.640 | were just stupid.
01:43:35.920 | - Okay, so that's not even a trade off.
01:43:39.440 | That's just absolutely.
01:43:40.800 | - Yeah, they were doing range reduction
01:43:42.960 | in first sign and cosine using a slightly wrong value for pi.
01:43:47.960 | - Got it.
01:43:49.760 | We've got 10 minutes.
01:43:50.680 | So in the interest of time, two questions.
01:43:53.480 | So one about Android and one about life.
01:43:55.680 | So one, I mean, we could talk for many more hours.
01:44:02.400 | I hope eventually we might talk again,
01:44:05.280 | but I gotta ask you about Android
01:44:07.520 | and the use of Java there
01:44:09.960 | because it's one of the many places where Java
01:44:13.200 | just has a huge impact on this world.
01:44:16.520 | Just on your opinion,
01:44:18.200 | is there things that make you happy
01:44:22.520 | about the way Java is used in the Android world?
01:44:25.960 | And are there things that you wish were different?
01:44:29.760 | - I don't know how to do a short answer to that,
01:44:32.800 | but I have to do a short answer to that.
01:44:34.520 | So, you know, I'm happy that they did it.
01:44:37.160 | Java had been running on cell phones at that time
01:44:41.720 | for quite a few years and it worked really, really well.
01:44:44.520 | There were things about how they did it.
01:44:49.480 | And in particular,
01:44:51.640 | various ways that they
01:44:55.800 | kind of, you know, violated all kinds of contracts.
01:45:00.680 | The guy who led it, Andy Rubin,
01:45:03.360 | he crossed a lot of lines.
01:45:06.560 | - There's some lines crossed.
01:45:07.880 | - Yeah, lines were crossed
01:45:09.800 | that have since, you know, mushroomed into giant court cases.
01:45:16.840 | And, you know, they didn't need to do that.
01:45:19.840 | And in fact, it would have been so much cheaper
01:45:22.920 | for them to not cross lines.
01:45:25.440 | - I mean, I suppose they didn't anticipate
01:45:28.320 | the success
01:45:30.000 | of this whole endeavor.
01:45:33.800 | Or do you think at that time it was already clear
01:45:36.600 | that this is gonna blow up?
01:45:38.880 | - I guess I sort of came to believe
01:45:42.040 | that it didn't matter what Andy did, it was gonna blow up.
01:45:46.080 | (laughing)
01:45:47.880 | - Okay.
01:45:48.720 | - He's, you know, I kind of started to think of him
01:45:52.360 | as like a manufacturer of bombs.
01:45:55.840 | - Yeah.
01:45:59.240 | Some of the best things in this world come about
01:46:02.040 | through a little bit of explosive.
01:46:05.840 | - Well, and some of the worst.
01:46:07.160 | - And some of the worst, beautifully put.
01:46:09.280 | But is there, and like you said, I mean,
01:46:13.040 | does that make you proud that Java is in--
01:46:16.640 | - Yeah.
01:46:17.480 | - Is in millions, I mean, it could be billions of devices.
01:46:21.720 | - Yeah, well, I mean, it was in billions of phones
01:46:25.040 | before Android came along.
01:46:26.560 | And, you know, I'm just as proud as, you know,
01:46:33.240 | of the way that like the smart card standards adopted Java.
01:46:38.240 | And they did a, you know, everybody involved in that
01:46:41.960 | did a really good job.
01:46:43.320 | And that's, you know, billions and billions.
01:46:46.840 | - That's crazy.
01:46:49.560 | - The SIM cards, you know, the SIM cards in your pocket.
01:46:53.680 | - Yeah, I mean, it's--
01:46:54.520 | - I've been outside of that world for a decade,
01:46:57.000 | so I don't know how that has evolved,
01:47:00.160 | but, you know, it's just been crazy.
01:47:04.520 | - So on that topic, let me ask,
01:47:06.760 | again, there's a million technical things
01:47:11.360 | we could talk about, but let me ask the absurd,
01:47:14.080 | the old philosophical question about life.
01:47:17.940 | What do you hope when you look back at your life
01:47:23.240 | and people talk about you, write about you
01:47:27.360 | 500 years from now, what do you hope your legacy is?
01:47:31.400 | - People not being afraid to take a leap of faith.
01:47:39.920 | I mean, you know, I've got this kind of weird history
01:47:44.440 | of doing weird stuff and--
01:47:46.160 | - It worked out pretty damn well.
01:47:49.320 | - It worked out, right?
01:47:50.920 | And I think some of the weirder stuff that I've done
01:47:53.680 | has been the coolest and some of it crashed and burned.
01:48:00.120 | You know, I think well over half of the stuff
01:48:05.320 | that I've done has crashed and burned,
01:48:08.720 | which has occasionally been really annoying.
01:48:10.920 | - But still you kept doing it.
01:48:13.760 | - But yeah, yeah, yeah.
01:48:16.120 | And you know, even when things crash and burn,
01:48:19.720 | you at least learn something from it.
01:48:22.360 | - By way of advice, you know, people, developers,
01:48:26.320 | engineers, scientists, or just people who are young
01:48:30.320 | to look up to you, what advice would you give them?
01:48:34.480 | How to approach their life?
01:48:37.720 | - Don't be afraid of risk.
01:48:39.640 | It's okay to do stupid things once.
01:48:42.180 | (both laughing)
01:48:45.440 | - Maybe a couple of times.
01:48:46.880 | - You know, you get a pass on the first time or two
01:48:50.920 | that you do something stupid.
01:48:53.080 | You know, the third or fourth time, yeah, not so much.
01:48:55.760 | But also, you know, I don't know why,
01:49:04.400 | but really early on I started to think about
01:49:09.400 | ethical choices in my life.
01:49:13.000 | And because I'm a big science fiction fan,
01:49:18.040 | I got to thinking about like just about
01:49:24.520 | every technical decision I make in terms of,
01:49:28.640 | how do you want, you know,
01:49:29.720 | are you building Blade Runner or Star Trek?
01:49:33.880 | - Which one's better?
01:49:34.720 | - Which future would you rather live in?
01:49:37.200 | You know?
01:49:39.080 | - So what's the answer to that?
01:49:40.440 | - Well, I would sure rather live
01:49:43.280 | in the universe of Star Trek.
01:49:45.160 | - Yeah, that opens up a whole topic about AI,
01:49:48.240 | but that's a really interesting.
01:49:49.640 | - Yeah, yeah, yeah.
01:49:51.800 | - It's a really interesting idea.
01:49:53.200 | So your favorite AI system would be data from Star Trek.
01:49:58.200 | - And my least favorite would easily be Skynet.
01:50:00.800 | - Yeah.
01:50:01.640 | (laughing)
01:50:02.460 | Beautifully put.
01:50:03.300 | I don't think there's a better way to end it, James.
01:50:05.740 | I can't say enough how much of an honor it is
01:50:08.460 | to meet you, to talk to you.
01:50:09.500 | Thanks so much for wasting your time with me today.
01:50:12.700 | - Not a waste at all.
01:50:13.980 | - Thanks, James.
01:50:14.820 | - All right, thanks.
01:50:16.620 | - Thanks for listening to this conversation
01:50:18.100 | with James Gosling, and thank you to our sponsors,
01:50:20.960 | Public Goods, BetterHelp, and ExpressVPN.
01:50:24.360 | Please check out these sponsors in the description
01:50:26.620 | to get a discount and to support this podcast.
01:50:30.520 | If you enjoy this thing, subscribe on YouTube,
01:50:32.820 | review it with Five Stars on Apple Podcasts,
01:50:34.980 | follow on Spotify, support on Patreon,
01:50:37.660 | or connect with me on Twitter @LexFriedman.
01:50:40.780 | And now, let me leave you with some words
01:50:42.660 | from James Gosling.
01:50:44.460 | One of the toughest things about life is making choices.
01:50:48.440 | Thank you for listening, and hope to see you next time.
01:50:52.380 | (upbeat music)
01:50:54.960 | (upbeat music)
01:50:57.540 | [BLANK_AUDIO]