back to index

Grant Sanderson: Math, Manim, Neural Networks & Teaching with 3Blue1Brown | Lex Fridman Podcast #118


Chapters

0:0 Introduction
5:13 Richard Feynman
9:41 Learning deeply vs broadly
13:56 Telling a story with visualizations
18:43 Topology
23:52 Intuition about exponential growth
32:28 Elon Musk's exponential view of the world
40:9 SpaceX and space exploration
45:28 Origins of the Internet
49:50 Does teaching on YouTube get lonely?
54:31 Daily routine
60:20 Social media
70:38 Online education in a time of COVID
87:3 Joe Rogan moving to Spotify
92:9 Neural networks
98:30 GPT-3
106:52 Manim
111:1 Python
116:21 Theory of everything
123:53 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Grant Sanderson,
00:00:03.020 | his second time on the podcast.
00:00:04.940 | He's known to millions of people
00:00:06.860 | as the mind behind 3Blue1Brown,
00:00:09.360 | a YouTube channel where he educates and inspires the world
00:00:12.820 | with the beauty and power of mathematics.
00:00:15.780 | Quick summary of the sponsors,
00:00:17.380 | Dollar Shave Club, DoorDash, and Cash App.
00:00:19.980 | Click the sponsor links in the description
00:00:21.660 | to get a discount and to support this podcast,
00:00:24.580 | especially for the two new sponsors,
00:00:26.620 | Dollar Shave Club and DoorDash.
00:00:29.020 | Let me say as a side note,
00:00:30.940 | I think that this pandemic challenged millions of educators
00:00:33.940 | to rethink how they teach,
00:00:35.740 | to rethink the nature of education.
00:00:38.540 | As people know, Grant is a master elucidator
00:00:41.460 | of mathematical concepts that may otherwise seem difficult
00:00:44.940 | or out of reach for students and curious minds.
00:00:47.780 | But he's also an inspiration to teachers,
00:00:50.060 | researchers, and people who just enjoy sharing knowledge,
00:00:53.500 | like me, for what it's worth.
00:00:56.140 | It's one thing to give a semester's worth
00:00:57.860 | of multi-hour lectures.
00:00:59.260 | It's another to extract from those lectures
00:01:01.400 | the most important, interesting, beautiful,
00:01:04.060 | and difficult concepts and present them in a way
00:01:06.660 | that makes everything fall into place.
00:01:08.860 | That is the challenge that is worth taking on.
00:01:11.420 | My dream is to see more and more of my colleagues at MIT
00:01:14.340 | and world experts across the world
00:01:16.660 | summon their inner 3Blue1Brown
00:01:19.420 | and create the canonical explainer videos on a topic
00:01:22.380 | that they know more than almost anyone else in the world.
00:01:25.860 | Amidst the political division,
00:01:27.500 | the economic pain, the psychological medical toll
00:01:30.540 | of the virus, masterfully crafted educational content
00:01:34.540 | feels like one of the beacons of hope
00:01:36.580 | that we can hold onto.
00:01:37.720 | If you enjoy this thing, subscribe on YouTube,
00:01:41.060 | review it with 5 Stars on Apple Podcasts,
00:01:43.200 | follow on Spotify, support on Patreon,
00:01:45.860 | or connect with me on Twitter @LexFriedman.
00:01:48.540 | Of course, after you go immediately,
00:01:51.220 | which you already probably have done a long time ago,
00:01:53.580 | and subscribe to 3Blue1Brown YouTube channel,
00:01:56.780 | you will not regret it.
00:01:58.040 | As usual, I'll do a few minutes of ads now
00:02:00.820 | and no ads in the middle.
00:02:02.140 | I try to make these interesting,
00:02:03.940 | but I give you timestamps so you can skip.
00:02:06.840 | But still, please do check out the sponsors
00:02:08.660 | by clicking the links in the description,
00:02:10.540 | especially the two new ones,
00:02:11.780 | DoorDash and Dollar Shave Club.
00:02:14.080 | They're evaluating us,
00:02:15.700 | looking at how many people go to their site
00:02:17.540 | and get their stuff in order to determine
00:02:19.420 | if they want to support us for the longterm.
00:02:21.280 | So you know what to do.
00:02:22.380 | It's the best way to support this podcast as always.
00:02:26.580 | This show is sponsored by Dollar Shave Club.
00:02:28.820 | Try them out with a one-time offer
00:02:31.060 | for only $5 and free shipping
00:02:33.220 | at dollarshaveclub.com/lex.
00:02:36.300 | Starter kit comes with a six blade razor,
00:02:38.980 | refills, and all kinds of other stuff
00:02:41.120 | that makes shaving feel great.
00:02:43.220 | I've been a member of Dollar Shave Club
00:02:44.740 | for over five years now,
00:02:46.300 | and actually signed up when I first heard about them
00:02:48.220 | on the Joe Rogan podcast.
00:02:50.700 | And now we have come full circle.
00:02:53.140 | I feel like I've made it.
00:02:54.780 | Now that I can do a read for them,
00:02:56.460 | just like Joe did all those years ago.
00:02:58.740 | For the most part,
00:02:59.580 | I've just used the razor and the refills,
00:03:01.680 | but they encouraged me to try the shave butter,
00:03:03.660 | which I've never used before.
00:03:05.360 | So I did, and I love it.
00:03:07.740 | Not sure how the chemistry of it works out,
00:03:09.900 | but it's translucent somehow,
00:03:11.660 | which is a cool new experience.
00:03:13.580 | Again, try the Ultimate Shave Starter Set today
00:03:15.980 | for just five bucks,
00:03:17.260 | plus free shipping at dollarshaveclub.com/lex.
00:03:21.340 | This show is also sponsored by DoorDash.
00:03:24.300 | Get five bucks off and zero delivery fees
00:03:26.580 | on your first order of $15 or more
00:03:28.900 | when you download the DoorDash app and enter code LEX.
00:03:33.180 | I have so many memories of working late nights
00:03:35.340 | for a deadline with a team of engineers,
00:03:37.500 | and eventually taking a break
00:03:38.980 | to argue about which DoorDash restaurant to order from.
00:03:41.940 | And when the food came,
00:03:43.180 | those moments of bonding, of exchanging ideas,
00:03:45.900 | of pausing to shift attention from the programs
00:03:49.460 | to the humans were special.
00:03:53.020 | These days, for a bit of time, I'm on my own, sadly,
00:03:56.340 | so I miss that camaraderie.
00:03:58.260 | But actually, DoorDash is still there for me.
00:04:00.540 | There's a million options that fit into my keto diet ways.
00:04:04.040 | Also, it's a great way to support restaurants
00:04:06.020 | in these challenging times.
00:04:07.620 | Once again, download the DoorDash app
00:04:09.580 | and enter code LEX to get five bucks off
00:04:12.340 | and zero delivery fees on your first order of $15 or more.
00:04:15.740 | Finally, this show is presented by Cash App,
00:04:19.420 | the number one finance app in the App Store.
00:04:21.660 | When you get it, use code LEXPODCAST.
00:04:24.580 | Cash App lets you send money to friends,
00:04:26.540 | buy Bitcoin, and invest in the stock market
00:04:28.860 | with as little as $1.
00:04:30.940 | It's one of the best design interfaces of an app
00:04:33.260 | that I've ever used.
00:04:34.700 | To me, good design is when everything is easy and natural.
00:04:38.060 | Bad design is when the app gets in the way,
00:04:40.580 | either because it's buggy
00:04:41.940 | or because it tries too hard to be helpful.
00:04:44.880 | I'm looking at you, Clippy.
00:04:46.380 | Anyway, there's a big part of my brain and heart
00:04:48.640 | that love to design things
00:04:50.060 | and also to appreciate great design by others.
00:04:52.660 | So again, if you get Cash App from the App Store,
00:04:54.940 | Google Play, and use code LEXPODCAST, you get $10,
00:04:59.020 | and Cash App will also donate $10 to FIRST,
00:05:01.740 | an organization that is helping to advance robotics
00:05:04.700 | and STEM education for young people around the world.
00:05:08.460 | And now, here's my conversation with Grant Sanderson.
00:05:12.500 | You've spoken about Richard Feynman as someone you admire.
00:05:17.740 | I think last time we spoke, we ran out of time.
00:05:20.100 | (both laughing)
00:05:21.340 | So I wanted to talk to you about him.
00:05:23.700 | Who is Richard Feynman to you in your eyes?
00:05:27.180 | What impact did he have on you?
00:05:29.460 | - I mean, I think a ton of people like Feynman.
00:05:31.740 | It's a little bit cliche to say that you like Feynman.
00:05:34.120 | That's almost like when you don't know
00:05:36.540 | what to say about sports,
00:05:37.540 | and you just point to the Super Bowl or something
00:05:39.780 | as something you enjoy watching.
00:05:41.240 | But I do actually think there's a layer to Feynman
00:05:43.100 | that sits behind the iconography.
00:05:45.480 | One thing that just really struck me was this letter
00:05:48.980 | that he wrote to his wife two years after she died.
00:05:51.700 | So during the Manhattan Project, she had polio.
00:05:54.260 | Tragically, she died.
00:05:55.460 | They were just young, madly in love.
00:05:57.700 | And the icon of Feynman is this,
00:06:01.900 | almost this mildly sexist, womanizing philanderer,
00:06:05.660 | at least on the personal side.
00:06:07.100 | But you read this letter,
00:06:09.140 | and I can try to pull it up for you if I want,
00:06:10.760 | and it's just this absolutely heartfelt letter
00:06:13.660 | to his wife saying how much he loves her,
00:06:15.720 | even though she's dead,
00:06:16.560 | and what she means to him,
00:06:18.820 | how no woman can ever measure up to her.
00:06:21.440 | And it shows you that the Feynman that we've all seen
00:06:25.200 | in "Surely You're Joking"
00:06:26.460 | is different from the Feynman in reality.
00:06:28.760 | And I think the same kind of goes in his science,
00:06:31.360 | where he sometimes has this output
00:06:35.160 | of being this, "Aw, shucks," character.
00:06:36.960 | Everyone else is coming in
00:06:37.880 | with these fancyfalutin formulas,
00:06:39.920 | but I'm just gonna try to whittle it down
00:06:41.280 | to its essentials, which is so appealing,
00:06:43.160 | 'cause we love to see that kind of thing.
00:06:45.000 | But when you get into it,
00:06:46.240 | what he was doing was actually quite deep,
00:06:49.320 | very much mathematical.
00:06:50.740 | That should go without saying,
00:06:52.920 | but I remember reading a book about Feynman in a cafe once,
00:06:55.240 | and this woman looked at me
00:06:56.200 | and saw that it was about Feynman.
00:06:58.680 | She was like, "Oh, I love him.
00:06:59.680 | "I read 'Surely You're Joking.'"
00:07:01.120 | And she started explaining to me
00:07:02.520 | how he was never really a math person.
00:07:04.640 | And I don't understand how that can possibly be
00:07:08.320 | a public perception about any physicist,
00:07:09.980 | but for whatever reason,
00:07:10.820 | that worked into his aura
00:07:11.960 | that he sort of shooed off math
00:07:13.920 | and in place of true science.
00:07:16.000 | The reality of it is he was deeply in love with math
00:07:18.000 | and was much more going in that direction
00:07:19.360 | and had a clicking point
00:07:20.560 | into seeing that physics was a way to realize that,
00:07:22.880 | and all the creativity
00:07:23.760 | that he could output in that direction
00:07:26.680 | was instead poured towards things like fundamental,
00:07:28.880 | not even fundamental theories,
00:07:29.900 | just emergent phenomena and everything like that.
00:07:32.640 | So to answer your actual question,
00:07:35.480 | what I like about his way of going at things
00:07:38.760 | is this constant desire to reinvent it for himself.
00:07:42.240 | Like when he would consume papers,
00:07:43.760 | the way he'd describe it,
00:07:44.600 | he would start to see what problem he was trying to solve
00:07:46.840 | and then just try to solve it himself
00:07:48.160 | to get a sense of personal ownership.
00:07:49.960 | And then from there, see what others had done.
00:07:52.520 | - Is that how you see problems yourself?
00:07:54.080 | Like that's actually an interesting point
00:07:56.320 | when you first are inspired by a certain idea
00:08:00.480 | that you maybe wanna teach or visualize
00:08:03.360 | or just explore on your own.
00:08:05.080 | I'm sure you're captured by some possibility
00:08:07.080 | and magic of it.
00:08:08.360 | Do you read the work of others?
00:08:10.560 | Like do you go through the proofs?
00:08:12.080 | Do you try to rediscover everything yourself?
00:08:14.920 | - So I think the things that I've learned best
00:08:17.640 | and have the deepest ownership of
00:08:18.960 | are the ones that have some element of rediscovery.
00:08:21.180 | The problem is that really slows you down.
00:08:23.200 | And this is, for my part, it's actually a big fault.
00:08:25.360 | Like this is part of why I'm not an active researcher.
00:08:28.280 | I'm not at the depth of the field
00:08:29.920 | that a lot of other people are.
00:08:31.340 | The stuff that I do learn, I try to learn it really well.
00:08:34.800 | But other times you do need to get through it
00:08:36.760 | at a certain pace.
00:08:37.600 | You need to get to a point of a problem
00:08:39.000 | you're trying to solve.
00:08:39.840 | So obviously you need to be well-equipped to read things
00:08:42.800 | without that reinvention component
00:08:44.440 | and see how others have done it.
00:08:45.880 | But I think if you choose a few core building blocks
00:08:47.840 | along the way and you say,
00:08:48.840 | I'm really gonna try to approach this
00:08:51.240 | before I see how this person went at it.
00:08:54.640 | I'm really gonna try to approach it for myself.
00:08:56.800 | No matter what, you gain all sorts of inarticulatable
00:08:59.200 | intuitions about that topic,
00:09:01.000 | which aren't gonna be there
00:09:02.640 | if you simply go through the proof.
00:09:04.520 | For example, you're gonna be trying to come up
00:09:06.680 | with counter examples.
00:09:07.820 | You're gonna try to come up with intuitive examples,
00:09:11.120 | all sorts of things where you're populating
00:09:12.480 | your brain with data.
00:09:13.480 | And the ones that you come up with
00:09:14.560 | are likely to be different than the one
00:09:15.720 | that the text comes up with.
00:09:17.000 | And that like lends it a different angle.
00:09:19.120 | So that aspect also slowed Feynman down
00:09:21.920 | in a lot of respects.
00:09:22.740 | I think there was a period when like the rest of physics
00:09:25.080 | was running away from him.
00:09:27.160 | But insofar as it got him to where he was,
00:09:29.600 | I kind of resonate with that.
00:09:32.280 | I just, I would be nowhere near it
00:09:34.920 | 'cause I'm not like him at all.
00:09:36.360 | But it's like a state to aspire to.
00:09:41.280 | - You know, just to link in a small point you made,
00:09:44.440 | that you're not a quote unquote active researcher.
00:09:47.180 | Do you, you're swimming often in reasonably good depth
00:09:53.160 | about a lot of topics.
00:09:55.760 | Do you sometimes wanna like dive deep
00:09:58.640 | at a certain moment and say like,
00:10:01.080 | 'cause you probably built up a hell of an amazing intuition
00:10:04.040 | about what is and isn't true within these worlds.
00:10:07.480 | Do you ever wanna just dive in
00:10:09.960 | and see if you can discover something new?
00:10:14.960 | - Yeah, I think one of my biggest regrets from undergrad
00:10:18.480 | is not having built better relationships
00:10:20.540 | with the professors I had there.
00:10:21.720 | And I think a big part of success in research
00:10:23.940 | is that element of like mentorship
00:10:26.240 | and like people giving you the kind of scaffolded problems
00:10:28.360 | to carry along.
00:10:29.760 | For my own like goals right now,
00:10:31.280 | I feel like I'm pretty good at exposing math to others
00:10:35.400 | and like want to continue doing that.
00:10:38.020 | For my personal learning,
00:10:39.840 | I, are you familiar with like the hedgehog fox dynamic?
00:10:44.720 | I think this was either the ancient Greeks came up with it
00:10:47.700 | or it was pretended to be something drawn
00:10:50.760 | from the ancient Greeks.
00:10:51.600 | I don't know who to point it to, but the--
00:10:53.360 | - Probably Mark Twain.
00:10:55.080 | - It is that you've got two types of people
00:10:57.200 | or especially two types of researchers.
00:10:59.060 | There's the fox that knows many different things
00:11:01.960 | and then the hedgehog that knows one thing very deeply.
00:11:04.560 | So like von Neumann would have been the fox.
00:11:06.800 | He's someone who knows many different things,
00:11:08.220 | just very foundational in a lot of different fields.
00:11:11.120 | Einstein would have been more of a hedgehog
00:11:12.640 | thinking really deeply about one particular thing
00:11:15.160 | and both are very necessary for making progress.
00:11:18.240 | So between those two, I would definitely see myself
00:11:20.480 | as like the fox where I'll try to get my paws
00:11:23.440 | in like a whole bunch of different things.
00:11:25.040 | And at the moment,
00:11:26.120 | I just think I don't know enough of anything
00:11:27.640 | to make like a significant contribution to any of them.
00:11:29.980 | But I do see value in like having a decently deep
00:11:34.420 | understanding of a wide variety of things.
00:11:36.640 | Like most people who know computer science really deeply
00:11:41.060 | don't necessarily know physics very deeply
00:11:42.940 | or many of the aspects, like different fields in math even.
00:11:46.780 | Let's say you have like an analytic number theory
00:11:48.420 | versus an algebraic number theory.
00:11:49.700 | Like these two things end up being related
00:11:51.420 | to very different fields.
00:11:52.380 | Like some of them more complex analysis,
00:11:54.260 | some of them more like algebraic geometry.
00:11:56.140 | And then when you just go out so far
00:11:57.440 | as to take those adjacent fields,
00:11:58.680 | place one PhD student into a seminar of another one's,
00:12:02.120 | they don't understand what the other one's saying at all.
00:12:04.280 | Like you take the complex analysis specialist
00:12:06.480 | inside the algebraic geometry seminar,
00:12:08.380 | they're as lost as you or I would be.
00:12:10.260 | But I think going around and like trying to have some sense
00:12:13.200 | of what this big picture is,
00:12:15.160 | certainly has personal value for me.
00:12:16.600 | I don't know if I would ever make like new contributions
00:12:19.320 | in those fields,
00:12:20.160 | but I do think I could make new
00:12:21.680 | like expositional contributions
00:12:24.000 | where there's kind of a notion of things that are known,
00:12:27.320 | but like haven't been explained very well.
00:12:30.000 | - Well, first of all, I think most people would agree
00:12:32.360 | your videos, your teaching,
00:12:34.300 | the way you see the world is fundamentally often new.
00:12:38.380 | Like you're creating something new.
00:12:40.680 | And it almost feels like research,
00:12:43.520 | even just like the visualizations,
00:12:45.560 | the multi-dimensional visualization we'll talk about.
00:12:48.480 | I mean, you're revealing something very interesting
00:12:51.360 | that yeah, just feels like research, feels like science,
00:12:55.720 | feels like the cutting edge of the very thing
00:12:59.680 | of which like new ideas and new discoveries are made of.
00:13:03.200 | - I do think you're being a little bit more generous
00:13:05.280 | than is necessarily true.
00:13:06.720 | And I promise that's not even false humility
00:13:08.880 | because I sometimes think when I research a video,
00:13:11.440 | I'll learn like 10 times as much as I need
00:13:13.080 | for the video itself.
00:13:14.000 | And it ends up feeling kind of elementary.
00:13:16.680 | So I have a sense of just how far away
00:13:18.560 | like the stuff that I cover is from the actual depth.
00:13:22.520 | - I think that's natural,
00:13:23.360 | but I think that could also be a mathematics thing.
00:13:26.360 | I feel like in the machine learning world,
00:13:28.240 | you're like two weeks in,
00:13:29.840 | you feel like you've basically mastered the field.
00:13:33.160 | In mathematics, it's like--
00:13:34.640 | - Well, everything is either trivial or impossible.
00:13:37.360 | And it's like a shockingly thin line between the two
00:13:40.000 | where you can find something that's totally impenetrable.
00:13:42.240 | And then after you get a feel for it, it's like,
00:13:43.560 | oh yeah, that whole subject is actually trivial in some way.
00:13:48.040 | So maybe that's what goes on.
00:13:50.200 | Every researcher is just on the other end of that hump
00:13:52.280 | and it feels like it's so far away,
00:13:53.520 | but one step actually gets them there.
00:13:56.040 | - What do you think about Feynman's teaching style
00:13:59.480 | or another perspective is of use of visualization?
00:14:04.480 | - Well, his teaching style is interesting
00:14:07.520 | because people have described like the Feynman effect
00:14:09.760 | where while you're watching his lectures
00:14:12.120 | or while you're reading his lectures,
00:14:13.200 | everything makes such perfect sense.
00:14:15.280 | So as an entertainment session, it's wonderful
00:14:19.200 | because it gives you this intellectual satisfaction
00:14:23.120 | that you don't get from anywhere else,
00:14:24.480 | that you like finally understand it.
00:14:26.320 | But the Feynman effect is that you can't really recall
00:14:29.200 | what it is that gave you that insight even a week later.
00:14:32.560 | And this is true of a lot of books and a lot of lectures
00:14:35.520 | where the retention is never quite what we hope it is.
00:14:38.160 | So there is a risk that the stuff that I do
00:14:44.920 | also fits that same bill,
00:14:46.440 | where at best it's giving this kind of intellectual candy
00:14:48.680 | on giving a glimpse of feeling
00:14:50.240 | like you understand something.
00:14:51.640 | But unless you do something active,
00:14:53.400 | like reinventing it yourself,
00:14:54.960 | like doing problems to solidify it,
00:14:57.500 | even things like space repetition memory
00:15:00.240 | to just make sure that you have like the building blocks
00:15:02.680 | of what do all the terms mean,
00:15:04.200 | unless you're doing something like that,
00:15:05.440 | it's not actually gonna stick.
00:15:06.920 | So the very same thing that's so admirable
00:15:10.120 | about Feynman's lectures,
00:15:11.140 | which is how damn satisfying they are to consume
00:15:14.940 | might actually also reveal a little bit of the flaw
00:15:18.620 | that we should as educators all look out for,
00:15:21.220 | which is that that does not correlate
00:15:22.640 | with long-term learning.
00:15:24.240 | - We'll talk about it a little bit.
00:15:25.660 | I think you've done some interactive stuff.
00:15:28.900 | I mean, even in your videos,
00:15:31.460 | the awesome thing that Feynman couldn't do at the time
00:15:34.260 | is you could, since it's programmed,
00:15:37.620 | you can like tinker, like play with stuff.
00:15:40.780 | You could take this value and change it.
00:15:42.660 | You can like, here, let's take the value of this variable
00:15:45.700 | and change it to build up an intuition,
00:15:47.700 | to move along the surface
00:15:48.940 | or to change the shape of something.
00:15:52.020 | I think that's almost an equivalent
00:15:54.900 | of you doing it yourself.
00:15:57.080 | It's not quite there, but as a viewer.
00:15:59.300 | Yeah, do you think there's some value
00:16:02.100 | in that interactive element?
00:16:04.660 | - Yeah, well, so what's interesting is you're saying that,
00:16:06.780 | and the videos are non-interactive
00:16:08.340 | in the sense that there's a play button and a pause button.
00:16:10.460 | And you could ask like,
00:16:11.780 | hey, while you're programming these things,
00:16:13.300 | why don't you program it into an interactable version?
00:16:15.860 | You know, make it a Jupyter notebook
00:16:17.340 | that people can play with,
00:16:18.460 | which I should do and that like would be better.
00:16:20.820 | I think the thing about interactives though
00:16:22.220 | is most people consuming them
00:16:24.020 | just sort of consume what the author had in mind.
00:16:28.300 | And that's kind of what they want.
00:16:29.620 | Like I have a ton of friends
00:16:31.420 | who make interactive explanations.
00:16:33.300 | And when you look into the analytics
00:16:34.620 | of how people use them,
00:16:36.900 | there's a small sliver that generally use it
00:16:38.660 | as a playground to have experiments.
00:16:40.340 | And maybe that small sliver is actually who you're targeting
00:16:42.340 | and the rest don't matter.
00:16:43.900 | But most people consume it just as a piece of
00:16:46.780 | like well-constructed literature
00:16:48.460 | that maybe you tweak with the example a little bit
00:16:50.420 | to see what it's getting at.
00:16:51.860 | But in that way, I do think like a video can get
00:16:54.460 | most of the benefits of the interactive,
00:16:56.680 | like the interactive app,
00:16:58.880 | as long as you make the interactive for yourself
00:17:00.700 | and you decide what the best narrative to spin is.
00:17:03.140 | As a more concrete example, like my process with,
00:17:06.660 | I made this video about SIR models for epidemics.
00:17:09.740 | And it's like this agent-based bottling thing
00:17:11.900 | where you tweak some things about how the epidemic spreads
00:17:14.900 | and you wanna see how that affects its evolution.
00:17:17.340 | My format for making that was very different than others
00:17:21.300 | where rather than scripting it ahead of time,
00:17:23.180 | I just made the playground and then I played a bunch
00:17:26.740 | and then I saw what stories there were to tell within that.
00:17:30.340 | - Yeah, that's cool.
00:17:31.160 | So your video had that kind of structure.
00:17:32.860 | It had like five or six stories or whatever it was.
00:17:37.020 | And like, it was basically,
00:17:38.980 | okay, here's a simulation, here's a model.
00:17:41.900 | What can we discover with this model?
00:17:44.340 | - And here's five things I found after playing with it.
00:17:47.020 | - Well, 'cause the thing is,
00:17:49.100 | a way that you could do that project is you make the model
00:17:51.740 | and then you put it out and you say,
00:17:52.860 | here's a thing for the world to play with.
00:17:54.500 | Like come to my website where you interact with this thing.
00:17:57.260 | And people did like sort of remake it in a JavaScript way
00:18:00.740 | so that you can go to that website
00:18:02.380 | and you can test your own hypotheses.
00:18:04.580 | But I think a meaningful part of the value to add
00:18:07.060 | is not just the technology,
00:18:08.420 | but to give the story around it as well.
00:18:10.580 | And like, that's kind of my job.
00:18:12.780 | It's not just to like make the visuals
00:18:15.700 | that someone will look at.
00:18:16.540 | It's to be the one to decide
00:18:18.140 | what's the interesting thing to walk through here.
00:18:21.060 | And even though there's lots of other interesting paths
00:18:22.780 | that one could take,
00:18:23.980 | that can be kind of daunting
00:18:25.140 | when you're just sitting there in a sandbox
00:18:26.580 | and you're given this tool with like five different sliders
00:18:28.980 | and you're told to like play and discover things.
00:18:31.860 | Where do you do?
00:18:32.700 | What do you start?
00:18:33.540 | What are my hypotheses?
00:18:34.420 | What should I be asking?
00:18:35.460 | Like a little bit of guidance in that direction
00:18:38.220 | can be what actually sparks curiosity
00:18:39.780 | to make someone want to imagine more about it.
00:18:43.100 | - A few videos I've seen you do,
00:18:45.100 | I don't know how often you do it,
00:18:46.220 | but there's almost a tangential like pause
00:18:49.500 | where you, here's a cool thing.
00:18:52.100 | You say like, here's a cool thing,
00:18:54.100 | but it's outside the scope of this video essentially.
00:18:56.860 | But I'll leave it to you as homework essentially
00:18:59.580 | to like figure out it's a cool thing to explore.
00:19:02.740 | - I wish I could say that wasn't a function of laziness.
00:19:05.140 | (laughs)
00:19:05.980 | - Right, and that's like you've worked so hard
00:19:07.380 | on making the 20 minutes already
00:19:09.540 | that to extend it out even further would take more time.
00:19:12.100 | - No wonder your cooler videos,
00:19:14.340 | the homomorphic, like from the Mobius strip to the--
00:19:17.860 | - What do you describe, rectangle?
00:19:18.980 | - Yeah, that's a super,
00:19:21.180 | and you're like, yeah, you can't transform
00:19:24.900 | the Mobius strip into a surface
00:19:29.900 | without it intersecting itself.
00:19:33.660 | But I'll leave it to you to see why that is.
00:19:36.460 | (laughs)
00:19:37.300 | - Well, I hope that's not exactly how I phrase it
00:19:39.260 | 'cause I think what my hope would be is that
00:19:42.220 | I leave it to you to think about
00:19:43.300 | why you would expect that to be true
00:19:45.100 | and then to want to know what aspects of a Mobius strip
00:19:48.660 | do you wanna formalize such that you can prove
00:19:50.480 | that intuition that you have?
00:19:51.900 | 'Cause at some point now you're starting to
00:19:54.540 | invent algebraic topology.
00:19:56.700 | If you have these vague instincts like,
00:19:59.380 | I wanna get this Mobius strip,
00:20:00.460 | I want to fit it such that it's all above the plane
00:20:05.460 | but its boundary sits exactly on the plane.
00:20:08.180 | I don't think I can do that without crossing itself
00:20:10.300 | but that feels really vague.
00:20:11.300 | How do I formalize it?
00:20:12.260 | And as you're starting to formalize that,
00:20:14.140 | that's what's gonna get you to
00:20:16.140 | try to come up with a definition
00:20:17.340 | for what it means to be orientable or non-orientable.
00:20:19.660 | And once you have that motivation,
00:20:21.160 | a lot of the otherwise arbitrary things
00:20:22.880 | that are sitting at the very beginning
00:20:23.980 | of a topology textbook start to make a little more sense.
00:20:27.640 | - Yeah, and I mean, that whole video, beautifully,
00:20:29.900 | was a motivation for topology is cool.
00:20:32.900 | - That was my, well, my hope with that
00:20:34.140 | is I feel like topology is,
00:20:36.060 | I don't wanna say it's taught wrong
00:20:37.400 | but I do think sometimes it's popularized in the wrong way
00:20:40.480 | where you'll hear these things of people saying,
00:20:43.380 | oh, topologists, they're very interested in surfaces
00:20:45.740 | that you can bend and stretch but you can't cut or glue.
00:20:50.100 | Are they?
00:20:51.780 | Like, there's all sorts of things you can be interested in
00:20:54.380 | with random, like, imaginative manipulations of things.
00:20:57.340 | Is that really what like mathematicians are into?
00:20:59.980 | And the short answer is not, not really.
00:21:03.060 | It's not as if someone was sitting there thinking like,
00:21:06.660 | I wonder what the properties of clay are
00:21:08.500 | if I add some arbitrary rules about when I can't cut it
00:21:11.260 | and when I can't glue it.
00:21:12.920 | Instead, there's a ton of pieces of math
00:21:16.040 | that can actually be equivalent
00:21:18.740 | to like these very general structures
00:21:21.420 | that's like geometry,
00:21:22.620 | except you don't have exact distances.
00:21:24.140 | You just wanna maintain a notion of closeness.
00:21:26.300 | And once you get it to those general structures,
00:21:28.900 | constructing mappings between them
00:21:30.820 | translate into non-trivial facts about other parts of math.
00:21:35.060 | And that, I don't think that's actually like popularized.
00:21:38.860 | I don't even think it's emphasized well enough
00:21:40.440 | when you're starting to take a topology class
00:21:42.420 | 'cause you kind of have these two problems.
00:21:43.620 | It's like either it's too squishy,
00:21:45.260 | you're just talking about coffee mugs and donuts,
00:21:47.220 | or it's a little bit too rigor first
00:21:49.660 | and you're talking about the axiom systems with open sets
00:21:54.060 | and an open set is not the opposite of closed set.
00:21:56.540 | So sorry about that, everyone.
00:21:57.820 | We have a notion of clopin sets
00:21:59.660 | for ones that are both at the same time.
00:22:02.500 | And just, it's not an intuitive axiom system
00:22:05.920 | in comparison to other fields of math.
00:22:07.320 | So you as a student,
00:22:08.220 | like really have to walk through mud to get there.
00:22:10.500 | And you're constantly confused
00:22:11.620 | about how this relates to the beautiful things
00:22:13.220 | about coffee mugs and Moebius strips and such.
00:22:16.020 | And it takes a really long time to actually see,
00:22:19.180 | like see topology in the way
00:22:20.500 | that mathematicians see topology.
00:22:22.340 | But I don't think it needs to take that time.
00:22:23.900 | I think there's, this is making me feel
00:22:26.340 | like I need to make more videos on the topic
00:22:27.900 | 'cause I think I've only done two.
00:22:28.740 | - 100% you do.
00:22:30.060 | But I've also seen it in my narrow view of like,
00:22:34.340 | I find game theory very beautiful.
00:22:36.380 | And I know topology has been used elegantly
00:22:39.860 | to prove things in game theory.
00:22:41.380 | - Yeah, you have like facts that seem very strange.
00:22:43.660 | Like I could tell you, you stir your coffee
00:22:45.860 | and after you stir it,
00:22:47.500 | and like, let's say all the molecules settled
00:22:48.940 | to like not moving again,
00:22:49.940 | one of the molecules will be basically
00:22:51.580 | in the same position it was before.
00:22:54.020 | You have all sorts of fixed point theorems like this, right?
00:22:57.020 | That kind of fixed point theorem,
00:22:58.620 | directly relevant to Nash equilibriums, right?
00:23:01.940 | So you can imagine popularizing it
00:23:03.900 | by describing the coffee fact,
00:23:05.380 | but then you're left to wonder like,
00:23:06.300 | who cares about if a molecule of coffee
00:23:08.060 | like stays in the same spot?
00:23:09.140 | Is this what we're paying our mathematicians for?
00:23:11.580 | You have this very elegant mapping onto economics
00:23:14.220 | in a way that's very concrete,
00:23:15.580 | or very, I shouldn't say concrete, very tangible,
00:23:18.820 | like actually adds value to people's lives
00:23:20.620 | through the predictions that it makes.
00:23:22.900 | But that line isn't always drawn
00:23:24.300 | because you have to get a little bit technical
00:23:26.860 | in order to properly draw that line out.
00:23:30.820 | And often I think popularized forms of media
00:23:34.260 | just shy away from being a little too technical.
00:23:37.380 | - For sure.
00:23:38.220 | By the way, for people who are watching the video,
00:23:40.580 | I do not condone the message in this mug.
00:23:42.900 | It's the only one I have,
00:23:44.020 | which is the snuggle is real.
00:23:46.100 | - By the way, for anyone watching,
00:23:48.540 | I do condone the message of that mug.
00:23:50.180 | - The snuggle is real. - The snuggle is real.
00:23:51.980 | Okay, so you mentioned the SIR model.
00:23:55.780 | I think there's certain ideas there of growth,
00:24:00.740 | of exponential growth.
00:24:03.020 | What maybe have you learned about pandemics
00:24:08.020 | from making that video?
00:24:11.460 | Because it was kind of exploratory.
00:24:13.020 | You were kind of building up an intuition.
00:24:14.940 | And it's, again, people should watch the video.
00:24:17.620 | It's kind of an abstract view.
00:24:19.060 | It's not really modeling in detail.
00:24:23.420 | The whole field of epidemiology,
00:24:25.580 | those people, they go really far in terms of modeling,
00:24:30.580 | like how people move about.
00:24:32.700 | I don't know if you've seen it,
00:24:33.580 | but like there is the mobility patterns,
00:24:35.940 | like how many people you encounter in certain situations,
00:24:40.940 | when you go to a school, when you go to a mall,
00:24:43.620 | they like model every aspect of that for a particular city.
00:24:46.340 | Like they have maps of actual city streets.
00:24:49.860 | They model it really well.
00:24:51.340 | And natural patterns of the people have, it's crazy.
00:24:54.220 | So you don't do any of that.
00:24:55.460 | You're just doing an abstract model
00:24:56.820 | to explore different ideas of--
00:24:58.620 | - Simple pedic--
00:24:59.460 | Well, because I don't want to pretend like I'm an epidemiologist.
00:25:02.620 | Like we have a ton of armchair epidemiologists.
00:25:04.820 | And the spirit of that was more like,
00:25:08.260 | can we, through a little bit of play,
00:25:10.500 | draw like reasonable-ish conclusions?
00:25:12.700 | And also just like get ourselves in a position
00:25:15.780 | where we can judge the validity of a model.
00:25:18.900 | Like I think people should look at that
00:25:20.340 | and they should criticize it.
00:25:21.180 | They should point to all the ways that it's wrong,
00:25:22.500 | 'cause it's definitely naive, right?
00:25:24.620 | In the way that it's set up.
00:25:26.580 | But to say like what lessons from that hold,
00:25:28.980 | like thinking about the R-naught value
00:25:31.020 | and what that represents and what it can imply.
00:25:34.100 | - What's R-naught?
00:25:35.380 | - So R-naught is if you are infectious
00:25:38.140 | and you're in a population which is completely susceptible,
00:25:41.500 | what's the average number of people
00:25:42.980 | that you're gonna infect during your infectiousness?
00:25:45.940 | So certainly during the beginning of an epidemic,
00:25:49.340 | this basically gives you kind of the exponential growth rate.
00:25:53.180 | Like if every person infects two others,
00:25:54.820 | you've got that one, two, four, eight
00:25:56.700 | exponential growth pattern.
00:25:58.100 | As it goes on, and let's say it's something endemic
00:26:03.500 | where you've got like a ton of people
00:26:05.100 | who have had it and are recovered,
00:26:07.540 | then the R-naught value doesn't tell you that as directly
00:26:12.260 | because a lot of the people you interact with
00:26:14.140 | aren't susceptible, but in the early phases it does.
00:26:17.540 | And this is like the fundamental constant
00:26:19.500 | that it seems like epidemiologists look at
00:26:21.420 | and the whole goal is to get that down.
00:26:23.420 | If you can get it below one, then it's no longer epidemic.
00:26:26.460 | If it's equal to one, then it's endemic
00:26:28.700 | and it's above one, then you're epidemic.
00:26:30.820 | So like just teaching what that value is
00:26:33.820 | and giving some intuitions on how do certain changes
00:26:36.540 | in behavior change that value?
00:26:38.660 | And then what does that imply for exponential growth?
00:26:40.940 | I think those are general enough lessons
00:26:44.020 | and they're like resilient to all of the chaoses
00:26:47.700 | of the world that it's still like valid
00:26:51.460 | to take from the video.
00:26:52.380 | - I mean, one of the interesting aspects of that
00:26:54.100 | is just exponential growth and we think about growth.
00:26:57.660 | Is that one of the first times you've done a video on,
00:27:00.780 | no, of course not, the whole Euler's identity.
00:27:07.260 | Okay, so.
00:27:08.340 | - Sure.
00:27:09.540 | I've done a lot of videos about exponential growth
00:27:11.340 | in the circular direction,
00:27:13.220 | only minimal in the normal direction.
00:27:15.300 | - I mean, another way to ask,
00:27:17.540 | do you think we're able to reason intuitively
00:27:20.780 | about exponential growth?
00:27:22.980 | - It's funny, I think it's extremely intuitive to humans
00:27:28.940 | and then we train it out of ourselves
00:27:30.420 | such that it's then really not intuitive
00:27:32.220 | and then I think it can become intuitive again
00:27:33.780 | when you study a technical field.
00:27:35.420 | So what I mean by that is,
00:27:38.500 | have you ever heard of these studies
00:27:39.700 | where in a anthropological setting
00:27:42.580 | where you're studying a group that has been disassociated
00:27:46.060 | from a lot of modern society
00:27:47.900 | and you ask what number is between one and nine?
00:27:51.620 | And maybe you would ask,
00:27:52.580 | you've got one rock and you've got nine rocks,
00:27:54.420 | you're like, what pile is halfway in between these?
00:27:56.900 | And our instinct is usually to say five,
00:27:59.740 | that's the number that sits right between one and nine.
00:28:02.260 | But sometimes when numeracy
00:28:04.180 | and the kind of just basic arithmetic that we have
00:28:07.500 | isn't in a society, the natural instinct is three
00:28:10.620 | because it's in between in an exponential sense
00:28:13.860 | and a geometric sense that one is three times bigger
00:28:16.700 | and then the next one is three times bigger than that.
00:28:18.820 | So it's like, if you have one friend versus 100 friends,
00:28:22.060 | what's in between that?
00:28:22.940 | Yeah, 10 friends seems like the social status
00:28:25.340 | in between those two states.
00:28:26.740 | So that's like deeply intuitive to us
00:28:28.340 | to think logarithmically like that.
00:28:30.060 | And for some reason, we kind of train it out of ourselves
00:28:33.900 | to start thinking linearly about things.
00:28:35.740 | - So in the sense, yeah, the early basic math
00:28:39.300 | forces us to take a step back.
00:28:43.020 | It's the same criticism if there's any of science
00:28:47.580 | is the lessons of science make us like see the world
00:28:52.580 | in a slightly narrow sense
00:28:55.540 | to where we have an over-exaggerated confidence
00:28:58.260 | that we understand everything
00:29:00.180 | as opposed to just understanding a small slice of it.
00:29:03.380 | But I think that probably only really goes for small numbers
00:29:06.420 | 'cause the real counterintuitive thing
00:29:07.580 | about exponential growth is like
00:29:08.900 | as the numbers start to get big.
00:29:10.300 | So I bet if you took that same setup and you asked them,
00:29:13.340 | oh, if I keep tripling the size of this rock pile,
00:29:15.940 | you know, seven times, how big will it be?
00:29:18.460 | I bet it would be surprisingly big
00:29:19.860 | even to like a society without numeracy.
00:29:23.620 | And that's the side of it
00:29:24.700 | that I think is pretty counterintuitive to us,
00:29:28.140 | but that you can basically train into people.
00:29:30.420 | Like I think computer scientists and physicists,
00:29:33.380 | when they're looking at the early numbers of like COVID,
00:29:37.140 | were, they were the ones thinking like,
00:29:39.020 | oh God, this is following an exact exponential curve.
00:29:41.620 | - Yeah.
00:29:43.220 | - And I heard that from a number of people.
00:29:45.380 | So it's, and almost all of them are like techies
00:29:47.980 | in some capacity,
00:29:49.140 | probably just 'cause I like live in the Bay Area, but.
00:29:51.700 | - But for sure, they're cognizant of this kind of,
00:29:54.940 | this kind of growth is present in a lot of natural systems
00:29:57.260 | and a lot of, in a lot of systems.
00:30:01.540 | I don't know if you've seen like,
00:30:02.660 | I mean, there's a lot of ways to visualize this obviously,
00:30:04.740 | but Raker as well, I think was the one
00:30:07.180 | that had this like chessboard
00:30:09.340 | where every square on the chessboard,
00:30:13.180 | you double the number of stones
00:30:15.060 | or something in that chessboard.
00:30:16.620 | - I've heard this is like an old proverb
00:30:18.420 | where it's like, you know, someone,
00:30:20.100 | the king offered him a gift and he said,
00:30:22.580 | the only gift I would like, very modest,
00:30:24.100 | give me a single grain of rice.
00:30:25.540 | - Rice, that's right.
00:30:26.380 | - For each chessboard and then two grains of rice
00:30:28.740 | for the next square, then twice that for the next square
00:30:31.780 | and just continue on.
00:30:32.620 | That's my only modest ask, your sire.
00:30:34.540 | And like, it's all, you know,
00:30:36.220 | more grains of rice than there are anything in the world
00:30:40.460 | by the time you get to the end.
00:30:41.540 | - And I, my intuition falls apart there.
00:30:44.060 | Like I would have never predicted that.
00:30:46.620 | Like for some reason, that's a really compelling
00:30:49.660 | illustration how poorly breaks down, just like you said,
00:30:54.780 | maybe we're okay for the first few piles,
00:30:57.100 | but of rocks, but after a while it's game over.
00:31:00.780 | - You know, the other classic example
00:31:02.340 | for gauging someone's intuitive understanding
00:31:04.860 | of exponential growth is I've got like a lily pad
00:31:08.060 | on a lake, really big lake, okay, like Lake Michigan.
00:31:12.020 | And that lily pad replicates, it doubles one day
00:31:15.860 | and then it doubles the next day
00:31:16.860 | and it doubles the next day.
00:31:18.180 | And after 50 days, it actually is gonna cover
00:31:21.100 | the entire lake, okay?
00:31:22.820 | So after how many days does it cover?
00:31:24.700 | Half the lake.
00:31:25.620 | - 49.
00:31:28.220 | - So you have a good instinct for exponential growth.
00:31:31.180 | So I think a lot of like the knee jerk reaction
00:31:33.900 | is sometimes to think that it's like half the amount
00:31:36.140 | of time or to at least be like surprised
00:31:38.700 | that like after 49 days, you've only covered half of it.
00:31:42.860 | - Yeah, I mean, that's the reason you heard a pause from me.
00:31:46.340 | I literally thought that can't be right.
00:31:48.460 | - Right, yeah, exactly.
00:31:49.740 | So even when you know the fact and you do the division,
00:31:52.340 | it's like, wow, so you've gotten like that whole time
00:31:55.060 | and then day 49, it's only covering half.
00:31:56.980 | And then after that, it gets the whole thing.
00:31:58.900 | But I think you can make that even more visceral
00:32:00.500 | if rather than going one day before you say,
00:32:02.260 | how long until it's covered 1% of the lake, right?
00:32:06.180 | And it's, so what would that be?
00:32:08.700 | How many times you have to double to get over 100?
00:32:10.900 | Like seven, six and a half times, something like that.
00:32:13.820 | So at that point, you're looking at 43, 44 days into it.
00:32:17.860 | You're not even at 1% of the lake.
00:32:19.820 | So you've experienced 44 out of 50 days.
00:32:22.740 | And you're like, yeah, that's really bad.
00:32:23.780 | It's just 1% of the lake.
00:32:25.500 | But then next thing you know, it's the entire lake.
00:32:28.700 | - You're wearing a SpaceX shirt, so let me ask you.
00:32:30.740 | - Sure.
00:32:31.820 | - Let me ask you, one person who talks about exponential,
00:32:36.100 | just the miracle of the exponential function
00:32:41.100 | in general is Elon Musk.
00:32:42.580 | So he kind of advocates the idea of exponential thinking
00:32:49.700 | realizing that technological development can,
00:32:53.180 | at least in the short term, follow exponential improvement,
00:32:56.780 | which breaks apart our intuition,
00:32:59.780 | our ability to reason about what is and isn't impossible.
00:33:03.060 | So he's a big, one, it's a good leadership kind of style
00:33:06.560 | of saying like, look, the thing that everyone thinks
00:33:09.180 | is impossible is actually possible because exponentials.
00:33:13.500 | But what's your sense about that kind of way
00:33:18.140 | to see the world?
00:33:19.860 | - Well, so I think it can be very inspiring
00:33:23.260 | to note when something, like Moore's Law
00:33:25.580 | is another great example where you have
00:33:27.300 | this exponential pattern that holds shockingly well.
00:33:30.240 | And it enables just better lives to be led.
00:33:33.580 | I think the people who took Moore's Law seriously
00:33:35.980 | in the '60s were seeing that, wow,
00:33:37.980 | it's not gonna be too long before these giant computers
00:33:40.420 | that are either batch processing or time-shared,
00:33:42.700 | you could actually have one small enough
00:33:43.980 | to put on your desk, on top of your desk,
00:33:45.700 | and you could do things.
00:33:46.740 | And if they took it seriously,
00:33:47.860 | you have people predicting smartphones a long time ago.
00:33:51.340 | And it's only out of this, I don't wanna say faith
00:33:54.380 | in exponentials, but an understanding
00:33:56.500 | that that's what's happening.
00:33:58.020 | What's more interesting, I think,
00:33:59.020 | is to really understand why exponential growth happens
00:34:03.500 | and that the mechanism behind it is when the rate of change
00:34:06.420 | is proportional to the thing in and of itself.
00:34:08.840 | So the reason that technology would grow exponentially
00:34:11.300 | is only gonna be if the rate of progress
00:34:14.300 | is proportional to the amount that you have.
00:34:16.300 | So that the software you write
00:34:17.980 | enables you to write more software.
00:34:19.740 | And I think we see this with the internet.
00:34:23.420 | The advent of the internet makes it faster to learn things,
00:34:25.780 | which makes it faster to create new things.
00:34:28.980 | I think this is oftentimes why investment
00:34:33.900 | will grow exponentially, that the more resources
00:34:36.660 | a company has, if it knows how to use them well,
00:34:39.060 | the more it can actually grow.
00:34:41.940 | So I mean, you referenced Elon Musk.
00:34:43.540 | I think he seems to really be into
00:34:45.620 | vertically integrating his companies.
00:34:47.380 | I think a big part of that is 'cause you have the sense
00:34:49.060 | what you want is to make sure that the things
00:34:50.580 | that you develop, you have ownership of,
00:34:52.500 | and they enable further development of the adjacent parts.
00:34:55.780 | So it's not just this, you see a curve
00:34:58.460 | and you're blindly drawing a line through it.
00:35:00.660 | What's much more interesting is to ask,
00:35:01.940 | when do you have this proportional growth property?
00:35:05.800 | Because then you can also recognize when it breaks down.
00:35:08.140 | Like in an epidemic, as you approach saturation,
00:35:10.700 | that would break down.
00:35:12.340 | As you do anything that skews
00:35:14.380 | what that proportionality constant is,
00:35:16.460 | you can make it maybe not break down as being an exponential,
00:35:19.220 | but it can seriously slow what that exponential rate is.
00:35:21.900 | - This is the opposite of a pandemic,
00:35:25.180 | is you want, in terms of ideas,
00:35:28.220 | you want to minimize barriers that prevent the spread.
00:35:31.620 | You wanna maximize the spread of impact.
00:35:33.560 | So like you want it to grow
00:35:35.420 | when you're doing technological development,
00:35:37.740 | is so that you do hold up, that rate holds up.
00:35:41.920 | And that's almost like an operational challenge
00:35:46.740 | of like how you run a company,
00:35:48.060 | how you run a group of people,
00:35:49.700 | is that any one invention has a ripple that's unstopped.
00:35:54.300 | And that ripple effect then has
00:35:56.580 | its own ripple effects and so on.
00:35:57.920 | And that continues.
00:35:59.060 | Yeah, like Moore's law is fascinating.
00:36:00.960 | On a psychological level, on a human level,
00:36:04.960 | 'cause it's not exponential,
00:36:06.620 | it's just a consistent set of like,
00:36:10.540 | what you would call like S-curves,
00:36:12.500 | which is like, it's constantly like,
00:36:15.200 | breakthrough innovations, nonstop.
00:36:17.980 | - That's a good point.
00:36:18.820 | Like it might not actually be an example of exponentials
00:36:21.300 | because of something which grows in proportion to itself,
00:36:23.820 | but instead it's almost like a benchmark that was set out
00:36:26.740 | that everyone's been pressured to meet.
00:36:28.780 | And it's like all these innovations
00:36:30.620 | and micro inventions along the way,
00:36:33.100 | rather than some consistent sit back
00:36:35.060 | and just let the lily pad grow across the lake phenomenon.
00:36:38.380 | - And it's also, there's a human psychological level for sure
00:36:41.260 | of like the four minute mile.
00:36:42.540 | Like it's something about it,
00:36:44.940 | like saying that, look, there is, you know, Moore's law.
00:36:49.860 | It's a law.
00:36:51.700 | So like, it's certainly an achievable thing.
00:36:56.700 | You know, we've achieved it for the last decade,
00:36:58.620 | for the last two decades, for the last three decades.
00:37:00.300 | You just keep going and it somehow makes it happen.
00:37:04.400 | I mean, it makes people,
00:37:06.140 | I'm continuously surprised in this world
00:37:08.560 | how few people do the best work in the world,
00:37:13.560 | like in that particular, whatever that field is.
00:37:16.580 | Like it's very often that like the genius,
00:37:21.060 | I mean, you can argue that community matters,
00:37:24.780 | but it's certain, like I've been in groups of engineers
00:37:27.600 | where like one person is clearly like
00:37:30.760 | doing an incredible amount of work and just is the genius.
00:37:35.180 | And it's fascinating to see,
00:37:36.900 | basically it's kind of the Steve Jobs idea
00:37:40.700 | is maybe the whole point is to create an atmosphere
00:37:45.700 | where the genius can discover themselves,
00:37:49.220 | like have the opportunity to do the best work of their life.
00:37:53.320 | And yeah, and that the exponential is just milking that.
00:37:57.820 | It's like rippling the idea that it's possible.
00:38:01.060 | And that idea that it's possible finds the right people
00:38:03.820 | for the four minute mile.
00:38:05.260 | The idea that it's possible finds the right runners
00:38:07.860 | to run it and then it explodes the number of people
00:38:10.060 | who can run faster than four minutes.
00:38:12.380 | It's kind of interesting to, I don't know.
00:38:15.260 | Basically the positive way to see that
00:38:17.060 | is most of us are way more intelligent,
00:38:20.060 | have way more potential than we ever realized.
00:38:22.860 | I guess that's kind of depressing.
00:38:24.020 | But I mean like the ceiling for most of us
00:38:25.960 | is much higher than we ever realized.
00:38:28.540 | - That is true.
00:38:29.380 | A good book to read if you want that sense is "Peak,"
00:38:32.780 | which essentially talks about peak performance
00:38:35.260 | in a lot of different ways, like chess, London cab drivers,
00:38:38.620 | how many pushups people can do, short-term memory tasks.
00:38:41.420 | And it's meant to be like a concrete manifesto
00:38:45.700 | about deliberate practice and such.
00:38:47.100 | But the one sensation you come out with is,
00:38:49.740 | wow, no matter how good people are at something,
00:38:52.500 | they can get better and like way better
00:38:54.420 | than we think they could.
00:38:55.740 | I don't know if that's actually related
00:38:57.220 | to exponential growth,
00:38:58.820 | but I do think it's a true phenomenon that's interesting.
00:39:01.540 | - Yeah, I mean, there's certainly no law
00:39:04.060 | of exponential growth in human innovation.
00:39:06.780 | Well, I don't know.
00:39:08.780 | - Well, kind of there is.
00:39:11.540 | I think it's very interesting to see
00:39:12.860 | when innovations in one field
00:39:14.340 | allow for innovations in another.
00:39:16.020 | Like the advent of computing seems like a prerequisite
00:39:18.860 | for the advent of chaos theory.
00:39:20.420 | You have this truth about physics and the world
00:39:22.940 | that in theory could be known.
00:39:25.060 | You could find Lorenz's equations without computers.
00:39:28.300 | But in practice, it was just never gonna be analyzed
00:39:31.100 | that way unless you were doing like a bunch of simulations
00:39:33.980 | and that you could computationally see these models.
00:39:36.460 | So it's like physics allowed for computers,
00:39:37.900 | computers allowed for better physics
00:39:39.740 | and wash, rinse and repeat.
00:39:42.100 | That self-proportionality, that's exponential.
00:39:45.660 | So I think I wouldn't,
00:39:47.820 | it's something it's too far to say
00:39:48.940 | that that's a law of some kind.
00:39:50.460 | - Yeah, a fundamental law of the universe
00:39:53.780 | is that these descendants of apes
00:39:57.820 | will exponentially improve their technology
00:40:00.980 | and one day be taken over by the AGI.
00:40:03.340 | That's built in the simulation.
00:40:06.180 | That'll make the video game fun, whoever created this thing.
00:40:09.580 | So, I mean, since you're wearing a SpaceX shirt,
00:40:12.100 | let me ask.
00:40:13.460 | - I didn't realize that was my spotlight.
00:40:15.420 | - I apologize to call you out.
00:40:17.140 | - It's on topic, I'll take it.
00:40:18.500 | - So Crew Dragon, the first crewed mission out into space
00:40:23.500 | since the space shuttle.
00:40:29.380 | And just by first time ever by a commercial company,
00:40:34.380 | I mean, it's an incredible accomplishment, I think,
00:40:37.780 | but it's also just an incredible,
00:40:41.060 | it inspires imagination amongst people
00:40:43.900 | that this is the first step in a long,
00:40:47.100 | like vibrant journey of humans into space.
00:40:50.100 | - Oh yeah.
00:40:50.940 | - So what are your, how do you feel?
00:40:52.820 | Is this exciting to you?
00:40:55.140 | - Yeah, it is.
00:40:55.980 | I think it's great.
00:40:56.800 | The idea of seeing it basically done by smaller entities.
00:40:59.360 | Instead of by governments.
00:41:00.420 | I mean, it's a heavy collaboration
00:41:02.300 | between SpaceX and NASA in this case,
00:41:04.380 | but moving in the direction of not necessarily requiring
00:41:07.420 | an entire country and its government to make it happen,
00:41:10.240 | but that you can have something closer
00:41:13.580 | to a single company doing it.
00:41:14.820 | We're not there yet,
00:41:15.740 | 'cause it's not like they're unilaterally saying,
00:41:17.980 | like we're just shooting people up into space.
00:41:20.880 | It's just a sign that we're able to do more powerful things
00:41:23.000 | with smaller groups of people.
00:41:24.500 | I find that inspiring.
00:41:26.820 | - Innovate quickly.
00:41:28.580 | - I hope we see people land on Mars in my lifetime.
00:41:31.260 | - Do you think we will?
00:41:32.460 | - I think so.
00:41:33.820 | I think there's a ton of challenges there, right?
00:41:35.580 | Like radiation being kind of the biggest one.
00:41:37.660 | And I think there's a ton of people who look at that
00:41:40.460 | and say, why?
00:41:42.100 | Why would you want to do that?
00:41:43.660 | Let's let the robots do the science for us.
00:41:45.980 | But I think there's enough people
00:41:47.500 | who are like genuinely inspired about broadening
00:41:50.000 | like the worlds that we've touched.
00:41:52.220 | Or people who think about things like
00:41:54.060 | backing up the light of consciousness
00:41:55.860 | with like super long-term visions of terraforming.
00:41:58.260 | Like as long as there's--
00:41:59.100 | - Sorry, backing up the light of consciousness?
00:42:01.100 | - Yeah, the thought that if Earth goes to hell,
00:42:04.980 | we gotta have a backup somewhere.
00:42:06.620 | A lot of people see that as pretty out there
00:42:09.300 | and it's like not in the short-term future.
00:42:10.660 | But I think that's an inspiring thought.
00:42:12.740 | I think that's a reason to like get up in the morning.
00:42:14.740 | And I feel like most employees at SpaceX feel that way too.
00:42:18.700 | - Do you think we'll colonize Mars one day?
00:42:21.280 | - No idea.
00:42:22.120 | Like either AGI kills us first, or if we're like allowed.
00:42:25.420 | I don't know if it'll take--
00:42:26.260 | - If we're allowed.
00:42:27.100 | - Well, like honestly, it would take such a long time.
00:42:30.180 | Like, okay, you might have a small colony, right?
00:42:32.500 | Something like what you see in the Martian.
00:42:35.580 | But not like people living comfortably there.
00:42:38.540 | But if you wanna talk about actual
00:42:41.340 | like second Earth kind of stuff,
00:42:45.780 | that's just like way far out there.
00:42:47.660 | And the future moves so fast that--
00:42:49.380 | - It's hard to predict.
00:42:50.220 | - It's like we might just kill ourselves
00:42:51.660 | before that even becomes viable.
00:42:54.420 | - Yeah, I mean, there's a lot of possibilities
00:42:56.460 | where it could be just, it doesn't have to be on a planet.
00:42:58.700 | We could be floating out in space,
00:43:00.340 | have a space-faring backup solution.
00:43:05.340 | That doesn't have to deal with the constraints
00:43:08.980 | that a planet, I mean, a planet provides
00:43:10.620 | a lot of possibilities and resources,
00:43:12.280 | but also some constraints.
00:43:14.220 | Yeah, I mean, for me, for some reason,
00:43:17.660 | it's a deeply exciting possibility.
00:43:19.660 | - Oh yeah.
00:43:20.660 | Yeah, all of the people who are like skeptical about it
00:43:22.780 | are like, "Why do we care about going to Mars?"
00:43:25.460 | It's like, what makes you care about anything?
00:43:27.380 | That's inspiring.
00:43:28.540 | - It's hard, actually it's hard to hear that
00:43:30.620 | because exactly as you put it on a philosophical level,
00:43:34.660 | it's hard to say why do anything.
00:43:37.580 | I don't know, it's like the people say like,
00:43:40.280 | I've been doing like an insane challenge
00:43:44.280 | last 30 something days.
00:43:46.660 | - Your pull-ups?
00:43:47.500 | - The pull-ups and push-ups.
00:43:48.540 | And like, a bunch of people are like,
00:43:52.680 | "Awesome, you're insane, but awesome."
00:43:56.900 | And then some people are like, "Why?"
00:43:59.500 | - Why do anything?
00:44:00.780 | - I don't know, there's a calling.
00:44:03.620 | I'm with JFK a little bit,
00:44:06.780 | is because we do these things because they're hard.
00:44:09.620 | There's something in the human spirit that says like,
00:44:12.300 | same with like a math problem,
00:44:13.980 | there's something you fail once
00:44:16.540 | and it's like this feeling that,
00:44:19.460 | you know what, I'm not gonna back down from this.
00:44:21.540 | There's something to be discovered in overcoming this thing.
00:44:25.020 | - Well, so what I like about it is,
00:44:27.020 | and I also like this about the moon missions,
00:44:29.100 | sure it's kind of arbitrary, but you can't move the target.
00:44:31.900 | So you can't make it easier
00:44:33.740 | and say that you've accomplished the goal.
00:44:35.980 | And when that happens, it just demands actual innovation.
00:44:39.500 | Like protecting humans from the radiation in space
00:44:43.060 | on the flight there, while there, hard problem,
00:44:45.740 | demands innovation.
00:44:46.740 | You can't move the goalposts to make that easier.
00:44:49.180 | Almost certainly the innovations required
00:44:50.860 | for things like that will be relevant
00:44:52.260 | in a bunch of other domains too.
00:44:54.700 | So like the idea of doing something
00:44:55.820 | merely because it's hard,
00:44:57.420 | it's like loosely productive, great.
00:44:59.980 | But as long as you can't move the goalposts,
00:45:01.660 | there's probably gonna be these secondary benefits
00:45:03.900 | that like we should all strive for.
00:45:07.380 | - Yeah, I mean, it's hard to formulate
00:45:09.740 | the Mars colonization problem
00:45:11.380 | as something that has a deadline, which is the problem.
00:45:15.140 | But if there was a deadline,
00:45:16.840 | then the amount of things we would come up with
00:45:21.740 | by forcing ourselves to figure out how to colonize
00:45:25.700 | that place would be just incredible.
00:45:28.980 | This is what people, like the internet didn't get created
00:45:32.660 | because people sat down and tried to figure out
00:45:35.500 | how do I, you know, send TikTok videos
00:45:39.780 | of myself dancing to people.
00:45:41.380 | They, you know, it was, there's an application.
00:45:44.900 | I mean, actually I don't even know how--
00:45:46.100 | - What do you think the application
00:45:47.620 | for the internet was when it was--
00:45:49.020 | - It must've been very low level basic network communication
00:45:52.240 | within DARPA, like military based,
00:45:55.040 | like how do I send, like a networking,
00:45:58.560 | how do I send information securely between two places?
00:46:02.820 | Maybe it was an encryption.
00:46:04.180 | I'm totally speaking totally outside of my knowledge,
00:46:06.300 | but like it was probably intended
00:46:08.220 | for a very narrow, small group of people.
00:46:10.300 | - Well, so, I mean, it was,
00:46:11.720 | there was like this small community of people
00:46:13.200 | who were really interested in time-sharing computing
00:46:15.100 | and like interactive computing
00:46:16.700 | in contrast with batch processing.
00:46:19.860 | And then the idea that as you set up
00:46:21.340 | like a time-sharing center,
00:46:23.880 | basically meaning you can have multiple people
00:46:25.340 | like logged in and using that like central computer,
00:46:28.300 | why not make it accessible to others?
00:46:30.260 | And this was kind of what I had always thought like,
00:46:31.920 | oh, is this like fringe group that was interested
00:46:34.300 | in this new kind of computing
00:46:35.460 | and they all like got themselves together.
00:46:37.740 | But the thing is like DARPA wouldn't actually,
00:46:39.820 | you wouldn't have the US government funding that
00:46:41.660 | just for the funds of it, right?
00:46:43.920 | In some sense, that's what ARPA was all about
00:46:45.800 | was like just really advanced research
00:46:48.760 | for the sake of having advanced research
00:46:50.240 | and it doesn't have to pay out with utility soon.
00:46:53.160 | But the core parts of its development were happening
00:46:56.420 | like in the middle of the Vietnam War
00:46:58.040 | when there was budgetary constraints all over the place.
00:47:01.380 | I only learned this recently, actually,
00:47:02.880 | like if you look at the documents,
00:47:04.960 | basically justifying the budget for the ARPANET
00:47:09.240 | as they were developing it
00:47:11.060 | and not just keeping it where it was,
00:47:12.780 | but actively growing it
00:47:13.900 | while all sorts of other departments
00:47:15.220 | were having their funding cut 'cause of the war.
00:47:18.580 | A big part of it was national defense
00:47:20.580 | in terms of having like a more robust communication system,
00:47:23.820 | like the idea of packet switching versus circuit switching.
00:47:26.460 | You could kind of make this case
00:47:27.580 | that in some calamitous circumstance
00:47:29.600 | where a central location gets nuked,
00:47:32.500 | this is a much more resilient way
00:47:34.560 | to still have your communication lines
00:47:36.500 | that like traditional telephone lines
00:47:39.720 | weren't as resilient to,
00:47:41.400 | which I just found very interesting.
00:47:43.360 | - Yeah. - That even something
00:47:45.960 | that we see as so happy-go-lucky
00:47:47.280 | is just a bunch of computer nerds
00:47:48.600 | trying to get like interactive computing out there.
00:47:51.000 | The actual like thing that made it funded
00:47:54.440 | and thing that made it advance when it did
00:47:57.080 | was because of this direct national security
00:47:59.860 | question and concern.
00:48:00.960 | - I don't know if you've read it.
00:48:02.440 | I haven't read it.
00:48:03.320 | I've been meaning to read it,
00:48:04.440 | but Neil deGrasse Tyson actually came out
00:48:06.380 | with a book that talks about like science
00:48:09.960 | in the context of the military,
00:48:11.240 | like basically saying all the great science we've done
00:48:14.480 | in the 20th century was like because of the military.
00:48:18.320 | I mean, he paints a positive.
00:48:19.680 | It's not like a critical.
00:48:21.840 | It's not, you know, a lot of people say
00:48:23.000 | like military industrial complex and so on.
00:48:25.680 | Another way to see the military and national security
00:48:28.160 | is like a source of, like you said, deadlines
00:48:30.980 | and like hard things you can't move.
00:48:33.080 | Like almost, you know, almost like scaring yourself
00:48:36.860 | into being productive.
00:48:38.620 | - It is that.
00:48:39.460 | I mean, the Manhattan Project is a perfect example,
00:48:41.660 | probably the quintessential example.
00:48:43.160 | That one is a little bit more macabre than others
00:48:46.460 | because of like what they were building,
00:48:48.180 | but in terms of how many focused, smart hours
00:48:52.140 | of human intelligence get pointed towards a topic per day,
00:48:56.540 | you're just maxing it out with that sense of worry.
00:48:58.540 | In that context, everyone there was saying like,
00:49:00.380 | we've got to get the bomb before Hitler does,
00:49:02.360 | and that just lights a fire under you.
00:49:04.860 | Again, like the circumstances macabre,
00:49:08.260 | but I think that's actually pretty healthy,
00:49:09.940 | especially for researchers that are otherwise
00:49:11.620 | going to be really theoretical.
00:49:13.540 | To take these like theorizers and say,
00:49:16.020 | make this real physical thing happen,
00:49:18.420 | meaning a lot of it is going to be unsexy.
00:49:20.440 | A lot of it's going to be like young Feynman
00:49:22.500 | sitting there kind of inventing a notion of computation
00:49:26.740 | in order to like compute what they needed to compute
00:49:29.300 | more quickly with like the rudimentary automated tools
00:49:31.820 | that they had available.
00:49:33.020 | I think you see this with Bell Labs also
00:49:36.140 | where you've got otherwise very theorizing minds
00:49:39.500 | in very pragmatic contexts that I think is like
00:49:42.180 | really helpful for the theory
00:49:43.340 | as well as for the applications.
00:49:44.940 | So I think that stuff can be positive for progress.
00:49:50.220 | - You mentioned Bell Labs and Manhattan Project.
00:49:52.720 | This kind of makes me curious for the things you've create
00:49:56.980 | which are quite singular.
00:49:58.580 | Like if you look at all YouTube,
00:50:01.140 | or just not YouTube, it doesn't matter what it is.
00:50:03.940 | It's just teaching content, art, doesn't matter.
00:50:06.780 | It's like, yep, that's Grant, right?
00:50:11.380 | That's unique.
00:50:12.420 | I know you're teaching style and everything.
00:50:14.620 | Does it, Manhattan Project and Bell Labs
00:50:19.740 | was like famously a lot of brilliant people,
00:50:21.760 | but there's a lot of them.
00:50:23.380 | They play off of each other.
00:50:25.100 | So like my question for you is that, does it get lonely?
00:50:28.820 | - Honestly, that right there I think is the biggest part
00:50:31.740 | of my life that I would like to change in some way
00:50:34.300 | that I look at a Bell Labs type situation
00:50:38.420 | and I'm like, goddamn, I love that whole situation.
00:50:40.900 | And I'm so jealous of it.
00:50:42.100 | And you're like reading about Hamming
00:50:44.180 | and then you see that he also shared an office with Shannon.
00:50:46.660 | And you're like, of course he did.
00:50:47.620 | Of course they shared an office.
00:50:48.740 | That's how these ideas get like--
00:50:50.700 | - And they actually probably very likely worked separately.
00:50:53.740 | - Yeah, totally separate.
00:50:55.260 | - But there's a literally, and sorry to interrupt,
00:50:57.140 | there's a literally magic that happens
00:50:59.620 | when you run into each other,
00:51:01.580 | like on the way to like getting a snack or something.
00:51:05.820 | - Conversations you overhear,
00:51:07.180 | it's other projects you're pulled into.
00:51:08.620 | It's like puzzles that colleagues are sharing,
00:51:09.980 | like all of that.
00:51:11.060 | I have some extent of it just because
00:51:14.060 | I try to stay well connected in communities
00:51:16.420 | of people who think in similar ways.
00:51:19.540 | But it's not in the day-to-day in the same way,
00:51:22.680 | which I would like to fix somehow.
00:51:24.940 | - That's one of the, I would say one of the biggest,
00:51:28.540 | well, one of the many drawbacks,
00:51:33.020 | negative things about this current pandemic
00:51:38.140 | is that whatever the term is,
00:51:40.280 | but like chance collisions are significantly reduced.
00:51:44.220 | - I saw, I don't know why I saw this,
00:51:46.540 | but on my brother's work calendar,
00:51:49.300 | he had a scheduled slot with someone
00:51:52.620 | that he scheduled a meeting.
00:51:53.700 | And the title of the whole meeting was,
00:51:57.140 | no specific agenda,
00:51:58.220 | I just missed the happenstance serendipitous conversations
00:52:00.820 | that we used to have,
00:52:01.660 | which the pandemic and remote work
00:52:03.340 | has so cruelly taken away from us.
00:52:05.700 | - Brilliant.
00:52:06.540 | - That was the whole title of the meeting.
00:52:07.860 | - That's brilliant.
00:52:08.700 | - I'm like, that's the way to do it.
00:52:09.620 | You just schedule those things.
00:52:10.900 | You schedule the serendipitous interaction.
00:52:13.540 | - It's like, I mean, you can't do it in academic setting,
00:52:15.660 | but it's basically like going to a bar
00:52:17.860 | and sitting there just for the strangers you might meet,
00:52:21.020 | just the strangers or striking up a conversation
00:52:24.380 | with strangers on the train.
00:52:25.980 | Harder to do when you're deeply,
00:52:30.100 | like maybe myself or maybe a lot of academic types
00:52:33.700 | who are like introverted
00:52:34.860 | and avoid human contact as much as possible.
00:52:38.340 | So it's nice when it's forced, those chance collisions,
00:52:40.720 | but maybe scheduling is a possibility.
00:52:43.660 | But for the most part, do you work alone?
00:52:47.740 | Like, I'm sure you struggle,
00:52:49.860 | like a lot, like this,
00:52:53.180 | like you probably hit moments when you look at this
00:52:56.500 | and you say like, this is the wrong way to show it.
00:53:00.940 | It's the wrong way to visualize it.
00:53:02.500 | I'm making it too hard for myself.
00:53:04.180 | I'm going down the wrong direction.
00:53:05.660 | This is too long.
00:53:06.500 | This is too short.
00:53:07.560 | All those self-doubt that's like could be paralyzing.
00:53:10.900 | Like, what do you do in those moments?
00:53:12.980 | - Honestly, I actually much prefer like work
00:53:16.620 | to be a solitary affair for me.
00:53:18.580 | That's like a personality quirk.
00:53:19.940 | I would like it to be in an environment with others
00:53:21.980 | and like collaborative in the sense of ideas exchanged.
00:53:24.580 | But those phenomena you're describing
00:53:26.060 | when you say this is too long, this is too short,
00:53:27.940 | this visualization sucks.
00:53:29.740 | It's way easier to say that to yourself
00:53:31.400 | than it is to say to a collaborator.
00:53:33.200 | And I know that's just a thing that I'm not good at.
00:53:36.340 | So in that way, it's very easy to just throw away a script
00:53:39.740 | because the script isn't working.
00:53:41.100 | It's hard to tell someone else they should do the same.
00:53:43.180 | - Actually, last time we talked,
00:53:44.340 | I think it was like very close to me talking Don Knuth.
00:53:47.500 | It was kind of cool.
00:53:48.340 | Like two people that--
00:53:49.660 | - Can't believe you got that interview.
00:53:51.500 | - It's the hard, no, can I brag about something?
00:53:54.540 | - Please.
00:53:55.860 | - My favorite thing is Don Knuth, after we did the interview,
00:53:59.900 | he offered to go out to hot dogs with me.
00:54:02.580 | We get hot dogs.
00:54:04.100 | That was never, like people ask me,
00:54:05.700 | "What's the favorite interview you've ever done?"
00:54:07.340 | I mean, that has to be.
00:54:09.420 | But unfortunately, I couldn't.
00:54:11.060 | I had a thing after.
00:54:12.700 | So I had to turn down Don Knuth--
00:54:14.420 | - You missed Knuth dogs?
00:54:15.740 | - Knuth dogs.
00:54:16.980 | Sorry, so that was a little bragging,
00:54:18.420 | but the hot dogs, he's such a sweet.
00:54:20.380 | So, but the reason I bring that up
00:54:23.620 | is he works through problems alone as well.
00:54:27.980 | He prefers that struggle, the struggle of it.
00:54:30.340 | You know, writers like Stephen King,
00:54:35.780 | you know, often talk about like their process
00:54:38.020 | of, you know, what they do,
00:54:40.020 | like what they eat when they wake up,
00:54:42.460 | like when they sit down, like how they like their desk.
00:54:46.740 | You know, on a perfectly productive day,
00:54:49.940 | like what they like to do, how long they like to work for,
00:54:54.540 | what enables them to think deeply, all that kind of stuff.
00:54:58.060 | Hunter S. Thompson did a lot of drugs.
00:55:00.260 | Everybody has their own thing.
00:55:01.800 | What's, do you have a thing?
00:55:04.420 | Is there, if you were to lay out a perfect, productive day,
00:55:09.340 | what would that schedule look like, do you think?
00:55:11.780 | - Part of that's hard to answer,
00:55:14.500 | 'cause like the mode of work I do changes a lot
00:55:18.460 | from day to day.
00:55:20.020 | Like some days I'm writing.
00:55:21.380 | The thing I have to do is write a script.
00:55:22.820 | Some days I'm animating.
00:55:23.660 | The thing I have to do is animate.
00:55:25.060 | Sometimes I'm like working on the animation library.
00:55:27.140 | The thing I have to do is like a little,
00:55:29.100 | I'm not a software engineer,
00:55:30.100 | but something in the direction of software engineering.
00:55:32.460 | Some days it's like a variant of research.
00:55:34.460 | It's like, learn this topic well
00:55:35.920 | and try to learn it differently.
00:55:37.460 | So those is like four very different modes of what it,
00:55:41.380 | some days it's like get through the email backlog
00:55:43.220 | of people I've been, tasks I've been putting off.
00:55:46.740 | - It goes research, scripting,
00:55:48.940 | like the idea starts with research and then there's scripting
00:55:52.060 | and then there's programming and then there's the showtime.
00:55:56.700 | - And the research side, by the way,
00:55:58.220 | like what's I think a problematic way to do it
00:56:00.420 | is to say I'm starting this project
00:56:02.180 | and therefore I'm starting the research.
00:56:03.660 | Instead, it should be that you're like ambiently learning
00:56:05.620 | a ton of things just in the background.
00:56:08.020 | And then once you feel like you have the understanding
00:56:10.280 | for one, you put it on the list of things
00:56:11.740 | that there can be a video for.
00:56:13.660 | Otherwise, either you're gonna end up roadblocked forever
00:56:16.820 | or you're just not gonna like have a good way
00:56:19.860 | of talking about it.
00:56:20.860 | But still some of the days it's like the thing to do
00:56:23.700 | is learn new things.
00:56:25.020 | - So what's the most painful one?
00:56:26.340 | I think you mentioned scripting.
00:56:28.860 | - Scripting is, yeah, that's the worst.
00:56:30.720 | Yeah, writing is the worst.
00:56:31.860 | - So what's your, on a perfectly,
00:56:33.420 | so let's take the hardest one.
00:56:35.300 | What's a perfectly productive day?
00:56:37.260 | You wake up and it's like, damn it,
00:56:39.060 | this is the day I need to do some scripting.
00:56:41.620 | And like you didn't do anything last two days,
00:56:43.540 | so you came up with excuses to procrastinate,
00:56:45.660 | so today must be the day.
00:56:47.380 | - Yeah, I wake up early.
00:56:49.300 | I guess I exercise.
00:56:52.460 | And then I turn the internet off.
00:56:54.900 | If we're writing, yeah, that's what's required
00:56:59.980 | is having the internet off.
00:57:01.100 | And then maybe you keep notes on the things
00:57:02.500 | that you wanna Google when you're allowed
00:57:03.940 | to have the internet again.
00:57:05.140 | I'm not great about doing that,
00:57:06.360 | but when I do, that makes it happen.
00:57:08.700 | And then when I hit writer's block,
00:57:10.380 | like the solution to writer's block is to read.
00:57:13.020 | Doesn't even have to be related,
00:57:14.180 | just read something different,
00:57:15.740 | just for like 15 minutes, half an hour,
00:57:17.900 | and then go back to writing.
00:57:19.860 | That, when it's a nice cycle, I think can work very well.
00:57:22.420 | - And when you're writing the script,
00:57:23.820 | you don't know where it ends, right?
00:57:26.060 | Like you have a--
00:57:27.820 | - Problem-solving videos, I know where it ends.
00:57:29.860 | Expositional videos, I don't know where it ends.
00:57:32.740 | - Like coming up with the magical thing
00:57:36.140 | that makes this whole story,
00:57:37.460 | like ties this whole story together.
00:57:39.500 | - When does that happen?
00:57:40.660 | - That's the thing that makes it such
00:57:43.460 | that a topic gets put on the list of like--
00:57:45.260 | - Oh, that's an issue.
00:57:46.100 | - Yeah, you shouldn't start the project
00:57:48.180 | unless there's one of those.
00:57:49.420 | - And you have so many nice bag,
00:57:51.580 | you have such a big bag of aha moments already
00:57:54.460 | that you could just pull at it.
00:57:56.580 | That's one of the things,
00:57:57.700 | and one of the sad things about time
00:58:01.940 | and that nothing lasts forever,
00:58:04.180 | and that we're all mortal.
00:58:05.300 | Let's not get into that.
00:58:06.580 | (both laughing)
00:58:08.980 | - That discussion is, you know,
00:58:12.300 | if I see like, even when I asked for people to ask,
00:58:15.820 | like I did a call for questions
00:58:18.020 | and people wanna ask you questions,
00:58:19.220 | and there's so many requests from people
00:58:20.780 | about like certain videos they would love you to do.
00:58:23.380 | It's such a pile.
00:58:25.020 | And I think that's a sign of like admiration
00:58:29.900 | from people for sure.
00:58:31.420 | But it's like, it makes me sad
00:58:32.740 | 'cause like whenever I see them, people give ideas,
00:58:35.160 | they're all like very often really good ideas.
00:58:38.540 | And it's like, it's such a,
00:58:41.580 | makes me sad in the same kind of way
00:58:44.040 | when I go through a library or through a bookstore,
00:58:47.300 | you see all these amazing books
00:58:48.780 | that you'll never get to open.
00:58:50.280 | (laughing)
00:58:52.620 | So, so yeah, so you did, yeah.
00:58:55.460 | - Gotta enjoy the ones that you have.
00:58:56.860 | Enjoy the books that are open
00:58:58.300 | and don't let yourself lament the ones that stay closed.
00:59:02.000 | - What else?
00:59:03.680 | Is there any other magic to that day?
00:59:05.340 | Do you try to dedicate like a certain number of hours?
00:59:08.620 | Do you, Cal Newport has this deep work kind of idea.
00:59:12.940 | - There's systematic people who like get really on top of,
00:59:16.380 | you know, they checklist of what they're gonna do in the day
00:59:18.740 | and they like count their hours.
00:59:20.200 | And I am not a systematic person in that way.
00:59:22.540 | It's, which is probably a problem.
00:59:24.620 | I very likely would get more done
00:59:26.540 | if I was systematic in that way, but that doesn't happen.
00:59:29.400 | So, you know, you talk to me, talk to me later in life
00:59:33.660 | and maybe I'll have like changed my ways
00:59:35.500 | and give you a very different answer.
00:59:37.420 | - I think Benjamin Franklin, like later in life,
00:59:39.860 | figured out the rigor.
00:59:41.340 | He has these like very rigorous schedules
00:59:43.340 | and what, how to be productive.
00:59:45.020 | - I think those schedules are much more fun to write.
00:59:47.540 | Like, it's very fun to like write a schedule
00:59:49.200 | and make a blog post about like the perfect productive day.
00:59:52.740 | That like might work for one person,
00:59:54.300 | but I don't know how much people get out of like reading them
00:59:56.420 | or trying to adopt someone else's style.
00:59:59.020 | - And I'm not even sure that they've ever followed.
01:00:01.620 | - Yeah, exactly.
01:00:02.460 | You're always gonna write it as the best version of yourself.
01:00:05.580 | You're not going to explain the phenomenon
01:00:08.420 | of like wanting to get out of the bed,
01:00:10.140 | but not really wanting to get out of the bed and all of that.
01:00:13.440 | - And just like zoning out for random reasons
01:00:16.180 | or the one that people probably don't touch at all is,
01:00:20.500 | I try to check social media once a day,
01:00:22.940 | but I'm like only, so I post and that's it.
01:00:26.340 | When I post, I check the previous days.
01:00:28.700 | That's like my, what I try to do.
01:00:31.740 | That's what I do like 90% of the days,
01:00:34.020 | but then I'll go, I'll have like a two week period
01:00:36.340 | where it's just like, I'm checking the internet.
01:00:39.300 | Like, I mean, it's probably some scary number of times.
01:00:43.580 | - I think a lot of people can resonate with that.
01:00:45.220 | I think it's a legitimate addiction.
01:00:47.420 | It's like, it's a dopamine addiction.
01:00:49.340 | And I don't know if it's a problem
01:00:52.220 | because as long as it's a kind of socializing,
01:00:54.340 | like if you're actually engaging with friends
01:00:55.920 | and engaging with other people's ideas,
01:00:58.460 | I think it can be really useful.
01:01:00.580 | - Well, I don't know.
01:01:01.420 | So like, for sure, I agree with you,
01:01:03.540 | but it's definitely an addiction because for me,
01:01:07.560 | I think it's true for a lot of people.
01:01:09.740 | I am very cognizant of the fact
01:01:11.740 | I just don't feel that happy.
01:01:14.440 | If I look at a day where I've checked social media a lot,
01:01:18.380 | like if I just aggregate, I did a self-report,
01:01:21.340 | I'm sure I would find that I'm just like literally
01:01:24.500 | on like less happy with my life and myself
01:01:28.180 | after I've done that check.
01:01:29.780 | When I check it once a day, I'm very, like I'm happy.
01:01:33.940 | Even like, 'cause I've seen it.
01:01:35.900 | Okay, one way to measure that is
01:01:37.660 | when somebody says something not nice to you on the internet
01:01:42.420 | is like when I check it once a day,
01:01:44.400 | I'm able to just like, like I smile,
01:01:46.620 | like I virtually, I think about them positively,
01:01:50.120 | empathetically, I send them love.
01:01:51.580 | I don't ever respond,
01:01:53.400 | but I just feel positively about the whole thing.
01:01:56.140 | If I check it, if I check like more than that,
01:01:59.440 | it starts eating at me.
01:02:01.400 | Like it start, there's an eating thing that happens,
01:02:05.340 | like anxiety, it occupies a part of your mind
01:02:09.060 | that's not, doesn't seem to be healthy.
01:02:10.940 | Same with, I mean, you put stuff out on YouTube.
01:02:15.780 | I think it's important.
01:02:17.540 | I think you have a million dimensions
01:02:18.940 | that are interesting to you,
01:02:19.780 | but one of the interesting ones is the study of education
01:02:24.780 | and the psychological aspect of putting stuff up on YouTube.
01:02:28.940 | I like now have completely stopped checking statistics
01:02:33.380 | of any kind.
01:02:34.620 | I've released an episode, 100 with my dad,
01:02:38.420 | conversation with my dad.
01:02:39.660 | He checks, he's probably listening to this, stop.
01:02:43.320 | He checks the number of views on his video,
01:02:48.260 | on his conversation.
01:02:49.540 | So he discovered like a reason,
01:02:51.220 | he's new to this whole addiction and he just checks.
01:02:54.360 | And he like, he'll text me or write to me,
01:02:56.540 | I just passed Dawkins in the top.
01:02:59.140 | (both laughing)
01:03:02.140 | - Oh my God, I love that so much.
01:03:04.300 | - Yeah, so he's--
01:03:05.940 | - Oh, can I tell you a funny story in that effect
01:03:07.600 | of like parental use of YouTube?
01:03:10.420 | Early on in the channel, my mom would like text me.
01:03:14.380 | She's like, the channel has had 990,000 views.
01:03:19.180 | The channel has had 991,000 views.
01:03:21.140 | I'm like, oh, that's cute.
01:03:21.980 | She's going to the little part on the about page
01:03:23.780 | where you see the total number of channel views.
01:03:25.780 | No, she didn't know about that.
01:03:27.820 | She had been going every day through all the videos
01:03:30.940 | and then adding them up.
01:03:31.900 | - Adding them up.
01:03:32.820 | - And she thought she was like doing me this favor
01:03:34.620 | of providing me this like global analytic
01:03:36.740 | that otherwise wouldn't be visible.
01:03:39.060 | - That's awesome.
01:03:39.900 | - It's just like this addiction
01:03:40.780 | where you have some number you want to follow
01:03:42.340 | and like, yeah, it's funny that your dad had this.
01:03:44.900 | I think a lot of people have it.
01:03:47.020 | - I think that's probably a beautiful thing for like parents
01:03:49.540 | 'cause they're legitimately, they're proud.
01:03:53.280 | - Yeah, it's born of love.
01:03:55.020 | - It's great.
01:03:56.500 | - The downside I feel, one of them,
01:03:59.420 | is this is one interesting experience
01:04:02.900 | that you probably don't know much about
01:04:04.780 | 'cause comments on your videos are super positive,
01:04:08.020 | but people judge the quality of how something went,
01:04:12.260 | like I see that with these conversations, by the comments.
01:04:16.260 | I'm not talking about like, you know,
01:04:20.540 | people in their 20s and their 30s.
01:04:22.500 | I'm talking about like CEOs of major companies
01:04:25.300 | who don't have time.
01:04:27.060 | They basically, they literally,
01:04:29.620 | this is their evaluation metric.
01:04:31.540 | They're like, ooh, the comments seem to be positive
01:04:33.500 | and that's really concerning to me.
01:04:35.540 | - Most important lesson for any content creator to learn
01:04:38.700 | is that the commenting public
01:04:40.220 | is not representative of the actual public.
01:04:42.740 | And this is easy to see.
01:04:43.940 | Ask yourself, how often do you write comments
01:04:46.060 | on YouTube videos?
01:04:47.180 | Most people will realize I never do it.
01:04:49.580 | Some people realize they do,
01:04:50.780 | but the people who realize they never do it
01:04:53.100 | should understand that that's a sign
01:04:54.700 | the kind of people who are like you
01:04:56.100 | aren't the ones leaving comments.
01:04:58.020 | And I think this is important in a number of respects.
01:04:59.580 | Like in my case, I think I would think my content
01:05:02.740 | was better than it was if I just read comments
01:05:04.780 | 'cause people are super nice.
01:05:06.340 | The thing is, the people who are bored by it,
01:05:08.940 | are put off by it in some way, are frustrated by it,
01:05:11.740 | usually they just go away.
01:05:13.340 | They're certainly not gonna watch the whole video,
01:05:14.780 | much less leave a comment on it.
01:05:16.620 | So there's a huge under-representation
01:05:18.340 | of negative feedback, well-intentioned negative feedback
01:05:21.460 | because very few people actively do that.
01:05:23.460 | Watch the whole thing that they dislike,
01:05:25.060 | figure out what they disliked, articulate what they dislike.
01:05:28.900 | There's plenty of negative feedback
01:05:30.020 | that's not well-intentioned, but for that golden kind.
01:05:33.140 | I think a lot of YouTuber friends I have,
01:05:37.580 | at least have gone through phases of anxiety
01:05:39.620 | about the nature of comments that stem from
01:05:44.020 | basically just this, that it's people
01:05:46.140 | who aren't necessarily representative
01:05:47.820 | of who they were going for, misinterpreted
01:05:49.300 | what they were trying to say or whatever have you.
01:05:51.820 | Or we're focusing on things like personal appearances
01:05:54.100 | as opposed to like substance.
01:05:55.700 | And they come away thinking like,
01:05:57.860 | oh, that's what everyone thinks, right?
01:05:59.380 | That's what everyone's response to this video was.
01:06:02.100 | But a lot of the people who had the reaction
01:06:03.660 | you wanted them to have,
01:06:05.260 | they probably didn't write it down.
01:06:06.980 | So very important to learn.
01:06:09.540 | It also translates to realizing that you're not
01:06:13.420 | as important as you might think you are, right?
01:06:15.100 | Because all of the people commenting
01:06:16.620 | are the ones who love you the most
01:06:17.820 | and are really asking you to create certain things
01:06:20.660 | or mad that you didn't create a past thing.
01:06:22.860 | I don't know, I have such a problem.
01:06:26.660 | I have a very real problem with making promises
01:06:28.700 | about a type of content that I'll make
01:06:30.420 | and then either not following up on it soon
01:06:32.860 | or just never following up on it.
01:06:34.580 | - Yeah, you actually, last time we talked,
01:06:36.460 | I think, I'm not sure, promised to me
01:06:38.340 | that you'll have music incorporated into your-
01:06:41.460 | - I'll share it with you at private.
01:06:43.500 | But there's an example of what I had in mind.
01:06:45.700 | I did a version of it and I'm like,
01:06:48.500 | "Oh, I think there's a better version of this
01:06:50.140 | "that might exist one day."
01:06:51.980 | - So it's now on the back burner.
01:06:54.420 | It's sitting there.
01:06:56.220 | - It was like a live performance of this one thing.
01:06:57.900 | I think next circumstance that I'm doing
01:06:59.980 | another recorded live performance
01:07:01.500 | that fits having that in a better recording context,
01:07:04.620 | maybe I'll make it nice and public.
01:07:05.820 | - Maybe a while.
01:07:06.940 | - But exactly, right?
01:07:09.020 | The point I was gonna make though is like,
01:07:10.300 | I know I'm bad about following up on stuff,
01:07:13.220 | which is an actual problem.
01:07:14.660 | It's born of the fact that I have a sense
01:07:16.780 | of what will be good content and when it won't be.
01:07:19.300 | But this can actually be incredibly disheartening
01:07:22.060 | 'cause a ton of comments that I see
01:07:24.140 | are people who are frustrated,
01:07:26.740 | usually in a benevolent way,
01:07:27.780 | that I haven't followed through on X and X, which I get.
01:07:31.220 | And I should do that.
01:07:32.500 | But what's comforting thought for me
01:07:34.180 | is that when there's a topic I haven't promised
01:07:36.300 | but I am working on and I'm excited about,
01:07:38.340 | it's like the people who would really like this
01:07:40.660 | don't know that it's coming
01:07:41.780 | and don't know to comment to that effect.
01:07:43.940 | And the commenting public that I'm seeing
01:07:45.580 | is not representative of who I think
01:07:47.860 | this other project will touch meaningfully.
01:07:50.220 | - Yeah, so focus on the future,
01:07:51.340 | on the thing you're creating now,
01:07:52.540 | just like the art of it.
01:07:55.060 | One of the people that's really inspiring to me
01:07:58.260 | in that regard, 'cause I've really seen it in person,
01:08:02.140 | Joe Rogan, he doesn't read comments, but not just that.
01:08:06.900 | He doesn't give a damn.
01:08:09.540 | He like legitimate, he's not like clueless about it.
01:08:13.780 | He's like, just like the richness
01:08:16.460 | and the depth of a smile he has
01:08:18.500 | when he just experiences the moment with you, like offline.
01:08:21.940 | You can tell he doesn't give a damn about like,
01:08:25.300 | about anything, about what people think about
01:08:30.220 | whether if it's on a podcast, you talk to him
01:08:31.940 | or whether offline about just, it's not there.
01:08:36.380 | Like what other people think,
01:08:37.940 | how even like what the rest of the day looks like
01:08:41.500 | is just deeply in the moment or like,
01:08:44.940 | especially like is what we're doing gonna make
01:08:48.420 | for a good Instagram photo or something like that.
01:08:50.660 | It doesn't think like that at all.
01:08:52.740 | It's, I think for actually quite a lot of people,
01:08:55.940 | he's an inspiration in that way, but it was,
01:08:58.460 | and in real life, I show that you can be very successful
01:09:02.380 | not giving a damn about comments.
01:09:07.180 | And it sounds bad not to read comments
01:09:11.660 | 'cause it's like, well, there's a huge number of people
01:09:13.420 | who are deeply passionate about what you do.
01:09:15.060 | So you're ignoring them.
01:09:16.820 | But at the same time, the nature of our platforms
01:09:20.500 | is such that the cost of listening to all the positive people
01:09:25.500 | who are really close to you, who are incredible people
01:09:28.180 | have been, made a great community
01:09:31.780 | that you can learn a lot from.
01:09:32.980 | The cost of listening to those folks
01:09:35.380 | is also the cost of your psychology
01:09:40.380 | slowly being degraded by the natural underlying toxicity
01:09:45.540 | of the internet.
01:09:46.380 | - Engage with a handful of people deeply
01:09:49.500 | rather than like as many people as you can in a shallow way.
01:09:52.620 | I think that's a good lesson for social media usage.
01:09:55.180 | - Platforms in general, yeah.
01:09:58.300 | - Choose just a handful of things to engage with
01:10:00.460 | and engage with it very well in a way that you feel proud of
01:10:03.140 | and don't worry about the rest.
01:10:04.740 | Honestly, I think the best social media platform is texting.
01:10:07.740 | That's my favorite.
01:10:10.100 | That's my go-to social media platform.
01:10:12.260 | - Well, yeah, the best social media interaction
01:10:15.380 | is like real life, not social media, but social interaction.
01:10:18.580 | - Well, yeah, no question there.
01:10:20.140 | I think everyone should agree with that.
01:10:21.820 | - Which sucks because it's been challenged now
01:10:24.660 | with the current situation.
01:10:25.980 | And we're trying to figure out
01:10:27.820 | what kind of platform can be created
01:10:30.060 | that we can do remote communication
01:10:31.660 | that still is effective.
01:10:33.020 | It's important for education.
01:10:34.420 | It's important for just--
01:10:35.660 | - That is the question of education right now.
01:10:37.580 | - Yeah.
01:10:39.060 | So on that topic, you've done a series of live streams
01:10:41.940 | called Lockdown Math.
01:10:43.260 | And you went live, which is different than you usually do.
01:10:48.620 | Maybe one, can you talk about how'd that feel?
01:10:53.020 | What's that experience like?
01:10:54.260 | Like in your own, when you look back,
01:10:56.460 | like is that an effective way,
01:10:58.540 | did you find, of being able to teach?
01:11:00.940 | And if so, is there a lessons for this world
01:11:05.020 | where all of these educators are now trying to figure out
01:11:08.780 | how the heck do I teach remotely?
01:11:11.100 | - For me, it was very different, as different as you can get.
01:11:13.340 | I'm on camera, which I'm usually not.
01:11:15.140 | I'm doing it live, which is nerve wracking.
01:11:18.100 | It was a slightly different like level of topics,
01:11:20.860 | although realistically, I'm just talking about things
01:11:22.540 | I'm interested in no matter what.
01:11:24.420 | I think the reason I did that was this thought
01:11:26.980 | that a ton of people are looking to learn remotely,
01:11:29.420 | the rate at which I usually put out content
01:11:31.060 | is too slow to be actively helpful.
01:11:33.100 | Let me just do some biweekly lectures
01:11:34.820 | that if you're looking for a place to point your students,
01:11:37.020 | if you're a student looking for a place
01:11:38.300 | to be edified about math, just tune in at these times.
01:11:40.980 | And in that sense, I think it was a success
01:11:43.900 | for those who followed with it.
01:11:45.260 | It was a really rewarding experience for me
01:11:47.580 | to see how people engaged with it.
01:11:49.780 | Part of the fun of the live interaction was to actually,
01:11:53.740 | like I'd do these live quizzes
01:11:54.900 | and see how people would answer
01:11:55.900 | and try to shape the lesson based on that
01:11:57.380 | or see what questions people were asking in the audience.
01:11:59.980 | I would love to, if I did more things like that
01:12:02.340 | in the future, kind of tighten that feedback loop even more.
01:12:05.340 | I think for, you know, you asked about like,
01:12:09.060 | if this can be relevant to educators,
01:12:10.860 | like 100% online teaching is basically
01:12:13.660 | a form of live streaming now.
01:12:15.340 | And usually it happens through Zoom.
01:12:17.300 | I think if teachers view what they're doing
01:12:20.220 | as a kind of performance and a kind of livestream performance
01:12:23.540 | that would probably be pretty healthy
01:12:26.380 | because Zoom can be kind of awkward.
01:12:28.420 | And I wrote up this little blog post actually
01:12:31.140 | just on like, just what our setup looked like
01:12:33.420 | if you want to adopt it yourself
01:12:34.740 | and how to integrate like the broadcasting software OBS
01:12:38.020 | with Zoom or things like that.
01:12:39.420 | - It was really sad to pause on that.
01:12:40.900 | I mean, yeah, maybe we could look at the blog post,
01:12:42.780 | but it looked really nice.
01:12:45.500 | - The thing is, I knew nothing about any of that stuff
01:12:47.700 | before I started.
01:12:48.540 | I had a friend who knew a fair bit.
01:12:50.700 | And so he kind of helped show me the roops.
01:12:52.500 | One of the things that I realized is that
01:12:54.980 | you could, as a teacher, like it doesn't take that much
01:12:57.460 | to make things look and feel pretty professional.
01:13:00.260 | Like one component of it is as soon as you hook things up
01:13:02.940 | with the broadcasting software,
01:13:04.100 | rather than just doing like screen sharing,
01:13:06.060 | you can set up different scenes
01:13:07.580 | and then you can like have keyboard shortcuts
01:13:09.220 | to transition between those scenes.
01:13:11.220 | So you don't need a production studio
01:13:12.740 | with a director calling like, go to camera three,
01:13:14.500 | go to camera two, like onto the screen capture.
01:13:16.680 | Instead, you can have control of that.
01:13:18.700 | And it took a little bit of practice
01:13:19.980 | and I would mess it up now and then.
01:13:21.300 | But I think I had it decently smooth such that,
01:13:24.260 | you know, I'm talking to the camera
01:13:25.380 | and then we're doing something on the paper.
01:13:26.680 | Then we're doing like a,
01:13:28.660 | playing with a Desmos graph or something.
01:13:30.780 | And something that I think in the past
01:13:32.540 | would have required a production team,
01:13:33.780 | you can actually do as a solo operation.
01:13:36.020 | And in particular as a teacher.
01:13:37.900 | And I think it's worth it to try to do that
01:13:39.740 | because two reasons.
01:13:41.980 | One, you might get more engagement from the students.
01:13:44.600 | But the biggest reason, I think one of the like
01:13:46.180 | best things that can come out of this pandemic
01:13:48.060 | education wise, is if we turn a bunch of teachers
01:13:50.540 | into content creators.
01:13:51.800 | And if we take lessons that are usually done
01:13:53.760 | in these one-off settings.
01:13:55.140 | And like start to get in the habit of,
01:13:57.360 | sometimes I'll use the phrase commoditizing explanation.
01:14:01.180 | Where what you want is,
01:14:03.220 | whatever a thing a student wants to learn,
01:14:06.580 | it just seems inefficient to me
01:14:08.180 | that that lesson is taught millions of times over
01:14:10.940 | in parallel across many different classrooms
01:14:12.860 | in the world, like year to year.
01:14:15.060 | You've got a given algebra one lesson
01:14:16.620 | that's just taught like literally millions of times
01:14:19.860 | by different people.
01:14:21.540 | What should happen is that there's the small handful
01:14:24.040 | of explanations online that exist.
01:14:27.160 | So that when someone needs that explanation,
01:14:28.680 | they can go to it.
01:14:29.680 | That the time in classroom is spent on all of the parts
01:14:31.700 | of teaching and education that aren't explanation,
01:14:33.800 | which is most of it, right?
01:14:35.920 | And the way to get there is to basically have more people
01:14:38.760 | who are already explaining, publish their explanations
01:14:41.960 | and have it in a publicized forum.
01:14:43.840 | So if during a pandemic, you can have people
01:14:46.600 | automatically creating online content
01:14:49.240 | 'cause it has to be online.
01:14:50.700 | But getting in the habit of doing it
01:14:51.920 | in a way that doesn't just feel like a Zoom call
01:14:55.640 | that happened to be recorded,
01:14:56.880 | but it actually feels like a piece
01:14:59.880 | that was always gonna be publicized to more people
01:15:02.160 | than just your students.
01:15:03.520 | That can be really powerful.
01:15:05.320 | - And there's an improvement process there.
01:15:07.480 | So being self-critical and growing,
01:15:10.760 | I guess YouTubers go through this process
01:15:15.000 | of putting out some content and nobody caring about it.
01:15:20.560 | And then trying to figure out,
01:15:21.960 | and basically improving,
01:15:24.080 | figure out why did nobody care?
01:15:26.140 | And they come up with all kinds of answers
01:15:30.400 | which may or may not be correct, but doesn't matter
01:15:33.120 | because the answer leads to improvement.
01:15:35.520 | So you're being constantly self-critical,
01:15:37.440 | self-analytical, it should be better to say.
01:15:40.440 | So you think of how can I make the audio better?
01:15:43.200 | All the basic things.
01:15:44.600 | Maybe one question to ask,
01:15:47.560 | well, by way of Russ Tedrick,
01:15:51.240 | he's a robotics professor at MIT,
01:15:53.160 | one of my favorite people, big fan of yours.
01:15:56.120 | He watched our first conversation.
01:15:57.760 | I just interviewed him a couple of weeks ago.
01:16:00.760 | He teaches this course in underactuated robotics,
01:16:05.120 | which is like robotic systems
01:16:09.000 | when you can't control everything.
01:16:10.720 | We as humans, when we walk, we're always falling forward,
01:16:16.360 | which means like it's gravity, you can't control it.
01:16:18.680 | You just hope you can catch yourself,
01:16:20.200 | but that's not all guaranteed.
01:16:21.800 | It depends on the surface.
01:16:22.880 | So like that's underactuated, you can't control everything.
01:16:25.880 | The number of actuators,
01:16:29.340 | the degrees of freedoms you have is not enough
01:16:31.040 | to fully control the system.
01:16:32.660 | So I don't know, it's a really,
01:16:34.080 | I think beautiful, fascinating class.
01:16:35.640 | He puts it online.
01:16:36.720 | It's quite popular.
01:16:39.000 | He does an incredible job teaching.
01:16:40.400 | He puts it online every time,
01:16:42.160 | but he's kind of been interested in like crisping it up.
01:16:45.080 | Like, you know, making it, you know,
01:16:48.640 | innovating in different kinds of ways.
01:16:50.320 | And he was inspired by the work you do,
01:16:53.080 | because I think in his work,
01:16:54.600 | he can do similar kinds of explanations as you're doing,
01:16:57.900 | like revealing the beauty of it
01:17:00.080 | and spending like months in preparing a single video.
01:17:03.080 | And he's interested in how to do that.
01:17:06.480 | That's why he listened to the conversation.
01:17:07.780 | He's playing with Manum.
01:17:09.800 | But he had this question of, you know,
01:17:14.320 | like in my apartment where we did the interview,
01:17:18.640 | I have like curtains, like a black curtain,
01:17:22.320 | not this, this is a adjacent mansion that we're in
01:17:27.080 | that I also own.
01:17:27.920 | But you basically just have like a black curtain,
01:17:32.400 | whatever, that, you know,
01:17:33.480 | it makes it really easy to set up a filming situation
01:17:35.680 | with cameras that we have here, these microphones.
01:17:38.080 | He was asking, you know,
01:17:39.600 | what kind of equipment do you recommend?
01:17:41.600 | I guess like your blog post is a good one.
01:17:43.760 | I said, I don't recommend,
01:17:45.640 | this is excessive and actually really hard to work with.
01:17:48.960 | So I wonder, I mean,
01:17:52.560 | is there something you would recommend
01:17:53.920 | in terms of equipment?
01:17:55.000 | Like, is it, do you think like lapel mics,
01:17:58.920 | like USB mics, what do you?
01:18:00.800 | - For my narration, I use a USB mic.
01:18:02.560 | For the streams, I used a lapel mic.
01:18:04.520 | The narration, it's a Blue Yeti.
01:18:07.720 | I'm forgetting actually the name of the lapel mic,
01:18:09.680 | but it was probably like a Rode of some kind.
01:18:14.120 | But--
01:18:14.960 | - Is it hard to figure out how to make the audio sound good?
01:18:17.520 | - Oh, I mean, listen to all the early videos on my channel
01:18:20.040 | and clearly like I'm terrible at this.
01:18:22.320 | For some reason, I just couldn't get audio for a while.
01:18:25.800 | I think it's weird when you hear your own voice.
01:18:28.520 | So you hear it, you're like, this sounds weird.
01:18:30.320 | And it's hard to know, does it sound weird
01:18:31.600 | because you're not used to your own voice
01:18:33.000 | or they're like actual audio artifacts at play.
01:18:38.320 | - And then video is just for the lockdown,
01:18:41.640 | it was just the camera.
01:18:42.480 | You said it was probably streaming somehow through the--
01:18:45.360 | - Yeah, there were two GH5 cameras,
01:18:47.120 | one that was mounted overhead over a piece of paper.
01:18:49.440 | You could also use like an iPad or a Wacom tablet
01:18:51.560 | to do your writing electronically,
01:18:53.600 | but I just wanted the paper feel.
01:18:55.240 | One on the face, there's two, again, I don't know.
01:19:00.240 | I'm like just not actually the one to ask this
01:19:02.120 | 'cause I like animate stuff usually,
01:19:03.400 | but each of them like has a compressor object
01:19:07.140 | that makes it such that the camera output
01:19:10.000 | goes into the computer USB,
01:19:11.820 | but like gets compressed before it does that.
01:19:14.120 | - The live aspect of it, do you regret doing it live?
01:19:19.120 | - Not at all.
01:19:20.840 | I do think the content might be like much less sharp
01:19:25.080 | and tight than if it were something,
01:19:27.520 | even that I just recorded like that and then edited later.
01:19:30.180 | But I do like something that I do to be out there to show
01:19:33.380 | like, hey, this is what it's like raw,
01:19:34.760 | this is what it's like when I make mistakes.
01:19:36.840 | This is like the pace of thinking.
01:19:38.760 | I like the live interaction of it.
01:19:41.300 | I think that made it better.
01:19:42.700 | I probably would do it on a different channel, I think,
01:19:46.280 | if I did series like that in the future,
01:19:47.880 | just because it's a different style.
01:19:49.440 | It's probably a different target audience
01:19:50.780 | and kind of keep clean what 3Blue1Brown is about
01:19:54.120 | versus the benefits of like live lectures.
01:19:57.920 | - Do you suggest like in this time of COVID
01:20:01.480 | that people like Russ or other educators try to go
01:20:05.880 | like the shorter, like 20 minute videos
01:20:09.840 | that are like really well planned out or scripted,
01:20:12.960 | you really think through, you slowly design,
01:20:15.620 | so it's not live?
01:20:16.800 | Do you see like that being an important part
01:20:18.520 | of what they do?
01:20:20.440 | - Yeah, well, what I think teachers like Russ should do
01:20:23.280 | is choose the small handful of topics
01:20:25.960 | that they're gonna do just really well.
01:20:27.360 | They wanna create the best short explanation of it
01:20:30.140 | in the world that will be one of those handfuls
01:20:32.200 | in a world where you have commoditized explanation, right?
01:20:34.980 | Most of the lectures should be done just normally.
01:20:37.900 | So put thought and planning into it.
01:20:39.080 | I'm sure he's a wonderful teacher
01:20:40.800 | and like knows all about that.
01:20:42.240 | But maybe choose those small handful of topics.
01:20:45.280 | Do what's beneficial for me sometimes
01:20:47.280 | is I do sample lessons with people on that topic
01:20:49.580 | to get some sense of how other people think about it.
01:20:52.400 | Let that inform how you want to edit it or script it
01:20:55.640 | or whatever format you wanna do.
01:20:57.560 | Some people are comfortable just explaining it
01:20:59.160 | and editing later.
01:21:00.000 | I'm more comfortable like writing it out
01:21:01.420 | and thinking in that setting.
01:21:03.000 | - Yeah, it's kind of, sorry to interrupt.
01:21:05.040 | It's a little bit sad to me to see
01:21:08.640 | how much knowledge is lost.
01:21:10.920 | Like just like you mentioned, there's professors,
01:21:14.120 | like we can take my dad for example,
01:21:16.160 | to blow up his ego a little bit.
01:21:17.480 | But he's a great teacher and he knows plasma,
01:21:21.620 | plasma chemistry, plasma physics really well.
01:21:23.840 | So he can very simply explain some beautiful
01:21:27.600 | but otherwise complicated concepts.
01:21:31.440 | And it's sad that like if you Google plasma
01:21:34.560 | or like for plasma physics, like there's no videos.
01:21:38.480 | - And just imagine if every one of those excellent teachers
01:21:40.960 | like your father or like Russ,
01:21:42.940 | even if they just chose one topic this year.
01:21:45.080 | They're like, I'm gonna make the best video
01:21:46.960 | that I can on this topic.
01:21:48.080 | If every one of the great teachers did that,
01:21:50.160 | the internet would be replete.
01:21:51.840 | And it's already replete with great explanations,
01:21:53.720 | but it would be even more so with all the niche
01:21:55.440 | great explanations and like anything you wanna learn.
01:21:58.080 | - And there's a self-interest to it in terms of teachers,
01:22:01.200 | in terms of even, so if you take Russ for example,
01:22:05.060 | it's not that he's teaching something,
01:22:07.060 | like he teaches his main thing,
01:22:09.100 | his thing he's deeply passionate about.
01:22:11.540 | And from a selfish perspective, it's also just like,
01:22:15.420 | I mean, it's like publishing a paper in a really,
01:22:21.960 | like nature has like letters, like accessible publication.
01:22:27.620 | It's just going to guarantee that your work,
01:22:30.940 | that your passion is seen by a huge number of people.
01:22:35.940 | Whatever the definition of huge is, it doesn't matter.
01:22:39.060 | It's much more than it otherwise would be.
01:22:42.340 | - And it's those lectures that tell early students
01:22:45.380 | what to be interested in.
01:22:47.300 | At the moment, I think students are disproportionately
01:22:49.520 | interested in the things that are well-represented
01:22:51.240 | on YouTube.
01:22:52.420 | So to any educator out there, if you're wondering,
01:22:54.140 | hey, I want more like grad students in my department,
01:22:57.060 | like what's the best way to recruit grad students?
01:22:58.900 | It's like, make the best video you can
01:23:00.780 | and then wait eight years.
01:23:01.980 | And then you're going to have a pile of like excellent
01:23:03.900 | grad students for that department.
01:23:05.500 | - And one of the lessons I think your channel teaches
01:23:08.220 | is there's appeal of explaining just something beautiful,
01:23:13.220 | explaining it cleanly, technically,
01:23:17.140 | not doing a marketing video about why topology is great.
01:23:21.620 | - Yeah, there's people interested in this stuff.
01:23:24.060 | I mean, one of the greatest channels,
01:23:27.180 | like it's not even a math channel,
01:23:29.060 | but the channel with greatest math content
01:23:30.740 | is Vsauce, who I interviewed.
01:23:33.580 | Imagine you were to propose making a video
01:23:35.580 | that explains the Banach-Tarski paradox substantively,
01:23:39.180 | right, not shying around,
01:23:40.740 | and maybe not describing things in terms of
01:23:42.820 | like the group theoretic terminology
01:23:46.340 | that you'd usually see in a paper,
01:23:47.540 | but the actual results that went into this idea
01:23:52.420 | of like breaking apart a sphere,
01:23:54.300 | proposing that to like a network TV station,
01:23:56.700 | saying, yeah, I'm going to do this in-depth talk
01:23:58.740 | of the Banach-Tarski paradox.
01:23:59.940 | I'm pretty sure it's going to reach 20 million people.
01:24:03.060 | It's like, get out of here.
01:24:03.940 | Like no one cares about that.
01:24:05.700 | No one's interested in anything even anywhere near that.
01:24:08.580 | But then you have Michael's quirky personality around it
01:24:10.900 | and just people that are actually hungry
01:24:12.740 | for that kind of depth,
01:24:13.940 | then you don't need like the approval of some higher network.
01:24:19.180 | You can just do it and let the people speak for themselves.
01:24:21.980 | So I think, you know,
01:24:23.180 | if your father was to make something on plasma physics,
01:24:25.420 | or if we were to have like underactualized robotics,
01:24:29.700 | that would-- - Underactuated.
01:24:31.020 | - Underactuated.
01:24:32.300 | Yes, not underactualized.
01:24:33.740 | (laughing)
01:24:34.740 | Plenty actualized.
01:24:35.700 | Underactuated robotics.
01:24:36.980 | - Yeah, most robotics is underactualized currently.
01:24:39.340 | (laughing)
01:24:40.340 | - That's true.
01:24:41.460 | So even if it's things that you might think are niche,
01:24:44.660 | I bet you'll be surprised by how many people
01:24:47.740 | actually engage with it really deeply.
01:24:49.340 | - Although I just psychologically watching him,
01:24:51.660 | I can't speak for a lot of people.
01:24:52.740 | I can speak for my dad.
01:24:54.020 | I think there's a little bit of a skill gap,
01:24:58.020 | but I think that could be overcome.
01:25:00.140 | That's pretty basic.
01:25:00.980 | - None of us know how to make videos when we start.
01:25:02.900 | The first stuff I made was terrible in a number of respects.
01:25:05.220 | Like look at the earliest videos on any YouTube channel,
01:25:07.900 | except for Captain Disillusion.
01:25:09.180 | And they're all like terrible versions
01:25:11.140 | of whatever they are now.
01:25:13.340 | - But the thing I've noticed,
01:25:15.780 | especially like with world experts,
01:25:18.160 | is it's the same thing that I'm sure you went through,
01:25:21.060 | which is like fear of like embarrassment.
01:25:25.180 | Like they definitely, it's the same reason.
01:25:29.660 | Like I feel that anytime I put out a video,
01:25:34.020 | I don't know if you still feel that,
01:25:35.300 | but like, I don't know, it's this imposter syndrome.
01:25:39.220 | Like who am I to talk about this?
01:25:40.940 | And that's true for like even things
01:25:43.700 | that you've studied for like your whole life.
01:25:46.400 | I don't know, it's scary to post stuff on YouTube.
01:25:50.100 | - It is scary.
01:25:51.580 | I honestly wish that more of the people
01:25:53.620 | who had that modesty to say who am I to post this
01:25:58.500 | were the ones actually posting it.
01:25:59.340 | - They're posting it, that's right.
01:26:00.740 | - I mean, the honest problem is like
01:26:02.340 | a lot of the educational content is posted by people
01:26:04.780 | who like were just starting to research it two weeks ago
01:26:08.060 | and are on a certain schedule,
01:26:09.740 | and who maybe should think like who am I to explain,
01:26:14.580 | choose your favorite topic,
01:26:15.860 | quantum mechanics or something.
01:26:17.420 | And the people who have the self-awareness to not post
01:26:22.500 | are probably the people also best positioned
01:26:24.780 | to give a good, honest explanation of it.
01:26:27.260 | - That's why there's a lot of value in a channel
01:26:29.980 | like Numberphile, where they basically trap
01:26:32.740 | a really smart person and force them to explain stuff
01:26:36.460 | on a brown sheet of paper.
01:26:38.100 | So, but of course that's not scalable as a single channel.
01:26:41.660 | If there's anything beautiful that it could be done
01:26:44.540 | is people take it in their own hands, educators.
01:26:48.500 | - Which is again, circling back,
01:26:49.900 | I do think the pandemic will serve to force
01:26:52.940 | a lot of people's hands.
01:26:54.420 | You're gonna be making online content anyway,
01:26:56.820 | it's happening, right?
01:26:58.740 | Just hit that publish button and see how it goes.
01:27:01.180 | - Yeah, see how it goes.
01:27:03.500 | The cool thing about YouTube is it might not go for a while,
01:27:07.780 | but like 10 years later, it'll be like, this is the thing,
01:27:12.540 | what people don't understand with YouTube,
01:27:14.220 | at least for now, at least that's my hope with it,
01:27:17.940 | is it's literally better than publishing a book
01:27:22.940 | in terms of the legacy.
01:27:24.940 | It will live for a long, long time.
01:27:27.740 | Of course, it's one of the things,
01:27:31.260 | I mentioned Joe Rogan before, it's kinda,
01:27:33.940 | there's a sad thing 'cause I'm a fan,
01:27:36.140 | he's moving to Spotify.
01:27:38.980 | - Yeah, yeah, nine digit numbers will do that to you.
01:27:42.140 | - But he doesn't really, he's one of the person
01:27:44.780 | that doesn't actually care that much about money.
01:27:46.820 | Like having talked to him, it wasn't because of money,
01:27:50.660 | it's because he legitimately thinks
01:27:54.140 | that they're going to do a better job.
01:27:59.140 | So from his perspective, YouTube,
01:28:02.180 | you have to understand where they're coming from.
01:28:04.180 | YouTube has been cracking down on people who they,
01:28:09.180 | Joe Rogan talks to Alex Jones and conspiracy theories,
01:28:12.820 | and YouTube is really careful with that kind of stuff.
01:28:15.980 | And that's not a good feeling.
01:28:19.140 | And Joe doesn't feel like YouTube is on his side.
01:28:22.100 | He's often has videos that they don't put in trending
01:28:27.420 | that are obviously should be in trending
01:28:30.300 | because they're nervous about like,
01:28:32.420 | is this content going to upset people,
01:28:38.980 | all that kind of stuff, have misinformation.
01:28:41.380 | And that's not a good place for a person to be in.
01:28:44.420 | And Spotify is giving them, we're never going to censor you.
01:28:48.460 | We're never going to do that.
01:28:50.260 | But the reason I bring that up,
01:28:52.540 | whatever you think about that,
01:28:53.540 | I personally think that's bullshit
01:28:55.260 | because podcasting should be free
01:28:57.220 | and not constrained to a platform.
01:28:59.660 | It's pirate radio, what the hell?
01:29:01.620 | You can't, as much as I love Spotify,
01:29:03.380 | you can't just, you can't put fences around it.
01:29:06.400 | But anyway, the reason I bring that up
01:29:10.460 | is Joe's gonna remove his entire library from YouTube.
01:29:14.180 | - Whoa, really?
01:29:15.180 | I didn't know that.
01:29:16.020 | - His full length, the clips are gonna stay,
01:29:17.620 | but the full length videos are all,
01:29:20.220 | I mean, made private or deleted.
01:29:22.220 | That's part of the deal.
01:29:23.540 | And like, that's the first time where I was like,
01:29:26.820 | oh, YouTube videos might not live forever.
01:29:29.100 | Like things you find, like, okay, sorry.
01:29:32.540 | - This is why you need IPFS or something
01:29:35.100 | where it's like, if there's a content link,
01:29:37.180 | are you familiar with this system at all?
01:29:39.180 | Like right now, if you have a URL that points to a server,
01:29:41.860 | there's like a system where the address
01:29:43.660 | points to content and then it's like distributed.
01:29:46.140 | So you can't actually delete what's at an address
01:29:48.940 | because it's content addressed.
01:29:50.820 | And as long as there's someone on the network who hosts it,
01:29:53.180 | it's always accessible at the address that it once was.
01:29:56.740 | - But I mean, that raises a question.
01:29:58.660 | I'm not gonna put you on the spot,
01:30:00.740 | but like somebody like Vsauce, right?
01:30:03.020 | Spotify comes along and gives him,
01:30:05.660 | let's say $100 billion, okay?
01:30:08.740 | Let's say some crazy number
01:30:10.300 | and then removes it from YouTube, right?
01:30:13.020 | It's made me, I don't know,
01:30:17.060 | for some reason I thought YouTube was forever.
01:30:19.900 | - I don't think it will be.
01:30:20.740 | I mean, another variant that this might take
01:30:22.860 | is like that you fast forward 50 years
01:30:27.060 | and Google or Alphabet isn't the company that it once was
01:30:31.180 | and it's kind of struggling to make ends meet
01:30:33.060 | and it's been supplanted by whoever wins on the AR game
01:30:38.060 | or whatever it might be.
01:30:39.300 | And then they're like, you know,
01:30:41.540 | all of these videos that we're hosting are pretty costly.
01:30:43.780 | So we're gonna start deleting the ones
01:30:46.020 | that aren't watched that much
01:30:47.180 | and tell people to like try to back them up on their own
01:30:50.220 | or whatever it is.
01:30:51.260 | Or even if it does exist in some form forever,
01:30:54.500 | it's like if people are not habituated
01:30:57.180 | to watching YouTube in 50 years,
01:30:58.500 | they're watching something else,
01:30:59.340 | which seems pretty likely.
01:31:00.940 | Like it would be shocking if YouTube remained as popular
01:31:04.060 | as it is now indefinitely into the future.
01:31:07.140 | - That's true.
01:31:07.980 | - So it won't be forever.
01:31:10.060 | - It makes me sad still, but, 'cause it's such a nice,
01:31:13.620 | it's like, just like you said of the canonical videos.
01:31:16.700 | - Sorry, I didn't mean to interrupt.
01:31:17.700 | Do you know, you should get Juan Bennett on the thing
01:31:20.900 | and then talk to him about permanence.
01:31:22.520 | I think you would have a good conversation.
01:31:24.260 | - Who's that?
01:31:25.140 | - So he's the one that founded this thing called IPFS
01:31:27.340 | that I'm talking about.
01:31:28.740 | And if you have him talk about basically
01:31:31.460 | what you're describing, like, oh, it's sad
01:31:32.540 | that this isn't forever,
01:31:33.820 | then you'll get some articulate pontification around it.
01:31:36.860 | - Yeah.
01:31:37.700 | - That's like been pretty well thought through.
01:31:39.820 | - But yeah, I do see YouTube, just like you said,
01:31:42.020 | as a place, like what your channel creates,
01:31:44.900 | which is like a set of canonical videos on a topic.
01:31:48.060 | Now, others could create videos on that topic as well,
01:31:52.220 | but as a collection, it creates a nice set of places to go
01:31:57.220 | if you're curious about a particular topic.
01:31:59.340 | And it seems like coronavirus is a nice opportunity
01:32:02.500 | to put that knowledge out there in the world
01:32:06.060 | at MIT and beyond.
01:32:08.860 | I have to talk to you a little bit about machine learning,
01:32:12.140 | deep learning, and so on.
01:32:13.300 | Again, we talked about last time,
01:32:15.380 | you have a set of beautiful videos on neural networks.
01:32:19.100 | Let me ask you first,
01:32:22.900 | what is the most beautiful aspect of neural networks
01:32:27.860 | and machine learning to you?
01:32:29.260 | From making those videos,
01:32:32.420 | from watching how the field is evolving,
01:32:35.260 | is there something mathematically or in applied sense
01:32:40.260 | just beautiful to you about them?
01:32:42.060 | - Well, I think what I would go to is the layered structure
01:32:46.260 | and how you can have,
01:32:48.020 | what feel like qualitatively distinct things happening
01:32:50.380 | going from one layer to another,
01:32:52.340 | but that are following the same mathematical rule.
01:32:55.500 | 'Cause you look at it as a piece of math,
01:32:56.860 | it's like you got a non-linearity
01:32:58.500 | and then you've got a matrix multiplication.
01:33:00.540 | That's what's happening on all the layers.
01:33:03.060 | But especially if you look at like
01:33:04.940 | some of the visualizations that like Chris Ola has done
01:33:08.020 | with respect to like convolutional nets
01:33:11.140 | that have been trained on ImageNet,
01:33:12.460 | trying to say, what does this neuron do?
01:33:14.020 | What does this family of neurons do?
01:33:17.100 | What you can see is that the ones closer to the input side
01:33:21.420 | are picking up on very low level ideas like the texture.
01:33:24.620 | And then as you get further back,
01:33:25.740 | you have higher level ideas like,
01:33:27.020 | what is the, where are the eyes in this picture?
01:33:29.140 | And then how do the eyes form like an animal?
01:33:31.460 | Is this animal a cat or a dog or a deer?
01:33:33.700 | You have this series of qualitatively
01:33:36.180 | different things happening,
01:33:37.020 | even though it's the same piece of math on each one.
01:33:39.580 | So that's a pretty beautiful idea
01:33:41.140 | that you can have like a generalizable object
01:33:44.180 | that runs through the layers of abstraction,
01:33:47.660 | which in some sense constitute intelligence
01:33:50.260 | is having those many different layers
01:33:53.180 | of an understanding to something.
01:33:54.700 | - Yeah, form abstractions in a automated way.
01:33:58.060 | - Exactly, it's automated abstracting,
01:33:59.740 | which I mean, that just feels very powerful.
01:34:02.940 | And the idea that it can be
01:34:04.460 | so simply mathematically represented.
01:34:06.220 | I mean, a ton of like modern ML research
01:34:08.380 | seems a little bit like you do a bunch of ad hoc things,
01:34:10.860 | then you decide which one worked
01:34:12.020 | and then you retrospectively come up
01:34:13.860 | with the mathematical reason that it always had to work.
01:34:16.980 | But who cares how you came to it
01:34:18.500 | when you have like that elegant piece of math,
01:34:20.980 | it's hard not to just smile seeing it work in action.
01:34:24.620 | - Well, and when you talked about topology before,
01:34:26.940 | one of the really interesting things
01:34:28.900 | is beginning to be investigated
01:34:32.060 | under kind of the field of like science and deep learning,
01:34:34.500 | which is like the craziness of the surface
01:34:39.300 | that is trying to be optimized in neural networks.
01:34:43.780 | I mean, the amount of local minima,
01:34:47.180 | local optima there is in these surfaces
01:34:50.020 | and somehow a dumb gradient descent algorithm
01:34:53.820 | is able to find really good solutions.
01:34:55.500 | That's like, that's really surprising.
01:34:58.540 | - Well, so on the one hand it is,
01:35:00.580 | but also it's like not,
01:35:02.500 | it's not terribly surprising
01:35:03.820 | that you have these interesting points that exist
01:35:06.340 | when you make your space so high dimensional.
01:35:08.780 | Like GPT-3, what did it have, 175 billion parameters?
01:35:12.540 | So it doesn't feel as mesmerizing to think about,
01:35:17.420 | oh, there's some surface of intelligent behavior
01:35:20.380 | in this crazy high dimensional space.
01:35:21.940 | Like there's so many parameters that of course,
01:35:24.020 | but what's more interesting is like,
01:35:25.060 | how is it that you're able to efficiently get there?
01:35:27.900 | Which is maybe what you're describing
01:35:29.020 | that something as dumb as gradient descent does it.
01:35:32.060 | But like the reason the gradient descent works well
01:35:36.380 | with neural networks and not just,
01:35:37.860 | you know, choose however you want to parameterize this space
01:35:40.140 | and then like apply gradient descent to it
01:35:41.820 | is that that layered structure
01:35:43.180 | lets you decompose the derivative
01:35:44.580 | in a way that makes it computationally feasible.
01:35:46.980 | - Yeah, it's just that there's so many good solutions,
01:35:51.620 | probably infinitely, infinitely many good solutions,
01:35:55.620 | not best solutions, but good solutions.
01:35:58.540 | That's what's interesting.
01:36:00.740 | It's similar to Steven Wolfram has this idea of like,
01:36:04.360 | if you just look at all space of computations,
01:36:09.060 | of all space of basically algorithms,
01:36:11.820 | that you'd be surprised
01:36:12.740 | how many of them are actually intelligent.
01:36:14.780 | (laughs)
01:36:15.660 | Like if you just randomly pick from the bucket,
01:36:18.660 | that's surprising.
01:36:19.620 | We tend to think like a tiny, tiny minority of them
01:36:23.100 | would be intelligent.
01:36:25.340 | But his sense is like, it seems weirdly easy
01:36:29.260 | to find computations that do something interesting.
01:36:32.620 | - Well, okay, so that,
01:36:34.940 | from like a Kolmogorov complexity standpoint,
01:36:38.360 | almost everything will be interesting.
01:36:39.860 | What's fascinating is to find the stuff
01:36:41.980 | that's describable with low information,
01:36:43.860 | but still does interesting things.
01:36:46.060 | Like one fun example of this,
01:36:47.460 | you know, Shannon's noisy coding theorem,
01:36:50.780 | noisy coding theorem and information theory
01:36:53.940 | that basically says if, you know,
01:36:55.300 | I want to send some bits to you,
01:36:57.140 | maybe some of them are going to get flipped.
01:36:59.940 | There's some noise along the channel.
01:37:01.700 | I can come up with some way of coding it
01:37:04.180 | that's resilient to that noise, that's very good.
01:37:07.700 | And then he quantitatively describes what very good is.
01:37:10.260 | What's funny about how he proves the existence
01:37:12.540 | of good error correction codes
01:37:14.780 | is rather than saying like, here's how to construct it,
01:37:17.160 | or even like a sensible non-constructive proof,
01:37:19.980 | the nature of his non-constructive proof is to say,
01:37:23.220 | if we chose a random encoding,
01:37:25.180 | it would be almost at the limit, which is weird,
01:37:28.420 | because then it took decades for people to actually find any
01:37:31.060 | that were anywhere close to the limit.
01:37:32.860 | And what his proof was saying is choose a random one,
01:37:35.760 | and it's like the best kind of encoding you'll ever find.
01:37:39.020 | But what that tells us is that sometimes
01:37:43.100 | when you choose a random element from this ungodly huge set,
01:37:46.260 | that's a very different task
01:37:47.620 | from finding an efficient way to actively describe it.
01:37:49.940 | Because in that case, the random element
01:37:51.220 | to actually implement it as a bit of code,
01:37:52.860 | you would just have this huge table
01:37:54.780 | of like telling you how to encode one thing into another
01:37:58.080 | that's totally computationally infeasible.
01:38:00.460 | So on the side of like how many possible programs
01:38:03.660 | are interesting in some way, it's like, yeah, tons of them.
01:38:07.020 | But the much, much more delicate question
01:38:09.620 | is when you can have a low information description
01:38:12.020 | of something that still becomes interesting.
01:38:14.500 | - And thereby, this kind of gives you a blueprint
01:38:16.600 | for how to engineer that kind of thing.
01:38:18.520 | - Right. - Yeah.
01:38:19.420 | - Chaos theory is another good instance there
01:38:21.220 | where it's like, yeah, a ton of things are hard to describe,
01:38:23.620 | but how do you have ones that have a simple set
01:38:25.740 | of governing equations that remain
01:38:28.280 | like arbitrarily hard to describe?
01:38:30.620 | - Well, let me ask you, you mentioned GPT-3.
01:38:33.580 | It's interesting to ask, what are your thoughts
01:38:36.780 | about the recently released OpenAI GPT-3 model
01:38:41.780 | that I believe is already trying to learn
01:38:44.820 | how to communicate like Grant Sanderson?
01:38:47.620 | - You know, I think I got an email a day or two ago
01:38:49.620 | about someone who wanted to try to use GPT-3 with Manim,
01:38:53.940 | where you would like give it a high level description
01:38:56.940 | of something and then it'll like automatically create
01:38:59.000 | the mathematical animation.
01:39:00.660 | Like trying to put me out of a job here.
01:39:02.300 | (both laughing)
01:39:05.180 | - I mean, it probably won't put you out of a job,
01:39:07.460 | but it'll create something visually beautiful for sure.
01:39:09.900 | - I would be surprised if that worked as stated,
01:39:12.100 | but maybe there's like variants of it
01:39:14.860 | like that you can get to.
01:39:16.620 | - I mean, like a lot of those demos, it's interesting.
01:39:18.660 | I think there's a lot of failed experiments,
01:39:23.660 | like depending on how you prime the thing,
01:39:26.260 | you're going to have a lot of failed,
01:39:27.660 | I mean, certainly with code and with program synthesis,
01:39:29.960 | most of it won't even run.
01:39:32.060 | But eventually I think if you pick the right examples,
01:39:37.000 | you'll be able to generate something cool.
01:39:38.620 | And I think that even that's good enough,
01:39:40.260 | even though if you're being very selective,
01:39:43.380 | it's still cool that something can be generated.
01:39:46.180 | - Yeah, that's huge value.
01:39:48.620 | I mean, think of the writing process.
01:39:49.980 | Sometimes a big part of it is just getting a bunch of stuff
01:39:52.100 | on the page and then you can decide what to whittle down to.
01:39:54.780 | So if it can be used in like a man-machine symbiosis
01:39:58.060 | where it's just giving you a spew of potential ideas
01:40:01.060 | that then you can refine down,
01:40:03.260 | like it's serving as the generator
01:40:05.100 | and then the human serves as the refiner,
01:40:07.200 | that seems like a pretty powerful dynamic.
01:40:10.140 | - Yeah, have you gotten a chance to see any of the demos
01:40:13.120 | like on Twitter?
01:40:14.020 | Is there a favorite you've seen or?
01:40:15.860 | - Oh, my absolute favorite.
01:40:17.220 | Yeah, so Tim Bley,
01:40:19.420 | who runs a channel called Acapella Science,
01:40:21.460 | he was like tweeting a bunch about playing with it.
01:40:25.180 | And so GPT-3 was trained on the internet from before COVID.
01:40:30.180 | So in a sense it doesn't know about the coronavirus.
01:40:33.580 | So what he seeded it with was just a short description
01:40:35.820 | about like a novel virus emerges in Wuhan, China
01:40:39.780 | and starts to spread around the globe.
01:40:41.500 | What follows is a month by month description
01:40:43.340 | of what happens, January colon, right?
01:40:46.340 | That's what he seeds it with.
01:40:47.340 | So then what GPT-3 generates is like January,
01:40:49.900 | then a paragraph of description, February and such.
01:40:52.540 | And it's the funniest thing you'll ever read
01:40:54.380 | because it predicts a zombie apocalypse,
01:40:57.060 | which of course it would because it's trained
01:40:59.940 | on like the internet data, zombie stories.
01:41:02.700 | But what you see unfolding is a description of COVID-19
01:41:06.180 | if it were a zombie apocalypse.
01:41:08.120 | And like the early aspects of it are kind of shockingly
01:41:11.260 | in line with what's reasonable.
01:41:12.740 | And then it gets out of hand so quickly.
01:41:14.780 | - And the other flip side of that is I wouldn't be surprised
01:41:18.420 | if it's onto something at some point here.
01:41:20.740 | 2020 has been full of surprises.
01:41:24.660 | - Who knows, like we might all be in like this crazy
01:41:27.620 | militarized zone as it predicts just a couple months off.
01:41:31.500 | - Yeah, I think there's definitely an interesting tool
01:41:34.580 | of storytelling.
01:41:36.100 | It has struggled with mathematics, which is interesting
01:41:38.420 | or just even numbers.
01:41:40.700 | It's able to, it's not able to generate like patterns,
01:41:45.180 | you know, like you give it in like five digit numbers
01:41:50.180 | and it's not able to figure out the sequence, you know,
01:41:53.500 | or like I didn't look in too much, but I'm talking
01:41:57.500 | about like sequences like the Fibonacci numbers
01:42:00.100 | and to see how far it can go.
01:42:02.100 | Because obviously it's leveraging stuff from the internet
01:42:04.340 | and it starts to lose it.
01:42:05.340 | But it is also cool that I've seen it able to generate
01:42:08.960 | some interesting patterns that are mathematically correct.
01:42:12.500 | - Yeah, I honestly haven't dug into like what's going on
01:42:16.220 | within it in a way that I can speak intelligently to.
01:42:19.680 | I guess it doesn't surprise me that it's bad
01:42:22.740 | at numerical patterns because, I mean, maybe I should be
01:42:26.140 | more impressed with it, but like that requires having
01:42:29.260 | a weird combination of intuitive and formulaic worldview.
01:42:35.500 | So you're not just going off of intuition
01:42:37.260 | when you see Fibonacci numbers.
01:42:38.380 | You're not saying like intuitively,
01:42:39.460 | what do I think will follow the 13?
01:42:41.300 | Like I've seen patterns a lot where like 13s
01:42:43.460 | are followed by 21s.
01:42:45.100 | Instead it's the, like the way you're starting
01:42:47.220 | to see a shape of things is by knowing what hypotheses
01:42:50.620 | to test where you're saying, oh, maybe it's generated
01:42:53.100 | based on the previous terms, or maybe it's generated
01:42:55.020 | based on like multiplying by a constant or whatever it is.
01:42:57.460 | You like have a bunch of different hypotheses
01:42:59.580 | and your intuitions are around those hypotheses,
01:43:01.380 | but you still need to actively test it.
01:43:05.220 | And it seems like GPT-3 is extremely good at like that sort
01:43:09.500 | of pattern matching recognition that usually is very hard
01:43:12.180 | for computers, that is what humans get good at
01:43:15.300 | through expertise and exposure to lots of things.
01:43:17.460 | It's why it's good to learn from as many examples
01:43:19.460 | as you can, rather than just from the definitions.
01:43:21.980 | It's to get that level of intuition,
01:43:24.420 | but to actually concretize it into a piece of math,
01:43:27.580 | you do need to like test your hypotheses
01:43:31.060 | and if not prove it, like have an actual explanation
01:43:34.380 | for what's going on, not just a pattern that you've seen.
01:43:37.940 | - Yeah, and but then the flip side to play devil's advocate,
01:43:41.180 | that's a very kind of probably correct,
01:43:44.060 | intuitive understanding of just like we said,
01:43:46.260 | a few layers creating abstractions,
01:43:49.220 | but it's been able to form something that looks like
01:43:54.220 | a compression of the data that it's seen
01:43:59.140 | that looks awfully a lot like it understands
01:44:01.420 | what the heck it's talking about.
01:44:03.020 | - Well, I think a lot of understanding is,
01:44:05.140 | like I don't mean to denigrate pattern recognition,
01:44:07.940 | pattern recognition is most of understanding
01:44:09.980 | and it's super important and it's super hard.
01:44:12.280 | And so like when it's demonstrating this kind
01:44:14.860 | of real understanding, compressing down some data,
01:44:17.100 | like that might be pattern recognition at its finest.
01:44:20.660 | My only point would be that like what differentiates math,
01:44:24.820 | I think to a large extent is that the pattern recognition
01:44:28.340 | isn't sufficient and that the kind of patterns
01:44:30.740 | that you're recognizing are not like the end goals,
01:44:34.540 | but instead they are the little bits and paths
01:44:37.180 | that get you to the end goal.
01:44:39.340 | - That's only true for mathematics in general.
01:44:41.300 | It's an interesting question if that might,
01:44:44.820 | for certain kinds of series of numbers,
01:44:47.120 | it might not be true.
01:44:48.220 | Like you might, because that's a basic,
01:44:52.000 | you know, like Taylor's, like certain kinds of series,
01:44:54.580 | it feels like compressing the internet is enough
01:45:00.420 | to figure out, 'cause those patterns in some form appear
01:45:04.140 | in the text somewhere.
01:45:06.060 | - Well, I mean, there's all sorts of wonderful examples
01:45:08.300 | of false patterns in math where one of the earliest videos
01:45:11.220 | I put on the channel was talking about,
01:45:13.300 | you could kind of dividing a circle up using these chords
01:45:15.500 | and you see this pattern of one, two, four, eight, 16.
01:45:18.580 | I was like, okay, pretty easy to see what that pattern is.
01:45:21.140 | It's powers of two.
01:45:22.180 | You've seen it a million times, but it's not powers of two.
01:45:25.780 | The next term is 31.
01:45:27.620 | And so it's like almost a power of two,
01:45:29.180 | but it's a little bit shy.
01:45:30.700 | And there's actually a very good explanation
01:45:32.540 | for what's going on.
01:45:34.300 | But I think it's a good test of whether you're thinking
01:45:37.700 | clearly about mechanistic explanations of things,
01:45:41.660 | how quickly you jump to thinking it must be powers of two.
01:45:44.380 | 'Cause the problem itself, there's really no good way to,
01:45:48.620 | I mean, there can't be a good way to think about it
01:45:50.300 | as like doubling a set because ultimately it doesn't.
01:45:53.380 | But even before it starts to, it's not something
01:45:55.300 | that screams out as being a doubling phenomenon.
01:45:58.420 | So at best, if it did turn out to be powers of two,
01:46:00.900 | it would have only been so very subtly.
01:46:02.900 | And I think the difference between like a math student
01:46:05.500 | making the mistake and a mathematician who's experienced
01:46:07.620 | seeing that kind of pattern is that they'll have a sense
01:46:10.620 | from what the problem itself is,
01:46:12.340 | whether the pattern that they're observing is reasonable
01:46:15.540 | and how to test it.
01:46:16.820 | And like, I would just be very impressed
01:46:20.460 | if there was any algorithm that was actively
01:46:23.500 | accomplishing that goal.
01:46:25.260 | - Yeah, like a learning-based algorithm.
01:46:28.060 | - Yeah, like a little scientist, I guess, basically.
01:46:30.260 | - Yeah, it's a fascinating thought because GPT-3,
01:46:35.140 | these language models are already accomplishing
01:46:36.980 | way more than I've expected.
01:46:38.540 | So I'm learning not to doubt.
01:46:40.780 | - I bet we'll get there.
01:46:42.940 | Yeah, I'm not saying I'd be impressed, but like surprised.
01:46:45.900 | Like I'll be impressed, but I think we'll get there
01:46:47.980 | on algorithms doing math like that.
01:46:51.520 | - So one of the amazing things you've done for the world
01:46:57.060 | is to some degree, open sourcing the tooling
01:47:01.140 | that you use to make your videos with Manum,
01:47:04.300 | this Python library.
01:47:07.860 | Now it's quickly evolving because I think you're inventing
01:47:10.700 | new things every time you make a video.
01:47:12.900 | In fact, I've been working on playing around with some,
01:47:17.300 | I wanted to do like an ode to "Three Blue, One Brown."
01:47:20.060 | Like I love playing Hendrix.
01:47:22.220 | I wanted to do like a cover of a concept
01:47:24.820 | I wanted to visualize and use Manum.
01:47:27.100 | And I saw that you had like a little piece of code
01:47:29.420 | on like Mobius strip.
01:47:31.300 | And I tried to do some cool things
01:47:33.180 | with spinning a Mobius strip,
01:47:35.900 | like continue twisting it, I guess is the term.
01:47:40.040 | And it was easier to, it was tough.
01:47:44.580 | So I haven't figured it out yet.
01:47:45.620 | Well, so I guess the question I want to ask
01:47:48.060 | is so many people love it, that you've put that out there.
01:47:51.660 | They want to do the same thing as I do with Hendrix.
01:47:54.260 | They want to cover it.
01:47:55.100 | They want to explain an idea using the tool, including Rust.
01:47:58.180 | How would you recommend they try to, I'm very sorry.
01:48:02.380 | They try to go by, about it.
01:48:06.260 | - Well, so I-
01:48:08.100 | - And what kind of choices should they choose
01:48:11.300 | to be most effective?
01:48:13.180 | - That I can answer.
01:48:14.020 | So I always feel guilty if this comes up
01:48:16.300 | because I think of it like this scrappy tool.
01:48:19.420 | It's like a math teacher who put together some code.
01:48:22.100 | People asked what it was, so they made it open source
01:48:24.500 | and they kept scrapping it together.
01:48:26.140 | And there's a lot of things about it
01:48:27.900 | that make it harder to work with than it needs to be
01:48:30.420 | that are a function of me not being a software engineer.
01:48:33.220 | I've put some work this year trying to make it better
01:48:38.140 | and more flexible that is still just kind of
01:48:41.460 | like a work in process.
01:48:43.020 | One thing I would love to do is just get my act together
01:48:46.900 | about properly integrating with what the community
01:48:49.380 | wants to work with and what stuff I work on
01:48:52.700 | and making that not deviate.
01:48:56.260 | And just actually fostering that community
01:48:58.380 | in a way that I've been shamefully neglectful of.
01:49:01.100 | So I'm just always guilty if it comes up.
01:49:03.140 | - So let's put that guilt aside.
01:49:04.740 | Just kind of Zen-like.
01:49:05.860 | - Sorry, Zen-like.
01:49:06.700 | I'll pretend like it isn't terrible.
01:49:08.140 | For someone like Russ, I think step one is
01:49:11.420 | make sure that what you're animating
01:49:12.860 | should be done so programmatically.
01:49:14.300 | 'Cause a lot of things maybe shouldn't.
01:49:16.980 | Like if you're just making a quick graph of something,
01:49:19.860 | if it's a graphical intuition that maybe
01:49:21.580 | has a little motion to it, use Desmos, use Grapher,
01:49:24.660 | use Geogebra, use Mathematica.
01:49:26.780 | Certain things that are like really oriented around graph.
01:49:28.540 | - Geogebra is kind of cool.
01:49:29.860 | I've been playing with it, it's amazing.
01:49:31.180 | - You can get very, very far with it.
01:49:33.660 | And in a lot of ways, it would make more sense
01:49:35.940 | for some stuff that I do to just do in Geogebra.
01:49:38.780 | But I kind of have this cycle of liking to try
01:49:41.020 | to improve mannum by doing videos and such.
01:49:42.980 | So do as I say, not as I do.
01:49:45.540 | The original thought I had in making mannum
01:49:47.940 | was that there's so many different ways
01:49:49.740 | of representing functions other than graphs.
01:49:52.020 | In particular, things like transformations.
01:49:54.940 | Like use movement over time to communicate relationships
01:49:57.940 | between inputs and outputs instead of like
01:50:00.300 | x direction and y direction.
01:50:02.820 | Or like vector fields or things like that.
01:50:04.740 | So I wanted something that was flexible enough
01:50:06.340 | that you didn't feel constrained
01:50:07.500 | into a graphical environment.
01:50:08.940 | By graphical, I mean like graphs with like
01:50:12.980 | x-coordinate, y-coordinate kind of stuff.
01:50:15.380 | But also make sure that you're taking advantage
01:50:19.060 | of the fact that it's programmatic.
01:50:20.700 | You have loops, you have conditionals,
01:50:22.260 | you have abstraction.
01:50:23.500 | If any of those are like well fit for what you wanna teach
01:50:26.140 | to have a scene type that you tweak a little bit
01:50:28.820 | based on parameters or to have conditionals
01:50:31.300 | so that things can go one way or another
01:50:32.620 | or loops so that you can create these things
01:50:34.660 | of like arbitrarily increasing complexity.
01:50:36.820 | That's the stuff that's like meant
01:50:37.860 | to be animated programmatically.
01:50:39.580 | If it's just like writing some text on the screen
01:50:42.300 | or shifting around objects or something like that,
01:50:45.740 | things like that, you should probably just use Keynote.
01:50:48.620 | You'd be a lot simpler.
01:50:49.900 | So try to find a workflow that distills down
01:50:53.340 | that which should be programmatic into Manim
01:50:55.500 | and that which doesn't need to be into like other domains.
01:50:58.700 | Again, do as I say, not as I do.
01:51:01.180 | - I mean, Python is an integral part of it.
01:51:03.500 | Just for the fun of it, let me ask,
01:51:05.900 | what's your most and least favorite aspects of Python?
01:51:10.900 | - Ooh, most and least.
01:51:12.540 | I mean, I love that it's like object-oriented
01:51:15.820 | and functional, I guess, that you can kind of like
01:51:18.780 | get both of those benefits for how you structure things.
01:51:23.780 | So if you would just want to quickly whip something together,
01:51:25.900 | the functional aspects are nice.
01:51:27.540 | - Is your primary language
01:51:28.860 | like for programmatically generating stuff?
01:51:31.100 | - Yeah, it's home for me.
01:51:32.060 | Python is home. - It's home.
01:51:33.700 | - Yeah.
01:51:34.540 | - Sometimes you travel, but it's home.
01:51:35.660 | Got it. - It's home.
01:51:37.340 | I mean, the biggest disadvantage is that it's slow.
01:51:39.180 | So when you're doing computationally intensive things,
01:51:41.700 | either you have to think about it more than you should,
01:51:43.620 | how to make it efficient, or it just takes long.
01:51:47.260 | - Do you run into that at all, like with your work?
01:51:49.780 | - Well, so certainly old Manom is way slower
01:51:52.700 | than it needs to be because of how it renders things
01:51:57.060 | on the back end, it's kind of absurd.
01:51:58.580 | I've rewritten things such that it's all done
01:52:00.940 | with shaders in such a way that it should be just live
01:52:03.700 | and actually interactive while you're coding it,
01:52:06.580 | if you want to.
01:52:07.700 | You have a 3D scene, you can move around,
01:52:09.540 | you can have elements respond
01:52:12.620 | to where your mouse is or things.
01:52:14.020 | That's not something that user of a video
01:52:16.620 | is gonna get to experience
01:52:17.580 | 'cause there's just a play button and a pause button.
01:52:19.220 | But while you're developing, that can be nice.
01:52:21.900 | So it's gotten better in speed in that sense,
01:52:23.860 | but that's basically because the hard work is being done
01:52:26.140 | in a language that's not Python, but GLSL, right?
01:52:29.540 | But yeah, there are some times when it's like a,
01:52:33.980 | there's just a lot of data that goes into the object
01:52:36.060 | that I want to animate that then it,
01:52:38.740 | just like Python is slow.
01:52:40.540 | - Well, let me ask, quickly ask,
01:52:42.620 | what do you think about the walrus operator,
01:52:44.340 | if you're familiar with it at all?
01:52:45.900 | The reason it's interesting,
01:52:47.060 | there's a new operator in Python 3.8.
01:52:49.660 | I find it psychologically interesting
01:52:51.380 | 'cause the toxicity over it led Guido to resign,
01:52:54.740 | to step down from his--
01:52:55.580 | - Is that actually true or was it like,
01:52:57.100 | there's a bunch of surrounding things that also,
01:52:59.780 | was it actually the walrus operator that--
01:53:01.900 | - Well, it was an accumulation of toxicity,
01:53:06.300 | but that was the most toxic one.
01:53:09.860 | Like the discussion,
01:53:11.260 | that's the most number of Python core developers
01:53:14.140 | that were opposed to Guido's decision.
01:53:16.100 | He didn't particularly,
01:53:18.220 | I don't think cared about it either way.
01:53:20.340 | He just thought it was a good idea,
01:53:21.780 | this is where you approve it.
01:53:23.380 | And like the structure of the idea of a BDFL is like,
01:53:27.220 | you listen to everybody, hear everybody out,
01:53:29.540 | you make a decision and you move forward.
01:53:33.180 | And he didn't like the negativity
01:53:35.260 | that burdened him after that.
01:53:37.420 | - People like some parts of the benevolent dictator
01:53:39.740 | for life mantra,
01:53:40.580 | but once the dictator does things different than you want,
01:53:42.660 | suddenly dictatorship doesn't seem so great.
01:53:45.060 | - Yeah, I mean, they still liked it,
01:53:46.460 | he just couldn't because he truly is the B in the benevolent.
01:53:50.580 | He really is a nice guy.
01:53:52.700 | I mean, and I think he can't,
01:53:55.700 | it's a lot of toxicity, it's difficult,
01:53:57.420 | it's a difficult job.
01:53:58.660 | That's why Alanis Torvald is perhaps the way he is,
01:54:01.380 | you have to have a thick skin to fight off,
01:54:04.180 | fight off the warring masses.
01:54:06.460 | It's kind of surprising to me how many people
01:54:08.900 | can like threaten to murder each other
01:54:11.180 | over whether we should have braces or not,
01:54:12.980 | or whether, like, it's incredible.
01:54:15.860 | - Yeah, I mean, that's my knee jerk reaction
01:54:17.220 | to the Walrus Operators,
01:54:18.060 | like, I don't actually care that much,
01:54:19.540 | either way, I'm not gonna get personally passionate.
01:54:21.820 | My initial reaction was like,
01:54:23.700 | yeah, this seems to make things more confusing to read.
01:54:26.100 | But then again, so does list comprehension
01:54:27.980 | until you're used to it.
01:54:29.140 | So like, if there's a use for it, great,
01:54:31.260 | if not, great, but like, let's just all calm down
01:54:33.860 | about our spaces versus tabs debates here
01:54:35.980 | and like, be chill.
01:54:38.020 | - Yeah, to me, it just represents
01:54:39.620 | the value of great leadership,
01:54:42.380 | even in open source communities.
01:54:44.020 | - Does it represent that if he stepped down as a leader?
01:54:46.900 | - Well, he fought for it, no, he got it passed.
01:54:49.540 | - I guess, but, I guess, sure.
01:54:51.980 | - It could represent multiple things too.
01:54:53.820 | It can represent like failed dictatorships,
01:54:57.620 | or it can represent a lot of things,
01:54:59.180 | but to me, great leaders take risks,
01:55:02.740 | even if it's a mistake at the end.
01:55:06.860 | Like, you have to make decisions.
01:55:09.020 | The thing is, this world won't go anywhere
01:55:11.580 | if you constantly, if whenever there's a divisive thing,
01:55:14.940 | you wait until the division is no longer there.
01:55:17.420 | Like, that's the paralysis we experienced
01:55:19.780 | with like Congress and political systems.
01:55:22.180 | It's good to be slow when there's indecision,
01:55:26.020 | when there's people disagree,
01:55:28.340 | it's good to take your time,
01:55:29.620 | but like at a certain point, it results in paralysis,
01:55:31.980 | and you just have to make a decision.
01:55:33.900 | The background of the site,
01:55:34.980 | whether it's yellow, blue, or red,
01:55:38.580 | can cause people to like go to war over each other.
01:55:41.500 | I've seen this with design.
01:55:42.740 | People are very touchy on color, color choices.
01:55:46.140 | At the end of the day, just make a decision,
01:55:49.020 | and go with it.
01:55:49.860 | I think that that's what the Walrus operator
01:55:53.420 | represents to me.
01:55:54.540 | - It represents the fighter pilot instinct
01:55:56.380 | of like quick action is more important than--
01:55:59.140 | - Than just like carrying everybody out
01:56:02.180 | and really thinking through it,
01:56:03.460 | because that's going to lead to paralysis.
01:56:06.660 | - Yeah, like if that's the actual case,
01:56:08.380 | that it's something where he's consciously
01:56:10.540 | hearing people's disagreement,
01:56:12.860 | disagreeing with that disagreement,
01:56:14.340 | and saying he wants to move forward anyway,
01:56:16.780 | that's an admirable aspect of leadership.
01:56:20.980 | - So we don't have much time,
01:56:22.940 | but I wanna ask just 'cause it's
01:56:25.300 | some beautiful mathematics involved.
01:56:27.380 | 2020 brought us a couple of, in the physics world,
01:56:31.940 | theories of everything.
01:56:33.900 | Eric Weinstein kind of,
01:56:36.020 | I mean, he's been working for probably decades,
01:56:38.060 | but he put out this idea of geometric unity,
01:56:42.420 | or started sort of publicly thinking
01:56:44.140 | and talking about it more.
01:56:45.820 | Stephen Wolfram put out his physics project,
01:56:50.100 | which is kind of this hypergraph view
01:56:51.740 | of a theory of everything.
01:56:53.420 | Do you find interesting, beautiful things
01:56:57.420 | to these theories of everything?
01:56:58.780 | What do you think about the physics world
01:57:00.940 | and sort of the beautiful, interesting,
01:57:04.020 | insightful mathematics in that world,
01:57:07.140 | whether we're talking about quantum mechanics,
01:57:08.940 | which you touched on in a bunch of your videos a little bit,
01:57:11.460 | quaternions, like just the mathematics involved,
01:57:13.860 | or the general relativity,
01:57:15.460 | which is more about surfaces and topology, all that stuff?
01:57:19.620 | - Well, I think as far as popularized science is concerned,
01:57:24.100 | people are more interested in theories of everything
01:57:26.060 | than they should be.
01:57:27.060 | 'Cause the problem is, whether we're talking about
01:57:30.380 | trying to make sense of Weinstein's lectures
01:57:33.020 | or Wolfram's project, or let's just say
01:57:35.180 | listening to Witten talk about string theory,
01:57:38.300 | whatever proposed path to a theory of everything,
01:57:40.940 | you're not actually gonna understand it.
01:57:43.900 | Some physicists will, but you're just not actually
01:57:47.100 | gonna understand the substance of what they're saying.
01:57:49.100 | What I think is way, way more productive
01:57:50.940 | is to let yourself get really interested
01:57:53.500 | in the phenomena that are still deep,
01:57:55.980 | but which you have a chance of understanding.
01:57:58.100 | 'Cause the path to getting to even understanding
01:58:00.700 | what questions these theories of everything
01:58:02.300 | are trying to answer involves walking down that.
01:58:05.780 | I mean, I was watching a video before I came here
01:58:07.980 | from Steve Mould talking about
01:58:09.820 | why sugar polarizes light in a certain way.
01:58:12.420 | So fascinating, really, really interesting.
01:58:15.220 | It's not this novel theory of everything type thing,
01:58:18.260 | but to understand what's going on there
01:58:20.140 | really requires digging in in depth to certain ideas.
01:58:23.300 | And if you let yourself think past
01:58:25.020 | what the video tells you about,
01:58:26.620 | what does circularly polarized light mean
01:58:28.340 | and things like that, it actually would get you
01:58:30.380 | to a pretty good appreciation of two-state states
01:58:32.740 | in quantum systems in a way that just trying to read about,
01:58:36.900 | like, oh, what are the hard parts about resolving
01:58:40.620 | quantum field theories with general relativity
01:58:42.780 | is never gonna get you.
01:58:44.220 | So as far as popularizing science is concerned,
01:58:47.340 | like, the audience should be less interested
01:58:50.380 | than they are in theories of everything.
01:58:53.340 | The popularizers should be less emphatic
01:58:56.620 | than they are about that.
01:58:57.900 | For, like, actual practicing physicists,
01:59:00.100 | you know, it might be the case maybe more people
01:59:01.580 | should think about fundamental questions.
01:59:03.700 | But--
01:59:04.540 | - It's difficult to create, like, a three blue,
01:59:06.620 | one brown video on theory of everything.
01:59:09.500 | So basically, we should really try to find the beauty
01:59:14.420 | in mathematics or physics by looking at concepts
01:59:17.060 | that are, like, within reach.
01:59:18.380 | - Yeah, I think that's super important.
01:59:20.340 | I mean, so you see this in math too
01:59:23.100 | with the big unsolved problems.
01:59:25.260 | So like the clay millennium problems, Riemann hypothesis.
01:59:28.620 | - Have you ever done a video on Fermat's last theorem?
01:59:30.900 | - No, I have not yet, no.
01:59:32.180 | But if I did, do you know what I would do?
01:59:34.140 | I would talk about proving Fermat's last theorem
01:59:36.840 | in the specific case of n equals three.
01:59:39.320 | - Is that still accessible, though?
01:59:41.560 | - Yes, actually, barely.
01:59:43.980 | Mathologer might be able to do, like, a great job on this.
01:59:46.080 | He does a good job of taking stuff
01:59:47.260 | that's barely accessible and making it.
01:59:49.060 | But the core ideas of proving it for n equals three are hard,
01:59:53.860 | but they do get you real ideas
01:59:55.340 | about algebraic number theory.
01:59:57.360 | It involves looking at a number field that's,
01:59:59.540 | it lives in the complex plane.
02:00:00.720 | It looks like a hexagonal lattice.
02:00:02.460 | And you start asking questions about factoring numbers
02:00:04.700 | in this hexagonal lattice.
02:00:06.380 | So it takes a while, but I've talked about this sort of,
02:00:08.100 | like, lattice arithmetic in other contexts.
02:00:11.580 | And you can get to a okay understanding of that.
02:00:15.580 | And the things that make Fermat's last theorem hard
02:00:17.460 | are actually quite deep.
02:00:19.340 | And so the cases that we can solve it for,
02:00:21.340 | it's like you can get these broad sweeps
02:00:23.060 | based on some hard, but, like,
02:00:25.220 | accessible bits of number theory.
02:00:28.700 | But before you can even understand
02:00:30.100 | why the general case is as hard as it is,
02:00:31.960 | you have to walk through those.
02:00:33.640 | And so any other attempt to describe it
02:00:35.840 | would just end up being, like, shallow
02:00:37.500 | and not really productive for the viewer's time.
02:00:40.340 | I think the same goes for most, like,
02:00:43.260 | unsolved problem type things,
02:00:44.440 | where I think, you know, as a kid,
02:00:46.820 | I was actually very inspired by the twin prime conjecture.
02:00:49.620 | That, like, totally sucked me in.
02:00:50.700 | It's this thing that was understandable.
02:00:52.420 | I kind of had this dream, like,
02:00:53.260 | "Oh, maybe I'll be the one
02:00:54.100 | to prove the twin prime conjecture."
02:00:56.020 | And new math that I would learn
02:00:57.520 | would be, like, viewed through this lens of, like,
02:00:59.600 | "Oh, maybe I can apply it to that in some way."
02:01:01.800 | But you sort of mature to a point where you realize
02:01:04.900 | that you should spend your brain cycles
02:01:09.060 | on problems that you will see resolved,
02:01:11.020 | 'cause then you're gonna grow to see
02:01:13.360 | what it feels like for these things to be resolved,
02:01:15.420 | rather than spending your brain cycles
02:01:16.900 | on something where it's not gonna pan out.
02:01:19.620 | And the people who do make progress towards these things,
02:01:22.580 | like James Maynard is a great example here
02:01:25.300 | of, like, young, creative mathematician
02:01:27.060 | who pushes in the direction of things
02:01:29.340 | like the twin prime conjecture.
02:01:31.080 | Rather than hitting that head on,
02:01:32.160 | just see all the interesting questions
02:01:33.940 | that are hard for similar reasons,
02:01:35.260 | but become more tractable,
02:01:36.300 | and let themselves really engage with those.
02:01:38.460 | So I think people should get in that habit.
02:01:41.300 | I think the popularization of physics
02:01:42.860 | should encourage that habit through things like
02:01:46.100 | the physics of simple everyday phenomena,
02:01:48.040 | because it can get quite deep.
02:01:49.700 | And yeah, I think I've heard a lot of the interest
02:01:53.940 | that people send me messages
02:01:55.980 | asking to explain Weinstein's thing,
02:01:57.700 | or asking to explain Wolfram's thing.
02:01:59.520 | One, I don't understand them, but more importantly--
02:02:02.460 | - It's too big a bite to--
02:02:06.140 | - You shouldn't be interested in those, right?
02:02:08.900 | - It's a giant sort of ball of interesting ideas.
02:02:12.660 | There's probably a million of interesting ideas in there
02:02:14.940 | that individually could be explored effectively.
02:02:17.700 | - And to be clear,
02:02:18.540 | you should be interested in fundamental questions.
02:02:20.100 | I think that's a good habit
02:02:21.100 | to ask what the fundamentals of things are.
02:02:23.060 | But I think it takes a lot of steps to...
02:02:27.780 | Certainly you shouldn't be trying to answer that
02:02:29.380 | unless you actually understand quantum field theory,
02:02:31.340 | and you actually understand general relativity.
02:02:33.380 | - That's the cool thing about your videos,
02:02:35.260 | people who haven't done mathematics.
02:02:37.220 | If you really give it time,
02:02:38.580 | watch it a couple of times,
02:02:39.700 | and try to reason about it,
02:02:42.660 | you can actually understand the concept
02:02:44.140 | that's being explained.
02:02:45.260 | - And it's not a coincidence
02:02:46.440 | that the things I'm describing aren't
02:02:48.740 | the most up-to-date progress
02:02:51.380 | on the Riemann hypothesis cousins,
02:02:53.340 | or there's context in which the analog
02:02:56.300 | of the Riemann hypothesis has been solved
02:02:57.980 | in more discrete-feeling finite settings
02:03:01.060 | that are more well-behaved.
02:03:02.460 | I'm not describing that
02:03:03.800 | because it just takes a ton to get there.
02:03:05.780 | And instead, I think it'll be productive
02:03:08.520 | to have an actual understanding of something
02:03:10.860 | that you can pack into 20 minutes.
02:03:12.980 | - I think that's beautifully put.
02:03:14.620 | Ultimately, that's where the most satisfying thing
02:03:16.780 | is when you really understand.
02:03:18.280 | Yeah, really understand.
02:03:20.900 | - Build a habit of feeling what it's like
02:03:22.900 | to actually come to resolution.
02:03:24.900 | - As opposed to, which it can also be enjoyable,
02:03:29.560 | but just being in awe of the fact
02:03:31.980 | that you don't understand anything.
02:03:33.060 | - Yeah, that's not like...
02:03:34.460 | I don't know, maybe people get entertainment out of that,
02:03:37.100 | but it's not as fulfilling as understanding.
02:03:40.480 | - You won't grow.
02:03:42.100 | - Yeah, but also just the fulfilling.
02:03:45.000 | It really does feel good
02:03:46.160 | when you first don't understand something, and then you do.
02:03:49.620 | That's a beautiful feeling.
02:03:51.480 | Hey, let me ask you one last...
02:03:54.340 | Last time, it got awkward and weird
02:03:55.860 | about a fear of mortality, which you made fun of me of,
02:03:59.260 | but let me ask you on the other absurd question
02:04:02.080 | is what do you think is the meaning of our life,
02:04:06.540 | of meaning of life?
02:04:08.020 | - I'm sorry if I made fun of you about mortality.
02:04:10.520 | - No, you didn't.
02:04:11.360 | I'm just joking.
02:04:12.180 | It was great.
02:04:13.800 | - I don't think life has a meaning.
02:04:15.320 | I think meaning...
02:04:16.960 | I don't understand the question.
02:04:18.560 | I think meaning is something that's ascribed
02:04:20.320 | to stuff that's created with purpose.
02:04:22.360 | There's a meaning to this water bottle label,
02:04:25.320 | in that someone created it
02:04:26.400 | with a purpose of conveying meaning,
02:04:27.880 | and there was one consciousness
02:04:29.360 | that wanted to get its ideas into another consciousness.
02:04:32.200 | Most things don't have that property.
02:04:35.520 | - It's a little bit like if I ask you,
02:04:37.800 | what is the height?
02:04:38.760 | - All right, so it's all relative.
02:04:42.080 | - Yeah, you'd be like, the height of what?
02:04:44.040 | You can't ask what is the height without an object.
02:04:46.560 | You can't ask what is the meaning of life
02:04:48.020 | without an intentful consciousness putting it...
02:04:52.040 | I guess I'm revealing I'm not very religious.
02:04:54.320 | - But the mathematics of everything
02:04:57.240 | seems kind of beautiful.
02:04:58.680 | It seems like there's some kind of structure
02:05:03.560 | relative to which you could calculate the height.
02:05:07.600 | - But what I'm saying is I don't understand the question,
02:05:09.720 | what is the meaning of life,
02:05:10.560 | in that I think people might be asking something very real.
02:05:13.680 | I don't understand what they're asking.
02:05:14.760 | Are they asking why does life exist?
02:05:16.880 | How did it come about?
02:05:17.760 | What are the natural laws?
02:05:19.240 | Are they asking, as I'm making decisions day by day
02:05:22.160 | for what should I do, what is the guiding light
02:05:24.480 | that inspires what should I do?
02:05:26.280 | I think that's what people are kind of asking.
02:05:27.920 | - But also, the thing that gives you joy
02:05:32.200 | about education, about mathematics,
02:05:35.880 | what the hell is that?
02:05:37.200 | - Interactions with other people.
02:05:40.160 | Interactions with like-minded people,
02:05:41.560 | I think is the meaning of, in that sense.
02:05:44.160 | - Bringing others joy, essentially.
02:05:46.240 | In something you've created,
02:05:48.840 | it connects with others somehow.
02:05:50.760 | And the same in the vice versa.
02:05:53.560 | - I think that is what,
02:05:55.600 | when we use the word meaning to mean
02:05:57.200 | you're sort of filled with a sense of happiness
02:05:59.280 | and energy to create more things.
02:06:01.120 | I have so much meaning taken from this.
02:06:03.040 | Like that, yeah, that's what fuels my pump, at least.
02:06:06.520 | - So a life alone on a deserted island
02:06:09.160 | would be kind of meaningless.
02:06:11.120 | - Yeah, you wanna be alone together with someone.
02:06:14.000 | - I think we're all alone together.
02:06:15.720 | I think there's no better way to end it, Grant.
02:06:18.440 | You've been, first time we talked, it was amazing.
02:06:20.600 | Again, it's a huge honor that you make time for me.
02:06:23.360 | I appreciate talking with you.
02:06:24.360 | Thanks, man. - This was fun.
02:06:25.200 | - Awesome.
02:06:26.760 | Thanks for listening to this conversation
02:06:28.320 | with Grant Sanderson.
02:06:29.520 | And thank you to our sponsors,
02:06:31.520 | Dollar Shave Club, DoorDash, and Cash App.
02:06:34.640 | Click the sponsor links in the description
02:06:36.600 | to get a discount and to support this podcast.
02:06:39.760 | If you enjoy this thing, subscribe on YouTube,
02:06:42.200 | review it with Five Stars on Apple Podcast,
02:06:44.280 | follow on Spotify, support on Patreon,
02:06:46.840 | or connect with me on Twitter @LexFriedman.
02:06:50.680 | And now let me leave you with some words
02:06:52.680 | from Richard Feynman.
02:06:54.200 | "I have a friend who's an artist
02:06:56.360 | "and has sometimes taken a view
02:06:57.920 | "which I don't agree with very well.
02:07:00.360 | "He'll hold up a flower and say,
02:07:01.720 | "Look how beautiful it is.
02:07:03.240 | "And I'll agree.
02:07:04.760 | "Then he says, I as an artist
02:07:06.960 | "can see how beautiful this is,
02:07:08.560 | "but you as a scientist take this all apart
02:07:11.480 | "and it becomes a dull thing.
02:07:13.760 | "And I think he's kind of nutty.
02:07:15.960 | "First of all, the beauty that he sees
02:07:17.560 | "is available to other people and to me too, I believe.
02:07:21.360 | "Although I may not be quite as refined aesthetically
02:07:24.280 | "as he is, I can appreciate the beauty of a flower.
02:07:27.680 | "At the same time, I see much more
02:07:29.200 | "about the flower than he sees.
02:07:31.120 | "I can imagine the cells in there,
02:07:33.200 | "the complicated actions inside, which also have a beauty.
02:07:36.960 | "I mean, it's not just beauty at this dimension
02:07:38.800 | "at one centimeter,
02:07:39.920 | "there's also beauty at smaller dimensions,
02:07:42.400 | "the inner structure, also the processes.
02:07:46.160 | "The fact that the colors in the flower evolved
02:07:48.240 | "in order to attract insects to pollinate it is interesting.
02:07:51.880 | "It means that insects can see the color.
02:07:54.720 | "It adds a question.
02:07:56.000 | "Does this aesthetic sense also exist in the lower forms?
02:07:59.580 | "Why is it aesthetic?
02:08:01.140 | "All kinds of interesting questions
02:08:02.620 | "which the science knowledge only adds to the excitement,
02:08:05.780 | "the mystery and the awe of a flower.
02:08:08.560 | "It only adds.
02:08:10.000 | "I don't understand how it subtracts."
02:08:12.280 | Thank you for listening and hope to see you next time.
02:08:16.560 | (upbeat music)
02:08:19.140 | (upbeat music)
02:08:21.720 | [BLANK_AUDIO]