back to indexGrant Sanderson: Math, Manim, Neural Networks & Teaching with 3Blue1Brown | Lex Fridman Podcast #118
Chapters
0:0 Introduction
5:13 Richard Feynman
9:41 Learning deeply vs broadly
13:56 Telling a story with visualizations
18:43 Topology
23:52 Intuition about exponential growth
32:28 Elon Musk's exponential view of the world
40:9 SpaceX and space exploration
45:28 Origins of the Internet
49:50 Does teaching on YouTube get lonely?
54:31 Daily routine
60:20 Social media
70:38 Online education in a time of COVID
87:3 Joe Rogan moving to Spotify
92:9 Neural networks
98:30 GPT-3
106:52 Manim
111:1 Python
116:21 Theory of everything
123:53 Meaning of life
00:00:00.000 |
The following is a conversation with Grant Sanderson, 00:00:09.360 |
a YouTube channel where he educates and inspires the world 00:00:21.660 |
to get a discount and to support this podcast, 00:00:30.940 |
I think that this pandemic challenged millions of educators 00:00:41.460 |
of mathematical concepts that may otherwise seem difficult 00:00:44.940 |
or out of reach for students and curious minds. 00:00:50.060 |
researchers, and people who just enjoy sharing knowledge, 00:01:04.060 |
and difficult concepts and present them in a way 00:01:08.860 |
That is the challenge that is worth taking on. 00:01:11.420 |
My dream is to see more and more of my colleagues at MIT 00:01:19.420 |
and create the canonical explainer videos on a topic 00:01:22.380 |
that they know more than almost anyone else in the world. 00:01:27.500 |
the economic pain, the psychological medical toll 00:01:30.540 |
of the virus, masterfully crafted educational content 00:01:37.720 |
If you enjoy this thing, subscribe on YouTube, 00:01:51.220 |
which you already probably have done a long time ago, 00:01:53.580 |
and subscribe to 3Blue1Brown YouTube channel, 00:02:22.380 |
It's the best way to support this podcast as always. 00:02:46.300 |
and actually signed up when I first heard about them 00:03:01.680 |
but they encouraged me to try the shave butter, 00:03:13.580 |
Again, try the Ultimate Shave Starter Set today 00:03:17.260 |
plus free shipping at dollarshaveclub.com/lex. 00:03:28.900 |
when you download the DoorDash app and enter code LEX. 00:03:33.180 |
I have so many memories of working late nights 00:03:38.980 |
to argue about which DoorDash restaurant to order from. 00:03:43.180 |
those moments of bonding, of exchanging ideas, 00:03:45.900 |
of pausing to shift attention from the programs 00:03:53.020 |
These days, for a bit of time, I'm on my own, sadly, 00:03:58.260 |
But actually, DoorDash is still there for me. 00:04:00.540 |
There's a million options that fit into my keto diet ways. 00:04:04.040 |
Also, it's a great way to support restaurants 00:04:12.340 |
and zero delivery fees on your first order of $15 or more. 00:04:30.940 |
It's one of the best design interfaces of an app 00:04:34.700 |
To me, good design is when everything is easy and natural. 00:04:46.380 |
Anyway, there's a big part of my brain and heart 00:04:50.060 |
and also to appreciate great design by others. 00:04:52.660 |
So again, if you get Cash App from the App Store, 00:04:54.940 |
Google Play, and use code LEXPODCAST, you get $10, 00:05:01.740 |
an organization that is helping to advance robotics 00:05:04.700 |
and STEM education for young people around the world. 00:05:08.460 |
And now, here's my conversation with Grant Sanderson. 00:05:12.500 |
You've spoken about Richard Feynman as someone you admire. 00:05:17.740 |
I think last time we spoke, we ran out of time. 00:05:29.460 |
- I mean, I think a ton of people like Feynman. 00:05:31.740 |
It's a little bit cliche to say that you like Feynman. 00:05:37.540 |
and you just point to the Super Bowl or something 00:05:41.240 |
But I do actually think there's a layer to Feynman 00:05:45.480 |
One thing that just really struck me was this letter 00:05:48.980 |
that he wrote to his wife two years after she died. 00:05:51.700 |
So during the Manhattan Project, she had polio. 00:06:01.900 |
almost this mildly sexist, womanizing philanderer, 00:06:09.140 |
and I can try to pull it up for you if I want, 00:06:10.760 |
and it's just this absolutely heartfelt letter 00:06:21.440 |
And it shows you that the Feynman that we've all seen 00:06:28.760 |
And I think the same kind of goes in his science, 00:06:52.920 |
but I remember reading a book about Feynman in a cafe once, 00:07:04.640 |
And I don't understand how that can possibly be 00:07:16.000 |
The reality of it is he was deeply in love with math 00:07:20.560 |
into seeing that physics was a way to realize that, 00:07:26.680 |
was instead poured towards things like fundamental, 00:07:29.900 |
just emergent phenomena and everything like that. 00:07:38.760 |
is this constant desire to reinvent it for himself. 00:07:44.600 |
he would start to see what problem he was trying to solve 00:07:49.960 |
And then from there, see what others had done. 00:07:56.320 |
when you first are inspired by a certain idea 00:08:12.080 |
Do you try to rediscover everything yourself? 00:08:14.920 |
- So I think the things that I've learned best 00:08:18.960 |
are the ones that have some element of rediscovery. 00:08:23.200 |
And this is, for my part, it's actually a big fault. 00:08:25.360 |
Like this is part of why I'm not an active researcher. 00:08:31.340 |
The stuff that I do learn, I try to learn it really well. 00:08:34.800 |
But other times you do need to get through it 00:08:39.840 |
So obviously you need to be well-equipped to read things 00:08:45.880 |
But I think if you choose a few core building blocks 00:08:54.640 |
I'm really gonna try to approach it for myself. 00:08:56.800 |
No matter what, you gain all sorts of inarticulatable 00:09:04.520 |
For example, you're gonna be trying to come up 00:09:07.820 |
You're gonna try to come up with intuitive examples, 00:09:22.740 |
I think there was a period when like the rest of physics 00:09:41.280 |
- You know, just to link in a small point you made, 00:09:44.440 |
that you're not a quote unquote active researcher. 00:09:47.180 |
Do you, you're swimming often in reasonably good depth 00:10:01.080 |
'cause you probably built up a hell of an amazing intuition 00:10:04.040 |
about what is and isn't true within these worlds. 00:10:14.960 |
- Yeah, I think one of my biggest regrets from undergrad 00:10:21.720 |
And I think a big part of success in research 00:10:26.240 |
and like people giving you the kind of scaffolded problems 00:10:31.280 |
I feel like I'm pretty good at exposing math to others 00:10:39.840 |
I, are you familiar with like the hedgehog fox dynamic? 00:10:44.720 |
I think this was either the ancient Greeks came up with it 00:10:59.060 |
There's the fox that knows many different things 00:11:01.960 |
and then the hedgehog that knows one thing very deeply. 00:11:06.800 |
He's someone who knows many different things, 00:11:08.220 |
just very foundational in a lot of different fields. 00:11:12.640 |
thinking really deeply about one particular thing 00:11:15.160 |
and both are very necessary for making progress. 00:11:18.240 |
So between those two, I would definitely see myself 00:11:20.480 |
as like the fox where I'll try to get my paws 00:11:27.640 |
to make like a significant contribution to any of them. 00:11:29.980 |
But I do see value in like having a decently deep 00:11:36.640 |
Like most people who know computer science really deeply 00:11:42.940 |
or many of the aspects, like different fields in math even. 00:11:46.780 |
Let's say you have like an analytic number theory 00:11:58.680 |
place one PhD student into a seminar of another one's, 00:12:02.120 |
they don't understand what the other one's saying at all. 00:12:04.280 |
Like you take the complex analysis specialist 00:12:10.260 |
But I think going around and like trying to have some sense 00:12:16.600 |
I don't know if I would ever make like new contributions 00:12:24.000 |
where there's kind of a notion of things that are known, 00:12:30.000 |
- Well, first of all, I think most people would agree 00:12:34.300 |
the way you see the world is fundamentally often new. 00:12:45.560 |
the multi-dimensional visualization we'll talk about. 00:12:48.480 |
I mean, you're revealing something very interesting 00:12:51.360 |
that yeah, just feels like research, feels like science, 00:12:55.720 |
feels like the cutting edge of the very thing 00:12:59.680 |
of which like new ideas and new discoveries are made of. 00:13:03.200 |
- I do think you're being a little bit more generous 00:13:08.880 |
because I sometimes think when I research a video, 00:13:18.560 |
like the stuff that I cover is from the actual depth. 00:13:23.360 |
but I think that could also be a mathematics thing. 00:13:29.840 |
you feel like you've basically mastered the field. 00:13:34.640 |
- Well, everything is either trivial or impossible. 00:13:37.360 |
And it's like a shockingly thin line between the two 00:13:40.000 |
where you can find something that's totally impenetrable. 00:13:42.240 |
And then after you get a feel for it, it's like, 00:13:43.560 |
oh yeah, that whole subject is actually trivial in some way. 00:13:50.200 |
Every researcher is just on the other end of that hump 00:13:56.040 |
- What do you think about Feynman's teaching style 00:13:59.480 |
or another perspective is of use of visualization? 00:14:07.520 |
because people have described like the Feynman effect 00:14:15.280 |
So as an entertainment session, it's wonderful 00:14:19.200 |
because it gives you this intellectual satisfaction 00:14:26.320 |
But the Feynman effect is that you can't really recall 00:14:29.200 |
what it is that gave you that insight even a week later. 00:14:32.560 |
And this is true of a lot of books and a lot of lectures 00:14:35.520 |
where the retention is never quite what we hope it is. 00:14:46.440 |
where at best it's giving this kind of intellectual candy 00:15:00.240 |
to just make sure that you have like the building blocks 00:15:11.140 |
which is how damn satisfying they are to consume 00:15:14.940 |
might actually also reveal a little bit of the flaw 00:15:18.620 |
that we should as educators all look out for, 00:15:31.460 |
the awesome thing that Feynman couldn't do at the time 00:15:42.660 |
You can like, here, let's take the value of this variable 00:16:04.660 |
- Yeah, well, so what's interesting is you're saying that, 00:16:08.340 |
in the sense that there's a play button and a pause button. 00:16:13.300 |
why don't you program it into an interactable version? 00:16:18.460 |
which I should do and that like would be better. 00:16:24.020 |
just sort of consume what the author had in mind. 00:16:40.340 |
And maybe that small sliver is actually who you're targeting 00:16:43.900 |
But most people consume it just as a piece of 00:16:48.460 |
that maybe you tweak with the example a little bit 00:16:51.860 |
But in that way, I do think like a video can get 00:16:58.880 |
as long as you make the interactive for yourself 00:17:00.700 |
and you decide what the best narrative to spin is. 00:17:03.140 |
As a more concrete example, like my process with, 00:17:06.660 |
I made this video about SIR models for epidemics. 00:17:09.740 |
And it's like this agent-based bottling thing 00:17:11.900 |
where you tweak some things about how the epidemic spreads 00:17:14.900 |
and you wanna see how that affects its evolution. 00:17:17.340 |
My format for making that was very different than others 00:17:21.300 |
where rather than scripting it ahead of time, 00:17:23.180 |
I just made the playground and then I played a bunch 00:17:26.740 |
and then I saw what stories there were to tell within that. 00:17:32.860 |
It had like five or six stories or whatever it was. 00:17:44.340 |
- And here's five things I found after playing with it. 00:17:49.100 |
a way that you could do that project is you make the model 00:17:54.500 |
Like come to my website where you interact with this thing. 00:17:57.260 |
And people did like sort of remake it in a JavaScript way 00:18:04.580 |
But I think a meaningful part of the value to add 00:18:18.140 |
what's the interesting thing to walk through here. 00:18:21.060 |
And even though there's lots of other interesting paths 00:18:26.580 |
and you're given this tool with like five different sliders 00:18:28.980 |
and you're told to like play and discover things. 00:18:35.460 |
Like a little bit of guidance in that direction 00:18:39.780 |
to make someone want to imagine more about it. 00:18:54.100 |
but it's outside the scope of this video essentially. 00:18:56.860 |
But I'll leave it to you as homework essentially 00:18:59.580 |
to like figure out it's a cool thing to explore. 00:19:02.740 |
- I wish I could say that wasn't a function of laziness. 00:19:05.980 |
- Right, and that's like you've worked so hard 00:19:09.540 |
that to extend it out even further would take more time. 00:19:14.340 |
the homomorphic, like from the Mobius strip to the-- 00:19:37.300 |
- Well, I hope that's not exactly how I phrase it 00:19:45.100 |
and then to want to know what aspects of a Mobius strip 00:19:48.660 |
do you wanna formalize such that you can prove 00:20:00.460 |
I want to fit it such that it's all above the plane 00:20:08.180 |
I don't think I can do that without crossing itself 00:20:17.340 |
for what it means to be orientable or non-orientable. 00:20:23.980 |
of a topology textbook start to make a little more sense. 00:20:27.640 |
- Yeah, and I mean, that whole video, beautifully, 00:20:37.400 |
but I do think sometimes it's popularized in the wrong way 00:20:40.480 |
where you'll hear these things of people saying, 00:20:43.380 |
oh, topologists, they're very interested in surfaces 00:20:45.740 |
that you can bend and stretch but you can't cut or glue. 00:20:51.780 |
Like, there's all sorts of things you can be interested in 00:20:54.380 |
with random, like, imaginative manipulations of things. 00:20:57.340 |
Is that really what like mathematicians are into? 00:21:03.060 |
It's not as if someone was sitting there thinking like, 00:21:08.500 |
if I add some arbitrary rules about when I can't cut it 00:21:24.140 |
You just wanna maintain a notion of closeness. 00:21:26.300 |
And once you get it to those general structures, 00:21:30.820 |
translate into non-trivial facts about other parts of math. 00:21:35.060 |
And that, I don't think that's actually like popularized. 00:21:38.860 |
I don't even think it's emphasized well enough 00:21:40.440 |
when you're starting to take a topology class 00:21:45.260 |
you're just talking about coffee mugs and donuts, 00:21:49.660 |
and you're talking about the axiom systems with open sets 00:21:54.060 |
and an open set is not the opposite of closed set. 00:22:08.220 |
like really have to walk through mud to get there. 00:22:11.620 |
about how this relates to the beautiful things 00:22:13.220 |
about coffee mugs and Moebius strips and such. 00:22:16.020 |
And it takes a really long time to actually see, 00:22:22.340 |
But I don't think it needs to take that time. 00:22:30.060 |
But I've also seen it in my narrow view of like, 00:22:41.380 |
- Yeah, you have like facts that seem very strange. 00:22:47.500 |
and like, let's say all the molecules settled 00:22:54.020 |
You have all sorts of fixed point theorems like this, right? 00:22:58.620 |
directly relevant to Nash equilibriums, right? 00:23:09.140 |
Is this what we're paying our mathematicians for? 00:23:11.580 |
You have this very elegant mapping onto economics 00:23:15.580 |
or very, I shouldn't say concrete, very tangible, 00:23:24.300 |
because you have to get a little bit technical 00:23:34.260 |
just shy away from being a little too technical. 00:23:38.220 |
By the way, for people who are watching the video, 00:23:50.180 |
- The snuggle is real. - The snuggle is real. 00:23:55.780 |
I think there's certain ideas there of growth, 00:24:14.940 |
And it's, again, people should watch the video. 00:24:25.580 |
those people, they go really far in terms of modeling, 00:24:35.940 |
like how many people you encounter in certain situations, 00:24:40.940 |
when you go to a school, when you go to a mall, 00:24:43.620 |
they like model every aspect of that for a particular city. 00:24:51.340 |
And natural patterns of the people have, it's crazy. 00:24:59.460 |
Well, because I don't want to pretend like I'm an epidemiologist. 00:25:02.620 |
Like we have a ton of armchair epidemiologists. 00:25:12.700 |
And also just like get ourselves in a position 00:25:21.180 |
They should point to all the ways that it's wrong, 00:25:31.020 |
and what that represents and what it can imply. 00:25:38.140 |
and you're in a population which is completely susceptible, 00:25:42.980 |
that you're gonna infect during your infectiousness? 00:25:45.940 |
So certainly during the beginning of an epidemic, 00:25:49.340 |
this basically gives you kind of the exponential growth rate. 00:25:58.100 |
As it goes on, and let's say it's something endemic 00:26:07.540 |
then the R-naught value doesn't tell you that as directly 00:26:12.260 |
because a lot of the people you interact with 00:26:14.140 |
aren't susceptible, but in the early phases it does. 00:26:23.420 |
If you can get it below one, then it's no longer epidemic. 00:26:33.820 |
and giving some intuitions on how do certain changes 00:26:38.660 |
And then what does that imply for exponential growth? 00:26:44.020 |
and they're like resilient to all of the chaoses 00:26:52.380 |
- I mean, one of the interesting aspects of that 00:26:54.100 |
is just exponential growth and we think about growth. 00:26:57.660 |
Is that one of the first times you've done a video on, 00:27:00.780 |
no, of course not, the whole Euler's identity. 00:27:09.540 |
I've done a lot of videos about exponential growth 00:27:17.540 |
do you think we're able to reason intuitively 00:27:22.980 |
- It's funny, I think it's extremely intuitive to humans 00:27:32.220 |
and then I think it can become intuitive again 00:27:42.580 |
where you're studying a group that has been disassociated 00:27:47.900 |
and you ask what number is between one and nine? 00:27:52.580 |
you've got one rock and you've got nine rocks, 00:27:54.420 |
you're like, what pile is halfway in between these? 00:27:59.740 |
that's the number that sits right between one and nine. 00:28:04.180 |
and the kind of just basic arithmetic that we have 00:28:07.500 |
isn't in a society, the natural instinct is three 00:28:10.620 |
because it's in between in an exponential sense 00:28:13.860 |
and a geometric sense that one is three times bigger 00:28:16.700 |
and then the next one is three times bigger than that. 00:28:18.820 |
So it's like, if you have one friend versus 100 friends, 00:28:22.940 |
Yeah, 10 friends seems like the social status 00:28:30.060 |
And for some reason, we kind of train it out of ourselves 00:28:35.740 |
- So in the sense, yeah, the early basic math 00:28:43.020 |
It's the same criticism if there's any of science 00:28:47.580 |
is the lessons of science make us like see the world 00:28:55.540 |
to where we have an over-exaggerated confidence 00:29:00.180 |
as opposed to just understanding a small slice of it. 00:29:03.380 |
But I think that probably only really goes for small numbers 00:29:10.300 |
So I bet if you took that same setup and you asked them, 00:29:13.340 |
oh, if I keep tripling the size of this rock pile, 00:29:24.700 |
that I think is pretty counterintuitive to us, 00:29:28.140 |
but that you can basically train into people. 00:29:30.420 |
Like I think computer scientists and physicists, 00:29:33.380 |
when they're looking at the early numbers of like COVID, 00:29:39.020 |
oh God, this is following an exact exponential curve. 00:29:45.380 |
So it's, and almost all of them are like techies 00:29:49.140 |
probably just 'cause I like live in the Bay Area, but. 00:29:51.700 |
- But for sure, they're cognizant of this kind of, 00:29:54.940 |
this kind of growth is present in a lot of natural systems 00:30:02.660 |
I mean, there's a lot of ways to visualize this obviously, 00:30:26.380 |
- For each chessboard and then two grains of rice 00:30:28.740 |
for the next square, then twice that for the next square 00:30:36.220 |
more grains of rice than there are anything in the world 00:30:46.620 |
Like for some reason, that's a really compelling 00:30:49.660 |
illustration how poorly breaks down, just like you said, 00:30:57.100 |
but of rocks, but after a while it's game over. 00:31:02.340 |
for gauging someone's intuitive understanding 00:31:04.860 |
of exponential growth is I've got like a lily pad 00:31:08.060 |
on a lake, really big lake, okay, like Lake Michigan. 00:31:12.020 |
And that lily pad replicates, it doubles one day 00:31:18.180 |
And after 50 days, it actually is gonna cover 00:31:28.220 |
- So you have a good instinct for exponential growth. 00:31:31.180 |
So I think a lot of like the knee jerk reaction 00:31:33.900 |
is sometimes to think that it's like half the amount 00:31:38.700 |
that like after 49 days, you've only covered half of it. 00:31:42.860 |
- Yeah, I mean, that's the reason you heard a pause from me. 00:31:49.740 |
So even when you know the fact and you do the division, 00:31:52.340 |
it's like, wow, so you've gotten like that whole time 00:31:56.980 |
And then after that, it gets the whole thing. 00:31:58.900 |
But I think you can make that even more visceral 00:32:02.260 |
how long until it's covered 1% of the lake, right? 00:32:08.700 |
How many times you have to double to get over 100? 00:32:10.900 |
Like seven, six and a half times, something like that. 00:32:13.820 |
So at that point, you're looking at 43, 44 days into it. 00:32:25.500 |
But then next thing you know, it's the entire lake. 00:32:28.700 |
- You're wearing a SpaceX shirt, so let me ask you. 00:32:31.820 |
- Let me ask you, one person who talks about exponential, 00:32:42.580 |
So he kind of advocates the idea of exponential thinking 00:32:49.700 |
realizing that technological development can, 00:32:53.180 |
at least in the short term, follow exponential improvement, 00:32:59.780 |
our ability to reason about what is and isn't impossible. 00:33:03.060 |
So he's a big, one, it's a good leadership kind of style 00:33:06.560 |
of saying like, look, the thing that everyone thinks 00:33:09.180 |
is impossible is actually possible because exponentials. 00:33:27.300 |
this exponential pattern that holds shockingly well. 00:33:33.580 |
I think the people who took Moore's Law seriously 00:33:37.980 |
it's not gonna be too long before these giant computers 00:33:40.420 |
that are either batch processing or time-shared, 00:33:47.860 |
you have people predicting smartphones a long time ago. 00:33:51.340 |
And it's only out of this, I don't wanna say faith 00:33:59.020 |
is to really understand why exponential growth happens 00:34:03.500 |
and that the mechanism behind it is when the rate of change 00:34:06.420 |
is proportional to the thing in and of itself. 00:34:08.840 |
So the reason that technology would grow exponentially 00:34:23.420 |
The advent of the internet makes it faster to learn things, 00:34:33.900 |
will grow exponentially, that the more resources 00:34:36.660 |
a company has, if it knows how to use them well, 00:34:47.380 |
I think a big part of that is 'cause you have the sense 00:34:49.060 |
what you want is to make sure that the things 00:34:52.500 |
and they enable further development of the adjacent parts. 00:34:58.460 |
and you're blindly drawing a line through it. 00:35:01.940 |
when do you have this proportional growth property? 00:35:05.800 |
Because then you can also recognize when it breaks down. 00:35:08.140 |
Like in an epidemic, as you approach saturation, 00:35:16.460 |
you can make it maybe not break down as being an exponential, 00:35:19.220 |
but it can seriously slow what that exponential rate is. 00:35:28.220 |
you want to minimize barriers that prevent the spread. 00:35:37.740 |
is so that you do hold up, that rate holds up. 00:35:41.920 |
And that's almost like an operational challenge 00:35:49.700 |
is that any one invention has a ripple that's unstopped. 00:36:18.820 |
Like it might not actually be an example of exponentials 00:36:21.300 |
because of something which grows in proportion to itself, 00:36:23.820 |
but instead it's almost like a benchmark that was set out 00:36:35.060 |
and just let the lily pad grow across the lake phenomenon. 00:36:38.380 |
- And it's also, there's a human psychological level for sure 00:36:44.940 |
like saying that, look, there is, you know, Moore's law. 00:36:56.700 |
You know, we've achieved it for the last decade, 00:36:58.620 |
for the last two decades, for the last three decades. 00:37:00.300 |
You just keep going and it somehow makes it happen. 00:37:08.560 |
how few people do the best work in the world, 00:37:13.560 |
like in that particular, whatever that field is. 00:37:21.060 |
I mean, you can argue that community matters, 00:37:24.780 |
but it's certain, like I've been in groups of engineers 00:37:30.760 |
doing an incredible amount of work and just is the genius. 00:37:40.700 |
is maybe the whole point is to create an atmosphere 00:37:49.220 |
like have the opportunity to do the best work of their life. 00:37:53.320 |
And yeah, and that the exponential is just milking that. 00:37:57.820 |
It's like rippling the idea that it's possible. 00:38:01.060 |
And that idea that it's possible finds the right people 00:38:05.260 |
The idea that it's possible finds the right runners 00:38:07.860 |
to run it and then it explodes the number of people 00:38:20.060 |
have way more potential than we ever realized. 00:38:29.380 |
A good book to read if you want that sense is "Peak," 00:38:32.780 |
which essentially talks about peak performance 00:38:35.260 |
in a lot of different ways, like chess, London cab drivers, 00:38:38.620 |
how many pushups people can do, short-term memory tasks. 00:38:41.420 |
And it's meant to be like a concrete manifesto 00:38:49.740 |
wow, no matter how good people are at something, 00:38:58.820 |
but I do think it's a true phenomenon that's interesting. 00:39:16.020 |
Like the advent of computing seems like a prerequisite 00:39:20.420 |
You have this truth about physics and the world 00:39:25.060 |
You could find Lorenz's equations without computers. 00:39:28.300 |
But in practice, it was just never gonna be analyzed 00:39:31.100 |
that way unless you were doing like a bunch of simulations 00:39:33.980 |
and that you could computationally see these models. 00:39:42.100 |
That self-proportionality, that's exponential. 00:40:06.180 |
That'll make the video game fun, whoever created this thing. 00:40:09.580 |
So, I mean, since you're wearing a SpaceX shirt, 00:40:18.500 |
- So Crew Dragon, the first crewed mission out into space 00:40:29.380 |
And just by first time ever by a commercial company, 00:40:34.380 |
I mean, it's an incredible accomplishment, I think, 00:40:56.800 |
The idea of seeing it basically done by smaller entities. 00:41:04.380 |
but moving in the direction of not necessarily requiring 00:41:07.420 |
an entire country and its government to make it happen, 00:41:15.740 |
'cause it's not like they're unilaterally saying, 00:41:17.980 |
like we're just shooting people up into space. 00:41:20.880 |
It's just a sign that we're able to do more powerful things 00:41:28.580 |
- I hope we see people land on Mars in my lifetime. 00:41:33.820 |
I think there's a ton of challenges there, right? 00:41:35.580 |
Like radiation being kind of the biggest one. 00:41:37.660 |
And I think there's a ton of people who look at that 00:41:47.500 |
who are like genuinely inspired about broadening 00:41:55.860 |
with like super long-term visions of terraforming. 00:41:59.100 |
- Sorry, backing up the light of consciousness? 00:42:01.100 |
- Yeah, the thought that if Earth goes to hell, 00:42:12.740 |
I think that's a reason to like get up in the morning. 00:42:14.740 |
And I feel like most employees at SpaceX feel that way too. 00:42:22.120 |
Like either AGI kills us first, or if we're like allowed. 00:42:27.100 |
- Well, like honestly, it would take such a long time. 00:42:30.180 |
Like, okay, you might have a small colony, right? 00:42:35.580 |
But not like people living comfortably there. 00:42:54.420 |
- Yeah, I mean, there's a lot of possibilities 00:42:56.460 |
where it could be just, it doesn't have to be on a planet. 00:43:05.340 |
That doesn't have to deal with the constraints 00:43:20.660 |
Yeah, all of the people who are like skeptical about it 00:43:22.780 |
are like, "Why do we care about going to Mars?" 00:43:25.460 |
It's like, what makes you care about anything? 00:43:30.620 |
because exactly as you put it on a philosophical level, 00:44:06.780 |
is because we do these things because they're hard. 00:44:09.620 |
There's something in the human spirit that says like, 00:44:19.460 |
you know what, I'm not gonna back down from this. 00:44:21.540 |
There's something to be discovered in overcoming this thing. 00:44:27.020 |
and I also like this about the moon missions, 00:44:29.100 |
sure it's kind of arbitrary, but you can't move the target. 00:44:35.980 |
And when that happens, it just demands actual innovation. 00:44:39.500 |
Like protecting humans from the radiation in space 00:44:43.060 |
on the flight there, while there, hard problem, 00:44:46.740 |
You can't move the goalposts to make that easier. 00:45:01.660 |
there's probably gonna be these secondary benefits 00:45:11.380 |
as something that has a deadline, which is the problem. 00:45:16.840 |
then the amount of things we would come up with 00:45:21.740 |
by forcing ourselves to figure out how to colonize 00:45:28.980 |
This is what people, like the internet didn't get created 00:45:32.660 |
because people sat down and tried to figure out 00:45:41.380 |
They, you know, it was, there's an application. 00:45:49.020 |
- It must've been very low level basic network communication 00:45:58.560 |
how do I send information securely between two places? 00:46:04.180 |
I'm totally speaking totally outside of my knowledge, 00:46:11.720 |
there was like this small community of people 00:46:13.200 |
who were really interested in time-sharing computing 00:46:23.880 |
basically meaning you can have multiple people 00:46:25.340 |
like logged in and using that like central computer, 00:46:30.260 |
And this was kind of what I had always thought like, 00:46:31.920 |
oh, is this like fringe group that was interested 00:46:37.740 |
But the thing is like DARPA wouldn't actually, 00:46:39.820 |
you wouldn't have the US government funding that 00:46:43.920 |
In some sense, that's what ARPA was all about 00:46:50.240 |
and it doesn't have to pay out with utility soon. 00:46:53.160 |
But the core parts of its development were happening 00:46:58.040 |
when there was budgetary constraints all over the place. 00:47:04.960 |
basically justifying the budget for the ARPANET 00:47:15.220 |
were having their funding cut 'cause of the war. 00:47:20.580 |
in terms of having like a more robust communication system, 00:47:23.820 |
like the idea of packet switching versus circuit switching. 00:47:48.600 |
trying to get like interactive computing out there. 00:48:11.240 |
like basically saying all the great science we've done 00:48:14.480 |
in the 20th century was like because of the military. 00:48:25.680 |
Another way to see the military and national security 00:48:28.160 |
is like a source of, like you said, deadlines 00:48:33.080 |
Like almost, you know, almost like scaring yourself 00:48:39.460 |
I mean, the Manhattan Project is a perfect example, 00:48:43.160 |
That one is a little bit more macabre than others 00:48:48.180 |
but in terms of how many focused, smart hours 00:48:52.140 |
of human intelligence get pointed towards a topic per day, 00:48:56.540 |
you're just maxing it out with that sense of worry. 00:48:58.540 |
In that context, everyone there was saying like, 00:49:00.380 |
we've got to get the bomb before Hitler does, 00:49:09.940 |
especially for researchers that are otherwise 00:49:22.500 |
sitting there kind of inventing a notion of computation 00:49:26.740 |
in order to like compute what they needed to compute 00:49:29.300 |
more quickly with like the rudimentary automated tools 00:49:36.140 |
where you've got otherwise very theorizing minds 00:49:39.500 |
in very pragmatic contexts that I think is like 00:49:44.940 |
So I think that stuff can be positive for progress. 00:49:50.220 |
- You mentioned Bell Labs and Manhattan Project. 00:49:52.720 |
This kind of makes me curious for the things you've create 00:50:01.140 |
or just not YouTube, it doesn't matter what it is. 00:50:03.940 |
It's just teaching content, art, doesn't matter. 00:50:25.100 |
So like my question for you is that, does it get lonely? 00:50:28.820 |
- Honestly, that right there I think is the biggest part 00:50:31.740 |
of my life that I would like to change in some way 00:50:38.420 |
and I'm like, goddamn, I love that whole situation. 00:50:44.180 |
and then you see that he also shared an office with Shannon. 00:50:50.700 |
- And they actually probably very likely worked separately. 00:50:55.260 |
- But there's a literally, and sorry to interrupt, 00:51:01.580 |
like on the way to like getting a snack or something. 00:51:08.620 |
It's like puzzles that colleagues are sharing, 00:51:19.540 |
But it's not in the day-to-day in the same way, 00:51:24.940 |
- That's one of the, I would say one of the biggest, 00:51:40.280 |
but like chance collisions are significantly reduced. 00:51:58.220 |
I just missed the happenstance serendipitous conversations 00:52:13.540 |
- It's like, I mean, you can't do it in academic setting, 00:52:17.860 |
and sitting there just for the strangers you might meet, 00:52:21.020 |
just the strangers or striking up a conversation 00:52:30.100 |
like maybe myself or maybe a lot of academic types 00:52:38.340 |
So it's nice when it's forced, those chance collisions, 00:52:53.180 |
like you probably hit moments when you look at this 00:52:56.500 |
and you say like, this is the wrong way to show it. 00:53:07.560 |
All those self-doubt that's like could be paralyzing. 00:53:19.940 |
I would like it to be in an environment with others 00:53:21.980 |
and like collaborative in the sense of ideas exchanged. 00:53:26.060 |
when you say this is too long, this is too short, 00:53:33.200 |
And I know that's just a thing that I'm not good at. 00:53:36.340 |
So in that way, it's very easy to just throw away a script 00:53:41.100 |
It's hard to tell someone else they should do the same. 00:53:44.340 |
I think it was like very close to me talking Don Knuth. 00:53:51.500 |
- It's the hard, no, can I brag about something? 00:53:55.860 |
- My favorite thing is Don Knuth, after we did the interview, 00:54:05.700 |
"What's the favorite interview you've ever done?" 00:54:27.980 |
He prefers that struggle, the struggle of it. 00:54:35.780 |
you know, often talk about like their process 00:54:42.460 |
like when they sit down, like how they like their desk. 00:54:49.940 |
like what they like to do, how long they like to work for, 00:54:54.540 |
what enables them to think deeply, all that kind of stuff. 00:55:04.420 |
Is there, if you were to lay out a perfect, productive day, 00:55:09.340 |
what would that schedule look like, do you think? 00:55:14.500 |
'cause like the mode of work I do changes a lot 00:55:25.060 |
Sometimes I'm like working on the animation library. 00:55:30.100 |
but something in the direction of software engineering. 00:55:37.460 |
So those is like four very different modes of what it, 00:55:41.380 |
some days it's like get through the email backlog 00:55:43.220 |
of people I've been, tasks I've been putting off. 00:55:48.940 |
like the idea starts with research and then there's scripting 00:55:52.060 |
and then there's programming and then there's the showtime. 00:55:58.220 |
like what's I think a problematic way to do it 00:56:03.660 |
Instead, it should be that you're like ambiently learning 00:56:08.020 |
And then once you feel like you have the understanding 00:56:13.660 |
Otherwise, either you're gonna end up roadblocked forever 00:56:16.820 |
or you're just not gonna like have a good way 00:56:20.860 |
But still some of the days it's like the thing to do 00:56:41.620 |
And like you didn't do anything last two days, 00:56:43.540 |
so you came up with excuses to procrastinate, 00:56:54.900 |
If we're writing, yeah, that's what's required 00:57:10.380 |
like the solution to writer's block is to read. 00:57:19.860 |
That, when it's a nice cycle, I think can work very well. 00:57:27.820 |
- Problem-solving videos, I know where it ends. 00:57:29.860 |
Expositional videos, I don't know where it ends. 00:57:51.580 |
you have such a big bag of aha moments already 00:58:12.300 |
if I see like, even when I asked for people to ask, 00:58:20.780 |
about like certain videos they would love you to do. 00:58:32.740 |
'cause like whenever I see them, people give ideas, 00:58:35.160 |
they're all like very often really good ideas. 00:58:44.040 |
when I go through a library or through a bookstore, 00:58:58.300 |
and don't let yourself lament the ones that stay closed. 00:59:05.340 |
Do you try to dedicate like a certain number of hours? 00:59:08.620 |
Do you, Cal Newport has this deep work kind of idea. 00:59:12.940 |
- There's systematic people who like get really on top of, 00:59:16.380 |
you know, they checklist of what they're gonna do in the day 00:59:20.200 |
And I am not a systematic person in that way. 00:59:26.540 |
if I was systematic in that way, but that doesn't happen. 00:59:29.400 |
So, you know, you talk to me, talk to me later in life 00:59:37.420 |
- I think Benjamin Franklin, like later in life, 00:59:45.020 |
- I think those schedules are much more fun to write. 00:59:49.200 |
and make a blog post about like the perfect productive day. 00:59:54.300 |
but I don't know how much people get out of like reading them 00:59:59.020 |
- And I'm not even sure that they've ever followed. 01:00:02.460 |
You're always gonna write it as the best version of yourself. 01:00:10.140 |
but not really wanting to get out of the bed and all of that. 01:00:13.440 |
- And just like zoning out for random reasons 01:00:16.180 |
or the one that people probably don't touch at all is, 01:00:34.020 |
but then I'll go, I'll have like a two week period 01:00:36.340 |
where it's just like, I'm checking the internet. 01:00:39.300 |
Like, I mean, it's probably some scary number of times. 01:00:43.580 |
- I think a lot of people can resonate with that. 01:00:52.220 |
because as long as it's a kind of socializing, 01:00:54.340 |
like if you're actually engaging with friends 01:01:03.540 |
but it's definitely an addiction because for me, 01:01:14.440 |
If I look at a day where I've checked social media a lot, 01:01:18.380 |
like if I just aggregate, I did a self-report, 01:01:21.340 |
I'm sure I would find that I'm just like literally 01:01:29.780 |
When I check it once a day, I'm very, like I'm happy. 01:01:37.660 |
when somebody says something not nice to you on the internet 01:01:46.620 |
like I virtually, I think about them positively, 01:01:53.400 |
but I just feel positively about the whole thing. 01:01:56.140 |
If I check it, if I check like more than that, 01:02:01.400 |
Like it start, there's an eating thing that happens, 01:02:05.340 |
like anxiety, it occupies a part of your mind 01:02:10.940 |
Same with, I mean, you put stuff out on YouTube. 01:02:19.780 |
but one of the interesting ones is the study of education 01:02:24.780 |
and the psychological aspect of putting stuff up on YouTube. 01:02:28.940 |
I like now have completely stopped checking statistics 01:02:39.660 |
He checks, he's probably listening to this, stop. 01:02:51.220 |
he's new to this whole addiction and he just checks. 01:03:05.940 |
- Oh, can I tell you a funny story in that effect 01:03:10.420 |
Early on in the channel, my mom would like text me. 01:03:14.380 |
She's like, the channel has had 990,000 views. 01:03:21.980 |
She's going to the little part on the about page 01:03:23.780 |
where you see the total number of channel views. 01:03:27.820 |
She had been going every day through all the videos 01:03:32.820 |
- And she thought she was like doing me this favor 01:03:40.780 |
where you have some number you want to follow 01:03:42.340 |
and like, yeah, it's funny that your dad had this. 01:03:47.020 |
- I think that's probably a beautiful thing for like parents 01:04:04.780 |
'cause comments on your videos are super positive, 01:04:08.020 |
but people judge the quality of how something went, 01:04:12.260 |
like I see that with these conversations, by the comments. 01:04:22.500 |
I'm talking about like CEOs of major companies 01:04:31.540 |
They're like, ooh, the comments seem to be positive 01:04:35.540 |
- Most important lesson for any content creator to learn 01:04:43.940 |
Ask yourself, how often do you write comments 01:04:58.020 |
And I think this is important in a number of respects. 01:04:59.580 |
Like in my case, I think I would think my content 01:05:02.740 |
was better than it was if I just read comments 01:05:06.340 |
The thing is, the people who are bored by it, 01:05:08.940 |
are put off by it in some way, are frustrated by it, 01:05:13.340 |
They're certainly not gonna watch the whole video, 01:05:18.340 |
of negative feedback, well-intentioned negative feedback 01:05:25.060 |
figure out what they disliked, articulate what they dislike. 01:05:30.020 |
that's not well-intentioned, but for that golden kind. 01:05:49.300 |
what they were trying to say or whatever have you. 01:05:51.820 |
Or we're focusing on things like personal appearances 01:05:59.380 |
That's what everyone's response to this video was. 01:06:09.540 |
It also translates to realizing that you're not 01:06:13.420 |
as important as you might think you are, right? 01:06:17.820 |
and are really asking you to create certain things 01:06:26.660 |
I have a very real problem with making promises 01:06:38.340 |
that you'll have music incorporated into your- 01:06:43.500 |
But there's an example of what I had in mind. 01:06:48.500 |
"Oh, I think there's a better version of this 01:06:56.220 |
- It was like a live performance of this one thing. 01:07:01.500 |
that fits having that in a better recording context, 01:07:16.780 |
of what will be good content and when it won't be. 01:07:19.300 |
But this can actually be incredibly disheartening 01:07:27.780 |
that I haven't followed through on X and X, which I get. 01:07:34.180 |
is that when there's a topic I haven't promised 01:07:38.340 |
it's like the people who would really like this 01:07:55.060 |
One of the people that's really inspiring to me 01:07:58.260 |
in that regard, 'cause I've really seen it in person, 01:08:02.140 |
Joe Rogan, he doesn't read comments, but not just that. 01:08:09.540 |
He like legitimate, he's not like clueless about it. 01:08:18.500 |
when he just experiences the moment with you, like offline. 01:08:21.940 |
You can tell he doesn't give a damn about like, 01:08:25.300 |
about anything, about what people think about 01:08:30.220 |
whether if it's on a podcast, you talk to him 01:08:31.940 |
or whether offline about just, it's not there. 01:08:37.940 |
how even like what the rest of the day looks like 01:08:44.940 |
especially like is what we're doing gonna make 01:08:48.420 |
for a good Instagram photo or something like that. 01:08:52.740 |
It's, I think for actually quite a lot of people, 01:08:58.460 |
and in real life, I show that you can be very successful 01:09:11.660 |
'cause it's like, well, there's a huge number of people 01:09:16.820 |
But at the same time, the nature of our platforms 01:09:20.500 |
is such that the cost of listening to all the positive people 01:09:25.500 |
who are really close to you, who are incredible people 01:09:40.380 |
slowly being degraded by the natural underlying toxicity 01:09:49.500 |
rather than like as many people as you can in a shallow way. 01:09:52.620 |
I think that's a good lesson for social media usage. 01:09:58.300 |
- Choose just a handful of things to engage with 01:10:00.460 |
and engage with it very well in a way that you feel proud of 01:10:04.740 |
Honestly, I think the best social media platform is texting. 01:10:12.260 |
- Well, yeah, the best social media interaction 01:10:15.380 |
is like real life, not social media, but social interaction. 01:10:21.820 |
- Which sucks because it's been challenged now 01:10:35.660 |
- That is the question of education right now. 01:10:39.060 |
So on that topic, you've done a series of live streams 01:10:43.260 |
And you went live, which is different than you usually do. 01:10:48.620 |
Maybe one, can you talk about how'd that feel? 01:11:05.020 |
where all of these educators are now trying to figure out 01:11:11.100 |
- For me, it was very different, as different as you can get. 01:11:18.100 |
It was a slightly different like level of topics, 01:11:20.860 |
although realistically, I'm just talking about things 01:11:24.420 |
I think the reason I did that was this thought 01:11:26.980 |
that a ton of people are looking to learn remotely, 01:11:34.820 |
that if you're looking for a place to point your students, 01:11:38.300 |
to be edified about math, just tune in at these times. 01:11:49.780 |
Part of the fun of the live interaction was to actually, 01:11:57.380 |
or see what questions people were asking in the audience. 01:11:59.980 |
I would love to, if I did more things like that 01:12:02.340 |
in the future, kind of tighten that feedback loop even more. 01:12:20.220 |
as a kind of performance and a kind of livestream performance 01:12:28.420 |
And I wrote up this little blog post actually 01:12:31.140 |
just on like, just what our setup looked like 01:12:34.740 |
and how to integrate like the broadcasting software OBS 01:12:40.900 |
I mean, yeah, maybe we could look at the blog post, 01:12:45.500 |
- The thing is, I knew nothing about any of that stuff 01:12:54.980 |
you could, as a teacher, like it doesn't take that much 01:12:57.460 |
to make things look and feel pretty professional. 01:13:00.260 |
Like one component of it is as soon as you hook things up 01:13:07.580 |
and then you can like have keyboard shortcuts 01:13:12.740 |
with a director calling like, go to camera three, 01:13:14.500 |
go to camera two, like onto the screen capture. 01:13:21.300 |
But I think I had it decently smooth such that, 01:13:41.980 |
One, you might get more engagement from the students. 01:13:44.600 |
But the biggest reason, I think one of the like 01:13:46.180 |
best things that can come out of this pandemic 01:13:48.060 |
education wise, is if we turn a bunch of teachers 01:13:57.360 |
sometimes I'll use the phrase commoditizing explanation. 01:14:08.180 |
that that lesson is taught millions of times over 01:14:16.620 |
that's just taught like literally millions of times 01:14:21.540 |
What should happen is that there's the small handful 01:14:29.680 |
That the time in classroom is spent on all of the parts 01:14:31.700 |
of teaching and education that aren't explanation, 01:14:35.920 |
And the way to get there is to basically have more people 01:14:38.760 |
who are already explaining, publish their explanations 01:14:51.920 |
in a way that doesn't just feel like a Zoom call 01:14:59.880 |
that was always gonna be publicized to more people 01:15:15.000 |
of putting out some content and nobody caring about it. 01:15:30.400 |
which may or may not be correct, but doesn't matter 01:15:40.440 |
So you think of how can I make the audio better? 01:15:57.760 |
I just interviewed him a couple of weeks ago. 01:16:00.760 |
He teaches this course in underactuated robotics, 01:16:10.720 |
We as humans, when we walk, we're always falling forward, 01:16:16.360 |
which means like it's gravity, you can't control it. 01:16:22.880 |
So like that's underactuated, you can't control everything. 01:16:29.340 |
the degrees of freedoms you have is not enough 01:16:42.160 |
but he's kind of been interested in like crisping it up. 01:16:54.600 |
he can do similar kinds of explanations as you're doing, 01:17:00.080 |
and spending like months in preparing a single video. 01:17:14.320 |
like in my apartment where we did the interview, 01:17:22.320 |
not this, this is a adjacent mansion that we're in 01:17:27.920 |
But you basically just have like a black curtain, 01:17:33.480 |
it makes it really easy to set up a filming situation 01:17:35.680 |
with cameras that we have here, these microphones. 01:17:45.640 |
this is excessive and actually really hard to work with. 01:18:07.720 |
I'm forgetting actually the name of the lapel mic, 01:18:09.680 |
but it was probably like a Rode of some kind. 01:18:14.960 |
- Is it hard to figure out how to make the audio sound good? 01:18:17.520 |
- Oh, I mean, listen to all the early videos on my channel 01:18:22.320 |
For some reason, I just couldn't get audio for a while. 01:18:25.800 |
I think it's weird when you hear your own voice. 01:18:28.520 |
So you hear it, you're like, this sounds weird. 01:18:33.000 |
or they're like actual audio artifacts at play. 01:18:42.480 |
You said it was probably streaming somehow through the-- 01:18:47.120 |
one that was mounted overhead over a piece of paper. 01:18:49.440 |
You could also use like an iPad or a Wacom tablet 01:18:55.240 |
One on the face, there's two, again, I don't know. 01:19:00.240 |
I'm like just not actually the one to ask this 01:19:03.400 |
but each of them like has a compressor object 01:19:11.820 |
but like gets compressed before it does that. 01:19:14.120 |
- The live aspect of it, do you regret doing it live? 01:19:20.840 |
I do think the content might be like much less sharp 01:19:27.520 |
even that I just recorded like that and then edited later. 01:19:30.180 |
But I do like something that I do to be out there to show 01:19:42.700 |
I probably would do it on a different channel, I think, 01:19:50.780 |
and kind of keep clean what 3Blue1Brown is about 01:20:01.480 |
that people like Russ or other educators try to go 01:20:09.840 |
that are like really well planned out or scripted, 01:20:20.440 |
- Yeah, well, what I think teachers like Russ should do 01:20:27.360 |
They wanna create the best short explanation of it 01:20:30.140 |
in the world that will be one of those handfuls 01:20:32.200 |
in a world where you have commoditized explanation, right? 01:20:34.980 |
Most of the lectures should be done just normally. 01:20:42.240 |
But maybe choose those small handful of topics. 01:20:47.280 |
is I do sample lessons with people on that topic 01:20:49.580 |
to get some sense of how other people think about it. 01:20:52.400 |
Let that inform how you want to edit it or script it 01:20:57.560 |
Some people are comfortable just explaining it 01:21:10.920 |
Like just like you mentioned, there's professors, 01:21:17.480 |
But he's a great teacher and he knows plasma, 01:21:21.620 |
plasma chemistry, plasma physics really well. 01:21:34.560 |
or like for plasma physics, like there's no videos. 01:21:38.480 |
- And just imagine if every one of those excellent teachers 01:21:51.840 |
And it's already replete with great explanations, 01:21:53.720 |
but it would be even more so with all the niche 01:21:55.440 |
great explanations and like anything you wanna learn. 01:21:58.080 |
- And there's a self-interest to it in terms of teachers, 01:22:01.200 |
in terms of even, so if you take Russ for example, 01:22:11.540 |
And from a selfish perspective, it's also just like, 01:22:15.420 |
I mean, it's like publishing a paper in a really, 01:22:21.960 |
like nature has like letters, like accessible publication. 01:22:30.940 |
that your passion is seen by a huge number of people. 01:22:35.940 |
Whatever the definition of huge is, it doesn't matter. 01:22:42.340 |
- And it's those lectures that tell early students 01:22:47.300 |
At the moment, I think students are disproportionately 01:22:49.520 |
interested in the things that are well-represented 01:22:52.420 |
So to any educator out there, if you're wondering, 01:22:54.140 |
hey, I want more like grad students in my department, 01:22:57.060 |
like what's the best way to recruit grad students? 01:23:01.980 |
And then you're going to have a pile of like excellent 01:23:05.500 |
- And one of the lessons I think your channel teaches 01:23:08.220 |
is there's appeal of explaining just something beautiful, 01:23:17.140 |
not doing a marketing video about why topology is great. 01:23:21.620 |
- Yeah, there's people interested in this stuff. 01:23:35.580 |
that explains the Banach-Tarski paradox substantively, 01:23:47.540 |
but the actual results that went into this idea 01:23:56.700 |
saying, yeah, I'm going to do this in-depth talk 01:23:59.940 |
I'm pretty sure it's going to reach 20 million people. 01:24:05.700 |
No one's interested in anything even anywhere near that. 01:24:08.580 |
But then you have Michael's quirky personality around it 01:24:13.940 |
then you don't need like the approval of some higher network. 01:24:19.180 |
You can just do it and let the people speak for themselves. 01:24:23.180 |
if your father was to make something on plasma physics, 01:24:25.420 |
or if we were to have like underactualized robotics, 01:24:36.980 |
- Yeah, most robotics is underactualized currently. 01:24:41.460 |
So even if it's things that you might think are niche, 01:24:49.340 |
- Although I just psychologically watching him, 01:25:00.980 |
- None of us know how to make videos when we start. 01:25:02.900 |
The first stuff I made was terrible in a number of respects. 01:25:05.220 |
Like look at the earliest videos on any YouTube channel, 01:25:18.160 |
is it's the same thing that I'm sure you went through, 01:25:35.300 |
but like, I don't know, it's this imposter syndrome. 01:25:43.700 |
that you've studied for like your whole life. 01:25:46.400 |
I don't know, it's scary to post stuff on YouTube. 01:25:53.620 |
who had that modesty to say who am I to post this 01:26:02.340 |
a lot of the educational content is posted by people 01:26:04.780 |
who like were just starting to research it two weeks ago 01:26:09.740 |
and who maybe should think like who am I to explain, 01:26:17.420 |
And the people who have the self-awareness to not post 01:26:27.260 |
- That's why there's a lot of value in a channel 01:26:32.740 |
a really smart person and force them to explain stuff 01:26:38.100 |
So, but of course that's not scalable as a single channel. 01:26:41.660 |
If there's anything beautiful that it could be done 01:26:44.540 |
is people take it in their own hands, educators. 01:26:54.420 |
You're gonna be making online content anyway, 01:26:58.740 |
Just hit that publish button and see how it goes. 01:27:03.500 |
The cool thing about YouTube is it might not go for a while, 01:27:07.780 |
but like 10 years later, it'll be like, this is the thing, 01:27:14.220 |
at least for now, at least that's my hope with it, 01:27:17.940 |
is it's literally better than publishing a book 01:27:38.980 |
- Yeah, yeah, nine digit numbers will do that to you. 01:27:42.140 |
- But he doesn't really, he's one of the person 01:27:44.780 |
that doesn't actually care that much about money. 01:27:46.820 |
Like having talked to him, it wasn't because of money, 01:28:02.180 |
you have to understand where they're coming from. 01:28:04.180 |
YouTube has been cracking down on people who they, 01:28:09.180 |
Joe Rogan talks to Alex Jones and conspiracy theories, 01:28:12.820 |
and YouTube is really careful with that kind of stuff. 01:28:19.140 |
And Joe doesn't feel like YouTube is on his side. 01:28:22.100 |
He's often has videos that they don't put in trending 01:28:41.380 |
And that's not a good place for a person to be in. 01:28:44.420 |
And Spotify is giving them, we're never going to censor you. 01:29:03.380 |
you can't just, you can't put fences around it. 01:29:10.460 |
is Joe's gonna remove his entire library from YouTube. 01:29:23.540 |
And like, that's the first time where I was like, 01:29:39.180 |
Like right now, if you have a URL that points to a server, 01:29:43.660 |
points to content and then it's like distributed. 01:29:46.140 |
So you can't actually delete what's at an address 01:29:50.820 |
And as long as there's someone on the network who hosts it, 01:29:53.180 |
it's always accessible at the address that it once was. 01:30:17.060 |
for some reason I thought YouTube was forever. 01:30:27.060 |
and Google or Alphabet isn't the company that it once was 01:30:31.180 |
and it's kind of struggling to make ends meet 01:30:33.060 |
and it's been supplanted by whoever wins on the AR game 01:30:41.540 |
all of these videos that we're hosting are pretty costly. 01:30:47.180 |
and tell people to like try to back them up on their own 01:30:51.260 |
Or even if it does exist in some form forever, 01:31:00.940 |
Like it would be shocking if YouTube remained as popular 01:31:10.060 |
- It makes me sad still, but, 'cause it's such a nice, 01:31:13.620 |
it's like, just like you said of the canonical videos. 01:31:17.700 |
Do you know, you should get Juan Bennett on the thing 01:31:25.140 |
- So he's the one that founded this thing called IPFS 01:31:33.820 |
then you'll get some articulate pontification around it. 01:31:37.700 |
- That's like been pretty well thought through. 01:31:39.820 |
- But yeah, I do see YouTube, just like you said, 01:31:44.900 |
which is like a set of canonical videos on a topic. 01:31:48.060 |
Now, others could create videos on that topic as well, 01:31:52.220 |
but as a collection, it creates a nice set of places to go 01:31:59.340 |
And it seems like coronavirus is a nice opportunity 01:32:08.860 |
I have to talk to you a little bit about machine learning, 01:32:15.380 |
you have a set of beautiful videos on neural networks. 01:32:22.900 |
what is the most beautiful aspect of neural networks 01:32:35.260 |
is there something mathematically or in applied sense 01:32:42.060 |
- Well, I think what I would go to is the layered structure 01:32:48.020 |
what feel like qualitatively distinct things happening 01:32:52.340 |
but that are following the same mathematical rule. 01:33:04.940 |
some of the visualizations that like Chris Ola has done 01:33:17.100 |
What you can see is that the ones closer to the input side 01:33:21.420 |
are picking up on very low level ideas like the texture. 01:33:27.020 |
what is the, where are the eyes in this picture? 01:33:29.140 |
And then how do the eyes form like an animal? 01:33:37.020 |
even though it's the same piece of math on each one. 01:33:41.140 |
that you can have like a generalizable object 01:33:54.700 |
- Yeah, form abstractions in a automated way. 01:34:08.380 |
seems a little bit like you do a bunch of ad hoc things, 01:34:13.860 |
with the mathematical reason that it always had to work. 01:34:18.500 |
when you have like that elegant piece of math, 01:34:20.980 |
it's hard not to just smile seeing it work in action. 01:34:24.620 |
- Well, and when you talked about topology before, 01:34:32.060 |
under kind of the field of like science and deep learning, 01:34:39.300 |
that is trying to be optimized in neural networks. 01:34:50.020 |
and somehow a dumb gradient descent algorithm 01:35:03.820 |
that you have these interesting points that exist 01:35:06.340 |
when you make your space so high dimensional. 01:35:08.780 |
Like GPT-3, what did it have, 175 billion parameters? 01:35:12.540 |
So it doesn't feel as mesmerizing to think about, 01:35:17.420 |
oh, there's some surface of intelligent behavior 01:35:21.940 |
Like there's so many parameters that of course, 01:35:25.060 |
how is it that you're able to efficiently get there? 01:35:29.020 |
that something as dumb as gradient descent does it. 01:35:32.060 |
But like the reason the gradient descent works well 01:35:37.860 |
you know, choose however you want to parameterize this space 01:35:44.580 |
in a way that makes it computationally feasible. 01:35:46.980 |
- Yeah, it's just that there's so many good solutions, 01:35:51.620 |
probably infinitely, infinitely many good solutions, 01:36:00.740 |
It's similar to Steven Wolfram has this idea of like, 01:36:04.360 |
if you just look at all space of computations, 01:36:15.660 |
Like if you just randomly pick from the bucket, 01:36:19.620 |
We tend to think like a tiny, tiny minority of them 01:36:29.260 |
to find computations that do something interesting. 01:36:34.940 |
from like a Kolmogorov complexity standpoint, 01:37:04.180 |
that's resilient to that noise, that's very good. 01:37:07.700 |
And then he quantitatively describes what very good is. 01:37:10.260 |
What's funny about how he proves the existence 01:37:14.780 |
is rather than saying like, here's how to construct it, 01:37:17.160 |
or even like a sensible non-constructive proof, 01:37:19.980 |
the nature of his non-constructive proof is to say, 01:37:25.180 |
it would be almost at the limit, which is weird, 01:37:28.420 |
because then it took decades for people to actually find any 01:37:32.860 |
And what his proof was saying is choose a random one, 01:37:35.760 |
and it's like the best kind of encoding you'll ever find. 01:37:43.100 |
when you choose a random element from this ungodly huge set, 01:37:47.620 |
from finding an efficient way to actively describe it. 01:37:54.780 |
of like telling you how to encode one thing into another 01:38:00.460 |
So on the side of like how many possible programs 01:38:03.660 |
are interesting in some way, it's like, yeah, tons of them. 01:38:09.620 |
is when you can have a low information description 01:38:14.500 |
- And thereby, this kind of gives you a blueprint 01:38:19.420 |
- Chaos theory is another good instance there 01:38:21.220 |
where it's like, yeah, a ton of things are hard to describe, 01:38:23.620 |
but how do you have ones that have a simple set 01:38:33.580 |
It's interesting to ask, what are your thoughts 01:38:36.780 |
about the recently released OpenAI GPT-3 model 01:38:47.620 |
- You know, I think I got an email a day or two ago 01:38:49.620 |
about someone who wanted to try to use GPT-3 with Manim, 01:38:53.940 |
where you would like give it a high level description 01:38:56.940 |
of something and then it'll like automatically create 01:39:05.180 |
- I mean, it probably won't put you out of a job, 01:39:07.460 |
but it'll create something visually beautiful for sure. 01:39:09.900 |
- I would be surprised if that worked as stated, 01:39:16.620 |
- I mean, like a lot of those demos, it's interesting. 01:39:27.660 |
I mean, certainly with code and with program synthesis, 01:39:32.060 |
But eventually I think if you pick the right examples, 01:39:43.380 |
it's still cool that something can be generated. 01:39:49.980 |
Sometimes a big part of it is just getting a bunch of stuff 01:39:52.100 |
on the page and then you can decide what to whittle down to. 01:39:54.780 |
So if it can be used in like a man-machine symbiosis 01:39:58.060 |
where it's just giving you a spew of potential ideas 01:40:10.140 |
- Yeah, have you gotten a chance to see any of the demos 01:40:21.460 |
he was like tweeting a bunch about playing with it. 01:40:25.180 |
And so GPT-3 was trained on the internet from before COVID. 01:40:30.180 |
So in a sense it doesn't know about the coronavirus. 01:40:33.580 |
So what he seeded it with was just a short description 01:40:35.820 |
about like a novel virus emerges in Wuhan, China 01:40:47.340 |
So then what GPT-3 generates is like January, 01:40:49.900 |
then a paragraph of description, February and such. 01:40:57.060 |
which of course it would because it's trained 01:41:02.700 |
But what you see unfolding is a description of COVID-19 01:41:08.120 |
And like the early aspects of it are kind of shockingly 01:41:14.780 |
- And the other flip side of that is I wouldn't be surprised 01:41:24.660 |
- Who knows, like we might all be in like this crazy 01:41:27.620 |
militarized zone as it predicts just a couple months off. 01:41:31.500 |
- Yeah, I think there's definitely an interesting tool 01:41:36.100 |
It has struggled with mathematics, which is interesting 01:41:40.700 |
It's able to, it's not able to generate like patterns, 01:41:45.180 |
you know, like you give it in like five digit numbers 01:41:50.180 |
and it's not able to figure out the sequence, you know, 01:41:53.500 |
or like I didn't look in too much, but I'm talking 01:41:57.500 |
about like sequences like the Fibonacci numbers 01:42:02.100 |
Because obviously it's leveraging stuff from the internet 01:42:05.340 |
But it is also cool that I've seen it able to generate 01:42:08.960 |
some interesting patterns that are mathematically correct. 01:42:12.500 |
- Yeah, I honestly haven't dug into like what's going on 01:42:16.220 |
within it in a way that I can speak intelligently to. 01:42:22.740 |
at numerical patterns because, I mean, maybe I should be 01:42:26.140 |
more impressed with it, but like that requires having 01:42:29.260 |
a weird combination of intuitive and formulaic worldview. 01:42:45.100 |
Instead it's the, like the way you're starting 01:42:47.220 |
to see a shape of things is by knowing what hypotheses 01:42:50.620 |
to test where you're saying, oh, maybe it's generated 01:42:53.100 |
based on the previous terms, or maybe it's generated 01:42:55.020 |
based on like multiplying by a constant or whatever it is. 01:42:57.460 |
You like have a bunch of different hypotheses 01:42:59.580 |
and your intuitions are around those hypotheses, 01:43:05.220 |
And it seems like GPT-3 is extremely good at like that sort 01:43:09.500 |
of pattern matching recognition that usually is very hard 01:43:12.180 |
for computers, that is what humans get good at 01:43:15.300 |
through expertise and exposure to lots of things. 01:43:17.460 |
It's why it's good to learn from as many examples 01:43:19.460 |
as you can, rather than just from the definitions. 01:43:24.420 |
but to actually concretize it into a piece of math, 01:43:31.060 |
and if not prove it, like have an actual explanation 01:43:34.380 |
for what's going on, not just a pattern that you've seen. 01:43:37.940 |
- Yeah, and but then the flip side to play devil's advocate, 01:43:44.060 |
intuitive understanding of just like we said, 01:43:49.220 |
but it's been able to form something that looks like 01:44:05.140 |
like I don't mean to denigrate pattern recognition, 01:44:09.980 |
and it's super important and it's super hard. 01:44:12.280 |
And so like when it's demonstrating this kind 01:44:14.860 |
of real understanding, compressing down some data, 01:44:17.100 |
like that might be pattern recognition at its finest. 01:44:20.660 |
My only point would be that like what differentiates math, 01:44:24.820 |
I think to a large extent is that the pattern recognition 01:44:28.340 |
isn't sufficient and that the kind of patterns 01:44:30.740 |
that you're recognizing are not like the end goals, 01:44:34.540 |
but instead they are the little bits and paths 01:44:39.340 |
- That's only true for mathematics in general. 01:44:52.000 |
you know, like Taylor's, like certain kinds of series, 01:44:54.580 |
it feels like compressing the internet is enough 01:45:00.420 |
to figure out, 'cause those patterns in some form appear 01:45:06.060 |
- Well, I mean, there's all sorts of wonderful examples 01:45:08.300 |
of false patterns in math where one of the earliest videos 01:45:13.300 |
you could kind of dividing a circle up using these chords 01:45:15.500 |
and you see this pattern of one, two, four, eight, 16. 01:45:18.580 |
I was like, okay, pretty easy to see what that pattern is. 01:45:22.180 |
You've seen it a million times, but it's not powers of two. 01:45:34.300 |
But I think it's a good test of whether you're thinking 01:45:37.700 |
clearly about mechanistic explanations of things, 01:45:41.660 |
how quickly you jump to thinking it must be powers of two. 01:45:44.380 |
'Cause the problem itself, there's really no good way to, 01:45:48.620 |
I mean, there can't be a good way to think about it 01:45:50.300 |
as like doubling a set because ultimately it doesn't. 01:45:53.380 |
But even before it starts to, it's not something 01:45:55.300 |
that screams out as being a doubling phenomenon. 01:45:58.420 |
So at best, if it did turn out to be powers of two, 01:46:02.900 |
And I think the difference between like a math student 01:46:05.500 |
making the mistake and a mathematician who's experienced 01:46:07.620 |
seeing that kind of pattern is that they'll have a sense 01:46:12.340 |
whether the pattern that they're observing is reasonable 01:46:28.060 |
- Yeah, like a little scientist, I guess, basically. 01:46:30.260 |
- Yeah, it's a fascinating thought because GPT-3, 01:46:35.140 |
these language models are already accomplishing 01:46:42.940 |
Yeah, I'm not saying I'd be impressed, but like surprised. 01:46:45.900 |
Like I'll be impressed, but I think we'll get there 01:46:51.520 |
- So one of the amazing things you've done for the world 01:47:07.860 |
Now it's quickly evolving because I think you're inventing 01:47:12.900 |
In fact, I've been working on playing around with some, 01:47:17.300 |
I wanted to do like an ode to "Three Blue, One Brown." 01:47:27.100 |
And I saw that you had like a little piece of code 01:47:35.900 |
like continue twisting it, I guess is the term. 01:47:48.060 |
is so many people love it, that you've put that out there. 01:47:51.660 |
They want to do the same thing as I do with Hendrix. 01:47:55.100 |
They want to explain an idea using the tool, including Rust. 01:47:58.180 |
How would you recommend they try to, I'm very sorry. 01:48:08.100 |
- And what kind of choices should they choose 01:48:16.300 |
because I think of it like this scrappy tool. 01:48:19.420 |
It's like a math teacher who put together some code. 01:48:22.100 |
People asked what it was, so they made it open source 01:48:27.900 |
that make it harder to work with than it needs to be 01:48:30.420 |
that are a function of me not being a software engineer. 01:48:33.220 |
I've put some work this year trying to make it better 01:48:43.020 |
One thing I would love to do is just get my act together 01:48:46.900 |
about properly integrating with what the community 01:48:58.380 |
in a way that I've been shamefully neglectful of. 01:49:16.980 |
Like if you're just making a quick graph of something, 01:49:21.580 |
has a little motion to it, use Desmos, use Grapher, 01:49:26.780 |
Certain things that are like really oriented around graph. 01:49:33.660 |
And in a lot of ways, it would make more sense 01:49:35.940 |
for some stuff that I do to just do in Geogebra. 01:49:38.780 |
But I kind of have this cycle of liking to try 01:49:54.940 |
Like use movement over time to communicate relationships 01:50:04.740 |
So I wanted something that was flexible enough 01:50:15.380 |
But also make sure that you're taking advantage 01:50:23.500 |
If any of those are like well fit for what you wanna teach 01:50:26.140 |
to have a scene type that you tweak a little bit 01:50:39.580 |
If it's just like writing some text on the screen 01:50:42.300 |
or shifting around objects or something like that, 01:50:45.740 |
things like that, you should probably just use Keynote. 01:50:55.500 |
and that which doesn't need to be into like other domains. 01:51:05.900 |
what's your most and least favorite aspects of Python? 01:51:12.540 |
I mean, I love that it's like object-oriented 01:51:15.820 |
and functional, I guess, that you can kind of like 01:51:18.780 |
get both of those benefits for how you structure things. 01:51:23.780 |
So if you would just want to quickly whip something together, 01:51:37.340 |
I mean, the biggest disadvantage is that it's slow. 01:51:39.180 |
So when you're doing computationally intensive things, 01:51:41.700 |
either you have to think about it more than you should, 01:51:43.620 |
how to make it efficient, or it just takes long. 01:51:47.260 |
- Do you run into that at all, like with your work? 01:51:52.700 |
than it needs to be because of how it renders things 01:51:58.580 |
I've rewritten things such that it's all done 01:52:00.940 |
with shaders in such a way that it should be just live 01:52:03.700 |
and actually interactive while you're coding it, 01:52:17.580 |
'cause there's just a play button and a pause button. 01:52:19.220 |
But while you're developing, that can be nice. 01:52:21.900 |
So it's gotten better in speed in that sense, 01:52:23.860 |
but that's basically because the hard work is being done 01:52:26.140 |
in a language that's not Python, but GLSL, right? 01:52:29.540 |
But yeah, there are some times when it's like a, 01:52:33.980 |
there's just a lot of data that goes into the object 01:52:51.380 |
'cause the toxicity over it led Guido to resign, 01:52:57.100 |
there's a bunch of surrounding things that also, 01:53:11.260 |
that's the most number of Python core developers 01:53:23.380 |
And like the structure of the idea of a BDFL is like, 01:53:37.420 |
- People like some parts of the benevolent dictator 01:53:40.580 |
but once the dictator does things different than you want, 01:53:46.460 |
he just couldn't because he truly is the B in the benevolent. 01:53:58.660 |
That's why Alanis Torvald is perhaps the way he is, 01:54:06.460 |
It's kind of surprising to me how many people 01:54:19.540 |
either way, I'm not gonna get personally passionate. 01:54:23.700 |
yeah, this seems to make things more confusing to read. 01:54:31.260 |
if not, great, but like, let's just all calm down 01:54:44.020 |
- Does it represent that if he stepped down as a leader? 01:54:46.900 |
- Well, he fought for it, no, he got it passed. 01:55:11.580 |
if you constantly, if whenever there's a divisive thing, 01:55:14.940 |
you wait until the division is no longer there. 01:55:22.180 |
It's good to be slow when there's indecision, 01:55:29.620 |
but like at a certain point, it results in paralysis, 01:55:38.580 |
can cause people to like go to war over each other. 01:55:42.740 |
People are very touchy on color, color choices. 01:55:56.380 |
of like quick action is more important than-- 01:56:27.380 |
2020 brought us a couple of, in the physics world, 01:56:36.020 |
I mean, he's been working for probably decades, 01:57:07.140 |
whether we're talking about quantum mechanics, 01:57:08.940 |
which you touched on in a bunch of your videos a little bit, 01:57:11.460 |
quaternions, like just the mathematics involved, 01:57:15.460 |
which is more about surfaces and topology, all that stuff? 01:57:19.620 |
- Well, I think as far as popularized science is concerned, 01:57:24.100 |
people are more interested in theories of everything 01:57:27.060 |
'Cause the problem is, whether we're talking about 01:57:35.180 |
listening to Witten talk about string theory, 01:57:38.300 |
whatever proposed path to a theory of everything, 01:57:43.900 |
Some physicists will, but you're just not actually 01:57:47.100 |
gonna understand the substance of what they're saying. 01:57:55.980 |
but which you have a chance of understanding. 01:57:58.100 |
'Cause the path to getting to even understanding 01:58:02.300 |
are trying to answer involves walking down that. 01:58:05.780 |
I mean, I was watching a video before I came here 01:58:15.220 |
It's not this novel theory of everything type thing, 01:58:20.140 |
really requires digging in in depth to certain ideas. 01:58:28.340 |
and things like that, it actually would get you 01:58:30.380 |
to a pretty good appreciation of two-state states 01:58:32.740 |
in quantum systems in a way that just trying to read about, 01:58:36.900 |
like, oh, what are the hard parts about resolving 01:58:40.620 |
quantum field theories with general relativity 01:58:44.220 |
So as far as popularizing science is concerned, 01:59:00.100 |
you know, it might be the case maybe more people 01:59:04.540 |
- It's difficult to create, like, a three blue, 01:59:09.500 |
So basically, we should really try to find the beauty 01:59:14.420 |
in mathematics or physics by looking at concepts 01:59:25.260 |
So like the clay millennium problems, Riemann hypothesis. 01:59:28.620 |
- Have you ever done a video on Fermat's last theorem? 01:59:34.140 |
I would talk about proving Fermat's last theorem 01:59:43.980 |
Mathologer might be able to do, like, a great job on this. 01:59:49.060 |
But the core ideas of proving it for n equals three are hard, 01:59:57.360 |
It involves looking at a number field that's, 02:00:02.460 |
And you start asking questions about factoring numbers 02:00:06.380 |
So it takes a while, but I've talked about this sort of, 02:00:11.580 |
And you can get to a okay understanding of that. 02:00:15.580 |
And the things that make Fermat's last theorem hard 02:00:37.500 |
and not really productive for the viewer's time. 02:00:46.820 |
I was actually very inspired by the twin prime conjecture. 02:00:57.520 |
would be, like, viewed through this lens of, like, 02:00:59.600 |
"Oh, maybe I can apply it to that in some way." 02:01:01.800 |
But you sort of mature to a point where you realize 02:01:13.360 |
what it feels like for these things to be resolved, 02:01:19.620 |
And the people who do make progress towards these things, 02:01:42.860 |
should encourage that habit through things like 02:01:49.700 |
And yeah, I think I've heard a lot of the interest 02:01:59.520 |
One, I don't understand them, but more importantly-- 02:02:06.140 |
- You shouldn't be interested in those, right? 02:02:08.900 |
- It's a giant sort of ball of interesting ideas. 02:02:12.660 |
There's probably a million of interesting ideas in there 02:02:14.940 |
that individually could be explored effectively. 02:02:18.540 |
you should be interested in fundamental questions. 02:02:27.780 |
Certainly you shouldn't be trying to answer that 02:02:29.380 |
unless you actually understand quantum field theory, 02:02:31.340 |
and you actually understand general relativity. 02:03:14.620 |
Ultimately, that's where the most satisfying thing 02:03:24.900 |
- As opposed to, which it can also be enjoyable, 02:03:34.460 |
I don't know, maybe people get entertainment out of that, 02:03:46.160 |
when you first don't understand something, and then you do. 02:03:55.860 |
about a fear of mortality, which you made fun of me of, 02:03:59.260 |
but let me ask you on the other absurd question 02:04:02.080 |
is what do you think is the meaning of our life, 02:04:08.020 |
- I'm sorry if I made fun of you about mortality. 02:04:22.360 |
There's a meaning to this water bottle label, 02:04:29.360 |
that wanted to get its ideas into another consciousness. 02:04:44.040 |
You can't ask what is the height without an object. 02:04:48.020 |
without an intentful consciousness putting it... 02:04:52.040 |
I guess I'm revealing I'm not very religious. 02:05:03.560 |
relative to which you could calculate the height. 02:05:07.600 |
- But what I'm saying is I don't understand the question, 02:05:10.560 |
in that I think people might be asking something very real. 02:05:19.240 |
Are they asking, as I'm making decisions day by day 02:05:22.160 |
for what should I do, what is the guiding light 02:05:26.280 |
I think that's what people are kind of asking. 02:05:57.200 |
you're sort of filled with a sense of happiness 02:06:03.040 |
Like that, yeah, that's what fuels my pump, at least. 02:06:11.120 |
- Yeah, you wanna be alone together with someone. 02:06:15.720 |
I think there's no better way to end it, Grant. 02:06:18.440 |
You've been, first time we talked, it was amazing. 02:06:20.600 |
Again, it's a huge honor that you make time for me. 02:06:36.600 |
to get a discount and to support this podcast. 02:06:39.760 |
If you enjoy this thing, subscribe on YouTube, 02:07:17.560 |
"is available to other people and to me too, I believe. 02:07:21.360 |
"Although I may not be quite as refined aesthetically 02:07:24.280 |
"as he is, I can appreciate the beauty of a flower. 02:07:33.200 |
"the complicated actions inside, which also have a beauty. 02:07:36.960 |
"I mean, it's not just beauty at this dimension 02:07:46.160 |
"The fact that the colors in the flower evolved 02:07:48.240 |
"in order to attract insects to pollinate it is interesting. 02:07:56.000 |
"Does this aesthetic sense also exist in the lower forms? 02:08:02.620 |
"which the science knowledge only adds to the excitement, 02:08:12.280 |
Thank you for listening and hope to see you next time.