back to indexMichael Stevens: Vsauce | Lex Fridman Podcast #58
Chapters
0:0 Michael Stevens
5:3 Why Are We Aware of Ourselves
19:47 Flat Earth Theory
38:0 Elon Musk
57:56 Words of Wisdom
00:00:00.000 |
The following is a conversation with Michael Stevens, 00:00:04.480 |
one of the most popular educational YouTube channels 00:00:07.000 |
in the world, with over 15 million subscribers 00:01:08.140 |
I'll do one or two minutes after introducing the episode, 00:01:24.000 |
I personally use Cash App to send money to friends, 00:01:34.980 |
say $1 worth, no matter what the stock price is. 00:01:38.300 |
Brokerage services are provided by Cash App Investing, 00:01:46.220 |
to support one of my favorite organizations called FIRST, 00:01:49.160 |
best known for their FIRST Robotics and Lego competitions. 00:01:52.480 |
They educate and inspire hundreds of thousands of students 00:01:57.740 |
and have a perfect rating on Charity Navigator, 00:02:04.300 |
When you get Cash App from the App Store, Google Play, 00:02:09.940 |
you'll get $10 and Cash App will also donate $10 to FIRST, 00:02:15.540 |
that I've personally seen inspire girls and boys 00:02:21.700 |
And now here's my conversation with Michael Stevens. 00:02:31.760 |
You've pointed out how messy studying human behavior is 00:02:55.400 |
finding the physical foundations of psychology, right? 00:03:01.200 |
If all of our emotions and moods and feelings and behaviors 00:03:17.760 |
and the uncertainty principle and all these things. 00:03:22.760 |
of every single quantum state in a brain, probably. 00:03:27.160 |
But I think that if we can get to that point with psychology, 00:03:32.160 |
then we can start to think about consciousness 00:03:41.000 |
When we ask questions like, well, what is self-reference? 00:03:52.440 |
- There's ideas of, in terms of consciousness 00:03:59.240 |
there's ideas of panpsychism where people believe that 00:04:03.480 |
whatever consciousness is is a fundamental part of reality. 00:04:08.840 |
Do you think, what's your views on consciousness? 00:04:11.560 |
Do you think it has this deep part of reality 00:04:24.240 |
- Nothing I ask you today has actually proven answer, 00:04:31.680 |
this is all speculation and I'm not an expert 00:04:49.080 |
I think that our bodies and brains and the universe 00:04:52.720 |
and at the quantum level is so rich and complex, 00:05:15.700 |
we're going to have to believe in answers purely on faith. 00:05:29.880 |
there are some that contain memories of others. 00:05:34.280 |
Literally, Julian Barber calls them time capsule states 00:05:38.180 |
where you're like, yeah, not only do I have a scratch 00:05:40.440 |
on my arm, but also this state of the universe 00:05:48.600 |
And for some reason, those kinds of states of the universe 00:06:06.840 |
because I think the consciousness then emerges 00:06:13.200 |
that contains fragments or memories of other states 00:06:18.200 |
is one where you're going to feel like there's time. 00:06:26.440 |
And I don't know what'll happen in the future 00:06:28.000 |
because these states don't contain information 00:06:42.520 |
But until you are in one, or if you are in one, 00:06:55.240 |
- You've kind of implied that you have a sense, 00:07:08.080 |
Do you think of the universe as deterministic 00:07:14.040 |
like it's operating under a specific set of physical laws, 00:07:17.960 |
and when you have set the initial conditions, 00:07:23.540 |
in our particular line of the universe every time? 00:07:28.480 |
- That is a very useful way to think about the universe. 00:07:33.780 |
It's brought us to where we are today, right? 00:07:36.960 |
I would not say that I believe in determinism 00:08:14.960 |
How big is the gap between reality and perception? 00:08:22.640 |
There is no experiment to find an answer to that question. 00:08:27.000 |
Everything we experience is an event in our brain. 00:08:36.700 |
All I am experiencing is the perception of a cat 00:08:43.140 |
I am only a witness to the events of my mind. 00:08:49.340 |
that if I witness the event of cat in my head, 00:08:54.100 |
it's because I'm looking at a cat that is literally there 00:09:00.060 |
and should be pet and given food and water and love. 00:09:03.100 |
I think that's the way you should live your life. 00:09:19.540 |
And it's a fantastic way to get people excited 00:09:22.020 |
about all kinds of topics, physics, psychology, 00:09:28.060 |
But at the end of the day, what would the difference be? 00:09:31.840 |
- The cat needs to be fed at the end of the day. 00:09:42.460 |
between killing a digital cat in a video game 00:09:48.340 |
It seems very different to us psychologically. 00:09:51.940 |
oh my gosh, I forgot to feed my Tamagotchi, right? 00:09:58.980 |
- So can you just touch on the topic of simulation? 00:10:08.800 |
inspiring, or constructive in any kind of way? 00:10:17.520 |
- I think it is extremely useful as a thought experiment 00:10:24.680 |
especially as we see virtual reality and computer games 00:10:30.600 |
You're not talking to an audience in like Newton's time 00:10:45.920 |
man, at what point is this little robot friend of mine 00:10:48.960 |
gonna be like someone I don't want to cancel plans with? 00:11:00.320 |
Am I a brain in a vat that is just being given 00:11:03.800 |
electrical impulses from some nefarious other beings 00:11:13.100 |
And the fact that you can't prove it either way 00:11:23.120 |
that you would want to cancel an appointment with. 00:11:27.820 |
That's what my research is, it's artificial intelligence. 00:11:30.760 |
And I apologize, but you're such a fun person 00:11:36.920 |
- Well, I hope I can give some answers that are interesting. 00:11:40.240 |
- Well, because of you've sharpened your brain's ability 00:11:45.240 |
to explore some of the most, some of the questions 00:11:47.960 |
that many scientists are actually afraid of even touching, 00:11:56.640 |
through this process of sharpening your brain. 00:12:04.960 |
And there are a lot of questions I investigate 00:12:24.240 |
But I think I would just describe myself as curious, 00:12:27.640 |
and I hope that that curiosity is contagious. 00:12:33.760 |
because your curiosity took you to asking questions. 00:12:40.680 |
even if you feel, society feels that it's not a question 00:12:47.360 |
to me, asking the question is the biggest step 00:12:57.280 |
and that may be what traditionally is called science. 00:13:07.360 |
is just true what it means to be a scientist to me. 00:13:11.600 |
- It's certainly a huge part of what it means to be a human. 00:13:15.280 |
If I were to say, you know what, I don't believe in forces. 00:13:19.120 |
I think that when I push on a massive object, 00:13:22.160 |
a ghost leaves my body and enters the object I'm pushing, 00:13:25.720 |
and these ghosts happen to just get really lazy 00:13:31.260 |
Oh, and by the way, the laziness of the ghost 00:13:37.800 |
Every experiment, well, you can never find the ghost. 00:13:45.920 |
But once I start saying, can I see the ghost? 00:13:50.920 |
And if there aren't ghosts, what might I expect? 00:13:58.040 |
Are there things that should happen if there are ghosts, 00:14:06.960 |
I don't think of science as, wow, a picture of a black hole. 00:14:13.100 |
That's data, that's a sensory and perception experience. 00:14:16.480 |
Science is how we got that and how we understand it 00:14:19.640 |
and how we believe in it and how we reduce our uncertainty 00:14:28.260 |
and I'm sometimes disheartened by the elitism 00:14:31.600 |
of the thinking, sort of not allowing yourself 00:14:49.860 |
And revolutions in our knowledge of the world occur 00:15:00.760 |
It is very scary to challenge conventional thinking 00:15:05.700 |
and risky because, let's go back to elitism and ego, 00:15:14.020 |
and all forces are actually created by invisible creatures 00:15:28.820 |
then ego gets involved and you just don't go anywhere. 00:15:40.040 |
but it needs to be done in this very political way 00:15:44.660 |
and let's realize that we're all learning together 00:15:49.060 |
And so when you look at a lot of revolutionary ideas, 00:15:57.500 |
And, you know, Galileo had a couple of problems 00:16:11.400 |
I'll have to create and invent and write different things 00:16:20.480 |
Before we get to AI, on topic of revolutionary ideas, 00:16:28.960 |
is one of the favorite questions you've ever answered. 00:16:36.020 |
people should definitely watch, is really fascinating. 00:16:50.940 |
So it had to have been like 2009 or something. 00:17:10.540 |
but I went there and I was reading their ideas 00:17:14.220 |
and how they responded to typical criticisms of, 00:17:17.900 |
well, the earth isn't flat because what about this? 00:17:20.300 |
And I could not tell, and I mentioned this in my video, 00:17:23.800 |
I couldn't tell how many of these community members 00:17:28.680 |
actually believe the earth was flat or were just trolling. 00:17:41.660 |
versus a maybe not so tenable or good belief. 00:17:49.960 |
It's about, look, there are a lot of reasons. 00:18:00.640 |
to every single one of them, but it's all in an ad hoc way. 00:18:04.140 |
And all of their rebuttals aren't necessarily 00:18:06.540 |
gonna form a cohesive non-contradictory whole. 00:18:10.780 |
And I believe that's the episode where I talk 00:18:13.100 |
about Occam's razor and Newton's flaming laser sword. 00:18:17.420 |
And then I say, well, you know what, wait a second, 00:18:24.500 |
And so to a particle moving near the speed of light 00:18:35.540 |
Like we need to be really generous to even wild ideas 00:19:00.500 |
what it means to explore the edge of science and so on. 00:19:07.140 |
nobody gets hurt whether the earth is flat or round, 00:19:15.760 |
I find that scientists roll their eyes way too fast 00:19:21.420 |
The kind of dismissal that I see to this even notion, 00:19:27.780 |
what are the arguments that are being proposed? 00:19:30.260 |
And this is why these arguments are incorrect. 00:19:32.580 |
So that should be something that scientists should always do 00:19:37.460 |
even to the most sort of ideas that seem ridiculous. 00:19:45.880 |
when I ask people what they think about flat earth theory 00:20:02.020 |
However, I don't know that and it has not been proven. 00:20:21.660 |
if we care about how probable and certain our ideas might be 00:20:41.300 |
And I think we're kind of like being pretty pedantic 00:20:56.900 |
But I still can't prove that I'm not living in a simulation, 00:21:06.540 |
So there's always some doubt and that's fine. 00:21:15.380 |
when you start talking about quantum mechanics 00:21:21.900 |
adds a little spice into the thinking process 00:21:30.060 |
your video kind of, okay, say the earth is flat, 00:21:38.380 |
That's a really nice thought experiment to think about. 00:21:45.380 |
but you actually wind up accidentally learning 00:21:48.060 |
a whole lot about gravity and about relativity 00:21:51.620 |
and geometry and I think that's really the goal 00:21:56.540 |
I'm not trying to convince people that the earth is round. 00:21:58.820 |
I feel like you either believe that it is or you don't 00:22:06.900 |
and how you are introduced to important concepts 00:22:13.740 |
Oh, it's all about the center of mass of an object. 00:22:40.020 |
'cause you always imagine gravity with spheres, 00:22:46.340 |
on masses that are not spherical, some other shape, 00:22:49.740 |
but in here, a plate, a flat object is really interesting. 00:22:57.740 |
- Yeah, even if a disc, the size of Earth would be impossible. 00:23:02.740 |
I think anything larger than like the moon, basically, 00:23:09.020 |
needs to be a sphere because gravity will round it out. 00:23:13.740 |
So you can't have a teacup the size of Jupiter, right? 00:23:18.140 |
There's a great book about a teacup in the universe 00:23:24.700 |
I forget her name, but it's a wonderful book, 00:23:28.180 |
I think it's called "Teacup in the Universe." 00:23:32.680 |
your videos are generally super, people love them, right? 00:23:37.100 |
If you look at the sort of number of likes versus dislikes, 00:23:39.940 |
this measure of YouTube, right, is incredible, as do I, 00:24:13.700 |
like they're bought by some, it's the kind of mechanism 00:24:21.100 |
What's your sense of the size of that community? 00:24:24.060 |
You're one of the sort of great educators in the world 00:24:28.940 |
that educates people on the exciting power of science, 00:24:44.220 |
on the flat Earth video, and so I would wonder 00:24:47.300 |
if it has a greater percentage of dislikes than usual, 00:24:53.580 |
because they think that it's a video about Earth being flat 00:24:58.580 |
and they find that ridiculous and they dislike it 00:25:12.140 |
that kind of go through the episode and are pro-flat Earth, 00:25:17.140 |
but I don't know if there's a larger community 00:25:36.660 |
that does no good because how idiotic are they really? 00:25:47.780 |
and that in and of itself is really interesting. 00:25:50.380 |
And I investigated that in which way is down, 00:25:58.940 |
depending on what's underneath you and what's above you 00:26:11.740 |
Like if you, there's this, I guess, thought experiment, 00:26:14.420 |
if you build a bridge all the way across the Earth 00:26:19.420 |
and then just knock out its pillars, what would happen? 00:26:25.820 |
like a very chaotic, unstable thing that's happening 00:26:28.980 |
because gravity is non-uniform throughout the Earth. 00:26:31.820 |
- Yeah, in small spaces, like the ones we work in, 00:26:36.620 |
we can essentially assume that gravity is uniform. 00:26:40.900 |
It is weaker the further you are from the Earth. 00:26:47.580 |
it's radially pointed towards the middle of the Earth. 00:26:50.060 |
So a really large object will feel tidal forces 00:26:55.660 |
And we can take advantage of that with satellites, right? 00:27:01.700 |
without having to use fuel or any kind of engine. 00:27:08.340 |
What's your thought of the state of where we are at 00:27:12.980 |
And what do you think it takes to build human level 00:27:22.860 |
And I think it's 'cause my instinct is always to go, 00:27:25.060 |
well, what are the foundations here of our discussion? 00:27:31.060 |
How do we measure the intelligence of an artificial machine 00:27:47.060 |
saying that whatever this fuzzy intelligence thing 00:28:01.580 |
A chatbot that you'd want to hang out and have a beer with 00:28:04.540 |
for a bunch of hours or have dinner plans with. 00:28:10.220 |
Is there something else that would impress you? 00:28:12.260 |
Or is that also too difficult to think about? 00:28:13.620 |
- Oh yeah, I'm pretty much impressed by everything. 00:28:18.740 |
- If there was a chatbot that was like incredibly, 00:28:27.940 |
Like if I'm unable to tell that it's not another person, 00:28:39.100 |
and it was like, that's actually what you're talking to. 00:28:42.700 |
I don't know if I would feel that guilty destroying it. 00:28:46.460 |
I would feel guilty because clearly it's well-made 00:28:51.100 |
It's like destroying a really cool car or something. 00:28:56.300 |
So yeah, at what point would I start to feel that way? 00:28:58.820 |
And this is such a subjective psychological question. 00:29:20.540 |
So if you just have a robot that not screams, 00:29:28.500 |
that immediately just puts it in a class that we humans, 00:29:40.980 |
- Right, I think that's a really good instinct to have. 00:29:48.480 |
even if you don't believe that it has the mental experience, 00:30:00.020 |
- The problem is that instinct can get us in trouble 00:30:08.300 |
There's robots like the Facebook and the YouTube algorithm 00:30:11.980 |
and they can manipulate in the same kind of way. 00:30:17.900 |
Do you have worries about existential threats from AI 00:30:21.780 |
or existential threats from other technologies 00:30:23.740 |
like nuclear weapons that could potentially destroy life 00:30:27.780 |
on earth or damage it to a very significant degree? 00:30:35.340 |
There's all kinds of famous ways to think about this. 00:30:39.300 |
what if we don't see advanced alien civilizations 00:30:55.060 |
Jeez, I wish I remembered the name of the channel. 00:31:00.020 |
of maybe once you discover radioactivity and its power, 00:31:09.460 |
is that no one's ever managed to survive as a civilization 00:31:16.900 |
And when it comes to AI, I'm not really very worried 00:31:21.900 |
because I think that there are plenty of other people 00:31:32.940 |
And they're questions that we should address later. 00:31:36.740 |
And I think I talk about this in my interview 00:31:54.260 |
if this really weird contrived scenario happens 00:31:57.040 |
where it has to swerve and save the driver, but kill a kid? 00:32:07.280 |
because we're worried about all of these little issues, 00:32:14.340 |
but we shouldn't allow them to be stumbling blocks 00:32:24.540 |
So worry should not paralyze technological progress, 00:32:28.560 |
but we're sort of marching, technology is marching forward 00:32:32.580 |
without the key scientists, the developing of technology, 00:32:37.900 |
worrying about the overnight having some effects 00:32:49.540 |
that there's enough people worrying about it, 00:32:51.260 |
Elon Musk says there's not enough people worrying about it. 00:33:06.100 |
So it's an interesting question of what is a good threshold 00:33:12.580 |
And if it's too many people that are worried, you're right. 00:33:15.140 |
It'll be like the press would over report on it 00:33:18.740 |
and there'll be technological, halt technological progress. 00:33:21.820 |
If not enough, then we can march straight ahead 00:33:24.420 |
into that abyss that human beings might be destined for 00:33:31.340 |
- Yeah, I don't know what the right balance is 00:33:36.980 |
but we're always worried about new technology. 00:33:40.460 |
We know that Plato was worried about the written word. 00:33:42.980 |
He's like, we shouldn't teach people to write 00:33:45.020 |
because then they won't use their minds to remember things. 00:33:51.260 |
and its advancement since the beginning of recorded history. 00:33:55.140 |
And so, I think, however, these conversations 00:34:01.100 |
because again, we learn a lot about ourselves. 00:34:06.220 |
like coming into being that is conscious or whatever 00:34:09.380 |
and can self-replicate, we already do that every day. 00:34:27.720 |
Maybe we should just like, what, not have babies born? 00:34:34.020 |
It's like, we'll want to have safeguards in place 00:34:41.780 |
a kid could be born that becomes some kind of evil person, 00:34:47.940 |
- And it's possible that with advanced genetics in general, 00:35:08.740 |
Like being able to, if it's something genetic, 00:35:11.500 |
if there's some sort of, and what to use that information, 00:35:20.020 |
- Yeah, I'd like to find an answer that isn't, 00:35:24.940 |
You know, I'd like to find an answer that is, 00:35:30.340 |
And if you have an 83% chance of becoming a psychopath, 00:35:43.340 |
- At least at this part of the world, at least in America, 00:35:45.980 |
there's a respect for individual life in that way. 00:35:55.740 |
But there's other cultures where individual human life 00:36:04.700 |
where the strength of nation and society together 00:36:07.420 |
is more important than any one particular individual. 00:36:15.940 |
but it's unclear that that was what the future holds. 00:36:19.160 |
- Well, yeah, and I mean, let me even throw this out. 00:36:28.000 |
and stuck on the idea that there is some thing 00:36:38.380 |
I think that humans and human technology are one organism. 00:36:48.620 |
would they necessarily know that this is an invention, 00:36:51.340 |
that I don't grow these organically from my body? 00:37:10.860 |
whether it be video games or artificial intelligence 00:37:18.700 |
When you're in a car, where do you end and the car begin? 00:37:21.040 |
It seems like a really easy question to answer, 00:37:25.740 |
we are in this symbiotic relationship with our inventions. 00:37:29.840 |
And there are plenty of people who are worried about it, 00:37:35.700 |
- And I think that even just us think of ourselves 00:37:38.800 |
as individual intelligences may be silly notion 00:37:43.800 |
because it's much better to think of the entirety 00:37:48.500 |
of human civilization, all living organisms on earth 00:37:55.140 |
'cause you're right, everything's intertwined. 00:38:06.140 |
What do you think of the efforts that Elon Musk is doing 00:38:09.860 |
with space exploration, with electric vehicles, 00:38:13.180 |
with autopilot, sort of getting into the space 00:38:16.580 |
of autonomous vehicles, with boring under LA, 00:38:24.020 |
brain machine interfaces, communicate between machines 00:38:30.940 |
I mean, look at the fandom that he's amassed. 00:38:45.520 |
but I also think that a lot of responsibility 00:38:51.100 |
how he feels about the responsibility he has. 00:38:53.880 |
When there are people who are such a fan of your ideas 00:39:00.020 |
and your dreams and share them so closely with you, 00:39:12.040 |
Well, he was, but well, he was named that later. 00:39:18.460 |
the psychology of becoming a figure like him. 00:39:23.460 |
Well, I don't even know how to phrase the question right, 00:39:28.020 |
when you're following, your fans become so large 00:39:41.060 |
And maybe it doesn't worry him at all, and that's fine too. 00:39:45.500 |
And I think there are a lot of people that go through this 00:39:47.700 |
when they realize, whoa, there are a lot of eyes on me. 00:39:50.420 |
There are a lot of people who really take what I say 00:39:53.940 |
very earnestly and take it to heart and will defend me. 00:40:36.580 |
How do you psychologically deal with the responsibility? 00:40:44.560 |
What is the burden that you feel in educating, 00:40:49.340 |
being one of the biggest educators in the world 00:40:53.500 |
And actually everybody, like most of the world 00:41:01.060 |
trust you as a source of good, strong scientific thinking. 00:41:16.020 |
I'm not out there doing a lot of scientific experiments. 00:41:23.200 |
and I'm celebrating their work and the way that they think 00:41:29.480 |
But I wanna make it clear at all times that like, 00:41:35.240 |
and I don't think we're ever going to reach a point 00:41:40.880 |
You solve this equation, you plug in some conditions 00:41:46.120 |
I don't think we're ever gonna reach that point 00:41:51.920 |
to sometimes believe in science and become elitist 00:41:56.100 |
and become, I don't know, hard when in reality 00:41:58.860 |
it should humble you and make you feel smaller. 00:42:16.700 |
When I start the episodes, I say, hey, Vsauce, Michael here. 00:42:20.500 |
Vsauce and Michael are actually a different thing in my mind. 00:42:33.620 |
You're just sort of plugging into this big thing 00:42:36.100 |
that is scientific exploration of our reality 00:42:42.660 |
but you're just plugging into this big Vsauce ball 00:42:47.660 |
that others, millions of others are plugged into. 00:42:49.860 |
- Yeah, and I'm just hoping to encourage curiosity 00:42:56.380 |
and an embracement of doubt and being okay with that. 00:43:25.100 |
that if you can get an honest answer that you would ask 00:43:38.900 |
as you ask and answer some of the questions you do? 00:43:49.500 |
how would you like to see the algorithm improve? 00:43:52.660 |
- Well, I think of the algorithm as a mirror. 00:43:58.940 |
and we don't always like what we see in that mirror. 00:44:07.120 |
it's reflecting back what people on average want to watch. 00:44:11.400 |
And when you see things being recommended to you, 00:44:15.380 |
it's reflecting back what it thinks you want to see. 00:44:41.720 |
As long as the next thing you do is open another video. 00:44:53.820 |
you're giving us 20 bucks a month no matter what, 00:45:09.740 |
your longer term development as a human being, 00:45:12.680 |
which I think ultimately will make you feel better 00:45:17.520 |
and allowing you to stick with it for longer. 00:45:25.600 |
eventually you sort of wake up like from a drug 00:45:30.900 |
So I wonder how much they're trying to optimize 00:45:35.840 |
your videos aren't exactly sort of, no offense, 00:45:47.320 |
and I feel a better human after I watched it. 00:45:49.680 |
So they're not just optimizing for the clickability. 00:46:03.300 |
how profound you explore the directions and so on. 00:46:07.020 |
- I've been really lucky in that I don't worry too much 00:46:18.140 |
where I'm in partnership with YouTube on the thumbnails, 00:46:37.820 |
And I kind of feel like all of the Vsauce channels 00:46:41.240 |
have cultivated an audience that expects that. 00:46:51.000 |
But there are other audiences out there that want that. 00:47:03.700 |
I mean, that's what makes YouTube really different 00:47:12.760 |
You open up Facebook, you can open up YouTube 00:47:16.280 |
are like what happened amongst the traditional 00:47:31.440 |
When they see Ariana Grande reads a love letter 00:47:41.840 |
they're like, well, I mean, I'm just on the bus. 00:47:46.000 |
Like I don't have time to dive into a whole lesson. 00:47:55.560 |
- Is there something you would improve about the algorithm? 00:47:59.180 |
Knowing of course, that as far as we're concerned, 00:48:02.800 |
it's a black box or we don't know how it works. 00:48:04.600 |
- Right, and I don't think that even anyone at YouTube 00:48:07.920 |
They know what they've tweaked, but then it learns. 00:48:09.840 |
I think that it learns and it decides how to behave. 00:48:13.840 |
And sometimes the YouTube employees are left going, 00:48:16.680 |
I don't know, maybe we should like change the value 00:48:22.680 |
and maybe it should worry more about something. 00:48:24.720 |
I don't know, but I mean, I would like to see, 00:48:27.420 |
I don't know what they're doing and not doing. 00:48:32.760 |
that you think they should be having just internally, 00:48:38.960 |
should they be thinking about the long-term future? 00:48:41.160 |
Should they be thinking about educational content 00:48:48.960 |
news or educational content, like what you're providing, 00:48:51.560 |
which is asking big sort of timeless questions 00:48:56.600 |
- Well, it's interesting, what should they think about? 00:49:08.400 |
You don't have shows like "Three Blue, One Brown" 00:49:11.720 |
or "Physics Girl" or "Looking Glass Universe" 00:49:21.240 |
and they don't have commissioned shows from those platforms. 00:49:26.680 |
that want to share their passion for learning, 00:49:32.540 |
And YouTube could promote those kinds of shows more, 00:49:43.280 |
and YouTube needs to make sure that the average user 00:49:47.760 |
They could still promote it more for the good of society, 00:49:51.080 |
but then we're making some really weird claims 00:50:00.400 |
I mentioned this quote before from Unumuno about, 00:50:02.920 |
look, I've seen a cat like estimate distances 00:50:05.440 |
and calculate a jump more often than I've seen a cat cry. 00:50:12.480 |
and make us feel things can be cheesy and can feel cheap, 00:50:18.040 |
And so even the dumbest vlog is still so important 00:50:23.800 |
that I don't think I have a better claim to take its spot 00:50:33.960 |
the shallow parts, the deep parts, you're right. 00:50:38.520 |
I miss the days when engaging with content on YouTube 00:50:43.400 |
helped push it into my subscribers timelines. 00:50:51.200 |
it would show up in the feed on the front page of the app 00:51:07.340 |
When I subscribe to someone, when I'm following them, 00:51:15.400 |
And I think that Twitter and Facebook are doing that 00:51:22.380 |
And I think we would see communities being stronger 00:51:25.200 |
on YouTube if it was that way, instead of YouTube going, 00:51:27.320 |
well, technically Michael liked this Veritasium video, 00:51:29.800 |
but people are way more likely to click on Carpool Karaoke. 00:51:33.620 |
So I don't even care who they are, just give them that. 00:51:38.960 |
that is a extremely important part of our society, 00:51:43.280 |
what it means to be a human on earth, you know, but- 00:51:49.680 |
- But a lot of people would disagree with you 00:51:51.360 |
and they should be able to see as much of that as they want. 00:51:53.860 |
And I think even people who don't think they like it 00:52:00.920 |
But yeah, I just wish that like new channels I discover 00:52:04.380 |
I wish that my subscribers found out about that 00:52:06.960 |
because especially in the education community, 00:52:14.040 |
you're just more likely to wanna watch an episode from me, 00:52:18.560 |
It's not competitive in the way that traditional TV was, 00:52:21.880 |
where it's like, well, if you tune into that show, 00:52:26.160 |
So helping each other out through collaborations 00:52:29.360 |
takes a lot of work, but just through engaging, 00:52:31.760 |
commenting on their videos, liking their videos, 00:52:36.720 |
that I would love to see become easier and more powerful. 00:52:48.960 |
You've spoken about death as an interesting topic. 00:52:59.740 |
- So what do you think is the meaning of life 00:53:17.360 |
What does mortality in the context of the whole universe 00:53:39.600 |
I think there's really exciting things being done 00:53:42.880 |
to extend life, but we still don't know how to like, 00:53:46.600 |
protect you from some accident that could happen, 00:53:54.120 |
and like recreate my consciousness digitally. 00:53:59.400 |
if it's stored on a physical medium or something. 00:54:13.400 |
that can be like, no, that's not what I meant. 00:54:39.000 |
I think I'm glad to get out of their way at some point 00:54:50.360 |
that far outlives you in ways that you can't control, 00:54:59.240 |
when they finally arrive and destroy all of us 00:55:25.840 |
in some of the ways that I have found useful and satisfying. 00:55:35.320 |
like burnt to a crisp, everything on it today, 00:55:50.560 |
- So we do, however, have the power to launch things 00:55:55.160 |
into space to try to extend how long our memory exists. 00:56:06.960 |
and we're learning things and writing stories 00:56:10.680 |
is truly what I think is the essence of being a human. 00:56:21.720 |
We're better than fossils, we're better than light spectrum, 00:56:26.800 |
We collect much more detailed memories of what's happening, 00:56:44.840 |
But even if I don't, you can't not have an effect. 00:56:47.600 |
That's the thing, this is not me feeling like, 00:56:57.760 |
that that impact might look really small and that's okay. 00:57:01.400 |
One of my favorite quotes is from Tess of the D'Urbervilles 00:57:04.520 |
and it's along the lines of the measure of your life 00:57:13.120 |
If I am happy and those that I love are happy, 00:57:20.120 |
- I think there's no better place to end it, Michael. 00:57:23.480 |
Thank you so much, it was an honor to meet you. 00:57:30.480 |
And thank you to our presenting sponsor, Cash App. 00:57:43.300 |
to learn, to dream of engineering our future. 00:57:47.040 |
If you enjoy this podcast, subscribe on YouTube, 00:57:51.920 |
support it on Patreon or connect with me on Twitter. 00:57:55.520 |
And now let me leave you with some words of wisdom 00:58:00.720 |
"The important thing is not to stop questioning. 00:58:09.160 |
"when he contemplates the mysteries of eternity, 00:58:11.660 |
"of life, the marvelous structure of reality. 00:58:14.920 |
"It is enough if one tries merely to comprehend 00:58:21.080 |
Thank you for listening and hope to see you next time.