back to index

Michael Stevens: Vsauce | Lex Fridman Podcast #58


Chapters

0:0 Michael Stevens
5:3 Why Are We Aware of Ourselves
19:47 Flat Earth Theory
38:0 Elon Musk
57:56 Words of Wisdom

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Michael Stevens,
00:00:02.680 | the creator of Vsauce,
00:00:04.480 | one of the most popular educational YouTube channels
00:00:07.000 | in the world, with over 15 million subscribers
00:00:10.080 | and over 1.7 billion views.
00:00:13.040 | His videos often ask and answer questions
00:00:16.160 | that are both profound and entertaining,
00:00:18.760 | spanning topics from physics to psychology.
00:00:21.780 | Popular questions include,
00:00:23.640 | what if everyone jumped at once?
00:00:25.440 | Or what if the sun disappeared?
00:00:27.680 | Or why are things creepy?
00:00:29.820 | Or what if the earth stopped spinning?
00:00:32.840 | As part of his channel,
00:00:34.080 | he created three seasons of "Minefield,"
00:00:36.320 | a series that explored human behavior.
00:00:38.760 | His curiosity and passion are contagious
00:00:41.600 | and inspiring to millions of people.
00:00:44.080 | And so as an educator,
00:00:45.240 | his impact and contribution to the world
00:00:47.540 | is truly immeasurable.
00:00:49.780 | This is the Artificial Intelligence Podcast.
00:00:52.840 | If you enjoy it, subscribe on YouTube,
00:00:55.200 | give it five stars on Apple Podcast,
00:00:57.160 | support it on Patreon,
00:00:58.600 | or simply connect with me on Twitter,
00:01:00.640 | @LexFriedman, spelled F-R-I-D-M-A-N.
00:01:03.800 | I recently started doing ads
00:01:06.380 | at the end of the introduction.
00:01:08.140 | I'll do one or two minutes after introducing the episode,
00:01:10.780 | and never any ads in the middle
00:01:12.740 | that break the flow of the conversation.
00:01:14.780 | I hope that works for you
00:01:16.120 | and doesn't hurt the listening experience.
00:01:19.000 | This show is presented by Cash App,
00:01:21.500 | the number one finance app in the App Store.
00:01:24.000 | I personally use Cash App to send money to friends,
00:01:26.580 | but you can also use it to buy, sell,
00:01:28.300 | and deposit Bitcoin in just seconds.
00:01:30.740 | Cash App also has a new investing feature.
00:01:33.520 | You can buy fractions of a stock,
00:01:34.980 | say $1 worth, no matter what the stock price is.
00:01:38.300 | Brokerage services are provided by Cash App Investing,
00:01:40.940 | a subsidiary of Square and member SIPC.
00:01:44.260 | I'm excited to be working with Cash App
00:01:46.220 | to support one of my favorite organizations called FIRST,
00:01:49.160 | best known for their FIRST Robotics and Lego competitions.
00:01:52.480 | They educate and inspire hundreds of thousands of students
00:01:56.020 | in over 110 countries
00:01:57.740 | and have a perfect rating on Charity Navigator,
00:02:00.340 | which means the donated money
00:02:01.660 | is used to maximum effectiveness.
00:02:04.300 | When you get Cash App from the App Store, Google Play,
00:02:07.220 | and use code LEXPODCAST,
00:02:09.940 | you'll get $10 and Cash App will also donate $10 to FIRST,
00:02:13.780 | which again is an organization
00:02:15.540 | that I've personally seen inspire girls and boys
00:02:18.400 | to dream of engineering a better world.
00:02:21.700 | And now here's my conversation with Michael Stevens.
00:02:27.020 | One of your deeper interests is psychology,
00:02:30.260 | understanding human behavior.
00:02:31.760 | You've pointed out how messy studying human behavior is
00:02:35.500 | and that it's far from the scientific rigor
00:02:37.420 | of something like physics, for example.
00:02:40.900 | How do you think we can take psychology
00:02:43.660 | from where it's been in the 20th century
00:02:45.680 | to something more like what the physicists,
00:02:49.100 | theoretical physicists are doing,
00:02:50.460 | something precise, something rigorous?
00:02:53.280 | - Well, we could do it by
00:02:55.400 | finding the physical foundations of psychology, right?
00:03:01.200 | If all of our emotions and moods and feelings and behaviors
00:03:05.840 | are the result of mechanical behaviors
00:03:10.400 | of atoms and molecules in our brains,
00:03:12.800 | then can we find correlations?
00:03:15.400 | Perhaps chaos makes that really difficult
00:03:17.760 | and the uncertainty principle and all these things.
00:03:19.600 | We can't know the position and velocity
00:03:22.760 | of every single quantum state in a brain, probably.
00:03:27.160 | But I think that if we can get to that point with psychology,
00:03:32.160 | then we can start to think about consciousness
00:03:37.400 | in a physical and mathematical way.
00:03:41.000 | When we ask questions like, well, what is self-reference?
00:03:44.800 | How can you think about your self-thinking?
00:03:47.360 | What are some mathematical structures
00:03:49.200 | that could bring that about?
00:03:52.440 | - There's ideas of, in terms of consciousness
00:03:55.560 | and breaking it down into physics,
00:03:59.240 | there's ideas of panpsychism where people believe that
00:04:03.480 | whatever consciousness is is a fundamental part of reality.
00:04:07.480 | It's almost like a physics law.
00:04:08.840 | Do you think, what's your views on consciousness?
00:04:11.560 | Do you think it has this deep part of reality
00:04:15.600 | or is it something that's deeply human
00:04:17.840 | and constructed by us humans?
00:04:21.640 | - Start nice and light and easy.
00:04:24.240 | - Nothing I ask you today has actually proven answer,
00:04:28.320 | so we're just hypothesizing.
00:04:29.760 | - So yeah, I mean, I should clarify,
00:04:31.680 | this is all speculation and I'm not an expert
00:04:33.800 | in any of these topics and I'm not God.
00:04:36.600 | But I think that consciousness is probably
00:04:41.160 | something that can be fully explained
00:04:45.720 | within the laws of physics.
00:04:49.080 | I think that our bodies and brains and the universe
00:04:52.720 | and at the quantum level is so rich and complex,
00:04:57.080 | I'd be surprised if we couldn't find a room
00:04:59.320 | for consciousness there.
00:05:00.720 | And why should we be conscious?
00:05:05.200 | Why are we aware of ourselves?
00:05:06.960 | That is a very strange and interesting
00:05:11.200 | and important question.
00:05:12.720 | And I think for the next few thousand years,
00:05:15.700 | we're going to have to believe in answers purely on faith.
00:05:20.240 | But my guess is that we will find that
00:05:25.120 | within the configuration space
00:05:27.120 | of possible arrangements of the universe,
00:05:29.880 | there are some that contain memories of others.
00:05:34.280 | Literally, Julian Barber calls them time capsule states
00:05:38.180 | where you're like, yeah, not only do I have a scratch
00:05:40.440 | on my arm, but also this state of the universe
00:05:43.200 | also contains a memory in my head
00:05:45.760 | of being scratched by my cat three days ago.
00:05:48.600 | And for some reason, those kinds of states of the universe
00:05:52.960 | are more plentiful or more likely.
00:05:55.800 | - When you say those states,
00:05:57.160 | the ones that contain memories of its past
00:06:00.020 | or ones that contain memories of its past
00:06:02.780 | and have degrees of consciousness?
00:06:05.420 | - Just the first part,
00:06:06.840 | because I think the consciousness then emerges
00:06:10.240 | from the fact that a state of the universe
00:06:13.200 | that contains fragments or memories of other states
00:06:18.200 | is one where you're going to feel like there's time.
00:06:22.600 | You're going to feel like, yeah,
00:06:24.160 | things happened in the past.
00:06:26.440 | And I don't know what'll happen in the future
00:06:28.000 | because these states don't contain information
00:06:29.640 | about the future.
00:06:30.800 | For some reason, those kinds of states
00:06:34.040 | are either more common, more plentiful,
00:06:38.280 | or you could use the anthropic principle
00:06:40.200 | and just say, well, they're extremely rare.
00:06:42.520 | But until you are in one, or if you are in one,
00:06:45.760 | then you can ask questions
00:06:46.880 | like you're asking me on this podcast.
00:06:48.860 | - Why questions?
00:06:50.720 | - But yeah, it's like, why are we conscious?
00:06:52.600 | Well, because if we weren't,
00:06:53.440 | we wouldn't be asking why we were.
00:06:55.240 | - You've kind of implied that you have a sense,
00:06:59.520 | again, hypothesis, theorizing,
00:07:02.400 | that the universe is deterministic.
00:07:05.580 | What's your thoughts about free will?
00:07:08.080 | Do you think of the universe as deterministic
00:07:10.360 | in a sense that it's unrolling a particular,
00:07:14.040 | like it's operating under a specific set of physical laws,
00:07:17.960 | and when you have set the initial conditions,
00:07:21.380 | it will unroll in the exact same way
00:07:23.540 | in our particular line of the universe every time?
00:07:28.480 | - That is a very useful way to think about the universe.
00:07:31.660 | It's done us well.
00:07:32.500 | It's brought us to the moon.
00:07:33.780 | It's brought us to where we are today, right?
00:07:36.960 | I would not say that I believe in determinism
00:07:40.320 | in that kind of an absolute form,
00:07:43.240 | or actually, I just don't care.
00:07:45.600 | Maybe it's true,
00:07:46.780 | but I'm not gonna live my life like it is.
00:07:48.880 | - What in your sense,
00:07:50.940 | 'cause you've studied kind of
00:07:52.120 | how we humans think of the world,
00:07:55.880 | what's in your view is the difference
00:07:58.000 | between our perception,
00:07:59.260 | like how we think the world is, and reality?
00:08:02.360 | Do you think there's a huge gap there?
00:08:04.180 | Like we delude ourselves
00:08:05.440 | that the whole thing is an illusion,
00:08:07.280 | just everything about human psychology,
00:08:09.400 | the way we see things,
00:08:10.760 | and how things actually are.
00:08:12.920 | All the things you've studied,
00:08:14.120 | what's your sense?
00:08:14.960 | How big is the gap between reality and perception?
00:08:16.960 | - Well, again, purely speculative.
00:08:18.960 | I think that we will never know the answer.
00:08:21.000 | We cannot know the answer.
00:08:22.640 | There is no experiment to find an answer to that question.
00:08:27.000 | Everything we experience is an event in our brain.
00:08:30.280 | When I look at a cat, I'm not even,
00:08:34.100 | I can't prove that there's a cat there.
00:08:36.700 | All I am experiencing is the perception of a cat
00:08:40.940 | inside my own brain.
00:08:43.140 | I am only a witness to the events of my mind.
00:08:46.340 | I think it is very useful to infer
00:08:49.340 | that if I witness the event of cat in my head,
00:08:54.100 | it's because I'm looking at a cat that is literally there
00:08:57.220 | and has its own feelings and motivations
00:09:00.060 | and should be pet and given food and water and love.
00:09:03.100 | I think that's the way you should live your life.
00:09:05.900 | But whether or not we live in a simulation
00:09:09.460 | on the brain in a vat, I don't know.
00:09:13.020 | - Do you care?
00:09:13.860 | - I don't really, well, I care
00:09:17.460 | because it's a fascinating question.
00:09:19.540 | And it's a fantastic way to get people excited
00:09:22.020 | about all kinds of topics, physics, psychology,
00:09:26.220 | consciousness, philosophy.
00:09:28.060 | But at the end of the day, what would the difference be?
00:09:31.020 | If you--
00:09:31.840 | - The cat needs to be fed at the end of the day.
00:09:33.820 | Otherwise, it'll be a dead cat.
00:09:35.660 | - Right, but if it's not even a real cat,
00:09:38.460 | then it's just like a video game cat.
00:09:40.380 | And, right, so what's the difference
00:09:42.460 | between killing a digital cat in a video game
00:09:45.820 | because of neglect versus a real cat?
00:09:48.340 | It seems very different to us psychologically.
00:09:50.580 | Like I don't really feel bad about,
00:09:51.940 | oh my gosh, I forgot to feed my Tamagotchi, right?
00:09:54.420 | But I would feel terrible
00:09:55.820 | if I forgot to feed my actual cats.
00:09:58.980 | - So can you just touch on the topic of simulation?
00:10:03.440 | Do you find this thought experiment
00:10:05.640 | that we're living in a simulation useful,
00:10:08.800 | inspiring, or constructive in any kind of way?
00:10:11.560 | Do you think it's ridiculous?
00:10:12.960 | Do you think it could be true?
00:10:15.000 | Or is it just a useful thought experiment?
00:10:17.520 | - I think it is extremely useful as a thought experiment
00:10:20.920 | because it makes sense to everyone,
00:10:24.680 | especially as we see virtual reality and computer games
00:10:28.520 | getting more and more complex.
00:10:30.600 | You're not talking to an audience in like Newton's time
00:10:33.880 | where you're like, imagine a clock
00:10:36.660 | that has mechanics in it that are so complex
00:10:38.800 | that it can create love.
00:10:40.240 | And everyone's like, no.
00:10:42.400 | But today you really start to feel,
00:10:45.920 | man, at what point is this little robot friend of mine
00:10:48.960 | gonna be like someone I don't want to cancel plans with?
00:10:56.000 | And so it's a great, the thought experiment
00:10:59.040 | of do we live in a simulation?
00:11:00.320 | Am I a brain in a vat that is just being given
00:11:03.800 | electrical impulses from some nefarious other beings
00:11:08.800 | so that I believe that I live on Earth
00:11:11.040 | and that I have a body and all of this?
00:11:13.100 | And the fact that you can't prove it either way
00:11:15.520 | is a fantastic way to introduce people
00:11:17.520 | to some of the deepest questions.
00:11:19.840 | - So you mentioned a little buddy
00:11:23.120 | that you would want to cancel an appointment with.
00:11:25.720 | So that's a lot of our conversations.
00:11:27.820 | That's what my research is, it's artificial intelligence.
00:11:30.760 | And I apologize, but you're such a fun person
00:11:34.560 | to ask these big questions with.
00:11:36.920 | - Well, I hope I can give some answers that are interesting.
00:11:40.240 | - Well, because of you've sharpened your brain's ability
00:11:45.240 | to explore some of the most, some of the questions
00:11:47.960 | that many scientists are actually afraid of even touching,
00:11:51.160 | which is fascinating.
00:11:52.440 | I think you're, in that sense,
00:11:54.040 | ultimately a great scientist
00:11:56.640 | through this process of sharpening your brain.
00:11:58.960 | - Well, I don't know if I am a scientist.
00:12:01.760 | I think science is a way of knowing.
00:12:04.960 | And there are a lot of questions I investigate
00:12:09.480 | that are not scientific questions.
00:12:12.000 | On like mine field, we have definitely done
00:12:14.240 | scientific experiments and studies
00:12:16.700 | that had hypotheses and all of that.
00:12:18.800 | But not to be too precious about
00:12:22.520 | what does the word science mean.
00:12:24.240 | But I think I would just describe myself as curious,
00:12:27.640 | and I hope that that curiosity is contagious.
00:12:29.960 | - So to you, the scientific method
00:12:31.880 | is deeply connected to science,
00:12:33.760 | because your curiosity took you to asking questions.
00:12:38.320 | To me, asking a good question,
00:12:40.680 | even if you feel, society feels that it's not a question
00:12:45.440 | within the reach of science currently,
00:12:47.360 | to me, asking the question is the biggest step
00:12:51.360 | of the scientific process.
00:12:53.360 | The scientific method is the second part,
00:12:57.280 | and that may be what traditionally is called science.
00:12:59.440 | But to me, asking the questions,
00:13:00.880 | being brave enough to ask the questions,
00:13:03.040 | being curious and not constrained
00:13:05.040 | by what you're supposed to think,
00:13:07.360 | is just true what it means to be a scientist to me.
00:13:11.600 | - It's certainly a huge part of what it means to be a human.
00:13:15.280 | If I were to say, you know what, I don't believe in forces.
00:13:19.120 | I think that when I push on a massive object,
00:13:22.160 | a ghost leaves my body and enters the object I'm pushing,
00:13:25.720 | and these ghosts happen to just get really lazy
00:13:28.080 | when they're around massive things.
00:13:29.800 | And that's why F equals MA.
00:13:31.260 | Oh, and by the way, the laziness of the ghost
00:13:34.400 | is in proportion to the mass of the object.
00:13:36.360 | So boom, proved me wrong.
00:13:37.800 | Every experiment, well, you can never find the ghost.
00:13:41.100 | And so none of that theory is scientific.
00:13:45.920 | But once I start saying, can I see the ghost?
00:13:49.440 | Why should there be a ghost?
00:13:50.920 | And if there aren't ghosts, what might I expect?
00:13:53.120 | And I start to do different tests to see,
00:13:56.580 | is this falsifiable?
00:13:58.040 | Are there things that should happen if there are ghosts,
00:14:01.520 | or are there things that shouldn't happen?
00:14:02.780 | And do they, you know, what do I observe?
00:14:05.160 | Now I'm thinking scientifically.
00:14:06.960 | I don't think of science as, wow, a picture of a black hole.
00:14:11.000 | That's just a photograph.
00:14:12.280 | That's an image.
00:14:13.100 | That's data, that's a sensory and perception experience.
00:14:16.480 | Science is how we got that and how we understand it
00:14:19.640 | and how we believe in it and how we reduce our uncertainty
00:14:22.180 | around what it means.
00:14:24.040 | - But I would say I'm deeply
00:14:26.060 | within the scientific community
00:14:28.260 | and I'm sometimes disheartened by the elitism
00:14:31.600 | of the thinking, sort of not allowing yourself
00:14:34.640 | to think outside the box.
00:14:36.280 | So allowing the possibility of going against
00:14:38.860 | the conventions of science, I think,
00:14:40.980 | is a beautiful part of some
00:14:43.940 | of the greatest scientists in history.
00:14:46.300 | - I don't know.
00:14:47.140 | I'm impressed by scientists every day.
00:14:49.860 | And revolutions in our knowledge of the world occur
00:14:54.860 | only under very special circumstances.
00:15:00.760 | It is very scary to challenge conventional thinking
00:15:05.700 | and risky because, let's go back to elitism and ego,
00:15:10.240 | right, if you just say, you know what,
00:15:11.460 | I believe in the spirits of my body
00:15:14.020 | and all forces are actually created by invisible creatures
00:15:17.900 | that transfer themselves between objects.
00:15:21.160 | If you ridicule every other theory
00:15:26.700 | and say that you're correct,
00:15:28.820 | then ego gets involved and you just don't go anywhere.
00:15:31.700 | But fundamentally, the question of, well,
00:15:34.340 | what is a force is incredibly important.
00:15:38.580 | We need to have that conversation,
00:15:40.040 | but it needs to be done in this very political way
00:15:42.660 | of like, let's be respectful of everyone
00:15:44.660 | and let's realize that we're all learning together
00:15:46.980 | and not shutting out other people.
00:15:49.060 | And so when you look at a lot of revolutionary ideas,
00:15:54.060 | they were not accepted right away.
00:15:57.500 | And, you know, Galileo had a couple of problems
00:16:00.580 | with the authorities and later thinkers,
00:16:04.220 | Descartes was like, all right, look,
00:16:05.580 | I kind of agree with Galileo,
00:16:07.020 | but I'm gonna have to not say that.
00:16:11.400 | I'll have to create and invent and write different things
00:16:13.760 | that keep me from being in trouble,
00:16:15.120 | but we still slowly made progress.
00:16:17.280 | - Revolutions are difficult in all forms
00:16:19.280 | and certainly in science.
00:16:20.480 | Before we get to AI, on topic of revolutionary ideas,
00:16:23.840 | let me ask, on a Reddit AMA,
00:16:26.640 | you said that is the earth flat
00:16:28.960 | is one of the favorite questions you've ever answered.
00:16:31.600 | Speaking of revolutionary ideas.
00:16:33.640 | So your video on that,
00:16:36.020 | people should definitely watch, is really fascinating.
00:16:38.820 | Can you elaborate why you enjoyed
00:16:41.680 | answering this question so much?
00:16:43.820 | - Yeah, well, it's a long story.
00:16:45.640 | I remember a long time ago,
00:16:49.400 | I was living in New York at the time.
00:16:50.940 | So it had to have been like 2009 or something.
00:16:53.980 | I visited the flat earth forums
00:16:57.600 | and this was before the flat earth theories
00:17:00.280 | became as sort of mainstream as they are.
00:17:03.160 | - Sorry to ask the dumb question.
00:17:05.400 | Forums, online forums.
00:17:06.740 | - Yeah, the flat earth society.
00:17:09.060 | I don't know if it's .com or .org,
00:17:10.540 | but I went there and I was reading their ideas
00:17:14.220 | and how they responded to typical criticisms of,
00:17:17.900 | well, the earth isn't flat because what about this?
00:17:20.300 | And I could not tell, and I mentioned this in my video,
00:17:23.800 | I couldn't tell how many of these community members
00:17:28.680 | actually believe the earth was flat or were just trolling.
00:17:32.420 | And I realized that the fascinating thing
00:17:35.300 | is how do we know anything?
00:17:38.500 | And what makes for a good belief
00:17:41.660 | versus a maybe not so tenable or good belief.
00:17:45.460 | And so that's really what my video
00:17:47.600 | about earth being flat is about.
00:17:49.960 | It's about, look, there are a lot of reasons.
00:17:52.460 | The earth is probably not flat,
00:17:55.720 | but a flat earth believer can respond
00:18:00.640 | to every single one of them, but it's all in an ad hoc way.
00:18:04.140 | And all of their rebuttals aren't necessarily
00:18:06.540 | gonna form a cohesive non-contradictory whole.
00:18:10.780 | And I believe that's the episode where I talk
00:18:13.100 | about Occam's razor and Newton's flaming laser sword.
00:18:17.420 | And then I say, well, you know what, wait a second,
00:18:19.500 | we know that space contracts as you move.
00:18:24.500 | And so to a particle moving near the speed of light
00:18:27.220 | towards earth, earth would be flattened
00:18:29.620 | in the direction of that particle's travel.
00:18:32.220 | So to them, earth is flat.
00:18:35.540 | Like we need to be really generous to even wild ideas
00:18:40.540 | because they're all thinking.
00:18:43.840 | They're all the communication of ideas
00:18:45.860 | and what else can it mean to be a human?
00:18:48.300 | - Yeah, and I think I'm a huge fan
00:18:51.000 | of the flat earth theory, quote unquote,
00:18:54.900 | in the sense that to me, it feels harmless
00:18:57.540 | to explore some of the questions
00:18:59.020 | of what it means to believe something,
00:19:00.500 | what it means to explore the edge of science and so on.
00:19:04.700 | It's 'cause it's a harm, it's to me,
00:19:07.140 | nobody gets hurt whether the earth is flat or round,
00:19:09.680 | not literally, but I mean, intellectually
00:19:11.820 | when we're just having a conversation.
00:19:13.340 | That said, again, to elitism,
00:19:15.760 | I find that scientists roll their eyes way too fast
00:19:20.060 | on the flat earth.
00:19:21.420 | The kind of dismissal that I see to this even notion,
00:19:25.540 | they haven't like sat down and say,
00:19:27.780 | what are the arguments that are being proposed?
00:19:30.260 | And this is why these arguments are incorrect.
00:19:32.580 | So that should be something that scientists should always do
00:19:37.460 | even to the most sort of ideas that seem ridiculous.
00:19:42.140 | So I like this, it's almost my test
00:19:45.880 | when I ask people what they think about flat earth theory
00:19:48.420 | to see how quickly they roll their eyes.
00:19:51.100 | - Well, yeah, I mean, let me go on record
00:19:53.900 | and say that the earth is not flat.
00:19:58.380 | It is a three-dimensional spheroid.
00:20:02.020 | However, I don't know that and it has not been proven.
00:20:06.980 | Science doesn't prove anything.
00:20:08.820 | It just reduces uncertainty.
00:20:10.660 | Could the earth actually be flat?
00:20:12.300 | Extremely unlikely, extremely unlikely.
00:20:18.940 | And so it is a ridiculous notion
00:20:21.660 | if we care about how probable and certain our ideas might be
00:20:26.660 | but I think it's incredibly important
00:20:28.380 | to talk about science in that way
00:20:32.100 | and to not resort to, well, it's true.
00:20:35.260 | It's true in the same way
00:20:38.260 | that a mathematical theorem is true.
00:20:41.300 | And I think we're kind of like being pretty pedantic
00:20:46.300 | about defining this stuff, but like,
00:20:49.100 | sure, I could take a rocket ship out
00:20:51.660 | and I could orbit earth and look at it
00:20:53.680 | and it would look like a ball, right?
00:20:56.900 | But I still can't prove that I'm not living in a simulation,
00:20:59.380 | that I'm not a brain in a vat,
00:21:00.540 | that this isn't all an elaborate ruse
00:21:02.660 | created by some technologically advanced
00:21:04.580 | extraterrestrial civilization.
00:21:06.540 | So there's always some doubt and that's fine.
00:21:11.020 | That's exciting.
00:21:12.300 | - And I think that kind of doubt,
00:21:14.020 | practically speaking, is useful
00:21:15.380 | when you start talking about quantum mechanics
00:21:17.700 | or string theory, sort of, it helps.
00:21:20.340 | To me, that kind of little,
00:21:21.900 | adds a little spice into the thinking process
00:21:25.460 | of scientists.
00:21:26.460 | So, I mean, just as a thought experiment,
00:21:30.060 | your video kind of, okay, say the earth is flat,
00:21:33.420 | what would the forces when you walk about
00:21:35.940 | this flat earth feel like to the human?
00:21:38.380 | That's a really nice thought experiment to think about.
00:21:40.620 | - Right, 'cause what's really nice about it
00:21:42.420 | is that it's a funny thought experiment,
00:21:45.380 | but you actually wind up accidentally learning
00:21:48.060 | a whole lot about gravity and about relativity
00:21:51.620 | and geometry and I think that's really the goal
00:21:55.700 | of what I'm doing.
00:21:56.540 | I'm not trying to convince people that the earth is round.
00:21:58.820 | I feel like you either believe that it is or you don't
00:22:01.300 | and that's, you know, how can I change that?
00:22:04.660 | What I can do is change how you think
00:22:06.900 | and how you are introduced to important concepts
00:22:10.940 | like, well, how does gravity operate?
00:22:13.740 | Oh, it's all about the center of mass of an object.
00:22:16.500 | So right, on a sphere,
00:22:17.780 | we're all pulled towards the middle,
00:22:19.460 | essentially the centroid, geometrically,
00:22:21.460 | but on a disc, ooh, you're gonna be pulled
00:22:23.820 | at a weird angle if you're out near the edge
00:22:25.940 | and that stuff's fascinating.
00:22:28.420 | - Yeah, and to me, that particular video
00:22:32.900 | opened my eyes even more to what gravity is.
00:22:37.500 | It's just a really nice visualization tool
00:22:40.020 | 'cause you always imagine gravity with spheres,
00:22:43.060 | with masses that are spheres.
00:22:44.620 | - Yeah. - And imagining gravity
00:22:46.340 | on masses that are not spherical, some other shape,
00:22:49.740 | but in here, a plate, a flat object is really interesting.
00:22:54.100 | It makes you really kind of visualize
00:22:56.300 | in a three-dimensional way the force.
00:22:57.740 | - Yeah, even if a disc, the size of Earth would be impossible.
00:23:02.740 | I think anything larger than like the moon, basically,
00:23:09.020 | needs to be a sphere because gravity will round it out.
00:23:13.740 | So you can't have a teacup the size of Jupiter, right?
00:23:18.140 | There's a great book about a teacup in the universe
00:23:21.020 | that I highly recommend.
00:23:22.900 | I don't remember the author.
00:23:24.700 | I forget her name, but it's a wonderful book,
00:23:26.780 | so look it up.
00:23:28.180 | I think it's called "Teacup in the Universe."
00:23:30.180 | - Just to link on this point briefly,
00:23:32.680 | your videos are generally super, people love them, right?
00:23:37.100 | If you look at the sort of number of likes versus dislikes,
00:23:39.940 | this measure of YouTube, right, is incredible, as do I,
00:23:45.220 | but this particular flat Earth video
00:23:48.180 | has more dislikes than usual.
00:23:51.380 | On that topic in general, what's your sense,
00:23:56.920 | how big is the community,
00:23:58.820 | not just who believes in flat Earth,
00:24:00.740 | but sort of the anti-scientific community
00:24:03.720 | that naturally distrusts scientists in a way
00:24:07.620 | that's not an open-minded way,
00:24:12.180 | like really just distrust scientists,
00:24:13.700 | like they're bought by some, it's the kind of mechanism
00:24:17.060 | of some kind of bigger system
00:24:18.860 | that's trying to manipulate human beings.
00:24:21.100 | What's your sense of the size of that community?
00:24:24.060 | You're one of the sort of great educators in the world
00:24:28.940 | that educates people on the exciting power of science,
00:24:33.940 | so you're kind of up against this community.
00:24:38.100 | What's your sense of it?
00:24:39.980 | - I really have no idea.
00:24:41.980 | I haven't looked at the likes and dislikes
00:24:44.220 | on the flat Earth video, and so I would wonder
00:24:47.300 | if it has a greater percentage of dislikes than usual,
00:24:51.380 | is that because of people disliking it
00:24:53.580 | because they think that it's a video about Earth being flat
00:24:58.580 | and they find that ridiculous and they dislike it
00:25:02.020 | without even really watching much?
00:25:04.260 | Do they wish that I was more dismissive
00:25:07.380 | of flat Earth theories?
00:25:08.620 | - That's possible too.
00:25:10.660 | There are a lot of response videos
00:25:12.140 | that kind of go through the episode and are pro-flat Earth,
00:25:17.140 | but I don't know if there's a larger community
00:25:21.900 | of unorthodox thinkers today
00:25:25.180 | than there have been in the past.
00:25:27.500 | And I just wanna not lose them.
00:25:29.980 | I want them to keep listening and thinking.
00:25:32.620 | And by calling them all idiots or something,
00:25:36.660 | that does no good because how idiotic are they really?
00:25:41.060 | I mean, the Earth isn't a sphere at all.
00:25:45.380 | We know that it's an oblate spheroid,
00:25:47.780 | and that in and of itself is really interesting.
00:25:50.380 | And I investigated that in which way is down,
00:25:52.300 | where I'm like, really, down does not point
00:25:54.260 | towards the center of the Earth.
00:25:56.660 | It points in a different direction
00:25:58.940 | depending on what's underneath you and what's above you
00:26:01.660 | and what's around you.
00:26:02.500 | The whole universe is tugging on me.
00:26:06.060 | - And then you also show that gravity
00:26:08.420 | is non-uniform across the globe.
00:26:11.740 | Like if you, there's this, I guess, thought experiment,
00:26:14.420 | if you build a bridge all the way across the Earth
00:26:19.420 | and then just knock out its pillars, what would happen?
00:26:23.500 | - Yeah.
00:26:24.340 | - And you describe how it would be
00:26:25.820 | like a very chaotic, unstable thing that's happening
00:26:28.980 | because gravity is non-uniform throughout the Earth.
00:26:31.820 | - Yeah, in small spaces, like the ones we work in,
00:26:36.620 | we can essentially assume that gravity is uniform.
00:26:39.340 | But it's not.
00:26:40.900 | It is weaker the further you are from the Earth.
00:26:43.040 | And it also is going to be,
00:26:47.580 | it's radially pointed towards the middle of the Earth.
00:26:50.060 | So a really large object will feel tidal forces
00:26:54.140 | because of that non-uniformness.
00:26:55.660 | And we can take advantage of that with satellites, right?
00:26:58.500 | Gravitational induced torque.
00:27:00.300 | It's a great way to align your satellite
00:27:01.700 | without having to use fuel or any kind of engine.
00:27:05.580 | - So let's jump back to it.
00:27:07.060 | Artificial intelligence.
00:27:08.340 | What's your thought of the state of where we are at
00:27:11.180 | currently with artificial intelligence?
00:27:12.980 | And what do you think it takes to build human level
00:27:15.900 | or superhuman level intelligence?
00:27:17.980 | - I don't know what intelligence means.
00:27:20.420 | That's my biggest question at the moment.
00:27:22.860 | And I think it's 'cause my instinct is always to go,
00:27:25.060 | well, what are the foundations here of our discussion?
00:27:27.980 | What does it mean to be intelligent?
00:27:31.060 | How do we measure the intelligence of an artificial machine
00:27:35.540 | or a program or something?
00:27:37.500 | - Can we say that humans are intelligent?
00:27:39.860 | Because there's also a fascinating field
00:27:42.620 | of how do you measure human intelligence?
00:27:44.260 | - Of course.
00:27:45.380 | - But if we just take that for granted,
00:27:47.060 | saying that whatever this fuzzy intelligence thing
00:27:50.020 | we're talking about, humans kind of have it.
00:27:52.220 | What would be a good test for you?
00:27:56.660 | So Turing developed a test
00:27:58.380 | that's natural language conversation.
00:28:00.420 | Would that impress you?
00:28:01.580 | A chatbot that you'd want to hang out and have a beer with
00:28:04.540 | for a bunch of hours or have dinner plans with.
00:28:08.380 | Is that a good test?
00:28:09.220 | Natural language conversation.
00:28:10.220 | Is there something else that would impress you?
00:28:12.260 | Or is that also too difficult to think about?
00:28:13.620 | - Oh yeah, I'm pretty much impressed by everything.
00:28:15.820 | I think that if-
00:28:17.740 | - Roomba?
00:28:18.740 | - If there was a chatbot that was like incredibly,
00:28:22.180 | I don't know, really had a personality.
00:28:24.460 | And if I didn't, the Turing test, right?
00:28:27.940 | Like if I'm unable to tell that it's not another person,
00:28:32.940 | but then I was shown a bunch of wires
00:28:36.620 | and mechanical components,
00:28:39.100 | and it was like, that's actually what you're talking to.
00:28:42.700 | I don't know if I would feel that guilty destroying it.
00:28:46.460 | I would feel guilty because clearly it's well-made
00:28:49.260 | and it's a really cool thing.
00:28:51.100 | It's like destroying a really cool car or something.
00:28:53.820 | But I would not feel like I was a murderer.
00:28:56.300 | So yeah, at what point would I start to feel that way?
00:28:58.820 | And this is such a subjective psychological question.
00:29:02.700 | If you give it movement,
00:29:05.020 | or if you have it act as though,
00:29:08.620 | or perhaps really feel pain as I destroy it
00:29:11.860 | and scream and resist, then I'd feel bad.
00:29:15.660 | - Yeah, it's beautifully put.
00:29:16.700 | And let's just say act like it's a pain.
00:29:20.540 | So if you just have a robot that not screams,
00:29:25.660 | just like moans in pain, if you kick it,
00:29:28.500 | that immediately just puts it in a class that we humans,
00:29:32.900 | it becomes, it anthropomorphizes it.
00:29:35.300 | It almost immediately becomes human.
00:29:37.980 | So that's a psychology question
00:29:39.380 | as opposed to sort of a physics question.
00:29:40.980 | - Right, I think that's a really good instinct to have.
00:29:43.140 | If the robot-
00:29:44.140 | - Screams.
00:29:46.620 | - Screams and moans,
00:29:48.480 | even if you don't believe that it has the mental experience,
00:29:52.700 | the qualia of pain and suffering,
00:29:55.340 | I think it's still a good instinct to say,
00:29:56.820 | you know what, I'd rather not hurt it.
00:30:00.020 | - The problem is that instinct can get us in trouble
00:30:02.620 | because then robots can manipulate that.
00:30:05.620 | And there's different kinds of robots.
00:30:08.300 | There's robots like the Facebook and the YouTube algorithm
00:30:10.700 | that recommends the video,
00:30:11.980 | and they can manipulate in the same kind of way.
00:30:14.500 | Well, let me ask you just to stick
00:30:16.300 | on artificial intelligence for a second.
00:30:17.900 | Do you have worries about existential threats from AI
00:30:21.780 | or existential threats from other technologies
00:30:23.740 | like nuclear weapons that could potentially destroy life
00:30:27.780 | on earth or damage it to a very significant degree?
00:30:31.260 | - Yeah, of course I do,
00:30:32.340 | especially the weapons that we create.
00:30:35.340 | There's all kinds of famous ways to think about this.
00:30:38.140 | And one is that, wow,
00:30:39.300 | what if we don't see advanced alien civilizations
00:30:43.980 | because of the danger of technology?
00:30:47.980 | What if we reach a point,
00:30:51.860 | and I think there's a channel, Thotty2.
00:30:55.060 | Jeez, I wish I remembered the name of the channel.
00:30:57.460 | But he delves into this kind of limit
00:31:00.020 | of maybe once you discover radioactivity and its power,
00:31:05.020 | you've reached this important hurdle.
00:31:07.220 | And the reason that the skies are so empty
00:31:09.460 | is that no one's ever managed to survive as a civilization
00:31:13.940 | once they have that destructive power.
00:31:16.900 | And when it comes to AI, I'm not really very worried
00:31:21.900 | because I think that there are plenty of other people
00:31:24.820 | that are already worried enough.
00:31:26.500 | And oftentimes these worries are just,
00:31:30.580 | they just get in the way of progress.
00:31:32.940 | And they're questions that we should address later.
00:31:36.740 | And I think I talk about this in my interview
00:31:42.020 | with the self-driving autonomous vehicle guy
00:31:47.020 | as I think it was a bonus scene
00:31:48.660 | from the "Trolley Problem" episode.
00:31:52.140 | And I'm like, wow, what should a car do
00:31:54.260 | if this really weird contrived scenario happens
00:31:57.040 | where it has to swerve and save the driver, but kill a kid?
00:32:00.200 | And he's like, well, what would a human do?
00:32:03.540 | And if we resist technological progress
00:32:07.280 | because we're worried about all of these little issues,
00:32:10.420 | then it gets in the way.
00:32:12.020 | And we shouldn't avoid those problems,
00:32:14.340 | but we shouldn't allow them to be stumbling blocks
00:32:16.780 | to advancement.
00:32:18.940 | - So the folks like Sam Harris or Elon Musk
00:32:22.500 | are saying that we're not worried enough.
00:32:24.540 | So worry should not paralyze technological progress,
00:32:28.560 | but we're sort of marching, technology is marching forward
00:32:32.580 | without the key scientists, the developing of technology,
00:32:37.900 | worrying about the overnight having some effects
00:32:42.100 | that would be very detrimental to society.
00:32:45.220 | So to push back on your thought of the idea
00:32:49.540 | that there's enough people worrying about it,
00:32:51.260 | Elon Musk says there's not enough people worrying about it.
00:32:54.620 | That's the kind of balance is,
00:32:56.600 | you know, it's like folks who really focus
00:33:01.260 | on non-nuclear deterrence are saying
00:33:03.680 | there's not enough people worried
00:33:04.860 | about nuclear deterrence, right?
00:33:06.100 | So it's an interesting question of what is a good threshold
00:33:10.220 | of people to worry about these?
00:33:12.580 | And if it's too many people that are worried, you're right.
00:33:15.140 | It'll be like the press would over report on it
00:33:18.740 | and there'll be technological, halt technological progress.
00:33:21.820 | If not enough, then we can march straight ahead
00:33:24.420 | into that abyss that human beings might be destined for
00:33:29.420 | with the progress of technology.
00:33:31.340 | - Yeah, I don't know what the right balance is
00:33:33.700 | of how many people should be worried
00:33:35.700 | and how worried should they be,
00:33:36.980 | but we're always worried about new technology.
00:33:40.460 | We know that Plato was worried about the written word.
00:33:42.980 | He's like, we shouldn't teach people to write
00:33:45.020 | because then they won't use their minds to remember things.
00:33:48.100 | There have been concerns over technology
00:33:51.260 | and its advancement since the beginning of recorded history.
00:33:55.140 | And so, I think, however, these conversations
00:33:59.980 | are really important to have,
00:34:01.100 | because again, we learn a lot about ourselves.
00:34:03.420 | If we're really scared of some kind of AI
00:34:06.220 | like coming into being that is conscious or whatever
00:34:09.380 | and can self-replicate, we already do that every day.
00:34:13.180 | It's called humans being born.
00:34:14.580 | They're not artificial.
00:34:15.980 | They're humans, but they're intelligent.
00:34:18.100 | And I don't wanna live in a world
00:34:20.180 | where we're worried about babies being born
00:34:21.980 | because what if they become evil?
00:34:24.260 | - Right.
00:34:25.100 | - What if they become mean people?
00:34:25.940 | What if they're thieves?
00:34:27.720 | Maybe we should just like, what, not have babies born?
00:34:31.820 | Like maybe we shouldn't create AI?
00:34:34.020 | It's like, we'll want to have safeguards in place
00:34:39.020 | in the same way that we know, look,
00:34:41.780 | a kid could be born that becomes some kind of evil person,
00:34:44.420 | but we have laws, right?
00:34:47.940 | - And it's possible that with advanced genetics in general,
00:34:51.620 | be able to, it's a scary thought to say that
00:34:59.020 | my child, if born, would have an 83% chance
00:35:04.020 | of being a psychopath, right?
00:35:08.740 | Like being able to, if it's something genetic,
00:35:11.500 | if there's some sort of, and what to use that information,
00:35:15.060 | what to do with that information
00:35:16.220 | is a difficult ethical thought.
00:35:20.020 | - Yeah, I'd like to find an answer that isn't,
00:35:22.060 | well, let's not have them live.
00:35:24.940 | You know, I'd like to find an answer that is,
00:35:26.860 | well, all human life is worthy.
00:35:30.340 | And if you have an 83% chance of becoming a psychopath,
00:35:33.640 | well, you still deserve dignity.
00:35:37.780 | - Yeah.
00:35:38.620 | - And you still deserve to be treated well.
00:35:42.380 | You still have rights.
00:35:43.340 | - At least at this part of the world, at least in America,
00:35:45.980 | there's a respect for individual life in that way.
00:35:49.540 | That's, well, to me, but again,
00:35:52.600 | I'm in this bubble, is a beautiful thing.
00:35:55.740 | But there's other cultures where individual human life
00:35:58.980 | is not that important, where a society,
00:36:02.660 | so I was born in Soviet Union,
00:36:04.700 | where the strength of nation and society together
00:36:07.420 | is more important than any one particular individual.
00:36:10.260 | So it's an interesting also notion,
00:36:12.020 | the stories we tell ourselves.
00:36:13.500 | I like the one where individuals matter,
00:36:15.940 | but it's unclear that that was what the future holds.
00:36:19.160 | - Well, yeah, and I mean, let me even throw this out.
00:36:21.440 | Like what is artificial intelligence?
00:36:23.800 | How can it be artificial?
00:36:25.180 | I really think that we get pretty obsessed
00:36:28.000 | and stuck on the idea that there is some thing
00:36:30.740 | that is a wild human, a pure human organism
00:36:34.300 | without technology.
00:36:35.780 | But I don't think that's a real thing.
00:36:38.380 | I think that humans and human technology are one organism.
00:36:43.380 | Look at my glasses, okay?
00:36:45.220 | If an alien came down and saw me,
00:36:48.620 | would they necessarily know that this is an invention,
00:36:51.340 | that I don't grow these organically from my body?
00:36:53.940 | They wouldn't know that right away.
00:36:56.240 | And the written word and spoons and cups,
00:37:01.020 | these are all pieces of technology.
00:37:03.140 | We are not alone as an organism.
00:37:08.520 | And so the technology we create,
00:37:10.860 | whether it be video games or artificial intelligence
00:37:13.780 | that can self-replicate and hate us,
00:37:15.840 | it's actually all the same organism.
00:37:18.700 | When you're in a car, where do you end and the car begin?
00:37:21.040 | It seems like a really easy question to answer,
00:37:22.900 | but the more you think about it,
00:37:24.500 | the more you realize, wow,
00:37:25.740 | we are in this symbiotic relationship with our inventions.
00:37:29.840 | And there are plenty of people who are worried about it,
00:37:31.940 | and there should be, but it's inevitable.
00:37:35.700 | - And I think that even just us think of ourselves
00:37:38.800 | as individual intelligences may be silly notion
00:37:43.800 | because it's much better to think of the entirety
00:37:48.500 | of human civilization, all living organisms on earth
00:37:51.100 | as a single living organism,
00:37:53.740 | as a single intelligent creature,
00:37:55.140 | 'cause you're right, everything's intertwined.
00:37:57.340 | Everything is deeply connected.
00:38:00.060 | So we mentioned Elon Musk.
00:38:01.500 | So you're a curious lover of science.
00:38:06.140 | What do you think of the efforts that Elon Musk is doing
00:38:09.860 | with space exploration, with electric vehicles,
00:38:13.180 | with autopilot, sort of getting into the space
00:38:16.580 | of autonomous vehicles, with boring under LA,
00:38:20.320 | and Neuralink trying to communicate
00:38:24.020 | brain machine interfaces, communicate between machines
00:38:27.100 | and human brains?
00:38:28.720 | - Well, it's really inspiring.
00:38:30.940 | I mean, look at the fandom that he's amassed.
00:38:35.820 | It's not common for someone like that
00:38:40.060 | to have such a following.
00:38:41.460 | And so it's-- - Engineering nerd.
00:38:43.240 | - Yeah, so it's really exciting,
00:38:45.520 | but I also think that a lot of responsibility
00:38:47.380 | comes with that kind of power.
00:38:48.540 | So like if I met him, I would love to hear
00:38:51.100 | how he feels about the responsibility he has.
00:38:53.880 | When there are people who are such a fan of your ideas
00:39:00.020 | and your dreams and share them so closely with you,
00:39:04.660 | you have a lot of power.
00:39:06.460 | And he didn't always have that.
00:39:09.720 | He wasn't born as Elon Musk.
00:39:12.040 | Well, he was, but well, he was named that later.
00:39:14.000 | But the point is that I wanna know
00:39:18.460 | the psychology of becoming a figure like him.
00:39:23.460 | Well, I don't even know how to phrase the question right,
00:39:25.700 | but it's a question about what do you do
00:39:28.020 | when you're following, your fans become so large
00:39:33.020 | that it's almost bigger than you?
00:39:38.000 | And how do you responsibly manage that?
00:39:41.060 | And maybe it doesn't worry him at all, and that's fine too.
00:39:43.540 | But I'd be really curious.
00:39:45.500 | And I think there are a lot of people that go through this
00:39:47.700 | when they realize, whoa, there are a lot of eyes on me.
00:39:50.420 | There are a lot of people who really take what I say
00:39:53.940 | very earnestly and take it to heart and will defend me.
00:39:57.740 | And whew, that can be dangerous.
00:40:00.240 | And you have to be responsible with it.
00:40:07.540 | - Both in terms of impact on society
00:40:09.300 | and psychologically for the individual,
00:40:11.300 | just the burden psychologically on Elon?
00:40:15.060 | - Yeah, yeah, how does he think about that
00:40:18.860 | part of his persona?
00:40:21.220 | - Well, let me throw that right back at you
00:40:23.380 | because in some ways you're just a funny guy
00:40:27.540 | that's gotten a humongous following,
00:40:31.700 | a funny guy with a curiosity.
00:40:33.580 | You've got a huge following.
00:40:36.580 | How do you psychologically deal with the responsibility?
00:40:40.100 | In many ways, you have a reach
00:40:42.020 | in many ways bigger than Elon Musk.
00:40:44.560 | What is the burden that you feel in educating,
00:40:49.340 | being one of the biggest educators in the world
00:40:51.980 | where everybody's listening to you?
00:40:53.500 | And actually everybody, like most of the world
00:40:58.380 | that uses YouTube for educational material
00:41:01.060 | trust you as a source of good, strong scientific thinking.
00:41:05.980 | - It's a burden and I try to approach it
00:41:11.020 | with a lot of humility and sharing.
00:41:16.020 | I'm not out there doing a lot of scientific experiments.
00:41:20.320 | I am sharing the work of real scientists
00:41:23.200 | and I'm celebrating their work and the way that they think
00:41:26.440 | and the power of curiosity.
00:41:29.480 | But I wanna make it clear at all times that like,
00:41:32.200 | look, we don't know all the answers
00:41:35.240 | and I don't think we're ever going to reach a point
00:41:37.640 | where we're like, wow, and there you go.
00:41:39.480 | That's the universe.
00:41:40.880 | You solve this equation, you plug in some conditions
00:41:42.740 | or whatever and you do the math
00:41:44.640 | and you know what's gonna happen tomorrow.
00:41:46.120 | I don't think we're ever gonna reach that point
00:41:47.920 | but I think that there is a tendency
00:41:51.920 | to sometimes believe in science and become elitist
00:41:56.100 | and become, I don't know, hard when in reality
00:41:58.860 | it should humble you and make you feel smaller.
00:42:01.780 | I think there's something very beautiful
00:42:03.080 | about feeling very, very small and very weak
00:42:07.420 | and to feel that you need other people.
00:42:10.940 | So I try to keep that in mind and say,
00:42:13.220 | look, thanks for watching.
00:42:14.340 | Vsauce is not, I'm not Vsauce, you are.
00:42:16.700 | When I start the episodes, I say, hey, Vsauce, Michael here.
00:42:20.500 | Vsauce and Michael are actually a different thing in my mind.
00:42:22.780 | I don't know if that's always clear
00:42:24.380 | but yeah, I have to approach it that way
00:42:26.900 | because it's not about me.
00:42:29.040 | - Yeah, so it's not even,
00:42:31.860 | you're not feeling the responsibility.
00:42:33.620 | You're just sort of plugging into this big thing
00:42:36.100 | that is scientific exploration of our reality
00:42:40.020 | and you're a voice that represents a bunch
00:42:42.660 | but you're just plugging into this big Vsauce ball
00:42:47.660 | that others, millions of others are plugged into.
00:42:49.860 | - Yeah, and I'm just hoping to encourage curiosity
00:42:53.060 | and responsible thinking
00:42:56.380 | and an embracement of doubt and being okay with that.
00:43:05.020 | - So next week talking to Christos Goudreau.
00:43:08.200 | I'm not sure if you're familiar who he is
00:43:09.980 | but he's the VP of engineering,
00:43:11.660 | head of the "YouTube algorithm"
00:43:14.660 | or the search and discovery.
00:43:16.140 | So let me ask, first high level,
00:43:20.180 | do you have a question for him
00:43:25.100 | that if you can get an honest answer that you would ask
00:43:28.860 | but more generally,
00:43:30.140 | how do you think about the YouTube algorithm
00:43:32.620 | that drives some of the motivation behind,
00:43:36.060 | no, some of the design decisions you make
00:43:38.900 | as you ask and answer some of the questions you do?
00:43:42.220 | How would you improve this algorithm
00:43:43.780 | in your mind in general?
00:43:45.140 | So just what would you ask him?
00:43:47.540 | And outside of that,
00:43:49.500 | how would you like to see the algorithm improve?
00:43:52.660 | - Well, I think of the algorithm as a mirror.
00:43:56.780 | It reflects what people put in
00:43:58.940 | and we don't always like what we see in that mirror.
00:44:01.140 | From the individual mirror
00:44:02.780 | to the individual mirror to the society.
00:44:05.460 | - Both.
00:44:06.300 | In the aggregate,
00:44:07.120 | it's reflecting back what people on average want to watch.
00:44:11.400 | And when you see things being recommended to you,
00:44:15.380 | it's reflecting back what it thinks you want to see.
00:44:19.260 | And specifically, I would guess that it's
00:44:22.340 | not just what you want to see,
00:44:23.600 | but what you will click on
00:44:25.600 | and what you will watch some of
00:44:27.980 | and stay on YouTube.
00:44:31.080 | Because of.
00:44:32.500 | I don't think that,
00:44:33.340 | this is all me guessing,
00:44:34.880 | but I don't think that YouTube cares
00:44:39.000 | if you only watch like a second of a video.
00:44:41.720 | As long as the next thing you do is open another video.
00:44:45.140 | If you close the app or close the site,
00:44:49.420 | that's a problem for them.
00:44:50.800 | Because they're not a subscription platform.
00:44:52.980 | They're not like, look,
00:44:53.820 | you're giving us 20 bucks a month no matter what,
00:44:56.160 | so who cares?
00:44:57.700 | They need you to watch and spend time there
00:45:00.940 | and see ads.
00:45:02.140 | - So one of the things I'm curious about
00:45:03.680 | whether they do consider longer term,
00:45:07.680 | sort of develop,
00:45:09.740 | your longer term development as a human being,
00:45:12.680 | which I think ultimately will make you feel better
00:45:15.180 | about using YouTube in the long term
00:45:17.520 | and allowing you to stick with it for longer.
00:45:20.000 | Because even if you feed the dopamine rush
00:45:22.440 | in the short term,
00:45:23.360 | and you keep clicking on cat videos,
00:45:25.600 | eventually you sort of wake up like from a drug
00:45:29.020 | and say, I need to quit this.
00:45:30.900 | So I wonder how much they're trying to optimize
00:45:32.720 | for the long term.
00:45:34.040 | Because when I look at the,
00:45:35.840 | your videos aren't exactly sort of, no offense,
00:45:39.440 | but they're not the most clickable.
00:45:41.760 | They're both the most clickable
00:45:44.200 | and I feel I watched the entire thing
00:45:47.320 | and I feel a better human after I watched it.
00:45:49.680 | So they're not just optimizing for the clickability.
00:45:54.000 | So my thought is, how do you think of it?
00:45:59.920 | And does it affect your own content?
00:46:02.260 | Like how deep you go,
00:46:03.300 | how profound you explore the directions and so on.
00:46:07.020 | - I've been really lucky in that I don't worry too much
00:46:11.500 | about the algorithm.
00:46:12.520 | I mean, look at my thumbnails.
00:46:13.780 | I don't really go too wild with them.
00:46:17.220 | And with Minefield,
00:46:18.140 | where I'm in partnership with YouTube on the thumbnails,
00:46:20.860 | I'm often like, let's pull this back.
00:46:22.420 | Let's be mysterious.
00:46:23.980 | But usually I'm just trying to do
00:46:25.660 | what everyone else is not doing.
00:46:27.540 | So if everyone's doing crazy Photoshop,
00:46:29.740 | what kind of thumbnails?
00:46:31.040 | I'm like, what if the thumbnails just align?
00:46:34.200 | And what if the title is just a word?
00:46:37.820 | And I kind of feel like all of the Vsauce channels
00:46:41.240 | have cultivated an audience that expects that.
00:46:43.240 | And so they would rather Jake make a video
00:46:45.400 | that's just called stains
00:46:47.080 | than one called I explored stains, shocking.
00:46:51.000 | But there are other audiences out there that want that.
00:46:53.360 | And I think most people kind of want
00:46:57.120 | what you see the algorithm favoring,
00:46:58.840 | which is mainstream traditional celebrity
00:47:02.300 | and news kind of information.
00:47:03.700 | I mean, that's what makes YouTube really different
00:47:05.240 | than other streaming platforms.
00:47:06.720 | No one's like, what's going on in the world?
00:47:08.820 | I'll open up Netflix to find out.
00:47:10.560 | But you do open up Twitter to find that out.
00:47:12.760 | You open up Facebook, you can open up YouTube
00:47:14.880 | 'cause you'll see that the trending videos
00:47:16.280 | are like what happened amongst the traditional
00:47:19.640 | mainstream people in different industries.
00:47:22.180 | That's what's being shown.
00:47:24.100 | And it's not necessarily YouTube saying,
00:47:27.600 | we want that to be what you see.
00:47:29.280 | It's that that's what people click on.
00:47:31.440 | When they see Ariana Grande reads a love letter
00:47:34.280 | from like her high school sweetheart,
00:47:36.320 | they're like, I wanna see that.
00:47:38.040 | And when they see a video from me
00:47:39.360 | that's got some lines in math
00:47:40.560 | and it's called law and causes,
00:47:41.840 | they're like, well, I mean, I'm just on the bus.
00:47:46.000 | Like I don't have time to dive into a whole lesson.
00:47:48.680 | So before you get super mad at YouTube,
00:47:52.400 | you should say, really,
00:47:53.680 | they're just reflecting back human behavior.
00:47:55.560 | - Is there something you would improve about the algorithm?
00:47:59.180 | Knowing of course, that as far as we're concerned,
00:48:02.800 | it's a black box or we don't know how it works.
00:48:04.600 | - Right, and I don't think that even anyone at YouTube
00:48:06.800 | really knows what it's doing.
00:48:07.920 | They know what they've tweaked, but then it learns.
00:48:09.840 | I think that it learns and it decides how to behave.
00:48:13.840 | And sometimes the YouTube employees are left going,
00:48:16.680 | I don't know, maybe we should like change the value
00:48:19.640 | of how much it worries about watch time
00:48:22.680 | and maybe it should worry more about something.
00:48:24.720 | I don't know, but I mean, I would like to see,
00:48:27.420 | I don't know what they're doing and not doing.
00:48:30.720 | - Well, is there a conversation
00:48:32.760 | that you think they should be having just internally,
00:48:35.760 | whether they're having it or not?
00:48:37.340 | Is there something,
00:48:38.960 | should they be thinking about the long-term future?
00:48:41.160 | Should they be thinking about educational content
00:48:44.480 | and whether that's educating
00:48:47.360 | about what just happened in the world today,
00:48:48.960 | news or educational content, like what you're providing,
00:48:51.560 | which is asking big sort of timeless questions
00:48:54.400 | about how the way the world works.
00:48:56.600 | - Well, it's interesting, what should they think about?
00:48:59.440 | Because it's called YouTube, not our tube.
00:49:02.640 | And that's why I think they have
00:49:04.720 | so many phenomenal educational creators.
00:49:08.400 | You don't have shows like "Three Blue, One Brown"
00:49:11.720 | or "Physics Girl" or "Looking Glass Universe"
00:49:14.240 | or "Up and Atom" or "Brain Scoop" or,
00:49:16.560 | I mean, I could go on and on.
00:49:18.760 | They aren't on Amazon Prime and Netflix
00:49:21.240 | and they don't have commissioned shows from those platforms.
00:49:24.040 | It's all organically happening
00:49:25.560 | because there are people out there
00:49:26.680 | that want to share their passion for learning,
00:49:30.160 | that wanna share their curiosity.
00:49:32.540 | And YouTube could promote those kinds of shows more,
00:49:37.480 | but like, first of all,
00:49:39.080 | they probably wouldn't get as many clicks
00:49:43.280 | and YouTube needs to make sure that the average user
00:49:45.360 | is always clicking and staying on the site.
00:49:47.760 | They could still promote it more for the good of society,
00:49:51.080 | but then we're making some really weird claims
00:49:52.760 | about what's good for society
00:49:54.000 | because I think that cat videos
00:49:55.640 | are also an incredibly important part
00:49:58.080 | of what it means to be a human.
00:50:00.400 | I mentioned this quote before from Unumuno about,
00:50:02.920 | look, I've seen a cat like estimate distances
00:50:05.440 | and calculate a jump more often than I've seen a cat cry.
00:50:09.480 | And so things that play with our emotions
00:50:12.480 | and make us feel things can be cheesy and can feel cheap,
00:50:15.400 | but like, man, that's very human.
00:50:18.040 | And so even the dumbest vlog is still so important
00:50:23.800 | that I don't think I have a better claim to take its spot
00:50:27.400 | than it has to have that spot.
00:50:29.880 | - It puts a mirror to us,
00:50:31.640 | the beautiful parts, the ugly parts,
00:50:33.960 | the shallow parts, the deep parts, you're right.
00:50:36.520 | - What I would like to see is,
00:50:38.520 | I miss the days when engaging with content on YouTube
00:50:43.400 | helped push it into my subscribers timelines.
00:50:47.640 | It used to be that when I liked a video,
00:50:49.600 | say from Veritasium,
00:50:51.200 | it would show up in the feed on the front page of the app
00:50:56.120 | or the website of my subscribers.
00:50:58.380 | And I knew that if I liked a video,
00:51:00.480 | I could send it 100,000 views or more.
00:51:03.520 | That no longer is true.
00:51:05.320 | But I think that was a good user experience.
00:51:07.340 | When I subscribe to someone, when I'm following them,
00:51:09.840 | I want to see more of what they like.
00:51:13.080 | I want them to also curate the feed for me.
00:51:15.400 | And I think that Twitter and Facebook are doing that
00:51:17.920 | in also some ways that are kind of annoying,
00:51:20.360 | but I would like that to happen more.
00:51:22.380 | And I think we would see communities being stronger
00:51:25.200 | on YouTube if it was that way, instead of YouTube going,
00:51:27.320 | well, technically Michael liked this Veritasium video,
00:51:29.800 | but people are way more likely to click on Carpool Karaoke.
00:51:33.620 | So I don't even care who they are, just give them that.
00:51:36.200 | Not saying anything against Carpool Karaoke,
00:51:38.960 | that is a extremely important part of our society,
00:51:43.280 | what it means to be a human on earth, you know, but-
00:51:46.800 | - I'll say it, it sucks, but-
00:51:48.400 | (both laughing)
00:51:49.680 | - But a lot of people would disagree with you
00:51:51.360 | and they should be able to see as much of that as they want.
00:51:53.860 | And I think even people who don't think they like it
00:51:55.680 | should still be really aware of it
00:51:57.080 | 'cause it's such an important thing
00:51:59.400 | and such an influential thing.
00:52:00.920 | But yeah, I just wish that like new channels I discover
00:52:03.320 | and that I subscribe to,
00:52:04.380 | I wish that my subscribers found out about that
00:52:06.960 | because especially in the education community,
00:52:10.040 | a rising tide floats all boats.
00:52:11.620 | If you watch a video from Numberphile,
00:52:14.040 | you're just more likely to wanna watch an episode from me,
00:52:16.760 | whether it be on Vsauce1 or Ding.
00:52:18.560 | It's not competitive in the way that traditional TV was,
00:52:21.880 | where it's like, well, if you tune into that show,
00:52:23.240 | it means you're not watching mine
00:52:24.600 | 'cause they both air at the same time.
00:52:26.160 | So helping each other out through collaborations
00:52:29.360 | takes a lot of work, but just through engaging,
00:52:31.760 | commenting on their videos, liking their videos,
00:52:34.040 | subscribing to them, whatever,
00:52:36.720 | that I would love to see become easier and more powerful.
00:52:41.540 | - So a quick and impossibly deep question,
00:52:46.080 | last question about mortality.
00:52:48.960 | You've spoken about death as an interesting topic.
00:52:52.600 | Do you think about your own mortality?
00:52:55.960 | - Yeah, every day.
00:52:57.760 | It's really scary.
00:52:59.740 | - So what do you think is the meaning of life
00:53:04.020 | that mortality makes very explicit?
00:53:07.320 | So why are you here on earth, Michael?
00:53:12.320 | What's the point of this whole thing?
00:53:14.520 | (Michael mumbles)
00:53:17.360 | What does mortality in the context of the whole universe
00:53:24.960 | make you realize about yourself?
00:53:26.400 | Just you, Michael Stevens.
00:53:28.080 | - Well, it makes me realize
00:53:31.320 | that I am destined to become a notion.
00:53:35.620 | I'm destined to become a memory.
00:53:37.960 | And we can extend life.
00:53:39.600 | I think there's really exciting things being done
00:53:42.880 | to extend life, but we still don't know how to like,
00:53:46.600 | protect you from some accident that could happen,
00:53:48.860 | some unforeseen thing.
00:53:50.280 | Maybe we could like save my connectome
00:53:54.120 | and like recreate my consciousness digitally.
00:53:56.560 | But even that could be lost
00:53:59.400 | if it's stored on a physical medium or something.
00:54:02.580 | So basically I just think that
00:54:04.520 | embracing and realizing how cool it is
00:54:09.000 | that like someday I will just be an idea
00:54:11.560 | and there won't be a Michael anymore
00:54:13.400 | that can be like, no, that's not what I meant.
00:54:16.240 | It'll just be what people like,
00:54:17.560 | they have to guess what I meant.
00:54:19.640 | And they'll remember me
00:54:21.760 | and how I live on as that memory
00:54:25.840 | will maybe not even be who I wanted to be.
00:54:29.640 | But there's something powerful about that.
00:54:31.980 | And there's something powerful about letting
00:54:34.320 | future people run the show themselves.
00:54:39.000 | I think I'm glad to get out of their way at some point
00:54:42.000 | and say, all right, it's your world now.
00:54:43.720 | - So you, the physical entity, Michael,
00:54:47.320 | have ripple effects in the space of ideas
00:54:50.360 | that far outlives you in ways that you can't control,
00:54:54.320 | but it's nevertheless fascinating to think,
00:54:56.160 | I mean, especially with you,
00:54:57.640 | you can imagine an alien species
00:54:59.240 | when they finally arrive and destroy all of us
00:55:01.800 | would watch your videos to try to figure out
00:55:04.640 | what were the questions that these people--
00:55:06.040 | - But even if they didn't,
00:55:08.700 | I still think that there will be ripples.
00:55:11.560 | When I say memory, I don't specifically mean
00:55:14.640 | people remember my name and my birth date
00:55:17.640 | and there's a photo of me on Wikipedia.
00:55:19.840 | All that can be lost, but I still would hope
00:55:21.840 | that people ask questions and teach concepts
00:55:25.840 | in some of the ways that I have found useful and satisfying.
00:55:28.400 | Even if they don't know that I was the one
00:55:29.840 | who tried to popularize it, that's fine.
00:55:32.640 | But if Earth was completely destroyed,
00:55:35.320 | like burnt to a crisp, everything on it today,
00:55:38.720 | the universe wouldn't care.
00:55:42.760 | Like Jupiter is not gonna go, "Oh no."
00:55:45.640 | And that could happen.
00:55:49.720 | - That could happen.
00:55:50.560 | - So we do, however, have the power to launch things
00:55:55.160 | into space to try to extend how long our memory exists.
00:56:02.960 | And what I mean by that is,
00:56:04.680 | we are recording things about the world
00:56:06.960 | and we're learning things and writing stories
00:56:08.480 | and all of this and preserving that
00:56:10.680 | is truly what I think is the essence of being a human.
00:56:16.560 | We are autobiographers of the universe
00:56:20.640 | and we're really good at it.
00:56:21.720 | We're better than fossils, we're better than light spectrum,
00:56:25.560 | we're better than any of that.
00:56:26.800 | We collect much more detailed memories of what's happening,
00:56:31.480 | much better data.
00:56:32.880 | And so that should be our legacy.
00:56:37.360 | And I hope that that's kind of mine too,
00:56:40.200 | in terms of people remembering something
00:56:42.480 | or having some kind of effect.
00:56:44.840 | But even if I don't, you can't not have an effect.
00:56:47.600 | That's the thing, this is not me feeling like,
00:56:49.240 | I hope that I have this powerful legacy.
00:56:50.880 | It's like, no matter who you are, you will.
00:56:53.040 | But you also have to embrace the fact
00:56:57.760 | that that impact might look really small and that's okay.
00:57:01.400 | One of my favorite quotes is from Tess of the D'Urbervilles
00:57:04.520 | and it's along the lines of the measure of your life
00:57:08.240 | depends on not your external displacement,
00:57:10.960 | but your subjective experience.
00:57:13.120 | If I am happy and those that I love are happy,
00:57:15.800 | can that be enough?
00:57:17.740 | Because if so, excellent.
00:57:20.120 | - I think there's no better place to end it, Michael.
00:57:23.480 | Thank you so much, it was an honor to meet you.
00:57:25.040 | Thanks for talking to me.
00:57:25.880 | - Thank you, it was a pleasure.
00:57:27.720 | - Thanks for listening to this conversation
00:57:29.200 | with Michael Stevens.
00:57:30.480 | And thank you to our presenting sponsor, Cash App.
00:57:33.340 | Download it, use code LEXPODCAST,
00:57:35.920 | you'll get $10 and $10 will go to FIRST,
00:57:39.000 | a STEM education nonprofit that inspires
00:57:41.320 | hundreds of thousands of young minds
00:57:43.300 | to learn, to dream of engineering our future.
00:57:47.040 | If you enjoy this podcast, subscribe on YouTube,
00:57:49.960 | give it five stars on Apple Podcast,
00:57:51.920 | support it on Patreon or connect with me on Twitter.
00:57:55.520 | And now let me leave you with some words of wisdom
00:57:58.600 | from Albert Einstein.
00:58:00.720 | "The important thing is not to stop questioning.
00:58:03.960 | "Curiosity has its own reason for existence.
00:58:06.860 | "One cannot help but be in awe
00:58:09.160 | "when he contemplates the mysteries of eternity,
00:58:11.660 | "of life, the marvelous structure of reality.
00:58:14.920 | "It is enough if one tries merely to comprehend
00:58:18.020 | "a little of this mystery every day."
00:58:21.080 | Thank you for listening and hope to see you next time.
00:58:24.180 | (upbeat music)
00:58:26.760 | (upbeat music)
00:58:29.340 | [BLANK_AUDIO]