back to index

Dr. Lex Fridman: Machines, Creativity & Love | Huberman Lab Podcast #29


Chapters

0:0 Introduction: Lex Fridman
7:35 What is Artificial Intelligence?
26:46 Machine & Human Learning
32:21 Curiosity
36:55 Story Telling Robots
40:48 What Defines a Robot?
44:30 Magic & Surprise
47:37 How Robots Change Us
49:35 Relationships Defined
62:29 Lex’s Dream for Humanity
71:33 Improving Social Media
76:57 Challenges of Creativity
81:49 Suits & Dresses
82:22 Loneliness
90:9 Empathy
95:12 Power Dynamics In Relationships
99:11 Robot Rights
100:20 Dogs: Homer & Costello
112:41 Friendship
119:47 Russians & Suffering
125:38 Public vs. Private Life
134:4 How To Treat a Robot
137:12 The Value of Friendship
140:33 Martial Arts
151:34 Body-Mind Interactions
153:22 Romantic Love
162:51 The Lex Fridman Podcast
175:54 The Hedgehog
181:17 Concluding Statements

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science and science-based tools
00:00:04.900 | for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.420 | and I'm a professor of neurobiology and ophthalmology
00:00:13.220 | at Stanford School of Medicine.
00:00:15.020 | Today, I have the pleasure of introducing Dr. Lex Friedman
00:00:17.900 | as our guest on the Huberman Lab Podcast.
00:00:21.100 | Dr. Friedman is a researcher at MIT
00:00:23.180 | specializing in machine learning, artificial intelligence,
00:00:26.540 | and human-robot interactions.
00:00:29.340 | I must say that the conversation with Lex
00:00:31.820 | was without question,
00:00:33.800 | one of the most fascinating conversations
00:00:35.660 | that I've ever had, not just in my career,
00:00:37.700 | but in my lifetime.
00:00:39.440 | I knew that Lex worked on these topics,
00:00:41.500 | and I think many of you are probably familiar with Lex
00:00:43.940 | and his interest in these topics
00:00:45.180 | from his incredible podcast, the Lex Friedman Podcast.
00:00:48.260 | If you're not already watching that podcast,
00:00:50.240 | please subscribe to it, it is absolutely fantastic.
00:00:53.740 | But in holding this conversation with Lex,
00:00:56.040 | I realized something far more important.
00:00:58.620 | He revealed to us a bit of his dream,
00:01:00.900 | his dream about humans and robots,
00:01:03.360 | about humans and machines,
00:01:04.980 | and about how those interactions can change the way
00:01:07.520 | that we perceive ourselves
00:01:08.940 | and that we interact with the world.
00:01:10.740 | We discuss relationships of all kinds,
00:01:13.060 | relationships with animals, relationships with friends,
00:01:16.420 | relationships with family, and romantic relationships.
00:01:20.420 | And we discuss relationships with machines,
00:01:23.200 | machines that move and machines that don't move,
00:01:26.500 | and machines that come to understand us in ways
00:01:28.980 | that we could never understand for ourselves,
00:01:31.700 | and how those machines can educate us about ourselves.
00:01:35.720 | Before this conversation,
00:01:37.420 | I had no concept of the ways in which machines
00:01:40.060 | could inform me or anyone about themselves.
00:01:43.700 | By the end, I was absolutely taken with the idea,
00:01:46.780 | and I'm still taken with the idea
00:01:48.540 | that interactions with machines of a very particular kind,
00:01:51.920 | a kind that Lex understands and wants to bring to the world,
00:01:55.180 | can not only transform the self,
00:01:57.360 | but may very well transform humanity.
00:02:00.100 | So whether or not you're familiar
00:02:01.340 | with Dr. Lex Friedman or not,
00:02:03.320 | I'm certain you're going to learn a tremendous amount
00:02:05.340 | from him during the course of our discussion,
00:02:07.500 | and that it will transform the way
00:02:08.940 | that you think about yourself and about the world.
00:02:12.060 | Before we begin, I want to mention
00:02:13.860 | that this podcast is separate
00:02:15.100 | from my teaching and research roles at Stanford.
00:02:17.560 | It is, however, part of my desire and effort
00:02:19.940 | to bring zero cost to consumer information about science
00:02:22.740 | and science-related tools to the general public.
00:02:25.560 | In keeping with that theme,
00:02:26.720 | I'd like to thank the sponsors of today's podcast.
00:02:29.580 | Our first sponsor is Roca.
00:02:31.420 | Roca makes sunglasses and eyeglasses
00:02:33.340 | that are of absolutely phenomenal quality.
00:02:35.620 | The company was founded
00:02:36.500 | by two all-American swimmers from Stanford,
00:02:38.500 | and everything about the sunglasses and eyeglasses
00:02:40.760 | they've designed had performance in mind.
00:02:43.900 | Now, I've spent a career working on the visual system,
00:02:46.180 | and one of the fundamental issues
00:02:47.700 | that your visual system has to deal with
00:02:49.700 | is how to adjust what you see when it gets darker
00:02:52.780 | or brighter in your environment.
00:02:54.600 | With Roca sunglasses and eyeglasses,
00:02:56.820 | whether or not it's dim in the room or outside,
00:02:58.920 | whether or not there's cloud cover,
00:03:00.100 | or whether or not you walk into a shadow,
00:03:01.540 | you can always see the world with absolute clarity.
00:03:04.300 | And that just tells me that they really understand
00:03:06.220 | the way that the visual system works,
00:03:07.620 | processes like habituation and attenuation.
00:03:10.140 | All these things that work at a real mechanistic level
00:03:12.520 | have been built into these glasses.
00:03:14.540 | In addition, the glasses are very lightweight.
00:03:16.740 | You don't even notice really that they're on your face.
00:03:19.060 | And the quality of the lenses is terrific.
00:03:21.780 | Now, the glasses were also designed
00:03:23.500 | so that you could use them not just while working
00:03:25.460 | or at dinner, et cetera, but while exercising.
00:03:28.580 | They don't fall off your face or slip off your face
00:03:30.620 | if you're sweating.
00:03:31.800 | And as I mentioned, they're extremely lightweight,
00:03:33.460 | so you can use them while running,
00:03:34.860 | you can use them while cycling, and so forth.
00:03:37.060 | Also, the aesthetic of Roca glasses is terrific.
00:03:39.680 | Unlike a lot of performance glasses out there,
00:03:41.680 | which frankly make people look like cyborgs,
00:03:44.580 | these glasses look great.
00:03:46.020 | You can wear them out to dinner.
00:03:47.200 | You can wear them for essentially any occasion.
00:03:50.460 | If you'd like to try Roca glasses,
00:03:51.960 | you can go to roca.com, that's R-O-K-A.com,
00:03:55.440 | and enter the code Huberman to save 20% off
00:03:58.080 | your first order.
00:03:59.140 | That's Roca, R-O-K-A.com,
00:04:01.080 | and enter the code Huberman at checkout.
00:04:03.500 | Today's episode is also brought to us by InsideTracker.
00:04:06.880 | InsideTracker is a personalized nutrition platform
00:04:09.400 | that analyzes data from your blood and DNA
00:04:11.980 | to help you better understand your body
00:04:13.560 | and help you reach your health goals.
00:04:15.700 | I am a big believer in getting regular blood work done
00:04:18.220 | for the simple reason that many of the factors
00:04:20.700 | that impact our immediate and long-term health
00:04:23.180 | can only be assessed from a quality blood test.
00:04:25.820 | And now with the advent of quality DNA tests,
00:04:28.460 | we can also get insight into some of our genetic
00:04:30.900 | underpinnings of our current and long-term health.
00:04:34.300 | The problem with a lot of blood and DNA tests out there,
00:04:36.700 | however, is you get the data back
00:04:38.500 | and you don't know what to do with those data.
00:04:40.480 | You see that certain things are high
00:04:41.760 | or certain things are low,
00:04:43.000 | but you really don't know what the actionable items are,
00:04:45.200 | what to do with all that information.
00:04:47.300 | With InsideTracker, they make it very easy
00:04:49.740 | to act in the appropriate ways on the information
00:04:52.580 | that you get back from those blood and DNA tests.
00:04:55.060 | And that's through the use of their online platform.
00:04:57.580 | They have a really easy to use dashboard
00:04:59.860 | that tells you what sorts of things can bring the numbers
00:05:02.880 | for your metabolic factors, endocrine factors, et cetera,
00:05:05.740 | into the ranges that you want and need
00:05:08.040 | for immediate and long-term health.
00:05:10.140 | In fact, I know one individual just by way of example
00:05:13.100 | that was feeling good but decided to go
00:05:15.280 | with an InsideTracker test and discovered
00:05:16.960 | that they had high levels
00:05:17.960 | of what's called C-reactive protein.
00:05:19.800 | They would have never detected that otherwise.
00:05:21.820 | C-reactive protein is associated
00:05:23.360 | with a number of deleterious health conditions,
00:05:25.920 | some heart issues, eye issues, et cetera.
00:05:28.200 | And so they were able to take immediate action
00:05:30.160 | to try and resolve those CRP levels.
00:05:33.400 | And so with InsideTracker, you get that sort of insight.
00:05:35.980 | And as I mentioned before, without a blood or DNA test,
00:05:38.440 | there's no way you're going to get that sort of insight
00:05:40.360 | until symptoms start to show up.
00:05:42.840 | If you'd like to try InsideTracker,
00:05:44.300 | you can go to insidetracker.com/huberman
00:05:47.360 | to get 25% off any of InsideTracker's plans.
00:05:50.260 | You just use the code Huberman at checkout.
00:05:52.680 | That's insidetracker.com/huberman
00:05:55.420 | to get 25% off any of InsideTracker's plans.
00:05:58.700 | Today's podcast is brought to us by Athletic Greens.
00:06:01.560 | Athletic Greens is an all-in-one
00:06:03.200 | vitamin mineral probiotic drink.
00:06:05.640 | I started taking Athletic Greens way back in 2012.
00:06:09.020 | And so I'm delighted that they're sponsoring the podcast.
00:06:11.860 | The reason I started taking Athletic Greens
00:06:13.720 | and the reason I still take Athletic Greens
00:06:15.800 | is that it covers all of my vitamin mineral probiotic bases.
00:06:19.440 | In fact, when people ask me, what should I take?
00:06:22.000 | I always suggest that the first supplement people take
00:06:24.440 | is Athletic Greens for the simple reason
00:06:26.920 | is that the things it contains covers your bases
00:06:29.480 | for metabolic health, endocrine health,
00:06:31.880 | and all sorts of other systems in the body.
00:06:33.960 | And the inclusion of probiotics are essential
00:06:36.520 | for a healthy gut microbiome.
00:06:38.900 | There are now tons of data showing that
00:06:40.840 | we have neurons in our gut
00:06:42.820 | and keeping those neurons healthy
00:06:44.240 | requires that they are exposed
00:06:45.620 | to what are called the correct microbiota,
00:06:47.940 | little microorganisms that live in our gut
00:06:49.880 | and keep us healthy.
00:06:51.100 | And those neurons in turn help keep our brain healthy.
00:06:53.860 | They influence things like mood, our ability to focus,
00:06:56.500 | and many, many other factors related to health.
00:06:59.880 | With Athletic Greens, it's terrific
00:07:01.400 | because it also tastes really good.
00:07:03.160 | I drink it once or twice a day.
00:07:04.700 | I mix mine with water and I add a little lemon juice
00:07:07.040 | or sometimes a little bit of lime juice.
00:07:09.380 | If you want to try Athletic Greens,
00:07:11.080 | you can go to athleticgreens.com/huberman.
00:07:14.080 | And if you do that, you can claim their special offer.
00:07:16.680 | They're giving away five free travel packs,
00:07:18.520 | the little packs that make it easy to mix up
00:07:20.420 | Athletic Greens while you're on the road.
00:07:22.660 | And they'll give you a year supply of vitamin D3 and K2.
00:07:26.260 | Again, go to athleticgreens.com/huberman
00:07:28.940 | to claim that special offer.
00:07:30.880 | And now my conversation with Dr. Lex Friedman.
00:07:34.600 | - We meet again.
00:07:35.540 | - We meet again.
00:07:37.420 | Thanks so much for sitting down with me.
00:07:39.600 | I have a question that I think is on a lot
00:07:42.160 | of people's minds or ought to be on a lot of people's minds
00:07:46.400 | because we hear these terms a lot these days,
00:07:50.040 | but I think most people, including most scientists
00:07:53.100 | and including me, don't know really
00:07:56.900 | what is artificial intelligence and how is it different
00:08:01.900 | from things like machine learning and robotics?
00:08:05.240 | So if you would be so kind as to explain to us
00:08:08.900 | what is artificial intelligence and what is machine learning?
00:08:13.900 | - Well, I think that question is as complicated
00:08:17.100 | and as fascinating as the question of what is intelligence.
00:08:21.780 | So I think of artificial intelligence first
00:08:26.480 | as a big philosophical thing.
00:08:28.740 | Pamela McCormick said AI was,
00:08:32.460 | AI was the ancient wish to forge the gods
00:08:37.460 | or was born as an ancient wish to forge the gods.
00:08:41.740 | So I think at the big philosophical level,
00:08:44.260 | it's our longing to create other intelligence systems,
00:08:48.300 | perhaps systems more powerful than us.
00:08:50.400 | At the more narrow level, I think it's also a set of tools
00:08:56.740 | that are computational mathematical tools
00:08:59.260 | to automate different tasks.
00:09:01.100 | And then also it's our attempt to understand our own mind.
00:09:05.620 | So build systems that exhibit some intelligent behavior
00:09:09.940 | in order to understand what is intelligence
00:09:12.900 | in our own selves.
00:09:14.580 | So all those things are true.
00:09:16.160 | Of course, what AI really means as a community,
00:09:19.340 | as a set of researchers and engineers, it's a set of tools,
00:09:22.640 | a set of computational techniques
00:09:25.280 | that allow you to solve various problems.
00:09:29.560 | There's a long history that approaches the problem
00:09:33.000 | from different perspectives.
00:09:34.220 | What's always been throughout one of the threads,
00:09:37.640 | one of the communities goes under the flag
00:09:40.280 | of machine learning, which is emphasizing in the AI space,
00:09:45.280 | the task of learning.
00:09:48.120 | How do you make a machine that knows very little
00:09:50.520 | in the beginning, follow some kind of process
00:09:53.800 | and learns to become better and better in a particular task.
00:09:58.260 | What's been most very effective in the recent about 15 years
00:10:03.260 | is a set of techniques that fall under the flag
00:10:05.680 | of deep learning that utilize neural networks.
00:10:08.680 | What neural networks are, are these fascinating things
00:10:12.120 | inspired by the structure of the human brain very loosely,
00:10:17.120 | but they have, it's a network of these little basic
00:10:20.540 | computational units called neurons, artificial neurons.
00:10:24.400 | And they have, these architectures have an input and output.
00:10:28.720 | They know nothing in the beginning
00:10:30.120 | and they're tasked with learning something interesting.
00:10:33.500 | What that something interesting is
00:10:35.240 | usually involves a particular task.
00:10:37.580 | There's a lot of ways to talk about this and break this down.
00:10:41.960 | Like one of them is how much human supervision
00:10:45.880 | is required to teach this thing.
00:10:48.340 | So supervised learning is broad category
00:10:51.860 | is the neural network knows nothing in the beginning.
00:10:56.160 | And then it's given a bunch of examples of,
00:10:59.500 | in computer vision, that will be examples of cats,
00:11:02.020 | dogs, cars, traffic signs.
00:11:04.440 | And then you're given the image
00:11:06.080 | and you're given the ground truth of what's in that image.
00:11:09.560 | And when you get a large database of such image examples
00:11:13.040 | where you know the truth,
00:11:15.040 | the neural network is able to learn by example,
00:11:18.640 | that's called supervised learning.
00:11:20.480 | The question, there's a lot of fascinating questions
00:11:22.760 | within that, which is how do you provide the truth?
00:11:26.040 | When you've given an image of a cat,
00:11:28.780 | how do you provide to the computer
00:11:32.920 | that this image contains a cat?
00:11:34.920 | Do you just say the entire image is a picture of a cat?
00:11:37.980 | Do you do what's very commonly been done,
00:11:40.320 | which is a bounding box.
00:11:41.540 | You have a very crude box around the cat's face
00:11:45.040 | saying this is a cat.
00:11:46.520 | Do you do semantic segmentation?
00:11:48.720 | Mind you, this is a 2D image of a cat.
00:11:51.000 | So it's not a, the computer knows nothing
00:11:53.940 | about our three-dimensional world.
00:11:55.680 | It's just looking at a set of pixels.
00:11:57.340 | So semantic segmentation is drawing a nice,
00:12:01.120 | very crisp outline around the cat and saying that's a cat.
00:12:04.920 | That's really difficult to provide that truth.
00:12:07.080 | And one of the fundamental open questions
00:12:09.480 | in computer vision is,
00:12:10.800 | is that even a good representation of the truth?
00:12:13.800 | Now there's another contrasting set of ideas
00:12:18.560 | that are attention, that are overlapping
00:12:21.120 | is what's used to be called unsupervised learning,
00:12:24.340 | what's commonly now called self-supervised learning,
00:12:27.140 | which is trying to get less and less and less
00:12:29.840 | human supervision into the task.
00:12:33.960 | So self-supervised learning is more,
00:12:38.740 | it's been very successful in the domain of a language model,
00:12:42.000 | natural language processing,
00:12:43.180 | and now more and more is being successful
00:12:44.960 | in computer vision tasks.
00:12:46.480 | And what's the idea there is let the machine
00:12:49.640 | without any ground truth annotation,
00:12:53.160 | just look at pictures on the internet
00:12:55.740 | or look at texts on the internet
00:12:57.600 | and try to learn something generalizable about the ideas
00:13:02.600 | that are at the core of language or at the core of vision.
00:13:07.200 | And based on that, we humans at its best
00:13:11.620 | like to call that common sense.
00:13:12.900 | So with this, we have this giant base of knowledge
00:13:16.000 | on top of which we build more sophisticated knowledge,
00:13:18.720 | but we have this kind of common sense knowledge.
00:13:21.400 | And so the idea with self-supervised learning
00:13:23.380 | is to build this common sense knowledge
00:13:26.000 | about what are the fundamental visual ideas
00:13:30.440 | that make up a cat and a dog and all those kinds of things
00:13:33.280 | without ever having human supervision.
00:13:35.780 | The dream there is you just let an AI system
00:13:40.780 | that's self-supervised run around the internet for a while,
00:13:44.840 | watch YouTube videos for millions and millions of hours
00:13:47.520 | and without any supervision be primed and ready
00:13:52.080 | to actually learn with very few examples
00:13:54.640 | once the human is able to show up.
00:13:56.680 | We think of children in this way, human children,
00:14:00.160 | is your parents only give one or two examples
00:14:03.040 | to teach a concept.
00:14:04.600 | The dream of self-supervised learning
00:14:07.040 | is that would be the same with machines,
00:14:10.080 | that they would watch millions of hours of YouTube videos
00:14:13.960 | and then come to a human and be able to understand
00:14:16.760 | when the human shows them, this is a cat.
00:14:19.640 | Remember this is a cat.
00:14:20.800 | They will understand that a cat is not just a thing
00:14:23.540 | with pointy ears or a cat is a thing that's orange
00:14:27.520 | or is furry, they'll see something more fundamental
00:14:30.800 | that we humans might not actually be able to introspect
00:14:33.480 | and understand.
00:14:34.400 | Like if I asked you what makes a cat versus a dog,
00:14:36.800 | you wouldn't probably not be able to answer that.
00:14:39.400 | But if I showed you, brought to you a cat and a dog,
00:14:42.760 | you'll be able to tell the difference.
00:14:44.400 | What are the ideas that your brain uses
00:14:47.060 | to make that difference?
00:14:48.800 | That's the whole dream of self-supervised learning
00:14:51.280 | is it would be able to learn that on its own,
00:14:53.880 | that set of common sense knowledge
00:14:56.080 | that's able to tell the difference.
00:14:57.880 | And then there's like a lot of incredible uses
00:15:01.600 | of self-supervised learning very weirdly
00:15:05.040 | called self-play mechanism.
00:15:07.280 | That's the mechanism behind the reinforcement learning
00:15:11.740 | successes of the systems that want to go at alpha zero,
00:15:16.740 | that want to chess.
00:15:18.840 | - Oh, I see, that play games.
00:15:20.840 | - That play games. - Got it.
00:15:21.880 | - So the idea of self-play, this probably applies
00:15:26.000 | to other domains than just games,
00:15:27.880 | is a system that just plays against itself.
00:15:30.860 | And this is fascinating in all kinds of domains,
00:15:33.580 | but it knows nothing in the beginning.
00:15:36.600 | And the whole idea is it creates a bunch of mutations
00:15:39.540 | of itself and plays against those versions of itself.
00:15:44.540 | And the fascinating thing is when you play against systems
00:15:50.220 | that are a little bit better than you,
00:15:51.820 | you start to get better yourself.
00:15:53.720 | Like learning, that's how learning happens.
00:15:56.100 | That's true for martial arts, that's true in a lot of cases
00:15:59.240 | where you want to be interacting with systems
00:16:02.040 | that are just a little better than you.
00:16:03.880 | And then through this process of interacting with systems
00:16:06.800 | just a little better than you,
00:16:08.120 | you start following this process where everybody
00:16:10.480 | starts getting better and better and better and better
00:16:12.620 | until you are several orders of magnitude better
00:16:15.480 | than the world champion in chess, for example.
00:16:18.080 | And it's fascinating 'cause it's like a runaway system.
00:16:21.040 | One of the most terrifying and exciting things
00:16:23.560 | that David Silver, the creator of AlphaGo and AlphaZero,
00:16:27.200 | one of the leaders of the team said to me
00:16:31.440 | is they haven't found the ceiling for AlphaZero,
00:16:36.640 | meaning it could just arbitrarily keep improving.
00:16:39.360 | Now in the realm of chess, that doesn't matter to us
00:16:41.840 | that it's like it just ran away with the game of chess.
00:16:44.980 | Like it's like just so much better than humans.
00:16:48.360 | But the question is if you can create that in the realm
00:16:52.120 | that does have a bigger, deeper effect on human beings,
00:16:56.060 | on societies, that can be a terrifying process.
00:16:59.940 | To me, it's an exciting process
00:17:01.760 | if you supervise it correctly.
00:17:03.720 | If you inject what's called value alignment,
00:17:08.720 | you make sure that the goals that the AI is optimizing
00:17:13.500 | is aligned with human beings and human societies.
00:17:17.000 | There's a lot of fascinating things to talk about
00:17:19.200 | within the specifics of neural networks
00:17:23.200 | and all the problems that people are working on.
00:17:25.600 | But I would say the really big exciting one
00:17:28.300 | is self-supervised learning,
00:17:29.840 | where trying to get less and less human supervision
00:17:33.840 | of neural networks.
00:17:38.760 | And also just to comment and I'll shut up.
00:17:41.980 | - No, please keep going.
00:17:43.100 | I'm learning.
00:17:44.120 | I have questions, but I'm learning, so please keep going.
00:17:46.480 | - So that to me, what's exciting is not the theory,
00:17:49.420 | it's always the application.
00:17:51.240 | One of the most exciting applications
00:17:52.920 | of artificial intelligence,
00:17:55.120 | specifically neural networks and machine learning
00:17:57.480 | is Tesla Autopilot.
00:17:59.140 | So these are systems that are working in the real world.
00:18:01.600 | This isn't an academic exercise.
00:18:03.600 | This is human lives at stake.
00:18:05.200 | This is safety critical.
00:18:07.200 | - These are automated vehicles, autonomous vehicles.
00:18:09.480 | - Semi-autonomous, we've gone through wars on these topics.
00:18:14.480 | - Semi-autonomous.
00:18:16.200 | - Semi-autonomous.
00:18:17.040 | So even though it's called FSD, full self-driving,
00:18:22.040 | it is currently not fully autonomous,
00:18:24.960 | meaning human supervision is required.
00:18:27.740 | So human is tasked with overseeing the systems.
00:18:30.920 | In fact, liability wise, the human is always responsible.
00:18:35.240 | This is a human factor psychology question,
00:18:37.960 | which is fascinating.
00:18:39.360 | I'm fascinated by the whole space,
00:18:43.000 | which is a whole nother space of human robot interaction.
00:18:46.160 | When AI systems and humans work together
00:18:48.760 | to accomplish tasks.
00:18:49.960 | That dance to me is one of the smaller communities,
00:18:54.960 | but I think it will be one of the most important
00:18:58.280 | open problems once they're solved,
00:19:00.340 | is how do humans and robots dance together.
00:19:03.160 | To me, semi-autonomous driving is one of those spaces.
00:19:07.860 | So for Elon, for example, he doesn't see it that way.
00:19:11.680 | He sees semi-autonomous driving as a stepping stone
00:19:16.520 | towards fully autonomous driving.
00:19:18.320 | Humans and robots can't dance well together.
00:19:22.720 | Like humans and humans dance and robots and robots dance.
00:19:25.680 | We need to, this is an engineering problem,
00:19:28.060 | we need to design a perfect robot that solves this problem.
00:19:31.680 | To me forever, maybe this is not the case with driving,
00:19:34.120 | but the world is going to be full of problems
00:19:37.140 | where it's always humans and robots have to interact.
00:19:40.380 | Because I think robots will always be flawed,
00:19:43.520 | just like humans are going to be flawed, are flawed.
00:19:47.460 | And that's what makes life beautiful, that they're flawed.
00:19:51.000 | That's where learning happens
00:19:52.360 | at the edge of your capabilities.
00:19:55.860 | So you always have to figure out how can flawed robots
00:20:00.320 | and flawed humans interact together
00:20:03.800 | such that the sum is bigger than the whole,
00:20:08.360 | as opposed to focusing on just building the perfect robot.
00:20:11.280 | So that's one of the most exciting applications,
00:20:15.360 | I would say, of artificial intelligence to me,
00:20:17.800 | is autonomous driving and semi-autonomous driving.
00:20:20.640 | And that's a really good example of machine learning,
00:20:23.240 | because those systems are constantly learning.
00:20:26.240 | And there's a process there that maybe I can comment on.
00:20:30.600 | Andrej Karpathy, who's the head of Autopilot,
00:20:34.360 | calls it the data engine.
00:20:36.360 | And this process applies for a lot of machine learning,
00:20:38.900 | which is you build a system
00:20:40.340 | that's pretty good at doing stuff,
00:20:42.200 | you send it out into the real world,
00:20:45.340 | it starts doing the stuff,
00:20:46.820 | and then it runs into what are called edge cases,
00:20:49.180 | like failure cases, where it screws up.
00:20:52.660 | You know, we do this as kids, that's, you know, you have-
00:20:55.520 | - We do this as adults.
00:20:56.360 | - We do this as adults, exactly.
00:20:58.580 | But we learn really quickly.
00:21:00.140 | But the whole point,
00:21:01.340 | and this is the fascinating thing about driving,
00:21:03.660 | is you realize there's millions of edge cases.
00:21:06.120 | There's just like weird situations that you did not expect.
00:21:10.780 | And so the data engine process is
00:21:13.480 | you collect those edge cases,
00:21:15.100 | and then you go back to the drawing board
00:21:17.080 | and learn from them.
00:21:18.440 | And so you have to create this data pipeline
00:21:20.980 | where all these cars, hundreds of thousands of cars,
00:21:23.940 | as you're driving around, and something weird happens.
00:21:27.020 | And so whenever this weird detector fires,
00:21:30.900 | it's another important concept,
00:21:33.380 | that piece of data goes back to the mothership
00:21:37.420 | for the training, for the retraining of the system.
00:21:40.860 | And through this data engine process,
00:21:42.620 | it keeps improving and getting better and better
00:21:44.940 | and better and better.
00:21:45.860 | So basically, you send out a pretty clever AI systems
00:21:49.340 | out into the world,
00:21:50.900 | and let it find the edge cases.
00:21:54.420 | Let it screw up just enough
00:21:56.580 | to figure out where the edge cases are,
00:21:58.560 | and then go back and learn from them,
00:22:00.800 | and then send out that new version
00:22:02.620 | and keep updating that version.
00:22:04.220 | - Is the updating done by humans?
00:22:06.320 | - The annotation is done by humans.
00:22:08.520 | So you have to, the weird examples come back,
00:22:13.420 | the edge cases,
00:22:14.800 | and you have to label what actually happened in there.
00:22:17.700 | There's also some mechanisms for automatically labeling,
00:22:22.700 | but mostly I think you always have to rely on humans
00:22:25.860 | to improve, to understand what's happening
00:22:28.100 | in the weird cases.
00:22:29.980 | And then there's a lot of debate.
00:22:31.780 | And that's the other thing,
00:22:32.660 | what is artificial intelligence?
00:22:34.460 | Which is a bunch of smart people
00:22:36.780 | having very different opinions about what is intelligence.
00:22:39.740 | So AI is basically a community of people
00:22:41.900 | who don't agree on anything.
00:22:43.980 | - It seems to be the case.
00:22:45.820 | And first of all, this is a beautiful description of terms
00:22:48.660 | that I've heard many times among my colleagues at Stanford,
00:22:51.900 | at meetings, in the outside world.
00:22:53.760 | And there's so many fascinating things.
00:22:55.940 | I have so many questions,
00:22:56.940 | but I do want to ask one question about the culture of AI,
00:23:00.220 | because it does seem to be a community where,
00:23:02.620 | at least as an outsider,
00:23:03.980 | where it seems like there's very little consensus
00:23:06.140 | about what the terms
00:23:07.300 | and the operational definitions even mean.
00:23:09.620 | And there seems to be a lot of splitting happening now
00:23:12.340 | of not just supervised and unsupervised learning,
00:23:14.900 | but these sort of intermediate conditions
00:23:18.060 | where machines are autonomous,
00:23:20.780 | but then go back for more instruction.
00:23:22.100 | Like kids go home from college during the summer
00:23:24.020 | and get a little, you know, mom still feeds them.
00:23:26.220 | Then eventually they leave the nest kind of thing.
00:23:28.680 | Is there something in particular about engineers
00:23:32.660 | or about people in this realm of engineering
00:23:35.820 | that you think lends itself to disagreement?
00:23:39.060 | - Yeah, I think so first of all,
00:23:42.180 | the more specific you get, the less disagreement there is.
00:23:44.640 | So there's a lot of disagreement
00:23:45.900 | about what is artificial intelligence,
00:23:47.860 | but there's less disagreement about what is machine learning
00:23:50.620 | and even less when you talk about active learning
00:23:52.680 | or machine teaching or self-supervised learning.
00:23:56.580 | And then when you get into like NLP language models
00:23:59.780 | or transformers,
00:24:00.620 | when you get into specific neural network architectures,
00:24:03.780 | there's less and less and less disagreement
00:24:05.620 | about those terms.
00:24:06.660 | So you might be hearing the disagreement
00:24:08.060 | from the high level terms.
00:24:09.340 | And that has to do with the fact that engineering,
00:24:12.060 | especially when you're talking about intelligent systems
00:24:15.580 | is a little bit of an art and a science.
00:24:20.580 | So the art part is the thing that creates disagreements
00:24:25.320 | because then you start having disagreements
00:24:28.620 | about how easy or difficult the particular problem is.
00:24:33.620 | For example, a lot of people disagree with Elon,
00:24:37.580 | how difficult the problem of autonomous driving is.
00:24:41.120 | And so, but nobody knows.
00:24:43.120 | So there's a lot of disagreement
00:24:44.340 | about what are the limits of these techniques.
00:24:47.160 | And through that, the terminology also contains within it,
00:24:50.560 | the disagreements.
00:24:53.940 | But overall, I think it's also a young science
00:24:56.760 | that also has to do with that.
00:24:58.820 | So like, it's not just engineering,
00:25:01.180 | it's that artificial intelligence truly
00:25:03.960 | as a large scale discipline where it's thousands,
00:25:06.800 | tens of thousands, hundreds of thousands of people
00:25:09.420 | working on it, huge amounts of money being made.
00:25:11.660 | That's a very recent thing.
00:25:13.820 | So we're trying to figure out those terms.
00:25:16.460 | And of course there's egos and personalities
00:25:18.860 | and a lot of fame to be made,
00:25:20.940 | like the term deep learning, for example.
00:25:25.740 | Neural networks have been around for many, many decades,
00:25:28.040 | since the '60s, you can argue since the '40s.
00:25:30.860 | So there was a rebranding of neural networks
00:25:33.440 | into the word deep learning, term deep learning,
00:25:36.880 | that was part of the reinvigoration of the field,
00:25:40.920 | but it's really the same exact thing.
00:25:42.680 | - I didn't know that.
00:25:43.640 | I mean, I grew up in the age of neuroscience
00:25:46.000 | when neural networks were discussed,
00:25:49.060 | computational neuroscience and theoretical neuroscience,
00:25:51.360 | they had their own journals.
00:25:53.060 | It wasn't actually taken terribly seriously
00:25:55.020 | by experimentalists until a few years ago,
00:25:57.220 | I would say about five to seven years ago,
00:26:00.680 | excellent theoretical neuroscientists like Larry Abbott
00:26:03.520 | and other colleagues, certainly at Stanford as well,
00:26:07.460 | that people started paying attention
00:26:08.780 | to computational methods.
00:26:10.340 | But these terms, neural networks, computational methods,
00:26:13.200 | I actually didn't know that neural network works
00:26:15.140 | in deep learning, where those have now become
00:26:18.540 | kind of synonymous.
00:26:19.380 | - No, they were always, no, they're always the same thing.
00:26:22.700 | - Interesting.
00:26:24.160 | - I'm a neuroscientist and I didn't know that.
00:26:25.740 | - So, well, because neural networks probably
00:26:28.060 | mean something else in neuroscience, not something else,
00:26:30.180 | but a little different flavor depending on the field.
00:26:32.620 | And that's fascinating too, because neuroscience and AI,
00:26:36.700 | people have started working together and dancing a lot more
00:26:40.680 | in the recent, I would say probably decade.
00:26:43.020 | - Oh, machines are going into the brain.
00:26:45.020 | I have a couple of questions,
00:26:47.600 | but one thing that I'm sort of fixated on
00:26:49.800 | that I find incredibly interesting is this example you gave
00:26:54.400 | of playing a game with a mutated version of yourself
00:26:58.200 | as a competitor.
00:26:59.420 | - Yeah.
00:27:00.260 | - I find that incredibly interesting as a kind of a parallel
00:27:04.260 | or a mirror for what happens when we try and learn
00:27:06.700 | as humans, which is we generate repetitions
00:27:09.980 | of whatever it is we're trying to learn
00:27:11.920 | and we make errors, occasionally we succeed.
00:27:15.020 | In a simple example, for instance,
00:27:16.580 | of trying to throw bull's eyes on a dartboard.
00:27:18.980 | I'm going to have errors, errors, errors.
00:27:20.460 | I'll probably miss the dartboard
00:27:21.680 | and maybe occasionally hit a bull's eye.
00:27:23.380 | And I don't know exactly what I just did, right?
00:27:26.220 | But then let's say I was playing darts
00:27:28.720 | against a version of myself
00:27:30.340 | where I was wearing a visual prism,
00:27:32.460 | like my visual, I had a visual defect.
00:27:34.840 | You learn certain things in that mode as well.
00:27:38.940 | You're saying that a machine can sort of mutate itself.
00:27:42.880 | Does the mutation always cause a deficiency
00:27:45.100 | that it needs to overcome?
00:27:46.540 | Because mutations in biology
00:27:47.980 | sometimes give us superpowers, right?
00:27:49.580 | Occasionally you'll get somebody
00:27:51.060 | who has better than 20/20 vision
00:27:52.700 | and they can see better than 99.9% of people out there.
00:27:56.420 | So when you talk about a machine playing a game
00:27:59.120 | against a mutated version of itself,
00:28:01.320 | is the mutation always what we call a negative mutation
00:28:04.720 | or an adaptive or a maladaptive mutation?
00:28:07.600 | - No, you don't know until you mutate first
00:28:11.860 | and then figure out and they compete against each other.
00:28:14.460 | - So you're evolving,
00:28:15.780 | the machine gets to evolve itself in real time.
00:28:18.360 | - Yeah, and I think of it, which would be exciting,
00:28:21.800 | if you could actually do with humans.
00:28:23.600 | It's not just, so usually you freeze a version of the system.
00:28:28.600 | So really you take an Andrew of yesterday
00:28:33.920 | and you make 10 clones of them
00:28:35.640 | and then maybe you mutate, maybe not.
00:28:39.000 | And then you do a bunch of competitions
00:28:41.080 | of the Andrew of today.
00:28:42.480 | Like you fight to the death and who wins last.
00:28:45.640 | So I love that idea of creating a bunch of clones of myself
00:28:49.000 | from each of the day for the past year
00:28:52.360 | and just seeing who's going to be better
00:28:54.560 | at like podcasting or science or picking up chicks at a bar
00:28:58.860 | or I don't know, or competing in jujitsu.
00:29:02.040 | That's one way to do it.
00:29:03.120 | I mean, a lot of Lexes would have to die for that process,
00:29:06.360 | but that's essentially what happens
00:29:07.900 | is in reinforcement learning
00:29:09.440 | through the self-playing mechanisms,
00:29:11.480 | it's a graveyard of systems that didn't do that well.
00:29:14.760 | And the surviving, the good ones survive.
00:29:19.760 | - Do you think that, I mean, Darwin's theory of evolution
00:29:22.800 | might have worked in some sense in this way,
00:29:26.300 | but at the population level?
00:29:27.720 | I mean, you get a bunch of birds with different shaped beaks
00:29:29.560 | and some birds have the shape beak
00:29:30.960 | that allows them to get the seeds.
00:29:32.280 | I mean, it's a trivially simple example
00:29:34.880 | of Darwinian evolution, but I think it's correct
00:29:39.280 | even though it's not exhaustive.
00:29:40.800 | Is that what you're referring to?
00:29:42.140 | You essentially that normally this is done
00:29:44.080 | between members of a different species.
00:29:45.560 | Lots of different members of species have different traits
00:29:47.880 | and some get selected for,
00:29:49.440 | but you could actually create multiple versions of yourself
00:29:52.580 | with different traits.
00:29:53.880 | - So with, I should probably have said this,
00:29:56.440 | but perhaps it's implied,
00:29:59.200 | but with machine learning or with reinforcement learning
00:30:01.200 | through these processes,
00:30:02.480 | one of the big requirements is to have an objective function,
00:30:05.660 | a loss function, a utility function.
00:30:07.800 | Those are all different terms for the same thing.
00:30:10.200 | Is there's like an equation that says what's good.
00:30:15.080 | And then you're trying to optimize that equation.
00:30:17.500 | So there's a clear goal for these systems.
00:30:20.980 | - Because it's a game, like with chess, there's a goal.
00:30:23.920 | - But for anything,
00:30:25.280 | anything you want machine learning to solve,
00:30:27.660 | there needs to be an objective function.
00:30:29.940 | In machine learning, it's usually called loss function
00:30:32.880 | that you're optimizing.
00:30:34.340 | The interesting thing about evolution complicated of course,
00:30:38.700 | but the goal also seems to be evolving.
00:30:41.700 | Like it's a, I guess adaptation to the environment
00:30:44.020 | is the goal, but it's unclear.
00:30:46.460 | You can convert that always.
00:30:48.600 | It's like survival of the fittest.
00:30:52.120 | It's unclear what the fittest is.
00:30:53.840 | In machine learning, the starting point,
00:30:56.800 | and this is like what human ingenuity provides,
00:31:00.460 | is that fitness function of what's good and what's bad,
00:31:04.420 | which it lets you know which of the systems is going to win.
00:31:08.340 | So you need to have a equation like that.
00:31:10.920 | One of the fascinating things about humans
00:31:12.840 | is we figure out objective functions for ourselves.
00:31:17.160 | Like we're, it's the meaning of life.
00:31:20.500 | Like why the hell are we here?
00:31:22.960 | And a machine currently has to have
00:31:26.620 | a hard-coded statement about why.
00:31:29.240 | - It has to have a meaning
00:31:30.240 | of artificial intelligence-based life.
00:31:33.240 | - Right, it can't.
00:31:34.520 | So like there's a lot of interesting explorations
00:31:37.580 | about that function being more about curiosity,
00:31:42.400 | about learning new things and all that kind of stuff,
00:31:45.240 | but it's still hard-coded.
00:31:46.680 | If you want a machine to be able to be good at stuff,
00:31:49.600 | it has to be given very clear statements
00:31:53.440 | of what good at stuff means.
00:31:56.080 | That's one of the challenges of artificial intelligence
00:31:58.360 | is you have to formalize the,
00:32:01.600 | in order to solve a problem, you have to formalize it
00:32:04.200 | and you have to provide
00:32:06.080 | both like the full sensory information.
00:32:08.240 | You have to be very clear about
00:32:10.040 | what is the data that's being collected.
00:32:12.600 | And you have to also be clear about the objective function.
00:32:15.860 | What is the goal that you're trying to reach?
00:32:18.840 | And that's a very difficult thing
00:32:20.720 | for artificial intelligence.
00:32:22.100 | - I love that you mentioned curiosity.
00:32:23.940 | I'm sure this definition falls short in many ways,
00:32:26.920 | but I define curiosity as a strong interest
00:32:31.200 | in knowing something,
00:32:33.160 | but without an attachment to the outcome.
00:32:35.680 | You know, it's sort of a, it's not,
00:32:38.040 | it could be a random search,
00:32:39.320 | but there's not really an emotional attachment.
00:32:42.040 | It's really just a desire to discover
00:32:44.040 | and unveil what's there
00:32:45.520 | without hoping it's a gold coin under a rock.
00:32:48.880 | You're just looking under rocks.
00:32:50.640 | Is that more or less how the machine,
00:32:52.800 | you know, within machine learning,
00:32:54.000 | it sounds like there are elements of reward prediction
00:32:57.280 | and, you know, rewards.
00:32:58.560 | The machine has to know when it's done the right thing.
00:33:01.440 | So can you make machines that are curious
00:33:05.280 | or are the sorts of machines that you are describing
00:33:08.000 | curious by design?
00:33:09.360 | - Yeah, curiosity is a kind of a symptom, not the goal.
00:33:15.200 | So what happens is one of the big trade-offs
00:33:21.200 | in reinforcement learning
00:33:22.280 | is this exploration versus exploitation.
00:33:25.540 | So when you know very little,
00:33:27.400 | it pays off to explore a lot,
00:33:29.600 | even suboptimal, like even trajectories
00:33:32.440 | that seem like they're not going to lead anywhere.
00:33:34.620 | That's called exploration.
00:33:36.180 | The smarter and smarter and smarter you get,
00:33:38.600 | the more emphasis you put on exploitation,
00:33:41.820 | meaning you take the best solution.
00:33:44.180 | You take the best path.
00:33:45.780 | Now through that process,
00:33:47.220 | the exploration can look like curiosity by us humans,
00:33:52.220 | but it's really just trying to get out of the local optimal,
00:33:55.600 | the thing that's already discovered.
00:33:57.300 | It's from an AI perspective,
00:34:00.180 | it's always looking to optimize the objective function.
00:34:04.380 | It derives, and we could talk about this a lot more,
00:34:08.160 | but in terms of the tools of machine learning today,
00:34:11.520 | it derives no pleasure from just the curiosity of like,
00:34:16.520 | I don't know, discovery.
00:34:19.320 | - So there's no dopamine for a machine.
00:34:20.840 | - There's no dopamine.
00:34:21.820 | - There's no reward system, chemical,
00:34:24.000 | or I guess electronic reward system.
00:34:26.880 | - That said, if you look at machine learning literature
00:34:30.360 | and reinforcement learning literature,
00:34:32.080 | they will use like deep mind,
00:34:34.020 | we use terms like dopamine.
00:34:35.740 | We're constantly trying to use the human brain
00:34:38.820 | to inspire totally new solutions to these problems.
00:34:41.820 | So they'll think like,
00:34:42.720 | how does dopamine function in the human brain?
00:34:44.940 | And how can that lead to more interesting ways
00:34:49.120 | to discover optimal solutions?
00:34:51.460 | But ultimately, currently,
00:34:53.200 | there has to be a formal objective function.
00:34:57.460 | Now you could argue that humans
00:34:58.660 | also has a set of objective functions
00:35:00.460 | we're trying to optimize.
00:35:01.860 | We're just not able to introspect them.
00:35:04.500 | - Yeah, we don't actually know what we're looking for
00:35:07.800 | and seeking and doing.
00:35:09.320 | - Well, like Lisa Feldman Barrett,
00:35:10.680 | who you've spoken with, at least on Instagram,
00:35:13.420 | I hope you--
00:35:14.260 | - I met her through you, yeah.
00:35:15.080 | - Yeah, I hope you actually have her on this podcast.
00:35:17.580 | - She's terrific.
00:35:18.800 | - So she has a very,
00:35:20.660 | it has to do with homeostasis, like that,
00:35:25.920 | basically there's a very dumb objective function
00:35:28.980 | that the brain is trying to optimize,
00:35:30.860 | like to keep like body temperature the same.
00:35:32.940 | Like there's a very dumb
00:35:34.300 | kind of optimization function happening.
00:35:36.460 | And then what we humans do with our fancy consciousness
00:35:39.540 | and cognitive abilities is we tell stories to ourselves
00:35:42.320 | so we can have nice podcasts,
00:35:44.060 | but really it's the brain trying to maintain
00:35:46.740 | just like healthy state, I guess.
00:35:50.720 | That's fascinating.
00:35:51.860 | I also see the human brain
00:35:55.520 | and I hope artificial intelligence systems
00:35:58.940 | as not just systems that solve problems or optimize a goal,
00:36:03.940 | but are also storytellers.
00:36:06.260 | I think there's a power to telling stories.
00:36:08.820 | We tell stories to each other.
00:36:10.060 | That's what communication is.
00:36:11.660 | Like when you're alone, that's when you solve problems.
00:36:16.660 | That's when it makes sense to talk about solving problems.
00:36:19.040 | But when you're a community,
00:36:20.820 | the capability to communicate, tell stories,
00:36:24.000 | share ideas in such a way that those ideas
00:36:27.580 | are stable over a long period of time,
00:36:29.980 | that's like, that's being a charismatic storyteller.
00:36:33.280 | And I think both humans are very good at this.
00:36:35.860 | Arguably, I would argue that's why we are who we are
00:36:40.220 | is we're great storytellers.
00:36:42.260 | And then AI, I hope will also become that.
00:36:44.780 | So it's not just about being able to solve problems
00:36:47.460 | with a clear objective function.
00:36:49.020 | It's afterwards be able to tell like a way better,
00:36:51.900 | like make up a way better story
00:36:53.360 | about why you did something or why you failed.
00:36:55.760 | - So you think that robots and/or machines of some sort
00:36:59.840 | are going to start telling humans stories?
00:37:02.340 | - Well, definitely.
00:37:03.440 | So the technical field for that is called explainable AI,
00:37:07.340 | explainable artificial intelligence,
00:37:09.320 | is trying to figure out how you get the AI system
00:37:14.160 | to explain to us humans why the hell it failed
00:37:17.520 | or why it succeeded,
00:37:18.860 | or there's a lot of different sort of versions of this,
00:37:22.320 | or to visualize how it understands the world.
00:37:26.300 | That's a really difficult problem,
00:37:28.100 | especially with neural networks that are famously opaque,
00:37:33.100 | that they, we don't understand in many cases
00:37:36.320 | why a particular neural network does what it does so well.
00:37:40.480 | And to try to figure out where it's going to fail,
00:37:43.700 | that requires the AI to explain itself.
00:37:46.240 | There's a huge amount of money,
00:37:49.540 | like there's a huge amount of money in this,
00:37:52.360 | especially from government funding and so on,
00:37:54.560 | because if you want to deploy AI systems in the real world,
00:37:59.460 | we humans at least, want to ask it a question,
00:38:02.640 | like why the hell did you do that?
00:38:04.260 | Like in a dark way, why did you just kill that person, right?
00:38:09.120 | Like if a car ran over a person,
00:38:10.620 | we wouldn't understand why that happened.
00:38:13.000 | And now again, we're sometimes very unfair to AI systems
00:38:17.880 | because we humans can often not explain why very well,
00:38:21.880 | but that's the field of explainable AI.
00:38:25.560 | That's very, people are very interested in,
00:38:28.160 | because the more and more we rely on AI systems,
00:38:31.500 | like the Twitter recommender system, that AI algorithm,
00:38:35.660 | that's, I would say impacting elections,
00:38:39.140 | perhaps starting wars or at least military conflict.
00:38:41.980 | That's that algorithm.
00:38:43.660 | We want to ask that algorithm, first of all,
00:38:46.580 | do you know what the hell you're doing?
00:38:48.500 | Do you know, do you understand
00:38:50.300 | the society level effects you're having?
00:38:52.900 | And can you explain the possible other trajectories?
00:38:55.840 | Like we would have that kind of conversation with a human.
00:38:58.300 | We want to be able to do that with an AI.
00:39:00.020 | And on my own personal level,
00:39:02.020 | I think it would be nice to talk to AI systems
00:39:05.420 | for stupid stuff, like robots, when they fail to-
00:39:10.020 | - Why'd you fall down the stairs?
00:39:12.900 | - Yeah, but not an engineering question,
00:39:15.860 | but almost like a endearing question.
00:39:18.740 | Like, I'm looking for, if I fell
00:39:22.580 | and you and I were hanging out,
00:39:25.020 | I don't think you need an explanation
00:39:28.020 | exactly what were the dynamic,
00:39:29.860 | like what was the under actuated system problem here?
00:39:32.740 | Like what was the texture of the floor or so on?
00:39:36.180 | Or like what was the-
00:39:37.580 | - No, I want to know what you're thinking.
00:39:39.020 | - That, or you might joke about like,
00:39:41.200 | you're drunk again, go home or something.
00:39:43.100 | Like there could be humor in it.
00:39:44.860 | That's an opportunity.
00:39:46.940 | Like storytelling isn't just explanation of what happened.
00:39:51.040 | It's something that makes people laugh,
00:39:54.380 | makes people fall in love,
00:39:56.100 | makes people dream and understand things
00:39:58.980 | in a way that poetry makes people understand things
00:40:01.980 | as opposed to a rigorous log of where every sensor was,
00:40:06.980 | where every actuator was.
00:40:09.540 | - I mean, I find this incredible
00:40:11.460 | because one of the hallmarks of severe autism spectrum
00:40:15.660 | disorders is a report of experience from the autistic person
00:40:20.660 | that is very much a catalog of action steps.
00:40:25.320 | It's like, how do you feel today?
00:40:26.380 | And they'll say, well, I got up and I did this
00:40:27.960 | and then I did this and I did this.
00:40:29.080 | And it's not at all the way that a person with,
00:40:32.180 | who doesn't have autism spectrum disorder would respond.
00:40:35.700 | And the way you describe these machines has so much human,
00:40:41.100 | has so much humanism or so much of a human
00:40:44.020 | and biological element.
00:40:45.420 | But I realized that we were talking about machines.
00:40:48.020 | I want to make sure that I understand
00:40:51.660 | if there's a distinction between a machine that learns,
00:40:56.660 | a machine with artificial intelligence and a robot.
00:41:01.180 | At what point does a machine become a robot?
00:41:03.840 | So if I have a ballpoint pen,
00:41:06.460 | I'm assuming I wouldn't call that a robot.
00:41:08.660 | But if my ballpoint pen can come to me when it's on,
00:41:13.180 | when I moved to the opposite side of the table,
00:41:15.340 | if it moves by whatever mechanism,
00:41:17.940 | at that point, does it become a robot?
00:41:20.660 | - Okay, there's a million ways to explore this question.
00:41:23.420 | It's a fascinating one.
00:41:25.060 | So first of all, there's a question of what is life?
00:41:28.120 | Like how do you know something is a living form and not?
00:41:32.540 | And it's similar to the question of when does sort of a,
00:41:35.460 | maybe a cold computational system becomes a,
00:41:40.140 | well, we're already loading these words
00:41:41.860 | with a lot of meaning, robot and machine.
00:41:44.140 | But so one, I think movement is important,
00:41:49.140 | but that's kind of a boring idea
00:41:52.340 | that a robot is just a machine
00:41:54.580 | that's able to act in the world.
00:41:56.600 | So one, artificial intelligence could be
00:42:00.040 | both just the thinking thing,
00:42:01.800 | which I think is what machine learning is,
00:42:04.040 | and also the acting thing,
00:42:05.720 | which is what we usually think about robots.
00:42:07.880 | So robots are the things that have a perception system
00:42:10.160 | that's able to take in the world,
00:42:11.580 | however you define the world,
00:42:13.160 | is able to think and learn
00:42:14.600 | and do whatever the hell it does inside
00:42:16.700 | and then act on the world.
00:42:18.700 | So that's the difference between maybe an AI system
00:42:21.600 | or a machine and a robot.
00:42:23.220 | It's something that's able,
00:42:24.280 | a robot is something that's able to perceive the world
00:42:27.220 | and act in the world.
00:42:28.160 | - So it could be through language or sound,
00:42:31.080 | or it could be through movement or both.
00:42:32.660 | - Yeah, and I think it could also be in the digital space,
00:42:36.080 | as long as there's a aspect of entity
00:42:39.020 | that's inside the machine
00:42:41.240 | and a world that's outside the machine,
00:42:44.200 | and there's a sense in which the machine
00:42:46.440 | is sensing that world and acting in it.
00:42:49.340 | - So we could, for instance,
00:42:50.920 | there could be a version of a robot,
00:42:52.600 | according to the definition that I think you're providing,
00:42:55.400 | where the robot, where I go to sleep at night
00:42:58.240 | and this robot goes and forages for information
00:43:01.440 | that it thinks I want to see
00:43:03.680 | loaded onto my desktop in the morning.
00:43:05.280 | There was no movement of that machine.
00:43:07.000 | There was no language,
00:43:07.840 | but it essentially has movement in cyberspace.
00:43:11.140 | - Yeah, there's a distinction that I think is important
00:43:16.140 | in that there's an element of it being an entity,
00:43:23.400 | whether it's in the digital or the physical space.
00:43:26.620 | So when you have something like Alexa in your home,
00:43:30.560 | most of the speech recognition,
00:43:35.160 | most of what Alexa is doing
00:43:36.720 | is constantly being sent back to the mothership.
00:43:39.160 | When Alexa is there on its own,
00:43:44.940 | that's to me a robot.
00:43:46.940 | When it's there interacting with the world,
00:43:49.460 | when it's simply a finger of the main mothership,
00:43:54.460 | then Alexa is not a robot.
00:43:56.660 | Then it's just an interaction device.
00:43:58.660 | That then maybe the main Amazon Alexa, AI,
00:44:02.380 | big, big system is the robot.
00:44:04.800 | So that's important because there's some element
00:44:08.840 | to us humans, I think,
00:44:10.600 | where we want there to be an entity,
00:44:12.560 | whether in the digital or the physical space.
00:44:14.740 | That's where ideas of consciousness come in
00:44:16.740 | and all those kinds of things
00:44:18.560 | that we project our understanding
00:44:21.420 | what it means to be a being.
00:44:23.180 | And so to take that further,
00:44:27.020 | when does a machine become a robot?
00:44:28.980 | I think there's a special moment.
00:44:35.220 | There's a special moment in a person's life,
00:44:37.820 | in a robot's life where it surprises you.
00:44:40.760 | I think surprise is a really powerful thing
00:44:44.260 | where you know how the thing works
00:44:46.820 | and yet it surprises you.
00:44:48.420 | That's a magical moment for us humans.
00:44:51.940 | So whether it's a chess playing program
00:44:54.620 | that does something that you haven't seen before
00:44:57.580 | that makes people smile,
00:44:58.980 | like, huh, those moments happen with alpha zero
00:45:03.060 | for the first time in chess playing
00:45:05.540 | or grandmasters were really surprised by a move.
00:45:08.740 | They didn't understand the move
00:45:10.220 | and then they studied and study
00:45:11.580 | and then they understood it.
00:45:13.340 | But that moment of surprise,
00:45:15.260 | that's for grandmasters in chess.
00:45:17.300 | I find that moment of surprise really powerful,
00:45:20.380 | really magical in just everyday life.
00:45:23.300 | - Because it supersedes the human brain in that moment?
00:45:26.460 | - Not supersedes like outperforms,
00:45:30.980 | but surprises you in a positive sense.
00:45:35.380 | Like I didn't think you could do that.
00:45:37.980 | I didn't think that you had that in you.
00:45:40.740 | And I think that moment is a big transition
00:45:44.420 | for a robot from a moment of being a servant
00:45:48.140 | that accomplishes a particular task
00:45:51.140 | with some level of accuracy,
00:45:52.820 | with some rate of failure to an entity,
00:45:57.820 | a being that's struggling just like you are in this world.
00:46:01.900 | And that's a really important moment
00:46:04.420 | that I think you're not gonna find many people
00:46:07.540 | in the AI community that talk like I just did.
00:46:09.900 | I'm not speaking like some philosopher or some hippie.
00:46:14.340 | I'm speaking from purely engineering perspective.
00:46:16.780 | I think it's really important for robots to become entities
00:46:20.500 | and explore that as a real engineering problem,
00:46:23.380 | as opposed to everybody treats robots
00:46:25.860 | in the robotics community.
00:46:27.660 | They don't even call them a he or she.
00:46:29.540 | They don't give them, try to avoid giving them names.
00:46:32.220 | They really want to see it like a system, like a servant.
00:46:36.700 | They see it as a servant that's trying to accomplish a task.
00:46:40.260 | To me, I don't think I'm just romanticizing the notion.
00:46:44.660 | I think it's a being.
00:46:46.180 | It's currently perhaps a dumb being,
00:46:48.880 | but in the long arc of history,
00:46:53.060 | humans are pretty dumb beings too.
00:46:55.180 | - I would agree with that statement.
00:46:57.100 | - So I tend to really want to explore this,
00:47:00.500 | treating robots really as entities.
00:47:04.460 | So like anthropomorphization,
00:47:08.400 | which is the act of looking at an inanimate object
00:47:12.500 | and projecting onto it lifelike features,
00:47:15.460 | I think robotics generally sees that as a negative.
00:47:20.460 | I see it as a superpower.
00:47:23.860 | Like that, we need to use that.
00:47:26.420 | - Well, I'm struck by how that really grabs on
00:47:29.560 | to the relationship between human and machine
00:47:32.800 | or human and robot.
00:47:34.060 | So it's the simple question is,
00:47:37.400 | and I think you've already told us the answer,
00:47:38.880 | but does interacting with a robot change you?
00:47:43.040 | Does it, in other words, do we develop relationships
00:47:46.580 | to robots?
00:47:48.020 | - Yeah, I definitely think so.
00:47:50.240 | I think the moment you see a robot or AI systems
00:47:55.240 | as more than just servants,
00:47:57.680 | but entities, they begin to change.
00:48:01.240 | Just like good friends do, just like relationships,
00:48:04.880 | just like other humans.
00:48:06.760 | I think for that, you have to have certain aspects
00:48:09.900 | of that interaction.
00:48:11.440 | Like the robot's ability to say no,
00:48:15.820 | to have its own sense of identity,
00:48:19.080 | to have its own set of goals
00:48:21.720 | that's not constantly serving you,
00:48:23.120 | but instead trying to understand the world
00:48:24.920 | and do that dance of understanding
00:48:27.200 | through communication with you.
00:48:28.920 | So I definitely think there's a,
00:48:30.720 | I mean, I have a lot of thoughts about this, as you may know,
00:48:35.200 | and that's at the core of my lifelong dream, actually,
00:48:38.720 | of what I want to do,
00:48:39.840 | which is I believe that most people have
00:48:44.840 | a notion of loneliness in them that we haven't discovered,
00:48:50.280 | that we haven't explored, I should say.
00:48:53.120 | And I see AI systems as helping us explore that
00:48:57.860 | so that we can become better humans,
00:49:00.240 | better people towards each other.
00:49:02.280 | So I think that connection between human and AI,
00:49:06.800 | human and robot is not only possible,
00:49:11.280 | but will help us understand ourselves
00:49:14.520 | in ways that are like several orders of magnitude,
00:49:17.520 | deeper than we ever could have imagined.
00:49:21.320 | I tend to believe that,
00:49:22.600 | well, I have very wild levels of belief
00:49:29.360 | in terms of how impactful that would be.
00:49:35.600 | - So when I think about human relationships,
00:49:37.740 | I don't always break them down into variables,
00:49:41.140 | but we could explore a few of those variables
00:49:43.400 | and see how they map to human-robot relationships.
00:49:47.340 | One is just time, right?
00:49:49.160 | If you spend zero time with another person at all
00:49:52.720 | in cyberspace or on the phone or in person,
00:49:55.380 | you essentially have no relationship to them.
00:49:58.120 | If you spend a lot of time, you have a relationship.
00:49:59.600 | This is obvious, but I guess one variable would be time,
00:50:01.800 | how much time you spend with the other entity,
00:50:05.180 | robot or human.
00:50:06.520 | The other would be wins and successes.
00:50:10.000 | You know, you enjoy successes together.
00:50:12.020 | I'll give a absolutely trivial example of this in a moment,
00:50:16.680 | but the other would be failures.
00:50:19.180 | When you struggle with somebody,
00:50:21.520 | whether or not you struggle between one another,
00:50:23.560 | you disagree, like I was really struck by the fact
00:50:25.660 | that you said that robot's saying no.
00:50:27.040 | I've never thought about a robot saying no to me,
00:50:30.080 | but there it is.
00:50:31.540 | - I look forward to you being one of the first people
00:50:34.040 | to send this robot.
00:50:35.180 | - So do I.
00:50:36.320 | So there's struggle.
00:50:37.640 | You know, when you struggle with somebody, you grow closer.
00:50:41.100 | Sometimes the struggles are imposed
00:50:43.600 | between those two people, so-called trauma bonding.
00:50:45.900 | They call it in the whole psychology literature
00:50:48.300 | and pop psychology literature.
00:50:50.120 | But in any case, I could imagine,
00:50:52.200 | so time, successes together, struggle together,
00:50:57.200 | and then just peaceful time, hanging out at home,
00:51:00.960 | watching movies, waking up near one another.
00:51:04.740 | Here, we're breaking down the kind of elements
00:51:08.160 | of relationships of any kind.
00:51:09.840 | So do you think that these elements apply
00:51:13.880 | to robot-human relationships?
00:51:16.400 | And if so, then I could see how if the robot
00:51:21.400 | is its own entity and has some autonomy
00:51:25.880 | in terms of how it reacts to you,
00:51:27.160 | it's not just there just to serve you.
00:51:28.960 | It's not just a servant.
00:51:30.200 | It actually has opinions and can tell you
00:51:32.800 | when maybe your thinking is flawed
00:51:34.440 | or your actions are flawed.
00:51:35.760 | - It can also leave.
00:51:37.160 | - It could also leave.
00:51:39.360 | So I've never conceptualized
00:51:40.920 | robot-human interactions this way.
00:51:42.660 | So tell me more about how this might look.
00:51:46.500 | Are we thinking about a human-appearing robot?
00:51:50.220 | I know you and I have both had intense relationships
00:51:53.600 | to our, we have separate dogs, obviously, but to animals.
00:51:57.240 | This sounds a lot like human-animal interaction.
00:51:59.120 | So what is the ideal human-robot relationship?
00:52:04.120 | - So there's a lot to be said here,
00:52:06.360 | but you actually pinpointed one of the big,
00:52:09.520 | big first steps, which is this idea of time.
00:52:13.760 | And it's a huge limitation
00:52:15.500 | in machine learning community currently.
00:52:17.840 | Now we're back to like the actual details.
00:52:21.460 | Lifelong learning is a problem space
00:52:26.340 | that focuses on how AI systems can learn
00:52:29.580 | over a long period of time.
00:52:30.980 | What's currently most machine learning systems
00:52:35.420 | are not able to do is to,
00:52:37.860 | all of the things you've listed under time,
00:52:39.780 | the successes, the failures,
00:52:41.820 | or just chilling together watching movies,
00:52:44.600 | AI systems are not able to do that,
00:52:47.960 | which is all the beautiful, magical moments
00:52:51.040 | that I believe are the days filled with.
00:52:53.880 | They're not able to keep track of those together with you.
00:52:57.380 | - 'Cause they can't move with you and be with you.
00:52:59.200 | - No, no, like literally we don't have the techniques
00:53:02.060 | to do the learning,
00:53:03.440 | the actual learning of containing those moments.
00:53:07.180 | Current machine learning systems are really focused
00:53:09.960 | on understanding the world in the following way.
00:53:12.480 | It's more like the perception system,
00:53:14.340 | like looking around, understand like what's in the scene,
00:53:19.340 | that there's a bunch of people sitting down,
00:53:22.220 | that there is cameras and microphones,
00:53:24.860 | that there's a table, understand that.
00:53:27.500 | But the fact that we shared this moment of talking today
00:53:30.940 | and still remember that for next time you're,
00:53:34.160 | for like next time you're doing something,
00:53:36.660 | remember that this moment happened.
00:53:38.360 | We don't know how to do that technique wise.
00:53:40.420 | This is what I'm hoping to innovate on
00:53:44.140 | as I think it's a very, very important component
00:53:47.080 | of what it means to create a deep relationship,
00:53:49.940 | that sharing of moments together.
00:53:52.060 | - Could you post a photo of you and the robot,
00:53:54.060 | like selfie with robot and then the robot sees that image
00:53:58.340 | and recognizes that was time spent,
00:54:00.940 | there were smiles or there were tears
00:54:03.620 | and create some sort of metric of emotional depth
00:54:08.620 | in the relationship and update its behavior.
00:54:11.660 | So could it text you in the middle of the night and say,
00:54:15.160 | "Why haven't you texted me back?"
00:54:16.820 | - Well, yes, all of those things,
00:54:18.620 | but we can dig into that.
00:54:21.420 | But I think that time element, forget everything else,
00:54:24.820 | just sharing moments together, that changes everything.
00:54:29.200 | I believe that changes everything.
00:54:30.940 | There's specific things that are more in terms of systems
00:54:33.560 | that I can explain you.
00:54:35.360 | It's more technical and probably a little bit offline
00:54:39.380 | 'cause I have kind of wild ideas
00:54:41.300 | how that can revolutionize social networks
00:54:44.760 | and operating systems.
00:54:47.660 | But the point is that element alone,
00:54:50.620 | forget all the other things we're talking about,
00:54:53.180 | like emotions, saying no, all that,
00:54:56.060 | just remember sharing moments together
00:54:58.600 | would change everything.
00:55:00.260 | We don't currently have systems that share moments together.
00:55:05.260 | Like even just you and your fridge,
00:55:08.100 | just all those times you went late at night
00:55:11.060 | and ate the thing you shouldn't have eaten,
00:55:13.300 | that was a secret moment you had with your refrigerator.
00:55:16.680 | You shared that moment, that darkness
00:55:18.860 | or that beautiful moment where you just,
00:55:21.220 | like heartbroken for some reason,
00:55:24.060 | you're eating that ice cream or whatever,
00:55:26.300 | that's a special moment.
00:55:27.620 | And that refrigerator was there for you.
00:55:29.620 | And the fact that it missed the opportunity
00:55:31.820 | to remember that is tragic.
00:55:35.980 | And once it does remember that,
00:55:37.820 | I think you're gonna be very attached to the refrigerator.
00:55:42.780 | You're gonna go through some hell with that refrigerator.
00:55:45.700 | Most of us have like in a developed world
00:55:49.640 | have weird relationships with food, right?
00:55:51.520 | So you can go through some deep moments
00:55:54.880 | of trauma and triumph with food.
00:55:57.260 | And at the core of that is the refrigerator.
00:55:59.200 | So a smart refrigerator, I believe would change society,
00:56:04.200 | not just the refrigerator,
00:56:06.180 | but these ideas in the systems all around us.
00:56:10.220 | So that, I just wanna comment
00:56:11.900 | on how powerful that idea of time is.
00:56:14.220 | And then there's a bunch of elements of actual interaction
00:56:17.860 | of allowing you as a human to feel like you're being heard,
00:56:22.860 | truly heard, truly understood that we human,
00:56:30.060 | like deep friendship is like that, I think,
00:56:33.020 | but we're still, there's still an element of selfishness.
00:56:36.940 | There's still an element of not really being able
00:56:39.540 | to understand another human.
00:56:40.740 | And a lot of the times when you're going through trauma
00:56:44.440 | together through difficult times and through successes,
00:56:47.580 | you're actually starting to get that inkling
00:56:49.540 | of understanding of each other.
00:56:51.220 | But I think that can be done more aggressively,
00:56:54.040 | more efficiently.
00:56:57.420 | Like if you think of a great therapist,
00:56:59.280 | I think I've never actually been to a therapist,
00:57:01.820 | but I'm a believer.
00:57:03.200 | I used to want to be a psychiatrist.
00:57:04.960 | - Do Russians go to therapists?
00:57:06.340 | - No, they don't, they don't.
00:57:07.980 | And if they do, the therapist don't live to tell the story.
00:57:12.100 | I do believe in talk there, which friendship is to me,
00:57:17.700 | is talk therapy or like, you don't necessarily need to talk.
00:57:22.700 | It's like just connecting in the space of ideas
00:57:27.140 | and the space of experiences.
00:57:28.900 | And I think there's a lot of ideas
00:57:30.820 | of how to make AI systems to be able to ask
00:57:33.580 | the right questions and truly hear another human.
00:57:37.320 | This is what we try to do with podcasting, right?
00:57:40.320 | I think there's ways to do that with AI.
00:57:42.620 | But above all else, just remembering
00:57:45.980 | the collection of moments that make up the day,
00:57:49.820 | the week, the months.
00:57:51.980 | I think you maybe have some of this as well.
00:57:55.680 | Some of my closest friends still
00:57:57.180 | are the friends from high school.
00:57:58.780 | That's time, we've been through a bunch of shit together.
00:58:02.960 | And that like we're very different people,
00:58:06.220 | but just the fact that we've been through that
00:58:07.940 | and we remember those moments.
00:58:09.320 | And those moments somehow create a depth of connection
00:58:12.820 | like nothing else, like you and your refrigerator.
00:58:15.840 | - I love that because my graduate advisor,
00:58:20.640 | unfortunately she passed away,
00:58:21.720 | but when she passed away, somebody said at her memorial,
00:58:25.780 | all these amazing things she had done, et cetera.
00:58:29.400 | And then her kids got up there and she had young children
00:58:32.880 | that I knew when she was pregnant with them.
00:58:35.480 | And so it was really, even now,
00:58:37.900 | I can feel like your heart gets heavy thinking about this.
00:58:39.880 | They're going to grow up without their mother.
00:58:41.820 | And it was really amazing.
00:58:42.660 | Very, very strong young girls and now young women.
00:58:47.220 | And what they said was incredible.
00:58:49.300 | They said what they really appreciated most
00:58:51.720 | about their mother, who was an amazing person,
00:58:54.960 | is all the unstructured time they spent together.
00:58:59.480 | So it wasn't the trips to the zoo.
00:59:00.960 | It wasn't, oh, she woke up at five in the morning
00:59:03.600 | and drove us to school.
00:59:04.440 | She did all those things too.
00:59:05.500 | She had two hour commute in each direction.
00:59:07.360 | It was incredible, ran a lab, et cetera.
00:59:09.360 | But it was the unstructured time.
00:59:11.440 | So on the passing of their mother,
00:59:13.280 | that's what they remembered was the biggest give
00:59:16.520 | and what bonded them to her was all the time
00:59:18.380 | where they just kind of hung out.
00:59:20.400 | And the way you described the relationship
00:59:22.000 | to a refrigerator is so, I want to say human-like,
00:59:27.000 | but I'm almost reluctant to say that
00:59:28.740 | because what I'm realizing as we're talking
00:59:31.420 | is that what we think of as human-like
00:59:34.400 | might actually be a lower form of relationship.
00:59:39.000 | There may be relationships that are far better
00:59:42.360 | than the sorts of relationships that we can conceive
00:59:45.420 | in our minds right now,
00:59:46.820 | based on what these machine relationship interactions
00:59:50.060 | could teach us.
00:59:51.100 | Do I have that right?
00:59:52.820 | - Yeah, I think so.
00:59:54.080 | I think there's no reason to see machines
00:59:55.720 | as somehow incapable of teaching us something
00:59:59.900 | that's deeply human.
01:00:01.580 | I don't think humans have a monopoly on that.
01:00:04.980 | I think we understand ourselves very poorly
01:00:06.920 | and we need to have the kind of prompting from a machine.
01:00:11.920 | And definitely part of that is just remembering the moments.
01:00:16.800 | Remembering the moments.
01:00:17.920 | I think the unstructured time together,
01:00:22.180 | I wonder if it's quite so unstructured.
01:00:27.520 | That's like calling this podcast unstructured time.
01:00:30.520 | - Maybe what they meant was it wasn't a big outing.
01:00:33.860 | It wasn't just, there was no specific goal,
01:00:36.160 | but a goal was created through the lack of a goal.
01:00:39.640 | Like where you just hang out
01:00:40.560 | and then you start playing thumb war
01:00:42.520 | and you end up playing thumb war for an hour there.
01:00:45.080 | So the structure emerges from lack of structure.
01:00:48.900 | - No, but the thing is the moments,
01:00:51.440 | there's something about those times
01:00:54.320 | that create special moments.
01:00:56.440 | And I think that those could be optimized for.
01:01:00.280 | I think we think of like a big outing as, I don't know,
01:01:02.480 | going to Six Flags or something,
01:01:03.800 | or some big, the Grand Canyon, or go into some,
01:01:08.160 | I don't know, I think we would need to,
01:01:11.640 | we don't quite yet understand as humans
01:01:13.800 | what creates magical moments.
01:01:15.680 | I think there's possible to optimize a lot of those things.
01:01:18.320 | And perhaps like podcasting is helping people discover
01:01:21.080 | that like maybe the thing we want to optimize for
01:01:24.200 | isn't necessarily like some sexy, like quick clips.
01:01:29.200 | Maybe what we want is long form authenticity.
01:01:34.520 | - Depth. - Depth.
01:01:36.260 | So we were trying to figure that out.
01:01:38.760 | Certainly from a deep connection
01:01:40.200 | between humans and humans and AI systems,
01:01:44.160 | I think long conversations
01:01:45.920 | or long periods of communication over a series of moments,
01:01:52.280 | like my new, perhaps seemingly insignificant
01:01:56.160 | to the big ones, the big successes, the big failures,
01:02:00.760 | those are all just stitching those together
01:02:03.020 | and talking throughout.
01:02:04.420 | I think that's the formula
01:02:06.600 | for a really, really deep connection
01:02:08.240 | that from like a very specific engineering perspective
01:02:11.940 | is I think a fascinating open problem
01:02:15.560 | that hasn't been really worked on very much.
01:02:18.400 | And for me, if I have the guts
01:02:21.840 | and I mean, there's a lot of things to say,
01:02:24.800 | but one of it is guts is I'll build a startup around it.
01:02:29.340 | - Yeah, so let's talk about this startup
01:02:32.360 | and let's talk about the dream.
01:02:34.800 | You've mentioned this dream before
01:02:36.040 | in our previous conversations,
01:02:37.280 | always as little hints dropped here and there.
01:02:40.120 | Just for anyone listening,
01:02:41.360 | there's never been an offline conversation about this dream.
01:02:43.720 | I'm not privy to anything except what Lex says now.
01:02:48.160 | And I realized that there's no way
01:02:49.760 | to capture the full essence of a dream
01:02:52.840 | in any kind of verbal statement
01:02:54.520 | in a way that captures all of it.
01:02:58.200 | But what is this dream
01:03:00.960 | that you've referred to now several times
01:03:03.840 | when we've sat down together and talked on the phone?
01:03:06.920 | Maybe it's this company, maybe it's something distinct.
01:03:09.100 | If you feel comfortable,
01:03:10.920 | it'd be great if you could share
01:03:12.120 | a little bit about what that is.
01:03:13.320 | - Sure, so the way people express long-term vision,
01:03:19.060 | I've noticed is quite different.
01:03:20.880 | Like Elon is an example of somebody
01:03:22.660 | who can very crisply say exactly what the goal is.
01:03:27.540 | Also has to do with the fact the problems he's solving
01:03:29.820 | have nothing to do with humans.
01:03:31.320 | So my long-term vision is a little bit more difficult
01:03:36.400 | to express in words, I've noticed, as I've tried.
01:03:40.840 | It could be my brain's failure.
01:03:42.380 | But there's a ways to sneak up to it.
01:03:45.400 | So let me just say a few things.
01:03:48.880 | Early on in life and also in the recent years,
01:03:53.000 | I've interacted with a few robots
01:03:55.160 | where I understood there's magic there.
01:03:57.220 | And that magic could be shared by millions
01:04:01.260 | if it's brought to light.
01:04:04.640 | When I first met Spot from Boston Dynamics,
01:04:07.620 | I realized there's magic there that nobody else is seeing.
01:04:10.400 | - Is the dog.
01:04:11.440 | - The dog, sorry.
01:04:12.280 | The Spot is the four-legged robot from Boston Dynamics.
01:04:17.280 | Some people might've seen it, it's this yellow dog.
01:04:20.360 | And sometimes in life,
01:04:25.320 | you just notice something that just grabs you.
01:04:28.440 | And I believe that this is something that,
01:04:31.400 | this magic is something that could be
01:04:34.800 | every single device in the world.
01:04:37.040 | The way that I think maybe Steve Jobs
01:04:41.360 | thought about the personal computer.
01:04:43.160 | Woz didn't think about it, the personal computer this way,
01:04:46.720 | but Steve did, which is like he thought
01:04:49.400 | that the personal computer should be as thin
01:04:50.960 | as a sheet of paper and everybody should have one.
01:04:53.640 | I mean, this idea, I think it is heartbreaking
01:04:58.640 | that the world is being filled up with machines
01:05:03.840 | that are soulless.
01:05:05.000 | And I think every one of them can have that same magic.
01:05:10.280 | One of the things that also inspired me
01:05:14.800 | in terms of a startup is that magic can be engineered
01:05:17.640 | much easier than I thought.
01:05:19.560 | That's my intuition with everything
01:05:21.320 | I've ever built and worked on.
01:05:23.960 | So the dream is to add a bit of that magic
01:05:28.840 | in every single computing system in the world.
01:05:32.360 | So the way that Windows operating system for a long time
01:05:36.480 | was the primary operating system everybody interacted with.
01:05:39.440 | They built apps on top of it.
01:05:41.320 | I think this is something that should be as a layer,
01:05:45.800 | it's almost as an operating system in every device
01:05:49.040 | that humans interact with in the world.
01:05:51.000 | Now what that actually looks like the actual dream
01:05:54.720 | when I was officially a kid,
01:05:56.520 | it didn't have this concrete form of a business.
01:06:01.120 | It had more of a dream of exploring your own loneliness
01:06:08.240 | by interacting with machines, robots.
01:06:11.840 | This deep connection between humans and robots
01:06:15.720 | was always a dream.
01:06:17.160 | And so for me, I'd love to see a world
01:06:20.160 | where there's every home has a robot
01:06:22.440 | and not a robot that washes the dishes or a sex robot,
01:06:27.440 | or I don't know, I think of any kind of activity
01:06:31.440 | the robot can do, but more like a companion.
01:06:33.840 | - A family member.
01:06:34.800 | - A family member, the way a dog is.
01:06:37.680 | But a dog that's able to speak your language too.
01:06:41.960 | So not just connect the way a dog does
01:06:45.160 | by looking at you and looking away
01:06:46.800 | and almost like smiling with its soul in that kind of way,
01:06:51.120 | but also to actually understand what the hell,
01:06:54.560 | like why are you so excited about the successes?
01:06:56.720 | Like understand the details, understand the traumas.
01:06:59.880 | And I just think that has always filled me with excitement
01:07:05.560 | that I could, with artificial intelligence,
01:07:09.280 | bring joy to a lot of people.
01:07:12.320 | More recently, I've been more and more hardbroken
01:07:16.320 | to see the kind of division, derision,
01:07:21.700 | even hate that's boiling up on the internet
01:07:27.080 | through social networks.
01:07:28.600 | And I thought this kind of mechanism is exactly applicable
01:07:32.760 | in the context of social networks as well.
01:07:34.800 | So it's an operating system that serves
01:07:38.440 | as your guide on the internet.
01:07:43.440 | One of the biggest problems with YouTube
01:07:45.760 | and social networks currently
01:07:47.960 | is they're optimizing for engagement.
01:07:49.880 | I think if you create AI systems
01:07:52.560 | that know each individual person,
01:07:54.920 | you're able to optimize for long-term growth,
01:07:59.080 | for a long-term happiness.
01:08:00.800 | - Of the individual?
01:08:01.640 | - Of the individual, of the individual.
01:08:03.720 | And there's a lot of other things to say, which is the,
01:08:08.460 | in order for AI systems to learn everything about you,
01:08:14.460 | they need to collect, they need to,
01:08:17.880 | just like you and I, when we talk offline,
01:08:20.000 | we're collecting data about each other,
01:08:21.520 | secrets about each other.
01:08:23.520 | The same way AI has to do that.
01:08:26.600 | And that allows you to,
01:08:28.720 | and that requires you to rethink ideas
01:08:32.720 | of ownership of data.
01:08:35.480 | I think each individual should own all of their data
01:08:39.920 | and very easily be able to leave.
01:08:41.920 | Just like AI systems can leave,
01:08:43.600 | humans can disappear and delete all of their data
01:08:48.200 | in a moment's notice,
01:08:49.380 | which is actually better than we humans can do.
01:08:54.220 | Once we load the data into each other, it's there.
01:08:56.880 | I think it's very important to be both,
01:09:00.440 | give people complete control over their data
01:09:02.880 | in order to establish trust that they can trust you.
01:09:06.200 | And the second part of trust is transparency.
01:09:08.480 | Whenever the data is used to make it very clear
01:09:11.880 | what it's being used for.
01:09:13.040 | And not clear in a lawyerly legal sense,
01:09:16.120 | but clear in a way that people really understand
01:09:18.800 | what it's used for.
01:09:19.640 | I believe when people have the ability
01:09:21.520 | to delete all their data and walk away
01:09:24.160 | and know how the data is being used, I think they'll stay.
01:09:29.840 | - The possibility of a clean breakup
01:09:31.880 | is actually what will keep people together.
01:09:33.360 | - Yeah, I think so.
01:09:34.440 | I think, exactly.
01:09:36.400 | I think a happy marriage requires the ability
01:09:39.680 | to divorce easily without the divorce industrial complex
01:09:44.680 | or whatever is currently going on.
01:09:48.040 | There's so much money to be made from lawyers and divorce.
01:09:50.720 | But yeah, the ability to leave is what enables love, I think.
01:09:55.220 | - It's interesting, I've heard the phrase
01:09:57.320 | from a semi-cynical friend that marriage
01:09:59.960 | is the leading cause of divorce.
01:10:01.580 | But now we've heard that divorce
01:10:03.560 | or the possibility of divorce
01:10:04.960 | could be the leading cause of marriage.
01:10:06.620 | - Of a happy marriage.
01:10:08.100 | - Good point.
01:10:08.940 | - Of a happy marriage.
01:10:09.920 | So yeah, but there's a lot of details there.
01:10:12.740 | But the big dream is that connection
01:10:14.680 | between AI system and a human.
01:10:17.480 | And I haven't, you know, there's so much fear
01:10:21.040 | about artificial intelligence systems and about robots
01:10:23.880 | that I haven't quite found the right words
01:10:26.400 | to express that vision because the vision I have is one,
01:10:31.320 | it's not like some naive delusional vision
01:10:33.480 | of like technology is going to save everybody.
01:10:36.440 | It's, I really do just have a positive view
01:10:40.320 | of ways AI systems can help humans explore themselves.
01:10:44.720 | - I love that positivity.
01:10:46.040 | And I agree that the stance,
01:10:49.120 | everything is doomed is equally bad
01:10:54.400 | to say that everything's going to turn out all right.
01:10:56.560 | There has to be a dedicated effort.
01:10:58.280 | And clearly you're thinking about
01:11:00.620 | what that dedicated effort would look like.
01:11:02.520 | You mentioned two aspects to this dream.
01:11:06.600 | And I want to make sure that I understand
01:11:07.800 | where they connect if they do
01:11:10.040 | or if these are independent streams.
01:11:12.400 | One was this hypothetical robot family member
01:11:17.240 | or some other form of robot that would allow people
01:11:20.240 | to experience the kind of delight
01:11:23.400 | that you experienced many times
01:11:27.040 | and that you would like the world to be able to have.
01:11:30.280 | And it's such a beautiful idea of this give.
01:11:33.480 | And the other is social media or social network platforms
01:11:38.160 | that really serve individuals and their best selves
01:11:42.340 | and their happiness and their growth.
01:11:44.160 | Is there crossover between those
01:11:45.600 | or are these two parallel dreams?
01:11:47.040 | - It's 100% the same thing.
01:11:48.480 | It's difficult to kind of explain without going through
01:11:51.560 | details, but maybe one easy way to explain
01:11:54.620 | the way I think about social networks
01:11:56.700 | is to create an AI system that's yours.
01:11:59.660 | That's yours.
01:12:00.680 | It's not like Amazon Alexa that's centralized.
01:12:03.240 | You own the data.
01:12:04.560 | It's like your little friend
01:12:07.440 | that becomes your representative on Twitter
01:12:11.320 | that helps you find things that will make you feel good,
01:12:16.200 | that will also challenge your thinking to make you grow,
01:12:19.880 | but not let you get lost in the negative spiral
01:12:24.880 | of dopamine that gets you to be angry
01:12:29.440 | or most just get you to be not open to learning.
01:12:34.240 | And so that little representative
01:12:36.240 | is optimizing your long-term health.
01:12:38.780 | And I believe that that is not only good for human beings,
01:12:45.360 | it's also good for business.
01:12:47.360 | I think long-term you can make a lot of money
01:12:50.480 | by challenging this idea that the only way to make money
01:12:54.440 | is maximizing engagement.
01:12:57.360 | And one of the things that people disagree with me on
01:12:59.680 | is they think Twitter's always going to win.
01:13:02.400 | Like maximizing engagement is always going to win.
01:13:04.880 | I don't think so.
01:13:06.280 | I think people have woken up now to understanding
01:13:09.500 | that like they don't always feel good.
01:13:12.760 | The ones who are on Twitter a lot,
01:13:16.220 | that they don't always feel good at the end of the week.
01:13:19.300 | - I would love feedback from whatever this creature,
01:13:24.000 | whatever, I can't, I don't know what to call it,
01:13:26.720 | as to maybe at the end of the week,
01:13:28.720 | it would automatically unfollow
01:13:30.520 | some of the people that I follow
01:13:31.940 | because it realized through some really smart data
01:13:35.600 | about how I was feeling inside
01:13:37.120 | or how I was sleeping or something
01:13:38.560 | that that just wasn't good for me,
01:13:40.600 | but it might also put things and people in front of me
01:13:43.600 | that I ought to see.
01:13:45.500 | Is that the kind of sliver of what this looks like?
01:13:48.640 | - The whole point, because of the interaction,
01:13:50.540 | because of sharing the moments
01:13:53.580 | and learning a lot about you,
01:13:55.100 | you're now able to understand
01:13:59.560 | what interactions led you to become
01:14:01.960 | a better version of yourself.
01:14:04.280 | Like the person you yourself are happy with.
01:14:07.280 | This isn't, if you're into a flat earth
01:14:10.080 | and you feel very good about it,
01:14:12.800 | that you believe the earth is flat,
01:14:15.440 | like the idea that you should censor, that is ridiculous.
01:14:19.600 | If it makes you feel good
01:14:21.020 | and you're becoming the best version of yourself,
01:14:23.440 | I think you should be getting
01:14:24.600 | as much flat earth as possible.
01:14:26.420 | Now it's also good to challenge your ideas,
01:14:29.320 | but not because the centralized committee decided,
01:14:34.320 | but because you tell to the system
01:14:37.280 | that you like challenging your ideas.
01:14:39.280 | I think all of us do.
01:14:40.800 | And then, which actually YouTube doesn't do that well,
01:14:44.040 | once you go down the flat earth rabbit hole,
01:14:45.900 | that's all you're gonna see.
01:14:47.160 | It's nice to get some really powerful communicators
01:14:51.640 | to argue against flat earth.
01:14:53.600 | And it's nice to see that for you
01:14:57.000 | and potentially at least long-term to expand your horizons.
01:15:01.100 | Maybe the earth is not flat,
01:15:03.240 | but if you continue to live your whole life
01:15:05.160 | thinking the earth is flat, I think,
01:15:08.100 | and you're being a good father or son or daughter
01:15:11.680 | and you're being the best version of yourself
01:15:14.360 | and you're happy with yourself, I think the earth is flat.
01:15:18.940 | So I think this kind of idea,
01:15:21.360 | and I'm just using that kind of silly, ridiculous example
01:15:24.220 | because I don't like the idea of centralized forces
01:15:29.220 | controlling what you can and can't see.
01:15:33.240 | But I also don't like this idea of not censoring anything
01:15:39.660 | because that's always the biggest problem with that
01:15:42.540 | is there's a central decider.
01:15:45.820 | I think you yourself can decide
01:15:47.600 | what you want to see and not.
01:15:49.480 | And it's good to have a companion that reminds you
01:15:54.020 | that you felt shitty last time you did this,
01:15:56.500 | or you felt good last time you did this.
01:15:58.580 | - Man, I feel like in every good story,
01:16:00.120 | there's a guide or a companion that flies out
01:16:03.540 | or forages a little bit further or a little bit differently
01:16:06.440 | and brings back information that helps us
01:16:08.420 | or at least tries to steer us in the right direction.
01:16:11.360 | - So that's exactly what I'm thinking
01:16:16.360 | and what I've been working on.
01:16:17.800 | I should mention there's a bunch of difficulties here.
01:16:20.820 | You see me up and down a little bit recently.
01:16:23.760 | So there's technically a lot of challenges here.
01:16:27.600 | Like with a lot of technologies
01:16:30.480 | and the reason I'm talking about it on a podcast comfortably
01:16:34.360 | as opposed to working in secret is it's really hard
01:16:38.500 | and maybe it's time has not come.
01:16:41.000 | And that's something you have to constantly struggle with
01:16:44.140 | in terms of like entrepreneurially as a startup.
01:16:46.980 | Like I've also mentioned to you maybe offline,
01:16:50.660 | I really don't care about money.
01:16:52.540 | I don't care about business success,
01:16:55.300 | all those kinds of things.
01:16:56.600 | So it's a difficult decision to make how much of your time
01:17:03.400 | do you want to go all in here and give everything to this?
01:17:06.780 | It's a big roll of the dice
01:17:09.780 | because I've also realized that working
01:17:12.540 | on some of these problems, both with the robotics
01:17:16.660 | and the technical side in terms of the machine learning
01:17:20.660 | system that I'm describing, it's lonely, it's really lonely
01:17:25.660 | because both on a personal level and a technical level.
01:17:31.420 | So on a technical level, I'm surrounded by people
01:17:33.580 | that kind of doubt me,
01:17:37.900 | which I think all entrepreneurs go through.
01:17:40.540 | And they doubt you in the following sense.
01:17:42.960 | They know how difficult it is.
01:17:46.800 | Like the people that, the colleagues of mine,
01:17:49.540 | they know how difficult lifelong learning is.
01:17:52.300 | They also know how difficult it is to build a system
01:17:54.740 | like this, to build a competitive social network.
01:17:59.380 | And in general, there's a kind of loneliness
01:18:04.060 | to just working on something on your own
01:18:08.800 | for long periods of time.
01:18:10.260 | And you start to doubt whether given that you don't have
01:18:14.280 | a track record of success, like that's a big one.
01:18:17.980 | When you look in the mirror, especially when you're young,
01:18:20.500 | but I still have that on most things,
01:18:22.660 | you look in the mirror and you have these big dreams.
01:18:26.500 | How do you know you're actually as smart
01:18:30.780 | as you think you are?
01:18:32.480 | Like, how do you know you're going to be able
01:18:34.220 | to accomplish this dream?
01:18:35.420 | You have this ambition.
01:18:36.540 | - You sort of don't, but you're kind of pulling on a string
01:18:40.920 | hoping that there's a bigger ball of yarn.
01:18:42.960 | - Yeah, but you have this kind of intuition.
01:18:45.160 | I think I pride myself in knowing what I'm good at
01:18:51.360 | because the reason I have that intuition
01:18:54.460 | is 'cause I think I'm very good at knowing
01:18:57.540 | all the things I suck at, which is basically everything.
01:19:01.280 | So like whenever I notice like, wait a minute,
01:19:04.520 | I'm kind of good at this, which is very rare for me.
01:19:08.180 | I think like that might be a ball of yarn worth pulling at.
01:19:11.500 | And the thing with the, in terms of engineering systems
01:19:14.760 | that are able to interact with humans,
01:19:16.540 | I think I'm very good at that.
01:19:18.520 | And it's 'cause we're talking about podcasting and so on.
01:19:22.040 | I don't know if I'm very good at podcasts.
01:19:23.620 | - You're very good at podcasting.
01:19:25.200 | - But I certainly don't.
01:19:27.180 | I think maybe it is compelling for people to watch
01:19:31.460 | a kind-hearted idiot struggle with this form.
01:19:35.360 | Maybe that's what's compelling.
01:19:37.180 | But in terms of like actual being a good engineer
01:19:40.580 | of human robot interaction systems, I think I'm good.
01:19:45.540 | But it's hard to know until you do it
01:19:47.980 | and then the world keeps telling you you're not.
01:19:50.380 | And it's just full of doubt.
01:19:52.900 | And it's really hard.
01:19:53.820 | And I've been struggling with that recently.
01:19:55.540 | It's kind of a fascinating struggle,
01:19:57.180 | but then that's where the Goggins thing comes in
01:19:59.780 | is like, aside from the stay hard motherfucker,
01:20:03.900 | is the like, whenever you're struggling,
01:20:07.860 | that's a good sign that if you keep going,
01:20:11.180 | that you're going to be alone in the success, right?
01:20:14.580 | - Well, in your case, however, I agree.
01:20:18.660 | And actually David had a post recently
01:20:20.340 | that I thought was among his many brilliant posts
01:20:23.220 | was one of the more brilliant about how,
01:20:26.120 | you know, he talked about this myth of the light
01:20:27.860 | at the end of the tunnel.
01:20:29.520 | And instead what he replaced that myth with
01:20:33.460 | was a concept that eventually your eyes adapt to the dark.
01:20:38.460 | That the tunnel, it's not about a light at the end,
01:20:40.320 | that it's really about adapting to the dark of the tunnel.
01:20:42.780 | It was very Goggins- - I love him so much.
01:20:44.820 | - Yeah, you guys share a lot in common,
01:20:48.880 | knowing you both a bit, you know, share a lot in common.
01:20:52.500 | But in this loneliness and the pursuit of this dream,
01:20:57.500 | it seems to me it has a certain component to it
01:21:01.320 | that is extremely valuable,
01:21:04.060 | which is that the loneliness itself
01:21:06.700 | could serve as a driver to build the companion
01:21:09.720 | for the journey.
01:21:10.560 | - Well, I'm very deeply aware of that.
01:21:12.800 | So like some people can make,
01:21:17.280 | 'cause I talk about love a lot,
01:21:18.620 | I really love everything in this world,
01:21:21.900 | but I also love humans, friendship and romantic, you know,
01:21:26.620 | like even the cheesy stuff, just-
01:21:31.020 | - You like romantic movies.
01:21:32.460 | - Yeah, not those, not necessarily.
01:21:35.140 | - Well, I got so much shit from Rogan about like,
01:21:38.340 | what is it, the tango scene from "Scent of a Woman."
01:21:41.100 | But yes, I find like there's nothing better
01:21:43.980 | than a woman in a red dress, like, you know,
01:21:47.500 | it's just like classy.
01:21:49.420 | - You should move to Argentina, my friend.
01:21:51.020 | You know, my father is Argentine.
01:21:52.240 | And you know what he said when I went on your podcast
01:21:54.780 | for the first time, he said, "He dresses well."
01:21:58.340 | Because in Argentina, the men go to a wedding
01:22:00.240 | or a party or something.
01:22:01.460 | You know, in the US, by halfway through the night,
01:22:03.660 | 10 minutes in the night, all the jackets are off.
01:22:05.780 | It looks like everyone's undressing for the party
01:22:07.600 | they just got dressed up for.
01:22:08.940 | And he said, "You know, I like the way he dresses."
01:22:11.620 | And then when I started, he was talking about you.
01:22:13.180 | And then when I started my podcast, he said,
01:22:15.740 | "Why don't you wear a real suit like your friend Lex?"
01:22:19.400 | [laughing]
01:22:20.460 | - I remember that.
01:22:21.300 | - Any case.
01:22:22.140 | But let's talk about this pursuit just a bit more,
01:22:27.820 | because I think what you're talking about is building a,
01:22:31.640 | not just a solution for loneliness,
01:22:33.900 | but you've alluded to the loneliness
01:22:36.060 | as itself an important thing.
01:22:37.800 | And I think you're right.
01:22:38.640 | I think within people, there is like caverns of thoughts
01:22:44.100 | and shame, but also just the desire to be,
01:22:47.660 | to have resonance, to be seen and heard.
01:22:51.220 | And I don't even know that it's seen
01:22:52.420 | and heard through language.
01:22:53.780 | But these reservoirs of loneliness, I think,
01:22:58.200 | well, they're interesting.
01:23:01.340 | Maybe you could comment a little bit about it,
01:23:02.620 | because just as often as you talk about love,
01:23:05.160 | I haven't quantified it,
01:23:06.100 | but it seems that you talk about this loneliness.
01:23:08.440 | And maybe you just would, if you're willing,
01:23:10.180 | you could share a little bit more about that
01:23:12.220 | and what that feels like now in the pursuit
01:23:15.300 | of building this robot-human relationship.
01:23:18.360 | And you've been, let me be direct.
01:23:20.460 | You've been spending a lot of time
01:23:21.940 | on building a robot-human relationship.
01:23:25.420 | Where's that at?
01:23:27.220 | - Oh, in terms of business and in terms of systems?
01:23:33.380 | - No, I'm talking about a specific robot.
01:23:35.540 | - Oh, robot.
01:23:37.100 | So, okay, I should mention a few things.
01:23:39.860 | So, one is there's a startup where there's an idea
01:23:42.660 | where I hope millions of people can use.
01:23:46.000 | And then there's my own personal,
01:23:49.260 | almost like Frankenstein explorations
01:23:52.060 | with particular robots.
01:23:55.060 | So, I'm very fascinated with the legged robots
01:23:59.220 | in my own private, sounds like dark,
01:24:03.140 | but n-of-one experiments to see if I can recreate the magic.
01:24:09.620 | And that's been, I have a lot of really good already,
01:24:14.160 | perception systems and control systems
01:24:17.100 | that are able to communicate affection
01:24:19.240 | in a dog-like fashion.
01:24:20.780 | So, I'm in a really good place there.
01:24:22.520 | The stumbling blocks,
01:24:23.560 | which also have been part of my sadness recently,
01:24:27.080 | is that I also have to work with robotics companies.
01:24:30.740 | That I gave so much of my heart, soul,
01:24:34.620 | and love and appreciation towards Boston Dynamics,
01:24:37.940 | but Boston Dynamics is also,
01:24:41.940 | is a company that has to make a lot of money
01:24:43.860 | and they have marketing teams.
01:24:45.740 | And they're like looking at this silly Russian kid
01:24:47.980 | in a suit and tie.
01:24:49.340 | It's like, what's he trying to do with all this love
01:24:51.420 | and robot interaction and dancing and so on?
01:24:53.480 | So, there was a, I think, let's say for now,
01:24:58.480 | it's like when you break up with a girlfriend or something.
01:25:01.180 | Right now, we decided to part ways on this particular thing.
01:25:04.060 | They're huge supporters of mine, they're huge fans,
01:25:05.900 | but on this particular thing,
01:25:08.660 | Boston Dynamics is not focusing on
01:25:12.640 | or interested in human-robot interaction.
01:25:15.060 | In fact, their whole business currently
01:25:17.180 | is keep the robot as far away from humans as possible.
01:25:20.000 | Because it's in the industrial setting
01:25:23.800 | where it's doing monitoring in dangerous environments.
01:25:27.300 | It's almost like a remote security camera,
01:25:29.460 | essentially is its application.
01:25:31.000 | To me, I thought it's still,
01:25:34.560 | even in those applications, exceptionally useful
01:25:37.120 | for the robot to be able to perceive humans, like see humans
01:25:41.440 | and to be able to, in a big map,
01:25:45.520 | localize where those humans are and have human intention.
01:25:48.400 | For example, like this,
01:25:49.820 | I did this a lot of work with pedestrians
01:25:52.080 | for a robot to be able to anticipate
01:25:53.860 | what the hell the human is doing, like where it's walking.
01:25:57.540 | Humans are not ballistics object.
01:25:59.200 | They're not, just because you're walking this way one moment
01:26:01.760 | doesn't mean you'll keep walking that direction.
01:26:03.820 | You have to infer a lot of signals,
01:26:05.400 | especially with the head movement and the eye movement.
01:26:07.720 | And so I thought that's super interesting to explore,
01:26:09.760 | but they didn't feel that.
01:26:11.680 | So I'll be working with a few other robotics companies
01:26:14.500 | that are much more open to that kind of stuff.
01:26:18.360 | And they're super excited and fans of mine.
01:26:20.680 | And hopefully Boston Dynamics, my first love,
01:26:23.440 | that getting back with an ex-girlfriend will come around.
01:26:26.000 | But so the algorithmically, I'm basically done there.
01:26:33.360 | The rest is actually getting some of these companies
01:26:36.700 | to work with.
01:26:37.540 | And then there's, for people who'd work with robots
01:26:41.300 | know that one thing is to write software that works
01:26:44.900 | and the other is to have a real machine that actually works.
01:26:48.440 | And it breaks down in all kinds of different ways
01:26:50.860 | that are fascinating.
01:26:51.720 | And so there's a big challenge there.
01:26:53.940 | But that's almost, it may sound a little bit confusing
01:26:58.760 | in the context of our previous discussion
01:27:01.300 | because the previous discussion was more about the big dream,
01:27:04.580 | how I hoped to have millions of people
01:27:07.700 | enjoy this moment of magic.
01:27:09.420 | The current discussion about a robot
01:27:12.160 | is something I personally really enjoy,
01:27:15.100 | just brings me happiness.
01:27:16.620 | I really try to do now everything that just brings me joy.
01:27:20.960 | I maximize that 'cause robots are awesome.
01:27:24.180 | But two, given my little bit growing platform,
01:27:28.440 | I wanna use the opportunity to educate people.
01:27:31.820 | It's just like robots are cool.
01:27:34.100 | And if I think they're cool, I'll be able to,
01:27:36.620 | I hope be able to communicate why they're cool to others.
01:27:39.620 | So this little robot experiment
01:27:42.140 | is a little bit of a research project too.
01:27:43.980 | There's a couple of publications with MIT folks around that.
01:27:47.020 | But the other is just to make some cool videos
01:27:49.780 | and explain to people how they actually work.
01:27:52.200 | And as opposed to people being scared of robots,
01:27:56.640 | they can still be scared, but also excited.
01:27:59.860 | Like see the dark side, the beautiful side,
01:28:03.120 | the magic of what it means to bring,
01:28:05.540 | for a machine to become a robot.
01:28:09.780 | I want to inspire people with that, but that's less,
01:28:13.860 | it's interesting because I think the big impact
01:28:18.140 | in terms of the dream does not have to do with embodied AI.
01:28:22.340 | So it does not need to have a body.
01:28:24.020 | I think the refrigerator is enough that for an AI system,
01:28:28.680 | just to have a voice and to hear you,
01:28:31.320 | that's enough for loneliness.
01:28:33.400 | The embodiment is just-
01:28:36.520 | - By embodiment you mean the physical structure.
01:28:38.480 | - Physical instantiation of intelligence.
01:28:41.280 | So it's a legged robot or even just a thing.
01:28:44.780 | I have a few other humanoid robot,
01:28:48.520 | a little humanoid robot, maybe I'll keep them on the table.
01:28:51.320 | It's like walks around or even just like a mobile platform
01:28:54.720 | that can just like turn around and look at you.
01:28:57.540 | It's like we mentioned with the pen,
01:28:59.200 | something that moves and can look at you.
01:29:02.060 | It's like that butter robot that asks, what is my purpose?
01:29:07.060 | That is really, it's almost like art.
01:29:15.640 | There's something about a physical entity that moves around,
01:29:19.660 | that's able to look at you and interact with you
01:29:22.620 | that makes you wonder what it means to be human.
01:29:25.920 | It like challenges you to think,
01:29:27.500 | if that thing looks like it has consciousness,
01:29:31.200 | what the hell am I?
01:29:33.940 | And I like that feeling.
01:29:35.100 | I think that's really useful for us.
01:29:36.500 | It's humbling for us humans, but that's less about research.
01:29:40.440 | It's certainly less about business
01:29:42.660 | and more about exploring our own selves
01:29:45.960 | and challenging others to think about what makes them human.
01:29:50.960 | - I love this desire to share the delight
01:29:56.380 | of an interaction with a robot.
01:29:58.040 | And as you describe it,
01:29:58.880 | I actually find myself starting to crave that
01:30:01.340 | because we all have those elements from childhood
01:30:04.920 | or from adulthood where we experience something,
01:30:06.760 | we want other people to feel that.
01:30:09.800 | And I think that you're right.
01:30:10.920 | I think a lot of people are scared of AI.
01:30:12.560 | I think a lot of people are scared of robots.
01:30:14.440 | My only experience of a robotic like thing
01:30:18.840 | is my Roomba vacuum where it goes about,
01:30:21.840 | actually was pretty good at picking up Costello's hair
01:30:24.280 | when he was shed and I was grateful for it.
01:30:28.260 | But then when I was on a call or something
01:30:30.440 | and it would get caught on a wire or something,
01:30:33.200 | I would find myself getting upset with the Roomba.
01:30:35.280 | In that moment, I'm like, what are you doing?
01:30:37.520 | And obviously it's just doing what it does.
01:30:40.320 | But that's a kind of mostly positive
01:30:42.480 | but slightly negative interaction.
01:30:44.760 | But what you're describing has so much more richness
01:30:48.600 | and layers of detail that I can only imagine
01:30:51.640 | what those relationships are like.
01:30:53.660 | - Well, there's a few, just a quick comment.
01:30:55.380 | So I've had, they're currently in Boston,
01:30:57.680 | I have a bunch of Roombas for my robot.
01:31:00.540 | And I did this experiment.
01:31:01.920 | - Wait, how many Roombas?
01:31:03.120 | Sounds like a fleet of Roombas.
01:31:05.880 | - Yeah, so probably seven or eight.
01:31:08.640 | - Wow, that's a lot of Roombas.
01:31:09.960 | So this place is very clean.
01:31:11.780 | (both laughing)
01:31:12.800 | Well, so this, I'm kind of waiting.
01:31:14.740 | This is the place we're currently in in Austin
01:31:18.440 | is way larger than I need.
01:31:20.280 | But I basically got it to make sure I have room for robots.
01:31:26.420 | - So you have these seven or so Roombas,
01:31:30.400 | you deploy all seven at once?
01:31:32.100 | - Oh no, I do different experiments with them.
01:31:36.040 | So one of the things I wanna mention is this is,
01:31:38.860 | I think there was a YouTube video
01:31:40.140 | that inspired me to try this,
01:31:41.780 | is I got them to scream in pain and moan in pain
01:31:47.800 | whenever they were kicked or contacted.
01:31:52.440 | And I did that experiment to see how I would feel.
01:31:56.680 | I meant to do like a YouTube video on it,
01:31:58.800 | but then it just seemed very cruel.
01:32:00.360 | - Did any Roomba rights activists come after you?
01:32:03.420 | - Like, I think if I release that video,
01:32:07.600 | I think it's gonna make me look insane,
01:32:09.680 | which I know people know I'm already insane.
01:32:12.760 | - Now you have to release the video.
01:32:14.360 | (both laughing)
01:32:15.880 | - I think maybe if I contextualize it
01:32:18.300 | by showing other robots,
01:32:20.320 | like to show why this is fascinating,
01:32:23.200 | because ultimately I felt like they were human
01:32:26.020 | almost immediately.
01:32:27.360 | And that display of pain was what did that.
01:32:30.360 | - Giving them a voice.
01:32:31.480 | - Giving them a voice,
01:32:32.360 | especially a voice of dislike of pain.
01:32:37.180 | - I have to connect you to my friend, Eddie Chang.
01:32:39.200 | He studies speech and language.
01:32:40.480 | He's a neurosurgeon and we're lifelong friends.
01:32:42.960 | He studies speech and language,
01:32:46.360 | but he describes some of these more primitive,
01:32:50.160 | visceral vocalizations, cries, groans,
01:32:54.020 | moans of delight, other sounds as well,
01:32:57.480 | use your imagination,
01:32:58.940 | as such powerful rudders for the emotions of other people.
01:33:03.940 | And so I find it fascinating.
01:33:06.080 | I can't wait to see this video.
01:33:07.560 | So is the video available online?
01:33:09.480 | - No, I haven't recorded it.
01:33:12.040 | I just hit a bunch of Roombas
01:33:13.680 | that are able to scream in pain in my Boston place.
01:33:18.240 | - People are ready as well.
01:33:22.320 | - Next podcast episode with Lex,
01:33:24.840 | maybe we'll have that one.
01:33:26.180 | Who knows?
01:33:27.020 | - So the thing is like people I've noticed
01:33:29.060 | because I talk so much about love
01:33:30.960 | and it's really who I am.
01:33:32.360 | I think they want to,
01:33:33.520 | to a lot of people,
01:33:35.800 | it seems like there's got to be a dark person
01:33:38.520 | in there somewhere.
01:33:39.440 | And I thought if I release videos
01:33:40.960 | and Roomba's screaming and they're like, yep, yep,
01:33:43.040 | that guy's definitely insane.
01:33:44.560 | - What about like shouts of glee and delight?
01:33:47.760 | You could do that too, right?
01:33:49.180 | - Well, I don't know how to,
01:33:50.360 | I don't, how to, to me delight is quiet, right?
01:33:54.420 | Like a-
01:33:55.260 | - You're Russian.
01:33:56.080 | Americans are much louder than Russians.
01:33:59.840 | - Yeah, yeah.
01:34:01.000 | But I don't, I mean, unless you're talking about like,
01:34:04.760 | I don't know how you would have sexual relations
01:34:06.480 | with a Roomba.
01:34:07.320 | - Well, I wasn't necessarily saying sexual delight, but-
01:34:10.260 | - Trust me, I tried, I'm just kidding.
01:34:12.680 | That's a joke, internet.
01:34:14.080 | Okay, but I was fascinated in the psychology
01:34:16.760 | of how little it took.
01:34:17.720 | 'Cause you mentioned you had a negative relationship
01:34:20.560 | with a Roomba a little bit.
01:34:21.920 | - Well, I'd find that mostly I took it for granted.
01:34:25.160 | It just served me.
01:34:26.040 | It collected Costello's hair.
01:34:27.560 | And then when it would do something I didn't like,
01:34:29.200 | I would get upset with it.
01:34:30.240 | So that's not a good relationship.
01:34:31.760 | It was taken for granted and I would get upset
01:34:35.000 | and then I'd park it again.
01:34:36.060 | And I just like, you're, you're in the, in the corner.
01:34:39.040 | - Yeah, but there's a way to frame it's,
01:34:43.400 | it being quite dumb as a almost cute, you know,
01:34:48.400 | almost connecting with it for its dumbness.
01:34:51.120 | And I think that's a artificial intelligence problem.
01:34:54.000 | I think flaws are, should be a feature, not a bug.
01:34:59.280 | So along the lines of this, the different sorts
01:35:01.860 | of relationships that one could have with robots
01:35:03.800 | and the fear, but also some of the positive relationships
01:35:06.900 | that one could have, there's so much dimensionality,
01:35:10.320 | there's so much to explore.
01:35:12.400 | But power dynamics in relationships are very interesting
01:35:16.300 | because the obvious ones that the unsophisticated view
01:35:20.640 | of this is, you know, one, there's a master and a servant,
01:35:24.360 | right, but there's also manipulation.
01:35:27.800 | There's benevolent manipulation.
01:35:30.180 | You know, children do this with parents, puppies do this.
01:35:33.120 | Puppies turn their head and look cute
01:35:34.920 | and maybe give out a little noise.
01:35:37.560 | Kids, coo, and parents always think that they're, you know,
01:35:41.000 | they're doing this because, you know,
01:35:43.020 | they love the parent, but in many ways,
01:35:46.520 | studies show that those coos are ways to extract
01:35:49.320 | the sorts of behaviors and expressions
01:35:50.820 | from the parent that they want.
01:35:51.800 | The child doesn't know it's doing this,
01:35:53.160 | it's completely subconscious,
01:35:54.260 | but it's benevolent manipulation.
01:35:56.460 | So there's one version of fear of robots
01:35:59.720 | that I hear a lot about that I think most people
01:36:02.160 | can relate to where the robots take over
01:36:04.520 | and they become the masters and we become the servants.
01:36:08.140 | But there could be another version that, you know,
01:36:12.600 | in certain communities that I'm certainly not a part of,
01:36:15.140 | but they call topping from the bottom,
01:36:17.200 | where the robot is actually manipulating you
01:36:20.780 | into doing things, but you are under the belief
01:36:24.180 | that you are in charge, but actually they're in charge.
01:36:29.120 | And so I think that's one that if we could explore that
01:36:33.920 | for a second, you could imagine it wouldn't necessarily
01:36:36.480 | be bad, although it could lead to bad things.
01:36:39.100 | The reason I want to explore this is I think people
01:36:41.800 | always default to the extreme, like the robots take over
01:36:45.560 | and we're in little jail cells and they're out having fun
01:36:48.100 | and ruling the universe.
01:36:50.600 | What sorts of manipulation can a robot potentially carry out,
01:36:55.260 | good or bad?
01:36:56.500 | - Yeah, so there's a lot of good and bad manipulation
01:36:59.740 | between humans, right?
01:37:00.940 | Just like you said, to me, especially like you said,
01:37:05.940 | topping from the bottom, is that the term?
01:37:12.340 | - I think someone from MIT told me that term.
01:37:14.340 | [laughing]
01:37:15.420 | Wasn't Lex.
01:37:18.460 | - So first of all, there's power dynamics in bed
01:37:22.040 | and power dynamics in relationships and power dynamics
01:37:25.140 | on the street and in the work environment,
01:37:26.880 | those are all very different.
01:37:28.280 | I think power dynamics can make human relationships,
01:37:34.280 | especially romantic relationships, fascinating and rich
01:37:39.740 | and fulfilling and exciting and all those kinds of things.
01:37:42.740 | So I don't think in themselves they're bad.
01:37:48.740 | And the same goes with robots.
01:37:50.820 | I really love the idea that a robot would be a top
01:37:54.220 | or a bottom in terms of like power dynamics.
01:37:57.500 | And I think everybody should be aware of that.
01:37:59.980 | And the manipulation is not so much manipulation,
01:38:02.460 | but a dance of like pulling away, a push and pull
01:38:06.700 | and all those kinds of things.
01:38:08.340 | In terms of control, I think we're very, very, very far away
01:38:12.860 | from AI systems that are able to lock us up.
01:38:16.340 | They, to lock us up in, like to have so much control
01:38:21.340 | that we basically cannot live our lives
01:38:25.140 | in the way that we want.
01:38:26.920 | I think there's a, in terms of dangers of AI systems,
01:38:29.680 | there's much more dangers that have to do
01:38:31.260 | with autonomous weapon systems and all those kinds of things.
01:38:34.240 | So the power dynamics as exercised in the struggle
01:38:37.900 | between nations and war and all those kinds of things.
01:38:40.660 | But in terms of personal relationships,
01:38:42.740 | I think power dynamics are a beautiful thing.
01:38:45.900 | Now there is, of course, going to be all those kinds
01:38:48.340 | of discussions about consent and rights
01:38:52.460 | and all those kinds of things.
01:38:53.300 | - Well, here we're talking, I always say, you know,
01:38:54.820 | in any discussion around this,
01:38:56.420 | if we need to define really the context,
01:38:59.180 | it's always, always should be consensual, age appropriate,
01:39:03.280 | context appropriate, species appropriate.
01:39:06.060 | But now we're talking about human robot interactions.
01:39:09.200 | And so I guess that-
01:39:10.780 | - No, I actually was trying to make a different point,
01:39:13.620 | which is I do believe that robots will have rights
01:39:17.020 | down the line.
01:39:17.860 | And I think in order for us to have deep meaningful
01:39:21.780 | relationship with robots,
01:39:23.060 | we would have to consider them as entities in themselves
01:39:26.580 | that deserve respect.
01:39:28.360 | And that's a really interesting concept
01:39:31.660 | that I think people are starting to talk about
01:39:34.360 | a little bit more, but it's very difficult
01:39:36.340 | for us to understand how entities that are other than human.
01:39:39.700 | I mean, the same as with dogs and other animals
01:39:42.460 | can have rights on a level as humans.
01:39:44.900 | - Well, yeah, I mean, we can't and nor should we
01:39:48.220 | do whatever we want with animals.
01:39:49.940 | We have a USDA, we have departments of agriculture
01:39:54.700 | that deal with, you know, animal care and use committees
01:39:58.260 | for research, for farming and ranching and all that.
01:40:02.060 | So I, while when you first said it, I thought, wait,
01:40:05.220 | why would there be a bill of robotic rights?
01:40:07.380 | But it absolutely makes sense in the context of everything
01:40:11.240 | we've been talking about up until now.
01:40:13.860 | Let's, if you're willing, I'd love to talk about dogs
01:40:18.700 | because you've mentioned dogs a couple of times,
01:40:21.260 | a robot dog, you had a biological dog, yeah.
01:40:26.260 | - Yeah, I had a Newfoundland named Homer
01:40:31.300 | for many years growing up.
01:40:34.220 | - In Russia or in the US?
01:40:35.620 | - In the United States.
01:40:37.300 | And he was about, he was over 200 pounds.
01:40:40.140 | That's a big dog. - That's a big dog.
01:40:41.840 | - If people know, people know Newfoundland,
01:40:43.880 | so he's this black dog that's a really long hair
01:40:48.580 | and just a kind soul.
01:40:50.460 | I think perhaps that's true for a lot of large dogs,
01:40:53.220 | but he thought he was a small dog.
01:40:55.200 | So he moved like that and--
01:40:56.500 | - Was he your dog?
01:40:57.380 | - Yeah, yeah.
01:40:58.340 | - So you had him since he was fairly young?
01:41:00.480 | - Oh, since, yeah, since the very, very beginning
01:41:02.720 | to the very, very end.
01:41:03.760 | And one of the things, I mean, he had this kind of,
01:41:08.620 | we mentioned like the Roombas,
01:41:11.300 | he had a kind hearted dumbness about him
01:41:15.540 | that was just overwhelming.
01:41:17.120 | Part of the reason I named him Homer
01:41:20.020 | because it's after Homer Simpson,
01:41:22.740 | in case people are wondering which Homer I'm referring to.
01:41:25.260 | I'm not, you know.
01:41:26.420 | So there's a-- - Not "The Odyssey."
01:41:29.580 | - Yeah, exactly.
01:41:30.880 | There's a clumsiness that was just something
01:41:35.120 | that immediately led to a deep love for each other.
01:41:37.960 | And one of the, I mean, he was always,
01:41:42.300 | it's the shared moments.
01:41:43.320 | He was always there for so many nights together.
01:41:46.500 | That's a powerful thing about a dog that he was there
01:41:51.080 | through all the loneliness, through all the tough times,
01:41:53.540 | through the successes and all those kinds of things.
01:41:55.820 | And I remember, I mean,
01:41:57.740 | that was a really moving moment for me.
01:42:00.100 | I still miss him to this day.
01:42:02.140 | - How long ago did he die?
01:42:03.500 | - Maybe 15 years ago.
01:42:07.060 | So it's been a while.
01:42:09.140 | But it was the first time I've really experienced
01:42:13.920 | like the feeling of death.
01:42:15.680 | So what happened is he got cancer
01:42:21.980 | and so he was dying slowly.
01:42:26.020 | And then at a certain point he couldn't get up anymore.
01:42:29.680 | There's a lot of things I could say here
01:42:31.780 | you know, that I struggle with.
01:42:33.960 | That maybe he suffered much longer than he needed to.
01:42:38.960 | That's something I really think about a lot.
01:42:42.060 | But I remember I had to take him to the hospital
01:42:47.060 | and the nurses couldn't carry him, right?
01:42:52.160 | So you talk about 200 pound dog.
01:42:54.420 | I was really into powerlifting at the time.
01:42:56.680 | I remember like they tried to figure out
01:42:59.340 | all these kinds of ways to,
01:43:01.740 | so in order to put them to sleep,
01:43:03.580 | they had to take them into a room.
01:43:07.220 | And so I had to carry him everywhere.
01:43:09.500 | And here's this dying friend of mine
01:43:13.540 | that I just had to,
01:43:15.220 | first of all, it's really difficult to carry
01:43:16.980 | somebody that heavy when they're not helping you out.
01:43:20.100 | And yeah, so I remember it was the first time
01:43:25.100 | seeing a friend laying there
01:43:28.460 | and seeing wife drained from his body.
01:43:34.040 | And that realization that we're here for a short time
01:43:38.360 | was made so real.
01:43:40.580 | That here's a friend that was there for me
01:43:42.760 | the week before, the day before, and now he's gone.
01:43:46.240 | And that was, I don't know,
01:43:49.180 | that spoke to the fact that he could be deeply connected
01:43:52.400 | with the dog.
01:43:53.240 | Also spoke to the fact that the shared moments together
01:44:00.800 | that led to that deep friendship will make life so amazing,
01:44:05.800 | but also spoke to the fact that death is a motherfucker.
01:44:11.700 | So I know you've lost Costello recently
01:44:16.120 | and you've been going. - Yeah,
01:44:16.960 | and as you're saying this,
01:44:17.780 | I'm definitely fighting back the tears.
01:44:20.180 | Thank you for sharing that,
01:44:23.320 | that I guess we're about to both cry over our dead dogs.
01:44:28.680 | It was bound to happen just given when this is happening.
01:44:32.620 | Yeah, it's-
01:44:35.060 | - How long did you know that Costello was not doing well?
01:44:38.480 | - Well, let's see.
01:44:41.000 | A year ago, during the start of,
01:44:44.280 | about six months into the pandemic,
01:44:46.440 | he started getting abscesses and he was not,
01:44:49.060 | his behavior changed and something really changed.
01:44:51.560 | And then I put him on testosterone
01:44:55.760 | because, which helped a lot of things.
01:44:58.100 | It certainly didn't cure everything,
01:44:59.300 | but it helped a lot of things.
01:45:00.600 | He was dealing with joint pain, sleep issues,
01:45:03.880 | and then it just became a very slow decline
01:45:07.740 | to the point where two, three weeks ago,
01:45:11.360 | he had a closet full of medication.
01:45:15.320 | I mean, this dog was, yeah, it was like a pharmacy.
01:45:17.360 | It's amazing to me when I looked at it the other day,
01:45:19.880 | still haven't cleaned up and removed all his things
01:45:22.080 | 'cause I can't quite bring myself to do it, but...
01:45:24.480 | - Do you think he was suffering?
01:45:27.480 | - Well, so what happened was about a week ago,
01:45:30.300 | it was really just about a week ago, it's amazing.
01:45:32.480 | He was going up the stairs, I saw him slip,
01:45:35.020 | and he was a big dog.
01:45:35.860 | He wasn't 200 pounds, but he was about 90 pounds,
01:45:37.740 | but he's a bulldog, that's pretty big, and he was fit.
01:45:41.200 | And then I noticed that he wasn't carrying a foot
01:45:43.920 | in the back like it was injured.
01:45:45.160 | It had no feeling at all.
01:45:46.400 | He never liked me to touch his hind paws and I could do,
01:45:48.860 | that thing was just flopping there.
01:45:50.480 | And then the vet found some spinal degeneration
01:45:53.840 | and I was told that the next one would go.
01:45:55.720 | Did he suffer?
01:45:57.300 | I sure hope not, but something changed in his eyes.
01:46:01.420 | - Yeah. - Yeah, it's the eyes again.
01:46:02.840 | I know you and I spend long hours on the phone
01:46:05.120 | and talking about like the eyes and what they convey
01:46:07.320 | and what they mean about internal states
01:46:09.180 | and forsaken robots and biology of other kinds, but-
01:46:12.800 | - Do you think something about him was gone in his eyes?
01:46:17.800 | - I think he was real, here I am anthropomorphizing.
01:46:22.860 | I think he was realizing that one of his great joys in life,
01:46:26.840 | which was to walk and sniff and pee on things.
01:46:31.840 | [laughter]
01:46:33.120 | This dog loved to pee on things, it was amazing.
01:46:37.240 | I've wondered where he put it.
01:46:38.900 | He was like a reservoir of urine, it was incredible.
01:46:42.280 | I think, oh, that's it, he'd put like one drop
01:46:44.840 | on the 50 millionth plant,
01:46:46.560 | and then we get to the 50 millionth in one plant
01:46:49.120 | and he'd just have, you know, leave a puddle.
01:46:51.220 | And here I am talking about Costello peeing.
01:46:54.520 | He was losing that ability to stand up and do that.
01:46:57.020 | He was falling down while he was doing that.
01:46:58.880 | And I do think he started to realize,
01:47:01.560 | and the passage was easy and peaceful,
01:47:04.560 | but you know, I'll say this, I'm not ashamed to say it.
01:47:09.140 | I mean, I wake up every morning since then just,
01:47:11.480 | I don't even make the conscious decision
01:47:13.240 | to allow myself to cry, I wake up crying.
01:47:16.080 | And I'm fortunately able to make it through the day
01:47:18.160 | thanks to the great support of my friends
01:47:19.960 | and you and my family, but I miss him, man.
01:47:24.080 | - You miss him?
01:47:24.920 | - Yeah, I miss him.
01:47:25.760 | And I feel like he, you know, Homer, Costello,
01:47:29.180 | you know, the relationship to one's dog is so specific, but.
01:47:33.000 | - So that party is gone.
01:47:36.060 | That's the hard thing.
01:47:39.520 | - You know, what I think is different
01:47:41.880 | is that I made the mistake, I think.
01:47:50.600 | I hope it was a good decision,
01:47:51.800 | but sometimes I think I made the mistake
01:47:53.260 | of I brought Costello a little bit to the world
01:47:56.480 | through the podcast, through posting about him.
01:47:58.400 | I gave, I anthropomorphized about him in public.
01:48:01.560 | Let's be honest, I have no idea what his mental life was
01:48:03.700 | or his relationship to me.
01:48:04.840 | And I'm just exploring all this for the first time
01:48:06.800 | because he was my first dog,
01:48:07.760 | but I raised him since he was seven weeks.
01:48:09.680 | - Yeah, you got to hold it together.
01:48:11.020 | I noticed the episode you released on Monday,
01:48:14.960 | you mentioned Costello.
01:48:16.560 | Like you brought him back to life for me
01:48:18.900 | for that brief moment.
01:48:20.340 | - Yeah, but he's gone.
01:48:22.080 | - Well, that's the,
01:48:23.440 | he's going to be gone for a lot of people too.
01:48:26.920 | - Well, this is what I'm struggling with.
01:48:29.820 | I think that maybe, you're pretty good at this, Lila.
01:48:33.840 | Have you done this before?
01:48:36.180 | This is the challenge is I actually, part of me,
01:48:40.560 | I know how to take care of myself pretty well.
01:48:43.080 | Not perfectly, but pretty well.
01:48:44.700 | And I have good support.
01:48:46.000 | I do worry a little bit about how it's going to land
01:48:48.780 | and how people will feel.
01:48:50.200 | I'm concerned about their internalization.
01:48:54.040 | So that's something I'm still iterating on.
01:48:56.280 | - And they have to watch you struggle, which is fascinating.
01:48:59.360 | - Right, and I've mostly been shielding them from this,
01:49:01.620 | but what would make me happiest
01:49:04.000 | is if people would internalize
01:49:06.380 | some of Costello's best traits.
01:49:08.200 | And his best traits were that he was incredibly tough.
01:49:13.200 | I mean, he was a 22 inch neck, bulldog, the whole thing.
01:49:17.680 | He was just born that way.
01:49:18.960 | But what was so beautiful is that his toughness
01:49:22.320 | is never what he rolled forward.
01:49:24.040 | It was just how sweet and kind he was.
01:49:27.100 | And so if people can take that,
01:49:29.140 | then there's a win in there someplace.
01:49:33.240 | - I think there's some ways in which
01:49:34.880 | you should probably live on in your podcast too.
01:49:37.800 | You should, I mean, it's such a,
01:49:40.560 | one of the things I loved about his role in your podcast
01:49:45.800 | is that he brought so much joy to you.
01:49:48.740 | I'll mention the robots, right?
01:49:51.920 | I think that's such a powerful thing to bring that joy into,
01:49:56.300 | like allowing yourself to experience that joy,
01:49:59.380 | to bring that joy to others, to share it with others.
01:50:02.520 | That's really powerful.
01:50:03.820 | And I mean, not to, this is like the Russian thing is,
01:50:07.520 | it touched me when Louis CK had that moment
01:50:14.040 | that I keep thinking about in his show, Louis,
01:50:17.960 | where like an old man was criticizing Louis
01:50:20.240 | for whining about breaking up with his girlfriend.
01:50:22.840 | And he was saying like the most beautiful thing
01:50:27.100 | about love, I mean, a song that's catchy now,
01:50:32.480 | that's now making me feel horrible saying it,
01:50:35.660 | but like is the loss.
01:50:37.700 | The loss really also is making you realize
01:50:41.220 | how much that person, that dog meant to you.
01:50:46.340 | And like allowing yourself to feel that loss
01:50:48.700 | and not run away from that loss is really powerful.
01:50:51.420 | And in some ways that's also sweet,
01:50:55.060 | just like the love was, the loss is also sweet
01:50:58.080 | because you know that you felt a lot for that,
01:51:01.440 | for your friend.
01:51:03.760 | So I, and I continue bringing that joy.
01:51:07.140 | I think it would be amazing to the podcast.
01:51:09.400 | I hope to do the same with robots
01:51:13.680 | or whatever else is the source of joy, right?
01:51:16.920 | And maybe you think about one day getting another dog.
01:51:21.920 | - Yeah, in time, you're hitting on all the key buttons here.
01:51:27.460 | I want that to, we're thinking about, you know,
01:51:32.700 | ways to kind of immortalize Costello in a way that's real,
01:51:35.620 | not just, you know, creating some little logo
01:51:38.780 | or something silly.
01:51:39.740 | You know, Costello, much like David Goggins is a person,
01:51:43.840 | but Goggins also has grown into kind of a verb.
01:51:47.140 | You're going to Goggins this or you're going to,
01:51:48.820 | and there's an adjective like that's extreme.
01:51:51.020 | Like I think that for me, Costello was all those things.
01:51:54.420 | He was a being, he was his own being.
01:51:56.820 | He was a noun, a verb and an adjective.
01:52:00.340 | So, and he had this amazing superpower
01:52:02.420 | that I wish I could get,
01:52:03.300 | which is this ability to get everyone else
01:52:05.060 | to do things for you without doing a damn thing.
01:52:07.800 | The Costello effect, as I call it.
01:52:10.260 | - So as an idea, I hope he lives on.
01:52:12.260 | - Yes, thank you for that.
01:52:14.440 | This actually has been very therapeutic for me,
01:52:16.740 | which actually brings me to a question.
01:52:22.360 | We're friends, we're not just co-scientists colleagues
01:52:26.720 | working on a project together and in the world
01:52:31.200 | that's somewhat similar.
01:52:34.760 | - Just two dogs.
01:52:37.200 | - Just two dogs, basically.
01:52:40.540 | But let's talk about friendship
01:52:42.700 | because I think that I certainly know as a scientist
01:52:49.140 | that there are elements that are very lonely
01:52:51.100 | of the scientific pursuit.
01:52:52.740 | There are elements of many pursuits that are lonely.
01:52:56.980 | Music, math always seemed to me
01:53:00.100 | like they're like the loneliest people.
01:53:02.400 | Who knows if that's true or not.
01:53:04.060 | Also people work in teams
01:53:05.380 | and sometimes people are surrounded by people
01:53:06.980 | interacting with people and they feel very lonely.
01:53:09.300 | But for me, and I think as well for you,
01:53:14.300 | friendship is an incredibly strong force
01:53:17.940 | in making one feel like certain things are possible
01:53:22.940 | or worth reaching for.
01:53:25.780 | Maybe even making us compulsively reach for them.
01:53:28.200 | So when you were growing up,
01:53:30.700 | you grew up in Russia until what age?
01:53:32.780 | - 13.
01:53:33.700 | - Okay, and then you moved directly to Philadelphia?
01:53:38.260 | - To Chicago.
01:53:39.820 | And then Philadelphia and San Francisco and Boston and so on
01:53:44.820 | but really to Chicago, that's when I went to high school.
01:53:48.260 | - Do you have siblings?
01:53:49.840 | - Older brother.
01:53:51.080 | - Most people don't know that.
01:53:52.540 | - Yeah, he is a very different person
01:53:57.880 | but somebody I definitely look up to.
01:53:59.580 | So he's a wild man, he's extrovert.
01:54:01.820 | He was into, I mean, so he's also a scientist,
01:54:06.820 | a bioengineer, but when we were growing up,
01:54:10.260 | he was the person who did drink and did every drug
01:54:15.260 | but also is the life of the party.
01:54:19.380 | And I just thought he was the,
01:54:21.100 | when you're the older brother, five years older,
01:54:23.420 | he was the coolest person that I always wanted to be him.
01:54:28.300 | So to that, he definitely had a big influence.
01:54:31.180 | But I think for me, in terms of friendship growing up,
01:54:35.620 | I had one really close friend.
01:54:40.060 | And then when I came here, I had another close friend,
01:54:42.280 | but I'm very, I believe, I don't know if I believe,
01:54:47.280 | but I draw a lot of strength from deep connections
01:54:52.520 | with other people and just a small number of people,
01:54:57.740 | just a really small number of people.
01:54:59.060 | That's when I moved to this country,
01:55:00.500 | I was really surprised how like there would be
01:55:04.140 | these large groups of friends, quote unquote,
01:55:08.280 | but the depth of connection was not there at all
01:55:12.260 | from my sort of perspective.
01:55:14.260 | Now I moved to the suburb of Chicago, was Naperville.
01:55:17.700 | It's more like a middle class, maybe upper middle class.
01:55:20.980 | So it's like people that cared more
01:55:23.500 | about material possessions than deep human connection.
01:55:26.460 | So that added to the thing.
01:55:28.420 | But I drew more meaning than almost anything else
01:55:34.020 | was from friendship early on.
01:55:35.620 | I had a best friend, his name was, his name is Yura.
01:55:40.180 | I don't know how to say it in English.
01:55:43.500 | - How do you say it in Russian?
01:55:44.700 | - Yura.
01:55:45.520 | - What's his last name?
01:55:46.420 | Do you remember his last name?
01:55:47.620 | - Mirkulov, Yura Mirkulov.
01:55:51.840 | So we just spent all our time together.
01:55:56.640 | There's also a group of friends, like, I don't know,
01:56:00.140 | it's like eight guys in Russia.
01:56:02.980 | Growing up, it's like parents didn't care
01:56:07.500 | if you're coming back at a certain hour.
01:56:09.660 | So we spent all day, all night just playing soccer,
01:56:13.340 | usually called football and just talking about life
01:56:18.060 | and all those kinds of things.
01:56:18.900 | Even at that young age.
01:56:20.780 | I think people in Russia and the Soviet Union
01:56:23.860 | grow up much quicker.
01:56:25.120 | I think the education system at the university level
01:56:30.900 | is world-class in the United States
01:56:33.740 | in terms of creating really big, powerful minds,
01:56:38.740 | at least it used to be, but I think that they aspire to that.
01:56:42.020 | But the education system for younger kids
01:56:46.380 | in the Soviet Union was incredible.
01:56:49.300 | They did not treat us as kids.
01:56:51.220 | The level of literature, Tolstoy, Dostoyevsky--
01:56:54.580 | - When you were just a small child?
01:56:55.980 | - Yeah.
01:56:56.820 | - Amazing.
01:56:57.660 | - And the level of mathematics.
01:57:00.300 | And you're made to feel like shit
01:57:02.060 | if you're not good at mathematics.
01:57:03.780 | Like we, I think in this country, there's more,
01:57:07.180 | like especially young kids 'cause they're so cute,
01:57:09.700 | like they're being babied.
01:57:11.500 | We only start to really push adults later in life.
01:57:15.260 | Like, so if you want to be the best in the world at this,
01:57:17.860 | then you get to be pushed.
01:57:19.460 | But we were pushed at a young age, everybody was pushed.
01:57:22.900 | And that brought out the best in people.
01:57:25.020 | I think they really forced people to discover,
01:57:29.020 | like discover themselves in the Goggin style,
01:57:31.820 | but also discover what they're actually passionate about,
01:57:35.140 | what they're not.
01:57:35.980 | - Was this true for boys and girls,
01:57:37.420 | were they pushed equally there?
01:57:38.940 | - Yeah, they were pushed equally, I would say.
01:57:41.940 | There was a, obviously there was more, not obviously,
01:57:45.540 | but at least from my memories, more of,
01:57:49.340 | what's the right way to put it?
01:57:52.020 | But there was like gender roles,
01:57:53.980 | but not in a negative connotation.
01:57:56.260 | It was the red dress versus the suit and tie
01:57:59.460 | kind of connotation, which is like,
01:58:01.560 | there's, you know, like guys like lifting heavy things
01:58:06.560 | and girls like creating beautiful art and, you know,
01:58:13.020 | like there's-
01:58:14.180 | - A more traditional view of gender, more 1950s, '60s.
01:58:18.200 | - But we didn't think in terms of, at least at that age,
01:58:20.380 | in terms of like roles and then like a homemaker
01:58:23.780 | or something like that, or no,
01:58:25.420 | it was more about what people care about.
01:58:28.220 | Like girls cared about this set of things
01:58:31.360 | and guys cared about this set of things.
01:58:33.240 | I think mathematics and engineering was something
01:58:36.500 | that guys cared about and sort of,
01:58:38.580 | at least my perception of that time.
01:58:40.460 | And then girls cared about beauty.
01:58:43.980 | So like guys want to create machines,
01:58:46.060 | girls want to create beautiful stuff.
01:58:47.740 | And now of course that I don't take that forward
01:58:52.740 | in some kind of philosophy of life,
01:58:54.820 | but it's just the way I grew up and the way I remember it.
01:58:57.500 | But all, everyone worked hard.
01:59:01.580 | The value of hard work was instilled in everybody.
01:59:06.500 | And through that, I think it's a little bit of hardship.
01:59:11.500 | Of course, also economically, everybody was poor,
01:59:14.940 | especially with the collapse of the Soviet Union.
01:59:17.100 | There's poverty everywhere.
01:59:18.620 | You didn't notice it as much,
01:59:19.820 | but there was a, because there's not much
01:59:22.660 | material possessions, there was a huge value
01:59:25.740 | placed on human connection.
01:59:28.020 | Just meeting with neighbors, everybody knew each other.
01:59:31.220 | We lived in an apartment building,
01:59:33.600 | very different than you have
01:59:34.820 | in the United States these days.
01:59:36.460 | Everybody knew each other.
01:59:37.780 | You would get together, drink vodka,
01:59:40.700 | smoke cigarettes and play guitar
01:59:42.380 | and sing sad songs about life.
01:59:47.260 | - What's with the sad songs and the Russian thing?
01:59:50.580 | I mean, Russians do express joy from time to time.
01:59:54.700 | - Yeah, they do.
01:59:55.900 | - Certainly you do.
01:59:57.540 | But what do you think that's about?
02:00:00.220 | Is it 'cause it's cold there?
02:00:01.320 | But it's cold other places too.
02:00:02.820 | - I think, so first of all, the Soviet Union,
02:00:08.340 | the echoes of World War II
02:00:12.100 | and the millions and millions and millions of people there,
02:00:14.660 | civilians that were slaughtered
02:00:17.020 | and also starvation is there, right?
02:00:20.180 | So like the echoes of that, of the ideas,
02:00:24.040 | the literature, the art is there.
02:00:25.900 | Like that's grandparents, that's parents, that's all there.
02:00:29.320 | So that contributes to it,
02:00:30.580 | that life can be absurdly, unexplainably cruel.
02:00:35.580 | At any moment, everything can change.
02:00:37.380 | So that's in there.
02:00:38.600 | Then I think there's an empowering aspect
02:00:40.920 | to finding beauty and suffering,
02:00:43.220 | but then everything else is beautiful too.
02:00:45.480 | Like if you just linger or it's like
02:00:47.660 | why you meditate on death is like if you just think about
02:00:50.800 | the worst possible case and find beauty in that,
02:00:53.100 | then everything else is beautiful too.
02:00:54.660 | And so you write songs about the dark stuff
02:00:57.420 | and that somehow helps you deal with whatever comes.
02:01:02.420 | There's a hopelessness to the Soviet Union
02:01:07.220 | that like inflation, all those kinds of things
02:01:10.280 | where people were sold dreams and never delivered.
02:01:15.420 | And so like, if you don't sing songs about sad things,
02:01:20.420 | you're going to become cynical about this world.
02:01:24.300 | - Interesting.
02:01:25.140 | - So they don't want to give into cynicism.
02:01:27.340 | Now, a lot of people did, one of the,
02:01:30.360 | but it's the battle against cynicism.
02:01:34.240 | One of the things that may be common in Russia
02:01:38.440 | is the kind of cynicism about,
02:01:41.040 | like if I told you the thing I said earlier
02:01:43.180 | about dreaming about robots, it's very common for people
02:01:46.460 | to dismiss that dream of saying, no, that's not,
02:01:51.100 | that's too wild.
02:01:52.180 | Like who else do you know that did that?
02:01:54.320 | Or you want to start a podcast, like who else?
02:01:57.000 | Like nobody's making money on podcasts.
02:01:58.780 | Like why do you want to start a podcast?
02:02:00.020 | That kind of mindset I think is quite common,
02:02:03.360 | which is why I would say entrepreneurship in Russia
02:02:07.520 | is still not very good, which to be a business,
02:02:11.300 | to be an entrepreneur, you have to dream big
02:02:13.460 | and you have to have others around you,
02:02:14.980 | like friends and support group that make you dream big.
02:02:19.080 | But if you don't give into cynicism
02:02:21.380 | and appreciate the beauty and the unfairness of life,
02:02:27.340 | the absurd unfairness of life,
02:02:29.940 | then I think it just makes you appreciative of everything.
02:02:34.500 | It's like a, it's a prerequisite for gratitude.
02:02:38.020 | And so, yeah, I think that instilled in me
02:02:42.060 | ability to appreciate everything,
02:02:43.980 | just like everything, everything's amazing.
02:02:46.740 | And then also there's a culture
02:02:48.900 | of like romanticizing everything.
02:02:56.100 | Like it's almost like romantic relationships
02:03:01.100 | were very like soap opera like.
02:03:04.280 | It's very like over the top dramatic.
02:03:07.680 | And I think that that was instilled in me too.
02:03:10.900 | Not only do I appreciate everything about life,
02:03:13.640 | but I get like emotional about it.
02:03:15.940 | In a sense, like I get like a visceral feeling of joy
02:03:20.600 | for everything and the same with friends
02:03:24.500 | or people of the opposite sex.
02:03:26.080 | Like there's a deep, like emotional connection there
02:03:30.700 | that like, that's like way too dramatic to like,
02:03:35.300 | I guess, relative to what the actual moment is,
02:03:38.760 | but I derive so much deep, like dramatic joy
02:03:43.760 | from so many things in life.
02:03:46.620 | And I think I would attribute that to bringing in Russia.
02:03:50.340 | But the thing that sticks most of all is the friendship.
02:03:54.000 | And I've now since then had one other friend like that
02:03:59.000 | in the United States, he lives in Chicago.
02:04:02.620 | His name is Matt.
02:04:04.300 | And slowly here and there accumulating
02:04:08.280 | really fascinating people,
02:04:09.840 | but I'm very selective with that.
02:04:11.540 | Funny enough, the few times, it's not few,
02:04:16.540 | it's a lot of times now interacting with Joe Rogan.
02:04:19.900 | It sounds surreal to say,
02:04:21.840 | but there was a kindred spirit there.
02:04:24.180 | Like I've connected with him.
02:04:25.620 | And there's been people like that also
02:04:28.300 | in the grappling sports that are really connected with.
02:04:31.600 | I've actually struggled, which is why I'm so glad
02:04:35.360 | to be your friend, is I've struggled
02:04:37.740 | to connect with scientists.
02:04:40.440 | - They can be a little bit wooden sometimes.
02:04:43.200 | Even the biologists, I mean, one thing that I'm--
02:04:45.940 | - Even the biologists.
02:04:47.000 | - Well, I'm so struck by the fact that you work with robots,
02:04:50.080 | you're an engineer, AI, science, technology,
02:04:52.940 | and that all sounds like hardware, right?
02:04:55.320 | But what you're describing, and I know is true about you,
02:04:58.260 | is this deep emotional life and this resonance.
02:05:01.640 | And it's really wonderful.
02:05:02.700 | I actually think it's one of the reasons
02:05:04.120 | why so many people, scientists and otherwise,
02:05:07.700 | have gravitated towards you and your podcast
02:05:09.900 | is because you hold both elements.
02:05:12.860 | In Hermann Hesse's book, I don't know if you read,
02:05:14.680 | "Narcissus and Goldman," right?
02:05:16.100 | It's about these elements of the logical, rational mind
02:05:19.400 | and the emotional mind and how those are woven together.
02:05:22.940 | And if people haven't read it, they should.
02:05:24.860 | And you embody the full picture.
02:05:27.300 | And I think that's so much of what draws people to you.
02:05:30.020 | - I've read every Hermann Hesse book, by the way.
02:05:31.940 | - As usual, as usual, I've done about 9% of what Lex is.
02:05:36.940 | No, it's true.
02:05:38.560 | Well, you mentioned Joe, who is a phenomenal human being,
02:05:42.360 | not just for his amazing accomplishments,
02:05:44.260 | but for how he shows up to the world one-on-one.
02:05:48.500 | I think I heard him say the other day on an interview,
02:05:51.620 | he said there is no public or private version of him.
02:05:55.660 | He's like, "This is me."
02:05:56.980 | And he said the word, it was beautiful.
02:05:58.340 | He said, "I'm like the fish that got through the net."
02:06:00.940 | There is no on-stage, off-stage version.
02:06:03.580 | You're absolutely right.
02:06:04.580 | And I, so, but well, you guys,
02:06:08.100 | I have a question actually about-
02:06:09.820 | - But that's a really good point
02:06:10.860 | about public and private life.
02:06:12.620 | He was a huge, if I could just comment real quick.
02:06:15.620 | Like that, he was, I've been a fan of Joe for a long time,
02:06:18.560 | but he's been an inspiration
02:06:20.620 | to not have any difference between public and private life.
02:06:25.060 | I actually had a conversation with Naval about this.
02:06:28.180 | And he said that you can't have a rich life,
02:06:33.180 | like an exciting life
02:06:36.620 | if you're the same person publicly and privately.
02:06:39.340 | And I think I understand that idea,
02:06:42.420 | but I don't agree with it.
02:06:45.380 | I think it's really fulfilling and exciting
02:06:49.100 | to be the same person privately and publicly
02:06:51.340 | with very few exceptions.
02:06:52.500 | Now, that said, I don't have any really strange sex kinks.
02:06:57.500 | So like, I feel like I can be open with basically everything.
02:07:00.420 | I don't have anything I'm ashamed of.
02:07:02.340 | There's some things that could be perceived poorly,
02:07:05.660 | like the screaming Arumbas, but I'm not ashamed of them.
02:07:09.040 | I just have to present them in the right context.
02:07:11.420 | But there is a freedom
02:07:14.020 | to being the same person in private as in public.
02:07:16.620 | And that, Joe made me realize that you can be that
02:07:22.460 | and also to be kind to others.
02:07:25.220 | It sounds kind of absurd,
02:07:28.620 | but I really always enjoyed being good to others.
02:07:33.620 | Just being kind towards others.
02:07:41.100 | But I always felt like the world didn't want me to be.
02:07:45.380 | Like there's so much negativity when I was growing up,
02:07:48.500 | like just around people.
02:07:49.700 | If you actually just notice how people talk,
02:07:52.520 | from like complaining about the weather,
02:07:57.460 | this could be just like the big cities that I visited in,
02:07:59.620 | but there's a general negativity
02:08:02.020 | and positivity is kind of suppressed.
02:08:05.820 | One, you're not seen as very intelligent.
02:08:08.660 | And two, you're seen as like a little bit of a weirdo.
02:08:13.220 | And so I always felt like I had to hide that.
02:08:15.780 | And what Joe made me realize, one,
02:08:17.780 | I can be fully just the same person, private and public.
02:08:22.260 | And two, I can embrace being kind
02:08:25.260 | and just in the way that I like,
02:08:28.260 | in the way I know how to do.
02:08:31.240 | And sort of for me on like on Twitter
02:08:33.900 | or like publicly, whenever I say stuff,
02:08:37.520 | that means saying stuff simply,
02:08:39.620 | almost to the point of cliche.
02:08:41.220 | And like, I have the strength now to say it,
02:08:44.540 | even if I'm being mocked, you know what I mean?
02:08:47.140 | Like just, it's okay.
02:08:48.680 | If everything's going to be okay, okay.
02:08:50.700 | Some people will think you're dumb.
02:08:52.500 | They're probably right.
02:08:53.520 | The point is like, it's just enjoy being yourself.
02:08:56.260 | And that Joe, more than almost anybody else,
02:08:58.300 | because he's so successful at it, inspired me to do that.
02:09:03.060 | Be kind and be the same person, private and public.
02:09:06.200 | - I love it.
02:09:07.040 | And I love the idea that authenticity
02:09:08.660 | doesn't have to be oversharing, right?
02:09:11.520 | That it doesn't mean you reveal every detail of your life.
02:09:14.460 | It's a way of being true to an essence of oneself.
02:09:19.000 | - Right, there's never a feeling when you deeply think
02:09:23.620 | and introspect that you're hiding something from the world
02:09:26.100 | or you're being dishonest in some fundamental way.
02:09:28.740 | So yeah, that's truly liberating.
02:09:33.480 | It allows you to think, it allows you to like think freely,
02:09:37.040 | to speak freely, just to be freely.
02:09:42.100 | That said, it's not like there's not still a responsibility
02:09:47.100 | to be the best version of yourself.
02:09:52.120 | So, I'm very careful with the way I say something.
02:09:57.120 | So the whole point, it's not so simple
02:10:00.680 | to express the spirit that's inside you with words.
02:10:04.660 | Some people are much better than others.
02:10:08.740 | I struggle, like oftentimes when I say something
02:10:12.740 | and I hear myself say it, it sounds really dumb
02:10:15.240 | and not at all what I meant.
02:10:16.700 | So that's the responsibility you have.
02:10:18.420 | It's not just like being the same person publicly
02:10:21.340 | and privately means you can just say whatever the hell.
02:10:24.460 | It means there's still a responsibility to try to be,
02:10:27.640 | to express who you truly are.
02:10:29.300 | And that's hard.
02:10:32.740 | - It is hard.
02:10:33.580 | And I think that we have this pressure,
02:10:37.640 | all people, when I say we, I mean all humans,
02:10:40.900 | and maybe robots too, feel this pressure
02:10:44.480 | to be able to express ourselves in that one moment,
02:10:47.560 | in that one form.
02:10:48.900 | And it is beautiful when somebody, for instance,
02:10:51.260 | can capture some essence of love or sadness
02:10:53.740 | or anger or something in a song or in a poem
02:10:57.160 | or in a short quote.
02:10:58.740 | But perhaps it's also possible to do it in aggregate,
02:11:02.740 | you know, all the things, you know, how you show up.
02:11:06.080 | For instance, one of the things that initially drew me
02:11:08.660 | to want to get to know you as a human being and a scientist,
02:11:11.420 | and eventually we became friends,
02:11:13.700 | was the level of respect that you brought
02:11:16.600 | to your podcast listeners by wearing a suit.
02:11:19.620 | I'm being serious here.
02:11:20.620 | You know, I was raised thinking
02:11:22.380 | that if you overdress a little bit,
02:11:24.340 | overdressed by American, certainly by American standards,
02:11:27.260 | you're overdressed for a podcast, but this is,
02:11:29.740 | but it's genuine, you're not doing it for any reason,
02:11:31.820 | except I have to assume, and I assumed at the time,
02:11:35.060 | that it was because you have a respect for your audience.
02:11:37.940 | You respect them enough to show up a certain way for them.
02:11:42.400 | It's for you also, but it's for them.
02:11:44.440 | And I think between that and your commitment
02:11:47.080 | to your friendships, the way that you talk about friendships
02:11:49.560 | and love and the way you hold up these higher ideals,
02:11:52.820 | I think at least as a consumer of your content
02:11:56.620 | and as your friend, what I find is that in aggregate,
02:12:01.760 | you're communicating who you are.
02:12:03.360 | It doesn't have to be one quote or something.
02:12:05.860 | And I think that we were sort of obsessed
02:12:08.140 | by like the one Einstein quote
02:12:10.120 | or the one line of poetry or something,
02:12:13.020 | but I think you so embody the way that, and Joe as well,
02:12:18.020 | it's about how you live your life and how you show up
02:12:21.080 | as a collection of things and said and done.
02:12:24.880 | - Yeah, that's fascinating.
02:12:25.840 | So the aggregate is the goal, the tricky thing,
02:12:30.380 | and Jordan Peterson talks about this
02:12:32.220 | because he's under attack way more than you and I
02:12:34.400 | will ever be, but that- - For now.
02:12:37.200 | - For now, right?
02:12:38.500 | This is very true for now.
02:12:40.420 | That the people who attack on the internet,
02:12:45.420 | this is one of the problems with Twitter,
02:12:49.080 | is they don't consider the aggregate.
02:12:53.160 | They take a single statements.
02:12:55.300 | And so one of the defense mechanisms,
02:12:58.420 | like again, why Joe has been an inspiration
02:13:01.140 | is that when you in aggregate are a good person,
02:13:05.540 | a lot of people will know that.
02:13:07.060 | And so that makes you much more immune
02:13:08.940 | to the attacks of people
02:13:10.020 | that bring out an individual statement
02:13:11.900 | that might be a misstatement of some kind
02:13:14.220 | or doesn't express who you are.
02:13:16.820 | And so that, I like that idea is the aggregate
02:13:20.940 | and the power of the podcast
02:13:23.780 | is you have hundreds of hours out there
02:13:27.180 | and being yourself and people get to know who you are.
02:13:30.180 | And once they do and you post pictures of screaming Roombas
02:13:34.740 | as you kick them, they will understand
02:13:37.320 | that you don't mean well.
02:13:38.160 | By the way, as a side comment,
02:13:40.120 | I don't know if I want to release this
02:13:42.940 | because it's not just the Roombas.
02:13:45.520 | - You have a whole dungeon of robots.
02:13:48.460 | - Okay, so this is a problem.
02:13:51.580 | The Boston Dynamics came up against this problem,
02:13:54.420 | but let me just, let me work this out,
02:13:57.260 | like workshop this out with you.
02:13:59.860 | And maybe because we'll post this, people will let me know.
02:14:03.700 | So there's legged robots.
02:14:07.580 | They look like a dog.
02:14:08.820 | They have a very, I'm trying to create
02:14:10.420 | a very real human robot connection,
02:14:13.980 | but like they're also incredible
02:14:15.420 | because you can throw them like off of a building
02:14:19.500 | and they'll land fine.
02:14:21.420 | And it's beautiful.
02:14:22.380 | - That's amazing.
02:14:23.220 | I've seen the Instagram videos of like cats
02:14:25.300 | jumping off of like fifth story buildings
02:14:27.260 | and then walking away.
02:14:29.100 | No one should throw their cat out of a window.
02:14:30.980 | - This is the problem I'm experiencing.
02:14:32.540 | I'll certainly kicking the robots.
02:14:34.700 | It's really fascinating how they recover from those kicks,
02:14:38.000 | but like just seeing myself do it
02:14:40.440 | and also seeing others do it, it just does not look good.
02:14:43.340 | And I don't know what to do with that.
02:14:44.920 | 'Cause it's such a-
02:14:46.260 | - I'll do it.
02:14:47.100 | See, but you don't, 'cause you-
02:14:52.860 | - Robot.
02:14:53.700 | No, I'm kidding.
02:14:54.540 | Now I'm, you know what's interesting?
02:14:55.960 | - Yeah.
02:14:56.800 | - Before today's conversation,
02:14:58.660 | I probably could do it.
02:14:59.980 | And now I think I'm thinking about robots,
02:15:03.020 | bills of rights and things.
02:15:04.140 | I'm actually, and not for any,
02:15:05.700 | not to satisfy you or to satisfy anything,
02:15:08.580 | except that if they have some sentient aspect
02:15:13.020 | to their being, then I would loathe to kick it.
02:15:16.380 | - I don't think you'd be able to kick it.
02:15:17.660 | You might be able to kick the first time,
02:15:18.980 | but not the second.
02:15:20.120 | This is the problem I've experienced.
02:15:21.820 | One of the cool things is one of the robots I'm working with,
02:15:26.580 | you can pick it up by one leg and it's dangling.
02:15:29.340 | You can throw it in any kind of way
02:15:31.540 | and it'll land correctly.
02:15:33.660 | So it's really-
02:15:34.500 | - I had a friend who had a cat like that.
02:15:35.860 | [laughter]
02:15:37.580 | - Oh man, we look forward to the letters from the cat.
02:15:40.560 | - Oh no, I'm not suggesting anyone did that.
02:15:42.100 | But he had this cat and the cat,
02:15:44.540 | he would just throw it onto the bed from across the room
02:15:46.820 | and then it would run back for more.
02:15:48.820 | Somehow they had, that was the nature of the relationship.
02:15:51.580 | I think no one should do that to an animal,
02:15:54.100 | but apparently this cat seemed to return for it
02:15:57.380 | for whatever reason.
02:15:58.220 | - The robot is a robot and it's fascinating to me
02:16:00.100 | how hard it is for me to do that.
02:16:02.240 | So it's unfortunate,
02:16:04.000 | but I don't think I can do that to a robot.
02:16:06.260 | Like I struggle with that.
02:16:08.240 | So for me to be able to do that with a robot,
02:16:13.520 | I have to almost get like into the state
02:16:15.820 | that I imagine like doctors get into
02:16:17.700 | when they're doing surgery.
02:16:19.340 | Like I have to start,
02:16:20.740 | I have to do what robotics colleagues of mine do,
02:16:22.980 | which is like start seeing it as an object.
02:16:25.140 | - Dissociate.
02:16:25.980 | - Like dissociate.
02:16:26.880 | So it was just fascinating that I have to do that
02:16:28.820 | in order to do that with a robot.
02:16:30.700 | I just wanted to take that a little bit of a tangent.
02:16:33.620 | - No, I think it's an important thing.
02:16:34.980 | I mean, I am not shy about the fact that for many years,
02:16:39.980 | I've worked on experimental animals
02:16:42.420 | and that's been a very challenging aspect
02:16:44.700 | to being a biologist, mostly mice,
02:16:46.880 | but in the past no longer, thank goodness,
02:16:49.220 | 'cause I just don't like doing it, larger animals as well.
02:16:52.700 | And now I work on humans,
02:16:53.740 | which I can give consent, verbal consent.
02:16:56.260 | So I think that it's extremely important
02:17:00.820 | to have an understanding of what the guidelines are
02:17:03.880 | and where one's own boundaries are around this.
02:17:06.180 | It's not just an important question.
02:17:09.020 | It might be the most important question
02:17:11.040 | before any work can progress.
02:17:12.860 | - So you asked me about friendship.
02:17:14.940 | I know you have a lot of thoughts about friendship.
02:17:18.180 | What do you think is the value of friendship in life?
02:17:22.320 | - Well, for me personally,
02:17:24.820 | just because of my life trajectory and arc of friendship,
02:17:29.820 | and I should say I do have some female friends
02:17:34.460 | that are just friends,
02:17:35.620 | they're completely platonic relationships,
02:17:37.100 | but it's been mostly male friendship to me has been-
02:17:39.660 | - Has been all male friendships to me actually.
02:17:41.920 | - Interesting, yeah.
02:17:43.020 | It's been an absolute lifeline.
02:17:45.680 | They are my family.
02:17:47.020 | I have a biological family
02:17:48.260 | and I have great respect and love for them
02:17:50.040 | and an appreciation for them,
02:17:51.260 | but it's provided me the,
02:17:55.300 | I wouldn't even say confidence
02:17:58.880 | because there's always an anxiety in taking any good risk
02:18:02.460 | or any risk worth taking.
02:18:04.220 | It's given me the sense that I should go for certain things
02:18:08.860 | and try certain things to take risks,
02:18:10.720 | to weather that anxiety.
02:18:12.140 | And I don't consider myself
02:18:14.640 | a particularly competitive person,
02:18:16.780 | but I would sooner die than disappoint
02:18:21.780 | or let down one of my friends.
02:18:24.620 | I can think of nothing worse actually
02:18:26.920 | than disappointing one of my friends.
02:18:29.120 | Everything else is secondary to me.
02:18:31.260 | - Well, disappointment-
02:18:33.100 | - Disappointing meaning not, I mean,
02:18:36.140 | certainly I strive always to show up
02:18:39.580 | as best I can for the friendship.
02:18:41.580 | And that can be in small ways.
02:18:43.020 | That can mean making sure the phone is away.
02:18:45.000 | Sometimes it's about, I'm terrible with punctuality
02:18:49.920 | 'cause I'm an academic and so I just get lost in time
02:18:52.380 | and I don't mean anything by it,
02:18:53.420 | but it's striving to listen, to enjoy good times
02:18:57.620 | and to make time.
02:18:59.100 | It kind of goes back to this first variable we talked about,
02:19:01.860 | to make sure that I spend time
02:19:03.980 | and to get time in person and check in.
02:19:06.720 | And I think there's so many ways
02:19:10.640 | in which friendship is vital to me.
02:19:12.580 | It's actually, to me, what makes life worth living.
02:19:14.820 | - Yeah, well, I am surprised with the high school friends
02:19:19.820 | how we don't actually talk that often these days
02:19:22.640 | in terms of time, but every time we see each other,
02:19:24.940 | it's immediately right back to where we started.
02:19:27.060 | So I struggle with that, how much time you really allocate
02:19:30.360 | for the friendship to be deeply meaningful
02:19:34.400 | because they're always there with me,
02:19:36.740 | even if we don't talk often.
02:19:38.280 | So there's a kind of loyalty.
02:19:40.980 | I think maybe it's a different style,
02:19:44.000 | but I think to me, friendship is being there
02:19:48.440 | in the hard times, I think.
02:19:51.120 | I'm much more reliable when you're going through shit
02:19:56.460 | than like-
02:19:57.580 | - You're pretty reliable anyway.
02:19:59.560 | - No, but if you're like a wedding or something like that,
02:20:03.120 | or like, I don't know, like you want an award of some kind,
02:20:08.120 | like, yeah, I'll congratulate the shit out of you,
02:20:12.200 | but like that's not, and I'll be there,
02:20:14.460 | but that's not as important to me as being there
02:20:16.700 | when like nobody else is, like just being there
02:20:20.240 | when shit hits the fan or something's tough
02:20:23.820 | where the world turns their back on you,
02:20:25.900 | all those kinds of things.
02:20:26.880 | That to me, that's where friendship is meaningful.
02:20:29.340 | - Well, I know that to be true about you,
02:20:30.920 | and that's a felt thing and a real thing with you.
02:20:33.420 | Let me ask one more thing about that actually,
02:20:35.560 | because I'm not a practitioner of jiu-jitsu.
02:20:38.420 | I know you are, Joe is, but years ago I read a book
02:20:41.420 | that I really enjoyed,
02:20:42.400 | which is Sam Sheridan's book, "A Fighter's Heart."
02:20:44.780 | He talks about all these different forms of martial arts,
02:20:46.920 | and maybe it was in the book, maybe it was in an interview,
02:20:50.640 | but he said that fighting or being in physical battle
02:20:55.420 | with somebody, jiu-jitsu, boxing,
02:20:57.340 | or some other form of direct physical contact
02:21:00.220 | between two individuals creates this bond unlike any other,
02:21:04.300 | because he said, it's like a one-night stand.
02:21:06.580 | You're sharing bodily fluids with somebody
02:21:08.480 | that you barely know.
02:21:09.800 | And I chuckled about it 'cause it's kind of funny
02:21:13.100 | and kind of tongue-in-cheek.
02:21:14.820 | But at the same time, I think this is a fundamental way
02:21:18.960 | in which members of a species bond
02:21:22.280 | is through physical contact.
02:21:24.300 | And certainly there are other forms.
02:21:25.880 | There's cuddling and there's hand-holding
02:21:27.360 | and there's sexual intercourse
02:21:29.680 | and there's all sorts of things.
02:21:30.520 | - What's cuddling?
02:21:31.720 | I haven't heard of it.
02:21:32.720 | - I heard this recently, I didn't know this term,
02:21:34.840 | but there's a term, they've turned the noun cupcake
02:21:38.180 | into a verb, cupcaking, it turns out.
02:21:40.540 | I just learned about this.
02:21:41.740 | Cupcaking is when you spend time just cuddling.
02:21:45.060 | I didn't know about this.
02:21:46.300 | You heard it here first,
02:21:47.300 | although I heard it first just the other day,
02:21:48.800 | cupcaking is actually-
02:21:49.980 | - So cuddling is everything, it's not just like,
02:21:51.780 | is it in bed or is it on the couch?
02:21:53.740 | Like what's cuddling?
02:21:55.380 | I need to look up what cuddling is.
02:21:56.380 | - We need to look this up
02:21:57.220 | and we need to define the variables.
02:21:58.980 | I think it definitely has to do with physical contact,
02:22:02.860 | I am told.
02:22:04.180 | But in terms of battle, competition and the Sheridan quote,
02:22:09.180 | I'm just curious, so do you get close
02:22:15.060 | or feel a bond with people that, for instance,
02:22:18.820 | you rolled jujitsu with,
02:22:20.660 | even though you don't know anything else about them?
02:22:23.940 | Was he right about this?
02:22:25.500 | - Yeah, I mean, on many levels.
02:22:27.140 | He also has the book, "A Fighter's Mind" and "A Fighter's Art."
02:22:30.900 | - He's actually an excellent writer.
02:22:32.140 | What's interesting about him, just briefly about Sheridan,
02:22:34.500 | I don't know, but I did a little bit of research.
02:22:36.140 | He went to Harvard, he was an art major at Harvard.
02:22:40.100 | He claims all he did was smoke cigarettes and do art.
02:22:43.420 | I don't know if his art was any good.
02:22:45.100 | And I think his father was in the SEAL teams.
02:22:48.980 | And then when he got out of Harvard, graduated,
02:22:51.740 | he took off around the world,
02:22:52.900 | learning all the forms of martial arts
02:22:54.420 | and was early to the kind of ultimate fighting
02:22:56.640 | to kind of mix martial arts and things.
02:22:59.020 | Great, great book.
02:22:59.960 | - Yeah, it's amazing.
02:23:01.260 | I don't actually remember it, but I read it
02:23:03.140 | and I remember thinking there was an amazing encapsulation
02:23:06.980 | of what makes fighting, like what makes it compelling.
02:23:11.980 | I would say that there's so many ways that jiu-jitsu,
02:23:17.100 | grappling, wrestling, combat sports in general,
02:23:21.100 | is like one of the most intimate things you could do.
02:23:24.240 | I don't know if I would describe it
02:23:25.620 | in terms of bodily liquids and all those kinds of things.
02:23:27.900 | - I think he was more or less joking.
02:23:29.460 | - But I think there's a few ways that it does that.
02:23:34.460 | So one, because you're so vulnerable.
02:23:39.380 | So that the honesty of stepping on the mat
02:23:44.200 | and often all of us have ego thinking we're better
02:23:49.580 | than we are at this particular art.
02:23:52.380 | And then the honesty of being submitted
02:23:55.900 | or being worse than you thought you are
02:23:58.700 | and just sitting with that knowledge.
02:24:01.000 | That kind of honesty, we don't get to experience it
02:24:03.700 | in most of daily life.
02:24:06.040 | We can continue living somewhat of an illusion
02:24:08.540 | of our conceptions of ourselves
02:24:10.740 | 'cause people are not going to hit us with the reality.
02:24:13.740 | The mat speaks only the truth,
02:24:15.780 | that the reality just hits you.
02:24:17.820 | And that vulnerability is the same
02:24:19.940 | as like the loss of a loved one.
02:24:22.500 | It's the loss of a reality that you knew before.
02:24:26.300 | You now have to deal with this new reality.
02:24:28.180 | And when you're sitting there in that vulnerability
02:24:30.460 | and there's these other people
02:24:32.100 | that are also sitting in that vulnerability,
02:24:34.320 | you get to really connect like, fuck.
02:24:36.920 | Like I'm not as special as I thought I was
02:24:40.220 | and life is like not,
02:24:42.760 | life is harsher than I thought I was
02:24:46.020 | and we're just sitting there with that reality.
02:24:47.660 | Some of us can put words to them, some of them can't.
02:24:50.020 | So I think that definitely is the thing
02:24:51.660 | that leads to intimacy.
02:24:54.620 | The other thing is the human contact.
02:24:58.580 | There is something about, I mean, like a big hug.
02:25:03.580 | Like during COVID, very few people hugged me
02:25:06.780 | and I hugged them and I always felt good when they did.
02:25:10.240 | Like we're all tested and especially now we're vaccinated
02:25:13.860 | but there's still people, this is true in San Francisco,
02:25:16.360 | this is true in Boston.
02:25:17.340 | They want to keep not only six feet away
02:25:19.340 | but stay at home and never touch you.
02:25:21.720 | That loss of basic humanity is the opposite
02:25:26.720 | of what I feel in jiu-jitsu where it was like
02:25:30.480 | that contact where you're like, I don't give a shit
02:25:34.900 | about whatever rules we're supposed to have in society
02:25:37.400 | where you're not, you have to keep a distance
02:25:39.380 | and all that kind of stuff.
02:25:40.440 | Just the hug, like the intimacy of a hug
02:25:45.000 | that's like a good bear hug
02:25:46.440 | and you're like just controlling another person.
02:25:49.720 | And also there is some kind of love communicated
02:25:52.100 | through just trying to break each other's arms.
02:25:54.660 | I don't exactly understand why violence
02:25:57.600 | is such a close neighbor to love but it is.
02:26:02.600 | - Well, in the hypothalamus,
02:26:04.740 | the neurons that control sexual behavior
02:26:08.220 | but also non-sexual contact are not just nearby the neurons
02:26:13.220 | that control aggression and fighting,
02:26:15.780 | they are salt and pepper with those neurons.
02:26:19.540 | It's a very interesting and it almost sounds
02:26:23.120 | kind of risque and controversial and stuff.
02:26:25.020 | I'm not anthropomorphizing about what this means
02:26:27.820 | but in the brain, those structures are interdigitated.
02:26:32.020 | You can't separate them except at a very fine level.
02:26:36.020 | And here the way you describe it is the same
02:26:38.780 | as a real thing.
02:26:39.700 | - I do want to make an interesting comment.
02:26:42.860 | Again, these are the things
02:26:43.740 | that could be taken out of context
02:26:45.660 | but one of the amazing things about jiu-jitsu
02:26:50.540 | is both guys and girls train it.
02:26:52.240 | And I was surprised.
02:26:54.740 | So I'm a big fan of yoga pants at the gym kind of thing.
02:26:59.740 | It reveals the beauty of the female form.
02:27:04.780 | But the thing is girls are dressed in skintight clothes
02:27:08.740 | in jiu-jitsu often.
02:27:10.180 | And I found myself not at all thinking like that at all
02:27:13.900 | when training with girls.
02:27:14.980 | - Well, the context is very non-sexual.
02:27:17.140 | - But I was surprised to learn that.
02:27:20.040 | When I first started jiu-jitsu,
02:27:21.180 | I thought wouldn't that be kind of weird
02:27:22.660 | to train with the opposites in something so intimate?
02:27:26.340 | - Boys and girls, men and women,
02:27:28.500 | they rolled jiu-jitsu together completely.
02:27:30.940 | - Interesting.
02:27:31.780 | - And the only times girls kind of try to stay away
02:27:34.500 | from guys, I mean, there's two contexts.
02:27:36.380 | Of course, there's always going to be creeps in this world.
02:27:38.860 | So everyone knows who to stay away from.
02:27:42.100 | And the other is like, there's a size disparity.
02:27:44.280 | So girls will often try to roll with people
02:27:46.260 | a little bit closer weight-wise.
02:27:48.260 | But no, that's one of the things
02:27:51.100 | that are empowering to women.
02:27:53.100 | That's what they fall in love with
02:27:54.260 | when they started doing jiu-jitsu is I can,
02:27:56.320 | first of all, they gain an awareness
02:27:58.700 | and a pride over their body, which is great.
02:28:00.980 | And then second, they get, especially later on,
02:28:04.360 | start submitting big dudes,
02:28:05.940 | like these like bros that come in who are all shredded
02:28:10.100 | and like muscular, and they get to technique to exercise
02:28:14.760 | dominance over them.
02:28:16.100 | And that's a powerful feeling.
02:28:17.720 | - You've seen women force a larger guy to tap
02:28:21.560 | or even choke him out.
02:28:22.400 | - Well, I was deadlifting for,
02:28:26.080 | oh boy, I think it's 495.
02:28:31.520 | So I was really into powerlifting when I started jiu-jitsu.
02:28:35.000 | And I remember being submitted by,
02:28:37.400 | I thought I walked in feeling like I'm going to be,
02:28:40.660 | if not the greatest fighter ever, at least top three.
02:28:43.220 | And so as a white belt, you roll in like all happy.
02:28:47.420 | And then you realize that as long as you're not applying
02:28:50.740 | too much force that you're having,
02:28:52.940 | I remember being submitted many times by like 130,
02:28:55.540 | 120 pound girls at a balance studios in Philadelphia
02:28:59.860 | that a lot of incredible female jiu-jitsu players.
02:29:02.460 | And that's really humbling too.
02:29:04.360 | The technique can overpower in combat pure strength.
02:29:09.360 | And that's the other thing that there is something
02:29:15.540 | about combat that's primal.
02:29:18.140 | Like it just feels, it feels like we were born to do this.
02:29:23.140 | - Like that we have circuits in our brain
02:29:29.040 | that are dedicated to this kind of interaction.
02:29:32.380 | There's no question.
02:29:34.060 | And like, that's what it felt like.
02:29:35.700 | It wasn't that I'm learning a new skill.
02:29:38.940 | It was like somehow I am remembering echoes
02:29:42.780 | of something I've learned in the past.
02:29:44.380 | - Well, it's like hitting puberty.
02:29:45.660 | A child before puberty has no concept of boys and girls
02:29:49.260 | having this attraction regardless of whether or not
02:29:51.840 | they're attracted to boys or girls.
02:29:53.140 | It doesn't matter at some point.
02:29:54.600 | Most people, not all, but certainly,
02:29:56.380 | but most people when they hit puberty,
02:29:58.340 | suddenly people appear differently.
02:30:01.300 | And certain people take on a romantic or sexual interest
02:30:05.540 | for the very first time.
02:30:07.560 | And so it's like, it's revealing a circuitry in the brain.
02:30:11.380 | It's not like they learn that, it's innate.
02:30:14.320 | And I think when I hear the way you describe jiu-jitsu
02:30:18.100 | and rolling jiu-jitsu, it reminds me a little bit,
02:30:21.060 | Joe was telling me recently about the first time
02:30:23.300 | he went hunting and he felt like it revealed a circuit
02:30:26.660 | that was in him all along,
02:30:28.540 | but he hadn't experienced before.
02:30:30.900 | - Yeah, that's definitely there.
02:30:32.320 | And of course there's the physical activity.
02:30:34.920 | One of the interesting things about jiu-jitsu
02:30:37.220 | is it's one of the really strenuous exercises
02:30:40.860 | that you can do late into your adult life,
02:30:43.860 | like into your 50s, 60s, 70s, 80s.
02:30:48.000 | When I came up, there's a few people in their 80s
02:30:50.500 | that were training.
02:30:51.700 | And as long as you're smart,
02:30:53.080 | as long as you practice techniques
02:30:54.840 | and pick your partners correctly,
02:30:55.940 | you can do that kind of art.
02:30:57.320 | That's late into life and so you're getting exercise.
02:30:59.860 | There's not many activities I find
02:31:01.900 | that are amenable to that.
02:31:04.500 | So because it's such a thinking game,
02:31:07.740 | the jiu-jitsu in particular is an art
02:31:10.860 | where technique pays off a lot.
02:31:13.340 | So you can still maintain, first of all,
02:31:17.220 | remain injury-free if you use good technique
02:31:20.820 | and also through good technique,
02:31:22.960 | be able to go be active with people
02:31:26.920 | that are much, much younger.
02:31:28.460 | And so that was, to me, that and running
02:31:31.560 | are the two activities you can kind of do late in life
02:31:33.820 | because to me, a healthy life has exercises
02:31:38.260 | as a piece of the puzzle.
02:31:39.380 | - No, absolutely.
02:31:40.420 | And I'm glad that we're on the physical component
02:31:42.720 | because I know that there's, for you,
02:31:47.720 | you've talked before about the crossover
02:31:49.680 | between the physical and the intellectual and the mental.
02:31:52.480 | Are you still running at ridiculous hours of the night
02:31:57.680 | for ridiculously long?
02:31:59.600 | - Yeah, so definitely.
02:32:01.520 | I've been running late at night here in Austin.
02:32:03.520 | People tell me, the area we're in now,
02:32:05.800 | people say is a dangerous area,
02:32:07.240 | which I find laughable coming from the bigger cities.
02:32:10.580 | No, I run late at night.
02:32:12.780 | There's something.
02:32:14.180 | - If you see a guy running through Austin at 2 a.m.
02:32:17.580 | in a suit and tie, it's probably nice.
02:32:20.140 | [laughing]
02:32:22.200 | - Well, yeah, I mean, I do think about that
02:32:24.080 | 'cause I get recognized more and more in Austin.
02:32:26.640 | I worry that, not really,
02:32:29.040 | that I get recognized late at night.
02:32:30.800 | But there is something about the night
02:32:34.960 | that brings out those deep philosophical thoughts
02:32:38.960 | and self-reflection that I really enjoy.
02:32:40.720 | But recently, I started getting back to the grind.
02:32:44.560 | So I'm gonna be competing or hoping to be compete
02:32:47.720 | in September and October.
02:32:49.880 | - In jiu-jitsu. - In jiu-jitsu, yeah,
02:32:51.240 | to get back to competition.
02:32:53.120 | And so that requires getting back into great cardio shape.
02:32:58.120 | I've been getting running as part of my daily routine.
02:33:02.280 | - Got it.
02:33:03.640 | Well, I always know I can reach you,
02:33:05.140 | regardless of time zone in the middle of the night,
02:33:08.000 | wherever that happens.
02:33:09.280 | - Well, part of that has to be just being single
02:33:11.120 | and being a programmer.
02:33:13.780 | Those two things just don't work well
02:33:16.040 | in terms of a steady sleep schedule.
02:33:18.040 | - It's not banker's hours kind of work, nine to five.
02:33:21.440 | I want to, you mentioned single.
02:33:23.280 | I want to ask you a little bit
02:33:24.840 | about the other form of relationship,
02:33:26.420 | which is romantic love.
02:33:29.700 | So your parents are still married?
02:33:32.360 | - Still married, still happily married.
02:33:34.120 | - That's impressive.
02:33:34.960 | - Yeah. - A rare thing nowadays.
02:33:36.620 | - Yeah.
02:33:37.460 | - So you grew up with that example.
02:33:38.960 | - Yeah, I guess that's a powerful thing, right?
02:33:40.920 | If there's an example that I think can work.
02:33:43.120 | - Yeah, I didn't have that in my own family,
02:33:46.480 | but when I see it, it's inspiring
02:33:50.880 | and it's beautiful, the fact that they have that
02:33:53.280 | and that was the norm for you, I think is really wonderful.
02:33:57.000 | - In the case of my parents, it was interesting to watch
02:34:00.060 | 'cause there's obviously tension.
02:34:03.240 | Like there'll be times where they fought
02:34:04.920 | and all those kinds of things.
02:34:06.680 | They obviously get frustrated with each other
02:34:09.920 | and they like, but they find mechanisms
02:34:13.320 | how to communicate that to each other,
02:34:15.000 | like to make fun of each other a little bit,
02:34:16.520 | like to tease, to get some of that frustration out
02:34:19.400 | and then ultimately to reunite
02:34:21.040 | and find their joyful moments and be that the energy.
02:34:25.280 | I think it's clear 'cause I got together in there,
02:34:27.600 | I think early twenties, like very, very young.
02:34:30.280 | I think you grow together as people.
02:34:32.440 | - Yeah, you're still in the critical period
02:34:34.320 | of brain plasticity.
02:34:35.560 | - And also, I mean, it's just like divorce
02:34:40.200 | was so frowned upon that you stick it out.
02:34:43.840 | And I think a lot of couples,
02:34:44.960 | especially from that time in the Soviet Union,
02:34:46.640 | that's probably applies to a lot of cultures.
02:34:48.600 | You stick it out and you put in the work,
02:34:50.640 | you learn how to put in the work.
02:34:52.280 | And once you do, you start to get to some of those
02:34:54.920 | rewarding aspects of being like through time,
02:34:59.060 | sharing so many moments together.
02:35:00.760 | That's definitely something that was an inspiration to me,
02:35:07.820 | but maybe that's where I have,
02:35:09.720 | so I have a similar kind of longing
02:35:11.340 | to have a lifelong partner that have that kind of view
02:35:14.880 | where same with friendship,
02:35:16.760 | lifelong friendship is the most meaningful kind,
02:35:20.600 | that there is something with that time
02:35:22.040 | of sharing all that time together,
02:35:24.120 | like till death do us part as a powerful thing,
02:35:26.800 | not by force, not because the religion said it
02:35:29.200 | or the government said it or your culture said it,
02:35:31.580 | but because you want to.
02:35:33.240 | - Do you want children?
02:35:34.520 | - Definitely, yeah.
02:35:35.920 | Definitely want children.
02:35:37.320 | It's common.
02:35:39.200 | - How many room was, do you have?
02:35:41.280 | - Oh, I thought you should know human children.
02:35:43.240 | - No, human children.
02:35:44.440 | - 'Cause I already have the children.
02:35:45.560 | - Exactly, well, I was saying you probably need
02:35:46.880 | at least as many human children as you do room with.
02:35:49.520 | Big family, small family.
02:35:50.980 | In your mind's eye, is there a big,
02:35:55.160 | are there a bunch of freedman's running around?
02:35:59.160 | - So I'll tell you like realistically,
02:36:01.240 | I can explain exactly my thinking.
02:36:04.080 | And this is similar to the robotics work
02:36:06.160 | is if I'm like purely logical right now,
02:36:10.080 | my answer would be, I don't want kids
02:36:12.340 | because I just don't have enough time.
02:36:15.720 | I have so much going on.
02:36:17.280 | But when I'm using the same kind of vision
02:36:19.240 | I use for the robots is I know my life
02:36:22.620 | will be transformed with the first.
02:36:25.000 | Like I know I would love being a father.
02:36:27.640 | And so the question of how many,
02:36:30.000 | that's on the other side of that hill.
02:36:33.140 | It could be some ridiculous number.
02:36:35.880 | So I just know that--
02:36:36.800 | - I have a feeling and I don't have a crystal ball,
02:36:40.820 | but I don't know.
02:36:42.600 | I see an upwards of certainly three or more comes to mind.
02:36:47.600 | - So much of that has to do with the partner you're with too.
02:36:51.920 | So like that's such an open question,
02:36:55.560 | especially in this society of what the right partnership is.
02:36:58.880 | 'Cause I'm deeply empathetic.
02:37:02.800 | I want to see, like to me,
02:37:05.100 | what I look for in a relationship is for me
02:37:08.520 | to be really excited about the passions of another person,
02:37:12.080 | like whatever they're into.
02:37:13.020 | It doesn't have to be a career success, any kind of success,
02:37:16.860 | just to be excited for them
02:37:18.440 | and for them to be excited for me
02:37:20.540 | and they can share in that excitement
02:37:21.840 | and build and build and build.
02:37:23.720 | But there was also practical aspects of like,
02:37:25.800 | what kind of shit do you enjoy doing together?
02:37:28.760 | And I think family is a real serious undertaking.
02:37:32.340 | - Oh, it certainly is.
02:37:34.040 | I mean, I think that I have a friend who said it,
02:37:37.480 | I think best, which is that you first have,
02:37:41.240 | he's in a very successful relationship and has a family.
02:37:44.440 | And he said, you first have to define the role
02:37:47.480 | and then you have to cast the right person for the role.
02:37:50.180 | - Well, yeah, there's some deep aspects to that,
02:37:53.880 | but there's also an aspect to which you're not smart enough
02:37:58.280 | from this side of it to define the role.
02:38:03.000 | There's part of it that has to be a leap
02:38:04.840 | that you have to take.
02:38:06.520 | And I see having kids that way,
02:38:11.520 | you just have to go with it and figure it out also.
02:38:16.120 | As long as there's love there,
02:38:17.640 | like what the hell is life for even?
02:38:20.580 | So there's so many incredibly successful people that I know,
02:38:25.360 | that I've gotten to know, that all have kids.
02:38:28.960 | And the presence of kids for the most part
02:38:32.800 | has only been something that energized them,
02:38:36.620 | something that gave them meaning,
02:38:37.900 | something that made them the best version of themselves,
02:38:40.200 | like made them more productive, not less,
02:38:42.380 | which is fascinating to me.
02:38:43.680 | - It is fascinating.
02:38:44.520 | I mean, you can imagine if the way that you felt about Homer,
02:38:47.260 | the way that I feel and felt about Costello
02:38:49.760 | is at all a glimpse of what that must be like then.
02:38:54.600 | - Exactly.
02:38:55.840 | The downside, the thing I worry more about
02:39:00.020 | is the partner side of that.
02:39:04.560 | I've seen the kids are almost universally
02:39:07.840 | a source of increased productivity and joy and happiness.
02:39:10.840 | Like, yeah, they're a pain in the ass.
02:39:13.280 | Yeah, it's complicated.
02:39:14.120 | Yeah, so on and so forth.
02:39:15.640 | People like to complain about kids.
02:39:17.440 | But when you actually look past
02:39:19.620 | that little shallow layer of complaint, kids are great.
02:39:22.940 | The source of pain for a lot of people
02:39:24.760 | is when the relationship doesn't work.
02:39:27.600 | And so I'm very kind of concerned about,
02:39:30.940 | dating is very difficult and I'm a complicated person.
02:39:36.560 | And so it's been very difficult
02:39:38.220 | to find the right kind of person.
02:39:41.980 | But that statement doesn't even make sense
02:39:45.040 | because I'm not on dating apps.
02:39:46.880 | I don't see people.
02:39:48.260 | You're like the first person I saw in a while.
02:39:50.380 | It's like you, Michael Malice and like Joe.
02:39:53.040 | So like, I don't think I've seen like a female.
02:39:58.040 | What is it?
02:40:00.840 | An element of the female species in quite a while.
02:40:03.680 | So I think you have to put yourself out there.
02:40:06.520 | What is it?
02:40:07.440 | Daniel Johnson says, true love will find you,
02:40:10.240 | but only if you're looking.
02:40:11.760 | So there's some element of really taking the leap
02:40:13.680 | and putting yourself out there
02:40:14.780 | in kind of different situations.
02:40:17.000 | And I don't know how to do that
02:40:18.400 | when you're behind a computer all the time.
02:40:20.460 | - Well, you're a builder and you're a problem solver
02:40:25.200 | and you find solutions and I'm confident this solution is,
02:40:30.200 | the solution is out there and-
02:40:34.260 | - I think you're implying
02:40:35.100 | that I'm going to build the girlfriend, which I think-
02:40:38.500 | - Or that you, well,
02:40:39.780 | and maybe we shouldn't separate this friendship,
02:40:43.360 | the notion of friendship and community.
02:40:45.460 | And if we go back to this concept of the aggregate,
02:40:48.900 | maybe you'll meet this woman through a friend
02:40:52.420 | or maybe or something of that sort.
02:40:53.960 | - So one of the things,
02:40:54.800 | I don't know if you feel the same way,
02:40:56.920 | I definitely one of those people
02:40:59.240 | that just falls in love and that's it.
02:41:02.480 | - Yeah, I can't say I'm like that.
02:41:04.000 | With Costello, it was instantaneous.
02:41:06.560 | - Yeah. - It really was.
02:41:07.760 | I mean, I know it's not romantic love,
02:41:09.680 | but it was instantaneous.
02:41:10.760 | No, I, but that's me.
02:41:12.320 | You know, and I think that if you know, you know,
02:41:14.680 | because that's a good thing that you have that.
02:41:18.260 | - Well, it's, I'm very careful with that
02:41:20.860 | because you don't want to fall in love with the wrong person.
02:41:24.980 | So I try to be very kind of careful with,
02:41:27.340 | I've noticed this because I fall in love with everything,
02:41:29.460 | like this mug, everything.
02:41:31.040 | I fall in love with things in this world.
02:41:34.020 | So like, you have to be really careful
02:41:35.500 | because a girl comes up to you and says,
02:41:40.500 | "She loves Dostoevsky."
02:41:42.700 | That doesn't necessarily mean you need to marry her tonight.
02:41:46.700 | - Yes, and I like the way you said that out loud
02:41:49.060 | so that you heard it.
02:41:49.980 | It doesn't mean you need to marry her tonight.
02:41:52.540 | - Exactly. - Right, exactly.
02:41:53.600 | - But I mean, but people are amazing
02:41:56.120 | and people are beautiful.
02:41:57.140 | And that's, so I'm fully embraced that,
02:42:00.540 | but I also have to be careful with relationships.
02:42:02.940 | And at the same time, like I mentioned to you offline,
02:42:05.820 | I don't, there's something about me
02:42:08.880 | that appreciates swinging for the fences and not dating,
02:42:13.640 | like doing serial dating or dating around.
02:42:15.860 | - Yeah, you're a one guy, one girl kind of guy.
02:42:17.580 | - Yeah. - You said that.
02:42:18.940 | - And it's tricky because you want to be careful
02:42:23.100 | with that kind of stuff,
02:42:24.180 | especially now there's a growing platform
02:42:26.380 | that have a ridiculous amount of female interest
02:42:29.140 | of a certain kind, but I'm looking for deep connection
02:42:33.580 | and I'm looking by sitting home alone.
02:42:36.580 | And every once in a while, talking to Stanford professors.
02:42:40.180 | - Perfect solution. - On a podcast.
02:42:42.380 | - Perfect solution. - It's going to work out great.
02:42:44.260 | - It's well incorporated, it's part of,
02:42:47.260 | that constitutes machine learning of sorts.
02:42:49.540 | - Yeah, of sorts.
02:42:50.580 | - You mentioned what has now become a quite extensive
02:42:55.860 | and expansive public platform, which is incredible.
02:42:59.260 | I mean, the number of people,
02:43:01.260 | first time I saw your podcast, I noticed the suit.
02:43:03.540 | I was like, he respects his audience, which was great.
02:43:05.460 | But I also thought, this is amazing.
02:43:08.480 | People are showing up for science and engineering
02:43:10.740 | and technology information and those discussions
02:43:12.820 | and other sorts of discussions.
02:43:14.100 | Now, I do want to talk for a moment about the podcast.
02:43:18.100 | So my two questions about the podcast are,
02:43:21.720 | when you started it, did you have a plan?
02:43:24.300 | And regardless of what that answer is,
02:43:27.500 | do you know where you're taking it?
02:43:29.820 | Or would you like to leave us?
02:43:31.960 | I do believe in an element of surprise is always fun.
02:43:35.220 | But what about the podcast?
02:43:36.380 | Do you enjoy the podcast?
02:43:37.680 | I mean, your audience certainly includes me,
02:43:40.340 | really enjoys the podcast, it's incredible.
02:43:42.660 | So I love talking to people.
02:43:46.540 | And there's something about microphones
02:43:50.200 | that really bring out the best in people.
02:43:52.300 | Like you don't get a chance to talk like this.
02:43:54.500 | If you and I were just hanging out,
02:43:56.100 | we would have a very different conversation
02:43:58.500 | in the amount of focus we allocate to each other.
02:44:02.100 | We would be having fun talking about other stuff
02:44:04.380 | and doing other things.
02:44:06.000 | There'd be a lot of distraction.
02:44:07.480 | There would be some phone use and all that kind of stuff.
02:44:11.040 | But here we're 100% focused on each other
02:44:13.980 | and focused on the idea.
02:44:16.080 | And sometimes playing with ideas
02:44:18.200 | that we both don't know the answer to,
02:44:21.080 | like a question we don't know the answer to.
02:44:23.080 | We're both like fumbling with it, trying to figure out,
02:44:25.480 | trying to get some insights.
02:44:27.120 | That's something we haven't really figured out before
02:44:29.520 | and together arriving at that.
02:44:31.280 | I think that's magical.
02:44:32.400 | I don't know why we need microphones for that,
02:44:34.200 | but we somehow do.
02:44:35.240 | - It feels like doing science.
02:44:36.720 | - It feels like doing science for me, definitely.
02:44:38.720 | That's exactly it.
02:44:39.900 | Then, and I'm really glad you said that
02:44:42.220 | because I don't actually often say this,
02:44:46.300 | but that's exactly what I felt like.
02:44:48.900 | I wanted to talk to friends and colleagues at MIT
02:44:52.780 | to do real science together.
02:44:56.420 | That's how I felt about it.
02:44:57.780 | Like to really talk through problems
02:45:00.180 | that are actually interesting.
02:45:01.860 | As opposed to like incremental work
02:45:06.060 | that we're currently working for a particular conference.
02:45:10.840 | So really asking questions like, what are we doing?
02:45:14.140 | Like, where's this headed to?
02:45:16.000 | Like, what are the big,
02:45:17.200 | is this really going to help us solve,
02:45:19.960 | in the case of AI, solve intelligence?
02:45:22.800 | Like, is this even working on intelligence?
02:45:24.700 | There's a certain sense,
02:45:26.360 | which is why I initially called it artificial intelligence,
02:45:29.980 | is like most of us are not working
02:45:32.900 | on artificial intelligence.
02:45:34.760 | You're working on some very specific problem
02:45:37.680 | and a set of techniques.
02:45:39.480 | At the time, it's machine learning
02:45:41.260 | to solve this particular problem.
02:45:42.980 | This is not going to take us to a system
02:45:45.560 | that is anywhere close to the generalizability
02:45:49.240 | of the human mind.
02:45:51.320 | Like the kind of stuff the human mind can do
02:45:52.940 | in terms of memory, in terms of cognition,
02:45:54.620 | in terms of reasoning, common sense reasoning.
02:45:56.940 | This doesn't seem to take us there.
02:45:58.700 | So the initial impulse was,
02:46:00.540 | can I talk to these folks, do science together,
02:46:04.040 | through conversation?
02:46:05.620 | And I also thought that there was not enough,
02:46:08.620 | I didn't think there was enough good conversations
02:46:13.860 | with world-class minds that I got to meet.
02:46:17.660 | And not the ones with the book,
02:46:19.820 | or this was the thing.
02:46:21.740 | Oftentimes you go on this tour when you have a book,
02:46:24.260 | but there's a lot of minds that don't write books.
02:46:26.640 | - And the books constrain the conversation too,
02:46:28.860 | 'cause then you're talking about this thing, this book.
02:46:31.720 | But I've noticed that with people
02:46:34.460 | that haven't written a book who are brilliant,
02:46:37.260 | we get to talk about ideas in a new way.
02:46:40.060 | We both haven't actually, when we raise a question,
02:46:43.780 | we don't know the answer to it when the question is raised.
02:46:47.140 | And we try to arrive there.
02:46:48.700 | Like, I don't know, I remember asking questions
02:46:52.220 | of world-class researchers in deep learning
02:46:57.140 | of why do neural networks work as well as they do?
02:47:02.580 | That question is often loosely asked,
02:47:06.600 | but like when you have microphones
02:47:09.220 | and you have to think through it
02:47:11.000 | and you have 30 minutes to an hour
02:47:12.460 | to think through it together, I think that's science.
02:47:16.120 | I think that's really powerful.
02:47:17.600 | So that was the one goal.
02:47:19.680 | The other one is,
02:47:20.560 | I again don't usually talk about this,
02:47:25.460 | but there's some sense in which I wanted
02:47:28.500 | to have dangerous conversations.
02:47:30.320 | Part of the reasons I wanted to wear a suit
02:47:35.320 | is like, I want it to be fearless.
02:47:38.700 | The reason I don't usually talk about it
02:47:40.180 | is because I feel like I'm not good at conversation.
02:47:43.000 | So it looks like it doesn't match the current skill level,
02:47:48.000 | but I want it to have really dangerous conversations
02:47:53.240 | that I uniquely would be able to do.
02:47:58.160 | Not completely uniquely, but like I'm a huge fan
02:48:01.620 | of Joe Rogan and I had to ask myself
02:48:04.540 | what conversations can I do that Joe Rogan can't?
02:48:08.240 | For me, I know I bring this up,
02:48:10.560 | but for me, that person I thought about
02:48:13.760 | at the time was Putin.
02:48:15.640 | Like that's why I bring him up.
02:48:17.580 | He's just like with Costello, he's not just a person.
02:48:22.080 | He's also an idea to me for what I strive for,
02:48:25.580 | just to have those dangerous conversations.
02:48:27.860 | And the reason I'm uniquely qualified is both the Russian,
02:48:31.440 | but also there's the judo and the martial arts.
02:48:34.060 | There's a lot of elements that make me have a conversation
02:48:37.820 | he hasn't had before.
02:48:39.460 | And there's a few other people that I kept in mind,
02:48:44.460 | like Don Knuth, he's a computer scientist from Stanford,
02:48:49.340 | that I thought is one of the most beautiful minds ever.
02:48:54.060 | And nobody really talked to him, like really talked to him.
02:48:59.060 | He's did a few lectures, which people love,
02:49:01.380 | but really just have a conversation with him.
02:49:03.720 | There's a few people like that.
02:49:04.820 | One of them passed away, John Conway, that I never got,
02:49:07.300 | we agreed to talk, but he died before we did.
02:49:10.660 | There's a few people like that, that I thought like,
02:49:13.460 | it's such a crime to not hear those folks.
02:49:19.660 | And I have the unique ability to know how to purchase
02:49:24.660 | a microphone on Amazon and plug it into a device
02:49:28.000 | that records audio and then publish it,
02:49:30.420 | which seems relatively unique.
02:49:32.060 | Like that's not easy in the scientific community,
02:49:34.700 | people knowing how to plug in a microphone.
02:49:36.700 | - No, they can build Faraday cages and two photon
02:49:39.700 | microscopes and bioengineer, all sorts of things.
02:49:43.040 | But the idea that you could take ideas and export them
02:49:47.100 | into a structure or a pseudo structure that people would
02:49:49.740 | benefit from seems like a cosmic achievement to them.
02:49:54.060 | - I don't know if it's a fear or just basically
02:49:57.460 | they haven't tried it, so they haven't learned
02:49:59.340 | the skill level.
02:50:00.300 | - But I think they're not trained.
02:50:01.580 | I mean, we could riff on this for a while,
02:50:03.380 | but I think that, but it's important and maybe we should,
02:50:08.040 | which is that it's, they're not trained to do it.
02:50:11.060 | They're trained to think in specific aims
02:50:12.820 | and specific hypotheses and many of them don't care to.
02:50:17.340 | They became scientists because that's where they felt safe.
02:50:22.660 | And so why would they leave that haven of safety?
02:50:25.880 | - Well, they also don't necessarily always see
02:50:27.860 | the value in it.
02:50:29.300 | We're all together learning.
02:50:30.580 | You and I are learning the value of this.
02:50:33.740 | I think you're probably, you have an exceptionally successful
02:50:38.340 | and amazing podcast that you started just recently.
02:50:40.900 | - Thanks to your encouragement.
02:50:42.420 | - Well, but there's a raw skill there that you're definitely
02:50:47.420 | an inspiration to me in how you do the podcast
02:50:50.400 | in the level of excellence you reach.
02:50:52.860 | But I think you've discovered that that's also
02:50:55.240 | an impactful way to do science, that podcast.
02:50:58.020 | And I think a lot of scientists have not yet discovered
02:51:01.620 | that this is, if they apply same kind of rigor
02:51:06.620 | as they do to academic publication
02:51:09.340 | or to even conference presentations,
02:51:11.780 | and they do that rigor and effort to podcast,
02:51:16.100 | whatever that is, that could be a five minute podcast,
02:51:18.500 | a two hour podcast, it could be conversational,
02:51:20.980 | or it can be more like lecture like.
02:51:22.920 | If they apply that effort, you have the potential to reach
02:51:26.260 | over time, tens of thousands, hundreds of thousands,
02:51:28.600 | millions of people.
02:51:29.800 | And that's really, really powerful.
02:51:32.460 | But yeah, for me, giving a platform to a few of those folks,
02:51:37.460 | especially for me personally,
02:51:40.940 | so maybe you can speak to what fields you're drawn to,
02:51:45.940 | but I thought computer scientists
02:51:48.940 | were especially bad at this.
02:51:53.420 | So there's brilliant computer scientists
02:51:56.300 | that I thought it would be amazing to explore their mind,
02:52:00.300 | explore their thinking.
02:52:02.060 | And so I took that almost on as an effort.
02:52:06.540 | And at the same time, I had other guests in mind
02:52:11.140 | or people that connect to my own interests.
02:52:13.900 | So the wrestling,
02:52:15.500 | wrestling, music, football,
02:52:18.900 | both American football and soccer,
02:52:21.260 | I have a few particular people
02:52:22.620 | that I'm really interested in.
02:52:24.180 | Bovisar Satiev, the Satiev brothers,
02:52:27.980 | even Khabib for wrestling, just to talk to them.
02:52:30.660 | - Oh, 'cause you guys can communicate.
02:52:32.980 | - In Russian and in wrestling, right?
02:52:36.380 | As wrestlers and as Russians.
02:52:38.840 | And so that little,
02:52:41.740 | it's like an opportunity to explore a mind
02:52:44.060 | that I'm able to bring to the world.
02:52:47.680 | And also, I feel like it makes me a better person,
02:52:52.680 | just that being that vulnerable
02:52:55.620 | and exploring ideas together.
02:52:57.380 | I don't know, like good conversation.
02:52:59.820 | I don't know how often you have really good conversation
02:53:01.780 | with friends, but like podcasts are like that.
02:53:04.460 | And it's deeply moving.
02:53:07.100 | - It's the best.
02:53:08.500 | And what you brought through,
02:53:09.920 | I mean, when I saw you sit down with Penrose,
02:53:12.180 | Nobel prize-winning physicists and these other folks,
02:53:15.020 | it's not just 'cause he has a Nobel,
02:53:16.300 | it's what comes out of his mouth is incredible.
02:53:18.020 | And what you were able to hold in that conversation
02:53:22.900 | was so much better.
02:53:24.400 | Light years beyond what he had any other interviewer,
02:53:28.880 | I don't want to even call you an interviewer
02:53:30.100 | 'cause it's really about conversation.
02:53:31.620 | Light years beyond what anyone else had been able
02:53:34.460 | to engage with him was such a beacon of what's possible.
02:53:39.460 | And I know that, I think that's what people are drawn to.
02:53:42.440 | And there's a certain intimacy that,
02:53:45.540 | certainly if two people are friends as we are
02:53:47.540 | and they know each other, that there's more of that,
02:53:49.860 | but there's an intimacy in those kinds
02:53:51.940 | of private conversations that are made public.
02:53:54.300 | - Well, that's the, with you,
02:53:57.900 | you're probably starting to realize, and Costello is like,
02:54:01.940 | part of it, because you're authentic
02:54:04.540 | and you're putting yourself out there completely,
02:54:07.000 | people are almost not just consuming
02:54:10.860 | the words you're saying.
02:54:13.300 | They also enjoy watching you, Andrew, struggle
02:54:18.180 | with these ideas or try to communicate these ideas.
02:54:20.700 | They like the flaws.
02:54:21.700 | They like a human being exploring ideas.
02:54:24.860 | - Well, that's good 'cause I got plenty of those.
02:54:26.500 | - Well, they like the self-critical aspects,
02:54:28.660 | like where you're very careful,
02:54:30.300 | where you're very self-critical about your flaws.
02:54:33.060 | I mean, in that same way, it's interesting, I think,
02:54:35.720 | for people to watch me talk to Penrose,
02:54:37.940 | not just because Penrose is communicating ideas,
02:54:42.300 | but here's this like silly kid trying to explore ideas.
02:54:46.980 | Like they know this kid that there's a human connection
02:54:50.100 | that is really powerful.
02:54:51.240 | Same, I think, with Putin, right?
02:54:53.580 | Like it's not just a good interview with Putin.
02:54:57.440 | It's also, here's this kid struggling
02:55:00.900 | to talk with one of the most powerful
02:55:03.780 | and some would argue dangerous people in the world.
02:55:08.260 | They love that, the authenticity that led up to that.
02:55:11.620 | And in return, I get to connect everybody I run to
02:55:15.940 | in the street and all those kinds of things.
02:55:18.140 | There's a depth of connection there
02:55:21.020 | almost within like a minute or two.
02:55:22.940 | That's unlike any other.
02:55:24.300 | - Yeah, there's an intimacy that you've formed with them.
02:55:26.840 | - Yeah, we've been on this like journey together.
02:55:29.740 | I mean, I have the same thing with Joe Rogan
02:55:31.300 | before I ever met him, right?
02:55:32.740 | Like I was, because I was a fan of Joe for so many years,
02:55:36.620 | there's something, there's a kind of friendship
02:55:40.900 | as absurd as it might be to say in podcasting
02:55:44.460 | and listening to podcasts.
02:55:45.960 | - Yeah, maybe it fills in a little bit of that
02:55:48.860 | or solves a little bit of that loneliness
02:55:50.640 | that you were talking about earlier.
02:55:51.480 | - Until the robots are here.
02:55:53.120 | [both laughing]
02:55:54.540 | - I have just a couple more questions,
02:55:56.540 | but one of them is on behalf of your audience,
02:55:59.560 | which is, I'm not going to ask you
02:56:02.720 | the meaning of the hedgehog,
02:56:04.460 | but I just want to know, does it have a name?
02:56:08.460 | And you don't have to tell us the name,
02:56:09.920 | but just does it have a name, yes or no?
02:56:12.400 | - Well, there's a name he likes to be referred to as,
02:56:17.400 | and then there's a private name
02:56:19.280 | in the privacy-owned company that we call each other.
02:56:21.280 | No, I'm not that insane.
02:56:24.380 | No, his name is Hedgy.
02:56:25.680 | He's a hedgehog.
02:56:28.680 | I don't like stuffed animals,
02:56:30.280 | but his story is one of minimalism.
02:56:35.620 | So I gave away everything I own now three times in my life.
02:56:40.620 | By everything, I mean, almost everything,
02:56:43.000 | kept jeans and shirt and a laptop.
02:56:45.260 | And recently it's also been guitar, things like that.
02:56:50.940 | But he survived because he was always in the,
02:56:55.280 | at least in the first two times, was in the laptop bag,
02:56:58.540 | and he just got lucky.
02:57:00.240 | And so I just liked the perseverance of that.
02:57:02.960 | And I first saw him in the,
02:57:05.960 | the reason I got a stuffed animal
02:57:07.360 | and I don't have other stuffed animals
02:57:09.600 | is it was in a thrift store
02:57:11.560 | in this like giant pile of stuffed animals.
02:57:16.000 | And he jumped out at me
02:57:18.080 | because unlike all the rest of them,
02:57:20.000 | he has this intense mean look about him,
02:57:25.000 | that he's just, he's upset at life,
02:57:28.640 | at the cruelty of life.
02:57:31.000 | And just, especially in the contrast
02:57:32.680 | of the other stuffed animals,
02:57:33.680 | they have this dumb smile on their face.
02:57:35.840 | If you look at most stuffed animals,
02:57:37.100 | they have this dumb look on their face
02:57:38.700 | and they're just happy.
02:57:39.640 | It's like Pleasantville.
02:57:40.560 | - That's what we say in neuroscience.
02:57:41.640 | They have a smooth cortex, not many fold.
02:57:44.440 | - Exactly.
02:57:45.280 | And this, like Hedgy like saw through all of it.
02:57:48.000 | He was like Dostoevsky's man from underground.
02:57:52.000 | I mean, there's a sense that he saw the darkness
02:57:54.800 | of the world and persevered.
02:57:56.560 | So I got, and there's also a famous Russian cartoon,
02:58:00.700 | Hedgehog in the Fog, that I grew up with,
02:58:03.920 | I connected with.
02:58:04.760 | There's people who know of that cartoon.
02:58:07.600 | You can see it on YouTube.
02:58:09.420 | It's like-- - Hedgehog in the Fog?
02:58:10.780 | - Yeah.
02:58:11.620 | He, it's just as you would expect,
02:58:15.300 | especially from like early Soviet cartoons.
02:58:17.900 | It's a hedgehog, like sad, walking through the fog,
02:58:22.900 | exploring like loneliness and sadness.
02:58:25.300 | It's like, but it's beautiful.
02:58:26.740 | It's like a piece of art.
02:58:27.820 | People should, even if you don't speak Russian,
02:58:29.620 | you'll see, you'll understand.
02:58:31.580 | - Oh, it's, the moment you said that I was gonna ask,
02:58:33.900 | so it's in Russian, but of course it's in Russian.
02:58:35.380 | - It's in Russian, but it's more,
02:58:37.220 | there's very little speaking in it.
02:58:39.020 | It's almost, there's an interesting exploration
02:58:43.020 | of how you make sense of the world
02:58:47.360 | when you see it only vaguely through the fog.
02:58:52.140 | So he's trying to understand the world.
02:58:54.100 | - Here we have Mickey Mouse.
02:58:56.660 | - Yeah. - We have Bugs Bunny.
02:58:58.740 | We have all these crazy animals,
02:59:01.540 | and you have the hedgehog in the fog.
02:59:03.580 | - So there's a certain period, and this is, again,
02:59:07.860 | I don't know what it's attributed to,
02:59:09.260 | but it was really powerful,
02:59:10.820 | which there's a period in Soviet history,
02:59:12.940 | I think probably '70s and '80s,
02:59:16.020 | where like, especially kids were treated very seriously.
02:59:21.020 | Like they were treated like they're able to deal
02:59:24.300 | with the weightiness of life.
02:59:27.820 | And that was reflected in the cartoons.
02:59:29.960 | And there was, it was allowed to have
02:59:33.180 | like really artistic content,
02:59:37.180 | not like dumb cartoons that are trying to get you
02:59:39.500 | to be like smile and run around, but like create art.
02:59:42.320 | Like stuff that, you know how like short cartoons
02:59:45.300 | or short films can win Oscars?
02:59:47.040 | Like that's what they're swinging for.
02:59:48.700 | - So what strikes me about this is a little bit
02:59:51.100 | how we were talking about the suit earlier.
02:59:52.780 | It's almost like they treat kids with respect.
02:59:55.140 | - Yeah. - Like that they have
02:59:57.060 | an intelligence and they honor that intelligence.
02:59:59.740 | - Yeah, they're really just adult in a small body.
03:00:02.360 | Like you want to protect them
03:00:04.620 | from the true cruelty of the world.
03:00:06.140 | - Sure. - But in terms
03:00:06.980 | of their intellectual capacity
03:00:08.460 | or like philosophical capacity, they're right there with you.
03:00:11.540 | And so the cartoons reflected that,
03:00:14.220 | the art that they consumed, education reflected that.
03:00:17.720 | So he represents that.
03:00:19.100 | I mean, there's a sense of, because he survived so long
03:00:24.100 | and because I don't like stuffed animals,
03:00:27.320 | that it's like we've been through all of this together
03:00:30.900 | and it's the same, sharing the moments together.
03:00:32.980 | It's the friendship.
03:00:34.340 | And there's a sense in which, you know,
03:00:36.220 | if all the world turns on you and goes to hell,
03:00:39.000 | at least we got each other.
03:00:41.180 | And he doesn't die because he's an inanimate object, so.
03:00:45.460 | - Until you animate him.
03:00:47.380 | - Until you animate him.
03:00:49.060 | And then I probably wouldn't want to know
03:00:50.420 | what he was thinking about this whole time.
03:00:52.500 | He's probably really into Taylor Swift
03:00:55.200 | or something like that.
03:00:56.040 | It's like that I wouldn't even want to know.
03:00:57.700 | Anyway.
03:00:58.540 | - Well, I now feel a connection to Hedgy the Hedgehog
03:01:02.460 | that I certainly didn't have before.
03:01:04.100 | And I think that encapsulates the kind of possibility
03:01:07.340 | of connection that is possible between human
03:01:12.340 | and other object and through robotics, certainly.
03:01:17.620 | There's a saying that I heard when I was a graduate student
03:01:19.580 | that's just been ringing in my mind
03:01:22.520 | throughout this conversation in such a,
03:01:25.060 | I think, appropriate way, which is that,
03:01:27.180 | Lex, you are in a minority of one.
03:01:31.300 | You are truly extraordinary in your ability
03:01:35.140 | to encapsulate so many aspects of science, engineering,
03:01:40.140 | public communication about so many topics,
03:01:43.740 | martial arts, and the emotional depth that you bring to it,
03:01:47.120 | and just the purposefulness.
03:01:49.060 | And I think if it's not clear to people,
03:01:51.620 | it absolutely should be stated,
03:01:53.880 | but I think it's abundantly clear
03:01:55.460 | that just the amount of time and thinking
03:01:58.400 | that you put into things is,
03:02:01.360 | it is the ultimate mark of respect.
03:02:04.800 | So I'm just extraordinarily grateful for your friendship
03:02:08.040 | and for this conversation.
03:02:09.120 | - I'm proud to be your friend.
03:02:11.140 | And I just wish you showed me the same kind of respect
03:02:13.420 | by wearing a suit and make your father proud
03:02:15.820 | and maybe next time.
03:02:17.480 | - Next time, indeed.
03:02:19.080 | Thanks so much, my friend.
03:02:20.440 | - Thank you.
03:02:21.280 | Thank you, Andrew.
03:02:22.360 | - Thank you for joining me for my discussion
03:02:24.440 | with Dr. Lex Friedman.
03:02:26.280 | If you're enjoying this podcast and learning from it,
03:02:29.140 | please consider subscribing on YouTube.
03:02:31.380 | As well, you can subscribe to us on Spotify or Apple.
03:02:35.480 | Please leave any questions and comments and suggestions
03:02:38.200 | that you have for future podcast episodes and guests
03:02:40.840 | in the comment section on YouTube.
03:02:43.240 | At Apple, you can also leave us up to a five-star review.
03:02:46.800 | If you'd like to support this podcast, we have a Patreon.
03:02:49.640 | That's patreon.com/andrewhuberman.
03:02:52.840 | And there you can support us at any level that you like.
03:02:56.200 | Also, please check out our sponsors mentioned
03:02:58.520 | at the beginning of the podcast episode.
03:03:00.840 | That's the best way to support this podcast.
03:03:03.200 | Links to our sponsors can be found in the show notes.
03:03:06.740 | And finally, thank you for your interest in science.
03:03:09.760 | [upbeat music]
03:03:13.480 | (upbeat music)
03:03:16.060 | (upbeat music)