back to indexDr. Lex Fridman: Machines, Creativity & Love | Huberman Lab Podcast #29
Chapters
0:0 Introduction: Lex Fridman
7:35 What is Artificial Intelligence?
26:46 Machine & Human Learning
32:21 Curiosity
36:55 Story Telling Robots
40:48 What Defines a Robot?
44:30 Magic & Surprise
47:37 How Robots Change Us
49:35 Relationships Defined
62:29 Lex’s Dream for Humanity
71:33 Improving Social Media
76:57 Challenges of Creativity
81:49 Suits & Dresses
82:22 Loneliness
90:9 Empathy
95:12 Power Dynamics In Relationships
99:11 Robot Rights
100:20 Dogs: Homer & Costello
112:41 Friendship
119:47 Russians & Suffering
125:38 Public vs. Private Life
134:4 How To Treat a Robot
137:12 The Value of Friendship
140:33 Martial Arts
151:34 Body-Mind Interactions
153:22 Romantic Love
162:51 The Lex Fridman Podcast
175:54 The Hedgehog
181:17 Concluding Statements
00:00:02.280 |
where we discuss science and science-based tools 00:00:10.420 |
and I'm a professor of neurobiology and ophthalmology 00:00:15.020 |
Today, I have the pleasure of introducing Dr. Lex Friedman 00:00:23.180 |
specializing in machine learning, artificial intelligence, 00:00:41.500 |
and I think many of you are probably familiar with Lex 00:00:45.180 |
from his incredible podcast, the Lex Friedman Podcast. 00:00:50.240 |
please subscribe to it, it is absolutely fantastic. 00:01:04.980 |
and about how those interactions can change the way 00:01:13.060 |
relationships with animals, relationships with friends, 00:01:16.420 |
relationships with family, and romantic relationships. 00:01:23.200 |
machines that move and machines that don't move, 00:01:26.500 |
and machines that come to understand us in ways 00:01:28.980 |
that we could never understand for ourselves, 00:01:31.700 |
and how those machines can educate us about ourselves. 00:01:37.420 |
I had no concept of the ways in which machines 00:01:43.700 |
By the end, I was absolutely taken with the idea, 00:01:48.540 |
that interactions with machines of a very particular kind, 00:01:51.920 |
a kind that Lex understands and wants to bring to the world, 00:02:03.320 |
I'm certain you're going to learn a tremendous amount 00:02:05.340 |
from him during the course of our discussion, 00:02:08.940 |
that you think about yourself and about the world. 00:02:15.100 |
from my teaching and research roles at Stanford. 00:02:19.940 |
to bring zero cost to consumer information about science 00:02:22.740 |
and science-related tools to the general public. 00:02:26.720 |
I'd like to thank the sponsors of today's podcast. 00:02:38.500 |
and everything about the sunglasses and eyeglasses 00:02:43.900 |
Now, I've spent a career working on the visual system, 00:02:49.700 |
is how to adjust what you see when it gets darker 00:02:56.820 |
whether or not it's dim in the room or outside, 00:03:01.540 |
you can always see the world with absolute clarity. 00:03:04.300 |
And that just tells me that they really understand 00:03:10.140 |
All these things that work at a real mechanistic level 00:03:14.540 |
In addition, the glasses are very lightweight. 00:03:16.740 |
You don't even notice really that they're on your face. 00:03:23.500 |
so that you could use them not just while working 00:03:25.460 |
or at dinner, et cetera, but while exercising. 00:03:28.580 |
They don't fall off your face or slip off your face 00:03:31.800 |
And as I mentioned, they're extremely lightweight, 00:03:34.860 |
you can use them while cycling, and so forth. 00:03:37.060 |
Also, the aesthetic of Roca glasses is terrific. 00:03:39.680 |
Unlike a lot of performance glasses out there, 00:03:47.200 |
You can wear them for essentially any occasion. 00:04:03.500 |
Today's episode is also brought to us by InsideTracker. 00:04:06.880 |
InsideTracker is a personalized nutrition platform 00:04:15.700 |
I am a big believer in getting regular blood work done 00:04:18.220 |
for the simple reason that many of the factors 00:04:20.700 |
that impact our immediate and long-term health 00:04:23.180 |
can only be assessed from a quality blood test. 00:04:25.820 |
And now with the advent of quality DNA tests, 00:04:28.460 |
we can also get insight into some of our genetic 00:04:30.900 |
underpinnings of our current and long-term health. 00:04:34.300 |
The problem with a lot of blood and DNA tests out there, 00:04:38.500 |
and you don't know what to do with those data. 00:04:43.000 |
but you really don't know what the actionable items are, 00:04:49.740 |
to act in the appropriate ways on the information 00:04:52.580 |
that you get back from those blood and DNA tests. 00:04:55.060 |
And that's through the use of their online platform. 00:04:59.860 |
that tells you what sorts of things can bring the numbers 00:05:02.880 |
for your metabolic factors, endocrine factors, et cetera, 00:05:10.140 |
In fact, I know one individual just by way of example 00:05:19.800 |
They would have never detected that otherwise. 00:05:23.360 |
with a number of deleterious health conditions, 00:05:28.200 |
And so they were able to take immediate action 00:05:33.400 |
And so with InsideTracker, you get that sort of insight. 00:05:35.980 |
And as I mentioned before, without a blood or DNA test, 00:05:38.440 |
there's no way you're going to get that sort of insight 00:05:58.700 |
Today's podcast is brought to us by Athletic Greens. 00:06:05.640 |
I started taking Athletic Greens way back in 2012. 00:06:09.020 |
And so I'm delighted that they're sponsoring the podcast. 00:06:15.800 |
is that it covers all of my vitamin mineral probiotic bases. 00:06:19.440 |
In fact, when people ask me, what should I take? 00:06:22.000 |
I always suggest that the first supplement people take 00:06:26.920 |
is that the things it contains covers your bases 00:06:33.960 |
And the inclusion of probiotics are essential 00:06:51.100 |
And those neurons in turn help keep our brain healthy. 00:06:53.860 |
They influence things like mood, our ability to focus, 00:06:56.500 |
and many, many other factors related to health. 00:07:04.700 |
I mix mine with water and I add a little lemon juice 00:07:14.080 |
And if you do that, you can claim their special offer. 00:07:22.660 |
And they'll give you a year supply of vitamin D3 and K2. 00:07:30.880 |
And now my conversation with Dr. Lex Friedman. 00:07:42.160 |
of people's minds or ought to be on a lot of people's minds 00:07:46.400 |
because we hear these terms a lot these days, 00:07:50.040 |
but I think most people, including most scientists 00:07:56.900 |
what is artificial intelligence and how is it different 00:08:01.900 |
from things like machine learning and robotics? 00:08:05.240 |
So if you would be so kind as to explain to us 00:08:08.900 |
what is artificial intelligence and what is machine learning? 00:08:13.900 |
- Well, I think that question is as complicated 00:08:17.100 |
and as fascinating as the question of what is intelligence. 00:08:37.460 |
or was born as an ancient wish to forge the gods. 00:08:44.260 |
it's our longing to create other intelligence systems, 00:08:50.400 |
At the more narrow level, I think it's also a set of tools 00:09:01.100 |
And then also it's our attempt to understand our own mind. 00:09:05.620 |
So build systems that exhibit some intelligent behavior 00:09:16.160 |
Of course, what AI really means as a community, 00:09:19.340 |
as a set of researchers and engineers, it's a set of tools, 00:09:29.560 |
There's a long history that approaches the problem 00:09:34.220 |
What's always been throughout one of the threads, 00:09:40.280 |
of machine learning, which is emphasizing in the AI space, 00:09:48.120 |
How do you make a machine that knows very little 00:09:50.520 |
in the beginning, follow some kind of process 00:09:53.800 |
and learns to become better and better in a particular task. 00:09:58.260 |
What's been most very effective in the recent about 15 years 00:10:03.260 |
is a set of techniques that fall under the flag 00:10:05.680 |
of deep learning that utilize neural networks. 00:10:08.680 |
What neural networks are, are these fascinating things 00:10:12.120 |
inspired by the structure of the human brain very loosely, 00:10:17.120 |
but they have, it's a network of these little basic 00:10:20.540 |
computational units called neurons, artificial neurons. 00:10:24.400 |
And they have, these architectures have an input and output. 00:10:30.120 |
and they're tasked with learning something interesting. 00:10:37.580 |
There's a lot of ways to talk about this and break this down. 00:10:41.960 |
Like one of them is how much human supervision 00:10:51.860 |
is the neural network knows nothing in the beginning. 00:10:59.500 |
in computer vision, that will be examples of cats, 00:11:06.080 |
and you're given the ground truth of what's in that image. 00:11:09.560 |
And when you get a large database of such image examples 00:11:15.040 |
the neural network is able to learn by example, 00:11:20.480 |
The question, there's a lot of fascinating questions 00:11:22.760 |
within that, which is how do you provide the truth? 00:11:34.920 |
Do you just say the entire image is a picture of a cat? 00:11:41.540 |
You have a very crude box around the cat's face 00:12:01.120 |
very crisp outline around the cat and saying that's a cat. 00:12:04.920 |
That's really difficult to provide that truth. 00:12:10.800 |
is that even a good representation of the truth? 00:12:21.120 |
is what's used to be called unsupervised learning, 00:12:24.340 |
what's commonly now called self-supervised learning, 00:12:27.140 |
which is trying to get less and less and less 00:12:38.740 |
it's been very successful in the domain of a language model, 00:12:57.600 |
and try to learn something generalizable about the ideas 00:13:02.600 |
that are at the core of language or at the core of vision. 00:13:12.900 |
So with this, we have this giant base of knowledge 00:13:16.000 |
on top of which we build more sophisticated knowledge, 00:13:18.720 |
but we have this kind of common sense knowledge. 00:13:21.400 |
And so the idea with self-supervised learning 00:13:30.440 |
that make up a cat and a dog and all those kinds of things 00:13:40.780 |
that's self-supervised run around the internet for a while, 00:13:44.840 |
watch YouTube videos for millions and millions of hours 00:13:47.520 |
and without any supervision be primed and ready 00:13:56.680 |
We think of children in this way, human children, 00:14:00.160 |
is your parents only give one or two examples 00:14:10.080 |
that they would watch millions of hours of YouTube videos 00:14:13.960 |
and then come to a human and be able to understand 00:14:20.800 |
They will understand that a cat is not just a thing 00:14:23.540 |
with pointy ears or a cat is a thing that's orange 00:14:27.520 |
or is furry, they'll see something more fundamental 00:14:30.800 |
that we humans might not actually be able to introspect 00:14:34.400 |
Like if I asked you what makes a cat versus a dog, 00:14:36.800 |
you wouldn't probably not be able to answer that. 00:14:39.400 |
But if I showed you, brought to you a cat and a dog, 00:14:48.800 |
That's the whole dream of self-supervised learning 00:14:51.280 |
is it would be able to learn that on its own, 00:14:57.880 |
And then there's like a lot of incredible uses 00:15:07.280 |
That's the mechanism behind the reinforcement learning 00:15:11.740 |
successes of the systems that want to go at alpha zero, 00:15:21.880 |
- So the idea of self-play, this probably applies 00:15:30.860 |
And this is fascinating in all kinds of domains, 00:15:36.600 |
And the whole idea is it creates a bunch of mutations 00:15:39.540 |
of itself and plays against those versions of itself. 00:15:44.540 |
And the fascinating thing is when you play against systems 00:15:56.100 |
That's true for martial arts, that's true in a lot of cases 00:15:59.240 |
where you want to be interacting with systems 00:16:03.880 |
And then through this process of interacting with systems 00:16:08.120 |
you start following this process where everybody 00:16:10.480 |
starts getting better and better and better and better 00:16:12.620 |
until you are several orders of magnitude better 00:16:15.480 |
than the world champion in chess, for example. 00:16:18.080 |
And it's fascinating 'cause it's like a runaway system. 00:16:21.040 |
One of the most terrifying and exciting things 00:16:23.560 |
that David Silver, the creator of AlphaGo and AlphaZero, 00:16:31.440 |
is they haven't found the ceiling for AlphaZero, 00:16:36.640 |
meaning it could just arbitrarily keep improving. 00:16:39.360 |
Now in the realm of chess, that doesn't matter to us 00:16:41.840 |
that it's like it just ran away with the game of chess. 00:16:44.980 |
Like it's like just so much better than humans. 00:16:48.360 |
But the question is if you can create that in the realm 00:16:52.120 |
that does have a bigger, deeper effect on human beings, 00:16:56.060 |
on societies, that can be a terrifying process. 00:17:08.720 |
you make sure that the goals that the AI is optimizing 00:17:13.500 |
is aligned with human beings and human societies. 00:17:17.000 |
There's a lot of fascinating things to talk about 00:17:23.200 |
and all the problems that people are working on. 00:17:29.840 |
where trying to get less and less human supervision 00:17:44.120 |
I have questions, but I'm learning, so please keep going. 00:17:46.480 |
- So that to me, what's exciting is not the theory, 00:17:55.120 |
specifically neural networks and machine learning 00:17:59.140 |
So these are systems that are working in the real world. 00:18:07.200 |
- These are automated vehicles, autonomous vehicles. 00:18:09.480 |
- Semi-autonomous, we've gone through wars on these topics. 00:18:17.040 |
So even though it's called FSD, full self-driving, 00:18:27.740 |
So human is tasked with overseeing the systems. 00:18:30.920 |
In fact, liability wise, the human is always responsible. 00:18:43.000 |
which is a whole nother space of human robot interaction. 00:18:49.960 |
That dance to me is one of the smaller communities, 00:18:54.960 |
but I think it will be one of the most important 00:19:03.160 |
To me, semi-autonomous driving is one of those spaces. 00:19:07.860 |
So for Elon, for example, he doesn't see it that way. 00:19:11.680 |
He sees semi-autonomous driving as a stepping stone 00:19:22.720 |
Like humans and humans dance and robots and robots dance. 00:19:28.060 |
we need to design a perfect robot that solves this problem. 00:19:31.680 |
To me forever, maybe this is not the case with driving, 00:19:34.120 |
but the world is going to be full of problems 00:19:37.140 |
where it's always humans and robots have to interact. 00:19:40.380 |
Because I think robots will always be flawed, 00:19:43.520 |
just like humans are going to be flawed, are flawed. 00:19:47.460 |
And that's what makes life beautiful, that they're flawed. 00:19:55.860 |
So you always have to figure out how can flawed robots 00:20:08.360 |
as opposed to focusing on just building the perfect robot. 00:20:11.280 |
So that's one of the most exciting applications, 00:20:15.360 |
I would say, of artificial intelligence to me, 00:20:17.800 |
is autonomous driving and semi-autonomous driving. 00:20:20.640 |
And that's a really good example of machine learning, 00:20:23.240 |
because those systems are constantly learning. 00:20:26.240 |
And there's a process there that maybe I can comment on. 00:20:30.600 |
Andrej Karpathy, who's the head of Autopilot, 00:20:36.360 |
And this process applies for a lot of machine learning, 00:20:46.820 |
and then it runs into what are called edge cases, 00:20:52.660 |
You know, we do this as kids, that's, you know, you have- 00:21:01.340 |
and this is the fascinating thing about driving, 00:21:03.660 |
is you realize there's millions of edge cases. 00:21:06.120 |
There's just like weird situations that you did not expect. 00:21:20.980 |
where all these cars, hundreds of thousands of cars, 00:21:23.940 |
as you're driving around, and something weird happens. 00:21:33.380 |
that piece of data goes back to the mothership 00:21:37.420 |
for the training, for the retraining of the system. 00:21:42.620 |
it keeps improving and getting better and better 00:21:45.860 |
So basically, you send out a pretty clever AI systems 00:22:08.520 |
So you have to, the weird examples come back, 00:22:14.800 |
and you have to label what actually happened in there. 00:22:17.700 |
There's also some mechanisms for automatically labeling, 00:22:22.700 |
but mostly I think you always have to rely on humans 00:22:36.780 |
having very different opinions about what is intelligence. 00:22:45.820 |
And first of all, this is a beautiful description of terms 00:22:48.660 |
that I've heard many times among my colleagues at Stanford, 00:22:56.940 |
but I do want to ask one question about the culture of AI, 00:23:00.220 |
because it does seem to be a community where, 00:23:03.980 |
where it seems like there's very little consensus 00:23:09.620 |
And there seems to be a lot of splitting happening now 00:23:12.340 |
of not just supervised and unsupervised learning, 00:23:22.100 |
Like kids go home from college during the summer 00:23:24.020 |
and get a little, you know, mom still feeds them. 00:23:26.220 |
Then eventually they leave the nest kind of thing. 00:23:28.680 |
Is there something in particular about engineers 00:23:42.180 |
the more specific you get, the less disagreement there is. 00:23:47.860 |
but there's less disagreement about what is machine learning 00:23:50.620 |
and even less when you talk about active learning 00:23:52.680 |
or machine teaching or self-supervised learning. 00:23:56.580 |
And then when you get into like NLP language models 00:24:00.620 |
when you get into specific neural network architectures, 00:24:09.340 |
And that has to do with the fact that engineering, 00:24:12.060 |
especially when you're talking about intelligent systems 00:24:20.580 |
So the art part is the thing that creates disagreements 00:24:28.620 |
about how easy or difficult the particular problem is. 00:24:33.620 |
For example, a lot of people disagree with Elon, 00:24:37.580 |
how difficult the problem of autonomous driving is. 00:24:44.340 |
about what are the limits of these techniques. 00:24:47.160 |
And through that, the terminology also contains within it, 00:24:53.940 |
But overall, I think it's also a young science 00:25:03.960 |
as a large scale discipline where it's thousands, 00:25:06.800 |
tens of thousands, hundreds of thousands of people 00:25:09.420 |
working on it, huge amounts of money being made. 00:25:25.740 |
Neural networks have been around for many, many decades, 00:25:28.040 |
since the '60s, you can argue since the '40s. 00:25:33.440 |
into the word deep learning, term deep learning, 00:25:36.880 |
that was part of the reinvigoration of the field, 00:25:49.060 |
computational neuroscience and theoretical neuroscience, 00:26:00.680 |
excellent theoretical neuroscientists like Larry Abbott 00:26:03.520 |
and other colleagues, certainly at Stanford as well, 00:26:10.340 |
But these terms, neural networks, computational methods, 00:26:13.200 |
I actually didn't know that neural network works 00:26:15.140 |
in deep learning, where those have now become 00:26:19.380 |
- No, they were always, no, they're always the same thing. 00:26:24.160 |
- I'm a neuroscientist and I didn't know that. 00:26:28.060 |
mean something else in neuroscience, not something else, 00:26:30.180 |
but a little different flavor depending on the field. 00:26:32.620 |
And that's fascinating too, because neuroscience and AI, 00:26:36.700 |
people have started working together and dancing a lot more 00:26:49.800 |
that I find incredibly interesting is this example you gave 00:26:54.400 |
of playing a game with a mutated version of yourself 00:27:00.260 |
- I find that incredibly interesting as a kind of a parallel 00:27:04.260 |
or a mirror for what happens when we try and learn 00:27:16.580 |
of trying to throw bull's eyes on a dartboard. 00:27:23.380 |
And I don't know exactly what I just did, right? 00:27:34.840 |
You learn certain things in that mode as well. 00:27:38.940 |
You're saying that a machine can sort of mutate itself. 00:27:52.700 |
and they can see better than 99.9% of people out there. 00:27:56.420 |
So when you talk about a machine playing a game 00:28:01.320 |
is the mutation always what we call a negative mutation 00:28:11.860 |
and then figure out and they compete against each other. 00:28:15.780 |
the machine gets to evolve itself in real time. 00:28:18.360 |
- Yeah, and I think of it, which would be exciting, 00:28:23.600 |
It's not just, so usually you freeze a version of the system. 00:28:42.480 |
Like you fight to the death and who wins last. 00:28:45.640 |
So I love that idea of creating a bunch of clones of myself 00:28:54.560 |
at like podcasting or science or picking up chicks at a bar 00:29:03.120 |
I mean, a lot of Lexes would have to die for that process, 00:29:11.480 |
it's a graveyard of systems that didn't do that well. 00:29:19.760 |
- Do you think that, I mean, Darwin's theory of evolution 00:29:27.720 |
I mean, you get a bunch of birds with different shaped beaks 00:29:34.880 |
of Darwinian evolution, but I think it's correct 00:29:45.560 |
Lots of different members of species have different traits 00:29:49.440 |
but you could actually create multiple versions of yourself 00:29:59.200 |
but with machine learning or with reinforcement learning 00:30:02.480 |
one of the big requirements is to have an objective function, 00:30:07.800 |
Those are all different terms for the same thing. 00:30:10.200 |
Is there's like an equation that says what's good. 00:30:15.080 |
And then you're trying to optimize that equation. 00:30:20.980 |
- Because it's a game, like with chess, there's a goal. 00:30:29.940 |
In machine learning, it's usually called loss function 00:30:34.340 |
The interesting thing about evolution complicated of course, 00:30:41.700 |
Like it's a, I guess adaptation to the environment 00:30:56.800 |
and this is like what human ingenuity provides, 00:31:00.460 |
is that fitness function of what's good and what's bad, 00:31:04.420 |
which it lets you know which of the systems is going to win. 00:31:12.840 |
is we figure out objective functions for ourselves. 00:31:34.520 |
So like there's a lot of interesting explorations 00:31:37.580 |
about that function being more about curiosity, 00:31:42.400 |
about learning new things and all that kind of stuff, 00:31:46.680 |
If you want a machine to be able to be good at stuff, 00:31:56.080 |
That's one of the challenges of artificial intelligence 00:32:01.600 |
in order to solve a problem, you have to formalize it 00:32:12.600 |
And you have to also be clear about the objective function. 00:32:15.860 |
What is the goal that you're trying to reach? 00:32:23.940 |
I'm sure this definition falls short in many ways, 00:32:39.320 |
but there's not really an emotional attachment. 00:32:45.520 |
without hoping it's a gold coin under a rock. 00:32:54.000 |
it sounds like there are elements of reward prediction 00:32:58.560 |
The machine has to know when it's done the right thing. 00:33:05.280 |
or are the sorts of machines that you are describing 00:33:09.360 |
- Yeah, curiosity is a kind of a symptom, not the goal. 00:33:32.440 |
that seem like they're not going to lead anywhere. 00:33:47.220 |
the exploration can look like curiosity by us humans, 00:33:52.220 |
but it's really just trying to get out of the local optimal, 00:34:00.180 |
it's always looking to optimize the objective function. 00:34:04.380 |
It derives, and we could talk about this a lot more, 00:34:08.160 |
but in terms of the tools of machine learning today, 00:34:11.520 |
it derives no pleasure from just the curiosity of like, 00:34:26.880 |
- That said, if you look at machine learning literature 00:34:35.740 |
We're constantly trying to use the human brain 00:34:38.820 |
to inspire totally new solutions to these problems. 00:34:42.720 |
how does dopamine function in the human brain? 00:34:44.940 |
And how can that lead to more interesting ways 00:35:04.500 |
- Yeah, we don't actually know what we're looking for 00:35:10.680 |
who you've spoken with, at least on Instagram, 00:35:15.080 |
- Yeah, I hope you actually have her on this podcast. 00:35:25.920 |
basically there's a very dumb objective function 00:35:36.460 |
And then what we humans do with our fancy consciousness 00:35:39.540 |
and cognitive abilities is we tell stories to ourselves 00:35:58.940 |
as not just systems that solve problems or optimize a goal, 00:36:11.660 |
Like when you're alone, that's when you solve problems. 00:36:16.660 |
That's when it makes sense to talk about solving problems. 00:36:29.980 |
that's like, that's being a charismatic storyteller. 00:36:33.280 |
And I think both humans are very good at this. 00:36:35.860 |
Arguably, I would argue that's why we are who we are 00:36:44.780 |
So it's not just about being able to solve problems 00:36:49.020 |
It's afterwards be able to tell like a way better, 00:36:53.360 |
about why you did something or why you failed. 00:36:55.760 |
- So you think that robots and/or machines of some sort 00:37:03.440 |
So the technical field for that is called explainable AI, 00:37:09.320 |
is trying to figure out how you get the AI system 00:37:14.160 |
to explain to us humans why the hell it failed 00:37:18.860 |
or there's a lot of different sort of versions of this, 00:37:22.320 |
or to visualize how it understands the world. 00:37:28.100 |
especially with neural networks that are famously opaque, 00:37:36.320 |
why a particular neural network does what it does so well. 00:37:40.480 |
And to try to figure out where it's going to fail, 00:37:52.360 |
especially from government funding and so on, 00:37:54.560 |
because if you want to deploy AI systems in the real world, 00:37:59.460 |
we humans at least, want to ask it a question, 00:38:04.260 |
Like in a dark way, why did you just kill that person, right? 00:38:13.000 |
And now again, we're sometimes very unfair to AI systems 00:38:17.880 |
because we humans can often not explain why very well, 00:38:28.160 |
because the more and more we rely on AI systems, 00:38:31.500 |
like the Twitter recommender system, that AI algorithm, 00:38:39.140 |
perhaps starting wars or at least military conflict. 00:38:52.900 |
And can you explain the possible other trajectories? 00:38:55.840 |
Like we would have that kind of conversation with a human. 00:39:02.020 |
I think it would be nice to talk to AI systems 00:39:05.420 |
for stupid stuff, like robots, when they fail to- 00:39:29.860 |
like what was the under actuated system problem here? 00:39:32.740 |
Like what was the texture of the floor or so on? 00:39:46.940 |
Like storytelling isn't just explanation of what happened. 00:39:58.980 |
in a way that poetry makes people understand things 00:40:01.980 |
as opposed to a rigorous log of where every sensor was, 00:40:11.460 |
because one of the hallmarks of severe autism spectrum 00:40:15.660 |
disorders is a report of experience from the autistic person 00:40:26.380 |
And they'll say, well, I got up and I did this 00:40:29.080 |
And it's not at all the way that a person with, 00:40:32.180 |
who doesn't have autism spectrum disorder would respond. 00:40:35.700 |
And the way you describe these machines has so much human, 00:40:45.420 |
But I realized that we were talking about machines. 00:40:51.660 |
if there's a distinction between a machine that learns, 00:40:56.660 |
a machine with artificial intelligence and a robot. 00:41:08.660 |
But if my ballpoint pen can come to me when it's on, 00:41:13.180 |
when I moved to the opposite side of the table, 00:41:20.660 |
- Okay, there's a million ways to explore this question. 00:41:25.060 |
So first of all, there's a question of what is life? 00:41:28.120 |
Like how do you know something is a living form and not? 00:41:32.540 |
And it's similar to the question of when does sort of a, 00:42:07.880 |
So robots are the things that have a perception system 00:42:18.700 |
So that's the difference between maybe an AI system 00:42:24.280 |
a robot is something that's able to perceive the world 00:42:32.660 |
- Yeah, and I think it could also be in the digital space, 00:42:52.600 |
according to the definition that I think you're providing, 00:42:55.400 |
where the robot, where I go to sleep at night 00:42:58.240 |
and this robot goes and forages for information 00:43:07.840 |
but it essentially has movement in cyberspace. 00:43:11.140 |
- Yeah, there's a distinction that I think is important 00:43:16.140 |
in that there's an element of it being an entity, 00:43:23.400 |
whether it's in the digital or the physical space. 00:43:26.620 |
So when you have something like Alexa in your home, 00:43:36.720 |
is constantly being sent back to the mothership. 00:43:49.460 |
when it's simply a finger of the main mothership, 00:44:04.800 |
So that's important because there's some element 00:44:12.560 |
whether in the digital or the physical space. 00:44:54.620 |
that does something that you haven't seen before 00:44:58.980 |
like, huh, those moments happen with alpha zero 00:45:05.540 |
or grandmasters were really surprised by a move. 00:45:17.300 |
I find that moment of surprise really powerful, 00:45:23.300 |
- Because it supersedes the human brain in that moment? 00:45:57.820 |
a being that's struggling just like you are in this world. 00:46:04.420 |
that I think you're not gonna find many people 00:46:07.540 |
in the AI community that talk like I just did. 00:46:09.900 |
I'm not speaking like some philosopher or some hippie. 00:46:14.340 |
I'm speaking from purely engineering perspective. 00:46:16.780 |
I think it's really important for robots to become entities 00:46:20.500 |
and explore that as a real engineering problem, 00:46:29.540 |
They don't give them, try to avoid giving them names. 00:46:32.220 |
They really want to see it like a system, like a servant. 00:46:36.700 |
They see it as a servant that's trying to accomplish a task. 00:46:40.260 |
To me, I don't think I'm just romanticizing the notion. 00:47:08.400 |
which is the act of looking at an inanimate object 00:47:15.460 |
I think robotics generally sees that as a negative. 00:47:26.420 |
- Well, I'm struck by how that really grabs on 00:47:29.560 |
to the relationship between human and machine 00:47:37.400 |
and I think you've already told us the answer, 00:47:38.880 |
but does interacting with a robot change you? 00:47:43.040 |
Does it, in other words, do we develop relationships 00:47:50.240 |
I think the moment you see a robot or AI systems 00:48:01.240 |
Just like good friends do, just like relationships, 00:48:06.760 |
I think for that, you have to have certain aspects 00:48:30.720 |
I mean, I have a lot of thoughts about this, as you may know, 00:48:35.200 |
and that's at the core of my lifelong dream, actually, 00:48:44.840 |
a notion of loneliness in them that we haven't discovered, 00:48:53.120 |
And I see AI systems as helping us explore that 00:49:02.280 |
So I think that connection between human and AI, 00:49:14.520 |
in ways that are like several orders of magnitude, 00:49:37.740 |
I don't always break them down into variables, 00:49:41.140 |
but we could explore a few of those variables 00:49:43.400 |
and see how they map to human-robot relationships. 00:49:49.160 |
If you spend zero time with another person at all 00:49:55.380 |
you essentially have no relationship to them. 00:49:58.120 |
If you spend a lot of time, you have a relationship. 00:49:59.600 |
This is obvious, but I guess one variable would be time, 00:50:01.800 |
how much time you spend with the other entity, 00:50:12.020 |
I'll give a absolutely trivial example of this in a moment, 00:50:21.520 |
whether or not you struggle between one another, 00:50:23.560 |
you disagree, like I was really struck by the fact 00:50:27.040 |
I've never thought about a robot saying no to me, 00:50:31.540 |
- I look forward to you being one of the first people 00:50:37.640 |
You know, when you struggle with somebody, you grow closer. 00:50:43.600 |
between those two people, so-called trauma bonding. 00:50:45.900 |
They call it in the whole psychology literature 00:50:52.200 |
so time, successes together, struggle together, 00:50:57.200 |
and then just peaceful time, hanging out at home, 00:51:04.740 |
Here, we're breaking down the kind of elements 00:51:46.500 |
Are we thinking about a human-appearing robot? 00:51:50.220 |
I know you and I have both had intense relationships 00:51:53.600 |
to our, we have separate dogs, obviously, but to animals. 00:51:57.240 |
This sounds a lot like human-animal interaction. 00:51:59.120 |
So what is the ideal human-robot relationship? 00:52:30.980 |
What's currently most machine learning systems 00:52:53.880 |
They're not able to keep track of those together with you. 00:52:57.380 |
- 'Cause they can't move with you and be with you. 00:52:59.200 |
- No, no, like literally we don't have the techniques 00:53:03.440 |
the actual learning of containing those moments. 00:53:07.180 |
Current machine learning systems are really focused 00:53:09.960 |
on understanding the world in the following way. 00:53:14.340 |
like looking around, understand like what's in the scene, 00:53:27.500 |
But the fact that we shared this moment of talking today 00:53:30.940 |
and still remember that for next time you're, 00:53:44.140 |
as I think it's a very, very important component 00:53:47.080 |
of what it means to create a deep relationship, 00:53:52.060 |
- Could you post a photo of you and the robot, 00:53:54.060 |
like selfie with robot and then the robot sees that image 00:54:03.620 |
and create some sort of metric of emotional depth 00:54:11.660 |
So could it text you in the middle of the night and say, 00:54:21.420 |
But I think that time element, forget everything else, 00:54:24.820 |
just sharing moments together, that changes everything. 00:54:30.940 |
There's specific things that are more in terms of systems 00:54:35.360 |
It's more technical and probably a little bit offline 00:54:50.620 |
forget all the other things we're talking about, 00:55:00.260 |
We don't currently have systems that share moments together. 00:55:13.300 |
that was a secret moment you had with your refrigerator. 00:55:37.820 |
I think you're gonna be very attached to the refrigerator. 00:55:42.780 |
You're gonna go through some hell with that refrigerator. 00:55:59.200 |
So a smart refrigerator, I believe would change society, 00:56:06.180 |
but these ideas in the systems all around us. 00:56:14.220 |
And then there's a bunch of elements of actual interaction 00:56:17.860 |
of allowing you as a human to feel like you're being heard, 00:56:33.020 |
but we're still, there's still an element of selfishness. 00:56:36.940 |
There's still an element of not really being able 00:56:40.740 |
And a lot of the times when you're going through trauma 00:56:44.440 |
together through difficult times and through successes, 00:56:51.220 |
But I think that can be done more aggressively, 00:56:59.280 |
I think I've never actually been to a therapist, 00:57:07.980 |
And if they do, the therapist don't live to tell the story. 00:57:12.100 |
I do believe in talk there, which friendship is to me, 00:57:17.700 |
is talk therapy or like, you don't necessarily need to talk. 00:57:22.700 |
It's like just connecting in the space of ideas 00:57:33.580 |
the right questions and truly hear another human. 00:57:37.320 |
This is what we try to do with podcasting, right? 00:57:45.980 |
the collection of moments that make up the day, 00:57:58.780 |
That's time, we've been through a bunch of shit together. 00:58:06.220 |
but just the fact that we've been through that 00:58:09.320 |
And those moments somehow create a depth of connection 00:58:12.820 |
like nothing else, like you and your refrigerator. 00:58:21.720 |
but when she passed away, somebody said at her memorial, 00:58:25.780 |
all these amazing things she had done, et cetera. 00:58:29.400 |
And then her kids got up there and she had young children 00:58:37.900 |
I can feel like your heart gets heavy thinking about this. 00:58:39.880 |
They're going to grow up without their mother. 00:58:42.660 |
Very, very strong young girls and now young women. 00:58:51.720 |
about their mother, who was an amazing person, 00:58:54.960 |
is all the unstructured time they spent together. 00:59:00.960 |
It wasn't, oh, she woke up at five in the morning 00:59:13.280 |
that's what they remembered was the biggest give 00:59:22.000 |
to a refrigerator is so, I want to say human-like, 00:59:34.400 |
might actually be a lower form of relationship. 00:59:39.000 |
There may be relationships that are far better 00:59:42.360 |
than the sorts of relationships that we can conceive 00:59:46.820 |
based on what these machine relationship interactions 00:59:55.720 |
as somehow incapable of teaching us something 01:00:01.580 |
I don't think humans have a monopoly on that. 01:00:06.920 |
and we need to have the kind of prompting from a machine. 01:00:11.920 |
And definitely part of that is just remembering the moments. 01:00:27.520 |
That's like calling this podcast unstructured time. 01:00:30.520 |
- Maybe what they meant was it wasn't a big outing. 01:00:36.160 |
but a goal was created through the lack of a goal. 01:00:42.520 |
and you end up playing thumb war for an hour there. 01:00:45.080 |
So the structure emerges from lack of structure. 01:00:56.440 |
And I think that those could be optimized for. 01:01:00.280 |
I think we think of like a big outing as, I don't know, 01:01:03.800 |
or some big, the Grand Canyon, or go into some, 01:01:15.680 |
I think there's possible to optimize a lot of those things. 01:01:18.320 |
And perhaps like podcasting is helping people discover 01:01:21.080 |
that like maybe the thing we want to optimize for 01:01:24.200 |
isn't necessarily like some sexy, like quick clips. 01:01:29.200 |
Maybe what we want is long form authenticity. 01:01:45.920 |
or long periods of communication over a series of moments, 01:01:56.160 |
to the big ones, the big successes, the big failures, 01:02:08.240 |
that from like a very specific engineering perspective 01:02:24.800 |
but one of it is guts is I'll build a startup around it. 01:02:37.280 |
always as little hints dropped here and there. 01:02:41.360 |
there's never been an offline conversation about this dream. 01:02:43.720 |
I'm not privy to anything except what Lex says now. 01:03:03.840 |
when we've sat down together and talked on the phone? 01:03:06.920 |
Maybe it's this company, maybe it's something distinct. 01:03:13.320 |
- Sure, so the way people express long-term vision, 01:03:22.660 |
who can very crisply say exactly what the goal is. 01:03:27.540 |
Also has to do with the fact the problems he's solving 01:03:31.320 |
So my long-term vision is a little bit more difficult 01:03:36.400 |
to express in words, I've noticed, as I've tried. 01:03:48.880 |
Early on in life and also in the recent years, 01:04:07.620 |
I realized there's magic there that nobody else is seeing. 01:04:12.280 |
The Spot is the four-legged robot from Boston Dynamics. 01:04:17.280 |
Some people might've seen it, it's this yellow dog. 01:04:25.320 |
you just notice something that just grabs you. 01:04:43.160 |
Woz didn't think about it, the personal computer this way, 01:04:50.960 |
as a sheet of paper and everybody should have one. 01:04:53.640 |
I mean, this idea, I think it is heartbreaking 01:04:58.640 |
that the world is being filled up with machines 01:05:05.000 |
And I think every one of them can have that same magic. 01:05:14.800 |
in terms of a startup is that magic can be engineered 01:05:28.840 |
in every single computing system in the world. 01:05:32.360 |
So the way that Windows operating system for a long time 01:05:36.480 |
was the primary operating system everybody interacted with. 01:05:41.320 |
I think this is something that should be as a layer, 01:05:45.800 |
it's almost as an operating system in every device 01:05:51.000 |
Now what that actually looks like the actual dream 01:05:56.520 |
it didn't have this concrete form of a business. 01:06:01.120 |
It had more of a dream of exploring your own loneliness 01:06:11.840 |
This deep connection between humans and robots 01:06:22.440 |
and not a robot that washes the dishes or a sex robot, 01:06:27.440 |
or I don't know, I think of any kind of activity 01:06:37.680 |
But a dog that's able to speak your language too. 01:06:46.800 |
and almost like smiling with its soul in that kind of way, 01:06:51.120 |
but also to actually understand what the hell, 01:06:54.560 |
like why are you so excited about the successes? 01:06:56.720 |
Like understand the details, understand the traumas. 01:06:59.880 |
And I just think that has always filled me with excitement 01:07:12.320 |
More recently, I've been more and more hardbroken 01:07:28.600 |
And I thought this kind of mechanism is exactly applicable 01:07:54.920 |
you're able to optimize for long-term growth, 01:08:03.720 |
And there's a lot of other things to say, which is the, 01:08:08.460 |
in order for AI systems to learn everything about you, 01:08:35.480 |
I think each individual should own all of their data 01:08:43.600 |
humans can disappear and delete all of their data 01:08:49.380 |
which is actually better than we humans can do. 01:08:54.220 |
Once we load the data into each other, it's there. 01:09:02.880 |
in order to establish trust that they can trust you. 01:09:06.200 |
And the second part of trust is transparency. 01:09:08.480 |
Whenever the data is used to make it very clear 01:09:16.120 |
but clear in a way that people really understand 01:09:24.160 |
and know how the data is being used, I think they'll stay. 01:09:36.400 |
I think a happy marriage requires the ability 01:09:39.680 |
to divorce easily without the divorce industrial complex 01:09:48.040 |
There's so much money to be made from lawyers and divorce. 01:09:50.720 |
But yeah, the ability to leave is what enables love, I think. 01:10:17.480 |
And I haven't, you know, there's so much fear 01:10:21.040 |
about artificial intelligence systems and about robots 01:10:26.400 |
to express that vision because the vision I have is one, 01:10:33.480 |
of like technology is going to save everybody. 01:10:40.320 |
of ways AI systems can help humans explore themselves. 01:10:54.400 |
to say that everything's going to turn out all right. 01:11:12.400 |
One was this hypothetical robot family member 01:11:17.240 |
or some other form of robot that would allow people 01:11:27.040 |
and that you would like the world to be able to have. 01:11:33.480 |
And the other is social media or social network platforms 01:11:38.160 |
that really serve individuals and their best selves 01:11:48.480 |
It's difficult to kind of explain without going through 01:12:00.680 |
It's not like Amazon Alexa that's centralized. 01:12:11.320 |
that helps you find things that will make you feel good, 01:12:16.200 |
that will also challenge your thinking to make you grow, 01:12:19.880 |
but not let you get lost in the negative spiral 01:12:29.440 |
or most just get you to be not open to learning. 01:12:38.780 |
And I believe that that is not only good for human beings, 01:12:47.360 |
I think long-term you can make a lot of money 01:12:50.480 |
by challenging this idea that the only way to make money 01:12:57.360 |
And one of the things that people disagree with me on 01:13:02.400 |
Like maximizing engagement is always going to win. 01:13:06.280 |
I think people have woken up now to understanding 01:13:16.220 |
that they don't always feel good at the end of the week. 01:13:19.300 |
- I would love feedback from whatever this creature, 01:13:24.000 |
whatever, I can't, I don't know what to call it, 01:13:31.940 |
because it realized through some really smart data 01:13:40.600 |
but it might also put things and people in front of me 01:13:45.500 |
Is that the kind of sliver of what this looks like? 01:13:48.640 |
- The whole point, because of the interaction, 01:14:15.440 |
like the idea that you should censor, that is ridiculous. 01:14:21.020 |
and you're becoming the best version of yourself, 01:14:29.320 |
but not because the centralized committee decided, 01:14:40.800 |
And then, which actually YouTube doesn't do that well, 01:14:47.160 |
It's nice to get some really powerful communicators 01:14:57.000 |
and potentially at least long-term to expand your horizons. 01:15:08.100 |
and you're being a good father or son or daughter 01:15:11.680 |
and you're being the best version of yourself 01:15:14.360 |
and you're happy with yourself, I think the earth is flat. 01:15:21.360 |
and I'm just using that kind of silly, ridiculous example 01:15:24.220 |
because I don't like the idea of centralized forces 01:15:33.240 |
But I also don't like this idea of not censoring anything 01:15:39.660 |
because that's always the biggest problem with that 01:15:49.480 |
And it's good to have a companion that reminds you 01:16:00.120 |
there's a guide or a companion that flies out 01:16:03.540 |
or forages a little bit further or a little bit differently 01:16:08.420 |
or at least tries to steer us in the right direction. 01:16:17.800 |
I should mention there's a bunch of difficulties here. 01:16:20.820 |
You see me up and down a little bit recently. 01:16:23.760 |
So there's technically a lot of challenges here. 01:16:30.480 |
and the reason I'm talking about it on a podcast comfortably 01:16:34.360 |
as opposed to working in secret is it's really hard 01:16:41.000 |
And that's something you have to constantly struggle with 01:16:44.140 |
in terms of like entrepreneurially as a startup. 01:16:46.980 |
Like I've also mentioned to you maybe offline, 01:16:56.600 |
So it's a difficult decision to make how much of your time 01:17:03.400 |
do you want to go all in here and give everything to this? 01:17:12.540 |
on some of these problems, both with the robotics 01:17:16.660 |
and the technical side in terms of the machine learning 01:17:20.660 |
system that I'm describing, it's lonely, it's really lonely 01:17:25.660 |
because both on a personal level and a technical level. 01:17:31.420 |
So on a technical level, I'm surrounded by people 01:17:46.800 |
Like the people that, the colleagues of mine, 01:17:49.540 |
they know how difficult lifelong learning is. 01:17:52.300 |
They also know how difficult it is to build a system 01:17:54.740 |
like this, to build a competitive social network. 01:18:10.260 |
And you start to doubt whether given that you don't have 01:18:14.280 |
a track record of success, like that's a big one. 01:18:17.980 |
When you look in the mirror, especially when you're young, 01:18:22.660 |
you look in the mirror and you have these big dreams. 01:18:32.480 |
Like, how do you know you're going to be able 01:18:36.540 |
- You sort of don't, but you're kind of pulling on a string 01:18:45.160 |
I think I pride myself in knowing what I'm good at 01:18:57.540 |
all the things I suck at, which is basically everything. 01:19:01.280 |
So like whenever I notice like, wait a minute, 01:19:04.520 |
I'm kind of good at this, which is very rare for me. 01:19:08.180 |
I think like that might be a ball of yarn worth pulling at. 01:19:11.500 |
And the thing with the, in terms of engineering systems 01:19:18.520 |
And it's 'cause we're talking about podcasting and so on. 01:19:27.180 |
I think maybe it is compelling for people to watch 01:19:31.460 |
a kind-hearted idiot struggle with this form. 01:19:37.180 |
But in terms of like actual being a good engineer 01:19:40.580 |
of human robot interaction systems, I think I'm good. 01:19:47.980 |
and then the world keeps telling you you're not. 01:19:57.180 |
but then that's where the Goggins thing comes in 01:19:59.780 |
is like, aside from the stay hard motherfucker, 01:20:11.180 |
that you're going to be alone in the success, right? 01:20:20.340 |
that I thought was among his many brilliant posts 01:20:26.120 |
you know, he talked about this myth of the light 01:20:33.460 |
was a concept that eventually your eyes adapt to the dark. 01:20:38.460 |
That the tunnel, it's not about a light at the end, 01:20:40.320 |
that it's really about adapting to the dark of the tunnel. 01:20:48.880 |
knowing you both a bit, you know, share a lot in common. 01:20:52.500 |
But in this loneliness and the pursuit of this dream, 01:20:57.500 |
it seems to me it has a certain component to it 01:21:06.700 |
could serve as a driver to build the companion 01:21:21.900 |
but I also love humans, friendship and romantic, you know, 01:21:35.140 |
- Well, I got so much shit from Rogan about like, 01:21:38.340 |
what is it, the tango scene from "Scent of a Woman." 01:21:52.240 |
And you know what he said when I went on your podcast 01:21:54.780 |
for the first time, he said, "He dresses well." 01:21:58.340 |
Because in Argentina, the men go to a wedding 01:22:01.460 |
You know, in the US, by halfway through the night, 01:22:03.660 |
10 minutes in the night, all the jackets are off. 01:22:05.780 |
It looks like everyone's undressing for the party 01:22:08.940 |
And he said, "You know, I like the way he dresses." 01:22:11.620 |
And then when I started, he was talking about you. 01:22:15.740 |
"Why don't you wear a real suit like your friend Lex?" 01:22:22.140 |
But let's talk about this pursuit just a bit more, 01:22:27.820 |
because I think what you're talking about is building a, 01:22:38.640 |
I think within people, there is like caverns of thoughts 01:23:01.340 |
Maybe you could comment a little bit about it, 01:23:02.620 |
because just as often as you talk about love, 01:23:06.100 |
but it seems that you talk about this loneliness. 01:23:27.220 |
- Oh, in terms of business and in terms of systems? 01:23:39.860 |
So, one is there's a startup where there's an idea 01:23:55.060 |
So, I'm very fascinated with the legged robots 01:24:03.140 |
but n-of-one experiments to see if I can recreate the magic. 01:24:09.620 |
And that's been, I have a lot of really good already, 01:24:23.560 |
which also have been part of my sadness recently, 01:24:27.080 |
is that I also have to work with robotics companies. 01:24:34.620 |
and love and appreciation towards Boston Dynamics, 01:24:45.740 |
And they're like looking at this silly Russian kid 01:24:49.340 |
It's like, what's he trying to do with all this love 01:24:58.480 |
it's like when you break up with a girlfriend or something. 01:25:01.180 |
Right now, we decided to part ways on this particular thing. 01:25:04.060 |
They're huge supporters of mine, they're huge fans, 01:25:17.180 |
is keep the robot as far away from humans as possible. 01:25:23.800 |
where it's doing monitoring in dangerous environments. 01:25:34.560 |
even in those applications, exceptionally useful 01:25:37.120 |
for the robot to be able to perceive humans, like see humans 01:25:45.520 |
localize where those humans are and have human intention. 01:25:53.860 |
what the hell the human is doing, like where it's walking. 01:25:59.200 |
They're not, just because you're walking this way one moment 01:26:01.760 |
doesn't mean you'll keep walking that direction. 01:26:05.400 |
especially with the head movement and the eye movement. 01:26:07.720 |
And so I thought that's super interesting to explore, 01:26:11.680 |
So I'll be working with a few other robotics companies 01:26:14.500 |
that are much more open to that kind of stuff. 01:26:20.680 |
And hopefully Boston Dynamics, my first love, 01:26:23.440 |
that getting back with an ex-girlfriend will come around. 01:26:26.000 |
But so the algorithmically, I'm basically done there. 01:26:33.360 |
The rest is actually getting some of these companies 01:26:37.540 |
And then there's, for people who'd work with robots 01:26:41.300 |
know that one thing is to write software that works 01:26:44.900 |
and the other is to have a real machine that actually works. 01:26:48.440 |
And it breaks down in all kinds of different ways 01:26:53.940 |
But that's almost, it may sound a little bit confusing 01:27:01.300 |
because the previous discussion was more about the big dream, 01:27:16.620 |
I really try to do now everything that just brings me joy. 01:27:24.180 |
But two, given my little bit growing platform, 01:27:28.440 |
I wanna use the opportunity to educate people. 01:27:34.100 |
And if I think they're cool, I'll be able to, 01:27:36.620 |
I hope be able to communicate why they're cool to others. 01:27:43.980 |
There's a couple of publications with MIT folks around that. 01:27:47.020 |
But the other is just to make some cool videos 01:27:49.780 |
and explain to people how they actually work. 01:27:52.200 |
And as opposed to people being scared of robots, 01:28:09.780 |
I want to inspire people with that, but that's less, 01:28:13.860 |
it's interesting because I think the big impact 01:28:18.140 |
in terms of the dream does not have to do with embodied AI. 01:28:24.020 |
I think the refrigerator is enough that for an AI system, 01:28:36.520 |
- By embodiment you mean the physical structure. 01:28:48.520 |
a little humanoid robot, maybe I'll keep them on the table. 01:28:51.320 |
It's like walks around or even just like a mobile platform 01:28:54.720 |
that can just like turn around and look at you. 01:29:02.060 |
It's like that butter robot that asks, what is my purpose? 01:29:15.640 |
There's something about a physical entity that moves around, 01:29:19.660 |
that's able to look at you and interact with you 01:29:22.620 |
that makes you wonder what it means to be human. 01:29:27.500 |
if that thing looks like it has consciousness, 01:29:36.500 |
It's humbling for us humans, but that's less about research. 01:29:45.960 |
and challenging others to think about what makes them human. 01:29:58.880 |
I actually find myself starting to crave that 01:30:01.340 |
because we all have those elements from childhood 01:30:04.920 |
or from adulthood where we experience something, 01:30:12.560 |
I think a lot of people are scared of robots. 01:30:21.840 |
actually was pretty good at picking up Costello's hair 01:30:30.440 |
and it would get caught on a wire or something, 01:30:33.200 |
I would find myself getting upset with the Roomba. 01:30:35.280 |
In that moment, I'm like, what are you doing? 01:30:44.760 |
But what you're describing has so much more richness 01:31:14.740 |
This is the place we're currently in in Austin 01:31:20.280 |
But I basically got it to make sure I have room for robots. 01:31:32.100 |
- Oh no, I do different experiments with them. 01:31:36.040 |
So one of the things I wanna mention is this is, 01:31:41.780 |
is I got them to scream in pain and moan in pain 01:31:52.440 |
And I did that experiment to see how I would feel. 01:32:00.360 |
- Did any Roomba rights activists come after you? 01:32:23.200 |
because ultimately I felt like they were human 01:32:37.180 |
- I have to connect you to my friend, Eddie Chang. 01:32:40.480 |
He's a neurosurgeon and we're lifelong friends. 01:32:46.360 |
but he describes some of these more primitive, 01:32:58.940 |
as such powerful rudders for the emotions of other people. 01:33:13.680 |
that are able to scream in pain in my Boston place. 01:33:35.800 |
it seems like there's got to be a dark person 01:33:40.960 |
and Roomba's screaming and they're like, yep, yep, 01:33:44.560 |
- What about like shouts of glee and delight? 01:33:50.360 |
I don't, how to, to me delight is quiet, right? 01:34:01.000 |
But I don't, I mean, unless you're talking about like, 01:34:04.760 |
I don't know how you would have sexual relations 01:34:07.320 |
- Well, I wasn't necessarily saying sexual delight, but- 01:34:17.720 |
'Cause you mentioned you had a negative relationship 01:34:21.920 |
- Well, I'd find that mostly I took it for granted. 01:34:27.560 |
And then when it would do something I didn't like, 01:34:31.760 |
It was taken for granted and I would get upset 01:34:36.060 |
And I just like, you're, you're in the, in the corner. 01:34:43.400 |
it being quite dumb as a almost cute, you know, 01:34:51.120 |
And I think that's a artificial intelligence problem. 01:34:54.000 |
I think flaws are, should be a feature, not a bug. 01:34:59.280 |
So along the lines of this, the different sorts 01:35:01.860 |
of relationships that one could have with robots 01:35:03.800 |
and the fear, but also some of the positive relationships 01:35:06.900 |
that one could have, there's so much dimensionality, 01:35:12.400 |
But power dynamics in relationships are very interesting 01:35:16.300 |
because the obvious ones that the unsophisticated view 01:35:20.640 |
of this is, you know, one, there's a master and a servant, 01:35:30.180 |
You know, children do this with parents, puppies do this. 01:35:37.560 |
Kids, coo, and parents always think that they're, you know, 01:35:46.520 |
studies show that those coos are ways to extract 01:35:59.720 |
that I hear a lot about that I think most people 01:36:04.520 |
and they become the masters and we become the servants. 01:36:08.140 |
But there could be another version that, you know, 01:36:12.600 |
in certain communities that I'm certainly not a part of, 01:36:20.780 |
into doing things, but you are under the belief 01:36:24.180 |
that you are in charge, but actually they're in charge. 01:36:29.120 |
And so I think that's one that if we could explore that 01:36:33.920 |
for a second, you could imagine it wouldn't necessarily 01:36:36.480 |
be bad, although it could lead to bad things. 01:36:39.100 |
The reason I want to explore this is I think people 01:36:41.800 |
always default to the extreme, like the robots take over 01:36:45.560 |
and we're in little jail cells and they're out having fun 01:36:50.600 |
What sorts of manipulation can a robot potentially carry out, 01:36:56.500 |
- Yeah, so there's a lot of good and bad manipulation 01:37:00.940 |
Just like you said, to me, especially like you said, 01:37:12.340 |
- I think someone from MIT told me that term. 01:37:18.460 |
- So first of all, there's power dynamics in bed 01:37:22.040 |
and power dynamics in relationships and power dynamics 01:37:28.280 |
I think power dynamics can make human relationships, 01:37:34.280 |
especially romantic relationships, fascinating and rich 01:37:39.740 |
and fulfilling and exciting and all those kinds of things. 01:37:50.820 |
I really love the idea that a robot would be a top 01:37:57.500 |
And I think everybody should be aware of that. 01:37:59.980 |
And the manipulation is not so much manipulation, 01:38:02.460 |
but a dance of like pulling away, a push and pull 01:38:08.340 |
In terms of control, I think we're very, very, very far away 01:38:16.340 |
They, to lock us up in, like to have so much control 01:38:26.920 |
I think there's a, in terms of dangers of AI systems, 01:38:31.260 |
with autonomous weapon systems and all those kinds of things. 01:38:34.240 |
So the power dynamics as exercised in the struggle 01:38:37.900 |
between nations and war and all those kinds of things. 01:38:42.740 |
I think power dynamics are a beautiful thing. 01:38:45.900 |
Now there is, of course, going to be all those kinds 01:38:53.300 |
- Well, here we're talking, I always say, you know, 01:38:59.180 |
it's always, always should be consensual, age appropriate, 01:39:06.060 |
But now we're talking about human robot interactions. 01:39:10.780 |
- No, I actually was trying to make a different point, 01:39:13.620 |
which is I do believe that robots will have rights 01:39:17.860 |
And I think in order for us to have deep meaningful 01:39:23.060 |
we would have to consider them as entities in themselves 01:39:31.660 |
that I think people are starting to talk about 01:39:36.340 |
for us to understand how entities that are other than human. 01:39:39.700 |
I mean, the same as with dogs and other animals 01:39:44.900 |
- Well, yeah, I mean, we can't and nor should we 01:39:49.940 |
We have a USDA, we have departments of agriculture 01:39:54.700 |
that deal with, you know, animal care and use committees 01:39:58.260 |
for research, for farming and ranching and all that. 01:40:02.060 |
So I, while when you first said it, I thought, wait, 01:40:07.380 |
But it absolutely makes sense in the context of everything 01:40:13.860 |
Let's, if you're willing, I'd love to talk about dogs 01:40:18.700 |
because you've mentioned dogs a couple of times, 01:40:43.880 |
so he's this black dog that's a really long hair 01:40:50.460 |
I think perhaps that's true for a lot of large dogs, 01:41:00.480 |
- Oh, since, yeah, since the very, very beginning 01:41:03.760 |
And one of the things, I mean, he had this kind of, 01:41:22.740 |
in case people are wondering which Homer I'm referring to. 01:41:35.120 |
that immediately led to a deep love for each other. 01:41:43.320 |
He was always there for so many nights together. 01:41:46.500 |
That's a powerful thing about a dog that he was there 01:41:51.080 |
through all the loneliness, through all the tough times, 01:41:53.540 |
through the successes and all those kinds of things. 01:42:09.140 |
But it was the first time I've really experienced 01:42:26.020 |
And then at a certain point he couldn't get up anymore. 01:42:33.960 |
That maybe he suffered much longer than he needed to. 01:42:42.060 |
But I remember I had to take him to the hospital 01:43:16.980 |
somebody that heavy when they're not helping you out. 01:43:20.100 |
And yeah, so I remember it was the first time 01:43:34.040 |
And that realization that we're here for a short time 01:43:42.760 |
the week before, the day before, and now he's gone. 01:43:49.180 |
that spoke to the fact that he could be deeply connected 01:43:53.240 |
Also spoke to the fact that the shared moments together 01:44:00.800 |
that led to that deep friendship will make life so amazing, 01:44:05.800 |
but also spoke to the fact that death is a motherfucker. 01:44:23.320 |
that I guess we're about to both cry over our dead dogs. 01:44:28.680 |
It was bound to happen just given when this is happening. 01:44:35.060 |
- How long did you know that Costello was not doing well? 01:44:49.060 |
his behavior changed and something really changed. 01:45:00.600 |
He was dealing with joint pain, sleep issues, 01:45:15.320 |
I mean, this dog was, yeah, it was like a pharmacy. 01:45:17.360 |
It's amazing to me when I looked at it the other day, 01:45:19.880 |
still haven't cleaned up and removed all his things 01:45:22.080 |
'cause I can't quite bring myself to do it, but... 01:45:27.480 |
- Well, so what happened was about a week ago, 01:45:30.300 |
it was really just about a week ago, it's amazing. 01:45:35.860 |
He wasn't 200 pounds, but he was about 90 pounds, 01:45:37.740 |
but he's a bulldog, that's pretty big, and he was fit. 01:45:41.200 |
And then I noticed that he wasn't carrying a foot 01:45:46.400 |
He never liked me to touch his hind paws and I could do, 01:45:50.480 |
And then the vet found some spinal degeneration 01:45:57.300 |
I sure hope not, but something changed in his eyes. 01:46:02.840 |
I know you and I spend long hours on the phone 01:46:05.120 |
and talking about like the eyes and what they convey 01:46:09.180 |
and forsaken robots and biology of other kinds, but- 01:46:12.800 |
- Do you think something about him was gone in his eyes? 01:46:17.800 |
- I think he was real, here I am anthropomorphizing. 01:46:22.860 |
I think he was realizing that one of his great joys in life, 01:46:26.840 |
which was to walk and sniff and pee on things. 01:46:33.120 |
This dog loved to pee on things, it was amazing. 01:46:38.900 |
He was like a reservoir of urine, it was incredible. 01:46:42.280 |
I think, oh, that's it, he'd put like one drop 01:46:46.560 |
and then we get to the 50 millionth in one plant 01:46:49.120 |
and he'd just have, you know, leave a puddle. 01:46:54.520 |
He was losing that ability to stand up and do that. 01:47:04.560 |
but you know, I'll say this, I'm not ashamed to say it. 01:47:09.140 |
I mean, I wake up every morning since then just, 01:47:16.080 |
And I'm fortunately able to make it through the day 01:47:25.760 |
And I feel like he, you know, Homer, Costello, 01:47:29.180 |
you know, the relationship to one's dog is so specific, but. 01:47:53.260 |
of I brought Costello a little bit to the world 01:47:56.480 |
through the podcast, through posting about him. 01:47:58.400 |
I gave, I anthropomorphized about him in public. 01:48:01.560 |
Let's be honest, I have no idea what his mental life was 01:48:04.840 |
And I'm just exploring all this for the first time 01:48:11.020 |
I noticed the episode you released on Monday, 01:48:23.440 |
he's going to be gone for a lot of people too. 01:48:29.820 |
I think that maybe, you're pretty good at this, Lila. 01:48:36.180 |
This is the challenge is I actually, part of me, 01:48:40.560 |
I know how to take care of myself pretty well. 01:48:46.000 |
I do worry a little bit about how it's going to land 01:48:56.280 |
- And they have to watch you struggle, which is fascinating. 01:48:59.360 |
- Right, and I've mostly been shielding them from this, 01:49:08.200 |
And his best traits were that he was incredibly tough. 01:49:13.200 |
I mean, he was a 22 inch neck, bulldog, the whole thing. 01:49:18.960 |
But what was so beautiful is that his toughness 01:49:34.880 |
you should probably live on in your podcast too. 01:49:40.560 |
one of the things I loved about his role in your podcast 01:49:51.920 |
I think that's such a powerful thing to bring that joy into, 01:49:56.300 |
like allowing yourself to experience that joy, 01:49:59.380 |
to bring that joy to others, to share it with others. 01:50:03.820 |
And I mean, not to, this is like the Russian thing is, 01:50:14.040 |
that I keep thinking about in his show, Louis, 01:50:20.240 |
for whining about breaking up with his girlfriend. 01:50:22.840 |
And he was saying like the most beautiful thing 01:50:27.100 |
about love, I mean, a song that's catchy now, 01:50:32.480 |
that's now making me feel horrible saying it, 01:50:48.700 |
and not run away from that loss is really powerful. 01:50:55.060 |
just like the love was, the loss is also sweet 01:50:58.080 |
because you know that you felt a lot for that, 01:51:13.680 |
or whatever else is the source of joy, right? 01:51:16.920 |
And maybe you think about one day getting another dog. 01:51:21.920 |
- Yeah, in time, you're hitting on all the key buttons here. 01:51:27.460 |
I want that to, we're thinking about, you know, 01:51:32.700 |
ways to kind of immortalize Costello in a way that's real, 01:51:35.620 |
not just, you know, creating some little logo 01:51:39.740 |
You know, Costello, much like David Goggins is a person, 01:51:43.840 |
but Goggins also has grown into kind of a verb. 01:51:47.140 |
You're going to Goggins this or you're going to, 01:51:48.820 |
and there's an adjective like that's extreme. 01:51:51.020 |
Like I think that for me, Costello was all those things. 01:52:05.060 |
to do things for you without doing a damn thing. 01:52:14.440 |
This actually has been very therapeutic for me, 01:52:22.360 |
We're friends, we're not just co-scientists colleagues 01:52:26.720 |
working on a project together and in the world 01:52:42.700 |
because I think that I certainly know as a scientist 01:52:52.740 |
There are elements of many pursuits that are lonely. 01:53:05.380 |
and sometimes people are surrounded by people 01:53:06.980 |
interacting with people and they feel very lonely. 01:53:17.940 |
in making one feel like certain things are possible 01:53:25.780 |
Maybe even making us compulsively reach for them. 01:53:33.700 |
- Okay, and then you moved directly to Philadelphia? 01:53:39.820 |
And then Philadelphia and San Francisco and Boston and so on 01:53:44.820 |
but really to Chicago, that's when I went to high school. 01:54:01.820 |
He was into, I mean, so he's also a scientist, 01:54:10.260 |
he was the person who did drink and did every drug 01:54:21.100 |
when you're the older brother, five years older, 01:54:23.420 |
he was the coolest person that I always wanted to be him. 01:54:28.300 |
So to that, he definitely had a big influence. 01:54:31.180 |
But I think for me, in terms of friendship growing up, 01:54:40.060 |
And then when I came here, I had another close friend, 01:54:42.280 |
but I'm very, I believe, I don't know if I believe, 01:54:47.280 |
but I draw a lot of strength from deep connections 01:54:52.520 |
with other people and just a small number of people, 01:55:00.500 |
I was really surprised how like there would be 01:55:04.140 |
these large groups of friends, quote unquote, 01:55:08.280 |
but the depth of connection was not there at all 01:55:14.260 |
Now I moved to the suburb of Chicago, was Naperville. 01:55:17.700 |
It's more like a middle class, maybe upper middle class. 01:55:23.500 |
about material possessions than deep human connection. 01:55:28.420 |
But I drew more meaning than almost anything else 01:55:35.620 |
I had a best friend, his name was, his name is Yura. 01:55:56.640 |
There's also a group of friends, like, I don't know, 01:56:09.660 |
So we spent all day, all night just playing soccer, 01:56:13.340 |
usually called football and just talking about life 01:56:20.780 |
I think people in Russia and the Soviet Union 01:56:25.120 |
I think the education system at the university level 01:56:33.740 |
in terms of creating really big, powerful minds, 01:56:38.740 |
at least it used to be, but I think that they aspire to that. 01:56:51.220 |
The level of literature, Tolstoy, Dostoyevsky-- 01:57:03.780 |
Like we, I think in this country, there's more, 01:57:07.180 |
like especially young kids 'cause they're so cute, 01:57:11.500 |
We only start to really push adults later in life. 01:57:15.260 |
Like, so if you want to be the best in the world at this, 01:57:19.460 |
But we were pushed at a young age, everybody was pushed. 01:57:25.020 |
I think they really forced people to discover, 01:57:29.020 |
like discover themselves in the Goggin style, 01:57:31.820 |
but also discover what they're actually passionate about, 01:57:38.940 |
- Yeah, they were pushed equally, I would say. 01:57:41.940 |
There was a, obviously there was more, not obviously, 01:58:01.560 |
there's, you know, like guys like lifting heavy things 01:58:06.560 |
and girls like creating beautiful art and, you know, 01:58:14.180 |
- A more traditional view of gender, more 1950s, '60s. 01:58:18.200 |
- But we didn't think in terms of, at least at that age, 01:58:20.380 |
in terms of like roles and then like a homemaker 01:58:33.240 |
I think mathematics and engineering was something 01:58:47.740 |
And now of course that I don't take that forward 01:58:54.820 |
but it's just the way I grew up and the way I remember it. 01:59:01.580 |
The value of hard work was instilled in everybody. 01:59:06.500 |
And through that, I think it's a little bit of hardship. 01:59:11.500 |
Of course, also economically, everybody was poor, 01:59:14.940 |
especially with the collapse of the Soviet Union. 01:59:28.020 |
Just meeting with neighbors, everybody knew each other. 01:59:47.260 |
- What's with the sad songs and the Russian thing? 01:59:50.580 |
I mean, Russians do express joy from time to time. 02:00:02.820 |
- I think, so first of all, the Soviet Union, 02:00:12.100 |
and the millions and millions and millions of people there, 02:00:25.900 |
Like that's grandparents, that's parents, that's all there. 02:00:30.580 |
that life can be absurdly, unexplainably cruel. 02:00:47.660 |
why you meditate on death is like if you just think about 02:00:50.800 |
the worst possible case and find beauty in that, 02:00:57.420 |
and that somehow helps you deal with whatever comes. 02:01:07.220 |
that like inflation, all those kinds of things 02:01:10.280 |
where people were sold dreams and never delivered. 02:01:15.420 |
And so like, if you don't sing songs about sad things, 02:01:20.420 |
you're going to become cynical about this world. 02:01:34.240 |
One of the things that may be common in Russia 02:01:43.180 |
about dreaming about robots, it's very common for people 02:01:46.460 |
to dismiss that dream of saying, no, that's not, 02:01:54.320 |
Or you want to start a podcast, like who else? 02:02:00.020 |
That kind of mindset I think is quite common, 02:02:03.360 |
which is why I would say entrepreneurship in Russia 02:02:07.520 |
is still not very good, which to be a business, 02:02:14.980 |
like friends and support group that make you dream big. 02:02:21.380 |
and appreciate the beauty and the unfairness of life, 02:02:29.940 |
then I think it just makes you appreciative of everything. 02:02:34.500 |
It's like a, it's a prerequisite for gratitude. 02:03:07.680 |
And I think that that was instilled in me too. 02:03:10.900 |
Not only do I appreciate everything about life, 02:03:15.940 |
In a sense, like I get like a visceral feeling of joy 02:03:26.080 |
Like there's a deep, like emotional connection there 02:03:30.700 |
that like, that's like way too dramatic to like, 02:03:35.300 |
I guess, relative to what the actual moment is, 02:03:46.620 |
And I think I would attribute that to bringing in Russia. 02:03:50.340 |
But the thing that sticks most of all is the friendship. 02:03:54.000 |
And I've now since then had one other friend like that 02:04:16.540 |
it's a lot of times now interacting with Joe Rogan. 02:04:28.300 |
in the grappling sports that are really connected with. 02:04:31.600 |
I've actually struggled, which is why I'm so glad 02:04:43.200 |
Even the biologists, I mean, one thing that I'm-- 02:04:47.000 |
- Well, I'm so struck by the fact that you work with robots, 02:04:55.320 |
But what you're describing, and I know is true about you, 02:04:58.260 |
is this deep emotional life and this resonance. 02:05:04.120 |
why so many people, scientists and otherwise, 02:05:12.860 |
In Hermann Hesse's book, I don't know if you read, 02:05:16.100 |
It's about these elements of the logical, rational mind 02:05:19.400 |
and the emotional mind and how those are woven together. 02:05:27.300 |
And I think that's so much of what draws people to you. 02:05:30.020 |
- I've read every Hermann Hesse book, by the way. 02:05:31.940 |
- As usual, as usual, I've done about 9% of what Lex is. 02:05:38.560 |
Well, you mentioned Joe, who is a phenomenal human being, 02:05:44.260 |
but for how he shows up to the world one-on-one. 02:05:48.500 |
I think I heard him say the other day on an interview, 02:05:51.620 |
he said there is no public or private version of him. 02:05:58.340 |
He said, "I'm like the fish that got through the net." 02:06:12.620 |
He was a huge, if I could just comment real quick. 02:06:15.620 |
Like that, he was, I've been a fan of Joe for a long time, 02:06:20.620 |
to not have any difference between public and private life. 02:06:25.060 |
I actually had a conversation with Naval about this. 02:06:36.620 |
if you're the same person publicly and privately. 02:06:52.500 |
Now, that said, I don't have any really strange sex kinks. 02:06:57.500 |
So like, I feel like I can be open with basically everything. 02:07:02.340 |
There's some things that could be perceived poorly, 02:07:05.660 |
like the screaming Arumbas, but I'm not ashamed of them. 02:07:09.040 |
I just have to present them in the right context. 02:07:14.020 |
to being the same person in private as in public. 02:07:16.620 |
And that, Joe made me realize that you can be that 02:07:28.620 |
but I really always enjoyed being good to others. 02:07:41.100 |
But I always felt like the world didn't want me to be. 02:07:45.380 |
Like there's so much negativity when I was growing up, 02:07:57.460 |
this could be just like the big cities that I visited in, 02:08:08.660 |
And two, you're seen as like a little bit of a weirdo. 02:08:13.220 |
And so I always felt like I had to hide that. 02:08:17.780 |
I can be fully just the same person, private and public. 02:08:44.540 |
even if I'm being mocked, you know what I mean? 02:08:53.520 |
The point is like, it's just enjoy being yourself. 02:08:58.300 |
because he's so successful at it, inspired me to do that. 02:09:03.060 |
Be kind and be the same person, private and public. 02:09:11.520 |
That it doesn't mean you reveal every detail of your life. 02:09:14.460 |
It's a way of being true to an essence of oneself. 02:09:19.000 |
- Right, there's never a feeling when you deeply think 02:09:23.620 |
and introspect that you're hiding something from the world 02:09:26.100 |
or you're being dishonest in some fundamental way. 02:09:33.480 |
It allows you to think, it allows you to like think freely, 02:09:42.100 |
That said, it's not like there's not still a responsibility 02:09:52.120 |
So, I'm very careful with the way I say something. 02:10:00.680 |
to express the spirit that's inside you with words. 02:10:08.740 |
I struggle, like oftentimes when I say something 02:10:12.740 |
and I hear myself say it, it sounds really dumb 02:10:18.420 |
It's not just like being the same person publicly 02:10:21.340 |
and privately means you can just say whatever the hell. 02:10:24.460 |
It means there's still a responsibility to try to be, 02:10:37.640 |
all people, when I say we, I mean all humans, 02:10:44.480 |
to be able to express ourselves in that one moment, 02:10:48.900 |
And it is beautiful when somebody, for instance, 02:10:58.740 |
But perhaps it's also possible to do it in aggregate, 02:11:02.740 |
you know, all the things, you know, how you show up. 02:11:06.080 |
For instance, one of the things that initially drew me 02:11:08.660 |
to want to get to know you as a human being and a scientist, 02:11:24.340 |
overdressed by American, certainly by American standards, 02:11:27.260 |
you're overdressed for a podcast, but this is, 02:11:29.740 |
but it's genuine, you're not doing it for any reason, 02:11:31.820 |
except I have to assume, and I assumed at the time, 02:11:35.060 |
that it was because you have a respect for your audience. 02:11:37.940 |
You respect them enough to show up a certain way for them. 02:11:47.080 |
to your friendships, the way that you talk about friendships 02:11:49.560 |
and love and the way you hold up these higher ideals, 02:11:52.820 |
I think at least as a consumer of your content 02:11:56.620 |
and as your friend, what I find is that in aggregate, 02:12:03.360 |
It doesn't have to be one quote or something. 02:12:13.020 |
but I think you so embody the way that, and Joe as well, 02:12:18.020 |
it's about how you live your life and how you show up 02:12:25.840 |
So the aggregate is the goal, the tricky thing, 02:12:32.220 |
because he's under attack way more than you and I 02:13:01.140 |
is that when you in aggregate are a good person, 02:13:16.820 |
And so that, I like that idea is the aggregate 02:13:27.180 |
and being yourself and people get to know who you are. 02:13:30.180 |
And once they do and you post pictures of screaming Roombas 02:13:51.580 |
The Boston Dynamics came up against this problem, 02:13:59.860 |
And maybe because we'll post this, people will let me know. 02:14:15.420 |
because you can throw them like off of a building 02:14:29.100 |
No one should throw their cat out of a window. 02:14:34.700 |
It's really fascinating how they recover from those kicks, 02:14:40.440 |
and also seeing others do it, it just does not look good. 02:15:08.580 |
except that if they have some sentient aspect 02:15:13.020 |
to their being, then I would loathe to kick it. 02:15:21.820 |
One of the cool things is one of the robots I'm working with, 02:15:26.580 |
you can pick it up by one leg and it's dangling. 02:15:37.580 |
- Oh man, we look forward to the letters from the cat. 02:15:44.540 |
he would just throw it onto the bed from across the room 02:15:48.820 |
Somehow they had, that was the nature of the relationship. 02:15:54.100 |
but apparently this cat seemed to return for it 02:15:58.220 |
- The robot is a robot and it's fascinating to me 02:16:08.240 |
So for me to be able to do that with a robot, 02:16:20.740 |
I have to do what robotics colleagues of mine do, 02:16:26.880 |
So it was just fascinating that I have to do that 02:16:30.700 |
I just wanted to take that a little bit of a tangent. 02:16:34.980 |
I mean, I am not shy about the fact that for many years, 02:16:49.220 |
'cause I just don't like doing it, larger animals as well. 02:17:00.820 |
to have an understanding of what the guidelines are 02:17:03.880 |
and where one's own boundaries are around this. 02:17:14.940 |
I know you have a lot of thoughts about friendship. 02:17:18.180 |
What do you think is the value of friendship in life? 02:17:24.820 |
just because of my life trajectory and arc of friendship, 02:17:29.820 |
and I should say I do have some female friends 02:17:37.100 |
but it's been mostly male friendship to me has been- 02:17:39.660 |
- Has been all male friendships to me actually. 02:17:58.880 |
because there's always an anxiety in taking any good risk 02:18:04.220 |
It's given me the sense that I should go for certain things 02:18:45.000 |
Sometimes it's about, I'm terrible with punctuality 02:18:49.920 |
'cause I'm an academic and so I just get lost in time 02:18:53.420 |
but it's striving to listen, to enjoy good times 02:18:59.100 |
It kind of goes back to this first variable we talked about, 02:19:12.580 |
It's actually, to me, what makes life worth living. 02:19:14.820 |
- Yeah, well, I am surprised with the high school friends 02:19:19.820 |
how we don't actually talk that often these days 02:19:22.640 |
in terms of time, but every time we see each other, 02:19:24.940 |
it's immediately right back to where we started. 02:19:27.060 |
So I struggle with that, how much time you really allocate 02:19:51.120 |
I'm much more reliable when you're going through shit 02:19:59.560 |
- No, but if you're like a wedding or something like that, 02:20:03.120 |
or like, I don't know, like you want an award of some kind, 02:20:08.120 |
like, yeah, I'll congratulate the shit out of you, 02:20:14.460 |
but that's not as important to me as being there 02:20:16.700 |
when like nobody else is, like just being there 02:20:26.880 |
That to me, that's where friendship is meaningful. 02:20:30.920 |
and that's a felt thing and a real thing with you. 02:20:33.420 |
Let me ask one more thing about that actually, 02:20:38.420 |
I know you are, Joe is, but years ago I read a book 02:20:42.400 |
which is Sam Sheridan's book, "A Fighter's Heart." 02:20:44.780 |
He talks about all these different forms of martial arts, 02:20:46.920 |
and maybe it was in the book, maybe it was in an interview, 02:20:50.640 |
but he said that fighting or being in physical battle 02:20:57.340 |
or some other form of direct physical contact 02:21:00.220 |
between two individuals creates this bond unlike any other, 02:21:04.300 |
because he said, it's like a one-night stand. 02:21:09.800 |
And I chuckled about it 'cause it's kind of funny 02:21:14.820 |
But at the same time, I think this is a fundamental way 02:21:32.720 |
- I heard this recently, I didn't know this term, 02:21:34.840 |
but there's a term, they've turned the noun cupcake 02:21:41.740 |
Cupcaking is when you spend time just cuddling. 02:21:47.300 |
although I heard it first just the other day, 02:21:49.980 |
- So cuddling is everything, it's not just like, 02:21:58.980 |
I think it definitely has to do with physical contact, 02:22:04.180 |
But in terms of battle, competition and the Sheridan quote, 02:22:15.060 |
or feel a bond with people that, for instance, 02:22:20.660 |
even though you don't know anything else about them? 02:22:27.140 |
He also has the book, "A Fighter's Mind" and "A Fighter's Art." 02:22:32.140 |
What's interesting about him, just briefly about Sheridan, 02:22:34.500 |
I don't know, but I did a little bit of research. 02:22:36.140 |
He went to Harvard, he was an art major at Harvard. 02:22:40.100 |
He claims all he did was smoke cigarettes and do art. 02:22:45.100 |
And I think his father was in the SEAL teams. 02:22:48.980 |
And then when he got out of Harvard, graduated, 02:22:54.420 |
and was early to the kind of ultimate fighting 02:23:03.140 |
and I remember thinking there was an amazing encapsulation 02:23:06.980 |
of what makes fighting, like what makes it compelling. 02:23:11.980 |
I would say that there's so many ways that jiu-jitsu, 02:23:17.100 |
grappling, wrestling, combat sports in general, 02:23:21.100 |
is like one of the most intimate things you could do. 02:23:25.620 |
in terms of bodily liquids and all those kinds of things. 02:23:29.460 |
- But I think there's a few ways that it does that. 02:23:44.200 |
and often all of us have ego thinking we're better 02:24:01.000 |
That kind of honesty, we don't get to experience it 02:24:06.040 |
We can continue living somewhat of an illusion 02:24:10.740 |
'cause people are not going to hit us with the reality. 02:24:22.500 |
It's the loss of a reality that you knew before. 02:24:28.180 |
And when you're sitting there in that vulnerability 02:24:46.020 |
and we're just sitting there with that reality. 02:24:47.660 |
Some of us can put words to them, some of them can't. 02:24:58.580 |
There is something about, I mean, like a big hug. 02:25:06.780 |
and I hugged them and I always felt good when they did. 02:25:10.240 |
Like we're all tested and especially now we're vaccinated 02:25:13.860 |
but there's still people, this is true in San Francisco, 02:25:26.720 |
of what I feel in jiu-jitsu where it was like 02:25:30.480 |
that contact where you're like, I don't give a shit 02:25:34.900 |
about whatever rules we're supposed to have in society 02:25:37.400 |
where you're not, you have to keep a distance 02:25:46.440 |
and you're like just controlling another person. 02:25:49.720 |
And also there is some kind of love communicated 02:25:52.100 |
through just trying to break each other's arms. 02:26:08.220 |
but also non-sexual contact are not just nearby the neurons 02:26:25.020 |
I'm not anthropomorphizing about what this means 02:26:27.820 |
but in the brain, those structures are interdigitated. 02:26:32.020 |
You can't separate them except at a very fine level. 02:26:45.660 |
but one of the amazing things about jiu-jitsu 02:26:54.740 |
So I'm a big fan of yoga pants at the gym kind of thing. 02:27:04.780 |
But the thing is girls are dressed in skintight clothes 02:27:10.180 |
And I found myself not at all thinking like that at all 02:27:22.660 |
to train with the opposites in something so intimate? 02:27:31.780 |
- And the only times girls kind of try to stay away 02:27:36.380 |
Of course, there's always going to be creeps in this world. 02:27:42.100 |
And the other is like, there's a size disparity. 02:28:00.980 |
And then second, they get, especially later on, 02:28:05.940 |
like these like bros that come in who are all shredded 02:28:10.100 |
and like muscular, and they get to technique to exercise 02:28:17.720 |
- You've seen women force a larger guy to tap 02:28:31.520 |
So I was really into powerlifting when I started jiu-jitsu. 02:28:37.400 |
I thought I walked in feeling like I'm going to be, 02:28:40.660 |
if not the greatest fighter ever, at least top three. 02:28:43.220 |
And so as a white belt, you roll in like all happy. 02:28:47.420 |
And then you realize that as long as you're not applying 02:28:52.940 |
I remember being submitted many times by like 130, 02:28:55.540 |
120 pound girls at a balance studios in Philadelphia 02:28:59.860 |
that a lot of incredible female jiu-jitsu players. 02:29:04.360 |
The technique can overpower in combat pure strength. 02:29:09.360 |
And that's the other thing that there is something 02:29:18.140 |
Like it just feels, it feels like we were born to do this. 02:29:29.040 |
that are dedicated to this kind of interaction. 02:29:45.660 |
A child before puberty has no concept of boys and girls 02:29:49.260 |
having this attraction regardless of whether or not 02:30:01.300 |
And certain people take on a romantic or sexual interest 02:30:07.560 |
And so it's like, it's revealing a circuitry in the brain. 02:30:14.320 |
And I think when I hear the way you describe jiu-jitsu 02:30:18.100 |
and rolling jiu-jitsu, it reminds me a little bit, 02:30:21.060 |
Joe was telling me recently about the first time 02:30:23.300 |
he went hunting and he felt like it revealed a circuit 02:30:34.920 |
One of the interesting things about jiu-jitsu 02:30:37.220 |
is it's one of the really strenuous exercises 02:30:48.000 |
When I came up, there's a few people in their 80s 02:30:57.320 |
That's late into life and so you're getting exercise. 02:31:31.560 |
are the two activities you can kind of do late in life 02:31:40.420 |
And I'm glad that we're on the physical component 02:31:49.680 |
between the physical and the intellectual and the mental. 02:31:52.480 |
Are you still running at ridiculous hours of the night 02:32:01.520 |
I've been running late at night here in Austin. 02:32:07.240 |
which I find laughable coming from the bigger cities. 02:32:14.180 |
- If you see a guy running through Austin at 2 a.m. 02:32:24.080 |
'cause I get recognized more and more in Austin. 02:32:34.960 |
that brings out those deep philosophical thoughts 02:32:40.720 |
But recently, I started getting back to the grind. 02:32:44.560 |
So I'm gonna be competing or hoping to be compete 02:32:53.120 |
And so that requires getting back into great cardio shape. 02:32:58.120 |
I've been getting running as part of my daily routine. 02:33:05.140 |
regardless of time zone in the middle of the night, 02:33:09.280 |
- Well, part of that has to be just being single 02:33:18.040 |
- It's not banker's hours kind of work, nine to five. 02:33:38.960 |
- Yeah, I guess that's a powerful thing, right? 02:33:50.880 |
and it's beautiful, the fact that they have that 02:33:53.280 |
and that was the norm for you, I think is really wonderful. 02:33:57.000 |
- In the case of my parents, it was interesting to watch 02:34:06.680 |
They obviously get frustrated with each other 02:34:16.520 |
like to tease, to get some of that frustration out 02:34:21.040 |
and find their joyful moments and be that the energy. 02:34:25.280 |
I think it's clear 'cause I got together in there, 02:34:27.600 |
I think early twenties, like very, very young. 02:34:44.960 |
especially from that time in the Soviet Union, 02:34:46.640 |
that's probably applies to a lot of cultures. 02:34:52.280 |
And once you do, you start to get to some of those 02:34:54.920 |
rewarding aspects of being like through time, 02:35:00.760 |
That's definitely something that was an inspiration to me, 02:35:11.340 |
to have a lifelong partner that have that kind of view 02:35:16.760 |
lifelong friendship is the most meaningful kind, 02:35:24.120 |
like till death do us part as a powerful thing, 02:35:26.800 |
not by force, not because the religion said it 02:35:29.200 |
or the government said it or your culture said it, 02:35:41.280 |
- Oh, I thought you should know human children. 02:35:45.560 |
- Exactly, well, I was saying you probably need 02:35:46.880 |
at least as many human children as you do room with. 02:35:55.160 |
are there a bunch of freedman's running around? 02:36:36.800 |
- I have a feeling and I don't have a crystal ball, 02:36:42.600 |
I see an upwards of certainly three or more comes to mind. 02:36:47.600 |
- So much of that has to do with the partner you're with too. 02:36:55.560 |
especially in this society of what the right partnership is. 02:37:08.520 |
to be really excited about the passions of another person, 02:37:13.020 |
It doesn't have to be a career success, any kind of success, 02:37:23.720 |
But there was also practical aspects of like, 02:37:25.800 |
what kind of shit do you enjoy doing together? 02:37:28.760 |
And I think family is a real serious undertaking. 02:37:34.040 |
I mean, I think that I have a friend who said it, 02:37:41.240 |
he's in a very successful relationship and has a family. 02:37:44.440 |
And he said, you first have to define the role 02:37:47.480 |
and then you have to cast the right person for the role. 02:37:50.180 |
- Well, yeah, there's some deep aspects to that, 02:37:53.880 |
but there's also an aspect to which you're not smart enough 02:38:11.520 |
you just have to go with it and figure it out also. 02:38:20.580 |
So there's so many incredibly successful people that I know, 02:38:25.360 |
that I've gotten to know, that all have kids. 02:38:37.900 |
something that made them the best version of themselves, 02:38:44.520 |
I mean, you can imagine if the way that you felt about Homer, 02:38:49.760 |
is at all a glimpse of what that must be like then. 02:39:07.840 |
a source of increased productivity and joy and happiness. 02:39:19.620 |
that little shallow layer of complaint, kids are great. 02:39:30.940 |
dating is very difficult and I'm a complicated person. 02:39:48.260 |
You're like the first person I saw in a while. 02:39:53.040 |
So like, I don't think I've seen like a female. 02:40:00.840 |
An element of the female species in quite a while. 02:40:03.680 |
So I think you have to put yourself out there. 02:40:07.440 |
Daniel Johnson says, true love will find you, 02:40:11.760 |
So there's some element of really taking the leap 02:40:20.460 |
- Well, you're a builder and you're a problem solver 02:40:25.200 |
and you find solutions and I'm confident this solution is, 02:40:35.100 |
that I'm going to build the girlfriend, which I think- 02:40:39.780 |
and maybe we shouldn't separate this friendship, 02:40:45.460 |
And if we go back to this concept of the aggregate, 02:40:48.900 |
maybe you'll meet this woman through a friend 02:41:12.320 |
You know, and I think that if you know, you know, 02:41:14.680 |
because that's a good thing that you have that. 02:41:20.860 |
because you don't want to fall in love with the wrong person. 02:41:27.340 |
I've noticed this because I fall in love with everything, 02:41:42.700 |
That doesn't necessarily mean you need to marry her tonight. 02:41:46.700 |
- Yes, and I like the way you said that out loud 02:41:49.980 |
It doesn't mean you need to marry her tonight. 02:42:00.540 |
but I also have to be careful with relationships. 02:42:02.940 |
And at the same time, like I mentioned to you offline, 02:42:08.880 |
that appreciates swinging for the fences and not dating, 02:42:15.860 |
- Yeah, you're a one guy, one girl kind of guy. 02:42:18.940 |
- And it's tricky because you want to be careful 02:42:26.380 |
that have a ridiculous amount of female interest 02:42:29.140 |
of a certain kind, but I'm looking for deep connection 02:42:36.580 |
And every once in a while, talking to Stanford professors. 02:42:42.380 |
- Perfect solution. - It's going to work out great. 02:42:50.580 |
- You mentioned what has now become a quite extensive 02:42:55.860 |
and expansive public platform, which is incredible. 02:43:01.260 |
first time I saw your podcast, I noticed the suit. 02:43:03.540 |
I was like, he respects his audience, which was great. 02:43:08.480 |
People are showing up for science and engineering 02:43:10.740 |
and technology information and those discussions 02:43:14.100 |
Now, I do want to talk for a moment about the podcast. 02:43:31.960 |
I do believe in an element of surprise is always fun. 02:43:52.300 |
Like you don't get a chance to talk like this. 02:43:58.500 |
in the amount of focus we allocate to each other. 02:44:02.100 |
We would be having fun talking about other stuff 02:44:07.480 |
There would be some phone use and all that kind of stuff. 02:44:23.080 |
We're both like fumbling with it, trying to figure out, 02:44:27.120 |
That's something we haven't really figured out before 02:44:32.400 |
I don't know why we need microphones for that, 02:44:36.720 |
- It feels like doing science for me, definitely. 02:44:48.900 |
I wanted to talk to friends and colleagues at MIT 02:45:06.060 |
that we're currently working for a particular conference. 02:45:10.840 |
So really asking questions like, what are we doing? 02:45:26.360 |
which is why I initially called it artificial intelligence, 02:45:45.560 |
that is anywhere close to the generalizability 02:45:54.620 |
in terms of reasoning, common sense reasoning. 02:46:00.540 |
can I talk to these folks, do science together, 02:46:05.620 |
And I also thought that there was not enough, 02:46:08.620 |
I didn't think there was enough good conversations 02:46:21.740 |
Oftentimes you go on this tour when you have a book, 02:46:24.260 |
but there's a lot of minds that don't write books. 02:46:26.640 |
- And the books constrain the conversation too, 02:46:28.860 |
'cause then you're talking about this thing, this book. 02:46:34.460 |
that haven't written a book who are brilliant, 02:46:40.060 |
We both haven't actually, when we raise a question, 02:46:43.780 |
we don't know the answer to it when the question is raised. 02:46:48.700 |
Like, I don't know, I remember asking questions 02:46:57.140 |
of why do neural networks work as well as they do? 02:47:12.460 |
to think through it together, I think that's science. 02:47:40.180 |
is because I feel like I'm not good at conversation. 02:47:43.000 |
So it looks like it doesn't match the current skill level, 02:47:48.000 |
but I want it to have really dangerous conversations 02:47:58.160 |
Not completely uniquely, but like I'm a huge fan 02:48:04.540 |
what conversations can I do that Joe Rogan can't? 02:48:17.580 |
He's just like with Costello, he's not just a person. 02:48:22.080 |
He's also an idea to me for what I strive for, 02:48:27.860 |
And the reason I'm uniquely qualified is both the Russian, 02:48:31.440 |
but also there's the judo and the martial arts. 02:48:34.060 |
There's a lot of elements that make me have a conversation 02:48:39.460 |
And there's a few other people that I kept in mind, 02:48:44.460 |
like Don Knuth, he's a computer scientist from Stanford, 02:48:49.340 |
that I thought is one of the most beautiful minds ever. 02:48:54.060 |
And nobody really talked to him, like really talked to him. 02:49:01.380 |
but really just have a conversation with him. 02:49:04.820 |
One of them passed away, John Conway, that I never got, 02:49:07.300 |
we agreed to talk, but he died before we did. 02:49:10.660 |
There's a few people like that, that I thought like, 02:49:19.660 |
And I have the unique ability to know how to purchase 02:49:24.660 |
a microphone on Amazon and plug it into a device 02:49:32.060 |
Like that's not easy in the scientific community, 02:49:36.700 |
- No, they can build Faraday cages and two photon 02:49:39.700 |
microscopes and bioengineer, all sorts of things. 02:49:43.040 |
But the idea that you could take ideas and export them 02:49:47.100 |
into a structure or a pseudo structure that people would 02:49:49.740 |
benefit from seems like a cosmic achievement to them. 02:49:54.060 |
- I don't know if it's a fear or just basically 02:49:57.460 |
they haven't tried it, so they haven't learned 02:50:03.380 |
but I think that, but it's important and maybe we should, 02:50:08.040 |
which is that it's, they're not trained to do it. 02:50:12.820 |
and specific hypotheses and many of them don't care to. 02:50:17.340 |
They became scientists because that's where they felt safe. 02:50:22.660 |
And so why would they leave that haven of safety? 02:50:25.880 |
- Well, they also don't necessarily always see 02:50:33.740 |
I think you're probably, you have an exceptionally successful 02:50:38.340 |
and amazing podcast that you started just recently. 02:50:42.420 |
- Well, but there's a raw skill there that you're definitely 02:50:47.420 |
an inspiration to me in how you do the podcast 02:50:52.860 |
But I think you've discovered that that's also 02:50:55.240 |
an impactful way to do science, that podcast. 02:50:58.020 |
And I think a lot of scientists have not yet discovered 02:51:01.620 |
that this is, if they apply same kind of rigor 02:51:11.780 |
and they do that rigor and effort to podcast, 02:51:16.100 |
whatever that is, that could be a five minute podcast, 02:51:18.500 |
a two hour podcast, it could be conversational, 02:51:22.920 |
If they apply that effort, you have the potential to reach 02:51:26.260 |
over time, tens of thousands, hundreds of thousands, 02:51:32.460 |
But yeah, for me, giving a platform to a few of those folks, 02:51:40.940 |
so maybe you can speak to what fields you're drawn to, 02:51:56.300 |
that I thought it would be amazing to explore their mind, 02:52:06.540 |
And at the same time, I had other guests in mind 02:52:27.980 |
even Khabib for wrestling, just to talk to them. 02:52:47.680 |
And also, I feel like it makes me a better person, 02:52:59.820 |
I don't know how often you have really good conversation 02:53:01.780 |
with friends, but like podcasts are like that. 02:53:09.920 |
I mean, when I saw you sit down with Penrose, 02:53:12.180 |
Nobel prize-winning physicists and these other folks, 02:53:16.300 |
it's what comes out of his mouth is incredible. 02:53:18.020 |
And what you were able to hold in that conversation 02:53:24.400 |
Light years beyond what he had any other interviewer, 02:53:31.620 |
Light years beyond what anyone else had been able 02:53:34.460 |
to engage with him was such a beacon of what's possible. 02:53:39.460 |
And I know that, I think that's what people are drawn to. 02:53:45.540 |
certainly if two people are friends as we are 02:53:47.540 |
and they know each other, that there's more of that, 02:53:51.940 |
of private conversations that are made public. 02:53:57.900 |
you're probably starting to realize, and Costello is like, 02:54:04.540 |
and you're putting yourself out there completely, 02:54:13.300 |
They also enjoy watching you, Andrew, struggle 02:54:18.180 |
with these ideas or try to communicate these ideas. 02:54:24.860 |
- Well, that's good 'cause I got plenty of those. 02:54:30.300 |
where you're very self-critical about your flaws. 02:54:33.060 |
I mean, in that same way, it's interesting, I think, 02:54:37.940 |
not just because Penrose is communicating ideas, 02:54:42.300 |
but here's this like silly kid trying to explore ideas. 02:54:46.980 |
Like they know this kid that there's a human connection 02:54:53.580 |
Like it's not just a good interview with Putin. 02:55:03.780 |
and some would argue dangerous people in the world. 02:55:08.260 |
They love that, the authenticity that led up to that. 02:55:11.620 |
And in return, I get to connect everybody I run to 02:55:24.300 |
- Yeah, there's an intimacy that you've formed with them. 02:55:26.840 |
- Yeah, we've been on this like journey together. 02:55:32.740 |
Like I was, because I was a fan of Joe for so many years, 02:55:36.620 |
there's something, there's a kind of friendship 02:55:40.900 |
as absurd as it might be to say in podcasting 02:55:45.960 |
- Yeah, maybe it fills in a little bit of that 02:55:56.540 |
but one of them is on behalf of your audience, 02:56:04.460 |
but I just want to know, does it have a name? 02:56:12.400 |
- Well, there's a name he likes to be referred to as, 02:56:19.280 |
in the privacy-owned company that we call each other. 02:56:35.620 |
So I gave away everything I own now three times in my life. 02:56:45.260 |
And recently it's also been guitar, things like that. 02:56:50.940 |
But he survived because he was always in the, 02:56:55.280 |
at least in the first two times, was in the laptop bag, 02:57:00.240 |
And so I just liked the perseverance of that. 02:57:45.280 |
And this, like Hedgy like saw through all of it. 02:57:48.000 |
He was like Dostoevsky's man from underground. 02:57:52.000 |
I mean, there's a sense that he saw the darkness 02:57:56.560 |
So I got, and there's also a famous Russian cartoon, 02:58:17.900 |
It's a hedgehog, like sad, walking through the fog, 02:58:27.820 |
People should, even if you don't speak Russian, 02:58:31.580 |
- Oh, it's, the moment you said that I was gonna ask, 02:58:33.900 |
so it's in Russian, but of course it's in Russian. 02:58:39.020 |
It's almost, there's an interesting exploration 02:58:47.360 |
when you see it only vaguely through the fog. 02:59:03.580 |
- So there's a certain period, and this is, again, 02:59:16.020 |
where like, especially kids were treated very seriously. 02:59:21.020 |
Like they were treated like they're able to deal 02:59:37.180 |
not like dumb cartoons that are trying to get you 02:59:39.500 |
to be like smile and run around, but like create art. 02:59:42.320 |
Like stuff that, you know how like short cartoons 02:59:48.700 |
- So what strikes me about this is a little bit 02:59:52.780 |
It's almost like they treat kids with respect. 02:59:57.060 |
an intelligence and they honor that intelligence. 02:59:59.740 |
- Yeah, they're really just adult in a small body. 03:00:08.460 |
or like philosophical capacity, they're right there with you. 03:00:14.220 |
the art that they consumed, education reflected that. 03:00:19.100 |
I mean, there's a sense of, because he survived so long 03:00:27.320 |
that it's like we've been through all of this together 03:00:30.900 |
and it's the same, sharing the moments together. 03:00:36.220 |
if all the world turns on you and goes to hell, 03:00:41.180 |
And he doesn't die because he's an inanimate object, so. 03:00:58.540 |
- Well, I now feel a connection to Hedgy the Hedgehog 03:01:04.100 |
And I think that encapsulates the kind of possibility 03:01:12.340 |
and other object and through robotics, certainly. 03:01:17.620 |
There's a saying that I heard when I was a graduate student 03:01:35.140 |
to encapsulate so many aspects of science, engineering, 03:01:43.740 |
martial arts, and the emotional depth that you bring to it, 03:02:04.800 |
So I'm just extraordinarily grateful for your friendship 03:02:11.140 |
And I just wish you showed me the same kind of respect 03:02:26.280 |
If you're enjoying this podcast and learning from it, 03:02:31.380 |
As well, you can subscribe to us on Spotify or Apple. 03:02:35.480 |
Please leave any questions and comments and suggestions 03:02:38.200 |
that you have for future podcast episodes and guests 03:02:43.240 |
At Apple, you can also leave us up to a five-star review. 03:02:46.800 |
If you'd like to support this podcast, we have a Patreon. 03:02:52.840 |
And there you can support us at any level that you like. 03:02:56.200 |
Also, please check out our sponsors mentioned 03:03:03.200 |
Links to our sponsors can be found in the show notes. 03:03:06.740 |
And finally, thank you for your interest in science.