back to indexJaron Lanier: Virtual Reality, Social Media & the Future of Humans and AI | Lex Fridman Podcast #218
Chapters
0:0 Introduction
1:39 What is reality?
5:52 Turing machines
7:10 Simulating our universe
13:25 Video games and other immersive experiences
17:12 Death and consciousness
25:43 Designing human-centric AI
27:17 Empathy with robots
31:9 Social media incentives
43:29 Data dignity
51:1 Jack Dorsey and Twitter
62:46 Bitcoin and cryptocurrencies
67:26 Government overreach and freedom
77:41 GitHub and TikTok
79:51 The Autodidactic Universe
84:42 Humans and the mystery of music
90:53 Defining moments
101:39 Mortality
103:31 The meaning of life
00:00:00.000 |
The following is a conversation with Jaron Lanier, 00:00:03.060 |
a computer scientist, visual artist, philosopher, 00:00:08.060 |
and the founder of the field of virtual reality. 00:00:12.780 |
please check out our sponsors in the description. 00:00:18.060 |
is a staunch critic of social media platforms. 00:00:26.340 |
about it being possible to build better platforms. 00:00:36.660 |
Let me also say a general comment about these conversations. 00:00:54.020 |
that are close to my heart are being criticized. 00:00:57.180 |
In those cases, I do offer a little pushback, 00:01:06.620 |
I think there's no such thing as winning in conversations, 00:01:24.120 |
That said, I also often just do a bad job of talking, 00:01:35.540 |
and here is my conversation with Jaron Lanier. 00:01:38.640 |
You're considered the founding father of virtual reality. 00:01:47.300 |
or all of our lives in virtual reality worlds? 00:01:52.420 |
- I have always found the very most valuable moment 00:01:58.320 |
when you take off the headset and your senses are refreshed 00:02:08.940 |
So you can really notice just how incredibly strange 00:02:13.720 |
and delicate and peculiar and impossible the real world is. 00:02:20.580 |
- So the magic is, and perhaps forever will be, 00:02:27.780 |
I mean, I think I don't get to tell everybody else 00:02:31.140 |
how to think or how to experience virtual reality, 00:02:33.740 |
and at this point, there have been multiple generations 00:02:36.780 |
of younger people who've come along and liberated me 00:02:47.540 |
well, I called it mixed reality back in the day, 00:02:49.580 |
and these days it's called augmented reality, 00:02:58.960 |
not because I think the forest needs augmentation, 00:03:01.300 |
but when you look at the augmentation next to a real tree, 00:03:05.100 |
the real tree just pops out as being astounding. 00:03:08.260 |
It's interactive, it's changing slightly all the time 00:03:13.180 |
if you pay attention, and it's hard to pay attention 00:03:15.500 |
to that, but when you compare it to virtual reality, 00:03:21.980 |
my favorite early application of virtual reality, 00:03:28.140 |
when I was working with Dr. Joe Rosen at Stanford Med 00:03:35.740 |
and to go from the fake anatomy of the simulation, 00:03:40.740 |
which is incredibly valuable for many things, 00:03:45.020 |
for all kinds of things, then to go to the real person, 00:03:50.340 |
Surgeons really get woken up by that transition. 00:03:54.140 |
So I think the transition is actually more valuable 00:04:05.620 |
in the physical space can help you appreciate 00:04:08.220 |
how much you value your home once you return. 00:04:20.380 |
between the virtual world and the physical meat space world 00:04:23.780 |
that you are still drawn, for you personally, 00:04:33.220 |
or is it the peculiarities of the current set of technology? 00:04:37.540 |
- In terms of the kind of virtual reality that we have now, 00:04:41.460 |
it's made of software and software is terrible stuff. 00:04:45.980 |
Software is always the slave of its own history, 00:04:52.020 |
It's always infinitely arbitrarily messy and arbitrary. 00:05:00.220 |
of nerdy personality in people, or at least in me, 00:05:09.100 |
And so that's different from the physical world. 00:05:11.580 |
It's not something we understand, as you just pointed out. 00:05:18.180 |
"Well, do you think the universe is a computer?" 00:05:41.220 |
so we can do technology, so we can program it. 00:05:43.700 |
So I mean, of course it's some kind of computer, 00:05:45.700 |
but I think trying to understand it as a Turing machine 00:05:54.460 |
whether it performs, this computer we call the universe, 00:05:58.820 |
performs the kind of computation that can be modeled 00:06:03.940 |
or is it something much more fancy, so fancy, in fact, 00:06:08.500 |
that it may be beyond our cognitive capabilities 00:06:18.660 |
'cause if you have an infinitely smart programmer 00:06:25.900 |
and an infinite clock speed, then they're universal, 00:06:29.900 |
but that cannot exist, so they're not universal in practice. 00:06:40.660 |
within the conservation principles of any reality 00:06:46.380 |
And so, I think universality of a particular model 00:07:00.740 |
of course, something like that's gotta be true 00:07:07.380 |
but it's just not accessible to us, so what's the point? 00:07:10.460 |
- Well, to me, the question of whether we're living 00:07:37.820 |
but we would prefer to stay in the virtual world anyway. 00:07:43.980 |
has the most practical importance to human beings right now, 00:07:51.580 |
sort of built into the way the question's usually asked 00:08:03.100 |
And actually, people are always learning, evolving, 00:08:20.980 |
So for instance, there was a peer-reviewed paper 00:08:26.420 |
playing back an opera singer behind a curtain 00:08:28.620 |
was indistinguishable from a real opera singer. 00:08:39.180 |
without the experience of it, it seemed plausible. 00:08:49.580 |
between New York and DC in the '30s, I think so, 00:08:54.320 |
that people viewed as being absolutely realistic 00:08:56.420 |
and indistinguishable, which to us would be horrible. 00:09:07.780 |
who just looked kind of like a few archetypes. 00:09:17.060 |
'cause actually photographing them was inconceivable, 00:09:25.460 |
How would they even know what they looked like? 00:09:32.300 |
we perceived the media as being really great, 00:09:34.300 |
but then we evolved through the experience of the media. 00:09:46.720 |
is that we can distinguish that opera singer now 00:09:57.460 |
by some assumption of stasis that's incorrect. 00:10:01.260 |
So that's the first thing, that's my first answer, 00:10:05.260 |
Now, of course, somebody might come back and say, 00:10:09.340 |
there must be some point at which it would surpass. 00:10:23.140 |
which you're now making me realize is way different. 00:10:28.500 |
in which people would want to stay instead of the real world? 00:10:33.860 |
- Like en masse, like large numbers of people. 00:10:44.260 |
helps people appreciate this physical world we have 00:11:02.780 |
the more sort of technology-free aspect of life. 00:11:20.020 |
The program you just described is totally doable. 00:11:26.060 |
you could find 20 PhD theses that do that already. 00:11:38.740 |
Claiming that reality isn't a computer in some sense 00:11:41.580 |
seems incoherent to me 'cause we can program it. 00:11:48.780 |
What more do you want from it to be a computer? 00:11:55.820 |
- Sorry to interrupt, but you're absolutely right. 00:12:07.220 |
that help us understand ourselves, understand us humans. 00:12:19.780 |
- Appreciate and open our eyes more richly to reality. 00:12:26.100 |
And I wish people who become incredibly fascinated, 00:12:29.900 |
who go down the rabbit hole of the different fascinations 00:12:35.900 |
or there's a whole world of variations on that. 00:12:41.940 |
their own motivations and exactly what they mean. 00:12:56.380 |
it has to be 'cause it's not coherent to say that it isn't. 00:13:02.260 |
you know anything about what kind of computer, 00:13:12.220 |
We have to have a bit of modesty about where we stand. 00:13:15.020 |
And the problem I have with these framings of computation 00:13:32.300 |
it's a role-playing game, Skyrim, for example. 00:13:36.860 |
Why do I enjoy so deeply just walking around that world 00:13:51.340 |
but I also am happy with the music that's playing 00:13:55.180 |
and the mountains and carrying around a sword 00:14:06.580 |
- I think it's wonderful to love artistic creations, 00:14:11.580 |
it's wonderful to love contact with other people, 00:14:19.900 |
evolving meaning and patterns with other people. 00:14:26.940 |
I'm not anti-tech and I'm certainly not anti-digital tech, 00:14:37.260 |
I think the manipulative economy of social media 00:14:54.340 |
And by the way, there's a thing about humans, 00:15:03.900 |
Any kind of social interaction with other people 00:15:16.220 |
but when you actually go to a classical music thing 00:15:19.700 |
this is like a backroom power deal kind of place 00:15:38.780 |
is gonna get you anywhere, so I'm not worried about that. 00:15:44.500 |
where we're making ourselves crazy or cruel enough 00:15:49.260 |
and I think the social media criticism rises to that level, 00:15:59.060 |
that every experience is both beauty and darkness, 00:16:03.620 |
I also play classical piano, so I appreciate it very much. 00:16:10.180 |
in "A Man's Search for Meaning" with Viktor Frankl 00:16:15.780 |
even there, there's opportunity to discover beauty. 00:16:19.220 |
And so it's, that's the interesting thing about humans, 00:16:42.140 |
We perceive socially, we depend on each other 00:16:44.780 |
for our sense of place and perception of the world. 00:16:52.340 |
and yet there's also a degree in which we're inevitably, 00:16:59.980 |
We are set up to be competitive as well as supportive. 00:17:06.540 |
our fundamental situation is complicated and challenging, 00:17:13.540 |
- Okay, let's talk about one of the most challenging things. 00:17:17.020 |
One of the things I unfortunately am very afraid of, 00:17:23.420 |
You wrote an essay on death and consciousness, 00:17:33.380 |
"and in the formation of the character of civilization, 00:17:45.460 |
I'm Russian, so I have to ask you about this. 00:17:51.740 |
- See, you would have enjoyed coming to our house, 00:17:58.660 |
- We have a piano of such spectacular qualities, 00:18:07.260 |
So the context in which, I remember that essay, 00:18:12.020 |
sort of, this was from maybe the '90s or something, 00:18:20.580 |
'cause I was interested in these endless debates 00:18:40.420 |
played into different philosophical approaches 00:18:58.540 |
meaning the feeling that there's something apart 00:19:08.020 |
that whatever that is will survive death and continue, 00:19:15.460 |
Not all of them, not really, but most of them. 00:19:19.420 |
The thing I noticed is that the opposite of those, 00:19:36.340 |
with a remarkably similar chain of arguments, 00:19:45.620 |
and upload myself and I'll live forever. (laughs) 00:19:50.580 |
Yeah, that's like the implied thought, right? 00:20:03.580 |
who would appear to be opposites in character 00:20:24.620 |
about the brain has turned into medieval Christianity. 00:20:34.740 |
who will come back and zap you and all that stuff. 00:20:38.460 |
It's really turned into medieval Christianity 00:20:44.860 |
that the fear of death is the warm of the core, 00:21:02.200 |
- So you just moved across this vast cultural chasm 00:21:08.340 |
that separates me from most of my colleagues in a way, 00:21:13.220 |
and I can't answer what you just said on the level 00:21:26.220 |
There's just algorithms, we make them, we control them. 00:21:30.740 |
Now, this is something that rubs a lot of people 00:21:36.140 |
When I was young, my main mentor was Marvin Minsky, 00:21:47.020 |
He was the first person to have the idea at all, 00:21:58.620 |
And I'm like, "Yeah, I heard it in 1978, sure. 00:22:03.660 |
and Marvin and I used to argue all the time about this stuff 00:22:08.540 |
'cause I always rejected it, and of all of his, 00:22:19.780 |
but of all of his students and student-like people 00:22:27.540 |
who argued with him about this stuff in particular, 00:22:31.220 |
- Yeah, I would've loved to hear that conversation. 00:22:37.540 |
So the very last time I saw him, he was quite frail, 00:22:40.260 |
and I was in Boston, and I was going to the old house 00:23:01.540 |
and he looked up, and he said, "Are you ready to argue?" 00:23:16.220 |
The first thing to say is that nobody can claim 00:23:28.680 |
I think the whole idea of faith needs to be updated, 00:23:42.540 |
called the circle of empathy in my old papers, 00:23:58.260 |
might be conscious, might be deserving of your empathy, 00:24:12.120 |
- And that circle is fundamentally based on faith. 00:24:18.000 |
- The thing about the circle is it can't be pure faith. 00:24:48.640 |
on the bottom of society that they are always ready 00:24:56.280 |
we're always trying to shove somebody out of the circle. 00:25:05.840 |
And so, and the biggest questions are probably fetuses 00:25:21.560 |
people say, "But aren't you afraid if you exclude AI, 00:25:26.320 |
And then I would say, "Well, if you include AI, 00:25:35.800 |
"and so you're facing incompetence immediately." 00:25:38.760 |
So I really think we need to subordinate algorithms 00:25:43.600 |
- Your intuition, you speak about this brilliantly 00:25:57.760 |
but give you control and make your life better 00:26:08.520 |
in that way, they grow to become better people. 00:26:11.520 |
I don't understand why that's fundamentally not possible. 00:26:14.440 |
You're saying oftentimes you get into trouble 00:26:27.680 |
So Alan Watts once said, "Morality is like gravity, 00:26:34.400 |
"there can't be morality because at some point 00:26:36.360 |
"it all becomes relative and who are we anyway? 00:26:45.560 |
this is our frame, and morality is a very real thing. 00:26:48.760 |
At some point, you get into interstellar space 00:27:06.600 |
from a feeling of wanting to feel sort of separate from 00:27:10.360 |
and superior to other people or something like that. 00:27:12.920 |
There's an impulse behind it that I really have to reject. 00:27:21.000 |
- Okay, so I agree with you that a lot of technologies 00:27:36.240 |
and you can create a new kind of technologist, engineer, 00:27:41.160 |
that does build systems that respect humanity, 00:27:46.920 |
that have empathy for common humans, have compassion. 00:27:52.600 |
I think, yeah, I mean, I think musical instruments 00:28:05.760 |
My invention or design during the pandemic period 00:28:14.120 |
sort of in a classroom or a theater instead of in squares. 00:28:19.120 |
And it allows them to semi-consciously perform to each other 00:28:29.480 |
as if they're paying attention to each other non-verbally. 00:28:34.040 |
And so it promotes empathy so far as I can tell. 00:28:43.120 |
I would say it was born with Adam Smith's "Invisible Hand," 00:28:47.160 |
with this idea that we build this algorithmic thing 00:28:51.880 |
and then we think it must be smarter than us. 00:28:55.680 |
is absolutely everybody has some line they draw 00:29:09.240 |
It was sometimes smart and sometimes it failed. 00:29:11.600 |
And so if you really, people who really, really, really 00:29:16.600 |
wanna believe in the "Invisible Hand" is infinitely smart, 00:29:22.760 |
You have to recognize the economy as a subservient tool. 00:29:29.920 |
They might not when it's not to their advantage. 00:29:31.760 |
That's kind of an interesting game that happens. 00:29:34.280 |
But the thing is, it's just like that with our algorithms. 00:29:37.080 |
Like you can have a sort of a Chicago economic philosophy 00:29:49.120 |
- I think that there is a deep loneliness within all of us. 00:29:59.640 |
Like this is what you criticize social media for. 00:30:02.360 |
I think there's much better ways of doing social media 00:30:06.640 |
but instead leads to deeper connection between humans, 00:30:12.000 |
And what that requires is some agency on the part of AI 00:30:15.200 |
to be almost like a therapist, I mean, a companion. 00:30:22.080 |
It's not guiding you as if it's an all-knowing thing. 00:30:25.400 |
It's just another companion that you can leave at any time. 00:30:28.840 |
You have complete transparency and control over. 00:30:31.960 |
There's a lot of mechanisms that you can have 00:30:34.320 |
that are counter to how current social media operates 00:30:48.920 |
I think it's possible to create AI systems like that. 00:30:51.680 |
And I think they need, I mean, that's a technical discussion 00:31:11.560 |
how AI systems manipulate you within social networks. 00:31:19.680 |
It isn't necessarily that there's advertisements, 00:31:24.680 |
that social networks present you with advertisements 00:31:32.400 |
The biggest problem is they then manipulate you. 00:31:35.320 |
They alter your human nature to get you to buy stuff, 00:31:41.360 |
or to get you to do whatever the advertiser wants. 00:31:49.920 |
but we can work with that as an approximation. 00:31:53.240 |
- I think the actual thing is even sort of more ridiculous 00:31:58.240 |
- So my question is, let's not use the word AI, 00:32:14.280 |
but we have no choice but to turn into economists 00:32:19.880 |
But I've been around this thing since it started, 00:32:27.240 |
where the social media companies sell themselves 00:32:31.600 |
to the people who put the most money into them, 00:32:33.880 |
which are usually the big advertising holding companies 00:32:41.400 |
and maybe it's even been recognized as that by everybody, 00:32:49.760 |
'cause I think people have looked at their returns 00:32:53.200 |
and everybody recognizes it's not exactly right. 00:32:56.320 |
It's more like a cognitive access blackmail payment 00:33:03.560 |
Like, just to be connected, you're paying the money. 00:33:06.040 |
It's not so much that the persuasion algorithms. 00:33:15.840 |
but the thing is that once people are engaged, 00:33:27.080 |
- But they're still, it's a giant cognitive access 00:33:32.560 |
So, because the science behind the Persuade part, 00:33:42.800 |
we play make believe that it works more than it does. 00:33:47.000 |
The damage doesn't come, honestly, as I've said in my books, 00:33:53.440 |
I actually think advertising can be demeaning and annoying 00:34:01.400 |
and take up a lot of our time with stupid stuff. 00:34:04.320 |
Like, there's a lot of ways to criticize advertising 00:34:14.000 |
I think advertising, at least as it was understood 00:34:17.160 |
before social media, helped bring people into modernity 00:34:20.360 |
in a way that overall actually did benefit people overall. 00:34:27.600 |
because I was saying you shouldn't manipulate people? 00:34:30.520 |
I mean, I'm not pretending to have this perfect airtight 00:34:35.520 |
I think there's a bit of a contradiction there, so. 00:34:39.400 |
I think advertising has, in some parts, benefited society 00:34:43.920 |
because it funded some efforts that perhaps benefited society. 00:34:46.720 |
- Yeah, I mean, I think there's a thing where, 00:34:51.120 |
sometimes I think it's actually been of some use. 00:34:53.960 |
Now, where the damage comes is a different thing, though. 00:35:06.880 |
they have to see if you respond to the stimulus. 00:35:09.080 |
Now, the problem is that the measurement mechanism 00:35:12.560 |
for telling if you respond in the engagement feedback loop 00:35:19.600 |
or occasionally if you're staring at the screen more, 00:35:21.680 |
if there's a forward-facing camera that's activated, 00:35:29.000 |
And so it's crude enough that it only catches 00:35:32.680 |
sort of the more dramatic responses from you, 00:35:37.760 |
Those are the things where you get scared or pissed off 00:35:48.080 |
these fast response, old, old, old evolutionary business 00:35:54.160 |
circuits that we have that are helpful in survival 00:36:07.560 |
intrinsically, totally aside from whatever the topic is, 00:36:11.080 |
you start to get incrementally just a little bit 00:36:16.920 |
you get a little stupid and you become a jerk. 00:36:22.440 |
It's not like everybody's instantly transformed, 00:36:28.040 |
where people who get hooked kind of get drawn 00:36:30.320 |
more and more into this pattern of being at their worst. 00:36:40.360 |
and say, "I am less happy with who I am now," 00:36:48.800 |
Are they able to self-reflect when you take yourself 00:36:54.160 |
I wrote a book about people, suggesting people 00:36:56.720 |
take a break from their social media to see what happens 00:36:58.920 |
and maybe even, actually the title of the book 00:37:06.480 |
Although I always said, "I don't know that you should. 00:37:08.480 |
"I can give you the arguments, it's up to you." 00:37:11.800 |
But I get, I don't have a social media account, obviously, 00:37:15.600 |
and it's not that easy for people to reach me. 00:37:18.920 |
They have to search out an old-fashioned email address 00:37:26.160 |
And even with that, I get this huge flood of mail 00:37:28.720 |
from people who say, "Oh, I quit my social media. 00:37:33.280 |
But the thing is, what's for me a huge flood of mail 00:37:39.960 |
And so I think it's rare for somebody to look at themselves 00:37:44.560 |
and say, "Oh boy, I sure screwed myself over." 00:37:54.320 |
is it possible to design social media systems 00:38:08.360 |
I think what you should do is design a system 00:38:24.440 |
What does that actually mean engineering-wise? 00:38:31.680 |
Algorithms don't understand enough to recommend. 00:38:46.960 |
Our office is funding GPT-3 and all these things 00:38:57.440 |
I mean, it still is statistical emergent pseudo-semantics. 00:39:39.760 |
- Have you done-- - But I'm still in control. 00:39:42.320 |
Have you done the experiment of letting YouTube 00:39:46.480 |
either starting from a absolutely anonymous random place 00:39:57.560 |
top video recommend and then just go 20 hops? 00:40:20.240 |
the majority of times, after about 17 or 18 hops, 00:40:23.520 |
you end up in really weird, paranoid, bizarre territory. 00:40:46.360 |
that promotes xenophobia, promotes fear, anger, 00:40:50.120 |
promotes selfishness, promotes separation between people. 00:41:02.200 |
I'd like to do a large citizen science thing sometime 00:41:05.560 |
and do it, but then I think the problem with that 00:41:07.320 |
is YouTube would detect it and then change it. 00:41:24.040 |
of what does healthy conversations actually look like, 00:41:27.040 |
and how do you incentivize those healthy conversations? 00:41:49.880 |
maybe not things that are officially social media, 00:41:54.040 |
and see which ones have more healthy conversations, 00:42:07.760 |
- Yeah, one that I've been really interested in is GitHub, 00:42:10.840 |
'cause it could change, I'm not saying it'll always be, 00:42:17.920 |
GitHub has had a relatively quite low poison quotient, 00:42:22.280 |
and I think there's a few things about GitHub 00:42:26.640 |
One thing about it is that people have a stake in it. 00:42:31.880 |
There's actual code, or there's actual stuff being done, 00:42:38.280 |
you have a motivation to not screw up that thing, 00:42:51.560 |
from dumping on somebody's TikTok or something. 00:42:56.960 |
but you have to kind of get decent with people 00:43:00.760 |
when you have a shared stake, a little secret, 00:43:08.640 |
but I'm tempted to play the journey back at you, 00:43:22.520 |
to manipulate you because you wanna preserve the stake. 00:43:33.000 |
I wrote a book about an earlier version of it 00:43:44.280 |
- Let me do the fastest version of this I can do. 00:43:56.880 |
and future two is gonna be data dignity, okay? 00:44:06.960 |
is that as the climate changes, we might burn down 00:44:11.480 |
Like, it's dangerous, you know, and it didn't used to be. 00:44:26.960 |
That's all nice, but there's this other middle layer, 00:44:40.520 |
cooperate with each other to make sure trees don't touch 00:44:46.640 |
They have this whole little web that's keeping us safe. 00:44:52.480 |
'cause they were out there during the pandemic. 00:44:54.320 |
And so I'd try to just see who are these people? 00:44:56.800 |
Who are these people who are keeping us alive? 00:44:59.320 |
Now, I wanna talk about the two different faiths 00:45:01.480 |
for those people under Future One and Future Two. 00:45:04.880 |
Future One, some weird, like, kindergarten paint job van 00:45:09.880 |
with all these, like, cameras and weird things, 00:45:23.160 |
And the robots are good, and they can scale to more land. 00:45:28.480 |
but then there are all these people out of work. 00:45:39.040 |
My problem with that solution is every time in history 00:45:47.280 |
because it's too centralized and it gets seized. 00:45:49.560 |
That's happened to every communist experiment I can find. 00:46:06.800 |
And you'll say, "Oh, no, an algorithm can do it." 00:46:18.160 |
- 60-something people own a quarter of all the Bitcoin. 00:46:22.440 |
Like the things that we think are decentralized 00:46:27.840 |
Future two, the gardeners see that van with all the cameras 00:46:39.000 |
And amazingly, California has a little baby data union law 00:46:48.520 |
and they say, "We're gonna form a data union, 00:46:52.600 |
and we're gonna, not only are we gonna sell our data 00:46:55.880 |
to this place, but we're gonna make it better 00:46:57.280 |
than it would have been if they were just grabbing it 00:47:11.760 |
there's two things that are different about that world 00:47:15.680 |
One thing, of course, the people have more pride. 00:47:17.680 |
They have more sense of ownership, of agency, 00:47:30.040 |
like, we'll figure out how to keep the neighborhood 00:47:31.640 |
from burning down, you have this whole creative community 00:47:44.360 |
with spiral pumpkin patches and waves of cultural things. 00:47:51.680 |
about climate change mitigation with how we do this. 00:48:00.560 |
there'll be this whole creative community on the case. 00:48:03.400 |
And isn't it nicer to have a high-tech future 00:48:12.600 |
future one and future two have the same robots 00:48:21.720 |
And that's second future two, that's state of dignity. 00:48:32.680 |
- Yeah, you know, I mean, I think you can believe in AI 00:48:51.480 |
- Yeah, I think what should happen is in the future, 00:49:04.000 |
And those data unions should smooth out the results 00:49:08.640 |
But at the same time, and people have to pay for it too. 00:49:11.680 |
They have to pay for Facebook the way they pay for Netflix 00:49:40.920 |
the outcomes for people form a normal distribution, 00:49:47.000 |
who do really well, a lot of people who do okay. 00:49:51.480 |
reflecting more and more creativity and expertise 00:50:02.920 |
So it gradually expands the economy and lifts all boats. 00:50:05.600 |
And the society has to support the lower wing 00:50:09.400 |
of the bell curve too, but not universal basic income. 00:50:22.840 |
But see what I believe, I'm not gonna talk about AI, 00:50:31.160 |
And so I don't think everybody's gonna be supplying data 00:50:36.160 |
nor do I think everybody's gonna make their living 00:50:38.880 |
I think in both cases, there'll be a rather small contingent 00:50:42.840 |
that do well enough at either of those things. 00:51:01.640 |
- Do you think it's possible to create a social network 00:51:13.120 |
so I gotta tell you how to get from what I'm talking, 00:51:16.600 |
how to get from where we are to anything kind of in the zone 00:51:44.240 |
And he's under enormous business pressure too. 00:51:46.520 |
- So Jack Dorsey to me is a fascinating study 00:51:49.640 |
because I think his mind is in a lot of good places. 00:52:04.240 |
is that if you just wanna look at the human side, 00:52:11.480 |
Almost all of the social media platforms that get big 00:52:18.040 |
where they're actually kind of sweet and cute. 00:52:20.320 |
Like if you look at the early years of Twitter, 00:52:34.080 |
when it gets big enough that it's the algorithm running it, 00:52:36.920 |
then you start to see the rise of the paranoid style 00:52:40.760 |
And we've seen that shift in TikTok rather recently. 00:52:43.800 |
- But I feel like that scaling reveals the flaws 00:52:54.560 |
No, because I have hope for the world with humans 00:52:59.560 |
and I have hope for a lot of things that humans create, 00:53:08.960 |
that incentivize different things than the current. 00:53:18.040 |
that was invented like 20 years ago, however long. 00:53:21.720 |
And it just works and so nobody's changing it. 00:53:24.080 |
I just think that there could be a lot of innovation 00:53:26.600 |
for more, see, you kind of push back this idea 00:53:29.480 |
that we can't know what long-term growth or happiness is. 00:53:35.600 |
to define what their long-term happiness and goals are, 00:53:55.240 |
to make their living doing TikTok dance videos, 00:54:10.920 |
So the future is like a cross between TikTok and GitHub 00:54:18.280 |
They're negotiating, they're negotiating for returns. 00:54:23.680 |
in order to soften the blow of the randomness 00:54:37.800 |
in the course of their lives, or maybe even 10,000. 00:54:44.200 |
distributed portfolios of different data unions 00:54:47.880 |
And some of them might just trickle in a little money 00:54:57.000 |
They'll find their way to the right GitHub-like community 00:55:09.880 |
into the algorithms and the robots of the future. 00:55:33.080 |
There's not some communist person deciding who's valuable. 00:55:47.640 |
which is definitely how you get the lizard brain. 00:55:59.280 |
So yeah, I'll tell you how I think to fix social media. 00:56:05.480 |
So one, I think people should have complete control 00:56:08.000 |
over their data and transparency of what that data is 00:56:11.720 |
and how it's being used if they do hand over the control. 00:56:14.760 |
Another thing, they should be able to delete, 00:56:16.520 |
walk away with their data at any moment, easy, 00:56:19.720 |
like with a single click of a button, maybe two buttons. 00:56:28.120 |
individualized control of the algorithm for them. 00:56:34.880 |
They get to be the decider of what they see in this world. 00:56:39.120 |
And to me, that's, I guess, fundamentally decentralized 00:56:50.120 |
over Twitter of today, over Facebook of today, 00:57:03.000 |
In this case, you have full control of the algorithms 00:57:10.400 |
But you can also say F you to those algorithms 00:57:12.880 |
and just consume the raw, beautiful waterfall 00:57:19.440 |
I think that, to me, that's not only fixes social media, 00:57:33.840 |
I think you can make more money by giving people control. 00:57:48.600 |
which is making a future that benefits programmers 00:57:55.280 |
So years ago, I co-founded an advisory board for the EU 00:57:59.800 |
with a guy named Giovanni Buttarelli who passed away. 00:58:02.120 |
It's one of the reasons I wanted to mention it. 00:58:12.040 |
So he was like this intense guy who was like, 00:58:22.200 |
let's make it all about transparency and consent. 00:58:24.240 |
And it was one of the theaters that led to this huge 00:58:26.960 |
data privacy and protection framework in Europe 00:58:34.000 |
And so therefore, we've been able to have empirical feedback 00:58:39.120 |
And the problem is that most people actually get stymied 00:58:44.000 |
by the complexity of that kind of management. 00:58:51.520 |
I can go in and I can figure out what's going on. 00:58:56.800 |
And so there's a problem that it differentially benefits 00:59:09.000 |
I kind of still want to come back to incentives. 00:59:15.080 |
if the commercial incentive is to help the creative people 00:59:26.720 |
I'm just saying that it's not only programmers. 00:59:30.840 |
- So, yeah, you have to make sure the incentives are right. 00:59:35.640 |
I mean, I like control is an interface problem 00:59:48.120 |
I mean, there's, I don't know, Creative Commons, 01:00:01.880 |
in the way that you can truly, simply understand. 01:00:07.880 |
In the same way, it should be very simple to understand 01:00:17.600 |
But then you're arguing that in order for that to happen, 01:00:22.480 |
- I mean, a lot of the reason that money works 01:00:26.640 |
is actually information hiding and information loss. 01:00:40.400 |
if you wanna give the most charitable interpretation possible 01:00:43.520 |
to the invisible hand, is what he was saying, 01:00:46.000 |
is that there's this whole complicated thing, 01:00:48.600 |
and not only do you not need to know about it, 01:00:50.640 |
the truth is you'd never be able to follow it if you tried. 01:00:57.720 |
And that, in a sense, every transaction's like a neuron 01:01:02.680 |
If he'd had that metaphor, he would have used it. 01:01:13.680 |
that reduce complexity for people can be made to work. 01:01:22.000 |
can you do it in a way that's not manipulative? 01:01:24.480 |
And I would say a GitHub-like, if you just have this vision, 01:01:34.400 |
I really think it is. - I'm not gonna be able 01:02:00.160 |
but I was just using the cat as this exemplar 01:02:03.000 |
of what we're talking about, so I don't know. 01:02:06.280 |
there's all this cattiness where people are like, 01:02:13.640 |
and kind of saying, okay, we're gonna work on this move, 01:02:16.560 |
we're gonna get a better, can we get a better musician? 01:02:22.040 |
that's kind of off the books right now, you know? 01:02:29.480 |
- Well, that's where the invention of Git, period, 01:02:32.040 |
the versioning is brilliant, and so some of the things 01:02:35.600 |
you're talking about, technology, algorithms, 01:02:41.800 |
for humans to connect, to collaborate, and so on. 01:02:50.800 |
- No, no, can we, can I ask you to elaborate, 01:02:53.920 |
'cause my intuition was that you would be a supporter 01:02:57.000 |
of something like cryptocurrency and Bitcoin, 01:02:59.360 |
because it is fundamentally emphasizes decentralization. 01:03:05.760 |
- Yeah, okay, look-- - Your thoughts on Bitcoin. 01:03:12.960 |
I've been advocating some kind of digital currency 01:03:19.560 |
when Bitcoin came out, and the original paper on blockchain, 01:03:29.440 |
oh my God, we're applying all of this fancy thought, 01:03:32.800 |
and all these very careful distributed security measures 01:03:38.600 |
Like, it's just so retro, it's so dysfunctional, 01:03:42.080 |
it's so useless from an economic point of view, 01:03:46.440 |
is using computational inefficiency at a boundless scale 01:03:50.240 |
as your form of security is a crime against the atmosphere, 01:03:57.720 |
Like, the thing is, when the first paper came out, 01:03:59.800 |
I remember a lot of people saying, oh my God, 01:04:01.520 |
this thing scales, it's a carbon disaster, you know? 01:04:09.320 |
but that's a different question than when you asked, 01:04:17.400 |
that's of a benefit, and absolutely, like I'm, 01:04:27.640 |
- Okay, so like, there are people in the community 01:04:31.040 |
and trying to figure out how to do this better. 01:04:36.160 |
like, government centralized, it's hard to control. 01:04:39.320 |
And then the other one, to fix some of the issues 01:04:41.400 |
that you're referring to, I'm sort of playing 01:04:55.240 |
not on the blockchain, but outside the blockchain, 01:05:03.360 |
- So Bitcoin's not new, it's been around for a while. 01:05:05.920 |
I've been watching it closely, I've not seen one example 01:05:18.520 |
let's say government earned that wrath honestly. 01:05:25.160 |
that governments have done in recent decades, 01:05:33.440 |
in the US government decided to bomb and landmine 01:05:37.480 |
Southeast Asia, it's hard to come back and say, 01:05:41.840 |
But then the problem is that this resistance to government 01:05:54.320 |
It's a way of not having obligations to others. 01:05:56.960 |
And that ultimately is a very suspect motivation. 01:06:04.240 |
that the government should not overreach its power is flawed? 01:06:12.160 |
is to replace the word government with politics. 01:06:15.560 |
Like, our politics is people having to deal with each other. 01:06:19.840 |
My theory about freedom is that the only authentic 01:06:26.560 |
All right, so annoyance means you're actually dealing 01:06:31.880 |
Perpetual means that that annoyance is survivable, 01:06:36.240 |
So if you have perpetual annoyance, then you have freedom. 01:06:42.920 |
something's gone very wrong, and you've suppressed 01:06:51.080 |
I'll invite you to a Berkeley City Council meeting 01:07:03.560 |
If you're not, you're trapped in some temporary illusion 01:07:07.840 |
Now, this quest to avoid government is really a quest 01:07:11.360 |
to avoid that political feeling, but you have to have it. 01:07:15.560 |
And it sucks, but that's the human situation. 01:07:20.680 |
And this idea that we're gonna have this abstract thing 01:07:22.920 |
that protects us from having to deal with each other 01:07:26.760 |
- The idea, and I apologize, I overstretched the use 01:07:32.320 |
The idea is there should be some punishment from the people 01:07:41.120 |
or a particular leader, like in an authoritarian regime, 01:07:44.580 |
which more than half the world currently lives under, 01:07:47.280 |
if they become, they stop representing the people, 01:07:56.680 |
and starts being more like a dictatorial kind of situation. 01:08:15.520 |
- Yeah, but people, see this idea that the problem 01:08:18.360 |
is always the government being powerful is false. 01:08:30.440 |
The problem can be infrastructure that fails. 01:08:51.400 |
There are all these other problems that are different 01:08:54.660 |
Like you have to keep some sense of perspective 01:08:57.000 |
and not be obsessed with only one kind of problem 01:09:05.320 |
So like some groups of people, like governments or gangs 01:09:10.200 |
or companies lead to problems more than others. 01:09:14.440 |
- Has the government ever really been a problem for you? 01:09:17.400 |
So first of all, I grew up in the Soviet Union 01:09:28.900 |
I would say that that's a really complicated question, 01:09:32.820 |
especially because the United States is such, 01:09:35.480 |
it's a special place like a lot of other countries. 01:09:58.000 |
- I would say, 'cause you did a little trick of, 01:10:03.720 |
to the United States to talk about government. 01:10:06.260 |
So I believe, unlike my friend, Michael Malice, 01:10:12.080 |
I believe government can do a lot of good in the world. 01:10:19.720 |
The thing that Bitcoin folks and cryptocurrency folks argue 01:10:26.840 |
is a centralized bank, like control the money. 01:10:38.640 |
And so what they argue is this is one way to go around 01:11:10.200 |
to prop up corrupt, murderous, horrible regimes 01:11:20.580 |
of cryptocurrency folks, what they would tell me, right? 01:11:30.860 |
- There's no way to know the answer perfectly. 01:11:32.740 |
- However, I gotta say, if you look at people 01:11:53.860 |
There's evidence that a lot of them are quite, 01:11:57.720 |
not the people you'd wanna support, let's say. 01:12:16.160 |
than new villains, or even villainous governments 01:12:18.960 |
will be empowered, there's no basis for that assertion. 01:12:24.600 |
And I think in general, Bitcoin ownership is one thing, 01:12:33.600 |
to support criminality more than productivity. 01:12:37.440 |
- Of course, they would argue that was the story 01:12:39.680 |
of its early days, that now more and more Bitcoin 01:12:52.320 |
I think what's happening is people are using it 01:13:02.760 |
for this and that, they buy a Tesla with it or something. 01:13:05.600 |
Investing in a startup, hard, it might have happened 01:13:11.040 |
a little bit, but it's not an engine of productivity, 01:13:28.200 |
I am anti the idea of economics wiping out politics 01:13:42.480 |
- In some ways, there's parallels to our discussion 01:13:44.800 |
of algorithms and cryptocurrency is you're pro the idea, 01:13:54.440 |
you can be used poorly by aforementioned humans. 01:13:59.360 |
- Well, I think that you can make better designs 01:14:03.520 |
And I think, and you know, the thing about cryptocurrency 01:14:06.320 |
that's so interesting is how many of us are responsible 01:14:11.320 |
for the poor designs because we're all so hooked 01:14:17.280 |
I'm gonna be the one who gets the viral benefit. 01:14:20.040 |
You know, way back when all this stuff was starting, 01:14:24.840 |
somebody had the idea of using viral as a metaphor 01:14:33.680 |
that it always created distortions that ruined 01:14:42.520 |
Like, but then somehow, even after the pandemic, 01:14:47.160 |
'cause we imagine ourselves as the virus, right? 01:14:54.560 |
- There is a sense because money is involved, 01:15:01.560 |
because they want to be part of that first viral wave 01:15:07.640 |
And that blinds people from their basic morality. 01:15:13.440 |
I don't, I sort of feel like I should respect 01:15:16.480 |
but some of the initial people who started Bitcoin, 01:15:26.580 |
Like, the early people have more than the later people. 01:15:31.820 |
the more you're subject to gambling-like dynamics 01:15:36.160 |
and more and more subject to weird network effects 01:15:37.920 |
and whatnot, unless you're a very small player, perhaps, 01:15:43.080 |
But even then you'll be subject to fluctuations 01:15:49.080 |
it's going to wave around the little people more. 01:15:51.800 |
And I remember the conversation turned to gambling 01:15:55.360 |
because gambling is a pretty large economic sector. 01:15:58.200 |
And it's always struck me as being non-productive. 01:16:01.560 |
Like somebody goes to Las Vegas, they lose money. 01:16:03.800 |
And so one argument is, well, they got entertainment. 01:16:07.040 |
They paid for entertainment as they lost money. 01:16:17.640 |
The argument that was made to me was different from that. 01:16:21.360 |
is they're getting a chance to experience hope. 01:16:25.520 |
And so that's really worth it, even if they're going to lose. 01:16:33.960 |
- That's so heartbreaking 'cause I've seen it. 01:16:38.160 |
But I've seen that, I have that a little bit of a sense. 01:16:52.000 |
that you've gotten hope from, so much is invested. 01:17:04.960 |
not going to be a source of that deep meaning. 01:17:16.160 |
- Yeah, you've just described the psychology of virality 01:17:21.160 |
or the psychology of trying to base a civilization 01:17:25.720 |
on semi-random occurrences of network effect peaks. 01:17:32.240 |
I mean, I think we need to get away from that. 01:17:36.080 |
and except Microsoft, which deserves every penny, 01:17:43.880 |
I think what Microsoft did with GitHub was brilliant. 01:17:51.160 |
but on Microsoft, 'cause they recently purchased Bethesda. 01:18:03.880 |
- Yeah, well, look, I'm not speaking for Microsoft. 01:18:23.720 |
Like, so, you know, we have, it's kind of extraordinary, 01:18:36.800 |
to see how things work from the inside of some big thing. 01:18:39.920 |
And you know, it's always just people kind of, 01:18:51.560 |
- And there's some good people, there's some bad people. 01:18:53.600 |
- I hope Microsoft doesn't screw up your game. 01:19:04.120 |
- Well, that's why, this is why I think you're wrong. 01:19:16.920 |
Cortana, Cortana, would Cortana do it for you? 01:19:24.920 |
- There's a woman in Seattle who's like the model 01:19:38.200 |
I think, I don't think you should turn a software 01:19:51.920 |
- You co-authored a paper, you mentioned Lee Smolin, 01:20:05.320 |
That's a trippy and beautiful and powerful idea. 01:20:08.320 |
What are, what would you say are the key ideas 01:20:22.320 |
that's not published that I'm quite excited about. 01:20:29.120 |
So I have to try to be a little careful about that. 01:20:33.680 |
We can think about it in a few different ways. 01:20:35.960 |
The core of the paper, the technical core of it 01:20:49.560 |
The part that was established was, of course, 01:20:57.000 |
The part that was fresher is understanding those 01:21:04.760 |
between these different ways of describing systems. 01:21:12.520 |
because, well, theoretical physics is really hard, 01:21:17.480 |
and a lot of programs have kind of run into a state 01:21:36.680 |
- As we start to publish more about where it's gone, 01:21:58.640 |
But then there's also the storytelling part of it. 01:22:15.080 |
is that there's some kind of starting condition, 01:22:22.440 |
And the question is like, why the starting condition? 01:22:26.240 |
Like how, oh, the starting condition has to get kind of, 01:22:31.240 |
it has to be fine-tuned and all these things about it 01:22:40.720 |
about where the universe comes from much further back 01:22:42.960 |
by starting with really simple things that evolve, 01:22:51.200 |
And so we've been exploring a variety of ways 01:23:09.560 |
to have a radical quality in the physics world, 01:23:13.760 |
but he still is like, no, this is gonna be like 01:23:19.800 |
in which evolution happens is the same time we're in now. 01:23:22.320 |
And we're talking about something that starts 01:23:26.040 |
well, what if there's some other kind of time 01:23:28.320 |
that's time-like, and it sounds like metaphysics, 01:23:38.760 |
a lot of the math can be thought of either way, 01:23:44.000 |
- So push it so far back that basically all the things 01:23:46.520 |
we take for granted in physics start becoming emergent. 01:23:49.600 |
- I really wanna emphasize, this is all super baby steps. 01:23:54.440 |
It's like, I think a lot of the things we're doing, 01:24:02.240 |
There's been a zillion papers about how you can think 01:24:06.160 |
or how you can think of different ideas in physics 01:24:09.080 |
as being quite similar to, or even equivalent to, 01:24:21.840 |
Like, there's probably two or three dozen papers 01:24:25.880 |
that have this quality, and some of them are just crazy good. 01:24:30.600 |
What we're trying to do is take those kinds of observations 01:24:38.760 |
with landscapes of theories that you couldn't do before, 01:24:47.920 |
How unlikely are we, this intelligent civilization? 01:25:20.280 |
- That's a feature, not a bug, I think, the weirdness. 01:25:25.880 |
I think if I'm just gonna answer you in terms of truth, 01:25:40.800 |
at least as yet, to really know much about who we are, 01:25:59.440 |
or charming like that first year of TikTok or something. 01:26:07.480 |
There's another level at which I can think about it 01:26:12.120 |
where I sometimes think that if you're a person 01:26:17.120 |
that if you are just quiet and you do something 01:26:22.120 |
that gets you in touch with the way reality happens, 01:26:32.960 |
and it feels like there's a lot more going on in it, 01:26:36.160 |
and there is a lot more life and a lot more stuff happening 01:26:43.000 |
This is kind of a more my artist side talking, 01:26:56.600 |
What do you, it sounds like you might be at least in part. 01:27:01.560 |
- There's a passage in Kerouac's book, "Dr. Sax," 01:27:05.640 |
where somebody tries to just explain the whole situation 01:27:12.000 |
but it's like, yeah, like there are these bulbous things 01:27:16.520 |
You can sort of understand them, but only kind of, 01:27:18.360 |
and then there's like this, and it's just like this amazing, 01:27:20.560 |
like just really quick, like if some spirit being 01:27:24.600 |
or something was gonna show up in our reality 01:27:36.560 |
in "Hitchhiker's Guide to the Galaxy," right, 01:27:48.960 |
- Yeah, music is something that just towers above me. 01:28:02.240 |
Like you can say, oh, it's a thing people evolved 01:28:11.920 |
There's all these things you can say about music, 01:28:13.840 |
which are, you know, some of that's probably true. 01:28:31.320 |
music can have like this kind of substantiality to it 01:28:52.560 |
Anything I can understand, read through in language. 01:29:04.960 |
And I've never ever, I've read across a lot of explanations 01:29:16.480 |
between people or between people and how they perceive 01:29:34.120 |
which is that everything in the universe is conscious. 01:29:39.600 |
makes me be humble in how much or how little I understand 01:29:50.560 |
- Most people interested in theoretical physics 01:30:00.200 |
I still think there's this pragmatic imperative 01:30:14.920 |
- Yeah, I'm not quite sure where to draw the line 01:30:19.280 |
or why the lines there or anything like that, 01:30:22.840 |
to all the same questions are equally mysterious 01:30:30.480 |
But if you listen to anyone trying to explain 01:30:38.600 |
either believing in souls or some special thing 01:30:42.360 |
you pretty much say, screw this, I'm gonna be a panpsychist. 01:30:55.880 |
that were defining in the way that you hope others, 01:31:02.560 |
the moments that defined me were not the good ones. 01:31:06.280 |
The moments that defined me were often horrible. 01:31:09.560 |
I've had successes, but if you ask what defined me, 01:31:19.360 |
being under the World Trade Center and the attack, 01:31:30.760 |
were the most were sort of real world terrible things, 01:31:37.580 |
And this is the thing that's hard about giving advice 01:31:42.000 |
to young people that they have to learn their own lessons 01:32:03.000 |
that has a bit of a fatalistic quality to it, 01:32:30.560 |
a little grace period of naivety that's pleasant. 01:32:45.480 |
okay, so let me try a little bit on this advice thing. 01:32:52.460 |
and any serious broad advice will have been given 01:32:56.400 |
a thousand times before for a thousand years. 01:33:29.280 |
whatever you wish, not to escape it entirely, 01:33:44.120 |
Believing in experience as a real thing is very dualistic. 01:33:52.680 |
and instead of squirting the magic dust on the programs, 01:34:00.360 |
- Your own personal experience that you just have, 01:34:06.140 |
silence the rest of the world enough to hear that, 01:34:08.260 |
like whatever that magic dust is in that experience. 01:34:30.020 |
and that it'll take you a while to have the skills, 01:35:35.600 |
And it's hard, 'cause right when it's happening, 01:35:39.220 |
And if I was actually giving this advice to my daughter, 01:35:56.640 |
that really wants to sit and listen to my voice. 01:36:13.880 |
- Yeah, I mean, I still connect to her through music. 01:36:17.680 |
She was a young prodigy piano player in Vienna. 01:36:27.960 |
and then died in a car accident here in the US. 01:36:40.680 |
so she had the whole Viennese music thing going, 01:36:51.440 |
of absolute skill and romance bundled together 01:36:58.920 |
I learned to play some of the Beethoven sonatas for her 01:37:01.920 |
and I played them in this exaggerated drippy way, 01:37:16.320 |
I mean, the fashion these days is to be slightly Apollonian 01:37:22.800 |
that actual Beethoven playing might've been different. 01:37:26.840 |
I've gotten to play a few instruments he played 01:37:35.080 |
- I was always against the clinical precision 01:37:39.400 |
I thought a great piano player should be like in pain, 01:37:58.360 |
- Maybe play classical music the way, I don't know, 01:38:17.080 |
I think the blues, the whole African-American tradition 01:38:21.680 |
was initially surviving awful, awful circumstances. 01:38:29.560 |
And it's not that Beethoven's circumstances were brilliant, 01:38:34.600 |
but he kind of also, I don't know, this is hard. 01:38:42.520 |
was somewhat self-imposed maybe through, I don't know. 01:38:47.240 |
I've known some people who loathed Beethoven, 01:38:54.040 |
I played in her band for a while and she was like, 01:39:00.400 |
"It completely, it turns emotion into your enemy. 01:39:05.400 |
"And it's ultimately all about your own self-importance, 01:39:20.000 |
but I'm just saying like her position on Beethoven 01:39:31.680 |
So it's a little hard for me to answer that question. 01:39:34.760 |
But it was interesting 'cause I'd always thought 01:39:36.000 |
of Beethoven as like, "Whoa, this is like Beethoven 01:39:43.320 |
Beethoven, Schmadovan, it's like not really happening. 01:40:05.520 |
One of the musicians I play with is John Batiste, 01:40:28.080 |
He's like, "Hey, do you have the book of Beethoven's sonatas?" 01:40:30.280 |
I say, "Yeah, I wanna find one I haven't played." 01:40:31.720 |
And he sight read through the whole damn thing perfectly. 01:40:34.080 |
And I'm like, "Oh God, I just need to get out of here. 01:40:44.840 |
of with the same persona and the same philosophy 01:41:03.320 |
Like he's like, "Oh yeah, here, he's doing this." 01:41:16.720 |
that was really kind of oppressed and hard to deal with. 01:41:21.400 |
- He's playing James P. Johnson or something. 01:41:53.600 |
- You know, what's funny is I used to not be able to, 01:41:56.240 |
but as you get older, you just know people who die 01:41:59.320 |
and it just becomes familiar and more ordinary, 01:42:11.840 |
And it's not like I didn't have some kind of insight 01:42:20.240 |
I think I just, like I say, it's kind of familiarity. 01:42:55.360 |
that optimism has to also come with its own like humility. 01:42:58.680 |
You have to make yourself small to believe in the future. 01:43:02.040 |
And so it actually in a funny way comforts me. 01:43:08.600 |
And optimism requires you to kind of step down after time. 01:43:15.600 |
- Yeah, I mean, that said, life seems kind of short, 01:43:22.720 |
- I've tried to find, I can't find the complaint department. 01:43:24.760 |
You know, I really want to, I want to bring this up, 01:43:27.080 |
but the customer service number never answers 01:43:32.120 |
- Do you think there's meaning to it, to life? 01:43:38.320 |
Like we say all these things as if we know what they mean, 01:43:40.480 |
but meaning, we don't know what we mean when we say meaning. 01:43:48.520 |
I think it ultimately connects to that sense of experience 01:43:54.720 |
- I guess there are why, like if you look up to the stars 01:43:59.040 |
and you experience that awe inspiring, like joy, 01:44:06.320 |
I don't know why for me, that's kind of makes me feel 01:44:54.120 |
- Well, just because, I don't know what meaning is. 01:45:01.720 |
I grew up in Southern New Mexico and the stars were so vivid. 01:45:15.960 |
One of our near neighbors was the head of optics research 01:45:22.000 |
he discovered Pluto, his name was Clyde Tombaugh. 01:45:28.960 |
And my dad had also made telescopes when he was a kid, 01:45:37.320 |
I mean, he really, he did his telescopes, you know? 01:45:39.960 |
And so I remember he'd let me go and play with them 01:45:47.760 |
and with a good telescope, it's really like this object. 01:45:50.120 |
Like you can really tell this isn't coming through 01:45:59.680 |
And you have even a feeling for the vastness of it. 01:46:07.640 |
I was very, very fortunate to have a connection 01:46:28.680 |
I took my daughter and her friends to a telescope. 01:46:31.960 |
There are a few around here that our kids can go and use 01:46:34.680 |
and they would look at Jupiter's moons or something. 01:46:52.200 |
And maybe it's too confusable with the screen. 01:47:04.520 |
that if humans, early humans weren't able to see the stars, 01:47:08.680 |
like if Earth's atmosphere was such that it was cloudy, 01:47:12.080 |
that we would not develop human civilization. 01:47:14.920 |
There's something about being able to look up 01:47:20.600 |
that's fundamental to the development of human civilization. 01:47:23.840 |
I thought that was a curious kind of thought. 01:47:26.720 |
- That reminds me of that old Isaac Asimov story 01:47:30.280 |
where there's this planet where they finally get to see 01:47:35.040 |
And it turns out they're in the middle of a globular cluster 01:47:39.760 |
God, that's from when I was the same age as a kid. 01:48:05.160 |
they were trying to navigate boats in the North Atlantic 01:48:10.680 |
without being able to see the sun 'cause it was cloudy. 01:48:12.600 |
And so they used a chunk of mica to diffract it 01:48:19.960 |
in order to be able to align where the sun really was 01:48:22.520 |
'cause they couldn't tell by eye and navigate. 01:48:24.680 |
So I'm just saying there are a lot of civilizations 01:48:37.440 |
- To me personally, the question of the meaning of life 01:48:53.400 |
- But then you ask, it still feels that we're special. 01:49:01.640 |
well, if we are as special as I think we are, 01:49:05.280 |
why the heck are we here in this vast universe? 01:49:08.840 |
That ultimately is the question of the meaning of life. 01:49:14.160 |
- I mean, look, there's a confusion sometimes 01:49:18.600 |
in trying to set up a question or a thought experiment 01:49:23.600 |
or something that's defined in terms of a context 01:49:29.920 |
to explain something where there is no larger context. 01:49:34.200 |
If we wanna do it in physics or in computer science, 01:49:40.360 |
it's hard to talk about the universe as a Turing machine 01:49:44.000 |
because a Turing machine has an external clock 01:49:54.160 |
you can't talk about it coherently as a Turing machine. 01:50:11.440 |
So maybe Turing machines and quantum mechanics 01:50:18.560 |
But the thing is, if you have something that's defined 01:50:29.280 |
So there's some ideas that are their own context. 01:50:48.680 |
and to talk about ultimate meaning is therefore a category. 01:50:57.720 |
It might be a way of thinking that is experientially 01:51:06.920 |
or aesthetically valuable because it is awesome 01:51:14.880 |
But to try to treat it analytically is not sensible. 01:51:20.440 |
- Maybe that's what music and poetry are for. 01:51:24.160 |
I think music actually does escape any particular context. 01:51:27.280 |
That's how it feels to me, but I'm not sure about that. 01:51:29.200 |
That's once again, crazy artist talking, not scientist. 01:51:37.880 |
And like I said, I'm a big fan of everything you've done, 01:51:53.760 |
that you spent your really valuable time with me today. 01:52:01.820 |
To support this podcast, please check out our sponsors 01:52:10.000 |
A real friendship ought to introduce each person 01:52:16.720 |
Thank you for listening, and hope to see you next time.