back to indexPeter Wang: Python and the Source Code of Humans, Computers, and Reality | Lex Fridman Podcast #250
Chapters
0:0 Introduction
0:33 Python
4:4 Programming language design
24:7 Virtuality
34:7 Human layers
41:5 Life
46:29 Origin of ideas
49:1 Eric Weinstein
54:0 Human source code
57:58 Love
72:16 AI
85:39 Meaning crisis
108:12 Travis Oliphant
114:38 Python continued
144:21 Best setup
151:39 Advice for the youth
160:12 Meaning of Life
00:00:00.000 |
The following is a conversation with Peter Wang, 00:00:02.400 |
one of the most impactful leaders and developers 00:00:28.800 |
And now, here's my conversation with Peter Wang. 00:00:51.120 |
or maybe the most beautiful feature of Python, 00:00:54.040 |
or maybe the thing that made you fall in love 00:01:01.960 |
what made me fall in love with me, stay in love. 00:01:05.760 |
was when I was a C++ computer graphics performance nerd. 00:01:14.120 |
And we kept trying to do more and more like abstract 00:01:20.520 |
which at the time was quite difficult with templates. 00:01:24.160 |
The compiler support wasn't great, et cetera. 00:01:26.560 |
So when I started playing around with Python, 00:01:37.200 |
So that was what kind of made me fall in love 00:01:46.200 |
that basically runs and works the first time is amazing. 00:01:53.680 |
I was decent at both, but Python just made everything, 00:01:59.680 |
I could script this and that and the other network things, 00:02:08.640 |
- Is there something specific you could put your finger on 00:02:19.640 |
as much as the design motif of both the creator 00:02:27.480 |
There was definitely, there was a taste to it. 00:02:32.200 |
I mean, Steve Jobs, you know, used that term, 00:02:35.720 |
but I think it's a real thing that it was designed to fit. 00:02:39.240 |
A friend of mine actually expressed this really well. 00:02:56.560 |
- Okay, so the most beautiful feature of Python 00:03:03.040 |
It's like over the years, what has like, you know, 00:03:06.560 |
you do a double take and you return too often 00:03:11.560 |
- I really still like the ability to play with metaclasses 00:03:24.400 |
'cause I'm pretty expert as a Python programmer. 00:03:27.080 |
I can easily put all sorts of lovely things together 00:03:34.680 |
So that to me, I would say that's tied with the NumPy 00:03:40.640 |
I love thinking in terms of the matrices and the vectors 00:03:46.080 |
So I would say those two are kind of tied for me. 00:03:49.400 |
- So the elegance of the NumPy data structure, 00:03:52.720 |
like slicing through the different multi-dimensional-- 00:03:56.200 |
It's like a very, it's a very simple, comfortable tool. 00:04:05.000 |
- Can you put your finger on how to design a language 00:04:18.680 |
Is it something you have to kind of write out on paper, 00:04:24.680 |
Is it a taste thing or is there a systematic process? 00:04:33.600 |
is that you have to pick your audience, right? 00:04:43.840 |
because their needs will be more compact and coherent. 00:04:50.960 |
The more diverse the user base, the harder that is. 00:04:57.200 |
that's also naturally created more complexity 00:05:06.320 |
And so I do think that's the downside of popularity. 00:05:19.520 |
were just solving a problem that you yourself had? 00:05:21.880 |
- Well, so Clay Shirky in his book on crowdsourcing 00:05:27.560 |
he identifies the first step of crowdsourcing 00:05:34.320 |
It's very telling that when you look at all of the impactful 00:05:37.760 |
big project, well, they're fundamental projects now 00:05:42.600 |
they all started with the people in the domain 00:05:48.240 |
And the whole idea of scratching your own itch 00:05:51.360 |
or the free software world has known for a long time. 00:06:00.560 |
They didn't have really a lot of programming skill 00:06:16.880 |
for utility and compactness and expressiveness. 00:06:23.360 |
'cause there was just too much trouble, right? 00:06:26.160 |
- And also necessity creates a kind of deadline. 00:06:29.440 |
are quickly thrown together in the first step. 00:06:34.520 |
that just seems to work well for software projects. 00:06:38.840 |
- Well, it does work well for software projects in general. 00:06:43.280 |
one of my colleagues, Stan Sieber, identified this, 00:06:46.320 |
that all the projects in the SciPy ecosystem, 00:06:49.040 |
if we just rattle them off, there's NumPy, there's SciPy, 00:06:55.000 |
although Travis is the heart of both of them. 00:07:00.640 |
And then you've got Pandas, you've got Jupyter or IPython. 00:07:06.880 |
There's just so many others, I'm not gonna do justice 00:07:10.720 |
but all of them are actually different people. 00:07:29.280 |
And the thing is with these scientists turned programmers, 00:07:34.800 |
that was a little bit better for what they needed, 00:07:39.960 |
So naturally, almost in kind of this annealing process 00:07:46.520 |
of the basic needs of a scientific computing library. 00:07:55.520 |
all kinds of methodologies for communication, 00:07:57.760 |
and it just kind of like grew this collective intelligence, 00:08:23.100 |
at a higher level of abstraction, something else? 00:08:44.860 |
when we try to give instructions to computing systems, 00:08:49.000 |
all of our computers, well, actually, this is not quite true, 00:08:54.980 |
But for the most part, we can think of computers 00:08:58.900 |
So when we program, we're giving very precise instructions 00:09:02.740 |
to iterated systems that then run at incomprehensible speed 00:09:08.820 |
In my experience, some people are just better equipped 00:09:16.860 |
well, whatever, iterated systems in their head. 00:09:31.260 |
And the issue is you can have a drag and drop thing, 00:09:33.700 |
but once you start having to iterate the system 00:09:35.220 |
with conditional logic, handling case statements 00:09:37.140 |
and branch statements and all these other things, 00:09:39.540 |
the visual drag and drop part doesn't save you anything. 00:09:42.080 |
You still have to reason about this giant iterated system 00:09:44.720 |
with all these different conditions around it. 00:09:48.220 |
So handling iterated logic, that's the hard part. 00:09:52.300 |
The languages we use then emerge to give us ability 00:09:57.300 |
Now, the one exception to this rule, of course, 00:09:58.900 |
is the most popular programming system in the world, 00:10:05.420 |
data transformation-oriented programming system. 00:10:10.400 |
that that system is the most popular programming system 00:10:16.940 |
I do think as we build future computing systems, 00:10:21.260 |
you're actually already seeing this a little bit, 00:10:22.900 |
it's much more about composition of modular blocks. 00:10:25.660 |
They themselves actually maintain all their internal state, 00:10:32.980 |
And so to stitch these things together using like IFTTT 00:10:38.780 |
I would say compositional scripting kinds of things, 00:10:41.980 |
I mean, HyperCard was also a little bit in this vein. 00:10:54.460 |
where you're standing on the shoulders of giants. 00:10:55.940 |
So you're building like higher and higher levels 00:10:58.740 |
of abstraction, but you do that a little bit with language. 00:11:07.580 |
And then you kind of leverage those philosophies 00:11:09.540 |
as you try to have deeper and deeper conversations. 00:11:13.540 |
it seems like you can build much more complicated systems. 00:11:27.860 |
ability to express ideas in a computational space. 00:11:31.300 |
- I think it's worth pondering the difference here 00:11:43.260 |
but the idea is when we have a human conversation, 00:11:55.540 |
which is that the person we're communicating with 00:11:57.700 |
is a person and they would communicate back to us. 00:12:01.220 |
And so we sort of hit a resonance point, right? 00:12:07.460 |
So there's a messiness to it and there's a fluidity to it. 00:12:11.860 |
when we express something to the computer and it's wrong, 00:12:17.500 |
of having failed at expressing ourselves to the computer 00:12:20.180 |
until the one time we expressed ourselves right. 00:12:32.460 |
one has to really pay attention to the difference 00:12:35.580 |
between when an end user is expressing something 00:12:49.500 |
And in most cases, but it doesn't have to be that. 00:12:52.060 |
And Excel is actually a great example of this, 00:12:56.220 |
- Okay, so what about the idea of, you said messiness. 00:13:27.380 |
then there's Python, the programming languages 00:13:38.140 |
And so the programming is almost in the space of data 00:13:45.900 |
is a lot of the programming happens in the space of data. 00:13:50.900 |
So back to Excel, all roads lead back to Excel. 00:13:54.060 |
In the space of data and also the hyperparameters 00:13:57.380 |
And all of those allow the same kind of messiness 00:14:09.940 |
So I don't have, now I did cram a bunch of CS in prep 00:14:19.740 |
But what I have observed in studying programming languages 00:14:27.340 |
It's one of these beautiful little iron triangles 00:14:32.060 |
And it's the connection between the code correctness 00:14:39.900 |
and then the kind of correctness or parameters 00:14:44.940 |
So there's the algorithms that you wanna apply. 00:14:53.340 |
So the semantics of the data within the representation, 00:14:56.940 |
and then there's what the computer can actually do. 00:14:59.740 |
And every programming system, every information system 00:15:07.420 |
Sometimes some systems collapse them into just one edge. 00:15:11.100 |
- Are we including humans as a system in this? 00:15:13.500 |
- No, no, I'm just thinking about computing systems here. 00:15:17.820 |
I believe there's no free lunch around this stuff. 00:15:39.580 |
That training process itself is intrinsically sensitive 00:15:45.580 |
It's quite sensitive to the boundary conditions 00:15:49.340 |
So I think even as we move to using automated systems 00:15:56.980 |
figuring out, well, this schema and this database 00:15:59.300 |
and these columns get selected for this algorithm. 00:16:01.660 |
And here we put a Fibonacci heap for some other thing. 00:16:08.140 |
the boundaries that we hit with these information systems 00:16:10.620 |
is when the representation of the data hits the real world 00:16:14.220 |
is where there's a lot of slop and a lot of interpretation. 00:16:20.860 |
is actually understanding kind of how to better 00:16:26.220 |
how to better encode the semantics of the world 00:16:33.460 |
- Okay, but given the semantics of the real world 00:16:40.780 |
- There's a lot of dimensions to correctness. 00:16:42.940 |
Historically, and this is one of the reasons I say 00:16:45.140 |
that we're coming to the end of the era of software, 00:17:03.180 |
I mean, in fact, I think the bright line in the sand 00:17:14.260 |
actually have to be considered with the function together 00:17:19.420 |
And usually there's a performance SLA as well. 00:17:29.180 |
to return a prediction of this level of accuracy, right? 00:17:32.700 |
So these are things that were not traditionally 00:17:35.020 |
in most business computing systems for the last 20 years 00:17:39.420 |
But now we have value dependence on functional correctness. 00:17:48.180 |
- We've thought about software as just this thing 00:17:50.500 |
that you can do in isolation with some test trial inputs 00:18:05.140 |
When we start turning this stuff into prediction systems, 00:18:10.140 |
you're going to find scenarios where you get inputs 00:18:17.460 |
So then the software has a varying amount of runtime 00:18:22.260 |
And that is a different kind of system altogether. 00:18:27.620 |
that is not like traditional software systems. 00:18:30.180 |
- Can you maybe describe what is a cybernetic system? 00:18:35.060 |
So is it human in the loop kind of complex mess 00:18:38.900 |
of the whole kind of interactivity of software 00:18:41.500 |
with the real world, or is it something more concrete? 00:18:47.700 |
is closing the observe, orient, decide, act loop by itself. 00:19:01.940 |
when the machine is actually sort of deciding on its own 00:19:05.220 |
what it should do next to get more information, 00:19:11.380 |
I think everyone talking about MLAI, it's great, 00:19:15.380 |
but really the thing we should be talking about 00:19:20.380 |
and all of the questions of ethics and governance 00:19:24.620 |
they really are the most important questions. 00:19:28.580 |
What does it mean for the human to be out of the loop 00:19:34.100 |
that's ultimately accomplishing some kind of purpose 00:19:37.420 |
that at the bottom, the turtles all the way down, 00:19:44.140 |
- Well, the human may have set some criteria, 00:19:55.220 |
sent out some automated killer drones with explosives. 00:19:58.220 |
And there was no human in the loop at that point. 00:19:59.980 |
They basically put them in a geo-fenced area, 00:20:03.420 |
like a truck or vehicle that looks like this, and boom. 00:20:09.300 |
- So increasingly, the less human there is in the loop, 00:20:12.700 |
the more concerned you are about these kinds of systems, 00:20:18.340 |
like less the original designer and engineer of the system 00:20:21.940 |
is able to predict, even one with good intent 00:20:24.980 |
is able to predict the consequences of such a system. 00:20:30.140 |
that run without humans in the loop that are quite complex. 00:20:35.900 |
We get, you know, in the heyday of high-frequency trading, 00:20:43.620 |
that the market designers had never really thought about, 00:20:52.860 |
now they become automated, you know, killer drones, 00:21:00.580 |
and the end of the era of just pure software. 00:21:09.980 |
so systems that aren't just doing a particular task, 00:21:20.340 |
are you more concerned about like the lobster being boiled, 00:21:27.360 |
collapse of civilization, or a big explosion, 00:21:33.260 |
like oops, kind of a big thing where everyone notices, 00:21:40.100 |
- I think that it will be a different experience 00:21:52.180 |
people who are concerned about climate change 00:21:53.740 |
and just the big existential risks that we have, 00:21:58.740 |
but unlike a lot of people who share my level of concern, 00:22:02.540 |
I think the collapse will not be quite so dramatic 00:22:07.540 |
And what I mean is that I think that for certain tiers 00:22:10.780 |
of let's say economic class or certain locations 00:22:14.620 |
people will experience dramatic collapse scenarios. 00:22:17.740 |
But for a lot of people, especially in the developed world, 00:22:29.120 |
the middle class will be used to insulate the upper class 00:22:31.300 |
from the pitchforks and the flaming torches and everything. 00:22:39.540 |
my question was more about cybernetic systems or software. 00:22:44.420 |
but it would nevertheless perhaps be about class. 00:22:46.660 |
So the effect of algorithms might affect certain classes 00:22:51.120 |
- I was more thinking about whether it's social media 00:23:05.140 |
Or is it something truly dramatic where there's, 00:23:08.420 |
like a meltdown of a nuclear reactor kind of thing, 00:23:21.100 |
- Yeah, I'm not as concerned about the visible stuff. 00:23:26.100 |
And the reason is because the big visible explosions, 00:23:29.540 |
I mean, this is the thing I said about social media 00:23:33.180 |
when a nuke goes off, you can see it and you're like, 00:23:35.100 |
"Well, that's really, wow, that's kind of bad." 00:23:37.780 |
I mean, Oppenheimer was reciting the Bhagavad Gita, 00:23:51.160 |
when you have all these different things that conspire 00:23:54.380 |
to create a layer of virtual experience for people 00:23:57.100 |
that alienates them from reality and from each other, 00:24:07.180 |
- You've written about this idea of virtuality 00:24:09.580 |
on this topic, which you define as the subjective phenomenon 00:24:14.100 |
of knowingly engaging with virtual sensation and perception 00:24:26.460 |
Is there a hard line between reality and virtuality? 00:24:30.440 |
Like perception drifts from some kind of physical reality. 00:24:33.460 |
We have to kind of have a sense of what is the line 00:24:38.760 |
For me, it's not about any hard line about physical reality, 00:24:46.180 |
does the particular technology help people connect 00:24:55.980 |
with all of the full spectrum of things around them? 00:24:58.540 |
So it's less about, oh, this is a virtual thing 00:25:02.980 |
more about when we create virtual representations 00:25:10.740 |
Usually many, many dimensions are lost in translation. 00:25:14.340 |
We're now coming to almost two years of COVID, 00:25:17.940 |
You know it's different when you meet somebody in person 00:25:24.020 |
And so I think when we engage in virtual experiences 00:25:42.500 |
It's hard to say, oh, we're gonna spend $100 million 00:25:44.540 |
building a new system that captures this 5% better 00:25:52.580 |
So when we rush madly into a world of simulacrum 00:25:57.420 |
and virtuality, the things that are lost are, 00:26:05.420 |
it can be hard to look back and see what we've lost. 00:26:14.220 |
is it possible for more to be gained than is lost? 00:26:18.700 |
they create virtual experiences that are surreal 00:26:29.900 |
So it can bring out the best and the worst in people. 00:26:39.400 |
I mean, it's possible to have that in the current world, 00:26:41.560 |
but when literally trillions of dollars of capital 00:26:52.840 |
and to attack our weaknesses in the limbic system 00:27:10.920 |
to create fulfilling connections and relationships 00:27:13.720 |
in the digital world and make a shit ton of money? 00:27:22.000 |
without concretely knowing what's the solution? 00:27:24.680 |
- My intuition is that a lot of our digital technologies 00:27:27.720 |
give us the ability to have synthetic connections 00:27:33.020 |
They have co-evolved with sort of the human expectations. 00:27:42.320 |
they need more sugary drinks to get that same hit, right? 00:27:45.860 |
So with these virtual things and with TV and fast cuts 00:27:50.400 |
and TikToks and all these different kinds of things, 00:28:00.340 |
It gets difficult for people to hold their attention 00:28:07.300 |
So mindfulness now more than ever is so important 00:28:10.340 |
in schools and as a therapy technique for people 00:28:13.520 |
because our environment has been accelerated. 00:28:17.340 |
in the electric environment of the television. 00:28:19.480 |
And that was before TikTok and before front-facing cameras. 00:28:25.320 |
it's not like we can ever switch to doing something better, 00:28:33.520 |
The technology that we use kind of molds what we need 00:28:40.680 |
and they're introspective and they can reflect 00:28:45.800 |
So for example, there's been many years in my life 00:28:50.900 |
And then a certain moment I woke up and said, 00:28:59.000 |
And I think, so going through the TikTok process 00:29:02.300 |
of realizing, okay, when I shorten my attention span, 00:29:06.280 |
actually that does not make me feel good longer term. 00:29:10.240 |
And realizing that, and then going to platforms, 00:29:13.160 |
going to places that are away from the sugar. 00:29:21.080 |
that can make a lot of money to help people wake up 00:29:24.000 |
to what actually makes them feel good long-term. 00:29:28.280 |
And it just feels like humans are more intelligent 00:29:40.720 |
things like long-term love and we can have a long-term fear 00:30:00.200 |
we can make financial decisions in using services 00:30:03.800 |
and paying for services that are making us better people. 00:30:06.880 |
So it just seems that we're in the very first stages 00:30:16.520 |
But in bringing out sometimes the bad parts of human nature, 00:30:28.520 |
"Hey, we're gonna start having like sugar-free social media." 00:30:34.800 |
I think some people certainly have the capacity for that. 00:30:37.560 |
And I certainly think, I mean, it's very interesting 00:30:44.200 |
Well, that's still your limbic system saying, 00:30:47.280 |
Right, you have a cat brain's worth of neurons 00:30:50.720 |
And so maybe that saturated and that was telling you, 00:30:55.040 |
Humans are more than just mice looking for cheese 00:31:02.640 |
Now a lot of people would argue with you on that one. 00:31:15.080 |
that we could be better and that better platforms exist. 00:31:21.720 |
- That's an awesome verb. - It's a great term. 00:31:26.560 |
nope out of that. - I'm gonna nope out of that. 00:31:34.240 |
that's the first generation of front-facing cameras, 00:31:43.560 |
have the capacity to say, "Yeah, I'm not gonna do that. 00:31:46.320 |
"I'm gonna go and spend time on long-form reads. 00:32:01.680 |
"Hey, I need to get the guy or the girl or the whatever, 00:32:05.520 |
And so one of the things that we have to reason about here 00:32:07.800 |
is the social media systems or social media, I think, 00:32:11.080 |
is our first encounter with a technological system 00:32:31.880 |
which is each person gets to make their own determination. 00:32:34.320 |
Each person is an individual that's sacrosanct 00:32:37.200 |
in their agency and their sovereignty and all these things. 00:32:39.960 |
The problem with these systems is they come down 00:32:44.360 |
they come down and they are able to manage everyone en masse. 00:32:47.960 |
And so every person is making their own decision, 00:32:50.160 |
but together the bigger system is causing them to act 00:32:53.600 |
with a group dynamic that's very profitable for people. 00:33:00.960 |
is that our philosophies are actually not geared 00:33:06.480 |
to have a high trust connection as part of a collective 00:33:16.160 |
That's something like when a social media app 00:33:21.600 |
it's done harm to more than just individuals. 00:33:24.520 |
So that concept is not something we really talk about 00:33:30.040 |
is that we're vaporizing molecules into atomic units 00:33:32.960 |
and then we're hitting all the atoms with certain things 00:33:35.040 |
that's like, yeah, well, that person chose to look at my app. 00:33:44.680 |
because ultimately society operates at the collective level. 00:33:55.940 |
we have to think of the collective level too. 00:34:01.740 |
- Individuals, family units, social collectives, 00:34:07.120 |
- So you've said that individual humans are multi-layered. 00:34:10.320 |
Susceptible to signals and waves and multiple strata. 00:34:13.600 |
The physical, the biological, social, cultural, 00:34:16.120 |
intellectual, so sort of going along these lines, 00:34:25.320 |
and maybe the human collective, human society? 00:34:29.180 |
- So I'm just stealing wholesale here from Robert Persig, 00:34:32.880 |
who is the author of "Zen and the Art of Motorcycle 00:34:42.280 |
but it's a crude approach to thinking about people, 00:34:51.200 |
where we look at people as a dualist would say, 00:34:57.140 |
is that just merely the matter that's in your brain 00:35:01.120 |
or is there something kind of more beyond that? 00:35:16.820 |
Collectives of things can emerge structures and patterns 00:35:19.760 |
that are just as real as the underlying pieces, 00:35:28.400 |
I mean, we just know physically you consist of atoms 00:35:31.000 |
and whatnot, and then the atoms are arranged into molecules, 00:35:34.680 |
which then arrange into certain kinds of structures 00:35:40.600 |
and those cells form sort of biological structures. 00:35:46.800 |
its physical ability and the biological ability 00:35:49.000 |
to consume energy and to maintain homeostasis, 00:35:54.000 |
I mean, human by themselves is not very long for the world. 00:36:02.200 |
From the mirror neurons to our language centers 00:36:09.200 |
there's a part of us that wants to be part of a thing. 00:36:12.560 |
If we're around other people, not saying a word, 00:36:14.680 |
but they're just up and down jumping and dancing, laughing, 00:36:18.580 |
And there was no exchange of physical anything. 00:36:21.760 |
They didn't give us like five atoms of happiness, right? 00:36:24.880 |
But there's an induction in our own sense of self 00:36:35.080 |
I think they're actually more intertwined than that. 00:36:37.080 |
But the intellectual level is the level of pure ideas 00:36:45.000 |
You will conduct yourself in a particular way. 00:36:47.720 |
I mean, I think part of this is if we think about it 00:36:50.880 |
there's the joke that physicists like to approximate things. 00:36:55.080 |
And we'll say, well, approximate a spherical cow, right? 00:36:57.480 |
You're not a spherical cow, you're not a spherical human, 00:37:00.720 |
And we can't even say what the dynamics of your emotion 00:37:03.960 |
will be unless we analyze all four of these layers, right? 00:37:08.520 |
If you're Muslim at a certain time of day, guess what? 00:37:11.720 |
You're gonna be on the ground, kneeling and praying, right? 00:37:13.960 |
And that has nothing to do with your biological need 00:37:23.720 |
So that's what the four layered stack is all about. 00:37:28.080 |
It's that a person is not only one of these things, 00:37:30.320 |
they're all of these things at the same time. 00:37:31.720 |
It's a superposition of dynamics that run through us, 00:37:48.280 |
- Yeah, each layer is a part of what you are. 00:37:50.280 |
You are a layer cake, right, of all these things. 00:38:02.480 |
because there's a lot of other things that are only atoms. 00:38:04.080 |
I can reduce a human being to a bunch of soup 00:38:32.240 |
is the superposition of these different layers. 00:38:39.760 |
Is consciousness a particular quality of one of the layers? 00:38:45.000 |
if you have a consciousness detector at these layers? 00:38:47.940 |
Or is something that just permeates all of these layers 00:38:51.920 |
- I believe what humans experience as consciousness 00:39:00.540 |
that seems to look for order and reach for order 00:39:06.780 |
You know, it would be odd to say a proton is alive, right? 00:39:27.460 |
will form very interesting and beautiful structures. 00:39:29.960 |
This gets kind of into weird mathematical territories. 00:39:33.000 |
You start thinking about Penrose and Game of Life stuff 00:39:37.920 |
like the hyperreal numbers, things like that. 00:39:42.340 |
I would say that there seems to be a tendency in the world 00:39:49.120 |
things will structure and pattern themselves. 00:39:51.360 |
And they will then actually furthermore try to create 00:39:53.760 |
an environment that furthers their continued stability. 00:39:58.080 |
It's the concept of externalized extended phenotype 00:40:02.280 |
So this is ultimately what leads to certain kinds 00:40:06.280 |
of amino acids forming certain kinds of structures 00:40:09.200 |
and so on and so forth until you get the ladder of life. 00:40:12.920 |
no, I don't think cells are conscious of that level. 00:40:15.600 |
But is there something beyond mere equilibrium state biology 00:40:32.440 |
and you look at any kind of statistical physics 00:40:37.440 |
when you look at things far out of equilibrium, 00:40:39.740 |
when you have excess energy, what happens then? 00:41:00.360 |
so I have to sort of linger on that for a little bit. 00:41:04.420 |
So cellular automata, I guess, or game of life 00:41:09.340 |
is a very simple example of reaching for order 00:41:14.040 |
or reaching for order and somehow creating complexity. 00:41:31.520 |
What intuition do you draw from the simplest mechanism? 00:41:38.000 |
and look at it as what if every single one of the patterns 00:41:50.560 |
And other times you start with certain things 00:41:54.000 |
that then construct like AND gates and NOT gates, right? 00:41:59.240 |
All of these rules that create these patterns 00:42:00.800 |
that we can see, those are just the patterns we can see. 00:42:04.600 |
What if our subjectivity is actually limiting our ability 00:42:09.680 |
What if some of the things that we think are random 00:42:13.260 |
We're simply not integrating at a final level 00:42:17.920 |
And this is again, I said, we go down the rabbit holes 00:42:20.560 |
and the Penrose stuff or like Wolfram's explorations 00:42:28.560 |
That is, hopefully one day I'll have enough money 00:42:30.360 |
to work and retire and just ponder those questions. 00:42:36.080 |
when you have enough money and you retire and you ponder, 00:42:38.440 |
there's a ceiling to how much you can truly ponder 00:42:43.020 |
in what you're able to perceive as a pattern. 00:42:47.180 |
- And maybe mathematics extends your perception capabilities 00:42:55.460 |
- Yeah, the mathematics we use is the mathematics 00:43:09.380 |
- She, I just think did she create all of it? 00:43:15.740 |
and then we screwed all up with zero and then I guess, okay. 00:43:18.540 |
- But we did, we created mathematical operations 00:43:26.000 |
I mean, the entire point of the Arabic numeral system 00:43:29.040 |
and it's a rubric for mapping a certain set of operations 00:43:32.580 |
of folding them into a simple little expression. 00:43:35.360 |
But that's just the operations that we can fit in our heads. 00:43:38.740 |
There are many other operations besides, right? 00:43:41.160 |
- The thing that worries me the most about aliens and humans 00:44:07.460 |
- Well, we're looking for a particular kind of thing. 00:44:14.720 |
and she would tell this story about a musical, 00:44:17.800 |
a musician, a Western musician who went to Japan 00:44:27.520 |
he would basically be looking for things in the style of 00:44:30.640 |
Western chromatic scale and these kinds of things. 00:44:38.840 |
They're playing different kinds of instruments. 00:44:40.480 |
The same thing she was using as sort of a metaphor 00:44:52.480 |
spirituality and practice rather than belief. 00:45:06.160 |
But if we kind of broaden and generalize this thing 00:45:10.880 |
under which conditions can they then create an environment 00:45:17.240 |
the invention of death is an interesting thing. 00:45:23.420 |
And it's not like they're incredibly complex, 00:45:25.560 |
they're actually simpler than the cells that comprise us, 00:45:37.160 |
And why is that along with the sexual reproduction, right? 00:45:41.440 |
There is something about the renewal process, 00:45:48.120 |
where it just become, just killing off the old generation 00:45:54.200 |
seems to be the best way to fit into the niche. 00:45:56.680 |
- You know, human historians seems to write about wheels 00:46:01.600 |
but it seems like death and sex are pretty good. 00:46:19.500 |
it's a team project, just like you're saying. 00:46:36.880 |
Is that the basic atom of collaboration is ideas? 00:46:40.400 |
- It's not not ideas, but it's not only ideas. 00:46:49.160 |
which is that humans are the only conspecific, 00:47:01.040 |
you see like pronghorns, butting heads, right? 00:47:07.180 |
Humans, we develop the ability to chuck rocks at each other, 00:47:11.960 |
And that means the beta male can chunk a rock 00:47:20.040 |
miss a bunch of times, but just hit once and be good. 00:47:22.400 |
So this ability to actually kill members of our own species 00:47:26.600 |
from range without a threat of harm to ourselves 00:47:29.960 |
created essentially mutually assured destruction 00:47:34.000 |
If we didn't, then if we just continue to try to do, 00:47:39.480 |
and I'm gonna own this tribe and you have to go, 00:47:43.240 |
if we do it that way, then those tribes basically failed. 00:47:48.440 |
and that have now given rise to the modern Homo sapiens 00:47:53.860 |
that we can kill each other from range without harm, 00:48:05.200 |
Right, come back here, don't throw that rock at me. 00:48:12.300 |
Well, the recognition, maybe the better way to put it 00:48:17.480 |
by working together than the prisoner's dilemma 00:48:22.580 |
- So mutually assured destruction in all its forms 00:48:28.720 |
- Well, and Eric Weinstein talks about our nuclear peace. 00:48:32.440 |
that we have thousands of warheads aimed at each other, 00:48:36.320 |
on the other hand, we only fought proxy wars. 00:48:43.160 |
dying to like machine gun fire and giant guided missiles. 00:48:52.800 |
- Yeah, well, the original scope of the world 00:49:08.000 |
'cause I didn't know there's this depth to you 00:49:10.840 |
'cause I knew you as an amazing leader of engineers 00:49:18.480 |
Maybe just as a comment, a side tangent that we can take, 00:49:23.840 |
what's the nature of your friendship with Eric Weinstein? 00:49:45.120 |
We were both working at a company called Enthought 00:49:54.400 |
and Eric was trying to use some of these Python tools 00:49:57.400 |
to visualize that he had a fiber bundle approach 00:50:04.280 |
and that's how he kind of got in touch with us. 00:50:16.200 |
to visualize fiber bundles. - Right, to visualize 00:50:41.160 |
government's in its pockets with regulatory capture. 00:50:45.280 |
And all sorts of people, Nassim Tlaib was there 00:50:56.880 |
and another one of our coworkers, Robert Kern, 00:51:00.280 |
who is anyone in the SciPy, NumPy community knows Robert, 00:51:04.720 |
So the three of us also got invited to go to this thing. 00:51:13.520 |
But anyway, so we met then and kind of had a friendship. 00:51:26.800 |
in terms of just thinking about the world deeply 00:51:30.440 |
And then there's Eric's interest in programming. 00:51:41.080 |
But I never kind of pushed the point of like, 00:51:43.640 |
what's the nature of your interest in programming? 00:51:55.040 |
what's his depth of interest and also his vision 00:51:59.360 |
for what programming would look like in the future? 00:52:07.400 |
like discussion in the space of Python programming? 00:52:09.840 |
- Well, in the sense of sometimes he asks me, 00:52:30.120 |
is that he does use programming metaphors a lot, right? 00:52:42.880 |
I haven't pair programmed with him in a very long time. 00:52:50.200 |
some of the visualizations around these things, 00:52:51.560 |
but it's been a very, not really pair program, 00:53:02.680 |
- Well, honestly, Robert Kern did all the heavy lifting. 00:53:05.480 |
So I have to give credit where credit is due. 00:53:18.560 |
as Travis and I were starting our company in 2012 timeframe, 00:53:31.560 |
Maybe it was Keen's, I don't know, somewhere in New York. 00:53:53.920 |
- Recreate it somewhere in LA, or maybe he comes here. 00:53:56.720 |
'Cause a lot of cool people are here in Austin, right? 00:54:10.160 |
and somebody who's quite a bit of an expert in source code, 00:54:14.160 |
do you think we'll ever figure out our own source code 00:54:19.040 |
Do you think we'll figure out the nature of it? 00:54:19.880 |
- Well, I think we're constantly working on that problem. 00:54:21.760 |
I mean, I think we'll make more and more progress. 00:54:24.440 |
For me, there's some things I don't really doubt too much. 00:54:28.120 |
Like I don't really doubt that one day we will create 00:54:37.760 |
that rivals the biological 20 watt computers in our heads. 00:54:53.240 |
- Doesn't Roomba Vacuum Cleaner already do that? 00:54:55.440 |
Or do you mean, oh, it doesn't ask questions. 00:55:03.680 |
- It doesn't ask questions about what is this wall? 00:55:05.360 |
It now, new feature asks, is this poop or not, apparently. 00:55:08.960 |
- Yes, a lot of our current cybernetic system, 00:55:12.680 |
It will go and it will happily vacuum up some poop, right? 00:55:16.720 |
- The new one, just released, does not vacuum up the poop. 00:55:23.760 |
In any case, these cybernetic systems we have, 00:55:27.120 |
they are mold, they're designed to be sent off 00:55:34.040 |
And whatever dynamic things happen in the environment, 00:55:36.360 |
they have a very limited capacity to respond to. 00:55:38.840 |
A human baby, a human toddler of 18 months of age, 00:55:43.040 |
has more capacity to manage its own attention 00:55:45.800 |
and its own capacity to make better sense of the world 00:55:51.680 |
So again, my cat, I think, can do a better job of my two, 00:55:56.360 |
So I do think though, back to my kind of original point, 00:55:59.440 |
I think that it's not, for me, it's not a question at all 00:56:02.680 |
that we will be able to create synthetic systems 00:56:05.960 |
that are able to do this better than the human, 00:56:09.200 |
at an equal level or better than the human mind. 00:56:14.840 |
that we will be able to put them alongside humans 00:56:25.360 |
and also looking at our responses, listening to our responses, 00:56:28.880 |
even maybe measuring certain vital signs about us. 00:56:37.600 |
in our whatever, 80 years of life, to train itself up, 00:56:42.200 |
and then be a very good simulacrum of us moving forward. 00:56:45.040 |
- So who is in the sidecar in that picture of the future, 00:57:03.240 |
we need to keep humans around for a long time. 00:57:05.600 |
And I would hope that anyone making those systems 00:57:20.640 |
- First, I mean, first we have to build systems 00:57:21.680 |
that help us do the things that we do know about, 00:57:24.400 |
that can then probe the unknowns that we know about, 00:57:37.560 |
even after our immortal selves have transcended 00:57:49.520 |
these seem like things that are going to happen. 00:57:55.720 |
Assuming we don't completely destroy ourselves. 00:58:02.480 |
that you fall in love with and it falls in love with you 00:58:10.740 |
- I would hope that that is the design criteria 00:58:14.500 |
If we cannot have a meaningful relationship with it, 00:58:40.700 |
So what role does love play in the human condition 00:58:45.340 |
at the individual level and at the group level? 00:58:52.260 |
both at the individual and the group and the societal level. 00:59:02.500 |
At which point did we invent love, by the way? 00:59:11.100 |
this is sort of beyond just romantic, sensual, 00:59:16.800 |
but actually genuine love as we have for another person, 00:59:19.680 |
love as it would be used in a religious text, right? 00:59:25.440 |
more than consciousness, that is the universal thing. 00:59:28.380 |
Our feeling of love is actually a sense of that generativity. 00:59:32.900 |
and see that they can be something more than they are, 00:59:47.620 |
that you should see the grace of God in the other person. 00:59:54.300 |
the love that God feels for his creation or her creation. 00:59:57.060 |
And so I think this thing is actually the root of it. 01:00:00.140 |
So I would say before, I don't think molecules of water 01:00:06.220 |
but there is some proto micro quantum thing of love. 01:00:10.900 |
That's the generativity when there's more energy 01:00:19.780 |
I mean, I had my mind blown one day as an undergrad 01:00:24.540 |
I logged in and when you log into Bash for a long time, 01:00:28.260 |
there was a little fortune that would come out. 01:00:34.000 |
And I was logging in to work on some problem set. 01:00:37.980 |
And I logged in and I saw that and I just said, 01:00:44.380 |
and I got a coffee and I sat there on the quad. 01:00:46.180 |
I'm like, "You know, it's not wrong and yet WTF, right?" 01:00:55.820 |
yeah, okay, non-equilibrium physics is a thing. 01:01:03.080 |
I would say that in the modern day human condition, 01:01:08.100 |
there's a lot of talk about freedom and individual liberty 01:01:17.220 |
it's very kind of following from the Western philosophy 01:01:22.700 |
But it's not really couched, I think, the right way 01:01:26.300 |
because it should be how do we maximize people's ability 01:01:29.420 |
to love each other, to love themselves first, 01:01:34.500 |
to the previous generation, to the future generations. 01:01:41.780 |
Those should be what we start with to then come up 01:01:52.200 |
I think when we design systems for cognition, 01:01:58.660 |
I think if we simply focus on efficiency and productivity, 01:02:04.840 |
all the things that Marx had issues with, right? 01:02:08.300 |
Those, that's a way to go and really, I think, 01:02:21.060 |
of an individual human, and then there's groups 01:02:41.260 |
in which sense do you believe that corporations are people? 01:02:47.620 |
- Right, so the belief is that groups of people 01:03:01.980 |
individuals have claims to agency and sovereignty. 01:03:09.500 |
nations have rights to sovereignty and agency. 01:03:26.660 |
Well, in our laws, we actually do encode this concept. 01:03:30.460 |
I believe that in a relationship, in a marriage, right, 01:03:33.820 |
one partner can sue for loss of consortium, right, 01:03:37.740 |
if someone breaks up the marriage or whatever. 01:03:41.580 |
we do respect that there is something about the union 01:03:48.300 |
to think that groups of people have a right to, 01:03:51.820 |
a claim to rights and sovereignty of some degree. 01:03:54.660 |
I mean, we look at our clubs, we look at churches. 01:03:59.020 |
These are, we talk about these collectives of people 01:04:05.780 |
But I think if we take that one step further, 01:04:10.300 |
Well, yes, check, you know, and by law, they can. 01:04:12.740 |
They can own land, they can engage in contracts, 01:04:17.060 |
they can do all these different kinds of things. 01:04:38.440 |
than anyone else in the landscape, anything else, 01:04:43.100 |
And they're able to essentially bully around individuals, 01:05:05.500 |
But the idea that collectives and collections of people, 01:05:13.100 |
- Some agency and some mass at a mesoscopic level. 01:05:17.980 |
because one thing I do think we underappreciate sometimes 01:05:22.340 |
is the fact that relationships have relationships. 01:05:26.180 |
So it's not just individuals having relationships 01:05:29.060 |
But if you have eight people seated around a table, right? 01:05:32.060 |
Each person has a relationship with each of the others, 01:05:42.720 |
And if it's couples, but one is the father and mother older, 01:05:47.040 |
and then, you know, one of their children and their spouse, 01:05:55.720 |
So the idea that relationships have relationships 01:06:01.780 |
but it's not something I hear expressed like that. 01:06:09.480 |
So I think the reason why I care a lot about this 01:06:21.520 |
at something, you know, around Dunbar number, 01:06:39.320 |
go to the dark force of the internet by ourselves. 01:06:45.300 |
I think maybe calling it a convenient fiction 01:06:53.920 |
because we are, like you said, made of cells, 01:07:02.860 |
But it seems like some fictions are better than others. 01:07:07.840 |
that argue the fiction of nation is a bad idea. 01:07:18.260 |
who are into meditation that believe the idea, 01:07:21.840 |
this useful fiction of agency of an individual 01:07:33.720 |
but suffering or to elevate the experience of life. 01:07:40.760 |
okay, so we have some of these useful fictions of agency. 01:07:47.880 |
that we tell ourselves about the agency of groups 01:07:51.480 |
in the hundreds of the half of Dunbar's number, 01:08:01.540 |
Rules that we feel are fair or rules that we consent to. 01:08:05.680 |
- I always question the rules when I lose, like a monopoly. 01:08:09.800 |
when I'm winning, I don't question the rules. 01:08:12.840 |
There's a trippy version of it that we could do. 01:08:16.240 |
- Contract Monopoly is introduced by a friend of mine to me, 01:08:19.160 |
where you can write contracts on future earnings 01:08:27.780 |
you land in a park place is free or whatever. 01:08:44.160 |
and we were to make NFTs out of the contracts we wrote, 01:08:50.160 |
but I bet we could actually sell the NFTs around. 01:08:52.920 |
- I have other ideas to make money that I could tell you, 01:09:19.240 |
requires modified fictions, stories of agency. 01:09:25.040 |
And also, how do you select the group of people? 01:09:38.000 |
we can have deep human connection at this scale. 01:09:41.280 |
Like some of it feels like an interface problem too. 01:09:48.080 |
I can deeply connect with a larger number of people. 01:09:57.980 |
getting to share traumatic experiences together, 01:10:04.080 |
like that in the digital space that you can share. 01:10:10.800 |
perhaps not to the level of millions and billions, 01:10:16.280 |
So how do we find the right interface, you think, 01:10:26.080 |
- You're right that there's many different ways 01:10:30.840 |
My friend, Joe Edelman talks about a few different ways 01:10:41.320 |
There's a variety of different things that we can do, 01:10:43.640 |
but all those things take time and you have to be present. 01:10:48.480 |
The less present you are, I mean, there's just, again, 01:10:51.560 |
The less present you are, the more of them you can do, 01:10:56.800 |
So I think there is sort of a human capacity issue 01:11:00.260 |
Now, that being said, if we can use certain technologies, 01:11:04.800 |
so for instance, if I write a little monograph 01:11:12.800 |
I read it, I'm like, "Wow, Lex, this is awesome." 01:11:15.320 |
We can be friends without having to spend 10 years, 01:11:20.560 |
We can just read each other's thing and be like, 01:11:22.120 |
"Oh yeah, this guy's exactly in my wheelhouse 01:11:26.080 |
And we can then connect just a few times a year 01:11:33.040 |
It can be expanded a little bit, but it also requires, 01:11:35.840 |
these things are not all technological in nature, 01:11:44.760 |
If you wanna use the ocean big five sort of model, 01:11:50.640 |
the fewer authentic connections you can really build 01:11:57.360 |
There's just a lot of the stuff that you have to deal with 01:12:04.760 |
to where they can develop more relationships faster, 01:12:06.880 |
and then you can maybe expand Dunbar number by quite a bit, 01:12:10.640 |
I think it's gonna be hard to get it beyond 10X, 01:12:23.480 |
- Do you count as one system or multiple AI systems? 01:12:28.800 |
for them to integrate into human society as it is now, 01:12:35.280 |
because otherwise we wouldn't relate to them. 01:12:37.600 |
- We could engage certain kinds of individuals 01:12:40.240 |
to make sense of them for us and be almost like, 01:12:42.680 |
did you ever watch "Star Trek," like Voyager? 01:12:45.520 |
Like there's the Volta who are like the interfaces, 01:13:02.100 |
but the biggest cybernetic system in the world 01:13:07.960 |
And you have an entire stack of people on Wall Street, 01:13:09.880 |
Wall Street analysts, to CNBC reporters, whatever. 01:13:13.240 |
They're all helping to communicate what does this mean? 01:13:19.580 |
Like all of these people are part of that lowering 01:13:29.760 |
And I don't see this changing with AI systems. 01:13:34.560 |
that this AI system is trying to do over here, over here. 01:13:39.120 |
So if you wanna talk about humans interfacing, 01:13:40.800 |
making first contact with the super intelligence, 01:13:44.800 |
And if you look at the gradient of power and money, 01:14:10.760 |
of like millions of AI systems that have agencies. 01:14:22.280 |
it's able to solve particular problems extremely well. 01:14:26.080 |
But there's some aspect of human-like intelligence 01:14:29.240 |
that's necessary to be integrated into human society. 01:14:39.680 |
I'm more referring to things that you interact with 01:14:45.120 |
And that I think requires, there has to be a backstory. 01:14:50.080 |
I believe it has to fear its own mortality in a genuine way. 01:14:56.540 |
many of the elements that we humans experience 01:15:05.840 |
But I don't think having a deep connection with it 01:15:07.780 |
is necessarily going to stop us from building a thing 01:15:10.580 |
that has quite an alien intelligence aspect to it. 01:15:13.340 |
So the other kind of alien intelligence on this planet 01:15:16.620 |
is octopuses or octopodes or whatever you wanna call them. 01:15:31.140 |
as a collective intelligence of eight intelligent arms. 01:15:34.340 |
Its arms have a tremendous amount of neural density to them. 01:15:40.340 |
I mean, just let's go with what you're saying. 01:15:44.380 |
that interfaces with humans, that has a sense of agency 01:16:07.380 |
And if their cognitive systems are already digitized 01:16:15.400 |
that maybe doesn't make all the individual decisions 01:16:25.860 |
that is then able to perceive much broader dynamics 01:16:31.060 |
In the same way that a phased array radar, right? 01:16:32.580 |
You think about how phased array radar works. 01:16:36.220 |
It's just radars, and then it's hypersensitivity 01:16:50.140 |
much, much better, much higher fidelity understanding 01:16:55.060 |
of what's actually happening in the environment. 01:16:57.740 |
the more understanding the central superintelligence has, 01:17:11.820 |
There has to be the experience of the individual agent 01:17:15.660 |
has to have the full richness of the human-like experience. 01:17:20.660 |
You have to be able to be driving the car in the rain, 01:17:28.260 |
because remembering something that happened to you 01:17:47.340 |
So each one in excess of energy has to reach for order 01:18:07.000 |
but you're connected with a few other individuals 01:18:14.640 |
what do you think is the experience of if you are, 01:18:18.960 |
If you are one, if you're part of this hive mind, 01:18:22.640 |
outside of all the aesthetics, forget the aesthetics, 01:18:28.400 |
'Cause I have a theory as to what that looks like. 01:18:30.660 |
The one question I have for you about that experience 01:18:33.640 |
is how much is there a feeling of freedom, of free will? 01:18:38.640 |
Because I, obviously, as a human, very unbiased, 01:18:43.340 |
but also somebody who values freedom and biased, 01:18:46.160 |
it feels like the experience of freedom is essential 01:18:54.680 |
and doing something truly novel, which is at the core of-- 01:18:59.140 |
- Yeah, well, I don't think you have to lose any freedom 01:19:04.600 |
we still think, I mean, you're still thinking about this 01:19:06.960 |
in a sense of a top-down command and control hierarchy, 01:19:12.320 |
I think the experience, so I'll just show my cards here. 01:19:16.080 |
I think the experience of being a robot in that robot swarm, 01:19:19.780 |
a robot who has agency over their own local environment 01:19:39.660 |
"I think this is what's gonna happen over here. 01:19:43.760 |
"this will make this change happen in the environment." 01:19:46.420 |
And then, God, she may tell you, "That's great. 01:19:51.200 |
"And in fact, your brothers and sisters will join you 01:19:55.040 |
And then she can let your brothers and sisters 01:20:00.660 |
"Because we think that this will make this thing go better." 01:20:03.840 |
So the whole thing could be actually a very emergent, 01:20:06.420 |
the sense of what does it feel like to be a cell 01:20:10.080 |
in a network that is alive, that is generative? 01:20:12.740 |
And I think actually the feeling is serendipity, 01:20:19.380 |
not random disorder or chaos, but random order. 01:20:22.380 |
Just when you need it to hear Bruce Springsteen, 01:20:30.100 |
I feel like this is a bit of a flight of fancy, 01:20:35.940 |
what does it feel like to be a cell in your body? 01:20:54.460 |
what does it feel like to be an independent individual 01:21:01.860 |
that still lives within a network that has order to it. 01:21:04.980 |
And I feel like it has to be a feeling of serendipity. 01:21:07.220 |
- So the cell, there's a feeling of serendipity, 01:21:12.260 |
why it's getting oxygen and sugar when it gets it. 01:21:16.180 |
has to be too dumb to understand the big picture. 01:21:35.220 |
The moment you understand, I feel like that leads to, 01:21:38.900 |
if you tell me now that there's some bigger intelligence 01:21:48.900 |
even the Sam Harris thing, there's no free will. 01:21:54.460 |
that that's the case, that's gonna, I don't know if I- 01:22:01.020 |
Because we're in the West and we're pumped full 01:22:10.540 |
No, it's actually, you don't actually have a lot of these. 01:22:21.900 |
You can say whatever you want on this podcast. 01:22:25.460 |
- You're not, I mean, you have a lot of this kind of freedom, 01:22:27.860 |
but even as you're doing this, you are actually, 01:22:30.140 |
I guess where the denouement of all of this is that 01:22:33.500 |
we are already intelligent agents in such a system, right? 01:22:37.820 |
In that one of these like robots of one of 5 million 01:22:43.580 |
they're just posting an internal bulletin board. 01:22:45.420 |
I mean, maybe the Borg cube is just a giant Facebook machine 01:22:47.620 |
floating in space and everyone's just posting on there. 01:22:50.540 |
They're just posting really fast and like, oh yeah. 01:22:57.200 |
Yeah, everyone upvotes and they're gonna go shoot it. 01:23:07.380 |
It's got the overhangs of zombie sense-making institutions 01:23:15.540 |
we are going to see humanity improving at speeds 01:23:21.980 |
And it's not because anyone's freedoms were limited. 01:23:24.660 |
I mean, we started this with open-source software, right? 01:23:26.820 |
The collaboration, what the internet surfaced 01:23:29.340 |
was the ability for people all over the world 01:23:32.620 |
some of the most foundational software that's in use today. 01:23:38.920 |
So these online kind of swarm kind of things are not novel. 01:23:44.400 |
It's just, I'm just suggesting that future AI systems, 01:23:57.800 |
And that thing will certainly have emergent intelligence 01:24:03.560 |
will be able to really put a bow around and explain. 01:24:09.960 |
still be able to go like rural Texas, buy a ranch, 01:24:28.120 |
- You have access to way more intelligence capability 01:24:34.920 |
- So like there's a word control that comes to mind. 01:24:45.360 |
- I think systems, well, this is to your point. 01:24:50.600 |
I think systems that feel like overbearing control 01:24:55.760 |
I think systems that give their individual elements 01:24:58.380 |
the feeling of serendipity and the feeling of agency, 01:25:05.960 |
that there will not be emergent higher level order. 01:25:08.840 |
And that's the thing, that's the philosophical breakdown 01:25:31.480 |
Everything we see is the emergent patterns of other things. 01:25:35.440 |
And there is agency when there's extra energy. 01:25:38.140 |
- So you have spoken about a kind of meaning crisis 01:25:44.980 |
But it feels like since we invented sex and death, 01:26:02.320 |
Why is, how is this particular meaning crisis different? 01:26:07.320 |
Or is it really a crisis and it wasn't previously? 01:26:15.840 |
and not getting eaten by bears crisis, right? 01:26:18.680 |
Once you get to a point where you can make food, 01:26:23.640 |
So sitting around wondering what is it all about, 01:26:31.600 |
the meaning crisis coming out of that is precisely because, 01:26:53.300 |
then it feels like what was the point, right? 01:26:55.480 |
But if there's all these big things happening, 01:27:09.800 |
acting on it and then seeing the consequences of it. 01:27:12.160 |
So historically, just when humans are in survival mode, 01:27:16.720 |
you're making consequential decisions all the time. 01:27:20.900 |
because like you either got eaten or you didn't, right? 01:27:23.440 |
You got some food and that's great, you feel good. 01:27:27.440 |
Only in the post fossil fuel and industrial revolution, 01:27:36.920 |
I could sit around not being threatened by bears, 01:27:44.840 |
but a lot of times not seeing the consequences 01:28:05.760 |
- Oh, there's a high end luxury purses and crap like that. 01:28:10.760 |
But the point is that we give people the idea 01:28:15.200 |
that making a choice of this team versus that team, 01:28:31.080 |
- Well, you're saying choosing between Chanel 01:28:35.440 |
I mean, why is one more meaningful than the other? 01:28:38.320 |
- It's not that it's more meaningful than the other, 01:28:39.640 |
it's that you make a decision between these two brands 01:28:42.600 |
and you're told this brand will make me look better 01:28:54.320 |
but consumption by itself doesn't actually yield meaning. 01:29:00.080 |
So that's why in this era of abundant production, 01:29:07.520 |
The NFT kind of explosion is a similar kind of thing. 01:29:12.000 |
because we just have so much excess production. 01:29:15.480 |
- But aren't those status games a source of meaning? 01:29:18.440 |
Like why do the games we play have to be grounded 01:29:26.200 |
Why can't we in this virtuality world, on social media, 01:29:35.280 |
- But you're saying that's creating a meaning crisis. 01:29:41.120 |
Number one, playing those kinds of status games 01:29:52.620 |
consuming the latest and greatest version of a thing, 01:30:01.740 |
and destroy every year to create artificial scarcity 01:30:09.600 |
So conspicuous consumption fueling status games 01:30:14.120 |
is really bad for the planet, not sustainable. 01:30:17.200 |
The second thing is you can play these kinds of status games 01:30:20.780 |
but then what it does is it renders you captured 01:30:25.420 |
The status games that really wealthy people are playing 01:30:31.460 |
they're gonna have the fuel and the rare earths 01:30:41.100 |
- So you're saying ultimately the big picture game 01:30:49.860 |
So you don't see a society where most of the games 01:30:59.780 |
It's just like the stack of human being, right? 01:31:08.820 |
and access to layer zero physical are going to own you. 01:31:26.260 |
is because the big sovereignties where one spends money 01:31:31.260 |
and uses money and plays money games and inflates money, 01:31:36.260 |
their ability to adjudicate the physical resources 01:31:40.340 |
and hard resources and land and things like that, 01:31:42.380 |
those have not been challenged in a very long time. 01:31:47.660 |
Most money is not connected to physical resources. 01:31:52.960 |
And that idea is very closely connected to status. 01:31:57.880 |
- But it's also tied to, it's actually tied to law. 01:32:03.160 |
It is tied to some physical hard things, right? 01:32:06.080 |
- Yes, so it's always at the end going to be connected 01:32:21.640 |
I'm playing on the stacks of devil's advocates here. 01:32:25.320 |
I'm popping one devil off the stack at a time. 01:32:33.160 |
but just because people control the physical reality 01:32:47.500 |
and now you're playing in the space of virtual, 01:32:51.900 |
And so it just feels like there could be games 01:33:00.120 |
Like I can imagine such things being possible. 01:33:19.040 |
And into like, well, you look at Mr. Beast, right? 01:33:31.140 |
Yeah, like it's not that those kinds of games 01:33:44.660 |
I would say mostly I'm thinking about middle-class consumers. 01:33:55.660 |
the need to buy the latest model of whatever, 01:33:59.220 |
the need to pursue status games as a driver of meaning. 01:34:11.380 |
it's like eating a lot of empty calories, right? 01:34:16.220 |
it was not enough protein to help build your muscles. 01:34:22.660 |
and setting aside our discussion on currency, 01:34:28.860 |
part of it being created by the fact that we don't, 01:34:46.520 |
We're encouraged to relate to these kinds of things 01:34:53.800 |
And that's where the meaning crisis comes from. 01:34:57.240 |
so there's somebody you mentioned who's Jacques Eliel, 01:35:04.240 |
he warns about the towering piles of technique, 01:35:19.340 |
My question broadly speaking, this meaning crisis, 01:35:21.460 |
can technology, what are the pros and cons of technology? 01:35:26.900 |
I certainly draw on some of the Lowell's ideas 01:35:37.420 |
I mean, he speaks to the general mentality of efficiency, 01:35:40.940 |
homogenized processes, homogenized production, 01:35:43.340 |
homogenized labor to produce homogenized artifacts 01:36:01.940 |
maybe a piece of wood and they need to make into a chair, 01:36:04.200 |
it may be a site to build a house or build a stable 01:36:08.440 |
and they will consider how to bring various things in 01:36:15.000 |
that's in right relationship with that environment. 01:36:22.280 |
over the last 100, 150 years is not that at all. 01:36:25.720 |
It is how can we make sure the input materials 01:36:46.780 |
is because we have broadcasts that tells everyone 01:36:52.600 |
And we're like Baudrillard and other critiques 01:36:59.260 |
It's that their point is that at this point in time, 01:37:06.700 |
but the need to consume and build status games on top. 01:37:12.140 |
I think this is really like Bernays and stuff, right? 01:37:14.820 |
In the early 20th century, we discovered we can create, 01:37:31.100 |
to build a relationship with their neighbor or their spouse. 01:37:33.660 |
We are telling them, you need to consume this brand. 01:37:40.940 |
So creating homogenized demand makes it really cheap 01:37:50.060 |
give it to all the kids and all the kids are like, 01:37:54.440 |
So this is ultimately where this ties in then 01:38:07.960 |
So you have to squeeze more and more demand out. 01:38:12.280 |
but tell everyone they're still getting meaning from it. 01:38:15.120 |
You're still like, this is still your tickle me Elmo, right? 01:38:20.960 |
critiques of this dripping in popular culture. 01:38:22.840 |
You see it sometimes it's when Buzz Lightyear 01:38:31.800 |
or there's hundreds of other Buzz Lightyear's 01:38:40.120 |
- I agree with you on most of the things you're saying. 01:38:54.280 |
if channeled correctly, innovation, invention, 01:39:05.640 |
the quality of lives for all kinds of people. 01:39:15.160 |
and more and more experiences that would then give meaning? 01:39:21.080 |
I mean, it's not all good or bad in my perspective. 01:39:30.680 |
but that's a different, that's somewhat different 01:39:35.940 |
but it's somewhat different than the question 01:39:40.640 |
Is this still the same rocket we need to ride 01:39:44.600 |
- Well, how does this, so you're predicting the future, 01:39:54.000 |
to where we can actually, I think, sustainably produce 01:40:20.200 |
have been, a lot of them have been just inherited 01:40:32.000 |
but a lot of these modes of organizing people 01:40:43.080 |
I think, a very industrial mode perspective on human labor. 01:40:54.920 |
If you look at the core SciPy sort of collection of libraries, 01:41:00.680 |
There's IPython Notebook, let's throw Pandas in there, 01:41:23.600 |
I mean, it's like, it's similar question of like, 01:41:36.040 |
Now, some of that stuff runs through TensorFlow, 01:41:45.160 |
they're using some aspect of these kinds of tools. 01:41:47.600 |
So I would say that these create billions of dollars 01:41:51.960 |
- Oh, you mean like direct use of tools that leverage-- 01:42:01.920 |
- So that's billions of dollars a day, great. 01:42:05.720 |
Now, if we take how many people did it take to make that? 01:42:09.760 |
Right, and there was a point in time, not anymore, 01:42:11.680 |
but there was a point in time when they could fit in a van. 01:42:13.520 |
I could have fit them in my Mercedes Spinter, right? 01:42:22.440 |
could create value to the tune of billions of dollars a day. 01:42:36.280 |
The way I've talked about this in other environments is, 01:42:39.680 |
when we use generative participatory crowdsourced approaches 01:42:50.720 |
I would challenge anyone to go and try to hire 01:43:02.440 |
They'd be very, very hard pressed to do that. 01:43:04.080 |
If a hedge fund could just hire a dozen people 01:43:10.040 |
every single one of them would be racing to do it, right? 01:43:24.080 |
and it took the right kinds of people, right? 01:43:27.800 |
I need to have a part of a multi-billion dollar a day 01:43:37.800 |
is to say that our way of thinking about value, 01:43:40.720 |
our way of thinking about allocation of resources, 01:43:54.160 |
to some extent we're sort of in a post-scarcity era, 01:43:57.040 |
although some people are hoarding a whole lot of stuff. 01:44:01.280 |
if not now, soon we'll be in a post-scarcity era. 01:44:08.680 |
because the kind of software these people built, 01:44:23.040 |
So that's different than any other physical resource 01:44:34.560 |
created this much value so efficiently, so cheaply, 01:44:37.520 |
'cause feeding a dozen people for 10 years is really cheap. 01:44:41.600 |
That's the reason I care about this right now, 01:44:54.040 |
to build something sustainable for you and your tribe 01:44:56.960 |
to deliver the right medicines, to take care of the kids, 01:45:07.280 |
where all of this additional generative things 01:45:12.320 |
they don't have to be wrapped up in a container 01:45:22.960 |
'Cause the old internet was connecting people just fine. 01:45:34.640 |
And then I said, how do I make money off that? 01:45:45.560 |
How do we have millions of vans full of people 01:45:47.760 |
that create NumPy, SciPy, that create Python? 01:45:54.320 |
is often they have some kind of job outside of this. 01:46:04.880 |
Isn't that what, like, isn't this consumerism 01:46:15.120 |
every once in a while to create something magical? 01:46:17.240 |
Like at the edges is where the innovation happens. 01:46:21.360 |
Like if everyone were to go and run their own farm, 01:46:24.400 |
no one would have time to go and write NumPy, SciPy, right? 01:46:29.960 |
when I say we're maybe at a post-scarcity point 01:46:34.160 |
The question that we're never encouraged to ask 01:46:42.000 |
Do you need to have a new car every two years, every five? 01:46:46.200 |
can you drive one for 10 years, is that all right? 01:46:47.960 |
You know, I had a car for 10 years and it was fine. 01:46:52.800 |
I mean, it's sort of, you're using the same apps 01:46:58.320 |
- This should be a Super Bowl ad, that's great. 01:47:01.520 |
- Maybe one of our listeners will fund something like this 01:47:03.920 |
of like, no, but just actually bringing it back, 01:47:13.560 |
for collectives of people to live on the basis of 01:47:18.200 |
providing what we need, meeting people's needs 01:47:21.040 |
with a little bit of excess to handle emergencies 01:47:26.320 |
to handle the really, really big emergencies, 01:47:30.880 |
or some massive fire sweeps through, you know, 01:47:44.120 |
to explore how to be the best version of themselves? 01:47:49.000 |
throwing away his shot at tenure in order to write "Numbhigh." 01:47:52.880 |
For others, there is a saying in the sci-fi community 01:47:59.800 |
And that's, you know, we can do these things. 01:48:03.800 |
We can actually do this kind of collaboration 01:48:05.600 |
because code, software, information organization, 01:48:09.880 |
Those bits are very cheap to fling across the oceans. 01:48:15.680 |
and we'll continue to talk about open source. 01:48:24.080 |
What's your relationship been like through the years? 01:48:44.200 |
you know, working on scientific computing, consulting. 01:48:58.240 |
One of the founders of Enthought was the CEO, Eric Jones. 01:49:01.920 |
And we were all very excited that Travis was joining us. 01:49:12.120 |
I mean, it was just a really, it was a good time there. 01:49:23.280 |
we started getting called into more and more finance shops. 01:49:29.880 |
I did some work on like a high-frequency trading shop, 01:49:36.520 |
at a couple of investment banks in Manhattan. 01:49:39.880 |
And so we started seeing that there was a potential 01:49:42.720 |
to take Python in the direction of business computing. 01:49:46.840 |
like MATLAB replacement for big vector computing. 01:49:51.840 |
you could actually use Python as a Swiss army knife 01:49:53.920 |
to do a lot of shadow data transformation kind of stuff. 01:49:56.880 |
So that's when we realized the potential is much greater. 01:50:03.400 |
I mean, it was called Continuum Analytics at the time, 01:50:07.560 |
with a vision of shoring up the parts of Python 01:50:10.800 |
that needed to get expanded to handle data at scale, 01:50:13.800 |
to do web visualization, application development, et cetera. 01:50:18.080 |
So he was CEO and I was president for the first five years. 01:50:23.080 |
And then we raised some money and then the board 01:50:35.240 |
Travis then left after a year to do his own thing, 01:50:39.600 |
which was more oriented around some of the bootstrap years 01:50:46.160 |
It wasn't sort of like gung-ho product development. 01:50:50.080 |
we accidentally stumbled into the package management problem 01:50:53.840 |
at Anaconda, but then we had a lot of other visions 01:50:56.920 |
of other technology that we built in the open source. 01:51:02.360 |
the frontiers of numerical computing, vector computing, 01:51:05.280 |
handling things like auto differentiation and stuff 01:51:18.280 |
We remain great friends and colleagues and collaborators, 01:51:22.520 |
even though he's no longer day-to-day working at Anaconda, 01:51:29.000 |
- What's a big lesson you've learned from Travis 01:51:32.200 |
about life or about programming or about leadership? 01:51:43.080 |
- I've gotten that sense having to interact with him. 01:52:00.780 |
and then be on a conversation and be eye-to-eye 01:52:08.280 |
no matter how much fog settles in over the ocean, 01:52:21.880 |
I mean, I hope he knows that over the years now. 01:52:28.120 |
I would say this about Travis, it's interesting. 01:52:29.880 |
For someone who cares so deeply about the nerd details 01:52:33.360 |
of like type system design and vector computing 01:52:36.000 |
and efficiency of expressing this and that and the other, 01:52:53.080 |
But for me, the beauty of what this human ecology 01:53:03.680 |
how do we replicate this for humanity at scale? 01:53:05.880 |
What this open source collaboration was able to produce? 01:53:08.840 |
How can we be generative in human collaboration 01:53:17.520 |
'Cause like a lot of the other open source movements, 01:53:19.800 |
it's all nerds nerding out on code for nerds. 01:53:34.440 |
- Is there a way for this kind of open source vision 01:53:51.020 |
they have needs that like business specific needs 01:53:54.960 |
They really can't tell their VPs and their investors, 01:54:02.100 |
installing random packages from who knows where 01:54:08.380 |
So we are a governed source of packages for them. 01:54:13.140 |
We take some of that and we just take that as a dividend. 01:54:18.240 |
and write that as a dividend for the open source community. 01:54:21.040 |
But beyond that, I really see the development 01:54:24.440 |
of a marketplace for people to create notebooks, 01:54:33.100 |
and to really have a long tail marketplace dynamic with that. 01:54:39.620 |
that you stumbled into of package management, 01:54:48.980 |
which is part of Anaconda, which is a package manager. 01:55:21.140 |
but they need to be compiled with all of the right settings. 01:55:29.340 |
when you look at different operating systems, 01:55:33.820 |
but if you're running Mac versus Linux versus Windows 01:55:37.220 |
on the same x86 chip, you compile and link differently. 01:55:40.040 |
All of this complexity is beyond the capability 01:55:46.780 |
And it's also beyond what most of the package developers 01:55:52.860 |
you're like, I code on Linux, this works for me, I'm good. 01:55:55.820 |
It is not my problem to figure out how to build this 01:56:05.140 |
or create a very creative crowdsourced environment 01:56:08.580 |
where people want to use this stuff, but they can't. 01:56:11.380 |
And so we ended up creating a new set of technologies 01:56:27.700 |
on each of these different kinds of platforms 01:56:30.860 |
when people want to install something, they can. 01:56:34.460 |
They don't have to set up a big compiler system 01:56:40.420 |
Now, the difficulty is we have literally thousands 01:56:48.740 |
they may take a dependence on something else. 01:57:06.620 |
They're like, I want to install NumPy and Pandas. 01:57:09.180 |
I want this version of some like geospatial library. 01:57:17.680 |
you're installing this on a version of Windows, right? 01:57:20.740 |
And half of these libraries are not built for Windows. 01:57:26.260 |
If you go to the old version of this library, 01:57:27.540 |
that means you need to go to a different version 01:57:34.500 |
we were able to fill a hundred thousand different niches. 01:57:49.200 |
So we end up sort of having to do a lot of this. 01:57:58.660 |
Now, Pip is a tool that came along after Conda 01:58:04.460 |
for the Python developers writing Python code 01:58:15.360 |
And what ended up happening in the Python ecosystem 01:58:17.820 |
was that a lot of the core Python and web Python developers, 01:58:20.820 |
they never ran into any of this compilation stuff at all. 01:58:29.500 |
you know what, the scientific community's packaging problems 01:58:33.140 |
I mean, you're talking about Fortran compilers, right? 01:58:41.740 |
and built its own sort of packaging technologies, 01:58:57.540 |
The instant you want to also install some other packages 01:59:09.860 |
OpenCV can have a different version of libjpeg over here 01:59:17.380 |
they have to all use the same underlying drivers 01:59:32.860 |
I mean, you said that you don't want to think, 01:59:35.900 |
but how much is it a little bit on the developer 01:59:42.020 |
of that sub graph of dependency that's necessary? 01:59:47.820 |
look, can we pull some of the most popular packages together 01:59:51.180 |
and get them to work on a coordinated release timeline, 01:59:53.580 |
get them to build against the same test matrix, 01:59:56.980 |
And there is a little bit of dynamic around this, 02:00:07.540 |
So we end up trying to pull these things together 02:00:13.020 |
and I would recommend just as a business tip, 02:00:16.500 |
where when your hard work works, you're invisible. 02:00:19.500 |
And when it breaks because of someone else's problem, 02:00:25.540 |
When something doesn't condo install properly, 02:00:38.140 |
So we end up in this kind of problematic scenario, 02:00:50.460 |
also appreciate the work we've done over the years 02:01:05.700 |
So there's all these ways to mess with these kinds 02:01:11.460 |
So I just wanna ask you about that particular one. 02:01:13.700 |
What do you think about the move from Python two to three? 02:01:23.620 |
and the community all struggled through this process, 02:01:29.380 |
- Looking back, some people perhaps underestimated 02:01:36.660 |
I think some people also underestimated how much, 02:01:44.380 |
some of the new features in Python three really provided. 02:01:47.340 |
Like the things they really loved about Python three 02:01:49.660 |
just didn't matter to some of these people in Python two. 02:01:52.540 |
'Cause this change was happening as Python, SciPy 02:01:59.940 |
in the early data science era, in the early 2010s. 02:02:08.220 |
these libraries I need are not supported in Python three yet, 02:02:15.660 |
So I think it wasn't any particular single thing, 02:02:18.780 |
but it was one of those death by a dozen cuts, 02:02:21.740 |
which just really made it hard to move off of Python two. 02:02:30.660 |
there's a lot of stuff that was happening there 02:02:39.620 |
So I think that's essentially what happened there. 02:02:43.660 |
the strength of the Python data science movement, 02:02:48.580 |
I think is what kept Python alive in that transition. 02:03:02.620 |
would have just left for Go and Rust and stayed. 02:03:06.100 |
They moved to Go and Rust and they just never looked back. 02:03:08.740 |
The fact that we were able to grow by millions of users, 02:03:15.700 |
that is what kept the momentum for Python going. 02:03:18.180 |
And now the usage of Python for data is over 50% 02:03:24.180 |
So I'm happy to debate that on stage somewhere 02:03:28.660 |
if they really wanna take issue with that statement. 02:03:33.380 |
the idea is that the switch from Python 2 to Python 3 02:03:40.260 |
if it didn't also coincide with Python for whatever reason, 02:03:53.940 |
that this maybe imperfect decision was coupled 02:03:57.580 |
with a great timing on the value of data in our world. 02:04:01.900 |
- I would say the troubled execution of a good decision. 02:04:08.660 |
we could have done in a way that was a little bit smoother, 02:04:15.020 |
I bought them at the time and I buy them now, right? 02:04:32.940 |
It's like now the most popular language on the planet, 02:04:37.020 |
So the lack of resources meant that they had to really, 02:04:40.300 |
they had to do things in a very hamstrung way. 02:04:47.740 |
the data movement was a critical part of that. 02:05:15.940 |
when NumPy announced that they're going to end support 02:05:23.060 |
- So like when I realized, oh, this is going to end. 02:05:31.660 |
It was like all of these packages were saying, 02:05:34.380 |
okay, we have Python 3 support now, come join us. 02:05:40.940 |
I sort of love and depend on said like, nope, it's over. 02:05:50.340 |
I wonder if you think it was possible much earlier 02:05:53.820 |
for somebody like NumPy or some major package 02:05:58.820 |
to step into the cold and say like, we're an anus. 02:06:01.820 |
- Well, it's a chicken and egg problem too, right? 02:06:09.380 |
So the decisions for the scientific community 02:06:16.060 |
we'll only be releasing new features on Python 3. 02:06:27.900 |
But then for others, yeah, NumPy in particular, 02:06:30.620 |
'cause it's at the base of the dependency stack 02:06:32.620 |
for so many things, that was the final stick. 02:06:38.420 |
if I have to keep maintaining my releases for Python 2, 02:06:49.940 |
So people were also getting kind of pulled by this tension. 02:06:53.300 |
So the overall community sort of had a lot of input 02:07:01.380 |
- So as these numbers are a little bit loose, 02:07:03.980 |
but there are about 10 million Python programmers 02:07:17.220 |
You mentioned in a talk that changes need to be made 02:07:20.540 |
for there to be 100 million Python programmers. 02:07:26.020 |
where there's 100 million Python programmers? 02:07:28.300 |
And second, what kind of changes need to be made? 02:07:34.900 |
So I think the idea that there's only 10 million 02:07:41.980 |
There are a lot of people who escape traditional counting 02:07:44.860 |
that are using Python and data in their jobs. 02:07:48.460 |
I do believe that the future world for it to, 02:07:58.620 |
that let them express their questions and ideas fluidly. 02:08:03.180 |
And the data variety and data complexity will not go down. 02:08:08.340 |
So I think some level of code or code-like things 02:08:19.740 |
that allow people to more seamlessly integrate 02:08:22.900 |
Python kinds of expressivity with data systems 02:08:26.060 |
and operationalization methods that are much more seamless. 02:08:32.380 |
right now you can't punch Python code into an Excel cell. 02:08:35.660 |
I mean, there's some tools you can do to kind of do this. 02:08:37.940 |
We didn't build a thing for doing this back in the day, 02:08:51.180 |
So I think Python has to get better at being embedded, 02:09:32.500 |
and I'm kind of forcing them or inspired them 02:09:35.420 |
to learn Python to do a bunch of stuff that helps them. 02:09:43.820 |
I would love it if the tools like Photoshop and Premiere 02:09:46.500 |
and all those kinds of tools that are targeted 02:09:48.740 |
towards creative people, I guess that's where Excel, 02:09:52.100 |
Excel is targeted towards a certain kind of audience 02:10:09.740 |
that I'm hopeful about looking at OpenAI Codex 02:10:20.780 |
from kind of visual interface to generating programs 02:10:28.980 |
but kind of without having to read the manual, 02:10:32.340 |
without having to do a Google search and stack overflow, 02:10:34.900 |
which is essentially what a neural network does 02:10:39.020 |
is actually generating code and allowing a human 02:10:48.860 |
- So that to me is a really exciting possibility 02:10:51.220 |
'cause I think there's a friction to kind of, 02:10:55.980 |
like how do I learn how to use Python in my life? 02:11:01.820 |
what started a class, you start learning about types. 02:11:07.140 |
Like this is, you know, Python is the first language 02:11:11.980 |
But I feel like that's going to take a long time 02:11:23.860 |
behind programming languages and types and all that, 02:11:35.580 |
So types are there because compiler writers are human 02:11:53.900 |
when I could, when Steve Jobs was still pitching 02:11:59.900 |
They were supposed to not be just media consumption devices, 02:12:03.300 |
but they were actually, you could write some code. 02:12:06.780 |
you could write some stuff to do some things. 02:12:19.500 |
in the generation of youth around Minecraft or Roblox, right? 02:12:28.780 |
of people actually shaping and using their computers 02:12:37.860 |
So you talk about scripting the Adobe suite with Python 02:12:49.540 |
We should better support those kinds of things. 02:12:51.260 |
But ultimately the idea that I should be able 02:12:56.260 |
If I want these things to happen repeatedly all the time, 02:12:59.660 |
I should be able to say that somehow to the computer, right? 02:13:02.660 |
Now, whether the operating systems get there faster 02:13:06.540 |
by having some Siri backed with open AI, with whatever. 02:13:09.580 |
So you can just say, "Siri, make this do this, 02:13:15.900 |
There's the Apple script in the menu that no one ever uses, 02:13:21.660 |
But when you start doing that kind of scripting, 02:13:32.860 |
Like who's got time to learn all that stuff, right? 02:13:55.780 |
and scientific, let's say, expression system, 02:14:01.340 |
which is that it gives us mathematical precision. 02:14:04.380 |
It gives us actually quite a lot of precision 02:14:06.780 |
over precisely what we mean about this data set, 02:14:10.780 |
And it's the fact that we can have that precision 02:14:13.700 |
that lets Python be powerful over as a duct tape for data. 02:14:21.220 |
and if you give me some massively expensive vendor tool 02:14:26.460 |
I don't know I'm gonna be able to solve your problem. 02:14:34.420 |
So that ability to take it as sort of this like, 02:14:44.340 |
we're not gonna get away from some of these expressions 02:14:47.780 |
and APIs and libraries in Python for data transformation. 02:14:51.700 |
- You've been at the center of the Python community 02:14:58.420 |
If you could change one thing about the community 02:15:05.580 |
to help it flourish and prosper, what would it be? 02:15:09.500 |
I mean, you know, it doesn't have to be one thing, 02:15:17.980 |
but it's also one of the values in the community 02:15:21.380 |
that it's been breached a little bit in the last few years, 02:15:37.940 |
I don't know how many people in the core Python community 02:15:41.780 |
really understand that they stand perched at the edge 02:15:46.780 |
of an opportunity to transform how people use computers. 02:15:50.340 |
And actually PyCon, I think it was the last physical PyCon 02:15:53.900 |
I went to, Russell Keith-Magee gave a great keynote 02:15:57.740 |
about very much along the lines of the challenges I have, 02:16:01.620 |
which is Python for a language that doesn't actually, 02:16:09.380 |
It's done really well as a language, hasn't it? 02:16:11.820 |
You can't write a web front end with Python, really. 02:16:17.020 |
So for a language that you can't actually write apps 02:16:20.580 |
in any of the front end runtime environments, 02:16:24.340 |
And so that wasn't to pat ourselves on the back. 02:16:28.740 |
That was to challenge ourselves as a community to say, 02:16:36.740 |
you know, we've caught the tiger by the tail. 02:16:38.620 |
How do we make sure we keep up with it as it goes forward? 02:16:48.500 |
Is that humility prevent you to have a vision 02:16:52.500 |
for creating something like very new and powerful? 02:16:55.620 |
- And you've brought us back to consciousness again. 02:16:57.660 |
The collaboration is a swarm emergent dynamic. 02:17:10.700 |
how does that emerge from, you know, billions of neurons? 02:17:13.860 |
So how can you have a swarm of people emerge a consensus 02:17:17.660 |
that has a singular vision to say, we will do this. 02:17:20.620 |
And most importantly, we're not gonna do these things. 02:17:23.860 |
Emerging a coherent, pointed, focused leadership dynamic 02:17:28.860 |
from a collaboration, being able to do that kind of, 02:17:32.060 |
and then dissolve it so people can still do the swarm thing. 02:17:37.220 |
- So do you have to have a charismatic leader? 02:17:39.580 |
For some reason, Linus Torvald comes to mind, 02:17:51.780 |
There's, every leader is different, I would say, 02:17:56.700 |
So he doesn't, I don't even know if you can say 02:18:01.860 |
There's such a meritocracy of ideas that like, 02:18:19.500 |
obviously that will not stand in an open source community 02:18:25.860 |
by that one particular person is not actually that good. 02:18:30.260 |
So you actually have to be really excellent at what you do. 02:18:39.020 |
where you can get thrown out, people can just leave. 02:18:42.620 |
You know, that's how it works with open source, the fork. 02:18:45.940 |
But at the same time, you want to sometimes be a leader, 02:18:58.500 |
And I didn't, you know, I'm not one of these guys 02:19:01.020 |
"I'm gonna be an entrepreneur and I'm gonna be a leader. 02:19:03.500 |
"And I'm gonna read all these Harvard Business Review 02:19:05.220 |
"articles on leadership and all this other stuff." 02:19:07.740 |
Like I was a physicist turned into a software nerd 02:19:14.860 |
I saw a business opportunity around the use of Python 02:19:16.620 |
or data, but for me, what has been interesting 02:19:22.140 |
is how much I started really enjoying the understanding 02:19:27.140 |
and thinking deeper about organizational dynamics 02:19:30.460 |
And leadership does come down to a few core things. 02:19:49.380 |
- So can you say belief in a singular vision? 02:19:56.980 |
And this is a valid thing to do and we can do it. 02:20:00.020 |
That you have to be able to drive that belief. 02:20:08.780 |
has to help you amplify that belief to more people. 02:20:12.660 |
I mean, I think at a fundamental level, that's what it is. 02:20:20.900 |
or you have to convince people to believe in the vision 02:20:28.180 |
- There's all different flavors of leadership. 02:20:31.420 |
we could talk about Elon Musk and Steve Jobs. 02:20:38.420 |
There's people that kind of put themselves at the center 02:20:42.540 |
And some people are more like consensus builders. 02:20:49.740 |
So you've been a programmer, you've led many programmers 02:20:53.300 |
and are now sort of at the center of this ecosystem. 02:20:55.620 |
What works well in the programming world, would you say? 02:21:09.140 |
the leader has to also be the high priest of values, right? 02:21:28.340 |
then the leader has to live those values unequivocally 02:21:34.380 |
So in our case, in this collaborative community 02:21:45.300 |
You have to walk the walk, not just talk the talk. 02:21:51.140 |
really demands that much from a vision standpoint. 02:21:59.060 |
like so many people use Python from where it comes 02:22:03.420 |
the vision, you know, like you have a Elon Musk 02:22:14.660 |
And it's like, I think a lot of people that work 02:22:36.500 |
Get to where, you know, Python is at the center 02:22:42.300 |
of the machine learning and was it data science, 02:22:51.720 |
Like in many ways, perhaps the Python community 02:23:05.860 |
and the PyData community, they would submit talks. 02:23:15.740 |
because there was the separate sort of PyData conferences. 02:23:21.180 |
And instead there'd be yet another talk about, 02:23:26.400 |
And it's like, that was an interesting dynamic 02:23:34.280 |
and get more people talking about these things. 02:23:40.220 |
but then also came to appreciate that, you know, 02:23:44.480 |
that allows parallel innovation is not bad, right? 02:23:47.400 |
There are people doing embedded Python stuff. 02:23:50.640 |
people doing scripting, there's cyber users of Python. 02:23:55.100 |
if your slide mode mold covers so much stuff, 02:23:58.020 |
you have to respect that different things are growing 02:24:04.140 |
and the central body has to provide resources. 02:24:11.740 |
to then allocate as they see fit in their niches. 02:24:17.500 |
It's not like they had that many resources to start with. 02:24:21.200 |
- What was or is your favorite programming setup? 02:24:23.960 |
What operating system, what keyboard, how many screens? 02:24:33.000 |
- Tea, sometimes coffee, depending on how well I slept. 02:24:46.000 |
but like a family, you know, you lead a company 02:24:54.240 |
- Yeah, I think I've gotten a little bit better balance. 02:24:56.320 |
I have a really great leadership team now supporting me. 02:24:58.880 |
And so that takes a lot of the day-to-day stuff 02:25:01.680 |
off my plate and my kids are getting a little older. 02:25:09.200 |
that I'm not able to take care of and she's great. 02:25:13.700 |
because I have to get up every morning at six 02:25:22.000 |
Like I go to bed at nine, wake up at like 2 a.m., 02:25:24.600 |
work till five, sleep three hours, wake up at eight. 02:25:31.960 |
I didn't keep it up for years, but once I have travel, 02:25:34.960 |
then it just, everything goes out the window, right? 02:25:37.360 |
'Cause then you're like time zones and all these things. 02:25:40.880 |
were you able to live outside of how you felt? 02:25:47.320 |
that wasn't out hanging out with people or whatever, 02:25:50.680 |
I wake up at two, I'm still responding to their slacks, 02:25:59.720 |
- Right, and then you go to bed for a few hours 02:26:01.720 |
and you wake up, it's like you had an extra day 02:26:05.020 |
- And I'd read somewhere that humans naturally 02:26:06.320 |
have biphasic sleep or something, I don't know. 02:26:14.520 |
I will say that that worked out for me for a while, 02:26:19.380 |
I had a 27 inch high DPI setup that I really liked, 02:26:31.280 |
plus communications, plus various kinds of things. 02:26:42.300 |
- Is that what happens when you become important? 02:26:48.520 |
on the next table over, but I have three desks, right? 02:26:54.960 |
- So main one is the standing desk so that I can, 02:26:58.360 |
I have a teleprompter set up and everything else. 02:26:59.920 |
And then I've got my iMac and then eGPU and then Windows PC. 02:27:04.920 |
The reason I moved to Mac was it's got a Linux prompt 02:27:12.160 |
It's got a Unix prompt so I can do all my stuff. 02:27:18.300 |
like when I'm presenting for clients or investors, whatever, 02:27:20.920 |
like I don't have to worry about any like ACPI related 02:27:25.120 |
F-sick things in the middle of a presentation, 02:27:38.440 |
I feel like a traitor to my community saying this, right? 02:27:48.360 |
- Can I just defend something that nobody respectable 02:27:51.360 |
seems to do, which is, so I do a boot on Linux Windows, 02:27:55.800 |
but in Windows, I have a Windows subsystems for Linux 02:28:02.940 |
And I find myself being able to handle everything I need 02:28:06.500 |
and almost everything I need in Linux for basic sort of tasks, 02:28:17.000 |
like they're all on iPhone and a Mac and it's like, yeah. 02:28:33.600 |
Windows subsystem for Linux is very tempting, 02:28:38.960 |
where I don't know where, and I've been, okay, 02:28:41.000 |
I've used DOS since version 1.11 or 1.21 or something. 02:28:48.320 |
And I will say that like, it's really hard for me to know 02:28:57.560 |
And just things like changing group permissions 02:29:01.360 |
just everything seems a little bit more awkward, 02:29:07.760 |
like hidden attributes and all this other happy stuff 02:29:16.800 |
that'll be very interesting, with the new M1. 02:29:19.040 |
There were some dark years, the last few years 02:29:21.600 |
when I was like, I think maybe I have to move off of Mac 02:29:25.640 |
- But this, I mean, like my keyboard was just not working. 02:29:29.040 |
Like literally my keyboard just wasn't working, right? 02:29:31.160 |
I had this touch bar, didn't have a physical escape button 02:29:37.400 |
- So you use Vim and you have a, what kind of keyboard? 02:29:48.480 |
- Oh no, 'cause I say that because I use a Kinesis 02:29:51.440 |
and I had, you said some dark, you said you had dark moments. 02:29:58.760 |
So I remember sort of flying in a very kind of tight space 02:30:02.960 |
and as I'm working, this is what I do on an airplane. 02:30:06.560 |
I pull out a laptop and on top of the laptop, 02:30:16.160 |
'Cause I'm on Emacs with this Kinesis keyboard 02:30:28.720 |
And like everybody around me is using their iPhone 02:30:36.080 |
you know what, maybe I need to become an adult 02:30:38.440 |
and put the 90s behind me and use like a normal keyboard. 02:31:05.320 |
maybe we need to have a Kinesis support group. 02:31:10.280 |
- I don't know, there's gotta be an IRC channel, man. 02:31:20.600 |
Honestly, the last thing I did was I had written, 02:31:25.880 |
I was working with my son to script some Minecraft stuff. 02:31:29.560 |
That was the last, literally the last code I wrote. 02:31:33.440 |
Also I wrote some code to do some cap table evaluation, 02:31:38.240 |
- What advice would you give to a young person, 02:31:47.280 |
- This may be where I get into trouble a little bit. 02:31:53.360 |
We're rapidly entering a time between worlds. 02:31:56.480 |
So we have a world now that's starting to really crumble 02:32:01.880 |
that no longer even pretend to serve the purposes 02:32:05.720 |
We are creating technologies that are hurtling 02:32:08.600 |
billions of people headlong into philosophical crises 02:32:11.440 |
who they don't even know the philosophical operating systems 02:32:15.080 |
And they're heading into a time when that gets vaporized. 02:32:29.680 |
You're going to have to have a pioneer spirit, 02:32:36.320 |
All of human reality around you is the result 02:32:55.080 |
Collapse is non-linear, but it will be managed. 02:32:59.240 |
And so if you are in a particular social caste 02:33:07.280 |
I think it's not kosher to say that about America, 02:33:10.200 |
but America is a very stratified and classist society. 02:33:14.160 |
There's some mobility, but it's really quite classist. 02:33:17.080 |
And in America, unless you're in the upper middle class, 02:33:23.520 |
So it is really, really good to think and understand 02:33:33.880 |
And almost all of the technology being created 02:33:38.000 |
that's consumer-facing is designed to own people, 02:33:41.760 |
to take the four stack of people, to delaminate them, 02:33:50.360 |
And so if you want to be an integral human being, 02:33:54.400 |
and you want to find your own way in the world, 02:33:57.640 |
when you're young would be a great time to spend time 02:34:05.880 |
what it means to build connection with people. 02:34:08.080 |
And so much of the status game, so much of the stuff, 02:34:22.640 |
And this is Jacques Allure's point to some extent as well. 02:34:25.200 |
That gradient of power is not going to go away. 02:34:38.640 |
So as the world gets more and more technological, 02:34:44.240 |
where people will seize power, economic fortunes, 02:34:48.400 |
and the way they make the people who are left behind 02:34:51.320 |
okay with their lot in life is they create lottery systems. 02:35:00.000 |
of your own being trapped in your own economic sort of zone. 02:35:04.200 |
So avoiding those kinds of things is really important, 02:35:07.920 |
knowing when someone is running game on you basically. 02:35:10.740 |
So these are the things I would tell young people. 02:35:15.880 |
- So after you gave some realism, you sit back, 02:35:38.760 |
what is in human nature you can find meaning. 02:35:47.480 |
- What gives me hope is that we have little tremors now, 02:35:54.640 |
of the fiction of modernity that they've been living in, 02:36:06.400 |
people are burning out on some of the social media stuff. 02:36:21.840 |
and it'll incentivize other competitors to be built. 02:36:32.160 |
capital coming in and saying, look, you own a network, 02:36:36.640 |
give me some exponential dynamics out of this network. 02:36:39.840 |
You're gonna just basically put a toll keeper 02:36:41.460 |
at every single node and every single graph edge, 02:37:03.000 |
So the greatest and biggest social network in the world 02:37:10.560 |
The issue with the social media, as we call it now, 02:37:13.200 |
is they're actually just new amplification systems, right? 02:37:16.600 |
Now it's benefit of certain people like yourself 02:37:18.640 |
who have interesting content to be amplified. 02:37:23.120 |
So it's created a greater economy and that's cool. 02:37:26.800 |
but giving everyone a shot at the fame lottery, 02:37:31.480 |
if you wiggle your butt the right way on TikTok, 02:37:40.200 |
that help people be conscientious about their attention, 02:37:49.600 |
not calling, but processing and thinking about that, 02:38:02.360 |
is that these early shocks of COVID lockdowns 02:38:08.120 |
and remote work and all these different kinds of things, 02:38:14.000 |
where they're sort of no longer in the reverie, right? 02:38:21.040 |
there's more people with ears to hear now, right? 02:38:23.680 |
With pandemic and education, everyone's like, wait, wait, 02:38:30.120 |
What is this crap you're giving them as homework, right? 02:38:33.960 |
that are getting in the supply chain disruptions, 02:38:44.840 |
it's still gonna take a while for these things, 02:38:50.540 |
and to be in right relationship with each other 02:38:55.880 |
So the message of hope is still people are resilient 02:38:58.420 |
and we are building some really amazing technology. 02:39:01.400 |
- And I also, like to me, I derive a lot of hope 02:39:08.160 |
The power of a single individual to transform the world, 02:39:12.000 |
to do positive things for the world is quite incredible. 02:39:16.000 |
it's nice to have as many of those individuals as possible, 02:39:18.840 |
but even the power of one, it's kind of magical. 02:39:29.020 |
and then spamming you with all this philosophical stuff 02:39:34.800 |
trying to put words around the current technological, 02:39:44.600 |
I think there has been a lot of great content 02:39:46.360 |
produced around this stuff for people who wanna see, 02:39:49.560 |
wanna find out more or think more about this. 02:39:52.120 |
We're popularizing certain kinds of philosophical ideas 02:39:56.560 |
oh, you're communist, oh, you're capitalist kind of stuff. 02:40:01.200 |
So that also gives me hope that I feel like I myself 02:40:04.620 |
am getting a handle on how to think about these things. 02:40:07.320 |
It makes me feel like I can hopefully affect, 02:40:12.540 |
- We've been sneaking up on this question all over the place. 02:40:29.700 |
I mean, I've never really understood that question. 02:40:53.400 |
or maybe you have created something beautiful, 02:41:01.720 |
So some of that is just chemicals coming together 02:41:12.620 |
that's providing meaning in all kinds of ways to its members. 02:41:35.200 |
of meaning for people, like innovation of different kinds. 02:41:46.500 |
But you were a physicist, so there's a desire to say, 02:41:51.840 |
okay, yeah, but these seem to be like symptoms 02:42:00.960 |
Why are we reaching for order when there is excess of energy? 02:42:09.360 |
Any why that I come up with, I think, is gonna be, 02:42:18.220 |
We do look at the world through a traditional, 02:42:22.280 |
I think most people look at the world through 02:42:26.040 |
kind of metaphysical lens, that we have our own subjectivity 02:42:29.600 |
and then there's all of these object things that are not us. 02:42:34.220 |
So I'm me and these things are not me, right? 02:42:37.240 |
And I'm interacting with them, I'm doing things to them. 02:42:39.880 |
But a different view of the world that looks at it 02:42:45.520 |
oh, I'm really quite embedded in a soup of other things. 02:42:50.520 |
And I'm simply almost like a standing wave pattern 02:43:07.240 |
that everything that we touch with our hands, 02:43:25.060 |
and all this other kind of stuff and quantum energy stuff, 02:43:44.220 |
a part of you, a bit of your life force that goes into it. 02:43:48.500 |
Okay, now this is of course completely mumbo jumbo stuff. 02:43:51.500 |
This is not like, I don't actually think this is real, 02:43:57.740 |
What if there actually was some quantum magnetic crystal 02:44:01.760 |
and energy field thing that just by touching this can, 02:44:09.080 |
And it's not much unless you put a lot into it 02:44:11.980 |
and you touch it all the time, like your phone, right? 02:44:19.680 |
but what if there's something that technical objects, 02:44:25.300 |
It does not really receive attention or intimacy 02:44:29.220 |
and then allow itself to be transformed by it. 02:44:33.420 |
if it's the handle of a knife that your mother used 02:44:57.000 |
So if you walk that thought experiment through, 02:45:02.280 |
And the reason this ties into my answer for your question 02:45:15.720 |
hypothesize such a thing, it could be that the purpose 02:45:27.580 |
- That's a beautiful answer and a beautiful way to end it. 02:45:39.220 |
in the space of engineering and in the space of philosophy. 02:45:44.780 |
I'm really proud to be living in the same city as you. 02:45:51.120 |
that you would spend your valuable time with me today. 02:45:53.840 |
I appreciate the opportunity to speak with you. 02:46:00.640 |
please check out our sponsors in the description. 02:46:17.060 |
"there exists a third thing, malicious incompetence. 02:46:27.940 |
Thank you for listening and hope to see you next time.