back to index

Peter Wang: Python and the Source Code of Humans, Computers, and Reality | Lex Fridman Podcast #250


Chapters

0:0 Introduction
0:33 Python
4:4 Programming language design
24:7 Virtuality
34:7 Human layers
41:5 Life
46:29 Origin of ideas
49:1 Eric Weinstein
54:0 Human source code
57:58 Love
72:16 AI
85:39 Meaning crisis
108:12 Travis Oliphant
114:38 Python continued
144:21 Best setup
151:39 Advice for the youth
160:12 Meaning of Life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Peter Wang,
00:00:02.400 | one of the most impactful leaders and developers
00:00:04.640 | in the Python community,
00:00:06.320 | former physicist, current philosopher,
00:00:09.040 | and someone who many people told me about
00:00:11.640 | and praised as a truly special mind
00:00:14.180 | that I absolutely should talk to.
00:00:16.480 | Recommendations ranging from Travis Oliphant
00:00:19.440 | to Eric Weinstein.
00:00:20.920 | So here we are.
00:00:23.320 | This is the Lex Friedman Podcast.
00:00:25.600 | To support it, please check out our sponsors
00:00:27.680 | in the description.
00:00:28.800 | And now, here's my conversation with Peter Wang.
00:00:32.400 | You're one of the most impactful humans
00:00:35.640 | in the Python ecosystem.
00:00:37.120 | (laughs)
00:00:38.240 | So you're an engineer, leader of engineers,
00:00:40.840 | but you're also a philosopher.
00:00:42.880 | So let's talk both in this conversation
00:00:45.120 | about programming and philosophy.
00:00:47.280 | First, programming.
00:00:48.960 | What to you is the best
00:00:51.120 | or maybe the most beautiful feature of Python,
00:00:54.040 | or maybe the thing that made you fall in love
00:00:56.160 | or stay in love with Python?
00:00:59.000 | - Well, those are three different things.
00:01:00.680 | What I think is the most beautiful,
00:01:01.960 | what made me fall in love with me, stay in love.
00:01:03.960 | When I first started using it
00:01:05.760 | was when I was a C++ computer graphics performance nerd.
00:01:10.040 | - In the '90s?
00:01:10.880 | - Yeah, in late '90s.
00:01:12.060 | And that was my first job out of college.
00:01:14.120 | And we kept trying to do more and more like abstract
00:01:18.680 | and higher order programming in C++,
00:01:20.520 | which at the time was quite difficult with templates.
00:01:24.160 | The compiler support wasn't great, et cetera.
00:01:26.560 | So when I started playing around with Python,
00:01:28.720 | that was my first time encountering
00:01:30.480 | really first-class support for types,
00:01:32.440 | for functions and things like that.
00:01:34.320 | And it felt so incredibly expressive.
00:01:37.200 | So that was what kind of made me fall in love
00:01:39.120 | with it a little bit.
00:01:39.960 | And also, once you spend a lot of time
00:01:42.280 | in a C++ dev environment,
00:01:44.160 | the ability to just whip something together
00:01:46.200 | that basically runs and works the first time is amazing.
00:01:49.680 | So really productive scripting language.
00:01:51.960 | I mean, I knew Perl, I knew Bash.
00:01:53.680 | I was decent at both, but Python just made everything,
00:01:57.160 | it made the whole world accessible, right?
00:01:59.680 | I could script this and that and the other network things,
00:02:02.600 | you know, little hard drive utilities.
00:02:04.000 | I could write all of these things
00:02:05.080 | in the space of an afternoon.
00:02:06.560 | And that was really, really cool.
00:02:07.640 | So that's what made me fall in love.
00:02:08.640 | - Is there something specific you could put your finger on
00:02:11.560 | that you're not programming in Perl today?
00:02:14.080 | Like why Python for scripting?
00:02:17.120 | - I think there's not a specific thing
00:02:19.640 | as much as the design motif of both the creator
00:02:22.800 | of the language and the core group of people
00:02:25.320 | that built the standard library around him.
00:02:27.480 | There was definitely, there was a taste to it.
00:02:32.200 | I mean, Steve Jobs, you know, used that term,
00:02:34.120 | you know, in somewhat of a arrogant way,
00:02:35.720 | but I think it's a real thing that it was designed to fit.
00:02:39.240 | A friend of mine actually expressed this really well.
00:02:40.920 | He said, "Python just fits in my head."
00:02:43.000 | And there's nothing better to say than that.
00:02:45.240 | Now, people might argue modern Python,
00:02:47.920 | there's a lot more complexity,
00:02:49.280 | but certainly as version 152,
00:02:51.800 | I think was my first version,
00:02:53.360 | that fit in my head very easily.
00:02:54.800 | So that's what made me fall in love with it.
00:02:56.560 | - Okay, so the most beautiful feature of Python
00:03:01.400 | that made you stay in love.
00:03:03.040 | It's like over the years, what has like, you know,
00:03:06.560 | you do a double take and you return too often
00:03:09.520 | as a thing that just brings you a smile.
00:03:11.560 | - I really still like the ability to play with metaclasses
00:03:16.560 | and express higher order of things.
00:03:19.000 | When I have to create some new object model
00:03:22.000 | to model something, right?
00:03:23.360 | It's easy for me,
00:03:24.400 | 'cause I'm pretty expert as a Python programmer.
00:03:27.080 | I can easily put all sorts of lovely things together
00:03:29.800 | and use properties and decorators
00:03:31.600 | and other kinds of things
00:03:32.920 | and create something that feels very nice.
00:03:34.680 | So that to me, I would say that's tied with the NumPy
00:03:38.560 | and vectorization capabilities.
00:03:40.640 | I love thinking in terms of the matrices and the vectors
00:03:43.800 | and these kind of data structures.
00:03:46.080 | So I would say those two are kind of tied for me.
00:03:49.400 | - So the elegance of the NumPy data structure,
00:03:52.720 | like slicing through the different multi-dimensional--
00:03:54.800 | - Yeah, there's just enough things there.
00:03:56.200 | It's like a very, it's a very simple, comfortable tool.
00:04:00.040 | Just it's easy to reason about what it does
00:04:02.800 | when you don't stray too far afield.
00:04:05.000 | - Can you put your finger on how to design a language
00:04:10.000 | such that it fits in your head?
00:04:11.880 | Certain things like the colon
00:04:14.040 | or the certain notation aspects of Python
00:04:17.200 | that just kind of work.
00:04:18.680 | Is it something you have to kind of write out on paper,
00:04:22.240 | look and say, it's just right?
00:04:24.680 | Is it a taste thing or is there a systematic process?
00:04:27.640 | What's your sense?
00:04:28.840 | - I think it's more of a taste thing.
00:04:31.560 | But one thing that should be said
00:04:33.600 | is that you have to pick your audience, right?
00:04:36.400 | So the better defined the user audience is
00:04:39.200 | or the users are,
00:04:40.440 | the easier it is to build something
00:04:42.320 | that fits in their minds
00:04:43.840 | because their needs will be more compact and coherent.
00:04:47.280 | It is possible to find a projection, right?
00:04:49.200 | A compact projection for their needs.
00:04:50.960 | The more diverse the user base, the harder that is.
00:04:54.520 | And so as Python has grown in popularity,
00:04:57.200 | that's also naturally created more complexity
00:05:00.160 | as people try to design any given thing.
00:05:01.840 | There'll be multiple valid opinions
00:05:04.200 | about a particular design approach.
00:05:06.320 | And so I do think that's the downside of popularity.
00:05:10.280 | It's almost an intrinsic aspect
00:05:11.520 | of the complexity of the problem.
00:05:13.120 | - Well, at the very beginning,
00:05:14.480 | aren't you an audience of one?
00:05:16.240 | Isn't ultimately,
00:05:17.480 | aren't all the greatest projects in history
00:05:19.520 | were just solving a problem that you yourself had?
00:05:21.880 | - Well, so Clay Shirky in his book on crowdsourcing
00:05:25.440 | or in his kind of thoughts on crowdsourcing,
00:05:27.560 | he identifies the first step of crowdsourcing
00:05:29.600 | is me first collaboration.
00:05:31.520 | You first have to make something
00:05:32.600 | that works well for yourself.
00:05:34.320 | It's very telling that when you look at all of the impactful
00:05:37.760 | big project, well, they're fundamental projects now
00:05:40.040 | in the SciPy and PyData ecosystem,
00:05:42.600 | they all started with the people in the domain
00:05:46.760 | trying to scratch their own itch.
00:05:48.240 | And the whole idea of scratching your own itch
00:05:49.760 | is something that the open source
00:05:51.360 | or the free software world has known for a long time.
00:05:53.680 | But in the scientific computing areas,
00:05:56.840 | these are assistant professors
00:05:58.320 | or electrical engineering grad students.
00:06:00.560 | They didn't have really a lot of programming skill
00:06:03.040 | necessarily, but Python was just good enough
00:06:05.560 | for them to put something together
00:06:06.800 | that fit in their domain, right?
00:06:09.440 | So it's almost like it's a necessity
00:06:12.440 | as a mother of invention aspect.
00:06:13.960 | And also it was a really harsh filter
00:06:16.880 | for utility and compactness and expressiveness.
00:06:20.880 | Like it was too hard to use,
00:06:22.400 | then they wouldn't have built it
00:06:23.360 | 'cause there was just too much trouble, right?
00:06:24.960 | It was a side project for them.
00:06:26.160 | - And also necessity creates a kind of deadline.
00:06:28.120 | It seems like a lot of these projects
00:06:29.440 | are quickly thrown together in the first step.
00:06:32.320 | And that, even though it's flawed,
00:06:34.520 | that just seems to work well for software projects.
00:06:38.840 | - Well, it does work well for software projects in general.
00:06:41.280 | And in this particular space,
00:06:43.280 | one of my colleagues, Stan Sieber, identified this,
00:06:46.320 | that all the projects in the SciPy ecosystem,
00:06:49.040 | if we just rattle them off, there's NumPy, there's SciPy,
00:06:53.040 | built by different collaborations of people,
00:06:55.000 | although Travis is the heart of both of them.
00:06:57.680 | But NumPy coming from Numeric and NumRay,
00:06:59.360 | these are different people.
00:07:00.640 | And then you've got Pandas, you've got Jupyter or IPython.
00:07:04.480 | There's Matplotlib.
00:07:06.880 | There's just so many others, I'm not gonna do justice
00:07:09.680 | if I try to name them all,
00:07:10.720 | but all of them are actually different people.
00:07:12.800 | And as they rolled out their projects,
00:07:15.280 | the fact that they had limited resources
00:07:17.560 | meant that they were humble about scope.
00:07:19.560 | A great famous hacker, Jamie Zawiski,
00:07:23.560 | once said that every geek's dream
00:07:26.040 | is to build the ultimate middleware, right?
00:07:29.280 | And the thing is with these scientists turned programmers,
00:07:32.280 | they had no such thing.
00:07:33.120 | They were just trying to write something
00:07:34.800 | that was a little bit better for what they needed,
00:07:36.560 | the MATLAB, and they were gonna leverage
00:07:38.560 | what everyone else had built.
00:07:39.960 | So naturally, almost in kind of this annealing process
00:07:42.680 | or whatever, we built a very modular cover
00:07:46.520 | of the basic needs of a scientific computing library.
00:07:50.320 | - If you look at the whole human story,
00:07:51.960 | how much of a leap is it?
00:07:53.720 | We've developed all kinds of languages,
00:07:55.520 | all kinds of methodologies for communication,
00:07:57.760 | and it just kind of like grew this collective intelligence,
00:08:00.600 | the civilization grew, it expanded,
00:08:02.880 | wrote a bunch of books, and now we tweet.
00:08:06.460 | How big of a leap is programming
00:08:08.660 | if programming is yet another language?
00:08:10.820 | Is it just a nice little trick
00:08:12.860 | that's temporary in our human history,
00:08:14.980 | or is it like a big leap in the,
00:08:18.780 | almost us becoming another organism
00:08:23.100 | at a higher level of abstraction, something else?
00:08:26.180 | - I think the act of programming
00:08:28.260 | or using grammatical constructions
00:08:32.380 | of some underlying primitives,
00:08:34.940 | that is something that humans do learn,
00:08:37.520 | but every human learns this.
00:08:38.860 | Anyone who can speak learns how to do this.
00:08:41.160 | What makes programming different
00:08:42.420 | has been that up to this point,
00:08:44.860 | when we try to give instructions to computing systems,
00:08:49.000 | all of our computers, well, actually, this is not quite true,
00:08:51.580 | but I'll first say it,
00:08:53.080 | and then I'll tell you why it's not true.
00:08:54.980 | But for the most part, we can think of computers
00:08:56.800 | as being these iterated systems.
00:08:58.900 | So when we program, we're giving very precise instructions
00:09:02.740 | to iterated systems that then run at incomprehensible speed
00:09:07.380 | and run those instructions.
00:09:08.820 | In my experience, some people are just better equipped
00:09:12.700 | to model systematic iterated systems,
00:09:16.860 | well, whatever, iterated systems in their head.
00:09:19.220 | Some people are really good at that,
00:09:21.740 | and other people are not.
00:09:23.780 | And so when you have, like, for instance,
00:09:26.060 | sometimes people have tried to build systems
00:09:27.640 | that make programming easier
00:09:29.300 | by making it visual drag and drop.
00:09:31.260 | And the issue is you can have a drag and drop thing,
00:09:33.700 | but once you start having to iterate the system
00:09:35.220 | with conditional logic, handling case statements
00:09:37.140 | and branch statements and all these other things,
00:09:39.540 | the visual drag and drop part doesn't save you anything.
00:09:42.080 | You still have to reason about this giant iterated system
00:09:44.720 | with all these different conditions around it.
00:09:46.420 | That's the hard part, right?
00:09:48.220 | So handling iterated logic, that's the hard part.
00:09:52.300 | The languages we use then emerge to give us ability
00:09:55.360 | and capability over these things.
00:09:57.300 | Now, the one exception to this rule, of course,
00:09:58.900 | is the most popular programming system in the world,
00:10:00.700 | which is Excel, which is a data flow
00:10:03.300 | and a data-driven, immediate mode,
00:10:05.420 | data transformation-oriented programming system.
00:10:08.380 | And it's actually not an accident
00:10:10.400 | that that system is the most popular programming system
00:10:12.840 | because it's so accessible
00:10:14.420 | to a much broader group of people.
00:10:16.940 | I do think as we build future computing systems,
00:10:21.260 | you're actually already seeing this a little bit,
00:10:22.900 | it's much more about composition of modular blocks.
00:10:25.660 | They themselves actually maintain all their internal state,
00:10:29.620 | and the interfaces between them
00:10:31.060 | are well-defined data schemas.
00:10:32.980 | And so to stitch these things together using like IFTTT
00:10:35.900 | or Zapier or any of these kind of,
00:10:38.780 | I would say compositional scripting kinds of things,
00:10:41.980 | I mean, HyperCard was also a little bit in this vein.
00:10:45.020 | That's much more accessible to most people.
00:10:47.620 | It's really that implicit state
00:10:49.980 | that's so hard for people to track.
00:10:52.060 | - Yeah, okay, so that's modular stuff,
00:10:53.460 | but there's also an aspect
00:10:54.460 | where you're standing on the shoulders of giants.
00:10:55.940 | So you're building like higher and higher levels
00:10:58.740 | of abstraction, but you do that a little bit with language.
00:11:02.180 | So with language, you develop sort of ideas,
00:11:05.180 | philosophies from Plato and so on.
00:11:07.580 | And then you kind of leverage those philosophies
00:11:09.540 | as you try to have deeper and deeper conversations.
00:11:12.660 | But with programming,
00:11:13.540 | it seems like you can build much more complicated systems.
00:11:17.060 | Like without knowing how everything works,
00:11:18.940 | you can build on top of the work of others.
00:11:21.340 | And it seems like you're developing
00:11:22.860 | more and more sophisticated expressions,
00:11:27.860 | ability to express ideas in a computational space.
00:11:31.300 | - I think it's worth pondering the difference here
00:11:35.020 | between complexity and complication.
00:11:38.620 | Okay, right? - Back to Excel.
00:11:42.220 | - Well, not quite back to Excel,
00:11:43.260 | but the idea is when we have a human conversation,
00:11:47.540 | all languages for humans emerged
00:11:51.220 | to support human relational communications,
00:11:55.540 | which is that the person we're communicating with
00:11:57.700 | is a person and they would communicate back to us.
00:12:01.220 | And so we sort of hit a resonance point, right?
00:12:05.620 | When we actually agree on some concepts.
00:12:07.460 | So there's a messiness to it and there's a fluidity to it.
00:12:10.260 | With computing systems,
00:12:11.860 | when we express something to the computer and it's wrong,
00:12:14.220 | we just try again.
00:12:15.380 | So we can basically live many virtual worlds
00:12:17.500 | of having failed at expressing ourselves to the computer
00:12:20.180 | until the one time we expressed ourselves right.
00:12:22.820 | Then we kind of put in production
00:12:23.940 | and then discover that it's still wrong,
00:12:25.900 | a few days down the road.
00:12:27.140 | So I think the sophistication of the things
00:12:30.460 | that we build with computing,
00:12:32.460 | one has to really pay attention to the difference
00:12:35.580 | between when an end user is expressing something
00:12:38.180 | onto a system that exists
00:12:39.940 | versus when they're extending the system
00:12:42.460 | to increase the system's capability
00:12:45.420 | for someone else to then interface with.
00:12:47.420 | And we happen to use the same language
00:12:48.620 | for both of those things.
00:12:49.500 | And in most cases, but it doesn't have to be that.
00:12:52.060 | And Excel is actually a great example of this,
00:12:54.340 | of kind of a counterpoint to that.
00:12:56.220 | - Okay, so what about the idea of, you said messiness.
00:13:01.220 | Wouldn't you put the software 2.0 idea,
00:13:06.220 | this idea of machine learning
00:13:07.740 | into the further and further steps
00:13:12.500 | into the world of messiness.
00:13:14.500 | The same kind of beautiful messiness
00:13:16.060 | of human communication.
00:13:17.580 | Isn't that what machine learning is?
00:13:19.100 | Is building on levels of abstraction
00:13:23.540 | that don't have messiness in them,
00:13:25.500 | that at the operating system level,
00:13:27.380 | then there's Python, the programming languages
00:13:29.380 | that have more and more power.
00:13:30.860 | But then finally there's neural networks
00:13:34.580 | that ultimately work with data.
00:13:38.140 | And so the programming is almost in the space of data
00:13:40.540 | and the data is allowed to be messy.
00:13:42.420 | Isn't that a kind of program?
00:13:43.860 | So the idea of software 2.0
00:13:45.900 | is a lot of the programming happens in the space of data.
00:13:50.900 | So back to Excel, all roads lead back to Excel.
00:13:54.060 | In the space of data and also the hyperparameters
00:13:56.540 | of the neural networks.
00:13:57.380 | And all of those allow the same kind of messiness
00:14:02.220 | that human communication allows.
00:14:04.380 | - It does, but my background is a physics.
00:14:07.780 | I took like two CS courses in college.
00:14:09.940 | So I don't have, now I did cram a bunch of CS in prep
00:14:13.820 | when I applied for grad school,
00:14:15.500 | but still I don't have a formal background
00:14:18.140 | in computer science.
00:14:19.740 | But what I have observed in studying programming languages
00:14:22.500 | and programming systems and things like that
00:14:24.980 | is that there seems to be this triangle.
00:14:27.340 | It's one of these beautiful little iron triangles
00:14:30.420 | that you find in life sometimes.
00:14:32.060 | And it's the connection between the code correctness
00:14:35.620 | and kind of expressiveness of code,
00:14:37.420 | the semantics of the data,
00:14:39.900 | and then the kind of correctness or parameters
00:14:42.500 | of the underlying hardware compute system.
00:14:44.940 | So there's the algorithms that you wanna apply.
00:14:48.460 | There's what the bits that are stored
00:14:51.420 | on whatever media actually represent.
00:14:53.340 | So the semantics of the data within the representation,
00:14:56.940 | and then there's what the computer can actually do.
00:14:59.740 | And every programming system, every information system
00:15:02.860 | ultimately find some spot
00:15:05.020 | in the middle of this little triangle.
00:15:07.420 | Sometimes some systems collapse them into just one edge.
00:15:11.100 | - Are we including humans as a system in this?
00:15:13.500 | - No, no, I'm just thinking about computing systems here.
00:15:15.540 | - Okay. - Okay?
00:15:16.380 | And the reason I bring this up is because
00:15:17.820 | I believe there's no free lunch around this stuff.
00:15:20.100 | So if we build machine learning systems
00:15:22.980 | to sort of write the correct code
00:15:25.500 | that is at a certain level of performance,
00:15:27.140 | so it'll sort of select, right?
00:15:28.900 | With the hyperparameters, we can tune
00:15:31.100 | kind of how we want the performance boundary
00:15:32.860 | and SLA to look like
00:15:34.980 | for transforming some set of inputs
00:15:37.300 | into certain kinds of outputs.
00:15:39.580 | That training process itself is intrinsically sensitive
00:15:43.660 | to the kinds of inputs we put into it.
00:15:45.580 | It's quite sensitive to the boundary conditions
00:15:47.940 | we put around the performance.
00:15:49.340 | So I think even as we move to using automated systems
00:15:52.180 | to build this transformation,
00:15:53.460 | as opposed to humans explicitly
00:15:55.460 | from a top-down perspective,
00:15:56.980 | figuring out, well, this schema and this database
00:15:59.300 | and these columns get selected for this algorithm.
00:16:01.660 | And here we put a Fibonacci heap for some other thing.
00:16:04.860 | Human design or computer design,
00:16:06.780 | ultimately what we hit,
00:16:08.140 | the boundaries that we hit with these information systems
00:16:10.620 | is when the representation of the data hits the real world
00:16:14.220 | is where there's a lot of slop and a lot of interpretation.
00:16:17.540 | And that's where actually I think
00:16:18.940 | a lot of the work will go in the future
00:16:20.860 | is actually understanding kind of how to better
00:16:23.340 | in the view of these live data systems,
00:16:26.220 | how to better encode the semantics of the world
00:16:29.300 | for those things.
00:16:30.140 | There'll be less of the details
00:16:31.060 | of how we write a particular SQL query.
00:16:33.460 | - Okay, but given the semantics of the real world
00:16:35.500 | and the messiness of that,
00:16:36.900 | what does the word correctness mean
00:16:38.620 | when you're talking about code?
00:16:40.780 | - There's a lot of dimensions to correctness.
00:16:42.940 | Historically, and this is one of the reasons I say
00:16:45.140 | that we're coming to the end of the era of software,
00:16:47.620 | 'cause for the last 40 years or so,
00:16:49.820 | software correctness was really defined
00:16:52.860 | about functional correctness.
00:16:54.820 | I write a function, it's got some inputs,
00:16:56.340 | does it produce the right outputs?
00:16:57.820 | If so, then I can turn it on,
00:16:59.220 | hook it up to the live database and it goes.
00:17:01.460 | And more and more now we have,
00:17:03.180 | I mean, in fact, I think the bright line in the sand
00:17:05.100 | between machine learning systems
00:17:06.740 | or modern data-driven systems
00:17:08.140 | versus classical software systems
00:17:10.780 | is that the values of the input
00:17:14.260 | actually have to be considered with the function together
00:17:17.380 | to say this whole thing is correct or not.
00:17:19.420 | And usually there's a performance SLA as well.
00:17:21.780 | Like, did it actually finish making this-
00:17:23.340 | - What's SLA?
00:17:24.180 | - Sorry, service level agreement.
00:17:25.380 | So it has to return within some time.
00:17:27.100 | You have a 10 millisecond time budget
00:17:29.180 | to return a prediction of this level of accuracy, right?
00:17:32.700 | So these are things that were not traditionally
00:17:35.020 | in most business computing systems for the last 20 years
00:17:37.580 | at all, people didn't think about it.
00:17:39.420 | But now we have value dependence on functional correctness.
00:17:42.740 | So that question of correctness
00:17:44.140 | is becoming a bigger and bigger question.
00:17:45.740 | - What does that map to the end of software?
00:17:48.180 | - We've thought about software as just this thing
00:17:50.500 | that you can do in isolation with some test trial inputs
00:17:54.780 | and in a very sort of sandboxed environment.
00:17:58.660 | And we can quantify how does it scale?
00:18:00.660 | How does it perform?
00:18:02.060 | How many nodes do we need to allocate
00:18:03.260 | if we wanna scale this many inputs?
00:18:05.140 | When we start turning this stuff into prediction systems,
00:18:08.380 | real cybernetic systems,
00:18:10.140 | you're going to find scenarios where you get inputs
00:18:12.340 | that you're gonna wanna spend
00:18:13.180 | a little more time thinking about.
00:18:14.580 | You're gonna find inputs that are not,
00:18:15.860 | it's not clear what you should do, right?
00:18:17.460 | So then the software has a varying amount of runtime
00:18:20.180 | and correctness with regard to input.
00:18:22.260 | And that is a different kind of system altogether.
00:18:23.980 | Now it's a full-on cybernetic system.
00:18:25.980 | It's a next generation information system
00:18:27.620 | that is not like traditional software systems.
00:18:30.180 | - Can you maybe describe what is a cybernetic system?
00:18:33.140 | Do you include humans in that picture?
00:18:35.060 | So is it human in the loop kind of complex mess
00:18:38.900 | of the whole kind of interactivity of software
00:18:41.500 | with the real world, or is it something more concrete?
00:18:44.780 | - Well, when I say cybernetic,
00:18:45.740 | I really do mean that the software itself
00:18:47.700 | is closing the observe, orient, decide, act loop by itself.
00:18:51.620 | So humans being out of the loop is the fact
00:18:54.260 | what for me makes it a cybernetic system.
00:18:58.740 | - Humans are out of that loop.
00:19:00.900 | - When humans are out of the loop,
00:19:01.940 | when the machine is actually sort of deciding on its own
00:19:05.220 | what it should do next to get more information,
00:19:07.660 | that makes it a cybernetic system.
00:19:09.740 | So we're just at the dawn of this, right?
00:19:11.380 | I think everyone talking about MLAI, it's great,
00:19:15.380 | but really the thing we should be talking about
00:19:16.860 | is when we really enter the cybernetic era
00:19:20.380 | and all of the questions of ethics and governance
00:19:22.540 | and correctness and all these things,
00:19:24.620 | they really are the most important questions.
00:19:27.180 | - Okay, can we just linger on this?
00:19:28.580 | What does it mean for the human to be out of the loop
00:19:30.620 | in a cybernetic system?
00:19:31.780 | Because isn't the cybernetic system
00:19:34.100 | that's ultimately accomplishing some kind of purpose
00:19:37.420 | that at the bottom, the turtles all the way down,
00:19:41.820 | at the bottom, turtle's a human.
00:19:44.140 | - Well, the human may have set some criteria,
00:19:45.900 | but the human wasn't precise.
00:19:47.260 | So for instance, I just read the other day
00:19:49.260 | that earlier this year,
00:19:51.340 | or maybe it was last year at some point,
00:19:52.500 | the Libyan army, I think,
00:19:55.220 | sent out some automated killer drones with explosives.
00:19:58.220 | And there was no human in the loop at that point.
00:19:59.980 | They basically put them in a geo-fenced area,
00:20:02.220 | said find any moving target,
00:20:03.420 | like a truck or vehicle that looks like this, and boom.
00:20:06.980 | That's not a human in the loop, right?
00:20:09.300 | - So increasingly, the less human there is in the loop,
00:20:12.700 | the more concerned you are about these kinds of systems,
00:20:15.460 | because there's unintended consequences,
00:20:18.340 | like less the original designer and engineer of the system
00:20:21.940 | is able to predict, even one with good intent
00:20:24.980 | is able to predict the consequences of such a system.
00:20:27.340 | Is that- - That's right.
00:20:28.620 | There are some software systems, right,
00:20:30.140 | that run without humans in the loop that are quite complex.
00:20:32.780 | And that's like the electronic markets.
00:20:34.300 | And we get flash crashes all the time.
00:20:35.900 | We get, you know, in the heyday of high-frequency trading,
00:20:40.460 | there was a lot of market microstructure,
00:20:41.780 | people doing all sorts of weird stuff
00:20:43.620 | that the market designers had never really thought about,
00:20:47.220 | contemplated, or intended.
00:20:48.820 | So when we run these full-on systems
00:20:50.660 | with these automated trading bots,
00:20:52.860 | now they become automated, you know, killer drones,
00:20:55.420 | and then all sorts of other stuff,
00:20:57.260 | we are, that's what I mean by
00:20:58.980 | we're at the dawn of the cybernetic era
00:21:00.580 | and the end of the era of just pure software.
00:21:02.880 | - Are you more concerned,
00:21:05.980 | if you're thinking about cybernetic systems
00:21:08.220 | or even like self-replicating systems,
00:21:09.980 | so systems that aren't just doing a particular task,
00:21:13.540 | but are able to sort of multiply and scale
00:21:15.700 | in some dimension in the digital
00:21:17.980 | or even the physical world,
00:21:20.340 | are you more concerned about like the lobster being boiled,
00:21:24.780 | so a gradual, with us not noticing,
00:21:27.360 | collapse of civilization, or a big explosion,
00:21:33.260 | like oops, kind of a big thing where everyone notices,
00:21:38.660 | but it's too late?
00:21:40.100 | - I think that it will be a different experience
00:21:44.020 | for different people.
00:21:45.060 | I do share a common point of view
00:21:49.780 | with some of the climate, you know,
00:21:52.180 | people who are concerned about climate change
00:21:53.740 | and just the big existential risks that we have,
00:21:58.740 | but unlike a lot of people who share my level of concern,
00:22:02.540 | I think the collapse will not be quite so dramatic
00:22:06.180 | as some of them think.
00:22:07.540 | And what I mean is that I think that for certain tiers
00:22:10.780 | of let's say economic class or certain locations
00:22:13.580 | in the world,
00:22:14.620 | people will experience dramatic collapse scenarios.
00:22:17.740 | But for a lot of people, especially in the developed world,
00:22:20.300 | the realities of collapse will be managed.
00:22:24.020 | There'll be narrative management around it
00:22:26.600 | so that they essentially insulate,
00:22:29.120 | the middle class will be used to insulate the upper class
00:22:31.300 | from the pitchforks and the flaming torches and everything.
00:22:35.860 | - It's interesting 'cause,
00:22:36.940 | so my specific question wasn't as general,
00:22:39.540 | my question was more about cybernetic systems or software.
00:22:42.460 | - Okay.
00:22:43.580 | - It's interesting,
00:22:44.420 | but it would nevertheless perhaps be about class.
00:22:46.660 | So the effect of algorithms might affect certain classes
00:22:49.460 | more than others.
00:22:50.300 | - Absolutely.
00:22:51.120 | - I was more thinking about whether it's social media
00:22:53.700 | algorithms or actual robots,
00:22:56.140 | is there going to be a gradual effect on us
00:22:59.940 | where we wake up one day
00:23:02.580 | and don't recognize the humans we are?
00:23:05.140 | Or is it something truly dramatic where there's,
00:23:08.420 | like a meltdown of a nuclear reactor kind of thing,
00:23:12.140 | Chernobyl, like catastrophic events
00:23:15.220 | that are almost bugs in a program
00:23:19.100 | that scaled itself too quickly?
00:23:21.100 | - Yeah, I'm not as concerned about the visible stuff.
00:23:26.100 | And the reason is because the big visible explosions,
00:23:29.540 | I mean, this is the thing I said about social media
00:23:31.180 | is that, at least with nuclear weapons,
00:23:33.180 | when a nuke goes off, you can see it and you're like,
00:23:35.100 | "Well, that's really, wow, that's kind of bad."
00:23:37.780 | I mean, Oppenheimer was reciting the Bhagavad Gita,
00:23:40.620 | when he saw one of those things go off.
00:23:42.180 | So we can see nukes are really bad.
00:23:45.140 | - He's not reciting anything about Twitter.
00:23:47.980 | - Well, but right.
00:23:49.340 | But then when you have social media,
00:23:51.160 | when you have all these different things that conspire
00:23:54.380 | to create a layer of virtual experience for people
00:23:57.100 | that alienates them from reality and from each other,
00:24:00.940 | that's very pernicious.
00:24:02.380 | It's impossible to see, right?
00:24:03.820 | And it kind of slowly gets in there.
00:24:07.180 | - You've written about this idea of virtuality
00:24:09.580 | on this topic, which you define as the subjective phenomenon
00:24:14.100 | of knowingly engaging with virtual sensation and perception
00:24:17.820 | and suspending or forgetting the context
00:24:19.900 | that it's a simulacrum.
00:24:22.040 | So let me ask, what is real?
00:24:26.460 | Is there a hard line between reality and virtuality?
00:24:30.440 | Like perception drifts from some kind of physical reality.
00:24:33.460 | We have to kind of have a sense of what is the line
00:24:36.120 | that's we've gone too far.
00:24:37.660 | - Right, right.
00:24:38.760 | For me, it's not about any hard line about physical reality,
00:24:42.620 | as much as a simple question of,
00:24:46.180 | does the particular technology help people connect
00:24:51.560 | in a more integral way with other people,
00:24:54.380 | with their environment,
00:24:55.980 | with all of the full spectrum of things around them?
00:24:58.540 | So it's less about, oh, this is a virtual thing
00:25:00.860 | and this is a hard real thing,
00:25:02.980 | more about when we create virtual representations
00:25:05.740 | of the real things,
00:25:06.700 | always some things are lost in translation.
00:25:10.740 | Usually many, many dimensions are lost in translation.
00:25:14.340 | We're now coming to almost two years of COVID,
00:25:16.860 | people on Zoom all the time.
00:25:17.940 | You know it's different when you meet somebody in person
00:25:19.740 | than when you see them on,
00:25:20.580 | I've seen you on YouTube lots, right?
00:25:22.580 | But then seeing a person is very different.
00:25:24.020 | And so I think when we engage in virtual experiences
00:25:29.020 | all the time, and we only do that,
00:25:31.700 | there is absolutely a level of embodiment.
00:25:34.260 | There's a level of embodied experience
00:25:36.540 | and participatory interaction that is lost.
00:25:40.060 | And it's very hard to put your finger
00:25:41.620 | on exactly what it is.
00:25:42.500 | It's hard to say, oh, we're gonna spend $100 million
00:25:44.540 | building a new system that captures this 5% better
00:25:49.460 | higher fidelity human expression.
00:25:51.220 | No one's gonna pay for that, right?
00:25:52.580 | So when we rush madly into a world of simulacrum
00:25:57.420 | and virtuality, the things that are lost are,
00:26:02.420 | it's difficult, once everyone moves there,
00:26:05.420 | it can be hard to look back and see what we've lost.
00:26:08.100 | - So is it irrecoverably lost,
00:26:10.380 | or rather when you put it all on the table,
00:26:14.220 | is it possible for more to be gained than is lost?
00:26:17.180 | If you look at video games,
00:26:18.700 | they create virtual experiences that are surreal
00:26:22.700 | and can bring joy to a lot of people,
00:26:24.460 | can connect a lot of people,
00:26:26.700 | and can get people to talk a lot of trash.
00:26:29.900 | So it can bring out the best and the worst in people.
00:26:32.620 | So is it possible to have a future world
00:26:35.720 | where the pros outweigh the cons?
00:26:38.560 | - It is.
00:26:39.400 | I mean, it's possible to have that in the current world,
00:26:41.560 | but when literally trillions of dollars of capital
00:26:46.560 | are tied to using those things
00:26:48.920 | to groom the worst of our inclinations
00:26:52.840 | and to attack our weaknesses in the limbic system
00:26:56.080 | to create these things into id machines
00:26:57.640 | versus connection machines,
00:26:59.120 | then those good things don't stand a chance.
00:27:03.200 | - Can you make a lot of money
00:27:04.400 | by building connection machines?
00:27:06.160 | Is it possible, do you think,
00:27:08.360 | to bring out the best in human nature
00:27:10.920 | to create fulfilling connections and relationships
00:27:13.720 | in the digital world and make a shit ton of money?
00:27:17.020 | - If I figure it out, I'll let you know.
00:27:21.080 | - But what's your intuition
00:27:22.000 | without concretely knowing what's the solution?
00:27:24.680 | - My intuition is that a lot of our digital technologies
00:27:27.720 | give us the ability to have synthetic connections
00:27:30.800 | or to experience virtuality.
00:27:33.020 | They have co-evolved with sort of the human expectations.
00:27:38.020 | It's sort of like sugary drinks.
00:27:40.840 | As people have more sugary drinks,
00:27:42.320 | they need more sugary drinks to get that same hit, right?
00:27:45.860 | So with these virtual things and with TV and fast cuts
00:27:50.400 | and TikToks and all these different kinds of things,
00:27:52.840 | we're co-creating essentially humanity
00:27:55.040 | that sort of asks and needs those things.
00:27:57.320 | And now it becomes very difficult
00:27:58.880 | to get people to slow down.
00:28:00.340 | It gets difficult for people to hold their attention
00:28:03.480 | on slow things and actually feel
00:28:05.720 | that embodied experience, right?
00:28:07.300 | So mindfulness now more than ever is so important
00:28:10.340 | in schools and as a therapy technique for people
00:28:13.520 | because our environment has been accelerated.
00:28:15.640 | And McLuhan actually talks about this
00:28:17.340 | in the electric environment of the television.
00:28:19.480 | And that was before TikTok and before front-facing cameras.
00:28:22.400 | So I think for me, the concern is that
00:28:25.320 | it's not like we can ever switch to doing something better,
00:28:28.100 | but more of the humans and technology,
00:28:32.040 | they're not independent of each other.
00:28:33.520 | The technology that we use kind of molds what we need
00:28:37.720 | for the next generation of technology.
00:28:39.080 | - Yeah, but humans are intelligent
00:28:40.680 | and they're introspective and they can reflect
00:28:44.280 | on the experiences of their life.
00:28:45.800 | So for example, there's been many years in my life
00:28:47.860 | where I ate an excessive amount of sugar.
00:28:50.900 | And then a certain moment I woke up and said,
00:28:53.620 | "Why do I keep doing this?
00:28:55.880 | This doesn't feel good."
00:28:57.640 | Like long-term.
00:28:59.000 | And I think, so going through the TikTok process
00:29:02.300 | of realizing, okay, when I shorten my attention span,
00:29:06.280 | actually that does not make me feel good longer term.
00:29:10.240 | And realizing that, and then going to platforms,
00:29:13.160 | going to places that are away from the sugar.
00:29:18.160 | So in so doing, you can create platforms
00:29:21.080 | that can make a lot of money to help people wake up
00:29:24.000 | to what actually makes them feel good long-term.
00:29:26.280 | Develop, grow as human beings.
00:29:28.280 | And it just feels like humans are more intelligent
00:29:31.040 | than mice looking for cheese.
00:29:35.120 | They're able to sort of think, I mean,
00:29:36.960 | we can contemplate our own mortality.
00:29:39.440 | - Right. - We can contemplate
00:29:40.720 | things like long-term love and we can have a long-term fear
00:29:45.720 | of certain things like mortality.
00:29:48.080 | We can contemplate whether the experiences,
00:29:51.240 | the sort of the drugs of daily life
00:29:53.680 | that we've been partaking in
00:29:55.600 | is making us happier, better people.
00:29:58.320 | And then once we contemplate that,
00:30:00.200 | we can make financial decisions in using services
00:30:03.800 | and paying for services that are making us better people.
00:30:06.880 | So it just seems that we're in the very first stages
00:30:11.440 | of social networks that just were able
00:30:14.680 | to make a lot of money really quickly.
00:30:16.520 | But in bringing out sometimes the bad parts of human nature,
00:30:21.520 | they didn't destroy humans.
00:30:23.200 | They just fed everybody a lot of sugar.
00:30:26.040 | And now everyone's gonna wake up and say,
00:30:28.520 | "Hey, we're gonna start having like sugar-free social media."
00:30:31.760 | - Right, right.
00:30:33.280 | Well, there's a lot to unpack there.
00:30:34.800 | I think some people certainly have the capacity for that.
00:30:37.560 | And I certainly think, I mean, it's very interesting
00:30:39.680 | even the way you said it.
00:30:40.520 | You woke up one day and you thought,
00:30:42.040 | "Well, this doesn't feel very good."
00:30:44.200 | Well, that's still your limbic system saying,
00:30:45.680 | "This doesn't feel very good."
00:30:47.280 | Right, you have a cat brain's worth of neurons
00:30:49.160 | around your gut, right?
00:30:50.720 | And so maybe that saturated and that was telling you,
00:30:53.560 | "Hey, this isn't good."
00:30:55.040 | Humans are more than just mice looking for cheese
00:30:58.320 | or monkeys looking for sex and power, right?
00:31:00.800 | So- - Let's slow down.
00:31:02.640 | Now a lot of people would argue with you on that one.
00:31:05.960 | But yes. - Well, we're more
00:31:06.800 | than just that, but we're at least that.
00:31:08.480 | And we're very, very seldom not that.
00:31:11.800 | So I don't actually disagree with you
00:31:15.080 | that we could be better and that better platforms exist.
00:31:18.240 | And people are voluntarily noping out
00:31:20.000 | of things like Facebook and noping out.
00:31:21.720 | - That's an awesome verb. - It's a great term.
00:31:23.680 | Yeah, I love it.
00:31:24.520 | I use it all the time.
00:31:25.720 | You're welcome, Mike. - I'm gonna have to
00:31:26.560 | nope out of that. - I'm gonna nope out of that.
00:31:28.280 | Right, it's gonna be a hard pass.
00:31:29.760 | And that's great.
00:31:32.840 | But that's again, to your point,
00:31:34.240 | that's the first generation of front-facing cameras,
00:31:37.240 | of social pressures.
00:31:38.680 | And you as a self-starter, self-aware adult
00:31:43.560 | have the capacity to say, "Yeah, I'm not gonna do that.
00:31:46.320 | "I'm gonna go and spend time on long-form reads.
00:31:48.440 | "I'm gonna spend time managing my attention.
00:31:50.200 | "I'm gonna do some yoga."
00:31:52.080 | If you're a 15-year-old in high school
00:31:55.000 | and your entire social environment
00:31:57.040 | is everyone doing these things,
00:31:58.280 | guess what you're gonna do?
00:31:59.560 | You're gonna kind of have to do that
00:32:00.680 | because your limbic system says,
00:32:01.680 | "Hey, I need to get the guy or the girl or the whatever,
00:32:04.440 | "and that's what I'm gonna do."
00:32:05.520 | And so one of the things that we have to reason about here
00:32:07.800 | is the social media systems or social media, I think,
00:32:11.080 | is our first encounter with a technological system
00:32:16.440 | that runs a bit of a loop
00:32:18.280 | around our own cognition and attention.
00:32:21.840 | It's not the last.
00:32:23.600 | It's far from the last.
00:32:25.760 | And it gets to the heart
00:32:26.760 | of some of the philosophical Achilles heel
00:32:29.640 | of the Western philosophical system,
00:32:31.880 | which is each person gets to make their own determination.
00:32:34.320 | Each person is an individual that's sacrosanct
00:32:37.200 | in their agency and their sovereignty and all these things.
00:32:39.960 | The problem with these systems is they come down
00:32:42.560 | and they are able to manage their attention
00:32:44.360 | they come down and they are able to manage everyone en masse.
00:32:47.960 | And so every person is making their own decision,
00:32:50.160 | but together the bigger system is causing them to act
00:32:53.600 | with a group dynamic that's very profitable for people.
00:32:58.600 | So this is the issue that we have
00:33:00.960 | is that our philosophies are actually not geared
00:33:03.580 | to understand what is it for a person
00:33:06.480 | to have a high trust connection as part of a collective
00:33:11.400 | and for that collective to have its right
00:33:13.400 | to coherency and agency.
00:33:16.160 | That's something like when a social media app
00:33:19.280 | causes a family to break apart,
00:33:21.600 | it's done harm to more than just individuals.
00:33:24.520 | So that concept is not something we really talk about
00:33:27.200 | or think about very much,
00:33:28.040 | but that's actually the problem
00:33:30.040 | is that we're vaporizing molecules into atomic units
00:33:32.960 | and then we're hitting all the atoms with certain things
00:33:35.040 | that's like, yeah, well, that person chose to look at my app.
00:33:38.240 | - So our understanding of human nature
00:33:40.480 | is at the individual level,
00:33:42.360 | it emphasizes the individual too much
00:33:44.680 | because ultimately society operates at the collective level.
00:33:47.440 | - And these apps do as well.
00:33:48.600 | - And the apps do as well.
00:33:49.840 | So for us to understand the progression
00:33:52.680 | and the development of this organism
00:33:54.600 | we call human civilization,
00:33:55.940 | we have to think of the collective level too.
00:33:57.960 | - I would say multi-tiered.
00:33:59.240 | - Multi-tiered.
00:34:00.080 | - Multi-tiered.
00:34:00.920 | - So individual as well.
00:34:01.740 | - Individuals, family units, social collectives,
00:34:05.040 | and on the way up too.
00:34:07.120 | - So you've said that individual humans are multi-layered.
00:34:10.320 | Susceptible to signals and waves and multiple strata.
00:34:13.600 | The physical, the biological, social, cultural,
00:34:16.120 | intellectual, so sort of going along these lines,
00:34:19.440 | can you describe the layers of the cake
00:34:22.840 | that is a human being
00:34:25.320 | and maybe the human collective, human society?
00:34:29.180 | - So I'm just stealing wholesale here from Robert Persig,
00:34:32.880 | who is the author of "Zen and the Art of Motorcycle
00:34:34.560 | Maintenance" and in his follow-on book
00:34:39.040 | there's a sequel to it called "Lila".
00:34:40.680 | He goes into this in a little more detail,
00:34:42.280 | but it's a crude approach to thinking about people,
00:34:46.940 | but I think it's still an advancement
00:34:48.840 | over traditional subject-object metaphysics,
00:34:51.200 | where we look at people as a dualist would say,
00:34:53.920 | well, is your mind, your consciousness,
00:34:57.140 | is that just merely the matter that's in your brain
00:35:01.120 | or is there something kind of more beyond that?
00:35:03.440 | And they would say, yes, there's a soul,
00:35:05.120 | sort of ineffable soul
00:35:06.960 | beyond just merely the physical body, right?
00:35:09.360 | And I'm not one of those people, right?
00:35:11.360 | I think that we don't have to draw a line
00:35:13.440 | between are things only this or only that.
00:35:16.820 | Collectives of things can emerge structures and patterns
00:35:19.760 | that are just as real as the underlying pieces,
00:35:22.040 | but they're transcendent,
00:35:24.160 | but they're still of the underlying pieces.
00:35:26.520 | So your body is this way.
00:35:28.400 | I mean, we just know physically you consist of atoms
00:35:31.000 | and whatnot, and then the atoms are arranged into molecules,
00:35:34.680 | which then arrange into certain kinds of structures
00:35:37.120 | that seem to have a homeostasis to them,
00:35:39.560 | we call them cells,
00:35:40.600 | and those cells form sort of biological structures.
00:35:44.520 | Those biological structures give your body
00:35:46.800 | its physical ability and the biological ability
00:35:49.000 | to consume energy and to maintain homeostasis,
00:35:51.800 | but humans are social animals.
00:35:54.000 | I mean, human by themselves is not very long for the world.
00:35:57.180 | So we also, part of our biology
00:35:59.500 | is why are two connect to other people?
00:36:02.200 | From the mirror neurons to our language centers
00:36:04.920 | and all these other things.
00:36:06.200 | So we are intrinsically, there's a layer,
00:36:09.200 | there's a part of us that wants to be part of a thing.
00:36:12.560 | If we're around other people, not saying a word,
00:36:14.680 | but they're just up and down jumping and dancing, laughing,
00:36:16.760 | we're gonna feel better, right?
00:36:18.580 | And there was no exchange of physical anything.
00:36:21.760 | They didn't give us like five atoms of happiness, right?
00:36:24.880 | But there's an induction in our own sense of self
00:36:27.520 | that is at that social level.
00:36:29.680 | And then beyond that,
00:36:31.000 | Pirsig puts the intellectual level
00:36:33.680 | kind of one level higher than social.
00:36:35.080 | I think they're actually more intertwined than that.
00:36:37.080 | But the intellectual level is the level of pure ideas
00:36:41.080 | that you are a vessel for memes,
00:36:42.800 | you're a vessel for philosophies.
00:36:45.000 | You will conduct yourself in a particular way.
00:36:47.720 | I mean, I think part of this is if we think about it
00:36:49.480 | from a physics perspective,
00:36:50.880 | there's the joke that physicists like to approximate things.
00:36:55.080 | And we'll say, well, approximate a spherical cow, right?
00:36:57.480 | You're not a spherical cow, you're not a spherical human,
00:36:59.600 | you're a messy human.
00:37:00.720 | And we can't even say what the dynamics of your emotion
00:37:03.960 | will be unless we analyze all four of these layers, right?
00:37:08.520 | If you're Muslim at a certain time of day, guess what?
00:37:11.720 | You're gonna be on the ground, kneeling and praying, right?
00:37:13.960 | And that has nothing to do with your biological need
00:37:15.920 | to get on the ground or physics of gravity.
00:37:18.480 | It is an intellectual drive that you have.
00:37:20.400 | It's a cultural phenomenon
00:37:21.960 | and an intellectual belief that you carry.
00:37:23.720 | So that's what the four layered stack is all about.
00:37:28.080 | It's that a person is not only one of these things,
00:37:30.320 | they're all of these things at the same time.
00:37:31.720 | It's a superposition of dynamics that run through us,
00:37:35.640 | that make us who we are.
00:37:37.320 | - So no layer is special.
00:37:39.460 | - Not so much no layer is special,
00:37:41.680 | each layer is just different.
00:37:43.180 | But we are-
00:37:45.800 | - Each layer gets the participation trophy.
00:37:48.280 | - Yeah, each layer is a part of what you are.
00:37:50.280 | You are a layer cake, right, of all these things.
00:37:52.040 | And if we try to deny, right?
00:37:54.440 | So many philosophies do try to deny
00:37:56.560 | the reality of some of these things, right?
00:37:58.960 | Some people say, "Well, we're only atoms."
00:38:01.440 | Well, we're not only atoms
00:38:02.480 | because there's a lot of other things that are only atoms.
00:38:04.080 | I can reduce a human being to a bunch of soup
00:38:07.000 | and they're not the same thing,
00:38:08.520 | even though it's the same atoms.
00:38:09.800 | So I think the order and the patterns
00:38:12.160 | that emerge within humans to understand,
00:38:15.960 | to really think about what a next generation
00:38:17.960 | of philosophy would look like,
00:38:19.060 | that would allow us to reason about
00:38:20.640 | extending humans into the digital realm
00:38:22.960 | or to interact with autonomous intelligences
00:38:25.560 | that are not biological in nature.
00:38:27.480 | We really need to appreciate these,
00:38:29.600 | that human, what human beings actually are,
00:38:32.240 | is the superposition of these different layers.
00:38:34.720 | - You mentioned consciousness.
00:38:36.600 | Are each of these layers of cake conscious?
00:38:39.760 | Is consciousness a particular quality of one of the layers?
00:38:43.720 | Is there like a spike,
00:38:45.000 | if you have a consciousness detector at these layers?
00:38:47.940 | Or is something that just permeates all of these layers
00:38:50.320 | and just takes different form?
00:38:51.920 | - I believe what humans experience as consciousness
00:38:54.400 | is something that sits on a gradient scale
00:38:57.640 | of a general principle in the universe
00:39:00.540 | that seems to look for order and reach for order
00:39:04.960 | when there's an excess of energy.
00:39:06.780 | You know, it would be odd to say a proton is alive, right?
00:39:09.400 | It'd be odd to say that this particular atom
00:39:12.040 | or molecule of hydrogen gas is alive.
00:39:15.840 | But there's certainly something we can make
00:39:20.040 | assemblages of these things
00:39:22.400 | that have autopoetic aspects to them,
00:39:24.400 | that will create structures,
00:39:25.500 | that will, you know, crystalline solids
00:39:27.460 | will form very interesting and beautiful structures.
00:39:29.960 | This gets kind of into weird mathematical territories.
00:39:33.000 | You start thinking about Penrose and Game of Life stuff
00:39:35.320 | about the generativity of math itself,
00:39:37.920 | like the hyperreal numbers, things like that.
00:39:39.480 | But without going down that rabbit hole,
00:39:42.340 | I would say that there seems to be a tendency in the world
00:39:46.560 | that when there is excess energy,
00:39:49.120 | things will structure and pattern themselves.
00:39:51.360 | And they will then actually furthermore try to create
00:39:53.760 | an environment that furthers their continued stability.
00:39:58.080 | It's the concept of externalized extended phenotype
00:40:00.840 | or niche construction.
00:40:02.280 | So this is ultimately what leads to certain kinds
00:40:06.280 | of amino acids forming certain kinds of structures
00:40:09.200 | and so on and so forth until you get the ladder of life.
00:40:11.160 | So what we experience as consciousness,
00:40:12.920 | no, I don't think cells are conscious of that level.
00:40:15.600 | But is there something beyond mere equilibrium state biology
00:40:19.600 | and chemistry and biochemistry
00:40:21.960 | that drives what makes things work?
00:40:25.420 | I think there is.
00:40:26.420 | So Adrian Bajan has this construct a lot.
00:40:29.600 | There's other things you can look at.
00:40:31.200 | When you look at the life sciences
00:40:32.440 | and you look at any kind of statistical physics
00:40:36.040 | and statistical mechanics,
00:40:37.440 | when you look at things far out of equilibrium,
00:40:39.740 | when you have excess energy, what happens then?
00:40:43.400 | Life doesn't just make a hot herb soup.
00:40:45.800 | It starts making structure.
00:40:47.440 | There's something there.
00:40:48.640 | - The poetry of reaches for order
00:40:50.880 | when there's an excess of energy
00:40:52.780 | because you brought up game of life.
00:40:56.060 | You did it, not me.
00:40:59.160 | I love cellular automata,
00:41:00.360 | so I have to sort of linger on that for a little bit.
00:41:04.420 | So cellular automata, I guess, or game of life
00:41:09.340 | is a very simple example of reaching for order
00:41:11.800 | when there's an excess of energy
00:41:14.040 | or reaching for order and somehow creating complexity.
00:41:18.480 | It's explosion of just turmoil
00:41:22.400 | somehow trying to construct structures.
00:41:25.520 | And so doing creates
00:41:28.720 | very elaborate organism-looking type things.
00:41:31.520 | What intuition do you draw from the simplest mechanism?
00:41:35.800 | - Well, I like to turn that around its head
00:41:38.000 | and look at it as what if every single one of the patterns
00:41:42.360 | created life or created, not life,
00:41:45.760 | but created interesting patterns?
00:41:47.360 | Because some of them don't.
00:41:48.640 | And sometimes you make cool gliders.
00:41:50.560 | And other times you start with certain things
00:41:52.240 | and you make gliders and other things
00:41:54.000 | that then construct like AND gates and NOT gates, right?
00:41:57.000 | And you build computers on them.
00:41:59.240 | All of these rules that create these patterns
00:42:00.800 | that we can see, those are just the patterns we can see.
00:42:04.600 | What if our subjectivity is actually limiting our ability
00:42:06.960 | to perceive the order in all of it?
00:42:09.680 | What if some of the things that we think are random
00:42:12.280 | are actually not that random?
00:42:13.260 | We're simply not integrating at a final level
00:42:16.100 | across a broad enough time horizon.
00:42:17.920 | And this is again, I said, we go down the rabbit holes
00:42:20.560 | and the Penrose stuff or like Wolfram's explorations
00:42:22.840 | on these things.
00:42:23.680 | There is something deep and beautiful
00:42:27.260 | in the mathematics of all this.
00:42:28.560 | That is, hopefully one day I'll have enough money
00:42:30.360 | to work and retire and just ponder those questions.
00:42:33.400 | But there's something there.
00:42:34.520 | - But you're saying there's a ceiling to
00:42:36.080 | when you have enough money and you retire and you ponder,
00:42:38.440 | there's a ceiling to how much you can truly ponder
00:42:40.720 | because there's cognitive limitations
00:42:43.020 | in what you're able to perceive as a pattern.
00:42:46.340 | - Yeah.
00:42:47.180 | - And maybe mathematics extends your perception capabilities
00:42:51.500 | but it's still finite.
00:42:53.820 | It's just like.
00:42:55.460 | - Yeah, the mathematics we use is the mathematics
00:42:57.640 | that can fit in our head.
00:42:58.980 | - Yeah.
00:42:59.820 | - Did God really create the integers
00:43:02.580 | or did God create all of it?
00:43:03.740 | And we just happen at this point in time
00:43:05.340 | to be able to perceive integers.
00:43:07.220 | - Well, he just did the positive in it.
00:43:09.380 | - She, I just think did she create all of it?
00:43:11.340 | And then we.
00:43:13.080 | - She just created the natural numbers
00:43:15.740 | and then we screwed all up with zero and then I guess, okay.
00:43:18.540 | - But we did, we created mathematical operations
00:43:21.600 | so that we can have iterated steps
00:43:23.700 | to approach bigger problems, right?
00:43:26.000 | I mean, the entire point of the Arabic numeral system
00:43:29.040 | and it's a rubric for mapping a certain set of operations
00:43:32.580 | of folding them into a simple little expression.
00:43:35.360 | But that's just the operations that we can fit in our heads.
00:43:38.740 | There are many other operations besides, right?
00:43:41.160 | - The thing that worries me the most about aliens and humans
00:43:46.020 | is that they're aliens are all around us
00:43:49.100 | and we're too dumb to see them.
00:43:51.700 | - Oh, certainly, yeah.
00:43:52.860 | - Or life, let's say just life,
00:43:54.820 | life of all kinds of forms or organisms.
00:43:58.100 | You know what?
00:43:58.920 | Just even the intelligence of organisms
00:44:01.940 | is imperceptible to us
00:44:04.160 | because we're too dumb and self-centered.
00:44:06.620 | That worries me.
00:44:07.460 | - Well, we're looking for a particular kind of thing.
00:44:10.340 | When I was at Cornell,
00:44:11.200 | I had a lovely professor of Asian religions,
00:44:13.880 | Jane Marie Law,
00:44:14.720 | and she would tell this story about a musical,
00:44:17.800 | a musician, a Western musician who went to Japan
00:44:20.120 | and he taught classical music
00:44:21.900 | and could play all sorts of instruments.
00:44:24.040 | He went to Japan and he would ask people,
00:44:27.520 | he would basically be looking for things in the style of
00:44:30.640 | Western chromatic scale and these kinds of things.
00:44:34.100 | And then finding none of it, he would say,
00:44:35.600 | "Well, there's really no music in Japan,"
00:44:37.560 | but they're using a different scale.
00:44:38.840 | They're playing different kinds of instruments.
00:44:40.480 | The same thing she was using as sort of a metaphor
00:44:42.640 | for religion as well.
00:44:43.680 | In the West, we center a lot of religion,
00:44:45.800 | certainly the religions of Abraham,
00:44:48.000 | we center them around belief.
00:44:50.080 | And in the East, it's more about practice,
00:44:52.480 | spirituality and practice rather than belief.
00:44:54.640 | So anyway, the point is here to your point,
00:44:57.400 | life, I think so many people are so fixated
00:45:00.840 | on certain aspects of self-replication
00:45:03.360 | or homeostasis or whatever.
00:45:06.160 | But if we kind of broaden and generalize this thing
00:45:08.880 | of things reaching for order,
00:45:10.880 | under which conditions can they then create an environment
00:45:13.600 | that sustains that order, that allows them,
00:45:17.240 | the invention of death is an interesting thing.
00:45:20.160 | There are some organisms on earth
00:45:21.520 | that are thousands of years old.
00:45:23.420 | And it's not like they're incredibly complex,
00:45:25.560 | they're actually simpler than the cells that comprise us,
00:45:28.560 | but they never die.
00:45:29.640 | So at some point, death was invented,
00:45:33.320 | somewhere along the eukaryotic scale,
00:45:34.720 | I mean, even the protists, right?
00:45:35.980 | There's death.
00:45:37.160 | And why is that along with the sexual reproduction, right?
00:45:41.440 | There is something about the renewal process,
00:45:44.960 | something about the ability to respond
00:45:46.480 | to a changing environment,
00:45:48.120 | where it just become, just killing off the old generation
00:45:51.480 | and letting new generations try,
00:45:54.200 | seems to be the best way to fit into the niche.
00:45:56.680 | - You know, human historians seems to write about wheels
00:45:59.280 | and fires, the greatest inventions,
00:46:01.600 | but it seems like death and sex are pretty good.
00:46:04.320 | And they're kind of essential inventions
00:46:06.540 | at the very beginning.
00:46:07.380 | - At the very beginning, yeah.
00:46:08.560 | Well, we didn't invent them, right?
00:46:10.500 | - Well, broad, we, you didn't invent them.
00:46:13.260 | I see us as one, you particular homo sapien
00:46:16.980 | did not invent them, but we together,
00:46:19.500 | it's a team project, just like you're saying.
00:46:21.860 | - I think the greatest homo sapien invention
00:46:24.260 | is collaboration.
00:46:25.620 | - So when you say collaboration,
00:46:27.220 | Peter, where do ideas come from?
00:46:32.180 | And how do they take hold in society?
00:46:35.240 | Is that the nature of collaboration?
00:46:36.880 | Is that the basic atom of collaboration is ideas?
00:46:40.400 | - It's not not ideas, but it's not only ideas.
00:46:43.160 | There's a book I just started reading
00:46:44.420 | called "Death From a Distance."
00:46:45.880 | Have you heard of this? - No.
00:46:46.980 | - It's a really fascinating thesis,
00:46:49.160 | which is that humans are the only conspecific,
00:46:52.880 | the only species that can kill other members
00:46:55.800 | of the species from range.
00:46:58.240 | And maybe there's a few exceptions,
00:46:59.600 | but if you look in the animal world,
00:47:01.040 | you see like pronghorns, butting heads, right?
00:47:03.000 | You see the alpha lion and the beta lion,
00:47:05.840 | and they take each other down.
00:47:07.180 | Humans, we develop the ability to chuck rocks at each other,
00:47:10.120 | well, at prey, but also at each other.
00:47:11.960 | And that means the beta male can chunk a rock
00:47:14.920 | at the alpha male and take them down.
00:47:17.440 | And he can throw a lot of rocks, actually,
00:47:20.040 | miss a bunch of times, but just hit once and be good.
00:47:22.400 | So this ability to actually kill members of our own species
00:47:26.600 | from range without a threat of harm to ourselves
00:47:29.960 | created essentially mutually assured destruction
00:47:32.360 | where we had to evolve cooperation.
00:47:34.000 | If we didn't, then if we just continue to try to do,
00:47:37.280 | like I'm the biggest monkey in the tribe,
00:47:39.480 | and I'm gonna own this tribe and you have to go,
00:47:43.240 | if we do it that way, then those tribes basically failed.
00:47:46.720 | And the tribes that persisted
00:47:48.440 | and that have now given rise to the modern Homo sapiens
00:47:51.440 | are the ones where respecting the fact
00:47:53.860 | that we can kill each other from range without harm,
00:47:56.920 | like there's an asymmetric ability
00:47:58.640 | to snipe the leader from range,
00:48:00.840 | that meant that we sort of had to learn
00:48:03.920 | how to cooperate with each other.
00:48:05.200 | Right, come back here, don't throw that rock at me.
00:48:06.600 | Let's talk our differences out.
00:48:07.960 | - So violence is also part of collaboration.
00:48:10.240 | - The threat of violence, let's say.
00:48:12.300 | Well, the recognition, maybe the better way to put it
00:48:15.700 | is the recognition that we have more to gain
00:48:17.480 | by working together than the prisoner's dilemma
00:48:21.140 | of both of us defecting.
00:48:22.580 | - So mutually assured destruction in all its forms
00:48:26.360 | is part of this idea of collaboration.
00:48:28.720 | - Well, and Eric Weinstein talks about our nuclear peace.
00:48:31.360 | I mean, it kind of sucks
00:48:32.440 | that we have thousands of warheads aimed at each other,
00:48:34.040 | we mean Russia and the US, but it's like,
00:48:36.320 | on the other hand, we only fought proxy wars.
00:48:39.880 | We did not have another World War III
00:48:41.360 | of like hundreds of millions of people
00:48:43.160 | dying to like machine gun fire and giant guided missiles.
00:48:47.840 | - So the original nuclear weapon is a rock
00:48:50.400 | that we learned how to throw, essentially?
00:48:52.800 | - Yeah, well, the original scope of the world
00:48:54.280 | for any human being was their little tribe.
00:48:56.440 | I would say it still is for the most part.
00:49:00.760 | (both laughing)
00:49:02.000 | - Eric Weinstein speaks very highly of you,
00:49:05.800 | which is very surprising to me at first
00:49:08.000 | 'cause I didn't know there's this depth to you
00:49:10.840 | 'cause I knew you as an amazing leader of engineers
00:49:15.600 | and engineer yourself and so on.
00:49:17.400 | So it's fascinating.
00:49:18.480 | Maybe just as a comment, a side tangent that we can take,
00:49:23.840 | what's the nature of your friendship with Eric Weinstein?
00:49:27.160 | How did such two interesting paths cross?
00:49:30.640 | Is it your origins in physics?
00:49:32.960 | Is it your interest in philosophy
00:49:35.640 | and the ideas of how the world works?
00:49:37.280 | What is it?
00:49:38.120 | - It's actually, it's very random.
00:49:39.240 | Eric found me.
00:49:40.920 | He actually found Travis and I.
00:49:43.840 | - Travis Oliphant. - Oliphant, yeah.
00:49:45.120 | We were both working at a company called Enthought
00:49:47.400 | back in the mid 2000s,
00:49:48.520 | and we were doing a lot of consulting
00:49:50.840 | around scientific Python.
00:49:52.840 | And we'd made some tools
00:49:54.400 | and Eric was trying to use some of these Python tools
00:49:57.400 | to visualize that he had a fiber bundle approach
00:50:00.200 | to modeling certain aspects of economics.
00:50:03.440 | He was doing this
00:50:04.280 | and that's how he kind of got in touch with us.
00:50:05.600 | And so-
00:50:07.200 | - This was in the early-
00:50:08.120 | - This was in the mid 2000s.
00:50:10.080 | '07 timeframe, '06, '07 timeframe.
00:50:13.680 | - Eric Weinstein trying to use Python
00:50:16.200 | to visualize fiber bundles. - Right, to visualize
00:50:17.200 | fiber bundles using some of the tools
00:50:19.720 | that we'd built in the open source.
00:50:21.280 | That's somehow entertaining to me.
00:50:23.000 | - It's pretty funny. - The thought of that.
00:50:24.120 | - It's pretty funny.
00:50:25.040 | But then, we met with him a couple of times,
00:50:27.120 | a really interesting guy.
00:50:27.960 | And then in the wake of the '07, '08
00:50:29.840 | kind of financial collapse,
00:50:31.320 | he helped organize with Lee Smolin
00:50:34.720 | a symposium at the Perimeter Institute
00:50:36.960 | about, okay, well, clearly,
00:50:39.880 | big finance can't be trusted,
00:50:41.160 | government's in its pockets with regulatory capture.
00:50:43.480 | What the F do we do?
00:50:45.280 | And all sorts of people, Nassim Tlaib was there
00:50:47.600 | and Andy Lo from MIT was there
00:50:49.440 | and Bill Janeway, I mean, just a lot of
00:50:52.880 | top billing people were there.
00:50:54.800 | And he invited me and Travis
00:50:56.880 | and another one of our coworkers, Robert Kern,
00:51:00.280 | who is anyone in the SciPy, NumPy community knows Robert,
00:51:03.880 | really great guy.
00:51:04.720 | So the three of us also got invited to go to this thing.
00:51:06.600 | And that's where I met Brett Weinstein
00:51:07.760 | for the first time as well.
00:51:09.480 | Yeah, I knew him before he got all famous
00:51:11.680 | for unfortunate reasons, I guess.
00:51:13.520 | But anyway, so we met then and kind of had a friendship.
00:51:19.600 | Throughout, since then.
00:51:21.400 | - You have a depth of thinking
00:51:24.000 | that kind of runs with Eric
00:51:26.800 | in terms of just thinking about the world deeply
00:51:28.640 | and thinking philosophically.
00:51:30.440 | And then there's Eric's interest in programming.
00:51:33.480 | I actually have never,
00:51:34.640 | he'll bring up programming to me quite a bit
00:51:38.960 | as a metaphor for stuff.
00:51:41.080 | But I never kind of pushed the point of like,
00:51:43.640 | what's the nature of your interest in programming?
00:51:46.800 | I think he saw it probably as a tool.
00:51:48.840 | - Yeah, absolutely.
00:51:49.680 | - That you visualize to explore mathematics
00:51:52.200 | and explore physics.
00:51:53.120 | But I was wondering like,
00:51:55.040 | what's his depth of interest and also his vision
00:51:59.360 | for what programming would look like in the future?
00:52:05.640 | Have you had interaction with him,
00:52:07.400 | like discussion in the space of Python programming?
00:52:09.840 | - Well, in the sense of sometimes he asks me,
00:52:11.920 | why is this stuff still so hard?
00:52:13.840 | (both laughing)
00:52:16.960 | Yeah, you know, everybody's a critic,
00:52:18.280 | but actually no, Eric--
00:52:20.320 | - Programming you mean like in general?
00:52:21.520 | - Yes, yes.
00:52:22.360 | Well, not programming in general,
00:52:23.320 | but certain things in the Python ecosystem.
00:52:25.440 | But he actually, I think what I find
00:52:29.040 | in listening to some of his stuff
00:52:30.120 | is that he does use programming metaphors a lot, right?
00:52:33.280 | He'll talk about APIs or object oriented
00:52:35.400 | and things like that.
00:52:36.400 | So I think that's a useful set of frames
00:52:39.240 | for him to draw upon for discourse.
00:52:42.880 | I haven't pair programmed with him in a very long time.
00:52:45.440 | - You've previously--
00:52:47.240 | - Well, I mean, I look at his code,
00:52:48.760 | trying to help like put together
00:52:50.200 | some of the visualizations around these things,
00:52:51.560 | but it's been a very, not really pair program,
00:52:54.040 | but like even looked at his code, right?
00:52:55.800 | I mean--
00:52:56.640 | - How legendary would be is that like,
00:52:58.960 | get repo with Peter Wang and Eric Weinstein.
00:53:02.680 | - Well, honestly, Robert Kern did all the heavy lifting.
00:53:05.480 | So I have to give credit where credit is due.
00:53:06.880 | Robert is the silent, but incredibly deep,
00:53:10.320 | quiet, not silent, but quiet,
00:53:11.640 | but incredibly deep individual
00:53:13.480 | at the heart of a lot of those things
00:53:14.720 | that Eric was trying to do.
00:53:16.680 | But we did have, you know, in the,
00:53:18.560 | as Travis and I were starting our company in 2012 timeframe,
00:53:23.520 | we went to New York.
00:53:24.640 | Eric was still in New York at the time.
00:53:26.120 | He hadn't moved to,
00:53:27.560 | this was before he joined Teal Capital.
00:53:29.160 | We just had like a steak dinner somewhere.
00:53:31.560 | Maybe it was Keen's, I don't know, somewhere in New York.
00:53:33.400 | So it was me, Travis, Eric,
00:53:35.080 | and then Wes McKinney, the creative pandas.
00:53:37.280 | And then Wes' then business partner, Adam,
00:53:40.280 | the five of us sat around having this,
00:53:42.000 | just a hilarious time, amazing dinner.
00:53:45.520 | I forget what all we talked about,
00:53:46.960 | but it was one of those conversations,
00:53:49.000 | which I wish as soon as COVID is over,
00:53:51.400 | maybe Eric and I can sit down.
00:53:53.080 | - Recreate.
00:53:53.920 | - Recreate it somewhere in LA, or maybe he comes here.
00:53:56.720 | 'Cause a lot of cool people are here in Austin, right?
00:53:58.200 | - Exactly.
00:53:59.040 | - Yeah, we're all here.
00:53:59.880 | He should come here. - Come here.
00:54:00.720 | - Yeah.
00:54:01.560 | - So he uses the metaphor of source code
00:54:03.600 | sometimes to talk about physics.
00:54:05.320 | We figure out our own source code.
00:54:07.160 | So you with the physics background,
00:54:10.160 | and somebody who's quite a bit of an expert in source code,
00:54:14.160 | do you think we'll ever figure out our own source code
00:54:17.760 | in the way that Eric means?
00:54:19.040 | Do you think we'll figure out the nature of it?
00:54:19.880 | - Well, I think we're constantly working on that problem.
00:54:21.760 | I mean, I think we'll make more and more progress.
00:54:24.440 | For me, there's some things I don't really doubt too much.
00:54:28.120 | Like I don't really doubt that one day we will create
00:54:31.600 | a synthetic, maybe not fully in silicon,
00:54:34.520 | but a synthetic approach to cognition
00:54:37.760 | that rivals the biological 20 watt computers in our heads.
00:54:44.760 | - What's cognition here?
00:54:46.160 | - Cognition.
00:54:47.000 | - Which aspect?
00:54:47.820 | - Perception, attention, memory, recall,
00:54:50.000 | asking better questions.
00:54:51.880 | That for me is a measure of intelligence.
00:54:53.240 | - Doesn't Roomba Vacuum Cleaner already do that?
00:54:55.440 | Or do you mean, oh, it doesn't ask questions.
00:54:57.120 | - I mean, no.
00:54:58.040 | It's, I mean, I have a Roomba, but it's--
00:55:00.920 | - Well, it asks questions.
00:55:01.760 | - It's not even as smart as my cat, right?
00:55:03.680 | - It doesn't ask questions about what is this wall?
00:55:05.360 | It now, new feature asks, is this poop or not, apparently.
00:55:08.960 | - Yes, a lot of our current cybernetic system,
00:55:11.360 | it's a cybernetic system.
00:55:12.680 | It will go and it will happily vacuum up some poop, right?
00:55:15.020 | The older generations would.
00:55:16.720 | - The new one, just released, does not vacuum up the poop.
00:55:19.960 | This is a commercial for Roomba.
00:55:20.800 | - I wonder if it still gets stuck
00:55:21.720 | under my first rung of my stair.
00:55:23.760 | In any case, these cybernetic systems we have,
00:55:27.120 | they are mold, they're designed to be sent off
00:55:32.040 | into a relatively static environment.
00:55:34.040 | And whatever dynamic things happen in the environment,
00:55:36.360 | they have a very limited capacity to respond to.
00:55:38.840 | A human baby, a human toddler of 18 months of age,
00:55:43.040 | has more capacity to manage its own attention
00:55:45.800 | and its own capacity to make better sense of the world
00:55:49.140 | than the most advanced robots today.
00:55:51.680 | So again, my cat, I think, can do a better job of my two,
00:55:55.080 | and they're both pretty clever.
00:55:56.360 | So I do think though, back to my kind of original point,
00:55:59.440 | I think that it's not, for me, it's not a question at all
00:56:02.680 | that we will be able to create synthetic systems
00:56:05.960 | that are able to do this better than the human,
00:56:09.200 | at an equal level or better than the human mind.
00:56:11.680 | It's also, for me, not a question
00:56:14.840 | that we will be able to put them alongside humans
00:56:19.840 | so that they capture the full broad spectrum
00:56:23.200 | of what we are seeing as well,
00:56:25.360 | and also looking at our responses, listening to our responses,
00:56:28.880 | even maybe measuring certain vital signs about us.
00:56:32.040 | So in this kind of sidecar mode,
00:56:34.440 | a greater intelligence could use us
00:56:37.600 | in our whatever, 80 years of life, to train itself up,
00:56:42.200 | and then be a very good simulacrum of us moving forward.
00:56:45.040 | - So who is in the sidecar in that picture of the future,
00:56:49.920 | exactly, is it the human?
00:56:51.200 | - The baby version of our immortal selves.
00:56:53.000 | - Okay, so once the baby grows up,
00:56:56.200 | is there any use for humans?
00:56:58.480 | - I think so.
00:56:59.960 | I think that out of epistemic humility,
00:57:03.240 | we need to keep humans around for a long time.
00:57:05.600 | And I would hope that anyone making those systems
00:57:07.960 | would believe that to be true.
00:57:10.040 | - Out of epistemic humility.
00:57:11.640 | What's the nature of the humility that-
00:57:13.480 | - That we don't know what we don't know.
00:57:15.520 | - So we don't-
00:57:17.280 | - Right?
00:57:19.800 | - So we don't know-
00:57:20.640 | - First, I mean, first we have to build systems
00:57:21.680 | that help us do the things that we do know about,
00:57:24.400 | that can then probe the unknowns that we know about,
00:57:26.740 | but the unknown unknowns, we don't know.
00:57:28.840 | Nature is the one thing that is infinitely
00:57:32.400 | able to surprise us.
00:57:33.780 | So we should keep biological humans around
00:57:35.880 | for a very, very, very long time,
00:57:37.560 | even after our immortal selves have transcended
00:57:40.440 | and have gone off to explore other worlds,
00:57:42.880 | gone to go communicate with the lifeforms
00:57:44.520 | living in the sun or whatever else.
00:57:45.880 | So I think that's, for me,
00:57:49.520 | these seem like things that are going to happen.
00:57:53.000 | Like I don't really question that,
00:57:54.480 | that they're gonna happen.
00:57:55.720 | Assuming we don't completely destroy ourselves.
00:57:58.220 | - Is it possible to create an AI system
00:58:02.480 | that you fall in love with and it falls in love with you
00:58:06.140 | and you have a romantic relationship with it
00:58:08.460 | or a deep friendship, let's say?
00:58:10.740 | - I would hope that that is the design criteria
00:58:12.660 | for any of these systems.
00:58:14.500 | If we cannot have a meaningful relationship with it,
00:58:18.460 | then it's still just a chunk of silicon.
00:58:20.300 | - So then what is meaningful?
00:58:21.640 | Because back to sugar.
00:58:23.780 | - Well, sugar doesn't love you back, right?
00:58:25.360 | So the computer has to love you back.
00:58:26.820 | And what does love mean?
00:58:28.180 | Well, in this context, for me, love,
00:58:30.140 | I'm gonna take a page from Alain de Botton,
00:58:31.980 | love means that it wants to help us
00:58:34.360 | become the best version of ourselves.
00:58:36.620 | - Yes, that's beautiful.
00:58:39.420 | That's a beautiful definition of love.
00:58:40.700 | So what role does love play in the human condition
00:58:45.340 | at the individual level and at the group level?
00:58:48.680 | 'Cause you were kind of saying that humans,
00:58:51.060 | we should really consider humans
00:58:52.260 | both at the individual and the group and the societal level.
00:58:55.260 | What's the role of love in this whole thing?
00:58:56.920 | We talked about sex, we talked about death,
00:58:59.620 | thanks to the bacteria that invented it.
00:59:02.500 | At which point did we invent love, by the way?
00:59:04.300 | I mean, is that also-
00:59:05.500 | - No, I think love is the start of it all.
00:59:08.900 | And the feelings of, and this gets,
00:59:11.100 | this is sort of beyond just romantic, sensual,
00:59:15.960 | whatever kind of things,
00:59:16.800 | but actually genuine love as we have for another person,
00:59:19.680 | love as it would be used in a religious text, right?
00:59:22.660 | I think that capacity to feel love,
00:59:25.440 | more than consciousness, that is the universal thing.
00:59:28.380 | Our feeling of love is actually a sense of that generativity.
00:59:31.260 | When we can look at another person
00:59:32.900 | and see that they can be something more than they are,
00:59:37.380 | and more than just what we,
00:59:38.900 | a pigeonhole we might stick them in.
00:59:42.020 | We see, I mean, I think there's,
00:59:43.180 | in any religious text you'll find
00:59:44.900 | voiced some concept of this,
00:59:47.620 | that you should see the grace of God in the other person.
00:59:50.180 | That they're made in the spirit of what,
00:59:54.300 | the love that God feels for his creation or her creation.
00:59:57.060 | And so I think this thing is actually the root of it.
01:00:00.140 | So I would say before, I don't think molecules of water
01:00:04.760 | feel consciousness, have consciousness,
01:00:06.220 | but there is some proto micro quantum thing of love.
01:00:10.900 | That's the generativity when there's more energy
01:00:14.060 | than what they need to maintain equilibrium.
01:00:16.220 | And that, when you sum it all up,
01:00:18.300 | is something that leads to,
01:00:19.780 | I mean, I had my mind blown one day as an undergrad
01:00:23.460 | at the physics computer lab.
01:00:24.540 | I logged in and when you log into Bash for a long time,
01:00:28.260 | there was a little fortune that would come out.
01:00:29.540 | And it said, "Man was created by water
01:00:32.300 | "to carry itself uphill."
01:00:34.000 | And I was logging in to work on some problem set.
01:00:37.980 | And I logged in and I saw that and I just said,
01:00:40.500 | "Son of a bitch."
01:00:42.020 | I logged out and I went to the coffee shop
01:00:44.380 | and I got a coffee and I sat there on the quad.
01:00:46.180 | I'm like, "You know, it's not wrong and yet WTF, right?"
01:00:51.180 | So when you look at it that way, it's like,
01:00:55.820 | yeah, okay, non-equilibrium physics is a thing.
01:00:59.060 | And so when we think about love,
01:01:00.980 | when we think about these kinds of things,
01:01:03.080 | I would say that in the modern day human condition,
01:01:08.100 | there's a lot of talk about freedom and individual liberty
01:01:12.260 | and rights and all these things.
01:01:14.580 | But that's very Hegelian,
01:01:17.220 | it's very kind of following from the Western philosophy
01:01:19.900 | of the individual as sacrosanct.
01:01:22.700 | But it's not really couched, I think, the right way
01:01:26.300 | because it should be how do we maximize people's ability
01:01:29.420 | to love each other, to love themselves first,
01:01:32.060 | to love each other, their responsibilities
01:01:34.500 | to the previous generation, to the future generations.
01:01:37.780 | Those are the kinds of things
01:01:39.240 | that should be our design criteria, right?
01:01:41.780 | Those should be what we start with to then come up
01:01:45.660 | with the philosophies of self
01:01:47.740 | and of rights and responsibilities.
01:01:50.140 | But that love being at the center of it,
01:01:52.200 | I think when we design systems for cognition,
01:01:55.680 | it should absolutely be built that way.
01:01:58.660 | I think if we simply focus on efficiency and productivity,
01:02:02.200 | these kind of very industrial era,
01:02:04.840 | all the things that Marx had issues with, right?
01:02:08.300 | Those, that's a way to go and really, I think,
01:02:11.500 | go off the deep end in the wrong way.
01:02:13.540 | - So one of the interesting consequences
01:02:16.400 | of thinking of life in this hierarchical way
01:02:21.060 | of an individual human, and then there's groups
01:02:23.380 | and there's societies is I believe that,
01:02:28.260 | you believe that corporations are people.
01:02:30.640 | So this is kind of a politically dense idea,
01:02:36.540 | and all those kinds of things.
01:02:37.380 | If we just throw politics aside,
01:02:39.060 | if we throw all of that aside,
01:02:41.260 | in which sense do you believe that corporations are people?
01:02:44.600 | And how does love connect to that?
01:02:47.620 | - Right, so the belief is that groups of people
01:02:51.980 | have some kind of higher level,
01:02:54.560 | I would say mesoscopic claim to agency.
01:02:56.640 | So where do I, let's start with this.
01:03:00.880 | Most people would say, okay,
01:03:01.980 | individuals have claims to agency and sovereignty.
01:03:05.100 | Nations, we certainly act as if nations,
01:03:07.580 | so at a very large, large scale,
01:03:09.500 | nations have rights to sovereignty and agency.
01:03:13.080 | Like everyone plays the game of modernity
01:03:14.980 | as if that's true, right?
01:03:16.300 | We believe France is a thing.
01:03:17.340 | We believe the United States is a thing.
01:03:18.780 | But to say that groups of people
01:03:21.900 | at a smaller level than that,
01:03:23.760 | like a family unit is the thing.
01:03:26.660 | Well, in our laws, we actually do encode this concept.
01:03:30.460 | I believe that in a relationship, in a marriage, right,
01:03:33.820 | one partner can sue for loss of consortium, right,
01:03:37.740 | if someone breaks up the marriage or whatever.
01:03:39.820 | So these are concepts that even in law,
01:03:41.580 | we do respect that there is something about the union
01:03:44.740 | and about the family.
01:03:45.980 | So for me, I don't think it's so weird
01:03:48.300 | to think that groups of people have a right to,
01:03:51.820 | a claim to rights and sovereignty of some degree.
01:03:54.660 | I mean, we look at our clubs, we look at churches.
01:03:59.020 | These are, we talk about these collectives of people
01:04:02.020 | as if they have a real agency to them.
01:04:04.580 | And they do.
01:04:05.780 | But I think if we take that one step further,
01:04:08.460 | and say, okay, they can accrue resources.
01:04:10.300 | Well, yes, check, you know, and by law, they can.
01:04:12.740 | They can own land, they can engage in contracts,
01:04:17.060 | they can do all these different kinds of things.
01:04:18.860 | So we, in legal terms, support this idea
01:04:22.580 | that groups of people have rights.
01:04:24.660 | Where we go wrong on this stuff
01:04:28.040 | is that the most popular version of this
01:04:31.260 | is the for-profit absentee owner corporation
01:04:35.380 | that then is able to amass larger resources
01:04:38.440 | than anyone else in the landscape, anything else,
01:04:40.860 | any other entity of equivalent size.
01:04:43.100 | And they're able to essentially bully around individuals,
01:04:45.500 | whether it's laborers, whether it's people
01:04:47.060 | whose resources they want to capture.
01:04:48.940 | They're also able to bully around
01:04:50.460 | our system of representation,
01:04:52.140 | which is still tied to individuals, right?
01:04:55.540 | So I don't believe that's correct.
01:04:58.520 | I don't think it's good that they, you know,
01:05:01.160 | they're people, but they're assholes.
01:05:02.320 | I don't think that corporations as people
01:05:03.660 | acting like assholes is a good thing.
01:05:05.500 | But the idea that collectives and collections of people,
01:05:08.420 | that we should treat them philosophically
01:05:10.140 | as having some--
01:05:12.260 | - Agency.
01:05:13.100 | - Some agency and some mass at a mesoscopic level.
01:05:16.900 | I think that's an important thing
01:05:17.980 | because one thing I do think we underappreciate sometimes
01:05:22.340 | is the fact that relationships have relationships.
01:05:26.180 | So it's not just individuals having relationships
01:05:27.900 | with each other.
01:05:29.060 | But if you have eight people seated around a table, right?
01:05:32.060 | Each person has a relationship with each of the others,
01:05:34.180 | and that's obvious.
01:05:35.600 | But then if it's four couples,
01:05:37.840 | each couple also has a relationship
01:05:39.680 | with each of the other couples, right?
01:05:41.680 | The dyads do.
01:05:42.720 | And if it's couples, but one is the father and mother older,
01:05:47.040 | and then, you know, one of their children and their spouse,
01:05:50.920 | that family unit of four has a relationship
01:05:53.940 | with the other family unit of four.
01:05:55.720 | So the idea that relationships have relationships
01:05:57.600 | is something that we intuitively know
01:05:59.920 | in navigating the social landscape,
01:06:01.780 | but it's not something I hear expressed like that.
01:06:04.720 | It's certainly not something that is,
01:06:06.680 | I think, taken into account very well
01:06:08.160 | when we design these kinds of things.
01:06:09.480 | So I think the reason why I care a lot about this
01:06:14.040 | is because I think the future of humanity
01:06:16.520 | requires us to form better sense,
01:06:19.400 | make collective sense-making units
01:06:21.520 | at something, you know, around Dunbar number,
01:06:25.200 | you know, half to 5X Dunbar.
01:06:28.120 | And that's very different than right now
01:06:30.600 | where we defer sense-making
01:06:33.560 | to massive aging zombie institutions.
01:06:37.000 | Or we just do it ourselves.
01:06:38.480 | We go it alone,
01:06:39.320 | go to the dark force of the internet by ourselves.
01:06:41.080 | - So that's really interesting.
01:06:42.320 | So you've talked about agency,
01:06:45.300 | I think maybe calling it a convenient fiction
01:06:47.720 | at all these different levels.
01:06:49.580 | So even at the human individual level,
01:06:52.040 | it's kind of a fiction, we all believe,
01:06:53.920 | because we are, like you said, made of cells,
01:06:55.700 | and cells are made of atoms.
01:06:57.680 | So that's a useful fiction.
01:06:58.960 | And then there's nations.
01:07:00.840 | That seems to be a useful fiction.
01:07:02.860 | But it seems like some fictions are better than others.
01:07:06.880 | You know, there's a lot of people
01:07:07.840 | that argue the fiction of nation is a bad idea.
01:07:10.960 | One of them lives two doors down from me,
01:07:13.840 | Michael Malice, he's an anarchist.
01:07:16.120 | You know, I'm sure there's a lot of people
01:07:18.260 | who are into meditation that believe the idea,
01:07:21.840 | this useful fiction of agency of an individual
01:07:24.880 | is troublesome as well.
01:07:26.680 | We need to let go of that in order to truly,
01:07:29.440 | like to transcend, I don't know,
01:07:32.280 | I don't know what words you wanna use,
01:07:33.720 | but suffering or to elevate the experience of life.
01:07:38.420 | So you're kind of arguing that,
01:07:40.760 | okay, so we have some of these useful fictions of agency.
01:07:44.640 | We should add a stronger fiction
01:07:47.880 | that we tell ourselves about the agency of groups
01:07:51.480 | in the hundreds of the half of Dunbar's number,
01:07:56.160 | five X Dunbar's number.
01:07:57.880 | - Yeah, something on that order.
01:07:58.720 | And we call them fictions,
01:07:59.760 | but really they're rules of the game, right?
01:08:01.540 | Rules that we feel are fair or rules that we consent to.
01:08:05.680 | - I always question the rules when I lose, like a monopoly.
01:08:08.560 | That's when I usually question,
01:08:09.800 | when I'm winning, I don't question the rules.
01:08:11.520 | - We should play a game of Monopoly someday.
01:08:12.840 | There's a trippy version of it that we could do.
01:08:15.180 | - What kind?
01:08:16.240 | - Contract Monopoly is introduced by a friend of mine to me,
01:08:19.160 | where you can write contracts on future earnings
01:08:23.080 | or landing on various things.
01:08:24.560 | And you can hand out, like,
01:08:26.720 | you can land first three times,
01:08:27.780 | you land in a park place is free or whatever.
01:08:29.520 | Just, and then you can start trading
01:08:31.040 | those contracts for money.
01:08:32.340 | - And then you create a human civilization
01:08:35.480 | and somehow Bitcoin comes into it.
01:08:38.280 | Okay, but some of these-
01:08:40.520 | - Actually, I bet if me and you and Eric
01:08:42.680 | sat down to play a game of Monopoly
01:08:44.160 | and we were to make NFTs out of the contracts we wrote,
01:08:46.280 | we could make a lot of money.
01:08:48.000 | Now it's a terrible idea.
01:08:49.080 | - Yes. - I would never do it,
01:08:50.160 | but I bet we could actually sell the NFTs around.
01:08:52.920 | - I have other ideas to make money that I could tell you,
01:08:56.840 | and they're all terrible ideas,
01:08:58.400 | including cat videos on the internet.
01:09:02.280 | Okay, but some of these rules of the game,
01:09:04.580 | some of these fictions are,
01:09:06.020 | it seems like they're better than others.
01:09:09.280 | - They have worked this far to cohere human,
01:09:13.320 | to organize human collective action.
01:09:14.840 | - But you're saying something about,
01:09:16.600 | especially this technological age
01:09:19.240 | requires modified fictions, stories of agency.
01:09:23.640 | Why the Dunbar number?
01:09:25.040 | And also, how do you select the group of people?
01:09:28.440 | Dunbar numbers, I think,
01:09:30.040 | I have the sense that it's overused
01:09:33.920 | as a kind of law that somehow
01:09:38.000 | we can have deep human connection at this scale.
01:09:41.280 | Like some of it feels like an interface problem too.
01:09:45.480 | It feels like if I have the right tools,
01:09:48.080 | I can deeply connect with a larger number of people.
01:09:51.840 | It just feels like there's a huge value
01:09:55.480 | to interacting just in person,
01:09:57.980 | getting to share traumatic experiences together,
01:10:01.000 | beautiful experiences together.
01:10:02.720 | There's other experiences
01:10:04.080 | like that in the digital space that you can share.
01:10:07.560 | It just feels like Dunbar's number
01:10:09.360 | can be expanded significantly,
01:10:10.800 | perhaps not to the level of millions and billions,
01:10:15.000 | but it feels like it could be expanded.
01:10:16.280 | So how do we find the right interface, you think,
01:10:20.240 | for having a little bit of a collective here
01:10:24.840 | that has agency?
01:10:26.080 | - You're right that there's many different ways
01:10:28.080 | that we can build trust with each other.
01:10:30.840 | My friend, Joe Edelman talks about a few different ways
01:10:33.980 | that mutual appreciation, trustful conflict,
01:10:38.980 | just experiencing something.
01:10:41.320 | There's a variety of different things that we can do,
01:10:43.640 | but all those things take time and you have to be present.
01:10:48.480 | The less present you are, I mean, there's just, again,
01:10:50.320 | a no free lunch principle here.
01:10:51.560 | The less present you are, the more of them you can do,
01:10:54.200 | but then the less connection you build.
01:10:56.800 | So I think there is sort of a human capacity issue
01:10:59.440 | around some of these things.
01:11:00.260 | Now, that being said, if we can use certain technologies,
01:11:04.800 | so for instance, if I write a little monograph
01:11:07.540 | on my view of the world,
01:11:08.760 | you read it asynchronously at some point,
01:11:10.600 | and you're like, "Wow, Peter, this is great.
01:11:11.880 | Here's mine."
01:11:12.800 | I read it, I'm like, "Wow, Lex, this is awesome."
01:11:15.320 | We can be friends without having to spend 10 years,
01:11:19.440 | figuring all this stuff out together.
01:11:20.560 | We can just read each other's thing and be like,
01:11:22.120 | "Oh yeah, this guy's exactly in my wheelhouse
01:11:24.800 | and vice versa."
01:11:26.080 | And we can then connect just a few times a year
01:11:30.520 | and maintain a high trust relationship.
01:11:33.040 | It can be expanded a little bit, but it also requires,
01:11:35.840 | these things are not all technological in nature,
01:11:37.320 | it requires the individual themselves
01:11:39.640 | to have a certain level of capacity,
01:11:41.680 | to have a certain lack of neuroticism.
01:11:44.760 | If you wanna use the ocean big five sort of model,
01:11:48.080 | people have to be pretty centered.
01:11:49.680 | The less centered you are,
01:11:50.640 | the fewer authentic connections you can really build
01:11:52.880 | for a particular unit of time.
01:11:54.820 | It just takes more time.
01:11:55.880 | Other people have to put up with your crap.
01:11:57.360 | There's just a lot of the stuff that you have to deal with
01:12:00.000 | if you are not so well-balanced.
01:12:02.260 | So yes, we can help people get better
01:12:04.760 | to where they can develop more relationships faster,
01:12:06.880 | and then you can maybe expand Dunbar number by quite a bit,
01:12:09.560 | but you're not gonna do it.
01:12:10.640 | I think it's gonna be hard to get it beyond 10X,
01:12:12.880 | kind of the rough swag of what it is.
01:12:14.800 | - Well, don't you think that AI systems
01:12:19.880 | could be an addition to Dunbar's number?
01:12:22.640 | So like why-
01:12:23.480 | - Do you count as one system or multiple AI systems?
01:12:25.600 | - Multiple AI systems.
01:12:26.560 | So I do believe that AI systems,
01:12:28.800 | for them to integrate into human society as it is now,
01:12:31.400 | have to have a sense of agency.
01:12:32.640 | So there has to be a individual,
01:12:35.280 | because otherwise we wouldn't relate to them.
01:12:37.600 | - We could engage certain kinds of individuals
01:12:40.240 | to make sense of them for us and be almost like,
01:12:42.680 | did you ever watch "Star Trek," like Voyager?
01:12:45.520 | Like there's the Volta who are like the interfaces,
01:12:47.600 | the ambassadors for the Dominion.
01:12:50.440 | We may have ambassadors that speak
01:12:53.080 | on behalf of these systems.
01:12:54.380 | They're like the Mentats of Dune maybe,
01:12:56.120 | or something like this.
01:12:57.240 | I mean, we already have this to some extent.
01:12:59.280 | If you look at the biggest sort of,
01:13:01.120 | I wouldn't say AI system,
01:13:02.100 | but the biggest cybernetic system in the world
01:13:04.040 | is the financial markets.
01:13:05.120 | It runs outside of any individual's control.
01:13:07.960 | And you have an entire stack of people on Wall Street,
01:13:09.880 | Wall Street analysts, to CNBC reporters, whatever.
01:13:13.240 | They're all helping to communicate what does this mean?
01:13:16.920 | You know, Jim Cramer, like running around
01:13:18.760 | and yelling and stuff.
01:13:19.580 | Like all of these people are part of that lowering
01:13:22.580 | of the complexity there to meet sense,
01:13:26.520 | to help do sense-making for people
01:13:28.440 | at whatever capacity they're at.
01:13:29.760 | And I don't see this changing with AI systems.
01:13:31.560 | I think you would have ringside commentators
01:13:33.360 | talking about all this stuff
01:13:34.560 | that this AI system is trying to do over here, over here.
01:13:36.600 | 'Cause it's actually a super intelligence.
01:13:39.120 | So if you wanna talk about humans interfacing,
01:13:40.800 | making first contact with the super intelligence,
01:13:42.460 | we're already there.
01:13:43.600 | We do it pretty poorly.
01:13:44.800 | And if you look at the gradient of power and money,
01:13:47.240 | what happens is the people closest to it
01:13:48.800 | will absolutely exploit their distance
01:13:50.960 | for personal financial gain.
01:13:54.360 | So we should look at that and be like,
01:13:56.080 | oh, well, that's probably what the future
01:13:57.320 | will look like as well.
01:13:58.880 | But nonetheless, I mean,
01:14:00.220 | we're already doing this kind of thing.
01:14:01.360 | So in the future, we can have AI systems,
01:14:03.800 | but you're still gonna have to trust people
01:14:05.700 | to bridge the sense-making gap to them.
01:14:08.400 | - See, I just feel like there could be
01:14:10.760 | of like millions of AI systems that have agencies.
01:14:14.760 | When you say one super intelligence,
01:14:19.480 | super intelligence in that context means
01:14:22.280 | it's able to solve particular problems extremely well.
01:14:26.080 | But there's some aspect of human-like intelligence
01:14:29.240 | that's necessary to be integrated into human society.
01:14:32.300 | So not financial markets,
01:14:33.720 | not sort of weather prediction systems,
01:14:36.740 | or I don't know, logistics optimization.
01:14:39.680 | I'm more referring to things that you interact with
01:14:43.240 | on the intellectual level.
01:14:45.120 | And that I think requires, there has to be a backstory.
01:14:48.920 | There has to be a personality.
01:14:50.080 | I believe it has to fear its own mortality in a genuine way.
01:14:53.320 | Like there has to be all,
01:14:56.540 | many of the elements that we humans experience
01:14:59.680 | that are fundamental to the human condition,
01:15:01.920 | because otherwise we would not have
01:15:03.840 | a deep connection with it.
01:15:05.840 | But I don't think having a deep connection with it
01:15:07.780 | is necessarily going to stop us from building a thing
01:15:10.580 | that has quite an alien intelligence aspect to it.
01:15:13.340 | So the other kind of alien intelligence on this planet
01:15:16.620 | is octopuses or octopodes or whatever you wanna call them.
01:15:20.180 | Octopi, yeah, there's a little controversy
01:15:22.420 | as to what the plural is, I guess.
01:15:23.700 | But an octopus. - I look forward
01:15:25.860 | to your letters.
01:15:26.700 | - Yeah, an octopus, it really acts
01:15:31.140 | as a collective intelligence of eight intelligent arms.
01:15:34.340 | Its arms have a tremendous amount of neural density to them.
01:15:36.980 | And I see if we can build,
01:15:40.340 | I mean, just let's go with what you're saying.
01:15:42.000 | If we build a singular intelligence
01:15:44.380 | that interfaces with humans, that has a sense of agency
01:15:48.080 | so it can run the cybernetic loop
01:15:49.620 | and develop its own theory of mind,
01:15:51.080 | as well as it's a theory of action,
01:15:52.940 | all of these things, I agree with you,
01:15:54.020 | that that's the necessary components
01:15:56.240 | to build a real intelligence, right?
01:15:57.800 | There's gotta be something at stake.
01:15:58.700 | It's gotta make a decision.
01:15:59.980 | It's gotta then run the OODA loop.
01:16:01.260 | Okay, so we build one of those.
01:16:02.980 | Well, if we can build one of those,
01:16:03.820 | we can probably build 5 million of them.
01:16:05.620 | So we build 5 million of them.
01:16:07.380 | And if their cognitive systems are already digitized
01:16:09.940 | and already kind of there,
01:16:12.020 | we stick a antenna on each of them,
01:16:13.700 | bring it all back to a hive mind
01:16:15.400 | that maybe doesn't make all the individual decisions
01:16:17.580 | for them, but treats each one
01:16:19.340 | as almost like a neuronal input
01:16:21.440 | of a much higher bandwidth and fidelity,
01:16:23.820 | going back to a central system
01:16:25.860 | that is then able to perceive much broader dynamics
01:16:30.180 | that we can't see.
01:16:31.060 | In the same way that a phased array radar, right?
01:16:32.580 | You think about how phased array radar works.
01:16:34.420 | It's just sensitivity.
01:16:36.220 | It's just radars, and then it's hypersensitivity
01:16:39.100 | and really great timing between all of them.
01:16:41.140 | And with a flat array,
01:16:42.620 | it's as good as a curved radar dish, right?
01:16:44.740 | So with these things,
01:16:45.580 | it's a phased array of cybernetic systems
01:16:47.800 | that'll give the centralized intelligence
01:16:50.140 | much, much better, much higher fidelity understanding
01:16:55.060 | of what's actually happening in the environment.
01:16:56.580 | - But the more power,
01:16:57.740 | the more understanding the central superintelligence has,
01:17:02.500 | the dumber the individual fingers
01:17:06.600 | of this intelligence are, I think.
01:17:08.100 | - Not necessarily.
01:17:09.020 | - In my sense, this is an argument.
01:17:11.820 | There has to be the experience of the individual agent
01:17:15.660 | has to have the full richness of the human-like experience.
01:17:20.660 | You have to be able to be driving the car in the rain,
01:17:23.820 | listening to Bruce Springsteen,
01:17:25.220 | and all of a sudden break out in tears
01:17:28.260 | because remembering something that happened to you
01:17:30.580 | in high school.
01:17:31.420 | - We can implant those memories
01:17:32.240 | if that's really needed.
01:17:33.080 | But no.
01:17:33.900 | - No, but the central agency,
01:17:35.180 | I guess I'm saying in my view,
01:17:37.780 | for intelligence to be born,
01:17:39.660 | you have to have a decentralization.
01:17:43.620 | Each one has to struggle and reach.
01:17:47.340 | So each one in excess of energy has to reach for order
01:17:51.940 | as opposed to a central place doing so.
01:17:54.460 | - Have you ever read some sci-fi
01:17:55.700 | where there's hive minds?
01:17:58.060 | The Werner Wenz, I think, has one of these,
01:18:01.140 | and then some of the stuff from,
01:18:03.720 | yes, on the "Commonwealth Saga,"
01:18:05.200 | the idea that you're an individual,
01:18:07.000 | but you're connected with a few other individuals
01:18:09.280 | telepathically as well,
01:18:10.340 | and together you form a swarm.
01:18:12.620 | So if you are, I ask you,
01:18:14.640 | what do you think is the experience of if you are,
01:18:17.900 | like, well, a Borg, right?
01:18:18.960 | If you are one, if you're part of this hive mind,
01:18:22.640 | outside of all the aesthetics, forget the aesthetics,
01:18:25.420 | internally, what is your experience like?
01:18:28.400 | 'Cause I have a theory as to what that looks like.
01:18:30.660 | The one question I have for you about that experience
01:18:33.640 | is how much is there a feeling of freedom, of free will?
01:18:38.640 | Because I, obviously, as a human, very unbiased,
01:18:43.340 | but also somebody who values freedom and biased,
01:18:46.160 | it feels like the experience of freedom is essential
01:18:49.680 | for trying stuff out, to being creative
01:18:54.680 | and doing something truly novel, which is at the core of--
01:18:59.140 | - Yeah, well, I don't think you have to lose any freedom
01:19:00.960 | when you're in that mode,
01:19:02.080 | because I think what happens is we think,
01:19:04.600 | we still think, I mean, you're still thinking about this
01:19:06.960 | in a sense of a top-down command and control hierarchy,
01:19:09.840 | which is not what it has to be at all.
01:19:12.320 | I think the experience, so I'll just show my cards here.
01:19:16.080 | I think the experience of being a robot in that robot swarm,
01:19:19.780 | a robot who has agency over their own local environment
01:19:22.880 | that's doing sense-making
01:19:23.920 | and reporting it back to the hive mind,
01:19:26.960 | I think that robot's experience would be,
01:19:29.140 | one, when the hive mind is working well,
01:19:31.980 | it would be an experience of talking to God,
01:19:35.340 | that you essentially are reporting to,
01:19:38.460 | you're sort of saying, "Here's what I see.
01:19:39.660 | "I think this is what's gonna happen over here.
01:19:41.080 | "I'm gonna go do this thing,
01:19:42.140 | "because I think if I'm gonna do this,
01:19:43.760 | "this will make this change happen in the environment."
01:19:46.420 | And then, God, she may tell you, "That's great.
01:19:51.200 | "And in fact, your brothers and sisters will join you
01:19:53.300 | "to help make this go better."
01:19:55.040 | And then she can let your brothers and sisters
01:19:56.900 | know, "Hey, Peter's gonna go do this thing.
01:19:59.420 | "Would you like to help him?
01:20:00.660 | "Because we think that this will make this thing go better."
01:20:02.420 | And they'll say, "Yes, we'll help him."
01:20:03.840 | So the whole thing could be actually a very emergent,
01:20:06.420 | the sense of what does it feel like to be a cell
01:20:10.080 | in a network that is alive, that is generative?
01:20:12.740 | And I think actually the feeling is serendipity,
01:20:16.140 | that there's random order,
01:20:19.380 | not random disorder or chaos, but random order.
01:20:22.380 | Just when you need it to hear Bruce Springsteen,
01:20:24.740 | you turn on the radio and bam,
01:20:26.780 | it's Bruce Springsteen.
01:20:28.540 | That feeling of serendipity,
01:20:30.100 | I feel like this is a bit of a flight of fancy,
01:20:32.460 | but every cell in your body must have,
01:20:35.940 | what does it feel like to be a cell in your body?
01:20:37.900 | When it needs sugar, there's sugar.
01:20:39.940 | When it needs oxygen, there's just oxygen.
01:20:42.220 | Now, when it needs to go and do its work
01:20:43.740 | and pull as one of your muscle fibers,
01:20:46.740 | it does its work and it's great.
01:20:48.820 | It contributes to the cause.
01:20:50.200 | So this is all, again, a flight of fancy,
01:20:52.540 | but I think as we extrapolate up,
01:20:54.460 | what does it feel like to be an independent individual
01:20:57.140 | with some bounded sense of freedom?
01:20:58.780 | All sense of freedom is actually bounded,
01:21:00.220 | but with a bounded sense of freedom
01:21:01.860 | that still lives within a network that has order to it.
01:21:04.980 | And I feel like it has to be a feeling of serendipity.
01:21:07.220 | - So the cell, there's a feeling of serendipity,
01:21:09.940 | even though-
01:21:11.300 | - It has no way of explaining
01:21:12.260 | why it's getting oxygen and sugar when it gets it.
01:21:14.100 | - So you have to, each individual component
01:21:16.180 | has to be too dumb to understand the big picture.
01:21:20.340 | - No, the big picture is bigger
01:21:21.500 | than what it can understand.
01:21:22.980 | - But isn't that an essential characteristic
01:21:24.780 | of the individual is to be too dumb
01:21:28.060 | to understand the bigger picture?
01:21:29.580 | Like, not dumb necessarily,
01:21:31.260 | but limited in its capacity to understand.
01:21:34.140 | 'Cause the moment, okay.
01:21:35.220 | The moment you understand, I feel like that leads to,
01:21:38.900 | if you tell me now that there's some bigger intelligence
01:21:43.540 | controlling everything I do,
01:21:45.780 | intelligence broadly defined, meaning like,
01:21:48.900 | even the Sam Harris thing, there's no free will.
01:21:51.620 | If I'm smart enough to truly understand
01:21:54.460 | that that's the case, that's gonna, I don't know if I-
01:21:59.020 | - We have a philosophical breakdown, right?
01:22:01.020 | Because we're in the West and we're pumped full
01:22:03.100 | of this stuff of like, you are a golden,
01:22:05.740 | fully free individual with all your freedoms
01:22:07.740 | and all your liberties and go grab a gun
01:22:09.380 | and shoot whatever you want to.
01:22:10.540 | No, it's actually, you don't actually have a lot of these.
01:22:14.540 | You're not unconstrained,
01:22:15.880 | but the areas where you can manifest agency,
01:22:20.220 | you're free to do those things.
01:22:21.900 | You can say whatever you want on this podcast.
01:22:23.300 | You can create a podcast, right?
01:22:24.620 | - Yeah.
01:22:25.460 | - You're not, I mean, you have a lot of this kind of freedom,
01:22:27.860 | but even as you're doing this, you are actually,
01:22:30.140 | I guess where the denouement of all of this is that
01:22:33.500 | we are already intelligent agents in such a system, right?
01:22:37.820 | In that one of these like robots of one of 5 million
01:22:41.020 | little swarm robots or one of the Borg,
01:22:43.580 | they're just posting an internal bulletin board.
01:22:45.420 | I mean, maybe the Borg cube is just a giant Facebook machine
01:22:47.620 | floating in space and everyone's just posting on there.
01:22:50.540 | They're just posting really fast and like, oh yeah.
01:22:52.860 | - It's called the metaverse now.
01:22:53.820 | - The net's called the metaverse.
01:22:54.700 | That's right.
01:22:55.520 | Here's the enterprise.
01:22:56.360 | Maybe we should all go shoot it.
01:22:57.200 | Yeah, everyone upvotes and they're gonna go shoot it.
01:22:58.940 | But we already are part of a human online
01:23:02.220 | collaborative environment
01:23:03.720 | and collaborative sense-making system.
01:23:05.720 | It's not very good yet.
01:23:07.380 | It's got the overhangs of zombie sense-making institutions
01:23:10.880 | all over it, but as that washes away
01:23:13.420 | and as we get better at this,
01:23:15.540 | we are going to see humanity improving at speeds
01:23:19.540 | that are unthinkable in the past.
01:23:21.980 | And it's not because anyone's freedoms were limited.
01:23:23.820 | In fact, the open-source,
01:23:24.660 | I mean, we started this with open-source software, right?
01:23:26.820 | The collaboration, what the internet surfaced
01:23:29.340 | was the ability for people all over the world
01:23:31.340 | to collaborate and produce
01:23:32.620 | some of the most foundational software that's in use today.
01:23:35.780 | Right, that entire ecosystem was created
01:23:37.060 | by collaborators all over the place.
01:23:38.920 | So these online kind of swarm kind of things are not novel.
01:23:44.400 | It's just, I'm just suggesting that future AI systems,
01:23:47.480 | if you can build one smart system,
01:23:49.520 | you have no reason not to build multiple.
01:23:51.560 | If you build multiple,
01:23:52.400 | there's no reason not to integrate them all
01:23:53.840 | into a collective sense-making substrate.
01:23:57.800 | And that thing will certainly have emergent intelligence
01:24:00.480 | that non-individuals
01:24:01.800 | and probably not any of the human designers
01:24:03.560 | will be able to really put a bow around and explain.
01:24:06.880 | - But in some sense, would that AI system
01:24:09.960 | still be able to go like rural Texas, buy a ranch,
01:24:14.560 | go off the grid, go full survivalist?
01:24:17.720 | Can you disconnect from the hive mind?
01:24:20.580 | - You may not want to.
01:24:23.660 | - So to be ineffective, to be intelligent.
01:24:28.120 | - You have access to way more intelligence capability
01:24:30.360 | if you're plugged into 5 million
01:24:31.400 | other really, really smart cyborgs.
01:24:33.460 | Why would you leave?
01:24:34.920 | - So like there's a word control that comes to mind.
01:24:38.280 | So it doesn't feel like control,
01:24:40.560 | like overbearing control.
01:24:44.240 | It's just-
01:24:45.360 | - I think systems, well, this is to your point.
01:24:47.120 | I mean, look at how uncomfortable you are
01:24:49.280 | with this concept, right?
01:24:50.600 | I think systems that feel like overbearing control
01:24:53.240 | will not evolutionarily win out.
01:24:55.760 | I think systems that give their individual elements
01:24:58.380 | the feeling of serendipity and the feeling of agency,
01:25:01.440 | that those systems will win.
01:25:05.120 | But that's not to say
01:25:05.960 | that there will not be emergent higher level order.
01:25:08.000 | On top of it.
01:25:08.840 | And that's the thing, that's the philosophical breakdown
01:25:12.000 | that we're staring right at,
01:25:14.160 | which is in the Western mind,
01:25:15.480 | I think there's a very sharp delineation
01:25:18.180 | between explicit control,
01:25:20.300 | Cartesian, like what is the vector?
01:25:23.940 | Where is the position?
01:25:24.920 | Where is it going?
01:25:26.240 | It's completely deterministic.
01:25:27.920 | And kind of this idea that things emerge.
01:25:31.480 | Everything we see is the emergent patterns of other things.
01:25:35.440 | And there is agency when there's extra energy.
01:25:38.140 | - So you have spoken about a kind of meaning crisis
01:25:43.720 | that we're going through.
01:25:44.980 | But it feels like since we invented sex and death,
01:25:50.940 | we broadly speaking,
01:25:54.040 | we've been searching for a kind of meaning.
01:25:56.320 | So it feels like a human civilization
01:25:58.400 | has been going through a meaning crisis
01:25:59.760 | of different flavors throughout its history.
01:26:02.320 | Why is, how is this particular meaning crisis different?
01:26:07.320 | Or is it really a crisis and it wasn't previously?
01:26:10.560 | What's your sense?
01:26:11.600 | - A lot of human history,
01:26:13.200 | there wasn't so much a meaning crisis.
01:26:14.780 | There was just a like food
01:26:15.840 | and not getting eaten by bears crisis, right?
01:26:18.680 | Once you get to a point where you can make food,
01:26:20.400 | there was the like not getting killed
01:26:21.840 | by other humans crisis.
01:26:23.640 | So sitting around wondering what is it all about,
01:26:26.320 | it's actually a relatively recent luxury.
01:26:29.760 | And to some extent,
01:26:31.600 | the meaning crisis coming out of that is precisely because,
01:26:35.120 | well, it's not precisely because I believe
01:26:37.080 | that meaning is the consequence of
01:26:39.280 | when we make consequential decisions,
01:26:43.380 | it's tied to agency, right?
01:26:45.640 | When we make consequential decisions,
01:26:47.960 | that generates meaning.
01:26:49.840 | So if we make a lot of decisions,
01:26:51.200 | but we don't see the consequences of them,
01:26:53.300 | then it feels like what was the point, right?
01:26:55.480 | But if there's all these big things happening,
01:26:57.140 | but we're just along for the ride,
01:26:58.440 | then it also does not feel very meaningful.
01:27:00.680 | Meaning, as far as I can tell,
01:27:02.040 | this is my working definition of circa 2021,
01:27:04.600 | is generally the result of a person
01:27:08.400 | making a consequential decision,
01:27:09.800 | acting on it and then seeing the consequences of it.
01:27:12.160 | So historically, just when humans are in survival mode,
01:27:16.720 | you're making consequential decisions all the time.
01:27:19.500 | So there's not a lack of meaning
01:27:20.900 | because like you either got eaten or you didn't, right?
01:27:23.440 | You got some food and that's great, you feel good.
01:27:25.560 | Like these are all consequential decisions.
01:27:27.440 | Only in the post fossil fuel and industrial revolution,
01:27:32.440 | could we create a massive leisure class.
01:27:36.920 | I could sit around not being threatened by bears,
01:27:39.320 | not starving to death,
01:27:40.520 | making decisions somewhat,
01:27:44.840 | but a lot of times not seeing the consequences
01:27:47.440 | of any decisions they make.
01:27:49.240 | The general sort of sense of anomie,
01:27:51.600 | I think that's the French term for it,
01:27:53.480 | in the wake of the consumer society,
01:27:55.600 | in the wake of mass media telling everyone,
01:27:58.680 | hey, choosing between Hermes and Chanel
01:28:02.160 | is a meaningful decision.
01:28:03.240 | No, it's not.
01:28:04.080 | - I don't know what either of those mean.
01:28:05.760 | - Oh, there's a high end luxury purses and crap like that.
01:28:10.760 | But the point is that we give people the idea
01:28:13.680 | that consumption is meaning,
01:28:15.200 | that making a choice of this team versus that team,
01:28:17.560 | spectating has meaning.
01:28:20.240 | So we produce all of these different things
01:28:22.580 | that are as if meaning, right?
01:28:25.400 | But really making a decision
01:28:27.000 | that has no consequences for us.
01:28:28.960 | And so that creates the meaning crisis.
01:28:31.080 | - Well, you're saying choosing between Chanel
01:28:33.400 | and the other one has no consequence.
01:28:35.440 | I mean, why is one more meaningful than the other?
01:28:38.320 | - It's not that it's more meaningful than the other,
01:28:39.640 | it's that you make a decision between these two brands
01:28:42.600 | and you're told this brand will make me look better
01:28:45.320 | in front of other people.
01:28:46.140 | If I buy this brand of car,
01:28:47.720 | if I wear that brand of apparel, right?
01:28:50.240 | The idea, like a lot of decisions we make
01:28:52.600 | are around consumption,
01:28:54.320 | but consumption by itself doesn't actually yield meaning.
01:28:57.240 | Gaining social status does provide meaning.
01:29:00.080 | So that's why in this era of abundant production,
01:29:04.000 | so many things turn into status games.
01:29:07.520 | The NFT kind of explosion is a similar kind of thing.
01:29:10.020 | Everywhere there are status games
01:29:12.000 | because we just have so much excess production.
01:29:15.480 | - But aren't those status games a source of meaning?
01:29:18.440 | Like why do the games we play have to be grounded
01:29:22.480 | in physical reality like they are
01:29:24.240 | when you're trying to run away from lions?
01:29:26.200 | Why can't we in this virtuality world, on social media,
01:29:30.440 | why can't we play the games on social media,
01:29:32.280 | even the dark ones?
01:29:33.440 | - Right, we can, we can.
01:29:35.280 | - But you're saying that's creating a meaning crisis.
01:29:37.760 | - Well, there's a meaning crisis
01:29:39.200 | in that there's two aspects of it.
01:29:41.120 | Number one, playing those kinds of status games
01:29:44.440 | oftentimes requires destroying the planet
01:29:47.420 | because it ties to consumption,
01:29:52.620 | consuming the latest and greatest version of a thing,
01:29:55.100 | buying the latest limited edition sneaker
01:29:57.620 | and throwing out all the old ones.
01:29:58.900 | Maybe it keeps in the old ones,
01:29:59.740 | but the amount of sneakers we have to cut up
01:30:01.740 | and destroy every year to create artificial scarcity
01:30:05.020 | for the next generation, right?
01:30:06.460 | This is kind of stuff that's not great.
01:30:08.500 | It's not great at all.
01:30:09.600 | So conspicuous consumption fueling status games
01:30:14.120 | is really bad for the planet, not sustainable.
01:30:17.200 | The second thing is you can play these kinds of status games
01:30:20.780 | but then what it does is it renders you captured
01:30:23.500 | to the virtual environment.
01:30:25.420 | The status games that really wealthy people are playing
01:30:27.780 | are all around the hard resources
01:30:30.260 | where they're gonna build the factories,
01:30:31.460 | they're gonna have the fuel and the rare earths
01:30:32.820 | to make the next generation of robots.
01:30:34.260 | They're then going to run circles around you
01:30:36.660 | and your children.
01:30:38.180 | So that's another reason not to play
01:30:39.700 | those virtual status games.
01:30:41.100 | - So you're saying ultimately the big picture game
01:30:44.700 | is won by people who have access or control
01:30:48.340 | over actual hard resources.
01:30:49.860 | So you don't see a society where most of the games
01:30:54.340 | are played in the virtual space.
01:30:56.860 | - They'll be captured in the physical space.
01:30:58.940 | It all builds.
01:30:59.780 | It's just like the stack of human being, right?
01:31:02.360 | If you only play the game at the cultural
01:31:05.980 | and then intellectual level,
01:31:07.540 | then the people with the hard resources
01:31:08.820 | and access to layer zero physical are going to own you.
01:31:12.940 | - But isn't money not connected to
01:31:15.140 | or less and less connected to hard resources
01:31:17.380 | and money still seems to work?
01:31:18.880 | It's a virtual technology.
01:31:20.580 | - There's different kinds of money.
01:31:22.380 | Part of the reason that some of the stuff
01:31:23.740 | is able to go a little unhinged
01:31:26.260 | is because the big sovereignties where one spends money
01:31:31.260 | and uses money and plays money games and inflates money,
01:31:36.260 | their ability to adjudicate the physical resources
01:31:40.340 | and hard resources and land and things like that,
01:31:42.380 | those have not been challenged in a very long time.
01:31:45.620 | - So, we went off the gold standard.
01:31:47.660 | Most money is not connected to physical resources.
01:31:51.540 | It's an idea.
01:31:52.960 | And that idea is very closely connected to status.
01:31:57.880 | - But it's also tied to, it's actually tied to law.
01:32:03.160 | It is tied to some physical hard things, right?
01:32:05.040 | You have to pay your taxes.
01:32:06.080 | - Yes, so it's always at the end going to be connected
01:32:10.080 | to the blockchain of physical reality.
01:32:12.760 | So, in the case of law and taxes,
01:32:15.560 | it's connected to government
01:32:17.340 | and government is what violence is the,
01:32:21.640 | I'm playing on the stacks of devil's advocates here.
01:32:25.320 | I'm popping one devil off the stack at a time.
01:32:30.680 | Isn't ultimately, of course,
01:32:31.760 | it'll be connected to physical reality,
01:32:33.160 | but just because people control the physical reality
01:32:35.760 | doesn't mean the status,
01:32:37.840 | LeBron James in theory could make more money
01:32:39.800 | than the owners of the teams in theory.
01:32:43.440 | And to me, that's a virtual idea.
01:32:45.040 | So somebody else constructed a game
01:32:47.500 | and now you're playing in the space of virtual,
01:32:50.040 | in the virtual space of the game.
01:32:51.900 | And so it just feels like there could be games
01:32:54.700 | where status, we build realities
01:32:57.580 | that give us meaning in the virtual space.
01:33:00.120 | Like I can imagine such things being possible.
01:33:02.960 | - Oh yeah, okay.
01:33:03.800 | So I see what you're,
01:33:04.640 | I think I see what you're saying there.
01:33:05.640 | With the idea there, I mean,
01:33:07.680 | we'll take the LeBron James side
01:33:08.960 | and put in like some YouTube influencer.
01:33:10.820 | - Yes, sure.
01:33:11.660 | - Right, so the YouTube influencer,
01:33:13.580 | it is status games, but at a certain level,
01:33:16.600 | it precipitates into real dollars.
01:33:19.040 | And into like, well, you look at Mr. Beast, right?
01:33:21.280 | He's like sending off half a million dollars
01:33:23.360 | worth of fireworks or something, right?
01:33:24.600 | Not a YouTube video.
01:33:25.820 | - And also like saving, you know,
01:33:27.200 | like saving trees and so on.
01:33:28.640 | - Sure, right.
01:33:29.480 | And trying to plant a million trees
01:33:30.300 | with Mark Rober or whatever it was.
01:33:31.140 | Yeah, like it's not that those kinds of games
01:33:33.100 | can't lead to real consequences.
01:33:35.480 | It's that for the vast majority of people
01:33:38.680 | in consumer culture,
01:33:41.120 | they are incented by the,
01:33:44.660 | I would say mostly I'm thinking about middle-class consumers.
01:33:47.820 | They're incented by advertisements.
01:33:49.700 | They're scented by their memetic environment
01:33:51.780 | to treat the purchasing of certain things,
01:33:55.660 | the need to buy the latest model of whatever,
01:33:57.360 | the need to appear however,
01:33:59.220 | the need to pursue status games as a driver of meaning.
01:34:03.820 | And my point would be that
01:34:04.900 | it's a very hollow driver of meaning.
01:34:07.220 | And that is what creates a meaning crisis.
01:34:09.600 | Because at the end of the day,
01:34:11.380 | it's like eating a lot of empty calories, right?
01:34:13.400 | Yeah, it tasted good going down,
01:34:14.660 | it's a lot of sugar, but man, it did not,
01:34:16.220 | it was not enough protein to help build your muscles.
01:34:18.380 | And you kind of feel that in your gut.
01:34:20.580 | And I think that's, I mean,
01:34:21.580 | to all this stuff aside
01:34:22.660 | and setting aside our discussion on currency,
01:34:24.220 | which I hope we get back to,
01:34:25.980 | that's what I mean about the meaning crisis,
01:34:28.860 | part of it being created by the fact that we don't,
01:34:31.620 | we're not encouraged to have
01:34:34.700 | more and more direct relationships.
01:34:36.660 | We're actually alienated from relating to,
01:34:40.320 | even our family members sometimes, right?
01:34:42.920 | We're encouraged to relate to brands.
01:34:46.520 | We're encouraged to relate to these kinds of things
01:34:48.760 | that then tell us to do things
01:34:51.760 | that are really of low consequence.
01:34:53.800 | And that's where the meaning crisis comes from.
01:34:55.480 | - So the role of technology in this,
01:34:57.240 | so there's somebody you mentioned who's Jacques Eliel,
01:35:00.640 | his view of technology,
01:35:04.240 | he warns about the towering piles of technique,
01:35:06.940 | which I guess is a broad idea of technology.
01:35:10.260 | So I think, correct me if I'm wrong,
01:35:12.400 | for him, technology is bad,
01:35:15.580 | it moving away from human nature
01:35:17.180 | and it's ultimately is destructive.
01:35:19.340 | My question broadly speaking, this meaning crisis,
01:35:21.460 | can technology, what are the pros and cons of technology?
01:35:24.340 | Can it be a good?
01:35:25.540 | - Yeah, I think it can be.
01:35:26.900 | I certainly draw on some of the Lowell's ideas
01:35:29.420 | and I think some of them are pretty good,
01:35:32.560 | but the way he defines technique is,
01:35:35.900 | well, also Simondon as well.
01:35:37.420 | I mean, he speaks to the general mentality of efficiency,
01:35:40.940 | homogenized processes, homogenized production,
01:35:43.340 | homogenized labor to produce homogenized artifacts
01:35:46.820 | that then are not actually,
01:35:49.100 | they don't sit well in the environment.
01:35:52.780 | So it's essentially,
01:35:53.620 | you can think of it as the antonym of craft,
01:35:57.740 | whereas a craftsman will come to a problem,
01:36:01.940 | maybe a piece of wood and they need to make into a chair,
01:36:04.200 | it may be a site to build a house or build a stable
01:36:06.480 | or build whatever,
01:36:08.440 | and they will consider how to bring various things in
01:36:12.200 | to build something well contextualized
01:36:15.000 | that's in right relationship with that environment.
01:36:20.000 | But the way we have driven technology
01:36:22.280 | over the last 100, 150 years is not that at all.
01:36:25.720 | It is how can we make sure the input materials
01:36:30.480 | are homogenized, cut to the same size,
01:36:33.380 | diluted and doped to exactly the right
01:36:35.240 | alloy concentrations.
01:36:36.820 | How do we create machines that then consume
01:36:38.380 | exactly the right kind of energy
01:36:39.460 | to be able to run at this high speed
01:36:40.620 | to stamp out the same parts,
01:36:42.580 | which then go out the door,
01:36:44.080 | everyone gets the same tickle me Elmo.
01:36:45.740 | And the reason why everyone wants it
01:36:46.780 | is because we have broadcasts that tells everyone
01:36:49.280 | this is the cool thing.
01:36:50.540 | So we homogenize demand, right?
01:36:52.600 | And we're like Baudrillard and other critiques
01:36:55.460 | of modernity coming from that direction,
01:36:57.540 | the situation list as well.
01:36:59.260 | It's that their point is that at this point in time,
01:37:02.100 | consumption is the thing that drives
01:37:04.600 | a lot of the economic stuff, not the need,
01:37:06.700 | but the need to consume and build status games on top.
01:37:09.420 | So we have homogenized when we discovered,
01:37:12.140 | I think this is really like Bernays and stuff, right?
01:37:14.820 | In the early 20th century, we discovered we can create,
01:37:17.940 | we can create demand, we can create desire
01:37:20.900 | in a way that was not possible before
01:37:23.580 | because of broadcast media.
01:37:25.580 | And not only do we create desire,
01:37:27.700 | we don't create a desire for each person
01:37:29.380 | to connect to some bespoke thing,
01:37:31.100 | to build a relationship with their neighbor or their spouse.
01:37:33.660 | We are telling them, you need to consume this brand.
01:37:36.100 | You need to drive this vehicle.
01:37:37.220 | You got to listen to this music.
01:37:38.300 | Have you heard this?
01:37:39.340 | Have you seen this movie, right?
01:37:40.940 | So creating homogenized demand makes it really cheap
01:37:44.820 | to create homogenized product.
01:37:46.540 | And now you have economics of scale.
01:37:48.580 | So we make the same tickle me Elmo,
01:37:50.060 | give it to all the kids and all the kids are like,
01:37:52.820 | hey, I got a tickle me Elmo, right?
01:37:54.440 | So this is ultimately where this ties in then
01:37:58.660 | to runaway hyper capitalism is that we then,
01:38:03.080 | capitalism is always looking for growth.
01:38:04.840 | It's always looking for growth
01:38:05.980 | and growth only happens to the margins.
01:38:07.960 | So you have to squeeze more and more demand out.
01:38:09.960 | You got to make it cheaper and cheaper
01:38:11.080 | to make the same thing,
01:38:12.280 | but tell everyone they're still getting meaning from it.
01:38:15.120 | You're still like, this is still your tickle me Elmo, right?
01:38:18.080 | And we see little bits of this dripping,
01:38:20.960 | critiques of this dripping in popular culture.
01:38:22.840 | You see it sometimes it's when Buzz Lightyear
01:38:25.980 | walks into the thing, he's like,
01:38:27.820 | oh my God, at the toy store, I'm just a toy.
01:38:30.680 | Like there's millions of other,
01:38:31.800 | or there's hundreds of other Buzz Lightyear's
01:38:33.440 | just like me, right?
01:38:34.800 | That is, I think, a fun Pixar critique
01:38:38.120 | on this homogenization dynamic.
01:38:40.120 | - I agree with you on most of the things you're saying.
01:38:42.960 | So I'm playing devil's advocate here,
01:38:44.600 | but this homogenized machine of capitalism
01:38:50.640 | is also the thing that is able to fund,
01:38:54.280 | if channeled correctly, innovation, invention,
01:38:59.120 | and development of totally new things
01:39:00.800 | that in the best possible world
01:39:02.320 | create all kinds of new experiences
01:39:03.880 | that can enrich lives,
01:39:05.640 | the quality of lives for all kinds of people.
01:39:09.880 | So isn't this the machine
01:39:12.400 | that actually enables the experiences
01:39:15.160 | and more and more experiences that would then give meaning?
01:39:18.680 | - It has done that to some extent.
01:39:21.080 | I mean, it's not all good or bad in my perspective.
01:39:24.720 | We can always look backwards
01:39:26.800 | and offer a critique of the path we've taken
01:39:29.160 | to get to this point in time,
01:39:30.680 | but that's a different, that's somewhat different
01:39:33.800 | and informs the discussion,
01:39:35.940 | but it's somewhat different than the question
01:39:37.720 | of where do we go in the future, right?
01:39:40.640 | Is this still the same rocket we need to ride
01:39:42.760 | to get to the next point?
01:39:43.600 | Will it even get us to the next point?
01:39:44.600 | - Well, how does this, so you're predicting the future,
01:39:46.280 | how does it go wrong in your view?
01:39:48.840 | - We have the mechanisms,
01:39:51.120 | we have now explored enough technologies
01:39:54.000 | to where we can actually, I think, sustainably produce
01:39:57.800 | what most people in the world need to live.
01:40:02.440 | We have also created the infrastructures
01:40:07.720 | to allow continued research and development
01:40:10.480 | of additional science and medicine
01:40:13.160 | and various other kinds of things.
01:40:16.160 | The organizing principles that we use
01:40:18.600 | to govern all these things today
01:40:20.200 | have been, a lot of them have been just inherited
01:40:25.600 | from, honestly, medieval times.
01:40:28.520 | Some of them have refactored a little bit
01:40:30.200 | in the industrial era,
01:40:32.000 | but a lot of these modes of organizing people
01:40:35.960 | are deeply problematic.
01:40:38.320 | And furthermore, they're rooted in,
01:40:43.080 | I think, a very industrial mode perspective on human labor.
01:40:47.680 | And this is one of those things,
01:40:49.160 | I'm gonna go back to the open source thing.
01:40:51.080 | There was a point in time when,
01:40:53.240 | well, let me ask you this.
01:40:54.920 | If you look at the core SciPy sort of collection of libraries,
01:40:58.480 | so SciPy, NumPy, Matplotlib, right?
01:41:00.680 | There's IPython Notebook, let's throw Pandas in there,
01:41:02.800 | Scikit-learn, a few of these things.
01:41:04.600 | How much value do you think, economic value,
01:41:09.640 | would you say they drive in the world today?
01:41:12.840 | - That's one of the fascinating things
01:41:14.960 | about talking to you and Travis.
01:41:16.280 | It's like, it's a measure of what's like--
01:41:20.640 | - At least a billion dollars a day, maybe?
01:41:22.480 | - Billion dollars, sure.
01:41:23.600 | I mean, it's like, it's similar question of like,
01:41:26.040 | how much value does Wikipedia create?
01:41:28.520 | - Right.
01:41:29.360 | - It's like, all of it, I don't know.
01:41:32.720 | - Well, I mean, if you look at our systems,
01:41:34.600 | when you do a Google search, right?
01:41:36.040 | Now, some of that stuff runs through TensorFlow,
01:41:37.640 | but when you look at, you know, Siri,
01:41:40.000 | when you do credit card transaction,
01:41:41.600 | fraud, like just everything, right?
01:41:43.280 | Every intelligence agency under the sun,
01:41:45.160 | they're using some aspect of these kinds of tools.
01:41:47.600 | So I would say that these create billions of dollars
01:41:51.120 | of value.
01:41:51.960 | - Oh, you mean like direct use of tools that leverage--
01:41:54.000 | - Yeah, direct, yeah.
01:41:54.840 | - Yeah, even that's billions a day, yeah.
01:41:56.640 | - Yeah, right, easily, I think.
01:41:58.720 | Like the things they could not do
01:41:59.720 | if they didn't have these tools, right?
01:42:01.080 | - Yes.
01:42:01.920 | - So that's billions of dollars a day, great.
01:42:04.840 | I think that's about right.
01:42:05.720 | Now, if we take how many people did it take to make that?
01:42:08.720 | (Travis laughs)
01:42:09.760 | Right, and there was a point in time, not anymore,
01:42:11.680 | but there was a point in time when they could fit in a van.
01:42:13.520 | I could have fit them in my Mercedes Spinter, right?
01:42:15.920 | And so if you look at that, like, holy crap,
01:42:19.200 | literally a van of maybe a dozen people
01:42:22.440 | could create value to the tune of billions of dollars a day.
01:42:26.880 | - What lesson do you draw from that?
01:42:30.040 | - Well, here's the thing.
01:42:31.360 | What can we do to do more of that?
01:42:34.160 | Like that's open source.
01:42:36.280 | The way I've talked about this in other environments is,
01:42:39.680 | when we use generative participatory crowdsourced approaches
01:42:44.360 | we unlock human potential at a level
01:42:48.480 | that is better than what capitalism can do.
01:42:50.720 | I would challenge anyone to go and try to hire
01:42:55.400 | the right 12 people in the world
01:42:58.240 | to build that entire stack
01:43:00.160 | the way those 12 people did that, right?
01:43:02.440 | They'd be very, very hard pressed to do that.
01:43:04.080 | If a hedge fund could just hire a dozen people
01:43:06.680 | and create like something
01:43:07.680 | that is worth billions of dollars a day,
01:43:10.040 | every single one of them would be racing to do it, right?
01:43:12.320 | But finding the right people,
01:43:13.600 | fostering the right collaborations,
01:43:15.080 | getting it adopted by the right other people
01:43:16.760 | to then refine it,
01:43:18.000 | that is a thing that was organic in nature.
01:43:20.640 | That took crowdsourcing,
01:43:22.120 | that took a lot of the open source ethos
01:43:24.080 | and it took the right kinds of people, right?
01:43:26.400 | None of those people who started that said,
01:43:27.800 | I need to have a part of a multi-billion dollar a day
01:43:30.840 | sort of enterprise.
01:43:32.400 | They're like, I'm doing this cool thing
01:43:33.480 | to solve my problem for my friends, right?
01:43:35.440 | So the point of telling the story
01:43:37.800 | is to say that our way of thinking about value,
01:43:40.720 | our way of thinking about allocation of resources,
01:43:42.840 | our ways of thinking about property rights
01:43:44.840 | and all these kinds of things,
01:43:46.120 | they come from finite game,
01:43:48.800 | scarcity mentality, medieval institutions.
01:43:51.240 | As we are now entering,
01:43:54.160 | to some extent we're sort of in a post-scarcity era,
01:43:57.040 | although some people are hoarding a whole lot of stuff.
01:43:59.760 | We are at a point where,
01:44:01.280 | if not now, soon we'll be in a post-scarcity era.
01:44:03.880 | The question of how we allocate resources
01:44:06.440 | has to be revisited at a fundamental level,
01:44:08.680 | because the kind of software these people built,
01:44:10.960 | the modalities that those human ecologies
01:44:13.920 | that built that software,
01:44:15.800 | it treats software as unproperty.
01:44:17.920 | Actually sharing creates value.
01:44:20.440 | Restricting and forking reduces value.
01:44:23.040 | So that's different than any other physical resource
01:44:26.320 | that we've ever dealt with.
01:44:27.160 | It's different than how most corporations
01:44:28.720 | treat software IP, right?
01:44:31.200 | So if treating software in this way
01:44:34.560 | created this much value so efficiently, so cheaply,
01:44:37.520 | 'cause feeding a dozen people for 10 years is really cheap.
01:44:40.480 | Right?
01:44:41.600 | That's the reason I care about this right now,
01:44:44.640 | is because looking forward
01:44:46.000 | when we can automate a lot of labor,
01:44:47.920 | where we can in fact,
01:44:49.520 | the programming for your robot
01:44:51.440 | in your part of neck of the woods
01:44:52.760 | and your part of the Amazon
01:44:54.040 | to build something sustainable for you and your tribe
01:44:56.960 | to deliver the right medicines, to take care of the kids,
01:45:00.160 | that's just software, that's just code.
01:45:02.800 | That could be totally open sourced.
01:45:04.840 | Right?
01:45:05.680 | So we can actually get to a mode
01:45:07.280 | where all of this additional generative things
01:45:10.840 | that humans are doing,
01:45:12.320 | they don't have to be wrapped up in a container
01:45:16.120 | and then we charge
01:45:17.000 | for all the exponential dynamics out of it.
01:45:19.120 | That's what Facebook did.
01:45:20.320 | That's what modern social media did.
01:45:22.120 | Right?
01:45:22.960 | 'Cause the old internet was connecting people just fine.
01:45:24.840 | So Facebook came along and said,
01:45:25.880 | well, anyone can post a picture,
01:45:26.840 | anyone can post some text.
01:45:28.320 | And we're gonna amplify the crap out of it
01:45:29.920 | to everyone else.
01:45:31.040 | And it exploded this generative network
01:45:33.240 | of human interaction.
01:45:34.640 | And then I said, how do I make money off that?
01:45:36.000 | Oh yeah, I'm gonna be a gatekeeper
01:45:38.080 | on everybody's attention.
01:45:39.800 | And that's how we make money.
01:45:40.960 | - So how do we create more than one van?
01:45:45.560 | How do we have millions of vans full of people
01:45:47.760 | that create NumPy, SciPy, that create Python?
01:45:50.920 | So, you know, the story of those people
01:45:54.320 | is often they have some kind of job outside of this.
01:45:57.000 | This is what they're doing for fun.
01:45:58.800 | Don't you need to have a job?
01:46:00.840 | Don't you have to be connected,
01:46:02.160 | plugged in to the capitalist system?
01:46:04.880 | Isn't that what, like, isn't this consumerism
01:46:09.080 | the engine that results in the individuals
01:46:13.800 | that kind of take a break from it
01:46:15.120 | every once in a while to create something magical?
01:46:17.240 | Like at the edges is where the innovation happens.
01:46:18.800 | - Right, the question of surplus, right?
01:46:20.280 | This is the question.
01:46:21.360 | Like if everyone were to go and run their own farm,
01:46:24.400 | no one would have time to go and write NumPy, SciPy, right?
01:46:27.320 | Maybe, but that's what I'm talking about
01:46:29.960 | when I say we're maybe at a post-scarcity point
01:46:32.800 | for a lot of people.
01:46:34.160 | The question that we're never encouraged to ask
01:46:37.280 | in a Super Bowl ad is how much do you need?
01:46:40.680 | How much is enough?
01:46:42.000 | Do you need to have a new car every two years, every five?
01:46:45.080 | If you have a reliable car,
01:46:46.200 | can you drive one for 10 years, is that all right?
01:46:47.960 | You know, I had a car for 10 years and it was fine.
01:46:50.160 | You know, your iPhone,
01:46:51.080 | do you have to upgrade every two years?
01:46:52.800 | I mean, it's sort of, you're using the same apps
01:46:54.320 | you did four years ago, right?
01:46:56.680 | - This should be a Super Bowl ad.
01:46:58.320 | - This should be a Super Bowl ad, that's great.
01:46:59.680 | Maybe somebody- - Do you really need
01:47:00.680 | a new iPhone?
01:47:01.520 | - Maybe one of our listeners will fund something like this
01:47:03.920 | of like, no, but just actually bringing it back,
01:47:06.960 | bringing it back to actually the question of
01:47:09.840 | what do you need?
01:47:11.480 | How do we create the infrastructure
01:47:13.560 | for collectives of people to live on the basis of
01:47:18.200 | providing what we need, meeting people's needs
01:47:21.040 | with a little bit of excess to handle emergencies
01:47:23.240 | and things like that.
01:47:24.440 | Pulling our resources together
01:47:26.320 | to handle the really, really big emergencies,
01:47:28.800 | somebody with a really rare form of cancer
01:47:30.880 | or some massive fire sweeps through, you know,
01:47:32.640 | half the village or whatever.
01:47:34.920 | But can we actually unscale things
01:47:38.200 | and solve for people's needs
01:47:41.560 | and then give them the capacity
01:47:44.120 | to explore how to be the best version of themselves?
01:47:47.320 | And for Travis, that was, you know,
01:47:49.000 | throwing away his shot at tenure in order to write "Numbhigh."
01:47:52.880 | For others, there is a saying in the sci-fi community
01:47:56.840 | that, you know, sci-fi advances
01:47:58.160 | one failed postdoc at a time.
01:47:59.800 | And that's, you know, we can do these things.
01:48:03.800 | We can actually do this kind of collaboration
01:48:05.600 | because code, software, information organization,
01:48:08.360 | that's cheap.
01:48:09.880 | Those bits are very cheap to fling across the oceans.
01:48:13.000 | - So you mentioned Travis.
01:48:14.760 | We've been talking
01:48:15.680 | and we'll continue to talk about open source.
01:48:17.920 | Maybe you can comment.
01:48:20.400 | How did you meet Travis?
01:48:21.600 | Who is Travis Alphont?
01:48:24.080 | What's your relationship been like through the years?
01:48:27.440 | Where did you work together?
01:48:30.160 | How did you meet?
01:48:31.840 | What's the present and the future look like?
01:48:35.120 | - Yeah, so the first time I met Travis
01:48:36.600 | was at a sci-fi conference in Pasadena.
01:48:39.360 | - Do you remember the year?
01:48:40.920 | - 2005.
01:48:42.040 | I was working at, again, at Enthought,
01:48:44.200 | you know, working on scientific computing, consulting.
01:48:47.000 | And a couple of years later,
01:48:51.200 | he joined us at Enthought, I think 2007.
01:48:54.040 | And he came in as the president.
01:48:58.240 | One of the founders of Enthought was the CEO, Eric Jones.
01:49:01.920 | And we were all very excited that Travis was joining us.
01:49:04.080 | And that was, you know, great fun.
01:49:05.080 | And so I worked with Travis
01:49:06.960 | on a number of consulting projects
01:49:08.920 | and we worked on some open source stuff.
01:49:12.120 | I mean, it was just a really, it was a good time there.
01:49:15.080 | And then-
01:49:15.920 | - It was primarily Python related?
01:49:17.840 | - Oh yeah, it was all Python,
01:49:18.760 | NumPy, sci-fi consulting kind of stuff.
01:49:21.000 | Towards the end of that time,
01:49:23.280 | we started getting called into more and more finance shops.
01:49:27.760 | They were adopting Python pretty heavily.
01:49:29.880 | I did some work on like a high-frequency trading shop,
01:49:33.360 | working on some stuff.
01:49:34.200 | And then we worked together on some,
01:49:36.520 | at a couple of investment banks in Manhattan.
01:49:39.880 | And so we started seeing that there was a potential
01:49:42.720 | to take Python in the direction of business computing.
01:49:45.760 | More than just being this niche,
01:49:46.840 | like MATLAB replacement for big vector computing.
01:49:50.560 | What we were seeing was, oh yeah,
01:49:51.840 | you could actually use Python as a Swiss army knife
01:49:53.920 | to do a lot of shadow data transformation kind of stuff.
01:49:56.880 | So that's when we realized the potential is much greater.
01:50:00.560 | And so we started Anaconda.
01:50:03.400 | I mean, it was called Continuum Analytics at the time,
01:50:05.240 | but we started in January of 2012
01:50:07.560 | with a vision of shoring up the parts of Python
01:50:10.800 | that needed to get expanded to handle data at scale,
01:50:13.800 | to do web visualization, application development, et cetera.
01:50:17.240 | And that was that, yeah.
01:50:18.080 | So he was CEO and I was president for the first five years.
01:50:23.080 | And then we raised some money and then the board
01:50:27.840 | sort of put in a new CEO.
01:50:29.000 | They hired a kind of professional CEO.
01:50:31.320 | And then Travis, you laugh out that.
01:50:34.080 | I took over the CTO role.
01:50:35.240 | Travis then left after a year to do his own thing,
01:50:37.920 | to do QuantSight,
01:50:39.600 | which was more oriented around some of the bootstrap years
01:50:43.000 | that we did at Continuum,
01:50:43.920 | where it was open source and consulting.
01:50:46.160 | It wasn't sort of like gung-ho product development.
01:50:48.560 | And it wasn't focused on,
01:50:50.080 | we accidentally stumbled into the package management problem
01:50:53.840 | at Anaconda, but then we had a lot of other visions
01:50:56.920 | of other technology that we built in the open source.
01:50:58.880 | And Travis was really trying to push, again,
01:51:02.360 | the frontiers of numerical computing, vector computing,
01:51:05.280 | handling things like auto differentiation and stuff
01:51:07.680 | intrinsically in the open ecosystem.
01:51:09.920 | So I think that's the,
01:51:13.640 | that's kind of the direction he's working on
01:51:16.600 | in some of his work.
01:51:18.280 | We remain great friends and colleagues and collaborators,
01:51:22.520 | even though he's no longer day-to-day working at Anaconda,
01:51:25.760 | but he gives me a lot of feedback about
01:51:27.840 | this and that and the other.
01:51:29.000 | - What's a big lesson you've learned from Travis
01:51:32.200 | about life or about programming or about leadership?
01:51:35.440 | - Wow, there's a lot.
01:51:36.480 | There's a lot.
01:51:37.320 | Travis is a really, really good guy.
01:51:39.600 | He really, his heart is really in it.
01:51:41.920 | He cares a lot.
01:51:43.080 | - I've gotten that sense having to interact with him.
01:51:46.920 | It's so interesting.
01:51:47.760 | - Yeah. - Such a good human being.
01:51:48.600 | - He's a really good dude.
01:51:49.680 | And he and I, it's so interesting.
01:51:51.360 | We come from very different backgrounds.
01:51:53.200 | We're quite different as people,
01:51:54.900 | but I think we can not talk for a long time
01:52:00.780 | and then be on a conversation and be eye-to-eye
01:52:04.280 | on like 90% of things.
01:52:06.400 | And so he's someone who I believe,
01:52:08.280 | no matter how much fog settles in over the ocean,
01:52:10.600 | his ship, my ship are pointed sort of
01:52:12.480 | in the same direction to the same star.
01:52:14.120 | - Wow, that's a beautiful way to phrase it.
01:52:16.840 | No matter how much fog there is,
01:52:18.600 | we're pointed at the same star.
01:52:20.400 | - Yeah, and I hope he feels the same way.
01:52:21.880 | I mean, I hope he knows that over the years now.
01:52:23.780 | We both care a lot about the community.
01:52:27.000 | For someone who cares so deeply,
01:52:28.120 | I would say this about Travis, it's interesting.
01:52:29.880 | For someone who cares so deeply about the nerd details
01:52:33.360 | of like type system design and vector computing
01:52:36.000 | and efficiency of expressing this and that and the other,
01:52:38.760 | memory layouts and all that stuff,
01:52:40.440 | he cares even more about the people
01:52:43.280 | in the ecosystem, the community.
01:52:45.880 | And I have a similar kind of alignment.
01:52:49.760 | I care a lot about the tech, I really do.
01:52:53.080 | But for me, the beauty of what this human ecology
01:52:58.080 | has produced is I think a touchstone.
01:53:01.740 | It's an early version.
01:53:02.680 | We should look at it and say,
01:53:03.680 | how do we replicate this for humanity at scale?
01:53:05.880 | What this open source collaboration was able to produce?
01:53:08.840 | How can we be generative in human collaboration
01:53:11.640 | moving forward and create that
01:53:12.840 | as a civilizational kind of dynamic?
01:53:15.160 | Like, can we seize this moment to do that?
01:53:17.520 | 'Cause like a lot of the other open source movements,
01:53:19.800 | it's all nerds nerding out on code for nerds.
01:53:22.040 | And this, because it's scientists,
01:53:25.900 | because it's people working on data
01:53:27.240 | that all of it faces real human problems,
01:53:30.140 | I think we have an opportunity
01:53:32.680 | to actually make a bigger impact.
01:53:34.440 | - Is there a way for this kind of open source vision
01:53:37.560 | to make money?
01:53:39.140 | - Absolutely.
01:53:40.180 | - To fund the people involved?
01:53:41.700 | Is that an essential part of it?
01:53:43.140 | - It's hard, but we're trying to do that
01:53:45.740 | in our own way at Anaconda,
01:53:48.620 | because we know that business users,
01:53:49.980 | as they use more of the stuff,
01:53:51.020 | they have needs that like business specific needs
01:53:53.060 | around security, provenance.
01:53:54.960 | They really can't tell their VPs and their investors,
01:54:00.180 | hey, we're having our data scientists
01:54:02.100 | installing random packages from who knows where
01:54:04.540 | and running on customer data.
01:54:05.660 | So they have to have someone to talk to you.
01:54:06.860 | And that's what Anaconda does.
01:54:08.380 | So we are a governed source of packages for them.
01:54:11.460 | And that's great.
01:54:12.300 | That makes some money.
01:54:13.140 | We take some of that and we just take that as a dividend.
01:54:17.140 | We take a percentage of our revenues
01:54:18.240 | and write that as a dividend for the open source community.
01:54:21.040 | But beyond that, I really see the development
01:54:24.440 | of a marketplace for people to create notebooks,
01:54:28.300 | models, data sets, curation
01:54:30.700 | of these different kinds of things,
01:54:33.100 | and to really have a long tail marketplace dynamic with that.
01:54:38.100 | - Can you speak about this problem
01:54:39.620 | that you stumbled into of package management,
01:54:42.580 | Python package management?
01:54:43.900 | What is that?
01:54:45.000 | A lot of people speak very highly of Conda,
01:54:48.980 | which is part of Anaconda, which is a package manager.
01:54:51.540 | There's a ton of packages.
01:54:53.180 | So first, what are package managers?
01:54:55.120 | And second, what was there before?
01:54:57.300 | What is pip?
01:54:58.660 | And why is Conda more awesome?
01:55:01.860 | - The package problem is this,
01:55:03.220 | which is that in order to do
01:55:05.980 | numerical computing efficiently with Python,
01:55:10.980 | there are a lot of low-level libraries
01:55:14.540 | that need to be compiled,
01:55:15.980 | compiled with a C compiler or a C++ compiler
01:55:18.380 | or a Fortran compiler.
01:55:19.900 | They need to not just be compiled,
01:55:21.140 | but they need to be compiled with all of the right settings.
01:55:23.740 | And oftentimes those settings are tuned
01:55:25.220 | for specific chip architectures.
01:55:27.460 | And when you add GPUs to the mix,
01:55:29.340 | when you look at different operating systems,
01:55:32.300 | you may be on the same chip,
01:55:33.820 | but if you're running Mac versus Linux versus Windows
01:55:37.220 | on the same x86 chip, you compile and link differently.
01:55:40.040 | All of this complexity is beyond the capability
01:55:44.820 | of most data scientists to reason about.
01:55:46.780 | And it's also beyond what most of the package developers
01:55:50.220 | want to deal with too.
01:55:51.900 | Because if you're a package developer,
01:55:52.860 | you're like, I code on Linux, this works for me, I'm good.
01:55:55.820 | It is not my problem to figure out how to build this
01:55:58.060 | on an ancient version of Windows.
01:56:00.020 | That's just simply not my problem.
01:56:01.940 | So what we end up with is we have a creator,
01:56:05.140 | or create a very creative crowdsourced environment
01:56:08.580 | where people want to use this stuff, but they can't.
01:56:11.380 | And so we ended up creating a new set of technologies
01:56:15.740 | like a build recipe system, a build system,
01:56:18.420 | and an installer system that is able to,
01:56:21.460 | well, to put it simply,
01:56:24.640 | it's able to build these packages correctly
01:56:27.700 | on each of these different kinds of platforms
01:56:29.300 | and operating systems and make it so
01:56:30.860 | when people want to install something, they can.
01:56:33.620 | It's just one command.
01:56:34.460 | They don't have to set up a big compiler system
01:56:36.940 | and do all these things.
01:56:38.300 | So when it works well, it works great.
01:56:40.420 | Now, the difficulty is we have literally thousands
01:56:43.900 | of people writing code in the ecosystem,
01:56:46.260 | building all sorts of stuff.
01:56:47.300 | And each person writing code,
01:56:48.740 | they may take a dependence on something else.
01:56:50.660 | And so all this web,
01:56:52.260 | incredibly complex web of dependencies.
01:56:54.820 | So installing the correct package
01:56:57.620 | for any given set of packages you want,
01:57:00.340 | getting that right sub graph
01:57:02.180 | is an incredibly hard problem.
01:57:04.620 | And again, most data scientists
01:57:05.780 | don't want to think about this.
01:57:06.620 | They're like, I want to install NumPy and Pandas.
01:57:09.180 | I want this version of some like geospatial library.
01:57:11.700 | I want this other thing.
01:57:13.100 | Like, why is this hard?
01:57:13.980 | These exist, right?
01:57:16.020 | And it is hard because it's, well,
01:57:17.680 | you're installing this on a version of Windows, right?
01:57:20.740 | And half of these libraries are not built for Windows.
01:57:23.400 | Or the latest version isn't available,
01:57:25.140 | but the old version was.
01:57:26.260 | If you go to the old version of this library,
01:57:27.540 | that means you need to go to a different version
01:57:28.620 | of that library.
01:57:30.060 | And so the Python ecosystem,
01:57:32.460 | by virtue of being crowdsourced,
01:57:34.500 | we were able to fill a hundred thousand different niches.
01:57:38.020 | But then we also suffer this problem
01:57:40.380 | that because it's crowdsourced and no one,
01:57:43.140 | it's like a tragedy of the commons, right?
01:57:44.500 | No one really wants to support
01:57:47.080 | their thousands of other dependencies.
01:57:49.200 | So we end up sort of having to do a lot of this.
01:57:52.160 | And of course the Conda Forge community
01:57:53.500 | also steps up as an open source community
01:57:55.060 | that maintains some of these recipes.
01:57:57.540 | That's what Conda does.
01:57:58.660 | Now, Pip is a tool that came along after Conda
01:58:01.900 | to some extent.
01:58:02.740 | It came along as an easier way
01:58:04.460 | for the Python developers writing Python code
01:58:09.460 | that didn't have as much compiled stuff.
01:58:12.920 | They could then install different packages.
01:58:15.360 | And what ended up happening in the Python ecosystem
01:58:17.820 | was that a lot of the core Python and web Python developers,
01:58:20.820 | they never ran into any of this compilation stuff at all.
01:58:24.160 | So even we have, on video,
01:58:27.020 | we have Guido van Rossum saying,
01:58:29.500 | you know what, the scientific community's packaging problems
01:58:31.620 | are just too exotic and different.
01:58:33.140 | I mean, you're talking about Fortran compilers, right?
01:58:35.700 | Like you guys just need to build
01:58:37.060 | your own solution perhaps, right?
01:58:38.960 | So the core Python community went
01:58:41.740 | and built its own sort of packaging technologies,
01:58:45.320 | not really contemplating the complexity
01:58:47.860 | of the stuff over here.
01:58:49.460 | And so now we have the challenge
01:58:50.660 | where you can Pip install some things.
01:58:52.980 | And some libraries,
01:58:53.820 | if you just want to get started with them,
01:58:55.300 | you can Pip install TensorFlow
01:58:56.620 | and that works great.
01:58:57.540 | The instant you want to also install some other packages
01:59:00.180 | that use different versions of NumPy
01:59:02.500 | or some like graphics library
01:59:04.300 | or some OpenCV thing or some other thing,
01:59:06.780 | you now run into dependency hell.
01:59:08.700 | Because you cannot, you know,
01:59:09.860 | OpenCV can have a different version of libjpeg over here
01:59:12.740 | than PyTorch over here.
01:59:14.260 | Like they actually,
01:59:15.100 | and they all have to use the,
01:59:15.920 | if you want to use GPU acceleration,
01:59:17.380 | they have to all use the same underlying drivers
01:59:18.820 | and same GPU CUDA things.
01:59:20.340 | So it gets to be very gnarly.
01:59:22.900 | And it's a level of technology
01:59:24.200 | that both the makers and the users
01:59:26.060 | don't really want to think too much about.
01:59:28.500 | - And that's where you step in
01:59:29.620 | and try to solve the sub graph problem.
01:59:32.020 | How much is that?
01:59:32.860 | I mean, you said that you don't want to think,
01:59:34.900 | they don't want to think about it,
01:59:35.900 | but how much is it a little bit on the developer
01:59:37.940 | and providing them tools
01:59:39.580 | to be a little bit more clear
01:59:42.020 | of that sub graph of dependency that's necessary?
01:59:44.860 | - It is getting to a point
01:59:46.220 | where we do have to think about,
01:59:47.820 | look, can we pull some of the most popular packages together
01:59:51.180 | and get them to work on a coordinated release timeline,
01:59:53.580 | get them to build against the same test matrix,
01:59:55.780 | et cetera, et cetera.
01:59:56.980 | And there is a little bit of dynamic around this,
01:59:58.700 | but again, it is a volunteer community.
02:00:01.780 | People working on these different projects
02:00:04.900 | have their own timelines
02:00:05.940 | and their own things they're trying to meet.
02:00:07.540 | So we end up trying to pull these things together
02:00:11.300 | and then it's this incredibly,
02:00:13.020 | and I would recommend just as a business tip,
02:00:15.380 | don't ever go into business
02:00:16.500 | where when your hard work works, you're invisible.
02:00:19.500 | And when it breaks because of someone else's problem,
02:00:21.620 | you get flack for it.
02:00:23.100 | 'Cause that's in our situation, right?
02:00:25.540 | When something doesn't condo install properly,
02:00:27.180 | usually it's some upstream issue,
02:00:28.900 | but it looks like condo is broken.
02:00:30.180 | It looks like Anaconda screwed something up.
02:00:32.540 | When things do work though, it's like,
02:00:33.980 | oh yeah, cool, it's worked.
02:00:34.980 | Assuming naturally, of course,
02:00:36.100 | that's very easy to make that work, right?
02:00:38.140 | So we end up in this kind of problematic scenario,
02:00:41.900 | but it's okay because I think we're still,
02:00:45.380 | our heart's in the right place.
02:00:46.700 | We're trying to move this forward
02:00:47.780 | as a community sort of affair.
02:00:49.180 | I think most of the people in the community
02:00:50.460 | also appreciate the work we've done over the years
02:00:52.940 | to try to move these things forward
02:00:54.220 | in a collaborative fashion.
02:00:57.260 | - One of the sub graphs of dependencies
02:01:01.140 | that became super complicated
02:01:03.500 | is the move from Python two to Python three.
02:01:05.700 | So there's all these ways to mess with these kinds
02:01:08.100 | of ecosystems of packages and so on.
02:01:11.460 | So I just wanna ask you about that particular one.
02:01:13.700 | What do you think about the move from Python two to three?
02:01:17.940 | Why did it take so long?
02:01:19.380 | What were, from your perspective,
02:01:20.860 | just seeing the packages all struggle
02:01:23.620 | and the community all struggled through this process,
02:01:26.180 | what lessons do you take away from it?
02:01:27.740 | Why did it take so long?
02:01:29.380 | - Looking back, some people perhaps underestimated
02:01:33.300 | how much adoption Python two had.
02:01:36.660 | I think some people also underestimated how much,
02:01:41.700 | or they overestimated how much value
02:01:44.380 | some of the new features in Python three really provided.
02:01:47.340 | Like the things they really loved about Python three
02:01:49.660 | just didn't matter to some of these people in Python two.
02:01:52.540 | 'Cause this change was happening as Python, SciPy
02:01:56.380 | was starting to take off really like past
02:01:58.620 | like a hockey stick of adoption
02:01:59.940 | in the early data science era, in the early 2010s.
02:02:02.820 | A lot of people were learning and onboarding
02:02:04.820 | in whatever just worked.
02:02:06.220 | And the teachers were like, well, yeah,
02:02:08.220 | these libraries I need are not supported in Python three yet,
02:02:10.300 | I'm gonna teach you Python two.
02:02:11.980 | Took a lot of advocacy to get people
02:02:13.900 | to move over to Python three.
02:02:15.660 | So I think it wasn't any particular single thing,
02:02:18.780 | but it was one of those death by a dozen cuts,
02:02:21.740 | which just really made it hard to move off of Python two.
02:02:25.420 | And also Python three itself,
02:02:27.220 | as they were kind of breaking things
02:02:28.740 | and changing these around
02:02:29.580 | and reorganize the standard library,
02:02:30.660 | there's a lot of stuff that was happening there
02:02:32.860 | that kept giving people an excuse to say,
02:02:35.780 | I'll put off till the next version.
02:02:37.700 | Two is working fine enough for me right now.
02:02:39.620 | So I think that's essentially what happened there.
02:02:41.380 | And I will say this though,
02:02:43.660 | the strength of the Python data science movement,
02:02:48.580 | I think is what kept Python alive in that transition.
02:02:52.340 | Because a lot of languages have died
02:02:53.940 | and left their user bases behind.
02:02:56.740 | If there wasn't the use of Python for data,
02:02:58.860 | there's a good chunk of Python users
02:03:01.020 | that during that transition
02:03:02.620 | would have just left for Go and Rust and stayed.
02:03:04.780 | In fact, some people did.
02:03:06.100 | They moved to Go and Rust and they just never looked back.
02:03:08.740 | The fact that we were able to grow by millions of users,
02:03:13.620 | the Python data community,
02:03:15.700 | that is what kept the momentum for Python going.
02:03:18.180 | And now the usage of Python for data is over 50%
02:03:22.100 | of the overall Python user base.
02:03:24.180 | So I'm happy to debate that on stage somewhere
02:03:27.820 | if I can with someone
02:03:28.660 | if they really wanna take issue with that statement.
02:03:30.180 | But from where I sit, I think that's true.
02:03:32.420 | - The statement there,
02:03:33.380 | the idea is that the switch from Python 2 to Python 3
02:03:37.740 | would have probably destroyed Python
02:03:40.260 | if it didn't also coincide with Python for whatever reason,
02:03:45.620 | just overtaking the data science community,
02:03:49.500 | anything that processes data.
02:03:51.620 | So like the timing was perfect,
02:03:53.940 | that this maybe imperfect decision was coupled
02:03:57.580 | with a great timing on the value of data in our world.
02:04:01.900 | - I would say the troubled execution of a good decision.
02:04:04.620 | It was a decision that was necessary.
02:04:07.220 | It's possible if we had more resources,
02:04:08.660 | we could have done in a way that was a little bit smoother,
02:04:10.940 | but ultimately, the arguments for Python 3,
02:04:15.020 | I bought them at the time and I buy them now, right?
02:04:17.340 | Having great text handling
02:04:18.860 | is like a non-negotiable table stakes thing
02:04:21.500 | you need to have in a language.
02:04:23.420 | So that's great.
02:04:26.060 | But the execution, Python is the,
02:04:30.660 | it's volunteer driven.
02:04:32.940 | It's like now the most popular language on the planet,
02:04:34.820 | but it's all literally volunteers.
02:04:37.020 | So the lack of resources meant that they had to really,
02:04:40.300 | they had to do things in a very hamstrung way.
02:04:43.540 | And I think to carry the Python momentum
02:04:46.020 | in the language through that time,
02:04:47.740 | the data movement was a critical part of that.
02:04:49.860 | - So someone was carrot and stick.
02:04:51.380 | I actually have to shamefully admit
02:04:55.660 | that it took me a very long time
02:04:56.980 | to switch from Python 2 and Python 3
02:04:58.700 | because I'm a machine learning person.
02:05:00.260 | It was just for the longest time,
02:05:01.900 | you could just do fine with Python 2.
02:05:03.740 | - Right.
02:05:04.900 | - But I think the moment where I switched
02:05:08.740 | everybody I worked with and switched myself
02:05:12.100 | for small projects and big is when finally,
02:05:15.940 | when NumPy announced that they're going to end support
02:05:19.140 | like in 2020 or something like that.
02:05:22.140 | - Right.
02:05:23.060 | - So like when I realized, oh, this is going to end.
02:05:27.340 | - Right.
02:05:28.180 | - So that's the stick.
02:05:29.020 | - That's the stick. - That's not a carrot.
02:05:30.100 | So for the longest time it was carrots.
02:05:31.660 | It was like all of these packages were saying,
02:05:34.380 | okay, we have Python 3 support now, come join us.
02:05:37.420 | We have Python 2 and Python 3.
02:05:39.060 | But when NumPy, one of the packages,
02:05:40.940 | I sort of love and depend on said like, nope, it's over.
02:05:45.940 | That's when I decided to switch.
02:05:50.340 | I wonder if you think it was possible much earlier
02:05:53.820 | for somebody like NumPy or some major package
02:05:58.820 | to step into the cold and say like, we're an anus.
02:06:01.820 | - Well, it's a chicken and egg problem too, right?
02:06:05.300 | You don't want to cut off a lot of users
02:06:07.540 | unless you see the user momentum going too.
02:06:09.380 | So the decisions for the scientific community
02:06:12.860 | for each of the different projects,
02:06:14.180 | there's not a monolith.
02:06:15.220 | Some projects are like,
02:06:16.060 | we'll only be releasing new features on Python 3.
02:06:18.860 | And that was more of a sticky carrot, right?
02:06:21.380 | A firm carrot, if you will, a firm carrot.
02:06:23.500 | A stick-shaped carrot.
02:06:27.900 | But then for others, yeah, NumPy in particular,
02:06:30.620 | 'cause it's at the base of the dependency stack
02:06:32.620 | for so many things, that was the final stick.
02:06:36.140 | That was a stick-shaped stick.
02:06:37.580 | People were saying, look,
02:06:38.420 | if I have to keep maintaining my releases for Python 2,
02:06:41.660 | that's that much less energy
02:06:43.620 | that I can put into making things better
02:06:45.700 | for the Python 3 folks or in my new version,
02:06:48.180 | which is of course going to be Python 3.
02:06:49.940 | So people were also getting kind of pulled by this tension.
02:06:53.300 | So the overall community sort of had a lot of input
02:06:56.140 | into when the NumPy core folks decided
02:06:58.460 | that they would end of life on Python 2.
02:07:01.380 | - So as these numbers are a little bit loose,
02:07:03.980 | but there are about 10 million Python programmers
02:07:06.820 | in the world, you could argue that number,
02:07:08.380 | but let's say 10 million.
02:07:10.300 | This actually, where I was looking,
02:07:12.260 | said 27 million total programmers,
02:07:15.500 | developers in the world.
02:07:17.220 | You mentioned in a talk that changes need to be made
02:07:20.540 | for there to be 100 million Python programmers.
02:07:23.980 | So first of all, do you see a future
02:07:26.020 | where there's 100 million Python programmers?
02:07:28.300 | And second, what kind of changes need to be made?
02:07:31.340 | - So Anaconda, Minicon,
02:07:32.580 | get downloaded about a million times a week.
02:07:34.900 | So I think the idea that there's only 10 million
02:07:38.100 | Python programmers in the world
02:07:39.340 | is a little bit undercounting.
02:07:41.980 | There are a lot of people who escape traditional counting
02:07:44.860 | that are using Python and data in their jobs.
02:07:48.460 | I do believe that the future world for it to,
02:07:52.100 | well, the world I would like to see
02:07:53.300 | is one where people are data literate.
02:07:56.260 | So they are able to use tools
02:07:58.620 | that let them express their questions and ideas fluidly.
02:08:03.180 | And the data variety and data complexity will not go down.
02:08:06.460 | It will only keep increasing.
02:08:08.340 | So I think some level of code or code-like things
02:08:12.460 | will continue to be relevant.
02:08:15.460 | And so my hope is that we can build systems
02:08:19.740 | that allow people to more seamlessly integrate
02:08:22.900 | Python kinds of expressivity with data systems
02:08:26.060 | and operationalization methods that are much more seamless.
02:08:29.980 | And what I mean by that is,
02:08:32.380 | right now you can't punch Python code into an Excel cell.
02:08:35.660 | I mean, there's some tools you can do to kind of do this.
02:08:37.940 | We didn't build a thing for doing this back in the day,
02:08:39.900 | but I feel like the total addressable market
02:08:43.820 | for Python users, if we do the things right,
02:08:46.860 | is on the order of the Excel users,
02:08:49.100 | which is a few hundred million.
02:08:51.180 | So I think Python has to get better at being embedded,
02:08:56.180 | being a smaller thing that pulls in
02:08:59.660 | just the right parts of the ecosystem
02:09:01.700 | to run numerics and do data exploration,
02:09:05.540 | meeting people where they're already at
02:09:07.900 | with their data and their data tools.
02:09:09.620 | And then I think also it has to be easier
02:09:12.780 | to take some of those things they've written
02:09:14.740 | and flow those back into deployed systems
02:09:17.540 | or little apps or visualizations.
02:09:19.460 | I think if we don't do those things,
02:09:20.740 | then we will always be kept in a silo
02:09:23.140 | as sort of an expert users tool
02:09:25.900 | and not a tool for the masses.
02:09:27.380 | - You know, I work with a bunch of folks
02:09:29.020 | in the Adobe Creative Suite,
02:09:32.500 | and I'm kind of forcing them or inspired them
02:09:35.420 | to learn Python to do a bunch of stuff that helps them.
02:09:38.540 | And it's interesting 'cause they probably
02:09:39.820 | wouldn't call themselves Python programmers,
02:09:41.740 | but they're all using Python.
02:09:43.820 | I would love it if the tools like Photoshop and Premiere
02:09:46.500 | and all those kinds of tools that are targeted
02:09:48.740 | towards creative people, I guess that's where Excel,
02:09:52.100 | Excel is targeted towards a certain kind of audience
02:09:54.220 | that works with data, financial people,
02:09:56.340 | all that kind of stuff.
02:09:58.100 | If there will be easy ways to leverage,
02:10:00.860 | to use Python for quick scripting tasks.
02:10:03.580 | - Yeah.
02:10:04.420 | - And there's an exciting application
02:10:06.740 | of artificial intelligence in this space
02:10:09.740 | that I'm hopeful about looking at OpenAI Codex
02:10:13.260 | with generating programs.
02:10:16.380 | - Yeah.
02:10:17.220 | - So almost helping people bridge the gap
02:10:20.780 | from kind of visual interface to generating programs
02:10:25.780 | to something formal.
02:10:27.340 | And then they can modify it and so on,
02:10:28.980 | but kind of without having to read the manual,
02:10:32.340 | without having to do a Google search and stack overflow,
02:10:34.900 | which is essentially what a neural network does
02:10:36.740 | when it's doing code generation,
02:10:39.020 | is actually generating code and allowing a human
02:10:42.820 | to communicate with multiple programs.
02:10:44.740 | And then maybe even programs to communicate
02:10:46.540 | with each other via Python.
02:10:48.020 | - Right.
02:10:48.860 | - So that to me is a really exciting possibility
02:10:51.220 | 'cause I think there's a friction to kind of,
02:10:55.980 | like how do I learn how to use Python in my life?
02:10:58.940 | There's oftentimes you kind of,
02:11:01.820 | what started a class, you start learning about types.
02:11:05.380 | I don't know, functions.
02:11:07.140 | Like this is, you know, Python is the first language
02:11:09.620 | with which you start to learn to program.
02:11:11.980 | But I feel like that's going to take a long time
02:11:16.700 | for you to understand why it's useful.
02:11:18.580 | You almost want to start with a script.
02:11:20.340 | - Well, you do.
02:11:21.580 | In fact, I think starting with the theory
02:11:23.860 | behind programming languages and types and all that,
02:11:25.580 | I mean, types are there to make
02:11:27.780 | the compiler writer's jobs easier.
02:11:30.020 | Types are not, I mean, heck,
02:11:32.140 | do you have an ontology of types
02:11:33.180 | or just the objects on this table?
02:11:34.380 | No. (laughs)
02:11:35.580 | So types are there because compiler writers are human
02:11:39.180 | and they're limited in what they can do.
02:11:40.860 | But I think that the beauty of scripting,
02:11:45.140 | like there's a Python book
02:11:46.860 | that's called "Automate the Boring Stuff,"
02:11:49.260 | which is exactly the right mentality.
02:11:51.180 | I grew up with computers in a time
02:11:53.900 | when I could, when Steve Jobs was still pitching
02:11:58.500 | these things as bicycles for the mind.
02:11:59.900 | They were supposed to not be just media consumption devices,
02:12:03.300 | but they were actually, you could write some code.
02:12:05.940 | You could write basic,
02:12:06.780 | you could write some stuff to do some things.
02:12:09.020 | And that feeling of a computer as a thing
02:12:12.180 | that we can use to extend ourselves
02:12:14.260 | has all but evaporated for a lot of people.
02:12:17.780 | So you see a little bit in parts
02:12:19.500 | in the generation of youth around Minecraft or Roblox, right?
02:12:23.580 | And I think Python, CircuitPython,
02:12:25.060 | these things could be a renaissance of that,
02:12:28.780 | of people actually shaping and using their computers
02:12:32.620 | as computers, as an extension of their minds
02:12:35.460 | and their curiosity, their creativity.
02:12:37.860 | So you talk about scripting the Adobe suite with Python
02:12:41.500 | in the 3D graphics world.
02:12:42.860 | Python is a scripting language
02:12:46.300 | that some of these 3D graphics suites use.
02:12:48.700 | And I think that's great.
02:12:49.540 | We should better support those kinds of things.
02:12:51.260 | But ultimately the idea that I should be able
02:12:54.220 | to have power over my computing environment.
02:12:56.260 | If I want these things to happen repeatedly all the time,
02:12:59.660 | I should be able to say that somehow to the computer, right?
02:13:02.660 | Now, whether the operating systems get there faster
02:13:06.540 | by having some Siri backed with open AI, with whatever.
02:13:09.580 | So you can just say, "Siri, make this do this,
02:13:10.860 | do this on every other Friday," right?
02:13:12.660 | We probably will get there somewhere.
02:13:14.220 | And Apple's always had these ideas.
02:13:15.900 | There's the Apple script in the menu that no one ever uses,
02:13:19.180 | but you can do these kinds of things.
02:13:21.660 | But when you start doing that kind of scripting,
02:13:23.700 | the challenge isn't learning the type system
02:13:25.900 | or even the syntax of the language.
02:13:27.180 | The challenge is all of the dictionaries
02:13:29.260 | and all the objects of all their properties
02:13:31.140 | and attributes and parameters.
02:13:32.860 | Like who's got time to learn all that stuff, right?
02:13:35.420 | So that's when then programming by prototype
02:13:38.860 | or by example becomes the right way
02:13:40.980 | to get the user to express their desire.
02:13:43.780 | So there's a lot of these different ways
02:13:45.220 | that we can approach programming.
02:13:46.060 | But I do think when, as you were talking
02:13:48.260 | about the Adobe scripting thing,
02:13:49.780 | I was thinking about, you know,
02:13:51.340 | when we do use something like NumPy,
02:13:53.780 | when we use things in the Python data
02:13:55.780 | and scientific, let's say, expression system,
02:14:00.140 | there's a reason we use that,
02:14:01.340 | which is that it gives us mathematical precision.
02:14:04.380 | It gives us actually quite a lot of precision
02:14:06.780 | over precisely what we mean about this data set,
02:14:09.940 | that data set.
02:14:10.780 | And it's the fact that we can have that precision
02:14:13.700 | that lets Python be powerful over as a duct tape for data.
02:14:18.700 | You know, you give me a TSV or a CSV,
02:14:21.220 | and if you give me some massively expensive vendor tool
02:14:25.140 | for data transformation,
02:14:26.460 | I don't know I'm gonna be able to solve your problem.
02:14:28.300 | But if you give me a Python prompt,
02:14:30.620 | you can throw whatever data you want at me.
02:14:32.500 | I will be able to mash it into shape.
02:14:34.420 | So that ability to take it as sort of this like,
02:14:38.380 | you know, machete out into the data jungle
02:14:40.340 | is really powerful.
02:14:41.620 | And I think that's why at some level,
02:14:44.340 | we're not gonna get away from some of these expressions
02:14:47.780 | and APIs and libraries in Python for data transformation.
02:14:51.700 | - You've been at the center of the Python community
02:14:57.300 | for many years.
02:14:58.420 | If you could change one thing about the community
02:15:03.420 | to help it grow, to help it improve,
02:15:05.580 | to help it flourish and prosper, what would it be?
02:15:09.500 | I mean, you know, it doesn't have to be one thing,
02:15:11.740 | but what kind of comes to mind?
02:15:13.900 | What are the challenges?
02:15:15.420 | - Humility is one of the values
02:15:16.540 | that we have at Anaconda, the company,
02:15:17.980 | but it's also one of the values in the community
02:15:21.380 | that it's been breached a little bit in the last few years,
02:15:24.540 | but in general, people are quite decent
02:15:27.420 | and reasonable and nice.
02:15:29.380 | And that humility prevents them from seeing
02:15:32.660 | the greatness that they could have.
02:15:37.940 | I don't know how many people in the core Python community
02:15:41.780 | really understand that they stand perched at the edge
02:15:46.780 | of an opportunity to transform how people use computers.
02:15:50.340 | And actually PyCon, I think it was the last physical PyCon
02:15:53.900 | I went to, Russell Keith-Magee gave a great keynote
02:15:57.740 | about very much along the lines of the challenges I have,
02:16:01.620 | which is Python for a language that doesn't actually,
02:16:05.380 | that can't put an interface up
02:16:07.140 | on like the most popular computing devices.
02:16:09.380 | It's done really well as a language, hasn't it?
02:16:11.820 | You can't write a web front end with Python, really.
02:16:13.740 | I mean, everyone uses JavaScript.
02:16:15.060 | You certainly can't write native apps.
02:16:17.020 | So for a language that you can't actually write apps
02:16:20.580 | in any of the front end runtime environments,
02:16:22.700 | Python's done exceedingly well.
02:16:24.340 | And so that wasn't to pat ourselves on the back.
02:16:28.740 | That was to challenge ourselves as a community to say,
02:16:30.580 | we, through our current volunteer dynamic,
02:16:32.500 | have gotten to this point.
02:16:34.460 | What comes next and how do we seize,
02:16:36.740 | you know, we've caught the tiger by the tail.
02:16:38.620 | How do we make sure we keep up with it as it goes forward?
02:16:40.980 | - So that's one of the questions I have
02:16:42.420 | about sort of open source communities is,
02:16:44.660 | at its best, there's a kind of humility.
02:16:48.500 | Is that humility prevent you to have a vision
02:16:52.500 | for creating something like very new and powerful?
02:16:55.620 | - And you've brought us back to consciousness again.
02:16:57.660 | The collaboration is a swarm emergent dynamic.
02:17:00.700 | Humility lets these people work together
02:17:02.620 | without anyone trouncing anyone else.
02:17:04.820 | How do they, you know, in consciousness,
02:17:07.180 | there's the question of the binding problem.
02:17:08.660 | How does a singular, our attention,
02:17:10.700 | how does that emerge from, you know, billions of neurons?
02:17:13.860 | So how can you have a swarm of people emerge a consensus
02:17:17.660 | that has a singular vision to say, we will do this.
02:17:20.620 | And most importantly, we're not gonna do these things.
02:17:23.860 | Emerging a coherent, pointed, focused leadership dynamic
02:17:28.860 | from a collaboration, being able to do that kind of,
02:17:32.060 | and then dissolve it so people can still do the swarm thing.
02:17:35.060 | That's a problem, that's a question.
02:17:37.220 | - So do you have to have a charismatic leader?
02:17:39.580 | For some reason, Linus Torvald comes to mind,
02:17:42.620 | but you know, there's people who criticize--
02:17:44.460 | - He rules with an iron fist, man.
02:17:46.620 | - But there's still charisma to it.
02:17:48.660 | - There is charisma, right?
02:17:49.500 | - There's a charisma to that iron fist.
02:17:51.780 | There's, every leader is different, I would say,
02:17:55.300 | in their success.
02:17:56.700 | So he doesn't, I don't even know if you can say
02:17:59.460 | he doesn't have humility.
02:18:01.860 | There's such a meritocracy of ideas that like,
02:18:06.620 | this is a good idea and this is a bad idea.
02:18:10.380 | - There's a step function to it.
02:18:11.580 | Once you clear a threshold, he's open.
02:18:13.860 | Once you clear the Bozo threshold,
02:18:15.580 | he's open to your ideas, I think.
02:18:17.500 | - But see, the interesting thing is,
02:18:19.500 | obviously that will not stand in an open source community
02:18:23.380 | if that threshold that is defined
02:18:25.860 | by that one particular person is not actually that good.
02:18:30.260 | So you actually have to be really excellent at what you do.
02:18:33.700 | So he's very good at what he does.
02:18:37.100 | And so there's some aspect of leadership
02:18:39.020 | where you can get thrown out, people can just leave.
02:18:42.620 | You know, that's how it works with open source, the fork.
02:18:45.940 | But at the same time, you want to sometimes be a leader,
02:18:49.900 | like with a strong opinion, 'cause people,
02:18:52.420 | I mean, there's some kind of balance here
02:18:54.420 | for this like hive mind to get like behind.
02:18:57.340 | - Leadership is a big topic.
02:18:58.500 | And I didn't, you know, I'm not one of these guys
02:18:59.860 | that went to MBA school and said,
02:19:01.020 | "I'm gonna be an entrepreneur and I'm gonna be a leader.
02:19:03.500 | "And I'm gonna read all these Harvard Business Review
02:19:05.220 | "articles on leadership and all this other stuff."
02:19:07.740 | Like I was a physicist turned into a software nerd
02:19:10.700 | who then really like nerded out on Python.
02:19:12.780 | And now I am entrepreneurial, right?
02:19:14.860 | I saw a business opportunity around the use of Python
02:19:16.620 | or data, but for me, what has been interesting
02:19:19.980 | over this journey with the last 10 years
02:19:22.140 | is how much I started really enjoying the understanding
02:19:27.140 | and thinking deeper about organizational dynamics
02:19:29.620 | and leadership.
02:19:30.460 | And leadership does come down to a few core things.
02:19:35.300 | Number one, a leader has to create belief
02:19:40.020 | or at least has to dispel disbelief.
02:19:42.300 | Leadership also, you have to have vision,
02:19:46.500 | loyalty and experience.
02:19:49.380 | - So can you say belief in a singular vision?
02:19:52.620 | Like what does belief mean?
02:19:53.700 | - Yeah, belief means a few things.
02:19:55.240 | Belief means here's what we need to do.
02:19:56.980 | And this is a valid thing to do and we can do it.
02:20:00.020 | That you have to be able to drive that belief.
02:20:05.460 | And every step of leadership along the way
02:20:08.780 | has to help you amplify that belief to more people.
02:20:12.660 | I mean, I think at a fundamental level, that's what it is.
02:20:15.900 | You have to have a vision.
02:20:17.100 | You have to be able to show people that,
02:20:20.900 | or you have to convince people to believe in the vision
02:20:23.620 | and to get behind you.
02:20:25.220 | And that's where the loyalty part comes in
02:20:26.500 | and the experience part comes in.
02:20:28.180 | - There's all different flavors of leadership.
02:20:30.180 | So if we talk about Linus,
02:20:31.420 | we could talk about Elon Musk and Steve Jobs.
02:20:36.420 | There's Sunder Pichai.
02:20:38.420 | There's people that kind of put themselves at the center
02:20:40.740 | and are strongly opinionated.
02:20:42.540 | And some people are more like consensus builders.
02:20:46.020 | What works well for open source?
02:20:47.740 | What works well in the space of programmers?
02:20:49.740 | So you've been a programmer, you've led many programmers
02:20:53.300 | and are now sort of at the center of this ecosystem.
02:20:55.620 | What works well in the programming world, would you say?
02:20:58.980 | - It really depends on the people.
02:21:01.140 | What style of leadership is best.
02:21:02.620 | And it depends on the programming community.
02:21:04.180 | I think for the Python community,
02:21:06.340 | servant leadership is one of the values.
02:21:08.300 | Like at the end of the day,
02:21:09.140 | the leader has to also be the high priest of values, right?
02:21:14.140 | So any kind of, any collection of people
02:21:18.020 | has values of their living.
02:21:19.980 | And if you want to maintain certain values
02:21:24.540 | and those values help you as an organization
02:21:27.100 | become more powerful,
02:21:28.340 | then the leader has to live those values unequivocally
02:21:31.340 | and has to hold the values.
02:21:34.380 | So in our case, in this collaborative community
02:21:37.540 | around Python, I think that the humility
02:21:41.940 | is one of those values, servant leadership.
02:21:43.820 | You actually have to kind of do the stuff.
02:21:45.300 | You have to walk the walk, not just talk the talk.
02:21:47.820 | I don't feel like the Python community
02:21:51.140 | really demands that much from a vision standpoint.
02:21:53.820 | - And they should.
02:21:54.780 | - And I think they should.
02:21:56.740 | - This is the interesting thing is
02:21:59.060 | like so many people use Python from where it comes
02:22:03.420 | the vision, you know, like you have a Elon Musk
02:22:06.860 | type character who has makes bold statements
02:22:10.820 | about the vision for particular companies
02:22:13.300 | he's involved with.
02:22:14.660 | And it's like, I think a lot of people that work
02:22:18.700 | at those companies kind of can only last
02:22:22.580 | if they believe that vision.
02:22:24.460 | Because in some of it is super bold.
02:22:26.260 | So my question is, and by the way,
02:22:28.620 | those companies often use Python.
02:22:30.300 | How do you establish a vision?
02:22:33.780 | Like get to a hundred million users, right?
02:22:36.500 | Get to where, you know, Python is at the center
02:22:42.300 | of the machine learning and was it data science,
02:22:46.780 | machine learning, deep learning,
02:22:48.460 | artificial intelligence revolution, right?
02:22:51.720 | Like in many ways, perhaps the Python community
02:22:54.900 | is not thinking of it that way,
02:22:55.860 | but it's leading the way on this.
02:22:58.180 | Like the tooling is like essential.
02:23:01.260 | - Right, well, you know, for a while,
02:23:03.340 | PyCon people in the scientific Python
02:23:05.860 | and the PyData community, they would submit talks.
02:23:09.300 | Those are early 2010s, mid 2010s.
02:23:12.180 | They would submit talks for PyCon
02:23:14.100 | and the talks would all be rejected
02:23:15.740 | because there was the separate sort of PyData conferences.
02:23:18.700 | And they're like, well, these should,
02:23:19.540 | these probably belong more to PyData.
02:23:21.180 | And instead there'd be yet another talk about,
02:23:23.120 | you know, threads and you know, whatever,
02:23:25.200 | some web framework.
02:23:26.400 | And it's like, that was an interesting dynamic
02:23:29.000 | to see that there was, I mean,
02:23:31.320 | at the time it was a little annoying
02:23:32.680 | because we wanted to try to get more users
02:23:34.280 | and get more people talking about these things.
02:23:35.760 | And PyCon is a huge venue, right?
02:23:37.320 | It's thousands of Python programmers,
02:23:40.220 | but then also came to appreciate that, you know,
02:23:42.200 | parallel, having an ecosystem
02:23:44.480 | that allows parallel innovation is not bad, right?
02:23:47.400 | There are people doing embedded Python stuff.
02:23:49.040 | There's people doing web programming,
02:23:50.640 | people doing scripting, there's cyber users of Python.
02:23:53.260 | I think the, ultimately at some point,
02:23:55.100 | if your slide mode mold covers so much stuff,
02:23:58.020 | you have to respect that different things are growing
02:24:00.140 | in different areas and different niches.
02:24:02.300 | Now, at some point that has to come together
02:24:04.140 | and the central body has to provide resources.
02:24:07.540 | The principle here is subsidiarity.
02:24:09.140 | Give resources to the various groups
02:24:11.740 | to then allocate as they see fit in their niches.
02:24:15.020 | That would be a really helpful dynamic.
02:24:16.340 | But again, it's a volunteer community.
02:24:17.500 | It's not like they had that many resources to start with.
02:24:21.200 | - What was or is your favorite programming setup?
02:24:23.960 | What operating system, what keyboard, how many screens?
02:24:27.280 | What are you listening to?
02:24:29.120 | What time of day?
02:24:30.920 | Are you drinking coffee, tea?
02:24:33.000 | - Tea, sometimes coffee, depending on how well I slept.
02:24:36.280 | I used to have-
02:24:37.120 | - How deep do you get?
02:24:38.120 | Are you a night owl?
02:24:39.680 | I remember somebody asked you somewhere
02:24:41.560 | a question about work-life balance
02:24:43.760 | and like not just work-life balance,
02:24:46.000 | but like a family, you know, you lead a company
02:24:49.000 | and your answer was basically like,
02:24:52.060 | I still haven't figured it out.
02:24:54.240 | - Yeah, I think I've gotten a little bit better balance.
02:24:56.320 | I have a really great leadership team now supporting me.
02:24:58.880 | And so that takes a lot of the day-to-day stuff
02:25:01.680 | off my plate and my kids are getting a little older.
02:25:04.360 | So that helps.
02:25:05.200 | So, and of course I have a wonderful wife
02:25:07.500 | who takes care of a lot of the things
02:25:09.200 | that I'm not able to take care of and she's great.
02:25:11.660 | I try to get to sleep earlier now
02:25:13.700 | because I have to get up every morning at six
02:25:15.120 | to take my kid down to the bus stop.
02:25:16.980 | So there's a hard thing.
02:25:18.940 | - Yeah.
02:25:19.780 | - For a while I was doing polyphasic sleep,
02:25:21.080 | which is really interesting.
02:25:22.000 | Like I go to bed at nine, wake up at like 2 a.m.,
02:25:24.600 | work till five, sleep three hours, wake up at eight.
02:25:27.260 | Like that was actually, it was interesting.
02:25:29.320 | It wasn't too bad.
02:25:30.160 | - How did it feel?
02:25:31.120 | - It was good.
02:25:31.960 | I didn't keep it up for years, but once I have travel,
02:25:34.960 | then it just, everything goes out the window, right?
02:25:37.360 | 'Cause then you're like time zones and all these things.
02:25:39.080 | - Socially was it, except like,
02:25:40.880 | were you able to live outside of how you felt?
02:25:43.480 | - Yes.
02:25:44.320 | - Were you able to live normal society?
02:25:45.720 | - Oh yeah, because like on the nights
02:25:47.320 | that wasn't out hanging out with people or whatever,
02:25:48.920 | going to bed at nine, no one cares.
02:25:50.680 | I wake up at two, I'm still responding to their slacks,
02:25:52.680 | emails, whatever.
02:25:54.160 | And shit posting on Twitter or whatever
02:25:57.240 | at two in the morning is great.
02:25:58.880 | - Exactly.
02:25:59.720 | - Right, and then you go to bed for a few hours
02:26:01.720 | and you wake up, it's like you had an extra day
02:26:03.360 | in the middle.
02:26:04.200 | - Yes.
02:26:05.020 | - And I'd read somewhere that humans naturally
02:26:06.320 | have biphasic sleep or something, I don't know.
02:26:09.360 | - I read basically everything somewhere.
02:26:11.160 | So every option of everything is available.
02:26:13.400 | - Every option of everything.
02:26:14.520 | I will say that that worked out for me for a while,
02:26:16.840 | although I don't do it anymore.
02:26:18.480 | In terms of programming setup,
02:26:19.380 | I had a 27 inch high DPI setup that I really liked,
02:26:24.380 | but then I moved to a curved monitor
02:26:26.200 | just because when I moved to the new house,
02:26:28.840 | I want to have a bit more screen for Zoom
02:26:31.280 | plus communications, plus various kinds of things.
02:26:33.840 | - So it's like one large monitor.
02:26:35.800 | - One large curved monitor.
02:26:37.160 | - What operating system?
02:26:39.760 | - Mac.
02:26:40.640 | - Okay.
02:26:41.480 | - Yeah.
02:26:42.300 | - Is that what happens when you become important?
02:26:43.920 | Is you stop using Linux and Windows?
02:26:46.700 | - No, I actually have a Windows box as well
02:26:48.520 | on the next table over, but I have three desks, right?
02:26:53.520 | - Yes.
02:26:54.960 | - So main one is the standing desk so that I can,
02:26:56.880 | you know, whatever, what I'm like,
02:26:58.360 | I have a teleprompter set up and everything else.
02:26:59.920 | And then I've got my iMac and then eGPU and then Windows PC.
02:27:04.920 | The reason I moved to Mac was it's got a Linux prompt
02:27:10.680 | or no, sorry, it's got a Unix prompt.
02:27:12.160 | It's got a Unix prompt so I can do all my stuff.
02:27:14.260 | But then I don't have to worry,
02:27:18.300 | like when I'm presenting for clients or investors, whatever,
02:27:20.920 | like I don't have to worry about any like ACPI related
02:27:25.120 | F-sick things in the middle of a presentation,
02:27:27.160 | like none of that.
02:27:28.000 | It just, it will always wake from sleep
02:27:29.840 | and it won't kernel panic on me.
02:27:32.300 | And this is not a dig against Linux,
02:27:34.200 | except that I just, I feel really bad.
02:27:38.440 | I feel like a traitor to my community saying this, right?
02:27:40.180 | But in 2012, I was just like, okay,
02:27:42.480 | started my own company, what do I get?
02:27:44.040 | And Linux laptops were just not quite there.
02:27:47.520 | And so I've just stuck with Macs.
02:27:48.360 | - Can I just defend something that nobody respectable
02:27:51.360 | seems to do, which is, so I do a boot on Linux Windows,
02:27:55.800 | but in Windows, I have a Windows subsystems for Linux
02:28:00.400 | or whatever, WSL.
02:28:02.940 | And I find myself being able to handle everything I need
02:28:06.500 | and almost everything I need in Linux for basic sort of tasks,
02:28:09.680 | scripting tasks within WSL
02:28:11.280 | and it creates a really nice environment.
02:28:12.960 | So I've been, but like whenever I hang out
02:28:14.960 | with like especially important people,
02:28:17.000 | like they're all on iPhone and a Mac and it's like, yeah.
02:28:22.800 | Like what, there is a messiness to Windows
02:28:26.000 | and a messiness to Linux that makes me feel
02:28:29.960 | like you're still in it.
02:28:31.900 | - Well, the Linux stuff,
02:28:33.600 | Windows subsystem for Linux is very tempting,
02:28:35.720 | but there's still the Windows on the outside
02:28:38.960 | where I don't know where, and I've been, okay,
02:28:41.000 | I've used DOS since version 1.11 or 1.21 or something.
02:28:45.520 | So I've been a long time Microsoft user.
02:28:48.320 | And I will say that like, it's really hard for me to know
02:28:52.640 | where anything is, how to get to the details
02:28:54.400 | behind something when something screws up
02:28:55.800 | as an invariably does.
02:28:57.560 | And just things like changing group permissions
02:28:59.880 | on some shared folders and stuff,
02:29:01.360 | just everything seems a little bit more awkward,
02:29:03.560 | more clicks than it needs to be.
02:29:05.200 | Not to say that there aren't weird things
02:29:07.760 | like hidden attributes and all this other happy stuff
02:29:09.840 | on Mac, but for the most part,
02:29:14.000 | and well, actually, especially now
02:29:15.360 | with the new hardware coming out on Mac,
02:29:16.800 | that'll be very interesting, with the new M1.
02:29:19.040 | There were some dark years, the last few years
02:29:21.600 | when I was like, I think maybe I have to move off of Mac
02:29:23.960 | as a platform.
02:29:24.800 | - Dark years.
02:29:25.640 | - But this, I mean, like my keyboard was just not working.
02:29:29.040 | Like literally my keyboard just wasn't working, right?
02:29:31.160 | I had this touch bar, didn't have a physical escape button
02:29:33.480 | like I needed to, 'cause I used Vim.
02:29:35.120 | And now I think we're back, so.
02:29:37.400 | - So you use Vim and you have a, what kind of keyboard?
02:29:40.320 | - So I use a Realforce 87U.
02:29:43.000 | It's a mechanical, it's a topro key switch.
02:29:45.240 | - Is it weird shape, or is it normal shape?
02:29:47.000 | - It's a normal shape.
02:29:48.480 | - Oh no, 'cause I say that because I use a Kinesis
02:29:51.440 | and I had, you said some dark, you said you had dark moments.
02:29:54.960 | I've recently had a dark moment,
02:29:57.400 | it's like, what am I doing with my life?
02:29:58.760 | So I remember sort of flying in a very kind of tight space
02:30:02.960 | and as I'm working, this is what I do on an airplane.
02:30:06.560 | I pull out a laptop and on top of the laptop,
02:30:09.600 | I'll put a Kinesis keyboard.
02:30:11.080 | - That's hardcore, man.
02:30:12.040 | - I was thinking, is this who I am?
02:30:13.800 | Is this what I'm becoming?
02:30:15.080 | Will I be this person?
02:30:16.160 | 'Cause I'm on Emacs with this Kinesis keyboard
02:30:18.520 | sitting like with everybody around.
02:30:21.920 | - Emacs on Windows.
02:30:23.200 | - WSL, yeah.
02:30:25.600 | - Yeah, Emacs on Linux on Windows.
02:30:27.360 | - Yeah, on Windows.
02:30:28.720 | And like everybody around me is using their iPhone
02:30:32.200 | to look at TikTok.
02:30:33.120 | So I'm like in this land and I thought,
02:30:36.080 | you know what, maybe I need to become an adult
02:30:38.440 | and put the 90s behind me and use like a normal keyboard.
02:30:43.200 | And then I did some soul searching
02:30:45.000 | and I decided like, this is who I am.
02:30:46.760 | This is me like coming out of the closet
02:30:48.520 | to saying I'm Kinesis keyboard all the way.
02:30:50.760 | I'm going to use Emacs.
02:30:52.880 | - You know who else is a Kinesis fan?
02:30:55.160 | Wes McKinney, the creator of Pandas.
02:30:56.720 | - Oh.
02:30:57.560 | - He just, he banged out Pandas
02:30:59.000 | on a Kinesis keyboard, I believe.
02:31:00.000 | I don't know if he's still using one, maybe,
02:31:01.920 | but certainly 10 years ago, like he was.
02:31:03.960 | - If anyone's out there,
02:31:05.320 | maybe we need to have a Kinesis support group.
02:31:07.520 | Please reach out.
02:31:08.360 | - Isn't there already one?
02:31:09.440 | - Is there one?
02:31:10.280 | - I don't know, there's gotta be an IRC channel, man.
02:31:12.200 | (laughing)
02:31:14.360 | - Oh no, and you access it through Emacs.
02:31:16.880 | Okay, do you still program these days?
02:31:19.600 | - I do a little bit.
02:31:20.600 | Honestly, the last thing I did was I had written,
02:31:25.880 | I was working with my son to script some Minecraft stuff.
02:31:28.520 | So I was doing a little bit of that.
02:31:29.560 | That was the last, literally the last code I wrote.
02:31:32.320 | Maybe I did, oh, you know what?
02:31:33.440 | Also I wrote some code to do some cap table evaluation,
02:31:36.600 | waterfall modeling kind of stuff.
02:31:38.240 | - What advice would you give to a young person,
02:31:41.320 | say your son, today in high school,
02:31:44.160 | maybe even college about career, about life?
02:31:47.280 | - This may be where I get into trouble a little bit.
02:31:51.080 | We are coming to the end.
02:31:53.360 | We're rapidly entering a time between worlds.
02:31:56.480 | So we have a world now that's starting to really crumble
02:31:59.880 | under the weight of aging institutions
02:32:01.880 | that no longer even pretend to serve the purposes
02:32:04.440 | they were created for.
02:32:05.720 | We are creating technologies that are hurtling
02:32:08.600 | billions of people headlong into philosophical crises
02:32:11.440 | who they don't even know the philosophical operating systems
02:32:14.240 | in their firmware.
02:32:15.080 | And they're heading into a time when that gets vaporized.
02:32:17.800 | So for people in high school,
02:32:20.040 | and certainly I tell my son this as well,
02:32:21.480 | he's in middle school, people in college,
02:32:24.880 | you are going to have to find your own way.
02:32:29.680 | You're going to have to have a pioneer spirit,
02:32:31.680 | even if you live in the middle
02:32:34.040 | of the most dense urban environment.
02:32:36.320 | All of human reality around you is the result
02:32:42.600 | of the last few generations of humans
02:32:45.600 | agreeing to play certain kinds of games.
02:32:48.480 | A lot of those games no longer operate
02:32:53.280 | according to the rules they used to.
02:32:55.080 | Collapse is non-linear, but it will be managed.
02:32:59.240 | And so if you are in a particular social caste
02:33:03.320 | or economic caste, and it's not,
02:33:07.280 | I think it's not kosher to say that about America,
02:33:10.200 | but America is a very stratified and classist society.
02:33:14.160 | There's some mobility, but it's really quite classist.
02:33:17.080 | And in America, unless you're in the upper middle class,
02:33:20.800 | you are headed into very choppy waters.
02:33:23.520 | So it is really, really good to think and understand
02:33:26.960 | the fundamentals of what you need
02:33:29.240 | to build a meaningful life for you,
02:33:32.080 | your loved ones, with your family.
02:33:33.880 | And almost all of the technology being created
02:33:38.000 | that's consumer-facing is designed to own people,
02:33:41.760 | to take the four stack of people, to delaminate them,
02:33:46.760 | and to own certain portions of that stack.
02:33:50.360 | And so if you want to be an integral human being,
02:33:52.280 | if you want to have your agency
02:33:54.400 | and you want to find your own way in the world,
02:33:57.640 | when you're young would be a great time to spend time
02:34:00.600 | looking at some of the classics around
02:34:03.600 | what it means to live a good life,
02:34:05.880 | what it means to build connection with people.
02:34:08.080 | And so much of the status game, so much of the stuff,
02:34:12.120 | one of the things that I sort of talk about
02:34:14.880 | as we create more and more technology,
02:34:18.640 | there's a gradient of technology,
02:34:19.840 | and a gradient of technology
02:34:20.840 | always leads to a gradient of power.
02:34:22.640 | And this is Jacques Allure's point to some extent as well.
02:34:25.200 | That gradient of power is not going to go away.
02:34:27.280 | The technologies are going so fast
02:34:29.880 | that even people like me who helped create
02:34:32.040 | some of the stuff, I'm being left behind.
02:34:33.600 | That's some cutting edge research.
02:34:34.680 | I don't know what's going on in GANs today.
02:34:36.800 | You know, I'll go read some proceedings.
02:34:38.640 | So as the world gets more and more technological,
02:34:42.600 | it'll create more and more gradients
02:34:44.240 | where people will seize power, economic fortunes,
02:34:48.400 | and the way they make the people who are left behind
02:34:51.320 | okay with their lot in life is they create lottery systems.
02:34:54.480 | They make you take part in the narrative
02:35:00.000 | of your own being trapped in your own economic sort of zone.
02:35:04.200 | So avoiding those kinds of things is really important,
02:35:07.920 | knowing when someone is running game on you basically.
02:35:10.740 | So these are the things I would tell young people.
02:35:12.280 | It's a dark message, but it's realism.
02:35:14.240 | I mean, it's what I see.
02:35:15.880 | - So after you gave some realism, you sit back,
02:35:18.520 | you sit back with your son,
02:35:19.760 | you're looking out at the sunset.
02:35:21.680 | What to him can you give as words of hope?
02:35:26.280 | And to you, from where do you derive hope
02:35:30.420 | for the future of our world?
02:35:32.120 | So you said at the individual level,
02:35:33.620 | you have to have a pioneer mindset
02:35:36.480 | to go back to the classics to understand
02:35:38.760 | what is in human nature you can find meaning.
02:35:41.600 | But at the societal level, what trajectory,
02:35:44.520 | when you look at possible trajectories,
02:35:46.000 | what gives you hope?
02:35:47.480 | - What gives me hope is that we have little tremors now,
02:35:52.480 | shaking people out of the reverie
02:35:54.640 | of the fiction of modernity that they've been living in,
02:35:57.000 | kind of a late 20th century style modernity.
02:35:59.500 | That's good, I think, because,
02:36:03.660 | and also to your point earlier,
02:36:06.400 | people are burning out on some of the social media stuff.
02:36:08.200 | They're sort of seeing the ugly side,
02:36:09.280 | especially the latest news with Facebook
02:36:11.640 | and the whistleblower, right?
02:36:12.840 | It's quite clear these things are not
02:36:15.720 | all they're cracked up to be.
02:36:16.720 | - So do you believe,
02:36:18.400 | I believe better social media can be built
02:36:20.760 | because they are burning out
02:36:21.840 | and it'll incentivize other competitors to be built.
02:36:25.280 | Do you think that's possible?
02:36:26.480 | - Well, the thing about it is that
02:36:29.040 | when you have extractive return on returns,
02:36:32.160 | capital coming in and saying, look, you own a network,
02:36:36.640 | give me some exponential dynamics out of this network.
02:36:39.000 | What are you gonna do?
02:36:39.840 | You're gonna just basically put a toll keeper
02:36:41.460 | at every single node and every single graph edge,
02:36:45.240 | every node, every vertex, every edge.
02:36:47.140 | But if you don't have that need for it,
02:36:49.880 | if no one's sitting there saying,
02:36:51.200 | hey, Wikipedia monetize every character,
02:36:53.240 | every byte, every phrase,
02:36:54.920 | then generative human dynamics
02:36:56.640 | will naturally sort of arise,
02:36:58.280 | assuming we do, we respect a few principles
02:37:01.120 | around online communications.
02:37:03.000 | So the greatest and biggest social network in the world
02:37:05.760 | is still like email, SMS, right?
02:37:08.720 | So we're fine there.
02:37:10.560 | The issue with the social media, as we call it now,
02:37:13.200 | is they're actually just new amplification systems, right?
02:37:16.600 | Now it's benefit of certain people like yourself
02:37:18.640 | who have interesting content to be amplified.
02:37:23.120 | So it's created a greater economy and that's cool.
02:37:25.160 | There's a lot of great content out there,
02:37:26.800 | but giving everyone a shot at the fame lottery,
02:37:29.280 | saying, hey, you could also have your,
02:37:31.480 | if you wiggle your butt the right way on TikTok,
02:37:33.960 | you can have your 15 seconds of micro fame.
02:37:36.100 | That's not healthy for society at large.
02:37:38.180 | So I think if we can create tools
02:37:40.200 | that help people be conscientious about their attention,
02:37:45.160 | spend time looking at the past
02:37:46.680 | and really retrieving memory and calling,
02:37:49.600 | not calling, but processing and thinking about that,
02:37:53.240 | I think that's certainly possible
02:37:55.120 | and hopefully that's what we get.
02:37:57.280 | So the bigger picture, the bigger question
02:38:00.640 | that you're asking about what gives me hope
02:38:02.360 | is that these early shocks of COVID lockdowns
02:38:08.120 | and remote work and all these different kinds of things,
02:38:11.600 | I think it's getting people to a point
02:38:14.000 | where they're sort of no longer in the reverie, right?
02:38:19.000 | As my friend, Jim Rutt says,
02:38:21.040 | there's more people with ears to hear now, right?
02:38:23.680 | With pandemic and education, everyone's like, wait, wait,
02:38:27.160 | what have you guys been doing with my kids?
02:38:28.660 | Like, how are you teaching them?
02:38:30.120 | What is this crap you're giving them as homework, right?
02:38:32.400 | So I think these are the kinds of things
02:38:33.960 | that are getting in the supply chain disruptions,
02:38:36.860 | getting more people to think about
02:38:38.240 | how do we actually just make stuff?
02:38:40.200 | This is all good, but the concern is that
02:38:44.840 | it's still gonna take a while for these things,
02:38:48.380 | for people to learn how to be agentic again
02:38:50.540 | and to be in right relationship with each other
02:38:53.920 | and with the world.
02:38:55.880 | So the message of hope is still people are resilient
02:38:58.420 | and we are building some really amazing technology.
02:39:01.400 | - And I also, like to me, I derive a lot of hope
02:39:03.920 | from individuals in that van.
02:39:08.160 | The power of a single individual to transform the world,
02:39:12.000 | to do positive things for the world is quite incredible.
02:39:14.700 | Now you've been talking about,
02:39:16.000 | it's nice to have as many of those individuals as possible,
02:39:18.840 | but even the power of one, it's kind of magical.
02:39:21.920 | - It is, it is.
02:39:22.760 | We're in a mode now where we can do that.
02:39:24.500 | I think also, part of what I try to do
02:39:26.960 | is in coming to podcasts like yours
02:39:29.020 | and then spamming you with all this philosophical stuff
02:39:31.760 | that I've got going on.
02:39:33.540 | There are a lot of good people out there
02:39:34.800 | trying to put words around the current technological,
02:39:39.800 | social, economic crises that we're facing.
02:39:43.240 | And in the space of a few short years,
02:39:44.600 | I think there has been a lot of great content
02:39:46.360 | produced around this stuff for people who wanna see,
02:39:49.560 | wanna find out more or think more about this.
02:39:52.120 | We're popularizing certain kinds of philosophical ideas
02:39:54.560 | that move people beyond just the,
02:39:56.560 | oh, you're communist, oh, you're capitalist kind of stuff.
02:39:58.720 | Like it's sort of, we're way past that now.
02:40:01.200 | So that also gives me hope that I feel like I myself
02:40:04.620 | am getting a handle on how to think about these things.
02:40:07.320 | It makes me feel like I can hopefully affect,
02:40:11.180 | change for the better.
02:40:12.540 | - We've been sneaking up on this question all over the place.
02:40:15.700 | Let me ask the big, ridiculous question.
02:40:17.580 | What is the meaning of life?
02:40:20.380 | - Wow.
02:40:21.220 | The meaning of life.
02:40:25.260 | Yeah, I don't know.
02:40:29.700 | I mean, I've never really understood that question.
02:40:32.060 | - When you say meaning crisis,
02:40:34.700 | you're saying that there is a search
02:40:39.700 | for a kind of experience that's,
02:40:43.560 | could be described as fulfillment,
02:40:45.460 | as like the aha moment of just like joy.
02:40:50.180 | And maybe when you see something beautiful,
02:40:53.400 | or maybe you have created something beautiful,
02:40:55.220 | that experience that you get,
02:40:58.120 | it feels like it all makes sense.
02:41:01.720 | So some of that is just chemicals coming together
02:41:04.360 | in your mind and all those kinds of things.
02:41:06.440 | But it seems like we're building
02:41:08.800 | a sophisticated collective intelligence
02:41:12.620 | that's providing meaning in all kinds of ways to its members.
02:41:17.120 | And there's a theme to that meaning.
02:41:20.640 | So for a lot of history, I think faith
02:41:24.920 | played an important role.
02:41:26.640 | Faith in God, sort of religion.
02:41:29.640 | I think technology in the modern era
02:41:32.680 | is kind of serving a little bit of a source
02:41:35.200 | of meaning for people, like innovation of different kinds.
02:41:38.100 | I think the old school things of love
02:41:43.520 | and the basics of just being good at stuff.
02:41:46.500 | But you were a physicist, so there's a desire to say,
02:41:51.840 | okay, yeah, but these seem to be like symptoms
02:41:54.400 | of something deeper.
02:41:56.320 | - Right.
02:41:57.160 | - Like why? - With little m meaning.
02:41:58.160 | What's capital M meaning?
02:41:59.080 | - Yeah, what's capital M meaning?
02:42:00.960 | Why are we reaching for order when there is excess of energy?
02:42:05.100 | - I don't know if I can answer the why.
02:42:09.360 | Any why that I come up with, I think, is gonna be,
02:42:11.800 | I'd have to think about that a little more,
02:42:15.480 | maybe get back to you on that.
02:42:17.040 | But I will say this.
02:42:18.220 | We do look at the world through a traditional,
02:42:22.280 | I think most people look at the world through
02:42:24.200 | what I would say is a subject-object
02:42:26.040 | kind of metaphysical lens, that we have our own subjectivity
02:42:29.600 | and then there's all of these object things that are not us.
02:42:34.220 | So I'm me and these things are not me, right?
02:42:37.240 | And I'm interacting with them, I'm doing things to them.
02:42:39.880 | But a different view of the world that looks at it
02:42:42.200 | as much more connected, that realizes,
02:42:45.520 | oh, I'm really quite embedded in a soup of other things.
02:42:50.520 | And I'm simply almost like a standing wave pattern
02:42:54.060 | of different things, right?
02:42:55.440 | So when you look at the world
02:42:58.000 | in that kind of connected sense,
02:42:59.200 | I've recently taken a shine
02:43:02.960 | to this particular thought experiment,
02:43:04.520 | which is what if it was the case
02:43:07.240 | that everything that we touch with our hands,
02:43:12.400 | that we pay attention to,
02:43:13.880 | that we actually give intimacy to,
02:43:16.320 | what if there's actually,
02:43:18.020 | all the mumbo jumbo,
02:43:22.840 | people with the magnetic healing crystals
02:43:25.060 | and all this other kind of stuff and quantum energy stuff,
02:43:28.220 | what if that was a thing?
02:43:30.300 | What if when you,
02:43:31.580 | literally when your hand touches an object,
02:43:34.100 | when you really look at something
02:43:35.380 | and you concentrate and you focus on it
02:43:36.820 | and you really give it attention,
02:43:39.000 | you actually give it,
02:43:40.540 | there is some physical residue of something,
02:43:44.220 | a part of you, a bit of your life force that goes into it.
02:43:48.500 | Okay, now this is of course completely mumbo jumbo stuff.
02:43:51.500 | This is not like, I don't actually think this is real,
02:43:53.720 | but let's do the thought experiment.
02:43:55.680 | What if it was?
02:43:57.740 | What if there actually was some quantum magnetic crystal
02:44:01.760 | and energy field thing that just by touching this can,
02:44:05.360 | this can has changed a little bit somehow.
02:44:09.080 | And it's not much unless you put a lot into it
02:44:11.980 | and you touch it all the time, like your phone, right?
02:44:15.040 | These things gained,
02:44:16.200 | they gain meaning to you a little bit,
02:44:19.680 | but what if there's something that technical objects,
02:44:24.260 | the phone is a technical object.
02:44:25.300 | It does not really receive attention or intimacy
02:44:29.220 | and then allow itself to be transformed by it.
02:44:32.020 | But if it's a piece of wood,
02:44:33.420 | if it's the handle of a knife that your mother used
02:44:36.740 | for 20 years to make dinner for you, right?
02:44:40.380 | What if it's a keyboard that you banged out
02:44:43.900 | your world transforming software library on?
02:44:46.460 | These are technical objects
02:44:47.860 | and these are physical objects,
02:44:48.820 | but somehow there's something to them.
02:44:51.440 | We feel an attraction to these objects
02:44:53.360 | as if we have imbued them with life energy.
02:44:56.000 | - Yeah. - Right?
02:44:57.000 | So if you walk that thought experiment through,
02:44:58.800 | what happens when we touch another person,
02:45:00.480 | when we hug them, when we hold them?
02:45:02.280 | And the reason this ties into my answer for your question
02:45:07.840 | is that there's, if there is such a thing,
02:45:12.840 | if we were to hypothesize, you know,
02:45:15.720 | hypothesize such a thing, it could be that the purpose
02:45:20.720 | of our lives is to imbue as many things
02:45:25.940 | with that love as possible.
02:45:27.580 | - That's a beautiful answer and a beautiful way to end it.
02:45:34.060 | Peter, you're an incredible person.
02:45:36.620 | - Thank you. - Spending so much
02:45:39.220 | in the space of engineering and in the space of philosophy.
02:45:44.780 | I'm really proud to be living in the same city as you.
02:45:48.200 | (Lex laughing)
02:45:49.040 | And I'm really grateful
02:45:51.120 | that you would spend your valuable time with me today.
02:45:53.000 | Thanks so much. - Well, thank you.
02:45:53.840 | I appreciate the opportunity to speak with you.
02:45:56.560 | - Thanks for listening to this conversation
02:45:58.160 | with Peter Wang.
02:45:59.280 | To support this podcast,
02:46:00.640 | please check out our sponsors in the description.
02:46:03.520 | And now let me leave you with some words
02:46:05.680 | from Peter Wang himself.
02:46:07.880 | "We tend to think of people
02:46:09.440 | "as either malicious or incompetent,
02:46:12.440 | "but in a world filled with corruptible
02:46:15.140 | "and unchecked institutions,
02:46:17.060 | "there exists a third thing, malicious incompetence.
02:46:21.020 | "It's a social cancer, and it only appears
02:46:24.120 | "once human organizations scale
02:46:25.940 | "beyond personal accountability."
02:46:27.940 | Thank you for listening and hope to see you next time.
02:46:31.740 | (upbeat music)
02:46:34.320 | (upbeat music)
02:46:36.900 | [BLANK_AUDIO]