back to indexJim Keller: The Future of Computing, AI, Life, and Consciousness | Lex Fridman Podcast #162
Chapters
0:0 Introduction
1:33 Good design is both science and engineering
7:33 Javascript
11:40 RISC vs CISC
15:39 What makes a great processor?
17:9 Intel vs ARM
18:58 Steve Jobs and Apple
21:36 Elon Musk and Steve Jobs
27:21 Father
31:3 Perfection
37:18 Modular design
42:52 Moore's law
49:50 Hardware for deep learning
56:44 Making neural networks fast at scale
64:22 Andrej Karpathy and Chris Lattner
68:36 How GPUs work
72:43 Tesla Autopilot, NVIDIA, and Mobileye
77:23 Andrej Karpathy and Software 2.0
83:43 Tesla Dojo
86:20 Neural networks will understand physics better than humans
88:33 Re-engineering the human brain
93:26 Infinite fun and the Culture Series by Iain Banks
95:20 Neuralink
100:43 Dreams
104:37 Ideas
114:49 Aliens
119:46 Jordan Peterson
124:44 Viruses
127:52 WallStreetBets and Robinhood
135:55 Advice for young people
137:45 Human condition
140:14 Fear is a cage
145:4 Love
151:27 Regrets
00:00:00.000 |
The following is a conversation with Jim Keller, 00:00:08.480 |
and is widely seen as one of the greatest engineering minds 00:00:14.620 |
In a peculiar twist of space-time in our simulation, 00:00:18.840 |
Jim is also a brother-in-law of Jordan Peterson. 00:00:25.320 |
artificial intelligence, consciousness, and life. 00:00:43.940 |
As a side note, let me say that Jim is someone 00:00:46.200 |
who on a personal level inspired me to be myself. 00:00:50.160 |
There was something in his words on and off the mic, 00:00:53.340 |
or perhaps that he even paid attention to me at all, 00:00:59.120 |
"A kind of pat on the back that can make the difference 00:01:12.840 |
gratitude for the people who have given me a chance 00:01:19.040 |
If you enjoy this thing, subscribe on YouTube, 00:01:21.240 |
review it on Apple Podcasts, follow on Spotify, 00:01:25.680 |
or connect with me on Twitter, Alex Friedman. 00:01:28.620 |
And now, here's my conversation with Jim Keller. 00:01:35.380 |
of theory versus engineering, this dichotomy, 00:01:38.120 |
in building good software or hardware systems? 00:01:53.280 |
And then science is the pursuit of discovering things 00:02:12.800 |
the pragmatic, like, okay, we have these nice models, 00:02:23.680 |
and how different policies will have an effect, 00:02:33.240 |
- So, computer design is almost all engineering, 00:02:38.200 |
Now, because of the complexity of the computers we built, 00:02:46.600 |
and then we'll verify it, and then we'll put it together, 00:02:59.760 |
And then, every so often, some big idea happens, 00:03:06.360 |
- And that idea is in what, in the space of engineering, 00:03:11.400 |
So, one of the limits of computer performance 00:03:34.600 |
So, the engineers who build branch prediction hardware 00:03:37.960 |
were happy to drop the one kind of training array 00:03:44.840 |
- And branch prediction is one of the key problems 00:03:48.520 |
underlying all of sort of the lowest level of software, 00:03:56.280 |
single-thread computers are limited by two things. 00:03:58.640 |
The predictability of the path of the branches 00:04:01.400 |
and the predictability of the locality of data. 00:04:15.720 |
virtually all the data has to be in the local cache. 00:04:23.280 |
it's really easy to see what the stream of data will be. 00:04:26.680 |
But you might have a more complicated program 00:04:35.200 |
And you can think, that's really unpredictable. 00:04:39.200 |
that looks at this kind of pattern and you realize, 00:04:44.560 |
And if you get this one and this one and this one, 00:04:54.680 |
or is it more like, here's a hack that works well? 00:04:59.160 |
Like there's information theory, I think, somewhere. 00:05:23.520 |
So lots of companies will reward you for filing patents. 00:05:27.560 |
Some many big companies get stuck because to get promoted, 00:05:34.760 |
to do some random new thing, 99% of which doesn't matter, 00:05:45.600 |
They think like the cell library and the basic CAD tools, 00:05:49.560 |
or basic software validation methods, that's simple stuff. 00:05:56.960 |
And then they spend lots of time trying to figure out 00:05:58.960 |
how to patent something, and that's mostly useless. 00:06:02.280 |
- But the breakthroughs are on the simple stuff. 00:06:04.620 |
- No, no, you have to do the simple stuff really well. 00:06:14.960 |
one guy says, "Yeah, they're over there in an ugly pile." 00:06:18.800 |
"Lovingly tells you about the 50 kinds of bricks, 00:06:21.240 |
"and how hard they are, and how beautiful they are, 00:06:32.040 |
the person who understands bricks, who loves bricks, 00:06:36.120 |
You know, good engineering is great craftsmanship. 00:06:39.400 |
And when you start thinking engineering is about invention, 00:06:44.880 |
and you set up a system that rewards invention, 00:06:50.680 |
- Okay, so maybe one perspective is the theory, 00:07:02.880 |
it doesn't matter what you do, theory, engineering-- 00:07:06.200 |
They're always talking about some breakthrough, 00:07:17.200 |
And innovation creates a whole new opportunity. 00:07:19.800 |
Like when some guy invented the internet, right? 00:07:25.880 |
The million people that wrote software against that 00:07:28.200 |
were mostly doing engineering software writing. 00:07:58.520 |
You don't get a Nobel Prize, or a Fields Medal, or-- 00:08:22.360 |
'Cause like, when stuff like JavaScript came out, 00:08:32.360 |
Where you write simple code, it might be interpreted, 00:08:35.200 |
it has lots of libraries, productivity is high, 00:08:41.360 |
all the world's problems, it was complicated. 00:08:50.120 |
- But was it the right thing at the right time? 00:09:09.520 |
- Well, I think there was a bunch of blog posts 00:09:11.920 |
written about it, which is like, wrong is right, 00:09:23.280 |
and then iterating over time, listening to developers, 00:09:26.120 |
like listening to people who actually use the thing. 00:09:28.240 |
This is something you can do more in software. 00:09:31.800 |
- But the right time, like you have to sense, 00:09:35.120 |
of when is the right time for the right tool, 00:09:37.560 |
and make it super simple, and just get it out there. 00:09:56.640 |
- There's something about that, and it wasn't accidental. 00:10:02.560 |
you have to have this broad sense of what's needed now, 00:10:10.840 |
and just like, it was obvious that there was no, 00:10:17.960 |
is everything that ran in the browser at the time, 00:10:25.920 |
they were all in a separate external container. 00:10:30.480 |
And then JavaScript was literally just injected 00:10:33.640 |
into the webpage, it was the dumbest possible thing 00:10:36.360 |
running in the same thread as everything else. 00:10:43.080 |
So JavaScript code is inserted as a comment in the HTML code. 00:10:47.560 |
And it was, I mean, it's either genius or super dumb, 00:10:56.680 |
it just executed in the framework of the program 00:11:00.920 |
- And then because something about that accessibility, 00:11:11.400 |
I mean, I don't even know what to make of that, 00:11:13.680 |
but it does seem to echo across different software, 00:11:19.720 |
PHP has the same story, really crappy language. 00:11:28.360 |
variable length instructions, that's always won, 00:11:34.440 |
x86 is arguably the worst architecture on the planet, 00:11:47.440 |
that us in this evolutionary process is valued. 00:11:52.440 |
If it's simple, it spreads faster, it seems like. 00:12:01.140 |
Yeah, it could be simple is good, but too simple is bad. 00:12:21.100 |
that run little programs like normal all over the place. 00:12:23.900 |
But we're going through another transformation, 00:12:39.460 |
you know, predictability of instructions and data. 00:12:43.380 |
And then the usability of it is some, you know, 00:12:46.720 |
quality of design, quality of tools, availability. 00:12:52.140 |
Like right now, x86 is proprietary with Intel and AMD, 00:12:56.420 |
but they can change it any way they want independently. 00:13:05.700 |
And RISC-V is open source, so anybody can change it, 00:13:09.100 |
which is super cool, but that also might mean 00:13:13.620 |
that there's no common subset of it that people can use. 00:13:19.900 |
Like if you were to bet all your money on one or the other, 00:13:37.460 |
But there was seven different people making x86 00:13:40.260 |
'cause at the time there was 6502 and Z80s and 8086. 00:13:56.140 |
So there's like four or five different microprocessors. 00:14:02.420 |
'cause people felt like they had multiple sources from it. 00:14:04.700 |
And then over time it narrowed down to two players. 00:14:09.880 |
why did Intel win for so long with their processors? 00:14:31.740 |
because they aggressively stole other people's ideas. 00:14:43.780 |
- They started making RAMs, random access memories. 00:14:48.260 |
And then at the time when the Japanese manufacturers 00:14:51.700 |
came up, they were getting out competed on that. 00:14:56.620 |
and they made the first integrated microprocessor 00:15:16.020 |
All kinds of big companies had boatloads of money 00:15:23.300 |
- So it's not like marketing, it's not any of that stuff. 00:15:28.460 |
I think the Core 2 was probably the first one 00:15:52.700 |
it's just like literally just raw performance. 00:16:08.620 |
- Well, there's the fastest in the environment. 00:16:10.620 |
Like for years you made the fastest one you could 00:16:13.100 |
and then people started to have power limits. 00:16:14.980 |
So then you made the fastest at the right power point. 00:16:17.700 |
And then when we started doing multi-processors, 00:16:24.460 |
you could be 10% faster on like a single thread, 00:16:35.060 |
you know, they have the A series and the R series 00:16:37.500 |
and the M series, like a family of processors 00:16:56.940 |
- Well, there's people that make microcontrollers 00:16:58.660 |
that are small, but they don't have a fast one. 00:17:09.380 |
- So what's the difference between the Arm folks and Intel 00:17:13.380 |
in terms of the way they're approaching this problem? 00:17:15.620 |
- Well, Intel, almost all their processor designs 00:17:27.540 |
- Yeah, and they architecturally are really good, 00:17:33.380 |
to what's going on in the industry with CAD tools and stuff. 00:17:36.300 |
And there's this debate about custom design versus synthesis 00:17:45.700 |
Arm came in from the bottom and they generated IP, 00:17:54.980 |
So Arm is super friendly to the synthesis IP environment. 00:17:59.460 |
Whereas Intel said, we're gonna make this great client chip 00:18:04.340 |
with our own process, with our own, you know, 00:18:06.660 |
other supporting IP and everything only works with our stuff. 00:18:11.340 |
- So is that, is Arm winning the mobile platform space 00:18:26.420 |
So they controlled the process architecture and IP, 00:18:29.420 |
but they let people put in lots of different chips. 00:18:32.060 |
And there was a lot of variability in what happened there. 00:18:37.140 |
their foray into mobile, they had one team doing one part. 00:18:48.100 |
And that brought a whole bunch of things along 00:18:49.860 |
that the mobile world, the embedded world don't do. 00:18:52.500 |
- Do you think it was possible for Intel to pivot hard 00:18:58.260 |
That's a hell of a difficult thing to do, right? 00:19:07.420 |
It's like, it's clear that PCs were dominating 00:19:19.380 |
Like Apple under Steve Jobs, when he came back, 00:19:24.780 |
You know, they build iPads and iTunes and phones 00:19:30.060 |
Like who knew computers should be made out of aluminum? 00:19:40.540 |
And the old Intel, they did that multiple times. 00:19:45.860 |
They made DRAMs and processors and processes. 00:20:08.260 |
He couldn't find a table 'cause the cafeteria was packed. 00:20:13.700 |
But I worked for Mike Colbert, who talked to, 00:20:20.300 |
And he worked for Steve for 25 years, maybe more. 00:20:25.740 |
And he was one of the people who could put up with Steve's, 00:20:33.580 |
And Steve trusted Mike to translate the shit he thought up 00:20:40.980 |
And then Mike ran a group called Platform Architecture. 00:20:50.500 |
'cause Steve would be yelling about something or other. 00:20:54.980 |
And then he would say, "Steve wants us to do this." 00:21:04.620 |
- And he's a really good selector for talent. 00:21:08.420 |
That seems to be one of the key elements of leadership. 00:21:10.980 |
- And then he was a really good first principles guy. 00:21:15.100 |
and he would just think, "That's obviously wrong." 00:21:31.620 |
There's a whole bunch of things you could think about. 00:21:38.100 |
so it seems like Elon Musk is more engineering centric. 00:21:42.860 |
But is also, I think he considers himself a designer too. 00:21:47.020 |
Steve Jobs feels like he is much more idea space, 00:21:59.140 |
He had computer people talk to him all the time. 00:22:16.060 |
It wasn't like he was just finger painting on the wall 00:22:23.420 |
because he wasn't a computer architect or designer, 00:22:28.340 |
but he had an intuition from the computers we had 00:22:35.300 |
because it seems like he was pissing off a lot of engineers 00:22:40.020 |
in his intuition about what can and can't be done. 00:22:52.100 |
like he'd go into a lab and look at what's going on 00:22:55.460 |
and hate it and fire people or ask somebody in the elevator 00:23:00.460 |
what they're doing for Apple and not be happy. 00:23:10.700 |
and didn't really interact outside of that as much. 00:23:13.940 |
And then the joke was, you'd see like somebody moving 00:23:16.380 |
a prototype through the quad with a black blanket over it. 00:23:20.860 |
And that was 'cause it was secret, partly from Steve 00:23:24.260 |
'cause they didn't want Steve to see it until it was ready. 00:23:26.980 |
- Yeah, the dynamic with Johnny Ive and Steve 00:23:43.540 |
like Gordon Bell was famous for ideas, right? 00:23:47.300 |
And it wasn't that the percentage of good ideas 00:23:53.180 |
and he was also good at talking to people about it 00:23:55.860 |
and getting the filters right and seeing through stuff. 00:24:00.220 |
Whereas Elon was like, hey, I wanna build rockets. 00:24:11.500 |
or like more like a love and passion for the manuals. 00:24:16.500 |
- And the details, the data and the understanding. 00:24:27.980 |
what do you make of like the anger and the passion 00:24:33.420 |
and the madness, being emotional and all that, 00:24:45.060 |
So there's a graph, which is y-axis productivity, 00:25:00.960 |
as you improve order, you improve productivity. 00:25:15.040 |
is once you start moving in a direction of order, 00:25:16.960 |
the force vector to drive you towards order is unstoppable. 00:25:21.880 |
- And every organization will move to the place 00:25:24.920 |
where their productivity is stymied by order. 00:25:28.000 |
- So the question is, who's the counterforce? 00:25:33.400 |
As you get more organized and productivity goes up, 00:25:36.280 |
the organization feels it, they orient towards it, right? 00:25:41.120 |
They get more guys who can run process, you get bigger. 00:25:46.080 |
the organization gets captured by the bureaucracy 00:26:07.920 |
- I can't tell you on how many levels that's profound. 00:26:23.520 |
you need people who can manage stuff and manage people, 00:26:29.880 |
and they're doing good stuff, and pat 'em on the back. 00:26:40.860 |
You have to praise them before they do anything. 00:26:46.560 |
Now I tell 'em what a great job they're doing 00:26:57.560 |
successful people, that you need to first do the rough stuff 00:27:12.280 |
- And lots of people only work for praise, which is weird. 00:27:21.160 |
- Well, you're probably looking for somebody's approval. 00:27:29.560 |
- Maybe somebody who's no longer with us kind of thing. 00:27:34.160 |
- I used to call up my dad and tell him what I was doing. 00:27:36.040 |
He was very excited about engineering and stuff. 00:27:49.280 |
So when I did poorly in school, I was dyslexic. 00:27:52.560 |
I didn't read until I was third or fourth grade. 00:28:23.760 |
and the chemotherapy, I think, accelerated it. 00:28:37.920 |
- Do you remember conversations from that time? 00:28:41.560 |
Like what, do you have fond memories of the guy? 00:28:46.440 |
- A friend told me one time I could draw a computer 00:28:50.400 |
on the whiteboard faster than anybody he'd ever met, 00:28:54.960 |
Like when I was a kid, he'd come home and say, 00:28:56.800 |
"I was driving by this bridge, and I was thinking about it, 00:29:08.720 |
And he had this idea that he could understand 00:29:13.440 |
And I just grew up with that, so that was natural. 00:29:16.480 |
So when I interview people, I ask them to draw a picture 00:29:24.800 |
and then they'll say, "And it just talks to this." 00:29:30.000 |
And then I had this other guy come in one time, 00:29:31.760 |
he says, "Well, I designed a floating point in this chip, 00:29:34.460 |
"but I'd really like to tell you how the whole thing works, 00:29:36.280 |
"and then tell you how the floating point works inside it. 00:29:39.080 |
And he covered two whiteboards in like 30 minutes. 00:29:53.600 |
Real view of the balance of how the design worked. 00:30:06.720 |
Your ability to lay it out in an understandable way 00:30:11.500 |
- And be able to sort of zoom into the detail 00:30:17.620 |
You said your dad believed that you could do anything. 00:30:26.700 |
- It seems that that echoes in your own behavior. 00:30:32.100 |
- Well, it's not that anybody can do anything right now. 00:30:36.220 |
It's that if you work at it, you can get better at it 00:30:46.140 |
So at the end of his life, he started playing the piano. 00:30:51.580 |
But he thought if he really worked at it in this life, 00:31:02.940 |
- Do you think the perfect is the enemy of the good 00:31:08.180 |
It's like we were talking about JavaScript a little bit 00:31:10.500 |
and the messiness of the 10 day building process. 00:31:17.140 |
So creative tension is you have two different ideas 00:31:37.060 |
and anything that doesn't fit in the schedule we can't do. 00:31:50.540 |
and no matter what, you know, more people, more money, right? 00:31:54.540 |
And there's a really clear idea about what you want. 00:31:57.860 |
Some people are really good at articulating it. 00:32:18.340 |
I work with a guy that I really like working with, 00:32:26.620 |
and as soon as he figured out what's wrong with it, 00:32:36.740 |
'cause sometimes, you know, you figure out how to tweak it, 00:32:49.740 |
But you also have to execute programs and get shit done. 00:32:53.380 |
And then it turns out, computer engineering's fun 00:32:56.900 |
to build a computer, 200 or 300, whatever the number is, 00:33:04.540 |
temperament and, you know, skill sets and stuff, 00:33:07.660 |
that in a big organization, you find the people 00:33:11.860 |
and the people that wanna get stuff done yesterday, 00:33:16.460 |
and people like to, let's say, shoot down ideas, 00:33:19.260 |
and it takes the whole, it takes a large group of people. 00:33:32.100 |
for that giant mess of people to find the perfect path 00:33:40.980 |
you said there's some people good at articulating 00:33:42.920 |
what perfect looks like, what a good design is. 00:33:55.300 |
how do you know this is something special here, 00:34:04.580 |
And you kinda go into it, and you don't quite understand it, 00:34:07.540 |
and you're working on it, and then you start, you know, 00:34:10.980 |
talking about it, putting it on the whiteboard, 00:34:16.180 |
and then your brains start to kinda synchronize. 00:34:19.580 |
Like, you start to see what each other is thinking. 00:34:30.180 |
in computer design is I can see how computers work 00:34:37.340 |
And when you're working with people that can do that, 00:34:47.180 |
you get to that place, and then you find the flaw, 00:34:49.420 |
which is kinda funny, 'cause you can fool yourself. 00:35:08.100 |
and I know some architects who really love ideas, 00:35:11.140 |
and then they work on 'em, and they put it on the shelf, 00:35:14.220 |
and put it on the shelf, and they never reduce it 00:35:15.740 |
to practice, so they find out what's good and bad. 00:35:18.780 |
'Cause almost every time I've done something really new, 00:35:22.500 |
by the time it's done, like, the good parts are good, 00:35:30.060 |
just your own experience, is your career defined 00:35:36.060 |
- Again, there's great tension between those. 00:35:46.260 |
then you're not gonna be facing the challenges 00:35:52.620 |
- But when you look back, do you see problems, or? 00:35:56.420 |
- When I look back, I think earlier in my career, 00:36:08.620 |
And it was in the Guinness Book of World Records, 00:36:10.380 |
and it was the fastest processor on the planet. 00:36:20.020 |
We did a bunch of new things, and some worked out great, 00:36:22.140 |
and some were bad, and we learned a lot from it, 00:36:28.020 |
That also, EV6 also had some really cool things in it. 00:36:31.820 |
I think the proportion of good stuff went up, 00:36:34.240 |
but it had a couple fatal flaws in it that were painful. 00:36:41.500 |
- You learned to channel the pain into, like, pride. 00:37:00.500 |
No, it's, you know, there's this kind of weird combination 00:37:10.260 |
Yeah, there's definitely lots of suffering in the world. 00:37:28.740 |
From your own work, from other people's work, 00:37:35.180 |
the battleground of flaws and mistakes and errors, 00:37:47.900 |
usually there's a well-thought-out set of abstraction layers. 00:38:01.160 |
when they work together, they work independently. 00:38:04.920 |
They don't have to know what the other one is doing. 00:38:08.640 |
- Yeah, so the famous one was the network stack. 00:38:13.080 |
you know, data transport and protocol and all the layers, 00:38:16.400 |
and the innovation was is when they really got that right, 00:38:20.000 |
'cause networks before that didn't define those very well, 00:38:30.960 |
And that let, you know, the design space breathe. 00:38:37.800 |
without having to worry about how layer four worked. 00:39:03.520 |
"so well independently when we put it together, 00:39:08.040 |
"'cause the floating point knows how the cache works." 00:39:10.720 |
And I was a little skeptical, but he was mostly right. 00:39:32.240 |
So, you know, a beautiful design can't be bigger 00:39:40.000 |
Like the odds of you doing a really beautiful design 00:39:42.440 |
with something that's way too hard for you is low, right? 00:39:46.640 |
If it's way too simple for you, it's not that interesting. 00:39:50.680 |
But when you get the right match of your expertise 00:39:54.800 |
and, you know, mental power to the right design size, 00:40:12.440 |
when you put it together, it's sufficiently interesting 00:40:16.840 |
to be used, and so that's what a beautiful design is. 00:40:21.400 |
- Matching the limits of that human cognitive capacity 00:40:30.320 |
and creating a nice interface between those modules, 00:40:37.120 |
we can build with this kind of modular design? 00:40:41.000 |
It's like, you know, we build increasingly more complicated, 00:41:06.840 |
- So if an alien showed up and looked at Twitter, 00:41:11.160 |
simple thing that everybody uses, which is really big. 00:41:19.200 |
the computers, the whole thing is so bloody complicated, 00:41:25.720 |
So yeah, if an alien showed up and looked at Twitter, 00:41:28.760 |
or looked at the various different networked systems 00:41:41.560 |
no human on this planet comprehends the system they built. 00:41:44.660 |
- No individual, well, would they even see individual humans? 00:41:48.880 |
Like we humans are very human-centric, entity-centric, 00:41:52.720 |
and so we think of us as the central organism 00:41:56.840 |
and the networks as just a connection of organisms. 00:41:59.800 |
But from a perspective of, from an outside perspective, 00:42:07.040 |
We're the ants and they'd see the ant colony. 00:42:10.480 |
Or the result of production of the ant colony, 00:42:26.800 |
- Well, that's 'cause it's stress-tested all the time. 00:42:29.360 |
- You know, you build all these cities with buildings 00:42:52.560 |
- Well, let's go, let's talk about Moore's Law a little bit. 00:43:05.320 |
Like, OpenAI, for example, recently published 00:43:14.120 |
in the training efficiency of neural networks. 00:43:17.080 |
For like ImageNet and all that kind of stuff, 00:43:22.360 |
just figuring out better tricks and algorithms 00:43:27.040 |
And that seems to be improving significantly faster 00:43:39.200 |
or if the general version of Moore's Law continues, 00:43:42.920 |
do you think that comes mostly from the hardware, 00:43:50.040 |
so not the reduction of the size of the transistor 00:43:52.840 |
kind of thing, but more in the totally interesting 00:44:30.600 |
they were going from like a single computer application 00:44:39.480 |
How many computers can I put on this problem? 00:44:42.280 |
'Cause the computers themselves are getting better 00:44:44.200 |
on like a Moore's Law rate, but their ability 00:44:58.280 |
It's been quite, you know, steady improvements. 00:45:09.880 |
what's the most productive, rich source of S-curves 00:45:18.760 |
- So, hardware is gonna move along relatively slowly. 00:45:23.600 |
Like, you know, double performance every two years. 00:45:31.440 |
The snail's pace of Moore's Law, maybe we should, 00:45:43.960 |
I'm sure at some point, Google had a, you know, 00:45:46.320 |
their initial search engine was running on a laptop, 00:45:50.120 |
- And at some point, they really worked on scaling that, 00:45:52.520 |
and then they factored the indexer from, you know, 00:45:57.440 |
and they spread the data on more and more things, 00:46:02.760 |
But as they scaled up the number of computers on that, 00:46:05.360 |
it kept breaking, finding new bottlenecks in their software 00:46:13.920 |
across a thousand computers, to schedule parts of it, 00:46:19.000 |
But if you wanna schedule a million searches, 00:46:31.960 |
like a network that was great on a hundred computers 00:46:36.560 |
You may pick a network that's 10 times slower 00:46:42.520 |
But if you go from a hundred to 10,000, that's a hundred times. 00:46:47.240 |
when we did internet scaling, is the efficiency went down. 00:46:52.560 |
The future of computing is inefficiency, not efficiency. 00:46:57.600 |
- It's scaling faster than inefficiency bites you. 00:47:06.000 |
But Google showed, Facebook showed, everybody showed 00:47:17.760 |
the entirety of Earth will be like a computing surface? 00:47:29.000 |
- The science fiction books, they call it computronium. 00:47:34.680 |
Well, most of the elements aren't very good for anything. 00:47:38.000 |
Like, you're not gonna make a computer out of iron. 00:47:39.920 |
Like, you know, silicon and carbon have nice structures. 00:47:45.440 |
- Well, we'll see what you can do with the rest of it. 00:47:48.640 |
People talk about, well, maybe we can turn the sun 00:48:05.760 |
- That'd be ironic from the simulation point of view, 00:48:07.600 |
is like, the simulator built mass to simulate, like. 00:48:14.120 |
this is all heading towards a simulation, yes. 00:48:15.960 |
- Yeah, well, I think I might have told you this story. 00:48:18.440 |
At Tesla, they were deciding, so they wanna measure 00:48:22.400 |
and they decide between putting a resistor in there 00:48:24.960 |
and putting a computer with a sensor in there. 00:48:29.360 |
And the computer was faster than the computer 00:48:34.160 |
And we chose the computer 'cause it was cheaper 00:48:37.340 |
So, sure, this hedgehog, you know, it costs $13, 00:48:51.800 |
- I was hoping it wouldn't be smarter than me, because-- 00:48:54.640 |
- Well, everything's gonna be smarter than you. 00:48:58.040 |
I thought it was better to have a lot of dumb things. 00:49:00.240 |
- Well, Moore's Law will slowly compact that stuff. 00:49:02.760 |
- So even the dumb things will be smarter than us? 00:49:10.520 |
It's like, well, just remember, a big computer chip, 00:49:15.520 |
it's like an inch by an inch, and 40 microns thick. 00:49:33.520 |
- But they still can't write compelling poetry or music 00:49:37.640 |
or understand what love is or have a fear of mortality, 00:49:48.080 |
But speaking about this walk along the path of innovation 00:49:55.880 |
towards the dumb things being smarter than humans, 00:50:00.060 |
you are now the CTO of Tenstor, as of two months ago. 00:50:08.060 |
How do you build scalable and efficient deep learning? 00:50:18.380 |
There are serial computers that run like C programs, 00:50:27.940 |
Like, GPUs are great 'cause you have a million pixels, 00:50:30.740 |
and modern computers are great 'cause you have 00:50:32.980 |
a million pixels, and modern GPUs run a program 00:50:42.060 |
You build something, you make this into little tiny chunks, 00:50:49.780 |
But most C programs, you write this linear narrative, 00:50:55.180 |
To make it go fast, you predict all the branches, 00:50:57.260 |
all the data fetches, and you run that more in parallel, 00:51:10.900 |
But the way people describe the neural networks, 00:51:14.780 |
and then how they write them in PyTorch, it makes graphs. 00:51:17.900 |
- Yeah, that might be fundamentally different 00:51:23.300 |
Because when you run the GPU program on all the pixels, 00:51:29.500 |
you know, this group of pixels say it's background blue, 00:51:34.020 |
This pixel is, you know, some patch of your face, 00:51:36.940 |
so you have some really interesting shader program 00:51:41.740 |
But the pixels themselves don't talk to each other. 00:51:46.620 |
So, you do the image, and then you do the next image, 00:51:55.620 |
and modern GPUs have like 6,000 thread engines in them. 00:52:02.100 |
each one runs a program on, you know, 10 or 20 pixels, 00:52:06.140 |
and that's how they work, but there's no graph. 00:52:09.380 |
- But you think graph might be a totally new way 00:52:17.100 |
this good conversation about given versus found parallelism, 00:52:20.580 |
and then the kind of walk, 'cause we got more transistors, 00:52:27.820 |
Now we did it on vector data, famous vector machines. 00:52:30.740 |
Now we're making computers that operate on matrices, right? 00:52:34.500 |
And then the category we said was next was spatial. 00:52:38.900 |
Like, imagine you have so much data that, you know, 00:52:53.060 |
than to move all the data to a central processor 00:52:57.580 |
- So, spatially, you mean moving in the space of data 00:53:24.020 |
do another computation, do a data transformation, 00:53:26.380 |
do a merging, do a pooling, do another computation. 00:53:34.500 |
this whole process efficient, this different? 00:53:37.220 |
- So first, the fundamental elements in the graphs 00:53:40.860 |
are things like matrix multiplies, convolutions, 00:53:46.380 |
- So GPUs emulate those things with their little singles, 00:53:49.580 |
you know, basically running a single-threaded program. 00:53:56.060 |
a bunch of programs that are similar together, 00:54:03.980 |
you take this graph and you say this part of the graph 00:54:06.060 |
is a matrix multiplier, which runs on these 32 threads. 00:54:12.620 |
for running programs on pixels, not executing graphs. 00:54:38.980 |
So I've been, you know, kind of following him 00:54:43.540 |
And in the fall, when I was considering things to do, 00:54:46.960 |
I decided, you know, we held a conference last year 00:55:10.160 |
where you write programs, to data program computers. 00:55:21.340 |
And then Chris has been working, he worked on LLVM, 00:55:31.340 |
And now he's working on another project called MLIR, 00:55:33.620 |
which is mid-level intermediate representation, 00:55:39.820 |
about how do you represent that kind of computation 00:55:58.300 |
But it's in service of executing graph programs. 00:56:10.100 |
they did a test chip and two production chips. 00:56:18.780 |
if you don't build the hardware to run the software 00:56:22.900 |
then you have to fix it by writing lots more software. 00:56:26.060 |
So the hardware naturally does matrix multiply, 00:56:31.820 |
and the data movement between processing elements 00:56:45.100 |
- So I think it's called the Grace Call processor 00:56:51.340 |
It's, you know, there's a bunch of measures of performance. 00:56:55.580 |
It seems to outperform 368 trillion operations per second. 00:56:59.940 |
Seems to outperform NVIDIA's Tesla T4 system. 00:57:04.700 |
What do they actually mean in real world performance? 00:57:07.660 |
Like what are the metrics for you that you're chasing 00:57:13.900 |
- Well, first, so the native language of, you know, 00:57:17.860 |
people who write AI network programs is PyTorch now. 00:57:21.380 |
PyTorch, TensorFlow, there's a couple others. 00:57:23.980 |
- Do you think PyTorch has won over TensorFlow? 00:57:37.100 |
- But the deepest love is for PyTorch currently. 00:57:42.580 |
So the first thing is when they write their programs, 00:57:46.640 |
can the hardware execute it pretty much as it was written? 00:57:53.300 |
We have a graph compiler that makes that graph. 00:58:05.100 |
There's a couple of mid-level representations of it 00:58:12.140 |
you can see how it's gonna go through the machine, 00:58:17.740 |
like math, data manipulation, data movement kernels, 00:58:35.980 |
So one of the goals is if you write a piece of PyTorch code 00:58:41.260 |
you should be able to compile it, run it on the hardware 00:58:44.740 |
and do all kinds of crazy things to get performance. 00:58:52.180 |
if you write a large matrix multiply naively, 00:58:54.580 |
you'll get five to 10% of the peak performance of the GPU. 00:59:01.580 |
and I read them about what steps do you have to do. 00:59:10.840 |
You know, block it so that you can put a block 00:59:13.960 |
of the matrix on different SMs, you know, groups of threads. 00:59:37.880 |
or you have to be an expert in microarchitecture 01:00:06.120 |
So the native, you know, data item is a packet. 01:00:15.400 |
and then it may send packets to other processors, 01:00:24.400 |
and then 16, the next second chip has 16 ethernet ports 01:00:29.560 |
and it's the same graph compiler across multiple chips. 01:00:35.120 |
Now, my experience with scaling is as you scale, 01:01:03.120 |
The header bit says which processor to send it to, 01:01:05.840 |
and we basically take a packet off our on-chip network, 01:01:09.560 |
put an ethernet header on it, send it to the other end, 01:01:13.000 |
strip the header off and send it to the local thing. 01:01:16.120 |
- Human to human interaction is pretty straightforward too, 01:01:32.320 |
Am I going to ever use Tenstor or is this more for? 01:01:38.720 |
or small training problems or big training problems. 01:01:46.760 |
- One of the goals is to scale from 100 milliwatts 01:01:51.720 |
so like really have some range on the problems 01:02:03.600 |
that we can move around, it's built to scale, 01:02:06.680 |
but so many people have, you know, small problems. 01:02:13.240 |
- Like inside that phone is a small problem to solve. 01:02:16.360 |
So do you see Tenstor potentially being inside a phone? 01:02:19.960 |
- Well, the power efficiency of local memory, 01:02:22.600 |
local computation and the way we built it is pretty good. 01:02:28.480 |
on being able to do conditional graphs and sparsity. 01:02:34.520 |
I want to go in a small factor, it's quite good, 01:02:36.920 |
but we have to prove that that's a fun problem. 01:02:40.760 |
- And that's the early days of the company, right? 01:02:44.600 |
But you think you invested, you think they're legit 01:02:49.960 |
Well, it's also, it's a really interesting place to be. 01:02:58.480 |
like build a faster processor, which people want, 01:03:03.720 |
than what's gonna happen in AI in the next 10 years. 01:03:18.120 |
why some of them, you know, aren't gonna work out that well. 01:03:27.520 |
Like we've talked to customers about exciting features. 01:03:33.920 |
they wanna hear first about memory bandwidth, 01:03:35.880 |
local bandwidth, compute intensity, programmability. 01:03:39.240 |
They want to know the basics, power management, 01:03:42.000 |
how the network ports work, what are the basics, 01:03:46.120 |
'Cause it's easy to say, we've got this great idea, 01:03:58.680 |
If you buy the card, you plug it in your machine, 01:04:01.960 |
how long does it take me to get my network to run? 01:04:24.800 |
with Karpathy, Andre Karpathy and Chris Lautner. 01:04:29.200 |
Very, very interesting, very brilliant people, 01:04:42.680 |
They only get stuff done to get their own projects done. 01:04:48.760 |
and they've created platforms for other people 01:04:52.040 |
- Yeah, the clear thinking that's able to be communicated 01:05:00.800 |
- Well, let me ask, 'cause I talk to Chris actually 01:05:05.040 |
He's been one of the, just to give him a shout out, 01:05:17.680 |
but he's been like sensitive to the human element 01:05:23.800 |
on this stupid podcast that I do to say like, 01:05:27.920 |
don't quit this thing and also talk to whoever 01:05:34.160 |
That kind of, from a legit engineer to get like props 01:05:40.000 |
That was, I mean, that's what a good leader does, right? 01:06:00.960 |
What's really impressive to you about the things 01:06:11.920 |
Then there's, he's also at Google worked at the TPU stuff. 01:06:21.360 |
Talking about people that work in the entirety of the stack. 01:06:24.360 |
From your time interacting with Chris and knowing the guy, 01:06:28.840 |
what's really impressive to you that just inspires you? 01:06:32.120 |
- Well, like LLVM became the de facto platform 01:06:43.780 |
And it was good code quality, good design choices. 01:06:48.820 |
There's a little bit of the right time, the right place. 01:06:51.980 |
And then he built a new programming language called Swift, 01:06:55.420 |
which after, let's say some adoption resistance 01:07:01.140 |
I don't know that much about his work at Google, 01:07:07.140 |
they started TensorFlow stuff and it was new. 01:07:11.580 |
They wrote a lot of code and then at some point 01:07:19.100 |
why PyTorch started a little later and then passed it. 01:07:28.220 |
is the complexity of the software stack above 01:07:33.500 |
that forcing the features of that into a level 01:07:41.620 |
And that was one of the inspirations for our software stack 01:07:43.860 |
where we have several intermediate representations 01:07:46.720 |
that are all executable and you can look at them 01:07:49.740 |
and do transformations on them before you lower the level. 01:08:06.660 |
So he, and there seems to be some profound ideas on that 01:08:14.940 |
as the world of software gets more and more complicated, 01:08:17.780 |
how do we create the right abstraction levels 01:08:20.060 |
to simplify it in a way that people can now work independently 01:08:35.660 |
So on either the TPU or maybe the NVIDIA GPU side, 01:08:53.900 |
deep learning centric hardware beat NVIDIA's? 01:09:08.060 |
- Well, GPUs were built to run shader programs 01:09:24.100 |
And then the primitives is not a SIMD program, 01:09:30.060 |
And then the data manipulations are fairly extensive 01:09:32.940 |
about like how do you do a fast transpose with a program? 01:09:36.340 |
I don't know if you've ever written a transpose program. 01:09:43.260 |
So when GPU accelerator started doing triangles, 01:09:59.220 |
and then you have to go back down to the next row 01:10:04.100 |
if the line of the triangle is like half on the pixel, 01:10:09.180 |
'Cause it's half of this pixel and half the next one. 01:10:12.980 |
- And you're saying that can be done in hardware? 01:10:22.140 |
I've written a program that did rasterization. 01:10:24.460 |
The hardware that does it is actually less code 01:10:33.460 |
when the abstraction you have, rasterize a triangle, 01:10:41.300 |
but the right thing to do in the hardware-software boundary 01:10:50.100 |
- Well, no, that's just, well, like in a modern, 01:10:56.980 |
What they did is they still rasterized triangles 01:11:00.940 |
but for the most part, most of the computation here 01:11:05.900 |
but they're single-threaded programs on pixels, not graphs. 01:11:09.580 |
- And to be honest, let's say I don't actually know 01:11:17.780 |
- They look like little simple floating-point programs 01:11:21.220 |
You can have 8,000 instructions in a shader program. 01:11:37.260 |
like say this is a line of pixels across this table, 01:11:40.740 |
the amount of light on each pixel is subtly different. 01:11:43.620 |
- And each pixel is responsible for figuring out 01:11:52.380 |
Like every single pixel here is a different color. 01:11:54.380 |
Every single pixel gets a different amount of light. 01:11:57.120 |
Every single pixel has a subtly different translucency. 01:12:02.140 |
the solution was you run a separate program on every pixel. 01:12:14.140 |
And then when the pixel's looking at the reflection map, 01:12:16.300 |
it has to calculate what the normal of the surface is, 01:12:20.880 |
By the way, there's boatloads of hacks on that. 01:12:29.180 |
But at the end of the day, it's per pixel computation. 01:12:32.900 |
- And it's so happening you can map graph-like computation 01:12:46.180 |
First for HPC, and then they got lucky with the AI trend. 01:12:50.100 |
- But do you think they're going to essentially 01:13:13.400 |
but they've also worked really hard on mobile. 01:13:28.520 |
Or semi-autonomous, like playing with Tesla and so on, 01:13:31.120 |
and seeing that's dipping a toe into that kind of pivot. 01:13:46.240 |
- I don't know if it's interesting technically. 01:13:49.920 |
Technically, I don't know if it's the execution, 01:13:55.440 |
- But they were repurposing GPUs for an automotive solution. 01:14:02.820 |
Like the chips inside Tesla are pretty cheap. 01:14:08.040 |
They're doing the classic work from the simplest thing. 01:14:11.240 |
They were building 40 square millimeter chips, 01:14:14.240 |
and Nvidia, their solution, had two 800 millimeter chips 01:14:31.280 |
and then they added features as it was economically viable. 01:14:43.680 |
where they have a 5,000 watt server in their trunk. 01:14:54.720 |
Elon's approach was that port has to be cheap enough 01:14:59.560 |
whether they turn on autonomous driving or not. 01:15:02.080 |
And Mobileye was like, "We need to fit in the BOM 01:15:15.160 |
- Well, and for Mobileye, it seems like neural networks 01:15:20.120 |
were not first-class citizens, like the computation. 01:15:27.120 |
- And did classic CV and found stoplights and lines, 01:15:37.960 |
Then, as opposed to, so if you look at the new Tesla work, 01:15:42.000 |
it's like neural networks from the ground up, right? 01:15:45.560 |
- Yeah, and even Tesla started with a lot of CV stuff in it, 01:15:54.360 |
- So without, this isn't like confidential stuff, 01:15:57.960 |
but you sitting on a porch looking over the world, 01:16:06.440 |
do you like the trajectory of where things are going 01:16:10.920 |
I like the videos of people driving the beta stuff. 01:16:14.160 |
Like, it's taken some pretty complicated intersections 01:16:16.520 |
and all that, but it's still an intervention per drive. 01:16:19.600 |
I mean, I have Autopilot, the current Autopilot, 01:16:39.080 |
Off by two, off by five, off by 10, off by 100. 01:16:50.640 |
- And one thing is, the data set gets bigger, 01:16:57.360 |
you sort of train and build an arbitrary size network 01:17:01.360 |
And then you refactor the network down to the thing 01:17:06.760 |
So the goal isn't to build a network that fits in the phone, 01:17:23.520 |
- Well, the one really important thing is also 01:17:25.760 |
what they're doing well is how to iterate that quickly, 01:17:28.640 |
which means like, it's not just about one time deployment, 01:17:31.720 |
one building, it's constantly iterating the network 01:17:34.200 |
and trying to automate as many steps as possible, right? 01:17:37.560 |
- And that's actually the principles of the Software 2.0, 01:17:41.680 |
like you mentioned with Andre, is it's not just, 01:17:50.880 |
if it's just high-level philosophical or their specifics, 01:17:53.520 |
but the interesting thing about what that actually looks 01:18:02.680 |
It's like, it's the iterative improvement of the thing. 01:18:19.920 |
like figuring out, it's kind of what you were talking about 01:18:23.040 |
with TensorTorrent is you have the data landscape. 01:18:27.560 |
in a way that's constantly improving the neural network. 01:18:46.920 |
Like the amazing thing about like the GPT-3 stuff 01:18:51.520 |
So there's essentially infinite amount of data. 01:18:53.280 |
Now there's obviously infinite amount of data 01:18:55.800 |
available from cars of people who are successfully driving. 01:18:59.240 |
But the current pipelines are mostly running on labeled data, 01:19:22.360 |
Now can I turn that into something that fits in the car? 01:19:25.840 |
And that process is gonna happen all over the place. 01:19:29.200 |
Every time you get to the place where you have unlimited data 01:19:43.800 |
the self-supervised formulation of the problem. 01:19:47.240 |
So the unsupervised formulation of the problem. 01:19:49.640 |
Like in driving, there's this really interesting thing, 01:19:53.520 |
which is you look at a scene that's before you 01:19:58.120 |
and you have data about what a successful human driver did 01:20:09.360 |
Currently, even though Tesla says they're using that, 01:20:17.440 |
with just that self-supervised piece of data? 01:20:32.240 |
is as good of a data engine, for example, as Tesla does. 01:20:35.920 |
That's where the, like the organization of the data. 01:20:38.580 |
I mean, as far as I know, I haven't talked to George, 01:20:54.360 |
I don't know if you think it's still an open question, 01:21:10.800 |
I think nobody actually knows the answer to that question. 01:21:17.840 |
There's another funny thing is you don't learn to drive 01:21:22.280 |
You learn to drive with an intellectual framework 01:21:24.320 |
that understands physics and color and horizontal surfaces 01:21:40.680 |
driving is a subset of this conceptual framework 01:21:48.520 |
we're teaching them to drive with driving data. 01:21:53.560 |
You teach a human all kinds of interesting things, 01:21:55.720 |
like language, like don't do that, watch out. 01:22:02.880 |
we talked about where you poetically disagreed 01:22:20.800 |
- It's a ballistics, humans are a ballistics problem, 01:22:28.480 |
And I think that's probably the right way to think about it. 01:22:30.880 |
But I still, they still continue to surprise me, 01:22:38.480 |
- Yeah, but it's gonna be one of these compensating things. 01:22:43.960 |
you have an intuition about what humans are going to do, 01:22:51.800 |
So the self-driving car comes in with no attention problem, 01:22:58.760 |
So they'll wipe out a whole class of accidents. 01:23:15.560 |
but then the cars also have a set of hardware features 01:23:22.760 |
is if you wipe out a huge number of kind of accidents, 01:23:27.000 |
then it might be just way safer than a human driver, 01:23:36.200 |
Autonomous cars will have a small number of accidents 01:23:43.800 |
- What do you think about like Tesla's dojo efforts, 01:23:59.200 |
build its own neural network training hardware? 01:24:23.640 |
we said, well, we have this 10,000 watt board to cool. 01:24:29.080 |
and they think 10,000 watts is a really small number, 01:24:41.640 |
- So the cooling does seem to be a big problem. 01:24:53.000 |
So it has to be way better than racks of GPUs. 01:25:09.160 |
If you're building a general purpose AI solution, 01:25:16.400 |
Now, something Andre said is, I think this is amazing, 01:25:19.800 |
10 years ago, like vision, recommendation, language 01:25:27.120 |
He said the people literally couldn't talk to each other. 01:25:29.720 |
And three years ago, it was all neural networks, 01:25:34.840 |
And recently it's converging on one set of networks. 01:25:37.720 |
They vary a lot in size, obviously they vary in data, 01:25:47.400 |
it seems like they could be applied to video, 01:25:50.560 |
and it's like, and they're all really simple. 01:25:52.520 |
- And it was like, they literally replace letters 01:26:02.080 |
So the bigger it gets, the more compute you throw at it, 01:26:05.640 |
- And the more data you have, the better it gets. 01:26:12.520 |
or is this just another step to some fundamental 01:26:16.080 |
understanding about this kind of computation? 01:26:20.240 |
- Us humans don't want to believe that that kind of thing 01:26:22.200 |
will achieve conceptual understanding, as you were saying, 01:26:24.400 |
like you'll figure out physics, but maybe it will. 01:26:31.000 |
It'll understand physics in ways that we can't understand. 01:26:33.760 |
I like your Stephen Wolfram talk where he said, 01:26:40.080 |
well, big things should fall faster than small things, 01:26:46.280 |
But the number of programs in the world that are solved 01:26:51.960 |
Almost all programs have more than one line of code, 01:26:56.840 |
So he said, now we're going to physics by equation, 01:27:01.680 |
I might point out there was two generations of physics 01:27:07.240 |
before reasoning, habit, like all animals know things fall 01:27:12.240 |
and birds fly and predators know how to solve 01:27:46.360 |
- And actually, there's no reason that I can see 01:27:55.560 |
I mean, usually when you have this hierarchy, 01:28:03.080 |
and conceptually different, it's not obvious why, 01:28:05.080 |
you know, six is the right number of hierarchy steps 01:28:20.820 |
understands the thing that's not explainable to us, 01:28:25.100 |
and like, I'm not sure why there's a limit to it. 01:28:36.540 |
which is an interesting illustrative example, 01:28:42.640 |
and trying to design deep learning architectures, 01:28:53.460 |
if you could change something about the brain, 01:29:10.260 |
Like, all the big networks are way bigger than that. 01:29:18.140 |
if the input generates a result you can lose, 01:29:25.900 |
which turns into an input, and then your brain, 01:29:28.340 |
to the point where you mull things over for days, 01:29:30.660 |
and how many trips through your brain is that, right? 01:29:33.380 |
Like, it's, you know, 300 milliseconds or something 01:29:39.860 |
But then it does it over and over and over as it searches. 01:29:43.300 |
And the brain clearly looks like some kind of graph, 01:29:49.220 |
and it's locally very computationally intense, 01:29:57.820 |
- There's a lot of messy biological type of things, 01:30:03.700 |
there's mechanical, chemical, and electrical signals, 01:30:18.620 |
and it's unclear whether that's a good thing, 01:30:22.640 |
or it's a bad thing, because if it's a good thing, 01:30:26.320 |
then we need to run the entirety of the evolution. 01:30:29.260 |
Well, we're gonna have to start with basic bacteria 01:30:32.420 |
- But imagine you could build a brain with 10 layers. 01:30:45.500 |
like, you know you can only hold seven numbers in your head. 01:30:53.700 |
- And why can't we have a floating point processor 01:31:03.180 |
And why can't we see in four or eight dimensions? 01:31:18.260 |
you could enhance with a whole bunch of features 01:31:25.980 |
you're describing are actually essential for, 01:31:28.780 |
like, the constraints are essential for creating, 01:31:39.060 |
'cause, like, your brain is clearly a parallel processor. 01:31:43.020 |
You know, 10 billion neurons talking to each other 01:31:59.060 |
Like, I think I'm a relatively visual thinker. 01:32:02.300 |
I can imagine any object and rotate it in my head 01:32:12.420 |
who say they don't have a voice in their head. 01:32:31.740 |
if we dedicated more hardware to holding information, 01:32:34.960 |
like, you know, 10 numbers or a million numbers, 01:32:37.940 |
like, would that distract us from our ability 01:32:53.120 |
but can actually do lots more things in parallel. 01:32:55.620 |
- Yeah, there's no reason, if we're thinking modularly, 01:32:57.900 |
there's no reason we can't have multiple consciousnesses 01:33:01.540 |
- Yeah, and maybe there's some way to make it faster 01:33:03.700 |
so that the, you know, the area of the computation 01:33:22.900 |
Actually, people don't give it enough credit. 01:33:30.240 |
give a nice, like, spark of beauty to the whole experience. 01:33:55.820 |
- So do you know, you know, Ian Banks, his stories? 01:34:03.580 |
mostly live in the world of what they call infinite fun 01:34:12.220 |
So they interact in, you know, the story has it. 01:34:15.780 |
and they're very smart and they can do all kinds of stuff. 01:34:23.220 |
And for reasons, you know, artificial to the story, 01:34:26.300 |
they're interested in people and doing stuff, 01:34:28.260 |
but they mostly live in this other land of thinking. 01:34:33.020 |
- My inclination is to think that the ability 01:34:44.100 |
Imagine being able to make a star, move planets around. 01:34:47.620 |
- Yeah, yeah, but because we can imagine that 01:34:50.020 |
as why life is fun, if we actually were able to do it, 01:34:53.340 |
it'd be a slippery slope where fun wouldn't even 01:34:58.940 |
desensitize ourselves by the infinite amounts 01:35:04.140 |
The sadness, the dark stuff is what makes it fun, I think. 01:35:22.540 |
not through the biology side, but through the BCI, 01:35:27.180 |
Now you got a chance to check out the Neuralink stuff. 01:35:31.460 |
Like humans, like our thoughts to manifest as action. 01:35:46.140 |
for a lot of kids became the thing where they, 01:35:53.580 |
But you have to have this physical interaction. 01:35:55.860 |
Now imagine, you know, you could just imagine stuff 01:36:08.100 |
Like dreams are funny because like if you have 01:36:14.380 |
like it's very realistic looking or not realistic, 01:36:17.900 |
depends on the dream, but you can also manipulate that. 01:36:26.220 |
And the fact that nobody understands it's hilarious, but. 01:36:29.020 |
- Do you think it's possible to expand that capability 01:36:36.500 |
so from a hardware designer perspective, is there, 01:36:39.740 |
do you think it'll present totally new challenges 01:36:49.420 |
So today, computer games are rendered by GPUs. 01:36:53.700 |
- Right, so, but you've seen the GAN stuff, right? 01:36:56.820 |
Where trained neural networks render realistic images, 01:37:00.900 |
but there's no pixels, no triangles, no shaders, 01:37:05.380 |
So the future of graphics is probably AI, right? 01:37:10.380 |
- Now that AI is heavily trained by lots of real data. 01:37:13.740 |
Right, so if you have an interface with a AI renderer, 01:37:22.780 |
it won't say, well, how tall is the cat and how big, 01:37:26.260 |
And you might say, well, a little bigger, a little smaller, 01:37:28.220 |
you know, make it a tabby, shorter hair, you know, 01:37:32.900 |
Like the amount of data you'll have to send to interact 01:37:37.420 |
with a very powerful AI renderer could be low. 01:37:41.420 |
- But the question is, for brain-computer interfaces, 01:37:58.580 |
and we could feel like we're participating in it. 01:38:04.860 |
It's gonna be so good when a projection to your eyes, 01:38:08.020 |
You know, they're slowly solving those problems. 01:38:11.620 |
And I suspect when the renderer of that information 01:38:19.740 |
you know, they'll be able to give you the cues that, 01:38:23.140 |
you know, you really want for depth and all kinds of stuff. 01:38:26.220 |
Like your brain is partly faking your visual field, right? 01:38:33.820 |
Occasionally they blank, you don't notice that. 01:38:45.540 |
- So if you have an AI renderer that's trained 01:38:51.700 |
and the kind of things that enhance the realism 01:38:54.780 |
of the experience, it could be super real actually. 01:39:10.460 |
in a better way than your eyes do, which is possible, 01:39:19.780 |
- But the really cool thing is that it has to do 01:39:21.580 |
with the infinite fun that you were referring to, 01:39:33.660 |
- The interesting open question is the limits 01:39:44.940 |
- We know about the experiments where they put 01:39:54.340 |
Especially at a young age, if you throw a lot at it, 01:40:00.220 |
so can you like arbitrarily expand it with computing power? 01:40:06.900 |
So connected to the internet directly somehow? 01:40:11.940 |
- So the problem with biology and ethics is like, 01:40:15.540 |
Like us humans are perhaps unwilling to take risks 01:40:20.540 |
into directions that are full of uncertainty. 01:40:26.460 |
90% of the population's unwilling to take risks. 01:40:49.340 |
He's doing this large-scale study of psychedelics. 01:40:55.220 |
with that community of scientists working on psychedelics. 01:40:57.780 |
But because of that, that opened the door to me 01:41:00.100 |
to all these, what do they call it, psychonauts, 01:41:03.660 |
the people who, like you said, the 10% who are like, 01:41:07.260 |
I don't care, I don't know if there's a science behind this. 01:41:14.180 |
psychedelics are interesting in the sense that 01:41:21.380 |
it's a way to explore the limits of the human mind. 01:41:28.180 |
'Cause you kinda, like when you dream, you detach it. 01:41:33.020 |
but you detach your reality from what your mind, 01:41:50.240 |
but you start to have like these weird, vivid worlds that-- 01:42:03.060 |
- I know, I haven't, I don't for some reason. 01:42:06.100 |
I just knock out, and I have sometimes anxiety-inducing 01:42:10.980 |
kinda like very pragmatic nightmare type of dreams, 01:42:20.620 |
I try, I unfortunately mostly have fun in the waking world, 01:42:25.620 |
which is very limited in the amount of fun you can have. 01:42:44.700 |
- You know, years ago, and I read about, you know, 01:43:00.620 |
But my mostly, when I'm thinking about things 01:43:04.340 |
or working on problems, I prep myself before I go to sleep. 01:43:15.380 |
And then that, let's say, greatly improves the chances 01:43:24.220 |
- And then I also, you know, basically ask to remember it. 01:43:43.340 |
You say, you know, to prepare yourself to do that. 01:43:50.580 |
still gnashing your teeth about some random thing 01:43:52.980 |
that happened that you're not that really interested in, 01:43:59.660 |
- But you can direct your dreams somewhat by prepping. 01:44:08.460 |
not like, what did this guy send in an email, 01:44:14.100 |
but like fundamental problems you're actually concerned 01:44:16.780 |
- And interesting things you're worried about 01:44:32.540 |
my percentage of interesting dreams and memories went up. 01:44:44.600 |
is there a process that's at the core of that? 01:44:49.440 |
Like so some people, you know, walk and think, 01:44:52.460 |
some people like in the shower, the best ideas hit 'em. 01:45:03.220 |
So like in college, I had friends who could study 01:45:05.680 |
at the last minute and get an A the next day. 01:45:17.760 |
Because I want, you know, 'cause like a new fact 01:45:20.440 |
day before finals may screw up my understanding 01:45:32.080 |
I remember when we were doing like 3D calculus, 01:45:33.800 |
I would have these amazing dreams of 3D surfaces 01:45:36.320 |
with normal, you know, calculating the gradient 01:45:43.900 |
And if I got cycles of that, that was useful. 01:45:47.440 |
And the other is, is don't over filter your ideas. 01:46:16.220 |
Like how do you be both open and, you know, precise? 01:46:22.320 |
that sit in your mind for like years before the? 01:46:44.760 |
- For the slow thinkers in the room, I suppose. 01:46:49.420 |
As I, some people, like you said, are just like, like the. 01:46:54.900 |
There's so much diversity in how people think. 01:47:01.700 |
Like, you know, I'm not super good at remembering facts, 01:47:06.500 |
Like in our engineering, I went to Penn State 01:47:08.060 |
and almost all our engineering tests were open book. 01:47:11.900 |
I could remember the page and not the formula. 01:47:15.940 |
I could remember the whole method if I'd learned it. 01:47:23.260 |
you know, I just watched friends like flipping 01:47:27.480 |
even knowing that they'd done just as much work. 01:47:39.020 |
and figure out what the, how to function optimally. 01:47:45.740 |
He had lots of ideas, but he said they just sort of popped up 01:47:49.120 |
like you'd be working on something, you have this idea, 01:47:54.820 |
Like, like, like how your brain works as a little murky 01:48:03.900 |
Like when you visualize something, how does that happen? 01:48:07.380 |
- You know, if I say, you know, visualize volcano, 01:48:09.900 |
- And what does it actually look like when you visualize it? 01:48:12.540 |
- I can visualize to the point where I don't see 01:48:14.380 |
the very much out of my eyes and I see the colors 01:48:29.660 |
- Yeah, yeah, yeah, that's a good way to say it. 01:48:31.800 |
You know, you have this kind of almost peripheral vision 01:48:42.320 |
- And somehow you can walk along those visualizations 01:48:47.240 |
- But when you're thinking about solving problems, 01:48:55.760 |
you're sort of teasing the area that you don't understand 01:49:06.480 |
Like I know sometimes when I'm working really hard 01:49:12.040 |
on something, I get really hot when I'm sleeping. 01:49:17.320 |
I wake up, I hold a blanket throw on the floor. 01:49:20.080 |
And every time, it's wow, I wake up and think, 01:49:30.360 |
and sometimes it's this kind of, like you say, 01:49:32.480 |
like shadow thinking that you sort of have this feeling 01:49:35.120 |
you're going through this stuff, but it's not that obvious. 01:49:46.040 |
that you can't, you're just there for the ride. 01:49:55.160 |
you need to learn the language of your own mind. 01:50:04.020 |
- It's somewhat comprehensible and observable 01:50:17.960 |
working on neural networks now, what's consciousness? 01:50:50.160 |
It's a post hoc narrative about what happened. 01:51:07.980 |
And there's a really big sorting thing going on there. 01:51:13.040 |
in the sense that you create a space in your head. 01:51:19.640 |
Like photons hit your eyes, it gets turned into signals, 01:51:31.080 |
Like how the resolution of your vision is so high 01:51:36.080 |
Where for most of it, it looks nothing like vision. 01:51:46.820 |
We're literally just isolated behind our sensors. 01:51:55.600 |
speculate about alternatives, problem solve, what if. 01:52:07.600 |
even though the underlying thing is like massively parallel. 01:52:16.400 |
well you'd have huge arrays of neural networks 01:52:28.280 |
you would train the network to create basically 01:52:36.840 |
- Well, create the world, tell stories in the world 01:52:53.560 |
Like if you're hungry, it dominates your thinking. 01:53:01.320 |
but it certainly disrupts, intrudes in the consciousness. 01:53:13.460 |
And the somewhat circular observation of that 01:53:27.560 |
dwelled on past events accurately or semi-accurately. 01:53:31.300 |
- Will consciousness just spring up like naturally? 01:53:35.320 |
- Well, would that look and feel conscious to you? 01:53:38.040 |
Like you seem conscious to me, but I don't know. 01:53:41.760 |
Do you think a thing that looks conscious is conscious? 01:53:44.960 |
Like do you, again, this is like an engineering 01:53:56.840 |
is it okay to engineer something that just looks conscious? 01:54:04.080 |
'cause it's a super effective way to manage our affairs. 01:54:16.200 |
is we're modeling each other in really high-level detail. 01:54:29.600 |
that we're like wondering what they're thinking, 01:54:36.480 |
they're interesting, surprising, fascinating, 01:55:29.600 |
There seems to be more than enough planets out there. 01:56:03.620 |
if they become sufficiently different than us. 01:56:10.660 |
and the geometry defines a certain amount of physics. 01:56:32.000 |
Because organics aren't stable, too cold or too hot. 01:56:38.240 |
So if you specify the list of things that input to that, 01:56:57.340 |
where all the society had uploaded into this matrix. 01:57:01.620 |
And at some point, some of the beings in the matrix thought, 01:57:05.340 |
I wonder if there's intelligent life out there. 01:57:07.900 |
So they had to do a whole bunch of work to figure out 01:57:17.760 |
When they got there, there was like life running around, 01:57:22.660 |
And then they figured out that there was these huge, 01:57:30.540 |
and they uploaded themselves into that matrix. 01:57:34.980 |
So everywhere intelligent life was, soon as it got smart, 01:57:39.980 |
it up-leveled itself into something way more interesting 01:57:49.780 |
The essence of what we think of as an intelligent being, 01:57:53.220 |
I tend to like the thought experiment of the organism, 01:58:00.380 |
I like the notion of like Richard Dawkins and memes 01:58:26.700 |
you think you have ideas, but ideas have you. 01:58:31.580 |
And then we know about the phenomenon of groupthink 01:58:34.220 |
and there's so many things that constrain us. 01:58:51.700 |
which isn't, it's one of the creative tension things again. 01:58:55.900 |
You're constructed by it, but you can still observe it 01:58:59.460 |
and you can think about it and you can make choices 01:59:01.780 |
about to some level, how constrained you are by it. 01:59:09.740 |
But at the same time, and it could be by doing that, 01:59:49.340 |
it seems like you're related to Jordan Peterson, 02:00:04.260 |
- Well, I became an expert in Benzo withdrawal, 02:00:09.260 |
which is you took Benzo's aspens and at some point 02:00:14.860 |
they interact with GABA circuits to reduce anxiety 02:00:24.700 |
they do 'cause they interact with so many parts 02:00:28.220 |
And then once you're on them, you habituate to them 02:01:03.900 |
So there's a process called the Ashton Protocol 02:01:07.420 |
where you taper it down slowly over two years. 02:01:10.340 |
The people go through that, go through unbelievable hell. 02:01:13.740 |
And what Jordan went through seemed to be worse 02:01:24.940 |
He seems to be doing quite a bit better intellectually. 02:01:29.220 |
You can see his brain clicking back together. 02:01:34.940 |
- Well, his brain is also like this powerhouse, right? 02:01:37.700 |
So I wonder, does a brain that's able to think deeply 02:01:42.500 |
about the world suffer more through these kinds of withdrawals? 02:01:47.100 |
I've watched videos of people going through withdrawal. 02:02:09.340 |
which is against the protocol, but it's common, right? 02:02:18.540 |
half the people have difficulty, but 75% get off okay. 02:02:34.940 |
- So you put some of the fault at the doctors. 02:02:36.820 |
They just not know what the hell they're doing. 02:02:40.540 |
It's one of those commonly prescribed things. 02:02:49.860 |
the protocol basically says you're either crazy or dependent 02:02:53.220 |
and you get kind of pushed into a different treatment regime 02:02:58.340 |
where a drug addict or a psychiatric patient. 02:03:08.580 |
And, you know, the awareness of that is slowly coming up. 02:03:14.400 |
The fact that they're casually prescribed to people 02:03:26.200 |
Like once you, you know, it's another one of those drugs. 02:03:29.220 |
But benzos long range have real impacts on your personality. 02:03:41.680 |
We were talking about how the infinite possibility of fun, 02:03:45.440 |
but like it's the infinite possibility of suffering too, 02:03:48.640 |
which is one of the dangers of like expansion 02:03:53.480 |
It's like, I wonder if all the possible experiences 02:04:10.160 |
the set of possibilities, like are you going to run 02:04:29.580 |
You know, all the literature on religion and stuff is, 02:04:41.640 |
But that's a long philosophical conversation. 02:04:48.640 |
of the more important moments in human history 02:04:51.600 |
with this particular virus, it seems like pandemics 02:05:03.040 |
And there's just fascinating 'cause there's so many viruses 02:05:08.560 |
in the sense that they've been around a very long time. 02:05:17.240 |
but at the same time, they're not intelligent 02:05:21.240 |
Do you have like high level thoughts about this virus 02:05:42.760 |
And it's not competitive out of a sense of evil, 02:05:44.880 |
it's competitive in a sense of there's endless variation 02:05:57.720 |
because of the competition between different kinds 02:06:04.240 |
And we know sex partly exists to scramble our genes 02:06:06.880 |
so that we have genetic variation against the invasion 02:06:11.880 |
of the bacteria and the viruses and it's endless. 02:06:18.040 |
like the density of viruses and bacteria in the ocean 02:06:47.800 |
- It just feels so peaceful from a human perspective 02:06:50.240 |
when we sit back and are able to have a relaxed conversation 02:06:56.800 |
- Like right now, you're harboring how many bacteria? 02:07:01.480 |
- There's ones, many of them are parasites on you 02:07:18.400 |
and the political response that it engendered 02:07:23.840 |
and the technology it engendered, it's kind of wild. 02:07:27.120 |
- Yeah, the communication on Twitter that it led to. 02:07:30.840 |
- Yeah, all that kind of stuff, at every single level, yeah. 02:07:34.600 |
the big extinctions are caused by meteors and volcanoes. 02:07:40.800 |
as opposed to human-created bombs that we launch-- 02:07:52.760 |
- Another historic moment, this is perhaps outside 02:08:06.640 |
I don't know if you're paying attention at all, 02:08:16.520 |
There's kind of a theme to this conversation we're having 02:08:20.720 |
it's cool how there's a large number of people 02:08:25.040 |
in a distributed way, almost having a kind of fund, 02:08:30.040 |
were able to take on the powerful elite hedge funds, 02:08:45.040 |
but it was like the Elon, Robin Hood guy when they talked. 02:08:52.680 |
how the finance system worked, that was clear. 02:08:55.560 |
He was treating the people who settled the transactions 02:08:58.560 |
as a black box and suddenly somebody called him up 02:09:08.880 |
"I don't even make any money on these trades. 02:09:10.440 |
"Why do I owe $3 billion while you're sponsoring the trade?" 02:09:16.160 |
I don't think either, like now we understand it. 02:09:29.040 |
and then chip comes back and it doesn't work. 02:09:31.320 |
And then suddenly you start having to open the black boxes. 02:09:37.640 |
There's a whole set of things that created this opportunity 02:09:46.280 |
Now, people spot these kinds of opportunities all the time. 02:09:58.480 |
because they're trying to manipulate their stock 02:10:03.840 |
and deprive value from both the company and the investors. 02:10:08.840 |
So the fact that some of these stocks were so short, 02:10:13.680 |
it's hilarious, that this hasn't happened before. 02:10:18.160 |
I don't actually know why some serious hedge funds 02:10:26.600 |
So my guess is we know 5% of what really happened 02:10:31.600 |
and that a lot of the players don't know what happened 02:10:34.440 |
and the people who probably made the most money 02:10:37.440 |
aren't the people that they're talking about. 02:11:03.680 |
It's the basic questions that everybody wants to know about. 02:11:06.320 |
- Yeah, so we're in a very hyper-competitive world, right? 02:11:10.320 |
But transactions like buying and selling stock 02:11:13.800 |
I trust the company, represented themselves properly. 02:11:17.000 |
I bought the stock 'cause I think it's gonna go up. 02:11:22.680 |
Now, inside of that, there's all kinds of places 02:11:26.120 |
where humans over trust and this expose, let's say, 02:11:37.320 |
I don't know if we have close to the real story. 02:11:44.480 |
And listen to that guy, he was like a little wide-eyed 02:11:47.280 |
about and then he did this and then he did that. 02:12:00.720 |
You pay attention to the stuff that's bugging you or new. 02:12:07.080 |
You just, you know, sky's blue every day, California. 02:12:20.840 |
- I was blue for like 100 days and now it's, you know. 02:12:29.560 |
that we've been talking about is there's a lot 02:12:32.560 |
of unexpected things that happen with the scaling. 02:12:36.040 |
And you have to be, I think the scaling forces you 02:12:41.800 |
- Well, it's interesting because when you buy 02:13:01.120 |
where he wasn't actually capitalized for the downside. 02:13:06.280 |
Now, whether something nefarious has happened, 02:13:12.000 |
the financial risk to both him and his customers 02:13:19.200 |
and his understanding how the system work was clearly weak 02:13:28.840 |
it could have been the surprise question was like, 02:13:32.680 |
it sounded like he was treating stuff as a black box. 02:13:36.280 |
Maybe he shouldn't have, but maybe he has a whole pile 02:13:38.560 |
of experts somewhere else and it was going on. 02:13:42.320 |
I mean, this is one of the qualities of a good leader 02:13:49.120 |
And that means to think clearly and to speak clearly. 02:13:59.720 |
like at the basic level, like what the hell happened. 02:14:10.880 |
experts/insiders/people with, you know, special information. 02:14:17.720 |
And the insiders, you know, my guess is the next time 02:14:25.080 |
- Well, they have more tools and more incentive. 02:14:44.000 |
- But this could be a new era because, I don't know, 02:14:46.040 |
at least I didn't expect that a bunch of Redditors could, 02:14:49.080 |
you know, there's millions of people can get together. 02:15:00.480 |
It has to be a surprise, it can't be the same game. 02:15:05.400 |
- It could be there's a very large number of games to play 02:15:34.560 |
they're trying to lower volatility in the short run 02:15:56.000 |
Let me ask you some advice to put on your profound hat. 02:15:59.800 |
There's a bunch of young folks who listen to this thing 02:16:07.480 |
Undergraduate students, maybe high school students, 02:16:16.880 |
What advice would you give to a young person today 02:16:19.360 |
about life, maybe career, but also life in general? 02:16:30.640 |
You have to love what you're doing to get good at it. 02:16:35.800 |
that's just boring or bland or numbing, right? 02:16:41.720 |
Well, people get talked into doing all kinds of shit 02:16:49.320 |
and like there's so much crap going on, you know? 02:17:03.320 |
Just because you're old doesn't mean you stop thinking. 02:17:18.960 |
I mean, when I hear young people spouting opinions, 02:17:22.080 |
it sounds like they come from Fox News or CNN, 02:17:24.400 |
I think they've been captured by groupthink and memes. 02:17:38.480 |
It seems safe, but it puts you at great jeopardy 02:17:52.160 |
I've been a fun person since I was pretty little. 02:18:06.000 |
Like you go through mental transitions in high school. 02:18:10.720 |
And I think I had my first midlife crisis at 26. 02:18:19.360 |
but I was going to work and all my time was consumed. 02:18:34.520 |
Like there's work, friends, family, personal time. 02:18:51.800 |
Like young people are often passionate about work. 02:19:05.840 |
to the deep dips into depression kind of thing. 02:19:10.840 |
- Yeah, well, you have to get to know yourself too. 02:19:14.480 |
Some physical, something physically intense helps. 02:19:18.520 |
- Like the weird places your mind goes kind of thing. 02:19:24.880 |
- Like triggers, like the things that cause your mind 02:19:33.760 |
whether your parents are great people or not, 02:19:35.680 |
you come into adulthood with all kinds of emotional burdens. 02:19:40.680 |
And you can see some people are so bloody stiff 02:20:06.360 |
You might as well get over that, like 100%, that's right. 02:20:14.520 |
- There's something about embarrassment that's, 02:20:22.000 |
the thing I'm most afraid of is being humiliated, I think. 02:20:41.160 |
and they walk out and they didn't think about it 02:20:43.800 |
Or maybe somebody told a funny story to somebody else 02:20:59.680 |
- Yeah, but that's a cage and you have to get out of it. 02:21:15.440 |
- So the goal, I guess it's like a cage within a cage. 02:21:18.440 |
I guess the goal is to die in the biggest possible cage. 02:21:23.760 |
People do get enlightened, I've got a few, it's great. 02:21:35.520 |
There's like enlightened people who write books 02:21:38.280 |
- It's a good way to sell a book, I'll give you that. 02:21:40.840 |
- You've never met somebody you just thought, 02:21:54.320 |
- You secretly suspect there's always a cage. 02:22:03.880 |
- You work, you worked at a bunch of companies, 02:22:13.680 |
I don't, I'm not sure if you've ever been like 02:22:19.440 |
but do you have advice for somebody that wants to do 02:22:28.320 |
like build a strong team of engineers that are passionate 02:22:35.000 |
Like is there a more specifically on that point? 02:22:43.080 |
you better be really interested in how people work and think. 02:22:50.200 |
One is how people work and the other is the-- 02:22:52.920 |
- Well, actually, there's quite a few successful startups. 02:22:55.680 |
It's pretty clear the founders don't know anything 02:22:58.380 |
Like the idea was so powerful that it propelled them. 02:23:03.760 |
they hired some people who understood people. 02:23:06.980 |
'Cause people really need a lot of care and feeding 02:23:12.760 |
Like startups are all about outproducing other people. 02:23:17.000 |
Like you're nimble because you don't have any legacy. 02:23:19.820 |
You don't have a bunch of people who are depressed 02:23:24.700 |
So startups have a lot of advantages that way. 02:23:28.020 |
- Do you like the, Steve Jobs talked about this idea 02:23:52.600 |
And there's so many people who are happy to do 02:24:01.880 |
but you need an organization that both values 02:24:05.880 |
but doesn't let them take over the leadership of it. 02:24:14.280 |
the notion with Steve was that like one B player 02:24:18.840 |
in a room of A players will be like destructive to the whole. 02:24:26.520 |
Like, you run into people who are clearly B players, 02:24:37.480 |
I just want to work with cool people on cool shit 02:24:39.680 |
and just tell me what to do and I'll go get it done. 02:24:42.520 |
So you have to, again, this is like people skills. 02:24:47.960 |
I've met some really great people I love working with 02:24:55.520 |
They create connection and community that people value. 02:25:10.400 |
Do you think this is your solution to your depression? 02:25:14.840 |
the enlightened people on occasion, trying to sell a book. 02:25:26.120 |
you should really write a book about your management 02:25:37.800 |
What role do you think love, family, friendship, 02:25:40.440 |
all that kind of human stuff play in a successful life? 02:25:44.400 |
You've been exceptionally successful in the space of 02:25:47.480 |
like running teams, building cool shit in this world, 02:26:07.320 |
- So, we habituate ourselves to the environment. 02:26:13.960 |
So, you go through life and you just get used to everything, 02:26:22.480 |
like other people's children and dogs and trees. 02:26:26.120 |
You just don't pay that much attention to them. 02:26:27.760 |
Your own kids, you're monitoring them really closely. 02:26:47.560 |
you're not gonna put the time in somebody else. 02:27:01.680 |
and surprises you and all those kinds of things. 02:27:06.400 |
There's people who figured out lots of frameworks for this. 02:27:20.080 |
And then, you know, different people have ideas 02:27:26.600 |
which keeps us together and it's super functional 02:27:29.520 |
for creating families and creating communities 02:27:41.960 |
You know, now, in the work-life balance scheme, 02:27:49.760 |
you think you may be optimizing your work potential, 02:28:04.720 |
you went home and took two days off and you came back in. 02:28:19.200 |
It's like changes the color of the light in a room. 02:28:22.120 |
Like, it creates a spaciousness that's different. 02:28:29.560 |
- Bukowski had this line about love being a fog 02:28:32.520 |
that dissipates with the first light of reality 02:28:37.100 |
- It's death-depressing, I think it's the other way around. 02:28:42.920 |
- It can be the light that actually enlivens your world 02:28:45.640 |
and creates the interest and the power and the strength 02:28:54.360 |
you know, there's like physical love, emotional love, 02:29:01.080 |
You should differentiate that, maybe that's your problem. 02:29:04.000 |
In your book, you should refine that a little bit. 02:29:09.920 |
aren't these just different layers of the same thing, 02:29:14.320 |
- People, some people are addicted to physical love 02:29:17.360 |
and they have no idea about emotional or intellectual love. 02:29:26.200 |
I guess the ultimate goal is for it to be the same. 02:29:28.200 |
- Well, if you want something to be bigger and interesting, 02:29:30.200 |
you should find all its components and differentiate them, 02:29:34.520 |
Like, people do this all the time, they, yeah, 02:29:41.560 |
- Well, maybe you can write the forward to my book 02:29:47.720 |
I feel like Lex has made a lot of progress in this book. 02:29:52.120 |
Well, you have things in your life that you love. 02:30:00.880 |
- And you can have multiple things with the same person 02:30:09.680 |
- Yeah, there's, like, what Pekoski described 02:30:13.160 |
is that moment when you go from being in love 02:30:19.480 |
- But when it happens, if you'd read the owner's manual 02:30:21.720 |
and you believed it, you would've said, oh, this happened. 02:30:27.960 |
But maybe there's something better about that. 02:30:40.720 |
'cause, like, who you can be in your future self 02:30:44.000 |
is actually more interesting and possibly delightful 02:30:46.720 |
than being a mad kid in love with the next person. 02:31:01.080 |
- Yeah, that's right, that there's a lot more fun 02:31:19.420 |
But you have to look carefully and you have to work at it 02:31:23.000 |
- Yeah, you have to see the surprises when they happen. 02:31:28.320 |
From the branching perspective, you mentioned regrets. 02:31:31.260 |
Do you have regrets about your own trajectory? 02:31:43.160 |
I would say, in terms of working with people, 02:32:02.120 |
And this psychologist I heard a long time ago said, 02:32:05.920 |
"You tend to think somebody does something to you, 02:32:09.160 |
"but really what they're doing is they're doing 02:32:10.780 |
"what they're doing while they're in front of you. 02:32:26.640 |
And then when they do stuff, I'm way less surprised. 02:32:41.000 |
It's like you do something and you think you're embarrassed, 02:32:49.680 |
no, they're getting mad the way they're doing that 02:32:53.160 |
And maybe you can help them if you care enough about it. 02:32:57.240 |
Or you could see it coming and step out of the way. 02:33:06.000 |
You said with Steve that was a feature, not a bug. 02:33:08.920 |
- Yeah, well, he was using it as the counterforce 02:33:18.960 |
It was more like I just got pissed off and did stuff. 02:33:32.440 |
It was more like I just got pissed off and left 02:33:46.120 |
and his staff had to be or they wouldn't survive him. 02:33:51.400 |
And then I've been in companies where they say it's political 02:33:53.320 |
but it's all fun and games compared to Apple. 02:33:56.960 |
And it's not that the people at Apple are bad people, 02:34:00.320 |
it's just they operate politically at a higher level. 02:34:03.600 |
It's not like, oh, somebody said something bad 02:34:10.080 |
They had strategies about accomplishing their goals, 02:34:15.720 |
sometimes over the dead bodies of their enemies. 02:34:24.520 |
sophistication and a big time factor rather than a-- 02:34:27.760 |
- That requires a lot of control over your emotions, 02:34:31.320 |
I think, to have a bigger strategy in the way you behave. 02:34:55.860 |
Like how do you convince people, how do you leverage them, 02:34:57.920 |
how do you motivate them, how do you get rid of them? 02:35:01.280 |
There's so many layers of that that are interesting. 02:35:04.480 |
And even though some of it, let's say, may be tough, 02:35:08.520 |
it's not evil unless you use that skill to evil purposes. 02:35:20.360 |
but it was sort of like, I'm an engineer, I do my thing. 02:35:23.800 |
And there's times when I could have had a way bigger impact 02:35:36.640 |
- Yeah, that human political power expression layer 02:35:43.680 |
I mean, people are good at it, they're just amazing. 02:35:50.280 |
relatively kind and oriented in a good direction, 02:35:54.200 |
you can really feel, you can get lots of stuff done 02:35:57.520 |
and coordinate things that you never thought possible. 02:36:00.160 |
But all people like that also have some pretty hard edges 02:36:09.400 |
And I wish I'd spent more time with that when I was younger. 02:36:14.160 |
You know, I was a wide-eyed kid for 30 years. 02:36:20.040 |
- What do you hope your legacy is when there's a book, 02:36:28.080 |
and this is like a one-sentence entry by Jim Crow. 02:36:37.800 |
You're one of the sparkling little human creatures 02:36:52.400 |
But they took it out, so she put it back in, she's 15. 02:36:56.580 |
I think that was probably the best part of my legacy. 02:37:01.200 |
She got her sister, and they were all excited. 02:37:06.640 |
'cause there's articles in that on the title. 02:37:09.440 |
So in the eyes of your kids, you're a legend. 02:37:21.620 |
In terms of the big legend stuff, I don't care. 02:37:28.600 |
- Yeah, I've been thinking about building a big pyramid. 02:37:33.600 |
about whether pyramids or craters are cooler. 02:37:36.880 |
And he realized that there's craters everywhere, 02:37:39.280 |
but they built a couple pyramids 5,000 years ago. 02:37:49.560 |
And they don't actually know how they built them, 02:37:52.920 |
- It's either AGI or aliens could be involved. 02:38:16.320 |
Like those two things would pretty much make it. 02:38:18.720 |
- Jim, it's a huge honor talking to you again. 02:38:20.680 |
I hope we talk many more times in the future. 02:38:22.800 |
I can't wait to see what you do with Tennis Torrent. 02:38:58.360 |
Those who can imagine anything can create the impossible. 02:39:03.280 |
Thank you for listening and hope to see you next time.