back to indexJack Dorsey: Square, Cryptocurrency, and Artificial Intelligence | Lex Fridman Podcast #91
Chapters
0:0 Introduction
2:48 Engineering at scale
8:36 Increasing access to the economy
13:9 Machine learning at Square
15:18 Future of the digital economy
17:17 Cryptocurrency
25:31 Artificial intelligence
27:49 Her
29:12 Exchange with Elon Musk about bots
32:5 Concerns about artificial intelligence
35:40 Andrew Yang
40:57 Eating one meal a day
45:49 Mortality
47:50 Meaning of life
48:59 Simulation
00:00:00.000 |
The following is a conversation with Jack Dorsey, 00:00:13.960 |
we decided to focus this conversation on Square 00:00:29.480 |
For everyone feeling the medical, psychological, 00:00:47.520 |
to form an organization that funds COVID-19 relief. 00:00:59.520 |
"by posting all its donations to a single Google Doc. 00:01:37.880 |
to get a discount and to support this podcast. 00:01:46.520 |
you get an all-access pass to watch courses from, 00:01:53.680 |
Neil deGrasse Tyson on scientific thinking communication, 00:01:59.960 |
both one of my favorite games on game design, 00:02:16.960 |
and the experience of being launched into space alone 00:02:29.360 |
but it's an experience that will stick with you 00:02:40.960 |
to get a discount and to support this podcast. 00:02:44.360 |
And now, here's my conversation with Jack Dorsey. 00:03:01.760 |
Sort of machine learning, artificial intelligence, 00:03:08.520 |
So there's a lot of incredible engineering going on 00:03:17.700 |
maybe we'll get to about life and death and meaning 00:03:23.840 |
some of the biggest network systems in the world, 00:03:30.840 |
The cool thing about that is the infrastructure, 00:03:52.800 |
- Hire people that I can learn from, number one. 00:04:12.120 |
is a matter of being able to bring it to more people, 00:04:22.000 |
that either have experience or are extremely fast learners 00:04:35.480 |
we benefit a ton from the open-source community 00:04:48.640 |
It's a very slow-moving process usually, open-source, 00:05:05.300 |
about hacking and programming and engineering 00:05:20.800 |
sacrifice their time without any expectation in return, 00:05:27.320 |
much larger than themselves, which I think is great. 00:05:45.880 |
not often talked about how incredible that is 00:05:48.920 |
to sort of have a system that doesn't go down often, 00:05:53.480 |
is able to take care of all these transactions. 00:06:25.520 |
Like what's the interesting challenges there? 00:06:28.560 |
- By the way, you're the best dress hacker I've met. 00:06:55.200 |
is trying to solve or address too many at once 00:07:05.200 |
or not being critical of the answers you find 00:07:12.840 |
credible hypotheses that you can actually test 00:07:27.840 |
And if there's one skill I wanna improve every day, 00:07:34.920 |
and the only way we can evolve any of these things 00:07:47.240 |
Seems like fundamental to this whole process. 00:07:56.960 |
And vice versa, if you focus too much on the software, 00:08:01.960 |
there are hardware solutions that can 10x the thing. 00:08:06.700 |
So I try to resist the categories of thinking 00:08:40.560 |
is to increase people's access to the economy. 00:08:49.560 |
peer-to-peer payments, even crypto, cryptocurrency, 00:08:52.760 |
digital cryptocurrency, what do you see as the major ways 00:08:56.120 |
our society can increase participation in the economy? 00:08:59.420 |
So if we look at today and the next 10 years, 00:09:03.480 |
maybe in Africa and all kinds of other places 00:09:07.540 |
- If there was one word that I think represents 00:09:13.960 |
what we're trying to do at Square, it is that word access. 00:09:23.480 |
When we started, we thought we were just building 00:09:29.120 |
to plug it into their phone and swipe a credit card. 00:09:34.400 |
who actually tried to accept credit cards in the past, 00:09:42.560 |
not enabled, but allowed to process credit cards. 00:09:46.600 |
And we dug a little bit deeper, again, asking that question, 00:09:50.640 |
and we found that a lot of them would go to banks 00:10:03.280 |
And many of the businesses that we talked to, 00:10:09.600 |
they don't have good credit or a credit history. 00:10:13.600 |
They're entrepreneurs who are just getting started, 00:10:17.800 |
taking a lot of personal risk, financial risk. 00:10:24.480 |
for the job of being able to accept money from people, 00:10:36.520 |
that wasn't the intention of the financial industry, 00:10:38.600 |
but it's the only tool they had available to them 00:10:49.160 |
So that's the first thing we actually looked at. 00:10:50.880 |
And that's where the, you know, we built the hardware, 00:10:52.640 |
but the software really came in in terms of risk modeling. 00:11:03.080 |
We started with a very strong data science discipline 00:11:07.680 |
because we knew that our business was not necessarily 00:11:21.280 |
so to enable more people to come into the system, 00:11:26.800 |
that that person will be a legitimate vendor. 00:11:34.000 |
I think a lot of the financial industry had a mindset 00:11:36.400 |
of kind of distrust and just constantly looking 00:11:41.400 |
for opportunities to prove why people shouldn't get 00:11:53.000 |
Whereas we took on a mindset of trust and then verify, 00:12:03.840 |
only about 30 to 40% of the people who applied 00:12:08.240 |
to accept credit cards would actually get through the system. 00:12:20.160 |
And we had this mindset of, we're going to watch 00:12:24.520 |
not at the merchant level, but we're going to watch 00:12:36.320 |
high integrity, credible, and don't look suspicious, 00:12:43.080 |
If we see any interestingness in how you use our system, 00:12:50.560 |
to figure out if there's something nefarious going on, 00:12:55.800 |
So the change in the mindset led to the technology 00:13:00.800 |
that we needed to enable more people to get through 00:13:06.520 |
and to enable more people to access the system. 00:13:09.000 |
- What role does machine learning play into that, 00:13:24.240 |
and then you just have to verify and catch the ones 00:13:26.840 |
who are not, as opposed to assuming everybody's bad, 00:13:37.320 |
has machine learning played in doing that verification? 00:13:41.800 |
I mean, we weren't calling it machine learning, 00:13:47.600 |
machine learning became more of the nomenclature, 00:13:50.000 |
and as that evolved, it became more sophisticated 00:13:54.840 |
with deep learning, and as that continues to evolve, 00:14:06.960 |
because we also had, we had to partner with a bank, 00:14:14.360 |
and we had to show that by bringing more people 00:14:17.800 |
into the system, that we could do so in a responsible way, 00:14:30.600 |
is able to deliver on this sort of trustworthy 00:14:49.920 |
So again, it's kind of getting something tangible out there. 00:14:54.160 |
I wanna show what we can do rather than talk about it. 00:14:57.160 |
And that put a lot of pressure on us to do the right things. 00:15:01.600 |
And it also created a culture of accountability, 00:15:10.760 |
and I think incentivized all of our early folks 00:15:27.240 |
You can see there's some people even talking about 00:15:30.280 |
our personal data as a thing that we could monetize 00:15:35.920 |
Sort of everything can become part of the economy. 00:15:38.480 |
Do you see, so what does the future of Square look like 00:15:41.400 |
in sort of giving people access in all kinds of ways 00:15:45.600 |
to being part of the economy as merchants and as consumers? 00:16:01.760 |
and that's why I'm such a huge believer in Bitcoin, 00:16:07.240 |
because it just, our biggest problem as a company right now 00:16:16.360 |
Open a new market, we have to have a partnership 00:16:30.000 |
takes a bunch of that away, where we can potentially 00:16:33.040 |
launch a product in every single market around the world. 00:16:41.160 |
and we have consistent understanding of regulation 00:16:49.520 |
So I think the internet continuing to be accessible 00:17:00.280 |
And it will just allow for a lot more innovation, 00:17:04.280 |
a lot more speed in terms of what we can build 00:17:24.040 |
I talked to Vitalik Buterin yesterday on this podcast. 00:17:33.960 |
talking a lot about Bitcoin and Ethereum, of course. 00:17:43.720 |
Where do you see it going in the next 10, 20 years? 00:17:51.760 |
for globally, for our world, for the way we think about money? 00:18:01.560 |
is there's no one person setting the direction. 00:18:08.640 |
So we have something that is pretty organic in nature 00:18:22.400 |
is one of the most seminal works of computer science 00:18:34.440 |
There's so much hype around digital currency, 00:18:42.160 |
- Yeah, and the underlying principles behind it 00:18:47.680 |
I think that's a very, very powerful statement. 00:18:51.280 |
The timing of when it was released is powerful. 00:19:05.540 |
and honorable and enables everyone to be part of the story, 00:19:12.200 |
So you asked a question around 10 years and 20 years. 00:19:15.600 |
I mean, I think the amazing thing is no one knows. 00:19:23.920 |
whether they be a developer or someone who uses it, 00:19:29.440 |
can change its direction in small and large ways. 00:19:35.020 |
'cause that's what the internet has shown is possible. 00:19:38.060 |
Now, there's complications with that, of course, 00:19:40.280 |
and there's certainly companies that own large parts 00:19:44.440 |
of the internet and can direct it more than others, 00:19:46.260 |
and there's not equal access to every single person 00:19:50.980 |
in the world just yet, but all those problems 00:19:56.560 |
and to me, that gives confidence that they're solvable 00:20:02.660 |
I think the world changes a lot as we get these satellites 00:20:10.620 |
'cause it just removes a bunch of the former constraints 00:20:18.180 |
But a global currency, which a native currency 00:20:22.060 |
for the internet is a proxy for, is a very powerful concept, 00:20:26.940 |
and I don't think any one person on this planet 00:20:35.800 |
- Do you think it's possible, sorry to interrupt, 00:20:39.080 |
of digital currency would redefine the nature of money, 00:20:54.780 |
- Definitely, but I think the bigger ramification 00:21:02.220 |
and I think there are many positive ramifications-- 00:21:09.180 |
Money is a foundational layer that enables so much more. 00:21:12.100 |
I was meeting with an entrepreneur in Ethiopia, 00:21:14.880 |
and payments is probably the number one problem 00:21:26.640 |
or the amount of corruption within the current system. 00:21:42.400 |
I met an entrepreneur who started the Lyft/Uber of Ethiopia, 00:21:49.800 |
is that it's not easy for her riders to pay the company, 00:21:54.720 |
and it's not easy for her to pay the drivers. 00:22:02.360 |
So the fact that she even has to think about payments 00:22:07.120 |
instead of thinking about the best rider experience 00:22:10.240 |
and the best driver experience is pretty telling. 00:22:15.020 |
So I think as we get a more durable, resilient, 00:22:20.020 |
and global standard, we see a lot more innovation everywhere. 00:22:25.700 |
And I think there's no better case study for this 00:22:32.060 |
and their entrepreneurs who are trying to start things 00:22:35.100 |
within health or sustainability or transportation 00:22:38.780 |
or a lot of the companies that we've seen here. 00:22:42.500 |
So the majority of companies I met in November 00:22:51.260 |
- You mentioned, and this is a small tangent, 00:22:54.300 |
you mentioned the anonymous launch of Bitcoin 00:22:58.340 |
is a sort of profound philosophical statement. 00:23:04.500 |
First of all, let me ask-- - Well, there's an identity 00:23:32.040 |
There's no intention to build an identity around it. 00:23:34.460 |
And while the identity being built was a short time window, 00:23:41.240 |
it was meant to stick around, I think, and to be known. 00:23:46.240 |
And it's being honored in how the community thinks 00:23:51.540 |
about building it, like the concept of Satoshi's, 00:23:58.600 |
But I think it was smart not to do it anonymous, 00:24:05.420 |
but to do it as pseudonym because I think it builds 00:24:11.420 |
that this was a human or a set of humans behind it. 00:24:14.660 |
And there's this natural identity that I can imagine. 00:24:22.420 |
That's a pretty powerful thing from your perspective. 00:25:08.700 |
My particular identity, my real identity associated with it, 00:25:17.620 |
from doing something good and being able to see it 00:25:21.020 |
and see people use it is pretty overwhelming and powerful. 00:25:25.860 |
More so than maybe seeing your name in the headlines. 00:25:34.860 |
70 years ago, Alan Turing formulated the Turing test. 00:25:38.020 |
To me, natural language is one of the most interesting 00:25:53.340 |
How hard do you think is it to pass the Turing test 00:26:00.260 |
I think where we are now and for at least years out 00:26:13.820 |
can bubble up interestingness very, very quickly 00:26:17.340 |
and pair that with human discretion around severity, 00:26:30.740 |
I think for me, the chasm to cross for general intelligence 00:26:48.980 |
- So the explainability part is kind of essential 00:26:54.260 |
why the decisions were made, that kind of thing. 00:26:56.540 |
- Yeah, I mean, I think that's one of our biggest risks 00:27:03.460 |
that can't necessarily explain why they made a decision 00:27:06.780 |
or what criteria they used to make the decision. 00:27:11.100 |
from lending decisions to content recommendation 00:27:18.060 |
Like a lot of us have watches that tell us when to stand. 00:27:34.940 |
Although it's hard-- - Which is a very hard problem 00:27:39.740 |
- So that's what I was, I think we're being sometimes 00:27:42.700 |
a little bit unfair to artificial intelligence systems 00:27:45.700 |
'cause we're not very good at some of these things. 00:27:52.580 |
romanticized question, but on that line of thought, 00:27:55.820 |
do you think we'll ever be able to build a system 00:27:58.820 |
like in the movie "Her" that you could fall in love with? 00:28:07.420 |
Hasn't someone in Japan fallen in love with his AI? 00:28:27.420 |
- That's less a function of the artificial intelligence 00:28:32.220 |
and how they find meaning and where they find meaning. 00:28:34.700 |
- Do you think we humans can find meaning in technology? 00:28:37.380 |
In this kind of way? - Yeah, yeah, yeah, 100%. 00:28:40.740 |
And I don't necessarily think it's a negative, 00:28:56.240 |
And I don't think it's going to be a function 00:29:08.820 |
- But that question really gets at the difference 00:29:12.700 |
You had a little bit of an exchange with Elon Musk. 00:29:16.200 |
Basically, I mean, it's the trivial version of that, 00:29:20.140 |
but I think there's a more fundamental question 00:29:29.260 |
if we look into the future, 10, 20 years out, 00:29:34.060 |
or is it even necessary to tell the difference 00:29:36.340 |
in the digital space between a human and a robot? 00:29:40.380 |
Can we have fulfilling relationships with each 00:29:42.700 |
or do we need to tell the difference between them? 00:29:45.180 |
- I think it's certainly useful in certain problem domains 00:30:02.860 |
- Well, what's interesting is I think the technology 00:30:14.340 |
So if you look at adversarial machine learning, 00:30:21.220 |
And at least for me, the hope is that the technology 00:30:23.900 |
to defend will always be right there, at least. 00:30:36.940 |
two or 10 steps ahead of the creation technologies. 00:30:42.060 |
This is a problem that I think the financial industry 00:30:47.660 |
risk models, for instance, are built around identity. 00:30:53.420 |
And you can imagine a world where all this conversation 00:31:00.940 |
of driver's license or passports or state identities. 00:31:05.940 |
And people construct identities in order to get through 00:31:10.740 |
a system such as ours to start accepting credit cards 00:31:15.940 |
And those technologies seem to be moving very, very quickly. 00:31:25.080 |
But certainly with more focus, we can get ahead of it. 00:31:51.180 |
And being able to take signals that show correctness 00:31:59.780 |
and to be able to build that into our newer models 00:32:05.580 |
- Do you have other worries, like some people, 00:32:07.380 |
like Elon and others, have worries of existential threats 00:32:14.260 |
Or if you think more narrowly about threats and concerns 00:32:24.380 |
Do you have concerns or are you more optimistic? 00:32:50.240 |
Google has a stronger sense of their preferences 00:33:01.680 |
I can easily imagine today that Google probably knows 00:33:10.460 |
Maybe not me, per se, but for someone growing up, 00:33:19.060 |
or Facebook or Twitter or Square or any of these things, 00:33:22.620 |
the self-awareness is being offloaded to other systems, 00:33:32.060 |
And his concern is that we lose that self-awareness 00:33:36.040 |
because the self-awareness is now outside of us 00:33:47.460 |
what doctor should I choose, who should I date? 00:33:50.340 |
All these things we're now seeing play out very quickly. 00:33:54.200 |
So he sees meditation as a tool to build that self-awareness 00:34:15.540 |
yes, I am offloading this decision to this algorithm 00:34:21.700 |
and can't tell me why it's doing the things it's doing 00:34:31.380 |
the best of what they can do is to help guide you 00:34:34.940 |
in a journey of learning new ideas, of learning, period. 00:34:39.740 |
- It can be a great thing, but do you know you're doing that? 00:34:41.900 |
Are you aware that you're inviting it to do that to you? 00:34:53.420 |
that you have that imitation and it's being acted upon? 00:35:29.020 |
- It's the entrepreneurial story in the form of a rat. 00:35:33.820 |
I just remember just the soundtrack was really good. 00:35:42.300 |
sticking on artificial intelligence a little bit, 00:35:57.580 |
- I know, it was very disappointing and depressing. 00:36:22.100 |
instead of the, of course, there's positive impacts 00:36:30.820 |
what are your thoughts about universal basic income? 00:36:39.380 |
- I think he was 100% right on almost every dimension. 00:36:48.700 |
I mean, he identified truck drivers, I'm from Missouri, 00:36:59.420 |
concern and the issue that people from where I'm from 00:37:04.420 |
feel every single day that is often invisible 00:37:12.100 |
This is where it pertains to Square's business. 00:37:14.460 |
We are seeing more and more of the point of sale 00:38:07.540 |
to set people up with new skills and new careers, 00:38:12.540 |
they need to have a floor to be able to survive. 00:38:22.620 |
It's not going to incentivize you to quit your job 00:38:34.980 |
so that you can focus on what am I going to do now 00:38:55.780 |
factory lines, and everything worked out okay. 00:39:04.220 |
and the centralization of a lot of the things 00:39:22.060 |
who actually owns the data and who can operate on it. 00:39:26.300 |
And are we able to share the insights from the data 00:39:35.420 |
that help our needs or help our business or whatnot? 00:39:39.140 |
So that's where I think regulation could play 00:39:52.660 |
that will ultimately touch every single aspect 00:39:56.540 |
And then where data is owned and how it's shared. 00:40:01.540 |
So those are the answers that as a society, as a world, 00:40:19.580 |
But I think it was spot on with identifying the problem 00:40:23.260 |
and proposing solutions that would actually work. 00:40:31.340 |
But I mean, I think UBI is well past its due. 00:40:36.340 |
I mean, it was certainly trumpeted by Martin Luther King 00:40:43.820 |
- And like you said, the exact thousand dollar mark 00:40:52.100 |
to implement these solutions and see what works. 00:41:05.420 |
- First time anyone has said that to me in this case. 00:41:08.620 |
- Yeah, but it's becoming more and more cool. 00:41:13.700 |
So the intermittent fasting and fasting in general, 00:41:26.400 |
makes you appreciate what it is to be human somehow. 00:41:37.940 |
It also helps me as a programmer and a deep thinker, 00:41:43.980 |
to sit there for many hours and focus deeply. 00:41:55.060 |
mindset that helps you maximize mental performance 00:42:04.020 |
- I think I just took it for granted for too long. 00:42:09.660 |
- Just the social structure of we eat three meals a day 00:42:15.180 |
And I just never really asked the question why. 00:42:22.180 |
but you at least, you famously eat once a day. 00:42:29.700 |
- By the way, what made you decide to eat once a day? 00:42:32.100 |
Like, 'cause to me that was a huge revolution 00:42:41.180 |
- When you first, like the first week you start doing it, 00:42:43.580 |
it feels like you kind of like have a superpower. 00:42:46.180 |
- And you realize it's not really a superpower. 00:42:50.860 |
like just how much our mind dictates what we're possible of. 00:43:02.260 |
that incentivize like, you know, this three meal a day thing 00:43:09.700 |
versus necessity for our health and for our bodies. 00:43:17.500 |
because I played a lot with my diet when I was a kid 00:43:25.900 |
Just because I, you know, health is the most precious thing 00:43:33.900 |
So being able to ask the question through experiments 00:43:44.180 |
And I heard this one guy on a podcast, Wim Hof, 00:43:47.660 |
who's famous for doing ice baths and holding his breath 00:44:02.020 |
So I just, I learn the most when I make myself, 00:44:10.420 |
Because everything comes to bear in those moments. 00:44:14.140 |
And you really learn what you're about or what you're not. 00:44:31.940 |
And then one day I realized I can't keep doing this 00:44:53.540 |
not being able to use the note card while speaking 00:44:56.580 |
and speaking for five minutes about that topic. 00:45:09.980 |
and around if I set my mind to do something, I'll do it. 00:45:23.840 |
and has given me so much learning and benefit as a result. 00:45:49.700 |
you kind of talk about facing difficult ideas. 00:45:53.060 |
You meditate, you think about the broad context of life 00:46:00.300 |
Let me ask, I apologize again for the romanticized question, 00:46:36.300 |
but it's also a tool to feel the importance of every moment. 00:46:48.620 |
Is this really what I'm going to spend the hour doing? 00:46:52.540 |
I only have so many more sunsets and sunrises to watch. 00:46:58.320 |
I'm not going to make sure that I try to see it. 00:47:09.640 |
I don't see it as something that I dread or is dreadful. 00:47:15.680 |
It's a tool that is available to every single person 00:47:19.000 |
to use every day because it shows how precious life is 00:47:24.560 |
whether it be your own health or a friend or a coworker 00:47:34.820 |
And for me, it's am I really focused on what matters? 00:47:42.240 |
Sometimes that might be friendships or family 00:47:47.020 |
but it's the ultimate clarifier in that sense. 00:48:11.120 |
of the fact that I'm alive is pretty meaningful. 00:48:25.040 |
or long-lasting friendships or my family is meaningful. 00:48:29.940 |
Seeing people use something that I helped build 00:48:36.300 |
But that sense of, I mean, I think ultimately 00:48:47.220 |
I am part of something that's bigger than myself 00:48:49.960 |
and I can feel it directly in small ways or large ways, 00:49:07.920 |
but also crazy and random and wrought with tons of problems. 00:49:20.240 |
I mean, I just think it's taken us way too long 00:49:24.560 |
as a planet to realize we're all in this together 00:49:27.200 |
and we all are connected in very significant ways. 00:49:32.200 |
I think we hide our connectivity very well through ego, 00:49:46.240 |
towards changing and that's how I would have it. 00:49:52.880 |
then how are we gonna connect to all the other simulations? 00:50:07.340 |
I don't think there's a better way to end it. 00:50:09.900 |
Jack, thank you so much for all the work you do. 00:50:12.020 |
- There's probably other ways that we've ended this 00:50:13.900 |
in other simulations that may have been better. 00:50:23.040 |
with Jack Dorsey and thank you to our sponsor, Masterclass. 00:50:29.120 |
by signing up to Masterclass at masterclass.com/lex. 00:50:34.120 |
If you enjoy this podcast, subscribe on YouTube, 00:50:37.060 |
review it with the five stars on Apple Podcast, 00:50:39.380 |
support on Patreon or simply connect with me on Twitter 00:50:54.980 |
"Hackers love it, yet it is described as a toy, 00:51:01.080 |
Thank you for listening and hope to see you next time.