back to index

Jack Dorsey: Square, Cryptocurrency, and Artificial Intelligence | Lex Fridman Podcast #91


Chapters

0:0 Introduction
2:48 Engineering at scale
8:36 Increasing access to the economy
13:9 Machine learning at Square
15:18 Future of the digital economy
17:17 Cryptocurrency
25:31 Artificial intelligence
27:49 Her
29:12 Exchange with Elon Musk about bots
32:5 Concerns about artificial intelligence
35:40 Andrew Yang
40:57 Eating one meal a day
45:49 Mortality
47:50 Meaning of life
48:59 Simulation

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Jack Dorsey,
00:00:02.520 | co-founder and CEO of Twitter,
00:00:05.160 | and founder and CEO of Square.
00:00:08.760 | Given the happenings at the time
00:00:10.560 | related to Twitter leadership
00:00:12.240 | and the very limited time we had,
00:00:13.960 | we decided to focus this conversation on Square
00:00:16.720 | and some broader philosophical topics,
00:00:18.880 | and to save an in-depth conversation
00:00:20.840 | on engineering and AI at Twitter
00:00:23.080 | for a second appearance in this podcast.
00:00:25.840 | This conversation was recorded
00:00:27.320 | before the outbreak of the pandemic.
00:00:29.480 | For everyone feeling the medical, psychological,
00:00:31.840 | and financial burden of this crisis,
00:00:33.800 | I'm sending love your way.
00:00:35.480 | Stay strong.
00:00:36.520 | We're in this together.
00:00:37.680 | We'll beat this thing.
00:00:38.760 | As an aside, let me mention that Jack moved
00:00:42.440 | $1 billion of Square equity,
00:00:45.240 | which is 28% of his wealth,
00:00:47.520 | to form an organization that funds COVID-19 relief.
00:00:51.400 | First, as Andrew Yang tweeted,
00:00:53.760 | "This is a spectacular commitment."
00:00:56.240 | And second, "It is amazing
00:00:57.960 | "that it operates transparently
00:00:59.520 | "by posting all its donations to a single Google Doc.
00:01:03.240 | "To me, true transparency is simple,
00:01:06.080 | "and this is as simple as it gets."
00:01:08.400 | This is the Artificial Intelligence Podcast.
00:01:11.800 | If you enjoy it, subscribe on YouTube,
00:01:13.720 | review it with Five Stars and Apple Podcast,
00:01:15.960 | support it on Patreon,
00:01:17.080 | or simply connect with me on Twitter
00:01:18.960 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:01:22.200 | As usual, I'll do a few minutes of ads now,
00:01:24.600 | and never any ads in the middle
00:01:26.080 | that can break the flow of the conversation.
00:01:28.200 | I hope that works for you
00:01:29.480 | and doesn't hurt the listening experience.
00:01:32.160 | This show is presented by Masterclass.
00:01:34.800 | Sign up on masterclass.com/lex
00:01:37.880 | to get a discount and to support this podcast.
00:01:40.840 | When I first heard about Masterclass,
00:01:42.440 | I thought it was too good to be true.
00:01:44.360 | For $180 a year,
00:01:46.520 | you get an all-access pass to watch courses from,
00:01:49.600 | to list some of my favorites,
00:01:51.360 | Chris Hadfield on space exploration,
00:01:53.680 | Neil deGrasse Tyson on scientific thinking communication,
00:01:56.960 | Will Wright, creator of SimCity and Sims,
00:01:59.960 | both one of my favorite games on game design,
00:02:03.080 | Jane Goodall on conservation,
00:02:05.440 | Carlos Santana on guitar,
00:02:07.160 | one of my favorite guitar players,
00:02:08.920 | Garry Kasparov on chess,
00:02:10.680 | Daniel Negrano on poker,
00:02:12.240 | and many, many more.
00:02:13.880 | Chris Hadfield explaining how rockets work
00:02:16.960 | and the experience of being launched into space alone
00:02:19.480 | is worth the money.
00:02:20.800 | For me, the key is to not be overwhelmed
00:02:22.560 | by the abundance of choice.
00:02:24.240 | Pick three courses you want to complete,
00:02:26.320 | watch each all the way through.
00:02:28.160 | It's not that long,
00:02:29.360 | but it's an experience that will stick with you
00:02:31.040 | for a long time.
00:02:32.800 | It's easily worth the money.
00:02:34.560 | You can watch it on basically any device.
00:02:37.200 | Once again, sign up on masterclass.com/lex
00:02:40.960 | to get a discount and to support this podcast.
00:02:44.360 | And now, here's my conversation with Jack Dorsey.
00:02:47.720 | You've been on several podcasts,
00:02:50.640 | Joe Rogan, Sam Harris, Rach Roll,
00:02:52.760 | others, excellent conversations.
00:02:55.240 | But I think there's several topics
00:02:57.520 | that you didn't talk about
00:02:58.840 | that I think are fascinating
00:03:00.040 | that I'd love to talk to you about.
00:03:01.760 | Sort of machine learning, artificial intelligence,
00:03:04.120 | both the narrow kind and the general kind,
00:03:06.240 | and engineering at scale.
00:03:08.520 | So there's a lot of incredible engineering going on
00:03:11.200 | that you're a part of.
00:03:12.400 | Crypto, cryptocurrency, blockchain, UBI,
00:03:16.200 | all kinds of philosophical questions
00:03:17.700 | maybe we'll get to about life and death and meaning
00:03:20.400 | and beauty.
00:03:21.640 | So you're involved in building
00:03:23.840 | some of the biggest network systems in the world,
00:03:27.800 | sort of trillions of interactions a day.
00:03:30.840 | The cool thing about that is the infrastructure,
00:03:33.120 | the engineering at scale.
00:03:35.080 | You started as a programmer with C.
00:03:37.840 | - A hacker. - Building, yeah.
00:03:38.880 | - I'm a hacker, I'm not really an engineer.
00:03:41.080 | - Not a legit software engineer,
00:03:43.000 | you're a hacker at heart.
00:03:44.360 | But to achieve scale,
00:03:46.000 | you have to do some, unfortunately,
00:03:48.040 | legit large-scale engineering.
00:03:49.920 | So how do you make that magic happen?
00:03:52.800 | - Hire people that I can learn from, number one.
00:03:57.540 | I mean I'm a hacker in the sense that I,
00:04:00.120 | you know, my approach has always been
00:04:01.600 | do whatever it takes to make it work,
00:04:03.440 | so that I can see and feel the thing
00:04:07.640 | and then learn what needs to come next.
00:04:09.540 | And oftentimes what needs to come next
00:04:12.120 | is a matter of being able to bring it to more people,
00:04:16.260 | which is scale.
00:04:17.560 | And there's a lot of great people out there
00:04:22.000 | that either have experience or are extremely fast learners
00:04:26.240 | that we've been lucky enough to find
00:04:30.200 | and work with for years.
00:04:33.700 | But I think a lot of it,
00:04:35.480 | we benefit a ton from the open-source community
00:04:39.600 | and just all the learnings there
00:04:41.480 | that are laid bare in the open.
00:04:44.520 | All the mistakes, all the success,
00:04:46.520 | all the problems.
00:04:48.640 | It's a very slow-moving process usually, open-source,
00:04:53.440 | but it's very deliberate.
00:04:54.600 | And you get to see, because of the pace,
00:04:58.520 | you get to see what it takes
00:05:00.280 | to really build something meaningful.
00:05:02.620 | So I learned, most of everything I learned
00:05:05.300 | about hacking and programming and engineering
00:05:08.240 | has been due to open-source
00:05:11.000 | and the generosity that people have
00:05:17.000 | given to give up their time,
00:05:20.800 | sacrifice their time without any expectation in return,
00:05:24.440 | other than being a part of something
00:05:27.320 | much larger than themselves, which I think is great.
00:05:29.800 | - The open-source movement is amazing.
00:05:31.400 | But if you just look at the scale,
00:05:33.520 | like Square has to take care of,
00:05:35.660 | is this fundamentally a software problem
00:05:38.720 | or a hardware problem?
00:05:39.760 | You mentioned hiring a bunch of people,
00:05:41.900 | but it's not, maybe from my perspective,
00:05:45.880 | not often talked about how incredible that is
00:05:48.920 | to sort of have a system that doesn't go down often,
00:05:52.160 | that is secure,
00:05:53.480 | is able to take care of all these transactions.
00:05:55.940 | Like maybe I'm also a hacker at heart
00:05:58.800 | and it's incredible to me
00:06:00.120 | that that kind of scale could be achieved.
00:06:02.200 | Is there some insight, some lessons,
00:06:06.380 | some interesting tidbits that you can say
00:06:09.560 | about how to make that scale happen?
00:06:12.280 | Is it the hardware fundamentally challenge?
00:06:14.240 | Is it a software challenge?
00:06:16.440 | Is it a social challenge
00:06:21.440 | of building large teams of engineers
00:06:24.360 | that work together, that kind of thing?
00:06:25.520 | Like what's the interesting challenges there?
00:06:28.560 | - By the way, you're the best dress hacker I've met.
00:06:31.040 | I think the-- - Thank you.
00:06:32.440 | (laughing)
00:06:34.120 | - If the enumeration you just went through,
00:06:35.960 | I don't think there's one.
00:06:37.360 | You have to kind of focus on all
00:06:39.720 | and the ability to focus on all that
00:06:43.080 | really comes down to how you face problems
00:06:47.080 | and whether you can break them down
00:06:49.720 | into parts that you can focus on.
00:06:53.600 | Because I think the biggest mistake
00:06:55.200 | is trying to solve or address too many at once
00:07:00.200 | or not going deep enough with the questions
00:07:05.200 | or not being critical of the answers you find
00:07:08.660 | or not taking the time to form
00:07:12.840 | credible hypotheses that you can actually test
00:07:17.360 | and you can see the results of.
00:07:19.480 | So all of those fall in the face
00:07:23.200 | of ultimately critical thinking skills,
00:07:26.600 | problem solving skills.
00:07:27.840 | And if there's one skill I wanna improve every day,
00:07:30.680 | it's that.
00:07:31.520 | That's what contributes to learning
00:07:34.920 | and the only way we can evolve any of these things
00:07:38.520 | is learning what it's currently doing
00:07:41.040 | and how to take it to the next step.
00:07:44.320 | - And questioning assumptions,
00:07:45.560 | the first principles kind of thinking.
00:07:47.240 | Seems like fundamental to this whole process.
00:07:50.640 | - Yeah, but if you get too overextended
00:07:52.960 | into well, this is a hardware issue,
00:07:55.000 | you miss all the software solutions.
00:07:56.960 | And vice versa, if you focus too much on the software,
00:08:01.960 | there are hardware solutions that can 10x the thing.
00:08:06.700 | So I try to resist the categories of thinking
00:08:11.700 | and look for the underlying systems
00:08:16.080 | that make all these things work.
00:08:18.080 | But those only emerge when you have a skill
00:08:22.560 | around creative thinking, problem solving,
00:08:27.160 | and being able to ask critical questions
00:08:32.160 | and having the patience to go deep.
00:08:36.440 | - So one of the amazing things,
00:08:38.480 | if we look at the mission of Square,
00:08:40.560 | is to increase people's access to the economy.
00:08:45.000 | Maybe you can correct me if I'm wrong,
00:08:46.840 | that's from my perspective.
00:08:47.900 | So from the perspective of merchants,
00:08:49.560 | peer-to-peer payments, even crypto, cryptocurrency,
00:08:52.760 | digital cryptocurrency, what do you see as the major ways
00:08:56.120 | our society can increase participation in the economy?
00:08:59.420 | So if we look at today and the next 10 years,
00:09:01.400 | next 20 years, you go into Africa,
00:09:03.480 | maybe in Africa and all kinds of other places
00:09:06.080 | outside of the North America.
00:09:07.540 | - If there was one word that I think represents
00:09:13.960 | what we're trying to do at Square, it is that word access.
00:09:17.160 | One of the things we found is that
00:09:21.800 | we weren't expecting this at all.
00:09:23.480 | When we started, we thought we were just building
00:09:25.360 | a piece of hardware to enable people
00:09:29.120 | to plug it into their phone and swipe a credit card.
00:09:32.080 | And then as we talked with people
00:09:34.400 | who actually tried to accept credit cards in the past,
00:09:37.680 | we found a consistent theme,
00:09:39.640 | which many of them weren't even enabled,
00:09:42.560 | not enabled, but allowed to process credit cards.
00:09:46.600 | And we dug a little bit deeper, again, asking that question,
00:09:50.640 | and we found that a lot of them would go to banks
00:09:54.360 | or these merchant acquirers,
00:09:57.280 | and waiting for them was a credit check
00:10:01.040 | and looking at a FICA score.
00:10:03.280 | And many of the businesses that we talked to,
00:10:07.720 | and many small businesses,
00:10:09.600 | they don't have good credit or a credit history.
00:10:13.600 | They're entrepreneurs who are just getting started,
00:10:17.800 | taking a lot of personal risk, financial risk.
00:10:20.160 | And it just felt ridiculous to us that
00:10:24.480 | for the job of being able to accept money from people,
00:10:32.040 | you had to get your credit checked.
00:10:33.720 | And as we dug deeper, we realized that
00:10:36.520 | that wasn't the intention of the financial industry,
00:10:38.600 | but it's the only tool they had available to them
00:10:42.240 | to understand authenticity, intent,
00:10:45.560 | predictor of future behavior.
00:10:49.160 | So that's the first thing we actually looked at.
00:10:50.880 | And that's where the, you know, we built the hardware,
00:10:52.640 | but the software really came in in terms of risk modeling.
00:10:57.520 | And that's when we started down the path
00:11:00.760 | that eventually leads to AI.
00:11:03.080 | We started with a very strong data science discipline
00:11:07.680 | because we knew that our business was not necessarily
00:11:12.200 | about making hardware.
00:11:13.880 | It was more about enabling more people
00:11:17.120 | to come into the system.
00:11:18.680 | - So the fundamental challenge there is,
00:11:21.280 | so to enable more people to come into the system,
00:11:23.480 | you have to lower the barrier of checking
00:11:26.800 | that that person will be a legitimate vendor.
00:11:30.560 | Is that the fundamental problem here?
00:11:31.800 | - Yeah, and a different mindset.
00:11:34.000 | I think a lot of the financial industry had a mindset
00:11:36.400 | of kind of distrust and just constantly looking
00:11:41.400 | for opportunities to prove why people shouldn't get
00:11:51.160 | into the system.
00:11:53.000 | Whereas we took on a mindset of trust and then verify,
00:11:56.080 | verify, verify, verify, verify.
00:11:58.840 | So we moved, when we entered the space,
00:12:03.840 | only about 30 to 40% of the people who applied
00:12:08.240 | to accept credit cards would actually get through the system.
00:12:10.840 | We took that number to 99%.
00:12:13.160 | And that's because we reframed the problem.
00:12:16.640 | We built credible models.
00:12:20.160 | And we had this mindset of, we're going to watch
00:12:24.520 | not at the merchant level, but we're going to watch
00:12:26.800 | at the transaction level.
00:12:28.520 | So come in, perform some transactions.
00:12:32.680 | And as long as you're doing things that feel
00:12:36.320 | high integrity, credible, and don't look suspicious,
00:12:40.200 | we'll continue to serve you.
00:12:43.080 | If we see any interestingness in how you use our system,
00:12:47.360 | that will be bubbled up to people to review,
00:12:50.560 | to figure out if there's something nefarious going on,
00:12:53.200 | and that's when we might ask you to leave.
00:12:55.800 | So the change in the mindset led to the technology
00:13:00.800 | that we needed to enable more people to get through
00:13:06.520 | and to enable more people to access the system.
00:13:09.000 | - What role does machine learning play into that,
00:13:11.840 | in that context of, you said, first of all,
00:13:15.760 | that's a beautiful shift.
00:13:17.640 | Anytime you shift your viewpoint into seeing
00:13:20.560 | that people are fundamentally good,
00:13:24.240 | and then you just have to verify and catch the ones
00:13:26.840 | who are not, as opposed to assuming everybody's bad,
00:13:30.480 | this is a beautiful thing.
00:13:31.560 | So what role does the, to you,
00:13:35.000 | throughout the history of the company,
00:13:37.320 | has machine learning played in doing that verification?
00:13:40.480 | - It was immediate.
00:13:41.800 | I mean, we weren't calling it machine learning,
00:13:43.400 | but it was data science.
00:13:44.760 | And then as the industry evolved,
00:13:47.600 | machine learning became more of the nomenclature,
00:13:50.000 | and as that evolved, it became more sophisticated
00:13:54.840 | with deep learning, and as that continues to evolve,
00:13:58.760 | it'll be another thing.
00:13:59.960 | But they're all in the same vein.
00:14:02.200 | But we built that discipline up
00:14:04.400 | within the first year of the company,
00:14:06.960 | because we also had, we had to partner with a bank,
00:14:11.960 | we had to partner with Visa MasterCard,
00:14:14.360 | and we had to show that by bringing more people
00:14:17.800 | into the system, that we could do so in a responsible way,
00:14:21.440 | that would not compromise their systems,
00:14:23.680 | and that they would trust us.
00:14:25.280 | - How do you convince this upstart company
00:14:27.800 | with some cool machine learning tricks
00:14:30.600 | is able to deliver on this sort of trustworthy
00:14:34.440 | set of merchants?
00:14:35.720 | - We staged it out in tiers.
00:14:37.920 | We had a bucket of 500 people using it,
00:14:41.320 | and then we showed results, and then 1,000,
00:14:43.960 | and then 10,000, then 50,000,
00:14:45.720 | and then the constraint was lifted.
00:14:49.920 | So again, it's kind of getting something tangible out there.
00:14:54.160 | I wanna show what we can do rather than talk about it.
00:14:57.160 | And that put a lot of pressure on us to do the right things.
00:15:01.600 | And it also created a culture of accountability,
00:15:07.320 | of a little bit more transparency,
00:15:10.760 | and I think incentivized all of our early folks
00:15:15.760 | and the company in the right way.
00:15:18.120 | - So what does the future look like
00:15:19.880 | in terms of increasing people's access?
00:15:21.920 | Or if you look at IoT, Internet of Things,
00:15:25.440 | there's more and more intelligent devices.
00:15:27.240 | You can see there's some people even talking about
00:15:30.280 | our personal data as a thing that we could monetize
00:15:33.720 | more explicitly versus implicitly.
00:15:35.920 | Sort of everything can become part of the economy.
00:15:38.480 | Do you see, so what does the future of Square look like
00:15:41.400 | in sort of giving people access in all kinds of ways
00:15:45.600 | to being part of the economy as merchants and as consumers?
00:15:49.240 | - I believe that the currency we use
00:15:52.040 | is a huge part of the answer,
00:15:55.720 | and I believe that the internet deserves
00:15:58.960 | and requires a native currency,
00:16:01.760 | and that's why I'm such a huge believer in Bitcoin,
00:16:07.240 | because it just, our biggest problem as a company right now
00:16:12.240 | is we cannot act like an internet company.
00:16:16.360 | Open a new market, we have to have a partnership
00:16:18.600 | with a local bank.
00:16:20.040 | We have to pay attention to different
00:16:21.960 | regulatory onboarding environments.
00:16:24.380 | And a digital currency like Bitcoin
00:16:30.000 | takes a bunch of that away, where we can potentially
00:16:33.040 | launch a product in every single market around the world.
00:16:37.960 | Because they're all using the same currency,
00:16:41.160 | and we have consistent understanding of regulation
00:16:46.160 | and onboarding and what that means.
00:16:49.520 | So I think the internet continuing to be accessible
00:16:54.400 | to people is number one,
00:16:57.440 | and then I think currency is number two.
00:17:00.280 | And it will just allow for a lot more innovation,
00:17:04.280 | a lot more speed in terms of what we can build
00:17:07.760 | and others can build.
00:17:09.120 | And it's just really exciting.
00:17:11.820 | So I mean, I want to be able to see that
00:17:13.720 | and feel that in my lifetime.
00:17:16.240 | - So in this aspect and in other aspects,
00:17:19.000 | you have a deep interest in cryptocurrency
00:17:22.040 | and distributed ledger tech in general.
00:17:24.040 | I talked to Vitalik Buterin yesterday on this podcast.
00:17:27.800 | He says hi, by the way.
00:17:29.080 | - Hey.
00:17:29.920 | (laughing)
00:17:32.200 | - He's a brilliant, brilliant person,
00:17:33.960 | talking a lot about Bitcoin and Ethereum, of course.
00:17:36.160 | So can you maybe linger on this point?
00:17:38.560 | What do you find appealing about Bitcoin,
00:17:42.400 | about digital currency?
00:17:43.720 | Where do you see it going in the next 10, 20 years?
00:17:46.960 | And what are some of the challenges
00:17:49.440 | with respect to Square, but also just bigger
00:17:51.760 | for globally, for our world, for the way we think about money?
00:17:56.760 | - I think the most beautiful thing about it
00:18:01.560 | is there's no one person setting the direction.
00:18:04.200 | And there's no one person on the other side
00:18:07.800 | that can stop it.
00:18:08.640 | So we have something that is pretty organic in nature
00:18:15.040 | and very principled in its original design.
00:18:18.420 | And I think the Bitcoin white paper
00:18:22.400 | is one of the most seminal works of computer science
00:18:24.640 | in the last 20, 30 years.
00:18:27.240 | It's poetry.
00:18:30.040 | I mean, it really is.
00:18:30.880 | - And the technology, I mean,
00:18:32.140 | that's not often talked about.
00:18:34.440 | There's so much hype around digital currency,
00:18:36.920 | about the financial impacts of it,
00:18:38.560 | but the actual technology is quite beautiful
00:18:40.480 | from a computer science perspective.
00:18:42.160 | - Yeah, and the underlying principles behind it
00:18:44.280 | that went into it, even to the point
00:18:45.960 | of releasing it under a pseudonym.
00:18:47.680 | I think that's a very, very powerful statement.
00:18:51.280 | The timing of when it was released is powerful.
00:18:54.160 | It was a total activist move.
00:18:58.140 | I mean, it's moving the world forward,
00:19:00.540 | and in a way that I think is extremely noble
00:19:05.540 | and honorable and enables everyone to be part of the story,
00:19:10.760 | which is also really cool.
00:19:12.200 | So you asked a question around 10 years and 20 years.
00:19:15.600 | I mean, I think the amazing thing is no one knows.
00:19:18.500 | And it can emerge, and every person
00:19:21.980 | that comes into the ecosystem,
00:19:23.920 | whether they be a developer or someone who uses it,
00:19:29.440 | can change its direction in small and large ways.
00:19:32.500 | And that's what I think it should be,
00:19:35.020 | 'cause that's what the internet has shown is possible.
00:19:38.060 | Now, there's complications with that, of course,
00:19:40.280 | and there's certainly companies that own large parts
00:19:44.440 | of the internet and can direct it more than others,
00:19:46.260 | and there's not equal access to every single person
00:19:50.980 | in the world just yet, but all those problems
00:19:54.300 | are visible enough to speak about them,
00:19:56.560 | and to me, that gives confidence that they're solvable
00:19:59.460 | in a relatively short time frame.
00:20:02.660 | I think the world changes a lot as we get these satellites
00:20:06.860 | projecting the internet down to Earth,
00:20:10.620 | 'cause it just removes a bunch of the former constraints
00:20:14.260 | and really levels the playing field.
00:20:18.180 | But a global currency, which a native currency
00:20:22.060 | for the internet is a proxy for, is a very powerful concept,
00:20:26.940 | and I don't think any one person on this planet
00:20:29.000 | truly understands the ramifications of that.
00:20:31.460 | I think there's a lot of positives to it.
00:20:34.040 | There's some negatives as well.
00:20:35.800 | - Do you think it's possible, sorry to interrupt,
00:20:37.400 | do you think it's possible that this kind
00:20:39.080 | of digital currency would redefine the nature of money,
00:20:43.360 | so become the main currency of the world,
00:20:46.320 | as opposed to being tied to fiat currency
00:20:48.960 | of different nations and sort of really push
00:20:51.880 | the decentralization of control of money?
00:20:54.780 | - Definitely, but I think the bigger ramification
00:20:58.060 | is how it affects how society works,
00:21:02.220 | and I think there are many positive ramifications--
00:21:06.220 | - Outside of just money?
00:21:07.580 | - Outside of just money.
00:21:09.180 | Money is a foundational layer that enables so much more.
00:21:12.100 | I was meeting with an entrepreneur in Ethiopia,
00:21:14.880 | and payments is probably the number one problem
00:21:19.140 | to solve across the continent.
00:21:21.820 | Both in terms of moving money across borders
00:21:24.640 | between nations on the continent,
00:21:26.640 | or the amount of corruption within the current system.
00:21:31.640 | But the lack of easy ways to pay people
00:21:37.820 | makes starting anything really difficult.
00:21:42.400 | I met an entrepreneur who started the Lyft/Uber of Ethiopia,
00:21:47.400 | and one of the biggest problems she has
00:21:49.800 | is that it's not easy for her riders to pay the company,
00:21:54.720 | and it's not easy for her to pay the drivers.
00:21:57.760 | And that definitely has stunted her growth
00:22:00.760 | and made everything more challenging.
00:22:02.360 | So the fact that she even has to think about payments
00:22:07.120 | instead of thinking about the best rider experience
00:22:10.240 | and the best driver experience is pretty telling.
00:22:15.020 | So I think as we get a more durable, resilient,
00:22:20.020 | and global standard, we see a lot more innovation everywhere.
00:22:25.700 | And I think there's no better case study for this
00:22:29.060 | than the various countries within Africa
00:22:32.060 | and their entrepreneurs who are trying to start things
00:22:35.100 | within health or sustainability or transportation
00:22:38.780 | or a lot of the companies that we've seen here.
00:22:42.500 | So the majority of companies I met in November
00:22:47.280 | when I spent a month on the continent
00:22:49.100 | were payments-oriented.
00:22:51.260 | - You mentioned, and this is a small tangent,
00:22:54.300 | you mentioned the anonymous launch of Bitcoin
00:22:58.340 | is a sort of profound philosophical statement.
00:23:00.740 | - Pseudonymous.
00:23:02.060 | - What's that even mean?
00:23:03.660 | There's a pseudonym.
00:23:04.500 | First of all, let me ask-- - Well, there's an identity
00:23:06.020 | tied to it.
00:23:06.840 | It's not just anonymous.
00:23:08.260 | It's Nakamoto. - I see.
00:23:10.080 | - So a Nakamoto might represent one person
00:23:11.960 | or multiple people.
00:23:13.420 | But-- - Let me ask,
00:23:14.580 | are you Satoshi Nakamoto?
00:23:15.860 | Just checking.
00:23:16.700 | - No. - Catch you up.
00:23:17.540 | - And if I were, would I tell you?
00:23:18.500 | - Yes, sure.
00:23:19.620 | - But-- - Maybe you slip.
00:23:21.100 | - A pseudonym is constructed identity.
00:23:25.080 | Anonymity is just kind of this random,
00:23:28.020 | like drop something off and leave.
00:23:32.040 | There's no intention to build an identity around it.
00:23:34.460 | And while the identity being built was a short time window,
00:23:41.240 | it was meant to stick around, I think, and to be known.
00:23:46.240 | And it's being honored in how the community thinks
00:23:51.540 | about building it, like the concept of Satoshi's,
00:23:55.100 | for instance, is one such example.
00:23:58.600 | But I think it was smart not to do it anonymous,
00:24:03.100 | not to do it as a real identity,
00:24:05.420 | but to do it as pseudonym because I think it builds
00:24:08.660 | tangibility and a little bit of empathy
00:24:11.420 | that this was a human or a set of humans behind it.
00:24:14.660 | And there's this natural identity that I can imagine.
00:24:19.660 | - But there is also a sacrifice of ego.
00:24:22.420 | That's a pretty powerful thing from your perspective.
00:24:24.260 | - Yeah, which is beautiful, yeah.
00:24:25.380 | - Would you do, sort of philosophically,
00:24:28.740 | to ask you the question, would you do
00:24:31.080 | all the same things you're doing now
00:24:32.920 | if your name wasn't attached to it?
00:24:35.020 | Sort of if you had to sacrifice the ego.
00:24:39.340 | Put another way, is your ego deeply tied
00:24:41.980 | in the decisions you've been making?
00:24:43.880 | - I hope not.
00:24:45.780 | I mean, I believe I would certainly attempt
00:24:49.260 | to do the things without my name having
00:24:51.460 | to be attached with it.
00:24:53.420 | But it's hard to do that in a corporation.
00:24:58.240 | Legally, that's the issue.
00:25:02.620 | If I were to do more open source things,
00:25:05.220 | then absolutely, I don't need.
00:25:08.700 | My particular identity, my real identity associated with it,
00:25:12.620 | but I think the appreciation that comes
00:25:17.620 | from doing something good and being able to see it
00:25:21.020 | and see people use it is pretty overwhelming and powerful.
00:25:25.860 | More so than maybe seeing your name in the headlines.
00:25:31.300 | - Let's talk about artificial intelligence
00:25:32.820 | a little bit, if we could.
00:25:34.860 | 70 years ago, Alan Turing formulated the Turing test.
00:25:38.020 | To me, natural language is one of the most interesting
00:25:41.220 | spaces of problems that are tackled
00:25:44.420 | by artificial intelligence.
00:25:45.580 | It's the canonical problem of what it means
00:25:47.420 | to be intelligent.
00:25:48.620 | He formulated the Turing test.
00:25:50.700 | Let me ask sort of the broad question.
00:25:53.340 | How hard do you think is it to pass the Turing test
00:25:56.420 | in the space of language?
00:25:58.620 | - Just from a very practical standpoint,
00:26:00.260 | I think where we are now and for at least years out
00:26:05.260 | is one where the artificial intelligence,
00:26:11.620 | machine learning, the deep learning models
00:26:13.820 | can bubble up interestingness very, very quickly
00:26:17.340 | and pair that with human discretion around severity,
00:26:22.340 | around depth, around nuance and meaning.
00:26:30.740 | I think for me, the chasm to cross for general intelligence
00:26:35.740 | is to be able to explain why and the meaning
00:26:41.060 | behind something.
00:26:43.180 | - Behind a decision?
00:26:44.820 | So being able to-- - Behind a decision
00:26:46.060 | or a set of data. - Tell a story.
00:26:47.820 | - Set sets of data.
00:26:48.980 | - So the explainability part is kind of essential
00:26:51.620 | to be able to explain using natural language
00:26:54.260 | why the decisions were made, that kind of thing.
00:26:56.540 | - Yeah, I mean, I think that's one of our biggest risks
00:26:58.820 | in artificial intelligence going forward
00:27:01.180 | is we are building a lot of black boxes
00:27:03.460 | that can't necessarily explain why they made a decision
00:27:06.780 | or what criteria they used to make the decision.
00:27:09.380 | And we're trusting them more and more
00:27:11.100 | from lending decisions to content recommendation
00:27:14.660 | to driving to health.
00:27:18.060 | Like a lot of us have watches that tell us when to stand.
00:27:22.220 | How is it deciding that?
00:27:23.100 | I mean, that one's pretty simple.
00:27:25.340 | But you can imagine how complex they get
00:27:28.220 | and being able to explain the reasoning
00:27:31.620 | behind some of those recommendations
00:27:33.260 | seems to be an essential part.
00:27:34.940 | Although it's hard-- - Which is a very hard problem
00:27:36.460 | because sometimes even we can't explain
00:27:38.740 | why we make decisions.
00:27:39.740 | - So that's what I was, I think we're being sometimes
00:27:42.700 | a little bit unfair to artificial intelligence systems
00:27:45.700 | 'cause we're not very good at some of these things.
00:27:48.260 | Do you think, I apologize for the ridiculous
00:27:52.580 | romanticized question, but on that line of thought,
00:27:55.820 | do you think we'll ever be able to build a system
00:27:58.820 | like in the movie "Her" that you could fall in love with?
00:28:03.180 | So have that kind of deep connection with.
00:28:05.420 | - Hasn't that already happened?
00:28:07.420 | Hasn't someone in Japan fallen in love with his AI?
00:28:12.000 | - There's always going to be somebody
00:28:14.820 | that does that kind of thing.
00:28:15.980 | I mean, at a much larger scale
00:28:18.420 | of actually building relationships,
00:28:20.100 | of being deeper connections.
00:28:21.620 | It doesn't have to be love,
00:28:22.620 | but it's just deeper connections
00:28:24.100 | with artificial intelligence systems.
00:28:26.580 | So you mentioned explainability.
00:28:27.420 | - That's less a function of the artificial intelligence
00:28:29.780 | and more a function of the individual
00:28:32.220 | and how they find meaning and where they find meaning.
00:28:34.700 | - Do you think we humans can find meaning in technology?
00:28:37.380 | In this kind of way? - Yeah, yeah, yeah, 100%.
00:28:39.500 | 100%.
00:28:40.740 | And I don't necessarily think it's a negative,
00:28:43.220 | but it's constantly going to evolve.
00:28:50.180 | So I don't know, but meaning is something
00:28:54.460 | that's entirely subjective.
00:28:56.240 | And I don't think it's going to be a function
00:28:59.340 | of finding the magic algorithm
00:29:02.380 | that enables everyone to love it.
00:29:06.300 | But maybe, I don't know.
00:29:08.820 | - But that question really gets at the difference
00:29:10.780 | between human and machine.
00:29:12.700 | You had a little bit of an exchange with Elon Musk.
00:29:16.200 | Basically, I mean, it's the trivial version of that,
00:29:20.140 | but I think there's a more fundamental question
00:29:22.060 | of is it possible to tell the difference
00:29:24.220 | between a bot and a human?
00:29:27.260 | And do you think it's,
00:29:29.260 | if we look into the future, 10, 20 years out,
00:29:32.740 | do you think it would be possible
00:29:34.060 | or is it even necessary to tell the difference
00:29:36.340 | in the digital space between a human and a robot?
00:29:40.380 | Can we have fulfilling relationships with each
00:29:42.700 | or do we need to tell the difference between them?
00:29:45.180 | - I think it's certainly useful in certain problem domains
00:29:49.340 | to be able to tell the difference.
00:29:51.040 | I think in others it might not be as useful.
00:29:56.260 | - Do you think it's possible for us today
00:29:58.180 | to tell that difference?
00:30:00.060 | Is the reverse the meta of the Turing test?
00:30:02.860 | - Well, what's interesting is I think the technology
00:30:07.060 | to create is moving much faster
00:30:10.620 | than the technology to detect.
00:30:12.400 | - You think so?
00:30:14.340 | So if you look at adversarial machine learning,
00:30:17.220 | there's a lot of systems that try to fool
00:30:19.380 | machine learning systems.
00:30:21.220 | And at least for me, the hope is that the technology
00:30:23.900 | to defend will always be right there, at least.
00:30:28.900 | Your sense is that--
00:30:30.880 | - I don't know if they'll be right there.
00:30:31.980 | I mean, it's a race, right?
00:30:34.140 | So the detection technologies have to be
00:30:36.940 | two or 10 steps ahead of the creation technologies.
00:30:42.060 | This is a problem that I think the financial industry
00:30:44.180 | will face more and more because a lot of our
00:30:47.660 | risk models, for instance, are built around identity.
00:30:50.780 | Payments ultimately comes down to identity.
00:30:53.420 | And you can imagine a world where all this conversation
00:30:57.620 | around deep fakes goes towards a direction
00:31:00.940 | of driver's license or passports or state identities.
00:31:05.940 | And people construct identities in order to get through
00:31:10.740 | a system such as ours to start accepting credit cards
00:31:13.380 | or into the cash app.
00:31:15.940 | And those technologies seem to be moving very, very quickly.
00:31:19.700 | Our ability to detect them, I think,
00:31:22.820 | is probably lagging at this point.
00:31:25.080 | But certainly with more focus, we can get ahead of it.
00:31:29.800 | But this is gonna touch everything.
00:31:34.700 | So I think it's like security.
00:31:39.260 | We're never going to be able to build
00:31:40.900 | a perfect detection system.
00:31:43.620 | We're only going to be able to,
00:31:46.220 | you know, what we should be focused on
00:31:47.860 | is the speed of evolving it.
00:31:51.180 | And being able to take signals that show correctness
00:31:56.180 | or errors as quickly as possible and move
00:31:59.780 | and to be able to build that into our newer models
00:32:03.660 | or the self-learning models.
00:32:05.580 | - Do you have other worries, like some people,
00:32:07.380 | like Elon and others, have worries of existential threats
00:32:10.940 | of artificial intelligence,
00:32:12.220 | of artificial general intelligence?
00:32:14.260 | Or if you think more narrowly about threats and concerns
00:32:18.500 | about more narrow artificial intelligence,
00:32:20.900 | like what are your thoughts in this domain?
00:32:24.380 | Do you have concerns or are you more optimistic?
00:32:27.180 | - I think Yuval, in his book,
00:32:29.620 | "21 Lessons for the 21st Century,"
00:32:32.660 | his last chapter is around meditation.
00:32:35.460 | And you look at the title of the chapter
00:32:37.700 | and you're like, oh, it's all meditation.
00:32:40.260 | But what was interesting about that chapter
00:32:43.220 | is he believes that, you know,
00:32:46.820 | kids being born today, growing up today,
00:32:50.240 | Google has a stronger sense of their preferences
00:32:57.120 | than they do, which you can easily imagine.
00:33:01.680 | I can easily imagine today that Google probably knows
00:33:06.680 | my preferences more than my mother does.
00:33:10.460 | Maybe not me, per se, but for someone growing up,
00:33:15.140 | only knowing the internet,
00:33:16.060 | only knowing what Google is capable of,
00:33:19.060 | or Facebook or Twitter or Square or any of these things,
00:33:22.620 | the self-awareness is being offloaded to other systems,
00:33:28.980 | and particularly these algorithms.
00:33:32.060 | And his concern is that we lose that self-awareness
00:33:36.040 | because the self-awareness is now outside of us
00:33:39.020 | and it's doing such a better job
00:33:40.860 | at helping us direct our decisions around,
00:33:45.180 | should I stand, should I walk today,
00:33:47.460 | what doctor should I choose, who should I date?
00:33:50.340 | All these things we're now seeing play out very quickly.
00:33:54.200 | So he sees meditation as a tool to build that self-awareness
00:33:58.540 | and to bring the focus back on,
00:34:00.680 | why do I make these decisions?
00:34:02.260 | Why do I react in this way?
00:34:03.780 | Why did I have this thought?
00:34:06.500 | Where did that come from?
00:34:07.780 | That's a way to regain control.
00:34:09.420 | - Or awareness, maybe not control,
00:34:12.380 | but awareness so that you can be aware that,
00:34:15.540 | yes, I am offloading this decision to this algorithm
00:34:19.820 | that I don't fully understand
00:34:21.700 | and can't tell me why it's doing the things it's doing
00:34:24.620 | because it's so complex.
00:34:26.780 | - That's not to say that the algorithm
00:34:28.180 | can't be a good thing.
00:34:29.060 | And to me, recommender systems,
00:34:31.380 | the best of what they can do is to help guide you
00:34:34.940 | in a journey of learning new ideas, of learning, period.
00:34:39.740 | - It can be a great thing, but do you know you're doing that?
00:34:41.900 | Are you aware that you're inviting it to do that to you?
00:34:45.560 | I think that's the risk he identifies.
00:34:48.080 | That's perfectly okay, but are you aware
00:34:53.420 | that you have that imitation and it's being acted upon?
00:34:58.420 | - And so that's a concern.
00:35:01.540 | You're kind of highlighting that
00:35:03.420 | without a lack of awareness,
00:35:04.700 | you can just be floating at sea.
00:35:06.340 | So awareness is key in the future
00:35:08.940 | of these artificial intelligence systems.
00:35:10.900 | - Yeah, the movie "Wall-E,"
00:35:12.820 | which I think is one of Pixar's best movies
00:35:15.380 | besides "Ratatouille."
00:35:17.180 | "Ratatouille" was incredible.
00:35:20.680 | - You had me until "Ratatouille," okay.
00:35:22.260 | - "Ratatouille" was incredible.
00:35:23.820 | - All right, we've come to the first point
00:35:28.180 | where we disagree.
00:35:29.020 | - It's the entrepreneurial story in the form of a rat.
00:35:32.980 | - Hmm.
00:35:33.820 | I just remember just the soundtrack was really good.
00:35:38.660 | - Excellent.
00:35:39.500 | - What are your thoughts,
00:35:42.300 | sticking on artificial intelligence a little bit,
00:35:44.400 | about the displacement of jobs?
00:35:46.020 | That's another perspective that candidates
00:35:48.260 | like Andrew Yang talk about.
00:35:50.300 | - Yang gang forever.
00:35:51.300 | - Yang gang.
00:35:54.140 | So he unfortunately, speaking of Yang gang,
00:35:56.340 | has recently dropped out.
00:35:57.580 | - I know, it was very disappointing and depressing.
00:36:00.220 | - Yeah, but on the positive side,
00:36:02.220 | he's, I think, launching a podcast.
00:36:05.340 | - Really, cool.
00:36:06.220 | - Yeah, he just announced that.
00:36:07.980 | I'm sure he'll try to talk you
00:36:09.460 | into trying to come onto the podcast.
00:36:10.780 | - I would love to.
00:36:12.100 | - So-- - What about "Ratatouille"?
00:36:14.620 | - Yeah, maybe he'll be more welcoming
00:36:16.100 | of the "Ratatouille" argument.
00:36:18.220 | What are your thoughts on his concerns
00:36:20.300 | of the displacement of jobs, of automation,
00:36:22.100 | instead of the, of course, there's positive impacts
00:36:24.980 | that could come from automation and AI,
00:36:26.700 | but there could also be negative impacts.
00:36:29.340 | And within that framework,
00:36:30.820 | what are your thoughts about universal basic income?
00:36:33.420 | So these interesting new ideas
00:36:36.060 | of how we can empower people in the economy.
00:36:39.380 | - I think he was 100% right on almost every dimension.
00:36:45.420 | We see this in Square's business.
00:36:48.700 | I mean, he identified truck drivers, I'm from Missouri,
00:36:53.140 | and he certainly pointed to the,
00:36:59.420 | concern and the issue that people from where I'm from
00:37:04.420 | feel every single day that is often invisible
00:37:07.780 | and not talked about enough.
00:37:09.580 | You know, the next big one is cashiers.
00:37:12.100 | This is where it pertains to Square's business.
00:37:14.460 | We are seeing more and more of the point of sale
00:37:19.780 | move to the individual customer's hand
00:37:23.020 | in the form of their phone and apps
00:37:24.580 | and pre-order and order ahead.
00:37:27.340 | We're seeing more kiosks,
00:37:29.380 | we're seeing more things like Amazon Go,
00:37:32.380 | and the number of workers in,
00:37:35.860 | as a cashier and retail is immense.
00:37:40.380 | And, you know, there's no real answers
00:37:43.660 | on how they transform their skills
00:37:47.980 | and work into something else.
00:37:51.180 | And I think that does lead to a lot
00:37:53.140 | of really negative ramifications.
00:37:56.180 | And the important point that he brought up
00:37:59.020 | around universal basic income
00:38:01.460 | is given that this shift is going to come,
00:38:03.660 | and given it is going to take time
00:38:07.540 | to set people up with new skills and new careers,
00:38:12.540 | they need to have a floor to be able to survive.
00:38:17.700 | And this $1,000 a month is such a floor.
00:38:22.620 | It's not going to incentivize you to quit your job
00:38:25.100 | because it's not enough.
00:38:26.620 | But it will enable you to not have to worry
00:38:30.220 | as much about just getting on day to day
00:38:34.980 | so that you can focus on what am I going to do now
00:38:39.860 | and what skills do I need to acquire?
00:38:44.860 | And I think a lot of people point to
00:38:49.340 | the fact that during the industrial age,
00:38:53.260 | we had the same concerns around automation,
00:38:55.780 | factory lines, and everything worked out okay.
00:38:59.220 | But the biggest change is just the velocity
00:39:04.220 | and the centralization of a lot of the things
00:39:08.060 | that make this work, which is the data
00:39:10.900 | and the algorithms that work on this data.
00:39:14.900 | I think the second biggest scary thing
00:39:18.900 | is just how around AI is just
00:39:22.060 | who actually owns the data and who can operate on it.
00:39:26.300 | And are we able to share the insights from the data
00:39:31.300 | so that we can also build algorithms
00:39:35.420 | that help our needs or help our business or whatnot?
00:39:39.140 | So that's where I think regulation could play
00:39:43.500 | a strong and positive part.
00:39:45.320 | First, looking at the primitives of AI
00:39:50.460 | and the tools we use to build these services
00:39:52.660 | that will ultimately touch every single aspect
00:39:54.820 | of the human experience.
00:39:56.540 | And then where data is owned and how it's shared.
00:40:01.540 | So those are the answers that as a society, as a world,
00:40:10.540 | we need to have better answers around,
00:40:12.460 | which we're currently not.
00:40:13.860 | They're just way too centralized
00:40:15.380 | into a few very, very large companies.
00:40:19.580 | But I think it was spot on with identifying the problem
00:40:23.260 | and proposing solutions that would actually work.
00:40:25.840 | At least that we'd learn from that
00:40:28.340 | you could expand or evolve.
00:40:31.340 | But I mean, I think UBI is well past its due.
00:40:36.340 | I mean, it was certainly trumpeted by Martin Luther King
00:40:41.620 | and even before him as well.
00:40:43.820 | - And like you said, the exact thousand dollar mark
00:40:48.060 | might not be the correct one,
00:40:50.260 | but you should take the steps to try
00:40:52.100 | to implement these solutions and see what works.
00:40:56.260 | - 100%.
00:40:57.500 | - So I think you and I eat similar diets.
00:40:59.380 | And at least I was--
00:41:01.780 | - First time I've heard this.
00:41:03.220 | - Yeah, so I was doing it--
00:41:05.420 | - First time anyone has said that to me in this case.
00:41:08.620 | - Yeah, but it's becoming more and more cool.
00:41:11.060 | But I was doing it before it was cool.
00:41:13.700 | So the intermittent fasting and fasting in general,
00:41:16.300 | I really enjoy.
00:41:17.420 | I love food, but I enjoy the,
00:41:21.420 | I also love suffering 'cause I'm Russian.
00:41:23.100 | So fasting kind of makes you appreciate the,
00:41:26.400 | makes you appreciate what it is to be human somehow.
00:41:32.460 | But I have, outside the philosophical stuff,
00:41:36.700 | I have a more specific question.
00:41:37.940 | It also helps me as a programmer and a deep thinker,
00:41:42.020 | like from the scientific perspective
00:41:43.980 | to sit there for many hours and focus deeply.
00:41:46.940 | Maybe you were a hacker before you were CEO.
00:41:50.900 | What have you learned about diet, lifestyle,
00:41:55.060 | mindset that helps you maximize mental performance
00:41:57.940 | to be able to focus for, to think deeply
00:42:01.740 | in this world of distractions?
00:42:04.020 | - I think I just took it for granted for too long.
00:42:07.240 | - Which aspect?
00:42:09.660 | - Just the social structure of we eat three meals a day
00:42:13.500 | and there's snacks in between.
00:42:15.180 | And I just never really asked the question why.
00:42:19.060 | - Oh, by the way, in case people don't know,
00:42:20.860 | I think a lot of people know,
00:42:22.180 | but you at least, you famously eat once a day.
00:42:25.980 | - Yeah. - You still eat once a day?
00:42:27.700 | - Yep, I eat dinner.
00:42:29.700 | - By the way, what made you decide to eat once a day?
00:42:32.100 | Like, 'cause to me that was a huge revolution
00:42:34.060 | that you don't have to eat breakfast.
00:42:35.500 | That was like, I felt like I was a rebel.
00:42:37.420 | Like I abandoned my parents or something
00:42:39.980 | and became an anarchist.
00:42:41.180 | - When you first, like the first week you start doing it,
00:42:43.580 | it feels like you kind of like have a superpower.
00:42:45.340 | - Yeah.
00:42:46.180 | - And you realize it's not really a superpower.
00:42:47.660 | But I think you realize, at least I realize,
00:42:50.860 | like just how much our mind dictates what we're possible of.
00:42:55.860 | And sometimes we have structures around us
00:43:02.260 | that incentivize like, you know, this three meal a day thing
00:43:05.340 | which was purely social structure
00:43:09.700 | versus necessity for our health and for our bodies.
00:43:14.180 | And I did it just, I started doing it
00:43:17.500 | because I played a lot with my diet when I was a kid
00:43:21.540 | and I was vegan for two years
00:43:23.340 | and just went all over the place.
00:43:25.900 | Just because I, you know, health is the most precious thing
00:43:30.560 | we have and none of us really understand it.
00:43:33.900 | So being able to ask the question through experiments
00:43:37.980 | that I can perform on myself and learn about
00:43:41.860 | is compelling to me.
00:43:44.180 | And I heard this one guy on a podcast, Wim Hof,
00:43:47.660 | who's famous for doing ice baths and holding his breath
00:43:50.940 | and all these things.
00:43:53.580 | He said he only eats one meal a day.
00:43:56.620 | I'm like, wow, that sounds super challenging
00:43:59.220 | and uncomfortable.
00:44:00.740 | I'm gonna do it.
00:44:02.020 | So I just, I learn the most when I make myself,
00:44:06.300 | I wouldn't say suffer,
00:44:07.460 | but when I make myself feel uncomfortable.
00:44:10.420 | Because everything comes to bear in those moments.
00:44:14.140 | And you really learn what you're about or what you're not.
00:44:19.140 | So I've been doing that my whole life.
00:44:23.920 | Like when I was a kid, I could not speak.
00:44:27.600 | Like I had to go to a speech therapist
00:44:29.140 | and it made me extremely shy.
00:44:31.940 | And then one day I realized I can't keep doing this
00:44:34.820 | and I signed up for the speech club.
00:44:39.700 | And it was the most uncomfortable thing
00:44:44.700 | I could imagine doing.
00:44:46.260 | Getting a topic on a note card,
00:44:49.500 | having five minutes to write a speech
00:44:51.420 | about whatever that topic is,
00:44:53.540 | not being able to use the note card while speaking
00:44:56.580 | and speaking for five minutes about that topic.
00:45:00.860 | But it just, it puts so much,
00:45:03.900 | it gave me so much perspective
00:45:06.220 | around the power of communication,
00:45:08.480 | around my own deficiencies
00:45:09.980 | and around if I set my mind to do something, I'll do it.
00:45:14.140 | So it gave me a lot more confidence.
00:45:16.420 | So I see fasting in the same light.
00:45:18.900 | This is something that was interesting,
00:45:21.460 | challenging, uncomfortable,
00:45:23.840 | and has given me so much learning and benefit as a result.
00:45:30.200 | And it will lead to other things
00:45:31.460 | that I'll experiment with and play with.
00:45:34.120 | But yeah, it does feel a little bit
00:45:36.860 | like a superpower sometimes.
00:45:38.780 | The most boring superpower one can imagine.
00:45:42.700 | - No, it's quite incredible.
00:45:44.060 | The clarity of mind is pretty interesting.
00:45:46.800 | Speaking of suffering,
00:45:49.700 | you kind of talk about facing difficult ideas.
00:45:53.060 | You meditate, you think about the broad context of life
00:45:58.940 | of our society.
00:46:00.300 | Let me ask, I apologize again for the romanticized question,
00:46:03.680 | but do you ponder your own mortality?
00:46:06.520 | Do you think about death,
00:46:09.920 | about the finiteness of human existence
00:46:13.760 | when you meditate, when you think about it?
00:46:15.440 | And if you do, how do you make sense of it,
00:46:19.480 | that this thing ends?
00:46:20.540 | - Well, I don't try to make sense of it.
00:46:23.720 | I do think about it every day.
00:46:25.480 | I mean, it's a daily, multiple times a day.
00:46:29.060 | - Are you afraid of death?
00:46:30.580 | - No, I'm not afraid of it.
00:46:32.380 | I think it's a transformation.
00:46:35.380 | I don't know to what,
00:46:36.300 | but it's also a tool to feel the importance of every moment.
00:46:41.300 | So I just use it as a reminder.
00:46:46.260 | I have an hour.
00:46:48.620 | Is this really what I'm going to spend the hour doing?
00:46:52.540 | I only have so many more sunsets and sunrises to watch.
00:46:56.520 | I'm not going to get up for it.
00:46:58.320 | I'm not going to make sure that I try to see it.
00:47:02.080 | So it just puts a lot into perspective
00:47:06.680 | and it helps me prioritize, I think.
00:47:09.640 | I don't see it as something that I dread or is dreadful.
00:47:15.680 | It's a tool that is available to every single person
00:47:19.000 | to use every day because it shows how precious life is
00:47:21.800 | and there's reminders every single day,
00:47:24.560 | whether it be your own health or a friend or a coworker
00:47:27.960 | or something you see in the news.
00:47:29.660 | So to me, it's just a question
00:47:32.400 | of what we do with that daily reminder.
00:47:34.820 | And for me, it's am I really focused on what matters?
00:47:39.820 | And sometimes that might be work.
00:47:42.240 | Sometimes that might be friendships or family
00:47:45.200 | or relationships or whatnot,
00:47:47.020 | but it's the ultimate clarifier in that sense.
00:47:51.320 | - So on the question of what matters,
00:47:53.600 | another ridiculously big question
00:47:55.880 | of once you try to make sense of it,
00:47:58.480 | what do you think is the meaning of it all?
00:48:00.680 | The meaning of life?
00:48:02.020 | What gives you purpose, happiness, meaning?
00:48:06.860 | - A lot does.
00:48:08.520 | I mean, just being able to be aware
00:48:11.120 | of the fact that I'm alive is pretty meaningful.
00:48:20.040 | The connections I feel with individuals,
00:48:23.080 | whether they're people I just meet
00:48:25.040 | or long-lasting friendships or my family is meaningful.
00:48:29.940 | Seeing people use something that I helped build
00:48:33.760 | is really meaningful and powerful to me.
00:48:36.300 | But that sense of, I mean, I think ultimately
00:48:41.120 | it comes down to a sense of connection
00:48:43.440 | and just feeling like I am bigger.
00:48:47.220 | I am part of something that's bigger than myself
00:48:49.960 | and I can feel it directly in small ways or large ways,
00:48:54.000 | however it manifests is probably it.
00:48:58.260 | - Last question.
00:49:00.680 | Do you think we're living in a simulation?
00:49:02.840 | - I don't know.
00:49:06.700 | It's a pretty fun one if we are,
00:49:07.920 | but also crazy and random and wrought with tons of problems.
00:49:14.000 | But yeah.
00:49:17.240 | - Would you have it any other way?
00:49:19.400 | - Yeah.
00:49:20.240 | I mean, I just think it's taken us way too long
00:49:24.560 | as a planet to realize we're all in this together
00:49:27.200 | and we all are connected in very significant ways.
00:49:32.200 | I think we hide our connectivity very well through ego,
00:49:38.940 | through whatever it is of the day.
00:49:42.040 | But that is the one thing I would wanna work
00:49:46.240 | towards changing and that's how I would have it.
00:49:51.240 | 'Cause if we can't do that,
00:49:52.880 | then how are we gonna connect to all the other simulations?
00:49:55.560 | 'Cause that's the next step is like
00:49:57.360 | what's happening in the other simulation.
00:49:58.840 | - Escaping this one and yeah.
00:50:01.100 | Spanning across the multiple simulations
00:50:05.460 | and sharing it in and on the fun.
00:50:07.340 | I don't think there's a better way to end it.
00:50:09.900 | Jack, thank you so much for all the work you do.
00:50:12.020 | - There's probably other ways that we've ended this
00:50:13.900 | in other simulations that may have been better.
00:50:16.580 | - We'll have to wait and see.
00:50:18.680 | Thanks so much for talking today.
00:50:19.840 | - Thank you.
00:50:20.680 | - Thanks for listening to this conversation
00:50:23.040 | with Jack Dorsey and thank you to our sponsor, Masterclass.
00:50:26.940 | Please consider supporting this podcast
00:50:29.120 | by signing up to Masterclass at masterclass.com/lex.
00:50:34.120 | If you enjoy this podcast, subscribe on YouTube,
00:50:37.060 | review it with the five stars on Apple Podcast,
00:50:39.380 | support on Patreon or simply connect with me on Twitter
00:50:42.640 | at Lex Friedman.
00:50:45.120 | And now let me leave you with some words
00:50:47.160 | about Bitcoin from Paul Graham.
00:50:49.560 | "I'm very intrigued by Bitcoin.
00:50:52.480 | "It has all the signs of a paradigm shift.
00:50:54.980 | "Hackers love it, yet it is described as a toy,
00:50:58.760 | "just like microcomputers."
00:51:01.080 | Thank you for listening and hope to see you next time.
00:51:05.220 | (upbeat music)
00:51:07.800 | (upbeat music)
00:51:10.380 | [BLANK_AUDIO]