back to index

Sergey Nazarov: Chainlink, Smart Contracts, and Oracle Networks | Lex Fridman Podcast #181


Chapters

0:0 Introduction
1:33 Hedgy
1:59 Digital vs physical world
9:53 Definitive truth
14:8 Decentralized finance
22:37 Smart contracts
28:53 Hybrid smart contracts and oracle networks
45:39 Applications of smart contracts
63:45 AI and smart contracts
71:17 What agreements can be turned into smart contracts?
82:19 Privacy
92:38 Trust
108:12 Bitcoin
116:26 Satoshi Nakamoto
123:46 Ethereum
129:36 Chainlink design decisions
142:4 Dogecoin
146:21 Book recommendations
158:31 Advice for young people
170:35 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Sergei Nazarov,
00:00:03.040 | CEO of Chainlink, which is a decentralized Oracle network
00:00:07.360 | that provides data to smart contracts.
00:00:10.120 | He and his team have done seminal research
00:00:12.760 | and engineering in the space of smart contracts.
00:00:15.320 | Check out the Chainlink 2.0 white paper
00:00:17.420 | that I found to be a great overview
00:00:19.380 | of their technology and vision.
00:00:21.320 | It's 136 pages, but very accessible.
00:00:24.700 | Quick mention of our sponsors, Wine Access,
00:00:27.800 | Athletic Greens, Magic Spoon, Indeed, and BetterHelp.
00:00:32.080 | Check them out in the description to support this podcast.
00:00:35.400 | As a side note, let me say that externally connected
00:00:38.300 | smart contracts that combine the ocean of data out there
00:00:42.340 | with the security of the blockchain are fascinating to me,
00:00:45.660 | both technically and philosophically.
00:00:48.240 | Data is knowledge, and knowledge is power.
00:00:51.960 | I think the more reliable data sources we integrate
00:00:55.120 | into our decision-making, especially when those decisions
00:00:59.200 | are executed by programs, the more efficient
00:01:01.880 | and productive our decisions become.
00:01:04.480 | There are interactions between humans
00:01:07.460 | that should not be formalized digitally,
00:01:10.060 | like love, for example, but for all the others,
00:01:13.680 | there's no reason for smart contracts
00:01:15.680 | not to automate away the menial parts of life,
00:01:18.840 | making more room for good conversation over brisket
00:01:22.960 | and maybe some vodka with old and new friends.
00:01:26.800 | This is the Lex Friedman Podcast,
00:01:29.000 | and here is my conversation with Sergei Nazarov.
00:01:32.720 | - Is that Jozhek there?
00:01:35.400 | - So I gave away everything I own a few times in my life,
00:01:38.420 | and he accidentally survived,
00:01:41.040 | and I don't like stuffed animals.
00:01:42.720 | What I really liked about, I got him in a thrift store.
00:01:44.840 | What I liked about him is 'cause I'd never seen
00:01:46.800 | a stuffed animal that looks pissed off at life.
00:01:50.600 | Like they're usually smiling in the dumbest of ways,
00:01:53.440 | and this guy was just pissed.
00:01:55.200 | - Yeah, I gotta tell you, that's actually pretty funny.
00:01:59.000 | - I like this guy.
00:01:59.960 | If you had to live only in the digital world
00:02:02.720 | or the physical world, which would you choose?
00:02:06.160 | - So I think this is actually a question more about
00:02:09.160 | what the fidelity of the digital world would be
00:02:11.740 | versus the physical world.
00:02:13.380 | I think this type of question, this whole simulation thing
00:02:16.400 | actually comes from papers about 20, 30 years ago
00:02:19.640 | in the philosophical world where people tried to make
00:02:22.800 | this thought experiment of would you be comfortable
00:02:25.520 | if everything that was happening to you
00:02:27.360 | happened in a simulation?
00:02:29.040 | What they were trying to do is they were intuitively
00:02:30.800 | trying to understand is there some kind of intuitive
00:02:34.920 | personal connection we have to something being
00:02:38.680 | the real world, right?
00:02:40.400 | And then the Matrix movie actually came out of these papers,
00:02:43.120 | and then these ideas made their way
00:02:45.800 | into the public consciousness.
00:02:47.960 | I personally think that if I had the choice
00:02:50.500 | to be in the digital world at the same fidelity
00:02:53.120 | as the real world with immortality,
00:02:56.100 | I would absolutely go with the digital world.
00:02:57.600 | - Wait, wait, wait, wait, wait.
00:02:58.640 | How'd you add the immortality part?
00:03:00.440 | That's a, you don't get immortality.
00:03:02.680 | - If you think about how we would go into the digital world,
00:03:04.760 | right, our brain patterns would be mapped
00:03:07.440 | onto some kind of probably virtual machine, right?
00:03:10.880 | And that would mean immortality, right?
00:03:13.040 | Because the virtual machine has no limit
00:03:15.240 | to how long it can exist.
00:03:16.080 | - Well, don't you think there would be like
00:03:17.280 | a versioning system?
00:03:18.480 | Like there'd be, this is a soft fork
00:03:22.080 | versus hard fork question.
00:03:23.760 | Whether Sergei version 2.0 would be different
00:03:27.160 | from Sergei version 1.0, there'd be an upgrade.
00:03:30.600 | So that's a mortality.
00:03:32.040 | Sergei 1.0 would die in the digital world.
00:03:34.040 | You get like a software update, and then that's it.
00:03:38.000 | - Well, yeah, when people go into the Star Trek transporter,
00:03:41.340 | are they killed or are they transported?
00:03:43.640 | I don't really know.
00:03:45.040 | I haven't read any papers on this.
00:03:46.520 | I haven't really thought about it too much.
00:03:48.040 | - There's no white paper on the transporter.
00:03:49.800 | - Not at this point, so.
00:03:51.240 | - Well, what does fidelity mean exactly to you?
00:03:53.720 | Is it like strictly, so the fidelity of the physics world,
00:03:58.040 | the physical world is maybe now questions of physics,
00:04:01.760 | quantum mechanics, what is at the bottom of it all?
00:04:05.320 | Or do you mean the fidelity of the actual experience?
00:04:07.640 | - It's just perception.
00:04:09.400 | It's just perception.
00:04:11.400 | - But that's limited by human cognitive capabilities.
00:04:14.280 | - It is, but I don't really have anything else, right?
00:04:16.160 | I think all of these papers that brought up these questions
00:04:18.320 | of assimilation, they were like in epistemology
00:04:20.560 | and metaphysics, and what they were trying to do,
00:04:22.960 | I think, was they were trying to put people
00:04:25.080 | through a thought experiment where they would come out
00:04:27.620 | on the other end and say,
00:04:29.600 | the reality of life is really worth something,
00:04:33.120 | and ignorance isn't bliss, which is that consistent
00:04:37.400 | statement in the matrix, right?
00:04:38.600 | Ignorance is bliss, that's what one of the guys says
00:04:40.880 | when he's doing something wrong
00:04:42.520 | and trying to get back into the matrix.
00:04:44.400 | And the question is, is ignorance bliss?
00:04:46.600 | And it's like a different version of that.
00:04:49.840 | I think from a perceptual point of view,
00:04:52.480 | if my perceptions aren't in any way different,
00:04:55.680 | so fidelity is very good, it doesn't matter.
00:04:59.600 | I don't know, right?
00:05:00.960 | So if I don't know something, it doesn't really exist,
00:05:04.880 | and if it doesn't exist in my perception
00:05:07.800 | or my consciousness, then it doesn't exist,
00:05:09.840 | period, for me, at least.
00:05:11.680 | And then whether it exists in some more metaphysical
00:05:15.160 | version of things, I personally never really got
00:05:17.760 | into the metaphysics stuff because I could never really,
00:05:20.600 | I couldn't understand what the point of it was, right?
00:05:25.720 | It's one of these things where I couldn't really get
00:05:28.160 | what the practical application of it was,
00:05:32.320 | and this is from those realm of questions, right?
00:05:34.560 | Like if there was something about the world,
00:05:37.040 | but you didn't have a capacity to perceive it,
00:05:39.800 | would it matter to you?
00:05:41.760 | To me, it wouldn't matter.
00:05:43.160 | - Right, to me, by the way, the simulation thing
00:05:45.440 | is a really interesting engineering question,
00:05:47.760 | which is how difficult is it to engineer a virtual reality,
00:05:52.760 | a digital world that is sufficiently of high fidelity
00:05:57.600 | where you would want to live in it?
00:05:59.080 | I think that's a really testable
00:06:01.400 | and a fascinating engineering question
00:06:03.240 | 'cause my intuition says it's not as difficult as we think.
00:06:06.560 | It's not nearly as difficult as having to create
00:06:10.080 | a quantum mechanical simulation that's large enough
00:06:12.400 | to capture the full human experience.
00:06:14.520 | It might be just as simple as just a really nice Quake game,
00:06:18.760 | like with a nice engine, just creating
00:06:22.800 | all the basic visual elements that tricked our cognitive,
00:06:26.440 | our visual cortex into believing that we're actually
00:06:29.520 | in a physical environment.
00:06:31.720 | And I think that if that's true,
00:06:34.280 | then that's quite, a high fidelity digital world
00:06:38.440 | is actually achievable within a century.
00:06:41.640 | And that changes things.
00:06:44.880 | - Yeah, yeah, maybe in our lifetime.
00:06:47.000 | I'm really hoping for that.
00:06:47.840 | I'm hoping somebody can copy my brain waves
00:06:49.840 | onto a virtual machine and allow that consciousness
00:06:54.400 | to continue to exist.
00:06:55.760 | Whether that's death or not, I don't know.
00:06:59.040 | But I think it's actually gonna require some serious leaps.
00:07:03.360 | Like even the VR headsets, right?
00:07:05.560 | They don't work if they go below 90 frame rates, right?
00:07:09.440 | People start getting freaked out.
00:07:11.200 | So you have to go from one gaming screen
00:07:13.800 | of 60 frames per second to two screens
00:07:17.920 | of 90 frames per second.
00:07:19.360 | And so the people's hardware today can't even handle that.
00:07:22.240 | And that's for these two little screens by your eyeballs.
00:07:26.720 | What it's gonna take to completely trick my consciousness
00:07:30.200 | into not knowing the difference in terms of like,
00:07:32.600 | all the sensory inputs.
00:07:34.520 | I'm keeping my fingers crossed.
00:07:37.880 | Whoever does that and is close to doing that,
00:07:41.320 | they should contact me.
00:07:42.720 | I want to have my brain waves turned into a virtual machine.
00:07:46.680 | - Would you in that context, if Morpheus came to you,
00:07:50.280 | would you take the blue pill or the red pill?
00:07:53.520 | Meaning, would you be happy just living in that world
00:07:58.320 | and not knowing that you're living inside
00:08:01.280 | that virtual world that's running a computer?
00:08:03.800 | Or would you want to know the truth of it?
00:08:06.640 | - Well, actually, I think that's a very different question.
00:08:08.680 | There's actually moral ethical question there
00:08:12.520 | about whether you should allow a bunch of people
00:08:15.120 | to get manipulated and killed and slaved.
00:08:17.940 | 'Cause in the matrix, they're all enslaved
00:08:20.200 | as like a AAA battery to turn a human being
00:08:24.200 | into the battery, right?
00:08:26.720 | So I think the moral and ethical question of that,
00:08:31.560 | fascinating enough, isn't actually different
00:08:33.120 | than the moral and ethical questions we face today
00:08:35.160 | in modern daily life.
00:08:36.960 | But I probably have given the choice
00:08:39.600 | of just completely going along or going against it.
00:08:43.480 | I would probably go against it
00:08:45.400 | if I had to make this kind of binary choice.
00:08:48.040 | Because going along with it,
00:08:50.000 | I think at that scale of scary stuff happening to people,
00:08:57.160 | is probably something really, really, really difficult.
00:09:00.000 | - But for your individual life,
00:09:01.200 | it's way more fun to go along with it.
00:09:03.240 | So you're saying you value
00:09:05.360 | opposing a system that includes the suffering of others
00:09:11.760 | versus just for yourself enjoying the ride?
00:09:15.840 | I mean, if there is such a binary choice,
00:09:19.400 | why choose the opposite system?
00:09:21.920 | - I think it's the nature of kind of the ethical dilemma
00:09:24.440 | that you face in that situation.
00:09:26.040 | There's kind of some, you know,
00:09:27.560 | this is obviously not something that's happening now, right?
00:09:29.880 | - We don't know this, right?
00:09:31.320 | - We don't know this, but at the end of the day,
00:09:35.240 | at that scale of something like that happening,
00:09:38.080 | yeah, at that scale of people being manipulated and harmed,
00:09:43.080 | then I think pretty much almost all people
00:09:45.760 | have an obligation to go against it.
00:09:48.600 | Probably that's what that looks like in my opinion.
00:09:53.620 | - So you've talked about the concept of definitive truth.
00:09:57.660 | What is it?
00:09:58.500 | And in general, what is the nature of truth
00:10:00.320 | in human civilization?
00:10:01.800 | And just talking about the digital age,
00:10:04.900 | the nature of truth in the digital age.
00:10:08.080 | - So the interesting thing about definitive truth
00:10:11.600 | is that it actually exists on this,
00:10:14.800 | at least in my mind, on this spectrum
00:10:16.820 | between objective truth and just, you know,
00:10:19.720 | somebody made something up and nobody else agrees.
00:10:22.800 | So what I think definitive truth is,
00:10:24.960 | is it's somewhere in the middle on that spectrum
00:10:27.100 | where if you and me define what truth is, right?
00:10:31.840 | Like if you and me have an agreement of some kind
00:10:34.240 | and we say, as long as the weather is sunny
00:10:38.580 | or the weather isn't, there is no rain on that day,
00:10:41.600 | then there'll be an insurance policy that results.
00:10:44.460 | And you and me both agree that as long as three sensors,
00:10:47.880 | three weather monitoring stations all say that,
00:10:51.480 | then the definitive truth for us and for that agreement
00:10:54.740 | is the result of those systems coming to consensus
00:10:59.500 | about what happened out in the real world.
00:11:01.920 | I think the objective truth definition
00:11:06.040 | from kind of the philosophical world
00:11:08.120 | is really, really stringent and very, very hard to attain.
00:11:12.120 | And that's not what this is.
00:11:13.960 | And that's actually not what commerce
00:11:16.240 | or the ability for people to interact about contracts needs.
00:11:20.140 | What I think the world of commerce needs is an upgrade
00:11:23.940 | from someone can unilaterally decide what the truth is
00:11:28.640 | to there can be a pre-agreed set of conditions
00:11:33.640 | where we define what the truth is under those conditions.
00:11:37.880 | And then you and me basically say,
00:11:40.600 | if these 20 nodes or of these 30 data sources
00:11:43.600 | come to consensus within this method of consensus
00:11:46.440 | with this threshold of agreement,
00:11:48.280 | then definitive truth has been achieved for you and me
00:11:51.240 | in our relationship for this specific agreement
00:11:54.340 | and the specificity and our shared agreement
00:11:58.200 | to that kind of truth or that definitive truth
00:12:01.480 | being acceptable to both of us
00:12:04.000 | is probably what's kind of necessary and sufficient
00:12:08.220 | for everything to move forward in a better way.
00:12:10.600 | In any case, much better than,
00:12:12.640 | I'm a bank or an insurance company,
00:12:14.440 | I'm gonna unilaterally decide what happens.
00:12:16.800 | It's definitely an upgrade from that.
00:12:18.720 | - Do you think it's possible to define formally in this way,
00:12:21.900 | a definitive truth for many things in this world?
00:12:24.360 | Like you talked about weather,
00:12:26.280 | basically defining that if three sensors of weather agree,
00:12:30.400 | then that we're going to agree
00:12:31.520 | that that is a definitive useful truth
00:12:34.640 | for us to operate under.
00:12:36.640 | So how many things in this world
00:12:39.360 | can be formalized in this way, do you think?
00:12:41.800 | - A huge amount.
00:12:43.240 | So there's actually two things going on here.
00:12:48.240 | One thing is the amount of data that already exists,
00:12:51.880 | and the pieces of data coming off of markets,
00:12:55.440 | IOT, shipment of goods, any number of other things.
00:12:59.240 | Like even your YouTube channel
00:13:00.800 | has a certain amount of likes
00:13:02.560 | or a certain amount of clicks or a certain amount of views,
00:13:05.240 | and even that's quantifiable.
00:13:06.600 | So even to a certain degree, what we do here today,
00:13:09.020 | you and me right now can be quantified
00:13:11.720 | as far as the amount of views,
00:13:12.920 | the amount of clicks, the amount of any number of other--
00:13:15.000 | - You, the viewer, have power of data in your hands
00:13:18.320 | by clicking like or dislike right now,
00:13:20.680 | or the subscribe button or the unsubscribe button,
00:13:23.360 | which I encourage you to do.
00:13:24.960 | Anyway, okay, so there's data flowing
00:13:27.280 | into all interactions in this world, there's data.
00:13:30.280 | - There's more and more data, right?
00:13:31.640 | - More and more data.
00:13:32.480 | - More and more data,
00:13:33.300 | that data is more and more accessible to everybody,
00:13:36.420 | and that accessibility and the fact that there's more of it
00:13:39.140 | means we can form more definitive truth proofs.
00:13:42.840 | We can form more and more proofs,
00:13:44.400 | and as we form those proofs,
00:13:46.280 | well, we can provide them to these blockchains
00:13:48.520 | and smart contract systems that consume them,
00:13:50.960 | and then they're tamper proof, right?
00:13:52.600 | So they can't be manipulated.
00:13:54.280 | And so now we've combined a system that can prove things
00:13:57.160 | with a system that guarantees us certain outcomes,
00:14:00.120 | and we have a better system of contracts,
00:14:02.440 | which is actually an unbelievably powerful tool
00:14:06.680 | that has never existed before.
00:14:08.560 | - Can we talk about the world of commerce and finance,
00:14:11.480 | decentralized finance, what is it,
00:14:14.080 | what's its promise from both the philosophical
00:14:18.720 | and technical perspective,
00:14:20.200 | if we just zoom in on that particular space
00:14:22.080 | of the digital world?
00:14:24.040 | - Sure, so the centralized finance is the instantiation
00:14:28.160 | of a specific type of smart contract, right?
00:14:31.360 | Or what I call hybrid smart contracts,
00:14:33.680 | which are these contracts that combine the on-chain code
00:14:36.480 | together with the off-chain proofs that something happened.
00:14:40.200 | They're called a hybrid because they basically use
00:14:42.520 | both of these systems, right?
00:14:43.880 | The blockchain and the proofs about what happened.
00:14:47.040 | And what DeFi is, is one specific type
00:14:51.080 | of hybrid smart contract that is taking on
00:14:54.360 | the contractual agreements you traditionally find
00:14:57.880 | in the global financial system, right?
00:15:01.360 | And that's basically the world of lending,
00:15:04.440 | the world of yield generation for people giving me
00:15:07.200 | or giving whoever their money
00:15:08.800 | and somebody giving back them, yield back to them,
00:15:11.680 | which is what bonds do and what treasuries do
00:15:13.960 | and what a lot of the global financial markets do,
00:15:16.120 | as well as the ability to gain exposure and protection
00:15:20.360 | from different types of events and risks.
00:15:23.000 | That's a lot of what derivatives do, right?
00:15:24.760 | Derivatives allow us to say,
00:15:26.240 | "Hey, something's gonna happen
00:15:27.480 | and I'm either gonna protect myself by getting paid
00:15:30.080 | if it happens, or I'm going to benefit from it happening
00:15:33.280 | by basically saying it's gonna happen,
00:15:34.960 | putting money down on that
00:15:36.120 | and that prediction will get me a return."
00:15:38.880 | Now, that's a very large part
00:15:41.640 | of the global financial system,
00:15:43.120 | excluding all the stuff for global trade
00:15:45.120 | and letters of credit
00:15:45.960 | and all the stuff that facilitates international trade.
00:15:48.320 | So excluding that at least for now.
00:15:50.920 | So if we look at what decentralized finance does,
00:15:53.560 | it takes all of those agreements
00:15:55.280 | about generating yield, lending,
00:15:57.400 | and all of these types of things you find in global finance
00:15:59.840 | and the world of derivatives
00:16:01.160 | and a few other types of financial products.
00:16:04.040 | And it basically puts them into a different format, right?
00:16:07.920 | So the format you have for centralized financial agreements
00:16:11.680 | is that you go to a bank, even if you're a hedge fund,
00:16:14.280 | even if you're like the richest people,
00:16:15.880 | you go to a bank, they make a product for you
00:16:18.560 | and you hope that they honor the product
00:16:20.640 | that they made for you.
00:16:21.720 | Or you do a deal with another hedge fund
00:16:23.360 | or whoever, some counterparty,
00:16:25.520 | and you hope that that deal is honored.
00:16:28.400 | - Yeah.
00:16:29.240 | - And then a number of very freaky things
00:16:31.720 | start to take place.
00:16:33.120 | One of them is people don't have clarity
00:16:36.640 | about what the agreement is, right?
00:16:38.840 | So a lot of people don't know exactly
00:16:41.120 | what the agreement is between those parties
00:16:44.920 | because they can't actually see it.
00:16:46.800 | Sometimes agreements are kept very private
00:16:48.800 | or parts of them are kept private.
00:16:50.360 | And that keeps other counterparties,
00:16:52.960 | other people in the system
00:16:53.840 | from understanding what's going on.
00:16:55.480 | This is actually partly what happened
00:16:56.920 | with the mortgage crisis.
00:16:58.160 | The mortgage crisis in 2008 was basically,
00:17:00.520 | there were a lot of agreements, there were a lot of assets,
00:17:02.680 | but because the centralized financial system
00:17:05.120 | worked in such an opaque way,
00:17:06.960 | it was so unbelievably difficult
00:17:08.560 | to understand what was going on, right?
00:17:11.120 | And so that lack of understanding
00:17:12.640 | for the global financial system
00:17:13.800 | basically led to a big boom
00:17:15.360 | and then correspondingly, very, very big bust,
00:17:18.800 | which amazingly enough had a huge impact on everybody,
00:17:21.240 | even though they didn't participate
00:17:22.440 | in the boom part of the equation.
00:17:25.320 | In any case, what decentralized finance does
00:17:28.640 | is it takes these financial contracts
00:17:31.480 | that power the global financial system.
00:17:33.320 | It puts them in this new blockchain-based format
00:17:36.240 | that basically at this point
00:17:37.400 | provides three very powerful things.
00:17:39.880 | The first thing that it provides is complete transparency
00:17:42.400 | over what's going on with your financial product.
00:17:45.320 | So this means when you use a financial product
00:17:47.120 | in the DeFi format, you, and you as a technical person
00:17:50.240 | actually can drill down very, very, very deeply.
00:17:53.240 | And you can understand where the collateral is,
00:17:56.000 | you can understand how much collateral there is,
00:17:57.720 | you can understand what format it's in,
00:17:59.280 | you can understand how it's changing,
00:18:00.680 | you can understand this on a second to second
00:18:03.440 | or block to block basis.
00:18:05.320 | So you have complete transparency
00:18:07.800 | into what's going on in the financial protocol
00:18:11.120 | that you have your assets in,
00:18:12.800 | which is because blockchains and infrastructure,
00:18:15.640 | all of these things are built on,
00:18:17.320 | force that transparency.
00:18:19.720 | Whereas the centralized financial system
00:18:21.280 | is very, very good at hiding it.
00:18:23.440 | It's very good at hiding it
00:18:24.880 | and packaging things in a glossy wrapper,
00:18:27.800 | creating a boom, then a bust.
00:18:30.360 | The centralized finance is built on infrastructure
00:18:32.560 | that forces transparency,
00:18:34.200 | such that everyone can understand
00:18:35.840 | what the financial product does from day one.
00:18:38.080 | And in fact, escaping that property
00:18:40.320 | is practically impossible.
00:18:41.720 | Or if someone tries to escape it,
00:18:43.360 | it becomes immediately obvious
00:18:44.720 | and people don't use their financial product.
00:18:46.680 | So that's number one.
00:18:48.600 | Number two is control.
00:18:50.680 | So if you look at what happened with Robinhood,
00:18:53.120 | everybody thought the system worked a certain way.
00:18:55.520 | Everybody thought I have a brokerage account,
00:18:57.720 | I can trade things under a certain set of market conditions.
00:19:02.240 | And then the market conditions changed
00:19:05.440 | within the band of what people thought they could do.
00:19:08.200 | And everybody was fascinated to find out that,
00:19:10.280 | oh my God, I thought my band of market conditions
00:19:13.400 | in which I can control my assets is X,
00:19:15.800 | but it is actually Y.
00:19:17.440 | It's actually much, much smaller band.
00:19:20.400 | And the reason it is a much, much smaller
00:19:22.760 | group of market conditions
00:19:24.400 | is that the system doesn't work
00:19:26.760 | the way people think it works.
00:19:28.160 | The system was wrapped up in a nice glossy wrapper
00:19:30.280 | and given to them to get them to participate in the system
00:19:32.840 | 'cause the system requires and needs their participation.
00:19:35.800 | But if you actually look at how the system works underneath,
00:19:39.480 | you will see that it does not work
00:19:41.080 | the way people think that it works.
00:19:43.120 | And this is actually another reason
00:19:44.400 | that DeFi is so powerful
00:19:45.520 | because DeFi actually, and these blockchain contracts,
00:19:49.280 | give people the version of the world
00:19:52.040 | they think they already have,
00:19:54.040 | which is why they don't beg for it.
00:19:56.480 | So everybody thinks they're in a certain version
00:19:58.320 | of the world that works in this reliable way,
00:20:00.440 | transparent way.
00:20:01.440 | They're not.
00:20:02.760 | They don't realize it.
00:20:03.960 | And so they're confused when you tell them,
00:20:05.520 | I'm gonna make the world work this way
00:20:07.120 | because they think they're already in that world.
00:20:09.160 | But then things like Robinhood
00:20:10.240 | make it immediately painfully clear
00:20:12.720 | that that's not how the world works.
00:20:14.000 | So the second real property of DeFi is control,
00:20:17.080 | which means that you control your assets,
00:20:20.200 | not a bank, not a broker, not a third party, you.
00:20:23.840 | You control your Bitcoins,
00:20:24.960 | you control your tokens in the finance protocol.
00:20:27.280 | If you don't like how something's going in that protocol,
00:20:30.200 | you can remove it.
00:20:31.040 | You can send it to another protocol,
00:20:32.800 | or you can use a feature of the protocol
00:20:34.440 | to do something it's supposed to do.
00:20:35.960 | And guess what?
00:20:36.800 | Nobody can just say, oops, that feature,
00:20:38.880 | that isn't so good for my friends over here.
00:20:41.240 | That feature is actually,
00:20:42.560 | we're just gonna pause that feature
00:20:43.960 | in the critical moment when you need it
00:20:46.480 | to execute your strategy,
00:20:49.440 | which is why you took all the risks to begin with.
00:20:52.200 | And then the final reason,
00:20:54.440 | the final thing to know about DeFi
00:20:56.480 | is that DeFi is inherently global,
00:20:59.160 | and actually right now provides better yield globally.
00:21:02.920 | So if you go to a bank right now with the US dollar,
00:21:05.640 | you get 1% or less.
00:21:07.800 | If you go to DeFi with the US dollar,
00:21:10.000 | you get 7% or 8%.
00:21:12.720 | So if we think about that in a world
00:21:16.600 | where there's a lot of inflation coming down the road,
00:21:19.080 | and we think about, well,
00:21:23.280 | a lot more systems might be failing soon,
00:21:25.440 | and they might be highlighting these types of problems
00:21:28.280 | that were there for,
00:21:30.200 | or as a result of the type of control
00:21:32.240 | that you see in Robinhood,
00:21:34.640 | and people are more and more concerned
00:21:37.240 | about both transparency and control,
00:21:39.960 | and they're looking for yield to combat inflation,
00:21:43.040 | I think that's what DeFi is about in a practical sense.
00:21:46.800 | It is this clarity about your risk,
00:21:48.880 | it is control over your assets,
00:21:51.320 | and amazingly, at the same time
00:21:53.600 | as having those two unbelievably useful properties,
00:21:56.760 | it is actually superior yield,
00:21:59.200 | which just leads me to the very obvious conclusion
00:22:03.000 | that the only reason DeFi isn't more used
00:22:05.880 | is 'cause more people don't know about it,
00:22:08.240 | and by virtue of this long kind of explanation
00:22:12.000 | here and elsewhere, more people will know about it,
00:22:14.880 | and it's just such an obviously superior solution
00:22:17.680 | that I haven't heard a single explanation
00:22:19.720 | as to why, no, no, don't earn 8% and take less risk
00:22:24.160 | and have more transparency with your assets,
00:22:26.640 | earn 7% less, take more risk,
00:22:29.800 | and give people the ability to change the rules on you
00:22:32.640 | at their discretion, go do that.
00:22:35.200 | Who's gonna do that?
00:22:37.280 | - And in general, on the first two
00:22:38.960 | of transparency and control,
00:22:40.440 | first of all, I do think, maybe you can correct me,
00:22:43.040 | but from my perspective, they're deeply tied together
00:22:47.800 | in the sense that transparency gives control.
00:22:50.760 | - Transparency creates accountability,
00:22:52.880 | and there's this kind of game being played,
00:22:55.400 | game theoretic game, where if I know,
00:22:58.280 | if you know I'm gonna discover your deviation,
00:23:00.600 | you're not gonna deviate.
00:23:02.240 | - Yes, this could be a whole 'nother conversation,
00:23:04.760 | but just as a small aside,
00:23:06.560 | on the social network side of things,
00:23:08.560 | which I've been thinking deeply about in the past year or so
00:23:12.160 | of how to do it right there,
00:23:16.040 | how to fix our social media,
00:23:18.360 | and I tend to believe that human beings,
00:23:22.800 | if they're given clear transparency
00:23:25.240 | about which data is being stored, how it's being used,
00:23:28.160 | where it's being moved about,
00:23:29.800 | just all a clear, simple transparency
00:23:32.840 | of how their data is being used,
00:23:36.440 | and them having the control at the very minimal level
00:23:40.720 | of being able to participate or to walk away,
00:23:43.840 | and walk away means delete everything
00:23:45.920 | you ever known about me,
00:23:47.400 | that will create a much, much better world.
00:23:51.960 | That currently there's a complete lack of transparency
00:23:54.480 | in the social media, how the data is being used
00:23:56.720 | for your own protection.
00:23:57.560 | I mean, there's a lot of parallels
00:23:58.520 | to the central bank situation,
00:24:00.480 | and there's not a control element
00:24:02.640 | of being able to walk away.
00:24:04.080 | Like being able to delete all your data,
00:24:06.360 | delete your account on Facebook is very difficult.
00:24:09.400 | It doesn't take a single click,
00:24:11.600 | which I think is what it should take.
00:24:12.920 | There should be a big red button that says,
00:24:15.000 | delete everything you've ever known about me,
00:24:17.360 | or like forget me.
00:24:19.240 | So I think that couple together can create
00:24:21.680 | a very different kind of world,
00:24:23.760 | and create an incentivization that will lead
00:24:28.760 | to like progress and innovation,
00:24:30.640 | and just like a much better social network,
00:24:32.640 | and a really good business for the future social networks.
00:24:37.640 | But so I tend to see like control as naturally
00:24:43.000 | being a sort of an outgrowth from the transparency.
00:24:47.680 | It should all start at the transparency,
00:24:49.520 | which is why the smart contract formulation is fascinating.
00:24:54.000 | 'Cause like you're formalizing in a simple, clear way,
00:24:59.000 | any agreements that you're participating in.
00:25:01.480 | And as a side comment also,
00:25:03.640 | what's really inspiring to me is that I think
00:25:07.000 | there's a greater, I don't know if this is always the case,
00:25:09.760 | but it seems like from having talked to people
00:25:12.520 | on the psychological element,
00:25:14.640 | there's a hunger amongst people for transparency
00:25:19.640 | and for control.
00:25:23.280 | Like transparency, another word for that is authenticity.
00:25:26.240 | If you look at the kind of stuff that people hunger for now,
00:25:29.560 | they want to know the reality of who you are as an individual.
00:25:33.840 | So that means you can create businesses,
00:25:35.880 | you can create tools that are built on authenticity,
00:25:40.000 | a transparency.
00:25:41.200 | And then the same, I'm inspired by the intelligence
00:25:45.920 | of people if you give them control,
00:25:48.120 | if you give them power, that they would make good choices.
00:25:52.800 | That's really exciting.
00:25:54.080 | Of course, not everybody,
00:25:55.320 | but that means that decentralized power
00:25:58.320 | can create effective systems.
00:26:01.720 | So a couple of that, there's a hunger for transparency
00:26:03.880 | so we can move to a world where everyone's being
00:26:06.400 | just like real, conveying their genuine human nature.
00:26:10.920 | And people are sufficiently intelligent
00:26:14.160 | that if they're given power in a distributed mass scale sense
00:26:18.800 | that we're going to build a better world through that,
00:26:21.480 | as opposed to centralized supervised control,
00:26:24.160 | where only a small percent of the population
00:26:26.600 | know what the hell they're doing
00:26:27.560 | and everybody else is clueless sheep.
00:26:30.880 | So those two coupled together is really, to me, inspiring.
00:26:35.240 | - Just to really quickly comment on the stuff
00:26:36.880 | that you just said, which I think is super,
00:26:38.480 | super, super fascinating.
00:26:40.440 | I think that's all exactly right.
00:26:42.760 | I think everything that you said is right.
00:26:44.720 | And I think it's actually going to be the same
00:26:46.040 | for social media and banking and every other type of contract
00:26:49.040 | is that all of those systems that house people's value
00:26:52.680 | for them and take control of either their social media value
00:26:56.760 | or their financial value or whatever for them,
00:27:00.120 | all of that is going to be made available to people
00:27:02.480 | in this autonomous piece of code
00:27:05.200 | that does the same thing
00:27:06.480 | that the centralized entity used to do.
00:27:08.800 | So they get all the features,
00:27:10.760 | but the autonomous piece of code gives them the ability
00:27:14.360 | to have control while getting all the features.
00:27:17.040 | - Right.
00:27:18.040 | - So banks give you features,
00:27:19.840 | social media sites give you features,
00:27:22.520 | whatever other system that you use online gives you features
00:27:25.560 | and then it takes your data
00:27:27.120 | and it takes control of your assets from you
00:27:29.720 | in return for those features.
00:27:31.480 | Right?
00:27:32.320 | I think the whole big difference here,
00:27:34.520 | partly in line with the definition of smart contracts
00:27:37.040 | and its evolution,
00:27:38.240 | is that there's this autonomous piece of code
00:27:41.960 | that's giving you all those features
00:27:44.400 | without requiring the ownership and lock-in and control
00:27:49.400 | and unilateral kind of ownership of your data or your value
00:27:54.440 | or whatever it is that you're giving it.
00:27:57.920 | Right?
00:27:58.800 | And I think what this will lead to fundamentally
00:28:01.200 | is just more of a free market dynamic
00:28:04.800 | among how people make...
00:28:07.120 | I think with the social media folks,
00:28:08.880 | you should just make some kind of law or something
00:28:11.760 | where you can just export all your data from them.
00:28:14.240 | Everyone should be able to get their data exported
00:28:16.960 | by another application.
00:28:18.680 | And then the network effect of all these social media sites
00:28:21.520 | will kind of crumble
00:28:22.360 | 'cause people will just combine your Twitter data
00:28:24.840 | with your Facebook data, with everything else,
00:28:27.000 | into an application that you control.
00:28:29.000 | And there'll just be thousands of different interfaces
00:28:31.800 | competing for how to consume all the social media data
00:28:34.680 | because it isn't locked in
00:28:35.920 | in one centralized actor's control.
00:28:38.400 | And so this is just the recurring pattern
00:28:40.840 | of what I think all of this will do
00:28:42.600 | is it'll give people a better deal.
00:28:45.760 | Right?
00:28:46.600 | It gives them features without ownership of data,
00:28:48.960 | without ownership of value.
00:28:50.800 | And that's really the difference.
00:28:53.600 | - So I think this is a good place
00:28:54.880 | to talk about smart contracts then.
00:28:56.520 | Can you tell me the history of smart contracts
00:28:58.880 | and the basic sort of definitions of what is it?
00:29:02.640 | - Sure.
00:29:03.480 | So I think smart contracts as a definition
00:29:05.840 | has actually gone through some kind of changes
00:29:08.200 | or a small evolution.
00:29:09.960 | Initially, I think it was actually a conception
00:29:12.200 | of a digital agreement that was tamper-proof
00:29:14.880 | and could know things about the world, right?
00:29:17.080 | So it could get proof
00:29:18.280 | and it could define that something happened
00:29:20.240 | and it could conclude an outcome
00:29:22.520 | and release payment or do something else.
00:29:24.120 | That's actually the definition of smart contracts
00:29:26.320 | that I began working in this industry with
00:29:28.400 | seven or eight years ago
00:29:29.560 | when I started making smart contracts.
00:29:31.480 | That is the conception that I had of a smart contract.
00:29:34.880 | Then what happened was that was really hard to do, right?
00:29:38.880 | Building that type of tamper-proof digital agreement
00:29:41.200 | that could also know things about the real world
00:29:43.640 | and release payments back to people about those events
00:29:47.200 | that were codified in this tamper-proof format
00:29:49.440 | was actually a very tall order.
00:29:51.400 | Turns out it's consistent of three parts.
00:29:53.040 | It's consisting of the contract,
00:29:54.560 | the proof about what happened
00:29:55.720 | and the release of value.
00:29:57.480 | The way things have evolved so far
00:30:00.400 | is that the definition has now come to mean on-chain code,
00:30:05.240 | right?
00:30:06.080 | So it's come to mean the codification
00:30:08.360 | of contractual agreement on a blockchain, right?
00:30:11.600 | So there's some code somewhere on some blockchain
00:30:14.000 | that defines what the agreement is.
00:30:16.160 | Now, that eliminates the part of the definition
00:30:20.720 | that's related to knowing things about the world
00:30:23.040 | and it partly eliminates the definition about payments
00:30:26.000 | and stuff like that.
00:30:26.840 | But basically it's on-chain code, right?
00:30:30.480 | We in our recent work on a second white paper
00:30:33.920 | have actually put out a different definition
00:30:36.080 | that we call hybrid smart contracts
00:30:38.400 | that actually tries to go back to the initial definition
00:30:41.360 | that I started with seven or eight years ago,
00:30:43.080 | which basically says that there's some proof somewhere
00:30:47.120 | that's proven to the contract
00:30:48.400 | and the contract can know that
00:30:49.840 | and the contract can gain proof.
00:30:51.800 | Then it can use that proof to settle the agreement
00:30:55.000 | that's codified on a blockchain.
00:30:57.480 | So you both need a mechanism to provide proof,
00:31:00.400 | you need a mechanism to codify the contract
00:31:02.680 | in a tamper-proof way on something like a blockchain.
00:31:05.440 | And then as with all contracts,
00:31:06.840 | there's a presumption that there's some kind
00:31:08.080 | of release of value.
00:31:09.440 | So I think a smart contract in our industry right now
00:31:13.240 | means on-chain code,
00:31:15.000 | which limits it to whatever can be done on-chain only.
00:31:19.040 | And then in our internal definition for us,
00:31:22.000 | and for us at Chainlink and for me,
00:31:24.480 | it's hybrid smart contracts,
00:31:26.600 | which is actually the original definition.
00:31:28.480 | It's the idea that a contract can both know what happened
00:31:32.520 | and automatically resolve to the proper outcome
00:31:36.720 | based on what happened.
00:31:38.000 | - So you're referring to the Chainlink 2.0 white paper,
00:31:41.080 | which is a paper that I recommend people look.
00:31:45.640 | It's a very easy read and very well-structured
00:31:48.200 | and very thorough, so I really enjoyed it.
00:31:50.400 | Very recently released, I guess.
00:31:53.000 | Can you dig in deeper?
00:31:53.960 | What is a hybrid smart contract?
00:31:57.120 | You mentioned sort of this idea of data
00:31:59.800 | or knowing about the world and on-chain and off-chain.
00:32:05.640 | So what are the different roles in this?
00:32:07.680 | So hybrid, by the way, refers to the fact
00:32:10.600 | that it's on-chain and off-chain contracts.
00:32:14.840 | So maybe dig in deeper of what the heck is it
00:32:18.560 | and what does it mean to know stuff about the world?
00:32:21.640 | Like how do you actually achieve that?
00:32:23.720 | - Yeah, absolutely.
00:32:25.080 | So the on-chain part is where the agreement itself is.
00:32:30.520 | That's the smart contract itself.
00:32:32.960 | And that's where you codify certain conditions,
00:32:35.080 | such as the conditions under which
00:32:38.120 | an interest payment is made,
00:32:39.320 | or the conditions under which the contract
00:32:41.320 | pays out the full amount that it holds
00:32:43.360 | to someone based on a derivative outcome
00:32:45.440 | or something like that.
00:32:47.080 | Now, what the on-chain code is very good at
00:32:49.160 | is creating transparency about what the core conditions
00:32:52.160 | of the contract are.
00:32:53.960 | It's very good at taking in money
00:32:55.920 | from other private keys that send it tokens
00:32:58.680 | and send it value to hold.
00:33:00.840 | And then it's also very good at returning money
00:33:03.680 | or returning value back to other addresses
00:33:06.480 | or other private keys.
00:33:08.040 | It can also be involved in governance.
00:33:09.560 | It can be involved in a few other
00:33:10.640 | private key signature-based operations.
00:33:14.400 | But primarily the on-chain part of a hybrid smart contract,
00:33:18.160 | from what I've seen so far,
00:33:19.880 | defines the agreement, takes in value,
00:33:22.440 | and returns value based upon the conditions codified
00:33:25.360 | in the agreement on a blockchain.
00:33:27.480 | The second and equally important off-chain part
00:33:30.400 | is where the term an oracle comes in,
00:33:34.280 | or an oracle mechanism,
00:33:35.480 | or a decentralized oracle network,
00:33:37.520 | as we describe it in the paper.
00:33:39.320 | And this is another decentralized computational system
00:33:44.320 | that has a different goal.
00:33:46.040 | So blockchains have the goal
00:33:48.280 | of packaging transactions into blocks
00:33:50.640 | and connecting them in a cryptographically unique way
00:33:54.320 | to create security and assurance
00:33:55.880 | about that chain of transactions.
00:33:58.720 | Oracles and decentralized oracle networks
00:34:01.640 | achieve consensus and they achieve decentralization
00:34:06.280 | about the topic of what happened.
00:34:08.960 | So blockchains structure transactions.
00:34:11.440 | Some of those transactions might be the state changes
00:34:14.200 | in different pieces of on-chain code.
00:34:16.880 | And then those on-chain pieces of code require input.
00:34:21.880 | I think the thing that people
00:34:24.800 | get kind of a little bit thrown by
00:34:26.560 | is despite being called smart contracts,
00:34:29.760 | the on-chain code on a blockchain
00:34:32.080 | cannot actually speak to any other system.
00:34:35.480 | So blockchains are valuable and useful
00:34:38.280 | as far as they're tamper-proof and secure.
00:34:40.600 | And to be tamper-proof and secure,
00:34:42.120 | they're made this kind of walled garden
00:34:44.080 | that is able to know and interact
00:34:46.800 | only with the highly reliable information
00:34:49.520 | that's within that system,
00:34:51.720 | which is basically tokens and private key signatures.
00:34:56.160 | All the other world's information
00:34:57.760 | is not available in a blockchain inherently.
00:35:01.160 | And a smart contract or a piece of on-chain code
00:35:03.920 | can't just say,
00:35:04.920 | "Hey, I'm going to go get some data from over here,"
00:35:07.600 | because the API they would get it from
00:35:09.680 | creates a whole bunch of security concerns
00:35:11.720 | for the blockchain itself
00:35:13.320 | and a whole bunch of consensus issues
00:35:14.840 | about how to agree on what that API said
00:35:17.600 | or what the truth of the world is, right?
00:35:19.200 | Because it's not even agreeing on what one API said.
00:35:22.240 | It's more so creating a reliable form
00:35:25.480 | of decentralized computation
00:35:27.280 | that can give you a definitive proof of what happened
00:35:29.880 | and not just what one API said.
00:35:31.680 | So for example, some of our most widely used networks
00:35:34.360 | have well over 30 nodes and well over 10 data sources
00:35:37.320 | that are all providing information
00:35:38.880 | about the same type of data.
00:35:40.520 | And then there's consensus on that one piece of data,
00:35:43.840 | which is then written in and essentially given back
00:35:46.280 | into the on-chain code to tell it what happened.
00:35:49.760 | Because you can't really make an agreement
00:35:52.160 | unless you know what happened, right?
00:35:54.200 | If you and me were to make an agreement
00:35:55.560 | and set some contractual conditions,
00:35:57.320 | but our agreement could never know what happened,
00:35:59.840 | it would be completely useless.
00:36:02.560 | However, if you and me made an agreement
00:36:04.280 | and there was another system called an Oracle mechanism
00:36:06.520 | or decentralized Oracle network
00:36:07.920 | that proved what happened definitively,
00:36:10.520 | and you and me pre-agreed
00:36:11.920 | that whatever this mechanism says is what happened,
00:36:15.000 | then we can achieve an entirely new level of automation, right?
00:36:18.680 | We can suddenly say,
00:36:20.480 | there's this piece of on-chain code that's highly reliable.
00:36:23.400 | We can give it millions, billions,
00:36:25.480 | eventually trillions of dollars in value.
00:36:27.760 | And it is controlled by this other system over here
00:36:30.720 | that's also highly reliable
00:36:32.480 | under this configurable set of definitive truth
00:36:34.920 | and decentralization conditions,
00:36:36.400 | which we all agree are sufficiently stringent
00:36:39.360 | to control that much value.
00:36:41.240 | And therefore, the combination
00:36:42.720 | of this tamper-proof on-chain representation of a contract
00:36:46.120 | and this mutually agreed upon definition
00:36:50.080 | of a trigger or a proof system combined
00:36:54.520 | is a hybrid smart contract,
00:36:56.640 | which as you can see probably already
00:36:59.200 | does a lot more than just a contract on-chain, right?
00:37:03.760 | - Can you talk about this consensus mechanism,
00:37:05.520 | which by the way is just fascinating.
00:37:07.480 | So there is the on-chain consensus mechanism
00:37:11.400 | of proof of work and proof of stake.
00:37:14.560 | And then there is this Oracle network consensus mechanism
00:37:19.560 | of what is true.
00:37:22.480 | So how do you, can you compare the two?
00:37:26.920 | Like, how do you achieve that kind of consensus?
00:37:28.800 | How do you achieve security
00:37:30.840 | in integrating data about the world?
00:37:35.440 | In a way that's definitively true,
00:37:39.160 | in a way that is usefully true,
00:37:41.120 | such that we can rely on it in making major agreements
00:37:44.400 | that as you said, involve billions of trillions of dollars.
00:37:48.160 | - Right, so this is the challenging question, right?
00:37:51.800 | This is the challenging problem that Oracle networks,
00:37:56.440 | Oracles, we at Chainlink that we work on
00:37:59.560 | in order to create this definitive truth
00:38:01.720 | to trigger and create hyper automation
00:38:04.280 | in this more advanced form,
00:38:06.520 | more advanced form of hybrid smart contracts.
00:38:09.120 | The reality I think of this problem
00:38:12.280 | is that it is very specific to each use case.
00:38:16.360 | And this is actually how we've architected our system
00:38:20.640 | is in a very flexible way.
00:38:22.640 | So for example, you need an ability for an Oracle network
00:38:27.120 | to grow in the amount of nodes that it has
00:38:30.200 | relative to the value it secures, right?
00:38:33.080 | So if you have an Oracle network
00:38:34.840 | that secures $100,000 in like a beta of a financial product,
00:38:39.840 | maybe it can be fine with only seven nodes
00:38:42.520 | and only two or three data sources, right?
00:38:44.320 | Because the risk to that Oracle network
00:38:47.000 | is relatively low based on the value it secures.
00:38:49.440 | So the first question is actually
00:38:53.040 | how do you scale security relative to value
00:38:57.880 | secured by that Oracle network?
00:38:59.360 | Because it wouldn't be very efficient
00:39:00.720 | to have a thousand nodes securing $100,000 worth of value.
00:39:05.720 | So one of the first questions is how do we properly scale
00:39:09.280 | and how do we compose ensembles of nodes
00:39:12.600 | in a decentralized way where we can know that,
00:39:15.400 | okay, we're going from seven nodes in a network
00:39:17.880 | to 15 to 31 to 57 to 105 to a thousand, right?
00:39:22.880 | So that's one dimension of the problem.
00:39:26.360 | - So you have to be scaling the number of nodes
00:39:28.000 | relative to the value that's derived
00:39:31.640 | from the truth integrated into those nodes.
00:39:34.960 | - Well, that's not the only problem, right?
00:39:36.440 | The other side of this is that you're trying to create
00:39:39.120 | a deterministic result, a deterministic output
00:39:42.440 | from a set of non-deterministic disparate systems,
00:39:44.800 | data sources, or places that prove things.
00:39:47.480 | - Can you also just as an aside, what is an Oracle node?
00:39:51.520 | What is the role of an Oracle node?
00:39:53.880 | - Sure, so an Oracle node essentially exists
00:39:57.680 | in both places, it exists in both worlds.
00:40:00.520 | It exists as an on-chain contract that represents
00:40:04.920 | either an Oracle network or an Oracle node.
00:40:07.400 | So there's an on-chain interface in the form of a contract
00:40:10.320 | that says, I exist to give you this list of inputs.
00:40:15.000 | You can request weather data from me,
00:40:16.880 | you can request price data from me,
00:40:18.880 | you can ask me to send a payment somewhere.
00:40:20.480 | - Like an API, so it's a pointer to a API
00:40:24.440 | that provides truth about this world.
00:40:28.880 | - It's an interface.
00:40:30.040 | So just like an API is an interface for Web 2.0 engineers,
00:40:33.800 | Oracle networks and the contracts that represent them
00:40:39.480 | or individual nodes are the interface
00:40:42.480 | of Web 3.0's use of services.
00:40:45.560 | And services includes all services, data, payment systems,
00:40:51.680 | messaging systems, whatever Web 2.0
00:40:54.040 | or any kind of computing service that you can conceptualize
00:40:58.280 | needs an interface on-chain in the form of a contract
00:41:01.720 | that says, here are the services I can provide for you,
00:41:04.840 | here are the transactions you need to send me
00:41:07.320 | to get back this data or that computation or this result.
00:41:11.280 | And then what you actually see
00:41:12.400 | is that the centralized Oracle networks,
00:41:14.040 | because they're uniquely capable of generating
00:41:17.000 | their own computations in a decentralized way
00:41:19.800 | around the data that they have access to,
00:41:22.400 | you actually see decentralized Oracle networks
00:41:24.760 | generating a lot of these services.
00:41:26.360 | So for example, we have a randomness service,
00:41:29.480 | a verifiable randomness function service
00:41:32.120 | that basically provides randomness on-chain
00:41:35.040 | and that randomness is then used in lotteries
00:41:38.160 | and various other contracts that need randomness.
00:41:40.240 | But that randomness, it's not a piece of data
00:41:42.400 | that comes from somewhere else.
00:41:43.680 | We don't go to another data source and get it,
00:41:46.560 | we generate it within an Oracle node
00:41:49.040 | that then provides it over into Oracle node
00:41:52.120 | or Oracle nodes that provide it
00:41:53.280 | into the contracts themselves.
00:41:55.160 | - So why do you say Oracle nodes are non-deterministic?
00:41:58.760 | - Well, they are as far as they come to consensus,
00:42:01.080 | but see, there's this kind of different problem here, right?
00:42:05.720 | The blockchains are very focused
00:42:08.160 | on generating blocks of transactions
00:42:10.720 | within a smaller universe of transaction types,
00:42:13.600 | a certain block size and a certain set of conditions.
00:42:16.880 | And then they have a economic system that says,
00:42:20.640 | I will perpetually generate blocks of this size
00:42:23.280 | with these transaction types
00:42:24.760 | in this kind of limited set of transaction types,
00:42:27.320 | whether those are UTXO transactions
00:42:29.240 | or scripted solidity or whatever it is.
00:42:31.840 | Oracles and Oracle networks,
00:42:35.400 | we don't have a blockchain, for example,
00:42:36.840 | there is no chain-linked blockchain.
00:42:38.680 | Our goal is not to generate a certain set
00:42:41.920 | of very clearly predetermined transaction types
00:42:45.280 | into a set of transactions that are put into blocks
00:42:48.120 | and it will infinitely be done that way.
00:42:50.720 | Our goal is actually to create what we call a meta layer,
00:42:54.840 | a decentralized meta layer
00:42:56.440 | between the non-deterministic, highly unreliable world
00:43:01.440 | and the highly hyper-reliable world of blockchains
00:43:05.760 | so that the unreliable world can be passed
00:43:08.480 | through this decentralized meta layer.
00:43:10.120 | - And it can coexist with the reliable on-chain world.
00:43:13.920 | - Exactly, it can coexist and in some cases,
00:43:16.000 | the meta layer might generate it.
00:43:17.840 | So the problem in giving you this straight answer
00:43:20.280 | is that there's just such a wide array of services.
00:43:24.200 | If you were to say, well, Sergey,
00:43:25.880 | how do we generate randomness from a data source?
00:43:28.880 | Well, we don't use a data source
00:43:30.040 | to generate the randomness,
00:43:31.320 | that's the type of service that can be generated
00:43:33.200 | in an Oracle network itself.
00:43:35.280 | And so there'll be certain computations
00:43:37.160 | that Oracle networks themselves generate themselves
00:43:40.280 | to augment and improve blockchains.
00:43:43.040 | And it is actually the goal of Oracle
00:43:44.600 | is to consistently do that.
00:43:45.760 | So if you were to think about the stack
00:43:48.320 | in a very generic high level,
00:43:50.220 | you would see blockchains are databases.
00:43:52.400 | They're basically the data structures
00:43:53.720 | that retain a lot of information
00:43:55.800 | in this transparent, highly reliable form.
00:43:58.400 | Smart contract code is the application logic.
00:44:01.880 | It is the logic under which all of this
00:44:04.240 | kind of activity occurs,
00:44:06.320 | storing data in the data structure, in the blockchain
00:44:09.480 | as a database in a certain conceptualization of it.
00:44:13.720 | And then Oracles and Oracle networks
00:44:15.760 | are all the services that are used by the application code.
00:44:21.840 | So by analogy, let's take Uber.
00:44:24.340 | Uber initially, some core code goes and gets the GPS API
00:44:28.880 | from Google Maps about the user's location,
00:44:31.000 | sends a message to the user through Twilio,
00:44:33.280 | pays the driver through Stripe.
00:44:34.840 | If those services weren't available
00:44:37.760 | to the people who made Uber,
00:44:39.280 | they wouldn't have made Uber, right?
00:44:40.680 | Because they would have written their core code
00:44:42.160 | on some database,
00:44:43.280 | and then they would have had to make a geolocation company,
00:44:45.840 | a telecom messaging company,
00:44:47.680 | and the global payments company.
00:44:49.320 | And they wouldn't have done that because it's too hard.
00:44:51.640 | And that's the weird scenario
00:44:53.460 | that a lot of people in our industry are in.
00:44:55.480 | And that's the problem that Oracles
00:44:57.340 | and Oracle networks fix,
00:44:58.980 | is they provide these decentralized services
00:45:01.960 | to take this developer ecosystem,
00:45:04.920 | the blockchain and smart contract developer ecosystem from,
00:45:08.680 | hey, I can have a database
00:45:10.520 | and write some application logic
00:45:12.120 | about tokenization and voting and private key signing,
00:45:15.000 | all of which is super useful and is a critical foundation.
00:45:18.260 | But now if you just layer on all the world's services,
00:45:21.720 | whether that's market data, weather data, randomness,
00:45:24.560 | suddenly people can build DeFi, fraud-proof gaming,
00:45:27.800 | fraud-proof global trade, fraud-proof ad networks.
00:45:31.200 | And that's why this world of decentralized services
00:45:34.640 | and decentralized Oracle networks is particularly,
00:45:38.000 | in my opinion, important to our industry.
00:45:39.760 | - Yeah, it's funny.
00:45:40.600 | And you talk about that the currents
00:45:42.640 | of decentralized world, DeFi,
00:45:44.200 | but decentralized services world is primarily just tokens.
00:45:48.200 | And it's basically just financial transactions.
00:45:51.920 | And the kind of thing,
00:45:53.800 | the reason why it's super exciting,
00:45:55.520 | the kind of thing you're doing with Chainlink
00:45:56.800 | and Oracle networks,
00:45:58.180 | is that you can basically open up
00:46:00.160 | the whole world of services
00:46:01.600 | to this kind of decentralized smart contract world.
00:46:08.920 | I mean, you're talking about just orders of magnitude
00:46:13.920 | greater impact financially and just socially
00:46:18.440 | and philosophically.
00:46:20.440 | Are there interesting near-term
00:46:23.320 | and long-term applications that excite you?
00:46:26.000 | - Yeah, there's a lot that excites me.
00:46:27.680 | And that is how I think about it,
00:46:28.920 | that it's not just about,
00:46:29.880 | we made a decentralized Oracle network.
00:46:31.880 | It's about, we made a decentralized service
00:46:34.120 | or collection of services
00:46:35.400 | that's going from hundreds to thousands
00:46:37.600 | and then people are able to build
00:46:39.000 | the hybrid smart contracts,
00:46:40.440 | which I think will redefine what our industry is about.
00:46:43.200 | Because for example,
00:46:44.040 | for the people that only learned about blockchains
00:46:45.960 | through the lens of NFTs,
00:46:48.080 | they understand blockchains through NFTs,
00:46:50.360 | not through speculative tokens or Bitcoins, right?
00:46:53.160 | And I think that will continue.
00:46:55.800 | I think the use cases that excite me,
00:46:58.280 | they vary between the developed market,
00:47:01.000 | the developed world's economies and emerging markets.
00:47:04.440 | I think in the developed world,
00:47:06.200 | what you will see is that transparency,
00:47:09.160 | creating a level, a new level of information
00:47:12.840 | for how markets work and the risk that is in markets
00:47:16.480 | and kind of the dynamics that put
00:47:19.640 | the global financial system at systemic financial risk,
00:47:22.400 | like 2008.
00:47:23.600 | And my hope is that all of this infrastructure
00:47:25.880 | will soften the boom and bust cycles
00:47:29.960 | by making information immediately available
00:47:32.280 | to all market participants,
00:47:34.000 | which is by the way, what all market participants want,
00:47:36.680 | except for the very, very, very small minority
00:47:39.280 | that are able to game the system and their benefit
00:47:41.800 | and benefit from booms, but avoid busts
00:47:43.640 | because of their asymmetric access to information,
00:47:46.200 | which really everybody should have
00:47:47.680 | and which this technically solves.
00:47:49.840 | I think in the process of doing that,
00:47:51.760 | and which is happening, I think right about now,
00:47:54.320 | you see a polishing of the technology
00:47:56.640 | such that it can be made available to emerging markets.
00:47:59.680 | And on a personal level,
00:48:00.960 | I feel that the emerging markets will benefit much more
00:48:04.800 | from this technology,
00:48:06.120 | just like the emerging markets benefit much more
00:48:08.360 | from the internet or from those $50 Android phones
00:48:12.440 | that people can have,
00:48:13.760 | because it's such a massive shift
00:48:15.960 | in how people's lives work, right?
00:48:17.920 | I have always had access to books and a library,
00:48:20.320 | which has been fantastic and very important,
00:48:23.200 | but there are places in the world
00:48:24.520 | where people don't have libraries,
00:48:26.480 | but now they have the internet and a $50 Android phone,
00:48:29.920 | and they can watch the same Stanford lecture that I watch.
00:48:32.960 | I mean, that's kind of mind-blowing, realistically, right?
00:48:35.800 | They just went from zero to one
00:48:38.200 | in a very, very dramatic way.
00:48:40.300 | I think all of these smart contracts,
00:48:43.440 | and in my case, I think the one
00:48:45.080 | that I seem to keep coming back to is crop insurance,
00:48:48.380 | where partly 'cause it doesn't have
00:48:49.880 | a tokenization component,
00:48:51.000 | partly 'cause it's actually much more important
00:48:52.520 | than it might seem.
00:48:54.300 | - What is crop insurance?
00:48:57.880 | - Right, so, exactly.
00:49:00.640 | So this is the nature of why it's sometimes hard
00:49:03.440 | to see the full value of what our industry does,
00:49:05.320 | because it solves all these kinds of backend problems
00:49:07.480 | that we don't have, right?
00:49:08.520 | So crop insurance is if I own a farm and it doesn't rain,
00:49:12.640 | I get an insurance payout,
00:49:14.280 | so I don't need to close down my farm,
00:49:16.320 | because if it didn't rain, I don't have crops, right?
00:49:19.680 | So people in the developed world can get crop insurance,
00:49:23.800 | and there's all kinds of systems
00:49:25.640 | that basically pay them out,
00:49:26.800 | and then they can argue with the insurance company
00:49:29.600 | if they don't get paid out properly and whatever.
00:49:32.000 | And this allows people to smooth out risk.
00:49:36.120 | In fact, a lot of the global options markets
00:49:38.840 | were about this, right?
00:49:40.320 | They were initially about people selling their produce
00:49:43.500 | or their crops ahead of time,
00:49:45.320 | so that if there was a risk of drought,
00:49:47.400 | they weren't impacted by it, right?
00:49:49.760 | And that's where a lot of options trading
00:49:51.480 | and all this kind of stuff came from,
00:49:53.840 | even though it's now turned into this kind of global casino.
00:49:57.160 | But in the emerging market,
00:49:59.080 | there are literally people that,
00:50:01.400 | if they don't have rain for two seasons,
00:50:03.640 | they need to close down their farm
00:50:05.000 | and become a migrant worker of some kind.
00:50:07.380 | And now they have a $50 Android phone
00:50:10.280 | where they can read Wikipedia,
00:50:12.220 | but they're still decades away from an insurance company
00:50:15.520 | coming to their geography and offering them insurance
00:50:18.400 | because their local legal system
00:50:19.640 | simply doesn't allow that type of thing to exist.
00:50:22.160 | No insurance company is gonna go and create an insurance
00:50:24.640 | entity and offer them insurance
00:50:26.120 | because the levels of fraud
00:50:27.800 | and the ability to resolve that fraud through courts
00:50:29.760 | would just not exist.
00:50:31.400 | So now these people have to wait for decades
00:50:34.280 | to have this very basic form of financial protection
00:50:37.280 | or something like a bank account even.
00:50:39.660 | And with this technology, they don't, right?
00:50:42.120 | So with this technology, if I have a $50 Android phone
00:50:45.680 | and the smart contract has data from satellites
00:50:48.960 | or weather stations about the weather conditions
00:50:51.320 | in the geography that my farm is in,
00:50:53.600 | I can put value into the smart contract
00:50:57.400 | and the smart contract will automatically pay me out,
00:51:00.800 | pay me back out at my Android phone.
00:51:02.800 | And guess what?
00:51:04.200 | I just leapfrogged past my corrupt government
00:51:07.760 | not being able to provide a legal infrastructure
00:51:10.880 | to create insurance.
00:51:12.360 | I just leapfrogged past dealing with insurance companies
00:51:15.160 | that will probably price gouge me and often not pay out.
00:51:19.600 | And I leapfrogged into the world of hyper-reliable
00:51:23.560 | kind of guaranteed smart contract outcomes
00:51:27.600 | that are as good or in many cases better
00:51:29.520 | than what farmers in all parts of other parts
00:51:31.480 | of the world have.
00:51:32.760 | And this type of dynamic for the emerging markets
00:51:35.200 | of creating a way for people to control
00:51:37.240 | and manage risk in their economic life,
00:51:39.320 | I think extends way past insurance.
00:51:41.240 | It extends to them having bank accounts
00:51:42.840 | to combat local inflation.
00:51:44.400 | It extends to them being able to sell their goods
00:51:46.720 | on the global free market of global trade
00:51:48.540 | without middlemen.
00:51:49.680 | It extends to all these things
00:51:51.360 | that we don't really care about, right?
00:51:53.160 | Because we're not farmers,
00:51:54.900 | but are unbelievably impactful for people
00:51:57.960 | that don't have a bank account
00:51:59.520 | and their inflation rate in their country is double digits
00:52:03.380 | or their farm completely depends on rain
00:52:06.080 | or their livelihood completely depends
00:52:07.840 | on their ability to sell goods.
00:52:10.160 | And they can't sell those goods
00:52:11.400 | because there's a middleman
00:52:12.760 | who essentially controls all the trust relationships.
00:52:16.200 | But now we have the internet and smart contracts
00:52:18.660 | and that might not have to be the case
00:52:20.860 | in the next five or 10 years.
00:52:22.500 | - Yeah, so that definitely has a quality of life impact
00:52:25.580 | on the particular farmer's life,
00:52:27.060 | but I suspect it has a huge,
00:52:30.200 | like down the line ripple effect on the whole supply chain.
00:52:33.940 | So if you think about farmers,
00:52:36.180 | but any other people that produce things
00:52:41.220 | that are part of a large logistics network,
00:52:46.220 | like a supply chain network,
00:52:48.960 | that means when you increase reliability,
00:52:52.600 | you sort of increase transparency and control,
00:52:57.600 | but like where any one node in that supply chain network
00:53:01.920 | can formalize the way it operates
00:53:06.320 | in its agreements with others,
00:53:08.680 | you could just have a very like at scale
00:53:13.680 | transformative effect on how people that down the line
00:53:18.900 | use the services that you provide,
00:53:20.940 | the products that you create operate.
00:53:23.280 | So like, it's almost hard to imagine
00:53:26.740 | the possible ways it might transform the world.
00:53:30.580 | I wonder how much friction there is in the system,
00:53:33.220 | I guess, currently that smart contracts might remove.
00:53:37.580 | That's almost unknown.
00:53:40.720 | You can sort of hypothesize and stuff, but I wonder.
00:53:44.480 | I've seen enough bureaucracy in my life
00:53:47.880 | to know that smart contracts in many cases
00:53:50.440 | would remove bureaucracy.
00:53:52.620 | And I wonder how the world will be
00:53:55.620 | once you remove much of the bureaucracy.
00:53:58.360 | Coming from the Soviet Union,
00:54:00.160 | where I just have seen the life sucked out
00:54:05.720 | of the innovative spirit of human nature by bureaucracy.
00:54:10.720 | I wonder, the kind of amazing world
00:54:14.160 | that could be created once bureaucracy is removed.
00:54:16.660 | - Yeah, I think it's fascinating how the world can evolve.
00:54:22.320 | I think this extends a lot further than people think
00:54:25.280 | into many, many different parts of the global economy.
00:54:28.040 | It might start with NFTs for art,
00:54:30.920 | or it might start with DeFi, right?
00:54:33.280 | Or it might start with fraud-proof ad networks next.
00:54:36.600 | We don't know what it's going to go to next.
00:54:39.320 | But I think the implication of people being
00:54:42.440 | in a system of contracts that holds them accountable
00:54:46.360 | and guarantees contractual outcomes,
00:54:48.960 | regardless of a local legal system,
00:54:51.240 | is something that I think extends to the supply chain.
00:54:53.960 | You can prove that goods were sourced in an ethical way,
00:54:57.720 | and you can prove that in a way that can't be gamed.
00:55:00.200 | That'll change buying power and supplier power
00:55:03.480 | and how people produce goods that we all consume.
00:55:06.760 | And then on the political level,
00:55:08.840 | I personally think that in a number of decades,
00:55:11.760 | we could literally be in a place where politicians
00:55:14.840 | can commit to a certain set of smart contract
00:55:18.700 | kind of budget definitional kind of results.
00:55:22.280 | For example, we discovered oil.
00:55:24.540 | I promise as a politician, I'm going to take the oil
00:55:26.840 | and I'm going to redistribute it to all of you.
00:55:29.200 | Well, that's wonderful.
00:55:30.440 | That's a great idea.
00:55:31.640 | Sounds very nice when you're running for office.
00:55:34.240 | Why don't we codify that in a smart contract?
00:55:36.760 | And why don't we put those conditions
00:55:38.760 | very solidly on a blockchain?
00:55:40.680 | And then once you've been elected,
00:55:44.040 | we'll just turn that one on,
00:55:45.760 | and it'll distribute the money just like you said,
00:55:48.880 | and everything will be fine.
00:55:50.520 | I personally think that this new level of systems
00:55:53.880 | that allows trustworthy collaboration between everybody,
00:55:57.160 | between supply chain partners, ad network users,
00:55:59.680 | the financial system, insurance companies, and farmers,
00:56:04.080 | all of these are just interactions
00:56:06.480 | that require a trusted entity,
00:56:09.520 | or in this case, a trusted piece of code
00:56:12.200 | to orchestrate the interaction
00:56:14.440 | in the way that everyone agrees.
00:56:17.480 | - Yeah, one of the things that makes the United States
00:56:20.200 | fascinating is the founding documents.
00:56:23.080 | And it's fascinating to think of us moving into the new
00:56:26.000 | in the 21st century to a digital version of that.
00:56:30.240 | So the constitution, a smart constitution,
00:56:33.320 | no offense to the paper constitution,
00:56:36.320 | but, and that would have transformative effects
00:56:39.800 | on politicians and governments,
00:56:42.160 | holding people accountable.
00:56:45.640 | Oh man, that's so exciting to think that
00:56:52.080 | we might enforce accountability
00:56:57.080 | through the smart contract process.
00:57:00.960 | - Exactly, why can't that happen?
00:57:02.440 | Anything that we could codify into a smart contract,
00:57:05.720 | and anything that we all agree
00:57:07.200 | is the way the world should work.
00:57:09.160 | And then anything that we can get proof about, right?
00:57:12.320 | Anything that a system somewhere could tell us happened,
00:57:15.840 | those are the pieces of the puzzle, right?
00:57:19.060 | We need a trusted piece of code,
00:57:21.120 | we need to have agreement
00:57:23.040 | that that's how the world should work,
00:57:24.580 | and we need a system that'll tell
00:57:26.040 | that trusted piece of code what happened.
00:57:28.480 | As long as we have those three things,
00:57:30.760 | we can theoretically codify any set of agreements
00:57:34.040 | about anything where those three properties take hold.
00:57:39.040 | - I wonder if you could apply that
00:57:40.920 | to like military conflict and so on.
00:57:43.800 | Recently, Biden announced that we're going to pull off
00:57:48.080 | from Afghanistan after 20 years in the war.
00:57:52.480 | I wonder, there's a lot of debacles
00:57:56.200 | around war in Afghanistan and invasion of Iraq,
00:57:59.880 | all those kinds of things.
00:58:01.000 | I wonder if that was instead formulated as a smart contract.
00:58:04.700 | Like that might have actually huge impact
00:58:11.000 | on the way we do conflict.
00:58:12.640 | So you think of a smart contract
00:58:15.360 | as a kind of win-win situation,
00:58:18.040 | where you're doing like financial transactions
00:58:20.040 | or something like that.
00:58:21.560 | But you could see that also about military conflict
00:58:25.360 | or like whenever two nations are at tension with each other,
00:58:30.080 | different scales of conflict,
00:58:32.040 | that you could have conflict codified.
00:58:35.260 | And that would potentially resolve conflict much faster
00:58:40.960 | because there's honesty, transparency,
00:58:42.840 | and control within that conflict.
00:58:44.200 | 'Cause there's conflict in this world.
00:58:46.240 | And again, very, very inspiring to think about
00:58:50.760 | the kind of effects it might have
00:58:52.920 | on the negative kinds of contracts,
00:58:56.400 | on the tense, painful kinds of contracts.
00:58:59.600 | - I haven't thought about that as much
00:59:01.280 | as actually kind of scary,
00:59:02.240 | the stuff you're thinking through now
00:59:04.040 | with like the war contracts or something.
00:59:06.080 | You know, that's not in the white paper.
00:59:07.400 | We don't have anything about war contracts or anything.
00:59:09.640 | - Again, this is the Russian, we're both Russian,
00:59:12.560 | but I'm a little more Russian in the suffering side.
00:59:15.320 | Maybe I read way too much Dostoevsky
00:59:17.240 | and military kind of ideas.
00:59:19.400 | But anyway, holding politicians accountable in all forms,
00:59:24.400 | I think is really powerful.
00:59:26.200 | Is there something you could say as a small aside
00:59:29.200 | on how smart contracts actually work?
00:59:32.760 | If we look at the code,
00:59:33.600 | is there some nice way to say technically,
00:59:37.160 | what is a smart contract?
00:59:39.020 | What does it mean to codify these agreements,
00:59:41.640 | the actual process for people
00:59:42.920 | who might not at all be familiar?
00:59:45.400 | - I think you just write it into code
00:59:48.640 | that operates in this kind of decentralized infrastructure.
00:59:52.560 | You usually write code
00:59:54.160 | that runs in a central server somewhere.
00:59:56.160 | Now you write code that runs
00:59:58.160 | across a lot of different machines
00:59:59.640 | in this decentralized way.
01:00:01.440 | And then after you write it, you need services.
01:00:04.440 | And that's where Oracles come in.
01:00:05.560 | They provide all the services.
01:00:07.200 | So just like you would be writing code in Web 2.0 land,
01:00:10.360 | running it on a server somewhere and using an API,
01:00:12.800 | here you'd be writing code,
01:00:14.880 | putting it on a decentralized infrastructure
01:00:16.640 | like a blockchain or a smart contract platform like Ethereum.
01:00:20.120 | And then you would be using various services
01:00:22.640 | in the form of Oracle.
01:00:23.620 | So they'll just be called Oracles
01:00:24.960 | or decentralized services instead of APIs.
01:00:27.400 | And you're basically composing the same type of architecture,
01:00:31.120 | except it's hyper reliable.
01:00:33.120 | At the moment, it's a little bit less efficient
01:00:34.760 | because there's an early stage to our industry.
01:00:39.160 | But it provides this extreme level of reliability
01:00:43.200 | and transparency, which for certain use cases
01:00:46.240 | is an absolute critical component
01:00:48.720 | and is completely reinventing how they work.
01:00:50.840 | So I think people should look at what are the use cases
01:00:53.920 | where that trust dynamic can be so heavily improved.
01:00:56.520 | And that's probably the ones
01:00:57.720 | where this is maybe initially useful.
01:00:59.560 | - But I mean, just to emphasize,
01:01:01.680 | I don't think people realize when you say code
01:01:04.360 | that we're talking about non-obfuscated actual program.
01:01:09.360 | You can read it, you can understand it.
01:01:11.960 | And there's something about,
01:01:14.120 | maybe this is my computer science perspective
01:01:17.320 | of like software engineering perspective,
01:01:19.640 | but there's something about the formalism
01:01:21.320 | of programming languages, which enforces simplicity,
01:01:26.320 | clarity and transparency.
01:01:28.040 | And because it's seen to everybody,
01:01:31.960 | I mean, simplicity is enforced.
01:01:34.080 | There's something about natural language,
01:01:36.600 | like language as written in the constitution, for example,
01:01:40.780 | where there's so many interpretations.
01:01:43.220 | With the nice thing about programs,
01:01:46.960 | there's not going to be a huge number of books written
01:01:51.120 | about what was meant by this particular line
01:01:54.160 | 'cause it's pretty clear.
01:01:55.560 | Like programming languages have a clarity to them
01:01:58.880 | that natural language does not,
01:02:01.480 | and they don't have ambiguity.
01:02:04.340 | Which I think it's important to pause on
01:02:06.740 | 'cause it's really powerful.
01:02:08.260 | It's really difficult to think about.
01:02:09.900 | I think we live in a world where all the philosophers
01:02:12.340 | and legal minds don't know how to program.
01:02:15.340 | So I think, not all, most don't.
01:02:18.620 | And so we don't often see the philosophical impact
01:02:22.820 | of this kind of idea that the agreements between humans
01:02:26.580 | can be written in a programming language.
01:02:30.020 | That's a really transformative idea.
01:02:32.760 | That, I mean, yeah, it's an idea that's not just technical,
01:02:37.760 | it's not just financial, it's philosophical.
01:02:41.020 | It's rethinking human nature from a digital perspective.
01:02:44.420 | Like what is human civilization?
01:02:48.100 | It's interaction between humans.
01:02:49.800 | And rethinking that interaction as a digital interaction
01:02:55.860 | that is managed by programming languages,
01:02:58.100 | by programs, by code.
01:03:00.860 | I mean, that's fascinating.
01:03:03.320 | That we'll look back at this time potentially
01:03:05.760 | as one where us little descendants of apes
01:03:10.120 | did not realize how important this moment in history is.
01:03:15.120 | Like human beings might be totally different
01:03:18.560 | a century from now because we codified
01:03:23.640 | the interaction between humans.
01:03:25.840 | That might have more of an impact
01:03:27.440 | than anything else we do today.
01:03:30.360 | You think about the impact of the internet,
01:03:31.780 | one of the cool things is digitization of data.
01:03:35.020 | But we have not yet integrated the tools,
01:03:38.100 | the mechanisms fully that use that data
01:03:42.340 | and interact with humans yet.
01:03:43.860 | And that's what smart contracts do.
01:03:45.840 | I wonder if you think about the role
01:03:47.620 | of artificial intelligence in all of this.
01:03:50.180 | Because your smart contracts are kind of agreements,
01:03:54.940 | maybe you disagree with this,
01:03:57.620 | but at least the way I'm thinking about it
01:03:59.860 | is agreements between humans or groups of humans.
01:04:03.060 | But it seems like because everything's operating
01:04:07.220 | in the digital space that you can integrate
01:04:10.620 | non-humans into this.
01:04:12.960 | Or AI systems that help out humans, managed by humans.
01:04:17.580 | Like what do you think about a world
01:04:19.980 | of hybrid smart contracts,
01:04:25.500 | codifying agreements between hybrid
01:04:30.500 | intelligent being networks of humans and AI systems?
01:04:34.720 | - Yeah, yeah, I think that makes perfect sense.
01:04:39.080 | In terms of AI, I'm not an expert, right?
01:04:42.300 | So it might be a bit simplistic or naive,
01:04:45.300 | my ideas in this field.
01:04:48.380 | I think everyone saw the Terminator movie, right?
01:04:51.720 | Everybody kind of saw the Terminator movie in the '90s.
01:04:54.980 | And it was like, this is really scary.
01:04:57.500 | I personally think AI is amazing.
01:05:00.020 | It makes perfect sense.
01:05:02.140 | I think it will evolve to a place where people have...
01:05:06.380 | Just to understand, I work in the world of trust issues.
01:05:09.460 | I work in the world of how can technology solve trust
01:05:12.940 | and collaboration issues using encryption,
01:05:16.040 | using cryptographically guaranteed systems,
01:05:18.260 | using decentralized infrastructure, right?
01:05:20.020 | So that's the world that I've been inhabiting
01:05:22.220 | for many, many years now,
01:05:24.780 | building smart contracts for seven or eight,
01:05:26.460 | doing stuff before that.
01:05:27.480 | It's kind of what I'm focused on.
01:05:29.320 | So I view AI through that same lens.
01:05:32.900 | And my brain naturally asks,
01:05:34.260 | well, what is the trust issue that people might have with AI?
01:05:37.860 | And my natural kind of response is,
01:05:40.580 | well, let's say AI continues to be built and improve.
01:05:44.820 | At some point, I have no clue where we are on this now.
01:05:47.980 | I've seen different ideas that were very far from this.
01:05:51.020 | I've seen other ideas were very close to this.
01:05:52.820 | At a certain point, we'd arrive at a place with AI
01:05:55.460 | where we would be a little bit worried
01:05:57.840 | about just how much it could do, right?
01:05:59.600 | We might be worried that AI could do things
01:06:02.180 | we don't want it to do,
01:06:04.240 | but we still want to give AI
01:06:06.300 | a level of control over our lives, right?
01:06:08.580 | So in my world, that's a trust issue.
01:06:11.660 | And the way that that trust issue
01:06:13.140 | would be solved with blockchains
01:06:14.980 | is actually very straightforward.
01:06:16.820 | And I think in its simplicity, quite powerful.
01:06:19.780 | You could have an AI that has an ability to do
01:06:23.420 | and control key parts of your and our lives, right?
01:06:28.420 | But then you could limit it with private keys
01:06:32.060 | and blockchains and create certain guardrails
01:06:35.800 | and firm kind of walls and limits
01:06:38.800 | to what the AI could never go past,
01:06:44.260 | assuming that encryption, right?
01:06:46.960 | That encryption continues to work, right?
01:06:49.000 | And assuming that if it's not that AI's specialization
01:06:53.060 | to break encryption, that it wouldn't be able to do that.
01:06:56.180 | So if you have an AI that controls
01:06:59.220 | something very important, whatever it is,
01:07:01.460 | shipping or something in defense
01:07:03.560 | or something in the financial system, whatever it is,
01:07:07.380 | but you're sitting there and you're kind of worried,
01:07:09.060 | hey, this thing is unbelievable.
01:07:11.540 | It's coming up with things
01:07:13.060 | we wouldn't have thought of in a hundred years,
01:07:14.980 | but maybe it's a little too unbelievable.
01:07:17.900 | How do you limit it?
01:07:19.680 | Well, if you bake in private keys
01:07:22.720 | and you bake in these kind of blockchain based limitations,
01:07:27.020 | you can create the conditions
01:07:28.940 | beyond which an AI could never act.
01:07:31.640 | And those could once again be codified
01:07:33.500 | in the very specific unambiguous terms
01:07:36.700 | in which you described,
01:07:38.040 | which once again, in my trust issue focused world
01:07:41.660 | would solve the trust issue for users
01:07:45.020 | and make them comfortable with using the AI
01:07:47.860 | or ceding control to the AI,
01:07:50.780 | which I think in more advanced versions of AI
01:07:53.860 | will continue to be a concern, right?
01:07:55.860 | - This is fascinating.
01:07:57.100 | So smart contracts actually provide a mechanism
01:07:59.260 | for human supervision of AI systems.
01:08:01.600 | - With encryption, very encryption heavy.
01:08:04.580 | So it's not about like, is it smarter than us?
01:08:07.320 | It's about, will the encryption hold up?
01:08:09.860 | - Yep.
01:08:10.780 | So that's based on the assumption that encryption holds up.
01:08:15.060 | I think that's a safe assumption
01:08:16.340 | we can get into that whole discussion,
01:08:18.100 | but from quantum computing,
01:08:19.940 | but cracking encryption is very difficult.
01:08:22.020 | That's a whole nother discussion.
01:08:23.700 | I think we're on the safe ground for quite a long time
01:08:27.660 | assuming encryption holds.
01:08:29.160 | There's a space that is at the cutting edge
01:08:36.820 | of general intelligence research in the AI community,
01:08:41.380 | which is the space of program synthesis
01:08:44.620 | or AI generating programs.
01:08:47.300 | So that's different than what you're referring to
01:08:50.220 | is AI being able to generate smart contracts.
01:08:54.720 | And that to me is kind of fascinating
01:08:59.060 | to think of especially two AI systems between each other
01:09:05.340 | generating contracts,
01:09:07.020 | sort of almost creating a world
01:09:10.740 | where most of the contracts are between non-human beings.
01:09:15.460 | - I think an AI system, as I think about it,
01:09:17.820 | and once again, this is not my field.
01:09:19.340 | This is something I might watch a YouTube video on
01:09:21.340 | or just see something interesting about at some point.
01:09:24.040 | I think if I were to just reason through it, even now,
01:09:28.400 | I think the highly deterministic
01:09:30.980 | and guaranteed nature of smart contracts
01:09:34.420 | would probably be preferable to an AI
01:09:37.940 | because I'm guessing an AI would have a lot of problems
01:09:41.860 | with dealing with the human element
01:09:43.780 | of how contracts work today.
01:09:45.660 | So an AI, for example,
01:09:47.380 | couldn't pick up the phone and call Dave at a bank
01:09:52.060 | to do a derivative and kind of discuss with Dave
01:09:54.860 | and have a call with him and kind of have a conversation
01:09:57.340 | and get him comfortable and tell him it's gonna be fine
01:10:00.380 | and kind of smooth out all the weird social cues
01:10:03.680 | that have to do with making certain derivatives.
01:10:06.260 | I'm assuming that that's a pretty complicated
01:10:09.980 | neural map AI kind of problem.
01:10:13.260 | Yeah, so if I think about it,
01:10:17.320 | the deterministic guaranteed nature of smart contracts
01:10:20.340 | probably would, and assuming they're accessible to AIs,
01:10:25.180 | could actually, interestingly enough, be the format
01:10:28.620 | that they prefer to codify their relationship
01:10:32.420 | with non-AI systems and very possibly other AI systems
01:10:37.420 | because it is very, I mean, it's pretty guaranteed, right?
01:10:42.700 | All the other types of contracts that an AI
01:10:46.740 | could go out there and seek to do
01:10:49.780 | would require some language processing around the law.
01:10:55.660 | And I think, I don't know if this is a term,
01:10:59.540 | but probably not, a smart AI or a good AI
01:11:01.700 | or whatever the term is for a high-quality AI
01:11:04.660 | would probably realize some of the limitations
01:11:08.100 | and the risks.
01:11:09.260 | - Yeah, yeah, AI definitely dislikes ambiguity
01:11:12.060 | and would prefer the deterministic nature
01:11:15.740 | of smart contracts.
01:11:17.540 | I do wonder about this particular problem
01:11:19.700 | and maybe you could speak to it
01:11:21.340 | of how smart contracts can take over certain industries
01:11:26.340 | in a sense or how certain industries can convert
01:11:30.400 | their sets of agreements into smart contracts,
01:11:34.180 | which is, you mentioned sort of talking to Dave
01:11:36.380 | from the bank, you know, many of our laws,
01:11:39.940 | many of our agreements are currently
01:11:43.300 | through natural language, through words.
01:11:46.900 | And so there is a process of mapping that has to occur
01:11:50.340 | in order to convert the legal agreements,
01:11:53.820 | legal contracts of today to smart contracts
01:11:58.300 | that by the way, AI may be able to help with.
01:12:01.080 | But by way of question,
01:12:03.840 | how do you think we convert the legal contracts
01:12:08.160 | on which many industries currently function today
01:12:11.040 | or not even legal contracts,
01:12:13.560 | but ambiguous kind of agreements,
01:12:16.000 | maybe they're loose sometimes,
01:12:17.840 | into more formal deterministic agreements
01:12:21.480 | that are represented by smart contracts?
01:12:25.060 | - So I think there's two, maybe two sides to this.
01:12:29.420 | I think the first one is actually not a huge problem
01:12:32.980 | where you have things like the,
01:12:34.120 | it's the master agreement for derivatives
01:12:36.000 | or you have these agreements
01:12:37.620 | that basically already reference a system somewhere, right?
01:12:41.100 | Like for example, many legal agreements
01:12:43.260 | already accept e-signature.
01:12:45.060 | And so they're saying,
01:12:45.900 | hey, I'm gonna use this computing system over here
01:12:48.060 | around signatures and I'm gonna consider,
01:12:50.140 | and there's laws around that.
01:12:51.340 | And there's clauses that say e-signature
01:12:53.220 | is good enough for this agreement.
01:12:55.140 | I actually don't think this is a big problem
01:12:57.300 | for the vast majority of legal agreements
01:12:59.820 | that use systems already, right?
01:13:02.080 | So what you'll do is you'll swap out one repository
01:13:05.500 | or one set of system of contract settlement.
01:13:09.180 | And you'll just say, hey,
01:13:10.020 | this blockchain system over here
01:13:11.540 | is my new system of contract settlement.
01:13:13.580 | Whatever it says is the state of the agreement
01:13:16.860 | instead of the centralized system over there, right?
01:13:20.220 | And so there's actually a huge amount of agreements
01:13:22.320 | that are already able to do that.
01:13:24.900 | And I think we'll do that.
01:13:26.740 | I think there's another side to your question,
01:13:28.900 | which is the amount of agreements that are very ambiguous
01:13:33.900 | that can be turned into smart contracts.
01:13:36.140 | And I think the limitation there is twofold.
01:13:39.300 | First of all, like you said earlier,
01:13:41.780 | the highly reliable smart contract
01:13:44.140 | and the lack of opaqueness
01:13:45.900 | and the clarity of smart contracts
01:13:49.500 | is very high and very powerful and very clear.
01:13:53.700 | And it's, in my opinion,
01:13:54.840 | going to be much, much easier to take a smart contract
01:13:58.400 | and turn it into a set of natural language explanations
01:14:01.720 | and just say, hey, this is what this does, right?
01:14:05.680 | So I think that many contracts are,
01:14:08.320 | and even now in decentralized finance and DeFi
01:14:11.040 | and in decentralized insurance,
01:14:12.060 | they're basically being rebuilt in this format.
01:14:14.880 | And that rebuilding will make them clearer, like you said,
01:14:18.240 | and then restating those in natural language
01:14:20.280 | and explaining to people,
01:14:21.120 | well, you know, whether there's this,
01:14:22.800 | I think it'll actually be a lot simpler
01:14:24.400 | to explain to people what the contract is about.
01:14:26.080 | - That's fascinating.
01:14:27.240 | Mapping smart contracts into natural language,
01:14:29.240 | I didn't even think about that.
01:14:30.280 | So that's, you're saying that's doable
01:14:33.440 | and natural and easy to do.
01:14:35.980 | - Because there's so much clear, right?
01:14:37.400 | There's that forced clarity that you talked about.
01:14:40.080 | I think the second aspect of this problem
01:14:42.860 | is the nuance around what contracts
01:14:45.680 | can be made unambiguous.
01:14:47.700 | And I think that comes down to,
01:14:49.320 | often comes down to proving what happened,
01:14:52.000 | which is where Oracle networks
01:14:53.280 | and decentralized Oracle networks
01:14:54.600 | and Chainlink would come in.
01:14:56.120 | And our experience there is quite extensive
01:14:58.760 | over the many years that we've worked
01:15:00.120 | on many different contract types.
01:15:02.360 | I think what it fundamentally comes down to
01:15:05.600 | is whether there is data.
01:15:07.680 | So we're not gonna be able to make a hybrid smart contract
01:15:11.560 | about whether somebody painted your house
01:15:13.280 | the right color blue.
01:15:15.080 | We're just not gonna be doing that
01:15:16.380 | because there's no data feed that tells us
01:15:18.480 | that your house was painted blue
01:15:19.760 | or that it was the right color of blue.
01:15:22.440 | Unless somebody sets up a drone
01:15:24.160 | with a color analysis tool and they generate that data.
01:15:28.200 | - Which by the way, it could be possible, right?
01:15:29.960 | There could be, if there's enough demand
01:15:32.240 | then the service would be created
01:15:33.800 | that has drones flying around
01:15:35.500 | that's telling you about the colors of,
01:15:38.040 | all those kinds of stuff.
01:15:38.920 | So if there's actual demand that that will be created
01:15:41.560 | and because there'll be value to connect that data feed
01:15:44.640 | to the smart contracts and so on.
01:15:46.360 | - I think you have it unbelievably right
01:15:48.660 | because there are already insurance companies
01:15:51.060 | that use drones to monitor construction sites from overhead
01:15:54.380 | and see how many people are wearing hard hats.
01:15:57.140 | And if the percentage of people wearing hard hats
01:15:59.100 | isn't sufficiently high, then the policy is voided.
01:16:02.700 | And so in that case, there is a data source
01:16:05.260 | and that data source can be put
01:16:06.660 | into a hybrid smart contract.
01:16:08.340 | So the limitation of hybrid smart contracts is,
01:16:11.020 | is there a data source or a set of data sources
01:16:13.820 | to create definitive truth,
01:16:15.640 | to settle the contract and eliminate ambiguity.
01:16:18.940 | And then as you said, I think as people realize
01:16:22.520 | that smart contracts are a format
01:16:24.800 | in which they can form agreement
01:16:26.580 | about things like that insurance product around,
01:16:29.300 | how many people are wearing hard hats.
01:16:31.200 | If I'm the construction site owner,
01:16:33.220 | well, I would really like a guarantee
01:16:35.660 | that your insurance policy is gonna pay me out
01:16:38.020 | if everyone is wearing hard hats.
01:16:40.780 | And in that case, there is demand for the data
01:16:44.580 | and people will generate the data.
01:16:46.540 | And I actually think the insurance industry
01:16:48.500 | is interestingly a precursor of this
01:16:50.120 | because they're so data driven.
01:16:51.680 | You already see insurance companies paying IOT companies
01:16:56.080 | to put data into their customer's infrastructure
01:16:59.120 | at the cost of the insurance company
01:17:01.360 | to generate the data that the insurance company uses
01:17:03.880 | to make a policy for the customer.
01:17:05.820 | So you basically already have people
01:17:08.280 | who really want to price data into their agreements
01:17:12.240 | when they're of sufficiently high value
01:17:14.220 | paying for their own customers
01:17:17.240 | to get data sensors into their infrastructure.
01:17:20.640 | And I think as smart contracts
01:17:22.680 | become more of a requested format
01:17:25.540 | or data driven contracts become more of a format,
01:17:28.280 | there will be a growing demand
01:17:31.200 | about proving what happened through data.
01:17:33.720 | - So it'd be motivating totally new data fees being created.
01:17:36.880 | By the way, the insurance industry broadly,
01:17:40.280 | the revolutions there will be huge.
01:17:42.840 | I've worked quite a bit with autonomous vehicles,
01:17:44.780 | semi-autonomous and just vehicles in general.
01:17:47.780 | The insurance industry there, by the way,
01:17:49.780 | makes a huge amount of money,
01:17:51.620 | but is using very crappy data fees.
01:17:54.520 | Revolutionizing how, like not by crappy, I mean very crude.
01:18:01.180 | Like literally the insurance is based on things like age,
01:18:06.340 | gen, like basic demographic information
01:18:09.140 | as opposed to really high resolution information
01:18:13.380 | about you as an individual,
01:18:15.320 | which you may or may not want to provide.
01:18:18.200 | So you can choose from an individual perspective
01:18:20.360 | to provide a data feed.
01:18:21.420 | And there, like the power of insurance
01:18:26.420 | to enable the individual, to empower the individual
01:18:35.420 | could be huge because ultimately smart contracts
01:18:38.380 | motivate the use of data, the creation of new data feeds,
01:18:42.220 | but leveraging the whatever service it provides in truth
01:18:47.220 | as opposed to some kind of very loose notion of who you are.
01:18:52.740 | So that, again, not sure how that would change things,
01:18:57.220 | but in terms of the fundamental experience of life,
01:19:02.220 | 'cause I think we all rely on insurance,
01:19:04.100 | not just in business, but in life.
01:19:06.200 | And grounding that insurance in more and more
01:19:10.660 | accurate representation of reality
01:19:13.340 | might just have transformative effects on society.
01:19:15.940 | - Well, just to mention one quick thing that you said,
01:19:18.860 | where I noticed another trust issue,
01:19:20.580 | you said the user might not want to share their data.
01:19:23.060 | - Yes.
01:19:23.900 | - So what you could actually do,
01:19:25.140 | and what we've already worked on is
01:19:26.980 | you can have a smart contract that holds the data
01:19:30.220 | and evaluates the data of the user
01:19:32.900 | without sharing it with the insurance company.
01:19:35.540 | And the insurance company knows that the smart contract
01:19:38.100 | will evaluate it according to the policy,
01:19:40.380 | so they don't need the data.
01:19:41.900 | And the user can provide the data
01:19:45.300 | knowing it'll never touch the insurance company
01:19:47.220 | because it's only provided to the smart contract.
01:19:49.900 | And suddenly you've solved another trust issue
01:19:52.500 | because the autonomous piece of code
01:19:54.580 | can evaluate information separately from the interests
01:19:58.300 | of both of the counterparties.
01:20:00.380 | And so this is the recurring theme.
01:20:02.080 | I think you're seeing this recurring theme
01:20:03.620 | where there's a trust issue,
01:20:05.320 | people can't use a system, they can't collaborate,
01:20:07.880 | they can't share information
01:20:09.320 | that would make a better agreement for both of them,
01:20:11.280 | they can't solve a risk in their daily life,
01:20:14.680 | they can't participate in a market,
01:20:16.300 | they can't have a bank account
01:20:17.580 | 'cause nobody will give it to them
01:20:18.660 | 'cause they can't give it to them in that legal system.
01:20:22.040 | And once you have an autonomous piece of code
01:20:25.380 | that can also know what's going on,
01:20:27.980 | thanks to Oracle Networks and that combination of the code
01:20:30.620 | and the Oracle Network for the hybrid smart contract,
01:20:34.260 | the same pattern just recurs.
01:20:36.460 | It's really the same pattern.
01:20:38.200 | And this is why I keep saying trust issues.
01:20:40.720 | It's because I basically,
01:20:43.080 | almost every contractual trust issue that I see
01:20:45.780 | where there is a piece of data to prove
01:20:47.620 | and settle the trust issue
01:20:49.740 | in a way that works for both parties,
01:20:51.620 | there is no reason not to use an autonomous,
01:20:55.840 | highly reliable contract and piece of code.
01:20:59.600 | And I have to tell you,
01:21:01.240 | I've seen this in a lot of different industries.
01:21:04.080 | I've seen it insurance, ad networks,
01:21:07.120 | global finance, global trade.
01:21:09.200 | Those are all multi-trillion dollar industries.
01:21:11.680 | And then there are other smaller industries.
01:21:14.040 | Like even one of the first smart contracts
01:21:16.240 | we worked on many years ago
01:21:17.780 | was for search engine optimization firms,
01:21:20.660 | where they would tell you,
01:21:21.640 | "Hey, I'm gonna raise your search engine ranking.
01:21:24.120 | "Give me the money."
01:21:25.520 | And people wouldn't wanna give them the money
01:21:26.920 | because they never knew if they were gonna do it.
01:21:29.760 | And then the search engine firm
01:21:30.840 | doesn't wanna do any work
01:21:31.720 | thinking they'll never get any money.
01:21:33.600 | So we just initially even came up with a system
01:21:36.200 | where you could put Bitcoin into a smart contract
01:21:38.960 | and it would be released
01:21:40.440 | based on whether the search rank of a website
01:21:43.520 | got to a certain level on Google for a certain keyword.
01:21:46.280 | And so the trust problem was solved.
01:21:49.140 | But it's just the same story.
01:21:50.760 | It's kind of like trust issues around AI,
01:21:52.560 | trust issues around financial products,
01:21:54.220 | trust issues around insurance,
01:21:55.480 | trust issues around social media, whatever it is.
01:21:58.960 | I think that's what people looking at this industry
01:22:03.960 | really need to understand.
01:22:06.160 | And once they do understand,
01:22:07.200 | they realize what this is all about.
01:22:09.440 | This is about redefining how everyone collaborates
01:22:14.240 | with everyone about everything
01:22:17.360 | where we can prove something through data.
01:22:20.000 | - You've mentioned confidentiality and privacy
01:22:23.520 | that the parties don't need to necessarily know
01:22:26.200 | private data in this interaction.
01:22:28.360 | You talk about confidentiality in the white paper
01:22:32.360 | for Chainlink 2.0.
01:22:33.960 | Can you talk more about how to achieve confidentiality
01:22:38.080 | in this process?
01:22:39.000 | - Sure, sure, absolutely.
01:22:41.740 | So I think you once again need to think of the contract
01:22:45.560 | as existing in two parts, right?
01:22:47.020 | You have the on-chain code
01:22:48.320 | and then you have this off-chain system
01:22:49.860 | called the centralized Oracle network.
01:22:51.800 | So the question is really what portion of the contract
01:22:56.720 | should live in what part of these two systems, right?
01:23:00.680 | So if you want to create transparency,
01:23:03.720 | you should put more information on-chain
01:23:06.960 | because that's what blockchains are very good at.
01:23:09.240 | - They're public, transparent,
01:23:12.220 | but they don't necessarily have privacy.
01:23:15.000 | - Well, you can see how those two things
01:23:16.680 | are a little bit kind of completely
01:23:19.520 | diametrically opposed, right?
01:23:21.480 | So I do think, and I do see blockchains working
01:23:25.360 | on on-chain encrypted smart contracts.
01:23:28.800 | That's very inefficient.
01:23:29.960 | It has a lot of nuances around it.
01:23:32.640 | That I think will appear at some point.
01:23:35.040 | I think until it appears,
01:23:37.320 | you have an option of taking a part of the computation
01:23:41.580 | and putting it into the centralized Oracle network.
01:23:44.800 | We actually did an entire paper about this
01:23:47.160 | that we presented at Stanford in February of last year,
01:23:51.200 | something called Mixicles,
01:23:53.420 | which basically talks about how you can take
01:23:55.620 | an Oracle network and you can put a portion
01:23:58.360 | of the computation into the Oracle network,
01:24:01.040 | assuming that you're comfortable with that limited set
01:24:04.120 | of nodes, knowing what the computation is.
01:24:07.020 | And you could actually provide additional confidentiality
01:24:09.720 | through special hard work
01:24:11.040 | called trusted execution environments
01:24:12.860 | that all those nodes are forced to run.
01:24:15.280 | So they won't even know what they're operating.
01:24:17.860 | And so at the end of the day,
01:24:19.580 | if you look at a hybrid smart contract
01:24:22.120 | as gaining functionality from its on-chain code
01:24:24.640 | and gaining other functionality from its off-chain,
01:24:27.440 | the centralized Oracle network component,
01:24:30.480 | you can place the part of the computation
01:24:32.800 | that you would like to be private
01:24:34.560 | in the decentralized Oracle network
01:24:37.240 | because you can control the set of nodes.
01:24:40.760 | You can control the committee of nodes
01:24:42.780 | and you can require that they run certain hardware
01:24:46.080 | to keep the information private.
01:24:47.960 | So you could basically make a derivative that,
01:24:50.720 | or a binary option is the example used
01:24:52.560 | in the Mexico's paper where the payout happened on-chain,
01:24:57.200 | but it was actually impossible to tell
01:24:59.680 | what the outcome of the contract was.
01:25:02.400 | So the outcome of the contract was computed
01:25:04.840 | in the centralized Oracle network.
01:25:06.560 | And then there was a switch that triggered
01:25:08.680 | who received the payment,
01:25:10.440 | but from the point of view of analyzing
01:25:13.760 | the on-chain transactions and seeing who received
01:25:19.040 | the payment or what the outcome of the contract was,
01:25:21.400 | you couldn't derive that,
01:25:23.960 | you couldn't backward engineer what that was,
01:25:26.320 | but the users of that hybrid smart contract
01:25:30.120 | still had on-chain code that guaranteed them
01:25:33.520 | that as long as the decentralized Oracle network
01:25:36.040 | found a certain outcome,
01:25:37.840 | determined a certain outcome,
01:25:39.480 | that the relevant user would get paid
01:25:42.020 | and there was still a place to put value.
01:25:45.620 | So there is this kind of fundamental tension
01:25:49.300 | between confidentiality, privacy,
01:25:52.340 | which is very important for many contracts,
01:25:54.180 | which is critical to many contracts
01:25:56.100 | and the public and transparent nature of blockchains,
01:25:59.500 | which I think eventually will be solved
01:26:01.860 | through encrypted on-chain smart contracts.
01:26:04.660 | That'll take some time,
01:26:05.860 | I think that'll take years in my opinion.
01:26:08.180 | And before we arrive there,
01:26:10.940 | I think people will put the private portion
01:26:13.060 | into the centralized Oracle network.
01:26:15.260 | Once again, going back to what
01:26:16.700 | the decentralized Oracle networks do,
01:26:19.060 | they seek to provide these services, right?
01:26:21.620 | So the ability to do a privacy preserving computation
01:26:25.340 | is perhaps a service without which
01:26:28.140 | a certain type of contract might never come into existence
01:26:31.460 | in the form of an on-chain hybrid smart contract.
01:26:34.540 | And so this is once again,
01:26:35.860 | what we see the centralized Oracle networks
01:26:38.220 | and decentralized services doing
01:26:40.020 | is providing people these tools and building blocks
01:26:43.020 | to compose, like I'm great at making
01:26:46.740 | these derivatives contracts,
01:26:47.980 | but I can't make them
01:26:49.340 | unless I can retain the privacy of them.
01:26:52.180 | And our goal is to provide the infrastructure
01:26:56.060 | that gives you as a developer
01:26:57.660 | and as a creator of smart contracts, that capability.
01:27:00.900 | And what we've seen is that as we provide that capability,
01:27:03.620 | people create more,
01:27:04.900 | which is also really the story of the internet, right?
01:27:07.220 | The story of the internet is,
01:27:08.700 | it was really tough to do e-commerce
01:27:10.980 | while everything was in HTTP
01:27:13.180 | and credit cards were transmitted publicly.
01:27:15.740 | And so e-commerce was kind of tough
01:27:17.180 | 'cause how am I gonna send my credit card
01:27:18.820 | over public unencrypted channels, right?
01:27:21.340 | But the second HTTPS appears,
01:27:23.260 | e-commerce becomes a lot easier
01:27:24.900 | because I can put in my credit card number
01:27:26.900 | and it can be sent over an encrypted channel
01:27:29.060 | and it's not at risk.
01:27:30.460 | And so I can participate in e-commerce
01:27:31.980 | as long as I have a credit card.
01:27:33.660 | I think those types,
01:27:34.940 | and I'm sure that was unexpected, right?
01:27:37.020 | I'm sure at the time that was an unexpected outcome
01:27:40.060 | from that technology.
01:27:41.980 | And so I think this is why
01:27:43.900 | we sometimes have this focus on privacy
01:27:46.420 | because in our work with contracts
01:27:48.540 | and their transition into this hybrid smart contract form,
01:27:51.740 | we see a substantial amount of need for privacy
01:27:55.740 | as an inherent property of these contracts.
01:27:59.820 | - And it'll take a while before that's possible
01:28:01.860 | to create the kind of technology innovation
01:28:04.100 | required to do that on chain.
01:28:05.580 | I know there's a few ideas that are being floating about,
01:28:07.860 | but so the currently distributed Oracle networks
01:28:10.820 | provide that feature,
01:28:12.380 | which is essential to many contracts.
01:28:14.260 | What brings to mind in this whole space,
01:28:16.860 | again, it might be outside of your expertise,
01:28:20.260 | but within the world, which I'm passionate about,
01:28:23.500 | which is machine learning.
01:28:25.140 | And it seems like very naturally
01:28:27.860 | because current machine learning systems
01:28:31.260 | are very data hungry
01:28:33.340 | and much of the value mined by companies
01:28:37.900 | in the digital space are from data.
01:28:40.420 | They often want their data to maintain privacy.
01:28:44.900 | So you think about an autonomous vehicle space,
01:28:47.500 | Tesla is collecting a huge amount of data,
01:28:49.500 | Waymo is collecting a huge amount of data.
01:28:53.180 | It seems like it would be very beneficial
01:28:54.740 | to form contracts where one could use the data
01:28:57.180 | from the other in some kind of privacy preserving way,
01:29:00.700 | but also where all the uses of data are codified
01:29:05.700 | and you can exchange value cleanly,
01:29:08.540 | basically contracts over data,
01:29:10.700 | over machine learning systems use of different data.
01:29:15.700 | I don't know, do you talk to machine learning folks
01:29:19.180 | that use ideas of smart contracts
01:29:21.460 | or is that from outside your interest?
01:29:23.300 | 'Cause it seems like exceptionally applicable set of...
01:29:27.460 | When we talk about different services
01:29:29.100 | that might be created and revolutionized by smart,
01:29:33.300 | especially hybrid smart contracts,
01:29:36.180 | I think machine learning systems comes to mind to me
01:29:40.100 | in all industries.
01:29:42.180 | I don't know if you've gotten a chance
01:29:43.460 | to interact with those folks, with those services.
01:29:45.860 | - I think what you're talking about
01:29:47.700 | is more data marketplaces
01:29:49.100 | in the data marketplace side of things.
01:29:51.700 | Well, this is actually once again, very applicable
01:29:55.340 | 'cause there's a trust issue.
01:29:56.820 | At the end of the day,
01:29:58.500 | let's say I'm trying to sell you some data.
01:30:00.580 | You don't know the quality of the data,
01:30:03.020 | so you don't know what you wanna pay for it.
01:30:04.700 | And I can't give you the data for you to determine
01:30:07.060 | the quality 'cause I've given you the data.
01:30:09.860 | Guess what?
01:30:10.700 | We need an autonomous impartial agent.
01:30:13.700 | We need an impartial computational kind of agent
01:30:17.140 | and on-chain smart contract with an Oracle network
01:30:20.940 | to assess my data, to basically take
01:30:25.540 | random cross-section samples of the data,
01:30:28.340 | assess it for quality, assess it for signal
01:30:31.940 | from the algorithm you have,
01:30:33.580 | which you don't wanna share with me
01:30:35.220 | 'cause you don't wanna know the algorithm
01:30:36.420 | you're working on, right?
01:30:37.740 | You don't want me to know what you want the data for.
01:30:40.860 | So now the autonomous agent takes your algorithm,
01:30:44.140 | keeping it private from me and takes my data,
01:30:46.020 | keeping it private from you,
01:30:47.700 | assesses it on a random cross-section sampling
01:30:50.700 | for quality of data, returns the scoring back to you,
01:30:54.700 | allows you to determine a price.
01:30:57.460 | And now both you and me know that we've arrived
01:31:00.380 | at a fair price for the quality of my data
01:31:03.700 | for what you wanna do with it.
01:31:05.620 | And that's once again, from what I've seen
01:31:10.060 | in the data marketplaces, which are full of people
01:31:13.940 | who want that data for these learning models,
01:31:15.780 | often for financial markets, often for other reasons,
01:31:20.220 | this is their fundamental problem, which amazingly enough,
01:31:24.460 | there's a trust issue that is getting solved.
01:31:28.060 | And I think you can see even on the face of it,
01:31:31.140 | once that trust issue is solved,
01:31:33.620 | those markets can work a lot better, right?
01:31:36.660 | I don't need to know your algorithm,
01:31:37.980 | you don't need to know my data.
01:31:39.540 | We both know that the autonomous agent
01:31:41.900 | is not under either of our control
01:31:43.660 | and gave us a fair assessment and a fair price.
01:31:46.340 | And that's it.
01:31:47.940 | And we're all very comfortable with that.
01:31:50.940 | I could even make conditions that your algorithm
01:31:53.980 | isn't analyzing the data for something
01:31:55.780 | I don't want you to analyze it for,
01:31:57.100 | or you could make conditions that the data has to have
01:32:00.820 | any number of properties.
01:32:02.420 | And once again, you haven't leaked any signal to me,
01:32:05.100 | and I haven't leaked any data to you,
01:32:07.460 | which is once again, just another type of trust issue
01:32:11.260 | that all of this solves.
01:32:12.260 | So it's the same pattern.
01:32:14.180 | If you work in this industry long enough,
01:32:15.820 | or if you really look at these use cases long enough,
01:32:18.420 | you'll simply come to the question,
01:32:20.060 | and this is the useful question,
01:32:22.260 | what is the trust issue this is solving?
01:32:24.740 | And then if you can get an answer to that question
01:32:27.100 | on a case by case basis,
01:32:28.940 | that's when you'll understand why blockchains are relevant.
01:32:32.620 | And then once you do that with enough use cases,
01:32:35.220 | it becomes a little bit mind-blowing.
01:32:38.620 | - You've mentioned trust quite a bit.
01:32:40.580 | You also mentioned trust minimization
01:32:45.860 | in the Chainlink white paper.
01:32:47.540 | Can we dig into trust a little bit more?
01:32:51.540 | What is the nature of trust
01:32:53.820 | that you think about in these smart contracts?
01:32:56.140 | What is trust minimization?
01:32:58.180 | How do we accomplish, achieve trust minimization?
01:33:01.140 | - Sure, sure.
01:33:03.020 | I think it's important maybe to have a conception
01:33:05.660 | of what the alternative is, right?
01:33:07.460 | What is highly reliable, trust minimized off-chain
01:33:11.620 | and on-chain computation an alternative to?
01:33:14.060 | So this is just kind of how I see the world
01:33:16.980 | in these two camps.
01:33:18.940 | One camp is the traditional,
01:33:20.780 | what I call brand-based or paper guarantee camp.
01:33:25.260 | And this is the world as pretty much most
01:33:27.500 | or all people know it today.
01:33:29.100 | This is the world where there's a bank logo
01:33:31.420 | or an insurance company logo or some kind of logo.
01:33:34.140 | There's a very big building with marble arches and columns.
01:33:37.700 | You know, it's the biggest building in the town.
01:33:39.460 | It's bigger than the church.
01:33:40.980 | And everybody feels very good.
01:33:42.380 | Everybody's got such a nice logo.
01:33:43.700 | It's such a big building.
01:33:45.220 | Why don't I give them my money?
01:33:47.620 | Why don't I interact with them
01:33:49.140 | on the basis of any kind of agreement?
01:33:51.780 | And that's good.
01:33:52.860 | And that is definitely better than that not being there.
01:33:56.060 | And that is definitely a huge improvement
01:33:57.940 | for how people conduct commerce,
01:34:00.260 | letters of credit from branded entities
01:34:04.100 | are very important for global trade to take place
01:34:06.700 | in the early stages of global trade.
01:34:09.260 | So that's good, but it is fundamentally
01:34:13.380 | just a paper agreement with a legal framework behind it.
01:34:16.740 | And if the paper agreement you have would say Robinhood
01:34:20.660 | or somebody else suddenly has to change, well, it changes
01:34:24.860 | and you can't really do anything about it.
01:34:26.500 | You won't be able to change anything
01:34:28.260 | about what happened there.
01:34:29.700 | There's some long-term service.
01:34:31.220 | There's some other agreements around all this stuff.
01:34:34.500 | At the end of the day, that's the brand-based
01:34:37.740 | and paper guarantee world where it's all very vague
01:34:41.420 | and opaque and you're kind of hoping for the best
01:34:44.060 | because there's a nice logo.
01:34:45.620 | It's been around 100 years, a lot of marble.
01:34:48.300 | - Put a lot of marble.
01:34:49.300 | - Big building, lots of marbles.
01:34:50.780 | This is why banks have such nice buildings.
01:34:53.740 | It's not because they wanna spend money on buildings.
01:34:56.020 | It's to create confidence in them as an entity
01:34:59.980 | in order for people to transact through them.
01:35:03.220 | This is why all these kind of go-to cities
01:35:06.460 | that had gold rushes, go-to cities that needed banking
01:35:10.020 | as a service in certain time periods,
01:35:12.580 | they're the most beautiful buildings,
01:35:14.060 | at least in the United States.
01:35:15.940 | So this is the brand-based paper guarantee model
01:35:19.060 | for which up until now,
01:35:20.500 | there has never been an alternative.
01:35:22.060 | So up until now, if you had a bad experience with a bank
01:35:25.100 | or insurance company or some logo somewhere,
01:35:28.340 | you would only have one option.
01:35:30.340 | Your option would be to go across the road
01:35:32.380 | and down the block to another building
01:35:35.420 | with another color of marble and another set of agreements
01:35:39.300 | that are fundamentally still paper brand agreements.
01:35:43.140 | Now, for the first time, you have mathematical agreements.
01:35:47.580 | You have mathematically guaranteed encryption-secured,
01:35:52.500 | decentralized infrastructure-powered agreements.
01:35:55.580 | This is really the shift.
01:35:59.500 | This is really the comparison and the alternative
01:36:03.420 | through which people should view all of this, in my opinion,
01:36:05.540 | because there's once again this conception
01:36:07.940 | that everything is fine, everything works very well.
01:36:11.060 | Well, it does, it works fine and very well
01:36:13.500 | as long as nothing goes wrong.
01:36:15.140 | And then in the cases when things go wrong,
01:36:17.500 | which they pretty much invariably at some point do,
01:36:20.340 | then you find out that, well, turns out
01:36:23.020 | they don't have to pay me, or turns out I can't trade,
01:36:25.220 | or turns out the ATMs can be locked up
01:36:27.900 | and only give me 66 euros per day,
01:36:30.100 | whether I'm a business or an individual,
01:36:31.740 | like what happened in Greece a few years ago, right?
01:36:35.060 | And the reality is that once that becomes strong enough
01:36:41.060 | kind of realization for people,
01:36:43.820 | I think they will all just migrate
01:36:45.260 | to mathematically guaranteed contracts,
01:36:47.460 | because why wouldn't you?
01:36:49.500 | So in the world of mathematically guaranteed contracts,
01:36:53.220 | kind of how do we, and cryptographically secured
01:36:56.220 | and decentralized infrastructure powered,
01:36:57.940 | how do we evolve into that world?
01:37:00.180 | Well, at the end of the day, it comes down to consensus.
01:37:05.380 | It comes down to a collection of independent nodes,
01:37:09.580 | a collection of provably independent computing systems
01:37:13.300 | arriving at the same conclusion impartially.
01:37:17.140 | That conclusion might be the transaction is valid
01:37:22.100 | between address A and address B,
01:37:25.260 | address A has one Bitcoin, wants to send it to address B,
01:37:27.940 | and address B has one Bitcoin, right?
01:37:29.940 | So that's one degree of validation.
01:37:31.980 | It has certain cryptographic primitives that are used,
01:37:34.340 | certain levels of cryptography, encryption,
01:37:36.700 | and other methods that basically provide
01:37:39.340 | clarity and those guarantees.
01:37:42.500 | But fundamentally, it's this level of consensus
01:37:44.660 | that multiple independent computing systems
01:37:48.100 | came to the same conclusion, verified that conclusion,
01:37:51.260 | and created a sense of finality,
01:37:53.420 | created a final state that is globally considered
01:37:56.860 | to be the state of a transaction.
01:37:59.500 | And that is how it's achieved, right?
01:38:03.820 | So it's achieved by users looking
01:38:06.660 | at these mathematical contract systems and saying,
01:38:11.580 | if I have money in a bank,
01:38:13.660 | there's one single person who controls that money,
01:38:16.340 | that's the bank, they could choose to give me my money
01:38:18.620 | or choose not to give me my money.
01:38:20.820 | And that's great, but maybe there's a percentage
01:38:23.220 | of what I own that I want to put into another system
01:38:26.180 | where there's thousands of independent computing systems
01:38:29.100 | that are promising me,
01:38:31.100 | with the help of cryptographic primitives,
01:38:33.140 | that I will be able to always have access to this.
01:38:36.540 | Whatever this is, whatever this token is,
01:38:41.060 | I will at least, or at the very least,
01:38:43.460 | I will always have unfettered,
01:38:45.900 | complete control and access to it.
01:38:48.220 | So that's one example.
01:38:50.380 | Another example is, hey, we have a hybrid smart contract
01:38:52.940 | for something like crop insurance.
01:38:55.100 | I, as the user, evaluate where this smart contract runs.
01:38:59.140 | Oh, wow, the smart contract runs on Ethereum.
01:39:01.500 | Great, thousands of nodes,
01:39:03.540 | lots of computational security,
01:39:06.700 | hash power, so on and so on.
01:39:08.540 | Then I look at, oh, well, what triggers the contract?
01:39:10.860 | Oh, there's this Oracle network.
01:39:12.420 | Okay, it's composed of 25 nodes or 15 nodes,
01:39:15.460 | gets data from five different weather stations.
01:39:18.060 | You know, I'm comfortable with that.
01:39:19.900 | I have a certain level of comfort
01:39:21.460 | with that hybrid smart contract
01:39:23.140 | and its ability to provide me consensus
01:39:26.700 | about the transaction,
01:39:28.340 | once the contract knows what's happened,
01:39:30.580 | and I'm comfortable with the consensus
01:39:33.100 | around the event that controls the contract, right?
01:39:36.220 | Because once again, that event is what determines
01:39:39.860 | what happens with the contract.
01:39:41.420 | And if the contract is super well-written,
01:39:43.900 | it doesn't matter if the event isn't reliable, right?
01:39:47.020 | So now I've made this determination.
01:39:48.580 | I've gotten all this clear, transparent information
01:39:51.220 | about this system that combines the contract code
01:39:55.580 | with a decentralized Oracle network,
01:39:57.540 | and I've made my decision to participate
01:39:59.580 | in this decentralized insurance,
01:40:01.940 | kind of crop insurance policy.
01:40:03.780 | I've sent the Bitcoin or the stable coin
01:40:06.540 | or whatever I have on my Android phone.
01:40:09.180 | And then time goes by, and let's say it doesn't rain.
01:40:12.340 | Lo and behold, the smart contract returns
01:40:15.460 | the relevant amount from the policy back to me.
01:40:18.940 | I continue my life as a farmer.
01:40:22.100 | And by the way, the fact that that happened
01:40:24.980 | contributes reputation and contributes proof
01:40:28.300 | back to both the contract as something
01:40:30.860 | that can prove to other people that it has settled,
01:40:34.260 | and the Oracle network as something that can prove
01:40:36.820 | that it has properly assessed reality
01:40:39.460 | or properly triggered a contract.
01:40:41.660 | And this is where there's one of many network effects
01:40:44.300 | where the more that smart contracts
01:40:47.620 | and Oracle networks are used,
01:40:49.260 | they themselves generate this immutable on-chain data
01:40:53.300 | that proves their value and reliability.
01:40:57.820 | And in proving more and more of that
01:41:00.540 | in more and more kind of use cases
01:41:03.700 | and more and more variants of the same contract,
01:41:06.740 | they arrive at a greater body of proof that they,
01:41:10.580 | like I am the decentralized crop,
01:41:12.860 | the decentralized insurance contract for crop insurance
01:41:16.620 | used by a million users.
01:41:19.780 | And my failure rate is non-existent or really low.
01:41:24.460 | And here's my Oracle network.
01:41:25.620 | And by the way, it's also settled a million of these.
01:41:28.740 | And so it's not the logo, right?
01:41:31.380 | It's not, "Hey, what a nice logo you have
01:41:35.460 | on top of a building above a train terminal or something."
01:41:40.100 | It's much more, "Hey, there's a million people.
01:41:44.020 | There's a million separate contracts
01:41:45.500 | that got settled correctly.
01:41:46.700 | I have all the proof that I could ever need about that."
01:41:50.340 | And it's not something that's very easy to gain, right?
01:41:53.460 | Because real value was at stake, real value was moved around.
01:41:57.100 | And so I think once again,
01:41:59.180 | the transparency aspect comes in
01:42:00.820 | where you're able to prove
01:42:02.580 | that the cryptographically enforced contracts are better.
01:42:07.580 | - That said, you can still integrate the traditional banks
01:42:12.020 | as long as you create a data feed
01:42:13.540 | on the amount of marble that's included.
01:42:16.940 | So if that's valuable to you in terms of reputation,
01:42:19.300 | you could still integrate the marble,
01:42:20.740 | the amount of marble and the size of the logo.
01:42:25.980 | We could still keep the banks around.
01:42:28.580 | - I think we will.
01:42:29.420 | I think what'll happen with the banks
01:42:30.820 | and all the insurance companies, by the way,
01:42:32.500 | is not that they'll all just die or something.
01:42:34.860 | I think it'll be just like the internet.
01:42:36.420 | There'll be some of them that adopt this
01:42:38.660 | and some of them that don't
01:42:40.140 | and some of them that do it faster,
01:42:41.340 | some of them that do it slower.
01:42:42.820 | And that's an economic decision that they'll make.
01:42:45.380 | I think their whole question is,
01:42:46.900 | "Is this a foregone conclusion?"
01:42:48.900 | I mean, I think my answer is yes,
01:42:51.100 | this is definitely gonna be happening.
01:42:52.980 | I think they still have a question of,
01:42:55.020 | "Is this gonna change my industry?"
01:42:58.580 | But I'm seeing a definite shift in people's understanding.
01:43:03.460 | And I think that shift is gonna accelerate rapidly
01:43:06.100 | as one or two of their competitors throw their hat
01:43:09.660 | in the smart contract ring and say,
01:43:12.060 | "Well, I have smart contracts.
01:43:14.500 | "I guarantee my outcomes to you.
01:43:17.260 | "What do they do for you?
01:43:18.500 | "It's risky, just use mine."
01:43:22.020 | And the second some of them start losing business
01:43:24.260 | because of that, they're gonna move very quickly
01:43:27.700 | because that's what all of their compensation structures
01:43:30.100 | and all their goal planning structures are based around.
01:43:32.980 | They're based around what is losing us business
01:43:35.260 | or getting us business.
01:43:36.820 | - Yeah, it's fascinating organizationally though
01:43:38.980 | to think about banks, they're very old school
01:43:41.940 | and their ability to move quickly is questionable to me.
01:43:46.340 | I just look at basic online banking,
01:43:50.740 | how good banks are creating a frictionless online experience
01:43:55.100 | and I think they're not very good.
01:43:58.660 | And so that speaks to the kind of people
01:44:01.180 | who are in leadership positions at banks,
01:44:03.900 | the kind of people they hire, the kind of culture there is.
01:44:07.540 | So I do wonder if banks will from inside
01:44:12.300 | revolutionize themselves to include smart contracts
01:44:15.420 | or whether totally new competitors will have to emerge
01:44:18.780 | that basically create new kinds of banks.
01:44:22.340 | Whether, what is it, the company Square?
01:44:26.260 | I think it comes up out of nowhere really
01:44:29.620 | with Cash App and they have Bitcoin on Cash App,
01:44:31.900 | whether they will start incorporating smart contracts
01:44:34.460 | and they will revolutionize the whole banking industry
01:44:39.460 | or whether Bank of America will revolutionize themselves
01:44:44.660 | from within.
01:44:46.020 | I'm skeptical on Bank of America,
01:44:48.140 | but you never know.
01:44:50.100 | It's in general, I'm fascinated by how big organizations,
01:44:53.700 | whether it's Google or Microsoft or Bank of America
01:44:56.820 | pivot hard in a world that's quickly changing.
01:45:00.740 | I think that takes bold leadership and a lot of firing
01:45:05.500 | and a lot of pain and a lot of meetings
01:45:08.140 | where the one asshole brings up the,
01:45:10.700 | from first principles idea that, you know what,
01:45:13.500 | the ways we've been doing stuff in the past require,
01:45:16.340 | you know, we need to throw that out
01:45:17.980 | and do stuff totally differently.
01:45:20.420 | I know a lot of those assholes
01:45:22.460 | in a lot of these different industries.
01:45:25.300 | First of all, I think they're getting listened to more.
01:45:27.060 | And second of all, I think all of these places,
01:45:30.460 | as I look at it more and more,
01:45:32.380 | I think they have a fundamental line of business
01:45:35.420 | that they try to protect.
01:45:37.060 | And then everybody's compensation
01:45:38.980 | and everybody's metrics and goals
01:45:40.380 | is focused around that line of business.
01:45:43.060 | So the second that things begin to impact that,
01:45:48.060 | then everybody will be in a senior meeting
01:45:52.500 | and that asshole will be quite listened to
01:45:55.180 | because he will have the only thoughtful explanation
01:45:58.020 | as to why this is happening.
01:45:59.900 | How things will evolve from there, I actually don't know
01:46:02.940 | because that hasn't been the case yet.
01:46:05.740 | But my thinking is that there will be people
01:46:08.700 | who don't wanna cannibalize certain parts of their business
01:46:12.020 | or don't wanna change certain parts of their business.
01:46:15.020 | And then there'll be people who say,
01:46:16.420 | "Look, I think this is how the world's gonna work.
01:46:18.940 | We're gonna make a very, very heavy
01:46:20.740 | kind of set of commitments to put resources towards this."
01:46:27.300 | I already see that with a few banks
01:46:29.820 | working on various blockchain-based systems.
01:46:33.100 | But granted, they've been working on those for years.
01:46:34.860 | So I think all of this comes down
01:46:37.420 | to these kind of quarterly earnings calls
01:46:39.860 | where somebody asks them,
01:46:41.300 | "Hey, I saw that bank over there launched a blockchain bond
01:46:45.420 | or a smart contract derivative platform.
01:46:47.860 | And I also saw that they made $10 billion in revenues
01:46:51.580 | or $10 billion in volume or whatever it is from that.
01:46:55.020 | What's your plan?"
01:46:56.380 | Right, on the earnings call.
01:46:58.100 | And I promise you by the next earnings call, there's a plan.
01:47:02.060 | And then the question on the next one is,
01:47:03.380 | "Well, when's the plan gonna happen?"
01:47:04.700 | And then by the next earnings call, a plan is happening.
01:47:08.420 | And that's what these people are sensitive to.
01:47:11.540 | That's what these organizations are structured around.
01:47:13.740 | It's not completely economically disconnected, right?
01:47:17.180 | They have this core business, they wanna protect it.
01:47:20.420 | I understand that idea,
01:47:22.980 | but I think that the problem with that
01:47:26.020 | is sometimes it requires this myopic focus, right?
01:47:31.020 | And that's what all the innovation stuff is about.
01:47:34.500 | Every time somebody at a corporate entity
01:47:36.580 | is about innovation, they're trying to sidestep this.
01:47:40.260 | But once again, the incentives to maintain
01:47:42.340 | whatever the core business is, is so strong
01:47:45.260 | that the innovation people, even though they are there,
01:47:49.180 | I think they get a phone call and go like,
01:47:53.740 | "What are we doing for this?"
01:47:55.540 | And the ones that actually did good work
01:47:58.540 | and got ready to do something for this
01:48:00.620 | have done their employer and their organization
01:48:03.980 | a very positive service.
01:48:05.820 | Whereas the ones that aren't ready,
01:48:07.820 | I mean, they'll make up something
01:48:09.940 | and maybe they're really smart and they'll get it together.
01:48:11.940 | I don't know.
01:48:12.900 | - Can we talk about tokens a little bit?
01:48:15.220 | Generally speaking, there's been a meteoric rise
01:48:19.620 | of a bunch of different tokens.
01:48:20.940 | We could just talk about Bitcoin and Ethereum as examples.
01:48:24.300 | Bitcoin, I think, costs $60,000 in value.
01:48:27.980 | What are your thoughts in general on this rise?
01:48:31.100 | What's the future of Bitcoin?
01:48:32.500 | What's the future of Ethereum?
01:48:35.300 | There's the total value locked metric
01:48:38.900 | that I think generalizes the different kind of value
01:48:42.060 | of these tokens.
01:48:45.260 | What is the future value and impact
01:48:50.980 | of cryptocurrency look like
01:48:52.380 | if we look through the lens of these tokens?
01:48:54.900 | - I think valuing all these tokens
01:48:58.420 | and determining that isn't something
01:48:59.940 | I'm particularly great at.
01:49:01.300 | I haven't spent a lot of time on that.
01:49:02.940 | I've spent the majority,
01:49:04.140 | vast majority of my time on building these systems
01:49:06.860 | and architecting them and getting them to fruition
01:49:08.980 | and getting them to a place where they operate properly
01:49:11.580 | on both the technical and the crypto economic
01:49:13.260 | and in every other sense.
01:49:15.580 | I think with Bitcoin, there is a certain conception
01:49:20.580 | of non-governmental fiat money
01:49:24.460 | that Bitcoin is really the first creator of.
01:49:28.820 | So there's this very powerful idea called fiat money.
01:49:31.940 | It's basically more or less a kind of 40-year experiment.
01:49:36.940 | I think on August 15th of this year is maybe,
01:49:39.260 | I think given the 40th anniversary,
01:49:41.340 | government can say, "Hey, I have a currency
01:49:42.940 | "and it's worth something and here it is."
01:49:46.180 | In terms of the way that governments have stopped that
01:49:50.820 | in the past is if anyone tries to make another fiat currency
01:49:53.500 | in their country, they immediately shut it down.
01:49:56.460 | They immediately say, "Hey, this is really bad.
01:49:58.860 | "You've done something really bad.
01:50:00.380 | "It's time for you to stop.
01:50:01.580 | "Don't do it anymore."
01:50:03.060 | And it stops.
01:50:04.260 | That's been the history of non-governmental fiat currency.
01:50:07.860 | Bitcoin is really due to its decentralized nature,
01:50:12.220 | the first and possibly in some cases,
01:50:15.300 | in many people's minds,
01:50:16.140 | it's still the only true non-governmental fiat currency.
01:50:19.740 | Now, how powerful is non-governmental fiat currency?
01:50:24.460 | I have no idea.
01:50:25.540 | It's really as powerful as the ideas
01:50:31.540 | that people ascribe to it are.
01:50:33.620 | So let's say people start saying,
01:50:36.900 | like right now people are saying,
01:50:38.060 | "Hey, it's internet money.
01:50:39.340 | "It's the money of the internet."
01:50:41.780 | Okay, great.
01:50:42.820 | What's that worth?
01:50:43.660 | I don't know.
01:50:44.500 | It's probably worth a lot.
01:50:45.340 | I have no idea what it's worth.
01:50:46.180 | But as an idea, as a concept to underpin the fiat money,
01:50:51.180 | the let there be aspect of fiat and of Bitcoin,
01:50:56.300 | you basically look at it and you say,
01:50:58.500 | "Yeah, internet money.
01:51:00.140 | "Okay, that could be worth whatever,
01:51:02.460 | "60,000, 600,000."
01:51:04.940 | Great question, right?
01:51:06.220 | There are other versions of the world, right?
01:51:08.780 | Where people say,
01:51:09.920 | "There are countries that don't have a good fiat currency.
01:51:14.600 | "And I see a lot of people using Bitcoin."
01:51:17.060 | So Bitcoin isn't internet money.
01:51:19.580 | It's countries without a good currency money.
01:51:24.260 | So all the countries without a good currency
01:51:25.980 | now use Bitcoin and let there be Bitcoin
01:51:30.100 | as this, right?
01:51:31.940 | As this conception of Bitcoin.
01:51:33.260 | What's the value of that?
01:51:34.740 | I don't know.
01:51:35.580 | That's a great question.
01:51:36.400 | Probably huge amount of value.
01:51:37.820 | Then there's a further conception of Bitcoin
01:51:42.740 | as some digital gold.
01:51:45.060 | There's a scarcity dynamic.
01:51:46.660 | There's all these other kinds of dynamics.
01:51:49.580 | What is a portable version of digital gold
01:51:52.700 | with some kind of built-in scarcity worth,
01:51:59.180 | kind of artificially created scarcity?
01:52:02.060 | What's that worth?
01:52:03.860 | I don't know.
01:52:05.100 | That's a great question.
01:52:06.700 | I haven't done the analysis on that as the point.
01:52:08.820 | Might be worth a lot.
01:52:10.300 | What is it all worth if all three of these things
01:52:13.740 | flow into the same fiat,
01:52:17.100 | kind of let there be Bitcoin as these three things,
01:52:20.140 | conception of Bitcoin?
01:52:21.580 | I don't know what that's worth.
01:52:23.580 | I also don't know what that's worth.
01:52:24.420 | But could be worth a huge amount.
01:52:26.800 | So I think it's not,
01:52:28.940 | I personally don't think it's super important
01:52:30.980 | what I think it's worth
01:52:32.260 | or what many other people think it's worth.
01:52:34.420 | I don't think that's really that important.
01:52:36.460 | I think what's probably important
01:52:38.220 | is understanding what the societal conception of Bitcoin is
01:52:43.220 | and how does that societal conception evolve over time.
01:52:49.440 | And that, interestingly enough,
01:52:52.020 | doesn't just depend on you or me
01:52:56.220 | or the people who made Bitcoin or anything else.
01:52:58.100 | It actually depends on current events.
01:53:00.700 | So for example, if people suddenly say,
01:53:02.700 | I'm more and more worried about fiat currency,
01:53:05.940 | I'm more and more worried that governmental fiat,
01:53:09.400 | even if it's the most reliable version of that,
01:53:12.500 | is not as good as I thought it was,
01:53:14.540 | maybe I should go on the PayPal app
01:53:17.780 | and maybe I should get some Bitcoin just in case.
01:53:20.420 | What's the world where Bitcoin is a certain percentage
01:53:25.740 | of everyone's ownership as a hedge
01:53:28.020 | against governmental fiat money not being so good?
01:53:32.780 | Haven't done the analysis, but another example, right?
01:53:35.140 | Here's this conception.
01:53:36.940 | That's the conception.
01:53:38.260 | So when I look at Bitcoin,
01:53:40.000 | what I see is a lot of these fascinating conceptions
01:53:43.620 | of what the fiat, let there be value of Bitcoin is.
01:53:47.620 | By the way, all of them could be true.
01:53:49.520 | Maybe some of them are true,
01:53:51.820 | maybe some of them aren't true.
01:53:53.340 | And the fascinating thing is,
01:53:54.580 | is that I've seen this conception change, right?
01:53:56.900 | So when I started in the Bitcoin space,
01:53:59.020 | the conception was micropayments.
01:54:01.940 | The cost of Bitcoin is low, we'll have micropayments.
01:54:05.380 | Micropayments are wonderful
01:54:06.660 | for machine to machine transactions.
01:54:09.740 | Micropayments are wonderful in the emerging market.
01:54:12.420 | And that's fine, right?
01:54:13.380 | And that was one conception of Bitcoin
01:54:15.780 | as let there be Bitcoin as micropayments platform, right?
01:54:19.620 | But then the value rose and certain things changed.
01:54:21.900 | There was enough expansion in certain ways.
01:54:23.980 | And now the conception has evolved
01:54:26.580 | into this other conception.
01:54:28.540 | But at the end of the day,
01:54:30.740 | I think governments have a very clear set of steps
01:54:34.660 | for directing the public's conception of their fiat, right?
01:54:39.660 | They say our fiat is worth this for these reasons.
01:54:44.220 | Bitcoin doesn't have that.
01:54:45.620 | Bitcoin doesn't have an official Bitcoin spokesperson
01:54:49.220 | that goes out and says,
01:54:50.780 | the non-governmental money called Bitcoin,
01:54:53.220 | the non-governmental fiat money called Bitcoin
01:54:55.380 | has value on the basis of this, this, this, and this.
01:54:57.980 | Here's our fiscal budget, here's our future plans.
01:55:00.740 | Our money will continue to be safe and secure and reliable.
01:55:04.100 | And so what that hole creates
01:55:07.340 | is a hole that we all fill, right?
01:55:09.140 | We all basically come to some vague
01:55:11.500 | kind of group understanding that Bitcoin is worth this
01:55:16.500 | because it is tied to,
01:55:23.020 | let's say all non-governmental fiat money
01:55:27.260 | comes into question, everybody doubts it,
01:55:30.420 | possibly due to inflation.
01:55:32.260 | And everybody says, this is nice,
01:55:35.540 | but I'd like to keep 10, 20% of my wealth
01:55:39.460 | in non-governmental fiat just in case.
01:55:42.940 | What are those numbers?
01:55:46.300 | I mean, if that happens,
01:55:49.340 | I'm guessing you can add a few zeros.
01:55:52.660 | - I like how you say, I haven't done the analysis
01:55:55.060 | as if I'm sure a lot of people have done
01:55:56.900 | quote unquote analysis, but it's not,
01:55:59.380 | it's still speculation.
01:56:00.540 | Nobody can predict the future,
01:56:02.060 | especially when so much of it has to do
01:56:04.780 | with a large number of people
01:56:07.420 | holding an idea in their mind
01:56:09.340 | as to the importance of a particular technology like Bitcoin.
01:56:13.300 | There's a lot of excitement by its possibilities,
01:56:15.340 | but the number of zeros you add is an open question
01:56:19.420 | and nobody can do a perfect analysis
01:56:21.980 | except whoever created this simulation.
01:56:24.380 | Let me ask you this question.
01:56:28.660 | Who is Satoshi Nakamoto?
01:56:33.100 | There's quite a few people who suggest that person is you.
01:56:37.760 | So is it you?
01:56:42.660 | - No.
01:56:43.500 | - Who do you think it could be?
01:56:46.740 | - I don't know who it is.
01:56:48.620 | I think if I had to guess,
01:56:50.100 | it's probably a group of people,
01:56:52.060 | some of which might not even be around anymore.
01:56:54.420 | Obviously I'm very grateful to,
01:56:57.380 | if this is a singular or a group of people
01:56:59.180 | for kicking off this entire industry
01:57:00.660 | and making this amazing change in the world
01:57:02.740 | that I have the privilege and luxury
01:57:05.500 | of being part of in some small way in the work that I do.
01:57:08.460 | I think also this kind of focus on who is Satoshi
01:57:13.460 | or who isn't Satoshi shouldn't in my opinion matter so much
01:57:17.660 | because regardless of who it is,
01:57:20.020 | that in my opinion should have no substantial,
01:57:24.060 | significant effect or bearing on the functioning
01:57:27.740 | or the value or the use or the security
01:57:31.300 | of the Bitcoin system.
01:57:33.460 | So I think whoever it is,
01:57:35.820 | they're probably better off not making that public.
01:57:38.860 | And I think beyond that,
01:57:40.220 | whoever it turned out to be shouldn't matter
01:57:43.740 | because it has nothing to do
01:57:46.340 | with how the system is made useful
01:57:48.620 | or secure or anything else.
01:57:50.740 | And so I think that's the point of view that I have.
01:57:54.460 | - Now, if you were Satoshi Nakamoto, would you tell me?
01:57:58.160 | Because you said they shouldn't,
01:58:02.220 | whoever Satoshi is, he should keep that private.
01:58:05.500 | So would you tell it to me or no?
01:58:07.580 | - We're in some kind of weird like thought experiment here.
01:58:12.060 | If I was this guy, let me think about this,
01:58:16.980 | which I'm not by the way,
01:58:18.140 | I am not this person.
01:58:19.260 | - But if you were, would you say it?
01:58:21.580 | - I think probably not.
01:58:29.380 | I don't see the,
01:58:30.220 | I think that they would cause a lot of distraction
01:58:33.020 | and a lot of weird stuff.
01:58:34.940 | And so realistically, I don't think it would help anybody
01:58:38.900 | or even the person who discloses it.
01:58:40.860 | But just to be clear, I am not.
01:58:42.500 | And whoever it is, I think they haven't said anything
01:58:45.020 | because they don't want the attention
01:58:46.540 | and they don't want the distraction
01:58:47.940 | and they don't want all the problems from this.
01:58:49.820 | And that makes sense to me conceptually.
01:58:53.060 | - It's fascinating to think if they're still out there
01:58:55.180 | and part of the Bitcoin, the cryptocurrency community.
01:58:58.800 | And it is inspiring to think that if they're out there,
01:59:04.940 | that they're not revealing their identity
01:59:09.100 | because it would be a distraction.
01:59:11.380 | That's kind of inspiring that people are like that.
01:59:13.080 | Just like George Washington,
01:59:15.740 | relinquishing power is inspiring
01:59:18.020 | 'cause it's ultimately about the progress of the community.
01:59:20.580 | Not some kind of ego driven attention scheme.
01:59:24.040 | Again, very inspiring.
01:59:26.380 | The humans at their best are inspiring.
01:59:29.740 | What do you think about the certainty
01:59:31.700 | that people in the Bitcoin maximalist community
01:59:34.780 | have about this particular piece of technology, Bitcoin?
01:59:37.720 | Is there something interesting that you think
01:59:41.560 | that you might wanna say about this community
01:59:44.380 | or is it just is what it is?
01:59:46.600 | - I think at the end of the day,
01:59:49.780 | results speak for themselves
01:59:51.740 | and Bitcoin has had an amazing impact on our industry
01:59:55.900 | and has had an amazing impact on the world.
01:59:59.020 | And I think the result is still
02:00:01.140 | that Bitcoin is very widely adopted
02:00:03.480 | and driving the adoption of our industry in many ways.
02:00:07.260 | So I think it's very difficult for people to say that,
02:00:09.860 | Bitcoin maximalists don't have something
02:00:12.820 | that they can latch onto and say,
02:00:14.140 | hey, there's something very real here.
02:00:16.380 | I think there's been decisions made by the Bitcoin community
02:00:20.020 | and the people who made the Bitcoin protocol
02:00:21.900 | to focus it on Bitcoin
02:00:24.460 | and to focus it on the kind of storing
02:00:28.220 | of the ledger of Bitcoin and the information about Bitcoin
02:00:31.100 | and the transaction of Bitcoin and to focus on securing that.
02:00:34.900 | And I understand why that decision was made
02:00:37.580 | to a certain degree, right?
02:00:38.580 | It was about focus.
02:00:39.980 | It was about getting something worthwhile, right?
02:00:43.100 | Without adding additional features and additional risk.
02:00:46.820 | And that decision is a decision that was made
02:00:49.980 | and has kind of the benefits of focus
02:00:53.780 | and the benefits of a certain amount of security
02:00:57.460 | and a certain amount of guarantees around Bitcoin
02:01:01.140 | and what that is and the value of that.
02:01:04.380 | And then it has certain limitations
02:01:07.780 | as a consequence of doing less
02:01:10.860 | or having the system hold data that isn't related to Bitcoin
02:01:13.700 | or not having the system hold contracts,
02:01:16.820 | contractual outcomes or smart contract code.
02:01:19.980 | So I think it's just kind of a decision, right?
02:01:22.060 | And I understand why they're excited and I'm very excited.
02:01:25.620 | I started in this industry going to Bitcoin meetups
02:01:28.340 | and I met a lot of fantastic people, libertarian people
02:01:32.020 | that wanted to see the world work differently
02:01:34.500 | and shared a lot of my beliefs and a lot of my points of view.
02:01:37.660 | And so, anyone who's been in the industry
02:01:40.100 | as long as I have, has had to come
02:01:44.260 | from the Bitcoin ecosystem by virtue
02:01:45.900 | of kind of starting out that early.
02:01:48.300 | So I have an unbelievable amount of respect
02:01:50.740 | and admiration and gratitude for Bitcoin
02:01:55.100 | and that it exists and everything that it's done
02:01:56.660 | and that it birthed this industry.
02:01:58.180 | There's absolutely no doubt about that.
02:02:00.140 | At the same time, whatever design decisions people make
02:02:04.540 | are the design decisions they make, right?
02:02:07.100 | And so if you've made a design decision
02:02:09.100 | that this ledger and this thing will be about Bitcoin,
02:02:12.020 | it won't be about colored coins,
02:02:14.020 | it won't be about op return at 80 bytes,
02:02:16.620 | it won't be about these other kind of nuances
02:02:20.620 | that you don't want this to be about, then that's fine.
02:02:23.820 | That's fine and that's a logical decision
02:02:26.540 | and that's called focus.
02:02:28.260 | And focus has a lot of value
02:02:30.220 | and a lot of great technology products
02:02:32.780 | have focused on something and done that.
02:02:36.660 | And then there's a lot of smart people around Bitcoin
02:02:39.140 | building kind of additional systems
02:02:41.900 | that anchor their security within Bitcoin.
02:02:44.300 | And I think that's an interesting approach
02:02:46.020 | that could bear fruit.
02:02:47.620 | I think it'll eventually require an interaction
02:02:51.220 | with a Bitcoin protocol in more advanced ways.
02:02:54.380 | And then there'll be another question of,
02:02:56.260 | what is the design decision for Bitcoin?
02:02:59.020 | Is it that Bitcoin will be just about the Bitcoin ledger
02:03:02.620 | or does Bitcoin want to evolve into an anchor
02:03:07.620 | for all these other systems
02:03:10.660 | and maybe create additional data store,
02:03:13.300 | kind of more data on the Bitcoin blockchain related to that.
02:03:17.340 | So I'm excited to see how that evolves,
02:03:20.620 | but until then kind of results speak for themselves
02:03:23.620 | and the results that Bitcoin has achieved for our industry
02:03:26.500 | and for itself as kind of the dominant cryptocurrency
02:03:30.300 | and the conception of our industry
02:03:31.980 | that people interact with first is obviously very important
02:03:35.260 | and something that I think really everybody in our industry
02:03:39.220 | is grateful for, right?
02:03:40.220 | Because without Bitcoin, where would our industry be?
02:03:44.020 | And that's obviously something that we can't forget.
02:03:46.980 | - What are your thoughts about Ethereum
02:03:49.540 | in the chain link distributed Oracle network world?
02:03:54.500 | Is it competition?
02:03:57.380 | Is it collaboration?
02:03:59.460 | Is it complimentary technology?
02:04:02.220 | What do you think about Ethereum?
02:04:03.500 | How much do you think about Ethereum?
02:04:05.300 | What role does it have?
02:04:07.140 | - Yeah, I think about a lot.
02:04:08.500 | I think we're completely complimentary.
02:04:11.100 | So there's no competitive dynamics in my opinion.
02:04:13.940 | We are completely collaborative and complimentary
02:04:16.420 | with Ethereum and all other blockchains
02:04:18.780 | and all other layer twos that operate a contract, right?
02:04:23.020 | So we do not seek to operate a smart contract.
02:04:26.220 | We seek to augment and enable smart contracts
02:04:29.860 | to go further in what they're able to do.
02:04:32.940 | In fact, Oracle networks have some value
02:04:36.060 | but they don't have nearly as much value in what they do
02:04:38.500 | if there isn't a mission critical system
02:04:40.580 | like a smart contract that needs their data, right?
02:04:43.420 | So we've made our own explicit design decisions
02:04:46.100 | in our own and created our own focus
02:04:48.660 | around guaranteeing that smart contracts can go further.
02:04:53.660 | We've already done that, right?
02:04:54.940 | Decentralized finance, the rate at which we put data
02:04:57.860 | is to a degree the rate at which certain
02:04:59.380 | decentralized financial markets grow.
02:05:01.460 | And as we put more data,
02:05:02.540 | we see more financial products go live.
02:05:04.860 | Gaming, we provide VRF.
02:05:06.020 | So we have this kind of focus
02:05:07.460 | and it's a very useful and valuable
02:05:09.660 | for our industry focus.
02:05:13.780 | At the end of the day,
02:05:15.420 | I think that smart contract platforms like Ethereum
02:05:19.620 | made a different set of design decisions
02:05:22.460 | from Bitcoin and others.
02:05:23.500 | And they focused on creating the smart contract capability
02:05:26.940 | and they kind of wanted that functionality to exist.
02:05:32.700 | And I think since then,
02:05:33.540 | there's been a number of people
02:05:34.380 | that try to improve on that
02:05:35.500 | or try to make variants of that.
02:05:37.940 | From our point of view,
02:05:39.540 | we want to support smart contracts
02:05:41.860 | in all of their variations and in all of their use cases.
02:05:46.380 | So one of the things that I personally like about Chainlink
02:05:49.220 | is their ability or Chainlink's ability
02:05:52.420 | and the Chainlink network's ability
02:05:53.580 | to be useful to many different chains
02:05:55.820 | and across many different use cases.
02:05:58.420 | I'm personally a fan of Ethereum.
02:06:00.420 | Ethereum has done a huge amount for our industry as well.
02:06:03.220 | Ethereum took us from a world where
02:06:05.340 | it literally took months to make a new smart contract
02:06:07.980 | by being forced to code it into a protocol.
02:06:10.980 | You had to go to the protocol developers
02:06:12.580 | and you had to say, "Hey, I need a DEX
02:06:15.340 | or I need some kind of smart contract."
02:06:17.380 | Put it in the protocol itself.
02:06:19.340 | Put it in the actual blockchain mining
02:06:21.900 | and kind of block generation,
02:06:23.460 | transaction generation protocol.
02:06:25.420 | That would take months or sometimes even over a year.
02:06:27.700 | That was a horrible experience.
02:06:29.100 | And obviously very few people wanted to participate in that.
02:06:31.660 | And so very few people made smart contracts,
02:06:33.820 | which I was not a fan of.
02:06:35.940 | And then Ethereum came along
02:06:37.220 | and really did a lot of innovative things
02:06:39.740 | and introduced this approach to scriptable smart contracts
02:06:43.380 | where you could script all of these different conditions.
02:06:47.020 | And I found that fascinating.
02:06:49.020 | Before Ethereum, I found that fascinating.
02:06:50.740 | Once Ethereum arrived,
02:06:52.180 | I found it fascinating after Ethereum launched
02:06:54.420 | and I still find it fascinating.
02:06:55.700 | And I'm also very grateful to Vitalik
02:06:58.700 | and the Ethereum community
02:06:59.660 | and all the core developers there
02:07:01.380 | for taking our industry a step further.
02:07:03.980 | So I think they absolutely deserve a huge amount of credit
02:07:06.700 | for taking our industry from,
02:07:09.140 | it takes months to make a really small smart contract
02:07:12.180 | to it takes weeks to make a relatively secure,
02:07:15.980 | relatively advanced piece of on-chain code
02:07:18.260 | that anybody can script and people can do audits on.
02:07:21.580 | And that's an unbelievable leap forward for our industry.
02:07:25.580 | And I'm genuinely grateful to them for that.
02:07:28.260 | I think the next step in line with our body of work
02:07:32.180 | is how does that scriptable on-chain code
02:07:36.420 | become more advanced in its interaction
02:07:40.100 | with all of the systems and events in the real world,
02:07:44.380 | which is in my opinion,
02:07:45.420 | the final missing piece of the puzzle, right?
02:07:48.380 | So my body of work,
02:07:49.660 | the body of work that I'm involved in
02:07:51.220 | would not be where it is right now
02:07:52.580 | without Bitcoin by any measure.
02:07:54.620 | It wouldn't even be where it is now without Ethereum
02:07:57.140 | and the growth in smart contract development
02:07:59.700 | that they've created.
02:08:01.380 | And now what I think is gonna happen next
02:08:04.820 | is there'll be a lot of different smart contract platforms,
02:08:07.580 | a lot of different layer twos.
02:08:09.300 | Some of them will be private for enterprise.
02:08:11.020 | Some of them will be public.
02:08:12.500 | There'll be some public winners in certain geographies
02:08:15.140 | for maybe regulation reasons, maybe other reasons.
02:08:18.100 | There'll be other public winners,
02:08:20.420 | the larger internet,
02:08:21.940 | and there'll be a number of different people
02:08:23.580 | building smart contracts in different languages.
02:08:26.580 | We are excited and I am excited
02:08:29.260 | and the Chainlink community is excited.
02:08:30.940 | And basically there's a lot of,
02:08:33.340 | I mean, for lack of a better word, excitement
02:08:37.020 | in seeing our industry graduate
02:08:40.060 | to providing more use cases,
02:08:43.940 | more usable hybrid smart contracts, right?
02:08:46.820 | Because once again, it's absolutely amazing
02:08:49.300 | that Bitcoin created non-governmental fiat money.
02:08:51.780 | It's an unbelievable innovation
02:08:53.380 | and invented the centralized infrastructure
02:08:55.860 | and birthed our industry.
02:08:57.540 | It's an unbelievably great achievement,
02:09:00.380 | an amazing achievement
02:09:01.540 | that we now have scriptable smart contracts
02:09:03.300 | through something like Ethereum.
02:09:04.700 | Once again, monumental achievement in my opinion.
02:09:07.740 | Once again, we still need to look to the future.
02:09:10.820 | We need to look to how do we take
02:09:13.500 | the decentralized infrastructure concepts
02:09:16.140 | that Bitcoin initially put forward,
02:09:17.940 | that Ethereum then improved upon
02:09:19.700 | and created into these scriptable smart contract formats,
02:09:23.420 | and how do we expand that
02:09:25.020 | into the world of real world outcomes
02:09:28.060 | to change the global financial industry,
02:09:29.900 | the global trade industry,
02:09:31.380 | the global data marketplace industry,
02:09:33.580 | and many other global industries.
02:09:36.260 | - You mentioned results speak for themselves
02:09:38.340 | and how design decisions have consequences.
02:09:43.100 | The Chainlink community have come up
02:09:45.020 | with a lot of brilliant designs.
02:09:48.180 | So how do you think through the design choices
02:09:51.780 | that you're facing,
02:09:53.180 | where you can't predict the future,
02:09:55.580 | but you're trying to create a better future?
02:09:58.740 | Is there something low level,
02:10:02.540 | introspective advice that you can give
02:10:06.060 | or describe as to how you think through those decisions
02:10:09.180 | or high level, how you think about those decisions?
02:10:13.100 | - Sure, absolutely.
02:10:14.540 | I think that's a great question.
02:10:17.220 | And I think that actually gets to the core
02:10:19.460 | of what the Chainlink network is supposed to achieve.
02:10:23.180 | We are supposed to achieve a maximally flexible system.
02:10:27.580 | So once again, this is the big difference
02:10:29.980 | between Chainlink and Oracle networks in general
02:10:32.580 | and blockchains in my opinion.
02:10:34.580 | Blockchains do not seek to be maximally flexible, right?
02:10:37.340 | They say, here's my block size,
02:10:40.260 | here's the transaction types you can put in those blocks.
02:10:43.940 | Here's the contract language I have.
02:10:46.180 | Here's kind of my blockchain system, right?
02:10:48.620 | Here's the fee structure for those blocks.
02:10:51.660 | They're gonna keep getting, you know,
02:10:52.860 | kind of composed transactions are gonna get put into blocks,
02:10:56.500 | blocks will get connected and it'll continue, right?
02:10:58.380 | And that's a very focused type of system.
02:11:00.460 | And that's great.
02:11:01.300 | And that makes sense because it's focused
02:11:03.060 | on creating security for that category
02:11:05.900 | of on-chain activity, which is once again,
02:11:08.340 | a critical, critical part of building
02:11:10.300 | a highly transparent system
02:11:11.780 | and something that Chainlink enables
02:11:13.860 | and, you know, doesn't compete with
02:11:14.940 | and just enables to do more.
02:11:16.340 | Oracle networks conversely have to interact
02:11:21.380 | with all the world's data and provide all the services
02:11:25.540 | that blockchains don't provide, right?
02:11:27.220 | So there's kind of a spectrum.
02:11:28.540 | On one end of the spectrum, you have blockchains
02:11:31.020 | that are highly secure, highly reliable,
02:11:33.340 | highly tamper-proof, highly transparent,
02:11:36.620 | but are not very feature-rich.
02:11:38.500 | For example, they cannot talk to an API.
02:11:41.020 | Many of them can't generate randomness.
02:11:42.540 | They cannot do some kind of privacy-preserving computation.
02:11:46.180 | So they're very secure and there are these kind
02:11:48.220 | of data structures and smart contract platforms
02:11:52.060 | to hold on-chain code that can define conditions,
02:11:55.700 | receive value, pay value back out under conditions
02:11:59.340 | and create transparency around all that,
02:12:01.060 | which makes perfect sense.
02:12:03.100 | And then there's oracles and oracle networks.
02:12:06.460 | That is all the world's data, right?
02:12:08.420 | We're talking about taking all the world's data
02:12:11.100 | and making it consumable for all the world's use cases
02:12:14.620 | that have trust issues.
02:12:16.300 | So the amount of variability there is absolutely massive.
02:12:19.180 | Right?
02:12:20.020 | It's like the decentralized oracle network
02:12:22.340 | and the conditions that that decentralized oracle network
02:12:24.780 | needs to meet is going to vary very widely
02:12:28.340 | from an insurance contract to a lending contract
02:12:32.060 | to an ad network contract to the data sales contract
02:12:36.740 | that we discussed to any number of other smart contracts.
02:12:40.740 | So really the ability of a decentralized oracle network
02:12:44.660 | to flexibly address all of those requirements
02:12:47.820 | is what's necessary.
02:12:48.980 | - So flexibility is the goal, whereas with on-chain
02:12:53.700 | like Bitcoin, flexibility is the enemy
02:12:57.620 | in the sense that you want security,
02:13:01.020 | you want the focus there.
02:13:03.380 | And in that kind of world,
02:13:05.500 | design decisions have huge consequences.
02:13:08.220 | And then if you look at the distributor oracle network side,
02:13:12.060 | you want to remove the restrictions of design choices.
02:13:16.220 | You want to provide maximal flexibility then.
02:13:19.060 | So it's a completely separate kind of design framework.
02:13:23.740 | - It's a slightly different problem, right?
02:13:25.180 | Because we're not trying to define transaction types
02:13:28.220 | fitting into blocks on a certain timeline
02:13:30.460 | and have those blocks being generated.
02:13:32.260 | We're trying to say, "Hey, there's this world of services
02:13:35.260 | or this world of data that's not very deterministic,
02:13:37.780 | but it's unbelievably useful
02:13:39.340 | to these smart contracts over here."
02:13:40.940 | And actually they needed to even exist.
02:13:43.060 | And we really want them to exist because once they exist,
02:13:45.580 | it's going to completely redefine
02:13:46.780 | what our whole industry is known for, right?
02:13:48.780 | And defined NFTs are not even the tip of the iceberg.
02:13:51.660 | They're like the snow coming off the top of the iceberg.
02:13:54.420 | And so our goal is to create a framework
02:13:58.540 | and an infrastructure and a software
02:14:01.180 | that allows people to compose
02:14:04.420 | decentralized oracle networks, right?
02:14:06.180 | So initially you can compose a decentralized oracle network
02:14:10.060 | of seven nodes that goes to three data sources
02:14:12.940 | to trigger your contract worth a million dollars.
02:14:15.420 | And that's where you could start.
02:14:16.580 | And then let's say your smart contract,
02:14:18.100 | your DeFi smart contract goes to a billion dollars.
02:14:20.620 | Well, then you need to make some changes, right?
02:14:22.580 | You need to go from seven nodes to 15 or maybe 31 nodes.
02:14:26.860 | And you need to go from three data sources to five or seven.
02:14:30.100 | And you maybe need to create some kind of
02:14:32.940 | what we call circuit breakers and some other checks.
02:14:36.980 | And you need to make sure that the decentralized oracle
02:14:39.020 | network comes to consensus around those checks.
02:14:41.540 | Because now the centralized oracle network
02:14:43.660 | isn't controlling a million dollars,
02:14:44.700 | it's controlling a billion dollars.
02:14:46.180 | And we have decentralized oracle networks
02:14:48.020 | that control well over a billion dollars,
02:14:49.580 | multiple billions of dollars.
02:14:51.300 | And we see them growing and getting more advanced
02:14:53.260 | data sources and more advanced features.
02:14:55.940 | And then if somebody else comes and says,
02:14:57.660 | "Well, I don't really wanna make a DeFi product,
02:14:59.900 | I wanna make crop insurance.
02:15:01.500 | And I have a completely different set of conditions.
02:15:03.540 | I want this method of consensus.
02:15:05.380 | And I want data to be aggregated in this way,
02:15:07.780 | but not the way that you do
02:15:08.780 | for decentralized financial products."
02:15:10.860 | I mean, what are we supposed to tell them?
02:15:11.980 | We're supposed to tell them,
02:15:12.820 | "No, our decentralized oracle network can't let you do that.
02:15:16.540 | And you can go and wait another five years
02:15:19.580 | until somebody builds it for you."
02:15:22.980 | That's not what we wanna do, right?
02:15:24.580 | What we wanna do is be able to say,
02:15:26.300 | "Absolutely, here's an example of how somebody else
02:15:30.460 | made a decentralized oracle network
02:15:32.020 | for weather insurance, right?
02:15:34.860 | Here's a template.
02:15:37.820 | Change that template, evolve it to meet your needs."
02:15:41.300 | And then someone else comes and says,
02:15:42.620 | "Hey, I have some other use case in gaming, right?
02:15:46.260 | I wanna make NFTs related to real world sports events,
02:15:49.540 | or I wanna do whatever I wanna do
02:15:50.980 | with some kind of sports related data."
02:15:54.100 | Wonderful.
02:15:55.140 | Here's the framework.
02:15:56.140 | Here are your risk dynamics.
02:15:58.180 | Here's a collection of node operators.
02:16:00.460 | Here's a set of pre-integrated data sources.
02:16:03.940 | Here's a reputation system to assess the quality
02:16:08.060 | of your ensemble of nodes.
02:16:09.900 | Here's a way to scale that up
02:16:11.260 | as the value in your contract scales.
02:16:14.580 | Here's all the tools that you need to build this contract.
02:16:18.540 | And what we actually see now
02:16:20.420 | as there are multiple types of computations
02:16:23.100 | and data sources that are provided
02:16:25.100 | by different decentralized oracle networks,
02:16:27.420 | of which there are now hundreds,
02:16:29.460 | we now see that a single hybrid smart contract
02:16:33.220 | might use multiple decentralized oracle networks.
02:16:35.980 | So there might be a hybrid smart contract
02:16:38.140 | that uses a price data, decentralized oracle network,
02:16:41.300 | a proof of reserve oracle network,
02:16:43.300 | a randomness oracle network.
02:16:45.580 | And I think we're gonna continue to see this dynamic
02:16:48.820 | that more and more advanced contracts
02:16:51.220 | compose various decentralized oracle networks
02:16:55.060 | into more advanced use cases.
02:16:57.340 | And this is the dynamic that we're focused on enabling.
02:17:02.340 | And I think it's actually a very virtuous cycle
02:17:05.060 | for everybody because the more of these hybrid smart
02:17:08.100 | contracts we enable on Ethereum and other blockchains,
02:17:12.060 | the more our industry provides real world outcomes
02:17:15.620 | to the larger world, which is at the end of the day,
02:17:18.420 | what I think everybody in our industry wants.
02:17:21.180 | Everybody in our industry wants hybrid smart contracts
02:17:24.780 | to become the way that global finance works,
02:17:28.220 | global trade works, global insurance products work,
02:17:31.340 | because they will inherently need both a blockchain
02:17:34.380 | on which the contract itself lives,
02:17:37.020 | and an oracle network that powers
02:17:39.340 | all of the other interactions, right?
02:17:41.940 | - As a developer, how would you recommend
02:17:44.540 | somebody listening to this, but also me,
02:17:50.380 | to get started with smart contracts
02:17:53.060 | and to get started with hybrid smart contracts?
02:17:56.180 | - Well, for hybrid smart contracts,
02:17:59.260 | I'm gonna have to do some kind of shameless promotion.
02:18:01.860 | - Please, let me twist your arm.
02:18:04.460 | - Thank you.
02:18:06.780 | I think you can go to our YouTube.
02:18:08.660 | We have a number of developer tutorials.
02:18:11.220 | - Chainlink YouTube?
02:18:12.300 | - Yeah, Chainlink.
02:18:13.500 | I think if you just search Chainlink on YouTube,
02:18:15.020 | you should find it.
02:18:16.260 | Beyond that, we recently had a hackathon
02:18:19.620 | where we had a huge amount of very,
02:18:21.660 | very kind of advanced hybrid smart contracts getting built.
02:18:25.100 | - To elaborate on that, you had a hackathon.
02:18:28.380 | Is that something that people can follow along
02:18:30.700 | like a video or there's webpage traces of what happened,
02:18:35.660 | or is there future actual hackathons
02:18:37.780 | that people could literally participate in?
02:18:39.700 | - There's plenty of more hackathons coming up.
02:18:41.620 | We wanna enable as many developers in web three and web two
02:18:44.580 | to build hybrid smart contracts
02:18:46.180 | as a way to redefine our industry
02:18:48.020 | and kind of make all of these smart contracts come to life.
02:18:51.540 | There are definitely gonna be more hackathons.
02:18:54.140 | So people should go and pre-register,
02:18:56.340 | register on a list to get involved in that.
02:18:58.340 | That's a great resource where we have a lot of speakers
02:19:00.340 | and a lot of educational tools.
02:19:01.540 | They happen over a course of weeks, not days.
02:19:04.780 | So there's a long time for people to work on these things
02:19:07.180 | at the speed that they find comfortable.
02:19:09.900 | - Two questions.
02:19:10.740 | One, is there a kind of a hello world entry point
02:19:14.420 | for hybrid smart contracts?
02:19:17.820 | And two, on the hackathon side,
02:19:19.420 | like what kind of stuff do you see people building at first?
02:19:22.380 | Just kind of getting their feet wet,
02:19:24.860 | like in terms of the kind of applications
02:19:27.260 | that could be enabled.
02:19:28.900 | - I mean, there's unbelievable things
02:19:30.500 | that we see people building.
02:19:32.460 | I think how to get your feet wet,
02:19:35.180 | I think the hello world is probably DeFi
02:19:37.180 | 'cause it's pretty straightforward.
02:19:38.580 | And there's a large amount of data sources
02:19:40.300 | that we already have putting data on chain on test net,
02:19:42.820 | which is the test environment in which people would build.
02:19:45.380 | So I think DeFi is probably to a certain degree,
02:19:47.780 | the most exciting for certain people
02:19:49.220 | and pretty, pretty expansive in terms of the tutorials
02:19:54.140 | and the amount of contracts
02:19:55.340 | to see how people have already built it.
02:19:57.780 | I think beyond that,
02:19:58.620 | we see people building amazing things at these hackathons.
02:20:01.020 | In the previous hackathon,
02:20:02.380 | we saw somebody build a smart contract
02:20:04.700 | that allows someone to rent out their Tesla, right?
02:20:09.380 | So it allows the Tesla API to give someone else access
02:20:14.380 | and rent out someone's Tesla
02:20:16.500 | on the basis of a smart contract
02:20:18.060 | kind of coordinating payment,
02:20:19.820 | which was kind of amazing.
02:20:22.420 | The more recent hackathon,
02:20:23.820 | we saw something called D-Bridge,
02:20:25.580 | which is a cross-chain solution that uses Oracle networks
02:20:28.340 | to confirm data on different chains.
02:20:30.580 | So I think the things that people build
02:20:33.820 | will just become expansive and varied in ways
02:20:37.180 | that I can't even imagine.
02:20:38.880 | But I think this recent hackathon saw a huge, huge list
02:20:44.380 | of different kind of winners in different categories.
02:20:48.020 | And there's so many different categories.
02:20:49.300 | We even have a GovTech category
02:20:50.780 | and a whole bunch of things.
02:20:53.060 | If people wanna see what's possible,
02:20:54.580 | they can go look at the winners.
02:20:55.900 | I think that's probably a good idea.
02:20:57.900 | Yeah, that'll be on the side of the hackathon.
02:21:01.340 | There's a blog related to that
02:21:03.820 | and we're gonna have more of these.
02:21:05.340 | And once again, our explicit goal
02:21:08.420 | is to take our industry
02:21:10.860 | into this world of hybrid smart contracts,
02:21:13.180 | which just benefits everybody.
02:21:14.780 | It makes more on-chain activity.
02:21:16.860 | It helps provide real-world value to the average person
02:21:21.020 | from all of this infrastructure, period.
02:21:23.500 | And at the end of the day,
02:21:25.000 | I think that it just redefines what our industry
02:21:28.060 | is about through use cases, right?
02:21:29.540 | Because if you only learn through our industry
02:21:33.060 | from the point of view of a single use case,
02:21:34.980 | like the NFT use case or some other use case,
02:21:37.640 | that's what our industry is about.
02:21:39.700 | And the more of these use cases
02:21:41.600 | that people can make available to the average person
02:21:44.280 | or to the fintech world
02:21:45.640 | or to the insurance world or wherever,
02:21:47.620 | the faster our industry will not just be
02:21:50.700 | about Bitcoins or tokens.
02:21:52.620 | It will be about changing global finance,
02:21:55.180 | changing global insurance, changing global trade.
02:21:58.080 | And that's the change in the world
02:22:00.360 | that I and a lot of other people in this industry,
02:22:02.660 | I think, got into this for.
02:22:04.680 | - Now, it's funny, you've mentioned about,
02:22:06.940 | you've had a lot of kind words to say about Bitcoin
02:22:09.460 | and Ethereum as important technology
02:22:12.420 | that paved the way for the future.
02:22:14.440 | And you somehow did not mention
02:22:16.920 | one of the most profound pieces of technology,
02:22:18.780 | which is Dogecoin.
02:22:20.380 | What are your thoughts about this particular
02:22:22.900 | revolutionary technology?
02:22:26.280 | And what are your thoughts about Dogecoin
02:22:28.000 | going to the moon, to Mars,
02:22:30.140 | and outside of the solar system?
02:22:32.540 | - I think Dogecoin is a very interesting kind of,
02:22:36.580 | probably closer to a social experiment
02:22:38.700 | than anything else.
02:22:40.040 | - Isn't everything a social experiment?
02:22:42.820 | - Yeah, I guess that's fair to a degree.
02:22:45.160 | I think it's fascinating how that's evolved.
02:22:49.160 | I think the people that made it
02:22:51.300 | with certain goals in mind,
02:22:53.060 | and then it's kind of taken on a life of its own.
02:22:56.080 | I don't fully understand exactly why
02:22:58.060 | it's taken on a life of its own at this point.
02:23:01.020 | I, once again, I don't spend too much time
02:23:03.020 | thinking about different tokens and how they're evolving.
02:23:07.060 | I'm much more focused on the launching and--
02:23:10.700 | - The technology around trust,
02:23:13.080 | and all those kinds of ideas.
02:23:15.020 | But I think one of the fascinating things about Dogecoin
02:23:18.820 | is how technology that is,
02:23:22.240 | that leverages social dynamics,
02:23:26.340 | that technology's ability to utilize fun
02:23:29.800 | and memes to spread.
02:23:32.180 | I think it's really interesting.
02:23:34.380 | I don't think it should be discounted
02:23:36.540 | as if, I think I tweeted today,
02:23:39.300 | something about the fundamental force field of fun,
02:23:43.060 | that fun has an effect on the space-time.
02:23:47.700 | So general relativity describes how mass and energy
02:23:52.700 | can curve space-time.
02:23:55.340 | And I was just giving an example that when life is fun,
02:23:58.460 | it seems short, when life is not fun, it seems very long.
02:24:01.520 | So fun has a very similar effect on space-time,
02:24:04.140 | like in curved space-time.
02:24:05.740 | In that same sense, there is a power to the meme.
02:24:10.100 | And I think Dogecoin illustrates that.
02:24:12.500 | I think Elon is an example of somebody that uses Dogecoin.
02:24:15.640 | I don't know his philosophy in particular on this aspect,
02:24:19.540 | but he does use it effectively to excite the world
02:24:23.720 | in a fun way about the possibilities of future technologies
02:24:27.440 | like cryptocurrency.
02:24:29.060 | I think the Bitcoin world is very serious right now.
02:24:32.700 | And we've spoken about Bitcoin maximalists.
02:24:36.740 | There is very little space for fun and joking
02:24:40.780 | in the Bitcoin world,
02:24:42.660 | but there's still a little bit of fun and humor left
02:24:47.260 | in the Dogecoin world.
02:24:48.620 | In that sense, I think it's exceptionally powerful
02:24:51.660 | to inspire, to excite,
02:24:53.260 | to be able to talk about stuff
02:25:00.340 | without the seriousness of financial impact
02:25:05.140 | that now certain cryptocurrencies have like Bitcoin.
02:25:08.940 | So I keep an eye on,
02:25:11.220 | I've previously mentioned that Dogecoin, I think,
02:25:13.820 | is a fascinating piece of technology
02:25:16.140 | because I do think cryptocurrency is much bigger
02:25:19.140 | than the technology that you focus on.
02:25:23.900 | There is also a social element that you also spoke to
02:25:27.780 | that's, I think, not quite yet understood.
02:25:31.100 | And it's fascinating to watch,
02:25:33.580 | especially as it co-evolves
02:25:35.460 | with the different tools on the internet,
02:25:37.340 | the different social networks,
02:25:39.620 | social network mechanisms on the internet.
02:25:42.180 | So I'm a huge supporter of Dogecoin
02:25:46.220 | because I'm a huge supporter of fun.
02:25:48.060 | - I'm fascinated to see how it'll work out.
02:25:52.140 | - You think it'll go to the moon?
02:25:55.740 | You think it'll be the first cryptocurrency
02:25:57.660 | to land on the moon?
02:25:59.260 | - I couldn't say.
02:26:00.100 | I haven't done the analysis, as I've said before.
02:26:04.300 | - I haven't done the analysis.
02:26:05.660 | Well, yeah.
02:26:06.500 | No matter what, I do hope we get humans back on the moon
02:26:12.500 | and hopefully get humans on Mars soon.
02:26:15.300 | Dogecoin, Bitcoin,
02:26:18.100 | or not.
02:26:21.380 | Let me ask you about books and movies.
02:26:26.020 | What books and movies in your life,
02:26:29.100 | long ago, when you were a baby Sergei,
02:26:32.300 | or today, had an impact on you?
02:26:35.500 | Maybe you would recommend to others.
02:26:39.140 | And maybe what ideas you took away from those books,
02:26:43.660 | movies, coloring books, children's books,
02:26:46.980 | blogs, whatever.
02:26:48.340 | - Yeah, yeah, sure.
02:26:51.460 | So I think one of the things that had a very big impact
02:26:53.980 | on me were Plato's dialogues,
02:26:55.980 | and particularly Protagoras and Gorgias
02:26:58.220 | as some of the two initial ones.
02:27:01.500 | I think what Plato's dialogues do very well
02:27:05.620 | is they give people a clear picture
02:27:08.500 | of what dialogue looks like
02:27:10.020 | and what the assessment of information
02:27:12.540 | probably should look like, right?
02:27:15.300 | And how the dissection and analysis
02:27:18.500 | of an idea is very important.
02:27:20.500 | And how it can actually be taken in either direction.
02:27:22.860 | But at the end of the day,
02:27:24.580 | that the process of eliminating
02:27:26.820 | kind of this fuzzy thinking
02:27:30.020 | and arriving at whether it's an external dialogue
02:27:33.380 | or an internal dialogue about an accurate picture of reality
02:27:37.060 | is actually very important.
02:27:38.940 | And so I think I'm very lucky to have read the dialogues
02:27:42.460 | when I was in my early teenage years.
02:27:44.900 | And it had a very large impact on me
02:27:46.460 | because it kind of showed me that,
02:27:48.780 | nobody knows what they're talking about.
02:27:50.980 | I would play out dialogues in my mind
02:27:54.220 | and I would engage in certain dialogues
02:27:56.780 | with different people.
02:27:58.900 | And what the Platonic dialogue showed me
02:28:01.340 | was kind of how to tell when someone has no clue.
02:28:06.020 | And a lot of people are very good at
02:28:07.780 | kind of saying they have a clue, right?
02:28:12.660 | Saying like, here's how the world works.
02:28:14.260 | Here's what you should do with your life.
02:28:15.540 | Here's what you should do with your time.
02:28:16.780 | Here's what you should do with your money.
02:28:18.020 | Here's what you should do with your attention.
02:28:19.820 | Here's what you should do with all these things.
02:28:22.420 | And I think the ability to evaluate information
02:28:26.300 | generally is something that is surprisingly under taught.
02:28:31.020 | I don't actually understand why there isn't a course
02:28:33.900 | in like high schools or universities
02:28:35.940 | that's just like, here is how you evaluate information.
02:28:39.380 | Here's how you engage in external dialogue
02:28:41.700 | and internal dialogue
02:28:43.180 | to arrive at an accurate picture of reality
02:28:45.540 | rather than the picture of reality
02:28:47.260 | that other people want you to have
02:28:48.620 | for their benefit most often, right?
02:28:50.580 | And at the end of the day,
02:28:52.780 | I think that put me down a path
02:28:54.780 | to really try and understand.
02:28:57.140 | Beyond that, I think biographies
02:29:00.300 | have had a very large impact on me.
02:29:02.900 | Plutarch's Greek and Roman lives.
02:29:05.820 | After I read Plato, I started reading a bunch of stuff,
02:29:08.380 | Greek stuff.
02:29:09.220 | I was just like, these Greek guys,
02:29:10.340 | they really know how it is.
02:29:12.420 | They did this 2000 years ago and they still got it right.
02:29:15.340 | There's something here.
02:29:16.620 | It's kind of this like theory of time around
02:29:19.620 | the value of intellectual ideas, right?
02:29:21.060 | If an intellectual idea has survived the test of time,
02:29:24.780 | it's much more valuable than the intellectual idea
02:29:27.180 | that I just came up with 10 minutes ago,
02:29:28.700 | haven't told anybody and hasn't gone up against
02:29:31.300 | all of the kind of rebuttals.
02:29:35.380 | - So what's your favorite,
02:29:37.740 | what would you say would be a most impactful biography
02:29:41.260 | that you've come across?
02:29:43.260 | - I don't think it was those Greek or Roman biographies
02:29:45.620 | 'cause they were very far away.
02:29:48.580 | I think that probably one of the most impactful ones
02:29:53.580 | that I can remember recently is around Vanderbilt.
02:29:58.420 | And so Vanderbilt was this guy who basically
02:30:03.660 | without that much of an education,
02:30:05.140 | he would invent or work with people
02:30:08.780 | to make these steamboats.
02:30:10.180 | And then he had a lot of acumen around
02:30:13.020 | creating certain monopolies regardless of
02:30:16.940 | what was right or wasn't right.
02:30:22.100 | And then fascinatingly enough,
02:30:23.180 | it all hinged on like a Supreme Court case
02:30:25.460 | that decided if monopolies were acceptable
02:30:28.340 | in the form of state created monopolies or not.
02:30:31.940 | And if it was deemed that state created monopolies
02:30:36.780 | were acceptable, he would have had a huge problem, this guy.
02:30:39.460 | But it was deemed that state created monopolies
02:30:41.860 | through these licenses for steamboat routes
02:30:46.020 | was not acceptable.
02:30:47.900 | And that did two interesting things.
02:30:50.060 | It unseated some kind of old time landed gentry
02:30:54.540 | in the Americas in like the 1830s and '40s.
02:30:57.940 | And it basically made him right
02:31:00.820 | and he saw it before other people.
02:31:03.180 | So I think Vanderbilt was a very interesting personality
02:31:08.180 | first of all, of all the biographies that I read
02:31:11.500 | is somebody who really took the situation in hand
02:31:15.660 | and kind of took action to achieve an outcome
02:31:21.300 | which I think was an amazing result.
02:31:25.180 | The fascinating thing by the way is,
02:31:27.620 | or amazing way of looking at things.
02:31:29.500 | The fascinating thing by the way is that the ferries now
02:31:32.020 | in New York Harbor are all run as a public good.
02:31:36.980 | So the fascinating thing is that the guy,
02:31:38.540 | he focused on an industry and he worked on something
02:31:41.700 | that was so important that it ended up
02:31:43.860 | becoming a public good.
02:31:45.860 | And I think that that's an interesting conception
02:31:48.740 | of how to look at this industry.
02:31:51.820 | I think there's a lot of economics dynamics
02:31:54.420 | around this industry, but I think I might've said this,
02:31:56.860 | I might've said this somewhere else before,
02:31:58.980 | but really the success of someone in this industry
02:32:02.380 | is whether they're able to make a Linux or HTTP
02:32:06.660 | or an HTTPS like system that lives on for a very long time
02:32:11.660 | and is essentially a kind of public good.
02:32:16.540 | - The success of an idea,
02:32:19.020 | even if that idea is originally sort of a capitalist idea
02:32:22.060 | above that's grounded in financial benefit,
02:32:25.180 | success of it is if it becomes a public good.
02:32:28.180 | It is so universal, it is so fundamental
02:32:31.700 | to the quality of life that it's a public good,
02:32:34.380 | is deemed to be so valuable
02:32:37.100 | that it should be a public good.
02:32:38.700 | - Yeah, I think so.
02:32:40.620 | I think that's a pretty good definition of success
02:32:43.380 | that you work on a body of work
02:32:45.580 | and that body of work isn't just some commercial enterprise.
02:32:49.300 | It's a body of work that whatever commercial aspects
02:32:53.540 | or economic incentive aspects it might have,
02:32:56.620 | it eventually is so important
02:32:58.660 | that it becomes critical to how society functions.
02:33:02.500 | I'm personally quite lucky and grateful to be,
02:33:06.780 | in my opinion, working on something like that
02:33:09.460 | with an amazing team and an amazing community
02:33:13.100 | that seems to really very much care
02:33:15.340 | about this hybrid smart contract, transparent world
02:33:20.260 | that a lot of people in our industry,
02:33:22.700 | realistically, I think this is why a lot of them signed up.
02:33:25.620 | This is why I came into our industry.
02:33:27.460 | It wasn't because Bitcoin,
02:33:29.860 | it was because Bitcoin was a picture
02:33:34.140 | of how the world could work in so many other ways.
02:33:37.500 | And that picture of how the world could work
02:33:39.220 | in so many other ways attracted me a very long time ago.
02:33:43.460 | And I think that all of this stuff
02:33:49.100 | will eventually become a public good.
02:33:51.100 | I think it'll become so critical
02:33:52.740 | to how societies function internally and internationally
02:33:56.300 | that just like there are systems,
02:33:58.700 | like the Federal Reserve, like global payment systems,
02:34:01.100 | like all these types of things,
02:34:02.580 | I think eventually all of this technology
02:34:05.420 | will be baked into these societally critical systems.
02:34:10.380 | And if I and our community and the people I work with
02:34:13.300 | and the body of work that we're working on
02:34:15.660 | can make some kind of contribution to that shift
02:34:18.940 | towards a fair, economically fair, transparent society,
02:34:25.780 | from my point of view, it's a very worthwhile body of work.
02:34:29.020 | In terms of the show, you also mentioned the show.
02:34:32.220 | One of the shows that I really seem to like more and more
02:34:36.380 | for some reason is Star Trek, not the old Star Trek.
02:34:40.100 | I don't really get the old Star Trek.
02:34:41.380 | The special effects aren't good enough.
02:34:43.340 | (laughing)
02:34:44.180 | Star Trek, like The Next Generation and Voyager
02:34:47.300 | and Deep Space Nine and all those.
02:34:50.260 | I think whenever I happen to watch a Star Trek show again,
02:34:54.900 | I have a very simple conception in my mind
02:34:57.340 | that I really didn't have whenever I saw it way back when.
02:35:00.300 | It's that this is what the world looks like
02:35:03.380 | if technology takes us towards a utopia.
02:35:07.020 | So I think there's this fascinating thing
02:35:08.980 | where technology can take us towards a utopia
02:35:11.860 | or towards a dystopia.
02:35:13.820 | And in my mind, those kind of three Star Trek shows
02:35:18.140 | are a picture of what human civilization looks like
02:35:23.500 | if everybody's technological ambitions
02:35:26.260 | successfully take us towards a utopia.
02:35:28.540 | Because in the Star Trek universe,
02:35:30.940 | you're not seeking money or you're not seeking safety
02:35:34.220 | or you're not seeking,
02:35:36.140 | you're not really seeking anything for yourself.
02:35:38.540 | Everybody within Maslow's hierarchy of needs
02:35:42.060 | has gotten so many things for themselves
02:35:44.060 | that their goal is learning and discovering and/or helping.
02:35:49.060 | And I think there is this conception of human life
02:35:53.300 | once the baser needs are satisfied.
02:35:56.260 | And at the end of the day,
02:35:59.180 | I think that's what technology generally can elevate
02:36:02.740 | all of human civilization to.
02:36:04.540 | It can elevate us to Star Trek world
02:36:07.460 | where if people want to invent,
02:36:10.500 | they can do that all day and nothing else.
02:36:12.140 | If people wanna explore the stars,
02:36:13.980 | they can explore the stars
02:36:15.020 | and they don't have to worry about economic scarcity
02:36:18.780 | or any number of these other conceptions.
02:36:21.220 | So I don't know what the most impactful
02:36:23.100 | on me shows have been,
02:36:24.780 | but for some reason recently Star Trek
02:36:27.740 | in the newer variant,
02:36:30.020 | not the most new Star Trek shows.
02:36:32.140 | Those shows are a little strange.
02:36:33.460 | The kind of middle Star Trek universe
02:36:35.980 | where everybody is doing something
02:36:37.980 | with a very important purpose
02:36:40.540 | and nobody's thinking about like,
02:36:42.220 | where's my paycheck or where's my whatever.
02:36:46.060 | They're all kind of like,
02:36:47.180 | we have to discover the formula to this
02:36:50.700 | to save the planet over there.
02:36:52.580 | And literally every episode you're discovering a formula
02:36:54.540 | to save a planet of some kind
02:36:55.940 | or a universe or ecosystem or whatever.
02:36:58.620 | And you're looking at it, you're like,
02:37:00.500 | you know, this is a pretty good place to end up.
02:37:03.700 | This is where we might wanna end up.
02:37:05.420 | - So it gives you hope.
02:37:06.540 | I mean, it's funny that we don't often think about the,
02:37:11.540 | I think it's very useful to think about positive visions
02:37:17.020 | of the future when we're trying to design technology.
02:37:21.100 | There's a lot of sort of in public discourse,
02:37:25.220 | a lot of people are thinking about
02:37:27.180 | kind of how everything goes wrong.
02:37:29.540 | It's important to think about that sometimes,
02:37:31.660 | but in moderation, I think,
02:37:33.180 | 'cause there's not enough,
02:37:35.500 | in my little corner of artificial intelligence world,
02:37:39.140 | people are very kind of fear monger centered.
02:37:42.940 | There's a lot of discussions
02:37:44.380 | about how everything goes wrong.
02:37:46.180 | It's important to do,
02:37:47.540 | but it's also really important to talk about
02:37:50.100 | how things can go right,
02:37:53.460 | because we ultimately want to guide the design
02:37:56.260 | of the systems we create to make things right.
02:37:59.140 | And I think with hope and optimism,
02:38:02.300 | not naiveness, but optimism,
02:38:06.060 | you can actually create the better world.
02:38:08.500 | Like you have to think about a positive,
02:38:10.880 | a better world as you create,
02:38:14.320 | because then you can actually create it.
02:38:16.320 | Yeah, I'm one of the people that thinks that
02:38:20.300 | having an optimistic view of the world
02:38:24.780 | is better for design and creativity
02:38:26.860 | than having a pessimistic one.
02:38:28.860 | It's hard to design when you're in fear.
02:38:30.940 | Do you have advice for young people,
02:38:37.060 | speaking of being excited about
02:38:38.980 | and hopeful about the future world,
02:38:40.900 | do you have advice for young people today
02:38:43.860 | in a computer science world,
02:38:46.780 | in software engineering world, in crypto world,
02:38:48.920 | but maybe in any world whatsoever,
02:38:51.260 | for life, how to pick a career
02:38:55.240 | or how to live life in general?
02:38:57.040 | - I think the thing that young people should do
02:39:02.280 | is not any one specific thing
02:39:04.140 | for any one specific young person.
02:39:07.420 | I think what they should do
02:39:08.860 | is what they won't be able to do
02:39:11.920 | in the later stages of their life.
02:39:13.680 | And the way, in my opinion,
02:39:16.200 | from a framework point of view to think about that,
02:39:19.560 | is that the amount of obligations
02:39:23.360 | and the amount of time that a person has
02:39:27.720 | seems to just diminish over time, right?
02:39:30.560 | So the amount of free time they have, right?
02:39:32.520 | So you start your job,
02:39:33.640 | you get a bunch of responsibilities,
02:39:35.360 | something with your partner, spouse, more responsibilities,
02:39:38.600 | kids, probably even more responsibilities,
02:39:40.800 | and soon enough, the time that you have
02:39:43.840 | to educate yourself, to travel,
02:39:46.960 | to experience the world however,
02:39:50.160 | create whatever creative endeavor you're interested in,
02:39:53.360 | slowly but surely disappears.
02:39:55.880 | I think this is something
02:39:57.880 | that young people don't fully realize.
02:40:01.040 | They assume that the world as it is now
02:40:04.320 | and the amount of free time that they have
02:40:07.480 | to travel, to educate themselves, to make new friends,
02:40:11.360 | to do all these things
02:40:12.640 | will somehow maybe diminish by 10%.
02:40:16.760 | It won't diminish by 10%.
02:40:18.640 | It'll diminish by 90%.
02:40:20.800 | And the 10% that you have,
02:40:22.560 | you'll be resting to get back to work and get things done.
02:40:26.000 | So what I think young people should do,
02:40:30.240 | and this is why it's very different for each of them,
02:40:32.680 | I can't tell young people,
02:40:33.720 | "Hey, you should study philosophy, travel,
02:40:36.480 | and start your own enterprise
02:40:38.240 | to achieve something worthwhile in the world."
02:40:39.800 | That might be something that's good for me
02:40:41.680 | with my values and my kind of worldview,
02:40:45.480 | but for other people might be something else.
02:40:47.880 | I think the way that they should conceptualize it
02:40:51.200 | is imagine if over the next 10, 12 years,
02:40:56.200 | the amount of choice that you had about what you could do
02:41:02.240 | was cut down by 90%.
02:41:05.360 | What would you, and this is copying
02:41:07.320 | from this kind of Jeff Bezos regret minimization framework.
02:41:10.760 | In that framework, it's like,
02:41:12.760 | what would I regret not doing at 80?
02:41:16.080 | And that's kind of meant to create this long-term view
02:41:19.440 | and make these decisions now
02:41:21.440 | that'll get you to a long-term future
02:41:23.880 | that you can look back on and be proud of your life.
02:41:27.000 | What I think young people should do
02:41:28.840 | is they should say to themselves,
02:41:30.560 | "Look, if I never get the chance to travel
02:41:34.000 | for as long as I live,
02:41:36.040 | assuming that after 25, after 27, after 29, that's the case,
02:41:41.040 | how will I feel about that?
02:41:43.440 | If I never get to start a company after 25,
02:41:46.960 | after I get married, after I have kids,
02:41:49.120 | how will I feel about that?"
02:41:51.000 | And whatever they feel the worst about
02:41:53.320 | is what they should do.
02:41:54.880 | Whatever they feel like when they say to themselves,
02:41:57.080 | "You know, if I don't travel now, I will never travel."
02:42:01.400 | And they feel horrible about that.
02:42:02.920 | They just have an overwhelming fear
02:42:06.000 | and disgust at themselves in that type of state
02:42:10.800 | at 25, 27, 29, that's what they should do.
02:42:14.640 | And they shouldn't listen to anybody else.
02:42:16.760 | Let me put it to you this way.
02:42:20.960 | If you're really smart, you're gonna make it anyway.
02:42:24.600 | There's a lot of people putting a lot of pressure on you
02:42:26.400 | because they're afraid whether you're gonna make it.
02:42:28.600 | If you're really smart, you're gonna make it anyway.
02:42:31.320 | If you're not really smart, you're screwed anyway.
02:42:33.760 | (laughing)
02:42:35.440 | - Either way, just relax with it and use your time well
02:42:39.040 | to do the things you would most regret not doing.
02:42:43.120 | That's really fascinating.
02:42:44.760 | - I wouldn't say relax.
02:42:46.160 | I would say very much cherish the free time,
02:42:50.400 | the discretionary time that you have
02:42:52.680 | from the age of 18 to maybe 25.
02:42:56.400 | Because at 25, everyone's gonna start looking at each other
02:42:59.800 | and asking, "What have I achieved?
02:43:02.280 | My friends have achieved, I haven't achieved."
02:43:04.560 | And then by the time you get to 30,
02:43:06.240 | you're gonna look at each other again and go,
02:43:07.960 | "Well, my friends have a family or a company or a PhD
02:43:12.960 | or a whatever, what do I have?"
02:43:14.520 | And the pressure will just increase.
02:43:16.080 | And it'll increase so much that even if you want to go
02:43:20.080 | and do the fun thing, it will not be fun
02:43:22.800 | because the pressure of comparing yourself
02:43:26.120 | to your friends at 25 or your peers at 30
02:43:29.280 | will be so great that it will no longer be normal
02:43:33.600 | for you to be in a hostel at 30,
02:43:36.480 | kind of like living it up, right?
02:43:38.960 | And this is why I also can't tell you
02:43:42.440 | specifically what it is.
02:43:43.400 | For me, it was getting an education in philosophy
02:43:46.040 | that was rigorous and in-depth.
02:43:47.800 | It was traveling and it was starting an enterprise
02:43:50.320 | that I thought that was worthwhile, that I directed,
02:43:52.680 | that I could make into something great.
02:43:54.360 | That's what it was for me.
02:43:55.640 | For other people, it might be something with a band,
02:43:58.080 | it might be something with painting,
02:43:59.920 | it might be an education.
02:44:02.360 | You, by the way, also should not assume
02:44:04.840 | that your ability to get an education will improve.
02:44:09.840 | All of those responsibilities will take away
02:44:13.240 | your ability to get an education.
02:44:15.520 | So if you value having an education,
02:44:17.560 | if you value being a deeply educated, well-rounded person
02:44:22.040 | with a wide array of knowledge on a wide array of topics,
02:44:26.320 | capitalism will force you to specialize.
02:44:28.960 | That's what it's good at.
02:44:30.000 | It's going to take you, it's going to fashion you
02:44:32.000 | into a very specific tool for most people,
02:44:35.400 | into a very specific set of tasks.
02:44:37.840 | If you want to have an education in something, get it now.
02:44:41.560 | If you want to travel somewhere, travel there now.
02:44:44.120 | If you want to do some kind of creative endeavor
02:44:46.400 | that you doubt whether you'll have time for in the future,
02:44:49.560 | do it now.
02:44:50.480 | You won't have time for it in the future.
02:44:52.320 | You won't have time to read philosophy books all day,
02:44:55.320 | unfortunately.
02:44:56.400 | You won't have time to fly to Italy
02:45:00.240 | and kind of hang out with people.
02:45:02.720 | If you're serious about your life,
02:45:04.720 | you're going to get more responsibilities.
02:45:06.120 | You're going to get more stuff to do.
02:45:08.000 | And so my advice to you is do not piss away
02:45:11.520 | this rare, unique, discretionary time.
02:45:15.120 | And if your friends are, get new friends.
02:45:18.720 | Get smarter friends.
02:45:20.240 | Get people who are using the limited time they have better.
02:45:24.840 | That's my advice.
02:45:25.920 | - So it's just a quickly comment.
02:45:27.960 | It's brilliant.
02:45:29.240 | You know, to reframe high school
02:45:31.480 | and undergraduate college education,
02:45:34.120 | sometimes people want to quickly get it over with.
02:45:37.400 | But one thing I remember thinking,
02:45:42.400 | and it's very true about high school,
02:45:45.480 | is one of the only times in your life
02:45:48.120 | you'll get a chance to truly get a broad education.
02:45:51.280 | You don't often think of it that way,
02:45:53.820 | but it's a chance to really enjoy learning things
02:45:57.160 | that are outside of the specialty
02:45:58.720 | that you'll eventually end up with.
02:46:00.480 | And that's how college education is.
02:46:02.280 | And on a more fun side, I played music.
02:46:05.640 | I did martial arts,
02:46:07.960 | and we offline mentioned played video games.
02:46:10.720 | I find it fascinating and brilliant what you said,
02:46:14.120 | which is the world will not give you a chance
02:46:17.440 | to truly enjoy many of these things
02:46:19.720 | and truly get value from many of those things
02:46:23.600 | once you get older.
02:46:25.540 | I find it exceptionally difficult to enjoy video games now.
02:46:29.420 | - There's so much stuff to do.
02:46:30.260 | There's so much responsibility.
02:46:31.720 | - And at the time when I played Elder Scrolls
02:46:37.260 | and Baldur's Gate and Diablo II,
02:46:39.900 | and at the time I thought maybe that was a waste of time.
02:46:44.440 | But now looking back, I realize,
02:46:49.100 | 'cause I always thought, you know,
02:46:50.500 | let me get the career first,
02:46:52.140 | and then I'll have a chance to play video games.
02:46:54.100 | That's the way I was thinking.
02:46:55.620 | You know, it was a waste of time
02:46:57.100 | because I should really progress on the career,
02:47:01.120 | and then I'll have time to play video games.
02:47:03.160 | No, the reality is that was really fulfilling.
02:47:06.160 | Those are some of the happiest travel experiences
02:47:10.560 | of my life is me traveling to those virtual worlds
02:47:14.080 | and spending time in them.
02:47:15.680 | And it was really fulfilling,
02:47:16.860 | and they stayed with me for the rest of my life.
02:47:19.140 | And I get to experience echoes of that
02:47:21.840 | when I play video games these days
02:47:23.400 | for an hour here, an hour there,
02:47:25.240 | like one hour a month or something like that.
02:47:27.820 | But even those experiences, as silly as they are,
02:47:30.920 | they seem like a waste of time at the time,
02:47:33.420 | enjoying them fully, unapologetically.
02:47:37.020 | And in a framework exactly as you said,
02:47:41.120 | would I regret being the kind of person
02:47:43.780 | who've never played those video games?
02:47:46.520 | And I can, for myself, honestly say that yes.
02:47:50.120 | Look, when I'm on my deathbed, I'm glad I-
02:47:54.680 | - Play Baldur's Gate.
02:47:55.520 | - Yeah, I build Baldur's Gate 2
02:47:57.400 | and all those Arena, Daggerfall, Morrowind,
02:48:00.960 | and all the Elder Scroll games.
02:48:03.620 | And yeah, the things that don't necessarily fit
02:48:08.620 | into this kind of storyline
02:48:12.300 | of what a career is supposed to be,
02:48:14.480 | travel and all those experiences that you mentioned.
02:48:16.440 | - I think I'd just like to say one final quick thing on this.
02:48:21.000 | I think this extends to really hard things as well.
02:48:24.000 | It extends to the things you wanna do,
02:48:25.800 | but one of the best pieces of advice one of my mentors
02:48:28.760 | gave me early on in my career around this time
02:48:32.720 | is that it will actually become harder
02:48:35.380 | to start a company as you get older.
02:48:38.760 | Once again, because you have more responsibilities.
02:48:41.360 | You're responsible to your partner for some kind of income
02:48:44.280 | to create a life together.
02:48:45.720 | Once you have kids, you're responsible
02:48:47.160 | for an even greater income to create a life for kids.
02:48:50.320 | And startups do not generate income, right?
02:48:53.300 | They take many, many years before anything happens.
02:48:56.040 | People are getting evicted.
02:48:57.400 | People are eating ramen noodles.
02:48:59.160 | That is a thing that happens, that will happen.
02:49:02.160 | So I'm not saying that you should do the fun things
02:49:05.920 | or the enjoyable things.
02:49:07.360 | I'm saying the things that you would regret not doing,
02:49:12.200 | that you can uniquely do in the time span from 18 to 25.
02:49:17.200 | Which one of which is, if you plan to have a family
02:49:22.360 | and start a family when you're 25,
02:49:24.680 | you should start a company now.
02:49:26.840 | You should not wait until a bunch of people depend on you
02:49:29.460 | for income to eat, to start a company.
02:49:32.120 | The amount of pressure that will be on you at that point
02:49:34.720 | will be monumental.
02:49:36.480 | You should start a company when nobody depends on you
02:49:39.200 | and you can sleep on the floor, eating ramen noodles
02:49:42.680 | and still have a great time and show up
02:49:45.060 | with a lot of enthusiasm and be excited.
02:49:47.720 | So I just mean whatever you want to really devote yourself
02:49:52.720 | to and really do, don't put it off.
02:49:56.200 | Don't go to consulting or banking or any other industry
02:49:59.080 | and say, I'm gonna do this for three years
02:50:00.600 | and I'll get experience.
02:50:02.080 | The only way you get experience is by doing something.
02:50:04.680 | You go, you do it, you fail, you do it again and again
02:50:07.440 | and again and again and again.
02:50:08.680 | And then you have experience and then you can do it right.
02:50:10.760 | That's the only way experience happens.
02:50:12.400 | There is no other way short of mentorship.
02:50:14.760 | If you're lucky to get mentorship,
02:50:16.280 | 99% of people don't get mentorship.
02:50:18.800 | - And even though we're talking about young people,
02:50:20.680 | I feel like you're speaking to me.
02:50:21.820 | As somebody who spent the last two weeks
02:50:24.760 | sleeping on the floor 'cause there's no mattress
02:50:27.040 | and somebody who is single and somebody who's thinking
02:50:29.360 | about doing a startup, I felt like you were speaking to me
02:50:32.800 | as a fellow young person.
02:50:35.680 | Let me ask you about this whole life of ours
02:50:40.680 | to zoom out on the big philosophical question,
02:50:43.160 | the ridiculous question.
02:50:44.520 | What do you think is the meaning of it all?
02:50:46.680 | Do you think about this kind of stuff
02:50:49.320 | as you're creating all the technology,
02:50:51.920 | as you're thinking about this future?
02:50:54.520 | You ever zoom out and think like, why?
02:50:57.160 | Why are you, Sergey, striving?
02:50:59.860 | Why are we, the human species, striving for the stars?
02:51:05.040 | - So I think it comes down to whether people
02:51:08.880 | wanna live in society.
02:51:10.260 | So if people decide to be part of society,
02:51:14.980 | they have a certain set of conditions
02:51:19.980 | that they decide to take part in.
02:51:21.920 | So I think what this comes down to is a lot of
02:51:27.600 | really involved conversations,
02:51:30.320 | but if we assume people have free will and choice,
02:51:33.040 | we just kind of make that blanket assumption,
02:51:35.280 | then the question starts to become,
02:51:37.600 | well, what choices do we make
02:51:39.200 | and how do we live with those choices?
02:51:41.900 | And I think probably the most fundamental choice
02:51:44.640 | is whether we exist in a society
02:51:47.560 | or we choose to leave society.
02:51:49.840 | And there are people that do this.
02:51:50.860 | There are people that go live in the woods.
02:51:52.000 | There are people that immigrate to other societies
02:51:55.080 | and they make a choice, right?
02:51:56.960 | And as they enter those other societies
02:51:58.840 | or they choose to leave society and go live in the woods,
02:52:01.840 | they adopt a certain set of values, right?
02:52:05.960 | They adopt values that the society prescribes,
02:52:08.400 | they compromise their own values,
02:52:09.680 | they define their own values,
02:52:11.000 | and they create a set of values for themselves, right?
02:52:13.760 | I think at the end of the day,
02:52:18.080 | if you're going to choose to live in society,
02:52:20.980 | in addition to all the minimums of, you know,
02:52:24.680 | not throwing garbage on the floor and, you know,
02:52:27.040 | doing nice things for people that need help
02:52:29.640 | and doing any number of things to just be
02:52:31.960 | a normal human being within society,
02:52:34.680 | you have to ask yourself,
02:52:37.160 | what am I doing as part of society, right?
02:52:39.920 | You can always say, hey, I'm going to leave society,
02:52:42.680 | I'm going to live in the woods.
02:52:43.720 | I did that, right?
02:52:44.760 | I went and I lived in the woods and I gave it a shot,
02:52:47.000 | realized a ton of stuff, huge amount of clarity from that.
02:52:49.800 | But when you decide to live in society,
02:52:52.800 | you take on, first of all, certain minimal agreements,
02:52:57.560 | you mold your values a little bit to that society,
02:53:01.120 | that's another choice that people inherently make.
02:53:04.040 | And then there's a question of,
02:53:07.200 | well, what am I doing here, right?
02:53:09.020 | What am I doing in society, right?
02:53:11.080 | So when people say the meaning of life,
02:53:13.060 | I don't know what the meaning of life is.
02:53:13.900 | - What's the meaning of life in society?
02:53:16.580 | - Right, what's the meaning of life for the choice
02:53:18.480 | that you've made within society, right?
02:53:20.580 | 'Cause that's maybe the first fundamental choice you made.
02:53:23.380 | You made a choice and you continue to make a choice
02:53:25.560 | to be part of society and a specific society, right?
02:53:30.560 | So you've made this choice, you're part of a society,
02:53:34.160 | and now you kind of have a life
02:53:39.160 | and you have people around you.
02:53:40.760 | And then the question is, in my opinion,
02:53:44.560 | the question is what is the body of work
02:53:46.860 | that you want to make, right?
02:53:50.160 | I think personally that life is kind of so short
02:53:55.120 | and the ability to get enough resources for yourself
02:53:57.800 | in at least the developed markets where we're lucky to be in
02:54:01.440 | is so relatively abundant that we, you and me,
02:54:06.440 | have the luxury, by your pursuit of a PhD,
02:54:09.880 | you've had this luxury, I've had this luxury
02:54:12.880 | through the work that I've been doing
02:54:14.960 | to pursue something that makes society better.
02:54:18.540 | So this is kind of the question, I would say,
02:54:25.040 | the question is, am I gonna live in society, yes or no?
02:54:28.280 | Yes, okay, most people choose yes.
02:54:30.200 | I understand why to a degree,
02:54:31.600 | I understand why some people choose no.
02:54:33.480 | And then what is the, and I'm gonna be in society,
02:54:36.520 | I'm gonna, if you choose to be in society,
02:54:38.320 | you're just choosing to abide by the rules,
02:54:40.240 | you're choosing to just do the minimum, right?
02:54:42.640 | That's what being part of society means.
02:54:44.380 | People that choose to be part of society
02:54:45.840 | but don't wanna do this, it's very confusing,
02:54:48.000 | they should just leave, they should just go,
02:54:49.520 | look, I don't like this deal, I'm gonna go somewhere else.
02:54:52.600 | I'm gonna live in Tibet, I'm gonna live in the woods,
02:54:55.080 | I'm gonna live wherever where the rules are to my liking.
02:54:57.920 | Right?
02:54:58.740 | You've chosen to be in society.
02:55:02.240 | Next question, kind of final question is,
02:55:04.280 | what is the body of work that I'm gonna be involved in?
02:55:07.280 | Because in looking at that Jeff Bezos
02:55:09.280 | kind of regret minimization framework thing,
02:55:12.080 | I think that's what a lot of it really comes down to,
02:55:14.280 | is you kind of, the framework is at 80 years old,
02:55:16.960 | you look back over your life,
02:55:17.960 | what would you regret not doing?
02:55:19.520 | What would you regret not pursuing?
02:55:22.640 | I think there are a number of things
02:55:24.880 | on a personal level each person has,
02:55:26.800 | but I think, at least for me,
02:55:29.460 | and probably for many of the other people I know,
02:55:32.400 | there's a question of, what is the body of work
02:55:35.480 | that I was involved in?
02:55:36.960 | What did I do?
02:55:38.400 | What happened?
02:55:39.640 | Right, what was I involved in?
02:55:41.140 | And in my opinion, you should have a good answer to that.
02:55:47.400 | - And you mentioned the body of work in relation to
02:55:51.720 | whether it helped make a better world.
02:55:54.760 | And the fundamental question there is,
02:55:58.280 | what does better mean?
02:56:00.120 | So it's our striving to understand what is better.
02:56:03.920 | What kind of world would we love to exist after we're gone?
02:56:08.920 | And I think that's another thing,
02:56:12.040 | almost unanswerable question,
02:56:13.600 | but it's one we can strive towards,
02:56:16.440 | is what is a better world?
02:56:20.000 | - Right, I think that's, once again,
02:56:22.560 | that's a very personal question.
02:56:23.840 | I'm not sure if there's an objective moral truth
02:56:26.240 | that's gonna suddenly give us all an answer.
02:56:28.480 | I think it's actually quite fascinating to me
02:56:30.540 | when people feel they have this objective moral truth,
02:56:32.600 | they're so sure in their opinions,
02:56:34.600 | this is what we do, we should go hurt them,
02:56:37.120 | or help them, or kill them, or rescue them, or whatever.
02:56:41.080 | There's all these kind of very situational specific,
02:56:44.320 | kind of like, this is the right thing to do,
02:56:46.560 | the objective moral truth told me that this is it.
02:56:49.960 | - But maybe there's a definitive truth
02:56:51.960 | that we can arrive towards, sort of a consensus
02:56:56.360 | of what that is within the little local pocket
02:56:59.680 | of society that you're in.
02:57:01.400 | - Yeah, that's the point.
02:57:02.240 | - That's what happens, people just then mislabel it
02:57:04.360 | and they go like, objective moral truth.
02:57:06.520 | This is not our idea.
02:57:09.280 | This is coming up from on high here.
02:57:10.760 | This is the objective moral truth
02:57:14.000 | that I think exists in some metaphysical form somewhere.
02:57:18.640 | - And then you build a building with marble
02:57:20.520 | and it's big and usually what happens?
02:57:22.880 | And then you convince yourself
02:57:23.940 | that that building represents objective truth.
02:57:24.780 | - I think those people actually,
02:57:26.440 | the people who build those buildings
02:57:27.920 | probably understand that there is no metaphysical objective.
02:57:30.280 | They're just like, we're all just coming to consensus.
02:57:32.640 | I'm gonna build the biggest building
02:57:33.680 | and you're like me and that's what we're gonna do, right?
02:57:35.840 | They just look at it that way probably.
02:57:37.960 | I think what ends up happening with all these values
02:57:43.080 | is yeah, people should determine that for themselves.
02:57:45.080 | I agree that there's a second order question here
02:57:47.320 | of what is the best body of work to work on.
02:57:50.300 | Personally, I think that's probably a mix
02:57:53.580 | of what could you realistically achieve?
02:57:56.080 | Is that gonna have an impact on society
02:57:59.120 | that you feel good about?
02:58:00.200 | - Yeah. - Right?
02:58:01.560 | So these are probably the two aspects of this question
02:58:04.800 | and is this gonna have a good impact on society
02:58:07.200 | that you feel good about?
02:58:08.200 | Obviously very subjective, right?
02:58:09.820 | Some people save animals, some people save forests.
02:58:13.240 | We and I are creating this system
02:58:16.280 | of economic fairness and transparency.
02:58:19.160 | I feel that I'm in a good position to enable that.
02:58:22.960 | I feel that I have a good chance of succeeding at that.
02:58:25.740 | And I think that the impact will be quite meaningful
02:58:29.160 | for a large number of people.
02:58:30.760 | And so I'm completely happy to look back once I'm 80
02:58:34.840 | and see a body of work that achieved that
02:58:37.400 | and be very proud of that, right?
02:58:38.600 | Because I think that's what I'll be doing
02:58:40.520 | when I'm looking back.
02:58:42.180 | - Well, I agree with you.
02:58:43.080 | The scale of impact as a hybrid smart contracts,
02:58:48.080 | this whole idea that you're working on
02:58:50.800 | has a potential to transform the world for the better
02:58:55.400 | at a scale that I can't even imagine.
02:59:00.400 | So speaking of which, means even more
02:59:03.760 | that you would waste so many hours
02:59:05.920 | of that exciting life with me.
02:59:07.560 | Thank you so much for talking today, Sergey.
02:59:09.200 | This is a really fascinating conversation,
02:59:11.360 | a really fascinating space,
02:59:13.100 | and I can't wait to learn more.
02:59:14.720 | So thank you so much for talking today.
02:59:17.080 | - Thank you for having me.
02:59:17.920 | It's been an absolute pleasure.
02:59:20.020 | - Thanks for listening to this conversation
02:59:21.580 | with Sergey Nazarov.
02:59:22.920 | And thank you to Wine Access, Athletic Greens,
02:59:26.340 | Magic Spoon, Indeed, and BetterHelp.
02:59:29.820 | Check them out in the description to support this podcast.
02:59:33.540 | And now let me leave you with some words from Copernicus.
02:59:37.100 | To know that we know what we know
02:59:39.580 | and to know that we do not know what we do not know,
02:59:43.180 | that is true knowledge.
02:59:45.680 | Thank you for listening and hope to see you next time.
02:59:48.380 | (upbeat music)
02:59:50.960 | (upbeat music)
02:59:53.540 | [BLANK_AUDIO]