back to index

Po-Shen Loh: Mathematics, Math Olympiad, Combinatorics & Contact Tracing | Lex Fridman Podcast #183


Chapters

0:0 Introduction
1:43 Planes and bridges
5:21 Writing a computer game from scratch
7:46 Programming competitions
11:21 Math is hard
16:52 Contact tracing that preserves privacy
54:9 Math Olympiad
69:49 Hard math problem
77:6 Is math discovered or invented?
82:2 Intelligence
88:52 Math education
93:3 How to learn math
101:58 Combinatorics
105:5 Voting trees
115:29 Stochastic coalescence
125:15 P=NP
129:32 Tolkien and WWII
131:52 Advice for young people
133:57 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Po Shen Lo,
00:00:02.720 | a professor of mathematics at Carnegie Mellon University,
00:00:06.300 | national coach of the USA International Math Olympia Team,
00:00:09.960 | and founder of XP,
00:00:11.920 | that does online education of basic math and science.
00:00:15.260 | He's also the founder of Novid,
00:00:17.080 | an app that takes a really interesting approach
00:00:19.560 | to contact tracing,
00:00:20.800 | making sure you stay completely anonymous,
00:00:23.120 | and it gives you statistical information about COVID cases
00:00:25.880 | in your physical network of interactions.
00:00:28.920 | So you can maintain privacy, very important,
00:00:31.680 | and make informed decisions.
00:00:34.240 | In my opinion, we desperately needed solutions like this
00:00:37.000 | in early 2020.
00:00:38.800 | And unfortunately, I think,
00:00:41.480 | we will again need it for the next pandemic.
00:00:44.760 | To me, solutions that require large-scale,
00:00:46.960 | distributed coordination of human beings
00:00:49.680 | need ideas that emphasize freedom and knowledge.
00:00:53.480 | Quick mention of our sponsors,
00:00:55.160 | Jordan Harbinger Show, Onnit, BetterHelp,
00:00:58.760 | 8sleep, and Element.
00:01:01.240 | Check them out in the description to support this podcast.
00:01:04.360 | As a side note, let me say that Po and I
00:01:07.160 | filmed a few short videos
00:01:08.440 | about simple, beautiful math concepts
00:01:10.660 | that I will release soon.
00:01:12.980 | It was really fun.
00:01:14.120 | I really enjoyed Po sharing his passion for math with me
00:01:16.680 | in those videos.
00:01:17.880 | I'm hoping to do a few more short videos
00:01:20.000 | in the coming months that are educational in nature,
00:01:23.180 | on AI, robotics, math, science, philosophy,
00:01:26.560 | or if all else fails,
00:01:28.880 | just fun snippets into my life on music,
00:01:31.280 | books, martial arts, and other random things,
00:01:34.760 | if that's of interest to anyone at all.
00:01:38.260 | This is the Lex Figman Podcast,
00:01:40.240 | and here's my conversation with Po Shenlo.
00:01:42.980 | You know, you mentioned you really enjoy flying
00:01:46.540 | and experiencing different people in different places.
00:01:49.760 | There's something about flying for me,
00:01:51.200 | I don't know if you have the same experience,
00:01:53.120 | that every time I get on an airplane,
00:01:55.720 | it's incredible to me that human beings
00:01:58.080 | have actually been able to achieve this.
00:02:00.280 | And when I look at what's happening now
00:02:03.760 | with humans traveling out into space,
00:02:06.440 | I see it as all the same thing.
00:02:08.120 | It's incredible that humans are able to get into a box
00:02:11.280 | and fly in the air and safely and land.
00:02:16.280 | And the same, it seems like,
00:02:18.080 | and everybody's taking it for granted.
00:02:19.960 | So when I observe them, it's quite fascinating
00:02:22.520 | because I see that cleanly mapping to the world
00:02:25.920 | where we're now in rockets and traveling to the moon,
00:02:30.920 | traveling to Mars, and at the same kind of way,
00:02:34.360 | I can already see the future
00:02:36.480 | where we will all take it for granted.
00:02:38.700 | So I don't know if you have, you personally,
00:02:43.600 | when you fly, have the same kind of magical experience
00:02:46.440 | of how the heck did humans actually accomplish this?
00:02:49.100 | - So I do, especially when there's turbulence.
00:02:52.600 | Which is, you know, like on the way here,
00:02:55.240 | there was turbulence and the plane jiggled,
00:02:58.120 | even the flight attendant had to hold on to the side.
00:03:00.280 | And I was just thinking to myself,
00:03:01.400 | it's amazing that this happens all the time
00:03:03.280 | and the wings don't fall off,
00:03:04.480 | you know, given how many planes are flying.
00:03:06.680 | But then I often think about it and I'm like,
00:03:08.560 | you know, a long time ago,
00:03:09.880 | I think people didn't trust elevators
00:03:12.200 | in a 40-story building in New York City.
00:03:14.440 | And now we just take it completely for granted
00:03:17.080 | that you can step into this shaft,
00:03:18.640 | which is 40 floors up and down.
00:03:21.560 | And it will just not fail.
00:03:23.220 | - Yeah, again, I'm the same way with elevators,
00:03:26.800 | but also buildings.
00:03:28.160 | When I'll stand on the 40th floor and wonder
00:03:31.640 | how the heck are we not falling right now?
00:03:35.360 | Like how amazing it is with the high winds,
00:03:39.160 | like structurally just the earthquakes and the vibrations,
00:03:42.260 | I mean, natural vibrations in the ground.
00:03:44.560 | Like how is this, how are all of these,
00:03:47.000 | you go to like New York City,
00:03:48.080 | all of these buildings standing.
00:03:49.960 | I mean, to me, one of the most beautiful things,
00:03:52.160 | actually mathematically too, is bridges.
00:03:54.800 | I used to build bridges in high school from like toothpicks,
00:03:57.600 | just like out of the pure joy of like physics
00:04:02.400 | making some structure really strong.
00:04:05.840 | Understanding like from a civil engineering perspective,
00:04:09.640 | what kind of structure will be stronger
00:04:12.080 | than another kind of structure, like suspension bridges.
00:04:14.880 | And then you see that at scale,
00:04:17.200 | humans being able to span a body of water
00:04:20.440 | with a giant bridge.
00:04:22.280 | And it's, I don't know, it's so humbling.
00:04:26.520 | It makes you realize how dependent we are on each other.
00:04:31.520 | Sort of, I talk about level up,
00:04:33.400 | but there's a certain element in which we little ants
00:04:37.980 | have just a small amount of knowledge
00:04:39.660 | about our particular thing.
00:04:41.920 | And then we're depending on a network of knowledge
00:04:45.400 | that other experts hold.
00:04:47.440 | And then most of our lives,
00:04:49.280 | most of the quality of life we have
00:04:51.160 | has to do with the richness of that network of knowledge,
00:04:56.160 | of that collaboration,
00:04:58.360 | and then sort of the ability to build on top of it.
00:05:02.040 | Levels of abstractions.
00:05:03.280 | You start from like bits in a computer,
00:05:05.800 | then you can have assembly,
00:05:07.520 | then you can have C++,
00:05:08.840 | or you have an operating system,
00:05:10.000 | then you can have C++ and Python,
00:05:11.760 | finally some machine learning on top.
00:05:13.480 | All of these are abstractions.
00:05:15.000 | And eventually we'll have AI that runs all of us humans.
00:05:17.160 | But anyway, but speaking of abstractions and programming,
00:05:21.320 | in high school you wrote some impressive games for MS-DOS.
00:05:26.120 | I got a chance to, in browser somehow, it's magic.
00:05:28.840 | I got a chance to play them.
00:05:30.540 | Alien Attack 1, 2, 3, and 4.
00:05:34.640 | What's the hardest part about programming those games?
00:05:37.280 | And maybe can you tell the story
00:05:38.440 | about building those games?
00:05:41.120 | - Sure.
00:05:41.960 | I actually tried to do those in high school
00:05:44.280 | because I was just curious if I could.
00:05:46.280 | Yeah.
00:05:48.080 | - That's a good starting point for anything, right?
00:05:49.280 | - Yeah, yeah, yeah.
00:05:50.120 | It's like, could you?
00:05:50.960 | But the appealing thing was also,
00:05:52.040 | it was a soup to nuts kind of thing.
00:05:54.200 | So something that has always attracted me
00:05:55.880 | is I like beautiful ideas.
00:05:58.460 | I like seeing beautiful ideas,
00:05:59.960 | but I actually also like seeing execution of an idea
00:06:03.800 | all the way from beginning to end in something that works.
00:06:06.400 | So for example, in high school,
00:06:08.040 | I was lucky enough to grow up in the late '90s
00:06:10.920 | when even a high school student
00:06:13.320 | could hope to make something sort of comparable
00:06:16.240 | to the shareware games that were out there.
00:06:18.720 | I say the word sort of, like still quite far away,
00:06:21.280 | but at least I didn't need to hire a 3D CG artist.
00:06:25.040 | There weren't enough pixels to draw anyway,
00:06:27.240 | even I can draw, right?
00:06:28.400 | Bad art, of course.
00:06:30.320 | But the point is I wanted to know,
00:06:31.680 | is it possible for me to try to do those things
00:06:34.660 | where back in those days,
00:06:36.580 | you didn't even have an easy way
00:06:38.440 | to draw letters on the screen in a particular font.
00:06:41.200 | You couldn't just say import a font.
00:06:42.880 | It wasn't like Python.
00:06:44.080 | So for example, back then,
00:06:45.440 | if you play those games in the web browser,
00:06:47.560 | which is emulating the old school computer,
00:06:51.800 | those, even the letters you see,
00:06:53.520 | those are made by individual calls
00:06:55.120 | to draw pixels on the screen.
00:06:56.800 | - So you built that from scratch,
00:06:58.320 | almost building a computer graphics library from scratch?
00:07:01.000 | - Yes.
00:07:01.840 | The primitive that I got to use
00:07:03.120 | was some code I copied off of a book in assembly
00:07:05.600 | of how to put a pixel on a screen in a particular color.
00:07:08.680 | - And the programming language was Pascal?
00:07:11.760 | - Ah, yeah.
00:07:12.720 | The first one was in Pascal,
00:07:14.440 | but then the other ones were in C++ after that.
00:07:18.200 | - How's the emulation in the browser work, by the way?
00:07:20.720 | Is that trivial?
00:07:21.680 | 'Cause it's pretty cool.
00:07:22.560 | You get to play these games
00:07:23.560 | that have a very much 90s feeling to them.
00:07:26.520 | - Ah, so it's literally making an MS-DOS environment,
00:07:29.360 | which is literally running the old .exe file.
00:07:32.280 | Wow, I didn't have to do that.
00:07:33.120 | - In the browser.
00:07:33.960 | That could be more amazing than the airplane.
00:07:37.840 | So it wasn't so much about the video games.
00:07:40.280 | It was more about,
00:07:41.400 | can you build something really cool from scratch?
00:07:44.120 | - Yes.
00:07:45.440 | - And you did a bunch of programming competitions.
00:07:50.280 | What was your interest, your love for programming?
00:07:54.080 | What did you learn through that experience?
00:07:56.880 | Especially now that much of your work
00:07:59.680 | has taken a long journey through mathematics.
00:08:02.240 | - I think I always was amazed
00:08:04.920 | by how computers could do things fast.
00:08:08.440 | If I wanted to make it an abstract analysis
00:08:11.560 | of why it is that I saw some power in the computer.
00:08:14.600 | Because if the computer can do things
00:08:16.360 | so many times faster than humans,
00:08:18.280 | where the hard part is telling the computer
00:08:20.240 | what to do and how to do it,
00:08:21.840 | if you can master that,
00:08:23.840 | asking the computer what to do,
00:08:25.560 | then you could conceivably achieve more things.
00:08:28.200 | And those contests I was in,
00:08:29.640 | those were the opposite in some sense
00:08:31.840 | of making a complete product,
00:08:34.120 | like a game is a product.
00:08:35.880 | Those contests were effectively write a function
00:08:38.560 | to do something extremely efficiently.
00:08:40.800 | And if you are able to do that,
00:08:42.480 | then you can unlock more of the power of the computer.
00:08:45.800 | - But also doing it quickly.
00:08:47.440 | There's a time element from the human perspective
00:08:49.640 | to be able to program quickly.
00:08:53.400 | There's something nice.
00:08:54.880 | So there's an almost like an athletics component
00:08:57.920 | to where you're almost like an athlete
00:09:01.280 | seeking optimal performance as a human being
00:09:03.920 | trying to write these programs.
00:09:05.680 | And at the same time,
00:09:06.520 | it's kind of art because you're,
00:09:08.840 | the best way to write a program quickly
00:09:11.080 | is to write a simple program.
00:09:13.440 | You still have a damn good solution.
00:09:14.840 | So it's not necessarily you have to type fast.
00:09:17.040 | You have to think through a really clean,
00:09:19.520 | beautiful solution.
00:09:22.760 | I mean, what do you think is the use
00:09:26.080 | of those programming competitions?
00:09:27.760 | Do you think they're ultimately something
00:09:29.160 | you would recommend for students,
00:09:30.600 | for people interested in programming
00:09:32.000 | or people interested in building stuff?
00:09:33.960 | - Yes.
00:09:34.800 | I think so because especially with the work
00:09:37.120 | that I've been doing nowadays,
00:09:38.120 | even trying to control COVID,
00:09:40.240 | something that was very helpful from day one
00:09:42.600 | was understanding that the kinds of computations
00:09:45.600 | we would want to do,
00:09:47.120 | we could conceivably do on like a four core cloud machine
00:09:50.480 | on Amazon web services out to a population
00:09:53.800 | which might have hundreds of thousands
00:09:55.320 | or millions of people.
00:09:56.520 | The reason why that was important
00:09:57.880 | to have that back of the envelope calculation
00:10:00.800 | with efficient algorithms
00:10:02.760 | is because if we couldn't do that,
00:10:05.000 | then we would bankrupt ourselves
00:10:06.320 | before we could get to a big enough scale.
00:10:08.200 | If you think about how you grow anything from small to big,
00:10:11.520 | if in order to grow it from small to big,
00:10:13.640 | you also already need 10,000 cloud servers,
00:10:16.480 | you'll never get to big.
00:10:17.680 | - And also the nice thing about programming competitions
00:10:22.360 | is that you actually build a thing that works.
00:10:26.440 | So you finish it.
00:10:28.080 | There's a completion thing and you realize,
00:10:30.800 | I think there's a magic to it
00:10:32.560 | where you realize that it's not so hard
00:10:35.560 | to build something that works.
00:10:37.560 | To have a system that successfully takes in inputs
00:10:40.600 | and produces outputs and solves a difficult problem
00:10:43.400 | and that directly transfers to building a startup,
00:10:46.560 | essentially that can help some aspect of this world
00:10:50.040 | as long as it's mostly based on software engineering.
00:10:53.880 | Things get really tricky
00:10:55.000 | when you have to manufacture stuff.
00:10:56.760 | That's why people like Elon Musk are so impressive
00:11:00.760 | that it's not just software.
00:11:02.880 | Tesla Autopilot is not just software.
00:11:05.360 | It's like you have to actually have factories
00:11:07.680 | that build cars and there's a million components
00:11:11.440 | involved in the machinery required
00:11:14.320 | to assemble those cars and so on.
00:11:16.200 | But in software, one person can change the world,
00:11:19.040 | which is incredible.
00:11:21.440 | But on the mathematics side,
00:11:23.240 | what if you look back or maybe today,
00:11:26.760 | what made you fall in love with mathematics?
00:11:29.680 | - For me, I think I've always been very attracted
00:11:33.320 | to challenge, as I already indicated
00:11:35.720 | with writing the program.
00:11:37.320 | I guess if I see something that's hard
00:11:40.440 | or supposed to be impossible,
00:11:41.960 | sometimes I say, "Maybe I want to see
00:11:45.720 | "if I can pull that off."
00:11:47.120 | And with the mathematics, the math competitions
00:11:49.920 | presented problems that were hard,
00:11:53.560 | that I didn't know how to start,
00:11:55.040 | but for which I could conceivably try to learn
00:11:57.640 | how to solve them.
00:11:59.120 | So, I mean, there are other things that are hard
00:12:00.480 | called like get something to Mars, get people to Mars.
00:12:03.360 | And I didn't, and I still don't think
00:12:05.840 | that I'm able to solve that problem.
00:12:08.200 | On the other hand, the math problems struck me
00:12:10.000 | as things which are hard
00:12:11.440 | and with significant amount of extra work,
00:12:13.560 | I could figure it out.
00:12:14.760 | And maybe they would actually even be useful.
00:12:16.560 | Like that mathematical skill is the core
00:12:18.960 | of lots of other things.
00:12:20.240 | - That's really interesting.
00:12:23.720 | Maybe you could speak to that
00:12:25.120 | because a lot of people say that math is hard.
00:12:29.480 | As a kind of negative statement.
00:12:31.640 | It always seemed to me a little bit like
00:12:35.160 | that's kind of a positive statement
00:12:37.400 | that all things that are worth having in this world,
00:12:40.400 | they're hard.
00:12:41.480 | I mean, everything that people think about
00:12:44.520 | that they would love to do,
00:12:46.440 | whether it's sports, whether it's art, music,
00:12:49.960 | and all the sciences,
00:12:52.760 | it's going to be hard if you want to do something special.
00:12:56.560 | So is there something you could say to that idea
00:12:59.040 | that math is hard?
00:13:00.200 | Should it be made easy or should it be hard?
00:13:03.880 | - So I think maybe I want to dig in a little bit
00:13:07.160 | onto this hard part and say,
00:13:09.680 | I think the interesting thing about the math
00:13:12.120 | is that you can see a question
00:13:14.800 | that you didn't know how to start doing it before.
00:13:18.240 | And over a course of thinking about it,
00:13:20.920 | you can come up with a way to solve it.
00:13:24.440 | And so you can move from a state
00:13:26.000 | of not being able to do something
00:13:28.200 | to a state of being able to do something
00:13:30.320 | where you help to take yourself through that
00:13:33.320 | instead of somebody else spoon feeding you that technique.
00:13:37.640 | So actually here, I'm already digging into
00:13:39.520 | maybe part of my teaching philosophy also,
00:13:42.080 | which is that I actually don't want to ever
00:13:45.440 | just tell somebody, here's how you do something.
00:13:48.040 | I actually prefer to say, here's an interesting question.
00:13:52.200 | I know you don't quite know how to do it.
00:13:54.280 | Do you have any ideas?
00:13:55.440 | I'm actually explaining another way
00:13:58.960 | that you could try to do teaching.
00:14:00.440 | And I'm contrasting this to a method of,
00:14:03.080 | watch me do this, now practice it 20 times.
00:14:06.360 | I'm trying to say a lot of people consider math to be hard
00:14:09.080 | because maybe they can't remember
00:14:10.800 | all of the methods that were taught.
00:14:12.800 | But for me, I look at the hardness
00:14:15.560 | and I don't think of it as a memory hardness.
00:14:17.600 | I think of it as a, can you invent something hardness?
00:14:21.960 | And I think that if we can teach more people
00:14:24.360 | how to do that art of invention in a pure cognitive way,
00:14:28.680 | not as hard as the actual hardware stuff, right?
00:14:31.280 | But like in terms of the concepts
00:14:32.680 | and the thoughts and the mathematics,
00:14:34.040 | teaching people how to invent,
00:14:36.400 | then suddenly, actually they might not even find math
00:14:38.880 | to be that tiresomeness hard anymore,
00:14:42.680 | but that rewardingness hard of,
00:14:45.440 | I have the capability of looking at something
00:14:48.440 | which I don't know what to do
00:14:49.920 | and coming up with how to do it.
00:14:51.400 | I actually think we should be doing that,
00:14:52.840 | giving people that capability.
00:14:55.040 | - So hard in the same way that invention is hard,
00:14:58.160 | that is ultimately rewarding.
00:14:59.520 | So maybe you can dig in that a little bit longer,
00:15:03.640 | which is, do you see basically the way to teach math
00:15:08.640 | is to present a problem and to give a person a chance
00:15:14.840 | to try to invent a solution
00:15:17.000 | with minimum amount of information first?
00:15:21.080 | Is that basically,
00:15:22.840 | how do you build that muscle of invention in a student?
00:15:26.400 | - Yes, so the way that,
00:15:27.840 | I guess I have two different sort of ways
00:15:30.120 | that I try to teach.
00:15:30.960 | Actually, one of them is, in fact, this semester,
00:15:32.960 | because all my classes were remotely delivered.
00:15:35.560 | I even threw them all onto my YouTube channel.
00:15:37.200 | So you can see how I teach at Carnegie Mellon.
00:15:39.960 | But I'd often say, "Hey, everyone, let's try to do this.
00:15:43.240 | "Any ideas?"
00:15:44.680 | And that actually changes my role as a professor
00:15:47.760 | from a person who shows up for class
00:15:50.040 | with a script of what I want to talk through.
00:15:52.480 | I actually, I don't have a script.
00:15:54.000 | The way I show up for classes,
00:15:55.680 | there's something that we want to learn how to do,
00:15:58.080 | and we're going to do it by improv.
00:16:00.120 | I'm talking about the same method as improv comedy,
00:16:02.760 | which is where you tell me some ideas
00:16:05.240 | and I'll try to yes and them.
00:16:07.040 | (both laughing)
00:16:08.360 | You know what I mean?
00:16:09.320 | And then together, we're going to come up with a proof
00:16:11.880 | of this concept where you were deeply involved
00:16:15.000 | in creating the proof.
00:16:16.360 | Actually, every time I teach the class,
00:16:18.040 | we do every proof slightly differently
00:16:19.800 | because it's based on how the students came up with it.
00:16:23.360 | And that's how I do it when I'm in person.
00:16:25.760 | I also have another line of courses that we make
00:16:27.880 | that is delivered online.
00:16:29.200 | Those things are where I can't do it live,
00:16:31.640 | but the teaching method became also similar.
00:16:34.560 | It was just, "Here's an interesting question.
00:16:37.080 | "I know it's out of reach.
00:16:38.160 | "Why don't you think about it?"
00:16:39.400 | And then automatic hints, we feed automatically hints
00:16:42.240 | through the internet to go and let the person try to invent.
00:16:47.560 | So that's like a more rigorous prodding of invention.
00:16:52.240 | But you did mention disease and COVID
00:16:56.000 | and you've been doing some very interesting stuff
00:16:58.160 | from a mathematical, but also software engineering angle
00:17:01.920 | of coming up with ideas.
00:17:04.520 | It's back to the, "I see a problem.
00:17:07.360 | "I think I can help."
00:17:08.720 | So you stepped into this world.
00:17:11.120 | Can you tell me about your work there
00:17:13.840 | under the flag of Novid and both the software
00:17:18.840 | and the technical details of how the thing works?
00:17:21.640 | - Sure, sure.
00:17:22.480 | So first I want to make sure that I say,
00:17:24.480 | this is actually team effort.
00:17:25.960 | I happen to be the one speaking,
00:17:27.440 | but there's no way this would exist
00:17:29.040 | without an incredible team of people
00:17:30.560 | who inspire me every day to work on this.
00:17:33.120 | But I'll speak on behalf of them.
00:17:34.560 | So the idea was indeed that we stepped forward
00:17:39.560 | in March of last year,
00:17:41.440 | when the world started to become,
00:17:43.320 | our part of the world started to become,
00:17:44.800 | our part meaning the United States,
00:17:46.360 | started to become paralyzed by COVID.
00:17:48.920 | The shutdown started to happen.
00:17:50.760 | And at that time, it started as a figment of an idea,
00:17:54.280 | which was network theory,
00:17:57.160 | which is the area of math that I work in,
00:17:59.520 | could potentially be combined with smartphones
00:18:02.480 | and some kind of health information, anonymized.
00:18:06.040 | Exactly how?
00:18:07.080 | We didn't know yet.
00:18:07.960 | We tried to crystallize it.
00:18:09.440 | And many months into this work,
00:18:11.400 | we ended up accidentally discovering a new way
00:18:15.240 | to control diseases,
00:18:17.000 | which is now what is the main impetus of all of this work,
00:18:20.880 | is to take this idea and polish it
00:18:23.480 | and hopefully have it be useful,
00:18:25.000 | not only now, but for future pandemics.
00:18:27.440 | The idea is really simple to describe.
00:18:29.720 | Actually, my main thing in the world
00:18:31.120 | is I come up with obvious observations.
00:18:33.760 | That's, I'll explain it now.
00:18:35.120 | - Einstein did the same thing
00:18:36.920 | and he wrote a few short papers.
00:18:38.960 | - But so the idea is like this.
00:18:41.800 | If we describe how usually people control disease,
00:18:46.480 | for a lot of history,
00:18:47.960 | it was that you'd find out who was sick,
00:18:51.000 | you'd find out who they have been around,
00:18:53.280 | and you try to remove all of those people from society
00:18:56.120 | against their will.
00:18:57.200 | - Yes.
00:18:58.040 | - Now that's the problem.
00:18:58.960 | The against their will part
00:19:00.880 | gives you the wrong kind of a feedback loop,
00:19:03.560 | which makes it hard to control the disease
00:19:05.520 | because then the people you're trying to control
00:19:07.000 | keep getting other people sick.
00:19:08.760 | You can see already how I'm thinking
00:19:10.120 | and talking about this, feedback loops.
00:19:11.960 | This is actually related to something you said earlier
00:19:14.240 | about even like how skyscrapers stay in the air.
00:19:17.160 | The whole point is control theory.
00:19:19.400 | You actually want to, or even how an airplane stays,
00:19:22.640 | you need to have control loops
00:19:24.920 | which are feedbacking in the right way.
00:19:27.080 | And what we observed was that the feedback control loop
00:19:29.920 | for controlling disease by asking people
00:19:32.160 | to be removed from society against their will
00:19:34.520 | was not working.
00:19:35.600 | It was running against human incentives,
00:19:37.640 | and you suddenly are trying to control 7 billion,
00:19:40.200 | 8 billion people in ways that they don't individually want
00:19:43.640 | to necessarily do.
00:19:45.200 | So here's the idea.
00:19:46.160 | And this is inspired by the fact that at the core of our team
00:19:49.480 | were user experience designers.
00:19:51.280 | That's actually, in fact, the first thing I knew we needed
00:19:53.920 | when we started was to bring user experience at the core.
00:19:57.160 | Okay.
00:19:58.160 | But so the idea was, suppose hypothetically,
00:20:03.160 | there was a pandemic.
00:20:05.320 | What would you want?
00:20:07.040 | You would want a way to be able to live your life
00:20:10.280 | as much as possible and avoid getting sick.
00:20:13.640 | Can we make an app to help you avoid getting sick?
00:20:17.400 | Notice how I've just articulated the problem.
00:20:19.680 | It is not, can we make an app
00:20:21.840 | so that after you are around somebody who's sick,
00:20:24.720 | you can be removed from society.
00:20:27.080 | It's, can we make an app so that you can avoid getting sick?
00:20:30.520 | That would run a positive,
00:20:32.360 | I don't know if I want to call it positive or negative,
00:20:35.360 | but it would run a good feedback loop.
00:20:36.880 | Okay.
00:20:37.720 | So then how would you do this?
00:20:38.880 | The only problem is that you don't know who's sick
00:20:41.520 | because especially with this disease,
00:20:44.160 | if I see somebody who looks perfectly healthy,
00:20:47.000 | the disease spreads two days before you have any symptoms.
00:20:50.040 | And so it's actually not possible.
00:20:52.400 | That's where the network theory comes in.
00:20:54.800 | You caught it from someone.
00:20:56.600 | What if we changed the paradigm and we said,
00:21:00.240 | whenever there's a sickness,
00:21:02.440 | tell everybody how many physical relationships
00:21:06.200 | separate them from the sickness.
00:21:08.000 | That is the trivial idea we added.
00:21:09.680 | The trivial idea was the distance between you and a disease
00:21:13.360 | is not measured in feet or seconds.
00:21:16.600 | It's measured in terms of how many
00:21:19.040 | close physical relationships separate you,
00:21:22.040 | like these six degrees of separation, like LinkedIn.
00:21:25.400 | Simple idea.
00:21:26.480 | What if we told everyone that?
00:21:28.080 | It turns out that actually unlocks
00:21:30.200 | some interesting behavioral feedback loops,
00:21:33.040 | which for example, let me now jump to a non-COVID example
00:21:37.000 | to show why this maybe could be useful.
00:21:39.240 | Actually, we think it could be quite useful.
00:21:40.920 | Imagine there was Ebola or some hemorrhagic fever.
00:21:44.080 | Imagine it spread through contact, through the air, in fact.
00:21:47.040 | Pretend, pretend.
00:21:48.120 | That's a disastrous disease.
00:21:52.720 | It has high fatality rate.
00:21:54.680 | And as you die, you're bleeding out of every orifice.
00:21:59.120 | Okay?
00:22:01.440 | - Yeah, not pleasant.
00:22:02.400 | - Not pleasant.
00:22:03.240 | So the question is, suppose that such a disease broke,
00:22:06.560 | who would want to install an app that would tell them
00:22:10.400 | how many relationships away from them
00:22:12.600 | this disease had struck?
00:22:14.200 | - A lot of people.
00:22:16.240 | - A lot of people.
00:22:17.080 | In fact, almost, I don't want to say almost everyone.
00:22:20.480 | That's a very strong statement,
00:22:21.400 | but a very large number of people.
00:22:23.200 | - That's fascinating framing.
00:22:24.600 | Like the more deadly and transmissible the disease,
00:22:28.400 | the stronger the incentive to install it in a positive sense,
00:22:32.520 | in the good feedback loop sense.
00:22:36.400 | That's a really good example.
00:22:37.600 | It's a really good way to frame it.
00:22:38.920 | 'Cause with COVID, it was not as deadly
00:22:42.640 | as potential pandemics could have been,
00:22:45.440 | viruses could have been.
00:22:46.320 | So it's sometimes muddled with how we think about it.
00:22:49.200 | But yeah, this is a really good framing.
00:22:50.840 | If the virus was a lot more deadly,
00:22:53.760 | you want to create a system that has a set of incentives
00:22:56.320 | that it quickly spreads to the population
00:22:59.440 | where everybody is using it
00:23:01.000 | and is contributing in a positive way to the system.
00:23:04.240 | - Exactly.
00:23:05.080 | And actually that point you just made,
00:23:06.120 | I don't take credit for that observation.
00:23:07.800 | There was another person I talked to who pointed out
00:23:09.760 | that it's very interesting that this feedback loop
00:23:12.840 | is even more effective when the disease is worse.
00:23:17.000 | And that's actually not a bad characteristic
00:23:19.400 | to have in your feedback loop.
00:23:20.680 | If you're trying to help civilization keep running.
00:23:24.360 | - Yeah, it's really, it's dynamic.
00:23:27.360 | Like people figure out,
00:23:29.360 | they dynamically figure out how bad the disease is.
00:23:31.960 | The more it spreads and the deadlier it is
00:23:35.120 | as the people observe it,
00:23:37.240 | as long as the spread of information,
00:23:39.680 | like semantic information, natural language information,
00:23:43.480 | is closely aligned with the reality of the disease,
00:23:46.080 | which is a whole nother conversation, right?
00:23:49.880 | Maybe we'll chat about that,
00:23:51.040 | how we sort of make sure there's not misinformation
00:23:53.840 | where there's accurate information.
00:23:55.000 | But that aside, okay, so this is a really nice property.
00:23:58.880 | - Right.
00:23:59.720 | And just going on on that,
00:24:00.920 | actually just talking more about what that could do
00:24:02.840 | and why we're so excited about it,
00:24:04.480 | it's that not only would people want to install it,
00:24:07.400 | what would they do?
00:24:09.720 | If you start to see that this disease
00:24:11.960 | is getting closer and closer,
00:24:13.160 | we surveyed informally people,
00:24:15.080 | but they said as we saw it getting closer, we would hide.
00:24:18.600 | We would try to not have contacts.
00:24:21.840 | But now you notice what this has just achieved.
00:24:24.120 | The whole goal on this whole exercise was
00:24:27.440 | you got the people who might be sick
00:24:29.760 | and you got everyone else, set A and set B.
00:24:32.040 | Set A is the people who might be sick,
00:24:33.400 | set B is everyone else.
00:24:34.880 | And for the entirety of the past contact tracing approaches,
00:24:39.720 | you tried to get set A to do things that might not be
00:24:43.600 | to their liking or their will,
00:24:44.960 | because that's removing them from society.
00:24:46.960 | - Yes.
00:24:47.800 | - And we found out that there's two ways
00:24:49.040 | to separate set A from set B.
00:24:51.160 | You can also let the people at set B,
00:24:53.440 | at the fringe of set A,
00:24:55.760 | attempt to remove themselves from this interface.
00:24:58.640 | It's the symmetry of A and B separation.
00:25:01.600 | Everyone was looking at A, we look at B,
00:25:04.560 | and suddenly B is in their incentive to do so.
00:25:07.000 | - Beautiful.
00:25:09.000 | So there's a virus that jumps from human to human.
00:25:12.000 | So there's a network sometimes called graph
00:25:16.280 | of the spread of a virus.
00:25:18.440 | It hops from person to person to person to person.
00:25:21.680 | And each one of us individuals are sitting
00:25:25.800 | or plopped into that network.
00:25:28.240 | We have close friends and relations and so on.
00:25:31.720 | It's kind of fascinating to actually think
00:25:33.000 | about this network.
00:25:33.920 | And we can maybe talk about the shapes
00:25:35.560 | of this kind of network.
00:25:36.760 | 'Cause I was trying to think exactly this,
00:25:39.940 | like how many people do I,
00:25:41.440 | also I'm kind of an introvert, not kind of,
00:25:43.520 | I'm very much an introvert.
00:25:45.200 | But so can I be explicit about the kind of people
00:25:48.240 | I meet in regular life?
00:25:50.080 | Is the same way it was completely opened up,
00:25:52.120 | there's no pandemic.
00:25:53.280 | There is a kind of network.
00:25:57.040 | Of course, and there's maybe in the graph,
00:26:00.240 | theoretic sense, there's some weights or something
00:26:02.960 | about how close that relationship is
00:26:06.960 | in terms of the frequency of visits,
00:26:08.840 | the duration of visits and all of those kinds of things.
00:26:11.600 | So you're saying we might want to be,
00:26:14.840 | to create on top of that network,
00:26:18.080 | a spread of information to let you know,
00:26:22.280 | as the virus travels through this network,
00:26:24.440 | how close is it getting to you?
00:26:26.280 | And the number of hops away it is on that network
00:26:29.280 | is really powerful information
00:26:31.360 | that creates a positive feedback loop
00:26:34.000 | where you can act essentially anonymously
00:26:37.440 | and on your own.
00:26:41.240 | Like nobody's telling you what to do,
00:26:44.000 | which is really important, is decentralized
00:26:46.320 | and not whatever the opposite of authoritarian is.
00:26:52.240 | But you get to sort of the American way,
00:26:54.720 | you get to choose to do it yourself,
00:26:56.400 | you have the freedom to do it yourself
00:26:58.480 | and you're incentivized to do it.
00:27:00.200 | And you're most likely going to do it
00:27:01.960 | to protect yourself against you getting the disease
00:27:06.960 | as the closer it gets to you
00:27:10.040 | based on the information that you have.
00:27:12.040 | But can you maybe elaborate, first of all, brilliant.
00:27:15.940 | Whenever I saw the thing you're working on,
00:27:20.360 | so forget for COVID,
00:27:21.640 | this is of course really relevant for COVID,
00:27:25.140 | but it's also probably relevant for future diseases as well.
00:27:28.200 | So that was the thing I'm nervous about,
00:27:30.440 | is like if this whole,
00:27:31.840 | if our society shut down because of COVID,
00:27:34.720 | like what the heck is gonna happen
00:27:38.040 | when there's a much deadlier disease?
00:27:40.480 | Like this is disappointing, the whole time,
00:27:43.560 | 2020, the whole time I'm just sitting like this,
00:27:45.960 | like is the incompetence of everybody
00:27:49.840 | except the people developing vaccines.
00:27:53.000 | The biologists are the only ones
00:27:54.400 | that got their stuff together.
00:27:56.060 | But in terms of institutions and all that kind of stuff,
00:27:58.600 | it's just been terrible.
00:28:00.720 | But this is exactly the power of information
00:28:04.080 | and the power of information
00:28:05.960 | that doesn't limit personal freedom.
00:28:08.080 | So your idea is brilliant.
00:28:09.400 | Okay, mathematically, can you maybe elaborate
00:28:12.680 | what are we talking about?
00:28:14.120 | Like how do you actually make that work?
00:28:16.480 | What's involved?
00:28:17.680 | - Sure, first I'm gonna reply to something you said
00:28:19.720 | about the freedom inside this,
00:28:22.400 | because actually that was the idea.
00:28:24.480 | The idea is this is game theory, right?
00:28:27.320 | And effectively what we did is analogous
00:28:29.600 | to free market economy as opposed to central planning.
00:28:34.880 | If you just line up the set of incentives correctly
00:28:38.120 | so that people have in their purely selfish behavior
00:28:43.120 | are contributing to the optimization
00:28:45.560 | of the global function, that's it.
00:28:47.720 | And the point of what we do, I guess, in mathematics
00:28:50.420 | is we try to explore the search space
00:28:52.800 | to go and find out as many possibilities as there are.
00:28:55.040 | And in this case, it's an applied search space.
00:28:58.080 | That's why the inputs from design, user experience design
00:29:01.040 | and actual people are important.
00:29:02.720 | But you asked about, I guess, the mathematical
00:29:05.760 | or the technical things underpinning it.
00:29:07.640 | So I think the first thing I'll say is
00:29:09.800 | we wanted to make this thing
00:29:11.480 | not require your personal information.
00:29:14.760 | And so in order to do that, what gave me the confidence
00:29:17.840 | to, I guess, lead our team to run at the beginning
00:29:20.560 | is we saw that this could be done
00:29:21.920 | without using GPS information.
00:29:23.800 | So technically what's going on is if two smartphones,
00:29:28.160 | it's a smartphone app,
00:29:29.040 | if two smartphones have this thing installed,
00:29:31.460 | they just communicate with each other by Bluetooth
00:29:34.520 | to go and find out how far,
00:29:37.160 | they can detect nearby things by Bluetooth.
00:29:40.280 | And then they can find out that these two phones
00:29:42.040 | were approximately such and such distance apart.
00:29:44.880 | And that kind of relative proximity information
00:29:47.660 | is enough to construct this big network.
00:29:50.640 | - Okay, so the physical network is constructed
00:29:53.160 | based on proximity that's through Bluetooth
00:29:56.040 | and you don't have to specify your exact location,
00:29:59.300 | it's the proximity.
00:30:01.200 | - I'm not using the Pythagorean theorem basically.
00:30:03.280 | I mean, if I just knew the GPS coordinates,
00:30:05.520 | we could use the Pythagorean theorem too.
00:30:07.280 | Sorry, that's just how I call it.
00:30:08.520 | Distance formula, whatever you wanna call it.
00:30:10.480 | (both laughing)
00:30:13.720 | - Yeah, so we're not doing the old Pythagorean
00:30:16.760 | based violation of privacy, okay.
00:30:18.740 | (both laughing)
00:30:21.240 | But so is that enough to form,
00:30:26.240 | to give you enough information
00:30:28.600 | about physical connection to another human being?
00:30:32.880 | Is there a time element there?
00:30:35.240 | Is there, so, okay.
00:30:37.400 | That sounds like a really strong, like low hanging fruit.
00:30:41.560 | Like if you have that,
00:30:42.480 | you could probably go really, really far.
00:30:44.920 | My natural question is,
00:30:46.880 | is there extra information you can add on top of that?
00:30:49.760 | Like the duration of the physical proximity?
00:30:53.520 | - So first of all, we actually do estimate the duration,
00:30:56.660 | but the way we estimate the duration
00:30:58.340 | is like how a movie is filmed,
00:31:00.440 | in the sense that every so often, every few minutes,
00:31:03.200 | we check what's nearby.
00:31:04.640 | It's like how a movie is filmed.
00:31:06.240 | You take lots of snapshots.
00:31:07.960 | So there's no way in a battery efficient way
00:31:11.280 | to really keep track of that proximity.
00:31:14.440 | However, fortunately, we're using probability.
00:31:17.120 | The fact is, the paradigm that we're using
00:31:20.120 | is it's not super important if you run into that person
00:31:23.200 | only for 10 minutes at the grocery store.
00:31:25.640 | If that's a stranger that you run into 10 minutes
00:31:27.640 | in this grocery store,
00:31:28.920 | that's not going to be relevant for our paradigm
00:31:30.880 | because our paradigm is not telling you
00:31:33.100 | who were you around before
00:31:35.000 | and might therefore have gotten infected by already.
00:31:38.600 | Ours is about predicting the future.
00:31:40.200 | We change from, I mean, the standard paradigm
00:31:42.360 | was what already happened, quick damage control.
00:31:45.200 | Ours is predict the future.
00:31:46.600 | If you run into that person once in the grocery store today
00:31:49.320 | and never see them again,
00:31:50.400 | it's irrelevant for predicting the future.
00:31:52.540 | And therefore, for ours, what really matters
00:31:54.820 | is the many hours around the other person,
00:31:57.960 | at which point if you're scanning every five to eight minutes
00:32:00.640 | - That's going to come out in the,
00:32:02.040 | like statistically speaking,
00:32:03.240 | it's going to come out as a strong relationship
00:32:05.400 | and a person in the grocery store is going to wash out
00:32:08.760 | as not an important physical relationship.
00:32:11.120 | Okay, I mean, this is brilliant.
00:32:13.200 | What, how difficult is it to make work?
00:32:15.880 | So you said, one, there's the mathematical component
00:32:19.000 | that we just kind of talked about.
00:32:21.280 | And then there's the user experience component.
00:32:24.000 | So how difficult is it to go,
00:32:26.080 | just like you built the video game, "Alien Attack",
00:32:28.720 | from zero to completion, what's involved?
00:32:33.800 | How difficult is it?
00:32:34.900 | - So I'm going to answer that question
00:32:36.720 | in terms of building the product,
00:32:39.360 | but then I'm also going to acknowledge
00:32:40.860 | that just having an app doesn't make it useful
00:32:44.720 | because that's actually maybe the easy part.
00:32:47.640 | If you know what I mean,
00:32:49.400 | there's like all of this stuff
00:32:50.300 | about rollout adoption and awareness,
00:32:52.060 | but let's focus on the app part first.
00:32:53.740 | So that's again, why I said that the team is incredible.
00:32:56.320 | So we have a bunch of people who,
00:32:58.840 | let's just say that the technology that we use to make it
00:33:02.720 | is not the standard way you make an app.
00:33:04.960 | If you think about a standard iOS app or Android app,
00:33:08.660 | those are a user interface that contacts a web server
00:33:12.080 | and sends some information back and forth.
00:33:14.080 | We're doing some stuff that has to hook
00:33:16.000 | into the operating system of saying,
00:33:17.880 | let's go use Bluetooth for something
00:33:19.280 | it wasn't really meant for.
00:33:21.320 | Right, so there's that part.
00:33:22.560 | - And by the way, what is the app called?
00:33:24.500 | - Oh, it's called Novid, COVID with an N.
00:33:27.500 | - Very nice.
00:33:29.580 | So you have to hook into Bluetooth.
00:33:31.340 | You're saying you have to do that
00:33:32.980 | beyond the permissions that are like
00:33:37.180 | at the very surface level provided on the phone?
00:33:40.660 | - Well, I don't want to call them permissions.
00:33:42.420 | I just want to say
00:33:43.260 | that's not what you usually do with Bluetooth.
00:33:45.420 | - Gotcha.
00:33:46.260 | - Usually with Bluetooth, you say,
00:33:47.660 | do I have headphones nearby?
00:33:49.060 | - Yes.
00:33:49.900 | - Okay, I'm done.
00:33:50.720 | So I go and say, do I have headphones nearby?
00:33:53.140 | Or do I have another phone nearby, which is doing something?
00:33:55.500 | And then keep asking that same question.
00:33:56.340 | - Keep asking the question.
00:33:58.340 | - Right, so it's actually not easy.
00:34:00.220 | And I mean, there were some parts of it,
00:34:02.180 | which actually a lot of people had tried unsuccessfully.
00:34:05.820 | Actually, it's known that, for example,
00:34:07.960 | the UK was trying to do something similar.
00:34:11.580 | And the problem they ran into was
00:34:13.620 | when you program things on iOS,
00:34:16.100 | iOS is very good at making it hard
00:34:19.300 | to do things in the background.
00:34:20.900 | And so there was quite a lot of effort required
00:34:23.780 | to go and make this thing work.
00:34:25.420 | - So the whole point, this thing would run in the background
00:34:28.180 | and iOS, I mean, most,
00:34:31.180 | Android probably as well, right?
00:34:33.740 | But yeah, iOS certainly makes it difficult
00:34:35.500 | for something to run in the background,
00:34:36.900 | especially when it's eating up your battery, right?
00:34:40.200 | - Ah, well, we wanted to make sure
00:34:41.300 | we didn't eat up the battery.
00:34:42.260 | So that one, we actually are very proud of the fact
00:34:44.940 | that ours uses very little battery.
00:34:47.780 | Actually, even if compared to Apple's own system.
00:34:51.580 | - Beautiful.
00:34:52.420 | So what else is required to make this thing work?
00:34:54.500 | - Right, so the key was that you had to do
00:34:56.860 | a significant amount of work
00:34:57.980 | on the actual mobile app development,
00:35:00.180 | which fortunately the team that we brought
00:35:02.140 | was this kind of general thinkers
00:35:04.020 | where we would dig in deep
00:35:05.740 | into the operating system documentation
00:35:07.940 | and the API libraries.
00:35:09.440 | So we got that working.
00:35:10.620 | But there's another angle, which is,
00:35:12.300 | you also need the servers to be able to compute fast enough,
00:35:14.980 | which is tying back to this old school
00:35:17.340 | computer programming competitions and math Olympiads.
00:35:20.020 | In fact, our team that was working on the algorithm
00:35:22.980 | and backend side included several people
00:35:25.440 | who had been in these competitions from before,
00:35:29.500 | which I happen to know
00:35:30.420 | because I do coach the team for the math.
00:35:33.340 | And so we were able to bring people in to build servers,
00:35:37.220 | a server infrastructure in C++ actually,
00:35:40.100 | so that we could support significant numbers of people
00:35:43.060 | without needing tons of servers.
00:35:45.340 | - Is there some distributed algorithms working here
00:35:47.980 | or you basically have to keep in the same place
00:35:51.540 | the entire graph as it builds?
00:35:53.700 | 'Cause especially the more and more people use it,
00:35:56.260 | the bigger, the bigger the graph gets.
00:35:58.240 | I mean, this is very difficult scaling problem, right?
00:36:02.300 | - Ah, so that's actually why
00:36:04.100 | this computer algorithm competition stuff was handy.
00:36:07.140 | It's because there are only about
00:36:10.100 | seven to eight giga people in the world.
00:36:12.900 | - Yeah.
00:36:14.020 | - That's not that many.
00:36:15.140 | So if you can make your algorithms linear time
00:36:17.500 | or almost linear time, a computer operates in gigahertz.
00:36:22.300 | I only need to do one run, one recalculation every hour
00:36:25.940 | in terms of telling people how far away these dangers are.
00:36:29.540 | So I'd suddenly have 3,600 seconds
00:36:33.500 | and my CPU cores are running in gigahertz
00:36:36.460 | and at most they're eight giga people.
00:36:39.500 | - Well, you're skipping over the fact that
00:36:42.020 | there's N squared potential connections between people.
00:36:45.980 | So how do you get around the fact that, you know,
00:36:51.380 | that we, you know, the potential set of relationship
00:36:54.420 | any one of us could have is 8 billion.
00:36:56.140 | So it's 8 billion times squared.
00:36:59.180 | That's the potential amount of data you have to be storing
00:37:02.700 | and computing over and constantly updating.
00:37:05.260 | - So the way we dealt with that is we actually expect
00:37:08.100 | that the typical network is very sparse.
00:37:10.820 | The technical term sparse would mean that the average degree
00:37:14.820 | or the average number of connections that a person has
00:37:17.580 | is going to be at most like a hundred strong connections
00:37:21.100 | that you care about.
00:37:22.500 | If you think of it almost in terms of the heavy hitters,
00:37:25.500 | actually in most people's lives,
00:37:27.580 | if we just kept track of their top hundred interactions,
00:37:32.420 | that's probably most of the signal.
00:37:34.180 | - Yeah, yeah.
00:37:37.660 | I'm saddened to think that I might not be even
00:37:40.340 | in a double digits, but...
00:37:41.980 | - Oh, I was intentionally giving a crazy number
00:37:44.660 | to account for college students.
00:37:46.260 | - You call, oh, those are the,
00:37:48.900 | who you call on the heavy hitters,
00:37:49.940 | the people who are like the social butterflies.
00:37:51.980 | Yeah. - Yeah, yeah.
00:37:52.940 | - I need to, I'd love to know that information
00:37:56.260 | about myself, by the way, that, do you expose the graph?
00:38:01.260 | Like how many, like about yourself,
00:38:04.420 | how many connections you have?
00:38:06.020 | - We do expose to each person
00:38:07.860 | how many direct connections they have.
00:38:09.540 | - That's great.
00:38:10.380 | - But for privacy purposes,
00:38:11.540 | we don't tell anybody who their connections,
00:38:14.020 | like how their connections are interconnected.
00:38:15.940 | - Yes, gotcha.
00:38:16.820 | - But at the same time, we do expose also to everyone
00:38:19.340 | an interesting chart that says,
00:38:20.820 | here's how many people you have
00:38:22.740 | that you're connected to directly.
00:38:24.260 | Here's how many at distance two, meaning via people.
00:38:27.780 | And then here's how many at distance three.
00:38:29.540 | And the reason we do that is that actually ends up
00:38:31.940 | being a dynamic that also boosts adoption.
00:38:34.340 | It drives another feedback loop.
00:38:36.260 | The reason is because we saw actually
00:38:38.020 | when we deployed this in some universities,
00:38:40.460 | that when people see on their app
00:38:42.700 | that they are indirectly connected to hundreds
00:38:45.940 | or thousands of other people, they get excited
00:38:48.180 | and they tell other people,
00:38:49.020 | "Hey, let's download this app."
00:38:50.660 | But you know, we also saw in those examples,
00:38:52.860 | especially looking at the screenshots people gave,
00:38:55.420 | that is hit as soon as the typical person
00:38:58.500 | has two or three other direct connections on the system,
00:39:02.540 | because that means that our app has reached a virality
00:39:05.700 | or not of two to three.
00:39:07.620 | The key is we were making a viral app to fight a virus
00:39:10.820 | spreading on the same network that the virus spreads on.
00:39:13.860 | - So you're trying to out-virus the virus.
00:39:17.020 | - That's right.
00:39:17.860 | (laughing)
00:39:18.740 | That's exactly right.
00:39:20.220 | - Okay, great.
00:39:21.340 | What have you learned from this whole experience
00:39:23.580 | in terms of, let's say for COVID,
00:39:26.500 | but for future pandemics as well,
00:39:28.740 | is it possible to use the power information here
00:39:33.700 | of networked information as the virus spreads
00:39:37.100 | and travels in order to basically keep the society open?
00:39:41.340 | Is it possible for people to protect themselves
00:39:44.740 | with this information?
00:39:46.100 | Or do you still have to have most,
00:39:48.820 | like in this overarching policy of everybody
00:39:51.100 | should stay at home, that kind of thing?
00:39:53.580 | - We are trying to answer that question right now.
00:39:55.340 | So the answer is we don't know yet,
00:39:57.500 | but that's actually why we're very happy
00:39:59.220 | that now the idea has started to become more widely known,
00:40:02.660 | and we're already starting to collaborate
00:40:04.580 | with epidemiologists.
00:40:06.420 | Again, I'm just a mathematician, right?
00:40:08.780 | And a mathematician should not be the person
00:40:10.940 | who is telling everybody, "This will definitely work."
00:40:13.620 | But because of the potential power of this approach,
00:40:17.580 | especially the potential power of this being an end game
00:40:21.060 | for COVID, we have gotten the interest of real researchers.
00:40:26.060 | And we're now working together to try to actually understand
00:40:28.980 | the answer to that question.
00:40:30.140 | Because you see, there's a theory.
00:40:31.500 | So what I can share is the mathematics of,
00:40:34.020 | "Here's why there's some hope that this would work."
00:40:36.660 | And that's because I'm talking about end game now.
00:40:39.420 | End game means you have very few cases.
00:40:41.540 | But everywhere, we're always thinking,
00:40:43.540 | "Once there's few cases,
00:40:44.500 | "then does that mean we now open up?"
00:40:46.500 | Once you open up in the past,
00:40:48.020 | then the cases go up again
00:40:49.540 | until you have to lock down again.
00:40:51.460 | And now when we talk about the dynamic process that makes,
00:40:54.180 | it's guaranteeing you always have cases
00:40:55.860 | until you have the great vaccines,
00:40:57.140 | which is, you know, we both got vaccinated.
00:40:59.020 | This is good.
00:41:00.700 | But at the same time, why I'm thinking
00:41:02.500 | this is still important
00:41:03.500 | is because we know that many vaccine makers have said
00:41:06.540 | they're preparing for the next dose next year.
00:41:09.740 | And if we have a perpetual thing
00:41:11.700 | where you just always need a new vaccine every year,
00:41:14.500 | it could actually be beneficial
00:41:15.820 | to make sure we have as many other techniques as possible
00:41:18.820 | for parts of the world that can't afford,
00:41:20.820 | for example, that kind of distribution.
00:41:23.180 | - Yeah, so actually, no matter how deadly the virus is,
00:41:26.380 | no matter how many things,
00:41:27.740 | whether you have a vaccine or not,
00:41:29.700 | it's still useful to be having this information.
00:41:31.940 | - Yes.
00:41:32.780 | - To stay home or not, depending on how risk,
00:41:35.020 | like I'm a big fan, just like you said,
00:41:37.380 | of having the freedom for you to decide
00:41:40.260 | how risk averse you want to be, right?
00:41:43.220 | Depending on your own conditions,
00:41:44.380 | but also in the state of like what you,
00:41:47.220 | just how dangerously you like to live.
00:41:50.100 | - So I think that actually makes a lot of sense.
00:41:51.940 | And I also think that since we're,
00:41:54.940 | when you think of disease spreading,
00:41:56.980 | it spreads in aggregate in the sense that
00:42:00.500 | if there are some people who maybe are more risk tolerant
00:42:04.660 | because of other things in their life,
00:42:06.460 | well, there might also be other people
00:42:08.060 | who are less risk tolerance.
00:42:09.820 | And then those people decide to isolate.
00:42:12.740 | But what matters is in the aggregate
00:42:14.500 | that this R naught of the infection spreading
00:42:17.660 | drops below one.
00:42:18.980 | And so the key is if you can empower people
00:42:21.180 | with that power to make that decision,
00:42:23.460 | you might actually still be able to drive
00:42:25.060 | that R naught down below one.
00:42:26.540 | - Yeah, and also, this is me talking,
00:42:30.100 | I, people get a little bit nervous, I think,
00:42:33.700 | with information somehow mapping to privacy violation.
00:42:38.300 | But I, first of all, in the approach you're describing,
00:42:42.280 | that's respecting anonymity.
00:42:44.640 | But I would love to have information
00:42:49.260 | from the very beginning, from March and April of last year,
00:42:52.800 | almost like a map of like where it's risky
00:42:59.140 | and where it's not to go.
00:43:01.420 | And not map based on sort of the exact location of people,
00:43:05.220 | but where people usually hang out kind of thing.
00:43:07.620 | And maybe not necessarily about actual location,
00:43:13.120 | but just maybe activities.
00:43:15.260 | Like just to have information about what is good to do
00:43:19.700 | and not, in terms of like safety.
00:43:23.100 | Is it okay to run outside and not,
00:43:25.460 | is it okay to go to a restaurant and not?
00:43:27.780 | I just feel like we're operating in the blind.
00:43:29.660 | And then what you had is a very imperfect signal,
00:43:33.900 | which is like basically politicians
00:43:36.180 | desperately trying to make statements
00:43:38.700 | about what is safe and not.
00:43:40.060 | They don't know what the heck they're doing.
00:43:41.620 | They have a bunch of smart scientists telling them stuff.
00:43:44.140 | And the scientists themselves, also, very important,
00:43:47.740 | don't always know what they're doing.
00:43:49.620 | Epidemiology is not, is as much an art as a science.
00:43:54.620 | You're desperately trying to predict the future,
00:43:56.420 | which nobody can do.
00:43:57.980 | And then you're trying to speak
00:43:59.580 | with some level of authority.
00:44:01.260 | I mean, if I were to criticize scientists,
00:44:02.940 | they spoke with too much authority.
00:44:04.460 | It's okay to say, I'm not sure.
00:44:06.500 | But then they think like, if I say I'm not sure,
00:44:10.700 | then there's going to be a distrust.
00:44:12.420 | What they realize is when you're wrong
00:44:14.100 | and you say, I'm sure, it's going to lead to more distrust.
00:44:16.880 | So there's this imperfect, like just chaotic,
00:44:19.780 | messy system of people trying to figure out
00:44:23.600 | with very little information.
00:44:25.340 | And what you're proposing is just a huge amount
00:44:27.860 | of information, and information is power.
00:44:31.100 | Is there challenges with adoption
00:44:33.700 | that you see in the future here?
00:44:36.260 | So there's, maybe we could speak to,
00:44:38.660 | there's approaches, I guess, from Google.
00:44:40.520 | There's different people that've tried
00:44:42.620 | similar kind of ideas.
00:44:44.740 | You have quite a novel idea, actually.
00:44:49.620 | But speaking the umbrella idea of contact tracing,
00:44:54.780 | is there something you can comment about
00:44:58.820 | why their approaches haven't been fully adopted?
00:45:02.060 | Is there challenges there?
00:45:03.220 | Is there reasons why Novid might be a better idea
00:45:06.400 | moving forward, in general, just about adoption?
00:45:09.260 | - Yeah, so first of all, I want to say,
00:45:10.700 | I always have respect for the methods that other people use.
00:45:13.300 | And so it's good to see that other people have been trying.
00:45:16.180 | But what we have noticed is that the difference
00:45:19.060 | between our value proposition to the user
00:45:22.380 | and the value proposition to the user
00:45:23.900 | delivered by everything that was made before,
00:45:26.500 | is that, unfortunately, the action of installing
00:45:30.500 | a standard contact tracing app will then tell you
00:45:34.500 | after you have already been exposed to the disease,
00:45:37.740 | so that you can protect other people from you.
00:45:40.660 | And what that does to your own direct probability
00:45:43.640 | of getting sick, if you think about it,
00:45:45.740 | suppose you were making the decision,
00:45:47.120 | should I or should I not install one of those apps?
00:45:50.040 | What does that do to your own probability of getting sick?
00:45:53.100 | (chuckles)
00:45:55.020 | It's close to zero.
00:45:56.340 | - This is the sad thing you're speaking to.
00:46:00.020 | Not sad, I suppose it's the way the world is.
00:46:03.340 | The only incentive there is to just help other people,
00:46:06.100 | I suppose, but a much stronger incentive
00:46:09.700 | is anything that allows you to help yourself.
00:46:13.200 | - Yes, so what I'm saying is that,
00:46:15.580 | let's just say free market capitalism
00:46:17.180 | was not based on altruism.
00:46:19.860 | I think it's based on, if you make a system of incentives
00:46:23.260 | so that everybody trying to maximize their own situation
00:46:26.940 | somehow contributes to the whole,
00:46:28.740 | that's a game theoretic solution to a very hard problem.
00:46:31.780 | And so this is actually basically mechanism design.
00:46:34.180 | Like we've basically come up with a different mechanism,
00:46:36.620 | different set of incentives,
00:46:38.240 | which incentivizes the adoption.
00:46:40.920 | Because actually, whenever we've been rolling it out,
00:46:43.100 | usually the first question we ask people,
00:46:45.220 | like say in a university is, do you know what Novavax does?
00:46:48.140 | And most of them have read about the other apps
00:46:50.620 | and they say, oh, Novavax will tell you
00:46:51.900 | after you've been around someone so you can quarantine.
00:46:54.060 | And we have to explain to them,
00:46:55.380 | actually, Novavax never wants to ask you to quarantine.
00:46:58.380 | That's not the principle.
00:46:59.220 | Our principle isn't based on that at all.
00:47:01.180 | We just want to let you know if something is coming close
00:47:04.520 | so that you can protect yourself.
00:47:06.180 | - If you want.
00:47:08.140 | - If you want, if you want, if you want.
00:47:09.300 | And then the quarantine is like, yes,
00:47:11.260 | in that case, if you're quarantining,
00:47:13.380 | it's because you're shutting the door from the inside,
00:47:16.180 | if that makes sense.
00:47:17.020 | - Yes, exactly, exactly.
00:47:18.700 | I mean, this is brilliant.
00:47:20.100 | So what do you think the future looks like
00:47:23.380 | for future pandemics?
00:47:24.580 | What's your plan with Novavax?
00:47:26.700 | What's your plan with these set of ideas?
00:47:28.780 | - I am actually still an academic and a researcher.
00:47:31.200 | So the biggest work I'm working on right now
00:47:33.380 | is to try to build as many collaborations
00:47:35.580 | with other public health researchers at other universities
00:47:39.140 | to actually work on pilot deployments together
00:47:42.180 | in various places.
00:47:43.060 | That's the goal.
00:47:44.100 | That's actually ongoing work right now.
00:47:46.060 | And so for example, if anyone's watching this
00:47:47.820 | and you happen to be a public health researcher
00:47:49.780 | and you want to be involved in something like this,
00:47:52.380 | I'm just gonna say, I'm still incentive thinking.
00:47:55.280 | There is something in it for the researchers too.
00:47:57.380 | This could open up an entire new way of controlling disease.
00:48:00.380 | That's my hope.
00:48:01.940 | I mean, it might actually be true.
00:48:03.580 | And people who are involved in figuring out
00:48:06.380 | how to make this work,
00:48:08.080 | well, it could actually be good for their careers too.
00:48:10.020 | I always have to think like,
00:48:11.100 | if a researcher was getting involved,
00:48:12.720 | what are they getting out of it?
00:48:14.420 | - So you mean like from a research perspective,
00:48:16.660 | you can like publications and sets of ideas
00:48:20.300 | about how to, from a sort of a network theory perspective,
00:48:25.300 | understand how we control the spread of a pandemic.
00:48:30.120 | - Yes, and what I'm doing right now
00:48:31.740 | is this is basically interdisciplinary research
00:48:33.780 | where maybe our side is bringing the technology
00:48:36.020 | and the network theory,
00:48:37.100 | and the missing parts are epidemiology
00:48:39.260 | and public health expertise.
00:48:41.020 | And if the two things start to join,
00:48:42.900 | also because everywhere that you deploy,
00:48:45.380 | let's just say that the world is different
00:48:46.860 | in the Philippines as it is in the United States.
00:48:49.460 | And just the natures of the locality
00:48:52.100 | would mean that someone like me
00:48:53.620 | should not be trying to figure out how to do that.
00:48:55.340 | But if we can work with the researchers who are based there,
00:48:57.900 | now suddenly we might come up with a solution
00:48:59.780 | that will help scale in parts of the world
00:49:02.020 | where they aren't all getting
00:49:03.240 | the Moderna and Pfizer vaccines,
00:49:04.740 | which cost like $20 a pop in the US.
00:49:07.460 | - So if they want to participate,
00:49:09.300 | who do they reach out to?
00:49:10.780 | - Oh, that would just be us.
00:49:11.680 | I mean, the nova.org website has--
00:49:13.620 | - Nova.org?
00:49:14.460 | - It has a feedback reach out form.
00:49:16.900 | And actually we are, I mean, again,
00:49:18.860 | this is the DNA of being a researcher.
00:49:21.060 | I am actually very excited by the idea
00:49:23.340 | that this could contribute knowledge
00:49:25.600 | that will outlast all of our generations,
00:49:28.260 | like all of our lifetimes.
00:49:29.980 | - There you go.
00:49:30.820 | Reach out to nova.org.
00:49:33.240 | What about individual people?
00:49:36.060 | Should they install the app and try it out?
00:49:37.700 | Or is this really geographically restricted?
00:49:40.100 | - Oh yeah, I didn't come on here
00:49:41.520 | to tell everyone to install the app.
00:49:42.800 | I did not come to tell everyone to install the app
00:49:44.780 | because it works best if your local health authority
00:49:48.240 | is working with us.
00:49:49.400 | - Gotcha.
00:49:50.240 | - There's a reason.
00:49:51.060 | It's because, this is back to the game theory.
00:49:53.460 | If anyone could just say, "I'm positive,"
00:49:57.220 | the high school senior prank would be to say
00:50:01.480 | that we have a massive outbreak on finals week.
00:50:04.000 | Let's not have final exams.
00:50:05.280 | So the way that our system works,
00:50:06.640 | it actually borrows some ideas, not borrows,
00:50:08.760 | we came up with them independently.
00:50:10.280 | But this idea is similar to what Google and Apple do,
00:50:13.240 | which is that if the local health authority
00:50:14.880 | is working with this, they can, for everyone who's positive,
00:50:17.960 | give them a passcode that expires in a short time.
00:50:20.680 | So for ours, if you're on the app and saying, "I'm positive,"
00:50:23.600 | you can either just say that, and that's called unverified,
00:50:26.800 | or you can enter in one of these codes
00:50:28.360 | that you got from the local health authority.
00:50:30.280 | So basically, for anyone who's watching this,
00:50:32.420 | it's not that you should just go and download it,
00:50:34.000 | unless you wanna go and look at it, that's cool.
00:50:36.120 | But if you, on the other hand,
00:50:37.480 | if you happen to know anyone at the local health authority
00:50:39.840 | which is trying to figure out how to handle COVID,
00:50:42.600 | well then, I mean, we'd be very happy to also work with you.
00:50:46.400 | - Gotcha, so the verified there is really important,
00:50:49.080 | because you're maintaining anonymity,
00:50:51.280 | and because of that,
00:50:52.120 | you have to have some source of verification
00:50:54.560 | in order to make sure that it's not possible to manipulate,
00:50:59.040 | because it's ultimately about trust and information,
00:51:01.880 | and so it could be,
00:51:02.960 | verification is really important there.
00:51:06.080 | - So basically, individual people should
00:51:08.080 | ask their local health authorities
00:51:11.240 | to sign up to contact you.
00:51:13.780 | I hope this spreads,
00:51:16.280 | I hope this spreads for future pandemics,
00:51:18.640 | 'cause I'm really, it's the amount,
00:51:21.400 | the millions of people who are hurt by this,
00:51:25.560 | I think our response to the virus, economically speaking,
00:51:30.000 | the number of people who lost their dream,
00:51:32.480 | lost their jobs, but also lost their dream,
00:51:35.080 | entrepreneurs, jobs often give meaning,
00:51:38.360 | there's people who financially and psychologically
00:51:41.040 | are suffering because of our,
00:51:42.720 | I'll say incompetent response to the virus across the world,
00:51:48.240 | but certainly in the United States,
00:51:49.720 | that should be the beacon
00:51:51.000 | of entrepreneurial hope for the world.
00:51:54.360 | So, I hope that we'll be able to respond
00:51:59.360 | to these kinds of events much better in the future,
00:52:02.720 | and this is exactly the right kind of idea,
00:52:05.000 | and now's the time to do the investment.
00:52:07.180 | Let's step back to the beauty of mathematics.
00:52:13.000 | Maybe ask the big, silly question first,
00:52:16.000 | which is, what do you find beautiful about mathematics?
00:52:19.740 | - I think that being able to look at a complicated problem,
00:52:25.800 | which looks unsolvable,
00:52:28.520 | and then to be able to change the perspective
00:52:30.760 | to come from a different angle,
00:52:32.440 | and suddenly see that there's a nice solution.
00:52:35.300 | I don't mean that every problem in math
00:52:37.860 | is supposed to be this way,
00:52:39.120 | but I think that these reframings
00:52:40.880 | and changing of perspectives
00:52:42.120 | that cause difficult things to get simplified
00:52:44.600 | and crystallized and factored in certain ways is beautiful.
00:52:48.560 | Actually, that's related to what we were just talking about
00:52:50.920 | with even this fighting pandemics.
00:52:52.560 | The crystal idea was just quantify proximity
00:52:57.560 | by the number of relationships in the physical network,
00:53:01.640 | instead of just by the feet and meters.
00:53:03.920 | If you change that perspective,
00:53:07.320 | now all of these things follow.
00:53:09.200 | And so, mathematics, to me, is beautiful
00:53:12.240 | in the pure sense, just for that.
00:53:14.960 | - Yeah, it's quite interesting to see
00:53:16.520 | human civilization as a network, as a graph,
00:53:20.000 | and our relationships as kind of edges in that graph,
00:53:25.000 | and to then do, outside of just pandemic,
00:53:29.320 | do interesting inferences based on that.
00:53:31.880 | This is true for Twitter, social networks, and so on,
00:53:36.920 | how we expand the kind of things we talk about,
00:53:40.080 | think about, sort of politically,
00:53:42.200 | if you have this little bubble, quote-unquote,
00:53:44.680 | of ideas that you play with,
00:53:46.880 | it's nice from a recommender system perspective,
00:53:50.200 | how do you jump out of those bubbles?
00:53:52.000 | It's really fascinating.
00:53:53.600 | YouTube was working on that, Twitter's working on that,
00:53:57.520 | but not always so successfully.
00:53:59.680 | But there's a lot of interesting work
00:54:02.680 | from a mathematical and a psychological,
00:54:05.160 | sociological perspective there, within those graphs.
00:54:08.540 | But if we look at the cleanest formulation of that,
00:54:13.360 | of looking at a problem from a different perspective,
00:54:16.280 | you're also involved
00:54:17.360 | with the International Mathematics Olympiad,
00:54:20.200 | which takes small, clean problems
00:54:26.220 | that are really hard,
00:54:27.640 | but once you look at them differently, can become easy.
00:54:31.140 | But that little jump of innovation is the entire trick.
00:54:36.140 | So maybe at the high level,
00:54:38.560 | can you say what is the International Mathematical Olympiad?
00:54:41.480 | - Sure, so this is the competition
00:54:44.600 | for people who aren't yet in college, math competition,
00:54:47.860 | which is the most prestigious one in the entire world.
00:54:50.720 | It's the Olympics of mathematics,
00:54:52.860 | but only for people who aren't yet in college.
00:54:55.040 | Now, the kinds of questions that they ask you to do
00:54:58.000 | are not computational.
00:54:59.600 | Usually you're not supposed to find that the answer is 42.
00:55:02.440 | (both laughing)
00:55:03.800 | Instead, you're supposed to explain why something is true.
00:55:07.300 | And the problem is that at the beginning,
00:55:09.840 | when you look at each of the questions,
00:55:11.560 | first of all, you have four and a half hours
00:55:13.560 | to solve three questions, and this is one day,
00:55:16.100 | and then you have a second day,
00:55:16.940 | which is four and a half hours, three questions.
00:55:19.400 | But when you look at the questions,
00:55:20.640 | they're all asking you,
00:55:21.480 | explain why the following thing is true,
00:55:23.280 | which you've never seen before.
00:55:25.220 | And by the way, even though there are six questions,
00:55:27.400 | if you solve any one of them, you're a genius,
00:55:29.160 | and you get an honorable mention.
00:55:30.360 | So this is hard to solve.
00:55:32.120 | - Really hard problem.
00:55:32.960 | So what about, is it one person, is it a team?
00:55:35.360 | - Ah, so it's each country can send six people,
00:55:38.760 | and the score of the country is actually unofficial.
00:55:42.360 | There's not an official country versus country system,
00:55:45.400 | although everyone just adds up the point scores
00:55:47.500 | of the six people, and they say,
00:55:48.800 | well, now which country stacked up where?
00:55:51.460 | - Yeah, so maybe as a side comment,
00:55:53.420 | I should say that there's a bunch of countries,
00:55:56.580 | including the former Soviet Union and Russia,
00:55:59.500 | where I grew up, where this is one of the
00:56:03.500 | most important competitions that the country participates in.
00:56:08.380 | It was a source of pride for a lot of the country.
00:56:11.920 | You look at the Olympic sports,
00:56:14.460 | like wrestling, weightlifting,
00:56:17.140 | there's certain sports and hockey
00:56:20.340 | that Russia and the Soviet Union truly took pride in.
00:56:25.020 | And actually the Mathematical Olympiad,
00:56:28.660 | it was one of them for many years, is still one of them.
00:56:32.620 | And that's kind of fascinating.
00:56:33.780 | We don't think about it this way in the United States.
00:56:36.740 | Maybe you can correct me if I'm wrong,
00:56:38.260 | but it's not nearly as popular in the United States
00:56:42.420 | in terms of its integration into the culture,
00:56:45.580 | into just basic conversation, into the pride.
00:56:49.100 | Like, if you won an Olympic gold medal,
00:56:52.420 | or if you won the Superbowl, you can walk around proud.
00:56:56.100 | I think that was the case
00:56:57.100 | with the Mathematical Olympiad in Russia.
00:56:59.260 | Not as much the case in the United States, I think.
00:57:03.100 | So I just wanna give that a little aside
00:57:04.940 | because beating anybody from Russia,
00:57:07.560 | from the Eastern Republic, or from China
00:57:09.420 | is very, very difficult.
00:57:11.900 | Like if I remember correctly,
00:57:13.540 | there's people, this was a multi-year training process.
00:57:18.900 | They train hard.
00:57:20.700 | And this is everything that they're focused on.
00:57:25.060 | My dad was a participant in this.
00:57:29.340 | And it's, I mean, it's as serious as Olympic sports.
00:57:33.140 | You think about like gymnastics,
00:57:34.540 | like young athletes participating in gymnastics.
00:57:36.620 | This is as serious as that, if not more serious.
00:57:38.940 | So I just wanna give that a little bit of context
00:57:41.380 | 'cause we're talking about serious, high-level math,
00:57:44.660 | athletics almost here.
00:57:46.380 | - Yeah, and actually I also think that it made sense
00:57:49.820 | from the Soviet Union's perspective.
00:57:51.420 | Because if you look at what these people do eventually,
00:57:55.460 | even though, let's look at the USSR's
00:57:58.820 | International Math Olympiad record.
00:58:00.620 | Even though they, I say, even though they won
00:58:03.100 | a lot of awards at the high school thing,
00:58:05.260 | many of them went on to do incredible things
00:58:07.820 | in research mathematics or research other things.
00:58:10.580 | And that's showing the generalization, generalizability
00:58:14.340 | of what they were working on.
00:58:15.980 | Because ultimately we're just playing with ideas
00:58:20.180 | of how to prove things.
00:58:22.380 | And if you get pretty good at inventing creative ways
00:58:26.060 | to turn problems apart, split them apart,
00:58:29.020 | observe neat ways to turn messy things into simple crystals.
00:58:34.020 | Well, if you're gonna try to solve any real problem
00:58:36.180 | in the real world, that could be a really handy tool too.
00:58:39.260 | So I don't think it was a bad investment.
00:58:41.180 | I think it clearly worked well for Soviet Union.
00:58:44.940 | - Yeah, so this is interesting.
00:58:47.060 | People sometimes ask me, you go up under communism,
00:58:50.960 | was there anything good about communism?
00:58:54.100 | And it's difficult for me to talk about it
00:58:58.140 | because it's not, communism is one of those things
00:59:00.820 | that's looked down on, like without,
00:59:02.940 | in absolutist terms currently.
00:59:05.380 | But you could still, in my perspective,
00:59:07.340 | talk about the actual, forget communism
00:59:09.500 | or whatever the actual term is, but certain,
00:59:14.680 | ways that the society functioned
00:59:16.760 | that we can learn lessons from.
00:59:18.120 | And one of the things in the Soviet Union
00:59:20.280 | that was highly prized is knowledge,
00:59:25.160 | not even knowledge, it's wisdom and the skill of invention,
00:59:30.160 | of innovation at a young age.
00:59:34.140 | So we're not talking about a selection process
00:59:37.240 | where you pick the best students in the school
00:59:40.440 | to do the mathematics or to read literature.
00:59:44.120 | It's like everybody did it, everybody.
00:59:48.740 | It was almost treated as if anyone
00:59:51.640 | could be the next Einstein, anybody could be the next,
00:59:54.800 | I don't know, Hemingway, James Joyce.
00:59:56.880 | And so you're forcing an education on the populace
01:00:01.200 | and a rigorous, deep education,
01:00:03.960 | like as opposed to kind of like,
01:00:06.440 | oh, we want to make sure we teach
01:00:10.320 | to the weakest student in the class,
01:00:12.420 | which American systems can sometimes do
01:00:16.960 | because we don't want to leave anyone behind.
01:00:19.200 | The Russian system was anyone can be the strongest student
01:00:25.040 | and we're going to teach you the strongest student
01:00:26.880 | and we're going to pretend or force everybody,
01:00:30.880 | even the weakest student to be strong.
01:00:32.920 | And what that results in, it's obviously,
01:00:35.240 | this is what people talk about
01:00:36.520 | is a huge amount of pressure.
01:00:38.120 | Like it's psychologically very difficult.
01:00:40.640 | This is why people struggle when they go to MIT,
01:00:42.640 | this very competitive environment.
01:00:44.400 | It can be very psychologically difficult,
01:00:46.100 | but at the same time,
01:00:47.360 | it's bringing out the best out of people.
01:00:49.800 | And that mathematics was certainly one of those things.
01:00:53.240 | And exactly what you're saying,
01:00:54.640 | which kind of clicked with me just now,
01:00:56.400 | as opposed to kind of a spelling bee in the United States,
01:01:00.760 | which I guess you spell, I'm horrible at this,
01:01:03.360 | but it's a competition about spelling,
01:01:04.980 | which I'm not sure, but you could argue,
01:01:07.200 | doesn't generalize well to the future skills.
01:01:10.020 | Mathematics, especially this kind of mathematics,
01:01:13.320 | is essentially formalized competition of invention,
01:01:17.160 | of creating new ideas.
01:01:21.780 | And that generalizes really, really well.
01:01:23.980 | So that's quite brilliantly put.
01:01:25.840 | I didn't really think about that.
01:01:27.360 | So this is not just about the competition.
01:01:29.200 | This is about developing minds
01:01:31.960 | that will come to do some incredible stuff in the future.
01:01:36.960 | - Yeah, actually, I want to respond
01:01:38.660 | to a couple of things there.
01:01:39.620 | The first one is one, which is this notion
01:01:42.020 | of whether or not that is possible
01:01:43.800 | in a non-authoritarian regime.
01:01:46.320 | I think it is.
01:01:47.160 | And that's actually why I spent some of my efforts
01:01:49.540 | before the COVID thing,
01:01:51.060 | actually trying to work towards there.
01:01:53.260 | The reason is because if you think about it,
01:01:55.720 | let's say in America,
01:01:57.420 | lots of people are pretty serious
01:01:58.860 | about training very hard for football or baseball,
01:02:01.940 | or basketball, basketball is very, very accessible,
01:02:04.020 | but lots of people are doing that.
01:02:06.660 | Well, actually, I think that what was going on
01:02:09.900 | with the authoritarian thing was at least the message
01:02:13.540 | that was universally sent was being a good thinker
01:02:17.860 | and a creator of ideas is a good thing.
01:02:21.800 | - Yes, exactly.
01:02:23.380 | - There's no reason why that message can't be sent.
01:02:25.740 | - That's right. - Everywhere.
01:02:26.940 | And I think it actually should be.
01:02:28.660 | So that's the first thing.
01:02:29.740 | The second thing is what you commented about,
01:02:31.900 | this thing about the generalizable skill
01:02:35.720 | and what could people do with Olympiads afterwards.
01:02:37.960 | So that's actually my interest in the whole thing.
01:02:40.340 | I don't just coach students how to do problems.
01:02:45.340 | In fact, I'm not even the best person for that.
01:02:47.140 | I'm not the best at solving these problems.
01:02:49.500 | There are other people who are much better
01:02:50.820 | at making problems and teaching people how to solve problems.
01:02:53.300 | In fact, when the Mathematical Association of America,
01:02:57.140 | which is the group which is in charge
01:02:58.640 | of the US participation in these Olympiads,
01:03:01.280 | when they were deciding whether or not to put me in
01:03:04.300 | back in 2013 as the head coach,
01:03:06.900 | I had a conversation with their executive director
01:03:09.260 | where I commented that we might do worse
01:03:12.680 | because my position was,
01:03:14.500 | I actually didn't want to focus on winning.
01:03:17.660 | I said, if you're going to let me work
01:03:19.840 | with 60 very strong minds as picked through this system,
01:03:24.420 | 'cause the coach works with these,
01:03:26.080 | gets to run a camp for these students,
01:03:27.700 | I said, I'm actually not going to define my success
01:03:30.420 | in terms of winning this contest.
01:03:33.100 | I said, I wanted to maximize the number of the students
01:03:36.060 | that I read about in the New York Times in 20 years.
01:03:39.160 | And the executive director
01:03:41.540 | of the Mathematical Association of America
01:03:44.040 | was fully in support of this
01:03:45.860 | because that's also how their philosophy is.
01:03:48.020 | So in America, the way we run this
01:03:49.860 | is we're actually not just training to win,
01:03:52.940 | even though the students are very good
01:03:54.860 | and they can win anyway.
01:03:56.460 | One reason, for example,
01:03:57.580 | I went and even did the COVID thing
01:03:59.100 | involving quite a few of them
01:04:00.540 | is so that hopefully some of them get ideas
01:04:04.260 | because in 20, 30 years,
01:04:05.480 | I won't have the energy or the insight to solve problems.
01:04:08.500 | We'll have another catastrophe.
01:04:10.500 | And hopefully some of these people will step up and do it.
01:04:13.180 | - And ultimately have that long-term impact.
01:04:14.980 | I wonder if this is scalable to,
01:04:17.300 | 'cause that's such a great metric for education,
01:04:20.340 | not how to get an A on the test,
01:04:27.140 | but how to be on the cover of New York Times
01:04:31.300 | for inventing something new.
01:04:32.740 | Do you think that's generalizable to education
01:04:37.340 | beyond just this particular Olympia?
01:04:39.100 | Like even you saying this feels like a rare statement,
01:04:42.700 | almost like a radical statement as a goal for education.
01:04:45.900 | - So actually the way I teach my classes at Carnegie Mellon,
01:04:48.740 | which I will admit right away
01:04:49.900 | is not equivalent to the average in the world,
01:04:52.420 | but it's already not just the top 60 in the country
01:04:56.260 | has picked by something.
01:04:58.140 | Let me just explain.
01:04:58.980 | I have exams in my class, which are 90% of the grade.
01:05:01.500 | So the exams are the whole thing,
01:05:02.820 | or most of the whole thing.
01:05:03.900 | And the way that I let students prepare for the exams
01:05:06.620 | is I show them all the problems I've ever given
01:05:08.860 | on the previous exams.
01:05:10.380 | And the exam that they will take is open notes.
01:05:12.660 | They can take all the notes they want
01:05:13.740 | on the previous problems.
01:05:14.820 | And the guarantee is that the exam problems this time
01:05:17.220 | will have no overlap with anything
01:05:18.780 | you have seen me give in the past,
01:05:20.980 | as well as no overlap with anything I taught in the class.
01:05:24.380 | So the entire exam is invention.
01:05:26.060 | - Wow.
01:05:28.580 | - But that's how I go, right?
01:05:29.740 | My point is, I have explained to people,
01:05:32.140 | when I teach you, I don't want you to have remembered
01:05:35.340 | a method I showed you.
01:05:36.700 | I want you to have learned enough about this area
01:05:39.300 | that if you face a new question,
01:05:40.900 | which I came up with the night before,
01:05:42.500 | by thinking about like, what could I ask
01:05:44.620 | that I have never asked before?
01:05:46.180 | Oh, that's cute.
01:05:47.020 | I wonder what the answer is.
01:05:48.060 | Aha, that's an exam problem.
01:05:49.180 | That's exactly what I do before the exam.
01:05:51.340 | And then that's what I want them to learn.
01:05:53.860 | And the first exam, usually people have a rough time
01:05:56.100 | because it's like, what kind of crazy class is this?
01:05:58.420 | The professor doesn't teach you anything for the exam.
01:06:01.940 | But then by the second or third,
01:06:03.500 | and by the time they finish the class,
01:06:05.300 | they have learned how to solve anything in the area.
01:06:09.500 | - How to invent.
01:06:10.340 | - How to invent in that area, yeah.
01:06:12.300 | - Can we walk back to the Mathematical Olympiad?
01:06:15.540 | What's the scoring and format like?
01:06:17.580 | And also what does it take to win?
01:06:20.860 | - So the way it works is that each of the six students
01:06:24.300 | do the problems and there are six problems.
01:06:27.860 | All the problems are equally weighted.
01:06:29.620 | So each one's worth seven points.
01:06:31.540 | That means that your maximum score
01:06:33.420 | is six problems times seven points,
01:06:35.060 | which is the nice number of 42.
01:06:37.220 | And now the way that they're scored, by the way,
01:06:40.380 | is there's partial credit.
01:06:41.700 | So your question is asking you,
01:06:43.260 | explain why this weird fact is true.
01:06:46.380 | Okay, if you explain why, you get seven points.
01:06:48.740 | If you make minor mistake, maybe you get six points.
01:06:51.300 | But if you don't succeed in explaining why,
01:06:53.860 | but you explain some other true fact,
01:06:57.940 | which is along the way of proving it,
01:07:02.220 | then you get partial credit.
01:07:03.980 | And actually now this is tricky
01:07:05.700 | because how do you score such a thing?
01:07:07.620 | It's not like the answer was 72 and you wrote 71
01:07:12.340 | and it's close, right?
01:07:13.180 | The answer is 72 and you wrote 36.
01:07:15.300 | Oh, but that's pretty close because you were,
01:07:17.700 | maybe you were just off by,
01:07:18.900 | by the way, they're not numerical anyway,
01:07:20.420 | but I'm just giving some numerical analog
01:07:22.700 | to the way the scoring might work.
01:07:24.580 | They're all essays.
01:07:25.860 | And that's where I guess I have some role
01:07:28.260 | as well as some other people
01:07:29.340 | who helped me in the US delegation for coaches.
01:07:32.380 | We actually debate with the country which is organizing it.
01:07:37.380 | The country which is organizing the Olympiad
01:07:39.500 | brings about 50 people to help judge the written solutions.
01:07:45.140 | And you schedule these half hour appointments
01:07:48.380 | where the delegation from one country
01:07:50.500 | sits down at a table like this,
01:07:52.340 | opposite side is two or three people from the host country.
01:07:55.540 | And they're just looking over these exam papers saying,
01:07:58.700 | well, how many points is this worth
01:08:00.620 | based on some rubric that has been designed?
01:08:03.100 | And this is a negotiation process
01:08:05.580 | where we're not trying to bargain
01:08:07.740 | and get the best score we can.
01:08:09.340 | In fact, sometimes we go to this table and we will say,
01:08:11.660 | we think we want less than what you gave us.
01:08:13.980 | These are our principles.
01:08:16.260 | If you give us too much, we say, no, you gave us too much.
01:08:18.780 | We do that.
01:08:19.700 | However, the reason why this is an interesting process
01:08:22.260 | is because if you can imagine every country
01:08:24.180 | which is participating has its own language.
01:08:26.740 | And so if you're trying to grade the Mongolian scripts
01:08:28.740 | and they're written in Mongolian,
01:08:31.020 | if you don't read Mongolian, which most people don't,
01:08:33.540 | then the coaches are explaining to you,
01:08:36.700 | this is what the student has written.
01:08:38.740 | It's actually quite interesting process.
01:08:40.460 | - So it's almost like a jury.
01:08:43.420 | - Yes.
01:08:44.260 | - You have, in the American legal system,
01:08:47.140 | you have a jury that where they're deliberating,
01:08:49.860 | but unlike a jury, there's the members of the jury
01:08:53.620 | speaking different languages sometimes.
01:08:55.780 | - Yes. - That's fascinating.
01:08:57.140 | But I mean, it's hard to know what to do
01:09:01.660 | 'cause it's probably really, really competitive,
01:09:04.540 | but your sense is that ultimately people,
01:09:10.300 | like how do you prevent manipulation here, right?
01:09:15.220 | - Well, we just hope that it's not happening.
01:09:18.020 | So we write in English,
01:09:20.260 | therefore everything that the US does, everyone can look at.
01:09:23.740 | So it's very hard for me.
01:09:25.180 | - It's very hard for you to manipulate.
01:09:26.980 | - We don't manipulate.
01:09:28.580 | We only hope that other people aren't,
01:09:30.420 | but at the same time, as you see,
01:09:32.260 | our philosophy was we want to use this
01:09:34.300 | as a way to develop general talent.
01:09:36.500 | And although we do this for the six people who go
01:09:39.380 | to the International Math Olympiad,
01:09:41.460 | we really want that everyone touched at any stage
01:09:44.820 | of this process gets some skills
01:09:46.980 | that can help to contribute more later.
01:09:48.980 | - So I don't know if you can say something insightful
01:09:52.340 | to this question, but what do you think
01:09:54.820 | makes a really hard math problem on this Olympiad,
01:09:58.660 | maybe in the courses you teach or in general,
01:10:01.940 | what makes for a hard problem?
01:10:04.220 | You've seen, I'm sure, a lot of really difficult problems.
01:10:07.060 | What makes a hard problem?
01:10:08.620 | - So I could quantify it by the number of leaps of insight,
01:10:12.940 | of changes of perspective that are along the way.
01:10:15.220 | And here's why.
01:10:16.460 | This is like a very theoretical computer science
01:10:18.300 | way of looking at it.
01:10:19.300 | Okay, it's that each reframing of the problem
01:10:23.260 | and using of some tool,
01:10:24.580 | I actually call that a leap of insight.
01:10:26.100 | When you say, oh, wow, now I see I should kind
01:10:29.020 | of put these plugs into those sockets, like so.
01:10:32.620 | And suddenly I get to use that machine.
01:10:34.540 | Oh, but I'm not done yet.
01:10:35.780 | Now I need to do it again.
01:10:37.020 | Each such step is a large possible,
01:10:39.820 | large fan out in the search space.
01:10:41.900 | The number of these tells you the exponent.
01:10:44.860 | The base of the exponent is like how big,
01:10:47.380 | how many different possibilities you could try.
01:10:50.220 | And that's actually why,
01:10:51.940 | like if you have a three insight problem,
01:10:55.020 | that is not three times as hard as a one insight problem,
01:10:58.380 | because after you've made the one insight,
01:10:59.900 | it's not clear that that was the right track necessarily.
01:11:03.740 | Well, unless you're-
01:11:04.580 | - It's a branching of possible, yeah.
01:11:07.180 | (laughs)
01:11:09.260 | You're saying there's problems like on the math Olympia
01:11:12.900 | that requires more than one insight?
01:11:14.340 | - Yes, those are the hard ones.
01:11:15.700 | And also I can tell you how you can tell.
01:11:18.100 | So this is how I also taught myself math
01:11:20.300 | when I was in college.
01:11:21.220 | So if you are taking a, not taught myself,
01:11:24.060 | I was taking classes, of course,
01:11:25.380 | but I was trying to read the textbook
01:11:27.220 | and I found out I was very bad at reading math textbooks.
01:11:29.980 | A math textbook has a long page of stuff that is all true,
01:11:33.260 | which after you read the page,
01:11:34.540 | you have no idea what you just read.
01:11:36.140 | - Yeah.
01:11:37.420 | - This is just-
01:11:38.260 | - A good summary of a math textbook.
01:11:39.660 | - Okay, yeah, because it's not clear
01:11:42.700 | why anything was done that way.
01:11:44.660 | And yes, everything is true,
01:11:45.900 | but how the heck did anyone think of that?
01:11:47.860 | So the way that I taught myself math eventually was,
01:11:51.380 | the way I read a math textbook is I would look at
01:11:54.580 | the theorem statement,
01:11:56.140 | I would look at the length of the proof,
01:11:58.980 | and then I would close the book
01:12:00.060 | and attempt to reprove it myself.
01:12:01.340 | - Yeah, that's brilliant.
01:12:03.580 | - The length of the proof
01:12:04.980 | is telling you the number of insights,
01:12:06.820 | because the length of the proof
01:12:08.060 | is linear in the number of insights.
01:12:10.820 | Each insight takes space.
01:12:12.220 | - Yeah.
01:12:13.060 | - And if I know that it's a short proof,
01:12:14.660 | I know that there's only one insight.
01:12:16.140 | So when I'm doing my own way of solving the problem,
01:12:19.020 | like finding the proof,
01:12:20.540 | I quit if I have to do too many plug-ins.
01:12:23.140 | It's equivalent to a math contest.
01:12:24.900 | In a math contest, I look, is it problem one, two, or three?
01:12:27.220 | That tells me how many insights there are.
01:12:28.940 | This is exactly what I did.
01:12:29.980 | - That's brilliant.
01:12:31.100 | Linear in the number.
01:12:32.100 | I don't know.
01:12:32.940 | I think, oh, it's possible that that's true.
01:12:36.380 | - Approximately, approximately.
01:12:37.220 | - Approximately, yeah.
01:12:38.500 | I don't know, it would be,
01:12:39.780 | somebody out there is gonna try to formally prove this.
01:12:43.060 | - Oh, no, I mean, you're right.
01:12:44.180 | There are cases where maybe it's not quite linear,
01:12:46.100 | but in general--
01:12:47.060 | - Well, some of it is notation too,
01:12:48.380 | and some of it is style, and all those kinds of things,
01:12:50.940 | but within a textbook.
01:12:51.900 | - Within the same book.
01:12:52.780 | - Within the same book with the same--
01:12:53.900 | - Yes, within the same book on the same subject.
01:12:56.180 | - Yeah.
01:12:57.020 | - This is what I was using.
01:12:57.860 | - That's hilarious.
01:12:58.780 | - Because you know, if it's a two-page proof,
01:13:00.420 | you just know this is gonna be insane, right?
01:13:02.780 | - That's the scary thing about insights.
01:13:06.940 | You look like Andrew Wiles
01:13:08.220 | working on the Fermat's Last Theorem,
01:13:10.180 | is you don't know.
01:13:13.620 | Something seems like a good idea,
01:13:16.220 | and you have that idea,
01:13:17.300 | and it feels like this is a leap,
01:13:19.260 | like a totally new way to see it,
01:13:22.300 | but you have no idea if it's at all useful.
01:13:25.780 | Even if you think it's correct,
01:13:27.020 | you have no idea if this is going to go down a path
01:13:30.140 | that's completely counterproductive
01:13:32.500 | or not productive at all.
01:13:34.580 | That's the crappy thing about invention,
01:13:37.860 | like I have, I'm sure you do,
01:13:41.140 | I have a lot of really good ideas every single day,
01:13:44.740 | but like, and I'll go inside my head along them,
01:13:49.660 | along that little trajectory,
01:13:52.100 | but it could be just a total waste.
01:13:54.700 | And it's, you know what that feels like?
01:13:57.220 | It just feels like patience is required,
01:13:59.500 | not to get excited at any one thing.
01:14:01.820 | - So I think this is interesting
01:14:03.340 | because you raised Andrew Wiles,
01:14:04.820 | he spent seven years attacking the same thing, right?
01:14:08.380 | And so I think that what attracts
01:14:10.340 | professional researchers to this
01:14:12.300 | is because even though it's very painful
01:14:14.900 | that you keep fighting with something,
01:14:16.700 | when you finally find the right insights
01:14:19.500 | and string them together, it feels really good.
01:14:23.140 | - Well, there's also like short-term,
01:14:26.060 | it feels good to, whether it's real or not,
01:14:31.060 | to pretend like you've solved something
01:14:33.540 | in the sense like you have an insight
01:14:35.660 | and there's a sense like this might be the insight
01:14:37.940 | that solves it.
01:14:38.980 | So at least for me, I just enjoy that rush of positivity,
01:14:43.980 | even though I know statistically speaking
01:14:46.460 | is probably going to be a dead end.
01:14:48.500 | - I'm the same way, I'm the same way.
01:14:49.820 | In fact, that's how I know whether I might want
01:14:52.460 | to keep thinking about this general problem.
01:14:54.340 | It's like, if I still see that I'm getting some insights,
01:14:57.300 | I'm not at a dead end yet.
01:14:59.020 | But that's also where I learned something
01:15:00.660 | from my PhD advisor.
01:15:01.940 | Actually, he was a real big inspiration on my life.
01:15:04.260 | His name is Benny Sudakov.
01:15:05.740 | In fact, he grew up in the former Soviet Union.
01:15:08.100 | He was from Georgia, but he's an incredible person.
01:15:12.260 | But one thing I learned was choose the problems to work on
01:15:16.660 | that might matter if you succeed.
01:15:21.100 | Because that's why, for example, we dug into COVID.
01:15:23.420 | It was just, well, suppose we succeed
01:15:25.820 | in finding some interesting insight here.
01:15:27.860 | Well, it actually matters.
01:15:29.140 | Then it's worthwhile.
01:15:31.020 | - Yeah, and I think COVID, the way you're approaching COVID
01:15:36.020 | has two interesting possibilities.
01:15:38.260 | One, it might help with COVID or another pandemic.
01:15:41.740 | But two, I mean, just this whole network theory space,
01:15:46.740 | you might unlock some deep understanding
01:15:51.240 | about the interaction with human beings
01:15:53.100 | that might have nothing to do with the pandemic.
01:15:55.900 | There's a space of possible impacts
01:15:58.380 | that may be direct or indirect.
01:16:00.460 | And the same thing is with Andrew Wiles' proof.
01:16:03.300 | I don't understand, but apparently the pieces of it
01:16:08.140 | are really impactful for mathematics,
01:16:12.060 | even if the main theorem is not.
01:16:14.900 | So along the way, the insights you have
01:16:18.300 | might be really powerful for unexpected reasons.
01:16:22.960 | - So I like what you said.
01:16:23.820 | This is something that I learned from another friend of mine
01:16:26.180 | who's also, he's a very famous researcher.
01:16:28.100 | All these people are more famous than I am.
01:16:30.020 | His name is Jacob Fox.
01:16:30.980 | He's Jacob Fox at Stanford.
01:16:32.220 | Also a very big inspiration for me.
01:16:33.820 | We were both grad students together at the same time.
01:16:36.060 | - Well, most importantly,
01:16:36.880 | you're good at selecting good friends.
01:16:38.260 | - Ah, yeah, well, that's the key.
01:16:39.940 | You gotta find good people to learn things from.
01:16:41.980 | But his thing was, he often said,
01:16:44.340 | if you solve a math problem and have this math proof,
01:16:46.780 | math problem for him is like a proof, right?
01:16:48.900 | So suppose you came up with this proof.
01:16:50.300 | He always asks, what have we learned from this
01:16:53.400 | that we could potentially use for something else?
01:16:56.000 | It's not just, did you solve the problem
01:16:58.040 | that was supposed to be famous?
01:16:59.400 | It was, and is there something new
01:17:01.520 | in the course of solving this that you had to invent
01:17:04.560 | that we could now use as a tool elsewhere?
01:17:06.680 | - Yeah, there's this funny effect
01:17:08.880 | where just looking at different fields
01:17:12.680 | where people discover parallels.
01:17:15.000 | They'll prove something, it'll be a totally new result,
01:17:17.520 | and then somebody later realizes
01:17:19.040 | this was already done 30 years ago
01:17:20.660 | in another discipline in another way.
01:17:23.360 | And it's really interesting.
01:17:25.080 | We did this offline in another illustration he showed to me.
01:17:30.220 | It's interesting to see the different perspectives
01:17:33.900 | on a problem.
01:17:35.560 | It kind of points like there's just very few novel ideas,
01:17:40.560 | that everything else, that most of us
01:17:43.280 | are just looking at different perspective on the same idea.
01:17:47.200 | And it makes you wonder this old silly question
01:17:51.540 | that I have to ask you is,
01:17:53.840 | do you think mathematics is discovered or invented?
01:17:58.200 | Do you think we're creating new idea,
01:18:02.560 | we're building a set of knowledge
01:18:05.160 | that's distinct from reality,
01:18:09.180 | or are we actually, is math almost like a shovel
01:18:13.440 | where we're digging to this core set of truths
01:18:16.800 | that were always there all along?
01:18:19.960 | - So I personally feel like it's discovered,
01:18:22.760 | but that's also because I guess the way
01:18:24.680 | that I like to choose what questions to work on
01:18:27.360 | are questions that maybe we'll get to learn something
01:18:29.960 | about why is this hard?
01:18:32.160 | I mean, I'm often attracted to questions
01:18:33.840 | that look simple, but are hard, right?
01:18:36.840 | And what could you possibly learn from that?
01:18:38.640 | Sort of like probably the attraction
01:18:40.240 | of Fermat's last theorem, as you mentioned,
01:18:42.640 | simple statement, why is it so hard?
01:18:44.920 | So I'm more on the discovered side.
01:18:47.480 | And I also feel like if we ever ran
01:18:49.320 | into an intelligent other species in the universe,
01:18:53.060 | probably if we compared notes,
01:18:56.920 | there might be some similarities
01:18:58.880 | between both of us realizing that pi is important.
01:19:02.200 | Because you might say, why?
01:19:03.560 | Why humans, do humans like circles more than others?
01:19:06.000 | I think stars also like circles.
01:19:08.040 | I think planets like circles.
01:19:09.800 | They're not perfect circles,
01:19:10.840 | but nevertheless, the concept of a circle
01:19:13.080 | is just point and constant distance.
01:19:15.640 | Doesn't get any simpler than that.
01:19:17.280 | - It's possible that like an alien species
01:19:19.880 | will have, depending on different cognitive capabilities
01:19:23.160 | and different perception systems,
01:19:24.920 | will be able to see things
01:19:28.080 | that are much different than circles.
01:19:30.320 | And so if it's discovered, it will still be pointing
01:19:34.600 | at a lot of same geometrical concepts,
01:19:37.200 | mathematical concepts, but it's interesting to think
01:19:42.280 | of how many things we would have to still align,
01:19:45.680 | not just based on notation, but based on understanding,
01:19:48.480 | like just the, like some basic mathematical concepts,
01:19:53.480 | like how much work is there going to be
01:19:56.840 | in trying to find a common language?
01:19:59.200 | I mean, this is, I think Stephen Wolfe,
01:20:01.680 | I mean, his son helped with the movie "Arrival,"
01:20:04.560 | like the developing an alien language,
01:20:07.200 | like how would aliens communicate with humans?
01:20:10.600 | It's fascinating 'cause like math seems
01:20:12.900 | to be the most promising thing, but even like math,
01:20:15.880 | like how do you visualize mathematical ideas?
01:20:20.880 | It feels like there has to be an interactive component,
01:20:24.620 | just like we have a conversation.
01:20:26.520 | There has to be, this is something we don't, I think,
01:20:29.080 | think about often, which is like,
01:20:31.420 | with somebody who doesn't know anything about math,
01:20:33.840 | doesn't know anything about English
01:20:35.440 | or any other natural language, how would we describe,
01:20:40.080 | we talked offline about visual proofs.
01:20:42.400 | How would we, through visual proofs, have a conversation
01:20:47.400 | where we say something, here's the concept,
01:20:50.160 | the way we see it, does that make sense to you?
01:20:53.780 | And like, can you mess with that concept
01:20:57.400 | to make it sense for you?
01:20:58.840 | And then go back and forth in this kind of way.
01:21:01.440 | So purely through mathematics, I'm sure it's possible
01:21:03.760 | to have those kinds of experiments with like tribes
01:21:05.880 | on earth that don't, there's no common language.
01:21:08.400 | Through math, like draw a circle
01:21:10.840 | and see what they do with it, right?
01:21:13.200 | Do some of these visual proofs,
01:21:15.640 | like the summation of the odds and adds up to the squares.
01:21:19.720 | Yes, I wonder how difficult that is
01:21:21.960 | before one or the other species murders.
01:21:24.320 | (all laughing)
01:21:26.120 | - That's a good question.
01:21:27.720 | - I hope that the curiosity for knowledge
01:21:29.760 | will overpower the greedy.
01:21:31.480 | This is back to our game theory thing,
01:21:33.520 | that the curiosity of like discovering math together
01:21:37.260 | will overpower the desire for resources
01:21:40.040 | and ultimately like, you know,
01:21:42.840 | willing to commit violence in order to gain those resources.
01:21:46.440 | I think as we progress,
01:21:47.920 | become more and more intelligent as a species,
01:21:50.080 | I'm hoping we would value more and more of the knowledge
01:21:53.320 | because we'll come up with clever ways
01:21:54.880 | to gain more resources so we won't be so resource starved.
01:21:58.160 | I don't know.
01:21:59.000 | That's a hopeful message for when we finally meet aliens.
01:22:01.240 | - Yeah, yeah.
01:22:02.080 | - See, the cool thing about the math Olympiad,
01:22:07.100 | I don't know if you know work from Francois Chollet
01:22:11.360 | from Google, he came up with this kind of IQ test slash,
01:22:16.360 | it kind of has a similar aspects to it
01:22:20.920 | that also the math Olympiad does for AI.
01:22:25.920 | So he came up with these tests
01:22:27.920 | where they're very simple for humans,
01:22:31.120 | but very difficult for AI to illustrate exactly
01:22:34.340 | why we're just not good at seeing a totally new problem.
01:22:39.080 | We, sorry, AI systems are not good at looking
01:22:45.120 | at a new problem that requires you to detect
01:22:48.720 | that there's a symmetry of some kind,
01:22:50.560 | or there's a pattern that hasn't seen before.
01:22:55.560 | The pattern is like obvious to us humans,
01:22:58.760 | but it's not so obvious to find that kind of,
01:23:01.560 | you're inventing a pattern that's there
01:23:04.260 | in order to then find a solution.
01:23:08.500 | I don't know if you can comment on,
01:23:14.220 | but from an AI perspective
01:23:16.280 | and from a math problem perspective,
01:23:19.420 | what do you think is intelligence?
01:23:22.140 | What do you think is the thing that allows us
01:23:24.380 | to solve that problem?
01:23:25.740 | And how hard is it to build a machine to do that?
01:23:29.540 | Asking for a friend.
01:23:30.580 | - Yeah, well, so I guess, you see,
01:23:33.000 | because if I just think of the raw search space, it's huge.
01:23:35.920 | That's why you can't do it.
01:23:37.000 | And if I think about what makes somebody good
01:23:38.960 | at doing these things, they have this heuristic sense.
01:23:42.320 | It's almost like a good chess player of saying,
01:23:44.320 | let's not keep analyzing down this way
01:23:45.960 | because there's some heuristic reason
01:23:47.600 | why that's a bad way to go.
01:23:49.240 | Where did they get that heuristic from?
01:23:50.520 | Now, that's a good question.
01:23:51.760 | I don't know.
01:23:53.000 | Because that, if you asked them to explain to you,
01:23:56.680 | they could probably say something in words
01:23:58.200 | that sounds like it makes sense,
01:23:59.640 | but I'm guessing that's only a part
01:24:01.260 | of what's really going on in their brain
01:24:03.020 | of evaluating that position.
01:24:04.780 | You know what I mean?
01:24:05.620 | If you ask Garry Kasparov, what is good,
01:24:06.820 | or why is this position good?
01:24:08.620 | He will say something.
01:24:10.220 | But probably not approximating everything
01:24:12.540 | that's going on inside.
01:24:14.020 | So there's basically a function being computed.
01:24:16.900 | But it's hard to articulate what that function is.
01:24:19.180 | Now, the question is, could a computer get as good
01:24:21.740 | at computing these kinds of heuristic functions?
01:24:24.220 | Maybe.
01:24:25.060 | I'm not enough of an expert to understand,
01:24:27.860 | but one bit of me has always been a little bit curious
01:24:30.320 | of whether or not the human brain has a particular tendency
01:24:34.180 | due to its wiring to come up with certain kinds of things,
01:24:37.480 | which is just natural due to the way
01:24:39.520 | that the topology of the neurons and whatever is there,
01:24:43.520 | for which if you tried to just build from scratch
01:24:46.120 | a computer to do it,
01:24:47.280 | would it naturally have different tendencies?
01:24:49.640 | I don't know.
01:24:50.480 | This is just me being completely ignorant
01:24:52.240 | and just saying a few ideas.
01:24:53.640 | - Well, this is a good thing that mathematics shows
01:24:56.160 | is we don't have to be, so math and physics
01:24:59.600 | or mathematical physics operates in a world
01:25:01.840 | that's different than our descendants of a brains operate in.
01:25:06.840 | So it allows us to have multiple, many, many dimensions.
01:25:11.980 | It allows us to work on weird surfaces.
01:25:15.520 | I would like topology as a discipline is just weird to me.
01:25:20.040 | It's really complicated,
01:25:21.220 | but it allows us to work in that space,
01:25:23.600 | differential geometry and all those kinds of things
01:25:25.920 | where it's totally outside of our natural day-to-day
01:25:30.640 | four-dimensional experience,
01:25:33.360 | 3D dimensional with time experience.
01:25:35.400 | So math gives me hope that we can see,
01:25:40.080 | we can discover the processes of intelligence
01:25:46.320 | outside the limited nature of our own human experiences.
01:25:52.000 | But you said that you're not an expert.
01:25:55.960 | It's kind of funny.
01:25:57.240 | I find that we know so little about intelligence
01:26:02.240 | that I think, I honestly think like almost children
01:26:06.800 | are more expert at creating artificial intelligence systems
01:26:11.400 | than adults.
01:26:14.400 | I feel like we know so little,
01:26:15.780 | we really need to think outside the box.
01:26:18.200 | And those little, I found people should check out
01:26:22.120 | Francois Chollet's little exams,
01:26:24.640 | but even just solving math problems,
01:26:27.080 | I don't know if you've ever done this for yourself,
01:26:30.560 | but when you solve a math problem,
01:26:33.400 | you kind of then trace back and try to figure out
01:26:38.160 | where did that idea come from?
01:26:39.840 | Like, what was I visualizing in my head?
01:26:45.280 | How did I start visualizing it that way?
01:26:47.340 | Why did I start rotating that cube in my head in that way?
01:26:52.280 | Like, what is that?
01:26:53.120 | If I were to try to build a program that does that,
01:26:55.520 | where did that come from?
01:26:56.920 | - So this is interesting.
01:26:58.200 | So I tried to do this to teach middle school students
01:27:02.760 | how to learn how to create and think and invent.
01:27:05.640 | And the way I do it is there are
01:27:07.360 | these math competition problems,
01:27:09.000 | and I'm working in collaboration
01:27:10.600 | with the people who run those,
01:27:12.080 | and I will turn on my YouTube Live,
01:27:14.120 | and for the first time look at those questions
01:27:16.440 | and live solve them.
01:27:17.520 | The reason I do this is to let the middle school students
01:27:21.240 | and the high school students and the adults,
01:27:22.480 | whoever wants to watch,
01:27:23.480 | just see what exactly goes on through someone's head
01:27:27.440 | as they go and attempt to invent what they need to do
01:27:30.480 | to solve the question.
01:27:32.120 | So I've actually thought about that.
01:27:34.560 | I think that, first of all, as a teacher,
01:27:37.640 | I think about that because whenever I want to explain
01:27:40.000 | to a student how to do something,
01:27:42.080 | I want to explain how it made sense,
01:27:44.160 | why it's intuitive to do the following things,
01:27:46.200 | and why the wrong things are wrong,
01:27:48.720 | not just by this one short, fast way.
01:27:51.900 | But why this is the right way, if that makes sense.
01:27:54.680 | So my point is I'm actually always thinking about that.
01:27:57.280 | Like, how would you think about these things?
01:27:58.960 | And then I eventually decided the easiest way to expose this
01:28:01.800 | would just be to go live on YouTube and just say,
01:28:05.200 | I've never seen any of these questions before, here we go.
01:28:07.840 | - Don't you get, that's anxiety inducing for me.
01:28:12.760 | Don't you get trapped in a kind of like
01:28:16.200 | little dead ends of confusion,
01:28:18.760 | even on middle school problems?
01:28:20.400 | - Yes, that's what the comments are for.
01:28:22.020 | The live comments come in and students say, try this.
01:28:24.640 | - Oh, wow.
01:28:25.640 | - It's actually pretty good.
01:28:26.680 | And I'll never get stuck.
01:28:27.800 | I mean, I'm willing to go on camera and say,
01:28:30.480 | guess what, potion though can't do this, that's fine.
01:28:33.180 | But then what ends up happening is you will then see
01:28:35.800 | how maybe somebody's saying something,
01:28:37.360 | and I look at the chat and I say, aha,
01:28:39.360 | that actually looks useful.
01:28:40.680 | Now that also shows how not all ideas,
01:28:44.080 | not all suggestions are the same power, if that makes sense.
01:28:46.920 | Because if I actually do get stuck,
01:28:48.180 | I'll go fishing through the chat.
01:28:49.440 | - You haven't got any ideas.
01:28:50.840 | - I don't know if you can speak to this,
01:28:53.340 | but is there a moment for the middle school students,
01:28:57.400 | maybe high school as well,
01:28:59.720 | where there's like a turning point for them
01:29:04.400 | where they maybe fall in love with mathematics
01:29:07.860 | or they get it?
01:29:09.800 | Is there something to be said about like discovering
01:29:13.440 | that moment and trying to grab them,
01:29:16.400 | to get them to understand that mathematics is,
01:29:19.960 | so no matter what they want to do in life
01:29:21.560 | could be part of their life?
01:29:23.120 | - Yes, I actually do think that the middle school
01:29:25.520 | is exactly the right time,
01:29:26.840 | because that's the place where your mathematical
01:29:29.280 | understanding gets just sophisticated enough
01:29:32.520 | that you can start doing interesting things.
01:29:34.720 | Because if you're early on in counting,
01:29:37.680 | I'm honestly not very good at teaching you new insights.
01:29:40.760 | My wife is pretty good at that.
01:29:41.960 | But somehow once you get to this part
01:29:44.200 | where you know what a fraction is,
01:29:45.760 | and when you know how to add and how to multiply
01:29:49.600 | and what the area of a triangle is,
01:29:51.240 | at that point to me, the whole world opens up
01:29:54.120 | and you can start observing
01:29:55.240 | there are really nifty coincidences,
01:29:57.360 | the things that made the Greek mathematicians
01:29:59.880 | and the ancient mathematicians excited.
01:30:02.120 | Actually back then it was exciting
01:30:03.880 | to discover the Pythagorean theorem.
01:30:05.720 | It wasn't just homework.
01:30:06.920 | - So is there, which discipline do you think
01:30:11.800 | has the most exciting coincidences?
01:30:14.120 | So is it geometry?
01:30:16.360 | Is it algebra?
01:30:17.960 | Is it calculus?
01:30:21.520 | - Well, you see, you're asking me
01:30:22.680 | and I'm the guy who gets the most excited
01:30:24.520 | when the combinatorics shows up in the geometry.
01:30:27.280 | (laughs)
01:30:28.480 | - Is it?
01:30:29.360 | Okay, so it's the combinatorics in the geometry.
01:30:33.080 | So first of all, the nice thing about geometry,
01:30:35.240 | this is the same nice thing about computer vision,
01:30:37.880 | is it's visual.
01:30:39.120 | So geometry, you can draw circles and triangles and stuff.
01:30:42.720 | So it naturally presents itself to the visual proof.
01:30:47.720 | But also the nice thing about geometry,
01:30:51.160 | I think for me is the earliest class,
01:30:56.160 | the earliest discipline where there's,
01:30:59.720 | that's most amenable to the exploration,
01:31:02.600 | the invention, the proofs.
01:31:04.960 | The idea of proofs I think is most easily shown in geometry
01:31:09.640 | 'cause it's so visual, I guess.
01:31:12.280 | So that to me is like, if I were to think about
01:31:15.440 | when I first fell in love with math, it would be geometry.
01:31:18.340 | And sadly enough, that's not used.
01:31:21.240 | Geometry only has a little, appears briefly
01:31:25.020 | in the journey of a student and it kind of disappears
01:31:30.020 | and not until much later,
01:31:32.280 | which there may be like differential geometry.
01:31:36.560 | I don't know where else it shows up.
01:31:37.680 | For me, in computer science,
01:31:39.760 | you could start to think about like computational geometry
01:31:43.520 | or even graph theory as a kind of geometry.
01:31:45.520 | You could start to think about it visually,
01:31:47.180 | although it's pretty tricky.
01:31:49.300 | But yeah, it was always, that was the most beautiful one.
01:31:53.000 | Everything else, I guess calculus can be kind of visual too.
01:31:56.200 | That can be pretty beautiful.
01:31:57.800 | But is there something you try to look for in the student
01:32:05.360 | to see like, how can I inspire them at this moment?
01:32:09.920 | Or is this like individual student to student?
01:32:12.160 | Is there something you could say there?
01:32:13.880 | - So first of all, I really think that every student
01:32:16.440 | can pick up all of this skill.
01:32:17.880 | I really do think so.
01:32:18.720 | I don't think it's something only for a few.
01:32:20.520 | And so if I'm looking for a student,
01:32:23.680 | actually oftentimes if I'm looking at a particular student,
01:32:26.800 | the question is, how can we help you feel
01:32:30.320 | like you have the power to invent also?
01:32:32.880 | Because I think a lot of people are used
01:32:34.640 | to thinking about math as something
01:32:36.080 | where the teacher will show you what to do
01:32:37.960 | and then you will do it.
01:32:39.600 | So I think that the key is to show that they have some,
01:32:42.120 | let them see that they have some power to invent.
01:32:44.240 | And at that point, it's often starting
01:32:45.960 | by trying to give a question that they don't know how to do.
01:32:48.720 | You want to find these questions
01:32:49.860 | that they don't know how to do,
01:32:51.120 | that they can think about and then they can solve.
01:32:54.480 | And then suddenly they say, my gosh, I've had a situation.
01:32:57.800 | I've had an experience where I didn't know what to do.
01:33:00.420 | And after a while I did.
01:33:03.320 | - Is there advice you can give on how to learn math
01:33:07.800 | for people, whether it's middle school,
01:33:10.680 | whether it's somebody as an adult
01:33:14.480 | kind of gave up on math maybe early on?
01:33:18.060 | - I actually think that these math competition problems,
01:33:22.000 | middle school and high school are really good.
01:33:23.760 | They're actually very hard.
01:33:25.040 | So if you haven't had this kind of experience before
01:33:29.120 | and you grab a middle school math competition problem
01:33:32.920 | from the state level, which is used to decide
01:33:35.000 | who represents the state in the country,
01:33:37.000 | in the United States, for example, those are pretty tricky.
01:33:40.640 | And even if you are a professional,
01:33:43.540 | maybe not doing mathematical things
01:33:45.480 | and you're not a middle school student, you'll struggle.
01:33:48.260 | So I find that these things really do teach you things
01:33:51.080 | by trying to work on these questions.
01:33:53.080 | - Is there a Googleable term that you can use
01:33:56.760 | for the organization for the state competitions?
01:33:59.440 | - Ah, yeah.
01:34:00.260 | So there are a number of different ones
01:34:02.280 | that are quite popular.
01:34:03.600 | One of them is called Math Counts, M-A-T-H-C-O-U-N-T-S.
01:34:07.760 | And that's a big tournament,
01:34:08.880 | which actually has a state level.
01:34:10.480 | There's also a mathleague.org, mathleague.org,
01:34:15.360 | also has this kind of tiered tournament structure.
01:34:18.660 | There's also the American Math Competitions, AMC8.
01:34:22.600 | AMC also has AMC10, that's for 10th grade and below,
01:34:25.920 | and AMC12.
01:34:27.240 | These are all run by the Mathematical Association of America.
01:34:30.260 | And these are all ways to find old questions.
01:34:32.880 | - What about the daily challenges that you run?
01:34:35.160 | What are those about?
01:34:36.000 | - We do that too.
01:34:36.820 | But I mean, the difference was, that one's not free.
01:34:39.920 | So I should actually probably be careful.
01:34:42.080 | The things that I've just mentioned are also not free.
01:34:44.280 | Not all of those things I mentioned just now are free either.
01:34:46.760 | - People can figure out what is free and what's not.
01:34:48.760 | But this is really nice to know what's out there.
01:34:51.040 | But can you speak a little bit to the daily challenges?
01:34:53.720 | - Sure, sure.
01:34:54.560 | So that's actually what we did when,
01:34:56.800 | I guess I was thinking about,
01:34:58.480 | how would I try to develop that skill in people
01:35:02.060 | if we had the power to architect the entire system ourselves?
01:35:05.380 | So that's called the daily challenge with Poch and Lo.
01:35:07.420 | It's not free because that's actually how I pay
01:35:09.720 | for everything else I do.
01:35:11.240 | So that was the idea.
01:35:12.860 | But the concept was, aha, now let's invent from scratch.
01:35:16.500 | So if we're going to go from scratch
01:35:17.980 | and we're going to use technology,
01:35:19.700 | what if we made every single lesson
01:35:22.060 | something where first I say,
01:35:24.700 | hey, here's an interesting question.
01:35:25.820 | Recorded, of course, not live.
01:35:27.100 | But it's like, I say, hey, here's an interesting question.
01:35:28.660 | Why don't we think about this?
01:35:29.880 | But I know you don't know how to do it.
01:35:32.000 | So now you think, and a minute later,
01:35:33.740 | a hint pops on the screen.
01:35:35.500 | But you still think, and a minute later,
01:35:37.000 | a big hint pops on the screen, and you still think.
01:35:39.260 | And then finally, after the three minutes,
01:35:41.260 | hopefully you got some ideas, you try to answer.
01:35:43.700 | And then suddenly there's this pretty extended explanation
01:35:47.300 | of, oh yeah, so here's multiple different ways
01:35:50.300 | that you can do the question.
01:35:51.580 | And by accident, you also just learned this other concept.
01:35:54.660 | That's what we did.
01:35:55.500 | So yeah.
01:35:56.340 | - Is it targeted towards middle school students,
01:35:58.140 | high school students?
01:35:59.500 | - It's targeted towards middle school students
01:36:01.340 | with competitions, but there's a lot of high school students
01:36:04.060 | who didn't do competitions in middle school,
01:36:06.420 | where they would also learn how to think.
01:36:07.860 | If you can see, the whole concept was,
01:36:09.820 | can we teach people how to think?
01:36:11.700 | How would you do that?
01:36:12.760 | You need to give people the chance to, on their own,
01:36:15.640 | invent without that kid in the front row
01:36:17.940 | answering every question in two seconds.
01:36:20.340 | - And people can find it, I think, what, daily.--
01:36:24.040 | - It's daily.potionload.com.
01:36:26.100 | But if you go to find my website,
01:36:27.860 | you'll be able to find it.
01:36:29.220 | - Beautiful.
01:36:30.620 | Can we zoom out a little bit?
01:36:31.940 | And so day to day, week to week, month to month,
01:36:36.260 | year to year, what does the lifelong educational process
01:36:41.260 | look like, do you think?
01:36:42.820 | For yourself, but for me, what would you recommend
01:36:48.460 | in the world of mathematics?
01:36:50.260 | Or sort of as opposed to studying for a test,
01:36:52.620 | but just like lifelong expanding of knowledge
01:36:57.620 | in that skill for invention?
01:37:02.100 | - I think I often articulate this as,
01:37:05.020 | can you always try to do more
01:37:07.740 | than you could do in the past?
01:37:09.480 | - Yeah.
01:37:11.660 | - But that comes in many ways.
01:37:13.660 | And I will say it's great if one wants to build that
01:37:16.620 | with mathematics, but it's also great to use
01:37:19.300 | that philosophy with all other things.
01:37:21.060 | In fact, if I just think of myself,
01:37:23.060 | I just think, what do I know now that I didn't know
01:37:25.820 | a year ago or a month ago or a week ago?
01:37:28.500 | And not just know, but like,
01:37:29.740 | what do I have the capability of doing?
01:37:31.300 | - Yes.
01:37:32.140 | - And if you just have that attitude, it brings more.
01:37:35.020 | - See, the thing is, there's also a habit,
01:37:38.260 | like it is a skill, like I've been using Anki,
01:37:43.020 | it's an app for helps you memorize things.
01:37:46.640 | And I've actually, a few months ago,
01:37:50.720 | started doing this daily of setting aside time
01:37:54.980 | to think about an idea that's outside of my work.
01:37:59.980 | Let's say, it's all over the place, by the way,
01:38:04.680 | but let's say politics, like gun control.
01:38:07.160 | Is it good to have a lot of guns or not in society?
01:38:11.800 | And just, I've set aside time every day.
01:38:15.020 | I do at least 10 minutes, but I try to do 30,
01:38:17.640 | where I think about a problem.
01:38:19.000 | And I kind of outline it for myself from scratch,
01:38:20.920 | from not looking anything up,
01:38:22.200 | just thinking about it using common sense.
01:38:24.960 | And I think the practice of that is really important.
01:38:29.120 | It's the daily routine of it.
01:38:31.040 | It's the discipline of it.
01:38:32.560 | It's not just that I figured something out
01:38:35.860 | from that thinking about gun control.
01:38:38.760 | It's more that that muscle is built too.
01:38:43.040 | It's that thinking muscle.
01:38:44.140 | So I'm kind of interested in, you know, math has,
01:38:49.140 | because especially 'cause I've gotten specialized
01:38:52.040 | into machine learning
01:38:53.020 | and because I love programming so much,
01:38:55.320 | I've lost touch with math a little bit
01:38:59.720 | to where I feel quite sad about it.
01:39:02.360 | And I want to fix that.
01:39:03.900 | Even just not math, like pure knowledge math,
01:39:07.480 | but math like these middle school problems,
01:39:10.160 | the challenges, right?
01:39:13.320 | Is that something you see a person be able to do
01:39:15.260 | every single day, kind of just practice
01:39:17.320 | every single day for years?
01:39:19.560 | - So I can give an answer to that
01:39:21.480 | that gives a practical way you could do it,
01:39:23.120 | assuming you have kids.
01:39:24.400 | (laughing)
01:39:25.920 | No, no, you can do it yourself.
01:39:26.760 | - Okay, step one, get kids.
01:39:27.960 | - No, no, I'm just saying this
01:39:29.600 | because I'm just thinking out loud right now.
01:39:31.720 | What could I do?
01:39:32.560 | Or what could I do to suggest?
01:39:33.740 | Because what I have noticed is that, for example,
01:39:36.000 | if you do have kids who are in elementary school
01:39:37.800 | or middle school, if you yourself go and look
01:39:41.120 | at those middle school math problems,
01:39:43.120 | to think about interesting ways that you can teach
01:39:45.320 | your elementary school or middle school kid, it works.
01:39:48.200 | That's what my wife did.
01:39:49.040 | She never did any of those contests before,
01:39:51.120 | but now she knows quite a lot about them.
01:39:53.080 | I didn't teach her anything.
01:39:53.960 | I don't do that.
01:39:55.200 | She just was messing around with them
01:39:57.360 | and taught herself all of that stuff.
01:39:59.640 | And that had the automatic daily.
01:40:01.440 | I'm always thinking, how do you make it practical, right?
01:40:03.880 | And the way to make it practical is if the timer
01:40:06.240 | on the automatically daily is that you are going
01:40:08.600 | to automatically daily do something with your own kids.
01:40:11.280 | Now it feeds back.
01:40:12.680 | - Okay.
01:40:13.520 | - And that includes the whole lesson
01:40:14.800 | that if you want to learn something, you should teach it.
01:40:17.000 | - Oh, I strongly believe that.
01:40:18.160 | - Yes.
01:40:19.200 | - I strongly believe that.
01:40:21.240 | - And so I currently don't have kids.
01:40:23.520 | So that's, maybe I should just get kids
01:40:25.640 | to help me with the math thing.
01:40:27.080 | But outside of that, I do want to integrate math
01:40:31.200 | into daily practice.
01:40:32.080 | So I'll definitely check out the daily challenges
01:40:35.960 | and see because, what is it?
01:40:39.240 | Grant Sanderson, we talked about offline,
01:40:41.120 | three blue, one brown.
01:40:42.720 | He speaks to this as well,
01:40:44.920 | that his videos aren't necessarily,
01:40:48.200 | that they don't speak to the thing that I'm referring to,
01:40:50.520 | which is the daily practice.
01:40:52.640 | They're more almost tools of inspiration.
01:40:56.320 | They kind of show you the beauty of a particular problem
01:41:01.320 | in mathematics, but they're not a daily ritual.
01:41:05.760 | And I'm in search of that daily ritual mathematics.
01:41:09.520 | It's not trivial to find,
01:41:11.280 | but I hope to find that.
01:41:16.880 | 'Cause I think math gives you a perspective on the world
01:41:20.960 | that enriches everything else.
01:41:23.120 | - So I like what you said about the daily also,
01:41:25.840 | because that's also one reason
01:41:27.400 | why I put my Carnegie Mellon class online.
01:41:29.960 | It's not every day, it's every other day.
01:41:31.960 | Semester is almost over.
01:41:33.280 | But the idea was, I guess my philosophy was,
01:41:35.880 | if I'm already doing the class,
01:41:37.360 | let's just like put it there, right?
01:41:38.920 | But I do know that there are people
01:41:40.880 | who have been following it, who are not in my class at all,
01:41:43.920 | who have just been following it because,
01:41:45.920 | yes, it's combinatorics.
01:41:47.560 | And the value of that is you could,
01:41:49.880 | you don't really need to know calculus to follow it,
01:41:51.800 | if that makes sense.
01:41:52.720 | So it's actually something that people could follow.
01:41:54.600 | So again, and that one's free.
01:41:56.000 | So that one's just there on YouTube.
01:41:58.640 | - Well, speaking of combinatorics, what is it?
01:42:02.440 | What do you find interesting?
01:42:03.800 | What do you find beautiful about combinatorics?
01:42:07.240 | - So combinatorics to me is the study of things
01:42:11.480 | where they might be more finite and more discrete.
01:42:16.480 | What I mean is like, if I look at a network,
01:42:18.960 | actually a lot of times the combinatorics
01:42:20.640 | will boil down to something,
01:42:21.840 | and the combinatorics I think about
01:42:23.240 | might be something related to graphs or networks.
01:42:25.960 | And they're very discrete because if you have a node,
01:42:28.800 | it's not that you have 0.7 of a node
01:42:32.160 | and 0.3 of a node over there.
01:42:33.520 | It's like you got one node and then you jump one step
01:42:36.080 | to go to the next node.
01:42:37.440 | So that notion is different from say calculus,
01:42:39.840 | which is very continuous, where you go and say,
01:42:42.800 | I have this speed, which is changing over time.
01:42:46.160 | And now what's the distance I've traveled?
01:42:47.760 | That's the notion of an integral,
01:42:49.040 | where you have to think of subdividing time
01:42:50.880 | into very, very small pieces.
01:42:52.600 | So the kinds of things that you do when you reason
01:42:55.160 | about these finite discrete structures
01:42:59.320 | often might be iterative, algorithmic, inductive.
01:43:03.240 | These are ideas where I go from one step to the next step
01:43:06.320 | and so on and make progress.
01:43:08.160 | I guess I actually personally like all kinds of math.
01:43:11.280 | My area of research just ended up in here
01:43:13.520 | because I met a really interesting PhD advisor.
01:43:17.120 | Potentially, that's honestly the reason
01:43:18.960 | I went into that direction.
01:43:20.280 | I met a really interesting guy.
01:43:21.920 | He seemed like he did good stuff, interesting stuff,
01:43:24.840 | and he looked like he cared about students.
01:43:26.640 | And I said, let me just go and learn whatever you do.
01:43:29.200 | Even though my prior practice and preparation
01:43:32.200 | before my PhD was not combinatorics,
01:43:34.240 | but analysis, the continuous stuff.
01:43:36.880 | - So the annoying thing about combinatorics
01:43:40.560 | and discrete stuff is it's often really difficult to solve
01:43:45.560 | from a sort of running time complexity perspective.
01:43:51.200 | Is there, could you speak to the idea
01:43:56.240 | of complexity analysis of problems?
01:44:00.640 | Do you find it useful, do you find it interesting?
01:44:03.120 | Do you find that lens of studying the difficulty
01:44:08.200 | of how difficult the computer science problem
01:44:11.560 | is a useful lens onto the world?
01:44:15.200 | - Oh, very much so.
01:44:16.160 | Because if you want to make something practical,
01:44:20.360 | which has large numbers of people using it,
01:44:22.720 | the computational complexity to me is almost question one.
01:44:27.200 | And that's, again, that's at the origin
01:44:29.040 | of when we started doing this stuff with disease control.
01:44:31.720 | From the very beginning, the deep questions
01:44:33.440 | that were running through my mind were,
01:44:35.360 | would we be able to support a large population
01:44:38.440 | with only one server?
01:44:39.640 | And if the answer is no, we can't start
01:44:43.640 | because I don't have enough money.
01:44:45.340 | - Yeah, and there the question is very much linear time
01:44:51.960 | versus anything slower than linear time.
01:44:58.320 | As a very specific thing,
01:44:59.800 | you have a bunch of really interesting papers.
01:45:01.360 | If I could ask, maybe we could pull out some cool insights
01:45:04.000 | at the high level.
01:45:05.240 | Can you describe the data structure of a voting tree
01:45:08.920 | and what are some interesting results on it?
01:45:11.240 | You have a paper that I noticed on it.
01:45:13.700 | - Yeah, so this is an example of, I guess,
01:45:17.380 | how in math we might say,
01:45:19.800 | here's an interesting kind of a question
01:45:22.240 | that we just can't seem to understand enough about.
01:45:25.720 | Maybe there's something else going on here.
01:45:27.560 | And the way to describe this is,
01:45:30.160 | you could imagine trying to hold elections
01:45:32.740 | where if you have only two candidates, that's kind of easy.
01:45:35.880 | You just run them against each other
01:45:37.160 | and see who gets more votes.
01:45:38.600 | But as you know, once you have more candidates,
01:45:40.680 | it's very difficult to decide who wins the election.
01:45:43.120 | And there's an entire voting theory around this.
01:45:46.320 | So a theoretical question became,
01:45:49.440 | what if you made a system of runoffs,
01:45:53.560 | like a system of head-to-head contests,
01:45:57.160 | which is structured like a tree,
01:45:58.600 | almost looking like a circuit.
01:46:00.200 | I'm using that way of thinking
01:46:01.800 | because it's sort of like electrical engineering
01:46:04.500 | or computer science.
01:46:05.680 | You might imagine having a bunch of leads that carry signal,
01:46:09.260 | which are going through AND gates and OR gates and whatnot,
01:46:11.600 | and you've managed to compute beautiful things.
01:46:13.600 | This is just from a purely abstract point of view.
01:46:16.260 | What if the inputs are candidates?
01:46:18.560 | And for every two candidates,
01:46:20.200 | it is known which of the candidates
01:46:21.760 | is more popular than the other.
01:46:23.440 | Now can you build some kind of a circuit board,
01:46:25.840 | which says, first, candidate number four
01:46:28.000 | will play against five and see who wins and so on.
01:46:31.560 | Okay, so now what would be a nice outcome?
01:46:34.560 | This is a general question of,
01:46:35.800 | could I make a big circuit board to feed an election into?
01:46:39.120 | Like maybe one nice outcome would be,
01:46:40.640 | whoever wins at least is preferred over a lot of people.
01:46:45.200 | So for example, if you ran in 1,024 candidates,
01:46:48.480 | ideally we would like a guarantee that says
01:46:51.080 | that the winner beats a lot of people.
01:46:54.080 | Actually, in any system where there are 1,024 candidates,
01:46:58.560 | there's always a candidate
01:46:59.880 | who beats at least 512 of the others.
01:47:02.880 | This is a mathematical fact
01:47:04.400 | that there's actually always a person
01:47:05.800 | who beats at least half of the other people.
01:47:08.040 | - I'm trying to make sense of that mathematical fact.
01:47:13.200 | Is this supposed to be obvious?
01:47:15.040 | - No, but I can explain it.
01:47:17.000 | No, no, I can.
01:47:17.840 | The way it works is that, think of it this way.
01:47:21.400 | Every time, I think, imagine I have all these candidates
01:47:24.320 | and everyone is competing,
01:47:26.080 | everyone is like compared with everyone else at some point.
01:47:29.240 | Well, think of it this way.
01:47:30.640 | Whenever there's a comparison, somebody gets a point.
01:47:34.240 | That's the one who is better than the other one.
01:47:37.040 | My claim is there's somebody whose score
01:47:39.480 | is at least half of how many other people there are.
01:47:42.920 | - Yeah, I'm just trying to,
01:47:44.880 | like my intuition is very close to that being true,
01:47:47.640 | but it's beautiful.
01:47:48.720 | I didn't at first,
01:47:50.400 | that's not an obvious fact.
01:47:52.280 | - No, it's not.
01:47:53.120 | - And it feels like a beautiful fact.
01:47:55.800 | - Well, let me explain it this way.
01:47:57.160 | Imagine that for every match,
01:48:00.560 | you didn't give one point, but you gave two points.
01:48:03.880 | You gave one point to each person.
01:48:05.960 | Now that's not what we're really doing.
01:48:07.080 | We really want to give one point to the winner of the match,
01:48:10.880 | but instead we'll just give two.
01:48:12.280 | If you gave two points to everyone on every matchup,
01:48:15.920 | actually everyone has the same number of points.
01:48:18.440 | And the number of points they get
01:48:19.720 | is how many other people there are.
01:48:21.480 | Does that sort of make sense?
01:48:23.560 | I'm just like saying--
01:48:24.400 | - No, no, everything is same makes perfect sense.
01:48:26.200 | - Okay, so the point is if for every comparison
01:48:29.480 | between two people, which I'm doing for every two people,
01:48:32.520 | I gave one point to each person,
01:48:34.440 | your score, everyone's score is the same.
01:48:36.680 | It's how many other people there are.
01:48:38.600 | Now we only make one change.
01:48:40.240 | For each matchup, you give one point only to the winner.
01:48:44.360 | So we're awarding half the points.
01:48:47.080 | So now the deal is if in the original situation,
01:48:50.280 | everyone's score was equal,
01:48:52.160 | which is how many other people there are.
01:48:54.880 | Now there's only half the number of points to go around.
01:48:57.640 | So what ends up happening is that there's always going to be,
01:49:02.360 | like the average number of points per person
01:49:04.720 | is going to be half of how many other people there are.
01:49:07.120 | And somebody is going to be above average.
01:49:08.600 | - Somebody is going to be above that.
01:49:09.880 | - At least average.
01:49:10.800 | Yeah, this is this notion of expected value
01:49:13.240 | that if I have a random variable,
01:49:14.560 | which has an expected value,
01:49:16.240 | there's going to be some possibility
01:49:17.840 | in the probability space
01:49:19.320 | where you're at least as big as the expected value.
01:49:21.600 | - Yeah, when you describe it like that, it's obvious.
01:49:23.720 | But when you're first saying in this little circuit
01:49:26.680 | that there's going to be one candidate better than half,
01:49:31.040 | that's not obvious.
01:49:33.400 | - Yeah, it's not. - It's funny.
01:49:35.160 | Math, this is nice.
01:49:37.200 | Okay, so you have this,
01:49:38.640 | but ultimately you're trying to with a voting tree,
01:49:42.820 | I don't know if you're trying this,
01:49:43.920 | but to have a circuit that's like compact, that's small.
01:49:48.400 | - Well, you'd like it to be small.
01:49:49.240 | - That achieves the same kind of,
01:49:53.960 | I mean, the smaller it is,
01:49:57.160 | if we look at practically speaking,
01:49:59.080 | the lower the cost of running the election,
01:50:01.600 | of running through, of computing the circuit.
01:50:03.840 | - That is true.
01:50:04.680 | But actually at this point,
01:50:05.840 | the reason the question was interesting
01:50:08.620 | is because there was no good guarantee
01:50:12.800 | that the winner of that circuit
01:50:15.440 | would have beaten a lot of people.
01:50:18.440 | Let me give an example.
01:50:19.680 | The best known circuit,
01:50:20.840 | when we started thinking about this,
01:50:22.420 | was the circuit called candidate one
01:50:24.800 | plays against candidate two,
01:50:26.440 | candidate three plays against four,
01:50:28.680 | and then the winners play against each other.
01:50:30.520 | And then by the way, five plays against six,
01:50:32.560 | seven against eight, the winners play against each other.
01:50:34.680 | You understand, it's like a giant binary tree.
01:50:36.440 | - Yeah, it's a binary, like a balanced binary tree?
01:50:38.760 | - Yeah, it's a balanced binary tree.
01:50:40.760 | One, two, three, four, up to 1,024,
01:50:42.720 | everyone going up to find the winner.
01:50:44.160 | - Beautiful.
01:50:45.000 | - Well, you know what?
01:50:45.820 | There's a system in the world
01:50:47.440 | where it could just be
01:50:49.760 | that there's a candidate called number one
01:50:52.320 | that just beats like 10 other people,
01:50:55.260 | just the 10 that they need to beat on their way up,
01:50:59.820 | and they lose to everyone else.
01:51:01.360 | But somehow they would get all the way up.
01:51:04.760 | My point is it is possible to outsmart that circuit
01:51:10.760 | in one weird way of the world,
01:51:13.780 | which makes that circuit a bad one,
01:51:15.360 | because you want to say,
01:51:16.200 | I will use this circuit for all elections.
01:51:18.960 | And you might have a system of inputs that go in there
01:51:22.480 | where the winner only beat 10 other people,
01:51:24.800 | which is the people they had to beat on their way up.
01:51:26.720 | - So you want to have a circuit where there's as many,
01:51:29.720 | like the final result is as strong as possible.
01:51:33.160 | - Yes.
01:51:34.200 | - And so what ideas do you have for that?
01:51:37.440 | - So we actually only managed to improve it
01:51:40.480 | to square root of N.
01:51:41.800 | So if N is number of vertices,
01:51:43.680 | N over two would be the ideal.
01:51:46.080 | We got it to square root of N.
01:51:48.320 | - Versus log of N.
01:51:50.120 | - Yeah, exactly.
01:51:51.080 | - Yeah.
01:51:52.400 | Which is-
01:51:53.440 | - Well, that is halfway.
01:51:54.360 | - It could be a lot.
01:51:55.880 | - Yeah.
01:51:56.720 | - It could be a big improvement.
01:51:57.760 | So that's a, okay, cool.
01:51:58.980 | Is there something you can say with words
01:52:01.600 | about what kind of circuit, what that looks like?
01:52:04.600 | - I can give an idea of one of the tools inside.
01:52:07.520 | - Yeah.
01:52:08.360 | - But the actual execution ends up being more complicated.
01:52:10.560 | But one of the widgets inside this
01:52:12.800 | is building a system where you have like a candidate
01:52:16.960 | who plays like one part of the whole huge, huge tree
01:52:20.720 | is that that same candidate, let's call him seven.
01:52:23.280 | Seven plays against somebody.
01:52:25.560 | Let's make up some numbers.
01:52:26.720 | Let's call the others like letters.
01:52:27.920 | So seven plays against A.
01:52:29.320 | Seven's also going to play against B separately.
01:52:33.720 | And the winners of each of those will play each other.
01:52:36.600 | By the way, seven's also going to play C.
01:52:38.440 | Seven's going to play D.
01:52:39.680 | And the winners are going to play each other.
01:52:41.080 | And the winners are going to play each other.
01:52:42.520 | We call this seven against all.
01:52:45.000 | Well, seven against like everyone from a bunch of-
01:52:47.720 | - Got it.
01:52:48.560 | So there's some nice overlap between the matchups.
01:52:50.880 | - Yeah.
01:52:51.720 | - That somehow has a nice feature to it.
01:52:53.040 | - Yes. And I can tell you the nice feature
01:52:54.240 | because if at the base of this giant tree,
01:52:56.520 | at the base of this giant circuit, like this is a widget.
01:52:58.680 | You rebuild the things out of widgets.
01:52:59.920 | So I'm just describing one widget.
01:53:01.360 | But in the base of this widget,
01:53:03.440 | you have lots of things which are seven against someone,
01:53:05.640 | seven against someone, seven against someone.
01:53:07.400 | In fact, every matchup at the bottom
01:53:09.880 | is seven against someone.
01:53:11.560 | What that means is
01:53:12.880 | if seven actually beat everyone
01:53:17.360 | they were matched up against,
01:53:18.440 | well, seven would rise to the top.
01:53:20.360 | So one possibility is if you see a seven
01:53:22.920 | emerge from the top,
01:53:24.040 | you know that seven actually beat everyone
01:53:25.960 | they were against.
01:53:27.480 | On the other hand, if anyone else is on top,
01:53:30.160 | let's call it F.
01:53:31.320 | If F is on top, how did F get there?
01:53:33.600 | Well, F beat seven on the way at the beginning.
01:53:36.360 | So the point is the outcome of this circuit
01:53:38.680 | has a certain property.
01:53:40.160 | If you see a seven,
01:53:41.320 | you know that the seven actually beat a bazillion people.
01:53:43.960 | If you see anyone else, at least you know they beat seven.
01:53:47.040 | - Yeah. Then you can prove that it has a nice property.
01:53:49.880 | That's really interesting.
01:53:50.880 | Is there something you can say,
01:53:54.160 | perhaps going completely outside of what we're talking about
01:53:56.760 | is how we may have mathematical ideas
01:54:01.760 | of improving the electoral process?
01:54:06.400 | - That one, no.
01:54:07.240 | No, I can't give you that one.
01:54:09.080 | - I mean, is there, like, do you ever see as,
01:54:11.720 | as there be, do you see as there being a lot of opportunities
01:54:17.360 | for improving how we vote?
01:54:19.160 | Like from your, I don't know if you saw parallels,
01:54:23.680 | but, you know, it seems like if this actually kind of maps
01:54:27.680 | to your sort of COVID work,
01:54:29.640 | which is there's a network effect, right?
01:54:32.200 | It seems like we should be able to apply
01:54:34.280 | similar kind of effects of how we decide other things
01:54:38.560 | in our lives.
01:54:39.400 | And one of the big decisions we'll make
01:54:42.160 | is who represents us in government.
01:54:44.480 | Do you ever think about like mathematically
01:54:46.200 | about those kinds of systems?
01:54:48.160 | - I think a little bit about those
01:54:49.520 | because where I went to college,
01:54:51.480 | the way we voted for student government
01:54:53.200 | was based on this, is it called ranked choice
01:54:56.080 | where you eliminate the bottom and the runoff elections?
01:55:00.800 | So that was the first time I ever saw that.
01:55:02.800 | And I thought that made sense.
01:55:04.440 | The only problem is it doesn't seem so easy
01:55:06.880 | to get something that makes sense adopted
01:55:08.520 | as the new voting system.
01:55:09.720 | - That's a whole nother, that's not a math solution.
01:55:12.720 | That's a, well, it's math in the sense that it's game theory
01:55:16.000 | so you have to come up with incentives,
01:55:17.280 | it's mechanism design.
01:55:18.280 | You have to figure out how to trick us
01:55:21.040 | despite our basic human nature
01:55:24.280 | to adopt solutions that are better.
01:55:27.160 | - Yeah.
01:55:28.000 | - That's a whole nother conversation, I think.
01:55:30.520 | Can you just, 'cause it sounded really cool,
01:55:33.240 | talk a little bit about stochastic coalescence
01:55:36.120 | and you have a paper on showing that,
01:55:39.440 | something you could describe what it is,
01:55:40.760 | but I guess it's a super linear, super logarithmic time
01:55:44.760 | and you came up with some kind of trick that make it faster.
01:55:47.680 | Just, can you just talk about it a little bit?
01:55:49.320 | - Yeah, so this was something which came up
01:55:51.720 | when I was at Microsoft Research for a summer
01:55:54.200 | and I'm putting that context because that shows
01:55:56.720 | that it has some practical motivation at some point.
01:55:59.600 | Actually, I think it's still--
01:56:01.920 | - It doesn't need to.
01:56:02.880 | - Yeah.
01:56:03.720 | - It doesn't need to, it can be beautiful and it's all right.
01:56:05.240 | - Yeah, so the easiest way to describe this is,
01:56:07.400 | suppose you got like a big crowd of people
01:56:10.000 | and everybody knows how many hours of sleep
01:56:12.200 | they got last night.
01:56:13.320 | And you wanna know how many total hours of sleep
01:56:15.320 | were gotten by this big crowd of people.
01:56:17.640 | At the beginning, you might say,
01:56:18.840 | that sounds like a linear time algorithm of saying,
01:56:21.160 | hey, how many hours you got?
01:56:22.760 | How many you got?
01:56:23.600 | How many you got?
01:56:24.440 | Add, add, add.
01:56:25.280 | - Yes.
01:56:26.120 | - But there's a way to do this if you remember
01:56:27.680 | that there are people and they presumably know how to add.
01:56:30.440 | You could make a distributed algorithm to make this happen.
01:56:33.480 | For example, while we're thinking of these trees,
01:56:35.920 | imagine you had 1,024 people.
01:56:38.640 | If you could just say, hey, person number one
01:56:40.760 | and person number two, you will add your hours of sleep.
01:56:43.920 | Person number two will go away
01:56:46.120 | and person number one is gonna remember the sum.
01:56:48.400 | Person three and four add up
01:56:50.960 | and person three takes charge of remembering it.
01:56:53.800 | Person four goes away.
01:56:54.880 | Now this like person one knows the sum of these two,
01:56:57.000 | person three knows the sum of those two, they talk.
01:56:58.920 | You see what I mean?
01:56:59.760 | It's like you're going up this tree,
01:57:02.200 | same tree that we talked about earlier.
01:57:03.640 | - Built up a tree from the bottom up.
01:57:05.920 | - Yeah, built up a tree from the bottom up.
01:57:07.600 | And the beautiful thing is,
01:57:09.200 | since everyone's doing stuff in parallel,
01:57:11.400 | the amount of time it takes to get the total sum
01:57:14.600 | is actually just the number of layers in the tree,
01:57:17.360 | which is 10.
01:57:18.960 | So now that's logarithmic time
01:57:20.360 | to add up the number of hours that people slept today.
01:57:23.760 | Sounds fantastic.
01:57:25.280 | There's only one problem.
01:57:26.400 | How do you decide who's person number one
01:57:28.000 | and person number two?
01:57:29.720 | - Yes.
01:57:30.560 | - So if, for example, you just went out into downtown
01:57:32.680 | and said, hey, get these thousand people, go.
01:57:34.760 | Well, if you're gonna go and say,
01:57:35.960 | and by the way, you're one and you're two and you're three,
01:57:37.720 | that's linear time.
01:57:38.760 | - Yes.
01:57:39.680 | - That's cheating.
01:57:40.520 | So now the question is how to do this in a distributed way.
01:57:43.000 | And there were some people
01:57:44.200 | who proposed a very elegant algorithm.
01:57:47.320 | And they wanted to analyze it.
01:57:48.840 | So I came in onto the analyze side.
01:57:50.680 | But the elegant algorithm was like this.
01:57:52.760 | It was like, well, we don't actually know
01:57:55.720 | what this big tree is.
01:57:57.600 | There isn't any big tree.
01:57:58.880 | So what's gonna happen is first,
01:58:01.000 | everyone is going to decide right now.
01:58:04.280 | Oh, one important thing.
01:58:05.640 | Everyone is going to,
01:58:07.000 | at the very beginning of the whole game,
01:58:09.920 | they will have delegated responsibility to themselves
01:58:13.360 | as the one who knows the sum so far.
01:58:16.400 | So the point is there's gonna be,
01:58:18.920 | people are all gonna have like a pointer,
01:58:20.760 | which says you are the one who knows my,
01:58:24.440 | you've taken care of my ticket, my number.
01:58:26.680 | - Yeah, they select the representative
01:58:28.280 | for this particular piece of knowledge.
01:58:31.360 | - And at the very beginning, you're your own representative.
01:58:33.680 | The thing has to start simple, right?
01:58:35.120 | So at the beginning, you're your own representative.
01:58:36.440 | - You're pointing to yourself, got it.
01:58:38.040 | - Yep, and the way this works is that at every time step,
01:58:41.560 | someone blares a ding dong on the town clock or whatever.
01:58:45.840 | And each person flips a coin themselves to decide,
01:58:48.720 | am I going to hunt for somebody to give my number to
01:58:53.600 | and let them represent me?
01:58:55.120 | Or am I going to sit here and wait for someone to come?
01:58:58.960 | Okay, well, they flip their coin.
01:59:02.640 | Some of the people start asking other people saying,
01:59:04.840 | hey, I would like you to be my representative.
01:59:08.600 | Here is my number.
01:59:10.240 | But the problem is that there's limited bandwidth
01:59:12.080 | of the people who are getting asked.
01:59:13.280 | It's like you can't go out to prom with five people.
01:59:16.720 | That is not what we're doing.
01:59:17.880 | We're adding numbers, okay?
01:59:19.200 | But you can only add one number.
01:59:20.760 | So the person who has suddenly gotten asked
01:59:22.640 | by all these people, well, they'll have to decide
01:59:25.000 | who they're going to take it from.
01:59:27.320 | And they randomly just choose one.
01:59:29.480 | When they randomly choose one, all the others are rejected
01:59:31.920 | and they don't get to delegate anything in that round.
01:59:34.960 | But now if this person has absorbed this one who said,
01:59:38.400 | okay, here, you take charge of my number,
01:59:40.720 | this person now updates their pointer.
01:59:42.640 | You're in charge.
01:59:43.960 | And this person adds the two numbers.
01:59:46.680 | That was the first round.
01:59:48.920 | In the next round, when they do the coin flipping,
01:59:52.880 | this person doesn't flip anymore
01:59:54.320 | because they're just delegating.
01:59:56.160 | It's that anyone who has the pointers themselves,
01:59:59.040 | that's like a person who is in charge
02:00:01.640 | of some number of informations,
02:00:03.320 | they flip the coin to decide,
02:00:04.680 | should I find other people who are agents
02:00:08.240 | or should I wait for people to ask me?
02:00:10.040 | - Yes, brilliant.
02:00:11.640 | - This is somebody else's idea.
02:00:12.840 | And now the idea is, okay,
02:00:14.360 | if you just keep doing this process, what ends up happening?
02:00:16.920 | Oh yeah, and also, by the way,
02:00:18.680 | if you decide that you want to go reach out to other people,
02:00:21.040 | here's the catch.
02:00:23.120 | When you're one of these agents saying,
02:00:24.680 | okay, I'm going to go look for someone,
02:00:27.400 | you have no idea who in this crowd is an agent
02:00:30.760 | or somebody who delegated it to someone else.
02:00:33.400 | You just pick a random person.
02:00:35.640 | When you pick the random person,
02:00:37.040 | if it lands on someone and the person says,
02:00:38.880 | oh, I actually delegated it to someone,
02:00:41.840 | then you follow the point.
02:00:43.800 | - You walk up the delegation chain.
02:00:45.280 | - Walk up the delegation chain.
02:00:46.120 | And you can do like path compression in the algorithm
02:00:49.120 | to make it so you don't consistently do lots of walking up.
02:00:52.040 | But the bottom line is that what ends up happening
02:00:54.680 | is that you end up reaching out.
02:00:57.280 | Whenever you're one of the ones reaching out,
02:00:59.120 | you can think of it as each agent is responsible
02:01:01.880 | for some number of people.
02:01:03.360 | It's almost like they're the leader of a bunch.
02:01:05.520 | As the process is evolving,
02:01:07.200 | you have these lumps.
02:01:09.960 | Each lump has an agent.
02:01:11.720 | And when the agent reaches out,
02:01:13.400 | they reach out to another lump
02:01:15.720 | where the probability of them hitting that lump
02:01:18.160 | is proportional to the size of the lump.
02:01:20.160 | That is the one funny thing about this process.
02:01:25.480 | This is not that they can reach out
02:01:27.400 | to a uniformly random lump
02:01:29.320 | where every lump has the same chance
02:01:30.960 | of getting reached out to.
02:01:32.480 | The bigger the lump is,
02:01:34.560 | the more likely it is
02:01:37.280 | that you end up reaching that lump.
02:01:38.760 | - Which is a problem.
02:01:40.200 | - Let me explain why that's a problem.
02:01:41.600 | Because you see,
02:01:42.800 | you're hoping that this has a small number of steps.
02:01:45.440 | But here's a bad situation that could happen.
02:01:47.720 | Imagine if you had like,
02:01:50.040 | there are N people that you're adding up.
02:01:52.040 | Imagine that you have exactly square root of N lumps left
02:01:57.040 | of which almost all of them are just one person
02:02:01.480 | who's still their own boss, their own manager.
02:02:04.360 | - Except one giant one.
02:02:05.200 | - One giant one.
02:02:06.040 | Now what's gonna happen?
02:02:07.000 | It's gonna be a huge bottleneck
02:02:08.400 | because every round the giant one
02:02:09.840 | can only absorb one of the others.
02:02:11.960 | And now you suddenly have time
02:02:13.640 | which is about square root of N.
02:02:15.760 | The square root of N is chosen
02:02:16.880 | because that is one where the lumps are such
02:02:20.320 | that you really are limited by this large one
02:02:23.600 | slowly sucking up the rest of them.
02:02:26.040 | So the heart of the question became,
02:02:28.200 | well, but is that just so unusual
02:02:30.080 | that it doesn't usually happen?
02:02:32.760 | Because remember you start with everyone
02:02:34.680 | just being independent.
02:02:36.080 | It's like a lot of lumps of size one.
02:02:37.640 | - How naturally do the big lumps emerge?
02:02:39.640 | - Yes.
02:02:40.480 | And so what that heart of the proof was,
02:02:42.120 | was showing that that was a joint work with A.L. Lubezki.
02:02:45.120 | That one was showing that actually in that thing,
02:02:49.080 | the lumps do kind of get out of whack.
02:02:50.920 | And so it's not the purely logarithmic number of steps.
02:02:54.720 | But if you make one very slight change,
02:02:56.760 | which is if you are one of the agents
02:03:00.400 | and you have just been propositioned,
02:03:02.400 | possibly relayed along by a couple of different people,
02:03:05.320 | if you just say, don't take a random one,
02:03:07.640 | but accept the smallest lump,
02:03:10.400 | that actually does enough to even hold it up.
02:03:14.320 | - Distributes the lump size.
02:03:15.720 | - Yeah.
02:03:16.560 | - I mean, yeah, it's fascinating
02:03:17.680 | how with the distributed algorithms,
02:03:19.000 | a little adjustment can make all the difference in the world.
02:03:22.440 | Yeah.
02:03:23.280 | Actually, by the way,
02:03:24.200 | this does back to our voting conversation.
02:03:26.920 | This makes me think of like,
02:03:28.440 | these networking systems are so fascinating to study.
02:03:32.240 | They immediately spring to mind ideas
02:03:35.080 | of how to have representation.
02:03:37.640 | Like maybe as opposed to me voting for a president,
02:03:42.240 | I want to vote for like,
02:03:45.600 | for you Paul to represent me,
02:03:48.400 | maybe on a particular issue.
02:03:50.600 | And then you'll delegate that further.
02:03:52.560 | And then we naturally construct those kinds of networks
02:03:55.120 | because that feels like I can have a good conversation
02:03:58.880 | with you and figure out that you know what you're doing
02:04:00.760 | and I can delegate it to you.
02:04:01.800 | And in that way, construct a representative government,
02:04:05.560 | a representative decision-maker.
02:04:08.400 | That feels really nice as opposed to like us,
02:04:12.440 | like a tree of height one or something,
02:04:14.520 | where it's like, everybody's just,
02:04:16.240 | it feels like there's a lot of room
02:04:19.560 | for layers of representation to form organically
02:04:22.480 | from the bottom up.
02:04:23.800 | I wonder if there are systems like that.
02:04:25.360 | This is the cool thing about the internet
02:04:27.040 | and the digital space where we're so well connected,
02:04:29.520 | just like with the Novid app to distribute information
02:04:34.040 | about the spread of the disease,
02:04:36.960 | we can in the same way, in a distributed sense,
02:04:39.200 | form anything like any kind of knowledge bases
02:04:44.200 | that are formed in a decentralized way
02:04:48.720 | and in a hierarchical way,
02:04:51.440 | as opposed to sort of old way
02:04:54.280 | where there's no mechanism for large scale,
02:04:56.800 | fast distributed transaction information.
02:05:01.720 | This is really interesting.
02:05:02.560 | This is where almost like network graph theory
02:05:06.760 | becomes practical.
02:05:07.920 | Yeah, most of that exciting work was done
02:05:10.800 | in the 20th century,
02:05:11.800 | but most of the application will be in the 21st,
02:05:14.040 | which is cool to think about.
02:05:15.920 | Let me ask the most ridiculous question.
02:05:17.680 | You think P equals NP?
02:05:19.840 | - Wow, I don't know.
02:05:22.360 | I mean, I would say,
02:05:23.800 | I know there are enough people
02:05:28.080 | who have very strong interest in trying to show that it is.
02:05:31.480 | I'm talking about government agencies.
02:05:34.600 | (laughing)
02:05:37.360 | - For security purposes.
02:05:38.440 | - For security purposes.
02:05:39.280 | - And most computer scientists,
02:05:40.520 | which you'd say believe that P equals NP.
02:05:42.920 | My question almost like,
02:05:45.240 | this is back to our aliens discussion.
02:05:46.960 | You want to think outside the box,
02:05:48.400 | the low probability event.
02:05:50.880 | What is the world,
02:05:53.120 | what kind of discoveries would lead us
02:05:56.000 | to prove that P does not equal to NP?
02:06:01.000 | Like there could be giant misunderstandings
02:06:05.440 | or gaps in our knowledge about computer science,
02:06:08.040 | about theoretical computer science,
02:06:09.480 | about computation,
02:06:11.000 | which allow us to think like flatten all problems.
02:06:14.760 | - Yeah, so I don't know the answer to this question.
02:06:17.040 | I think it's very interesting,
02:06:18.600 | but I actually,
02:06:20.160 | let's put it this way.
02:06:21.240 | By being at Carnegie Mellon
02:06:22.520 | and being around the theoretical computer scientists,
02:06:24.880 | I know enough about what I don't know to say.
02:06:27.160 | I'm the wrong person. - To be humble.
02:06:28.840 | - I'm the wrong person to answer this question.
02:06:31.000 | - Yeah. - Yeah.
02:06:32.360 | It's a great one.
02:06:33.240 | - Well, Scott Aronson, who's now here at UT Austin,
02:06:35.920 | he used to be at MIT,
02:06:37.800 | puts the probability of P not equals to NP at 3%.
02:06:42.800 | He put, you know, I always love it.
02:06:46.800 | When you ask, it's very rare in science and academics,
02:06:51.040 | because most folks are humble
02:06:53.720 | in the face of the mystery,
02:06:56.800 | the uncertainty of everything around us,
02:06:59.360 | to have both the humor and the guts to say,
02:07:02.800 | like what are the chance that there's aliens in our galaxy,
02:07:07.120 | intelligent alien civilizations?
02:07:09.400 | As opposed to saying, I don't know,
02:07:10.760 | it could be zero,
02:07:12.280 | it could be, depending on the factors,
02:07:13.640 | saying it's 2.5%.
02:07:16.080 | (laughing)
02:07:17.680 | There's something very pleasant about just having,
02:07:20.280 | it's the number thing,
02:07:22.440 | it's power to the number, it's just like 42.
02:07:26.440 | It's like, why 40?
02:07:27.280 | I don't know, but it's a powerful number.
02:07:29.640 | And then everything, this is the power of human psychology,
02:07:32.600 | is once you have the number 42,
02:07:34.760 | it's not that the number has meaning,
02:07:39.240 | but because it's placed in a book with humor around it,
02:07:43.920 | it has the meme effect of actually creating reality.
02:07:48.920 | I mean, you could say that 42 has a strong contribution
02:07:53.480 | of helping us colonize Mars,
02:07:55.920 | because it created,
02:07:57.720 | it gave the whatever existential crisis to many of us,
02:08:00.480 | including Elon Musk when he was young,
02:08:03.160 | reading a book like that,
02:08:04.280 | and then now 42 is now part of his humor
02:08:07.040 | that he doesn't shut up about,
02:08:08.880 | he's constantly joking about,
02:08:09.880 | and that humor is spreading through our minds,
02:08:12.320 | and somehow this like silly number just had an effect.
02:08:15.160 | In that same way, after Scott told me the 3% chance,
02:08:19.360 | it's stuck in my head,
02:08:20.680 | and I think it's been having a ripple effect
02:08:22.680 | in everybody else.
02:08:23.720 | The believing that P is not equal to NP,
02:08:27.720 | Scott almost as a joke saying it's 3%,
02:08:32.200 | is actually motivating a large number of researchers
02:08:34.920 | to work on it.
02:08:35.760 | - Like 3% is high.
02:08:37.080 | - It's very high.
02:08:37.920 | - Because for the potential impact that that might have.
02:08:39.880 | (laughing)
02:08:41.320 | But then 3% is not that high,
02:08:43.480 | because it's only,
02:08:44.320 | you know, like we're not very good.
02:08:46.480 | I feel like humans are only able to really think
02:08:48.560 | about like 1%, 50%,
02:08:51.280 | and we kind of,
02:08:52.520 | I think a lot of people round 3% up to 50%,
02:08:57.160 | like in our minds,
02:08:58.800 | like 3%,
02:09:00.120 | like this--
02:09:00.960 | - It could happen.
02:09:01.800 | - It could happen.
02:09:02.640 | And it could happen,
02:09:03.480 | and it's like, yeah,
02:09:04.720 | like half the time it'll probably happen.
02:09:07.040 | So we're not very good at that.
02:09:08.520 | That's the other thing with the pandemic
02:09:10.240 | is we're not the exponential growth
02:09:13.560 | that we also talked about offline
02:09:15.800 | is something that we can't quite intuit.
02:09:20.200 | And that's something we probably should
02:09:22.680 | if we were to predict the future,
02:09:24.120 | to anticipate the future,
02:09:25.240 | and to understand how to create technologies
02:09:27.920 | that let us sort of control the future.
02:09:30.680 | Can I ask you for some recommendations,
02:09:35.040 | maybe for books or movies in your life?
02:09:39.120 | Long ago, when you were baby Poe,
02:09:42.080 | or today,
02:09:43.680 | that you found insightful
02:09:47.680 | or you learned a lot from,
02:09:50.360 | you would recommend to others?
02:09:52.040 | - Yeah, so I think,
02:09:53.880 | I don't necessarily have an exact name of these old things,
02:09:56.600 | but I was generally inspired by stories,
02:10:00.640 | true or fictional,
02:10:02.640 | of campaigns.
02:10:04.640 | For example, like the Lord of the Rings,
02:10:07.560 | that's a campaign, right?
02:10:09.080 | But the thing that always inspired me was
02:10:11.360 | it could be possible for somebody who's crazy enough
02:10:16.360 | to go up against adversity after adversity after adversity,
02:10:19.960 | and it succeeds.
02:10:21.080 | I mean, those are false, those are fictitious,
02:10:23.920 | but I also spent a lot of time, I guess,
02:10:25.280 | reading about, I don't know,
02:10:26.600 | I was interested somehow in like World War II history,
02:10:29.280 | for whatever reason.
02:10:30.360 | That's a campaign which is much more brutal,
02:10:32.360 | but nevertheless, the idea of difficulty, strategy,
02:10:38.360 | fighting even when things,
02:10:40.600 | in that case, it was really fighting,
02:10:41.640 | but just pushing on even when things are difficult.
02:10:44.520 | I guess these are the kinds of general stories
02:10:47.760 | that made me, I guess, want to work on things
02:10:51.560 | that would be hard,
02:10:53.080 | and where it could be a campaign.
02:10:55.600 | It could be that you work on something for a year,
02:10:58.160 | multiple years,
02:10:59.280 | because that was the point, I guess.
02:11:02.880 | - Yeah, it starts with a single person.
02:11:04.800 | That's the interesting thing.
02:11:06.320 | I've obviously been, don't shout about it recently,
02:11:09.560 | about World War II,
02:11:10.720 | especially on the Hitler side and the Stalin side.
02:11:13.280 | Some of that has really affected my own family,
02:11:16.240 | the roots of my family very much,
02:11:18.760 | but it's interesting to think
02:11:20.280 | that it was just an idea,
02:11:24.800 | and one person decided to do stuff,
02:11:26.880 | and it just builds and builds and builds,
02:11:29.480 | and you can truly have an impact on the world,
02:11:31.840 | both horrendous
02:11:34.240 | and exceptionally positive and inspiring.
02:11:40.960 | So yeah, it's agency of us individuals.
02:11:46.320 | Sometimes we think we're just reacting to the world,
02:11:49.880 | but we have the full power to actually change the world.
02:11:53.040 | Is there advice you can give to young folks?
02:11:56.880 | We give a bunch of advice
02:11:57.960 | on middle school, high school mathematics.
02:12:00.560 | Is there more general advice you would give
02:12:02.440 | about how to succeed in life,
02:12:04.760 | how to learn for high school students,
02:12:07.400 | for college students, career or life in general?
02:12:10.560 | - So I think the first one would be
02:12:12.280 | to make sure that you're learning to invent,
02:12:14.680 | and to make sure you're not just learning how to mimic,
02:12:19.040 | because a lot of times you learn how to do X
02:12:21.840 | by watching somebody do X,
02:12:23.160 | and then repeating X many times with different inputs.
02:12:26.360 | I've just been very generic in explaining this,
02:12:28.520 | but I guess this is just my own attitude towards the world.
02:12:31.840 | I didn't like ever following anyone's directions exactly.
02:12:34.960 | Even if you told me this is the way to do your homework
02:12:37.680 | is to write in pencil, I would say,
02:12:39.320 | "But I think pen is nice.
02:12:41.200 | "Let's try."
02:12:42.040 | (laughs)
02:12:42.880 | So I've been that kind of a funny person,
02:12:45.680 | but I do encourage that if you can learn how to invent
02:12:50.680 | as your core skill, then you can do a lot.
02:12:52.760 | But then the second piece that comes with that
02:12:54.320 | is something I learned from my PhD advisor,
02:12:56.320 | which was, "Well, make sure that what you're working on
02:12:59.880 | "is big enough."
02:13:01.280 | And so in that sense, I usually advise to people
02:13:03.760 | once they have learned how to invent,
02:13:05.680 | ideally, don't just try to settle
02:13:08.840 | for something comfortable.
02:13:10.480 | Try to see if you can aim for something which is hard,
02:13:14.000 | which might involve a campaign,
02:13:15.760 | which might be important, which might make a difference.
02:13:19.000 | And it's more of, I guess, rather than worrying,
02:13:22.880 | what if you didn't achieve that?
02:13:25.800 | There's also the regret of, what if I didn't try?
02:13:30.320 | See, that's how I operate.
02:13:31.560 | I don't operate based on, did I succeed or fail?
02:13:33.680 | It was hard anyway.
02:13:34.600 | If I did this novid thing and the whole thing failed,
02:13:36.600 | would I feel terrible?
02:13:37.600 | No, it's a very hard problem.
02:13:39.440 | But would I have had the regret of not jumping in?
02:13:44.040 | So it's that different mentality of,
02:13:45.520 | don't worry about the failing part as much of the,
02:13:48.640 | make sure you give yourself the shot
02:13:50.720 | at those potentially unbounded opportunities.
02:13:55.160 | - You almost make it sound like there's a meaning to it all.
02:13:58.280 | Let me ask the big, ridiculous question.
02:13:59.880 | What do you think is the meaning of life?
02:14:01.760 | Or maybe the easier version of that
02:14:04.120 | is what brings your life joy?
02:14:06.040 | - So I'll just answer that one personally.
02:14:07.800 | For me, I'm a little bit weird.
02:14:10.000 | I sort of, I guess you can tell by now.
02:14:13.440 | - See the pen and pencil discussion from earlier, yes.
02:14:15.880 | - Yeah, yeah.
02:14:16.840 | So, I mean, my thing is, I guess I personally
02:14:20.280 | just wanted to maximize a certain score,
02:14:24.280 | which was for how many person years
02:14:28.080 | after I'm no longer here anymore,
02:14:30.960 | did what I do mattered?
02:14:33.880 | And it didn't matter if it's necessarily attributed to me.
02:14:36.440 | It's just like, did it matter?
02:14:38.560 | And so that's what I wanted.
02:14:41.760 | I guess that is very inspired by how scientists work.
02:14:45.560 | It's like, why do we keep talking about Newton?
02:14:47.720 | It's because Newton discovered some interesting things.
02:14:50.800 | And so Newton's score is pretty high.
02:14:53.800 | It's going to be infinity, right?
02:14:56.240 | - Well, let's hope it's infinity, but pretty high.
02:14:58.520 | - Ah, yes, yes.
02:15:00.080 | - So you're going for, so person years,
02:15:03.720 | you're going for like triple digits.
02:15:05.440 | You're going for, so like Newton is like four digits,
02:15:08.880 | probably like a thousand years or a person lifetimes.
02:15:13.160 | Like, how do you like to think about, what are we?
02:15:15.280 | - Sorry, I meant people times years.
02:15:17.440 | - People times.
02:15:18.280 | - So then it's like, actually his is huge.
02:15:19.920 | His is like going to be billions or trillions, right?
02:15:22.600 | Trillions, but I guess for me,
02:15:26.360 | I actually changed the metric after a while.
02:15:28.320 | And the reason is because you may have seen,
02:15:30.040 | I found some simple way to solve quadratic equations
02:15:33.600 | that is easier than every textbook.
02:15:35.720 | So my score might already be not bad,
02:15:39.000 | which is why I decided then let's change it
02:15:40.720 | into the number of hours in the lifetimes as well.
02:15:44.600 | So the way I was doing it before is that
02:15:49.400 | if a person was sort of remembering or using
02:15:53.440 | or appreciating what I had done
02:15:56.040 | for like 10 years of their life, that would count as 10.
02:16:01.040 | - I see.
02:16:02.480 | - So if there was one person who for 10 years
02:16:04.600 | remembered or appreciated something I did,
02:16:06.200 | that counts as a score of 10 and we add up overall people.
02:16:09.360 | And then, and that was with the hypothesis
02:16:13.480 | that the score would be very finite in the sense that
02:16:17.640 | if I didn't come up with anything
02:16:19.040 | that might potentially help a lot of generations
02:16:21.360 | in a forever way, then your score will be finite
02:16:23.840 | because at some point it's not,
02:16:25.960 | people don't remember that you made
02:16:27.880 | like nice bottles or something, right?
02:16:30.160 | But then after the quadratic equation thing,
02:16:33.040 | it was that there's some chance
02:16:35.000 | that that actually might make it into textbooks.
02:16:37.720 | And if it makes it in textbooks,
02:16:39.080 | the chance that there'll be an easier way discovered
02:16:40.960 | is actually quite small.
02:16:42.680 | So in that case, then the score might get bigger.
02:16:46.240 | I was just saying the score might actually
02:16:47.760 | already have been achieved in a non-trivial way.
02:16:51.240 | - I see.
02:16:52.080 | - Because--
02:16:52.920 | - It's fun to think about, 'cause it could be different.
02:16:54.600 | You can achieve a high score by a small number of people
02:16:58.960 | using it for most of their lifetime
02:17:01.240 | and then generations and generations,
02:17:03.400 | or you can have, if we do dissipate,
02:17:05.960 | if we do spread, colonize, become multi-planetary species,
02:17:10.120 | you could have that little, a clever way
02:17:13.200 | to solve differential equations.
02:17:15.160 | - Yeah.
02:17:16.000 | - Spread through like trillions of people
02:17:19.640 | as they spread throughout the galaxy.
02:17:21.600 | And they would only use it each one
02:17:24.800 | a few hours in their lifetime,
02:17:26.760 | but their kids will use it,
02:17:28.280 | the kids of kids will use it, it will spread,
02:17:30.080 | and you'll have that impact in that kind of way.
02:17:33.240 | - Yes, so that's why I renormalized it,
02:17:34.920 | because I was like, well, that's kind of dumb,
02:17:36.440 | because what's the importance of that?
02:17:37.720 | That'll save people 15 minutes.
02:17:39.880 | But so what I meant is I didn't want to count that
02:17:42.400 | as the main score.
02:17:43.240 | (both laughing)
02:17:45.480 | - Well, I'm gonna have to try to come up
02:17:47.880 | with some kind of device that everyone would want to use,
02:17:50.360 | maybe to make coffee,
02:17:51.520 | 'cause coffee seems to be the prevalent performance
02:17:55.560 | enhancing chemical that everyone uses.
02:17:57.160 | So I'll have to think about those kinds of metrics.
02:17:59.880 | - Yeah.
02:18:00.880 | But you see, that's just giving an idea
02:18:02.560 | of I guess what I found meaningful in general,
02:18:05.080 | like whether or not it's like,
02:18:06.200 | whether or not that quadratic thing is important or not.
02:18:08.320 | The general idea was I wanted to do things
02:18:10.720 | that would outlast me.
02:18:11.840 | - Yes.
02:18:12.680 | - And that was what inspired me.
02:18:13.520 | That's just how I choose what problems to work on.
02:18:15.720 | - And that's a kind of immortality,
02:18:17.240 | is ideas that you've invented living on long after you
02:18:22.240 | in the minds of others.
02:18:24.960 | And humans are ultimately not,
02:18:27.200 | are like meat vehicles that carry ideas for brief,
02:18:32.200 | for just a few years, may not be the important thing.
02:18:34.920 | It might be the ideas that we carry with us
02:18:37.560 | and invent new ones.
02:18:38.760 | Like we get a bunch of baby ideas in our head.
02:18:41.680 | We borrow them from others,
02:18:43.160 | and then maybe we invent a new one
02:18:45.080 | and then you one might have a life of its own.
02:18:47.960 | And it's fun.
02:18:49.560 | It's fun to think about that idea of living
02:18:51.200 | for many centuries to come, unless we destroy ourselves.
02:18:54.960 | But maybe AI will borrow it and we'll remember Poe
02:18:58.920 | as like that one human that helped us out
02:19:01.840 | before we of course killed him
02:19:04.680 | and the rest of human civilization.
02:19:06.760 | On that note, Poe, this is a huge honor.
02:19:09.840 | You're one of the great educators.
02:19:12.880 | I've ever gotten a chance to interact with.
02:19:15.240 | So it's truly an honor that you would talk with me today.
02:19:18.520 | It means especially a lot that you would travel out
02:19:21.160 | to Austin to talk to me.
02:19:22.480 | It really means a lot.
02:19:23.480 | So thank you so much.
02:19:25.180 | Keep on inspiring.
02:19:26.480 | And I'm one of your many, many students.
02:19:30.320 | Thank you so much for talking today.
02:19:32.160 | - Thank you.
02:19:32.980 | Thank you.
02:19:33.820 | It's actually a real honor for me to talk to you
02:19:34.640 | and to get this chance to have this
02:19:36.680 | really intellectual conversation
02:19:38.480 | through all of these topics.
02:19:39.800 | - Thanks, Poe.
02:19:41.440 | Thanks for listening to this conversation
02:19:43.040 | with Poe Shen Lowe.
02:19:44.240 | And thank you to Jordan Harbour to show,
02:19:47.040 | Onnit, BetterHelp, Eight Sleep, and Element.
02:19:51.400 | Check them out in the description to support this podcast.
02:19:54.480 | And now let me leave you with some words from Isaac Newton.
02:19:58.320 | I can calculate the motion of heavenly bodies,
02:20:01.280 | but not the madness of people.
02:20:03.700 | Thank you for listening.
02:20:04.640 | I hope to see you next time.
02:20:06.360 | (upbeat music)
02:20:08.940 | (upbeat music)
02:20:11.520 | [BLANK_AUDIO]