back to index

Garry Kasparov: Chess, Deep Blue, AI, and Putin | Lex Fridman Podcast #46


Chapters

0:0 Intro
1:33 Garry Kasparov
4:50 Haunted by demons
9:3 Creating new moves
16:42 Ranking Magnus Carlsen
30:52 Chess as the epitome
31:22 Inside humans
38:42 Machine safety
42:2 Lessons from the USSR
44:40 Stalin and Hitler
45:52 Life in danger
49:22 Did Russia interfere in 2016 US election

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Garry Kasparov.
00:00:03.560 | He's considered by many to be the greatest
00:00:05.880 | chess player of all time.
00:00:07.680 | From 1986 until his retirement in 2005,
00:00:11.240 | he dominated the chess world,
00:00:13.360 | ranking world number one for most of those 19 years.
00:00:17.200 | While he has many historical matches
00:00:18.880 | against human chess players,
00:00:20.840 | in the long arc of history,
00:00:22.600 | he may be remembered for his match against the machine,
00:00:26.480 | IBM's Deep Blue.
00:00:28.840 | His initial victories and eventual loss to Deep Blue
00:00:31.840 | captivated the imagination of the world,
00:00:34.680 | of what role artificial intelligence systems
00:00:36.920 | may play in our civilization's future.
00:00:39.720 | That excitement inspired an entire generation
00:00:42.400 | of AI researchers, including myself,
00:00:45.160 | to get into the field.
00:00:47.240 | Garry is also a pro-democracy political thinker and leader,
00:00:51.320 | a fearless human rights activist,
00:00:53.280 | and author of several books,
00:00:54.960 | including "How Life Imitates Chess,"
00:00:57.320 | which is a book on strategy and decision making,
00:01:00.000 | "Winter is Coming," which is a book
00:01:02.120 | articulating his opposition to the Putin regime,
00:01:05.040 | and "Deep Thinking," which is a book
00:01:07.160 | on the role of both artificial intelligence
00:01:09.480 | and human intelligence in defining our future.
00:01:13.600 | This is the Artificial Intelligence Podcast.
00:01:16.560 | If you enjoy it, subscribe on YouTube,
00:01:19.080 | give it five stars on iTunes,
00:01:21.080 | support it on Patreon,
00:01:22.320 | or simply connect with me on Twitter,
00:01:24.360 | @lexfriedman, spelled F-R-I-D-M-A-N.
00:01:28.160 | And now, here's my conversation with Garry Kasparov.
00:01:32.400 | As perhaps the greatest chess player of all time,
00:01:35.960 | when you look introspectively at your psychology
00:01:39.080 | throughout your career,
00:01:40.720 | what was the bigger motivator,
00:01:42.560 | the love of winning or the hatred of losing?
00:01:45.280 | - Tough question.
00:01:47.840 | Have to confess, I never heard it before,
00:01:51.800 | which is, again, congratulations.
00:01:53.720 | It's quite an accomplishment.
00:01:55.120 | Losing was always painful.
00:02:01.200 | For me, it was almost like a physical pain
00:02:04.480 | because I knew that if I lost the game,
00:02:08.120 | it's just because I made a mistake.
00:02:11.400 | I always believed that the result of the game
00:02:16.520 | had to be decided by the quality of my play.
00:02:20.960 | Okay, you may say it sounds arrogant,
00:02:24.360 | but it helped me to move forward
00:02:26.480 | because I always knew that there was room for improvement.
00:02:30.000 | - Was there the fear of the mistake?
00:02:32.920 | - Actually, fear of mistake guarantees mistakes.
00:02:36.640 | And the difference between top players,
00:02:39.360 | the very top, is that it's the ability to make a decision
00:02:44.360 | without predictable consequences.
00:02:46.360 | You don't know what's happening.
00:02:47.520 | It's just intuitively.
00:02:48.640 | I can go this way or that way.
00:02:50.560 | And there are always hesitations.
00:02:52.640 | People are like, "You are just at the crossroads.
00:02:55.000 | You can go right, you can go left, you can go straight.
00:02:57.200 | You can turn and go back."
00:02:58.720 | And the consequences are just very uncertain.
00:03:03.120 | Yes, you have certain ideas what happens on the right
00:03:07.000 | or on the left, or just if you go straight,
00:03:09.000 | but it's not enough to make well-calculated choice.
00:03:13.400 | And when you play chess at the very top,
00:03:16.360 | it's about your inner strength.
00:03:18.720 | So I can make this decision.
00:03:21.080 | I will stand firm, and I'm not going to waste my time
00:03:24.880 | because I have full confidence that I will go through.
00:03:28.720 | Now, going back to your original question,
00:03:32.120 | I would say neither.
00:03:34.320 | It's just, it's love for winning, hate for losing.
00:03:39.040 | They were important elements, psychological elements.
00:03:41.640 | But the key element, I would say the driving force
00:03:46.960 | was always my passion for making a difference.
00:03:51.960 | It's just, I can move forward, and I can always enjoy,
00:03:58.360 | not just playing, but creating something new.
00:04:01.680 | - Creating something new.
00:04:03.160 | How do you think about that?
00:04:04.560 | - It's just finding new ideas in the openings,
00:04:07.480 | some original plan in the middle game.
00:04:09.520 | Actually, that helped me to make the transition
00:04:13.320 | from the game of chess, where I was on the very top,
00:04:15.960 | to another life, where I knew I would not be number one.
00:04:20.200 | I would not be necessarily on the top,
00:04:22.120 | but I could still be very active and productive
00:04:26.760 | by my ability to make a difference,
00:04:30.280 | by influencing people, say,
00:04:32.440 | joining the democratic movement in Russia,
00:04:34.960 | or talking to people about human-machine relations.
00:04:38.600 | There's so many things where I knew my influence
00:04:41.960 | may not be as decisive as in chess,
00:04:45.760 | but still strong enough to help people
00:04:49.080 | to make their choices.
00:04:51.680 | - So you can still create something new
00:04:53.160 | that makes a difference in the world outside of chess.
00:04:57.200 | But wait, you've kind of painted a beautiful picture
00:05:01.960 | of your motivations in chess, to create something new,
00:05:04.280 | to look for those moments of some brilliant new ideas.
00:05:08.520 | But were you haunted by something?
00:05:11.120 | So you make it seem like to be at the level you're at,
00:05:15.160 | you can get away without having demons,
00:05:18.520 | without having fears,
00:05:21.800 | without being driven by some of the darker forces.
00:05:25.440 | - I mean, you sound almost religious.
00:05:29.440 | You know, darker forces, spiritual demons.
00:05:32.840 | I mean, do you have a call for a priest?
00:05:34.680 | (both laughing)
00:05:36.640 | - That's what I'm dressing.
00:05:37.480 | - Now, let's go back to these crucial chess moments,
00:05:42.400 | where I had to make big decisions.
00:05:44.480 | As I said, it was all about my belief from very early days
00:05:49.480 | that I can make all the difference by playing well
00:05:52.920 | or by making mistakes.
00:05:53.960 | So yes, I always had an opponent across the chessboard,
00:05:58.960 | opposite me.
00:06:00.240 | But no matter how strong the opponent was,
00:06:02.920 | whether just an ordinary player
00:06:04.360 | or another world champion like Anatoly Karpov,
00:06:07.720 | having all respect for my opponent,
00:06:10.880 | I still believe that it's up to me to make the difference.
00:06:14.400 | And I knew I was not invincible.
00:06:19.400 | I made mistakes.
00:06:22.560 | I made some blunders.
00:06:23.800 | And with age, I made more blunders.
00:06:28.200 | So I knew it, but it's still, you know,
00:06:32.360 | it's very much for me to be decisive factor in the game.
00:06:37.000 | I mean, even now, look,
00:06:37.840 | I just, you know, my latest chess experience was horrible.
00:06:40.360 | I mean, I played Fabio Caruana,
00:06:44.920 | this number two, number three player in the world these days.
00:06:47.840 | We played this 960 with the so-called Fischer random chess,
00:06:51.760 | reshuffling pieces.
00:06:52.960 | Yeah, I lost very badly, but it's because I made mistakes.
00:06:56.240 | I mean, I had so many winning positions.
00:06:57.720 | I mean, 15 years ago, I would have crushed him.
00:07:00.000 | So, and it's, you know, while I lost,
00:07:03.640 | I was not so much upset.
00:07:05.160 | I mean, I know, as I said in the interview,
00:07:08.280 | I can fight any opponent,
00:07:09.440 | but not my biological clock.
00:07:10.840 | So it's fighting time is always a losing proposition.
00:07:15.840 | But even today at age 56, you know,
00:07:18.920 | I knew that, you know, I could play great game.
00:07:22.120 | I couldn't finish it because I didn't have enough energy
00:07:24.400 | or just, you know,
00:07:25.560 | I couldn't have the same level of concentration.
00:07:27.640 | But, you know, in number of games
00:07:29.720 | where I completely outplayed one of the top players
00:07:31.960 | in the world, I mean,
00:07:32.800 | it gave me a certain amount of pleasure.
00:07:35.360 | That is, even today, I haven't lost my touch.
00:07:38.440 | Not the same, you know, okay,
00:07:41.000 | the jaws are not as strong and the teeth are not as sharp,
00:07:44.760 | but I could get him just, you know,
00:07:47.040 | almost, you know, on the ropes.
00:07:49.120 | - You still got it.
00:07:49.960 | - Still got it.
00:07:50.800 | And it's, you know, and it's,
00:07:52.320 | I think it's my wife said it well.
00:07:54.560 | I mean, she said, "Look, Gary,
00:07:55.920 | it's somehow, it's not you just fighting
00:07:58.240 | your biological clock.
00:07:59.720 | It's just, you know, maybe it's a signal."
00:08:01.480 | Because, you know, the goddess of chess,
00:08:03.720 | since you spoke, right?
00:08:04.760 | - Yeah, religiously.
00:08:05.680 | - The goddess of chess, Keisha,
00:08:07.560 | maybe she didn't want you to win
00:08:09.000 | because, you know, if you could beat number two,
00:08:13.120 | number three player in the world,
00:08:14.480 | I mean, that's one of the top players
00:08:16.600 | who just recently played a world championship match.
00:08:19.280 | If you could beat him,
00:08:20.840 | it's not really bad for the game of chess.
00:08:23.000 | But just, what people will say,
00:08:24.840 | "Oh, look, the game of chess, you know,
00:08:26.280 | it's not making any progress.
00:08:28.400 | The game is just, you know, it's totally devalued
00:08:31.200 | because, look, the guy coming out of retirement,
00:08:34.080 | you know, just, you know, winning games."
00:08:35.640 | Maybe that was good for chess.
00:08:37.080 | Not good for you, but it's, look,
00:08:39.200 | I've been following your logic.
00:08:41.560 | We should always look for, you know, demons,
00:08:43.480 | you know, superior forces,
00:08:45.720 | and other things that could, you know,
00:08:47.520 | if not dominate our lives,
00:08:49.000 | but somehow, you know, play a significant role
00:08:52.680 | in the outcome.
00:08:55.600 | - Yeah, so the goddess of chess had to send a message.
00:08:58.280 | - Yeah, that's okay, okay.
00:08:59.880 | So, Gary, you should do something else.
00:09:02.680 | Time.
00:09:04.040 | - Now, for a question that you have heard before,
00:09:06.960 | but give me a chance.
00:09:09.360 | You've dominated the chess world for 20 years,
00:09:12.320 | you even still got it.
00:09:14.280 | Is there a moment, you said,
00:09:15.920 | you always look to create something new.
00:09:17.560 | Is there games or moments
00:09:21.360 | where you're especially proud of,
00:09:23.400 | in terms of your brilliance,
00:09:25.280 | of a new creative move?
00:09:27.240 | You've talked about Mikhail Tal
00:09:28.640 | as somebody who was aggressive and creative chess player
00:09:31.920 | in your own game.
00:09:33.520 | - Look, you mentioned Mikhail Tal.
00:09:35.600 | He's very aggressive, very sharp player,
00:09:39.440 | famous for his combinations and sacrifices,
00:09:42.280 | even called magician from Riga,
00:09:44.160 | so for his very unique style.
00:09:47.440 | But any world champion, you know,
00:09:50.960 | it's, yeah, was a creator.
00:09:53.200 | Some of them were so flamboyant and flash-like tall.
00:09:57.640 | Some of them were, you know, just, you know,
00:09:59.200 | less discerned at the chess board,
00:10:02.960 | like Tigran Petrozyan,
00:10:04.040 | but every world champion,
00:10:05.680 | every top player brought something
00:10:07.840 | into the game of chess.
00:10:09.080 | And each contribution was priceless
00:10:11.720 | because it's not just about sacrifices.
00:10:13.720 | Of course, amateurs, they enjoy, you know,
00:10:15.960 | the brilliant games where pieces being sacrificed.
00:10:18.880 | It's all just, you know,
00:10:20.120 | all pieces are hanging,
00:10:21.560 | and it's all of a sudden, you know,
00:10:24.080 | being material down, rook down,
00:10:26.360 | or just, you know, queen down,
00:10:28.320 | the weaker side delivers the final blow
00:10:33.920 | and just, you know, mating opponent's king.
00:10:36.200 | But there are other kinds of beauty.
00:10:38.560 | I mean, it's a slow positional maneuvering,
00:10:41.120 | you know, looking for weaknesses
00:10:42.520 | and just gradually strangling your opponent
00:10:47.280 | and eventually delivering sort of a positional masterpiece.
00:10:51.720 | So I think I made more difference in the game of chess
00:10:55.280 | than I could have imagined when I started playing.
00:10:58.800 | And the reason I thought it was time for me to leave
00:11:01.600 | was that, I mean, I knew that I was not,
00:11:04.760 | I was no longer in the position
00:11:08.400 | to bring the same kind of contribution,
00:11:13.400 | the same kind of new knowledge into the game.
00:11:18.360 | So, and going back,
00:11:21.680 | I could immediately look at my games
00:11:24.200 | against Anatoly Karpov.
00:11:25.320 | It's not just I won the match in 1985
00:11:27.840 | and became a world champion at age 22,
00:11:31.200 | but there were at least two games in that match.
00:11:34.640 | Of course, the last one, game 24,
00:11:36.720 | that was decisive game of the match,
00:11:38.160 | I won and became world champion.
00:11:40.800 | But also the way I won, it was a very sharp game
00:11:45.680 | and I found a unique maneuver that was absolutely new
00:11:48.960 | and it became some sort of, just a typical now.
00:11:52.360 | Though just when the move was made on the board
00:11:57.360 | and put on display, a lot of people thought it was ugly.
00:12:00.480 | So, and another game, game 16 in the match
00:12:03.680 | where I just also managed to outplay Karpov completely
00:12:06.840 | with black pieces, just paralyzing his entire army
00:12:10.880 | in its own camp.
00:12:13.320 | - Technically or psychologically,
00:12:14.840 | or was that a mix of both in game 16?
00:12:17.600 | - Yeah, I think it was a big blow to Karpov.
00:12:20.000 | I think it was a big psychological victory
00:12:21.880 | for a number of reasons.
00:12:22.960 | One, the score was equal at the time
00:12:25.320 | and the world champion, by the rules,
00:12:28.080 | could retain his title in case of a tie.
00:12:30.800 | So we still have, before game 16, we have nine games to go.
00:12:35.120 | And also it was some sort of a bluff
00:12:37.120 | because neither me nor Karpov saw the refutation
00:12:41.560 | of this opening idea.
00:12:42.920 | And I think it's just for Karpov, it was double blow
00:12:46.680 | because not that he lost the game, it was a triple blow.
00:12:48.880 | He lost the game, it was a brilliant game
00:12:51.000 | and I played impeccably after just this opening bluff.
00:12:55.280 | And then they discovered that it was a bluff.
00:12:57.480 | So it's the, again, I didn't know, I was not bluffing.
00:13:00.520 | So that's why it happens very often.
00:13:03.160 | Some ideas could be refuted and it's just,
00:13:05.200 | what I found out, and this is again,
00:13:07.080 | going back to your spiritual theme,
00:13:09.200 | is that you could spend a lot of time working.
00:13:13.000 | And when I say you could, it's in the 80s, in the 90s.
00:13:16.400 | It doesn't happen these days
00:13:17.280 | because everybody has a computer.
00:13:18.880 | You could immediately see if it works or it doesn't work.
00:13:21.400 | Machine shows you refutation in a split of a second.
00:13:24.600 | But many of our analysis in the 80s or in the 90s,
00:13:28.640 | they were not perfect simply because we're humans
00:13:31.560 | and you analyze the game, you look for some fresh ideas.
00:13:36.000 | And then just it happens that there was something
00:13:39.280 | that you missed because the level of concentration
00:13:42.720 | at the chessboard is different from one
00:13:45.080 | that when you analyze the game,
00:13:46.360 | just moving the pieces around.
00:13:47.880 | But somehow if you spend a lot of time
00:13:52.480 | at the chessboard preparing,
00:13:54.120 | so in your studies with your coaches,
00:13:58.120 | hours and hours and hours,
00:13:59.840 | and nothing of what you found had materialized
00:14:04.680 | on the chessboard.
00:14:08.920 | Somehow these hours helped, I don't know why,
00:14:12.240 | always helped you.
00:14:13.360 | It's as if the amount of work you did
00:14:16.960 | could be transformed into some sort of spiritual energy
00:14:23.000 | that helped you to come up with other great ideas
00:14:26.040 | during the board.
00:14:27.320 | Again, even if there was no direct connection
00:14:30.280 | between your preparation and your victory in the game,
00:14:33.840 | there was always some sort of invisible connection
00:14:37.680 | between the amount of work you did,
00:14:39.840 | your dedication to actually to,
00:14:42.080 | and your passion to discover new ideas,
00:14:45.280 | and your ability during the game at the chessboard
00:14:48.320 | when the clock was ticking,
00:14:49.280 | we still had ticking clock, not digital clock at the time.
00:14:52.040 | So to come up with some brilliance.
00:14:55.120 | And I also can mention many games from the '90s.
00:14:59.760 | So it's the, obviously all amateurs would pick up my game
00:15:04.200 | against Veselin Topalov in 1999,
00:15:06.440 | and V. Gonzay, again, because it was a brilliant game.
00:15:09.640 | The Black King traveled from its own camp
00:15:13.560 | to into White's camp across the entire board.
00:15:17.720 | It doesn't happen often, trust me, as you know,
00:15:20.560 | in the games with professional players,
00:15:23.280 | top professional players.
00:15:24.480 | So that's why visually it was one
00:15:26.680 | of the most impressive victories.
00:15:28.320 | But I could bring to your attention
00:15:32.400 | many other games that were not so impressive for amateurs,
00:15:37.400 | not so beautiful, just because it's sacrifice,
00:15:42.560 | always beautiful, you sacrifice pieces.
00:15:44.840 | And then eventually you have so very few resources left,
00:15:48.840 | and you use them just to crush your opponent, basically.
00:15:53.840 | You have to make the kink,
00:15:55.680 | because you have almost nothing left at your disposal.
00:16:00.680 | But up to the very end, again, less and less,
00:16:06.160 | but still up to the very end,
00:16:07.800 | I always had games with some sort of interesting ideas
00:16:11.880 | and games that gave me great satisfaction.
00:16:14.800 | But I think it's what happened from 2005 up to these days
00:16:19.800 | was also a very big accomplishment,
00:16:24.600 | since I had to find myself to sort of relocate myself.
00:16:28.640 | - Yeah, rechannel the creative energies
00:16:30.680 | into other persons. - Exactly.
00:16:31.520 | And to find something where I feel comfortable,
00:16:35.920 | even confident that my participation
00:16:39.400 | still makes the difference.
00:16:41.360 | - Beautifully put.
00:16:42.560 | So let me ask perhaps a silly question,
00:16:44.880 | but sticking on chess for just a little longer.
00:16:48.440 | Where do you put Magnus Carlsen,
00:16:49.960 | the current world champion in the list of all-time greats?
00:16:52.880 | In terms of style, moments of brilliance, consistency?
00:16:57.880 | - It's a tricky question.
00:16:59.520 | The moment you start ranking
00:17:01.600 | a world champion-- - Yeah, you lose something?
00:17:04.560 | - I think it's not fair,
00:17:06.840 | because any new generation knows much better
00:17:12.040 | knows much more about the game than the previous one.
00:17:15.120 | So when people say, "Oh, Gary was the greatest,
00:17:17.560 | "Fisher was the greatest, Magnus was the greatest,"
00:17:19.720 | it disregards the fact that the great players of the past,
00:17:24.360 | whether Lasky or Capaplanca, Lokian,
00:17:26.800 | I mean, they knew so little about chess by today's standards.
00:17:29.920 | I mean, today, just any kid that spent a few years
00:17:32.720 | with his or her chess computer
00:17:36.760 | knows much more about the game,
00:17:38.120 | just simply just because you have access to this information.
00:17:40.960 | And it has been discovered generation after generation,
00:17:43.480 | we added more and more knowledge to the game of chess.
00:17:46.880 | It's about the gap between the world champion
00:17:50.360 | and the rest of the field.
00:17:51.760 | Now, if you look at the gap,
00:17:55.240 | then probably Fisher could be on top,
00:17:58.320 | but very short period of time.
00:17:59.800 | Then you should also add a time factor.
00:18:01.960 | I was on top, not as big as Fisher, but much longer.
00:18:06.320 | So, and also, unlike Fisher,
00:18:08.440 | I succeeded in beating next generation.
00:18:11.800 | - Here's the question.
00:18:13.280 | Let's see if you still got the fire,
00:18:14.800 | speaking of the next generation,
00:18:15.960 | because you did succeed beating the next generation.
00:18:19.000 | - It's close.
00:18:19.840 | Okay, Anand, short Anand, the sheer of,
00:18:22.920 | Kramnik is already 12 years younger, so that's a neck.
00:18:25.560 | But still yet, I competed with them
00:18:28.200 | and I just had beat most of them.
00:18:30.000 | And I was still dominant when I left at age 41.
00:18:34.240 | So, back to Magnus.
00:18:36.840 | Magnus, I mean, consistency is phenomenal.
00:18:40.280 | The reason Magnus is on top and it seems unbeatable today,
00:18:45.280 | Magnus is a lethal combination of Fisher and Karpov,
00:18:50.720 | which is very, it's very unusual,
00:18:52.960 | because Fisher's style was very dynamic,
00:18:54.880 | just fighting to the last point,
00:18:56.760 | I mean, just using every resource available.
00:18:59.160 | Karpov was very different.
00:19:01.360 | It's just, he had an unparalleled ability
00:19:04.680 | to use every piece with a maximum effect.
00:19:07.600 | Just its minimal resources always produce maximum effect.
00:19:12.080 | So, now imagine that you merge these two styles.
00:19:15.160 | So, it's like squeezing every stone for a drop of water,
00:19:20.160 | but doing it just for 50, 60, 70, 80 moves.
00:19:25.080 | I mean, Magnus could go on as long as Fisher
00:19:27.320 | with all his passion and energy,
00:19:28.960 | and at the same time being as meticulous
00:19:31.480 | and deadly as Karpov by just using every little advantage.
00:19:36.480 | So, and he has good, very good health, it's important.
00:19:41.280 | I mean, physical conditions are, by the way, very important.
00:19:43.480 | So, a lot of people don't recognize it.
00:19:44.800 | There are latest studies shows that chess players
00:19:46.960 | burn thousands of calories during the game.
00:19:50.960 | So, that puts him on the top of this field
00:19:54.560 | of the world champions.
00:19:56.000 | But again, it's the discussion that is,
00:19:59.480 | I saw recently in the internet,
00:20:00.720 | whether Garry Kasparov of his peak,
00:20:03.200 | let's say late 80s, could beat Magnus Carlsen today.
00:20:06.600 | I mean, it's totally irrelevant
00:20:07.560 | because Garry Kasparov in 1989, okay,
00:20:10.920 | he's played great chess,
00:20:12.760 | but still I knew very little about chess
00:20:15.200 | compared to Magnus Carlsen in 2019,
00:20:17.160 | who, by the way, learned from me as well.
00:20:18.840 | So, that's why, yeah.
00:20:20.440 | I'm extremely cautious in making any judgment
00:20:24.040 | that involves time gaps.
00:20:26.560 | You ask soccer fans, so who is your favorite?
00:20:30.120 | Pelé, Maradona, or Messi?
00:20:31.920 | Yeah. Yeah.
00:20:33.040 | Who's your favorite?
00:20:33.960 | Messi. Messi.
00:20:35.360 | Yeah, why?
00:20:36.720 | Because- Maybe Maradona, maybe.
00:20:38.520 | No, because you're younger, but that's simple.
00:20:40.040 | Your instinctive answer is correct
00:20:41.720 | because you didn't see Maradona in action.
00:20:44.480 | I saw all of them in action, so that's why.
00:20:47.120 | But since when I was just following it,
00:20:50.520 | it's Pelé and Maradona, they were big stars,
00:20:53.680 | and it's Messi's already just,
00:20:55.280 | I was gradually losing interest in just other things.
00:20:58.800 | So, I remember Pelé in 1970, the final match, Brazil-Italy.
00:21:02.200 | So, that's the first World Cup soccer I watched.
00:21:05.760 | So, that's the...
00:21:06.800 | And actually, my answer, when I just, you know,
00:21:10.360 | because I was asked this question as well.
00:21:12.480 | So, I say that it's just,
00:21:13.800 | while it's impossible to make a choice,
00:21:15.600 | I would still probably go with Maradona for a simple reason.
00:21:18.440 | The Brazilian team in 1970 could have won without Pelé.
00:21:21.920 | It was absolutely great.
00:21:23.160 | Still could have won.
00:21:24.080 | Maybe, but it is...
00:21:25.880 | The Argentinian team in 1986 without Maradona
00:21:28.680 | would not be in the final.
00:21:29.760 | So, this is...
00:21:30.600 | And Messi, he still hasn't won a title.
00:21:33.320 | That's...
00:21:34.160 | - Could argue for that for an hour,
00:21:35.600 | but you could say, if you ask Maradona,
00:21:38.680 | if you look in his eyes,
00:21:39.880 | especially, let's say, Garry Kasparov in 1989,
00:21:43.480 | he would have said,
00:21:45.200 | "I was sure as hell would beat Magnus Carlsen."
00:21:48.360 | - Yeah, just simply because...
00:21:49.200 | - The confidence, the fire.
00:21:50.440 | - Simply because, simply because, again,
00:21:52.680 | it's just, they saw me in action.
00:21:54.040 | So, this, again, it's the age factor that's important.
00:21:56.840 | Therefore, with the passion and energy
00:21:58.600 | and being equipped with all modern ideas,
00:22:01.680 | but, again, then you make a very just important assumption
00:22:05.840 | that you could empower Garry Kasparov in '89
00:22:08.680 | with all ideas that have been accumulated over 30 years.
00:22:11.520 | That would not be Garry Kasparov.
00:22:12.680 | That would be someone else.
00:22:14.280 | Because, again, I belong to 1989.
00:22:16.240 | I was way ahead of the field,
00:22:18.920 | and I beat Karpov several times
00:22:21.720 | in the world championship matches,
00:22:23.120 | and I crossed 2,800, which, by the way,
00:22:26.320 | if you look at the rating, which is just, it's...
00:22:29.400 | Even today, so this is the rating that I retired,
00:22:33.600 | so it's still, you know, it's just, it's a top two, two, three.
00:22:37.000 | So, that's, it's Karwana and Deag.
00:22:38.760 | It's about the same rating now.
00:22:40.480 | And I crossed 2,800 in 1990.
00:22:43.280 | Well, just you look at the inflation.
00:22:44.880 | When I crossed 2,800 in 1990,
00:22:47.360 | there was only one player in 2,700 category,
00:22:49.640 | Anatoly Karpov.
00:22:50.640 | Now, we had more than 50.
00:22:52.160 | So, just, when you see this, so if you add inflation,
00:22:55.080 | so I think my 2,851, it could probably,
00:22:58.080 | could be more valuable as Magnus 2,882,
00:23:02.080 | which was his highest rating.
00:23:04.160 | But anyway, again, too many hypotheticals.
00:23:07.040 | - You're lost to IBM DBlue in 1997.
00:23:10.880 | In my eyes, that is one of the most seminal moments
00:23:13.360 | in the history.
00:23:14.960 | Again, I apologize for being romanticizing the notion,
00:23:18.520 | but in the history of our civilization,
00:23:20.440 | because humans, as a civilization,
00:23:25.200 | for centuries saw chess as the peak
00:23:27.920 | of what man can accomplish, of intellectual mastery.
00:23:30.680 | And that moment when a machine could beat a human being
00:23:36.560 | was inspiring to just an entire,
00:23:40.160 | anyone who cares about science, innovation,
00:23:43.280 | an entire generation of AI researchers.
00:23:46.280 | And yet, to you, that loss, at least if reading your face,
00:23:50.800 | was, seemed like a tragedy, extremely painful.
00:23:53.600 | Like you said, physically painful.
00:23:56.560 | When you look back at your psychology of that loss,
00:23:59.720 | why was it so painful?
00:24:01.240 | Were you not able to see the seminal nature of that moment?
00:24:05.640 | Or was that exactly why it was that painful?
00:24:10.240 | - As I already said, losing was painful,
00:24:15.680 | physically painful.
00:24:17.120 | And the match I lost in 1997
00:24:19.040 | was not the first match I lost to a machine.
00:24:22.080 | It was the first match I lost, period.
00:24:24.040 | - Yeah.
00:24:24.880 | - That's...
00:24:25.720 | - Oh, wow.
00:24:28.160 | Oh, wow.
00:24:30.480 | - Yeah, it's...
00:24:31.880 | - Right.
00:24:32.720 | - Yeah, that makes all the difference to me.
00:24:35.920 | - Yes.
00:24:36.760 | - First time I lost, it's just...
00:24:38.440 | Now, I lost, and the reason I was so angry
00:24:42.440 | that I just, you know, I had suspicions
00:24:46.200 | that my loss was not just a result of my bad play.
00:24:49.240 | - Yes.
00:24:50.080 | - So though I played quite poorly, you know,
00:24:51.600 | just when you started looking at the games today,
00:24:53.320 | I made tons of mistakes.
00:24:54.960 | But, you know, I had all reasons to believe
00:24:57.680 | that, you know, there were other factors
00:25:00.240 | that had nothing to do with the game of chess.
00:25:01.800 | And that's why I was angry.
00:25:03.000 | But look, it was 22 years ago.
00:25:05.520 | It's more than the bridge.
00:25:07.280 | We can analyze this match,
00:25:09.040 | and this is with everything you said,
00:25:10.720 | I agree, with probably one exception,
00:25:13.360 | is that considering chess, you know,
00:25:16.440 | as the sort of, as a pinnacle of intellectual activities,
00:25:20.280 | was our mistake.
00:25:21.400 | Because, you know, we just thought,
00:25:23.040 | oh, it's a game of the highest intellect,
00:25:25.760 | and it's just, you know, you have to be so, you know,
00:25:28.040 | intelligent, and you could see things that, you know,
00:25:30.640 | the ordinary mortals could not see.
00:25:35.080 | It's a game.
00:25:37.400 | And all machines had to do in this game
00:25:41.160 | is just to make fewer mistakes,
00:25:42.720 | not to solve the game,
00:25:44.240 | because the game cannot be solved.
00:25:45.560 | I mean, according to Gualt-Shanan,
00:25:46.800 | the number of legal moves is 10 to the 46th power.
00:25:50.080 | Too many zeros, you know,
00:25:51.200 | just for any computer to finish the job, you know,
00:25:54.520 | in the next few billion years.
00:25:58.600 | But it doesn't have to.
00:26:00.200 | It's all about making fewer mistakes.
00:26:02.680 | And I think that's this match, actually,
00:26:04.880 | and what's happened afterwards with other games,
00:26:07.240 | with Go, with Shogi, with video games,
00:26:12.240 | it's a demonstration that it's the machines
00:26:16.040 | will always beat humans in what I call closed systems.
00:26:19.360 | The moment you build a closed system,
00:26:22.000 | no matter how the system is called,
00:26:23.680 | chess, Go, Shogi, Dota,
00:26:27.000 | machines will prevail simply
00:26:30.720 | because they will bring down number of mistakes.
00:26:35.960 | Machines don't have to solve it.
00:26:37.600 | They just have to, the way they outplay us,
00:26:40.800 | it's not by just being more intelligent.
00:26:43.360 | It's just by doing something else,
00:26:46.080 | but eventually it's capitalizing on our mistakes.
00:26:49.800 | When you look at the chess machines ratings today
00:26:52.760 | and compare this to Magnus Carlsen,
00:26:55.200 | it's the same as comparing Ferrari to Usain Bolt.
00:26:58.240 | It's the, the gap is, I mean, by chess standards is insane.
00:27:04.240 | 34, 3,500 to 2,800, 2,850 on Magnus.
00:27:08.680 | It's like difference between Magnus
00:27:10.400 | and an ordinary player
00:27:12.200 | from an open international tournament.
00:27:14.120 | It's not because machine understands
00:27:17.600 | just better than Magnus Carlsen,
00:27:19.400 | but simply because it's steady.
00:27:21.720 | Machine has steady hand.
00:27:23.840 | And I think that is what we have to learn
00:27:28.520 | from 1997 experience and from further encounters
00:27:32.600 | with computers and sort of the current state of affairs
00:27:36.360 | with AlphaZero, beating other machines.
00:27:40.040 | The idea that we can compete with computers
00:27:43.600 | in so-called intellectual fields,
00:27:45.360 | it was wrong from the very beginning.
00:27:49.120 | It's just, it's, by the way,
00:27:50.880 | the 1997 match was not the first victory of machines over.
00:27:54.880 | - Over grandmasters.
00:27:56.720 | - Over grandmasters.
00:27:57.640 | No, actually it's, I played against
00:28:00.720 | first decent chess computers from late '80s.
00:28:04.120 | So I played with the prototype of Deep Blue
00:28:07.160 | called Deep Thought in 1989,
00:28:09.240 | two rapid chess games in New York.
00:28:10.560 | I won handily to both games.
00:28:13.080 | We played against new chess engines
00:28:16.280 | like Fritz and other programs.
00:28:18.760 | And then it was Israeli program Junior
00:28:21.320 | that appeared in 1995.
00:28:22.160 | - Right, right, right, I remember.
00:28:23.240 | - Yeah, so there were several programs.
00:28:25.440 | I, you know, I lost few games in Blitz.
00:28:28.240 | I lost one match against a computer,
00:28:30.600 | a chess engine, 1994, rapid chess.
00:28:33.120 | So I lost one game to Deep Blue in 1996 match,
00:28:36.120 | the match I won.
00:28:38.160 | Some people, you know, tend to forget about it,
00:28:40.000 | that I won the first match.
00:28:41.360 | - Yes.
00:28:42.280 | - But it's, we made a very important
00:28:47.280 | psychological mistake thinking that
00:28:49.120 | the reason we lost Blitz matches, five minutes games,
00:28:52.440 | the reason we lost some of the rapid chess matches,
00:28:55.400 | 25 minutes chess, because we didn't have enough time.
00:28:58.160 | If you play a longer match,
00:29:00.000 | we will not make the same mistakes.
00:29:02.040 | Nonsense.
00:29:02.920 | So yeah, we had more time, but we still make mistakes.
00:29:06.120 | And machine also has more time.
00:29:07.560 | And machines, machine will always, you know,
00:29:10.360 | will always be steady and consistent
00:29:13.040 | compared to humans' instabilities and inconsistencies.
00:29:18.040 | And today we are at the point where, yes,
00:29:20.760 | nobody talks about, you know, humans playing against machines.
00:29:24.480 | Now machines can offer handicap to top players.
00:29:27.520 | Still, you know, will be favorite.
00:29:31.160 | I think we're just learning that it's no longer
00:29:34.080 | human versus machines.
00:29:35.280 | It's about human working with machines.
00:29:37.760 | That's what I recognized in 1998,
00:29:41.640 | just after leaking my wounds and spending one year
00:29:43.880 | in just, you know, ruminating so the,
00:29:45.960 | so what's happened in this match.
00:29:48.280 | And I knew that though,
00:29:49.680 | we still could play against the machines.
00:29:51.400 | I had two more matches in 2003,
00:29:53.880 | playing both Deep Freed and Deep Junior.
00:29:56.520 | Both matches ended as a tie.
00:29:58.480 | Though these machines were not weaker, at least,
00:30:02.320 | actually probably stronger than Deep Blue.
00:30:04.880 | And by the way, today, chess app on your mobile phone
00:30:08.680 | is probably stronger than Deep Blue.
00:30:10.000 | - Than Deep Blue, yeah.
00:30:10.840 | - I'm not speaking about chess engines
00:30:12.320 | that are so much superior.
00:30:13.840 | And by the way, when you analyze games
00:30:16.240 | we played against Deep Blue in 1997 on your chess engine,
00:30:19.120 | they'll be laughing.
00:30:20.480 | So this is, and it's also shows that's how chess changed
00:30:23.120 | because chess commentators,
00:30:25.360 | they look at some of our games,
00:30:26.760 | like game four, game five, brilliant idea.
00:30:29.360 | Now you ask Stockfish, you ask Houdini,
00:30:34.360 | you ask Commodore, all the leading chess engines.
00:30:37.600 | Within 30 seconds, they will show you how many mistakes
00:30:40.120 | both Gary and Deep Blue made in the game
00:30:43.880 | that was trumpeted as a great chess match in 1997.
00:30:48.880 | - Well, okay, so you've made an interesting,
00:30:54.000 | if you can untangle that comment.
00:30:56.400 | So now in retrospect, it was a mistake to see chess
00:31:01.040 | as the peak of human intellect.
00:31:03.560 | Nevertheless, that was done for centuries.
00:31:06.640 | So let me-
00:31:07.480 | - By the way, in Europe, because you move to the Far East,
00:31:12.040 | they will go, they had shown-
00:31:14.720 | - Games, games.
00:31:15.560 | - Again, some of the games, like board games.
00:31:20.080 | - Yes.
00:31:21.200 | - Yeah, I agree.
00:31:22.040 | - So if I push back a little bit,
00:31:23.520 | so now you say that, okay, but it was a mistake
00:31:27.640 | to see chess as the epitome.
00:31:29.240 | And now there's other things, maybe,
00:31:32.280 | like language, like conversation,
00:31:34.880 | like some of the things that, in your view,
00:31:36.840 | is still way out of reach of computers, but inside humans.
00:31:40.840 | Do you think, can you talk about what those things might be?
00:31:44.320 | And do you think, just like chess, they might fall
00:31:47.560 | with the same set of approaches, if you look at alpha zero,
00:31:52.920 | the same kind of learning approaches
00:31:55.240 | as the machines grow in size?
00:31:57.400 | - No, no, it's not about growing in size.
00:31:59.280 | It's about, again, it's about understanding the difference
00:32:02.360 | between closed system and open-ended system.
00:32:05.200 | - So you think that key difference,
00:32:06.880 | so the board games are closed in terms of the rule set,
00:32:11.000 | the actions, the state space, everything is just constrained.
00:32:15.960 | You think once you open it, the machines are lost?
00:32:19.800 | - Not lost, but again,
00:32:21.360 | the effectiveness is very different
00:32:22.960 | because machine does not understand the moment
00:32:25.600 | it's reaching territory of diminishing returns.
00:32:28.560 | It's the, to put it in a different way,
00:32:32.520 | machine doesn't know how to ask right questions.
00:32:35.920 | It can ask questions, but it will never tell you
00:32:38.200 | which questions are relevant.
00:32:39.440 | So it's like about, it's a direction.
00:32:42.360 | So I think it's in human-machine relations,
00:32:45.040 | we have to consider our role.
00:32:47.240 | And many people feel uncomfortable
00:32:49.600 | that the territory that belongs to us is shrinking.
00:32:54.320 | I'm saying, so what?
00:32:56.640 | You know, this is eventually will belong
00:32:58.760 | to the last few decimal points,
00:33:00.720 | but it's like having, so a very powerful gun,
00:33:05.720 | and all you can do there is slightly, you know,
00:33:10.280 | alter direction of the bullet,
00:33:11.640 | maybe, you know, 0.1 degree of this angle.
00:33:16.160 | But that means a mile away, 10 meters of target.
00:33:21.080 | So that's, we have to recognize
00:33:24.080 | that is a certain unique human qualities
00:33:26.600 | that machines in a foreseeable future
00:33:30.200 | will not be able to reproduce.
00:33:33.000 | And the effectiveness of this cooperation,
00:33:35.920 | collaboration depends on our understanding
00:33:38.320 | what exactly we can bring into the game.
00:33:40.400 | So the greatest danger is when we try to interfere
00:33:43.840 | with machine superior knowledge.
00:33:45.600 | So that's why I always say that sometimes
00:33:47.600 | you'd rather have, by reading this,
00:33:49.800 | pictures in radiology,
00:33:51.640 | you may probably prefer an experienced nurse
00:33:55.440 | than, rather than having top professor,
00:33:57.640 | because she will not try to interfere
00:34:00.600 | with machines' understanding.
00:34:02.200 | So it's very important to know that
00:34:04.040 | if machines knows how to do better things in 95%,
00:34:07.560 | 96% of territory, we should not touch it
00:34:09.840 | because it's happened.
00:34:11.360 | It's like in chess, recognize, they do it better.
00:34:15.400 | See where we can make the difference.
00:34:17.160 | You mentioned AlphaZero.
00:34:18.440 | I mean, AlphaZero, it's actually a first step
00:34:22.240 | into what you may call AI,
00:34:24.160 | because everything that's being called AI today
00:34:26.800 | is just, it's one or another variation
00:34:30.840 | of what Claude Shannon characterized as a brute force.
00:34:34.000 | It's a type A machine, whether it's Deep Blue,
00:34:36.680 | whether it's Watson, and all these,
00:34:39.320 | this is the modern technologies
00:34:41.480 | that are being trumpeted as AI, it's still brute force.
00:34:45.320 | It's the, all they do, it's they do optimization.
00:34:48.760 | It's this, they are, they keep improving
00:34:53.240 | the way to process human generated data.
00:34:56.920 | Now, AlphaZero is the first step
00:35:00.680 | towards machine produced knowledge.
00:35:04.840 | Which is, by the way, it's quite ironic
00:35:06.880 | that the first company that championed that was IBM.
00:35:10.400 | Oh, it's in backgammon.
00:35:13.560 | Interesting, in backgammon.
00:35:15.400 | Yes, you should look at IBM, it's a new gammon.
00:35:19.640 | It's the scientist called Cesaro.
00:35:22.040 | He's still working at IBM.
00:35:23.520 | They had in early '90s.
00:35:25.520 | It's the program that played in all the AlphaZero types,
00:35:29.400 | so just trying to come up with own strategies.
00:35:31.720 | But because of success of Deep Blue,
00:35:34.360 | this project had been not abandoned,
00:35:36.720 | but just, it was put on hold.
00:35:40.000 | And now it just, it's, everybody talks about,
00:35:43.480 | it's the machines generated knowledge,
00:35:46.840 | so as revolutionary, and it is,
00:35:49.440 | but there's still many open-ended questions.
00:35:54.120 | Yes, AlphaZero generates its own data.
00:35:58.160 | Many ideas that AlphaZero generated in chess
00:36:00.800 | were quite intriguing.
00:36:02.160 | So I looked at these games with,
00:36:04.600 | not just with interest, but with,
00:36:08.080 | it was quite exciting to learn how machine
00:36:11.360 | could actually juggle all the pieces
00:36:13.880 | and just play positions with a broken material balance,
00:36:17.400 | sacrificing material, always being ahead of other programs,
00:36:20.800 | one or two moves ahead by foreseeing the consequences,
00:36:24.840 | not over-calculating, because machines,
00:36:27.440 | other machines were at least as powerful in calculating,
00:36:30.280 | but it's having this unique knowledge
00:36:33.200 | based on discovered patterns.
00:36:35.240 | After playing 60 million games.
00:36:37.240 | - Almost something that feels like intuition.
00:36:39.680 | - Exactly, but there's one problem.
00:36:41.760 | Now, the simple question,
00:36:44.800 | if AlphaZero faces superior point,
00:36:47.640 | let's say another powerful computer
00:36:50.400 | accompanied by a human who could help
00:36:54.800 | just to discover certain problems,
00:36:56.240 | because I already, I look at many AlphaZero games,
00:36:58.720 | I visited their lab,
00:37:00.280 | spoke to Demis Kasabis and his team,
00:37:01.840 | and I know there's certain weaknesses there.
00:37:04.200 | Now, if these weaknesses are exposed,
00:37:05.680 | then the question is,
00:37:06.520 | how many games will it take for AlphaZero to correct it?
00:37:09.560 | The answer is hundreds of thousands.
00:37:11.520 | Even if it keeps losing,
00:37:13.320 | it's just because the whole system is based.
00:37:16.400 | So it's now, imagine, so this is,
00:37:19.160 | you can have a human by just making a few tweaks.
00:37:21.680 | So humans are still more flexible.
00:37:24.520 | And as long as we recognize what is our role,
00:37:28.360 | where we can play sort of,
00:37:30.720 | so the most valuable part in this collaboration,
00:37:34.280 | so it will help us to understand
00:37:36.960 | what are the next steps in human-machine collaboration.
00:37:40.400 | - Beautifully put.
00:37:41.240 | So let's talk about the thing that machines
00:37:43.200 | certainly don't know how to do yet, which is morality.
00:37:46.080 | - Machines and morality.
00:37:47.480 | It's another question that, you know,
00:37:48.760 | just it's being asked all the time these days.
00:37:51.560 | And I think it's another phantom
00:37:54.280 | that is haunting a general public
00:37:57.320 | because it's just being fed with this, you know,
00:38:00.120 | illusions is that how can we avoid machines, you know,
00:38:04.800 | having bias, being prejudices.
00:38:07.680 | You cannot, because it's like looking in the mirror
00:38:10.480 | and complaining about what you see.
00:38:12.120 | If you have certain bias in the society,
00:38:14.800 | machine will just follow it.
00:38:17.440 | It's just, you know, you look at the mirror,
00:38:19.800 | you don't like what you see there,
00:38:21.000 | you can, you know, you can break it,
00:38:23.720 | you can try to distort it,
00:38:25.520 | or you can try to actually change something.
00:38:28.000 | - By yourself.
00:38:29.440 | - By yourself, yes.
00:38:30.400 | So it's very important to understand
00:38:31.920 | is that you cannot expect machines
00:38:33.680 | to improve the ills of our society.
00:38:37.360 | And moreover, machines will simply, you know,
00:38:39.720 | just, you know, amplify.
00:38:41.280 | - Yes. - Yeah.
00:38:42.200 | - But the thing is people are more comfortable
00:38:45.640 | with other people doing injustice, with being biased.
00:38:50.640 | We're not comfortable with machines
00:38:52.480 | having the same kind of bias.
00:38:54.320 | So that's an interesting standard
00:38:58.080 | that we place on machines.
00:38:59.360 | With autonomous vehicles, they have to be much safer.
00:39:02.000 | With automated systems--
00:39:03.920 | - Of course, of course they're much safer.
00:39:05.240 | Statistically, they're much safer than--
00:39:07.080 | - It's not an of course.
00:39:08.600 | Why would they, it's not of course.
00:39:10.760 | It's not given.
00:39:12.720 | Autonomous vehicles, you have to work really hard
00:39:16.400 | to make them safer.
00:39:19.120 | - I think it just, it goes without saying,
00:39:22.800 | is the outcome of this, I wouldn't call it competition,
00:39:26.800 | but comparison is very clear.
00:39:29.040 | But the problem is not about being, you know, safer.
00:39:32.360 | It's the 40,000 people or so every year
00:39:35.760 | died in car accidents in the United States.
00:39:38.320 | And it's statistics.
00:39:40.160 | One accident with autonomous vehicle
00:39:42.720 | and it's front page of a newspaper.
00:39:43.960 | - Yes, so yes.
00:39:44.800 | - So it's, again, it's about psychological.
00:39:47.280 | So it's while people, you know,
00:39:49.440 | kill each other in car accidents
00:39:50.640 | because they make mistakes, they make more mistakes.
00:39:52.720 | For me, it's not a question.
00:39:54.440 | Of course we make more mistakes because we're human.
00:39:57.080 | Yes, machines also,
00:39:58.560 | and by the way, no machine will ever reach 100% perfection.
00:40:01.440 | That's another important fake story
00:40:04.040 | that is being fed to the public.
00:40:05.920 | If machine doesn't reach 100% performance is not safe.
00:40:09.120 | No, all you can ask any computer,
00:40:11.720 | whether it's, you know, playing chess
00:40:13.600 | or doing the stock market calculations
00:40:16.360 | or driving your autonomous vehicle,
00:40:18.720 | it's to make fewer mistakes.
00:40:21.280 | And yes, I know it's not, you know,
00:40:23.240 | it's not easy for us to accept because ah,
00:40:25.680 | if you have two humans, you know, colliding in their cars,
00:40:30.240 | okay, it's like, if one of these cars
00:40:33.040 | is autonomous vehicle,
00:40:34.200 | and by the way, even if it's human's fault, terrible.
00:40:37.400 | How could you allow a machine to run
00:40:40.640 | without a driver at the wheel?
00:40:42.680 | - So, you know, let's linger that for a second,
00:40:45.240 | that double standard.
00:40:46.840 | The way you felt with your first loss against Deep Blue,
00:40:51.640 | were you treating the machine differently
00:40:54.480 | than you would have a human?
00:40:55.880 | So what do you think about that difference
00:40:58.720 | between the way we see machines and humans?
00:41:02.200 | - No, it's the, at that time, you know,
00:41:03.440 | for me it was a match.
00:41:04.280 | And that's why I was angry because I believe that
00:41:06.560 | the match was not, you know, fairly organized.
00:41:08.920 | So it's definitely there were unfair advantages for IBM
00:41:13.040 | and I want to play another match, like a rubber match.
00:41:16.880 | - So your anger or displeasure was aimed more like
00:41:20.680 | at the humans behind IBM versus the actual pure algorithm.
00:41:24.120 | - Absolutely, look, I knew at the time,
00:41:26.800 | and by the way, I was, objectively speaking,
00:41:29.080 | I was stronger at that time.
00:41:30.360 | So that's probably added to my anger
00:41:32.560 | because I knew I could beat the machine.
00:41:34.080 | - Yeah.
00:41:34.920 | - Yeah, so that's, and that's the, and I lost,
00:41:36.600 | and I knew I was not well prepared.
00:41:38.080 | So because they, I have to give them credit,
00:41:39.960 | they did some good work from 1996,
00:41:42.760 | and I, but I still could beat the machine.
00:41:45.280 | So I made too many mistakes.
00:41:47.040 | Also, this is the whole, it's this,
00:41:48.480 | the publicity around the match.
00:41:49.840 | So I underestimated the effect, you know, just it's,
00:41:52.760 | and being called the, you know, the brain's last stand,
00:41:56.720 | you know, it's okay, no pressure.
00:41:59.320 | (both laughing)
00:42:02.200 | - Okay, well, let me ask.
00:42:04.080 | So I was born also in the Soviet Union.
00:42:06.880 | What lessons do you draw from the rise and fall
00:42:09.280 | of the Soviet Union in the 20th century?
00:42:11.480 | When you just look at this nation
00:42:15.280 | that is now pushing forward into what Russia is,
00:42:19.480 | if you look at the long arc of history
00:42:21.360 | of the 20th century, what do we take away?
00:42:25.480 | What do we take away from that?
00:42:28.840 | - I think the lesson of history is clear.
00:42:32.720 | Undemocratic systems, totalitarian regimes,
00:42:38.560 | systems that are based on controlling their citizens
00:42:42.840 | and just every aspect of their life,
00:42:46.040 | not offering opportunities to,
00:42:49.720 | for private initiative, central planning systems,
00:42:54.280 | they're doomed.
00:42:55.120 | They're just, you know,
00:42:55.960 | they cannot be driving force for innovation.
00:43:00.120 | So they, in a history timeline,
00:43:02.600 | I mean, they could cause certain, you know,
00:43:05.080 | distortion of the concept of progress.
00:43:11.240 | They, by the way, they may call themselves progressive,
00:43:13.240 | but we know that is the damage that they caused
00:43:16.200 | to humanity is just, it's yet to be measured.
00:43:19.800 | But at the end of the day, they fail.
00:43:22.240 | They fail and the end of the Cold War
00:43:24.920 | was a great triumph of the free world.
00:43:28.360 | It's not that the free world is perfect.
00:43:30.120 | It's very important to recognize the fact that,
00:43:32.880 | I always like to mention, you know,
00:43:34.320 | one of my favorite books, "The Lord of the Rings,"
00:43:36.560 | that there's no absolute good,
00:43:40.520 | but there is an absolute evil.
00:43:42.120 | Good, you know, comes in many forms,
00:43:44.000 | but we all, you know, it's being humans
00:43:47.240 | or being even, you know, humans from fairy tales
00:43:49.680 | or just some sort of mythical creatures.
00:43:52.200 | It's the, you can always find spots on the sun.
00:43:57.200 | So this is, you're conducting war
00:44:00.880 | and just, and fighting for justice.
00:44:03.320 | There are always things that, you know,
00:44:04.840 | can be easily criticized.
00:44:06.480 | And human history is a never-ending quest for perfection.
00:44:11.040 | But we know that there is absolute evil.
00:44:13.280 | We know it's, for me, it's now clear.
00:44:15.840 | I mean, nobody argues about Hitler being absolute evil,
00:44:18.680 | but I think it's very important
00:44:19.520 | to recognize Stalin was absolute evil.
00:44:21.240 | Communism caused more damage than any other ideology
00:44:25.440 | in the 20th century.
00:44:26.320 | And unfortunately, while we all know
00:44:28.400 | that fascism was condemned,
00:44:30.400 | but there was no Nuremberg for communism.
00:44:32.160 | And that's why we could see, you know,
00:44:33.600 | still the successors of Stalin
00:44:36.840 | are feeling far more comfortable.
00:44:39.040 | And Putin is one of them.
00:44:40.880 | - You highlight a few interesting connections,
00:44:43.000 | actually, between Stalin and Hitler.
00:44:45.000 | I mean, in terms of the adjusting
00:44:49.480 | or clarifying the history of World War II,
00:44:53.240 | which is very interesting.
00:44:54.080 | Of course, we don't have time, so let me ask.
00:44:55.920 | - You can ask it.
00:44:56.760 | I just recently delivered a speech in Toronto
00:44:59.960 | at 80th anniversary of the Molotov-Ribbentrop Pact.
00:45:02.360 | It's something that I believe, you know,
00:45:03.840 | just, you know, must be taught in the schools.
00:45:07.040 | That the World War II had been started by two dictators,
00:45:11.520 | by signing this criminal treaty,
00:45:15.440 | collusion of two tyrants in August 1939,
00:45:19.120 | that led to the beginning of the World War II.
00:45:21.240 | And the fact is that eventually Stalin had no choice
00:45:24.160 | but to join allies because Hitler attacked him.
00:45:27.560 | So it just doesn't, you know, eliminate the fact
00:45:31.040 | that Stalin helped Hitler to start World War II.
00:45:34.360 | And he was one of the beneficiaries at early stage
00:45:37.800 | by annexing part of Eastern Europe.
00:45:40.440 | And as a result of the World War II,
00:45:42.120 | he annexed almost entire Eastern Europe.
00:45:44.560 | And for many Eastern European nations,
00:45:46.480 | the end of the World War II
00:45:47.800 | was the beginning of communist occupation.
00:45:50.560 | - So Putin, you've talked about as a man who stands
00:45:55.560 | between Russia and democracy, essentially today.
00:46:00.480 | You've been a strong opponent and critic of Putin.
00:46:04.320 | Let me ask again,
00:46:06.120 | how much does fear enter your mind and heart?
00:46:09.320 | So in 2007, there's this interesting comment
00:46:12.760 | from Oleg Kalugin, KGB general.
00:46:17.760 | He said that, "I do not talk details.
00:46:19.760 | "People who knew them are all dead now
00:46:21.800 | "because they were vocal.
00:46:23.740 | "I'm quiet.
00:46:24.940 | "There's only one man who's vocal,
00:46:26.880 | "and he may be in trouble, World Chess Champion Kasparov.
00:46:30.600 | "He has been very outspoken in his attacks on Putin,
00:46:33.600 | "and I believe he's probably next on the list.
00:46:36.240 | "So clearly your life has been,
00:46:38.760 | "and perhaps continues to be in danger."
00:46:41.760 | How do you think about having the views you have,
00:46:45.520 | the ideas you have, being in opposition as you are
00:46:48.280 | in this kind of context when your life could be in danger?
00:46:53.860 | - That's the reason I live in New York.
00:46:58.120 | So it was not my first choice,
00:47:00.200 | but I knew I had to leave Russia at one point.
00:47:01.960 | And among other places, New York is the safest.
00:47:05.480 | Is it safe?
00:47:07.040 | No, I mean, it's just, I know what happened,
00:47:11.280 | what is happening with many of Putin's enemies.
00:47:14.760 | But at the end of the day, I mean, what can I do?
00:47:18.400 | I mean, I could be very proactive
00:47:21.640 | by trying to change things I can influence,
00:47:24.600 | but here are a way of facts.
00:47:26.480 | I cannot stop doing what I've been doing for a long time.
00:47:30.600 | It's the right thing to do.
00:47:32.520 | I grew up with my family teaching me
00:47:36.300 | sort of the wisdom of Soviet dissidents,
00:47:38.120 | do what you must and so be it.
00:47:39.620 | I could try to be cautious by not traveling
00:47:44.960 | to certain places where my security could be at risk.
00:47:49.620 | There's so many invitations to speak
00:47:51.220 | at different locations in the world.
00:47:52.680 | And I have to say that many countries are just now,
00:47:57.480 | are not destinations that I can afford to travel.
00:48:00.460 | My mother still lives in Moscow.
00:48:02.680 | I meet her a few times a year.
00:48:05.240 | She was devastated when I had to leave Russia
00:48:08.220 | because since my father died in 1971,
00:48:11.340 | so she was 33 and she dedicated her entire life
00:48:14.880 | to her only son.
00:48:16.500 | But she recognized in just a year or so
00:48:20.260 | since I left Russia that it was the only chance
00:48:22.940 | for me to continue my normal life.
00:48:26.660 | So just to, I mean, to be relatively safe
00:48:30.460 | and to do what she taught me to do, to make the difference.
00:48:35.460 | - Do you think you will ever return to Russia?
00:48:37.920 | Or let me ask a different way. - Oh, I'm sure.
00:48:39.400 | - When?
00:48:40.240 | - It will be sooner than many people think
00:48:41.640 | because I think Putin's regime
00:48:43.080 | is facing insurmountable difficulties.
00:48:46.680 | And again, I read enough historical books
00:48:49.440 | to know that dictatorships, they end suddenly.
00:48:57.120 | It's just on Sunday dictator feels comfortable.
00:49:01.140 | He believes he's popular on Monday morning, he's bust.
00:49:06.020 | The good news and bad news.
00:49:07.140 | I mean, the bad news is that I don't know when
00:49:10.100 | and how Putin rule ends.
00:49:13.060 | The good news he also doesn't know.
00:49:14.860 | (laughing)
00:49:17.120 | - Okay, well put.
00:49:19.300 | Let me ask a question that seems to preoccupy
00:49:24.860 | the American mind from the perspective of Russia.
00:49:28.940 | One, did Russia interfere in the 2016 US election,
00:49:33.940 | government sanction?
00:49:35.700 | And future, two, will Russia interfere
00:49:39.420 | in the 2020 US election?
00:49:41.840 | And what does that interference look like?
00:49:45.000 | - It's very old, you know.
00:49:46.700 | We had such an intelligent conversation.
00:49:48.500 | (laughing)
00:49:49.900 | And you are ruining everything
00:49:51.940 | by asking such a stupid question.
00:49:53.540 | (laughing)
00:49:55.380 | - It's been going downhill the entire way.
00:49:57.500 | - But it's insulting for my intellect.
00:50:00.740 | - Okay.
00:50:01.580 | - Of course they did interfere.
00:50:03.540 | Of course they did absolutely everything to elect Trump.
00:50:05.780 | I mean, they said it many times.
00:50:07.460 | It is just, you know, I met enough KGB colonels in my life
00:50:11.380 | to tell you that, you know,
00:50:12.740 | just the way Putin looks at Trump.
00:50:15.340 | This is the way, looks, and I don't have to hear
00:50:17.920 | what he says, what Trump says.
00:50:19.700 | It just is, I don't need to go through
00:50:21.780 | congressional investigations.
00:50:23.060 | The way Putin looks at Trump is the way
00:50:25.420 | the KGB officers looked at the assets.
00:50:28.480 | It's just, and falling to 2020,
00:50:31.500 | of course they will do absolutely everything
00:50:33.180 | to help Trump to survive,
00:50:35.180 | because I think the damage that Trump's re-elections
00:50:37.900 | could cause to America and to the free world,
00:50:40.360 | it's just, it's beyond one's imagination.
00:50:42.780 | I think basically if Trump is re-elected,
00:50:44.540 | he will ruin NATO,
00:50:45.940 | because he's already heading in this direction,
00:50:48.100 | but now he's just, he's still limited
00:50:50.380 | by the re-election hurdles.
00:50:55.260 | If he's still in the office after November 2020,
00:51:01.500 | okay, January 2021, I don't want to think about it.
00:51:05.980 | My problem is not just Trump,
00:51:07.340 | because Trump is basically a symptom,
00:51:09.540 | but the problem is that I don't see,
00:51:11.980 | it's just, it's the, in American political horizon,
00:51:18.020 | politicians who could take on Trump
00:51:19.940 | for all damage that he's doing for the free world,
00:51:25.980 | not just things that just happened,
00:51:27.420 | that went wrong in America.
00:51:28.820 | So there's the, it seems to me that the campaign,
00:51:30.940 | political campaign on the Democratic side
00:51:32.700 | is fixed on certain important, but still secondary issues.
00:51:37.700 | Because when you have the foundation
00:51:40.180 | of the Republic in jeopardy,
00:51:41.820 | I mean, you cannot talk about healthcare.
00:51:44.060 | I mean, I understand how important it is,
00:51:45.920 | but it's still secondary,
00:51:47.180 | because the entire framework of American political life
00:51:49.540 | is at risk.
00:51:50.700 | And you have Vladimir Putin just,
00:51:53.340 | it's having, fortunately, free hands
00:51:56.180 | by attacking America and other free countries.
00:52:00.580 | And by the way, we have so much evidence
00:52:02.420 | about Russia interference in Brexit,
00:52:04.380 | in elections in almost every European country.
00:52:07.260 | And thinking that they will be shy
00:52:09.660 | of attacking America in 2020,
00:52:12.220 | now with Trump in the office,
00:52:14.940 | yeah, I think it's, yeah,
00:52:18.100 | it definitely diminishes the intellectual quality
00:52:20.340 | of our conversation.
00:52:21.180 | (laughing)
00:52:23.260 | - I do what I can.
00:52:24.820 | Last question.
00:52:26.020 | If you can go back,
00:52:27.100 | just look at the entirety of your life you accomplished
00:52:30.180 | more than most humans will ever do.
00:52:33.420 | If you could go back and relive a single moment
00:52:35.660 | in your life, what would that moment be?
00:52:43.380 | - There are moments in my life when I think about
00:52:46.980 | what could be done differently, but.
00:52:50.680 | - No, experience happiness and joy and pride.
00:52:56.660 | Just to touch once again. - I know, I know.
00:52:58.660 | But it's the, look, I made many mistakes in my life.
00:53:01.780 | So I just, it's the, I know that.
00:53:04.740 | At the end of the day,
00:53:05.980 | I believe in the butterfly effect.
00:53:07.540 | So it's the, I knew moments where I could,
00:53:11.540 | now if I'm there at that point in '89, in '93,
00:53:16.540 | pick up a year, I could improve my actions
00:53:20.820 | by not doing this stupid thing.
00:53:22.380 | But then how do you know that I will have
00:53:26.580 | all other accomplishments?
00:53:27.820 | Yeah, I just, I'm afraid that we just have to
00:53:32.460 | just follow this, if you may call it wisdom,
00:53:35.660 | before it's gump, you know?
00:53:36.700 | It's the life is this, you know?
00:53:37.940 | It's a box of chocolate and you don't know what's inside,
00:53:42.780 | but you have to go one by one.
00:53:44.620 | So it's the, I'm happy with who I am and where I am today.
00:53:49.020 | And I'm very proud, not only with my chess accomplishments,
00:53:52.940 | but that I made this transition.
00:53:54.900 | And since I left chess, I built my own reputation
00:53:58.900 | that had some influence on the game of chess,
00:54:00.940 | but it's not directly derived from the game.
00:54:06.620 | I'm grateful for my wife, who helped me to build this life.
00:54:10.180 | We actually married in 2005.
00:54:11.580 | It was my third marriage, that's why I said
00:54:13.020 | I made mistakes in my life.
00:54:14.340 | And by the way, I'm close with two kids
00:54:17.180 | from my previous marriages.
00:54:18.460 | So that's the, I managed to sort of to balance my life.
00:54:22.940 | And here, I live in New York, so we have our two kids
00:54:26.300 | born here in New York.
00:54:28.020 | It's new life and it's busy.
00:54:31.620 | Sometimes I wish I could limit my engagement
00:54:34.500 | in many other things that are still,
00:54:37.620 | taking time and energy, but life is exciting.
00:54:44.860 | And as long as I can feel that I have energy,
00:54:48.780 | I have strengths, I have passion to make the difference,
00:54:53.620 | I'm happy.
00:54:55.460 | - I think that's a beautiful moment to end on.
00:54:59.420 | Gary, spasibo by shore.
00:55:00.940 | Thank you very much for talking today.
00:55:02.540 | - Thank you.
00:55:03.380 | Spasibo.
00:55:04.220 | (upbeat music)
00:55:06.820 | (upbeat music)
00:55:09.420 | (upbeat music)
00:55:12.020 | (upbeat music)
00:55:14.620 | (upbeat music)
00:55:17.220 | (upbeat music)
00:55:19.820 | [BLANK_AUDIO]