back to index

Nicole Perlroth: Cybersecurity and the Weapons of Cyberwar | Lex Fridman Podcast #266


Chapters

0:0 Introduction
0:55 Zero-day vulnerability
6:56 History of hackers
21:48 Interviewing hackers
25:50 Ransomware attack
38:34 Cyberwar
51:42 Cybersecurity
60:49 Social engineering
77:42 Snowden and whistleblowers
87:12 NSA
96:59 Fear for cyberattacks
104:30 Self-censorship
108:51 Advice for young people
114:8 Hope for the future

Whisper Transcript | Transcript Only Page

00:00:00.000 | If one site is hacked, you can just unleash all hell.
00:00:03.520 | We have stumbled into this new era
00:00:06.840 | of mutually assured digital destruction.
00:00:08.960 | How far are people willing to go?
00:00:11.200 | You can capture their location,
00:00:13.040 | you can capture their contacts
00:00:16.160 | that record their telephone calls,
00:00:18.000 | record their camera without them knowing about it.
00:00:20.760 | Basically, you can put an invisible ankle bracelet
00:00:24.400 | on someone without them knowing.
00:00:26.000 | You could sell that to a zero-day broker for $2 million.
00:00:31.000 | - The following is a conversation with Nicole Perleroth,
00:00:37.320 | cybersecurity journalist and author
00:00:40.080 | of "This Is How They Tell Me The World Ends,"
00:00:42.560 | the cyber weapons arm race.
00:00:44.920 | This is the Alex Friedman Podcast.
00:00:46.960 | To support it, please check out our sponsors
00:00:49.080 | in the description.
00:00:50.360 | And now, dear friends, here's Nicole Perleroth.
00:00:55.220 | You've interviewed hundreds of cybersecurity hackers,
00:00:58.560 | activists, dissidents, computer scientists,
00:01:01.040 | government officials, forensic investigators,
00:01:03.760 | and mercenaries.
00:01:05.700 | So let's talk about cybersecurity and cyber war.
00:01:09.400 | Start with the basics.
00:01:10.500 | What is a zero-day vulnerability
00:01:13.560 | and then a zero-day exploit or attack?
00:01:16.800 | - So at the most basic level, let's say I'm a hacker
00:01:22.560 | and I find a bug in your iPhone iOS software
00:01:27.560 | that no one else knows about, especially Apple.
00:01:31.260 | That's called a zero-day because the minute it's discovered,
00:01:34.360 | engineers have had zero days to fix it.
00:01:36.560 | If I can study that zero-day,
00:01:40.920 | I could potentially write a program to exploit it.
00:01:44.940 | And that program would be called a zero-day exploit.
00:01:48.860 | And for iOS, the dream is that you craft a zero-day exploit
00:01:53.860 | that can remotely exploit someone else's iPhone
00:01:57.380 | without them ever knowing about it.
00:01:59.940 | And you can capture their location,
00:02:01.900 | you can capture their contacts
00:02:04.980 | that record their telephone calls,
00:02:06.860 | record their camera without them knowing about it.
00:02:09.620 | Basically, you can put an invisible ankle bracelet
00:02:13.240 | on someone without them knowing.
00:02:15.060 | And you can see why that capability,
00:02:17.220 | that zero-day exploit would have immense value
00:02:20.500 | for a spy agency or a government
00:02:23.740 | that wants to monitor its critics or dissidents.
00:02:27.700 | And so there's a very lucrative market now
00:02:30.620 | for zero-day exploits.
00:02:32.100 | - So you said a few things there.
00:02:33.460 | One is iOS, why iOS?
00:02:36.360 | Which operating system?
00:02:37.660 | Which one is the sexier thing to try to get to
00:02:40.220 | or the most impactful thing?
00:02:42.740 | And the other thing you mentioned is remote
00:02:45.500 | versus having to actually come in physical contact with it.
00:02:49.300 | Is that the distinction?
00:02:50.900 | - So iPhone exploits have just been
00:02:54.540 | a government's number one priority.
00:02:58.260 | Recently, actually, the price of an Android
00:03:02.100 | remote zero-day exploit,
00:03:03.540 | something that can get you into Android phones,
00:03:06.580 | is actually higher.
00:03:08.140 | The value of that is now higher
00:03:09.660 | on this underground market for zero-day exploits
00:03:12.540 | than an iPhone iOS exploit.
00:03:15.400 | So things are changing.
00:03:16.740 | - So there's probably more Android devices,
00:03:20.100 | so that's why it's better.
00:03:21.780 | But on the iPhone side,
00:03:23.260 | so I'm an Android person 'cause I'm a man of the people,
00:03:28.420 | but it seems like all the elites use iPhone,
00:03:31.180 | all the people at nice dinner parties.
00:03:33.060 | So is that the reason that the more powerful people
00:03:37.300 | use iPhones, is that why?
00:03:38.820 | - I don't think so.
00:03:39.820 | I actually, so it was about two years ago
00:03:42.500 | that the prices flipped.
00:03:43.660 | It used to be that if you could craft
00:03:46.760 | a remote zero-click exploit for iOS,
00:03:51.760 | then that was about as good as it gets.
00:03:55.420 | You could sell that to a zero-day broker for $2 million.
00:04:00.420 | The caveat is you can never tell anyone about it
00:04:04.680 | because the minute you tell someone about it,
00:04:07.180 | Apple learns about it,
00:04:08.880 | they patch it in that $2.5 million investment
00:04:12.560 | that that zero-day broker just made goes to dust.
00:04:16.120 | So a couple years ago, and don't quote me on the prices,
00:04:20.780 | but an Android zero-click remote exploit
00:04:25.780 | for the first time topped the iOS.
00:04:29.020 | And actually a lot of people's read on that
00:04:32.180 | was that it might be a sign that Apple security was falling
00:04:40.280 | and that it might actually be easier
00:04:43.060 | to find an iOS zero-day exploit
00:04:46.700 | than find an Android zero-day exploit.
00:04:48.940 | The other thing is market share.
00:04:51.140 | There are just more people around the world
00:04:52.860 | that use Android.
00:04:54.700 | And a lot of governments that are paying top dollar
00:04:58.540 | for zero-day exploits these days
00:05:01.340 | are deep-pocketed governments in the Gulf
00:05:05.280 | that wanna use these exploits to monitor their own citizens,
00:05:08.920 | monitor their critics.
00:05:10.780 | And so it's not necessarily
00:05:12.400 | that they're trying to find elites.
00:05:14.660 | It's that they wanna find out who these people are
00:05:17.300 | that are criticizing them
00:05:18.540 | or perhaps planning the next Arab Spring.
00:05:21.220 | - So in your experience,
00:05:23.300 | are most of these attack targeted
00:05:24.980 | to cover a large population
00:05:26.540 | or is there attacks that are targeted
00:05:29.140 | towards specific individuals?
00:05:31.340 | - So I think it's both.
00:05:32.780 | Some of the zero-day exploits that have fetched top dollar
00:05:37.020 | that I've heard of in my reporting in the United States
00:05:39.400 | were highly targeted.
00:05:41.320 | There was a potential terrorist attack.
00:05:43.680 | They wanted to get into this person's phone.
00:05:45.520 | It had to be done in the next 24 hours.
00:05:48.120 | They approached hackers and say, "We'll pay you
00:05:50.880 | X millions of dollars if you can do this."
00:05:53.840 | But then you look at when we've discovered
00:05:57.480 | iOS zero-day exploits in the wild,
00:06:00.800 | some of them have been targeting
00:06:02.400 | large populations like Uyghurs.
00:06:05.280 | So a couple of years ago, there was a watering hole attack.
00:06:10.280 | Okay, what's a watering hole attack?
00:06:12.180 | There was a website.
00:06:13.300 | It was actually, it had information aimed at Uyghurs
00:06:17.900 | and you could access it all over the world.
00:06:21.000 | And if you visited this website,
00:06:23.280 | it would drop an iOS zero-day exploit onto your phone.
00:06:29.180 | And so anyone that visited this website
00:06:32.260 | that was about Uyghurs, anywhere,
00:06:34.500 | I mean, Uyghurs living abroad,
00:06:37.420 | basically the Uyghur diaspora,
00:06:39.800 | would have gotten infected with this zero-day exploit.
00:06:43.980 | So in that case, they were targeting huge swaths
00:06:48.980 | of this one population or people interested
00:06:51.740 | in this one population basically in real time.
00:06:54.780 | - Who are these attackers?
00:06:59.380 | From the individual level to the group level,
00:07:02.720 | psychologically speaking, what's their motivation?
00:07:05.180 | Is it purely money?
00:07:07.460 | Is it the challenge?
00:07:09.260 | Are they malevolent?
00:07:10.620 | Is it power?
00:07:12.220 | These are big philosophical human questions, I guess.
00:07:15.460 | - So these are the questions I set out to answer
00:07:18.940 | for my book.
00:07:20.380 | I wanted to know,
00:07:22.040 | are these people that are just after money?
00:07:25.620 | If they're just after money, how do they sleep at night
00:07:29.700 | not knowing whether that zero-day exploit
00:07:31.900 | they just sold to a broker is being used
00:07:34.740 | to basically make someone's life a living hell?
00:07:37.280 | And what I found was there's kind of this long-sorted
00:07:41.220 | history to this question.
00:07:43.660 | It started out in the '80s and '90s
00:07:46.920 | when hackers were just finding holes and bugs in software
00:07:51.200 | for curiosity's sake, really as a hobby.
00:07:54.380 | And some of them would go to the tech companies
00:07:56.820 | like Microsoft or Sun Microsystems at the time, or Oracle.
00:08:01.780 | And they'd say, "Hey, I just found this zero-day
00:08:04.500 | in your software and I can use it to break into NASA."
00:08:08.140 | And the general response at the time wasn't,
00:08:11.140 | "Thank you so much for pointing out this flaw
00:08:13.940 | and our software, we'll get it fixed as soon as possible."
00:08:17.240 | It was, "Don't ever poke around our software ever again,
00:08:21.200 | or we'll stick our general counsel on you."
00:08:24.160 | And that was really sort of the common thread for years.
00:08:30.380 | And so hackers who set out to do the right thing
00:08:34.420 | were basically told to shut up
00:08:37.660 | and stop doing what you're doing.
00:08:40.140 | And what happened next was they basically started
00:08:44.340 | trading this information online.
00:08:46.220 | Now, when you go back and interview people
00:08:48.180 | from those early days, they all tell a very similar story,
00:08:53.020 | which is they're curious, they're tinkers.
00:08:57.260 | You know, they remind me of like the kid down the block
00:08:59.740 | that was constantly poking around the hood of his dad's car.
00:09:03.500 | You know, they just couldn't help themselves.
00:09:06.140 | They wanted to figure out how a system is designed
00:09:09.820 | and how they could potentially exploit it
00:09:11.860 | for some other purpose.
00:09:13.080 | It doesn't have to be good or bad.
00:09:14.900 | But they were basically kind of beat down for so long
00:09:20.620 | by these big tech companies
00:09:22.940 | that they started just silently trading them
00:09:26.620 | with other hackers.
00:09:28.060 | And that's how you got these really heated debates
00:09:32.620 | in the '90s about disclosure.
00:09:34.740 | Should you just dump these things online?
00:09:38.140 | Because any script kiddie can pick them up
00:09:40.300 | and use it for all kinds of mischief.
00:09:42.540 | But, you know, don't you wanna just stick a middle finger
00:09:46.100 | to all these companies
00:09:47.260 | that are basically threatening you all the time?
00:09:50.020 | So there was this really interesting dynamic at play.
00:09:53.580 | And what I learned in the course of doing my book
00:09:57.060 | was that government agencies and their contractors
00:10:01.740 | sort of tapped into that frustration and that resentment.
00:10:06.180 | And they started quietly reaching out to hackers
00:10:09.140 | on these forums.
00:10:11.100 | And they said, "Hey, you know that zero day
00:10:13.820 | you just dropped online?
00:10:14.940 | Could you come up with something custom for me?
00:10:17.980 | And I'll pay you six figures for it
00:10:20.060 | so long as you shut up and never tell anyone
00:10:22.300 | that I paid you for this."
00:10:24.740 | And that's what happened.
00:10:27.060 | So throughout the '90s,
00:10:28.420 | there was a bunch of boutique contractors
00:10:31.460 | that started reaching out to hackers on these forums
00:10:34.660 | and saying, "Hey, I'll pay you six figures
00:10:37.340 | for that bug you were trying to get Microsoft
00:10:39.620 | to fix for free."
00:10:41.420 | And sort of so began or so catalyzed this market
00:10:45.540 | where governments and their intermediaries
00:10:48.020 | started reaching out to these hackers
00:10:50.420 | and buying their bugs for free.
00:10:53.060 | And in those early days, I think a lot of it
00:10:55.100 | was just for quiet counterintelligence,
00:10:57.620 | traditional espionage.
00:11:00.060 | But as we started baking the software,
00:11:04.260 | Windows software, Schneider Electric,
00:11:07.060 | Siemens Industrial Software, into our nuclear plants
00:11:11.300 | and our factories and our power grid
00:11:14.940 | and our petrochemical facilities and our pipelines,
00:11:18.720 | those same zero days came to be just as valuable
00:11:22.060 | for sabotage and war planning.
00:11:25.140 | - Does the fact that the market sprung up
00:11:27.220 | and you can now make a lot of money
00:11:28.500 | change the nature of the attackers that came to the table
00:11:32.100 | or grow the number of attackers?
00:11:34.540 | I mean, what is, I guess,
00:11:35.740 | you told the psychology of the hackers in the '90s,
00:11:40.580 | what is the culture today and where is it heading?
00:11:44.020 | - So I think there are people who will tell you
00:11:47.140 | they would never sell a zero day
00:11:49.100 | to a zero day broker or a government.
00:11:52.100 | One, because they don't know how it's gonna get used
00:11:54.580 | when they throw it over the fence.
00:11:56.380 | Most of these get rolled into classified programs
00:11:58.820 | and you don't know how they get used.
00:12:01.300 | If you sell it to a zero day broker,
00:12:02.700 | you don't even know which nation state might use it
00:12:05.500 | or potentially which criminal group might use it
00:12:09.340 | if you sell it on the dark web.
00:12:11.400 | The other thing that they say is that
00:12:15.420 | they wanna be able to sleep at night
00:12:18.020 | and they lose a lot of sleep
00:12:19.940 | if they found out their zero day was being used
00:12:22.140 | to make a dissidence life living hell.
00:12:24.860 | But there are a lot of people, good people,
00:12:28.500 | who also say, no, this is not my problem.
00:12:32.620 | This is the technology company's problem.
00:12:35.380 | If they weren't writing new bugs
00:12:37.020 | into their software every day,
00:12:39.180 | then there wouldn't be a market,
00:12:41.060 | then there wouldn't be a problem.
00:12:42.860 | But they continue to write bugs
00:12:44.780 | into their software all the time
00:12:46.180 | and they continue to profit off that software.
00:12:48.300 | So why shouldn't I profit off my labor too?
00:12:53.300 | And one of the things that has happened,
00:12:55.740 | which is I think a positive development
00:12:57.980 | over the last 10 years, are bug bounty programs.
00:13:02.260 | Companies like Google and Facebook and then Microsoft
00:13:06.180 | and finally Apple, which resisted it for a really long time,
00:13:10.180 | have said, okay, we are gonna shift our perspective
00:13:14.780 | about hackers, we're no longer going to treat them
00:13:17.400 | as the enemy here.
00:13:18.960 | We're going to start paying them
00:13:20.340 | for what it's essentially free quality assurance.
00:13:23.800 | And we're gonna pay them good money in some cases,
00:13:26.820 | six figures in some cases.
00:13:28.900 | We're never gonna be able to bid against a zero day broker
00:13:32.380 | who sells to government agencies.
00:13:34.740 | But we can reward them and hopefully get to that bug earlier
00:13:38.900 | where we can neutralize it
00:13:40.980 | so that they don't have to spend another year
00:13:43.060 | developing the zero day exploit.
00:13:44.760 | And in that way, we can keep our software more secure.
00:13:48.180 | But every week I get messages from some hacker that says,
00:13:53.180 | I tried to see this zero day exploit
00:13:55.940 | that was just found in the wild,
00:13:58.020 | being used by this nation state.
00:14:00.380 | I tried to tell Microsoft about this two years ago
00:14:04.540 | and they were gonna pay me peanuts, so it never got fixed.
00:14:08.900 | There are all sorts of those stories that can continue on.
00:14:12.700 | And I think just generally,
00:14:16.700 | hackers are not very good at diplomacy.
00:14:19.860 | They tend to be pretty snipey, technical crowd
00:14:24.500 | and very philosophical in my experience.
00:14:28.260 | But diplomacy is not their strong suit.
00:14:31.900 | - Well, there almost has to be a broker
00:14:33.700 | between companies and hackers
00:14:35.980 | where you can translate effectively,
00:14:38.000 | just like you have a zero day broker
00:14:39.540 | between governments and hackers.
00:14:41.700 | You have to speak their language.
00:14:43.100 | - Yeah, and there have been some of those companies
00:14:45.900 | who've risen up to meet that demand
00:14:47.780 | and HackerOne is one of them, BugCrowd is another.
00:14:52.460 | Synack has an interesting model.
00:14:54.280 | So that's a company that you pay
00:14:56.900 | for a private bug bounty program, essentially.
00:14:59.900 | So you pay this company,
00:15:00.940 | they tap hackers all over the world
00:15:04.460 | to come hack your software, hack your system.
00:15:07.580 | And then they'll quietly tell you what they found.
00:15:10.820 | And I think that's a really positive development.
00:15:13.740 | And actually the Department of Defense
00:15:16.300 | hired all three of those companies I just mentioned
00:15:20.460 | to help secure their systems.
00:15:22.020 | Now, I think they're still a little timid
00:15:24.140 | in terms of letting those hackers
00:15:25.780 | into the really sensitive high side classified stuff,
00:15:30.300 | but baby steps. (laughs)
00:15:33.080 | - Just to understand what you were saying,
00:15:34.740 | you think it's impossible for companies
00:15:37.780 | to financially compete with the zero day brokers,
00:15:40.540 | with governments.
00:15:42.020 | So like the defense can't outpay the hackers?
00:15:47.020 | - It's interesting, they shouldn't outpay them
00:15:51.860 | because what would happen if they started offering
00:15:56.180 | $2.5 million at Apple for any zero day exploit
00:16:01.180 | that governments would pay that much for
00:16:04.740 | is their own engineers would say,
00:16:06.260 | "Why the hell am I working for less than that?"
00:16:10.300 | And doing my nine to five every day.
00:16:12.420 | So you would create a perverse incentive.
00:16:14.740 | And I didn't think about that
00:16:16.940 | until I started this research and I realized,
00:16:19.220 | okay, yeah, that makes sense.
00:16:20.540 | You don't want to incentivize offense so much
00:16:25.300 | that it's to your own detriment.
00:16:27.500 | And so I think what they have though,
00:16:29.540 | what the companies have on government agencies
00:16:32.660 | is if they pay you, you get to talk about it.
00:16:36.460 | You get the street cred.
00:16:38.180 | You get to brag about the fact you just found
00:16:41.260 | that $2.5 million iOS zero day that no one else did.
00:16:46.260 | And if you sell it to a broker,
00:16:48.780 | you never get to talk about it.
00:16:50.100 | And I think that really does eat at people.
00:16:52.300 | - Can I ask you a big philosophical question
00:16:55.140 | about human nature here?
00:16:56.540 | (laughing)
00:16:57.380 | So if you have, in what you've seen,
00:17:00.820 | if a human being has a zero day,
00:17:03.460 | they found a zero day vulnerability
00:17:06.460 | that can hack into, I don't know,
00:17:09.860 | what's the worst thing you can hack into?
00:17:11.980 | Something that could launch nuclear weapons.
00:17:14.900 | Which percentage of the people in the world
00:17:16.980 | that have the skill would not share that with anyone,
00:17:20.660 | with any bad party?
00:17:23.420 | I guess how many people are completely devoid
00:17:27.740 | of ethical concerns in your sense?
00:17:31.780 | So my belief is all the ultra competent people
00:17:36.340 | or very, very high percentage of ultra competent people
00:17:39.580 | are also ethical people.
00:17:41.580 | That's been my experience.
00:17:42.980 | But then again, my experience is narrow.
00:17:45.900 | What's your experience been like?
00:17:48.500 | - So this was another question I wanted to answer.
00:17:52.620 | Who are these people who would sell a zero day exploit
00:17:57.740 | that would neutralize a Schneider Electric safety lock
00:18:01.420 | at a petrochemical plant?
00:18:03.100 | Basically the last thing you would need to neutralize
00:18:05.380 | before you trigger some kind of explosion.
00:18:07.980 | Who would sell that?
00:18:09.100 | And I got my answer.
00:18:14.100 | Well, the answer was different.
00:18:16.780 | A lot of people said, I would never even look there
00:18:19.860 | 'cause I don't even wanna know.
00:18:21.020 | I don't even wanna have that capability.
00:18:23.020 | I don't even wanna have to make that decision
00:18:26.580 | about whether I'm gonna profit off of that knowledge.
00:18:29.940 | I went down to Argentina
00:18:31.860 | and this whole kind of moral calculus I had in my head
00:18:36.860 | was completely flipped around.
00:18:39.260 | So just to back up for a moment.
00:18:41.620 | So Argentina actually is a real hacker's paradise.
00:18:46.460 | People grew up in Argentina and I went down there,
00:18:50.820 | I guess I was there around 2015, 2016,
00:18:54.260 | but you still couldn't get an iPhone.
00:18:56.620 | They didn't have Amazon Prime.
00:18:58.940 | You couldn't get access to any of the apps
00:19:00.860 | we all take for granted.
00:19:02.620 | To get those things in Argentina as a kid,
00:19:05.300 | you have to find a way to hack 'em.
00:19:07.620 | And it's the whole culture is really like a hacker culture.
00:19:12.100 | They say like, it's really like a MacGyver culture.
00:19:15.300 | You have to figure out how to break into something
00:19:17.620 | with wire and tape.
00:19:19.500 | And that means that there are a lot of really good hackers
00:19:24.460 | in Argentina who specialize in developing zero-day exploits.
00:19:30.580 | And I went down to this Argentina conference
00:19:34.020 | called Echo Party.
00:19:35.740 | And I asked the organizer,
00:19:38.220 | okay, can you introduce me to someone
00:19:40.060 | who's selling zero-day exploits to governments?
00:19:43.100 | And he was like, just throw a stone.
00:19:45.020 | Throw a stone anywhere and you're gonna hit someone.
00:19:48.940 | And all over this conference,
00:19:50.700 | you saw these guys who were clearly from these Gulf States
00:19:54.140 | who only spoke Arabic.
00:19:55.700 | What are they doing at a young hacking conference
00:19:59.380 | in Buenos Aires? - Oh boy.
00:20:01.180 | - And so I went out to lunch
00:20:03.260 | with kind of this godfather of the hacking scene there.
00:20:07.340 | And I asked this really dumb question
00:20:10.260 | and I'm still embarrassed about how I phrased it.
00:20:13.180 | But I said, so, you know,
00:20:14.620 | well, these guys only sell these zero-day exploits
00:20:17.700 | to good Western governments.
00:20:20.380 | And he said, Nicole, last time I checked,
00:20:22.380 | the United States wasn't a good Western government.
00:20:25.660 | You know, the last country that bombed another country
00:20:28.260 | into oblivion wasn't China or Iran.
00:20:31.900 | It was the United States.
00:20:33.580 | So if we're gonna go by your whole moral calculus,
00:20:36.380 | you know, just know that we have
00:20:37.540 | a very different calculus down here.
00:20:39.860 | And we'd actually rather sell to Iran or Russia or China,
00:20:44.780 | maybe, than the United States.
00:20:46.900 | And that just blew me away.
00:20:48.860 | Like, wow.
00:20:50.020 | You know, he's like, we'll just sell
00:20:51.860 | to whoever brings us the biggest bag of cash.
00:20:53.940 | Have you checked into our inflation situation recently?
00:20:57.980 | So, you know, I had some of those,
00:20:59.980 | like, reality checks along the way.
00:21:02.260 | You know, we tend to think of things as,
00:21:04.500 | is this moral, you know, is this ethical,
00:21:06.980 | especially as journalists.
00:21:08.780 | You know, we kind of sit on our high horse sometimes
00:21:11.020 | and write about a lot of things
00:21:13.660 | that seem to push the moral bounds.
00:21:16.380 | But in this market,
00:21:17.820 | which is essentially an underground market,
00:21:20.220 | that, you know, the one rule is like Fight Club,
00:21:22.420 | you know, no one talks about Fight Club.
00:21:24.420 | First rule of the zero-day market,
00:21:25.940 | nobody talks about the zero-day market.
00:21:27.820 | On both sides.
00:21:29.180 | Because the hacker doesn't wanna lose
00:21:30.900 | their $2.5 million bounty.
00:21:33.940 | And governments roll these into classified programs
00:21:36.860 | and they don't want anyone to know what they have.
00:21:39.180 | So no one talks about this thing.
00:21:41.220 | And when you're operating in the dark like that,
00:21:43.860 | it's really easy to put aside your morals sometimes.
00:21:47.000 | - Can I, as a small tangent, ask you, by way of advice,
00:21:52.060 | you must have done some incredible interviews.
00:21:55.580 | And you've also spoken about how serious
00:21:58.300 | you take protecting your sources.
00:22:00.060 | If you were to give me advice for interviewing
00:22:04.420 | when you're recording on mic with a video camera,
00:22:07.980 | how is it possible to get into this world?
00:22:13.060 | Like, is it basically impossible?
00:22:16.060 | So you've spoken with a few people,
00:22:19.020 | what is it, like the godfather of cyber war, cybersecurity?
00:22:23.300 | So people that are already out.
00:22:25.260 | And they still have to be pretty brave to speak publicly.
00:22:28.440 | But is it virtually impossible to really talk to anybody
00:22:32.740 | who's a current hacker?
00:22:34.660 | You're always like 10, 20 years behind?
00:22:37.740 | - It's a good question.
00:22:38.740 | And this is why I'm a print journalist.
00:22:40.880 | But, you know, a lot, when I've seen people do it,
00:22:45.940 | it's always the guy who's behind the shadows,
00:22:49.260 | whose voice has been altered.
00:22:51.580 | You know, when they've gotten someone on camera,
00:22:53.420 | that's usually how they do it.
00:22:55.160 | You know, very, very few people talk in this space.
00:22:59.020 | And there's actually a pretty well known case study
00:23:02.060 | in why you don't talk publicly in this space
00:23:04.460 | and you don't get photographed.
00:23:06.100 | And that's the gruck.
00:23:07.420 | So, you know, the gruck is or was this zero day broker,
00:23:12.060 | South African guy, lives in Thailand.
00:23:15.260 | And right when I was starting on this subject
00:23:18.860 | at the New York Times, he'd given an interview to Forbes.
00:23:22.700 | And he talked about being a zero day broker.
00:23:25.580 | And he even posed next to this giant duffel bag
00:23:29.180 | filled with cash, ostensibly.
00:23:31.740 | And later he would say he was speaking off the record,
00:23:35.480 | he didn't understand the rules of the game.
00:23:38.020 | But what I heard from people who did business with him
00:23:41.020 | was that the minute that that story came out,
00:23:43.120 | he became PNG'd.
00:23:45.320 | No one did business with him.
00:23:47.300 | You know, his business plummeted by at least half.
00:23:50.100 | No one wants to do business with anyone
00:23:52.140 | who's gonna get on camera and talk about
00:23:54.420 | how they're selling zero days to governments.
00:23:56.680 | It puts you at danger.
00:23:59.740 | And I did hear that he got some visits
00:24:01.660 | from some security folks.
00:24:04.040 | And, you know, it's another thing
00:24:05.060 | for these people to consider.
00:24:06.940 | If they have those zero day exploits at their disposal,
00:24:11.940 | they become a huge target for nation states
00:24:16.420 | all over the world.
00:24:18.140 | You know, talk about having perfect OPSEC.
00:24:20.660 | You know, you better have some perfect OPSEC
00:24:23.580 | if people know that you have access
00:24:25.620 | to those zero day exploits.
00:24:27.900 | - Which sucks because, I mean, transparency here
00:24:32.900 | would be really powerful for educating the world
00:24:36.460 | and also inspiring other engineers to do good.
00:24:40.180 | It just feels like when you're operating in shadows,
00:24:42.780 | it doesn't help us move in the positive direction
00:24:46.380 | in terms of like getting more people on the defense side
00:24:48.980 | versus on the attack side.
00:24:50.380 | - Right.
00:24:51.220 | - But of course, what can you do?
00:24:52.060 | I mean, the best you can possibly do
00:24:53.660 | is have great journalists, just like you did,
00:24:57.060 | interview and write books about it
00:24:58.900 | and integrate the information you get
00:25:01.060 | while hiding the sources.
00:25:02.940 | - Yeah, and I think, you know,
00:25:04.580 | what HackerOne has told me was,
00:25:07.620 | okay, let's just put away the people
00:25:09.180 | that are finding and developing zero day exploits
00:25:13.240 | all day long, let's put that aside.
00:25:15.540 | What about the, you know, however many millions
00:25:18.540 | of programmers all over the world
00:25:20.740 | who've never even heard of a zero day exploit?
00:25:23.300 | Why not tap into them and say,
00:25:25.980 | "Hey, we'll start paying you if you can find a bug
00:25:29.220 | in United Airlines software or in Schneider Electric
00:25:34.140 | or in Ford or Tesla."
00:25:36.880 | And I think that is a really smart approach.
00:25:39.940 | Let's go find this untapped army of programmers
00:25:43.940 | to neutralize these bugs before the people
00:25:46.900 | who will continue to sell these to governments
00:25:49.160 | can find them and exploit them.
00:25:50.860 | - Okay, I have to ask you about this.
00:25:53.220 | From a personal side, it's funny enough,
00:25:55.780 | after we agreed to talk, I've gotten,
00:25:59.320 | for the first time in my life,
00:26:01.180 | was a victim of a cyber attack.
00:26:03.920 | So this is ransomware, it's called Deadbolt.
00:26:08.580 | People can look it up.
00:26:10.060 | I have a QNAP device for basically kind of coldish storage.
00:26:15.620 | So it's about 60 terabytes with 50 terabytes of data on it
00:26:20.260 | in RAID 5 and apparently about four to 5,000 QNAP devices
00:26:25.260 | were hacked and taken over with this ransomware.
00:26:30.660 | And what ransomware does there is it goes file by file,
00:26:35.540 | almost all the files on the QNAP storage device
00:26:39.020 | and encrypts them.
00:26:40.420 | And then there's this very eloquently
00:26:43.020 | and politely written page that pops up.
00:26:45.240 | It describes what happened.
00:26:48.300 | All your files have been encrypted.
00:26:50.060 | This includes, but is not limited to photos,
00:26:52.140 | documents, and spreadsheets.
00:26:53.820 | Why me?
00:26:54.740 | This is, a lot of people commented
00:26:57.980 | about how friendly and eloquent this is written.
00:27:00.780 | And I have to commend them.
00:27:01.820 | It is, and it's pretty user-friendly.
00:27:04.060 | Why me?
00:27:06.860 | This is not a personal attack.
00:27:08.100 | You have been targeted because of the inadequate security
00:27:11.060 | provided by your vendor, QNAP.
00:27:13.580 | What now?
00:27:16.340 | You can make a payment of exactly 0.03 Bitcoin,
00:27:19.820 | which is about a thousand dollars,
00:27:21.540 | to the following address.
00:27:23.380 | Once the payment has been made,
00:27:24.860 | we'll follow up with transaction to the same address,
00:27:27.540 | blah, blah, blah.
00:27:28.600 | They give you instructions of what happens next
00:27:31.540 | and they'll give you a decryption key that you can then use.
00:27:34.620 | And then there's another message for QNAP that says,
00:27:37.600 | all your affected customers have been targeted
00:27:40.900 | using a zero-day vulnerability in your product.
00:27:43.860 | We offer you two options to mitigate this and future damage.
00:27:48.420 | One, make a Bitcoin payment of five Bitcoin
00:27:51.780 | to the following address,
00:27:53.820 | and that will reveal to QNAP the,
00:27:56.340 | I'm summarizing things here,
00:27:57.940 | what the actual vulnerability is.
00:28:00.180 | Or you can make a Bitcoin payment of 50 Bitcoin
00:28:03.900 | to get a master decryption key for all your customers.
00:28:07.060 | 50 Bitcoin is about $1.8 million.
00:28:10.380 | - Okay.
00:28:11.980 | So first of all, on a personal level,
00:28:14.500 | this one hurt for me.
00:28:16.300 | There's, I mean, I learned a lot
00:28:21.100 | 'cause I wasn't, for the most part,
00:28:23.820 | backing up much of that data
00:28:27.020 | because I thought I can afford to lose that data.
00:28:30.900 | It's not like horrible.
00:28:32.460 | I mean, I think you've spoken about the crown jewels,
00:28:35.900 | like making sure there's things you really protect.
00:28:38.380 | And I have, I'm very conscious security-wise
00:28:43.300 | on the crown jewels.
00:28:45.100 | But there's a bunch of stuff, like personal videos.
00:28:49.020 | They're not, I don't have anything creepy,
00:28:50.900 | but just fun things I did
00:28:53.340 | that because they're very large or 4K or something like that,
00:28:56.100 | I kept them on there thinking RAID 5 will protect it.
00:28:59.740 | And just, I lost a bunch of stuff,
00:29:01.260 | including raw footage from interviews
00:29:06.100 | and all that kind of stuff.
00:29:08.340 | So it's painful.
00:29:09.540 | And I'm sure there's a lot of painful stuff like that
00:29:12.100 | for the four to 5,000 people that use QNAP.
00:29:15.580 | And there's a lot of interesting ethical questions here.
00:29:18.460 | Do you pay them?
00:29:19.460 | Does QNAP pay them?
00:29:22.020 | Do the individuals pay them?
00:29:25.660 | Especially when you don't know if it's going to work or not.
00:29:29.060 | Do you wait?
00:29:30.260 | So QNAP said that please don't pay them.
00:29:35.860 | We're working very hard day and night to solve this.
00:29:39.060 | It's so philosophically interesting to me
00:29:44.100 | because I also project onto them thinking,
00:29:46.580 | what is their motivation?
00:29:48.260 | Because the way they phrased it on purpose, perhaps,
00:29:51.940 | but I'm not sure if that actually reflects
00:29:53.360 | their real motivation,
00:29:54.820 | is maybe they're trying to help themselves sleep at night.
00:29:59.260 | Basically saying, this is not about you.
00:30:01.300 | This is about the company with the vulnerabilities.
00:30:04.380 | Just like you mentioned, this is the justification they have
00:30:07.340 | but they're hurting real people.
00:30:09.580 | They hurt me but I'm sure there's a few others
00:30:12.060 | that are really hurt.
00:30:13.420 | - And the zero day factor is a big one.
00:30:17.220 | QNAP right now is trying to figure out
00:30:22.340 | what the hell is wrong with their system
00:30:23.880 | that would let this in.
00:30:25.520 | And even if they pay,
00:30:28.580 | if they still don't know where the zero day is,
00:30:30.820 | what's to say that they won't just hit them again
00:30:32.780 | and hit you again?
00:30:34.260 | So that really complicates things.
00:30:36.580 | And that is a huge advancement for ransomware.
00:30:40.860 | It's really only been, I think, in the last 18 months
00:30:44.740 | that we've ever really seen ransomware exploit zero days
00:30:48.420 | to pull these off.
00:30:49.260 | Usually 80% of them, I think the data shows 80% of them
00:30:54.260 | come down to a lack of two-factor authentication.
00:30:58.460 | So when someone gets hit by a ransomware attack,
00:31:01.420 | they don't have two-factor authentication on,
00:31:04.420 | their employees were using stupid passwords.
00:31:07.660 | You can mitigate that in the future.
00:31:09.740 | This one, they don't know.
00:31:10.940 | They probably don't know.
00:31:11.860 | - Yeah, and it was, I guess it's zero click
00:31:14.380 | 'cause I didn't have to do anything.
00:31:16.220 | The only thing, well, here's the thing.
00:31:20.020 | I did basics of, I put it behind a firewall,
00:31:26.220 | I followed instructions.
00:31:27.980 | But I didn't really pay attention.
00:31:30.380 | So maybe there's a misconfiguration of some sort
00:31:34.260 | that's easy to make.
00:31:36.280 | It's difficult.
00:31:37.120 | We have a personal NAS.
00:31:39.120 | So I'm not willing to sort of say
00:31:43.500 | that I did everything I possibly could.
00:31:45.540 | But I did a lot of reasonable stuff
00:31:49.820 | and they still hit it with zero clicks.
00:31:51.540 | I didn't have to do anything.
00:31:52.540 | - Yeah, well, it's like a zero day
00:31:54.140 | and it's a supply chain attack.
00:31:55.880 | You're getting hit from your supplier.
00:31:59.260 | You're getting hit because of your vendor.
00:32:01.700 | And it's also a new thing for ransomware groups
00:32:04.340 | to go to the individuals to pressure them to pay.
00:32:07.700 | There was this really interesting case,
00:32:09.900 | I think it was in Norway,
00:32:12.020 | where there was a mental health clinic that got hit.
00:32:16.140 | And the cyber criminals were going
00:32:17.940 | to the patients themselves to say,
00:32:20.620 | pay this or we're going to release your psychiatric records.
00:32:25.460 | I mean, talk about hell.
00:32:28.220 | In terms of whether to pay,
00:32:30.620 | that is on the cheaper end of the spectrum.
00:32:33.740 | - From the individual or from the company?
00:32:35.620 | - Both.
00:32:36.740 | We've seen, for instance,
00:32:39.420 | there was an Apple supplier in Taiwan.
00:32:43.020 | They got hit and the ransom demand was 50 million.
00:32:47.380 | I'm surprised it's only 1.8 million.
00:32:49.460 | I'm sure it's going to go up.
00:32:50.900 | And it's hard.
00:32:53.340 | There's obviously governments
00:32:55.660 | and maybe in this case,
00:32:57.100 | the company are going to tell you,
00:32:58.940 | we recommend you don't pay or please don't pay.
00:33:02.160 | But the reality on the ground
00:33:04.940 | is that some businesses can't operate.
00:33:08.020 | Some countries can't function.
00:33:09.700 | I mean, the under reported storyline
00:33:13.740 | of Colonial Pipeline was after the company got hit
00:33:18.740 | and took the preemptive step of shutting down the pipeline
00:33:22.180 | because their billing systems were frozen,
00:33:24.260 | they couldn't charge customers downstream.
00:33:27.180 | My colleague, David Singer,
00:33:29.100 | and I got our hands on a classified assessment
00:33:32.820 | that said that as a country,
00:33:35.780 | we could have only afforded two to three more days
00:33:38.500 | of Colonial Pipeline being down.
00:33:40.700 | And it was really interesting.
00:33:42.100 | I thought it was the gas and the jet fuel,
00:33:44.300 | but it wasn't.
00:33:45.660 | We were sort of prepared for that.
00:33:47.460 | It was the diesel.
00:33:49.020 | Without the diesel, the refineries couldn't function
00:33:52.300 | and it would have totally screwed up the economy.
00:33:54.780 | And so there was almost this like national security,
00:33:59.540 | economic impetus for them to pay this ransom.
00:34:04.540 | And the other one I always think about is Baltimore.
00:34:06.980 | You know, when the city of Baltimore got hit,
00:34:09.100 | I think the initial ransom demand
00:34:11.420 | was something around 76,000.
00:34:13.820 | It may have even started smaller than that.
00:34:16.780 | And Baltimore stood its ground and didn't pay.
00:34:20.060 | But ultimately the cost to remediate was $18 million.
00:34:24.260 | It's a lot for the city of Baltimore.
00:34:26.820 | That's money that could have gone
00:34:28.100 | to public school education and roads and public health.
00:34:32.500 | And instead it just went to rebuilding
00:34:35.180 | these systems from scratch.
00:34:36.340 | And so a lot of residents in Baltimore were like,
00:34:39.180 | why the hell didn't you pay the $76,000?
00:34:43.540 | So it's not obvious.
00:34:45.700 | You know, it's easy to say don't pay
00:34:48.140 | because why you're funding their R&D for the next go round.
00:34:51.780 | But it's too often, it's too complicated.
00:34:56.940 | - So on the individual level, just like, you know,
00:34:59.740 | the way I feel personally from this attack,
00:35:03.260 | have you talked to people that were kind of victims
00:35:05.300 | in the same way I was,
00:35:06.400 | but maybe more dramatic ways or so on?
00:35:08.540 | You know, in the same way that violence hurts people.
00:35:13.100 | - Yeah. - How much does this
00:35:13.940 | hurt people in your sense and the way you researched it?
00:35:16.700 | - The worst ransomware attack I've covered
00:35:21.020 | on a personal level was an attack on a hospital in Vermont.
00:35:26.020 | And, you know, you think of this as like,
00:35:30.140 | okay, it's hitting their IT networks.
00:35:31.820 | They should still be able to treat patients.
00:35:34.660 | But it turns out that cancer patients
00:35:37.380 | couldn't get their chemo anymore
00:35:39.260 | because the protocol of who gets what is very complicated.
00:35:43.000 | And without it, nurses and doctors couldn't access it.
00:35:47.080 | So they were turning chemo patients away,
00:35:50.520 | cancer patients away.
00:35:52.160 | One nurse told us,
00:35:54.640 | "I don't know why people aren't screaming about this.
00:35:57.240 | That the only thing I've seen that even compares
00:35:59.680 | to what we're seeing at this hospital right now
00:36:02.040 | was when I worked in the burn unit
00:36:04.440 | after the Boston Marathon bombing."
00:36:06.840 | You know, they really put it in these super dramatic terms.
00:36:10.520 | And last year there was a report in the Wall Street Journal
00:36:15.040 | where they attributed an infant death
00:36:18.880 | to a ransomware attack because a mom came in
00:36:23.560 | and whatever device they were using to monitor the fetus
00:36:28.480 | wasn't working because of the ransomware attack.
00:36:30.680 | And so they attributed this infant death
00:36:33.600 | to the ransomware attack.
00:36:34.680 | Now on a bigger scale, but less personal,
00:36:39.120 | when there was the NotPetya attack.
00:36:41.320 | So this was an attack by Russia on Ukraine
00:36:45.520 | that came at them through a supplier,
00:36:49.580 | a tax software company in that case,
00:36:53.080 | that didn't just hit any government agency
00:36:56.800 | or business in Ukraine that used this tax software.
00:36:59.680 | It actually hit any business all over the world
00:37:02.680 | that had even a single employee working remotely in Ukraine.
00:37:07.360 | So it hit Maersk, the shipping company,
00:37:09.520 | but hit Pfizer, hit FedEx,
00:37:11.680 | but the one I will never forget is Merck.
00:37:14.560 | It paralyzed Merck's factories.
00:37:17.760 | I mean, it really created an existential crisis
00:37:20.300 | for the company.
00:37:21.600 | Merck had to tap into the CDC's emergency supplies
00:37:25.400 | of the Gardasil vaccine that year
00:37:27.800 | because their whole vaccine production line
00:37:29.920 | had been paralyzed in that attack.
00:37:32.160 | Imagine if that was gonna happen right now
00:37:36.040 | to Pfizer or Moderna or Johnson & Johnson.
00:37:39.480 | You know, imagine.
00:37:41.200 | I mean, that would really create
00:37:43.720 | a global cyber terrorist attack, essentially.
00:37:47.240 | - And that's almost unintentional.
00:37:49.360 | - I thought for a long time,
00:37:51.280 | I always labeled it as collateral damage.
00:37:53.960 | But actually just today,
00:37:57.300 | there was a really impressive threat researcher at Cisco,
00:38:02.300 | which has this threat intelligence division called Talos,
00:38:05.400 | who said, "Stop calling it collateral damage."
00:38:09.000 | They could see who was gonna get hit
00:38:12.280 | before they deployed that malware.
00:38:15.640 | It wasn't collateral damage.
00:38:17.960 | It was intentional.
00:38:19.080 | They meant to hit any business
00:38:21.680 | that did business with Ukraine.
00:38:23.240 | It was to send a message to them, too.
00:38:26.320 | So I don't know if that's accurate.
00:38:28.760 | I always thought of it
00:38:29.940 | as sort of the sloppy collateral damage,
00:38:32.000 | but it definitely made me think.
00:38:34.840 | - So how much of this between states
00:38:37.040 | is going to be a part of war,
00:38:40.080 | these kinds of attacks on Ukraine,
00:38:43.980 | between Russia and US,
00:38:47.800 | Russia and China, China and US?
00:38:49.980 | Let's look at China and US.
00:38:53.240 | Do you think China and US are going to escalate
00:38:58.240 | something that would be called a war
00:39:01.360 | purely in the space of cyber?
00:39:04.240 | - I believe any geopolitical conflict
00:39:08.840 | from now on is guaranteed to have some cyber element to it.
00:39:15.200 | The Department of Justice recently declassified a report
00:39:20.480 | that said China's been hacking into our pipelines,
00:39:22.700 | and it's not for intellectual property theft.
00:39:25.260 | It's to get a foothold
00:39:26.980 | so that if things escalate in Taiwan, for example,
00:39:30.460 | they are where they need to be to shut our pipelines down,
00:39:33.280 | and we just got a little glimpse
00:39:34.960 | of what that looked like with Colonial Pipeline
00:39:37.840 | and the panic buying and the jet fuel shortages
00:39:40.920 | and that assessment I just mentioned about the diesel.
00:39:44.560 | So they're there, they've gotten there.
00:39:48.180 | Anytime I read a report about new aggression
00:39:53.440 | from fighter jets, Chinese fighter jets in Taiwan,
00:39:57.120 | or what's happening right now with Russia's buildup
00:40:00.700 | on the Ukraine border, or India, Pakistan,
00:40:04.660 | I'm always looking at it through a cyber lens,
00:40:07.440 | and it really bothers me that other people aren't
00:40:10.400 | because there is no way that these governments
00:40:15.520 | and these nation states are not going to use their access
00:40:19.260 | to gain some advantage in those conflicts.
00:40:23.760 | And I'm now in a position where I'm an advisor
00:40:28.640 | to the Cybersecurity Infrastructure Security Agency at DHS.
00:40:33.640 | So I'm not saying anything classified here,
00:40:37.540 | but I just think that it's really important
00:40:41.220 | to understand just generally what the collateral damage
00:40:46.020 | could be for American businesses and critical infrastructure
00:40:50.020 | in any of these escalated conflicts around the world.
00:40:54.060 | Because just generally, our adversaries have learned
00:40:59.060 | that they might never be able to match us
00:41:02.660 | in terms of our traditional military spending
00:41:05.080 | on traditional weapons and fighter jets.
00:41:08.060 | But we have a very soft underbelly when it comes to cyber.
00:41:12.980 | 80% or more of America's critical infrastructure,
00:41:17.400 | so pipelines, power grid, nuclear plants, water systems,
00:41:23.600 | is owned and operated by the private sector.
00:41:26.860 | And for the most part, there is nothing out there
00:41:30.500 | legislating that those companies
00:41:33.820 | share the fact they've been breached.
00:41:35.760 | They don't even have to tell the government
00:41:37.320 | they've been hit.
00:41:38.760 | There's nothing mandating that they even meet
00:41:40.840 | a bare minimum standard of cybersecurity.
00:41:43.740 | And that's it.
00:41:46.700 | So even when there are these attacks,
00:41:49.020 | most of the time we don't even know about it.
00:41:51.360 | So that is, if you were gonna design a system
00:41:54.360 | to be as blind and vulnerable as possible,
00:41:57.980 | that's pretty good.
00:42:00.680 | That's what it looks like,
00:42:01.840 | is what we have here in the United States.
00:42:04.480 | And everyone here is just operating like,
00:42:08.440 | let's just keep hooking up everything for convenience.
00:42:12.200 | Software eats the world.
00:42:13.840 | Let's just keep going for cost, for convenience sake,
00:42:18.640 | just because we can.
00:42:20.760 | And when you study these issues,
00:42:22.840 | and you study these attacks,
00:42:24.400 | and you study the advancement,
00:42:26.960 | and the uptick in frequency,
00:42:29.400 | and the lower barrier to entry
00:42:32.320 | that we see every single year,
00:42:34.760 | you realize just how dumb software eats world is.
00:42:39.760 | And no one has ever stopped to pause and think,
00:42:43.180 | should we be hooking up these systems to the internet?
00:42:48.040 | They've just been saying, can we?
00:42:49.760 | Let's do it.
00:42:51.280 | And that's a real problem.
00:42:52.480 | And just in the last year,
00:42:54.680 | we've seen a record number of zero-day attacks.
00:42:56.920 | I think there were 80 last year,
00:42:59.080 | which is probably more than double what it was in 2019.
00:43:02.920 | A lot of those were nation states.
00:43:05.240 | We live in a world
00:43:08.220 | with a lot of geopolitical hot points right now.
00:43:11.640 | And where those geopolitical hot points are,
00:43:15.120 | are places where countries have been investing heavily
00:43:19.120 | in offensive cyber tools.
00:43:21.760 | - If you're a nation state,
00:43:23.440 | the goal would be to maximize the footprint of zero-day,
00:43:29.520 | like super-secret zero-day that nobody's aware of.
00:43:33.360 | And whenever war is initiated,
00:43:36.000 | the huge negative effects of shutting down infrastructure,
00:43:38.920 | or any kind of zero-day, is the chaos it creates.
00:43:42.000 | So if you just, there's a certain threshold
00:43:43.600 | when you create the chaos,
00:43:45.240 | the markets plummet, just everything goes to hell.
00:43:48.920 | - So it's not just zero-days.
00:43:52.880 | We make it so easy for threat actors.
00:43:56.640 | I mean, we're not using two-factor authentication.
00:44:00.440 | We're not patching.
00:44:01.780 | There was the shell shock vulnerability
00:44:04.880 | that was discovered a couple years ago.
00:44:08.200 | It's still being exploited,
00:44:09.920 | because so many people haven't fixed it.
00:44:13.280 | So the zero-days are really the sexy stuff.
00:44:16.600 | And what really drew me to the zero-day market
00:44:19.240 | was the moral calculus we talked about.
00:44:21.440 | Particularly from the US government's point of view.
00:44:26.240 | How do they justify leaving these systems so vulnerable
00:44:31.240 | when we use them here,
00:44:33.560 | and we're baking more of our critical infrastructure
00:44:36.080 | with this vulnerable software?
00:44:38.160 | It's not like we're using one set of technology,
00:44:41.120 | and Russia's using another, and China's using this.
00:44:43.800 | We're all using the same technology.
00:44:45.960 | So when you find a zero-day in Windows,
00:44:48.920 | you're not just leaving it open so you can spy on Russia
00:44:52.400 | or implant yourself in the Russian grid.
00:44:54.600 | You're leaving Americans vulnerable too.
00:44:57.520 | But zero-days are like, that is the secret sauce.
00:45:02.240 | That's the superpower.
00:45:04.440 | And I always say, every country now,
00:45:07.920 | with the exception of Antarctica,
00:45:09.440 | someone added the Vatican to my list,
00:45:11.960 | is trying to find offensive hacking tools in zero-days
00:45:16.760 | to make 'em work.
00:45:17.600 | And those that don't have the skills
00:45:20.680 | now have this market that they can tap into
00:45:23.520 | where $2.5 million, that's chump change
00:45:26.480 | for a lot of these nation states.
00:45:28.200 | It's a hell of a lot less
00:45:29.400 | than trying to build the next fighter jet.
00:45:31.940 | But yeah, the goal is chaos.
00:45:34.520 | I mean, why did Russia turn off the lights twice in Ukraine?
00:45:39.060 | You know, I think part of it is chaos.
00:45:42.740 | I think part of it is to sow the seeds of doubt
00:45:46.220 | in their current government.
00:45:47.940 | Your government can't even keep your lights on.
00:45:50.380 | Why are you sticking with them?
00:45:52.420 | You know, come over here
00:45:54.140 | and we'll keep your lights on at least.
00:45:56.180 | You know, there's like a little bit of that.
00:45:58.300 | - Nuclear weapons seems to have helped prevent nuclear war.
00:46:03.300 | Is it possible that we have so many vulnerabilities
00:46:08.260 | and so many attack vectors on each other
00:46:11.260 | that it will kind of achieve the same kind of equilibrium
00:46:15.220 | like mutually assured destruction?
00:46:17.060 | - Yeah.
00:46:18.700 | - That's one hopeful solution to this.
00:46:20.700 | Do you have any hope for this particular solution?
00:46:23.780 | - You know, nuclear analogies always tend to fall apart
00:46:26.480 | when it comes to cyber,
00:46:27.500 | mainly because you don't need fissile material.
00:46:30.940 | You know, you just need a laptop and the skills
00:46:33.260 | and you're in the game.
00:46:34.580 | So it's a really low barrier to entry.
00:46:38.220 | The other thing is attributions harder.
00:46:40.940 | And we've seen countries muck around with attribution.
00:46:44.300 | We've seen, you know, nation states piggyback
00:46:47.120 | on other countries' spy operations
00:46:49.300 | and just sit there and siphon out whatever they're getting.
00:46:53.360 | We learned some of that from the Snowden documents.
00:46:56.160 | We've seen Russia hack into Iran's
00:46:58.500 | command and control attack servers.
00:47:01.380 | We've seen them hit a Saudi petrochemical plant
00:47:05.420 | where they did neutralize the safety locks at the plant
00:47:08.180 | and everyone assumed that it was Iran,
00:47:10.140 | given Iran had been targeting Saudi oil companies forever.
00:47:13.640 | But nope, it turned out that it was
00:47:15.220 | a graduate research institute outside Moscow.
00:47:17.760 | So you see countries kind of playing around
00:47:20.260 | with attribution.
00:47:22.260 | I think because they think, okay, if I do this,
00:47:25.380 | like how am I gonna cover up that it came from me
00:47:27.800 | because I don't wanna risk the response.
00:47:30.880 | So people are sort of dancing around this.
00:47:33.180 | It's just in a very different way.
00:47:34.980 | And, you know, at the Times, I'd covered the Chinese hacks
00:47:39.620 | of infrastructure companies like pipelines.
00:47:42.860 | I'd covered the Russian probes of nuclear plants.
00:47:46.120 | I'd covered the Russian attacks on the Ukraine grid.
00:47:50.100 | And then in 2018, my colleague David Singer and I
00:47:53.860 | covered the fact that US Cyber Command
00:47:57.140 | had been hacking into the Russian grid
00:47:59.620 | and making a pretty loud show of it.
00:48:02.220 | And when we went to the National Security Council,
00:48:05.260 | because that's what journalists do
00:48:06.820 | before they publish a story,
00:48:08.100 | they give the other side a chance to respond.
00:48:11.400 | I assumed we would be in for that really awkward,
00:48:14.500 | painful conversation where they would say,
00:48:17.020 | "You will have blood on your hands
00:48:18.460 | "if you publish this story."
00:48:20.280 | And instead, they gave us the opposite answer.
00:48:23.000 | They said, "We have no problem
00:48:25.060 | "with you publishing this story."
00:48:28.060 | Well, they didn't say it out loud,
00:48:29.360 | but it was pretty obvious they wanted Russia to know
00:48:33.140 | that we're hacking into their power grid too,
00:48:35.340 | and they better think twice before they do to us
00:48:38.460 | what they had done to Ukraine.
00:48:40.220 | So yeah, you know, we have stumbled into this new era
00:48:44.660 | of mutually assured digital destruction.
00:48:47.780 | I think another sort of quasi-norm we've stumbled into
00:48:52.780 | is proportional responses.
00:48:56.940 | You know, there's this idea that if you get hit,
00:49:00.560 | you're allowed to respond proportionally
00:49:03.480 | at a time and place of your choosing.
00:49:05.480 | You know, that is how the language always goes.
00:49:08.440 | That's what Obama said after North Korea hit Sony.
00:49:12.800 | "We will respond at a time and place of our choosing."
00:49:15.700 | But no one really knows what that response looks like.
00:49:21.120 | And so what you see a lot of the time
00:49:22.760 | are just these like just short of war attacks.
00:49:27.140 | You know, Russia turned off the power in Ukraine,
00:49:29.320 | but it wasn't like it stayed off for a week.
00:49:31.860 | You know, it stayed off for a number of hours.
00:49:34.820 | You know, NotPetya hit those companies pretty hard,
00:49:39.660 | but no one died, you know?
00:49:41.380 | And the question is, what's gonna happen when someone dies?
00:49:44.560 | And can a nation-state masquerade as a cyber-criminal group,
00:49:49.560 | as a ransomware group?
00:49:51.620 | And that's what really complicates
00:49:53.500 | coming to some sort of digital Geneva Convention.
00:49:57.140 | Like there's been a push from Brad Smith at Microsoft.
00:50:01.140 | We need a digital Geneva Convention.
00:50:03.700 | And on its face, it sounds like a no-brainer.
00:50:06.060 | Yeah, why wouldn't we all agree to stop hacking
00:50:08.860 | into each other's civilian hospital systems,
00:50:11.140 | elections, power grid, pipelines?
00:50:15.580 | But when you talk to people in the West,
00:50:19.780 | officials in the West, they'll say,
00:50:20.900 | "We would never, we'd love to agree to it,
00:50:24.220 | but we'd never do it when you're dealing with Xi or Putin
00:50:28.460 | or Kim Jong-un."
00:50:30.540 | Because a lot of times,
00:50:32.860 | they outsource these operations to cyber-criminals.
00:50:37.060 | In China, we see a lot of these attacks come
00:50:39.340 | from this loose satellite network of private citizens
00:50:43.060 | that work at the behest of the Ministry of State Security.
00:50:46.660 | So how do you come to some sort of state-to-state agreement
00:50:51.340 | when you're dealing with transnational actors
00:50:55.700 | and cyber-criminals, where it's really hard to pin down
00:50:59.140 | whether that person was acting alone
00:51:01.660 | or whether they were acting at the behest of the MSS
00:51:04.980 | or the FSB?
00:51:06.580 | And a couple of years ago, I remember,
00:51:09.420 | can't remember if it was before or after NotPetya,
00:51:11.740 | but Putin said, "Hackers are like artists
00:51:14.740 | who wake up in the morning in a good mood
00:51:16.540 | and start painting."
00:51:18.020 | In other words, I have no say over what they do or don't do.
00:51:21.420 | So how do you come to some kind of norm
00:51:24.300 | when that's how he's talking about these issues
00:51:26.900 | and he's just decimated Merck and Pfizer
00:51:30.180 | and another however many thousand companies?
00:51:34.220 | - That is the fundamental difference
00:51:35.660 | between nuclear weapons and cyber-attacks
00:51:38.860 | is the attribution,
00:51:40.380 | or one of the fundamental differences.
00:51:42.540 | If you can fix one thing in the world
00:51:45.180 | in terms of cybersecurity,
00:51:47.180 | that would make the world a better place,
00:51:48.940 | what would you fix?
00:51:51.100 | So you're not allowed to fix like authoritarian regimes
00:51:54.100 | and you can't. (laughs)
00:51:55.420 | - Right.
00:51:56.620 | - You have to keep that,
00:51:57.940 | you have to keep human nature as it is.
00:52:00.580 | In terms of on the security side,
00:52:02.820 | technologically speaking,
00:52:05.100 | you mentioned there's no regulation on companies,
00:52:07.860 | United States,
00:52:09.020 | what if you could just fix with the snap of a finger,
00:52:14.860 | what would you fix?
00:52:15.740 | - Two-factor authentication,
00:52:17.620 | multi-factor authentication.
00:52:19.820 | It's ridiculous how many of these attacks come in
00:52:24.780 | because someone didn't turn on multi-factor authentication.
00:52:27.620 | I mean, Colonial Pipeline, okay,
00:52:30.700 | they took down the biggest conduit
00:52:34.260 | for gas, jet fuel, and diesel
00:52:35.860 | to the East Coast of the United States of America, how?
00:52:39.180 | Because they forgot to deactivate an old employee account
00:52:42.180 | whose password had been traded on the dark web
00:52:44.700 | and they'd never turned on two-factor authentication.
00:52:48.020 | This water treatment facility outside Florida
00:52:50.140 | was hacked last year, how did it happen?
00:52:53.220 | They were using Windows XP from like a decade ago
00:52:56.500 | that can't even get patches if you want it to
00:52:59.300 | and they didn't have two-factor authentication.
00:53:01.700 | Time and time again,
00:53:02.700 | if they just switched on two-factor authentication,
00:53:06.740 | some of these attacks wouldn't have been possible.
00:53:08.340 | Now, if I could snap my fingers,
00:53:10.060 | that's the thing I would do right now.
00:53:11.780 | But of course, this is a cat and mouse game
00:53:15.100 | and then the attacker's onto the next thing.
00:53:17.540 | But I think right now, that is like bar none,
00:53:21.700 | that is just, that is the easiest, simplest way
00:53:24.660 | to deflect the most attacks.
00:53:25.900 | And the name of the game right now isn't perfect security.
00:53:29.780 | Perfect security is impossible.
00:53:32.220 | They will always find a way in.
00:53:34.340 | The name of the game right now is
00:53:35.880 | make yourself a little bit harder to attack
00:53:39.060 | than your competitor or than anyone else out there
00:53:41.460 | so that they just give up and move along.
00:53:44.300 | And maybe if you are a target for an advanced nation state
00:53:48.740 | or the SVR, you're gonna get hacked no matter what.
00:53:53.740 | But you can make cyber criminal groups,
00:53:56.540 | Deadbolt is it, you can make their jobs a lot harder
00:53:59.780 | simply by doing the bare basics.
00:54:03.180 | And the other thing is stop reusing your passwords.
00:54:05.260 | But if I only get one, then two-factor authentication.
00:54:08.060 | - So what is two-factor authentication?
00:54:10.460 | Factor one is what, logging in with a password?
00:54:13.260 | And factor two is like have another device
00:54:15.900 | or another channel through which you can confirm,
00:54:18.420 | yeah, that's me.
00:54:19.460 | - Yes, usually this happens through some kind of text.
00:54:23.500 | You get your one-time code from Bank of America
00:54:26.700 | or from Google.
00:54:28.700 | The better way to do it is spend $20
00:54:31.460 | buying yourself a Fido key on Amazon.
00:54:34.240 | That's a hardware device.
00:54:36.060 | And if you don't have that hardware device with you,
00:54:39.460 | then you're not gonna get in.
00:54:41.300 | And the whole goal is, I mean, basically,
00:54:43.780 | my first half of my decade at the Times
00:54:46.020 | was spent covering like the cop beat.
00:54:48.980 | It was like Home Depot got breached,
00:54:51.340 | News at 11, Target, Neiman Marcus,
00:54:54.380 | like who wasn't hacked over the course of those five years?
00:54:58.380 | And a lot of those companies that got hacked,
00:55:01.060 | what did hackers take?
00:55:02.160 | They took the credentials, they took the passwords.
00:55:05.480 | They can make a pretty penny selling them on the dark web.
00:55:08.980 | And people reuse their passwords.
00:55:11.540 | So you get one from God knows who, I don't know,
00:55:15.540 | last pass, the worst case example, actually last pass.
00:55:19.340 | But you get one and then you go test it
00:55:21.860 | on their email account.
00:55:23.420 | And you go test it on their brokerage account.
00:55:25.580 | And you test it on their cold storage account.
00:55:28.620 | That's how it works.
00:55:29.580 | But if you have multi-factor authentication,
00:55:32.900 | then they can't get in
00:55:34.500 | because they might have your password,
00:55:36.740 | but they don't have your phone,
00:55:38.040 | they don't have your Fido key.
00:55:39.740 | So you keep them out.
00:55:42.700 | And I get a lot of alerts that tell me
00:55:46.540 | someone is trying to get into your Instagram account
00:55:49.860 | or your Twitter account or your email account.
00:55:52.060 | And I don't worry because I use multi-factor authentication.
00:55:55.840 | They can try all day.
00:55:57.200 | Okay, I worry a little bit.
00:55:59.500 | But it's the simplest thing to do and we don't even do it.
00:56:04.500 | - Well, there's an interface aspect to it
00:56:06.860 | 'cause it's pretty annoying if it's implemented poorly.
00:56:09.900 | - Yeah, true.
00:56:11.420 | - So actually bad implementation
00:56:13.060 | of two-factor authentication, not just bad,
00:56:16.400 | but just something that adds friction
00:56:19.040 | is a security vulnerability, I guess,
00:56:21.360 | because it's really annoying.
00:56:23.480 | Like I think MIT for a while had two-factor authentication.
00:56:27.560 | It was really annoying.
00:56:28.680 | The number of times it pings you,
00:56:35.040 | it asks to re-authenticate across multiple subdomains.
00:56:39.960 | It just feels like a pain.
00:56:41.720 | I don't know what the right balance there.
00:56:44.120 | - Yeah, it feels like friction in our frictionless society.
00:56:48.680 | It feels like friction.
00:56:49.800 | It's annoying.
00:56:51.040 | That's security's biggest problem.
00:56:52.840 | It's annoying.
00:56:54.520 | We need the Steve Jobs of security to come along
00:56:57.880 | and we need to make it painless.
00:56:59.600 | And actually, on that point,
00:57:02.060 | Apple has probably done more for security than anyone else
00:57:07.060 | simply by introducing biometric authentication,
00:57:10.880 | first with the fingerprint and then with Face ID.
00:57:13.640 | And it's not perfect, but if you think just eight years ago,
00:57:17.440 | everyone was running around with either no passcode,
00:57:20.400 | an optional passcode, or a four-digit passcode
00:57:22.960 | on their phone that anyone,
00:57:24.500 | think of what you can get when you get someone's iPhone,
00:57:27.320 | if you steal someone's iPhone.
00:57:29.060 | And props to them for introducing the fingerprint
00:57:32.600 | and Face ID.
00:57:33.440 | And again, it wasn't perfect,
00:57:34.880 | but it was a huge step forward.
00:57:36.960 | Now it's time to make another huge step forward.
00:57:40.200 | I wanna see the password die.
00:57:42.880 | I mean, it's gotten us as far as it was ever gonna get us
00:57:46.880 | and I hope whatever we come up with next
00:57:49.560 | is not gonna be annoying, is gonna be seamless.
00:57:52.360 | - When I was at Google, that's what we worked on is,
00:57:55.080 | and there's a lot of ways to call this active authentication
00:57:58.200 | or passive authentication.
00:57:59.900 | So basically use biometric data,
00:58:02.500 | not just like a fingerprint,
00:58:03.780 | but everything from your body to identify who you are,
00:58:07.180 | like movement patterns.
00:58:09.140 | So basically create a lot of layers of protection
00:58:12.700 | where it's very difficult to fake,
00:58:15.420 | including like face unlock,
00:58:18.780 | checking that it's your actual face,
00:58:21.260 | like the liveness tests.
00:58:23.020 | So like from video, so unlocking it with video,
00:58:26.260 | voice, the way you move the phone,
00:58:30.040 | the way you take it out of the pocket, that kind of thing.
00:58:33.280 | All of those factors.
00:58:35.040 | It's a really hard problem though.
00:58:37.360 | And ultimately it's very difficult to beat the password
00:58:42.080 | during the security.
00:58:43.560 | - Well, there's a company that I actually will call out
00:58:46.160 | and that's Abnormal Security.
00:58:48.220 | So they work on email attacks.
00:58:51.440 | And it was started by a couple of guys
00:58:55.080 | who were doing, I think, ad tech at Twitter.
00:58:59.360 | So, you know, ad technology now,
00:59:01.560 | like it's a joke how much they know about us.
00:59:03.780 | You know, you always hear the conspiracy theories
00:59:05.840 | that you saw someone's shoes
00:59:08.240 | and next thing you know, it's on your phone.
00:59:10.400 | It's amazing what they know about you.
00:59:12.300 | And they're basically taking that
00:59:15.800 | and they're applying it to attacks.
00:59:19.680 | So they're saying, okay, you know,
00:59:21.920 | if you're, this is what your email patterns are.
00:59:24.960 | It might be different for you and me
00:59:26.360 | because we're emailing strangers all the time.
00:59:28.640 | But for most people,
00:59:31.000 | their email patterns are pretty predictable.
00:59:33.920 | And if something strays from that pattern, that's abnormal.
00:59:38.400 | And they'll block it, they'll investigate it, you know,
00:59:41.760 | and that's great.
00:59:43.560 | You know, let's start using that kind of targeted
00:59:46.820 | ad technology to protect people.
00:59:50.560 | And yeah, I mean, it's not gonna get us away
00:59:52.980 | from the password and using multi-factor authentication,
00:59:56.320 | but you know, the technology is out there
00:59:59.960 | and we just have to figure out how to use it
01:00:02.080 | in a really seamless way,
01:00:03.400 | because it doesn't matter
01:00:05.680 | if you have the perfect security solution
01:00:07.600 | if no one uses it.
01:00:08.440 | I mean, when I started at the Times,
01:00:10.360 | when I was trying to be really good
01:00:12.200 | about protecting sources,
01:00:14.880 | I was trying to use PGP encryption
01:00:17.760 | and it's like, it didn't work.
01:00:19.480 | You know, the number of mistakes I would probably make
01:00:22.440 | just trying to email someone with PGP just wasn't worth it.
01:00:27.100 | And then Signal came along and Signal made it,
01:00:31.360 | Wicker, you know, they made it a lot easier
01:00:34.700 | to send someone an encrypted text message.
01:00:37.040 | So we have to start investing in creative minds
01:00:42.040 | in good security design.
01:00:45.180 | You know, I really think that's the hack
01:00:46.940 | that's gonna get us out of where we are today.
01:00:50.100 | - What about social engineering?
01:00:52.420 | Do you worry about this sort of hacking people?
01:00:56.760 | - Yes, I mean, this is the worst nightmare
01:01:00.940 | of every chief information security officer out there.
01:01:04.200 | You know, social engineering, we work from home now.
01:01:09.040 | I saw this woman posted online about how her husband,
01:01:15.100 | it went viral today, but it was her husband
01:01:18.620 | had this problem at work.
01:01:20.020 | They hired a guy named John and now the guy
01:01:23.660 | that shows up for work every day doesn't act like John.
01:01:27.540 | I mean, think about that.
01:01:31.060 | Like think about the potential for social engineering
01:01:34.020 | in that context.
01:01:35.540 | You know, you apply for a job and you put on a pretty face,
01:01:38.940 | you hire an actor or something,
01:01:40.420 | and then you just get inside the organization
01:01:42.680 | and get access to all that organization's data.
01:01:45.580 | You know, a couple of years ago,
01:01:47.460 | Saudi Arabia planted spies inside Twitter.
01:01:51.180 | You know, why?
01:01:52.020 | Probably because they were trying to figure out
01:01:54.500 | who these people were who were criticizing
01:01:56.460 | the regime on Twitter.
01:01:58.020 | You know, they couldn't do it with a hack from the outside,
01:02:00.040 | so why not plant people on the inside?
01:02:02.260 | And that's like the worst nightmare.
01:02:04.540 | And it also, unfortunately, creates all kinds of xenophobia
01:02:09.540 | at a lot of these organizations.
01:02:11.340 | I mean, if you're gonna have to take that into consideration
01:02:14.780 | then organizations are gonna start looking
01:02:16.700 | really skeptically and suspiciously
01:02:19.420 | at someone who applies for that job from China.
01:02:23.020 | And we've seen that go really badly
01:02:25.780 | at places like the Department of Commerce
01:02:28.540 | where they basically accuse people of being spies
01:02:31.180 | that aren't spies.
01:02:32.020 | So it is the hardest problem to solve.
01:02:35.380 | And it's never been harder to solve
01:02:37.380 | than right at this very moment
01:02:39.180 | when there's so much pressure for companies
01:02:41.300 | to let people work remotely.
01:02:43.940 | - That's actually why I'm single, I'm suspicious
01:02:46.260 | of China and Russia every time I meet somebody
01:02:49.660 | or trying to plant and get insider information.
01:02:52.940 | So I'm very, very suspicious.
01:02:54.700 | I keep putting the Turing test in front, no.
01:02:58.420 | - No, I have a friend who worked inside NSA
01:03:02.700 | and was one of their top hackers.
01:03:04.820 | And he's like, "Every time I go to Russia,
01:03:08.500 | "I get hit on by these 10s."
01:03:10.820 | And I come home, my friends are like,
01:03:12.160 | "I'm sorry, you're not a 10."
01:03:13.820 | Like, it's a common story.
01:03:17.020 | - I mean, it's difficult to trust humans
01:03:20.940 | in this day and age online.
01:03:23.820 | So we're working remotely, that's one thing.
01:03:27.460 | But just interacting with people on the internet,
01:03:31.340 | it sounds ridiculous.
01:03:32.500 | But because of this podcast in part,
01:03:35.300 | I've gotten to meet some incredible people.
01:03:37.920 | But it makes you nervous to trust folks.
01:03:43.300 | And I don't know how to solve that problem.
01:03:47.240 | So I'm talking with Mark Zuckerberg,
01:03:51.460 | who dreams about creating the metaverse.
01:03:53.500 | What do you do about that world
01:03:56.820 | where more and more our lives is in the digital sphere?
01:04:01.500 | Like, one way to phrase it is,
01:04:05.320 | most of our meaningful experiences
01:04:07.940 | at some point will be online.
01:04:12.140 | Like falling in love, getting a job,
01:04:15.800 | or experiencing a moment of happiness with a friend,
01:04:19.900 | with a new friend made online.
01:04:21.900 | All of those things.
01:04:23.100 | Like more and more, the fun we do,
01:04:25.540 | the things that make us love life will happen online.
01:04:29.100 | And if those things have an avatar that's digital,
01:04:32.960 | that's like a way to hack into people's minds.
01:04:35.820 | Whether it's with AI or kind of troll farms
01:04:39.500 | or something like that.
01:04:40.980 | I don't know if there's a way to protect against that.
01:04:43.540 | That might fundamentally rely on our faith
01:04:49.420 | in how good human nature is.
01:04:51.940 | So if most people are good, we're going to be okay.
01:04:54.900 | But if people will tend towards manipulation
01:04:59.220 | and malevolent behavior in search of power,
01:05:03.300 | then we're screwed.
01:05:04.520 | So I don't know if you can comment
01:05:07.860 | on how to keep the metaverse secure.
01:05:10.380 | - Yeah, I mean, all I thought about
01:05:13.700 | when you were talking just now is my three-year-old son.
01:05:16.740 | - Yeah.
01:05:17.580 | - Yeah, he asked me the other day, what's the internet, mom?
01:05:22.540 | And I just almost wanted to cry.
01:05:25.860 | (laughs)
01:05:28.020 | I don't want that for him.
01:05:30.380 | I don't want all of his most meaningful experiences
01:05:33.200 | to be online.
01:05:34.260 | By the time that happens,
01:05:36.120 | how do you know that person's human?
01:05:40.420 | That avatar's human?
01:05:42.500 | I believe in free speech.
01:05:43.620 | I don't believe in free speech for robots and bots.
01:05:47.540 | And look what just happened over the last six years.
01:05:52.540 | We had bots pretending to be Black Lives Matter activists
01:05:56.940 | just to sow some division,
01:05:59.380 | or Texas secessionists,
01:06:01.780 | or organizing anti-Hillary protests,
01:06:06.620 | or just to sow more division,
01:06:09.020 | to tie us up in our own politics
01:06:12.460 | so that we're so paralyzed we can't get anything done.
01:06:15.700 | We can't make any progress,
01:06:17.260 | and we definitely can't handle our adversaries
01:06:19.980 | and their long-term thinking.
01:06:21.600 | It really scares me.
01:06:25.300 | And here's where I just come back to,
01:06:28.340 | just because we can create the metaverse,
01:06:32.540 | just because it sounds like the next logical step
01:06:36.420 | in our digital revolution.
01:06:39.820 | Do I really want my child's most significant moments
01:06:43.980 | to be online?
01:06:45.580 | They weren't for me.
01:06:46.760 | So maybe I'm just stuck in that old-school thinking,
01:06:51.940 | or maybe I've seen too much.
01:06:54.580 | And I'm really sick of being the guinea pig parent generation
01:06:59.580 | for these things.
01:07:01.860 | I mean, it's hard enough with screen time.
01:07:04.660 | But thinking about how to manage the metaverse as a parent
01:07:09.660 | to a young boy, I can't even let my head go there.
01:07:13.740 | That's so terrifying for me.
01:07:16.400 | But we've never stopped any new technology
01:07:21.360 | just because it introduces risks.
01:07:24.040 | We've always said, okay, the promise of this technology
01:07:27.920 | means we should keep going, keep pressing ahead.
01:07:31.760 | We just need to figure out new ways to manage that risk.
01:07:35.560 | And that's the blockchain right now.
01:07:39.520 | When I was covering all of these ransomware attacks,
01:07:44.880 | I thought, okay, this is gonna be it for cryptocurrency.
01:07:49.000 | Governments are gonna put the kibosh down.
01:07:51.400 | They're gonna put the hammer down and say, enough is enough.
01:07:54.780 | We have to put this genie back in the bottle
01:07:56.900 | because it's enabled ransomware.
01:07:59.480 | Five years ago, they would hijack your PC
01:08:02.680 | and they'd say, go to the local pharmacy,
01:08:05.520 | get a e-gift card and tell us what the pin is
01:08:08.340 | and then we'll get your $200.
01:08:10.440 | Now it's pay us five Bitcoin.
01:08:13.420 | And so there's no doubt cryptocurrencies
01:08:16.160 | enabled ransomware attacks.
01:08:17.840 | But after the colonial pipeline, ransom was seized.
01:08:22.600 | 'Cause if you remember, the FBI was actually able to go in
01:08:25.760 | and claw some of it back from dark side,
01:08:28.120 | which was the ransomware group that hit it.
01:08:30.280 | And I spoke to these guys at TRM Labs.
01:08:34.080 | So they're one of these blockchain intelligence companies.
01:08:37.480 | And a lot of people that work there
01:08:38.680 | used to work at the treasury.
01:08:40.880 | And what they said to me was,
01:08:42.160 | yeah, cryptocurrency has enabled ransomware.
01:08:46.440 | But to track down that ransom payment would have taken,
01:08:51.440 | if we were dealing with fiat currency,
01:08:54.760 | would have taken us years to get to that one bank account
01:08:58.100 | or belonging to that one front company in the Seychelles.
01:09:01.560 | And now thanks to the blockchain,
01:09:04.040 | we can track the movement of those funds in real time.
01:09:08.320 | And you know what?
01:09:09.960 | These payments are not as anonymous as people think.
01:09:13.300 | Like we still can use our old hacking ways and zero days
01:09:16.640 | and old school intelligence methods
01:09:19.560 | to find out who owns that private wallet
01:09:21.800 | and how to get to it.
01:09:23.480 | So it's a curse in some ways in that it's an enabler,
01:09:27.640 | but it's also a blessing.
01:09:29.440 | And they said that same thing to me that I just said to you.
01:09:32.080 | They said, we've never shut down a promising new technology
01:09:37.080 | because it introduced risk.
01:09:39.160 | We just figured out how to manage that risk.
01:09:42.720 | - And I think that's where the conversation,
01:09:44.400 | unfortunately, has to go,
01:09:45.880 | is how do we, in the metaverse,
01:09:48.800 | use technology to fix things.
01:09:53.600 | So maybe we'll finally be able to, not finally,
01:09:57.000 | but figure out a way to solve the identity problem
01:10:00.960 | on the internet, meaning like a blue check mark
01:10:03.400 | for actual human and connect it to identity
01:10:06.960 | like a fingerprint so you can prove you're you,
01:10:11.360 | and yet do it in a way that doesn't involve
01:10:15.120 | the company having all your data.
01:10:17.580 | So giving you, allowing you to maintain control
01:10:20.760 | over your data, or if you don't,
01:10:23.800 | then there's complete transparency
01:10:25.960 | of how that data is being used, all those kinds of things.
01:10:28.880 | And maybe as you educate more and more people,
01:10:32.020 | they would demand in a capitalist society
01:10:36.080 | that the companies that they give their data to
01:10:38.440 | will respect that data.
01:10:41.040 | - Yeah, I mean, there is this company,
01:10:43.600 | and I hope they succeed, their name's PIIANO, P-I-I-O,
01:10:48.520 | and they wanna create a vault for your personal information
01:10:52.160 | inside every organization.
01:10:54.440 | And ultimately, if I'm gonna call Delta Airlines
01:10:58.000 | to book a flight, they don't need to know
01:11:00.360 | my social security number,
01:11:02.520 | they don't need to know my birth date,
01:11:05.540 | they're just gonna send me a one-time token to my phone.
01:11:08.920 | My phone's gonna say, or my Fido key is gonna say,
01:11:11.680 | "Yep, it's her."
01:11:13.520 | And then we're gonna talk about my identity like a token,
01:11:16.600 | you know, some random token.
01:11:17.660 | They don't need to know exactly who I am,
01:11:20.120 | they just need to know the system trust
01:11:23.240 | that I am who I say I am,
01:11:24.880 | but they don't get access to my PII data,
01:11:27.960 | they don't get access to my social security number,
01:11:30.440 | my location, or the fact I'm a Times journalist.
01:11:34.580 | You know, I think that's the way the world's gonna go.
01:11:37.380 | We have, enough is enough,
01:11:39.960 | sort of losing our personal information everywhere,
01:11:44.160 | letting data marketing companies track our every move.
01:11:48.320 | You know, they don't need to know who I am.
01:11:50.680 | You know, okay, I get it.
01:11:52.280 | You know, we're stuck in this world
01:11:53.680 | where the internet runs on ads,
01:11:57.480 | so ads are not gonna go away,
01:11:59.960 | but they don't need to know I'm Nicole Perlera.
01:12:03.040 | They can know that I am token number, you know, x567.
01:12:07.040 | - And they can let you know what they know
01:12:11.040 | and give you control about removing the things they know.
01:12:14.080 | - Yeah, right, to be forgotten.
01:12:15.880 | - To me, you should be able to walk away
01:12:17.960 | with a single press of a button.
01:12:20.300 | And I also believe that most people,
01:12:22.080 | given the choice to walk away, won't walk away.
01:12:25.240 | They'll just feel better about having the option
01:12:28.560 | to walk away when they understand the trade-offs.
01:12:30.720 | If you walk away, you're not gonna get
01:12:32.580 | some of the personalized experiences
01:12:34.160 | that you would otherwise get,
01:12:36.000 | like a personalized feed and all those kinds of things.
01:12:38.620 | But the freedom to walk away is,
01:12:40.920 | I think, really powerful.
01:12:44.240 | And obviously, what you're saying,
01:12:45.600 | there's all of these HTML forms
01:12:48.660 | where you have to enter your phone number
01:12:50.220 | and email and private information
01:12:52.840 | from every single airline.
01:12:54.700 | New York Times.
01:12:56.780 | I have so many opinions on this.
01:13:00.640 | Just the friction and the sign-up
01:13:03.440 | and all of those kinds of things.
01:13:04.840 | I should be able to, this has to do with everything.
01:13:07.200 | This has to do with payment, too.
01:13:08.840 | Payment should be trivial.
01:13:11.800 | It should be one click,
01:13:13.920 | and one click to unsubscribe and subscribe,
01:13:16.960 | and one click to provide all of your information
01:13:19.560 | that's necessary for the subscription service,
01:13:21.700 | for the transaction service, whatever that is,
01:13:24.140 | getting a ticket, as opposed to,
01:13:25.940 | I have all of these fake phone numbers and emails
01:13:28.120 | that I use now to sign out.
01:13:29.380 | 'Cause you never know.
01:13:31.940 | If one site is hacked,
01:13:34.320 | then it's just going to propagate to everything else.
01:13:37.580 | - Yeah.
01:13:38.780 | And there's low-hanging fruit,
01:13:41.080 | and I hope Congress does something.
01:13:44.440 | And frankly, I think it's negligent they haven't
01:13:46.620 | on the fact that elderly people are getting spammed to death
01:13:51.620 | on their phones these days with fake car warranty scams.
01:13:56.980 | I mean, my dad was in the hospital last year,
01:13:59.540 | and I was in the hospital room,
01:14:00.820 | and his phone kept buzzing, and I look at it,
01:14:03.660 | and it's just spam attack after spam attack,
01:14:08.260 | people nonstop calling about his freaking car warranty,
01:14:13.200 | why they're trying to get his social security number.
01:14:16.020 | They're trying to get his PII.
01:14:17.700 | They're trying to get this information.
01:14:19.900 | We need to figure out how to put those people
01:14:24.060 | in jail for life. (laughs)
01:14:26.720 | And we need to figure out why in the hell
01:14:30.300 | we are being required or asked to hand over
01:14:34.800 | our social security number, and our home address,
01:14:38.860 | and our passport, all of that information
01:14:41.420 | to every retailer who asks.
01:14:43.360 | I mean, that's insanity.
01:14:46.240 | And there's no question they're not protecting it
01:14:49.760 | because it keeps showing up in spam,
01:14:52.760 | or identity theft, or credit card theft, or worse.
01:14:57.080 | - Well, the spam is getting better,
01:14:58.280 | and maybe I need to, as a side note,
01:15:01.160 | make a public announcement.
01:15:02.560 | Please clip this out, which is if you get an email
01:15:07.360 | or a message from Lex Friedman saying how much I, Lex,
01:15:13.960 | appreciate you and love you and so on,
01:15:16.920 | and please connect with me on my WhatsApp number,
01:15:19.640 | and I will give you Bitcoin or something like that,
01:15:23.440 | please do not click.
01:15:25.020 | And I'm aware that there's a lot of this going on,
01:15:29.040 | a very large amount.
01:15:30.120 | I can't do anything about it.
01:15:32.200 | This is on every single platform.
01:15:33.800 | It's happening more and more and more,
01:15:36.040 | which I've been recently informed
01:15:38.880 | that they're now emailing.
01:15:40.960 | So it's cross-platform.
01:15:42.920 | They're taking people's, they're somehow,
01:15:46.040 | this is fascinating to me,
01:15:47.360 | because they are taking people who comment
01:15:51.360 | on various social platforms,
01:15:53.840 | and they somehow reverse engineer,
01:15:56.040 | they figure out what their email is,
01:15:57.760 | and they send an email to that person
01:16:00.320 | saying, "From Lex Friedman,"
01:16:02.240 | and it's like a heartfelt email with links.
01:16:05.160 | It's fascinating because it's cross-platform now.
01:16:07.440 | It's not just a spam bot that's messaging us
01:16:11.040 | and a comment that's in a reply.
01:16:13.480 | They are saying, "Okay, this person cares
01:16:16.320 | "about this other person on social media,
01:16:18.420 | "so I'm going to find another channel,"
01:16:20.440 | which in their mind probably increases,
01:16:22.600 | and it does, the likelihood
01:16:25.080 | that they'll get the people to click, and they do.
01:16:28.960 | I don't know what to do about that.
01:16:30.160 | It makes me really, really sad,
01:16:32.120 | especially with podcasting.
01:16:33.680 | There's an intimacy that people feel connected,
01:16:36.720 | and they get really excited.
01:16:37.760 | "Okay, cool, I wanna talk to Lex."
01:16:41.560 | (sighs)
01:16:42.840 | And they click.
01:16:43.840 | And I get angry at the people that do this.
01:16:50.360 | It's like the John that gets hired, the fake employee.
01:16:57.240 | I don't know what to do about that.
01:16:58.440 | I suppose the solution is education.
01:17:02.160 | It's telling people to be skeptical on stuff they click.
01:17:06.240 | That balance with the technology solution
01:17:09.480 | of creating maybe two-factor authentication
01:17:14.120 | and maybe helping identify things
01:17:17.880 | that are likely to be spam, I don't know.
01:17:20.360 | But then the machine learning there is tricky,
01:17:21.980 | 'cause you don't wanna add a lot of extra friction
01:17:25.280 | that just annoys people, because they'll turn it off,
01:17:28.200 | 'cause you have the accept cookies thing, right,
01:17:30.720 | that everybody has to click on now,
01:17:32.440 | so now they completely ignore the accept cookies.
01:17:34.600 | This is very difficult (laughs)
01:17:38.560 | to find that frictionless security.
01:17:41.160 | You mentioned Snowden.
01:17:43.760 | You've talked about looking through the NSA documents
01:17:48.360 | he leaked and doing the hard work of that.
01:17:52.000 | What do you make of Edward Snowden?
01:17:54.520 | What have you learned from those documents?
01:17:56.720 | What do you think of him?
01:17:57.960 | In the long arc of history,
01:18:02.440 | is Edward Snowden a hero or a villain?
01:18:05.520 | - I think he's neither.
01:18:07.480 | I have really complicated feelings about Edward Snowden.
01:18:11.460 | On the one hand, I'm a journalist at heart,
01:18:15.760 | and more transparency is good.
01:18:19.600 | And I'm grateful for the conversations that we had
01:18:24.040 | in the post-Snowden era about the limits to surveillance
01:18:29.640 | and how critical privacy is.
01:18:33.120 | And when you have no transparency
01:18:35.640 | and you don't really know, in that case,
01:18:38.080 | what our secret courts were doing,
01:18:40.060 | how can you truly believe that our country
01:18:45.680 | is taking our civil liberties seriously?
01:18:47.880 | So on the one hand, I'm grateful
01:18:51.320 | that he cracked open these debates.
01:18:56.280 | On the other hand, when I walked into the storage closet
01:19:01.280 | of classified NSA secrets,
01:19:05.920 | I had just spent two years
01:19:09.480 | covering Chinese cyber espionage almost every day
01:19:14.240 | and the sort of advancement of Russian attacks.
01:19:19.240 | They were just getting worse and worse and more destructive.
01:19:23.200 | And there were no limits to Chinese cyber espionage
01:19:27.440 | and Chinese surveillance of its own citizens.
01:19:30.880 | And there seemed to be no limit
01:19:32.840 | to what Russia was willing to do in terms of cyber attacks
01:19:37.280 | and also, in some cases, assassinating journalists.
01:19:41.280 | So when I walked into that room,
01:19:44.020 | there was a part of me, quite honestly,
01:19:46.760 | that was relieved to know that the NSA
01:19:50.040 | was as good as I hoped they were.
01:19:52.640 | And we weren't using that knowledge
01:19:57.880 | to, as far as I know, assassinate journalists.
01:20:02.220 | We weren't using our access
01:20:06.240 | to take out pharmaceutical companies.
01:20:11.000 | For the most part, we were using it
01:20:12.740 | for traditional espionage.
01:20:15.600 | Now, that set of documents
01:20:18.160 | also set me on the journey of my book
01:20:20.200 | because to me, the American people's reaction
01:20:24.440 | to the Snowden documents was a little bit misplaced.
01:20:28.000 | They were upset
01:20:29.920 | about the phone call metadata collection program.
01:20:34.020 | Angela Merkel, I think, rightfully was upset
01:20:36.360 | that we were hacking her cell phone.
01:20:38.340 | But in sort of the spy eats spy world,
01:20:42.580 | hacking world leader's cell phones
01:20:44.360 | is pretty much what most spy agencies do.
01:20:47.320 | And there wasn't a lot that I saw in those documents
01:20:51.560 | that was beyond what I thought a spy agency does.
01:20:56.560 | And I think if there was another 9/11 tomorrow,
01:21:01.000 | God forbid, we would all say,
01:21:03.520 | how did the NSA miss this?
01:21:05.720 | Why weren't they spying on those terrorists?
01:21:07.900 | Why weren't they spying on those world leaders?
01:21:10.760 | You know, there's some of that too.
01:21:13.160 | But I think that there was great damage
01:21:16.960 | done to the US's reputation.
01:21:21.680 | I think we really lost our halo
01:21:25.480 | in terms of a protector of civil liberties.
01:21:30.440 | And I think a lot of what was reported
01:21:33.680 | was unfortunately reported in a vacuum.
01:21:37.000 | That was my biggest gripe,
01:21:39.080 | that we were always reporting,
01:21:42.320 | the NSA has this program and here's what it does.
01:21:45.220 | And the NSA is in Angela Merkel's cell phone
01:21:48.680 | and the NSA can do this.
01:21:50.760 | And no one was saying, and by the way,
01:21:55.640 | China has been hacking into our pipelines
01:22:00.240 | and they've been making off
01:22:01.480 | with all of our intellectual property.
01:22:04.400 | And Russia has been hacking into our energy infrastructure.
01:22:07.800 | And they've been using the same methods to spy on track
01:22:11.340 | and in many cases, kill their own journalists.
01:22:13.960 | And the Saudis have been doing this
01:22:15.760 | to their own critics and dissidents.
01:22:17.320 | And so you can't talk about
01:22:19.840 | any of these countries in isolation.
01:22:22.720 | It is really like spy, spy out there.
01:22:25.840 | And so I just have complicated feelings.
01:22:29.040 | You know, and the other thing is,
01:22:30.320 | and I'm sorry, this is a little bit of a tangent,
01:22:32.000 | but the amount of documents that we had,
01:22:36.760 | like thousands of documents,
01:22:39.800 | most of which were just crap,
01:22:41.880 | but had people's names on them.
01:22:45.980 | You know, part of me wishes that those documents
01:22:48.900 | had been released in a much more targeted, limited way.
01:22:52.980 | It's just a lot of it just felt like a PowerPoint
01:22:56.340 | that was taken out of context.
01:22:58.580 | And you just sort of wish
01:23:03.020 | that there had been a little bit more thought
01:23:05.880 | into what was released.
01:23:07.700 | Because I think a lot of the impact from Sony
01:23:10.060 | was just the volume of the reporting.
01:23:13.380 | But I think, you know, based on what I saw personally,
01:23:17.040 | there was a lot of stuff that I just,
01:23:20.380 | I don't know why that particular thing got released.
01:23:24.100 | - As a whistleblower, what's a better way to do it?
01:23:26.780 | 'Cause I mean, there's fear, there's,
01:23:28.660 | it takes a lot of effort to do a more targeted release.
01:23:33.560 | You know, if there's proper channels,
01:23:34.960 | you're afraid that those channels will be manipulated.
01:23:38.220 | Like, who do you trust?
01:23:39.460 | What's a better way to do this, do you think?
01:23:43.540 | As a journalist, this is almost a good journalistic question.
01:23:46.580 | Reveal some fundamental flaw in the system
01:23:49.620 | without destroying the system.
01:23:51.220 | I bring up, you know, again, Mark Zuckerberg and Meta.
01:23:54.860 | There was a whistleblower that came out
01:23:59.100 | about Instagram internal studies.
01:24:02.100 | And I also am torn about how to feel about that whistleblower
01:24:06.980 | because from a company perspective, that's an open culture.
01:24:10.800 | How can you operate successfully if you have an open culture
01:24:14.880 | where any one whistleblower can come out,
01:24:17.900 | out of context, take a study,
01:24:19.360 | whether it represents a larger context or not.
01:24:21.800 | And the press eats it up.
01:24:25.400 | And then that creates a narrative that is,
01:24:28.980 | just like with the NSA, you said,
01:24:30.660 | that's out of context, very targeted to where,
01:24:34.560 | well, Facebook is evil, clearly, because of this one leak.
01:24:38.880 | It's really hard to know what to do there
01:24:40.520 | 'cause we're now in a society
01:24:42.040 | that's deeply distressed institutions.
01:24:44.520 | And so narratives by whistleblowers make that whistleblower
01:24:49.120 | and their forthcoming book very popular.
01:24:52.240 | And so there's a huge incentive to take stuff
01:24:54.680 | out of context and to tell stories
01:24:56.800 | that don't represent the full context, the full truth.
01:25:00.340 | It's hard to know what to do with that
01:25:03.100 | 'cause then that forces Facebook and Meta and governments
01:25:06.940 | to be much more conservative, much more secretive.
01:25:09.800 | It's like a race to the bottom.
01:25:13.180 | I don't know.
01:25:14.580 | I don't know if you can comment on any of that,
01:25:16.260 | how to be a whistleblower ethically and properly.
01:25:19.320 | - I don't know.
01:25:21.620 | I mean, these are hard questions.
01:25:23.420 | And even for myself, in some ways,
01:25:27.280 | I think of my book as sort of blowing the whistle
01:25:31.740 | on the underground zero-day market.
01:25:33.980 | But it's not like I was in the market myself.
01:25:38.860 | It's not like I had access to classified data
01:25:41.460 | when I was reporting out that book.
01:25:43.460 | As I say in the book,
01:25:46.380 | "Listen, I'm just trying to scrape the surface here
01:25:49.500 | "so we can have these conversations before it's too late."
01:25:53.040 | And I'm sure there's plenty in there
01:25:57.100 | that someone who's US intelligence agency's
01:26:01.800 | preeminent zero-day broker
01:26:03.340 | probably has some voodoo doll of me out there.
01:26:05.940 | And you're never gonna get it 100%.
01:26:10.700 | But I really applaud whistleblowers
01:26:14.900 | like the whistleblower who blew the whistle
01:26:19.060 | on the Trump call with Zelensky.
01:26:22.340 | I mean, people needed to know about that,
01:26:25.340 | that we were basically, in some ways, blackmailing an ally
01:26:30.340 | to try to influence an election.
01:26:33.780 | I mean, they went through the proper channels.
01:26:37.300 | They weren't trying to profit off of it, right?
01:26:39.480 | There was no book that came out afterwards
01:26:42.100 | from that whistleblower.
01:26:44.100 | That whistleblower's not like,
01:26:45.780 | they went through the channels.
01:26:47.900 | They're not living in Moscow.
01:26:49.340 | You know, let's put it that way.
01:26:51.180 | - Can I ask you a question?
01:26:52.020 | You mentioned NSA, one of the things it showed
01:26:54.300 | is they're pretty good at what they do.
01:26:58.180 | Again, this is a touchy subject, I suppose,
01:27:03.460 | but there's a lot of conspiracy theories
01:27:06.260 | about intelligence agencies.
01:27:08.020 | From your understanding of intelligence agencies,
01:27:11.180 | the CIA, NSA, and the equivalent in other countries,
01:27:15.980 | are they, one question, this could be a dangerous question,
01:27:19.980 | are they competent, are they good at what they do?
01:27:23.560 | And two, are they malevolent in any way?
01:27:28.880 | Sort of, I recently had a conversation
01:27:32.620 | about tobacco companies.
01:27:35.140 | They kind of see their customers as dupes.
01:27:39.500 | Like, they can just play games with people.
01:27:42.440 | Conspiracy theories tell that similar story
01:27:46.400 | about intelligence agencies,
01:27:48.500 | that they're interested in manipulating the populace
01:27:51.700 | for whatever ends the powerful in dark rooms,
01:27:56.700 | cigarette smoke, cigar smoke-filled rooms.
01:28:01.280 | What's your sense?
01:28:04.620 | Do these conspiracy theories have kind of any truth to them?
01:28:09.620 | Or are intelligence agencies, for the most part,
01:28:14.420 | good for society?
01:28:15.700 | - Okay, well, that's an easy one.
01:28:17.340 | (Bridget laughing)
01:28:18.660 | - Is it? - No.
01:28:20.140 | I think, you know, depends which intelligence agency.
01:28:23.780 | Think about the Mossad.
01:28:25.340 | You know, they're killing every Iranian nuclear scientist
01:28:30.340 | they can over the years, you know?
01:28:34.540 | But have they delayed the time horizon
01:28:38.740 | before Iran gets the bomb?
01:28:40.540 | Yeah.
01:28:41.380 | Have they probably staved off terror attacks
01:28:45.860 | on their own citizens? Yeah.
01:28:47.400 | You know, none of these, intelligence is intelligence.
01:28:53.320 | You know, you can't just say, like,
01:28:54.820 | they're malevolent or they're heroes, you know?
01:28:59.820 | Everyone I have met in this space
01:29:02.620 | is not like the pound-your-chest patriot
01:29:07.420 | that you see on the beach on the 4th of July.
01:29:11.580 | A lot of them have complicated feelings
01:29:15.180 | about their former employers.
01:29:17.700 | Well, at least at the NSA reminded me
01:29:20.540 | to do what we were accused of doing after Snowden,
01:29:25.540 | to spy on Americans.
01:29:28.820 | You have no idea the amount of red tape
01:29:33.340 | and paperwork and bureaucracy it would have taken
01:29:37.820 | to do what everyone thinks that we were supposedly doing.
01:29:42.500 | But then, you know, we find out
01:29:45.060 | in the course of the Snowden reporting
01:29:46.860 | about a program called Lovin'
01:29:49.480 | where a couple of the NSA analysts
01:29:51.900 | were using their access to spy on their ex-girlfriends.
01:29:55.380 | So, you know, there's an exception to every case.
01:29:59.280 | Generally, I will probably get, you know,
01:30:05.020 | accused of my Western bias here again,
01:30:07.620 | but I think you can almost barely compare
01:30:12.620 | some of these Western intelligence agencies
01:30:17.300 | to China, for instance.
01:30:20.040 | And the surveillance that they're deploying on the Uyghurs
01:30:25.040 | to the level they're deploying it,
01:30:28.140 | and the surveillance they're starting to export abroad
01:30:32.160 | with some of the programs like the watering hole attack
01:30:34.620 | I mentioned earlier,
01:30:35.540 | where it's not just hitting the Uyghurs inside China,
01:30:39.020 | it's hitting anyone interested
01:30:40.380 | in the Uyghur plight outside China.
01:30:42.100 | I mean, it could be an American high school student
01:30:44.620 | writing a paper on the Uyghurs.
01:30:46.560 | They wanna spy on that person too.
01:30:49.140 | You know, there's no rules in China
01:30:51.780 | really limiting the extent of that surveillance.
01:30:55.820 | And we all better pay attention
01:30:58.620 | to what's happening with the Uyghurs
01:30:59.980 | because just as Ukraine has been to Russia
01:31:04.060 | in terms of a test kitchen for its cyber attacks,
01:31:07.420 | the Uyghurs are China's test kitchen for surveillance.
01:31:12.900 | And there's no doubt in my mind
01:31:15.300 | that they're testing them on the Uyghurs.
01:31:17.660 | Uyghurs are their Petri dish,
01:31:19.140 | and eventually they will export
01:31:21.220 | that level of surveillance overseas.
01:31:23.860 | I mean, in 2015, Obama and Xi Jinping reached a deal
01:31:31.700 | where basically the White House said,
01:31:34.460 | "You better cut it out on intellectual property theft."
01:31:38.620 | And so they made this agreement
01:31:40.140 | that they would not hack each other for commercial benefit.
01:31:43.620 | And for a period of about 18 months,
01:31:45.700 | we saw this huge drop-off in Chinese cyber attacks
01:31:49.060 | on American companies, but some of them continued.
01:31:53.100 | Where did they continue?
01:31:54.300 | They continued on aviation companies,
01:31:58.420 | on hospitality companies like Marriott,
01:32:02.900 | Because that was still considered fair game to China.
01:32:05.700 | It wasn't IP theft they were after.
01:32:07.420 | They wanted to know who was staying in this city
01:32:11.820 | at this time when Chinese citizens were staying there
01:32:15.020 | so they could cross match for counterintelligence
01:32:17.420 | who might be a likely Chinese spy.
01:32:20.220 | I'm sure we're doing some of that too.
01:32:22.740 | Counterintelligence is counterintelligence.
01:32:24.780 | It's considered fair game.
01:32:26.300 | But where I think it gets evil
01:32:30.420 | is when you use it for censorship,
01:32:34.220 | to suppress any dissent,
01:32:36.100 | to do what I've seen the UAE do to its citizens
01:32:41.780 | where people who've gone on Twitter
01:32:44.180 | just to advocate for better voting rights,
01:32:47.580 | more enfranchisement,
01:32:49.340 | suddenly find their passports confiscated.
01:32:52.480 | I talked to one critic, Ahmed Mansour,
01:32:57.340 | and he told me, "You might find yourself a terrorist,
01:33:00.980 | "labeled a terrorist one day,
01:33:02.320 | "you don't even know how to operate a gun."
01:33:04.700 | I mean, he'd been beaten up
01:33:06.500 | every time he tried to go somewhere.
01:33:08.020 | His passport had been confiscated.
01:33:09.660 | By that point, it turned out
01:33:10.600 | they'd already hacked into his phone
01:33:12.100 | so they were listening to us talking.
01:33:14.160 | They'd hacked into his baby monitor
01:33:16.300 | so they're spying on his child.
01:33:18.540 | And they stole his car.
01:33:21.340 | And then they created a new law
01:33:24.680 | that you couldn't criticize the ruling family
01:33:27.780 | or the ruling party on Twitter.
01:33:29.620 | And he's been in solitary confinement every day
01:33:32.300 | since on hunger strike.
01:33:34.180 | So that's evil.
01:33:36.260 | That's evil.
01:33:37.760 | And we don't do that here.
01:33:40.780 | We have rules here.
01:33:42.060 | We don't cross that line.
01:33:44.500 | So yeah, in some cases, I won't go to Dubai.
01:33:47.860 | I won't go to Abu Dhabi.
01:33:50.000 | If I ever wanna go to the Maldives, too bad.
01:33:52.660 | Most of the flights go through Dubai.
01:33:54.920 | - So there's some lines we're not willing to cross.
01:33:57.200 | But then again, just like you said,
01:33:59.000 | there's individuals within NSA, within CIA,
01:34:02.720 | and they may have power.
01:34:05.800 | And to me, there's levels of evil.
01:34:07.880 | To me personally, this is the stuff of conspiracy theories,
01:34:11.360 | is the things you've mentioned as evil
01:34:13.920 | are more direct attacks.
01:34:16.200 | But there's also psychological warfare.
01:34:19.200 | So blackmail.
01:34:20.840 | So what does spying allow you to do?
01:34:25.600 | Allow you to collect information
01:34:27.860 | if you have something that's embarrassing.
01:34:30.180 | Or if you have like Jeffrey Epstein conspiracy theories,
01:34:33.620 | active, what is it, manufacture of embarrassing things,
01:34:38.500 | and then use blackmail to manipulate the population
01:34:41.120 | or all the powerful people involved.
01:34:42.900 | It troubles me deeply that MIT allowed somebody
01:34:45.860 | like Jeffrey Epstein in their midst,
01:34:48.620 | especially some of the scientists I admire
01:34:51.660 | that they would hang out with that person at all.
01:34:54.300 | And so I'll talk about it sometimes.
01:34:57.880 | And then a lot of people tell me,
01:35:00.220 | "Well, obviously,
01:35:01.340 | "Jeffrey Epstein is a front for intelligence."
01:35:04.380 | And I just, I struggle to see that level of competence
01:35:09.380 | and malevolence.
01:35:10.760 | But who the hell am I?
01:35:17.260 | And I guess I was trying to get to that point.
01:35:21.180 | You said that there's bureaucracy and so on,
01:35:23.020 | which makes some of these things very difficult.
01:35:25.740 | I wonder how much malevolence,
01:35:27.420 | how much competence there is in these institutions.
01:35:31.660 | Like how far, this takes us back to the hacking question.
01:35:34.860 | How far are people willing to go if they have the power?
01:35:39.860 | This has to do with social engineering.
01:35:41.700 | This has to do with hacking.
01:35:42.780 | This has to do with manipulating people,
01:35:45.380 | attacking people, doing evil onto people,
01:35:47.340 | psychological warfare and stuff like that.
01:35:50.260 | I don't know.
01:35:51.460 | I believe that most people are good.
01:35:53.980 | And I don't think that's possible in a free society.
01:35:59.380 | There's something that happens
01:36:00.420 | when you have a centralized government
01:36:02.540 | where power corrupts over time
01:36:05.580 | and you start surveillance programs.
01:36:08.940 | It's like a slippery slope that over time
01:36:12.940 | starts to both use fear and direct manipulation
01:36:17.940 | to control the populace.
01:36:20.020 | But in a free society, I just,
01:36:22.200 | it's difficult for me to imagine
01:36:25.100 | that you can have somebody like a Jeffrey Epstein
01:36:27.660 | in front for intelligence.
01:36:29.340 | I don't know what I'm asking you, but I'm just,
01:36:31.700 | I have a hope that for the most part,
01:36:36.900 | intelligence agencies are trying to do good
01:36:39.860 | and are actually doing good for the world.
01:36:42.960 | When you view it in the full context
01:36:45.680 | of the complexities of the world.
01:36:47.680 | But then again, if they're not, would we know?
01:36:55.160 | That's why Edward Snowden might be a good thing.
01:36:58.360 | Let me ask you on a personal question.
01:37:00.480 | You have investigated some of the most powerful organizations
01:37:03.240 | and people in the world of cyber warfare, cybersecurity.
01:37:07.600 | Are you ever afraid for your own life,
01:37:09.600 | your own well-being, digital or physical?
01:37:13.280 | - I mean, I've had my moments.
01:37:14.920 | I've had our security team at the Times
01:37:20.120 | called me at one point and said,
01:37:21.920 | "Someone's on the dark web offering good money
01:37:25.760 | "to anyone who can hack your phone or your laptop."
01:37:28.980 | I describe in my book how when I was
01:37:32.660 | at that hacking conference in Argentina,
01:37:34.480 | I came back and I brought a burner laptop with me,
01:37:38.680 | but I'd kept it in the safe anyway
01:37:40.760 | and it didn't have anything on it,
01:37:42.480 | but someone had broken in and it was moved.
01:37:45.320 | I've had all sorts of sort of scary moments.
01:37:51.720 | And then I've had moments where I think I went
01:37:55.520 | just way too far into the paranoid side.
01:37:58.920 | I mean, I remember writing about the Times hack by China
01:38:03.920 | and I just covered a number of Chinese cyber attacks
01:38:07.460 | where they'd gotten into the thermostat
01:38:10.080 | at someone's corporate apartment
01:38:11.600 | and they'd gotten into all sorts of stuff.
01:38:15.800 | And I was living by myself.
01:38:17.760 | I was single in San Francisco
01:38:19.960 | and my cable box on my television started making
01:38:24.680 | some weird noises in the middle of the night.
01:38:26.960 | And I got up and I ripped it out of the wall
01:38:29.720 | and I think I said something like embarrassing,
01:38:32.080 | like, "Fuck you, China."
01:38:33.560 | (both laughing)
01:38:35.720 | And then I went back to bed and I woke up
01:38:39.360 | and this beautiful morning light,
01:38:41.680 | I mean, I'll never forget it.
01:38:42.760 | This is like glimmering morning light
01:38:44.640 | is shining on my cable box, which has now been ripped out
01:38:48.120 | and is sitting on my floor and the morning light.
01:38:50.680 | And I was just like, "No, no, no.
01:38:54.160 | "I'm not going down that road."
01:38:57.240 | Basically, I came to a fork in the road
01:39:03.440 | where I could either go full tinfoil hat,
01:39:06.160 | go live off the grid, never have a car with navigation,
01:39:10.080 | never use Google Maps, never own an iPhone,
01:39:12.520 | never order diapers off Amazon, create an alias,
01:39:17.520 | or I could just do the best I can
01:39:22.280 | and live in this new digital world we're living in.
01:39:26.100 | And what does that look like for me?
01:39:28.120 | I mean, what are my crown jewels?
01:39:30.800 | This is what I tell people.
01:39:31.680 | What are your crown jewels?
01:39:32.800 | 'Cause just focus on that.
01:39:34.200 | You can't protect everything,
01:39:35.680 | but you can protect your crown jewels.
01:39:37.520 | For me, for the longest time,
01:39:39.040 | my crown jewels were my sources.
01:39:42.360 | I was nothing without my sources.
01:39:44.560 | So I had some sources.
01:39:46.400 | I would meet the same dim sum place,
01:39:49.440 | or maybe it was a different restaurant,
01:39:51.680 | on the same date every quarter.
01:39:55.460 | And we would never drive there.
01:39:59.080 | We would never Uber there.
01:40:00.400 | We wouldn't bring any devices.
01:40:02.080 | I could bring a pencil and a notepad.
01:40:05.040 | And if someone wasn't in town,
01:40:07.440 | there were a couple times where I'd show up
01:40:09.160 | and the source never came,
01:40:11.160 | but we never communicated digitally.
01:40:14.040 | And those were the lengths I was willing to go
01:40:16.480 | to protect that source, but you can't do it for everyone.
01:40:19.560 | So for everyone else, it was signal,
01:40:22.320 | using two-factor authentication,
01:40:24.760 | keeping my devices up to date,
01:40:26.980 | not clicking on phishing emails,
01:40:28.760 | using a password manager,
01:40:30.680 | all the things that we know we're supposed to do.
01:40:34.520 | And that's what I tell everyone.
01:40:36.120 | Don't go crazy, because then that's the ultimate hack.
01:40:39.320 | Then they've hacked your mind,
01:40:41.160 | whoever they is for you.
01:40:43.640 | But just do the best you can.
01:40:45.200 | Now, my whole risk model changed when I had a kid.
01:40:49.720 | Now it's, oh God,
01:40:54.760 | if anyone threatened my family,
01:40:58.640 | God help them.
01:40:59.820 | (both laughing)
01:41:02.760 | But it changes you.
01:41:07.560 | And unfortunately, there are some things
01:41:12.560 | I was really scared to go deep on,
01:41:15.040 | like Russian cybercrime, like Putin himself.
01:41:20.040 | And it's interesting, I have a mentor
01:41:22.560 | who's an incredible person
01:41:24.600 | who was the Times Moscow Bureau Chief during the Cold War.
01:41:29.560 | And after I wrote a series of stories
01:41:32.040 | about Chinese cyberespionage, he took me out to lunch.
01:41:35.040 | And he told me that when he was living in Moscow,
01:41:37.960 | he would drop his kids off at preschool
01:41:40.100 | when they were my son's age now,
01:41:42.660 | and the KGB would follow him.
01:41:44.800 | And they would make a really loud show of it.
01:41:47.720 | They'd tail him, they'd honk,
01:41:51.360 | they'd just be a wreck, make a ruckus.
01:41:55.240 | And he said, "You know what?
01:41:56.080 | "They never actually did anything,
01:41:57.620 | "but they wanted me to know that they were following me.
01:42:00.720 | "And I operated accordingly."
01:42:03.000 | And he says, "That's how you should operate
01:42:05.760 | "in the digital world.
01:42:08.140 | "Know that there are probably people following you.
01:42:11.580 | "Sometimes they'll make a little bit of noise.
01:42:14.320 | "But one thing you need to know is that
01:42:17.460 | "while you're at the New York Times,
01:42:18.540 | "you have a little bit of an invisible shield on you.
01:42:21.300 | "You know, if something were to happen to you,
01:42:23.580 | "that would be a really big deal.
01:42:25.000 | "That would be an international incident."
01:42:27.000 | So I kind of carried that invisible shield
01:42:29.380 | with me for years.
01:42:31.540 | And then Jamal Khashoggi happened.
01:42:34.360 | And that destroyed my vision of my invisible shield.
01:42:39.340 | You know, sure, he was a Saudi,
01:42:42.820 | but he was a Washington Post columnist.
01:42:45.620 | You know, for the most part,
01:42:46.940 | he was living in the United States.
01:42:48.140 | He was a journalist.
01:42:50.020 | And for them to do what they did to him
01:42:53.620 | pretty much in the open and get away with it,
01:42:58.200 | and for the United States to let them get away with it
01:43:01.880 | because we wanted to preserve diplomatic relations
01:43:04.680 | with the Saudis,
01:43:06.080 | that really threw my worldview upside down.
01:43:10.520 | And, you know, I think that sent a message
01:43:13.760 | to a lot of countries
01:43:15.600 | that it was sort of open season on journalists.
01:43:19.520 | And to me, that was one of the most destructive things
01:43:23.000 | that happened under the previous administration.
01:43:27.400 | And, you know, I don't really know
01:43:30.280 | what to think of my invisible shield anymore.
01:43:32.320 | - Take a second.
01:43:33.160 | That really worries me on the journalism side
01:43:34.920 | that people would be afraid to dig deep
01:43:37.400 | on fascinating topics.
01:43:39.080 | And, you know, I have my own,
01:43:44.400 | part of the reason,
01:43:48.280 | I would love to have kids,
01:43:50.160 | I would love to have a family.
01:43:52.720 | Part of the reason I'm a little bit afraid,
01:43:56.680 | there's many ways to phrase this,
01:43:57.940 | but the loss of freedom in the way of
01:44:01.120 | doing all the crazy shit that I naturally do,
01:44:05.000 | which I would say the ethic of journalism
01:44:07.880 | is kind of not,
01:44:08.840 | is doing crazy shit without really thinking about it,
01:44:11.720 | is letting your curiosity
01:44:13.440 | really allow you to be free and explore.
01:44:18.480 | It's, I mean, whether it's stupidity or fearlessness,
01:44:22.040 | whatever it is, that's what great journalism is.
01:44:25.280 | And all the concerns about security risks
01:44:30.240 | have made me become a better person.
01:44:32.800 | The way I approach it is
01:44:35.000 | just make sure you don't have anything to hide.
01:44:37.320 | I know this is not a thing,
01:44:39.040 | this is not an approach to security.
01:44:41.920 | I'm just, this is like a motivational speech or something.
01:44:44.920 | It's just like, if you can lose,
01:44:47.360 | you can be hacked at any moment.
01:44:49.280 | Just don't be a douchebag secretly.
01:44:52.140 | Just be like a good person.
01:44:54.420 | 'Cause then, I see this actually
01:44:56.660 | with social media in general.
01:44:58.660 | Just present yourself in the most authentic way possible,
01:45:03.820 | meaning be the same person online as you are privately,
01:45:06.860 | have nothing to hide.
01:45:08.060 | That's one, not the only,
01:45:09.960 | but one of the ways to achieve security.
01:45:13.540 | Maybe I'm totally wrong on this,
01:45:15.780 | but don't be secretly weird.
01:45:19.500 | If you're weird, be publicly weird
01:45:21.860 | so it's impossible to blackmail you.
01:45:24.300 | That's my approach to security.
01:45:25.420 | - Yeah, well, they call it
01:45:26.580 | the New York Times front page phenomenon.
01:45:29.900 | Don't put anything in email,
01:45:31.660 | or I guess social media these days,
01:45:33.300 | that you wouldn't want to read
01:45:35.780 | on the front page of the New York Times.
01:45:37.900 | And that works, but sometimes I even get carried,
01:45:41.620 | I mean, I have not as many followers as you,
01:45:45.540 | but a lot of followers,
01:45:47.060 | and sometimes even I get carried away.
01:45:49.100 | - You have to be emotional and stuff and say something.
01:45:51.300 | - Yeah, I mean, just the cortisol response on Twitter.
01:45:56.300 | Twitter is basically designed to elicit those responses.
01:46:01.580 | I mean, every day I turn on my computer,
01:46:04.740 | I look at my phone, I look at what's trending on Twitter,
01:46:07.740 | and it's like, what are the topics
01:46:10.060 | that are gonna make people the most angry today?
01:46:13.700 | (both laughing)
01:46:16.140 | And it's easy to get carried away,
01:46:19.180 | but it's also just, that sucks too,
01:46:22.340 | that you have to be constantly censoring yourself.
01:46:25.300 | And maybe it's for the better,
01:46:26.620 | maybe you can't be a secret asshole,
01:46:29.500 | and we can put that in the good bucket.
01:46:31.380 | But at the same time, there is a danger
01:46:35.820 | to that other voice, to creativity, to being weird.
01:46:40.820 | There's a danger to that little whispered voice
01:46:45.580 | that's like, well, how would people read that?
01:46:49.100 | How could that be manipulated?
01:46:51.180 | How could that be used against you?
01:46:53.540 | And that stifles creativity and innovation and free thought.
01:46:58.540 | And that is on a very micro level.
01:47:05.300 | And that's something I think about a lot.
01:47:08.940 | And that's actually something
01:47:10.300 | that Tim Cook has talked about a lot,
01:47:13.220 | and why he has said he goes full force on privacy
01:47:17.660 | is it's just that little voice
01:47:21.020 | that is at some level censoring you.
01:47:24.860 | And what is sort of the long-term impact
01:47:28.300 | of that little voice over time?
01:47:31.020 | - I think there's a ways,
01:47:32.300 | I think that self-censorship is an attack factor
01:47:36.460 | that there are solutions to.
01:47:37.740 | The way I'm really inspired by Elon Musk,
01:47:40.220 | the solution to that is just be privately
01:47:43.740 | and publicly the same person and be ridiculous.
01:47:46.780 | Embrace the full weirdness and show it more and more.
01:47:49.780 | So there's memes that has ridiculous humor.
01:47:54.020 | And if there is something you really wanna hide,
01:47:59.300 | deeply consider if you wanna be that.
01:48:02.500 | Like, why are you hiding it?
01:48:05.300 | What exactly are you afraid of?
01:48:07.420 | Because I think my hopeful vision for the internet
01:48:10.700 | is the internet loves authenticity.
01:48:13.340 | They wanna see you weird.
01:48:15.180 | So be that and live that fully.
01:48:18.300 | Because I think that gray area
01:48:20.260 | where you're kind of censoring yourself,
01:48:22.260 | that's where the destruction is.
01:48:25.260 | You have to go all the way, step over, be weird.
01:48:28.780 | And then it feels, it can be painful
01:48:31.060 | 'cause people can attack you and so on, but just ride it.
01:48:33.740 | I mean, that's just a skill
01:48:35.980 | on the social psychological level
01:48:38.060 | that ends up being an approach to security,
01:48:42.100 | which is like remove the attack vector
01:48:45.060 | of having private information
01:48:46.980 | by being your full weird self publicly.
01:48:50.080 | What advice would you give to young folks today
01:48:56.420 | operating in this complicated space
01:49:00.660 | about how to have a successful life,
01:49:02.500 | a life they can be proud of,
01:49:03.980 | a career they can be proud of?
01:49:05.580 | Maybe somebody in high school and college
01:49:09.460 | thinking about what they're going to do.
01:49:11.580 | - Be a hacker.
01:49:12.580 | If you have any interest, become a hacker
01:49:16.740 | and apply yourself to defense.
01:49:19.020 | Every time, like we do have
01:49:21.780 | these amazing scholarship programs, for instance,
01:49:24.580 | where they find you early,
01:49:26.580 | they'll pay your college as long as you commit
01:49:30.020 | to some kind of federal commitment
01:49:31.940 | to sort of help federal agencies with cybersecurity.
01:49:35.300 | And where does everyone wanna go every year
01:49:37.860 | from the scholarship program?
01:49:39.060 | They wanna go work at the NSA or Cyber Command.
01:49:42.220 | They wanna go work on offense.
01:49:44.180 | They wanna go do the sexy stuff.
01:49:46.180 | It's really hard to get people to work on defense.
01:49:49.900 | It's just, it's always been more fun to be a pirate
01:49:52.580 | than be in the Coast Guard.
01:49:54.540 | And so we have a huge deficit
01:49:59.100 | when it comes to filling those roles.
01:50:01.300 | There's 3.5 million unfilled cybersecurity positions
01:50:06.300 | around the world.
01:50:07.940 | I mean, talk about job security.
01:50:09.660 | Like be a hacker and work on cybersecurity.
01:50:12.700 | You will always have a job.
01:50:14.260 | And we're actually at a huge deficit
01:50:18.540 | and disadvantage as a free market economy
01:50:21.260 | because we can't match cybersecurity salaries
01:50:26.740 | at Palantir or Facebook or Google or Microsoft.
01:50:30.020 | And so it's really hard for the United States
01:50:32.460 | to fill those roles.
01:50:33.820 | And other countries have had this workaround
01:50:38.420 | where they basically have forced conscription on some level.
01:50:41.820 | China tells people like,
01:50:44.380 | you do whatever you're gonna do during the day,
01:50:46.860 | work at Alibaba.
01:50:48.860 | If you need to do some ransomware, okay.
01:50:51.340 | But the minute we tap you on the shoulder
01:50:53.980 | and ask you to come do this sensitive operation for us,
01:50:57.180 | the answer is yes.
01:50:59.380 | Same with Russia.
01:51:00.820 | You know, a couple of years ago when Yahoo was hacked
01:51:03.660 | and they laid it all out in an indictment,
01:51:05.700 | it came down to two cyber criminals
01:51:07.660 | and two guys from the FSB.
01:51:09.580 | Cyber criminals were allowed to have their fun,
01:51:12.180 | but the minute they came across the username and password
01:51:14.860 | for someone's personal Yahoo account
01:51:16.740 | that worked at the White House or the State Department
01:51:19.460 | or military, they were expected to pass that over to the FSB.
01:51:23.660 | So we don't do that here.
01:51:24.980 | And it's even worse on defense.
01:51:27.420 | You really can't fill these positions.
01:51:29.900 | So, you know, if you are a hacker,
01:51:33.020 | if you're interested in code, if you're a tinker,
01:51:36.620 | you know, learn how to hack.
01:51:38.460 | There are all sorts of amazing hacking competitions
01:51:42.700 | you can do through the SANS org, for example, S-A-N-S.
01:51:47.500 | And then use those skills for good.
01:51:50.900 | You know, neuter the bugs in that code
01:51:53.500 | that get used by autocratic regimes
01:51:56.300 | to make people's life, you know, a living prison.
01:51:59.300 | You know, plug those holes, you know,
01:52:01.940 | defend industrial systems,
01:52:03.700 | defend our water treatment facilities
01:52:06.020 | from hacks where people are trying to come in
01:52:07.820 | and poison the water.
01:52:09.500 | You know, that I think is just an amazing,
01:52:12.660 | it's an amazing job on so many levels.
01:52:16.260 | It's intellectually stimulating.
01:52:19.700 | You can tell yourself you're serving your country.
01:52:22.900 | You can tell yourself you're saving lives
01:52:24.700 | and keeping people safe.
01:52:26.140 | And you'll always have amazing job security.
01:52:28.460 | And if you need to go get that job that pays you,
01:52:30.980 | you know, two million bucks a year, you can do that too.
01:52:33.540 | - And you can have a public profile,
01:52:34.860 | more so of a public profile, you can be a public rockstar.
01:52:38.780 | I mean, it's the same thing as sort of the military.
01:52:42.780 | There's a lot of,
01:52:44.020 | there's a lot of well-known sort of people
01:52:49.980 | commenting on the fact that veterans
01:52:51.700 | are not treated as well as they should be.
01:52:54.060 | But it's still the fact that soldiers
01:52:56.140 | are deeply respected for defending the country,
01:53:00.380 | the freedoms, the ideals that we stand for.
01:53:02.820 | And in the same way, I mean, in some ways,
01:53:05.820 | the cybersecurity defense are the soldiers of the future.
01:53:09.020 | - Yeah, and you know what's interesting?
01:53:10.740 | I mean, in cybersecurity, the difference is
01:53:14.140 | oftentimes you see the more interesting threats
01:53:17.180 | in the private sector
01:53:18.660 | because that's where the attacks come.
01:53:20.740 | When cyber criminals and nation state adversaries
01:53:24.340 | come for the United States,
01:53:25.580 | they don't go directly for Cyber Command or the NSA.
01:53:29.060 | No, they go for banks, they go for Google,
01:53:32.940 | they go for Microsoft, they go for critical infrastructure.
01:53:36.660 | And so those companies, those private sector companies
01:53:39.580 | get to see some of the most advanced,
01:53:41.820 | sophisticated attacks out there.
01:53:45.700 | And if you're working at FireEye
01:53:48.620 | and you're calling out the SolarWinds attack, for instance,
01:53:51.580 | I mean, you just saved God knows how many systems
01:53:56.140 | from that compromise turning into something
01:53:59.980 | that more closely resembles sabotage.
01:54:02.560 | So go be a hacker or go be a journalist.
01:54:08.700 | (both laughing)
01:54:10.980 | - So you wrote the book,
01:54:13.100 | "This is How They Tell Me the World Ends,"
01:54:15.900 | as we've been talking about,
01:54:17.540 | of course, referring to cyber war, cybersecurity.
01:54:20.260 | What gives you hope about the future of our world?
01:54:25.380 | If it doesn't end, how will it not end?
01:54:28.040 | - That's a good question.
01:54:32.820 | I mean, I have to have hope, right?
01:54:34.600 | Because I have a kid and I have another on the way.
01:54:37.420 | And if I didn't have hope, I wouldn't be having kids.
01:54:41.040 | But it's a scary time to be having kids.
01:54:46.580 | And it's like pandemic, climate change,
01:54:50.620 | disinformation, increasingly advanced,
01:54:54.820 | perhaps deadly cyber attacks.
01:54:56.860 | What gives me hope is that I share your worldview
01:55:01.500 | that I think people are fundamentally good.
01:55:04.060 | And sometimes, and this is why the metaverse
01:55:07.620 | scares me to death, but when I'm reminded of that
01:55:10.820 | is not online.
01:55:12.200 | Like online, I get the opposite.
01:55:15.500 | You start to lose hope in humanity
01:55:17.420 | when you're on Twitter half your day.
01:55:19.360 | It's like when I go to the grocery store
01:55:22.620 | or I go on a hike or like someone smiles at me,
01:55:26.440 | or someone just says something nice,
01:55:30.140 | people are fundamentally good.
01:55:33.300 | We just don't hear from those people enough.
01:55:37.140 | And my hope is, I just think our current political climate,
01:55:42.140 | we've hit rock bottom.
01:55:44.900 | This is as bad as it gets.
01:55:46.680 | We can't do anything.
01:55:47.860 | - Don't jinx it.
01:55:48.860 | (laughs)
01:55:49.780 | - But I think it's a generational thing.
01:55:52.460 | I think baby boomers, it's time to move along.
01:55:56.060 | I think it's time for a new generation to come in.
01:56:01.140 | And I actually have a lot of hope when I look at,
01:56:06.040 | I'm sort of like this, I guess they call me
01:56:08.820 | a geriatric millennial or a young Gen X.
01:56:12.120 | But we have this unique responsibility
01:56:14.500 | because I grew up without the internet
01:56:17.620 | and without social media, but I'm native to it.
01:56:21.020 | So I know the good and I know the bad.
01:56:25.300 | And that's true on so many different things.
01:56:28.660 | I grew up without climate change anxiety
01:56:32.100 | and now I'm feeling it and I know it's not a given.
01:56:34.900 | We don't have to just resign ourselves to climate change.
01:56:38.980 | Same with disinformation.
01:56:41.240 | And I think a lot of the problems we face today
01:56:44.140 | have just exposed the sort of inertia
01:56:47.660 | that there's been on so many of these issues.
01:56:49.980 | And I really think it's a generational shift
01:56:52.940 | that has to happen.
01:56:54.540 | And I think this next generation is gonna come in
01:56:57.660 | and say like, we're not doing business
01:56:59.540 | like you guys did it anymore.
01:57:01.380 | We're not just gonna like rape and pillage the earth
01:57:03.860 | and try and turn everyone against each other
01:57:06.540 | and play dirty tricks and let lobbyists dictate
01:57:09.620 | what we do or don't do as a country anymore.
01:57:14.100 | And that's really where I see the hope.
01:57:16.540 | - It feels like there's a lot of low hanging fruit
01:57:19.220 | for young minds to step up and create solutions and lead.
01:57:23.780 | So whenever like politicians or leaders that are older,
01:57:28.780 | like you said, are acting shitty,
01:57:32.980 | I see that as a positive.
01:57:34.180 | They're inspiring a large number of young people
01:57:38.140 | to replace them.
01:57:39.700 | And so I think you're right.
01:57:41.900 | There's going to be, it's almost like you need people
01:57:44.260 | to act shitty to remind them, oh, wow, we need good leaders.
01:57:47.860 | We need great creators and builders and entrepreneurs
01:57:51.340 | and scientists and engineers and journalists.
01:57:54.140 | You know, all the discussions about how the journalism
01:57:56.540 | is quote unquote broken and so on.
01:57:58.660 | That's just an inspiration for new institutions to rise up
01:58:02.100 | that do journalism better,
01:58:03.760 | new journalists to step up and do journalism better.
01:58:06.300 | So I, and I've been constantly,
01:58:08.660 | when I talk to young people, I'm constantly impressed
01:58:11.780 | by the ones that dream to build solutions.
01:58:16.420 | And so that's ultimately why I put the hope.
01:58:21.300 | But the world is a messy place,
01:58:23.140 | like we've been talking about.
01:58:24.940 | It's a scary place.
01:58:26.180 | - Yeah, and I think you hit something,
01:58:29.700 | hit on something earlier, which is authenticity.
01:58:33.100 | Like no one is going to rise above that is plastic anymore.
01:58:40.060 | You know, people are craving authenticity.
01:58:43.100 | You know, the benefit of the internet is it's really hard
01:58:46.500 | to hide who you are on every single platform.
01:58:49.940 | You know, on some level it's going to come out
01:58:51.820 | who you really are.
01:58:53.460 | And so you hope that, you know,
01:58:57.380 | by the time my kids are grown, like no one's going to care
01:59:01.380 | if they made one mistake online,
01:59:04.060 | so long as they're authentic, you know?
01:59:06.980 | And I used to worry about this.
01:59:09.580 | My nephew was born the day I graduated from college.
01:59:13.260 | And I just always, you know, he's like born into Facebook.
01:59:17.740 | And I just think like, how is a kid like that
01:59:20.980 | ever going to be president of the United States of America?
01:59:24.020 | Because if Facebook had been around when I was in college,
01:59:27.740 | you know, like Jesus, you know,
01:59:30.980 | how are those kids going to ever be president?
01:59:34.220 | There's going to be some photo of them at some point
01:59:37.460 | making some mistake,
01:59:39.340 | and that's going to be all over for them.
01:59:41.780 | And now I take that back.
01:59:43.100 | Now it's like, no, everyone's going to make mistakes.
01:59:46.700 | There's going to be a picture for everyone.
01:59:49.380 | And we're all going to have to come and grow up
01:59:53.020 | to the view that as humans, we're going to make huge mistakes
01:59:56.500 | and hopefully they're not so big
01:59:58.140 | that they're going to ruin the rest of your life.
02:00:00.660 | But we're going to have to come around to this view
02:00:02.980 | that we're all human,
02:00:04.500 | and we're going to have to be a little bit more forgiving
02:00:07.300 | and a little bit more tolerant when people mess up.
02:00:10.300 | And we're going to have to be a little bit more humble
02:00:12.100 | when we do and like keep moving forward.
02:00:15.620 | Otherwise you can't like cancel everyone.
02:00:17.620 | - Nicole, this was an incredible, hopeful conversation.
02:00:21.700 | Also one that reveals that in the shadows,
02:00:26.700 | there's a lot of challenges to be solved.
02:00:30.220 | So I really appreciate that you took on
02:00:32.340 | this really difficult subject with your book.
02:00:34.300 | That's journalism at its best.
02:00:35.860 | So I'm really grateful that you did it,
02:00:37.860 | that you took the risk, that you took that on.
02:00:40.180 | And that you plugged the cable box back in.
02:00:42.580 | That means you have hope.
02:00:43.820 | And thank you so much for spending
02:00:46.540 | your valuable time with me today.
02:00:47.900 | - Thank you.
02:00:48.740 | Thanks for having me.
02:00:49.860 | - Thanks for listening to this conversation
02:00:52.180 | with Nicole Perlroth.
02:00:53.660 | To support this podcast,
02:00:54.900 | please check out our sponsors in the description.
02:00:57.660 | And now let me leave you with some words
02:01:00.020 | from Nicole herself.
02:01:01.780 | "Here we are, entrusting our entire digital lives
02:01:05.540 | "passwords, texts, love letters, banking records,
02:01:08.940 | "health records, credit cards, sources,
02:01:10.740 | "and deepest thoughts to this mystery box
02:01:13.900 | "whose inner circuitry most of us would never vet.
02:01:17.660 | "Run by code, written in a language
02:01:19.580 | "most of us will never fully understand."
02:01:22.460 | Thank you for listening and hope to see you next time.
02:01:26.420 | (upbeat music)
02:01:29.020 | (upbeat music)
02:01:31.620 | [BLANK_AUDIO]