back to indexNicole Perlroth: Cybersecurity and the Weapons of Cyberwar | Lex Fridman Podcast #266
Chapters
0:0 Introduction
0:55 Zero-day vulnerability
6:56 History of hackers
21:48 Interviewing hackers
25:50 Ransomware attack
38:34 Cyberwar
51:42 Cybersecurity
60:49 Social engineering
77:42 Snowden and whistleblowers
87:12 NSA
96:59 Fear for cyberattacks
104:30 Self-censorship
108:51 Advice for young people
114:8 Hope for the future
00:00:00.000 |
If one site is hacked, you can just unleash all hell. 00:00:18.000 |
record their camera without them knowing about it. 00:00:20.760 |
Basically, you can put an invisible ankle bracelet 00:00:26.000 |
You could sell that to a zero-day broker for $2 million. 00:00:31.000 |
- The following is a conversation with Nicole Perleroth, 00:00:40.080 |
of "This Is How They Tell Me The World Ends," 00:00:50.360 |
And now, dear friends, here's Nicole Perleroth. 00:00:55.220 |
You've interviewed hundreds of cybersecurity hackers, 00:01:01.040 |
government officials, forensic investigators, 00:01:05.700 |
So let's talk about cybersecurity and cyber war. 00:01:16.800 |
- So at the most basic level, let's say I'm a hacker 00:01:27.560 |
that no one else knows about, especially Apple. 00:01:31.260 |
That's called a zero-day because the minute it's discovered, 00:01:40.920 |
I could potentially write a program to exploit it. 00:01:44.940 |
And that program would be called a zero-day exploit. 00:01:48.860 |
And for iOS, the dream is that you craft a zero-day exploit 00:01:53.860 |
that can remotely exploit someone else's iPhone 00:02:06.860 |
record their camera without them knowing about it. 00:02:09.620 |
Basically, you can put an invisible ankle bracelet 00:02:17.220 |
that zero-day exploit would have immense value 00:02:23.740 |
that wants to monitor its critics or dissidents. 00:02:37.660 |
Which one is the sexier thing to try to get to 00:02:45.500 |
versus having to actually come in physical contact with it. 00:03:03.540 |
something that can get you into Android phones, 00:03:09.660 |
on this underground market for zero-day exploits 00:03:23.260 |
so I'm an Android person 'cause I'm a man of the people, 00:03:33.060 |
So is that the reason that the more powerful people 00:03:55.420 |
You could sell that to a zero-day broker for $2 million. 00:04:00.420 |
The caveat is you can never tell anyone about it 00:04:04.680 |
because the minute you tell someone about it, 00:04:08.880 |
they patch it in that $2.5 million investment 00:04:12.560 |
that that zero-day broker just made goes to dust. 00:04:16.120 |
So a couple years ago, and don't quote me on the prices, 00:04:32.180 |
was that it might be a sign that Apple security was falling 00:04:54.700 |
And a lot of governments that are paying top dollar 00:05:05.280 |
that wanna use these exploits to monitor their own citizens, 00:05:14.660 |
It's that they wanna find out who these people are 00:05:32.780 |
Some of the zero-day exploits that have fetched top dollar 00:05:37.020 |
that I've heard of in my reporting in the United States 00:05:48.120 |
They approached hackers and say, "We'll pay you 00:06:05.280 |
So a couple of years ago, there was a watering hole attack. 00:06:13.300 |
It was actually, it had information aimed at Uyghurs 00:06:23.280 |
it would drop an iOS zero-day exploit onto your phone. 00:06:39.800 |
would have gotten infected with this zero-day exploit. 00:06:43.980 |
So in that case, they were targeting huge swaths 00:06:51.740 |
in this one population basically in real time. 00:06:59.380 |
From the individual level to the group level, 00:07:02.720 |
psychologically speaking, what's their motivation? 00:07:12.220 |
These are big philosophical human questions, I guess. 00:07:15.460 |
- So these are the questions I set out to answer 00:07:25.620 |
If they're just after money, how do they sleep at night 00:07:34.740 |
to basically make someone's life a living hell? 00:07:37.280 |
And what I found was there's kind of this long-sorted 00:07:46.920 |
when hackers were just finding holes and bugs in software 00:07:54.380 |
And some of them would go to the tech companies 00:07:56.820 |
like Microsoft or Sun Microsystems at the time, or Oracle. 00:08:01.780 |
And they'd say, "Hey, I just found this zero-day 00:08:04.500 |
in your software and I can use it to break into NASA." 00:08:11.140 |
"Thank you so much for pointing out this flaw 00:08:13.940 |
and our software, we'll get it fixed as soon as possible." 00:08:17.240 |
It was, "Don't ever poke around our software ever again, 00:08:24.160 |
And that was really sort of the common thread for years. 00:08:30.380 |
And so hackers who set out to do the right thing 00:08:40.140 |
And what happened next was they basically started 00:08:48.180 |
from those early days, they all tell a very similar story, 00:08:57.260 |
You know, they remind me of like the kid down the block 00:08:59.740 |
that was constantly poking around the hood of his dad's car. 00:09:03.500 |
You know, they just couldn't help themselves. 00:09:06.140 |
They wanted to figure out how a system is designed 00:09:14.900 |
But they were basically kind of beat down for so long 00:09:28.060 |
And that's how you got these really heated debates 00:09:42.540 |
But, you know, don't you wanna just stick a middle finger 00:09:47.260 |
that are basically threatening you all the time? 00:09:50.020 |
So there was this really interesting dynamic at play. 00:09:53.580 |
And what I learned in the course of doing my book 00:09:57.060 |
was that government agencies and their contractors 00:10:01.740 |
sort of tapped into that frustration and that resentment. 00:10:06.180 |
And they started quietly reaching out to hackers 00:10:14.940 |
Could you come up with something custom for me? 00:10:31.460 |
that started reaching out to hackers on these forums 00:10:37.340 |
for that bug you were trying to get Microsoft 00:10:41.420 |
And sort of so began or so catalyzed this market 00:11:07.060 |
Siemens Industrial Software, into our nuclear plants 00:11:14.940 |
and our petrochemical facilities and our pipelines, 00:11:18.720 |
those same zero days came to be just as valuable 00:11:28.500 |
change the nature of the attackers that came to the table 00:11:35.740 |
you told the psychology of the hackers in the '90s, 00:11:40.580 |
what is the culture today and where is it heading? 00:11:44.020 |
- So I think there are people who will tell you 00:11:52.100 |
One, because they don't know how it's gonna get used 00:11:56.380 |
Most of these get rolled into classified programs 00:12:02.700 |
you don't even know which nation state might use it 00:12:05.500 |
or potentially which criminal group might use it 00:12:19.940 |
if they found out their zero day was being used 00:12:46.180 |
and they continue to profit off that software. 00:12:57.980 |
over the last 10 years, are bug bounty programs. 00:13:02.260 |
Companies like Google and Facebook and then Microsoft 00:13:06.180 |
and finally Apple, which resisted it for a really long time, 00:13:10.180 |
have said, okay, we are gonna shift our perspective 00:13:14.780 |
about hackers, we're no longer going to treat them 00:13:20.340 |
for what it's essentially free quality assurance. 00:13:23.800 |
And we're gonna pay them good money in some cases, 00:13:28.900 |
We're never gonna be able to bid against a zero day broker 00:13:34.740 |
But we can reward them and hopefully get to that bug earlier 00:13:40.980 |
so that they don't have to spend another year 00:13:44.760 |
And in that way, we can keep our software more secure. 00:13:48.180 |
But every week I get messages from some hacker that says, 00:14:00.380 |
I tried to tell Microsoft about this two years ago 00:14:04.540 |
and they were gonna pay me peanuts, so it never got fixed. 00:14:08.900 |
There are all sorts of those stories that can continue on. 00:14:19.860 |
They tend to be pretty snipey, technical crowd 00:14:43.100 |
- Yeah, and there have been some of those companies 00:14:47.780 |
and HackerOne is one of them, BugCrowd is another. 00:14:56.900 |
for a private bug bounty program, essentially. 00:15:04.460 |
to come hack your software, hack your system. 00:15:07.580 |
And then they'll quietly tell you what they found. 00:15:10.820 |
And I think that's a really positive development. 00:15:16.300 |
hired all three of those companies I just mentioned 00:15:25.780 |
into the really sensitive high side classified stuff, 00:15:37.780 |
to financially compete with the zero day brokers, 00:15:42.020 |
So like the defense can't outpay the hackers? 00:15:47.020 |
- It's interesting, they shouldn't outpay them 00:15:51.860 |
because what would happen if they started offering 00:15:56.180 |
$2.5 million at Apple for any zero day exploit 00:16:06.260 |
"Why the hell am I working for less than that?" 00:16:16.940 |
until I started this research and I realized, 00:16:20.540 |
You don't want to incentivize offense so much 00:16:29.540 |
what the companies have on government agencies 00:16:32.660 |
is if they pay you, you get to talk about it. 00:16:38.180 |
You get to brag about the fact you just found 00:16:41.260 |
that $2.5 million iOS zero day that no one else did. 00:17:16.980 |
that have the skill would not share that with anyone, 00:17:23.420 |
I guess how many people are completely devoid 00:17:31.780 |
So my belief is all the ultra competent people 00:17:36.340 |
or very, very high percentage of ultra competent people 00:17:48.500 |
- So this was another question I wanted to answer. 00:17:52.620 |
Who are these people who would sell a zero day exploit 00:17:57.740 |
that would neutralize a Schneider Electric safety lock 00:18:03.100 |
Basically the last thing you would need to neutralize 00:18:16.780 |
A lot of people said, I would never even look there 00:18:23.020 |
I don't even wanna have to make that decision 00:18:26.580 |
about whether I'm gonna profit off of that knowledge. 00:18:31.860 |
and this whole kind of moral calculus I had in my head 00:18:41.620 |
So Argentina actually is a real hacker's paradise. 00:18:46.460 |
People grew up in Argentina and I went down there, 00:19:07.620 |
And it's the whole culture is really like a hacker culture. 00:19:12.100 |
They say like, it's really like a MacGyver culture. 00:19:15.300 |
You have to figure out how to break into something 00:19:19.500 |
And that means that there are a lot of really good hackers 00:19:24.460 |
in Argentina who specialize in developing zero-day exploits. 00:19:40.060 |
who's selling zero-day exploits to governments? 00:19:45.020 |
Throw a stone anywhere and you're gonna hit someone. 00:19:50.700 |
you saw these guys who were clearly from these Gulf States 00:19:55.700 |
What are they doing at a young hacking conference 00:20:03.260 |
with kind of this godfather of the hacking scene there. 00:20:10.260 |
and I'm still embarrassed about how I phrased it. 00:20:14.620 |
well, these guys only sell these zero-day exploits 00:20:22.380 |
the United States wasn't a good Western government. 00:20:25.660 |
You know, the last country that bombed another country 00:20:33.580 |
So if we're gonna go by your whole moral calculus, 00:20:39.860 |
And we'd actually rather sell to Iran or Russia or China, 00:20:51.860 |
to whoever brings us the biggest bag of cash. 00:20:53.940 |
Have you checked into our inflation situation recently? 00:21:08.780 |
You know, we kind of sit on our high horse sometimes 00:21:20.220 |
that, you know, the one rule is like Fight Club, 00:21:33.940 |
And governments roll these into classified programs 00:21:36.860 |
and they don't want anyone to know what they have. 00:21:41.220 |
And when you're operating in the dark like that, 00:21:43.860 |
it's really easy to put aside your morals sometimes. 00:21:47.000 |
- Can I, as a small tangent, ask you, by way of advice, 00:21:52.060 |
you must have done some incredible interviews. 00:22:00.060 |
If you were to give me advice for interviewing 00:22:04.420 |
when you're recording on mic with a video camera, 00:22:19.020 |
what is it, like the godfather of cyber war, cybersecurity? 00:22:25.260 |
And they still have to be pretty brave to speak publicly. 00:22:28.440 |
But is it virtually impossible to really talk to anybody 00:22:40.880 |
But, you know, a lot, when I've seen people do it, 00:22:45.940 |
it's always the guy who's behind the shadows, 00:22:51.580 |
You know, when they've gotten someone on camera, 00:22:55.160 |
You know, very, very few people talk in this space. 00:22:59.020 |
And there's actually a pretty well known case study 00:23:07.420 |
So, you know, the gruck is or was this zero day broker, 00:23:15.260 |
And right when I was starting on this subject 00:23:18.860 |
at the New York Times, he'd given an interview to Forbes. 00:23:25.580 |
And he even posed next to this giant duffel bag 00:23:31.740 |
And later he would say he was speaking off the record, 00:23:38.020 |
But what I heard from people who did business with him 00:23:41.020 |
was that the minute that that story came out, 00:23:47.300 |
You know, his business plummeted by at least half. 00:23:54.420 |
how they're selling zero days to governments. 00:24:06.940 |
If they have those zero day exploits at their disposal, 00:24:27.900 |
- Which sucks because, I mean, transparency here 00:24:32.900 |
would be really powerful for educating the world 00:24:36.460 |
and also inspiring other engineers to do good. 00:24:40.180 |
It just feels like when you're operating in shadows, 00:24:42.780 |
it doesn't help us move in the positive direction 00:24:46.380 |
in terms of like getting more people on the defense side 00:24:53.660 |
is have great journalists, just like you did, 00:25:09.180 |
that are finding and developing zero day exploits 00:25:15.540 |
What about the, you know, however many millions 00:25:20.740 |
who've never even heard of a zero day exploit? 00:25:25.980 |
"Hey, we'll start paying you if you can find a bug 00:25:29.220 |
in United Airlines software or in Schneider Electric 00:25:39.940 |
Let's go find this untapped army of programmers 00:25:46.900 |
who will continue to sell these to governments 00:26:10.060 |
I have a QNAP device for basically kind of coldish storage. 00:26:15.620 |
So it's about 60 terabytes with 50 terabytes of data on it 00:26:20.260 |
in RAID 5 and apparently about four to 5,000 QNAP devices 00:26:25.260 |
were hacked and taken over with this ransomware. 00:26:30.660 |
And what ransomware does there is it goes file by file, 00:26:35.540 |
almost all the files on the QNAP storage device 00:26:57.980 |
about how friendly and eloquent this is written. 00:27:08.100 |
You have been targeted because of the inadequate security 00:27:16.340 |
You can make a payment of exactly 0.03 Bitcoin, 00:27:24.860 |
we'll follow up with transaction to the same address, 00:27:28.600 |
They give you instructions of what happens next 00:27:31.540 |
and they'll give you a decryption key that you can then use. 00:27:34.620 |
And then there's another message for QNAP that says, 00:27:37.600 |
all your affected customers have been targeted 00:27:40.900 |
using a zero-day vulnerability in your product. 00:27:43.860 |
We offer you two options to mitigate this and future damage. 00:28:00.180 |
Or you can make a Bitcoin payment of 50 Bitcoin 00:28:03.900 |
to get a master decryption key for all your customers. 00:28:27.020 |
because I thought I can afford to lose that data. 00:28:32.460 |
I mean, I think you've spoken about the crown jewels, 00:28:35.900 |
like making sure there's things you really protect. 00:28:45.100 |
But there's a bunch of stuff, like personal videos. 00:28:53.340 |
that because they're very large or 4K or something like that, 00:28:56.100 |
I kept them on there thinking RAID 5 will protect it. 00:29:09.540 |
And I'm sure there's a lot of painful stuff like that 00:29:15.580 |
And there's a lot of interesting ethical questions here. 00:29:25.660 |
Especially when you don't know if it's going to work or not. 00:29:35.860 |
We're working very hard day and night to solve this. 00:29:48.260 |
Because the way they phrased it on purpose, perhaps, 00:29:54.820 |
is maybe they're trying to help themselves sleep at night. 00:30:01.300 |
This is about the company with the vulnerabilities. 00:30:04.380 |
Just like you mentioned, this is the justification they have 00:30:09.580 |
They hurt me but I'm sure there's a few others 00:30:28.580 |
if they still don't know where the zero day is, 00:30:30.820 |
what's to say that they won't just hit them again 00:30:36.580 |
And that is a huge advancement for ransomware. 00:30:40.860 |
It's really only been, I think, in the last 18 months 00:30:44.740 |
that we've ever really seen ransomware exploit zero days 00:30:49.260 |
Usually 80% of them, I think the data shows 80% of them 00:30:54.260 |
come down to a lack of two-factor authentication. 00:30:58.460 |
So when someone gets hit by a ransomware attack, 00:31:01.420 |
they don't have two-factor authentication on, 00:31:30.380 |
So maybe there's a misconfiguration of some sort 00:32:01.700 |
And it's also a new thing for ransomware groups 00:32:04.340 |
to go to the individuals to pressure them to pay. 00:32:12.020 |
where there was a mental health clinic that got hit. 00:32:20.620 |
pay this or we're going to release your psychiatric records. 00:32:43.020 |
They got hit and the ransom demand was 50 million. 00:32:58.940 |
we recommend you don't pay or please don't pay. 00:33:13.740 |
of Colonial Pipeline was after the company got hit 00:33:18.740 |
and took the preemptive step of shutting down the pipeline 00:33:29.100 |
and I got our hands on a classified assessment 00:33:35.780 |
we could have only afforded two to three more days 00:33:49.020 |
Without the diesel, the refineries couldn't function 00:33:52.300 |
and it would have totally screwed up the economy. 00:33:54.780 |
And so there was almost this like national security, 00:33:59.540 |
economic impetus for them to pay this ransom. 00:34:04.540 |
And the other one I always think about is Baltimore. 00:34:06.980 |
You know, when the city of Baltimore got hit, 00:34:16.780 |
And Baltimore stood its ground and didn't pay. 00:34:20.060 |
But ultimately the cost to remediate was $18 million. 00:34:28.100 |
to public school education and roads and public health. 00:34:36.340 |
And so a lot of residents in Baltimore were like, 00:34:48.140 |
because why you're funding their R&D for the next go round. 00:34:56.940 |
- So on the individual level, just like, you know, 00:35:03.260 |
have you talked to people that were kind of victims 00:35:08.540 |
You know, in the same way that violence hurts people. 00:35:13.940 |
hurt people in your sense and the way you researched it? 00:35:21.020 |
on a personal level was an attack on a hospital in Vermont. 00:35:39.260 |
because the protocol of who gets what is very complicated. 00:35:43.000 |
And without it, nurses and doctors couldn't access it. 00:35:54.640 |
"I don't know why people aren't screaming about this. 00:35:57.240 |
That the only thing I've seen that even compares 00:35:59.680 |
to what we're seeing at this hospital right now 00:36:06.840 |
You know, they really put it in these super dramatic terms. 00:36:10.520 |
And last year there was a report in the Wall Street Journal 00:36:23.560 |
and whatever device they were using to monitor the fetus 00:36:28.480 |
wasn't working because of the ransomware attack. 00:36:56.800 |
or business in Ukraine that used this tax software. 00:36:59.680 |
It actually hit any business all over the world 00:37:02.680 |
that had even a single employee working remotely in Ukraine. 00:37:17.760 |
I mean, it really created an existential crisis 00:37:21.600 |
Merck had to tap into the CDC's emergency supplies 00:37:43.720 |
a global cyber terrorist attack, essentially. 00:37:57.300 |
there was a really impressive threat researcher at Cisco, 00:38:02.300 |
which has this threat intelligence division called Talos, 00:38:05.400 |
who said, "Stop calling it collateral damage." 00:38:53.240 |
Do you think China and US are going to escalate 00:39:08.840 |
from now on is guaranteed to have some cyber element to it. 00:39:15.200 |
The Department of Justice recently declassified a report 00:39:20.480 |
that said China's been hacking into our pipelines, 00:39:22.700 |
and it's not for intellectual property theft. 00:39:26.980 |
so that if things escalate in Taiwan, for example, 00:39:30.460 |
they are where they need to be to shut our pipelines down, 00:39:34.960 |
of what that looked like with Colonial Pipeline 00:39:37.840 |
and the panic buying and the jet fuel shortages 00:39:40.920 |
and that assessment I just mentioned about the diesel. 00:39:53.440 |
from fighter jets, Chinese fighter jets in Taiwan, 00:39:57.120 |
or what's happening right now with Russia's buildup 00:40:04.660 |
I'm always looking at it through a cyber lens, 00:40:07.440 |
and it really bothers me that other people aren't 00:40:10.400 |
because there is no way that these governments 00:40:15.520 |
and these nation states are not going to use their access 00:40:23.760 |
And I'm now in a position where I'm an advisor 00:40:28.640 |
to the Cybersecurity Infrastructure Security Agency at DHS. 00:40:41.220 |
to understand just generally what the collateral damage 00:40:46.020 |
could be for American businesses and critical infrastructure 00:40:50.020 |
in any of these escalated conflicts around the world. 00:40:54.060 |
Because just generally, our adversaries have learned 00:41:02.660 |
in terms of our traditional military spending 00:41:08.060 |
But we have a very soft underbelly when it comes to cyber. 00:41:12.980 |
80% or more of America's critical infrastructure, 00:41:17.400 |
so pipelines, power grid, nuclear plants, water systems, 00:41:26.860 |
And for the most part, there is nothing out there 00:41:38.760 |
There's nothing mandating that they even meet 00:41:49.020 |
most of the time we don't even know about it. 00:41:51.360 |
So that is, if you were gonna design a system 00:42:08.440 |
let's just keep hooking up everything for convenience. 00:42:13.840 |
Let's just keep going for cost, for convenience sake, 00:42:34.760 |
you realize just how dumb software eats world is. 00:42:39.760 |
And no one has ever stopped to pause and think, 00:42:43.180 |
should we be hooking up these systems to the internet? 00:42:54.680 |
we've seen a record number of zero-day attacks. 00:42:59.080 |
which is probably more than double what it was in 2019. 00:43:08.220 |
with a lot of geopolitical hot points right now. 00:43:15.120 |
are places where countries have been investing heavily 00:43:23.440 |
the goal would be to maximize the footprint of zero-day, 00:43:29.520 |
like super-secret zero-day that nobody's aware of. 00:43:36.000 |
the huge negative effects of shutting down infrastructure, 00:43:38.920 |
or any kind of zero-day, is the chaos it creates. 00:43:45.240 |
the markets plummet, just everything goes to hell. 00:43:56.640 |
I mean, we're not using two-factor authentication. 00:44:16.600 |
And what really drew me to the zero-day market 00:44:21.440 |
Particularly from the US government's point of view. 00:44:26.240 |
How do they justify leaving these systems so vulnerable 00:44:33.560 |
and we're baking more of our critical infrastructure 00:44:38.160 |
It's not like we're using one set of technology, 00:44:41.120 |
and Russia's using another, and China's using this. 00:44:48.920 |
you're not just leaving it open so you can spy on Russia 00:44:57.520 |
But zero-days are like, that is the secret sauce. 00:45:11.960 |
is trying to find offensive hacking tools in zero-days 00:45:34.520 |
I mean, why did Russia turn off the lights twice in Ukraine? 00:45:42.740 |
I think part of it is to sow the seeds of doubt 00:45:47.940 |
Your government can't even keep your lights on. 00:45:58.300 |
- Nuclear weapons seems to have helped prevent nuclear war. 00:46:03.300 |
Is it possible that we have so many vulnerabilities 00:46:11.260 |
that it will kind of achieve the same kind of equilibrium 00:46:20.700 |
Do you have any hope for this particular solution? 00:46:23.780 |
- You know, nuclear analogies always tend to fall apart 00:46:27.500 |
mainly because you don't need fissile material. 00:46:30.940 |
You know, you just need a laptop and the skills 00:46:40.940 |
And we've seen countries muck around with attribution. 00:46:44.300 |
We've seen, you know, nation states piggyback 00:46:49.300 |
and just sit there and siphon out whatever they're getting. 00:46:53.360 |
We learned some of that from the Snowden documents. 00:47:01.380 |
We've seen them hit a Saudi petrochemical plant 00:47:05.420 |
where they did neutralize the safety locks at the plant 00:47:10.140 |
given Iran had been targeting Saudi oil companies forever. 00:47:15.220 |
a graduate research institute outside Moscow. 00:47:22.260 |
I think because they think, okay, if I do this, 00:47:25.380 |
like how am I gonna cover up that it came from me 00:47:34.980 |
And, you know, at the Times, I'd covered the Chinese hacks 00:47:42.860 |
I'd covered the Russian probes of nuclear plants. 00:47:46.120 |
I'd covered the Russian attacks on the Ukraine grid. 00:47:50.100 |
And then in 2018, my colleague David Singer and I 00:48:02.220 |
And when we went to the National Security Council, 00:48:08.100 |
they give the other side a chance to respond. 00:48:11.400 |
I assumed we would be in for that really awkward, 00:48:20.280 |
And instead, they gave us the opposite answer. 00:48:29.360 |
but it was pretty obvious they wanted Russia to know 00:48:33.140 |
that we're hacking into their power grid too, 00:48:35.340 |
and they better think twice before they do to us 00:48:40.220 |
So yeah, you know, we have stumbled into this new era 00:48:47.780 |
I think another sort of quasi-norm we've stumbled into 00:48:56.940 |
You know, there's this idea that if you get hit, 00:49:05.480 |
You know, that is how the language always goes. 00:49:08.440 |
That's what Obama said after North Korea hit Sony. 00:49:12.800 |
"We will respond at a time and place of our choosing." 00:49:15.700 |
But no one really knows what that response looks like. 00:49:22.760 |
are just these like just short of war attacks. 00:49:27.140 |
You know, Russia turned off the power in Ukraine, 00:49:31.860 |
You know, it stayed off for a number of hours. 00:49:34.820 |
You know, NotPetya hit those companies pretty hard, 00:49:41.380 |
And the question is, what's gonna happen when someone dies? 00:49:44.560 |
And can a nation-state masquerade as a cyber-criminal group, 00:49:53.500 |
coming to some sort of digital Geneva Convention. 00:49:57.140 |
Like there's been a push from Brad Smith at Microsoft. 00:50:03.700 |
And on its face, it sounds like a no-brainer. 00:50:06.060 |
Yeah, why wouldn't we all agree to stop hacking 00:50:24.220 |
but we'd never do it when you're dealing with Xi or Putin 00:50:32.860 |
they outsource these operations to cyber-criminals. 00:50:39.340 |
from this loose satellite network of private citizens 00:50:43.060 |
that work at the behest of the Ministry of State Security. 00:50:46.660 |
So how do you come to some sort of state-to-state agreement 00:50:51.340 |
when you're dealing with transnational actors 00:50:55.700 |
and cyber-criminals, where it's really hard to pin down 00:51:01.660 |
or whether they were acting at the behest of the MSS 00:51:09.420 |
can't remember if it was before or after NotPetya, 00:51:18.020 |
In other words, I have no say over what they do or don't do. 00:51:24.300 |
when that's how he's talking about these issues 00:51:51.100 |
So you're not allowed to fix like authoritarian regimes 00:52:05.100 |
you mentioned there's no regulation on companies, 00:52:09.020 |
what if you could just fix with the snap of a finger, 00:52:19.820 |
It's ridiculous how many of these attacks come in 00:52:24.780 |
because someone didn't turn on multi-factor authentication. 00:52:35.860 |
to the East Coast of the United States of America, how? 00:52:39.180 |
Because they forgot to deactivate an old employee account 00:52:42.180 |
whose password had been traded on the dark web 00:52:44.700 |
and they'd never turned on two-factor authentication. 00:52:48.020 |
This water treatment facility outside Florida 00:52:53.220 |
They were using Windows XP from like a decade ago 00:52:56.500 |
that can't even get patches if you want it to 00:52:59.300 |
and they didn't have two-factor authentication. 00:53:02.700 |
if they just switched on two-factor authentication, 00:53:06.740 |
some of these attacks wouldn't have been possible. 00:53:17.540 |
But I think right now, that is like bar none, 00:53:21.700 |
that is just, that is the easiest, simplest way 00:53:25.900 |
And the name of the game right now isn't perfect security. 00:53:39.060 |
than your competitor or than anyone else out there 00:53:44.300 |
And maybe if you are a target for an advanced nation state 00:53:48.740 |
or the SVR, you're gonna get hacked no matter what. 00:53:56.540 |
Deadbolt is it, you can make their jobs a lot harder 00:54:03.180 |
And the other thing is stop reusing your passwords. 00:54:05.260 |
But if I only get one, then two-factor authentication. 00:54:10.460 |
Factor one is what, logging in with a password? 00:54:15.900 |
or another channel through which you can confirm, 00:54:19.460 |
- Yes, usually this happens through some kind of text. 00:54:23.500 |
You get your one-time code from Bank of America 00:54:36.060 |
And if you don't have that hardware device with you, 00:54:54.380 |
like who wasn't hacked over the course of those five years? 00:54:58.380 |
And a lot of those companies that got hacked, 00:55:02.160 |
They took the credentials, they took the passwords. 00:55:05.480 |
They can make a pretty penny selling them on the dark web. 00:55:11.540 |
So you get one from God knows who, I don't know, 00:55:15.540 |
last pass, the worst case example, actually last pass. 00:55:23.420 |
And you go test it on their brokerage account. 00:55:25.580 |
And you test it on their cold storage account. 00:55:46.540 |
someone is trying to get into your Instagram account 00:55:49.860 |
or your Twitter account or your email account. 00:55:52.060 |
And I don't worry because I use multi-factor authentication. 00:55:59.500 |
But it's the simplest thing to do and we don't even do it. 00:56:06.860 |
'cause it's pretty annoying if it's implemented poorly. 00:56:23.480 |
Like I think MIT for a while had two-factor authentication. 00:56:35.040 |
it asks to re-authenticate across multiple subdomains. 00:56:44.120 |
- Yeah, it feels like friction in our frictionless society. 00:56:54.520 |
We need the Steve Jobs of security to come along 00:57:02.060 |
Apple has probably done more for security than anyone else 00:57:07.060 |
simply by introducing biometric authentication, 00:57:10.880 |
first with the fingerprint and then with Face ID. 00:57:13.640 |
And it's not perfect, but if you think just eight years ago, 00:57:17.440 |
everyone was running around with either no passcode, 00:57:20.400 |
an optional passcode, or a four-digit passcode 00:57:24.500 |
think of what you can get when you get someone's iPhone, 00:57:29.060 |
And props to them for introducing the fingerprint 00:57:36.960 |
Now it's time to make another huge step forward. 00:57:42.880 |
I mean, it's gotten us as far as it was ever gonna get us 00:57:49.560 |
is not gonna be annoying, is gonna be seamless. 00:57:52.360 |
- When I was at Google, that's what we worked on is, 00:57:55.080 |
and there's a lot of ways to call this active authentication 00:58:03.780 |
but everything from your body to identify who you are, 00:58:09.140 |
So basically create a lot of layers of protection 00:58:23.020 |
So like from video, so unlocking it with video, 00:58:30.040 |
the way you take it out of the pocket, that kind of thing. 00:58:37.360 |
And ultimately it's very difficult to beat the password 00:58:43.560 |
- Well, there's a company that I actually will call out 00:59:01.560 |
like it's a joke how much they know about us. 00:59:03.780 |
You know, you always hear the conspiracy theories 00:59:21.920 |
if you're, this is what your email patterns are. 00:59:26.360 |
because we're emailing strangers all the time. 00:59:33.920 |
And if something strays from that pattern, that's abnormal. 00:59:38.400 |
And they'll block it, they'll investigate it, you know, 00:59:43.560 |
You know, let's start using that kind of targeted 00:59:52.980 |
from the password and using multi-factor authentication, 01:00:19.480 |
You know, the number of mistakes I would probably make 01:00:22.440 |
just trying to email someone with PGP just wasn't worth it. 01:00:27.100 |
And then Signal came along and Signal made it, 01:00:37.040 |
So we have to start investing in creative minds 01:00:46.940 |
that's gonna get us out of where we are today. 01:00:52.420 |
Do you worry about this sort of hacking people? 01:01:00.940 |
of every chief information security officer out there. 01:01:04.200 |
You know, social engineering, we work from home now. 01:01:09.040 |
I saw this woman posted online about how her husband, 01:01:23.660 |
that shows up for work every day doesn't act like John. 01:01:31.060 |
Like think about the potential for social engineering 01:01:35.540 |
You know, you apply for a job and you put on a pretty face, 01:01:40.420 |
and then you just get inside the organization 01:01:42.680 |
and get access to all that organization's data. 01:01:52.020 |
Probably because they were trying to figure out 01:01:58.020 |
You know, they couldn't do it with a hack from the outside, 01:02:04.540 |
And it also, unfortunately, creates all kinds of xenophobia 01:02:11.340 |
I mean, if you're gonna have to take that into consideration 01:02:19.420 |
at someone who applies for that job from China. 01:02:28.540 |
where they basically accuse people of being spies 01:02:43.940 |
- That's actually why I'm single, I'm suspicious 01:02:46.260 |
of China and Russia every time I meet somebody 01:02:49.660 |
or trying to plant and get insider information. 01:03:27.460 |
But just interacting with people on the internet, 01:03:56.820 |
where more and more our lives is in the digital sphere? 01:04:15.800 |
or experiencing a moment of happiness with a friend, 01:04:25.540 |
the things that make us love life will happen online. 01:04:29.100 |
And if those things have an avatar that's digital, 01:04:32.960 |
that's like a way to hack into people's minds. 01:04:40.980 |
I don't know if there's a way to protect against that. 01:04:51.940 |
So if most people are good, we're going to be okay. 01:05:13.700 |
when you were talking just now is my three-year-old son. 01:05:17.580 |
- Yeah, he asked me the other day, what's the internet, mom? 01:05:30.380 |
I don't want all of his most meaningful experiences 01:05:43.620 |
I don't believe in free speech for robots and bots. 01:05:47.540 |
And look what just happened over the last six years. 01:05:52.540 |
We had bots pretending to be Black Lives Matter activists 01:06:12.460 |
so that we're so paralyzed we can't get anything done. 01:06:17.260 |
and we definitely can't handle our adversaries 01:06:32.540 |
just because it sounds like the next logical step 01:06:39.820 |
Do I really want my child's most significant moments 01:06:46.760 |
So maybe I'm just stuck in that old-school thinking, 01:06:54.580 |
And I'm really sick of being the guinea pig parent generation 01:07:04.660 |
But thinking about how to manage the metaverse as a parent 01:07:09.660 |
to a young boy, I can't even let my head go there. 01:07:24.040 |
We've always said, okay, the promise of this technology 01:07:27.920 |
means we should keep going, keep pressing ahead. 01:07:31.760 |
We just need to figure out new ways to manage that risk. 01:07:39.520 |
When I was covering all of these ransomware attacks, 01:07:44.880 |
I thought, okay, this is gonna be it for cryptocurrency. 01:07:51.400 |
They're gonna put the hammer down and say, enough is enough. 01:08:05.520 |
get a e-gift card and tell us what the pin is 01:08:17.840 |
But after the colonial pipeline, ransom was seized. 01:08:22.600 |
'Cause if you remember, the FBI was actually able to go in 01:08:34.080 |
So they're one of these blockchain intelligence companies. 01:08:46.440 |
But to track down that ransom payment would have taken, 01:08:54.760 |
would have taken us years to get to that one bank account 01:08:58.100 |
or belonging to that one front company in the Seychelles. 01:09:04.040 |
we can track the movement of those funds in real time. 01:09:09.960 |
These payments are not as anonymous as people think. 01:09:13.300 |
Like we still can use our old hacking ways and zero days 01:09:23.480 |
So it's a curse in some ways in that it's an enabler, 01:09:29.440 |
And they said that same thing to me that I just said to you. 01:09:32.080 |
They said, we've never shut down a promising new technology 01:09:53.600 |
So maybe we'll finally be able to, not finally, 01:09:57.000 |
but figure out a way to solve the identity problem 01:10:00.960 |
on the internet, meaning like a blue check mark 01:10:06.960 |
like a fingerprint so you can prove you're you, 01:10:17.580 |
So giving you, allowing you to maintain control 01:10:25.960 |
of how that data is being used, all those kinds of things. 01:10:28.880 |
And maybe as you educate more and more people, 01:10:36.080 |
that the companies that they give their data to 01:10:43.600 |
and I hope they succeed, their name's PIIANO, P-I-I-O, 01:10:48.520 |
and they wanna create a vault for your personal information 01:10:54.440 |
And ultimately, if I'm gonna call Delta Airlines 01:11:05.540 |
they're just gonna send me a one-time token to my phone. 01:11:08.920 |
My phone's gonna say, or my Fido key is gonna say, 01:11:13.520 |
And then we're gonna talk about my identity like a token, 01:11:27.960 |
they don't get access to my social security number, 01:11:30.440 |
my location, or the fact I'm a Times journalist. 01:11:34.580 |
You know, I think that's the way the world's gonna go. 01:11:39.960 |
sort of losing our personal information everywhere, 01:11:44.160 |
letting data marketing companies track our every move. 01:11:59.960 |
but they don't need to know I'm Nicole Perlera. 01:12:03.040 |
They can know that I am token number, you know, x567. 01:12:11.040 |
and give you control about removing the things they know. 01:12:22.080 |
given the choice to walk away, won't walk away. 01:12:25.240 |
They'll just feel better about having the option 01:12:28.560 |
to walk away when they understand the trade-offs. 01:12:36.000 |
like a personalized feed and all those kinds of things. 01:13:04.840 |
I should be able to, this has to do with everything. 01:13:16.960 |
and one click to provide all of your information 01:13:19.560 |
that's necessary for the subscription service, 01:13:21.700 |
for the transaction service, whatever that is, 01:13:25.940 |
I have all of these fake phone numbers and emails 01:13:34.320 |
then it's just going to propagate to everything else. 01:13:44.440 |
And frankly, I think it's negligent they haven't 01:13:46.620 |
on the fact that elderly people are getting spammed to death 01:13:51.620 |
on their phones these days with fake car warranty scams. 01:13:56.980 |
I mean, my dad was in the hospital last year, 01:14:00.820 |
and his phone kept buzzing, and I look at it, 01:14:08.260 |
people nonstop calling about his freaking car warranty, 01:14:13.200 |
why they're trying to get his social security number. 01:14:19.900 |
We need to figure out how to put those people 01:14:34.800 |
our social security number, and our home address, 01:14:46.240 |
And there's no question they're not protecting it 01:14:52.760 |
or identity theft, or credit card theft, or worse. 01:15:02.560 |
Please clip this out, which is if you get an email 01:15:07.360 |
or a message from Lex Friedman saying how much I, Lex, 01:15:16.920 |
and please connect with me on my WhatsApp number, 01:15:19.640 |
and I will give you Bitcoin or something like that, 01:15:25.020 |
And I'm aware that there's a lot of this going on, 01:16:05.160 |
It's fascinating because it's cross-platform now. 01:16:25.080 |
that they'll get the people to click, and they do. 01:16:33.680 |
There's an intimacy that people feel connected, 01:16:50.360 |
It's like the John that gets hired, the fake employee. 01:17:02.160 |
It's telling people to be skeptical on stuff they click. 01:17:20.360 |
But then the machine learning there is tricky, 01:17:21.980 |
'cause you don't wanna add a lot of extra friction 01:17:25.280 |
that just annoys people, because they'll turn it off, 01:17:28.200 |
'cause you have the accept cookies thing, right, 01:17:32.440 |
so now they completely ignore the accept cookies. 01:17:43.760 |
You've talked about looking through the NSA documents 01:18:07.480 |
I have really complicated feelings about Edward Snowden. 01:18:19.600 |
And I'm grateful for the conversations that we had 01:18:24.040 |
in the post-Snowden era about the limits to surveillance 01:18:56.280 |
On the other hand, when I walked into the storage closet 01:19:09.480 |
covering Chinese cyber espionage almost every day 01:19:14.240 |
and the sort of advancement of Russian attacks. 01:19:19.240 |
They were just getting worse and worse and more destructive. 01:19:23.200 |
And there were no limits to Chinese cyber espionage 01:19:27.440 |
and Chinese surveillance of its own citizens. 01:19:32.840 |
to what Russia was willing to do in terms of cyber attacks 01:19:37.280 |
and also, in some cases, assassinating journalists. 01:19:57.880 |
to, as far as I know, assassinate journalists. 01:20:20.200 |
because to me, the American people's reaction 01:20:24.440 |
to the Snowden documents was a little bit misplaced. 01:20:29.920 |
about the phone call metadata collection program. 01:20:47.320 |
And there wasn't a lot that I saw in those documents 01:20:51.560 |
that was beyond what I thought a spy agency does. 01:20:56.560 |
And I think if there was another 9/11 tomorrow, 01:21:07.900 |
Why weren't they spying on those world leaders? 01:21:42.320 |
the NSA has this program and here's what it does. 01:22:04.400 |
And Russia has been hacking into our energy infrastructure. 01:22:07.800 |
And they've been using the same methods to spy on track 01:22:11.340 |
and in many cases, kill their own journalists. 01:22:30.320 |
and I'm sorry, this is a little bit of a tangent, 01:22:45.980 |
You know, part of me wishes that those documents 01:22:48.900 |
had been released in a much more targeted, limited way. 01:22:52.980 |
It's just a lot of it just felt like a PowerPoint 01:23:03.020 |
that there had been a little bit more thought 01:23:07.700 |
Because I think a lot of the impact from Sony 01:23:13.380 |
But I think, you know, based on what I saw personally, 01:23:20.380 |
I don't know why that particular thing got released. 01:23:24.100 |
- As a whistleblower, what's a better way to do it? 01:23:28.660 |
it takes a lot of effort to do a more targeted release. 01:23:34.960 |
you're afraid that those channels will be manipulated. 01:23:39.460 |
What's a better way to do this, do you think? 01:23:43.540 |
As a journalist, this is almost a good journalistic question. 01:23:51.220 |
I bring up, you know, again, Mark Zuckerberg and Meta. 01:24:02.100 |
And I also am torn about how to feel about that whistleblower 01:24:06.980 |
because from a company perspective, that's an open culture. 01:24:10.800 |
How can you operate successfully if you have an open culture 01:24:19.360 |
whether it represents a larger context or not. 01:24:30.660 |
that's out of context, very targeted to where, 01:24:34.560 |
well, Facebook is evil, clearly, because of this one leak. 01:24:44.520 |
And so narratives by whistleblowers make that whistleblower 01:24:52.240 |
And so there's a huge incentive to take stuff 01:24:56.800 |
that don't represent the full context, the full truth. 01:25:03.100 |
'cause then that forces Facebook and Meta and governments 01:25:06.940 |
to be much more conservative, much more secretive. 01:25:14.580 |
I don't know if you can comment on any of that, 01:25:16.260 |
how to be a whistleblower ethically and properly. 01:25:27.280 |
I think of my book as sort of blowing the whistle 01:25:33.980 |
But it's not like I was in the market myself. 01:25:38.860 |
It's not like I had access to classified data 01:25:46.380 |
"Listen, I'm just trying to scrape the surface here 01:25:49.500 |
"so we can have these conversations before it's too late." 01:26:03.340 |
probably has some voodoo doll of me out there. 01:26:25.340 |
that we were basically, in some ways, blackmailing an ally 01:26:33.780 |
I mean, they went through the proper channels. 01:26:37.300 |
They weren't trying to profit off of it, right? 01:26:52.020 |
You mentioned NSA, one of the things it showed 01:27:08.020 |
From your understanding of intelligence agencies, 01:27:11.180 |
the CIA, NSA, and the equivalent in other countries, 01:27:15.980 |
are they, one question, this could be a dangerous question, 01:27:19.980 |
are they competent, are they good at what they do? 01:27:48.500 |
that they're interested in manipulating the populace 01:27:51.700 |
for whatever ends the powerful in dark rooms, 01:28:04.620 |
Do these conspiracy theories have kind of any truth to them? 01:28:09.620 |
Or are intelligence agencies, for the most part, 01:28:20.140 |
I think, you know, depends which intelligence agency. 01:28:25.340 |
You know, they're killing every Iranian nuclear scientist 01:28:47.400 |
You know, none of these, intelligence is intelligence. 01:28:54.820 |
they're malevolent or they're heroes, you know? 01:29:07.420 |
that you see on the beach on the 4th of July. 01:29:20.540 |
to do what we were accused of doing after Snowden, 01:29:33.340 |
and paperwork and bureaucracy it would have taken 01:29:37.820 |
to do what everyone thinks that we were supposedly doing. 01:29:51.900 |
were using their access to spy on their ex-girlfriends. 01:29:55.380 |
So, you know, there's an exception to every case. 01:30:20.040 |
And the surveillance that they're deploying on the Uyghurs 01:30:28.140 |
and the surveillance they're starting to export abroad 01:30:32.160 |
with some of the programs like the watering hole attack 01:30:35.540 |
where it's not just hitting the Uyghurs inside China, 01:30:42.100 |
I mean, it could be an American high school student 01:30:51.780 |
really limiting the extent of that surveillance. 01:31:04.060 |
in terms of a test kitchen for its cyber attacks, 01:31:07.420 |
the Uyghurs are China's test kitchen for surveillance. 01:31:23.860 |
I mean, in 2015, Obama and Xi Jinping reached a deal 01:31:34.460 |
"You better cut it out on intellectual property theft." 01:31:40.140 |
that they would not hack each other for commercial benefit. 01:31:45.700 |
we saw this huge drop-off in Chinese cyber attacks 01:31:49.060 |
on American companies, but some of them continued. 01:32:02.900 |
Because that was still considered fair game to China. 01:32:07.420 |
They wanted to know who was staying in this city 01:32:11.820 |
at this time when Chinese citizens were staying there 01:32:15.020 |
so they could cross match for counterintelligence 01:32:36.100 |
to do what I've seen the UAE do to its citizens 01:32:57.340 |
and he told me, "You might find yourself a terrorist, 01:33:24.680 |
that you couldn't criticize the ruling family 01:33:29.620 |
And he's been in solitary confinement every day 01:33:54.920 |
- So there's some lines we're not willing to cross. 01:34:07.880 |
To me personally, this is the stuff of conspiracy theories, 01:34:30.180 |
Or if you have like Jeffrey Epstein conspiracy theories, 01:34:33.620 |
active, what is it, manufacture of embarrassing things, 01:34:38.500 |
and then use blackmail to manipulate the population 01:34:42.900 |
It troubles me deeply that MIT allowed somebody 01:34:51.660 |
that they would hang out with that person at all. 01:35:01.340 |
"Jeffrey Epstein is a front for intelligence." 01:35:04.380 |
And I just, I struggle to see that level of competence 01:35:17.260 |
And I guess I was trying to get to that point. 01:35:23.020 |
which makes some of these things very difficult. 01:35:27.420 |
how much competence there is in these institutions. 01:35:31.660 |
Like how far, this takes us back to the hacking question. 01:35:34.860 |
How far are people willing to go if they have the power? 01:35:53.980 |
And I don't think that's possible in a free society. 01:36:12.940 |
starts to both use fear and direct manipulation 01:36:25.100 |
that you can have somebody like a Jeffrey Epstein 01:36:29.340 |
I don't know what I'm asking you, but I'm just, 01:36:47.680 |
But then again, if they're not, would we know? 01:36:55.160 |
That's why Edward Snowden might be a good thing. 01:37:00.480 |
You have investigated some of the most powerful organizations 01:37:03.240 |
and people in the world of cyber warfare, cybersecurity. 01:37:21.920 |
"Someone's on the dark web offering good money 01:37:25.760 |
"to anyone who can hack your phone or your laptop." 01:37:34.480 |
I came back and I brought a burner laptop with me, 01:37:51.720 |
And then I've had moments where I think I went 01:37:58.920 |
I mean, I remember writing about the Times hack by China 01:38:03.920 |
and I just covered a number of Chinese cyber attacks 01:38:19.960 |
and my cable box on my television started making 01:38:24.680 |
some weird noises in the middle of the night. 01:38:29.720 |
and I think I said something like embarrassing, 01:38:44.640 |
is shining on my cable box, which has now been ripped out 01:38:48.120 |
and is sitting on my floor and the morning light. 01:39:06.160 |
go live off the grid, never have a car with navigation, 01:39:12.520 |
never order diapers off Amazon, create an alias, 01:39:22.280 |
and live in this new digital world we're living in. 01:40:14.040 |
And those were the lengths I was willing to go 01:40:16.480 |
to protect that source, but you can't do it for everyone. 01:40:30.680 |
all the things that we know we're supposed to do. 01:40:36.120 |
Don't go crazy, because then that's the ultimate hack. 01:40:45.200 |
Now, my whole risk model changed when I had a kid. 01:41:24.600 |
who was the Times Moscow Bureau Chief during the Cold War. 01:41:32.040 |
about Chinese cyberespionage, he took me out to lunch. 01:41:35.040 |
And he told me that when he was living in Moscow, 01:41:44.800 |
And they would make a really loud show of it. 01:41:57.620 |
"but they wanted me to know that they were following me. 01:42:08.140 |
"Know that there are probably people following you. 01:42:11.580 |
"Sometimes they'll make a little bit of noise. 01:42:18.540 |
"you have a little bit of an invisible shield on you. 01:42:21.300 |
"You know, if something were to happen to you, 01:42:34.360 |
And that destroyed my vision of my invisible shield. 01:42:53.620 |
pretty much in the open and get away with it, 01:42:58.200 |
and for the United States to let them get away with it 01:43:01.880 |
because we wanted to preserve diplomatic relations 01:43:15.600 |
that it was sort of open season on journalists. 01:43:19.520 |
And to me, that was one of the most destructive things 01:43:23.000 |
that happened under the previous administration. 01:43:30.280 |
what to think of my invisible shield anymore. 01:43:33.160 |
That really worries me on the journalism side 01:44:01.120 |
doing all the crazy shit that I naturally do, 01:44:08.840 |
is doing crazy shit without really thinking about it, 01:44:18.480 |
It's, I mean, whether it's stupidity or fearlessness, 01:44:22.040 |
whatever it is, that's what great journalism is. 01:44:35.000 |
just make sure you don't have anything to hide. 01:44:41.920 |
I'm just, this is like a motivational speech or something. 01:44:58.660 |
Just present yourself in the most authentic way possible, 01:45:03.820 |
meaning be the same person online as you are privately, 01:45:37.900 |
And that works, but sometimes I even get carried, 01:45:49.100 |
- You have to be emotional and stuff and say something. 01:45:51.300 |
- Yeah, I mean, just the cortisol response on Twitter. 01:45:56.300 |
Twitter is basically designed to elicit those responses. 01:46:04.740 |
I look at my phone, I look at what's trending on Twitter, 01:46:10.060 |
that are gonna make people the most angry today? 01:46:22.340 |
that you have to be constantly censoring yourself. 01:46:35.820 |
to that other voice, to creativity, to being weird. 01:46:40.820 |
There's a danger to that little whispered voice 01:46:45.580 |
that's like, well, how would people read that? 01:46:53.540 |
And that stifles creativity and innovation and free thought. 01:47:13.220 |
and why he has said he goes full force on privacy 01:47:32.300 |
I think that self-censorship is an attack factor 01:47:43.740 |
and publicly the same person and be ridiculous. 01:47:46.780 |
Embrace the full weirdness and show it more and more. 01:47:54.020 |
And if there is something you really wanna hide, 01:48:07.420 |
Because I think my hopeful vision for the internet 01:48:25.260 |
You have to go all the way, step over, be weird. 01:48:31.060 |
'cause people can attack you and so on, but just ride it. 01:48:50.080 |
What advice would you give to young folks today 01:49:21.780 |
these amazing scholarship programs, for instance, 01:49:26.580 |
they'll pay your college as long as you commit 01:49:31.940 |
to sort of help federal agencies with cybersecurity. 01:49:39.060 |
They wanna go work at the NSA or Cyber Command. 01:49:46.180 |
It's really hard to get people to work on defense. 01:49:49.900 |
It's just, it's always been more fun to be a pirate 01:50:01.300 |
There's 3.5 million unfilled cybersecurity positions 01:50:21.260 |
because we can't match cybersecurity salaries 01:50:26.740 |
at Palantir or Facebook or Google or Microsoft. 01:50:30.020 |
And so it's really hard for the United States 01:50:38.420 |
where they basically have forced conscription on some level. 01:50:44.380 |
you do whatever you're gonna do during the day, 01:50:53.980 |
and ask you to come do this sensitive operation for us, 01:51:00.820 |
You know, a couple of years ago when Yahoo was hacked 01:51:09.580 |
Cyber criminals were allowed to have their fun, 01:51:12.180 |
but the minute they came across the username and password 01:51:16.740 |
that worked at the White House or the State Department 01:51:19.460 |
or military, they were expected to pass that over to the FSB. 01:51:33.020 |
if you're interested in code, if you're a tinker, 01:51:38.460 |
There are all sorts of amazing hacking competitions 01:51:42.700 |
you can do through the SANS org, for example, S-A-N-S. 01:51:56.300 |
to make people's life, you know, a living prison. 01:52:06.020 |
from hacks where people are trying to come in 01:52:19.700 |
You can tell yourself you're serving your country. 01:52:28.460 |
And if you need to go get that job that pays you, 01:52:30.980 |
you know, two million bucks a year, you can do that too. 01:52:34.860 |
more so of a public profile, you can be a public rockstar. 01:52:38.780 |
I mean, it's the same thing as sort of the military. 01:52:56.140 |
are deeply respected for defending the country, 01:53:05.820 |
the cybersecurity defense are the soldiers of the future. 01:53:14.140 |
oftentimes you see the more interesting threats 01:53:20.740 |
When cyber criminals and nation state adversaries 01:53:25.580 |
they don't go directly for Cyber Command or the NSA. 01:53:32.940 |
they go for Microsoft, they go for critical infrastructure. 01:53:36.660 |
And so those companies, those private sector companies 01:53:48.620 |
and you're calling out the SolarWinds attack, for instance, 01:53:51.580 |
I mean, you just saved God knows how many systems 01:54:17.540 |
of course, referring to cyber war, cybersecurity. 01:54:20.260 |
What gives you hope about the future of our world? 01:54:34.600 |
Because I have a kid and I have another on the way. 01:54:37.420 |
And if I didn't have hope, I wouldn't be having kids. 01:54:56.860 |
What gives me hope is that I share your worldview 01:55:07.620 |
scares me to death, but when I'm reminded of that 01:55:22.620 |
or I go on a hike or like someone smiles at me, 01:55:37.140 |
And my hope is, I just think our current political climate, 01:55:52.460 |
I think baby boomers, it's time to move along. 01:55:56.060 |
I think it's time for a new generation to come in. 01:56:01.140 |
And I actually have a lot of hope when I look at, 01:56:17.620 |
and without social media, but I'm native to it. 01:56:32.100 |
and now I'm feeling it and I know it's not a given. 01:56:34.900 |
We don't have to just resign ourselves to climate change. 01:56:41.240 |
And I think a lot of the problems we face today 01:56:47.660 |
that there's been on so many of these issues. 01:56:54.540 |
And I think this next generation is gonna come in 01:57:01.380 |
We're not just gonna like rape and pillage the earth 01:57:06.540 |
and play dirty tricks and let lobbyists dictate 01:57:16.540 |
- It feels like there's a lot of low hanging fruit 01:57:19.220 |
for young minds to step up and create solutions and lead. 01:57:23.780 |
So whenever like politicians or leaders that are older, 01:57:34.180 |
They're inspiring a large number of young people 01:57:41.900 |
There's going to be, it's almost like you need people 01:57:44.260 |
to act shitty to remind them, oh, wow, we need good leaders. 01:57:47.860 |
We need great creators and builders and entrepreneurs 01:57:51.340 |
and scientists and engineers and journalists. 01:57:54.140 |
You know, all the discussions about how the journalism 01:57:58.660 |
That's just an inspiration for new institutions to rise up 01:58:03.760 |
new journalists to step up and do journalism better. 01:58:08.660 |
when I talk to young people, I'm constantly impressed 01:58:29.700 |
hit on something earlier, which is authenticity. 01:58:33.100 |
Like no one is going to rise above that is plastic anymore. 01:58:43.100 |
You know, the benefit of the internet is it's really hard 01:58:46.500 |
to hide who you are on every single platform. 01:58:49.940 |
You know, on some level it's going to come out 01:58:57.380 |
by the time my kids are grown, like no one's going to care 01:59:09.580 |
My nephew was born the day I graduated from college. 01:59:13.260 |
And I just always, you know, he's like born into Facebook. 01:59:17.740 |
And I just think like, how is a kid like that 01:59:20.980 |
ever going to be president of the United States of America? 01:59:24.020 |
Because if Facebook had been around when I was in college, 01:59:30.980 |
how are those kids going to ever be president? 01:59:34.220 |
There's going to be some photo of them at some point 01:59:43.100 |
Now it's like, no, everyone's going to make mistakes. 01:59:49.380 |
And we're all going to have to come and grow up 01:59:53.020 |
to the view that as humans, we're going to make huge mistakes 01:59:58.140 |
that they're going to ruin the rest of your life. 02:00:00.660 |
But we're going to have to come around to this view 02:00:04.500 |
and we're going to have to be a little bit more forgiving 02:00:07.300 |
and a little bit more tolerant when people mess up. 02:00:10.300 |
And we're going to have to be a little bit more humble 02:00:17.620 |
- Nicole, this was an incredible, hopeful conversation. 02:00:32.340 |
this really difficult subject with your book. 02:00:37.860 |
that you took the risk, that you took that on. 02:00:54.900 |
please check out our sponsors in the description. 02:01:01.780 |
"Here we are, entrusting our entire digital lives 02:01:05.540 |
"passwords, texts, love letters, banking records, 02:01:13.900 |
"whose inner circuitry most of us would never vet. 02:01:22.460 |
Thank you for listening and hope to see you next time.