back to index

Chris Tarbell: FBI Agent Who Took Down Silk Road | Lex Fridman Podcast #340


Chapters

0:0 Introduction
1:16 Silk Road
11:39 Mass surveillance
15:50 Operation Onion Peeler
21:6 Hacker Avunit
31:56 Ross Ulbricht and Silk Road
44:39 Edward Snowden
46:44 NSA surveillance
58:51 Silk Road murders
67:37 Dark web
71:39 Ross Ulbricht's arrest
79:37 Aaron Swartz
82:55 Donald Trump and the Mar-a-Lago raid
86:1 Tech companies and censorship
95:0 War in Ukraine
98:58 Anonymous and LulzSec
109:10 FBI
112:11 Personal threats
117:57 Hector Monsegur a.k.a Sabu
131:7 Cyber attack threats against civilians
147:55 Most secure operating system
151:44 Cyber war
159:38 Advice for young people
164:50 FBI's credibility
173:21 Love

Whisper Transcript | Transcript Only Page

00:00:00.000 | You could buy literally whatever else you wanted.
00:00:03.160 | You could host things. - Drugs.
00:00:04.080 | - Drugs.
00:00:04.900 | You could buy heroin right from Afghanistan,
00:00:06.640 | the good stuff.
00:00:08.120 | Hacking tools, you could hack for hire.
00:00:09.920 | You could buy murders for hire.
00:00:11.480 | - The following is a conversation with Chris Tarbell,
00:00:16.200 | a former FBI special agent and cyber crime specialist
00:00:19.800 | who tracked down and arrested Russ Ulbricht,
00:00:23.240 | the leader of Silk Road,
00:00:25.000 | the billion dollar drug marketplace.
00:00:27.240 | And he tracked down and arrested Hector Monsegur,
00:00:30.880 | AKA Sabu, of LulzSec and Anonymous,
00:00:34.520 | which are some of the most influential
00:00:36.120 | hacker groups in history.
00:00:38.040 | He is co-founder of Naxo,
00:00:40.240 | a complex cyber crime investigation firm,
00:00:42.960 | and is a co-host of a podcast called
00:00:45.920 | The Hacker and the Fed.
00:00:47.920 | This conversation gives the perspective
00:00:50.160 | of the FBI cyber crime investigator,
00:00:52.860 | both the technical and the human story.
00:00:55.760 | I would also like to interview people on the other side,
00:00:58.960 | the cyber criminals who have been caught,
00:01:01.240 | and perhaps the cyber criminals who have not been caught
00:01:04.800 | and are still out there.
00:01:06.400 | This is the Alex Friedman Podcast.
00:01:09.200 | To support it,
00:01:10.040 | please check out our sponsors in the description.
00:01:12.360 | And now, dear friends, here's Chris Tarbell.
00:01:15.720 | You are one of the most successful
00:01:18.600 | cybersecurity law enforcement agents of all time.
00:01:21.720 | You tracked and brought down Russ Ulbricht,
00:01:25.040 | AKA Dread Pirate Roberts, who ran Silk Road,
00:01:27.600 | and Sabu of LulzSec and Anonymous,
00:01:32.440 | who was one of the most influential hackers in the world.
00:01:35.700 | So first, can you tell me the story
00:01:38.400 | of tracking down Russ Ulbricht and Silk Road?
00:01:41.200 | Let's start from the very beginning.
00:01:42.640 | And maybe let's start by explaining what is the Silk Road.
00:01:45.280 | - It was really the first dark market website.
00:01:49.520 | You literally could buy anything there.
00:01:51.520 | Well, I'll take that back.
00:01:53.600 | There's two things you couldn't buy there.
00:01:55.200 | You couldn't buy guns, because that was a different website,
00:01:58.760 | and you couldn't buy fake degrees.
00:02:00.360 | So no one could become a doctor,
00:02:02.960 | but you could buy literally whatever else you wanted.
00:02:06.280 | You could- - Drugs.
00:02:07.120 | - Host things, drugs.
00:02:07.940 | You could buy heroin right from Afghanistan,
00:02:09.800 | the good stuff.
00:02:11.260 | Hacking tools, you could hack for hire.
00:02:13.080 | You could buy murders for hire
00:02:14.800 | if you wanted someone killed.
00:02:16.120 | Now, so when I was an FBI agent,
00:02:18.780 | I had to kind of sell some of these cases,
00:02:20.440 | and this was a big drug case.
00:02:22.400 | That's the way people saw Silk Road.
00:02:23.920 | So internally to the FBI, how I had to sell it,
00:02:26.920 | I had to find the worst thing on there
00:02:28.820 | that I could possibly find.
00:02:30.360 | And I think one time I saw a posting for baby parts.
00:02:34.400 | So let's say that you had a young child
00:02:36.840 | and that needed a liver.
00:02:38.280 | You could literally go on there and ask
00:02:40.000 | for a six-month-old liver if you wanted to.
00:02:43.040 | - For like surgical operations versus something darker.
00:02:46.440 | - Yeah, I'd never saw anything that dark
00:02:48.200 | as far as people that wanted to eat body parts.
00:02:51.000 | I did interview a cannibal once when I was in the FBI.
00:02:53.480 | That's another crazy story,
00:02:54.800 | but that one actually weirded me out.
00:02:57.240 | - So I just watched Jeffrey Dahmer document on Netflix,
00:03:00.920 | and it just changed the way I see human beings
00:03:04.040 | because it's a portrayal of a normal-looking person
00:03:08.300 | doing really dark things,
00:03:11.240 | and doing so not out of a place of insanity, seemingly,
00:03:15.240 | but just because he has almost like a fetish
00:03:18.120 | for that kind of thing.
00:03:19.400 | It's disturbing that people like that are out there.
00:03:22.080 | So people like that would then be using Silk Road,
00:03:26.880 | not like that necessarily,
00:03:28.040 | but people of different walks of life
00:03:30.080 | would be using Silk Road to primarily,
00:03:31.960 | what was the primary thing, drugs?
00:03:34.240 | - It was primarily drugs, and that's the way it started.
00:03:35.920 | It started off with Ross Ulbricht growing mushrooms
00:03:38.440 | out in the wilderness of California and selling them.
00:03:41.240 | But really his was more of a libertarian viewpoint.
00:03:43.720 | I mean, it was like you choose
00:03:45.560 | what you wanna do for yourself and do it.
00:03:48.000 | And the way Silk Road kind of had the anonymity
00:03:50.800 | is it used what's called Tor, the Onion Router,
00:03:54.120 | which is an anonymizing function on the deep web.
00:03:59.120 | It was actually invented by the US Navy
00:04:01.440 | back in the mid '90s or so,
00:04:03.840 | but it also used cryptocurrency.
00:04:05.320 | So it was the first time that we saw this birth
00:04:07.560 | on the internet of mixing cryptocurrency
00:04:10.600 | and an IP-blocking software.
00:04:13.320 | So in cybercrime, you go after, one, the IP address
00:04:16.720 | and trace it through the network,
00:04:18.240 | or two, you go after the cache,
00:04:19.760 | and this one kind of blocked both.
00:04:21.600 | - Cache meaning the flow of money, physical or digital,
00:04:25.480 | and then IP is some kind of identifying thing
00:04:30.200 | of the computer.
00:04:31.080 | - It's your telephone number on your computer.
00:04:33.040 | So yeah, all computers have a unique four-octet numbers.
00:04:38.040 | So 123.123.123.123.
00:04:43.760 | And the computer uses DNS or domain name services
00:04:48.120 | to render that name.
00:04:49.440 | So if you were looking for CNN.com,
00:04:52.040 | your computer then translates that to that IP address
00:04:54.400 | or that telephone number
00:04:55.280 | where it can find that information.
00:04:56.840 | - Didn't Silk Road used to have guns in the beginning?
00:04:59.200 | Or was that considered to have guns?
00:05:01.720 | Or did it naturally emerge
00:05:04.320 | and then Russ realized like, this is not good?
00:05:07.440 | - It went back and forth.
00:05:08.680 | I think there were guns on there and he tried to police it.
00:05:12.360 | He told himself that they're the captain of the boat,
00:05:15.000 | so you had to follow his rules.
00:05:16.280 | So I think he took off those posts eventually
00:05:19.160 | and moved guns elsewhere.
00:05:21.080 | - What was the system of censorship that he used
00:05:23.960 | of selecting what is okay and not okay?
00:05:27.560 | I mean-- - Him alone.
00:05:28.520 | He's the captain of the boat.
00:05:30.000 | - Do you know by chance if there was a lot of debates
00:05:33.880 | and criticisms internally amongst the criminals
00:05:37.080 | of what is and isn't allowed?
00:05:38.480 | I mean, it's interesting to see
00:05:40.400 | a totally different moral code emerge
00:05:42.800 | that's outside the legal code of society.
00:05:46.560 | - We did get the server and was able to read
00:05:48.680 | all of the chat logs that happened.
00:05:50.920 | I mean, all the records were there.
00:05:52.920 | I don't remember big debates.
00:05:54.240 | I mean, there was a clear leadership
00:05:56.840 | and that was the final decision.
00:05:58.120 | That was the CEO of Silk Road.
00:06:00.400 | - And so primarily it was drugs
00:06:02.160 | and primarily out of an ideology of freedom,
00:06:06.800 | which is if you want to use drugs,
00:06:10.760 | you should be able to use drugs.
00:06:12.500 | - You should put into your body
00:06:13.600 | what you wanna put in your body.
00:06:15.160 | - And when you were presenting the case
00:06:16.920 | of why this should be investigated,
00:06:19.160 | you're trying to find, as you mentioned,
00:06:21.440 | the worst possible things on there.
00:06:23.640 | Is that what you were saying?
00:06:24.680 | - So we had arrested a guy named Jeremy Hammond
00:06:26.720 | and he hit himself.
00:06:27.800 | He was a hacker and when we arrested him,
00:06:30.000 | it was the second time he had been arrested for hacking.
00:06:32.960 | He used TOR.
00:06:34.240 | And so that kind of brought us to a point.
00:06:37.220 | The FBI has a computer system where you look up things.
00:06:41.960 | You look up anything.
00:06:42.800 | I could look up your name or whatever
00:06:44.440 | if you're associated with my case.
00:06:46.480 | And we were finding at the time a lot of things
00:06:48.800 | in, you look it up, a case would end.
00:06:51.480 | Be like, oh, this is TOR.
00:06:53.120 | It just stopped.
00:06:54.280 | Like we couldn't get any further.
00:06:55.720 | So we had just had this big arrest of Sabu
00:07:00.000 | and took down Anonymous.
00:07:01.160 | And sometimes in the FBI, the way it used,
00:07:04.880 | the old school FBI, when you had a big case
00:07:07.520 | and you're working seven days a week
00:07:09.020 | and 14 hours, 15 hours a day, you sort of take a break.
00:07:13.000 | The boss kind of said, yeah, I'll see you in a few months.
00:07:14.800 | Go get to know your family a little bit and come back.
00:07:17.440 | But the group of guys I was with was like,
00:07:20.480 | let's find the next big challenge.
00:07:21.960 | And that's when we were finding case closed, it was TOR.
00:07:24.760 | Case closed, it was TOR.
00:07:25.740 | So said, let's take a look at TOR
00:07:27.400 | and let's see what we can do.
00:07:28.240 | Maybe we'll take a different approach.
00:07:29.480 | And Silk Road was being looked at by other law enforcement,
00:07:33.000 | but it was taking like a drug approach
00:07:34.440 | where I'm going to find a drug buyer
00:07:37.440 | who got the drug sent to them in the mail
00:07:39.800 | and let's arrest up, let's go up the chain.
00:07:42.120 | But the buyers didn't know their dealers.
00:07:43.800 | They never met them.
00:07:44.800 | - And so you were taking a cyber security approach.
00:07:48.160 | - Yeah, we said, let's try to look at this
00:07:49.920 | from a cyber approach and see if we can
00:07:52.520 | gleam anything out of it.
00:07:54.320 | - So I'm actually indirectly connected
00:07:58.600 | to, I'm sure I'm not admitting anything
00:08:01.560 | that's not already on my FBI file.
00:08:03.440 | - Oh, I can already tell you
00:08:04.360 | what you're gonna tell me though.
00:08:05.480 | - What's that?
00:08:06.320 | - That when you were at college, you wrote a paper
00:08:08.400 | and you're connected to the person that started.
00:08:10.760 | - You son of a bitch.
00:08:12.200 | You clever son of a bitch.
00:08:13.400 | - I'm an FBI agent or a former FBI agent.
00:08:15.760 | How would I not have already known that?
00:08:16.600 | - No, but I could have told you other stuff.
00:08:18.720 | - No, that's exactly what you were about to tell me.
00:08:20.720 | - I was looking up his name 'cause I forgot it.
00:08:22.320 | So one of my advisors for my PhD was Rachel Greenstadt
00:08:26.040 | and she is married to Roger Dingle Dine,
00:08:29.720 | which is the co-founder of the Tor Project.
00:08:32.120 | And I actually reached out to him last night
00:08:33.900 | to do a podcast together.
00:08:35.480 | I don't know.
00:08:36.320 | (laughing)
00:08:38.580 | No, it was a good party trick.
00:08:43.140 | I mean, it's cool that you know this
00:08:46.840 | and the timing of it, it was just like beautiful.
00:08:50.400 | But just to link around on the Tor Project,
00:08:55.600 | so we understand, so Tor is this black box
00:09:00.520 | that people disappear in,
00:09:01.940 | in terms of like when you were tracking people.
00:09:05.160 | Can you paint a picture of what Tor is used in general?
00:09:09.240 | Other, it's like when you talk about Bitcoin,
00:09:12.000 | for example, cryptocurrency, especially today,
00:09:15.480 | much more people use it for legal activity
00:09:18.120 | versus illegal activity.
00:09:19.600 | What about Tor?
00:09:20.960 | - Tor was originally invented by the US Navy
00:09:23.560 | so that like spies inside countries could talk to spies
00:09:26.240 | and no one could find them.
00:09:28.180 | There was no way of tracing them.
00:09:29.720 | And then they released that information free to the world.
00:09:32.700 | So Tor has two different versions of,
00:09:35.280 | versions, two different ways it can be utilized.
00:09:38.240 | There's .onionsites, which is like a normal website,
00:09:40.800 | a .com, but it's only found within the Tor browser.
00:09:43.200 | You can only get there if you know the whole address
00:09:45.560 | and get there.
00:09:46.400 | The other way Tor is used is to go through the internet
00:09:49.720 | and then come out the other side
00:09:51.200 | if you want a different IP address.
00:09:53.040 | If you're trying to hide your identity.
00:09:54.720 | So if you were doing like, say, cyber crime,
00:09:57.680 | I would have the victim computer
00:09:59.200 | and I would trace it back out to a Tor relay.
00:10:01.800 | And then because you don't have an active connection
00:10:04.720 | or what's called a circuit at the time,
00:10:06.220 | I wouldn't be able to trace it back.
00:10:07.640 | But even if you had an active circuit,
00:10:09.360 | I would have to go to each machine physically live
00:10:11.920 | and try to rebuild that, which is literally impossible.
00:10:15.520 | - So what do you feel about Tor,
00:10:17.280 | ethically, philosophically, as a human being
00:10:21.120 | on this world that spent quite a few years of your life
00:10:25.080 | and still trying to protect people?
00:10:27.680 | - So part of my time in the FBI
00:10:29.440 | was working on child exploitation,
00:10:31.640 | kiddie porn, as they call it.
00:10:33.560 | That really changed my life in a way.
00:10:35.200 | And so anything that helps facilitate
00:10:37.640 | the exploitation of children fucking pisses me off.
00:10:41.120 | And that sort of jaded my opinion towards Tor
00:10:46.120 | because that, because it helps facilitate those sites.
00:10:49.680 | - So this ideal of freedom that Russell Albrecht,
00:10:53.160 | for example, tried to embody is something
00:10:57.360 | that you don't connect with anymore
00:11:00.920 | because of what you've seen that ideal being used for.
00:11:05.920 | - I mean, the child exploitation
00:11:08.240 | is the specific example for it.
00:11:09.920 | You know, and it's easy for me to sit here
00:11:11.800 | and say child exploitation, child porn,
00:11:13.520 | 'cause no one listening to this is ever gonna say
00:11:15.600 | that I'm wrong and that we should allow child porn.
00:11:19.240 | Should, because some people utilize it in a bad way,
00:11:22.480 | should it go away?
00:11:23.600 | No, I mean, I'm a technologist.
00:11:26.680 | I want technology to move forward.
00:11:28.560 | People are gonna do bad things
00:11:32.080 | and they're going to use technology
00:11:33.600 | to help them do bad things.
00:11:35.720 | - Well, let me ask you then,
00:11:37.120 | oh, we'll jump around a little bit,
00:11:38.840 | but the things you were able to do
00:11:41.560 | in tracking down information, and we'll get to it,
00:11:44.760 | there is some suspicion that this was only possible
00:11:49.720 | with mass surveillance, like with NSA, for example.
00:11:54.720 | First of all, is there any truth to that?
00:11:56.640 | And second of all, what do you feel
00:11:58.960 | are the pros and cons of mass surveillance?
00:12:01.440 | - There is no truth to that.
00:12:04.680 | And then my feelings on mass surveillance--
00:12:07.040 | - If there was, would you tell me?
00:12:08.480 | - Probably not.
00:12:09.320 | - Yeah. (laughs)
00:12:11.200 | I love this conversation so much.
00:12:13.680 | But what do you feel about the,
00:12:16.480 | given that you said child porn,
00:12:18.440 | what are the pros and cons of surveillance
00:12:21.560 | at a society level?
00:12:22.840 | - I mean, nobody wants to give up their privacy.
00:12:26.240 | I say that, I say no one wants to give up their privacy,
00:12:28.720 | but I mean, I used to have to get a search warrant
00:12:30.360 | to look inside your house,
00:12:31.760 | or I can just log onto your Facebook
00:12:33.320 | and you've got pictures of all inside your house
00:12:35.120 | and what's going on.
00:12:36.280 | I mean, it's not, you know,
00:12:37.160 | so people like the idea of not giving up their privacy,
00:12:40.760 | but they do it anyways.
00:12:43.520 | They're giving away their freedoms all the time.
00:12:44.960 | They're carrying watches that gives out their heartbeat
00:12:47.960 | to a weight of companies that are storing that.
00:12:49.680 | I mean, what's more personal than your heartbeat?
00:12:52.480 | - So I think people on mass
00:12:55.680 | really want to protect their privacy.
00:12:57.480 | And I would say most people don't really need
00:13:00.280 | to protect their privacy.
00:13:01.640 | But the case against mass surveillance
00:13:03.720 | is that if you want to criticize the government
00:13:08.120 | in a very difficult time,
00:13:10.200 | you should be able to do it.
00:13:12.400 | So when you need the freedom, you should have it.
00:13:15.880 | So when you wake up one day and realize
00:13:17.960 | there's something going wrong with the country I love,
00:13:21.280 | I want to be able to help.
00:13:23.760 | And one of the great things about
00:13:25.920 | the United States of America
00:13:28.120 | is there's that individual revolutionary spirit,
00:13:31.480 | like so that the government doesn't become too powerful.
00:13:35.440 | You can always protest.
00:13:37.200 | There's always the best of the ideal of freedom of speech.
00:13:40.760 | You can always say, "Fuck you," to the man.
00:13:43.200 | And I think there's a concern of direct
00:13:46.760 | or indirect suppression of that through mass surveillance.
00:13:49.920 | You might not, is that little subtle fear
00:13:54.200 | that grows with time,
00:13:56.000 | that why bother criticizing the government?
00:13:59.560 | It's gonna be a headache.
00:14:00.440 | I'm gonna get a ticket every time I say something bad,
00:14:03.360 | that kind of thing.
00:14:04.280 | So it can get out of hand.
00:14:05.840 | The bureaucracy grows and the freedoms slip away.
00:14:09.760 | That's the criticism.
00:14:11.560 | - I completely see your point and I agree with it.
00:14:15.040 | But on the other side,
00:14:16.600 | people criticize the government of these freedoms,
00:14:19.040 | but tech companies talk about destroying your privacy
00:14:21.600 | and controlling what you can say.
00:14:23.480 | I realize they're private platforms
00:14:25.160 | and they can decide what's on their platform,
00:14:28.560 | but they're taking away your freedoms of what you can say.
00:14:30.800 | And we've heard some things
00:14:32.600 | where maybe government officials were in line
00:14:35.360 | with tech companies to take away some of that freedom.
00:14:38.480 | And I agree with you.
00:14:39.680 | That gets scary.
00:14:40.560 | - Yeah, there's something about government that feels,
00:14:43.560 | maybe because of the history of human civilization,
00:14:47.240 | maybe because tech companies are a new thing,
00:14:50.040 | but just knowing the history of abuses of government,
00:14:53.400 | there's something about government
00:14:57.160 | that enables the corrupting nature of power
00:14:59.920 | to take hold at scale more than tech companies,
00:15:02.600 | at least what we've seen so far.
00:15:04.200 | - I agree, I agree.
00:15:06.720 | But I mean, we haven't had a voice like we've had
00:15:09.640 | until recently.
00:15:10.520 | I mean, anyone that has a Twitter account now can speak
00:15:13.480 | and become a news article.
00:15:15.160 | My parents didn't have that voice.
00:15:18.720 | If they wanted to speak out against the government
00:15:21.360 | or do something, they had to go to a protest
00:15:22.880 | or organize a protest or do something along those lines.
00:15:26.160 | So we have more of a place to put our voice out now.
00:15:30.240 | - Yeah, it's incredible, but that's why it hurts.
00:15:32.560 | And that's why you notice it
00:15:34.120 | when certain voices get removed.
00:15:36.240 | The president of the United States of America
00:15:39.840 | was removed from one such or all such platforms.
00:15:43.320 | And that hurts.
00:15:45.400 | - Yeah, that's crazy to me.
00:15:46.520 | That's insane.
00:15:47.720 | That's insane that we took that away.
00:15:49.680 | - But let's return to Silk Road and Russ Elbrecht.
00:15:54.680 | So how did your path with this very difficult,
00:15:58.680 | very fascinating case cross?
00:16:02.680 | - We were looking to open a case against Tory
00:16:05.600 | because it was a problem.
00:16:06.440 | All the cases were closing because Tory.
00:16:08.480 | So we went on Tory and we came up
00:16:11.840 | with 26 different onion, dot onions that we targeted.
00:16:16.760 | We were looking for nexuses to hacking
00:16:19.600 | 'cause I was on a squad called CY2
00:16:21.720 | and we were like the premier squad in New York
00:16:25.400 | that was working criminal cyber intrusions.
00:16:29.320 | And so, any website that was offering hackers for hire
00:16:33.400 | or hacking tools for free or paid services,
00:16:38.400 | now we're seeing ransomware as a paid service
00:16:42.800 | and phishing as a paid service,
00:16:44.920 | anything that offered that.
00:16:45.840 | So we opened this case on, I think we called it,
00:16:49.200 | so you have to name cases.
00:16:50.160 | One of the fun thing in the FBI is when you start a case,
00:16:52.160 | you get to name it.
00:16:53.800 | You would not believe how much time is spent
00:16:55.880 | in coming up with the name.
00:16:57.240 | Case goes by.
00:16:59.120 | I think we called this Onion Peeler because of the, yeah.
00:17:02.240 | - So a little bit of humor, a little bit of wit
00:17:05.440 | and some profundity to the language, yeah, yeah.
00:17:08.200 | - Yeah. - 'Cause you're gonna have
00:17:09.040 | to work with this for quite a lot, so.
00:17:11.040 | - Yeah, this one had the potential of being a big one
00:17:13.480 | because I think Silk Road was like the sixth on the list
00:17:16.400 | for that case, but we all knew
00:17:18.640 | that was sort of the golden ring.
00:17:20.080 | If you could make the splash
00:17:21.920 | that that onion site was going down,
00:17:24.160 | then it would probably get some publicity.
00:17:25.880 | And that's part of law enforcement
00:17:27.720 | is getting some publicity out of it
00:17:30.640 | that makes others think not to do it.
00:17:33.720 | - I wish to say that Tor is the name of the project,
00:17:36.960 | the browser.
00:17:38.280 | What is the onion technology behind Tor?
00:17:40.600 | - Let's say you wanna go to a .onion site.
00:17:42.960 | You'll put in the .onion you wanna go to
00:17:45.400 | and your computer will build communications
00:17:47.800 | with a Tor relay, which are all publicly available out there.
00:17:51.000 | But you'll encrypt it.
00:17:53.320 | You'll put a package around your data.
00:17:56.040 | And so it's encrypted and so can't read it.
00:17:59.120 | It goes to that first relay.
00:18:01.480 | That first relay knows about you
00:18:03.880 | and then knows about the next relay down the chain.
00:18:06.400 | And so it takes your data
00:18:07.880 | and then encrypts that on the outside
00:18:09.400 | and sends it to the relay number two.
00:18:11.400 | Now, relay number two only knows about relay number one.
00:18:14.240 | It doesn't know who you are asking for this.
00:18:16.360 | And it goes through there, adding those layers on top,
00:18:19.000 | layers of encryption till it gets to where it is.
00:18:21.480 | And then even the onion service doesn't know,
00:18:23.640 | except for the relay it came from, who it's talking to.
00:18:27.320 | And so it peels back that, gives the information,
00:18:29.720 | puts another layer back on.
00:18:31.280 | And so it's layers, like you're peeling an onion back
00:18:34.720 | of the different relays and that encryption protects
00:18:38.920 | who the sender is and what information they're sending.
00:18:41.360 | - The more layers there are,
00:18:42.360 | the more exponentially difficult it is to decrypt it.
00:18:47.000 | - I mean, you get to a place
00:18:48.280 | where you don't have to have so many layers
00:18:50.200 | because it doesn't matter anymore.
00:18:52.080 | It's mathematically impossible to decrypt it.
00:18:54.320 | But the more relays you have, the slower it is.
00:18:58.240 | I mean, that's one of the big drawbacks on Tor
00:19:00.640 | is how slow it operates.
00:19:03.200 | - So how do you peel the onion?
00:19:04.940 | So what are the different methodologies
00:19:07.760 | for trying to get some information
00:19:10.880 | from a cybersecurity perspective
00:19:12.840 | on these operations like the Silk Road?
00:19:15.380 | - It's very difficult.
00:19:17.120 | People have come up with different techniques.
00:19:19.760 | There's been techniques to put out in the news media
00:19:22.760 | about how they do it, running massive amounts of relays
00:19:26.920 | and you're controlling those relays.
00:19:28.480 | I think somebody tried that once.
00:19:30.360 | - So there's a technical solution.
00:19:31.640 | And what about social engineering?
00:19:34.560 | What about trying to infiltrate the actual humans
00:19:39.560 | that are using the Silk Road and trying to get in that way?
00:19:44.140 | - Yeah, I mean, I definitely could see the way of doing that
00:19:47.760 | and in this case, in our takedown, we used that.
00:19:51.880 | There was one of my partners, Jared Darragon,
00:19:54.320 | he was an HSI investigator and he had worked his way up
00:19:57.320 | to be a system admin on the site.
00:19:59.420 | So that did gleam quite a bit of information
00:20:02.640 | because he was inside and talking to, at that time,
00:20:07.120 | we only know it as DPR or Dread Pirate Roberts.
00:20:09.840 | We didn't know who that was yet,
00:20:11.560 | but we had that open communication.
00:20:13.640 | And one of the things, the technical aspects on that
00:20:18.180 | is there was a Jabber server.
00:20:20.600 | There was, that's a type of communication server
00:20:24.120 | that was being used and we knew that Ross
00:20:28.360 | had his Jabber set to Pacific time.
00:20:32.000 | So we had a pretty good idea what part of the country
00:20:37.000 | he was in.
00:20:38.520 | - I mean, isn't that, from DPR's perspective,
00:20:42.160 | from Russ's perspective, isn't that clumsy?
00:20:45.280 | - He wasn't a big computer guy.
00:20:47.680 | - Do you notice that aspect of the technical savvy
00:20:50.200 | of some of these guys doesn't seem to be quite,
00:20:52.920 | why weren't they good at this?
00:20:55.120 | - Well, the real techie savvy ones, we don't arrest.
00:20:57.760 | We don't get to 'em, we don't find 'em.
00:20:59.000 | - We don't get to them.
00:21:00.840 | Shout out to the techie criminals.
00:21:04.920 | They're probably watching this.
00:21:06.880 | - I mean, yeah, I mean, we're getting the low-hanging fruit.
00:21:08.880 | I mean, we're getting the ones that can be caught.
00:21:10.660 | I mean, I'm sure we'll talk about it,
00:21:13.120 | but the anonymous case, there was a guy named AV Unit.
00:21:15.200 | He's still, I lose sleep over him
00:21:16.920 | 'cause we didn't catch him.
00:21:18.080 | We caught everybody else, we didn't catch him.
00:21:20.960 | He's good, though.
00:21:22.680 | He pops up, too, once in a while on the internet,
00:21:24.560 | and it pisses me off.
00:21:25.520 | - Yeah, what's his name again?
00:21:26.720 | - AV Unit, that's all I know, is his AV Unit.
00:21:29.800 | - AV Unit.
00:21:30.640 | - Yeah, I got a funny story about him
00:21:32.280 | and who people think he is.
00:21:34.280 | - Can I actually, can we go on that brief tangent?
00:21:36.520 | - Sure, I love tangents.
00:21:37.800 | - Well, let me ask you, since he's probably he or she,
00:21:43.680 | do we know it's a he?
00:21:44.880 | - We have no idea.
00:21:46.000 | - Okay.
00:21:46.840 | - Another funny story about hackers, the he/she issue.
00:21:49.920 | - What's the funny story there?
00:21:51.080 | - Well, one of the guys in LULSEC was a she,
00:21:53.720 | was a 17-year-old girl.
00:21:55.920 | And my source in the case, the guy, Sabu,
00:21:59.240 | that I arrested and part of,
00:22:01.440 | we sat side by side for nine months
00:22:03.720 | and then took down the case and all that.
00:22:06.840 | He was convinced she was a girl,
00:22:08.920 | and he was in love with her almost at one point.
00:22:12.720 | It turns out to be a 35-year-old guy living in England.
00:22:16.080 | - Oh, so he was convinced it was a...
00:22:18.920 | - Yes, he was absolutely convinced.
00:22:20.400 | - Based on what exactly?
00:22:21.560 | By a linguistic, human-based linguistic analysis or what?
00:22:25.520 | - She, he, whatever, Kayla,
00:22:28.840 | so it ended up being a modification of his sister's name,
00:22:32.520 | the real guy's sister's name,
00:22:34.720 | was so good at building the backstory.
00:22:37.560 | All these guys, and it's funny,
00:22:39.360 | these guys are part of a hacking crew.
00:22:40.600 | They social engineer the shit out of each other
00:22:43.440 | just to build, if one of them ever gets caught,
00:22:45.960 | they'll convince the everybody else
00:22:47.400 | that they're a Brazilian ISP owner or something like that,
00:22:52.200 | and that's how I'm so powerful.
00:22:53.680 | - Well, yeah, that social engineering aspect
00:22:55.480 | is part of living a life of cyber crime or cybersecurity
00:22:59.000 | on the offensive or defensive.
00:23:00.640 | So AV unit, can I ask you also just
00:23:03.600 | a tangent of a tangent first?
00:23:05.600 | - That's my favorite tangent.
00:23:06.920 | - Okay.
00:23:07.760 | Is it possible for me to have a podcast conversation
00:23:14.560 | with somebody who hasn't been caught yet,
00:23:16.920 | and because they have the conversation,
00:23:20.200 | they still won't be caught?
00:23:22.200 | And is that a good idea?
00:23:24.360 | Meaning, is there a safe way for a criminal
00:23:26.320 | to talk to me on a podcast?
00:23:27.960 | - I would think so.
00:23:31.400 | I would think that someone could,
00:23:33.600 | I mean, someone who has been living a double life
00:23:36.440 | for long enough, where you think they're not a criminal.
00:23:39.520 | - No, no, no, they would have to admit
00:23:42.760 | that they would say I am AV unit.
00:23:45.080 | - Oh, you would wanna have a conversation with AV unit?
00:23:47.160 | - Yes.
00:23:48.000 | I'm just speaking from an FBI perspective,
00:23:52.400 | technically speaking, 'cause I,
00:23:54.520 | so let me explain my motivation.
00:23:57.280 | I think I would like to be able to talk
00:24:02.280 | to people from all walks of life
00:24:04.880 | and understanding criminals,
00:24:07.640 | understanding their mind, I think is very important.
00:24:12.600 | And I think there's fundamentally something different
00:24:14.840 | between a criminal who's still active
00:24:17.040 | versus one that's been caught.
00:24:18.760 | The mind, just from observing it,
00:24:21.000 | changes completely once you're caught.
00:24:24.320 | You have a big shift in your understanding of the world.
00:24:27.680 | I mean, I do have a question about the ethics
00:24:31.320 | of having such conversations, but first, technically,
00:24:34.000 | is it possible?
00:24:36.760 | - If I was technically advising you,
00:24:39.760 | I would say, first off, don't advertise it.
00:24:42.080 | The fewer people that you're gonna tell
00:24:44.040 | that you're having this conversation with, the better.
00:24:46.720 | And yeah, you could, are you doing it in person?
00:24:50.000 | Are you doing it--
00:24:50.840 | - In person would be amazing, yeah,
00:24:52.120 | but their face would not be shown.
00:24:54.000 | - Face would not be shown, yeah.
00:24:55.440 | I mean, you couldn't publish the show for a while.
00:24:57.800 | They'd have to put a lot of trust in you
00:24:59.200 | that you are not going to,
00:25:01.040 | you're gonna have to alter those tapes.
00:25:03.380 | I say tapes 'cause it's old school, the opt-out, you know.
00:25:07.200 | - It's a tape.
00:25:08.040 | - Exactly, I'm sure a lot of people just said that,
00:25:09.920 | like, oh shit, this old guy just said tape.
00:25:11.800 | - I heard it, VHS was in the 1800s, I think.
00:25:14.560 | - But yeah, yeah, you could do it.
00:25:18.600 | They'd have to have complete faith and trust in you
00:25:21.080 | that you destroy the originals after you've altered it.
00:25:24.560 | - What about if they don't have faith?
00:25:25.920 | Is there a way for them to attain security?
00:25:28.960 | So like for me to go through some kind of process
00:25:34.080 | where I meet them somewhere where--
00:25:36.080 | - I mean, you're not gonna do it
00:25:37.040 | without a bag over your head.
00:25:38.400 | I don't know if that's the life you wanna live.
00:25:40.280 | - I'm fine with a bag over my head.
00:25:42.320 | That's gonna get taken out of context.
00:25:44.920 | But I just, I think it's a worthy effort.
00:25:47.160 | It's worthy to go through the hardship of that
00:25:50.440 | to understand the mind of somebody.
00:25:52.400 | I think fundamentally conversations are a different thing
00:25:56.840 | than the operation of law enforcement.
00:25:58.840 | Understanding the mind of a criminal,
00:26:00.160 | I think, is really important.
00:26:01.680 | - I don't know if you're gonna have
00:26:03.160 | the honest conversation that you're looking for.
00:26:05.520 | I mean, it may sound honest, but it may not be the truth.
00:26:08.320 | I found most times when I was talking to criminals,
00:26:10.260 | it's lies mixed with half-truths.
00:26:12.680 | And you kinda, if they're good,
00:26:15.400 | they can keep that story going for long enough.
00:26:18.080 | If they're not, you kind of see the relief in them
00:26:21.120 | when you finally break that wall down.
00:26:24.120 | - That's the job of an interviewer.
00:26:26.720 | If the interviewer is good, then perhaps not directly,
00:26:30.600 | but through the gaps, seeps out the truth
00:26:34.600 | of the human being.
00:26:35.980 | So not necessarily the details
00:26:37.480 | of how they do the operations and so on,
00:26:39.800 | but just who they are as a human being,
00:26:41.320 | what their motivations are, what their ethics are,
00:26:44.260 | how they see the world, what is good, what is evil,
00:26:46.440 | do they see themselves as good?
00:26:47.960 | What do they see their motivation as?
00:26:50.360 | Do they have resentment?
00:26:51.920 | What do they think about love for the people
00:26:54.200 | within their small community?
00:26:55.920 | Do they have resentment for the government
00:26:57.740 | or for other nations or for other people?
00:27:00.160 | Do they have childhood issues that led
00:27:02.160 | to a different view of the world than others perhaps have?
00:27:05.800 | Do they have certain fetishes, like sexual and otherwise,
00:27:09.060 | that led to the construction of the world?
00:27:12.120 | They might be able to reveal some deep flaws
00:27:15.960 | to the cybersecurity infrastructure of our world,
00:27:20.120 | not in detail, but like philosophically speaking.
00:27:24.240 | They might have, I know you might say it's just a narrative,
00:27:29.240 | but they might have a kind of ethical concern
00:27:33.480 | for the well-being of the world,
00:27:35.240 | that they're essentially attacking the weakness
00:27:38.800 | of the cybersecurity infrastructure
00:27:40.440 | because they believe ultimately
00:27:42.380 | that would lead to a safer world.
00:27:44.900 | So the attacks will reveal the weaknesses.
00:27:47.940 | And if they're stealing a bunch of money, that's okay,
00:27:51.060 | because that's gonna enforce you to invest a lot more money
00:27:53.780 | in defending, yeah, defending things that actually matter,
00:27:58.380 | you know, nuclear warheads and all those kinds of things.
00:28:01.340 | I mean, I could see, you know, it's fascinating
00:28:05.100 | to explore the mind of a human being like that
00:28:07.220 | because I think it will help people understand.
00:28:11.740 | Now, of course, it's still a person
00:28:16.740 | that's creating a lot of suffering in the world,
00:28:19.660 | which is a problem.
00:28:20.820 | So do you think ethically it's a good thing to do?
00:28:23.860 | - I don't, I mean, I feel like I have a fairly high
00:28:28.000 | ethical bar that I have to put myself on,
00:28:30.840 | and I don't think I have a problem with it.
00:28:32.460 | I would love to listen to it.
00:28:34.340 | - Okay, great.
00:28:36.220 | - I mean, not that I'm your ethical coach or anything.
00:28:39.060 | - Well, that's interesting, I mean,
00:28:40.340 | so 'cause I thought you would have become jaded
00:28:44.420 | and exhausted by the criminal mind.
00:28:49.420 | - It's funny, you know, fast forward in our story,
00:28:56.580 | I'm very good friends with Hector Monserrate,
00:28:58.900 | the sabu, the guy I arrested,
00:29:00.340 | and he tells stories of what he did in his past,
00:29:04.540 | and I'm like, oh, that Hector, you know?
00:29:06.820 | But then I listened to your episode with Brett Johnson,
00:29:11.220 | and I was like, ah, this guy's stealing money
00:29:14.500 | from the US government and welfare fraud
00:29:17.300 | and all this sort of thing, it just pissed me off.
00:29:19.340 | And I don't know why I have that differentiation in my head.
00:29:24.060 | I don't know why I think one's just,
00:29:26.900 | oh, Hector will be Hector,
00:29:28.140 | and then this guy just pissed me off.
00:29:30.140 | - Well, you didn't feel that way about Hector
00:29:31.940 | until you probably met him.
00:29:33.580 | - Well, I didn't know Hector, I knew sabu.
00:29:36.340 | So I hunted down sabu,
00:29:38.380 | and I learned about Hector over those nine months.
00:29:41.340 | - We'll talk about a little, let's finish with,
00:29:43.940 | let's return tangent to back to tangent.
00:29:46.020 | Oh, one tangent up, who's AV unit?
00:29:50.060 | - I don't know. - That's interesting.
00:29:51.220 | So he's at the core of Anonymous,
00:29:54.780 | he's one of the critical people in Anonymous.
00:29:56.620 | What is known about him?
00:29:57.940 | - There's what's known in public and what was known
00:29:59.900 | because I sat with Hector,
00:30:01.340 | and he was sort of like the set things up guy.
00:30:05.900 | So if, LulzSec had like their hackers,
00:30:09.380 | which was sabu and Kayla,
00:30:11.060 | and they had their media guy, this guy Topiary,
00:30:15.900 | he lived up in the Northern end of England.
00:30:18.140 | And they had a few other guys,
00:30:20.100 | but AV unit was the guy that set up infrastructure.
00:30:22.860 | So if you need a VPN in Brazil
00:30:24.460 | or something like that to pop through.
00:30:26.360 | One of the first things Hector told me
00:30:29.580 | after we arrested him is that AV unit
00:30:31.500 | was a secret service agent.
00:30:33.620 | And I was like, oh shit.
00:30:34.860 | Just because he kind of lived that lifestyle.
00:30:38.740 | He'd be around for a bunch of days
00:30:40.500 | and then all of a sudden gone for three weeks.
00:30:42.380 | And I tried to get more out of Hector
00:30:44.660 | and that early on in that relationship,
00:30:47.020 | I'm sure he was a little bit guarded,
00:30:49.740 | maybe trying to social engineer me.
00:30:51.260 | Maybe he wanted that, oh shit,
00:30:53.580 | there's law enforcement involved in this.
00:30:56.860 | And not to say, I mean, I was in over my head
00:31:00.220 | with that case, just the amount of work that was going on.
00:31:02.940 | So to track them all down,
00:31:04.780 | plus the 350 hacks that came in
00:31:07.940 | about just military institutions,
00:31:09.900 | it was swimming in the deep end.
00:31:13.100 | So it was just at the end of the case,
00:31:14.840 | I looked back and I was like,
00:31:15.980 | oh fuck, AV unit, I could have had them all.
00:31:18.660 | Maybe that's the perfectionist in me.
00:31:21.700 | - Oh man, well, reach out somehow.
00:31:24.700 | I can't, I won't say how, right?
00:31:26.180 | We'll have to figure out.
00:31:27.140 | - Would you have him on?
00:31:28.100 | - Yeah.
00:31:28.940 | - Oh my God, just let me know.
00:31:30.380 | - And just talk shit about you the whole time.
00:31:32.220 | - That's perfect.
00:31:33.140 | He probably doesn't even care about me.
00:31:34.740 | - Well, now he will.
00:31:36.340 | Because there's a certain pleasure
00:31:38.140 | of a guy who's extremely good at his job,
00:31:41.300 | not catching another guy who's extremely good at his job.
00:31:44.620 | - Obviously better, he got away.
00:31:46.780 | - There you go, he's still eating at you, I love it.
00:31:49.540 | He or she.
00:31:50.380 | - If I can meet that guy one day, he or she,
00:31:52.780 | that'd be great.
00:31:53.620 | I mean, I have no power.
00:31:55.540 | - So yes, Silk Road, can you speak to the scale
00:31:58.620 | of this thing?
00:31:59.460 | What, just for people who are not familiar,
00:32:02.740 | how big was it?
00:32:04.340 | And any other interesting things you understand
00:32:08.300 | about its operation when it was active?
00:32:10.980 | - So it was when we finally got looking through the books
00:32:14.780 | and the numbers came out, it was about $1.2 billion
00:32:19.500 | in sales.
00:32:20.460 | It's kind of hard with the fluctuation value
00:32:22.220 | of Bitcoin at the time to come up with a real number.
00:32:24.500 | So you kind of pick a daily average and go across.
00:32:27.420 | - Most of the operation was done in Bitcoin.
00:32:29.540 | - It was all done in Bitcoin.
00:32:30.700 | You couldn't, you had escrow accounts on,
00:32:33.700 | you came in and you put money in an escrow account
00:32:37.060 | and the transaction wasn't done until the client
00:32:40.900 | got the drugs or whatever they had bought.
00:32:43.460 | And then the drug dealers had sent it in.
00:32:46.580 | There was some talk at the time that the cartel
00:32:49.140 | was starting to sell on there.
00:32:50.980 | So that started getting a little hairy there at the end.
00:32:53.260 | - What was the understanding of the relationship
00:32:54.940 | between organized crime like the cartels
00:32:58.020 | and this kind of more ad hoc new age market
00:33:03.020 | that is the Silk Road?
00:33:06.820 | - I mean, it was all just chatter.
00:33:08.020 | It was just, 'cause like I said, Jared was in the inside.
00:33:10.900 | So we saw some of it from the admin sides
00:33:13.540 | and Ross had a lot of private conversations
00:33:15.540 | with the different people that he advised him,
00:33:18.220 | but no one knew each other.
00:33:20.300 | And I mean, the only thing that they knew
00:33:23.780 | were the admins had to send an ID to Ross,
00:33:27.900 | had to send a picture of their driver's license or passport,
00:33:30.700 | which I always found very strange
00:33:32.180 | because if you are an admin on a site that sells fake IDs,
00:33:36.400 | why would you send your real ID?
00:33:38.260 | And then why would the guy running the site
00:33:40.060 | who profits from selling fake IDs believe that it was?
00:33:45.060 | But fast forward, they were all real IDs.
00:33:48.620 | All the IDs that we found on Ross's computer as the admins
00:33:51.020 | were the real people's IDs.
00:33:52.620 | - What do you make of that?
00:33:53.460 | Just other clumsiness?
00:33:55.220 | - Yeah, low hanging fruit, I guess.
00:33:56.820 | I guess that's what it is.
00:33:57.740 | I mean, I would have bought,
00:33:59.460 | I mean, even Ross bought fake IDs off the site.
00:34:02.780 | He had federal agents knock on his door.
00:34:04.780 | You know, and then he got a little cocky about it.
00:34:08.100 | - The landscape, the dynamics of trust is fascinating here.
00:34:11.620 | So you trust certain ideas are,
00:34:13.740 | like who do you trust in that kind of market?
00:34:15.740 | What was your understanding of the network of trust?
00:34:19.180 | - I don't think anyone trusts anybody, you know?
00:34:20.980 | I mean, I think Ross had his advisors of trust,
00:34:22.940 | but outside of that, I mean,
00:34:24.980 | he required people to send their ID for their trust.
00:34:28.980 | People stole from him.
00:34:31.340 | There's open cases of that.
00:34:33.220 | It's a criminal world.
00:34:36.540 | You can't trust anybody.
00:34:37.820 | - What was his life like, you think?
00:34:40.700 | - Lonely.
00:34:42.460 | Can you imagine being trapped in something like that
00:34:44.980 | where the whole world focused on that
00:34:47.380 | and you can't tell people what you do all day?
00:34:49.940 | - Could he have walked away?
00:34:51.340 | - Like someone else take over or the site just shut down?
00:34:56.140 | - Either one.
00:34:56.980 | Just you putting yourself in his shoes,
00:34:59.460 | the loneliness, the anxiety,
00:35:02.100 | the just the growing immensity of it.
00:35:05.340 | So walk away with some kind of financial stability.
00:35:08.340 | - I couldn't have made it past two days.
00:35:10.380 | I don't like loneliness.
00:35:12.860 | I mean, if my wife's away,
00:35:15.300 | I'd probably call her 10, 12 times a day.
00:35:17.980 | We just talk about things.
00:35:19.020 | You know, something crossed my mind.
00:35:20.660 | I want to talk about it.
00:35:21.500 | And I'm sure she--
00:35:22.940 | - And you'd like to talk to her honestly about everything.
00:35:25.620 | So if you were running so crowded,
00:35:27.420 | you wouldn't be able to like--
00:35:31.180 | - Hopefully I'd have a little protection.
00:35:32.340 | I'd only mention to her when we were in bed
00:35:35.100 | to have that marital connection.
00:35:37.780 | But who knows?
00:35:38.620 | I mean, she's gonna question why the Ferrari is outside
00:35:41.420 | and things like that.
00:35:42.940 | - Yeah.
00:35:43.780 | Well, I'm sure you can come up with something.
00:35:46.420 | Why didn't he walk away?
00:35:47.860 | It's another question of why don't criminals walk away
00:35:50.500 | in these situations?
00:35:51.900 | - Well, I mean, I don't know every criminal mind
00:35:53.780 | and some do.
00:35:54.620 | I mean, A.V. Unit walked away.
00:35:55.900 | I mean, not to go back to that son of a bitch, but--
00:35:58.620 | (laughing)
00:35:59.820 | - There's a theme to this.
00:36:00.780 | - But you know, Ross started counting his dollars.
00:36:04.340 | I mean, he really kept track of how much money
00:36:06.300 | he was making and it started getting exponentially growth.
00:36:09.500 | I mean, if he would have stayed at it,
00:36:12.060 | he would have probably been one of the richest people
00:36:13.820 | in the world.
00:36:14.940 | - And do you think he liked the actual money
00:36:17.820 | or the fact of the number growing?
00:36:20.220 | - I mean, have you ever held a Bitcoin?
00:36:22.060 | - Yeah.
00:36:22.900 | - Oh, you have?
00:36:23.720 | Well, he never did.
00:36:24.560 | - What do you mean held a Bitcoin?
00:36:25.380 | - You can't hold it.
00:36:26.220 | It's not real.
00:36:27.060 | It's not like I can give you a briefcase of Bitcoin
00:36:29.220 | or something like that.
00:36:31.140 | He liked the idea of it growing.
00:36:32.620 | He liked the idea.
00:36:33.540 | I mean, I think it started off as sharing this idea,
00:36:35.680 | but then he really did turn to,
00:36:37.300 | like I am the captain of this ship
00:36:39.420 | and that's what goes and he was making a lot of money.
00:36:42.840 | And again, my interaction with Ross was about
00:36:46.940 | maybe five or six hours over a two day period.
00:36:52.360 | I knew DPR 'cause I read his words and all that.
00:36:56.100 | I didn't really know Ross.
00:36:57.760 | There was a journal found on his computer
00:37:00.900 | and so it sort of kind of gave me a little insight.
00:37:03.940 | So I don't like to do a playbook for criminals,
00:37:06.500 | but I'll tell you right now, don't write things down.
00:37:09.860 | There was a big fad about people,
00:37:11.460 | like remember kids going around
00:37:12.540 | shooting people with paint balls and filming it?
00:37:14.900 | I don't know why you would do that.
00:37:16.140 | Why would you videotape yourself committing crime
00:37:18.460 | and then publish it?
00:37:19.300 | Like if there's one thing I've taught my children,
00:37:21.960 | don't record yourself doing bad things.
00:37:23.740 | It never goes well.
00:37:24.980 | - And you actually give advice on the other end
00:37:26.860 | of logs being very useful for the defense perspective
00:37:30.260 | for information is useful for being with people
00:37:36.220 | for being able to figure out what the attacks were all about
00:37:39.260 | - Logs are the only reason I found Hector Monsegor.
00:37:41.340 | I mean, the one time his VPN dropped during a Fox hack
00:37:46.340 | and he says he did, it wasn't even hacking.
00:37:48.380 | He just was sent a link and he clicked on it.
00:37:50.180 | And in 10 million lines of logs,
00:37:52.580 | there was one IP address that stuck out.
00:37:54.580 | - This is fascinating.
00:37:56.820 | We'll explore several angles of that.
00:37:58.460 | So what was the process of bringing down Ross
00:38:04.740 | and the Silk Road?
00:38:06.660 | - All right, so that's a long story.
00:38:07.820 | You want the whole thing or you want to break it up?
00:38:09.620 | - Let's start at the beginning.
00:38:11.620 | - Once we had the information of the chat logs
00:38:15.860 | and all that from the server, we found--
00:38:18.140 | - What's the server?
00:38:18.980 | What's the chat log?
00:38:19.940 | - So the dot onion was running the website,
00:38:23.000 | the Silk Road was running on a server in Iceland.
00:38:27.020 | - How did you figure that out?
00:38:28.420 | That was one of the claims that the NSA.
00:38:31.680 | - Yeah, that's the one that we said that, yeah,
00:38:34.300 | I wouldn't tell you if it was.
00:38:35.700 | It's on the internet.
00:38:36.540 | I mean, the internet has their conspiracy theories
00:38:38.140 | and all that, so.
00:38:39.380 | - But you figure out, that's the part of the thing you do.
00:38:41.780 | It's puzzle pieces and you have to put them together
00:38:44.220 | and look for different pieces of information
00:38:45.900 | and figure out, okay,
00:38:46.780 | so you figure out the server is in Iceland.
00:38:48.420 | - We get a copy of it.
00:38:49.540 | And so we started getting clues off of that.
00:38:51.260 | - Was it a physical copy of the server?
00:38:53.020 | - Yeah, you fly over there.
00:38:54.720 | So you go, if you've been to Iceland,
00:38:56.660 | if you've never been, you should definitely go to Iceland.
00:38:58.940 | - Is it beautiful?
00:38:59.780 | - I love it, I love it.
00:39:00.620 | It was what, so I'll tell you this.
00:39:02.080 | So, sorry, tangents.
00:39:03.380 | - Yeah, I love this, yeah.
00:39:05.300 | - So I went to Iceland for the Anonymous case.
00:39:09.020 | Then I went to Iceland for the Silk Road case.
00:39:10.820 | And I was like, oh shit, all cyber crime goes through Iceland.
00:39:13.700 | It was just my sort of thing.
00:39:15.020 | And I was over there for like the third time.
00:39:16.860 | And I said, if I ever can bring my family here.
00:39:19.380 | Like, so there's a place called Thingavar,
00:39:20.900 | and I'm sure I'm fucking up the name.
00:39:22.240 | The Icelandics are pissed right now.
00:39:24.020 | But it's where the North American continental plate
00:39:27.420 | and the European continental plate are pulling apart.
00:39:29.780 | And it's being filled in with volcanic material
00:39:33.260 | in the middle.
00:39:34.660 | And it's so cool.
00:39:36.300 | Like, I was like, one day I'll be able to afford
00:39:39.740 | to bring my family here.
00:39:41.380 | And once I left--
00:39:42.780 | - Just like the humbling and the beauty of nature.
00:39:45.060 | - Just everything, man, it was a different world.
00:39:46.980 | It was insane how great Iceland is.
00:39:49.660 | And so we went back and we rented a van
00:39:51.820 | and we took friends.
00:39:53.340 | And we drove around the entire country.
00:39:56.560 | Absolutely, like a beautiful place.
00:40:01.260 | Like, Reykjavik's nice, but get out of Reykjavik
00:40:04.020 | as quick as you can and see the countryside.
00:40:06.060 | - How is this place even real?
00:40:07.560 | - Well, it's so new.
00:40:08.800 | I mean, that's, so you know, our rivers
00:40:10.380 | have been going through here for millions of years
00:40:11.900 | and flattened everything out and all that.
00:40:13.380 | These are new, this is new land
00:40:15.380 | being carved by these rivers.
00:40:17.020 | You can walk behind a waterfall in one place.
00:40:20.060 | It's the most beautiful place I've ever been.
00:40:23.140 | - You understand why this is a place
00:40:24.860 | where a lot of hacking is being done?
00:40:26.840 | - Because the energy is free and it's cool.
00:40:29.740 | So you have a lot of servers going on there.
00:40:31.500 | Server farms, you know, the energy has come
00:40:34.460 | up out of the ground, geothermal.
00:40:36.980 | And so, and then it keeps all the servers nice and cool.
00:40:40.060 | So why not keep your computers there at a cheap rate?
00:40:43.200 | - I'll definitely visit for several reasons,
00:40:46.540 | including to talk to AV Unit.
00:40:48.340 | - Yeah, he'll let you there.
00:40:50.320 | - Well, the servers are there,
00:40:51.420 | but they don't probably live there.
00:40:53.200 | I mean, that's interesting.
00:40:54.220 | I mean, the Pacific, the PST, the time zones,
00:40:58.100 | there's so many fascinating things to explore here.
00:41:00.340 | But so you got--
00:41:01.180 | - Sorry, to add to that, I mean,
00:41:02.340 | the European internet cable goes through there.
00:41:04.160 | So, you know, across to Greenland
00:41:06.300 | and down through Canada and all that.
00:41:07.460 | So they have backbone access with cheap energy
00:41:10.620 | and free cold weather, you know.
00:41:13.640 | - And beautiful.
00:41:14.480 | - Oh, and beautiful, yes.
00:41:15.680 | - So chat logs on that server,
00:41:19.020 | what was in the chat logs?
00:41:22.780 | - Everything, he kept them all.
00:41:24.020 | That's another issue.
00:41:25.180 | If you're running a criminal enterprise,
00:41:27.020 | please don't keep all, again,
00:41:28.460 | I'm not making a guidebook
00:41:29.380 | of how to commit the perfect crime,
00:41:31.500 | but you know, every chat he ever had,
00:41:33.740 | and everyone's chat,
00:41:34.580 | it was like going into Facebook of criminal activity.
00:41:38.220 | - Yeah, just looking at texts with Elon Musk
00:41:42.540 | being part of the conversations.
00:41:45.300 | I don't know if you're familiar,
00:41:46.380 | but they've been made public
00:41:48.300 | for the court cases going through,
00:41:50.260 | was going through, is going through,
00:41:52.020 | was going through with Twitter.
00:41:53.380 | - I don't know where it is.
00:41:55.260 | But it made me realize that, oh, okay.
00:41:58.460 | I'm generally, that's my philosophy on life,
00:42:01.640 | is like anything I text or email or say,
00:42:05.300 | publicly or privately, I should be proud of.
00:42:07.700 | So I tried to kind of do that
00:42:10.100 | because you basically, you say don't keep chat logs,
00:42:13.700 | but it's very difficult to erase chat logs from this world.
00:42:18.700 | I guess if you're a criminal, that should be,
00:42:22.140 | like you have to be exceptionally competent
00:42:24.300 | at that kind of thing.
00:42:25.180 | To erase your footprints is very, very difficult.
00:42:27.380 | - Can't make one mistake.
00:42:28.420 | All it takes is one mistake of keeping it.
00:42:30.580 | But yeah, I mean, not only do you have to be,
00:42:34.620 | whatever you put in a chat log or whatever you put in an email
00:42:36.700 | it has to hold up and you have to be able
00:42:38.580 | to stand behind it publicly when it comes out.
00:42:40.260 | But if it comes out 10 years from now,
00:42:42.480 | you have to stand behind it.
00:42:43.660 | I mean, we're seeing that now in today's society.
00:42:45.900 | - Yeah, but that's a responsibility
00:42:47.740 | you have to take really, really seriously.
00:42:49.500 | If I was a parent and advising teens,
00:42:52.500 | like you kind of have to teach them that.
00:42:54.580 | I know there's a sense like,
00:42:56.860 | no, we'll become more accustomed to that kind of thing.
00:42:58.980 | But in reality, no, I think in the future
00:43:02.140 | we'll still be held responsible
00:43:03.860 | for the weird shit we do.
00:43:05.220 | - Yeah, a friend of mine,
00:43:06.040 | his daughter got kicked out of college
00:43:07.340 | because of something she posted in high school
00:43:09.360 | and the shittiest thing for him, but great for my kids.
00:43:13.060 | Great lesson, look over there
00:43:14.580 | and you don't want that to happen to you.
00:43:16.380 | - Yeah, okay.
00:43:17.580 | So in the chat logs was a useful information,
00:43:21.300 | like breadcrumbs of what,
00:43:24.780 | of information that you can then pull out.
00:43:26.740 | - Yeah, great evidence and stuff.
00:43:28.140 | I mean, obviously-- - Evidence too.
00:43:29.460 | - Yeah, a lot of evidence.
00:43:30.700 | Here's a sale of this much heroin
00:43:33.640 | because Ross ended up getting charged
00:43:35.940 | with czar status on certain things.
00:43:37.580 | And it's a certain weight in each type of drug
00:43:41.580 | that you had, I think it's four or five employees
00:43:45.100 | of your empire and that you made more than $10 million.
00:43:48.660 | And so it's just like what the narco track readers
00:43:52.140 | get charged with or anybody out of Columbia.
00:43:55.660 | - And that was primarily what he was charged with
00:44:00.140 | during when he was arrested is the drug.
00:44:03.020 | - Yeah, and he got charged
00:44:03.940 | with some of the hacking tools too.
00:44:05.980 | - Okay, because he's in prison, what, for--
00:44:08.900 | - Two life sentences plus 40 years.
00:44:11.540 | - And no possibility of parole?
00:44:13.780 | - In the federal system,
00:44:14.740 | there's no possibility of parole when you have life.
00:44:16.940 | The only way you get out is if the president pardons you.
00:44:19.780 | - There's always a chance.
00:44:25.060 | - There is, I think it was close.
00:44:26.700 | I heard rumors it was close.
00:44:29.620 | - Well, right, so it depends.
00:44:31.180 | Given, it's fascinating, but given the political,
00:44:33.940 | the ideological ideas that he represented and espoused,
00:44:38.940 | it's not out of the realm of possibility.
00:44:42.700 | - Yeah, I mean, I've been asked before,
00:44:44.940 | does he get out of prison first
00:44:46.100 | or does Snowden come back into America?
00:44:48.060 | I don't know, I have no idea.
00:44:49.740 | - Snowden just became a Russian citizen.
00:44:50.900 | - I saw that, and I've heard a lot of weird theories
00:44:53.820 | about that one.
00:44:54.820 | - Well, actually, on another tangent, let me ask you,
00:44:57.740 | do you think Snowden is a good or a bad person?
00:45:02.740 | - A bad person.
00:45:04.700 | - Can you make the case that he's a bad person?
00:45:07.980 | - There's ways of being a whistleblower,
00:45:10.020 | and there's rules set up on how to do that.
00:45:15.620 | He didn't follow those rules.
00:45:17.300 | I mean, I'm red, white, and blue, so I'm pretty,
00:45:22.300 | - So you think his actions were anti-American?
00:45:24.500 | - I think the results of his actions were anti-American.
00:45:27.260 | I don't know if his actions were anti-American.
00:45:29.180 | - Do you think he could have anticipated
00:45:30.540 | the negative consequences of his action?
00:45:33.420 | - Yes.
00:45:34.260 | - Should we judge him by the consequences
00:45:35.460 | or the ideals of the intent of his actions?
00:45:39.500 | - I think we all get to judge him based on our own beliefs,
00:45:41.940 | but I believe what he did was wrong.
00:45:44.500 | - Can you still mend the case that he's actually
00:45:47.300 | a good person and good for this country,
00:45:50.460 | for the United States of America,
00:45:52.660 | as a flag bearer for the whistleblowers,
00:45:56.900 | the check on the power of government?
00:45:59.400 | - Yeah, I mean, I'm not big government-type guy,
00:46:04.180 | you know, so that sounds weird
00:46:06.140 | coming from a government guy for so many years,
00:46:08.500 | but there's rules in place for a reason.
00:46:12.500 | I mean, he put some of our best capabilities,
00:46:16.500 | he made them publicly available.
00:46:18.100 | It really kind of set us back in the,
00:46:21.540 | and this isn't my world at all,
00:46:23.080 | but the offensive side of cybersecurity.
00:46:25.460 | - Right, so he revealed stuff that he didn't need to reveal
00:46:27.940 | in order to make the point.
00:46:29.180 | - Correct.
00:46:30.020 | - So if you could imagine a world where he leaked stuff
00:46:35.900 | that revealed the mass surveillance efforts
00:46:40.500 | and not reveal other stuff.
00:46:43.220 | Like, is the mass surveillance,
00:46:46.660 | I mean, that's the thing that,
00:46:48.620 | of course, in the interpretation of that,
00:46:50.900 | there's fear-mongering, but at the core,
00:46:53.620 | that was a real shock to people
00:46:55.740 | that it's possible for a government to collect data at scale.
00:47:00.740 | - It's surprising to me that people are that shocked by it.
00:47:06.920 | - Well, there's conspiracies,
00:47:08.220 | and then there's like actual evidence
00:47:12.340 | that that is happening.
00:47:13.940 | I mean, there's a lot of reality that people ignore,
00:47:18.940 | but when it hits you in the face,
00:47:20.620 | you realize, holy shit, we're living in a new world.
00:47:22.740 | This is the new reality,
00:47:25.060 | and we have to deal with that reality.
00:47:26.060 | Just like you work in cybersecurity,
00:47:28.100 | I think it really hasn't hit most people
00:47:31.900 | how fucked we all are in terms of cybersecurity.
00:47:36.700 | Okay, let me rephrase that.
00:47:38.580 | How many dangers there are in a digital world,
00:47:41.940 | how much under attack we all are,
00:47:44.100 | and how more intense the attacks are getting,
00:47:47.620 | and how difficult the defense is,
00:47:49.300 | and how important it is,
00:47:50.900 | and how much we should value it,
00:47:52.420 | and all the different things we should do
00:47:53.660 | at the small and large scale to defend.
00:47:55.700 | Like, most people really haven't woken up.
00:47:58.260 | They think about privacy from tech companies.
00:48:01.020 | They don't think about attacks, cyber attacks.
00:48:03.940 | - People don't think they're a target,
00:48:04.940 | and that message definitely has to get out there.
00:48:07.780 | I mean, if you have a voice, you're a target.
00:48:10.780 | If the place you work, you might be a target.
00:48:13.460 | See, your husband might work at some place,
00:48:15.820 | because now people are working from home,
00:48:17.460 | so they're gonna target you to get access
00:48:19.860 | to his network in order to get in.
00:48:21.900 | - Well, in that same way,
00:48:23.180 | the idea that the US government or any government
00:48:26.340 | could be doing mass surveillance on its citizens
00:48:28.940 | is one that was a wake-up call,
00:48:32.900 | because you could imagine the ways in which that could be,
00:48:37.500 | like, you could abuse the power of that
00:48:42.020 | to control a citizenry for political reasons and purposes.
00:48:45.980 | - Absolutely.
00:48:46.820 | You know, you could abuse it.
00:48:48.220 | I think during, in the part of the Snowden League,
00:48:50.740 | saw that two NSA guys were monitoring their girlfriends,
00:48:55.540 | and there's rules in place for that.
00:48:56.660 | Those people should be punished for abusing that.
00:48:58.860 | But how else are we going to hear about, you know,
00:49:02.340 | terrorists that are in the country
00:49:04.220 | talking about birthday cakes?
00:49:06.500 | And, you know, that was a case where that was the trip word,
00:49:09.220 | that, you know, we're gonna go bomb New York City's subway.
00:49:12.700 | - Yeah, it's complicated,
00:49:13.660 | but it just feels like there should be
00:49:15.140 | some balance of transparency.
00:49:17.260 | There should be a check in that power.
00:49:19.460 | Because, like, you know, in the name of the war on terror,
00:49:22.660 | you can sort of sacrifice,
00:49:26.580 | there is a trade-off between security and freedom,
00:49:29.700 | but it just feels like there's a giant, slippery slope
00:49:33.060 | on the sacrificing of freedom in the name of security.
00:49:36.020 | - I hear you.
00:49:36.860 | And, you know, we live in a world where,
00:49:40.180 | well, I live in a world where I had to tell you exactly
00:49:43.860 | when I arrested someone,
00:49:44.740 | I had to write a 50-page document of how I arrested you
00:49:47.900 | and all the probable cause I have against you and all that.
00:49:50.140 | Well, you know, bad guys are reading that.
00:49:51.940 | They're reading how I caught you,
00:49:53.500 | and they're changing the way they're doing things.
00:49:54.900 | They're changing their MO.
00:49:56.820 | You know, they're doing it to be more secure.
00:49:59.300 | If we tell people how we're monitoring,
00:50:02.540 | how, what we're surveilling, we're gonna lose that.
00:50:05.100 | I mean, the terrorists are just gonna go a different way.
00:50:07.740 | And I'm not trying to, again, I'm not big government.
00:50:10.380 | I'm not trying to say that, you know,
00:50:12.060 | it's cool that we're monitoring,
00:50:14.460 | the US government's monitoring everything.
00:50:16.660 | You know, big tech's monitoring everything.
00:50:18.100 | They're just monetizing it
00:50:19.300 | versus possibly using it against you.
00:50:21.940 | - But there is a balance.
00:50:22.900 | And those 50 pages, they have a lot of value.
00:50:27.200 | They make your job harder,
00:50:28.920 | but they prevent you from abusing the power of the job.
00:50:33.680 | - Yeah. - There's a balance.
00:50:34.880 | - Yeah. - That's a tricky balance.
00:50:36.640 | So the chat logs in Iceland
00:50:40.400 | give you evidence of the heroin
00:50:46.440 | and all the large-scale czar-level drug trading.
00:50:51.440 | What else did it give you in terms of the how to catch?
00:50:55.720 | - It gave us an instruction.
00:50:56.800 | So the Onion name was actually running on a server in France
00:51:00.920 | so if you like,
00:51:02.420 | and it only commuted through a back channel of VPN
00:51:04.960 | to connect to the Iceland server.
00:51:07.660 | There was a Bitcoin vault server that was also in Iceland.
00:51:13.560 | And I think that was so that the admins
00:51:17.040 | couldn't get into the Bitcoins,
00:51:19.600 | the other admins that were hired to work on the site.
00:51:21.560 | So you could get into the site,
00:51:23.040 | but you couldn't touch the money.
00:51:24.520 | Only Ross had access to that.
00:51:26.440 | And then another big mistake on Ross's part
00:51:29.400 | is he had the backups for everything
00:51:31.760 | at a data center in Philadelphia.
00:51:34.080 | Don't put your infrastructure in the United States.
00:51:38.200 | I mean, again, let's not make a playbook, but you know.
00:51:41.920 | - Well, I think these are low-hanging fruit
00:51:43.520 | that people of competence would know already.
00:51:46.480 | - I agree. - But it's interesting
00:51:47.520 | that he wasn't competent enough to make,
00:51:50.960 | so he was incompetent in certain ways.
00:51:53.240 | - Yeah, I don't think he was a mastermind
00:51:56.080 | of setting up an infrastructure
00:51:58.320 | that would protect his online business
00:52:03.080 | because keeping chat logs, keeping a diary,
00:52:06.200 | putting infrastructure where it shouldn't be,
00:52:08.500 | bad decisions.
00:52:12.000 | - How did you figure out that he's in San Francisco?
00:52:16.200 | - So we had that part with Jared
00:52:18.560 | that he was on the West Coast, and then--
00:52:20.840 | - Who again is Jared?
00:52:22.160 | - Jared Day-Egan, he was a partner.
00:52:25.840 | He was a DHS agent,
00:52:28.360 | worked for HSI, Homeland Security Investigations in Chicago.
00:52:32.240 | He started his Silk Road investigation
00:52:33.920 | because he was working at O'Hare
00:52:35.600 | and a weird package came in,
00:52:37.600 | come to find out he traced it back to Silk Road.
00:52:39.600 | So he started working at a Silk Road investigation
00:52:42.720 | long before I started my case,
00:52:44.560 | and he made his way up undercover
00:52:46.880 | all the way to be an admin on Silk Road.
00:52:48.880 | So he was talking to Ross on a Jabra server,
00:52:53.080 | a private Jabra server, private chat communication server.
00:52:56.640 | And we noticed that Ross's time zone
00:53:01.160 | on that Jabra server was set to the West Coast.
00:53:03.940 | So we had Pacific time on there.
00:53:05.980 | So we had a region, 1/24 of the world was covered
00:53:09.640 | of where we thought he might be.
00:53:11.960 | - And from there, how do you get to San Francisco?
00:53:14.460 | - There was another guy, an IRS agent
00:53:16.520 | that was part of the team,
00:53:17.840 | and he used a powerful tool
00:53:21.240 | to find his clue.
00:53:24.160 | He used the world of Google.
00:53:26.320 | He simply just went back and Googled around
00:53:28.760 | for Silk Road at the time it was coming up
00:53:31.880 | and found some posts on some help forums
00:53:36.080 | that this guy was starting an Onion website
00:53:38.960 | and wanted some cryptocurrency help.
00:53:41.120 | And if you could help him,
00:53:42.840 | please reach out to ross.albrek@gmail.com.
00:53:47.080 | In my world, that's a clue.
00:53:50.880 | - Okay, so that's as simple as that.
00:53:53.000 | - Yeah, and the name he used on that post was Frosty.
00:53:56.440 | - Yeah, so you had to connect Frosty
00:53:59.560 | and other uses in Frosty,
00:54:01.320 | and here's a Gmail, and the Gmail has the name.
00:54:05.200 | - The Gmail posted that I need help
00:54:07.720 | under the name Frosty on this forum.
00:54:09.840 | - So what's the connection of Frosty elsewhere?
00:54:12.560 | - The person logging into the Philadelphia backup server,
00:54:15.960 | the name of the computer was Frosty.
00:54:19.400 | Another clue in my world.
00:54:21.320 | - And that's it.
00:54:22.640 | The name is there, the connection to the Philadelphia server
00:54:25.960 | and then to Iceland is there,
00:54:28.000 | and so the rest is small details in terms of,
00:54:31.840 | or is there interesting details?
00:54:33.440 | - No, I mean, there's some electronic surveillance
00:54:35.280 | that find Ross Albrek living in a house,
00:54:37.640 | and is there, you know, is a computer at his house
00:54:40.520 | attaching to, you know, does it have Tor traffic
00:54:44.780 | at the same time that DPR's on?
00:54:47.040 | Another big clue.
00:54:48.920 | (laughing)
00:54:49.760 | - Matching up timeframes?
00:54:51.880 | - Again, just putting your email out there,
00:54:54.840 | putting your name out there like that,
00:54:56.880 | like what I see from that, just at the scale of that market,
00:55:02.280 | what I, it just makes me wonder
00:55:04.320 | how many criminals are out there
00:55:05.520 | that are not making these low-hanging fruit mistakes
00:55:08.480 | and are still successfully operating.
00:55:10.680 | To me, it seems like you could be a criminal,
00:55:13.940 | much, it's much easier to be a criminal on the internet.
00:55:18.680 | - What else to you is interesting to understand
00:55:20.760 | about that case of Ross and Silk Road
00:55:25.640 | and just the history of it from your own relationship with it
00:55:30.640 | from a cybersecurity perspective,
00:55:33.040 | from an ethical perspective, all that kind of stuff.
00:55:35.360 | Like when you look back,
00:55:36.600 | what's interesting to you about that case?
00:55:39.640 | - I think my views on the case have changed over time.
00:55:42.280 | I mean, it was my job back then,
00:55:44.640 | so I just looked at it as of, you know,
00:55:46.920 | I'm going after this.
00:55:48.160 | I sort of made a name for myself in the bureau
00:55:52.520 | for the anonymous case, and then this one was just,
00:55:55.040 | I mean, this was a bigger deal.
00:55:56.120 | I mean, they flew me down to DC
00:55:57.920 | to meet with the director about this case.
00:56:00.560 | The president of the United States
00:56:01.640 | was gonna announce this case, the arrest.
00:56:03.600 | Unfortunately, the government shut down two days before,
00:56:06.680 | so it was just us.
00:56:07.600 | And that's really the only reason
00:56:08.800 | I had any publicity out of it
00:56:10.480 | is because the government shut down
00:56:12.720 | and the only thing that went public
00:56:14.280 | was that affidavit with my signature at the end.
00:56:17.360 | Otherwise, it would have just been the attorney general
00:56:20.000 | and the president announcing the rest of this big thing,
00:56:22.440 | and you wouldn't have seen me.
00:56:24.320 | - Did you understand that this was a big case?
00:56:26.160 | - Yeah, I knew at the time.
00:56:28.560 | - Was it because of the scale of it or what it stood for?
00:56:32.380 | - I just knew that the public was gonna react in a big way.
00:56:37.400 | Like the media was, now, did I think
00:56:39.160 | that it was gonna be on the front page of every newspaper
00:56:41.080 | the day after the arrest?
00:56:42.280 | No, but I could sense it.
00:56:44.000 | Like I went like three or four days without sleep.
00:56:47.120 | When I was out in San Francisco to arrest Ross,
00:56:49.200 | I had sent three guys to Iceland to...
00:56:52.280 | So it was a three-prong approach for the takedown.
00:56:54.560 | It was get Ross, get the Bitcoins, and seize the site.
00:56:58.280 | Like we didn't want someone else taking control of the site
00:57:00.560 | and we wanted that big splash of that banner.
00:57:02.560 | Like, look, the government found this site.
00:57:04.440 | Like you might not wanna think about doing this again.
00:57:08.600 | - And you were able to pull off all three?
00:57:10.080 | - Maybe that's my superpower.
00:57:11.480 | I'm really good about putting smarter people
00:57:13.560 | than I am together on the right things.
00:57:16.680 | - It's the only way to do it.
00:57:18.880 | - In the business I formed, that's what I did.
00:57:20.400 | I hired only smarter people than me.
00:57:21.840 | And I'm not that smart,
00:57:23.280 | but smart enough to know who the smart people are.
00:57:26.720 | - The team was able to do all three?
00:57:28.560 | - Yeah, we were able to get all three done.
00:57:31.280 | Yeah, and the one guy, one of the guys,
00:57:32.760 | the main guys I sent to Iceland, man, he was so smart.
00:57:35.240 | I sent another guy from the FBI to France to get that part,
00:57:42.520 | and he couldn't do it,
00:57:43.720 | so the guy in Iceland did it from Iceland.
00:57:46.360 | They had to pull some stuff out of memory on a computer.
00:57:49.240 | You know, it's live process stuff.
00:57:51.360 | I'm sure you've done that before, but.
00:57:53.260 | - I'm sure you did.
00:57:55.960 | Look what you're doing.
00:57:57.200 | You're, this is like a multi-layer interrogation going on.
00:58:01.300 | Was there a concern that somebody else
00:58:04.960 | would step in and control the site?
00:58:06.280 | - Absolutely.
00:58:07.120 | We didn't have insight on who exactly I'd control.
00:58:10.120 | So it turns out that Russ had like dictatorial control,
00:58:13.760 | so it wasn't easy to delegate to somebody else.
00:58:17.280 | - He hadn't.
00:58:18.120 | I think he had some sort of ideas.
00:58:19.840 | I mean, his diary talked about walking away
00:58:21.960 | and giving it to somebody else,
00:58:23.120 | but he couldn't give up that control on anybody apparently.
00:58:26.620 | - Which makes you think that power corrupts,
00:58:30.520 | and his ideals were not as strong as he espoused about,
00:58:34.680 | because if it was about the freedom
00:58:36.880 | of being able to buy drugs,
00:58:39.720 | if you want to,
00:58:41.480 | then he surely should have found ways
00:58:43.640 | to delegate that power.
00:58:45.280 | - We changed over time.
00:58:46.360 | You could see it in his writings that he changed.
00:58:49.560 | Like, so people argue back and forth
00:58:53.320 | that there was never murders on Silk Road.
00:58:55.920 | When we were doing the investigation,
00:58:57.400 | to us, there were six murders.
00:58:59.720 | So there was, the way we saw him at the time
00:59:04.320 | was Ross ordered people to be murdered.
00:59:07.960 | People stole from him and all that.
00:59:09.720 | It was sort of an evolution from,
00:59:12.320 | oh man, I can't deal with this, I can't do it,
00:59:14.360 | it's too much, to the last one was like,
00:59:17.240 | the guy said, well, he's got three roommates.
00:59:20.300 | It's like, oh, we'll kill them too.
00:59:22.960 | - Was that ever proven in court?
00:59:24.760 | - No. - Just part murder?
00:59:26.040 | - The murders never went forward
00:59:27.560 | because there was some stuff, problems in that case.
00:59:31.120 | So there was a separate case in Baltimore
00:59:33.480 | that they had been working on for a lot longer.
00:59:35.360 | And so, during the investigation,
00:59:37.760 | that caused a bunch of problems
00:59:39.080 | because now we have multiple federal agencies,
00:59:42.000 | a case against the same thing.
00:59:43.760 | - How do you decide not to push forward
00:59:45.760 | the murder investigations?
00:59:48.560 | - So there was a de-confliction meeting that happened in DC.
00:59:52.920 | I didn't happen to go to that meeting, but Jared went,
00:59:56.320 | this is before I ever knew Jared,
00:59:58.040 | and we have like televisions where we can just sit in a room
01:00:04.000 | and sit in on the meeting,
01:00:06.160 | but it's all secured network and all that.
01:00:07.800 | So we can talk openly about secure things.
01:00:11.280 | And we sat in on the meeting
01:00:13.640 | and people just kept saying the term sweat equity.
01:00:17.000 | I've got sweat equity,
01:00:18.200 | meaning that they had worked on the case for so long
01:00:21.880 | that they deserve to take them down.
01:00:24.640 | And by this time, no one knew about us,
01:00:28.880 | but we told them at the meeting
01:00:30.640 | that we had found the server and we have a copy of it
01:00:33.120 | and we have the infrastructure.
01:00:34.640 | And these guys had just had communications under covers.
01:00:38.840 | They didn't really know what was going on.
01:00:40.640 | And this wasn't my first de-confliction meeting.
01:00:42.640 | We had a huge de-confliction meeting
01:00:44.080 | during the anonymous case.
01:00:46.480 | - What's a de-confliction meeting?
01:00:47.880 | - Agents within your agency or other federal agencies
01:00:52.240 | have an open investigation that if you expose your case
01:00:55.840 | or took down your case would hurt their case
01:00:58.080 | or the other way.
01:00:58.920 | - Oh, so you kind of have a,
01:01:00.480 | it's like the rival gangs meet at the table
01:01:02.720 | in a smoke filled room and-
01:01:05.360 | - Less bullets at the end, but yes.
01:01:07.160 | - Boy, with the sweat equity.
01:01:10.240 | - Yeah.
01:01:11.080 | - I mean, there's careers at stake, right?
01:01:13.200 | Yeah.
01:01:14.320 | You hate that idea.
01:01:15.600 | - Yeah, I mean, why is that a stake?
01:01:17.720 | Just because you've worked on it long enough,
01:01:19.960 | longer than I have, that means you did better?
01:01:23.400 | - Yeah.
01:01:24.240 | - That's insane to me.
01:01:25.520 | That's rewarding bad behavior.
01:01:28.040 | - And so that one of the part of the sweat equity discussion
01:01:30.840 | was about murder.
01:01:31.680 | This was, here's a chance to actually bust them
01:01:34.200 | given the data you have from Iceland
01:01:35.880 | and all that kind of stuff.
01:01:37.120 | So why?
01:01:37.960 | - They wanted us just to turn the data over to them.
01:01:41.040 | - To them.
01:01:41.880 | - Yeah, thanks for getting us this far.
01:01:43.880 | Here it is.
01:01:44.720 | I mean, it came to the point where they sent us,
01:01:46.880 | like they had a picture of what they thought Ross was
01:01:50.920 | and it was an internet meme.
01:01:52.560 | It really was a meme.
01:01:53.560 | It was a photo that we could look up.
01:01:55.960 | Like it was insane.
01:01:58.040 | - All right, so there's different degrees of competence
01:02:01.120 | all across the world between different people.
01:02:04.520 | Okay.
01:02:05.360 | Does part of you regret because you pushed forward
01:02:10.280 | the heroin and the drug trade,
01:02:12.960 | we never got to the murder discussion?
01:02:15.800 | - I mean, the only regret is that the internet
01:02:18.320 | doesn't seem to understand.
01:02:19.640 | Like they just kind of blow that part off
01:02:22.040 | that he literally paid people to have people murdered.
01:02:24.880 | It didn't result in a murder.
01:02:25.920 | And I thank God no one resulted in a murder.
01:02:27.960 | - But that's where his mind was.
01:02:29.360 | - His mind and where he wrote in his diary
01:02:31.480 | was that I had people killed and here's the money.
01:02:33.320 | He paid it.
01:02:34.160 | He paid a large amount of Bitcoins for that murder.
01:02:38.480 | - So he didn't just even think about it.
01:02:40.880 | He actually took action, but the murders never happened.
01:02:43.760 | He took action by paying the money.
01:02:45.320 | - Correct.
01:02:46.160 | And the people came back with results.
01:02:47.800 | He thought they were murdered.
01:02:50.000 | - That said, can you understand the steel man,
01:02:53.480 | the case for the drug trade on Silk Road?
01:02:56.760 | Like can you make the case
01:02:58.400 | that it's a net positive for society?
01:03:00.280 | - So there was a time period
01:03:03.440 | of when we found out the infrastructure
01:03:05.920 | and when we built the case against Ross.
01:03:08.600 | I don't remember exactly, six weeks, a month, two months,
01:03:12.640 | I don't know, somewhere in there.
01:03:14.800 | But then at Ross's sentencing,
01:03:16.640 | there was a father that stood up
01:03:18.520 | and talked about his son dying.
01:03:21.160 | And I went back and kind of did the math.
01:03:22.640 | And it was between those time periods
01:03:24.000 | of when we knew we could shut it down.
01:03:25.920 | We could have pulled the plug on the server and gone.
01:03:27.920 | And when Ross was arrested,
01:03:29.400 | his son died from buying drugs on Silk Road.
01:03:32.680 | And I still think about that father a lot.
01:03:37.280 | - But if we look at the scale at the war on drugs,
01:03:41.040 | let's just even outside of Silk Road,
01:03:43.680 | do you think the war on drugs by the United States
01:03:47.440 | has alleviated more suffering
01:03:52.080 | or caused more suffering in the world?
01:03:53.980 | - That might be above my pay scale.
01:03:57.320 | I mean, I understand the other side of the argument.
01:03:59.560 | I mean, people said that I don't have to go down
01:04:01.480 | to the corner to buy drugs.
01:04:03.320 | I'm not gonna get shot on the corner
01:04:04.880 | buying drugs or something.
01:04:05.920 | I can just have them sent to my house.
01:04:07.440 | People are gonna do drugs anyways.
01:04:08.800 | I understand that argument.
01:04:10.420 | From my personal standpoint,
01:04:13.560 | if I made it more difficult for my children to get drugs,
01:04:16.480 | then I'm satisfied.
01:04:17.480 | - So your personal philosophy is that
01:04:20.280 | if we legalize all drugs, including heroin and cocaine,
01:04:23.800 | that that would not make for a better world.
01:04:27.000 | - I don't, no, personally, I don't believe
01:04:30.160 | legalizing all drugs would make for a better world.
01:04:32.440 | - Can you imagine that it would?
01:04:36.840 | Do you understand that argument?
01:04:38.640 | - Sure, I mean, as I've gotten older,
01:04:40.200 | I've started to, I like to see both sides of an argument.
01:04:42.640 | And when I can't see the other side,
01:04:43.880 | that's when I really like to dive into it.
01:04:46.120 | And I can see the other side.
01:04:47.280 | I can see why people would say that.
01:04:49.520 | But I don't wanna be, my race children in a world
01:04:54.480 | where drugs are just free for use.
01:04:57.480 | - Well, and then the other side of it is with Silk Road,
01:05:00.600 | taking down Silk Road, did that increase or decrease
01:05:06.720 | the number of drug trading criminals in the world?
01:05:11.280 | It's unclear.
01:05:12.240 | - Online, I think it increased.
01:05:14.360 | I think, that's one of the things I think about a lot
01:05:17.680 | with Silk Road was that no one really knew.
01:05:21.160 | I mean, there was thousands of users,
01:05:24.040 | but then after that, it was on the front page of the paper,
01:05:25.960 | and there was millions of people that knew about Tor
01:05:28.360 | and Onion Sites.
01:05:29.800 | It was an advertisement.
01:05:31.080 | I would have thought, I thought crypto was gonna crash
01:05:34.000 | right after that.
01:05:34.840 | Like, I don't know, people now see that bad people
01:05:37.040 | are doing bad things with crypto.
01:05:38.560 | That'll crash.
01:05:39.400 | Well, I'm obviously wrong on that one.
01:05:41.680 | And I thought, Ross was sentenced to two life sentences
01:05:45.160 | plus 40 years.
01:05:46.200 | No one's gonna start up these.
01:05:48.120 | Dark markets exploded after that.
01:05:50.200 | Some of them started as opportunistic,
01:05:53.840 | I'm gonna take those escrow accounts
01:05:55.840 | and I'm gonna steal all the money that came in.
01:05:58.120 | They were for that.
01:05:59.080 | But there were a lot of dark markets
01:06:01.600 | that popped up after that.
01:06:02.560 | Now we put the playbook out there.
01:06:04.320 | - Yeah, yeah.
01:06:07.040 | But and also there's a case for,
01:06:09.920 | do you ever think about not taking down,
01:06:13.520 | if you've not taken down Silk Road,
01:06:16.080 | you could use it because it's a market.
01:06:19.160 | It itself is not necessarily
01:06:21.280 | the primary criminal organization.
01:06:23.480 | It's a market for criminals.
01:06:25.200 | So it could be used to track down criminals
01:06:28.440 | in the physical world.
01:06:30.040 | So if you don't take it down,
01:06:31.840 | given that it was, you know, the central,
01:06:35.000 | how centralized it was,
01:06:36.680 | it could be used as a place to find criminals, right?
01:06:40.320 | As opposed to-- - So the dealers,
01:06:41.280 | the drug dealers take down the drug dealers?
01:06:42.800 | - Yeah.
01:06:43.640 | So if you have the cartel, get the cartels
01:06:45.120 | start get to involve you, you go after the dealers.
01:06:48.200 | - It would have been very difficult.
01:06:50.160 | - Because of TOR and all that.
01:06:51.280 | - Because of all the protections anonymity.
01:06:53.680 | De-cloaking all that would have been
01:06:56.160 | drastically more difficult.
01:06:58.480 | And a lot of people in upper management of the FBI
01:07:00.920 | didn't have the appetite of running something like that.
01:07:04.240 | That would have been the FBI running a drug market.
01:07:06.920 | How many kids, how many fathers would have to come in
01:07:09.440 | and said, "My kid bought while the FBI was running a site,
01:07:12.840 | a drug site, my kid died."
01:07:14.800 | So I didn't know of anybody in the FBI and management
01:07:17.760 | that would have the appetite to let us run
01:07:20.360 | what was happening on Silk Road.
01:07:21.960 | 'Cause remember at that time we're still believing
01:07:25.000 | six people are dead.
01:07:26.280 | We're still investigating where are all these bodies.
01:07:29.000 | That's pretty much why we took down Ross when we did.
01:07:33.560 | I mean, we had to jump on it fast.
01:07:35.600 | - What else can you say about this complicated world
01:07:40.360 | that has grown of the dark web?
01:07:42.040 | - I don't understand it.
01:07:46.240 | It would have been something for me,
01:07:47.960 | I thought it was gonna collapse.
01:07:49.520 | But I mean, it's just gotten bigger
01:07:52.120 | in what's going on out there.
01:07:53.040 | Now, I'm really surprised that it hasn't grown
01:07:55.880 | into other networks or people haven't developed
01:07:57.800 | other networks, but TOR--
01:07:58.640 | - You mean like instead of TOR?
01:08:00.160 | - Yeah, TOR is still the main one out there.
01:08:02.520 | I mean, there's a few others and I'm not gonna put
01:08:05.360 | an advertisement out for them.
01:08:06.800 | But I thought that market would have grown.
01:08:10.480 | - Yeah, my sense was when I interacted with TOR
01:08:12.720 | it was that there's huge usability issues.
01:08:15.280 | But that's for like legal activity.
01:08:17.640 | 'Cause like if you care about privacy,
01:08:19.080 | it's just not as good of a browser to look at stuff.
01:08:24.080 | - No, it's way too slow.
01:08:26.880 | It's way too slow.
01:08:27.840 | But I mean, you can't even like,
01:08:29.960 | I know some people would use it to like view movies
01:08:31.880 | like Netflix, so you can only view certain movies
01:08:33.800 | in certain countries.
01:08:34.640 | You can use it for that, but it's too slow even for that.
01:08:37.880 | - Were you ever able to hold in your mind
01:08:40.120 | the landscape of the dark web?
01:08:42.040 | Like what's going on out there?
01:08:44.240 | To me as a human being,
01:08:46.720 | it's just difficult to understand the digital world.
01:08:51.280 | Like these anonymous usernames,
01:08:53.480 | like doing anonymous activity.
01:08:56.200 | It's just, it's hard to, what am I trying to say?
01:09:00.160 | It's hard to visualize it in the way I can visualize it.
01:09:02.800 | Like I've been reading a lot about Hitler.
01:09:04.600 | I can visualize meetings between people, military strategy,
01:09:09.320 | deciding on certain evil atrocities,
01:09:13.080 | all that kind of stuff.
01:09:14.040 | I can visualize the people, there's agreements,
01:09:16.640 | hands, handshakes, stuff signed, groups built.
01:09:21.640 | Like in the digital space, like with bots, with anonymity,
01:09:25.760 | anyone human can be multiple people.
01:09:28.920 | It's just-
01:09:29.760 | - Yeah, it's all lies.
01:09:30.580 | It's all lies.
01:09:31.420 | - Like, yeah, it feels like I can't trust anything.
01:09:33.080 | - No, you can't.
01:09:33.920 | You honestly can't.
01:09:34.760 | And like, you can talk to two different people
01:09:36.560 | and it's the same person.
01:09:37.600 | Like there's so many different, you know,
01:09:40.400 | Hector had so many different identities online,
01:09:42.640 | the, you know, of things that, you know,
01:09:45.880 | the lies to each other.
01:09:47.120 | I mean, he lied to people inside his group
01:09:49.400 | just to use another name to spy on,
01:09:51.280 | make sure what they were, you know,
01:09:52.560 | weren't talking shit behind his back
01:09:53.840 | or weren't doing anything.
01:09:55.840 | It's all lies and people that can keep
01:09:58.000 | all those lies straight.
01:09:59.240 | It's unbelievable to me.
01:10:00.840 | - Ross Albrecht represents the very early days of that.
01:10:03.440 | That's why the competence wasn't there.
01:10:06.160 | Just imagine how good the people are now,
01:10:09.040 | the kids that grow up.
01:10:10.520 | Oh, they've learned from his mistakes.
01:10:12.600 | - Just the extreme competence.
01:10:15.880 | You just see how good people are at video games.
01:10:17.760 | Like the level of play in terms of video games.
01:10:22.720 | Like I used to think I sucked.
01:10:24.880 | And now I'm not even like,
01:10:27.680 | I'm not even in the like consideration
01:10:31.280 | of calling myself shitty at video games.
01:10:33.840 | I'm not even, I'm like non-existent.
01:10:35.760 | I'm like the mold.
01:10:37.640 | - Yeah, I stopped playing because it's so embarrassing.
01:10:39.440 | - It's embarrassing.
01:10:40.280 | It's like wrestling with your kid
01:10:41.520 | and he finally beats you.
01:10:42.360 | And he's like, well, fuck that.
01:10:43.280 | I'm not wrestling with my kid ever again.
01:10:45.360 | - And in some sense, hacking at its best and its worst
01:10:48.840 | is a kind of game.
01:10:50.360 | And you can get exceptionally good at that kind of game.
01:10:53.400 | - And you get the accolades of it.
01:10:56.200 | I mean, there's power that comes along.
01:10:58.840 | If you have success,
01:11:00.160 | I look at the kid that was hacking into Uber
01:11:02.360 | and Rockstar Games.
01:11:03.280 | He put it out there that he was doing it.
01:11:04.840 | I mean, he used the name,
01:11:06.280 | whatever hacked into Uber was his screen name.
01:11:09.840 | He was very proud of it.
01:11:10.720 | I mean, one building evidence against himself,
01:11:13.640 | but he wanted that slap on the back.
01:11:16.680 | Like, look at what a great hacker you are.
01:11:18.600 | - Yeah.
01:11:19.480 | What do you think is in the mind of that guy?
01:11:22.120 | What do you think is in the mind of Ross?
01:11:24.520 | Do you think they see themselves as good people?
01:11:26.880 | Do you think they acknowledge
01:11:31.080 | the bad they're doing onto the world?
01:11:33.760 | - So that Uber hacker, I think that's just youth
01:11:36.120 | not realizing what consequences are,
01:11:37.840 | I mean, based on his actions.
01:11:39.440 | Ross was a little bit older.
01:11:41.480 | I think Ross truly is a libertarian.
01:11:45.080 | He truly had his beliefs that he could provide
01:11:50.080 | the gateway for other people
01:11:51.800 | to live that libertarian lifestyle
01:11:53.480 | and put in their body what they want.
01:11:55.400 | I don't think that was a front or a lie.
01:11:58.280 | - What's the difference between DPR and Ross?
01:12:01.120 | You said like, "I have never met Ross until,
01:12:03.840 | "I have only had those two days of worth of interaction."
01:12:07.080 | - Yeah.
01:12:07.920 | - It's just interesting given how long you've chased him
01:12:10.960 | and then having met him,
01:12:11.920 | what was the difference to you as a human being?
01:12:14.480 | - He was a human being.
01:12:15.440 | He was an actual person.
01:12:17.480 | He was nervous when we arrested him.
01:12:20.440 | So one of the things that I learned
01:12:23.320 | through my law enforcement career is
01:12:25.320 | if I'm gonna be the case agent,
01:12:26.800 | I'm gonna be the one in charge of dealing with this person,
01:12:29.600 | I'm not putting handcuffs on him.
01:12:30.880 | Somebody else is gonna do that.
01:12:32.320 | Like I'm gonna be there to help him.
01:12:34.120 | I'm your conduit to help.
01:12:36.000 | And so right after someone's arrested,
01:12:38.320 | you obviously have had them down for weapons
01:12:40.000 | to make sure for everybody's safety,
01:12:41.400 | but then I just put my hand on their chest,
01:12:43.520 | just feel their heart, feel their breathing.
01:12:45.960 | I'm sure it's the scariest day,
01:12:48.080 | but then to have that human contact
01:12:51.720 | kind of settles people down.
01:12:53.200 | And you can kind of like, "Let's start thinking about this.
01:12:55.560 | "I'm gonna tell you,
01:12:56.600 | "I'm gonna be open and honest with you."
01:12:58.720 | There's a lot of cops out there and federal agents, cops,
01:13:01.720 | that just go to the hard-ass tactic.
01:13:04.760 | You don't get very far with that.
01:13:06.360 | You don't get very far being a mean asshole to somebody.
01:13:09.880 | Be compassionate, be human,
01:13:12.160 | and it's gonna go a lot further.
01:13:13.640 | - So given everything he's done,
01:13:15.080 | you were still able to have compassion for him?
01:13:17.600 | - Yeah, we took him to the jail.
01:13:19.880 | So it was after hours,
01:13:22.520 | so he didn't get to see a judge that day.
01:13:24.040 | So we stuck him in the San Francisco jail.
01:13:26.840 | I hadn't slept for about four days
01:13:29.600 | because I was dealing with people in Iceland,
01:13:32.720 | bosses in DC, bosses in New York.
01:13:35.400 | So, and I was in San Francisco, so timeframe,
01:13:39.440 | like the Iceland people were calling me
01:13:40.720 | when I was supposed to be sleeping.
01:13:41.640 | It was insane.
01:13:42.720 | But I still went out that night
01:13:44.720 | while Ross sat in jail and bought him breakfast.
01:13:46.520 | I said, "What do you want for breakfast?
01:13:47.760 | "I'll have a nice breakfast for you."
01:13:49.440 | 'Cause we picked him up in the morning
01:13:50.920 | and took him over to the FBI to do the FBI booking,
01:13:53.720 | the fingerprints and all that.
01:13:55.240 | And I got him breakfast.
01:13:56.520 | I mean, and you don't get paid back for that sort of thing.
01:13:59.240 | I'm not looking, but out of my own--
01:14:00.840 | - Did he make special requests for breakfast?
01:14:02.880 | - Yeah, he asked for certain things.
01:14:04.440 | - What, can you mention, is that top secret FBI?
01:14:06.680 | - No, that's not top secret.
01:14:07.520 | I think he wanted some granola bars.
01:14:10.000 | And, you know, but I mean, he already had lawyered up,
01:14:13.560 | so we, you know, which is his right, he can do that.
01:14:16.000 | So I knew we were gonna work together,
01:14:18.600 | you know, like I did with Hector.
01:14:21.000 | But I mean, this is--
01:14:22.280 | - So most of the conversations--
01:14:23.120 | - His last day.
01:14:23.940 | - Most of the conversations have to be then with lawyers.
01:14:26.480 | - From that point on, I can't question him
01:14:28.720 | when he asked for a lawyer,
01:14:30.500 | or if I did, it couldn't be used against him.
01:14:33.240 | So we just had conversation where I talked to him.
01:14:36.000 | You know, he could say things to me,
01:14:38.080 | but then I have to remind him that he asked for a lawyer,
01:14:39.880 | and he'd have to waive that and all that.
01:14:41.280 | But we didn't talk about his case so much.
01:14:43.320 | We just talked about like human beings.
01:14:45.040 | - Did he, with his eyes, with his words,
01:14:49.080 | reveal any kind of regret,
01:14:52.600 | or did you see a human being changing,
01:14:55.400 | understanding something about themselves
01:14:57.000 | in the process of being caught?
01:14:59.320 | - No, I don't think that.
01:15:01.020 | I mean, he did offer me $20 million to let him go
01:15:03.340 | when we were driving to the jail.
01:15:06.340 | - Oh, no.
01:15:07.660 | - And I asked him what we were gonna do
01:15:09.300 | with the agent that sat in the front seat.
01:15:11.100 | - The money really broke him, huh?
01:15:13.060 | - I think so.
01:15:13.900 | I think he kind of got caught up in how much money it was,
01:15:17.580 | and how, you know, when crypto started, it was pennies,
01:15:20.980 | and by the time he got arrested, it was 120 bucks,
01:15:23.060 | and you know, 177,000 Bitcoins.
01:15:26.300 | Even today, you know, that's a lot of Bitcoins.
01:15:29.960 | - So you really could have been,
01:15:31.280 | if you continued to be one of the richest people
01:15:33.640 | in the world.
01:15:34.640 | - I possibly could have been if I took that 20 million then.
01:15:37.640 | I could have been living,
01:15:38.480 | we could have this conversation in Venezuela.
01:15:40.640 | (laughing)
01:15:43.040 | - In a castle, in a palace.
01:15:44.960 | - Yeah, until it runs out,
01:15:46.000 | and then the government storms the castle.
01:15:48.480 | - Yeah.
01:15:49.320 | Have you talked to Russ since?
01:15:52.180 | - No, no.
01:15:53.360 | I'd be open to it.
01:15:54.840 | I don't think he probably wants to hear from me.
01:15:57.100 | - And do you know where, in which prison he is?
01:15:59.940 | - I think he's somewhere out in Arizona.
01:16:02.340 | I know he was in the one next to Supermax
01:16:04.820 | for a little while, like the high security one
01:16:06.980 | that's like, shares the fence with Supermax,
01:16:09.580 | but I don't think he's there anymore.
01:16:10.740 | I think he's out in Arizona.
01:16:11.820 | I haven't seen him in a while.
01:16:13.900 | - I wonder if he can do interviews in prison.
01:16:15.820 | That'd be nice.
01:16:16.820 | - Some people are allowed to,
01:16:18.340 | so I've not seen an interview with him.
01:16:20.940 | I know people have wanted to interview him
01:16:22.540 | about books and that sort of thing.
01:16:24.340 | - Right, because the story really blew up.
01:16:26.020 | Did it surprise you how much the story
01:16:28.820 | and many elements of it blew up?
01:16:31.740 | Movies, books?
01:16:32.580 | - It did surprise me.
01:16:34.460 | My wife's uncle, who I didn't,
01:16:36.300 | I've been married to my wife for 22 years now.
01:16:38.420 | I don't think he knew my name,
01:16:40.140 | and he was excited about that.
01:16:41.740 | He reached out when "Silk Road" came out,
01:16:43.800 | so that was surprising to see.
01:16:46.340 | - Did you think the movie on the topic was good?
01:16:49.820 | - I didn't have anything to do with that movie.
01:16:51.540 | I've watched it once.
01:16:52.660 | It was kind of cool that Jimmy Simpson
01:16:54.660 | was my name in the movie,
01:16:55.780 | but outside of that, I thought it sort of
01:16:58.040 | missed the mark on some things.
01:16:59.260 | - When Hollywood, I don't think they understand
01:17:02.120 | what's interesting about these kinds of stories,
01:17:06.060 | and there's a lot of things that are interesting,
01:17:08.220 | and they missed all of them.
01:17:09.740 | So for example, I recently talked to John Carmack,
01:17:12.940 | who's a world-class developer and so on.
01:17:16.020 | So Hollywood would think that the interesting thing
01:17:19.140 | about John Carmack is some kind of shitty,
01:17:23.740 | like a parody of a hacker or something like that.
01:17:26.740 | They would show really crappy emulation
01:17:31.740 | of some kind of Linux terminal thing.
01:17:34.440 | The reality is the technical details,
01:17:36.420 | for five hours with him, for 10 hours with him,
01:17:38.580 | is what people actually wanna see,
01:17:39.700 | even people that don't program.
01:17:41.240 | They want to see a brilliant mind,
01:17:43.280 | the details that they're not,
01:17:45.620 | even if they don't understand all the details,
01:17:47.380 | they want to have an inkling of the genius
01:17:49.100 | that's there.
01:17:49.920 | That's just one way of saying that you wanna reveal
01:17:54.220 | the genius, the complexity of that world
01:17:57.180 | in interesting ways, and to make a Hollywood,
01:17:59.960 | almost parody caricature of it,
01:18:02.040 | it just destroys the spirit of the thing.
01:18:04.220 | So one, the Operation FBI is fascinating,
01:18:09.220 | just tracking down these people
01:18:11.660 | on the cyber security front is fascinating.
01:18:13.920 | The other is just how you run TOR,
01:18:17.900 | how you run this kind of organization,
01:18:19.760 | the trust issues of the different criminal entities involved,
01:18:23.380 | the anonymity, the low-hanging fruit,
01:18:28.340 | the being shitty at certain parts on the technical front,
01:18:31.460 | all of those are fascinating things.
01:18:33.700 | That's what a movie should reveal.
01:18:35.580 | Should probably be a series, honestly,
01:18:37.180 | a Netflix series than a movie.
01:18:38.780 | - Yeah, an FX show or something like that,
01:18:40.820 | 'cause they're kinda gritty.
01:18:42.300 | - Yeah, yeah, gritty, exactly, gritty.
01:18:44.740 | I mean, shows like Chernobyl from HBO
01:18:47.540 | made me realize, okay, you can do a good job
01:18:49.580 | of a difficult story and reveal the human side,
01:18:53.540 | but also reveal the technical side
01:18:55.380 | and have some deep, profound understanding on that case,
01:18:58.260 | on the bureaucracy of a Soviet regime.
01:19:02.220 | In this case, you could reveal the bureaucracy,
01:19:04.740 | the chaos of a criminal organization,
01:19:07.700 | of a law enforcement organization.
01:19:09.820 | I mean, there's so much to explore.
01:19:11.380 | It's fascinating.
01:19:12.540 | - Yeah, I like Chernobyl.
01:19:13.580 | When I rewatch it, I can't watch episode three, though,
01:19:15.740 | the animal scene, the episode.
01:19:19.260 | They go around shooting all the dogs and all that.
01:19:21.100 | I gotta skip that part.
01:19:21.940 | - You're a big softie, aren't you?
01:19:23.260 | - I really am.
01:19:24.380 | I'm sure I'll probably cry at some point.
01:19:26.340 | (both laughing)
01:19:27.820 | - I love it, I love it.
01:19:29.860 | - Don't get me talking about that episode
01:19:31.340 | you made about your grandmother.
01:19:32.420 | Oh my God, that was rough.
01:19:34.620 | - Just to linger on this ethical versus legal question,
01:19:37.900 | what do you think about people like Aaron Schwartz?
01:19:39.740 | I don't know if you're familiar with him,
01:19:41.140 | but he was somebody who broke the law
01:19:46.140 | in the name of an ethical ideal.
01:19:47.980 | He downloaded and released academic publications
01:19:52.980 | that were behind a paywall,
01:19:56.660 | and he was arrested for that and then committed suicide,
01:20:02.620 | and a lot of people see him,
01:20:05.260 | certainly in the MIT community,
01:20:06.700 | but throughout the world as a hero,
01:20:10.460 | because you look at the way knowledge,
01:20:15.460 | scientific knowledge is being put behind paywalls,
01:20:18.740 | it does seem somehow unethical,
01:20:20.780 | and he basically broke the law to do the ethical thing.
01:20:26.620 | Now, you could challenge it, maybe it is unethical,
01:20:31.980 | but there's a gray area, and to me at least, it is ethical.
01:20:36.980 | To me at least, he is a hero,
01:20:39.460 | because I'm familiar with the paywall
01:20:43.500 | created by the institutions that hold these publications.
01:20:48.500 | They're adding very little value.
01:20:51.820 | So it is basically holding hostage
01:20:55.180 | the work of millions of brilliant scientists
01:20:57.900 | for some kind of, honestly,
01:21:03.060 | a crappy capitalist institution.
01:21:05.500 | Like they're not actually making that much money.
01:21:07.300 | It doesn't make any sense to me.
01:21:08.900 | It should, to me, it should all be open public access.
01:21:13.900 | There's no reason it shouldn't be,
01:21:15.780 | all publications should be.
01:21:16.780 | So he stood for that ideal
01:21:19.300 | and was punished harshly for it.
01:21:23.060 | That's the other criticism, it was too harshly.
01:21:25.420 | And of course, deeply unfortunately,
01:21:29.940 | that also led to his suicide,
01:21:31.660 | 'cause he was also tormented on many levels.
01:21:34.060 | I mean, are you familiar with him?
01:21:36.020 | What do you think about that line
01:21:37.740 | between what is legal and what is ethical?
01:21:40.100 | - So it's a tough case.
01:21:42.780 | I mean, the outcome was tragic, obviously.
01:21:45.260 | Unfortunately, when you're in law enforcement,
01:21:51.060 | you have to, your job is to enforce the laws.
01:21:53.660 | I mean, it's not, if you're told
01:21:55.540 | that you have to do a certain case,
01:21:57.540 | and there is a violation of, at the time,
01:22:00.260 | 18 USC 1030, computer hacking,
01:22:03.780 | you have to press forward with that.
01:22:07.500 | I mean, you have to charge,
01:22:09.260 | you bring the case to the U.S. Attorney's Office,
01:22:11.740 | and whether they're gonna press charges or not,
01:22:14.180 | you can't really pick and choose what you press
01:22:17.860 | and don't press forward.
01:22:18.900 | I never felt that, at least that flexibility,
01:22:20.980 | not in the FBI.
01:22:21.820 | I mean, maybe when you're a street cop
01:22:23.780 | and you pull somebody over,
01:22:24.660 | you can let them go with a warning.
01:22:26.380 | - So in the FBI, you're sitting in a room,
01:22:28.180 | but you're also a human being.
01:22:30.540 | You have compassion.
01:22:31.660 | You arrested Ross, the hand on the chest.
01:22:35.100 | I mean, that's a human thing.
01:22:37.180 | - Sure.
01:22:38.020 | - So there's a--
01:22:39.380 | - But I can't be the jury for whether
01:22:41.660 | it was a good hack or a bad hack.
01:22:44.460 | It's all someone, a victim has come forward
01:22:46.940 | and said, "We're the victim of this."
01:22:48.420 | And I agree with you, 'cause again,
01:22:50.460 | the basis of the internet was to share academic thought.
01:22:53.580 | I mean, that's where the internet was born.
01:22:55.740 | - But it's not up to you.
01:22:58.180 | So the role of the FBI is to enforce the law.
01:23:01.940 | - Correct.
01:23:05.420 | And there's a limited number of tools
01:23:08.580 | on our Batman belt that we can use.
01:23:10.860 | Not to get into all the aspects of the Trump case
01:23:14.620 | and Mar-a-Lago and the documents there.
01:23:16.580 | I mean, the FBI has so many tools they can use
01:23:19.860 | and a search warrant is the only way they could get in there.
01:23:22.780 | I mean, that's it.
01:23:24.060 | There's no other legal document or legal way
01:23:26.340 | to enter and get those documents.
01:23:28.660 | - What do you think about the FBI and Mar-a-Lago
01:23:32.820 | and the FBI taking the documents for Donald Trump?
01:23:36.180 | - You know, it's a tough spot.
01:23:37.620 | It's a really tough spot.
01:23:38.500 | The FBI has gotten a lot of black eyes recently.
01:23:42.540 | And I don't know if it's the same FBI
01:23:45.580 | that I remember when I was there.
01:23:47.820 | - Do you think they deserve it in part?
01:23:49.780 | Was it done clumsily, their raiding
01:23:54.220 | of the former president's residence?
01:23:56.580 | - It's tough.
01:24:00.740 | It's tough, you know, because again,
01:24:02.140 | they're only limited to what they're legally allowed to do
01:24:05.420 | and a search warrant is the only legal way of doing it.
01:24:08.860 | I have my personal and political views on certain things.
01:24:11.660 | I think it might be surprising to some
01:24:15.660 | where those political points stand.
01:24:19.580 | But-- - You told me offline
01:24:21.500 | that you're a hardcore communist.
01:24:23.020 | That was very surprising to me.
01:24:25.220 | - Well, that's only you tried to bring me
01:24:26.860 | into the Communist Party.
01:24:28.020 | - Exactly, I was trying to recruit you.
01:24:29.820 | I was giving you all kinds of flyers.
01:24:31.660 | Okay, but you said like, you know,
01:24:37.460 | people in the FBI are just following the law,
01:24:38.980 | but there's a chain of command and so on.
01:24:41.100 | What do you think about the conspiracy theories
01:24:42.780 | that people, some small number of people inside the FBI
01:24:47.780 | conspire to undermine the presidency of Donald Trump?
01:24:51.500 | - If you would have asked me when I was inside
01:24:54.020 | and before all this happened,
01:24:54.900 | I would say it never happened.
01:24:56.220 | I don't believe in conspiracies.
01:24:58.180 | You know, there's too many people involved.
01:24:59.700 | Somebody's gonna come out with some sort of information.
01:25:02.140 | But I mean, from the more of the stuff that comes out,
01:25:04.860 | it's surprising that, you know,
01:25:06.460 | agents are being fired because of certain actions
01:25:08.780 | that are taken inside and being dismissed
01:25:12.180 | because of politically motivated actions.
01:25:14.760 | - So do you think it's explicit or just pressure?
01:25:19.500 | Just, do you think there could exist just pressure
01:25:21.900 | at the higher ups that has a political leaning
01:25:25.260 | and you kinda maybe don't explicitly order
01:25:28.060 | any kind of thing, but just kinda pressure people
01:25:29.900 | to lean one way or the other and then create a culture
01:25:32.700 | that leans one way or the other
01:25:34.220 | based on political leanings?
01:25:35.860 | - You would really, really hope not.
01:25:37.720 | But I mean, that seems to be the narrative
01:25:39.780 | that's being written.
01:25:41.380 | - But when you were operating,
01:25:43.020 | you didn't feel that pressure?
01:25:45.140 | - Man, I was such at a low level.
01:25:46.700 | You know, I'd had no aspirations of being a boss.
01:25:49.260 | I wanted to be a case agent my entire life.
01:25:51.220 | - So you love the puzzle of it, the chase.
01:25:54.140 | - I love solving things, yeah.
01:25:56.220 | To be management and manage people and all that,
01:25:58.660 | like no desire whatsoever.
01:26:01.020 | - What do you think about Mark Zuckerberg
01:26:05.620 | on Joe Rogan's podcast saying that the FBI
01:26:08.820 | warned Facebook about potential foreign interference?
01:26:11.480 | And then Facebook inferred from that
01:26:16.980 | that they're talking about Hunter Biden laptop story
01:26:20.060 | and thereby censored it.
01:26:22.500 | What do you think about that whole story?
01:26:25.220 | - Again, you asked me when I was in the FBI,
01:26:27.020 | I wouldn't believed it from being on the inside
01:26:28.900 | and I wouldn't believe these things.
01:26:30.100 | But there's a certain narrative being written
01:26:32.020 | that is surprising to me
01:26:33.500 | that the FBI is involved in these stories.
01:26:36.140 | - So, but the interesting thing there is
01:26:38.180 | the FBI is saying that they didn't really
01:26:41.300 | make that implication.
01:26:42.260 | They're saying that there's interference activity happening.
01:26:45.540 | Just watch out.
01:26:46.900 | And it's a weird relationship between FBI and Facebook.
01:26:50.260 | You could see from the best possible interpretation
01:26:53.300 | that the FBI just wants Facebook to be aware
01:26:55.820 | 'cause it is a powerful platform,
01:26:57.660 | a platform for viral spread of misinformation.
01:27:01.720 | So in the best possible interpretation of it,
01:27:05.100 | it makes sense for FBI to send some information
01:27:07.660 | saying like we're seeing some shady activity.
01:27:10.700 | - Absolutely.
01:27:11.540 | - But it seems like all of that somehow escalated
01:27:13.580 | to a political interpretation.
01:27:15.860 | - I mean, yeah, it sounded like there was a wink-wink
01:27:17.980 | with it.
01:27:20.020 | I don't know if Mark meant for that to be that way.
01:27:24.220 | Again, are we being social engineered
01:27:27.340 | or was that a true expression that Mark had?
01:27:31.180 | - And I wonder if the wink-wink is direct
01:27:34.100 | or it's just culture.
01:27:35.860 | You know, maybe certain people responsible
01:27:38.140 | on the Facebook side have a certain political lean
01:27:41.100 | and then certain people on the FBI side
01:27:43.140 | have a political lean when they're interacting together.
01:27:45.780 | And it's like literally has nothing to do
01:27:48.220 | with a giant conspiracy theory,
01:27:50.920 | but just with a culture that has a particular political lean
01:27:54.540 | during a particular time in history.
01:27:57.700 | And so like maybe it could be Hunter Biden laptop one time
01:28:02.700 | and then it could be whoever,
01:28:05.620 | Donald Trump Jr.'s laptop another time.
01:28:09.580 | - It's a tough job.
01:28:10.420 | I mean, if you're the liaison,
01:28:11.620 | if you're the FBI's liaison to Facebook,
01:28:14.380 | you know, there are certain people
01:28:17.580 | that I'm sure they were offered a position at some point.
01:28:19.860 | It seems, you know, there's FBI agents that go,
01:28:22.180 | I know of a couple that's gone to Facebook.
01:28:25.780 | This is a really good agent
01:28:27.020 | that now leads up their child exploitation stuff.
01:28:29.460 | Another squad mate runs their internal investigations,
01:28:33.900 | both great investigators.
01:28:35.060 | So, you know, there's good money,
01:28:36.820 | especially when you're an FBI agent that's capped out
01:28:38.760 | at a, you know, a 1310 or whatever pay scale
01:28:41.600 | you're capped out at.
01:28:44.740 | It's alluring to be, you know,
01:28:46.620 | maybe want to please them and be asked to join them.
01:28:51.620 | - Yeah, and over time that corrupts.
01:28:56.140 | I think there has to be an introspection in tech companies
01:28:59.620 | about the culture that they develop,
01:29:01.660 | about the political ideology, the bubble.
01:29:05.860 | It's interesting to see that bubble.
01:29:07.980 | Like I've asked myself a lot of questions.
01:29:13.940 | I've interviewed the Pfizer CEO,
01:29:16.460 | what seems now a long time ago,
01:29:20.420 | and I've gotten a lot of criticism,
01:29:23.020 | positive comments, but also criticism
01:29:24.940 | from that conversation.
01:29:26.260 | And I did a lot of soul searching
01:29:28.220 | about the kind of bubbles we have in this world.
01:29:30.620 | And it makes me wonder, pharmaceutical companies,
01:29:34.740 | they all believe they're doing good.
01:29:40.380 | And I wonder, because the ideal they have
01:29:44.560 | is to create drugs that help people and do so at scale.
01:29:49.280 | And it's hard to know at which point that can be corrupted.
01:29:54.940 | It's hard to know when it was corrupted
01:29:57.180 | and if it was corrupted and where,
01:29:58.860 | which drugs and which companies and so on.
01:30:01.460 | And I don't know.
01:30:03.980 | I don't know that complicated.
01:30:05.660 | It seems like inside a bubble,
01:30:07.060 | you can convince yourself if anything is good.
01:30:09.700 | People inside the Third Reich regime
01:30:13.100 | were able to convince themselves.
01:30:14.460 | I'm sure many just, "Bloodlands" is another book
01:30:18.900 | I've been recently reading about it.
01:30:21.220 | And the ability of humans to convince they're doing good
01:30:23.980 | when they're clearly murdering and torturing people
01:30:27.740 | in front of their eyes is fascinating.
01:30:31.460 | They're able to convince themselves they're doing good.
01:30:33.500 | It's crazy.
01:30:35.140 | Like there's not even an inkling of doubt.
01:30:38.220 | Yeah, I don't know what to make of that.
01:30:40.340 | So it has taught me to be a little bit more careful
01:30:43.740 | when I enter into different bubbles
01:30:45.500 | to be skeptical about what's taken
01:30:51.540 | as an assumption of truth.
01:30:53.580 | Like you always have to be skeptical
01:30:56.220 | about what's assumed is true.
01:30:58.060 | Is it possible it's not true?
01:30:59.580 | You know, if you're talking about America,
01:31:05.300 | it's assumed that in certain places
01:31:08.460 | that surveillance is good.
01:31:10.100 | Well, let's question that assumption.
01:31:13.460 | Yeah, and also it inspired me to question my own assumptions
01:31:18.740 | that I hold as true constantly, constantly.
01:31:23.500 | It's tough, it's tough.
01:31:25.180 | - But you don't grow.
01:31:26.000 | I mean, do you wanna be just static and not grow?
01:31:28.260 | You have to question yourself on some of these things
01:31:30.540 | if you wanna grow as a person.
01:31:32.780 | - Yeah, for sure.
01:31:33.700 | Now, one of the tough things actually
01:31:35.300 | of being a public personality when you speak publicly
01:31:38.300 | is you get attacked all along the way as you're growing.
01:31:42.460 | In part, a big softy as well, if I may say.
01:31:48.460 | And those hurt, it hurts, it hurts, it hurts.
01:31:52.020 | - Do you pay attention to it?
01:31:54.020 | - Yeah, yeah, yeah, yeah.
01:31:56.700 | It's very hard.
01:31:57.560 | Like I have two choices.
01:32:01.220 | One, you can shut yourself off from the world and ignore it.
01:32:05.700 | I never found that compelling,
01:32:07.060 | this kind of idea of like haters gonna hate.
01:32:09.320 | This idea that anyone with a big platform
01:32:15.660 | or anyone's ever done anything was always gotten hate.
01:32:18.860 | Okay, maybe.
01:32:20.340 | But I still wanna be vulnerable,
01:32:22.740 | wear my heart on my sleeve, really show myself,
01:32:25.380 | open myself to the world, really listen to people.
01:32:28.060 | And that means every once in a while
01:32:30.300 | somebody will say something that touches me
01:32:32.540 | in a way that's like, what if they're right?
01:32:35.780 | - Do you let that hate influence you?
01:32:37.340 | I mean, can you be bullied into a different opinion
01:32:40.780 | than you think you really are just because of that hate?
01:32:43.460 | - No, no, I believe not, but it hurts
01:32:47.820 | in a way that's hard to explain.
01:32:49.420 | Yeah, it gets to like,
01:32:53.540 | it shakes your faith in humanity actually
01:32:56.860 | is probably why it hurts.
01:32:59.580 | Like people that call me a Putin apologist
01:33:04.580 | or a Zelensky apologist, which I'm currently getting
01:33:08.140 | almost an equal amount of, but it hurts.
01:33:13.100 | It hurts because I,
01:33:15.700 | it hurts because it damages slightly my faith in humanity
01:33:22.900 | to be able to see the love that connects us
01:33:28.900 | and then to see that I'm trying to find that.
01:33:32.620 | And that's, I'm doing my best
01:33:34.380 | in the limited capabilities I have to find that.
01:33:37.420 | And so to call me something like a bad actor,
01:33:41.100 | essentially from whatever perspective,
01:33:43.300 | it just makes me realize, well,
01:33:44.940 | people don't have empathy and compassion for each other.
01:33:48.980 | And it makes me question that for a brief moment.
01:33:51.300 | And that's like a crack and it hurts.
01:33:53.860 | - How many people do this to your face?
01:33:57.940 | - Very few.
01:33:59.220 | - It's online e-muscles, man.
01:34:00.500 | They're just flexing.
01:34:01.340 | - I have to be honest that it happens
01:34:05.620 | because I've hung around with Rogan enough.
01:34:09.460 | When your platform grows,
01:34:11.460 | there's people that will come up to Joe
01:34:13.300 | and say stuff to his face that they forget.
01:34:16.580 | They still, they forget he's an actual real human being.
01:34:21.300 | They'll make accusations about him.
01:34:23.660 | - So does that cause him to wall himself off more?
01:34:26.300 | - No, he's pretty gangster on that.
01:34:28.860 | But yeah, it still hurts.
01:34:30.900 | If you're human, if you really feel others,
01:34:35.460 | I think that's also the difference with Joe and me.
01:34:39.020 | He has a family that he deeply loves,
01:34:43.340 | and that's an escape from the world for him.
01:34:45.540 | There's a loneliness in me
01:34:50.060 | that I'm always longing to connect with people
01:34:52.460 | and with regular people.
01:34:54.300 | And just to learn their stories and so on.
01:34:57.700 | And so if you open yourself up that way,
01:34:59.900 | the things they tell you can really hurt in every way.
01:35:02.820 | Like just me going to Ukraine,
01:35:05.440 | just seeing so much loss and death.
01:35:08.960 | Some of it is, I mean, unforgettably haunting.
01:35:15.180 | Not in some kind of political way, activist way,
01:35:18.700 | or who's right, who's wrong way,
01:35:21.300 | but just like, man, like so much pain.
01:35:25.500 | You see it and it just stays with you.
01:35:27.380 | - When you see a human being bad to another human,
01:35:29.580 | you can't get rid of that in your head.
01:35:31.580 | You can't imagine that we can treat each other like that.
01:35:37.180 | That's the hard part, I think.
01:35:38.260 | I mean, for me it is.
01:35:41.180 | When I saw parents,
01:35:42.740 | like when I did the child exploitation stuff,
01:35:45.060 | when they rented their children out,
01:35:46.660 | they literally rented infant children out
01:35:48.580 | to others for sexual gratification.
01:35:51.380 | Like I don't know how a human being
01:35:52.980 | could do that to another human being.
01:35:55.460 | And that sounds like the kind of thing you're going through.
01:35:58.260 | I mean, I went through a huge funk
01:35:59.740 | when I did those cases afterwards.
01:36:01.860 | I should have talked to somebody,
01:36:03.180 | but in the FBI you have to keep that machismo up
01:36:06.600 | or they're gonna take your gun away from you.
01:36:08.460 | - Well, I think that's examples of evil
01:36:11.100 | that that's like the worst of human nature.
01:36:16.660 | But just because I have--
01:36:18.980 | - War is just as bad, I mean.
01:36:21.580 | - Somehow war, it's somehow understandable
01:36:26.580 | given all the very intense propaganda that's happening.
01:36:30.740 | So you can understand that there is love in the heart
01:36:35.740 | of the soldiers on each side
01:36:38.940 | given the information they're given.
01:36:40.840 | There's a lot of people on the Russian side
01:36:42.380 | believe they're saving these Ukrainian cities
01:36:44.980 | from Nazi occupation.
01:36:48.260 | Now there is stories, there is a lot of evidence
01:36:53.260 | of people for fun murdering civilians.
01:36:57.500 | Now that is closer to the things you've experienced
01:37:01.940 | of like evil, of evil embodied.
01:37:06.940 | And I haven't interacted with that directly
01:37:10.460 | with people who for fun murdered civilians.
01:37:13.340 | - But you know it's there in the world.
01:37:14.780 | I mean, you're not naive to it.
01:37:16.220 | - Yes, but if you experience that directly,
01:37:18.940 | if somebody shot somebody for fun in front of me,
01:37:22.100 | that would probably break me, yeah.
01:37:24.440 | Like seeing it yourself, knowing that it exists
01:37:27.460 | is different than seeing it yourself.
01:37:29.980 | Now I've interacted with the victims of that
01:37:32.380 | and they tell me stories and you see their homes destroyed,
01:37:37.460 | destroyed for no good military reason.
01:37:39.540 | It's civilians with civilian homes being destroyed.
01:37:42.500 | That really lingers with you.
01:37:44.680 | It's, yeah, the people that are capable of that.
01:37:48.520 | - That goes with the propaganda.
01:37:49.880 | I mean, if you were to build a story,
01:37:51.160 | you have to have on the other side,
01:37:54.880 | the homes are gonna be destroyed,
01:37:56.000 | the non-military targets are gonna be destroyed.
01:37:59.620 | - To put it in perspective, I'm not sure a lot of people
01:38:01.640 | understand the deep human side
01:38:04.800 | or even the military strategy side of this war.
01:38:07.720 | There's a lot of experts outside of the situation
01:38:10.320 | that are commenting on it with certainty.
01:38:12.400 | And that kind of hurts me because I feel like
01:38:14.200 | there's a lot of uncertainty.
01:38:15.920 | There's so much propaganda,
01:38:17.240 | it's very difficult to know what is true.
01:38:19.280 | Yeah, so my whole hope was to travel to Ukraine,
01:38:27.600 | to travel to Russia, to talk to soldiers,
01:38:30.360 | to talk to leaders, to talk to real people
01:38:32.800 | that have lost homes, that have lost family members,
01:38:35.880 | that who this war has divided,
01:38:39.120 | who this war changed completely how they see the world.
01:38:42.600 | Whether they have love or hate in their heart
01:38:44.400 | to understand their stories.
01:38:46.440 | I've learned a lot on the human side of things
01:38:48.720 | by having talked to a lot of people there.
01:38:51.120 | But it has been on the Ukrainian side for me currently.
01:38:54.320 | Traveling to the Russian side is more difficult.
01:38:56.720 | Let me ask you about your now friend,
01:39:02.680 | can we go as far as to say his friend in Asabu,
01:39:05.960 | Hector Masegur.
01:39:08.600 | What's the story, what's your long story with him?
01:39:13.600 | Can you tell me about what is LALSEC,
01:39:16.960 | who is Asabu, and who's Anonymous, what is Anonymous?
01:39:22.560 | Where's the right place to start that story?
01:39:24.480 | - Probably Anonymous.
01:39:25.960 | Anonymous was a, it still is I guess,
01:39:28.920 | a decentralized organization.
01:39:31.360 | They call themselves Headless,
01:39:32.560 | but once you look into them a little ways,
01:39:34.480 | they're not really Headless.
01:39:37.320 | The power struggle comes with whoever has a hacking ability.
01:39:42.200 | That might be you're a good hacker
01:39:45.120 | or you have a giant botnet used for DDoS.
01:39:47.880 | So you're gonna wield more power
01:39:50.520 | if you can control where it goes.
01:39:52.160 | Anonymous started doing their hacktivism stuff
01:39:55.560 | in 2010 or so.
01:39:57.880 | The word hack was in the media all the time then.
01:40:00.400 | And then right around then,
01:40:03.160 | there was a federal contractor named HBGary Federal.
01:40:06.360 | Their CEO is Aaron Barr.
01:40:08.880 | And Aaron Barr said he was gonna come out
01:40:10.640 | and de-anonymize Anonymous.
01:40:13.000 | He's gonna come out and talk at Black Hat or Defcon
01:40:15.440 | or one of those and say who they are.
01:40:17.320 | He figured it out by based on when people were online,
01:40:22.320 | when people were in IRC, when tweets came out.
01:40:25.480 | There was no scientific proof behind it or anything.
01:40:28.360 | So he was just gonna falsely name people
01:40:30.040 | that were in Anonymous.
01:40:31.760 | So Anonymous went on the attack.
01:40:34.320 | They went and hacked in HBGary Federal
01:40:36.800 | and they turned his life upside down.
01:40:38.240 | They took over his Twitter account
01:40:39.280 | and all that stuff pretty quickly.
01:40:42.440 | - I have very mixed feelings about all of this.
01:40:45.000 | - Okay.
01:40:45.840 | - I get, like part of me
01:40:49.040 | admires the positive side of the hacktivism.
01:40:55.080 | - Okay.
01:40:56.440 | - Is there no room for admiration there
01:40:59.360 | of the fuck you to the man?
01:41:01.120 | - Not at the time.
01:41:02.080 | Again, it was a violation.
01:41:03.840 | The 18 USC 1030, so it was my job.
01:41:06.560 | It's what I, you know, so at the time, no.
01:41:08.160 | In retrospect, sure.
01:41:09.240 | - But what was the philosophy of the hacktivism?
01:41:13.360 | The philosophically, were they at least expressing it
01:41:17.600 | for the good of humanity or no?
01:41:19.920 | - They outwardly said that they were gonna go after people
01:41:22.600 | that they thought were corrupt.
01:41:24.000 | So they were judge and jury on corruption.
01:41:25.920 | They were gonna go after it.
01:41:27.520 | Once you get inside and realize what they were doing,
01:41:30.720 | they were going after people
01:41:33.000 | that they had an opportunity to go after.
01:41:36.040 | So maybe someone had a zero day
01:41:37.800 | and then they searched for servers running that zero day.
01:41:40.920 | And then from there, let's find a target.
01:41:43.400 | I mean, one time they went after a toilet paper company.
01:41:46.640 | I still don't understand what that toilet paper company did,
01:41:49.320 | but it was an opportunity to make a splash.
01:41:51.760 | - Is there some way for the joke, for the lulz?
01:41:55.200 | - It developed into that.
01:41:56.480 | So I think the hacktivism and the anonymous stuff
01:41:58.840 | wasn't so much for the lulz,
01:42:01.120 | but from that HBGary Federal hack,
01:42:02.960 | then there were six guys that worked well together
01:42:04.880 | and they formed a crew, a hacking crew,
01:42:06.600 | and they kind of split off into their own private channels.
01:42:09.120 | And that was lulzsec, or laughing at your security,
01:42:11.800 | was their motto.
01:42:12.740 | - So that's L-U-L-Z-S-E-C, lulzsec.
01:42:18.120 | - Of course it is.
01:42:19.080 | - Lulzsec.
01:42:19.920 | And who founded that organization?
01:42:26.440 | - So Kayla and Sabu were the hackers of the group.
01:42:30.720 | And so they really did all the work on HBGary.
01:42:33.080 | So they're-
01:42:33.920 | - These are code names.
01:42:35.120 | - Yeah, they're online names.
01:42:36.640 | They're Nicks.
01:42:38.340 | And so, you know, that's all they knew each other as.
01:42:44.320 | You know, they talked as those names.
01:42:46.260 | And they worked well together.
01:42:48.440 | And so they formed a hacking crew
01:42:49.880 | and that's when they started the,
01:42:51.880 | at first they didn't name it this,
01:42:53.160 | but it was the 50 days of lulz,
01:42:54.680 | where they would just release major, major breaches.
01:42:58.800 | And it stirred up the media.
01:43:00.520 | I mean, it put hacking in the media every day.
01:43:03.780 | They had 400 or 500,000 Twitter followers.
01:43:07.240 | You know, and it was kind of interesting.
01:43:11.080 | But then they started swinging at the beehive
01:43:14.440 | and they took out some FBI affiliated sites.
01:43:18.120 | And then they started Fuck FBI Fridays,
01:43:21.480 | where every Friday they would release something.
01:43:23.440 | And we waited it with bated breath.
01:43:25.120 | I mean, they had us hook, line, and sinker pissed.
01:43:28.040 | We were waiting to see what was gonna be dropped
01:43:29.640 | every Friday.
01:43:30.480 | It was, it's a little embarrassing looking back on it now.
01:43:33.280 | - And this is in the early 2010s.
01:43:35.320 | - Yeah, this was 2010, 2011, around there.
01:43:38.480 | - So actually linger on Anonymous.
01:43:42.640 | What, do we still understand what the heck is Anonymous?
01:43:46.280 | - It's just a place where you hang out.
01:43:47.480 | I mean, it's just, it started on 4chan, went to 8chan.
01:43:50.400 | It's really just anyone.
01:43:51.680 | You could be in Anonymous right now if you wanted to.
01:43:53.600 | Just you're in there hanging out in the channel.
01:43:55.120 | Now, you're probably not gonna get much cred
01:43:56.960 | until you work your way up and prove who you are
01:43:58.800 | or someone vouches for you.
01:44:00.680 | But anybody can be in Anonymous.
01:44:02.320 | Anybody can leave Anonymous.
01:44:03.520 | - What's the leadership of Anonymous?
01:44:04.960 | Do you have a sense that there is a leadership?
01:44:07.000 | - There's a power play.
01:44:08.560 | Now, is that someone that says this is what we're doing
01:44:11.520 | and all we're doing?
01:44:12.560 | - I love the philosophical and the technical aspect
01:44:17.560 | of all of this.
01:44:18.800 | But I think there is a slippery slope
01:44:20.880 | to where for the lulz,
01:44:22.080 | you can actually really hurt people.
01:44:26.040 | That's the terrifying thing.
01:44:28.080 | When you're attached, I'm actually really terrified
01:44:31.480 | of the power of the lulz.
01:44:33.440 | It's the fun thing somehow becomes a slippery slope.
01:44:38.080 | I haven't quite understood the dynamics of that.
01:44:40.540 | But even in myself, if you just have fun with a thing,
01:44:45.440 | you lose track of the ethical grounding of the thing.
01:44:48.600 | And so like, it feels like hacking for fun
01:44:50.780 | can just turn it, like literally lead to nuclear war.
01:44:54.360 | Like literally destabilize nations.
01:44:56.760 | - Yeah, yada, yada, yada, nuclear war.
01:44:58.960 | I could see it, yeah.
01:45:00.840 | - So I've been more careful with the lulz.
01:45:03.280 | Yeah, I've been more careful about that.
01:45:07.920 | And I wonder about it because in internet speak,
01:45:10.680 | somehow ethics can be put aside
01:45:12.800 | through the slippery slope of language.
01:45:16.980 | I don't know, everything becomes a joke.
01:45:19.480 | If everything's a joke, then everything's allowed
01:45:21.340 | and everything's allowed,
01:45:22.720 | then you don't have a sense of what is right and wrong.
01:45:24.640 | You lose sense of what is right and wrong.
01:45:26.220 | - You still have victims.
01:45:27.060 | I mean, you're laughing at someone.
01:45:28.440 | Someone's the butt of this joke.
01:45:30.180 | Whether it's major corporations or the individuals.
01:45:33.380 | I mean, some of the stuff they did was just
01:45:35.580 | releasing people's PII, their personal identifying
01:45:38.000 | information and stuff like that.
01:45:39.280 | I mean, is it a big deal?
01:45:41.080 | I don't know, maybe, maybe not.
01:45:42.640 | But if you could choose to not have your information
01:45:45.640 | put out there, probably wouldn't.
01:45:48.800 | - We do have a sense of what anonymous is today.
01:45:51.720 | Has it ever been one stable organization
01:45:54.520 | or is it a collection of hackers that kind of emerge
01:45:57.960 | for particular tasks, for particular,
01:46:01.820 | like, hacktivism tasks and that kind of stuff?
01:46:05.080 | - It's a collection of people that has some hackers in it.
01:46:08.140 | There's not a lot of big hackers in it.
01:46:11.120 | I mean, there's some that'll come bouncing in and bounce out.
01:46:14.160 | Even back then, there was probably just as many
01:46:17.520 | reporters in it, people in the media in it
01:46:20.280 | with the hackers at the time, just trying to get
01:46:23.600 | the inside scoop on things.
01:46:25.720 | Some giving the inside scoop.
01:46:27.360 | We arrested a reporter that gave over the username
01:46:30.520 | and password to his newspaper,
01:46:32.480 | just so he could break the story.
01:46:35.360 | He trusted him.
01:46:37.920 | - Speaking of trust, reporters, boy, there's good ones.
01:46:43.960 | There's good ones.
01:46:45.040 | - There are.
01:46:45.880 | - There are.
01:46:47.320 | But boy, do I have a complicated relationship with them.
01:46:49.800 | - How many stories about you are completely true?
01:46:53.000 | - You can just make stuff up on the internet.
01:46:55.520 | And one of the things that, I mean,
01:46:57.840 | there's so many fascinating psychological,
01:47:00.240 | sociological elements of the internet to me.
01:47:03.920 | One of them is that you can say that
01:47:07.000 | Lex is a lizard, right?
01:47:11.480 | And if it's not funny, so lizard is kind of funny,
01:47:14.720 | what should we say?
01:47:17.280 | Lex has admitted to being an agent of the FBI.
01:47:22.280 | You can just say that, right?
01:47:24.320 | And then the response that the internet will be like,
01:47:26.560 | oh, is that true?
01:47:27.960 | I didn't realize that.
01:47:29.320 | They won't go like, provide evidence, please.
01:47:33.920 | They'll just say like, oh, that's weird.
01:47:35.800 | I kind of thought he might be kind of weird.
01:47:38.640 | And then it piles on, it's like, hey, hey, hey, guys.
01:47:41.560 | Here's a random dude on the internet
01:47:44.520 | just said a random thing.
01:47:45.480 | You can't just like pile up as, and then--
01:47:47.840 | - Yeah, Johnny6969 is now a source that says.
01:47:51.160 | - And then like, the thing is I'm a tiny guy,
01:47:54.440 | but when it grows, if you have a big platform,
01:47:59.360 | I feel like newspapers will pick that up
01:48:01.960 | and then they'll start to build on a story.
01:48:04.440 | And you never know where that story really started.
01:48:06.640 | It's so cool.
01:48:07.480 | I mean, to me, actually, honestly, it's kind of cool
01:48:09.520 | that there's a viral nature of the internet
01:48:12.280 | that can just fabricate truth completely.
01:48:14.560 | I think we have to accept that new reality
01:48:17.080 | and try to deal with it somehow.
01:48:18.660 | You can't just like complain that Johnny69
01:48:21.000 | can start a random thing, but I think
01:48:24.600 | in the best possible world, it is the role of the journalist
01:48:28.280 | to be the adult in the room and put a stop to it
01:48:32.280 | versus look for the sexiest story
01:48:35.160 | so that there could be clickbait that can generate money.
01:48:38.600 | Journalism should be about sort of slowing things down,
01:48:43.880 | thinking deeply through what is true or not
01:48:46.280 | and showing that to the world.
01:48:47.360 | I think there's a lot of hunger for that.
01:48:49.320 | And I think that would actually get the most clicks
01:48:51.400 | in the end.
01:48:52.240 | - I mean, it's that same pressure I think we're talking about
01:48:53.900 | with the FBI and with the tech companies about Controversy.
01:48:57.280 | I mean, the editors have to please and get those clicks.
01:49:00.160 | I mean, they're measured by those clicks.
01:49:02.680 | So, I'm sure the journalists, the true journalists,
01:49:06.360 | the good ones out there want that,
01:49:08.000 | but they wanna stay employed too.
01:49:10.520 | - Can I actually ask you really as another tangent,
01:49:13.160 | the Jared and others that are doing undercover,
01:49:17.000 | in terms of the tools you have
01:49:20.000 | for catching cybersecurity criminals,
01:49:22.200 | how much of is undercover?
01:49:23.880 | - Undercover is a high bar to jump over.
01:49:26.880 | You have to do a lot to start an undercover in the FBI.
01:49:30.280 | There's a lot of thresholds.
01:49:31.680 | So, it's not your first investigative tool step.
01:49:36.280 | You have to identify a problem
01:49:37.840 | and then show that the lower steps can't get you there.
01:49:42.760 | But I mean, I think we had an undercover
01:49:45.960 | going on in the squad about all times.
01:49:47.640 | When one was being shut down or taken down,
01:49:50.560 | we were spinning up another one.
01:49:52.920 | So, it's a good tool to have and utilize.
01:49:57.400 | They're a lot of work.
01:49:58.480 | I don't think if you run one,
01:50:00.040 | you'll never run another one in your life.
01:50:02.600 | - Oh, so it's like psychologically,
01:50:05.360 | there's a lot of work just technically,
01:50:07.520 | but also psychologically, like you have to really-
01:50:09.960 | - It's 24/7, you're inside that world.
01:50:11.760 | Like you have to know what's going on and what's happening.
01:50:14.120 | You're taking on, you have to remember who you are
01:50:17.920 | when you're, 'cause you're a criminal online.
01:50:20.920 | You have to go to a special school for it too.
01:50:22.800 | - Was that ever something compelling to you?
01:50:25.000 | - I went through the school,
01:50:26.120 | but I'm a pretty open and honest guy.
01:50:28.760 | And so, it's tough for me to build that wall of lies.
01:50:31.660 | Maybe I'm just not smart enough
01:50:34.000 | to keep all the lies straight.
01:50:35.400 | - Yeah, but a guy who's good at building up a wall of lies
01:50:37.720 | would say that exact same thing.
01:50:38.680 | - Exactly.
01:50:39.520 | - It's so annoying the way truth works
01:50:41.640 | in this world.
01:50:42.560 | It's like, people have told me,
01:50:45.120 | because I'm trying to be honest and transparent,
01:50:47.360 | that's exactly what an agent would do, right?
01:50:50.160 | But I feel like an agent would not wear a suit and tie.
01:50:55.040 | - I wore a suit and tie every day.
01:50:57.540 | I was a suit and tie guy.
01:50:58.600 | - You were?
01:50:59.440 | - Yeah, every day.
01:51:00.260 | I remember one time I wore shorts in and the SAC came in.
01:51:03.400 | And this was when I was a rockstar at the time in the Bureau
01:51:06.000 | and I had shorts in and I said,
01:51:08.400 | "Sorry, ma'am, I apologize for my attire."
01:51:10.440 | And she goes, "You can wear bike shorts in here,
01:51:12.280 | "I wouldn't care."
01:51:13.120 | I was like, "Oh, shit, that sounds nice."
01:51:15.080 | I never wore the bike shorts, but.
01:51:17.920 | - Yeah.
01:51:19.200 | But see, I don't see a suit and tie as constraining.
01:51:21.880 | I think it's liberating in sorts.
01:51:24.080 | It's like, shows that you're taking the moment seriously.
01:51:26.720 | - Well, not just that, people wanted it.
01:51:28.160 | I mean, people expected when you're not,
01:51:30.120 | you are dressed like a perfect FBI agent.
01:51:32.600 | When someone knocks on their door,
01:51:33.960 | that's what they wanna see.
01:51:35.200 | They wanna see what Hollywood built up
01:51:36.700 | is what an FBI agent is.
01:51:38.160 | You show up like my friend, Il-Won.
01:51:39.880 | He was dressed always in t-shirts and shorts.
01:51:42.280 | People aren't gonna take him serious.
01:51:43.400 | They're not gonna give him what they want.
01:51:44.560 | - I wonder how many police that can just show up
01:51:46.520 | and say I'm from the FBI and start interrogating them.
01:51:49.560 | Like at a bar.
01:51:50.400 | - Probably.
01:51:51.240 | - Like how--
01:51:52.060 | - Oh, definitely, if they've had a few drinks,
01:51:53.040 | you can definitely.
01:51:53.880 | Well, but people are gonna recognize you.
01:51:55.120 | That's the only problem.
01:51:56.440 | That's another thing.
01:51:57.280 | You start taking out big cases.
01:51:58.560 | You can't work cases anymore in the FBI.
01:52:00.440 | Your face gets out there.
01:52:01.920 | - Your name too.
01:52:02.960 | - Yeah, yeah.
01:52:03.800 | - Well, actually, let me ask you about that
01:52:07.560 | before we return to our friend, Sabu.
01:52:09.440 | - Okay.
01:52:10.280 | - You've tracked and worked on
01:52:15.760 | some of the most dangerous people in this world.
01:52:18.160 | Have you ever feared for your life?
01:52:22.080 | - So I had to make a really, really
01:52:24.400 | shitty phone call one time.
01:52:26.720 | I was sitting in the bureau,
01:52:28.120 | and this was right after Silk Road,
01:52:30.800 | and Jared called me.
01:52:32.560 | He was back in Chicago.
01:52:34.200 | And he called me and said,
01:52:35.200 | "Hey, your name and your kid's name
01:52:37.880 | are on a website for an assassination.
01:52:40.040 | They're paying to have you guys killed."
01:52:42.480 | Now, these things happen on the black market.
01:52:44.640 | They come up, you know,
01:52:46.040 | and people debate whether they're real or not.
01:52:49.400 | But we have to take it serious.
01:52:50.720 | Someone's paying to have me killed.
01:52:52.480 | So I had to call my wife, and we had a word,
01:52:55.400 | in that if I said this word,
01:52:57.840 | and we only said it one time to each other,
01:52:59.760 | if I said this, this is serious.
01:53:01.360 | Drop what you're doing and get to the kids.
01:53:03.600 | And so I had to drop the word to her.
01:53:07.880 | And I could feel the breath come out of her,
01:53:12.120 | 'cause she thought her kids were in danger,
01:53:14.520 | at the time they were.
01:53:15.560 | I wasn't in a state of mind to drive myself.
01:53:19.680 | So an agent on the squad, a girl named Evelina,
01:53:22.960 | she drove me, lights and sirens,
01:53:24.480 | all the way to my kid's school.
01:53:26.040 | And we had locked, I called the school.
01:53:29.800 | We were in a lockdown.
01:53:30.920 | Nobody should get in or out,
01:53:33.600 | especially someone with a gun.
01:53:35.720 | The first thing they did was let me
01:53:36.960 | in the building with a gun.
01:53:38.400 | So I was a little disappointed with that.
01:53:40.960 | My kids were, I think, kindergarten and fifth grade,
01:53:44.000 | or somewhere around there, maybe the closer, second,
01:53:46.440 | I'm not sure where.
01:53:47.440 | But all hell broke loose,
01:53:50.160 | and we had to, from there, go move into a safe house.
01:53:54.200 | I live in New York City.
01:53:55.520 | NYPD surrounded my house.
01:53:57.480 | The FBI put cameras outside my house.
01:53:59.360 | You couldn't drive in my neighborhood
01:54:00.600 | without your license plate being read.
01:54:03.880 | Hey, why is this person here?
01:54:04.840 | Why is that person there?
01:54:06.760 | I got to watch my house on an iPad while I sat at my desk.
01:54:09.640 | But again, I put my family through that,
01:54:13.600 | and it scared the shit out of 'em.
01:54:15.880 | And that's, to be honest, I think that's sort of
01:54:18.800 | my mother-in-law's words were,
01:54:21.600 | "I thought you did cybercrime."
01:54:23.560 | (laughing)
01:54:25.960 | And because during Silk Road,
01:54:27.360 | I didn't tell my family what I was working on.
01:54:28.760 | I'll talk about that.
01:54:30.480 | I wanna escape that.
01:54:31.320 | I don't wanna be there.
01:54:32.600 | I remember that, so when I was in the FBI,
01:54:35.080 | driving in, I used to go in at 4.30 every morning,
01:54:38.520 | 'cause I like to go to the gym before I go to the desk.
01:54:40.600 | So I'd be at the desk at seven,
01:54:41.800 | so in the gym at five, a couple hours, and then go.
01:54:46.000 | The best time I had was that drive-in in the morning
01:54:50.920 | where I could just be myself.
01:54:52.120 | I listened to a sports podcast out of DC.
01:54:54.920 | We talked about sports and the Nationals
01:54:59.720 | and whatever it was, the Capitals.
01:55:01.840 | It was great to not think about Silk Road for 10 minutes.
01:55:05.280 | But that was my best time, but yeah, again, so yeah.
01:55:08.720 | I've had that move into the safe house.
01:55:11.520 | I left my MP5 at home.
01:55:13.200 | That's the Bureau's machine gun.
01:55:15.440 | Showed my wife to just pull and spray.
01:55:19.240 | - But how often did you live or work
01:55:24.120 | and live with fear in your heart?
01:55:26.400 | - It was only that time.
01:55:27.400 | I mean, for actual physical security,
01:55:30.200 | then, I mean, after the anonymous stuff,
01:55:32.280 | I really tightened down to my cybersecurity.
01:55:36.400 | I don't have social media.
01:55:39.240 | I don't have pictures of me and my kids online.
01:55:41.640 | I don't really, if I go to a wedding or something,
01:55:43.720 | I say, "I don't take my picture with my kids,"
01:55:46.160 | if you're gonna post it someplace or something like that.
01:55:48.560 | So that sort of security I have.
01:55:50.240 | But just like everybody, you start to relax a little bit
01:55:55.480 | and security breaks down 'cause it's not convenient.
01:55:59.600 | - But it's also part of your job,
01:56:01.000 | so you're much better at,
01:56:04.280 | like, I mean, your job now and your job before,
01:56:06.920 | so you're probably much better taking care
01:56:08.760 | of the low-hanging fruit, at least.
01:56:10.520 | - I understand the threat,
01:56:12.840 | and I think that's what a lot of people don't understand,
01:56:14.640 | is understanding what the threat against them is.
01:56:17.800 | So I'm aware of that and what possibly,
01:56:20.720 | and I think about it, you know?
01:56:22.120 | I think about things.
01:56:23.320 | I do remember, so you tripped a memory in my mind.
01:56:28.280 | I remember a lot of times, and I had a gun on my hip,
01:56:30.800 | I still carry a gun to this day,
01:56:33.040 | opening my front door and being concerned
01:56:35.080 | what was on the other side,
01:56:37.240 | walking out of the house 'cause I couldn't see it.
01:56:39.800 | I remember those four o'clocks, heading to the car.
01:56:42.400 | I was literally scared.
01:56:46.160 | - Yeah.
01:56:47.000 | I mean, having seen some of the things you've seen,
01:56:50.960 | it makes you perhaps question
01:56:52.960 | how much evil there is out there in the world,
01:56:56.640 | how many dangerous people there are out there,
01:56:59.000 | crazy people even.
01:57:01.440 | - There's a lot of crazy, there's a lot of evil.
01:57:05.640 | Most people, I think, get into cyber crime
01:57:08.320 | or just opportunistic, not necessarily evil.
01:57:11.760 | They don't really know, maybe think about the victim.
01:57:14.480 | They just do it as a crime of opportunity.
01:57:16.800 | I don't label that as evil.
01:57:20.040 | - And one of the things about America
01:57:22.320 | that I'm also very happy about
01:57:25.080 | is that rule of law, despite everything we talk about,
01:57:28.400 | it's tough to be a criminal in the United States.
01:57:32.000 | So if you walk outside your house,
01:57:36.320 | you're much safer than you are
01:57:38.000 | in most other places in the world.
01:57:40.200 | - You're safer and the system's tougher.
01:57:43.520 | I mean, LulzSec, six guys,
01:57:45.960 | one guy in the United States, five guys other places.
01:57:48.560 | Hector was facing 125 years.
01:57:52.000 | Those guys got slaps on the wrist
01:57:53.440 | and went back to college.
01:57:55.600 | You know, different laws, different places.
01:57:58.360 | - So who's Hector?
01:57:59.440 | Tell me the story of Hector.
01:58:01.880 | So this LulzSec organization was started.
01:58:03.840 | So Hector was before that in,
01:58:05.500 | he was in part anonymous.
01:58:08.840 | He was doing all kinds of hacking stuff,
01:58:10.840 | but then he launched LulzSec.
01:58:13.040 | - He's an old school hacker.
01:58:14.080 | I mean, he learned how to hack
01:58:16.280 | and I don't wanna tell his story,
01:58:17.520 | but he learned to hack
01:58:19.360 | because he grew up in the Lower East Side of New York
01:58:21.640 | and picked up some NYPD computers
01:58:25.560 | that were left on the sidewalk for trash.
01:58:28.440 | Taught himself how to--
01:58:29.280 | - He doesn't exactly look like a hacker.
01:58:30.640 | For people who don't know,
01:58:31.480 | he looks, I don't know exactly what he looks like,
01:58:33.800 | but not like a technical, not what you would imagine.
01:58:38.800 | But perhaps that's a Hollywood portrayal.
01:58:42.520 | - Yeah, I think you get in trouble these days
01:58:43.880 | saying what a hacker looks like.
01:58:47.520 | I don't know if they have a traditional look.
01:58:49.040 | Just like I said, Hollywood has an idea,
01:58:50.920 | an FBI looks like.
01:58:52.200 | I don't think you can do that anymore.
01:58:53.280 | I don't think you can say that anymore.
01:58:55.560 | - Well, he certainly has a big personality
01:58:57.840 | and charisma and all that kind of stuff.
01:58:59.980 | - That's Sabu.
01:59:01.080 | - I can see him selling me anything.
01:59:04.600 | - That's Sabu.
01:59:05.440 | - Convincing me of anything.
01:59:07.440 | - Two different people.
01:59:08.280 | There's Sabu and there's Hector.
01:59:09.240 | Hector is a sweet guy.
01:59:11.320 | He likes to have intellectual conversations
01:59:14.240 | and that's just his thing.
01:59:15.760 | He'd rather just sit there
01:59:18.040 | and have a one-on-one conversation with you.
01:59:19.880 | But Sabu, that's a ruthless motherfucker.
01:59:22.600 | - And you first met Sabu.
01:59:24.440 | - I was tracking Sabu.
01:59:25.960 | That's all I knew was Sabu.
01:59:27.840 | I didn't know Hector.
01:59:28.720 | - So when did your paths cross in terms of tracking?
01:59:32.360 | When did you first take on the case?
01:59:34.320 | - The spring of '11.
01:59:36.920 | - So it was through Anonymous.
01:59:38.440 | - Through Anonymous, and really kind of LULZSEC.
01:59:40.880 | LULZSEC was a big thing
01:59:44.640 | and it was pushed out to all the cyber,
01:59:47.120 | 56 field offices in the FBI.
01:59:49.600 | Most of them have cyber squads or cyber units.
01:59:52.840 | And so it was being pushed out there
01:59:55.160 | and it was in the news every day,
01:59:56.480 | but it really wasn't ours.
01:59:57.760 | So we didn't have a lot of victims
01:59:59.280 | in our AOR area of responsibility.
02:00:01.840 | And so we just kind of pay attention to it.
02:00:04.880 | Then I got a tip that a local hacker in New York
02:00:07.400 | had broken into AOL.
02:00:09.400 | And so Olivia Olsen and I,
02:00:12.840 | she's another agent who she's still in.
02:00:14.400 | She's a supervisor out in LA.
02:00:15.640 | She's a great agent.
02:00:17.480 | We went all around New York looking for this kid
02:00:20.080 | just to see what we can find
02:00:21.640 | and ended up out in Staten Island at his grandmother's house.
02:00:26.120 | She didn't know where he was, obviously, why would she?
02:00:29.360 | But I left my card.
02:00:30.400 | He gave me a call that night and started talking to me.
02:00:34.000 | And I said, "Let's just meet up tomorrow
02:00:36.320 | at the McDonald's across from 26th and."
02:00:38.840 | And he came in and three of us sat there and talked
02:00:42.480 | and gave me his stuff.
02:00:45.280 | He started telling me about all the felonies
02:00:46.920 | he was committing those days, including that break into AOL.
02:00:50.040 | And then he finally says, "I can give you Sabu."
02:00:54.520 | Sabu to us was the Kaiser socialite of hacking.
02:00:56.880 | He was our guy.
02:00:58.480 | He was the guy that was in the news
02:00:59.600 | that was pissing us off.
02:01:00.920 | - So he was part of the FBI Fridays?
02:01:04.720 | - Sabu was, yeah.
02:01:05.600 | Oh, he led it.
02:01:06.760 | Yeah, he was the leader of fuck FBI Fridays.
02:01:09.080 | So yeah.
02:01:10.120 | - What was one of the more memorable FFFs?
02:01:15.640 | (sighs)
02:01:17.640 | - I said, "How do you get,
02:01:19.400 | how and why do you go after the beehive?
02:01:22.760 | That's kind of intense."
02:01:24.040 | - You get you on the news, it's the lulls.
02:01:27.080 | It's funnier to go after the big ones.
02:01:29.680 | You know, and they weren't getting like real FBI.
02:01:31.720 | They weren't breaking into FBI mainframes or anything,
02:01:33.560 | but they were affiliate sites or anything that had to do,
02:01:38.320 | a lot of law enforcement stuff was coming out.
02:01:42.000 | But, you know, we looked back.
02:01:45.520 | And so if this kid knew that Sabu,
02:01:47.640 | maybe there was a chance we could use him to lure Sabu out.
02:01:51.600 | But we also said,
02:01:52.440 | "Well, maybe this kid knows Sabu in real life."
02:01:54.280 | And so we went and looked through the IPs
02:01:55.680 | and 10 million IPs, we find one and it belonged to him.
02:01:58.360 | And so that day Sabu, someone had doxxed Sabu
02:02:03.360 | and we were a little afraid he was gonna be on the run.
02:02:07.960 | We had a surveillance team
02:02:09.760 | and FBI surveillance teams are awesome.
02:02:12.040 | Like you cannot even tell their FBI agents.
02:02:14.280 | They are really that good.
02:02:16.200 | I mean, there's baby strollers
02:02:17.760 | and all whatever you wouldn't expect an FBI agent to have.
02:02:20.480 | - So that's a little like the movies.
02:02:22.120 | - A little bit, yeah.
02:02:22.960 | I mean, it is true, but they fit into the area.
02:02:25.720 | So now they're on the Lower East Side,
02:02:27.120 | which is, you know, a baby stroller
02:02:28.800 | might not fit in there as well.
02:02:30.120 | You know, somebody just laying on the ground
02:02:31.600 | or something like that.
02:02:33.280 | They really get in, play the character and get into it.
02:02:36.080 | - So now I can never trust a baby stroller again.
02:02:39.160 | - Well, probably shouldn't.
02:02:40.640 | - Every baby, I'm just like, look at stare at them suspiciously.
02:02:43.720 | - Especially if the mom's wearing cargo pants
02:02:45.520 | while she pushes it.
02:02:46.360 | - Yeah, so if it's like a very stereotypical mom
02:02:49.800 | or stereotypical baby, I'm gonna be very suspicious.
02:02:52.760 | I'm gonna question the baby.
02:02:53.600 | - That baby's wired, be careful.
02:02:55.440 | You know, we raced out there
02:02:58.760 | and like our squad's not even full.
02:03:00.640 | There's only a few guys there.
02:03:01.720 | And like I said, I was a suit guy,
02:03:03.720 | but that day I had shorts and a t-shirt on.
02:03:05.400 | I had a white t-shirt on and I only bring it up
02:03:08.080 | 'cause Sabu makes fun of me to this day.
02:03:09.720 | So I had a bulletproof vest and a white t-shirt on
02:03:11.640 | and that was it.
02:03:12.520 | I had shorts too and all that, but raced over to there.
02:03:15.920 | We didn't have any equipment.
02:03:17.680 | We brought our boss's boss's boss.
02:03:20.320 | He stopped off at NYPD, got us like a ballistic shield
02:03:23.920 | and a battery and RAM if we needed it.
02:03:25.920 | And then we get to Hector's house, Sabu's house,
02:03:29.880 | and he's on the sixth floor.
02:03:31.760 | And so normally, you know, we're the cyber dork squad.
02:03:36.360 | We'll hop in the elevator, six floors is a long ways
02:03:39.080 | to go up and bulletproof vest and a ballistic shield.
02:03:40.880 | But we had been caught in an elevator before on a search.
02:03:44.600 | So we didn't, we took the stairs.
02:03:46.880 | We get to the top, a tad winded, but knocking the door
02:03:52.840 | and this big towering guy opens the door just slightly.
02:03:56.400 | And he sees the green vest with big yellow letters FBI
02:04:00.640 | and he steps outside.
02:04:02.280 | Can I help you?
02:04:04.240 | And tries to social engineer us.
02:04:06.560 | But eventually we get our way inside the house.
02:04:08.920 | You know, I noticed a few things that are kind of
02:04:12.600 | out of place.
02:04:13.800 | There's a laptop charger and a flashing modem.
02:04:17.360 | And I said, well, do you have a computer here?
02:04:19.040 | And he said, no, there's no computer here.
02:04:21.640 | So we knew the truce and then the half lies
02:04:25.040 | and all that sort of thing.
02:04:25.880 | So it took us about another two hours and finally gave up
02:04:28.680 | that he was Sabu, he was the guy we were looking for.
02:04:31.960 | So we sat there and we kind of showed him
02:04:33.760 | sort of the evidence we had against him.
02:04:35.760 | And, you know, from his words, we sat there and talked,
02:04:39.600 | talked like two grown adults and, you know,
02:04:43.320 | I gave him the options and he said, well,
02:04:45.320 | let's talk about working together.
02:04:48.920 | - So he chose to become an informant.
02:04:52.760 | - I don't think he chose that night,
02:04:54.120 | but that's where it kind of went to.
02:04:56.360 | So then we brought him down to the FBI that night,
02:04:59.640 | which was, it was a funny trip
02:05:02.080 | 'cause I'm sitting in the back seat of the car with him.
02:05:04.840 | And I was getting calls from all over the US
02:05:08.440 | from different FBI agents saying
02:05:09.720 | that we arrested the wrong guy.
02:05:11.800 | I was like, I don't think so.
02:05:13.520 | And they're like, why do you think so?
02:05:14.360 | I was like, 'cause he says it's him.
02:05:16.640 | And they still said, no, it's the wrong guy.
02:05:18.520 | So I said, well, we'll see how it plays out.
02:05:20.920 | - That's so interesting 'cause it's such a strange world.
02:05:24.440 | Such a strange world 'cause it's tough to,
02:05:27.680 | 'cause you still have to prove it's the same guy, right?
02:05:30.640 | 'Cause the anonymity.
02:05:32.080 | - Yeah, I mean, we had his laptop by that point.
02:05:35.120 | - Yeah, I know, but-- - Him saying, that helped.
02:05:37.360 | I gave him a clue in my world.
02:05:38.860 | - Yeah, yeah.
02:05:42.200 | - But yeah, if he would have fought it,
02:05:43.280 | I mean, that definitely would have come in as evidence
02:05:45.160 | that other FBI agents are saying it's not him.
02:05:47.680 | You have to disclose that stuff.
02:05:49.080 | - So you had a lot of stuff on him.
02:05:50.840 | What was he facing if--
02:05:54.600 | - He was facing 125 years.
02:05:56.040 | - 125 years in prison.
02:05:58.680 | Now that's if you took every charge we had against him
02:06:00.880 | and put him consecutively.
02:06:04.440 | No, no one ever gets charged with that,
02:06:05.900 | but yeah, essentially it would have been 125 years.
02:06:09.160 | Fast forward to the end,
02:06:10.200 | he got thanked by the judge for his service
02:06:12.400 | after nine months.
02:06:14.240 | And he walked out of the court a free man.
02:06:17.080 | - But that's while being an informant.
02:06:20.560 | - Yes.
02:06:21.400 | - Well, so the word informant here
02:06:25.960 | really isn't that good.
02:06:29.240 | It's not fitting that technically,
02:06:31.400 | I guess that's what he was,
02:06:33.120 | but he didn't know the other people.
02:06:35.720 | It was all anon, he knew Nix and all that.
02:06:38.600 | He really gave us the insight
02:06:40.480 | of what was happening in the hacker world.
02:06:43.000 | Like I said, he was an old school hacker.
02:06:44.600 | Back when hackers didn't work together with anonymous,
02:06:47.120 | he was down Cult of Dead Cow and those type guys,
02:06:49.760 | like way back.
02:06:51.600 | He was around for that.
02:06:52.480 | He's like an encyclopedia of hacking.
02:06:54.400 | But, you know, we just--
02:06:55.960 | - So I guess Prime was in the '90s.
02:06:58.400 | - For terror hack, but yeah,
02:06:59.800 | he kind of came back when anonymous started going
02:07:02.560 | after MasterCard and PayPal and all that,
02:07:04.240 | do the WikiLeaks stuff.
02:07:06.160 | - But even that little interaction, being an informant,
02:07:09.640 | he probably made a lot of enemies.
02:07:11.720 | How do you protect a guy like that?
02:07:13.920 | - He made enemies after it was revealed?
02:07:15.800 | - Yeah.
02:07:17.360 | - How does the FBI protect him?
02:07:19.080 | Good luck.
02:07:22.080 | I mean, perhaps I'll talk to him one day,
02:07:25.080 | but is that guy afraid for his life?
02:07:28.720 | - I, again, I think--
02:07:29.880 | - He doesn't seem like it.
02:07:30.920 | - He has very good security for himself, cyber security.
02:07:34.700 | But, you know, yeah, he doesn't like
02:07:38.840 | the negative things said about him online.
02:07:40.720 | I don't think anybody does.
02:07:42.080 | But, you know, I think it's so many years
02:07:45.480 | of the internet kind of bitching at you and all that,
02:07:49.480 | you get calloused, it's just internet bitching.
02:07:52.600 | - And also the hacking world moves on very quickly.
02:07:55.840 | He is kind of, they have their own wars to fight now,
02:08:00.840 | and he's not part of those wars anymore.
02:08:03.800 | - There's still people out there
02:08:04.840 | that bitch and moan about him,
02:08:06.560 | but yeah, I think it's less.
02:08:08.680 | I think, you know, he has a good message out there
02:08:12.200 | of, you know, trying to keep kids
02:08:16.400 | from making the same mistakes he made.
02:08:18.560 | He tries to really preach that.
02:08:20.320 | - How do people get into this line of work?
02:08:26.120 | Is there all kinds of ways, being not your line of work,
02:08:30.040 | his line of work, just all the stories you've seen
02:08:33.320 | of people that are in Anonymous and LulzSec and Silk Road
02:08:38.080 | and all the cyber criminals you've interacted with.
02:08:40.780 | What's the profile of a cyber criminal?
02:08:43.880 | - I don't think there's a profile anymore.
02:08:45.560 | You know, I used to be able to say, you know,
02:08:47.140 | the kid in your mom's basement or something like that,
02:08:49.320 | but it's not true anymore.
02:08:51.160 | You know, like, it's wide.
02:08:53.240 | It's like, I've arrested people
02:08:57.000 | that you wouldn't expect would be cyber criminals.
02:08:59.500 | - And it's in the United States, it's international,
02:09:03.000 | it's everything?
02:09:03.840 | - Oh, it's international.
02:09:04.660 | I mean, we're seeing a lot of the big hackers now.
02:09:07.520 | The big arrests for hackers in England, surprisingly.
02:09:10.440 | You know, there's, you know,
02:09:12.360 | you're not gonna see there's a lot of good hackers
02:09:14.440 | like down in Brazil,
02:09:15.400 | but I don't think Brazil law enforcement
02:09:17.620 | is as good at hunting them down.
02:09:19.680 | So you're not gonna see the big arrests.
02:09:21.540 | - How much state-sponsored cyber attacks are there,
02:09:26.540 | do you think?
02:09:28.120 | - More than you can imagine.
02:09:29.520 | And what do you wanna say an attack?
02:09:32.740 | You had a successful attack or just a probing?
02:09:35.220 | - Probing for information, just like feeling, you know,
02:09:40.060 | testing that there's where the attack factors are,
02:09:43.460 | trying to collect all the possible attack factors.
02:09:45.600 | - Put a Windows 7 machine on the internet forward-facing
02:09:47.800 | and put a packet sniffer on there
02:09:49.560 | and look at where the driver comes from.
02:09:51.160 | I mean, in 24 hours, you were gonna fill up a hard drive
02:09:53.740 | with packets just coming at it.
02:09:55.200 | - Yeah.
02:09:56.040 | - I mean, it's not hard to know.
02:09:57.720 | I mean, it's just constantly probing
02:10:00.020 | for entry points into things, you know?
02:10:02.120 | You could go mad putting up Honeypot,
02:10:05.520 | draws in intrusions, should I see what methodologies?
02:10:07.840 | - Just to see what's out there.
02:10:08.920 | - Yeah, and it doesn't go anywhere.
02:10:10.320 | It maybe has fake information and stuff like that.
02:10:13.460 | You know, it's kind of to see what's going on
02:10:16.020 | and judge what's happening on the internet.
02:10:18.340 | Get a, you know, lick your finger
02:10:20.280 | and test the wind of what's happening these days.
02:10:22.460 | - The funny thing about, like, because I'm at MIT,
02:10:25.820 | that attracted even more attention for the,
02:10:28.180 | not for the lulz, but for the technical challenge.
02:10:31.020 | It seems like people enjoy hacking MIT.
02:10:33.340 | Just the amount of traffic MIT was getting for that,
02:10:36.820 | in terms of just the sheer number of attacks
02:10:38.820 | from different places is crazy.
02:10:40.340 | Yeah, like, just like that,
02:10:41.420 | putting up a machine, seeing what comes.
02:10:43.340 | - NASA used to be the golden ring.
02:10:44.600 | Now everybody got NASA.
02:10:45.640 | That was like the early '90s.
02:10:46.840 | If you could hack NASA, that was the,
02:10:48.440 | now, yeah, MIT is a big one.
02:10:50.300 | - Yeah, it's fun.
02:10:51.240 | It's fun to see. (laughs)
02:10:53.920 | Respect.
02:10:54.760 | 'Cause I think in that case,
02:10:56.840 | it comes from a somewhat good place,
02:10:58.880 | 'cause, you know, they're not getting any money from MIT.
02:11:01.360 | (both laugh)
02:11:02.440 | It's more for the challenge.
02:11:04.440 | Well, let me ask you about that,
02:11:05.680 | about this world of cybersecurity.
02:11:09.740 | How big of a threat are cyberattacks
02:11:11.460 | for companies and for individuals?
02:11:13.900 | Like, let's lay out, where are we in this world?
02:11:17.900 | What's out there?
02:11:19.140 | - It's the wild, wild west.
02:11:20.900 | And it's, I mean,
02:11:24.140 | people want the idea of security, but it's inconvenient,
02:11:29.380 | so they don't, they push back on it.
02:11:31.860 | And there are a lot of opportunistic nation state,
02:11:35.900 | financially motivated hackers, hackers for the lulz.
02:11:38.300 | You got three different tiers there.
02:11:40.420 | And they're on the prowl.
02:11:43.100 | They have tools.
02:11:44.060 | They have really good tools that are being used against us.
02:11:47.940 | - And at what scale?
02:11:49.500 | So when you're thinking of,
02:11:51.260 | I don't know what's, let's talk about companies first.
02:11:55.900 | So say you're talking to a mid-tier.
02:12:00.540 | I wonder what the most interesting business is.
02:12:03.420 | So Google, we can look at large tech companies,
02:12:06.580 | or we can look at medium-sized tech companies.
02:12:10.420 | And like, you are sitting in a room with a CTO,
02:12:12.980 | with a CEO, and the question is, how fucked are we?
02:12:17.780 | And what should we do?
02:12:18.620 | What's the low-hanging fruit?
02:12:19.740 | What are the different strategies
02:12:21.700 | and those companies should consider?
02:12:24.380 | - I mean, the problem is they want a push button.
02:12:26.380 | They want a out-of-the-box solution that,
02:12:29.060 | I'm secure, you know?
02:12:30.020 | They want to tell people they're secure, but--
02:12:32.340 | - And that's very challenging to have.
02:12:33.820 | - It's impossible.
02:12:34.780 | But if I could, if someone had it,
02:12:36.500 | they'd be a billionaire.
02:12:37.820 | They'd be beyond a billionaire,
02:12:40.940 | 'cause that's what everybody wants.
02:12:42.580 | So you can buy all the tools you want.
02:12:45.740 | It's configuring them the proper way.
02:12:47.580 | And if anyone's trying to tell you
02:12:49.940 | that there's one solution that fits all,
02:12:52.940 | they're stakeholder salesmen.
02:12:53.860 | And there's a lot of people in cybersecurity
02:12:55.740 | that are stakeholder salesmen.
02:12:57.220 | - Yeah, and I feel like there's tools,
02:12:58.620 | if they're not configured correctly,
02:13:01.060 | they just introduce,
02:13:04.100 | they don't increase security significantly,
02:13:06.140 | and they introduce a lot of pain for the people.
02:13:08.460 | They decrease efficiency of the actual work you have to do.
02:13:11.740 | So like, we had, I was at Google for a time,
02:13:15.520 | and I think mostly I want to give props
02:13:20.980 | to their security efforts,
02:13:23.340 | but user data, so like data that belongs to users,
02:13:28.340 | is like the holy,
02:13:30.580 | like the amount of security they have around that
02:13:35.380 | is incredible.
02:13:36.220 | So most, any time I had to work with
02:13:40.380 | anything even resembling user data,
02:13:42.260 | so I never got a chance to work with actual user data,
02:13:44.620 | anything resembling that,
02:13:45.860 | first of all, you have no access to the internet.
02:13:48.020 | It's impossible to even come close
02:13:49.780 | to the access to the internet.
02:13:51.020 | And there's so much pain to actually
02:13:54.420 | like interact with that data.
02:13:57.100 | Where, I mean, it was extremely inefficient.
02:14:00.900 | In places where I thought
02:14:02.620 | it didn't have to be that inefficient,
02:14:04.220 | the security was too much.
02:14:05.820 | But I have to give respect to that,
02:14:07.660 | 'cause in that case,
02:14:08.780 | you want to err on the side of security.
02:14:10.580 | But that's Google.
02:14:11.700 | They were doing a good job of this.
02:14:13.580 | - The reputational harm, if it got out.
02:14:15.100 | I mean, Google, why is Google drive-free?
02:14:18.300 | Because they want your data.
02:14:20.500 | They want you to park your data there.
02:14:21.780 | So if they got hacked or leaked information,
02:14:26.400 | the reputational harm would be tremendous.
02:14:29.140 | - But for a company that's not,
02:14:32.300 | it's really hard to do that, right?
02:14:33.460 | And the company is not as big as Google
02:14:35.060 | or not as tech-savvy as Google,
02:14:37.820 | might have a lot of trouble with doing that kind of stuff.
02:14:39.740 | Instead of increasing security,
02:14:41.780 | they'll just decrease the efficiency.
02:14:45.780 | - Well, yeah.
02:14:46.620 | So there's a big difference between IT and security.
02:14:49.340 | And unfortunately, these mid-side companies,
02:14:51.340 | they try to stack security into their IT department.
02:14:54.340 | Your IT department is about business continuity.
02:14:56.560 | They're about trying to move business forward.
02:14:58.500 | They want users to get the data they need
02:15:00.660 | to do their job so the company can grow.
02:15:02.660 | Security is not that.
02:15:04.180 | They don't want you to get the data.
02:15:06.340 | But there's fine-tuning you can do to ensure that.
02:15:11.100 | I mean, as simple as having good onboarding procedures
02:15:13.980 | for employees.
02:15:14.820 | Like, you come into my company,
02:15:16.860 | you don't need access to everything.
02:15:18.060 | Maybe you need access to something for one day.
02:15:20.380 | Turn the access on, don't leave it on.
02:15:22.140 | I mean, I was the victim of the OPM hack,
02:15:24.540 | the Office of Personnel Management,
02:15:26.300 | because old credentials from a third-party vendor
02:15:29.260 | were sitting there inactive.
02:15:30.780 | And the Chinese government found those credentials
02:15:32.660 | and were able to log in and steal all my information.
02:15:34.940 | - So a lot could be helped
02:15:36.780 | if you just control the credentials, the access,
02:15:39.020 | the access control, how long they last.
02:15:41.940 | And people who need access to a certain thing
02:15:45.580 | only get access to that thing and nothing else.
02:15:47.920 | And then it just gets refreshed like that.
02:15:50.140 | - Access control, yeah, like we said,
02:15:51.660 | setting up people, leaving the company,
02:15:54.340 | get rid of their, they don't need control.
02:15:56.260 | Two-factor authentication, that's a big thing.
02:16:00.340 | I mean, I sound like a broken record
02:16:02.060 | because this isn't anything new.
02:16:03.300 | This isn't rocket science.
02:16:04.360 | The problem is we're not implementing it.
02:16:06.060 | If we are, we're not doing it correctly
02:16:08.460 | because these guys are taking us.
02:16:10.700 | - Well, two-factor authentication is a good example
02:16:12.900 | of something that I just was annoyed by
02:16:16.620 | for the longest time.
02:16:18.060 | Because yes, it's very good,
02:16:19.740 | but it seems that it's pretty easy to implement horribly
02:16:24.620 | to where it's not convenient at all
02:16:27.060 | for the legitimate user to use.
02:16:29.300 | It should be trivial to do,
02:16:31.280 | like to authenticate yourself twice should be super easy.
02:16:35.720 | - If security, if it's slightly inconvenient for you,
02:16:39.460 | it's think about how inconvenient it is for a hacker
02:16:41.580 | and how they're just gonna move on to the next person.
02:16:43.100 | - Yes, yes, in theory, we implemented it extremely well.
02:16:46.460 | But I just don't think so.
02:16:50.180 | I think actually if it's inconvenient,
02:16:53.780 | it shows that system hasn't been thought through a lot.
02:16:57.260 | - Do you know why we need two-factor authentication?
02:17:00.060 | People using the same password across the same site.
02:17:02.320 | So when one site is compromised,
02:17:04.020 | people just take that username and password,
02:17:05.900 | it's called credential stuffing
02:17:07.060 | and just stuff it across the internet.
02:17:08.540 | So if 10 years ago when we told everybody,
02:17:11.020 | "Don't use the same fucking password across the internet,
02:17:13.460 | "across vulnerable sites,"
02:17:14.720 | maybe two-factor wouldn't be needed.
02:17:16.340 | - Yeah, so you wouldn't need two-factor
02:17:17.940 | if everyone did a good job with passwords.
02:17:20.700 | - Yeah.
02:17:22.020 | - Right, but I'm saying like the two-factor authentication,
02:17:26.220 | it should be super easy to authenticate myself
02:17:29.620 | with some other device really quickly.
02:17:33.100 | Like it should be frictionless.
02:17:35.980 | - Like you just hit OK?
02:17:37.060 | - OK, and anything that belongs to me, yeah.
02:17:40.580 | And it should, very importantly,
02:17:43.420 | be easy to set up what belongs to me.
02:17:46.180 | I don't know the full complexity
02:17:48.700 | of the cyber attacks these platforms are under.
02:17:51.620 | They're probably under insane amount of attacks.
02:17:54.660 | - Yeah, you've got it right there.
02:17:56.100 | - People have no idea, these large companies,
02:17:58.700 | how often they're attacked, on a per second basis.
02:18:02.540 | And they have to fight all that off
02:18:03.620 | and pick out the good traffic in there.
02:18:05.980 | So yeah, there's no way I'd wanna run a large tech company.
02:18:10.980 | (Lex laughing)
02:18:12.420 | - Well, what about protecting individuals, for individuals?
02:18:15.580 | What's good advice to try to protect yourself
02:18:20.140 | from this increasingly dangerous world of cyber attacks?
02:18:23.420 | - Again, educate yourself that you understand
02:18:25.420 | there is a threat.
02:18:26.260 | First, you have to realize that.
02:18:27.700 | Then you're gonna step up
02:18:29.100 | and you're gonna do stuff a little bit more.
02:18:31.260 | Sometimes, I guess, I think I take that
02:18:32.580 | to a little bit extreme.
02:18:33.540 | I remember one time, my mom called me
02:18:36.900 | and she was screaming that,
02:18:39.100 | "I woke up this morning and I just clicked on a link
02:18:42.500 | "and now my phone is making weird noises."
02:18:44.780 | And I was like, "Throw your phone in a glass of water.
02:18:47.740 | "Just put it in a glass of water right now."
02:18:49.340 | And I made my mom cry.
02:18:51.180 | It was not a pleasant thing.
02:18:53.700 | So sometimes I go to a little extremes on those ones.
02:18:56.540 | But understanding there's a risk
02:18:59.060 | and making it a little bit more difficult
02:19:01.500 | to become a victim.
02:19:02.660 | I mean, just understanding certain things.
02:19:05.420 | Simple things like, as we add more internet of the things
02:19:10.380 | to people's houses, I mean,
02:19:11.460 | how many wifi networks do people have?
02:19:13.100 | It's normally just one.
02:19:13.980 | And you're bumping your phones
02:19:15.260 | and giving your password to people who come to visit.
02:19:17.620 | Set up a guest network.
02:19:18.660 | Set up something you can change every 30 days.
02:19:20.540 | Simple little things like that.
02:19:23.100 | I hate to remind you, but change your passwords.
02:19:25.180 | I mean, I feel like I'm a broken record again.
02:19:27.140 | But just make it more difficult for others to victimize you.
02:19:31.060 | - And then don't use the same password everywhere.
02:19:33.820 | - That, yes.
02:19:36.300 | I mean-
02:19:37.140 | - I still know people that do that.
02:19:38.660 | - I mean, ask.fm.got popped last week, two weeks ago.
02:19:42.260 | And that's 350 million username and passwords
02:19:44.980 | with connected Twitter accounts, Google accounts,
02:19:47.620 | all the different social media accounts.
02:19:50.780 | That is a treasure trove for the next two and a half,
02:19:53.660 | three years of just using those credentials everywhere.
02:19:57.980 | Using, you'll learn, even if it's not the right password,
02:20:00.700 | you'll learn people's password styles.
02:20:03.180 | Bad guys are making portfolios out of people.
02:20:06.220 | We're figuring out how people generate their passwords
02:20:08.540 | and kind of figuring,
02:20:09.820 | and then it's easier to crack their password.
02:20:12.860 | We're making a dossier on each person.
02:20:14.940 | It's 350 million dossiers just in that one hack.
02:20:17.100 | Yahoo, there was half a billion.
02:20:19.620 | So the thing a hacker would do with that
02:20:22.260 | is try to find all the low-hanging fruit,
02:20:25.220 | like have some kind of program that,
02:20:27.140 | yeah, evaluates the strength of the passwords,
02:20:31.060 | and then finds the weak ones.
02:20:33.500 | That means that this person is probably the kind of person
02:20:35.620 | that would use the same password across multiple.
02:20:38.340 | - Or even just write a program into that.
02:20:39.900 | Remember the Ring hack a couple of years ago?
02:20:41.900 | That's all it was, it was credential stuffing.
02:20:43.380 | So Ring, the security system by default,
02:20:45.900 | had two-factor, but didn't turn it on.
02:20:47.780 | And they also had a don't try unlimited tries
02:20:50.980 | to log into my account.
02:20:51.900 | You can lock it out after 10, by default, not turned on,
02:20:54.980 | 'cause it's not convenient for people.
02:20:56.700 | The Ring, it was like, I want people
02:20:58.700 | to stick these little things up
02:21:00.020 | and have security in their house,
02:21:02.060 | but cybersecurity, don't make it inconvenient,
02:21:04.860 | then people won't buy our product.
02:21:06.560 | That's how they got hacked.
02:21:08.340 | They wanted to say that it's insecure
02:21:10.220 | and got hacked into,
02:21:11.060 | reputational harm right there for Ring, but they didn't.
02:21:12.780 | It was just credential stuffing.
02:21:14.500 | People bought username and passwords on the black market
02:21:17.980 | and just wrote a bot that just went through Ring
02:21:21.580 | and used every one of them to maybe 1% hit,
02:21:24.120 | but that's a big hit to the number of Ring users.
02:21:27.380 | - You know, you can use also password managers
02:21:29.180 | to make the changing of the passwords easier.
02:21:32.200 | - And to make, you can charge the difficulty,
02:21:36.340 | the number of special characters,
02:21:37.780 | the length of it and all that.
02:21:39.280 | - My favorite thing is on websites,
02:21:43.140 | yell at you for your password being too long
02:21:45.580 | or having too many special characters,
02:21:47.260 | or like, yeah, you're not allowed
02:21:50.700 | to have this special character or something.
02:21:52.620 | - You can only use these three special characters.
02:21:55.120 | Do you understand how password cracking works?
02:21:59.100 | If you specifically tell me which password,
02:22:01.060 | which special characters I can use?
02:22:02.960 | - I honestly just want to have a one-on-one meeting,
02:22:07.580 | like late at night with the engineer that programmed that,
02:22:10.700 | 'cause that's like an intern.
02:22:12.860 | I just want to have a sit down meeting.
02:22:14.500 | - Yeah, I made my parents switch banks once
02:22:16.380 | because the security was so poor.
02:22:18.020 | I was like, you just, you can't have money here.
02:22:20.500 | - But then there's also like the zero-day attacks.
02:22:22.500 | Like I mentioned before the QNAP NAS that got hacked.
02:22:27.500 | Luckily I didn't have anything private on there,
02:22:30.780 | but it really woke me up to like, okay,
02:22:34.420 | so like if you take everything extremely seriously.
02:22:38.000 | - Unfortunately for the end users,
02:22:39.260 | there's nothing you can do about a zero-day.
02:22:40.620 | It's, you have no control over that.
02:22:42.940 | I mean, the engineers that made the software
02:22:45.500 | don't even know about it.
02:22:46.740 | Now let's talk about one days.
02:22:49.620 | So there's a patch now out there for the security.
02:22:51.380 | So if you're not updating your systems
02:22:53.180 | for these security patches, if it's just not on you,
02:22:55.780 | my father-in-law has such an old iPhone,
02:22:59.340 | you can't security patch it anymore.
02:23:01.260 | So, and I tell him, I said,
02:23:04.140 | this is what you're missing out on.
02:23:05.340 | This is what you're exposing yourself to,
02:23:06.820 | because, you know, we talked about that powerful tool
02:23:11.060 | that how we found Ross Ulbrich at gmail.com.
02:23:14.700 | Well, bad guys are using that too.
02:23:16.220 | It's called, you know, it used to be called Google dorking.
02:23:18.940 | Now it's, I think it's named kind of Google hacking
02:23:21.340 | by the community.
02:23:22.740 | You can go in, you know, and find a vulnerability,
02:23:26.040 | read about the white paper, what's wrong with that software.
02:23:28.980 | And then you can go on the internet
02:23:30.220 | and find all of the computers
02:23:32.060 | that are running that outdated software.
02:23:34.020 | And there's your list, there's your target list.
02:23:35.700 | - Yeah. - I know the vulnerabilities
02:23:37.100 | that are running.
02:23:37.920 | Again, not making a playbook here,
02:23:39.740 | but, you know, that's how easy it is to find your targets.
02:23:43.460 | And that's what the bad guys are doing.
02:23:45.700 | - Then the reverse is tough.
02:23:47.380 | It's much tougher, but it's still doable,
02:23:49.140 | which is like first find the target.
02:23:51.580 | If you have specific targets,
02:23:53.100 | to, you know, hack into a Twitter account, for example.
02:23:57.380 | - Much harder.
02:23:58.200 | - That's probably social engineering, right?
02:23:59.980 | That's probably the best way.
02:24:01.060 | - Probably, if you want something specific to that.
02:24:02.980 | I mean, if you really want to go far, you know,
02:24:06.380 | if you're targeting a specific person, you know,
02:24:08.900 | how hard is it to get into their office
02:24:11.140 | and put a, you know, a little device,
02:24:13.340 | USB device in line with their mouse,
02:24:15.140 | who checks how their mouse is plugged in.
02:24:17.060 | And you can, for 40 bucks on the black market,
02:24:18.740 | you can buy a key logger that just USB,
02:24:21.300 | then the mouse plugs right into it.
02:24:22.500 | It looks like an extension on the mouse.
02:24:24.420 | If you can even find it,
02:24:25.260 | you can buy the stuff with a mouse inside of it
02:24:29.300 | and just plug it into somebody's computer.
02:24:30.820 | And there's a key logger that lives in there and calls home,
02:24:33.380 | sends everything you want.
02:24:34.340 | So, I mean, and it's cheap.
02:24:36.220 | - Yeah, in grad school,
02:24:37.780 | a program that built a bunch of key loggers,
02:24:41.540 | it was fascinating, a tracking mouse,
02:24:43.580 | just for, I was doing as part of the research,
02:24:46.780 | I was doing to see if by the dynamics of how you type
02:24:51.780 | and how you move the mouse, you can tell who the person is.
02:24:56.900 | - Oh, wow.
02:24:57.740 | - That's like, it's called the active authentication,
02:25:01.940 | like it's basically biometrics that's not using bio
02:25:05.220 | to see how identifiable that is.
02:25:08.020 | So it's fascinating to study that,
02:25:09.180 | but it's also fascinating how damn easy it is
02:25:11.620 | to install key loggers.
02:25:13.580 | So I think it's natural,
02:25:15.940 | what happens is you realize how many vulnerabilities
02:25:18.940 | there are in this world.
02:25:20.100 | You do that when you understand bacteria and viruses,
02:25:23.360 | you realize they're everywhere.
02:25:24.900 | And the same way with, I'm talking about biological ones,
02:25:27.840 | and then you realize that all the vulnerabilities
02:25:30.300 | that are out there.
02:25:31.140 | One of the things I've noticed quite a lot
02:25:32.540 | is how many people don't log out of their computers.
02:25:36.200 | Just how easy physical access to systems actually is.
02:25:40.860 | Like in a lot of places in this world,
02:25:44.540 | and I'm not talking about private homes,
02:25:46.740 | I'm talking about companies, especially large companies.
02:25:49.260 | It seems quite trivial in certain places
02:25:51.420 | that I've been to, to walk in
02:25:52.900 | and have physical access to a system.
02:25:54.940 | And that's depressing to me.
02:25:56.300 | - It is.
02:25:57.140 | It just, I laugh because one of my partners at Naxo
02:26:00.460 | that I work at now, he worked at a big company.
02:26:04.500 | You would know the name as soon as I told you,
02:26:05.780 | I'm not gonna say it.
02:26:07.020 | But the guy who owned the company,
02:26:08.540 | and the company has his name on it,
02:26:10.860 | didn't want to ever log into a computer.
02:26:13.700 | It just annoyed the shit out of him.
02:26:14.940 | So they hired a person that stands next to his computer
02:26:17.420 | when he's not there, and that's his physical security.
02:26:20.120 | - See, that's good.
02:26:22.020 | That's pretty good, actually.
02:26:23.100 | - Yeah, I mean, I guess if you could afford to do that.
02:26:25.700 | - At least you're taking your security seriously.
02:26:27.380 | I feel like there's a lot of people in that case
02:26:29.240 | would just not have a login.
02:26:30.820 | - Yeah.
02:26:32.260 | No, the security team there had to really work around
02:26:35.100 | to make that work, non-compliant with the company policy.
02:26:39.380 | - But that's interesting.
02:26:40.740 | The key log, there's a lot of,
02:26:43.980 | there's just a lot of threats.
02:26:45.740 | - Yeah, I mean--
02:26:46.580 | - There's a lot of ways to get in.
02:26:47.400 | - Yeah, I mean, so you can't sit around
02:26:49.220 | and worry about someone physically gaining access
02:26:51.300 | to your computer with key logger and stuff like that.
02:26:53.940 | You know, if you're traveling to a foreign country
02:26:55.880 | and you work for the FBI, then yeah, you do.
02:26:57.920 | You pick little, you know, sometimes some countries
02:27:00.260 | you would bring a fake laptop just to see
02:27:02.500 | if they stole it or accessed it.
02:27:04.260 | - I really want, especially in this modern day,
02:27:07.300 | to just create a lot of clones of myself
02:27:09.260 | that generate Lex sounding things
02:27:13.300 | and just put so much information out there.
02:27:16.020 | I actually dox myself all across the world.
02:27:20.140 | - And then you're not a target, I guess.
02:27:21.740 | Just put it out there.
02:27:22.740 | I've always said that, though.
02:27:24.140 | We do these searches in FBI houses and stuff like that.
02:27:26.260 | If someone just got a box load of 10 terabyte drives
02:27:30.800 | and just encrypted them, oh my God,
02:27:33.240 | do you know how long the FBI would spin their wheels
02:27:34.920 | trying to get that data off there?
02:27:36.400 | It'd be insane.
02:27:37.240 | - Oh, so just give 'em--
02:27:40.160 | - You don't even know which one you're looking for.
02:27:42.240 | - Yeah.
02:27:43.080 | That's true, that's true.
02:27:45.240 | So it's like me printing a treasure map
02:27:48.540 | to a random location, just get people to go on goose chases.
02:27:53.960 | - Yeah, what about operating system?
02:27:57.260 | What have you found, what's the most secure
02:28:00.460 | and what's the least secure operating system,
02:28:02.020 | Windows, Linux?
02:28:03.140 | Is there no universal?
02:28:05.780 | - There's no universal security.
02:28:07.240 | I mean, it changed.
02:28:08.620 | People used to think Macs were the most secure
02:28:10.460 | just 'cause they just weren't out there,
02:28:11.840 | but now kids have had access to them.
02:28:13.860 | So I know you're a Linux guy.
02:28:17.340 | I like Linux too, but it's tough to run a business on Linux.
02:28:23.180 | People wanna move more towards the Microsofts
02:28:25.220 | and the Googles just 'cause it's easier to communicate
02:28:28.280 | with other people that maybe aren't computer guys.
02:28:30.520 | So you have to just take what's best, what's easiest,
02:28:34.080 | and secure the shit out of it as much as you can
02:28:35.960 | and just think about it.
02:28:37.220 | - What are you doing these days at Nexo?
02:28:39.200 | - So we just started Nexo.
02:28:40.700 | So I left the government and went to a couple consultancies
02:28:45.660 | and I started working, really all the people
02:28:48.200 | I worked good in the government with,
02:28:51.920 | I brought them out with me.
02:28:53.520 | And now--
02:28:54.360 | - You used to work for the man and now you're the man.
02:28:56.320 | - Exactly.
02:28:57.280 | But now we formed a partnership
02:28:58.580 | and it's a new cybersecurity firm.
02:29:01.520 | Our launch party is actually on Thursday,
02:29:03.320 | so it's gonna be exciting.
02:29:04.640 | - Do you wanna give more details about the party
02:29:06.680 | so that somebody can hack into it?
02:29:08.080 | - No, I don't think I can tell you where it is.
02:29:10.440 | You can come if you want, but don't bring the hackers.
02:29:13.440 | Hector will be there.
02:29:15.200 | - I can't believe you invited me
02:29:16.800 | 'cause you also say insider threat is the biggest threat.
02:29:22.160 | By the way, can you explain what the insider threat is?
02:29:24.160 | - The biggest insider threat in my life is my children.
02:29:27.240 | My son's big into Minecraft
02:29:29.800 | and will download executables mindlessly
02:29:31.840 | and just run them on the network.
02:29:33.360 | So he is--
02:29:34.200 | - Do you recommend against marriage and family and kids?
02:29:36.640 | - Nope, nope.
02:29:37.640 | - From a security perspective.
02:29:38.840 | - From a security perspective, absolutely.
02:29:40.440 | But no, I just, segmentation.
02:29:43.040 | I mean, we do it in all businesses for years.
02:29:45.800 | Started segmenting networks, different networks.
02:29:48.760 | I just do it at home.
02:29:49.600 | My kid's on his own network.
02:29:52.100 | It makes it a little bit easier
02:29:53.740 | to see what they're doing too.
02:29:55.220 | You can monitor traffic and then also throttle bandwidth
02:29:58.420 | if your Netflix isn't playing fast enough
02:30:00.700 | or buffers or something.
02:30:01.700 | So you can obviously change that a little too.
02:30:04.340 | - You know they're gonna listen to this, right?
02:30:06.500 | They're gonna get your tricks.
02:30:07.540 | - Yeah, they'll definitely will listen.
02:30:09.700 | But there's nothing more humbling than your family.
02:30:11.660 | You think you've done something big
02:30:13.060 | and you go on a big podcast and talk to Les Freeman,
02:30:15.980 | they don't fucking care.
02:30:17.420 | - Unless you're on TikTok or shit.
02:30:20.980 | - Yeah, you'll show up on a YouTube feed
02:30:23.020 | or something like that.
02:30:23.860 | And they'll be like, oh yeah.
02:30:24.700 | - Whatever, this guy's boring.
02:30:26.620 | - My son does a podcast for his school
02:30:29.340 | and I still can't get him to tell.
02:30:32.620 | So Hector and I just started a podcast
02:30:35.340 | talking about cybersecurity.
02:30:36.540 | We do a podcast called Hacker in the Fed.
02:30:38.900 | It just came out yesterday.
02:30:39.940 | So first episode.
02:30:41.860 | So yeah, we got 1,300 downloads the first day.
02:30:45.660 | So pretty, we were at the top of Hacker News,
02:30:47.700 | which is a big website in our world.
02:30:49.780 | - So it's called Hacker in the Fed?
02:30:51.060 | - Hacker in the Fed's the name of it.
02:30:52.740 | - Go download and listen to Hacker in the Fed.
02:30:54.780 | I can't wait to see what,
02:30:56.300 | 'cause I don't think I've seen a video of you two together.
02:30:58.500 | So I can't wait to see what the chemistry is like.
02:31:01.860 | It's not weird that you guys used to be enemies
02:31:06.220 | and now you're friends?
02:31:07.780 | - So yeah, I mean, we just did a trailer and all that.
02:31:10.940 | And our producer, we have a great producer guy named Phineas
02:31:14.060 | and he kind of pulls things out of me.
02:31:15.500 | And I said, okay, I got one.
02:31:18.780 | My relationship with Hector,
02:31:21.060 | we're very close friends now.
02:31:22.700 | And I was like, oh, I arrested one of my closest friends.
02:31:27.180 | Which is a very strange relationship.
02:31:28.980 | - Yeah, it's weird.
02:31:30.900 | - But he says that I changed his life.
02:31:32.860 | I mean, he was going down a very dark path
02:31:34.500 | and I gave him an option that one night
02:31:36.020 | and he made the right choice.
02:31:37.620 | I mean, he now does penetration testing.
02:31:39.920 | He does a lot of good work
02:31:41.180 | and he's turned his life around.
02:31:43.060 | - Do you worry about cyber war in the 21st century?
02:31:48.620 | - Absolutely.
02:31:50.060 | If there is a global war, it'll start with cyber.
02:31:53.380 | If it's not already started.
02:31:54.860 | - Do you feel like there's a boiling,
02:32:00.740 | like the drums of war are beating?
02:32:04.140 | What's happening in Ukraine with Russia?
02:32:06.500 | It feels like the United States
02:32:08.780 | becoming more and more involved
02:32:11.000 | in the conflict in that part of the world.
02:32:13.060 | And China is watching very closely,
02:32:15.940 | is starting to get involved geopolitically
02:32:19.540 | and probably in terms of cyber.
02:32:22.100 | Do you worry about this kind of thing happening
02:32:25.940 | in the next decade or two, like where it really escalates?
02:32:29.220 | You know, people in the 1920s were completely terrible
02:32:34.220 | at predicting the World War II.
02:32:37.000 | Do you think we're at the precipice of war, potentially?
02:32:41.700 | - I think we could be.
02:32:42.740 | I mean, I would hate to just be, you know,
02:32:46.500 | just fear mongering out there, you know,
02:32:48.980 | COVID's over, so the next big thing in the media
02:32:50.900 | is war and all that.
02:32:51.800 | But I mean, there's some flags going up
02:32:56.580 | that are very strange to me.
02:32:58.820 | - Is there ways to avoid this?
02:33:00.620 | - I hope so.
02:33:01.460 | I hope smarter people than I are figuring it out.
02:33:03.500 | I hope people are playing their parts
02:33:05.100 | and talking to the right people
02:33:06.780 | because war is the last thing I want.
02:33:10.620 | - Well, there's two things to be concerned about
02:33:12.220 | on the cyber side.
02:33:13.420 | One is the actual defense on the technical side of cyber.
02:33:17.660 | And the other one is the panic that might happen
02:33:20.860 | when something like some dramatic event happened
02:33:24.860 | because of cyber, some major hack that becomes public.
02:33:29.360 | I'm honestly more concerned about the panic
02:33:31.900 | because I feel like if people don't think about this stuff,
02:33:35.500 | the panic can hit harder.
02:33:37.540 | Like if they're not conscious about the fact
02:33:40.300 | that we're constantly under attack,
02:33:42.180 | I feel like it'll come like a much harder surprise.
02:33:45.820 | - Yeah, I think people will be really shocked on things.
02:33:49.140 | I mean, so we talked about LULSIC today
02:33:50.740 | and LULSIC was 2011.
02:33:52.700 | They had access into the water supply system
02:33:57.060 | of a major US city.
02:33:58.620 | They didn't do anything with it.
02:33:59.700 | They were sitting on it in case someone got arrested
02:34:01.700 | and they were gonna maybe just expose that it's insecure.
02:34:05.840 | Maybe they were gonna do something to fuck with it.
02:34:07.500 | I don't know.
02:34:08.340 | But that's 2011.
02:34:11.660 | I don't think it's gotten a lot better since then.
02:34:14.300 | - And there's probably nation states or major organizations
02:34:21.000 | that are sitting secretly on hacks like this.
02:34:23.620 | - 100%, 100% they are sitting secretly
02:34:26.580 | waiting to expose things.
02:34:28.400 | I mean, again, I don't wanna scare the shit out of people,
02:34:32.620 | but people have to understand the cyber threat.
02:34:34.700 | I mean, there are thousands of nation state hackers
02:34:39.700 | in some countries.
02:34:41.780 | I mean, we have them too.
02:34:42.620 | We have offensive hackers.
02:34:43.820 | - You know, the terrorist attacks of 9/11,
02:34:46.020 | there's planes that actually hit actual buildings
02:34:50.080 | and it was visibly clear and you can trace the information.
02:34:54.620 | With cyber attacks, say something that would result
02:34:57.500 | in a major explosion in New York City,
02:35:00.640 | how the hell do you trace that?
02:35:03.980 | Like if it's well done,
02:35:06.300 | it's going to be extremely difficult.
02:35:08.460 | The problem is there's so many problems.
02:35:12.300 | One of which the US government in that case
02:35:14.660 | has complete freedom to blame anybody they want.
02:35:17.380 | - True.
02:35:18.380 | - And then to go start war with anybody,
02:35:21.800 | anybody that actually see,
02:35:25.120 | that's sorry, that's one cynical take on it, of course.
02:35:30.580 | - No, but you're going down the right path.
02:35:31.820 | I mean, the guys that flew planes in the buildings
02:35:33.620 | wanted attribution.
02:35:34.540 | They took credit for it.
02:35:35.780 | When we see the cyber attack,
02:35:37.580 | I doubt we're going to see attribution.
02:35:39.660 | Maybe the victim side, the US government on this side
02:35:42.580 | might come out and try to blame somebody.
02:35:44.620 | But you know, like you've brought up,
02:35:46.740 | they could blame anybody they want.
02:35:48.020 | There's not really a good way of verifying that.
02:35:51.100 | - Can I just ask for your advice?
02:35:53.060 | So in my personal case, am I being tracked?
02:35:57.140 | How do I know?
02:35:59.100 | How do I protect myself?
02:36:00.340 | Should I care?
02:36:02.100 | - You are being tracked.
02:36:04.260 | I wouldn't say you're being tracked by the government.
02:36:06.300 | You're definitely being tracked by big tech.
02:36:09.100 | - No, I mean, me personally, Lex, at an escalated level.
02:36:13.020 | So like,
02:36:13.860 | like you mentioned, there's an FBI file on people.
02:36:19.060 | - Sure.
02:36:20.300 | - I'd love to see what's in that file.
02:36:22.200 | (laughing)
02:36:25.500 | Who did I have the argument for?
02:36:26.940 | Oh, let me ask you, FBI.
02:36:28.540 | - Yeah.
02:36:29.700 | - How's the cafeteria food in FBI?
02:36:31.660 | - At the Academy, it's bad.
02:36:33.980 | - Yeah.
02:36:35.020 | What about like-
02:36:35.860 | - At headquarters?
02:36:36.820 | - Headquarters.
02:36:37.640 | - A little bit better, 'cause that's where the director,
02:36:39.220 | I mean, he eats up on the seventh floor.
02:36:41.180 | - Have you been like at Google?
02:36:42.340 | Have you been to Silicon Valley, those cafeteria,
02:36:44.980 | like those-
02:36:45.820 | - I've been to the Google in Silicon Valley.
02:36:47.620 | I've been to the Google in New York.
02:36:49.220 | - Yeah, the food is incredible.
02:36:50.460 | - It is great.
02:36:51.460 | - So FBI's worse.
02:36:53.100 | - Well, when you're going through the Academy,
02:36:54.860 | they don't let you outside of the building.
02:36:56.420 | So you have to eat it.
02:36:58.060 | And I think that's the only reason people eat it.
02:37:00.460 | - Okay.
02:37:01.300 | - It's pretty bad.
02:37:02.700 | - I got it.
02:37:04.820 | Okay, I don't know why I asked-
02:37:05.660 | - But there's also a bar inside the FBI Academy.
02:37:08.180 | People don't know that.
02:37:09.620 | - Alcohol bar?
02:37:10.460 | - Yes, alcohol bar.
02:37:11.940 | And as long as you've passed your PT and going well,
02:37:16.940 | you're allowed to go to the bar.
02:37:18.180 | - Nice.
02:37:19.380 | It feels like if I was a hacker,
02:37:22.100 | I would be going after like celebrities,
02:37:23.940 | 'cause they're a little bit easier,
02:37:25.020 | like celebrity celebrities, like Hollywood.
02:37:27.260 | - The Hollywood nudes were a big thing there
02:37:29.300 | for a long time.
02:37:30.620 | - But now, yeah, I guess nudes-
02:37:32.140 | - That's what they went after.
02:37:33.020 | I mean, all those guys, they socialized.
02:37:34.980 | They social engineered Apple to get backups,
02:37:38.540 | to get the recoveries for backups.
02:37:40.220 | And then they just pulled all their nudes.
02:37:41.460 | And I mean, whole websites were dedicated to that.
02:37:44.260 | - Yeah, see that?
02:37:45.540 | See, I wouldn't do that kind of stuff.
02:37:46.740 | It's very creepy.
02:37:48.060 | I would go, if I was a hacker,
02:37:49.620 | I would go after like major, like powerful people
02:37:54.620 | and like tweet something from their account
02:37:58.580 | and like something that, like positive, like loving,
02:38:02.740 | but like for the walls, the obvious that it's a troll.
02:38:05.860 | - God, you get busted so quick.
02:38:07.980 | - By a bad hacker.
02:38:09.980 | - Really, but why?
02:38:11.300 | - Because hackers never put things out about love.
02:38:14.060 | - Oh, you mean like, this is clearly-
02:38:17.020 | - Yeah, this is clearly Lex.
02:38:18.620 | - What the fuck?
02:38:19.460 | - He talks about love in every podcast he does.
02:38:21.540 | - I would just be like, no, oh, goddammit,
02:38:23.580 | now somebody's gonna do it.
02:38:25.540 | You'll blame me.
02:38:26.860 | It wasn't me.
02:38:28.280 | - Looking back at your life, is there something you regret?
02:38:31.820 | - I'm only 44 years old, I'm already looking back.
02:38:34.020 | - Is there stuff that you regret?
02:38:38.740 | - EV unit.
02:38:40.760 | Got away.
02:38:42.660 | - It's always the one that got away.
02:38:45.060 | - Yeah, I mean, it took me a while
02:38:46.180 | into my law enforcement career
02:38:47.460 | to learn about like the compassionate side
02:38:49.820 | and it took Hector Monsiger to make me realize
02:38:53.620 | that criminals aren't really criminals, they're human beings.
02:38:57.620 | That really humanized the whole thing for me,
02:38:59.960 | sitting with him for nine months.
02:39:01.980 | I think that's maybe why I had a lot more compassion
02:39:05.740 | when I arrested Ross.
02:39:07.480 | Probably wouldn't have been so compassionate
02:39:09.460 | if it was before Hector, but yeah, he changed my life
02:39:12.020 | and showed me that humanity side of things.
02:39:15.220 | - So would it be fair to say that all the criminals,
02:39:19.140 | or most criminals are just people
02:39:22.560 | that took a wrong turn at some point?
02:39:24.500 | They all have the capacity for good
02:39:26.680 | and for evil in them?
02:39:27.900 | - I'd say 99% of the criminals that I've interacted with,
02:39:33.100 | yes, the people with the child exploitation,
02:39:36.020 | no, I don't have any place in my heart for them.
02:39:38.420 | - What advice would you give to people in college,
02:39:43.200 | people in high school, trying to figure out
02:39:45.100 | what they wanna do with their life?
02:39:46.780 | How to have a life they can be proud of,
02:39:48.660 | how to have a career they can be proud of,
02:39:51.140 | all that kind of stuff.
02:39:52.460 | - In the US budget that was just put forward,
02:39:55.620 | there's $18 billion for cybersecurity.
02:39:58.840 | We're about a million people short
02:40:00.360 | of where we really should be in the industry, if not more.
02:40:03.120 | If you have, want job security and want to work
02:40:06.100 | and see exciting stuff, head towards cybersecurity.
02:40:09.600 | It's a good career.
02:40:11.400 | And one thing I dislike about cybersecurity right now
02:40:17.280 | is they expect you to come out of college
02:40:19.280 | and have 10 years experience in protecting
02:40:21.520 | and knowing every different Python script out there
02:40:24.200 | and everything available.
02:40:26.440 | The industry needs to change and let the lower people in
02:40:28.840 | in order to broaden and get those billion jobs filled.
02:40:33.560 | But as far as their personal security,
02:40:35.680 | just remember, it's all gonna follow you.
02:40:37.200 | I mean, there's laws out there now
02:40:40.400 | that you have to turn over your social media accounts
02:40:42.120 | in order to have certain things.
02:40:44.240 | They just changed that in New York state.
02:40:45.700 | If you wanna carry a gun,
02:40:46.600 | you have to turn over your social media
02:40:47.920 | to figure if you're a good social character.
02:40:53.200 | So hopefully you didn't say something strange
02:40:56.640 | in the last few years and it's gonna follow you forever.
02:40:59.480 | I bet Ross Ulbrich would tell you the same thing,
02:41:02.200 | don't put rossulbrich@gmail.com on things
02:41:04.880 | 'cause it's gonna last forever.
02:41:06.320 | - Yeah, people sometimes, for some reason,
02:41:08.700 | they interact on social media
02:41:10.040 | as if they're talking to a couple of buddies,
02:41:12.240 | like just shooting shit and mocking
02:41:17.280 | and like, what is that, busting each other's chops,
02:41:21.920 | like making fun of yourself, like being,
02:41:24.240 | especially gaming culture, like people who stream.
02:41:28.320 | - Thank God that's not recorded.
02:41:29.680 | Oh my God, the things people say on those streams.
02:41:32.680 | - Yeah, but a lot of them are recorded.
02:41:34.840 | That's just so there's a whole Twitch thing
02:41:36.840 | where people stream for many hours a day.
02:41:39.480 | And I mean, just outside of the very offensive things
02:41:44.480 | they say, they just swear a lot.
02:41:49.400 | They're not the kind of person that I would wanna hire,
02:41:54.160 | I wanna work with.
02:41:55.480 | Now, I understand that some of us might be
02:41:59.360 | that way privately, I guess,
02:42:02.280 | when you're shooting shit with friends,
02:42:04.080 | like playing a video game and talking shit to each other,
02:42:07.200 | maybe, but like that's all out there.
02:42:09.820 | You have to be conscious of the fact
02:42:11.160 | that that's all out there.
02:42:12.520 | And it's just not a good look.
02:42:14.600 | It's not like you're, you should,
02:42:16.080 | it's complicated 'cause I'm like against hiding
02:42:18.600 | who you are.
02:42:20.440 | - If you're an asshole, you should hide some of it.
02:42:23.160 | - Yeah, but like, I just feel like
02:42:25.060 | it's going to be misinterpreted.
02:42:27.240 | When you talk shit to your friends
02:42:28.620 | while you're playing video games,
02:42:30.780 | it doesn't mean you're an asshole.
02:42:32.400 | 'Cause you're an asshole to your friend,
02:42:34.800 | but that's how a lot of friends show love.
02:42:37.680 | - Yeah, an outside person can't judge
02:42:39.120 | how I'm friends with you.
02:42:40.320 | If I wanna be, this is our relationship.
02:42:42.920 | If that person can say that I'm an asshole to them,
02:42:46.280 | then that's fine, I'll take it.
02:42:47.360 | But you can't tell me I'm an asshole to them
02:42:49.200 | just because you saw my interaction.
02:42:50.480 | I agree with that.
02:42:51.320 | - They'll take those words out of context
02:42:52.960 | and that's considered who you are is dangerous.
02:42:57.600 | And people take that very nonchalantly.
02:42:59.400 | People treat their behavior on the internet
02:43:01.640 | very, very carelessly.
02:43:03.460 | That's definitely something that you need to learn
02:43:06.320 | and take extremely seriously.
02:43:07.840 | Also, I think that taking that seriously
02:43:10.200 | will help you figure out who you,
02:43:12.120 | what you really stand for.
02:43:13.720 | If you use your language carelessly,
02:43:17.720 | you'd never really ask, what do I stand for?
02:43:20.440 | I feel like it's a good opportunity when you're young
02:43:23.720 | to ask what are the things that are okay to say?
02:43:28.720 | What are the things, what are the ideas I stand behind?
02:43:31.800 | Especially if they're controversial
02:43:35.300 | and I'm willing to say them because I believe in them
02:43:38.100 | versus just saying random shit for the lols.
02:43:41.160 | 'Cause for the random shit for the lols,
02:43:42.560 | keep that off the internet.
02:43:44.600 | That said, man, I was an idiot for most of my life
02:43:47.480 | and I'm constantly learning and growing.
02:43:50.460 | I'd hate to be responsible for the kind of person
02:43:53.440 | I was in my teens, in my 20s.
02:43:59.360 | I didn't do anything offensive,
02:44:00.600 | but it just changed as a person.
02:44:02.960 | Like I used to, I guess I probably still do,
02:44:05.600 | but I used to read so much existential literature.
02:44:08.780 | That was a phase.
02:44:11.080 | There's like phases.
02:44:12.300 | - Yeah, you grow and evolve as a person
02:44:14.280 | that changes you in the future.
02:44:15.720 | Yeah, thank God there wasn't social media
02:44:17.440 | when I was in high school.
02:44:18.440 | Thank God.
02:44:19.280 | Oh my God, I would never have gotten the FBI.
02:44:22.680 | - Would you recommend that people consider a career
02:44:25.440 | at a place like the FBI?
02:44:26.860 | - I loved the FBI.
02:44:29.240 | I never thought I would go anyplace else,
02:44:31.500 | but the FBI, I thought I was gonna retire
02:44:33.240 | with the gold watch and everything from the FBI.
02:44:35.760 | That was my plan.
02:44:36.600 | - You get a gold watch?
02:44:37.420 | - No, but you know what it is, it's a,
02:44:38.800 | oh, it's an expression of colonialism.
02:44:42.040 | You get a gold badge, you actually get your badge
02:44:43.840 | in Lucite and your creds, they put it in Lucite
02:44:46.280 | and all that, so.
02:44:47.440 | - Does it, by the way, just on a tangent,
02:44:49.880 | since we like those, does it hurt you that the FBI
02:44:54.880 | by certain people is distrusted or even hated?
02:44:58.280 | - 100%, it kills me.
02:44:59.600 | I've never until recently not,
02:45:03.160 | sometimes be embarrassed about the FBI sometimes,
02:45:09.240 | which is really, really hard for me to say
02:45:11.320 | 'cause I love that place.
02:45:12.240 | I love the people in it.
02:45:13.440 | I love the brotherhood that you have
02:45:16.200 | with all the guys in your squad, guys and girls.
02:45:19.120 | I just use guys, you know.
02:45:20.480 | I developed a real drinking problem there
02:45:24.160 | because we were so social of going out after work
02:45:27.520 | and continuing on, it really was a family.
02:45:30.780 | So I do miss that.
02:45:34.100 | But yeah, I mean, if someone can become an FBI agent,
02:45:38.280 | I mean, it's pretty fucking cool, man.
02:45:40.880 | The day you graduate and walk out of the academy
02:45:43.040 | with a gun and a badge and the power to charge someone
02:45:46.880 | with a misdemeanor for flying
02:45:48.120 | a United States flag at night, that's awesome.
02:45:51.020 | - So there is a part of representing
02:45:54.480 | and loving your country,
02:45:55.880 | and especially if you're doing cyber security.
02:45:57.360 | So there's a lot of technical savvy
02:45:58.960 | in different places in the FBI.
02:46:02.080 | - Yeah, I mean, there's different pieces.
02:46:04.320 | Sometimes you'll see an older agent that's done
02:46:07.680 | not cyber crime come over to cyber crime at the end
02:46:10.200 | so he can get a job once he goes out.
02:46:12.280 | But there's also some guys that come in.
02:46:14.320 | I won't name his name, but there was a guy,
02:46:17.000 | I think he was a hacker when he was a kid,
02:46:18.800 | and now he's been an agent.
02:46:19.800 | Now he's way up in management.
02:46:21.840 | Great guy, I love this guy.
02:46:23.400 | And he knows who he is if he's listening.
02:46:25.400 | He had some skills.
02:46:29.760 | But we also lost a bunch of guys that had some skills
02:46:31.680 | because we had one guy in the squad
02:46:34.560 | that he had to leave the FBI
02:46:35.920 | 'cause his wife became a doctor
02:46:37.440 | and she got a residency down in Houston
02:46:39.680 | and she couldn't move.
02:46:41.320 | He wasn't allowed to transfer,
02:46:42.840 | so he decided to keep his family versus the FBI.
02:46:45.920 | So there's some stringent rules in the FBI
02:46:48.080 | that need to be relaxed a little bit.
02:46:50.320 | - Yeah, I love hackers turned leaders.
02:46:54.240 | Like one of my quickly becoming good friends is Mudge.
02:46:58.480 | He was a big hack in the '90s,
02:47:00.360 | and then now was recently Twitter chief security officer,
02:47:06.800 | CSO, but he had a bunch of different leadership positions,
02:47:10.200 | including being my boss at Google,
02:47:13.480 | but originally a hacker.
02:47:16.680 | It's cool to see hackers become leaders.
02:47:19.720 | - I just wonder what would cause him to stop doing it,
02:47:22.320 | why he would then take a managerial route,
02:47:25.560 | very high-tech companies versus--
02:47:27.120 | - I think a lot of those guys,
02:47:28.920 | so this is like the '90s,
02:47:30.240 | they really were about the freedom.
02:47:32.580 | There's a philosophy to it.
02:47:35.520 | And when I think the hacking culture evolved over the years,
02:47:39.720 | and I think when it leaves you behind,
02:47:41.960 | you start to realize like,
02:47:43.120 | oh, actually what I wanna do is I wanna help the world,
02:47:45.920 | and I can do that in legitimate routes and so on.
02:47:48.960 | But that's the story that,
02:47:49.960 | and yeah, I would love to talk to him one day,
02:47:53.400 | but I wonder how common that is too,
02:47:56.480 | like young hackers turn good.
02:47:59.520 | You're saying it like pulls you in.
02:48:01.520 | If you're not careful, it can really pull you in.
02:48:03.320 | - Yeah, you're good at it, you become powerful,
02:48:06.920 | you become, everyone's slapping you on the back
02:48:09.640 | and say what a good job and all that at a very young age.
02:48:12.800 | - Yeah.
02:48:13.640 | - Yeah, I would love to get into my buddy's mind
02:48:15.560 | on why he stopped hacking and moved on.
02:48:18.680 | That's gonna be a good conversation.
02:48:20.440 | - In his case, maybe it's always about a great woman
02:48:24.000 | involved, a family and so on that grounds you.
02:48:30.320 | Because there is a danger to hacking
02:48:34.400 | that once you're in a relationship,
02:48:36.280 | once you have family, maybe you're not willing to partake in.
02:48:39.280 | What's your story?
02:48:41.080 | What, from childhood, what are some fond memories you have?
02:48:45.080 | - Fond memories?
02:48:46.120 | - Where did you grow up?
02:48:47.680 | - Well, I don't give away that information.
02:48:50.360 | - In the United States?
02:48:51.200 | - Yeah, yeah, yeah, in Virginia.
02:48:53.280 | - In Virginia. - Yeah.
02:48:54.440 | - What are some rough moments,
02:48:55.760 | what are some beautiful moments that you remember?
02:48:59.760 | - I had a very good family growing up.
02:49:01.800 | The rough moment, and I'll tell you a story
02:49:06.640 | that just happened to me two days ago
02:49:07.760 | and it fucked me up, man, it really did.
02:49:09.200 | And you'll be the first, I've never told,
02:49:10.600 | I tried to tell my wife this two nights ago
02:49:12.280 | and I couldn't get it out.
02:49:13.720 | So my father, he's a disabled veteran,
02:49:18.280 | or he was a disabled veteran, he was in the army
02:49:21.080 | and got hurt and was in a wheelchair his whole life
02:49:24.360 | for all my growing up.
02:49:25.520 | He was my biggest fan.
02:49:29.400 | He just wanted to know everything about
02:49:31.800 | what was going on in the FBI, my stories.
02:49:34.480 | I was a local cop before the FBI
02:49:36.120 | and I got into a high-speed car chase,
02:49:38.040 | foot chase and all that, and kicking doors in.
02:49:41.760 | He wanted to hear all those stories.
02:49:43.920 | And at some points I was kind of too cool for school
02:49:46.840 | and, "Ah, dad, I just want a break," and all that,
02:49:48.600 | and things going on.
02:49:50.400 | We lost my dad during COVID, not because of COVID,
02:49:54.760 | but it was around that time,
02:49:55.840 | but it was right when COVID was kicking off.
02:49:57.880 | And so he died in the hospital by himself
02:50:00.280 | and I didn't get to see him then.
02:50:01.920 | And then my mom had some people visiting her
02:50:06.160 | the other night, Tom and Karen Roggeberg,
02:50:09.080 | and I'll say they're my second biggest fans,
02:50:10.960 | right behind my dad.
02:50:12.560 | They always asking about me and my career
02:50:15.840 | and they read the books and seen the movie.
02:50:18.120 | They'll even tell you that "Silk Road" movie was good.
02:50:20.440 | (laughing)
02:50:21.560 | They'll hide you on that.
02:50:22.840 | But, and so they came over and I helped them with something
02:50:27.680 | and my mom called me back a couple of days later
02:50:30.040 | and she said, "I appreciate you helping them.
02:50:31.720 | "I know fixing someone's Apple phone over the phone
02:50:34.400 | "really isn't what you do for a living.
02:50:36.480 | "It's kind of beneath you and all that,
02:50:38.640 | "but I appreciate it."
02:50:39.880 | And she said, "Oh, they loved hearing the stories
02:50:42.360 | "about 'Silk Road' and all those things."
02:50:45.400 | And she goes, "Your dad, he loved those stories.
02:50:49.360 | "I just wish he could have heard them."
02:50:50.920 | He even would tell me, he would say,
02:50:52.880 | "Maybe Chris will come home and I'll get him drunk
02:50:56.800 | "and he'll tell me the stories."
02:50:59.320 | But, and then she goes, "Maybe one day in heaven
02:51:02.240 | "you can tell him those stories."
02:51:03.760 | And I fucking lost it.
02:51:06.000 | I literally stood in my shower sobbing like a child.
02:51:10.920 | Like just thinking about like,
02:51:13.480 | all my dad wanted was those stories.
02:51:15.840 | - Yeah.
02:51:17.160 | - And now I'm on a fucking podcast
02:51:18.480 | telling the stories to the world and I did tell him.
02:51:21.520 | Yeah, so.
02:51:22.740 | - Did you ever have like a long heart to heart with him
02:51:26.640 | about like, about such stories?
02:51:30.880 | - He was in the hospital one time and I went through
02:51:32.800 | and I want to know about his history,
02:51:35.320 | like his life, what he did.
02:51:36.880 | And I think he may be sensationalized some of it,
02:51:40.280 | but that's what you want.
02:51:41.120 | Your dad's a hero, so you want to hear those things.
02:51:42.800 | - He's a good storyteller?
02:51:44.080 | - Yeah, again, I don't know what was true and not true,
02:51:47.120 | but you know, some of it was really good
02:51:50.720 | and it was just good to hear his life.
02:51:52.760 | But you know, we lost him and now those stories are gone.
02:51:57.600 | - You miss him?
02:51:58.440 | - Yeah.
02:51:59.260 | - What did he teach you about what it means to be a man?
02:52:05.160 | - So my dad, he was an engineer.
02:52:13.640 | And so part of his job, we worked for Vermont Power
02:52:18.800 | and Electric or whatever it was.
02:52:21.200 | I mean, when he first got married to my mom and all that,
02:52:24.280 | like he flew around in a helicopter,
02:52:27.080 | checking out like power lines and dams.
02:52:29.040 | He used to swim inside Scuba into dams
02:52:32.000 | to check to make sure they were functioning properly
02:52:34.040 | and all that.
02:52:34.880 | Pretty cool shit.
02:52:35.720 | And then he couldn't walk anymore.
02:52:39.640 | I probably would have killed myself
02:52:42.660 | if my life switched like that so bad.
02:52:44.680 | And my dad probably went through some dark points,
02:52:46.520 | but he had that from me, maybe.
02:52:48.320 | And so to get through that struggle,
02:52:50.680 | to teach me like, you know, you press on,
02:52:53.320 | you have a family, people count on you,
02:52:55.200 | you do what you gotta do.
02:52:56.660 | That was big.
02:52:59.260 | Yeah.
02:53:01.520 | - I'm sure you make him proud, man.
02:53:03.040 | - I'm sure I do, but I don't think he knew that,
02:53:06.880 | that I knew that.
02:53:07.720 | - Well, you get to pass on that love to your kids now.
02:53:12.800 | - I try, I try, but I can't impress them
02:53:15.560 | as much as my dad impressed me.
02:53:18.240 | I can try all I want, but.
02:53:20.840 | - Well, what do you think is the role of love?
02:53:23.280 | 'Cause you gave me some grief,
02:53:27.520 | you busted my balls a little bit
02:53:28.920 | for talking about love a lot.
02:53:30.000 | What do you think is the role of love in the human condition?
02:53:32.080 | - I think it's the greatest thing.
02:53:33.400 | I think everyone should be searching for it.
02:53:35.080 | If you don't have it, find it, get it as soon as you can.
02:53:38.600 | I love my wife, I really do.
02:53:41.160 | I had no idea what love was until my kids were born.
02:53:44.640 | My son came out and, this is a funny story,
02:53:47.920 | he came out and I just wanted him to be safe
02:53:50.520 | and be healthy and all that.
02:53:51.760 | And I said to the doctor, I said,
02:53:53.600 | "10 and 10, doc, 10 fingers, 10 toes, everything good?"
02:53:56.560 | And he goes, "Eh, nine and nine."
02:53:59.200 | I was like, "What the fuck?"
02:54:00.480 | He's like, "Oh, this is gonna suck."
02:54:02.000 | Okay, we'll deal with it and all that.
02:54:03.960 | He was talking about the Apena card score
02:54:06.200 | or some score about breathing and color and all that.
02:54:08.560 | And I was like, "Oh, shit."
02:54:09.880 | But no one told me this.
02:54:12.120 | But so I'm just sobbing.
02:54:13.640 | I couldn't even cut the umbilical cord.
02:54:15.480 | Just fell in love with my kids when I saw them.
02:54:17.560 | And that to me really is what love is, just for them, man.
02:54:22.400 | - And I see that through your career
02:54:24.080 | that love developed, which is awesome.
02:54:26.000 | Being able to see the humanity in people.
02:54:31.640 | - I didn't when I was young, the foolishness of youth.
02:54:34.620 | I needed to learn that lesson hard.
02:54:37.920 | When I was young in my career,
02:54:40.120 | it was just about career goals
02:54:42.320 | and arresting people became stats.
02:54:44.200 | You arrest someone, you get a good stat,
02:54:45.720 | you get an atta boy, maybe the boss likes it
02:54:48.520 | and you get a better job or you move up the chain.
02:54:51.020 | It took a real change in my life to see that humanity.
02:54:55.960 | - And I can't wait to listen to you talk,
02:54:59.920 | which is probably hilarious and insightful
02:55:03.520 | given the life of the two you lived
02:55:07.760 | and given how much you've changed each other's lives.
02:55:11.240 | I can't wait to listen, brother.
02:55:12.880 | Thank you so much.
02:55:13.720 | This is a huge honor.
02:55:14.800 | You're an amazing person with an amazing life.
02:55:16.880 | This was an awesome conversation.
02:55:18.360 | - Dude, huge fan.
02:55:19.280 | I love the podcast.
02:55:20.320 | Glad I could be here.
02:55:21.140 | Thanks for the invite.
02:55:22.680 | So, exercise in the brain too.
02:55:25.040 | It was great.
02:55:25.880 | Great conversation.
02:55:27.080 | - And the heart too, right?
02:55:28.240 | - Oh, yeah, yeah.
02:55:29.080 | You got some tears there at the end.
02:55:30.800 | - Thanks for listening to this conversation
02:55:33.320 | with Chris Darbell.
02:55:34.480 | To support this podcast,
02:55:35.640 | please check out our sponsors in the description.
02:55:38.240 | And now, let me leave you with some words
02:55:40.320 | from Benjamin Franklin.
02:55:42.160 | They who can give up essential liberty
02:55:44.640 | to obtain a little temporary safety
02:55:46.980 | deserve neither liberty nor safety.
02:55:49.920 | Thank you for listening and hope to see you next time.
02:55:53.680 | (upbeat music)
02:55:56.260 | (upbeat music)
02:55:58.840 | [BLANK_AUDIO]