back to index1008-IRS_Whistleblower
00:00:00.000 |
More isn't necessarily better. At CrowdStrike we understand that a 00:00:05.280 |
holistic cybersecurity approach is the only way to protect your business from 00:00:09.400 |
active real-time threats. Which is why we created our data-centric platform to 00:00:14.680 |
have one agent, one console, and one integrated workflow to focus on our one 00:00:20.160 |
mission. Stop breaches. When a cyber attack hits, who do you want power in 00:00:25.800 |
your defense? CrowdStrike. We stop breaches. One of the things that we got 00:00:32.120 |
to do is when you get something wrong, go back and correct the record. And today 00:00:36.020 |
I'm gonna do exactly that. Welcome to Radical Personal Finance, a show 00:00:38.680 |
dedicated to providing you with the knowledge, skills, insight, and 00:00:41.040 |
encouragement you need to live a rich and meaningful life now, while building a 00:00:44.640 |
plan for financial freedom in 10 years or less. My name is Joshua Sheets, I am 00:00:47.920 |
your host, and if you were to go back to June of 2021 and listen to episodes 794 00:00:55.640 |
and 797 of this podcast, you would find my discussion of the ProPublica articles 00:01:03.800 |
titled "The Secret IRS Files, Trove of Never-Before-Seen Records Reveal How the 00:01:08.920 |
Wealthiest Avoid Income Tax." This was a famous series of articles at the time, 00:01:13.840 |
I think it wound up being about a half dozen or at least that's what I read, 00:01:16.480 |
something like that. And these articles were all being written based upon 00:01:21.560 |
publicly available, excuse me, based upon leaked IRS tax returns for the some of 00:01:28.760 |
the wealthy people in the United States. Now this also was following a period of 00:01:33.080 |
time in which President Trump's tax returns to the IRS, which he had famously 00:01:38.520 |
refused to disclose publicly as is or was the custom in the United States for 00:01:44.080 |
presidential candidates, his tax returns were leaked publicly and that made a big 00:01:48.840 |
stir in the political space as well. In those episodes, I myself said, I doubt, I'm 00:01:53.840 |
skeptical that anyone would ever be brought to justice, anyone would ever be 00:01:57.640 |
found. It just seemed as though it was just too neat of a political trick, there 00:02:02.640 |
was too much energy on behalf of the the value of the leaks for the political 00:02:08.520 |
cause of the moment, and I didn't think anyone would ever be caught, captured, or 00:02:13.680 |
prosecuted. I stand corrected. We go now to Justice.gov official press release 00:02:19.760 |
from the Office of Public Affairs for Justice.gov, date January 29, 2024. 00:02:25.680 |
Headline, former IRS contractor sentenced for disclosing tax return information to 00:02:32.000 |
news organizations. A former IRS contractor was sentenced today to five 00:02:36.920 |
years in prison for disclosing thousands of tax returns without authorization. 00:02:41.960 |
Charles Littlejohn abused his position as a consultant at the Internal Revenue 00:02:46.520 |
Service by disclosing thousands of Americans' federal tax returns and other 00:02:51.040 |
private financial information to news organizations. He violated his 00:02:55.240 |
responsibility to safeguard the sensitive information that was 00:02:58.040 |
entrusted to his care and now he is a convicted felon, said Acting Assistant 00:03:03.240 |
Attorney General Nicole M. Argentieri of the Justice Department's Criminal 00:03:07.960 |
Division. Today's sentence sends a strong message that those who violate 00:03:12.960 |
laws intended to protect sensitive tax information will face significant 00:03:17.480 |
punishment. According to court documents, Charles Littlejohn, 38, of Washington, D.C., 00:03:23.800 |
while working at the IRS as a government contractor, stole tax return information 00:03:29.280 |
associated with a high-ranking government official, Public Official A. 00:03:33.120 |
Littlejohn accessed tax returns associated with Public Official A and 00:03:37.760 |
related individuals and entities on an IRS database after using broad search 00:03:43.720 |
parameters designed to conceal the true purpose of his queries. He then uploaded 00:03:49.240 |
the tax returns to a private website in order to avoid IRS protocols established 00:03:54.520 |
to detect and prevent large downloads or uploads from IRS devices or systems. 00:04:00.360 |
Littlejohn then saved the tax returns to multiple personal storage devices, 00:04:05.280 |
including an iPod, before contacting News Organization One. Between around August 00:04:11.680 |
2019 and October 2019, Littlejohn provided News Organization One with the 00:04:17.760 |
tax return information associated with Public Official A. Littlejohn 00:04:21.840 |
subsequently stole additional tax return information related to Public Official A 00:04:26.560 |
and provided it to News Organization One. Beginning in September 2020, News 00:04:32.240 |
Organization One published a series of articles about Public Official A's tax 00:04:36.200 |
returns using the tax return information obtained from Littlejohn. "This sentence 00:04:41.920 |
should serve as a warning to anyone who is considering emulating Mr. Littlejohn's 00:04:45.520 |
actions," said Acting Inspector General Heather Hill of the Treasury, Inspector 00:04:50.280 |
General for Tax Administration, TIGTA. TIGTA relentlessly investigates 00:04:56.120 |
individuals who illicitly access and disclose taxpayer information regardless 00:05:01.120 |
of their personal motivation. TIGTA appreciates the commitment of the 00:05:05.920 |
Criminal Division's Public Integrity Section and the U.S. Attorney's Office in 00:05:09.400 |
ensuring those who abuse their positions of public trust are held accountable for 00:05:14.080 |
their actions. In July and August 2020, Littlejohn separately stole tax return 00:05:19.600 |
information for thousands of the nation's wealthiest individuals. Littlejohn was 00:05:24.160 |
again able to evade detection by uploading the tax return information to 00:05:28.040 |
a private website. In November 2020, Littlejohn disclosed this tax return 00:05:32.560 |
information to News Organization Two, which published nearly 50 articles using 00:05:37.200 |
the stolen data. I guess it wasn't six. Littlejohn then obstructed the 00:05:41.240 |
forthcoming investigation into his conduct by deleting and destroying 00:05:44.640 |
evidence of his disclosures. Littlejohn pleaded guilty in October 2023 to 00:05:50.440 |
unauthorized disclosure of tax returns and return information. Now if we go to 00:05:56.920 |
some of the news articles from the time, from CBS News, one of the things that's 00:06:01.840 |
interesting is his statement in court. Before sentencing Littlejohn on Monday 00:06:07.240 |
to the maximum penalty, Federal District Judge Anna Reyes called his attack 00:06:11.360 |
"an attack on our constitutional democracy. He targeted the sitting 00:06:15.840 |
President of the United States of America and that is exceptional by any 00:06:18.720 |
measure," Judge Reyes said. "It cannot be open season on our elected officials." 00:06:23.840 |
Littlejohn made a brief statement before the court acknowledging that "I 00:06:27.920 |
alone am responsible for this crime." He said he was driven by a desire for 00:06:32.480 |
transparency, but was also aware of the potential consequences of his actions. 00:06:36.680 |
"I made my decision with full knowledge that I would likely end up in a 00:06:41.320 |
courtroom to answer for my serious crime," he said. "I used my skills to 00:06:45.960 |
systematically violate the privacy of thousands of people." Littlejohn's 00:06:50.720 |
explanations did not appear to sway the court's sentencing decision. Reyes said 00:06:55.240 |
courts must be an "unbreakable bulwark for American democracy in the face of 00:07:00.000 |
increased threats." The court's job, the judge said, was to make sure that others 00:07:04.720 |
never viewed "this type of conduct as acceptable or justifiable or worth 00:07:09.720 |
the trade-off. We are a nation of laws." I don't know about you, it doesn't sound to me 00:07:18.960 |
like he's particularly repentant of his crime. Sounds to me like he went into it 00:07:24.880 |
believing that this action that he took was necessary in order to advance his 00:07:31.560 |
cause, his agenda, and that he accepts the five years in prison as a 00:07:38.200 |
reasonable sentence. In a day and age in which people are self-immolating in 00:07:44.680 |
public places for their political beliefs and political concern in the 00:07:48.760 |
United States, it seems to me that five years is not a particularly long 00:07:52.360 |
sentence, but I've never been to prison, maybe five years in the federal prison 00:07:56.840 |
feels like forever, I don't know. But it certainly at least is worth 00:08:02.120 |
acknowledging that the IRS did follow through and did catch this guy, did 00:08:07.520 |
convict him in a court of law, and he is now in prison. And so we should 00:08:12.760 |
acknowledge that and appreciate that. It's worth being 00:08:16.480 |
appreciative of. The challenge is that though we can acknowledge and appreciate them 00:08:21.920 |
for following through, the deed is done, the events are in the past, they have 00:08:28.400 |
happened, and that probably is something that will happen in the future. Now, we 00:08:35.280 |
should talk for a moment about privacy. For example, do government individuals or 00:08:39.920 |
do wealthy individuals have a right to privacy? I think they do, but not all 00:08:45.880 |
countries acknowledge that. And for example, with all of the hullabaloo 00:08:50.200 |
around President Trump's tax returns, if we want our political candidates to be 00:08:55.440 |
required to disclose their tax returns, then we should pass a law. Congress 00:09:01.120 |
should do its job, pass a law, and require that political candidates make public 00:09:04.920 |
their tax returns. We should not go about it with this unofficial but pressure 00:09:09.840 |
thing, and then people going around and stealing the tax returns in order to 00:09:13.840 |
publicize them. That's not healthy and helpful. If we believe that 00:09:18.680 |
wealthy individuals should be required to publicly disclose their tax returns, 00:09:22.280 |
then we should pass a law. And Congress should pass a law requiring the 00:09:26.080 |
disclosure of tax returns, and the data can become public. There are some 00:09:31.080 |
countries around the world, at least one I know, where that is the law, and 00:09:35.840 |
that could be the law in the United States or whatever country that you are 00:09:39.960 |
in. But as always, individuals acting outside of the law is not a healthy and 00:09:46.200 |
a helpful thing. But it is something that happens, and I was 00:09:51.320 |
interested to see if there was a history of this. So I went and I tried to find, is 00:09:55.280 |
there a history of other kinds of instances in which leaked or stolen tax 00:10:00.680 |
return data has become public? There is a precedent for this with regard to 00:10:09.120 |
US President Richard Nixon. And in 1970-1971, somebody stole his tax 00:10:17.640 |
return information, and that information was leaked. It's my understanding that 00:10:23.080 |
whoever was involved was never publicly charged or prosecuted for that crime, 00:10:28.960 |
because the political sentiment was very much focused on, "Hey, these 00:10:35.240 |
tax returns are so scandalous," and so the person was never brought to justice. 00:10:39.160 |
There have been a few other times. There was one guy who tried to extort 00:10:43.420 |
presidential candidate Mitt Romney, that he tried to extort and sell those 00:10:51.920 |
documents. As it turned out, it was a hoax, and the individual involved, a man named 00:10:57.480 |
Michael Mansell Brown, didn't actually have any documents, and he was convicted 00:11:01.520 |
for his extortion attempt and for his fraud attempt. But whistleblowers or 00:11:06.760 |
leaking information is a time-honored tradition of individuals taking private 00:11:13.320 |
information and going public with it. And in general, what I have found is that my 00:11:17.920 |
friendliness towards whistleblowers depends much more on whether or not I 00:11:22.920 |
happen to agree with the whistleblower's politics or position than it does about 00:11:27.880 |
any kind of strong moral statement about whistleblowing in general. And I think 00:11:32.880 |
that your experience is probably similar to mine, that when a whistleblower is on 00:11:37.120 |
our side, we're glad to have the person come out and make private information 00:11:41.520 |
public, and we think, "Yay, go for it, go for you, good job," and when the 00:11:46.040 |
whistleblower doesn't agree with us, then we're upset about the travesty of 00:11:49.680 |
justice of an individual making private information public. It's just human 00:11:54.280 |
nature, right? What can you do and what can I do? Well, first of all, we can 00:11:58.200 |
acknowledge and appreciate that the IRS has done its job, or that the 00:12:03.980 |
Justice Department did its job, tracked somebody down, they 00:12:07.680 |
prosecuted the individual, and gave him the maximum sentence permitted under 00:12:13.000 |
federal sentencing guidelines. But if that rings hollow for you, as it does for 00:12:18.560 |
me, as pretty insubstantial penalty for pretty challenging, you know, I don't know, 00:12:26.640 |
how would you even value it? It's hard to say. So maybe I'm being too energized by 00:12:32.520 |
saying it's an insubstantial penalty. It is what it is. But if it rings true that, 00:12:36.520 |
"Hey, you know what, this ought not to happen," then the question is, "What can you do 00:12:40.440 |
about it?" Because it's up to you what you do about it. And once things have 00:12:47.280 |
happened, once the information is out there, if the information is harmful to 00:12:51.760 |
you, you can't get it back. Once it's been printed on the internet and released, you 00:12:55.120 |
can't get it back. So first of all, I think you should be aware of the 00:13:01.240 |
risk. And many of us are living in a world in which we're simply not aware of 00:13:05.840 |
the risk. I had a funny thing that happened to me at a recent group event. 00:13:13.680 |
I went and I was buying some tickets for a little fair, and the person at the desk 00:13:18.240 |
wanted my name. I'm paying cash for these little tickets for rides and food for my 00:13:23.960 |
children, and the person wanted my name, and I just gave the person an alias name. 00:13:27.640 |
And someone standing there, who knows me by another name, turned and looked at me 00:13:31.560 |
and said, "Huh? Like, what's going on?" And they said, "You know, what?" And I just 00:13:36.960 |
explained that anytime you put your name or information into a database, you need 00:13:40.800 |
to expect that that database is going to be published online and is 00:13:45.000 |
fully available for someone else to be there. And so it's silly, because 00:13:49.920 |
what's the actual risk to me of, you know, me going and buying tickets 00:13:55.600 |
for a fair for my children and my name being disclosed? Not a big deal. Any more 00:13:59.880 |
than it's a big deal for you to use your legal name when you go and order your 00:14:03.560 |
Starbucks latte. It's not a big deal. But it is a big deal, because you 00:14:09.160 |
need to be aware that any information that you publish is ultimately going to 00:14:12.680 |
wind up on the Internet. And once the information is out there, then there's 00:14:16.960 |
nothing you can do about it. And so at all times, you should be aware of the 00:14:21.000 |
fact that any list or registry that you enrolled in, any information that you 00:14:26.440 |
give to another person, that is all going to be publicly available on the Internet 00:14:31.480 |
in a data dump probably within a few years. Now, most information is not that 00:14:38.440 |
big of a deal. Is it a big deal if you know that your name is and your address 00:14:42.400 |
and something is published online? Well, for some people, it is a big deal. For 00:14:47.000 |
most of us, though, we can go through our lifetime and it's not really a big 00:14:51.000 |
deal. But I recently had an experience in which I was thinking about how big of a 00:14:56.280 |
deal this actually can be. Many years ago, I had an unfortunate experience with one 00:15:05.480 |
of my dogs. One of my children had left the door of our house open. One of my 00:15:10.320 |
dogs had gone out in the neighborhood for a run and, you know, "Hey, I'm free. I'm 00:15:14.560 |
gonna escape and run around," had chased and attacked a stray cat in our 00:15:20.240 |
neighborhood, and the cat ended up dying after being found in the jaws of my dog. 00:15:26.440 |
My dog is not a vicious dog. I don't think that he intended to kill the cat, 00:15:31.480 |
but how do you know between dogs and cats? As it turns out, it was a stray cat, 00:15:35.680 |
which I was very grateful for because it was my dog. My dog went out and killed 00:15:41.120 |
another creature. Thus, I'm responsible for it. And it was something that really 00:15:45.160 |
affected me at the time. There was a lot of lessons and a long story to it that 00:15:48.560 |
I'm gonna skip here, but it was something that really affected me at the time and 00:15:51.760 |
has continued in my memory for a very significant, for a long time. Primarily 00:15:59.480 |
because of the moral issue. I was so grateful when I was able to 00:16:05.240 |
ascertain that the cat was a stray cat, and while I was unfortunate for the loss-- 00:16:10.360 |
I was very sad, of course, for the loss of life of the cat--I was grateful that I 00:16:14.440 |
didn't have to go to somebody and share that, "Hey, my dog killed your cat," because 00:16:21.040 |
I've thought a lot about the morality of that. Like, what can you possibly do 00:16:24.960 |
about--how could you make that situation right? If my dog were to get 00:16:30.720 |
out of my house and attack your dog, your family pet, or your cat, your family cat 00:16:36.800 |
who you all love, or your rabbit, or your hamster, or anything else that you do 00:16:40.440 |
that you really love, and my dog kills your loved animal, how could you 00:16:46.400 |
possibly make that right? How could you morally solve it? And it's one of those 00:16:50.640 |
things where there's--all of the answers are very unsatisfying, because--and it's 00:16:57.720 |
one of those things where, at the end of the day, the potential results are so 00:17:03.800 |
significant that the only proper course of action is to make sure it never 00:17:07.160 |
happens by preventing it, because it probably can't be morally solved. There's 00:17:12.840 |
not an amount of money that we could figure out that could compensate you for 00:17:17.040 |
loss of life of your beloved family pet, something like that. So I was sharing 00:17:21.320 |
this experience with a friend of mine and sharing my moral question and my 00:17:28.120 |
musing on it, trying to figure out how would you make it right. And this friend 00:17:31.680 |
is a very serious cat lover, and the friend had recently lost a beloved cat. 00:17:38.680 |
And in the interaction, the friend showed me basically the way that he would have 00:17:47.240 |
responded if it had been his beloved cat whom my dog killed, which was flat-out 00:17:53.000 |
scary, because there was violence involved and there was harm being 00:17:57.600 |
threatened towards me and towards my family. And it was a rather shocking 00:18:04.280 |
event, because my friend was role-playing, but genuinely was pointing out how 00:18:09.280 |
serious and emotionally charged his reaction would have been. When I was 00:18:15.080 |
considering that event, I realized that this is just how fast life can change. 00:18:21.040 |
Life can change in an instant. And once it's changed, you can't go back and 00:18:26.960 |
change it. A number of years ago, somebody that I used to work with was murdered. She 00:18:31.120 |
and her husband were having a cookout in their garage in a Tony neighborhood in 00:18:35.000 |
South Florida, and a guy walks up to them and murders them just flat-out in cold 00:18:41.040 |
blood. And that event really shook me. It was just a random young guy in his 20s 00:18:47.960 |
who had evidently a psychotic breakdown, had done some drugs, and for unknown 00:18:55.480 |
reasons murdered the people that I know. And it was an example of how quickly 00:19:03.200 |
things can change, how quickly people can snap, how quickly they can go crazy. He was 00:19:07.000 |
ultimately found to be criminally insane by the court and sent to a 00:19:13.840 |
mental institution instead of to prison for his crimes. My point simply is 00:19:20.320 |
to say that a lot of times privacy doesn't matter until it does. And then 00:19:25.440 |
when it matters, it really matters really quickly. And I thought if my dog had 00:19:30.960 |
gotten out, run out of the house, which is a totally understandable event, and 00:19:35.200 |
chased down a neighbor's cat, which again is a totally reasonable and 00:19:40.160 |
understandable event, and killed a cat, and it just so happens to be that this 00:19:44.320 |
particular neighbor is emotionally charged, to put it mildly or flat-out 00:19:49.440 |
crazy, to say something that is possible, I would have to disappear immediately, to 00:19:57.080 |
protect my family, to protect my wife. I would have to disappear instantly. And 00:20:00.920 |
it's an example of something that I didn't do anything really wrong. 00:20:06.560 |
I certainly would bear responsibility for allowing the dog to run out of the 00:20:10.920 |
front door, but you know what it's like going in and out of the front door with 00:20:14.720 |
children and strollers and all the rest of the stuff. It's an understandable 00:20:17.680 |
event, and yet if it just so happens to be that you're up against a crazy person 00:20:22.120 |
that day, then things matter. So all data and all information should be viewed not 00:20:27.800 |
with extreme paranoia perhaps, but just suspicion. And what's crazy is that 00:20:33.920 |
we're living in a world in which our data is consistently published 00:20:39.600 |
everywhere, rarely without our even knowing it. In my state, where I'm from, in 00:20:45.480 |
Florida, not only do they publicly disclose your data online, for 00:20:50.680 |
example when you sign up to vote, when you get a driver's license, 00:20:56.240 |
they publish your voting data, all of it available online, but more 00:21:01.720 |
importantly they actually sell your data. So here's how the law works. The 00:21:05.960 |
government requires you to have a driver's license if you are going to 00:21:13.040 |
enjoy the privilege of driving. So you have to go and get a driver's license. 00:21:16.600 |
That driver's license, by law, has to reflect your physical address as well as 00:21:21.880 |
all of your characteristics, and has to record a current photo of you. Then what 00:21:27.640 |
the Florida State Government does is they turn around and they sell that data in 00:21:31.560 |
order to profit off of the data that they have legally compelled you to 00:21:35.440 |
disclose to the government. Here's a paragraph or two from Action News 00:21:39.760 |
Jacksonville. "Action News Jacksonvestigates learned the state of 00:21:43.080 |
Florida is making hundreds of millions of dollars by selling your DMV 00:21:46.920 |
information. This includes your name, address, and date of birth. Action News 00:21:50.760 |
Jackson investigator Ben Becker looked into why it's allowed and how it 00:21:53.720 |
potentially could compromise your financial safety. Becker discovered the 00:21:57.120 |
state of Florida sells your personal DMV information to dozens of private 00:22:00.840 |
companies, mainly data brokers, who can request it or pay for unlimited 00:22:05.920 |
electronic access. Your name, address, date of birth, and even your driver's 00:22:10.280 |
license number are available for as little as a penny per file to send you 00:22:13.600 |
junk mail." And it adds up. "From 2021 to 2023 the state made 263 million dollars 00:22:21.080 |
and you can't opt out. The biggest bulk buyers of your information are data 00:22:28.520 |
brokers like LexisNexis at more than 90 million dollars, followed by 53 million 00:22:32.880 |
dollars from Tessera data, and 40 million dollars from safety holdings." It goes on 00:22:37.680 |
and talks about it. So actually we'll continue down. "In 2023 a Russian linked 00:22:44.240 |
cyber attack targeting the Louisiana and Oregon DMVs leaked the sensitive data of 00:22:48.320 |
nearly 10 million drivers." Goes on and talks about other attacks and the 263 00:22:52.960 |
million dollars that the state of Florida made during those couple of 00:22:55.640 |
years. So this is, I think to me it's pretty shocking, but it's standard 00:23:01.280 |
operating business, no one seems to care, doesn't seem to be a big deal for a lot 00:23:04.640 |
of people. But something as simple as your driver's license data is basically 00:23:08.680 |
public access, because anybody with really any connections whatsoever, do any 00:23:13.640 |
serious investigative capabilities, can access the information needed or find it 00:23:19.360 |
and buy it. So this, I think, should be concerning. It's an example of how you're 00:23:24.720 |
compelled to create data and then that data, by the government, and then the data 00:23:28.360 |
is often sold out from under you. So while you may or may not be able to opt 00:23:32.360 |
out of your driver's license data being sold, you can take steps to to adjust the 00:23:40.280 |
data that you personally create and put online. And that's something that you 00:23:46.240 |
should be thoughtful about, as thoughtful as possible. So the first step is limit 00:23:50.620 |
the data that you create and share to only what's necessary. Now another thing 00:23:55.320 |
though, that you can get involved in, is some form of political advocacy. I don't 00:24:00.080 |
see any particularly fruitful political advocacy opportunity, but you might in 00:24:05.520 |
the future. For example, if the residents of the state of Florida would rise up, 00:24:09.160 |
then possibly they could end this abusive practice of the government 00:24:12.680 |
selling data, and that may be something that you could be involved in in your 00:24:16.400 |
state. But I would say that another expression of this is something like the 00:24:21.280 |
tax system in the United States. The way that the tax system works, is you are 00:24:26.600 |
legally obligated to disclose to the US government on your tax returns, all of 00:24:32.360 |
your personal and confidential information. And the penalties for 00:24:35.720 |
non-disclosure are so significant and severe, that most of us are gonna follow 00:24:41.640 |
the law and go ahead and disclose. I don't want to get into a situation where 00:24:45.560 |
I got a judge saying, you know, you lied on this and so I'm tossing you in prison. 00:24:49.000 |
The pain of prison is too high, I'm gonna go ahead and disclose. But the 00:24:52.840 |
disclosures are enormous, enormous. And now as they've expanded to disclosures 00:24:58.680 |
of cryptocurrency holdings, things like that, it's become so much more 00:25:04.880 |
significant. And then what you see is that all it takes is one person, one 00:25:10.880 |
person with motivation and opportunity, and now the data of thousands of people 00:25:17.280 |
is compromised. And some of those people are very high-level people. This creates 00:25:22.920 |
not only moral hazard, this creates not only political risk, not only real risk, 00:25:28.960 |
but what I mean is that not only a real risk to, say, your reputation or your 00:25:34.400 |
business or your something like that in the community, but this creates a real 00:25:38.760 |
physical risk. One of the things that has not generally been present in the United 00:25:43.360 |
States is extortion. Crimes of extortion based upon people's personal information 00:25:50.160 |
and personal data. But this is a feature that has happened in many places around 00:25:56.000 |
the world, and it's something that I think is likely to happen more in the 00:26:01.520 |
United States. And so even something like tax return information is a vector of 00:26:05.800 |
risk that can be quite significant. I'll tell you just one simple example. A number 00:26:11.360 |
of years ago I had a friend of mine who was a taxi driver from Tampico, Mexico, 00:26:16.400 |
and we were chatting about this. And we were discussing, basically, the safety and 00:26:22.720 |
security of Mexico. Going back and forth, and I'm pretty, you know, pretty relaxed 00:26:28.880 |
about dangerous places, but I really want to know what's real and what's true. And 00:26:33.720 |
so he was explaining to me what's real and what's true on the ground. And he 00:26:37.080 |
said one of the things that you can never do in, where he was from, he says 00:26:40.440 |
you can't keep your money in the bank. And the reason you can't keep your money 00:26:43.680 |
in the bank is that all of the criminal elements that are there in their society 00:26:49.200 |
have paid informants in the bank who will let them know if somebody has a 00:26:54.360 |
certain amount of money in his bank account. And so let's say that you have 00:26:58.920 |
$15,000 in your bank account. Well, the agent at the bank who has access to the 00:27:04.120 |
personal information of how much money you have in your bank account can go and 00:27:08.240 |
share that information with a local criminal who can then go and kidnap your 00:27:14.720 |
daughter from her school, or from off the street, or off the playground. And that 00:27:19.680 |
person knows exactly how much ransom to require from you this weekend based upon 00:27:25.960 |
the inside information. So he requests a ransom of $15,000, and you basically have 00:27:30.880 |
no choice but to pay it. Who among us would say no? If we know we have the 00:27:36.480 |
money and by paying the money we might be able to save the life of a loved 00:27:41.920 |
child. So you create a problem when you have data access, access to data. And you 00:27:49.640 |
say, well that's only in Mexico. No, it's not. Just imagine for a moment that this 00:27:54.240 |
guy with the IRS whistleblower, imagine that he had not gone after public 00:27:59.360 |
figures. Imagine that he instead of going after, you know, President Trump at the 00:28:04.680 |
time, and Peter Thiel, and whoever all the other billionaires were that were 00:28:08.280 |
talked about. Imagine that instead of doing that, he had just taken the 00:28:12.920 |
information related to high-income earners who have access to things, not 00:28:18.800 |
mega billionaires who are likely to have security, just guys like you and people 00:28:23.920 |
that you know. You know, normal people with good amounts of income. He goes 00:28:27.760 |
through your private tax return data, figures out who's got hundreds of 00:28:31.680 |
thousands or millions of dollars, and he doesn't go and publish the information 00:28:34.920 |
to the Internet, doesn't go and take it to a journalist, but instead just simply 00:28:38.520 |
passes the private information along to a criminal element in your place, in your 00:28:45.200 |
state, in your city. And now that same model that works so reliably in 00:28:49.880 |
Mexico has now been imported to Kansas City, Kansas or wherever you happen to 00:28:55.000 |
be from. Imagine how chilling that would be. And that's with tax return data. 00:29:01.040 |
Which is probably much more highly guarded. Now take that information into 00:29:05.760 |
your local financial institutions, take that information into your local bank, 00:29:09.640 |
your local insurance office, just imagine all the information that is there. It's a 00:29:18.800 |
lot. And there's basically no protection against it. Except you being careful with 00:29:24.920 |
the information. So be careful with the information, that's step one. If you can 00:29:29.240 |
advocate for a lighter system, fewer government rules and requirements in 00:29:36.280 |
some way, then that would be great. To be clear, I don't see any opportunity for 00:29:41.280 |
this. It's not there. I could create, meaning it's not politically 00:29:45.800 |
feasible, I could create a system in which it would be possible. For 00:29:51.720 |
example, I don't think taxes should be based upon income. Why not just figure 00:29:57.480 |
out how much tax each person needs to pay in order for us to run the 00:30:00.480 |
government and give people a bill and you keep everything above it. There are, 00:30:03.720 |
around the world, there are governments and countries that offer these kinds of 00:30:08.320 |
tax programs. They're called lump-sum tax programs. And the most famous of them is 00:30:13.560 |
from Switzerland, where you individually go to the government and you negotiate 00:30:17.440 |
your tax rate privately based upon the amount of consumption that you pay. And 00:30:22.000 |
so you can be a mega gazillionaire and if you negotiate with the Swiss 00:30:25.120 |
government that, "Hey, I'll pay you guys 300,000 Swiss francs every year," that's 00:30:28.760 |
your tax bill. It doesn't matter how many mega gazillions you make on top of that, 00:30:32.640 |
you just negotiate it with the government. This is widely available 00:30:36.320 |
right now in a couple countries in Europe. Famously, Italy has a 100,000 00:30:39.920 |
euro tax program. Go to Italy, you pay 100,000 euros, you're done. 00:30:43.720 |
Everything above it is beyond that. Why shouldn't we have more of that? Why 00:30:47.800 |
not negotiate with the US government and say, "All right, my number is $300,000 a 00:30:52.920 |
year and if I pay $300,000 a year, that's it." And we all know why not. It's not 00:30:56.640 |
politically feasible, but it could be done. And it could all be done without 00:31:01.240 |
even required disclosure of individuals' amounts. I'm 00:31:06.760 |
not here to solve it because it's silly to even spend time thinking about 00:31:10.400 |
it. It's not something that is in any way politically feasible at the moment. But 00:31:14.520 |
you need to be aware that these risks are real. And just because you have been 00:31:19.160 |
protected from them for much of your life doesn't make them any less real. 00:31:22.560 |
A lot of Americans see the violence happening around them. They see that 00:31:28.960 |
that violence could increase. You understand that in most of our cases the 00:31:34.160 |
local police departments are overtaxed. And while certainly they are likely to 00:31:38.200 |
give more priority to somebody who has potentially kidnapped your child or 00:31:45.240 |
is extorting you, perhaps, but it's pretty severe, the world that is 00:31:51.600 |
available. And we make it easy based upon the proliferation of data all around the 00:31:58.000 |
world. Consider it. A couple of practical things. Remember that if this is 00:32:04.520 |
something that you're interested in, if this is something that you have not 00:32:07.400 |
researched before and you don't even know what to do, remember that I teach a 00:32:11.320 |
course on basically this, how to protect yourself. It's called a hack proof course. 00:32:16.120 |
Go to hackproofcourse.com. That will allow you to have an 00:32:22.720 |
up-to-date, cutting-edge, but basic, but comprehensive. What I mean is that 00:32:29.360 |
it's a very simple program, but it's not unsophisticated. It's simple but 00:32:33.240 |
straightforward and really good of how to protect yourself against identity 00:32:37.600 |
theft for situations like this. And it's the kind of thing that, as with any good 00:32:42.520 |
insurance policy, you got to do in advance. So if this is of interest to you, 00:32:45.960 |
go to hackproofcourse.com. Link in the show notes for today's show. 00:32:49.140 |
Hackproofcourse.com. Sign up. Buy my course. I think it'd be a great fit for you, be 00:32:54.080 |
enormously helpful to you to take good actions to protect yourself and to 00:32:57.880 |
protect your data. And we can appreciate that the Justice Department did get the 00:33:01.680 |
guy. I don't think, however, that it's likely to be the last. I think that we'll 00:33:08.040 |
see more and more of these kinds of crimes because they fit the new world. 00:33:13.000 |
What I mean is that since there's now much more data collected and available 00:33:19.880 |
in data breaches, data leaks, we can expect to see more sophisticated 00:33:26.000 |
extortion attempts imposed on people based upon the data. And these 00:33:32.280 |
extortion attempts are not exclusively going to be for public figures. It can be 00:33:36.800 |
for many other people who are going to be extorted and blackmailed. So good for 00:33:42.680 |
the Justice Department for doing it. But you need to do everything you can to 00:33:47.040 |
protect yourself going forward as best you can. Hackproofcourse.com can be of 00:33:52.320 |
service to you and I'll be back with you very soon. 00:33:55.000 |
You've always had what it takes to make it happen, and we know the right tools 00:33:59.640 |
can make it easier. At Stereo University, we're always thinking about new ways to 00:34:04.280 |
set you up for success. That's why we give you a brand new laptop when you 00:34:08.520 |
enroll in a bachelor's program, so you can start off on the right foot and keep 00:34:12.840 |
striving. Visit stereo.edu to learn more. Eligibility rules, restrictions, and 00:34:18.200 |
exclusions apply. Connect with us for details. Stereo University is certified to