back to index

2024-07-30_Errata_et_eorum_causa


Whisper Transcript | Transcript Only Page

00:00:00.000 | Ever feel like you're falling behind on the latest tech, AI, and all the smart stuff in your life?
00:00:04.960 | That used to be me, until I started listening to Kim Commando Today.
00:00:09.200 | Every episode is packed with the latest in tech, AI, security tips, and tricks.
00:00:14.080 | I just learned how to find hidden GPS trackers in a car.
00:00:18.160 | Look inside, but check the undercarriage for magnetic holders.
00:00:21.920 | Join the smart listeners who start their day with the Kim Commando Today podcast.
00:00:26.720 | That's Kim Commando Today. K-O-M-A-N-D-O.
00:00:30.160 | Before I get into the main topic of this show, I want to correct an error which I made in the
00:00:35.440 | previous episode here, which was about the value of fully funding a Roth IRA. And I want to not
00:00:42.400 | only correct the error that I made, but I want to tell you why I made that error, because it's
00:00:48.720 | something that's embarrassing to me, but importantly, it may also help you to learn from my mistake.
00:00:54.800 | I got an email from a listener who pointed out two small errors that I had made. Number one,
00:00:59.760 | that the maximum contribution for the year 2024 to a Roth IRA is $7,000, not $6,500.
00:01:07.520 | Upon verification, the listener is absolutely correct. I misstated the 2024 contribution limit
00:01:14.160 | as being $6,500, when in reality, it is $7,000. In addition, the listener says that you can do
00:01:21.040 | futures on a Roth IRA, that he sells puts and does futures on Tasty Trade, evidently, and is Roth at
00:01:26.960 | a platform called Tasty Trade. So I assume that I was in error on saying that you can't do futures.
00:01:34.160 | So these are both important corrections. And the reason for my error has to do with my use of AI
00:01:43.360 | in preparing my show notes. Let me explain. When I do a show, most of the show is written in my
00:01:49.600 | head before I sit down and make notes. So I have most of the content, most of the ideas, basically,
00:01:56.080 | the basic theme of what I want to do and to create is there in my head before I ever open my mouth
00:02:04.560 | or even make any notes. I think about my topics constantly, and sometimes they're the product of
00:02:09.920 | a couple of hours of thinking, sometimes they're the product of years of thinking before I finally
00:02:13.680 | decide, "You know what? I'm going to go ahead and do it." So the show of "Max Out a Roth IRA"
00:02:18.960 | is something that's based upon years and years of thinking of simply, "This should be the
00:02:22.400 | fundamental step." Now, I learned early in the annals of radical personal finance that in order
00:02:27.600 | to keep myself from rambling, it's really good for me to sit down and plop out a few notes that I'm
00:02:33.040 | going to work from because I don't want to go off onto rabbit trails. I don't want to be ineffective
00:02:36.720 | with my use of time. I really want to be focused. And I found that if I just created some basic
00:02:42.000 | notes to speak from, then I can be focused. In years past, I've done extensive notes,
00:02:48.640 | many, many pages of notes, and that allows me to create a very tight show with lots of good
00:02:55.040 | information, but it takes so long to create those notes that it doesn't seem like a good trade-off
00:03:01.440 | of time invested in creating the show versus the actual impact of any one particular podcast.
00:03:08.000 | And so I try to just spend a little time creating some basic notes or outline that I'm going to
00:03:13.040 | speak from. What I have found is that AI tools, such as ChatGPT, I primarily use ChatGPT,
00:03:20.320 | simplify the process of creating notes because they think relatively structurally. So AI is
00:03:28.240 | really good at generating lists. My brain also works from lists. So having AI as a list maker
00:03:35.440 | for me basically streamlines everything. So when I went to create my most previous episode in this
00:03:42.720 | podcast series, the Roth IRA episode, I sat down and simply decided, okay, well, I know all the
00:03:49.920 | basic reasons I'm going to promote a Roth IRA, but I went to ChatGPT and I just said, give me
00:03:54.960 | a bullet point list of all of the basic features of the Roth IRA and all of the basic rules.
00:04:01.040 | And that created for me a very simple list of rules that I already know. I've already studied,
00:04:05.360 | I've already learned all these things, proven it, taken exams on all this stuff,
00:04:09.840 | but it gave me a current list of some of the things that have changed over the last couple
00:04:15.280 | of years. So I had it all sitting right in front of me and I could reference that while I was
00:04:18.800 | working. I also, in preparation for that podcast, I also used ChatGPT to create a couple of the
00:04:24.320 | spreadsheets that I alluded to in terms of the total accumulation of the Roth IRA account.
00:04:31.360 | I can do all those calculations when they're static, really straightforwardly with just my
00:04:36.400 | simple financial calculator. If I want to take an annual contribution amount and project it forward
00:04:40.960 | to the future, that's simple to do. I can also build a spreadsheet to forecast what I want to
00:04:46.480 | do in the future. And that's what I would need to do generally to have an increasing annual
00:04:52.160 | investment amount with also an increase showing the return on investment. I would need to build
00:04:58.240 | a spreadsheet. That's not an easy thing to do with a financial calculator. But ChatGPT makes
00:05:04.480 | spreadsheets pretty beautifully for me. And I can give it pretty simple instructions, such as
00:05:09.920 | show me the account value and I give it the assumptions that I want it to use for present
00:05:16.960 | value and increases rates of increase. And I can say project for me what this would look like
00:05:22.720 | with an increasing contribution amount and then do step up contributions for me at age 50.
00:05:28.480 | And so things like that are a very fast and easy task for AI tools like ChatGPT and much faster
00:05:35.920 | than my sitting down and creating the spreadsheet with it. So these were the tools that I used to
00:05:40.080 | speak from. Now, where's the problem? Well, the problem comes in, in that I didn't myself verify
00:05:47.440 | the accuracy of every piece of information from ChatGPT. I know if it's directionally correct,
00:05:55.120 | it didn't give me any, you know, wrong, really wrong things, but I didn't check every single
00:06:02.800 | one. And in fact, if you were to go back and listen to the podcast again, where you will hear
00:06:09.680 | me realize that I need to be really careful with what I'm speaking from is when I reviewed the
00:06:17.120 | income limits for Roth IRA contributors. Ordinarily, whenever I'm talking about numbers,
00:06:26.080 | I'll just go straight to irs.gov. I'll pull up the most recent press release of them that has all of
00:06:32.160 | the 401k limits in it, the income limits, the Roth limits in it, and I'll pull from that direct
00:06:39.600 | news release all of my current data. The irs.gov website is perfectly useful, works fine. But in
00:06:45.360 | this case, the ChatGPT had included that in it, they included the income ranges. And as I was
00:06:52.800 | speaking the show, and I was alluding to those income ranges, I realized, wait a second, I didn't
00:07:00.000 | double verify this information. Is this number actually correct? Or is this a different year's
00:07:06.160 | number? And if you listen to my voice, you'll hear that I basically tried to generalize the number.
00:07:12.080 | I backed off the specific number and I said, oh, it's about $230,000 instead of using a precise
00:07:18.560 | number. I knew as I was speaking that I wanted to be accurate. And most of the thrust of my message
00:07:25.680 | was to be directionally accurate, big picture concepts. And so I needed to not create the sense
00:07:34.400 | of certainty with the specific number. So I tried to back off the numbers there. And that's why I
00:07:40.080 | kind of hemmed and hawed briefly and then tried to generalize. Because I was realizing in the
00:07:46.000 | moment as I was recording that, wait a second, I didn't verify before hitting record, I didn't
00:07:50.320 | pull up the 2024 numbers. I don't have them here from the IRS website. Is my data source correct?
00:07:57.840 | What I hadn't done is I didn't do the same thing with my use of the $6,500 number. By the way,
00:08:03.600 | both of the numbers that I used, both for the phase out amounts and for the number,
00:08:08.000 | were the 2023 numbers, not the 2024 numbers. But ChatGPT didn't label them as 2024 numbers.
00:08:15.600 | I mentioned it to it as current, but I didn't specifically inquire of the software whether or
00:08:21.120 | not it was 2023 or 2024 numbers. And so I point this out to say, number one, it's an error. It's
00:08:27.040 | an error that I committed. And I will be more careful in the future to make certain that I
00:08:32.080 | have the proper current data with me. It's hard for me to remember all the specifics.
00:08:37.040 | There's so many numbers, they change all the time. I don't follow them on a day-to-day basis,
00:08:41.200 | but none of that is an excuse. I'm a professional and I need to do better with it.
00:08:45.280 | But more importantly, this, I think, is the big weakness of using these various tools
00:08:49.760 | of ChatGPT and other AI models. And I think we're going to see more and more of this.
00:08:55.200 | These tools are so good that they lull us into a false sense of security because we don't know
00:09:01.920 | what we don't know. Let me give just one other funny story from when I was younger. When I was
00:09:05.840 | in college, I was working for somebody and I was requested, this boss that I had told me to put
00:09:13.760 | together some information and create basically a quick presentation about Cisco company. Now,
00:09:20.080 | where I'm from in Florida, we have a company called Cisco, spelled S-Y-S-C-O. And Cisco
00:09:28.960 | is a food service distributor, I think a pretty big one. I'm not sure whether they're regional,
00:09:32.080 | national, I don't know. But I was very familiar with that Cisco company. And so I
00:09:39.600 | ignorantly put together all the information on that Cisco food service company. Well, as it turns
00:09:45.120 | out, what my boss actually wanted was information on Cisco Systems, C-I-S-C-O, the big technology
00:09:53.040 | company. And both of them are called Cisco. It's just a difference of spelling. And I was too
00:09:58.320 | ignorant at the time to intuit, well, why would anybody care about this food service company?
00:10:03.200 | What really we need is for the electronics technology company. And it's kind of a similar
00:10:09.120 | error that the tool is good. I did my best job, but there was an error in knowledge and I wasn't
00:10:17.440 | smart enough to catch it there. And the same thing is going to happen to us more and more with AI.
00:10:22.720 | AI is writing a lot of our news reports. AI is producing all kinds of content. And you need to
00:10:30.320 | be knowledgeable in a space in order to pick up on it. And once you pick up on it, then you can
00:10:35.760 | identify, wait a second, where is this coming from? Expertise is not less important in a space of
00:10:43.200 | modern AI systems. It's more important because only the real expert is increasingly able to spot
00:10:53.280 | the error in the actual content. It's really hard to articulate these things to non-experts sometimes.
00:11:03.520 | Let me give you one more example. Years ago, a family member had a book that was called
00:11:09.760 | something like The Thomas Jefferson Education. And it was a book that was written by this guy
00:11:15.280 | who was going on and on about how the best education is something referred to as a Thomas
00:11:21.040 | Jefferson education. And I read the book originally years ago, and I thought, wow, that's really cool.
00:11:27.440 | I really want a Thomas Jefferson education for my children. I really want that. And then I forgot
00:11:32.480 | about it. I didn't own the book, but I had seen it. I was really inspired by it, and I moved on.
00:11:36.960 | Then a number of years later, after I myself had been homeschooling my children, after I myself
00:11:42.400 | had read a couple dozen more books on education and really thought about the concepts of everything
00:11:47.440 | involved with it, later on, the family member that had had the book was getting rid of it. And I
00:11:53.920 | picked it up and took it home, and I read it again. And the second time I read through the book,
00:11:58.880 | I thought to myself, wait a second. This is all fake. The guy was speaking with such
00:12:05.040 | sweeping terms about the importance of a classical education, and wouldn't it be amazing if our
00:12:12.320 | children learned their geometry from Euclid? And that's far better than learning it from a textbook
00:12:17.760 | and many other things. I can't even cite at the moment all the specific things. And I thought,
00:12:21.920 | this is a fake. This is not real stuff. This is a guy trying to present ideas that are bigger than
00:12:27.680 | they should be. And while I appreciate the direction he's going, none of this is real.
00:12:31.840 | This is just made up stuff to sound cool, and basically is ripping off classical education,
00:12:37.760 | putting his own spins and terms on it. And this is fake. Well, as it turns out, while I never dug
00:12:43.120 | deeply into the controversy, sometime later, I came across some online controversy that was
00:12:47.520 | exposing the author, and the whole movement is basically a fraud. And I thought, well,
00:12:53.360 | I figured that out. But I never could have figured it out in the first place around,
00:12:56.960 | because I was so engaged with the ideas. But the next time around, I just knew the whole feeling
00:13:02.720 | was off. So it's nice when mistakes are easily countable when it comes to which year's contribution
00:13:12.320 | limits are we talking about. But the same thing holds for spotting errors and mistakes of ideas
00:13:17.360 | that are less easily identified and calculated. So be super careful. I think that all of us need
00:13:25.600 | to be using these tools in order for us to stay current. I find chatGBT specifically,
00:13:33.120 | and I'm trying not to, that's the one I use all day, every day. But I find these tools are
00:13:39.840 | enormously valuable. And for an expert, for something that you really know about, they're
00:13:45.440 | enormously valuable. They're also, by the way, really valuable for a beginner. Because if you're
00:13:50.880 | trying to learn about something, it can be a great tutor and can really show you how to gain access
00:13:56.320 | to information. I find them wonderful. So many good things to say about them. But they're not
00:14:02.000 | perfect. And if we're not aware of it, and we don't double check them, we've got trouble. Real
00:14:09.680 | trouble. So if you're using these tools, as you almost certainly should be, make certain that
00:14:14.960 | you're verifying them. Don't be lazy, like I was, and not go ahead and pull up the IRS current
00:14:20.320 | numbers, knowing that I'm going to be talking about 2024 numbers, and I didn't double check
00:14:24.480 | everything. If something is really important, and you use AI to create a spreadsheet, make certain
00:14:30.640 | that you manually create the spreadsheet and manually test it yourself, that you go through
00:14:35.280 | and you carefully check all of the assumptions. Don't be lazy, or you'll be caught with an error,
00:14:40.880 | as I was in yesterday's show. The show still stands. I'm not going to take it down or anything,
00:14:45.840 | because 95% of what I wanted to say was just big picture concepts. The actual specific numbers were
00:14:53.200 | not meaningful at all to the exact concept. And the fact that I mistakenly used 2023 numbers instead
00:14:59.600 | of the current 2024 numbers doesn't matter, and it's not going to matter in 2029 when someone's
00:15:04.320 | listening to this. But the lesson that I am reminded of, and am learning again, because I'm
00:15:10.480 | now embarrassed, and the lesson for you to hopefully learn before you're embarrassed publicly,
00:15:15.520 | is use the tools, but make certain that you fact-check them. Even as the tools themselves
00:15:22.960 | will show you, they make mistakes, and you've got to be good enough to catch the mistakes.
00:15:28.480 | [music]
00:15:29.120 | Heading back to school is a breeze, and so are the savings at your local Amazon Fresh
00:15:33.760 | grocery store. Our favorite class? Lunch. Plan, prep, and pack in one convenient place. Send
00:15:40.080 | their taste buds on a field trip with fresh favorites, from easy, delicious meals to snacks
00:15:45.120 | that power study sessions. Plus, save up to 50% on select Prime member deals this back-to-school
00:15:51.040 | season. Save big on back-to-school at your local Amazon Fresh grocery store. Find a store near you.
00:15:57.120 | [BLANK_AUDIO]