back to index

Everyday Educator - Is Being Biased Really A Bad Thing? with Kathy Gibbens


Whisper Transcript | Transcript Only Page

00:00:00.000 | Welcome, friends, to this episode of the Everyday Educator podcast.
00:00:09.540 | I'm your host, Lisa Bailey, and I'm excited to spend some time today with you as we encourage
00:00:15.760 | one another, learn together, and ponder the delights and challenges that make homeschooling
00:00:22.580 | the adventure of a lifetime.
00:00:24.140 | Whether you are just considering this homeschooling possibility or deep into the daily delight
00:00:31.800 | of family learning, I believe you'll enjoy thinking along with us.
00:00:36.720 | But don't forget, although this online community is awesome, you'll find even closer support
00:00:44.520 | in a local CC community.
00:00:46.900 | So, go to classicalconversations.com and find a community near you today.
00:00:54.500 | Welcome, ladies.
00:00:57.840 | Thanks so much, Lisa.
00:00:58.720 | Hi, Delise.
00:00:59.400 | Great to have you guys, or great to be here with you all today.
00:01:01.600 | Yes, we're so happy to have you.
00:01:04.660 | You guys, Kathy Gibbons is in the house today, and she is probably not new to you, but just
00:01:11.940 | in case, Kathy, will you introduce yourself to our audience?
00:01:14.680 | Yes, so I'm Kathy Gibbons.
00:01:16.420 | I am a long-time homeschool mom.
00:01:19.100 | In fact, we just graduated our daughter out of Challenge 4, which, oh my goodness, I feel
00:01:26.120 | like it was...
00:01:26.500 | Congratulations.
00:01:26.740 | Thank you.
00:01:27.860 | I feel like it was just yesterday we were entering the program.
00:01:31.220 | She was four years old and...
00:01:33.300 | Oh, my word.
00:01:34.100 | ...whispering the tiniest little presentations, you know, up in front of her class, hiding behind
00:01:40.640 | our legs.
00:01:41.320 | And then just like two weeks ago, she confidently defended her senior thesis in front of this
00:01:48.180 | panel of judges.
00:01:49.080 | And it was just like, oh, my goodness, just the journey of a lifetime.
00:01:53.460 | So anyways, that's me.
00:01:55.140 | I'm Kathy.
00:01:55.660 | That is so awesome.
00:01:57.580 | That is really awesome.
00:01:59.420 | And you did not say, tell us your other job, because you are sitting in a different seat
00:02:06.740 | than you're used to sitting in.
00:02:08.500 | You're in the guest seat today.
00:02:10.740 | But usually you're the host seat.
00:02:12.480 | Tell us.
00:02:13.060 | Tell us about it.
00:02:13.820 | So I am.
00:02:14.720 | And I'll tell you how I got there.
00:02:15.700 | I am the host of the Filter It Through a Brain Cell podcast.
00:02:19.540 | I've been podcasting for the last couple of years.
00:02:21.680 | On this podcast, I teach critical thinking to parents, middle schoolers, high schoolers.
00:02:27.520 | It's really kind of very family friendly, but specifically those age groups in short, fun
00:02:32.880 | episodes.
00:02:33.220 | The whole podcast is designed so that parents who are running around, going to soccer practice,
00:02:38.320 | going to the grocery store, family vacation, whatever, they're just living their life.
00:02:42.220 | They can hit play in the car with their kids, learn about critical thinking, learn a really
00:02:48.020 | good skill, and then have a conversation about it as a family.
00:02:51.720 | And it's something that everybody can listen to, everybody can learn from.
00:02:54.960 | And how I got into this was actually through CC.
00:02:59.080 | I was the director when my daughter was in Challenge A.
00:03:03.240 | And one of the things that they learn in second semester Challenge A is they read a book called
00:03:09.800 | The Fallacy Detective.
00:03:10.960 | Well, these kids loved it.
00:03:13.260 | These seventh graders, they love The Fallacy Detective.
00:03:15.640 | We had so much fun.
00:03:16.660 | You know, it's such the perfect age for them to be learning critical thinking.
00:03:20.480 | They love to find mistakes in the world and what grown-up thinkers are doing wrong.
00:03:27.720 | They're perfect.
00:03:28.580 | Perfect age.
00:03:29.620 | Love it.
00:03:29.760 | I mean, all they want to do is argue anyway.
00:03:32.520 | So it's a perfect time.
00:03:34.020 | I'm like, OK, if they're going to argue, let's teach them how to do it well and how to do it
00:03:37.760 | right.
00:03:38.240 | So the year that my daughter went through Challenge A was 2019-2020 school year.
00:03:43.660 | OK, so that was the year that COVID hit, shut down the world.
00:03:47.920 | So we're into COVID.
00:03:49.160 | We're into the internet, online fact checkers.
00:03:52.140 | We go into this crazy presidential election that happened.
00:03:57.340 | And the thing that was so interesting to me was watching these kids who had had only just
00:04:03.220 | just an introduction, just a smattering of introduction to critical thinking and logical
00:04:08.360 | fallacies.
00:04:08.820 | But watching them go through that time period as opposed to some of their peers who didn't
00:04:13.620 | have it.
00:04:14.680 | It was incredible.
00:04:15.820 | I was like, oh, my goodness.
00:04:17.480 | The thing that I realized is, number one, you can't fool these kids.
00:04:20.760 | Right.
00:04:21.280 | Watching a political debate with a bunch of kids who are like, oh, that's an ad hominem.
00:04:25.240 | That was a red herring.
00:04:26.860 | Right.
00:04:27.160 | Like, they called it out.
00:04:28.580 | Yeah.
00:04:29.120 | And I thought, you cannot fool these kids because they got it.
00:04:33.720 | They know how to think and they know how to recognize it.
00:04:36.420 | The other thing that it did for them is it drastically lowered the emotional rollercoaster
00:04:43.840 | that most people were riding because they were able to recognize just bad thinking.
00:04:49.540 | And they were-
00:04:50.360 | Right.
00:04:50.420 | That's not necessarily true.
00:04:52.360 | That's somebody else's opinion.
00:04:54.080 | That it may not be as bad as this person's painting it.
00:04:57.340 | Oh, they're just trying to get me emotional.
00:04:59.420 | That's an appeal to emotion.
00:05:00.900 | And so they didn't have to get all angry.
00:05:03.100 | They didn't have to feel sad and guilty about things.
00:05:06.000 | They were able to really look at it.
00:05:08.140 | And therefore, their mental and emotional health was so much better going through that
00:05:14.540 | time period.
00:05:15.000 | And I thought, oh, my goodness.
00:05:16.940 | If we can teach, I mean, not just this generation, but let's talk about this generation.
00:05:21.400 | If we can teach them the skills of thinking well and the value of finding truth, they will
00:05:28.600 | not be fooled by all this craziness that is all around us.
00:05:32.160 | So that's why I started the podcast.
00:05:35.040 | What a blessing, because it really enables a whole generation and more, right, whoever's
00:05:41.600 | listening with them, to not be reactionary.
00:05:45.020 | That's what I noticed.
00:05:46.420 | The kids who were in logic classes, because they do a whole year of formal logic in B, and
00:05:52.600 | then they do logic.
00:05:54.760 | By the time they graduate, they get two more years of logic.
00:05:59.520 | And so you're right.
00:06:00.440 | It's really difficult to fool these students.
00:06:04.400 | And they are much less reactionary because they are much more about thinking the issue through
00:06:12.960 | and looking for fallacies or, dare I say, bias.
00:06:19.500 | Cognitive bias.
00:06:20.860 | Yeah, exactly.
00:06:22.340 | And I've loved listening to your show.
00:06:24.620 | It's so fascinating to me.
00:06:26.680 | And I didn't realize that season one was going to be one way, and then we're going to take
00:06:31.120 | a totally different turn and look at critical thinking through a completely different lens,
00:06:36.460 | which would be even more fascinating to me than season one.
00:06:39.660 | So in season two, you guys, and please go listen to it, she is beginning to look, just like
00:06:46.660 | Lisa said, at cognitive biases.
00:06:49.940 | And I, at first, when you said, I said, oh, yeah, people need to work on their cognitive biases
00:06:56.660 | because, you know, they're really messed up.
00:06:58.640 | It's really bad.
00:06:59.360 | Everyone out here needs a reality check.
00:07:02.820 | And we'll get into it in a minute, but I realized after really pondering what you were presenting
00:07:09.980 | that I need to take a look at my cognitive biases as well.
00:07:13.520 | But I'm curious, what made you select that topic and really want to hone in on that specific
00:07:20.560 | aspect of logical thinking?
00:07:22.240 | Okay.
00:07:23.100 | Now, I feel like you're going to do this, but just for the benefit of the listeners out
00:07:28.840 | there who are thinking, yeah, I want to talk about cognitive bias, but first, I want to know
00:07:33.640 | what this is.
00:07:34.520 | So define it for us.
00:07:36.720 | We'll use one of our classical skills.
00:07:39.080 | Define it for us and then talk about it.
00:07:42.740 | I'm going to define a couple terms that we've already, a couple terms, one that we've already
00:07:46.420 | used because before I studied this in Challenge A, I didn't know what it was.
00:07:50.720 | So we've been talking about logical fallacies.
00:07:52.700 | What is that?
00:07:53.260 | A logical fallacy.
00:07:54.600 | So logic is just your thinking and a fallacy is an error.
00:07:58.540 | So a logical fallacy is just an error in thinking.
00:08:01.440 | And we all do it.
00:08:03.520 | And we all have been exposed to them.
00:08:06.500 | We all, and we've kind of wondered, well, that doesn't really make sense, but I don't
00:08:09.720 | know why it doesn't make sense.
00:08:10.900 | Well, cognitive, logical fallacies teach us what's wrong.
00:08:17.380 | And we get exposed to them in the fallacy detective.
00:08:19.780 | There's about 30, 33 of them in the book.
00:08:22.300 | Did you know that there's over 300 named logical fallacies?
00:08:26.320 | Like there is no shortage of ways to think wrong.
00:08:29.200 | So that's one side.
00:08:30.880 | So that is helping us recognize bad arguments.
00:08:33.660 | Now, a cognitive bias is a limitation in our ability to see things objectively.
00:08:42.640 | So cognitive, again, having to do with thinking and then a bias is leaning toward one particular
00:08:48.080 | viewpoint or particular action.
00:08:50.140 | And here's the thing, because you're right, Delise.
00:08:53.200 | Like we all think, oh, we should be unbiased.
00:08:55.340 | I'm unbiased.
00:08:56.380 | The reality is there's no such thing as being unbiased.
00:08:59.080 | You can't be.
00:09:01.080 | You can't be.
00:09:02.040 | We all have biases.
00:09:03.980 | Biases.
00:09:04.660 | And we have them for a couple of reasons.
00:09:06.500 | So number one, we have biases just because of who we are, how we've been raised, and how
00:09:13.280 | we view the world.
00:09:13.920 | There's things about me that I can't change.
00:09:16.740 | I am a woman, I'm a wife, I'm a mother, I'm a Christian, I'm all these things, and these
00:09:22.720 | all affect how I view the world.
00:09:24.460 | The second, you know, where I grew up, the country that I grew up in, the experiences that
00:09:29.500 | I've had growing up.
00:09:30.200 | Yeah, how many siblings you have, yeah, what kind of vacations you took.
00:09:34.820 | Yes, the exposure you've had to the world, like all of these things are going to affect.
00:09:39.120 | The second thing for why we have cognitive biases is because of the way that God designed
00:09:45.460 | our brain, our brain really likes to be efficient.
00:09:49.360 | Our brain really likes to make sense of things.
00:09:52.940 | Our brain really likes to protect us and to keep us safe.
00:09:57.480 | Like there's all these things that our brain is designed to do.
00:10:00.800 | And so therefore, it makes shortcuts.
00:10:03.980 | Our brain likes to take shortcuts because it's faster, it's efficient, it keeps us safe, it
00:10:09.440 | helps us focus on other things that might be more important at that time.
00:10:12.760 | And each of those things, actually, it does something very important for us, but it can
00:10:18.780 | limit us in our ability to think well about the thing that we're seeing.
00:10:22.360 | And that's where these biases come from.
00:10:24.900 | And that's why we all have them.
00:10:26.740 | They happen naturally, but they will affect our ability to think well and to find objective
00:10:32.540 | truth.
00:10:32.960 | And so really, the big thing is, OK, we just need to be aware of them and take some steps
00:10:38.420 | to kind of get past our biases a little bit.
00:10:40.960 | So it's not inherently wrong for people to have biases.
00:10:46.740 | But it is best for us to be aware that we have them and maybe what they are so that it keeps
00:10:56.780 | us on our guard that we're not pulled in a certain way without thinking about the full
00:11:04.300 | situation.
00:11:05.300 | And it just brings it back to one more reason why, you know, in Scripture, God tells us to
00:11:11.980 | take every thought captive.
00:11:13.800 | This is a very active thing that we're told to do.
00:11:17.480 | And this is one of the reasons why.
00:11:19.160 | Like, we have to control our thinking.
00:11:21.920 | We have to actively be on guard with what we're thinking and where our minds and our brains are
00:11:26.800 | going.
00:11:27.080 | And this is one of the reasons is because the brain left on its own will take some shortcuts
00:11:33.360 | that might not lead us to truth.
00:11:35.600 | And we want to be aware of that.
00:11:37.360 | So maybe what our brain does is sort things into piles that are mostly alike.
00:11:42.620 | And then our brain just sees them as alike.
00:11:45.300 | And if we aren't careful to differentiate, we could end up with the wrong conclusion because
00:11:52.600 | we didn't follow a path that was different.
00:11:56.120 | Yeah, gotcha.
00:11:57.140 | Yeah.
00:11:57.460 | Well said.
00:11:58.140 | I think an example of this that I, even today, just on the way to the studio, I was talking
00:12:04.360 | to my brother who lives in Japan right now.
00:12:07.740 | And I realized another one of my cognitive biases.
00:12:12.200 | And it was sitting somewhere that I just wasn't expecting.
00:12:16.600 | So it kind of blindsided me, to be honest.
00:12:19.000 | And it came at the intersection of what I believe about the world and my faith.
00:12:24.900 | And essentially, he was telling me that there is a lot of proof for the fact that these ancient
00:12:33.520 | Asian cultures had a deep understanding of the world as we would see it from an evangelical
00:12:42.020 | perspective.
00:12:43.400 | So I won't get into it because I'm going to say some of these things wrong.
00:12:47.300 | But in the way that their language is written, like in their kanji, the story of creation and
00:12:54.500 | just all kinds of different things are embedded there.
00:12:56.660 | And it is very old, much older than anything that America can touch or even Europe can touch.
00:13:02.560 | It is ancient.
00:13:03.460 | And I thought to myself, wow, I see that I am sitting on top of an enormous Western bias because
00:13:12.340 | I just assume, because of the way that I've been taught history, that we need to go and
00:13:20.020 | educate you about these things that you've, quote, never heard.
00:13:23.600 | But the truth about what I see on this paper or even hieroglyphics in your country would
00:13:29.340 | say that you heard it before me.
00:13:31.100 | And I'm just circling back around to something that you all have known and perhaps even know
00:13:37.780 | more about than anything I've ever experienced.
00:13:41.160 | And it just shook me.
00:13:42.760 | And I thought, wow, I'm not sure.
00:13:46.040 | I'm not sure I know what to do with this.
00:13:48.120 | I don't know what to do with that.
00:13:49.420 | Exactly.
00:13:50.600 | Have you had any big surprises in your research about cognitive biases, like personally or things
00:13:58.520 | that sort of just upset the way that you were thinking, Kathy?
00:14:01.580 | Oh my gosh, Delice, you just nailed it so well.
00:14:06.820 | And that is the experience of coming to realize, of having the openness and the awareness to
00:14:13.500 | realize, okay, maybe the way I see things isn't the whole picture here.
00:14:18.920 | So many people cannot do that.
00:14:21.820 | And we see the outcome of people's inability to do that when they're faced with something
00:14:27.480 | new or something that is different or that they just didn't know.
00:14:30.780 | We see this every day on social media.
00:14:32.540 | It's why there's so many arguments and fights and just all kinds of craziness and drama on
00:14:37.220 | social media because somebody runs into an idea that goes against what they knew, know,
00:14:42.560 | or think they know about whatever the topic is.
00:14:45.700 | And they can't even stop and be a little bit curious about it.
00:14:49.860 | So let me give you a couple.
00:14:52.200 | I have all kinds of examples.
00:14:53.980 | So here's one that I have committed.
00:14:56.900 | It's called the Dunning-Kruger effect and it is a cognitive bias.
00:15:02.280 | The Dunning-Kruger effect is just named after the people who kind of did the study and discovered
00:15:05.940 | So the name does not help us to understand it at all.
00:15:08.860 | So, but the Dunning-Kruger effect happens when a person, and this has been scientifically
00:15:14.080 | studied, when a person learns a little bit about something, they are full of confidence and
00:15:20.100 | they feel like they know everything about that topic.
00:15:23.080 | And they will go forth with all the confidence of somebody who has learned a little bit as
00:15:27.560 | if they know everything.
00:15:28.480 | However, the people who study the thing and go on, you know, and they go on, the more that
00:15:35.160 | they learn, yes, the more that they learn, the more they realize they don't know.
00:15:41.160 | And that the more that they, and so there is this overconfidence with people who only know
00:15:45.380 | a little and this deep humility with people who know a lot.
00:15:48.660 | I experienced this after I graduated from high school, I went to a two-year Bible college.
00:15:53.340 | And in this two-year Bible college, we essentially went through the whole Bible, right?
00:15:57.420 | Studied the whole Bible.
00:15:58.720 | I came out of that two years feeling like I knew everything there was to know about the
00:16:04.620 | Bible.
00:16:04.820 | Of course I did.
00:16:05.540 | I had studied it for two years, right?
00:16:07.440 | A lot of time.
00:16:08.820 | And oh my gosh, you know, at all of 20 years old, I, you know, was just so confident in
00:16:14.280 | my Bible knowledge.
00:16:15.040 | Well, the older I've gotten, the more I have realized, oh my goodness, I don't know.
00:16:21.220 | I barely scratched the surface.
00:16:23.320 | There are people who they have specialized, like one specialty would be ancient Near Eastern,
00:16:30.340 | what they would have believed during Bible times.
00:16:33.200 | They've studied, like they have gone so deep into just this one little aspect of scripture
00:16:37.660 | and they have spent an entire career doing so.
00:16:41.280 | And it just made me realize there's not even enough time in a lifetime to learn everything
00:16:47.840 | there is to know about the Bible, right?
00:16:49.940 | And to have the humility to say, there's people who've studied this for years and years that
00:16:56.640 | I could learn from and I still, you know, and I still don't know, right?
00:17:00.080 | So there's just so many, there's so many things just like that, that if we can come with humility
00:17:06.620 | and curiosity, oh my goodness, the world can open up to us.
00:17:11.400 | Yeah, that is, that is so true.
00:17:14.540 | It, it takes a long time.
00:17:17.280 | And I think like you hit the nail on the head.
00:17:19.740 | It takes a great deal of humility to realize that we maybe are standing in this position,
00:17:27.040 | looking at something and we think we see it.
00:17:31.320 | We've looked at it for a long time.
00:17:33.220 | We're looking at it carefully.
00:17:34.820 | We're asking God to show us and we're convinced we see the whole thing,
00:17:39.600 | except for we're not standing over here.
00:17:43.300 | So we're not seeing that whole side.
00:17:46.020 | And so the humility to say, everything I know may be true,
00:17:51.600 | but it may not be everything there is to know.
00:17:55.260 | And so, but it is just so, it's so hard to recognize your own bias,
00:18:02.560 | but it's really harder for somebody else to accuse you of being biased.
00:18:07.900 | We don't like to think that we're only seeing it for one perspective
00:18:12.200 | and that we might be guilty of bias.
00:18:15.620 | So, I mean, really the natural response to being accused of bias is to be defensive.
00:18:23.000 | That's what I do.
00:18:23.920 | When somebody says, well, you're just, you think that because you're biased.
00:18:29.160 | You don't know the whole story.
00:18:30.280 | I feel very defensive.
00:18:31.860 | But what's a better way to respond?
00:18:34.920 | Because, because I'm sure defensiveness is not it.
00:18:38.580 | And I think that's kind of the natural thing because in our, in our world,
00:18:42.820 | we see bias as being a negative thing, right?
00:18:45.540 | We've always kind of seen it that way.
00:18:47.440 | It'll keep you from seeing the truth.
00:18:49.300 | It'll keep you from being the better person.
00:18:51.340 | It'll keep you from being enlightened.
00:18:53.580 | And while those things, right.
00:18:55.960 | And while those things are probably actually true,
00:18:58.240 | but I think the reality is we all just need to come to accept that if somebody accuses us of being biased,
00:19:03.420 | we need to say, yeah, you're right.
00:19:05.000 | I totally do.
00:19:06.380 | Like we're all biased.
00:19:07.900 | We all have different biases and our brains do these things.
00:19:10.600 | Right.
00:19:11.020 | So the question is, you know, the response should be, yeah, you're totally right.
00:19:17.640 | I'm sure I'm biased.
00:19:18.500 | How do you see this?
00:19:19.620 | Tell me how you see it.
00:19:21.540 | Like if we could just do that one, if we could come with curiosity,
00:19:24.860 | that is the number one thing that could change so many things if both people.
00:19:29.840 | You're so right.
00:19:30.700 | Preferably if both people, but even if one person can come with the attitude of curiosity, oh my gosh, goodness.
00:19:37.220 | How do you see that?
00:19:38.040 | I'm super curious.
00:19:39.800 | That is so good.
00:19:40.980 | My sister-in-law is, and she may never hear our podcast, so she won't know that I'm saying all these nice things about her.
00:19:47.840 | One year, years and years ago, I heard her be confronted by somebody, I mean, she was just confronted by somebody else who had, who was basically accusing her of bias.
00:19:59.660 | And I could say that it really bothered her, but she stopped herself and she said to them so kindly with lots of humility and a great deal of curiosity, that's so interesting.
00:20:10.160 | Help me understand what you're saying.
00:20:15.480 | Help me understand.
00:20:17.260 | And I thought, that's it.
00:20:19.500 | I don't have to apologize for the way I see it.
00:20:23.740 | I could see this as an opportunity for us to share an understanding that will be totally new to both of us because we're coming at it from different sides.
00:20:34.000 | What if we could both see more of the sphere than we've ever seen before?
00:20:39.020 | That would be so awesome.
00:20:41.200 | This was a big one.
00:20:42.820 | And I think I talked about this in the second episode of season two.
00:20:47.020 | So coming out of this last election period, right?
00:20:50.840 | And this happens every election based on who wins and who doesn't win.
00:20:55.140 | You're going to have very dramatic responses on both sides.
00:20:58.960 | And then depending on who wins or loses the next time, it switches, right?
00:21:03.120 | And the people doing the dramatics, it's either side is going to be on one or the other.
00:21:07.560 | And here's what I, one of the things that I kept seeing is I can't believe so-and-so, I can't believe the other side thinks this, or I can't believe they're having this reaction or this response.
00:21:18.620 | And I kept thinking, I can.
00:21:20.580 | I totally can understand why both sides are responding and reacting the way they are.
00:21:26.740 | Because if I had been seeing the things that they've been seeing, if I had been hearing the messages that they've been getting, I would think the same thing too.
00:21:36.980 | Right.
00:21:37.420 | If I had the same social media algorithm that they had, I would be thinking and I would be afraid.
00:21:43.160 | I would be excited, whatever it is.
00:21:45.280 | If all the people around me thought the same thing, I'm probably going to think that.
00:21:50.180 | I would think that same way too.
00:21:52.240 | And so when we can step back and realize, okay, people think what they think for a reason.
00:21:59.000 | Now let's step in with curiosity and let's step in with the powerful tool of asking good questions and now we can have a conversation, hopefully.
00:22:08.360 | Now, not everybody is emotionally able to engage in a conversation and that's not our job to get them there.
00:22:16.740 | But we can come with that curiosity and we can have that heart.
00:22:20.600 | Yeah.
00:22:21.500 | Well, and really what I feel like you're showcasing there is empathy, right?
00:22:28.100 | Like how can we be a more empathetic listener?
00:22:32.300 | Because I think there is a point to which once you teach a person to think logically, they can say, okay, well, that's just done because X, Y, Z.
00:22:42.420 | You know?
00:22:42.840 | And it might be.
00:22:43.520 | It might be silly.
00:22:44.680 | It might be.
00:22:45.240 | Like, that's okay.
00:22:46.320 | And if you do see the whole picture, it's okay to say, hey, you're having a cow about this, but there's a bigger issue here.
00:22:53.080 | And to acknowledge both sides, however, love would come at that with a perspective of empathy.
00:23:01.460 | And so I love what you're saying and you guys are making me think of the fact that, you know, even our eyes, like the human eye, has a limited spectrum of light and a limited ability to see color.
00:23:14.380 | And we had a whole debate in the office the other day about the color of someone's shirt.
00:23:18.900 | And the truth is that I'm sure it looks exactly as was described by all parties, but our eyes are different.
00:23:26.800 | You know, and so like, it doesn't matter if it's the computer thwarting the image or if it's the person in the room there with you.
00:23:33.920 | My mother and I still call one color.
00:23:35.880 | She calls it green and I call it blue.
00:23:37.820 | And I just have to get over it.
00:23:39.600 | Stephanie and I do the same thing.
00:23:41.320 | Yeah.
00:23:41.900 | We have a set of yellow glasses.
00:23:43.580 | She's declared her green since she was a child.
00:23:46.140 | There you go.
00:23:47.140 | But I think that what you're saying, Kathy, and what you're observing is the fact that the person's eyes actually see that way.
00:23:55.320 | And you can choose to accept that or not accept that or to learn more about that or not learn more about that.
00:24:01.180 | But it is still a fact that that's the way they're perceiving things.
00:24:04.640 | And I'd love to hear you share a little bit about for those who find it perhaps more natural now because they've worked out their logic muscles to identify a fallacy or to observe someone's cognitive bias.
00:24:23.100 | How can that person protect themselves and keep themselves in a position of humility and walking in love?
00:24:32.740 | Okay.
00:24:33.500 | I love this question.
00:24:34.800 | I love this question for a couple reasons.
00:24:37.880 | And I have two things that I want to say, so don't let me forget to the second one.
00:24:41.720 | But I also, I want to just give a plug for the upper level of challenge to answer this question.
00:24:52.380 | Because it is not enough to just learn, oh, here's the biases and here's the logical fallacies.
00:24:59.420 | Fantastic.
00:25:00.220 | That's right.
00:25:00.640 | Information is great.
00:25:03.620 | But information can also lead to a harsh and a critical spirit because we can say, oh, well, they're just making fallacies.
00:25:11.320 | Okay.
00:25:12.200 | They're just too ignorant to see.
00:25:13.920 | They just don't never know.
00:25:15.200 | They're just dumb.
00:25:16.560 | And we just write people off or we just don't engage or we make fun or whatever.
00:25:20.240 | What we learned to do in Challenge 3 and especially, oh my gosh, I got to direct Challenge 4 this last year.
00:25:26.660 | And it was so amazing because what the kids learn to do, what the students learn to do is now take that truth and how do we engage the world with it.
00:25:38.580 | And it's not about beating people over the head.
00:25:41.880 | It's not about going out there and telling everybody they're idiots, right?
00:25:46.880 | Because, duh.
00:25:47.860 | It is about how can we take the truth and engage culture and how can we take the truth and really be somebody that somebody else will actually listen to and wants to listen to.
00:26:01.140 | And not that we have to hold ourselves and bend ourselves into culture to do that, but there is a way that we can stand for truth and that we can engage truth in a way that loves people.
00:26:13.560 | Because guess what?
00:26:14.740 | You may disagree with that other person, but they're still made in the image of God.
00:26:18.280 | And God loves them.
00:26:20.300 | And God sees them as a precious son or a precious daughter.
00:26:23.420 | Okay.
00:26:24.100 | So how do we engage them with truth as an image bearer of God?
00:26:28.100 | And that's a skill that we can be learned and that we can cultivate.
00:26:32.240 | And one of the ways that we can do it is we have to show that as parents.
00:26:37.400 | When we get something new, okay, math map.
00:26:40.340 | Let's just go there.
00:26:41.520 | Math map is new.
00:26:42.640 | Oh, I'll never forget practicum last year when it was getting rolled out.
00:26:46.780 | And for me, I'm one of these people that I thought it was genius.
00:26:52.300 | I was like, are you kidding me?
00:26:53.920 | I wish I'd learned math that way.
00:26:56.320 | So I wish I'd learned math.
00:26:58.080 | I wish my daughter had learned math.
00:26:59.680 | I think she would have done very well.
00:27:01.620 | I wish like I was like, oh, it just made my heart ache.
00:27:04.060 | I said, well, you can homeschool your kids and teach them this way.
00:27:06.020 | Do it with your kids and learn.
00:27:07.040 | And you'll get to learn it again.
00:27:08.180 | Redeem your own education.
00:27:09.260 | But I also think it's fascinating that Lee Bortons looked at math, something that all of us, right?
00:27:15.860 | Delise, to your point, this is just how math is done.
00:27:17.980 | We all know this.
00:27:18.840 | There's only one way to do math.
00:27:20.040 | And she just took all the pieces and changed it all around and jumbled it up and presented it.
00:27:25.760 | It totally knew.
00:27:26.540 | And so for me, I'm sitting there and I'm thinking, this is amazing.
00:27:30.620 | This is so great.
00:27:31.800 | I'm so interested.
00:27:33.240 | But not everybody at my table thought of it that way.
00:27:36.900 | Embraced it that way.
00:27:38.360 | There were these precious, beautiful homeschool mamas that were kind of mad about it because they didn't want to learn a new way of doing math.
00:27:46.740 | They didn't understand it.
00:27:48.280 | And when, you know, the presenter came around, I said, well, why don't we ask the questions?
00:27:52.600 | They didn't even want to ask the questions, right?
00:27:55.200 | And so I thought, okay, so if we can, you know, and it's hard to learn new things.
00:28:01.560 | It's hard to be open to a new way of thinking because, again, it's how our brains work.
00:28:06.980 | Our brains want to keep us safe.
00:28:08.400 | Our brains want to just go, no, we already know this.
00:28:10.560 | We got this.
00:28:11.140 | We know this way and we got there fine.
00:28:13.800 | Yes, we got there fine.
00:28:15.580 | So why mess it all up?
00:28:17.040 | You know, it's scary.
00:28:18.500 | It's new, blah, blah, blah.
00:28:19.540 | All the things, right?
00:28:20.620 | And I get it.
00:28:21.720 | I get it.
00:28:22.240 | This is how, this is what our brains do.
00:28:24.880 | But again, if we can come with the idea of being open, of asking questions, and just
00:28:32.680 | be curious, it doesn't ever hurt us to learn something new, right?
00:28:37.500 | It never hurts us to learn something new.
00:28:39.960 | And one of the things that I learned just from directing challenge and just life in general
00:28:46.680 | is sometimes asking the right question is far more important than getting the right answer.
00:28:53.780 | And when we can come with this spirit of, wow, let's just learn.
00:28:59.040 | Guess what?
00:28:59.800 | Learning something new doesn't mean you have to do it.
00:29:01.860 | Doesn't mean you have to agree with it.
00:29:03.200 | Doesn't mean you have to like it.
00:29:04.580 | But doesn't it help just to know about it?
00:29:07.500 | Why don't you know it?
00:29:09.040 | Learn about what you don't like or learn about this thing that you disagree with.
00:29:12.740 | It doesn't hurt you at all to do that.
00:29:15.420 | And so if we can just cultivate this spirit of just being curious, right?
00:29:19.960 | Let me just be curious about this thing and find out.
00:29:22.240 | And maybe I don't, I don't have to agree with it.
00:29:24.600 | Whatever, like insert any topic from society, from culture, from religion, insert any topic.
00:29:31.420 | Guess what?
00:29:32.540 | It's, it's a good mind that can hold two ideas and look at them and say, okay, well, what are
00:29:37.960 | they?
00:29:38.300 | Let's just find out what they are.
00:29:39.340 | What's the truth?
00:29:40.400 | How do they, how do they compare?
00:29:41.880 | How do they, like, what do we think about this thing?
00:29:44.460 | That's a beautiful thing.
00:29:46.140 | And so we can teach our kids to do that and we can learn, yes, even though our brains are
00:29:51.280 | set and all this stuff.
00:29:52.360 | No, they're not.
00:29:53.120 | We can learn these things too.
00:29:54.260 | I love what you're saying, Kathy.
00:29:56.700 | That is a way to be winsome, to be winsome as a person.
00:30:02.500 | And, you know, in CC, we spend all of these years saying that our mission is to know God
00:30:09.680 | and to make him known.
00:30:11.400 | And so we raise our children to see God in the word and to see God in the world.
00:30:18.080 | And we give them all the tools of communication that they need to go out and speak God's truth
00:30:24.940 | on his behalf.
00:30:25.920 | But I think that sometimes we forget that one of the ways to make God known is to be
00:30:34.260 | Jesus, to be like Jesus, and to draw people to that spirit of love and winsome curiosity
00:30:45.940 | and open-mindedness that will say not open-minded to take in what's not true, but open-minded to
00:30:55.720 | asking you what you believe and why you think that and to explore it together.
00:31:02.200 | That's what Jesus would do.
00:31:04.760 | And so that is one way to make him known.
00:31:08.180 | That's good.
00:31:10.460 | A hundred percent.
00:31:11.320 | I agree.
00:31:12.060 | Yeah.
00:31:12.900 | I'm curious about your perspective of technology.
00:31:18.960 | I've asked many people about their perspective of technology lately, just because I want to
00:31:23.060 | know we're at the cusp of a new era.
00:31:25.520 | And so when we're thinking about thinking critically, teaching our students to think critically,
00:31:32.220 | teaching them to think logically, identify their biases, and then we put AI in the mix,
00:31:38.160 | we get an interesting collection of scenarios.
00:31:44.160 | How do you think the use of AI is affecting or even empowering people?
00:31:54.860 | I don't want to, you know, I don't want to dictate how you're going to answer.
00:31:57.340 | Right.
00:31:57.660 | Or even empowering people to navigate issues.
00:32:02.860 | And what would it look like for us to set our children up for success knowing that the presence
00:32:10.420 | of AI is now permanent?
00:32:12.620 | It is not going away.
00:32:13.740 | Parents, it is not going away.
00:32:15.440 | It is only going to become ever more pervasive in society.
00:32:21.320 | And so, yes, this is something that's interesting to me.
00:32:23.980 | I think it's a great question.
00:32:25.200 | And this is one of the things that I encourage parents to have an ongoing conversation with
00:32:29.960 | their kids about.
00:32:30.480 | Like, we need to be talking about this.
00:32:31.880 | Yeah.
00:32:33.000 | And so, here's where I'm at today, right?
00:32:36.500 | My ideas about this may change the more I learn or the more I grow.
00:32:39.580 | But here's where I'm at today is, okay, AI is a tool that can be used rightly and can be used
00:32:46.240 | wrongly.
00:32:46.720 | This is how I see it today.
00:32:48.420 | I see that there are some incredible applications for AI in culture, in society, in life.
00:32:55.240 | And I see that there are applications that can absolutely destroy people.
00:32:59.580 | So, let me give a few examples.
00:33:02.780 | I have read articles about kids who have trouble, like they're getting into the high
00:33:08.640 | school and the college years, and they have trouble formulating ideas, having conversations
00:33:13.400 | because they have not been forced to think.
00:33:16.320 | They have been relying on AI to do their work and their thinking for them.
00:33:21.640 | To spit it out.
00:33:22.700 | To spit it out.
00:33:22.700 | To spit out the answer without processing the information.
00:33:26.160 | So, guess what?
00:33:27.380 | Their brain is atrophied.
00:33:28.500 | They are actually atrophying their brain.
00:33:31.360 | And the brain is the one organ that the more you use it, the more you're able to use it.
00:33:37.180 | And the less you use it, the less capacity you are going to have in your future to be able
00:33:42.380 | to use it.
00:33:42.860 | And so, in that way, I think it can be really dangerous.
00:33:45.900 | The other thing is, AI is sometimes wrong.
00:33:48.960 | And so, we have to teach our kids, look, just because you get an answer on AI does not mean
00:33:54.020 | it's right.
00:33:54.360 | It could be wrong.
00:33:55.060 | AI is only as good as what it's feeding from.
00:33:57.800 | And that can be biased.
00:34:00.380 | And so, guess what?
00:34:02.240 | Whatever tool you're using, it could be biased.
00:34:05.300 | And you have to be aware of it.
00:34:06.820 | Now, the other thing that I think we have to be very aware of with AI is it is now being
00:34:12.540 | developed to be a replacement for human connection and relationship.
00:34:17.160 | So, I got an ad in my social media just recently where it was advertising an AI tool that would
00:34:24.540 | be a replacement for a therapist or a counselor.
00:34:27.920 | I have even – there is even programs where you can get an AI boyfriend or a girlfriend.
00:34:35.000 | Oh, my word.
00:34:35.940 | I've heard about that.
00:34:37.060 | I thought it was a joke.
00:34:37.980 | Nope.
00:34:38.660 | It's not a joke.
00:34:39.280 | It's a real thing.
00:34:40.160 | It's totally real.
00:34:41.020 | And so, now somebody can have what's meant to be this experience with another flawed but
00:34:49.220 | beautiful person with a machine that's only going to give you one type of feedback and
00:34:55.560 | like, oh, just the implications could be really detrimental and scary.
00:35:01.860 | So, but this is all our kids are going to know.
00:35:05.960 | So, we have to have this conversation with them and we got to stay up with what's going
00:35:11.240 | on and talk to them about this stuff.
00:35:13.360 | Right.
00:35:13.960 | We can't just say, oh, I don't want to think about that.
00:35:16.920 | That's not worth thinking about.
00:35:18.140 | Well, you got to think about it because you need to arm them with good thinking about it.
00:35:24.500 | That's really good.
00:35:25.560 | So, am I hearing you say, and you can say, no, you're not hearing me say that to Lisa,
00:35:30.500 | but am I hearing you say that you think maybe an appropriate application of AI would be,
00:35:38.920 | as you said at the beginning, to use it as a tool and then to have your own tools to check
00:35:44.740 | Oh, 100%.
00:35:46.680 | And that's how you see it playing out even in a, quote, classical education.
00:35:52.540 | I think there are, I think there is a place for it.
00:35:55.880 | I think in a classical education, I think there's less of a place for it because we're not necessarily
00:36:01.760 | relying on it that way.
00:36:04.060 | I'm talking about in society, I think there's, you know, there's going to be medical applications
00:36:08.880 | for it.
00:36:09.200 | There's going to be technical applications for it where it is going to be fantastic.
00:36:13.820 | But we have, but you can only apply a tool when you understand the tool.
00:36:19.040 | And so when we understand what the tool is and what it does, we can apply it well.
00:36:24.220 | But when we're starting to apply it, where it's replacing our brain, where it's replacing other
00:36:28.880 | people's, where it's replacing human connection, the way that we're created as human beings to do,
00:36:35.000 | I think that's wrong.
00:36:36.000 | And so that's where, that's where we've got to be very careful.
00:36:41.100 | I have not seen anything with my daughter going from foundations to challenge for where she has
00:36:48.340 | needed AI.
00:36:48.860 | Like, I don't know that it's got a big place in a classical education, at least to the point
00:36:53.220 | that we're at.
00:36:53.920 | But that doesn't mean that I don't see applications for it in society in certain places.
00:36:58.240 | So yeah, sure.
00:36:59.600 | So that's, that is what you're hearing me say, Elise.
00:37:01.560 | Sure.
00:37:01.880 | Yeah.
00:37:02.460 | Classical education is so much about wrestling with big ideas in community, wrestling together
00:37:11.700 | to see, this is what I saw.
00:37:13.700 | This is what I heard from what we just read.
00:37:15.700 | Oh, well, I heard something different, or it made me think in a different way.
00:37:19.240 | And then you've got a third person saying, well, I saw something different than both of you,
00:37:23.620 | but maybe the truth is in the middle and, and that wrestling.
00:37:27.000 | And so that is more a relational community aspect of education than AI could produce.
00:37:35.620 | But I love it, Kathy, that you are thinking about that and that you know that you want
00:37:41.200 | to learn more.
00:37:42.000 | I really appreciated you giving the caveat.
00:37:45.420 | God, this is what I think today, but I reserve the right to change as I learn because you
00:37:51.220 | model the lifelong learner so beautifully.
00:37:55.200 | I mean, it's not that you have now arrived and you know enough about logic to have a podcast
00:38:01.200 | that everybody will listen to.
00:38:02.700 | You're still learning.
00:38:04.920 | And I think it's because you're such a curious person.
00:38:07.480 | Like you yourself have a lot of curiosity.
00:38:09.760 | What is it that feeds your curiosity?
00:38:13.060 | Well, I think that's a great question.
00:38:17.140 | Number one, it's a mindset that I can just learn whatever I need to know.
00:38:21.600 | I might not know it, but I can learn it.
00:38:23.340 | But you can learn it.
00:38:24.260 | We can learn it.
00:38:25.080 | We can learn anything.
00:38:25.880 | If there's anything I wanted to learn, I could learn it.
00:38:28.840 | But it's also giving myself the freedom to follow my interest and to say, well, I'm really
00:38:35.380 | interested in this.
00:38:36.740 | So like it's with your kid.
00:38:39.060 | If you've got a kid who's into something, let them go all in.
00:38:42.120 | Like let there's so much learning that can happen around Legos, right?
00:38:46.440 | Dinosaurs or whatever the interest is that a child currently has, you know, and because
00:38:55.460 | I see this, right, even in classical conversations about how, you know, parents, they get nervous.
00:39:00.420 | Well, I could never direct challenge.
00:39:02.000 | I could never.
00:39:02.660 | Oh, my word.
00:39:03.360 | I don't know this.
00:39:04.500 | I don't know.
00:39:05.200 | Trust me, going into challenge four, I didn't know anything about.
00:39:09.040 | Greek literature.
00:39:09.700 | I've never read the Odyssey before.
00:39:11.540 | I'd never like that's never been part of my education.
00:39:15.340 | But if we can just have the attitude of, OK, well, let's figure it out.
00:39:20.280 | I'm I'm reading it at the same time that these kids are reading it.
00:39:24.280 | And that's that's fine.
00:39:26.300 | Like, even if I'm just one day ahead of the kids in my class, that's enough.
00:39:31.480 | That's enough.
00:39:32.440 | You know, and I think that is one of the most important things.
00:39:35.640 | It's if we model this mind that's just excited about learning something new, right?
00:39:40.600 | And you just be excited about learning something new.
00:39:42.620 | That's all you have to do.
00:39:43.620 | Or even just willing to learn something new.
00:39:46.840 | That we really I mean, that's a great encouragement because you're right.
00:39:50.600 | There are a lot of parents who think, oh, I'm going to I'm going to need to put them in school
00:39:55.580 | or we're going to have to do dual enrollment or something.
00:39:58.240 | Because when we get to upper level math or when we get to fourth year Latin or when we
00:40:03.920 | get to Greek literature, when we get to policy debate, I don't know anything about that.
00:40:08.620 | I'm just out of my depth.
00:40:09.860 | I love the encouragement that you have of learning alongside your kids.
00:40:16.020 | Really, some of the best lessons that you will teach as a lead learner in your home.
00:40:23.120 | You teach when you don't know what you're doing in that subject because you teach your child to stay
00:40:30.540 | that even people who know nothing can learn something.
00:40:36.380 | And that's a great encouragement.
00:40:39.640 | Yeah.
00:40:39.820 | And it's all about the power of learning to ask good questions.
00:40:42.540 | Yeah.
00:40:43.200 | And this is the this is the the trick to not knowing is asking the questions, because if
00:40:49.020 | you can ask the question now, you've got something to go off of.
00:40:53.340 | Right.
00:40:53.520 | Because we just we just tend to the reason we get stuck is we're like, well, I just I just
00:40:57.640 | don't get it.
00:40:58.640 | OK, well, you're not helpless.
00:41:00.580 | You're like nobody is helpless here.
00:41:02.560 | So if you don't know, you don't know.
00:41:04.820 | Right.
00:41:05.160 | Right.
00:41:05.380 | Go back to what you did know.
00:41:06.840 | Yeah.
00:41:07.360 | Just ask the next question.
00:41:08.660 | That's all you have to do.
00:41:09.400 | Ask the next question.
00:41:10.780 | Be interested in learning what is the next thing.
00:41:13.460 | And you don't have to know all of it.
00:41:15.580 | Just find out the next thing.
00:41:17.240 | What's the next thing?
00:41:18.140 | That's all you got to do.
00:41:19.100 | I love that.
00:41:20.180 | The next thing.
00:41:21.260 | So that's right.
00:41:22.080 | All of us just need to take a deep breath and go learn the next thing.
00:41:27.060 | Yeah.
00:41:28.000 | Got it.
00:41:28.760 | Yeah.
00:41:29.120 | And going back to what you were saying, Lisa, a couple of minutes ago, I mean, that is
00:41:34.020 | really a Christ like blueprint because he would just ask questions.
00:41:39.280 | Even though he knew the answers, which to me is just the most humble.
00:41:44.600 | Like, obviously, we're asking questions and we don't know the answer.
00:41:47.660 | So there goes our limitation.
00:41:49.180 | But he's asking questions and he does know the answer.
00:41:52.740 | And we're both trying to lead toward a new discovery.
00:41:55.740 | And so we can do that simple.
00:41:59.920 | That simple step.
00:42:01.840 | And I just think about, like, for example, the woman at the well and, like, the beautiful
00:42:07.180 | discoveries that she unpacked with a series of very simple questions.
00:42:12.120 | Where's your husband?
00:42:12.920 | Like, what's going on?
00:42:14.600 | Just tell me a little bit more about you.
00:42:16.480 | And she starts gushing.
00:42:17.680 | And you're just making me excited as two women who are further down the road in your homeschooling
00:42:23.180 | journeys than I am for the things that I'm sure I'm going to discover with my boys by just
00:42:29.680 | not having to have the answers.
00:42:31.500 | You know, that takes that pressure off.
00:42:33.060 | So, you know, Kathy, talking to you has been a joy.
00:42:37.180 | I knew it was going to be.
00:42:38.400 | I said so before we started recording because I just love hearing your thoughts.
00:42:42.120 | And I know that our listeners probably want to hear more of your thoughts from this conversation.
00:42:46.840 | So if they're saying, hey, where can I find her online?
00:42:50.740 | Where would you send them?
00:42:51.960 | Yeah, absolutely.
00:42:52.820 | So the podcast is called Filter It Through a Brain Cell.
00:42:55.640 | They can find it on pretty much any podcast listener.
00:42:58.280 | That's a really great place to start.
00:42:59.920 | I recommend go back and start with season one because it kind of builds on itself as you go.
00:43:04.800 | I do have a free quiz on my website.
00:43:07.920 | So people can go to filter it through brain cell dot com forward slash quiz that they can take.
00:43:12.700 | You can have your middle schoolers or high schoolers take it to you.
00:43:15.620 | It's got 10 different.
00:43:16.520 | I love teaching through memes, headlines, articles like stuff that you see in real life.
00:43:21.340 | Right.
00:43:21.620 | And it's basically, can you figure out what the fallacy is there?
00:43:25.940 | And for all my challenge a families or directors, I have a matchup that I created of podcast episodes that goes along with each chapter each week of the fallacy detective as you're going through that.
00:43:39.140 | So filter it through brain cell dot com forward slash a you can download it for free.
00:43:43.720 | And it's just a fun way to go along with what they're learning in challenge a so anyways that yeah.
00:43:48.820 | Or they can find me on Instagram.
00:43:50.640 | I'm on social media.
00:43:51.640 | So either way.
00:43:52.560 | So cool.
00:43:53.660 | That's so cool.
00:43:54.920 | This has been great.
00:43:56.100 | It's been really nice to have you, Kathy.
00:43:58.340 | Well, thanks so much for having me on.
00:44:00.060 | I love partnering with I mean, we love classical conversations.
00:44:02.480 | I love partnering with CC and yeah, I just to me, I think that people who care about if we can teach our kids to not just know truth, because that's important.
00:44:15.060 | But I think the even more important tool and skill is to love truth.
00:44:19.780 | If we can teach our children to love the truth.
00:44:24.120 | That changes a generation.
00:44:26.320 | And that is my hope and that is my goal is that we can help this next I have a lot of hope for this next generation.
00:44:32.560 | And I pray that they can learn to love the truth.
00:44:36.280 | That's awesome.
00:44:38.060 | That's awesome.
00:44:39.060 | So parents, we hope that you have been blessed, that you have been encouraged to go out and learn the next new thing.
00:44:46.780 | And that generally the everyday educator encourages you to keep learning on your own and to see what's new.
00:44:54.680 | You may want to see what's new with Classical Conversations books this summer.
00:44:59.320 | If you are looking for something to do this summer to expand your mind or to get ahead of your kids, take a look at what is coming out, what's new in the bookstore.
00:45:10.740 | We've got coming out the Math Map Digits.
00:45:14.320 | Kathy mentioned the Math Map, our new classical education approach to teaching math with your family.
00:45:20.620 | So we've got Math Map Digits, integers, fractions.
00:45:24.880 | We've got two sets of flashcards that offer some really simple classical ways to practice those essential math skills.
00:45:36.220 | Math Map Monomials is coming out for our Challenge B families.
00:45:40.680 | They will be using that this fall.
00:45:43.760 | There is a gorgeous new Reasoning Together philosophy textbook.
00:45:49.300 | It has got wonderful excerpts.
00:45:53.500 | It helps our students explore great thinkers' ideas through reading those works and then participating in Community Day in some Socratic discussions.
00:46:08.820 | And there's always new Copper Lodge Library books.
00:46:11.320 | We've got four new classics.
00:46:13.080 | Pilgrim's Progress is one of them that have great illustrations and some really good insights.
00:46:19.080 | You could explore all the new products.
00:46:21.180 | And if you don't know which one you're the most interested in, here's what you do.
00:46:24.820 | Go to classicalconversations.com forward slash what's new and you'll find everything that's there.
00:46:33.880 | Okay?
00:46:34.560 | Well, guys, thank you for listening to our show today.
00:46:38.900 | Thank you again, Kathy, for being here.
00:46:40.900 | It has meant a lot.
00:46:42.660 | And we will see you over on social media at Everyday Educator Podcast or on YouTube if you want to watch, if you want to see Kathy's smiling face, we're over there on the podcast channel on YouTube as well.
00:46:54.860 | So we'll talk to you soon.
00:46:55.880 | Thanks.
00:46:56.340 | All right.
00:46:57.260 | Bye, guys.
00:47:03.860 | Thank you.