back to indexEveryday Educator - Is Being Biased Really A Bad Thing? with Kathy Gibbens

00:00:00.000 |
Welcome, friends, to this episode of the Everyday Educator podcast. 00:00:09.540 |
I'm your host, Lisa Bailey, and I'm excited to spend some time today with you as we encourage 00:00:15.760 |
one another, learn together, and ponder the delights and challenges that make homeschooling 00:00:24.140 |
Whether you are just considering this homeschooling possibility or deep into the daily delight 00:00:31.800 |
of family learning, I believe you'll enjoy thinking along with us. 00:00:36.720 |
But don't forget, although this online community is awesome, you'll find even closer support 00:00:46.900 |
So, go to classicalconversations.com and find a community near you today. 00:00:59.400 |
Great to have you guys, or great to be here with you all today. 00:01:04.660 |
You guys, Kathy Gibbons is in the house today, and she is probably not new to you, but just 00:01:11.940 |
in case, Kathy, will you introduce yourself to our audience? 00:01:19.100 |
In fact, we just graduated our daughter out of Challenge 4, which, oh my goodness, I feel 00:01:27.860 |
I feel like it was just yesterday we were entering the program. 00:01:34.100 |
...whispering the tiniest little presentations, you know, up in front of her class, hiding behind 00:01:41.320 |
And then just like two weeks ago, she confidently defended her senior thesis in front of this 00:01:49.080 |
And it was just like, oh, my goodness, just the journey of a lifetime. 00:01:59.420 |
And you did not say, tell us your other job, because you are sitting in a different seat 00:02:15.700 |
I am the host of the Filter It Through a Brain Cell podcast. 00:02:19.540 |
I've been podcasting for the last couple of years. 00:02:21.680 |
On this podcast, I teach critical thinking to parents, middle schoolers, high schoolers. 00:02:27.520 |
It's really kind of very family friendly, but specifically those age groups in short, fun 00:02:33.220 |
The whole podcast is designed so that parents who are running around, going to soccer practice, 00:02:38.320 |
going to the grocery store, family vacation, whatever, they're just living their life. 00:02:42.220 |
They can hit play in the car with their kids, learn about critical thinking, learn a really 00:02:48.020 |
good skill, and then have a conversation about it as a family. 00:02:51.720 |
And it's something that everybody can listen to, everybody can learn from. 00:02:54.960 |
And how I got into this was actually through CC. 00:02:59.080 |
I was the director when my daughter was in Challenge A. 00:03:03.240 |
And one of the things that they learn in second semester Challenge A is they read a book called 00:03:13.260 |
These seventh graders, they love The Fallacy Detective. 00:03:16.660 |
You know, it's such the perfect age for them to be learning critical thinking. 00:03:20.480 |
They love to find mistakes in the world and what grown-up thinkers are doing wrong. 00:03:34.020 |
I'm like, OK, if they're going to argue, let's teach them how to do it well and how to do it 00:03:38.240 |
So the year that my daughter went through Challenge A was 2019-2020 school year. 00:03:43.660 |
OK, so that was the year that COVID hit, shut down the world. 00:03:49.160 |
We're into the internet, online fact checkers. 00:03:52.140 |
We go into this crazy presidential election that happened. 00:03:57.340 |
And the thing that was so interesting to me was watching these kids who had had only just 00:04:03.220 |
just an introduction, just a smattering of introduction to critical thinking and logical 00:04:08.820 |
But watching them go through that time period as opposed to some of their peers who didn't 00:04:17.480 |
The thing that I realized is, number one, you can't fool these kids. 00:04:21.280 |
Watching a political debate with a bunch of kids who are like, oh, that's an ad hominem. 00:04:29.120 |
And I thought, you cannot fool these kids because they got it. 00:04:33.720 |
They know how to think and they know how to recognize it. 00:04:36.420 |
The other thing that it did for them is it drastically lowered the emotional rollercoaster 00:04:43.840 |
that most people were riding because they were able to recognize just bad thinking. 00:04:54.080 |
That it may not be as bad as this person's painting it. 00:05:03.100 |
They didn't have to feel sad and guilty about things. 00:05:08.140 |
And therefore, their mental and emotional health was so much better going through that 00:05:16.940 |
If we can teach, I mean, not just this generation, but let's talk about this generation. 00:05:21.400 |
If we can teach them the skills of thinking well and the value of finding truth, they will 00:05:28.600 |
not be fooled by all this craziness that is all around us. 00:05:35.040 |
What a blessing, because it really enables a whole generation and more, right, whoever's 00:05:46.420 |
The kids who were in logic classes, because they do a whole year of formal logic in B, and 00:05:54.760 |
By the time they graduate, they get two more years of logic. 00:06:00.440 |
It's really difficult to fool these students. 00:06:04.400 |
And they are much less reactionary because they are much more about thinking the issue through 00:06:12.960 |
and looking for fallacies or, dare I say, bias. 00:06:26.680 |
And I didn't realize that season one was going to be one way, and then we're going to take 00:06:31.120 |
a totally different turn and look at critical thinking through a completely different lens, 00:06:36.460 |
which would be even more fascinating to me than season one. 00:06:39.660 |
So in season two, you guys, and please go listen to it, she is beginning to look, just like 00:06:49.940 |
And I, at first, when you said, I said, oh, yeah, people need to work on their cognitive biases 00:07:02.820 |
And we'll get into it in a minute, but I realized after really pondering what you were presenting 00:07:09.980 |
that I need to take a look at my cognitive biases as well. 00:07:13.520 |
But I'm curious, what made you select that topic and really want to hone in on that specific 00:07:23.100 |
Now, I feel like you're going to do this, but just for the benefit of the listeners out 00:07:28.840 |
there who are thinking, yeah, I want to talk about cognitive bias, but first, I want to know 00:07:42.740 |
I'm going to define a couple terms that we've already, a couple terms, one that we've already 00:07:46.420 |
used because before I studied this in Challenge A, I didn't know what it was. 00:07:50.720 |
So we've been talking about logical fallacies. 00:07:54.600 |
So logic is just your thinking and a fallacy is an error. 00:07:58.540 |
So a logical fallacy is just an error in thinking. 00:08:06.500 |
We all, and we've kind of wondered, well, that doesn't really make sense, but I don't 00:08:10.900 |
Well, cognitive, logical fallacies teach us what's wrong. 00:08:17.380 |
And we get exposed to them in the fallacy detective. 00:08:22.300 |
Did you know that there's over 300 named logical fallacies? 00:08:26.320 |
Like there is no shortage of ways to think wrong. 00:08:30.880 |
So that is helping us recognize bad arguments. 00:08:33.660 |
Now, a cognitive bias is a limitation in our ability to see things objectively. 00:08:42.640 |
So cognitive, again, having to do with thinking and then a bias is leaning toward one particular 00:08:50.140 |
And here's the thing, because you're right, Delise. 00:08:53.200 |
Like we all think, oh, we should be unbiased. 00:08:56.380 |
The reality is there's no such thing as being unbiased. 00:09:06.500 |
So number one, we have biases just because of who we are, how we've been raised, and how 00:09:16.740 |
I am a woman, I'm a wife, I'm a mother, I'm a Christian, I'm all these things, and these 00:09:24.460 |
The second, you know, where I grew up, the country that I grew up in, the experiences that 00:09:30.200 |
Yeah, how many siblings you have, yeah, what kind of vacations you took. 00:09:34.820 |
Yes, the exposure you've had to the world, like all of these things are going to affect. 00:09:39.120 |
The second thing for why we have cognitive biases is because of the way that God designed 00:09:45.460 |
our brain, our brain really likes to be efficient. 00:09:49.360 |
Our brain really likes to make sense of things. 00:09:52.940 |
Our brain really likes to protect us and to keep us safe. 00:09:57.480 |
Like there's all these things that our brain is designed to do. 00:10:03.980 |
Our brain likes to take shortcuts because it's faster, it's efficient, it keeps us safe, it 00:10:09.440 |
helps us focus on other things that might be more important at that time. 00:10:12.760 |
And each of those things, actually, it does something very important for us, but it can 00:10:18.780 |
limit us in our ability to think well about the thing that we're seeing. 00:10:26.740 |
They happen naturally, but they will affect our ability to think well and to find objective 00:10:32.960 |
And so really, the big thing is, OK, we just need to be aware of them and take some steps 00:10:40.960 |
So it's not inherently wrong for people to have biases. 00:10:46.740 |
But it is best for us to be aware that we have them and maybe what they are so that it keeps 00:10:56.780 |
us on our guard that we're not pulled in a certain way without thinking about the full 00:11:05.300 |
And it just brings it back to one more reason why, you know, in Scripture, God tells us to 00:11:13.800 |
This is a very active thing that we're told to do. 00:11:21.920 |
We have to actively be on guard with what we're thinking and where our minds and our brains are 00:11:27.080 |
And this is one of the reasons is because the brain left on its own will take some shortcuts 00:11:37.360 |
So maybe what our brain does is sort things into piles that are mostly alike. 00:11:45.300 |
And if we aren't careful to differentiate, we could end up with the wrong conclusion because 00:11:58.140 |
I think an example of this that I, even today, just on the way to the studio, I was talking 00:12:07.740 |
And I realized another one of my cognitive biases. 00:12:12.200 |
And it was sitting somewhere that I just wasn't expecting. 00:12:19.000 |
And it came at the intersection of what I believe about the world and my faith. 00:12:24.900 |
And essentially, he was telling me that there is a lot of proof for the fact that these ancient 00:12:33.520 |
Asian cultures had a deep understanding of the world as we would see it from an evangelical 00:12:43.400 |
So I won't get into it because I'm going to say some of these things wrong. 00:12:47.300 |
But in the way that their language is written, like in their kanji, the story of creation and 00:12:54.500 |
just all kinds of different things are embedded there. 00:12:56.660 |
And it is very old, much older than anything that America can touch or even Europe can touch. 00:13:03.460 |
And I thought to myself, wow, I see that I am sitting on top of an enormous Western bias because 00:13:12.340 |
I just assume, because of the way that I've been taught history, that we need to go and 00:13:20.020 |
educate you about these things that you've, quote, never heard. 00:13:23.600 |
But the truth about what I see on this paper or even hieroglyphics in your country would 00:13:31.100 |
And I'm just circling back around to something that you all have known and perhaps even know 00:13:37.780 |
more about than anything I've ever experienced. 00:13:50.600 |
Have you had any big surprises in your research about cognitive biases, like personally or things 00:13:58.520 |
that sort of just upset the way that you were thinking, Kathy? 00:14:01.580 |
Oh my gosh, Delice, you just nailed it so well. 00:14:06.820 |
And that is the experience of coming to realize, of having the openness and the awareness to 00:14:13.500 |
realize, okay, maybe the way I see things isn't the whole picture here. 00:14:21.820 |
And we see the outcome of people's inability to do that when they're faced with something 00:14:27.480 |
new or something that is different or that they just didn't know. 00:14:32.540 |
It's why there's so many arguments and fights and just all kinds of craziness and drama on 00:14:37.220 |
social media because somebody runs into an idea that goes against what they knew, know, 00:14:42.560 |
or think they know about whatever the topic is. 00:14:45.700 |
And they can't even stop and be a little bit curious about it. 00:14:56.900 |
It's called the Dunning-Kruger effect and it is a cognitive bias. 00:15:02.280 |
The Dunning-Kruger effect is just named after the people who kind of did the study and discovered 00:15:05.940 |
So the name does not help us to understand it at all. 00:15:08.860 |
So, but the Dunning-Kruger effect happens when a person, and this has been scientifically 00:15:14.080 |
studied, when a person learns a little bit about something, they are full of confidence and 00:15:20.100 |
they feel like they know everything about that topic. 00:15:23.080 |
And they will go forth with all the confidence of somebody who has learned a little bit as 00:15:28.480 |
However, the people who study the thing and go on, you know, and they go on, the more that 00:15:35.160 |
they learn, yes, the more that they learn, the more they realize they don't know. 00:15:41.160 |
And that the more that they, and so there is this overconfidence with people who only know 00:15:45.380 |
a little and this deep humility with people who know a lot. 00:15:48.660 |
I experienced this after I graduated from high school, I went to a two-year Bible college. 00:15:53.340 |
And in this two-year Bible college, we essentially went through the whole Bible, right? 00:15:58.720 |
I came out of that two years feeling like I knew everything there was to know about the 00:16:08.820 |
And oh my gosh, you know, at all of 20 years old, I, you know, was just so confident in 00:16:15.040 |
Well, the older I've gotten, the more I have realized, oh my goodness, I don't know. 00:16:23.320 |
There are people who they have specialized, like one specialty would be ancient Near Eastern, 00:16:30.340 |
what they would have believed during Bible times. 00:16:33.200 |
They've studied, like they have gone so deep into just this one little aspect of scripture 00:16:37.660 |
and they have spent an entire career doing so. 00:16:41.280 |
And it just made me realize there's not even enough time in a lifetime to learn everything 00:16:49.940 |
And to have the humility to say, there's people who've studied this for years and years that 00:16:56.640 |
I could learn from and I still, you know, and I still don't know, right? 00:17:00.080 |
So there's just so many, there's so many things just like that, that if we can come with humility 00:17:06.620 |
and curiosity, oh my goodness, the world can open up to us. 00:17:17.280 |
And I think like you hit the nail on the head. 00:17:19.740 |
It takes a great deal of humility to realize that we maybe are standing in this position, 00:17:34.820 |
We're asking God to show us and we're convinced we see the whole thing, 00:17:46.020 |
And so the humility to say, everything I know may be true, 00:17:51.600 |
but it may not be everything there is to know. 00:17:55.260 |
And so, but it is just so, it's so hard to recognize your own bias, 00:18:02.560 |
but it's really harder for somebody else to accuse you of being biased. 00:18:07.900 |
We don't like to think that we're only seeing it for one perspective 00:18:15.620 |
So, I mean, really the natural response to being accused of bias is to be defensive. 00:18:23.920 |
When somebody says, well, you're just, you think that because you're biased. 00:18:34.920 |
Because, because I'm sure defensiveness is not it. 00:18:38.580 |
And I think that's kind of the natural thing because in our, in our world, 00:18:42.820 |
we see bias as being a negative thing, right? 00:18:55.960 |
And while those things are probably actually true, 00:18:58.240 |
but I think the reality is we all just need to come to accept that if somebody accuses us of being biased, 00:19:07.900 |
We all have different biases and our brains do these things. 00:19:11.020 |
So the question is, you know, the response should be, yeah, you're totally right. 00:19:21.540 |
Like if we could just do that one, if we could come with curiosity, 00:19:24.860 |
that is the number one thing that could change so many things if both people. 00:19:30.700 |
Preferably if both people, but even if one person can come with the attitude of curiosity, oh my gosh, goodness. 00:19:40.980 |
My sister-in-law is, and she may never hear our podcast, so she won't know that I'm saying all these nice things about her. 00:19:47.840 |
One year, years and years ago, I heard her be confronted by somebody, I mean, she was just confronted by somebody else who had, who was basically accusing her of bias. 00:19:59.660 |
And I could say that it really bothered her, but she stopped herself and she said to them so kindly with lots of humility and a great deal of curiosity, that's so interesting. 00:20:19.500 |
I don't have to apologize for the way I see it. 00:20:23.740 |
I could see this as an opportunity for us to share an understanding that will be totally new to both of us because we're coming at it from different sides. 00:20:34.000 |
What if we could both see more of the sphere than we've ever seen before? 00:20:42.820 |
And I think I talked about this in the second episode of season two. 00:20:47.020 |
So coming out of this last election period, right? 00:20:50.840 |
And this happens every election based on who wins and who doesn't win. 00:20:55.140 |
You're going to have very dramatic responses on both sides. 00:20:58.960 |
And then depending on who wins or loses the next time, it switches, right? 00:21:03.120 |
And the people doing the dramatics, it's either side is going to be on one or the other. 00:21:07.560 |
And here's what I, one of the things that I kept seeing is I can't believe so-and-so, I can't believe the other side thinks this, or I can't believe they're having this reaction or this response. 00:21:20.580 |
I totally can understand why both sides are responding and reacting the way they are. 00:21:26.740 |
Because if I had been seeing the things that they've been seeing, if I had been hearing the messages that they've been getting, I would think the same thing too. 00:21:37.420 |
If I had the same social media algorithm that they had, I would be thinking and I would be afraid. 00:21:45.280 |
If all the people around me thought the same thing, I'm probably going to think that. 00:21:52.240 |
And so when we can step back and realize, okay, people think what they think for a reason. 00:21:59.000 |
Now let's step in with curiosity and let's step in with the powerful tool of asking good questions and now we can have a conversation, hopefully. 00:22:08.360 |
Now, not everybody is emotionally able to engage in a conversation and that's not our job to get them there. 00:22:16.740 |
But we can come with that curiosity and we can have that heart. 00:22:21.500 |
Well, and really what I feel like you're showcasing there is empathy, right? 00:22:28.100 |
Like how can we be a more empathetic listener? 00:22:32.300 |
Because I think there is a point to which once you teach a person to think logically, they can say, okay, well, that's just done because X, Y, Z. 00:22:46.320 |
And if you do see the whole picture, it's okay to say, hey, you're having a cow about this, but there's a bigger issue here. 00:22:53.080 |
And to acknowledge both sides, however, love would come at that with a perspective of empathy. 00:23:01.460 |
And so I love what you're saying and you guys are making me think of the fact that, you know, even our eyes, like the human eye, has a limited spectrum of light and a limited ability to see color. 00:23:14.380 |
And we had a whole debate in the office the other day about the color of someone's shirt. 00:23:18.900 |
And the truth is that I'm sure it looks exactly as was described by all parties, but our eyes are different. 00:23:26.800 |
You know, and so like, it doesn't matter if it's the computer thwarting the image or if it's the person in the room there with you. 00:23:43.580 |
She's declared her green since she was a child. 00:23:47.140 |
But I think that what you're saying, Kathy, and what you're observing is the fact that the person's eyes actually see that way. 00:23:55.320 |
And you can choose to accept that or not accept that or to learn more about that or not learn more about that. 00:24:01.180 |
But it is still a fact that that's the way they're perceiving things. 00:24:04.640 |
And I'd love to hear you share a little bit about for those who find it perhaps more natural now because they've worked out their logic muscles to identify a fallacy or to observe someone's cognitive bias. 00:24:23.100 |
How can that person protect themselves and keep themselves in a position of humility and walking in love? 00:24:37.880 |
And I have two things that I want to say, so don't let me forget to the second one. 00:24:41.720 |
But I also, I want to just give a plug for the upper level of challenge to answer this question. 00:24:52.380 |
Because it is not enough to just learn, oh, here's the biases and here's the logical fallacies. 00:25:03.620 |
But information can also lead to a harsh and a critical spirit because we can say, oh, well, they're just making fallacies. 00:25:16.560 |
And we just write people off or we just don't engage or we make fun or whatever. 00:25:20.240 |
What we learned to do in Challenge 3 and especially, oh my gosh, I got to direct Challenge 4 this last year. 00:25:26.660 |
And it was so amazing because what the kids learn to do, what the students learn to do is now take that truth and how do we engage the world with it. 00:25:38.580 |
And it's not about beating people over the head. 00:25:41.880 |
It's not about going out there and telling everybody they're idiots, right? 00:25:47.860 |
It is about how can we take the truth and engage culture and how can we take the truth and really be somebody that somebody else will actually listen to and wants to listen to. 00:26:01.140 |
And not that we have to hold ourselves and bend ourselves into culture to do that, but there is a way that we can stand for truth and that we can engage truth in a way that loves people. 00:26:14.740 |
You may disagree with that other person, but they're still made in the image of God. 00:26:20.300 |
And God sees them as a precious son or a precious daughter. 00:26:24.100 |
So how do we engage them with truth as an image bearer of God? 00:26:28.100 |
And that's a skill that we can be learned and that we can cultivate. 00:26:32.240 |
And one of the ways that we can do it is we have to show that as parents. 00:26:42.640 |
Oh, I'll never forget practicum last year when it was getting rolled out. 00:26:46.780 |
And for me, I'm one of these people that I thought it was genius. 00:27:01.620 |
I wish like I was like, oh, it just made my heart ache. 00:27:04.060 |
I said, well, you can homeschool your kids and teach them this way. 00:27:09.260 |
But I also think it's fascinating that Lee Bortons looked at math, something that all of us, right? 00:27:15.860 |
Delise, to your point, this is just how math is done. 00:27:20.040 |
And she just took all the pieces and changed it all around and jumbled it up and presented it. 00:27:26.540 |
And so for me, I'm sitting there and I'm thinking, this is amazing. 00:27:33.240 |
But not everybody at my table thought of it that way. 00:27:38.360 |
There were these precious, beautiful homeschool mamas that were kind of mad about it because they didn't want to learn a new way of doing math. 00:27:48.280 |
And when, you know, the presenter came around, I said, well, why don't we ask the questions? 00:27:52.600 |
They didn't even want to ask the questions, right? 00:27:55.200 |
And so I thought, okay, so if we can, you know, and it's hard to learn new things. 00:28:01.560 |
It's hard to be open to a new way of thinking because, again, it's how our brains work. 00:28:08.400 |
Our brains want to just go, no, we already know this. 00:28:24.880 |
But again, if we can come with the idea of being open, of asking questions, and just 00:28:32.680 |
be curious, it doesn't ever hurt us to learn something new, right? 00:28:39.960 |
And one of the things that I learned just from directing challenge and just life in general 00:28:46.680 |
is sometimes asking the right question is far more important than getting the right answer. 00:28:53.780 |
And when we can come with this spirit of, wow, let's just learn. 00:28:59.800 |
Learning something new doesn't mean you have to do it. 00:29:09.040 |
Learn about what you don't like or learn about this thing that you disagree with. 00:29:15.420 |
And so if we can just cultivate this spirit of just being curious, right? 00:29:19.960 |
Let me just be curious about this thing and find out. 00:29:22.240 |
And maybe I don't, I don't have to agree with it. 00:29:24.600 |
Whatever, like insert any topic from society, from culture, from religion, insert any topic. 00:29:32.540 |
It's, it's a good mind that can hold two ideas and look at them and say, okay, well, what are 00:29:41.880 |
How do they, like, what do we think about this thing? 00:29:46.140 |
And so we can teach our kids to do that and we can learn, yes, even though our brains are 00:29:56.700 |
That is a way to be winsome, to be winsome as a person. 00:30:02.500 |
And, you know, in CC, we spend all of these years saying that our mission is to know God 00:30:11.400 |
And so we raise our children to see God in the word and to see God in the world. 00:30:18.080 |
And we give them all the tools of communication that they need to go out and speak God's truth 00:30:25.920 |
But I think that sometimes we forget that one of the ways to make God known is to be 00:30:34.260 |
Jesus, to be like Jesus, and to draw people to that spirit of love and winsome curiosity 00:30:45.940 |
and open-mindedness that will say not open-minded to take in what's not true, but open-minded to 00:30:55.720 |
asking you what you believe and why you think that and to explore it together. 00:31:12.900 |
I'm curious about your perspective of technology. 00:31:18.960 |
I've asked many people about their perspective of technology lately, just because I want to 00:31:25.520 |
And so when we're thinking about thinking critically, teaching our students to think critically, 00:31:32.220 |
teaching them to think logically, identify their biases, and then we put AI in the mix, 00:31:38.160 |
we get an interesting collection of scenarios. 00:31:44.160 |
How do you think the use of AI is affecting or even empowering people? 00:31:54.860 |
I don't want to, you know, I don't want to dictate how you're going to answer. 00:31:57.660 |
Or even empowering people to navigate issues. 00:32:02.860 |
And what would it look like for us to set our children up for success knowing that the presence 00:32:15.440 |
It is only going to become ever more pervasive in society. 00:32:21.320 |
And so, yes, this is something that's interesting to me. 00:32:25.200 |
And this is one of the things that I encourage parents to have an ongoing conversation with 00:32:36.500 |
My ideas about this may change the more I learn or the more I grow. 00:32:39.580 |
But here's where I'm at today is, okay, AI is a tool that can be used rightly and can be used 00:32:48.420 |
I see that there are some incredible applications for AI in culture, in society, in life. 00:32:55.240 |
And I see that there are applications that can absolutely destroy people. 00:33:02.780 |
I have read articles about kids who have trouble, like they're getting into the high 00:33:08.640 |
school and the college years, and they have trouble formulating ideas, having conversations 00:33:16.320 |
They have been relying on AI to do their work and their thinking for them. 00:33:22.700 |
To spit out the answer without processing the information. 00:33:31.360 |
And the brain is the one organ that the more you use it, the more you're able to use it. 00:33:37.180 |
And the less you use it, the less capacity you are going to have in your future to be able 00:33:42.860 |
And so, in that way, I think it can be really dangerous. 00:33:48.960 |
And so, we have to teach our kids, look, just because you get an answer on AI does not mean 00:33:55.060 |
AI is only as good as what it's feeding from. 00:34:02.240 |
Whatever tool you're using, it could be biased. 00:34:06.820 |
Now, the other thing that I think we have to be very aware of with AI is it is now being 00:34:12.540 |
developed to be a replacement for human connection and relationship. 00:34:17.160 |
So, I got an ad in my social media just recently where it was advertising an AI tool that would 00:34:24.540 |
be a replacement for a therapist or a counselor. 00:34:27.920 |
I have even – there is even programs where you can get an AI boyfriend or a girlfriend. 00:34:41.020 |
And so, now somebody can have what's meant to be this experience with another flawed but 00:34:49.220 |
beautiful person with a machine that's only going to give you one type of feedback and 00:34:55.560 |
like, oh, just the implications could be really detrimental and scary. 00:35:01.860 |
So, but this is all our kids are going to know. 00:35:05.960 |
So, we have to have this conversation with them and we got to stay up with what's going 00:35:13.960 |
We can't just say, oh, I don't want to think about that. 00:35:18.140 |
Well, you got to think about it because you need to arm them with good thinking about it. 00:35:25.560 |
So, am I hearing you say, and you can say, no, you're not hearing me say that to Lisa, 00:35:30.500 |
but am I hearing you say that you think maybe an appropriate application of AI would be, 00:35:38.920 |
as you said at the beginning, to use it as a tool and then to have your own tools to check 00:35:46.680 |
And that's how you see it playing out even in a, quote, classical education. 00:35:52.540 |
I think there are, I think there is a place for it. 00:35:55.880 |
I think in a classical education, I think there's less of a place for it because we're not necessarily 00:36:04.060 |
I'm talking about in society, I think there's, you know, there's going to be medical applications 00:36:09.200 |
There's going to be technical applications for it where it is going to be fantastic. 00:36:13.820 |
But we have, but you can only apply a tool when you understand the tool. 00:36:19.040 |
And so when we understand what the tool is and what it does, we can apply it well. 00:36:24.220 |
But when we're starting to apply it, where it's replacing our brain, where it's replacing other 00:36:28.880 |
people's, where it's replacing human connection, the way that we're created as human beings to do, 00:36:36.000 |
And so that's where, that's where we've got to be very careful. 00:36:41.100 |
I have not seen anything with my daughter going from foundations to challenge for where she has 00:36:48.860 |
Like, I don't know that it's got a big place in a classical education, at least to the point 00:36:53.920 |
But that doesn't mean that I don't see applications for it in society in certain places. 00:36:59.600 |
So that's, that is what you're hearing me say, Elise. 00:37:02.460 |
Classical education is so much about wrestling with big ideas in community, wrestling together 00:37:15.700 |
Oh, well, I heard something different, or it made me think in a different way. 00:37:19.240 |
And then you've got a third person saying, well, I saw something different than both of you, 00:37:23.620 |
but maybe the truth is in the middle and, and that wrestling. 00:37:27.000 |
And so that is more a relational community aspect of education than AI could produce. 00:37:35.620 |
But I love it, Kathy, that you are thinking about that and that you know that you want 00:37:45.420 |
God, this is what I think today, but I reserve the right to change as I learn because you 00:37:55.200 |
I mean, it's not that you have now arrived and you know enough about logic to have a podcast 00:38:04.920 |
And I think it's because you're such a curious person. 00:38:17.140 |
Number one, it's a mindset that I can just learn whatever I need to know. 00:38:25.880 |
If there's anything I wanted to learn, I could learn it. 00:38:28.840 |
But it's also giving myself the freedom to follow my interest and to say, well, I'm really 00:38:39.060 |
If you've got a kid who's into something, let them go all in. 00:38:42.120 |
Like let there's so much learning that can happen around Legos, right? 00:38:46.440 |
Dinosaurs or whatever the interest is that a child currently has, you know, and because 00:38:55.460 |
I see this, right, even in classical conversations about how, you know, parents, they get nervous. 00:39:05.200 |
Trust me, going into challenge four, I didn't know anything about. 00:39:11.540 |
I'd never like that's never been part of my education. 00:39:15.340 |
But if we can just have the attitude of, OK, well, let's figure it out. 00:39:20.280 |
I'm I'm reading it at the same time that these kids are reading it. 00:39:26.300 |
Like, even if I'm just one day ahead of the kids in my class, that's enough. 00:39:32.440 |
You know, and I think that is one of the most important things. 00:39:35.640 |
It's if we model this mind that's just excited about learning something new, right? 00:39:40.600 |
And you just be excited about learning something new. 00:39:46.840 |
That we really I mean, that's a great encouragement because you're right. 00:39:50.600 |
There are a lot of parents who think, oh, I'm going to I'm going to need to put them in school 00:39:55.580 |
or we're going to have to do dual enrollment or something. 00:39:58.240 |
Because when we get to upper level math or when we get to fourth year Latin or when we 00:40:03.920 |
get to Greek literature, when we get to policy debate, I don't know anything about that. 00:40:09.860 |
I love the encouragement that you have of learning alongside your kids. 00:40:16.020 |
Really, some of the best lessons that you will teach as a lead learner in your home. 00:40:23.120 |
You teach when you don't know what you're doing in that subject because you teach your child to stay 00:40:30.540 |
that even people who know nothing can learn something. 00:40:39.820 |
And it's all about the power of learning to ask good questions. 00:40:43.200 |
And this is the this is the the trick to not knowing is asking the questions, because if 00:40:49.020 |
you can ask the question now, you've got something to go off of. 00:40:53.520 |
Because we just we just tend to the reason we get stuck is we're like, well, I just I just 00:41:10.780 |
Be interested in learning what is the next thing. 00:41:22.080 |
All of us just need to take a deep breath and go learn the next thing. 00:41:29.120 |
And going back to what you were saying, Lisa, a couple of minutes ago, I mean, that is 00:41:34.020 |
really a Christ like blueprint because he would just ask questions. 00:41:39.280 |
Even though he knew the answers, which to me is just the most humble. 00:41:44.600 |
Like, obviously, we're asking questions and we don't know the answer. 00:41:49.180 |
But he's asking questions and he does know the answer. 00:41:52.740 |
And we're both trying to lead toward a new discovery. 00:42:01.840 |
And I just think about, like, for example, the woman at the well and, like, the beautiful 00:42:07.180 |
discoveries that she unpacked with a series of very simple questions. 00:42:17.680 |
And you're just making me excited as two women who are further down the road in your homeschooling 00:42:23.180 |
journeys than I am for the things that I'm sure I'm going to discover with my boys by just 00:42:33.060 |
So, you know, Kathy, talking to you has been a joy. 00:42:38.400 |
I said so before we started recording because I just love hearing your thoughts. 00:42:42.120 |
And I know that our listeners probably want to hear more of your thoughts from this conversation. 00:42:46.840 |
So if they're saying, hey, where can I find her online? 00:42:52.820 |
So the podcast is called Filter It Through a Brain Cell. 00:42:55.640 |
They can find it on pretty much any podcast listener. 00:42:59.920 |
I recommend go back and start with season one because it kind of builds on itself as you go. 00:43:07.920 |
So people can go to filter it through brain cell dot com forward slash quiz that they can take. 00:43:12.700 |
You can have your middle schoolers or high schoolers take it to you. 00:43:16.520 |
I love teaching through memes, headlines, articles like stuff that you see in real life. 00:43:21.620 |
And it's basically, can you figure out what the fallacy is there? 00:43:25.940 |
And for all my challenge a families or directors, I have a matchup that I created of podcast episodes that goes along with each chapter each week of the fallacy detective as you're going through that. 00:43:39.140 |
So filter it through brain cell dot com forward slash a you can download it for free. 00:43:43.720 |
And it's just a fun way to go along with what they're learning in challenge a so anyways that yeah. 00:44:00.060 |
I love partnering with I mean, we love classical conversations. 00:44:02.480 |
I love partnering with CC and yeah, I just to me, I think that people who care about if we can teach our kids to not just know truth, because that's important. 00:44:15.060 |
But I think the even more important tool and skill is to love truth. 00:44:19.780 |
If we can teach our children to love the truth. 00:44:26.320 |
And that is my hope and that is my goal is that we can help this next I have a lot of hope for this next generation. 00:44:32.560 |
And I pray that they can learn to love the truth. 00:44:39.060 |
So parents, we hope that you have been blessed, that you have been encouraged to go out and learn the next new thing. 00:44:46.780 |
And that generally the everyday educator encourages you to keep learning on your own and to see what's new. 00:44:54.680 |
You may want to see what's new with Classical Conversations books this summer. 00:44:59.320 |
If you are looking for something to do this summer to expand your mind or to get ahead of your kids, take a look at what is coming out, what's new in the bookstore. 00:45:14.320 |
Kathy mentioned the Math Map, our new classical education approach to teaching math with your family. 00:45:20.620 |
So we've got Math Map Digits, integers, fractions. 00:45:24.880 |
We've got two sets of flashcards that offer some really simple classical ways to practice those essential math skills. 00:45:36.220 |
Math Map Monomials is coming out for our Challenge B families. 00:45:43.760 |
There is a gorgeous new Reasoning Together philosophy textbook. 00:45:53.500 |
It helps our students explore great thinkers' ideas through reading those works and then participating in Community Day in some Socratic discussions. 00:46:08.820 |
And there's always new Copper Lodge Library books. 00:46:13.080 |
Pilgrim's Progress is one of them that have great illustrations and some really good insights. 00:46:21.180 |
And if you don't know which one you're the most interested in, here's what you do. 00:46:24.820 |
Go to classicalconversations.com forward slash what's new and you'll find everything that's there. 00:46:34.560 |
Well, guys, thank you for listening to our show today. 00:46:42.660 |
And we will see you over on social media at Everyday Educator Podcast or on YouTube if you want to watch, if you want to see Kathy's smiling face, we're over there on the podcast channel on YouTube as well.