back to indexConfirmation Bias, Encountering Alternatives, and QUITTING Twitter
Chapters
0:0 Cal's Intro
0:9 Question regarding Confirmation Bias from Episode #143 Deep Question Podcast
0:20 Cal's discussion about his math work
0:47 Cal's discussion about his writing work
1:11 Cal's Angle 1 about Confirmation bias
2:44 Cal talks about pragmatic non-fiction books
3:9 Cal talks about caveats
4:50 An individual trying to understand the world
6:12 Listening to the alternative of your beliefs
7:40 Cal's additional thought on when people speak for you
9:20 Cal's final summary
00:00:11.500 |
how do you combat confirmation bias within your line of work?" 00:00:16.300 |
Well, Charlotte, when it comes to my main academic work, 00:00:20.800 |
which up to this point has largely been theoretical computer science, 00:00:24.300 |
I don't have to worry much about confirmation bias because 00:00:35.300 |
Technically speaking, you can check that, look at the math. 00:00:40.800 |
If I can't make a proof goes through, it doesn't go through. 00:00:43.800 |
So in my primary academic work, I haven't had to worry much 00:00:50.500 |
of course, it's going to rear its head all the time, 00:00:53.300 |
especially as a writer who makes cultural arguments, 00:00:56.800 |
who does cultural critique. Confirmation bias is something that is quite relevant. 00:01:03.100 |
Now, I have two different angles I want to follow here 00:01:14.800 |
Now, what I mean about that is that we often, 00:01:17.800 |
especially those of us who aspire to rationality, 00:01:23.800 |
of the individual as the perfectly rational being, 00:01:27.000 |
fully reflective and careful in their thought. 00:01:32.800 |
we want to be sure to try to get every possible bias out of our thinking patterns. 00:01:39.800 |
if you zoom out and look at a population of people 00:01:49.200 |
Maybe what you want out there is this person is wildly advocating 00:01:53.200 |
for this angle, and this person is wildly advocating for that, 00:01:55.600 |
and this person has a nuance they're really advocating. 00:01:57.900 |
And when all of these different voices collide, 00:02:01.200 |
what you get is better wisdom that if instead you had everyone 00:02:05.700 |
trying to get every possible caveat, every possible angle, 00:02:09.400 |
every possible objection, all worked into their own individual take on the issue. 00:02:20.900 |
it often is moved forward by these clashes of people 00:02:25.900 |
that you would say are suffering individually from confirmation bias. 00:02:29.200 |
"Let me make the best argument I can for this," 00:02:31.500 |
"you make the best argument you can for that," 00:02:33.300 |
combined will have a better understanding of how the world actually functions. 00:02:38.200 |
We really do see that is the way that things often execute. 00:02:42.200 |
One place this comes up, I think, in an interesting way 00:02:48.900 |
So I write advice books, I've written advice books. 00:02:52.200 |
Of course, they have other parts to them as well, 00:02:54.500 |
but I'm partially part of the pragmatic nonfiction genre. 00:03:00.800 |
and I've talked about this on the show before, 00:03:02.300 |
but one of the things you learn as you become a professional advice book writer 00:03:14.200 |
make a good pitch for the thing that you are trying to argue is important. 00:03:19.900 |
Leave it to the reader who is smart to add their own caveats. 00:03:37.400 |
When you talk to people who are not professional advice writers, 00:03:43.500 |
Well, you don't have all the right caveats on here. 00:03:45.600 |
You're not acknowledging all the different situations 00:03:49.700 |
But the thing is, while that might be reasonable 00:03:52.100 |
if you're in conversation with an individual, 00:03:54.300 |
while it might make sense if I'm talking to you one-on-one 00:03:57.700 |
that I should probably caveat the advice I'm giving you 00:04:05.500 |
It's clunky and too self-referential and sluggish 00:04:16.000 |
that, you know, if you're giving advice about jogging 00:04:22.800 |
and they have a severely injured knee and they can't run, 00:04:33.800 |
and they'll extract from it what they actually need to extract. 00:04:36.100 |
So I think that's a good illustration of this bigger point 00:04:38.800 |
that sometimes it's good for people just to take their swings 00:04:43.400 |
When all these swings collide, we get more wisdom. 00:04:50.400 |
to an individual trying to understand the world, 00:04:53.100 |
so not you trying to push forward your vision for the world, 00:04:58.200 |
where confirmation bias is not necessarily a bad thing, 00:05:00.200 |
but where you're trying to understand how the world operates, 00:05:07.100 |
This is where you need to be more wary about confirmation bias. 00:05:10.200 |
And I would say our culture right now of intaking information 00:05:13.500 |
is essentially entirely defined by confirmation bias. 00:05:26.200 |
where I'll feel really good hearing people I agree with, 00:05:28.600 |
hearing not only them say things I agree with, 00:05:31.200 |
but hearing them trash the other people who disagree with us? 00:05:35.500 |
Let's all have a kind of weirdly sadistic pep rally 00:05:40.400 |
but discuss how much we hate the other team." 00:05:42.600 |
That is basically our cultural conversation right now, 00:05:45.400 |
online in particular, in many different areas. 00:05:55.100 |
I feel like it's intellectual abdication to say, 00:06:04.900 |
And what I have found, what I've argued before, 00:06:06.500 |
is that the best thing to do here is to encounter 00:06:12.100 |
that either disagree with or offer an alternative 00:06:17.200 |
This is how you dilute individual confirmation bias 00:06:30.000 |
to a really good alternative, a really good other thought, 00:06:33.000 |
then maybe I will dilute my commitment to my team. 00:06:43.300 |
when you encounter really good arguments against it 00:06:46.700 |
that combination strengthens and nuances your belief. 00:06:51.500 |
It gives you a stronger tie to the thing you believe in. 00:07:01.400 |
and where they overlap and where it gets complicated. 00:07:23.000 |
when it comes to taking in information about the world. 00:07:26.000 |
And I say that's a good bias to try to get away from 00:07:32.800 |
actually your bias towards whatever you believe in 00:07:43.600 |
"I don't really want to encounter these thoughts 00:07:46.200 |
because they might be, let's say, disagreeable to me." 00:07:51.100 |
Be really wary when other people are saying that for you. 00:08:04.900 |
So let me just tell you what you should look at." 00:08:11.200 |
those other things, there could be some trouble. 00:08:13.400 |
Lace up your running shoes and start running. 00:08:16.200 |
That has never in the history of the world of ideas 00:08:25.100 |
that if you are advocating for something publicly, 00:08:28.900 |
make your case, let other people make their other cases. 00:08:33.500 |
When you're intaking information about the world, 00:08:38.700 |
You can be on a team, have a jersey, that's fine. 00:08:42.600 |
But the way to really strengthen your understanding 00:08:58.000 |
It's going to make your beliefs more nuanced. 00:09:12.200 |
I think they can both be true at the same time. 00:09:14.800 |
We could probably summarize them all, though. 00:09:18.200 |
At least we could probably reduce 80% of the issues 00:09:21.400 |
I just talked about, if you just did one simple thing,