Back to Index

Confirmation Bias, Encountering Alternatives, and QUITTING Twitter


Chapters

0:0 Cal's Intro
0:9 Question regarding Confirmation Bias from Episode #143 Deep Question Podcast
0:20 Cal's discussion about his math work
0:47 Cal's discussion about his writing work
1:11 Cal's Angle 1 about Confirmation bias
2:44 Cal talks about pragmatic non-fiction books
3:9 Cal talks about caveats
4:50 An individual trying to understand the world
6:12 Listening to the alternative of your beliefs
7:40 Cal's additional thought on when people speak for you
9:20 Cal's final summary

Transcript

This one comes from Charlotte. Charlotte asks, "As an author and academic, how do you combat confirmation bias within your line of work?" Well, Charlotte, when it comes to my main academic work, which up to this point has largely been theoretical computer science, I don't have to worry much about confirmation bias because mathematics doesn't care about your bias.

I have to prove theorems. Either the theorem is true or it's not. Technically speaking, you can check that, look at the math. It either works or it doesn't. If I can't make a proof goes through, it doesn't go through. And if I can, it can. So in my primary academic work, I haven't had to worry much about confirmation bias.

In my work as a writer, of course, it's going to rear its head all the time, especially as a writer who makes cultural arguments, who does cultural critique. Confirmation bias is something that is quite relevant. Now, I have two different angles I want to follow here on the issue of confirmation bias.

Angle number one is that it's not necessarily a bad thing. Now, what I mean about that is that we often, especially those of us who aspire to rationality, have the Socratic model in our head of the individual as the perfectly rational being, fully reflective and careful in their thought.

And when we aspire to that model, we want to be sure to try to get every possible bias out of our thinking patterns. However, if you zoom out, if you zoom out and look at a population of people who are thinking and sharing their thoughts, you see that confirmation bias is more a feature than a bug.

Maybe what you want out there is this person is wildly advocating for this angle, and this person is wildly advocating for that, and this person has a nuance they're really advocating. And when all of these different voices collide, what you get is better wisdom that if instead you had everyone trying to get every possible caveat, every possible angle, every possible objection, all worked into their own individual take on the issue.

If you look at a lot of big cultural issues, if you look at a lot of scientific progress, it often is moved forward by these clashes of people that you would say are suffering individually from confirmation bias. "Let me make the best argument I can for this," "you make the best argument you can for that," combined will have a better understanding of how the world actually functions.

We really do see that is the way that things often execute. One place this comes up, I think, in an interesting way is when you do pragmatic nonfiction writing. So I write advice books, I've written advice books. Of course, they have other parts to them as well, but I'm partially part of the pragmatic nonfiction genre.

And one of the things you learn, and I've talked about this on the show before, but one of the things you learn as you become a professional advice book writer is that caveats make the writing worse. If you are giving advice to someone, make a good pitch for the thing that you are trying to argue is important.

Leave it to the reader who is smart to add their own caveats. The reader is not an idiot. They know that, you know, in my situation, that advice doesn't completely apply. I, as the reader, can figure that out. And I know that. And I can modify the advice accordingly and extract from it what's relevant to me.

When you talk to people who are not professional advice writers, they get really worried. Well, you don't have all the right caveats on here. You're not acknowledging all the different situations in which this advice might not apply. But the thing is, while that might be reasonable if you're in conversation with an individual, while it might make sense if I'm talking to you one-on-one that I should probably caveat the advice I'm giving you to actually fit your circumstances, when it comes to writing one-to-many, it makes your writing much worse.

It's clunky and too self-referential and sluggish if you add all those caveats. It's also condescending to the reader who are smart enough to know, for example, that, you know, if you're giving advice about jogging and getting better at your running times, and they have a severely injured knee and they can't run, they know that.

And they don't need you to say, "And of course this advice doesn't apply if you have a severely injured knee." They know that's true, and they'll extract from it what they actually need to extract. So I think that's a good illustration of this bigger point that sometimes it's good for people just to take their swings as compellingly as they can.

When all these swings collide, we get more wisdom. Now when we shift over, however, to an individual trying to understand the world, so not you trying to push forward your vision for the world, where confirmation bias is not necessarily a bad thing, but where you're trying to understand how the world operates, you're intaking other people's thoughts.

This is where you need to be more wary about confirmation bias. And I would say our culture right now of intaking information is essentially entirely defined by confirmation bias. We figure out, "What is my tribe? What do we believe? I need to get to a pep rally. How do I find my way to the pep rally where I'll feel really good hearing people I agree with, hearing not only them say things I agree with, but hearing them trash the other people who disagree with us?

Those are the others. We hate them. Let's all have a kind of weirdly sadistic pep rally where we not only cheer on our team, but discuss how much we hate the other team." That is basically our cultural conversation right now, online in particular, in many different areas. This is a place where I try to resist that.

I do not like teams. I do not like tribes. I feel like it's intellectual abdication to say, "Tell me what I'm supposed to believe, and then I'll go get the pom-poms." I like a wide variety of thought. And what I have found, what I've argued before, is that the best thing to do here is to encounter the very best arguments you can that either disagree with or offer an alternative to things that you feel strongly about.

This is how you dilute individual confirmation bias about the things you believe in. Now, as I talk about on this show, people get worried about that. They say, "If I expose myself to a really good alternative, a really good other thought, then maybe I will dilute my commitment to my team.

Maybe my morality itself will be degraded." It's not what happens. If you believe strongly in something and it's a good thing you believe in, when you encounter really good arguments against it or really good alternatives, that combination strengthens and nuances your belief. It gives you a stronger tie to the thing you believe in.

It gives you a more sophisticated take on why you believe in what you believe in and what's wrong with the alternatives and where they overlap and where it gets complicated. It makes you a better defender of the things that are important to you to encounter their best alternatives. That makes you a better defender than to instead try to avoid them, to try to stay within a bubble where I'm only hearing from people that makes me feel good.

So I have been arguing for that. We all have confirmation bias when it comes to taking in information about the world. And I say that's a good bias to try to get away from because when you do try to get away from it, actually your bias towards whatever you believe in tends to get stronger.

The only addition I will add to this is while it is natural for individuals to say, "I don't really want to encounter these thoughts because they might be, let's say, disagreeable to me." That's completely natural. Be really wary when other people are saying that for you. So if other people are saying, "I don't want you to be exposed to this because I don't think you are smart enough to be exposed to it.

I won't be tricked, but you'll be tricked. So let me just tell you what you should look at." Be wary of these other things. And in fact, if I find out you're looking at those other things, there could be some trouble. Lace up your running shoes and start running.

That has never in the history of the world of ideas ever led to a good place. Be wary of people who tell you what you should or shouldn't encounter. And again, come back to this point that if you are advocating for something publicly, make your case, let other people make their other cases.

When they collide, wisdom grows. That's fine. When you're intaking information about the world, know what you believe in, believe in what you believe in. That's great. You can be on a team, have a jersey, that's fine. We all like to cheer for things. But the way to really strengthen your understanding is to encounter the alternatives, encounter the best faith, smartest arguments that go against what you say, that show different ways of seeing things.

It's not going to turn you into some sort of conspiratorial weirdo. It's going to actually make you smarter. It's going to make your beliefs more nuanced. It's going to give you some more empathy. It's going to make you a better human. And ultimately, it's going to make your understanding of the world better.

So, Charlotte, remember your name there. That's my take on this issue. I have those two angles on it. I think they can both be true at the same time. We could probably summarize them all, though. At least we could probably reduce 80% of the issues I just talked about, if you just did one simple thing, which was quit Twitter and never ever look at Twitter ever again.

That alone would actually solve most of the problems I'm talking about. So, you know, that's the short answer and the long answer you just heard.