back to index

Confirmation Bias, Encountering Alternatives, and QUITTING Twitter


Chapters

0:0 Cal's Intro
0:9 Question regarding Confirmation Bias from Episode #143 Deep Question Podcast
0:20 Cal's discussion about his math work
0:47 Cal's discussion about his writing work
1:11 Cal's Angle 1 about Confirmation bias
2:44 Cal talks about pragmatic non-fiction books
3:9 Cal talks about caveats
4:50 An individual trying to understand the world
6:12 Listening to the alternative of your beliefs
7:40 Cal's additional thought on when people speak for you
9:20 Cal's final summary

Whisper Transcript | Transcript Only Page

00:00:00.000 | [Music]
00:00:05.000 | This one comes from Charlotte.
00:00:07.000 | Charlotte asks, "As an author and academic,
00:00:11.500 | how do you combat confirmation bias within your line of work?"
00:00:16.300 | Well, Charlotte, when it comes to my main academic work,
00:00:20.800 | which up to this point has largely been theoretical computer science,
00:00:24.300 | I don't have to worry much about confirmation bias because
00:00:27.600 | mathematics doesn't care about your bias.
00:00:30.800 | I have to prove theorems.
00:00:32.300 | Either the theorem is true or it's not.
00:00:35.300 | Technically speaking, you can check that, look at the math.
00:00:39.300 | It either works or it doesn't.
00:00:40.800 | If I can't make a proof goes through, it doesn't go through.
00:00:42.800 | And if I can, it can.
00:00:43.800 | So in my primary academic work, I haven't had to worry much
00:00:46.800 | about confirmation bias.
00:00:48.800 | In my work as a writer,
00:00:50.500 | of course, it's going to rear its head all the time,
00:00:53.300 | especially as a writer who makes cultural arguments,
00:00:56.800 | who does cultural critique. Confirmation bias is something that is quite relevant.
00:01:03.100 | Now, I have two different angles I want to follow here
00:01:07.800 | on the issue of confirmation bias.
00:01:10.600 | Angle number one
00:01:11.800 | is that it's not necessarily a bad thing.
00:01:14.800 | Now, what I mean about that is that we often,
00:01:17.800 | especially those of us who aspire to rationality,
00:01:20.800 | have the Socratic model in our head
00:01:23.800 | of the individual as the perfectly rational being,
00:01:27.000 | fully reflective and careful in their thought.
00:01:30.200 | And when we aspire to that model,
00:01:32.800 | we want to be sure to try to get every possible bias out of our thinking patterns.
00:01:38.000 | However, if you zoom out,
00:01:39.800 | if you zoom out and look at a population of people
00:01:43.000 | who are thinking and sharing their thoughts,
00:01:44.900 | you see that confirmation bias is
00:01:46.500 | more a feature than a bug.
00:01:49.200 | Maybe what you want out there is this person is wildly advocating
00:01:53.200 | for this angle, and this person is wildly advocating for that,
00:01:55.600 | and this person has a nuance they're really advocating.
00:01:57.900 | And when all of these different voices collide,
00:02:01.200 | what you get is better wisdom that if instead you had everyone
00:02:05.700 | trying to get every possible caveat, every possible angle,
00:02:09.400 | every possible objection, all worked into their own individual take on the issue.
00:02:15.000 | If you look at a lot of big cultural issues,
00:02:17.600 | if you look at a lot of scientific progress,
00:02:20.900 | it often is moved forward by these clashes of people
00:02:25.900 | that you would say are suffering individually from confirmation bias.
00:02:29.200 | "Let me make the best argument I can for this,"
00:02:31.500 | "you make the best argument you can for that,"
00:02:33.300 | combined will have a better understanding of how the world actually functions.
00:02:38.200 | We really do see that is the way that things often execute.
00:02:42.200 | One place this comes up, I think, in an interesting way
00:02:45.800 | is when you do pragmatic nonfiction writing.
00:02:48.900 | So I write advice books, I've written advice books.
00:02:52.200 | Of course, they have other parts to them as well,
00:02:54.500 | but I'm partially part of the pragmatic nonfiction genre.
00:02:59.200 | And one of the things you learn,
00:03:00.800 | and I've talked about this on the show before,
00:03:02.300 | but one of the things you learn as you become a professional advice book writer
00:03:07.200 | is that caveats make the writing worse.
00:03:11.000 | If you are giving advice to someone,
00:03:14.200 | make a good pitch for the thing that you are trying to argue is important.
00:03:19.900 | Leave it to the reader who is smart to add their own caveats.
00:03:24.300 | The reader is not an idiot.
00:03:25.500 | They know that, you know, in my situation,
00:03:27.400 | that advice doesn't completely apply.
00:03:29.200 | I, as the reader, can figure that out.
00:03:31.500 | And I know that.
00:03:33.100 | And I can modify the advice accordingly
00:03:35.500 | and extract from it what's relevant to me.
00:03:37.400 | When you talk to people who are not professional advice writers,
00:03:41.200 | they get really worried.
00:03:43.500 | Well, you don't have all the right caveats on here.
00:03:45.600 | You're not acknowledging all the different situations
00:03:47.800 | in which this advice might not apply.
00:03:49.700 | But the thing is, while that might be reasonable
00:03:52.100 | if you're in conversation with an individual,
00:03:54.300 | while it might make sense if I'm talking to you one-on-one
00:03:57.700 | that I should probably caveat the advice I'm giving you
00:04:00.700 | to actually fit your circumstances,
00:04:02.300 | when it comes to writing one-to-many,
00:04:04.100 | it makes your writing much worse.
00:04:05.500 | It's clunky and too self-referential and sluggish
00:04:09.600 | if you add all those caveats.
00:04:10.800 | It's also condescending to the reader
00:04:12.500 | who are smart enough to know, for example,
00:04:16.000 | that, you know, if you're giving advice about jogging
00:04:20.900 | and getting better at your running times,
00:04:22.800 | and they have a severely injured knee and they can't run,
00:04:26.700 | they know that.
00:04:28.100 | And they don't need you to say,
00:04:29.600 | "And of course this advice doesn't apply
00:04:31.100 | if you have a severely injured knee."
00:04:32.500 | They know that's true,
00:04:33.800 | and they'll extract from it what they actually need to extract.
00:04:36.100 | So I think that's a good illustration of this bigger point
00:04:38.800 | that sometimes it's good for people just to take their swings
00:04:41.600 | as compellingly as they can.
00:04:43.400 | When all these swings collide, we get more wisdom.
00:04:48.000 | Now when we shift over, however,
00:04:50.400 | to an individual trying to understand the world,
00:04:53.100 | so not you trying to push forward your vision for the world,
00:04:58.200 | where confirmation bias is not necessarily a bad thing,
00:05:00.200 | but where you're trying to understand how the world operates,
00:05:02.400 | you're intaking other people's thoughts.
00:05:07.100 | This is where you need to be more wary about confirmation bias.
00:05:10.200 | And I would say our culture right now of intaking information
00:05:13.500 | is essentially entirely defined by confirmation bias.
00:05:17.500 | We figure out, "What is my tribe?
00:05:20.100 | What do we believe?
00:05:21.600 | I need to get to a pep rally.
00:05:23.700 | How do I find my way to the pep rally
00:05:26.200 | where I'll feel really good hearing people I agree with,
00:05:28.600 | hearing not only them say things I agree with,
00:05:31.200 | but hearing them trash the other people who disagree with us?
00:05:33.800 | Those are the others. We hate them.
00:05:35.500 | Let's all have a kind of weirdly sadistic pep rally
00:05:38.900 | where we not only cheer on our team,
00:05:40.400 | but discuss how much we hate the other team."
00:05:42.600 | That is basically our cultural conversation right now,
00:05:45.400 | online in particular, in many different areas.
00:05:48.900 | This is a place where I try to resist that.
00:05:51.500 | I do not like teams. I do not like tribes.
00:05:55.100 | I feel like it's intellectual abdication to say,
00:05:58.700 | "Tell me what I'm supposed to believe,
00:06:00.300 | and then I'll go get the pom-poms."
00:06:02.100 | I like a wide variety of thought.
00:06:04.900 | And what I have found, what I've argued before,
00:06:06.500 | is that the best thing to do here is to encounter
00:06:09.200 | the very best arguments you can
00:06:12.100 | that either disagree with or offer an alternative
00:06:15.400 | to things that you feel strongly about.
00:06:17.200 | This is how you dilute individual confirmation bias
00:06:21.600 | about the things you believe in.
00:06:23.000 | Now, as I talk about on this show,
00:06:24.700 | people get worried about that.
00:06:26.300 | They say, "If I expose myself
00:06:30.000 | to a really good alternative, a really good other thought,
00:06:33.000 | then maybe I will dilute my commitment to my team.
00:06:35.500 | Maybe my morality itself will be degraded."
00:06:38.500 | It's not what happens.
00:06:39.900 | If you believe strongly in something
00:06:42.100 | and it's a good thing you believe in,
00:06:43.300 | when you encounter really good arguments against it
00:06:45.500 | or really good alternatives,
00:06:46.700 | that combination strengthens and nuances your belief.
00:06:51.500 | It gives you a stronger tie to the thing you believe in.
00:06:54.100 | It gives you a more sophisticated take on
00:06:57.300 | why you believe in what you believe in
00:06:59.200 | and what's wrong with the alternatives
00:07:01.400 | and where they overlap and where it gets complicated.
00:07:03.400 | It makes you a better defender of the things
00:07:05.500 | that are important to you
00:07:06.600 | to encounter their best alternatives.
00:07:09.000 | That makes you a better defender
00:07:10.100 | than to instead try to avoid them,
00:07:12.500 | to try to stay within a bubble
00:07:15.200 | where I'm only hearing from people
00:07:16.700 | that makes me feel good.
00:07:19.000 | So I have been arguing for that.
00:07:21.400 | We all have confirmation bias
00:07:23.000 | when it comes to taking in information about the world.
00:07:26.000 | And I say that's a good bias to try to get away from
00:07:31.000 | because when you do try to get away from it,
00:07:32.800 | actually your bias towards whatever you believe in
00:07:34.900 | tends to get stronger.
00:07:36.300 | The only addition I will add to this is
00:07:40.500 | while it is natural for individuals to say,
00:07:43.600 | "I don't really want to encounter these thoughts
00:07:46.200 | because they might be, let's say, disagreeable to me."
00:07:49.800 | That's completely natural.
00:07:51.100 | Be really wary when other people are saying that for you.
00:07:54.800 | So if other people are saying,
00:07:57.400 | "I don't want you to be exposed to this
00:07:59.100 | because I don't think you are smart enough
00:08:01.200 | to be exposed to it.
00:08:02.200 | I won't be tricked, but you'll be tricked.
00:08:04.900 | So let me just tell you what you should look at."
00:08:07.000 | Be wary of these other things.
00:08:08.500 | And in fact, if I find out you're looking at
00:08:11.200 | those other things, there could be some trouble.
00:08:13.400 | Lace up your running shoes and start running.
00:08:16.200 | That has never in the history of the world of ideas
00:08:19.600 | ever led to a good place.
00:08:21.200 | Be wary of people who tell you
00:08:22.500 | what you should or shouldn't encounter.
00:08:23.700 | And again, come back to this point
00:08:25.100 | that if you are advocating for something publicly,
00:08:28.900 | make your case, let other people make their other cases.
00:08:31.400 | When they collide, wisdom grows.
00:08:32.800 | That's fine.
00:08:33.500 | When you're intaking information about the world,
00:08:35.700 | know what you believe in,
00:08:36.900 | believe in what you believe in.
00:08:37.800 | That's great.
00:08:38.700 | You can be on a team, have a jersey, that's fine.
00:08:41.400 | We all like to cheer for things.
00:08:42.600 | But the way to really strengthen your understanding
00:08:45.300 | is to encounter the alternatives,
00:08:47.200 | encounter the best faith, smartest arguments
00:08:49.600 | that go against what you say,
00:08:50.900 | that show different ways of seeing things.
00:08:52.800 | It's not going to turn you
00:08:54.500 | into some sort of conspiratorial weirdo.
00:08:56.600 | It's going to actually make you smarter.
00:08:58.000 | It's going to make your beliefs more nuanced.
00:08:59.400 | It's going to give you some more empathy.
00:09:00.600 | It's going to make you a better human.
00:09:01.900 | And ultimately, it's going to make
00:09:03.500 | your understanding of the world better.
00:09:05.000 | So, Charlotte, remember your name there.
00:09:09.200 | That's my take on this issue.
00:09:10.800 | I have those two angles on it.
00:09:12.200 | I think they can both be true at the same time.
00:09:14.800 | We could probably summarize them all, though.
00:09:18.200 | At least we could probably reduce 80% of the issues
00:09:21.400 | I just talked about, if you just did one simple thing,
00:09:23.600 | which was quit Twitter
00:09:25.900 | and never ever look at Twitter ever again.
00:09:28.300 | That alone would actually solve
00:09:29.400 | most of the problems I'm talking about.
00:09:30.700 | So, you know, that's the short answer
00:09:32.100 | and the long answer you just heard.
00:09:33.500 | [Music]