back to index

Destiny: Politics, Free Speech, Controversy, Sex, War, and Relationships | Lex Fridman Podcast #337


Chapters

0:0 Introduction
3:6 Politics and debates
19:28 War in Ukraine
32:37 Trans athletics
35:46 AI
48:29 Lowest point in Destiny's life
68:40 Hate speech and language
105:2 Misogyny
120:13 Big government and institutions
159:57 Hasan and Vaush
176:6 Joe Biden
186:56 Donald Trump
192:56 Free speech
195:49 Melina joins the conversation
199:3 Melina and Destiny
209:34 Open relationship
216:56 Red pill community
227:10 Sex body count
239:25 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | If you have a democratic style of governance,
00:00:03.120 | you are entrusting people with one of the most awesome
00:00:06.720 | and radical of responsibilities.
00:00:08.360 | And that's saying that you're going to pick the people
00:00:10.400 | that are gonna make some of the hardest decisions
00:00:11.960 | in all of human history.
00:00:13.220 | If you're gonna trust people to vote correctly,
00:00:15.960 | you have to be able to trust them
00:00:17.200 | to have open and honest dialogue with each other.
00:00:19.020 | Whether that's Nazis or KKK people or whoever talking,
00:00:22.720 | you have to believe that your people
00:00:24.720 | are going to be able to rise above
00:00:26.080 | and make the correct determinations
00:00:27.580 | when they hear these types of speeches.
00:00:29.360 | And if you're so worried that somebody is gonna hear
00:00:32.040 | a certain political figure
00:00:33.120 | and they're gonna be completely radicalized instantly,
00:00:35.280 | then what that tells me
00:00:36.120 | is that you don't have enough faith in humans
00:00:38.120 | for democracy to be a viable institution, which is fine.
00:00:41.000 | You can be anti-democratic,
00:00:42.320 | but I don't think you can be pro-democracy
00:00:44.160 | and anti-free speech.
00:00:45.520 | - The following is a conversation with Stephen Bunnell,
00:00:51.360 | also known online as Destiny.
00:00:53.560 | He's a video game streamer and political commentator,
00:00:56.720 | one of the early pioneers
00:00:58.040 | of both live streaming in general
00:00:59.680 | and live streamed political debate and discourse.
00:01:02.720 | Politically, he is a progressive,
00:01:05.220 | identifying as either left or far left,
00:01:08.040 | depending on your perspective.
00:01:10.020 | There are many reasons I wanted to talk to Stephen.
00:01:12.880 | First, I just talked to Ben Shapiro
00:01:14.920 | and many people have told me
00:01:16.600 | that Stephen is the Ben Shapiro of the left
00:01:20.360 | in terms of political perspective
00:01:22.040 | and exceptional debate skills.
00:01:24.920 | Second reason is he skillfully defends
00:01:28.000 | some nuanced non-standard views,
00:01:31.040 | at the same time being pro-establishment,
00:01:33.360 | pro-institutions and pro-Biden,
00:01:35.640 | while also being pro-capitalism and pro-free speech.
00:01:39.460 | Third reason is he has been there at the beginning
00:01:43.380 | and throughout the meteoric rise
00:01:45.560 | of the video game live streaming community.
00:01:48.080 | In some mainstream circles,
00:01:50.040 | this community is not taken seriously,
00:01:52.700 | perhaps because of its demographic distribution
00:01:55.720 | skewing young,
00:01:56.920 | or perhaps because of the sometimes harsh style
00:01:59.840 | of communication.
00:02:01.600 | But I think this community should be taken seriously
00:02:05.280 | and shown respect.
00:02:07.200 | Millions of young minds tune in
00:02:08.840 | to live streams like Destiny's to question
00:02:11.480 | and to try to understand what is going on with the world,
00:02:14.280 | often exploring challenging, even controversial ideas.
00:02:18.180 | The language is sometimes harsher
00:02:19.940 | and the humor sometimes meaner than I would prefer.
00:02:22.880 | But I, Grandpa Lex, put on my rain boots
00:02:27.380 | and went into the beautiful chaotic muck
00:02:29.540 | of online discourse,
00:02:31.160 | and have so far survived to tell the tale,
00:02:34.120 | with a smile and even more love in my heart than before.
00:02:37.960 | On top of all this,
00:02:40.640 | we were lucky to have Malina Gorinson,
00:02:43.480 | a popular streamer and world traveler,
00:02:45.880 | join us at the end of the conversation.
00:02:48.040 | You can check out her channel on twitch.tv/malina,
00:02:52.100 | and you can check out Stephen's channel
00:02:54.180 | on youtube.com/destiny.
00:02:57.100 | This is the Lex Friedman Podcast.
00:02:59.140 | To support it,
00:02:59.980 | please check out our sponsors in the description.
00:03:02.460 | And now, dear friends, here's Destiny.
00:03:05.200 | - I don't know if you watched me watching your
00:03:08.020 | yay interview.
00:03:09.300 | - Yeah, thank you so much for-
00:03:10.820 | - I'm so curious,
00:03:11.700 | when you're navigating a conversation like that,
00:03:14.500 | are you, how intentional is the thought process
00:03:16.740 | between building rapport and pushing
00:03:19.520 | and giving a little and letting-
00:03:21.160 | - Zero, zero intention.
00:03:23.180 | I was watching, and thank you so much.
00:03:24.680 | It was very kind for you to review that conversation.
00:03:26.960 | It meant a lot that you were complimentary in parts,
00:03:30.280 | on the technical aspects of the conversation,
00:03:32.240 | but no, zero.
00:03:33.360 | And I'm actually deliberately trying to avoid,
00:03:37.560 | I think you've called it debate brain,
00:03:39.460 | which is just another flavor of thinking about
00:03:45.800 | like the meta conversation,
00:03:47.180 | trying to optimize how should this conversation go?
00:03:50.440 | Because I feel like the more you do that,
00:03:53.620 | the better you get at that,
00:03:55.620 | the less human connection you have.
00:03:57.780 | Like the less genuinely you're actually sitting there
00:03:59.980 | in the moment and listening to the person,
00:04:01.820 | you're more like calculating what's the right thing to say,
00:04:04.300 | versus like feeling what is that person feeling right now?
00:04:08.180 | What are they thinking?
00:04:09.260 | That's what I'm trying to do,
00:04:10.180 | is like putting myself in their mind and thinking,
00:04:13.780 | what does the world look like to them?
00:04:15.580 | What does the world feel like to them?
00:04:17.920 | And so from that, I truly try to listen.
00:04:20.440 | Now I'm also learning,
00:04:22.280 | especially 'cause Rogan and others have been giving me shit
00:04:25.200 | for not pushing back.
00:04:26.960 | It's good sometimes to say,
00:04:28.580 | from a place of care for the other human being,
00:04:32.600 | to say stop.
00:04:33.840 | What did you just say?
00:04:36.160 | I don't think that represents who you are,
00:04:39.400 | and what you really mean.
00:04:40.760 | Or maybe if it does at that time represents who they are,
00:04:45.460 | I can see a better world
00:04:48.560 | if they grow into a different direction,
00:04:50.120 | try to point that direction out to them.
00:04:52.160 | - There's a really complicated dance
00:04:53.640 | between letting somebody share their full story,
00:04:56.620 | versus letting somebody like,
00:04:58.080 | essentially I guess like proselytize your audience.
00:05:00.560 | And it's like, okay, hold on, let's take him in here.
00:05:02.640 | But yeah, I used to be four or five years ago,
00:05:06.200 | it was attack, attack, attack, attack, attack,
00:05:08.040 | whatever you said.
00:05:09.080 | And now I'm leaning way more towards the like,
00:05:10.960 | okay, well, tell me how you feel about everything,
00:05:12.560 | and then we'll go from there.
00:05:13.400 | So a lot of people like my new approach,
00:05:15.360 | some older fans will watch and they're like,
00:05:16.760 | why are you letting this guy just ramble on?
00:05:18.140 | You know he said like five or six wrong things,
00:05:20.020 | and you're only gonna call him out on two of them.
00:05:21.340 | And it's like, it's just different styles of conversation.
00:05:23.720 | But yeah.
00:05:24.560 | - Do you do a lot of research beforehand too?
00:05:26.320 | - Depending on the conversation, yeah.
00:05:27.540 | So if we're gonna talk like vaccines and stuff,
00:05:29.120 | yeah, that's a ton of reading and stuff
00:05:30.940 | that I never thought I'd know going into it.
00:05:33.680 | If it's a more personal,
00:05:34.520 | like political philosophy conversation,
00:05:36.200 | there's not as much you can prepare for,
00:05:37.320 | just it truly depends on the conversation.
00:05:39.320 | - How much are you actually listening to the other person?
00:05:42.080 | - I'm always listening, you have to listen.
00:05:43.440 | 'Cause as soon as you stop listening,
00:05:44.440 | the quality of everything falls apart.
00:05:46.320 | The connection disappears,
00:05:47.200 | the quality of the conversation disappears.
00:05:49.080 | But my natural inclination is to just be way more aggressive
00:05:52.360 | than normal, so I have to constantly remind myself,
00:05:54.520 | I guess you would call it a meta conversation,
00:05:55.960 | where you're like, okay,
00:05:56.800 | he's probably saying this because of that,
00:05:58.160 | or we'll let him go here and then we'll stop later.
00:06:00.220 | But yeah, 'cause my preferred style of conversation
00:06:03.440 | is like, I'm gonna talk,
00:06:04.520 | and the second I say something you disagree with,
00:06:06.280 | then let's iron it out, right?
00:06:08.440 | I got like, I think I'm like syllogisms,
00:06:10.120 | like, okay, here's premise A, good, okay.
00:06:13.200 | Premise B, okay, and then conclusion.
00:06:15.320 | And then as long as we're both deductively sound,
00:06:17.300 | we're not crazy, no psychosis,
00:06:18.680 | then we're gonna agree on everything.
00:06:20.320 | Whereas other people like to,
00:06:21.440 | most people think in stories, like narratives,
00:06:23.480 | like a whole, there's a whole narrative,
00:06:25.060 | and the individual facts don't matter as much,
00:06:27.280 | 'cause they'll pick and choose what they want.
00:06:28.520 | And it's really hard, 'cause everybody thinks narratives
00:06:31.080 | have to function in that world.
00:06:32.680 | But it's frustrating for me sometimes.
00:06:34.800 | - Well, I've seen, you've had a lot of excellent debates.
00:06:37.720 | One of them I just recently, last night,
00:06:39.680 | watched is on systemic racism.
00:06:42.680 | And it's the first time I've seen you
00:06:44.920 | completely lose your shit.
00:06:46.360 | - Oh, shoot, who was that against?
00:06:47.800 | - I'm not sure exactly, but you were just very frustrated.
00:06:50.360 | Sorry, not lose your shit, but you were frustrated,
00:06:52.440 | constantly, because of the thing, let's lay out one, two,
00:06:55.000 | three, and every time you try to lay it out,
00:06:57.880 | it would falter.
00:06:59.300 | I think it had to do with sort of,
00:07:01.160 | can you use data to make an argument,
00:07:03.560 | or do you need to use a study
00:07:04.920 | that does an interpretation of that data?
00:07:07.320 | And then there's like this tension between,
00:07:09.200 | I think this is a behavioral economist
00:07:11.000 | that you were talking to.
00:07:12.360 | The point is, you do this kind of nice layout
00:07:14.800 | that the whole point of behavioral economics,
00:07:16.400 | it says there's more to it than just the data.
00:07:19.080 | You have to give a context and do the rich,
00:07:22.240 | rigorous interpretation in the context
00:07:24.520 | of the full human story.
00:07:26.120 | And then there was like a dance back and forth.
00:07:27.800 | Sometimes you use data, sometimes not,
00:07:29.440 | and you were getting really frustrated and shutting down.
00:07:32.480 | And so that felt like a failure mode.
00:07:34.600 | I've seen Sam Harris have similar sticking points.
00:07:37.560 | Like if we can't agree on the terminology, we can't go on.
00:07:40.880 | To me, I feel like,
00:07:43.000 | the Wittgenstein perspective is like,
00:07:47.760 | I think if you get stuck on any one thing,
00:07:51.240 | you're just not gonna make progress.
00:07:52.480 | You have to, part of the conversation has to be
00:07:55.840 | about doing a good dance together
00:08:00.520 | versus being dogmatically stuck on the path to truth.
00:08:05.840 | - I think the true challenge is identifying
00:08:08.560 | what of those sticking points are important
00:08:10.880 | versus what is not important.
00:08:14.840 | It's like if I'm having an argument with somebody
00:08:16.200 | about Jewish representation in media,
00:08:18.840 | it might be a big conversation
00:08:21.240 | and they might say a couple things.
00:08:23.000 | Like I think Jewish people,
00:08:24.360 | they tend to help their own or whatever.
00:08:25.640 | And it's like, "Yeah, okay."
00:08:26.800 | But for the purpose of the conversation,
00:08:28.280 | we can keep moving.
00:08:29.320 | But if they casually drop, "Yeah,
00:08:31.160 | "and I think that's why the Holocaust numbers
00:08:32.960 | "were blown up from 100,000 to 6 million."
00:08:35.080 | And it's like, "Okay, well, hold on, wait, wait.
00:08:36.280 | "If you think this, we have to stop here
00:08:37.920 | "because this is gonna be,
00:08:39.880 | "it's not just a language game in this part.
00:08:41.440 | "If you really believe this fact,
00:08:42.960 | "then the whole rest of the conversation
00:08:44.360 | "is gonna be informed by that belief."
00:08:46.800 | - And it has to be something
00:08:47.800 | that doesn't bother you personally.
00:08:50.640 | You have to step outside your own ego.
00:08:53.000 | So Holocaust denial is somebody
00:08:54.880 | that would bother a lot of people.
00:08:56.640 | And there's some things, just observing you,
00:08:59.900 | I feel like when you get really good at conversation,
00:09:02.300 | you can become a stickler to,
00:09:05.720 | you might have your favorite terms
00:09:07.560 | that really bothers you if people don't agree
00:09:09.760 | on those terms.
00:09:10.600 | - Begs the question, you mean raising the question.
00:09:12.800 | Yeah, I usually just want,
00:09:13.640 | if people say stuff, I just let it slide, yeah.
00:09:16.280 | 'Cause if you fight,
00:09:17.480 | when you're having a conversation with somebody
00:09:19.000 | and you're talking to their audience at the same time,
00:09:20.560 | 'cause that's really what's happening,
00:09:21.920 | you never wanna come off as overcombative
00:09:23.640 | or overaggressive because it puts people in like,
00:09:26.320 | there's like a trigger in your brain,
00:09:28.000 | and this is true of relationships or friendships
00:09:30.040 | of persuasive rhetoric or whatever,
00:09:31.560 | there's a trigger in the brain.
00:09:32.400 | And as soon as that defensive trigger gets like flipped on,
00:09:34.680 | everything is over, you've lost the ability to persuade
00:09:37.240 | because everything becomes a fight at that point, yeah.
00:09:39.600 | - Well, I wanted to talk to you
00:09:41.080 | 'cause I heard somewhere that you were referred to
00:09:44.280 | as the Ben Shapiro of the left,
00:09:45.520 | and since I'm talking with Ben as well,
00:09:48.440 | I wanted to sort of complete spiritually
00:09:51.360 | this platonic political philosophy puzzle in my head.
00:09:55.160 | You are a progressive,
00:09:57.080 | but a progressive with many nonstandard progressive views,
00:10:00.280 | and you had a heck of a fascinating journey
00:10:02.280 | through all of that.
00:10:03.120 | And like I said, I think you argue with passion sometimes
00:10:07.160 | with excessive amounts of passion, but almost--
00:10:09.400 | - That's a really polite way of saying that.
00:10:11.200 | - Almost always with good faith
00:10:13.920 | and with rigor, with seriousness.
00:10:16.240 | I asked on your subreddit, which is an excellent subreddit,
00:10:18.880 | shout out to the Destiny subreddit,
00:10:22.080 | so much, at least for that particular post.
00:10:25.040 | What I really loved is when I asked for questions for you,
00:10:27.880 | they were like, holy shit, there's adults in there,
00:10:29.680 | let's all behave.
00:10:30.880 | - Yeah. - Like, nobody say incest.
00:10:32.640 | I was like, what?
00:10:33.480 | What's going on here?
00:10:35.160 | But actually, the questions that rose to the top
00:10:37.960 | were really good.
00:10:38.800 | So somebody said that Destiny was,
00:10:40.880 | speaking of your journey,
00:10:43.200 | was a conservative in his early teens,
00:10:45.920 | then he became a libertarian,
00:10:48.120 | then he became a left-wing social justice warrior,
00:10:50.640 | then he flirted with socialism,
00:10:53.000 | and now he is a social democrat liberal.
00:10:55.800 | I've also heard you refer to yourself as a far-left person.
00:10:59.760 | So to the degree there's truth to that journey,
00:11:02.280 | can you take me through your evolution
00:11:04.880 | through the landscape of political ideologies
00:11:07.760 | that you went through?
00:11:08.600 | - So my dad comes from Kentucky,
00:11:11.440 | and my mom is a Cuban immigrant.
00:11:14.800 | Cubans are notorious for being very conservative
00:11:17.520 | in the United States,
00:11:18.920 | for historical reasons and for other reasons,
00:11:21.280 | but my upbringing was a very Republican one.
00:11:24.760 | I grew up listening to Rush Limbaugh, Glenn Beck,
00:11:28.160 | Michael Savage, on the radio, Billy Cunningham,
00:11:32.360 | I think Sean Hannity a little bit later on,
00:11:34.360 | that was my whole upbringing politically.
00:11:36.480 | I remember I was writing,
00:11:38.440 | I'd written articles for the school journal
00:11:40.240 | in favor of defending the war in Iraq
00:11:42.680 | and defending Bush from all the criticism, et cetera.
00:11:45.040 | So that was my upbringing.
00:11:47.600 | I think once I hit high school, college,
00:11:50.120 | I had my edgy libertarian-esque high school phase
00:11:53.340 | of reading Ayn Rand, of figuring out that,
00:11:57.160 | oh my God, nothing in life matters
00:11:58.600 | except for class and money,
00:11:59.680 | that's actually the answer to everything.
00:12:02.000 | I got to college, I became a Ron Paul fan,
00:12:04.600 | very big Ron Paul fan,
00:12:06.200 | and then from there, I kind of work, do life, life happens.
00:12:11.200 | At the kind of the lowest point of my life
00:12:13.560 | in terms of where I'm working,
00:12:14.560 | financially everything is kind of in ruin in my life,
00:12:16.400 | there's just a whole bunch of dumb stuff that's happened.
00:12:18.120 | Probably my most conservative point,
00:12:19.800 | I don't know what it is about being poor
00:12:21.880 | and thinking you can work your way out of it,
00:12:23.480 | you can do whatever,
00:12:24.320 | it's just my upbringing is always just like,
00:12:25.480 | if you're not having financial success,
00:12:28.160 | just work, work, work, work, work.
00:12:30.420 | And then I got into streaming, very, very lucky break,
00:12:32.560 | everything just lined up at the right time.
00:12:33.880 | And then as I've progressed through streaming,
00:12:35.680 | I would say through the years,
00:12:36.640 | I've gradually fallen more and more to the left,
00:12:38.920 | especially once my kid turned four, five, six years old,
00:12:42.280 | and I started to see how much different his life was
00:12:45.440 | just because of the financial opportunities
00:12:47.000 | that I was able to provide for him
00:12:48.240 | through no merit of his own,
00:12:49.720 | and that started to radically change
00:12:51.200 | how I viewed the world in a lot of ways.
00:12:52.600 | - So actually, let's linger on that low point.
00:12:55.840 | You worked at McDonald's, you worked at a casino,
00:12:59.920 | you did carpet cleaning, what was the lowest point?
00:13:03.960 | - Definitely the carpet cleaning.
00:13:05.560 | - Really? - Absolutely.
00:13:07.080 | - Why was it the lowest point?
00:13:08.680 | That's when you were just flirting with starting streaming?
00:13:11.960 | - My whole life has been a series of lucky breaks,
00:13:14.160 | really, truly.
00:13:15.280 | I grew up playing a lot of video games,
00:13:17.520 | but back in my day, our day, you had to read.
00:13:21.240 | There was a lot of text on the screen.
00:13:22.440 | - Back in my day, we used to play--
00:13:24.280 | - They didn't all talk to you.
00:13:25.360 | Yeah, 'cause nowadays everything's voice acted,
00:13:27.160 | but back then you had to read a lot.
00:13:28.700 | I was a really good reader,
00:13:29.840 | I had a really good vocabulary.
00:13:30.880 | - Yeah, I've heard you actually say that.
00:13:32.040 | What games are we talking about?
00:13:33.440 | What do you mean, just reading?
00:13:35.080 | You're talking about RPGs?
00:13:36.080 | - Yeah, JRPGs, so like Final Fantasy games,
00:13:38.000 | Fantasy Stars, like all of these,
00:13:39.480 | like any RPG that would have been on the SNES,
00:13:41.760 | Sega, PlayStation, these are the things that I'm--
00:13:43.840 | - Let's pause on that.
00:13:44.840 | - Okay.
00:13:45.680 | - I just talked to Todd Howard,
00:13:47.200 | who's of the Elder Scrolls fame
00:13:50.840 | and the Fallout fame and beyond.
00:13:54.320 | What's your thoughts on Elder Scrolls?
00:13:56.520 | Why is Skyrim the greatest RPG of all time?
00:13:59.440 | - Man, I really don't like Skyrim or Fallout.
00:14:02.200 | - You don't love it? - Or those types of games.
00:14:03.160 | - Oh, really? - No, not at all.
00:14:04.480 | - Why do you hate Skyrim?
00:14:05.640 | - Yeah, so I really like characters
00:14:07.920 | and like compelling stories and narrators
00:14:09.440 | around those characters.
00:14:10.480 | And I like to see them kind of like grow and change,
00:14:12.280 | kind of like a movie or a story.
00:14:14.000 | So in your like Final Fantasy games, you've got characters.
00:14:17.400 | There are a lot of like classical tropes
00:14:20.600 | of like a character starts off kind of like edgy, angsty,
00:14:23.260 | all on their own.
00:14:24.100 | They develop relationships, friendships.
00:14:25.600 | They realize that the life is more about themselves
00:14:27.880 | and they do that.
00:14:28.720 | So I like that growth.
00:14:29.800 | That's kind of what you see
00:14:30.640 | in all of those old role-playing games.
00:14:32.560 | I didn't like the open world ones as much
00:14:34.200 | 'cause your main character is just like a blank slate,
00:14:35.880 | never talks.
00:14:36.720 | It's for you to like project onto,
00:14:38.200 | but there's not the same like linear narrative
00:14:40.880 | of like growth for the character.
00:14:41.720 | - Oh, that's fascinating.
00:14:42.560 | There's an actual story arc to the character
00:14:45.360 | that's more crafted in a beautiful way
00:14:47.640 | by the designers of the game.
00:14:48.800 | Yeah, that's okay. - I don't think one
00:14:49.640 | is better or worse.
00:14:50.480 | I tend towards like I wanna hear a compelling story
00:14:52.360 | around like a set of characters
00:14:53.600 | that like grow and change as the game goes on.
00:14:54.720 | - Oh, that's beautifully put then.
00:14:55.880 | Yeah, I just really loved being able to leave the town.
00:15:00.040 | You go outside the town and you look outside,
00:15:02.240 | it's nature and the world of possibilities is before you.
00:15:05.160 | You can do whatever the fuck you want.
00:15:07.040 | I mean, that immensity of just being lost in the world
00:15:10.440 | is really immersive for me.
00:15:11.500 | But yeah, you're right.
00:15:12.340 | Whatever attracts you about a world.
00:15:14.640 | So you were just starting to play video games.
00:15:19.120 | You grew up playing video games.
00:15:20.240 | That's one of your lucky breaks.
00:15:22.160 | - There's just like a lot of random skills you pick up
00:15:24.040 | depending on the type of game you play.
00:15:25.720 | I played a lot of text-based games on the computer.
00:15:27.880 | So I was a very fast typer.
00:15:29.600 | I'm still a very fast typer.
00:15:31.000 | Read a lot, learned weird kind of math stuff
00:15:34.600 | for some of the calculations, some of the games.
00:15:36.700 | I think I'm pretty good at getting information,
00:15:39.520 | figuring stuff out, learning patterns, all of that.
00:15:42.240 | And then that plus the reading and everything
00:15:44.400 | with the games meant that I,
00:15:45.960 | I don't wanna say I excelled in school
00:15:47.440 | 'cause my grades were pretty bad,
00:15:48.960 | but I was in like all honors, all AP classes or whatever.
00:15:51.560 | A lot of dual enrollment,
00:15:52.680 | a lot of AP credit going into college.
00:15:54.080 | So I did pretty well in school,
00:15:55.920 | probably better than I should have,
00:15:56.920 | but it was because I had the game stuff
00:15:58.220 | that was like really powering a lot of my brain there
00:16:00.400 | while I was trying to sleep through class.
00:16:02.400 | - So you're able to soak in information,
00:16:04.200 | integrate it, quickly take notes.
00:16:06.640 | - Generally, I think I'm pretty good at that, yeah.
00:16:08.480 | - What, you do this a lot when you stream,
00:16:10.280 | you're typing stuff.
00:16:11.200 | Is there a system in that note-taking?
00:16:13.480 | And what note, what do you use for note-taking?
00:16:16.480 | Does it matter?
00:16:17.320 | - I use a notepad.
00:16:19.240 | - Like notepad.exe notepad?
00:16:20.880 | - Yep, notepad.exe, not the plus plus, not.
00:16:23.440 | - Is there genius to the madness behind that,
00:16:26.320 | or you just don't give a shit?
00:16:27.640 | - No, I mean like, it's gonna depend
00:16:29.480 | on the style of conversation.
00:16:30.820 | If I'm with somebody that is very meticulously organized
00:16:33.400 | their thoughts and they're a,
00:16:35.640 | find a better word here for rambler,
00:16:36.920 | you can edit that in, better word for rambler.
00:16:38.560 | Somebody that talks a lot and a lot,
00:16:39.840 | I'll start like taking notes, bullet points,
00:16:42.280 | like this, this, this, this, this, this, this,
00:16:43.880 | because there's a style of conversation
00:16:45.960 | where I say seven or eight different things,
00:16:47.960 | and then when you go to respond to everything I said,
00:16:49.720 | I cut you off immediately and we argue that point.
00:16:51.960 | But if somebody's gonna do that,
00:16:52.800 | I usually say, "Hold on, you just said these eight things,
00:16:54.960 | "I'm gonna respond to every single one.
00:16:56.520 | "I've written them all down."
00:16:57.560 | And then you can go,
00:16:58.380 | if you wanna go point by point, we can,
00:16:59.700 | but you just said all this and I wrote it down,
00:17:00.960 | so now we're gonna go.
00:17:01.800 | - So what are you actually writing down,
00:17:02.640 | like a couple of words per point they left?
00:17:05.400 | - Honestly, like there are very few
00:17:07.040 | unique conversations in politics.
00:17:08.600 | Like a lot of them are kind of retreading old ground.
00:17:10.280 | So if we're having a debate on abortion,
00:17:12.780 | somebody might say like,
00:17:13.760 | "Oh, well, I believe this thing about viability,
00:17:15.700 | "and I believe this thing about, you know,
00:17:17.000 | "when they're a fetus versus a human."
00:17:18.600 | And I'll just write down like those points
00:17:19.920 | so that when I go to respond,
00:17:20.760 | I kind of have like a, like note cards,
00:17:22.240 | like a guiding thing there
00:17:23.160 | to keep me centered on my response.
00:17:25.440 | - Political discourse is a kind of tree
00:17:26.800 | you're walking down, I got it.
00:17:27.640 | And you're like taking--
00:17:29.120 | - Just to keep my focus guided,
00:17:31.200 | so I'm not like running off on a weird tangent
00:17:32.920 | or responding to something I didn't say or something.
00:17:34.520 | - What about like doing research?
00:17:36.320 | It's just, is there a system to your note taking?
00:17:38.960 | Because mentally you seem to be
00:17:40.960 | one of the most organized people I've listened to.
00:17:43.040 | So is there, is it in your mind,
00:17:45.500 | or is there a system that's on paper?
00:17:47.640 | - A little of both.
00:17:48.840 | I feel like the human mind is a beautiful thing
00:17:51.600 | if you have interest in an area.
00:17:53.720 | So like what I'll tell people is,
00:17:55.760 | let's say there's like a totally new topic
00:17:57.400 | that I'm researching, I don't know anything.
00:17:58.680 | And I'll do a couple of these on stream.
00:18:00.120 | I think they're boring, but people watch it.
00:18:01.760 | I might open a Wikipedia article and I'll read,
00:18:04.000 | and I hit something I don't know,
00:18:05.040 | and then I open the next Wikipedia article
00:18:06.640 | and I'll read it, and then I might have like seven tabs open
00:18:08.860 | and I'll read and I'll read and I'll read.
00:18:11.160 | And I'll read a ton of stuff,
00:18:12.000 | maybe for hour two, three, four hours of stuff.
00:18:14.240 | And then by the end, you know,
00:18:15.080 | someone in chat will ask me like,
00:18:16.040 | "Do you even remember like this particular thing?"
00:18:18.320 | And I'll say, "Not really, no, not too much."
00:18:21.080 | But what happens is, as long as you've seen it once,
00:18:23.380 | what will happen is like the next day, the day after,
00:18:25.160 | we'll read something else and we'll be like,
00:18:26.440 | "Oh, I remember that thing from this thing.
00:18:28.880 | I remember like vaguely that."
00:18:30.080 | And then if you see it like a third time, you're like,
00:18:31.480 | "Oh, this makes sense."
00:18:32.820 | Because especially when it comes to,
00:18:35.280 | oh, here's like a little trick on stuff.
00:18:37.120 | If you're ever reading any news
00:18:38.800 | and there's a place that pops up,
00:18:40.640 | always look at it on a map
00:18:41.920 | because so much of history is like on a map.
00:18:44.280 | It's so important to like know the geography.
00:18:45.920 | It makes things make so much more sense.
00:18:48.560 | But yeah, once I start to see stuff over and over again,
00:18:51.080 | just because I've like read it a few times,
00:18:52.760 | stuff will start to kind of connect to my mind.
00:18:54.400 | And I'm like, "Oh yeah, well, this makes sense.
00:18:55.560 | Of course, these people believe this because of this."
00:18:57.760 | Or of course, like this happened here.
00:18:59.060 | It's because, you know, that happened there.
00:19:01.160 | So yeah, it's a lot of that.
00:19:02.240 | If there's like a topic
00:19:03.080 | that I'm doing specific research for,
00:19:05.760 | so like vaccine-related stuff is a big one.
00:19:08.200 | The Ukrainian-Russian conflict is a big one.
00:19:10.240 | That I'll break out a note.
00:19:12.020 | I'll probably get like a Google doc
00:19:13.300 | and I'll just start like writing like an outline
00:19:14.900 | of kind of the rough points of everything
00:19:16.560 | just to organize my thoughts around different topics, yeah.
00:19:18.960 | - We're just gonna go on tangent upon a tangent
00:19:20.760 | upon a tangent.
00:19:21.600 | We'll return to the low point of your life at some point.
00:19:24.320 | Always returning from the philosophy to the psychology.
00:19:26.480 | So you did the Ukraine topic.
00:19:29.460 | One question is, what role does US play in this war?
00:19:35.360 | Could they have done something to avoid the war?
00:19:40.240 | Did they have a role to play
00:19:42.640 | in forcing Vladimir Putin's hand?
00:19:45.640 | Do they have a role to play
00:19:47.360 | in de-escalating the war towards a peace agreement
00:19:51.840 | and the opposite?
00:19:53.000 | If it does escalate towards something like
00:19:56.400 | the use of a tactical nuclear weapon,
00:19:58.520 | are they to blame?
00:19:59.480 | Are we to blame?
00:20:01.280 | - Oh man, somebody sent me an email a while ago
00:20:03.080 | with great words.
00:20:04.420 | There's a specific way to navigate a conversation
00:20:07.560 | where you can kind of like contribute to a negative event,
00:20:10.840 | but you're not really the one responsible for it.
00:20:13.640 | Like the classic example is,
00:20:15.040 | a woman goes out late at night,
00:20:16.240 | gets a little bit too drunk,
00:20:17.240 | and then something happens.
00:20:18.440 | And it's like, while there might've been steps
00:20:20.100 | she could have taken to mitigate the risk,
00:20:22.300 | it's not her fault of what happened
00:20:24.060 | because the responsibility rests
00:20:26.440 | on the agent making the choice, right?
00:20:28.520 | There's a chooser at some point
00:20:30.260 | that is choosing to do wrong or evil.
00:20:32.840 | I don't believe in any of the arguments
00:20:35.200 | that say the United States has contributed
00:20:36.960 | to Russia's position on Ukraine
00:20:39.140 | or the actions that they've taken on Ukraine.
00:20:41.440 | There are several arguments that some people,
00:20:45.080 | some even political scholars are putting out there
00:20:47.440 | to say that the United States is to blame,
00:20:48.740 | but I find them completely unconvincing.
00:20:50.880 | I think that when you ask the question of like,
00:20:52.720 | what is the United States role or what has our role been?
00:20:55.680 | I think it's really important for us.
00:20:57.000 | I don't think we even agree as a country
00:20:58.120 | on what our role should be,
00:20:59.180 | which I think is a hard one
00:21:00.020 | because you've got this kind of,
00:21:01.340 | there's this growing populist movement in the United States.
00:21:04.400 | It might be the far left and the far right.
00:21:05.960 | And I think populists tend to have
00:21:07.240 | this kind of isolationist view of the world
00:21:09.920 | where the United States should just be our own thing.
00:21:11.920 | We shouldn't be telling anybody what to do.
00:21:13.920 | We shouldn't be the world police.
00:21:15.120 | And then kind of more in these
00:21:16.520 | like center left, center right positions.
00:21:17.960 | And then across a lot of Europe, you've got, well, okay.
00:21:20.280 | The United States is kind of like the big kid on the block.
00:21:22.840 | Like we're looking to them for guidance and leadership
00:21:25.040 | on situations like what's going on in Ukraine.
00:21:27.740 | So insofar as the original question is like,
00:21:31.000 | what is like the United States responsibility?
00:21:32.880 | I think we have a responsibility to ensure
00:21:35.140 | the relative like freedom,
00:21:36.960 | prosperity and stability across Europe.
00:21:39.200 | I think that defending Ukraine's sovereignty
00:21:41.360 | and right to their borders is a part of that.
00:21:43.600 | And I don't believe that prior to the invasion in 2022,
00:21:47.720 | I don't think the United States was contributing
00:21:49.520 | to Russia invading that country.
00:21:51.560 | I know there are arguments given that like
00:21:53.680 | the expansion of NATO,
00:21:55.320 | has something that's been threatening to Russia,
00:21:57.700 | but the Baltics joined
00:21:58.760 | and Russia didn't do anything about it.
00:22:00.660 | The invasion to Crimea was very clearly a response
00:22:02.940 | to the revolution in 2014.
00:22:04.800 | The invasion on the borders is clearly a response
00:22:07.200 | to Ukraine winning that civil war in the Southeast
00:22:10.760 | and the Donbass and Russia becoming more aggressive.
00:22:13.160 | I don't think that you can blame any of that
00:22:14.480 | on NATO expansion.
00:22:15.720 | There's no NATO countries that are threatening Russia
00:22:17.520 | or debating Russia.
00:22:18.780 | - Do you think there is a nuclear threat?
00:22:21.200 | Do you think about this?
00:22:23.160 | Do you worry about this,
00:22:24.080 | that there is a threat of a tactical nuclear weapon
00:22:26.640 | being dropped?
00:22:27.640 | - I think that possibility exists either way.
00:22:29.160 | And I think the responsibility for that is on Russia
00:22:31.640 | because it just can't be the case
00:22:33.640 | that if you have nukes,
00:22:34.520 | you're allowed to invade countries and take their land.
00:22:36.600 | Because if anything,
00:22:37.440 | I think that that down the road also increases
00:22:40.160 | the potential for nuclear problems in the future, right?
00:22:43.040 | Because at that point,
00:22:43.880 | either every single country has to acquire
00:22:45.560 | their own nuclear weapons,
00:22:46.520 | 'cause if you don't,
00:22:47.360 | Russia's gonna mess with you,
00:22:48.440 | or every single country has to join NATO.
00:22:50.480 | And now what?
00:22:51.320 | We're back at square zero, ground zero, square one,
00:22:53.160 | where people are like,
00:22:54.000 | "Oh, well look, all these countries joining NATO
00:22:54.960 | is aggressive towards Russia."
00:22:55.960 | Like, what are you gonna do?
00:22:57.720 | - Yeah, you've mentioned that there's a complicated calculus
00:23:02.120 | going on with the countries that have nuclear weapons.
00:23:06.200 | And what's our responsibility?
00:23:08.440 | Are you allowed to do anything you want
00:23:10.040 | to countries that don't have nuclear weapons?
00:23:13.160 | - That's a really tricky discussion.
00:23:15.360 | - For sure.
00:23:16.200 | - Because what is US supposed to do
00:23:17.640 | if Russia drops a tactical nuclear weapon?
00:23:20.360 | There's a set of options,
00:23:21.880 | none of which are good.
00:23:25.360 | And it's such a tricky moment right now
00:23:29.320 | because the things that Biden and other public figures say,
00:23:34.320 | I feel like has a significant impact
00:23:37.040 | on the way this game turns out.
00:23:38.360 | 'Cause I think mutually assured destruction
00:23:40.040 | is partially a game of words.
00:23:43.720 | - Yeah.
00:23:44.560 | - Like, I mean, I believe in the power of conversation,
00:23:47.560 | of leaders talking to each other.
00:23:50.480 | I feel like you have to have a balance
00:23:53.480 | between threat and compromise,
00:23:57.240 | and like empathy for the needs,
00:24:01.120 | the geopolitical, the economic needs of a nation,
00:24:05.320 | but also sort of respect and represent your own interests.
00:24:10.320 | So it's a tricky one.
00:24:11.880 | Like, how do you play the hand?
00:24:14.960 | - It reminds me of, I don't know if you've ever heard
00:24:17.000 | in like evolutionary psychology or evolutionary biology,
00:24:19.000 | there are things called tit for tat strategies.
00:24:21.440 | It kind of reminds me of that,
00:24:22.400 | where it's like if, like,
00:24:23.720 | there are a whole bunch of these little biological mechanisms
00:24:25.720 | where creatures will develop,
00:24:26.720 | like socializing, like tit for tat.
00:24:28.160 | If you do something bad to me,
00:24:29.480 | I'm gonna do something bad for you.
00:24:30.920 | And then more complicated schemes will come out
00:24:32.880 | where it'll be like tit, tit for tat,
00:24:35.240 | where it's like, you can make one mistake,
00:24:36.880 | and then I'm gonna get you if you do a second one,
00:24:38.280 | or it could be tit, tit, tit for tat,
00:24:39.600 | or there could be tit for tat, tat for tit.
00:24:41.280 | There's like all these like back and forths
00:24:43.040 | where creatures kind of optimize themselves.
00:24:44.800 | And yeah, I think something the United States
00:24:47.760 | did really well in terms of that kind
00:24:49.240 | of conversational strategy,
00:24:50.440 | and I approved of this in the beginning,
00:24:52.280 | was Biden was very clear about setting out
00:24:54.640 | like the exact level of US involvement for the war.
00:24:57.560 | We're not gonna do a no fly zone.
00:24:59.120 | There's not gonna be US troops on the ground in Ukraine,
00:25:01.080 | but we are gonna send a whole bunch of money
00:25:02.640 | and a whole bunch of arms and a whole bunch of intel to them.
00:25:04.960 | And I thought he did a good job at laying out
00:25:06.360 | like the limitation of the US involvement
00:25:08.040 | while opening as much as we could
00:25:09.720 | in the ways that we could help.
00:25:11.160 | But yeah, that looming threat
00:25:13.160 | of some sort of tactical nuclear weapon,
00:25:15.200 | I think on the table right now is like,
00:25:16.480 | it's gonna be the annihilation
00:25:17.600 | of like Russian sea forces and everything.
00:25:19.360 | But you know, what happens if it continues to escalate?
00:25:22.720 | That's like a world that nobody wants to be in, yeah.
00:25:26.760 | - So we talked about difficult conversations.
00:25:29.040 | And again, thank you so much
00:25:30.080 | for reviewing the yay conversation.
00:25:32.040 | Let me ask you about Putin.
00:25:34.480 | - Mm-hmm.
00:25:35.320 | - Speaking of difficult conversations.
00:25:37.840 | So if you sit down,
00:25:38.960 | if I sit down with somebody like Vladimir Putin
00:25:41.640 | or Vladimir Zelensky,
00:25:43.800 | what's the right way to have that conversation?
00:25:45.520 | - Oh man. - We can talk about that one
00:25:46.960 | or we could talk about somebody more well understood
00:25:51.960 | through history, like Stalin or Hitler,
00:25:56.800 | something like that.
00:25:57.680 | Maybe that's an easier example to illustrate
00:26:00.520 | how to handle extremely difficult conversations.
00:26:04.200 | - Yeah, I mean, I can handle really difficult conversations
00:26:06.000 | between like two people, leaders of countries though.
00:26:09.440 | There's so much that you are representing
00:26:11.680 | in that conversation.
00:26:13.360 | I guess the thing that would be interesting to me
00:26:16.160 | would be like, what is Vladimir Putin's interest?
00:26:19.520 | Like what is the genuine interest
00:26:21.640 | that he has in the conflict?
00:26:23.640 | 'Cause I think finding out like, what is your buy-in
00:26:25.480 | or what is the driving force keeping you here
00:26:27.840 | is probably the most important thing.
00:26:29.840 | I think for Zelensky,
00:26:30.680 | I think it's quite a bit more simpler
00:26:32.280 | 'cause he's on the defense.
00:26:34.160 | So it's defending his country and his people.
00:26:36.720 | For Putin, I've heard all sorts of things.
00:26:39.120 | Dugin has his writings on like the East versus the West,
00:26:42.520 | the collapse of the West in the face
00:26:44.000 | of like all of the liberalism
00:26:45.640 | and the weird LGBT stuff that they criticize.
00:26:48.080 | You've got the desire to like return
00:26:50.000 | to this like former Soviet Union-esque thing.
00:26:52.040 | You've got Putin's quotes that collapse of the Soviet Union
00:26:54.440 | was the biggest geopolitical disaster of 20th century.
00:26:58.160 | And I guess figuring out like, what is Putin after?
00:27:00.280 | I'm not actually sure.
00:27:01.120 | I don't know the answer to that question.
00:27:01.960 | I know a lot of people write about it, but yeah.
00:27:03.160 | - Well, there's a lot of answers to that question.
00:27:04.880 | There's a lot of answers that he can give to that question.
00:27:07.040 | So say I sit down with him for three hours and talk about it.
00:27:10.960 | I think this is a really interesting distinction
00:27:14.120 | because you do do difficult conversations
00:27:17.140 | in the space of ideas, but also in your stream,
00:27:20.080 | you have, I mean, there's a bunch of drama going on.
00:27:22.040 | There's a human psychology is laid out
00:27:24.480 | in its full richness before you.
00:27:27.640 | So to me with leaders,
00:27:29.440 | I think a part of the conversation has to be
00:27:32.500 | about the human psychology.
00:27:34.240 | Not like a meta conversation,
00:27:36.680 | but like really understand what they feel,
00:27:40.720 | what they fear, who they are as a human being.
00:27:43.560 | Like as a family man, as a person proud of their country,
00:27:48.560 | as a person with an ego, as a person who's been affected,
00:27:53.360 | if not corrupted by powers, all of us can be and likely are.
00:27:58.360 | So all of that, that gives context to then
00:28:01.920 | the answers about what do you want in this war.
00:28:04.480 | Is that the answers about what you want in this war
00:28:07.800 | will be political answers.
00:28:10.040 | It's like a game that's being played again with words
00:28:13.120 | and politicians are incredibly good at playing that game.
00:28:16.160 | I think the deeper truth comes from understanding
00:28:19.540 | the human being from which those words come.
00:28:22.320 | And I think that's what you do.
00:28:24.420 | I don't know if you do those kinds of conversations where--
00:28:26.560 | - Never talked to any country leader, so.
00:28:28.480 | - No, not a country leader,
00:28:29.860 | but say a controversial figure or somebody
00:28:32.440 | that represents a certain idea.
00:28:34.140 | Don't just talk in the space of ideas or challenge the ideas
00:28:37.880 | but understand who is this person,
00:28:39.780 | how did you come to those ideas?
00:28:41.840 | - Oh yeah, when I've had, there've been a couple
00:28:43.480 | of very controversial right-leaning figures.
00:28:47.320 | So the two, obviously the mainstreamers for me
00:28:48.720 | are Lauren Southern and Nick Fuentes.
00:28:51.120 | And those types of conversations initially
00:28:54.260 | aren't very political at all.
00:28:55.600 | Yeah, it's more like, obviously we believe
00:28:57.880 | in very, very, very different things,
00:28:59.600 | but like beliefs don't happen accidentally.
00:29:01.280 | So how did you get to where you are?
00:29:02.860 | Those are way more personal conversations, that's true.
00:29:05.200 | - Is there things you regret about those conversations
00:29:08.480 | where you failed?
00:29:09.520 | Is there things you're proud of where you succeeded?
00:29:12.200 | - For things that I'm proud of,
00:29:13.880 | I feel like I'm really good at attempting
00:29:17.080 | to understand people without judgment.
00:29:19.280 | That I think a lot of people feel like
00:29:20.400 | they can have conversations with me
00:29:21.460 | where they can share a lot
00:29:22.760 | and I'm not gonna jump down their throat
00:29:24.180 | for them having a politically incorrect observation
00:29:27.440 | or for them being judgmental of somebody else
00:29:29.600 | or having like a feeling that's maybe not something
00:29:31.820 | they should have, something they're embarrassed about.
00:29:33.400 | So I think I do a really good job at that.
00:29:35.000 | And then by extension of that,
00:29:36.560 | I've gotten the ability to hear perspectives
00:29:38.360 | from so many different people
00:29:39.400 | that I think I can understand
00:29:40.300 | a lot of different perspectives.
00:29:42.680 | For failures of mine, I mean, it's always gonna be,
00:29:46.600 | on stream it'll be like, I didn't push back hard enough
00:29:48.920 | or I didn't know like a certain fact for a conversation.
00:29:51.900 | These are usually the, they're gonna be on these
00:29:53.640 | like very technical grounds generally.
00:29:55.720 | I'm pretty happy with like the direction
00:29:57.080 | my conversations have gone recently,
00:29:59.360 | especially over like the past six months.
00:30:01.000 | - So your goal is to de-radicalize
00:30:04.120 | the audience of those folks.
00:30:06.680 | - So that used to be my goal.
00:30:08.780 | My goal was de-radicalization.
00:30:10.800 | Now I'm kind of hoping that that's just the by-product.
00:30:13.200 | So the goal I think is to talk to somebody
00:30:15.240 | and to show they believe this because of these reasons.
00:30:18.400 | And if you wanna change people's beliefs,
00:30:19.980 | we have to talk about the underlying reasons
00:30:21.640 | for why they think the things they think.
00:30:23.580 | It's not enough to just say like that belief is bad
00:30:26.000 | 'cause it's like, well, they believe it
00:30:27.120 | for a whole bunch of things that are true and real
00:30:28.660 | to them at least.
00:30:29.760 | So you have to address all of the underlying things
00:30:32.680 | that they believe before you can change
00:30:33.760 | the overlying belief.
00:30:35.360 | So if I'm having a conversation with somebody,
00:30:37.060 | it'll be like, okay, why do you feel this about that,
00:30:39.240 | that and that?
00:30:40.160 | Okay, I understand that.
00:30:41.560 | Maybe like a better way to solve that would be like
00:30:43.840 | this or that instead of this thing.
00:30:45.800 | - So to what degree do you have to empathize
00:30:48.200 | with the person's worldview versus pushback?
00:30:52.000 | - That's always the hard one.
00:30:53.940 | When I'm talking to other people,
00:30:55.940 | it's almost always me stepping as much inside their bubble
00:30:59.260 | as I can.
00:31:00.100 | I have to like live and breathe their worldview
00:31:01.580 | and be able to speak their worldview
00:31:03.380 | in order to like navigate their thoughts
00:31:05.480 | because my worldview is,
00:31:07.780 | I'm not even using this as an insult.
00:31:09.620 | I don't know if I am a little bit autistic or something,
00:31:11.620 | but when I break apart things,
00:31:13.260 | I just wanna see like study, study, study, fact, fact, fact.
00:31:15.460 | That's how my mind works for everything.
00:31:17.140 | That's what I like to see.
00:31:18.180 | Like personal stories don't do much for me.
00:31:20.300 | Narratives don't do much for me.
00:31:21.140 | They show me like the data and the studies or whatever.
00:31:23.060 | But for other people,
00:31:24.100 | I think most brains are more human than that.
00:31:26.040 | And they tend to see things that more kind of like
00:31:28.680 | surreal pictures that are kind of painted
00:31:30.360 | and the brushstrokes are way broader.
00:31:32.080 | And they don't care about the itty bitty tiny fact.
00:31:35.240 | So if I'm talking to somebody else
00:31:36.360 | and I'm trying to get into their head
00:31:38.160 | and I'm trying to change their mind on things,
00:31:40.040 | I'm gonna be stepping into their world
00:31:42.720 | and I'm gonna try to be working through that framework.
00:31:45.000 | Really good example might be,
00:31:46.600 | we'll say like when it comes to trans issues for minors,
00:31:49.960 | okay, 16 or 17 year old needs to get on puberty blockers.
00:31:52.980 | The way that I want that debate to play out
00:31:54.940 | is let's look at all the data.
00:31:57.180 | Let's see what are the outcomes.
00:31:58.940 | Let's see what are the processes for getting a medication.
00:32:00.860 | And then we'll evaluate all of that
00:32:01.980 | and then we'll go in whatever like points more favorably.
00:32:04.820 | But that's wholly unconvincing to most people, right?
00:32:07.020 | So as a parent,
00:32:07.920 | if I'm having that conversation with another parent,
00:32:09.800 | the easiest way for me to have that conversation is like,
00:32:11.580 | hey, we both have kids.
00:32:13.420 | Imagine how horrible it would be
00:32:14.660 | if we felt like our kids needed help
00:32:16.420 | and the government was trying to get between us
00:32:18.540 | and their doctor in that conversation.
00:32:20.560 | That might be how that talk plays out,
00:32:21.760 | which I don't even think that's a really good argument.
00:32:23.880 | 'Cause I think there probably are times
00:32:24.720 | when the government should get in between,
00:32:25.760 | but I'll have that conversation
00:32:26.760 | because now I'm in a world
00:32:28.320 | where they understand what I'm saying.
00:32:29.560 | I am resonating with the way that they feel about things
00:32:32.200 | and then I can make progress
00:32:33.480 | with the way that they're kind of viewing the world
00:32:35.200 | because I'm talking in a language they understand.
00:32:37.240 | - So on this particular topic of trans issues,
00:32:40.440 | is that the reason you were banned from Twitch?
00:32:42.600 | - I'm not sure, I don't know.
00:32:45.000 | They just said hate speech,
00:32:45.920 | but I don't use like slurs or anything.
00:32:47.280 | So it's hard to know exactly.
00:32:48.780 | - So I think you made the claim
00:32:50.360 | that trans women shouldn't compete with cis women
00:32:53.100 | in the women's athletics.
00:32:54.800 | Can you make this case
00:32:58.340 | and can you steel man the case against it?
00:33:00.500 | I think in your community,
00:33:01.740 | there's a lot of trans folks who love you
00:33:04.700 | and there's a lot who hate you.
00:33:06.460 | - Yeah.
00:33:07.300 | - And so if you can walk the tightrope of this conversation
00:33:11.260 | to try to steel man both sides.
00:33:13.220 | - One of the argumentative strategies I say
00:33:14.780 | is that like anytime you have a conversation,
00:33:16.340 | you should be able to argue both sides
00:33:17.520 | better than anybody else.
00:33:19.320 | So for my side, the genuine belief side,
00:33:22.860 | it feels like overwhelmingly,
00:33:26.200 | all of the data is showing that trans,
00:33:28.980 | mostly trans women, even after I think three years
00:33:32.640 | on some sort of like HRT or estrogen stuff,
00:33:37.360 | they're still maintaining these advantages
00:33:39.440 | from their male puberty over cisgender women.
00:33:42.700 | And if that is the case,
00:33:43.920 | if we are gonna draw these distinctions
00:33:45.780 | around our sports between women and men,
00:33:48.380 | it feels unfair to have a category inside the women's sports
00:33:51.840 | that are maintaining advantages
00:33:53.640 | that are coming from a male puberty,
00:33:55.840 | regardless of the amount of time they've spent
00:33:57.480 | on hormone replacement therapy.
00:34:00.280 | So that would be my argument on that side.
00:34:02.320 | - So it's unfair from a performance enhancement aspect.
00:34:06.460 | So the same way we ban performance enhancing drugs
00:34:11.180 | that involve increasing of testosterone
00:34:14.140 | in that same way would be unfair.
00:34:15.760 | - Essentially, yeah.
00:34:16.600 | - So what's the case against?
00:34:20.000 | - Yeah, so the case in favor of them competing together
00:34:22.680 | is that realistically,
00:34:24.400 | there's not gonna be a trans sports category.
00:34:27.940 | Realistically, trans women aren't gonna be competitive
00:34:30.380 | with cis men because they've gone through these huge,
00:34:33.080 | you know, like hormone changes
00:34:34.540 | by the medication they're taking.
00:34:36.640 | And that when we look at how sports are kind of done anyway,
00:34:40.360 | there's a whole bunch of biological differences
00:34:42.760 | between people within sports categories
00:34:44.860 | that are determining their placement
00:34:46.360 | in the professional world.
00:34:48.100 | So for instance, somebody like me
00:34:50.460 | is probably never gonna go far in the NBA
00:34:51.980 | because I'm not tall enough.
00:34:53.420 | I think the average height in the NBA--
00:34:54.740 | - Don't doubt yourself.
00:34:55.620 | - Don't doubt myself, yeah.
00:34:56.460 | I wanna say it's like six or something.
00:34:58.100 | They're huge people.
00:34:59.900 | Or, you know, you look at like Michael Phelps
00:35:02.660 | as a classic example of a guy whose torso is like so long,
00:35:05.620 | his body is built for swimming.
00:35:07.460 | And I think there are some trans people
00:35:09.100 | that will look at that
00:35:09.940 | or somebody advocating for this position,
00:35:11.380 | they'll look at that and they'll go,
00:35:12.220 | "Okay, realistically, the way that Michael Phelps' body
00:35:15.140 | processes lactic acid,
00:35:16.380 | the shape physiologically of his body
00:35:18.280 | is gonna put him in a level of competition
00:35:20.420 | that so many men are never gonna reach
00:35:22.060 | just because of biology.
00:35:23.660 | How is it fair that you can have these biological outliers
00:35:26.300 | competing in these categories,
00:35:27.780 | but then when we come to like sports categories
00:35:29.540 | with trans and cis women,
00:35:30.660 | you're gonna take trans women
00:35:31.740 | and say that they can't compete against cis women.
00:35:33.500 | Can't you also just say that they have
00:35:34.700 | some level of biological difference there?
00:35:37.060 | Like, is it really gonna be that great of a difference
00:35:38.880 | than what Michael Phelps has versus the average swimmer
00:35:41.240 | or an NBA player has versus like the average height male?
00:35:44.040 | - Yeah.
00:35:46.340 | Do you think we're gonna get into
00:35:47.460 | some tricky ethical territory
00:35:49.500 | as we start to be able to,
00:35:51.380 | through biology and genetics, modify the human body?
00:35:54.940 | - Absolutely.
00:35:56.380 | I feel like those things are coming sooner
00:35:58.620 | than we wanted them to.
00:36:00.780 | Oh man, have you seen the AI art?
00:36:03.780 | - Yes.
00:36:04.620 | - That's a--
00:36:05.440 | - Of course, I'm an AI person.
00:36:06.540 | - Oh, okay, then yeah, yeah.
00:36:07.980 | That's always been like,
00:36:09.980 | what's gonna happen when robots can do art
00:36:12.520 | better than humans, LOL?
00:36:13.920 | Like, well, we'll see in 20 years,
00:36:15.360 | in 20 years, in 20 years.
00:36:16.600 | And now you have AI art winning competitions.
00:36:19.360 | And it's funny because robots are essentially--
00:36:24.040 | - There's a robot behind you, by the way.
00:36:25.400 | - A robot behind me.
00:36:26.680 | Oh, nice.
00:36:27.520 | Robots are really good--
00:36:29.800 | - Careful what you say.
00:36:30.640 | - Yeah, oh God, I'll be careful.
00:36:32.360 | That's not like one of the Chinese ones
00:36:33.560 | with a gun on it, right?
00:36:34.400 | Oh, okay.
00:36:35.220 | Hopefully not.
00:36:37.680 | - We'll see, depending on what you say, yeah.
00:36:39.280 | - Okay.
00:36:40.120 | Robots are really good at showing the limitations
00:36:43.360 | of the human mind in categories
00:36:46.120 | that we didn't believe we were limited before.
00:36:48.560 | I think that humans have this idea intrinsically
00:36:51.820 | that we have some type of innovative, creative drive
00:36:56.820 | that is just outside of the bounds of physical understanding.
00:37:00.940 | And with a sophisticated enough program,
00:37:03.480 | we see that maybe that's not actually true.
00:37:05.520 | And that's a really scary thing,
00:37:07.460 | philosophically, to deal with.
00:37:08.900 | Because we feel like we're very special, right?
00:37:10.360 | We own the planet, we make computers,
00:37:12.680 | and the idea that you can start to get these robots
00:37:14.480 | that can do things that's like, okay, you can do math, fine.
00:37:17.280 | Okay, you can do calculations, fine.
00:37:19.240 | But you can't do art.
00:37:20.240 | That's the human stuff.
00:37:21.760 | And then when they start to do that, it's like, oh, shoot.
00:37:23.920 | - And that terrifies you a little bit?
00:37:25.440 | Like the human species losing control
00:37:28.800 | of our dominance over this earth?
00:37:31.040 | - I don't think it's necessarily losing control
00:37:32.200 | of our dominance.
00:37:33.040 | I mean, I guess like a Skynet thing
00:37:34.240 | could come in at some point.
00:37:35.760 | But I think it brings us to this really fundamental level
00:37:39.660 | of like, what does it mean to be human?
00:37:41.500 | What is it that we're good at?
00:37:43.460 | What should we be doing with technology?
00:37:45.100 | We never really ask that question in the Western world.
00:37:47.060 | It's always the technology is like normative
00:37:50.000 | in that technology equals good
00:37:51.940 | and more technology equals better.
00:37:53.380 | That's been like the default assumption.
00:37:54.940 | In fact, if you ask a lot of people,
00:37:56.420 | how do you know if civilization has progressed
00:37:58.060 | over the past 100 or 200 years,
00:37:59.740 | they don't say we have better relationships,
00:38:01.740 | we have longer marriages, blah, blah, blah.
00:38:04.660 | They'll say technology has improved.
00:38:05.700 | We've got crazy phones, we've got crazy computers.
00:38:07.700 | And the idea that more technology might be bad
00:38:10.420 | has never even crossed somebody's mind,
00:38:11.840 | unless it's used for like a really bad thing.
00:38:13.980 | - Well, it's interesting.
00:38:15.080 | We kind of think as more and more automation is happening,
00:38:18.080 | we're going to get more and more meaning
00:38:20.000 | from things like being artists and doing creative pursuits.
00:38:23.740 | And here's like, oh shit, if the art,
00:38:27.220 | if the creative pursuits are also being automated,
00:38:29.460 | then what are we gonna gain meaning from?
00:38:31.300 | What are the activities from which you'll gain meaning?
00:38:33.540 | You know, my whole life I've been working
00:38:35.120 | on artificial intelligence systems.
00:38:36.700 | There's been different revolutions.
00:38:38.260 | One of them is the machine learning revolution.
00:38:40.980 | And it's interesting to build up intuition
00:38:44.220 | and destroy that intuition about what is
00:38:48.340 | and isn't solvable by machines.
00:38:50.580 | I think for the longest time, I grew up thinking
00:38:54.140 | Go is not, the game of Go is not solvable.
00:38:57.100 | Because my understanding of AI systems
00:39:01.180 | is ultimately this, is fundamentally a search mechanism
00:39:06.180 | that is fundamentally going to be brute force.
00:39:09.240 | There's no shortcuts.
00:39:10.400 | - Sure.
00:39:11.240 | Like if it can't solve the traveling salesman problem,
00:39:13.600 | it's not even gonna be able to give you an approximation.
00:39:15.760 | - So most interesting problems
00:39:17.240 | are giant travel salesman problem.
00:39:19.200 | And then, so of course, it's not gonna be able to solve that.
00:39:21.640 | And then you, then the deep learning revolution
00:39:25.160 | made you realize, holy shit, these large neural networks
00:39:27.520 | with a giant number of knobs is able
00:39:29.760 | to actually somehow estimate functions
00:39:34.760 | that can do a pretty good job
00:39:38.160 | of understanding deep representation of a thing.
00:39:41.200 | Whether that's a game of Go,
00:39:42.940 | or whether it's the human natural language,
00:39:46.120 | or if it's images and video or audio,
00:39:50.120 | and even actions in different video games
00:39:52.320 | and actions of robotics and so on.
00:39:54.820 | And then you realize with diffusion models
00:39:57.260 | and different generative models,
00:39:59.840 | you start to realize, holy shit,
00:40:01.100 | it can actually generate not just interesting representations
00:40:06.100 | or interesting manifestations
00:40:10.920 | of the representations of forms,
00:40:12.720 | but it's able to do something
00:40:13.880 | that impresses humans in its creativity.
00:40:16.660 | It's beautiful in the way we think of art as beautiful.
00:40:19.640 | Like it surprises us and makes us chuckle,
00:40:21.640 | it makes us sit back in awe and all those kinds of things.
00:40:25.360 | And yet the thing that it seems to struggle with the most
00:40:28.140 | is the physical world currently.
00:40:30.380 | So that's counterintuitive.
00:40:31.780 | We humans think that it's pretty trivial,
00:40:35.260 | the being able to pick up a cup,
00:40:38.100 | being able to write with a pen,
00:40:40.060 | like in the physical space, we think that's trivial.
00:40:42.940 | We give ourselves respect for being great artists
00:40:45.180 | and great mathematicians and all that kind of stuff,
00:40:47.860 | and that seems to be much easier than the physical space.
00:40:51.100 | - Bodies are really cool.
00:40:52.500 | - Yeah. - There is a, I don't know,
00:40:54.420 | it's probably Asimov or somebody,
00:40:55.440 | there was some science fiction writer
00:40:56.600 | that had a short story and it was like an alien
00:40:58.360 | that had landed on earth and it was describing our bodies
00:41:00.640 | from a totally alien perspective.
00:41:02.480 | And when you think about all the things we can do,
00:41:04.440 | it's pretty cool.
00:41:05.280 | We can climb through a whole multitude of environments,
00:41:08.880 | we can exist in a multitude of temperatures,
00:41:10.860 | we can manipulate things just with our hands
00:41:14.840 | and the way that we can interact with things around us.
00:41:17.160 | And yeah, we're very capable on like a physical level,
00:41:19.440 | even though, like you said, we think about ourselves like,
00:41:21.080 | oh, well, human beings have really big brains and we do,
00:41:23.160 | we're really intelligent as well,
00:41:24.160 | but yeah, our bodies are pretty cool too.
00:41:26.380 | - And it's a fascinating hierarchical biological system.
00:41:30.640 | Like we're made up of a bunch of different
00:41:34.080 | like living organisms that all don't know
00:41:38.000 | about the big picture of our body.
00:41:40.880 | And it's all functioning in its own little local world
00:41:42.920 | and it's doing its thing, but together it has,
00:41:46.400 | it forms a super resilient system.
00:41:48.940 | All of that comes from a very
00:41:53.880 | compressed encoding of what makes a human.
00:41:57.200 | You start with the DNA and it builds up
00:41:59.120 | from a single cell to a giant organism.
00:42:01.700 | I mean, and because of the DNA,
00:42:05.600 | through the evolution process,
00:42:06.780 | you can constantly create new humans
00:42:09.240 | and new living organisms that adapt to the environment.
00:42:12.080 | Like that resilience to the physical world,
00:42:15.120 | it seems like running the whole earth over again,
00:42:19.060 | the whole evolutionary process over again
00:42:22.480 | might be the only way to do it.
00:42:25.240 | So to create a robot that actually adapts,
00:42:27.920 | is as resilient to the dynamic world,
00:42:32.060 | might be a really difficult problem.
00:42:34.360 | - Possibly.
00:42:35.400 | Well, I was gonna say like in a programming environment,
00:42:37.000 | you can do things on timescales
00:42:38.660 | that are impossible in the real world, right?
00:42:41.040 | Like the benefit to AI and computers is computationally,
00:42:43.440 | they can compute so much data so quickly.
00:42:45.960 | Whereas on human timetables, we have to wait.
00:42:48.280 | When you talk about evolution,
00:42:50.000 | you know, it's generation after generation after generation.
00:42:53.160 | Maybe in a virtual environment that could be simulated
00:42:55.720 | and then those changes could happen a lot quicker.
00:42:57.400 | - Well, that's on a human timescale,
00:42:58.720 | but you have to look at earth
00:43:00.960 | as a quantum mechanical system,
00:43:03.260 | the computation is happening super fast.
00:43:05.240 | This is a giant computer doing a giant simulation.
00:43:08.140 | So just 'cause for us humans, it's slow,
00:43:10.720 | there's like trillions of organisms involved in you,
00:43:13.720 | destiny being you.
00:43:15.480 | - Sure, but the next iteration of like from human to human,
00:43:19.480 | even if on the quantum level, there's a lot of stuff
00:43:21.080 | going on, you talk about like changes in DNA,
00:43:23.560 | for instance, right?
00:43:24.680 | Like that's happening from a generation
00:43:26.760 | to generation timescale.
00:43:27.920 | Like in a virtual environment,
00:43:28.880 | that could theoretically happen.
00:43:30.280 | Well, it already is, there's like protein folding,
00:43:32.160 | like huge cloud computing, probably ML stuff
00:43:35.000 | that's like working on doing all of that stuff.
00:43:36.320 | And it'll run like trillions and trillions of simulations,
00:43:38.400 | you know, every second and stuff, maybe not every second,
00:43:40.440 | but-- - Still slower
00:43:41.960 | than the actual protein folding, much slower.
00:43:45.520 | That's for the problem of solving protein folding
00:43:49.680 | to estimate the 3D structure, but the actual body
00:43:52.360 | does the actual protein folding way faster.
00:43:54.960 | So like we're, the question is, can we shortcut
00:43:58.040 | the simulation of human evolution,
00:44:00.840 | try to figure out how to build up an organism
00:44:03.040 | without simulating all the details?
00:44:05.360 | 'Cause we have to simulate all the details of biology
00:44:07.960 | where we're screwed, we don't have--
00:44:09.600 | - Oh, true, we'd have to put something in a pond
00:44:11.200 | and then watch it for a billion years.
00:44:12.280 | - That might be the most efficient way to do it.
00:44:14.040 | - Sure. - That's what the universe
00:44:15.760 | most likely is, is a kind of simulation created
00:44:18.720 | by a teenager in their basement to try to see what happens.
00:44:22.620 | It's a computer game.
00:44:24.480 | That might be the most efficient way
00:44:25.840 | to create interesting organisms.
00:44:28.200 | But within the system, it's perhaps possible
00:44:31.280 | to create other robots that will be of use
00:44:36.280 | and will entertain us in the way
00:44:38.800 | that other humans entertain us.
00:44:40.680 | And that's a really interesting, of course, problem.
00:44:42.920 | But it's surprising how difficult it has been
00:44:46.120 | to create systems that operate in the physical world
00:44:49.920 | and operate in that physical world
00:44:51.240 | in a way that's safe to humans and interesting to humans.
00:44:55.640 | 'Cause there's also the human factor,
00:44:57.160 | the human-robot interaction.
00:44:58.520 | To me, that's the most interesting problem,
00:45:00.680 | to figure out how to do that well.
00:45:02.680 | And so Elon Musk and others, Boston Dynamics
00:45:07.120 | have worked on legged robots,
00:45:08.320 | so I really care about legged robots.
00:45:11.120 | Those are super interesting.
00:45:12.800 | How to make them such that they're able
00:45:16.360 | to operate successfully in a dynamic environment.
00:45:18.840 | It's super tricky.
00:45:20.080 | They're like the dumbest of dogs,
00:45:23.120 | speaking of which, there's a dog barking outside.
00:45:25.520 | It's really tricky to create those kinds of organisms
00:45:29.080 | that live in the human world.
00:45:31.680 | Then again, if more and more of us move
00:45:35.720 | into the digital world, so you stream a lot,
00:45:39.880 | like part of who you are exists in the digital space.
00:45:44.880 | The fact that you have a physical representation also,
00:45:48.520 | maybe more and more will become not important.
00:45:51.580 | - I hope that's the case,
00:45:53.440 | 'cause I bought a lot of stock in Meta,
00:45:54.680 | and man, it's down a lot.
00:45:57.080 | - Meta the company?
00:45:59.000 | Is there some degree, can you look at yourself,
00:46:02.360 | like Steven, the physical meat vehicle,
00:46:05.000 | and then the destiny, this digital space,
00:46:07.820 | like digital avatar, do you sense that in a certain way
00:46:12.780 | you're the digital avatar?
00:46:14.740 | - I've always tried to keep my on-stream personality
00:46:16.740 | as genuine as possible, so they're one and the same to me.
00:46:19.440 | I don't really view them as two separate entities,
00:46:21.740 | but I mean, I always view myself as Steven,
00:46:23.780 | the real-life person.
00:46:25.260 | Destiny's my online name, but--
00:46:26.900 | - No, but because your social network
00:46:29.420 | is established in the digital space,
00:46:31.020 | so many people know you through the digital space.
00:46:33.460 | Can we swap out another person that looks like you?
00:46:37.180 | In like an AI system, and then that entity
00:46:40.100 | known as destiny will continue existing.
00:46:42.140 | - I mean, there must be some level of sophistication
00:46:45.460 | that could emulate a human brain, I would imagine, right?
00:46:48.260 | Probably the tech's not there yet, but--
00:46:50.820 | - Well, the question is, what's the level of sophistication
00:46:53.820 | of the audience that would recognize
00:46:55.580 | that something has changed?
00:46:56.940 | It's the Turing test.
00:47:00.340 | How hard is it to trick your audience,
00:47:03.500 | your large audience of fans that watch your streams,
00:47:07.460 | that when you swap out an AI that emulates you,
00:47:11.220 | that nothing has changed?
00:47:13.180 | And the question is, do you have to really simulate
00:47:16.180 | so much of the human brain for that?
00:47:18.020 | I don't think so.
00:47:19.180 | - Probably not.
00:47:20.940 | - So, I mean, like you said, a lot of political discourse
00:47:25.420 | is just walking down the tree together,
00:47:27.740 | so you can probably emulate a lot of that discussion.
00:47:30.380 | - Yeah, it would depend on if you're doing old data sets
00:47:32.620 | and you're training on that,
00:47:33.500 | and I'm having conversations about abortion
00:47:35.300 | and you're creating vaccines,
00:47:36.300 | I imagine it could do it for quite a while.
00:47:38.060 | The only thing that would be weird
00:47:38.900 | is when novel issues pop up.
00:47:40.420 | Then you probably need a more sophisticated resemblance
00:47:43.380 | of the inner brain, right?
00:47:44.220 | - You have to keep training on the internet,
00:47:46.340 | so how the language models,
00:47:47.500 | and that's the most incredible breakthrough,
00:47:49.140 | is just the language models.
00:47:50.380 | You just have to keep retraining the system on Reddit,
00:47:54.220 | which is actually what a lot of it is trained on,
00:47:56.260 | which is hilarious.
00:47:57.380 | - I do think it's really interesting
00:47:58.580 | that funny problems, like the trolley problem,
00:48:01.140 | that we can kind of work through our
00:48:02.540 | normative ethical systems on,
00:48:03.740 | are now like real questions.
00:48:05.820 | If you're driving a Tesla and it's on autopilot
00:48:07.540 | and you're gonna hit somebody,
00:48:08.380 | but it can swerve and hit somebody else,
00:48:10.100 | what ought the system do?
00:48:11.940 | We went very quickly from fun project in philosophy class
00:48:15.980 | to we need to solve this for insurance purposes
00:48:18.580 | as quickly as possible.
00:48:19.580 | It's kind of interesting to think about.
00:48:21.100 | - Well, I actually have,
00:48:22.700 | I'll bring up the trolley problem with you later.
00:48:25.100 | There's a fascinating version of it that I find hilarious.
00:48:28.260 | Okay, let's return to your low point.
00:48:30.940 | - Oh yeah.
00:48:31.780 | - You started playing video games.
00:48:34.220 | That was a lucky break.
00:48:36.100 | You did text-based ones.
00:48:37.300 | That was a lucky break
00:48:38.140 | 'cause you've gotten to be pretty good at learning.
00:48:40.380 | And then you started thinking about going to college
00:48:43.180 | and so on.
00:48:44.020 | What happened next?
00:48:44.860 | - I mean, I went to like a prep school.
00:48:46.260 | So you kind of have to go to college after.
00:48:47.940 | That's like the point, right?
00:48:48.780 | I was also a millennial.
00:48:50.020 | All of us had to go to college.
00:48:51.020 | That's always what they told us.
00:48:52.060 | So my life was kind of,
00:48:54.820 | it's hard to describe.
00:48:57.300 | I didn't really think much of the future.
00:48:58.780 | I was just kind of enjoying the day-to-day
00:49:00.340 | 'cause everything in my life was pretty weird.
00:49:02.340 | Both my parents had moved to Florida
00:49:04.260 | by the time I was 16, 17.
00:49:05.860 | I was living with my grandma.
00:49:06.700 | I was working.
00:49:07.820 | I had a girlfriend, moved out.
00:49:09.980 | We got a place, did college.
00:49:11.560 | By the time I got into college,
00:49:13.820 | I had transitioned from working at McDonald's
00:49:16.340 | to I was like working in a casino restaurant basically.
00:49:19.260 | And I was really good at that job.
00:49:21.020 | So high level of patience for drunk people and sane people.
00:49:24.700 | And I was doing music in school
00:49:26.700 | 'cause I'd really grown to love music.
00:49:28.140 | And my kind of thought process was,
00:49:30.380 | my thought process was I can do music as a hobby, I guess,
00:49:33.260 | unless I get really good
00:49:34.100 | and maybe I can make money with that.
00:49:35.260 | But otherwise I love music.
00:49:36.420 | I'm okay going to school for music, getting good at it.
00:49:38.220 | And then just doing that on the side.
00:49:39.580 | And then my main job would kind of be
00:49:40.580 | this career I was building at the casino.
00:49:43.420 | And basically the trying to balance personal life
00:49:47.040 | plus graveyard shift, six to eight weeks at a casino,
00:49:49.540 | and then a full-time music degree was not possible for me.
00:49:52.340 | And eventually I had to drop school
00:49:53.840 | after I think it was like three years.
00:49:55.860 | And after I dropped school to maintain my casino job,
00:49:59.220 | after a few months, I got fired from my casino job.
00:50:02.220 | So I'd essentially just thrown away
00:50:03.260 | like the past like three or four years of my life.
00:50:05.900 | - Why'd you get fired from the casino job?
00:50:07.740 | I heard there's a story behind that.
00:50:09.180 | - Yeah, there's a story.
00:50:10.060 | Basically I was just really dumb
00:50:11.420 | when it came to understanding corporate politics.
00:50:13.180 | And this is funny 'cause the same attitude
00:50:14.700 | kind of followed me into the streaming world.
00:50:17.180 | My thought process has kind of always been
00:50:19.220 | that like as long as I'm really good at what I do,
00:50:21.760 | I should be untouchable.
00:50:22.860 | If I'm really good, you can't do anything to me.
00:50:24.420 | I don't have to play any dumb games or whatever.
00:50:26.660 | And at the casino, I think I was the youngest.
00:50:29.660 | It was originally shiftly
00:50:30.740 | than supervisor position at the casino.
00:50:33.340 | And when I started to get my own shifts,
00:50:35.900 | there were problems that I would run into on graveyard shift
00:50:38.160 | because of carryover from the swing shift.
00:50:40.500 | I remember one of these problems
00:50:41.400 | was underneath the soda machine,
00:50:42.700 | they weren't cleaning it properly
00:50:43.780 | and fruit flies were showing up.
00:50:45.500 | And the manager came in one morning and she was like,
00:50:48.700 | "Hey, what's going on with the machine?"
00:50:50.780 | And I told her, I was like,
00:50:51.620 | "I'm done, I can't take everything from swing shift
00:50:54.460 | and do everything at grave shift, I can't do this.
00:50:55.980 | They need to figure out their stuff better
00:50:57.340 | or I need more employees, it's not possible for me."
00:50:59.460 | And she's like, "What did you tell anybody else?"
00:51:00.740 | I was like, "Yeah, I complained to the supervisor
00:51:02.740 | on the swing shift all the time."
00:51:04.340 | And she told me, "If you're not getting the answer
00:51:06.840 | that you like, then it's your responsibility
00:51:08.900 | to email the next person up."
00:51:11.100 | And I was like, "Oh, okay, that's interesting."
00:51:13.220 | And some months went on and I ran into more problems
00:51:16.020 | because on graveyard, here's how,
00:51:18.080 | I don't know if it's everywhere,
00:51:19.020 | but morning shift is the easiest
00:51:20.820 | and that's when you're the most overstaffed
00:51:22.060 | because that's when all the VPs are in
00:51:23.420 | and that's when all the managers are there
00:51:24.660 | and everybody blah, blah, blah.
00:51:25.620 | Swing shift is the most challenging,
00:51:27.460 | that's where your highest flow of customers is.
00:51:29.620 | You're also decently staffed there,
00:51:30.800 | but there's a lot of stuff going on.
00:51:32.020 | And graveyard, nobody cares at all about you.
00:51:34.540 | They don't give you any employees.
00:51:35.900 | You might get swamped, you might not, who cares?
00:51:37.780 | Make sure it's clean for day shift,
00:51:38.860 | that's the only thing that matters.
00:51:39.780 | - A quick question, first of all, clarification.
00:51:41.500 | So this is 24 hour?
00:51:43.740 | - 24 hour diner, yeah, inside the casino, yeah.
00:51:45.420 | - So it's a diner in a casino.
00:51:47.020 | Oh, by the way, I had an amazing moment
00:51:48.820 | at a diner in a casino recently.
00:51:50.580 | It's a special place.
00:51:51.860 | A diner in a casino is a place of magic.
00:51:54.140 | - There's a lot of, I don't know if I'd say magic,
00:51:56.220 | but there's a lot of other worldly stuff going on.
00:51:57.860 | - There's characters, there's,
00:51:59.540 | and I had an interaction with a waitress
00:52:01.980 | that was the sweetest waitress in the world.
00:52:03.980 | And it was just like, I don't know,
00:52:05.580 | made me feel less alone in this cruel world of ours.
00:52:09.620 | So graveyard begins when?
00:52:11.580 | - For me, my shift was 10 p.m. to 6 a.m.
00:52:13.900 | Or sometimes I got called in early,
00:52:15.060 | so it'd be 8 p.m. to 6 a.m.
00:52:16.740 | - That's no love for that shift.
00:52:19.620 | - No, especially not trying to do school at the same time.
00:52:21.380 | Absolutely not.
00:52:22.260 | But yeah, basically, long story short,
00:52:24.600 | I ran into a problem with my,
00:52:25.700 | where I didn't have enough employees on my shift.
00:52:28.020 | VPs were coming in in the morning,
00:52:29.220 | and they're just like, "Hey, the diner's kind of dirty."
00:52:31.020 | And I'm like, "You've cut all my employees past 4 a.m."
00:52:33.100 | Like on some nights, I'm literally cooking
00:52:34.580 | and doing front of house, like all on my own.
00:52:36.500 | Like, I can't do this.
00:52:37.820 | And my manager, Pam, told me,
00:52:39.980 | "Well, you've got to figure it out."
00:52:41.340 | And so I remembered her advice.
00:52:43.260 | So I emailed the VP of food and beverage, and I CC'd her.
00:52:45.940 | And I said, "I'm not getting the help
00:52:46.980 | "I need on my restaurant."
00:52:48.020 | Now, I didn't know at the time
00:52:48.980 | that I was basically completely throwing her under the bus
00:52:51.380 | because of that email.
00:52:52.740 | But retroactively, when I look back on things,
00:52:55.900 | or retrospectively, I see that was the moment
00:52:58.140 | that I got marked for deletion.
00:53:00.140 | And I didn't really understand it,
00:53:01.500 | even though I'd heard terminology
00:53:02.980 | for papering somebody out the door.
00:53:04.340 | But after that point, I started to get written up
00:53:06.140 | for a lot of little, random things.
00:53:07.900 | Like, I'd missed one day of work
00:53:10.140 | in my three years at the casino.
00:53:11.940 | And I started to get written up
00:53:13.220 | for showing up one or two minutes late.
00:53:14.980 | That's kind of weird, I don't know, that's whatever.
00:53:16.900 | Or written up for random ways about filing paperwork.
00:53:19.140 | And then eventually there came a situation
00:53:20.860 | with another employee where they were,
00:53:23.340 | it's complicated, it has to do with call-out stuff.
00:53:24.820 | But basically, they wanted to call out,
00:53:27.460 | and I told them that if they called out,
00:53:28.620 | they were gonna get fired because they were at 10 points.
00:53:31.580 | They were at nine points and 10 points
00:53:32.740 | of the firing, blah, blah, blah.
00:53:33.740 | Pam told me, "You can tell her
00:53:35.220 | "that she's gonna get a point,
00:53:36.060 | "but you can't tell her she's gonna get fired."
00:53:37.460 | I don't know what that meant.
00:53:38.420 | And then I told her that if you call out,
00:53:39.980 | you're fucked, you're gonna get fired,
00:53:42.060 | or you're gonna be at 10 points.
00:53:43.460 | And then I got called in early, like three days later,
00:53:45.740 | and Pam was like, "You inappropriately communicated
00:53:47.900 | "with an employee because you said the F word
00:53:49.540 | "in a text message."
00:53:50.420 | And I'm like, "Really?
00:53:51.260 | "There's no shot."
00:53:52.260 | And she's like, "Well, you also tried to fire the employee."
00:53:53.980 | And I was like, "No, I told her
00:53:54.800 | "she was gonna get 10 points."
00:53:55.940 | She's like, "Well, you used the F word."
00:53:57.060 | I'm like, "This is insane."
00:53:58.580 | And I didn't, just 'cause I was such a high-performing
00:54:01.140 | employee, I was like, "There's no way I'm getting fired."
00:54:02.500 | And then I did, and I was like, "Yeah."
00:54:04.500 | Cashed out my 401(k) and moped for like three months
00:54:07.020 | 'cause I had thrown away school for this casino job.
00:54:09.780 | And then I got fired from this job that, yeah,
00:54:11.860 | nobody believed I got fired.
00:54:13.260 | It was just insane.
00:54:14.460 | - So if you look back, if you were allowed to not just
00:54:16.460 | to look back to your own memory, but actually watch yourself,
00:54:20.020 | like somebody recorded a video that whole time,
00:54:22.660 | do you think you would be surprised,
00:54:24.360 | you would notice some things, like potentially,
00:54:26.860 | of not having a self-awareness, not having social,
00:54:30.620 | like a civility and social etiquette that's played
00:54:33.340 | in the human relations?
00:54:34.860 | - Yeah, absolutely.
00:54:35.860 | - So is that at the core of it, essentially?
00:54:39.340 | - Yeah, I think so.
00:54:40.180 | I mean, it follows me even to this day.
00:54:41.780 | There's a lot of, I don't know if you're recording or not,
00:54:44.380 | but when we spoke earlier about like meta conversations,
00:54:46.420 | I have to think a lot sometimes about meta conversations
00:54:48.620 | 'cause the way that I wanna drive a conversation
00:54:50.460 | will sometimes be way different than what is like
00:54:52.620 | the best way to have a conversation.
00:54:54.460 | Whereas I just wanna like go really hard
00:54:56.380 | on like some itty-bitty, like some idiosyncrasy,
00:54:59.020 | some factor, figure, whatever,
00:55:00.420 | but that's not like the human conversation I need to have.
00:55:03.260 | - So you got fired/left that job,
00:55:07.580 | and that took you to the job that would be the lowest point.
00:55:11.300 | - Yeah.
00:55:12.140 | (Lex laughing)
00:55:12.960 | 'Cause there was a huge downgrade in pay.
00:55:15.340 | I went from getting like, I think at the casino,
00:55:17.660 | 'cause I worked so much overtime,
00:55:18.860 | I was getting like 20 to 50 an hour on all my overtime.
00:55:21.140 | And this was back in 2008, 2009, as like a college student.
00:55:25.060 | Like it was amazing pay.
00:55:26.500 | The guy had benefits, like everything was good.
00:55:29.260 | And then the carpet cleaning was like,
00:55:30.940 | I was probably getting my paycheck like every other week
00:55:33.700 | was maybe 1,500 bucks or $1,000.
00:55:36.780 | And I'm working like 13 day stretches.
00:55:39.220 | Like I have every other Sunday off, and it's so many hours.
00:55:42.260 | Like I have to show up at the shop at like seven or six,
00:55:44.620 | and then I go home at like eight or nine,
00:55:46.700 | depending on when my jobs are throughout the day.
00:55:48.340 | - You doing businesses or residential,
00:55:50.620 | or what are you doing?
00:55:51.460 | - Everything.
00:55:52.280 | - Everything.
00:55:53.120 | Are you working for a company that does carpet cleaning?
00:55:55.060 | - Yeah.
00:55:56.100 | - Okay, and so like there's a schedule thing,
00:55:57.940 | you have to go to it and so on.
00:55:59.260 | - Yeah, but so like this is why the schedule would suck.
00:56:01.460 | 'Cause sometimes I'd show up at,
00:56:03.500 | I think we had to be in the shop at,
00:56:04.900 | I think it was 7 a.m.
00:56:06.020 | We show up at the shop at 7 a.m.
00:56:07.540 | First job might be at eight or nine,
00:56:09.140 | but that job might be like a one hour job.
00:56:10.800 | So I might show up at 7 a.m.
00:56:11.780 | and have a job from 8.30 to 9.30.
00:56:13.340 | Then my next job might not be from,
00:56:15.100 | until like say 11.
00:56:16.260 | So from 8.30 to 9.30 I'll do one job.
00:56:18.420 | And then I've got a job from like 11 to 12 or something.
00:56:21.740 | Then I might have like a decent job from like five to eight.
00:56:24.900 | But like my whole day is destroyed.
00:56:26.700 | And I'm doing like three smallest jobs.
00:56:30.340 | So I'm getting like 30 bucks maybe for being in the shop,
00:56:33.300 | or you know, my job for like 10 or 11 hours.
00:56:35.420 | It's just like horrible.
00:56:36.980 | - So you're somebody that seems to be extremely good
00:56:39.580 | at thinking and conversation.
00:56:41.460 | And so have a bit of an ego perhaps,
00:56:44.860 | in both the negative and the positive sense of that word.
00:56:48.300 | Was there some aspect of working at McDonald's
00:56:52.220 | and then working at the casino
00:56:53.900 | and then working as a carpet cleaner that was humbling?
00:56:57.580 | - No, never.
00:56:58.840 | I had a--
00:57:00.780 | - The ego burned bright through it all.
00:57:04.180 | Or no, you can push back on the ego.
00:57:06.020 | - Yeah, no, I understand.
00:57:06.900 | I totally get what you mean.
00:57:08.140 | I had a really close friend growing up whose name was Chris.
00:57:11.020 | And I think we probably met when he was,
00:57:12.460 | we were like four or five, I think.
00:57:13.900 | He lived behind me.
00:57:15.060 | And I grew up with him
00:57:16.260 | and I'd always been kind of an outsider
00:57:18.360 | to the world that I was in
00:57:20.580 | once I got to high school for sure.
00:57:22.540 | Because all of those kids were incredibly wealthy,
00:57:25.020 | you know, Corvettes and Mustangs when they turned 16.
00:57:27.560 | It was a prep school.
00:57:28.660 | And I was doing the,
00:57:29.980 | they had like a work study program there
00:57:31.380 | where you could stay after school from 2.30 to five
00:57:33.820 | every day to kind of like work to pay for your tuition.
00:57:36.340 | So I'd been working like throughout all of high school.
00:57:39.020 | I got another job at McDonald's when I was 18,
00:57:41.220 | worked at the casino.
00:57:42.060 | Like I'd always been doing that kind of work.
00:57:43.620 | I never really viewed it as like beneath me or anything.
00:57:45.580 | It's not like I don't have like a family of doctors
00:57:47.420 | or lawyers or anything.
00:57:48.460 | And then me and my other friend, Chris guy,
00:57:50.140 | we'd always make fun of everybody else
00:57:51.460 | for being kind of like, you know,
00:57:52.840 | like preppy kids and everything, so.
00:57:54.780 | - So there is a,
00:57:56.340 | there's some pride to that sort of hard work.
00:57:59.400 | - Yeah, I guess a little bit, yeah.
00:58:00.580 | 'Cause, you know, looking especially at my dad,
00:58:02.140 | like the solution to every problem
00:58:03.140 | was just throw more hours of work at it basically.
00:58:05.140 | So that was always my, yeah, go-to.
00:58:07.260 | And I never, yeah.
00:58:08.100 | - So what was psychologically the low point?
00:58:10.940 | - I think psychologically the low point was that
00:58:13.460 | as I'm doing this carpet cleaning job,
00:58:15.400 | driving around my city, there's like this feeling of,
00:58:20.120 | I guess for a lot of people it's probably college,
00:58:23.460 | but there's a feeling when you're in high school
00:58:25.820 | that everything is like so exciting
00:58:28.060 | and the whole world is kind of in front of you
00:58:30.460 | and there are a trillion, trillion
00:58:32.980 | different branching paths of possibilities.
00:58:35.320 | And, you know, even through high school,
00:58:36.660 | you're thinking like, am I gonna be a doctor or a lawyer
00:58:39.140 | or can I join the NBA or can I do this or that?
00:58:41.420 | There's all these things in front of you.
00:58:43.660 | And when I especially felt it
00:58:45.980 | when I was doing these carpet cleaning jobs
00:58:47.500 | and I think it was in the fall,
00:58:49.860 | I'd be outside some of these houses
00:58:51.160 | and I just kind of look around
00:58:52.340 | and I'd recognize a lot of these neighborhoods
00:58:53.780 | that I'd drive around with friends in
00:58:55.580 | or I'd, you know, be walking through.
00:58:56.860 | I did, I ran cross country.
00:58:57.940 | Some of them I'd be running through these neighborhoods
00:58:59.860 | and it was just kind of like this feeling of looking around
00:59:01.780 | and it was like, when I was here in the past,
00:59:04.620 | this was like kind of like a transitionary phase of my life
00:59:07.100 | where I'm doing this and it's so fun and exciting
00:59:09.380 | and then I'm gonna move on to something else
00:59:10.660 | and it's gonna be fun and exciting and awesome.
00:59:12.540 | And then like, you know, two years later,
00:59:14.980 | my whole life has collapsed.
00:59:16.940 | Like I'm in a house that I can't afford anymore.
00:59:18.840 | My ex that I hate is pregnant with my kid
00:59:22.100 | and I have no money.
00:59:24.220 | I've got no upward mobility.
00:59:25.700 | I failed college.
00:59:27.500 | My job is horrible.
00:59:28.900 | Like just every single, like this is like my,
00:59:31.900 | all of those, the way function had collapsed into one thing
00:59:35.060 | and that one thing was the worst thing
00:59:36.500 | that it could have possibly been at the time for me.
00:59:38.740 | Yeah, like everything was gone and horrible.
00:59:40.140 | So yeah, that was the feeling I had at the time.
00:59:41.820 | - Do you ever contemplate suicide?
00:59:43.740 | - I thought about thinking about it,
00:59:46.100 | but I've just never been that kind of person, so.
00:59:48.300 | - I mean, basically as a way to escape from the hardship.
00:59:50.900 | - Something that I'm so incredibly lucky,
00:59:53.340 | I don't know why or how,
00:59:54.580 | I'm just gonna chalk it up to biology.
00:59:56.140 | I've always had really high mental baseline.
00:59:59.460 | I've like depression and all of that.
01:00:01.300 | There've been a few short stints I've dealt with it past 30
01:00:03.620 | because I did a lot of drugs.
01:00:04.940 | But other than that, my mental baseline is just so high.
01:00:08.260 | And even in the carpet cleaning days,
01:00:10.380 | like if you, man, the videos might still be there.
01:00:12.820 | I think on my old YouTube channel,
01:00:14.700 | where I'll be like playing StarCraft
01:00:15.900 | when I first started getting into streaming
01:00:17.340 | and I'll be calling up customers like,
01:00:18.580 | this is Steve from Guaranteed Clean.
01:00:19.780 | We had to move your job back one hour.
01:00:21.380 | Is it okay if I show up instead of 2.30?
01:00:22.700 | And then I hang up, it's like, all right guys,
01:00:23.940 | we've got three more games and it's like, let's go.
01:00:25.540 | Like stuff like that.
01:00:26.380 | So my baseline has always been like really high
01:00:28.060 | for mental function.
01:00:28.900 | - So even in low point, you had strength.
01:00:31.260 | Is there anything you can give by way of advice
01:00:34.500 | from people that, for whom the wave function collapses,
01:00:37.900 | as it does for many of us?
01:00:39.820 | Like, holy fuck, the world is not full of opportunity
01:00:43.380 | and you're kind of a failure.
01:00:45.180 | And like, I've been there.
01:00:47.260 | - Yeah, I don't know.
01:00:48.100 | It's rough because like, I usually ask for compassion
01:00:51.580 | from people that have it better off.
01:00:53.660 | Because like, once you're down there,
01:00:55.180 | like the only reason, I say I got lucky,
01:00:57.260 | but it wasn't even really lucky.
01:00:59.980 | Or it was lucky, but it was more lucky.
01:01:01.340 | It wasn't just lucky that I got into streaming.
01:01:03.020 | It was lucky that I was into computers at an early age.
01:01:05.260 | It was lucky that I played video games at an early age.
01:01:07.100 | It was lucky that all the tech came up
01:01:08.540 | at exactly that right point in time.
01:01:10.420 | Like I was a pretty smart guy,
01:01:12.060 | but it was definitely preparation meets opportunity.
01:01:15.180 | And that opportunity was like
01:01:16.340 | at the exact precise moment of my life.
01:01:18.380 | If anything had gone differently,
01:01:19.620 | then I would just be cleaning carpets today, so.
01:01:22.140 | - So in the many worlds interpretation of quantum mechanics,
01:01:25.060 | this is like one out of like.
01:01:27.420 | - There's many, many Stevens
01:01:29.660 | that are just still carpet cleaning
01:01:32.220 | and they're full of pain and resentment.
01:01:34.580 | - Yeah, the one piece of advice that I give,
01:01:37.140 | I hate that I have to push back against
01:01:38.420 | all these crypto bros and everybody online.
01:01:40.180 | For decently intelligent people that are successful,
01:01:42.460 | I've never heard anybody give a contradiction to this.
01:01:45.020 | Maybe you will, you can tell me if you disagree.
01:01:47.540 | I always look at kids in high school
01:01:49.780 | and I'm like, just try a little bit harder.
01:01:51.580 | Like 30 minutes a night, if you don't study,
01:01:53.340 | just do 30 minutes, just do a little bit more.
01:01:56.020 | It is, you're laying the foundation for the rest of your life
01:01:58.660 | and you can't appreciate it in high school and college.
01:02:01.140 | But oh my God, when you get out,
01:02:03.220 | everything in your life is so much easier.
01:02:05.080 | You have probably more responsibility
01:02:07.300 | over the direction of your life
01:02:08.460 | when you're like 13, 14 years old
01:02:10.580 | than you ever will once you're like 25 and older.
01:02:12.860 | Because this is like when you're determining
01:02:14.700 | the foundations that everything's gonna be built on.
01:02:16.180 | - Yeah, 100%.
01:02:17.700 | So first of all, it does seem that
01:02:21.380 | the liberating aspect of being young
01:02:23.740 | is like anything you learn.
01:02:26.940 | So working hard at learning something will pay off
01:02:30.620 | in nonlinear ways, like you said with video games.
01:02:33.140 | I feel like, so people who are like, I hate school.
01:02:36.140 | All right, well, fine, but find something
01:02:39.640 | where you're challenging yourself,
01:02:41.020 | you're growing, you're learning, you're learning a skill,
01:02:43.380 | you're learning about a thing.
01:02:44.880 | Of course, you could push back and say,
01:02:47.860 | well, there's some trajectories that might not be productive
01:02:50.260 | if you spend the entirety of your teen years
01:02:52.940 | playing, I don't know, League of Legends,
01:02:54.780 | your game you have a love and hate relationship with.
01:02:57.980 | - No, just a hate and hate relationship.
01:02:59.100 | - Okay, well, we'll talk about,
01:03:01.020 | I think you have a love-hate relationship
01:03:02.980 | with hate in general.
01:03:03.900 | We'll just talk about it in love.
01:03:06.420 | We'll try to de-complexify that one.
01:03:08.580 | I think in general, just investing yourself
01:03:12.180 | fully with passion really does pay off.
01:03:15.380 | But that said, also school,
01:03:18.380 | I feel like doesn't get enough credit,
01:03:21.740 | like high school in particular,
01:03:23.260 | middle school and high school,
01:03:24.740 | because it's general education.
01:03:26.840 | I think if you're, especially if you're lucky
01:03:31.980 | to have good teachers, but honestly, I haven't mostly.
01:03:35.740 | The textbooks themselves with good teachers,
01:03:38.180 | it's a one chance in life you have to
01:03:40.740 | really explore a subject.
01:03:43.480 | Fuck grades, like getting good grades
01:03:46.860 | is at tension, I would say, with actual learning.
01:03:49.500 | That is true.
01:03:50.520 | But just get a biology textbook
01:03:53.320 | and to explore ideas in biology
01:03:55.340 | and allowing yourself to be inspired
01:03:59.420 | by the beauty of it.
01:04:00.940 | Yeah, I don't know, I think that really, really,
01:04:04.700 | really pays off and you never get a chance
01:04:06.460 | to do that again.
01:04:07.300 | And maybe not even textbooks, like reading,
01:04:10.840 | straight up reading.
01:04:12.020 | I think if you read, this is a one time in life
01:04:14.900 | you get a chance to read.
01:04:16.700 | Really read, like read a book a day, read.
01:04:20.020 | You can really invest, you can really grow by reading.
01:04:24.740 | I mean, Elon Musk, all those guys talk about it.
01:04:27.500 | - It's very, very rare that you meet
01:04:28.820 | like a dumb person who reads a lot.
01:04:30.540 | I don't know if that's ever happened in my life.
01:04:32.620 | Yeah.
01:04:33.460 | - Dumb or not successful.
01:04:34.300 | And the cool thing is, it seems like the reading,
01:04:38.020 | it's like investment, the reading you do early on
01:04:40.880 | in high school pays off way more
01:04:42.900 | than the reading you do later.
01:04:45.100 | So like the really influential reading
01:04:48.700 | is during those high school years.
01:04:52.340 | Because you're basically learning from others
01:04:55.100 | the mistakes they've made, the solutions to problems.
01:04:59.020 | You're basically learning the shortcuts to life.
01:05:02.100 | Like whatever the hell you wanna do,
01:05:03.180 | music, read from the best people,
01:05:05.500 | the music theory, like learn music theory.
01:05:08.500 | Learn, read biographies about jazz musicians,
01:05:11.860 | blues musicians, see all the mistakes,
01:05:14.700 | see what they did, see the shortcuts.
01:05:16.380 | If you wanna do podcasting, read about other podcasts.
01:05:18.380 | If you wanna do streaming, read about other streamers,
01:05:21.140 | physicists and so on.
01:05:22.580 | And I feel like you figure out all the mistakes
01:05:24.740 | and you get to shortcut through life.
01:05:26.620 | Because most people show up to college
01:05:28.380 | without having done that.
01:05:29.820 | And now you get a chance to shortcut your way past them.
01:05:33.180 | Yeah, 100%.
01:05:35.340 | But nobody really teaches you that.
01:05:37.240 | They're like, go to school, from this time to that time.
01:05:42.500 | Shut up, this is just what you do, eat your broccoli.
01:05:46.020 | - I think there's two huge problems.
01:05:47.340 | One is now that I'm older,
01:05:50.100 | 'cause you don't know anything as a kid.
01:05:51.700 | You can't really criticize adults as a kid.
01:05:53.260 | 'Cause you're a kid.
01:05:54.100 | - You're ageist, if I may say so.
01:05:55.740 | - I am, I am super ageist.
01:05:56.580 | And as I get older, I get even more ageist.
01:05:59.940 | There are a lot of people where I argue with them,
01:06:00.780 | I was like, man, dude, you're really 22, aren't you?
01:06:02.980 | I can tell every word you say.
01:06:05.060 | There's seeps of like 22 year old-ness.
01:06:07.220 | But that's okay, I love that for you.
01:06:08.940 | - I could just say, 'cause you mentioned this,
01:06:11.580 | your wife is a fellow streamer, Melina.
01:06:14.020 | You mentioned that this is a source of fights
01:06:16.500 | for the two of you that, and I could just feel that.
01:06:19.940 | There is truth to what you're saying, which is like,
01:06:22.700 | all right, you're saying that because you're 22.
01:06:25.220 | Just wait until you're 25,
01:06:26.860 | and you won't be saying that anymore.
01:06:28.560 | Now, that is the most annoying thing for people to hear.
01:06:31.460 | - Yeah, you can't ever say that, of course.
01:06:32.620 | - Because it's actually usually true.
01:06:35.100 | Because we do go through phases in life.
01:06:39.740 | And you can understand that most things are phases.
01:06:42.260 | So just in general, you can say, just wait, just wait.
01:06:46.500 | You won't feel this way again.
01:06:49.140 | I could say that to you, you could say that to yourself.
01:06:51.460 | Just wait, whatever you're feeling like, just wait.
01:06:53.980 | In five, 10 years, you'll be a different person,
01:06:56.140 | and you will laugh at the things you take seriously now
01:06:59.460 | that are causing you pain now, all that kind of stuff.
01:07:02.080 | But people hate hearing that.
01:07:03.860 | Anyway. - Absolutely.
01:07:05.340 | I think the joke that I always say is that like,
01:07:06.700 | if I could literally step into a time machine,
01:07:09.060 | and I could come back out and see myself as a 17-year-old,
01:07:12.060 | and I could say, "Hey, I am literally you from the future.
01:07:14.500 | "You see the time machine."
01:07:15.740 | And I would look at me, and I would see the time machine.
01:07:17.740 | And I would give myself the best advice in the world,
01:07:20.460 | to be the most successful person.
01:07:21.860 | I would ignore all of it, even knowing it came from myself.
01:07:24.260 | I'd be like, "This guy sold out.
01:07:25.280 | "This dude doesn't know what the fuck he's talking about.
01:07:26.620 | "Like, nah, I'll figure it out better.
01:07:27.780 | "Like, he must've made some mis..."
01:07:28.800 | That's what I would think as a 17-year-old.
01:07:29.640 | Even if I knew it was myself from the future,
01:07:30.980 | I would just 100% never believe it.
01:07:33.180 | And knowing that is very frustrating.
01:07:34.460 | But I keep that in mind when I deal with younger people.
01:07:36.020 | That's why I never, I always say on stream
01:07:37.740 | when I'm talking to, there's been stuff with Sneako.
01:07:40.500 | There's another girl on my stream called Lab.
01:07:41.860 | Like, when I see the way, I see the mistakes they're making.
01:07:44.260 | Oftentimes, because I've made all of these mistakes,
01:07:46.460 | sometimes in the most public and horrible fashion ever.
01:07:49.480 | But I'm never like a mentor.
01:07:50.500 | I'm not gonna sit there and tell you,
01:07:51.660 | oh, do this or that or that or that.
01:07:52.780 | 'Cause I don't know if you're gonna listen to me,
01:07:54.460 | and I don't wanna condescend to you.
01:07:55.780 | And you figure stuff out,
01:07:56.860 | and I'll be here if you wanna talk about it.
01:07:57.860 | But yeah.
01:07:58.940 | There was one of the stories,
01:08:00.380 | there was a company that didn't work with me,
01:08:02.580 | because I was very adamant on defending
01:08:05.380 | very radical notions about language
01:08:07.300 | and racial slurs and everything
01:08:08.220 | when I was like 22 or whatever.
01:08:09.940 | And there was a company, and they said,
01:08:10.940 | well, we don't wanna work with this guy for an event.
01:08:12.720 | And after they'd said that,
01:08:14.140 | I had written an article on my website
01:08:15.820 | called, the company was Gigabyte, they make motherboards.
01:08:18.000 | I said, fuck Gigabyte in the ass.
01:08:19.320 | That was the title to my article.
01:08:20.940 | And it was like, well, if they don't wanna work with me,
01:08:22.280 | I'm gonna blow them up
01:08:23.120 | and never do anything ever with them again.
01:08:25.180 | And it was just like, looking back on it now,
01:08:27.260 | obviously, as an older person, I'm like,
01:08:28.780 | hey, you need to pump the brakes and chill.
01:08:30.620 | You're destroying yourself.
01:08:32.000 | But yeah, as a young person, it's like,
01:08:33.500 | yeah, you're 22, of course you think
01:08:34.900 | that you can say whatever and do whatever.
01:08:36.420 | And as long as you're good at what you're doing,
01:08:37.620 | you've got the whole world behind you.
01:08:38.660 | And yeah, geez.
01:08:40.380 | - Well, let's go there.
01:08:41.280 | You have a history of using offensive language,
01:08:43.980 | like the R word, the N word,
01:08:47.180 | including the N word with a hard R,
01:08:49.580 | calling women bitches,
01:08:53.100 | talking about rape in a nonchalant way.
01:08:56.640 | What part of that do you regret?
01:09:01.460 | And what part of that do you not?
01:09:03.260 | - Language is very complicated.
01:09:04.860 | When it comes to stuff relating to slurs,
01:09:08.300 | there's been like a whole trajectory of feelings
01:09:11.980 | on everything related to language.
01:09:14.540 | So my--
01:09:15.380 | - For you personally and for the internet as a whole.
01:09:17.100 | - Yeah, I don't care about the internet,
01:09:18.100 | almost for me personally.
01:09:19.500 | In my early 20s, I'll say like 22, 23,
01:09:22.900 | I think probably when I first started streaming,
01:09:24.660 | my feeling is that any word is just a word.
01:09:27.860 | And if it hurts you, that's your fault.
01:09:29.420 | Take responsibility for yourself.
01:09:31.580 | This probably came from my background
01:09:32.780 | of being like a really independent person.
01:09:34.140 | So that's just kind of like the mind
01:09:35.220 | that I had for everything.
01:09:36.780 | And there were basically,
01:09:38.620 | there were like a collection of experiences that I had,
01:09:41.000 | that as I grew, I started to realize like, okay,
01:09:43.820 | well, I feel differently about some of these words,
01:09:45.940 | depending on the context,
01:09:46.980 | and I can see how they can affect other people,
01:09:48.540 | depending on the context.
01:09:49.900 | So as I've kind of like grown,
01:09:51.740 | I think I've developed a more sophisticated understanding
01:09:54.020 | of how different words are used
01:09:55.900 | and how they affect people,
01:09:56.940 | whether they like it or not.
01:09:58.000 | And more importantly, whether I like it or not.
01:09:59.780 | And that words can, even if I don't want it to be,
01:10:02.780 | they can be a vehicle for emboldening certain types of ideas
01:10:05.900 | that I don't wanna embolden.
01:10:07.420 | And yeah, that's kind of been the whole like growth.
01:10:09.780 | I've been lucky that in the time
01:10:11.140 | that I came up on the internet,
01:10:12.260 | I was able to learn these lessons,
01:10:13.840 | because if I was trying to learn those same lessons today,
01:10:15.460 | I would have been completely destroyed
01:10:17.200 | 'cause I had insane views on language like 10 years ago.
01:10:20.120 | - We could talk about the past,
01:10:21.140 | we could talk about the present.
01:10:22.180 | Let's talk about the past first.
01:10:23.220 | So how do you deal with the fact
01:10:25.020 | that there's videos of you in the past saying the N word,
01:10:30.020 | including the N word with a hard R?
01:10:32.120 | - So generally-
01:10:32.960 | - And what's the context?
01:10:34.080 | Can you give me like,
01:10:35.000 | give me a memory, what would be the context usually?
01:10:37.560 | - When I lay out this defense,
01:10:39.560 | it's not because I wouldn't have used the N word.
01:10:42.080 | Generally, whenever I said the N word,
01:10:43.360 | it was usually in an example of like,
01:10:44.800 | this is something that like a racist person would say.
01:10:47.160 | I don't think I've ever on the internet,
01:10:49.200 | I don't think I've ever called anybody
01:10:50.440 | like the N word with a hard R.
01:10:52.360 | Not because I wouldn't have,
01:10:53.360 | but just because it wasn't in my vocabulary.
01:10:55.180 | I played RTS, real-time strategy,
01:10:57.900 | and we used the F slur for gay people.
01:11:00.500 | That's the one, and I use that one a ton.
01:11:01.980 | I've called people that a ton in the past.
01:11:04.420 | - So I should actually just as a small tangent.
01:11:06.780 | - Yeah, go for it.
01:11:07.660 | - And this is what I'd like to explore with you.
01:11:10.360 | There's a ruthlessness to the language in the gaming world.
01:11:16.460 | - Yeah.
01:11:17.340 | - And there's different communities,
01:11:18.740 | they have different flavors of language.
01:11:22.300 | - Of hate speech, yeah.
01:11:23.180 | - Of hate speech, essentially.
01:11:25.140 | And there's also a humor to it,
01:11:27.880 | which really bothers me in a dark way
01:11:34.200 | that I haven't been able to really think through,
01:11:37.280 | because humor seems to be a kind of catalyst for hate.
01:11:41.520 | It seems to normalize hate.
01:11:44.120 | Like you say, basically it's like Louis C.K.
01:11:48.080 | says a lot of edgy things,
01:11:50.120 | but you take something Louis C.K. says
01:11:52.500 | and do it in a non-funny way,
01:11:53.920 | and do it over and over and over,
01:11:55.120 | and keep increasing the hatefulness of it, the vitriol.
01:12:00.120 | And somehow you find yourself like Alice in Wonderland
01:12:04.360 | in a world full of hate, where there's no good and evil,
01:12:07.600 | it's all the same.
01:12:08.600 | In fact, the good is to be mocked,
01:12:11.480 | and the evil is to be celebrated for the humor of it.
01:12:14.800 | Basically not taking the ideas of evil seriously.
01:12:18.560 | And I don't know what it,
01:12:19.720 | it reveals something about human nature
01:12:21.440 | that you can let go.
01:12:22.860 | The moral relativism that can happen
01:12:25.820 | when you do that kind of stuff.
01:12:27.020 | At the same time, I'm a fan of dark humor, when done well.
01:12:32.020 | Anyway, for people who are not familiar,
01:12:34.380 | I just wanted to mention that some of the worst hate speech
01:12:39.380 | that ends in LOL happens in gaming communities.
01:12:44.120 | - Yeah.
01:12:45.080 | - And that's where you come from in a certain part.
01:12:47.320 | - So a lot of people don't remember this,
01:12:48.980 | or don't know this 'cause they're younger,
01:12:50.120 | but way back in the day, in the late 90s,
01:12:54.040 | early mid 2000s of the internet,
01:12:55.840 | the way that online kind of like shit talk worked
01:12:58.440 | was you were just trying to ramp up
01:13:00.320 | to the most insanely edgy, crazy stuff you could say
01:13:04.300 | to like provoke a reaction.
01:13:06.520 | Have you ever heard of something called the aristocrats?
01:13:09.280 | That it's like a joke, the joke?
01:13:11.080 | - Oh yeah, the joke, yeah, there's a movie on it, yeah.
01:13:13.240 | - Okay, basically every single like shit talk
01:13:16.040 | back in the internet was like that.
01:13:17.420 | Like what is the most increasingly depraved,
01:13:19.720 | and back then you didn't get banned for slurs or anything
01:13:22.440 | on any of these chat rooms,
01:13:23.400 | so it was just like insane world to walk into.
01:13:25.880 | And I was fully 100% a part of, a product of,
01:13:29.640 | and a contributor to that world.
01:13:31.320 | - So that probably still goes on on the internet in some way
01:13:34.040 | and that probably still goes on in the internet
01:13:36.200 | in a maybe more pacified way.
01:13:39.280 | - Only in darker parts of the internet.
01:13:40.840 | I'd say for the most part, most,
01:13:42.400 | well compared to back then, compared to 20 years ago,
01:13:44.600 | the internet is way cleaned up now.
01:13:46.160 | There are still gonna be boards you can go on
01:13:47.760 | or parts of the internet where you see that type of humor,
01:13:49.840 | but not, nowhere near as mainstream.
01:13:51.680 | Like back then you could open your mic on Xbox Live
01:13:54.400 | and hear some insane stuff when that first started,
01:13:56.720 | nowhere near what you hear today, although it's, yeah.
01:13:58.720 | - There's still elements of escalation that happen
01:14:01.280 | that just seems to be part of human nature on the internet.
01:14:04.860 | Because we don't get the feedback
01:14:06.040 | of actually hurting people directly.
01:14:08.240 | So the trolling, like for the lulz,
01:14:11.760 | you'll do like whatever, like you will still escalate.
01:14:15.280 | Within the bounds, you're just saying
01:14:17.080 | that there's more bounds now.
01:14:18.200 | On Reddit, there's more bounds and so on.
01:14:20.400 | So there's moderators that kind of yell at you,
01:14:23.800 | that ban you and so on if you cross those bounds.
01:14:26.520 | But overall, that basic human instinct to escalate,
01:14:29.900 | especially under the veil of anonymity is still there.
01:14:34.080 | I don't know, it's dark, it's dark.
01:14:35.480 | - Yeah, just there's a lot of different ways to look at it
01:14:37.560 | and there's different ways you can break that arc.
01:14:39.000 | Like for instance, like you mentioned dark humor
01:14:41.000 | and you say that like, sometimes dark humor is funny
01:14:44.920 | and sometimes it's not.
01:14:46.320 | I think that it's really important to dig into
01:14:48.080 | and figure out like why certain things are funny
01:14:50.320 | and why certain things-- - Can I give you an example?
01:14:51.320 | - Yeah, go. - It's from your subreddit.
01:14:52.860 | - Oh boy. (laughs)
01:14:53.920 | - No, that made me laugh and I felt wrong about it.
01:14:56.080 | - (laughs) Oh no.
01:14:58.280 | - So this is-- (laughs)
01:15:00.360 | - I already know what this is, yeah.
01:15:02.400 | - Yeah, so this is a trolley problem.
01:15:04.680 | To me, it connects because I think about the,
01:15:07.800 | it keeps, 'cause I worked on autonomous vehicles,
01:15:10.040 | the trolley problem, the philosophical thought experiment
01:15:12.840 | keeps brought up a lot.
01:15:14.360 | You know, when AI is part of making the decision,
01:15:16.800 | do I kill three people here or five people here
01:15:19.960 | and AI makes that decision, how do you do that calculus?
01:15:22.760 | And this particular, there's a deep,
01:15:27.320 | so it's satire that reveals some kind of flaw in society.
01:15:31.180 | I feel like that's why-- - That's what dark humor does?
01:15:33.600 | - Successful dark humor does.
01:15:35.200 | And I don't know if this-- - And it reveals a flaw.
01:15:36.960 | I feel like there's a certain brand of dark humor
01:15:39.840 | and I think the reason, I think the reason is why it's good
01:15:43.760 | or why it is good humor, I think it's because it,
01:15:46.000 | I don't think it necessarily reveals a flaw.
01:15:47.240 | Sometimes I feel like it reveals a kind of virtue, I think.
01:15:49.800 | Like, if you look at this particular thing--
01:15:52.520 | - Can I explain what we're-- - Yeah, go for it.
01:15:53.840 | Oh yeah, sure. - We're just listening.
01:15:55.880 | The title of the Reddit post is "You Know What to Pick"
01:15:59.440 | and it says, "Five people are going to die either way,
01:16:03.360 | "but if you flip the lever,
01:16:05.200 | "the trolley will do a sick fucking loop first."
01:16:08.940 | And also the top comment is a question saying,
01:16:13.940 | which I think is also part of the dark humor
01:16:16.500 | that's successful, "Can I get the gender
01:16:18.740 | "and ethnic backgrounds of the groups first?"
01:16:21.560 | And the top answer is, "Both groups are each comprised
01:16:24.700 | "of five white, orphaned, cis male, heavy meth users
01:16:29.700 | "who are consistently in and out of drug rehab,
01:16:34.360 | "all who identify as right-wing extremists."
01:16:37.060 | Humor is such a sophisticated thing that we engage in.
01:16:40.100 | Humor is really complicated.
01:16:42.380 | But I would argue that hopefully the humor here
01:16:45.080 | shows the virtue of this is obviously horrible,
01:16:48.660 | but that's kind of why it's funny.
01:16:50.020 | It's funny because it's such a horrible question to ask.
01:16:52.580 | Do we kill five people in a boring way
01:16:54.180 | or in a really entertaining way?
01:16:55.340 | And it's like, that's really, that's really,
01:16:57.180 | and then when you ask even more,
01:16:58.340 | what are the ethnic backgrounds?
01:16:59.500 | That's even worse to say that.
01:17:01.540 | So I feel like that's the type of,
01:17:03.260 | there's a way that you can engage with dark humor
01:17:04.900 | where it's like, oof.
01:17:06.380 | It's funny because it's so wrong and so taboo.
01:17:08.820 | And we all know that it's wrong and taboo,
01:17:10.220 | and that's kind of where the shared laugh comes from.
01:17:11.940 | - So for me, the question, asking the diversity question
01:17:16.580 | is a sophisticated way of revealing the absurdity
01:17:19.340 | of asking about diversity when it's talking about human life.
01:17:22.500 | - Oh, interesting, 'cause the way that I took that was,
01:17:25.120 | I think it reveals the absurdity
01:17:27.400 | of how people will weigh different ethnic backgrounds
01:17:29.540 | so differently when it comes to value of human life.
01:17:31.900 | Like, I'm actually thinking of that in terms of like
01:17:33.700 | an immigration-related question,
01:17:35.740 | where people are really keen and quick to dehumanize
01:17:38.300 | like black or brown people.
01:17:39.980 | So like the question is like,
01:17:40.820 | well, if five of them are brown and five are white,
01:17:42.460 | well, I know which one I'm gonna pull the lever for.
01:17:44.740 | That's how I read that.
01:17:45.580 | - But it's satirizing that aspect.
01:17:46.740 | - Yeah, exactly, yes, of course, yeah.
01:17:47.580 | - But that's what I mean, that that's the flaw.
01:17:50.060 | To me, at least, it showed that humanity
01:17:55.060 | or social networks that are easy to be outraged
01:17:59.100 | and love the outrage and the chaos,
01:18:02.820 | that Twitter and social networks will pull that lever.
01:18:06.200 | Like they would always try to maximize the fun.
01:18:10.420 | And there's a sick aspect to all the atrocities,
01:18:14.620 | all the tragedies that happen in the world
01:18:16.780 | that we kind of always lean towards
01:18:18.780 | the outrageous narrative weaved around it.
01:18:23.220 | Yeah, the one that leads to sort of the most clicks,
01:18:28.380 | to the most attention, to the most outrage,
01:18:32.140 | to all that kind of stuff.
01:18:33.700 | So that's almost like a satire of society.
01:18:37.740 | When they are faced with tragedy, they will maximize.
01:18:42.580 | I'm trying to think of a word that's not fun, but--
01:18:46.160 | - Entertainment.
01:18:47.060 | - Maximize the entertainment, yeah.
01:18:49.100 | - This is a big criticism I give,
01:18:50.420 | especially to conservative crowds.
01:18:52.420 | Left-leaning people, everybody does it.
01:18:54.660 | I don't like when people blame the media
01:18:56.380 | for the state of the media today.
01:18:58.020 | I very much believe that everything in society
01:19:00.300 | is a feedback loop, and that if you're really unhappy
01:19:02.740 | with the state of the media,
01:19:03.600 | I think that the media is a good reflection
01:19:05.060 | for what people wanna see.
01:19:06.220 | Because there is a room right now in the United States
01:19:08.380 | where somebody could start a company
01:19:09.500 | where all they do is completely factual reporting,
01:19:12.460 | they don't have a political slant,
01:19:13.740 | and they're not giving you these sensationalist narratives
01:19:15.700 | or stories, and that media company would fail in two weeks,
01:19:18.100 | because people don't wanna see that.
01:19:19.140 | Generally, people really wanna see the,
01:19:20.860 | show me the guy that really believes in what I say,
01:19:22.860 | that calls the other guy an idiot,
01:19:24.020 | the guy that are screaming on TV or on the radio,
01:19:25.820 | like this is what I really want.
01:19:27.660 | And people will engage in that,
01:19:29.060 | and that feedback loop will continue for generations,
01:19:32.140 | and then all of a sudden people are like,
01:19:33.140 | why is the media so biased?
01:19:34.580 | Why is the media driving so many narratives?
01:19:36.380 | And it's like, well, what do you mean?
01:19:37.200 | This is exactly what you wanna see.
01:19:39.060 | And that's frustrating for me.
01:19:40.780 | That's one of my big kind of when I defend establishments,
01:19:43.080 | or when I talk about the interplay between citizen
01:19:45.820 | and all these institutions we have,
01:19:48.660 | that the institutions are very much a reflection
01:19:51.180 | of the population, at least in democratic societies.
01:19:54.340 | And I think that people very much try to elude
01:19:56.300 | the personal responsibility
01:19:57.780 | or the country's responsibility
01:19:59.700 | to why some of them look the way that they do.
01:20:02.000 | - But that takes us back to the N word with a hard R.
01:20:05.900 | - Sure. - Why?
01:20:07.700 | - For the particular examples that I was given,
01:20:09.660 | or for the particular conversations that I was having,
01:20:11.900 | if you're gonna have challenging conversations
01:20:13.740 | around certain words,
01:20:14.900 | I think you should probably be able to say them,
01:20:16.540 | otherwise it feels really ridiculous to me.
01:20:18.580 | That's like my-- - Do you still believe that?
01:20:20.780 | - For, yes.
01:20:22.020 | And not like calling people those words,
01:20:23.800 | but in having conversations about those words,
01:20:25.620 | I would say that I still believe that, yeah.
01:20:27.100 | - But don't you think, as you said,
01:20:29.060 | that using those words
01:20:30.620 | actually gives motivation and strength
01:20:35.460 | to people who have hate in their hearts?
01:20:38.100 | - I think depending on the context of what's going on,
01:20:41.820 | I think that that's gonna be a big driver
01:20:43.420 | in terms of how people are going to perceive or take it.
01:20:46.300 | So in a conversation about the N word,
01:20:48.500 | I don't think I would normally say the N word,
01:20:49.900 | we would just talk about the word,
01:20:51.660 | much the same way that, say, like in a movie,
01:20:53.900 | like in "Django," people use the N word.
01:20:55.860 | Should that be censored in that movie?
01:20:57.460 | Or in the context of that movie,
01:20:58.860 | is it being employed in a way where these aren't good people,
01:21:01.100 | you're not supposed to like them,
01:21:02.020 | and that's what the audience walks away with.
01:21:04.340 | - Yeah, but that context is different in the conversation.
01:21:06.380 | It feels like in conversation,
01:21:08.720 | you using that word normalizes it.
01:21:11.900 | And normalizing that word is going to make it easier
01:21:15.340 | for people who use that word in a hateful way to use it.
01:21:19.380 | Same with the F word, the F slur.
01:21:23.460 | If you use that casually and normalize it
01:21:27.640 | in a way that's not hateful,
01:21:30.700 | you use it in a way that's not hateful,
01:21:32.780 | but the side effect is that it normalizes it,
01:21:35.580 | then people who do use it in a hateful way
01:21:38.460 | will be more likely to use it.
01:21:40.260 | Therefore, mathematically looking at the equation
01:21:43.660 | of the number of times the N word or the F word
01:21:46.900 | is used throughout the world,
01:21:48.580 | it increases the number of times it's used in a hateful way.
01:21:51.340 | - Yeah, I think that human beings--
01:21:52.180 | - And you're part of that problem, Steven.
01:21:54.460 | - I don't agree.
01:21:55.340 | I understand the thought process,
01:21:57.980 | but I don't know if using certain words
01:22:00.760 | within different contexts is going to necessarily normalize
01:22:04.780 | the hateful use of that word.
01:22:06.220 | That is an argument that I've heard people use.
01:22:09.700 | Somebody will say, "Okay, well, hold on.
01:22:11.060 | "That should never be used ever,"
01:22:12.500 | because by virtue of you normalizing it,
01:22:14.800 | even in an inoffensive environment,
01:22:16.740 | you increase the proclivity for people to use it
01:22:18.700 | in a potentially more offensive environment.
01:22:20.620 | And my argument is always that,
01:22:21.900 | "No, I don't think that crossover exists."
01:22:23.500 | But if you did wanna take that argument,
01:22:25.100 | and maybe you do feel this way,
01:22:26.460 | I think that you get really problematic
01:22:27.820 | when you run into communities that do use certain words
01:22:30.040 | that people would say, "Well, they should be allowed
01:22:31.100 | "to do it."
01:22:31.940 | So, for instance, if you think that any utterance
01:22:33.500 | of the N word at all is highly problematic
01:22:35.260 | and might increase hatred,
01:22:36.540 | then the entire rap industry has to dramatically change
01:22:38.820 | the way that they engage with the N word.
01:22:40.340 | And obviously, a lot of people that criticize
01:22:42.120 | people's use of the N word are gonna turn to rappers
01:22:44.260 | and say, "Well, you guys can't say it either."
01:22:45.900 | - No, it's who uses the N word.
01:22:48.780 | So, it's not just the word.
01:22:50.980 | It's the, it is context dependent.
01:22:55.760 | But I would say that you, as a white person,
01:23:00.380 | having conversations, the context there is the kind
01:23:05.380 | that would lead to an increase in hate.
01:23:07.300 | - Do you think the N word should be censored
01:23:08.460 | in the dictionary?
01:23:09.460 | - No.
01:23:10.300 | And I believe there's a Wikipedia page on it,
01:23:12.540 | and it's not censored.
01:23:13.700 | Yeah, I think it should be in the dictionary.
01:23:16.300 | I think the context of casual conversation,
01:23:20.120 | like I said, I just believe that on the internet,
01:23:23.060 | having humor, having fun conversations as you have
01:23:28.300 | on your streams, that leads to the normalization
01:23:32.660 | of the word without any educational value.
01:23:35.180 | Without significant educational value.
01:23:36.020 | - Yeah, I would agree with that.
01:23:37.060 | I think I would agree with that.
01:23:37.900 | - No, sorry, so there's a difference between F slur
01:23:41.140 | and N word, and both, I think, should not be used
01:23:45.060 | in a fun way, but the F word was used in a fun way
01:23:48.300 | for the longest time.
01:23:49.180 | - For sure.
01:23:50.020 | - And I'll tell you something that bothers me
01:23:52.360 | about your streams, not your streams,
01:23:55.440 | your streams and basically every other stream,
01:23:58.440 | is the casual use of the R word.
01:24:00.920 | - Oh, the ableism, yeah.
01:24:02.580 | - I don't know if it's about the able,
01:24:04.160 | I don't even know, listen, it's complicated.
01:24:07.280 | I'm not like virtue signaling here.
01:24:08.840 | - No, ableism isn't virtue signaling.
01:24:10.520 | I mean, it's a legitimate, yeah.
01:24:12.320 | Like I get emails from fans that say like,
01:24:13.920 | "Hey, I deal with this particular issue.
01:24:16.660 | "Every time you use this word,
01:24:17.880 | "it kind of feels like you're attacking me."
01:24:19.160 | Like just like, so it's a valid concept, yeah.
01:24:21.980 | - It's just something cuts wrong for me.
01:24:24.180 | Like for example, I'm not bothered by,
01:24:26.660 | I am bothered by the excessive use of the word fuck.
01:24:29.540 | - Okay.
01:24:30.380 | - But not--
01:24:31.660 | - In the same way that--
01:24:32.500 | - Moderate use of the word fuck.
01:24:35.300 | - What is it I'm curious in?
01:24:36.180 | When somebody calls somebody an R word,
01:24:38.420 | what is it that, what is the feeling that you get
01:24:40.180 | that makes you feel bad about it?
01:24:42.440 | - It signals to me that you don't give a damn about,
01:24:47.440 | (sighs)
01:24:49.440 | people who are struggling in ways
01:24:54.480 | that you are not struggling.
01:24:56.080 | Like that signals to me, like about the experience of others.
01:25:01.080 | - Do you think that there are other words also
01:25:04.040 | that could convey like a similar feeling to you?
01:25:06.480 | 'Cause it feels like you've drawn
01:25:07.320 | a pretty special circle around,
01:25:09.360 | 'cause like I imagine, I go, "Oh, this guy's,
01:25:11.000 | "you're an uneducated dumb fuck," or, "You're a nitwicker."
01:25:13.640 | Like, do those words--
01:25:15.440 | - Well, that circle keeps changing.
01:25:17.920 | - Which it can, which is fine, it does.
01:25:18.760 | - And I think that's what the whole point with the culture.
01:25:21.320 | So I'm trying to feel, my feeling is a kind of,
01:25:25.280 | I'm a human being that exists in a social context
01:25:27.560 | that we're all evolving that language together
01:25:30.320 | and just feels wrong.
01:25:31.160 | Like, the word bitch, for example, it really,
01:25:34.840 | like I've heard on your streams and in general,
01:25:37.560 | calling a woman a stupid bitch really bothers me.
01:25:42.560 | But it's not just the word bitch, it's context.
01:25:45.240 | Like, for example, me personally,
01:25:47.440 | I'm speaking to me personally,
01:25:48.800 | like badass bitch is different than stupid bitch.
01:25:53.480 | - Sure, like a bad bitch or something is different than,
01:25:55.280 | yeah, of course. - Way different.
01:25:56.960 | I think it speaks to a bigger sense of civility
01:26:00.840 | and respect for human beings that are not like you.
01:26:04.680 | That's the feeling that I'm bothered.
01:26:07.720 | So I guess what I'm trying to say here is
01:26:12.520 | just because people speak in this kind of way
01:26:16.800 | in the gaming world and in streams
01:26:19.680 | doesn't mean that you, like a lot of people look up to you.
01:26:23.720 | It doesn't mean, young people especially,
01:26:26.920 | doesn't mean that you don't have the responsibility
01:26:28.860 | to sort of stand alone from the crowd.
01:26:31.080 | 'Cause you're somebody that values
01:26:32.640 | the power of effective discourse.
01:26:35.180 | And to be effective discourse,
01:26:40.240 | there's some level of civility.
01:26:41.840 | So you can be the sort of the beacon of civility
01:26:44.520 | in that world versus giving in to the derogatory words.
01:26:49.520 | 'Cause you have to lift people out of that world,
01:26:55.120 | out of the muck of, what I would say is like drama
01:27:00.120 | in effective discourse.
01:27:02.320 | I think that's one of your missions, right?
01:27:04.180 | Is like to inspire the world through conversation,
01:27:07.480 | through debate, through effective discourse.
01:27:09.960 | So I guess I'm just calling you out
01:27:11.640 | that I think using our word, for me personally,
01:27:14.600 | as a fan that believes in your mission,
01:27:18.600 | it just makes you look ineffective and bad
01:27:22.080 | and uninspiring to young people that look up to you.
01:27:24.960 | 'Cause those young people are going to use those words
01:27:27.440 | that you're using and they'll do it much less effectively.
01:27:30.760 | - Sure. - That's the problem.
01:27:32.280 | - Yeah, I guess the challenge is always
01:27:33.320 | just like finding the line.
01:27:34.720 | Like my vocabulary shifted dramatically from,
01:27:38.160 | even from like two or three years ago,
01:27:39.720 | I think my vocabulary shifted quite a bit
01:27:41.760 | as we've like, we've kind of gotten rid of some words
01:27:43.760 | and some things are kind of coming out.
01:27:46.280 | The R word is one that has kind of gone out and come back
01:27:48.400 | and gone out and come back.
01:27:49.760 | That one we've definitely gone back and forth on.
01:27:51.440 | I know there are different thoughts about it
01:27:52.640 | in different communities on the internet.
01:27:54.200 | - This is interesting.
01:27:55.040 | I mean, I'm just telling you, for me, it cuts,
01:27:59.440 | and I'm not a social justice warrior type,
01:28:01.920 | it cuts pretty hard.
01:28:03.760 | - What you're saying is I'm gonna lose a subscriber
01:28:05.440 | if I'm-- - No, it's not a subscriber.
01:28:08.120 | I know what you mean. - I actually have
01:28:09.880 | to empathize harder because I'm like,
01:28:12.280 | maybe this is not a very good person.
01:28:14.100 | That's what I feel.
01:28:15.280 | Like if you're so carelessly using that word,
01:28:18.240 | then maybe you're not actually thinking deeply
01:28:20.080 | about the suffering in the world.
01:28:22.600 | Like to be a student of human nature,
01:28:24.740 | you really have to think about other humans
01:28:26.360 | and other experiences that are unlike your own.
01:28:28.280 | - Yeah, of course.
01:28:29.120 | - And so that's the sense I get.
01:28:30.000 | But at the same time, you're also like the grandpa,
01:28:34.160 | I mean, ageist, who's trying to be cool with the young kids.
01:28:36.880 | A lot of the reason young kids look up to you
01:28:40.200 | is like you also know the language of the internet.
01:28:42.920 | - Yeah, but I mean, that's not an excuse
01:28:44.600 | to use words that we think shouldn't be used.
01:28:46.720 | I guess the question that I would have,
01:28:48.360 | 'cause it's always a struggle,
01:28:49.240 | and to some extent it's kind of happened,
01:28:50.960 | is let's say that like three years ago,
01:28:53.720 | I would have said I'm no longer saying the R word,
01:28:55.840 | that's just, I'm just gonna get rid of that in my vocabulary.
01:28:58.320 | Like is there a chance that today,
01:29:00.240 | we would be having a conversation about like,
01:29:01.840 | why do you call people dumb fucks?
01:29:04.040 | Like is that really appropriate?
01:29:05.160 | Like does this attack at the core of like somebody's like,
01:29:07.640 | level of intelligence, education, opportunities in life,
01:29:10.320 | like is that worthy?
01:29:11.640 | You don't think so?
01:29:12.720 | - I think that's a, as the kids say, cope.
01:29:15.520 | - You really think so?
01:29:16.360 | - I think that's-- - Because the words
01:29:17.560 | have definitely moved in a way where it's like,
01:29:19.600 | this was okay, now it's not, this is okay, now it's not.
01:29:21.440 | - So you're standing your ground by using,
01:29:23.160 | listen, you could, you could, but I think it's better
01:29:28.160 | to use those words, if you want to defend the ground
01:29:32.280 | words stand on, to use them rarely and deliberately,
01:29:36.320 | versus how you currently use them,
01:29:38.760 | which is to express an emotion, like you,
01:29:43.760 | I'm going to be honest, you use R word,
01:29:45.920 | not when you're at your best.
01:29:47.640 | (laughing)
01:29:48.480 | - True.
01:29:49.320 | - And so that's not--
01:29:50.440 | - That's generally, that could be true
01:29:51.400 | for a lot of swearing too, but yeah, I know what you mean.
01:29:53.200 | - No, but like, you know that R word is offensive,
01:29:55.960 | you know, and there's part of it is like,
01:29:59.400 | you tell yourself that like, you're still kind of
01:30:03.600 | fighting political correctness by using it a little bit
01:30:06.640 | when you say it?
01:30:07.580 | - No, I don't think so, I think, I'm trying to think
01:30:09.680 | in terms of like, where is the virtue,
01:30:11.400 | where like, there's a whole bunch of arguments
01:30:12.840 | for why some words are okay, some words aren't okay,
01:30:14.720 | or whatever, and I try to like, think more along
01:30:16.560 | those lines rather than, but like, there's going to be
01:30:19.000 | like a lot of phrases where like, if the R word
01:30:21.680 | has come out, the conversation is over,
01:30:23.600 | like I know that, like things, my brain is shut down,
01:30:25.600 | the person I'm talking to is, but there's like,
01:30:27.240 | there's a lot of words also in terms of like,
01:30:29.440 | if you ever hear me say like, fucking moron in a debate,
01:30:32.000 | it's like, it's done, like this conversation is over,
01:30:33.980 | there's no way that anything productive
01:30:35.600 | is happening past that point.
01:30:37.000 | - I think fucking moron is not, I think it's ineffective,
01:30:40.180 | it's not civil, but it's not, it doesn't bother me
01:30:43.560 | in a way, it's basically when you speak in a way
01:30:47.720 | that I know there's a group that's going to be hurt by that,
01:30:51.360 | not only do I think about the hurt that group experiences,
01:30:54.000 | I think of you as a lesser intellectual,
01:30:57.800 | like as a lesser person who's thinking about the world.
01:31:00.560 | What bothers me the most is just what kind of mindset
01:31:05.560 | that inspires in young people, especially when you're
01:31:08.920 | a public figure and a lot of people look up to you.
01:31:11.440 | So I definitely don't think sort of this idea,
01:31:15.120 | the R word is not the battleground of expanding
01:31:19.100 | the Overton window of discourse, okay?
01:31:22.560 | Like I don't think it'll lead to dumb fuck being canceled
01:31:27.560 | two years later, unless that word is hurting
01:31:33.800 | people's experience, which I don't foresee that happening.
01:31:37.880 | I think legitimately, R word and F slur
01:31:42.000 | and calling women bitches, context matters here too,
01:31:47.000 | like of course, but just the way I've heard you use it,
01:31:50.760 | it is not, it's from emotion and it's from frustration
01:31:54.280 | and it ultimately is rooted in disrespect.
01:31:57.280 | I don't, I think it's ineffective.
01:31:59.440 | And of course, like who gets to say, I don't know,
01:32:02.880 | but I'm saying somebody who, like I admire
01:32:06.880 | effective conversations and I admire great humor,
01:32:11.280 | dark humor, wit.
01:32:13.140 | To me, oftentimes the use of the R word
01:32:18.200 | in the way you've used it and the way I see the community
01:32:20.240 | use it is none of those things.
01:32:22.200 | It contributes not at all to the humor and so on.
01:32:24.920 | Now I could see it might contribute to the camaraderie
01:32:28.920 | of that particular group, especially when they normalize
01:32:32.320 | the use of that word.
01:32:34.040 | You kind of take some of the edge off,
01:32:36.240 | but you forget that there's a large number of other people
01:32:40.960 | that don't have the chemistry, that don't hear the music
01:32:45.360 | of the friendship that you have, the relationship you have,
01:32:47.920 | and instead they hear the normalization of a hateful word
01:32:51.320 | and it ultimately has an impact that's hateful.
01:32:53.880 | And then people like me who show up,
01:32:56.000 | I haven't watched much of your stuff.
01:32:58.160 | It turns me off from, a couple of times your content
01:33:01.840 | came before me and I listened to it a little bit,
01:33:05.440 | it turned me off completely.
01:33:06.340 | I didn't understand how good your heart is.
01:33:09.340 | I didn't understand how your mission of actually
01:33:14.480 | de-radicalize people, help people, and increase the level
01:33:19.000 | of good faith discourse in the world.
01:33:21.440 | I didn't understand any of that,
01:33:22.720 | 'cause what I was hearing is pretty rough,
01:33:25.000 | the R word type of stuff.
01:33:27.080 | And I just feel like the benefit-cost analysis
01:33:29.440 | is heavy on the cost.
01:33:31.000 | - Gotcha.
01:33:31.840 | - So I just have to sort of call this out.
01:33:34.080 | And I straight up think it's wrong.
01:33:37.840 | - Why do you think it's wrong?
01:33:41.360 | - 'Cause it's hurting people without any benefit
01:33:43.760 | to you whatsoever.
01:33:45.040 | - When you say hurting people,
01:33:46.040 | do you mean the person I'm using it at,
01:33:47.120 | or do you think there's like the--
01:33:47.960 | - No, no, no, no, we're listening.
01:33:48.780 | - We're talking about the affected third group.
01:33:49.620 | - The third group.
01:33:51.240 | - It's good feedback, right?
01:33:52.120 | I always consider everything, especially,
01:33:53.600 | I respect you a lot, you're a really smart guy.
01:33:55.640 | Something that I always kind of like,
01:33:58.360 | fight over in terms of like language,
01:34:01.500 | or like who to attack, or what to attack, or what to do,
01:34:03.760 | is that it's very hard to draw like what boxes
01:34:07.840 | are okay to insult people on versus what aren't.
01:34:11.080 | So for instance, if I call somebody like a Nazi
01:34:13.800 | with a lot of vitriol, I'm okay with every single Nazi
01:34:18.120 | being negatively affected by that,
01:34:19.960 | because that category intrinsically calls upon it
01:34:23.960 | some level of more condemnation for me, right?
01:34:26.000 | Whereas like if I'm out there,
01:34:27.560 | I try not to do like image-related jokes, right?
01:34:29.440 | I don't wanna call you like,
01:34:30.280 | oh, you're a fat fucking loser,
01:34:31.520 | because there's a lot of people that are fat,
01:34:33.000 | that are overweight, where I don't want them to feel bad.
01:34:34.800 | I don't want them, I'm not trying to call you out
01:34:36.360 | or like insult you.
01:34:37.560 | So there's like a lot of, you say cost benefit,
01:34:40.400 | I like a lot of collateral damage from a word like that,
01:34:42.720 | where there's no purpose in doing that.
01:34:45.200 | So certain words are easy to get rid of,
01:34:46.920 | they're off the table, right?
01:34:48.120 | F slur, N word, like these are not words you call people,
01:34:51.160 | because there's so much collateral, it's not worth it.
01:34:53.880 | We've got some words where it's like,
01:34:56.200 | if you have some form of like mental thing,
01:34:58.360 | it is a bad thing, you're not a bad person,
01:35:00.200 | but just using that word could feel like
01:35:01.840 | a collateral damage to those people.
01:35:04.480 | And then there's other categories of words.
01:35:05.880 | So like, if I say that like,
01:35:07.720 | this person is like, it's a stupid fucking Republican,
01:35:10.840 | right, there's probably some Republicans that aren't dumb,
01:35:14.360 | that I don't wanna feel called out by that.
01:35:16.100 | Like, are those types of phrases that you think
01:35:17.960 | should be completely removed as well?
01:35:19.240 | Or I'm kind of curious.
01:35:20.760 | - This completely removed, just so we're clear.
01:35:24.040 | - Yeah.
01:35:24.860 | - I'm not referring to censorship.
01:35:26.600 | - Oh no, I'm not even talking about,
01:35:27.440 | I'm just a person like emotionally like--
01:35:29.320 | - Removed is the wrong word though.
01:35:30.680 | Like I care about, like, I'm not trying to listen
01:35:35.040 | to people on the internet saying
01:35:36.240 | like you shouldn't say that word, that's not good.
01:35:38.280 | I mean, I'm trying to look to your mind and heart.
01:35:43.280 | The reason we're talking today
01:35:45.160 | is you're betraying your gift.
01:35:48.260 | You're better than this.
01:35:49.880 | - You think it's indicative of like
01:35:51.040 | a more flippant thought process,
01:35:52.480 | where it's like the only way you can say that word
01:35:54.080 | is if you're ignoring the hurt and suffering of those people.
01:35:56.680 | And if you're somebody that says--
01:35:57.520 | - Not even those people, you're ignoring
01:36:00.400 | the state of language.
01:36:03.380 | 'Cause I think you're getting to the point,
01:36:04.840 | 'cause it's not about a single word.
01:36:06.400 | It's about like, it's music.
01:36:08.960 | And I just feel like there is--
01:36:10.640 | - Very strong note.
01:36:13.280 | - It's a strong note that ruins the melody.
01:36:15.760 | - Gotcha.
01:36:16.600 | - And I don't think I can say, you know,
01:36:17.960 | you shouldn't use the R word or whatever.
01:36:19.920 | I'm just speaking to, I'm just listening to music
01:36:22.560 | and reviewing the final result.
01:36:24.780 | It's not necessarily, 'cause maybe one use of the R word
01:36:28.840 | strategically or part of an actual,
01:36:32.200 | like when you've built up a camaraderie
01:36:35.200 | that's sandwiched in like some love,
01:36:38.960 | but then you try to reveal their,
01:36:40.800 | 'cause you're talking about a lot of,
01:36:42.040 | there's a bunch of drama, you have friends
01:36:44.900 | with whom you're worrying and stuff,
01:36:46.680 | and they're all a little bit beautifully insane.
01:36:50.800 | And you've said that you are becoming more and more insane.
01:36:53.320 | It's beautiful to watch,
01:36:54.200 | it's the human condition laid before us.
01:36:56.200 | Wonderful, and some of that is swearing and so on.
01:36:58.640 | So it's a tricky thing.
01:37:00.320 | But the whole skill of discourse,
01:37:04.840 | just like it is with dark humor, is walking that line.
01:37:07.200 | I just feel like it's overuse of the R word.
01:37:11.680 | And I don't wanna die in that ground,
01:37:13.200 | 'cause I don't think it's that representative.
01:37:15.360 | Like there's certain things like that,
01:37:17.920 | it feels like it ruins the music.
01:37:20.080 | - Gotcha.
01:37:20.920 | - And I don't, you know, it's the same,
01:37:21.740 | like a dumb Republican or a dumb Democrat, I don't,
01:37:24.880 | yeah, that ruins it too a little bit.
01:37:26.680 | Depends on how you use it.
01:37:28.760 | You can be lazy with that.
01:37:30.240 | - Yeah.
01:37:31.080 | - You know, like even overuse of the word,
01:37:32.440 | I think bots is what's used for people
01:37:34.360 | who don't think or something.
01:37:35.600 | I don't actually know the definition.
01:37:37.360 | I'm offended on behalf of robots.
01:37:39.540 | - That might be a compliment soon.
01:37:42.600 | (laughs)
01:37:43.440 | - Right, exactly.
01:37:44.280 | But I guess bot means you don't think.
01:37:46.280 | - Yeah, you're like an NPC, you just copy everybody else.
01:37:48.640 | - Again, I'm offended on behalf of NPCs,
01:37:50.520 | I count myself as one.
01:37:52.000 | But there's a sense if you say bots too much
01:37:55.880 | that you're just dismissing people.
01:37:58.120 | Like everything I say is right,
01:38:01.360 | and anyone that disagrees with me is a bot.
01:38:03.640 | That's lazy too.
01:38:05.720 | Sometimes it's funny, sometimes it's effective.
01:38:08.840 | Basically saying a lot of people in the mainstream media
01:38:11.240 | or something like that are bots.
01:38:12.180 | Okay, that's, a little bit of that is effective.
01:38:15.280 | But too much, it becomes ineffective.
01:38:18.240 | And I'm trying to speak to that.
01:38:19.800 | - Yeah, I understand.
01:38:20.640 | - And I'm just, the reason we're highlighting
01:38:22.160 | clear examples, like the N word.
01:38:26.440 | Joe Rogan had to contend with that.
01:38:28.240 | - Oh yeah, the, yeah.
01:38:29.800 | - I think it's ineffective.
01:38:30.960 | It makes you less effective at discourse.
01:38:33.000 | But like you've talked about many times,
01:38:35.760 | language is a tricky one.
01:38:37.640 | - It's always hard 'cause you talk about
01:38:38.840 | constructing a melody.
01:38:40.040 | There's not one melody that sounds good to everyone.
01:38:42.840 | But there are probably certain notes that like,
01:38:44.680 | if you got rid of them,
01:38:46.320 | everybody's still gonna like it about as much,
01:38:48.080 | and you don't really lose anything.
01:38:49.000 | There's a whole other part of an audience
01:38:50.280 | that might be more willing to listen, of course.
01:38:52.520 | - And it's not about losing the magic of that melody.
01:38:55.160 | You don't wanna be vanilla.
01:38:56.620 | I just feel like there's stuff
01:38:58.880 | that doesn't need to be there.
01:39:00.320 | - Yeah, for sure.
01:39:01.160 | - It's fat.
01:39:02.200 | But then again, you're,
01:39:03.440 | the other thing that people should understand
01:39:06.080 | that might be listening to this,
01:39:07.440 | you're streaming many hours a day for many years.
01:39:11.480 | I don't know, it's a--
01:39:12.320 | - 11 or 12, I think, yeah.
01:39:13.520 | Started in 2010.
01:39:14.360 | - And so one of the things that people can do
01:39:16.200 | is just clip out anything.
01:39:18.240 | You're going through the full human experience of emotion.
01:39:21.880 | Anger, fear, frustration, all of it.
01:39:25.640 | So of course, there's going to be moments
01:39:27.600 | when you're not the best version of yourself.
01:39:29.840 | Anything else to say about the language?
01:39:33.160 | - It's complicated.
01:39:34.520 | I'm still always trying to figure it out.
01:39:36.200 | There are opinions that I have
01:39:37.360 | that have changed throughout the years.
01:39:38.920 | It's possible that the R word has always been
01:39:40.880 | the next one on the chopping block
01:39:42.200 | that we're all kinda looking at,
01:39:43.080 | but people are always worried about that treadmill.
01:39:45.120 | But it's possible in a year or two,
01:39:46.080 | I'll have a different view on it,
01:39:47.120 | or I'll have changed away some of the words I use.
01:39:50.360 | Yeah, it's definitely like a,
01:39:51.400 | it's always like a work in progress.
01:39:52.520 | There's always like different communities
01:39:53.680 | that feel different ways about different words, yeah.
01:39:56.120 | - Yeah, but do you acknowledge that there's people out there
01:39:59.040 | that are never gonna talk to you?
01:40:00.680 | They're never gonna think of you as a good man
01:40:05.800 | because you used the N word with a hard R publicly.
01:40:09.200 | - In the past?
01:40:10.200 | I mean, yeah, those people exist,
01:40:11.560 | but I mean, there are some people that are beyond my reach,
01:40:16.160 | which I'm okay with.
01:40:17.260 | Like, there's gonna be some people
01:40:18.160 | because of things that have been involved,
01:40:19.080 | or even ideas that I have now
01:40:20.640 | that might make them beyond my reach.
01:40:23.140 | Something you said earlier was very true.
01:40:25.520 | I think the goal is to identify
01:40:27.160 | what are the elements that you can cut out
01:40:29.120 | that aren't integral to your message,
01:40:30.900 | but could be alienating to more people,
01:40:34.200 | and those are probably the things that you identify.
01:40:35.740 | But I think that you can get lost in yourself,
01:40:38.160 | or lost in the internet,
01:40:39.400 | or lost in the outside of yourself
01:40:42.360 | if you're trying to appeal to every single person.
01:40:44.200 | It's just never gonna be the case.
01:40:45.840 | And for, I actually,
01:40:47.480 | I like that I've had the journey
01:40:49.200 | that I've had on the internet,
01:40:50.200 | that you can find me saying and defending
01:40:52.080 | a lot of insane stuff 10 years ago,
01:40:53.960 | because I think it shows a level of progress.
01:40:55.760 | And I think I do get a lot of respect
01:40:57.240 | and buy-in to certain communities,
01:40:58.800 | where it's like, I'm not just some random dude
01:41:00.800 | telling you that, oh, you shouldn't say
01:41:02.700 | the F word or the N word.
01:41:03.920 | I'm a guy that's been there, that's done it,
01:41:05.480 | that's defended it, and you can see my whole past,
01:41:07.240 | my whole history is laid bare for you
01:41:08.840 | to watch every, thousands of hours of it.
01:41:12.100 | But I can show that there's growth,
01:41:13.360 | and evolution, and change that can happen in a person.
01:41:15.480 | - Yeah, and you're honest about that growth.
01:41:18.340 | It's a tricky thing, 'cause people just call,
01:41:21.080 | bring up stuff from your past.
01:41:23.160 | - For sure.
01:41:24.060 | - I hope we figure out, as a civilization,
01:41:27.000 | a mechanism to clearly say, this was me two years ago,
01:41:31.640 | this was me five years ago, I'm a different person.
01:41:34.780 | Because Twitter doesn't care about that.
01:41:38.640 | These social mechanisms that bring stuff up
01:41:40.680 | doesn't care about that.
01:41:42.200 | It's like, one stupid thing you say,
01:41:44.680 | it becomes like a scarlet letter.
01:41:46.760 | And I don't know how to fight that.
01:41:47.960 | It's tricky to fight that.
01:41:49.240 | - Have you ever seen Men in Black?
01:41:50.960 | - Yes.
01:41:51.800 | - When KJ on the bench, and he says,
01:41:53.240 | a person is smart, but people are stupid,
01:41:55.520 | dumb, finicky animals, or whatever,
01:41:57.480 | there's something that changes for human dynamics
01:41:59.880 | when there is a group of people
01:42:01.880 | that make it so hard to control.
01:42:03.160 | Like, I think one-on-one, anybody can sit across
01:42:05.720 | from somebody and admit to some horrible stuff.
01:42:07.920 | I used to be, I abused my husband when I was 20,
01:42:11.960 | and now I'm 35, and I see it's wrong, or I did this,
01:42:14.000 | I was addicted to whatever, and I made these mistakes.
01:42:16.200 | One-on-one, it's always easy.
01:42:17.680 | But in group environments, that in-group, out-group,
01:42:20.880 | tribalistic thing of identifying one thing
01:42:22.960 | and then coming to destroy a person's life
01:42:24.560 | is such a huge impulse we have.
01:42:27.900 | And I think, probably when we were hunter-gatherers
01:42:30.400 | in the forest, probably good,
01:42:31.920 | 'cause you really wanna push weird people out
01:42:34.080 | or anything like that.
01:42:34.920 | But now on the internet, when we can hunt
01:42:36.760 | for any dissenting opinion, and just,
01:42:38.880 | with ruthless precision, destroy somebody's life over it,
01:42:41.560 | it's a pretty scary dynamic.
01:42:43.400 | - I think one of the mechanisms that could fix it
01:42:45.980 | is make it super easy for each individual person
01:42:49.280 | to analyze all the stupid shit they themselves have said
01:42:51.920 | in the past, like a full recording.
01:42:54.300 | Because I think people are just honestly,
01:42:58.440 | paint a very rosy picture to their own brain
01:43:00.720 | of who they have been in the past.
01:43:03.440 | - Yeah, of course.
01:43:04.280 | - If we can have empathy for the fact
01:43:05.920 | that we've said stupid shit, or we're drunk,
01:43:08.840 | the ridiculous things you say,
01:43:10.480 | the offensive things you might have said,
01:43:12.360 | the offensive things you might have done,
01:43:13.680 | I just feel like that would give us the ammunition
01:43:18.680 | to have empathy for others that are like,
01:43:21.320 | okay, yeah, this guy five years ago said this.
01:43:23.880 | Maybe that doesn't represent who they are
01:43:27.640 | any more than stuff I said five years ago
01:43:30.000 | represents who I am today.
01:43:32.420 | I feel like technology can actually enable that.
01:43:34.800 | - Maybe, although you're talking about more recording
01:43:37.400 | and more stuff, which people are already wary of.
01:43:39.440 | - It's a double-edged sword.
01:43:42.100 | I think there is going to be more and more recording.
01:43:44.480 | We have to figure out how to do that
01:43:45.680 | in the way that respects people's privacy
01:43:47.840 | and gives them ownership of their data and so on.
01:43:49.880 | I've looked at the search history I've done on Google,
01:43:52.840 | which for most people is available,
01:43:54.560 | like your Google search history,
01:43:56.360 | and it's fascinating to watch the evolution
01:43:57.840 | of a human being.
01:43:58.800 | It doesn't seem like the same person.
01:44:01.440 | It's like a different person.
01:44:02.440 | - For sure.
01:44:03.440 | - It's weird.
01:44:04.280 | - It's also hard too with the internet today.
01:44:05.880 | I'm gonna be ageist again,
01:44:07.280 | but now all of the people are thrown together,
01:44:09.920 | whereas I don't want a 27-year-old judging
01:44:13.340 | the image of a 15 or 16-year-old.
01:44:15.540 | Obviously, he's in high school.
01:44:16.980 | There was that story that came out of the,
01:44:18.500 | there was a kid that saved a recording of,
01:44:20.700 | I think it was some white girl.
01:44:22.140 | I think that she got her driver's license
01:44:23.820 | and she was like, "I can drive now, N-words,"
01:44:25.900 | with the A or whatever.
01:44:26.820 | It was dumb, she shouldn't have said it,
01:44:27.820 | but I think she was 15 or 16
01:44:29.220 | when she TikTok'd this or whatever.
01:44:31.020 | And he held onto that recording until she applied
01:44:33.020 | and got accepted to college three years later,
01:44:35.120 | and then he released it to get her kicked out of college.
01:44:37.180 | And I'm like, "Damn."
01:44:39.220 | If everything that I had ever said
01:44:40.840 | as a 15, 16-year-old was immortalized on the internet,
01:44:44.520 | my life wouldn't have even begun. (laughs)
01:44:47.140 | 'Cause those are insanely high standards to hold people to.
01:44:50.400 | Not that, obviously, you shouldn't be saying those,
01:44:53.480 | you shouldn't be saying certain words or whatever,
01:44:54.800 | but you have to be able to make mistakes in adolescence.
01:44:57.840 | Everybody does, we all did.
01:44:59.200 | Everybody did it growing up.
01:45:00.600 | - Why do you think there is so much misogyny
01:45:03.920 | in the streaming community?
01:45:06.120 | And how can you fight it?
01:45:07.680 | 'Cause you've shown a lot of interest in fighting it,
01:45:11.840 | trying to decrease or eliminate misogyny
01:45:14.560 | from your community.
01:45:15.960 | - I think it's really difficult.
01:45:17.640 | I think that eliminating racism
01:45:19.000 | is easier than eliminating misogyny.
01:45:21.520 | - On the internet, you mean?
01:45:22.720 | - On anywhere.
01:45:24.160 | 'Cause I think fundamentally,
01:45:25.400 | I don't think there's that much difference
01:45:26.320 | between white people and black people
01:45:27.960 | and brown people and Asian people or whatever.
01:45:30.280 | We have different cultures and stuff,
01:45:31.320 | but at the end of the day, we're all people.
01:45:33.400 | But I think there are differences between men and women,
01:45:36.360 | like throughout all of history and time,
01:45:38.920 | and then even today in every culture.
01:45:40.800 | And when real differences do exist,
01:45:42.960 | it's harder to account for them in a way
01:45:44.600 | that can we have conversations with each other
01:45:46.560 | without it becoming very gendered in a negative way, right?
01:45:49.680 | Negative way of gendering something
01:45:51.000 | would be like a misogynistic way of doing it.
01:45:52.920 | - Of course, it's unclear to me that it's so difficult
01:45:57.440 | to avoid the negative gendering versus the positive,
01:46:02.360 | 'cause there's a lot of positive to the tension,
01:46:04.640 | the dance between the different genders and so on.
01:46:06.640 | Maybe in this particular moment in history, it's not.
01:46:09.840 | But it's not trivial to me that racism
01:46:12.000 | is easier to eliminate.
01:46:13.080 | It's an interesting hypothesis,
01:46:14.960 | just because there's more biological difference
01:46:17.400 | between men and women.
01:46:18.680 | That means it's harder to eliminate, but.
01:46:20.760 | - I don't know if this is true.
01:46:21.600 | I hear this a lot.
01:46:22.420 | I feel like I read this somewhere,
01:46:23.260 | but I need to get a better source
01:46:24.080 | before I repeat it everywhere.
01:46:24.920 | But I've heard that in the US military, for instance,
01:46:27.160 | they've gotten exceedingly well,
01:46:29.220 | they do an exceedingly good job
01:46:30.640 | at getting different people of different races to integrate.
01:46:33.640 | And it's not a huge problem
01:46:35.200 | once you're through basic training,
01:46:36.800 | all the training, everything.
01:46:38.200 | But for different sexes,
01:46:39.440 | it still represents a significant problem
01:46:41.120 | that the military hasn't figured out.
01:46:42.640 | And I actually looked at like,
01:46:43.480 | well, what's the military doing?
01:46:44.400 | 'Cause if something was solvable,
01:46:45.800 | like can we sleep for four hours a night and be healthy?
01:46:47.800 | If we could, I bet the military would know.
01:46:49.440 | So I kind of look sometimes to them
01:46:50.440 | to see their integration.
01:46:51.280 | But it might be that there are other issues there
01:46:52.980 | that make it kind of--
01:46:53.820 | - Yeah, it feels like the military
01:46:54.880 | is a very particular kind of--
01:46:55.720 | - For sure, yeah, it could be.
01:46:56.640 | - The actual task at hand might bias
01:47:01.600 | the difficulty of the process.
01:47:03.280 | - Potentially, yeah.
01:47:04.440 | There's been a lot of interesting talk
01:47:06.840 | about like women integrating into male groups.
01:47:10.800 | And how do you do this in a way
01:47:12.720 | where everybody is happy with the outcome
01:47:15.240 | and there's not like issues.
01:47:18.080 | I think Jordan Peterson spoke about this a little bit.
01:47:19.960 | And then Workplace Culture speaks about this a bit.
01:47:22.320 | Would you happen to remember,
01:47:24.280 | I wanna say it was like five or 10 years ago,
01:47:26.240 | there was a big tech conference
01:47:27.920 | and there were two guys behind a woman
01:47:29.480 | and they made a joke about like a USB dongle,
01:47:32.200 | like dongle was a dick.
01:47:33.600 | And this woman turned around,
01:47:34.640 | she tweeted pictures of them,
01:47:35.960 | spoke about like misogyny.
01:47:37.240 | And then that blew up into a huge ordeal that like, yeah.
01:47:41.560 | There was this interesting phenomenon
01:47:43.700 | that in a less misogynistic
01:47:47.520 | and more inclusive workplace environment,
01:47:50.080 | some women might end up feeling worse
01:47:53.360 | because in a more misogynistic environment,
01:47:54.920 | you're thinking like, okay, that's a woman,
01:47:56.660 | she doesn't get our humor.
01:47:57.640 | I'm gonna treat her in a very indifferent,
01:48:00.360 | very dispassionate, cold way and whatever.
01:48:02.320 | And then I'm gonna have my boys over here.
01:48:04.160 | And then you've got like these environments
01:48:05.560 | where they're a little bit more warmer
01:48:06.560 | and it's like, oh, cool,
01:48:07.520 | we're gonna bring this woman into our environment
01:48:09.000 | and we're gonna make all the same types
01:48:10.200 | of like crass jokes we did before.
01:48:11.960 | And it's actually worse now.
01:48:13.080 | Now the woman feels even more otherwise
01:48:14.520 | 'cause like, oh my God, why do you talk like this?
01:48:16.600 | I think that internet communities,
01:48:18.000 | especially online ones that do like political debate
01:48:20.480 | and video games are very much like big boys clubs.
01:48:23.600 | So it's not enough to just say,
01:48:25.360 | you can't be misogynistic to get rid of misogyny.
01:48:27.860 | There's always gonna be an othering effect on women.
01:48:30.040 | There's a lot of like behaviors that are unintuitive
01:48:32.560 | that you have to account for
01:48:33.560 | and you've gotta try to like push that back.
01:48:35.520 | And that's just a very, very, very challenging thing to do.
01:48:38.760 | So I like to deal with concrete examples more.
01:48:40.880 | So here's a concrete example.
01:48:42.240 | And this is like a recent initiative in my community
01:48:44.080 | 'cause I'm trying to like be,
01:48:45.200 | 'cause misogyny hasn't been fixed anywhere on the internet.
01:48:47.000 | I'm curious, well, there are ways
01:48:47.840 | that I can push my community to do this.
01:48:50.360 | I don't think you should almost ever make a comment
01:48:52.040 | on a woman's appearance ever,
01:48:53.440 | if they're appearing in like some political
01:48:55.040 | or professional manner.
01:48:56.240 | Even if it's a positive comment,
01:48:57.680 | I think it's equally bad to a negative comment.
01:48:59.600 | It's just never good to do.
01:49:01.160 | And that's kind of an unintuitive thing
01:49:02.440 | 'cause it's like, well, a woman appears,
01:49:04.000 | wow, she's really cute.
01:49:05.320 | It seems like a nice comment.
01:49:06.240 | You're being nice, you know, she looks cute or whatever,
01:49:07.720 | but it's like, it's not at all the point of why she's there.
01:49:09.680 | And just by saying that,
01:49:10.900 | you're kind of like otherizing her as like a person
01:49:12.760 | to like think she looks good
01:49:14.480 | rather than listening to anything she has to say, you know?
01:49:17.120 | - Well, there's a lot of stuff that you're saying
01:49:20.000 | that is a part of misogyny that's almost like obvious.
01:49:26.080 | Like any woman will tell you that.
01:49:27.680 | - Woman will, yeah, but they're not in these spaces
01:49:29.400 | and a lot of the guys don't know.
01:49:30.240 | - But I think what that requires is just empathy.
01:49:34.760 | You need to consider the female experience.
01:49:40.320 | That's it.
01:49:41.160 | Like you have to either read about or talk with women.
01:49:44.120 | You learn, like the low-hanging fruit is very easy to learn.
01:49:47.440 | It feels like just the level of social skill
01:49:49.640 | oftentimes in internet communities is quite low.
01:49:54.240 | - I disagree.
01:49:55.080 | I don't like to say, here's the problem with empathy
01:49:57.600 | is it's very hard to have empathy for experiences
01:50:00.160 | that are so outside of your own.
01:50:02.440 | Well, maybe some people,
01:50:03.500 | there might be some people that can do it, I can't.
01:50:05.260 | There's a lot of stuff that I had to learn.
01:50:07.080 | - Women are half the population.
01:50:08.800 | - But they're women.
01:50:09.800 | They're totally different.
01:50:12.000 | - They're totally different.
01:50:14.520 | We'll talk about it right though.
01:50:16.800 | They're not totally different.
01:50:18.360 | - So here's an example, okay?
01:50:19.640 | So especially for me,
01:50:22.360 | my archetype makes up a lot of the internet, white man.
01:50:27.040 | There's never been a point--
01:50:28.280 | - The name of a beautiful woman.
01:50:29.880 | - Who might be a dancer.
01:50:30.920 | - What's the backstory, from New Orleans or from--
01:50:33.240 | - I haven't thought that through yet.
01:50:35.160 | It's ambiguous, okay?
01:50:36.400 | - Like an open world--
01:50:37.600 | - Open world, I want you to project wherever you want
01:50:39.920 | destiny of the dancer to be from, that's in your mind, okay?
01:50:42.320 | - All right, I'll save that for later tonight.
01:50:44.160 | - Yeah, okay.
01:50:45.000 | As a white guy, I don't know if there's ever been a spot
01:50:49.540 | that I've been in where I've been made to feel like
01:50:51.840 | I don't belong there just by virtue of who I am.
01:50:54.400 | I actually don't, it's impossible for me to empathize that
01:50:57.440 | 'cause I don't even have that experience.
01:50:59.840 | If you go back eight, nine years,
01:51:01.880 | one of the big issues that came up
01:51:03.320 | was harassment in gaming against women.
01:51:05.600 | And I was one of the big debaters against that,
01:51:08.220 | saying that like, sure, women might get harassment,
01:51:10.980 | but everybody gets harassment.
01:51:12.560 | If you're a woman and you're in gaming and you get harassed,
01:51:15.200 | congratulations, you're being treated like a man.
01:51:16.920 | What you're actually asking for
01:51:18.280 | is for us to actually treat you differently.
01:51:19.640 | You don't wanna be insulted.
01:51:21.040 | You don't wanna be treated like a man.
01:51:22.240 | And that's actually misogyny
01:51:23.520 | is women making that argument.
01:51:24.720 | - Do you still stand by that?
01:51:26.280 | - Is that a problem if I do?
01:51:27.980 | No, I'm just kidding.
01:51:29.280 | Okay, hold on.
01:51:30.640 | So a little while after--
01:51:31.480 | - I disagree with it.
01:51:32.600 | - Sure, okay, that's good, you should.
01:51:34.320 | A little while later, I had a friend, Jessica,
01:51:37.240 | super cool girl.
01:51:38.280 | We go to play games.
01:51:39.360 | She was between jobs and she's like,
01:51:40.600 | "I've got like two months and we're gonna grind CSGO."
01:51:42.800 | And I'm like, "Okay, this is awesome, let's do it."
01:51:45.040 | CSGO, Counter-Strike, Global Offensive,
01:51:47.200 | shooter game, FPS, microphones.
01:51:49.740 | First day we start playing, okay?
01:51:50.920 | Hop into our first game.
01:51:52.280 | Obviously she talks, everybody's making,
01:51:53.960 | "Is that a 12 year old boy?
01:51:55.000 | "Why aren't you making sandwiches?"
01:51:56.160 | Blah, blah, blah, yeah, okay, whatever.
01:51:57.520 | Play our first game, play our second game, same jokes.
01:52:00.040 | Third game, fourth game.
01:52:01.240 | By like the fourth or fifth game,
01:52:03.240 | I was actually starting to feel triggered.
01:52:04.720 | Like every time the game started, I was like,
01:52:06.100 | "Can you just like talk so we can get over
01:52:07.720 | "like the stupid fucking jokes, it's so fucking stupid."
01:52:10.560 | And you hear the same fucking joke every single time.
01:52:13.680 | And it took one day of that experience
01:52:16.120 | for me to realize it's not about being insulted.
01:52:19.020 | It's like this othering feeling that you don't belong.
01:52:21.400 | And I've never felt that because I'm a white guy.
01:52:23.760 | Not to be like virtue signaling,
01:52:24.960 | but like there's no places where it's like,
01:52:26.400 | "You're white, you don't belong here.
01:52:27.400 | "You're a guy, you don't belong here."
01:52:29.200 | Like I've never felt that non-inclusion.
01:52:31.040 | And playing with her, there's a different feeling
01:52:34.400 | when it's the same types of jokes
01:52:36.400 | coming from a group of people
01:52:37.480 | to make you feel like you don't belong there.
01:52:39.680 | Where I was like, "Damn, this actually feels really bad."
01:52:42.520 | And it feels bad in a different way
01:52:43.880 | where it's like if you call me like an F slur
01:52:46.480 | or any other type of swear word or insult,
01:52:49.040 | like yeah, you can call me that,
01:52:50.100 | but at the end of the day, we're all kind of the same.
01:52:51.920 | We're all white dudes and we call each other names.
01:52:53.680 | But like, this is a woman and this is not her place
01:52:56.200 | and she doesn't belong here.
01:52:58.160 | Kind of the analogy that I would make,
01:53:00.000 | 'cause after getting these experiences,
01:53:01.560 | I would learn this afterwards.
01:53:03.280 | If I tell you that there's another guy in a room
01:53:05.820 | and you need to think of the worst insults ever
01:53:07.720 | for that person without ever knowing anything about them
01:53:09.760 | or meeting them, if I tell you
01:53:11.280 | that it's like a white straight guy
01:53:12.760 | and you have to write insults, you're fucked.
01:53:15.160 | Maybe you can do like school shooter,
01:53:16.760 | but there's not really much you can say
01:53:18.360 | at the end of the day.
01:53:19.400 | But if I tell you it's a woman,
01:53:21.060 | we could, there are so many different jokes you can write.
01:53:23.040 | If it's a black person,
01:53:24.100 | so many different racist things we can say.
01:53:25.900 | - Are you sure?
01:53:26.740 | I can come up with a lot of stuff for a white guy.
01:53:28.860 | - In terms of stuff that is just intrinsic
01:53:30.420 | to him being a white guy?
01:53:31.780 | - Yeah, like there's--
01:53:33.420 | - Really?
01:53:34.260 | - Wait a minute, what are you talking about?
01:53:36.080 | There's a lot, the internet has sharpened that sword.
01:53:39.520 | - In terms of like jokes that are targeted at his sex--
01:53:42.620 | - Incel, virgin, weak.
01:53:46.620 | - Some of the incel, virgin, maybe.
01:53:48.340 | - Yeah, that's getting there, sure.
01:53:49.580 | That's for sure.
01:53:50.420 | That's recent though, sorry, I'm older on the internet.
01:53:52.980 | We didn't have those words way back then.
01:53:54.420 | - That wasn't--
01:53:55.260 | - When I was making these analogies,
01:53:56.140 | that incel and virgin--
01:53:56.980 | - Back in my day, we didn't have general--
01:53:58.460 | - There were no incels back then.
01:53:59.940 | None of us had sex, we just accepted it.
01:54:01.280 | We were all computer gamers.
01:54:02.420 | Nobody had sex to play video games back then, okay?
01:54:04.500 | People don't remember that.
01:54:05.340 | There wasn't the Big Bang Theory.
01:54:06.460 | You were just a loser that was stuck in--
01:54:07.820 | - You guys didn't even know sex existed
01:54:09.340 | so you could use it as an incel.
01:54:10.180 | - Exactly, yeah.
01:54:11.000 | We had to download sexual pictures and it took two minutes
01:54:12.780 | and you didn't even know if you were gonna get
01:54:13.780 | the right thing by the time it finished loading.
01:54:15.420 | But what I'm saying is that, okay,
01:54:17.180 | I think you agree that if somebody gives you a race,
01:54:19.780 | like a black person who's a woman,
01:54:21.420 | we can write very cutting, scathing insults
01:54:24.340 | for that person that are very otherizing.
01:54:26.220 | - Or words that would really hurt if they're spoken to.
01:54:28.180 | - Yeah, that are very cutting to the person.
01:54:30.100 | But for a white guy, it's kinda hard
01:54:33.120 | because that's the default.
01:54:34.500 | There's not as much otherizing of those people.
01:54:36.460 | Yeah, that's kind of the point.
01:54:37.300 | - So the insults you have from white guy to white guy,
01:54:40.060 | the insults are much harsher.
01:54:41.780 | So when you start to apply the same kind of harshness
01:54:44.060 | to other groups--
01:54:45.300 | - You can make them feel like they really don't belong.
01:54:47.580 | And that otherizing effect is something
01:54:49.460 | that's very hard for me.
01:54:50.420 | I can't really empathize with it 'cause I've never felt it.
01:54:52.380 | So I have to intellectualize it and then sympathize with it.
01:54:54.660 | It's like a whole process I have to go through.
01:54:56.260 | And then I try to walk other people through that
01:54:58.100 | 'cause if you're a white guy on the internet,
01:54:59.640 | which is a lot of the internet,
01:55:00.560 | you really don't know what that feels like.
01:55:01.700 | You've never felt like that before.
01:55:02.980 | - Well, so you're now in a leadership position,
01:55:05.620 | Grandpa Destiny, so that's, a lot of people look up to you
01:55:10.320 | for that, for that sort of pathway to empathy.
01:55:14.260 | - Yeah.
01:55:15.460 | - How not to otherize.
01:55:17.420 | I mean, you have felt otherizing.
01:55:18.940 | You mentioned high school people not being--
01:55:22.460 | - Yeah, but those were always for things that,
01:55:24.380 | it's different to insult somebody
01:55:25.820 | for a non-immutable characteristic.
01:55:27.700 | Like, okay, you think poorly about me
01:55:29.160 | because I'm not enough money or I don't have money,
01:55:31.220 | but I could get more money and I could change that.
01:55:33.020 | But it's different for somebody to really attack you
01:55:34.740 | for your gender or attack you for your race.
01:55:37.940 | - A lot of the attacks that hit the hardest
01:55:39.700 | is not about gender.
01:55:40.900 | I do think that they're,
01:55:44.880 | the way women are attacked on the internet,
01:55:47.920 | it's the same kind of attacks you would do
01:55:50.000 | towards other guys, but you go harsher.
01:55:52.560 | - I feel like they're fundamentally different.
01:55:54.520 | I feel like when we're attacking guys,
01:55:56.420 | I'm not usually attacking you
01:55:58.520 | on the virtue of you being a guy,
01:56:02.720 | but if it's a woman and she's typing something,
01:56:04.600 | like, oh, did your boyfriend type that for you?
01:56:06.400 | Or like, what are you even doing here?
01:56:07.880 | Shouldn't you be trying to find a husband?
01:56:09.700 | Or like, oh, you're like a stupid,
01:56:11.220 | go start an OnlyFans or whatever.
01:56:12.060 | - No, but the stupidity,
01:56:13.180 | the intelligence aspect is what's attacked.
01:56:15.140 | - Yeah, but it's so much different.
01:56:16.260 | Like, you can call a guy stupid,
01:56:17.740 | but that's because he's a guy that's being stupid.
01:56:19.520 | But when you call a woman stupid,
01:56:20.620 | she's stupid because she's a woman.
01:56:21.980 | - Yeah, but I honestly think that women
01:56:24.220 | are called stupid more than men on the internet.
01:56:27.820 | It's nothing to do, like, the attack is not gendered.
01:56:32.300 | It's the gender inspires an increased level of attack.
01:56:37.300 | I feel like it is gendered.
01:56:38.820 | - I wish we had data on this.
01:56:40.740 | - Have you ever heard of the XKCD comics?
01:56:42.680 | - Yes.
01:56:43.520 | - It's a really good comic where,
01:56:44.560 | and this is something that I've dealt
01:56:45.860 | with a lot in my community, okay?
01:56:47.700 | There's a guy at a board and he fucks up a math equation,
01:56:51.060 | and it's like, wow, you suck at math.
01:56:52.940 | And then the next panelist, there's a girl that does it,
01:56:55.100 | and she fucks it up, and it's like, wow,
01:56:56.460 | women suck at math.
01:56:57.940 | And there's like that feeling that happens
01:56:59.540 | where when I bring on, I won't use names,
01:57:02.680 | but there are like YouTube people that I've brought on
01:57:04.160 | that have crazy opinions.
01:57:05.740 | And when they're men, that person is crazy.
01:57:08.180 | Oh my God, he said the crazy stuff.
01:57:09.840 | He's so dumb, he's so crazy, he's so stupid.
01:57:11.940 | But when it's a woman, it's like, oh my God,
01:57:14.100 | why do you always bring dumb women here?
01:57:15.400 | Why do so many women on the internet have crazy opinions?
01:57:17.540 | There's a different minority character
01:57:19.220 | that has to stand it and represent their whole group
01:57:23.300 | where white men don't typically have to.
01:57:25.420 | - Speaking of groups versus individuals, yes.
01:57:27.820 | But then what I feel happens is then another person
01:57:31.820 | from that group comes, another woman comes,
01:57:34.200 | and people before she says anything
01:57:37.120 | will already feel like they're ready with that attack.
01:57:41.100 | - For sure.
01:57:41.940 | But they're ready for the attack 'cause she's a woman.
01:57:43.740 | They're gonna call her, she's stupid because she's a woman,
01:57:45.420 | not because she says something,
01:57:46.260 | but just 'cause she's a woman.
01:57:48.420 | - So like the group in their brain
01:57:52.420 | accumulates all the negative characteristics
01:57:54.820 | of the individuals they've met.
01:57:56.420 | Not the positive, the negative.
01:57:57.820 | And it becomes like this ball of stickiness.
01:58:00.100 | And then that becomes the bias for their judgment
01:58:03.140 | of a new person that comes.
01:58:04.980 | With white men, there's more of a blank slate
01:58:07.420 | in terms of bias of how they analyze the person.
01:58:09.980 | With any of the minority group,
01:58:12.300 | they're basically make a judgment
01:58:14.780 | based on the negative characteristics
01:58:16.540 | of the individuals they've met in the past.
01:58:18.420 | That leads to a system where you're just harsher
01:58:22.100 | towards minority groups and towards women.
01:58:24.700 | How do you solve that?
01:58:26.060 | - The most important thing for any problem ever
01:58:27.840 | is step one is to be aware of it.
01:58:29.420 | If you're not aware of it,
01:58:30.640 | then you're hopelessly lost at sea.
01:58:33.220 | But yeah, the first thing I like to say,
01:58:34.460 | she's like, be aware of it.
01:58:35.300 | Like I've had, there's a girl that I've had on recently
01:58:37.380 | and she says a lot of, in my opinion, kind of crazy things,
01:58:40.080 | but people will use her as like,
01:58:41.400 | this is why women shouldn't be here.
01:58:42.660 | This is like, she's crazy and she's a woman and blah, blah,
01:58:44.640 | blah.
01:58:45.480 | But I can bring on a guy who says similarly dumb things
01:58:47.120 | and he's evaluated on his own merits 'cause it's a guy.
01:58:50.760 | There's never, ever, ever been a case
01:58:52.380 | where I brought a stupid guy on stream
01:58:53.960 | and everybody's been like, this guy makes me hate men.
01:58:56.160 | This guy makes me hate white people.
01:58:57.400 | That has never happened.
01:58:58.640 | But then there's like other women that come on
01:59:00.240 | and it's like, now I know why incels exist
01:59:02.080 | or I totally understand where red pill ideology comes from.
01:59:04.680 | And even if the statements are kind of true,
01:59:06.760 | when you're making these observations over and over
01:59:08.440 | and over and over again,
01:59:09.280 | it damages your ability to individually perceive somebody.
01:59:12.800 | And then two people that make the same statements,
01:59:14.580 | one can be perceived more harshly
01:59:16.240 | just because of that like group bias you've got built up.
01:59:18.360 | - I think there's something about streaming
01:59:19.760 | that just brings that out of people.
01:59:21.120 | Like, 'cause you have to talk for like seven hours.
01:59:23.480 | So you're like, all right,
01:59:24.520 | well, whatever psychological issues and complexities I have,
01:59:28.040 | I'm going to explore them.
01:59:28.880 | - They're gonna be magnified. - Magnified.
01:59:31.240 | And then it's the, as you talked about,
01:59:33.400 | the mimetic theories or Girardian,
01:59:37.320 | like whatever the things that are very similar
01:59:40.360 | and you're going to magnify the conflicts that you have
01:59:43.200 | and you're going to explore all the different perspectives
01:59:46.080 | on those different conflicts.
01:59:47.440 | And I mean, I don't know if it's just anecdotal,
01:59:50.120 | but it's nice to have women on stream.
01:59:52.760 | I think the dynamic that you guys have is wonderful.
01:59:57.520 | It's really interesting.
01:59:58.920 | So it's just the female voice in general.
02:00:01.160 | I love having women on the podcast.
02:00:03.720 | The female voice, I feel like is under heard on the internet.
02:00:07.560 | - For sure.
02:00:08.400 | - And I would love the internet to be a place
02:00:10.360 | where women feel safe to speak.
02:00:12.960 | All right, given that you're,
02:00:15.040 | like we talked about a progressive
02:00:17.120 | with nonstandard progressive views.
02:00:19.760 | So you're very pro-free speech, pro-capitalism.
02:00:24.560 | So given that, it's very interesting
02:00:26.040 | that you're also pro-establishment and pro-institutions.
02:00:30.240 | So right now, if you look at the world,
02:00:32.840 | there's a significant distrust of institutions,
02:00:36.000 | at least in sort of public intellectual discourse.
02:00:39.400 | What is the nature of your support
02:00:41.200 | for government and institutions?
02:00:42.800 | Can you make the case for and against them?
02:00:45.300 | - Broadly speaking, there is a synergistic effect
02:00:51.440 | when two humans come together.
02:00:53.440 | If I can speak very broadly in terms of,
02:00:56.800 | we'll say utility, okay?
02:00:58.680 | My happiness with one person might be 10.
02:01:01.680 | The happiness with another person might be 10.
02:01:03.600 | When they come together, it's like 50
02:01:05.720 | between the two of them.
02:01:06.560 | There's like this synergistic effect
02:01:07.960 | when humans work together that the sum is greater
02:01:11.320 | than all the individual parts or whatever.
02:01:12.720 | There's like an emergent thing that happens there.
02:01:14.640 | - There's a capacity, there's a possibility of that.
02:01:17.200 | - Yeah, a possibility, sure.
02:01:18.320 | Things could go really wrong.
02:01:19.400 | There could be a cannibalistic tribe
02:01:20.600 | that all eats each other, sure.
02:01:21.960 | But for the purpose of this--
02:01:23.120 | - There's other failure modes, but yes.
02:01:25.120 | - Okay, sure, yeah.
02:01:26.680 | But I think broadly speaking,
02:01:28.720 | are you gonna be the well actually guy?
02:01:30.280 | Okay, if you wanna, okay.
02:01:31.320 | - Well actually.
02:01:32.160 | - Well actually, sometimes--
02:01:33.000 | - Sometimes cannibalism is actually good for both.
02:01:36.080 | - True, yeah, sometimes things do go wrong.
02:01:37.960 | But I think broadly speaking,
02:01:39.080 | the fact that you're sitting here in clothing
02:01:40.920 | that you didn't make and I'm sitting here
02:01:42.320 | on an airplane that I don't know how to fly or build.
02:01:45.000 | There's a lot of cool stuff that happens
02:01:46.440 | when people come together and they make civilizations.
02:01:49.800 | And part of that civilization building
02:01:52.620 | is the fact that we can specialize
02:01:54.960 | and it's the fact that we can offload
02:01:57.000 | a bunch of trust onto third parties
02:01:59.520 | that we delegate the power to make important decisions
02:02:02.680 | about our lives, right?
02:02:03.960 | I don't know anything about how to build
02:02:06.880 | like a combustion engine,
02:02:08.320 | but I know that when I push the button on my car,
02:02:09.800 | it's gonna drive around and the fumes aren't gonna kill me
02:02:12.180 | and I can park it in garages
02:02:13.440 | and the building's not gonna collapse.
02:02:14.720 | And the only reason all of this works
02:02:16.440 | is because I've offloaded a lot of trust
02:02:17.880 | onto these third party things.
02:02:19.600 | And I would say that the pillars
02:02:20.820 | of these third party things that society is built on
02:02:22.860 | are roughly speaking institutions.
02:02:25.360 | So that might be the institution of peer review
02:02:27.940 | for scientific articles.
02:02:29.740 | It might be the institution of voting for government, right?
02:02:33.700 | Or the ability for us to vote in that whole process.
02:02:37.020 | It might, yeah, the FDA,
02:02:38.420 | like all of these institutions are things
02:02:39.760 | that they need to exist because we don't have the time
02:02:42.880 | or the capability to individually sort
02:02:44.820 | through all of these things individually.
02:02:47.100 | We have to rely on some third party to do it.
02:02:49.660 | - Okay, so you believe at scale,
02:02:53.800 | when we're together, we're greater
02:02:55.800 | than the sum of our parts.
02:02:57.040 | That's the case for institutions.
02:02:58.680 | - Absolutely.
02:02:59.600 | - What about the inefficiencies of bureaucracy?
02:03:02.800 | Is there some aspect when at scale,
02:03:05.440 | different dynamics come into play
02:03:07.980 | than they do when there's two people together?
02:03:10.160 | Two people that love each other, the birds and the bees.
02:03:13.400 | Is there some aspect that leads more
02:03:15.520 | to cannibalism at scale?
02:03:17.240 | So like corruption, inefficiencies
02:03:19.200 | that due to bureaucracy and so on.
02:03:21.220 | - Bureaucracy, which is not,
02:03:23.500 | I hate it when people try to say bureaucracy is government
02:03:26.020 | because bureaucracy exists a ton
02:03:27.620 | in private environments as well, right?
02:03:29.220 | In businesses and everything.
02:03:30.660 | Bureaucracy introduces its own set of problems,
02:03:33.460 | but I mean, a bureaucracy is necessary
02:03:35.700 | because it's coordinating all of the underlying things
02:03:38.220 | in order to create something that's greater
02:03:40.300 | than the sum of its parts, right?
02:03:42.280 | Like all of the software developers in the world
02:03:44.620 | are useless without being paired with good designers
02:03:46.680 | in order to make their products usable by a person.
02:03:48.660 | And the coordination of those people
02:03:50.300 | and the coordination of increasingly more and more things
02:03:52.180 | necessitate some level of bureaucracy.
02:03:54.500 | I think we always say bureaucracy when it's like a bad,
02:03:57.100 | it's like a slur almost,
02:03:57.980 | like you're a bureaucrat, you're bureaucratic.
02:03:59.900 | The bureaucracy is slowing everything down.
02:04:01.140 | It's like, sure, the bureaucracy slows things down,
02:04:03.320 | but bureaucracy also gives us things like safe medicine
02:04:06.020 | and safe water to drink for most of the US
02:04:08.180 | or safe buildings to live in or safe cars to drive, so.
02:04:11.240 | - So the managers in institution
02:04:13.260 | versus like the software developers and the designers,
02:04:16.100 | the managers is the bureaucracy.
02:04:18.140 | The reason bureaucracy is used as a slur
02:04:21.260 | is that something about human nature
02:04:23.980 | leads to bureaucracy often growing,
02:04:27.420 | growing indefinitely.
02:04:30.300 | - Sometimes. - And becoming
02:04:31.140 | less and less efficient.
02:04:32.340 | Without, I mean, this is where capitalism can come in,
02:04:35.620 | that capitalism puts a pressure on the bureaucracy
02:04:39.620 | not to grow too much because you want the bureaucracy
02:04:43.180 | to be useful, but not large.
02:04:47.180 | - Yeah, to be a certain size, yeah, of course.
02:04:49.300 | - To be the minimum size to get the job done.
02:04:53.100 | And so capitalism provides that mechanism.
02:04:55.840 | Government does not always.
02:05:00.900 | And so that's the criticism of government,
02:05:02.660 | of institutions where it can grow
02:05:04.980 | without a significant mechanism that says,
02:05:07.540 | there's a cost to bureaucracy
02:05:09.340 | that's not being accounted for here.
02:05:11.140 | We're just paying for the increasing size of government
02:05:15.020 | without the benefit.
02:05:16.860 | - Yeah, government is a special institution
02:05:18.700 | because it doesn't have to show itself
02:05:20.500 | to be financially viable.
02:05:22.140 | And we kind of live in a capitalist economy
02:05:23.700 | where that's generally the case.
02:05:25.020 | So government gets its powers from votes from the people,
02:05:27.780 | which introduces a whole new set of possible positives
02:05:30.660 | and possible negatives, right?
02:05:32.180 | Having something, for instance,
02:05:33.340 | that gives food or shelter to homeless people,
02:05:35.900 | maybe you don't want that to have to run at a profit.
02:05:38.740 | But giving an organization that can self-justify
02:05:41.460 | its budgets perpetually and indefinitely growing,
02:05:43.660 | maybe that's not the best thing.
02:05:45.220 | Yeah, we always have to figure out
02:05:46.460 | how to do the constraints there.
02:05:48.220 | - What about the corrupting nature of power?
02:05:52.420 | That comes with institutions as well.
02:05:54.460 | - Absolutely.
02:05:55.300 | So then you better pick your style
02:05:57.300 | of institution very carefully.
02:05:58.780 | I think that the democratic institution
02:06:00.780 | we have in the United States today,
02:06:02.100 | I think works very well.
02:06:03.820 | But I mean, there are other styles of government
02:06:05.100 | that have been tried in the past
02:06:05.940 | that I think lend themselves more to corruption.
02:06:08.140 | Not to say that, by the way,
02:06:08.980 | there's not corruption in the United States.
02:06:10.260 | Of course, there's gonna be varying levels of corruption
02:06:12.420 | at like all levels.
02:06:15.060 | But you ran into this interesting problem
02:06:17.580 | where authoritarian regimes can act
02:06:20.100 | with ruthless precision and swiftness
02:06:22.580 | because they don't have to ask any questions.
02:06:24.060 | They just do, do, do, do, do, and that's it.
02:06:26.660 | But the problem is, it's an authoritarian regime.
02:06:28.820 | They're prone to missteps.
02:06:30.380 | They're slow to respond to changing or evolving needs.
02:06:33.420 | There was an interesting study that was put out a while ago
02:06:35.140 | that showed that every single famine
02:06:37.020 | that happened around the world,
02:06:38.100 | almost like 98% of them happened under authoritarian regimes
02:06:41.900 | where freedom of speech is very limited.
02:06:43.380 | It's very rare for a famine to happen under a democracy
02:06:46.300 | because press and everything
02:06:47.460 | makes the government more responsive
02:06:48.660 | to the needs of the people.
02:06:49.940 | Power can corrupt, there are levels of corruption,
02:06:51.880 | but you have to have like a system of checks and balances
02:06:53.900 | on all of those different levels
02:06:55.180 | to make sure it doesn't run off the rails, I guess,
02:06:57.060 | and do a sick loop-de-loop
02:06:58.180 | and half the population gets.
02:06:59.980 | - Nice callback.
02:07:01.820 | There's a lot of people that will listen to you
02:07:03.860 | say that the democracy in the United States
02:07:06.300 | is working pretty damn well
02:07:07.540 | and they will spit out the drink
02:07:09.540 | if they're drinking a drink and be very upset.
02:07:13.260 | Can you make the case that they're right and you're wrong?
02:07:16.220 | - Can I make the case-- - Can you steel man?
02:07:17.820 | - They're right, yeah.
02:07:18.660 | Well, the steel man for them
02:07:20.140 | is that people have a lot of problems on the day-to-day
02:07:22.660 | and when they look and they see what government is doing,
02:07:25.620 | they might see potholes outside their house,
02:07:29.020 | homeless people all over their downtown,
02:07:31.580 | and the United States just approved
02:07:33.120 | another X billion amount of dollars for Ukraine.
02:07:36.140 | Or they might be living in a city
02:07:37.980 | where half the factories are shut down,
02:07:39.960 | a lot of their people are out of work,
02:07:41.540 | but the president is on the TV talking about
02:07:43.780 | how to find jobs for immigrants coming in from Mexico.
02:07:46.460 | And for these people,
02:07:47.420 | they have problems that exist in their lives.
02:07:49.980 | Some of them are paying taxes to alleviate these problems.
02:07:52.700 | And then when they listen to the government talk,
02:07:55.340 | it feels like the government is not responding
02:07:56.860 | to the needs that they have.
02:07:58.260 | And then that's one problem.
02:07:59.540 | Then on top of that,
02:08:00.620 | you've got all of these people working in alternative media
02:08:02.580 | that can show you, well, look at this politician
02:08:04.180 | wasting this much money,
02:08:05.180 | or look at him double speaking here or there.
02:08:07.140 | Look at Hillary Clinton saying she's got a private position
02:08:09.300 | and a public position.
02:08:10.340 | Look at how all of these politicians have family members
02:08:12.720 | that are getting rich
02:08:13.560 | because of their relationships with people in Congress.
02:08:15.420 | Look at the revolving door
02:08:16.500 | between capitalist companies and the government.
02:08:19.420 | How can you look at all of that,
02:08:21.140 | take into account that the government's
02:08:22.140 | not responding to your needs,
02:08:23.380 | and then really feel like it's a government
02:08:25.000 | by the people and for the people?
02:08:26.260 | - Yeah, this was very good.
02:08:27.660 | Good steel, man, and good question.
02:08:30.360 | How can you?
02:08:31.200 | How can you tell that they're not just politicians
02:08:34.060 | that care more about continuously winning the elections
02:08:37.540 | versus running government effectively?
02:08:40.660 | - They should care about winning the elections.
02:08:41.980 | That's the first misconception.
02:08:43.060 | A lot of people say,
02:08:43.900 | "This guy only cares about getting voted in.
02:08:45.580 | "This guy, he doesn't even believe in fracking or abortion.
02:08:49.220 | "He just changes his opinion to get voted in."
02:08:51.160 | Anytime somebody says that, you should say,
02:08:52.940 | "That's really good."
02:08:53.940 | You want them to change their opinion so they get voted in.
02:08:55.820 | That's the whole point of a democracy.
02:08:57.220 | You don't want them to remain obstinate.
02:08:58.660 | You don't want them to say,
02:08:59.500 | "I'm not changing my opinion no matter what the people want."
02:09:00.860 | You want them to evolve and adopt new opinions
02:09:02.940 | based on what the population,
02:09:04.280 | their constituents are voting for.
02:09:05.500 | - Yeah, but the cynical take is that they're,
02:09:08.260 | on the surface, they're changing their opinion,
02:09:10.220 | but that there's a boys club, or boys means the elite,
02:09:13.420 | that under, in the smoke-filled rooms,
02:09:17.320 | in secret, they actually have their own agenda
02:09:20.540 | and they're following that agenda,
02:09:21.820 | and they're just saying anything publicly
02:09:24.180 | to placate the public based on whatever the new trends are.
02:09:29.180 | So here's-- - It's a cynical take
02:09:31.420 | upon the question. - Yeah, I understand.
02:09:33.100 | Somebody asked me this question and it flipped,
02:09:35.860 | I 180'd completely.
02:09:37.900 | I was a Bernie Sanders supporter in 2016,
02:09:39.580 | and my single issue voting thing was lobbying.
02:09:44.040 | I thought that lobbying, the government was corrupt,
02:09:45.780 | they weren't responding to the needs of people,
02:09:46.700 | it completely destroyed my faith in government and everything
02:09:50.420 | and I had one question posed to me by a conservative
02:09:52.660 | that used to come on my stream and shout at me,
02:09:53.700 | and he said, and then he asked me,
02:09:55.760 | "Can you think of any popular opinion
02:09:59.500 | "that the American public has
02:10:01.020 | "that the government is unresponsive to?
02:10:02.480 | "Is there some big piece of legislation
02:10:03.900 | "or policy or whatever that people want
02:10:05.660 | "that the government isn't doing?"
02:10:08.020 | And he asked me that, I couldn't think
02:10:09.460 | of a single good answer, and I'm like, oh, geez.
02:10:12.060 | - There's a good answer. - There's not.
02:10:13.260 | - Drugs. - There's not.
02:10:15.260 | - Legalization. - And that thing.
02:10:16.100 | - Legalization or drugs, hold on a sec.
02:10:17.740 | - Yeah, go for it. - All right, well--
02:10:18.700 | - Oh, shoot, you're doing the Joe Rogan thing,
02:10:20.140 | you're pushing back 'cause I brought up weed, go ahead.
02:10:23.180 | I'm sorry. - I have become meme.
02:10:25.600 | I don't even, I don't wanna interrupt your--
02:10:30.340 | - No, you're good. - 'Cause there's
02:10:31.180 | memes upon memes upon memes I can go with here.
02:10:33.240 | But no, 'cause people bring up, okay, there's no issues,
02:10:37.160 | there's no issues that the government
02:10:38.480 | is not representing of the public.
02:10:40.880 | - So here's the issue, so somebody will bring up,
02:10:42.480 | like, well, what about the legalization of drugs, okay?
02:10:45.440 | The first issue people have is, one,
02:10:46.960 | they look at national polling.
02:10:48.580 | Very few things are decided on a national level,
02:10:50.920 | so that's the first huge mistake.
02:10:52.760 | Arguably, a lot of BLM made mistakes in this arena
02:10:55.800 | where they're saying, like, why isn't the government
02:10:57.280 | doing anything about policing?
02:10:58.720 | Federal government can't do anything about policing,
02:11:00.040 | that's gonna be your, sometimes it's gonna be your state,
02:11:01.720 | sometimes it's your local city government.
02:11:03.080 | The people that like your chief of police,
02:11:05.140 | your police commissioner, that's coming from your mayor.
02:11:08.260 | So you've got people looking, one,
02:11:09.900 | at the wrong parts of the government
02:11:11.340 | to even figure out the solution to the problem.
02:11:12.760 | Two, oftentimes for polling, the questions are vague enough
02:11:16.980 | that you can poll very high, but when you get into
02:11:18.860 | the weeds on things, no pun intended,
02:11:21.140 | you start to realize, like, oh, shoot,
02:11:23.020 | this is more complicated than I thought.
02:11:24.540 | I don't know the numbers in particular
02:11:25.940 | for legalization of marijuana,
02:11:27.780 | but this is what I'm gonna guess is the case.
02:11:29.220 | If you poll and you say, should we legalize marijuana,
02:11:32.380 | that number might poll at like 65, 70%,
02:11:36.260 | but that's including people
02:11:37.100 | that are in favor of medical marijuana.
02:11:38.500 | If you were to poll, like, should we decriminalize
02:11:42.260 | recreational use of marijuana,
02:11:43.980 | that number might drop to like 52%.
02:11:46.180 | And then if you poll, like, should we completely legalize,
02:11:48.580 | not just decriminalize, but completely legalize
02:11:50.060 | recreational use of marijuana,
02:11:51.260 | that number might drop to like 40%.
02:11:53.020 | There's like all these different ways
02:11:54.140 | you can poll around issues where people are like,
02:11:55.300 | oh no, we broadly agree on this topic,
02:11:57.700 | but when you really figure out, well, do you,
02:11:59.040 | do we really agree, or is there just like broad consensus
02:12:01.260 | around a thing that's never gonna show up,
02:12:02.460 | like in a piece of legislation?
02:12:03.980 | A really good example, one example I do know
02:12:05.500 | was socialized healthcare.
02:12:07.240 | I think if you poll, there was a time a few years ago
02:12:10.020 | where if you poll America,
02:12:11.540 | do you think every American citizen should have access
02:12:14.060 | to like free healthcare?
02:12:15.840 | I think that answer, that poll, like 74% yes.
02:12:19.420 | But when you asked, should the government
02:12:21.340 | be the sole provider of healthcare,
02:12:22.980 | it dropped to like 26%, dropped 50 points.
02:12:26.300 | And you could see it was both asking questions
02:12:28.060 | about single payer, but the way that it was asked
02:12:29.980 | was so different that even if you all,
02:12:31.680 | looks like there's consensus,
02:12:33.080 | there's not nearly as much consensus
02:12:34.480 | as people think around certain ideas.
02:12:36.660 | Yeah, go, we can argue any political--
02:12:37.500 | - You're right, you're right, you're right, you're right.
02:12:40.920 | That polls, the way you ask the polls really matters.
02:12:44.560 | When you ask, should the government be in charge of a thing,
02:12:47.440 | that also biases the answer, right?
02:12:50.360 | Like, because there's such a negative experience
02:12:54.040 | with government creating a .gov site that runs the thing.
02:12:57.160 | But sometimes.
02:12:58.500 | - Sometimes.
02:12:59.340 | I think if you dig in, if you have a one hour conversation
02:13:02.580 | with each individual citizen,
02:13:04.240 | - Sure.
02:13:05.200 | - I think you will understand that yes,
02:13:07.060 | there is support for socialized medicine.
02:13:09.360 | Like it's not--
02:13:10.460 | - The argument has to be made though, yeah.
02:13:12.660 | - What do you mean, the argument has to be made?
02:13:14.060 | - Like if you just ask a conservative,
02:13:15.520 | like what about single payer?
02:13:16.620 | They're gonna tell you no.
02:13:17.540 | You might be able to build up to an argument for it,
02:13:19.260 | but you're gonna have to make the case for it.
02:13:20.660 | - No, but I thought we're talking about
02:13:23.280 | the feeling deep inside your mind and heart,
02:13:26.380 | does the government represent that?
02:13:28.300 | - Oh.
02:13:29.140 | - So it's not like some shallow,
02:13:31.260 | surface layer public opinion.
02:13:34.300 | Does the government effectively represent
02:13:36.940 | what the people want?
02:13:38.760 | Not a shallow survey, but deeply what they want.
02:13:41.940 | I'm not actually that familiar
02:13:43.340 | with the debates over healthcare,
02:13:45.100 | but let's maybe look at an easier one.
02:13:47.300 | - Sure.
02:13:48.140 | - Maybe you'll say it's harder, war.
02:13:50.420 | - War is a really good example where the government
02:13:52.460 | was very responsive, I think, to the people.
02:13:55.060 | - You think so?
02:13:55.900 | - So Iraq, Afghanistan,
02:13:58.420 | the government didn't manipulate public opinion.
02:14:01.980 | - There's an argument to be made that they did
02:14:04.340 | in terms of like WMD and everything,
02:14:06.180 | but after 9/11, were you in the United States after 9/11?
02:14:09.500 | After 9/11, I legitimately--
02:14:11.180 | - That seems accusatory.
02:14:12.460 | Like where were you on 9/11?
02:14:13.860 | (laughing)
02:14:15.460 | - Just checking, okay?
02:14:16.300 | - All right, cool.
02:14:17.540 | I have evidence and witnesses.
02:14:20.220 | No, okay, all right.
02:14:21.060 | I'm very defensive right now.
02:14:22.820 | It's very strange.
02:14:23.780 | Look into it.
02:14:25.420 | - Al Jones.
02:14:26.260 | - Yeah, but I think after 9/11,
02:14:27.540 | we could have gone to war with any country in the world.
02:14:30.980 | We were ready because all of America was like, oh my God,
02:14:33.900 | and they pointed to Iraq,
02:14:35.180 | and the reasons for the WMDs was kind of dumb,
02:14:37.800 | but I don't think we even needed WMDs to go to Iraq.
02:14:39.940 | We could have just said Saddam Hussein
02:14:41.700 | was giving medical aid to Taliban, Al-Qaeda, Iraq, let's go,
02:14:45.140 | and we would have gone for it.
02:14:46.620 | But post-Iraq, Iraq was for a while popular
02:14:50.140 | and then became obviously deeply unpopular,
02:14:52.060 | Iraq and Afghanistan,
02:14:53.300 | and I think you could see that influence
02:14:54.740 | in other foreign policy that the United States had.
02:14:56.700 | For instance, we opted more towards drone warfare
02:14:59.420 | than troops on the ground for places like Yemen.
02:15:01.900 | We opted more towards sending money and help
02:15:04.180 | instead of boots on the ground for places like Syria,
02:15:06.660 | and I think that a lot of that was kind of in response
02:15:08.860 | to how unpopular the Iraq stuff had became.
02:15:11.380 | And when you looked at a lot of elections afterwards,
02:15:13.220 | even for Obama,
02:15:14.820 | one of the defining characteristics of a lot of campaigns
02:15:17.200 | were like, I'm gonna close Guantanamo Bay,
02:15:19.080 | I'm gonna get us out of foreign wars, even up to Trump.
02:15:21.420 | I'm going into, I'm not gonna stop
02:15:23.340 | doing all this weird stuff in the Middle East.
02:15:25.060 | - But they didn't still withdraw from Afghanistan.
02:15:27.060 | - They didn't withdraw, but they definitely tapered off
02:15:29.420 | and weren't as aggressively pushing those types of conflicts
02:15:31.940 | 'cause they knew it was unpopular.
02:15:33.020 | - But I think if you also consider perfect information
02:15:36.460 | or good information, if you ask a lot of people,
02:15:40.100 | are you okay spending this amount of money for this purpose,
02:15:45.980 | so a military conflict in Iraq and Afghanistan,
02:15:48.420 | I think almost from the very beginning, they would say no.
02:15:51.500 | - After 9/11, I feel like--
02:15:52.700 | - Maybe like a few days after 9/11.
02:15:54.700 | - I remember Freedom Prize.
02:15:56.900 | We were so mad at--
02:15:58.220 | - There's some memes and so on, yes.
02:15:59.820 | But the nature of the public support for the war,
02:16:02.960 | was there public support in 2003,
02:16:06.660 | which is when the invasion happened?
02:16:08.940 | - I feel like initially there was a lot.
02:16:10.300 | I remember seeing it on,
02:16:11.120 | but then I also lived in a Republican household
02:16:12.540 | and I was not very media savvy at the time.
02:16:14.100 | So my parents--
02:16:14.940 | - And I don't know if the nature of that public support
02:16:16.260 | had to do with WMDs or with 9/11.
02:16:19.400 | 'Cause the weird--
02:16:20.240 | - It became about WMDs.
02:16:21.660 | - But I wonder what is the,
02:16:23.060 | if you were to poll people and let's say hypothetically,
02:16:26.360 | there was above 50% support for the war,
02:16:29.300 | what would be the nature of that support?
02:16:31.700 | And to what degree is the government
02:16:34.460 | actually representing the will of the people
02:16:37.240 | versus some complex mechanism
02:16:41.020 | like the military industrial complex
02:16:43.080 | is manipulating the narrative
02:16:44.700 | that's controlling public opinion?
02:16:47.240 | And then there's the media that gets a lot of attention
02:16:51.380 | by being divided in how they're shaping the narrative
02:16:54.820 | through the mechanism of division.
02:16:56.900 | So what--
02:16:58.180 | - There's a lot of complicated things out there.
02:16:59.900 | It's not just like the people and then the government.
02:17:02.040 | And that's, yeah, for sure I agree
02:17:03.700 | that there are gonna be different elements at play.
02:17:05.580 | - And how much of those elements that lead us astray
02:17:10.420 | can be attributed to the largeness
02:17:12.700 | of the different systems and the different institutions,
02:17:16.660 | like the media institutions and government.
02:17:21.180 | The institutions that have a monopoly on violence,
02:17:23.700 | let's put it this way,
02:17:24.780 | which is one way to define government.
02:17:26.940 | - Sure, it's complicated.
02:17:28.860 | There's definitely gonna be different institutions at play.
02:17:30.980 | But I think that like,
02:17:32.580 | all I would say is like in reverence to my original point,
02:17:34.660 | when there becomes like broad consensus around a thing,
02:17:37.300 | I think the government will usually follow.
02:17:38.780 | It's not gonna fight.
02:17:39.620 | It's gonna follow more often than not.
02:17:41.980 | But I think that a lot of times I think Americans
02:17:44.180 | think that there's more consensus around certain issues
02:17:46.660 | than there actually are.
02:17:48.100 | So like a really good example,
02:17:48.940 | we're on that war point too.
02:17:51.340 | What caused like the lowest dip in Biden's approval rating?
02:17:55.420 | I'm pretty sure it was right after
02:17:56.460 | we pulled out of Afghanistan,
02:17:58.060 | which I think if I would have asked people like a year before
02:18:01.580 | like, let's assume that we could pull out of Afghanistan.
02:18:04.780 | The government's probably gonna collapse after we leave
02:18:06.580 | because they just don't have the will to fight.
02:18:08.020 | They don't have the support, they don't, whatever.
02:18:09.220 | That's just not gonna work.
02:18:10.180 | But like no Americans are gonna die.
02:18:12.100 | There might be a couple of other people,
02:18:12.940 | but like no Americans are gonna die
02:18:13.780 | when we get out of Afghanistan.
02:18:14.620 | Would you support that?
02:18:15.460 | I think broadly speaking,
02:18:17.100 | I think like more than 60 or 70% of Americans are like,
02:18:19.900 | "Yeah, that would be fine."
02:18:21.260 | But then when it actually plays on TV,
02:18:22.780 | when we see the people hanging onto the planes,
02:18:24.260 | when we see like helicopter embassies,
02:18:26.100 | some of the courts and politicians,
02:18:27.460 | well now it's like, "Oh my God, this was horrible
02:18:29.180 | and it was so botched and it was so like,
02:18:30.860 | it could have gone so much better."
02:18:31.700 | It's like, "Well, could it have gone better?"
02:18:32.740 | Like maybe, maybe not.
02:18:34.100 | But I mean, it seems like you can have consensus
02:18:36.540 | around a certain opinion,
02:18:37.580 | but the way that things play out
02:18:39.580 | and the way that people actually feel,
02:18:41.060 | it's actually way, way, way more complicated
02:18:43.260 | and there's not usually this broad consensus opinion.
02:18:46.460 | All right, yeah, go ahead.
02:18:47.300 | - I'd like to believe that.
02:18:48.340 | I mean, just to lay my cards on the table,
02:18:52.100 | I have faith in the power of effective government.
02:18:58.900 | I just have a lot of concern about what happens
02:19:05.100 | as institutions grow in size.
02:19:07.540 | - For sure.
02:19:08.380 | - And I just have a lot of worry
02:19:09.420 | about the natural corrupting influence on the individuals
02:19:13.420 | and on the system as a whole,
02:19:15.660 | like the boys club nature of it.
02:19:17.820 | I don't know, there must be a better term,
02:19:19.420 | but basically they agree to the game
02:19:22.420 | and they play the game
02:19:23.460 | and there's a generational aspect, momentum to the game.
02:19:27.940 | And they more and more stop being responsive
02:19:30.300 | to the people that they represent.
02:19:32.860 | I just feel like there is that mechanism.
02:19:35.060 | And I think the nice thing,
02:19:37.740 | democracy elections are resistance
02:19:40.340 | to that natural human mechanism.
02:19:41.980 | Also the balances of power
02:19:44.220 | is a resistance to that mechanism.
02:19:46.020 | In some ways, the media that reveals the bullshit
02:19:48.740 | of politicians is also a resistance to that mechanism.
02:19:52.300 | It's hard to be full of shit as a politician
02:19:54.780 | 'cause people will try to catch you on it.
02:19:56.620 | So there's a honesty method there that keeps you honest.
02:20:00.940 | It's to some degree, but it still feels like,
02:20:04.480 | it still feels like politicians are gonna politician.
02:20:10.380 | - Yeah, they definitely play their games.
02:20:12.020 | That is true.
02:20:13.300 | There's probably always gonna be that meta narrative
02:20:15.620 | over like governance that just develops
02:20:18.500 | as like you have to form relationships and play games
02:20:20.860 | to get legislation passed and everything.
02:20:22.980 | The only reason why I don't like it
02:20:24.860 | when people attack institutions is because one,
02:20:27.240 | institutions are incredibly important, arguably paramount.
02:20:29.860 | No, they are to keeping society running.
02:20:32.020 | And two, I think sometimes when we shift the blame
02:20:34.420 | onto institutions too much,
02:20:36.240 | I think that we lose sight of what the real problems are.
02:20:38.540 | So for instance, in the United States today,
02:20:40.860 | people might be very critical of the government
02:20:42.540 | not getting much done,
02:20:44.060 | but then everybody turns their eyes to the government
02:20:45.780 | for being ineffective.
02:20:47.100 | But what I would argue is I would say the government
02:20:48.860 | is actually incredibly effective
02:20:51.060 | and it's showcasing the will of the American people
02:20:53.180 | really well right now,
02:20:54.140 | which is we are historically more divided
02:20:56.460 | than we have ever been.
02:20:57.900 | And if I were to just look at the people
02:20:59.360 | and I were to say, we have a historic divide
02:21:01.940 | that is getting like rapidly blown apart
02:21:04.220 | by things like the internet and the media, right?
02:21:06.180 | If that exists, well, what would I expect
02:21:07.580 | that government to look like?
02:21:08.700 | I wouldn't expect the government
02:21:09.540 | to be governing very effective.
02:21:10.620 | I would expect that government to show
02:21:12.300 | that legitimate divide in people.
02:21:14.420 | - Do you think that divide,
02:21:15.740 | we have a perception of a large divide
02:21:18.140 | between left and right.
02:21:19.940 | Do you think that's a real divide that's in this country?
02:21:22.220 | - Narrow the language.
02:21:23.140 | What do you mean by real divide?
02:21:25.420 | - Do you think there is that divide in ideology,
02:21:28.420 | that there's a large number of people
02:21:29.500 | that believe a certain set of policies
02:21:30.980 | and the different set of policies?
02:21:32.940 | Or is it just the perception on Twitter?
02:21:35.660 | - No, I think there is a large divide in terms of belief.
02:21:37.660 | I don't think there's very much divide between any people
02:21:39.660 | in terms of like what they,
02:21:40.860 | like on the most fundamental levels want
02:21:43.140 | in terms of like human beings.
02:21:44.500 | But in terms of like Democrat versus Republican right now,
02:21:46.860 | I think there is a huge divide
02:21:48.460 | in terms of the direction they wanna see the country go
02:21:50.460 | and what they believe really
02:21:51.860 | and what they even believe is reality, right?
02:21:53.500 | Unfortunately, that's where we've gotten to.
02:21:55.940 | - Can I just speak about the mechanism
02:21:57.380 | of the left and right here,
02:21:58.980 | maybe on the mimetic rivalry aspect?
02:22:01.780 | Is there some aspect to the left
02:22:03.740 | on which you're a part of that attacks their own
02:22:06.180 | for ideological impurity more than the right does?
02:22:10.560 | Is it the bigotry of small differences?
02:22:12.780 | There's a concept where when you're near somebody
02:22:17.260 | who is very slightly different than you,
02:22:18.620 | you wanna destroy it.
02:22:19.620 | But when you're with somebody
02:22:20.460 | that's way different than you, you don't.
02:22:22.420 | I think the left does it,
02:22:24.460 | but I think the right does it too.
02:22:25.620 | I didn't realize it until I started dipping more
02:22:27.200 | into conservative communities, but oh my God,
02:22:29.420 | the people from the Daily Wire
02:22:30.660 | and the people from Turning Point and the America First,
02:22:33.620 | but all these different groups of people hate each other
02:22:36.100 | and they fight each other so much.
02:22:37.380 | They hire and fire sometimes employees.
02:22:39.540 | They talk smack about each other.
02:22:41.720 | I think there's a lot of political division
02:22:42.920 | between both sides.
02:22:43.760 | I think that the left just kind of gets highlighted more
02:22:45.840 | because it's like the internet
02:22:46.760 | and a lot of the internet spaces
02:22:47.920 | have a lot of left-leaning people.
02:22:48.960 | So you see like the crazy communists
02:22:50.640 | and the crazy progressives
02:22:52.080 | and the crazy center-left liberals
02:22:53.840 | and the crazy blah, blah, blah.
02:22:54.960 | Whereas like a lot of the right-leaning people
02:22:56.240 | have kind of been pushed off of the main areas
02:22:58.160 | of the internet now.
02:22:59.520 | - Interesting.
02:23:00.360 | My sense was that it's hard to exist on the center-left,
02:23:04.000 | but maybe because I just don't have the full spectrum view
02:23:07.600 | of the political divide.
02:23:08.600 | It felt like center-left is a difficult position
02:23:10.660 | to occupy.
02:23:11.500 | - Yeah, I would definitely say so, yeah.
02:23:12.820 | - I don't know if it's that difficult to be center-right.
02:23:15.420 | - It's very difficult to be center-right.
02:23:17.180 | I think actually maybe even more difficult
02:23:19.460 | because a center-right person might be somebody
02:23:21.180 | who's like conservative, but not a fan of Trump.
02:23:24.060 | And you're like over.
02:23:25.940 | Like look at like Liz Cheney, right?
02:23:27.220 | You've had politicians that are just like,
02:23:28.780 | they didn't back the Trump stuff and now they're gone.
02:23:30.540 | Or you might be like center-right,
02:23:32.020 | but like you don't think the election was stolen.
02:23:33.900 | And now you're like half the Republican Party
02:23:36.180 | is looking at you like you're crazy, you know?
02:23:38.380 | - That's true, that's true.
02:23:39.440 | I think there's a Ben Shapiro, who I'm talking with,
02:23:42.920 | I think he publicly spoke against Trump, right?
02:23:46.360 | - He did initially, but I felt like he softened
02:23:47.920 | his language up on him pretty significantly.
02:23:49.760 | - So there's a significant pressure to kind of
02:23:51.960 | cop out a certain kind of messaging.
02:23:53.880 | - Which the whole Republican Party is feeling right now.
02:23:55.960 | Geez, two years from now, that election is gonna be insane.
02:23:59.280 | - It's just hard, okay.
02:24:00.200 | So to generalize, it's hard to be in the center,
02:24:02.360 | it feels like.
02:24:03.200 | - For sure.
02:24:04.020 | - To center and then like do like a random walk
02:24:08.000 | among the policies around that.
02:24:10.400 | I don't know what that mechanism is.
02:24:12.120 | I mean, it makes people like me not feel good
02:24:15.880 | being in the center.
02:24:17.360 | It seems like people are just not nice
02:24:19.480 | to people in the center.
02:24:21.000 | Like the public, the Twitter machine is not nice
02:24:24.960 | to the people who are open-minded in the center.
02:24:27.760 | Is there some truth to that?
02:24:29.420 | - Two reasons for that.
02:24:30.560 | One is because I think a lot of people
02:24:32.080 | that market themselves as center
02:24:33.440 | are legitimately spineless cowards
02:24:35.420 | and deserve to be called out.
02:24:37.880 | I've never killed a man, but today might be my first.
02:24:40.720 | - Oh no.
02:24:41.560 | - Like I told you, I'll take over your stream.
02:24:45.520 | - With AI, we'll see.
02:24:47.280 | Is that guy gonna be streaming in the background?
02:24:48.960 | - Hey fellas.
02:24:49.800 | - Okay, gotcha, gotcha, gotcha, gotcha.
02:24:53.720 | Lots of gotchas, lots of gotchas.
02:24:55.160 | Okay, gotcha.
02:24:56.480 | - And decrease the amount,
02:24:57.720 | which already is a pretty low level of emotion.
02:25:01.200 | Just decrease it completely.
02:25:02.700 | When people are screaming at you and accusing stuff,
02:25:05.960 | just remain calm.
02:25:07.320 | - Absolutely.
02:25:08.160 | - Emotionless.
02:25:08.980 | - The gaslighter strategy.
02:25:10.000 | - Yeah.
02:25:10.840 | Okay, so what were we talking about?
02:25:12.400 | So I don't--
02:25:14.600 | - I don't even, I don't ever identify as center anything
02:25:16.720 | because it's got such a bad reputation because--
02:25:18.360 | - Fuck that, I stand center with a spine.
02:25:22.040 | It's called being open-minded.
02:25:23.320 | And it's not center left and right, those are just labels.
02:25:26.280 | - Here's a really good quote my mom said to me
02:25:28.240 | when I was really young.
02:25:29.080 | She said, "Stevie, don't ever let your mind be so open
02:25:31.560 | "that your brain falls out."
02:25:32.960 | And that's what I find that a lot of center people do.
02:25:34.800 | - That's not what she told me last night.
02:25:36.920 | Why are you like this?
02:25:38.080 | (laughing)
02:25:39.480 | - I'm sorry, man.
02:25:40.680 | - Okay, I'm glad I can bring that.
02:25:42.560 | I'm glad you feel like this is a safe space.
02:25:43.760 | Like I said, people, non-judgmental.
02:25:45.360 | If you wanna talk about fucking my mom,
02:25:46.640 | you know what, you're totally within your rights.
02:25:48.040 | - I didn't say that, you said that.
02:25:49.400 | - That's totally great, I support that.
02:25:50.960 | She's a beautiful woman.
02:25:52.240 | Her husband probably wouldn't be too happy about it,
02:25:53.580 | but you know.
02:25:54.560 | - I didn't say there was any sexual relations.
02:25:56.520 | It was just that having a conversation with her.
02:25:59.120 | You projected that, that says more about you than me.
02:26:02.040 | Anyway, go ahead about spineless center.
02:26:04.480 | - Spineless center.
02:26:05.720 | There is some aspect to that which is amorphous.
02:26:09.040 | To me, center means you think freely
02:26:13.080 | about each individual policy without being stuck to a--
02:26:16.320 | - Some ideal, yeah, but a lot of people don't do that.
02:26:18.240 | They call themselves centrist,
02:26:19.320 | but then they're anti-establishment,
02:26:21.020 | essentially, on everything.
02:26:22.320 | I don't know your position on the vaccines or anything,
02:26:25.520 | but I met a lot of free and open thinkers
02:26:27.400 | who were like, you know what, I am open to everything,
02:26:29.640 | and it's an experimental vaccine,
02:26:31.040 | and I'm gonna eat hydroxychloroquine and ivermectin
02:26:33.160 | because that's what the institutions
02:26:34.360 | are telling me not to take,
02:26:35.560 | and I think Fauci got too much money from that company,
02:26:37.960 | and these are, but I'm an open thinker,
02:26:39.560 | and what open thinker becomes--
02:26:41.160 | - I'm at MIT.
02:26:42.000 | What do you think my position on vaccines is exactly?
02:26:44.640 | - I hear a lot of crazy things from a lot of people, okay?
02:26:46.200 | You might be from MIT, but I know you from the internet,
02:26:48.240 | okay, and people from the internet are weird and crazy, so.
02:26:50.560 | - Yeah. (laughs)
02:26:51.880 | Well, I-- - Who knows?
02:26:53.640 | - I don't like arrogance,
02:26:55.000 | and I have criticized scientists during COVID,
02:26:58.480 | a lot of people, but scientists included,
02:27:00.200 | having arrogance. - Which is fair.
02:27:01.960 | - But-- - And I think there's
02:27:03.160 | a lot of good criticism to be made
02:27:05.400 | of different scientific and medical establishments
02:27:08.520 | over a lot of stuff,
02:27:10.460 | but nobody can make those good criticisms
02:27:12.560 | because they're too obsessed
02:27:14.240 | over just trying to have the anti-establishment answer,
02:27:17.360 | and that is what is upsetting me the most.
02:27:19.440 | I think there are good conversations to be had
02:27:22.160 | about a lot of stuff related to
02:27:24.280 | how we handled the coronavirus.
02:27:26.120 | Were lockdowns effective?
02:27:27.240 | Was there enough data to support the huge measures we took?
02:27:30.920 | Why didn't we have the option to show,
02:27:33.040 | I was infected a month ago,
02:27:34.440 | why do I need to be vaccinated?
02:27:35.720 | Why wasn't that option ever a thing in the United States?
02:27:37.320 | I don't think it was.
02:27:38.440 | There are really good questions to be asked there,
02:27:41.120 | but all the people asking the questions
02:27:42.800 | are also trying to tell you that ivermectin
02:27:44.320 | and monoclonal antibodies are the way to go for everything,
02:27:46.600 | and the vaccine is evil,
02:27:47.840 | and it's gonna turn you gay like the frogs,
02:27:49.600 | and it's like, Jesus,
02:27:51.400 | there's no place to reasonably criticize from
02:27:54.240 | because all of the people that are criticizing
02:27:55.940 | aren't doing it with an open mind,
02:27:57.360 | or they're not reading studies or doing anything.
02:27:59.800 | They're saying, "I do my own research,"
02:28:01.060 | which means they listened to
02:28:02.360 | whatever the last guy on Joe Rogan said,
02:28:03.640 | and now they are parroting that opinion 100%.
02:28:05.040 | - Easy now, easy now, bros.
02:28:06.680 | - The last guy on Joe Rogan, not Joe Rogan, okay.
02:28:08.200 | That Robert Malone guy on Joe Rogan
02:28:09.800 | got me real fired up. - That's one guest.
02:28:11.320 | - People see him as the father of mRNA technology.
02:28:14.320 | He published one paper, okay?
02:28:16.080 | - What do you mean people?
02:28:17.200 | Which people think that?
02:28:19.080 | - Joe Rogan fans.
02:28:19.920 | I run into these people.
02:28:21.400 | I start arguing with people,
02:28:22.220 | and they start saying to me, "Well, what about--"
02:28:23.060 | - I'm a Joe Rogan fan, and I appreciate the vaccine.
02:28:26.560 | - That's good, I'm glad you do.
02:28:27.760 | But there's definitely--
02:28:28.600 | - Sorry, but you said there's a type.
02:28:30.960 | What's the type of Joe Rogan fan?
02:28:33.120 | - Anti-establishment.
02:28:34.880 | - I think that's not Joe Rogan.
02:28:36.040 | That's a general public discourse.
02:28:37.960 | There's a default anti-establishment.
02:28:40.680 | On the right and the left,
02:28:42.440 | that's the default easy thing to go to.
02:28:45.160 | - I think Joe Rogan fans are definitely
02:28:46.360 | a certain type of anti-establishment, though.
02:28:48.280 | Like, I could guess the Joe Rogan fan.
02:28:50.280 | Like, if I were to do general population
02:28:52.240 | versus Joe Rogan fan,
02:28:53.080 | who do you think is more likely to be anti-vaccine?
02:28:55.640 | - Do you have data on this?
02:28:57.280 | Or are you just guessing? - You're gonna do that to me?
02:28:58.520 | - Yeah. - Just guessing.
02:28:59.480 | - Yeah, I think you are actually judging.
02:29:01.720 | - I am, yeah.
02:29:02.540 | - You're judging.
02:29:03.380 | 'Cause I think you're also,
02:29:04.920 | the beautiful thing about podcasting,
02:29:08.000 | this could be similar to streaming,
02:29:09.520 | is there's a large number of people that just listen.
02:29:13.000 | Like, what does it mean to be a Joe Rogan fan?
02:29:15.560 | - I don't think you just listen.
02:29:16.600 | I think people listen and absorb the information.
02:29:19.360 | - I would say that the Joe Rogan fan base
02:29:21.640 | is as divided in the vaccine as the general public.
02:29:24.600 | - Gotcha.
02:29:25.440 | Man, I'm gonna look for polling data on that.
02:29:26.800 | I'm sure somebody's gotta have done it out there, but--
02:29:28.760 | - No, but you're basically revealing
02:29:30.720 | the fact they have no data.
02:29:31.760 | You're using your own judgment.
02:29:33.200 | - For sure.
02:29:34.400 | Based on how he's had conversations
02:29:35.880 | about his experience with the coronavirus,
02:29:37.720 | and then based on the guests that have come on
02:29:39.760 | that have talked and echoed a lot of anti-vax talking points
02:29:42.200 | and been completely unchallenged,
02:29:43.720 | and then based on statements he's made
02:29:45.240 | about myocarditis and the vaccine and everything as well.
02:29:47.960 | - So it's the level of challenge or not that he's doing.
02:29:50.840 | - Well, yeah, and then what his true positions are,
02:29:52.560 | and then the types of guests he typically chooses
02:29:54.280 | to bring on to talk about the vaccines, yeah.
02:29:56.080 | - Okay, but that represents somehow
02:29:58.300 | a deep anti-establishment feeling versus--
02:30:01.720 | - Versus just the vaccine.
02:30:02.760 | I mean, I've seen the vaccine and other things
02:30:04.600 | being a thing that broke people.
02:30:07.640 | They seem to-- - I think all the coronavirus,
02:30:08.800 | that whole one or two years broke a lot of people.
02:30:12.080 | - 'Cause there's a lot of emotion,
02:30:13.080 | and that emotion quickly solidified into an opinion
02:30:17.120 | that almost had nothing to do with thinking through
02:30:20.880 | and updating your knowledge and so on.
02:30:22.800 | You just made up your mind.
02:30:24.480 | - Yeah, but I think a lot of it comes
02:30:25.640 | from that anti-establishment place.
02:30:27.200 | The vaccine represents the ultimate of establishment.
02:30:30.340 | It was a huge private company
02:30:33.080 | backed by a huge public government,
02:30:36.200 | and there's Fauci, and there's Biden, and there's Pfizer,
02:30:40.080 | and there's all these countries locking us up in our homes,
02:30:42.480 | telling us to do a thing.
02:30:43.560 | The vaccine was the ultimate submission tool
02:30:47.280 | to show you that the government owns you.
02:30:48.800 | Not only do you have to get injected once, it's a series,
02:30:51.200 | and then you gotta get boosters,
02:30:52.480 | and it's like they're trying to keep you under their thumb,
02:30:54.040 | and that's the control.
02:30:55.800 | I feel like that vaccine
02:30:56.760 | became the ultimate rallying cry between,
02:30:58.440 | do you support, are you a sellout
02:30:59.920 | that is gonna believe whatever the government
02:31:01.520 | tells the sheep to take,
02:31:02.720 | or are you gonna be like the guy
02:31:03.720 | that stands against the crowd and gets fired from his job
02:31:06.600 | and pulls his kids from school
02:31:07.760 | because they're not gonna let the evil Fauci medicine
02:31:09.960 | jab them in the arm?
02:31:11.160 | - And the funny thing is the crowd that stands against
02:31:14.600 | the institution is not larger than the crowd of sheep.
02:31:18.120 | There's like one sheep standing there.
02:31:19.760 | - Sure, yeah, or it feels that way sometimes.
02:31:21.600 | - One vaccinated sheep.
02:31:23.280 | Well, okay, what's the defense of institutions?
02:31:26.080 | How do you regain the trust of institutions?
02:31:28.080 | Like, first of all, do you think that there's ways
02:31:31.200 | in which WHO, CDC failed,
02:31:34.800 | and do you think there's criticism towards Pfizer
02:31:37.600 | and the big pharma companies that's deserved?
02:31:40.080 | - Damn, it's the pharma companies, I'm not sure.
02:31:42.520 | For CDC and WHO, so here's a criticism that I have
02:31:45.640 | of all of academia,
02:31:47.080 | and I feel it stronger and stronger every day.
02:31:49.600 | I don't think it's enough to be a researcher
02:31:52.720 | or to be correct about issues.
02:31:54.440 | Academia needs to increase its ability to communicate.
02:31:58.560 | It is just an unbelievable, unmitigated failure
02:32:03.400 | that academics are unwilling to wade
02:32:06.080 | into the complicated topics that exist today
02:32:09.200 | because other people are, you know?
02:32:10.800 | - First you call me spineless,
02:32:12.320 | and then you call me a bad communicator.
02:32:14.280 | - But no, look, you're here, you're doing it,
02:32:15.360 | so you get props for me, okay?
02:32:16.600 | Good job.
02:32:17.440 | - That motherfucker.
02:32:18.280 | - But there are like so many, but I'm sure you've,
02:32:19.520 | I'm sure that you must have heard another fellow academic,
02:32:24.520 | a fellow colleague express some amount of frustration
02:32:27.680 | about like in their specific discipline,
02:32:30.320 | they know something to be true,
02:32:31.440 | and they know that like a lot of the messaging
02:32:33.320 | is like wrong or bad in the public about it,
02:32:35.120 | but they're never gonna step out and say anything
02:32:36.520 | because either one, they're very measured
02:32:38.400 | and careful with their tech,
02:32:39.280 | which they feel is incompatible
02:32:40.440 | with what people wanna hear,
02:32:41.560 | or two, they're really worried
02:32:43.280 | that they might be incorrect,
02:32:44.240 | so they're gonna be cautious
02:32:45.200 | while everybody else is going out
02:32:46.400 | and like hard-courting their--
02:32:47.360 | - And they also don't have the support of institutions
02:32:49.320 | for them to go out on a limb.
02:32:50.560 | - Yeah, that too.
02:32:51.400 | - So like to take risks, for example,
02:32:52.760 | I've heard that with lab leak theory.
02:32:55.880 | I've had a lot of biologists, virologists, friends
02:32:58.120 | that are like, yeah, it's obviously leaked from a lab.
02:33:01.520 | Like early on.
02:33:02.920 | - Oh, maybe, okay.
02:33:04.040 | We can fight over this one, but sorry, go ahead.
02:33:05.400 | - Let's fight.
02:33:06.240 | - No, no, keep talking.
02:33:07.080 | - We can fight over this.
02:33:07.900 | Well, like they, okay, I should sort of backstep
02:33:11.120 | and say like that's like you talk about shooting the shit,
02:33:13.680 | you haven't really investigated, but it's your gut.
02:33:15.960 | Like this doesn't make any sense.
02:33:17.680 | They would never say that publicly.
02:33:18.960 | - Of course.
02:33:19.920 | - Mostly because you're saying like what they would all say
02:33:23.320 | is like, we want to see data.
02:33:24.960 | - Yeah, which would be good, which is fine.
02:33:26.640 | - So they're going with like,
02:33:27.800 | like this is too many coincidences in the same place.
02:33:30.160 | That's the logic, but they don't want to say anything
02:33:32.860 | because there's no data.
02:33:33.720 | You need to have evidence.
02:33:34.880 | You need to have actual evidence
02:33:36.040 | to say one way or the other.
02:33:37.480 | There's that, but there's also just like you said,
02:33:40.140 | I mean, effective communication.
02:33:42.880 | You're a fan of Sean Carroll.
02:33:44.360 | - Oh, yes.
02:33:45.520 | He's like one of the only people in this whole planet
02:33:47.880 | that I like besides you.
02:33:49.000 | I love Sean Carroll.
02:33:50.480 | - Anytime Sean Carroll is brought up as evidence,
02:33:53.640 | there's a smile that comes over your face.
02:33:56.120 | - I love it.
02:33:56.960 | - Of like joy of like a little kid
02:33:58.800 | thinking about Santa Claus.
02:34:00.320 | Okay, I love Sean Carroll too.
02:34:01.800 | People should listen to this podcast.
02:34:02.920 | - I love Sean Carroll because I hate this divide
02:34:06.160 | between like you're either STEM
02:34:08.200 | or you're like philosophy, arts, and all that other stuff.
02:34:10.200 | And the two worlds kind of across.
02:34:11.200 | And I love that he was so good at physics,
02:34:13.180 | but like explores and pays attention
02:34:14.680 | to all of the like sociological stuff too.
02:34:16.520 | It's so rare to find that quality in a person.
02:34:18.480 | - He's legit one of the really, really, really
02:34:20.500 | special minds, but you don't have to be a Sean Carroll.
02:34:22.360 | You can be just a little better at educating.
02:34:24.720 | Another person in the medical and the health space
02:34:28.000 | is somebody named Andrew Huberman,
02:34:29.720 | a friend of mine from Stanford.
02:34:31.820 | He's an incredible educator.
02:34:33.920 | There's the kind of process in science
02:34:36.220 | they usually call like review or survey papers
02:34:38.720 | where you basically summarize all that's going on,
02:34:42.160 | integrate it, and like draw wisdom from it,
02:34:44.280 | and also project like where's the discipline headed.
02:34:47.080 | And Andrew does that basically on all these subcomponents
02:34:49.880 | of the different stuff going on in neuroscience,
02:34:52.520 | and biology, neurobiology, all of that.
02:34:54.440 | He's able to, he does a podcast called Huberman Lab
02:34:58.140 | where he just summarizes all and is able to explain
02:35:01.000 | like what does that actually mean for your life
02:35:04.240 | in terms of protocols of how to make your life better.
02:35:07.040 | I feel like people should be able to do that more and more.
02:35:09.360 | But with virology, and oh boy, that's a tricky one.
02:35:14.140 | That's a really tricky one.
02:35:15.600 | - I wish that people could have honest conversations.
02:35:18.080 | Like I attack a lot of people
02:35:19.400 | that do the lab leak theory stuff,
02:35:20.880 | but truly we should be able
02:35:22.560 | to have that conversation publicly.
02:35:24.280 | It just, it always feels like
02:35:25.120 | the people that are having the conversation
02:35:26.440 | don't ever really wanna have the conversation.
02:35:28.040 | They're not being honest.
02:35:28.880 | Like I'm a guy that like does his own research,
02:35:32.000 | and it's so boring reading studies,
02:35:34.360 | and a lot of it I can only do abstracts,
02:35:36.200 | and like it's so much work,
02:35:38.400 | but I'll never ever say that about myself.
02:35:40.280 | I'm a guy that does his own research
02:35:41.400 | because every time I hear somebody say that,
02:35:43.320 | they don't do any research.
02:35:44.460 | When they say they do their own research,
02:35:45.540 | what they mean is they've seen one podcast,
02:35:46.900 | and their opinion on it is completely--
02:35:47.740 | - What podcast is that?
02:35:48.900 | - Definitely not mine, 'cause if it was mine,
02:35:50.220 | I wouldn't be criticizing anything they say.
02:35:52.700 | But yeah, so like lab leak is another one,
02:35:54.180 | where it's like, well, how do you know it's lab leak?
02:35:56.100 | How do I know it's lab leak?
02:35:57.600 | Because Fauci lied, and Hunter Biden lapped up,
02:36:00.580 | and it's like, okay, come on,
02:36:01.420 | you haven't engaged with it at all.
02:36:02.300 | There's really interesting research
02:36:03.940 | that shows there's a really strong study
02:36:05.540 | that shows that there's like a high degree of certainty
02:36:08.180 | that it came from the wet markets.
02:36:10.060 | Very, very high degree of certainty.
02:36:11.820 | And there was an article that came out recently
02:36:13.320 | where it's like, Senate concludes that virus
02:36:15.600 | actually came from the Wuhan virology lab or whatever.
02:36:18.880 | And that whole article, if you actually read it,
02:36:20.800 | it never says that in the article.
02:36:21.960 | I don't know why they tweeted it with that headline.
02:36:24.480 | But yeah, to back up, I'm sorry.
02:36:26.480 | I think we should have good--
02:36:27.920 | - You should be sorry.
02:36:28.920 | - Yeah, I'm not sorry, actually.
02:36:31.000 | I get to ramble here, okay?
02:36:32.360 | I'm here for a long time.
02:36:33.420 | I rescind my apology, okay?
02:36:35.360 | I actually rescind my apology.
02:36:36.860 | We should be able to have
02:36:38.720 | challenging conversations about things,
02:36:39.600 | but you gotta, man, be well read on both sides.
02:36:41.440 | Not this like, I do my own research,
02:36:43.380 | so I don't believe anything that Fauci says.
02:36:44.860 | Like, come on, dude.
02:36:45.700 | Dude, you can do better than that.
02:36:46.700 | Not you personally, but.
02:36:48.060 | - Gotcha.
02:36:49.620 | How does that feel?
02:36:50.720 | - Feels great.
02:36:52.820 | - So for people who don't know--
02:36:53.640 | - Feel understood.
02:36:54.480 | - That's the catchphrase.
02:36:55.740 | Gotcha.
02:36:56.580 | Through all tragedy and triumph,
02:36:58.980 | through all the rollercoaster of life,
02:37:01.420 | your response to it is gotcha.
02:37:03.180 | It's, well, actually, let me jump to that
02:37:06.940 | before I continue with political discourse.
02:37:11.900 | Psychologically, you are in a lot of heated debates,
02:37:15.580 | and you're usually super calm under fire until you're not.
02:37:18.380 | Sometimes you lose your temper completely.
02:37:20.580 | - Very rarely, but.
02:37:21.860 | - That's like your opinion, man.
02:37:23.580 | Let me ask about your psychology.
02:37:24.980 | What are psychologically your strengths and weaknesses
02:37:27.460 | that you're self-aware about?
02:37:29.100 | - I think I'm very nonjudgmental,
02:37:30.440 | so I can entertain a lot of different thoughts
02:37:32.620 | without agreeing with them or condoning them.
02:37:34.740 | I think that's a really big benefit to me.
02:37:37.540 | For whatever reason, I seem to be pretty calm
02:37:40.960 | in dealing with annoying people.
02:37:43.440 | It's why I got promoted at the casino so fast.
02:37:45.080 | I could deal with drunks or whatever.
02:37:46.840 | It just didn't affect me that much.
02:37:48.360 | - What percent of the population is annoying?
02:37:51.080 | - Depends on how you're engaging with them.
02:37:53.700 | Most people aren't really annoying ever.
02:37:55.520 | - That's what I mean.
02:37:56.360 | - But if you're doing political debate,
02:37:57.960 | what percentage is annoying?
02:37:59.160 | I guess it depends on who I'm debating
02:38:00.280 | and what the topic is.
02:38:01.120 | - Well, I guess I'm trying to point out the fact
02:38:03.400 | that sometimes you can say that reveals something about you
02:38:08.020 | if you think a large percent of people are annoying.
02:38:10.260 | - Well, I would say working graveyard shift
02:38:11.780 | when alcohol is involved,
02:38:12.940 | that percentage of people goes very, very, very high.
02:38:15.580 | Or to be more fair, actually,
02:38:16.860 | it's not a high percentage, truly,
02:38:18.540 | but if you're a server,
02:38:19.820 | one bad customer can ruin the rest of your shift.
02:38:22.340 | So you only need one or two people acting in that manner
02:38:24.580 | to just totally throw you off.
02:38:26.260 | - And you're able to, at least these days,
02:38:29.380 | not allow that one customer to throw you off,
02:38:32.460 | quote, unquote.
02:38:33.300 | - It's pretty much like a,
02:38:34.240 | I noticed this especially after having a son,
02:38:36.440 | there's something about six-year-old kids or whatever
02:38:39.200 | where it's like, if they get mad,
02:38:40.720 | they're never gonna be mad for that long.
02:38:41.960 | They'll move on.
02:38:42.800 | That's my mentality.
02:38:43.640 | I'm a six-year-old kid.
02:38:44.840 | I might be mad about something,
02:38:45.680 | but I'll get over it in 30 minutes or an hour.
02:38:47.600 | I'm pretty good about not carrying that through.
02:38:50.240 | It's very rare that I'll hold a grudge against anybody
02:38:52.760 | or be angry about something
02:38:54.680 | or really disaffected by something over the long term.
02:38:56.800 | That almost never happens to me.
02:38:58.480 | - What are your weaknesses psychologically, could you say?
02:39:01.240 | - I still have a problem with projecting.
02:39:04.420 | I think we all probably do,
02:39:05.260 | but my mind onto others.
02:39:06.620 | It's like, if I understand this and I've said this,
02:39:08.620 | you should understand it.
02:39:09.500 | And if you're not, you're dumb.
02:39:11.080 | That's an issue that I,
02:39:12.540 | I still have that where I project too much.
02:39:14.660 | - What about holding grudges and stuff like that?
02:39:16.760 | - I never hold grudges.
02:39:17.860 | I'm the least grudgy person ever.
02:39:19.500 | It's kind of a meme in my community
02:39:20.540 | 'cause anybody can always come back
02:39:22.980 | as long as they're acting different.
02:39:24.100 | - What about the,
02:39:24.940 | (laughs)
02:39:25.860 | as long as they're acting different.
02:39:27.220 | - As long as they're acting different.
02:39:28.060 | - See, I mean, all right.
02:39:29.040 | - The reason why I say that is because,
02:39:30.460 | so for instance, nobody likes this,
02:39:31.980 | but I have a strong stance on apologies
02:39:33.740 | and that I hate them.
02:39:34.580 | I don't ever want to hear an apology.
02:39:35.620 | I don't care about them ever.
02:39:36.900 | They don't mean anything to me.
02:39:38.040 | If you did something bad,
02:39:39.340 | as long as you've fixed the behavior
02:39:40.880 | and you're not doing that thing,
02:39:41.780 | then we're generally chill.
02:39:42.740 | So there's been a lot of people
02:39:43.580 | that have been involved in weird stuff with me,
02:39:44.840 | but then they go off to do their thing
02:39:46.620 | and they come back and it's like, okay, cool.
02:39:48.180 | As long as you don't do it again, we're fine.
02:39:49.580 | It's all good.
02:39:50.420 | - I'm sorry you feel that way.
02:39:53.060 | It's not your fault, Steven.
02:39:55.380 | It's not your fault.
02:39:56.340 | - Okay, gotcha.
02:39:57.580 | - You've said plenty of negative stuff.
02:40:00.180 | Positive stuff and negative stuff about Hassan.
02:40:03.180 | This is my podcast.
02:40:04.340 | I get to get you to force you to say positive things.
02:40:07.020 | What do you love?
02:40:08.620 | I'm all about love.
02:40:10.260 | - Let's go back to grilling me on the R word stuff.
02:40:12.140 | You're gonna make me compliment Hassan?
02:40:13.540 | This is gonna be a harder conversation than that.
02:40:15.340 | - All right.
02:40:16.180 | We're gonna get you to feel emotions.
02:40:19.960 | So he's, for people who don't know,
02:40:22.620 | he's another popular political streamer.
02:40:25.100 | I think you had, as the kids call it,
02:40:27.860 | a bridge burning over Bernie Sanders.
02:40:30.140 | I don't know.
02:40:30.980 | My research is very limited on this.
02:40:32.620 | But what do you respect and love most about Hassan?
02:40:37.300 | - He puts in a lot of work.
02:40:38.820 | When he was growing his stream
02:40:40.260 | from 2,000 concurrent viewers to 15,000,
02:40:42.420 | he was streaming, it was like 12 hours a day,
02:40:44.780 | like every single day.
02:40:46.860 | So that was admirable.
02:40:47.700 | He did a lot of work.
02:40:49.100 | He does seem to be pretty good at networking
02:40:51.100 | and socializing and making the correct friends
02:40:53.260 | and connections to continue to build his business.
02:40:55.020 | - What about him as a political thinker?
02:40:56.900 | I know you don't think highly of him on that regard,
02:40:59.780 | but I think that's unfair.
02:41:01.300 | - Oh, man.
02:41:02.140 | - I think that's unfair.
02:41:02.960 | I honestly wanna push back on that because--
02:41:04.940 | - Okay.
02:41:05.860 | I have zero respect for him as a political thinker.
02:41:08.740 | Or there's not gonna be almost anything.
02:41:10.420 | So you can't-- - Oh, I will say,
02:41:11.620 | I admire the fact that through no actual capability
02:41:16.300 | or ability of his own, he manages to wind up
02:41:18.260 | at some of the correct answers
02:41:19.380 | just 'cause he's towing the line.
02:41:20.380 | So good job for him on that.
02:41:22.260 | He's got a lot of correct opinions,
02:41:23.460 | just he has no idea why, so.
02:41:25.140 | - I think that's undeserved.
02:41:26.220 | I think that's too harsh, man.
02:41:27.420 | - Okay.
02:41:28.260 | - The reason I bring that up is I feel like
02:41:29.420 | there is a deep grudge in there somehow.
02:41:33.020 | So you're the father now, so since you're so old,
02:41:36.660 | the grandfather of the political debate on stream,
02:41:41.220 | on livestream political debater.
02:41:43.280 | So there could be some grudge about that split
02:41:47.740 | that happened or not enough credit given
02:41:49.820 | or all that kind of stuff.
02:41:50.900 | I just think he's somebody that has a left-leaning ideology
02:41:56.340 | that's different than yours.
02:41:57.180 | He was a Bernie supporter, right?
02:41:58.740 | And I guess you were not.
02:42:00.620 | Can you explain to me what the division is?
02:42:02.980 | - He exemplifies everything that I absolutely hate
02:42:05.540 | about politics. - Which is what?
02:42:07.100 | - Which is shallow engagement,
02:42:09.700 | heavily ideologically driven.
02:42:11.820 | - And you're not ideologically different, right?
02:42:13.580 | - Absolutely not.
02:42:14.660 | Free of ideology. - That's what we're talking
02:42:16.420 | about, like the free thinker in the real meaning
02:42:18.700 | of that word.
02:42:19.540 | - Yeah, so the way, let me qualify--
02:42:20.860 | - It's your body, it's your thinking.
02:42:21.900 | - Let me qualify what I mean when I say that.
02:42:24.220 | I spent a lot of time, unfortunate time,
02:42:26.220 | delving into the boring world of philosophy.
02:42:28.100 | I spent a lot of time thinking about
02:42:30.020 | what are my ethical positions?
02:42:31.700 | How do I feel about myself, the people around me,
02:42:34.820 | and how that relates to the world around me?
02:42:36.300 | And then from all of these positions,
02:42:38.020 | I think you might have used the phrase
02:42:39.060 | first principles earlier.
02:42:40.860 | From these kind of first principles,
02:42:42.660 | out of that is where all of my political positions
02:42:45.180 | are built out of, like full stop.
02:42:47.300 | So if you ask me a question like,
02:42:48.780 | how do you feel about the right to own a firearm
02:42:50.940 | or how do you feel about social healthcare?
02:42:52.740 | We can walk through, well, this is how I feel about it
02:42:55.100 | as like a thing from the government.
02:42:56.660 | This is where the government gets its power.
02:42:58.360 | This is ethically how groups of people
02:43:00.140 | are supposed to function.
02:43:01.200 | This is morally how we relate to each other.
02:43:02.980 | And personally, this is how I feel about it.
02:43:04.180 | Like I'll be able to do every single
02:43:05.220 | political belief back there.
02:43:06.460 | It's not like I'm telling you,
02:43:07.860 | like if I were to ask Hasan,
02:43:10.180 | what do you feel about this political topic?
02:43:11.640 | He's gonna tell me what progressives are supposed to say.
02:43:13.700 | I don't know what he thinks about it.
02:43:14.580 | I don't know if he thinks about it.
02:43:15.420 | - Don't you think that's a cynical take?
02:43:16.460 | Why is he, just because his views coincide
02:43:19.740 | with the mainstream narrative,
02:43:21.580 | mainstream viewpoints of progressive thinkers,
02:43:24.220 | I mean, why does that mean he's not thinking?
02:43:27.500 | - Because his engagement with every subject
02:43:29.180 | is incredibly shallow, 100% predictable.
02:43:32.060 | Like I could write like a,
02:43:33.940 | I could probably program a script
02:43:35.420 | to like give you every single potential answer
02:43:37.260 | you could have to any single question you could give him.
02:43:39.260 | - Again, I think that's pretty cynical take.
02:43:41.220 | - Okay, it could be the case that his brain
02:43:42.740 | perfectly aligns with every single mainstream--
02:43:44.340 | - No, but I don't know if you know it is perfectly aligns
02:43:47.060 | 'cause I think you're just taking a very select,
02:43:48.900 | just like streamers do of each other,
02:43:51.020 | a very select slice that represents the perfect alignment
02:43:55.940 | as opposed to looking at a person struggling with ideas
02:43:58.340 | and thinking through ideas and then giving them a pass
02:44:01.060 | like a lot of people, like I give you a pass
02:44:03.820 | on just the fact that you say a lot of crazy shit
02:44:07.460 | on stream for drama, like which is--
02:44:09.780 | - I don't say things for drama.
02:44:11.140 | It might be dramatic, but.
02:44:13.100 | - I mean, you've evolved as a fish evolves legs.
02:44:17.020 | You've evolved a mechanism which creates controversy.
02:44:20.540 | - Sure.
02:44:21.380 | - That you could say it's not intention, but it happens.
02:44:24.100 | I think the extremists kind of learn that kind of thing.
02:44:27.420 | And so I'm sure Hasan does the same kind of stuff.
02:44:30.380 | And so like underneath it, there's still a thinking being
02:44:33.180 | that's contending with political ideas.
02:44:36.300 | You don't think so.
02:44:37.140 | - He does a really good job of hiding it.
02:44:38.620 | There are other political figures that I really don't like
02:44:41.180 | that I wouldn't say the same thing about.
02:44:42.300 | So like, I don't know if you have Vosch written in there.
02:44:44.060 | I'm like, okay, that's a person that--
02:44:45.660 | - That's Vosch.
02:44:46.500 | - That also split out of my community
02:44:48.020 | and grew up to something.
02:44:48.860 | And now he hates me and he's an anti-fan community
02:44:50.740 | and they all hate him.
02:44:51.580 | - Okay, tell me something you love about Vosch.
02:44:53.140 | - I can tell you a lot of things about Vosch.
02:44:54.620 | I think Vosch legitimately thinks through a lot
02:44:56.340 | of his political positions.
02:44:58.020 | I admire or did admire that he has like his own like
02:45:01.280 | positions that he would take sometimes contrary
02:45:02.700 | to people further left than him.
02:45:04.460 | He's got some positions that don't fit his ideology
02:45:06.660 | kind of at all.
02:45:07.500 | Like he's his own intimate thinker.
02:45:08.620 | Rhetorically, he's very effective.
02:45:10.220 | He was willing to sit down and do research
02:45:12.300 | for like his debates and everything.
02:45:14.340 | He would spend a lot of time practicing
02:45:15.840 | like his rhetorical effectiveness
02:45:17.260 | and navigating conversations.
02:45:19.140 | He intentionally and purposefully built like a community
02:45:21.300 | that exemplified his values.
02:45:23.260 | Yeah, I've got a lot.
02:45:24.100 | I don't, we are completely split and hate each other now.
02:45:27.020 | But like I have a lot of--
02:45:27.860 | - Why, why, why?
02:45:28.900 | First of all, hate is a strong, why the hate?
02:45:31.580 | - Okay, I don't hate him, but he hates me
02:45:33.460 | because we had a couple of really big debates.
02:45:35.220 | - What happened?
02:45:36.060 | - Well, one had to do with whether or not
02:45:37.500 | you should live your values.
02:45:39.060 | - And can you give me the story
02:45:40.780 | that's a charitable interpretation?
02:45:42.420 | - I always give charitable interpretation.
02:45:43.660 | - You don't.
02:45:44.500 | - I absolutely do.
02:45:45.700 | - You don't.
02:45:46.540 | - Wait, name one time it happened.
02:45:47.640 | - Five minutes ago, you talking about Hasan.
02:45:49.540 | - Everything I said about Hasan is true.
02:45:50.740 | There is no steel man there, okay?
02:45:52.180 | - That's not charitable.
02:45:53.220 | (laughing)
02:45:54.380 | - I'm sorry, if you can prove me wrong,
02:45:56.500 | I would love for you to do it, okay?
02:45:57.860 | - I'm using my gut instinct.
02:45:59.540 | Usually when somebody feels strongly
02:46:02.340 | about another person in that way,
02:46:04.180 | it's not coming from a place of data and reason.
02:46:06.700 | It's coming from a place of emotion.
02:46:09.160 | It's coming from a place of resentment
02:46:11.580 | and grudge and all that kind of stuff.
02:46:13.140 | - Yeah, I understand.
02:46:14.140 | - There's emotions deep in there.
02:46:15.620 | So the gotcha is hiding,
02:46:17.380 | the gotcha is a surface of an iceberg
02:46:21.220 | and there's a deep ocean underneath
02:46:22.900 | that you yourself have not explored.
02:46:24.460 | - I disagree, but I understand what you're saying.
02:46:25.940 | - And love is actually a doorway,
02:46:27.740 | the young love is a doorway for you to explore
02:46:31.260 | the depths of your--
02:46:32.100 | - Yeah, get into that ocean to find my fish,
02:46:33.260 | my pre-evolved form.
02:46:34.460 | - Yeah.
02:46:35.420 | - I understand why you think the way you do,
02:46:36.780 | and you should, you shouldn't believe me.
02:46:38.260 | And I understand that,
02:46:39.520 | because if somebody told me the same thing,
02:46:40.780 | I'd think you probably just really don't like this person
02:46:42.340 | for a reason or two.
02:46:43.300 | I understand why you think that way, okay?
02:46:45.060 | The reality is though,
02:46:46.060 | for any political person that I disagree with,
02:46:47.780 | like I can give them a fair shake.
02:46:49.140 | It's one of the few things I think I do
02:46:50.820 | exceedingly well on my stream.
02:46:52.300 | Even with Hasan, there's been drama
02:46:53.580 | that he's been involved in,
02:46:54.660 | and I've like very, when I'm involved in drama,
02:46:56.780 | he'll always throw me under the bus.
02:46:57.980 | But when he's involved in stuff, I'll always say like,
02:46:59.460 | "Oh, like I think Hasan was right here,"
02:47:01.140 | or "I think that he meant this."
02:47:03.300 | There was a thing that came up once we're on,
02:47:05.740 | livestream fail, he was getting roasted
02:47:07.060 | because he referred to somebody,
02:47:08.820 | he used the expression "shit skin"
02:47:10.740 | to refer to somebody's, like the way they looked.
02:47:13.260 | And I have only ever heard that
02:47:15.820 | in the context of 4chan people
02:47:17.140 | talking about like Indians or like black people.
02:47:19.540 | Like it's a racial thing.
02:47:21.100 | But I could tell the context
02:47:23.300 | and everything that he was saying.
02:47:24.620 | He was insulting some guy,
02:47:25.900 | I think it was kind of like Ensel Virch or whatever.
02:47:27.220 | He was going for like acne skin.
02:47:28.980 | I think that's what he meant when he said it.
02:47:30.300 | And there were a whole bunch of people
02:47:31.220 | that were insulting, like,
02:47:32.060 | "Oh my God, did he just say racist term?"
02:47:33.140 | I was like, "No, I don't think he was racist."
02:47:34.300 | I think he was like, he was just reaching for words,
02:47:35.860 | and that's what came out.
02:47:37.460 | So like that's an example of me being charitable.
02:47:39.820 | - Okay, but didn't you criticize him for something?
02:47:41.940 | I was trying to, I like Googled
02:47:43.820 | why the hell you guys split up,
02:47:45.140 | 'cause I thought your friends,
02:47:46.300 | you should be like-
02:47:47.140 | - It's a little bit of a Kamala Harris video,
02:47:48.620 | but go ahead, what were you-
02:47:49.460 | - Is that why, is that the,
02:47:51.620 | I feel like you criticized him over something.
02:47:54.620 | And I, okay, this is very vague memory,
02:47:56.700 | but you criticized him over something,
02:47:58.660 | and I felt that criticism wasn't charitable.
02:48:00.980 | - Was it Pete Buttigieg's stuff?
02:48:02.220 | - Yeah, Pete Buttigieg, yes, yes, yes, yes, yes.
02:48:03.940 | - Yeah, so I've said this a million times,
02:48:06.060 | but no amount of context or no amount of nuances
02:48:08.320 | is ever acceptable to people.
02:48:09.460 | I don't think Hasan is homophobic,
02:48:11.980 | but I think the comments made about Pete Buttigieg
02:48:14.580 | were really homophobic.
02:48:15.420 | - That's what he said, right?
02:48:16.380 | - Yeah, and there were a lot of people
02:48:18.260 | making a lot of comments that made me really uncomfortable
02:48:21.660 | about Pete Buttigieg, that was insane to me.
02:48:24.900 | - Spurred by the comments of Hasan?
02:48:27.580 | - No, but it was an environment of progressives,
02:48:30.340 | all the progressives were attacking Pete,
02:48:32.460 | and I felt like his gayness became like the subject
02:48:36.300 | of a lot of attack. - Yeah, but why,
02:48:37.140 | why throw Hasan under the bus for that?
02:48:38.940 | - 'Cause he was jumping along
02:48:39.980 | with all of those types of insults.
02:48:42.180 | - You don't think you've done the same kind of stuff?
02:48:44.020 | - If I do, call me on it,
02:48:44.860 | and I'll probably say I shouldn't have done it.
02:48:46.180 | - That's what the R word was about.
02:48:47.780 | - That's fine, that's a good call-out.
02:48:49.180 | - No, but your friend, you should privately tell him,
02:48:51.660 | right, like, hey-- - Well, no, by then
02:48:53.220 | we were sworn enemies, so.
02:48:54.740 | - So that wasn't the reason you--
02:48:55.860 | - No, no, no, it was over a Kamala Harris video.
02:48:57.380 | - Sworn enemies. - Yeah.
02:48:59.060 | - He hates me, what am I supposed, listen,
02:49:01.140 | for all of these people, I will accept them
02:49:02.820 | back into my life if they ever wanna come back in
02:49:04.540 | at any point in time, but usually they're the ones
02:49:06.020 | that are saying like-- - If they correct
02:49:07.060 | themselves, right?
02:49:08.740 | - No, I'm not expecting anybody to,
02:49:10.540 | so here's the deal with Vosh and Hasan,
02:49:11.860 | these are like the three, we're the three guys online,
02:49:14.420 | none of us will talk to each other.
02:49:16.220 | Hasan because he won't give clout to anybody,
02:49:17.660 | and Vosh because he thinks I'm bad faith,
02:49:20.380 | and then neither of them will talk to me
02:49:21.540 | because they both hate me.
02:49:22.740 | - You guys should go on a camping trip together.
02:49:25.100 | It's like Brokeback Mountain, but three-way,
02:49:27.340 | and just like rejoin, refine--
02:49:29.420 | - Yeah, that could be a thing.
02:49:30.260 | - Refine the patient for each other.
02:49:31.180 | Honestly, just from the internet perspective,
02:49:32.820 | for me as just stepping into this world,
02:49:35.720 | there's some aspect to which you have a responsibility,
02:49:40.440 | I hate that word.
02:49:41.720 | - Good word, don't run from it.
02:49:43.000 | - You have an opportunity, I wish you guys
02:49:45.240 | would kind of be the beacon of forgiveness
02:49:48.760 | and friendship and camaraderie and that kind of stuff.
02:49:50.600 | - Yeah, I agree, and even if we disagree,
02:49:52.280 | it would be really good content for us to argue.
02:49:53.880 | - Yeah, like shit talk, like friends shit talk,
02:49:56.200 | versus not, like the fact that you guys
02:49:58.120 | don't talk to each other.
02:50:00.780 | Like I would love for you to shit talk publicly,
02:50:03.480 | with a camaraderie always there.
02:50:04.840 | Like there's love in the beginning, love in the end,
02:50:07.400 | but you beat the shit out of each other in the middle,
02:50:09.200 | and that's what live streaming is for,
02:50:10.880 | the political discourse, that's great political discourse.
02:50:14.200 | Versus, I think what underlies it,
02:50:16.680 | some jealousy and so on, with this,
02:50:19.180 | you get this many followers.
02:50:20.680 | (laughing)
02:50:22.400 | - I just wanna make sure you're clear to your audience.
02:50:23.360 | - Everybody has, to your audience.
02:50:26.200 | I'm sure you have flaws, and I'm just not,
02:50:28.720 | in this dynamic--
02:50:29.560 | - Hard to find, you know, 'cause I'm--
02:50:30.760 | - Your only flaw is you're too modest, yeah.
02:50:33.520 | - So why did you guys split up?
02:50:34.880 | Because I would love it, honestly,
02:50:36.520 | just let me just put that idea out there,
02:50:38.560 | for you guys to make up.
02:50:41.200 | - Yeah, it's out there, of course,
02:50:42.280 | as everybody talks, me, Vash, and Hasan.
02:50:44.000 | It's crazy that like the three largest,
02:50:45.480 | like political debate, left leaning people online,
02:50:48.600 | like can't do any type of content or collaboration at all.
02:50:51.520 | It's so stupid.
02:50:52.600 | - Yeah, it's strange.
02:50:53.440 | What was the reason you guys split up, the Kamala Harris?
02:50:55.880 | - So, Hasan's entry into kind of like
02:50:58.240 | the Twitch political debate world was in, I think 2018.
02:51:01.800 | And I think he did a debate with Charlie Kirk.
02:51:03.960 | And he reached out to me to kind of like review that debate,
02:51:06.960 | to like go over it on stream.
02:51:08.920 | And he came on, we went over it,
02:51:10.280 | and then we kind of, a friendship developed.
02:51:11.880 | We hung out in real life.
02:51:12.800 | I think when I came to LA, I think I slept on his couch,
02:51:14.720 | we played with his dog, we were like kind of friends.
02:51:17.200 | And as time went on, I think he was a little bit more,
02:51:21.960 | he was farther left than he let on.
02:51:25.120 | So like I was a social Democrat, he was a social Democrat.
02:51:27.480 | But back in those days, like 2018,
02:51:29.520 | when people said they were a social Democrat,
02:51:30.880 | they really meant socialist,
02:51:31.760 | but they just didn't wanna say it.
02:51:32.760 | So he was farther left than me.
02:51:33.840 | And we had a lot of deep divides
02:51:36.280 | in our approach to politics.
02:51:38.440 | Whereas like I was very much like a first principles,
02:51:41.000 | this is my whole political position.
02:51:42.360 | And he was very much kind of like a,
02:51:44.200 | this is like the political ideology I'm involved in.
02:51:46.200 | And this is kind of like the field
02:51:47.200 | that I kind of like navigate in.
02:51:48.680 | So there were a couple instances where these divides
02:51:51.360 | would be laid very bare.
02:51:53.080 | One was when, it was either him or the Young Turks.
02:51:56.680 | I think it was him.
02:51:58.400 | There was a shooting in a neighborhood
02:52:00.080 | where very young black child gets killed by a white shooter.
02:52:03.640 | And they did a video about like hate crimes
02:52:06.000 | and how hate crimes are on the rise
02:52:07.720 | between races and white people are evil and blah, blah, blah.
02:52:09.960 | Not that, but like white people committing hate crimes
02:52:11.720 | against black people.
02:52:12.800 | And I remember saying to him, I was like,
02:52:14.760 | "Hey, we don't have all the data yet for this.
02:52:17.560 | "It feels really bad to make videos about this beforehand
02:52:20.560 | "'cause it's the same type of shit
02:52:21.600 | "that happens at airports.
02:52:23.620 | "Is there a thing going on?
02:52:24.640 | "Was it a brown person?
02:52:25.520 | "Are they Muslim?
02:52:26.360 | "It's Islamic extremism."
02:52:27.360 | We see this played out so many times.
02:52:29.160 | In recent history, probably not a good idea
02:52:31.080 | to jump to conclusions.
02:52:32.120 | And he's like, "Well, no, you don't understand.
02:52:33.560 | "It's not that big a deal, whatever."
02:52:34.920 | And obviously as the story goes,
02:52:36.400 | tale as old as time, the data comes out.
02:52:38.160 | It was just an errant shot.
02:52:39.640 | There was like gang violence,
02:52:40.800 | shot goes out of nowhere, hits a kid in the car.
02:52:42.480 | It wasn't like a hate crime.
02:52:43.400 | The guy was trying to kill a kid.
02:52:44.520 | But yeah, we basically, we bump up against a few
02:52:46.920 | kind of political disagreements like this.
02:52:49.160 | And an annoying thing is happening in my community
02:52:51.320 | where Hasan is like the serious political figure
02:52:53.360 | 'cause he's from the Young Turks.
02:52:54.520 | And I'm just kind of like, I do politics, but I also game.
02:52:56.880 | And anytime I criticize Hasan, people like Destiny,
02:52:59.280 | you need to be more respectful.
02:53:00.240 | He does this full time.
02:53:01.360 | If you're gonna bring criticisms,
02:53:02.400 | you need to be like really well read and researched
02:53:04.000 | because he's got a more serious, whatever,
02:53:06.200 | which I thought was ridiculous.
02:53:07.040 | - So by the way, if people don't know,
02:53:08.580 | he worked at the Young Turks, which is--
02:53:11.240 | - Like the largest left-leaning YouTube channel probably.
02:53:13.080 | - At least at the time, yeah.
02:53:14.520 | - So finally, he did a video on,
02:53:18.320 | we skip ahead to some more minor disagreements.
02:53:19.920 | He does a video on Kamala Harris.
02:53:21.400 | He calls it "Katmala Harris."
02:53:22.880 | And it's like seven or eight horrible things
02:53:24.640 | about Kamala Harris.
02:53:25.480 | And I'm like, okay, I know at least one or two
02:53:27.240 | of these things are not fully accurate.
02:53:28.560 | So I'm gonna do all the research.
02:53:30.520 | I'm gonna have all the sources
02:53:31.560 | and we're gonna have a long conversation about it
02:53:32.920 | so that now when I provide criticism to him,
02:53:34.980 | it's not gonna be like this horrible,
02:53:37.080 | like just me saying something flippantly or whatever.
02:53:39.280 | It's gonna be like substantial criticism.
02:53:41.080 | So I was on a plane ride, JFK to Orlando, whatever,
02:53:45.960 | flying to Sweden to visit my wife.
02:53:48.020 | And on the plane, I review all of the video,
02:53:50.920 | all the data, do all the research and I write everything.
02:53:52.680 | It's like, okay, I get to my wife's dad's house
02:53:57.680 | and I'm at the table.
02:53:59.560 | We're having a conversation like,
02:54:00.560 | hey, we should talk about the Kamala Harris stuff.
02:54:03.600 | And he's like, okay, well, let's do it.
02:54:05.000 | And we go over it and I'll leave to the audience
02:54:09.000 | to watch the video.
02:54:10.080 | Enough people have said this,
02:54:10.900 | I feel pretty confident in saying this.
02:54:12.080 | I was pretty reasonable, pretty measured,
02:54:13.920 | pretty calm the whole time.
02:54:15.040 | And I think he started to get increasingly irritated
02:54:17.180 | that I was levying like more and more serious criticisms
02:54:20.200 | at like the quality of work that he did.
02:54:22.440 | Probably because he felt a little bit intimidated,
02:54:24.260 | I think by my willingness to like dive
02:54:26.440 | through political stuff.
02:54:27.680 | There'd been a couple of awkward blowups where like on,
02:54:30.280 | there's like a show called the Raj Royale
02:54:32.220 | where sometimes politics would come up
02:54:34.400 | and Hassan would kind of try to explain something.
02:54:36.240 | And there was another person one time on the show
02:54:37.720 | that made the joke.
02:54:38.560 | It was like, instead of Hassan taking 10 minutes
02:54:39.840 | to explain this, can Destiny just come here
02:54:41.160 | and explain it in 30 seconds?
02:54:42.320 | And he like exploded at that.
02:54:43.440 | He got so fucking mad at that.
02:54:46.040 | So yeah, I think that when I made that kind of call out
02:54:48.100 | or critique of him over the Kamala Harris stuff,
02:54:49.960 | he's probably feeling like increasingly irritated,
02:54:51.960 | threatened, agitated.
02:54:52.920 | And then that's kind of what began the huge split from ours.
02:54:56.000 | - So you don't think you were a dick at all?
02:54:58.080 | - I don't think so in that conversation.
02:55:00.000 | Especially given that like at that point,
02:55:02.240 | 'cause this is still 2018 or 20, this might be 2019.
02:55:06.000 | I'm still known at that point as being very aggressive
02:55:09.280 | towards conservatives or all writers.
02:55:11.240 | - Oh, gotcha.
02:55:12.080 | - Yeah, so, and with lefties is what I call them.
02:55:14.000 | I think I'm being like very gentle.
02:55:15.680 | Like my conversation I'll start with conservatives
02:55:17.240 | like you're a fuck idiot, you're so dumb.
02:55:18.920 | Like that's how I'm like doing.
02:55:19.960 | So like with him, I'm like,
02:55:20.920 | well, don't you think that like,
02:55:22.000 | this is like a little bit of like an inconsistent
02:55:24.160 | presentation about like, I feel like I'm being nice.
02:55:26.000 | But I always leave to the audience,
02:55:27.200 | they can go and watch that Kamala video,
02:55:29.160 | Kamala Harris video, "Destiny of Son"
02:55:30.560 | if they think that I was being a dick.
02:55:31.920 | But a lot of people watching said
02:55:32.840 | I was being pretty gentle.
02:55:33.840 | - Well, let me say as a new fan of this space,
02:55:37.320 | I hope you guys make up and I hope you guys fight it out
02:55:41.080 | in the space of discourse and ideas.
02:55:43.040 | - Me too.
02:55:43.880 | - And also with empathy, understanding what the strength
02:55:45.680 | of the other person is, what their buttons are.
02:55:48.120 | And there's like an unspoken rule
02:55:51.240 | that you don't press the buttons that you need to,
02:55:54.720 | unless you're doing it mutually and it's fun,
02:55:57.160 | 'cause it's fun to piss each other off.
02:55:58.880 | So that's kind of like what friends do,
02:56:01.000 | you don't cross a certain line.
02:56:03.080 | But then other than that, you fight it out.
02:56:05.400 | Okay, let's step back.
02:56:06.840 | One other super interesting aspect of your worldview
02:56:10.240 | is you're a big supporter of Biden.
02:56:13.560 | Can you explain what you love about Biden?
02:56:15.560 | Do you love Biden more than Sean Carroll or less?
02:56:18.480 | - Sean Carroll is just like in another world of admiration.
02:56:22.640 | - I feel like I'm culturally appropriating you
02:56:24.960 | by saying gotcha now, but it's so convenient.
02:56:27.840 | - It's an easy word, you're just,
02:56:29.520 | we're on the same wavelength, okay?
02:56:30.800 | We're synchronizing, that's good.
02:56:32.280 | - I mean, it is really interesting
02:56:33.640 | because even the people that support Biden
02:56:35.400 | usually don't say they love,
02:56:37.560 | sort of they don't support it strongly.
02:56:39.720 | - Ideologically, philosophically,
02:56:41.920 | the reason why I like Biden is 'cause he's really committed
02:56:44.600 | to this bringing the left and right together,
02:56:46.640 | which is something we so desperately need in the country.
02:56:49.360 | And his statements over and over again of like,
02:56:52.480 | I'm not the Democrat president or the Republican president,
02:56:54.960 | I'm the president of the United States.
02:56:56.520 | His desire to bring Republicans together
02:56:58.920 | to work on things like the infrastructure bill,
02:57:01.440 | that's so incredibly needed.
02:57:03.560 | And I have a huge amount of respect and admiration for him
02:57:06.520 | for trying to push through on that message.
02:57:08.560 | - Do you think then it's unfortunate
02:57:10.400 | that he made that comment about MAGA?
02:57:12.400 | - MAGA Republicans?
02:57:13.240 | - Yeah, I mean, I forget what the comment was,
02:57:15.160 | but MAGA Republicans are not good people kind of thing.
02:57:18.440 | - I watched the full video and he's right.
02:57:21.680 | There is this toxic aspect and it's hard to call out
02:57:24.440 | because they're always gonna spend like,
02:57:25.680 | oh, he hates our Republicans, he's not.
02:57:26.880 | If you watch the quote, he's very specifically calling out
02:57:29.400 | like this group of people
02:57:31.000 | that think that the election was fraudulent.
02:57:33.120 | - Is it clear that's what he meant by--
02:57:35.600 | - We can bring it up.
02:57:36.960 | - All right, this is--
02:57:37.800 | - Oh no, uh-oh.
02:57:39.200 | But I remember watching it on stream,
02:57:40.040 | it was like, if he said it, yeah, that's bad.
02:57:41.360 | You can probably like YouTube MAGA Republicans Biden.
02:57:43.920 | But like, it feels like it's pretty clear
02:57:44.920 | he's talking about the people that are like,
02:57:46.120 | election denying.
02:57:47.840 | - Too much of what's happening in our country today
02:57:50.400 | is not normal.
02:57:52.360 | Donald Trump and the MAGA Republicans
02:57:56.600 | represent an extremism that threatens
02:57:59.200 | the very foundations of our republic.
02:58:02.240 | Now I wanna be very clear.
02:58:05.400 | - Listen to this part. - Very clear up front.
02:58:08.960 | - Not every Republican, not even the majority of Republicans
02:58:11.760 | are MAGA Republicans.
02:58:13.200 | Not every Republican embraces their extreme ideology.
02:58:18.040 | I know, 'cause I've been able to work
02:58:20.640 | with these mainstream Republicans.
02:58:22.600 | But there's no question that the Republican Party today
02:58:27.000 | is dominated, driven, and intimidated
02:58:30.320 | by Donald Trump and the MAGA Republicans.
02:58:32.760 | And that is a threat to this country.
02:58:36.640 | - I disagree with that, man.
02:58:37.600 | He didn't clearly say extremist ideology.
02:58:41.680 | He didn't say the people that doubt
02:58:44.200 | the validity of the election.
02:58:45.560 | - I mean, that's Donald Trump.
02:58:47.040 | - No, but there's--
02:58:48.400 | - That's all the candidates that Donald Trump is supporting.
02:58:50.360 | How many, what is it, like 40, 50,
02:58:51.720 | how many candidates right now that are MAGA candidates
02:58:54.160 | are election deniers?
02:58:55.040 | - No, but there's 80 million or whatever people
02:58:58.120 | voted for Donald Trump.
02:58:59.560 | You could say that's the MAGA Republicans.
02:59:02.320 | So to me, it sounded like he was referring
02:59:06.320 | to not even the majority.
02:59:08.520 | I mean, that's one nice, helpful, clarifying statement.
02:59:12.840 | But it's basically there's the mainstream Republicans,
02:59:15.160 | and then there's those that voted for Donald Trump.
02:59:17.380 | That's the way I heard it.
02:59:18.640 | - Okay.
02:59:19.480 | - And it's like, so--
02:59:20.320 | - Maybe he should've done a better job at clarifying, but.
02:59:22.160 | - Yeah, I--
02:59:23.000 | - I feel like there's like a clear,
02:59:24.040 | there is a huge problem with this group of Americans
02:59:27.280 | that think that the election is stolen.
02:59:28.520 | I feel like that's what he's trying to call it.
02:59:29.360 | - No matter if that's what he meant,
02:59:33.760 | even flirting with that line is not a person
02:59:37.080 | who's bringing people together.
02:59:39.180 | - I feel like the extending a hand to the,
02:59:41.200 | like most, I've worked with Republicans in Congress,
02:59:43.560 | not even a majority of Republicans are like this.
02:59:46.800 | - No, but why say not the majority of Republicans
02:59:49.240 | are like this?
02:59:50.060 | Say like we are, like we're one country.
02:59:53.640 | We believe the same thing.
02:59:54.520 | So like focus on the uniting part versus saying--
02:59:57.280 | - He does before and after.
02:59:58.120 | That was 50 seconds, okay?
02:59:59.080 | - But that, you never,
03:00:00.800 | the point is you never say something like that.
03:00:02.600 | Listen, like that, you've spoken about the Bosnia speech,
03:00:05.560 | which is your favorite of his.
03:00:07.080 | - Yeah.
03:00:07.920 | - I went back to it and listened to it.
03:00:09.200 | - Before I move to that, just on this,
03:00:10.560 | it's really hard for him to call out that group
03:00:12.800 | of like election deniers, I think,
03:00:14.120 | without it always feeling like--
03:00:14.960 | - Well, why call them out?
03:00:16.640 | - Because it's arguably one of the most destructive forces
03:00:19.240 | that exist in this country today.
03:00:21.120 | - Did it destroy anything?
03:00:22.920 | - They were trying to.
03:00:24.280 | - Did it, though?
03:00:25.940 | It didn't, did it?
03:00:27.280 | - So does that mean we don't call it out?
03:00:28.600 | We wait till next time?
03:00:29.640 | - Because calling it out is giving fuel to the division.
03:00:34.200 | Like the people that doubted the validity of the election,
03:00:38.080 | that's anger, that's frustration with the other side.
03:00:40.880 | You heal that as opposed to saying all those people
03:00:44.160 | that believed that at any time are idiots.
03:00:47.640 | They're un-American.
03:00:49.140 | - I mean, they don't think the election was real.
03:00:50.720 | I don't know if Biden has the ears of these people at all.
03:00:52.940 | I don't know what he can do for--
03:00:55.020 | - There's people that believe the same thing in 2016
03:00:57.960 | with the Russian hacking, right?
03:00:59.640 | - Hold on.
03:01:01.360 | - Yes.
03:01:02.600 | - That is a super not fair comparison.
03:01:04.500 | There were definitely, the mainstream Democrat opinion
03:01:08.940 | was that Russian intrusion in terms of like social media
03:01:13.880 | and stuff happened, but there was never a claim
03:01:15.880 | that like the election was stolen.
03:01:17.480 | No main, or at least I don't know of any mainstream Democrat
03:01:19.880 | that supported that.
03:01:20.760 | Donald Trump is not just saying
03:01:22.000 | there was interference, blah, blah, blah.
03:01:23.200 | Donald Trump is literally saying
03:01:24.320 | the election was literally stolen.
03:01:26.360 | Vote boxes were, ballot boxes were hidden,
03:01:28.720 | that vote tallies were manipulated.
03:01:30.640 | I think the claim is there's a huge gulf
03:01:32.640 | of difference between the two.
03:01:33.960 | - So you can attack Donald Trump for that.
03:01:36.480 | - Yeah.
03:01:37.520 | - I believe it's not the words of a uniter
03:01:40.280 | to attack people that believe that.
03:01:43.940 | You could argue maybe it's okay,
03:01:47.200 | but especially not being super clear about that,
03:01:49.980 | about who you're referring to when you say MAGA Republicans.
03:01:52.920 | 'Cause MAGA is a hat and a slogan
03:01:57.920 | that refers to whatever the number is,
03:02:02.720 | 70 million people, whoever, that voted for Donald Trump.
03:02:05.840 | Like--
03:02:06.680 | - Of all the Republicans that consider themselves
03:02:07.520 | MAGA Republicans, what percentage of them do you think
03:02:09.940 | believe the election was stolen?
03:02:12.320 | I feel like that number is, I don't have the poll,
03:02:13.640 | but I feel like that number is like probably more than 70%.
03:02:17.080 | - What's a MAGA Republican?
03:02:18.680 | Maybe I'm not familiar--
03:02:19.520 | - Like a Trump supporting Republican, a MAGA Republican.
03:02:21.480 | They're there for Trump.
03:02:23.200 | - What's the difference between somebody
03:02:24.320 | that voted for Trump and a--
03:02:26.120 | - MAGA Republican?
03:02:26.960 | - And a MAGA Republican.
03:02:27.880 | - So my mom is a MAGA Republican.
03:02:30.680 | If Trump ran independently and DeSantis ran
03:02:33.480 | under the Republican ticket, my mom would vote for Trump.
03:02:36.080 | She'll follow him to the end of the earth.
03:02:37.520 | That's like a MAGA Republican.
03:02:38.880 | - I think it's easy to mistake that distinction
03:02:42.160 | in these kinds of political speeches.
03:02:44.080 | 'Cause to me, anybody who voted for Trump
03:02:46.700 | can easily in the context of the speech
03:02:49.140 | be interpreted as a MAGA Republican.
03:02:51.800 | - Gotcha.
03:02:52.640 | I understand what you're saying.
03:02:56.080 | Maybe you could have been more clear,
03:02:56.960 | but I think in listening to that,
03:02:58.560 | I think it's pretty obvious who he's talking about.
03:03:01.420 | But I guess if you have an emotional response to it,
03:03:03.360 | I can understand the emotional response.
03:03:04.440 | But there's a lot of people that--
03:03:05.280 | - I don't have an emotional response.
03:03:06.640 | I just don't like, I think I'm with, what is it?
03:03:09.880 | Michelle Obama, they go low, we go high.
03:03:13.320 | Meaning like, to me, a uniter doesn't participate
03:03:17.480 | in derision.
03:03:18.480 | - Sure, a uniter might not,
03:03:19.420 | but a leader has to be able to accurately assess
03:03:21.680 | the situation before him
03:03:23.080 | and make people aware of what's going on.
03:03:24.240 | - You mean all the impeachment trials,
03:03:26.120 | all the censoring from social media,
03:03:28.640 | all of that didn't do the job?
03:03:30.360 | - That's not his job.
03:03:31.480 | I don't know about censoring any of that.
03:03:32.320 | - No, but that mechanism,
03:03:33.480 | his job is to inspire a nation to unite a nation.
03:03:36.760 | - How can he do that when half the people
03:03:37.840 | don't believe that he was even legitimately elected?
03:03:40.200 | Like, I think he's done a good job
03:03:41.080 | at working on legislation and doing stuff
03:03:42.800 | that hopefully benefits all Americans.
03:03:44.360 | But I think it's important to recognize
03:03:45.820 | that there is a contingent of Americans
03:03:47.680 | that don't even believe, like, this is really crazy.
03:03:49.760 | - There are plenty of people that recognize that
03:03:51.840 | and are fighting that and are constantly
03:03:53.360 | screaming that from the rooftops.
03:03:55.200 | His job is to be the inspiring figure
03:03:59.520 | that makes the majority of Americans be proud
03:04:02.440 | for him to be a president of the nation they love.
03:04:05.920 | And that's what the uniting aspect is,
03:04:08.680 | is you remind people that we are one
03:04:12.120 | and we love this country,
03:04:13.160 | we love the ideas that it represents.
03:04:14.840 | - He does that in other parts of that speech.
03:04:16.200 | It's like a 20 minute speech, isn't it?
03:04:17.480 | But that's a fuck up.
03:04:19.080 | You just don't participate in that division.
03:04:21.560 | Anyway, I understand, I understand.
03:04:24.460 | I just wanted to push back on the saying,
03:04:26.240 | one of his strengths is that he's uniting.
03:04:30.520 | But yes, that is an ideal, that is a goal, is a great one.
03:04:35.520 | And he is one that espoused that goal for a long time.
03:04:40.120 | Do you think, what else?
03:04:42.320 | So from a policy perspective and so on.
03:04:44.520 | - I thought the way he's handled Ukraine and everything
03:04:45.920 | thus far has been almost perfect.
03:04:47.600 | I think he did a really good job.
03:04:49.280 | And at the political maneuvering
03:04:51.080 | of bringing other countries into the fold,
03:04:52.640 | at establishing clearly what our mission was
03:04:54.720 | in relation to Ukraine, I thought he did a good job there.
03:04:57.240 | I admire him for pulling out of Afghanistan.
03:04:59.040 | Even if it was a little bit rough around the edges,
03:05:01.080 | we got out and we're gone, no American lives were lost.
03:05:03.880 | The domestic policy, he's passed more major legislation
03:05:08.000 | than I think anybody thought possible.
03:05:10.280 | The green energy stuff with the last bill,
03:05:11.880 | the infrastructure bill.
03:05:13.720 | A lot of the coronavirus relief I thought was really good,
03:05:15.640 | especially the expansion of the child tax credit.
03:05:18.160 | So from a policy perspective, foreign and domestic,
03:05:20.520 | I think he's been successful.
03:05:21.640 | Rhetorically, I think he's generally been above board
03:05:24.240 | in terms of not attacking people or being too divisive.
03:05:26.800 | He's trying to bring people together and work on them.
03:05:28.840 | - What do you think about the sort of popular in the media
03:05:32.240 | criticism of his mental decline?
03:05:34.340 | Do you think he's experiencing mental decline?
03:05:35.180 | - You know, he's an old guy.
03:05:37.360 | - But do you think, I mean, do you?
03:05:40.040 | - Yeah, maybe a little bit,
03:05:41.240 | but he's still doing a good job, so you know.
03:05:43.320 | - Not from a speech perspective,
03:05:44.520 | you mean from a policy perspective?
03:05:45.720 | - Yeah, I'm analyzing it as a job, yeah.
03:05:47.440 | From a speech perspective, maybe not the greatest,
03:05:49.400 | but yeah, I mean, he's definitely, what is he, like 80, 81?
03:05:51.800 | How old is he?
03:05:52.640 | - I lose track after so many years.
03:05:54.360 | - Yeah.
03:05:55.200 | - But you did say that he's probably going to run in 2024
03:05:59.600 | and he's probably going to win.
03:06:01.160 | - Did I say that, that he's probably gonna win?
03:06:02.680 | No way did I say that.
03:06:03.520 | - I heard that somewhere.
03:06:04.340 | - He's probably gonna run.
03:06:05.180 | - Okay.
03:06:06.020 | - Who knows who will win?
03:06:06.840 | But I think, I feel like the incumbent advantage
03:06:08.880 | is so strong.
03:06:09.720 | Are you really gonna throw that away?
03:06:10.540 | Like, there's been like one or two times in history
03:06:12.240 | in the US, right, where like the non-incumbent,
03:06:14.440 | the parties put somebody else up?
03:06:16.200 | - Yeah, I mean, the concern is like the,
03:06:18.960 | just the age and the mental decline,
03:06:21.980 | just the wear and tear of the campaign,
03:06:26.040 | all of that kind of stuff.
03:06:26.880 | All of the speech you have to make,
03:06:28.120 | the debates and all that kind of stuff.
03:06:29.600 | - Yeah, I guess we'll see what happens.
03:06:31.040 | (laughing)
03:06:32.480 | What?
03:06:33.320 | - The least excited.
03:06:34.920 | - I mean, two years from now is a long time.
03:06:36.560 | At his current mental state, he could run and do it.
03:06:39.920 | He could do a possible job.
03:06:40.840 | In two years, man, I don't know.
03:06:42.520 | I've seen videos of Bill Clinton recently.
03:06:43.880 | He's looking pretty rough.
03:06:45.180 | You know, if Biden is looking a lot more rough,
03:06:48.280 | worse for wear in two years,
03:06:49.360 | then maybe they actually do have to dig out another person
03:06:53.520 | for running, who knows?
03:06:54.680 | - What do you think about Trump?
03:06:58.920 | When he won in 2016, I think is when you came to fruition,
03:07:03.920 | politically speaking.
03:07:05.720 | So what do you think his winning
03:07:07.320 | the 2016 election represents?
03:07:11.320 | - So for me, the reason why I got into politics
03:07:16.160 | was Trump was like this new epistemic force
03:07:20.160 | in American politics that you kind of have to flirt
03:07:24.540 | with facts before, even if you wanted to be non-factual.
03:07:27.040 | He super didn't care.
03:07:28.880 | Lying was like a first language to him,
03:07:30.560 | just like in speaking in terms of like,
03:07:32.600 | the way that he used language to just say to you
03:07:35.420 | what he felt like you needed to hear to support him
03:07:37.920 | and not care at all about what is going on,
03:07:40.400 | about, yeah, that's what Trump represented to me
03:07:43.840 | in terms of like things that I cared about.
03:07:46.240 | He also represents a lot more, obviously,
03:07:47.680 | that there was this undercurrent of American opinion
03:07:49.760 | that a lot of people didn't know still existed,
03:07:51.720 | and it did, he got elected.
03:07:53.720 | That the Overton window was misidentified
03:07:55.420 | by even a large amount of the Republican Party.
03:07:57.480 | That populism was a lot more popular
03:07:59.280 | than a lot of people figured, you know?
03:08:01.280 | Yeah, there's a lot that I guess he represented.
03:08:03.080 | - Do you think Trump should have been banned from Twitter?
03:08:06.320 | Can you make the case for and against it?
03:08:08.040 | So you're a big supporter of free speech.
03:08:10.240 | - Yeah, so the case in favor of it.
03:08:13.080 | - Do you think he should be brought back as Elon tweeted?
03:08:16.800 | - Yeah, because if he gets brought back,
03:08:17.640 | there's a higher chance that I'll be brought back.
03:08:19.120 | So I'm supporting that all the way.
03:08:20.680 | Thank you, Elon, unban my account.
03:08:22.680 | - So because you called me weak spine,
03:08:26.640 | I'm gonna have to message Elon.
03:08:28.200 | - Okay, @OmniDestiny, it was a verified Twitter account.
03:08:30.760 | - OmniDestiny. - No, no, I'm just kidding.
03:08:32.840 | - Why'd you get banned from Twitter, Destiny?
03:08:34.560 | - I don't know. - I'll add that to Elon.
03:08:37.240 | - I saw that there was a screenshot of you
03:08:40.400 | referring to the rape of somebody.
03:08:43.960 | - Okay, that was on an older Twitter account,
03:08:45.640 | and that was a bad tweet.
03:08:46.680 | - You have multiple Twitter accounts,
03:08:48.520 | so you're trying to go around the bans
03:08:51.360 | that you keep getting.
03:08:52.360 | - Okay, hold on.
03:08:53.520 | You're slandering me a lot right now.
03:08:55.440 | Let's get the facts straight, okay?
03:08:57.000 | I don't even remember why my first account got banned,
03:08:59.000 | but it was a wild account.
03:09:00.280 | I tweeted some wildly inappropriate things.
03:09:02.400 | - You regret? - I don't like that word.
03:09:03.880 | I'm gonna give the answer that most people give.
03:09:05.080 | It's like, I don't regret it 'cause I learned a lot.
03:09:06.760 | So I'm glad I had the bad experiences that I did.
03:09:08.360 | - Why don't you like the word regret?
03:09:10.120 | - I think if we look at where we are,
03:09:13.720 | how do you feel about determinism?
03:09:15.640 | (Lex laughing)
03:09:18.080 | I believe in the hardest of determinism.
03:09:20.320 | That's who I am, okay?
03:09:21.520 | So who I am today is the culmination
03:09:24.120 | of everything that's occurred in the past.
03:09:25.400 | - But I believe you speaking, sorry to interrupt.
03:09:28.240 | I believe in you speaking about regret
03:09:31.360 | is a nice way to communicate
03:09:33.800 | that in this deterministic world,
03:09:35.780 | you've analyzed the acts of the past
03:09:38.160 | and you're no longer that person.
03:09:39.800 | - Yeah, of course, for sure.
03:09:40.880 | - That's what regret usually means.
03:09:43.120 | - Okay, thanks for giving me the human explanation.
03:09:44.800 | Okay, true.
03:09:45.640 | So in that sense, there's a lot of things I've done
03:09:46.880 | that I regret.
03:09:48.000 | - Oh, what are you?
03:09:48.920 | You're not human, you're a bot?
03:09:50.440 | - NPC is my preferred term. - Okay, all right.
03:09:54.040 | - I wish I would have been smart enough at the time
03:09:56.600 | to not have to have had made those mistakes.
03:09:58.840 | - Okay. - There you go.
03:09:59.680 | - Good job.
03:10:00.760 | - But yeah, obviously really dumb,
03:10:02.240 | really crazy off the wall tweets.
03:10:04.400 | But that account got banned.
03:10:05.240 | And then I made another account called,
03:10:08.840 | I can't believe I'm giving you a history
03:10:09.680 | of my Twitter accounts,
03:10:10.520 | but another account called Omni Destiny.
03:10:11.600 | - It's an honor.
03:10:12.440 | - And that was my, I got verified.
03:10:14.560 | I was cool, they let me have that account
03:10:16.280 | 'cause originally they banned it and I said appeal
03:10:17.760 | and I was like, oh, let me have one more.
03:10:18.960 | And back then Twitter was cool
03:10:20.240 | and they're like, okay, go for it.
03:10:21.540 | And that account last for a long time.
03:10:23.040 | And I don't actually know 100% why that account got banned.
03:10:27.580 | I believe that the tweet that showed up in the final,
03:10:31.400 | I got banned for hate speech.
03:10:33.360 | And it was because I was,
03:10:34.920 | there was a picture that I tweeted
03:10:36.480 | with three different alt writers
03:10:38.140 | that are kind of like neo-Nazi people.
03:10:39.520 | And they were all like mixed race people.
03:10:42.020 | And I said like the new alt right looks
03:10:43.360 | like a Disney Channel original movie
03:10:45.200 | in terms of racial composition.
03:10:46.520 | And somehow they got flagged for instigating violence
03:10:50.200 | against minorities I think.
03:10:51.480 | And I think that's the tweet that got me banned
03:10:53.280 | 'cause I think that's what showed up in the final report.
03:10:54.880 | But I don't know, maybe there were other reasons
03:10:56.480 | 'cause nobody ever communicates.
03:10:57.460 | But ever since that account went under,
03:10:59.520 | it's just been ban evading ever since so.
03:11:02.160 | - Oh, ban evading ever since.
03:11:04.120 | - So all my new accounts that I've got banned
03:11:05.240 | just get banned 'cause they finally figured out it's me
03:11:06.800 | and then they ban evade.
03:11:07.640 | There's like one dude at Twitter HQ
03:11:08.920 | who's like constantly looking for my new accounts
03:11:12.160 | and they get me, yeah.
03:11:13.200 | - Yeah.
03:11:14.240 | - Anyway, yeah.
03:11:15.400 | - So post.
03:11:17.200 | - Post Trump world.
03:11:19.080 | - Do you think, okay, I mean this.
03:11:20.560 | - Oh, should he be banned?
03:11:21.400 | Oh, you asked me to make both cases.
03:11:23.120 | Should he be banned?
03:11:25.200 | I mean, damn dude, when you're tweeting out shit
03:11:26.920 | that's arguably leading to stuff like January 6th,
03:11:29.060 | I can understand why.
03:11:30.780 | Because it's like, what else is this wild dude
03:11:32.320 | gonna tweet out?
03:11:33.160 | Like is he gonna start instigating other violent events?
03:11:34.760 | So I'm sympathetic towards the like,
03:11:35.960 | okay, well he can't just be here saying stuff like this.
03:11:38.360 | That's insane, we're gonna ban him.
03:11:40.120 | I'm sympathetic.
03:11:40.960 | - Because it's instigating actual physical violence
03:11:44.320 | in the physical world.
03:11:45.560 | - Yeah, like if I were to tweet stuff like that,
03:11:46.960 | I would get banned probably.
03:11:48.720 | On the flip side, this is the President of the United States.
03:11:51.400 | It seems like he's like doing presidential decree
03:11:53.560 | by social media sometimes.
03:11:55.200 | Like is it really right that one public or private,
03:11:58.120 | I should say, one private company can like erase
03:12:01.620 | the President of the United States words
03:12:03.220 | from the eyes of a lot of Americans
03:12:04.740 | that are using these social media feeds?
03:12:06.940 | - And one big one, which I for sure am against,
03:12:10.860 | is the permanent ban.
03:12:12.660 | - Yeah, I don't like that, I hate that.
03:12:13.960 | Even in my community, if somebody comes back
03:12:15.620 | after like a year, like I mean--
03:12:17.580 | - Did you just compare yourself
03:12:18.580 | to the President of the United States?
03:12:19.940 | - No, I compare myself to Twitter
03:12:20.820 | banning the President of the United States.
03:12:22.140 | Let me put it this way, if I ban Donald Trump
03:12:23.660 | in my chat room, I'd unban him in a year.
03:12:26.100 | - A year? - Yeah.
03:12:27.300 | - What's the process for unbanning Donald Trump?
03:12:29.520 | What would he have to do?
03:12:31.000 | - Usually people send me an email,
03:12:32.120 | and they're like, "Listen, I did this stuff.
03:12:33.420 | "I'm sorry I was dumb.
03:12:34.460 | "I'll give him another chance."
03:12:35.680 | - But a year, what if they send an email a month later?
03:12:38.620 | - Usually I'll unban him.
03:12:39.740 | That's usually my policy.
03:12:40.580 | I ban pretty quickly in my community,
03:12:41.620 | but if you ever ask me to come back--
03:12:42.460 | - You're a big softie.
03:12:43.460 | - Yeah, I usually let him back, yeah.
03:12:45.080 | Well, because I used to be the worst type
03:12:47.160 | of internet person, and I think I'm a little bit better
03:12:49.600 | than I used to be, so.
03:12:50.440 | - Now that you're older.
03:12:51.340 | - Yeah, now that I've matured, yeah, of course.
03:12:53.680 | Age bestows a wisdom that just can't be gotten any other way.
03:12:56.660 | - What's your sense in general?
03:12:57.860 | Is there something interesting you could say
03:12:59.460 | about your view on free speech?
03:13:01.240 | It seems like one of those terms
03:13:02.500 | that's also overused to mean a lot of different things.
03:13:05.780 | What does it mean to you?
03:13:06.860 | - If you have a democratic style of governance,
03:13:09.980 | you are entrusting people with one of the most awesome
03:13:13.540 | and radical of responsibilities,
03:13:15.180 | and that's saying that you're going to pick the people
03:13:17.220 | that are gonna make some of the hardest decisions
03:13:18.820 | in all of human history.
03:13:20.060 | If you're gonna trust people to vote correctly,
03:13:22.780 | you have to be able to trust them
03:13:24.040 | to have open and honest dialogue with each other.
03:13:25.860 | Whether that's Nazis or KKK people or whoever talking,
03:13:29.560 | you have to believe that your people
03:13:31.540 | are going to be able to rise above
03:13:32.920 | and make the correct determinations
03:13:34.420 | when they hear these types of speeches.
03:13:36.180 | And if you're so worried that somebody's gonna hear
03:13:38.840 | a certain political figure,
03:13:39.940 | and they're gonna be completely radicalized instantly,
03:13:42.100 | then what that tells me is that you don't have enough faith
03:13:44.200 | in humans for democracy to be a viable institution,
03:13:46.820 | which is fine.
03:13:47.800 | You can be anti-democratic,
03:13:49.140 | but I don't think you can be pro-democracy
03:13:50.980 | and anti-free speech.
03:13:52.720 | Within reason.
03:13:53.900 | - So what's the within reason?
03:13:55.420 | - So I mean, you can't post like child porn
03:13:57.300 | or something on Twitter
03:13:58.140 | or people try to get you on that stuff.
03:13:59.460 | Or like direct calls to violence are probably not,
03:14:01.540 | you shouldn't be tweeting out like,
03:14:02.380 | "We're gonna meet up tomorrow and go bomb, blah, blah, blah."
03:14:03.940 | Probably not.
03:14:04.780 | - So do you think it's okay to allow racism
03:14:06.300 | and antisemitism and hate speech?
03:14:09.700 | - Hate speech, yes,
03:14:12.660 | because that can be very broadly defined.
03:14:15.300 | I can understand there being some basic rules
03:14:17.700 | of like no slurs on like a platform
03:14:20.300 | that gets into like acceptable forms of moderation
03:14:22.580 | or like excessive harassment and bullying,
03:14:24.440 | I can understand.
03:14:25.860 | But past that, when the moderation becomes ideological,
03:14:29.940 | I get a little bit nervous
03:14:31.080 | because there's a whole other host.
03:14:33.900 | - Yeah, of course it's all a gray area,
03:14:36.080 | but when it feels like ideology
03:14:38.460 | has seeped into the censorship, not good.
03:14:40.880 | Yeah, which it's so fascinating to think,
03:14:43.880 | especially now that Elon bought Twitter,
03:14:45.640 | how do you engineer a system
03:14:48.600 | that prevents ideology from seeping in
03:14:52.200 | and nevertheless is able to create a platform
03:14:55.600 | that has healthy conversations?
03:14:57.480 | 'Cause if you have one guy
03:14:58.360 | who's just screaming nonsense nonstop,
03:15:00.680 | it has this effect where the quiet voices
03:15:04.340 | at the back of the room are silenced.
03:15:06.480 | So like, that's what you usually don't talk about.
03:15:09.040 | Like if you let one annoying, loud person in,
03:15:12.200 | that's actually censoring the voice of a lot of people
03:15:14.360 | that would like to speak, but they don't get a chance.
03:15:16.420 | - That's one of the things,
03:15:17.260 | especially around like trans discourse,
03:15:18.920 | I have to constantly do that like reminder for my audience
03:15:21.800 | is like when I'm dealing with these types of people
03:15:23.560 | on the internet, a lot of them might seem really crazy.
03:15:25.520 | A lot of these types of people might seem insane,
03:15:27.040 | but like in the real world,
03:15:28.160 | outside of like the crazy Twitter activist world,
03:15:30.400 | like the vast majority of people you're meeting
03:15:32.400 | from LGBT communities are like the coolest,
03:15:34.400 | normalest people.
03:15:35.280 | All they want is the like right to live their life
03:15:37.080 | in the way they want to and to be like unobstructed
03:15:39.040 | and like, yeah.
03:15:40.040 | But people will get this impression
03:15:41.440 | of like an online activist,
03:15:42.760 | like a vegan or LGBT person or whatever.
03:15:45.320 | And then they think that every single person
03:15:46.720 | in real life is like that
03:15:47.640 | and it's a really negative stereotype.
03:15:49.920 | And then even the other people in that group.
03:15:52.360 | - Oh, is Melina coming over?
03:15:53.720 | - Oh yeah, I asked her, I don't know if that's her.
03:15:55.400 | - Okay, Melina just joined us.
03:15:57.400 | What were we talking about?
03:15:58.760 | Was it interesting?
03:15:59.680 | - You were saying that you were gonna talk to Elon
03:16:02.400 | about getting @OmniDestiny,
03:16:04.080 | the verified Twitter account unbanned.
03:16:05.360 | I said, that's so-
03:16:06.180 | - That sounds like a lie.
03:16:07.020 | - That's so gracious of you.
03:16:07.840 | I can't even believe you would do that for me.
03:16:09.640 | - And then you admitted that you tried to evade the ban
03:16:12.120 | multiple times, which I'm sure would be very looked upon.
03:16:15.320 | - You know, I heard that in Norway,
03:16:16.600 | in their prison system,
03:16:17.520 | they don't actually punish you for trying to escape jail
03:16:19.320 | 'cause that's like the natural human thing to do.
03:16:20.880 | - They hug you?
03:16:21.720 | What do they do?
03:16:22.540 | - I don't know if they, but they don't punish you
03:16:23.380 | 'cause of course you're trying to be free.
03:16:24.480 | That's all I'm trying to be on Twitter.
03:16:25.680 | I'm just trying to be free.
03:16:26.520 | - Oh, that's the natural humanistic-
03:16:27.960 | - Yeah, that's the natural, of course, it's the banning of it.
03:16:30.120 | - You're not a destructive force, you're just-
03:16:31.560 | - No, I'm a force for good.
03:16:33.200 | That's why all my accounts only get banned
03:16:34.280 | for banning and evading.
03:16:35.120 | I don't get banned for doing bad things.
03:16:36.320 | And I'm a progressive show.
03:16:37.440 | I'm like far left.
03:16:38.740 | I love like progressive causes.
03:16:40.840 | - I thought this is what you criticized Hasan for being.
03:16:43.760 | - I show them from a place of first principles,
03:16:45.640 | not from a mindless AI echoing kind of thing.
03:16:48.640 | - Okay, so you're a free thinking bot.
03:16:50.840 | - Yeah, exactly.
03:16:51.920 | - All right, cool.
03:16:52.920 | Well, I'm sure we'll return to some politics.
03:16:54.560 | That was beautiful.
03:16:55.520 | Malia, can you tell us about yourself?
03:16:59.280 | You're also a fellow streamer.
03:17:00.760 | - Yes. - What's your story?
03:17:01.600 | - I stream and I started streaming
03:17:03.040 | because I met him basically, kind of,
03:17:05.400 | but I don't do the politics.
03:17:06.840 | I do like travels or talk about relationships,
03:17:10.000 | talk to my audience basically.
03:17:12.480 | - You're from that part of the world, right?
03:17:14.040 | Sweden? - Yeah, exactly.
03:17:14.880 | - So did you escape from prison and they didn't?
03:17:17.120 | - That was Norway.
03:17:17.960 | You just went to Norway.
03:17:19.160 | - It's different?
03:17:20.000 | - It's a different.
03:17:20.840 | - I actually really, I've been to Sweden a bunch of times.
03:17:22.840 | I love it.
03:17:23.680 | There's a tech sector there that's really like flourishing.
03:17:26.760 | - Where did you go?
03:17:27.680 | Which city?
03:17:28.520 | - I went to Stockholm.
03:17:29.440 | I think I gave a few lectures there.
03:17:30.860 | There's a vibrant tech sector, it was cool.
03:17:33.040 | And people are super nice.
03:17:34.920 | - Yeah, we're friendly.
03:17:36.640 | We're not like very deep.
03:17:37.960 | Like we don't really have much deep conversation.
03:17:39.720 | It's like a meta conversation.
03:17:40.720 | - Oh, there's not many intellectuals that come from Sweden?
03:17:43.480 | - We don't really speak very highly of ourselves.
03:17:45.120 | We're kind of like just chill all the time.
03:17:46.920 | We don't make a scene.
03:17:47.920 | We don't, we're just like, you know.
03:17:50.560 | - Do you know what the name for that is?
03:17:51.560 | There's a specific name for it.
03:17:52.880 | - Jantelagen.
03:17:53.840 | - Yeah, Jantelagen.
03:17:54.680 | - Yeah, Jantelagen, yeah.
03:17:55.840 | - Oh, there's a philosophy behind it.
03:17:57.320 | - When you're part of like Sweden or Norway,
03:17:58.920 | you don't talk too highly of yourself
03:18:00.720 | 'cause it's seen as kind of like rude.
03:18:01.960 | Like think of like America, except the exact opposite.
03:18:04.240 | - You don't even really wanna like,
03:18:05.160 | you don't wanna make yourself into a victim too much.
03:18:07.120 | You don't wanna be too much of anything.
03:18:08.440 | You're just like sticking to the group.
03:18:10.480 | Don't make big scene about yourself.
03:18:12.880 | - But that said, you came here and you were,
03:18:16.160 | you put yourself in front of a camera and became a streamer.
03:18:18.720 | - Yeah, do you understand how weird that is
03:18:19.880 | for my friends in Sweden?
03:18:21.840 | - Do you have anxiety?
03:18:22.680 | - I just didn't talk about myself
03:18:23.840 | and just like make a big deal about myself
03:18:25.640 | for hours every day.
03:18:26.480 | - Was that like terrifying?
03:18:27.560 | Did you have anxiety about that?
03:18:28.760 | - No, 'cause I don't see them,
03:18:30.000 | but then I come back and I'm like, ooh.
03:18:31.800 | (laughs)
03:18:32.640 | - Also, what do you feel like when you're actually streaming?
03:18:34.720 | You feel like you're just alone in a room?
03:18:37.080 | One-on-one type thing?
03:18:37.920 | - No, I see Chad and I'm thinking,
03:18:40.040 | oh, they're like little fairies.
03:18:41.120 | They're not really real.
03:18:41.960 | They're just like out there.
03:18:42.780 | I don't know what they look like.
03:18:43.620 | I just see little names and they're just cute.
03:18:46.080 | Colors, you know?
03:18:46.920 | - You're talking to little fairies inside your head.
03:18:48.520 | - Yes, I do.
03:18:49.360 | - Is that how you feel about Chad?
03:18:50.680 | - They're demons for me.
03:18:51.640 | - They're demons?
03:18:52.480 | Okay, my Anna fairies.
03:18:54.080 | - Are they, so is Chad a source of stress or happiness?
03:18:56.440 | Like, is there a--
03:18:57.280 | - No, for me, it's a source of happiness.
03:18:58.840 | I've been very intentional with like the construction
03:19:01.080 | of my community, so I'm really happy with where it's at.
03:19:03.360 | - How are you able to actually have deep political discourse
03:19:06.040 | while playing a video game at the same time?
03:19:08.240 | - I have a really good chat room
03:19:10.120 | in terms of like the way
03:19:10.960 | that people engage in conversations.
03:19:12.200 | Like, I was one of the earliest people
03:19:14.100 | to embrace the philosophy of like,
03:19:16.560 | I am in total control of what people watch me think,
03:19:19.620 | that like I have a high level of responsibility
03:19:22.120 | for how they conduct themselves,
03:19:23.520 | and that if I conduct myself in a certain way,
03:19:25.120 | I can expect a certain level of conduct from them.
03:19:26.880 | And for the most part, it's like worked pretty well
03:19:28.240 | for the past, you know, nine or 10 years, yeah.
03:19:30.920 | - What about the actual playing of the game?
03:19:32.760 | Like, you're able to parallelize the brain, like--
03:19:35.640 | - Oh.
03:19:36.480 | - Like it seems, like "Factory" seems
03:19:37.840 | like a super complex game.
03:19:39.120 | - Yeah, I don't actually think that's possible.
03:19:40.560 | I don't think multitasking for a human brain is possible.
03:19:43.320 | If you see me playing a game,
03:19:44.680 | usually what's happening is the conversation is like,
03:19:47.020 | I've had it a million times, so I'm not thinking about it.
03:19:48.880 | I've automated that.
03:19:49.860 | Or if the conversation is very challenging,
03:19:52.360 | then if you watch me,
03:19:53.200 | if you really watch what's happening,
03:19:54.320 | I'm probably just running around in circles
03:19:55.360 | 'cause I have to think about the conversation.
03:19:56.800 | - Okay, because with "Factory,"
03:19:58.320 | it looks like a lot of stuff is going on.
03:19:59.760 | - Sometimes, yeah.
03:20:00.600 | - So it's hard for a person who hasn't played the game
03:20:03.920 | to detect that you're not actually--
03:20:04.760 | - Does that come off as like,
03:20:05.680 | you're super intelligent and multitask?
03:20:07.360 | Or does it come off as like,
03:20:08.520 | he's not interested in this conversation at all?
03:20:10.520 | - Yeah.
03:20:11.360 | - Yeah, there's a coolness to it,
03:20:12.600 | like when you're not paying attention.
03:20:14.160 | Like, if you're looking elsewhere,
03:20:17.120 | like you're checking your phone,
03:20:18.180 | you're too cool for this conversation.
03:20:19.560 | There is a sense like that.
03:20:20.400 | - Yeah, the reality is though, is if you watch,
03:20:22.280 | it was easier to see in "Minecraft,"
03:20:23.760 | 'cause in "Minecraft,"
03:20:24.580 | when there was a challenging conversation,
03:20:25.640 | if you watch me play,
03:20:26.480 | I'm literally just running around and jumping in circles
03:20:28.240 | 'cause I have to think about the conversation 100%.
03:20:30.040 | I can't do a complicated task
03:20:31.280 | and think about the conversation.
03:20:32.780 | Or like, the people always joke in my chat,
03:20:34.420 | like, oh no, the notepad came out.
03:20:35.960 | If it's a really challenging conversation,
03:20:37.440 | I'll get rid of the game and I'll bring out a notepad
03:20:39.280 | and I'll start writing stuff down
03:20:40.200 | to keep track of what's going on, yeah.
03:20:42.160 | - So what kind of stuff do you stream?
03:20:44.080 | So advice, you talk about--
03:20:45.560 | - Yeah, like either I talk to chat
03:20:46.800 | or I travel around, basically.
03:20:48.920 | Like, have conversations or we like, go to countries.
03:20:52.960 | I've been to like Italy.
03:20:54.080 | I was in Italy for like one and a half months,
03:20:56.480 | just like traveling around alone,
03:20:58.320 | going to cities, like having like my camera with me
03:21:01.560 | and like streaming for hours.
03:21:03.740 | - Where's the coolest place you've been to?
03:21:06.080 | - Ever?
03:21:06.920 | It's probably New Zealand.
03:21:08.160 | - New Zealand?
03:21:09.000 | - I think so.
03:21:09.840 | After that, it's probably gonna be Italy, I think.
03:21:11.600 | Because I like history and yeah.
03:21:13.320 | - Oh, so both history, 'cause New Zealand is also beautiful.
03:21:16.640 | So it's both natural beauty and historical beauty.
03:21:19.760 | - Yeah, for sure.
03:21:21.120 | I think I just really like the Polynesian sort of culture.
03:21:23.840 | I think it's very interesting.
03:21:25.120 | Like the ocean people and it's just really beautiful.
03:21:27.720 | People are very relaxed, chill.
03:21:28.880 | They're very far away, which is interesting as well
03:21:31.800 | 'cause whenever they talk about politics
03:21:33.480 | or they talk about just like the world,
03:21:36.720 | it feels really far away.
03:21:38.560 | - So where's home for you?
03:21:39.680 | Is Austin home?
03:21:40.640 | - It's home for me.
03:21:43.160 | - So a human being is home?
03:21:45.080 | - Yeah.
03:21:45.920 | - We've lived in a lot of different places
03:21:47.000 | and traveled around a lot.
03:21:47.840 | So that's what you think of home is like humans?
03:21:50.160 | - I think so, yeah.
03:21:52.000 | I mean, if there's gonna be a place,
03:21:53.840 | it's probably gonna be like my childhood places probably.
03:21:57.200 | - Yeah.
03:21:58.040 | - Like my old country house or something like that.
03:21:59.560 | We don't have it anymore,
03:22:00.400 | but like that's like home for me, I guess.
03:22:03.240 | - So how'd you guys meet each other?
03:22:05.080 | You're currently married.
03:22:06.520 | - Yes.
03:22:07.520 | - To each other, yeah.
03:22:08.360 | - Yeah.
03:22:09.200 | - To each other.
03:22:10.020 | (laughing)
03:22:11.880 | - Just making sure we're on the same page.
03:22:13.560 | - All right, cool.
03:22:14.520 | How'd you guys meet?
03:22:15.880 | - I was watching his YouTube stuff like 2018, I think,
03:22:19.840 | like because it was the Swedish election around that time
03:22:21.940 | and I was interested in politics.
03:22:23.680 | And then I think he said in one of his videos
03:22:27.400 | that he had an Instagram
03:22:28.880 | and that he needed people to stop DMing him
03:22:31.360 | that wasn't PewDiePie's.
03:22:32.640 | And then I messaged him and said,
03:22:34.080 | "Am I PewDiePie?"
03:22:35.360 | And then you replied in like two minutes.
03:22:38.080 | And then that's when I was in New Zealand.
03:22:41.360 | And I guess you wanted to escape America
03:22:44.440 | or like LA for a little bit and then flew to New Zealand.
03:22:48.240 | - Where were you mentally there?
03:22:50.680 | 'Cause we've talked through this timeline.
03:22:52.560 | Where's 2018?
03:22:54.120 | - Was it 18, 19?
03:22:55.800 | - Where was the low point?
03:22:57.520 | Or that was way earlier?
03:22:58.560 | - Low point, carpet cleaning, that was like 2010.
03:23:02.000 | - Oh, okay.
03:23:02.840 | - 2018 was probably your peak.
03:23:04.280 | - Every day now is my peak.
03:23:06.520 | What do you mean?
03:23:07.360 | That was my peak.
03:23:08.180 | Why would you say that?
03:23:09.020 | - You've been through that, okay?
03:23:09.840 | - Nobody ever admits being past their prime.
03:23:11.840 | Just so you know.
03:23:13.040 | - Well, I mean, my prime is still coming up.
03:23:15.000 | - It was probably around the time
03:23:16.240 | where you were getting a lot of lefties through your community
03:23:18.720 | and you were really like thinking about
03:23:20.240 | that they would go too far.
03:23:21.400 | - Maybe.
03:23:22.240 | I think that was still when Hasan and Vosh
03:23:23.480 | were both in my community.
03:23:24.440 | - Exactly.
03:23:25.280 | So I would say it feels like there was not really
03:23:27.360 | like much issues when it comes to your stuff
03:23:30.240 | or like your work stuff back then.
03:23:31.640 | - Oh, something we didn't talk about is that like,
03:23:33.520 | there were no politics on Twitch.
03:23:34.840 | I exclusively inhabited that place for like two years
03:23:37.600 | 'cause nobody else did it.
03:23:38.420 | 'Cause it was a really toxic environment for politics.
03:23:40.160 | So for a couple of years as it grew,
03:23:42.120 | like I kind of grew the whole space
03:23:43.800 | 'cause it wasn't, nobody was doing it yet.
03:23:44.960 | - What did that look like?
03:23:45.800 | You're having like political debates, political discourse.
03:23:49.840 | - Yeah, mainly like going into YouTube people
03:23:51.880 | to try to argue with them or just doing politics on stream,
03:23:54.080 | like reading stories, researching stuff,
03:23:55.560 | talking about stuff.
03:23:56.520 | But there's not like other people on Twitch
03:23:57.960 | to debate about politics 'cause there was no politics.
03:24:00.120 | It was, yeah.
03:24:01.000 | - Was there a debate in the space of communism,
03:24:04.000 | socialism, social Democrats, kind of like this?
03:24:07.360 | Are you trying to outline your own position
03:24:09.000 | during that time?
03:24:09.840 | - I think it was mainly me fighting against conservatives
03:24:11.960 | 'cause it was like Trump stuff.
03:24:13.200 | And then it was coming off the back of like,
03:24:14.860 | there was this movement called Gamergate
03:24:16.400 | and there was all this anti-SJW stuff on the internet.
03:24:18.760 | And I was like the SJW, like the progressive
03:24:21.000 | that was fighting on the progressive side of things.
03:24:22.920 | So I think that's what I was known for.
03:24:24.080 | But I was fighting with people off of Twitch
03:24:25.540 | 'cause on Twitch,
03:24:26.380 | there weren't very many political discussions happening.
03:24:28.240 | - So you were holding the SJW flag.
03:24:30.880 | - Yeah.
03:24:31.720 | - To what degree do you still hold it?
03:24:34.360 | Like what's the best, what's the steel man case for SJW?
03:24:37.240 | - I mean, like I'm still very much that SJW from 2018, 2019,
03:24:40.680 | but the positions have moved so much farther left
03:24:42.880 | that some people might not call me that anymore.
03:24:45.080 | I'm not sure.
03:24:45.900 | It depends on who I'm talking to.
03:24:46.740 | - So it's basically, what is social justice?
03:24:48.800 | Were you like being sensitive to the experience of others?
03:24:51.840 | - Yeah, being sensitive and empathetic
03:24:53.280 | towards the experience of others
03:24:54.220 | and then trying to build a better world
03:24:55.400 | that like suits as many different types of people
03:24:57.000 | as possible while being like aware of like their needs.
03:24:59.840 | - Okay.
03:25:00.680 | So you guys met, what's from your perspective?
03:25:04.200 | Is that, is she telling lies?
03:25:06.440 | Is it accurate?
03:25:07.280 | - No, it's pretty accurate.
03:25:08.880 | - Okay, when'd you guys actually meet?
03:25:11.080 | - I flew out in 2019.
03:25:14.000 | - 19, yeah, in like in February.
03:25:15.400 | - Yeah, basically there was like weird stuff happening in LA.
03:25:17.760 | I just come off of kind of a weird,
03:25:19.160 | not kind of sort of relationship.
03:25:21.080 | And I just wanted to like go away for a while.
03:25:22.960 | Another company reached out to me
03:25:24.360 | and they had like a fun streaming device.
03:25:25.820 | And they said they'd sponsor a trip if I went somewhere.
03:25:27.400 | And I was like, oh, well, I know this person.
03:25:28.600 | I know a couple of people in New Zealand.
03:25:30.120 | Melina's one of them.
03:25:30.960 | It's like, I'll go to New Zealand.
03:25:31.780 | New Zealand, it'll be fun.
03:25:33.000 | And yeah, I did that for two weeks.
03:25:34.600 | - Do you guys believe in love?
03:25:36.320 | I feel like you lack the gotcha got us into this.
03:25:41.400 | I'm not sure to the degree to which you have human emotions.
03:25:44.000 | - I have quite a few.
03:25:45.200 | - Okay.
03:25:46.280 | From your perspective, when did you fall in love with Melina?
03:25:49.600 | - When did you fall in love with Meli Mel?
03:25:52.320 | - The minute I saw her.
03:25:53.600 | I don't know, our first two weeks together
03:25:57.120 | were a lot of fun.
03:25:57.960 | We had a lot of chemistry in person.
03:26:00.440 | - I was kind of shocked that I wasn't thinking about it.
03:26:02.560 | 'Cause it was like, we spent like a week together
03:26:04.440 | and you said, I really want to tell you something.
03:26:06.400 | And you were like stalling that for the longest time.
03:26:09.120 | - I think she was, oh, she said like, I love you?
03:26:12.200 | - No, he basically just said like,
03:26:14.400 | I really like you and it never really happens.
03:26:16.080 | That's what he said.
03:26:16.920 | And I was like, oh, and I thought, hey, I thought.
03:26:19.840 | - So let's still run, we said Trump getting banned
03:26:23.480 | from Twitter, is that what we were talking about before?
03:26:25.820 | - Oh yeah.
03:26:26.660 | - So you agreed to me coming on here.
03:26:28.060 | Of course I'm gonna be doing this to you.
03:26:30.400 | - So how long did that take, two weeks you said?
03:26:32.400 | - That took like a week.
03:26:33.240 | No, I don't know, I think it was just like.
03:26:34.700 | - Thing is my mind processes like information so quickly.
03:26:37.260 | Two weeks to somebody like you is actually like years for me.
03:26:40.260 | - Oh, like me, yeah.
03:26:41.660 | So there was like a lot of like factorial type
03:26:44.420 | of strategic thinking.
03:26:45.260 | - Yeah, going on.
03:26:46.260 | I was seeing like all the events,
03:26:48.220 | like Dr. Strange or whatever in the Avengers
03:26:50.460 | when he's like seeing into all the futures.
03:26:51.300 | - When you saw me, you just saw the future.
03:26:53.580 | - Yeah, I was looking at all of them, yeah.
03:26:55.300 | - So you're doing like some game theoretic simulation
03:26:57.660 | of all the possible outcomes.
03:26:58.940 | - Yeah, exactly.
03:26:59.980 | - Okay.
03:27:01.300 | - But no, yeah, it was probably pretty soon I realized
03:27:02.900 | that we had a lot of chemistry.
03:27:03.740 | I think before I left after my two weeks there,
03:27:05.900 | I was like, we need to make sure you get like a ticket
03:27:07.500 | to come visit me in the United States
03:27:08.700 | 'cause it'll be fun and everything.
03:27:09.540 | - And I kind of decided that last minute too.
03:27:11.860 | It was like really like five hours before your flight back.
03:27:14.620 | We kind of realized because it was kind of like men
03:27:17.100 | is just like a one time thing and then that was it.
03:27:19.540 | But we're like, oh no, this is a lot of fun.
03:27:20.860 | We should probably hang out again.
03:27:21.700 | - Oh, so you realized you would miss each other.
03:27:24.540 | - Yeah.
03:27:25.380 | - Yeah, yeah.
03:27:26.220 | - This was a one time thing.
03:27:27.040 | The melancholy side of love.
03:27:27.880 | Okay, when did you fall in love with Steven?
03:27:29.700 | - I thought he like hated me.
03:27:30.980 | I don't know, I thought, not hated me.
03:27:32.780 | - She still thinks I hate her.
03:27:33.620 | - But no, no, I remember like when he said
03:27:37.420 | that he really liked me, I was a little shocked about that
03:27:39.940 | 'cause I don't know, there was a lot of like random things
03:27:42.900 | happening in New Zealand.
03:27:43.740 | It was a lot of fun, but it was definitely
03:27:45.460 | like very interesting like things that happened
03:27:48.300 | because I was like around a lot of other people as well.
03:27:50.820 | So I thought he might've had like a really bad time.
03:27:53.500 | But when he said that, I was thinking about it more
03:27:55.300 | and then we spent like more time together
03:27:57.540 | like a week after that and then it felt like
03:27:59.460 | that was more like real.
03:28:00.780 | And I think when he was about to leave,
03:28:02.620 | I kind of realized like, no, I really like him.
03:28:04.660 | - Do you guys ever say love to each other?
03:28:07.460 | Like, I love you?
03:28:08.580 | - Yeah, of course, yeah.
03:28:09.420 | - Okay, all right.
03:28:10.660 | I wasn't sure.
03:28:11.500 | - Why would you ask that?
03:28:12.320 | What has he said before?
03:28:13.160 | - 'Cause I haven't, I don't think I've heard you speak.
03:28:16.140 | The only time I've heard Steven talk about love
03:28:18.220 | is when you're like criticizing the Red Pill community
03:28:21.460 | saying they don't ever talk about love in relationships.
03:28:24.220 | - Almost all the time I'm giving criticism to people.
03:28:26.260 | Like I said, I'm kind of stepping in.
03:28:27.580 | I'm very disconnected from my own emotional experience
03:28:29.460 | 'cause I'm trying to talk within there.
03:28:31.340 | So it's pretty rare that I'll talk about my--
03:28:32.700 | - What is your own emotional experience exactly?
03:28:35.300 | - Highly blunted, I guess.
03:28:36.620 | - There's a lot, okay.
03:28:37.460 | - What does that mean?
03:28:39.380 | - I mean, what's deep in there?
03:28:42.260 | Is this just who you are genetically
03:28:43.660 | or are you running from something?
03:28:45.540 | - I think I have a pretty good understanding of myself.
03:28:46.940 | A lot of people make that accusation to me,
03:28:48.460 | but I don't think I am.
03:28:49.620 | - Okay, this is just who you are?
03:28:51.220 | - It's just who I am, yeah.
03:28:52.300 | - Okay, this is not childhood stuff like trauma?
03:28:55.260 | - It's all sort of done.
03:28:57.420 | - You figured it all out?
03:28:58.460 | - Yeah.
03:28:59.300 | - In your old age?
03:29:00.300 | - As I grow every year, I figure out more and more.
03:29:02.220 | - He did mention, I think I heard this somewhere,
03:29:04.180 | that this is a source of fights
03:29:05.820 | for the two of you, the age thing.
03:29:07.540 | I felt the ageism throughout this whole conversation.
03:29:10.180 | - He's basically, he's saying that he gambles with time.
03:29:14.260 | He's just like, "I think she will be good later."
03:29:16.500 | And then just like--
03:29:17.340 | - It's like an investment, yeah.
03:29:18.180 | - Yeah, it's like what he's doing.
03:29:20.380 | When this treasury bond matures,
03:29:22.020 | I'm gonna be able to cash out for a good--
03:29:22.860 | - What do you think so far?
03:29:23.700 | Is the stocks going up or?
03:29:25.980 | - It's tumultuous.
03:29:27.020 | - What's that mean?
03:29:27.860 | - It's like Bitcoin?
03:29:28.700 | - A lot of up and down.
03:29:29.540 | - Oh my God.
03:29:30.380 | - Yeah, like Bitcoin, crypto mill.
03:29:31.900 | - All right, if you guys don't mind,
03:29:34.140 | one interesting aspect of your relationship
03:29:36.100 | is you're in an open relationship.
03:29:38.420 | What's that like?
03:29:39.700 | From a game theoretic simulation perspective,
03:29:41.900 | what went into that calculation?
03:29:43.980 | And how does that--
03:29:45.140 | - Like how that started or?
03:29:46.300 | - Yeah, how did that start, sure.
03:29:47.820 | - The only relationships I've ever done
03:29:49.340 | has been open relationships since I was in high school.
03:29:52.260 | 'Cause I didn't really understand
03:29:53.940 | why wouldn't you be able to do other things
03:29:56.700 | with other people,
03:29:57.540 | but then just have your main partner basically.
03:29:59.580 | - So what is an open relationship, generally speaking?
03:30:02.460 | That means you have one main partner?
03:30:04.140 | - Not a monogamous relationship.
03:30:05.700 | You're somehow allowed in different ways.
03:30:09.100 | You can see other people sexually.
03:30:11.060 | - Sexually, but there's one main station.
03:30:14.580 | - It doesn't have to be there for some people,
03:30:16.140 | but I think it's probably easier
03:30:18.180 | when we probably don't really have time
03:30:19.620 | or the energy for more than one person
03:30:23.180 | to really focus on.
03:30:24.660 | - What about emotional?
03:30:26.820 | - It's really complicated.
03:30:27.660 | There's a lot of complicated stuff going on
03:30:29.260 | under the hood there.
03:30:30.300 | I think broadly speaking,
03:30:33.020 | you've got polyamorous relationships
03:30:35.100 | and you've got open relationships,
03:30:36.540 | where polyamorous is like,
03:30:38.060 | oh, I've got three different girlfriends
03:30:40.020 | and we all hang out or sometimes even live together
03:30:41.900 | or three boyfriends, whatever.
03:30:43.060 | And then you've got open relationships,
03:30:44.340 | which is like, oh, you can basically hook up
03:30:46.220 | with other people.
03:30:47.060 | And then you've got your main relationship and that's it.
03:30:49.340 | I think ours is probably somewhere in the middle of that,
03:30:52.140 | to where we've got long-term friends,
03:30:53.740 | some of them we hook up with,
03:30:54.860 | and that's kind of how we, yeah.
03:30:56.420 | It's a delicate dance that explodes
03:30:59.420 | every six months on itself.
03:31:00.260 | - So it does explode, you guys fight over it?
03:31:02.380 | - We fight over some things, yeah.
03:31:04.380 | It things happen, yeah.
03:31:05.220 | - I think it's mostly because a lot of people can't handle it
03:31:08.180 | and they agree to something
03:31:09.620 | and then they realize that we're way too cool.
03:31:11.420 | And then they get really obsessed
03:31:12.980 | and they think that they can get in there
03:31:14.900 | and then it gets really dramatic.
03:31:16.500 | - Have you figured it out?
03:31:17.860 | - I feel like we figure out things more and more
03:31:22.180 | when it comes to what's a good person for us to hang out
03:31:24.580 | and what's not a good person for us to hang out with.
03:31:27.980 | I probably have more opinions on who he hangs out with
03:31:30.540 | because he likes the fucking psychos.
03:31:32.340 | (laughs)
03:31:33.300 | - Yeah, so you like the surrounding?
03:31:35.060 | - He likes the crazy ones, the baby trap sort of women.
03:31:38.740 | That's the ones.
03:31:39.940 | And I don't like that 'cause that affects me.
03:31:42.140 | (laughs)
03:31:43.660 | - That affects your game theoretical relation.
03:31:45.380 | - Yeah, obviously, yeah.
03:31:46.220 | - Okay, you like to surround yourself,
03:31:48.460 | like in general, you've talked about with crazy people.
03:31:51.700 | - I say crazy and I really shouldn't.
03:31:53.180 | - It's a humorous way.
03:31:54.020 | - It's like, yeah.
03:31:54.860 | - They're very unstable.
03:31:55.780 | - Very, can be unstable, but people that are very unique.
03:31:58.820 | Like when I meet this person, that's like--
03:32:00.780 | - Not boring.
03:32:01.620 | - Yeah, not boring, yeah.
03:32:03.180 | - And you said that you're progressively becoming
03:32:05.940 | not boring yourself.
03:32:07.460 | - No, I think I'm pretty stable.
03:32:08.660 | I don't let them affect me much, but.
03:32:10.220 | - So you don't think they affect your--
03:32:11.780 | - No, if I've said that, I said it jokingly.
03:32:13.460 | I think I've got my stuff really well figured out.
03:32:15.780 | It's what allows me to engage with people like this
03:32:17.540 | so easily because I can engage,
03:32:19.180 | I can make them feel seen and heard.
03:32:20.380 | And then if it gets insane, I can cut off
03:32:22.060 | and I can be chill.
03:32:23.060 | Like very few things affect me in the longterm.
03:32:25.060 | - Do you guys experience jealousy?
03:32:27.540 | - Usually, like whenever I feel like he's not spending
03:32:30.180 | the like the amount of time that I'm asking for
03:32:32.140 | and he spends it on his video games or his stream
03:32:35.100 | or like he sees someone else like more than he sees me
03:32:38.220 | or something like that, that would like not be good.
03:32:40.300 | 'Cause then it affects like our relationship.
03:32:42.540 | - Do you have a good sense of like,
03:32:44.260 | is it literally time or is it the energy put into the--
03:32:49.180 | - It's probably like if he's with me,
03:32:52.340 | that like the attention in the time,
03:32:53.780 | like when he hangs out with me
03:32:54.780 | and then there's also probably the time.
03:32:56.180 | So if I feel like something else is distracting too much,
03:32:58.620 | like it could be work or it could be a friend
03:33:00.380 | or it could be anything.
03:33:01.660 | Like if I feel like it starts to take away from like me,
03:33:04.860 | then I'm having an issue with it.
03:33:06.500 | I don't think he really cares much.
03:33:08.140 | I guess the only jealousy you experience
03:33:09.580 | is probably when you feel like,
03:33:12.140 | like if I get upset about him seeing someone too much
03:33:15.780 | and then I go see someone more and then he's like,
03:33:17.980 | why can't I go see my friend more, like as much as you?
03:33:20.540 | So like, that's the sort of like thing
03:33:22.340 | that we're trying to navigate on, I guess.
03:33:24.440 | - I think we are like diametrically opposed sometimes
03:33:28.700 | in terms of how we view like engagement with people
03:33:30.620 | or engagement with the world sometimes.
03:33:32.300 | So like on her end of the spectrum,
03:33:34.060 | like a perfect week for her might be like being in a cabin,
03:33:39.820 | watching like fireflies at night,
03:33:41.620 | going hiking every morning, going swimming at the beach,
03:33:44.080 | because it's like, you're taking in like
03:33:45.520 | the grandeur of nature.
03:33:46.660 | You're like connected with yourself.
03:33:47.980 | You're like very at peace.
03:33:48.980 | Everything is like chill and cool.
03:33:50.220 | There's the wind, the feeling of nature, everything.
03:33:51.900 | That's like her peak living experience.
03:33:53.620 | - I like being present.
03:33:54.620 | - Yeah, and like my peak experiences are like
03:33:57.060 | people trying to destroy my life,
03:33:58.420 | like the challenge of like navigating
03:33:59.820 | really complicated discussion,
03:34:01.700 | like several different dramatic events unfolding
03:34:04.220 | that might end my career.
03:34:05.060 | Like these things are like very,
03:34:05.940 | I like the stress and the action and the entertainment
03:34:08.260 | and everything's like very cool for me.
03:34:09.900 | So when we're together,
03:34:11.020 | she generally wants me to be like more chill.
03:34:12.980 | But if I don't feel like I'm being like stimulated a lot,
03:34:15.300 | then it's easy for like my mind to wander.
03:34:16.780 | - To wander somewhere else.
03:34:18.420 | - That's kind of the issue.
03:34:19.260 | We have a very different way of like engaging with the world.
03:34:20.740 | - So how can you find happiness in the stillness?
03:34:24.100 | - I feel like if we're just like aware of it
03:34:26.300 | and we're trying our best,
03:34:27.460 | like whenever we like we're supposed to do this one thing.
03:34:29.380 | So let's say that we wanna go to New York
03:34:31.020 | and I'm like, we should just like go out
03:34:32.500 | and do this one specific thing.
03:34:33.780 | We try to find something that he enjoys doing.
03:34:35.860 | Like now that we're in Texas,
03:34:37.700 | we can go shooting or do something fun that he enjoys
03:34:40.260 | then we can do it.
03:34:41.180 | And then I think like,
03:34:44.100 | just like for me also to be aware
03:34:46.540 | that like when he spends a lot of time on crazy people,
03:34:48.620 | it's not because he like loves them or wants to be with them.
03:34:50.820 | It's just because he likes being like,
03:34:52.380 | having his life destroyed.
03:34:54.500 | Like you said, which I don't really do.
03:34:56.380 | It's just a completely different thing.
03:34:57.580 | So like for me to like understand more like how he's thinking
03:35:00.700 | because it's so different from mine
03:35:02.300 | and for him to understand how I'm thinking about things
03:35:04.500 | and like what I prioritize in my life.
03:35:06.700 | I think that's like how we navigate.
03:35:08.300 | But I think it's good.
03:35:09.140 | I think the differences can be good.
03:35:10.660 | Like when we're finding a way, yeah.
03:35:12.420 | - Well, I think you're relatable.
03:35:15.700 | (laughing)
03:35:17.220 | - More of a human, you're an AI.
03:35:18.060 | - No, I'm definitely very difficult to get along with.
03:35:19.740 | Like I always tell people that,
03:35:20.580 | that like if you're dating me for like more than a few years,
03:35:22.580 | like you get like an award for that.
03:35:23.420 | - It's like a war zone that you've survived.
03:35:25.580 | - Did you say that? - Absolutely.
03:35:26.420 | - That you're like a veteran, you get medals and stuff.
03:35:29.300 | - And it's always like,
03:35:30.140 | I think there's probably been like six different,
03:35:31.540 | I don't think she says it anymore,
03:35:32.740 | but there were like six different times in our relationship
03:35:34.180 | where she's like, is it always like this?
03:35:35.700 | Is this actually right?
03:35:36.660 | - Yeah. - And like every next year--
03:35:37.500 | - You lied in the beginning of it.
03:35:39.380 | Like you were lying about that.
03:35:40.700 | - Well, it got worse.
03:35:42.300 | - You were like, no, it's just like right now,
03:35:43.940 | I'm having a huge argument online
03:35:46.140 | about saying the N-word in private.
03:35:47.580 | It's just gonna be like this
03:35:48.540 | and I'm gonna be streaming 24 hours a day.
03:35:51.140 | And I'm like, when are you gonna go to bed?
03:35:52.940 | It's been a week.
03:35:53.780 | - Did playing league come into this?
03:35:58.380 | - A little bit, but I'm clean.
03:35:59.580 | I'm clean of league like six months right now.
03:36:01.140 | - Yeah, what do you hate about legally?
03:36:03.380 | - I never got-- - The humans.
03:36:04.980 | (laughing)
03:36:06.300 | - Well, speaking of which,
03:36:08.180 | my participation in league involved on the robot side.
03:36:10.820 | - Good.
03:36:11.660 | - 'Cause there's--
03:36:12.540 | - That's an improvement.
03:36:13.940 | - 'Cause both with Starcraft II and League of Legends,
03:36:18.180 | 'cause OpenAI and DeepMind both participate
03:36:21.220 | in creating bots in those.
03:36:22.380 | - I was a professional Starcraft II player,
03:36:23.780 | so I remember when the AI started to play.
03:36:25.940 | It's interesting the types of restrictions
03:36:27.740 | that you would have to put on like a gaming robot
03:36:29.580 | to make it like functional
03:36:30.740 | and not totally unfair to the other side.
03:36:33.340 | - Yeah, to make it human-like.
03:36:34.660 | Yeah, was that interesting to you,
03:36:35.940 | to see AI be able to play those video games?
03:36:38.740 | - I think in some ways,
03:36:39.580 | people think things are more complicated
03:36:40.700 | than they actually are.
03:36:41.860 | And I think video games is one of those things
03:36:43.380 | where we're like, oh my God,
03:36:44.220 | there's like a million possibilities at every second
03:36:46.660 | and who knows?
03:36:47.500 | And it's like, no, there's like three or four things
03:36:48.620 | going on at any point in time.
03:36:49.580 | And I'm willing to bet that like an AI
03:36:50.860 | could probably solve some of these games like pretty easily,
03:36:53.740 | especially if there are no constraints
03:36:55.100 | on how they can learn, yeah.
03:36:56.820 | - Can I talk to you about relationships?
03:36:59.060 | - Yeah. - Yeah, we already have.
03:37:00.300 | - Yeah, I know, but more generally speaking,
03:37:02.180 | we didn't get a chance to talk
03:37:03.860 | about the Red Pill community.
03:37:05.820 | - Oh, sure.
03:37:06.660 | - Well, first of all, what is the Red Pill community,
03:37:08.380 | the metasphere in general?
03:37:09.780 | I'd love to get both of your opinions on this.
03:37:11.700 | - Sure.
03:37:12.980 | - I know you're probably not as opinionated on that whole--
03:37:17.380 | - I'd say, do you think I am then?
03:37:19.060 | Like probably not as much as you,
03:37:21.060 | but I do have opinions.
03:37:22.500 | - You do, okay.
03:37:23.340 | - I usually don't like speak out too much on it
03:37:25.420 | because I feel like there's like a language barrier.
03:37:28.020 | That's why I don't really do politics
03:37:29.380 | because this is my second language, yeah.
03:37:31.220 | - That's right, you have to know the--
03:37:33.300 | - A little bit like that, yeah.
03:37:34.140 | - You know how to use derogatory terms every other sentence
03:37:38.700 | so they understand you, right?
03:37:40.020 | - Exactly.
03:37:40.860 | - I don't know anything about that.
03:37:41.700 | - It's the only thing I've talked about.
03:37:42.540 | - You need to be able to speak really well
03:37:45.780 | for people to take you seriously, I think.
03:37:48.300 | And that's the thing, if I don't have the words
03:37:51.260 | and I can't pronounce things correctly,
03:37:54.060 | then people are not gonna say--
03:37:54.900 | - The ESL person searching for words looks stupid,
03:37:56.740 | essentially, that's how people view it, yeah.
03:37:58.300 | - Tell me about it, I have a podcast
03:38:00.260 | that a bunch of people listen to and I mumble and they,
03:38:03.500 | yeah.
03:38:04.340 | - Wait, what's your first language?
03:38:06.220 | - Russian.
03:38:07.060 | - Oh, okay.
03:38:07.880 | - But I speak both languages horribly.
03:38:09.540 | I'm just not, I'm not like,
03:38:12.300 | there is definitely a big disconnect
03:38:14.380 | between my brain and my mouth module.
03:38:17.220 | Like, I'm not able to generate the thoughts efficiently.
03:38:20.740 | Like the things you're able to do,
03:38:21.940 | like the da-da-da-da-da, like speak like that, I'm not.
03:38:25.180 | It's very, very tough.
03:38:26.140 | Plus there's a huge amount of anxiety
03:38:27.900 | and social interaction that I have,
03:38:29.480 | which makes speaking even harder.
03:38:31.340 | - Gotcha.
03:38:32.180 | - Yeah, yeah, yeah, it's tough.
03:38:34.140 | - I understand.
03:38:34.980 | - Gotcha.
03:38:35.800 | - Makes sense.
03:38:37.340 | - Yeah, the gotcha is both a symbol of compassion
03:38:42.340 | and derision at once.
03:38:44.100 | - I'm just letting you know, I understand what you're saying.
03:38:45.540 | I'm just gonna sit there and stare at you in silence.
03:38:46.820 | - No, you can just say like, yeah, I get it.
03:38:48.780 | Like, yeah.
03:38:49.620 | - I get it, gotcha.
03:38:50.820 | - No, no, gotcha sounds, no, it's so short.
03:38:55.060 | It's like, say a longer sentence,
03:38:57.900 | but that means the same thing.
03:38:59.260 | - I understand you.
03:39:00.600 | - Yeah, good, that's good.
03:39:01.600 | That's like, not chills, you know?
03:39:04.200 | You get chills, so you understand me.
03:39:06.360 | - Yeah, it feels good.
03:39:07.240 | - Yeah.
03:39:08.080 | - I hear you.
03:39:08.900 | - I hear you.
03:39:09.740 | - And like, if you just like hold the other person's hand,
03:39:11.320 | that's even better.
03:39:12.160 | - You gotta put in some emotion there, okay?
03:39:13.520 | Show that you have some.
03:39:14.720 | - I understand.
03:39:15.560 | What do you think about, gotcha.
03:39:18.360 | What do you think about Red Pill?
03:39:20.320 | What, sorry, what is it, first of all,
03:39:22.360 | for people who don't know?
03:39:23.440 | - Yeah, the Red Pill community,
03:39:25.000 | obviously it's the Matrix reference.
03:39:26.480 | The Red Pill that you take is when you realize
03:39:28.640 | what dating standards and norms really are in the world.
03:39:31.440 | That men are providers and have to become some great thing
03:39:34.620 | to hunt and attract the woman who are just kind of there
03:39:37.640 | floating around looking for people
03:39:38.860 | to give them the most resources.
03:39:40.160 | And it's like coming to a realization
03:39:42.260 | of what the world of dating really is,
03:39:43.760 | broken away from the Hollywood standards
03:39:45.560 | and the romantic stuff that they try to sell you
03:39:47.160 | in the stores.
03:39:48.220 | - So there was kind of,
03:39:49.800 | maybe you can kind of educate me on this,
03:39:51.880 | but Red Pill used to be associated with just
03:39:57.240 | maybe anti-establishment views, I don't know.
03:39:59.320 | Maybe Republican conservative viewpoints,
03:40:01.400 | maybe alt-right. - People use,
03:40:02.340 | yeah, they use Red Pill a lot in different communities.
03:40:04.800 | When you say the Red Pill community--
03:40:06.480 | - Yeah, that usually means dating.
03:40:07.960 | - The dating thing.
03:40:08.800 | But a lot of people say, "Oh, Trump voters,
03:40:09.640 | "they're Red Pilled."
03:40:10.460 | Are you Red Pilled on politics or whatever?
03:40:11.800 | People will say stuff like that, yeah.
03:40:13.120 | - Okay. - Cool.
03:40:13.960 | - And then there's the Manosphere,
03:40:15.760 | all the similar type of stuff.
03:40:17.160 | And Andrew Tate is somebody that represents
03:40:19.440 | kind of the figurehead.
03:40:21.560 | - Of the Manosphere, of the Red Pill stuff,
03:40:23.320 | yeah, I would say so.
03:40:24.160 | I'm pretty sure, yeah.
03:40:25.000 | - Okay.
03:40:25.820 | All right, cool.
03:40:26.660 | Well, what are some ideas that they represent
03:40:29.560 | and what do you think about them?
03:40:30.680 | - I think they do a good job at speaking
03:40:32.720 | to disaffected young men who feel like
03:40:34.880 | the rest of the world has kind of left them behind
03:40:36.600 | or isn't willing to speak to them.
03:40:38.360 | And they do identify some true and real problems.
03:40:41.520 | Feels like on the left, we have a really hard time
03:40:43.420 | doing self-improvement or telling people
03:40:45.560 | how to better themselves.
03:40:46.460 | We focus too much on structural or systemic issues
03:40:48.880 | rather than what can an individual do
03:40:50.400 | to uplift or empower themselves.
03:40:52.240 | And it also feels like they do a good job
03:40:54.000 | at speaking to some of the positive aspects
03:40:56.220 | of masculinity, that it's okay to be strong
03:40:58.920 | and brave and a soldier and a warrior
03:41:01.440 | and provide for your family and blah, blah, blah.
03:41:03.840 | So I would say those are positive messages,
03:41:05.480 | like self-improvement and everything
03:41:06.760 | that come from the Red Pill community.
03:41:08.640 | - What's the negative?
03:41:09.800 | - I think the analysis on how men and women interact
03:41:13.960 | is way too transactional.
03:41:16.040 | All of the romanticism and love and chemistry
03:41:18.300 | is totally sucked out of it.
03:41:19.740 | Everything is very sex-based,
03:41:22.040 | like how do you basically have sex
03:41:23.440 | with the most amount of women possible
03:41:24.840 | and that's gonna make you happy?
03:41:26.340 | And then I think people's motivations sometimes
03:41:28.660 | are just spoken about in such a shallow derogatory way
03:41:31.620 | that I don't think is always reflective of reality.
03:41:33.620 | Like a woman only wants you
03:41:34.820 | because you make six figures and you're tall
03:41:36.620 | and a guy only wants you
03:41:37.460 | 'cause he wants to have sex with you and blah, blah.
03:41:38.860 | Like it feels like there's a lot of that going on a lot.
03:41:41.420 | - Yeah, and that misses some fundamental aspects
03:41:43.980 | about relationships, about meaningful relationships
03:41:46.420 | and so on.
03:41:47.260 | - I don't think I've never heard Red Pill people
03:41:48.660 | ever, ever talk about meaningful relationships.
03:41:50.980 | It's always just how to get in one
03:41:52.060 | or how to have sex, really.
03:41:53.880 | - Mel, what bothers you about some of that philosophy?
03:41:57.260 | - I feel like the people that are like the Red Pill people,
03:41:59.620 | I feel like their solution is something
03:42:01.580 | that doesn't actually work out.
03:42:03.520 | Or it works out for some people,
03:42:06.700 | people that makes a lot of money
03:42:07.780 | and is really successful in that sort of way,
03:42:10.100 | but it's not gonna help most men out there.
03:42:12.980 | So I feel like it's just a pointless speech
03:42:15.700 | to give to these really lost guys.
03:42:18.060 | And they really do believe that they can become successful,
03:42:21.060 | they can get money and when they get all these things,
03:42:24.680 | they can get girls,
03:42:25.600 | but most of them is not gonna achieve that ever.
03:42:28.440 | - To get the money part or become successful.
03:42:30.440 | - Just become a billionaire, you know?
03:42:31.840 | And you will get all the girls, which is true,
03:42:33.640 | but not everyone can do that.
03:42:34.680 | So I feel like when these guys are speaking to these men
03:42:38.000 | and they're just like,
03:42:38.820 | "We just care about these men out there.
03:42:40.520 | They need to hear this."
03:42:41.720 | It doesn't really help a lot of them.
03:42:43.520 | - And it doesn't inspire them to develop compassion
03:42:48.520 | towards the opposite sex,
03:42:49.920 | which is probably something required
03:42:51.460 | to have a meaningful relationship.
03:42:52.920 | - And also they seem to complain a lot about women,
03:42:56.640 | like only wanting men that have money and that's tall
03:42:59.380 | and that's muscular or whatever, all those things.
03:43:01.880 | But they complain about that,
03:43:04.660 | but that's also kind of what they're trying to make the men
03:43:08.740 | try to do for themselves.
03:43:10.260 | So they kind of fall into the same sort of behavior.
03:43:12.900 | And it seems like they're kind of unaware of that as well.
03:43:16.060 | They're just playing a part of the game
03:43:17.380 | instead of trying to find a woman
03:43:19.280 | that doesn't look for those things
03:43:20.940 | and that are looking for not those things.
03:43:24.260 | - I actually would love to have straight up data
03:43:27.700 | on people in that world versus not in that world,
03:43:31.900 | how often they get laid.
03:43:33.220 | Like literally, so I think for sure people in that world
03:43:39.420 | have fewer meaningful long-term relationships
03:43:44.460 | that are fulfilling,
03:43:45.460 | that actually helped them succeed in life,
03:43:47.540 | that helped them be happy and content
03:43:49.020 | and all that kind of stuff.
03:43:50.300 | But just even the straight up,
03:43:51.620 | the shallow goal of getting laid, I wonder.
03:43:56.540 | Because it's very possible that like just the roughness
03:43:59.460 | with which they treat intellectually women,
03:44:03.420 | that might lead to lower success, not higher success.
03:44:06.660 | - It's very adversarial,
03:44:08.100 | which I think is always disappointing.
03:44:09.220 | Anything that talks about men and women,
03:44:10.500 | I think it's good to acknowledge differences,
03:44:12.340 | but when it becomes like adversarial,
03:44:13.980 | especially when you talk about sex,
03:44:15.340 | sex is something that men are getting
03:44:16.820 | and it's something that women are giving
03:44:18.180 | and that type of like trade off
03:44:19.340 | and the way they talk about it is like,
03:44:20.620 | yeah, it sets people against each other
03:44:22.260 | in a really toxic way, I think.
03:44:23.820 | - How do you talk to people from that world,
03:44:27.180 | from the red pill world?
03:44:28.500 | Like would you ever talk to like somebody like Andrew Tate?
03:44:31.020 | - Oh yeah, if I had the chance to.
03:44:32.220 | I've been on the Fresh and Fit podcast a few times
03:44:34.260 | and then I've got a friend, Sneeko,
03:44:35.580 | who's like very red pill, that stuff.
03:44:37.700 | If I'm trying to talk to them,
03:44:40.460 | usually it's kind of like approaching it like a scared cat.
03:44:44.060 | The first thing you have to do is like be very gentle
03:44:45.820 | and say like, I understand your issues,
03:44:47.180 | I understand your complaints,
03:44:48.060 | I know that I'm scary because you think I'm gonna say
03:44:49.980 | like toxic masculinity and feminism
03:44:51.820 | and all these scary words at you.
03:44:54.020 | So the first thing is always to recognize it.
03:44:55.580 | Like a lot of what they talk about,
03:44:56.980 | there are like true aspects to what they're talking about
03:44:59.340 | that people on the left won't recognize.
03:45:01.140 | So I think it's good to acknowledge those things
03:45:03.900 | that like men and women are kind of different.
03:45:05.820 | We do look for different things in general
03:45:07.220 | when it comes to relationships.
03:45:08.060 | It's okay to say that, there's nothing bad there.
03:45:10.500 | And then it'll usually be like,
03:45:11.500 | once I've got your trust and I'm in your bubble,
03:45:13.700 | like let's talk about the things that you want
03:45:15.780 | and maybe some of the strategies that you're employing
03:45:17.980 | aren't necessarily gonna get you
03:45:19.140 | some of the things that you want.
03:45:20.340 | So for instance, if you're really worried
03:45:21.580 | about like shallow girls ruining your life,
03:45:23.820 | like Melina said, it's probably not best
03:45:26.100 | to build your entire worldview
03:45:27.140 | around trying to get shallow girls
03:45:28.340 | that are gonna ruin your life.
03:45:29.540 | Like if your way of attracting a girl
03:45:30.820 | is to go to the gym, get a whole bunch of money
03:45:32.500 | and try to like flaunt your wealth as much as possible,
03:45:34.740 | you're gonna be attracting the very same type of women
03:45:36.380 | that you're here like decrying on your stream.
03:45:38.660 | - Yeah, I think we talked about that on the podcast.
03:45:40.100 | Like you probably wanna have a woman
03:45:41.660 | that's gonna be there if you lose your job,
03:45:43.420 | it's still there.
03:45:44.260 | Like that cares about the things that's not just your job.
03:45:47.020 | - Yeah.
03:45:47.980 | - It's more stable.
03:45:48.940 | - And also I don't help you become a great man
03:45:52.020 | or a great like grow.
03:45:54.260 | Like I feel like a great friendship and a partnership,
03:45:57.100 | like it helps you make you a better person.
03:45:58.900 | Some of the most successful people I know,
03:46:00.420 | I mean, they have families
03:46:01.500 | and there's clearly a dynamic there
03:46:03.140 | that's like that makes them,
03:46:05.020 | they wouldn't be that without.
03:46:06.460 | - They're not an island, yeah.
03:46:07.500 | - Yeah, and the kids actually a big part of that too.
03:46:10.180 | Like for most people, if you're like a good parent,
03:46:12.860 | they make you step up somehow in life.
03:46:15.260 | You have to take responsibility
03:46:16.500 | for getting your shit together
03:46:18.580 | and excelling in ways that I guess the philosophy
03:46:23.580 | of the Red Bull does not quite get to.
03:46:25.140 | - That's always an interesting,
03:46:26.100 | I think I've asked that a couple of times
03:46:27.220 | where it's like, would you let your daughter
03:46:30.460 | date Andrew Tate?
03:46:31.820 | And it's always funny to watch them
03:46:32.860 | kind of like squirm around those answers sometimes.
03:46:35.260 | - But see if they don't have a daughter,
03:46:37.260 | like I don't have a daughter.
03:46:39.340 | I think your whole philosophy changes
03:46:41.340 | once you have a daughter.
03:46:42.380 | - Sure.
03:46:43.220 | - Like you can feel a solid--
03:46:44.060 | - Well, but even at that,
03:46:44.880 | like they know that what they're answering,
03:46:45.720 | they feel a little bit weird about it.
03:46:46.820 | It's funny to watch them.
03:46:47.660 | Like even they know, it's like, ah, fuck, you know.
03:46:49.900 | - Well, they might say like,
03:46:51.140 | I want my daughter to date like a high value male
03:46:54.540 | to the degree that he's a high value male, yes.
03:46:57.260 | But like, I don't think you'll feel that way.
03:46:59.120 | The definition of high value changes completely.
03:47:01.500 | - For sure.
03:47:02.340 | - Certainly the stereotypical measures of value
03:47:05.220 | contribute to the calculation,
03:47:06.560 | but it's so much more than that.
03:47:07.860 | I think the chemistry of the whole thing is bigger.
03:47:11.140 | You've also mentioned about body count.
03:47:13.780 | You guys both have a high body count.
03:47:16.700 | Does body count matter?
03:47:18.060 | Or it depends, like you said,
03:47:19.780 | it's low in some people's eyes,
03:47:21.620 | it's high in other people's eyes.
03:47:23.160 | Does body count matter in relationships?
03:47:24.900 | Does the past matter?
03:47:26.660 | - Well, the past matters.
03:47:27.900 | I don't think body count, not to me, I don't really care.
03:47:29.900 | - Not just as it is, no.
03:47:31.220 | - I mean, it could be.
03:47:33.220 | - What the past does.
03:47:34.040 | - Yeah.
03:47:34.880 | - Well, the past is who you are, right?
03:47:36.100 | But if somebody tells me they have a 200 body count
03:47:38.220 | and they're 16,
03:47:39.300 | something's probably going on there that's not good.
03:47:41.260 | - I was thinking about that too.
03:47:42.100 | 'Cause it could be really young people
03:47:43.900 | that are having some sort of mental problems going on.
03:47:47.460 | - Or somebody's 45 and they've never had sex before.
03:47:50.340 | There's probably something going on, right?
03:47:51.620 | So it could be indicative.
03:47:52.460 | But if somebody's in their 20s and they've had sex
03:47:54.260 | with 100 people or 50 people, whatever, it's whatever.
03:47:57.140 | - It's more experience, which can be good.
03:47:59.420 | - Sure.
03:48:00.260 | - Okay, so that just represents you're sexually open
03:48:03.420 | and so it doesn't really necessarily mean any kind of,
03:48:06.460 | not necessarily, it could though.
03:48:07.940 | The number alone doesn't mean anything.
03:48:09.940 | - Yeah.
03:48:10.780 | - Okay.
03:48:11.600 | - Well, you could meet a guy that's like,
03:48:12.860 | I just really, really want to fuck a lot of people
03:48:14.900 | because it makes me so cool.
03:48:16.820 | You can meet someone like that, which is like, maybe.
03:48:18.940 | - So the body count doesn't matter,
03:48:19.860 | but where it comes from.
03:48:21.140 | - Yeah, like why have you slept with the people
03:48:23.500 | you slept with?
03:48:24.580 | - Does it hurt the romantic aspect of the relationship,
03:48:26.900 | knowing that there's a lot of people in the past?
03:48:29.380 | - I don't know.
03:48:30.220 | - Not for us, no.
03:48:31.180 | - Is the part of the relationship fundamentally romantic?
03:48:34.060 | - For us, yeah, I'd say so, yeah.
03:48:34.900 | - For us, yeah, pretty soon.
03:48:36.180 | - Okay.
03:48:37.020 | (laughing)
03:48:37.860 | - What?
03:48:39.260 | - You come off as such a cold person.
03:48:40.900 | - No, I was just in my head thinking,
03:48:42.140 | I wanted to just say gotcha right there.
03:48:44.260 | - Oh, nice.
03:48:45.100 | (laughing)
03:48:45.940 | - It's so judgmental.
03:48:47.100 | - I think when it comes to the sex thing,
03:48:48.440 | there's always like, the way that I explain it is,
03:48:50.040 | and I understand, like, I have to say this
03:48:52.460 | 'cause I don't advocate for what I do for everybody
03:48:54.760 | or what she does for everybody,
03:48:55.840 | 'cause obviously there's a whole bunch of natural feelings
03:48:58.160 | of jealousy that pop up for a lot of people.
03:49:00.340 | But when people ask me, it's always like,
03:49:01.880 | oh, isn't this horrible that you guys are doing this
03:49:04.240 | and you don't love each other?
03:49:05.520 | From my perspective, I can have sex with any person
03:49:08.600 | and it can be sex.
03:49:09.480 | That's not a special thing between two people, in my eyes.
03:49:12.040 | It's like anybody can have sex.
03:49:13.440 | But there are certain activities and ways
03:49:15.960 | you can spend time with each other
03:49:17.440 | where you're carving out these precious little moments
03:49:19.440 | in time with a certain person that can do things
03:49:21.280 | that are special to that person.
03:49:22.760 | And those are the kind of events
03:49:24.320 | that I remember more than anything else.
03:49:26.480 | So the idea of like, oh, wow, I had sex with a person
03:49:29.680 | that was so special doesn't mean as much as like,
03:49:31.940 | you know, us traveling to like New Zealand
03:49:34.300 | or sharing some special moment,
03:49:35.500 | doing like some really fun activity or event or whatever.
03:49:37.820 | That's usually how I like it, yeah.
03:49:39.540 | - So a shared intimate moment.
03:49:41.300 | - Yeah.
03:49:42.140 | - I kind of agree,
03:49:42.960 | but I can definitely connect the romance with sex.
03:49:44.900 | - Boring.
03:49:45.740 | (laughing)
03:49:46.580 | - I'm curious why you can't do that.
03:49:47.400 | - That's 'cause she's a woman.
03:49:48.240 | See, that's where the red pill's right.
03:49:49.060 | That's exactly what you can't.
03:49:49.900 | - We also talked about misogyny,
03:49:51.900 | which is clearly the embodiment of that.
03:49:54.380 | What were you saying?
03:49:55.580 | So there's a connection between romance and sex?
03:49:58.020 | - Yeah, I think it is.
03:49:59.380 | Because I think sex could be a lot of things, right?
03:50:02.740 | It's some sort of bonding, I'd say, in some way.
03:50:06.740 | Let's say that you really like BDSM.
03:50:08.900 | You kind of like, you become submissive to someone
03:50:11.360 | or you take control over someone.
03:50:12.620 | It's like a very bonding, like intimate moment, I'd say.
03:50:16.340 | - And that's romantic.
03:50:17.180 | The intimacy is romantic.
03:50:18.020 | - I think it is.
03:50:19.060 | If you can show yourself as really submissive or like weak,
03:50:21.820 | or you have like absolutely no control over yourself
03:50:23.920 | and you let someone else do it,
03:50:25.380 | or you are the one being that,
03:50:27.000 | like you are the dominant force of someone.
03:50:28.620 | I think that's a really like intimate thing.
03:50:30.420 | 'Cause you show like the weakest part of yourself, kind of.
03:50:34.060 | - I just feel like I personally,
03:50:35.740 | to me, some component of romantic,
03:50:39.340 | but to me, this is not judging to others.
03:50:40.960 | To me, maybe it's how I was brought up,
03:50:43.060 | the romance increases if the number of intimate interactions
03:50:49.460 | are limited to one person.
03:50:51.580 | For me, for some reason, spreading it out
03:50:57.700 | decreases exponentially the feeling of romance you feel.
03:51:02.700 | That could be just like sort of having grown up
03:51:07.860 | in the Soviet Union, there's the fairy tale stories
03:51:11.620 | and you're kind of maybe living through them.
03:51:14.660 | - Yeah, I mean, I think what you're saying
03:51:15.500 | is like really normal.
03:51:16.460 | Like most people probably feel that way, yeah.
03:51:18.700 | - But because you guys are able to successfully not do that,
03:51:21.220 | I just wanna question my own understanding of it.
03:51:24.320 | - Like why is that?
03:51:26.420 | - Why is that?
03:51:27.260 | - Like am I being very jealous for no reason?
03:51:30.820 | Maybe you can maximize the number of intimate experiences
03:51:33.820 | if you just open up and let go of the jealousy, essentially.
03:51:37.100 | - I feel like in Sweden, like in Scandinavia,
03:51:39.100 | we're extremely just sexually open, in general.
03:51:42.060 | We're not super religious either.
03:51:44.180 | We're very relaxed.
03:51:45.780 | We don't feel bad about ourselves.
03:51:48.740 | It's just like a different sort of thing.
03:51:49.940 | And I would say we're more progressive
03:51:51.980 | when it comes to feminism and stuff.
03:51:53.700 | So it's more common that you will meet women
03:51:55.980 | with a higher body count than like,
03:51:57.660 | when I meet like American girls,
03:51:59.020 | all of them have like vaginismus,
03:52:00.540 | like super suppressed sexually.
03:52:02.220 | And they have like-
03:52:03.060 | - What did you just use?
03:52:04.340 | - They have like issues to like,
03:52:06.360 | they can't relax during sex, which just hurts for them.
03:52:08.700 | - Vaginismus. - So they get really nervous.
03:52:09.860 | Vaginismus, is that what it's called?
03:52:10.940 | Yeah.
03:52:12.380 | So like I meet so many girls
03:52:13.820 | that are having like a lot of issues with sex
03:52:16.660 | and they have like a very low body count
03:52:18.220 | because they just can't relax.
03:52:19.980 | Yeah, and usually they come
03:52:22.620 | from like a very religious background.
03:52:24.720 | So they have just been told like, you cannot wear that.
03:52:26.820 | You cannot be like that.
03:52:27.900 | You can't like, you know.
03:52:29.340 | And like where I grew up, it wasn't like that at all.
03:52:32.540 | We just see it as more like a casual thing.
03:52:35.100 | - So then you can just maximize
03:52:36.980 | the awesomeness of the experience.
03:52:39.900 | - Yeah, I guess.
03:52:40.740 | I don't like how I go. - You don't have to
03:52:41.560 | trauma over it. - Exactly, yeah.
03:52:42.380 | - I think the important thing I think
03:52:43.740 | for everybody to realize
03:52:44.700 | is there's always pros and cons to everything.
03:52:46.740 | Like my lifestyle,
03:52:48.380 | like obviously I get to have a lot of fun experiences.
03:52:50.180 | That's like a huge pro and that's super cool.
03:52:52.180 | And if you're like a more monogamy brain person,
03:52:54.300 | you're not gonna get those experiences.
03:52:55.680 | But if you're a monogamy brain person,
03:52:57.280 | like when you're sharing that special moment
03:52:58.760 | in time with somebody else,
03:52:59.760 | like that moment can be really, really, really special
03:53:01.960 | because now it's the thing that you're showing yourself
03:53:03.920 | and open yourself up to another person
03:53:05.220 | and they're only trusting you to do that.
03:53:06.760 | And that's like a really special thing
03:53:08.000 | that only the two of you are sharing with each other.
03:53:09.980 | So, I mean like there's always like pros and cons
03:53:11.400 | to everything.
03:53:12.240 | - Like I think we both would say like,
03:53:13.600 | like doing an open relationship is probably not,
03:53:15.600 | like we would not recommend it.
03:53:17.600 | - Yeah, no, of course not. - I don't think we would, no.
03:53:20.080 | - Yeah, I recently fasted for three days
03:53:22.240 | and I ate a chicken breast at the end of that.
03:53:24.420 | And it was like the most delicious food I've ever eaten.
03:53:27.180 | - True.
03:53:28.020 | - So like there's some aspect of fasting and scarcity
03:53:30.340 | and so on that like,
03:53:31.180 | and you have to figure out what for your own psyche,
03:53:33.620 | what works the best.
03:53:34.460 | - It's good to be a little bored or like not do something
03:53:36.780 | or like work because you can just enjoy the time
03:53:39.500 | when you're doing something really fun.
03:53:41.020 | It's more fun, otherwise you're just gonna get numb
03:53:42.980 | in general with everything, yeah.
03:53:44.820 | - Yeah, I personally just never get bored.
03:53:46.720 | Like I guess the boring thing is exciting to me.
03:53:49.820 | Though it's just like--
03:53:50.660 | - Are you like me?
03:53:51.700 | 'Cause everything I like is boring.
03:53:54.940 | - I gotta ask you, we talked about massaging
03:53:56.980 | and he's trying to battle it out on the internet.
03:53:59.740 | What's your sense as a woman about the level of massaging
03:54:04.740 | and the internet and the streaming community
03:54:07.180 | and how to fight it?
03:54:08.820 | - For me, 'cause I guess I get it every single day somehow.
03:54:12.860 | Like, because I have an online,
03:54:14.260 | like I have a chat that's live, right?
03:54:16.380 | But I have like mods moderating that all the time
03:54:19.100 | so I don't really need to see much of it.
03:54:20.540 | I think it's just pretty annoying
03:54:22.100 | because you get to like see it all the time.
03:54:25.080 | - So it's become like background noise?
03:54:28.380 | - Yeah, a little bit and it's like the same comms
03:54:30.700 | over and over again.
03:54:31.900 | But it's usually for me, I don't personally care that much.
03:54:34.980 | I understand that other people do,
03:54:36.180 | especially when it comes into like,
03:54:37.900 | when there's like a lot of sexism and stuff
03:54:40.100 | and when there's a lot of like men
03:54:42.340 | not taking women seriously.
03:54:43.420 | Like I definitely get that and I used to get that even more
03:54:45.700 | like a few years ago with my accent and everything
03:54:48.340 | and like I used to be blonde as well like a few months ago.
03:54:50.500 | So I feel like people wouldn't take me seriously
03:54:52.220 | because of that.
03:54:53.460 | That's a bit annoying, but I feel like it's pretty easy
03:54:57.180 | to like see through when someone acts that way.
03:54:59.020 | And for me personally, I don't really care.
03:55:00.620 | But it's a bit annoying, like being online
03:55:03.700 | and like getting stuff every single day.
03:55:05.700 | I would say like probably the worst thing is
03:55:08.560 | when you feel like you put in a lot of effort
03:55:10.500 | into some sort of work, everyone is just gonna say,
03:55:13.500 | you just got that because you're a woman
03:55:15.460 | and you're attractive.
03:55:16.380 | And that's probably like the worst thing.
03:55:18.740 | - Is there a way to fight that you think?
03:55:20.500 | - Yeah, I don't think you can.
03:55:21.460 | I think it just comes up all the time.
03:55:22.820 | It's just like, it is what it is, I guess.
03:55:24.900 | You just gotta keep doing whatever you do
03:55:26.300 | and like not let it like emotionally control you somehow.
03:55:28.980 | - I think having more women in those spaces is always good.
03:55:31.060 | - It's probably good, yeah.
03:55:31.900 | - Like a lot of the guys you can tell online
03:55:33.220 | that they don't ever--
03:55:34.060 | - Don't bring on the worst ones then, Stephen.
03:55:35.860 | - See, she just did it.
03:55:37.740 | She did the misogyny thing.
03:55:39.700 | By having some bad women on, she's saying all women, see?
03:55:42.380 | - Well, you know it's true, right?
03:55:46.060 | - See, I disagree with you and I'm older than both of you.
03:55:48.700 | And therefore wiser, right?
03:55:51.180 | - Well, combined, we're older than you.
03:55:52.020 | - I think we're--
03:55:52.860 | - So, careful.
03:55:53.700 | - We're one only metronomy.
03:55:55.020 | Yeah, we've got combined age.
03:55:56.460 | - Or it could be the same thing as like,
03:55:59.420 | also the age thing and like the woman thing.
03:56:01.300 | A lot of people think that I'm just copying
03:56:02.700 | every single thing that he says,
03:56:04.180 | which I think is a bit annoying as well.
03:56:05.420 | So I can never really like--
03:56:06.580 | - Hassan accused her of that one time.
03:56:07.660 | - Yeah, which is a bit annoying.
03:56:08.980 | - I don't really like so much, you know?
03:56:10.580 | - It was about the defunding police.
03:56:11.980 | Like my dad is a cop.
03:56:12.820 | - I like friendship, camaraderie, and love and respect,
03:56:16.260 | which you both have had for a time and have lost.
03:56:20.100 | And I would like you to regain it.
03:56:22.060 | Let's try to increase, not decrease,
03:56:23.820 | the amount of love in this space.
03:56:25.140 | What do you think about some of the harshness
03:56:26.980 | of his language, which we talked about?
03:56:29.420 | R-word in the past, when you used N-word,
03:56:33.860 | all of that kind of stuff.
03:56:34.700 | - When he, what he used to do?
03:56:36.100 | I mean, I just--
03:56:37.020 | - No, like, what do you think about it?
03:56:37.860 | Like, do you give him advice?
03:56:39.300 | - To not speak a certain way?
03:56:41.220 | - No, like a little more civility?
03:56:43.100 | I was just trying to get a second opinion on this.
03:56:44.980 | - Second opinion about--
03:56:45.820 | - Normie people, non-internet people,
03:56:47.140 | are way more extreme than--
03:56:48.820 | - Yeah, yeah.
03:56:49.660 | - She's way more extreme.
03:56:50.660 | - No, that's not true.
03:56:52.100 | Okay, so here's the thing for me, okay?
03:56:54.500 | I was not online until three years ago, at all.
03:56:57.740 | Like, I would watch YouTube.
03:56:58.980 | That's pretty much all I would do.
03:57:00.580 | I wouldn't do anything else, really.
03:57:02.020 | I didn't grow up playing video games or anything.
03:57:04.740 | So I'm like extremely new to everything.
03:57:06.740 | So when I came into this world
03:57:10.340 | and I started seeing clips from him in the past,
03:57:12.860 | I don't think I really had much of an opinion
03:57:14.420 | because it just sounded like
03:57:15.460 | it was just a different words that we're using,
03:57:17.300 | but it didn't mean anything.
03:57:18.700 | That's what it feels like.
03:57:19.540 | It was just like, if you're saying the R word,
03:57:21.660 | it's because you just want to call someone stupid,
03:57:23.220 | but you want to do it a little bit more.
03:57:25.300 | But it's not like, it didn't feel like it was a,
03:57:27.860 | it's not like racist more like--
03:57:29.060 | - Yeah, but I'm not getting an agreement on this side here.
03:57:32.780 | - So like, if he was saying the F word
03:57:34.500 | because it was just like a word to insult someone
03:57:37.780 | and he wasn't, like, I don't think he was ever,
03:57:39.660 | I don't think you were ever homophobic back in the day
03:57:41.580 | or anything like that.
03:57:42.420 | I think it was just like a way to express yourself
03:57:45.300 | maybe back then.
03:57:46.140 | I don't know, I didn't do it.
03:57:47.020 | There's no videos of me or anything
03:57:48.260 | 'cause I wasn't even online back then.
03:57:49.540 | - Yeah, so my case was,
03:57:51.180 | I definitely don't think Stephen is homophobic,
03:57:54.460 | or racist, or any of those, obviously.
03:57:55.740 | So there's a good heart there and a good mind.
03:57:59.300 | I was just saying--
03:58:00.140 | - He just likes being mean, I think.
03:58:02.220 | - Well, there is some, you lose yourself
03:58:04.300 | and forget the bigger picture
03:58:05.540 | that he's pushing for more effective discourse
03:58:09.180 | on the internet.
03:58:10.020 | He's like an inspiration to a lot of people,
03:58:11.420 | especially now, of how you can use effective conversation
03:58:14.860 | to make for a better world, to radicalize people,
03:58:18.260 | and so on.
03:58:19.080 | And then you lose some of that power
03:58:21.020 | by losing yourself in the language,
03:58:24.940 | just more language of emotion
03:58:26.500 | versus effective communication.
03:58:28.460 | - I would say-- - But it's a gray area.
03:58:30.460 | - I would say something that is probably recently done
03:58:32.420 | in that case, because he's been joking about women a lot.
03:58:35.540 | Like, it's women's fault, they're bad.
03:58:37.820 | It's been a lot of jokes when it comes to misogyny,
03:58:40.380 | I guess, in your community.
03:58:41.700 | And I think it's actually turned people
03:58:43.340 | a little bit that way.
03:58:44.180 | - That's why we've done the recent misogyny.
03:58:45.580 | - Yes, so I guess that's actually true,
03:58:47.980 | because I don't think it was clear enough.
03:58:50.940 | I don't think it actually was.
03:58:51.940 | I think you did that mistake.
03:58:53.580 | But I think back then I was even saying,
03:58:56.100 | like, hey, you should probably not,
03:58:57.300 | like, you probably should not do that.
03:59:00.260 | 'Cause it actually is pretty hard for me,
03:59:01.620 | 'cause whenever I come into his community, like his chat,
03:59:04.200 | people are just gonna spam, it's like a woman moment.
03:59:06.020 | It's a woman moment whenever I say something,
03:59:07.580 | and it's kind of like, yeah, it's getting pretty annoying,
03:59:10.020 | as I said, it's just annoying
03:59:11.180 | when you see it every single day.
03:59:12.820 | - There you go, wisdom from somebody younger than you.
03:59:17.060 | - Wisdom can come from all kind of people.
03:59:19.260 | Yeah, of course, just sometimes in very limited quantities,
03:59:21.580 | depends on the age.
03:59:22.420 | - Oh boy.
03:59:23.260 | - You can learn something from anybody.
03:59:25.220 | - What advice would you give to young people,
03:59:27.540 | the both of you, that you have both audiences
03:59:31.580 | where young people look up to you?
03:59:33.420 | In general, if you were to give advice
03:59:35.260 | to somebody in high school, like how to create a life
03:59:39.560 | they can be proud of, what would you say?
03:59:41.560 | - The most important thing that I've learned
03:59:44.860 | is to view people as different and not better or worse.
03:59:48.620 | And when you view people as different
03:59:49.780 | instead of better or worse,
03:59:50.940 | you learn that there's almost something
03:59:52.900 | that you can learn from anybody.
03:59:54.780 | Like be open and empathetic
03:59:55.900 | towards other people's experiences.
03:59:57.440 | Nobody does anything by random choice.
04:00:00.060 | Like there's always reasons why people act the way they do.
04:00:02.340 | And as long as you're willing to kind of like be open
04:00:04.600 | and receptive to the lived experiences of other people,
04:00:07.280 | you're gonna be able to gather information
04:00:09.020 | and create like a more cohesive and better view of the world
04:00:11.460 | than any of your peers will.
04:00:13.660 | - Do you have any kind of advice
04:00:15.300 | you can give to young folks?
04:00:16.940 | - I feel like something that I see,
04:00:18.200 | especially in America a lot,
04:00:19.540 | is that a lot of people kind of get told what to do
04:00:23.020 | early on, like in high school.
04:00:24.660 | They're supposed to become this thing,
04:00:26.980 | like education-wise, like they're supposed to like
04:00:28.700 | become a doctor or this thing or whatever.
04:00:30.620 | And then they kind of just like give up on things
04:00:32.020 | that they're actually passionate about.
04:00:33.480 | So I think a lot of teenagers get really confused.
04:00:35.880 | They get an education and then they get that job
04:00:38.380 | and they hate everything.
04:00:39.300 | And they think that when they're reaching the job,
04:00:42.260 | when they're reaching like the journey,
04:00:43.340 | they're gonna get happy.
04:00:44.180 | That's like where the happiness is gonna be.
04:00:45.620 | But then when they get to there,
04:00:47.500 | they just hate everything.
04:00:48.660 | And then they become really depressed.
04:00:50.420 | And I've seen this so much.
04:00:51.960 | Like I've seen this all the time.
04:00:53.500 | And it's pretty sad to me to see so many people
04:00:57.040 | that are just wasting time.
04:00:58.260 | And then they just get really confused.
04:01:00.020 | And I don't know, it's the same thing with relationships too.
04:01:03.300 | No one really knows what they want anymore.
04:01:05.060 | I feel like everyone is just kind of doing whatever
04:01:07.100 | like society is saying or the parents are saying
04:01:08.820 | or their friends are saying.
04:01:10.220 | And they're never really doing anything
04:01:12.980 | that's super meaningful anymore.
04:01:14.460 | And like they don't, so what I would say is like,
04:01:17.100 | try to find something that is important to you.
04:01:19.860 | It could be anything really, like some sort of passion,
04:01:22.540 | maybe like your friends, maybe like what matters to you.
04:01:25.940 | Like figuring those things out,
04:01:27.420 | I think is really important.
04:01:28.260 | - And that comes from being able to listen
04:01:30.060 | to like some inner voice.
04:01:31.420 | So it's not gonna come from elsewhere.
04:01:32.260 | - Yeah, I guess.
04:01:33.100 | It's really hard because you're living the life
04:01:34.940 | and like there's things happening around you
04:01:36.460 | and people tell you what to do and what not to do.
04:01:39.060 | No one really has like their own opinions.
04:01:40.660 | Everyone is just kind of like listening
04:01:42.820 | to the cooler thing or, you know.
04:01:45.340 | - Except Steven, he seems to stand on his own.
04:01:47.500 | - True. - Yeah, I guess.
04:01:48.340 | - Free thinking. - Yeah, I'd say so.
04:01:49.340 | - High hashtag.
04:01:50.260 | - Like something I realized too,
04:01:51.700 | 'cause we just went to TwitchCon
04:01:52.940 | and we were talking to a lot of streamers.
04:01:54.700 | - How was that?
04:01:56.100 | - It was interesting.
04:01:56.940 | I thought it was interesting because the few people
04:02:00.020 | that I feel like that seem really cool
04:02:01.900 | and that I look up to like in the streaming world,
04:02:04.300 | all of them wants to quit streaming.
04:02:06.420 | All of them wants to do it.
04:02:07.300 | No one wants, like no one likes it.
04:02:08.780 | And they're so successful.
04:02:10.620 | Like they are around successful people.
04:02:13.220 | They're working every single day.
04:02:15.300 | They're working hard.
04:02:16.140 | They're making so much money.
04:02:17.460 | And everyone is just complaining.
04:02:19.540 | And like, they're complaining about like not being able
04:02:23.140 | to see their partner or, you know,
04:02:25.780 | because they need to live somewhere else.
04:02:27.260 | Because I see these things and they seem extremely unhappy.
04:02:31.220 | But it's so hard for them to just like cut
04:02:33.060 | all this successful stuff off
04:02:35.060 | because that's like what you, you know, learn to do.
04:02:38.500 | And that's like supposed to be like your happiness,
04:02:40.500 | but it isn't.
04:02:41.340 | Everyone is really unhappy.
04:02:42.380 | - Yeah, there's something about,
04:02:43.540 | maybe streaming is different,
04:02:44.980 | but YouTube folks too, I've interacted with a few.
04:02:47.620 | And even in podcasting space,
04:02:50.980 | people become obsessed about the views and numbers
04:02:53.660 | and subscribers and stuff like that.
04:02:55.460 | So I turn, I never talk about that.
04:02:59.020 | I don't pay attention to that.
04:03:02.060 | I feel like that's a drug that destroys your mind.
04:03:06.900 | Your mind as an artist,
04:03:08.460 | ability to create truly unique things.
04:03:12.020 | Also your mind in terms of the anxiety,
04:03:14.020 | the ups and downs of the attention mechanism.
04:03:18.060 | And then also being just,
04:03:20.940 | if something that you make is not popular,
04:03:23.540 | but it meant a lot to you,
04:03:24.940 | you will think of it less because it's not popular.
04:03:28.380 | That's a really dangerous thing.
04:03:29.620 | And because everyone around you is reinforcing,
04:03:32.740 | like I'll get messages like,
04:03:34.700 | "Wow, this thing got this many views or something.
04:03:38.500 | Great job."
04:03:39.660 | It's like, no, you don't get it.
04:03:41.980 | Like that's not, that's not,
04:03:44.140 | everyone is enforcing this,
04:03:45.540 | like this language of views and likes and so on.
04:03:48.980 | And it's correlated of course,
04:03:51.100 | 'cause truly impactful things
04:03:52.340 | will get a lot of attention often,
04:03:55.020 | but it's not on the individual local scale,
04:03:59.300 | like temporarily is,
04:04:00.780 | it can really fuck with your mind.
04:04:03.740 | And I see that in the creators,
04:04:05.620 | they become addicts to the algorithm.
04:04:09.580 | - Lost in chasing views.
04:04:10.900 | Like we know friends that,
04:04:12.140 | we know cool people and then they start streaming
04:04:13.820 | and eventually they're like chasing the dragon of like.
04:04:16.900 | - And they change, yeah.
04:04:18.900 | It's like hard to engage for them anymore.
04:04:19.740 | - They change who they are completely.
04:04:20.580 | And there's like, this is something I've always said
04:04:22.100 | that like one of the biggest blessings
04:04:23.980 | and biggest curses of humanity
04:04:25.340 | is we are very good at acclimating.
04:04:27.060 | Like you can become paralyzed,
04:04:28.340 | you can have all sorts of horrible things happen to you
04:04:29.740 | and you'll get used to it and you'll be okay.
04:04:31.100 | You're gonna have like a good baseline,
04:04:32.500 | but it works the other way too.
04:04:33.500 | And that you can get more and more and more and more
04:04:35.260 | and you acclimate to it almost immediately.
04:04:37.260 | There's like, this is a phenomenon that I bet it happens
04:04:39.540 | in the YouTube world,
04:04:40.380 | but I know what happens in the streaming world
04:04:41.260 | where you're streaming 1000 viewers every day,
04:04:44.100 | huge event happens and you blow up
04:04:45.820 | and you got like 15,000 viewers for a day or two.
04:04:48.420 | And then it starts to go down and down and down
04:04:50.140 | and down and down.
04:04:50.980 | And then after all the drama stayed off,
04:04:52.500 | you're at like 3000 concurrent viewers.
04:04:54.900 | Now in the macro, you went from 1000 to 3000.
04:04:57.660 | That feels awesome.
04:04:58.740 | But you actually feel like shit the whole time
04:05:00.380 | 'cause you're remembering when you had 10 or 15,000
04:05:02.220 | and now everything feels horrible.
04:05:03.780 | And you'll see people climb over time.
04:05:05.420 | They're like, fuck, like, but whatever
04:05:06.420 | that one huge stream I had,
04:05:07.340 | like I've never been able to,
04:05:08.700 | and it's like, dude, you're doing great.
04:05:10.300 | Yeah, that happens a lot.
04:05:11.660 | - There's so many people that we know
04:05:12.740 | that we find super, super cool.
04:05:14.300 | They're passionate about things.
04:05:15.620 | They have so much interest
04:05:17.260 | and then they just get like so addicted to these numbers.
04:05:19.500 | And like all the, everything is just ruined.
04:05:21.900 | Like all the cool things about them is ruined
04:05:23.420 | because they stopped doing the things
04:05:24.580 | that they actually like to do something else
04:05:27.260 | that gives them more viewers and more money.
04:05:28.820 | And it's really sad to see.
04:05:30.660 | - Yeah, that temporary sacrifice that seems temporary
04:05:35.020 | is that it actually destroys you.
04:05:37.620 | Like for one time making a choice,
04:05:39.500 | 'cause I come across those choices often.
04:05:41.420 | Like I can do this.
04:05:42.580 | You can kind of know what's gonna be popular and not.
04:05:46.060 | And you have to ask yourself that question.
04:05:47.740 | Like, is this gonna sacrifice?
04:05:49.860 | - 'Cause if people are sacrificing
04:05:51.460 | like intimate relationships,
04:05:53.340 | they're sacrificing time with their family.
04:05:55.220 | They're sacrificing time with the things
04:05:57.460 | that they feel good about and that they like.
04:06:00.180 | And that's something I kind of realized last year
04:06:03.180 | 'cause I was working so much
04:06:04.460 | and I was just grinding, grinding, grinding
04:06:05.980 | because it was kind of new for me.
04:06:07.340 | And then new years came by and I was like,
04:06:09.580 | wait, what did I even do like the entire year?
04:06:11.980 | Like I traveled to a bunch of places,
04:06:13.820 | but nothing actually really meant anything to me
04:06:15.580 | because I felt like I was just working the entire time.
04:06:17.620 | I felt like I was just numb through the entire year.
04:06:19.780 | And I was really scary.
04:06:23.100 | I rented a super pretty house for a week
04:06:26.380 | with my dad and my sister
04:06:27.740 | 'cause I wanted to spend time with them.
04:06:29.580 | But the entire time I was just streaming
04:06:32.540 | and I actually didn't ever like calm down
04:06:34.780 | and just like chill with them.
04:06:36.220 | And that's like time I'll never get back.
04:06:38.820 | I don't give a shit about the money that I made that week,
04:06:41.100 | but I lost the time.
04:06:43.100 | And like, that is really important to me.
04:06:45.460 | And yeah, and a lot of people are doing that.
04:06:47.860 | And I feel like, as you said,
04:06:49.220 | you can definitely see that in like artists for sure.
04:06:51.860 | I feel like if you look at like artists
04:06:53.500 | like back in the 60s or 70s,
04:06:55.500 | I feel like things were just so much better back then.
04:06:57.740 | And it feels like they were actually making music
04:07:00.220 | that meant something to them.
04:07:01.700 | They were actually making art.
04:07:03.460 | And I feel like today everything is just kind of like
04:07:05.740 | whatever is cool, whatever sells,
04:07:07.340 | whatever sounds in a certain way,
04:07:09.060 | everything is kind of the same thing.
04:07:10.900 | And everything that is very artistic and very cool
04:07:13.460 | is actually not that popular at all.
04:07:16.180 | And that's kind of sad, I think.
04:07:18.100 | - Yeah, of course there's now bigger mechanisms
04:07:20.180 | and platforms to spread stuff, music.
04:07:23.820 | So as long as you can be content with not being popular,
04:07:27.300 | I think you can still create art.
04:07:28.140 | - Yeah, but not like when people get a little popular,
04:07:30.660 | they get addicted to that so fast.
04:07:32.140 | - Yeah, it's weird.
04:07:33.700 | You have been somewhat good,
04:07:35.820 | at least from my outsider perspective,
04:07:37.820 | because I think you, I can at least imagine
04:07:41.060 | you making choices that could make you more popular
04:07:43.620 | and you don't seem to make those choices.
04:07:45.860 | - Like having an orbital problem.
04:07:49.620 | But it is very intentional, like you said.
04:07:51.580 | And I made that choice at every single stage of my life.
04:07:54.420 | One is because from the perspective
04:07:56.060 | of being a carpet cleaner,
04:07:56.900 | my life is way better than that was or ever would have been.
04:07:59.980 | So I'm already doing way better than I ever thought
04:08:01.620 | somebody like me ever could be.
04:08:03.360 | But then two, I super love my job.
04:08:06.400 | Every time I wake up,
04:08:07.300 | every time I fly to a place to do a podcast,
04:08:08.940 | every time I get to talk to really cool people,
04:08:10.180 | every single part about my job, I super like.
04:08:12.060 | If there's something I don't like,
04:08:13.060 | I just cut it off because I don't care.
04:08:14.260 | 'Cause I'm already making plenty of money doing what I do.
04:08:16.060 | And why would I ever wake up and not like what I'm doing
04:08:18.260 | when I can like what I'm doing?
04:08:19.540 | - How do you guys find through that,
04:08:21.300 | given that you love it
04:08:23.580 | and you sometimes maybe lose yourself in the drug of it,
04:08:26.020 | how do you find like work-life balance together
04:08:29.180 | inside a relationship?
04:08:30.940 | Like time for each other?
04:08:32.660 | - I don't at all, so I'm not a good person to ask.
04:08:34.940 | - What do you love more, Mel or Factor?
04:08:38.180 | (laughing)
04:08:39.060 | - Factor is a really good game.
04:08:40.520 | That's like not a fair comparison, okay?
04:08:41.980 | You're talking about one of the best, cleanest games,
04:08:44.060 | best support ever made, cleanest code base.
04:08:46.900 | - I think I'm just saying all this for other reasons.
04:08:48.820 | - Yeah.
04:08:49.660 | - It's more Factor time for me.
04:08:50.700 | (laughing)
04:08:52.020 | - Starting to understand where the misogyny comes from.
04:08:53.900 | By the way, is Factor legit a really good game?
04:08:57.220 | - Yeah, of course, yeah.
04:08:58.060 | - Okay.
04:08:58.900 | - If you're like, do you enjoy programming?
04:09:00.860 | - Of course, that's all I do.
04:09:01.820 | That's all.
04:09:02.660 | - Okay.
04:09:03.500 | - To me, programming is the game in itself
04:09:06.180 | that I enjoy probably more than anything else, but yeah.
04:09:08.340 | - It's very much a game like that.
04:09:09.700 | If you're into stuff like that,
04:09:10.860 | like you can lose hundreds of hours very quickly
04:09:13.180 | to like you have a problem and then you think of a solution
04:09:16.260 | and then you iterate on that over and over and over again
04:09:18.180 | in larger, larger schemes.
04:09:19.260 | Sometimes you gotta redesign stuff.
04:09:20.420 | Sometimes you got like, it's a very much like that kind of game.
04:09:22.140 | - So you're essentially building a factory, like what,
04:09:23.980 | on a foreign planet or something like that?
04:09:26.140 | - Basically, it's like a bunch of,
04:09:27.140 | you're trying to automate different problems
04:09:28.820 | so that you can build bigger things,
04:09:30.260 | so that you can automate bigger problems,
04:09:31.660 | so you can build bigger things
04:09:32.500 | and automate bigger problems, yeah.
04:09:33.340 | - So it's more complicated than like a city building game,
04:09:36.260 | like some city type of thing?
04:09:38.300 | - I wouldn't say it's more complicated.
04:09:39.620 | It's more, like, Factor is like a game of like logic,
04:09:43.120 | like strictly like logic.
04:09:44.620 | - Oh, so you're almost like building a circuit or something.
04:09:46.740 | - Yes, yeah, there's like, there's circuitry
04:09:48.380 | and you've got your N-ORGs, ORG-ATs,
04:09:49.820 | like there's stuff like that.
04:09:50.820 | It's very much like that, like problems.
04:09:51.660 | - What are the enemies in the game?
04:09:53.220 | Like what, what's something attacking?
04:09:54.060 | - They're like bugs that try to bite you
04:09:56.020 | and you can get guns and shoot and kill them, but it's like-
04:09:57.860 | - 'Cause I saw there's like shooting going on.
04:09:59.540 | - Yeah, but that's like a minor,
04:10:00.500 | it's just like another problem to solve
04:10:01.500 | in the game, basically, yeah.
04:10:02.460 | - Okay, all right.
04:10:03.780 | So you see what we did there?
04:10:04.820 | We just started talking about the game as we're trying.
04:10:07.580 | Oh my God, that's like a-
04:10:08.420 | - That's a really good game, okay.
04:10:09.740 | - That's horrible.
04:10:10.620 | Anyway, is there, is that basically the struggle,
04:10:13.500 | not a struggle, how to get human, like intimate human time?
04:10:18.500 | - I feel like it was like that a little bit more
04:10:21.100 | in the past.
04:10:21.940 | I feel like it's been better lately,
04:10:24.800 | but I think it's because when we started dating,
04:10:27.580 | I wasn't streaming and I kind of just like gave up
04:10:30.420 | like my trip in New Zealand.
04:10:32.220 | I give up like, like I left Sweden.
04:10:35.540 | So I was just like in LA, which I hate, I hate LA.
04:10:38.860 | I don't like LA at all.
04:10:40.580 | It's hard to make friends that are like real,
04:10:43.580 | that are into the same stuff as you.
04:10:45.460 | It was just really hard for me to connect with anyone,
04:10:47.660 | especially also like being a European
04:10:49.780 | and like being around Americans was very strange.
04:10:52.180 | So the only thing I had when I came here was him.
04:10:56.140 | And I didn't expect-
04:10:58.380 | - Bad situation.
04:10:59.220 | - Because yeah, because we had like two weeks
04:11:01.940 | of hanging out and like he would be on his computer
04:11:03.640 | sometimes and like do emails and stuff,
04:11:05.280 | but I wasn't thinking that he would stream
04:11:07.220 | like 12 hours a day.
04:11:08.060 | And it was pretty, like it was pretty intense,
04:11:09.600 | like in the beginning of it as well.
04:11:10.780 | And I realized it was really hard to like get attention
04:11:13.740 | or get time because his like love meter would be like full
04:11:18.020 | if I was just in the house.
04:11:19.260 | And that's just kind of like the way he is.
04:11:20.860 | And for me back then when I didn't have anything else to do,
04:11:23.860 | was kind of like a, it was kind of crazy for me.
04:11:28.100 | I feel like right now, because I do work as well
04:11:30.460 | and I have things going for me
04:11:32.900 | and I have other friends now that I made,
04:11:35.880 | I feel like it's a lot easier.
04:11:37.280 | 'Cause, and I can definitely like enjoy
04:11:39.120 | just like being in separate rooms
04:11:40.680 | and just like hearing him in the background is really nice.
04:11:42.520 | So I can sit and paint and like in my room
04:11:45.320 | and I will do that for hours while I'm just like hearing
04:11:47.160 | a scream in the background.
04:11:48.000 | It's kind of like comforting that he's just there.
04:11:50.600 | It feels nice.
04:11:51.420 | I like it.
04:11:52.260 | - 'Cause to you that's the sound of happiness.
04:11:53.800 | - Yes, because I know he's right there.
04:11:55.240 | Yeah, it's nice.
04:11:56.080 | And they'll come in and check on me sometimes.
04:11:57.400 | And it's kind of like, it's actually very comforting.
04:11:59.360 | It's very nice.
04:12:00.200 | I like it.
04:12:01.240 | - Yeah, I think that's kind of what a relationship is.
04:12:03.640 | Like you do fun things together
04:12:04.960 | and you share moments together,
04:12:06.760 | but also just like having someone like around you
04:12:08.880 | is really, really nice.
04:12:10.400 | And I think that's probably, maybe it's me growing up.
04:12:13.780 | Maybe that's what it is.
04:12:15.840 | And like, I start liking like the kind of,
04:12:17.820 | I feel like we're like an old couple,
04:12:19.240 | like we're like 80 and we're just like around.
04:12:21.400 | We don't really have to talk much.
04:12:22.840 | It's nice to just do that.
04:12:23.680 | - And that fills your love meter?
04:12:25.880 | I like this terminology, love meter.
04:12:28.680 | - I need both.
04:12:29.880 | - Okay, you're making it sound like
04:12:32.440 | that I'm like craving like crazy time.
04:12:34.600 | - I'm not saying anything.
04:12:35.440 | I haven't said a single thing at all.
04:12:36.280 | - You're making faces right now.
04:12:37.360 | I know exactly what you're thinking.
04:12:38.200 | - There's so much judging going on.
04:12:40.000 | - There's no judging at all.
04:12:41.120 | - No, but like, I think whenever we do plan something out,
04:12:44.640 | like if we go on a trip like every other month
04:12:47.040 | or once a month, I feel like usually like that's enough
04:12:51.120 | as long as he's not playing Factorio the entire time.
04:12:54.040 | Like if I feel like he's going on these trips for me
04:12:56.560 | and he's not like doing things for me
04:12:58.920 | or he's not interested in like spending time
04:13:01.200 | or like being present with me,
04:13:02.680 | then I'll feel like,
04:13:03.520 | I just feel like I'm just wasting time right now
04:13:05.040 | and then I get kind of disappointed.
04:13:06.640 | But otherwise I think this, like, this is fun.
04:13:08.920 | I think this is like spending time together
04:13:10.760 | 'cause we're like doing something together.
04:13:11.960 | Yeah, it's fun, yeah.
04:13:14.120 | - My love meter is full.
04:13:15.880 | - Your love meter is full.
04:13:17.400 | - It's my social life.
04:13:18.400 | Otherwise it's fine.
04:13:19.240 | - We like to think about it that way,
04:13:20.680 | that like I need like a little bit more of like
04:13:22.400 | this one thing, like quality time.
04:13:24.280 | And he needs like almost zero quality time.
04:13:27.000 | But like, let's say that we took away like physical touch,
04:13:29.360 | you would probably not be very happy.
04:13:31.680 | - So you need physical touch.
04:13:32.880 | So it's not just Factorio, huh?
04:13:34.160 | - No, I'm a very cuddly person.
04:13:35.280 | - Yeah, like cuddly.
04:13:36.520 | And then you're like, I guess like acts of service.
04:13:38.440 | Like if I do something for you, you get really happy.
04:13:40.280 | - Like hot chocolate.
04:13:41.120 | - Yeah, like if I give him hot chocolate in the morning,
04:13:43.280 | he gets really happy, so.
04:13:44.640 | - The actual, it's not the hot chocolate,
04:13:46.720 | it's the giving of the hot chocolate.
04:13:48.080 | - No, it's just the hot chocolate.
04:13:49.720 | But if she gives it to me,
04:13:50.640 | I didn't have to get it myself.
04:13:51.480 | - It's just physical touch that you like.
04:13:52.480 | - That's really nice.
04:13:54.320 | - All right, well, if you have to choose
04:13:55.560 | between Factorio and the drama or political discourse.
04:14:00.120 | - Probably political discourse, probably my calling.
04:14:02.400 | But I am a good Factorio player.
04:14:04.200 | - But what role exactly does Factorio play
04:14:07.840 | in your streaming life?
04:14:10.000 | - Oh, well, right now it's just,
04:14:11.120 | usually there are like these games
04:14:12.360 | that I play in the background as I have conversations,
04:14:14.040 | 'cause it's hard for me to just sit on the computer
04:14:15.320 | and just talk and not like be playing a game
04:14:17.560 | at the same time.
04:14:18.400 | So it's just something to keep me kind of like occupied.
04:14:19.920 | You know, I was gonna be able to buy
04:14:20.760 | like little like widget things, I guess.
04:14:22.000 | - Yeah, that's what yours is.
04:14:22.840 | - Yeah, basically, yeah.
04:14:23.680 | - So I'm just like Minecraft or Factorio for me.
04:14:26.000 | - All right, well, my love meter is full from this.
04:14:30.400 | Mel, thank you so much for joining us.
04:14:32.320 | - Thank you for having me.
04:14:33.240 | - This was really fun.
04:14:34.440 | You guys are fascinating human beings.
04:14:36.880 | Thank you for existing.
04:14:38.560 | I'm glad to live in a world where you exist.
04:14:40.680 | I can't wait to see what kind of beautiful thing
04:14:44.160 | you create next and the crazy kind of art that you create
04:14:48.560 | through the different people you interact with.
04:14:50.640 | Destiny, Steven, you're an amazing human.
04:14:52.440 | Thank you so much for talking to me.
04:14:53.680 | It's an honor.
04:14:54.520 | Hope to talk with you again.
04:14:55.840 | I'm talking to Ben Shapiro.
04:14:57.000 | You've given me a lot of inspiration.
04:14:59.080 | It's an honor to talk to the Ben Shapiro of the left.
04:15:01.760 | - Yeah, well, thanks a lot for having me.
04:15:03.040 | I appreciate it.
04:15:03.880 | - Thank you, guys. - It's been fun.
04:15:04.720 | - Thank you.
04:15:06.040 | - Thanks for listening to this conversation with Destiny.
04:15:08.720 | To support this podcast,
04:15:09.880 | please check out our sponsors in the description.
04:15:12.600 | And now let me leave you with some words from Lewis Carroll.
04:15:16.000 | "It's no use going back to yesterday
04:15:18.800 | "because I was a different person then."
04:15:22.000 | Thank you for listening and hope to see you next time.
04:15:25.400 | (upbeat music)
04:15:28.000 | (upbeat music)
04:15:30.600 | Thanks for watching.