back to index

Whitney Cummings: Comedy, Robotics, Neurology, and Love | Lex Fridman Podcast #55


Chapters

0:0 The Following Is a Conversation with Whitney Cummings She's a Stand-Up Comedian Actor Producer Writer Director and Recently Finally the Host of Her Very Own Podcast Called Good for You Her Most Recent Netflix Special Called Can I Touch It Features in Part of Robot She Affectionately Named Bear Claw but It's Designed To Be Visually a Replica Whitney It's Exciting for Me To See One of My Favorite Comedians Explore the Social Aspects of Robotics and Ai in Our Society She Also Has some Fascinating Ideas about Human Behavior Psychology
51:26 Definition of Codependency
56:54 Prevent Social Media from Destroying Your Mental Health
57:19 Addiction to Social Media
68:0 What Is Love
74:39 Terror Management Theory

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Whitney Cummings.
00:00:03.640 | She's a stand-up comedian, actor, producer, writer, director,
00:00:07.240 | and recently, finally, the host of her very own podcast
00:00:11.240 | called Good For You.
00:00:12.920 | Her most recent Netflix special called Can I Touch It?
00:00:15.920 | features in part a robot she affectionately
00:00:18.680 | named Bearclaw that is designed to be visually a replica
00:00:22.240 | Whitney.
00:00:23.400 | It's exciting for me to see one of my favorite comedians
00:00:26.000 | explore the social aspects of robotics and AI in our society.
00:00:30.720 | She also has some fascinating ideas
00:00:32.920 | about human behavior, psychology, and urology,
00:00:36.000 | some of which she explores in her book called
00:00:38.120 | I'm Fine and Other Lies.
00:00:41.240 | It was truly a pleasure to meet Whitney
00:00:43.360 | and have this conversation with her,
00:00:45.200 | and even to continue it through text afterwards.
00:00:47.960 | Every once in a while, late at night,
00:00:50.160 | I'll be programming over a cup of coffee
00:00:52.360 | and will get a text from Whitney saying something hilarious.
00:00:55.760 | Or weirder yet, sending a video of Brian Callen
00:00:58.960 | saying something hilarious.
00:01:00.960 | That's when I know the universe has a sense of humor,
00:01:03.880 | and it gifted me with one hell of an amazing journey.
00:01:07.400 | Then I put the phone down and go back
00:01:09.200 | to programming with a stupid, joyful smile on my face.
00:01:13.400 | If you enjoy this conversation, listen to Whitney's podcast,
00:01:16.320 | Good For You, and follow her on Twitter and Instagram.
00:01:19.840 | This is the Artificial Intelligence Podcast.
00:01:22.640 | If you enjoy it, subscribe on YouTube,
00:01:24.840 | give it five stars on Apple Podcasts,
00:01:26.800 | support on Patreon, or simply connect with me on Twitter,
00:01:30.160 | @LexFriedman, spelled F-R-I-D-M-A-N.
00:01:34.040 | This show is presented by Cash App,
00:01:35.840 | the number one finance app in the App Store.
00:01:38.280 | They regularly support
00:01:39.160 | Whitney's Good For You podcast as well.
00:01:41.440 | I personally use Cash App to send money to friends,
00:01:43.920 | but you can also use it to buy, sell,
00:01:45.680 | and deposit Bitcoin in just seconds.
00:01:47.960 | Cash App also has a new investing feature.
00:01:50.660 | You can buy fractions of a stock, say $1 worth,
00:01:53.480 | no matter what the stock price is.
00:01:55.760 | Brokerage services are provided by Cash App Investing,
00:01:58.560 | a subsidiary of Square, and member SIPC.
00:02:02.020 | I'm excited to be working with Cash App
00:02:04.120 | to support one of my favorite organizations called FIRST,
00:02:07.180 | best known for their FIRST Robotics and LEGO competitions.
00:02:10.400 | They educate and inspire hundreds of thousands of students
00:02:13.800 | in over 110 countries,
00:02:16.000 | and have a perfect rating on Charity Navigator,
00:02:18.680 | which means the donated money
00:02:19.960 | is used to maximum effectiveness.
00:02:22.380 | When you get Cash App from the App Store or Google Play
00:02:25.240 | and use code LEXPODCAST, you'll get $10,
00:02:28.960 | and Cash App will also donate $10 to FIRST,
00:02:32.100 | which again is an organization that I've personally seen
00:02:35.280 | inspire girls and boys to dream
00:02:37.400 | of engineering a better world.
00:02:39.180 | This podcast is supported by ZipRecruiter.
00:02:43.080 | Hiring great people is hard,
00:02:45.240 | and to me is the most important element
00:02:47.300 | of a successful mission-driven team.
00:02:50.100 | I've been fortunate to be a part of
00:02:52.120 | and to lead several great engineering teams.
00:02:54.720 | The hiring I've done in the past
00:02:56.680 | was mostly through tools that we built ourselves,
00:02:59.420 | but reinventing the wheel was painful.
00:03:02.400 | ZipRecruiter is a tool that's already available for you.
00:03:05.220 | It seeks to make hiring simple, fast, and smart.
00:03:09.000 | For example, Codable co-founder Gretchen Huebner
00:03:11.960 | used ZipRecruiter to find a new game artist
00:03:14.400 | to join her education tech company.
00:03:16.700 | By using ZipRecruiter screening questions
00:03:18.840 | to filter candidates,
00:03:20.280 | Gretchen found it easier to focus on the best candidates,
00:03:23.080 | and finally hiring the perfect person for the role
00:03:26.440 | in less than two weeks from start to finish.
00:03:29.440 | ZipRecruiter, the smartest way to hire.
00:03:32.520 | See why ZipRecruiter is effective
00:03:34.160 | for businesses of all sizes by signing up,
00:03:36.660 | as I did, for free at ziprecruiter.com/lexpod.
00:03:41.560 | That's ziprecruiter.com/lexpod.
00:03:46.520 | And now, here's my conversation with Whitney Cummings.
00:03:50.640 | I have trouble making eye contact, as you can tell.
00:03:53.760 | - Me too.
00:03:54.600 | Did you know that I had to work on making eye contact
00:03:56.960 | 'cause I used to look here?
00:03:58.880 | Do you see what I'm doing? - That helps, yeah.
00:04:00.360 | - Do you want me to do that?
00:04:01.800 | Well, I'll do this way, I'll cheat the camera.
00:04:03.600 | But I used to do this, and finally people,
00:04:05.760 | like I'd be on dates, and guys would be like,
00:04:07.320 | "Are you looking at my hair?"
00:04:08.920 | It would make people really insecure
00:04:10.900 | because I didn't really get a lot of eye contact as a kid.
00:04:13.200 | It's one to three years.
00:04:14.760 | Did you not get a lot of eye contact as a kid?
00:04:16.440 | - I don't know.
00:04:17.280 | I haven't done the soul searching.
00:04:19.560 | - Right.
00:04:20.400 | - But there's definitely some psychological issues.
00:04:24.200 | - Makes you uncomfortable.
00:04:25.520 | - Yeah, for some reason, when I connect eyes,
00:04:27.880 | I start to think, I assume that you're judging me.
00:04:31.800 | - Oh, well, I am.
00:04:33.320 | That's why you assume that.
00:04:34.800 | We all are.
00:04:36.020 | - All right. - This is perfect.
00:04:36.860 | The podcast would be me and you both
00:04:38.160 | staring at the table the whole time.
00:04:40.200 | (both laughing)
00:04:42.400 | - Do you think robots of the future,
00:04:44.200 | ones with human level intelligence,
00:04:46.000 | will be female, male, genderless,
00:04:49.480 | or another gender we have not yet created as a society?
00:04:53.240 | - You're the expert at this.
00:04:55.240 | - Well, I'm gonna ask you-- - You know the answer.
00:04:57.320 | - I'm gonna ask you questions
00:04:58.680 | that maybe nobody knows the answer to.
00:05:01.000 | - Okay.
00:05:01.960 | - And then I just want you to hypothesize
00:05:04.080 | as a imaginative author, director, comedian,
00:05:09.080 | and just intellectual. - Can we just be very clear
00:05:12.120 | that you know a ton about this
00:05:14.200 | and I know nothing about this,
00:05:15.800 | but I have thought a lot about
00:05:19.600 | what I think robots can fix in our society.
00:05:22.860 | And I mean, I'm a comedian.
00:05:24.400 | It's my job to study human nature,
00:05:27.520 | to make jokes about human nature,
00:05:28.860 | and to sometimes play devil's advocate.
00:05:31.080 | And I just see such a tremendous negativity around robots,
00:05:35.160 | or at least the idea of robots,
00:05:37.000 | that it was like, oh, I'm just gonna take the opposite side
00:05:40.120 | for fun, for jokes, and then I was like,
00:05:43.400 | oh no, I really agree in this devil's advocate argument.
00:05:45.920 | So please correct me when I'm wrong about this stuff.
00:05:49.400 | - So first of all, there's no right and wrong
00:05:51.760 | because we're all, I think most of the people
00:05:54.880 | working on robotics are really not actually even thinking
00:05:57.560 | about some of the big picture things
00:06:00.040 | that you've been exploring.
00:06:01.280 | In fact, your robot, what's her name, by the way?
00:06:04.600 | - Bear Claw. - We'll go with Bear Claw.
00:06:06.200 | (laughing)
00:06:09.840 | What's the genesis of that name, by the way?
00:06:11.680 | - Bear Claw was, I, God, I don't even remember the joke
00:06:15.000 | 'cause I black out after I shoot specials,
00:06:16.640 | but I was writing something about the pet names
00:06:19.160 | that men call women, like Cupcake, Sweetie, Honey,
00:06:22.920 | you know, like, we're always named after desserts
00:06:26.500 | or something, and I was just writing a joke
00:06:29.360 | about if you wanna call us a dessert,
00:06:30.960 | at least pick a cool dessert, you know,
00:06:33.160 | like Bear Claw, like something cool.
00:06:35.640 | So I ended up calling her Bear Claw.
00:06:36.480 | - And she stuck.
00:06:38.240 | So do you think future robots
00:06:42.280 | of greater and greater intelligence
00:06:44.440 | would like to make them female, male?
00:06:46.520 | Would we like to assign them gender?
00:06:48.560 | Or would we like to move away from gender
00:06:50.800 | and say something more ambiguous?
00:06:54.000 | - I think it depends on their purpose, you know?
00:06:56.360 | I feel like if it's a sex robot,
00:06:59.920 | people prefer certain genders, you know?
00:07:01.900 | And I also, you know, when I went down
00:07:03.960 | and explored the robot factory,
00:07:06.300 | I was asking about the type of people
00:07:07.680 | that bought sex robots, and I was very surprised
00:07:11.520 | at the answer, because of course,
00:07:13.400 | the stereotype was it's gonna be a bunch of perverts.
00:07:15.320 | It ended up being a lot of people that were handicapped,
00:07:18.580 | a lot of people with erectile dysfunction,
00:07:20.520 | and a lot of people that were exploring their sexuality.
00:07:23.880 | A lot of people that thought they were gay,
00:07:25.920 | but weren't sure, but didn't wanna take the risk
00:07:28.120 | of trying on someone that could reject them
00:07:31.660 | and being embarrassed, or they were closeted,
00:07:33.880 | or in a city where maybe that's, you know,
00:07:36.340 | taboo and stigmatized, you know?
00:07:37.940 | So I think that a gendered sex robot,
00:07:40.560 | that would serve an important purpose
00:07:42.340 | for someone trying to explore their sexuality.
00:07:44.160 | Am I into men?
00:07:45.000 | Let me try on this thing first.
00:07:46.160 | Am I into women?
00:07:47.000 | Let me try on this thing first.
00:07:48.220 | So I think gendered robots would be important for that,
00:07:51.200 | but I think genderless robots,
00:07:52.460 | in terms of emotional support robots, babysitters,
00:07:56.520 | I'm fine for a genderless babysitter
00:07:58.720 | with my husband in the house.
00:08:00.120 | You know, there are places that I think
00:08:02.080 | that genderless makes a lot of sense,
00:08:04.640 | but obviously not in the sex area.
00:08:07.540 | - What do you mean with your husband in the house?
00:08:09.920 | What's that have to do with the gender of the robot?
00:08:11.820 | - Right, I mean, I don't have a husband,
00:08:13.040 | but hypothetically speaking,
00:08:14.340 | I think every woman's worst nightmare
00:08:15.740 | is like the hot babysitter.
00:08:17.340 | (laughing)
00:08:18.220 | You know what I mean?
00:08:19.260 | So I think that there is a time and place,
00:08:21.700 | I think, for genderless, you know, teachers, doctors,
00:08:25.380 | all that kind of, it would be very awkward
00:08:27.260 | if the first robotic doctor was a guy,
00:08:29.700 | or the first robotic nurse is a woman.
00:08:32.580 | You know, it's sort of, that stuff is so loaded.
00:08:36.120 | I think that genderless could just take
00:08:38.480 | the unnecessary drama out of it,
00:08:43.480 | and possibility to sexualize them,
00:08:46.040 | or be triggered by any of that stuff.
00:08:49.560 | - So there's two components to this, to Bearclaw.
00:08:52.840 | So one is the voice and the talking, and so on,
00:08:55.040 | and then there's the visual appearance.
00:08:56.400 | So on the topic of gender and genderless,
00:08:59.620 | in your experience, what has been the value
00:09:03.200 | of the physical appearance?
00:09:04.920 | So has it added much to the depth of the interaction?
00:09:09.000 | - I mean, mine's kind of an extenuating circumstance,
00:09:11.240 | 'cause she's supposed to look exactly like me.
00:09:13.560 | I mean, I spent six months getting my face molded,
00:09:16.000 | and having, you know, the idea was,
00:09:18.160 | I was exploring the concept of, can robots replace us?
00:09:21.240 | Because that's the big fear,
00:09:22.500 | but also the big dream in a lot of ways,
00:09:24.280 | and I wanted to dig into that area,
00:09:26.780 | because, you know, for a lot of people,
00:09:29.720 | it's like, they're gonna take our jobs,
00:09:30.840 | and they're gonna replace us, legitimate fear,
00:09:33.000 | but then a lot of women I know are like,
00:09:34.360 | I would love for a robot to replace me every now and then,
00:09:37.000 | so it can go to baby showers for me,
00:09:38.920 | and it can pick up my kids at school,
00:09:40.200 | and it can cook dinner, and whatever.
00:09:42.500 | So I just think that was an interesting place to explore,
00:09:45.240 | so her looking like me was a big part of it.
00:09:47.080 | Now her looking like me just adds
00:09:49.120 | an unnecessary level of insecurity,
00:09:51.400 | 'cause I got her a year ago,
00:09:53.560 | and she already looks younger than me,
00:09:54.720 | so that's a weird problem,
00:09:57.660 | but I think that her looking human was the idea,
00:10:00.700 | and I think that where we are now,
00:10:03.100 | please correct me if I'm wrong,
00:10:04.820 | a human robot resembling an actual human you know
00:10:09.820 | is going to feel more realistic than some generic face.
00:10:13.780 | - Well, you're saying that robots that have some familiarity
00:10:18.140 | like look similar to somebody that you actually know,
00:10:22.540 | you'll be able to form a deeper connection with?
00:10:24.580 | - That was the question? - I think so on some level.
00:10:26.020 | - That's an open question, I don't, you know,
00:10:28.820 | it's an interesting--
00:10:30.200 | - Or the opposite, 'cause then you know me,
00:10:32.100 | and you're like, well I know this isn't real,
00:10:33.380 | 'cause you're right here, so maybe it does the opposite.
00:10:36.260 | - We have a very keen eye for human faces,
00:10:39.260 | and they're able to detect strangeness,
00:10:41.860 | especially when it has to do with people
00:10:44.820 | whose faces we've seen a lot of,
00:10:46.420 | so I tend to be a bigger fan
00:10:48.900 | of moving away completely from faces.
00:10:52.820 | - Of recognizable faces?
00:10:54.220 | No, just human faces at all.
00:10:56.020 | - In general, 'cause I think that's where things get dicey,
00:10:58.340 | and one thing I will say is I think my robot
00:11:01.340 | is more realistic than other robots,
00:11:03.060 | not necessarily because you have seen me,
00:11:05.180 | and then you see her, and you go, oh, they're so similar,
00:11:07.620 | but also because human faces are flawed and asymmetrical,
00:11:11.100 | and sometimes we forget when we're making things
00:11:13.460 | that are supposed to look human,
00:11:14.400 | we make them too symmetrical,
00:11:16.020 | and that's what makes them stop looking human,
00:11:17.960 | so because they molded my asymmetrical face,
00:11:20.540 | she just, even if someone didn't know who I was,
00:11:22.860 | I think she'd look more realistic than most generic ones
00:11:26.580 | that didn't have some kind of flaws.
00:11:28.900 | - Got it, yeah.
00:11:29.740 | - 'Cause they start looking creepy
00:11:30.740 | when they're too symmetrical, 'cause human beings aren't.
00:11:33.260 | - Yeah, the flaws is what it means to be human,
00:11:35.740 | so visually as well, but I'm just a fan of the idea
00:11:39.420 | of letting humans use a little bit more imagination,
00:11:43.300 | so just hearing the voice is enough for us humans
00:11:47.580 | to then start imagining the visual appearance
00:11:50.300 | that goes along with that voice,
00:11:52.020 | and you don't necessarily need to work too hard
00:11:54.500 | on creating the actual visual appearance,
00:11:56.900 | so there's some value to that.
00:11:59.140 | When you step into this territory
00:12:00.620 | of actually building a robot that looks like Bear Claw,
00:12:04.300 | it's such a long road of facial expressions,
00:12:07.680 | of sort of making everything smiling, winking,
00:12:12.280 | rolling of the eyes, all that kind of stuff,
00:12:14.900 | it gets really, really tricky.
00:12:16.540 | - It gets tricky, and I think, again, I'm a comedian,
00:12:19.140 | like I'm obsessed with what makes us human,
00:12:21.820 | and our human nature, and the nasty side of human nature
00:12:25.460 | tends to be where I've ended up exploring
00:12:28.060 | over and over again, and I was just mostly fascinated
00:12:31.580 | by people's reactions, so it's my job
00:12:33.340 | to get the biggest reaction from a group of strangers,
00:12:35.640 | the loudest possible reaction, and I just had this instinct,
00:12:39.860 | just when I started building her, and people going,
00:12:42.180 | (gasps)
00:12:43.020 | and people scream, and I mean, I would bring her out
00:12:45.620 | on stage, and people would scream, and I just,
00:12:49.380 | to me, that was the next level of entertainment,
00:12:51.540 | getting a laugh, I've done that, I know how to do that,
00:12:53.540 | I think comedians were always trying to figure out
00:12:54.940 | what the next level is, and comedy's evolving so much,
00:12:57.280 | and Jordan Peele had just done these genius
00:13:00.540 | comedy horror movies, which feel like the next level
00:13:03.060 | of comedy to me, and this sort of funny horror
00:13:08.060 | of a robot was fascinating to me,
00:13:11.660 | but I think the thing that I got the most obsessed with
00:13:15.520 | was people being freaked out and scared of her,
00:13:18.220 | and I started digging around with pathogen avoidance,
00:13:21.660 | and the idea that we've essentially evolved
00:13:24.060 | to be repelled by anything that looks human,
00:13:27.140 | but is off a little bit, anything that could be sick,
00:13:30.340 | or diseased, or dead, essentially,
00:13:32.580 | is our reptilian brain's way to get us
00:13:34.740 | to not try to have sex with it, basically,
00:13:38.340 | you know, so I got really fascinated
00:13:39.940 | by how freaked out and scared, I mean,
00:13:42.140 | I would see grown men get upset,
00:13:44.380 | they're like, get that thing away from me,
00:13:45.220 | like, oh, like, people get angry,
00:13:47.900 | and it was like, you know what this is, you know,
00:13:50.860 | but the sort of like, you know,
00:13:53.340 | amygdala getting activated by something
00:13:55.980 | that to me is just a fun toy said a lot
00:13:59.340 | about our history as a species,
00:14:02.100 | and what got us into trouble thousands of years ago.
00:14:04.780 | - So it's that, it's the deep down stuff
00:14:07.340 | that's in our genetics, but also is it just,
00:14:10.100 | are people freaked out by the fact that there's a robot?
00:14:13.140 | It's not just the appearance,
00:14:14.860 | but that there's an artificial human.
00:14:17.860 | - Anything people I think, and I'm just also fascinated
00:14:21.260 | by the blind spots humans have,
00:14:23.060 | so the idea that you're afraid of that,
00:14:24.740 | I mean, how many robots have killed people?
00:14:27.220 | How many humans have died at the hands of other humans?
00:14:29.780 | - Yeah, a few more. - Millions?
00:14:31.380 | Hundreds of millions?
00:14:32.900 | Yet we're scared of that?
00:14:34.580 | And we'll go to the grocery store
00:14:36.060 | and be around a bunch of humans who statistically,
00:14:38.500 | the chances are much higher
00:14:39.500 | that you're gonna get killed by humans,
00:14:40.700 | so I'm just fascinated by, without judgment,
00:14:43.580 | how irrational we are as a species.
00:14:47.820 | - The worry is the exponential,
00:14:49.260 | so it's, you can say the same thing about nuclear weapons
00:14:52.700 | before we dropped on Hiroshima and Nagasaki,
00:14:55.740 | so the worry that people have is the exponential growth.
00:14:59.420 | So it's like, oh, it's fun and games right now,
00:15:03.700 | but overnight, especially if a robot
00:15:08.700 | provides value to society, we'll put one in every home,
00:15:11.660 | and then all of a sudden,
00:15:12.780 | lose track of the actual large-scale impact
00:15:16.180 | it has on society, and then all of a sudden,
00:15:18.100 | gain greater and greater control
00:15:20.260 | to where we'll all be, affect our political system
00:15:23.900 | and then affect our decision.
00:15:25.580 | - Didn't robots already ruin our political system?
00:15:27.700 | Didn't that just already happen?
00:15:28.700 | - Which ones?
00:15:29.540 | Oh, Russia hacking. - No offense.
00:15:32.540 | But hasn't that already happened?
00:15:34.980 | I mean, that was like an algorithm
00:15:36.460 | of negative things being clicked on more.
00:15:39.340 | - We'd like to tell stories
00:15:40.780 | and like to demonize certain people.
00:15:43.660 | I think nobody understands our current political system,
00:15:46.860 | our discourse on Twitter, the Twitter mobs.
00:15:49.700 | Nobody has a sense, not Twitter, not Facebook,
00:15:52.580 | the people running it,
00:15:53.420 | nobody understands the impact of these algorithms.
00:15:55.420 | They're trying their best.
00:15:56.900 | Despite what people think,
00:15:57.980 | they're not like a bunch of lefties
00:16:00.260 | trying to make sure that Hillary Clinton gets elected.
00:16:03.260 | It's more that it's an incredibly complex system
00:16:06.860 | that we don't, and that's the worry.
00:16:08.900 | It's so complex and moves so fast
00:16:11.460 | that nobody will be able to stop it once it happens.
00:16:15.780 | - And let me ask a question.
00:16:16.940 | This is a very savage question,
00:16:19.460 | which is, is this just the next stage of evolution?
00:16:23.860 | As humans, some people will die.
00:16:25.660 | Yes, I mean, that's always happened.
00:16:27.460 | This is just taking emotion out of it.
00:16:30.340 | Is this basically the next stage of survival of the fittest?
00:16:34.980 | - Yeah, you have to think of organisms.
00:16:37.780 | You know, what does it mean to be a living organism?
00:16:41.380 | Like, is a smartphone part of your living organism?
00:16:45.660 | - We're in relationships with our phones.
00:16:49.740 | - Yeah, but-- - We have sex through them,
00:16:52.460 | with them, what's the difference
00:16:53.420 | between with them and through them?
00:16:54.460 | - But it also expands your cognitive abilities,
00:16:57.140 | expands your memory, knowledge, and so on,
00:16:59.100 | so you're a much smarter person
00:17:00.660 | because you have a smartphone in your hand.
00:17:02.660 | - But as soon as it's out of my hand,
00:17:04.820 | we've got big problems,
00:17:06.140 | 'cause we become sort of so morphed with them.
00:17:08.420 | - Well, there's a symbiotic relationship,
00:17:09.980 | and that's what, so Elon Musk, the neural link,
00:17:12.580 | is working on trying to increase the bandwidth
00:17:16.700 | of communication between computers and your brain,
00:17:19.340 | and so further and further expand our ability
00:17:22.860 | as human beings to sort of leverage machines,
00:17:26.340 | and maybe that's the future, the evolution,
00:17:29.380 | next evolutionary step.
00:17:30.500 | It could be also that, yes, we'll give birth,
00:17:33.940 | just like we give birth to human children right now,
00:17:36.580 | we'll give birth to AI and they'll replace us.
00:17:38.980 | I think it's a really interesting possibility.
00:17:42.220 | - I'm gonna play devil's advocate.
00:17:44.140 | I just think that the fear of robots is wildly classist,
00:17:48.340 | because, I mean, Facebook, like, it's easy for us to say,
00:17:51.180 | they're taking their data.
00:17:52.020 | Okay, a lot of people that get employment off of Facebook,
00:17:55.740 | they are able to get income off of Facebook.
00:17:58.260 | They don't care if you take their phone numbers
00:17:59.860 | and their emails and their data, as long as it's free.
00:18:01.980 | They don't wanna have to pay $5 a month for Facebook.
00:18:03.980 | Facebook is a wildly democratic thing.
00:18:05.860 | Forget about the election and all that kind of stuff.
00:18:08.300 | A lot of technology making people's lives easier,
00:18:12.540 | I find that most elite people are more scared
00:18:17.180 | than lower-income people, and women, for the most part.
00:18:21.260 | So the idea of something that's stronger than us
00:18:24.020 | and that might eventually kill us,
00:18:25.260 | women are used to that.
00:18:26.460 | Like, that's not, I see a lot of really rich men
00:18:29.940 | being like, "The robots are gonna kill us."
00:18:31.180 | We're like, "What's another thing that's gonna kill us?"
00:18:33.700 | I tend to see, like, "Oh, something can walk me
00:18:36.180 | to my car at night.
00:18:37.180 | Something can help me cook dinner."
00:18:38.780 | For people in underprivileged countries
00:18:42.940 | who can't afford eye surgery, like, in a robot,
00:18:45.260 | can we send a robot to underprivileged places
00:18:48.740 | to do surgery where they can't?
00:18:50.580 | I work with this organization called Operation Smile
00:18:53.460 | where they do cleft palate surgeries,
00:18:55.580 | and there's a lot of places that can't do
00:18:56.900 | a very simple surgery because they can't afford doctors
00:19:00.340 | and medical care and such.
00:19:01.380 | So I just see, and this can be completely naive
00:19:04.740 | and should be completely wrong,
00:19:05.740 | but I feel like a lot of people are going like,
00:19:08.620 | "The robots are gonna destroy us."
00:19:09.860 | Humans, we're destroying ourselves.
00:19:11.540 | We're self-destructing.
00:19:12.740 | Robots, to me, are the only hope to clean up
00:19:14.620 | all the messes that we've created.
00:19:15.860 | Even when we go try to clean up pollution in the ocean,
00:19:18.140 | we make it worse because of the oil that the tankers use.
00:19:21.620 | It's like, to me, robots are the only solution.
00:19:25.420 | Firefighters are heroes, but they're limited
00:19:27.900 | in how many times they can run into a fire.
00:19:30.180 | So there's just something interesting to me.
00:19:32.340 | I'm not hearing a lot of lower-income,
00:19:36.140 | more vulnerable populations talking about robots.
00:19:39.940 | - Maybe you can speak to it a little bit more.
00:19:42.020 | There's an idea, I think you've expressed it,
00:19:44.100 | I've heard actually a few female writers and robotists
00:19:49.100 | have talked to express this idea that exactly you just said,
00:19:54.100 | which is, it just seems that being afraid
00:19:59.500 | of existential threats of artificial intelligence
00:20:03.100 | is a male issue.
00:20:06.260 | - Yeah.
00:20:07.100 | - And I wonder what that is.
00:20:09.580 | Because men in certain positions, like you said,
00:20:14.060 | it's also a classist issue.
00:20:15.620 | They haven't been humbled by life,
00:20:17.420 | and so you always look for the biggest problems
00:20:20.660 | to take on around you.
00:20:22.380 | - It's a champagne problem to be afraid of robots.
00:20:24.220 | Most people don't have health insurance,
00:20:26.460 | they're afraid they're not gonna be able to feed their kids,
00:20:28.220 | they can't afford a tutor for their kids.
00:20:30.020 | I mean, I just think of the way I grew up,
00:20:32.380 | and I had a mother who worked two jobs, had kids.
00:20:36.180 | We couldn't afford an SAT tutor.
00:20:37.820 | The idea of a robot coming in,
00:20:40.060 | being able to tutor your kids,
00:20:41.060 | being able to provide childcare for your kids,
00:20:43.540 | being able to come in with cameras for eyes
00:20:45.540 | and make sure surveillance.
00:20:48.340 | I'm very pro-surveillance,
00:20:49.820 | because I've had security problems,
00:20:52.300 | and I've been, we're generally in a little more danger
00:20:55.780 | than you guys are.
00:20:56.620 | So I think that robots are a little less scary to us,
00:20:58.620 | 'cause we can see them maybe as free assistance,
00:21:01.260 | help, and protection.
00:21:03.420 | And then there's sort of another element for me personally,
00:21:06.860 | which is maybe more of a female problem.
00:21:08.820 | I don't know, I'm just gonna make a generalization.
00:21:11.660 | Happy to be wrong.
00:21:13.100 | But the emotional sort of component of robots
00:21:18.100 | and what they can provide in terms of,
00:21:21.220 | I think there's a lot of people
00:21:23.740 | that don't have microphones
00:21:25.700 | that I just recently kind of stumbled upon
00:21:28.620 | in doing all my research on the sex robots
00:21:30.980 | for my standup special,
00:21:31.980 | which is there's a lot of very shy people
00:21:34.900 | that aren't good at dating.
00:21:35.900 | There's a lot of people who are scared of human beings,
00:21:37.940 | who have personality disorders,
00:21:40.580 | or grew up in alcoholic homes,
00:21:42.140 | or struggle with addiction, or whatever it is
00:21:44.380 | where a robot can solve an emotional problem.
00:21:47.020 | And so we're largely having this conversation
00:21:49.900 | about rich guys that are emotionally healthy
00:21:53.700 | and how scared of robots they are.
00:21:55.660 | We're forgetting about a huge part of the population
00:21:58.620 | who maybe isn't as charming and effervescent
00:22:01.660 | and solvent as people like you and Elon Musk,
00:22:05.500 | who these robots could solve very real problems
00:22:09.300 | in their life, emotional or financial.
00:22:11.300 | - Well, that's in general a really interesting idea
00:22:13.500 | that most people in the world don't have a voice.
00:22:16.260 | You've talked about it,
00:22:18.620 | even the people on Twitter
00:22:19.980 | who are driving the conversation.
00:22:22.820 | You said comments, people who leave comments
00:22:25.420 | represent a very tiny percent of the population.
00:22:28.260 | And they're the ones,
00:22:29.580 | we tend to think they speak for the population,
00:22:33.340 | but it's very possible on many topics they don't at all.
00:22:37.300 | And look, and I'm sure there's gotta be some kind of
00:22:40.660 | legal sort of structure in place
00:22:43.980 | for when the robots happen.
00:22:45.300 | You know way more about this than I do,
00:22:46.660 | but for me to just go, the robots are bad,
00:22:49.780 | that's a wild generalization
00:22:51.260 | that I feel like is really inhumane in some way.
00:22:54.460 | Just after the research I've done,
00:22:56.700 | you're gonna tell me that a man whose wife died suddenly
00:23:00.100 | and he feels guilty moving on with a human woman
00:23:02.980 | or can't get over the grief,
00:23:04.340 | he can't have a sex robot in his own house?
00:23:06.980 | Why not?
00:23:07.860 | Who cares?
00:23:08.820 | Why do you care?
00:23:09.980 | - Well, there's an interesting aspect of human nature.
00:23:12.700 | So, you know, we tend to as a civilization
00:23:16.820 | to create a group that's the other in all kinds of ways.
00:23:19.940 | - Right.
00:23:20.780 | - And so you work with animals too.
00:23:23.500 | You're especially sensitive to the suffering of animals.
00:23:26.540 | Let me kind of ask,
00:23:27.660 | what's your,
00:23:28.540 | do you think we'll abuse robots in the future?
00:23:34.260 | Do you think some of the darker aspects of human nature
00:23:36.620 | will come out?
00:23:37.980 | - I think some people will,
00:23:39.220 | but if we design them properly,
00:23:41.820 | the people that do it,
00:23:43.020 | we can put it on a record and they can,
00:23:46.060 | we can put them in jail.
00:23:46.940 | We can find sociopaths more easily.
00:23:48.900 | You know, like--
00:23:49.740 | - But why is that a sociopathic thing to harm a robot?
00:23:53.220 | - I think, look, I don't know enough
00:23:55.260 | about the consciousness and stuff as you do.
00:23:57.940 | I guess it would have to be when they're conscious,
00:23:59.900 | but it is, you know, the part of the brain
00:24:02.900 | that is responsible for compassion,
00:24:04.460 | the frontal lobe or whatever,
00:24:05.300 | like people that abuse animals also abuse humans
00:24:08.220 | and commit other kinds of crimes.
00:24:09.460 | Like that's, it's all the same part of the brain.
00:24:11.140 | No one abuses animals and then is like awesome
00:24:14.060 | to women and children and awesome to underprivileged,
00:24:17.420 | you know, minorities.
00:24:18.660 | Like it's all, so, you know,
00:24:20.540 | we've been working really hard to put a database together
00:24:23.060 | of all the people that have abused animals.
00:24:24.820 | So when they commit another crime,
00:24:25.980 | you go, okay, this is, you know, it's all the same stuff.
00:24:29.380 | And I think people probably think I'm nuts
00:24:32.420 | for the, a lot of the animal work I do,
00:24:34.820 | but because when animal abuse is present,
00:24:37.100 | another crime is always present,
00:24:38.900 | but the animal abuse is the most socially acceptable.
00:24:40.900 | You can kick a dog and there's nothing people can do,
00:24:43.940 | but then what they're doing behind closed doors,
00:24:46.620 | you can't see.
00:24:47.440 | So there's always something else going on,
00:24:48.900 | which is why I never feel compunction about it.
00:24:50.720 | But I do think we'll start seeing the same thing
00:24:52.420 | with robots.
00:24:53.380 | The person that kicks the,
00:24:55.540 | I felt compassion when the kicking the dog robot
00:24:59.760 | really pissed me off.
00:25:00.860 | I know that they're just trying to get the stability right
00:25:04.140 | and all that,
00:25:05.220 | but I do think there will come a time
00:25:07.380 | where that will be a great way to be able to figure out
00:25:10.740 | if somebody has like, you know, anti-social behaviors.
00:25:15.500 | - You kind of mentioned surveillance.
00:25:18.100 | It's also a really interesting idea of yours
00:25:20.020 | that you just said, you know,
00:25:21.540 | a lot of people seem to be really uncomfortable
00:25:23.420 | with surveillance.
00:25:24.300 | - Yeah.
00:25:25.180 | - And you just said that, you know what, for me,
00:25:28.540 | you know, there's positives for surveillance.
00:25:31.220 | - I think people behave better
00:25:32.200 | when they know they're being watched.
00:25:33.300 | And I know this is a very unpopular opinion.
00:25:36.020 | I'm talking about it on stage right now.
00:25:38.140 | We behave better when we know we're being watched.
00:25:40.380 | You and I had a very different conversation
00:25:41.980 | before we were recording.
00:25:43.220 | (laughing)
00:25:44.580 | We behave different.
00:25:45.460 | You sit up and you are in your best behavior
00:25:47.540 | and I'm trying to sound eloquent
00:25:49.380 | and I'm trying to not hurt anyone's feelings.
00:25:51.180 | And I mean, I have a camera right there.
00:25:52.900 | I'm behaving totally different
00:25:54.680 | than when we first started talking, you know?
00:25:56.980 | When you know there's a camera, you behave differently.
00:25:59.420 | I mean, there's cameras all over LA at stoplights
00:26:02.740 | so that people don't run stoplights,
00:26:04.060 | but there's not even film in it.
00:26:05.860 | They don't even use them anymore, but it works.
00:26:08.020 | - It works.
00:26:08.860 | - Right?
00:26:09.700 | And I'm, you know, working on this thing
00:26:10.980 | in standabout surveillance.
00:26:12.000 | It's like, that's why we invented Santa Claus.
00:26:14.260 | You know, Santa Claus is the first surveillance, basically.
00:26:17.820 | All we had to say to kids is he's making a list
00:26:20.420 | and he's watching you and they behave better.
00:26:22.940 | - That's brilliant.
00:26:23.780 | - You know, so I do think that there are benefits
00:26:26.140 | to surveillance.
00:26:27.420 | You know, I think we all do sketchy things in private
00:26:30.940 | and we all have watched weird porn or Googled weird things
00:26:34.500 | and we don't want people to know about it,
00:26:37.060 | our secret lives.
00:26:37.940 | So I do think that obviously,
00:26:40.220 | we should be able to have a modicum of privacy,
00:26:42.860 | but I tend to think that people that are the most negative
00:26:46.660 | about surveillance have the most secrets.
00:26:48.300 | - The most hype.
00:26:49.140 | (laughing)
00:26:50.540 | Well, you should, you're saying you're doing bits on it now?
00:26:54.580 | - Well, I'm just talking in general about, you know,
00:26:57.020 | privacy and surveillance and how paranoid
00:26:59.340 | we're kind of becoming and how, you know,
00:27:02.500 | I mean, it's just wild to me that people are like,
00:27:04.940 | our emails are gonna leak
00:27:05.940 | and they're taking our phone numbers.
00:27:07.220 | Like there used to be a book full of phone numbers
00:27:11.460 | and addresses that were, they just throw it at your door.
00:27:15.580 | And we all had a book of everyone's numbers, you know,
00:27:18.420 | this is a very new thing.
00:27:20.380 | And, you know, I know our amygdala is designed
00:27:22.380 | to compound sort of threats and, you know,
00:27:25.380 | there's stories about, and I think we all just glom on
00:27:29.300 | in a very, you know, tribal way of,
00:27:31.420 | yeah, they're taking our data.
00:27:32.420 | Like, we don't even know what that means,
00:27:33.740 | but we're like, well, yeah, they, they, you know?
00:27:37.100 | So I just think that sometimes it's like, okay, well,
00:27:39.740 | so what, they're gonna sell your data?
00:27:41.340 | Who cares?
00:27:42.180 | Why do you care?
00:27:43.220 | - First of all, that bit will kill in China.
00:27:46.240 | So, and I say this sort of only a little bit joking
00:27:51.100 | because a lot of people in China, including the citizens,
00:27:55.220 | despite what people in the West think of as abuse,
00:27:59.660 | are actually in support of the idea of surveillance.
00:28:02.580 | Sort of, they're not in support of the abuse of surveillance
00:28:06.500 | but they're, they like, I mean, the idea of surveillance
00:28:09.420 | is kind of like the idea of government.
00:28:13.540 | Like you said, we behave differently.
00:28:15.940 | And in a way, it's almost like why we like sports.
00:28:18.520 | There's rules and within the constraints of the rules,
00:28:22.380 | this is a more stable society.
00:28:25.040 | And they make good arguments about success,
00:28:28.140 | being able to build successful companies,
00:28:30.460 | being able to build successful social lives
00:28:32.780 | around a fabric that's more stable.
00:28:34.540 | When you have a surveillance, it keeps the criminals away,
00:28:37.060 | keeps abusive animals, whatever the values of the society,
00:28:41.860 | with surveillance, you can enforce those values better.
00:28:44.780 | - And here's what I will say.
00:28:45.900 | There's a lot of unethical things
00:28:47.300 | happening with surveillance.
00:28:48.580 | Like I feel the need to really make that very clear.
00:28:52.100 | I mean, the fact that Google is like collecting
00:28:54.100 | if people's hands start moving on the mouse
00:28:55.980 | to find out if they're getting Parkinson's
00:28:58.460 | and then their insurance goes up,
00:29:00.080 | like that is completely unethical and wrong.
00:29:02.180 | And I think stuff like that,
00:29:03.340 | we have to really be careful around.
00:29:05.860 | So the idea of using our data to raise our insurance rates
00:29:08.660 | or, you know, I heard that they're looking,
00:29:10.820 | they can sort of predict if you're gonna have depression
00:29:13.300 | based on your selfies by detecting micro muscles
00:29:16.100 | in your face, you know, all that kind of stuff.
00:29:18.260 | That is a nightmare, not okay.
00:29:20.020 | But I think, you know, we have to delineate
00:29:22.380 | what's a real threat and what's getting spam
00:29:25.180 | in your email box, that's not what to spend
00:29:27.420 | your time and energy on.
00:29:28.620 | Focus on the fact that every time you buy cigarettes,
00:29:31.100 | your insurance is going up without you knowing about it.
00:29:35.260 | - On the topic of animals too,
00:29:36.980 | can we just linger on it a little bit?
00:29:38.380 | Like, what do you think,
00:29:40.340 | what does it say about our society,
00:29:43.400 | of the society wide abuse of animals
00:29:45.700 | that we see in general?
00:29:47.220 | Sort of factory farming, just in general,
00:29:49.420 | just the way we treat animals of different categories.
00:29:53.780 | Like, what do you think of that?
00:29:56.980 | What does a better world look like?
00:29:59.820 | What should people think about it in general?
00:30:03.660 | - I think the most interesting thing
00:30:06.460 | I can probably say around this,
00:30:07.860 | that's the least emotional,
00:30:09.500 | 'cause I'm actually a very non-emotional animal person
00:30:11.860 | because it's, I think everyone's an animal person.
00:30:14.060 | It's just a matter of if it's yours
00:30:15.880 | or if you've, you know, been conditioned to go numb.
00:30:18.480 | You know, I think it's really a testament
00:30:20.820 | to what as a species we are able to be in denial about,
00:30:24.580 | mass denial and mass delusion,
00:30:26.300 | and how we're able to dehumanize and debase groups,
00:30:30.620 | you know, World War II,
00:30:33.420 | in a way in order to conform
00:30:36.780 | and find protection in the conforming.
00:30:38.860 | So we are also a species who used to go to coliseums
00:30:43.860 | and watch elephants and tigers fight to the death.
00:30:47.540 | We used to watch human beings be pulled apart
00:30:50.300 | in the, that wasn't that long ago.
00:30:53.060 | We're also a species who had slaves,
00:30:56.860 | and it was socially acceptable by a lot of people.
00:30:59.020 | People didn't see anything wrong with it.
00:31:00.160 | So we're a species that is able to go numb
00:31:02.660 | and that is able to dehumanize very quickly
00:31:05.940 | and make it the norm.
00:31:08.120 | Child labor wasn't that long ago.
00:31:10.420 | Like the idea that now we look back and go,
00:31:12.620 | "Oh yeah, kids were losing fingers in factories,
00:31:15.900 | "making shoes."
00:31:17.180 | Like someone had to come in and make that, you know,
00:31:20.160 | so I think it just says a lot about the fact that,
00:31:23.180 | you know, we are animals and we are self-serving
00:31:25.300 | and one of the most successful,
00:31:27.260 | the most successful species
00:31:29.180 | because we are able to debase and degrade
00:31:33.140 | and essentially exploit anything that benefits us.
00:31:36.840 | I think the pendulum's gonna swing as being lately.
00:31:39.980 | - Which way?
00:31:40.820 | - Like I think we're Rome now, kind of.
00:31:42.580 | Like I think we're on the verge of collapse
00:31:44.980 | because we are dopamine receptors.
00:31:47.240 | Like we are just, I think we're all kind of addicts
00:31:49.560 | when it comes to this stuff.
00:31:50.520 | Like we don't know when to stop.
00:31:53.380 | It's always the buffet.
00:31:54.480 | Like we're, the thing that used to keep us alive,
00:31:56.600 | which is killing animals and eating them,
00:31:58.380 | now killing animals and eating them
00:31:59.740 | is what's killing us in a way.
00:32:01.220 | So it's like we just can't, we don't know when to call it
00:32:04.220 | and we don't, moderation is not really something
00:32:06.560 | that humans have evolved to have yet.
00:32:10.040 | So I think it's really just a flaw in our wiring.
00:32:13.620 | - Do you think we'll look back at this time
00:32:15.240 | as our society's being deeply unethical?
00:32:19.380 | - Yeah, yeah.
00:32:20.520 | I think we'll be embarrassed.
00:32:22.240 | - Which are the worst parts right now going on?
00:32:24.860 | Is it-- - In terms of animal?
00:32:26.120 | Well, I think-- - No, in terms of anything.
00:32:27.780 | What's the unethical thing?
00:32:29.800 | It's very hard to take a step out of it,
00:32:32.020 | but you just said we used to watch, you know,
00:32:36.180 | there's been a lot of cruelty throughout history.
00:32:40.400 | What's the cruelty going on now?
00:32:42.080 | - I think it's gonna be pigs.
00:32:44.180 | I think it's gonna be, I mean,
00:32:45.480 | pigs are one of the most emotionally intelligent animals.
00:32:48.660 | And they have the intelligence of like a three-year-old.
00:32:51.640 | And I think we'll look back and be really,
00:32:54.280 | they use tools.
00:32:55.120 | I mean, I think we have this narrative
00:32:58.400 | that they're pigs and they're pigs
00:32:59.720 | and they're disgusting and they're dirty
00:33:01.840 | and their bacon is so good.
00:33:02.840 | I think that we'll look back one day
00:33:04.200 | and be really embarrassed about that.
00:33:06.620 | - Is this for just, what's it called, the factory farming?
00:33:10.300 | So basically mass-- - 'Cause we don't see it.
00:33:12.520 | If you saw, I mean, we do have,
00:33:14.520 | I mean, this is probably an evolutionary advantage.
00:33:17.560 | We do have the ability to completely
00:33:20.360 | pretend something's not,
00:33:21.480 | something that is so horrific that it overwhelms us.
00:33:24.000 | And we're able to essentially deny that it's happening.
00:33:27.520 | I think if people were to see
00:33:28.640 | what goes on in factory farming,
00:33:30.440 | and also we're really to take in how bad it is for us,
00:33:35.320 | you know, we're hurting ourselves first and foremost
00:33:37.120 | with what we eat.
00:33:38.400 | But that's also a very elitist argument, you know?
00:33:41.240 | It's a luxury to be able to complain about meat.
00:33:44.580 | It's a luxury to be able to not eat meat.
00:33:46.640 | You know, there's very few people because of, you know,
00:33:49.960 | how the corporations have set up meat being cheap.
00:33:53.320 | You know, it's $2 to buy a Big Mac,
00:33:55.280 | it's $10 to buy a healthy meal.
00:33:57.620 | You know, that's, I think a lot of people
00:34:00.000 | don't have the luxury to even think that way.
00:34:02.260 | But I do think that animals in captivity,
00:34:04.200 | I think we're gonna look back
00:34:05.040 | and be pretty grossed out about.
00:34:06.480 | Mammals in captivity, whales, dolphins.
00:34:08.760 | I mean, that's already starting to dismantle.
00:34:10.160 | Circuses, we're gonna be pretty embarrassed about.
00:34:13.980 | But I think it's really more a testament
00:34:15.720 | to, you know, there's just such a ability to go like,
00:34:20.720 | that thing is different than me and we're better.
00:34:25.520 | It's the ego.
00:34:26.360 | I mean, it's just, we have the species
00:34:27.560 | with the biggest ego, ultimately.
00:34:29.160 | - Well, that's what I think, that's my hope for robots
00:34:31.840 | is they'll, you mentioned consciousness before.
00:34:34.200 | Nobody knows what consciousness is.
00:34:37.600 | But I'm hoping robots will help us empathize
00:34:42.200 | and understand that there's other creatures
00:34:47.200 | out besides ourselves that can suffer,
00:34:50.360 | that can experience the world
00:34:54.840 | and that we can torture by our actions.
00:34:57.720 | And robots can explicitly teach us that,
00:34:59.920 | I think, better than animals can.
00:35:01.520 | - I have never seen such compassion
00:35:06.520 | from a lot of people in my life
00:35:11.080 | toward any human, animal, child,
00:35:13.880 | as I have a lot of people
00:35:15.240 | in the way they interact with the robot.
00:35:16.840 | 'Cause I think there's-- - Compassion, for sure.
00:35:18.480 | - I think there's something of,
00:35:20.000 | I mean, I was on the robot owners chat boards
00:35:23.760 | for a good eight months.
00:35:26.160 | And the main emotional benefit is
00:35:28.360 | she's never gonna cheat on you.
00:35:30.600 | She's never gonna hurt you.
00:35:32.160 | She's never gonna lie to you.
00:35:33.360 | She doesn't judge you.
00:35:35.020 | You know, I think that robots help people,
00:35:38.740 | and this is part of the work I do with animals.
00:35:40.880 | Like I do equine therapy and train dogs and stuff
00:35:43.040 | because there is this safe space to be authentic.
00:35:46.280 | You're with this being that doesn't care
00:35:48.240 | what you do for a living,
00:35:49.080 | doesn't care how much money you have,
00:35:50.400 | doesn't care who you're dating,
00:35:51.540 | doesn't care what you look like,
00:35:52.440 | doesn't care if you have cellulite, whatever.
00:35:54.560 | You feel safe to be able to truly be present
00:35:57.960 | without being defensive and worrying about eye contact
00:36:00.160 | and being triggered by needing to be perfect
00:36:02.920 | and fear of judgment and all that.
00:36:04.840 | And robots really can't judge you yet,
00:36:07.320 | but they can't judge you.
00:36:09.320 | But I think it really puts people at ease
00:36:13.480 | and at their most authentic.
00:36:15.320 | - Do you think you can have a deep connection
00:36:18.720 | with a robot that's not judging?
00:36:22.040 | Do you think you can really have a relationship
00:36:25.440 | with a robot or a human being that's a safe space?
00:36:30.000 | Or is attention, mystery, danger necessary
00:36:34.320 | for a deep connection?
00:36:36.000 | - I'm gonna speak for myself and say that
00:36:38.640 | I grew up in Alcoa, Colombo.
00:36:40.120 | I identify as a codependent,
00:36:41.440 | talked about this stuff before,
00:36:43.280 | but for me, it's very hard to be in a relationship
00:36:45.360 | with a human being without feeling like
00:36:47.600 | I need to perform in some way or deliver in some way.
00:36:50.760 | And I don't know if that's just the people
00:36:51.920 | I've been in a relationship with or me or my brokenness,
00:36:56.520 | but I do think this is gonna sound really negative
00:37:01.320 | and pessimistic, but I do think a lot of our relationships
00:37:06.400 | are a projection and a lot of our relationships
00:37:08.360 | are performance, and I don't think I really understood that
00:37:12.280 | until I worked with horses.
00:37:15.280 | And most communications with human is nonverbal, right?
00:37:18.080 | I can say, "I love you," but you don't think I love you.
00:37:22.000 | Whereas with animals, it's very direct.
00:37:24.280 | It's all physical, it's all energy.
00:37:26.840 | I feel like that with robots, too.
00:37:28.520 | It feels very...
00:37:29.960 | How I say something doesn't matter.
00:37:35.280 | My inflection doesn't really matter.
00:37:36.920 | And you thinking that my tone is disrespectful,
00:37:40.280 | like you're not filtering it through all
00:37:42.160 | of the bad relationships you've been in.
00:37:43.800 | You're not filtering it through
00:37:44.840 | the way your mom talked to you.
00:37:45.880 | You're not getting triggered.
00:37:47.760 | I find that for the most part,
00:37:49.400 | people don't always receive things
00:37:51.000 | the way that you intend them to or the way intended,
00:37:53.640 | and that makes relationships really murky.
00:37:56.120 | - So the relationships with animals
00:37:57.440 | and relationship with the robots as they are now,
00:38:00.680 | you kind of implied that that's more healthy.
00:38:05.240 | Can you have a healthy relationship with other humans?
00:38:08.080 | Or not healthy, I don't like that word,
00:38:10.120 | but shouldn't it be, you've talked about codependency.
00:38:14.440 | Maybe you can talk about what is codependency,
00:38:16.640 | but is the challenges of that,
00:38:21.640 | the complexity of that necessary for passion,
00:38:24.640 | for love between humans?
00:38:27.360 | - That's right, you love passion.
00:38:29.560 | (laughing)
00:38:31.240 | That's a good thing.
00:38:32.080 | - I thought this would be a safe space.
00:38:34.000 | (laughing)
00:38:36.240 | I got trolled by Rogan for hours on this.
00:38:40.080 | - Look, I am not anti-passion.
00:38:42.680 | I think that I've just maybe been around long enough
00:38:45.360 | to know that sometimes it's ephemeral
00:38:48.320 | and that passion is a mixture
00:38:53.040 | of a lot of different things.
00:38:55.440 | Adrenaline, which turns into dopamine, cortisol.
00:38:57.720 | It's a lot of neurochemicals.
00:38:59.200 | It's a lot of projection.
00:39:01.240 | It's a lot of what we've seen in movies.
00:39:03.280 | It's a lot of, I identify as an addict.
00:39:06.240 | So for me, sometimes passion is like,
00:39:08.640 | uh-oh, this could be bad.
00:39:10.200 | And I think we've been so conditioned to believe
00:39:11.600 | that passion means your soulmates.
00:39:13.160 | And I mean, how many times have you had
00:39:14.400 | a passionate connection with someone
00:39:15.680 | and then it was a total train wreck?
00:39:17.840 | - The train wreck is interesting.
00:39:19.840 | - How many times exactly?
00:39:21.160 | - Exactly.
00:39:22.000 | What's a train wreck?
00:39:22.840 | - You just did a lot of math in your head
00:39:24.480 | in that little moment.
00:39:25.520 | - Counting.
00:39:26.560 | I mean, what's a train wreck?
00:39:28.640 | What's a, why is obsession,
00:39:31.560 | so you describe this codependency
00:39:33.680 | and sort of the idea of attachment,
00:39:37.480 | over attachment to people who don't deserve
00:39:40.360 | that kind of attachment as somehow a bad thing.
00:39:44.600 | I think our society says it's a bad thing.
00:39:47.800 | It probably is a bad thing.
00:39:49.680 | Like a delicious burger is a bad thing.
00:39:52.640 | I don't know.
00:39:53.480 | - Right, oh, that's a good point.
00:39:54.360 | I think that you're pointing out something
00:39:55.640 | really fascinating, which is like passion,
00:39:57.280 | if you go into it knowing this is like pizza
00:40:00.280 | where it's gonna be delicious for two hours
00:40:01.920 | and then I don't have to have it again for three.
00:40:03.480 | If you can have a choice in the passion,
00:40:06.440 | I define passion as something
00:40:07.600 | that is relatively unmanageable
00:40:09.600 | and something you can't control
00:40:10.880 | or stop and start with your own volition.
00:40:13.760 | So maybe we're operating under different definitions.
00:40:16.360 | If passion is something that like,
00:40:17.960 | you know, ruins your marriages
00:40:21.000 | and screws up your professional life
00:40:23.240 | and becomes this thing that you're not in control of
00:40:27.360 | and becomes addictive, I think that's the difference
00:40:29.920 | is is it a choice or is it not a choice?
00:40:32.640 | And if it is a choice, then passion's great.
00:40:35.160 | But if it's something that like consumes you
00:40:37.400 | and makes you start making bad decisions
00:40:39.400 | and clouds your frontal lobe
00:40:41.200 | and is just all about dopamine
00:40:44.080 | and not really about the person
00:40:46.240 | and more about the neurochemical,
00:40:47.840 | we call it sort of the drug, the internal drug cabinet.
00:40:50.800 | If it's all just you're on drugs, that's different,
00:40:53.000 | you know, 'cause sometimes you're just on drugs.
00:40:55.040 | - Okay, so there's a philosophical question here.
00:40:58.520 | So would you rather, and it's interesting for a comedian,
00:41:03.440 | brilliant comedian to speak so eloquently
00:41:07.640 | about a balanced life.
00:41:09.560 | I kind of argue against this point.
00:41:12.100 | There's such an obsession
00:41:13.080 | of creating this healthy lifestyle now,
00:41:15.580 | psychologically speaking.
00:41:18.160 | You know, I'm a fan of the idea
00:41:19.760 | that you sort of fly high and you crash and die at 27
00:41:24.760 | as also a possible life,
00:41:26.520 | and it's not one we should judge
00:41:28.000 | because I think there's moments of greatness.
00:41:30.680 | I've talked to Olympic athletes
00:41:32.160 | where some of their greatest moments
00:41:34.320 | are achieved in their early 20s,
00:41:36.600 | and the rest of their life is in a kind of fog
00:41:39.880 | of almost of a depression because they can never--
00:41:42.320 | - Because it was based on their physical prowess, right?
00:41:44.280 | - Physical prowess, and they'll never,
00:41:46.440 | so that, so they're watching their physical prowess fade,
00:41:50.260 | and they'll never achieve the kind of height,
00:41:54.720 | not just physical, of just emotion,
00:41:57.220 | of-- - Well, the max number
00:41:59.720 | of neurochemicals, and you also put your money
00:42:03.200 | on the wrong horse.
00:42:04.760 | That's where I would just go like,
00:42:06.520 | oh, yeah, if you're doing a job where you peak at 22,
00:42:10.260 | the rest of your life is gonna be hard.
00:42:12.440 | - That idea is considering the notion
00:42:15.280 | that you wanna optimize some kind of,
00:42:17.620 | but we're all gonna die soon.
00:42:19.440 | - What?
00:42:20.280 | (laughing)
00:42:22.000 | Now you tell me.
00:42:23.400 | I've immortalized myself, so I'm gonna be fine.
00:42:26.920 | - See, you're almost like, how many Oscar-winning movies
00:42:30.440 | can I direct by the time I'm 100?
00:42:34.160 | How many this and that?
00:42:35.880 | But there's, life is short, relatively speaking.
00:42:40.880 | - I know, but it can also come in different,
00:42:42.680 | you go, life is short, play hard,
00:42:45.200 | fall in love as much as you can, run into walls.
00:42:47.720 | I would also go, life is short, don't deplete yourself
00:42:51.360 | on things that aren't sustainable and that you can't keep.
00:42:55.800 | - Yeah.
00:42:56.640 | - So I think everyone gets dopamine from different places,
00:42:59.840 | everyone has meaning from different places.
00:43:01.840 | I look at the fleeting, passionate relationships
00:43:04.640 | I've had in the past, and I don't have pride in them.
00:43:08.000 | I think that you have to decide
00:43:09.120 | what helps you sleep at night.
00:43:11.240 | For me, it's pride and feeling like I behave
00:43:13.600 | with grace and integrity, that's just me personally.
00:43:16.160 | Everyone can go like, yeah, I slept with all the hot chicks
00:43:19.760 | in Italy, I could, and I did all the whatever,
00:43:23.560 | like whatever you value.
00:43:25.160 | We're allowed to value different things.
00:43:26.720 | - We're talking about Brian Callan.
00:43:28.080 | (laughing)
00:43:29.800 | - Brian Callan has lived his life to the fullest,
00:43:32.640 | to say the least, but I think that it's just
00:43:34.600 | for me personally, I, and this could be like my workaholism
00:43:38.920 | or my achievementism, if I don't have something
00:43:43.680 | to show for something, I feel like it's a waste of time
00:43:47.960 | or some kind of loss.
00:43:50.360 | I'm in a 12-step program, and the third step would say,
00:43:52.680 | there's no such thing as waste of time
00:43:54.200 | and everything happens exactly as it should
00:43:56.880 | and whatever, that's a way to just sort of keep us sane
00:43:59.560 | so we don't grieve too much and beat ourselves up
00:44:01.880 | over past mistakes, there's no such thing as mistakes,
00:44:04.720 | da-da-da, but I think passion is,
00:44:08.400 | I think it's so life-affirming and one of the few things
00:44:11.760 | that maybe people like us, makes us feel awake and seen
00:44:14.920 | and we just have such a high threshold for adrenaline.
00:44:19.920 | You know, I mean, you are a fighter, right?
00:44:22.800 | Yeah, okay, so yeah, so you have a very high tolerance
00:44:27.520 | for adrenaline, and I think that Olympic athletes,
00:44:30.440 | the amount of adrenaline they get from performing,
00:44:33.680 | it's very hard to follow that, it's like when guys come back
00:44:35.720 | from the military and they have depression,
00:44:38.160 | it's like, do you miss bullets flying at you?
00:44:40.720 | Yeah, kind of, because of that adrenaline
00:44:42.880 | which turned into dopamine and the camaraderie,
00:44:45.160 | I mean, there's people that speak much better
00:44:46.600 | about this than I do, but I just, I'm obsessed
00:44:50.120 | with neurology and I'm just obsessed with sort of
00:44:52.240 | the lies we tell ourselves in order
00:44:54.480 | to justify getting neurochemicals.
00:44:57.120 | - You've done actually quite, done a lot of thinking
00:45:00.400 | and talking about neurology, just kind of look
00:45:03.480 | at human behavior through the lens of looking
00:45:06.960 | at how our actually, chemically, our brain works.
00:45:09.160 | So what, first of all, why did you connect with that idea
00:45:13.960 | and what have you, how has your view of the world changed
00:45:17.600 | by considering the brain is just a machine?
00:45:22.480 | - You know, I know it probably sounds really nihilistic,
00:45:24.640 | but for me it's very liberating to know a lot
00:45:27.600 | about neurochemicals because you don't have to,
00:45:30.120 | it's like the same thing with like critics,
00:45:32.560 | like critical reviews, if you believe the good,
00:45:34.840 | you have to believe the bad kind of thing.
00:45:36.200 | Like, you know, if you believe that your bad choices
00:45:39.000 | were because of your moral integrity or whatever,
00:45:43.840 | you have to believe your good ones,
00:45:44.840 | I just think there's something really liberating
00:45:46.400 | and going like, oh, that was just adrenaline,
00:45:48.160 | I just said that thing 'cause I was adrenalized
00:45:49.800 | and I was scared and my amygdala was activated
00:45:52.160 | and that's why I said you're an asshole and get out.
00:45:54.480 | And that's, you know, I just think it's important
00:45:56.800 | to delineate what's nature and what's nurture,
00:45:58.880 | what is your choice and what is just your brain
00:46:01.120 | trying to keep you safe.
00:46:02.120 | I think we forget that even though we have security systems
00:46:04.640 | and homes and locks on our doors, that our brain,
00:46:07.000 | for the most part, is just trying to keep us safe
00:46:08.800 | all the time, it's why we hold grudges,
00:46:10.440 | it's why we get angry, it's why we get road rage,
00:46:13.000 | it's why we do a lot of things.
00:46:14.840 | And it's also, when I started learning about neurology,
00:46:17.360 | I started having so much more compassion for other people.
00:46:19.760 | You know, if someone yelled at me,
00:46:21.280 | being like, fuck you, on the road,
00:46:22.640 | I'd be like, okay, he's producing adrenaline right now
00:46:24.680 | because we're all going 65 miles an hour
00:46:27.760 | and our brains aren't really designed
00:46:30.400 | for this type of stress and he's scared.
00:46:33.360 | He was scared, you know, so that really helped me
00:46:35.400 | to have more love for people in my everyday life
00:46:38.720 | instead of being in fight or flight mode.
00:46:41.080 | But the, I think, more interesting answer to your question
00:46:44.240 | is that I've had migraines my whole life,
00:46:45.800 | like I've suffered with really intense migraines,
00:46:49.120 | ocular migraines, ones where my arm would go numb,
00:46:52.440 | and I just started having to go to so many doctors
00:46:55.080 | to learn about it, and I started, you know,
00:46:58.400 | learning that we don't really know that much.
00:47:00.720 | We know a lot, but it's wild to go into one
00:47:03.680 | of the best neurologists in the world
00:47:04.920 | who's like, yeah, we don't know.
00:47:05.960 | - We don't know. - We don't know.
00:47:07.400 | And that fascinated me.
00:47:08.720 | - It's like one of the worst pains you can probably have,
00:47:10.760 | all that stuff, and we don't know the source.
00:47:13.360 | - We don't know the source, and there is something
00:47:15.520 | really fascinating about when your left arm
00:47:18.640 | starts going numb, and you start not being able to see
00:47:21.080 | out of the left side of both your eyes,
00:47:22.880 | and I remember when the migraines get really bad,
00:47:25.400 | it's like a mini stroke almost,
00:47:26.920 | and you're able to see words on a page,
00:47:29.920 | but I can't read them.
00:47:31.280 | They just look like symbols to me.
00:47:33.120 | So there's something just really fascinating to me
00:47:35.040 | about your brain just being able to stop functioning,
00:47:38.320 | and so I just wanted to learn about it, study about it.
00:47:41.680 | I did all these weird alternative treatments.
00:47:43.400 | I got this piercing in here that actually works.
00:47:45.920 | I've tried everything, and then both my parents
00:47:48.160 | had strokes.
00:47:49.240 | So when both of my parents had strokes,
00:47:51.080 | I became sort of the person who had to decide
00:47:54.200 | what was gonna happen with their recovery,
00:47:56.720 | which is just a wild thing to have to deal with it,
00:47:59.160 | you know, 28 years old, when it happened,
00:48:02.160 | and I started spending basically all day every day
00:48:05.120 | in ICUs with neurologists learning about what happened
00:48:08.240 | in my dad's brain, and why he can't move his left arm,
00:48:11.200 | but he can move his right leg, but he can't see out of the,
00:48:13.600 | you know, and then my mom had another stroke
00:48:16.000 | in a different part of the brain.
00:48:18.120 | So I started having to learn what parts of the brain
00:48:20.560 | did what, and so that I wouldn't take their behavior
00:48:23.200 | so personally, and so that I would be able to manage
00:48:25.160 | my expectations in terms of their recovery.
00:48:27.440 | So my mom, because it affected a lot of her frontal lobe,
00:48:31.280 | changed a lot as a person.
00:48:33.120 | She was way more emotional, she was way more micromanaged,
00:48:35.800 | she was forgetting certain things,
00:48:37.000 | so it broke my heart less when I was able to know,
00:48:40.320 | oh yeah, well, the stroke hit this part of the brain,
00:48:42.080 | and that's the one that's responsible for short-term memory
00:48:44.240 | and that's responsible for long-term memory, da-da-da,
00:48:46.880 | and then my brother just got something
00:48:48.720 | called viral encephalitis,
00:48:50.640 | which is an infection inside the brain.
00:48:53.280 | So it was kind of wild that I was able to go,
00:48:56.320 | oh, I know exactly what's happening here, and I know,
00:48:58.240 | you know, so.
00:48:59.800 | - So that allows you to have some more compassion
00:49:02.480 | for the struggles that people have,
00:49:04.480 | but does it take away some of the magic
00:49:06.640 | for some of the, from some of the more positive experiences
00:49:10.680 | of life? - Sometimes.
00:49:12.000 | Sometimes, and I don't, I'm such a control addict
00:49:15.360 | that, you know, I think our biggest,
00:49:18.120 | someone like me, my biggest dream
00:49:19.920 | is to know why someone's doing, that's what stand-up is.
00:49:22.240 | It's just trying to figure out why,
00:49:23.440 | or that's what writing is, that's what acting is,
00:49:25.040 | that's what performing is, it's trying to figure out
00:49:26.400 | why someone would do something.
00:49:27.440 | As an actor, you get a piece of, you know, material,
00:49:30.040 | and you go, this person, why would he say that?
00:49:32.120 | Why would she pick up that cup?
00:49:33.760 | Why would she walk over here?
00:49:35.040 | It's really why, why, why, why.
00:49:36.560 | So I think neurology is, if you're trying to figure out
00:49:39.600 | human motives and why people do what they do,
00:49:41.520 | it'd be crazy not to understand
00:49:44.000 | how neurochemicals motivate us.
00:49:46.080 | I also have a lot of addiction in my family,
00:49:48.080 | and hardcore drug addiction and mental illness,
00:49:51.480 | and in order to cope with it,
00:49:53.720 | you really have to understand it,
00:49:54.760 | borderline personality disorder,
00:49:56.000 | schizophrenia, and drug addiction.
00:49:58.340 | So I have a lot of people I love
00:50:00.640 | that suffer from drug addiction and alcoholism,
00:50:02.840 | and the first thing they started teaching you
00:50:04.760 | is it's not a choice.
00:50:05.880 | These people's dopamine receptors
00:50:07.360 | don't hold dopamine the same ways yours do.
00:50:09.640 | Their frontal lobe is underdeveloped.
00:50:11.600 | Like, you know, and that really helped me
00:50:14.600 | to navigate loving people
00:50:17.920 | that were addicted to substances.
00:50:20.240 | - I wanna be careful with this question,
00:50:22.600 | but how much--
00:50:24.240 | - Money do you have?
00:50:25.320 | - How much-- (laughing)
00:50:27.480 | Can I borrow $10?
00:50:28.520 | (laughing)
00:50:30.920 | Okay.
00:50:31.760 | No, is how much control,
00:50:36.520 | how much, despite the chemical imbalances
00:50:39.760 | or the biological limitations
00:50:42.920 | that each of our individual brains have,
00:50:44.480 | how much mind over matter is there?
00:50:47.080 | So through things,
00:50:49.440 | and I've known people with clinical depression,
00:50:53.200 | and so it's always a touchy subject
00:50:55.560 | to say how much they can really help it.
00:50:57.680 | - Very.
00:50:58.520 | - What can you, yeah, what can you,
00:51:01.680 | 'cause you've talked about codependency,
00:51:03.560 | you've talked about issues that you struggle through,
00:51:07.400 | and nevertheless you choose to take a journey
00:51:09.880 | of healing and so on.
00:51:11.200 | So that's your choice, that's your actions.
00:51:14.240 | So how much can you do to help fight the limitations
00:51:17.720 | of the neurochemicals in your brain?
00:51:20.000 | - That's such an interesting question,
00:51:21.800 | and I don't think I'm at all qualified to answer,
00:51:23.440 | but I'll say what I do know.
00:51:25.560 | And really quick, just the definition of codependency,
00:51:28.200 | I think a lot of people think of codependency
00:51:29.880 | as like two people that can't stop hanging out,
00:51:32.680 | you know, or like, you know,
00:51:35.200 | that's not totally off,
00:51:36.640 | but I think for the most part,
00:51:38.280 | my favorite definition of codependency
00:51:39.960 | is the inability to tolerate the discomfort of others.
00:51:42.920 | You grow up in an alcoholic home,
00:51:44.040 | you grow up around mental illness,
00:51:45.200 | you grow up in chaos,
00:51:46.480 | you have a parent that's a narcissist,
00:51:48.280 | you basically are wired to just,
00:51:50.520 | people, please, worry about others,
00:51:52.760 | be perfect, walk on eggshells,
00:51:54.720 | shape shift to accommodate other people.
00:51:56.680 | So codependence is a very active wiring issue
00:52:01.680 | that, you know, doesn't just affect
00:52:04.480 | your romantic relationships,
00:52:05.680 | it affects you being a boss,
00:52:07.040 | it affects you in the world online,
00:52:10.520 | you know, you get one negative comment
00:52:12.040 | and it throws you for two weeks,
00:52:14.160 | you know, it also is linked to eating disorders
00:52:16.200 | and other kinds of addiction.
00:52:17.160 | So it's a very big thing.
00:52:20.240 | And I think a lot of people sometimes only think
00:52:22.000 | that it's in romantic relationships,
00:52:23.520 | so I always feel the need to say that.
00:52:25.960 | And also one of the reasons I love the idea of robots
00:52:28.080 | so much because you don't have to walk
00:52:29.880 | on eggshells around them,
00:52:30.880 | you don't have to worry they're gonna get mad at you yet,
00:52:33.320 | but there's no, codependence are hypersensitive
00:52:36.920 | to the needs and moods of others,
00:52:39.560 | and it's very exhausting, it's depleting.
00:52:42.160 | Just, well, one conversation about where we're gonna
00:52:44.720 | go to dinner is like, do you wanna go get Chinese food?
00:52:47.280 | We just had Chinese food.
00:52:48.440 | Well, wait, are you mad?
00:52:50.160 | Well, no, I didn't mean to, and it's just like,
00:52:52.200 | that codependence live in this,
00:52:54.960 | everything means something,
00:52:56.640 | and humans can be very emotionally exhausting.
00:53:00.160 | Why did you look at me that way?
00:53:01.200 | What are you thinking about?
00:53:02.040 | What was that, why'd you check your phone?
00:53:03.200 | It's just, it's a hypersensitivity
00:53:05.040 | that can be incredibly time-consuming,
00:53:07.920 | which is why I love the idea of robots just subbing in.
00:53:10.760 | Even, I've had a hard time running TV shows and stuff
00:53:13.840 | because even asking someone to do something,
00:53:15.360 | I don't wanna come off like a bitch.
00:53:16.520 | I'm very concerned about what other people think of me,
00:53:18.680 | how I'm perceived, which is why I think robots
00:53:21.640 | will be very beneficial for codependence.
00:53:23.880 | - By the way, just a real quick tangent,
00:53:25.600 | that skill or flaw, whatever you wanna call it,
00:53:29.160 | is actually really useful for,
00:53:30.840 | if you ever do start your own podcast for interviewing,
00:53:34.640 | because you're now kind of obsessed
00:53:36.480 | about the mindset of others,
00:53:39.240 | and it makes you a good sort of listener and talker with.
00:53:43.560 | So I think, what's her name from NPR?
00:53:48.560 | - Terry Gross.
00:53:49.480 | - Terry Gross talked about having that, so.
00:53:51.720 | - I don't feel like she has that at all.
00:53:53.680 | (laughing)
00:53:56.320 | What?
00:53:57.160 | She worries about other people's feelings.
00:54:00.760 | - Yeah, absolutely.
00:54:01.800 | - Oh, I don't get that at all.
00:54:03.680 | - I mean, you have to put yourself in the mind
00:54:05.240 | of the person you're speaking with.
00:54:07.120 | - Yes, oh, I see, just in terms of,
00:54:08.880 | yeah, I am starting a podcast,
00:54:10.200 | and the reason I haven't is because I'm codependent
00:54:12.440 | and I'm too worried it's not gonna be perfect.
00:54:14.360 | So a big codependent adage is,
00:54:17.520 | perfectionism leads to procrastination,
00:54:19.160 | which leads to paralysis.
00:54:20.320 | - So how do you, sorry to take a million tangents,
00:54:22.280 | how do you survive on social media?
00:54:23.600 | 'Cause you're exceptionally active.
00:54:25.160 | - But by the way, I took you on a tangent
00:54:26.560 | and didn't answer your last question
00:54:27.800 | about how much we can control.
00:54:29.960 | - How much, yeah, we'll return it, or maybe not.
00:54:32.960 | The answer is we can't.
00:54:33.800 | - Now as a codependent, I'm worried, okay, good.
00:54:36.240 | We can, but one of the things that I'm fascinated by
00:54:39.640 | is the first thing you learn
00:54:40.960 | when you go into 12-step programs
00:54:42.640 | or addiction recovery or any of this
00:54:44.200 | is genetics loads the gun, environment pulls the trigger.
00:54:47.840 | And there's certain parts of your genetics
00:54:50.540 | you cannot control.
00:54:51.560 | I come from a lot of alcoholism,
00:54:54.240 | I come from a lot of mental illness,
00:54:59.920 | there's certain things I cannot control
00:55:01.680 | and a lot of things that maybe we don't even know yet
00:55:04.000 | what we can and can't
00:55:04.840 | 'cause of how little we actually know about the brain.
00:55:06.720 | But we also talk about the warrior spirit
00:55:08.600 | and there are some people that have that warrior spirit
00:55:12.080 | and we don't necessarily know what that engine is,
00:55:15.280 | whether it's you get dopamine from succeeding
00:55:17.960 | or achieving or martyring yourself
00:55:21.160 | or that tension you get from growing.
00:55:24.880 | So a lot of people are like,
00:55:25.720 | "Oh, well, this person can edify themselves and overcome,
00:55:29.040 | "but if you're getting attention from improving yourself,
00:55:32.260 | "you're gonna keep wanting to do that."
00:55:34.560 | So that is something that helps a lot of
00:55:37.260 | in terms of changing your brain.
00:55:38.600 | If you talk about changing your brain to people
00:55:40.440 | and talk about what you're doing
00:55:41.640 | to overcome set obstacles,
00:55:42.920 | you're gonna get more attention from them,
00:55:44.600 | which is gonna fire off your reward system
00:55:46.840 | and then you're gonna keep doing it.
00:55:48.400 | - Yeah, so you can leverage that momentum.
00:55:50.320 | - So this is why in any 12-step program,
00:55:52.720 | you go into a room and you talk about your progress
00:55:55.160 | 'cause then everyone claps for you
00:55:57.120 | and then you're more motivated to keep going.
00:55:58.800 | So that's why we say you're only as sick
00:56:00.200 | as the secrets you keep,
00:56:01.240 | because if you keep things secret,
00:56:03.700 | there's no one guiding you to go in a certain direction.
00:56:06.120 | It's based on, right?
00:56:07.120 | We're sort of designed to get approval from the tribe
00:56:10.400 | or from a group of people 'cause our brain
00:56:12.840 | translates it to safety.
00:56:14.600 | So, yeah.
00:56:15.440 | - And in that case, the tribe is a positive one
00:56:17.680 | that helps you go in a positive direction.
00:56:19.520 | - So that's why it's so important to go into a room
00:56:21.240 | and also say, "Hey, I wanted to use drugs today."
00:56:25.080 | And people go, "Mm."
00:56:26.440 | They go, "Me too."
00:56:27.720 | You feel less alone and you feel less like you're,
00:56:30.040 | you know, have been castigated from the pack or whatever.
00:56:32.720 | And then you say, "And I didn't have any,"
00:56:34.200 | you get a chip when you haven't drank for 30 days
00:56:36.400 | or 60 days or whatever.
00:56:37.560 | You get little rewards.
00:56:38.600 | - So talking about a pack that's not at all healthy or good,
00:56:43.200 | but in fact is often toxic, social media.
00:56:46.240 | So you're one of my favorite people on Twitter and Instagram
00:56:49.760 | to sort of just both the comedy and the insight and just fun.
00:56:54.480 | How do you prevent social media
00:56:55.760 | from destroying your mental health?
00:56:57.240 | - I haven't.
00:56:58.080 | I haven't.
00:57:00.400 | It's the next big epidemic, isn't it?
00:57:03.160 | I don't think I have.
00:57:06.600 | I don't think--
00:57:08.680 | - Is moderation the answer?
00:57:10.760 | - Maybe, but you can do a lot of damage in a moderate way.
00:57:14.320 | I mean, I guess, again, it depends on your goals, you know?
00:57:17.120 | And I think for me,
00:57:19.600 | the way that my addiction to social media,
00:57:21.800 | I'm happy to call it an addiction.
00:57:23.080 | I mean, and I define it as an addiction
00:57:24.960 | because it stops being a choice.
00:57:26.280 | There are times I just reach over and I'm like,
00:57:28.120 | that was--
00:57:29.240 | - Yeah, that was weird.
00:57:30.080 | - That was weird.
00:57:31.320 | I'll be driving sometimes and I'll be like, oh my God,
00:57:33.720 | my arm just went to my phone, you know?
00:57:36.800 | I can put it down.
00:57:37.800 | I can take time away from it, but when I do, I get antsy.
00:57:41.360 | I get restless, irritable, and discontent.
00:57:43.400 | I mean, that's kind of the definition, isn't it?
00:57:45.840 | So I think by no means
00:57:48.680 | do I have a healthy relationship with social media.
00:57:50.480 | I'm sure there's a way to,
00:57:51.560 | but I think I'm especially a weirdo in this space
00:57:54.800 | because it's easy to conflate, is this work?
00:57:58.040 | Is this not?
00:57:58.880 | I can always say that it's for work.
00:58:00.840 | - Right. - You know?
00:58:01.960 | - But I mean, don't you get the same kind of thing
00:58:04.160 | as you get from when a room full of people laugh at your jokes
00:58:08.080 | 'cause I mean, I see, especially the way you do Twitter,
00:58:11.200 | it's an extension of your comedy in a way.
00:58:13.680 | - I took a big break from Twitter though,
00:58:15.960 | a really big break.
00:58:16.800 | I took like six months off or something for a while
00:58:19.160 | because it was just like,
00:58:20.480 | it seemed like it was all kind of politics
00:58:22.240 | and it was just a little bit,
00:58:23.280 | it wasn't giving me dopamine
00:58:25.000 | because there was like this weird, a lot of feedback.
00:58:28.280 | So I had to take a break from it and then go back to it
00:58:30.720 | 'cause I felt like I didn't have a healthy relationship.
00:58:33.480 | - Have you ever tried the, I don't know if I believe him,
00:58:36.120 | but Joe Rogan seems to not read comments.
00:58:39.440 | Have you, and he's one of the only people at the scale,
00:58:42.540 | like at your level, who at least claims not to read.
00:58:48.760 | 'Cause you and him swim in this space of tense ideas
00:58:53.760 | that get the toxic folks riled up.
00:58:58.720 | - I think Rogan, I don't know.
00:59:01.160 | I think he probably looks at YouTube, like the likes,
00:59:06.600 | and I think if something's, if he doesn't know,
00:59:09.640 | I don't know, I'm sure he would tell the truth.
00:59:12.700 | I'm sure he's got people that look at them
00:59:15.840 | and is like, "This guest did great,"
00:59:17.280 | or, "I don't," I'm sure he gets it.
00:59:20.640 | I can't picture him in the weeds on--
00:59:23.240 | - No, for sure.
00:59:24.320 | He's honest actually saying that.
00:59:26.020 | - Feedback, we're addicted to feedback.
00:59:29.680 | Yeah, we're addicted to feedback.
00:59:30.600 | I mean, look, I think that our brain is designed
00:59:34.280 | to get intel on how we're perceived
00:59:37.760 | so that we know where we stand, right?
00:59:39.900 | That's our whole deal, right?
00:59:41.320 | As humans, we wanna know where we stand.
00:59:43.080 | We walk into a room and we go,
00:59:44.160 | "Who's the most powerful person in here?
00:59:45.480 | "I gotta talk to 'em and get in their good graces."
00:59:47.520 | It's just we're designed to rank ourselves, right?
00:59:49.960 | And constantly know our rank.
00:59:51.800 | And social media, because you can't figure out your rank
00:59:55.900 | with 500 million people, it's impossible.
00:59:59.600 | So our brain is like, "What's my rank?
01:00:00.720 | "What's my," and especially if we're following people.
01:00:03.000 | I think the big, the interesting thing I think
01:00:05.440 | I maybe be able to say about this,
01:00:07.840 | besides my speech impediment,
01:00:09.440 | is that I did start muting people
01:00:13.120 | that rank wildly higher than me
01:00:16.160 | because it is just stressful on the brain
01:00:19.040 | to constantly look at people that are incredibly successful
01:00:23.080 | so you keep feeling bad about yourself.
01:00:25.320 | I think that that is cutting to a certain extent.
01:00:28.640 | Just like, look at me looking at all these people
01:00:30.960 | that have so much more money than me
01:00:32.080 | and so much more success than me.
01:00:33.800 | It's making me feel like a failure,
01:00:35.880 | even though I don't think I'm a failure,
01:00:37.600 | but it's easy to frame it so that I can feel that way.
01:00:41.920 | - Yeah, that's really interesting,
01:00:43.320 | especially if they're close to,
01:00:45.120 | like if they're other comedians or something like that.
01:00:46.920 | - That's right. - Or whatever.
01:00:48.920 | It's really disappointing to me.
01:00:50.480 | I do the same thing as well.
01:00:51.760 | So other successful people that are really close
01:00:53.580 | to what I do, I don't know.
01:00:56.400 | I wish I could just admire.
01:00:58.240 | - Yeah.
01:00:59.080 | - And for it not to be a distraction.
01:01:01.160 | - But that's why you are where you are
01:01:02.480 | 'cause you don't just admire your competitive
01:01:04.360 | and you wanna win.
01:01:05.280 | So it's also the same thing that bums you out
01:01:07.520 | when you look at this is the same reason
01:01:08.840 | you are where you are.
01:01:09.680 | So that's why I think it's so important
01:01:11.520 | to learn about neurology and addiction
01:01:12.840 | 'cause you're able to go like,
01:01:14.120 | oh, this same instinct.
01:01:15.720 | So I'm very sensitive and I sometimes don't like that
01:01:18.840 | about myself, but I'm like, well, that's the reason
01:01:20.480 | I'm able to write good standup.
01:01:22.440 | And that's the reason I'm able to be sensitive to feedback
01:01:25.680 | and go, that joke should have been better.
01:01:26.920 | I can make that better.
01:01:28.080 | So it's the kind of thing where it's like,
01:01:29.480 | you have to be really sensitive in your work
01:01:31.280 | and the second you leave,
01:01:32.360 | you gotta be able to turn it off.
01:01:33.580 | It's about developing the muscle,
01:01:34.840 | being able to know when to let it be a superpower
01:01:38.360 | and when it's gonna hold you back and be an obstacle.
01:01:41.200 | So I try to not be in that black and white
01:01:43.080 | of like, being competitive is bad
01:01:45.760 | or being jealous of someone just to go like,
01:01:47.680 | oh, there's that thing that makes me really successful
01:01:50.200 | in a lot of other ways,
01:01:51.400 | but right now it's making me feel bad.
01:01:53.240 | - Well, I'm kind of looking to you
01:01:54.920 | 'cause you're basically a celebrity,
01:01:58.000 | the famous sort of world-class comedian.
01:02:01.200 | And so I feel like you're the right person
01:02:03.080 | to be one of the key people to define
01:02:06.080 | what's the healthy path forward with social media.
01:02:08.680 | 'Cause we're all trying to figure it out now
01:02:12.800 | and I'm curious to see where it evolves.
01:02:16.200 | I think you're at the center of that.
01:02:17.960 | So like, there's trying to leave Twitter
01:02:21.640 | and then come back and say,
01:02:22.800 | can I do this in a healthy way?
01:02:24.080 | I mean, you have to keep trying, exploring and thinking.
01:02:25.920 | - You have to know because it's being,
01:02:28.120 | I have a couple answers.
01:02:29.720 | I think, I hire a company
01:02:31.600 | to do some of my social media for me.
01:02:33.920 | So it's also being able to go,
01:02:35.960 | okay, I make a certain amount of money by doing this,
01:02:38.360 | but now let me be a good business person
01:02:40.360 | and say, I'm gonna pay you this amount to run this for me.
01:02:42.960 | So I'm not 24/7 in the weeds, hashtagging and responding.
01:02:45.920 | And just, it's a lot to take on.
01:02:47.280 | It's a lot of energy to take on.
01:02:48.800 | But at the same time,
01:02:49.640 | part of what I think makes me successful
01:02:52.280 | on social media if I am,
01:02:53.480 | is that people know I'm actually doing it
01:02:55.280 | and that I am an engaging and I'm responding
01:02:57.200 | and developing a personal relationship
01:02:59.600 | with complete strangers.
01:03:01.080 | So I think, figuring out that balance
01:03:04.000 | and really approaching it as a business,
01:03:06.200 | that's what I try to do.
01:03:07.320 | It's not dating, it's not,
01:03:09.520 | I try to just be really objective about,
01:03:11.240 | okay, here's what's working, here's what's not working.
01:03:13.440 | And in terms of taking the break from Twitter,
01:03:15.920 | this is a really savage take,
01:03:17.680 | but because I don't talk about my politics publicly,
01:03:21.760 | being on Twitter right after the last election
01:03:26.080 | was not gonna be beneficial
01:03:27.960 | because there was gonna be, you had to take a side.
01:03:30.320 | You had to be political
01:03:31.600 | in order to get any kind of retweets or likes.
01:03:34.440 | And I just wasn't interested in doing that
01:03:37.280 | 'cause you were gonna lose as many people
01:03:38.720 | as you were gonna gain
01:03:39.560 | and it was gonna all come clean in the wash.
01:03:40.840 | So I was just like,
01:03:41.840 | the best thing I can do for me business-wise
01:03:44.360 | is to just abstain.
01:03:46.680 | And the robot, I joke about her replacing me,
01:03:52.240 | but she does do half of my social media.
01:03:54.360 | 'Cause I don't want people to get sick of me.
01:03:57.960 | I don't want to be redundant.
01:03:59.840 | There are times when I don't have the time or the energy
01:04:02.440 | to make a funny video,
01:04:03.360 | but I know she's gonna be compelling and interesting
01:04:06.160 | and that's something that you can't see every day.
01:04:08.520 | - Of course, the humor comes from your,
01:04:11.920 | I mean, the cleverness, the wit,
01:04:13.400 | the humor comes from you when you film the robot.
01:04:16.400 | That's kind of the trick of it.
01:04:17.840 | I mean, the robot is not quite there
01:04:21.000 | to do anything funny.
01:04:23.440 | The absurdity is revealed through the filmmaker in that case
01:04:26.680 | or whoever is interacting,
01:04:27.840 | not through the actual robot being who she is.
01:04:32.840 | Let me sort of, love, okay.
01:04:37.440 | (Bridget laughs)
01:04:38.960 | How difficult-- - What is it?
01:04:40.800 | - What is it?
01:04:41.640 | Well, first an engineering question.
01:04:45.080 | I know, I know, you're not an engineer,
01:04:48.080 | but how difficult do you think is it to build an AI system
01:04:52.160 | that you can have a deep,
01:04:53.760 | fulfilling, monogamous relationship with?
01:04:56.480 | Sort of replace the human-to-human relationships
01:04:59.880 | that we value?
01:05:00.800 | - I think anyone can fall in love with anything.
01:05:04.280 | You know, like how often have you looked back at someone,
01:05:08.340 | like I ran into someone the other day
01:05:10.480 | that I was in love with and I was like,
01:05:12.640 | "Hey," it was like, there was nothing there.
01:05:16.320 | There was nothing there.
01:05:17.880 | Like, you know, like where you're able to go like,
01:05:19.720 | "Oh, that was weird.
01:05:20.880 | "Oh, right."
01:05:22.480 | You know, I were able--
01:05:25.080 | - You mean from a distant past
01:05:26.320 | or something like that? - Yeah.
01:05:27.880 | When you're able to go like,
01:05:28.720 | I can't believe we had an incredible connection
01:05:31.440 | and now it's just,
01:05:32.960 | I do think that people will be in love with robots,
01:05:37.320 | probably even more deeply with humans,
01:05:39.820 | because it's like when people mourn their animals,
01:05:42.520 | when their animals die,
01:05:44.040 | they're always, it's sometimes harder than mourning a human
01:05:47.760 | because you can't go, "Well, he was kind of an asshole."
01:05:50.360 | But like, "He didn't pick me up from school."
01:05:52.000 | You know, it's like you're able to get out of your grief
01:05:53.880 | a little bit.
01:05:54.720 | You're able to kind of be,
01:05:56.200 | "Oh, he was kind of judgmental."
01:05:57.440 | Or, "She was kind of," you know.
01:05:59.440 | With a robot, there's something so pure about,
01:06:02.080 | and innocent, and impish, and childlike about it
01:06:05.560 | that I think it probably will be much more conducive
01:06:09.660 | to a narcissistic love, for sure, at that.
01:06:12.520 | But it's not like, "Well, he cheated.
01:06:15.240 | "She can't cheat.
01:06:16.080 | "She can't leave you.
01:06:16.900 | "She can't," you know.
01:06:17.960 | - Well, if a bear claw leaves your life
01:06:21.560 | and maybe a new version
01:06:23.640 | or somebody else will enter,
01:06:25.680 | will you miss a bear claw?
01:06:27.960 | - For guys that have these sex robots,
01:06:30.680 | they're building a nursing home for the bodies
01:06:34.400 | that are now rusting
01:06:36.360 | 'cause they don't wanna part with the bodies
01:06:37.960 | 'cause they have such an intense emotional connection to it.
01:06:40.880 | I mean, it's kind of like a car club, a little bit.
01:06:42.880 | You know, like it's, you know.
01:06:45.000 | But I'm not saying this is right.
01:06:47.400 | I'm not saying it's cool, it's weird, it's creepy,
01:06:50.040 | but we do anthropomorphize things with faces
01:06:53.840 | and we do develop emotional connections to things.
01:06:56.680 | I mean, there's certain,
01:06:58.080 | have you ever tried to like throw away,
01:06:59.360 | I can't even throw away my teddy bear from when I was a kid.
01:07:01.840 | It's a piece of trash and it's upstairs.
01:07:04.360 | Like, it's just like, why can't I throw that away?
01:07:06.700 | It's bizarre.
01:07:07.980 | You know, and there's something kind of beautiful about that.
01:07:10.160 | There's something, it gives me hope in humans
01:07:13.160 | 'cause I see humans do such horrific things all the time
01:07:15.760 | and maybe I'm too, I see too much of it, frankly,
01:07:18.380 | but there's something kind of beautiful
01:07:20.280 | about the way we're able to have emotional connections
01:07:24.400 | to objects, which, you know, a lot of,
01:07:29.240 | I mean, it's kind of specifically, I think, Western, right?
01:07:32.200 | That we don't see objects as having souls.
01:07:34.920 | Like, that's kind of specifically us.
01:07:36.840 | But I don't think it's so much
01:07:39.760 | that we're objectifying humans with these sex robots.
01:07:43.460 | We're kind of humanizing objects, right?
01:07:45.700 | So there's something kind of fascinating
01:07:47.120 | in our ability to do that.
01:07:48.180 | 'Cause a lot of us don't humanize humans.
01:07:50.080 | So it's just a weird little place to play in.
01:07:52.880 | And I think a lot of people, I mean,
01:07:54.980 | a lot of people will be marrying these things is my guess.
01:07:57.760 | - So you've asked the question.
01:07:59.640 | Let me ask it of you.
01:08:00.640 | So what is love?
01:08:01.900 | You have a bit of a brilliant definition of love
01:08:05.720 | as being willing to die for someone
01:08:07.860 | who you yourself want to kill.
01:08:10.600 | So that's kind of fun.
01:08:12.240 | First of all, that's brilliant.
01:08:13.960 | That's a really good definition.
01:08:16.480 | I don't think it'll stick with me for a long time.
01:08:18.080 | - This is how little of a romantic I am.
01:08:19.880 | A plane went by when you said that.
01:08:21.440 | And my brain is like, you're gonna need to rerecord that.
01:08:24.960 | I don't want you to get into post
01:08:26.440 | and then not be able to use it.
01:08:28.040 | (laughing)
01:08:30.300 | - And I'm a romantic 'cause I--
01:08:32.000 | - Don't mean to ruin the moment.
01:08:33.680 | - Actually, I can not be conscious of the fact
01:08:35.680 | that I heard the plane and it made me feel like
01:08:38.280 | how amazing it is that we live in a world with planes.
01:08:41.160 | (laughing)
01:08:43.360 | - And I just went, why haven't we fucking evolved
01:08:46.160 | past planes and why can't they make them quieter?
01:08:49.120 | - Yeah.
01:08:49.960 | (laughing)
01:08:50.780 | - But yes.
01:08:51.620 | - This--
01:08:52.460 | - My definition of love?
01:08:54.480 | - What, yeah, what's your--
01:08:56.760 | - Consistently producing dopamine for a long time.
01:08:59.960 | (laughing)
01:09:01.440 | Consistent output of oxytocin with the same person.
01:09:05.140 | - Dopamine is a positive thing.
01:09:08.280 | What about the negative?
01:09:09.560 | What about the fear and the insecurity?
01:09:12.000 | The longing, anger, all that kind of stuff?
01:09:16.520 | - I think that's part of love.
01:09:17.840 | I think that love brings out the best in you
01:09:22.000 | but it also, if you don't get angry and upset,
01:09:24.040 | it's, I don't know, I think that that's part of it.
01:09:26.880 | I think we have this idea that love has to be really
01:09:29.680 | placid or something.
01:09:31.060 | I only saw stormy relationships growing up
01:09:34.160 | so I don't have a judgment on how a relationship
01:09:38.120 | should look but I do think that this idea
01:09:42.360 | that love has to be eternal is really destructive,
01:09:47.360 | is really destructive and self-defeating
01:09:50.780 | and a big source of stress for people.
01:09:53.680 | I mean, I'm still figuring out love.
01:09:55.640 | I think we all kind of are but I do kind of
01:09:58.040 | stand by that definition.
01:09:59.360 | And I think that, I think for me, love is like
01:10:04.160 | just being able to be authentic with somebody.
01:10:06.260 | It's very simple, I know, but I think for me
01:10:08.560 | it's about not feeling pressure to have to perform
01:10:11.060 | or impress somebody, just feeling truly like
01:10:14.800 | accepted unconditionally by someone.
01:10:16.540 | Although I do believe love should be conditional.
01:10:19.160 | That might be a hot take.
01:10:22.860 | I think everything should be conditional.
01:10:24.280 | I think if someone's behavior, I don't think love
01:10:28.060 | should just be like, I'm in love with you,
01:10:29.520 | now behave however you want forever.
01:10:30.960 | This is unconditional.
01:10:31.840 | I think love is a daily action.
01:10:35.320 | It's not something you just like get tenure on
01:10:38.120 | and then get to behave however you want
01:10:40.020 | 'cause we said I love you 10 years ago.
01:10:41.900 | It's a daily, it's a verb.
01:10:44.580 | - Well, there's some things that are,
01:10:46.140 | you see, if you explicitly make it clear
01:10:49.080 | that it's conditional, it takes away
01:10:51.460 | some of the magic of it.
01:10:52.520 | So there's certain stories we tell ourselves
01:10:55.360 | that we don't wanna make explicit about love.
01:10:57.200 | I don't know, maybe that's the wrong way to think of it.
01:10:59.160 | Maybe you wanna be explicit in relationships.
01:11:02.680 | - I also think love is a business decision.
01:11:04.640 | Like I do in a good way.
01:11:07.980 | Like I think that love is not just
01:11:11.160 | when you're across from somebody.
01:11:12.580 | It's when I go to work, can I focus?
01:11:15.360 | Do I, am I worried about you?
01:11:16.200 | Am I stressed out about you?
01:11:17.440 | Am I, you're not responding to me, you're not reliable.
01:11:20.420 | Like I think that being in a relationship,
01:11:23.180 | the kind of love that I would want
01:11:24.280 | is the kind of relationship where when we're not together,
01:11:26.920 | it's not draining me, causing me stress, making me worry,
01:11:30.320 | you know, and sometimes passion, that word.
01:11:33.400 | You know, we get murky about it,
01:11:36.480 | but I think it's also like I can be the best version
01:11:38.200 | of myself when the person's not around
01:11:40.080 | and I don't have to feel abandoned or scared
01:11:42.680 | or any of these kind of other things.
01:11:43.960 | So it's like love, you know, for me, I think is,
01:11:47.040 | I think it's a Flaubert quote and I'm gonna butcher it,
01:11:49.800 | but I think it's like be, you know,
01:11:51.920 | boring in your personal life so you can be violent
01:11:54.240 | and take risks in your professional life.
01:11:55.780 | Is that it?
01:11:56.620 | I got it wrong.
01:11:57.480 | Something like that, but I do think that
01:12:00.000 | it's being able to align values in a way
01:12:01.680 | to where you can also thrive outside of the relationship.
01:12:04.680 | - Some of the most successful people I know
01:12:06.200 | are those sort of happily married and have kids and so on.
01:12:10.120 | It's always funny--
01:12:10.960 | - It can be boring.
01:12:11.880 | Boring's okay.
01:12:13.200 | Boring is serenity.
01:12:14.480 | - And it's funny how those elements
01:12:16.500 | actually make you much more productive.
01:12:18.380 | I don't understand the--
01:12:19.700 | - I don't think relationships should drain you
01:12:21.160 | and take away energy that you could be using
01:12:23.440 | to create things that generate pride.
01:12:25.800 | - Okay.
01:12:26.640 | - Did you say your relationship of love yet?
01:12:28.240 | - Huh?
01:12:29.080 | - Have you said your definition of love?
01:12:31.400 | - My definition of love?
01:12:32.640 | No, I did not say it.
01:12:34.760 | (Bridget laughs)
01:12:35.600 | We're out of time.
01:12:36.760 | - No!
01:12:37.600 | - When you have a podcast, maybe you can invite me on.
01:12:41.800 | - Oh no, I already did.
01:12:42.640 | You're doing it.
01:12:44.040 | We've already talked about this.
01:12:46.400 | - And because I also have codependency, I had to say yes.
01:12:49.480 | - No, yeah, yeah, no, no, I'm trapping you.
01:12:52.120 | You owe me now.
01:12:53.080 | - Actually, I wondered whether when I asked
01:12:58.280 | if we could talk today, after sort of doing more research
01:13:01.720 | and reading some of your book, I started to wonder,
01:13:04.640 | did you just feel pressured to say yes?
01:13:07.040 | - Yes, of course.
01:13:09.320 | - Good.
01:13:10.160 | - But I'm a fan of yours too.
01:13:11.000 | - Okay, awesome.
01:13:11.820 | - No, I actually, because I am codependent,
01:13:13.400 | but I'm in recovery for codependence,
01:13:14.920 | so I actually don't do anything I don't wanna do.
01:13:17.640 | - You really, you go out of your way to say no.
01:13:20.400 | - What's that?
01:13:21.240 | I say no all the time.
01:13:22.720 | - Good, I'm trying to learn that as well.
01:13:23.560 | - I moved this, remember, I moved it from one to two.
01:13:26.040 | - Yeah, yeah.
01:13:26.880 | - I--
01:13:27.720 | - Yeah, just to let you know how recovered I am.
01:13:30.320 | I'm not codependent, but I don't do anything
01:13:33.960 | I don't wanna do.
01:13:34.800 | - Yeah, you're ahead of me on that, okay.
01:13:36.960 | So do you--
01:13:37.800 | - You're like, I don't even wanna be here.
01:13:38.880 | (laughing)
01:13:40.600 | - Do you think about your mortality?
01:13:43.480 | - Yes, it is a big part of how I was able
01:13:47.000 | to sort of kickstart my codependence recovery.
01:13:49.120 | My dad passed a couple years ago,
01:13:50.480 | and when you have someone close to you in your life die,
01:13:53.120 | everything gets real clear in terms of
01:13:56.540 | how we're a speck of dust who's only here
01:13:58.360 | for a certain amount of time.
01:14:00.880 | - What do you think is the meaning of it all?
01:14:02.360 | Like what, the speck of dust,
01:14:05.200 | what's maybe in your own life,
01:14:08.080 | what's the goal, the purpose of your existence?
01:14:13.080 | - Is there one?
01:14:15.280 | - Well, you're exceptionally ambitious,
01:14:17.320 | you've created some incredible things
01:14:19.120 | in different disciplines.
01:14:21.680 | - Yeah, it's we're all just managing our terror
01:14:23.560 | 'cause we know we're gonna die.
01:14:24.560 | So we create and build all these things
01:14:26.360 | and rituals and religions and robots
01:14:29.460 | and whatever we need to do to just distract ourselves
01:14:31.800 | from imminent rotting.
01:14:35.240 | We're rotting, we're all dying.
01:14:37.140 | I got very into terror management theory
01:14:42.540 | when my dad died and it resonated, it helped me,
01:14:45.120 | and everyone's got their own religion
01:14:46.580 | or sense of purpose or thing that distracts them
01:14:50.280 | from the horrors of being human.
01:14:53.360 | - What's the terror management theory?
01:14:56.080 | - Terror management is basically the idea
01:14:57.360 | that since we're the only animal
01:14:58.680 | that knows they're gonna die,
01:15:00.360 | we have to basically distract ourselves
01:15:03.520 | with awards and achievements and games and whatever
01:15:08.520 | just in order to distract ourselves
01:15:11.960 | from the terror we would feel if we really processed
01:15:14.600 | the fact that we could not only, we are gonna die,
01:15:16.960 | but also could die at any minute
01:15:18.480 | because we're only superficially
01:15:19.820 | at the top of the food chain.
01:15:22.760 | And technically we're at the top of the food chain
01:15:26.160 | if we have houses and guns and stuff, machines,
01:15:29.320 | but if me and a lion are in the woods together,
01:15:32.520 | most things could kill us.
01:15:33.800 | I mean, a bee can kill some people.
01:15:35.720 | Something this big can kill a lot of humans.
01:15:38.680 | So it's basically just to manage the terror
01:15:41.480 | that we all would feel if we were able to really be awake.
01:15:45.200 | 'Cause we're mostly zombies, right?
01:15:46.860 | Job, school, religion, go to sleep, drink,
01:15:50.400 | football, relationship, dopamine, love, you know,
01:15:54.480 | we're kind of just like trudging along
01:15:57.000 | like zombies for the most part.
01:15:58.480 | And then I think--
01:15:59.800 | - That fear of death adds some motivation.
01:16:02.360 | - Yes.
01:16:03.460 | - Well, I think I speak for a lot of people
01:16:06.080 | in saying that I can't wait to see what your terror creates
01:16:10.440 | (laughing)
01:16:11.880 | in the next few years.
01:16:13.640 | I'm a huge fan.
01:16:14.840 | Whitney, thank you so much for talking today.
01:16:16.560 | - Thanks.
01:16:18.840 | - Thanks for listening to this conversation
01:16:20.520 | with Whitney Cummings.
01:16:21.880 | And thank you to our presenting sponsor, Cash App.
01:16:24.680 | Download it and use code LEXPODCAST.
01:16:27.400 | You'll get $10 and $10 will go to FIRST,
01:16:30.160 | a STEM education nonprofit that inspires hundreds
01:16:33.120 | of thousands of young minds to learn
01:16:35.440 | and to dream of engineering our future.
01:16:38.000 | If you enjoy this podcast, subscribe on YouTube,
01:16:40.800 | give it five stars on Apple Podcast,
01:16:42.720 | support on Patreon, or connect with me on Twitter.
01:16:46.080 | Thank you for listening and hope to see you next time.
01:16:49.200 | (upbeat music)
01:16:51.780 | (upbeat music)
01:16:54.360 | [BLANK_AUDIO]