back to indexWhitney Cummings: Comedy, Robotics, Neurology, and Love | Lex Fridman Podcast #55
Chapters
0:0 The Following Is a Conversation with Whitney Cummings She's a Stand-Up Comedian Actor Producer Writer Director and Recently Finally the Host of Her Very Own Podcast Called Good for You Her Most Recent Netflix Special Called Can I Touch It Features in Part of Robot She Affectionately Named Bear Claw but It's Designed To Be Visually a Replica Whitney It's Exciting for Me To See One of My Favorite Comedians Explore the Social Aspects of Robotics and Ai in Our Society She Also Has some Fascinating Ideas about Human Behavior Psychology
51:26 Definition of Codependency
56:54 Prevent Social Media from Destroying Your Mental Health
57:19 Addiction to Social Media
68:0 What Is Love
74:39 Terror Management Theory
00:00:00.000 |
The following is a conversation with Whitney Cummings. 00:00:03.640 |
She's a stand-up comedian, actor, producer, writer, director, 00:00:07.240 |
and recently, finally, the host of her very own podcast 00:00:12.920 |
Her most recent Netflix special called Can I Touch It? 00:00:18.680 |
named Bearclaw that is designed to be visually a replica 00:00:23.400 |
It's exciting for me to see one of my favorite comedians 00:00:26.000 |
explore the social aspects of robotics and AI in our society. 00:00:32.920 |
about human behavior, psychology, and urology, 00:00:36.000 |
some of which she explores in her book called 00:00:45.200 |
and even to continue it through text afterwards. 00:00:52.360 |
and will get a text from Whitney saying something hilarious. 00:00:55.760 |
Or weirder yet, sending a video of Brian Callen 00:01:00.960 |
That's when I know the universe has a sense of humor, 00:01:03.880 |
and it gifted me with one hell of an amazing journey. 00:01:09.200 |
to programming with a stupid, joyful smile on my face. 00:01:13.400 |
If you enjoy this conversation, listen to Whitney's podcast, 00:01:16.320 |
Good For You, and follow her on Twitter and Instagram. 00:01:26.800 |
support on Patreon, or simply connect with me on Twitter, 00:01:41.440 |
I personally use Cash App to send money to friends, 00:01:50.660 |
You can buy fractions of a stock, say $1 worth, 00:01:55.760 |
Brokerage services are provided by Cash App Investing, 00:02:04.120 |
to support one of my favorite organizations called FIRST, 00:02:07.180 |
best known for their FIRST Robotics and LEGO competitions. 00:02:10.400 |
They educate and inspire hundreds of thousands of students 00:02:16.000 |
and have a perfect rating on Charity Navigator, 00:02:22.380 |
When you get Cash App from the App Store or Google Play 00:02:32.100 |
which again is an organization that I've personally seen 00:02:56.680 |
was mostly through tools that we built ourselves, 00:03:02.400 |
ZipRecruiter is a tool that's already available for you. 00:03:05.220 |
It seeks to make hiring simple, fast, and smart. 00:03:09.000 |
For example, Codable co-founder Gretchen Huebner 00:03:20.280 |
Gretchen found it easier to focus on the best candidates, 00:03:23.080 |
and finally hiring the perfect person for the role 00:03:36.660 |
as I did, for free at ziprecruiter.com/lexpod. 00:03:46.520 |
And now, here's my conversation with Whitney Cummings. 00:03:50.640 |
I have trouble making eye contact, as you can tell. 00:03:54.600 |
Did you know that I had to work on making eye contact 00:03:58.880 |
Do you see what I'm doing? - That helps, yeah. 00:04:01.800 |
Well, I'll do this way, I'll cheat the camera. 00:04:05.760 |
like I'd be on dates, and guys would be like, 00:04:10.900 |
because I didn't really get a lot of eye contact as a kid. 00:04:14.760 |
Did you not get a lot of eye contact as a kid? 00:04:20.400 |
- But there's definitely some psychological issues. 00:04:25.520 |
- Yeah, for some reason, when I connect eyes, 00:04:27.880 |
I start to think, I assume that you're judging me. 00:04:49.480 |
or another gender we have not yet created as a society? 00:04:55.240 |
- Well, I'm gonna ask you-- - You know the answer. 00:05:09.080 |
and just intellectual. - Can we just be very clear 00:05:31.080 |
And I just see such a tremendous negativity around robots, 00:05:37.000 |
that it was like, oh, I'm just gonna take the opposite side 00:05:43.400 |
oh no, I really agree in this devil's advocate argument. 00:05:45.920 |
So please correct me when I'm wrong about this stuff. 00:05:49.400 |
- So first of all, there's no right and wrong 00:05:51.760 |
because we're all, I think most of the people 00:05:54.880 |
working on robotics are really not actually even thinking 00:06:01.280 |
In fact, your robot, what's her name, by the way? 00:06:11.680 |
- Bear Claw was, I, God, I don't even remember the joke 00:06:16.640 |
but I was writing something about the pet names 00:06:19.160 |
that men call women, like Cupcake, Sweetie, Honey, 00:06:22.920 |
you know, like, we're always named after desserts 00:06:54.000 |
- I think it depends on their purpose, you know? 00:07:07.680 |
that bought sex robots, and I was very surprised 00:07:13.400 |
the stereotype was it's gonna be a bunch of perverts. 00:07:15.320 |
It ended up being a lot of people that were handicapped, 00:07:20.520 |
and a lot of people that were exploring their sexuality. 00:07:25.920 |
but weren't sure, but didn't wanna take the risk 00:07:31.660 |
and being embarrassed, or they were closeted, 00:07:42.340 |
for someone trying to explore their sexuality. 00:07:48.220 |
So I think gendered robots would be important for that, 00:07:52.460 |
in terms of emotional support robots, babysitters, 00:08:07.540 |
- What do you mean with your husband in the house? 00:08:09.920 |
What's that have to do with the gender of the robot? 00:08:21.700 |
I think, for genderless, you know, teachers, doctors, 00:08:32.580 |
You know, it's sort of, that stuff is so loaded. 00:08:49.560 |
- So there's two components to this, to Bearclaw. 00:08:52.840 |
So one is the voice and the talking, and so on, 00:09:04.920 |
So has it added much to the depth of the interaction? 00:09:09.000 |
- I mean, mine's kind of an extenuating circumstance, 00:09:11.240 |
'cause she's supposed to look exactly like me. 00:09:13.560 |
I mean, I spent six months getting my face molded, 00:09:18.160 |
I was exploring the concept of, can robots replace us? 00:09:30.840 |
and they're gonna replace us, legitimate fear, 00:09:34.360 |
I would love for a robot to replace me every now and then, 00:09:42.500 |
So I just think that was an interesting place to explore, 00:09:57.660 |
but I think that her looking human was the idea, 00:10:04.820 |
a human robot resembling an actual human you know 00:10:09.820 |
is going to feel more realistic than some generic face. 00:10:13.780 |
- Well, you're saying that robots that have some familiarity 00:10:18.140 |
like look similar to somebody that you actually know, 00:10:22.540 |
you'll be able to form a deeper connection with? 00:10:24.580 |
- That was the question? - I think so on some level. 00:10:26.020 |
- That's an open question, I don't, you know, 00:10:32.100 |
and you're like, well I know this isn't real, 00:10:33.380 |
'cause you're right here, so maybe it does the opposite. 00:10:56.020 |
- In general, 'cause I think that's where things get dicey, 00:11:05.180 |
and then you see her, and you go, oh, they're so similar, 00:11:07.620 |
but also because human faces are flawed and asymmetrical, 00:11:11.100 |
and sometimes we forget when we're making things 00:11:16.020 |
and that's what makes them stop looking human, 00:11:20.540 |
she just, even if someone didn't know who I was, 00:11:22.860 |
I think she'd look more realistic than most generic ones 00:11:30.740 |
when they're too symmetrical, 'cause human beings aren't. 00:11:33.260 |
- Yeah, the flaws is what it means to be human, 00:11:35.740 |
so visually as well, but I'm just a fan of the idea 00:11:39.420 |
of letting humans use a little bit more imagination, 00:11:43.300 |
so just hearing the voice is enough for us humans 00:11:47.580 |
to then start imagining the visual appearance 00:11:52.020 |
and you don't necessarily need to work too hard 00:12:00.620 |
of actually building a robot that looks like Bear Claw, 00:12:07.680 |
of sort of making everything smiling, winking, 00:12:16.540 |
- It gets tricky, and I think, again, I'm a comedian, 00:12:21.820 |
and our human nature, and the nasty side of human nature 00:12:28.060 |
over and over again, and I was just mostly fascinated 00:12:33.340 |
to get the biggest reaction from a group of strangers, 00:12:35.640 |
the loudest possible reaction, and I just had this instinct, 00:12:39.860 |
just when I started building her, and people going, 00:12:43.020 |
and people scream, and I mean, I would bring her out 00:12:45.620 |
on stage, and people would scream, and I just, 00:12:49.380 |
to me, that was the next level of entertainment, 00:12:51.540 |
getting a laugh, I've done that, I know how to do that, 00:12:53.540 |
I think comedians were always trying to figure out 00:12:54.940 |
what the next level is, and comedy's evolving so much, 00:13:00.540 |
comedy horror movies, which feel like the next level 00:13:03.060 |
of comedy to me, and this sort of funny horror 00:13:11.660 |
but I think the thing that I got the most obsessed with 00:13:15.520 |
was people being freaked out and scared of her, 00:13:18.220 |
and I started digging around with pathogen avoidance, 00:13:27.140 |
but is off a little bit, anything that could be sick, 00:13:47.900 |
and it was like, you know what this is, you know, 00:14:02.100 |
and what got us into trouble thousands of years ago. 00:14:10.100 |
are people freaked out by the fact that there's a robot? 00:14:17.860 |
- Anything people I think, and I'm just also fascinated 00:14:27.220 |
How many humans have died at the hands of other humans? 00:14:36.060 |
and be around a bunch of humans who statistically, 00:14:49.260 |
so it's, you can say the same thing about nuclear weapons 00:14:55.740 |
so the worry that people have is the exponential growth. 00:14:59.420 |
So it's like, oh, it's fun and games right now, 00:15:08.700 |
provides value to society, we'll put one in every home, 00:15:20.260 |
to where we'll all be, affect our political system 00:15:25.580 |
- Didn't robots already ruin our political system? 00:15:43.660 |
I think nobody understands our current political system, 00:15:49.700 |
Nobody has a sense, not Twitter, not Facebook, 00:15:53.420 |
nobody understands the impact of these algorithms. 00:16:00.260 |
trying to make sure that Hillary Clinton gets elected. 00:16:03.260 |
It's more that it's an incredibly complex system 00:16:11.460 |
that nobody will be able to stop it once it happens. 00:16:19.460 |
which is, is this just the next stage of evolution? 00:16:30.340 |
Is this basically the next stage of survival of the fittest? 00:16:37.780 |
You know, what does it mean to be a living organism? 00:16:41.380 |
Like, is a smartphone part of your living organism? 00:16:54.460 |
- But it also expands your cognitive abilities, 00:17:06.140 |
'cause we become sort of so morphed with them. 00:17:09.980 |
and that's what, so Elon Musk, the neural link, 00:17:12.580 |
is working on trying to increase the bandwidth 00:17:16.700 |
of communication between computers and your brain, 00:17:19.340 |
and so further and further expand our ability 00:17:22.860 |
as human beings to sort of leverage machines, 00:17:30.500 |
It could be also that, yes, we'll give birth, 00:17:33.940 |
just like we give birth to human children right now, 00:17:36.580 |
we'll give birth to AI and they'll replace us. 00:17:38.980 |
I think it's a really interesting possibility. 00:17:44.140 |
I just think that the fear of robots is wildly classist, 00:17:48.340 |
because, I mean, Facebook, like, it's easy for us to say, 00:17:52.020 |
Okay, a lot of people that get employment off of Facebook, 00:17:58.260 |
They don't care if you take their phone numbers 00:17:59.860 |
and their emails and their data, as long as it's free. 00:18:01.980 |
They don't wanna have to pay $5 a month for Facebook. 00:18:05.860 |
Forget about the election and all that kind of stuff. 00:18:08.300 |
A lot of technology making people's lives easier, 00:18:12.540 |
I find that most elite people are more scared 00:18:17.180 |
than lower-income people, and women, for the most part. 00:18:21.260 |
So the idea of something that's stronger than us 00:18:26.460 |
Like, that's not, I see a lot of really rich men 00:18:31.180 |
We're like, "What's another thing that's gonna kill us?" 00:18:33.700 |
I tend to see, like, "Oh, something can walk me 00:18:42.940 |
who can't afford eye surgery, like, in a robot, 00:18:45.260 |
can we send a robot to underprivileged places 00:18:50.580 |
I work with this organization called Operation Smile 00:18:56.900 |
a very simple surgery because they can't afford doctors 00:19:01.380 |
So I just see, and this can be completely naive 00:19:05.740 |
but I feel like a lot of people are going like, 00:19:15.860 |
Even when we go try to clean up pollution in the ocean, 00:19:18.140 |
we make it worse because of the oil that the tankers use. 00:19:21.620 |
It's like, to me, robots are the only solution. 00:19:36.140 |
more vulnerable populations talking about robots. 00:19:39.940 |
- Maybe you can speak to it a little bit more. 00:19:42.020 |
There's an idea, I think you've expressed it, 00:19:44.100 |
I've heard actually a few female writers and robotists 00:19:49.100 |
have talked to express this idea that exactly you just said, 00:19:59.500 |
of existential threats of artificial intelligence 00:20:09.580 |
Because men in certain positions, like you said, 00:20:17.420 |
and so you always look for the biggest problems 00:20:22.380 |
- It's a champagne problem to be afraid of robots. 00:20:26.460 |
they're afraid they're not gonna be able to feed their kids, 00:20:32.380 |
and I had a mother who worked two jobs, had kids. 00:20:41.060 |
being able to provide childcare for your kids, 00:20:52.300 |
and I've been, we're generally in a little more danger 00:20:56.620 |
So I think that robots are a little less scary to us, 00:20:58.620 |
'cause we can see them maybe as free assistance, 00:21:03.420 |
And then there's sort of another element for me personally, 00:21:08.820 |
I don't know, I'm just gonna make a generalization. 00:21:13.100 |
But the emotional sort of component of robots 00:21:35.900 |
There's a lot of people who are scared of human beings, 00:21:42.140 |
or struggle with addiction, or whatever it is 00:21:44.380 |
where a robot can solve an emotional problem. 00:21:47.020 |
And so we're largely having this conversation 00:21:55.660 |
We're forgetting about a huge part of the population 00:22:01.660 |
and solvent as people like you and Elon Musk, 00:22:05.500 |
who these robots could solve very real problems 00:22:11.300 |
- Well, that's in general a really interesting idea 00:22:13.500 |
that most people in the world don't have a voice. 00:22:25.420 |
represent a very tiny percent of the population. 00:22:29.580 |
we tend to think they speak for the population, 00:22:33.340 |
but it's very possible on many topics they don't at all. 00:22:37.300 |
And look, and I'm sure there's gotta be some kind of 00:22:51.260 |
that I feel like is really inhumane in some way. 00:22:56.700 |
you're gonna tell me that a man whose wife died suddenly 00:23:00.100 |
and he feels guilty moving on with a human woman 00:23:09.980 |
- Well, there's an interesting aspect of human nature. 00:23:16.820 |
to create a group that's the other in all kinds of ways. 00:23:23.500 |
You're especially sensitive to the suffering of animals. 00:23:28.540 |
do you think we'll abuse robots in the future? 00:23:34.260 |
Do you think some of the darker aspects of human nature 00:23:49.740 |
- But why is that a sociopathic thing to harm a robot? 00:23:57.940 |
I guess it would have to be when they're conscious, 00:24:05.300 |
like people that abuse animals also abuse humans 00:24:09.460 |
Like that's, it's all the same part of the brain. 00:24:11.140 |
No one abuses animals and then is like awesome 00:24:14.060 |
to women and children and awesome to underprivileged, 00:24:20.540 |
we've been working really hard to put a database together 00:24:25.980 |
you go, okay, this is, you know, it's all the same stuff. 00:24:38.900 |
but the animal abuse is the most socially acceptable. 00:24:40.900 |
You can kick a dog and there's nothing people can do, 00:24:43.940 |
but then what they're doing behind closed doors, 00:24:48.900 |
which is why I never feel compunction about it. 00:24:50.720 |
But I do think we'll start seeing the same thing 00:24:55.540 |
I felt compassion when the kicking the dog robot 00:25:00.860 |
I know that they're just trying to get the stability right 00:25:07.380 |
where that will be a great way to be able to figure out 00:25:10.740 |
if somebody has like, you know, anti-social behaviors. 00:25:21.540 |
a lot of people seem to be really uncomfortable 00:25:25.180 |
- And you just said that, you know what, for me, 00:25:28.540 |
you know, there's positives for surveillance. 00:25:38.140 |
We behave better when we know we're being watched. 00:25:49.380 |
and I'm trying to not hurt anyone's feelings. 00:25:54.680 |
than when we first started talking, you know? 00:25:56.980 |
When you know there's a camera, you behave differently. 00:25:59.420 |
I mean, there's cameras all over LA at stoplights 00:26:05.860 |
They don't even use them anymore, but it works. 00:26:12.000 |
It's like, that's why we invented Santa Claus. 00:26:14.260 |
You know, Santa Claus is the first surveillance, basically. 00:26:17.820 |
All we had to say to kids is he's making a list 00:26:20.420 |
and he's watching you and they behave better. 00:26:23.780 |
- You know, so I do think that there are benefits 00:26:27.420 |
You know, I think we all do sketchy things in private 00:26:30.940 |
and we all have watched weird porn or Googled weird things 00:26:40.220 |
we should be able to have a modicum of privacy, 00:26:42.860 |
but I tend to think that people that are the most negative 00:26:50.540 |
Well, you should, you're saying you're doing bits on it now? 00:26:54.580 |
- Well, I'm just talking in general about, you know, 00:27:02.500 |
I mean, it's just wild to me that people are like, 00:27:07.220 |
Like there used to be a book full of phone numbers 00:27:11.460 |
and addresses that were, they just throw it at your door. 00:27:15.580 |
And we all had a book of everyone's numbers, you know, 00:27:20.380 |
And, you know, I know our amygdala is designed 00:27:25.380 |
there's stories about, and I think we all just glom on 00:27:33.740 |
but we're like, well, yeah, they, they, you know? 00:27:37.100 |
So I just think that sometimes it's like, okay, well, 00:27:46.240 |
So, and I say this sort of only a little bit joking 00:27:51.100 |
because a lot of people in China, including the citizens, 00:27:55.220 |
despite what people in the West think of as abuse, 00:27:59.660 |
are actually in support of the idea of surveillance. 00:28:02.580 |
Sort of, they're not in support of the abuse of surveillance 00:28:06.500 |
but they're, they like, I mean, the idea of surveillance 00:28:15.940 |
And in a way, it's almost like why we like sports. 00:28:18.520 |
There's rules and within the constraints of the rules, 00:28:34.540 |
When you have a surveillance, it keeps the criminals away, 00:28:37.060 |
keeps abusive animals, whatever the values of the society, 00:28:41.860 |
with surveillance, you can enforce those values better. 00:28:48.580 |
Like I feel the need to really make that very clear. 00:28:52.100 |
I mean, the fact that Google is like collecting 00:29:05.860 |
So the idea of using our data to raise our insurance rates 00:29:10.820 |
they can sort of predict if you're gonna have depression 00:29:13.300 |
based on your selfies by detecting micro muscles 00:29:16.100 |
in your face, you know, all that kind of stuff. 00:29:28.620 |
Focus on the fact that every time you buy cigarettes, 00:29:31.100 |
your insurance is going up without you knowing about it. 00:29:49.420 |
just the way we treat animals of different categories. 00:29:59.820 |
What should people think about it in general? 00:30:09.500 |
'cause I'm actually a very non-emotional animal person 00:30:11.860 |
because it's, I think everyone's an animal person. 00:30:15.880 |
or if you've, you know, been conditioned to go numb. 00:30:20.820 |
to what as a species we are able to be in denial about, 00:30:26.300 |
and how we're able to dehumanize and debase groups, 00:30:38.860 |
So we are also a species who used to go to coliseums 00:30:43.860 |
and watch elephants and tigers fight to the death. 00:30:47.540 |
We used to watch human beings be pulled apart 00:30:56.860 |
and it was socially acceptable by a lot of people. 00:31:12.620 |
"Oh yeah, kids were losing fingers in factories, 00:31:17.180 |
Like someone had to come in and make that, you know, 00:31:20.160 |
so I think it just says a lot about the fact that, 00:31:23.180 |
you know, we are animals and we are self-serving 00:31:33.140 |
and essentially exploit anything that benefits us. 00:31:36.840 |
I think the pendulum's gonna swing as being lately. 00:31:47.240 |
Like we are just, I think we're all kind of addicts 00:31:54.480 |
Like we're, the thing that used to keep us alive, 00:32:01.220 |
So it's like we just can't, we don't know when to call it 00:32:04.220 |
and we don't, moderation is not really something 00:32:10.040 |
So I think it's really just a flaw in our wiring. 00:32:22.240 |
- Which are the worst parts right now going on? 00:32:32.020 |
but you just said we used to watch, you know, 00:32:36.180 |
there's been a lot of cruelty throughout history. 00:32:45.480 |
pigs are one of the most emotionally intelligent animals. 00:32:48.660 |
And they have the intelligence of like a three-year-old. 00:33:06.620 |
- Is this for just, what's it called, the factory farming? 00:33:10.300 |
So basically mass-- - 'Cause we don't see it. 00:33:14.520 |
I mean, this is probably an evolutionary advantage. 00:33:21.480 |
something that is so horrific that it overwhelms us. 00:33:24.000 |
And we're able to essentially deny that it's happening. 00:33:30.440 |
and also we're really to take in how bad it is for us, 00:33:35.320 |
you know, we're hurting ourselves first and foremost 00:33:38.400 |
But that's also a very elitist argument, you know? 00:33:41.240 |
It's a luxury to be able to complain about meat. 00:33:46.640 |
You know, there's very few people because of, you know, 00:33:49.960 |
how the corporations have set up meat being cheap. 00:34:00.000 |
don't have the luxury to even think that way. 00:34:08.760 |
I mean, that's already starting to dismantle. 00:34:10.160 |
Circuses, we're gonna be pretty embarrassed about. 00:34:15.720 |
to, you know, there's just such a ability to go like, 00:34:20.720 |
that thing is different than me and we're better. 00:34:29.160 |
- Well, that's what I think, that's my hope for robots 00:34:31.840 |
is they'll, you mentioned consciousness before. 00:35:16.840 |
'Cause I think there's-- - Compassion, for sure. 00:35:20.000 |
I mean, I was on the robot owners chat boards 00:35:38.740 |
and this is part of the work I do with animals. 00:35:40.880 |
Like I do equine therapy and train dogs and stuff 00:35:43.040 |
because there is this safe space to be authentic. 00:35:52.440 |
doesn't care if you have cellulite, whatever. 00:35:57.960 |
without being defensive and worrying about eye contact 00:36:15.320 |
- Do you think you can have a deep connection 00:36:22.040 |
Do you think you can really have a relationship 00:36:25.440 |
with a robot or a human being that's a safe space? 00:36:43.280 |
but for me, it's very hard to be in a relationship 00:36:47.600 |
I need to perform in some way or deliver in some way. 00:36:51.920 |
I've been in a relationship with or me or my brokenness, 00:36:56.520 |
but I do think this is gonna sound really negative 00:37:01.320 |
and pessimistic, but I do think a lot of our relationships 00:37:06.400 |
are a projection and a lot of our relationships 00:37:08.360 |
are performance, and I don't think I really understood that 00:37:15.280 |
And most communications with human is nonverbal, right? 00:37:18.080 |
I can say, "I love you," but you don't think I love you. 00:37:36.920 |
And you thinking that my tone is disrespectful, 00:37:51.000 |
the way that you intend them to or the way intended, 00:37:57.440 |
and relationship with the robots as they are now, 00:38:00.680 |
you kind of implied that that's more healthy. 00:38:05.240 |
Can you have a healthy relationship with other humans? 00:38:10.120 |
but shouldn't it be, you've talked about codependency. 00:38:14.440 |
Maybe you can talk about what is codependency, 00:38:21.640 |
the complexity of that necessary for passion, 00:38:42.680 |
I think that I've just maybe been around long enough 00:38:55.440 |
Adrenaline, which turns into dopamine, cortisol. 00:39:10.200 |
And I think we've been so conditioned to believe 00:39:40.360 |
that kind of attachment as somehow a bad thing. 00:40:01.920 |
and then I don't have to have it again for three. 00:40:13.760 |
So maybe we're operating under different definitions. 00:40:23.240 |
and becomes this thing that you're not in control of 00:40:27.360 |
and becomes addictive, I think that's the difference 00:40:47.840 |
we call it sort of the drug, the internal drug cabinet. 00:40:50.800 |
If it's all just you're on drugs, that's different, 00:40:53.000 |
you know, 'cause sometimes you're just on drugs. 00:40:55.040 |
- Okay, so there's a philosophical question here. 00:40:58.520 |
So would you rather, and it's interesting for a comedian, 00:41:19.760 |
that you sort of fly high and you crash and die at 27 00:41:28.000 |
because I think there's moments of greatness. 00:41:36.600 |
and the rest of their life is in a kind of fog 00:41:39.880 |
of almost of a depression because they can never-- 00:41:42.320 |
- Because it was based on their physical prowess, right? 00:41:46.440 |
so that, so they're watching their physical prowess fade, 00:41:50.260 |
and they'll never achieve the kind of height, 00:41:59.720 |
of neurochemicals, and you also put your money 00:42:06.520 |
oh, yeah, if you're doing a job where you peak at 22, 00:42:23.400 |
I've immortalized myself, so I'm gonna be fine. 00:42:26.920 |
- See, you're almost like, how many Oscar-winning movies 00:42:35.880 |
But there's, life is short, relatively speaking. 00:42:45.200 |
fall in love as much as you can, run into walls. 00:42:47.720 |
I would also go, life is short, don't deplete yourself 00:42:51.360 |
on things that aren't sustainable and that you can't keep. 00:42:56.640 |
- So I think everyone gets dopamine from different places, 00:43:01.840 |
I look at the fleeting, passionate relationships 00:43:04.640 |
I've had in the past, and I don't have pride in them. 00:43:13.600 |
with grace and integrity, that's just me personally. 00:43:16.160 |
Everyone can go like, yeah, I slept with all the hot chicks 00:43:19.760 |
in Italy, I could, and I did all the whatever, 00:43:29.800 |
- Brian Callan has lived his life to the fullest, 00:43:34.600 |
for me personally, I, and this could be like my workaholism 00:43:38.920 |
or my achievementism, if I don't have something 00:43:43.680 |
to show for something, I feel like it's a waste of time 00:43:50.360 |
I'm in a 12-step program, and the third step would say, 00:43:56.880 |
and whatever, that's a way to just sort of keep us sane 00:43:59.560 |
so we don't grieve too much and beat ourselves up 00:44:01.880 |
over past mistakes, there's no such thing as mistakes, 00:44:08.400 |
I think it's so life-affirming and one of the few things 00:44:11.760 |
that maybe people like us, makes us feel awake and seen 00:44:14.920 |
and we just have such a high threshold for adrenaline. 00:44:22.800 |
Yeah, okay, so yeah, so you have a very high tolerance 00:44:27.520 |
for adrenaline, and I think that Olympic athletes, 00:44:30.440 |
the amount of adrenaline they get from performing, 00:44:33.680 |
it's very hard to follow that, it's like when guys come back 00:44:38.160 |
it's like, do you miss bullets flying at you? 00:44:42.880 |
which turned into dopamine and the camaraderie, 00:44:45.160 |
I mean, there's people that speak much better 00:44:46.600 |
about this than I do, but I just, I'm obsessed 00:44:50.120 |
with neurology and I'm just obsessed with sort of 00:44:57.120 |
- You've done actually quite, done a lot of thinking 00:45:00.400 |
and talking about neurology, just kind of look 00:45:03.480 |
at human behavior through the lens of looking 00:45:06.960 |
at how our actually, chemically, our brain works. 00:45:09.160 |
So what, first of all, why did you connect with that idea 00:45:13.960 |
and what have you, how has your view of the world changed 00:45:22.480 |
- You know, I know it probably sounds really nihilistic, 00:45:24.640 |
but for me it's very liberating to know a lot 00:45:27.600 |
about neurochemicals because you don't have to, 00:45:32.560 |
like critical reviews, if you believe the good, 00:45:36.200 |
Like, you know, if you believe that your bad choices 00:45:39.000 |
were because of your moral integrity or whatever, 00:45:44.840 |
I just think there's something really liberating 00:45:46.400 |
and going like, oh, that was just adrenaline, 00:45:48.160 |
I just said that thing 'cause I was adrenalized 00:45:49.800 |
and I was scared and my amygdala was activated 00:45:52.160 |
and that's why I said you're an asshole and get out. 00:45:54.480 |
And that's, you know, I just think it's important 00:45:56.800 |
to delineate what's nature and what's nurture, 00:45:58.880 |
what is your choice and what is just your brain 00:46:02.120 |
I think we forget that even though we have security systems 00:46:04.640 |
and homes and locks on our doors, that our brain, 00:46:07.000 |
for the most part, is just trying to keep us safe 00:46:10.440 |
it's why we get angry, it's why we get road rage, 00:46:14.840 |
And it's also, when I started learning about neurology, 00:46:17.360 |
I started having so much more compassion for other people. 00:46:22.640 |
I'd be like, okay, he's producing adrenaline right now 00:46:33.360 |
He was scared, you know, so that really helped me 00:46:35.400 |
to have more love for people in my everyday life 00:46:41.080 |
But the, I think, more interesting answer to your question 00:46:45.800 |
like I've suffered with really intense migraines, 00:46:49.120 |
ocular migraines, ones where my arm would go numb, 00:46:52.440 |
and I just started having to go to so many doctors 00:46:58.400 |
learning that we don't really know that much. 00:47:08.720 |
- It's like one of the worst pains you can probably have, 00:47:10.760 |
all that stuff, and we don't know the source. 00:47:13.360 |
- We don't know the source, and there is something 00:47:18.640 |
starts going numb, and you start not being able to see 00:47:22.880 |
and I remember when the migraines get really bad, 00:47:33.120 |
So there's something just really fascinating to me 00:47:35.040 |
about your brain just being able to stop functioning, 00:47:38.320 |
and so I just wanted to learn about it, study about it. 00:47:41.680 |
I did all these weird alternative treatments. 00:47:43.400 |
I got this piercing in here that actually works. 00:47:45.920 |
I've tried everything, and then both my parents 00:47:51.080 |
I became sort of the person who had to decide 00:47:56.720 |
which is just a wild thing to have to deal with it, 00:48:02.160 |
and I started spending basically all day every day 00:48:05.120 |
in ICUs with neurologists learning about what happened 00:48:08.240 |
in my dad's brain, and why he can't move his left arm, 00:48:11.200 |
but he can move his right leg, but he can't see out of the, 00:48:18.120 |
So I started having to learn what parts of the brain 00:48:20.560 |
did what, and so that I wouldn't take their behavior 00:48:23.200 |
so personally, and so that I would be able to manage 00:48:27.440 |
So my mom, because it affected a lot of her frontal lobe, 00:48:33.120 |
She was way more emotional, she was way more micromanaged, 00:48:37.000 |
so it broke my heart less when I was able to know, 00:48:40.320 |
oh yeah, well, the stroke hit this part of the brain, 00:48:42.080 |
and that's the one that's responsible for short-term memory 00:48:44.240 |
and that's responsible for long-term memory, da-da-da, 00:48:53.280 |
So it was kind of wild that I was able to go, 00:48:56.320 |
oh, I know exactly what's happening here, and I know, 00:48:59.800 |
- So that allows you to have some more compassion 00:49:06.640 |
for some of the, from some of the more positive experiences 00:49:12.000 |
Sometimes, and I don't, I'm such a control addict 00:49:19.920 |
is to know why someone's doing, that's what stand-up is. 00:49:23.440 |
or that's what writing is, that's what acting is, 00:49:25.040 |
that's what performing is, it's trying to figure out 00:49:27.440 |
As an actor, you get a piece of, you know, material, 00:49:30.040 |
and you go, this person, why would he say that? 00:49:36.560 |
So I think neurology is, if you're trying to figure out 00:49:39.600 |
human motives and why people do what they do, 00:49:48.080 |
and hardcore drug addiction and mental illness, 00:50:00.640 |
that suffer from drug addiction and alcoholism, 00:50:02.840 |
and the first thing they started teaching you 00:50:49.440 |
and I've known people with clinical depression, 00:51:03.560 |
you've talked about issues that you struggle through, 00:51:07.400 |
and nevertheless you choose to take a journey 00:51:14.240 |
So how much can you do to help fight the limitations 00:51:21.800 |
and I don't think I'm at all qualified to answer, 00:51:25.560 |
And really quick, just the definition of codependency, 00:51:28.200 |
I think a lot of people think of codependency 00:51:29.880 |
as like two people that can't stop hanging out, 00:51:39.960 |
is the inability to tolerate the discomfort of others. 00:51:56.680 |
So codependence is a very active wiring issue 00:52:14.160 |
you know, it also is linked to eating disorders 00:52:20.240 |
And I think a lot of people sometimes only think 00:52:25.960 |
And also one of the reasons I love the idea of robots 00:52:30.880 |
you don't have to worry they're gonna get mad at you yet, 00:52:33.320 |
but there's no, codependence are hypersensitive 00:52:42.160 |
Just, well, one conversation about where we're gonna 00:52:44.720 |
go to dinner is like, do you wanna go get Chinese food? 00:52:50.160 |
Well, no, I didn't mean to, and it's just like, 00:52:56.640 |
and humans can be very emotionally exhausting. 00:53:07.920 |
which is why I love the idea of robots just subbing in. 00:53:10.760 |
Even, I've had a hard time running TV shows and stuff 00:53:16.520 |
I'm very concerned about what other people think of me, 00:53:18.680 |
how I'm perceived, which is why I think robots 00:53:25.600 |
that skill or flaw, whatever you wanna call it, 00:53:30.840 |
if you ever do start your own podcast for interviewing, 00:53:39.240 |
and it makes you a good sort of listener and talker with. 00:54:03.680 |
- I mean, you have to put yourself in the mind 00:54:10.200 |
and the reason I haven't is because I'm codependent 00:54:12.440 |
and I'm too worried it's not gonna be perfect. 00:54:20.320 |
- So how do you, sorry to take a million tangents, 00:54:29.960 |
- How much, yeah, we'll return it, or maybe not. 00:54:33.800 |
- Now as a codependent, I'm worried, okay, good. 00:54:36.240 |
We can, but one of the things that I'm fascinated by 00:54:44.200 |
is genetics loads the gun, environment pulls the trigger. 00:55:01.680 |
and a lot of things that maybe we don't even know yet 00:55:04.840 |
'cause of how little we actually know about the brain. 00:55:08.600 |
and there are some people that have that warrior spirit 00:55:12.080 |
and we don't necessarily know what that engine is, 00:55:15.280 |
whether it's you get dopamine from succeeding 00:55:25.720 |
"Oh, well, this person can edify themselves and overcome, 00:55:29.040 |
"but if you're getting attention from improving yourself, 00:55:38.600 |
If you talk about changing your brain to people 00:55:52.720 |
you go into a room and you talk about your progress 00:55:57.120 |
and then you're more motivated to keep going. 00:56:03.700 |
there's no one guiding you to go in a certain direction. 00:56:07.120 |
We're sort of designed to get approval from the tribe 00:56:15.440 |
- And in that case, the tribe is a positive one 00:56:19.520 |
- So that's why it's so important to go into a room 00:56:21.240 |
and also say, "Hey, I wanted to use drugs today." 00:56:27.720 |
You feel less alone and you feel less like you're, 00:56:30.040 |
you know, have been castigated from the pack or whatever. 00:56:34.200 |
you get a chip when you haven't drank for 30 days 00:56:38.600 |
- So talking about a pack that's not at all healthy or good, 00:56:46.240 |
So you're one of my favorite people on Twitter and Instagram 00:56:49.760 |
to sort of just both the comedy and the insight and just fun. 00:57:10.760 |
- Maybe, but you can do a lot of damage in a moderate way. 00:57:14.320 |
I mean, I guess, again, it depends on your goals, you know? 00:57:26.280 |
There are times I just reach over and I'm like, 00:57:31.320 |
I'll be driving sometimes and I'll be like, oh my God, 00:57:37.800 |
I can take time away from it, but when I do, I get antsy. 00:57:43.400 |
I mean, that's kind of the definition, isn't it? 00:57:48.680 |
do I have a healthy relationship with social media. 00:57:51.560 |
but I think I'm especially a weirdo in this space 00:58:01.960 |
- But I mean, don't you get the same kind of thing 00:58:04.160 |
as you get from when a room full of people laugh at your jokes 00:58:08.080 |
'cause I mean, I see, especially the way you do Twitter, 00:58:16.800 |
I took like six months off or something for a while 00:58:25.000 |
because there was like this weird, a lot of feedback. 00:58:28.280 |
So I had to take a break from it and then go back to it 00:58:30.720 |
'cause I felt like I didn't have a healthy relationship. 00:58:33.480 |
- Have you ever tried the, I don't know if I believe him, 00:58:39.440 |
Have you, and he's one of the only people at the scale, 00:58:42.540 |
like at your level, who at least claims not to read. 00:58:48.760 |
'Cause you and him swim in this space of tense ideas 00:59:01.160 |
I think he probably looks at YouTube, like the likes, 00:59:06.600 |
and I think if something's, if he doesn't know, 00:59:09.640 |
I don't know, I'm sure he would tell the truth. 00:59:30.600 |
I mean, look, I think that our brain is designed 00:59:45.480 |
"I gotta talk to 'em and get in their good graces." 00:59:47.520 |
It's just we're designed to rank ourselves, right? 00:59:51.800 |
And social media, because you can't figure out your rank 01:00:00.720 |
"What's my," and especially if we're following people. 01:00:03.000 |
I think the big, the interesting thing I think 01:00:19.040 |
to constantly look at people that are incredibly successful 01:00:25.320 |
I think that that is cutting to a certain extent. 01:00:28.640 |
Just like, look at me looking at all these people 01:00:37.600 |
but it's easy to frame it so that I can feel that way. 01:00:45.120 |
like if they're other comedians or something like that. 01:00:51.760 |
So other successful people that are really close 01:01:02.480 |
'cause you don't just admire your competitive 01:01:05.280 |
So it's also the same thing that bums you out 01:01:15.720 |
So I'm very sensitive and I sometimes don't like that 01:01:18.840 |
about myself, but I'm like, well, that's the reason 01:01:22.440 |
And that's the reason I'm able to be sensitive to feedback 01:01:34.840 |
being able to know when to let it be a superpower 01:01:38.360 |
and when it's gonna hold you back and be an obstacle. 01:01:47.680 |
oh, there's that thing that makes me really successful 01:02:06.080 |
what's the healthy path forward with social media. 01:02:24.080 |
I mean, you have to keep trying, exploring and thinking. 01:02:35.960 |
okay, I make a certain amount of money by doing this, 01:02:40.360 |
and say, I'm gonna pay you this amount to run this for me. 01:02:42.960 |
So I'm not 24/7 in the weeds, hashtagging and responding. 01:03:11.240 |
okay, here's what's working, here's what's not working. 01:03:13.440 |
And in terms of taking the break from Twitter, 01:03:17.680 |
but because I don't talk about my politics publicly, 01:03:21.760 |
being on Twitter right after the last election 01:03:27.960 |
because there was gonna be, you had to take a side. 01:03:31.600 |
in order to get any kind of retweets or likes. 01:03:46.680 |
And the robot, I joke about her replacing me, 01:03:54.360 |
'Cause I don't want people to get sick of me. 01:03:59.840 |
There are times when I don't have the time or the energy 01:04:03.360 |
but I know she's gonna be compelling and interesting 01:04:06.160 |
and that's something that you can't see every day. 01:04:13.400 |
the humor comes from you when you film the robot. 01:04:23.440 |
The absurdity is revealed through the filmmaker in that case 01:04:27.840 |
not through the actual robot being who she is. 01:04:48.080 |
but how difficult do you think is it to build an AI system 01:04:56.480 |
Sort of replace the human-to-human relationships 01:05:00.800 |
- I think anyone can fall in love with anything. 01:05:04.280 |
You know, like how often have you looked back at someone, 01:05:17.880 |
Like, you know, like where you're able to go like, 01:05:28.720 |
I can't believe we had an incredible connection 01:05:32.960 |
I do think that people will be in love with robots, 01:05:39.820 |
because it's like when people mourn their animals, 01:05:44.040 |
they're always, it's sometimes harder than mourning a human 01:05:47.760 |
because you can't go, "Well, he was kind of an asshole." 01:05:50.360 |
But like, "He didn't pick me up from school." 01:05:52.000 |
You know, it's like you're able to get out of your grief 01:05:59.440 |
With a robot, there's something so pure about, 01:06:02.080 |
and innocent, and impish, and childlike about it 01:06:05.560 |
that I think it probably will be much more conducive 01:06:30.680 |
they're building a nursing home for the bodies 01:06:37.960 |
'cause they have such an intense emotional connection to it. 01:06:40.880 |
I mean, it's kind of like a car club, a little bit. 01:06:47.400 |
I'm not saying it's cool, it's weird, it's creepy, 01:06:53.840 |
and we do develop emotional connections to things. 01:06:59.360 |
I can't even throw away my teddy bear from when I was a kid. 01:07:04.360 |
Like, it's just like, why can't I throw that away? 01:07:07.980 |
You know, and there's something kind of beautiful about that. 01:07:10.160 |
There's something, it gives me hope in humans 01:07:13.160 |
'cause I see humans do such horrific things all the time 01:07:15.760 |
and maybe I'm too, I see too much of it, frankly, 01:07:20.280 |
about the way we're able to have emotional connections 01:07:29.240 |
I mean, it's kind of specifically, I think, Western, right? 01:07:39.760 |
that we're objectifying humans with these sex robots. 01:07:50.080 |
So it's just a weird little place to play in. 01:07:54.980 |
a lot of people will be marrying these things is my guess. 01:08:01.900 |
You have a bit of a brilliant definition of love 01:08:16.480 |
I don't think it'll stick with me for a long time. 01:08:21.440 |
And my brain is like, you're gonna need to rerecord that. 01:08:33.680 |
- Actually, I can not be conscious of the fact 01:08:35.680 |
that I heard the plane and it made me feel like 01:08:38.280 |
how amazing it is that we live in a world with planes. 01:08:43.360 |
- And I just went, why haven't we fucking evolved 01:08:46.160 |
past planes and why can't they make them quieter? 01:08:56.760 |
- Consistently producing dopamine for a long time. 01:09:01.440 |
Consistent output of oxytocin with the same person. 01:09:22.000 |
but it also, if you don't get angry and upset, 01:09:24.040 |
it's, I don't know, I think that that's part of it. 01:09:26.880 |
I think we have this idea that love has to be really 01:09:34.160 |
so I don't have a judgment on how a relationship 01:09:42.360 |
that love has to be eternal is really destructive, 01:09:59.360 |
And I think that, I think for me, love is like 01:10:04.160 |
just being able to be authentic with somebody. 01:10:08.560 |
it's about not feeling pressure to have to perform 01:10:16.540 |
Although I do believe love should be conditional. 01:10:24.280 |
I think if someone's behavior, I don't think love 01:10:35.320 |
It's not something you just like get tenure on 01:10:55.360 |
that we don't wanna make explicit about love. 01:10:57.200 |
I don't know, maybe that's the wrong way to think of it. 01:10:59.160 |
Maybe you wanna be explicit in relationships. 01:11:17.440 |
Am I, you're not responding to me, you're not reliable. 01:11:24.280 |
is the kind of relationship where when we're not together, 01:11:26.920 |
it's not draining me, causing me stress, making me worry, 01:11:36.480 |
but I think it's also like I can be the best version 01:11:43.960 |
So it's like love, you know, for me, I think is, 01:11:47.040 |
I think it's a Flaubert quote and I'm gonna butcher it, 01:11:51.920 |
boring in your personal life so you can be violent 01:12:01.680 |
to where you can also thrive outside of the relationship. 01:12:06.200 |
are those sort of happily married and have kids and so on. 01:12:19.700 |
- I don't think relationships should drain you 01:12:37.600 |
- When you have a podcast, maybe you can invite me on. 01:12:46.400 |
- And because I also have codependency, I had to say yes. 01:12:58.280 |
if we could talk today, after sort of doing more research 01:13:01.720 |
and reading some of your book, I started to wonder, 01:13:14.920 |
so I actually don't do anything I don't wanna do. 01:13:17.640 |
- You really, you go out of your way to say no. 01:13:23.560 |
- I moved this, remember, I moved it from one to two. 01:13:27.720 |
- Yeah, just to let you know how recovered I am. 01:13:47.000 |
to sort of kickstart my codependence recovery. 01:13:50.480 |
and when you have someone close to you in your life die, 01:14:00.880 |
- What do you think is the meaning of it all? 01:14:08.080 |
what's the goal, the purpose of your existence? 01:14:21.680 |
- Yeah, it's we're all just managing our terror 01:14:29.460 |
and whatever we need to do to just distract ourselves 01:14:42.540 |
when my dad died and it resonated, it helped me, 01:14:46.580 |
or sense of purpose or thing that distracts them 01:15:03.520 |
with awards and achievements and games and whatever 01:15:11.960 |
from the terror we would feel if we really processed 01:15:14.600 |
the fact that we could not only, we are gonna die, 01:15:22.760 |
And technically we're at the top of the food chain 01:15:26.160 |
if we have houses and guns and stuff, machines, 01:15:29.320 |
but if me and a lion are in the woods together, 01:15:41.480 |
that we all would feel if we were able to really be awake. 01:15:50.400 |
football, relationship, dopamine, love, you know, 01:16:06.080 |
in saying that I can't wait to see what your terror creates 01:16:14.840 |
Whitney, thank you so much for talking today. 01:16:21.880 |
And thank you to our presenting sponsor, Cash App. 01:16:30.160 |
a STEM education nonprofit that inspires hundreds 01:16:38.000 |
If you enjoy this podcast, subscribe on YouTube, 01:16:42.720 |
support on Patreon, or connect with me on Twitter. 01:16:46.080 |
Thank you for listening and hope to see you next time.