Back to Index

Whitney Cummings: Comedy, Robotics, Neurology, and Love | Lex Fridman Podcast #55


Chapters

0:0 The Following Is a Conversation with Whitney Cummings She's a Stand-Up Comedian Actor Producer Writer Director and Recently Finally the Host of Her Very Own Podcast Called Good for You Her Most Recent Netflix Special Called Can I Touch It Features in Part of Robot She Affectionately Named Bear Claw but It's Designed To Be Visually a Replica Whitney It's Exciting for Me To See One of My Favorite Comedians Explore the Social Aspects of Robotics and Ai in Our Society She Also Has some Fascinating Ideas about Human Behavior Psychology
51:26 Definition of Codependency
56:54 Prevent Social Media from Destroying Your Mental Health
57:19 Addiction to Social Media
68:0 What Is Love
74:39 Terror Management Theory

Transcript

The following is a conversation with Whitney Cummings. She's a stand-up comedian, actor, producer, writer, director, and recently, finally, the host of her very own podcast called Good For You. Her most recent Netflix special called Can I Touch It? features in part a robot she affectionately named Bearclaw that is designed to be visually a replica Whitney.

It's exciting for me to see one of my favorite comedians explore the social aspects of robotics and AI in our society. She also has some fascinating ideas about human behavior, psychology, and urology, some of which she explores in her book called I'm Fine and Other Lies. It was truly a pleasure to meet Whitney and have this conversation with her, and even to continue it through text afterwards.

Every once in a while, late at night, I'll be programming over a cup of coffee and will get a text from Whitney saying something hilarious. Or weirder yet, sending a video of Brian Callen saying something hilarious. That's when I know the universe has a sense of humor, and it gifted me with one hell of an amazing journey.

Then I put the phone down and go back to programming with a stupid, joyful smile on my face. If you enjoy this conversation, listen to Whitney's podcast, Good For You, and follow her on Twitter and Instagram. This is the Artificial Intelligence Podcast. If you enjoy it, subscribe on YouTube, give it five stars on Apple Podcasts, support on Patreon, or simply connect with me on Twitter, @LexFriedman, spelled F-R-I-D-M-A-N.

This show is presented by Cash App, the number one finance app in the App Store. They regularly support Whitney's Good For You podcast as well. I personally use Cash App to send money to friends, but you can also use it to buy, sell, and deposit Bitcoin in just seconds.

Cash App also has a new investing feature. You can buy fractions of a stock, say $1 worth, no matter what the stock price is. Brokerage services are provided by Cash App Investing, a subsidiary of Square, and member SIPC. I'm excited to be working with Cash App to support one of my favorite organizations called FIRST, best known for their FIRST Robotics and LEGO competitions.

They educate and inspire hundreds of thousands of students in over 110 countries, and have a perfect rating on Charity Navigator, which means the donated money is used to maximum effectiveness. When you get Cash App from the App Store or Google Play and use code LEXPODCAST, you'll get $10, and Cash App will also donate $10 to FIRST, which again is an organization that I've personally seen inspire girls and boys to dream of engineering a better world.

This podcast is supported by ZipRecruiter. Hiring great people is hard, and to me is the most important element of a successful mission-driven team. I've been fortunate to be a part of and to lead several great engineering teams. The hiring I've done in the past was mostly through tools that we built ourselves, but reinventing the wheel was painful.

ZipRecruiter is a tool that's already available for you. It seeks to make hiring simple, fast, and smart. For example, Codable co-founder Gretchen Huebner used ZipRecruiter to find a new game artist to join her education tech company. By using ZipRecruiter screening questions to filter candidates, Gretchen found it easier to focus on the best candidates, and finally hiring the perfect person for the role in less than two weeks from start to finish.

ZipRecruiter, the smartest way to hire. See why ZipRecruiter is effective for businesses of all sizes by signing up, as I did, for free at ziprecruiter.com/lexpod. That's ziprecruiter.com/lexpod. And now, here's my conversation with Whitney Cummings. I have trouble making eye contact, as you can tell. - Me too. Did you know that I had to work on making eye contact 'cause I used to look here?

Do you see what I'm doing? - That helps, yeah. - Do you want me to do that? Well, I'll do this way, I'll cheat the camera. But I used to do this, and finally people, like I'd be on dates, and guys would be like, "Are you looking at my hair?" It would make people really insecure because I didn't really get a lot of eye contact as a kid.

It's one to three years. Did you not get a lot of eye contact as a kid? - I don't know. I haven't done the soul searching. - Right. - But there's definitely some psychological issues. - Makes you uncomfortable. - Yeah, for some reason, when I connect eyes, I start to think, I assume that you're judging me.

- Oh, well, I am. That's why you assume that. We all are. - All right. - This is perfect. The podcast would be me and you both staring at the table the whole time. (both laughing) - Do you think robots of the future, ones with human level intelligence, will be female, male, genderless, or another gender we have not yet created as a society?

- You're the expert at this. - Well, I'm gonna ask you-- - You know the answer. - I'm gonna ask you questions that maybe nobody knows the answer to. - Okay. - And then I just want you to hypothesize as a imaginative author, director, comedian, and just intellectual. - Can we just be very clear that you know a ton about this and I know nothing about this, but I have thought a lot about what I think robots can fix in our society.

And I mean, I'm a comedian. It's my job to study human nature, to make jokes about human nature, and to sometimes play devil's advocate. And I just see such a tremendous negativity around robots, or at least the idea of robots, that it was like, oh, I'm just gonna take the opposite side for fun, for jokes, and then I was like, oh no, I really agree in this devil's advocate argument.

So please correct me when I'm wrong about this stuff. - So first of all, there's no right and wrong because we're all, I think most of the people working on robotics are really not actually even thinking about some of the big picture things that you've been exploring. In fact, your robot, what's her name, by the way?

- Bear Claw. - We'll go with Bear Claw. (laughing) What's the genesis of that name, by the way? - Bear Claw was, I, God, I don't even remember the joke 'cause I black out after I shoot specials, but I was writing something about the pet names that men call women, like Cupcake, Sweetie, Honey, you know, like, we're always named after desserts or something, and I was just writing a joke about if you wanna call us a dessert, at least pick a cool dessert, you know, like Bear Claw, like something cool.

So I ended up calling her Bear Claw. - And she stuck. So do you think future robots of greater and greater intelligence would like to make them female, male? Would we like to assign them gender? Or would we like to move away from gender and say something more ambiguous?

- I think it depends on their purpose, you know? I feel like if it's a sex robot, people prefer certain genders, you know? And I also, you know, when I went down and explored the robot factory, I was asking about the type of people that bought sex robots, and I was very surprised at the answer, because of course, the stereotype was it's gonna be a bunch of perverts.

It ended up being a lot of people that were handicapped, a lot of people with erectile dysfunction, and a lot of people that were exploring their sexuality. A lot of people that thought they were gay, but weren't sure, but didn't wanna take the risk of trying on someone that could reject them and being embarrassed, or they were closeted, or in a city where maybe that's, you know, taboo and stigmatized, you know?

So I think that a gendered sex robot, that would serve an important purpose for someone trying to explore their sexuality. Am I into men? Let me try on this thing first. Am I into women? Let me try on this thing first. So I think gendered robots would be important for that, but I think genderless robots, in terms of emotional support robots, babysitters, I'm fine for a genderless babysitter with my husband in the house.

You know, there are places that I think that genderless makes a lot of sense, but obviously not in the sex area. - What do you mean with your husband in the house? What's that have to do with the gender of the robot? - Right, I mean, I don't have a husband, but hypothetically speaking, I think every woman's worst nightmare is like the hot babysitter.

(laughing) You know what I mean? So I think that there is a time and place, I think, for genderless, you know, teachers, doctors, all that kind of, it would be very awkward if the first robotic doctor was a guy, or the first robotic nurse is a woman. You know, it's sort of, that stuff is so loaded.

I think that genderless could just take the unnecessary drama out of it, and possibility to sexualize them, or be triggered by any of that stuff. - So there's two components to this, to Bearclaw. So one is the voice and the talking, and so on, and then there's the visual appearance.

So on the topic of gender and genderless, in your experience, what has been the value of the physical appearance? So has it added much to the depth of the interaction? - I mean, mine's kind of an extenuating circumstance, 'cause she's supposed to look exactly like me. I mean, I spent six months getting my face molded, and having, you know, the idea was, I was exploring the concept of, can robots replace us?

Because that's the big fear, but also the big dream in a lot of ways, and I wanted to dig into that area, because, you know, for a lot of people, it's like, they're gonna take our jobs, and they're gonna replace us, legitimate fear, but then a lot of women I know are like, I would love for a robot to replace me every now and then, so it can go to baby showers for me, and it can pick up my kids at school, and it can cook dinner, and whatever.

So I just think that was an interesting place to explore, so her looking like me was a big part of it. Now her looking like me just adds an unnecessary level of insecurity, 'cause I got her a year ago, and she already looks younger than me, so that's a weird problem, but I think that her looking human was the idea, and I think that where we are now, please correct me if I'm wrong, a human robot resembling an actual human you know is going to feel more realistic than some generic face.

- Well, you're saying that robots that have some familiarity like look similar to somebody that you actually know, you'll be able to form a deeper connection with? - That was the question? - I think so on some level. - That's an open question, I don't, you know, it's an interesting-- - Or the opposite, 'cause then you know me, and you're like, well I know this isn't real, 'cause you're right here, so maybe it does the opposite.

- We have a very keen eye for human faces, and they're able to detect strangeness, especially when it has to do with people whose faces we've seen a lot of, so I tend to be a bigger fan of moving away completely from faces. - Of recognizable faces? No, just human faces at all.

- In general, 'cause I think that's where things get dicey, and one thing I will say is I think my robot is more realistic than other robots, not necessarily because you have seen me, and then you see her, and you go, oh, they're so similar, but also because human faces are flawed and asymmetrical, and sometimes we forget when we're making things that are supposed to look human, we make them too symmetrical, and that's what makes them stop looking human, so because they molded my asymmetrical face, she just, even if someone didn't know who I was, I think she'd look more realistic than most generic ones that didn't have some kind of flaws.

- Got it, yeah. - 'Cause they start looking creepy when they're too symmetrical, 'cause human beings aren't. - Yeah, the flaws is what it means to be human, so visually as well, but I'm just a fan of the idea of letting humans use a little bit more imagination, so just hearing the voice is enough for us humans to then start imagining the visual appearance that goes along with that voice, and you don't necessarily need to work too hard on creating the actual visual appearance, so there's some value to that.

When you step into this territory of actually building a robot that looks like Bear Claw, it's such a long road of facial expressions, of sort of making everything smiling, winking, rolling of the eyes, all that kind of stuff, it gets really, really tricky. - It gets tricky, and I think, again, I'm a comedian, like I'm obsessed with what makes us human, and our human nature, and the nasty side of human nature tends to be where I've ended up exploring over and over again, and I was just mostly fascinated by people's reactions, so it's my job to get the biggest reaction from a group of strangers, the loudest possible reaction, and I just had this instinct, just when I started building her, and people going, (gasps) and people scream, and I mean, I would bring her out on stage, and people would scream, and I just, to me, that was the next level of entertainment, getting a laugh, I've done that, I know how to do that, I think comedians were always trying to figure out what the next level is, and comedy's evolving so much, and Jordan Peele had just done these genius comedy horror movies, which feel like the next level of comedy to me, and this sort of funny horror of a robot was fascinating to me, but I think the thing that I got the most obsessed with was people being freaked out and scared of her, and I started digging around with pathogen avoidance, and the idea that we've essentially evolved to be repelled by anything that looks human, but is off a little bit, anything that could be sick, or diseased, or dead, essentially, is our reptilian brain's way to get us to not try to have sex with it, basically, you know, so I got really fascinated by how freaked out and scared, I mean, I would see grown men get upset, they're like, get that thing away from me, like, oh, like, people get angry, and it was like, you know what this is, you know, but the sort of like, you know, amygdala getting activated by something that to me is just a fun toy said a lot about our history as a species, and what got us into trouble thousands of years ago.

- So it's that, it's the deep down stuff that's in our genetics, but also is it just, are people freaked out by the fact that there's a robot? It's not just the appearance, but that there's an artificial human. - Anything people I think, and I'm just also fascinated by the blind spots humans have, so the idea that you're afraid of that, I mean, how many robots have killed people?

How many humans have died at the hands of other humans? - Yeah, a few more. - Millions? Hundreds of millions? Yet we're scared of that? And we'll go to the grocery store and be around a bunch of humans who statistically, the chances are much higher that you're gonna get killed by humans, so I'm just fascinated by, without judgment, how irrational we are as a species.

- The worry is the exponential, so it's, you can say the same thing about nuclear weapons before we dropped on Hiroshima and Nagasaki, so the worry that people have is the exponential growth. So it's like, oh, it's fun and games right now, but overnight, especially if a robot provides value to society, we'll put one in every home, and then all of a sudden, lose track of the actual large-scale impact it has on society, and then all of a sudden, gain greater and greater control to where we'll all be, affect our political system and then affect our decision.

- Didn't robots already ruin our political system? Didn't that just already happen? - Which ones? Oh, Russia hacking. - No offense. But hasn't that already happened? I mean, that was like an algorithm of negative things being clicked on more. - We'd like to tell stories and like to demonize certain people.

I think nobody understands our current political system, our discourse on Twitter, the Twitter mobs. Nobody has a sense, not Twitter, not Facebook, the people running it, nobody understands the impact of these algorithms. They're trying their best. Despite what people think, they're not like a bunch of lefties trying to make sure that Hillary Clinton gets elected.

It's more that it's an incredibly complex system that we don't, and that's the worry. It's so complex and moves so fast that nobody will be able to stop it once it happens. - And let me ask a question. This is a very savage question, which is, is this just the next stage of evolution?

As humans, some people will die. Yes, I mean, that's always happened. This is just taking emotion out of it. Is this basically the next stage of survival of the fittest? - Yeah, you have to think of organisms. You know, what does it mean to be a living organism? Like, is a smartphone part of your living organism?

- We're in relationships with our phones. - Yeah, but-- - We have sex through them, with them, what's the difference between with them and through them? - But it also expands your cognitive abilities, expands your memory, knowledge, and so on, so you're a much smarter person because you have a smartphone in your hand.

- But as soon as it's out of my hand, we've got big problems, 'cause we become sort of so morphed with them. - Well, there's a symbiotic relationship, and that's what, so Elon Musk, the neural link, is working on trying to increase the bandwidth of communication between computers and your brain, and so further and further expand our ability as human beings to sort of leverage machines, and maybe that's the future, the evolution, next evolutionary step.

It could be also that, yes, we'll give birth, just like we give birth to human children right now, we'll give birth to AI and they'll replace us. I think it's a really interesting possibility. - I'm gonna play devil's advocate. I just think that the fear of robots is wildly classist, because, I mean, Facebook, like, it's easy for us to say, they're taking their data.

Okay, a lot of people that get employment off of Facebook, they are able to get income off of Facebook. They don't care if you take their phone numbers and their emails and their data, as long as it's free. They don't wanna have to pay $5 a month for Facebook.

Facebook is a wildly democratic thing. Forget about the election and all that kind of stuff. A lot of technology making people's lives easier, I find that most elite people are more scared than lower-income people, and women, for the most part. So the idea of something that's stronger than us and that might eventually kill us, women are used to that.

Like, that's not, I see a lot of really rich men being like, "The robots are gonna kill us." We're like, "What's another thing that's gonna kill us?" I tend to see, like, "Oh, something can walk me to my car at night. Something can help me cook dinner." For people in underprivileged countries who can't afford eye surgery, like, in a robot, can we send a robot to underprivileged places to do surgery where they can't?

I work with this organization called Operation Smile where they do cleft palate surgeries, and there's a lot of places that can't do a very simple surgery because they can't afford doctors and medical care and such. So I just see, and this can be completely naive and should be completely wrong, but I feel like a lot of people are going like, "The robots are gonna destroy us." Humans, we're destroying ourselves.

We're self-destructing. Robots, to me, are the only hope to clean up all the messes that we've created. Even when we go try to clean up pollution in the ocean, we make it worse because of the oil that the tankers use. It's like, to me, robots are the only solution.

Firefighters are heroes, but they're limited in how many times they can run into a fire. So there's just something interesting to me. I'm not hearing a lot of lower-income, more vulnerable populations talking about robots. - Maybe you can speak to it a little bit more. There's an idea, I think you've expressed it, I've heard actually a few female writers and robotists have talked to express this idea that exactly you just said, which is, it just seems that being afraid of existential threats of artificial intelligence is a male issue.

- Yeah. - And I wonder what that is. Because men in certain positions, like you said, it's also a classist issue. They haven't been humbled by life, and so you always look for the biggest problems to take on around you. - It's a champagne problem to be afraid of robots.

Most people don't have health insurance, they're afraid they're not gonna be able to feed their kids, they can't afford a tutor for their kids. I mean, I just think of the way I grew up, and I had a mother who worked two jobs, had kids. We couldn't afford an SAT tutor.

The idea of a robot coming in, being able to tutor your kids, being able to provide childcare for your kids, being able to come in with cameras for eyes and make sure surveillance. I'm very pro-surveillance, because I've had security problems, and I've been, we're generally in a little more danger than you guys are.

So I think that robots are a little less scary to us, 'cause we can see them maybe as free assistance, help, and protection. And then there's sort of another element for me personally, which is maybe more of a female problem. I don't know, I'm just gonna make a generalization.

Happy to be wrong. But the emotional sort of component of robots and what they can provide in terms of, I think there's a lot of people that don't have microphones that I just recently kind of stumbled upon in doing all my research on the sex robots for my standup special, which is there's a lot of very shy people that aren't good at dating.

There's a lot of people who are scared of human beings, who have personality disorders, or grew up in alcoholic homes, or struggle with addiction, or whatever it is where a robot can solve an emotional problem. And so we're largely having this conversation about rich guys that are emotionally healthy and how scared of robots they are.

We're forgetting about a huge part of the population who maybe isn't as charming and effervescent and solvent as people like you and Elon Musk, who these robots could solve very real problems in their life, emotional or financial. - Well, that's in general a really interesting idea that most people in the world don't have a voice.

You've talked about it, even the people on Twitter who are driving the conversation. You said comments, people who leave comments represent a very tiny percent of the population. And they're the ones, we tend to think they speak for the population, but it's very possible on many topics they don't at all.

And look, and I'm sure there's gotta be some kind of legal sort of structure in place for when the robots happen. You know way more about this than I do, but for me to just go, the robots are bad, that's a wild generalization that I feel like is really inhumane in some way.

Just after the research I've done, you're gonna tell me that a man whose wife died suddenly and he feels guilty moving on with a human woman or can't get over the grief, he can't have a sex robot in his own house? Why not? Who cares? Why do you care?

- Well, there's an interesting aspect of human nature. So, you know, we tend to as a civilization to create a group that's the other in all kinds of ways. - Right. - And so you work with animals too. You're especially sensitive to the suffering of animals. Let me kind of ask, what's your, do you think we'll abuse robots in the future?

Do you think some of the darker aspects of human nature will come out? - I think some people will, but if we design them properly, the people that do it, we can put it on a record and they can, we can put them in jail. We can find sociopaths more easily.

You know, like-- - But why is that a sociopathic thing to harm a robot? - I think, look, I don't know enough about the consciousness and stuff as you do. I guess it would have to be when they're conscious, but it is, you know, the part of the brain that is responsible for compassion, the frontal lobe or whatever, like people that abuse animals also abuse humans and commit other kinds of crimes.

Like that's, it's all the same part of the brain. No one abuses animals and then is like awesome to women and children and awesome to underprivileged, you know, minorities. Like it's all, so, you know, we've been working really hard to put a database together of all the people that have abused animals.

So when they commit another crime, you go, okay, this is, you know, it's all the same stuff. And I think people probably think I'm nuts for the, a lot of the animal work I do, but because when animal abuse is present, another crime is always present, but the animal abuse is the most socially acceptable.

You can kick a dog and there's nothing people can do, but then what they're doing behind closed doors, you can't see. So there's always something else going on, which is why I never feel compunction about it. But I do think we'll start seeing the same thing with robots. The person that kicks the, I felt compassion when the kicking the dog robot really pissed me off.

I know that they're just trying to get the stability right and all that, but I do think there will come a time where that will be a great way to be able to figure out if somebody has like, you know, anti-social behaviors. - You kind of mentioned surveillance. It's also a really interesting idea of yours that you just said, you know, a lot of people seem to be really uncomfortable with surveillance.

- Yeah. - And you just said that, you know what, for me, you know, there's positives for surveillance. - I think people behave better when they know they're being watched. And I know this is a very unpopular opinion. I'm talking about it on stage right now. We behave better when we know we're being watched.

You and I had a very different conversation before we were recording. (laughing) We behave different. You sit up and you are in your best behavior and I'm trying to sound eloquent and I'm trying to not hurt anyone's feelings. And I mean, I have a camera right there. I'm behaving totally different than when we first started talking, you know?

When you know there's a camera, you behave differently. I mean, there's cameras all over LA at stoplights so that people don't run stoplights, but there's not even film in it. They don't even use them anymore, but it works. - It works. - Right? And I'm, you know, working on this thing in standabout surveillance.

It's like, that's why we invented Santa Claus. You know, Santa Claus is the first surveillance, basically. All we had to say to kids is he's making a list and he's watching you and they behave better. - That's brilliant. - You know, so I do think that there are benefits to surveillance.

You know, I think we all do sketchy things in private and we all have watched weird porn or Googled weird things and we don't want people to know about it, our secret lives. So I do think that obviously, we should be able to have a modicum of privacy, but I tend to think that people that are the most negative about surveillance have the most secrets.

- The most hype. (laughing) Well, you should, you're saying you're doing bits on it now? - Well, I'm just talking in general about, you know, privacy and surveillance and how paranoid we're kind of becoming and how, you know, I mean, it's just wild to me that people are like, our emails are gonna leak and they're taking our phone numbers.

Like there used to be a book full of phone numbers and addresses that were, they just throw it at your door. And we all had a book of everyone's numbers, you know, this is a very new thing. And, you know, I know our amygdala is designed to compound sort of threats and, you know, there's stories about, and I think we all just glom on in a very, you know, tribal way of, yeah, they're taking our data.

Like, we don't even know what that means, but we're like, well, yeah, they, they, you know? So I just think that sometimes it's like, okay, well, so what, they're gonna sell your data? Who cares? Why do you care? - First of all, that bit will kill in China. So, and I say this sort of only a little bit joking because a lot of people in China, including the citizens, despite what people in the West think of as abuse, are actually in support of the idea of surveillance.

Sort of, they're not in support of the abuse of surveillance but they're, they like, I mean, the idea of surveillance is kind of like the idea of government. Like you said, we behave differently. And in a way, it's almost like why we like sports. There's rules and within the constraints of the rules, this is a more stable society.

And they make good arguments about success, being able to build successful companies, being able to build successful social lives around a fabric that's more stable. When you have a surveillance, it keeps the criminals away, keeps abusive animals, whatever the values of the society, with surveillance, you can enforce those values better.

- And here's what I will say. There's a lot of unethical things happening with surveillance. Like I feel the need to really make that very clear. I mean, the fact that Google is like collecting if people's hands start moving on the mouse to find out if they're getting Parkinson's and then their insurance goes up, like that is completely unethical and wrong.

And I think stuff like that, we have to really be careful around. So the idea of using our data to raise our insurance rates or, you know, I heard that they're looking, they can sort of predict if you're gonna have depression based on your selfies by detecting micro muscles in your face, you know, all that kind of stuff.

That is a nightmare, not okay. But I think, you know, we have to delineate what's a real threat and what's getting spam in your email box, that's not what to spend your time and energy on. Focus on the fact that every time you buy cigarettes, your insurance is going up without you knowing about it.

- On the topic of animals too, can we just linger on it a little bit? Like, what do you think, what does it say about our society, of the society wide abuse of animals that we see in general? Sort of factory farming, just in general, just the way we treat animals of different categories.

Like, what do you think of that? What does a better world look like? What should people think about it in general? - I think the most interesting thing I can probably say around this, that's the least emotional, 'cause I'm actually a very non-emotional animal person because it's, I think everyone's an animal person.

It's just a matter of if it's yours or if you've, you know, been conditioned to go numb. You know, I think it's really a testament to what as a species we are able to be in denial about, mass denial and mass delusion, and how we're able to dehumanize and debase groups, you know, World War II, in a way in order to conform and find protection in the conforming.

So we are also a species who used to go to coliseums and watch elephants and tigers fight to the death. We used to watch human beings be pulled apart in the, that wasn't that long ago. We're also a species who had slaves, and it was socially acceptable by a lot of people.

People didn't see anything wrong with it. So we're a species that is able to go numb and that is able to dehumanize very quickly and make it the norm. Child labor wasn't that long ago. Like the idea that now we look back and go, "Oh yeah, kids were losing fingers in factories, "making shoes." Like someone had to come in and make that, you know, so I think it just says a lot about the fact that, you know, we are animals and we are self-serving and one of the most successful, the most successful species because we are able to debase and degrade and essentially exploit anything that benefits us.

I think the pendulum's gonna swing as being lately. - Which way? - Like I think we're Rome now, kind of. Like I think we're on the verge of collapse because we are dopamine receptors. Like we are just, I think we're all kind of addicts when it comes to this stuff.

Like we don't know when to stop. It's always the buffet. Like we're, the thing that used to keep us alive, which is killing animals and eating them, now killing animals and eating them is what's killing us in a way. So it's like we just can't, we don't know when to call it and we don't, moderation is not really something that humans have evolved to have yet.

So I think it's really just a flaw in our wiring. - Do you think we'll look back at this time as our society's being deeply unethical? - Yeah, yeah. I think we'll be embarrassed. - Which are the worst parts right now going on? Is it-- - In terms of animal?

Well, I think-- - No, in terms of anything. What's the unethical thing? It's very hard to take a step out of it, but you just said we used to watch, you know, there's been a lot of cruelty throughout history. What's the cruelty going on now? - I think it's gonna be pigs.

I think it's gonna be, I mean, pigs are one of the most emotionally intelligent animals. And they have the intelligence of like a three-year-old. And I think we'll look back and be really, they use tools. I mean, I think we have this narrative that they're pigs and they're pigs and they're disgusting and they're dirty and their bacon is so good.

I think that we'll look back one day and be really embarrassed about that. - Is this for just, what's it called, the factory farming? So basically mass-- - 'Cause we don't see it. If you saw, I mean, we do have, I mean, this is probably an evolutionary advantage. We do have the ability to completely pretend something's not, something that is so horrific that it overwhelms us.

And we're able to essentially deny that it's happening. I think if people were to see what goes on in factory farming, and also we're really to take in how bad it is for us, you know, we're hurting ourselves first and foremost with what we eat. But that's also a very elitist argument, you know?

It's a luxury to be able to complain about meat. It's a luxury to be able to not eat meat. You know, there's very few people because of, you know, how the corporations have set up meat being cheap. You know, it's $2 to buy a Big Mac, it's $10 to buy a healthy meal.

You know, that's, I think a lot of people don't have the luxury to even think that way. But I do think that animals in captivity, I think we're gonna look back and be pretty grossed out about. Mammals in captivity, whales, dolphins. I mean, that's already starting to dismantle. Circuses, we're gonna be pretty embarrassed about.

But I think it's really more a testament to, you know, there's just such a ability to go like, that thing is different than me and we're better. It's the ego. I mean, it's just, we have the species with the biggest ego, ultimately. - Well, that's what I think, that's my hope for robots is they'll, you mentioned consciousness before.

Nobody knows what consciousness is. But I'm hoping robots will help us empathize and understand that there's other creatures out besides ourselves that can suffer, that can experience the world and that we can torture by our actions. And robots can explicitly teach us that, I think, better than animals can.

- I have never seen such compassion from a lot of people in my life toward any human, animal, child, as I have a lot of people in the way they interact with the robot. 'Cause I think there's-- - Compassion, for sure. - I think there's something of, I mean, I was on the robot owners chat boards for a good eight months.

And the main emotional benefit is she's never gonna cheat on you. She's never gonna hurt you. She's never gonna lie to you. She doesn't judge you. You know, I think that robots help people, and this is part of the work I do with animals. Like I do equine therapy and train dogs and stuff because there is this safe space to be authentic.

You're with this being that doesn't care what you do for a living, doesn't care how much money you have, doesn't care who you're dating, doesn't care what you look like, doesn't care if you have cellulite, whatever. You feel safe to be able to truly be present without being defensive and worrying about eye contact and being triggered by needing to be perfect and fear of judgment and all that.

And robots really can't judge you yet, but they can't judge you. But I think it really puts people at ease and at their most authentic. - Do you think you can have a deep connection with a robot that's not judging? Do you think you can really have a relationship with a robot or a human being that's a safe space?

Or is attention, mystery, danger necessary for a deep connection? - I'm gonna speak for myself and say that I grew up in Alcoa, Colombo. I identify as a codependent, talked about this stuff before, but for me, it's very hard to be in a relationship with a human being without feeling like I need to perform in some way or deliver in some way.

And I don't know if that's just the people I've been in a relationship with or me or my brokenness, but I do think this is gonna sound really negative and pessimistic, but I do think a lot of our relationships are a projection and a lot of our relationships are performance, and I don't think I really understood that until I worked with horses.

And most communications with human is nonverbal, right? I can say, "I love you," but you don't think I love you. Whereas with animals, it's very direct. It's all physical, it's all energy. I feel like that with robots, too. It feels very... How I say something doesn't matter. My inflection doesn't really matter.

And you thinking that my tone is disrespectful, like you're not filtering it through all of the bad relationships you've been in. You're not filtering it through the way your mom talked to you. You're not getting triggered. I find that for the most part, people don't always receive things the way that you intend them to or the way intended, and that makes relationships really murky.

- So the relationships with animals and relationship with the robots as they are now, you kind of implied that that's more healthy. Can you have a healthy relationship with other humans? Or not healthy, I don't like that word, but shouldn't it be, you've talked about codependency. Maybe you can talk about what is codependency, but is the challenges of that, the complexity of that necessary for passion, for love between humans?

- That's right, you love passion. (laughing) That's a good thing. - I thought this would be a safe space. (laughing) I got trolled by Rogan for hours on this. - Look, I am not anti-passion. I think that I've just maybe been around long enough to know that sometimes it's ephemeral and that passion is a mixture of a lot of different things.

Adrenaline, which turns into dopamine, cortisol. It's a lot of neurochemicals. It's a lot of projection. It's a lot of what we've seen in movies. It's a lot of, I identify as an addict. So for me, sometimes passion is like, uh-oh, this could be bad. And I think we've been so conditioned to believe that passion means your soulmates.

And I mean, how many times have you had a passionate connection with someone and then it was a total train wreck? - The train wreck is interesting. - How many times exactly? - Exactly. What's a train wreck? - You just did a lot of math in your head in that little moment.

- Counting. I mean, what's a train wreck? What's a, why is obsession, so you describe this codependency and sort of the idea of attachment, over attachment to people who don't deserve that kind of attachment as somehow a bad thing. I think our society says it's a bad thing. It probably is a bad thing.

Like a delicious burger is a bad thing. I don't know. - Right, oh, that's a good point. I think that you're pointing out something really fascinating, which is like passion, if you go into it knowing this is like pizza where it's gonna be delicious for two hours and then I don't have to have it again for three.

If you can have a choice in the passion, I define passion as something that is relatively unmanageable and something you can't control or stop and start with your own volition. So maybe we're operating under different definitions. If passion is something that like, you know, ruins your marriages and screws up your professional life and becomes this thing that you're not in control of and becomes addictive, I think that's the difference is is it a choice or is it not a choice?

And if it is a choice, then passion's great. But if it's something that like consumes you and makes you start making bad decisions and clouds your frontal lobe and is just all about dopamine and not really about the person and more about the neurochemical, we call it sort of the drug, the internal drug cabinet.

If it's all just you're on drugs, that's different, you know, 'cause sometimes you're just on drugs. - Okay, so there's a philosophical question here. So would you rather, and it's interesting for a comedian, brilliant comedian to speak so eloquently about a balanced life. I kind of argue against this point.

There's such an obsession of creating this healthy lifestyle now, psychologically speaking. You know, I'm a fan of the idea that you sort of fly high and you crash and die at 27 as also a possible life, and it's not one we should judge because I think there's moments of greatness.

I've talked to Olympic athletes where some of their greatest moments are achieved in their early 20s, and the rest of their life is in a kind of fog of almost of a depression because they can never-- - Because it was based on their physical prowess, right? - Physical prowess, and they'll never, so that, so they're watching their physical prowess fade, and they'll never achieve the kind of height, not just physical, of just emotion, of-- - Well, the max number of neurochemicals, and you also put your money on the wrong horse.

That's where I would just go like, oh, yeah, if you're doing a job where you peak at 22, the rest of your life is gonna be hard. - That idea is considering the notion that you wanna optimize some kind of, but we're all gonna die soon. - What? (laughing) Now you tell me.

I've immortalized myself, so I'm gonna be fine. - See, you're almost like, how many Oscar-winning movies can I direct by the time I'm 100? How many this and that? But there's, life is short, relatively speaking. - I know, but it can also come in different, you go, life is short, play hard, fall in love as much as you can, run into walls.

I would also go, life is short, don't deplete yourself on things that aren't sustainable and that you can't keep. - Yeah. - So I think everyone gets dopamine from different places, everyone has meaning from different places. I look at the fleeting, passionate relationships I've had in the past, and I don't have pride in them.

I think that you have to decide what helps you sleep at night. For me, it's pride and feeling like I behave with grace and integrity, that's just me personally. Everyone can go like, yeah, I slept with all the hot chicks in Italy, I could, and I did all the whatever, like whatever you value.

We're allowed to value different things. - We're talking about Brian Callan. (laughing) - Brian Callan has lived his life to the fullest, to say the least, but I think that it's just for me personally, I, and this could be like my workaholism or my achievementism, if I don't have something to show for something, I feel like it's a waste of time or some kind of loss.

I'm in a 12-step program, and the third step would say, there's no such thing as waste of time and everything happens exactly as it should and whatever, that's a way to just sort of keep us sane so we don't grieve too much and beat ourselves up over past mistakes, there's no such thing as mistakes, da-da-da, but I think passion is, I think it's so life-affirming and one of the few things that maybe people like us, makes us feel awake and seen and we just have such a high threshold for adrenaline.

You know, I mean, you are a fighter, right? Yeah, okay, so yeah, so you have a very high tolerance for adrenaline, and I think that Olympic athletes, the amount of adrenaline they get from performing, it's very hard to follow that, it's like when guys come back from the military and they have depression, it's like, do you miss bullets flying at you?

Yeah, kind of, because of that adrenaline which turned into dopamine and the camaraderie, I mean, there's people that speak much better about this than I do, but I just, I'm obsessed with neurology and I'm just obsessed with sort of the lies we tell ourselves in order to justify getting neurochemicals.

- You've done actually quite, done a lot of thinking and talking about neurology, just kind of look at human behavior through the lens of looking at how our actually, chemically, our brain works. So what, first of all, why did you connect with that idea and what have you, how has your view of the world changed by considering the brain is just a machine?

- You know, I know it probably sounds really nihilistic, but for me it's very liberating to know a lot about neurochemicals because you don't have to, it's like the same thing with like critics, like critical reviews, if you believe the good, you have to believe the bad kind of thing.

Like, you know, if you believe that your bad choices were because of your moral integrity or whatever, you have to believe your good ones, I just think there's something really liberating and going like, oh, that was just adrenaline, I just said that thing 'cause I was adrenalized and I was scared and my amygdala was activated and that's why I said you're an asshole and get out.

And that's, you know, I just think it's important to delineate what's nature and what's nurture, what is your choice and what is just your brain trying to keep you safe. I think we forget that even though we have security systems and homes and locks on our doors, that our brain, for the most part, is just trying to keep us safe all the time, it's why we hold grudges, it's why we get angry, it's why we get road rage, it's why we do a lot of things.

And it's also, when I started learning about neurology, I started having so much more compassion for other people. You know, if someone yelled at me, being like, fuck you, on the road, I'd be like, okay, he's producing adrenaline right now because we're all going 65 miles an hour and our brains aren't really designed for this type of stress and he's scared.

He was scared, you know, so that really helped me to have more love for people in my everyday life instead of being in fight or flight mode. But the, I think, more interesting answer to your question is that I've had migraines my whole life, like I've suffered with really intense migraines, ocular migraines, ones where my arm would go numb, and I just started having to go to so many doctors to learn about it, and I started, you know, learning that we don't really know that much.

We know a lot, but it's wild to go into one of the best neurologists in the world who's like, yeah, we don't know. - We don't know. - We don't know. And that fascinated me. - It's like one of the worst pains you can probably have, all that stuff, and we don't know the source.

- We don't know the source, and there is something really fascinating about when your left arm starts going numb, and you start not being able to see out of the left side of both your eyes, and I remember when the migraines get really bad, it's like a mini stroke almost, and you're able to see words on a page, but I can't read them.

They just look like symbols to me. So there's something just really fascinating to me about your brain just being able to stop functioning, and so I just wanted to learn about it, study about it. I did all these weird alternative treatments. I got this piercing in here that actually works.

I've tried everything, and then both my parents had strokes. So when both of my parents had strokes, I became sort of the person who had to decide what was gonna happen with their recovery, which is just a wild thing to have to deal with it, you know, 28 years old, when it happened, and I started spending basically all day every day in ICUs with neurologists learning about what happened in my dad's brain, and why he can't move his left arm, but he can move his right leg, but he can't see out of the, you know, and then my mom had another stroke in a different part of the brain.

So I started having to learn what parts of the brain did what, and so that I wouldn't take their behavior so personally, and so that I would be able to manage my expectations in terms of their recovery. So my mom, because it affected a lot of her frontal lobe, changed a lot as a person.

She was way more emotional, she was way more micromanaged, she was forgetting certain things, so it broke my heart less when I was able to know, oh yeah, well, the stroke hit this part of the brain, and that's the one that's responsible for short-term memory and that's responsible for long-term memory, da-da-da, and then my brother just got something called viral encephalitis, which is an infection inside the brain.

So it was kind of wild that I was able to go, oh, I know exactly what's happening here, and I know, you know, so. - So that allows you to have some more compassion for the struggles that people have, but does it take away some of the magic for some of the, from some of the more positive experiences of life?

- Sometimes. Sometimes, and I don't, I'm such a control addict that, you know, I think our biggest, someone like me, my biggest dream is to know why someone's doing, that's what stand-up is. It's just trying to figure out why, or that's what writing is, that's what acting is, that's what performing is, it's trying to figure out why someone would do something.

As an actor, you get a piece of, you know, material, and you go, this person, why would he say that? Why would she pick up that cup? Why would she walk over here? It's really why, why, why, why. So I think neurology is, if you're trying to figure out human motives and why people do what they do, it'd be crazy not to understand how neurochemicals motivate us.

I also have a lot of addiction in my family, and hardcore drug addiction and mental illness, and in order to cope with it, you really have to understand it, borderline personality disorder, schizophrenia, and drug addiction. So I have a lot of people I love that suffer from drug addiction and alcoholism, and the first thing they started teaching you is it's not a choice.

These people's dopamine receptors don't hold dopamine the same ways yours do. Their frontal lobe is underdeveloped. Like, you know, and that really helped me to navigate loving people that were addicted to substances. - I wanna be careful with this question, but how much-- - Money do you have? - How much-- (laughing) Can I borrow $10?

(laughing) Okay. No, is how much control, how much, despite the chemical imbalances or the biological limitations that each of our individual brains have, how much mind over matter is there? So through things, and I've known people with clinical depression, and so it's always a touchy subject to say how much they can really help it.

- Very. - What can you, yeah, what can you, 'cause you've talked about codependency, you've talked about issues that you struggle through, and nevertheless you choose to take a journey of healing and so on. So that's your choice, that's your actions. So how much can you do to help fight the limitations of the neurochemicals in your brain?

- That's such an interesting question, and I don't think I'm at all qualified to answer, but I'll say what I do know. And really quick, just the definition of codependency, I think a lot of people think of codependency as like two people that can't stop hanging out, you know, or like, you know, that's not totally off, but I think for the most part, my favorite definition of codependency is the inability to tolerate the discomfort of others.

You grow up in an alcoholic home, you grow up around mental illness, you grow up in chaos, you have a parent that's a narcissist, you basically are wired to just, people, please, worry about others, be perfect, walk on eggshells, shape shift to accommodate other people. So codependence is a very active wiring issue that, you know, doesn't just affect your romantic relationships, it affects you being a boss, it affects you in the world online, you know, you get one negative comment and it throws you for two weeks, you know, it also is linked to eating disorders and other kinds of addiction.

So it's a very big thing. And I think a lot of people sometimes only think that it's in romantic relationships, so I always feel the need to say that. And also one of the reasons I love the idea of robots so much because you don't have to walk on eggshells around them, you don't have to worry they're gonna get mad at you yet, but there's no, codependence are hypersensitive to the needs and moods of others, and it's very exhausting, it's depleting.

Just, well, one conversation about where we're gonna go to dinner is like, do you wanna go get Chinese food? We just had Chinese food. Well, wait, are you mad? Well, no, I didn't mean to, and it's just like, that codependence live in this, everything means something, and humans can be very emotionally exhausting.

Why did you look at me that way? What are you thinking about? What was that, why'd you check your phone? It's just, it's a hypersensitivity that can be incredibly time-consuming, which is why I love the idea of robots just subbing in. Even, I've had a hard time running TV shows and stuff because even asking someone to do something, I don't wanna come off like a bitch.

I'm very concerned about what other people think of me, how I'm perceived, which is why I think robots will be very beneficial for codependence. - By the way, just a real quick tangent, that skill or flaw, whatever you wanna call it, is actually really useful for, if you ever do start your own podcast for interviewing, because you're now kind of obsessed about the mindset of others, and it makes you a good sort of listener and talker with.

So I think, what's her name from NPR? - Terry Gross. - Terry Gross talked about having that, so. - I don't feel like she has that at all. (laughing) What? She worries about other people's feelings. - Yeah, absolutely. - Oh, I don't get that at all. - I mean, you have to put yourself in the mind of the person you're speaking with.

- Yes, oh, I see, just in terms of, yeah, I am starting a podcast, and the reason I haven't is because I'm codependent and I'm too worried it's not gonna be perfect. So a big codependent adage is, perfectionism leads to procrastination, which leads to paralysis. - So how do you, sorry to take a million tangents, how do you survive on social media?

'Cause you're exceptionally active. - But by the way, I took you on a tangent and didn't answer your last question about how much we can control. - How much, yeah, we'll return it, or maybe not. The answer is we can't. - Now as a codependent, I'm worried, okay, good.

We can, but one of the things that I'm fascinated by is the first thing you learn when you go into 12-step programs or addiction recovery or any of this is genetics loads the gun, environment pulls the trigger. And there's certain parts of your genetics you cannot control. I come from a lot of alcoholism, I come from a lot of mental illness, there's certain things I cannot control and a lot of things that maybe we don't even know yet what we can and can't 'cause of how little we actually know about the brain.

But we also talk about the warrior spirit and there are some people that have that warrior spirit and we don't necessarily know what that engine is, whether it's you get dopamine from succeeding or achieving or martyring yourself or that tension you get from growing. So a lot of people are like, "Oh, well, this person can edify themselves and overcome, "but if you're getting attention from improving yourself, "you're gonna keep wanting to do that." So that is something that helps a lot of in terms of changing your brain.

If you talk about changing your brain to people and talk about what you're doing to overcome set obstacles, you're gonna get more attention from them, which is gonna fire off your reward system and then you're gonna keep doing it. - Yeah, so you can leverage that momentum. - So this is why in any 12-step program, you go into a room and you talk about your progress 'cause then everyone claps for you and then you're more motivated to keep going.

So that's why we say you're only as sick as the secrets you keep, because if you keep things secret, there's no one guiding you to go in a certain direction. It's based on, right? We're sort of designed to get approval from the tribe or from a group of people 'cause our brain translates it to safety.

So, yeah. - And in that case, the tribe is a positive one that helps you go in a positive direction. - So that's why it's so important to go into a room and also say, "Hey, I wanted to use drugs today." And people go, "Mm." They go, "Me too." You feel less alone and you feel less like you're, you know, have been castigated from the pack or whatever.

And then you say, "And I didn't have any," you get a chip when you haven't drank for 30 days or 60 days or whatever. You get little rewards. - So talking about a pack that's not at all healthy or good, but in fact is often toxic, social media. So you're one of my favorite people on Twitter and Instagram to sort of just both the comedy and the insight and just fun.

How do you prevent social media from destroying your mental health? - I haven't. I haven't. It's the next big epidemic, isn't it? I don't think I have. I don't think-- - Is moderation the answer? - Maybe, but you can do a lot of damage in a moderate way. I mean, I guess, again, it depends on your goals, you know?

And I think for me, the way that my addiction to social media, I'm happy to call it an addiction. I mean, and I define it as an addiction because it stops being a choice. There are times I just reach over and I'm like, that was-- - Yeah, that was weird.

- That was weird. I'll be driving sometimes and I'll be like, oh my God, my arm just went to my phone, you know? I can put it down. I can take time away from it, but when I do, I get antsy. I get restless, irritable, and discontent. I mean, that's kind of the definition, isn't it?

So I think by no means do I have a healthy relationship with social media. I'm sure there's a way to, but I think I'm especially a weirdo in this space because it's easy to conflate, is this work? Is this not? I can always say that it's for work. - Right.

- You know? - But I mean, don't you get the same kind of thing as you get from when a room full of people laugh at your jokes 'cause I mean, I see, especially the way you do Twitter, it's an extension of your comedy in a way. - I took a big break from Twitter though, a really big break.

I took like six months off or something for a while because it was just like, it seemed like it was all kind of politics and it was just a little bit, it wasn't giving me dopamine because there was like this weird, a lot of feedback. So I had to take a break from it and then go back to it 'cause I felt like I didn't have a healthy relationship.

- Have you ever tried the, I don't know if I believe him, but Joe Rogan seems to not read comments. Have you, and he's one of the only people at the scale, like at your level, who at least claims not to read. 'Cause you and him swim in this space of tense ideas that get the toxic folks riled up.

- I think Rogan, I don't know. I think he probably looks at YouTube, like the likes, and I think if something's, if he doesn't know, I don't know, I'm sure he would tell the truth. I'm sure he's got people that look at them and is like, "This guest did great," or, "I don't," I'm sure he gets it.

I can't picture him in the weeds on-- - No, for sure. He's honest actually saying that. - Feedback, we're addicted to feedback. Yeah, we're addicted to feedback. I mean, look, I think that our brain is designed to get intel on how we're perceived so that we know where we stand, right?

That's our whole deal, right? As humans, we wanna know where we stand. We walk into a room and we go, "Who's the most powerful person in here? "I gotta talk to 'em and get in their good graces." It's just we're designed to rank ourselves, right? And constantly know our rank.

And social media, because you can't figure out your rank with 500 million people, it's impossible. So our brain is like, "What's my rank? "What's my," and especially if we're following people. I think the big, the interesting thing I think I maybe be able to say about this, besides my speech impediment, is that I did start muting people that rank wildly higher than me because it is just stressful on the brain to constantly look at people that are incredibly successful so you keep feeling bad about yourself.

I think that that is cutting to a certain extent. Just like, look at me looking at all these people that have so much more money than me and so much more success than me. It's making me feel like a failure, even though I don't think I'm a failure, but it's easy to frame it so that I can feel that way.

- Yeah, that's really interesting, especially if they're close to, like if they're other comedians or something like that. - That's right. - Or whatever. It's really disappointing to me. I do the same thing as well. So other successful people that are really close to what I do, I don't know.

I wish I could just admire. - Yeah. - And for it not to be a distraction. - But that's why you are where you are 'cause you don't just admire your competitive and you wanna win. So it's also the same thing that bums you out when you look at this is the same reason you are where you are.

So that's why I think it's so important to learn about neurology and addiction 'cause you're able to go like, oh, this same instinct. So I'm very sensitive and I sometimes don't like that about myself, but I'm like, well, that's the reason I'm able to write good standup. And that's the reason I'm able to be sensitive to feedback and go, that joke should have been better.

I can make that better. So it's the kind of thing where it's like, you have to be really sensitive in your work and the second you leave, you gotta be able to turn it off. It's about developing the muscle, being able to know when to let it be a superpower and when it's gonna hold you back and be an obstacle.

So I try to not be in that black and white of like, being competitive is bad or being jealous of someone just to go like, oh, there's that thing that makes me really successful in a lot of other ways, but right now it's making me feel bad. - Well, I'm kind of looking to you 'cause you're basically a celebrity, the famous sort of world-class comedian.

And so I feel like you're the right person to be one of the key people to define what's the healthy path forward with social media. 'Cause we're all trying to figure it out now and I'm curious to see where it evolves. I think you're at the center of that.

So like, there's trying to leave Twitter and then come back and say, can I do this in a healthy way? I mean, you have to keep trying, exploring and thinking. - You have to know because it's being, I have a couple answers. I think, I hire a company to do some of my social media for me.

So it's also being able to go, okay, I make a certain amount of money by doing this, but now let me be a good business person and say, I'm gonna pay you this amount to run this for me. So I'm not 24/7 in the weeds, hashtagging and responding. And just, it's a lot to take on.

It's a lot of energy to take on. But at the same time, part of what I think makes me successful on social media if I am, is that people know I'm actually doing it and that I am an engaging and I'm responding and developing a personal relationship with complete strangers.

So I think, figuring out that balance and really approaching it as a business, that's what I try to do. It's not dating, it's not, I try to just be really objective about, okay, here's what's working, here's what's not working. And in terms of taking the break from Twitter, this is a really savage take, but because I don't talk about my politics publicly, being on Twitter right after the last election was not gonna be beneficial because there was gonna be, you had to take a side.

You had to be political in order to get any kind of retweets or likes. And I just wasn't interested in doing that 'cause you were gonna lose as many people as you were gonna gain and it was gonna all come clean in the wash. So I was just like, the best thing I can do for me business-wise is to just abstain.

And the robot, I joke about her replacing me, but she does do half of my social media. 'Cause I don't want people to get sick of me. I don't want to be redundant. There are times when I don't have the time or the energy to make a funny video, but I know she's gonna be compelling and interesting and that's something that you can't see every day.

- Of course, the humor comes from your, I mean, the cleverness, the wit, the humor comes from you when you film the robot. That's kind of the trick of it. I mean, the robot is not quite there to do anything funny. The absurdity is revealed through the filmmaker in that case or whoever is interacting, not through the actual robot being who she is.

Let me sort of, love, okay. (Bridget laughs) How difficult-- - What is it? - What is it? Well, first an engineering question. I know, I know, you're not an engineer, but how difficult do you think is it to build an AI system that you can have a deep, fulfilling, monogamous relationship with?

Sort of replace the human-to-human relationships that we value? - I think anyone can fall in love with anything. You know, like how often have you looked back at someone, like I ran into someone the other day that I was in love with and I was like, "Hey," it was like, there was nothing there.

There was nothing there. Like, you know, like where you're able to go like, "Oh, that was weird. "Oh, right." You know, I were able-- - You mean from a distant past or something like that? - Yeah. When you're able to go like, I can't believe we had an incredible connection and now it's just, I do think that people will be in love with robots, probably even more deeply with humans, because it's like when people mourn their animals, when their animals die, they're always, it's sometimes harder than mourning a human because you can't go, "Well, he was kind of an asshole." But like, "He didn't pick me up from school." You know, it's like you're able to get out of your grief a little bit.

You're able to kind of be, "Oh, he was kind of judgmental." Or, "She was kind of," you know. With a robot, there's something so pure about, and innocent, and impish, and childlike about it that I think it probably will be much more conducive to a narcissistic love, for sure, at that.

But it's not like, "Well, he cheated. "She can't cheat. "She can't leave you. "She can't," you know. - Well, if a bear claw leaves your life and maybe a new version or somebody else will enter, will you miss a bear claw? - For guys that have these sex robots, they're building a nursing home for the bodies that are now rusting 'cause they don't wanna part with the bodies 'cause they have such an intense emotional connection to it.

I mean, it's kind of like a car club, a little bit. You know, like it's, you know. But I'm not saying this is right. I'm not saying it's cool, it's weird, it's creepy, but we do anthropomorphize things with faces and we do develop emotional connections to things. I mean, there's certain, have you ever tried to like throw away, I can't even throw away my teddy bear from when I was a kid.

It's a piece of trash and it's upstairs. Like, it's just like, why can't I throw that away? It's bizarre. You know, and there's something kind of beautiful about that. There's something, it gives me hope in humans 'cause I see humans do such horrific things all the time and maybe I'm too, I see too much of it, frankly, but there's something kind of beautiful about the way we're able to have emotional connections to objects, which, you know, a lot of, I mean, it's kind of specifically, I think, Western, right?

That we don't see objects as having souls. Like, that's kind of specifically us. But I don't think it's so much that we're objectifying humans with these sex robots. We're kind of humanizing objects, right? So there's something kind of fascinating in our ability to do that. 'Cause a lot of us don't humanize humans.

So it's just a weird little place to play in. And I think a lot of people, I mean, a lot of people will be marrying these things is my guess. - So you've asked the question. Let me ask it of you. So what is love? You have a bit of a brilliant definition of love as being willing to die for someone who you yourself want to kill.

So that's kind of fun. First of all, that's brilliant. That's a really good definition. I don't think it'll stick with me for a long time. - This is how little of a romantic I am. A plane went by when you said that. And my brain is like, you're gonna need to rerecord that.

I don't want you to get into post and then not be able to use it. (laughing) - And I'm a romantic 'cause I-- - Don't mean to ruin the moment. - Actually, I can not be conscious of the fact that I heard the plane and it made me feel like how amazing it is that we live in a world with planes.

(laughing) - And I just went, why haven't we fucking evolved past planes and why can't they make them quieter? - Yeah. (laughing) - But yes. - This-- - My definition of love? - What, yeah, what's your-- - Consistently producing dopamine for a long time. (laughing) Consistent output of oxytocin with the same person.

- Dopamine is a positive thing. What about the negative? What about the fear and the insecurity? The longing, anger, all that kind of stuff? - I think that's part of love. I think that love brings out the best in you but it also, if you don't get angry and upset, it's, I don't know, I think that that's part of it.

I think we have this idea that love has to be really placid or something. I only saw stormy relationships growing up so I don't have a judgment on how a relationship should look but I do think that this idea that love has to be eternal is really destructive, is really destructive and self-defeating and a big source of stress for people.

I mean, I'm still figuring out love. I think we all kind of are but I do kind of stand by that definition. And I think that, I think for me, love is like just being able to be authentic with somebody. It's very simple, I know, but I think for me it's about not feeling pressure to have to perform or impress somebody, just feeling truly like accepted unconditionally by someone.

Although I do believe love should be conditional. That might be a hot take. I think everything should be conditional. I think if someone's behavior, I don't think love should just be like, I'm in love with you, now behave however you want forever. This is unconditional. I think love is a daily action.

It's not something you just like get tenure on and then get to behave however you want 'cause we said I love you 10 years ago. It's a daily, it's a verb. - Well, there's some things that are, you see, if you explicitly make it clear that it's conditional, it takes away some of the magic of it.

So there's certain stories we tell ourselves that we don't wanna make explicit about love. I don't know, maybe that's the wrong way to think of it. Maybe you wanna be explicit in relationships. - I also think love is a business decision. Like I do in a good way. Like I think that love is not just when you're across from somebody.

It's when I go to work, can I focus? Do I, am I worried about you? Am I stressed out about you? Am I, you're not responding to me, you're not reliable. Like I think that being in a relationship, the kind of love that I would want is the kind of relationship where when we're not together, it's not draining me, causing me stress, making me worry, you know, and sometimes passion, that word.

You know, we get murky about it, but I think it's also like I can be the best version of myself when the person's not around and I don't have to feel abandoned or scared or any of these kind of other things. So it's like love, you know, for me, I think is, I think it's a Flaubert quote and I'm gonna butcher it, but I think it's like be, you know, boring in your personal life so you can be violent and take risks in your professional life.

Is that it? I got it wrong. Something like that, but I do think that it's being able to align values in a way to where you can also thrive outside of the relationship. - Some of the most successful people I know are those sort of happily married and have kids and so on.

It's always funny-- - It can be boring. Boring's okay. Boring is serenity. - And it's funny how those elements actually make you much more productive. I don't understand the-- - I don't think relationships should drain you and take away energy that you could be using to create things that generate pride.

- Okay. - Did you say your relationship of love yet? - Huh? - Have you said your definition of love? - My definition of love? No, I did not say it. (Bridget laughs) We're out of time. - No! - When you have a podcast, maybe you can invite me on.

- Oh no, I already did. You're doing it. We've already talked about this. - And because I also have codependency, I had to say yes. - No, yeah, yeah, no, no, I'm trapping you. You owe me now. - Actually, I wondered whether when I asked if we could talk today, after sort of doing more research and reading some of your book, I started to wonder, did you just feel pressured to say yes?

- Yes, of course. - Good. - But I'm a fan of yours too. - Okay, awesome. - No, I actually, because I am codependent, but I'm in recovery for codependence, so I actually don't do anything I don't wanna do. - You really, you go out of your way to say no.

- What's that? I say no all the time. - Good, I'm trying to learn that as well. - I moved this, remember, I moved it from one to two. - Yeah, yeah. - I-- - Yeah, just to let you know how recovered I am. I'm not codependent, but I don't do anything I don't wanna do.

- Yeah, you're ahead of me on that, okay. So do you-- - You're like, I don't even wanna be here. (laughing) - Do you think about your mortality? - Yes, it is a big part of how I was able to sort of kickstart my codependence recovery. My dad passed a couple years ago, and when you have someone close to you in your life die, everything gets real clear in terms of how we're a speck of dust who's only here for a certain amount of time.

- What do you think is the meaning of it all? Like what, the speck of dust, what's maybe in your own life, what's the goal, the purpose of your existence? - Is there one? - Well, you're exceptionally ambitious, you've created some incredible things in different disciplines. - Yeah, it's we're all just managing our terror 'cause we know we're gonna die.

So we create and build all these things and rituals and religions and robots and whatever we need to do to just distract ourselves from imminent rotting. We're rotting, we're all dying. I got very into terror management theory when my dad died and it resonated, it helped me, and everyone's got their own religion or sense of purpose or thing that distracts them from the horrors of being human.

- What's the terror management theory? - Terror management is basically the idea that since we're the only animal that knows they're gonna die, we have to basically distract ourselves with awards and achievements and games and whatever just in order to distract ourselves from the terror we would feel if we really processed the fact that we could not only, we are gonna die, but also could die at any minute because we're only superficially at the top of the food chain.

And technically we're at the top of the food chain if we have houses and guns and stuff, machines, but if me and a lion are in the woods together, most things could kill us. I mean, a bee can kill some people. Something this big can kill a lot of humans.

So it's basically just to manage the terror that we all would feel if we were able to really be awake. 'Cause we're mostly zombies, right? Job, school, religion, go to sleep, drink, football, relationship, dopamine, love, you know, we're kind of just like trudging along like zombies for the most part.

And then I think-- - That fear of death adds some motivation. - Yes. - Well, I think I speak for a lot of people in saying that I can't wait to see what your terror creates (laughing) in the next few years. I'm a huge fan. Whitney, thank you so much for talking today.

- Thanks. - Thanks for listening to this conversation with Whitney Cummings. And thank you to our presenting sponsor, Cash App. Download it and use code LEXPODCAST. You'll get $10 and $10 will go to FIRST, a STEM education nonprofit that inspires hundreds of thousands of young minds to learn and to dream of engineering our future.

If you enjoy this podcast, subscribe on YouTube, give it five stars on Apple Podcast, support on Patreon, or connect with me on Twitter. Thank you for listening and hope to see you next time. (upbeat music) (upbeat music)