back to indexIn conversation with Sheryl Sandberg, plus open-source AI gene editing explained
Chapters
0:0 Welcoming Sheryl Sandberg and remembering Dave Goldberg
11:10 What led Sheryl to get involved with "Screams Before Silence," reaction to sexual violence on and after 10/7
28:18 Paths forward, documentary decisions, involvement of women in protests
53:3 Post-interview debrief
59:45 Science Corner: Open-source AI gene editing with OpenCRISPR-1
00:00:00.000 |
David Sachs had a last minute board meeting, so he will not be joining us. 00:00:11.120 |
Jason, do you know what I'm about to do? I'm so excited. I'm so excited. 00:00:21.280 |
Oh, you're going to reveal who Fake Chamath is? 00:00:29.360 |
I would like to get that handle and give it to someone to be. 00:00:34.480 |
Well, trust me, there's a lot of people who would love to have the Fake Chamath handle. 00:00:37.600 |
Well, how do I get it? Can I ask Linda at Twitter for it? 00:00:42.340 |
We might know somebody at Twitter who can reset the password. 00:00:44.720 |
Maybe you can help me. I am so credible as the person who deserves that password. 00:00:49.200 |
Of all the people who've suffered spending time with Chamath, you're at the top of that list. 00:00:58.400 |
I mean, you've had to watch his growth over 20 years. You've had to suffer. 00:01:04.960 |
I raised him. I raised Chamath and he raised me right back. 00:01:08.560 |
All right. Welcome back to the program, everybody. 00:01:28.000 |
One of the guests we've always dreamed about having on the show is considered one of the 00:01:31.040 |
great business operators of all time in Silicon Valley. For the past 20 years, 00:01:34.400 |
Sheryl Sandberg was a key, some might say the key, piece in building the two largest 00:01:40.080 |
advertising and technology companies in the world, Google and Facebook. 00:01:43.760 |
Paradoxically, they don't go by those names anymore, Alphabet and Meta. 00:01:48.720 |
When she joined Google in 2001, it had $20 million in revenue. They were private. 00:01:52.720 |
And when she left in 2008, they had $22 billion in revenue. 00:01:56.960 |
When she joined Facebook in 2008, it was at $270 million in revenue. 00:02:01.520 |
When she left, it was at $117 billion. Market caps of those two companies 00:02:06.160 |
have grown $100 billion and $950 billion during her tenures. 00:02:10.240 |
And today, both are worth over $3 trillion combined and are the number four and number 00:02:15.280 |
seven market cap companies in the world. However, to our crew, she will always be 00:02:20.320 |
Bestie Dave Goldberg's dream girl, as he once described it to me. 00:02:24.960 |
He told me he pursued her relentlessly until she finally gave in, 00:02:28.640 |
dated and then married him and started a beautiful family together. 00:02:32.880 |
Dave Goldberg passed away nine years ago this week in 2015. In an alternate universe, 00:02:39.600 |
on a different timeline, Goldie would have been one of the four people on this panel, 00:02:42.640 |
because he was the most wise, funny, supermench of the entire 10-person core poker group, 00:02:49.360 |
the original poker group. In fact, he was twice the man of any of us, 00:02:53.840 |
which, given the low benchmark we've set, isn't that difficult. We can get at least three shows 00:02:59.840 |
worth of wisdom from our current guest. But that's not why she's joining us today. 00:03:03.920 |
She made a documentary, and we're here to talk about that. And we'll have some time for business 00:03:08.000 |
talk at the end, which is going to be a very hard pivot, given the nature of the doc. 00:03:12.320 |
The doc she co-produced is called Screams Before Silence. I watched it on the flight back from New 00:03:17.600 |
York. I had to take three breaks, and it took a lot of tissues, if I'm being honest. It is one of 00:03:23.520 |
the most difficult hours of viewing I've ever had in my life. It is focused on the sexual violence 00:03:30.480 |
committed by Hamas during and after the October 7th attacks, and which, tragically, in all 00:03:36.640 |
likelihood, continues today with the hostages who are still somewhere in Gaza. The documentary also 00:03:42.800 |
takes on claims in our polluted, journalistic, conspiracy-filled media landscape that claim 00:03:48.400 |
none of this happens. She traveled to Israel to conduct interviews for it, 00:03:52.240 |
and outside of comforting the victims, she spends less than 90 seconds speaking in it herself. The 00:03:56.560 |
stories, of course, speak for themselves. Now, this isn't a disclaimer, but some context about 00:04:01.760 |
this podcast, for those of you who are here for the first time, might be helpful. We realize we're 00:04:06.640 |
wading into a conflict that is thousands of years old, and it's shrouded in pain and suffering, 00:04:11.280 |
with a foundation on the most deeply held religious beliefs humanity has ever formed. 00:04:16.080 |
When we do podcasts like this and have guests, we'll be championed by one side and derided by 00:04:20.960 |
the other. But as you know, we don't shy away from the hard discussions on this podcast. 00:04:25.120 |
We go all in on them. Equal time will always be given, and we welcome all sides on these 00:04:29.680 |
difficult discussions. It goes without saying that we're not here to be your expert or final 00:04:34.480 |
authority. We're here to have a first principle discussion and to personally learn alongside each 00:04:38.160 |
other in good faith. In good faith, this is a really important concept, because it's hard to 00:04:44.400 |
have these discussions in good faith today. So, with that, I'll welcome to the all-in podcast, 00:04:49.040 |
our bestie, Sheryl Sandberg. Well, saying two things you just said, 00:04:53.440 |
that Dave would have been on this podcast, I've thought that, actually. And calling me a bestie, 00:04:58.320 |
because I've been friends with all of you for so long, means a lot to me. 00:05:01.680 |
Yeah. Jason, you dedicated your book to Dave. That 00:05:05.120 |
meant everything to me. David Freeberg and I have been traveling around together to conferences, 00:05:09.280 |
sitting in the backseat of cars. And Shma, it's a really special moment to be here with you. 00:05:15.040 |
We lost Dave nine years ago, yesterday. We were at our dear friend, Phil Joich's 50th birthday 00:05:21.840 |
party. It happened suddenly. I was in shock. Everyone was in shock. Shma sprung into action, 00:05:28.240 |
took care of every logistical thing you could have possibly needed. But then he did something, 00:05:33.200 |
you did something, Shma, even more important, which is you showed up for my children, 00:05:36.560 |
not just for the days and weeks, but for the months and years afterwards. 00:05:40.880 |
And one of the many things you did is you taught them to play poker. Because what you said is, 00:05:45.200 |
if Dave were alive, he would have taught them to play poker. And last night, on the ninth 00:05:48.720 |
anniversary of his loss, my kids were in that room playing poker. And that is very much to 00:05:54.320 |
your credit, Shma. And I will always, always be grateful for that and grateful, Jason, to David, 00:06:00.800 |
and all of us for Dave. So the world lost something really big when we lost Dave. 00:06:08.800 |
And I think a lot of people know a lot of the things we lost. I lost an amazing husband and 00:06:13.120 |
father to my children. You all lost a best friend. The world lost a lot of wisdom. But 00:06:18.960 |
there's actually one thing that the world also lost that we've never shared. And I'm prepared 00:06:24.640 |
to reveal right now, right here, right now. Because last night, Rob Goldberg, Dave's brother, 00:06:30.240 |
and I decided, we decided it was time to share. People may have known there was a fake Chamath 00:06:38.640 |
Twitter handle. I built Facebook, rocked the angel world, and now I'm the warriors. My motto, 00:06:42.960 |
don't be a D-bag. That's my job. And people have questioned who this was. I mean, some people think 00:06:48.560 |
it was Jason Calcanez. Some people think it was Friedberg's choice. All right. I got to say, 00:06:53.120 |
some people think it was Chamath himself. But you know what? The number one choice. Yeah. 00:06:57.520 |
The number one choice. Yeah. Dave was fake Chamath. Now, he didn't write all the tweets 00:07:02.880 |
himself. I know all of you helped him, but he wrote a bunch of them. And he used to 00:07:06.000 |
literally lie in bed next to me, write something, and just big bellow. Remember Dave's big laugh? 00:07:12.800 |
He would laugh out loud. And there are so many things the world lost. But can you guys imagine 00:07:17.680 |
the field day Dave, a.k.a. fake Chamath, would be having with this podcast? Field day. Field day. 00:07:25.360 |
All he'd have to do is just take excerpts from the show. 00:07:31.200 |
I mean, it's one of the great things about the great challenge. I remember 00:07:35.360 |
workshopping some tweets here with Dave, with Goldie. And the big laugh we would have and 00:07:43.200 |
David Lee from the Warriors was involved in this. I mean, we just had like, a whole group who lived 00:07:48.000 |
to write these tweets. And sometimes Chamath's tweets were so insane and deranged that we 00:07:56.480 |
couldn't top them. Like this one from fake Chamath. This is a great one. Pinterest is a 00:08:06.240 |
new hot company in the valley. I don't understand why a site for girls with cats is worth 300 00:08:11.600 |
million. Now that's something that would be a benign tweet. Here's a great one from October 00:08:19.760 |
29 2011. A lot of demand for me to appear in commercials like others, but I am holding out 00:08:25.040 |
for Cartier. Mercedes is beneath me. I mean, this predated Laura Piana. Freebird, you got this next 00:08:33.200 |
one. Give us this next one. There is a Laura Piana one. Yeah, reason number 756 to go to Vegas. No 00:08:39.760 |
sales tax on Laura Piana. This is in 2012. By the way, very precious. If you dress like me, I won't 00:08:45.600 |
initially think you are a D-bag. There's no way Dave know what Laura Piana was. There's no way 00:08:51.120 |
Dave wrote this. Someone else wrote this one for sure. Absolutely. Who was ahead of the time on 00:08:54.880 |
Laura Piana at that time? Very good. I mean, it's just incredible. People think the Laura Piana 00:08:59.680 |
thing is like recent history. It was 12 years ago. I mean, this is when... I mean, Cheryl, 00:09:04.480 |
before we get started here. Wait, Cheryl should read that last one. That's really good. All right, 00:09:08.880 |
Cheryl, you get the last one. My newest investment is so good. Jet time. You can random video chat 00:09:15.360 |
with other people who are also on their private jet. G55 to Hawker. Yeah, Dave loved this group 00:09:22.000 |
of friends and he loved being fake Chamath. Yeah, loved it. Anyway, the secret's out. 00:09:27.520 |
My guess is that Twitter handle's about to get popular again. It's going to get pretty popular 00:09:33.760 |
and I will just say, as much as Dave loved being fake Chamath, it's like half the amount Chamath 00:09:40.960 |
loves being at Chamath. So let's just keep that in mind here, folks. Oh my God, I haven't cried 00:09:48.560 |
and laughed so hard in five minutes as I did just now. I mean, actually, in some ways, Cheryl, 00:09:54.960 |
you're our fifth bestie as well. You're always welcome to come on the pod. And I just also, 00:10:00.720 |
for a little bit of housekeeping here, when guests come on this podcast, we don't pre-vet questions. 00:10:04.960 |
No questions are off limits and nobody gets to strike or do anything nonsensical with the product. 00:10:11.360 |
Everybody comes here. We're not journalists. We're not. We're not journalists. We're not 00:10:14.800 |
traditional journalists. We're friends talking, trying to understand stuff. And just to be clear, 00:10:19.440 |
I know a lot of commentary comes back. Well, why didn't you say this or ask this? And, you know, 00:10:23.520 |
I think we're just when we have guests on, we just want to talk with them like we would 00:10:26.880 |
in a living room and have a conversation. So right, which means no gotcha journalism. 00:10:31.920 |
Although I'll ask a tough question once in a while that may get me in a little bit of trouble. But 00:10:37.040 |
David Freeberg, you set this all up. And I know you and Cheryl have been talking about these 00:10:42.400 |
important issues. And of course, we're going to have all sides on so you don't have to email me 00:10:47.920 |
and say, what about this side? What about that side? All sides are welcome to come on the pod. 00:10:51.840 |
But Freeberg, why don't you kick us off here? We're going to talk about this important film 00:10:55.520 |
and a lot of the debates going on about this horrific attack on October 7th and then what's 00:11:01.920 |
going on in Gaza today. But then we also make that hard pivot to business and get some of Cheryl's 00:11:08.240 |
insights on what's happening in the world today of business. So Freeberg, why don't you kick us 00:11:11.760 |
off? Well, I just want to zoom out because I think Cheryl, we had, I believe it a couple of 00:11:18.240 |
conversations after October 7. Amongst other folks, I've heard that there's been a lot of 00:11:28.080 |
disappointment that institutions, organizations, ideologies that have been supported by folks 00:11:35.440 |
like yourself, or maybe you can speak, I don't want to put words in your mouth, 00:11:37.920 |
suddenly emerged to be something quite different when threads of anti-Semitism started to emerge. 00:11:46.320 |
And folks began to deny certain things based on their ideology about the oppressor oppressed 00:11:57.200 |
concept being applied to Israel and Palestine. And, and maybe you can tell us a little bit about 00:12:04.800 |
the surprise and journey that you've been through since October 7th with respect to some of the 00:12:10.000 |
groups that you've supported that suddenly seemed quite different than what maybe we all thought 00:12:13.280 |
they were prior. Look, it's a great question because I mean, I'm sorry. And that's the 00:12:20.240 |
conversation Cheryl and I have been having that led to saying, hey, why don't you come on the 00:12:23.760 |
show this week and let's talk about this and other topics, particularly given the timing 00:12:27.840 |
with the release of the film. It's a great question. I mean, if you had told me on October 6th, 00:12:34.720 |
the following is going to happen. Terrorists are going to parachute into Israel. They are 00:12:40.640 |
going to kill 1200 people. They are going to sexually brutalize, brutalize and rape multiple 00:12:47.600 |
women and men. I would have said, you're crazy. Then if you would have told me that people were 00:12:54.640 |
going to deny the reports were going to start coming out, people were going to say, I'm a 00:12:58.320 |
first responder. I saw naked bodies. I saw women bloodied, legs spread, but then people were going 00:13:04.720 |
to deny that this happened. I would have said you were crazy. And then if you had told me that what 00:13:09.280 |
we would be doing on college campuses is not protesting sexual violence as a tool of war 00:13:16.000 |
by the hands of Hamas, Hamas, misogynistic, homophobic terrorists who are right now 00:13:23.200 |
holding not just Israelis, but Americans hostage. Yet we would be protesting and college kids would 00:13:28.800 |
be screaming, we are Hamas. I would have said you were crazy. And that's hit me hard. And for me, 00:13:35.440 |
as a woman, as a very outspoken feminist, it's all hard. But the part that has hit me the hardest 00:13:40.560 |
is the denial of the sexual violence. That has just been horrible. And so the reports were coming 00:13:47.840 |
out in November, I wrote an op-ed. And what my op-ed said was, no matter what you believe should 00:13:52.560 |
happen in the Middle East, I believe in a two state solution. No matter what flag you're flying, 00:13:56.960 |
march you're going to, you can all be united on one thing, which is sexual violence should never 00:14:02.640 |
be used as a tool of war. Then I did a video that went pretty viral, but people are denying it and 00:14:09.840 |
they're attacking articles and attacking reports. And so I went to Israel and I sat down myself 00:14:16.480 |
with a video crew. This was generously financed by this great philanthropist, Joey Lowe and his wife. 00:14:24.240 |
And I sat down there and I asked people, what did you see with your own eyes? 00:14:30.240 |
We sat down with a released hostage who told her story. And this is because people are actually 00:14:38.400 |
denying or ignoring this. And that is a horrible place for us to be and truly shocking, truly 00:14:47.760 |
shocking. Let's double click into that word denial. So it's a very heightened moment. Everybody is 00:14:55.360 |
taking sides. Everybody's trying to interpret what they think is the right point of view, 00:15:00.240 |
whether it's in that moment or historically in the arc of how 00:15:04.880 |
Israel and Palestine have been in conflict. Where does that aspect of denial come from? Have you 00:15:11.440 |
have you spent time trying to unpack like, how do you start to get to a place where you say clearly 00:15:17.360 |
people were killed, but then when it goes into war crimes and sexual violence, we're actually 00:15:24.400 |
going to stop it there because it basically pulls our cause back. So we can't agree that that 00:15:30.400 |
actually happened. How how does that happen? Why is that happening? I mean, you're framing exactly 00:15:35.600 |
right. That's exactly what happened. So I mean, you all talk about this a lot, but there's huge 00:15:40.160 |
polarization. What does that mean? Polarization means I have a view that is so firmly entrenched 00:15:45.520 |
that I see the word is black and white. Everything has to fit into my view and my narrative. 00:15:49.680 |
And when it doesn't fit, I don't know what to do. So I reject it. And that's I think what's 00:15:53.840 |
happening, that there is there are people out there who believe that October 7th was resistance. 00:15:58.960 |
I want to be clear. I'm not that person. I do not believe that. I'm horrified by what's happening 00:16:03.920 |
in Gaza. Every life lost is too much. I want two states living peacefully beside each other. 00:16:09.680 |
I really want that. But let's say you think October 7th was resistance. 00:16:15.280 |
Then all of a sudden you're like, wait a second. Mass rape. 00:16:22.640 |
Genital mutilation of men and women, women and men. Women tied to trees, naked, bloodied, 00:16:30.160 |
leg spread. That doesn't fit your narrative. So what can you do? You can now think maybe the world 00:16:34.720 |
isn't so black and white. Maybe I have to rethink my narrative. Or you can say this didn't happen. 00:16:40.400 |
And I think it is a travesty and a tragedy that anyone could say that. And I want to be clear, 00:16:46.880 |
Jason, you started this by saying you always have positions. You always give people room for two 00:16:52.160 |
sides. And that's fantastic. I think there are not just two sides, multiple sides to the Middle 00:16:57.120 |
East story, multiple sides to the history, multiple sides to what's been going on. There 00:17:01.680 |
are not two sides on this. This is sexual violence. There is one side, one side, and we are against it. 00:17:08.160 |
And that's relatively new in the world. To take you back, quick history lesson, which you all 00:17:12.480 |
know, but I'd love for all your viewers to know. For a long time, the history of mankind, women's 00:17:18.560 |
bodies were part of war. You got the village, you got the gold, you got the women. And it was only 00:17:23.760 |
30 years ago after the mass rapes of the DRC, Bosnia, the former Yugoslavia, that people said, 00:17:29.680 |
no, rape is not a tool of war. We will prosecute it as a war crime and a crime against humanity. 00:17:37.120 |
And the feminist groups were the ones who made that happen. The civil rights groups, 00:17:40.640 |
the human rights groups, they've held since then in this moment, if our politics drive us to give 00:17:46.560 |
that up, think about what we give up. Because as we're doing this podcast right now, there are 00:17:52.000 |
hostages in Gaza that we know are being sexually assaulted. There are women in Ukraine, Sudan, 00:17:58.640 |
Ethiopia, around the world who are being sexually assaulted right now, right now. 00:18:03.760 |
And we can't let that go. This is the one place we need to be united. 00:18:10.080 |
Why are the feminist groups finding themselves aligning more with Hamas than they are with this 00:18:17.840 |
core, what seems to be and should be a core ideology? So look, we can't paint them all with 00:18:23.520 |
one brush. There are feminist groups that have spoken out on this, that have said, you know, 00:18:28.080 |
now did it, the new NARAL did it. They said, we are against the sexual violence. CARE did it. 00:18:35.440 |
There are groups that have done it, no matter what else they're working on. 00:18:38.880 |
A bunch of them have said to me privately, I know you're right. Of course, sexual violence 00:18:43.360 |
isn't okay. And of course this happened, but I can't speak out because all my employees are 00:18:47.360 |
going to get upset. I can't speak out because the young people and that makes me really sad. 00:18:52.000 |
But explain that. What does that mean? You know, people will be upset to know that, 00:18:56.080 |
that both things happen. You've got to be able to hold two thoughts at the same time. 00:19:02.560 |
Again, not my thought, but if you believe October 7th is resistant, you can still believe sexual 00:19:07.440 |
violence happened. The fact that a group of feminists, none I'm particularly close to, 00:19:13.520 |
have actually signed letters saying this didn't happen is crazy. Absolutely crazy. I mean, look, 00:19:20.880 |
I'm going to read this. The UN special representative on sexual violence, Pramila 00:19:24.560 |
Patten, traveled to Israel and here is what she wrote. She said, I witnessed in Israel were scenes 00:19:30.000 |
of unspeakable violence perpetrated with shocking brutality, catalog of the most extreme and inhumane 00:19:35.920 |
forms of killing, torture, and other horrors, including sexual violence. That's the UN. 00:19:41.200 |
They're not exactly a pro-Israel group. Cheryl, let me ask, because I think it's important to 00:19:47.360 |
note some people will counter and say, look at this article from Grayzone. Grayzone said Western 00:19:53.120 |
media concocts evidence that the UN report on October 7th sex crimes failed to deliver for March 00:19:58.720 |
7th. They said Western media promoted a UN report as proof Hamas sexually assaulted Israelis, 00:20:04.240 |
yet the report's authors admitted they couldn't locate a single victim, suggested Israeli officials 00:20:09.200 |
staged a rape scene and denounced inaccurate forensic interpretations. I just want to give 00:20:13.040 |
you an opportunity to respond to Grayzone's article, because I think a lot of folks have 00:20:17.760 |
pointed to that article and the articles that that organization has put out as being representative 00:20:24.240 |
of an alternative view that the sexual violence maybe didn't happen as evidenced in your film. 00:20:29.360 |
Maybe you can address it, give you a chance to do that. Yeah, well, the key thing you said there is 00:20:33.920 |
where are the, they're asking where are the victims? Well, let me tell you where the victims 00:20:37.440 |
are. They're dead. They're dead. That is why we call this film, sorry, Screams Before Silence. 00:20:45.120 |
I have a story in this film, this woman, Tali. I went with her to the trailer where she hid. She 00:20:51.920 |
was at the Nova Film Festival. She's a nurse. She hid in a trailer. I walked in with her to that 00:20:57.040 |
trailer the first time she'd been in there and you could see her body like shake and she, we didn't, 00:21:01.680 |
this didn't make the final cut of the film, but she picked up a black sweater and I think she 00:21:05.440 |
might have been wearing that sweater. I was afraid to ask her, but she was like shaking. 00:21:08.640 |
She hid in that trailer for, I don't know, five, six, seven hours and she heard, 00:21:12.800 |
sometimes she would hear like a little scream, like, ah, someone's pointing a gun at you and 00:21:17.040 |
a shot. But sometimes she would heard scream over and over and over, stop, stop. 00:21:22.080 |
And then for like a long period, like 15 minutes and then a shot. 00:21:28.640 |
And then when she got out of that trailer, there were naked bodies where she heard those screams. 00:21:34.960 |
The victims are dead. Most of them are dead. There is exactly one person who is an escaped, 00:21:42.800 |
released hostage. Her name is Amit Sasana. She gave a video interview. You all saw it. 00:21:47.040 |
We have the only video interview in this documentary. And she tells her story very 00:21:51.360 |
clearly. She was held hostage for months. She was chained to a bed. And as she said it, 00:21:58.080 |
her captor forced him to do a, commit a sexual act on her. This woman is so brave. And she told 00:22:05.120 |
me she's speaking out because there are still hostages there, but she is the only living 00:22:09.920 |
witness to speak out. We think there are a few more who are in deep trauma, but there were 1200 00:22:16.240 |
people killed and at least dozens of them were sexually brutalized, assaulted. And that is why 00:22:23.760 |
they're not speaking out just as a followup. What is the social and political motivation of a group 00:22:28.480 |
like gray zone and other appointed deniers? What are they trying to accomplish by denying? 00:22:34.240 |
They're trying to accomplish their narrative that October 7th was justified resistance 00:22:38.960 |
because even they understand that it is not tainted because the sexual violence taints it in a way, 00:22:46.640 |
right? As opposed to just being soldiers, killing soldiers, the sexual violence aspect of it, ain't 00:22:52.480 |
the valor of the resistance. Is that a fair way to summarize it? 00:22:55.840 |
Yes. Even they don't believe, and it's interesting, Hamas has been proudly talking 00:23:02.400 |
about who they killed, but even they deny the sexual violence. That wouldn't happen. It's 00:23:05.840 |
against our religion. The sexual violence doesn't fit the narrative, but I want to be clear. The 00:23:12.080 |
sexual violence was multiple locations, systematic, meets the definition of a war crime, a crime 00:23:19.120 |
against humanity and was part of the plan. If there was no sexual violence, 00:23:23.760 |
would it be fair to call it a resistance? I would not call it a resistance. 00:23:27.600 |
One of the things that happened after the Holocaust was there was still a small cohort 00:23:34.960 |
of people that denied that it ever happened. And I think that there was to use the word 00:23:40.160 |
systematic again, a systematic effort to document, right? There's pictures, there's museums, 00:23:48.480 |
there's memorials. You can think what you want of World War II or Jews in general, 00:23:54.240 |
but you can't deny that that happened. And the documentation of it is pretty unambiguous or 00:24:00.720 |
completely unambiguous. When you spent time there, is there an effort to start doing this? And here's 00:24:06.800 |
where I'm getting to, which is kind of a morbid question, but there was a moment in this documentary 00:24:10.800 |
where this woman who was the doctor in the morgue, I guess, is talking about all of these bodies. 00:24:18.160 |
And unfortunately, where my mind went to, but I think it's the kind of the right thought is I hope 00:24:24.480 |
that there was rape kits done, even if it's posthumously, because that's the trail of 00:24:32.320 |
evidence that allows one to know squarely inside of a box. This is the totality of what happened 00:24:39.600 |
as a learning lesson for everybody, including not just the people that disagree, but the people that 00:24:44.800 |
agree. And then to reinforce some of these basic rights that we thought we've all signed up for. 00:24:49.120 |
I mean, it is such an important question. There were not rape kits done. 1200 people killed in 00:24:56.320 |
one day. I don't anyone was people, their bodies were burned, people were trying to identify them. 00:25:00.880 |
I've actually looked into this a bunch. And in a lot of sexual violence in war situations, 00:25:05.200 |
there are no rape kits. So that's actually sometimes they're used, but often, in chaos, 00:25:09.440 |
there is none. There are very few pictures, there are some and I saw them in this documentary. And 00:25:15.600 |
they are sorry, they are naked women with nails in their groin, like, I'm sorry, I saw these pictures. 00:25:27.520 |
But what's interesting about it is the people who are the first responders are taught not to take 00:25:32.720 |
pictures, particularly, particularly of gruesome things. They don't have the victims, the victim's, 00:25:38.800 |
you know, rights. But the man I interviewed, he said, 24 hours in, he thought to himself, 00:25:46.080 |
no one's going to believe this, I got to take pictures. And against the training he had, 00:25:49.600 |
he took the pictures and he showed me on his phone, he was like, I took this. And another 00:25:54.560 |
guy from Zaka, they're a first responder group that goes in. This is an unheard of situation. 00:26:01.520 |
I mean, I said to him, you've been processing, sorry, maybe that's not the right word, you've been, 00:26:06.000 |
I guess, processing dead bodies all over the world. How many times in your experience are 00:26:11.280 |
they naked? And he just looked at me and said, never, they're never naked. And what meets the 00:26:17.840 |
legal criteria for proving crimes against humanity are witnesses, eyewitnesses. And what's important 00:26:24.480 |
about the documentary that we did, but also important about the efforts Israel is doing, 00:26:29.840 |
Israel is doing that documentation, not Israel, the country, a woman in Israel named Kochav Levi, 00:26:35.360 |
who's fantastic. She is from a private university with private funding, doing that documentation, 00:26:42.240 |
which at this point are mostly considered of interviews. But there are hundreds of them. 00:26:48.080 |
And look, I hope people watch in the documentary, I go into a field with this guy, Rami, I'm sure 00:26:53.840 |
you guys remember this. He because he's huge, right? He Yeah, huge. He stands like this tall 00:26:59.600 |
over me, private citizen, this guy is the biggest hero I've ever met my life, sirens go off, he gets 00:27:05.200 |
into his car, takes his gun, and drives to where incredible bravery, rescued hundreds of people 00:27:13.280 |
himself, himself. But he got to a field and I stood in those trees. And he said, these trees, 00:27:21.520 |
he thinks about 30 women were there and raped or sexually brutalized when he saw them. They were 00:27:26.880 |
naked, tied to trees, legs, red, bloodied, like bloodied in the regions, you would be bloody if 00:27:32.640 |
you were raped. And he what he said in the film is I got there, I covered their bodies, so no one 00:27:38.480 |
else would see. He didn't take pictures, I wish he had. But well, I guess I don't know, do I wish 00:27:44.480 |
he had? I don't know. But you understand why? But he said, I saw this with my own eyes. And what 00:27:50.080 |
you saw in the film is this huge man who's so brave fought terrorists themselves crying because 00:27:56.240 |
he didn't get there early enough to save those women. But the good news is, while the victims 00:28:00.800 |
were killed, the good news is the first responders are alive and their testimony, which is eyewitness 00:28:07.120 |
testimony meets the criteria of any international or global court. Absolutely crimes can be proven 00:28:15.120 |
by by by eyewitnesses, for sure. Sure. What is the response? What is the response in Israel? 00:28:21.360 |
How do you how do you judge what Netanyahu is doing, both in reaction to the events, 00:28:28.480 |
but then in reaction to these specific aspects of the events? What are they doing? 00:28:33.920 |
That's different? Or what would you wish they were doing differently? Or 00:28:38.480 |
can you just give us a sense of how people are processing this aspect? 00:28:44.320 |
I mean, look, we need peace. We need two sides and two leaders that are committed to 00:28:51.520 |
peace, like long term peace. And there's a lot going wrong, you know, but on this aspect, 00:28:57.840 |
you violate someone said it in the film, you violate a woman, you violate a country. 00:29:03.040 |
There's a reason sexual violence is used as a war crime. There's a reason it was used in the 00:29:08.800 |
DRC and Bosnia. And it's being used in Ukraine today because and I can see it in your reaction. 00:29:13.680 |
I mean, it's to humiliate people, right? It's to humiliate a country, right? You humiliate look at 00:29:19.840 |
look at the three of you like, y'all don't cry a lot. Like, this is traumatic, because you all have 00:29:26.320 |
mothers and daughters, like you can feel what happens to a country. And that's why this was 00:29:33.120 |
done. This was not an accident. This was on purpose. And unfortunately, it works. 00:29:40.720 |
sexual violence. I think I think, like, for us to have a path towards peace, 00:29:47.040 |
there has to be a degree. Despite the pain, being felt a degree of empathy for the other side's 00:29:55.840 |
desires, the other side's pain, the other side's feeling that they were enacting a resistance 00:30:03.680 |
against an oppression. How does one side embrace that aspect having gone through this? How do we 00:30:10.160 |
get to a point that a people can say, I have empathy for the resistance after feeling this 00:30:18.160 |
sort of pain. And this is the age old story of war. I for an eye never ends, it always goes on. 00:30:24.240 |
What's the right path here to hear the other side to hear the kids on campuses to hear the 00:30:30.880 |
people in Palestine to hear the world saying, we feel free Palestine. After going through this, 00:30:40.240 |
I can tell you what I believe. I believe we need peace. I believe we need two states. I believe 00:30:45.920 |
those states need to be run by peaceful leaders who want prosperity for the other side. 00:30:50.560 |
Look, I believe we should be able to look at anyone anywhere in the world, but 00:30:55.200 |
certainly the Palestinian people living in Gaza and say any death is too much. One death is too 00:31:02.080 |
much. No innocent lives should be killed. No women, no children, no innocent lives should be killed. 00:31:07.200 |
But I think also as part of that path to peace, there needs to be forgiveness, 00:31:11.920 |
but there needs to be a clear, clear articulation of what is not acceptable ever. And the sexual 00:31:21.440 |
violence is not acceptable ever. If you were netting, if you were netting Yahoo, what would 00:31:26.800 |
you do differently? I'm sorry for cutting you off. No, no. I mean, I don't have an answer to peace in 00:31:32.160 |
the Middle East. I don't. I mean, I wish I did, but I do have a very strong view that we are not 00:31:38.320 |
going to get to peace when we are apologizing or denying crimes against humanity and crime, 00:31:43.520 |
mass rape of women. Well, that is not the path to peace. The path to peace is not saying this 00:31:48.720 |
didn't happen. The path to peace is saying this happened. No matter what side of the fence you're 00:31:53.760 |
on, no matter what side of the world you're on, if you're the far right, the far left, anywhere 00:31:57.440 |
in the world, we're not going to let this happen again, and we're going to get to peace to make 00:32:00.800 |
sure. Denial is not going to get us there. Why has the other side captivated so much of the youth 00:32:06.880 |
in the United States? You're very close to Harvard. Maybe tell us what's gone on at Harvard over the 00:32:11.360 |
last few years. How did we end up in this place where so much of the youth is so sympathetic 00:32:17.040 |
to the Palestinian cause and not as moved as you are by the trauma experienced on the other side? 00:32:24.080 |
I mean, y'all are, I would throw that question right back to you. I know you've talked about, 00:32:28.960 |
you know, narratives and oppressor and oppressed, and again, polarization is where 00:32:34.800 |
you can only have one view, and you cannot tolerate anything that doesn't fit one view, 00:32:40.880 |
and I don't know of anything that's that clear and that simple. I mean, I'll throw that right 00:32:45.440 |
back to you. You all have been articulate on this, and I think have a lot to say. 00:32:48.720 |
Well, I mean, you said it earlier, Cheryl, this tolerance for ambiguity, this ability, 00:32:54.640 |
the cognitive dissonance to be able to hold in your head that the people of Gaza are suffering. 00:33:00.080 |
Perhaps, I guess the other side would say, you know, they would start down this whataboutism. 00:33:04.800 |
It's not my position, but what about what Netanyahu is doing? What about aid to people 00:33:10.000 |
suffering Gaza? You've addressed that. You don't believe anybody should suffer, 00:33:13.520 |
but I just want to talk a little bit about this conspiracy theory that it didn't happen. 00:33:18.560 |
Also, in the documentary, The Savagery, you chose not to show the graphic 00:33:28.240 |
photos that you saw and that you're clearly traumatized by, and a lot of us New Yorkers 00:33:36.320 |
had a similar experience with 9/11 and watching that up close. It is what terrorists do. 00:33:41.200 |
Terrorists do these things to cause massive trauma, to make it impossible to deescalate. 00:33:45.840 |
That is the sadism, that is the pure evil of this brand of terrorism, is to make it impossible for 00:33:53.600 |
the good people of the world to unwind or deescalate. I think part of the process 00:34:00.640 |
is accepting what happened and coming to some truth. The truth can be there are people dying 00:34:07.440 |
unnecessarily in Gaza. There are people starving in Gaza. There are children who are not getting 00:34:12.000 |
food and water. All of those can be true. This horrific sexual sadism and violence that occurred 00:34:18.480 |
is also true. That was awesome. I couldn't have said that better. That was exactly right, Jason. 00:34:27.440 |
That is exactly the point and the path. Sorry, please continue. 00:34:32.000 |
I'm trying to make sense of this. I come to it with humility. This podcast hits certain notes 00:34:39.760 |
with people and, "Oh, how can people in Silicon Valley or whatever discuss these topics?" Listen, 00:34:44.320 |
we're all discussing them. We're all trying to make sense of a very confusing world. 00:34:48.480 |
But you made two choices in the documentary. One was to leave yourself out of it largely. 00:34:53.360 |
Your role in the documentary is to hug people and to cry alongside them and to witness this stuff. 00:34:59.360 |
You talked for, I think, 90 seconds in the whole documentary. I think this was an important 00:35:03.360 |
decision you made. Then you made a decision which I'm not sure if I agree with, which is to not show 00:35:09.040 |
the photos. I am of the belief that people should see what happened on 9/11 as a New Yorker who 00:35:15.360 |
witnessed it and my brothers in the fire department. I had PTSD from it. I think people have to see 00:35:20.560 |
these things. You chose not to out of respect for the family. You should put a note at the end. 00:35:25.920 |
Explain this choice because I know you must have struggled with it. There are photos that you've 00:35:30.480 |
seen of women with their breasts cut off. I don't want to say these things. I know it's very 00:35:35.840 |
traumatic. But I believe people have to understand what's in these photos that you saw. Nails in 00:35:41.360 |
women's private parts. Breasts that have been cut off. This is undeniable. If you want to deny the 00:35:47.040 |
rapes happened or whatever, you cannot deny the photos that you saw. You chose not to put them in. 00:35:52.800 |
I understand that decision. Respect for the family. Take us into that decision. 00:35:57.520 |
Maybe you need to. The woman who chose to do the interview with you, she's so brave. 00:36:06.880 |
She said, "I had to do this because I wanted to combat the denialism." I don't know who the gray 00:36:11.440 |
zone is. I don't know why people are giving it a ton of attention. The first line of the Wikipedia 00:36:18.480 |
page is it's a fringe website. I'll just leave it at that. I don't know if it is or if it isn't, 00:36:21.440 |
but that's the first line of the Wikipedia page. Is there not a case to be made for making a second 00:36:27.680 |
version of the documentary that shows exactly these things so people can stop denying it? 00:36:33.040 |
Because then you would have to come to the place that the people who are one-sided created fake 00:36:40.640 |
images. Is that what we're getting to in this conspiracy-filled world that the dozens of people 00:36:48.800 |
you interviewed are part of a grand conspiracy and the photos are doctored? 00:36:52.960 |
Just talk about that decision. You must have had an important meeting about that. 00:36:57.840 |
Look, we didn't really have a choice. I agree with you. I think the world seeing this 00:37:05.440 |
would probably be necessary at some point. I do think the deniers will deny. They'll say, "Oh, 00:37:11.120 |
you can doctor any photo, so you're going to have to believe the person who took them anyway." We 00:37:16.880 |
didn't have that choice. These photos are held by people who have taken a vow as part of their work 00:37:24.080 |
as first responders of processing and getting bodies ready for burial that they won't show them. 00:37:29.760 |
We've made this freely available on YouTube, so anyone can watch it. No firewalls. Anyone can 00:37:37.120 |
watch this thing if you're over 18. It wouldn't meet YouTube standards, so that would be taken 00:37:42.800 |
down. We can't show them right now for those two reasons, but I think over time, the world may have 00:37:48.960 |
to see some of them, but I also want to go back to what Chamath said because there are photos. 00:37:53.600 |
There are clear photos and there are clear witnesses, but Rami's story, he took no photos, 00:37:59.040 |
and he will tell you why he took no photos. He covered those bodies so no one would see. 00:38:05.040 |
And so, it's traumatic, and that's why Israel is documenting this, or not Israel, 00:38:10.640 |
actually. I shouldn't say it. Someone in Israel is documenting this, but again, no matter what 00:38:16.160 |
else you believe, I love the way you said it, Jason. You can absolutely believe, I absolutely 00:38:21.680 |
believe that every single person, particularly the private citizens, not the terrorists, 00:38:27.520 |
in Gaza should live in peace and harmony. They should, of course, get aid, but they 00:38:31.760 |
shouldn't need aid because they should have a thriving economy and a state that's their own. 00:38:38.320 |
That doesn't mean sexual violence didn't happen because it is clear it did, 00:38:42.400 |
and the denial is crazy. I was in France. I took some of the witnesses to different 00:38:48.080 |
parliaments, including in the French parliament, and Maurice Levy hosted this beautiful lunch for 00:38:54.640 |
us, and there were all the people who work in civil society, and this woman stood up at this 00:38:59.040 |
lunch, and she stood up and she said, "I'm French. I'm not Jewish. I run a non-profit that works on 00:39:05.920 |
sexual violence and conflict. I've done this work for 30 years. No one's ever questioned my work 00:39:13.040 |
ever until now," and she said, "I think it's anti-Semitism." You look at that New York Times 00:39:19.600 |
article, and I know there's different views of the New York Times. I'm not defending the paper, 00:39:23.600 |
but that article written by Jeff Gettleman and others, he has covered sexual violence 00:39:29.280 |
for decades. He won a Pulitzer for his coverage of this in Somalia, a Pulitzer. 00:39:35.280 |
I did a search. No one's ever questioned it before. Something is going on here, 00:39:40.240 |
and it is a combination of narratives and polarization and anti-Semitism, 00:39:44.640 |
which is getting us to a place where we lose. Yeah, sorry. 00:39:49.120 |
Let's explore that for a second. So, when you see the videos, what you see 00:39:52.560 |
are young people, but you see a lot of young women, and many of the leaders of these 00:40:02.640 |
movements on campuses now, the spokespeople are women. The leadership seems mostly to be women. 00:40:09.680 |
Do you have a reaction to that? Do you have a thought on that when you see 00:40:13.920 |
these folks and that they should be closer to this realization maybe than a man could theoretically 00:40:22.960 |
overlook it or try to block it out, but it's actually the leadership of these organizations 00:40:28.800 |
tend to be mostly women-led, and they're basically like, "Let's keep going, and it's about this 00:40:32.960 |
resistance." How do you react to that when you see that? It really depends what I see. When I 00:40:38.000 |
see someone peacefully protesting and saying, "Free Palestine," that's good. I want free Palestine. 00:40:46.000 |
When I see people protesting and saying, "We need peace on all sides. We need a ceasefire. 00:40:52.640 |
Of course we need. We need a permanent ceasefire. I'm for that." Ready? When I see people saying, 00:40:58.480 |
"The rapes didn't happen. That's unacceptable." You saw a student at Columbia. I saw it on video. 00:41:04.400 |
I'm sure you did too, screaming at a Jewish kid, "Go back to Poland," or, 00:41:09.760 |
"October 7th is going to happen to you over and over." That's not okay. It really depends what 00:41:16.000 |
they're saying, but again, I'm hoping people watch this documentary so they can see it for 00:41:20.560 |
their own eyes. I'm hoping people wake up and realize that they are capable of holding two 00:41:26.080 |
thoughts at the same time. They just are. What's going to happen at Harvard? What's 00:41:29.840 |
going to happen at the Ivy Leagues? I don't know what's going to happen at any of these schools, 00:41:33.760 |
but I'll tell you, I'm a parent of college-age kids. I've got a kid who was in college for a 00:41:40.480 |
year. I've got a kid going off this year in the fall. Colleges have a responsibility to keep our 00:41:45.680 |
kids safe, full stop, and protect them from hate, full stop. They have the ability to do this. 00:41:54.240 |
They have the ability to do this. It's up to them. Do you think Columbia has done a good job? 00:41:59.120 |
Well, if you were president of Harvard, what would you have done differently, Cheryl? 00:42:01.760 |
I'm not close enough. It's all merging together in my mind. I don't know exactly which protests 00:42:06.800 |
that happen at which schools, but here's what I would do. Yeah. I would have very clear rules, 00:42:12.240 |
which by the way, all the schools have. It's a question of enforcing them. Right. The schools 00:42:16.240 |
that are letting this happen are not enforcing their own rules. Schools are actually, I think, 00:42:20.720 |
look, free and open dialogue is important. College is the place you should go to talk 00:42:25.440 |
about the issues from all sides, to have thoughtful conversations, to have deep conversations, 00:42:30.800 |
even maybe to have angry conversations, but not violent. 00:42:35.360 |
I went to Berkeley. It wasn't lunch without a protest. I mean, that's like the daily thing 00:42:39.360 |
you do there. You go grab a sandwich, and you go protest, and you go back to class. 00:42:42.960 |
Great. And I bet you had thoughtful conversations because that's what's made you you. Look at your 00:42:49.600 |
views, David, you're able to articulate multiply complex views. And I bet some of that was from 00:42:54.160 |
Berkeley, where you probably sat with your fellow students, and talk to them, right? 00:42:58.400 |
That's not what's happening. I think things are very different. Yeah, I think things are 00:43:03.280 |
very different. These colleges have rules. Some of the colleges there, most colleges have a rule 00:43:08.800 |
that you can't protest in the president's office. There are colleges where the faculty and 00:43:13.360 |
administration when people are protesting in the president's office, they're serving them food. 00:43:18.080 |
There are colleges that say you're not allowed to protest here, go outside. 00:43:21.360 |
But if you feel if you feel deep down in your heart that it's a matter of life or death, 00:43:25.360 |
don't you feel justified that having an encampment, setting up a tent, 00:43:29.440 |
living there showing that that degree of conviction is necessary, because you're saving 00:43:34.880 |
lives versus, hey, I think something is a good idea. Let me go protest for an hour during lunch, 00:43:39.120 |
and then I'll leave. It's never going to move the needle. The question I'm asking as a young 00:43:43.120 |
person is how do I move the needle? And there's not a lot of ways that people feel empowered to 00:43:47.120 |
move the needle. So it seems rational to me to some degree that they want to go into these 00:43:52.480 |
encampments and they want to do something strong and show their conviction. But again, 00:43:57.840 |
I think that there's a question on how much truth is anyone willing to see how much are folks willing 00:44:02.800 |
to embrace the other side? How much are they willing to listen? I see very little listening, 00:44:06.720 |
very little dialogue going on. Because then you put up a list of demands that are unmeetable. 00:44:11.680 |
And, you know, you deny anyone to have a conversation and you deny listening to the 00:44:16.400 |
other side. And you take this hardened view that doesn't allow for progress. And I think it's the 00:44:22.000 |
hardened views on all sides that's limiting progress entirely. Unfortunately, the youth 00:44:26.400 |
have been subsumed by this. And it's really frustrating to see because I worry about what 00:44:31.600 |
that is. I think it's not necessarily the use meaning. I think you're seeing it in some very 00:44:37.760 |
specific places that cater to a very specific kind of youth. You see them at Columbia, Harvard, 00:44:44.080 |
there are these specific Berkeley UCLA that are bastions of privileged kids. For the most part, 00:44:50.000 |
these are extremely elite institutions that typically allow in kids that have been coached 00:44:57.280 |
their entire lives to get into those schools. And I think that they're coming there with a lack of 00:45:02.240 |
fulfillment. And it reminds me at some level of how people reacted to Occupy Wall Street. 00:45:10.480 |
Meaning there were a whole bunch of young people there that probably didn't even know what the 00:45:15.920 |
whole Occupy Wall Street movement was about. Well, they showed up- 00:45:19.280 |
They said themselves, Chamath, it was a platform for whatever your grievance was. That was their 00:45:23.200 |
stated mission. Yeah. And I think what they found a decade and a half ago or so was community in 00:45:29.280 |
this weird way. Yes. Right? The physical interaction of other people where you had this intimacy around 00:45:34.320 |
a thing. I'm not condoning Occupy Wall Street, just like I'm not condoning what's happening on 00:45:37.920 |
campuses. But I think psychologically, what kids are looking for is that level of attachment. 00:45:42.960 |
And to your point, David, something that they can feel strongly about. And I think they end up 00:45:49.040 |
getting to the age of 18 and 19, not having felt strongly about anything, because they were working 00:45:53.600 |
on playing nine sports and 14 instruments and all this other bullshit to go to these schools. 00:45:59.120 |
And then they get there and they feel a little empty. And sometimes negative things can fill the 00:46:04.240 |
void. What I was going to say is I think the answer lies in what we're saying. David, you 00:46:09.600 |
started out by saying, how are we going to get to progress? Well, screaming at each other is not 00:46:12.880 |
going to get to progress. I don't have an answer for peace in the Middle East, but universities 00:46:17.840 |
play roles in getting there. Thoughtful, hard conversations. Let's look at the real history. 00:46:24.400 |
Let's look at who the leadership could be. Let's look at what kind of leaders we need on both sides. 00:46:29.280 |
Let's look at what the international community could be doing. Those answers could come out of 00:46:34.240 |
universities. Some of those college students, if they weren't reading five things they don't 00:46:38.720 |
understand, could help us get there. And I think these protests are getting in the way 00:46:43.040 |
of the thoughtful dialogue. And I honestly think part of what happens with cancel culture, 00:46:49.680 |
I don't want to listen to another view on all sides. Really? Why don't you say, 00:46:55.200 |
I want to listen. My friend Adam Grant wrote a great book called Think Again. I wish everyone 00:47:01.440 |
in the whole world would read that damn book. Think Again. Think Again means you like might 00:47:05.520 |
not be right about everything. Think Again means you need to like listen to the other side. We're 00:47:11.840 |
never going to get there without that kind of thoughtful dialogue. 00:47:14.320 |
Well, yeah, and that's exactly where I wanted to go with it, Cheryl, which is if you steal man, 00:47:19.120 |
if you look at their perspective, and you look at the beauty of feminism and femininity, 00:47:25.920 |
and you wrote a book, Lean In, and you are an expert on this, 00:47:29.840 |
having compassion for people who are suffering is absolutely beautiful. It is the best of humanity, 00:47:37.280 |
and I think it's the best of femininity and women, is that they have this incredible gift of empathy 00:47:43.600 |
that as men, maybe we are so far behind. And so it does not surprise me that women leave these 00:47:49.120 |
organizations when they see suffering. And if you see children suffering, women are in a unique 00:47:54.000 |
position in their life experience to understand the value of children, of family, and of suffering. 00:47:59.840 |
And, you know, I can understand an impressionable young person seeing the videos coming out of Gaza 00:48:06.560 |
of a baby dying in a bombing and collateral damage, and being devastated and saying, you know what, 00:48:10.960 |
I have to fight for these poor children. It is completely noble in their mind. In fact, 00:48:17.440 |
it might be noble, I mean, to fight for peace. And so, you know, I can understand their positions, 00:48:26.000 |
and I don't actually disagree with them. But then you start looking at the reality 00:48:31.200 |
of getting the hostages back. And if this was an American situation, and we actually have a 00:48:38.480 |
corollary 9/11, we didn't go to Afghanistan to get hostages, we went there to get retribution. 00:48:45.920 |
So if America went there to eliminate this threat, and we also took out another country, 00:48:55.600 |
just for good measure, that wasn't even involved in it, you know, it's such a complex issue. And 00:49:00.800 |
we're in the fog of war, I think everybody pausing for a second here, and just remember 00:49:04.960 |
how confusing it was after 9/11, how confusing it was. And we had to figure out, wait a second, 00:49:10.560 |
these were Saudis. These were this radical group, the splinter group, like, it takes a while to 00:49:15.680 |
figure out what's going on here. And I do think on these campuses, they should allow them to protest, 00:49:22.000 |
but there's outside agitators. That seems to me to be completely unacceptable to have 40, 00:49:26.000 |
50, six-year-old lifetime agitators on these campuses allow these kids to protest, but to 00:49:34.160 |
chase Jewish kids around the campus and then surround them and threaten them in 2024. I mean, 00:49:41.120 |
I can't understand what's happening and how could an administration, Cheryl, allow students to 00:49:49.360 |
threaten other students and not immediately snap a snap decision. It's a decision. You just said it. 00:49:56.000 |
It's a decision. It is a decision. Absolutely expelled. If a Jewish set of Jewish students 00:50:01.760 |
surrounded a Palestinian student, an Islamic student, a Muslim, and chanted at them about 00:50:09.120 |
what happened October 7th and made them feel threatened, expel them as well. 00:50:12.640 |
There's just some basic, basic rules of the game they're not enforcing. It's absolutely infuriating, 00:50:18.240 |
but I just want to make sure I stay on that other side. And you did, you know, I think very 00:50:22.400 |
eloquently say you also agree with the suffering in Palestine needs to end. I know we're out of 00:50:29.360 |
time. Can I say one thing? I really want to say this. You can say anything, Cheryl, as much as 00:50:32.640 |
you like. You have time. I really want to thank you all for this because two things happened in 00:50:36.960 |
the last hour with you. One is that you were really passionate against the sexual assault and 00:50:44.720 |
really clear. And as much as we need women to believe this, we need male and male leaders. 00:50:50.160 |
And so, your voices, like I could feel the passion on this and I'm really grateful because that gives 00:50:56.720 |
me hope. Like I am, it's such a dark moment. It's such a dark moment for democracy. It's 00:51:01.360 |
such a dark moment for Jews. It's such a dark moment for women, but this really gave me hope. 00:51:06.400 |
And the second is your tears for Dave. Thanks. It's been nine years. A lot of we've moved on. 00:51:16.080 |
You have friends. I have a wonderful life that I'm so grateful for, but the world still lost 00:51:24.560 |
a really, really, really special person. And I can see how much that means. I knew this, but-- 00:51:31.520 |
There are very few things as you grow older that you realize in life that matters and-- 00:51:41.520 |
You know, at the end of the day, Chamath, we have-- 00:51:43.440 |
You have family and your friends. That's all you have. 00:51:45.280 |
I think about Dave frequently. And I just think at the end of your life, what you have 00:51:54.560 |
is but a collection of memories. And the memories we have with Dave, the laughter, the joy, the-- 00:52:02.480 |
Fake Chamath, yeah. His wit, his insights. You know, we would be sitting at that poker table, 00:52:08.640 |
and it was like we're all like 15, 16-year-olds, and we got this big brother who's 20. 00:52:14.240 |
And, you know, we'd be bickering and laughing, whatever. He'd come in and say, "Hey, guys, 00:52:19.040 |
how about this and this?" So, Chamath and I would be jawing each other. He'd say, 00:52:23.280 |
"Hey, guys, let's calm it down a little bit and let's have a good time," whatever. 00:52:27.040 |
Thank you guys so much. I have to go to my board meeting. 00:52:29.440 |
This was as deeply meaningful as it could have been. Seriously, thank you. 00:52:36.560 |
We didn't get to talk about anything business. 00:52:40.720 |
I beg you to come back to the-- either come to the summit and be long or short in video. 00:52:52.800 |
If you were to change any paragraph of the Lean In book. 00:53:00.720 |
Wow. I need to take a deep breath here, Chamath Freeberg. This was super emotional for me. 00:53:09.120 |
I didn't know if I could do it. I'll be honest. I have so many emotions, Chamath, 00:53:13.840 |
about Dave. I have so many feelings about this situation. 00:53:16.960 |
I watched the documentary. I thought the most important thing is there are these, 00:53:22.800 |
you said it, Jason, in the fog of war, there are things that happen that are just wholly 00:53:28.080 |
unacceptable. I remember when I was getting older and I was curious, why did my family 00:53:36.720 |
not go back to Sri Lanka? And what do the Tamils, which is a small minority, Hindu minority in 00:53:44.720 |
the majority Buddhist population, why did they feel so out of sorts? And we were part of the 00:53:49.600 |
Buddhist majority. And when you insert yourself into that struggle and understand where they're 00:53:55.440 |
coming from, it's jarring because you have to really re-underwrite, okay, what are we fighting 00:54:02.320 |
for? What are they fighting for? And the most important thing that I got to is what is allowed? 00:54:06.720 |
Because then you would see things. And the unfortunate part of Sri Lanka's history was 00:54:13.840 |
in the final parts of the war that ended it, there were some incredible atrocities that were committed. 00:54:19.840 |
And the United Nations and international court system tried to find justice for the Tamil 00:54:27.840 |
minority population in what happened in those final hours of that war. I don't think that they 00:54:33.440 |
did for the most part. But it's just to show you that these things leave deep wounds that, frankly, 00:54:42.320 |
can be reopened in a moment. So it's very important that I think these things are, 00:54:48.800 |
and I hate to say it so unemotionally, but documented. >> Absolutely. Absolutely. 00:54:53.840 |
>> For those that don't even understand the Holocaust, if you go to the Holocaust Museum, 00:54:58.560 |
if you're lucky enough to do it in Israel, I would encourage you to do it, but even in Washington, 00:55:02.240 |
you know the totality of what happened. There's certain places that document these important 00:55:08.960 |
moments in history. And if this is one of those moments to the Israeli people, 00:55:13.200 |
I just encourage them, please make sure that you minimize the miss and disinformation. 00:55:19.200 |
As complicated as that may be to do, it is incredibly important so that you can create-- 00:55:24.960 |
>> And doing so does not dissolve empathy for the other side's cause or for the other side's 00:55:33.520 |
motivations or objectives. Having empathy for the circumstances that happened here is the equivalent 00:55:38.880 |
of having empathy for the plight of the Palestinian people and what they're dealing with today 00:55:43.920 |
following October 7th. And I think that we need to recognize that both things can be true. 00:55:49.040 |
We can have empathy for both sides. >> Yeah. And by the way, humans have 00:55:54.080 |
a way of making decisions which I think is pretty predictable, which is once you have a point of 00:56:00.320 |
view, there are things that you believe are facts, and then there's all this other stuff 00:56:05.920 |
that you have degrees in which you believe that are essentially conjecture. The most important 00:56:12.320 |
thing in really important debates is to move something from that gray zone into the box of 00:56:19.280 |
>> And that is the only way that causes people to re-underwrite their principle views. It doesn't 00:56:25.040 |
matter what topic we're talking about. So the more that we're able to document and actually 00:56:30.720 |
make these things unambiguous, I think it actually has a really important role to play in how these 00:56:36.000 |
young people view what it is that they're a part of. I'm totally pro-protesting. I'm totally in 00:56:42.800 |
support of standing up for the things that you believe in. I'm not in support of overlooking 00:56:51.920 |
Chamath. And the response, I can tell you, to this episode and the response I got for just tweeting, 00:57:00.080 |
"Hey, this is an important documentary to watch," is the whataboutism, the other side, 00:57:06.160 |
and documenting what's happening in Gaza. And we have this search for truth right now, 00:57:10.720 |
which is very difficult because institutions have a lot of self-inflicted wounds. We live in an age 00:57:19.200 |
of conspiracy theory, and there are reports of crisis actors in Gaza creating fake deaths and 00:57:30.640 |
fake videos. So now you have one side saying, "Oh, the people of Palestine or Hamas, the numbers 00:57:38.560 |
aren't correct of the number of people died. The suffering's not correct. These images aren't 00:57:42.000 |
correct." The fog of war is going to be thick for a while here, folks, and it's going to take us a 00:57:46.080 |
while. And Chamath's exactly right. You got to document this. You got to get to some ground 00:57:49.120 |
truth. You got to get to some common facts so we can all objectively look at those common facts. 00:57:55.120 |
And, you know, listen, it's a shame David Sachs couldn't make it today, but he's really missed 00:57:59.520 |
here because, you know, we have that same thing with the war in Ukraine. It's very hard for 00:58:05.760 |
us in this current media landscape where we're quoting from news sources and anonymous Twitter 00:58:13.200 |
accounts, fake videos. It's going to get worse with AI. It's going to be harder and harder to 00:58:18.320 |
find the truth. And this is where your own personal morality, ethics, and I'm not sure who 00:58:23.120 |
brought this up during our talk because I'm emotionally spent, I got to be honest. It's a 00:58:27.920 |
little hard for me to collect myself here. But, man, you know, you have to have some basic moral 00:58:34.880 |
principles here. Children, women, rape, sexual assault, we all can agree on this. You said this 00:58:44.160 |
in the week after October 7th, Freeberg, you had a very powerful moment on the show, that you don't 00:58:50.400 |
want to have to decide between October 7th being horrific and children dying in Gaza being horrific. 00:58:59.440 |
And you don't want to have to be painted with one side or the other. You want to 00:59:03.520 |
believe as a moral person, that all suffering needs to end. And we collectively as a species, 00:59:11.040 |
in 2024 on this planet, can work together to just agree that certain things should never happen. 00:59:20.080 |
And to try to resolve these horrible conflicts. I'm so spent right now. And it was just very 00:59:26.240 |
difficult for me to watch that documentary. I don't know where we go from here, gentlemen. 00:59:29.680 |
I'm fine ending the show here. We're taking a 10 minute break and then maybe doing one 00:59:33.600 |
or two new stories. Take a break. We'll come back. Let's take 5-10 minutes. 00:59:37.360 |
All right, everybody. Welcome back to the program. Yep, it's not easy to do a pivot here. 00:59:52.160 |
But we collected ourselves, took a deep breath. And you've all been asking for a science corner. 00:59:57.840 |
And so, there's a really important story that Freeberg has been educating us about on the 01:00:03.920 |
group chat. There's a startup that just open sourced an AI gene editor. Yes, you heard that 01:00:07.680 |
right. Open source gene editor, powered by AI. It's called ProFluent Bio, am I correct? ProFluent? 01:00:15.040 |
Yeah, Berkeley based startup ProFluent Bio. Great. Have we talked about CRISPR and gene 01:00:19.440 |
editing before on the show or no? I think we have mentioned it. It would 01:00:22.400 |
be good as a primer for you to just explain from first principles, what is CRISPR? Why 01:00:26.880 |
it's important, and then get into this. So, there's debate around who discovered 01:00:31.520 |
CRISPR-Cas systems first and found their application. But generally- 01:00:34.800 |
Whose side are you on? The Jennifer Duden aside or the MIT? 01:00:39.520 |
I'm an open source guy, which is why I'm excited about this topic today. Because I don't give a 01:00:43.840 |
shit. I think things that are in nature are in nature. And I don't think you should be 01:00:46.960 |
able to patent stuff that you discover in nature. So, let's step back for a second, 01:00:51.600 |
Freeberg. Explain what CRISPR is to somebody who's heard the term but doesn't actually know 01:00:55.360 |
in your unique ability to explain science. Yeah. So, CRISPR-Cas, C-A-S, Cas proteins, 01:01:04.240 |
C-A-S proteins, are proteins that can go in to a cell. And they have what's called a guide RNA, 01:01:13.440 |
little piece of RNA attached to them, that allows that protein to find its way to a specific point 01:01:20.720 |
in DNA in that cell, in the nucleus of that cell. And when that protein hits that specific location, 01:01:26.800 |
it cuts it like scissors. And so, the protein finds the part of the DNA it's looking to cut, 01:01:31.600 |
attaches itself, cuts the DNA, and a cut is made. And so, this capability was discovered actually 01:01:38.400 |
in bacteria. And it was an evolved system that bacteria developed to actually protect themselves 01:01:45.360 |
from viruses. So, the CRISPR-Cas complex emerged through evolution, where bacteria started to 01:01:52.880 |
figure out that they could cut up viral DNA. So, they made these proteins, these proteins would 01:01:57.760 |
attach to viral DNA and destroy the viruses that came into the bacteria cells. So, scientists, 01:02:05.120 |
arguably from Harvard, from Berkeley, and from other places around the world, in the early 2010s, 01:02:11.040 |
started to do research and identified ways that we could leverage these proteins that we were 01:02:15.760 |
discovering in nature to do targeted DNA editing in human cells and plant cells and other cells. 01:02:22.720 |
So, rather than them just being used as a defense mechanism by bacteria, that we could harness these 01:02:28.160 |
proteins and make them useful to go in and do specific gene editing. Now, why would we want to 01:02:33.200 |
do gene editing? Gene editing, if done precisely enough and efficiently enough, would allow us to 01:02:41.440 |
go in and fix genetic diseases in humans, for example. It would allow us to take T cells and 01:02:47.600 |
reprogram them to go and attack cancer cells back in the human body. It would allow us, in the case 01:02:53.600 |
of agriculture, which I'm very close to and what I work on every day, to figure out ways to make 01:02:58.560 |
specific changes to the genes of a plant to make that plant grow in higher yield or change itself 01:03:04.880 |
to be disease resistant or drought resistant or other features that might be helpful to agriculture 01:03:09.920 |
and to humanity. So, gene editing became this amazing toolkit that emerged around 2012, 2013, 01:03:16.240 |
and just blew up on the market. And the main original foundational patents, which are now 01:03:21.920 |
mostly held after a lot of litigation by the Broad Institute, which is, you know, there's this kind 01:03:27.280 |
of joint patent arrangement with the Broad and MIT and Harvard, are being used in medical 01:03:33.520 |
applications, are being used in agriculture applications, they're being used in all these 01:03:36.640 |
different tools, but they're patented, there's royalties, there's fees, all this stuff. And in 01:03:41.360 |
the years that followed, many other cast proteins started to get discovered, all these different 01:03:46.080 |
types of proteins were discovered. And the reason you want to use different proteins is you want to 01:03:50.240 |
improve the efficiency. So how frequently or how good are these proteins at editing the cell 01:03:55.280 |
and eliminate off target effects, meaning the protein isn't making cuts or making changes to 01:04:01.040 |
other parts of the DNA that you don't want it to. So there's been this search underway for the last 01:04:05.520 |
decade for new cast proteins and developing new cast proteins, and dozens have been discovered, 01:04:10.640 |
people are trying to patent them, people are trying to make them do special things, 01:04:15.040 |
they can only change one letter, all these different tools are emerging. So we went from 01:04:20.000 |
having absolutely no ability to do gene editing just over a decade ago, to suddenly having all 01:04:25.040 |
of these different tools that could do gene editing really efficiently, really cheaply, 01:04:30.160 |
really affordably, really scalably, and more precisely. So this company ProFluent, they 01:04:35.280 |
actually used an AI model, what they call a protein language model, to create and train 01:04:41.440 |
an entirely new library of cast proteins that do not exist in nature today. So they basically took 01:04:49.360 |
26 terabases as 26 trillion letters of assembled genomes and metagenomes, this is from other from 01:04:57.760 |
various species and start to simulate new cast proteins that could be useful to replace the ones 01:05:04.480 |
that are on the market today or improvements on what's in the market today. And they found one 01:05:09.280 |
that they called open CRISPR one, and they made it publicly available under an open source license. 01:05:15.680 |
So any startup, any research lab, any individual, any scientist can use this particular 01:05:21.360 |
cast protein to go in and make edits without having to deal with patents and IP and claims on 01:05:29.200 |
who owns what that they found in nature. And this particular protein that they identified 01:05:35.600 |
is 400 mutations away from anything that they've seen in nature. So basically, 01:05:42.080 |
the AI model started to learn what sequence of DNA generated what structure of protein 01:05:49.120 |
that was really good at being a gene editor. And they started to discover and iterate on building 01:05:54.080 |
new ones. And the AI started to predict, hey, this would be a good gene editor, this would be a good 01:05:58.560 |
gene editor. And they came up with dozens of new gene editing molecules that don't exist in nature 01:06:03.360 |
today. They identified one that they then sequenced, they created it, they put it in a lab, 01:06:08.880 |
they tested it, and it turned out to be much better than cast nine. So they wanted to mark 01:06:14.880 |
so they use an AI model to find a new guide RNA, to find a new cast protein. So the guide RNA is 01:06:24.320 |
just RNA. That's like a that's like the key. Think about a CRISPR cast system has two components. One 01:06:31.120 |
is the cast protein. That's the giant protein that goes in and cuts DNA. And attached to it is what's 01:06:38.160 |
called a guide RNA guide RNA is the specific letters. And those specific letters are like a 01:06:43.280 |
key and a lock, they go attached to a particular part of the DNA. And then that giant protein cuts 01:06:47.520 |
in that exact spot. And so what they what everyone's been working on is new proteins. And 01:06:51.760 |
they've been trying to find new cast proteins that aren't going to go do off target cutting, 01:06:55.840 |
they aren't going to make mistakes that are going to be perfect at making the exact cut you want to 01:07:00.080 |
make. So everyone's always trying to improve the efficiency and reduce the off target effects of 01:07:04.320 |
these systems. And so what they did is they tried to create a new protein that doesn't exist by 01:07:09.360 |
learning from all of these other cast proteins that exist in nature today, and identifying the 01:07:14.240 |
three dimensional structure of them, and allow the model to predict a cast protein that might 01:07:18.320 |
actually be better than anything that's found. So it works around every single existing patent. 01:07:23.200 |
Well, that's going to be tested in the courts later, I'm sure, but they open sourced it. So 01:07:27.840 |
they they're not claiming any IP on it. They're not making any claims on it with the patent office. 01:07:32.000 |
And they're saying, Look, it's free and available. I'm on the same page as you if this is this, 01:07:35.920 |
this was occurring billions of years ago. And it just took us billions of years to actually 01:07:41.600 |
observe it occurring naturally in nature. It's absolutely ridiculous that a patent would be 01:07:45.200 |
granted on that. Now the implementation of that in a commercial use case, that's fine. 01:07:50.080 |
Be able to know this is happening, I guess, in the psychedelic space with psilocybin, MDMA. 01:07:58.960 |
No, no, I'm just saying, there are drug companies now that are really realizing the efficacy at 01:08:03.520 |
Johns Hopkins, Stanford, where they're doing these, and then they're trying to figure out 01:08:06.800 |
how do we take something that's occurring naturally, psilocybin in mushrooms, referred 01:08:11.680 |
to colloquially as magic mushrooms. And then how do we get our how do we bear hug this so that we 01:08:15.840 |
can patent it? How can we own the implementation of it, as you're saying? And so they're fascinating 01:08:21.840 |
to me taking nature and trying to patent nature. 01:08:24.880 |
Right. There's a simple truth to all this, which is every single life sciences lab on Earth is 01:08:31.360 |
using this technology today. It has absolutely revolutionized life sciences. It has changed 01:08:37.280 |
everything. It has reset the trajectory of human health, of agriculture, and of industrial 01:08:42.720 |
biotechnology. Those are the three major markets where gene editing is useful. It is changing 01:08:47.840 |
everything. And so it is already a ubiquitous tool. We basically created software engineering 01:08:53.760 |
for DNA for life capability. And so this system that these guys just published on, I think, 01:08:58.880 |
is a really wonderful manifestation of how AI is allowing us to open source and create improved 01:09:05.600 |
tools. And it's, it's really important for humanity. And so it's just great to see happen. 01:09:12.880 |
Yeah. Okay. Two questions for my baby brother with the with the science brain. Number one, 01:09:17.440 |
the LLMs here, how, maybe you speak to the efficacy of LLMs when applied to this vertical, 01:09:25.840 |
because it's a constrained data set, I believe. So it feels to me analogous to 01:09:30.800 |
code, whereas like, human language, images, videos, you know, building an LLM around them, 01:09:36.800 |
you have like, pretty large corpus. It seems to me code is constrained a video games constrained. 01:09:42.640 |
And of course, maybe maybe gene editing is constrained. I'll let you answer that. And 01:09:46.400 |
the LLM component here. And then maybe you could speak to what this will do to the startup community, 01:09:52.480 |
being able to leverage this open source tool have have startup started to pop up around this yet? 01:09:58.480 |
Is there a.org.com kind of equivalent here we have wordpress.org open source version of WordPress, 01:10:04.880 |
wordpress.com, the hosted paid version? And are we gonna see a bunch of.com versions of this and 01:10:10.080 |
different startups? Take it whichever way you like the two questions. I'm not super familiar. 01:10:15.680 |
I've met the profluent guys a couple times. I'm not super familiar with what their business models 01:10:20.320 |
going to be. But I hate that startups are worried or feel encumbered by the patent landscape 01:10:34.560 |
associated with CRISPR cast systems, and that they can't build novel products and move humanity 01:10:41.120 |
forward. Okay, so it's a blocker for humanity. I'm hopeful that we do see more of 01:10:45.600 |
these open source like tools become available and ubiquitous. It's almost the equivalent of having 01:10:51.120 |
Linux, where everyone can now, you know, as an operating system, or HTML being, you know, 01:10:57.760 |
standard code. I don't know if you remember, to use before HTML five, a lot of people were using 01:11:04.480 |
macro media flash. Oh, there was a huge blocker. Yeah, it was a huge blocker. So you had to pay 01:11:10.320 |
the license fee to to create flash content. And then you had to I don't know if they sold consumer 01:11:15.120 |
plugins. Well, yeah, they were rug pulled, right? They could change their mind. And they were trying 01:11:19.440 |
to make money on both sides. And so in order to show Microsoft did active to try to be a blocker 01:11:24.480 |
and own the open source. So in order to put it Yeah, in order to put multimedia on the internet, 01:11:28.720 |
you have used to have to pay license fees. And then HTML five basically created multimedia 01:11:34.800 |
capabilities native to the HTML, which is open source. And so everyone could do it. And I think 01:11:39.600 |
that it's really important that we see that happening with gene editing. I think all the 01:11:42.800 |
applications of gene editing should be patentable and protectable. But the core tools are so powerful 01:11:47.920 |
and important, that I think it's very difficult and hard to see how we are accelerating humanity's 01:11:54.720 |
progress by keeping these things at bay. And I'm it's really great to see open source tools like 01:12:00.160 |
this hit the market. Okay. And I think it's really important. And I think it's really amazing. 01:12:04.800 |
Yeah, tell me about the LLM side here, the large language model being built around this data, 01:12:10.960 |
how, what's the efficacy of that going to be like, and is my analogy of a constrained data set, 01:12:17.600 |
meaning it will be able to perform at a higher level, like we see with code, 01:12:21.440 |
and then copilots for code? Well, they created their, their own LLM, they call it a protein 01:12:28.880 |
language model. So they took all of this genome data that is generally very publicly available, 01:12:34.480 |
there's a lot of this stuff published in open genome databases, you can download it, 01:12:39.040 |
ingest it and use it for whatever purposes you want as a life sciences researcher. So they took 01:12:44.480 |
26 trillion base pairs of data and basically use that to train their model. And then using that 01:12:52.880 |
trained model, they then started to run inference on it, to say, come up with cast systems that are 01:12:58.000 |
novel, that could theoretically have efficacy greater than what we see in nature with the 01:13:03.440 |
natural cast systems. And then the model started to output all of these novel proteins, then they 01:13:07.520 |
started to test them. And they found that this one worked really, really well after testing in 01:13:12.640 |
the lab. So actually, here's a great image. So here you can see, basically in training the model, 01:13:18.160 |
so it's a little bit technically complicated what they did in the steps to generate this system. 01:13:23.040 |
But ultimately, the system yielded something that they could then create, put in a lab environment, 01:13:28.240 |
and in the lab environment, test how well it worked. And what they showed was that it actually 01:13:32.000 |
worked better than cast nine, which is the primary gene editing protein is today. So, you know, 01:13:37.760 |
pretty powerful set of steps and all unlocked by, again, freely available data and building their 01:13:44.320 |
own model and now ultimately open sourcing the best output of it. Okay, well, amazing job. If 01:13:52.000 |
you missed any of those graphics, you're listening to the podcast, go to YouTube and search for all 01:13:57.280 |
the podcasts if you want to see those graphics. All right, gentlemen, this has been another 01:14:01.120 |
amazing episode of the all in podcast for your Sultan of Science, David Freeberg, the chairman 01:14:07.360 |
dictator, and David Sachs, who couldn't make it today. I am the world's greatest moderator. 01:14:13.760 |
Rest in power, Dave Goldie. We love you. We miss you. And we'll see you all next time 01:14:31.200 |
And it said we open sourced it to the fans and they've just gone crazy 01:14:52.800 |
We should all just get a room and just have one big huge orgy because they're all just useless. 01:14:58.560 |
It's like this like sexual tension that they just need to release somehow. 01:15:02.720 |
What you're about to be. We need to get Merck.