back to indexE157: Epic legal win, OpenAI's news deal, FCC targets Elon, the limits of free speech & more
Chapters
0:0 Bestie intros: Mullets!
2:56 Recapping Friedberg's holiday party
9:29 Jury rules in favor of Epic Games over Google: How to handle the app store duopoly?
23:21 OpenAI inks deal with Axel Springer
35:2 FCC cancels Starlink subsidy, dissenting FCC Commissioner says federal agencies are targeting Elon Musk on Biden's orders
58:25 Alex Jones reinstated on X
82:59 Sacks receives an unlikely apology
87:32 Besties take two questions from the audience
00:00:00.000 |
We're going mullet this week in honor of your your closest to 00:00:05.680 |
mullet right now sacks. I see you trying to tuck the lettuce 00:00:07.880 |
in. It's not gonna work. We see it back there. I need the 00:00:10.480 |
ponytail. Wow, you went full knot. Where's the secret camera 00:00:16.080 |
in his room? What is this? No, my kids took it. Are you really 00:00:20.700 |
doing a douche not? One of my daughters is playing with my 00:00:23.280 |
hair. She wants to see if she can make a ponytail with it. So 00:00:25.840 |
she made a ponytail that took a photo. So we send it into chat 00:00:29.200 |
TPT to ask who it looked like. And it said Thomas Jefferson. 00:00:32.280 |
Just serious question. Did you do a fit check with with Tucker 00:00:35.000 |
on that? Did you send him that and say fit check? 00:00:37.040 |
You know, not everything has to do with Tucker. J. Cal the 00:00:40.220 |
If you guys don't know what a fit check is, as your daughters 00:00:43.960 |
away, what what check a fit check? Check it, you take a 00:00:47.880 |
picture of yourself, you send it to your friends, you say fit 00:00:49.800 |
check, and then they tell you if you look good for the day. Oh, 00:00:52.360 |
okay. It's kind of like it's kind of like a wellness check. 00:00:57.480 |
Friedberg send you I'm having anxiety about this. And then we 00:01:00.680 |
do a wellness check on Friedberg. See who's going to 00:01:03.240 |
Or we do a wellness check on you and Alex Jones comes back to 00:01:07.000 |
I'm putting it out there right now. And Alex Jones is on the 00:01:09.520 |
back half of the show just to tease it. Just like I'm going to 00:01:12.520 |
tease these photos. I pulled the archives and here I am in 1984 00:01:17.480 |
with my mullet. That's a J. Cal mullet from 1984 in Staten 00:01:21.840 |
Island on the way to a Boy Scout trip, but I thought we'd have a 00:01:24.080 |
little fun. Sax actually has been working. He's got his 00:01:28.720 |
hairstylist. Yeah, well, the great Lord of the Rings 00:01:33.600 |
character, like an elephant warrior. No, he looks like an 00:01:37.600 |
arrow. Exactly. So when it will punch that up, but surprisingly, 00:01:42.560 |
surprisingly, I can't look. It's horrible. Freebird. You're not 00:01:48.080 |
getting away. Looks like I think we should all go this way. He 00:01:54.880 |
looks like Orlando Bloom and Lord of the Rings, right? 00:01:57.040 |
Actually, it looks like more like that vampire movie was 00:01:59.920 |
that he does look like interview with the vampire or Yeah, he 00:02:03.400 |
looks like he's teaching gender studies at Berkeley and his nine 00:02:07.360 |
box non binary. So if you want to take understanding, non 00:02:11.080 |
binary, gender studies 101. It's coming this spring. Freeberg 00:02:16.320 |
should be teaching at his alma mater. Well done. I'll send it 00:02:19.640 |
to you as my I did have a ponytail. I sported it from 00:02:22.240 |
about age 16 to 19. When I smoked cigarettes, too. And I've 00:02:25.560 |
got photos with the ponytail. I'm smoking cigs. So it's the 00:02:30.040 |
Alright, so in the spirit of mullets, let's go business 00:02:33.520 |
first. And we will go to the party at the end. Let's get 00:02:44.800 |
With me again today here on the all in podcast, the king. We've 00:03:00.720 |
retired Queen of quinoa. Because David freeberg is CEO of a 00:03:13.440 |
Wait, wait, it's classified what crop you're working on? 00:03:17.200 |
Absolutely. It's a SaaS company, like a SaaS company wants to 00:03:19.960 |
keep it on the DL, which vertical they're going after 00:03:22.280 |
finance or sales, whatever. He's got to keep those. It could be 00:03:25.560 |
carrots. You never know. You could be Captain carrots. You 00:03:28.560 |
Just not beats. Okay, freeberg. We don't need more beats in the 00:03:31.840 |
Yeah, I'm gonna say no fennel. Dude. No, delicious. Beets are 00:03:36.560 |
No more beets. Beets with some feta cheese. Delicious. 00:03:41.840 |
Brussels sprouts can be very good. You know, saute them. 00:03:47.360 |
Yes, I'm with you. Very good. Let's make those cheaper 00:03:50.880 |
Yeah, bigger and cheaper and more tasty. You know, you know 00:03:54.040 |
what people people people dunk on but it's a great vegetable is 00:03:57.400 |
cabbage. I like cabbage. You like shred cabbage into a salad 00:04:02.360 |
and you put a little oil, little lemon, little salt, 00:04:05.440 |
Chinese salad, like chin chin in LA. That's nice. Exactly. Yeah. 00:04:09.720 |
sacks. Did you like freeberg? Christmas party last week? How 00:04:13.840 |
Oh, great. Oh, right. sacks didn't show up. I missed the 00:04:17.200 |
party. Next week, who does a Christmas party first? Before 00:04:21.680 |
you tell your story. A very kind gentleman rings the door to my 00:04:25.520 |
gate. I opened the gate. The guy drives in he gets out. He's like 00:04:28.520 |
a big guy kind of says Mr. freeberg. And I'm like, What am 00:04:31.600 |
I getting served? And he goes to the backseat. I'm like, Oh my 00:04:34.080 |
god, I'm gonna get shot. He reaches in. And he pulls out a 00:04:37.320 |
beautifully wrapped gift. He's like, Mr. sacks, Mr. freeberg. 00:04:41.080 |
Happy Holidays. And then he gave me like a touch on the shoulder 00:04:44.760 |
or like he does this thing where he goes like this little bow. He 00:04:49.400 |
It was like when you go to an award, he did the Amman thing 00:04:53.480 |
That wasn't his. That was his valet. That's his valet. He stole 00:04:59.200 |
It was the most thoughtful no show I've ever had. I will say 00:05:03.040 |
sex. Okay, well, I'm glad to hear that. I'm glad to hear 00:05:05.680 |
that. Well, what's funny is, I just checked my inbox and my 00:05:09.240 |
invitation from you was sitting in there. Because I've been 00:05:12.360 |
meaning to go, but I didn't realize it was so soon. So I'm 00:05:15.960 |
just looking at the date. It was December 9, I didn't realize it 00:05:20.440 |
Someone in your household did. So we appreciate it. Someone 00:05:23.040 |
knew. I went but this is my this is my second or third year 00:05:26.080 |
going. So I pre I pre gamed it. I went and I got protein. I got 00:05:31.800 |
meat, a bunch of steak and a burger. And then I went and sure 00:05:35.920 |
enough. I talked to the staff. And the staff said, after much 00:05:40.880 |
debate, and haranguing with Freeberg, he allowed cheese this 00:05:45.200 |
year. He allowed sushi with a whole sushi platter thing. No, 00:05:48.040 |
stop. He didn't have sushi. Okay. To tell the truth. Go 00:05:51.760 |
ahead. Tell the truth. I get in the car to drive 90 minutes. 90 00:05:55.520 |
minutes to free. Okay, that's the poker trip for me, by the 00:05:58.960 |
way. But yeah, and I not texts, Allison and David, we're on our 00:06:05.000 |
way. trough wants to know, will there be meat? Because I'm 00:06:07.960 |
driving so I can't. And Ali says, Oh, yeah, don't worry, 00:06:12.920 |
there'll be sushi. And then not text back. Okay, we just got off 00:06:16.960 |
the highway. We're going home. We'll see you later. Because we 00:06:18.680 |
don't say there's gonna be fish fine. I get there haven't eaten 00:06:22.480 |
a thing. I am starving. Raven. I start I start to work my way 00:06:27.040 |
through the appetizers. There's like, sliced green peppers and 00:06:30.920 |
red peppers. There's like some falafels. Then there was a 00:06:33.760 |
Spanish omelet felt pretty decent. Okay. And I'm like, 00:06:36.760 |
where's the sushi? And so Phil Deutsch says 1000 bucks, there's 00:06:40.760 |
no sushi. And I said, No, there's Freeberg texted me. He 00:06:43.960 |
said, I've asked all the people working here. They say there's 00:06:47.000 |
no sushi. So I bet him $1,000. I go outside to where the sushi 00:06:52.200 |
is. And you know how we used to make fun of sky for having the 00:06:56.760 |
filler fruit at the poker game, like cantaloupe and honeydew. 00:07:00.960 |
And that's all there was. Yeah, there was no sushi, but there 00:07:03.960 |
was two or three rolls with four pieces of salmon strategically 00:07:09.960 |
cut just laying over. I won the bet. I was so hungry. I was 00:07:14.880 |
like, What do I eat? I'm starving. So I keep eating the 00:07:17.920 |
omelet. I keep eating the vegetarian food. And then I see 00:07:21.800 |
these brownies. And I'm like, I'll just have a brownie. Then I 00:07:25.920 |
had two brownies. And then I was like, Okay, I need to stop. I 00:07:28.120 |
can't I can't have this. I'm still so hungry. I walk outside 00:07:31.360 |
J. Cal where I saw you. We're having a cup of coffee. And you 00:07:34.200 |
know what I did? I ate five baklava. And then I was like, 00:07:38.680 |
this is disgusting. I've had no protein. I've had no 00:07:41.440 |
carbohydrates. I've had no fiber. I've had a fucking 3000 00:07:44.880 |
calories of sugar. I grabbed Nat and I charged him. I was so 00:07:48.400 |
dead. He did he Irish mad. I didn't even say goodbye. Irish 00:07:51.120 |
goodbye. Freeberg and I talked about it yesterday. We talked 00:07:53.120 |
it out. But I was I must have had 6000 calories at his 00:07:58.480 |
Christmas. I'm coming over to Tim off tonight. Yeah, it's only 00:08:01.040 |
gonna be tried to the whole night. I'm sure. Not a simple 00:08:04.280 |
That sounds like J. Cal preloaded the meat. He did. He 00:08:08.400 |
I on the way. No, seriously. After last year, last year, I 00:08:11.680 |
left that party. And I just typed in I just said to my 00:08:15.200 |
to go to Papa closest In-N-Out Burger. And I literally got a 00:08:18.960 |
double double. And then I just had a second one. On the way up 00:08:22.840 |
this time, I literally had a steak for dinner. And then I 00:08:25.120 |
went and then I too had some pop. Have you guys ever had 00:08:27.640 |
Popeye's? Of course. I had never had it. My son asked to get it 00:08:32.800 |
last week. Last week. I'd never had it. Can you believe it? Here 00:08:35.880 |
we go. It is the most incredible thing I've ever tasted. 00:08:39.120 |
Wow. Wait for the out of touch YouTube comments. 00:08:42.720 |
Problem is not the eating of it. It's how you feel two hours 00:08:47.920 |
But I mean, I've had chick fil a you know, I've done the chicken 00:08:50.840 |
sandwiches at other places never. Popeye's is incredible. 00:08:55.080 |
Incredible. So then I had Nat almost on the edge of convincing 00:09:01.600 |
her to go to Popeye's on the way home. But we missed the closing 00:09:05.960 |
I got go to starboard or banchan. Those are two other 00:09:09.480 |
No, bro. Not not I'm not talking fancy chicken. I've had like the 00:09:12.760 |
fancy chicken sandwiches. I'm saying Popeye's is incredible. 00:09:16.160 |
I like it. I like it. Austerity. Chema. Here we go. Have you had 00:09:19.960 |
a flay of fish? All right, let's get the show started here. Of 00:09:22.800 |
course with me again. The king of beep, the dictator and the 00:09:26.760 |
rain man. Let's get to work. All right, everybody epic. The 00:09:30.000 |
makers of fortnight just won a huge case. With Google over the 00:09:35.040 |
play app store on Android phones. For those of you don't 00:09:37.320 |
know, app stores have become an absolutely huge business for 00:09:41.600 |
Apple and Google. Google App Store generates $50 billion of 00:09:45.800 |
revenue a year now that's about 17% of Google's total revenue. 00:09:50.520 |
Apple's app store and services 85 billion in annual revenue. 00:09:53.920 |
These are on top of their franchises of hardware and 00:09:57.880 |
search. But during San Francisco, unanimously found 00:10:01.600 |
that Google violated California's federal antitrust 00:10:04.080 |
laws through sweetheart deals, and annoying workarounds that 00:10:08.520 |
stifled competition. For example, Google got spooked that 00:10:12.200 |
other game developers would follow epics, lead and launch 00:10:15.800 |
their own app stores or route people directly to their 00:10:19.000 |
websites to avoid the 30% take rate. Some might call it a tax. 00:10:23.280 |
Google calculated they would lose 2 billion to 3.5 billion in 00:10:28.080 |
revenue annually if the other major game developers followed 00:10:32.000 |
epic so they created a program codenamed project hub where they 00:10:36.480 |
basically paid out bribes or incentives to discourage large 00:10:40.280 |
developers from building their own competitive app stores. They 00:10:42.960 |
also gave Spotify a sweetheart deal of 0%. And Google paid 00:10:47.600 |
Activision $360 million to keep in the Play Store. And the 00:10:54.360 |
discovery in this case was absolutely wild. According to 00:10:57.120 |
testimony in the trial, Google had deleted some employees chat 00:11:00.040 |
logs. And the judge told the jury to assume that the deleted 00:11:04.320 |
information wouldn't have been favorable to Google. Dray only 00:11:07.880 |
deliberated a few hours, and Google plans to appeal the 00:11:10.960 |
verdict. Obviously. Epic isn't seeking damages. They just want 00:11:15.760 |
Google to change their practices. They want to 00:11:18.000 |
basically let people plug in their own billing system to 00:11:20.680 |
avoid the 30% tax. We'll see what happens next. Freeberg. 00:11:24.320 |
These stores clearly have monopolistic characteristics, 00:11:28.320 |
but and Google actually allows for third party app stores. 00:11:32.640 |
Maybe you can explain why you think Apple won their case 00:11:37.200 |
These are pretty different cases. The Apple case was a 00:11:42.000 |
judge. This one was a jury of citizens in federal court. I 00:11:48.760 |
think it's worth just backing up a minute and talking about the 00:11:51.000 |
history of like apps on phones and how Android came to be. 00:11:53.840 |
Prior to Google acquiring Android, you guys may remember, 00:11:57.520 |
there were a few companies that were the dominant OS providers, 00:12:00.640 |
operating system providers to mobile phones. There was Nokia, 00:12:04.080 |
there was Microsoft, there was Apple, and there was also 00:12:08.160 |
BlackBerry. And at the time, a lot of the telcos, the Verizons 00:12:13.680 |
and AT&Ts of the world, prior to this, we're trying to make money 00:12:16.880 |
by charging for people to install apps on phones. So that 00:12:19.440 |
was the first business model in the mobile internet was the 00:12:22.560 |
telco would make money. And everyone fought against it. All 00:12:26.160 |
the open internet providers said this is ridiculous. And it was 00:12:28.640 |
clear that that was not going to be allowed. So ultimately, these 00:12:31.880 |
operating systems became the play, and which operating system 00:12:35.160 |
was on which mobile phone, and what did that operating system 00:12:38.040 |
then allow to control what apps were allowed, and so on. So the 00:12:41.160 |
reason Google bought Android is they wanted to make an open 00:12:43.840 |
source alternative to all of these closed app and closed 00:12:47.440 |
systems. So Google bought Android 2005, made a huge 00:12:51.280 |
investment in growing the team, and allowed anyone to use the 00:12:54.920 |
Android OS, fork it, make their own versions of it, install it 00:12:58.480 |
on their own hardware, run it however they want it. Meanwhile, 00:13:01.720 |
Google made an internal version of Android that could be used on 00:13:05.160 |
any mobile handset companies phone as a pre installed OS. Now 00:13:09.200 |
why did Google want to do this, they wanted to do this number 00:13:11.520 |
one to make sure that the internet was still open, and it 00:13:14.160 |
wasn't going to end up being closed from a user's perspective. 00:13:16.960 |
And number two, so that anyone can install any app they wanted. 00:13:20.280 |
And the commercial interest for Google, which is number three is 00:13:23.360 |
so that Google could make search Google search the default search 00:13:26.160 |
engine on that phone, and have YouTube installed and all these 00:13:29.260 |
other tools that Google makes money on, including their own app 00:13:32.960 |
store. Now in Android, anyone can install any app they want on 00:13:36.960 |
the phone. And so there's no restrictions, unlike in Apple in 00:13:41.520 |
the iOS, if you try and download an app off the internet, install 00:13:44.760 |
it, it has to go through the app store, it has to be Apple 00:13:46.560 |
verified, in order to be allowed on the phone. So the whole point 00:13:49.600 |
of Android was that it could be open, anyone can install 00:13:51.520 |
anything. What Epic claimed in this case was that Google's 00:13:55.120 |
Android OS gave people security warnings. So if you ever have 00:13:59.040 |
tried this, you download an app from a website on Android, it 00:14:02.040 |
says warning warning, this may cause a virus on your phone, are 00:14:05.480 |
you sure you want to do this, this app hasn't been verified by 00:14:08.200 |
Google, etc, etc. So it gives these warnings that scare 00:14:11.240 |
consumers off of doing that. So Epic can, you can install 00:14:14.160 |
Fortnite direct on your Android phone today. And you can do it 00:14:17.200 |
by downloading it from Epic's website, you don't have to go 00:14:19.320 |
through the Google Play Store. And you can enter your credit 00:14:21.400 |
card, you can pay for stuff. So it is it is an open system that 00:14:24.640 |
allows that what these guys are claiming is that because Google 00:14:28.280 |
can default the Google Play Store on the phone, it's 00:14:30.880 |
basically what most consumers are going to use anyway. And so 00:14:33.360 |
they're saying it's not fair. And because they also have 00:14:35.600 |
influence over the OS, and they're putting these security 00:14:37.920 |
warnings, it's inappropriate, because now it's scaring people 00:14:40.400 |
from downloading stuff off the internet. So that's the big 00:14:42.520 |
claim Epics making. So Google has already said they're going 00:14:44.760 |
to appeal this case. Because fundamentally, again, if this 00:14:47.960 |
were really true, and there really was deep antitrust issues 00:14:50.360 |
with this, you would likely have seen a federal agency come 00:14:53.760 |
after Google, not a private company suing them in a civil 00:14:57.080 |
case, this would have been a much more significant action if 00:14:59.840 |
there really was antitrust behavior. But it's a lot easier 00:15:02.840 |
to win a jury trial, party to party where Epic can go to a 00:15:05.520 |
court and say, Hey, let's go after Google, they're awful, we 00:15:08.000 |
make fortnight and all this sort of stuff. So they do have a bias 00:15:10.680 |
in that sense of being able to do this, Google is going to 00:15:12.720 |
appeal, they feel very strongly they'll win on appeal. And the 00:15:15.760 |
markets obviously did a, you know, voted with the fact that 00:15:19.680 |
Google stock didn't really move anywhere. And the market said, 00:15:22.680 |
Hey, this isn't this is a nothing burger, Google's 40 00:15:25.360 |
billion in annual Play Store revenue, worst case scenario, 00:15:28.080 |
like you said, if it gets impacted by $2 billion, that's 00:15:30.640 |
2 billion out of 300. Overall, doesn't really matter. And 00:15:34.040 |
likely, they're going to win on appeal anyway. So you know, I 00:15:36.520 |
think the saga will continue. But I think Google's got a 00:15:39.480 |
pretty strong case on appeal. And it seems like, you know, 00:15:42.480 |
that's going to be very hard to kind of see a massive change in 00:15:45.160 |
app store behavior as a result of this case, even though it's 00:15:47.200 |
been hyped up to be that. That's my take on it. 00:15:49.360 |
Yeah, great take. Chamath, what do you think about this jury 00:15:53.040 |
shopping and maybe the fact that this isn't Lena Khan's isn't the 00:15:56.320 |
FTC, you know, as company to company, do you think that the 00:15:59.760 |
claims here were valid? Do you think the jury shopping impacted 00:16:05.240 |
Probably, I guess the simple thought exercises, what do we 00:16:08.160 |
think the outcome would have been had this trial happened in 00:16:11.080 |
Dallas, Texas? Probably different. And so I think 00:16:16.600 |
freebergs, right, what does it materially prove nothing with 00:16:19.880 |
respect to the body of law, it just goes to show that if you 00:16:23.760 |
pick the right place to convene these trials, in the right 00:16:27.480 |
format, you can give yourself a slightly better probability of 00:16:31.600 |
winning. But the question is, what will you win? It's not 00:16:35.440 |
clear to me what happens now? Is there going to be a damages 00:16:38.760 |
portion now of this trial? Is that what happens next? 00:16:41.120 |
They're not seeking damages, they want changes to how they 00:16:45.480 |
trying to get a settlement. And they want Google to settle out 00:16:48.520 |
with changes to the app store policies. That's what they're 00:16:51.440 |
And then what about the epic versus Apple lawsuit? Is it 00:16:56.880 |
They lost and they appealed and there's one element that's being 00:16:59.360 |
appealed to the Supreme Court now. But basically, they lost 00:17:02.360 |
And that was that convened in California in a jury trial as 00:17:07.560 |
No, that that was not a jury trial. It was a judge. 00:17:11.040 |
And it was California. It was a bench trial in California. Yeah, 00:17:15.280 |
it was also in Northern California. That's right. Yeah. 00:17:18.720 |
So Saks, let me bring you in on this. Do you think that these 00:17:22.040 |
stores are monopolies? And do you think if they change their 00:17:25.440 |
behavior, especially Apple, you know, allow other third party 00:17:28.280 |
stores, what impact that would that have on the startup 00:17:30.560 |
ecosystem, because the 30% tax is significant. And we see that 00:17:33.960 |
every day. With our startups. I mean, if you have to give away 00:17:36.760 |
30% of your revenue to Google and Apple, it's brutal. And then 00:17:40.680 |
you're advertising on Apple and Google and Facebook. That's 00:17:44.720 |
another 30% of your revenue or 50% of your revenue. 00:17:47.840 |
Yeah, no, I agree with that. So first of all, these app stores 00:17:51.680 |
are absolutely monopolies within their ecosystem. And Apple and 00:17:56.520 |
Google Android are absolutely a duopoly within the mobile space. 00:18:00.520 |
My experience with these types of monopolies or gatekeepers is 00:18:04.920 |
that they exercise more and more control and extract more and 00:18:08.320 |
more of the value over time. It's an iterative process, in 00:18:12.040 |
which they just keep, you know, extracting, keep taxing, keep, 00:18:15.360 |
keep imposing more rules on the ecosystem for their benefit, 00:18:19.120 |
and to the detriment of innovators. And so I do think 00:18:21.760 |
they have to be controlled. And I think Epic is doing the 00:18:23.600 |
ecosystem a favor. For example, on this 30% rake that you're 00:18:27.760 |
talking about, Jekyll, that level of rake might have been 00:18:31.840 |
appropriate for certain types of apps like a hobbyist app where 00:18:35.600 |
it's literally 100% margin, okay, you pay 30% to the app 00:18:38.840 |
store. It doesn't work for SaaS companies. I mean, I can tell 00:18:41.640 |
you that. I mean, this would be like half of their gross margin 00:18:44.680 |
or something like that. It doesn't work for a lot of 00:18:48.200 |
companies that spend a lot of money on content creation, like 00:18:51.040 |
Epic, which spends a lot of money in Rd to create a game 00:18:54.040 |
like Fortnite, Spotify, break their models immediately, or 00:18:57.560 |
Amazon with Kindle. And so what happened is it used to be the 00:19:01.360 |
case that Amazon could have a link in their app, at least 00:19:05.600 |
directing the user to go to the amazon.com website, and you 00:19:08.680 |
could buy the book there and you could circumvent the the rake in 00:19:12.440 |
the app, and it was inconvenient for the user, but at least it 00:19:14.600 |
was a way around it. Then Apple banned those links, then they 00:19:18.600 |
banned the ability for the app to even message to the user what 00:19:22.560 |
was happening. So if for example, if you use the Kindle 00:19:25.160 |
app on iOS, which I do all the time, you can't buy a book in 00:19:30.040 |
it. And the reason why is because Amazon doesn't want to 00:19:33.220 |
pay the 30% rake, but they can't even tell you that it just looks 00:19:36.240 |
like it's broken functionality. Right? So those of us who know 00:19:39.240 |
go to amazon.com through the browser, we buy the book there 00:19:41.760 |
and then it magically appears in the Kindle app. We've all had 00:19:44.760 |
that experience. So I just think that these duopolies have to be 00:19:47.840 |
controlled. I think that it'd be good if the government could 00:19:51.360 |
figure out better ways to do it. I don't think M&A is the right 00:19:54.760 |
way to do it. We've talked about this before. I think that 00:19:57.240 |
restricting anti competitive tactics is really the way to 00:20:00.560 |
stop it. And like I said, I can't speak to the details of 00:20:04.560 |
epics case, but I do think they're doing the ecosystem a 00:20:06.680 |
favor here by pushing back on these monopolies and helping to 00:20:11.680 |
100% agree with you, Saxon, your take. And I think actually 00:20:14.880 |
other people should join them. And the industry should really 00:20:17.960 |
force this issue because you are absolutely correct that they're 00:20:20.680 |
boiling the frog. Now they did make some cuts under a million. 00:20:23.280 |
I think they charged 15% on the first million. So they try to be 00:20:27.680 |
Pull it up. That's not like it's actually the larger ones to pull 00:20:30.240 |
up the link I just said. So you'll see here, Google charges 00:20:34.520 |
through the Play Store. If you doubt if you want to have 00:20:36.840 |
distribution, I mean, think about the Play Store as being 00:20:38.560 |
like a retailer, you make clothing, you need to have a 00:20:41.320 |
retail store that someone can go to and buy stuff, the retail 00:20:43.600 |
store has to make money, you're not gonna have a retail store 00:20:46.040 |
that's free. So how does the retail store make money? Well, 00:20:49.120 |
they charge 98% of apps, as you can see here are free, because 00:20:52.320 |
the you know, they, they don't make any money on that. But then 00:20:54.440 |
if you start to charge subscriptions, it's 15% take on 00:20:57.680 |
automatically renewing subscriptions, where it's easier 00:21:00.160 |
second year. Yeah. Yeah, it's easier. No, each year, look at 00:21:03.680 |
the second. No, no, no, it's for renewing. So I know, that's 00:21:07.120 |
because of calm. So in the first year, it's for renewing 00:21:10.280 |
subscriptions, subscription products. So all subscriptions 00:21:13.520 |
that have an automatic renewal feature to them are instantly at 00:21:16.640 |
15%. And as a result, you know, you can think about the what is 00:21:20.480 |
it worth to get a user to not have to enter their credit card 00:21:22.560 |
info, you know, plus the credit card fees, it's like 15% is not 00:21:26.360 |
too crazy. Honestly, I'm just, you know, I'm not trying to be a 00:21:29.200 |
super Google advocate. But I'm just saying like, I don't think 00:21:31.040 |
that's too crazy. And then they've got this like negotiated 00:21:33.600 |
tier, where if you are a very large app developer, and you 00:21:36.600 |
want to go and negotiate with Google, they have a biz dev team 00:21:38.800 |
like Spotify and others get where they'll negotiate fees 00:21:41.440 |
down. And you can actually go and like argue for better 00:21:44.040 |
economics. So they've tried to be commercial, which I'm 00:21:46.640 |
guessing is probably why Lena Khan and others haven't gone 00:21:50.240 |
after them for antitrust and monopolistic behavior, because 00:21:53.120 |
they've tried to find the comfortable place where it's not 00:21:55.440 |
going to be too crazy. At least that's my read on what's going 00:21:58.080 |
on. Because otherwise, I mean, obviously, folks would be all 00:22:00.080 |
over him. You know, if it really was the boiling of the frog 00:22:02.920 |
issue is the one for me, because then they want to charge you 00:22:05.440 |
now for placement in the App Store and get revenue from you 00:22:09.200 |
Oh, yeah, I mean, that's Amazon to Amazon's got that, like 00:22:12.000 |
everyone's gotten squee every DTC company in the last five 00:22:14.560 |
years, has gotten obliterated, their unit economics are upside 00:22:17.440 |
down now. And we've talked about this, both Google's taken out 00:22:20.080 |
the margin, but Amazon forces you to buy ads in order to get 00:22:22.880 |
your placement. Yeah. And then they force you to pay all the 00:22:25.360 |
extra fees for inventory. Amazon squeezed everyone way more than 00:22:30.760 |
This is the perfect place for Luna Khan to get active, I think 00:22:33.400 |
and the settlement super easy. The entire industry should come 00:22:36.680 |
at them in unison, tons of lawsuits, group lawsuits, until 00:22:41.760 |
they allow when you turn on your Apple phone, the ability to load 00:22:45.240 |
Amazon's App Store, epics App Store, whoever else wants to 00:22:48.720 |
have an App Store, that should be your right. If you buy a 00:22:50.720 |
hardware device, it should be your right to load these and 00:22:52.920 |
they shouldn't be ankled in any way. And that's the other thing 00:22:56.320 |
Android does all kinds of ankling to make those with those 00:22:58.680 |
pop ups and hey, this isn't safe, etc. They should have a 00:23:01.600 |
verified App Store program. Amazon's App Store, epics App 00:23:05.280 |
Store, they should be verified or something. And maybe they pay 00:23:07.680 |
5% to have a verified App Store. But yeah, this is a this is 00:23:12.760 |
going to be an ongoing issue. And we'll see more of it, I 00:23:16.080 |
think. So let's go on. Anybody else have thoughts on it? No. 00:23:20.320 |
Okay. So in other news, open AI is starting to cut licensing 00:23:25.600 |
deals. If you remember, we had a big debate about this back on 00:23:28.760 |
Episode 115 in February. And I was saying, Hey, this content is 00:23:34.800 |
owned and the opportunity to create LLM or derivative 00:23:38.760 |
products, you know, is the right of the people who make that 00:23:41.640 |
content. Saks you told me I was gonna get rolled over. But here 00:23:45.920 |
I wouldn't say I said you're going to rolled over. What I 00:23:47.720 |
said is the ecosystem is going to figure this out. 00:23:49.720 |
Okay, let's play the tape. If chat GPT takes a Yelp review, 00:23:54.720 |
and a you know, a condominium traveler review, and they 00:23:58.080 |
represent it based on the best content that's out there that 00:24:01.920 |
they've already ranked because they have that algorithm with 00:24:04.040 |
PageRank or Bing's ranking engine, and then they republish 00:24:06.920 |
it. And then that jeopardizes those businesses that is 00:24:09.600 |
profoundly unfair and not what we want for society. And they 00:24:12.640 |
are interfering with their ability to leverage their own 00:24:15.680 |
content is profoundly unfair. And those magazines and 00:24:21.400 |
You're gonna get steamrolled. Okay, hair. There it is. Man is 00:24:27.800 |
cut back then. I think you were like post COVID back then. Yeah, 00:24:33.320 |
Like a toupee. Can you can you just show us a picture of that? 00:24:37.560 |
It looks like an era that was like a like an early ad. This is 00:24:42.480 |
like the common era now. You know, that's a toupee. That's a 00:24:45.960 |
toupee. It does look like a to it looks like a raccoon. Okay, 00:24:50.040 |
J. Cal, I'll get some comments on this, because I think your 00:24:52.960 |
offer has as fluff too much on the upper parts, and and unfluff 00:24:57.480 |
the bottom parts, which I think I mean, listen, don't criticize 00:25:00.040 |
him when he was in an in between phase. We all go through an in 00:25:02.240 |
between phase with our hair. It's it's part of the process. 00:25:04.960 |
Now I know why we're talking about this topic of J. Cal's 00:25:06.960 |
because you think you it's it's a total non story or it's I 00:25:10.200 |
shouldn't say it's a non story. Let me finish it off. Okay. So 00:25:13.240 |
just so we know what's going on here. Open AI announced a 00:25:15.640 |
licensing deal with Axel Springer to bring real time 00:25:18.840 |
news from Politico and the fake news from Business Insider to 00:25:22.200 |
chat btk you literally sound like Alex Jones. I mean, thank 00:25:27.400 |
you complain all your news is right about that. I got one 00:25:34.000 |
thing right. As part of the deal. Axel Springer can use 00:25:37.000 |
chat GPT to improve their products includes other 00:25:39.480 |
European sites. This is on top of the deal that open I did with 00:25:43.480 |
the Associated Press. But most importantly, let me just most 00:25:46.440 |
importantly, I'm going to throw it in a second. Most importantly, 00:25:48.560 |
when chat GPT relies on these sources, it'll include a 00:25:51.680 |
summary and a link back. Other examples of licensing are 00:25:55.400 |
happening all over the industry. Adobe is using static images for 00:25:58.240 |
theirs and stable diffusion. As you know, that brazenly used 00:26:02.800 |
Gettys images are being sued. So Freeberg, you thought this was 00:26:08.840 |
No, I don't agree with your framing. And I think I think 00:26:12.360 |
it's unrealistic for you to frame this as validating or 00:26:16.040 |
justifying the fact that these companies won't be able to 00:26:19.080 |
access and utilize open data under fair use to train models. 00:26:23.880 |
So that's what's going on historically, right. So the open 00:26:26.640 |
web, you know, we talked a little bit about where folks can 00:26:29.480 |
get content from the open web, you can browse the internet, you 00:26:32.160 |
can download all this content, it's all freely available, it's 00:26:34.920 |
readily available, it's in, it's in the open domain. And then you 00:26:38.040 |
can train models, and then the models can ultimately make stuff 00:26:41.120 |
based on all that training data. What this deal is, is it's 00:26:44.520 |
actually a content integration deal. And I'll read this with 00:26:47.680 |
the partnership chat GPT users around the world will receive 00:26:51.400 |
summaries of selected global news content from Axel Springer 00:26:55.080 |
media brands, including yada, yada, including otherwise paid 00:26:58.000 |
content. So a chat GPT is doing is they're accessing content 00:27:01.360 |
behind a paywall. And there'll be able and instead of training 00:27:04.600 |
models on it, they're able to fetch that data as a retrieval 00:27:08.160 |
aspect of the chat GPT service. So now you as a user want an 00:27:12.080 |
update on, hey, what's going on with Donald Trump, it can search 00:27:15.440 |
notch, it can not just use its training data, but it can 00:27:18.040 |
recognize that, hey, there's a current event news question 00:27:21.440 |
embedded in this query. And I can go fetch that current event 00:27:24.800 |
news answer from this content that I've now paid for. So it's 00:27:28.560 |
not a training data set that that's now being unlocked, which 00:27:31.240 |
is what the complaint was before that all the open web data was 00:27:34.000 |
being used for training. But it's behind paywall data that 00:27:36.840 |
can now be fetched and integrated. And I think it's 00:27:38.880 |
more interesting because it really speaks to a new model for 00:27:42.400 |
how the internet will work, which we've talked about, which 00:27:44.760 |
is that there may be the sort of new chat interfaces that cannot 00:27:48.240 |
just send you to another page and link you over somewhere, but 00:27:51.280 |
can fetch data for you and present it to you in an 00:27:53.800 |
integrated way in the response it's providing. And these 00:27:56.680 |
services have to pay for access to that. So chat, you know, open 00:27:59.320 |
AI, it's a three year deal, they're paying 10s of millions 00:28:01.880 |
of dollars to Axel Springer to access their closed content and 00:28:05.000 |
present it to the user. So I think it's quite a bit different 00:28:07.360 |
than, you know, using training data, which is what, you know, 00:28:10.080 |
the complaint was the first time around. And it's more of like a 00:28:12.360 |
really interesting front end feature for what chat GPT is 00:28:17.640 |
And I don't really have a lot to add to that. I think freebird 00:28:21.520 |
did a great job explaining that issue. I mean, look, I think 00:28:24.880 |
Jay Kyle, you've had a little bit of a session with this 00:28:27.520 |
And acting protecting rights holders, I do believe in, I 00:28:31.840 |
freebird makes a really great point, which is there's a 00:28:35.000 |
difference between copying somebody's copywritten work, 00:28:38.960 |
which would be a violation of copyright, and using content 00:28:43.080 |
that's available on the open web to train a model to create 00:28:46.520 |
entirely new content. And I do think that AI models should be 00:28:52.120 |
able to use the content that's available on the web 00:28:54.720 |
under a fair use doctrine to train their models for the 00:28:59.080 |
benefit of consumers. And I don't see a reason to try and 00:29:03.720 |
tie that up in a bunch of copyright lawsuits. If chat GPT 00:29:06.760 |
is producing a plagiarized result, then you may have 00:29:11.520 |
So what I would say is, you know, when you look at that 00:29:14.800 |
fair use doctrine, I've got a lot of experience with it. 00:29:17.080 |
Having done this in blogs and other content companies, you 00:29:20.560 |
know, the fourth factor test, I'm sure you're well aware of 00:29:22.560 |
this is the effect of the use on the potential market and the 00:29:25.200 |
value of the work. And if you look at the lawsuits that are 00:29:29.280 |
starting to emerge, it is Geddy's right to then make 00:29:33.200 |
derivative products based on their images, I think we would 00:29:35.680 |
all agree, stable diffusion, when they use these open web, 00:29:39.560 |
that is no excuse to use an open web crawler to avoid getting a 00:29:43.960 |
license from the original owner of that just because you can 00:29:46.280 |
technically do it doesn't mean you're allowed to do it. In 00:29:48.200 |
fact, the open web projects that provide these say explicitly, we 00:29:53.000 |
do not give you the right to use this, you have to then go read 00:29:57.200 |
the copyright laws on each of those websites. And on top of 00:30:00.560 |
that, if somebody were to steal the copyrights of other people, 00:30:03.880 |
put it on the open web, which is happening all day long, you 00:30:06.860 |
still if you're building a derivative work like this, you 00:30:09.680 |
still need to go get it. So it's no excuse that I took some site 00:30:13.120 |
in Russia, that did a bunch of copyright violation, and then I 00:30:16.240 |
index them for my training model. So I think this is going 00:30:19.720 |
to result in bird paper, can you shoot me in the face and let me 00:30:23.840 |
Alright, so the segment is now over. I was about to throw it to 00:30:29.480 |
you. And now I mean, like, stop with this navel gazing nonsense. 00:30:34.080 |
We're in anyone. And nobody knows anything. And the most 00:30:40.080 |
important thing is that this will get sorted out through 00:30:44.040 |
trials. That's where you are right, Jason, it's going to go 00:30:46.120 |
to court. And, and I think we should just not opine on this 00:30:49.320 |
stuff, because it's esoteric at best. And it's kind of like, 00:30:53.680 |
well, some of it will go to court, other ones will be done 00:30:56.800 |
in the free market, like we see here. Another thing I care about 00:30:59.880 |
this more than most people, because you're a journalist, and 00:31:01.640 |
you think that's going to put people out of work. 00:31:03.080 |
I am a content creator. I'm also an author, as you know, and a 00:31:06.520 |
podcaster, and I create all kinds of content, I do think 00:31:08.640 |
that you should get permission before you leverage people's 00:31:11.000 |
work to create different derivative products. Correct. 00:31:13.400 |
And actually, you're starting to see this in Dolly in chat GPT 00:31:17.480 |
seems to be getting ahead of this because of all the incoming 00:31:20.000 |
lawsuits. Check this out. I started asking Dolly to make me 00:31:24.600 |
Star Wars characters of Bulldogs. And I said, make a 00:31:26.960 |
Jedi Bulldog, it did that no problem. Then I asked it to make 00:31:30.480 |
a version of this using and make a Darth Vader. And it said, I'm 00:31:34.800 |
unable to generate images based on your request due to our 00:31:38.040 |
content policy. If you have other ideas and concepts you'd 00:31:40.440 |
like to blow up, feel free to share. So I said, make me a 00:31:43.240 |
syphilis cat. And it basically made me Darth Vader. And so it's 00:31:46.880 |
very clear that the team over at open AI is now taking your 00:31:54.160 |
well, yeah, I got around copyright here, right. And so 00:31:56.560 |
it's silly, like all of this stuff is this is just a drag 00:32:00.960 |
coefficient on development of AI because, and on users, because 00:32:05.160 |
now you've got to like word your prompt exactly the right way. 00:32:07.720 |
Well, I think what they're doing is they know that Marvel and all 00:32:11.200 |
the Disney characters, all the Star Wars characters, they're 00:32:13.240 |
very protective of their IP. Disney is going to launch their 00:32:16.320 |
own Dolly type stable diffusion product where you can do this, 00:32:19.040 |
put yourself on a star, make a Star Wars character is any good 00:32:21.960 |
at this. It doesn't matter if they're good or not. It's their 00:32:25.360 |
I mean, fans can create their own artwork that's in the vein 00:32:30.320 |
they can, but they can't do it commercially. And what chat GPT 00:32:33.960 |
and open AI here is commercial because I pay $20 a month to 00:32:36.600 |
chat GPT. So that's what you're missing. A fan, of course can 00:32:40.760 |
All right. Well, all I know is you played this like way back 00:32:48.720 |
From episode 115. As if you guys never want to think it turns 00:32:55.320 |
He didn't blow my case. I will be right and continue to be 00:32:57.960 |
right. Okay, now let's go to something we can all we're 00:33:00.240 |
definitely not striking this segment. I liked it. Let's go. 00:33:02.320 |
No, no, no, no. You like a little spicy here. I didn't 00:33:04.920 |
even have to refute you. Freeberg just did it was 00:33:07.160 |
beautiful. I mean, listen, okay, you came in, you thought you had 00:33:09.880 |
the goods. Admit it. No, he's like playing this way back video 00:33:13.640 |
from like 45 episodes ago. I finally got him. I finally got 00:33:18.040 |
socks. I was finally right about something you hadn't claimed you 00:33:20.680 |
never used the word steamroll. I would have never played the 00:33:22.920 |
clip. Oh my god. I have some live footage. I just want to go 00:33:25.560 |
through it. I'm just talking like it. Anyways. Yeah. So guys, 00:33:30.480 |
there's this copyright thing. I want to Okay, hold on. Let me 00:33:33.760 |
let me just reframe. Let me just start that again. Okay, you 00:33:36.960 |
know, copyright issues. Okay, hold on. Well, listen, there's 00:33:41.720 |
this thing I want to talk about copyright, which 00:33:51.640 |
something and you want to get Okay, hold on. Everybody that 00:33:58.000 |
writes gets a chance. Okay, hold on. I mean, if I could get up 00:34:07.280 |
Come on, Jay Cal, you teed up the way back clip of yourself. 00:34:15.840 |
You teed up the way back clip of yourself. So you brought 00:34:21.840 |
Yeah, but you try to dunk and it didn't work. Okay, okay. Okay. 00:34:26.560 |
Every bit has to take the piss out of me. I'm seeing a trend. 00:34:30.080 |
Unless you have a clean dunk. All right, here we go. Don't 00:34:49.200 |
You know, who actually deserves credit for admitting? Well, he 00:34:55.920 |
Oh, we'll get to we'll get your flowers. Okay. 00:34:59.080 |
It's coming. Here we go. Great transition. Great transition. 00:35:03.720 |
Here we go. Elon versus the FCC. Another government agency is now 00:35:07.840 |
targeting Elon. This is a little bit complicated. But let me 00:35:11.440 |
explain on Tuesday, the FCC rejected starlinks application 00:35:14.560 |
for 900 million in subsidies for rural broadband starlink 00:35:19.920 |
originally won these back in 2020. When they agreed to 00:35:22.880 |
provide high speed internet to 640,000 rural homes across 35 00:35:27.200 |
states funding would have come from the RDF, rural digital 00:35:31.760 |
opportunity fund, I guess the government is paying for 00:35:34.400 |
expanding broadband services in rural areas. And starlink 00:35:38.440 |
obviously is perfect for that. It's actually the only solution 00:35:40.880 |
for this really, she can't run fiber to these locations. So the 00:35:45.400 |
FCC found that starling quote had failed to meet its burden to 00:35:49.160 |
be entitled to those funds. And here's the quote, FCC has a 00:35:54.120 |
responsibility to be a good steward of limited public funds 00:35:57.040 |
meant to expand access to rural broadband not fund applicants 00:36:01.040 |
that fail to meet basic program requirements. Brendan Carr, one 00:36:05.440 |
of the FCC commissioners dissented from the agency's 00:36:08.680 |
decision. And he did not hold back last year. So the quote 00:36:12.120 |
after Elon acquired Twitter, President Biden gave federal 00:36:15.240 |
agencies a green light to go after him. Today's decision 00:36:17.720 |
certainly fits the Biden administration's pattern of 00:36:20.360 |
regulatory harassment. This is a decision that cannot be 00:36:24.160 |
explained by any objective application of law facts or 00:36:26.920 |
policy car went on to explain how his decision was made and 00:36:31.320 |
why it's unprecedented instead of applying the traditional FCC 00:36:33.920 |
standards to the record evidence, which would have 00:36:37.240 |
compelled the agency to confirm starlinks $885 million award, 00:36:41.120 |
the FCC denied it on the grounds that starlink is not providing 00:36:44.840 |
high speed internet service to all these locations today. As 00:36:48.320 |
noted, the FCC is milestone does not kick in until 2025. Let me 00:36:52.320 |
talk to you sacks thoughts on the Biden hit squad going after 00:36:56.520 |
I mean, I can't remember anything quite like this. This is 00:37:00.360 |
absolutely extraordinary. I mean, you have a sitting member 00:37:03.440 |
of the FCC telling us that the FCC is engaging in political 00:37:07.520 |
retaliation. He sits on the board of the five commissioners 00:37:09.880 |
of the FCC. They just canceled an $885 million contract to 00:37:13.840 |
starlink. What was that contract for to provide rural internet 00:37:17.360 |
service. Starlink is the only company that has that capability 00:37:20.680 |
today. It's the only one that has that capability. If you look 00:37:23.960 |
forward a few years, it is by far the best at providing 00:37:27.040 |
broadband from space, which is the best way to get into these 00:37:29.560 |
rural areas. So what did the commission do? Well, they 00:37:33.400 |
cherrypicked they took speed test snapshots from to cherry 00:37:37.680 |
pick moments in time. And so even that probably was not an 00:37:40.960 |
accurate reflection of where starlink is today. But they then 00:37:44.240 |
said based on those snapshots that starlink would not be able 00:37:47.120 |
to meet the standards in three years. So remember the 00:37:50.520 |
requirements that they're saying that starlink violated don't 00:37:54.800 |
even have to be met for three years. So somehow they're saying 00:37:58.120 |
that starlink will not get there in three years or preemptively 00:38:00.520 |
judging the service to meet a standard that is not even 00:38:04.640 |
required to meet today. And nobody else is even close to 00:38:07.840 |
meeting the standard. So Elon's response to this was, guys, okay, 00:38:12.120 |
if you're going to cancel the contract for us, like just save 00:38:15.360 |
the money because the competitors that you're giving it 00:38:18.280 |
to have even less of a service than we do. Yeah, so just like 00:38:22.240 |
save the taxpayer the money, but they're not doing that. So this 00:38:26.600 |
is really remarkable. And what car said here is that the Biden 00:38:31.880 |
administration is choosing prioritizes political and 00:38:34.000 |
ideological goals at the expense of connecting Americans, we can 00:38:37.040 |
and should reverse course. This is now part of a pattern of the 00:38:41.360 |
federal government harassing Elon and his companies. And it 00:38:44.640 |
all stems from Biden at that press conference saying, we got 00:38:47.940 |
to look at the sky. You know, like Tony Soprano, yeah, we got 00:38:51.040 |
to look at the sky, you know, whatever. I mean, it was. And so 00:38:55.840 |
since that's a nice restaurant, you got to be terrible if 00:38:58.120 |
anything ever happened to it. Yeah, Jake, I can't do it. In 00:39:02.080 |
any event. So Biden says this press conference, we got to look 00:39:05.400 |
at the sky. And since then, they've investigated Tesla for 00:39:08.560 |
supposedly building a glass house, which I didn't know was a 00:39:11.320 |
crime. That's amazing. Yeah. SpaceX, which is partially a 00:39:16.920 |
defense contractor was sued by the DOJ because they were hiring 00:39:20.320 |
too many Americans and didn't, they weren't hiring enough 00:39:23.760 |
refugees into sensitive national security roles, that they would 00:39:27.800 |
surely be sued for doing if that was confounding. Yeah. And now 00:39:31.520 |
they're they've canceled a contract for SpaceX, having the 00:39:34.880 |
best service in the space, but somehow missing a goal that 00:39:38.580 |
they're not required to meet for three years. This is harassment. 00:39:42.120 |
It's transparent. Yeah. And the question I have is, do we want 00:39:45.480 |
to live in a country where the government can engage in this 00:39:48.880 |
kind of naked political retaliation against its critics? 00:39:52.920 |
And I have to say, you know, there was a there was a time in 00:39:55.840 |
America where, you know, Nixon was roundly attacked for having 00:40:00.040 |
this quote, unquote, enemies list, where supposedly, you 00:40:03.160 |
know, he had made a list of all his enemies and the IRS was 00:40:05.640 |
auditing them. Okay, we are so far beyond that point. And the 00:40:09.880 |
media isn't interested at all. And no one's really interested 00:40:13.960 |
unless you like what Elon's doing. But if you don't, you 00:40:17.360 |
know, if you're on the opposite side of the political spectrum, 00:40:19.280 |
as Elon, you don't care. And there's nobody who's willing to 00:40:23.120 |
say in a neutral way that political retaliation should not 00:40:26.840 |
I mean, we have a presidential candidate running specifically 00:40:31.160 |
saying I am your retribution. I mean, this is something that 00:40:33.360 |
has to stop across all of politics, nobody should be using 00:40:37.080 |
their political power to do any retribution against any buddy, 00:40:40.160 |
they should be operating the government efficiently, and the 00:40:45.200 |
Okay, so maybe I don't like that rhetoric from Trump. I don't 00:40:47.920 |
think it's helpful. But what did Trump ever do? That's in this 00:40:50.880 |
league. I mean, everything they accused Trump of doing the 00:40:54.640 |
fascism, the retribution, and all that kind of stuff. Seems to 00:40:59.320 |
Yeah, well, I mean, he says he's gonna do it. He says the first 00:41:02.040 |
thing he's gonna do is go after journalists. And 00:41:03.840 |
do you think Trump did no retribution when he was in 00:41:06.480 |
I mean, we have to look through every single issue. One doesn't 00:41:14.120 |
come off the top of my head. I'm trying to remember if he ever 00:41:16.400 |
said, I'm going to go after this person or that person. I don't 00:41:20.360 |
he said, lock her up. That wasn't he never did it. Trump 00:41:24.040 |
was all talk in this respect. He didn't actually do it. 00:41:26.760 |
Yes, this is what Peter Thiel said, like, you know, his great 00:41:29.480 |
quote about him, like, just look at his actions, not what he 00:41:31.760 |
says. He say brattles, and he says he's going to do 00:41:34.720 |
retribution against everybody. But you know, then he doesn't. 00:41:37.260 |
tomorrow, even talking about Trump in this context is a 00:41:39.440 |
deflection to account the action is being taken by the Biden 00:41:42.040 |
administration. They've now weaponized multiple federal 00:41:44.320 |
agencies to go after Ilan on these cases that seem 00:41:47.280 |
transparently trumped up a glass house, not hiring enough 00:41:51.480 |
refugees to national security roles. It's obviously a 00:41:54.720 |
contract for the for Starlink, which is by far the best rural 00:41:57.800 |
internet service. How do you even justify it? These cases on 00:42:01.560 |
there? I'm not just 100% I'm in 100% agreement. You think 00:42:06.080 |
there's this is politically motivated harassment of Ilan by 00:42:09.400 |
the Biden administration? 100% He said it. He said it and he 00:42:12.600 |
didn't invite him to the EV sun. So you just take Biden at his 00:42:15.440 |
actions. If you don't invite by if you don't invite you on to 00:42:18.440 |
the EV summit. It's obvious that he's got it in for this guy. And 00:42:22.280 |
now it's obvious he's told people to, you know, 00:42:25.200 |
investigate him and harass him. It's obvious. 00:42:27.360 |
So why do you think they don't like him? Why do you think 00:42:31.600 |
Why doesn't Biden like him? Because he's non union. It's 00:42:34.640 |
obvious. That's the that's the beginning and end of it. I mean, 00:42:37.280 |
I'm sure the freedom of speech things and, you know, Twitter 00:42:40.080 |
doesn't help. But this predates, Biden is a union guy. And he 00:42:44.760 |
will not have non union people. He will not support non union 00:42:47.960 |
people. He is bought and sold by the unions. That's 100% anyone 00:42:51.200 |
That may be how it started. But I think you're underrating the 00:42:55.480 |
Oh, I said it could have to do with that. But it's definitely 00:43:00.080 |
More importantly, you're saying enabling dissenting voices 00:43:03.840 |
strongly dissent. Absolutely. I think I think that from the get 00:43:07.320 |
go, they have sought to exercise control over the Twitter 00:43:12.000 |
formerly Twitter now x platform, because it is the town square 00:43:17.240 |
Oh, they succeeded. They succeeded the FBI, everybody 00:43:20.000 |
total control of it until Elon somehow bought the company which 00:43:23.160 |
was not in their plans. Frankly, that was just a fluke. I mean, 00:43:25.920 |
that was something that Elon did out of the blue, because he 00:43:28.480 |
cares a lot about free speech. And he opened up the Twitter 00:43:31.520 |
jails and, you know, stop the censorship and open up the 00:43:35.040 |
Twitter files, we found out that this was not just a private 00:43:37.480 |
company acting on its own. It was being directed or 00:43:44.600 |
they were giving them a list of tweets saying, Hey, these tweets 00:43:49.160 |
Yeah, you know, and the FBI acting as the quote unquote 00:43:53.760 |
belly button of the whole federal government, directing 00:43:56.160 |
all these takedown requests, totally unamerican. 00:43:58.040 |
I think that the pattern of actions more than anything, 00:44:01.280 |
mandates that Biden and his team actually have to address 00:44:05.840 |
publicly why it is not retribution. The absence of 00:44:09.520 |
doing that the absence of doing that at this point, is going to 00:44:12.520 |
be more damaging to them than just letting things go on and 00:44:15.640 |
claiming down the road. Hey, this is all part of the normal 00:44:20.920 |
what why would they address it? Why do they have to address the 00:44:23.640 |
media doesn't hold them accountable. The media doesn't 00:44:25.160 |
report it. The media pretended like the Twitter files never 00:44:27.040 |
even happened. Remember that zero mainstream media coverage 00:44:30.040 |
of the Twitter files? Zero mainstream media coverage of 00:44:34.440 |
these retaliatory lawsuits? Why would the administration need to 00:44:37.200 |
explain itself? Why would they even talk about it? The fix is 00:44:39.960 |
in? Well, I mean, now the question is being I think the 00:44:43.120 |
mainstream media are stenographers for one side of 00:44:45.840 |
the political spectrum, which is precisely the reason did you 00:44:48.680 |
guys? It's precisely the reason why they're so upset with Elon 00:44:52.400 |
with opening up free speech on Twitter, because they had total 00:44:56.040 |
control of the public discourse until he did that. 00:44:58.920 |
Did you see that thing this morning where somebody called 00:45:01.960 |
out the New York Times for selectively editing what Hunter 00:45:05.400 |
Biden said to make it more broad, broad? Do you see that 00:45:10.360 |
Yeah, he basically said my father has not financially been 00:45:14.200 |
involved in my businesses. And the New York Times took out the 00:45:17.320 |
word financially, to make it more broad to say he's not been 00:45:20.200 |
involved in my business. He was involved in his business is 00:45:23.000 |
really? Yeah, yeah. And then there's a clip where they show 00:45:26.080 |
the article. And they show and then they show his interview. 00:45:29.360 |
And the interview is very clear. He says it, but the New York 00:45:32.200 |
Times headline omits the word, and doesn't doesn't put it in a 00:45:35.640 |
bracket so that it shows that it was edited. It shows that that 00:45:38.240 |
was the quote. Can you pull it off? You have? Do you have that 00:45:40.720 |
thing? Yeah, crazy. Let me find it. That's crazy. 00:45:44.640 |
I mean, I would like to think there's a possibility this was a 00:45:47.400 |
mistake. But man, it's pretty bad. I mean, this is really bad 00:45:50.280 |
by the New York Times. And they got to figure out who actually 00:45:53.400 |
took that word out. Because it's pretty clear Biden was very 00:45:56.960 |
involved in hunters businesses. And that's why he put that word 00:46:01.080 |
financially in there. Because he was on, I think he's on all the 00:46:05.600 |
documents as being part of it. And he's in the emails. So he 00:46:10.920 |
There it is. Thank you, Nick, you're very good at finding 00:46:13.920 |
these things. Okay. So if you look on the left, that's the 00:46:16.680 |
article in the New York Times. And it's clear that that was the 00:46:21.360 |
quote. And then if you play it on the right, it's actually what 00:46:24.400 |
let me state as clearly as I can. My father was not 00:46:31.240 |
Wow. Yeah. And then quotes as clear as I can. My father was 00:46:36.320 |
not involved in my business. Now. If in journalism, you could 00:46:40.520 |
put an ellipsis after involved, you know, three dots in but why 00:46:45.840 |
would you do that? This is like breaking very basic journalistic 00:46:48.560 |
standards. My point, there's like a format, right? When you 00:46:50.960 |
edit out a word like that? Yeah, you would only edit out a word 00:46:54.080 |
if it was superfluous. And you wanted to have a tighter quote, 00:46:57.440 |
you have the person said, Ah, you can take that out. And if 00:47:00.640 |
you were taking out a long quote, you put three dots, and 00:47:03.280 |
then you would show that you cut the quote, there was something 00:47:06.120 |
in between. And then you went to that. In the case of something 00:47:08.520 |
this important, you would never take out a keyword like that. 00:47:12.480 |
That was this is just journalism one on one. So I mean, if this 00:47:15.840 |
happens, man, it is, it's the keyword in the sentence, by the 00:47:20.240 |
way, it's the keyword in the sentence, it is the keyword in 00:47:22.600 |
the sentence. So if the person took it out, man, who took it 00:47:25.640 |
out. And this is the problem with the New York Times is they 00:47:28.600 |
bury their corrections, they need to, and this is back to 00:47:32.280 |
accountability, you're saying Freeberg, the Biden 00:47:34.120 |
administration has to explain why they excluded Elon from the 00:47:37.240 |
EV summits. And the New York Times needs to explain why they 00:47:40.080 |
did this, or else, you know, the mind wanders that there's some 00:47:43.480 |
conspiracy going on here or targeting. And I wonder if they 00:47:46.780 |
changed. Well, here's the live story. Is it changed in the 00:47:48.720 |
story? Did they change it yet? Luke Broadwater, I mean, Luke 00:47:52.680 |
Broadwater, is that a name from central casting? That's not a 00:47:55.920 |
I don't know. Let's say they post a correction. 00:47:57.960 |
Oh, here we go. Correction. An earlier version of this article 00:48:00.280 |
misquoted Hunter Biden, you said my father was not financial. It 00:48:02.840 |
was not my father. So the only room I have here is if the 00:48:06.960 |
person was live transcribing it, maybe, and they left it out. 00:48:10.320 |
But this is too important to not have a fact checker go through 00:48:12.880 |
it. And to have to get called out on it to fix it. He just 00:48:17.000 |
Jake Allah putting on the front page, I did not kill that 00:48:22.040 |
person. And then a day later, it buried in page 812. I actually 00:48:27.640 |
I mean, if the correction at the bottom of the story is good, 00:48:31.880 |
everybody sees the front page, very, very people you would 00:48:34.920 |
agree with me very, very few people see the correction in the 00:48:37.240 |
old days, they would put the correction on like, a two a three 00:48:40.280 |
of the paper, and it would be small and be at the bottom in 00:48:43.160 |
the digital age, you put the correction at the bottom of the 00:48:45.280 |
article. So it's a little bit better. But the truth is, most 00:48:47.960 |
people don't go to the bottom of the article. So there's an 00:48:50.680 |
argument to put corrections at the top of the article. But 00:48:53.440 |
journalists don't want to admit when they're wrong. I think that 00:48:56.280 |
I saw I saw a list of all of the organization's investigating 00:49:00.520 |
Ilan. And what was surprising was how broad some of these 00:49:05.800 |
organizations felt that they had a mandate to look into him. So 00:49:10.760 |
there was like, I want to say, Nick, maybe you can find this on 00:49:13.520 |
Twitter, but they had a list of them. And it was like, the Bureau 00:49:16.280 |
of Land Management investigation. I mean, it just 00:49:20.280 |
makes no sense. Like, it just does not smell right. 00:49:24.360 |
In fairness, Ilan is involved in many, many very important 00:49:28.560 |
projects. So there would be a lot of agencies that speaks to 00:49:31.080 |
overregulation. And then you have to drill down and say, 00:49:33.400 |
okay, when are they actually targeting him? And so that's 00:49:36.440 |
going to be a lot of parsing wildlife, you know, health and 00:49:40.160 |
human housing. I mean, it's a list goes on. You might have a 00:49:45.560 |
No, I think my understanding of it is, you know, at Starbase, 00:49:50.600 |
there's some estuaries or something. And there was a lot 00:49:54.680 |
of estuary. I mean, yes, we protect animals and whatever. 00:49:59.680 |
This is something that happens all over the country, where 00:50:02.880 |
California is actually probably the leader in this, but I think 00:50:06.240 |
some crabs might have got burned, not in a barbecue, but 00:50:09.760 |
by the rocket. I mean, literally, that and then this is 00:50:13.160 |
this, this speaks to what risk Are we willing to take to make 00:50:15.920 |
progress as humanity? freeburg? Remember, we had this discussion 00:50:18.720 |
about self driving cars? Like, if getting to Mars and being 00:50:23.040 |
multi planetary kills some crabs, I think we should be okay 00:50:26.240 |
with that. In fact, if it decimated, I mean, it's not 00:50:29.880 |
decimated, but let's just say 100 square miles got decimated 00:50:32.800 |
by getting to Mars on planet Earth. Well, you'd make that 00:50:36.160 |
Yeah, it's a I mean, this is the same standard. I think I feel 00:50:40.440 |
around if there's a mouse infestation in my house, I'm not 00:50:44.400 |
going to let the mice live in my house, even though I'm completely 00:50:48.000 |
ethically against killing animals, killing animals to eat 00:50:52.040 |
when I have other options, I'm against and I think animal 00:50:54.840 |
testing and medical applications, I have a totally 00:50:57.520 |
different standard than I think what is standard in the market 00:50:59.680 |
today. So for me, it's like a pretty sensitive topic, because 00:51:02.920 |
my ethics are don't kill animals unless absolutely necessary. And 00:51:06.160 |
the question is, what is necessary? What is the 00:51:07.760 |
definition of necessary? And so these sorts of points that 00:51:10.360 |
you're making about, you know, if it gets all humans to Mars, 00:51:13.680 |
that might be a trade off worth making for some crabs. I don't 00:51:17.520 |
probably hard to understand the analogy here. Are you saying 00:51:20.520 |
that he wants the mouse? The rocket may have killed the mouse 00:51:25.320 |
the regulators on the house. Wait, what's the who's the mouse 00:51:28.280 |
and who's the house? The house is the rocket ship. Clearly, 00:51:32.080 |
the mouse is the mouse in the house, right? Yeah, yeah. Let's 00:51:35.480 |
pull up this quote from Brendan Carr is the FCC Commissioner. I 00:51:38.360 |
thought this is amazing. This list is incredible. This is the 00:51:41.920 |
FCC Commissioner. He said. He said the DOJ, FAA, FTC, NLRB, 00:51:47.880 |
SDNY, and FWS, I guess it's Fish and Wildlife have all taken 00:51:52.040 |
action. The FCC now joins them. Man, that's incredible. Yeah, 00:51:56.000 |
it's a little bit nuts. Look at that Biden quote, where I didn't 00:51:59.320 |
actually know about the second part. I knew about the first 00:52:01.120 |
part where he says we got to take a look at this guy. But 00:52:03.680 |
then he was asked how this President Biden responded. 00:52:07.200 |
There's a lot of ways a lot. You know, you know, there's a lot 00:52:09.640 |
of ways. There's a lot of ways to get to somebody. I can get to 00:52:13.320 |
you, you might be able to get to me, I might be able to get to 00:52:18.760 |
You know what else Biden said? There's a lot of ways is when he 00:52:24.320 |
was talking about the the Nord Stream Pipeline. And he said, 00:52:28.600 |
that pipeline is not going to move forward. And then they said, 00:52:32.120 |
Yeah, but the press said to him, Yeah, but that's that's like a 00:52:35.160 |
German Russian project. Like how? What's your involvement? 00:52:38.600 |
He said, we got ways that a lot of ways we got ways. Wow. 00:52:43.680 |
Ouch. Okay, so on the counterpoint, obviously, Elon 00:52:46.520 |
has several pretty sprawling businesses. He has self 00:52:49.560 |
driving cars, right? And they push that, right, they push the 00:52:52.240 |
envelope on, you know, where there's an existing regulatory 00:52:56.240 |
framework, same with going to Mars, right, same with 00:52:58.760 |
transmitting internet services, wireless communications, like, 00:53:01.840 |
you know, there is a regulatory framework for all of these 00:53:03.920 |
businesses. And he's on the bleeding edge, and typically 00:53:06.400 |
beyond the framework to some degree. So I think it's like 00:53:08.840 |
worth acknowledging, at least that there's a necessity of 00:53:13.080 |
scrutiny and involvement in these agencies, given that they 00:53:15.240 |
do have regulatory authority and responsibility over these various 00:53:19.560 |
businesses. And he's well beyond where anyone else is in each of 00:53:22.480 |
them. So I just want to acknowledge that hold on me 00:53:24.400 |
respond to that. Yeah. He's well beyond where other people are in 00:53:28.080 |
his industry in terms of innovation. He's the first to 00:53:30.600 |
acknowledge because I've heard him say this many times that 00:53:32.640 |
he's in highly regulated industries. And they've got, you 00:53:36.160 |
know, massive compliance programs at Tesla and SpaceX and 00:53:39.920 |
all these different companies. What we're judging these 00:53:43.480 |
regulatory agencies on is not that there's a need to regulate 00:53:47.880 |
Elon's companies within the framework of their industries, 00:53:51.080 |
but rather the specific actions that are being brought. Remember, 00:53:56.560 |
DOJ suing SpaceX for not hiring enough refugees, right, Tesla 00:54:01.840 |
being sued on this glass house business, whatever that is, 00:54:04.720 |
right. Now, those are voluntary actions in the SEC cancelling a 00:54:08.400 |
contract. Yeah, three years speak volumes. Yeah, exactly 00:54:12.840 |
three years before the they even need to judge that contract 00:54:17.920 |
100% of things that happened this week, which I think is 00:54:20.960 |
important along this vein is that the IRS is in charge of 00:54:24.640 |
making sure that you can claim the $7500 EV tax credit for cars. 00:54:28.440 |
And a lot of us that have been looking at this issue, the way 00:54:32.440 |
that they break the EV tax credit is in half. And part of 00:54:36.320 |
it is about where the material is sourced. And part of it has 00:54:39.680 |
to do with the total sum of certain components of the car 00:54:43.240 |
and how much of those are made in the US, etc. Okay. And it was 00:54:48.880 |
presumed, just based on the trend that Tesla would lose half 00:54:52.800 |
the credit, keep half the credit. And in a bit of a 00:54:55.680 |
surprising move, they the IRS came out and said the whole 00:54:59.720 |
thing, we're not going to acknowledge anymore. So Tesla 00:55:02.200 |
had to go and put on the website that the credit ends as of 00:55:06.280 |
December 31. So I would add the IRS to this to this list as 00:55:11.040 |
well. That's so crazy. So I got to be investigated. I like 00:55:13.680 |
freeberg suggestion that prove to us that you're not doing this 00:55:18.800 |
at this point. Because it's pretty clear that it is 00:55:24.360 |
Good luck getting them to do that. Just on the on that EV 00:55:28.360 |
subsidy, you know, one of the perverse things about this is 00:55:31.600 |
that the administration is putting the thumb on the scales 00:55:35.120 |
against Elon in favor of these less innovative competitors who 00:55:39.960 |
have worse products. So like Elon said, if you want to 00:55:43.600 |
cancel our contract for Starlink to provide this rural 00:55:47.160 |
broadband, that's fine to save taxpayers the money. But by all 00:55:50.680 |
means don't then give the money to these other services I can't 00:55:53.440 |
deliver. What's the point of that? And same thing on the 00:55:55.720 |
electric cars. I mean, the subsidy is going to these other 00:56:00.560 |
Yeah, totally. This is the key point. The The fact is, all of 00:56:04.160 |
those people who just had their Starlink cancel through the 00:56:06.440 |
government, I guarantee you, they will buy Starlink, because 00:56:11.200 |
it's the best product. So irony of ironies, they're just gonna 00:56:14.720 |
go spend 60 7080 bucks a month to put their own Starlinks in 00:56:17.760 |
like everybody else around the world who lives rurally. I have 00:56:20.800 |
Starlink one more thing to add the under pressure from 00:56:24.480 |
regulators, they announced a recall on Tuesday of 2 million 00:56:28.360 |
cars to to fix some of the autopilot software. 00:56:31.640 |
Yeah, but that's an over the air update. So the press went crazy 00:56:35.000 |
No, no, no. What I'm saying is, if you read that, if you read 00:56:37.680 |
the article, that over the air update is specifically because 00:56:41.120 |
of, again, how it's written, regulatory pressure to change 00:56:46.040 |
how the software behaves some tuning and some edge cases. My 00:56:49.160 |
point is that one could guess that there is an attempt here to 00:56:53.960 |
kind of do the death by 1000 cuts approach, right? So the 00:56:57.480 |
drip drip water torture of just like, a thing over here, a thing 00:57:00.800 |
over here, a thing over here, a thing over here, eventually 00:57:03.040 |
companies can get distracted and misfire. And so the question I 00:57:09.840 |
do think is like, you know, does it make us better off if all of 00:57:13.320 |
these little ticky tacky footfalls are enforced by the 00:57:18.080 |
government? I think I think we all know what the answer is. No, 00:57:21.240 |
we have an example that Microsoft got so distracted by 00:57:24.360 |
their court cases that the company went sideways for a 00:57:26.840 |
Well, but I mean, I think there was a lot of good basis for that 00:57:31.280 |
particular entity trials and antitrust case and Microsoft 00:57:33.440 |
clearly had a monopoly just on the recall thing. I did see that 00:57:36.120 |
story at the top of drudge or whatever last week, whereas it 00:57:39.240 |
said, every Tesla has to be recalled. In when I see the word 00:57:43.760 |
recall, yes, I think that means you got to bring it to the 00:57:46.240 |
dealership and get like some part swapped out. But that's not 00:57:50.160 |
No, well, so what's so interesting is they refuse in 00:57:52.320 |
these articles, I can show you the New York Times version of 00:57:54.440 |
it. They refuse to write that it's actually an OTA update to 00:57:57.800 |
breaking news. I don't know if you guys saw this, but a billion 00:58:00.480 |
five iPhones were just recalled for the 17.2 update. So 00:58:04.560 |
everybody's gonna have to bring their iPhones in 1.5 billion 00:58:08.040 |
iPhones recall, bring it to the store. Yeah, you got to bring it 00:58:10.840 |
to the store. And then they give you this new journal app. I 00:58:13.160 |
don't know if you got it in the latest update. But Apple made a 00:58:15.920 |
journaling app so that you can have more anonymity. Yeah, so 00:58:18.960 |
that's the recall. Yeah, it's total recall. All right, listen, 00:58:21.960 |
now we keep the red meat going. Saks is cooking with oil. Alex 00:58:25.320 |
Jones, the controversial conspiracy commentator of info 00:58:29.840 |
wars fame is back on Twitter after Ilan did a poll. He got 2 00:58:33.760 |
million people to respond asking if he should be reinstated 70% 00:58:37.280 |
said yes, of course, Jones encouraged fans to vote in this 00:58:40.080 |
poll. So I'm not sure how scientific it is. For 00:58:41.960 |
background, Twitter permanently banned Jones in 2018 after 00:58:45.440 |
accusing him of posting direct threats of violence and hate 00:58:47.960 |
speech. It already received bans from Apple, Facebook, and 00:58:51.280 |
YouTube is pretty much the the number one person to be D 00:58:55.200 |
platform. As you know, Jones was ordered to pay $1.5 billion to 00:59:00.320 |
the families of eight Sandy Hook victims. This is across two 00:59:05.840 |
cases in Texas and Connecticut. And here is Jones in his own 00:59:11.400 |
Sandy Hook, it's got inside job written all over it. Sandy Hook 00:59:16.760 |
is a synthetic, completely fake with actors. In my view, 00:59:22.560 |
manufactured. I couldn't believe it at first the Newtown kids. 00:59:25.680 |
They take them put them in our face. Tell us their names who 00:59:29.800 |
they were. I heard an ad this morning on the radio Bloomberg 00:59:33.560 |
paid for locally going. I dropped Billy off and watched 00:59:38.160 |
him go around the corner. And he never came back all because of 00:59:41.800 |
the guns. Won't you just turn your guns in for my son? Why'd 00:59:46.400 |
you do it to him gun owners? Forgive my language, but that 00:59:50.000 |
guy. Okay, there it is. We have that guy. But let me censor him. 00:59:56.680 |
Unfortunately, I absolutely cannot stand that. That is just 01:00:01.040 |
like heart wrenching, like evil, awful spewing out of his mouth. 01:00:06.640 |
And he still, you know, should have a right to speak. But that 01:00:11.040 |
guy, I was never a big crier. Part of it was just my defense 01:00:16.080 |
mechanism. And I remember Sandy Hook, because I had just become 01:00:21.760 |
a parent. I had, I think, two kids by that point. And I was 01:00:28.320 |
uncontrollably crying when that happened. And it was the first 01:00:31.640 |
time I realized how you change as a parent and you just develop 01:00:35.720 |
this empathy. And then you realize how precious kids lives 01:00:40.520 |
are. And I become more and more of a crier as my kids have grown 01:00:46.040 |
older. And I really appreciate that what my kids have done for 01:00:50.640 |
me. So when I hear him talk like that, I guess he has a right to 01:00:56.840 |
say what he wants, but he is a complete piece. 01:00:59.360 |
Yeah. Okay, so let's get into this very difficult question. 01:01:03.960 |
In fact, I don't want to force you to defend, you know, one of 01:01:07.640 |
those horrible humans. I think we can all agree. 01:01:10.240 |
Well, my position is pretty similar with the other guys 01:01:12.160 |
said, which is what he said was odious. However, that doesn't 01:01:15.760 |
necessarily mean he should be censored. We have standards. We 01:01:21.680 |
So first of all, me back up. I mean, I didn't even really know 01:01:23.880 |
who Alex Jones was. I mean, I only knew him because of the 01:01:27.120 |
controversy. I've never actually listened to a show. I'm not 01:01:29.320 |
really interested in what he has to say. I do think that if 01:01:34.280 |
you're going to play this clip of his mistake going back many 01:01:37.360 |
years, you should supplement it by playing a clip of what he 01:01:41.120 |
says now. And what he says now is, he's apologized, he's 01:01:44.520 |
admitted he made a mistake. He basically bought into a 01:01:47.600 |
conspiracy theory. But it wasn't just him saying it. Apparently 01:01:51.040 |
he had some people on the show, who I don't know if they were 01:01:54.120 |
purported experts or what, but they were making a case that the 01:01:57.720 |
whole Sandy Hook thing was a hoax. And it was being done to 01:02:02.080 |
basically, you know, get people's guns. I mean, look, 01:02:04.880 |
it's nutty stuff. I'm not defending it in any way. But he 01:02:07.600 |
explained that he bought into that, into that theory or hoax 01:02:11.160 |
or whatever. And he thinks it's a terrible mistake, and he's 01:02:14.640 |
apologized for it. And the question is, are you going to 01:02:17.840 |
have a lifetime ban on somebody for saying things that were 01:02:22.440 |
wrong and odious, when they have now apologized? And for me, it's 01:02:27.400 |
not about Alex Jones, it's about censorship. Remember, when this 01:02:33.480 |
case happened way back in 2018, it was really hard to defend 01:02:37.600 |
keeping this guy on the platform in light of what he had said and 01:02:40.920 |
done, because everyone's reacting very emotionally to 01:02:43.680 |
it. And it was people like defenders of free speech, like 01:02:47.120 |
Len Greenwald, who said that, listen, if you take Alex Jones 01:02:51.040 |
out now, if you have a permanent ban, it will basically be a 01:02:54.560 |
slippery slope, and it will create a precedent, and other 01:02:57.720 |
people will get banned. And sure enough, just two years later, 01:03:01.200 |
Twitter was banning people like Jay Bhattacharya, Stanford 01:03:04.880 |
doctor for saying dissident things about COVID that turned 01:03:08.560 |
out to be completely correct. Marie authored the Great 01:03:11.000 |
Barrington Declaration, talk about how lockdowns wouldn't 01:03:13.760 |
work, and so on. And so even within two years of this 01:03:18.280 |
decision around Alex Jones, the censorship was totally out of 01:03:21.840 |
control. And so I think the people who warned us that Alex 01:03:25.520 |
Jones would become a slippery slope ended up being completely 01:03:27.560 |
correct. To me, that's the symbolism of the restoration of 01:03:31.440 |
Alex Jones's account. It's not endorsing what he did. It's not 01:03:34.840 |
saying that what he said wasn't odious. I mean, look, again, I 01:03:40.040 |
have zero interest in even listening to the guy. But the 01:03:43.240 |
point is that free speech does requires to put up with people 01:03:48.160 |
who are wrong people who are even hateful, sometimes, and 01:03:53.320 |
stating misinformation, people who put out misinformation. 01:03:56.960 |
That's what free speech requires us to do. And if you want a 01:04:01.920 |
different standard, it's going to become a precedent for a lot 01:04:05.680 |
I agree with sacks. The only place where I disagree with sacks 01:04:08.520 |
is on Twitter, not having a right to do this as a private 01:04:13.560 |
enterprise. I think Twitter had a decision to make on what kind 01:04:16.120 |
of editorial position they wanted to do with the content on 01:04:19.120 |
their platform on their product. And they made a choice. I don't 01:04:22.120 |
think that I think it was the wrong choice, personally. And 01:04:25.280 |
we've talked about this in the past. I think, you know, it's 01:04:27.640 |
it's great that Elon's making a different choice and create 01:04:31.240 |
catering to a, you know, a different audience, perhaps, 01:04:33.760 |
with a different product that has more open speech. But that's 01:04:37.920 |
not, you know, a government free speech mandate. That's a private 01:04:42.200 |
enterprise mandate. And I do believe in the right to free 01:04:45.440 |
speech, I think it's a little bit ironic to say that it's 01:04:50.880 |
inappropriate when someone says something that is 01:04:52.960 |
misinformation, because it's incorrect or unprovable. When we 01:04:57.840 |
have an entire group of people that believe in something called 01:05:02.000 |
religion, and much of religion is based on this concept of 01:05:04.960 |
faith and belief without necessarily hard proof or 01:05:08.960 |
evidence. And we allow religion religious speech, you know, in 01:05:13.760 |
many forums, without saying, hey, that's misinformation, or 01:05:17.000 |
hey, it's not true, or hey, it doesn't meet the standards of X 01:05:20.040 |
or Y or Z, scientific assessment or understanding. And so I think 01:05:24.280 |
it's just worth acknowledging that this whole concept that 01:05:26.560 |
someone has to ultimately be the police of the truth, and the 01:05:29.360 |
police of fact and the police of information is going to lead to 01:05:32.000 |
a bad place. And I'd rather have more free speech with people 01:05:34.520 |
saying misinformation and saying awful putrid things than one 01:05:38.160 |
where a few people get to decide what everyone gets to hear. So 01:05:42.960 |
you may be right that Twitter as a private company had the right 01:05:48.360 |
as our laws currently exist to decide who they were going to 01:05:50.760 |
suspend and ban from the site. However, once that censorship 01:05:54.320 |
power was created, it attracted powerful entities from our 01:06:00.000 |
government who wanted to co opt and use that power. That's what 01:06:02.600 |
we saw in the Twitter files, right with the ad FBI agent 01:06:04.960 |
sending takedown requests. That's what happens is, when you 01:06:08.320 |
create the censorship power, people will abuse it, people 01:06:11.440 |
abuse it, but but more to the point, it's such a tempting 01:06:16.880 |
power to use by people in authority, right? It's like the 01:06:21.320 |
ring of power, those tools that Twitter created, it's like they 01:06:26.040 |
released a pheromone or something that attracted all 01:06:28.920 |
these powerful shadowy actors from the federal government in 01:06:32.280 |
the FBI and all these agencies. And so that that is why I think 01:06:36.360 |
it's just very dangerous for even private companies to create 01:06:38.680 |
these censorship regimes is that they can be co opted and abused 01:06:42.720 |
being co opted and abused is the issue. I don't think that the 01:06:46.000 |
issue is their choice in what kind of content they want to put 01:06:49.760 |
out. You can go to the Netflix kids version of Netflix and 01:06:53.280 |
kids, they control what content is on Netflix, and they provide 01:06:56.160 |
a different version than what they provide to adults. And I 01:06:58.600 |
think like editorializing the content platform that you're 01:07:01.040 |
making available, whether it's user generated, or paid for or 01:07:04.320 |
whatever, is a totally reasonable, like approach to 01:07:07.200 |
running a business, a content business, the point you're 01:07:09.800 |
making is the right one, which is the point at which you allow 01:07:12.840 |
government agencies to intervene and have control and 01:07:16.160 |
manipulation over private citizens user generated content 01:07:19.080 |
is where I think it crossed the line. So I don't disagree with 01:07:23.000 |
May I ask two clarifying questions here, because I'm 01:07:25.400 |
curious how you would handle this. If you were the CEO of x, 01:07:28.920 |
formerly known as Twitter, would you have reinstated Alex Jones? 01:07:33.040 |
Yes or no. And then number two, if Alex Jones, then as a new 01:07:37.920 |
member of the community who's been reinstated and forgiven, 01:07:40.360 |
because he apologized, and then he did this again, this exact 01:07:44.360 |
same thing again, with another school shooting with parents, 01:07:50.280 |
I don't know that these are yes or no questions. What I would 01:07:53.360 |
say is that I've written what I think should be a speech policy 01:07:56.800 |
for social media platforms, in a blog post I did several years 01:08:00.000 |
ago. And what I said is that I would take First Amendment case 01:08:04.440 |
law and operationalize it for social media platforms. There 01:08:10.320 |
are nine categories of speech that the Supreme Court has said 01:08:13.200 |
are not protected speech because they're dangerous in some way. 01:08:17.880 |
So for example, incitement to violence is one of them, you 01:08:20.440 |
know, harassment is one of them. So I would use 01:08:24.760 |
Well, people, his fans went and knocked on the door. His fans. 01:08:30.320 |
So as I understand the whole Sandy Hook thing, what happened 01:08:33.600 |
is, he said the whole thing was a hoax. That obviously wasn't 01:08:36.080 |
true. He paid a huge price for that. His fans, then some of his 01:08:40.840 |
crazy fans went and harassed the parents, which obviously is not 01:08:44.440 |
right. But according to him, he didn't. And I don't know that 01:08:48.680 |
anyone's shown that he did that. I don't think he encouraged 01:08:54.000 |
chance. Well, of course he does. It's a conspiracy show. So 01:08:58.360 |
knowing it's a conspiracy show, knowing that incitement to 01:09:01.640 |
violence is one of your criteria. If his fans after him 01:09:05.280 |
saying it's fake, then went to the house, knocked on the door 01:09:09.280 |
and asked the parent to see little Susie because you know 01:09:12.640 |
she's alive. Would you kick him off the platform? 01:09:17.120 |
the Barrington declaration, the Barrington declaration was 01:09:22.000 |
declared to be a conspiracy show. The idea that COVID 01:09:25.120 |
originated in a lab was considered a conspiracy show. I 01:09:27.920 |
don't think you can prejudge in advance that a show is, quote 01:09:33.200 |
unquote, dangerous, factually wrong conspiracies. As I 01:09:37.800 |
understand it again, I haven't watched the show, but I did 01:09:39.560 |
watch a clip by Joe Rogan, who provided something of a 01:09:42.280 |
character reference for Alex Jones. I don't know if Nick can 01:09:44.840 |
find that and play that it was actually quite good. What Rogan 01:09:48.360 |
said is look, I've known Alex Jones for like 30 years. He's 01:09:51.800 |
had problems with alcohol abuse, substance abuse, whatever. He's 01:09:55.440 |
had mental health issues that he's acknowledged. And sometimes 01:09:58.320 |
he goes off the rails. At the same time, he's also been way 01:10:02.440 |
ahead of the curve on certain things. For example, he told me 01:10:05.200 |
about the Epstein Island, like 10 years before the story broke. 01:10:09.000 |
I don't know how he figured that out. But somehow he did. Now 01:10:12.040 |
that was a conspiracy theory until it was proven true. And it 01:10:16.120 |
probably would have been a good thing for the public if that 01:10:17.960 |
story had come out a lot sooner, so that it could have been shut 01:10:20.680 |
down a lot sooner. So I don't think you can just judge in 01:10:23.640 |
advance that somebody is a conspiracy theorist and basically 01:10:26.080 |
blackball them from the internet. One other data point 01:10:29.000 |
I want to bring up is that is something that Elon mentioned is 01:10:32.880 |
that he looked at the Twitter tools, the admin tools to seek 01:10:38.200 |
to look at Alex Jones's account and the third strike he 01:10:40.960 |
received, that that caused him to be banned from the Twitter 01:10:45.520 |
platform by the former management, was he actually 01:10:48.240 |
insulted a reporter, which was a very borderline case. So, you 01:10:53.720 |
know, the things that you're saying that he was banned for 01:10:57.960 |
Yeah, no, that's true. I think. Yeah, so that's why I was 01:11:01.680 |
framing it to you as, you know, and this is the issue that I 01:11:05.480 |
think, and maybe what some people are missing here, a 01:11:08.680 |
mentally ill person like himself, if he's admitted to 01:11:10.880 |
mental illness and substance abuse, when they do this real 01:11:13.600 |
when they when they go on these tirades, or they go off their 01:11:15.800 |
meds, or whatever it is, or they're just evil, and they do 01:11:18.440 |
this for ratings to make money, it starts to cause real world 01:11:21.960 |
harm, people start showing up on these people's doorsteps. And so 01:11:24.760 |
then, are you going to wait 10 years for the courts to do this 01:11:28.800 |
$1.5 billion judgment, and then make the decision while real 01:11:32.320 |
world harm is occurring. And if you own the platform, my belief 01:11:35.800 |
is you have a higher standard, obviously believe in freedom of 01:11:38.120 |
speech, he can make his own website. But if you own the 01:11:40.520 |
platform, and the platform enables him to reach a large 01:11:43.000 |
number of people, and those people are being harmed and 01:11:45.280 |
parents, his doors are being knocked on demanding to see 01:11:47.280 |
their children, because Alex Jones said that child is still 01:11:49.360 |
alive, and they're trying to take our guns. And he knows his 01:11:52.440 |
fans are crazy. There's responsibility that comes from 01:11:55.080 |
it. And there's responsibility that comes with owning a 01:11:57.000 |
platform like this. I know he wants going full freedom of 01:11:59.680 |
speech, but I would be very careful about this. 01:12:01.920 |
Steve Scalise, the House Republican whip was shot by a 01:12:05.360 |
crazy Bernie Sanders supporter. Does that blame go to Bernie 01:12:08.440 |
Sanders? I think we have to separate. There is hold on 01:12:12.400 |
there. There is a legal standard for incitement. Okay, there's a 01:12:16.960 |
legal standard for judging that you're saying that these crazy 01:12:20.120 |
people were incited, but there is actually a legal way of 01:12:22.760 |
determining that. I don't think that's been proven. 01:12:24.840 |
I would do a common sense one, which is, do we see real world 01:12:28.800 |
happening? I would just use common sense. Do we see real 01:12:31.000 |
world happening? Okay, real world is happening. We own the 01:12:34.520 |
platform, we need to stop this, which is what happened. That's a 01:12:37.720 |
judgment standard, right? Yeah, I would I would make the 01:12:39.920 |
judgment. If I was the CEO, I'd make the judgment. And I would 01:12:42.520 |
make the judgment based on, you know, the courts are going to 01:12:45.120 |
take years to adjudicate this. And it's my platform. I don't 01:12:49.320 |
I would operationalize a content moderation policy based on First 01:12:53.480 |
Amendment case law, you're right that you can't always wait for 01:12:55.480 |
the courts to adjudicate it, there's gonna be judgment calls. 01:12:57.640 |
I would have been fine, I think, with the suspension of Alex 01:13:01.880 |
Jones in that context, because it does seem pretty egregious, 01:13:04.680 |
and he's apologized for it. The question is whether there should 01:13:06.840 |
be lifetime bans. And I'm pretty much I think I'm against 01:13:11.920 |
lifetime bans. I'm okay with timeouts. I'm okay with 01:13:14.080 |
suspensions for egregious behavior. When somebody has 01:13:19.320 |
apologized, they've, I mean, had to pay. I mean, I think he's 01:13:22.920 |
been bankrupted. He's had to pay all these fines. I think he's 01:13:26.840 |
paid his price to society, so to speak. And he's admitted he was 01:13:31.320 |
wrong. The question is, do you still have the lifetime ban? It 01:13:34.360 |
seems to me he's acknowledged his mistake. If he doesn't like 01:13:37.400 |
this again, then you can suspend him. Maybe you do the ban. But I 01:13:41.640 |
do believe in giving people second chances. And I'm just 01:13:44.880 |
sort of viscerally against the lifetime banning people. 01:13:47.960 |
I don't like the standard of what can be deemed dangerous 01:13:52.000 |
speech, because I think that, as Zack said, there's a clear way 01:13:56.320 |
to measure whether someone's inciting violence or inciting 01:13:59.000 |
harm, versus saying speech that can be deemed dangerous in some 01:14:03.920 |
contexts, and then not be deemed dangerous after the fact. COVID 01:14:08.120 |
vaccine conversations are the perfect example. Telling people 01:14:11.600 |
that there's health risks associated with taking a vaccine 01:14:14.360 |
in the period when everyone was worried about a pandemic killing 01:14:17.920 |
us all was deemed too dangerous to allow. And after the fact, it 01:14:22.320 |
wasn't dangerous, because there was suddenly clear evidence that 01:14:25.400 |
there may be some costs and benefits associated with the 01:14:27.800 |
vaccines. And so I really don't like this standard of dangerous 01:14:31.160 |
speech. In fact, I think that the biggest changes that are 01:14:34.040 |
necessary in society initially start as dangerous speech, and 01:14:37.720 |
then they eventually become true. And then they become a 01:14:40.160 |
standard and then things change. My my repeated calls for 01:14:44.920 |
reduction in fiscal spending at the federal level, and lack of 01:14:48.640 |
accountability and fiscal spending at the federal level, 01:14:50.960 |
by some measure could be deemed dangerous speech, and an 01:14:53.800 |
incitement against the government. But really, my point 01:14:56.440 |
is to call out the importance of this, like issue. And after the 01:15:00.720 |
fact, I may be right, I may be wrong. And I need to be able to 01:15:04.520 |
say that. I think it's critically important to say those 01:15:07.160 |
sorts of things. And I think that other people in their own 01:15:08.920 |
domains will find other things that are critically important to 01:15:11.520 |
say, and that would be deemed by some standard to be dangerous at 01:15:14.640 |
the time. So as much as I have great disdain for, you know, 01:15:19.520 |
certain people and certain things that they may say, I do 01:15:22.520 |
think that what might be deemed dangerous speech is a critical 01:15:25.000 |
element of the kind of progressivism that's allowed the 01:15:30.320 |
I think it was dangerous speech, to promulgate the false 01:15:33.680 |
conspiracy theory that Trump was an agent of Putin. I mean, that 01:15:37.480 |
was in the steel dossier, they basically said that Putin had 01:15:40.360 |
compromise on Trump, and Trump was basically working for the 01:15:44.320 |
Kremlin. I mean, he was a traitor. I mean, what if there 01:15:47.880 |
were people out there who tried to assassinate the President on 01:15:50.640 |
the grounds that we can't have a traitor in the White House, 01:15:54.240 |
that was a private document, right? That wasn't like a 01:15:58.960 |
You make my point. Exactly. It's leaked, right. So there's like a 01:16:02.920 |
But then it was by BuzzFeed. And then once it was in the echo 01:16:06.680 |
chamber was endlessly repeated by the mainstream media. So the 01:16:09.560 |
idea that like only people like Alex Jones, promote conspiracy 01:16:15.400 |
theories, the mainstream media promotes a lot of conspiracy 01:16:17.360 |
theories. And some of those theories, if acted upon by crazy 01:16:20.640 |
people, will be just as dangerous as the things that 01:16:24.880 |
you're in hypothetical, but actually, I think you and I are 01:16:27.840 |
not too far apart, you wanting to take these harms and 01:16:31.280 |
operationalize them is sort of what I'm saying. And in each of 01:16:34.200 |
these cases, it's a judgment call. And this is where I think, 01:16:36.920 |
you know, I, in many ways, I'm proud of what you want is doing 01:16:40.800 |
and saying, like, freedom of speech is an absolute thing. And 01:16:43.120 |
that's what the platform is going to be his right to do it. 01:16:45.480 |
It's his platform. And so I'm fine with that, I would I would 01:16:49.000 |
do something different if it was my platform. Everybody's 01:16:51.320 |
different, you know, everybody can can take their stance, I 01:16:54.000 |
would have some basic humanity as my stance. And I'd be willing 01:16:58.520 |
to give up a little freedom of speech in my restaurant in my 01:17:01.440 |
cafe. In order to have it be more delightful for everybody 01:17:04.760 |
there. I wouldn't go as far as banning people talking about 01:17:07.280 |
COVID. But yeah, if somebody was trying to claim that parents of 01:17:11.240 |
murdered children were liars and actors, that would be fine for 01:17:14.560 |
me to say, yeah, no good. And then, of course, there's Kanye. 01:17:17.680 |
So you know, Ilan ban Kanye, that was under his realm. And 01:17:20.680 |
this is what Kanye said. And I think this falls into hate 01:17:23.160 |
speech and real world time. I'm a bit sleepy tonight. But when I 01:17:25.800 |
wake up, I'm going death, death, con three on Jewish people. The 01:17:30.120 |
funny thing is, I actually can't be anti semitic, because black 01:17:32.840 |
people are actually Jew. Also, you guys have played with me and 01:17:35.920 |
tried to blackball anyone whoever opposes your agenda. So 01:17:38.760 |
when you see this tweet, would you have banned him sex? Or is 01:17:41.720 |
I might have given him a timeout? I think his family 01:17:44.920 |
wanted him to get a timeout because he was having an 01:17:47.440 |
episode. I certainly wouldn't give him a lifetime ban. Give me 01:17:50.080 |
a year or something. Look, I don't really know this Alex 01:17:53.160 |
Jones guy. I certainly don't know him in person. I don't even 01:17:55.120 |
listen to him. It's not a show I'm interested in. Even now. I 01:17:58.640 |
only know he's a conspiracy theory. I like knowing the 01:18:02.200 |
truth. I like hearing facts. And I don't believe that factional 01:18:06.640 |
information, like the lab leak theory should be censored by 01:18:11.400 |
labeling it a conspiracy theory. Yeah, for example, but what I 01:18:16.040 |
would say about Alex Jones is there is some humanity and 01:18:19.200 |
allowing him a forum to apologize for what he did and 01:18:23.960 |
acknowledge his mistake and explain why he thought what he 01:18:28.120 |
did and why he was wrong. And that's what he did on x. 01:18:31.560 |
And I went on the Twitter spaces and I asked him a follow up 01:18:34.040 |
question, which he wouldn't answer. I said, How have you 01:18:36.040 |
changed your behavior? Yeah, but honestly, Jake, he wouldn't 01:18:39.080 |
answer it. You came bounding in in the last five minutes. I 01:18:42.360 |
didn't ask him a bunch of questions about what he did when 01:18:44.800 |
it already been covered at the top of the pod. And I listened 01:18:47.560 |
to it. And he had not answered the question. How is your 01:18:49.720 |
behavior changed? And so he doesn't want to talk about that 01:18:52.720 |
might be for legal reasons. The first half hour re litigating 01:18:55.240 |
Sandy Hook and you weren't aware of that. It was the first 10 01:18:58.360 |
minutes, but he won't answer questions on how he would change 01:19:01.320 |
his behavior. And so I think that's one of the things I would 01:19:03.800 |
want to see from him. How have you changed your behavior and 01:19:05.840 |
how you you know, do shows? And I don't think he's answered that 01:19:09.600 |
question. Anyway, we're going to disagree on this one. Any final 01:19:12.100 |
thoughts to map as we wrap here on this on this issue. 01:19:14.840 |
The free speech litmus test is is very simple. It's this exact 01:19:20.280 |
thing. It's when the person that you dislike says the thing that 01:19:24.280 |
you find very displeasing. What do you do? And I am a free 01:19:29.060 |
speech absolutist on this. I just think it's a very slippery 01:19:31.920 |
slope. And I don't think we're very capable of making these 01:19:35.740 |
delineations. And so I agree, the right solution or timeouts. 01:19:39.640 |
But lifetime bands, I think, again, go down this path where 01:19:44.120 |
human judgment gets involved, and then it's about the person 01:19:46.520 |
in charge, and then it becomes a power play, and then it 01:19:48.560 |
eventually always gets corrupted. So I can hold two 01:19:51.920 |
thoughts in my head. One, Alex Jones should be able to say what 01:19:54.880 |
he thinks, and two, it was disgusting, and he should be 01:20:00.080 |
You know, the bands doubling every time is probably a good 01:20:04.960 |
precedent as well. So I mean, we should do it like we do at our 01:20:07.080 |
poker game that fines go up. Yeah, phone penalty, penalty 01:20:10.360 |
doubles, exponential back off. I mean, you're gonna you're gonna 01:20:14.600 |
put your phone down and play the goddamn game if it gets to 800 01:20:17.400 |
or 1600. Because that's things a little bit. So there it totally 01:20:20.880 |
does. But maybe that's the right solution. Jason's like you have 01:20:23.440 |
a finding mechanism somehow, and it just like it increases. And 01:20:26.360 |
so there's a financial penalty of nothing else as well as a 01:20:28.680 |
timeout. When you violate these laws, at least that's a scalable 01:20:32.320 |
way to solve the problem in a way that's hard to corrupt and 01:20:37.320 |
gain. But if it goes down to the person to an individual or a 01:20:41.360 |
group of people's judgments, as we saw with the previous 01:20:43.840 |
management of Twitter, I think it's going to be a very 01:20:46.880 |
difficult problem. I don't think that those were bad people. But 01:20:50.720 |
Yeah, I mean, Nick, do you have that clip from Rogan? Because 01:20:54.080 |
when I listened to this clip from Rogan, it did have an 01:21:00.640 |
it does show like 20 years, it does show the human complexity. 01:21:04.200 |
And again, you're judging him based on the worst thing he ever 01:21:07.520 |
did. And Rogan presents a more balanced viewpoint about this 01:21:10.560 |
guy. Again, I have no dog in this hunt. I don't really care 01:21:14.760 |
about Alex Jones. But I'm just saying that if we're gonna sit 01:21:18.800 |
in judgment of people, I think maybe we should have a more 01:21:21.800 |
balanced view. Because, I mean, it does bias the conversation to 01:21:24.840 |
play the clip of the worst thing you ever did. This display 01:21:27.360 |
Rogan for a second. I think it's interesting. 01:21:28.800 |
Look at the way people look at Alex Jones now. Because Alex 01:21:31.720 |
Jones has been on my podcast a few times. The people that have 01:21:34.400 |
watched those podcasts think he's hilarious. And they think 01:21:37.840 |
that he definitely f***ed up with that whole Sandy Hook 01:21:41.080 |
thing. But he's right more than he's wrong. And he's not an evil 01:21:46.800 |
guy. He's just a guy who's had some psychotic breaks in his 01:21:50.920 |
life. He's had some genuine mental health issues that he's 01:21:53.800 |
addressed. He's had some serious bouts of alcoholism, some 01:21:57.520 |
serious bouts of, you know, substance abuse. And they've 01:22:01.320 |
contributed to some very poor thinking. But if you know the 01:22:05.160 |
guy, if you get to know him, like I have, I've known him for 01:22:08.400 |
more than 20 years. And if you know him on podcasts, you 01:22:11.520 |
realize, like, he is genuinely trying to unearth some things 01:22:17.280 |
that are genuinely disturbing for most people. Like, this is a 01:22:21.400 |
guy that was telling me about Epstein's Island decade ago, at 01:22:28.880 |
Yeah. I mean, this platforming mentally ill people during an 01:22:33.800 |
episode is a whole nother can of worms. I mean, I told this to 01:22:36.960 |
Lex Riedman when he had Kanye on during that episode. I said, I 01:22:39.880 |
think it's a very bad idea to spend two hours with somebody 01:22:42.080 |
who's having an episode. And sure enough, what did he do? 01:22:44.680 |
More anti semitic insanity on his podcast. And I just told 01:22:49.000 |
Lex, like, leave the guy alone. It's not worth it. Let him get 01:22:53.120 |
Look, I think I think you have a point there. But, you know, that 01:22:56.240 |
that would be an argument for a temporary suspension, not a ban, 01:22:58.920 |
Yeah. All right. sacks in other news. Something insane has 01:23:03.600 |
happened on the internet. It's never happened before. But 01:23:07.200 |
somebody has apologized for getting something wrong. This is 01:23:10.760 |
breaking news. We're in year 34 of the internet. And somebody 01:23:14.000 |
says they had my wife. Is it my wife? Did she? She's never gotten 01:23:18.600 |
anything wrong. I've listened. I've been there for this whole 01:23:20.520 |
relationship. She has been 100 out of 100 times correct. 01:23:23.880 |
She makes a mistake. We'll know but it hasn't happened yet. 01:23:30.240 |
Actually, paradoxically, same with my wife. She's been right 01:23:33.640 |
No, my wife has been wrong four times. And I've gotten three on 01:23:38.560 |
voice memo, I taped them, I pull out the phone. And I'm like, 01:23:40.640 |
Hold on, I need you to say it again. Yeah. So I've gotten three 01:23:44.000 |
on voice. But that's because it's been only three. 01:23:46.120 |
Did those three have something to do with deciding to marry you 01:23:49.320 |
for her to move in with you and make children and start a life 01:23:53.080 |
together? It's so frustrating. How can one person be so wrong? 01:23:56.200 |
I eat me all the time. Well, I mean, at least you're self 01:24:00.040 |
aware. Everybody loves self aware. Chuck Chobot. It is. God 01:24:03.720 |
is a new trend in the world. But yes, Nassim Taleb publicly 01:24:08.040 |
admitted. We're gonna pull it up here, sacks. Here we go. He 01:24:11.280 |
publicly admitted that techno watermelon. That's your name, 01:24:14.520 |
sacks. That was his insult. I don't really understand the 01:24:17.080 |
insult. Either. I think he's saying your head is the size of 01:24:20.360 |
a watermelon. And that you're involved in technology. That's 01:24:23.160 |
my was my interpretation. I don't think you have a melon 01:24:25.160 |
head. Does it mean like I'm green on the outside and red on 01:24:28.520 |
the inside or something like somehow I'm supporting 01:24:30.040 |
communism, or I don't I don't really understand it. But that 01:24:32.800 |
has come up. Well, listen, it's better than mine. I'm a 01:24:35.240 |
psychotic ignoramus. So he knows attracted that for some reason. 01:24:39.320 |
Not yet. Not yet. But at some point, I'm sure he will. But 01:24:43.440 |
here it is. Well, he just said, I can see that David sacks is 01:24:46.360 |
correct about the relative strength of the parties in the 01:24:48.680 |
Ukraine war and I was wrong. All caps. Russia is not as weak as 01:24:53.200 |
it seemed to that staying power. This means a settlement is 01:24:55.880 |
likely outcome. Anyway, it's so rare on the internet for anyone 01:25:00.440 |
to admit they were wrong. And what they usually do is just 01:25:07.160 |
memory hole, which is why you know, I always like to produce 01:25:10.320 |
receipts. I only do that for the people who strongly denounced me 01:25:12.920 |
about something. And then I end up being right. They never can 01:25:16.080 |
see it's not just about the fact that I was right. It's about the 01:25:18.200 |
fact that they attacked me personally. And they never come 01:25:22.160 |
back and apologize or correct the record. And to love did 01:25:26.000 |
that. So kudos to him. I mean, I admit when I read this, I was 01:25:29.240 |
like, you know, like, yeah, I'm like, Okay, what's the gotcha? 01:25:34.280 |
I thought a trapdoor is gonna open under my feet. I thought a 01:25:40.520 |
cartoon piano was gonna fall on my head. I just I thought this 01:25:44.800 |
can't be it. And but no, that was it. So that was it. Well, 01:25:48.480 |
there it is. kudos. By the way, just I mean, the reason why I 01:25:54.320 |
understood what was going to happen in this, you know, 01:25:59.600 |
counteroffensive and why the war is not going to go as well as 01:26:02.400 |
people thought. It's not because I purport to be some sort of 01:26:05.600 |
Ukraine expert or foreign affairs expert, I mainly just 01:26:08.760 |
spent the time to figure out who the real experts were. And the 01:26:12.560 |
real experts are never the people who the media tells you, 01:26:15.160 |
you actually have to spend the time to look at people's track 01:26:18.480 |
records, what they said in the past, did it come true or not? 01:26:21.920 |
You know, it's basically a falsifiability standard, look 01:26:25.600 |
at what they predicted, look at what actually happened. And you 01:26:28.240 |
can figure out who the real experts are. And that's what I 01:26:31.200 |
did in the case of Ukraine, it was possible to figure out who 01:26:34.080 |
are the foreign policy scholars who got this right? Who are the 01:26:36.960 |
military bloggers who are accurately reporting 01:26:39.640 |
information? And who are the ones who are basically putting 01:26:41.880 |
out propaganda? If you spend the time to do that on virtually any 01:26:48.320 |
It's a great point. That's really good point. Yeah. 01:26:50.400 |
You have to you have to find your own process of finding the 01:26:53.600 |
truth today, because I don't think you can trust the 01:26:56.840 |
100%. And by the way, as a VC, people say, well, what does a VC 01:27:01.280 |
think? You know, in a way, like what VCs do is when you get 01:27:04.520 |
interested in a topic, you kind of go deep, you're trying to 01:27:07.000 |
simulate a lot of information, you try to figure out who the 01:27:08.800 |
real experts in the space, so that those are the people I 01:27:11.720 |
should listen to. And then you develop a take. It's not the 01:27:14.080 |
worst skill set in the world for doing a pod or tweeting out hot 01:27:18.760 |
takes on on Twitter. Again, I'm not saying I'm an expert. I'm 01:27:22.040 |
just somebody who is independent minded enough to get to the 01:27:24.960 |
bottom of an issue without just relying on what I'm supposed to 01:27:29.080 |
believe. And I just try to figure out who the real experts 01:27:32.040 |
All right, producer Nick, are you there? You did a tweet about 01:27:34.640 |
questions that people might want to have to ask the besties here 01:27:38.320 |
as we wrap this up. So we'll take to your two favorite 01:27:41.320 |
thoughts on the Harvard board standing behind President gay 01:27:44.720 |
Okay, that's a good question. That's a great question. That's 01:27:47.040 |
good one for tomorrow. Here's what I'll say. I think that the 01:27:49.920 |
Harvard board was probably in a really difficult position in the 01:27:55.160 |
following way. You know, when you hire somebody, and you 01:28:00.720 |
realize that that person has some faults, you have three 01:28:06.320 |
choices, right? One is to fire them. Two is to be unequivocal 01:28:15.120 |
in their support. And three is to basically give a milk toast. 01:28:20.600 |
CYA kind of a statement to give yourself time. The reason why I 01:28:28.600 |
think that President gay wasn't fired was probably because the 01:28:32.280 |
board for whatever reason didn't want to seem like they were 01:28:34.920 |
kowtowing to Bill Ackman and all the other people that were 01:28:38.000 |
asking for her resignation. But what they didn't do is equally 01:28:42.400 |
important. They may not have fired her. But what they also 01:28:45.040 |
didn't do was come out with an unequivocal statement of 01:28:47.320 |
support. I think it was kind of a little bit wishy washy and 01:28:51.440 |
acknowledging her mistakes, which seems to be a setup to 01:28:56.080 |
allow her to basically make a couple more mistakes so that 01:28:58.920 |
then they can fire her and they can all seem like they did the 01:29:02.400 |
right thing. So I suspect that that's what happens. She 01:29:05.880 |
probably won't be in that job in a year from now. Or, you know, 01:29:09.520 |
she kind of muddles along and in two or three years, she quote 01:29:11.800 |
unquote, retires to spend more time with her family. Anyone 01:29:16.800 |
Yeah, I mean, I think this university presidents debates 01:29:18.960 |
been a little bit of a Rorschach test. And I've seen people that 01:29:23.760 |
I generally agree with fall into one of two camps. Some see it as 01:29:27.160 |
a free speech issue. Other people see it as a kind of woke 01:29:30.480 |
double standards or DI issue. I think for those who see as a 01:29:34.640 |
free speech issue, they're emphasizing the motivations of 01:29:38.840 |
people like Elise Stefanik, the person who asked the university 01:29:41.520 |
presidents a question in saying when she said basically, the 01:29:46.040 |
question was, does your code of conduct allow calls for genocide 01:29:49.480 |
of Jews? And their argument is that's a loaded question, 01:29:53.920 |
because there is not an epidemic on campus of people calling for 01:29:58.200 |
genocide of Jews. And so this is basically all kind of an 01:30:01.720 |
invented hysteria. And the purpose of it is to suppress 01:30:05.360 |
debate about this Israel Hamas war in Gaza, and it's designed 01:30:11.200 |
to expand campus speech codes, so that it's harder for 01:30:16.040 |
Palestinian supporters to protest in favor of their cause. 01:30:20.760 |
So that's one way of looking at it. My view on that is, if it 01:30:26.560 |
ends up being the case that campus speech codes get expanded 01:30:31.280 |
in that way, that'd be a bad thing. I don't think we need to 01:30:33.960 |
restrict speech on campus. So I agree with them on that point. 01:30:37.280 |
However, there is a different way of looking at this, which is 01:30:40.120 |
the motivations of the university presidents and 01:30:43.040 |
answering that question. And yes, it was a loaded question, 01:30:45.960 |
but they flubbed the answer. And the question is why, because as 01:30:49.320 |
we talked about last week, if they were asked about calls for 01:30:52.800 |
the murder of any other group, a racial group, or trans people, 01:30:58.440 |
or something like that, Asian, again, I don't think their 01:31:00.760 |
answer would have been the same. And I do think that that comes 01:31:03.040 |
back to the fact that they have a preconceived notion of which 01:31:07.000 |
groups deserve protection, and which ones don't. And that is a 01:31:09.840 |
double standard. And I think anything we can do to get rid of 01:31:13.560 |
that poisonous ideology, that wants to treat people 01:31:17.640 |
differently, on campus, I think is a good thing. And so I 01:31:21.640 |
support what Bill Ackman is doing on that basis. But if Bill 01:31:25.200 |
Ackman goes too far and demands restrictions on the ability of 01:31:28.240 |
students to protest, then I think it would be a bad thing. 01:31:30.640 |
And that's going too far. So this would be a great thing to 01:31:32.960 |
ask him, like what his motivations are, that you know, 01:31:38.560 |
the thing that's crazy is that just the crazy hypocrisy, like 01:31:41.560 |
these are the same people who were firing people or not 01:31:45.000 |
letting them speak on campus, if they had a, you know, a 01:31:48.760 |
microaggression, where they didn't, they misgendered 01:31:52.280 |
somebody, or they used a different pronoun, or they had a 01:31:54.360 |
different feeling about what defines a woman versus a man or, 01:31:57.880 |
you know, gender differences, whatever their massive 01:32:00.520 |
intolerance and the crazy hypocrisy, which you're alluding 01:32:02.640 |
to your sex is the thing that I think has broken everybody's 01:32:05.120 |
brain, like, this is bizarre. And the di stuff is a road to 01:32:08.560 |
nowhere. I tweeted today about the absolute grift that was 01:32:12.800 |
going on in tech not long ago, which was call out a company, 01:32:17.000 |
venture firm, whatever it is, for their di stats, then 01:32:20.920 |
quietly contact them after you've done this brigadooning of 01:32:24.640 |
them and say, Hey, we can solve your problem, hire us as 01:32:27.760 |
consultants and speakers to come in and fix your di and tell you 01:32:31.480 |
what you've done wrong, and then publicly come out. And I saw 01:32:34.800 |
this happen, publicly come out, and then tell the same group, 01:32:38.400 |
hey, this person is now an ally. Rinse and repeat. It was a crazy 01:32:41.960 |
grift. And it's all coming out now. And there's a 01:32:43.520 |
yeah, that that Bill Ackman thing, Jay Calvert, you 01:32:47.920 |
Yeah. So we could just go down this rabbit hole forever. But I 01:32:51.840 |
think you call it a cul de sac. I call it a road to nowhere or 01:32:55.200 |
dead end. identity politics and de it's just a dead end when you 01:32:58.520 |
start judging people based on, you know, any criteria other 01:33:02.120 |
than their character and performance in the world. Do we 01:33:04.760 |
know if there was a response by MIT? Is this one MIT? Yeah, I 01:33:11.040 |
don't, I don't think yet there has been. So anyway, we'll 01:33:15.120 |
And Bill Ackman is doing is brave, because he is taking on 01:33:19.480 |
di. And that is, historically, that's been one of the most 01:33:23.000 |
dangerous things you can do. I mean, that is what people get 01:33:25.320 |
canceled for. Now I know there are there are people who I am 01:33:30.840 |
fans of, like like Lynn Greenwald, who's been very 01:33:33.120 |
critical of Ackman, because he thinks that Ackman is trying to 01:33:35.680 |
restrict free speech, and prevent, again, the pro 01:33:39.400 |
Palestinian cause from protesting or saying it's peace. 01:33:43.840 |
And I guess Bill can clarify that. But I think this issue is 01:33:48.200 |
less about foreign policy and more about domestic policy, 01:33:52.000 |
these de policies. And finally, we have someone who's willing to 01:33:55.320 |
take it on and challenge it challenge it at an ideological 01:33:59.360 |
level, and then challenge it at like a just grift level. 01:34:02.400 |
Yeah, shout out to Brian Armstrong. He got this right. 01:34:05.400 |
And he went right up the hill and took the arrows for it. And 01:34:08.120 |
I think we've turned a corner. And if you the tweet I did 01:34:11.720 |
today, I would not have done two years ago, because I just didn't 01:34:14.240 |
want to, you know, risk my firm or the companies I work with, 01:34:18.160 |
you know, to kind of expose that grift, because it could blow 01:34:22.360 |
back on people. So but now I feel totally comfortable doing 01:34:27.320 |
Well, that also puts me in. That's for sure. He's got a few 01:34:34.280 |
Ackman's got a lot of firepower. You know, that bond call was 01:34:37.520 |
totally right. No, the 10 years now sub 4%. I will say. Yeah, I 01:34:42.320 |
will say that does. Your bank account does give you the 01:34:45.120 |
ability to Okay, final question, please. We must we must wrap, we 01:34:48.480 |
must have a final question. And please throw it to Friedberg. 01:34:51.440 |
First, I got Friedberg involved early on offense today. It's a 01:34:55.720 |
from mom cooks fast and slow on x. For Friedberg, I guess, what 01:35:00.280 |
is the correct way to hire kids out of school now that an elite 01:35:02.840 |
university degree tells you very little about the applicant? And 01:35:05.840 |
will you follow this path in your own companies? 01:35:12.000 |
please, let me go to the schools with co op. Why? Because it 01:35:16.560 |
allows you to evaluate these kids in situ on a real time 01:35:20.840 |
basis without an obligation to hire. You can find the ones that 01:35:28.200 |
I went to a co op school, University of Waterloo in 01:35:31.520 |
Canada, the way that it works there, not not everywhere, but 01:35:34.760 |
there at least is you go to school for the first eight 01:35:38.600 |
months. And then you never get a break. You're either in working 01:35:42.560 |
for four months, or you're in school for four months. And you 01:35:45.360 |
go back and forth until you graduate. So instead of 01:35:48.080 |
graduating in four years, you graduate in five. But you 01:35:51.920 |
graduate with with basically two full years of work experience. 01:35:55.280 |
And depending on the employers that you work at, you typically 01:36:00.200 |
get two to three job offers from those folks if you do a good 01:36:03.400 |
job when I was helping to build Facebook, we went there, we had 01:36:09.440 |
never hired an intern before and we started to hire people. And I 01:36:12.320 |
think now it happened in Microsoft, it happened at 01:36:15.760 |
Google, I think it's happened to Facebook. If you look at the 01:36:17.640 |
number of kids that work at those schools now from co op 01:36:22.200 |
schools, they're higher than any other school. There's a bunch of 01:36:25.440 |
schools in the United States that have co op. But I would go 01:36:31.960 |
Freeberg, any thoughts you're now running a company, you're 01:36:35.360 |
back in the saddle? How are you going to hire people? And tell 01:36:39.600 |
everybody the name of the company again. It's called Oh, 01:36:41.360 |
hollow. Oh, hollow, not Mahalo. Oh, hollow. And your hollow was 01:36:47.840 |
taken. Who owns that domain name? Now? I still have it. 01:36:50.280 |
Yeah. If somebody wants to have it, I have Mahalo and I have 01:36:53.840 |
cuckoo. Happy to sell it to somebody if they want to. But 01:36:57.400 |
tell us Harry, what's your hiring criteria? And how do you 01:36:59.360 |
think about this now? I'd like to start an enterprise software 01:37:01.760 |
business called cuckoo. It means to help or to guide in Hawaiian. 01:37:06.440 |
So probably the third or fourth most important word. Yeah. Go 01:37:09.960 |
ahead. freebird, please. I have criteria around raw horsepower, 01:37:14.000 |
skills or experience, and then motivation. And I have systems 01:37:18.440 |
for how we try and assess those and then matching our 01:37:20.600 |
principles. It's kind of the fourth bucket of things. 01:37:23.680 |
horsepower, you can test skills is based on experience and fits 01:37:28.360 |
the role and the need. But motivation is one that there's a 01:37:31.320 |
lot of question marks around. Does this person have they 01:37:36.880 |
demonstrated that they've had a not just a desire, but an action 01:37:41.640 |
that they've taken, that has pushed them beyond the limits of 01:37:44.360 |
the systems that they've operated in. And that's what I 01:37:47.280 |
would typically look for, regardless of the schooling 01:37:49.880 |
background, the education background is some demonstration 01:37:53.200 |
of that, because that's necessary in in business 01:37:55.400 |
building. So those though, that's, that's my framework for 01:37:57.960 |
hiring. We call it, you know, smarts or horsepower skills, 01:38:01.560 |
motivation and principles. And we score each one of those and 01:38:05.960 |
You started to do that with the CEO candidates. I got that email 01:38:09.200 |
from you. And I was very intrigued. So I like that. All 01:38:11.840 |
right, everybody, this has been another amazing episode of the 01:38:15.080 |
all in podcast for the king of beef, and the dictator and the 01:38:21.560 |
rain man himself. I'm your boy, Jay cow. We'll see you next 01:38:31.120 |
Sack. And it said we open source it to the fans and they've 01:38:37.120 |
just gone crazy with it. I love us. I'm the queen of can walk. 01:38:48.640 |
That is my dog taking a notice in your driveway. 01:38:52.160 |
We should all just get a room and just have one big huge orgy 01:39:00.520 |
because they're all like this, like sexual tension, but they