back to indexE170: Tech's Vibe Shift, TikTok ban debate, Vertical AI boom, Florida bans lab-grown meat & more
Chapters
0:0 Bestie Intros!
1:2 Friedberg's newest family members
7:13 Tech's vibe shift: More candidness, less PR-speak from top CEOs
22:47 OpenAI CTO slips up on training data: did OpenAI train Sora on YouTube videos?
28:58 Vertical AI startups flourishing: Cognition launches Devin, what will this do to startups?
40:38 TikTok debate: Is the new bill to ban or force a sale of TikTok fair or potentially overreaching due to its vagueness?
82:10 Florida on the verge of banning lab-grown meat
00:00:00.000 |
Why do you always go Christopher Walken whenever you do these 00:00:04.000 |
Thanks for listening to the all in podcast. Wow. David Sachs 00:00:09.520 |
poignant points about regulatory capture. Freeberg loves mock 00:00:16.600 |
meats. Not for me. Everyone loves a great day. See you next 00:00:26.680 |
time. podcast. Wow. Love you boys. So you're better. Everyone's 00:00:38.280 |
got their superpower. That's your nasty nasty Jake. I'm 00:00:43.260 |
coming on. I'm coming on. Rain Man David Sachs Hey, Freeberg, 00:01:03.620 |
you want to tell them about your new family members? Oh my god. 00:01:07.100 |
Did he get more dogs? Is he trying to date Fortnoy this 00:01:09.940 |
again? Oh, Miss peachy. So I'm like working all day Friday. I'm 00:01:15.020 |
like wiped out. I've been in I can't remember. I was in Santa 00:01:18.460 |
Cruz. I made it all the way back up through the traffic. I get up 00:01:22.220 |
to the house. I've been texting and calling all afternoon. No 00:01:25.860 |
response. I'm like, what the fuck is going on? Normally, 00:01:28.740 |
she'll text me like just walk through the door, open the door 00:01:31.340 |
to my car, every little thing. So for her to not be calling me 00:01:33.900 |
back and something's up. I walk in the house. The kids are 00:01:36.700 |
there. They're jumping up and down. Daddy, Daddy, we got two 00:01:39.300 |
new dogs. And I'm like, what the fuck are you talking about? We 00:01:41.940 |
didn't get two new dogs. What do you say? Mommy's got two new 00:01:44.740 |
dogs. And then mommy took him to the vet. She'll be back in a 00:01:48.300 |
few minutes. I'm like, no way. And I had some friends coming 00:01:51.180 |
over for dinner. They walk in the house. The kids are jumping 00:01:53.820 |
up and down. Two minutes later, walks in the house. These two 00:01:56.740 |
dogs that someone found in a parking lot in San Jose, and 00:02:00.860 |
they couldn't find a home for these dogs. And they were like, 00:02:03.060 |
the dogs don't have a home. decided I'll take them into my 00:02:06.420 |
home. And I'm like, you This is why you didn't call me. You 00:02:09.220 |
didn't text me. I walked out the room. I'm like, it's over. It's 00:02:13.180 |
been nice knowing you. The kids are screaming, Daddy, you can't 00:02:15.860 |
get rid of the dogs. There are dogs. Now these are the best 00:02:17.980 |
dogs. So now we have four dogs. And which was the vet bill. And 00:02:21.580 |
then I come down that night, huge in the dining room, like 00:02:24.460 |
multiple diarrhea, plopped all over the floor. I walked 00:02:28.620 |
downstairs, the whole house was smelling of poop. My house has 00:02:32.060 |
become like, you know, those carnival trains that used to go 00:02:34.420 |
from city to city back in the 19th century. If one of those 00:02:37.260 |
trains like fell over and spilled open, that's basically 00:02:39.980 |
what my house has become. It just smells like poo and hay. And 00:02:44.740 |
there's clowns and children running around. And I live in 00:02:48.260 |
there ever tell you the story of Chuck Norris the Chihuahua? No, 00:02:53.220 |
no, no. So I'm going to a wedding. I'm in like Arizona. 00:02:57.300 |
I'm driving like on one of these giant, like Arizona streets and 00:03:01.740 |
a chihuahua runs across this like eight lane boulevard, 00:03:06.300 |
whatever. And I'm like, Oh my god, I'm like dodging around it 00:03:10.140 |
and it goes into the other side of traffic and I see a car. And 00:03:13.660 |
he just the dog ducks, he misses the car. I'm like, Oh, thank 00:03:16.380 |
God. My wife is screaming her head off. The dog gets whacked 00:03:21.220 |
by another car and it goes rolling down the highway. I make 00:03:24.220 |
a u turn, I block it. The dogs knocked out on the side of the 00:03:27.500 |
road. I run up to the dog, I pick it up. I'm like, this dog's 00:03:30.980 |
gonna die. We don't I get in the car. I said, Let's just take it 00:03:36.700 |
to a vet or whatever. And I'm saying goodbye to the dog. The 00:03:40.100 |
dog's like looking up at me. It's in bad shape. We go to a 00:03:44.220 |
vet. I gave it to the vet. We go to the wedding. My wife who was 00:03:48.500 |
like, got this big heart decides she's going to stay with the 00:03:53.140 |
dog. So I go to the wedding. I go to the you know, opening 00:03:55.620 |
night party. She's with she's at the 24 hour event, sitting with 00:03:59.260 |
this dog waiting for it to die in hospital. Oh my god, the dog 00:04:03.540 |
doesn't die. survives. Now it's Monday. And I get the alert on 00:04:10.380 |
my credit card. $12,000 Yeah, yeah. Well, yeah, $1,000 on this 00:04:18.740 |
dead dog. I'm like, the dog survives. I bring the dog back 00:04:25.060 |
to the Bay Area. The dogs find I can't believe it. But you know, 00:04:28.980 |
it's got like a broken leg, all this stuff. But you know, he's 00:04:31.100 |
generally he's the dog's alive. So I put him on social media. 00:04:34.860 |
I'm like, anybody want Chuck Norris, a dog that cannot die. 00:04:37.820 |
This is like the toughest chihuahua you ever seen. Some of 00:04:41.540 |
these rich people in San Francisco living on like a 00:04:44.700 |
certain street, Broadway or something like that turns out 00:04:48.860 |
there like their heirs to something famous. They live in 00:04:55.620 |
like three states, they got a private jet, all this nonsense. 00:04:58.740 |
They say we'll take the dog. They're friends with a couple of 00:05:00.980 |
our mutuals. So I'm like, this is great. I'm going to get the 00:05:05.140 |
12 grand from them. Because they got a private jet, they got 00:05:15.180 |
three houses. So they come down to pick up the dog with their 00:05:18.500 |
driver and everything. And we're delivering the dog. And I tell 00:05:21.980 |
Jade, hey, can we let them know about the $12,000 bill? Maybe 00:05:26.420 |
they'll pick it up. Like a Larry David. And my wife is like, if 00:05:31.940 |
you bring that up with them, I'm divorcing you. We have to pay 00:05:34.420 |
the 12,000. So I pay the 12,000. They proceed to then send us 00:05:38.580 |
pictures of Chuck Norris on private jets. You know, for 00:05:42.380 |
years to this day, Chuck Norris living this life and I paid 12 00:05:48.380 |
Freebird told me the story. I found it so hilarious. But I 00:05:52.340 |
told him, you know, after we got Joker when Jason when you came 00:05:56.500 |
to visit us in Porto de Madre, Joker had like Jardia. So he was 00:06:00.820 |
just everywhere. And you know, we got through it. And then three 00:06:06.460 |
months ago, when you guys were at my house for poker, Sean made 00:06:11.500 |
octopus and left some raw octopus in a garbage bag 00:06:14.060 |
outside and Joker Joker ate it. And he got such terrible 00:06:18.660 |
poisoning. We had to take him to the to the emergency animal 00:06:22.060 |
hospital where he stayed for like a week. That was by the 00:06:24.340 |
way, 17k. But ever since he came back, he's been completely 00:06:30.100 |
incontinent. So I was done. Reaper that Nat and I now just 00:06:33.780 |
wake up half an hour early. And what we do now from 630 to 00:06:37.340 |
usually 645 is we're cleaning up some form of feces that he's 00:06:41.180 |
left somewhere to go and find where did he take it? Let's go 00:06:46.100 |
clean it up. And the worst one was he once pooped through one 00:06:50.060 |
of the grates with this really bad poo and I had to go and just 00:06:54.580 |
fetch it all out. It was just right in the grits. Fantastic. 00:06:57.580 |
Right? Yeah. So there you have it, folks. Go adopt a pet. 00:07:00.420 |
There's your endorsement for the $17,000 in veterinary bills and 00:07:07.180 |
cleaning up all day long and replacing carpets. Listen, we 00:07:11.660 |
got a big docket today. I know it's a bit early in the year, 00:07:15.580 |
but I am going to add a new category. We're proposing a new 00:07:18.700 |
category. We'd love to hear your feedback on it for the 2024 00:07:22.140 |
besties, most based CEO, lots of options to choose from right 00:07:27.060 |
now, which we'll get into in a minute. But there seems to be a 00:07:29.660 |
bit of a vibe shift happening in tech during peak Zerp and cancel 00:07:33.220 |
culture. 2019 to 2021 era, seemed like CEOs were a little 00:07:39.860 |
vigilant about what they would say, you know, the Tim Cooks, 00:07:42.220 |
the Sundars, but something has clearly changed. tech CEOs have 00:07:45.060 |
gotten radically candid and fired their comms group. Two 00:07:48.940 |
great examples this week that we were talking about. Jensen 00:07:52.700 |
Wong, the CEO of Nvidia had this awesome clip when speaking at 00:07:58.340 |
I think one of my great advantages is that I have very 00:08:01.580 |
low expectations. And I mean that most of the Stanford 00:08:04.740 |
graduates have very high expectations. People with very 00:08:08.100 |
high expectations have very low resilience. And unfortunately, 00:08:12.340 |
resilience matters in success. I don't know how to teach it to 00:08:15.580 |
you, except for I hope suffering happens to you. To this day, I 00:08:18.220 |
use the word the phrase pain and suffering inside our company 00:08:21.140 |
with great glee, boy, this is going to cause a lot of pain and 00:08:23.740 |
suffering. And I mean that in a happy way. Because you want to 00:08:26.220 |
train you want to refine the character of your company. You 00:08:29.140 |
want greatness out of them. And greatness is not intelligence is 00:08:31.900 |
greatness comes from character and character is informed out of 00:08:34.940 |
smart people is formed out of people who suffered. 00:08:37.380 |
And then next up Palantir CEO Alex Carr called out the coked 00:08:45.940 |
I love burning the short sellers like almost nothing makes a 00:08:49.460 |
human happier than taking the glides of cocaine and away from 00:08:54.460 |
these short sellers who like are going short on a truly great 00:08:58.540 |
American company, not just ours, but it just love pulling down 00:09:01.540 |
great American companies so that they can pay for their coke. And 00:09:04.500 |
the best thing that could happen to them is we will provide we 00:09:08.100 |
will lead their coke dealers to their homes after they can't pay 00:09:15.540 |
Yeah, well, you know, go ahead and do your thing. We'll do our 00:09:26.540 |
Good for you. At some New York Times conference, always candid. 00:09:32.620 |
And even Zuckerberg, he's been getting a little base. He did a 00:09:36.220 |
whole video about how the Apple Vision Pro was when compared to 00:09:40.620 |
menace quest to he's getting a little frisky on the social 00:09:48.740 |
It seems like a lot of people are less worried about cancel 00:09:51.460 |
culture as they were three years ago. So I don't know if it's 00:09:55.820 |
just in Silicon Valley, broadly in media. And broadly, 00:10:00.260 |
culturally, there seems to be a move away from cancel culture 00:10:03.620 |
mentality and people are speaking their mind. Which is 00:10:08.660 |
yeah, I think obviously positive and refreshing. 00:10:10.860 |
Sachs, you're a big fan of freedom of speech. Your 00:10:15.900 |
thoughts on this vibe shift? Is it is this related to cancel 00:10:20.020 |
culture kind of ending and journalists just not being able 00:10:22.980 |
to cancel people because they misspoke or were a little spicy 00:10:27.980 |
Well, I like the fact that the CEOs are all being colorful in 00:10:32.780 |
their remarks and candid and interesting. And that that's 00:10:35.620 |
always a good thing. In each of these cases, I kind of like what 00:10:39.660 |
they had to say. But I think that you might, or we 00:10:44.060 |
collectively might be reading a little bit too much into this. I 00:10:47.780 |
mean, at the end of the day, what secret cows are they 00:10:50.620 |
really challenging? What real dangerous truths are they 00:10:54.940 |
speaking? What real risks are they taking? I just don't put 00:10:58.420 |
any of the things that they're saying or doing in the same 00:11:01.060 |
category, as say, what Elon has been doing in terms of taking on 00:11:05.020 |
the powers that be, in terms of rolling back censorship, and 00:11:09.380 |
promoting free speech on on x. I mean, Elon, I think has taken 00:11:13.900 |
some real risks in doing that. And you see that he's paying the 00:11:17.700 |
price with all these government investigations and the voiding 00:11:21.380 |
of his compensation package. That I think is just in a 00:11:25.700 |
slightly different category of true risk taking by speaking 00:11:29.980 |
truth to power, or allowing the masses to speak truth to power, 00:11:34.740 |
compared to what these other guys are doing. And I'm not 00:11:36.940 |
disparaging any of them. But, you know, look at them one by 00:11:39.780 |
one. I mean, Alex Karp made a colorful joke at the expense of 00:11:42.700 |
short traders. I agree. But it's not really a risky remark. 00:11:47.660 |
Jensen Wang is giving some tough love to Stanford students. He's 00:11:52.900 |
giving them I think, a good lesson of stopping so entitled, 00:11:56.860 |
go get some real life experience. Be resilient. Okay, 00:12:01.220 |
great message. I saw the Zuckerberg clip liked it as 00:12:04.380 |
well. He's basically speaking from a place of passion about 00:12:07.940 |
his own product, and comparing it to Apple. Okay, great. That's 00:12:11.380 |
what he's supposed to do. I don't see any of those CEOs take 00:12:14.620 |
again, if you want to compare it to Elon, taking a really 00:12:18.860 |
dangerous political stance. In fact, remember when Zuckerberg 00:12:22.060 |
got dragged to Capitol Hill and gave that testimony, and they 00:12:25.700 |
demanded that he give that apology. He did it. I mean, 00:12:29.300 |
you're genuflected. I thought that word was banned on this 00:12:32.380 |
Anybody else can be genuflected. I mean, I thought he was just 00:12:39.740 |
showing some humanity, which is, you know, kind of paradoxical. 00:12:42.820 |
Look, I get it in that moment. But I'm just saying that, like, 00:12:46.020 |
if you want to put it in the same category as Elon, the thing 00:12:48.380 |
to do would have been to punch back. And I think in that moment, 00:12:52.100 |
it was who it was Josh Hawley, the thing to do would have been 00:12:54.420 |
to say no, it's it's you who are exploiting the misery and 00:12:57.780 |
suffering of all of these people by trying to score political 00:13:00.660 |
points. So anyway, I don't have a super base. Yeah, yeah, that 00:13:04.660 |
would have been super base. So all I'm saying is that it's one 00:13:07.460 |
thing for these founders and CEOs to be colorful and candid 00:13:13.100 |
or whatever, that's all great. But have they really taken 00:13:21.500 |
Well, I think Jason, the takeaway, I think you're almost 00:13:26.380 |
on the right point, but not quite. I don't think this is 00:13:28.820 |
based or not based. I think the point is, everybody is exhausted 00:13:33.060 |
with the multiple layers of word scramble gymnastics that people 00:13:39.140 |
have had to play. And they're like, enough's enough. I think 00:13:43.060 |
the more interesting way to look at this is that there was a few 00:13:46.980 |
year period where a lot of companies were under a lot of 00:13:50.580 |
pressure. And folks felt that they couldn't say what was on 00:13:55.460 |
their mind to really fix the problems that they saw. Now you 00:13:59.540 |
have, you know, both of the companies and the CEOs you 00:14:02.180 |
pointed out are firing on all cylinders, as far as anybody can 00:14:05.220 |
tell from the outside. And there's a certain level of 00:14:08.140 |
political capital that comes with that. And they're choosing 00:14:10.980 |
to spend it. And I think that's the interesting takeaway, which 00:14:13.660 |
is, as these companies become successful, again, as tech 00:14:17.980 |
reemerges, again, from this multi year malaise, are folks 00:14:22.700 |
going to find their voice or not? And I think that's the big 00:14:28.220 |
point, which is that it seems at least that we can see this next 00:14:31.580 |
generation of winning companies seems to have CEOs that will 00:14:35.940 |
take a different path, they'll be maybe closer to Elon than 00:14:41.500 |
Yeah, it's, I loved longs comments just about his secret 00:14:46.860 |
weapons. He has incredibly low expectations. If you had said if 00:14:50.220 |
you had said that comment a few years ago, the whole pain and 00:14:53.580 |
suffering thing, what somebody would have said, I felt 00:14:56.380 |
triggered, it touched my childhood trauma. You know, I 00:14:59.660 |
had x, y, and z happened to me. And it's not that those people 00:15:03.180 |
don't have valid claims, but they would have aired it in a 00:15:06.300 |
way that tried to get him canceled effectively. 00:15:09.620 |
Yeah, he would be out of touch. He would be talking down to 00:15:12.820 |
people to be a billionaire telling people like to suck it 00:15:16.660 |
In fairness, I bet you there are people that felt that way 00:15:20.140 |
even today when listening to his clip. The thing that is 00:15:22.460 |
different, though, was the rest of us who also have had pain and 00:15:25.740 |
suffering in our lives, retweeted it and was just like 00:15:29.820 |
Yeah, I mean, finally, candid advice. Elon used to always say 00:15:36.580 |
happiness equals this was his happiness formula happiness 00:15:40.860 |
equals expectation reality minus expectations. And so if your 00:15:48.420 |
expectations are really high, and reality doesn't hit it, 00:15:50.500 |
you're going to be sad. And if you keep your expectations low 00:15:52.620 |
and realities, you know, okay, or good, yeah, you could be 00:15:55.660 |
happy in life. But Freiburg, I just love of those quotes, maybe 00:16:01.500 |
some thoughts about suffering and how hard it is. You're back 00:16:05.260 |
in the CEO seat. How's that been going for you? Generally 00:16:09.660 |
speaking, how hard is it compared to being a capital 00:16:12.220 |
allocator? Curious. I was going to talk to you about that 00:16:14.620 |
offline. Anyway, I probably have an unhealthy affinity for 00:16:20.460 |
suffering. I think that if you come from certain backgrounds, 00:16:29.620 |
you're sort of trained that that's the place that your 00:16:33.780 |
unconscious tends to want to be. And I think that that also some 00:16:39.740 |
people call it chips on shoulders. Some people call it 00:16:43.020 |
motivation. I mean, look at your friend, Elon, how much suffering 00:16:47.540 |
he puts his himself through. I think it's a requisite to to 00:16:52.660 |
greatness is you have to really find ways to sacrifice. Now, if 00:16:56.180 |
I've said this a lot, there's a reason a lot of people that have 00:16:59.740 |
had success in their career, don't end up being great 00:17:02.740 |
entrepreneurs. Because as soon as you're faced with failure for 00:17:06.300 |
the first time, it doesn't pattern match to what's happened 00:17:09.420 |
to you. Historically, I go to a good school, I get good grades, 00:17:12.980 |
I go to Stanford, I get a degree, everything about every 00:17:16.020 |
step you do, you're told if you do x, you will get y. And then 00:17:19.540 |
you do x and you get y and you repeat. And at some point, 00:17:23.220 |
you're considered successful in your education in your career, 00:17:26.340 |
and so on. If you then decide that entrepreneurship is the 00:17:29.780 |
path for you, you realize that entrepreneurship is that there 00:17:33.180 |
is no if x, then y, there is if x, maybe y, maybe z, maybe 100 00:17:39.580 |
other things, it'll smack you in the face. And that experience is 00:17:42.820 |
shockingly different for people that have historically followed 00:17:45.820 |
a path of success of what's defined as success culturally, 00:17:48.740 |
socially. And I think that that's really what he's speaking 00:17:51.460 |
to. If you've grown up where all of your expectations have not 00:17:54.820 |
been met, or many of the expectations have not been met, 00:17:56.980 |
you realize that persistence, grit, perseverance, 00:18:00.020 |
relentlessness, these are the necessary traits to be 00:18:03.420 |
successful in entrepreneurship. And I think that I find myself 00:18:06.980 |
much happier in that condition than in any other condition. And 00:18:10.020 |
it's why I'm actually very happy in the work I'm doing right now. 00:18:13.260 |
Yeah, I think that resonates a lot with me and all the 00:18:18.020 |
entrepreneurs that I back. They all have that chip. And you got 00:18:22.420 |
to be careful not to get caught up in the trappings and really 00:18:24.780 |
focus on solving problems. If you think about what a CEO does 00:18:27.820 |
all day, you hire the smartest team, you know, you give them 00:18:31.500 |
the biggest challenges, as much autonomy as you can. And then 00:18:35.140 |
they return back to you with all the problems, the smartest 00:18:37.980 |
people you could find to join your team can't fix. So then 00:18:40.820 |
your life becomes, essentially, you know, what's left over, that 00:18:45.420 |
is the most brutal to solve. And, you know, there's only a 00:18:49.380 |
certain percentage of people who can do that day in and day out, 00:18:53.180 |
just relentlessly before you reconcile, how do you reconcile 00:18:56.420 |
that statement, Jason, with the proposition that people should 00:19:01.620 |
become entrepreneurs, generally, I think that this really, this 00:19:05.860 |
really challenges me whenever people say I'm thinking about 00:19:08.420 |
starting a company, my first response is no, don't, like, you 00:19:12.180 |
have to be told over and over again, to not start a company to 00:19:16.220 |
test it, they actually have the resilience and grit necessary 00:19:18.780 |
just to take the first step of starting the company. I have 00:19:21.380 |
gotten that everyone should be encouraged to start a company 00:19:23.980 |
and entrepreneurism is a is a career choice, I think is a 00:19:27.100 |
false notion. I think that most people are not psychologically 00:19:30.740 |
equipped for being successful in entrepreneurship. 00:19:32.940 |
Yeah, you're 100%. Right. And I've gotten in my later years, 00:19:36.580 |
as we run, like found university, these programs, when 00:19:39.060 |
people are applying, and we only accept 10% into the programs, 00:19:42.060 |
and then only invest in 10% of those are net net, like less 00:19:45.420 |
than 1% get funded. You know, I'll ask them, they're like, Oh, 00:19:49.140 |
can you give us money, so I can get a co founder. And I'm like, 00:19:53.420 |
you know what, you failed the first test of being an 00:19:55.220 |
entrepreneur. You know, the first test of being an 00:19:58.060 |
entrepreneur is can you convince two or three people to go on 00:20:00.380 |
this crazy journey with you, because it's important, and 00:20:03.420 |
without money. And, you know, people expect, oh, you're going 00:20:06.460 |
to just give them the money, because they have an interesting 00:20:08.460 |
idea. And then I asked him, what's your skill? What do you 00:20:10.540 |
do at the startup? Do you sell the product, you build the 00:20:12.900 |
product. And a lot of people do not have the wherewithal to add 00:20:18.140 |
a skill that the world needs, being a developer, a UX 00:20:22.100 |
designer, a salesperson, whatever it is, right, they 00:20:24.380 |
don't have any skill. And then they also have no ability to 00:20:28.540 |
convince somebody else with skills to start a company. If 00:20:32.380 |
you if you can't have marketable, important skills 00:20:35.660 |
yourself, that you taught yourself on your own, through 00:20:40.020 |
sheer force of motivation and will and opening up YouTube, 00:20:43.340 |
which really is not hard people, like anybody can learn to code 00:20:47.300 |
to be a UX designer to be a salesperson, all this is on some 00:20:51.340 |
online course, some book, some YouTube video, some podcast, or 00:20:54.620 |
just learn some marketable skills that a startup needs. And 00:20:57.860 |
if you can't do that, you shouldn't be a founder. 00:20:59.980 |
I think the point is that everybody's capable of being a 00:21:03.620 |
founder. And anybody can and should start a company, it's 00:21:06.500 |
just that very few can finish a company. And that is the 00:21:10.300 |
resilience part where there's just so many ups and downs. And 00:21:15.180 |
you have to be able to roll and survive. And you have to just 00:21:18.700 |
problem solve constantly. And yeah, there are very few people 00:21:22.540 |
You have to be comfortable with failure, you have to be 00:21:25.380 |
comfortable with expectations not being met. That's the 00:21:28.180 |
important point that he's making. I can't tell you how 00:21:30.820 |
important it was early in my career, I had several cold 00:21:33.780 |
calling jobs where I called called college kids, I called 00:21:36.100 |
called alumni, I called called CEOs, I had three different cold 00:21:39.540 |
calling jobs, and getting rejection after rejection, 00:21:42.940 |
failing in my life, I didn't get into any schools except for 00:21:45.700 |
Berkeley, because Berkeley didn't take teacher 00:21:48.020 |
recommendations to be teachers, I did not do well in school. And 00:21:51.700 |
then playing poker taught me a lot. Because playing poker, you 00:21:54.900 |
lose hands, you lose hands, you lose hands, you lose hands, you 00:21:57.780 |
have to just make sure that you're making the right 00:21:59.380 |
decisions. And over time, the money will come to you, the 00:22:01.220 |
positive EV will be there. But failure and persisting through 00:22:04.380 |
failure, I think was one of the most important traits I had to 00:22:07.700 |
develop before I was even ready to start a company. Because all 00:22:12.700 |
But I have my first magazine, I didn't understand I had just 00:22:15.500 |
printed photocopies up and I had like 2000 copies of this 00:22:18.260 |
magazine, Silicon Valley reporter. And I didn't know how 00:22:21.220 |
to get them in people's hands. So I just got a luggage cart. 00:22:24.540 |
Like a literal luggage cart, I put them in the luggage cart and 00:22:28.220 |
I walked around lower Manhattan. And I just dropped them off at 00:22:32.380 |
internet companies. And I dropped them off at cafes. And 00:22:35.020 |
then I went to all the village voice boxes, and I just slotted 00:22:38.420 |
them in, in between the village voice. And miraculously, people 00:22:41.620 |
found it, and then they would subscribe. And you know, 00:22:44.300 |
whatever it took was the approach. All right, everybody, 00:22:47.580 |
let's get to our second topic, a mini topic. But did opening I 00:22:51.380 |
just get caught with their hand in the cookie jar or the 00:22:53.740 |
training data cookie jar. Open AI CTO was interviewed by the 00:22:59.100 |
Wall Street Journal this week. During the interview, you 00:23:01.300 |
probably saw this if you're on x it trended, she was asked what 00:23:04.140 |
data did opening I use to train Sora, if you don't know what 00:23:06.940 |
story is, we talked about it here. It's that incredible. You 00:23:10.860 |
know, video create, you know, type in a text prompt, get a 00:23:13.740 |
video back model. And let's watch this clip and then 00:23:19.340 |
What data was used to train Sora? We used publicly available 00:23:24.860 |
data and license data. So videos on YouTube. I'm actually not 00:23:32.980 |
sure about that. Okay. videos from Facebook, Instagram. You 00:23:39.980 |
know, if they were publicly available, available yet 00:23:44.620 |
publicly available to use. There might be the data, but I'm not 00:23:52.700 |
Let me start with you, Jamal. What are your thoughts here? And 00:23:56.020 |
just her not being super prepared there to answer a 00:24:00.460 |
question? Or is this a cookie jar situation? What do you 00:24:03.340 |
think? I mean, you got to think the CTO of an organization that 00:24:07.740 |
whose job it is to build models based on training data knows 00:24:13.380 |
My interpretation of that answer is that she is hesitating 00:24:18.100 |
because she doesn't want to make a statement against interest. 00:24:20.220 |
Right? I want to show you guys something. I had a friend of 00:24:23.420 |
mine, somebody we all know, who's deep in the heart of this 00:24:29.340 |
AI stuff. He was, he came to my office, and he showed me this 00:24:34.060 |
one really interesting thing, which is when you launch chat 00:24:37.220 |
GPT. And if you go into the microphone, but you say nothing, 00:24:42.100 |
so just wait a few seconds, turn on the microphone, wait a few 00:24:44.380 |
seconds and turn it off. I'll just do it right now just to 00:24:46.620 |
show you. It comes back and it says thank you for watching, 00:24:49.380 |
which is typically what it shows you when you auto watch a bunch 00:24:54.340 |
of YouTube videos. Now, why is that interesting? Well, if you 00:24:59.700 |
say nothing, right, the model has clearly then is guessing 00:25:03.780 |
that whenever you see you hear silence, that probabilistically 00:25:08.500 |
the next best thing to translate that into is thank you for 00:25:11.380 |
watching, which would mean that the training happened on a bunch 00:25:15.220 |
of content where thank you for watching was the next obvious 00:25:18.220 |
thing so that when there's silence, and the most obvious 00:25:22.020 |
place where that happens is in YouTube. I don't know, I thought 00:25:24.780 |
that was an interesting little thing that he pointed out to me. 00:25:26.860 |
I don't know if anybody's actually explored this. But if 00:25:30.540 |
it is true, and Google decides they have an issue with it, 00:25:34.980 |
For your thoughts. I think it's fine. I don't know why this is a 00:25:40.900 |
Well, I mean, it's obvious she's kind of lying on camera. 00:25:44.180 |
She definitely knows where the training data came from. And it 00:25:47.860 |
I mean, what YouTube is public. So what's wrong with watching 00:25:55.300 |
Well, you'd have to get a license to make a derivative 00:25:58.020 |
work. As we've seen, OpenAI has been doing that. They're in a 00:26:00.820 |
lawsuit right now with the New York Times. And they failed to 00:26:05.060 |
So the assumption is they're making a derivative work. 00:26:08.740 |
Yeah, I mean, they've licensed other people's data, right? 00:26:11.460 |
Yeah, to get access to it that's not publicly available. You can 00:26:15.300 |
go to YouTube downloader and download all this data. You know 00:26:17.860 |
what's crazy? I heard from someone at Google that under the 00:26:21.020 |
terms of service, Google's not allowed to train on YouTube 00:26:23.660 |
data. Remember my I made this point a couple weeks ago about 00:26:26.260 |
how important YouTube data is. I think there'd be this really 00:26:29.580 |
ironic handicap that Google's done to itself, where anyone 00:26:33.060 |
else can access and download and watch YouTube data to train 00:26:36.100 |
models, but Google cannot. I don't know if there's any 00:26:39.180 |
clarity on that. But it's a pretty crazy fact pattern. But 00:26:41.580 |
yeah, YouTube's on the internet. And I feel like anything that's 00:26:44.100 |
on the open internet should be watchable by these models. I 00:26:46.300 |
don't consider training models to be generation of derivative 00:26:50.740 |
And that's a position that OpenAI is taking the other side 00:26:54.740 |
of that position because they're going around licensing data. 00:26:56.700 |
Saks, any other thoughts here about this kerfuffle? 00:26:59.340 |
Well, look, I kind of agree with both you and Freeberg. In terms 00:27:03.660 |
of the part that I agree with Freeberg, I think that this 00:27:05.700 |
issue is kind of more of the same of what we've been talking 00:27:08.500 |
about for a while, which is clearly OpenAI trained its model 00:27:12.260 |
on publicly available data that was available on the on the 00:27:15.180 |
internet. I agree with Freeberg that fair use doctrine should be 00:27:19.380 |
applied to that. I know that you have a different point of view 00:27:21.740 |
on that, Jake. I don't know if we need to rehash that. I 00:27:24.460 |
understand that you don't think fair use should apply. In any 00:27:27.380 |
event, I think it's pretty clear that OpenAI trained SORA using 00:27:33.460 |
you know, available data on the internet, and that probably 00:27:36.260 |
included YouTube. The part where I agree with you, Jekyll, is 00:27:40.420 |
that, I don't know if I would say that she's lying per se, but 00:27:44.900 |
I think she's probably concerned that if she comes right out and 00:27:50.540 |
answers the question as directly as she could, that it could be a 00:27:54.620 |
problem for them and all these lawsuits that they're now 00:27:56.500 |
facing. You know, again, I think that I would take the side in 00:28:00.860 |
those lawsuits that fair use should be allowed. But I think 00:28:03.860 |
that probably she is being careful here, because they are 00:28:07.660 |
facing so many lawsuits about this training data. 00:28:09.740 |
It's gonna be pretty clear, Chamath, I think what the courts 00:28:12.140 |
will decide here, which is whose opportunity is it to make a 00:28:16.340 |
SORA in the world? If Disney owns a massive collection of IP, 00:28:20.340 |
and somebody should be able to use that IP to make derivative 00:28:23.700 |
works? Should it be SORA? Or should it be Disney itself? And 00:28:27.220 |
so I think the journalists know full well, what they're doing. 00:28:32.260 |
And now the journalists have their hooks into this for a very 00:28:35.060 |
real reason. The journalists are also content creators. So now 00:28:38.900 |
we're gonna have this, you know, two sides forming journalists 00:28:42.460 |
worry that folks are not technical enough to make this 00:28:45.260 |
decision. And you know, when it goes into a court, how is a 00:28:49.140 |
judge really going to understand the nuances of this to make a to 00:28:52.220 |
make a call on this, or the jury? It's going to be a big 00:28:55.500 |
education process. Yeah, yeah, or the jury. Yeah. Speaking of 00:28:58.980 |
AI, vertically, I startups are starting to make some noise. We 00:29:03.060 |
all know about large language models. We've talked about them 00:29:05.940 |
here. If you listen to this program, you know about OpenAI, 00:29:07.980 |
Google's, Gemini, previously known as Bard, Anthropic, Claude, 00:29:12.260 |
all this stuff, their general purpose, they've been trained on 00:29:15.420 |
the open internet, as we were just discussing, so they can 00:29:18.380 |
answer questions about almost anything. And yeah, sometimes 00:29:21.940 |
it's correct. Sometimes they're incorrect, but it's showing 00:29:24.700 |
promise. There is another school of thought here that's emerging 00:29:28.420 |
in startups, vertical AI. These companies are kind of taking a 00:29:32.300 |
job title, a role in society, and they are building vertical 00:29:37.660 |
apps. Harvey is AI for lawyers, a bridge is doing an AI 00:29:41.820 |
notetaker for doctors, saves them hours a day, according to 00:29:45.820 |
them. Tax GPT is an AI tax assistant, and Sierra's AI for 00:29:50.380 |
customer support. That's Brett Taylor's new startup. This week, 00:29:54.260 |
a startup called cognition debuted a tool called Devon, 00:29:57.020 |
they're calling it an AI software engineer, the demos 00:29:59.460 |
went viral on x, you've probably seen them all over the place. And 00:30:02.340 |
in the news, if you watch it, you can see Devon fixing bugs in 00:30:06.300 |
real time fine tuning an AI model building apps and and, and 00:30:10.380 |
people are speculating Devon was built on GPT for from OpenAI. 00:30:14.660 |
That's not confirmed. But according to the CEO, Devon was 00:30:18.980 |
built by tweaking reasoning and long term planning into an 00:30:22.500 |
existing LLM. Here's how it ranks against other major models 00:30:27.220 |
on coding benchmarks. They're building all these benchmarks to 00:30:29.820 |
test each language model. And as you can see, it's according to 00:30:33.980 |
this chart, and according to their data, doing much better 00:30:37.780 |
than just a generic language model kind of makes sense. 00:30:41.100 |
Schmaltz, did you see these demos this week? I think I saw 00:30:43.940 |
you on the group chat talking about it. And what was your take 00:30:50.940 |
Oh, I think this is so powerful. I mean, it's incredible, because 00:30:54.980 |
we're measuring this progress in like, what, week over week? 00:30:59.860 |
I think the point that you should take away is that the 00:31:03.620 |
most of these very difficult in impenetrable job types, for the 00:31:11.860 |
average person, if there's if you said to them, hey, become a 00:31:14.260 |
developer, that's like a complicated journey, right? It's 00:31:19.300 |
just going to be now like a command line interface where you 00:31:24.820 |
just kind of describe in English what you want to do. And all of 00:31:28.140 |
this stuff will just happen behind the scenes, and it'll be 00:31:30.380 |
totally automated. So that'll grow the number of people that 00:31:34.980 |
can use these tools. At the same time, it'll make the developers, 00:31:39.420 |
I think even more valued, because you're going to need 00:31:42.460 |
people in the guts of these models, and in the code that it 00:31:46.220 |
generates, because it's not always going to work perfectly, 00:31:48.500 |
there's always going to be some kind of hallucination, some 00:31:50.780 |
stuff is not going to compile. Now, the demos that they did, 00:31:53.820 |
though, were incredible, they were able to find errors, they 00:31:55.940 |
were able to remediate errors in code. I mean, I just, I think 00:32:01.700 |
So you you've been on copilots for the past year talking about 00:32:05.940 |
that. This is slightly different. We're moving from, 00:32:08.740 |
hey, here's a copilot, somebody helping a developer to, hey, 00:32:12.460 |
here's a developer working, and now they have a supervisor. So 00:32:16.580 |
what do you think of these sort of role based agents? And how 00:32:19.700 |
quickly we went from year one copiloting to, okay, now they're 00:32:24.460 |
the pilot, and we're sitting in the copilot seat watching them 00:32:27.900 |
Yeah, well, look, first of all, everyone's working on autonomous 00:32:31.740 |
coding, or working towards that this is like one of the core, 00:32:34.940 |
most obvious use cases of LLM because code is text. And it can 00:32:40.580 |
also be run through a compiler to debug it. So you can also get 00:32:44.420 |
to, in theory, you can get to high levels of accuracy yet, 00:32:47.100 |
although, in the example that you gave, Jason, this new 00:32:50.660 |
product was only at 13%. So there's still a long way to go. 00:32:53.900 |
But the potential is clearly there. So a lot of companies are 00:32:59.100 |
working on some variation of this idea. Devin is, I guess you 00:33:04.380 |
call it an agent first approach. And I think that's very cool for 00:33:08.900 |
generating new software projects. But where I think this 00:33:13.740 |
gets much trickier, and is much more difficult, is when you're 00:33:17.220 |
working in existing code bases. And just to talk my own book for 00:33:20.780 |
a second, we're an investor in a company called source graph, 00:33:23.740 |
they have a product called Cody, and their whole approach is 00:33:26.700 |
context first, as opposed to agent first, it's all about 00:33:29.020 |
getting the copilot to work inside of existing code bases. 00:33:33.100 |
So different companies are coming at this from different 00:33:35.460 |
approaches, GitHub copilot, I think is kind of more like Cody, 00:33:38.700 |
where it's all about making an existing code base more useful. 00:33:43.820 |
Whereas Devin, again, is starting with, I think, net new 00:33:46.340 |
code bases, but that's going to demo really well. And so that's 00:33:49.580 |
what you're seeing is like these really cool demos. In any event, 00:33:52.700 |
the larger picture here is that we are going to get better and 00:33:56.140 |
better at coding autonomously, I guess you could say. And I don't 00:34:01.620 |
know if it gets ever gets to 100%, where you don't need 00:34:03.540 |
coders anymore. But it's going to make coders much more 00:34:07.540 |
productive over time, you're gonna get this huge multiplier 00:34:10.460 |
effect on the ability to write code. And that's really 00:34:15.740 |
For a while, we've been tracking this evolution from, you know, 00:34:19.940 |
Gmail, guess the next word, guess the next phrase, guess the 00:34:23.420 |
next sentence to copilots. Now we have these role based agent 00:34:27.740 |
based solutions that startups are pursuing. What's next? If we 00:34:35.260 |
follow this thread, what would the next evolution here? Well, 00:34:38.540 |
the big push has been for this notion of AGI to replace a 00:34:42.300 |
human. And I think what we're seeing is software that 00:34:49.820 |
replaces a specific human doing a specific thing, like being a 00:34:56.060 |
lawyer, being an accountant, being an art director, if you 00:35:00.060 |
think about the internet, when the internet, which was like 00:35:03.300 |
networking software, and the capabilities that arose from the 00:35:06.220 |
connection of all these computers, during the internet 00:35:08.460 |
era, the innovation was everyone tried to create a business model 00:35:12.340 |
which was how do you take an existing vertical business and 00:35:14.900 |
put it on the internet? I think what we're seeing in this era is 00:35:18.340 |
everyone's taking a vertical human and creating a vertical 00:35:21.580 |
version of a human in the AI era. And so I think like the the 00:35:27.700 |
success will probably accrue to one company that replaces one 00:35:33.820 |
set of core human services, like being a lawyer, being an 00:35:36.700 |
accountant, you know, being an artist in whatever way. And that 00:35:42.100 |
that ends up being the specific vertical tool that people will 00:35:46.900 |
use to automate and scale up their ability to do that task in 00:35:51.420 |
an automated way. Because I think that there's like a great 00:35:54.020 |
deal of capability that emerges in the fine tuning and the 00:35:58.380 |
unique data that certain people may have to make that one tool 00:36:01.820 |
better than the rest. And therefore, everyone will end up 00:36:04.340 |
using this one lawyer service, or this one accounting service or 00:36:08.020 |
what have you. So I definitely think that's kind of what we're 00:36:10.260 |
seeing. Yeah, I think it's pretty obvious where this is 00:36:12.460 |
going. You got co pilots assisting a developer, or a 00:36:16.700 |
lawyer, then the next were writer, then they got the next 00:36:19.780 |
phase, okay, you've got a peer, so you're doing peer programming 00:36:22.860 |
or somebody's kind of working alongside you, you're checking 00:36:25.100 |
their work, and maybe they're even checking your work, seeing 00:36:27.380 |
if you have bugs, where this is going to be next year is there's 00:36:30.300 |
going to be a conductor, there's going to be somebody who has a 00:36:33.660 |
role or a piece of software has a role, where you say, Hey, 00:36:36.980 |
you're a CEO of a company, you're a founder, a product 00:36:39.660 |
manager, here's your lawyer, here's your accountant, here's 00:36:42.700 |
your developers, here's your designer. And now you will 00:36:45.780 |
coordinate those five people. Now imagine how that changes 00:36:49.620 |
startups when you as an individual have a conductor 00:36:53.260 |
working with you and says, You know what, I don't know if I 00:36:55.780 |
agree with this legal advice that's coming in in relation to 00:36:58.260 |
the tax advice. And maybe we should not even add this feature 00:37:01.660 |
to the program. Let's talk to the product manager, the agent 00:37:05.420 |
product manager about taking that feature out. So we don't 00:37:07.700 |
have these downstream legal issues. And we don't even have 00:37:09.820 |
to file taxes in this area, it's gonna get really interesting 00:37:14.860 |
The other way it may go Jason is you have a lawyer that has 50 00:37:20.140 |
associates working for them through the AI. So you don't 00:37:23.260 |
replace the lawyer, you don't replace the software engineer, 00:37:26.260 |
the software engineer levels up. And now the software engineer 00:37:29.100 |
has 50 engineers available 50 agents running, doing tasks for 00:37:32.540 |
them. You do still you do still need humans with domain 00:37:35.900 |
expertise, and creativity to think through architecture to 00:37:39.100 |
think through process and to make sure that the AI agents are 00:37:42.820 |
doing their job. So I think what it creates is extraordinary 00:37:45.700 |
leverage for people and organizations, which is why 00:37:48.260 |
generally economic productivity goes up, people don't lose jobs, 00:37:51.940 |
In this phase, the the op ex of companies will probably be cut 00:37:56.500 |
in half. At the limit, I think Jason is actually absolutely 00:38:02.300 |
right. I think you find that there'll be millions of 00:38:05.220 |
companies with one person, and then a whole layer of software 00:38:08.300 |
and conductors and agents and bots. That's the future. So you 00:38:12.060 |
won't have these engineering people, that person should be 00:38:15.020 |
running their own company. And so you'll just have millions and 00:38:18.100 |
millions and millions and maybe billions of companies. And I 00:38:20.540 |
think that that's really exciting. Not all of them will 00:38:23.660 |
work, many of them will fail, and a few of them will be 00:38:26.340 |
ginormous. And it'll be up to the person who can navigate and 00:38:31.100 |
be a conductor, as you said, you know, yeah, really interesting. 00:38:34.860 |
The solo entrepreneur movement of last couple years, there were 00:38:37.620 |
all these kind of like independent hackers, building 00:38:40.460 |
one item like Phil Kaplan did with distro kid, he said, like 00:38:44.220 |
two or three people working on that got very big. You know, I 00:38:47.660 |
was telling you guys about that slopes app I showed you, I 00:38:50.460 |
reached out to the founder of that. I was like, Hey, tell me 00:38:51.980 |
about the business, like, it's enough of a business to support 00:38:54.220 |
one person or two people, like, there will be a lot of these 00:38:57.100 |
apps or services, one conductor. And yeah, it makes what half 00:39:01.500 |
million a year, 3 million a year, whatever, it's enough to 00:39:03.580 |
support 123 people working on it. But previously, you know, 00:39:07.620 |
you'd be going to the venture community like, Oh, what did it 00:39:10.180 |
take a modern app company, sacks to kind of build an Android and 00:39:15.060 |
a, an iOS app, just, you know, 510 years ago, if we were 00:39:18.980 |
funding 110 years ago, what would the footprint look like? 00:39:25.740 |
if you're going to go all the way back to like the late 90s 00:39:29.140 |
during the.com era, I remember that with PayPal, just to 00:39:33.580 |
launch, really, what was an MVP, we had, I'd say a dozen 00:39:38.380 |
developers, and it was pretty expensive. And we had to set up 00:39:40.940 |
our own colo. There's all this infrastructure that all got 00:39:44.020 |
abstracted away with AWS, then you move to the mobile era. And 00:39:48.380 |
the app stores provide. There's just a lot more developer tools 00:39:52.740 |
for more API's. Yeah, it was well as distribution, but it's 00:39:56.220 |
far easier to code these apps. So definitely things have 00:39:59.100 |
gotten easier and easier. That's the trend. If that's the 00:40:01.620 |
point you're trying to make. It's certainly never been easier 00:40:04.980 |
to get started in creating something if you're a solo 00:40:08.220 |
developer. Yeah. That being said, I think that depending on 00:40:11.980 |
what you're trying to do, it's still usually the case that if 00:40:16.860 |
you're trying to do something interesting and profound, you're 00:40:19.260 |
going to need a small team of developers and a couple million 00:40:24.220 |
Yeah, it used to be rule of thumb, I think 12 people for an 00:40:27.580 |
app company, you get two or three working on each platform, 00:40:29.740 |
a couple designers could be testing and design UX, you get 00:40:35.460 |
to 1012 people to run a modern one. Alright, everybody, next 00:40:39.900 |
issue, the house just passed a bill that would either ban tick 00:40:44.340 |
tock or force a sale. We talked about this bill being proposed 00:40:48.220 |
last week, and things had moved really slowly on the tick tock 00:40:52.140 |
band. Now they're moving really fast. On Wednesday, the house 00:40:54.540 |
passed the bill with a bipartisan vote of 352 to 65. 00:40:59.420 |
Making this one of the few subjects that members of 00:41:01.940 |
Congress can agree on. Biden has also signaled his intent to sign 00:41:05.980 |
the bill into law should it pass the Senate passing the Senate 00:41:08.620 |
that could be an obstacle Democratic majority leader Chuck 00:41:11.140 |
Schumer has signaled a lack of interest in the subject and 00:41:14.340 |
said he'll review the bill with committee chairs before deciding 00:41:16.940 |
on the path forward arguments for and against the bill have 00:41:19.820 |
centered around a few main points reciprocity. We talked 00:41:22.660 |
about that here. You can't use Instagram x or any of our 00:41:26.500 |
domestic social networks in China. And if they won't allow 00:41:30.100 |
us in their country, why should we give them unrestricted access 00:41:32.700 |
to our stifling debate, progressives fear this isn't 00:41:36.580 |
really about national security. Their position is mainstream 00:41:41.380 |
politicians are hoping to shut down political discourse, 00:41:43.860 |
particularly among the youth who are mostly on tick tock, 00:41:48.140 |
particularly pro Palestinian and anti Israel discourse, 00:41:51.500 |
which seems to flourish on tick tock versus other platforms. 00:41:55.140 |
Coincidentally, Joe Biden launched a tick tock account 00:41:58.940 |
last month, and his comments were instantly flooded with pro 00:42:02.580 |
Palestinian remarks, some calling him genocide, Joe. Third 00:42:08.460 |
argument overreach some sex, I think you've pointed out that 00:42:12.060 |
the language in this law is a bit vague, it needs to be 00:42:14.940 |
tightened up. Maybe the President could go after 00:42:19.220 |
companies supposedly aligned with foreign interests who 00:42:21.540 |
aren't our esteemed patriot and friend Keith were boy argued 00:42:25.220 |
with you sacks on this on x. And then there are guys like Trump 00:42:29.140 |
and Vivek, who I believe are flip flopping based on securing 00:42:32.580 |
bags, Vivek called tick tock digital fentanyl. And Trump 00:42:37.100 |
issued an executive order calling for ByteDance to divest 00:42:40.500 |
tick tock in 2020. Now, they're both opposing the ban. And 00:42:44.980 |
interestingly, both Trump and Vivek have ties to the 00:42:48.340 |
Republican mega donor Jeffrey. Yes, who is a major shareholder 00:42:53.020 |
in ByteDance with a reported 15 to $30 billion stake he gave 00:42:57.820 |
Vivek 5 million bucks, who knows what Trump's gotten. But they 00:43:02.540 |
said they're back in love. Sachs, you had this big back and 00:43:05.500 |
forth with Keith for a boy on x. Are you in support of the 00:43:10.740 |
divestiture or not? I haven't been able to track exactly where 00:43:15.100 |
Well, I think my take on this, and I'm gonna have to revise and 00:43:19.780 |
extend my remarks from last week, because I didn't know as 00:43:21.940 |
much about the bill, I hadn't actually read the language yet. 00:43:24.740 |
And now I have. And my take on this is that the bill poses 00:43:29.740 |
significant risk of being Patriot Act 2.0. So in other 00:43:33.780 |
words, you know, a threat to the security of the United 00:43:37.580 |
States is basically hyped up, some part of it may be real, 00:43:41.140 |
some of it may be threat inflation. And then we give the 00:43:44.420 |
intelligence community and the government new powers, which can 00:43:48.220 |
then be abused. And that's exactly what happened with the 00:43:50.780 |
Patriot Act, they ended up spying on Americans. Now, what 00:43:53.780 |
is the potential abuse here? This is the thing that I've 00:43:56.340 |
debated with Keith. And there's also other people who I respect 00:44:01.820 |
a lot. And Keith, by the way, is a very talented lawyer, in 00:44:04.180 |
addition to other things being a successful founder and investor 00:44:09.820 |
and investor. And then you know, I saw that cigar and Jenny whose 00:44:13.900 |
show I was just on thinks that the bill is just fine. So look, 00:44:17.220 |
there are people, very legit people who disagree with me 00:44:20.860 |
about this. But I went last night, I read this bill, like 00:44:23.380 |
four times to try and like parse the language. And I've just come 00:44:27.660 |
away concluding there's no, the way I see it, there's no way to 00:44:32.220 |
argue that this language isn't vague, and could invite abuse. 00:44:36.060 |
And you really have to dig into it. But let me just kind of walk 00:44:39.740 |
you through, I think a key part of it. So first of all, this 00:44:44.600 |
bill doesn't just ban or force the divestiture of Tick Tock, it 00:44:49.460 |
goes after what it calls foreign adversary controlled 00:44:53.980 |
applications. Now, what is a foreign adversary, there are 00:44:57.220 |
four countries that are defined as foreign adversaries, it's 00:45:00.780 |
basically, Russia, China, Iran, and North Korea, I'm not super 00:45:05.500 |
worried about that list of foreign adversaries growing, 00:45:08.620 |
because that does take a bunch of procedural hoops to go 00:45:11.940 |
through it. What I am concerned about is when the bill talks 00:45:15.600 |
about what makes a company controlled by a foreign 00:45:20.260 |
adversary. And if you go to that language, which is then 00:45:23.780 |
definition, so you know, what frequently happens with these 00:45:26.820 |
bills is that a lot of the meat is actually in the definitions, 00:45:29.300 |
you have to look at this very carefully. It says here, the 00:45:32.540 |
term controlled by foreign adversary means with respect to 00:45:34.820 |
a covered company, that such company or entity is and then 00:45:38.340 |
there's three categories. The first category is a foreign 00:45:41.420 |
person, which can also mean a foreign corporation that's 00:45:44.100 |
domiciled in or is headquartered or has its principal place of 00:45:46.980 |
business, or is organized under the law of foreign adversary 00:45:49.900 |
country. So that would be like ByteDance. Okay, ByteDance is a 00:45:53.380 |
Chinese company, okay, then it says, or it can be an entity, 00:45:57.740 |
where 20% of the ownership group is in that foreign person 00:46:02.940 |
category. So that would be like you, let's say you had a US 00:46:06.060 |
company, but 20% of the company was owned by, I don't know if 00:46:10.940 |
Chinese VCs or by ByteDance, okay, that would also be a 00:46:16.100 |
controlled by a foreign adversary, then you get to the 00:46:19.040 |
provision that I think is the most problematic, which is, it's 00:46:22.740 |
a person subject to the direction or control of a foreign 00:46:27.380 |
person or entity. The novel language here is where it says 00:46:30.900 |
subject to the direction of, okay, it's not just saying under 00:46:35.700 |
the control of it's saying subject to the direction of, in 00:46:39.460 |
my view, that's very vague. And it creative prosecutor, creative 00:46:45.260 |
Attorney General could try to say, well, wait a second, if 00:46:48.500 |
Elon has a major Tesla factory, in Shanghai, is he subject to 00:46:55.700 |
the direction of the Chinese Communist Party, because they 00:46:58.460 |
could influence him, they could leverage him. If Donald Trump is 00:47:03.180 |
accused on virtually a daily basis of being a Russian asset, 00:47:07.300 |
is he subject to the direction of Vladimir Putin, David from 00:47:11.820 |
just tweeted very recently that he said that not only Trump, but 00:47:15.100 |
the entire Republican Party that works for Trump is under the 00:47:19.660 |
Yeah, that's a partisan hack, though, that's not like actual 00:47:23.140 |
factual in a court, right. So this would have to be proven 00:47:26.220 |
factual in the court. Okay, do you think or no? 00:47:29.380 |
Well, the ag could open an investigation, based on the 00:47:34.260 |
theory that, for example, Trump owns true social, and Trump is 00:47:38.820 |
under the direction of a foreign adversary, ie Putin, because the 00:47:42.860 |
mainstream media continues to normalize this idea and spread 00:47:46.980 |
this idea on virtually a daily basis. So the point is that 00:47:50.340 |
look, first, the ag would open an investigation, you think 00:47:53.860 |
about the sledgehammer type of remedy here that an Elon or a 00:48:00.500 |
Trump or a you take rumble, for example, which is also accused 00:48:04.700 |
of being a Russian agent, they could be forced to divest the 00:48:08.620 |
company or to have it be banned. So it gives huge, I think new 00:48:13.900 |
powers to the executive branch to pursue political opponents 00:48:19.660 |
and political enemies, whether they actually win in a court of 00:48:22.860 |
law down the road is kind of secondary because they can vex 00:48:27.140 |
and harass their political enemies using this power. So my 00:48:31.100 |
point is that at a minimum, I think this language needs to get 00:48:33.660 |
cleaned up. I think it is way too vague. And you still have 00:48:38.780 |
the issue of whether it's a good idea or not, to force the 00:48:42.340 |
banning or divestiture of tick tock. But this bill goes way 00:48:45.540 |
beyond it. Again, it creates this novel category of foreign 00:48:50.020 |
adversary controlled applications, which also, like I 00:48:52.580 |
said, includes websites. And that means not just foreign 00:48:56.460 |
companies, but domestic companies as well that are set 00:48:59.980 |
to be under the direction of a foreign actor. So again, I, I 00:49:03.980 |
don't know how anyone can look at this language and not say 00:49:08.020 |
Yeah, it's easy enough. Tighten it up. And if it was tightened 00:49:10.900 |
up, you'd be in favor of the tick tock, man, I take it sex or 00:49:14.500 |
divestiture. I should say, I keep saying ban, you would still 00:49:17.060 |
want the Chinese government to not have control of this or, or 00:49:20.860 |
I still have very mixed feelings about the idea of just the tick 00:49:25.260 |
tock ban. Because look, you're talking about an app that 100 00:49:28.060 |
million Americans use for something like 90 minutes a day. 00:49:32.940 |
So people obviously get a lot of value out of this. I've yet to 00:49:35.820 |
see the the hard proof that this app is under the control of the 00:49:41.300 |
CCP. I mean, I know that allegation is made. I can 00:49:44.380 |
understand why journalists are ready. Yeah. But it's it's very 00:49:49.180 |
unclear to me that they have the goods on that. And I do think 00:49:53.440 |
that depriving Americans of the right to use this app that they 00:49:57.260 |
clearly enjoy and love, based on a threat that has yet to be 00:50:02.860 |
proven. That makes me I'm deeply ambivalent about that. 00:50:06.580 |
Martha, if we were to look at, say, media channels, newspapers, 00:50:11.020 |
there's been rules about foreign ownership of those do you 00:50:14.580 |
believe tick tock kind of falls into that as Zack said, 00:50:17.060 |
Americans are enjoying this in a massive way 100 million plus 90 00:50:21.740 |
minutes a day, the statistics are crazy. That actually, I 00:50:24.540 |
think argues for, you know, not having a foreign adversary own 00:50:29.060 |
this or have access to it, we wouldn't allow them to own CNN, 00:50:31.940 |
Fox, New York Times, Washington Post, etc. We have rules against 00:50:36.060 |
that already. So are you in favor of this tick tock 00:50:39.260 |
divestiture? Yes or no? Yeah. Okay. Yeah. Freeberg. 00:50:53.020 |
I'll tell you before I tell you why. I'll tell you a quick 00:50:55.900 |
story. Two or three months ago, after Joker was sick, we started 00:51:00.760 |
to make his own dog food, right. So to have like a super bland 00:51:04.700 |
diet, we had the service that was sending us food and we got 00:51:08.100 |
rid of it and we would just make it ourselves. And part of the 00:51:12.580 |
meal was like some raw apples and carrots that I would that I 00:51:17.540 |
would cut up. And I would always complain to my I hate peeling 00:51:22.060 |
these apples or whatever, slicing them and taking the core 00:51:25.300 |
out. Just just said that. For the next month and a half, all I 00:51:30.540 |
got was apple coring utensils on tik tok. And she was like, Oh, 00:51:35.580 |
you know, we should get rid of XYZ food service. And she would 00:51:38.900 |
just get plastered with these ads. And it was just a reminder 00:51:42.860 |
to me that these apps are constantly listening. Now that's 00:51:45.500 |
a benign example. But my phone is on my desk when I'm talking 00:51:49.540 |
about some really important stuff. Again, important related 00:51:54.300 |
to me both personal and professional, where there's lots 00:51:57.180 |
of money on the line. There's moments where for certain parts 00:52:01.220 |
of my business, like with crypto, we have like 19 layers 00:52:04.860 |
of people that have to you have passwords upon passwords upon 00:52:09.340 |
passwords to do stuff. The phone is always there. It was just a 00:52:13.900 |
reminder to me. So I deleted tik tok, it's gone, which sucks 00:52:17.740 |
because I would relax with that app at night. Like, you know, I 00:52:21.220 |
would have 1520 minutes where I would press. It's super fun. 00:52:24.660 |
It's a great app. I have to be honest with you. I love it. But 00:52:29.340 |
as a consumer, that was the decision I made as a business 00:52:33.860 |
person. What I'll say is it is inconceivable to me that our 00:52:41.940 |
voice signatures aren't being mapped. And there isn't a 00:52:50.660 |
understanding what we're all saying. And it's then further 00:52:55.260 |
inconceivable to me that there isn't a service that's an 00:52:58.860 |
identifying that this is probably David Sachs. And this 00:53:02.140 |
is Chamath and this is free bird. And we're not really all 00:53:05.580 |
that important. But let's take a better example. This is Elon 00:53:08.500 |
Musk. This is the President of the United States. This is or 00:53:11.740 |
the waiter that works for the president who has his phone in 00:53:14.540 |
his pocket while he's sitting inside the you know, the the 00:53:18.900 |
residences of the White House. So I think it's happening. I 00:53:23.820 |
think they're not the only one though. I think there's 00:53:26.100 |
American companies that are doing it too. And so I think 00:53:30.340 |
that we need to start somewhere until we can have our apps in a 00:53:35.940 |
market. They shouldn't have their apps here. And I think 00:53:41.740 |
Completely reasonable. I think I'm gonna disagree. I think a 00:53:45.140 |
couple points. I don't believe in the notion of like 00:53:50.340 |
reciprocity for reciprocity sake. China blocks access to US 00:53:55.100 |
content. Does that mean the US government should block access 00:53:58.180 |
to international content? I think the answer is no, because 00:54:01.100 |
this country is different than China. We have afforded 00:54:04.980 |
ourselves freedoms and liberties that don't exist in 00:54:07.700 |
other countries, freedom of choice, freedom of speech, 00:54:11.020 |
freedom of the press, and so on. So I don't think that the 00:54:13.580 |
government should be restricting access to content. Because 00:54:16.940 |
another country restricts access to our content, I think that we 00:54:20.260 |
should make decisions based on what's right for the citizens, 00:54:23.580 |
what's right for the country, what's right for national 00:54:25.300 |
security, what about spying apps? So that's, that's, that's 00:54:28.540 |
my, my next point. So I do think that if there is a case to be 00:54:31.900 |
made, that there is spying or data acquisition that's going on 00:54:35.700 |
through these apps, we're not talking about rice noodles over 00:54:38.260 |
here, you know, we're talking. And so I think if that's true, 00:54:42.220 |
then I would imagine that there are multiple paths to alleviate 00:54:45.900 |
that, like move all the servers to the US and separate the 00:54:48.580 |
entity force a sale, you know, like, I think I don't think that 00:54:52.100 |
it's necessarily appropriate to say that there aren't other 00:54:56.020 |
courses or other options available to try and prohibit 00:54:59.820 |
what should generally be prohibited, which is spying on 00:55:03.300 |
American citizens, and capturing data that people, you know, 00:55:07.780 |
haven't opted into being captured, I do believe that 00:55:11.140 |
citizens and people should have the right to decide if they want 00:55:13.900 |
to have their data used to be able to access an app, I 00:55:16.740 |
actually am not a big believer that we should be paternalistic 00:55:20.100 |
in the government sense and saying, having the government 00:55:22.620 |
come in and say, here's an app, and we have determined that it 00:55:26.460 |
is not good for you, because this data is being used in a 00:55:28.860 |
manipulative way against you. I think that citizens should be 00:55:31.620 |
afforded transparency and make a decision about whether or not 00:55:35.500 |
And I don't think I don't think citizens have the 00:55:37.220 |
sophistication to understand what foreign adversaries, again, 00:55:40.980 |
we only have four, so they were named there for a reason, are 00:55:44.140 |
doing with our data. And I don't think that they should have to 00:55:48.340 |
be forced to choose. I think that, you know, this is not 00:55:50.660 |
dissimilar to how the FDA says, you're not qualified to decide 00:55:54.540 |
whether this drug is good, we will tell you, and people are 00:55:57.180 |
okay with that. Because you are saying this is part of the 00:56:00.140 |
government infrastructure that's full of experts who know the 00:56:03.180 |
totality of a subject. And so the problem is, in order for 00:56:06.340 |
anybody to make a reasonable decision, you'd have to share so 00:56:08.740 |
much as to just completely blow up a bunch of national security, 00:56:12.980 |
Well, look, I think we can have audit rights, and we can have 00:56:15.260 |
rights that protect our citizens. I think that that's 00:56:17.220 |
appropriate. I'll also say my final point on this, I think 00:56:20.940 |
that whether it's intentional or not, this sort of action leads 00:56:26.300 |
to inevitable cronyism and regulatory capture. Who does 00:56:29.340 |
this benefit? If tick tock gets banned in the US, it's going to 00:56:32.060 |
benefit meta, it's going to benefit Instagram, it's going to 00:56:35.380 |
benefit a few other, you know, social networks going to benefit 00:56:38.020 |
Elon at Twitter, there's a few folks that are going to benefit 00:56:41.020 |
pretty significantly if tick tock is banned, because all 00:56:43.860 |
those users are going to migrate. I will also say by the 00:56:46.180 |
way that many 1000s of people make their living on tick tock, 00:56:48.740 |
like it or not, it's a big income stream for a lot of 00:56:51.300 |
people in the US, and a really important part of their 00:56:54.060 |
And just to be clear here, this is not about banning tick tock. 00:56:57.020 |
This is about divestiture. Then if it's not divested, then it 00:57:01.100 |
gets banned. So if you just think from first principles 00:57:03.220 |
here, why wouldn't they divest? Why would you keep this? The 00:57:09.340 |
reason is, this is a incredibly powerful tool. Under no 00:57:13.380 |
circumstances, would the Chinese government ever allow us to have 00:57:17.860 |
this kind of reach into their populace. And if you want to 00:57:20.140 |
judge a person, you can just look at their behavior. Citizens 00:57:23.740 |
in China live either in a complete police state where 00:57:27.780 |
everything they do is tracked every transaction, facial 00:57:31.060 |
recognition, every step they take, every purchase they make 00:57:34.340 |
is tracked, or you're in a literal concentration camp. 00:57:37.740 |
That's how the people of China live today. We are their 00:57:41.700 |
adversary, they have noted that we're their adversary. Under 00:57:45.860 |
what circumstances would you think they would treat us any 00:57:48.780 |
differently than they treat their own citizens? It is 00:57:51.540 |
absolutely insane that anybody would take the side of the CCP 00:57:56.220 |
on this, you are asking them to divest, you're not asking them 00:57:59.860 |
to shut it down. That is a partisan sort of angle on this. 00:58:04.260 |
And there's a lot of partisanship going on, on the 00:58:07.140 |
edges on this. But what we have to realize is they would never 00:58:10.140 |
allow us to do this to their citizens, and how they treat 00:58:14.740 |
As an example, we've tried, you know, we tried to export sugar 00:58:17.980 |
via McDonald's and Coke, and actually, they have domestic 00:58:21.300 |
brands that they were able to support and prop up. So they 00:58:23.460 |
have more control of that, too. So even, even even our downstream 00:58:26.820 |
attempts to actually send products that theoretically over 00:58:30.100 |
long periods of time aren't necessarily beneficial. They've 00:58:35.180 |
Yeah, and they don't let their kids play video games or use 00:58:37.820 |
Well, even video games, they saw video games are bad. It's not 00:58:40.900 |
like they're like, Oh, Activision, come on in the doors 00:58:43.020 |
are open. They're like, you can use it an hour. No, I think it's 00:58:45.660 |
like three hours a week, and certain kids. And then the 00:58:48.540 |
weekends and certain kids are totally bad. They have an 00:58:51.140 |
ability to make strategic decisions that benefit their 00:58:53.740 |
population. And I think that we have never attempted to I'm not 00:58:58.300 |
saying that this law is going to be great. But you'd have to be 00:59:01.820 |
extremely naive to assume that there is nothing bad going on 00:59:05.820 |
here with this app. It doesn't mean that there's nothing bad 00:59:08.420 |
going on with all the other apps. I am assuming that all 00:59:11.660 |
these apps have foreign actors that have infiltrated them. I 00:59:14.780 |
think that's the right posture to have. I don't have Instagram 00:59:18.380 |
on my phone. I don't have Facebook on my phone. I don't 00:59:21.060 |
have Tick Tock anymore on my phone, the only thing that's 00:59:23.340 |
left is x. And at some point, if those apps can prove that there's 00:59:30.060 |
a chain of custody, so for example, you know, one thing 00:59:32.580 |
that I thought of you could do is you could put some kind of 00:59:35.860 |
like ITAR regulations around this so that only US citizens 00:59:38.980 |
could work on these apps, any app that is ambiently observing 00:59:43.620 |
you. I think the overwhelming majority of people there's, it's 00:59:47.620 |
very benign implications. But the whole point is that if you 00:59:51.140 |
have 300 million people that are being observed, you're going to 00:59:55.380 |
get also the 1000 or 10,000 that are super important. And you're 01:00:00.660 |
going to catch stuff that you're not allowed to know. And that's 01:00:03.860 |
not ready doing this, pull up the article, Nick, Tick Tock has 01:00:08.140 |
already admitted to and been caught with their hand in the 01:00:11.540 |
cookie jar spying on journalists. We know that 01:00:14.980 |
they're doing this already. And we know their track record for 01:00:18.260 |
spying on their own people. This is the no brainer of no brainer 01:00:21.900 |
decisions, just divest. And if they won't divest, it tells you 01:00:25.580 |
everything you need to know, this is super valuable to them. 01:00:28.980 |
And that algorithm is so valuable that they know they can 01:00:34.380 |
I think the other problem is, I'm not sure that the people 01:00:37.500 |
with the technical sophistication will have the 01:00:40.420 |
time to then actually go and audit the code base to really 01:00:43.660 |
know that there aren't any Easter eggs in here. And there 01:00:45.860 |
aren't real backdoors. And they won't. So this is my point. I 01:00:49.100 |
think divestiture should not be allowed. I actually just think 01:00:51.820 |
unfortunately, this app should just be shut down. And the 01:00:54.420 |
people will migrate to Instagram, and the people migrate 01:00:57.180 |
to YouTube, and new products, and or new products. And the 01:01:01.660 |
thing there is that at least these are American controlled. 01:01:04.860 |
And are there data leaks happening in those apps? 01:01:07.300 |
Probably. But it's at least not so brazenly obvious. 01:01:13.260 |
I'm a little confused here. Are you guys saying that the Chinese 01:01:16.700 |
government uses to talk to spy on its own population? Or they 01:01:21.660 |
No, they ban tick tock there because it's harmful. No, they 01:01:24.980 |
complete control over tick tock. Yes. And they control that in 01:01:27.980 |
their country. They use camera sacks. And you know this full 01:01:31.820 |
well on every single corner with facial recognition, and every 01:01:35.900 |
one of their phones is tracked. You know this full well sacks 01:01:39.140 |
they are spying on their pockets or not. No, they have bad tick 01:01:42.940 |
tock. They have a different version called a different 01:01:45.100 |
version. And in that version, and by the way, and they push 01:01:49.500 |
Matt doions algorithms are very different. They push educational 01:01:52.780 |
content. There's so basically, like on the on the ByteDance 01:01:56.260 |
board, what happened with this whole thing is the the CEO and 01:02:00.140 |
the founder had to step down effectively, right, they brought 01:02:04.020 |
in effectively new management. And in that the CCP hat is a 01:02:08.220 |
board of board director and has what's called the golden vote. 01:02:11.820 |
So they effectively decide. And so what you saw were the 01:02:15.060 |
algorithms morph over time from pure entertainment, to things 01:02:19.020 |
that pushed a lot of educational content. They also then imposed, 01:02:23.540 |
broadly speaking, limits on content that they didn't like, 01:02:27.700 |
they would only allow in other apps, experiences where they 01:02:32.340 |
also have a board seat now and a golden vote. As Jason said, 01:02:35.940 |
games can only be played in certain apps for certain amounts 01:02:39.380 |
of time. So they've made their own decisions about what's good 01:02:42.500 |
for their own population. The flavor of the app that's here, 01:02:46.060 |
my my biggest point is that, again, you have to pick the 01:02:49.820 |
lesser evil. I think it's pretty reasonable to assume that all 01:02:53.380 |
these apps are infiltrated. But this one is more brazen, because 01:02:58.380 |
it is really the only one that's completely and obviously foreign 01:03:01.100 |
control. And I don't think that if you tried to divest it, you 01:03:07.420 |
will get any assurance that that code is reliable. And so you'd 01:03:10.460 |
have to build the whole thing from scratch all over again, for 01:03:12.740 |
what it's worth, the company has said that they're willing to 01:03:14.860 |
move all of their hosting in the US to Oracle data centers in the 01:03:21.900 |
David, you and I know what that means. I could you imagine a 01:03:24.780 |
migration like that, who is going to look at every single 01:03:27.540 |
endpoint, every single line of code, that is a joke that's 01:03:34.740 |
All right. It seems like we're I don't know if we're 2v2 here. 01:03:38.540 |
But look, I you may or not been sacks with the language is 01:03:43.580 |
I'm ambivalent about it, because I'm open to the idea of banning 01:03:48.900 |
a foreign app in this way. But the reason why I'm a little bit 01:03:55.340 |
ambivalent about it is because a I think that what they're doing 01:03:59.780 |
has to be proven, as opposed to just a bunch of hand waving over 01:04:03.020 |
it. And when you start talking about what's happening to the 01:04:05.500 |
Uyghurs, a concentration camps, and spying that has nothing to 01:04:08.980 |
do with this particular app. And in fact, you've revealed this 01:04:11.780 |
app doesn't even function in China. I think there's just a 01:04:14.540 |
lot of hand waving going on a lot of hard proof. And I don't 01:04:18.180 |
think that one little story in the New York Times, I would 01:04:23.220 |
Yeah. So that's number one is I'm open to this idea. But I 01:04:29.060 |
I don't think you're gonna get it. I think the NSA and the CIA 01:04:32.020 |
will read in the Intelligence Committee. And there's a sub 01:04:35.980 |
committee that's probably been read in. But I can guarantee 01:04:38.260 |
you, the American public will not get ridden on what 01:04:40.140 |
Manchin said on CNBC today, by the way, that, you know, he has, 01:04:44.340 |
based on his information, he believes strongly that they 01:04:47.260 |
should divest, whether it's to his group or another group. 01:04:49.580 |
These guys have lied so many times about so many things. I 01:04:52.700 |
mean, I just think the public has a right to know, look, if 100 01:04:55.660 |
million Americans are going to be deprived of using an app they 01:04:57.860 |
love, why can't we get a little bit more information? Well, 01:05:00.180 |
they're not exactly what I think they're saying is done wrong. 01:05:03.140 |
They're saying it, they're saying it's in control. It's 01:05:05.860 |
it's being controlled by a foreign adversary. They are 01:05:07.900 |
saying it, the bill says it. What is debatable, they're 01:05:12.460 |
telling you means all that means is that the app is owned by 01:05:15.820 |
ByteDance, which is, here's what's unique about this, 01:05:18.940 |
which is incorporated in China. Here's what here's what's 01:05:21.300 |
unique about this bill. It got out of committee 50 to zero. It 01:05:26.780 |
was overwhelmingly approved by both sides of the house. And it 01:05:30.220 |
looks like in the Senate, there may be some refinements that 01:05:32.860 |
happened, but it's going to get largely overwhelmingly approved 01:05:35.580 |
there as well. It's like the Patriot Act. Hold on. It's a 01:05:38.580 |
very unique moment in time, where you see in today's 01:05:42.340 |
political landscape, such uniformity. And again, I'm not 01:05:48.300 |
going to go and defend these folks. But except to say that 01:05:51.340 |
where there's smoke, there's probably fire. And I think that 01:05:55.860 |
in this specific narrow case, again, we're not talking about 01:05:59.460 |
foodstuffs or lumber, right, we're talking about a 01:06:02.780 |
technological application that is observing a lot of people on 01:06:06.780 |
a continuous 24 by seven basis. I suspect this, this law would 01:06:11.580 |
not have gotten out of committee 50 to zero, had they not been 01:06:14.900 |
read in by the NSA and the CIA. And that will never see the 01:06:18.740 |
light of day because we will never be disclosed that 01:06:20.780 |
information. Okay, so it could have been 49 to one, David, you 01:06:25.740 |
I want to respond to that ideological, I could see it 01:06:29.420 |
being 40 to 10. If it's ideological, I could be seeing 01:06:36.740 |
So what's the Patriot Act? Look, there's an old saying in 01:06:39.900 |
Washington that the worst ideas are bipartisan. These guys were 01:06:43.860 |
all being stampeded into this act, they brought it up and 01:06:46.980 |
passed it with hardly any debate. I've already shown you 01:06:50.140 |
how the language of the bill is overly broad. It's not just a 01:06:53.620 |
tick tock ban. It says that any app or website, did you know 01:06:58.500 |
that any app or website domestically, that's subject 01:07:02.580 |
subject to the direction of a foreigner, one of these 01:07:09.820 |
I said a foreigner from one of those countries. Yeah, no, I'm 01:07:12.220 |
trying to be clear for the audience. All you got to do is 01:07:14.300 |
make that argument. If you're an eg who wants to go after one of 01:07:18.540 |
our domestic platforms, that's all you got to do to bring them 01:07:21.220 |
under your thumb, what would be the proper route? If a US 01:07:25.500 |
platform was in fact compromised by the Chinese, North Koreans, 01:07:29.100 |
or Iran? What would be the proper path in your mind? 01:07:33.060 |
If an app is sharing us data in a way that it shouldn't, then 01:07:40.060 |
Yeah, which is what this is saying, right? The ages would 01:07:42.780 |
go? It's not it's not it's not even it's banning tik tok. Okay, 01:07:46.140 |
subject to maybe divestiture happens. But there's a lot of 01:07:48.940 |
people who think that the six months they're giving for the 01:07:51.140 |
business is not going to work. And the Chinese government may 01:07:53.500 |
not go for the idea of allowing tik tok to be bought by an 01:07:57.540 |
American company, because they may not like the precedent that 01:08:01.700 |
I think you're answering your own question. I think you're 01:08:03.460 |
making a very important and good point, which is, there's so much 01:08:07.100 |
body of business law here. And there's so many state agees who 01:08:11.300 |
would want to make their bones. And you see the state agees 01:08:14.780 |
trying to build their credibility, going after 01:08:17.300 |
Facebook for sexual exploitation, and all of this 01:08:19.780 |
stuff. They're to your point, David, there is such a body of 01:08:23.740 |
lawsuits and law and precedent, that tik tok could have been 01:08:27.540 |
prosecuted under, it's very odd and rare that there's been a 01:08:31.220 |
complete detente, where this thing came on from up above, and 01:08:35.420 |
everybody just got on board. And all I'm saying is, in these very 01:08:39.140 |
rare and unique instances where somebody shelled their own 01:08:48.980 |
Alarm bells go off when DC acts with this kind of unanimity, 01:08:52.140 |
because the only time they ever do that, when they become a unit 01:08:55.300 |
party is when the national security state wants some new 01:08:58.020 |
power. I agree with this is how we got into the 01:09:00.540 |
almost unanimous. This is how we got into the Patriot Act. I'm 01:09:06.780 |
every dumb idea that gives the federal government more power 01:09:12.940 |
You're absolutely right. And I'm agreeing with you. They were 01:09:16.060 |
read in at a national security level that got them to become a 01:09:19.340 |
unit party on this topic. We are never going to get and I'm just, 01:09:22.900 |
you know, responding to Fribourg, we are never going to 01:09:25.940 |
we the people because of this, like, phony concern about you, 01:09:29.940 |
they always like classification when they don't want the public 01:09:31.780 |
to know something. Now, by the way, this should be secret. But 01:09:34.660 |
by the way, give us the proof. Why can't they give us a proof? 01:09:37.260 |
I don't think it needs to be proof. It's the potential damage 01:09:40.140 |
here that they're saying is the problem sacks, they could wake 01:09:43.780 |
up this algorithm at any point in time. And they could steer 01:09:47.220 |
people's thinking they have already steered it. Your party 01:09:51.060 |
has talked incessantly about how tick tock has steered cultural 01:09:56.140 |
issues in a certain direction, thus libs of tick tock, all this 01:09:59.580 |
kind of stuff. There is ample evidence that these social 01:10:04.300 |
networks and these videos do influence young people. So if 01:10:08.140 |
young people are addicted to this, which they clearly are, 01:10:10.380 |
and if young people are influenced by this, which they 01:10:13.060 |
clearly are, those are just facts that are indisputable. The 01:10:16.900 |
Chinese could wake this up at any time, and steer political 01:10:20.180 |
issues left, right center cause chaos. It is the potential harm 01:10:24.140 |
here sacks. That is the concern. It's not just that they're doing 01:10:28.060 |
it now. It's that they could do it in the future. And if you 01:10:31.660 |
were to take this and you were to put, let's say this was North 01:10:34.540 |
Korea or Iran, would your would you allow North Korea, or Iran 01:10:41.340 |
Of course, he would. I think my point would be the same, which 01:10:44.740 |
is I want to see some proof. Hold on, let me make a couple of 01:10:47.340 |
points. Number one, I want to see some proof that this is 01:10:50.620 |
actually spyware. That to me has been alleged. And if it's as 01:10:54.660 |
such a slam dunk case, as you say, why is there no proof of 01:10:58.140 |
this? So that's point number one. And by the way, if if the 01:11:02.960 |
phone is just sitting there passively listening to you, and 01:11:05.100 |
that's the spyware, Apple should really address that, because 01:11:07.940 |
apps shouldn't be able to do that. Right? So that's like 01:11:11.140 |
Apple's problem. With respect to the data, supposedly being 01:11:15.060 |
leaked to the Communist Party. Again, I want to see what data 01:11:18.900 |
we talking about, like, like what to talk videos I like what 01:11:23.460 |
what's important about that data. There are bundles of data 01:11:27.700 |
that are available on the open market that they can buy about 01:11:31.180 |
I don't think that's what it is. These microphones are on 24 01:11:34.860 |
seven, these apps are allowed to passively listen in the 01:11:37.400 |
background, though, I would just tell Apple not to allow that. 01:11:40.020 |
Think that that could be a partial solve to what about 01:11:44.300 |
should not be passively listening to you should have to 01:11:46.100 |
like stick in your face and talk to it if you want to activate 01:11:49.940 |
The problem is when you download these apps, you create these 01:11:52.520 |
settings. And so you try to you have to try to convince 100 01:11:54.920 |
million people that, hey, you shouldn't do that. Maybe you 01:11:58.760 |
should only do it when you use the app in a specific way. And 01:12:01.380 |
it kills the usability. And so people get lazy. And they're 01:12:05.500 |
look, what I'm saying here is that the way the microphone 01:12:08.800 |
on Apple products is that I think it does work that way. 01:12:14.560 |
Yeah, it shouldn't work that way. I don't know if that's 01:12:18.120 |
okay. Listening is like the key issue here. Well, I mean, who 01:12:22.540 |
knows what's possible in terms of hacking and spyware. You know, 01:12:29.500 |
okay, so so one by one, we're kind of knocking, knocking out 01:12:32.620 |
all these issues. It's not passively listening. The data 01:12:35.920 |
that we don't know, we don't know, it doesn't seem that 01:12:38.320 |
important. With respect to steering, I agree. I guess 01:12:41.800 |
that's a theoretical problem. But you could say that about any 01:12:44.520 |
of these apps, and you're on a slippery slope to basically a 01:12:47.800 |
First Amendment problem, because they're not letting users get 01:12:53.880 |
and then and then look, and then one final point about the 01:12:57.800 |
reciprocity argument is, if you want to make this about 01:13:01.640 |
reciprocity, then put it in a trade bill. That's where you 01:13:06.460 |
deal with reciprocity is you say, okay, these products from 01:13:10.500 |
China gets to play in US markets, our products to the 01:13:13.580 |
United States gets play in Chinese markets, you do it 01:13:15.860 |
through a trade bill. This is not a trade bill. This gives a 01:13:18.380 |
new power to the government to define foreign adversary 01:13:23.840 |
February want to give you a chance to answer the two 01:13:25.500 |
questions. Are you in favor of divestiture? If it was Iran who 01:13:31.960 |
I don't know about divestiture. Until to Saksa's point, I just I 01:13:38.960 |
haven't heard a clear point of view or clear, like piece of 01:13:42.720 |
information about what this app uniquely does that other apps 01:13:47.000 |
don't do. That creates a threat. It's just not clear to me. So in 01:13:52.320 |
general, I'm not in fan of like forcing companies to divest and 01:13:56.240 |
forcing things to be shut down and taking away human consumer 01:13:59.600 |
choice. I just don't think those are good things generally. Okay, 01:14:02.540 |
and I'm not to Saksa's point, I'm just not compelled that 01:14:05.180 |
there's something here based on what I've seen. And then like if 01:14:07.880 |
Iran own tik tok, I don't care who owns it based on my prior 01:14:12.440 |
statement, I think people should have choice to use what they 01:14:15.620 |
want to use a provided that there is no spying. If there's 01:14:18.680 |
spying, I want to know that there's spying and how it's 01:14:20.360 |
happening. We need to technically fix that if someone 01:14:22.880 |
can passively turn on a microphone on an app and listen 01:14:25.800 |
to people in a room. Yeah, pretty smart scientists, we've 01:14:28.920 |
we've proven that it's possible and they can also we should fix 01:14:31.160 |
that. That's a bigger problem, then you need to shut down all 01:14:34.840 |
the iPhones we have with there's an Israeli software company. We 01:14:37.920 |
all know what it's called. We all can go and license the 01:14:40.280 |
software from them. Okay, they they will help you do this. So 01:14:44.760 |
yeah, I'm sorry, guys. But that's an Apple needs to patch 01:14:49.160 |
it. That's exactly right. That's how I don't want a domestic app 01:14:52.880 |
to be able to do that to me. Yeah, exactly. It is happening 01:14:56.400 |
and it hasn't been fixed. And it's not that it hasn't been 01:14:58.960 |
fixed. The point is, when you have a large technical surface 01:15:01.800 |
area, you guys know this, there are bugs all the time, like we 01:15:05.160 |
deal with, for example, thread factors all the time into like, 01:15:09.000 |
Microsoft has dealt with these for 50 years, right? Every new 01:15:12.400 |
update of Windows, every little thing has all of these little 01:15:15.720 |
backdoors and little threads that weren't cleaned up 01:15:18.000 |
properly. So this is an ongoing whack a mole problem. And the 01:15:21.680 |
reality is that a very small technical team has an always 01:15:25.760 |
been able to stay a few steps ahead of Apple, and many of 01:15:30.680 |
these apps. And all I'm saying is, you should just not be so 01:15:33.280 |
naive as to assume that this Apple is something that this is 01:15:36.160 |
something Apple can fix. I think that a small talented team of 01:15:39.400 |
hackers, whoever they work for, will always be able to find 01:15:43.320 |
these backdoors in every single new release of every single 01:15:45.920 |
operating system. And they will be a few steps ahead. So 01:15:49.040 |
that we're screwed anyway, because all the data is already 01:15:51.640 |
out there. So I think the point is, so knowing that, if you want 01:15:56.200 |
to maybe minimize the surface area of how bad this data leaks 01:16:00.240 |
to just the foreign actors that have infiltrated our American 01:16:03.520 |
companies, you can ban tick tock. Okay. If you want to just 01:16:07.520 |
open the door to it, keep it around. Look, I like I said at 01:16:11.040 |
the beginning, I'm ambivalent about this, because if you can 01:16:12.960 |
prove to me that there's really a national security threat, and 01:16:16.280 |
that your remedy for dealing with it is narrowly tailored, 01:16:19.720 |
then I'm potentially on board with this. But I don't believe 01:16:22.560 |
the threat has been proven. And I certainly don't think the 01:16:24.640 |
remedy is narrowly tailored. In fact, it's expansive, and it's 01:16:28.040 |
going to lead to weaponization. And I don't trust the government 01:16:30.640 |
to define new new categories of foreign controlled applications 01:16:37.080 |
I think you're probably right, which is the unfortunate part. 01:16:39.600 |
But I do think that there's enough of these folks that have 01:16:43.640 |
been read into something that we're not being told. 01:16:48.160 |
Yeah, I'm not I'm not gonna say that it's accurate or not. But 01:16:52.480 |
I'm saying, I suspect that some body of work from a three letter 01:16:58.080 |
agency has made its way to enough of these people under 01:17:05.240 |
the national security state has a secret reason for wanting to 01:17:08.400 |
expand its powers. That just doesn't fill me with any 01:17:11.600 |
confidence. Again, Patriot Act 2.0. That's where I started with 01:17:15.040 |
this. I don't think this is the Patriot Act, because the Patriot 01:17:17.080 |
Act was like, I'm going to have now a broad license to probe 01:17:20.240 |
anything I want. This is about saying, this thing that is here 01:17:24.480 |
So it's not focused on Tick Tock. It's focused on, again, 01:17:29.800 |
first, what I'm saying is, applications, websites, 01:17:33.920 |
the Patriot Act was about the government being able to go 01:17:37.800 |
proactively wherever they wanted when they weren't allowed. That 01:17:42.040 |
is not what this bill says this bill is more reactive and 01:17:44.600 |
saying, Okay, you out, out of the sandbox. And all I'm saying 01:17:49.160 |
is you're right that that could be abused. But that's very 01:17:51.440 |
different than the Patriot Act, which was very aggressive. 01:17:55.160 |
All right. So they have folks, two people looks like agree with 01:18:01.720 |
the divestiture. Two people have a difference of opinion. 01:18:05.240 |
Well, I don't agree with the divestiture. I think it should 01:18:07.560 |
be shut down. Okay, shut down. Yeah. So shut down divestiture. 01:18:10.560 |
Looks like lots of opinions. That's good. Lots of opinions. 01:18:12.920 |
That's good. Okay. I think we just had more debate over this 01:18:17.040 |
Perhaps. I mean, I think that's why people listen to this 01:18:20.120 |
program is because we're literally getting into the 01:18:22.520 |
absolute finest details and asking really probing questions 01:18:26.560 |
sex that you love 100 million people are soon gonna be asking 01:18:29.960 |
when Tick Tock gets shut down 100 million people who used it 01:18:33.160 |
every day are gonna be like, why did this happen? 01:18:35.320 |
I'm gonna say something about that. One of my kids and I'm not 01:18:39.240 |
gonna say who it is. We recently had them tested for ADHD. They 01:18:43.720 |
weren't doing particularly great in school. The response by some 01:18:48.360 |
of the folks in the school was Oh, there's meds for that. We're 01:18:52.360 |
like no meds. And what we did was we took away their iPad. And 01:18:59.280 |
we completely deprived them of all these apps and video games. 01:19:05.040 |
I cannot describe to you the magnitude of the turnaround in 01:19:09.280 |
this kid. grades. Incredible word, they were getting 60% and 01:19:16.960 |
70% now getting 90% totally engaged, interesting, charming 01:19:23.200 |
kid that had lost a little bit. And I think that there was a 01:19:28.000 |
little bit of a daze. And I just want to say that in general as a 01:19:31.160 |
parent, whether the app is banned or not, who cares my lived 01:19:35.200 |
experience right now is that video games and these apps are 01:19:41.680 |
This is a known fact actually, Jamal, I looked into this as 01:19:45.960 |
well. It turns out students who used iPads and digital media 01:19:50.920 |
many times a day, they show the exact signs of ADHD inattention 01:19:55.200 |
and hyperactivity and impulsivity. So whether you 01:19:59.800 |
think it's a real diagnosis or not, it's inducing those 01:20:04.000 |
I'm willing to say that my child probably had a very light form 01:20:07.200 |
of it. Maybe it was exacerbated by these video games and the 01:20:10.240 |
iPad. And these apps and you know, they were getting all the 01:20:14.560 |
tick tock content. But they were getting it on YouTube alerts, 01:20:20.200 |
all this anyways, went to zero, not like you can have a little 01:20:23.880 |
bit here, they're zero. And the transformation in this last 01:20:27.000 |
month and a half, two months has been incredible. I don't know if 01:20:30.480 |
other parents are dealing with this. But what I'll tell you is 01:20:32.920 |
these apps are not good. So this whole consumer choice thing, I 01:20:37.160 |
think my view is a little tainted, because I get agitated 01:20:39.840 |
as a responsible parent, I'm like, just get this app out of 01:20:42.520 |
here. And I already know that I'm addicted to it. And it's not 01:20:44.760 |
good for me. I have no idea what it's doing to my, my kids. And I 01:20:49.800 |
If you if you look at the correlation between all of these 01:20:54.000 |
symptoms showing up in kids, it's basically the introduction 01:20:56.720 |
of the iPhone. And so people are kind of reaching consensus that 01:21:01.080 |
the iPhone and distraction, ADHD, spiking, depression, all 01:21:05.040 |
this stuff, anxiety, and kids is correlating with too much 01:21:07.600 |
screen time. You can do your own research, folks. 01:21:10.040 |
Maybe it correlates with prescription of SSRIs. Maybe it 01:21:14.840 |
well, no, that is part of it with schools going totally woke 01:21:22.800 |
The COVID exacerbated correlates COVID exacerbated it when you 01:21:26.920 |
look at the the trend lines for sure. And the SSRIs and those 01:21:31.400 |
kind of things being prescribed is probably as a result of these 01:21:37.440 |
ADHD symptoms coming from screen time. And then it's probably 01:21:41.680 |
I think people just got a lot more prescription happy, like 01:21:45.200 |
they're just definitely that over prescription. They're 01:21:47.040 |
willing to put kids on drugs for I mean, you talk about ADHD for 01:21:51.600 |
young boys, that's kind of like this normal behavior. 01:21:54.280 |
Yeah, no, absolutely. Like, kids running around outside and 01:21:58.480 |
having energy. If I had to pick an age, I would say 16 1718 years 01:22:01.920 |
old for social media. And for phones, I would say 15 or 16. 01:22:05.360 |
You know, with a very controlled phone. So as this is one 01:22:09.160 |
person's opinion. All right, Florida's on the verge of 01:22:11.680 |
banning lab grown meat. Freebird. Can you tee this up 01:22:14.560 |
Yeah, so Florida has been debating a bill in their state 01:22:19.120 |
legislature since November. And it just passed the house and the 01:22:23.080 |
Senate vote of 8627 in the house 2820 in the Senate to prohibit 01:22:28.440 |
the manufacturing sale, holding or distribution of cultivated 01:22:32.400 |
meat. And it basically makes the sale or ownership of cultivated 01:22:36.440 |
meat a second degree misdemeanor. This is lab grown 01:22:38.760 |
meat. And this bill is now on to Santa's desk for signing. You 01:22:46.960 |
know, I'll kind of highlight a little bit of the motivation and 01:22:49.280 |
and some of the technical background and my point of view 01:22:51.080 |
on it. If that's okay, but do you do you see him eating the 01:22:53.960 |
meat while you're talking about it? Did you see the beach? I'm 01:22:57.600 |
paying attention. Unlike you guys. He's got this like 01:23:00.640 |
Flintstone size ham hock. We're all this and good. It feels like 01:23:04.680 |
a like a non issue to folks who eat meat and don't care. But I 01:23:07.960 |
just want to point out how much of this is generally a challenge 01:23:11.400 |
to enabling choice in new innovative technology, which you 01:23:17.000 |
know, we've seen attempts at this in the past. But what is 01:23:20.200 |
the reason with you? What is the reason to ban lab grown meat? 01:23:24.080 |
They love to ban in Florida. The motivation is that Florida 01:23:27.920 |
ranchers felt the guy who just has been advocating for banning 01:23:32.320 |
Tick Tock this whole show is now accusing Florida of being into 01:23:35.920 |
banning your boy loves to bitch. I'm the only person on this pod 01:23:40.400 |
who actually free burgers to who are skeptical of like, does 01:23:44.080 |
knee jerk banning everything, ban everything? I would not ban 01:23:47.360 |
this. I didn't say to ban everything. Where's that? 01:23:50.240 |
We're just joking. We're joking. Don't label me with your J cal 01:23:57.040 |
I just definitely becoming a minority position to be against 01:24:01.880 |
government intervention in free decision making and commerce by 01:24:06.640 |
individual citizens. Florida ranchers felt that their 01:24:10.240 |
livelihood was threatened. They have a billion dollar ranching 01:24:12.640 |
business a year in Florida. Good for them. They should go ahead 01:24:15.880 |
and compete with whatever new technology is emerging. I would 01:24:19.000 |
say try a little role reversal. Imagine if you know, governments 01:24:22.520 |
and states tried to ban the use of the tractor for fear of 01:24:25.560 |
putting agricultural workers out of business, or you know, 01:24:29.000 |
software companies that did accounting software got banned, 01:24:32.120 |
because it could put accountants out of business, or electric car 01:24:35.680 |
production and use got banned, because it could put traditional 01:24:38.480 |
automotive manufacturers out of business, you could go down the 01:24:41.360 |
list, and you could create this position on nearly any new or 01:24:44.160 |
emerging technology that feels threatening to an incumbent 01:24:46.960 |
industry. And ultimately, it really only yields to regulatory 01:24:51.240 |
capture, and to a lack of choice, and opportunity for new 01:24:55.480 |
innovation and for consumers to make decisions about what they 01:24:58.080 |
want. And the irony here is that so much of what's being consumed 01:25:02.120 |
in the market space today, part of their their rationalization 01:25:05.720 |
is, oh, well, it's new technology. We don't know if 01:25:07.400 |
it's good for you. We don't know if it's going to work. The truth 01:25:09.840 |
is, there are federal regulatory bodies that have oversight on 01:25:12.320 |
this sort of thing. 20 years ago, almost all the cheese that 01:25:15.880 |
we ate in the United States was made from rennet. rennet is an 01:25:19.280 |
enzyme that converts the protein and milk into cheese. We got 01:25:22.560 |
rennet from the stomach of calves, we would scrape it out 01:25:25.800 |
and sell rennet and it would be used to make cheese. Then 01:25:29.040 |
recombinant engineering where we could put we could get bacteria 01:25:33.040 |
or yeast cells to make proteins. This technology unlocked the 01:25:37.200 |
opportunity to make rennet more affordably. So rather than go 01:25:40.360 |
and slaughter calves and get the rennet out of their stomach, we 01:25:43.120 |
engineer the bacteria or the yeast cell to make the exact 01:25:45.640 |
same protein. And that is now the entirety of the rennet 01:25:48.640 |
industry is recombinantly produced rennet. And the 01:25:51.400 |
entirety of cheese that we all consume is made using this 01:25:54.720 |
genetically modified yeast that makes this enzyme that converts 01:25:57.720 |
milk into cheese. The same is true across other industries. We 01:26:01.080 |
used to use animal fat for laundry detergent turned out it 01:26:04.240 |
was a lot cheaper to make enzymes using the same process I 01:26:07.600 |
just described instead of making animal fat. And now all of our 01:26:10.200 |
laundry detergent is recombinant enzymes. So I think like this 01:26:13.280 |
notion that we're going to ban this stuff is a regulatory 01:26:17.240 |
capture incumbency moment. It's totally wrong. It denies 01:26:20.560 |
consumers choice. And frankly, it flies in the face of what has 01:26:24.160 |
historically been a real economic opportunity to bring 01:26:27.280 |
cost down to bring new innovation to market and to try 01:26:30.120 |
and stall out innovation is going to leave the state or this 01:26:33.120 |
country. You're right in a real kind of challenge compared to 01:26:35.400 |
other countries. I just think it's wrong. I think it's really 01:26:37.480 |
on this topic. You are 100% right. You're right. This is a 01:26:42.280 |
dumb thing to legislate. And it is meaningless and unimpactful. 01:26:46.680 |
And people should just decide based on what tastes better. 01:26:50.680 |
It's that's different than a listening device or cheaper. 01:26:55.000 |
freeberg is it is the criticism of this that it's regulatory 01:26:59.720 |
capture, or it's part of like this anti woke kind of vibes in 01:27:04.520 |
there's a conservative movement in Florida, which has taken 01:27:06.960 |
hold, which I think this is key to. Now, in some ways, I would 01:27:11.760 |
argue that conservative movement has really important sociological 01:27:16.520 |
points of view. In other cases, I think it denies necessary 01:27:21.920 |
innovation, it denies necessary advancement to move industry 01:27:25.200 |
forward versus social change. And I think the the ability to 01:27:29.000 |
kind of obfuscate the two that Oh, you know, transgenderism in 01:27:32.520 |
schools and elementary schools is the same as the woke leftists 01:27:36.280 |
from California, making, you know, lab grown meat, and they 01:27:39.280 |
all get kind of jammed together as one big tribal group. And 01:27:42.760 |
therefore, we should ban it all. What will likely end up 01:27:44.760 |
happening here is this will find its path to federal preemption. 01:27:47.320 |
Historically, when we've seen states try to impose these 01:27:51.800 |
sorts of bans, the companies that are ultimately affected the 01:27:55.120 |
innovators that are affected, go to the federal government, and 01:27:58.120 |
they try and legislate for a bill that says this stuff is 01:28:00.680 |
legal and should be broadly available. That federal 01:28:03.000 |
preemption then stop states rights on having a ban in place. 01:28:06.680 |
And so it's very likely that we'll end up seeing some 01:28:09.040 |
legislation here over the next couple of years. If this 01:28:11.400 |
technology is ultimately beneficial. The problem is now 01:28:13.720 |
that Florida has done this, I guarantee you're going to see 01:28:15.840 |
Texas, which is a huge ranching state, and many other states 01:28:19.040 |
step up to do it, it creates a really bad precedent for all 01:28:21.600 |
other disruptive kind of industries to be blocked by 01:28:23.920 |
their local economies that that believe that they're under 01:28:27.020 |
threat. And, you know, it just creates like a lot of 01:28:31.240 |
sacks, what is the sentences be with fake meat? 01:28:34.000 |
I don't know. I don't know. But let me let me state clearly what 01:28:40.640 |
I think, which is I think it's really terrible. When a bunch of 01:28:44.320 |
incumbents in an industry get together, and try to shut out 01:28:48.360 |
the innovative solution by inventing some unproven threat 01:28:59.200 |
I wanted to talk about this, because I think the two are so 01:29:02.240 |
similar. And they're not similar at all. Well, they're not 01:29:06.360 |
just go ahead. So regulatory captures regulatory capture. But 01:29:09.600 |
that's just the term of art. We're just you know, fair enough. 01:29:12.920 |
Yeah, we disagree on Yeah, that's fair. I think you guys 01:29:15.680 |
have to have the intellectual honesty to say these are 01:29:17.560 |
different issues. One is lab grown meat versus ranchers in 01:29:20.640 |
America. This is typical corruption and cronyism. Fine, 01:29:25.720 |
Oh, yeah. But you don't think you don't think that the the 01:29:28.960 |
other tech companies that compete with Tick Tock are 01:29:31.360 |
secretly banning together to basically gin up this bill. 01:29:35.320 |
I don't think I don't think they're secretly banning. I 01:29:37.840 |
think that they're being over there overtly organizing. 01:29:42.960 |
But again, that typically still always gets cut on partisan 01:29:46.840 |
lines. And what I'm saying is this is totally different than 01:29:55.960 |
I think I just want to make a point here. I think the 01:29:58.400 |
intellectual discourse on this program is second to none. I 01:30:01.880 |
mean, we are getting into the most refined details of this 01:30:04.960 |
issue. And you just don't hear this anywhere else. So I just 01:30:07.400 |
want to give myself and the rest of the crew a pat on the back. 01:30:10.320 |
You just want to ask a bit. I just want to Yeah, no, I mean, 01:30:14.040 |
we really did get in a pretty in the weeds here. 01:30:16.480 |
J Cal, do you agree that the Florida bill is ridiculous? And 01:30:20.920 |
I didn't read it. But I would think that's exactly there's 01:30:26.920 |
nothing special about it. I can't imagine why somebody would 01:30:30.240 |
want to ban mock meat other than crony capitalism, of course. And 01:30:34.960 |
my question to you is when is this going to taste good? When 01:30:38.240 |
Well, by the way, this bill makes no sense. Because the 01:30:40.440 |
only reason they would do it is if they were afraid that the lab 01:30:43.600 |
grown meat was just so much cheaper. And frankly, so much 01:30:48.160 |
tastier than what they make. And Freiburg, you've you've said 01:30:51.000 |
this many times, like, we are so like orders of magnitude away 01:30:57.760 |
but by the way, why try and stop it from competing? Let the VCs 01:31:01.240 |
that want to throw money at it, throw money at it, let the 01:31:03.200 |
scientists pursue it. Let the let the thing that's made 01:31:06.640 |
America so successful, be successful. By the way, imagine 01:31:10.560 |
Elon got the Starship up today. You guys saw this thing and went 01:31:13.800 |
into orbit. It was incredible. Imagine if Boeing and a bunch of 01:31:18.360 |
defense contractors got together 10 years ago and said we got to 01:31:21.080 |
stop SpaceX. These guys are trying to do stuff in an unsafe 01:31:24.200 |
way. We don't try and they tried and they tried and they will 01:31:27.040 |
keep trying. And the freedom that we afford people and 01:31:30.680 |
businesses in the United States of America is what allows us to 01:31:33.720 |
have our unique edge and our unique ability to create 01:31:37.000 |
progress that you don't see anywhere else on planet Earth. 01:31:39.760 |
And this is the sort of that takes us backwards. I just had 01:31:42.480 |
the key word we us America, America, back to the tick tock 01:31:46.600 |
thing. That's not an us thing. That's a Chinese thing. I get 01:31:49.760 |
the point. I'm not like, I'm not pro CCP. But yeah, I get the 01:31:53.360 |
point. Anything American? God bless us. Let's go American 01:31:57.080 |
exceptionalism. No, I saw the headlines. I think CNBC reported 01:32:02.920 |
I mean, they didn't say that, did they? I saw one headline 01:32:08.480 |
where they had to put a little nag in the end of the sentence 01:32:10.720 |
or a starling starling video for like an hour. It's not video 01:32:15.240 |
was so crystal clear. It was like HD of this rocket coming 01:32:19.360 |
back down to Earth. It was bonkers. So congrats to E. 01:32:21.880 |
Can I show you guys the most incredible tweet related to 01:32:28.560 |
Friendly reminder that Google's annual catering budget is 72 01:32:32.560 |
million about twice the cost of sharkship. Today, today, Ilan 01:32:39.160 |
made the entire human race multi planetary. Yeah. And it costs 01:32:44.680 |
half as much as what Google spends on food. That's 01:32:49.560 |
incredible. It does seem like a lot of money to spend on food. I 01:32:53.720 |
don't know. It doesn't seem like these rockets take two or three. 01:32:56.160 |
The new each time there's a new platform, it seems to take you 01:32:59.440 |
on exactly like two or three to nail it. And here we are 01:33:02.360 |
again, you know, you know, congrats to our boy. Great job. 01:33:06.080 |
This is going back full circle, the persistence and the 01:33:08.880 |
resilience that was needed to get this thing to happen. It's 01:33:12.000 |
just like 20 years of step by step iterative, nothing about 01:33:16.680 |
the initial kind of instantiation or concept of, you 01:33:21.040 |
know, what SpaceX needed to walk to get here was right, everything 01:33:24.440 |
had to change along the way Starlink came along, etc, etc. 01:33:27.920 |
And here he is at the end of this period of time, with this 01:33:31.120 |
incredible craft, the largest object to ever fly in outer 01:33:34.440 |
space, it really is like an incredible moment in human 01:33:37.480 |
history. And it took the degree of persistence, resilience, that 01:33:40.280 |
was, I think, marks what makes American entrepreneurship so 01:33:43.520 |
powerful. It really is an incredible day. It was an 01:33:45.920 |
amazing thing to watch. I don't know if any of you guys have 01:33:48.240 |
ever gone watch and watch any of the old Saturn launches on 01:33:50.880 |
YouTube. But I think this is a new era. It's really incredible. 01:33:54.160 |
I think this was like five or six years ago. I got cleared to 01:33:58.360 |
basically fly and hug Radenberg when they were doing a launch. 01:34:05.160 |
And so I was able to see it as the thing like kind of like, 01:34:08.800 |
tried to approach Max Q it's one of the most incredible things 01:34:12.920 |
I've ever done. So I think we made some progress. If you look 01:34:15.960 |
at the Google News links. A lot of praise for this launch soon. 01:34:21.000 |
Yeah, looks like people are kind of getting it. Yeah, just 01:34:26.080 |
incredible. I mean, this thing is so huge. I don't know if 01:34:29.240 |
you've ever been inside of one of these things. But the the 01:34:33.240 |
starships Yeah, I've been inside. I was inside the first 01:34:36.160 |
one they built. It is so large. The capacity of this thing is 01:34:39.960 |
like, think you could fit like three, four or 500 human beings 01:34:44.280 |
Maybe when we do our our Austin poker night, we can stop over 01:34:47.600 |
and see it be really walkers. And when you see it stacked, the 01:34:51.460 |
height is like, it almost feels like you're looking at CGI and 01:34:57.380 |
real in the real world. It is incredibly tall, like incredibly 01:35:01.600 |
tall. All right, listen, this is your favorite podcast in the 01:35:04.180 |
world. Episode 170 of the all in podcast. We never asked for 01:35:08.180 |
this. But if you have a chance, we do this show on video. So if 01:35:11.260 |
you're listening to audio, just type in all in podcast in 01:35:14.140 |
YouTube. And you can see the four of us. I don't know if 01:35:17.300 |
that's a bonus feature or not. But we do all the graphs and the 01:35:20.660 |
charts and everything here. Alright, for David sacks, the 01:35:24.900 |
rain man. David freeberg, your Sultan of science, Chamath 01:35:30.340 |
polyhypertia. Chairman Dictator, I am the world's greatest 01:35:36.180 |
moderator as David sacks would test with all those incredibly 01:35:39.340 |
sharp questions go Christopher Walken whenever you do these 01:35:42.940 |
Thanks for listening to the all in podcast. Wow. David sacks, 01:35:50.300 |
poignant points about regulatory capture. freeberg loves mock 01:35:57.060 |
meats. Not for me. Everyone loves a great dictator. See you 01:36:20.260 |
Nasty, nasty. jk. But I'm coming on. I'm coming on. Saxon by 01:36:30.580 |
I got to work on I turned the Trump off for three years. Give 01:36:33.380 |
me a break. I'm gonna do a job. Can you do a job? Do I have a 01:36:38.020 |
I feel like you do a good job. I do. I do. I'll bring you a Joe 01:36:43.500 |
Pesci. Next. I take requests. I'll do a job. Sure. Why not? 01:36:46.140 |
He only does the Pesci involuntarily when he gets 01:36:48.900 |
in the chopper. Everybody got a short time. I'm running for I'm 01:36:52.860 |
running for president. You got to change the laws. 01:36:55.300 |
You have to change it. So a German can be president Austrian 01:37:02.340 |
Austrian Austrian. Yeah. All right. Oh, yeah, rate and 01:37:17.380 |
Sax. And it said we open source it to the fans and they've just 01:37:38.340 |
We should all just get a room and just have one big huge orgy 01:37:46.740 |
because it's like this like sexual tension that they just