back to indexE99: Cheating scandals, Twitter updates, rapid AI advancements, Biden's pardon, Section 230 & more
Chapters
0:0 Bestie intros!
1:34 Breaking down major cheating scandals: chess, poker, fishing
16:13 Twitter deal updates
30:5 AI making rapid advancements: Tesla AI Day, Meta's text-to-video tool, where this all leads
49:45 Biden pardons all prior federal offenses of marijuana possession
59:24 SCOTUS will hear cases regarding Section 230, common carrier, algorithmic recommendations
00:00:00.000 |
We're seven minutes in and we've produced absolutely nothing that will go in the show. 00:00:03.520 |
Here comes Sax waking up with his commentary. 00:00:08.000 |
When Freberg is criticizing you for being too negative, you're in a dark place, Sax. 00:00:12.320 |
I'm actually angry at Sax for not publishing my AMA from the other night. 00:00:20.320 |
We had over 2000 people in the room for like four hours. 00:00:26.560 |
Everyone I know that was trying to get in was texting saying they couldn't get in. 00:00:36.160 |
Weren't you the same guy who was responsible for scaring PayPal? 00:00:43.760 |
We had huge scalability challenges at PayPal too. 00:00:48.240 |
Yeah, the theme is when you have an app that's breaking out, you hit scalability challenges. 00:00:59.280 |
2000 people participating in the conversation is a challenge. 00:01:05.280 |
When you get to 1000 people coming to the room, everybody else is in passive mode. 00:01:34.640 |
So there have been three cheating scandals across poker, chess, and even competitive fishing. 00:01:40.320 |
I don't know if you guys saw the fishing one, but they found weights and fillets during a fish 00:01:45.200 |
weigh in and then everybody wants us to check in on the chess and the poker scandals. 00:01:49.360 |
Chess.com just released their report that this grandmaster has been suspended. 00:01:55.840 |
They have evidence he cheated basically in a bunch of 00:02:04.960 |
He denied that he had done that, but he had previously cheated as a kid. 00:02:07.840 |
They now have the statistical proof that he was playing essentially perfect chess. 00:02:13.520 |
And they've outlined this in like hundreds of pages in a report. 00:02:18.000 |
Sax, what are your thoughts on this scandal in chess? 00:02:20.960 |
Magnus Carlsen finally came out and explained why he thought Hans Niemann was cheating. 00:02:26.000 |
Basically, he got the strong perception during the game that Hans wasn't really 00:02:30.080 |
putting in a lot of effort, that he wasn't under a lot of stress. 00:02:32.880 |
It's his experience that when he's playing the top players, they're intensely concentrating. 00:02:39.520 |
And Hans Niemann didn't seem to be exerting himself at all. 00:02:42.320 |
So his hackles were raised and got suspicious. 00:02:45.760 |
And then he has had this meteoric rise, the fastest rise in classical chess rating ever. 00:02:51.920 |
And I guess he had gotten suspended from chess.com in the past for cheating. 00:02:57.600 |
So on this basis, and maybe other things that Magnus is telling us, 00:03:00.560 |
Magnus basically said that this guy is cheating. 00:03:02.880 |
I think that maybe the interesting part of this is that there's been a lot of analysis now of Hans 00:03:07.680 |
And I just think the methodology is kind of interesting. 00:03:10.480 |
So what they do is they run all of his games through a computer, and they compare his moves 00:03:19.120 |
And they basically assign a percentage that Hans matches, a correlation matches the computer 00:03:25.520 |
And what they found is there were a handful of games where it was literally 100%. 00:03:28.640 |
That's basically impossible without cheating. 00:03:31.280 |
I mean, you look at the top players who, through an entire career, have never had 100% game. 00:03:37.280 |
Chess is so subtle that the computer can now see so many moves into the future, 00:03:41.760 |
that nailing the best move every single time for 40, 50, 100 moves is just... 00:03:45.600 |
And in chess, which a human really can't do that well, is that there are positional sacrifices 00:03:52.160 |
that you will make in short lines that pay off much, much later in the future, 00:03:56.000 |
which is impossible for a human to calculate. 00:03:58.000 |
And so, you know, and you saw this, by the way, when I think it was the Google AI, 00:04:05.920 |
So the idea that this guy could play absolutely perfectly according to those lines 00:04:14.720 |
So there were a handful of games at 100%, and then there were tournaments where his percentages 00:04:21.680 |
And so just to give you some basic comparison, Bobby Fischer during his legendary 20-game 00:04:28.720 |
So he only matched the computer for best move 72% of the time. 00:04:40.480 |
And then the, you know, the super GMs category are typically in the 64% to 68% range. 00:04:47.920 |
So I think it's really interesting, actually, how you can now quantify by comparing the 00:04:57.520 |
It provides a way to assess who the greatest player ever is. 00:05:00.880 |
I actually thought that it was Magnus, but now maybe there's a basis for believing it 00:05:04.800 |
was Bobby Fischer, because he was at 72% and Magnus was only at 70%. 00:05:08.320 |
However, look, the idea that Hans Niemann is in the 70s, 80s, or 90s during tournaments 00:05:13.920 |
would be, you know, just an off-the-charts level of play. 00:05:17.280 |
And if he's not cheating, then we should expect over the next couple of years that 00:05:24.000 |
he should rapidly become the world's number one player over the board, you know, now that 00:05:29.280 |
they have all this anti-cheating stuff, right? 00:05:30.960 |
So it'll be interesting to see what happens in his career now that they've really cracked 00:05:35.280 |
down on, you know, on with anti-cheating technology. 00:05:39.600 |
I have a general observation, which is these people are complete fucking losers. 00:05:44.480 |
The people that cheat in any of these games don't understand this basic, simple idea, 00:05:51.040 |
which is that trying is a huge part of the human experience. 00:05:54.480 |
The whole point is to be out there in the field of play, trying. 00:05:58.720 |
And it's basically taking the wins and the losses and getting better that is the path. 00:06:04.240 |
Once you actually win, it's actually not that much fun because then you have this pressure 00:06:10.880 |
That's a lot less enjoyable than the path to getting there. 00:06:14.560 |
And so the fact that these people don't understand that makes them slightly broken, in my opinion. 00:06:20.240 |
And then the other thing is, like, why is it that we have this strain of people now that 00:06:25.280 |
are just so devoid of any personal responsibility that they'll just so brazenly take advantage 00:06:36.960 |
This is really interesting how they caught him and running this against the computer. 00:06:41.040 |
Here's a chart of his scores in these tournaments. 00:06:45.120 |
Oh, here is this first chart is how quickly he advanced, which was off the charts. 00:06:50.640 |
And then the second chart that's really interesting is his chess.com strengths. 00:06:55.040 |
If you don't know chess.com, it has become like a juggernaut in the chess world, especially 00:07:04.160 |
And man, you look at the chess strength score there. 00:07:09.040 |
And then the number of games he likely cheated in, you can see the last two columns. 00:07:19.120 |
And he said he didn't cheat in any of the games where they were live streaming, but 00:07:30.800 |
And I don't want to overly judge until they have hard proof that he was cheating. 00:07:38.080 |
It's just that the computer evidence, you know, seems pretty damning. 00:07:45.520 |
I don't know how they prove that he was cheating over the board without actually 00:07:49.520 |
And I don't, I still don't think anyone really has a good theory in terms of how he 00:07:55.440 |
Look, look, the fishing thing, Jason, which was crazy. 00:08:00.000 |
This guy was in a fishing competition and they basically caught these fish and then 00:08:03.360 |
they put these big weighted pellets inside the fish's body. 00:08:07.120 |
They even put like a, you know, chicken breast and chicken fillets inside of the thing. 00:08:17.520 |
And in poker, in poker, everybody's afraid that there are ways in which you can read 00:08:21.680 |
the RFID and some of the cards and some of these, you know, televised situations and 00:08:26.320 |
front run what the, what the playing situation is so that, you know, whether you're winning 00:08:30.640 |
And again, I just asked the question, like, is it, is it, is this, are things that bad 00:08:38.480 |
The idea that we would play against somebody that would take that edge. 00:08:48.180 |
One observation might be that across all three, because I'm trying to find some common thread 00:08:53.840 |
across these, but it could be that there was a lot of cheating going on for a long time. 00:08:58.240 |
And maybe the fact that we do have so much digital imagery that's live on these things 00:09:06.400 |
now and so much coverage and everyone's got a cell phone that suddenly our perception 00:09:13.200 |
of the cheating in competitive events is becoming more tuned. 00:09:18.080 |
Whereas maybe there's been a lot of cheating for a long time and it's just kind of coming 00:09:23.120 |
I mean, we didn't have a lot of live streaming in poker. 00:09:27.200 |
I mean, we could probably ask Phil this or Keating, but like for how many years was there 00:09:33.060 |
Remember like people are using these like software programs that would track the hand 00:09:46.260 |
So it help you assess whether the person might be bluffing in that particular situation. 00:09:52.480 |
So I don't know if you guys, I don't know if you guys watch Twitch, like video games, 00:09:55.440 |
like Fortnite or whatever, but there are like players that have been accused of using the 00:10:00.640 |
screen overlay systems that basically more accurately show you and drive the mouse to 00:10:09.040 |
And so there's software overlays that make you a better. 00:10:11.680 |
You know, competitive video, tell you what the through line is. 00:10:19.200 |
So now what's interesting is now there's eye tracking software that people are using on 00:10:24.160 |
Twitch screens to see if the individual is actually spotting the target when they shoot 00:10:36.240 |
And I think what's interesting is just that there's so much, you know, insight now so 00:10:42.160 |
I mean, think about all those guys at that competition. 00:10:44.960 |
And their cell phones, and they all video this thing happening. 00:10:47.440 |
Yeah, I think 10 years ago, that wouldn't have been the case. 00:10:49.520 |
And there wouldn't have been a big story about it. 00:10:51.520 |
And so she said there was a theme you wanted to there was a I think the theme is is pretty 00:10:56.720 |
obvious, which is that there's been an absolute decay of personal responsibility. 00:11:02.160 |
People don't feel like there's any downside to cheating anymore. 00:11:07.120 |
And they're not willing to take it upon themselves to take a journey of wins and losses to get 00:11:11.440 |
better at something they want the easy solution. 00:11:14.000 |
The easy solve the quick answer, you know, that gets them to some sort of finish line 00:11:20.480 |
that they have imagined for themselves will solve all their problems. 00:11:24.240 |
The problem is, it doesn't solve any problems. 00:11:26.560 |
And it just makes them a wholly corrupt individual. 00:11:29.600 |
Yeah, so let's talk about this Hustler Casino live cash game play. 00:11:34.640 |
There's this woman, Robbie, who is a new player. 00:11:38.000 |
Apparently, she's being staked in a very high stakes game. 00:11:40.560 |
She's playing as a guy Garrett, who is a very, very known winning cash game player. 00:11:46.080 |
And it was a very strange hand on the turn, all the money gets in. 00:11:52.320 |
She says she has a bluff catcher, then she claims that she had a thought she misread 00:11:55.840 |
her hand. Now people are saying that the poker world seems to be 7030 that she cheated. 00:12:04.800 |
There was a lot of weird words salad that she said that she had a bluff catcher, which 00:12:12.000 |
Then she said she thought she had a pair of threes. 00:12:14.000 |
And then she immediately said afterwards that he was giving her too much credit. 00:12:17.760 |
They confronted her in the hallway, she gave the money back because she supposedly loves 00:12:24.880 |
One side says, Okay, well, this is happening because she's a new player. 00:12:28.240 |
The other side is saying somebody was signaling her that she was good and giving her just 00:12:34.880 |
Because if you are going to cheat, cheating with Jack high in a situation where you just 00:12:38.640 |
put all in for a two and a quarter million dollar pot seems very suspect. 00:12:42.480 |
I don't know if you guys watch the hand breakdown. 00:12:45.920 |
Where does everybody stand on a percentage basis, I guess, if they think she was cheating 00:12:50.640 |
or not, because we this is not definitive, obviously, it's not like they cut it open 00:12:55.200 |
It's not it's not so obvious in that situation. 00:12:57.600 |
But I think the way that that line played made no sense. 00:13:04.480 |
And I guess in her previous hand, she had a jack three, and there was a three on the 00:13:10.080 |
So if she misread her hand for 10 1093 No, but you would have but you would have you 00:13:19.920 |
But I'm just trying to find a logical explanation. 00:13:22.320 |
And that jack three explanation, somebody kind of fed that to her, and then she changed 00:13:28.240 |
So this changing of the story is the thing I was sort of keyed on Friedberg is why does 00:13:34.720 |
Maybe she's had a couple of beverages or whatever. 00:13:37.280 |
She's just a new player and she's embarrassed by her play and can't explain it. 00:13:43.840 |
All of the things you're saying are probable. 00:13:45.520 |
I don't think that Yeah, I don't think there's any data for us to have a strongly held point 00:13:51.840 |
I'm just looking forward to us all playing live. 00:13:55.120 |
Yeah, HCL poker live October 21 minus David Sachs, unfortunately, to mop J. 00:14:03.840 |
We're going to be playing on the same stream. 00:14:05.840 |
We're going to be playing on the same screen, same table. 00:14:08.000 |
I figured out how to hack into the video stream for the car. 00:14:15.520 |
I'm going to take your money and I'm going to buy my kids some nice clothes. 00:14:19.440 |
For my 40th birthday, Sky organized poker in Tahoe. 00:14:27.600 |
And they taped it as if it was being broadcast with whole cards and commentators. 00:14:36.240 |
I it was it's one of the greatest things that anybody's ever given me. 00:14:40.320 |
There was a one hour block where somebody at the table said, okay, guys, how about we 00:14:48.400 |
Yes, where you could look at each other's cards and, you know, you could sort of help 00:14:55.120 |
In that one hour, our beautiful home game of friendship became Lord of the Flies. 00:15:00.480 |
I have never seen so much hatred, angling, me, behavior. 00:15:11.600 |
So I hope that we never, we never we never see cheating in our game. 00:15:16.160 |
Yeah, we'll see how it goes on October 21 at HCL poker live. 00:15:23.280 |
Oh, and we're not having any official 100 stuff. 00:15:27.120 |
But the fans, some of the fans who were at the all in summit 2022 are doing their own 00:15:35.360 |
100 episode 100 meetups on October 15, I think, all in meetups.io. 00:15:42.000 |
So there are fan meetups happening in Zurich and a bunch of other places. 00:15:45.120 |
I'm going to FaceTime into some of them and just say hi to the fans. 00:15:47.360 |
You know, it might be like 10 people in a bar somewhere. 00:15:49.600 |
I think the largest one is like Miami or San Francisco are going to be like 50 people or 00:15:54.480 |
We should all think that actually be kind of fun. 00:15:56.960 |
I'm basically I told them to send me an invite and I'll FaceTime in any any time. 00:16:07.440 |
It's a Saturday, the Saturday after the 100th episode, people are doing these all in meetups.io. 00:16:12.880 |
Earlier this week, it was reported that Elon contacted Twitter's board and suggested that 00:16:19.200 |
they move forward with closing the transaction at the original terms and the original purchase 00:16:23.280 |
price of $54.20 a share in the couple of days since then. 00:16:27.840 |
And even as of right now, with some news reports coming out here on Thursday morning, it appears 00:16:33.520 |
that there are still some question marks around whether or not the deal is actually going to 00:16:37.760 |
move forward at $54.20 a share because Elon, as of right now, the report said is still 00:16:42.800 |
asking for a financing contingency in order to close. 00:16:46.000 |
And there's a lot of back and forth on what the terms are. 00:16:48.400 |
Meanwhile, the court case in Delaware is continuing forward on whether or not Elon breached his 00:16:54.560 |
terms of the original agreement to close and buy Twitter at $54.20. 00:16:58.400 |
As we know, leading up to the signed deal or post signing the deal, Elon put together 00:17:05.600 |
a financing syndicate, a combination of debt investors as well as equity co investors with 00:17:10.640 |
him to do the purchase of Twitter at $54.20 a share. 00:17:15.280 |
So the 40 some odd billion dollars of capital that's needed was committed by a set of investors 00:17:22.960 |
And there's a big question mark now on whether or not those investors want to or would still 00:17:27.760 |
consummate the transaction with Elon given how the markets have turned and given how 00:17:32.000 |
debt markets are trading and equity markets are trading. 00:17:34.320 |
So tomorrow, I'd love to hear your point of view on what hurdles does Elon still have 00:17:41.760 |
And is there still a financing syndicate that's standing behind him at the original purchase 00:17:48.560 |
Maybe the best way to start is Nick, do you want to queue up what I said in August 25? 00:17:54.000 |
The lawsuit really boils down to one very specific clause, which is the pinnacle 00:18:00.240 |
question at hand, which is there is a specific performance clause that Elon signed up to, 00:18:07.680 |
right, which, you know, his lawyers could have struck out and either chose not to or, 00:18:14.240 |
you know, couldn't get the deal done without. 00:18:16.480 |
And that specific performance clause says that Twitter can force him to close at $5420 a share. 00:18:23.200 |
And I think the issue at hand at the Delaware Business Court is going to be that because 00:18:29.360 |
Twitter is going to point to all of these, you know, gotchas and disclaimers that they 00:18:34.240 |
have around this bought issue as their cover story. 00:18:39.200 |
And I think that really, you know, this kind of again, builds more and more momentum in 00:18:45.840 |
my mind that the most likely outcome here is a settlement where you have to pay the 00:18:53.680 |
economic difference between where the stock is now and 5420, which is more than a billion 00:18:59.040 |
dollars, or you close at some number below $54 and 20 cents a share. 00:19:05.360 |
And I think that that is like, you know, if you had to be a betting person, that's probably 00:19:10.160 |
and if you look at the way the stock is traded, and if you also look at the way the options 00:19:15.280 |
market trades, that's what people are assuming that there's a seven to $10 billion swing. 00:19:21.200 |
And if you impute that into the stock price, you kind of get into the $51 a share kind 00:19:27.280 |
Again, I'm not saying that that is right or should be right. 00:19:31.760 |
Yeah, so so it turns out that, you know, sort of like that kind of guesstimate turned out 00:19:37.600 |
to be pretty accurate, because the stock today is at $51 a share. 00:19:40.800 |
So I think that the specific performance thing is exactly what this thing has always hinged 00:19:46.720 |
And I think that there was a realization that there were very few outs around how that 00:19:56.880 |
And that out says that I think it's by April. 00:19:59.440 |
If if the deal doesn't get done by April, then the banks can walk away from their commitment 00:20:06.880 |
And if the banks walk away, then Elon does have a financing contingency that allows him 00:20:13.440 |
So the actual set of events that have to happen is those two things specifically get to April 00:20:19.040 |
so the banks can pass and say, we've changed our mind market conditions are different. 00:20:23.520 |
And then Elon is able to say, oh, you know, the banks just walked away. 00:20:27.200 |
Right now, the banks, if you look at all of the debt that they've committed to, what they 00:20:32.320 |
committed at a point in time when the debt markets were much better than they are today. 00:20:37.120 |
In the last, you know, six or seven months since they agreed to do this, the debt markets 00:20:42.640 |
have been clobbered and specifically junk bonds and a bunch of junk bond debt, the yields 00:20:48.640 |
So the price to get that kind of debt has skyrocketed. 00:20:51.440 |
So roughly back of the envelope math would tell me that right now the banks are offside 00:20:57.120 |
between one and two billion dollars because they're not going to be able to sell this 00:21:02.240 |
So I think the banks obviously want to weigh out. 00:21:04.000 |
The problem is their only way out is to run the shot clock off until April. 00:21:08.560 |
So I think that's the dance that they're in right now. 00:21:12.160 |
Elon's trying to find a way to solve, you know, for the merger. 00:21:15.280 |
I think Twitter is going to say, we're not going to give you a financing contingency. 00:21:19.280 |
You have to bring the banks in and close right now and then we will not go to court. 00:21:25.280 |
And so I think it's a very delicate predicament that they're all in. 00:21:29.360 |
But my estimate is that the equity is probably 20 percent offside. 00:21:35.760 |
He can make that up because he can create equity value like nobody's business. 00:21:39.600 |
The debt is way offside by a couple billion dollars, which is hard to make back. 00:21:44.160 |
But I think in the end, you know, given enough time, they can probably make that back. 00:21:48.240 |
The best off in all of this are the Twitter shareholders. 00:21:51.200 |
They're getting an enormous premium to what that company is worth today in the open market. 00:21:59.040 |
It's probably going to close in the next few weeks. 00:22:00.560 |
And had you bought Twitter when we were talking about it in August, you would have made 25 00:22:06.400 |
percent in six weeks and you know, if the deal closes at 54, you would have made a third of 00:22:10.480 |
your money in eight weeks, which is very hard to do in a market. 00:22:14.320 |
You're a GP at one of the funds like Andreessen or Sequoia. 00:22:18.480 |
And you had made this commitment to Elon or even Larry Ellison a couple of months ago. 00:22:31.120 |
I mean, what do you do, given that the premium is so much higher than where the market would 00:22:35.520 |
Some people are saying the stock should be at like 20 bucks a share or something. 00:22:38.000 |
The average premium in an M&A transaction in the public markets is about 30 percent. 00:22:42.320 |
So and I think the fair value of Twitter is around 32 to 35 bucks a share. 00:22:49.200 |
So, you know, it's not like he is massively, massively overpaying. 00:22:53.600 |
And so, you know, I would just sort of keep that in the realm of the possible. 00:22:58.880 |
So like if you take thirty five dollars as the midpoint. 00:23:04.880 |
So, yeah, he paid 20 percent more than he should have, but he didn't pay 100 percent more. 00:23:08.640 |
So it's not as if you can't make that equity back as a private company, particularly because 00:23:14.240 |
there's probably ten dollars of fat in the stock if you think about just 00:23:18.240 |
OPEX in terms of all the buildings they have. 00:23:24.960 |
You know, one thing is when I looked at doing an activist play at Twitter, I think I mentioned 00:23:30.720 |
One of the things that I found was at that time Twitter was running their own data centers. 00:23:35.360 |
And, you know, the most obvious thing for me at that time was like, we're going to move 00:23:39.360 |
Now, I don't know if that happened, but I'm sure that if it hasn't, just bidding that 00:23:44.320 |
out to Azure, GCP and AWS can raise, you know, three or four billion dollars because I'm 00:23:48.960 |
sure those companies would want this kind of an app on their cloud. 00:23:52.800 |
So there's all kinds of things that I think Elon can do as a private company to make back 00:24:00.080 |
And then he can get to the core job of rebuilding this company to be usable, this product to 00:24:06.960 |
It has been decaying at a very, very rapid clip. 00:24:11.840 |
And I think that his trepidation in closing the merger in part also, even though he hasn't 00:24:17.680 |
said it has to do with the quality of the experience. 00:24:21.600 |
It's not as fun to use as it was during the pandemic or even before the pandemic. 00:24:27.760 |
So something is happening inside that app that needs to get fixed. 00:24:31.760 |
And if he does it, he'll make a ton of money. 00:24:33.520 |
Sort of like what happened with Friendster and MySpace and any social networking app 00:24:39.520 |
If it's not growing, it's shrinking and it gets if it's if it's not growing. 00:24:43.520 |
And also, if the product hygiene isn't enforced in code and product hygiene in this case are 00:24:49.600 |
this, you know, spam bots, you know, the trolling, it can really take away from the experience. 00:24:57.440 |
I mean, interestingly, like if you think back to the the starting days, the original days 00:25:01.920 |
of Twitter, I don't know if you guys remember, you would send in an SMS to do your tweet, 00:25:06.160 |
and then it would post up and other people would get the SMS notification and it would 00:25:13.520 |
And the apps were the app was notoriously crashing. 00:25:18.880 |
And some people have argued that Twitter has had a cultural technical incompetence from 00:25:27.200 |
So I do think look, Twitter was known for what's called a fail whale. 00:25:30.320 |
You know, they used to have these fail whales constantly. 00:25:32.640 |
And they did hire people that attempted to try to fix it. 00:25:37.200 |
I remember the the funniest part of when I went in there and said, Hey, here's my plan. 00:25:41.360 |
And here's what I want to do is literally a day or two later, the head of engineering 00:25:46.160 |
I can't remember who his name was, but he was just out the door. 00:25:50.720 |
But it is a I think it is a team that has tried its best. 00:25:56.720 |
That probably at the edges definitely made some technical miscalculations. 00:26:02.000 |
Like I said, at that time, the idea that any app of that scale would use their own 00:26:06.640 |
data centers made no technical sense whatsoever. 00:26:12.240 |
It made it more prone to downtime, to your point. 00:26:14.160 |
But that being said, I would be shocked if they haven't made meaningful improvements, 00:26:18.960 |
because the stack of the internet has gotten so much better over the last seven years. 00:26:23.440 |
And so to your point, David, if they didn't take advantage of all these new abstractions 00:26:27.840 |
and mechanisms to mechanisms to rebuild the app or to rebuild search or to rebuild, you 00:26:32.800 |
know how you know, all these infrastructure elements of the app work, I would be really 00:26:37.520 |
surprised because then what are they doing over there? 00:26:39.840 |
Yeah, well, look, I mean, to the point earlier, besides the product points, there was a really 00:26:44.960 |
good tweet I liked that said, for what it's worth, I think Elon will show us just how 00:26:52.480 |
lean the Silicon Valley advertising companies can be run. 00:26:55.440 |
At the very least, it'll be an interesting thought experiment for spectators. 00:26:59.120 |
Because if he does go in and actually does significantly reduce op ex and headcount, 00:27:03.840 |
and the company does turn profitable and he can grow it, well, it'll really, by the way, 00:27:09.120 |
it'll really be a beacon for financial big companies. 00:27:12.160 |
Yeah, from a financial perspective, there is $10 a share in op ex cuts that he should 00:27:16.560 |
make right away just so that he is economically break even. 00:27:20.080 |
And he looks like every other M&A transaction, you know, you paid a 30% premium and you bought 00:27:25.680 |
There's a lot of margin of safety there if Elon does that. 00:27:30.400 |
And there probably needs to be a meaningful riff at Twitter. 00:27:33.520 |
I'm not saying it's, you know, and I feel for the people that may go through it. 00:27:36.480 |
But from a financial perspective, the math makes sense for him to do that, because then 00:27:41.600 |
he is a break even proposition on a go in M&A transaction. 00:27:45.920 |
And I think that there's, there's a lot of intelligent financial sense, so that all the 00:27:50.720 |
debt holders feel like he's doing the right thing. 00:27:53.040 |
And all the equity holders, particularly, see a chance for them to make a decent return 00:27:59.360 |
This is a great conversation between Chamath Palihappitya and Dave Friedberg about the 00:28:16.640 |
I have a nice cold brew here, a nice iced cold brew and a nice drip coffee. 00:28:22.400 |
I'd love to talk about topics I'm not being subpoenaed or depositioned about. 00:28:26.480 |
We will have a lot to say in the coming weeks. 00:28:28.480 |
I love to talk about topics that my lawyers have advised me not to talk about. 00:28:42.000 |
Speaking of Elon, Tesla AI Day was last week. 00:28:50.800 |
Phil Helmuth and I went and I drove Phil Helmuth home. 00:28:57.920 |
And it is essentially a giant recruiting event. 00:29:05.680 |
Can we just talk about Phil Helmuth's non sequitur in the group chat about Ken Griffin? 00:29:11.200 |
Oh, yeah, where he's just like, I made a joke about his net worth and he responded-- 00:29:18.080 |
We were talking about the most serious of topics and he just comes-- 00:29:25.280 |
By the way, I was texting with Daniel Negreanu. 00:29:31.840 |
If you guys, with Lex Friedman, if you haven't listened to it, the Daniel Negreanu 00:29:38.640 |
But I was joking with Daniel that there's a section where he's talking about the greatest 00:29:43.760 |
And if you look in the bar of YouTube, it shows where the most viewership was. 00:29:48.240 |
And it was exactly the 30 seconds he talks about Helmuth. 00:29:51.920 |
And I said to Daniel, this must have been Phil re-watching it over and over. 00:30:08.480 |
Elon only spoke when he showed the Optimus, the new robot. 00:30:14.080 |
He's building a general purpose robot that will work in the factories. 00:30:21.280 |
And he said he thinks they could get it down to $20,000. 00:30:26.880 |
And obviously, the factories already have a ton of robots. 00:30:29.600 |
But this is more of a robot that will benefit from the general-- 00:30:35.520 |
or the computer vision and the AI, the narrow AI being pursued by the self-driving team. 00:30:41.840 |
This is like two and a half hours of really intense presentations. 00:30:44.480 |
The most interesting part for me was they're building their own supercomputer and their chips. 00:30:50.160 |
And the Dojo supercomputer was really impressive at how much they can get through scenarios. 00:30:58.400 |
So they're building every scenario of every self-driving. 00:31:01.280 |
I actually have the full self-driving beta on my car. 00:31:07.760 |
If you haven't used it yet, I feel like AI is moving at a pretty advanced clip. 00:31:14.720 |
The past year, if you haven't also seen, Meta announced a text-to-video generator. 00:31:23.040 |
DALI, you put in a couple of words, Friedberg, and you get a painting or whatever. 00:31:27.360 |
This is put in a couple of words, and you get a short video. 00:31:29.920 |
So they had one of a teddy bear painting a teddy bear. 00:31:35.520 |
essentially create a whole movie by just talking to a computer. 00:31:42.480 |
Where do you think we are, Friedberg, in terms of 00:31:44.880 |
the compounding nature of these narrow AI efforts? 00:31:49.520 |
Obviously, we saw poker, chess, Go, DALI, GPT-3, self-driving. 00:31:55.440 |
It feels like this is all compounding at a faster rate. 00:32:03.600 |
When people saw the first computer playing chess, they said the same thing. 00:32:06.720 |
I think any time that you see progress with a computer that 00:32:10.080 |
starts to mimic the predictive capabilities of a human, it's impressive. 00:32:16.960 |
But I will argue, and I'll say a few words on this. 00:32:20.000 |
I think this is part of a 60-year cycle that we've been going through. 00:32:25.280 |
Fundamentally, what humans and human brains do is we can sense our external environment, 00:32:32.560 |
then we generate knowledge from that sensing, and then our brains build a model that predicts 00:32:38.960 |
And then that predicted outcome is what drives our actions and our behavior. 00:32:42.640 |
We observe the sun rise every morning, then we observe that it sets. 00:32:45.920 |
And you see that enough times, and you build a predictive model from that data that's been 00:32:49.280 |
generated in your brain that I predict that the sun has risen. 00:32:56.640 |
And I think that the computing approach is very similar. 00:32:58.800 |
It's all about sensing or generating data, and then creating a predictive model, and 00:33:04.480 |
And initially, the first approach was just basic algorithms. 00:33:08.960 |
And these are deterministic models that are built. 00:33:11.520 |
It's a piece of code that says, here's an input, here's an output. 00:33:17.200 |
And a human designed that algorithmic model and said, this is what the predictive potential 00:33:24.640 |
Then there was this term called data science. 00:33:26.800 |
So as data generation began to proliferate, meaning there was far more sensors in the 00:33:31.200 |
world, it was really cheap to create digital data from the physical world, really cheap 00:33:35.840 |
to transmit it really cheap to store it really cheap to compute with it. 00:33:38.960 |
Data science became the hot term in Silicon Valley for a while. 00:33:42.080 |
And these models were not just a basic algorithm written by a human, but it became an algorithm 00:33:47.760 |
that was a similar deterministic model that had parameters, and the parameters were ultimately 00:33:54.320 |
resolved by the data that was being generated. 00:33:56.960 |
And so these models became much more complex and much more predictive, finer granularities, 00:34:02.000 |
finer range, then we use this term machine learning. 00:34:04.960 |
And in the data science era, it was still like, hey, there's a model. 00:34:09.040 |
And you would solve it statically, you would get a bunch of data, you would statically 00:34:13.360 |
solve for the parameters, and that would be your model, and it would run machine learning 00:34:16.960 |
machine learning, then allowed those parameters to become dynamic. 00:34:22.160 |
But generally speaking, the parameters that drove the model became dynamic as more data 00:34:27.840 |
came into the system, and they were dynamically updated. 00:34:31.120 |
And then this era of AI became and that's the new catchword. 00:34:34.240 |
And what AI is realizing is that there's so much data that rather than just resolve the 00:34:39.200 |
parameters of the model, you can actually resolve a model itself, the algorithm can 00:34:43.520 |
be written by the data, the algorithm can be written by the software. 00:34:47.520 |
And so with a with AI example, so poker playing an adaptive model, so people, so you're playing 00:34:54.240 |
poker and the software begins to recognize behavior, and it builds a predictive model 00:35:00.240 |
And then over time, it actually changes not just the parameters of the model, but the 00:35:05.920 |
And so AI and then it eventually gets to a point where the algorithm is so much more 00:35:09.880 |
complex that a human would have never written it. 00:35:12.480 |
And suddenly, the AI has built its own intelligence, its own ability to be predictive in a way 00:35:16.720 |
that a human algorithmic programmer would have never done. 00:35:22.760 |
So none of this is new science per se, there's new techniques that all on their underlying 00:35:30.320 |
And then there's these techniques that allow us to build these new systems of model development, 00:35:35.680 |
And those statistics build those neural nets, they solve those parameters, and so on. 00:35:39.600 |
But fundamentally, there is a geometric increase in data and a geometric decline in the cost 00:35:46.520 |
to generate data from sensors, because the cost of sensors is coming down with Moore's 00:35:50.000 |
Law, transmit that data, because the cost of moving data has come down with broadband 00:35:54.280 |
communications, the cost of storing data, because the cost of DRAM and solid state hard 00:36:01.720 |
And now the ability to actually have enough data to do this AI driven where people are 00:36:08.000 |
It's part of a spectrum of things that have been going on for 60 years, to actually drive 00:36:11.480 |
predictions in the in the world is really being realized in a bunch of areas that we 00:36:16.280 |
would have historically been really challenged and surprised to see. 00:36:19.320 |
And so my argument is, at this point, data played a big role. 00:36:23.600 |
Yeah, yeah, we've over the last decade, we've reached this tipping point in terms of data 00:36:27.280 |
generation, storage and computation, that's allowed these statistical models to resolve 00:36:32.920 |
And as a result, they are far more predictive. 00:36:36.280 |
And as a result, we see far more human like behavior in the predictive systems, both physical 00:36:41.280 |
both those that are, you know, like a like a robot is the same as one that existed 20 00:36:46.400 |
But the way that it's run is using the software that is driven by this dynamic model. 00:36:56.800 |
But one, the first one is a total non sequitur. 00:37:00.360 |
Do you know where the term data scientists came from? 00:37:04.520 |
As classically used in Silicon Valley, it came from Facebook, and it came from my team 00:37:12.320 |
I was trying to 2008 I was trying to build the growth team. 00:37:15.760 |
This is the team that became very famous for getting to a billion users and, you know, 00:37:20.160 |
building a lot of these algorithmic insights. 00:37:22.480 |
And I was trying to recruit a person from Google. 00:37:27.080 |
And he was like a PhD in some crazy thing like astrophysics or particle physics or something. 00:37:32.720 |
And we gave him an offer as a data analyst, because this is what I needed at the time. 00:37:37.160 |
This is what I thought I needed an analyst, you know, to analyze data. 00:37:45.080 |
And I remember talking to my my HR, you know, business process partner, and I asked her 00:37:52.720 |
And she said, he fashions himself a scientist. 00:37:55.700 |
And I said, Well, then call him a data scientist. 00:37:57.880 |
So we wrote in the offer for the first time, data scientist. 00:38:01.220 |
And at the time, people internally were like, this is a dumb title. 00:38:09.440 |
And, and that title just took off internally. 00:38:11.520 |
It's funny, because parallel, we started climate corp in 2006. 00:38:15.200 |
And the original the first guy I hired was a buddy of mine, who was a 4.0 for in applied 00:38:20.160 |
And then everyone we hired on with him, we called them the math team. 00:38:24.040 |
And they were all applied math and statistics, PhDs. 00:38:28.360 |
And it was really cool to be part of the math team. 00:38:30.320 |
But then we switched the team name to data scientist. 00:38:33.160 |
And then it obviously created this much more kind of impressive role, impressive title, 00:38:41.000 |
That was more than just a math person or data analyst, as I think it may have been classically 00:38:45.400 |
treated because they really were building the algorithms that drove the models that 00:38:50.880 |
Peter Thiel has a very funny observation, not funny, but you know, observation, which 00:38:54.880 |
is, you should always be wary of any science that actually has science in the name, political 00:39:00.160 |
science, social science, I guess, maybe data scientists, you know, because the real sciences 00:39:04.920 |
don't need to qualify themselves physics, chemistry, biology. 00:39:08.000 |
Anyways, that's, so here's what I wanted to talk about with respect to AI. 00:39:12.640 |
Two very important observations that I think is useful for people to know. 00:39:16.760 |
The first one, Nick, if you throw it up here is just a baselining of, you know, when we 00:39:20.920 |
have thought about intelligence and compute capability, we've always talked about Moore's 00:39:24.800 |
law and Moore's law, essentially this idea that there is a fixed amount of time where 00:39:30.240 |
the density of transistors inside of a chip would double. 00:39:34.140 |
And roughly that period for many, many years was around two years. 00:39:40.080 |
And we used to equate this to intelligence, meaning the more density there was in a chip, 00:39:45.440 |
the more things could be learned and understood. 00:39:48.840 |
And we used to think about that as the progression of how computing intelligence would grow and 00:39:54.800 |
eventually AI and artificial intelligence would get to mass market. 00:39:59.040 |
Well, what we are now at is a place where many people have said Moore's law has broken. 00:40:06.600 |
It's because we cannot cram any more transistors into a fixed amount of area. 00:40:14.700 |
And so people think, well, does that mean that our ability to compute will essentially 00:40:22.160 |
And that's what's demonstrated on this next chart, just to make it simple, which is that 00:40:26.840 |
what you really see is that if you think about, you know, supercomputing power, so the ability 00:40:33.060 |
to get to an answer that has actually continued unabated. 00:40:38.080 |
And if you look at this chart, the reason why this is possible is entirely because we've 00:40:43.280 |
shifted from CPUs to these things called GPUs. 00:40:46.920 |
So you may have heard companies like Nvidia, why is companies like Nvidia done so well? 00:40:51.560 |
It's because they said they raised their hand and said, we can take on the work. 00:40:56.100 |
And by taking on the work away from a traditional CPU, you're able to do a lot of what Freeberg 00:41:01.460 |
said is get into these very complicated models. 00:41:04.380 |
So this is just an observation that I think that we are continuing to compound knowledge 00:41:11.520 |
and intelligence effectively at the same rate as Moore's law. 00:41:16.520 |
And we will continue to be able to do that because this makes it a problem of power and 00:41:24.880 |
So as long as you can buy enough GPUs from Nvidia or build your own, and as long as you 00:41:30.280 |
can get access to enough power to run those computers, there really isn't many problems 00:41:37.620 |
And that's what's so fascinating and interesting. 00:41:39.760 |
And this is what companies like OpenAI are really proving, you know, when they raised 00:41:44.160 |
a billion dollars, what they did was they attacked this problem, because they realized 00:41:48.760 |
that by shifting the problem to GPUs, it left all these amazing opportunities for them to 00:41:55.040 |
uncover and that's effectively what they have. 00:41:57.600 |
The second thing that I'll say very quickly is that it's been really hard for us as a 00:42:03.200 |
society to build intelligence in a multimodal way like our brain works. 00:42:13.480 |
We can process imagery, we can process words and sounds, we can process all of these different 00:42:20.000 |
modes, text, into one system, and then intuit some intelligence from it and make a decision, 00:42:29.360 |
So, you know, we could be watching this YouTube video, there's going to be transcription, 00:42:32.640 |
there's video, voice, audio, everything all at once. 00:42:36.240 |
And we are moving to a place very quickly where computers will have that same ability 00:42:41.800 |
Today, we go to very specific models and kind of balkanized silos to solve different kinds 00:42:48.220 |
But those are now quickly merging, again, because of what I just said about GPUs. 00:42:53.040 |
So I think what's really important about AI for everybody to understand is the marginal 00:43:02.120 |
And this is where I'm just going to put out another prediction of my own. 00:43:06.280 |
When that happens, it's going to be incredibly important for humans to differentiate themselves 00:43:13.880 |
And I think the best way for humans to differentiate ourselves is to be more human. 00:43:21.720 |
It's to be more empathetic, it's to be more emotional, not less emotional, because those 00:43:26.800 |
differentiators are very difficult for brute force compute to solve. 00:43:31.000 |
Be careful, the replicants on this call are getting a little nervous here. 00:43:38.840 |
Well, to your point, during this AI day, they were showing in self driving, as you're talking 00:43:45.320 |
about this balkanization, and trying to make decisions across many different decision trees, 00:43:51.880 |
you know, they're looking at lane changes, they're looking at other cars and pedestrians, 00:43:56.200 |
they're looking at road conditions like fog and rain. 00:43:59.640 |
And then they're using all this big data to your point, Friedberg to run tons of different 00:44:05.840 |
So they're building like this virtual world on Market Street, and then they will throw 00:44:11.440 |
people, dogs, cars, people who have weird behaviors into the simulation. 00:44:23.360 |
So clearly, there's some auditory expression of risk, right? 00:44:28.640 |
And now you have to scan your visual field, you have to probabilistically decide what 00:44:33.400 |
it could be, what the evasive maneuver, if anything, should be. 00:44:37.720 |
So that's a multimodal set of intelligence that today isn't really available. 00:44:43.800 |
But we have to get there if we're going to have real full self driving. 00:44:46.440 |
So that's a perfect example, Jason, a real world example of how hard the problem is, 00:44:50.640 |
but it'll get solved because we can brute force it now with, with chips and with compute. 00:44:54.760 |
I think that's going to be the very interesting thing with the robots as well as all of, you 00:44:58.600 |
know, these decisions they're making, moving cars through roads, all of a sudden, we're 00:45:04.160 |
going to see that with VTOLs, vertical takeoff and landing, you know, aircraft, and we're 00:45:11.800 |
And everybody wanted to ask you a lot about general AI, you know, the Terminator kind 00:45:16.560 |
And his position is, I think, if we solve enough of these problems, Friedberg, it'll 00:45:20.560 |
be an emergent behavior or an emergent phenomenon, I guess would be a better word, based on each 00:45:27.520 |
of these cities crumbling, you know, each of these tasks getting solved by groups of 00:45:32.440 |
You have any thoughts as we wrap up here on the discussion about general AI and the timeline 00:45:36.920 |
for that, because obviously, we're going to solve every vertical AI problem in short order. 00:45:41.320 |
I spoke about this a little bit on the Ask AMA on Colin on Tuesday night, once Saks gets 00:45:50.360 |
But I really have this strong belief that servers crash, there's no AI team at Colin. 00:45:58.520 |
Yeah, you guys can try to download the app, but it might crash. 00:46:04.520 |
The problem is, Friedberg, that you are 10 times more popular than J Cal. 00:46:10.240 |
Well, you had you did have an account with 11,000 followers. 00:46:12.920 |
I mean, you're right, Jake, I will put you on that account next time. 00:46:22.000 |
So my, my core thesis is I think humans transition from being, let's call it, you know, passive 00:46:33.040 |
And then we transition from being laborers to being creators. 00:46:36.040 |
And I think our next transition with AI is to transition from being creators to being 00:46:41.160 |
And what I mean by that is, as as we started to do work on Earth and engineer the world 00:46:45.960 |
around us, we did labor to do that we literally plowed the fields, we walk distances, we 00:46:53.280 |
And over time, we built machines that automated a lot of that labor. 00:46:58.120 |
You know, everything from a plow to a tractor, to a caterpillar equipment to a microwave 00:47:03.280 |
that cooks for us, labor became less, we became less dependent on our labor abilities. 00:47:08.840 |
And then we got to switch our time and spend our time as creators, as knowledge workers. 00:47:13.120 |
And a vast majority of the developed world now primarily spends their time as knowledge 00:47:19.440 |
And we create stuff on computers, we do stuff on computers, but we're not doing physical 00:47:24.160 |
As a lot of the knowledge work gets supplanted by AI, or as it's being termed now, but really 00:47:32.000 |
The role of the human I think transitions to being one of a narrator, where instead 00:47:36.000 |
of having to create the blueprint for a house, you narrate the house you want, and the software 00:47:44.120 |
And instead of yeah, and instead of creating the movie and not spending $100 million producing 00:47:48.080 |
a movie, you dictate or you narrate the movie you want to see, and you iterate with the 00:47:52.720 |
computer and the computer renders the entire film for you. 00:47:55.760 |
Because those films are shown digitally anyway, so you can have a computer render it. 00:47:59.340 |
Instead of creating a new piece of content, you narrate the content you want to experience, 00:48:04.920 |
you create your own video game, you create your own movie experience. 00:48:08.080 |
And I think that there's a whole evolution that happens. 00:48:10.040 |
And if you look, Steve Pinker's book, Enlightenment Now has a great statistic, a set of statistics 00:48:14.520 |
on this, but the amount of time that humans are spending on leisure activities per week 00:48:19.480 |
has climbed extraordinarily over the past couple of decades. 00:48:22.900 |
We spend more time enjoying ourselves and exploring our creative interests than we ever 00:48:27.000 |
did in the past in human history, because we were burdened by all the labor and all 00:48:31.040 |
the creative and knowledge work we have to do. 00:48:33.160 |
And now things are much more accessible to us. 00:48:36.060 |
And I think that AI allows us to transition into an era that we never really thought possible 00:48:39.880 |
or realized, where the limits are really our imagination of what we can do with the world 00:48:45.360 |
And the software resolves to the automation resolves to make those things possible. 00:48:50.560 |
And that's a really exciting kind of vision for the future that I think AI enables. 00:48:54.160 |
Star Trek had this right, people didn't have to work and they could pursue things in the 00:48:57.800 |
holodeck or whatever that they felt was rewarding to them. 00:49:02.160 |
But speaking of jobs, the job reports for August came in, we talked about this, we were 00:49:10.080 |
We're wondering if the other shoe would drop, oh boy, did it drop 100 over a million jobs 00:49:16.400 |
So without getting into the macro talk, it does feel like what the Fed is doing, and 00:49:20.920 |
companies doing hiring freezes and cuts is finally finally having an impact. 00:49:26.320 |
If we start losing a million, as we predicted could happen here on the show. 00:49:32.360 |
People might actually go back to work and Lyft and Uber are reporting that the driver 00:49:38.080 |
They no longer have to pay people spiffs and stuff like that to get people to come back 00:49:42.080 |
So at least here in America feels like we're turning a corner. 00:49:50.080 |
I was gonna say we got a couple of things we really want to get to here. 00:49:58.760 |
While we're recording the show, President Biden says, and I'm just going to quote here. 00:50:04.520 |
First, I'm pardoning all prior federal offenses of simple marijuana possession. 00:50:10.480 |
There are 1000s of people who are were previously convicted of simple possession, who may be 00:50:14.580 |
denied employment, housing or educational opportunities. 00:50:16.600 |
So my pardon will remove this burden as big news. 00:50:19.560 |
Second, I'm calling on governors to pardon simple state marijuana possession offenses, 00:50:25.580 |
just as no one should be in a federal prison solely for possessing marijuana. 00:50:28.880 |
Nobody should be in a local jail or state prison for that reason, either. 00:50:34.640 |
Third, and this is an important one, we classify the marijuana at the same level as heroin 00:50:42.200 |
I'm asking Secretary Burr, but Kara and the Attorney General to initiate the process of 00:50:48.080 |
reviewing how marijuana is scheduled under federal law. 00:50:52.080 |
I'd also like to note that as federal and state regulators change, we still need important 00:50:57.080 |
limitations on trafficking, marketing and under a shells of marijuana. 00:51:00.960 |
So I'm asking for your thoughts on this breaking news. 00:51:03.240 |
Is this giving the timing on this is kind of midterm related. 00:51:09.280 |
This seems is this is this I guess is this a politically popular decision to do? 00:51:19.200 |
I mean, I thought that we should decriminalize marijuana for a long time, or specifically, 00:51:25.800 |
you know, I agree with this idea of de scheduling it, it does not make sense to treat marijuana 00:51:30.920 |
the same as heroin as a schedule one narcotic. 00:51:35.120 |
It should be regulated separately and differently. 00:51:39.000 |
Obviously you want to keep it out of the hands of minors, but no one should be going to jail, 00:51:45.520 |
And I think the thing they need to do, I don't see it mentioned here is they should pass 00:51:50.040 |
a federal law that would allow for the normalization of let's call it legal, legal, you know, cannabis 00:52:01.000 |
So so companies that are allowed to operate under state laws, like in California, should 00:52:06.840 |
have access to the banking system should have access to payment rails. 00:52:09.960 |
Because right now, the reason why the legal cannabis industry isn't working at all in 00:52:15.160 |
California is because they can't bank, they can't take payments. 00:52:18.640 |
So it's this weird all cash business that makes no sense. 00:52:21.400 |
So So listen, if we're not going to criminalize it as a drug like heroin, if we're going to 00:52:26.480 |
allow states to make it legal, then allow it to be a more normal business where the 00:52:32.480 |
state can tax it, and it can operate in a more above board way. 00:52:37.680 |
So federal mandate is what you're saying the federal mandate, I think it's could still 00:52:43.600 |
But I think you need the feds to bless the idea that banks and payment companies can 00:52:49.200 |
take on those clients, which states have already said, are legally operating companies. 00:52:57.600 |
So maybe that's the one thing I would add to this. 00:52:59.880 |
But I don't have any complaints about this right now, based on what we know from this 00:53:04.240 |
And I would say, this, by the way, was about face. 00:53:08.760 |
Yeah, do you know what the polling data says? 00:53:11.520 |
I'm assuming there's big support in kind of the independence in the middle. 00:53:20.160 |
So So look, this, to me, this is the kind of thing that Biden should be doing with the 00:53:24.360 |
5050 Senate to finding these sorts of bipartisan compromises. 00:53:32.600 |
This happened in the past, like, what's been the political reason that other presidents 00:53:36.920 |
Obama even didn't that have their similar ideology? 00:53:45.400 |
Does anyone know why this hasn't been done in the past? 00:53:47.120 |
There was rumors he was going to do in the second term. 00:53:54.040 |
Yeah, the pardon doesn't require political capital. 00:53:57.280 |
I think it's probably the perception that this is soft on crime in some way, or there 00:54:04.160 |
As David said, I mean, I think the the United States population has moved pretty meaningfully 00:54:20.800 |
So when people saw the states doing it, and they saw absolutely no problem, you know, 00:54:25.360 |
in every state, and I think what people will see next, that's a Gallup poll. 00:54:30.000 |
You're seeing the so it's increased dramatically. 00:54:32.040 |
MDMA, psilocybin, and some of these other plant based medicines, ayahuasca are next, 00:54:38.280 |
Now, I don't want to take away from how important this is for all the people for whom this will 00:54:44.520 |
I just want to talk about the schedule change for marijuana. 00:54:48.200 |
As a parent, one of the things that I'm really, really concerned about is that through this 00:54:53.000 |
process of legalization, getting access to marijuana has frankly become too easy, particularly 00:55:01.120 |
At the same time, I saw a lot of really alarming evidence that the the intensity of these marijuana 00:55:08.000 |
based products have gone, you know, I think it's like five or six times more intense than 00:55:16.000 |
So so it's no longer, you know, this kind of like, you know, do no harm drug that it 00:55:22.160 |
was 20 years ago, this is this could be actually David, the way that it's productized today, 00:55:29.280 |
as bad as some of these other, you know, narcotics. 00:55:33.040 |
So in June of this year, the Biden administration basically made this press release that said 00:55:40.320 |
the FDA is going to come out with regulations that would cap the amount of nicotine in cigarettes. 00:55:46.120 |
And I think that was a really smart move, because it basically set the stage to taper 00:55:50.300 |
nicotine out of out of cigarettes, which would essentially, you know, decapitate it as a 00:55:59.540 |
And I think by thinking about how it's how it's dealt with, what I really hope the administration 00:56:06.360 |
does is it empowers the FDA, if you're going to legalize it, you need to have expectations 00:56:13.160 |
around what the intensity of these drugs are. 00:56:16.280 |
Because if you're delivering drugs, OTC, and now any kid can go in at 18 years old and 00:56:20.520 |
buy them, which means that 18 year olds are going to buy them for 16 year olds, 16 year 00:56:25.400 |
olds are going to get fake IDs to buy them for themselves. 00:56:29.280 |
So the parents you're helping parents do our job. 00:56:37.040 |
If alcohol is 21, then of course, yeah, fine. 00:56:39.400 |
But even alcohol, David, you know that there are there are we know what the intensity of 00:56:42.720 |
these are, there are labels, and there's warnings. 00:56:44.840 |
And you know, the difference between beer, they're getting wine versus hard alcohol. 00:56:48.880 |
But let me just give you some statistics here. 00:56:50.360 |
Chamath, if you think about the the cannabis in the 90s, and prior to that, there was very, 00:56:56.480 |
you've been a ton of studies on this in Colorado, it was the THC content was less than 2%. 00:57:02.400 |
And then in 2017, we were talking about, you know, things going up to 17 to 28% for specific 00:57:12.440 |
So they have been building strains like Girl Scout cookies, etc, that have just increased 00:57:16.800 |
And then there are things like shards and obviously edibles, you can create whatever 00:57:22.000 |
So you have this incredible there, you know, variation, you could have an edible that's, 00:57:27.360 |
you know, got one milligram of THC, you got one that has 100, or you could have a pack 00:57:32.440 |
And you see this happen in the news all the time, some kid gets their parents pack or 00:57:38.280 |
And this dabbing phenomenon combined with a dabbing is like the shards like this really 00:57:45.560 |
Combined with the edibles is really the issue and the labeling of them. 00:57:48.340 |
So you got to be incredibly careful with this. 00:57:50.640 |
It's not good for kids, it screws up their brains. 00:57:55.400 |
I have a zero tolerance policy on this stuff. 00:57:58.800 |
Like I don't want my kids touching any of this stuff until 00:58:01.800 |
Yeah, but you also should not be until they're until they're 35 or 40. 00:58:09.600 |
And I'm not sure I'm the only parent that's asking you can't have this stuff be available 00:58:13.320 |
effectively sold like in a convenience store. 00:58:15.720 |
No, no, that's not going to happen where there isn't even labeling at least like cigarettes 00:58:20.440 |
It's very clear how bad this stuff is for you. 00:58:22.760 |
Or do you guys have any feedback on the job report or anything? 00:58:25.360 |
They're all going away when when the when the AI wins? 00:58:29.080 |
Well, that's why I brought it up is like, we're now going to see a potential, you know, 00:58:35.200 |
And a lot of the stuff like even developers, right? 00:58:37.480 |
Don't you think freeberg developers are going to start development tasks? 00:58:40.760 |
No, I designed tasks are going to be I added it. 00:58:45.560 |
I think what happens particularly in things like developer tools, is the developer can 00:58:53.680 |
And so the overall productivity goes up, not down. 00:58:59.120 |
And remember, like, like, we were talking on the AMA the other night, Adobe Photoshop 00:59:05.200 |
So you didn't have to take the perfect photograph and then print it. 00:59:08.040 |
You could, you know, you could use the software to improve the quality of your photograph. 00:59:12.360 |
And I think that that's what we see happening with all software. 00:59:15.360 |
In the creative process is it helps people do more than they realize they could do before. 00:59:20.480 |
And it opens up all these new avenues of interest and things we're not even imagining today. 00:59:24.240 |
Alright, so SCOTUS is going to hear two cases for section 230. 00:59:28.940 |
The family of Nohema Gonzalez, a 23 year old American college student who was killed in 00:59:33.520 |
an ISIS terrorist attack in Paris back in 2015. 00:59:36.780 |
Remember those horrible attacks, just claiming that YouTube helped and aided and abetted 00:59:42.940 |
The family's argument is YouTube's algorithm was recommending videos that make it that 00:59:47.500 |
makes it a publisher of content, as you know, it's section 230, common carrier. 00:59:51.260 |
If you make editorial decisions, if you promote certain content, you lose your 230 protections. 00:59:57.700 |
In court papers filed in 2016, they said the company quote, no only permitted ISIS to post 01:00:01.660 |
on YouTube hundreds of radicalizing videos inciting violence, which helped the group 01:00:06.540 |
recruit, including some who were actually involved in the terrorist attack. 01:00:11.340 |
Well, look, let's let's be honest, we can we can we can put a pin in this thing. 01:00:14.940 |
Because I think it would be shocking to me if this current SCOTUS all of a sudden founded 01:00:21.500 |
in the cockles of their heart to protect big tech. 01:00:23.940 |
I mean, they've dismantled a lot of other stuff that I think is a lot more controversial 01:00:34.860 |
And so you know, we've we've basically looked at gun laws, we've looked at affirmative action, 01:00:42.900 |
So well, I mean, I think as we've said, I think we all know where that die is, unfortunately 01:00:50.360 |
So to me, it just seems like this could be an interesting case where it's actually nine 01:00:54.140 |
zero in favor for complete for completely different sets of reasons. 01:00:58.940 |
I mean, if you think of the liberal left part of the court, they have their own reasons 01:01:02.940 |
for saying that there are 230 protections for big tech. 01:01:05.760 |
And if you look at the far right, or the right leaning parts, members of this of SCOTUS, 01:01:10.700 |
they have they have another set of do you think you're gonna make a point? 01:01:14.340 |
No, but even even in their politics, they actually end up in the same place. 01:01:18.060 |
They both don't want the protections, but for different reasons. 01:01:21.820 |
So there there is a reasonable outcome here where you know, Roberts is going to have a 01:01:25.900 |
really interesting time trying to pick who writes the majority opinion. 01:01:29.020 |
There was a related case in the Fifth Circuit in Texas, where do you guys see this Fifth 01:01:33.900 |
Circuit decision, where Texas passed a law imposing common carrier restrictions on social 01:01:42.340 |
The idea being that social media companies need to operate like phone companies, and 01:01:46.660 |
they can't just arbitrarily deny you service or deny you access to the platform. 01:01:51.820 |
And the argument why previously that had been viewed actually as unconstitutional, was this 01:01:58.340 |
idea of compelled speech that you can't compel a corporation to support speech that they 01:02:03.860 |
don't want to because that was a violation of their own First Amendment rights. 01:02:07.820 |
And what the First, the Fifth Circuit said is no, that doesn't make any sense. 01:02:11.060 |
Facebook or Twitter can still advocate for whatever speech they want as a corporation. 01:02:15.700 |
But as a platform, they if Texas requires them to not discriminate against people on 01:02:21.820 |
the basis of viewpoint, then Texas has the right to to impose that because that doesn't 01:02:27.100 |
it, their quote was that does not chill speech, if anything, it chills censorship. 01:02:31.620 |
So what's the right legal decision here in your mind, putting aside politics, if you 01:02:35.140 |
can, for a moment, putting on your legal hat, what is the right thing for society? 01:02:39.940 |
What is the right legal issue around section 230? 01:02:43.260 |
Specifically in the YouTube case, and just generally, should we look at YouTube? 01:02:46.780 |
Should we look at a blogging platform like medium or blogger, Twitter, should we look 01:02:51.660 |
at those as common carrier, and they're not responsible for what you publish on them, 01:02:57.580 |
obviously, they have to take stuff down if it breaks our terms of service, etc. 01:03:01.820 |
I've made the case before that I do think that common carrier requirements should apply 01:03:05.940 |
on some level of the stack to protect the rights of ordinary Americans to have their 01:03:11.100 |
speech in the face of these giant monopolies, which could otherwise deplatform them for 01:03:17.100 |
Just to you know, just explain this a little bit. 01:03:19.860 |
So historically, there was always a debate between so called positive rights and negative 01:03:27.300 |
So where the United States start off as a country was with this idea of negative rights 01:03:31.460 |
that what a right meant is that you'd be protected from the government taking some action against 01:03:37.540 |
And if you look at the Bill of Rights, you know, the original rights are all about protecting 01:03:41.660 |
the citizen against intrusion on their liberty by a state or by the federal government. 01:03:46.780 |
In other words, Congress shall make no law, it was always a restriction. 01:03:50.540 |
So the right was negative, it wasn't sort of positively enforced. 01:03:53.140 |
And then with the progressive era, you started seeing, you know, more progressive rights 01:03:58.340 |
like, for example, American citizens should have the right to healthcare, right? 01:04:03.180 |
That's not protecting you from the government. 01:04:05.140 |
That's saying that the government can be used to give you a right that you didn't otherwise 01:04:10.380 |
And so that was sort of the big progressive revolution. 01:04:12.880 |
My take on it is I actually think that the problem we have in our society right now is 01:04:20.940 |
I think it actually needs to be a positive right. 01:04:23.300 |
I'm embracing a more progressive version of rights, but on behalf of sort of this original 01:04:30.780 |
So, and the reason is because the town square got privatized, right? 01:04:34.620 |
I mean, you used to be able to go anywhere in this country, there'd be a multiplicity 01:04:37.860 |
of town squares, anyone could pull out their soapbox, draw a crowd, they could listen. 01:04:46.660 |
The way that speech, political speech especially, occurs today is in these giant social networks 01:04:51.620 |
that have giant network effects and are basically monopolies. 01:04:55.340 |
So if you don't protect the right to free speech in a positive way, it no longer exists. 01:05:00.700 |
So you not only believe that YouTube should keep its section 230, you believe that YouTube 01:05:06.740 |
shouldn't be able to de-platform as a private company, you know, Alex Jones as but one example, 01:05:13.900 |
they should have their free speech rights, and we should lean on that side of forcing 01:05:17.300 |
YouTube to put Alex Jones or Twitter to put Trump back on the platform. 01:05:23.780 |
I'm not saying that the Constitution requires YouTube to do anything. 01:05:28.100 |
What I'm saying is that if a state like Texas or if the federal government wants to pass 01:05:33.700 |
a law saying that YouTube, if you are say of a certain size, you're a social network 01:05:39.020 |
of a certain size, you have monopoly network effects, I wouldn't necessarily apply this 01:05:43.800 |
But for those big monopolies, we know who they are, if the federal government or a state 01:05:49.600 |
wanted to say that they are required to be a common carrier, and they cannot discriminate 01:05:54.440 |
against certain viewpoints, I think the government should be allowed to do that because it furthers 01:06:00.480 |
Historically, they've not been able to do that because of this idea of compelled speech, 01:06:05.960 |
meaning that it would infringe on YouTube's speech rights. 01:06:09.840 |
I mean, Google and YouTube can advocate for whatever positions they want. 01:06:15.840 |
But the point that, and I think Section 230 kind of makes this point as well, is that 01:06:19.840 |
they are platforms, they're distribution platforms, they're not publishers. 01:06:23.080 |
So if they want, so especially if they want Section 230 protection, they should not be 01:06:31.680 |
Your explanation, David, your explanation that you just gave before was so excellent. 01:06:37.400 |
That it allows me to understand it even more clearly. 01:06:40.760 |
So Shumoff, do you think the algorithm is an act of editorialization? 01:06:49.520 |
Look guys, at the end of the day, let me, let me break down an algorithm for you. 01:06:54.080 |
Effectively, it is a mathematical equation of variables and weights. 01:06:59.360 |
An editor 50 years ago was somebody who had that equation of variables and weights in 01:07:09.280 |
And so all we did was we translated again, this multimodal model that was in somebody's 01:07:14.560 |
brain into a model that's mathematical, that sits in code. 01:07:20.680 |
You're talking about the front page editor of the New York Times. 01:07:24.000 |
And I think it's a fake leaf to say that because there is not an individual person who writes 01:07:27.840 |
point two in front of this one variable and point eight in front of the other, that all 01:07:33.200 |
of a sudden that this isn't editorial decision making is wrong. 01:07:36.400 |
We need to understand the current moment in which we live, which is that these computers 01:07:45.780 |
They're providing this, you know, computationally intensive decision making and reasoning. 01:07:53.700 |
And I think it's, it's pretty ridiculous to assume that that isn't true. 01:07:58.200 |
That's why when you go to Google and you search for, you know, Michael Jordan, we know what 01:08:03.960 |
the right Michael Jordan is because it's reasoned. 01:08:09.000 |
It's making an editorial decision around what the right answer is. 01:08:15.920 |
And so I think we need to acknowledge that because I think it allows us at least to be 01:08:19.880 |
in a position to rewrite these laws through the lens of the 21st century. 01:08:25.320 |
And we need to update our understanding for how the world works today. 01:08:30.520 |
And you know, Chamath, there's such an easy way to do this. 01:08:32.640 |
If you're tick tock, if you're YouTube, if you want section 230, if you want to have 01:08:37.280 |
common carrier and not be responsible with their when a user signs up, it should give 01:08:41.720 |
them the option, would you like to turn on an algorithm? 01:08:44.920 |
Here are a series of algorithms which you could turn on, you could bring your own algorithm, 01:08:49.140 |
you could write your own algorithm with a bunch of sliders, or here are ones that other 01:08:52.980 |
users and services provide like an app store. 01:08:56.160 |
So you Chamath could pick one for your family, your kids, that would be I want one that's 01:09:00.680 |
leaning towards education and takes out conspiracy theories takes out cannabis use takes out 01:09:05.960 |
It's a wonderful what you're saying is so wonderful. 01:09:08.000 |
Because for example, like, you know, this organization, common sense media. 01:09:12.400 |
Every time I put in a movie, I put common sense media decide if we should watch it. 01:09:15.840 |
Or like an I use it a lot for apps, because they're pretty good at just telling you which 01:09:22.520 |
But you know, if common sense media could raise a little bit more money and create an 01:09:26.240 |
algorithm that would help filter stories in Tick Tock for my kids, I'd be more likely 01:09:35.360 |
Right now, I know that they're going to sneak it by going to YouTube and looking at YouTube 01:09:39.160 |
shorts and all these other things because I cannot control that algorithm. 01:09:43.320 |
And it does worry me what kind of content that they're getting access to. 01:09:47.640 |
And you could do this, by the way, Chamath on the operating system level or on the router 01:09:51.560 |
level in your house, you could say I want the common sense algorithm, I will pay $25 01:09:56.160 |
a month, $100 a year for it, we are put it on your society, then any IP that goes through 01:10:02.840 |
I want less violence, I want less sex, you know, whatever I think we are as a society 01:10:13.180 |
And so I think we do need to have the right observation of the current state of play. 01:10:26.320 |
Yeah, I don't fully agree with sex on the monopolistic assumption. 01:10:32.400 |
I think that there are a packet, I think that there are other places to access content. 01:10:37.000 |
And I think that there is still a free market to compete. 01:10:42.560 |
I think that we saw this happen with Tick Tock, we saw it happen with Instagram, we 01:10:46.440 |
saw it happen with YouTube, competing against Google Video and Microsoft Video. 01:10:51.240 |
Compared to that, there has been a very significant battle for the attention of kind of being 01:10:59.580 |
And we have seen Spotify compete, and we're seeing Spotify continue to be challenged by 01:11:06.080 |
So I don't buy the assumption that these are built in monopolies, and therefore it allows 01:11:12.780 |
some regulatory process to come in and say, hey, free speech needs to be actively enforced 01:11:18.600 |
This isn't like when utilities laid power lines, and sewer lines and and trains across 01:11:24.720 |
the country, and they had a physical monopoly on being able to access and move goods and 01:11:30.520 |
The internet is still thank God knock on wood open. 01:11:33.480 |
And the ability for anyone to build a competing service is still possible. 01:11:37.300 |
And there is a lot of money that would love to disrupt these businesses that is actively 01:11:42.140 |
And I think every day, look at how big Tick Tock has gotten, it is bigger than YouTube 01:11:50.560 |
And because of that competition, I think that the the market will ultimately choose where 01:11:55.440 |
they want to get their content from and how they want to consume it. 01:11:58.200 |
And I don't think that the government should play a role. 01:12:02.200 |
Well, so not all these companies are monopolies, but I think they act in a monopolistic way 01:12:07.400 |
with respect to restricting free speech, which is they act as a cartel, they all share like 01:12:12.880 |
best practices with each other on how to restrict speech. 01:12:16.120 |
And we saw the the watershed here was better when Trump was thrown off. 01:12:22.280 |
You know, Jack, I don't know if it was Jack, but basically the company 01:12:26.680 |
Actually, he said it was the woman who was running it specifically. 01:12:34.440 |
And then all the other companies followed suit. 01:12:36.120 |
I mean, even like Pinterest and Okta and Snapchat, like officially passed a policy. 01:12:49.400 |
And Jack actually said that in his comments where he said it was a mistake. 01:12:52.680 |
He said he didn't realize the way in which Twitter's action would actually cascade. 01:12:59.040 |
He said that he thought originally that the action was okay, because it was just Twitter 01:13:03.760 |
decided to take away Trump's right to free speech. 01:13:06.720 |
But he could still go to all these other companies. 01:13:08.280 |
And then all these other companies, basically, they are all subject to the same political 01:13:15.000 |
The leadership of these companies are all sort of, they all drink from the same monocultural 01:13:22.280 |
So the problem with Freeburg is, yeah, I agree, a bunch of these companies aren't quite monopolies, 01:13:27.280 |
I hate to say it, but I agree with you, Zach. 01:13:29.280 |
And so the collective effect is of a speech cartel. 01:13:33.320 |
So the question is, how do you protect the rights of Americans to free speech in the 01:13:37.200 |
face of a speech cartel that wants to basically block them? 01:13:42.600 |
My argument is that these are not public service providers, they're private service providers, 01:13:48.400 |
The market is saying, and I think I think that the pressure that was felt by these folks 01:13:53.480 |
was that so many consumers were pissed off that they were letting Trump rail on or they 01:13:58.240 |
were pissed off about Jan six, they were pissed off about whatever, whatever the current fad 01:14:03.160 |
is, the trend is, they respond to the market. 01:14:06.080 |
And they say, you know what, this is cross the line. 01:14:08.520 |
And this was the case on public television, when nudity came out, and they're like, okay, 01:14:12.400 |
you know what, we need to take that off the TV, we need to because the market is telling 01:14:17.560 |
And I think that there's a market pressure here that we're ignoring, that is actually 01:14:20.960 |
pretty, pretty relevant, that as a private service provider, if they're going to lose 01:14:24.840 |
half their audience, because people are pissed about one or two pieces of content showing 01:14:29.080 |
up that they're acting in the best interest of their shareholders, and in the best interest 01:14:33.360 |
of their platform, they're not acting as a public service. 01:14:36.040 |
Look, I love market forces as much as the next libertarian. 01:14:40.160 |
But I just think that fundamentally, that's just not what's going on here. 01:14:43.040 |
This has nothing to do with market forces as everything to do with political forces. 01:14:47.640 |
Look, do you think the average consumer, the average user of PayPal is demanding that they 01:14:53.040 |
engage in all these restrictive policies throwing off all these accounts who have the wrong 01:14:58.480 |
No, that has nothing to do with it has to do with the vocal minority. 01:15:00.920 |
Yeah, it's a small number of people who are political activists who work at these companies 01:15:08.400 |
It's also the people from outside, the activists who create these boycott campaigns and pressure 01:15:14.540 |
And then it's basically people on Capitol Hill who have the same ideology who basically 01:15:19.720 |
So these companies are under enormous pressure from above, below, and sideways, and it's 01:15:29.560 |
I think it's about maximizing political outcomes. 01:15:34.160 |
And that is what the American people need to be protected from. 01:15:36.840 |
Now I will add one nuance to my theory though, which is I'm not sure what level of the stack 01:15:46.580 |
So in other words, you may be right actually that at the level of YouTube or Twitter or 01:15:52.560 |
Facebook, maybe we shouldn't make them common carrier and I'll tell you why because just 01:15:56.720 |
to take the other side of the argument for a second, which is if you don't... 01:16:00.920 |
Because those companies do have legitimate reasons to take down some content. 01:16:05.120 |
I don't like the way they do it, but I do not want to see bots on there. 01:16:08.600 |
I do not want to see fake accounts and I actually don't want to see truly hateful speech or 01:16:15.720 |
And the problem is I do worry that if you subject them to common carrier, they won't 01:16:20.020 |
actually be able to engage in, let's say, legitimate curation of their social networks. 01:16:25.720 |
However, so there's a real debate to be had there and it's going to be messy, but I think 01:16:31.160 |
there's one level of the stack below that, which is at the level of pipes, like an AWS, 01:16:35.600 |
like a CloudFlare, like a PayPal, like the ISPs, like the banks. 01:16:40.320 |
They are not doing any content moderation or they have no legitimate reason to be doing 01:16:45.200 |
None of those companies should be allowed to engage in viewpoint discrimination. 01:16:48.280 |
We have a problem right now where American citizens are being denied access to payment 01:16:52.800 |
rails and to the banking system because of their viewpoints. 01:16:55.920 |
So wait, hold on, you're saying AWS shouldn't be able to deny service to the Ku Klux Klan 01:17:00.560 |
I think that they should be under the same requirements the phone company is under. 01:17:05.480 |
Yeah, I mean, this is a very complicated shock. 01:17:06.480 |
So when you frame it that way, it's, you know, the question is like, look, I could frame 01:17:12.800 |
Should such and such horrible group be able to get a phone account, right? 01:17:18.680 |
And you'd say, no, they shouldn't get anything, but they have that right. 01:17:21.960 |
That has been litigated and that's been pretty much protected by the Supreme Court. 01:17:25.720 |
You know, even if it's a government conferred monopoly, the Supreme Court has said, okay, 01:17:30.520 |
listen, like it's not violating one's constitutional right. 01:17:33.360 |
For example, if your water bill gets terminated without you getting due process and the inverse 01:17:40.680 |
So for whether we like it or not, that Jason, that issue has been litigated. 01:17:45.600 |
I think I think I think for me, again, just like practically speaking for the functioning 01:17:52.160 |
of civil society, I think it's very important for us to now introduce this idea of algorithmic 01:17:59.940 |
And I don't think that that will happen in the absence of us rewriting section 230 in 01:18:07.040 |
I don't know whether this specific case creates enough standing for us to do all of that. 01:18:14.760 |
But I think it's an important thing that we have to revisit as a society, because Jason, 01:18:20.040 |
what you described as having a breadth of algorithmic choices over time where there 01:18:26.880 |
Could you imagine that's not a job or a company that the four of us would ever have imagined 01:18:33.420 |
But maybe there should be an economy of algorithms and there are these really great algorithms 01:18:39.040 |
that one would want to pay a subscription for because one believes in the quality of 01:18:45.760 |
And I think it's an important set of choices that will allow actually, YouTube as an example 01:18:50.640 |
to operate more safely as a platform, because it can say, Listen, I've created this set 01:18:55.280 |
of abstractions, you can plug in all sorts of algorithms, there's a default algorithm 01:18:59.520 |
that works, but then there's a marketplace of algorithms, just like there's a marketplace 01:19:08.160 |
It's model like if it was on a blockchain, if all the videos, all the video content was 01:19:12.760 |
uploaded to a public blockchain, and then distributed on distributed computing system, 01:19:17.680 |
then your ability to search and use that media would be a function of a service provider 01:19:23.160 |
you're willing to pay for that provides the best service experience. 01:19:25.640 |
And by the way, this is also why I think over time to sex to kind of sex and I are both 01:19:30.400 |
arguing both sides a little bit, but I think that what will happen, I don't think that 01:19:35.120 |
the government should come in and regulate these guys and tell them that they can't take 01:19:39.440 |
I really don't like the precedent it sets, period. 01:19:42.280 |
I also think that it's a terrible idea for YouTube and Twitter to take stuff down. 01:19:48.480 |
And I think that there's an incredibly difficult balance that they're going to have to find 01:19:52.840 |
because if they do this, as we're seeing right now, the quality of the experience for a set 01:19:57.000 |
of users declines, and they will find somewhere else any market will develop for something 01:20:03.120 |
And so I that's why I don't like the government intervening, because I want to see a better 01:20:06.840 |
product emerge when the big company makes some stupid mistake and does a bad job. 01:20:11.240 |
And then the market will find a better outcome. 01:20:15.720 |
And as soon as you do government intervention on these things and tell them what they can 01:20:19.680 |
and can't take down, I really do think that over time, you will limit the user experience 01:20:24.240 |
to what is possible if you allow the free market. 01:20:27.320 |
And this is where the the industry needs to police itself. 01:20:30.280 |
If you look at the movie industry with the MPAA, and Indiana Jones and the Temple of 01:20:34.960 |
Doom, they came out with the PG 13 rating specifically for things that were a little 01:20:40.320 |
This is where our industry could get ahead of this, they could give algorithmic choice 01:20:47.080 |
And if you look at the original sin, it was these lifetime bans, like, Trump should not 01:20:50.880 |
have been given a lifetime ban, they should have given them a one year ban, they should 01:20:54.160 |
have had a process and overreached, we wouldn't be in this position. 01:20:58.400 |
But Jason, when you talk about when you talk about like having a industry consortium, like 01:21:03.120 |
the MPAA, what you're doing is formalizing the communication that's already taking place 01:21:09.120 |
And what is the result of that communication? 01:21:10.840 |
They all standardized on overly restrictive policies, because they all share the same 01:21:15.480 |
No, but if they did it correctly, it's all in the execution sacks, it has to be executed 01:21:19.480 |
properly, like the movie industry, it doesn't matter, you'll end up with the same problem 01:21:24.560 |
If you have the government intervene, or a private body intervene, any sort of set standard 01:21:28.600 |
intervention that prevents the market from complying, I disagree with you, I think you 01:21:32.760 |
can create more competition if the government says, Okay, folks, you can have the standard 01:21:37.600 |
algorithm, but you need to make a simple, abstracted way for somebody else to write 01:21:43.400 |
some other filtering mechanism and to basically you so that users can pick the power users. 01:21:49.760 |
What the MPAA did was, I don't understand why you don't like why isn't that more choice? 01:21:54.080 |
Because as a product person, as a product company, I don't want to be told how to make 01:21:58.000 |
If I'm not YouTube, you have you have an algo, you're now saying that there is this distinction 01:22:04.140 |
And my choice might be to create different content libraries. 01:22:06.800 |
For example, YouTube has YouTube kids, and it's a different content library. 01:22:13.280 |
And you're trying to create an abstraction that may not necessarily be natural for the 01:22:16.720 |
evolution of the product set of that company. 01:22:20.560 |
That's not a good argument that again, if you were not a monopoly, I would be more sympathetic. 01:22:25.600 |
But because like somebody, somebody feelings would get hurt, a product managers feelings 01:22:29.720 |
will get hurt inside of Google feelings, but it's not the reason to not protect free speech. 01:22:33.800 |
I think you're unnaturally disrupting the product evolution. 01:22:37.520 |
That's what that's what happens when you're worth $2 trillion. 01:22:40.200 |
And when you impact a billion people on the planet, when you start having massive impact 01:22:44.280 |
in society, you have to take some responsibility. 01:22:46.840 |
And those companies are not taking responsibility. 01:22:49.240 |
If you're not super, super, super successful, it's this is not going to affect you. 01:22:53.180 |
So you're not going to worry about you'll see you'll see apps offshore and you'll see 01:22:56.800 |
tick tock and other things compete because they'll have a better product experience. 01:23:00.080 |
Because they know, no, no, no, no, I was going to create a new Google because they're down 01:23:05.120 |
ranking one to 10% of the search results for reasons. 01:23:10.520 |
Some accountability, hold on an ideal world companies like Google and so forth would not 01:23:15.000 |
take sides in political debates to be politically neutral, but they're not. 01:23:18.280 |
You look at all the data around the political leanings, the people running these companies, 01:23:21.920 |
and then you look at the actual actions of these companies, and they have become fully 01:23:25.600 |
political and they've waded into all these political debates with the result that the 01:23:29.280 |
American people's rights to speech and to earn have been reduced. 01:23:33.640 |
You have companies like PayPal, which are just engaging in retaliation, basically financial 01:23:38.000 |
retaliation purely on based on what political viewpoints they have. 01:23:44.280 |
It's not like PayPal needs to be in the business. 01:23:49.000 |
We're not going to stop calling it calling AMA. 01:23:51.200 |
Well, if they can get some servers over that, I don't know, maybe so you got to raise some 01:23:55.560 |
money sacks for this app and get some more service. 01:23:57.920 |
All right, listen for the dictator who needs to hit the loo to do a number two. 01:24:23.040 |
Get it while less enjoy while less that we're wrapping it up here. 01:24:31.480 |
We open source it to the fans and they've just gone crazy with it. 01:25:01.600 |
We should all just get a room and just have like one big huge orgy 01:25:05.640 |
It's like this like sexual tension that they just need to release somehow