back to index

E10: Twitter & Facebook botch censorship (again), the publisher vs. distributor debate & more


Chapters

0:0 The besties catch up on the news
1:29 NY Post Hunter Biden story & censorship by Twitter/Facebook
7:27 What is section 230 & how does it play into the publisher vs. distributor debate
13:23 Distinguishing between publishers & distributors
28:30 Why Twitter & Facebook's actions with the NY Post were a huge blunder & crossed a line, should the laws be rewritten?
37:21 Trump beats COVID, what that means for better treatment options, dueling town halls
46:14 Sacks explains his stance on Prop 13 & Zuckerberg's pro-Prop 15 lobbying
54:34 Thoughts on Amy Coney Barrett & Biden's large lead in the polls

Whisper Transcript | Transcript Only Page

00:00:00.000 | Hey everybody. Hey everybody. Welcome. Besties are back. Besties are back. It's another all-in
00:00:04.480 | podcast. Dropping it to you unexpectedly because there's just so much news.
00:00:10.760 | Surprise Bestie Pod. We're dropping a bestie. It's not a code 13. We're not dropping
00:00:18.460 | any Snickers bars today. Just dropping a bestie. Oh no, he's got a megaphone.
00:00:24.040 | Oh no, he's got a megaphone. He's got two.
00:00:28.160 | This is a special censorship edition. Warning, warning.
00:00:38.040 | We hit a new low in terms of people needing to be heard. By the way, Tramath Sachs, his agent
00:00:48.900 | and his chief of staff called me. He felt like he only got 62% of the minutes in the last two
00:00:55.520 | podcasts versus the rest of us.
00:00:57.920 | And so I'm dealing with his agent a little bit. It's like the debates where they count
00:01:02.820 | the number of minutes.
00:01:03.560 | Who, Daniel? Is Daniel grinding you for more minutes?
00:01:06.440 | Daniel's grinding me for more minutes on the back channel.
00:01:07.520 | No, I go for quality over quantity.
00:01:09.780 | Absolutely. Okay. Well, this week's going to be, I mean, what a complete disaster of a week.
00:01:16.300 | Is there no other way to explain what is happening right now?
00:01:21.940 | Every day is a dumpster fire.
00:01:24.100 | It's a huge dumpster fire.
00:01:26.800 | So here we are, we're three weeks out from the election and somebody's emails have,
00:01:35.120 | Democrats' emails have been leaked again, potentially. But last time we had an investigation
00:01:44.060 | by the FBI and then that might have impacted the election. This time we have a whole different
00:01:53.320 | brouhaha. Apparently Hunter Biden, who,
00:01:55.920 | loves to smoke crack and has a serious drug problem.
00:02:00.240 | This is, you know, he's a seriously, obviously troubled individual.
00:02:04.800 | But he brought three laptops to get them fixed and never picked them up.
00:02:09.440 | According to this story in the New York Post.
00:02:12.400 | So the New York Post runs a story with an author who is kind of unknown.
00:02:18.480 | And this, these laptops were somehow the hard drives, he never picked them up.
00:02:25.200 | That's the hard drive.
00:02:25.840 | That's the hard drive.
00:02:25.840 | That's the hard drive.
00:02:25.920 | That's a little suspicious.
00:02:27.040 | The hard drives wind up with Rudy Giuliani and the FBI.
00:02:30.320 | And anyway, what they say is that Hunter Biden, which we kind of know is a grifter
00:02:35.680 | who traded on his last name to get big consulting deals.
00:02:40.400 | I don't know what board anybody here has been on that pays 50,000 a month,
00:02:43.920 | but it's obviously gnarly stuff.
00:02:46.160 | But the, the fallout from it was the big story.
00:02:49.040 | I went to tweet the story and it wouldn't let me tweet the story.
00:02:52.640 | So the literal New York Post was banned
00:02:55.840 | by Twitter at the same time Facebook put a warning on it.
00:02:59.680 | So let's just put it out there.
00:03:01.440 | You know, Saks, your guys losing pretty badly in this election.
00:03:06.080 | And so we'll go to our token GOP here.
00:03:08.000 | What do you think is this?
00:03:09.840 | Let's let's take this in two parts.
00:03:12.160 | One, what do you what do they think the chances that this is fake news or real news or something
00:03:17.680 | in between?
00:03:18.160 | And then let's get into Twitter's insane decision to block the URL.
00:03:22.320 | Yeah, I mean, so first of all, I think this is a whole different story.
00:03:25.760 | I think this whole thing is a tragedy of errors on the part of sort of everyone involved.
00:03:32.480 | I think the New York Post story stinks.
00:03:35.040 | I don't think it it meets sort of standards of journalistic integrity.
00:03:40.400 | We can talk about that.
00:03:41.440 | But then I think, you know, Twitter and Facebook overreacted.
00:03:45.520 | And I think that the story was well in the process of being debunked by the Internet.
00:03:50.720 | And it was like Twitter and Facebook didn't trust that process to happen.
00:03:55.120 | And so they intervened.
00:03:55.680 | And now I think there's going to be a third mistake, which is that conservatives are looking
00:04:01.120 | to repeal Section 230.
00:04:03.200 | We should talk about that.
00:04:04.240 | And so each one there's been a cascade of disasters that have led to this this dumpster
00:04:10.000 | fire.
00:04:10.320 | But starting with the story, it is it is very suspicious.
00:04:16.080 | First of all, these disclosures about Hunter Biden's personal life, they didn't have to
00:04:20.560 | go there was completely gratuitous to the article.
00:04:22.800 | It was sleazy.
00:04:23.600 | And then, of course, this story.
00:04:25.600 | About how the hard drive ends up with the reporters makes no sense.
00:04:29.200 | Even today, Giuliani was was making up new explanations for how it got there.
00:04:34.320 | It's now being widely speculated that this was the that the content came from the result
00:04:39.280 | of a hack, maybe involving foreign actors, that this whole idea that it came from this
00:04:45.120 | sort of hard drive that he left at a repair shop and forgot to pick up.
00:04:48.560 | I mean, so that that's now, you know, I think that would have been the story today if it
00:04:54.720 | weren't for.
00:04:55.520 | Facebook and Twitter making censorship the story.
00:04:58.320 | And then the final thing is, you know, this story wasn't a smoking gun to begin with.
00:05:03.360 | I mean, the worst thing it showed was that there was a single email between a Burisma
00:05:08.800 | executive and Joe Biden.
00:05:10.160 | And the Biden campaign is denied that that Joe Biden never met with this guy.
00:05:15.520 | And so it wasn't ever the smoking gun.
00:05:18.960 | And and that makes it all the more apparent why Facebook and Twitter.
00:05:25.440 | Sort of overreacted.
00:05:27.120 | It was almost like they were trying to overprotect their candidate.
00:05:30.400 | That's the thing that obviously looks crazy.
00:05:34.560 | Like they now have given the GOP the right, the extreme right, the belief that the the
00:05:42.560 | technology companies are now on the side of the left, whereas last time they were on the
00:05:48.000 | side of the right.
00:05:48.960 | I think Facebook was supposed to be on the side of the right last time.
00:05:51.600 | So, Shabbat, you worked at Facebook famously for many years.
00:05:54.320 | What are your thoughts?
00:05:55.120 | Well, Jack came out last night and basically said that the reason that they
00:05:59.360 | that they shut down distribution was that it came from hacking and doxing or some.
00:06:04.160 | I think that was basically the combination.
00:06:05.920 | Yes, a combination.
00:06:07.120 | And then Facebook today came out and said, you know, before we could take it down, it had been
00:06:11.840 | distributed or read 300000 times.
00:06:15.440 | I mean, look, if we just take a step back and think about what's happening here, there are
00:06:22.640 | more and more and more examples that are.
00:06:25.040 | Telling, I think all of us, what we kind of already knew, which is that this fig leaf that
00:06:30.080 | the online Internet companies have used to shield themselves from any responsibility.
00:06:38.160 | Those days are probably numbered because now, exactly as David said, what you have is the left
00:06:43.520 | and the right looking to repeal Section 230.
00:06:46.720 | And so and by the way, two days ago, I think it was Clarence Thomas basically put out the
00:06:51.440 | entire roadmap of how to repeal it.
00:06:53.120 | And if you assume that.
00:06:54.800 | Amy Coney Barrett gets put into the high court in a matter of days or whatever, it's only a matter of
00:07:02.480 | time until the right case is thoughtfully prepared along those guardrails that that Clarence Thomas
00:07:08.640 | defined and it'll get fast tracked through to the Supreme Court.
00:07:14.480 | But if I was a betting man, which I am, I think that Section 230 is their days are numbered and
00:07:20.400 | Facebook, Twitter, Google, all these companies are going to have to look more.
00:07:24.720 | Like newspapers and television stations.
00:07:26.640 | David Morgan Okay, so before we go to your Friedberg,
00:07:28.800 | I'm just going to read what Section 230 is.
00:07:31.600 | This is part of a law basically designed to protect common carriers, web hosters of legal
00:07:40.400 | claims that come from hosting third party information.
00:07:44.560 | Here's what it reads.
00:07:46.080 | No provider or user of an interactive computer service shall be treated as the publisher
00:07:50.320 | or speaker of any information provided by another information provider.
00:07:54.480 | So what this basically means is if you put a blog post up and people comment on it,
00:07:58.880 | you're not responsible for their comments.
00:08:01.040 | Or if you're medium and you host the blog, you're not responsible for the comments of that person.
00:08:04.880 | Is that person's it makes complete logical sense.
00:08:07.920 | The entire internet was based off of this, that platforms are not responsible for what people
00:08:14.000 | contribute to those platforms.
00:08:15.680 | That's how publishing works.
00:08:17.040 | Now look at the internet as paper.
00:08:18.400 | But again, let's build on this one.
00:08:20.320 | That's the only way to do it.
00:08:21.360 | That's the only way to do it.
00:08:22.400 | And that's the only way to do it.
00:08:23.440 | Right.
00:08:23.760 | Right.
00:08:24.000 | Right.
00:08:24.240 | When that law was originally written, we had no conception of social distribution and
00:08:29.600 | algorithmic feeds that basically pumped content and increased the volume on those things.
00:08:34.880 | So what you have now is really no different than if you created a show on Netflix or HBO
00:08:42.480 | or CBS and put it out there.
00:08:44.720 | If that stuff contained something that was really offensive, those companies are on the
00:08:49.840 | hook.
00:08:50.080 | Did they make it?
00:08:51.040 | Did they distribute it?
00:08:53.760 | But here's the difference.
00:08:54.560 | It's the Netflix, but it's the active act of distributing it.
00:08:58.720 | You cannot look at these companies and say they are basically holding their hands back.
00:09:03.360 | They have written active code and there is technical procedures that they are in control
00:09:08.320 | of that are both the amplifier and the kill switch.
00:09:11.200 | But isn't this a bad analogy, Netflix?
00:09:14.640 | Shouldn't the analogy be the person who makes film stock or the person who makes the camera
00:09:19.360 | or the person who develops the film, not the person who distributes it?
00:09:22.640 | No, because that.
00:09:23.200 | A limited amount of shows on Netflix.
00:09:25.440 | You can police all of them.
00:09:26.320 | You can't police everything written.
00:09:28.080 | Netflix is making editorial decisions about which shows to publish, just like a magazine
00:09:36.000 | makes editorial decisions about which articles to publish.
00:09:39.120 | They are clearly publishers.
00:09:40.560 | But the Communications Decency Act Section 230, the original distinction, if you want
00:09:46.880 | to think about it in offline terms for a second, you've got this idea of publishers and distributors.
00:09:52.720 | Right?
00:09:52.880 | That's a fundamental dichotomy.
00:09:54.160 | A magazine would be a publisher.
00:09:56.640 | The newsstand on which it appears is a distributor.
00:10:00.000 | It shouldn't be liable.
00:10:00.960 | If there's a libelous article contained in that magazine, you shouldn't be able to sue
00:10:07.360 | every single newsstand in the country that made that magazine available for sale.
00:10:10.960 | That was the original offline law that was then kind of ported over into Section 230.
00:10:17.840 | It made a lot of sense.
00:10:18.800 | Without this, I mean, I think it was a really visionary provision.
00:10:21.760 | It was.
00:10:22.080 | It was passed in 1996.
00:10:24.320 | Without that, every time that somebody sends an email that potentially created a legal
00:10:32.320 | issue, Gmail could have been liable.
00:10:35.120 | Freedberg, is it?
00:10:37.200 | What's the right analogy?
00:10:38.320 | When people post to the internet, is the analogy paper or film stock?
00:10:43.280 | Is it the newsstand or is it the publisher?
00:10:45.440 | So remember, what Sachs is pointing out is this was passed in 1996.
00:10:50.080 | So think back to 1996.
00:10:51.440 | When you would create some content, and the term around that time was user generated content.
00:10:58.160 | You guys remember this, like the early days.
00:10:59.920 | It was like the big sweeping trend.
00:11:02.560 | And it was like the big sweeping trend was like, "Oh my God, all this content is being
00:11:05.840 | created by the users.
00:11:07.280 | We don't have to go find content creators to create a reason for other consumers to want
00:11:12.800 | to come to our websites."
00:11:14.080 | So users could create content.
00:11:15.840 | Blogger was an early kind of user generated content service.
00:11:19.200 | You could create a blog post.
00:11:20.320 | You could post it.
00:11:20.960 | And people would show up.
00:11:21.920 | The problem with blogger or the challenge was distribution or syndication.
00:11:27.440 | Now I've posted my content.
00:11:30.080 | How do I, as that content creator, get people to read my content?
00:11:33.280 | And you'd have to send people like a link to a website, a link to a web page.
00:11:36.560 | And you click on that link and then you could read it.
00:11:38.640 | What Chamath is pointing out is that today, Twitter and Facebook make a choice about,
00:11:44.800 | and YouTube make a choice about what content to show.
00:11:47.440 | And so I think the analogy in the offline sense,
00:11:50.480 | Via the algorithm is what you're saying to be clear.
00:11:52.880 | Via the algorithm.
00:11:54.080 | And YouTube realized that if they showed you videos that they think that you'll click on,
00:11:58.320 | they'll keep you on YouTube longer and make more money from ads.
00:12:00.960 | So it keeps the cycle going.
00:12:02.480 | And so they optimize content.
00:12:04.080 | And it turns out that the content that you need to optimize for to get people to keep
00:12:07.120 | clicking is content that is somewhat activating to the amygdala in your brain.
00:12:11.600 | It's like stuff that makes you angry or makes you super pleasured, not just boring, ordinary
00:12:16.880 | stuff.
00:12:17.360 | And so this sort of content, which the New York Post sells a lot of, is a content that's
00:12:20.000 | a lot of is the sort of stuff that rises to the top of those algorithms naturally because
00:12:25.680 | of the way they operate.
00:12:26.800 | Now, if a magazine stand were to put those newspapers using the offline analogy on the
00:12:31.360 | front of their magazine stand and told people walking down the street, hey, you guys should
00:12:34.480 | check these out.
00:12:35.520 | You know, top of the news is Hunter Biden smoking crack with a hooker.
00:12:39.200 | People would probably stop.
00:12:40.720 | But I think the question is, should they be liable?
00:12:43.280 | Now, in, I think, 2000, the Digital Millennium Copyright Act was passed and that act,
00:12:49.520 | basically created a process by which folks who felt like it was related to copyright.
00:12:54.800 | But I think the analogy is similar.
00:12:56.720 | If you thought that your content was copyrighted and was being put up falsely or put up without
00:13:01.600 | your permission, you could make a claim to one of those platforms to get your content
00:13:06.000 | pulled down.
00:13:07.040 | And I think the question is, is there some sort of analogy around liable content or false
00:13:12.560 | or misleading content that maybe this evolves into law where there's a process by which
00:13:17.840 | platforms can kind of be challenged?
00:13:19.040 | And what they're showing, much like they are with the DMCA takedown notices.
00:13:23.760 | So the problem comes back to the code.
00:13:26.800 | If you explicitly write code that fundamentally makes it murky, whether you are the publisher
00:13:35.360 | or the distributor, I think that you have to basically take the approach that you are
00:13:40.720 | both and then you should be subject to the laws of both.
00:13:44.160 | If, for example, Twitter did not have any algorithmic redistribution,
00:13:48.560 | amplification, there were the only way you could get content was in a real time feed.
00:13:53.840 | That was everything that your friends posted and they stayed silent.
00:13:58.640 | You could make a very credible claim that they are a publisher and not a distributor,
00:14:03.040 | which, by the way, is the way it originally worked.
00:14:05.760 | And it was why they were falling behind Facebook, as you well know, because you worked on the
00:14:09.680 | you can't, I cannot claim that you're not a distributor when you literally have a bunch
00:14:15.680 | of people that sit beside you writing codes.
00:14:18.080 | You can't claim that you're not a distributor when you literally have a bunch of people
00:14:20.080 | that sit beside you writing code that decides what is important and what is not.
00:14:22.960 | You can debate which signals they decide to use, but it is their choice.
00:14:27.840 | Well, but if the signals are the user's own clicks, then I would argue that's still just
00:14:34.000 | user generated content.
00:14:35.200 | No, no, it is a signal, David, but that's not the only signal.
00:14:38.800 | For example, I can tell you very clearly that we would choose a priori stuff that we knew
00:14:43.680 | you would click on.
00:14:44.480 | It wasn't necessarily the most heavily clicked.
00:14:46.400 | We could make things that were lightly clicked, more clicked.
00:14:47.600 | We could make things that were more clicked, less clicked.
00:14:50.320 | But my point is there are people inside the bowels of these companies that are deciding
00:14:55.600 | what you and your children see.
00:14:57.040 | And to the extent that that's okay, that's okay.
00:15:00.400 | Wait, wait, maybe we've actually solved this problem, Sax, in that if we said, if you deploy
00:15:06.720 | an algorithm that is not disclosing how this is going, then you are ergo a publisher.
00:15:12.800 | And if you are just showing it reverse chronological, are you saying that you're not a publisher?
00:15:17.280 | Our chron, as we used to call it back in the day, with the newest thing up top, that would be just
00:15:22.960 | a, so maybe we should be not getting rid of 230.
00:15:26.240 | We should be talking to these politicians about algorithms equal publisher.
00:15:30.400 | So the publisher at the New York Post is the same as the algorithm.
00:15:34.320 | I like this as a better framework.
00:15:36.400 | Well, yeah.
00:15:37.600 | So Senator Tom Cotton, who's a Republican, he tweeted in response to the New York Post
00:15:42.880 | censorship, look, if you guys are going to act like publishers, we're going to treat
00:15:46.080 | you like publishers.
00:15:47.600 | So that's not modifying section 230.
00:15:49.840 | That's just saying you're not going to qualify for section 230 protection anymore.
00:15:52.640 | If you're going to make all these editorial decisions, I would argue that these decisions
00:15:57.600 | are making about censoring specific articles.
00:15:59.920 | And by the way, it's a total double standard because, you know, when Trump's tax returns
00:16:04.640 | came out a week or two ago, where was the censorship of that?
00:16:07.840 | That was, wasn't that hacked material?
00:16:09.600 | I mean, that was material that found its way to the New York Times without Trump's consent.
00:16:13.840 | By the way, so were the Pentagon papers.
00:16:15.760 | I mean, you cannot apply.
00:16:16.960 | The standard, this idea that we're going to prohibit links to articles.
00:16:21.040 | But you're proving the point.
00:16:22.480 | These people are publishers.
00:16:23.680 | No, I don't.
00:16:24.160 | Well, hold on.
00:16:26.160 | I'm saying if they make editorial decisions, they're publishers.
00:16:30.480 | I think there's a way for them to employ speech neutral rules and remain distributors.
00:16:36.320 | So I would be, I would have a little bit of an issue with you.
00:16:41.360 | I would say the reason why they're going to fall into this trap of becoming publishers
00:16:44.720 | is because of their own desires.
00:16:46.880 | They're not going to be able to censor their own biases.
00:16:48.880 | They can't.
00:16:49.280 | I don't think that's what it is.
00:16:50.080 | I think it's purely market cap driven.
00:16:51.840 | If you go from an algorithmic feed to a reverse chronological feed only, I can tell you what
00:16:58.880 | will happen in my opinion, which is that the revenue monetization on a per page per impression
00:17:04.880 | basis will go off by 90%.
00:17:06.640 | 90% for sure.
00:17:07.920 | People wouldn't.
00:17:08.400 | Right.
00:17:09.120 | People wouldn't.
00:17:09.600 | That is the only reason why these guys won't switch because they know that for every billion
00:17:14.240 | dollars they make today, it would go to a hundred million.
00:17:16.800 | In a reverse chronological feed, because you would not be able to place ads in any coherent,
00:17:21.280 | valuable way.
00:17:22.320 | There'll be zero click throughs and the ads would be just worthless.
00:17:25.200 | Otherwise they should do it now.
00:17:28.720 | If you could keep all the revenue and you could be reverse chronological, right?
00:17:32.800 | And have the same market cap, just do it and be under safe harbor so that you're not attacked
00:17:37.360 | every day.
00:17:38.640 | How fun is it to be sitting there and being attacked every single day?
00:17:42.400 | By both sides.
00:17:43.280 | Well,
00:17:43.600 | let me,
00:17:44.240 | and by all the,
00:17:45.200 | and by all the libertarians in the middle,
00:17:46.720 | the reason they don't do it is because of money.
00:17:49.360 | Let's just be honest.
00:17:50.320 | That's the only reason they don't do it.
00:17:52.000 | They're all market cap driven.
00:17:53.440 | Maybe they should go back to this kind of the straight reverse confeed and maybe you're right
00:17:58.000 | that the algorithm, I mean, I think you probably are right that the algorithms are make the
00:18:02.480 | situation worse because they kind of trap people in these bubbles of like reinforcement and they
00:18:07.520 | just keep being fed more ideological purity and it, and it definitely is fueling the polarization
00:18:12.640 | of our society.
00:18:13.280 | So I'm not trying to defend, I mean,
00:18:16.640 | I think maybe you have a point that we should get rid of these algorithms,
00:18:19.600 | but, but just to think about like the publisher aspect of it,
00:18:22.560 | going back to the newsstand example,
00:18:24.080 | let's say that the guy who works at the newsstand knows his customers and pulls aside every month,
00:18:30.240 | the magazines that he knows that his clientele wants.
00:18:34.080 | And in fact, sometimes he even makes recommendations knowing that, oh, okay.
00:18:38.480 | You know, Tamath likes, you know, these three magazines, here's a new one.
00:18:41.600 | Maybe he'll like this and he pulls it aside for you.
00:18:44.080 | That would not subject him to publisher liability, even
00:18:46.560 | though he's doing some curation, he's not involved in the content curation.
00:18:51.200 | I would argue that if the algorithms proceed in a speech neutral way, which is just to say,
00:18:57.760 | they're going to look at your clicks and then based on your own revealed preferences,
00:19:01.520 | suggest other things for you to look at.
00:19:03.120 | I don't think that makes you a publisher necessarily.
00:19:05.760 | And I think if it was,
00:19:06.960 | But if you, if you do, if you do put your finger, if these engineers are putting their
00:19:11.280 | thumb on the scale and, and, and pushing the algorithm towards certain specific kinds of
00:19:16.480 | content that may cross over.
00:19:18.080 | No, no, no, no, no.
00:19:18.640 | You're being, you're being too specific and it's, it's not that extreme.
00:19:21.920 | And it's not as simple as you're saying.
00:19:23.680 | The reality is there are incredibly intricate models on a per person basis that these companies
00:19:30.080 | use to figure out what you're likely going to click on, not what you should, not what
00:19:35.360 | is exposed to you, not what you shouldn't, but what you likely will.
00:19:39.600 | And that's part of a much broader maximization function that includes revenue as a huge driver.
00:19:44.800 | Yeah.
00:19:45.360 | So the reality,
00:19:46.400 | The reality is that these guys are making publishing decisions.
00:19:49.440 | And you are right, David, that, you know, the law back in the day,
00:19:53.760 | it didn't scale to the newspaper owner, but you know what, in 1796, you know,
00:19:58.160 | colored people were three fifths of a human and we figured out a way to change the law.
00:20:02.000 | So I'm pretty sure we can change the law here too.
00:20:04.720 | And I think what's going to happen is you should be allowed to be algorithmic,
00:20:09.520 | but then you should live and die by the same rules as everybody else.
00:20:12.960 | Otherwise that is what's really anti-competitive.
00:20:16.320 | So what's the point of that?
00:20:17.280 | I think the point of that is to essentially lie your way to a market advantage that isn't true,
00:20:22.400 | just because people don't understand what an algorithm is, that's not sufficient to me.
00:20:26.240 | But they're not actually in the content creation business. Right? And so what's the definition of a
00:20:32.880 | term publisher in that context? Because in all other cases, publishers pay for and guide and
00:20:39.360 | direct the editorial creation of the content versus being a kind of discriminatory function of that
00:20:45.040 | content.
00:20:45.440 | So I think that's the point. So let's take for example, Instagram Reels. Can you manipulate
00:20:52.160 | content through Reels? Yes. Now as the person that provides that tool to create content that
00:21:00.800 | theoretically could be violating other people's copyright or, you know, offensive or wrong or
00:21:06.400 | whatever and then you yourself distribute it to other people knowingly, the reality is that the
00:21:13.360 | laws need to address in a mature way.
00:21:16.240 | The reality of what is happening today versus trying to harken back to the 1860s and the 1930s
00:21:22.960 | because things are just different. And we're smart enough as humans to figure out these nuances and
00:21:28.720 | that sometimes we start with good intentions and the laws just need to change.
00:21:32.880 | Well, ironically, Chamath, you're making a point that Clarence Thomas made,
00:21:38.000 | Justice Thomas made in his filing, recent filing where he said that, that if you are acting as both
00:21:44.560 | a publisher and a publisher, you're going to be a publisher. And that's the point. And that's the
00:21:45.280 | point. And that's the point. And that's the point. And that's the point. And that's the point.
00:21:45.360 | And that's the point. And that's the point. And that's the point. And that's the point.
00:21:45.920 | You need to be subject to published reliability, which means peeling back Section 230. And
00:21:50.560 | moreover, you may not even be the primary creator of the content. If you're merely a secondary
00:21:56.080 | creator, if you're someone who has a hand in the content, then you are your creator, you're a
00:22:02.400 | publisher, and therefore you should lose Section 230 protection. That is basically what he said.
00:22:06.720 | If your argument is that the algorithms make you a content creator effectively,
00:22:12.400 | Add the tools. Algorithms and the tools.
00:22:15.200 | And tools.
00:22:15.600 | And tools.
00:22:16.400 | The other thing is, you know, what you have the algorithm.
00:22:18.720 | Because David's
00:22:19.200 | You have the tools, but you also have monetization, guys.
00:22:22.320 | Right.
00:22:22.560 | There's monetization involved in the YouTube example.
00:22:24.720 | They are helping you make
00:22:26.720 | We're having a serious conversation, Jason. Let's not let's not go off on that. No, I'm just kidding.
00:22:30.800 | No, but Jamath, I mean, this goes back to the politics makes strange bedfellows point. I mean,
00:22:36.720 | I think a lot of the conservatives are actually making the point you're making, which is that
00:22:40.480 | these social media sites are involved in publishing.
00:22:43.760 | I don't know.
00:22:45.120 | I don't want these guys involved in any of this shit, because I don't trust them to be neutral
00:22:51.840 | over long periods of time.
00:22:53.600 | So do you trust their decision to pull down QAnon groups and
00:22:57.760 | Zero.
00:22:58.240 | What they call hate groups?
00:22:59.680 | Just like it took it took years for us to figure out that Holocaust denial was wrong.
00:23:05.600 | Anti-vax was marginal. QAnon was crazy.
00:23:08.320 | Like wearing masks was a good idea, right?
00:23:10.320 | I don't want these people in charge of any of this stuff. And to the extent that they
00:23:15.040 | are, I want them to be liable and culpable to defend their decisions.
00:23:19.360 | So, Jamath, your ideal nonprofit social media service would be a chronological feed
00:23:26.240 | of any content anyone wants to publish that anyone can browse.
00:23:29.360 | That's not what I'm saying, David. What I'm saying is that you have to be able to
00:23:33.120 | live with the risk that comes with, you know, playing in the big league and wanting to be a
00:23:40.560 | 500 plus billion dollar company. There is a liability that comes with that. And you
00:23:44.960 | need to own it and live up to the responsibility of what it means. Otherwise, you don't get the
00:23:49.840 | free option.
00:23:50.880 | What if they didn't take a hand in it and they follow the dig, the Reddit model,
00:23:54.640 | and it's just upvoting that decides what content rises to the top?
00:23:57.600 | I suspect that Reddit has just a different problem, which is a sort of like,
00:24:02.160 | you know, a decency problem and a different class of law.
00:24:06.080 | Who are we to judge decency, right? I mean, like in the vein of like editorialism,
00:24:10.160 | like they're taking no hand in what content rises to the top.
00:24:13.040 | Well, they did ban certain topics.
00:24:14.880 | So they did recently, but like, like assume they didn't. Right. And it was just
00:24:18.160 | purely like upvoted consumer and not algorithmic.
00:24:20.800 | That's the end. It's very hard to pay. I think it's very hard to me. I think it's
00:24:25.760 | very hard to pin a section 230 claim on Reddit as easy as it is YouTube, Facebook and Twitter.
00:24:33.040 | And so if YouTube reverted to just, hey, what people are watching right now rises to the top,
00:24:36.960 | and that was the only thing that drove the algorithm, you would feel more comfortable
00:24:39.760 | with YouTube not being. It's not comfortable. This is what I'm saying. It's what I know. All I want
00:24:44.800 | to know is what am I getting when I go here? And if what I'm getting is a subjective function
00:24:52.080 | where they are maximizing revenue, which means that I can't necessarily trust the content I get,
00:24:59.200 | as long as I know that and as long as there's recourse for me,
00:25:02.480 | I'm I'm very fine to use YouTube and Twitter and Facebook. What I think is unfair is to not know
00:25:08.960 | that there's a subjective function. Confuse it with an objective function. Go on with your
00:25:14.720 | life. End up in the state that we're in now where nobody is happy and everybody is throwing barbs.
00:25:20.560 | And you have no solution. Maybe I just want to be stimulated. Like, I remember the day when I would
00:25:24.720 | go to Facebook and Twitter and it was boring as hell. It's like just fucking random shit that
00:25:28.400 | people like here's a picture of my show me the best stuff. You know, like, like, now I go to
00:25:33.120 | Facebook and I'm like fucking addicted because it's showing me this and there's like shit that
00:25:36.080 | I've been buying online and the ads keep popping up and I'm like, Oh, this is awesome. And I keep
00:25:40.080 | buying more stuff. Well, I think all of that is good, but I it's all it. All of that is good, but
00:25:43.520 | I it's all it all should be done eyes wide open where in these corner cases, the people that feel
00:25:49.520 | like some sort of right or privilege or has been violated or some overstepping has occurred. They
00:25:56.960 | should have some legal recourse and they should be there should be on the record a mechanism to
00:26:01.360 | disambiguate all that. Wait, hold on. Let me just ask this one question, David. Would this be
00:26:06.000 | alleviated if the algorithm was less of a black box? If we could just say, Hey,
00:26:13.440 | no, we need these algorithms to be so that's not a solution. And then what is this? And I want to hear
00:26:17.680 | Dan's about and then also labeling because Facebook labeled stuff and if labeling stuff,
00:26:23.840 | hey, this is disputed from a third party. That feels to me like that would have been a better
00:26:28.400 | solution in the Twitter's case. All right, let me get in here. So I half agree with
00:26:32.240 | Chamath. Okay, so the half I agree with is I don't want any of these people meaning the
00:26:36.160 | social media sites, making editorial decisions about what I see censoring what I can look at. I
00:26:43.360 | trust them. I don't want that kind of power residing in really two people's hands. Mark
00:26:49.360 | Zuckerberg and Jack Dorsey. I don't I don't trust them. And I don't want them to have that kind of
00:26:53.120 | power. But that where I disagree is if you repeal Section 230, you're going to make the situation
00:26:58.480 | infinitely worse. Because Section 230, what is the response to these companies going to be
00:27:03.760 | corporate risk aversion is going to cause them to want to hire hundreds of low level employees,
00:27:09.120 | basically millennials, to sit there making judgments about what
00:27:13.280 | content might be defamatory might cause a lawsuit, they're going to be taking down
00:27:17.680 | content all over the place. And you know what will happen? That's gonna be a worse world. No,
00:27:21.760 | you know what will happen? Those companies will lose users lose engagement and new things will
00:27:26.640 | spring up in its place around these laws that work. How will they how they lose audience? I
00:27:32.480 | mean, I think what will happen is you have a torrent of lawsuits. Anytime somebody has a
00:27:38.240 | potential lawsuit based on what you know, I don't think I don't like trying to police
00:27:43.200 | speech at a dinner party like our job never existed. And scale that never existed at the
00:27:48.960 | scale. I don't think the goal is to work backwards from how do we preserve a trillion dollars of
00:27:53.120 | market cap? So what if that's what happens? That's what we're doing. So for me, I'm trying
00:27:58.880 | to work back from how do we preserve the open internet. But I think this is exactly what it's
00:28:02.880 | saying, which is, here's a clear delineation in 2020. Knowing what we know, you know,
00:28:08.160 | person entrepreneur who goes to Y Combinator or to launch to build the next great company. Here are
00:28:13.120 | these rules, pick your poison. And some will choose to be just a publisher, some will probably
00:28:18.640 | create forms of distribution, we can't even think of some will choose to straddle the line, they'll
00:28:24.000 | have different risk spectrums that they live on. And that's exactly how the free markets work today.
00:28:28.640 | There's nothing wrong with that. Maybe the only like disagreement here
00:28:33.840 | is that I think that code can be written and algorithms can be written in a speech neutral way,
00:28:39.680 | so that the distributors don't cross over the line to becoming publishers.
00:28:43.040 | I fully agree with you that these sites should not be publishers. The reason why the New York
00:28:47.360 | Post should be taken off, they should be platforms and they cross the line. I would say that this
00:28:53.040 | New York Post story is the reason why people are up in arms about it is because what Twitter and
00:28:59.200 | Facebook have done is basically said they're going to sit in judgment of the media industry.
00:29:04.720 | And if a publisher like the New York Post puts out a story that doesn't meet the standards
00:29:10.080 | of Twitter and Facebook, they're going to censor them. That is
00:29:12.960 | a sweeping assertion of power. They're picking and choosing who they don't want to give distribution
00:29:17.920 | to. Yeah, we all we all agree on that piece. They should not be the arbiter of that. That is what is
00:29:23.440 | triggering. But that is what is triggering the conservatives in particular, but everybody, but
00:29:27.920 | especially conservatives to say they want to repeal Section 230. Nobody, my point is nobody is safe.
00:29:33.600 | And it's less about I actually think that there's a nuanced point to this, which is it's less about
00:29:40.000 | what they think is legit.
00:29:42.880 | Or not as much as what they think is important or not. They chose to make this an important
00:29:47.840 | article. They chose to kind of intervene in this particular case, when every day there are going to
00:29:52.720 | be hundreds of other articles that are going to be actively shared on these platforms that are by
00:29:57.040 | those same standards, false with some degree of equivalency, false or shouldn't be on the platform.
00:30:01.760 | Absolutely. And it is the simple choice that they chose an article to exclude, regardless of the
00:30:07.600 | reason in the background, because there are many articles like it that aren't being excluded.
00:30:10.800 | And that alone speaks to the whole.
00:30:12.800 | In the system as kind of sacks.
00:30:14.880 | Well, it's because it's because they they have too much power and they're unaware of their own biases.
00:30:20.400 | They can't see this action for what it so clearly was. It was a knee jerk reaction on the part of
00:30:26.240 | employees at Twitter and Facebook to to protect the Biden campaign from a story that they didn't
00:30:32.000 | like. I mean, because if they were to apply these standards evenly, they would have blocked the Trump
00:30:37.200 | tax returns for the exact same reason. By the way, just so you know,
00:30:39.360 | Cal's about to block you so he can keep the Biden campaign strong.
00:30:42.720 | And not have your.
00:30:43.280 | I would say I've been red pilled. Actually, the last 24 hours have been red pilling for me.
00:30:48.800 | I got to say, David, I agree with you because like I thought I thought that both
00:30:53.280 | things were crossing the line, like meaning either you publish them both or you censor them both.
00:31:00.720 | And there are very legitimate reasons where you could be on either side. But to choose one and not
00:31:06.080 | do the other, it just again, it creates for me uncertainty. And I don't like uncertainty. And I
00:31:11.200 | really don't like the idea that.
00:31:12.560 | Right.
00:31:12.640 | That some nameless, faceless person in one of these organizations is all of a sudden
00:31:16.640 | going to decide for me knowledge. Yeah.
00:31:19.680 | And information that to me is just unacceptable.
00:31:22.080 | The journalistic standard becomes a slippery slope to nowhere. Right. Like at that point,
00:31:25.840 | like what is true, what is not true? What is opinion? What is not opinion? What is what?
00:31:31.040 | You know, how do I validate whether this fucking laptop came from this guy or this guy or this guy?
00:31:35.360 | It's a slippery. How are you ever going to resolve that across billions of articles a day?
00:31:39.200 | Standards would be the answer.
00:31:42.560 | New York has lower standards.
00:31:44.240 | Right. And so let's look at how slippery the slope has become just a week ago. I mean,
00:31:48.560 | literally a week ago, Mark Zuckerberg put out a statement explaining why Facebook was going to
00:31:54.640 | censor, censor Holocaust denial.
00:31:56.400 | Why he really went out on a limb, huh, David?
00:31:58.400 | Well, it's I think.
00:32:00.880 | My point is, my point is he actually put out a multi paragraph well reasoned statement.
00:32:09.920 | Multi paragraph. Your three paragraphs about the Holocaust is bad. Wow. Congrats, Zach.
00:32:14.880 | No, no, no. What I'm trying to you're not listening to my point. My point is that he took
00:32:20.640 | it seriously that he was going to censor something. And I think people can come down. You could be like
00:32:25.520 | a Skokie ACLU liberal and oppose it. Or, you know, you could say, look, common sense dictates the
00:32:30.800 | you would you would censor this, but he felt the need to justify it with, you know, like a long
00:32:35.760 | post. And then one week later, we're already down the slippery slope to the point where, you know,
00:32:40.400 | Facebook's justification for censoring this article was a tweet by Andy Stone. You know, like that was
00:32:46.960 | it. It was a tweet. That was the only explanation they gave. By the way, one of the reporters
00:32:51.200 | pointed out that if you were going to announce a new policy, you probably wouldn't want it done by
00:32:55.920 | a guy who's been a lifelong Democratic operative. You know, this was just so and so it just shows
00:33:00.720 | that once you start down the slope of censoring things, it becomes so easy to keep doing it more
00:33:07.360 | and more. And, and this is why I think these guys are really in hot water, whatever, whatever,
00:33:12.960 | you know, whatever controversy there was about Section 230 before, and there was already a lot
00:33:18.800 | of rumblings in DC, about modifying this, they have made things 10 times worse. I mean, as someone
00:33:24.880 | who's actually a defender of Section 230, I wish Dorsey and Zuckerberg weren't making these blunder
00:33:30.640 | is because I think they're going to ruin the open internet for everyone.
00:33:33.360 | Super blunder. And I'll tell you what was an even bigger blunder for an equal blunder for me
00:33:37.360 | last night. I don't know if you guys had this experience. But I was trying to figure out what
00:33:41.040 | the consensus view on the Biden Hunter Biden story was. And I went to Rachel Maddow, and the last
00:33:48.800 | word and Anderson Cooper, and there was a media blackout last night. I couldn't find one left
00:33:56.640 | leaning or CNN if that is even in the center. I don't think they're the center anymore.
00:34:00.560 | Any more than the left. I couldn't find one person talking about Biden. I was like,
00:34:04.160 | all right, let me just see if I tune into Fox News. And Fox News was only discussing the
00:34:08.080 | Biden story. And so this now felt like, wow, not only if you were one of these, you know,
00:34:15.120 | folks on the left, who's in their filter bubble on Twitter and Facebook, they're not going to see
00:34:19.600 | that story. And then if they tuned into Rachel Maddow, or to Anderson Cooper, or you go to the
00:34:25.120 | New York Times, it's not there either. And then Drudge didn't have it for a day. You're bringing
00:34:30.480 | up something so important. So think about what you're really talking about, Jason, there was a
00:34:35.520 | first order reaction that was misplaced, and not rooted in anything that was really scalable or
00:34:41.520 | justifiable. Then everybody has to deal with the second and third order reactions. The left leaning
00:34:48.480 | media outlets circle the wagons, the right leaning media outlets are up in arms, nobody is happy,
00:34:55.440 | both look like they're misleading. And then now if you're a person in the middle, for example, what
00:35:00.400 | was what was frustrating for me yesterday was, it took me five or six clicks and hunting and pecking
00:35:05.280 | to find out what the hell is actually going on here. Why is everybody going crazy? But that
00:35:10.000 | bothered me. You know? And so I just think like, again, it used to be very simple to define what
00:35:18.160 | a publisher was and what a distributor was in a world without code, without machine learning,
00:35:23.440 | without AI, without all of these things. I think those lines are burned. We have to rewrite the laws. I
00:35:30.320 | think you should be able to choose. And then I think if you're trying to do both, by the way, the
00:35:34.720 | businesses that successfully do both will have the best market caps. But if you're trying to do both,
00:35:39.360 | you have to live and die by the sword. Yeah. It would be interesting also, if I don't know
00:35:44.960 | if you guys have done this, but I switched my Twitter to being reverse chronological,
00:35:49.520 | which you can do in the top right hand corner of the app or on your desktop,
00:35:52.560 | because I just like to see the most recent stuff first. But then sometimes I do miss something
00:35:55.920 | that's trending, whatever. But I just prefer that because I have a smaller follower list now.
00:36:00.240 | But to Friedberg, your point, you kind of like the algorithm telling you what to watch. So
00:36:07.360 | a potential solution here might be... I'm not saying I like it rationally, by the way. I'm
00:36:10.720 | just saying like as a human, humans like it. I like it. I like to be stimulated with titillating
00:36:17.040 | information and interesting things that for whatever reason I'm going to...
00:36:22.320 | Click on again. You like that experience of jumping down the rabbit hole.
00:36:25.920 | My point is all humans are activated and the algorithms, the way they're written, they're
00:36:30.160 | designed to activate you and keep you engaged. And activation naturally leads to these dynamic
00:36:36.160 | feedback loops where I'm going to get the same sort of stuff over and over again that it identifies,
00:36:40.240 | activates me because I clicked on it. And therefore, I'm going to continue to firm up my
00:36:44.480 | opinions and my beliefs in that area. But I think showing me stuff that I don't believe,
00:36:49.200 | showing me stuff that's anti-science, because I'm a science guy, showing me stuff that's
00:36:52.240 | anti-science, showing me stuff that's bullshit that I consider bullshit,
00:36:55.440 | I'm not going to read it anymore. So if I'm reading just random blurtings by random people in
00:37:00.080 | reverse chronological order, it is a completely uncompelling platform to me and I will stop using
00:37:04.560 | it. And that leads back to kind of the, you know, Chamath's point, which is that the ultimate
00:37:08.400 | incentive, the mechanism by which these platforms stay alive is the capitalist incentive, which is,
00:37:13.440 | you know, how do you drive revenue and therefore how do you drive engagement? And that's to give
00:37:18.560 | consumers what they want. That's what consumers want. All right. Let's give Sachs his victory lap.
00:37:23.520 | He predicted last time that there was a possibility that Trump would come out of this like Superman.
00:37:30.000 | And would do a huge victory lap. And sure enough, he considered putting a Superman outfit on under
00:37:36.240 | his suit. And he did a victory lap literally around the hospital, putting the Secret Service
00:37:43.120 | at risk, I guess. And then did a Mussolini-like salute from everybody from the top of the White
00:37:51.280 | House. I mean, you nailed it, Sachs. He came out. It was very Il Duce.
00:37:55.920 | Il Duce. He did Il Duce.
00:37:57.440 | It was very Il Duce.
00:37:58.400 | Il Duce?
00:37:59.200 | No, but he was very Il Duce.
00:37:59.920 | It was very predictable. The media was making it sound like Trump was on his deathbed,
00:38:06.400 | you know, because the presumption is always that the administration's hiding something.
00:38:10.720 | He must be much sicker than he's letting on. If he says he's not that sick, it must be really bad.
00:38:15.040 | And so for days and days, they were talking about how Trump was, you know, potentially
00:38:19.120 | had this fatal condition. And by the way, he deserved it. You know, it was a moral failing.
00:38:24.080 | He was negligent. And so it's not unlike really what the right was doing, constantly accusing
00:38:29.840 | Biden of senility, you know, and then Biden went into that debate and then blew away expectations.
00:38:35.040 | And so the same thing here, you know, the media set up Trump to kind of
00:38:40.400 | exceed expectations. But I do think, you know, it is noteworthy that Trump was
00:38:47.680 | cured so quickly with the use of these, you know, clonal antibodies that we talked about last time.
00:38:53.600 | I think we talked about it on the show two weeks ago. And it was a combination, I guess, of Regeneron and Remedios,
00:38:55.760 | and then the use of these, you know, clonal antibodies that we talked about last time. And we talked about it on the show two weeks ago.
00:38:55.920 | And it was a combination, I guess, of Regeneron and Remdesivir. And the guy was out of there in like
00:39:01.840 | a couple of days. So, you know, it's like the media doesn't want to admit anything
00:39:07.680 | that is potentially helpful to Trump. But you have to say that at this point, we have very
00:39:14.160 | effective treatments for COVID. They may not be completely distributed yet. Trump obviously had
00:39:20.480 | access to them that the rest of us don't have. But it feels to me like we are really winding down on
00:39:25.840 | the whole COVID thing.
00:39:27.440 | Can I ask a question? Have they published the blow-by-blow TikTok of exactly what he got when...
00:39:34.880 | No, they haven't, right? I would love to have that because I think all Americans deserve to have that.
00:39:41.120 | They did roughly, yeah. They know what his dosage was. I mean, they said what day he got it on the Remdesivir.
00:39:45.520 | He got several doses. It said what days he got the antibody treatment.
00:39:49.600 | I just want to print that out and keep it as a folder in my pocket just in case.
00:39:55.760 | We know what to take now. We know what to take if we get sick, right?
00:39:58.560 | Yeah. Well, the question is, can we get it?
00:40:00.640 | But even independent of that, right? Like, I think people love anecdotes. It's very hard for people to
00:40:08.480 | find emotion and find belief in statistics. And, you know, if you look at the statistics on COVID,
00:40:14.880 | you know, you go into the hospital, 80% chance you're coming out.
00:40:18.000 | And, you know, the average stay for someone that goes in, a lot of people are going to
00:40:22.320 | the ER and they're getting pushed back out because they're not severe enough.
00:40:25.680 | And I think the anecdote is everyone that gets COVID dies. The statistics show that that's not
00:40:31.280 | true. And, you know, whether or not Trump got exceptional treatment, he certainly did.
00:40:35.760 | It's very hard to Sachs' point for the storytelling that has kind of been used to keep people at home
00:40:43.360 | and manage kind of and create this expectation of severity of this crisis, etc. It's very hard for
00:40:49.520 | people to kind of then say, hey, like, you know, he's got a 97% chance of making it through this,
00:40:54.960 | and he'll be at 97% chance of making it through this.
00:40:55.600 | I mean, you know, he's got a 97% chance of making it through this. He's got a 99% chance of being out
00:40:56.720 | of the hospital in three days, when it happened. It was a shocking moment. And it really hit that
00:41:01.680 | narrative upside down. Right? Like, it was just like, well, can we show that there was a tweet
00:41:06.080 | recently providing the statistics on what the real infection fatality rate was for COVID?
00:41:11.680 | Yeah, I saw it. It's about half a percent point four. And that's across,
00:41:17.120 | you know, the whole spectrum. But like in anyone under 75 years old,
00:41:20.960 | you've got the number of strikes facts. Right, but it's here. Let me pull it up. It's on.
00:41:25.520 | We tweet, I think Bill Gurley first tweeted it. And then I read it.
00:41:28.320 | Yeah, I thought the IFR was like point one, if you're young,
00:41:32.160 | and it goes all the way up to like point four, if you're above 75.
00:41:35.680 | It's way less than point one.
00:41:37.040 | Yeah, it's it was, um, I thought the IFR was a lot less severe than that.
00:41:43.280 | That IFR is also distorted, you know, based on the zero prevalence study that was just published,
00:41:48.160 | you can take that number that's published and divided by about three, three to five.
00:41:54.480 | To get the true audience.
00:41:55.440 | Yeah, because not everyone that's had COVID has is registering as a positive infection,
00:41:59.600 | because they had COVID and got over it. So there was a paper published in in
00:42:03.360 | JAMA a few weeks ago, where they took dialysis patients, and they measured that they get blood
00:42:10.480 | from these dialysis patients. And they measured COVID antibodies in these patients. And they
00:42:14.880 | showed that in the Northeast, 30% of people, it's 27 point something percent of people
00:42:19.120 | have already had COVID. It's an incredible fact. Wow. And in the West, the number is
00:42:25.360 | close in Western states, they've kind of got it all written up in this paper. And they did a great
00:42:28.880 | job with the paper, it's about 3%. But in aggregate across the United States, it's a this was a few
00:42:34.640 | weeks ago. So nowadays, it was 10.5%, I think, so it's probably closer to 12%. Now people have
00:42:39.920 | already had COVID. And so then if you assume that number, right, I mean, that's 30 million people.
00:42:45.040 | And now you look at how many people have died, we haven't gotten the deaths wrong, right? Because
00:42:49.120 | everyone that's died from COVID, we've recorded that death. We know that numbers could be a little
00:42:53.520 | inflated, right? People who died with COVID. Right. So it's a lot of people who died with COVID.
00:42:55.280 | Exactly. But be conservative and assume that it's right. Right. I mean, if I look in the United
00:43:00.800 | States, 217,000 cases, but the real cases is 30 million, 30 million. And that's where you that's
00:43:07.760 | where you end up with this, like, you know, adjusted IFR true IFR of one. Yeah, like very,
00:43:14.880 | very point 1% point. Oh, 7% or point 7%. Sorry. By the way, my my tweets aren't loading, right?
00:43:25.200 | Right now. So I think Trump just took the tick tock decree, and he just crossed out tick tock and put
00:43:31.840 | Twitter and he just shut Twitter down. What was what is the tick tock thing done?
00:43:35.920 | Yeah, who knows? That was like three weeks ago. It doesn't matter. Or tomorrow.
00:43:40.960 | Was there a second debate?
00:43:44.160 | There's tonight, there's going to be two town halls. Trump refused to do a zoom with or you know,
00:43:55.120 | debate. I'm talking about the power of zoom. A virtual debate he wouldn't do,
00:43:59.360 | ostensibly because he's not good when he's not interrupting somebody would be my take on it.
00:44:05.360 | So then he went to NBC, which he made $400 million, I guess from The Apprentice, and NBC let him take
00:44:12.160 | a time slot directly opposite Biden tonight to do his own town hall. So they didn't even stagger it.
00:44:17.920 | Which NBC, which is responsible for saving Trump is getting absolutely demolished by their own
00:44:25.040 | actors and showrunners on Twitter. So I think NBC is going to come out swinging tonight in this town
00:44:29.840 | hall to try to, you know, take down Trump as maybe their penance. That's my prediction for it. But
00:44:35.760 | how do you watch Biden if Biden is up against Trump? Like, that's like watching paint dry
00:44:42.640 | versus watching like, you know, some maniac running down Market Street with a samurai sword on meth.
00:44:49.040 | I'll be I won't be watching either. I cannot wait for this election to be over. How many
00:44:54.960 | days until November 3, we are like, 18 and a wake up. 18 days, my God, maybe 18. Yeah,
00:45:01.520 | let us just get this over with. Yeah, 18. I know, we're all sick of it.
00:45:06.000 | I do feel like I mean, it's the polls are now showing that Biden is up by as much as 17.
00:45:11.680 | I mean, things are really continue to break his way. I think to your point, Jason, about Trump
00:45:18.800 | being more watchable. I think that's sort of Trump's problem is he just can't help making himself
00:45:24.880 | the center of the news cycle every single day. And to the extent the election is a referendum on
00:45:30.960 | Trump, I think he's going to get repudiated if the election were more of a contest. And people would
00:45:37.520 | weigh Biden's, you know, positions as well. I think Trump would have a better shot because I think he
00:45:43.040 | does have some Biden does have some weaknesses. But the whole reason why Biden's basement strategy
00:45:48.240 | has been working so far is because Trump just eats up all the oxygen and he's making a referendum on
00:45:53.280 | him, which I think is a good thing. I think he's going to get repudiated. And I think he's going to
00:45:54.800 | lose if he keeps doing it that way. You know what they say,
00:45:57.920 | Sacks, what got you here will not get you there. What got him into his office was the ability to
00:46:02.640 | take up the entire media channel during the Republican runoff and just be able to demolish
00:46:07.840 | everybody was entertaining. I wanted that exhausting. It's now exhausting. I want to
00:46:12.560 | change topics. I would like to ask David to explain his tweet related to prop 13. Or 15. Yeah, yeah.
00:46:24.720 | So, so I saw that that Mark Zuckerberg had contributed $11 million to try and convince
00:46:31.440 | the people of California to vote for this prop 15, which is the largest property tax increase
00:46:36.640 | in California history. What it does is it chips away at prop 13 by moving commercial property out
00:46:44.720 | of prop 13. And it would then tax it on what's called fair market value as opposed to
00:46:51.760 | the the cost basis of the property, it would have a lot of
00:46:54.640 | unfair consequences for property owners who've owned their their commercial property for a long
00:47:01.840 | time. You know, if you're a small business, and you've owned your your store, whatever for 2030
00:47:08.080 | years, all of a sudden, you're going to get your taxes are going to get reassessed at the new fair
00:47:12.400 | market value. But, you know, I just think there's the larger prize, though, is that the California
00:47:21.680 | unions, the government workers unions,
00:47:24.560 | want to chip away at prop 13. This is the first salvo first, we're going to strip out
00:47:28.640 | commercial property, eventually, they want to, they want to basically repeal all of prop 13.
00:47:33.040 | And I just think it's like, so misguided for billionaires to be using their wealth in this way.
00:47:38.640 | Because prop 13 is really the shield of the middle class in California.
00:47:43.280 | And it's kind of no wonder that, frankly, like tech belt wealth is so
00:47:48.160 | increasingly despised in this country. Because tech billionaires are funding such stupid causes.
00:47:54.480 | to explain this to people who don't know in California if you bought your house in 1970
00:47:59.880 | for fifty thousand dollars the one percent tax you pay on it is five hundred dollars
00:48:03.980 | that house might be worth five million today if it was in Atherton and so you're still paying
00:48:09.440 | what would have been a fifty thousand dollar tax bill is a five hundred dollar tax bill so they're
00:48:14.280 | starting with commercial spaces and Jason sorry we'll go backwards and you can pass it off to
00:48:18.700 | your kids at that cost basis yeah so this is why you have two old people living in a five bedroom
00:48:25.060 | right it caps the rate increase of the the tax increase every year there there's there
00:48:31.020 | if you didn't have prop 13 no hold on if you didn't have prop me just explain to people if
00:48:37.660 | you didn't have prop 13 anybody who owned who's owned their house for say 20 years would have a
00:48:43.400 | massive tax bill all of a sudden and probably would have to sell their house just about anybody
00:48:48.280 | who's middle
00:48:48.680 | class who's been in California for for more than a decade or two probably could no longer afford to
00:48:54.860 | live in their house but the reality is people are mortgaging that asset sacks to access capital that
00:49:00.680 | they're using and investing in different things whether it's you know that's fueling the economy
00:49:04.460 | right so I mean the libertarian point of view might be less taxes is good because in this
00:49:09.500 | particular case that building can still be used by that resident uh to buy stuff uh they can take
00:49:15.740 | a mortgage out and they can go spend that money versus having that money eaten
00:49:18.660 | up by property taxes which just goes well yeah so so so I I understand that if you were to design
00:49:25.260 | the like perfect tax policy it wouldn't look like prop 13 or you know or you know and maybe
00:49:32.400 | prop 15 in a vacuum if you're just like a policy wonk trying to design the ideal tax policy it
00:49:37.800 | might look more like that but the real problem in California we're not an undertaxed state it's
00:49:43.020 | a massively taxed state and and there's never enough you know the beast always wants more
00:49:48.640 | and so what I would say is look if you want to reform prop 13 do it as part of a grand bargain
00:49:54.280 | that creates real structural reform in the state of California um what I mean by structural reform
00:49:59.560 | where you got to look at well who controls the system and it's really the government
00:50:02.500 | employee unions who block all structural reform and who keep eating up a bigger and bigger portion
00:50:08.620 | of the state budget um so we've talked about this on previous pods that the police unions block any
00:50:14.140 | kind of police reform um you know the the prison unions block prison reform you've got
00:50:18.620 | the teachers unions blocking education reform and school choice if you want to talk about systemic
00:50:23.840 | problems in California look at who runs the system it's these these gigantic unions and
00:50:29.120 | a bigger and bigger portion of the budget keeps going to them every year they're breaking the
00:50:33.500 | bank and by the way it doesn't get us more cops on the beat it doesn't get us more teachers in
00:50:37.940 | the classroom what it's buying is lots and lots more of administration along with a bunch of
00:50:42.680 | pension fraud and so what I would do is I would say look we need some structural reforms here we
00:50:48.600 | need some reforms on the rate of growth in spending we need some pension reforms in exchange for that
00:50:54.960 | as part of a grand bargain you might get some reforms to prop 13 but just to give away one of
00:51:00.600 | the only cards we have in negotiating with these powerful special interests for no reason I just
00:51:06.180 | think it's dumb you know do you think that Zuck was tricked or what do you think I think he's
00:51:11.820 | probably got look I don't really know but I don't know how many things and I've defended him on this
00:51:16.080 | podcast a lot basically on on the story I think he's probably got look I don't really know but I
00:51:18.580 | think he's probably got look I don't really know but I don't really know but I think what it is he's
00:51:21.100 | got some foundation and he's got some pointy-headed policy walks sitting there trying to analyze what
00:51:26.500 | the perfect tax policy is and it probably looks more like fair market value than like cost basis
00:51:31.960 | and they're not thinking about the larger political sort of ramifications which is we the
00:51:39.520 | private sector is being squeezed more and more by these public employee unions and we do need
00:51:45.760 | structural reform and we can't just give up one of the other things that we're doing and we're not
00:51:48.560 | the only cards we have which would be you know trading reform on prop 13. and Zuck doesn't already
00:51:54.620 | commercial real estate yeah well even if so I I I would venture to guess that maybe Saks does I don't
00:52:01.520 | know I mean hold on let me I I do but let me explain that this doesn't affect me because my
00:52:06.440 | cost basis is fresh yeah all the all the commercial real estate that I've bought in California has
00:52:11.420 | been the last few years is probably underwater I mean it's certainly not above my cost basis
00:52:15.440 | um so it doesn't affect me it affects
00:52:18.540 | the little guy it affects the small business who's owned their property for 10 or 20 years
00:52:23.460 | and again I'm not arguing that we couldn't come up with a better tax system but what I'm saying
00:52:28.440 | is the bigger more pressing need is structural reform totally no I mean I totally agree the
00:52:33.360 | bloated monster of socialism is coming for us and it starts with the unions and it evolves and it's
00:52:37.860 | just average salary I don't know if you saw this go viral in the last couple weeks on Twitter
00:52:42.120 | average to average salary in San Francisco 170 thousand dollars
00:52:48.520 | a lot of tech workers city employees yeah I'm saying employees I saw that like 170
00:52:53.020 | 000 was the average salary I was like oh wow tech people are doing good it's like no no no
00:52:57.100 | that's the city employees 19 000 administrative employees in the city of San Francisco city of
00:53:01.780 | 800 000 people 800 000 people with a 14 billion dollar budget the state of California is converting
00:53:10.000 | the entire middle class into government workers because if you're a small business owner you're
00:53:13.780 | getting squeezed by more and more taxes you're getting driven out of the state people leaving the
00:53:18.340 | state now exceeds people immigrating into the state so the the private sector middle class is leaving
00:53:23.260 | and this public sec this sort of public sector middle class of government workers is being
00:53:28.300 | created and like I mentioned it's not getting us more cops on the beat it's not getting us more
00:53:32.500 | teachers in the classroom what it's getting is a giant number of overpaid administrators and
00:53:37.120 | bureaucrats that is the big structural problem the you know private sector unions are very different
00:53:42.760 | you see when when a private sector union goes to negotiate they go negotiate against ownership or
00:53:47.920 | management and there's someone to oppose their unreasonable demands not all their demands are
00:53:51.880 | reasonable just the most unreasonable demands but it with the public sector unions they're negotiating
00:53:56.560 | against the politicians and they are the largest contributors to those politicians and so there's
00:54:00.880 | no one and the politicians need them for their votes right they're like they're going to deliver
00:54:04.000 | whatever number of teachers police officers exactly the unions feed the politicians the
00:54:09.220 | politicians feed the unions that is a structural um that is a structural problem and these unions
00:54:15.040 | will the unions will never be appeased you can never buy them you can never buy them you can never buy them
00:54:17.740 | you can never buy them off it's why democracy always ends in in the state like it's it's just
00:54:23.020 | an inevitable outcome I um I had no idea about any of this until um I'm glad I asked you about
00:54:29.440 | that tweet that's really I I actually learned a lot just in that last little bit uh I have one
00:54:35.080 | other thing I want to ask you guys about which is the Amy Coney Barrett confirmation hearings whether
00:54:39.760 | you guys have watched them and what you guys think um and I don't know whether these are just um cherry
00:54:47.140 | picked clips or whether she's playing dumb or I I really don't want to judge because I want to know
00:54:54.040 | more but I just want to know what you guys think up uh going into this um you know the I'll say
00:54:59.920 | something about climate change because look I'm I I I spent a lot of time looking at data and
00:55:06.040 | research on climate change and certainly feel strongly that there's a a human uh caused function
00:55:12.160 | of of global warming that that we're actively kind of experiencing
00:55:16.960 | um but I think everyone kind of assumes you have to take that as truth I think one of the the key
00:55:23.140 | points of science is you have to recognize your ignorance and you have to recognize that science
00:55:28.000 | is um you know kind of an evolving process of Discovery and understanding I don't and she's
00:55:33.880 | getting a lot of heat for what she said about I'm not a scientist I don't know how to opine on
00:55:38.860 | climate change and I heard that and actually gave me a bit of pause that like this this is exactly
00:55:45.400 | you know what I would expect someone who's thoughtful to say not someone that trying to act
00:55:50.200 | ignorant and play to the right um she didn't say I don't think climate change is being caused by
00:55:55.180 | humans and I think like everyone kind of wants to jump on her and every it's like become religion I
00:55:59.320 | just want to point out that climate change has become as politicized and as dogmatic as all these
00:56:04.300 | other topics we talked about and we all kind of assume that if you do or don't believe in climate
00:56:08.260 | change you're left or right you're evil you're good um and I I think like it's very easy to kind
00:56:13.840 | of just go into here those
00:56:15.220 | hearings and assume that but I wouldn't say that her answer necessarily made me think that she is
00:56:20.140 | ignoring facts and ignoring the truth I think you know she's kind of pointing out that this is a
00:56:24.280 | process of science and there's a lot of Discovery underway so I I don't know I mean that was one
00:56:28.780 | point that controversial point that I thought I should make um because I am a believer I I do
00:56:34.120 | think that climate change is real I do think the data and science supports it but I do appreciate
00:56:38.080 | that someone recognizes that they may have the skills rather than just assume what the media
00:56:42.880 | tells them to us to believe yeah the few
00:56:45.040 | um few clips that I saw of the confirmation hearing my takeaway was basically you know
00:56:49.300 | any candidate on the left or the right comes in extremely well coached and they're taught
00:56:55.000 | basically how to evade meaning there's a go-to answer Amy Coney Barrett's go-to answer was
00:56:59.800 | um listen as a judge I'd have to you know hear that case on the record I can't opine
00:57:04.660 | on something hypothetically you know she had this very well rehearsed answer and a lot of
00:57:09.700 | the answers to the questions from the left were that um and uh you know the questions on
00:57:14.860 | the right were um more softballish um so I couldn't really get a sense of it now the the thing that I
00:57:21.280 | take kind of a lot of comfort in is that you know when we saw John Roberts get confirmed to the court
00:57:27.280 | um it was supposed to be 5-4 conservative with John Roberts and basically what we learned was
00:57:34.180 | now John Roberts and you know some critical decisions he is willing to basically you know
00:57:39.400 | uh make sure that things don't change that much um including Obamacare yeah exactly yeah exactly yeah
00:57:44.680 | exactly you you don't you don't know exactly how they're going to vote on these issues you really
00:57:48.400 | don't Roberts was the deciding vote in upholding Obamacare Gorsuch uh extended uh gay rights well
00:57:56.080 | beyond anything Anthony Kennedy ever did that was a big surprise and so we don't really know exactly
00:58:00.880 | how she's going to vote the reason why Amy Coney Barrett Rockets to the top of Trump's list quite
00:58:07.300 | frankly is because of how Diane Feinstein treated her three years ago in the last confirmation hearings
00:58:12.220 | which she is she where Feinstein attacked her Catholicism it was and it was so ham-handed it
00:58:18.580 | was so poorly done that it made Barrett a hero instantly on the right and it rocketed her to the
00:58:23.800 | top of this list but but we don't know how she's going to vote based on her Catholicism you know
00:58:28.720 | which is a feature isn't it David because the lifetime appointment means they like tenure they
00:58:35.320 | can go with what they think is right so that is kind of a good feature of the Supreme Court do you
00:58:40.480 | think there should be like a term
00:58:42.040 | well I I think it's a little crazy that decisions as important as you know the the the right to to
00:58:50.380 | choice or something like that um hangs on whether an 89 year old uh cancer victim can hold on for
00:58:58.360 | three more months you know it seems very arbitrary to me and therefore these Supreme Court battles
00:59:03.820 | become very um heated and and um and and toxic and there's been a recent proposal by Democrats
00:59:11.740 | that that I would support which basically says listen we should have an 18-year term for Supreme
00:59:16.120 | Court Justice that's long enough and each president should get two nominees like one in the first year
00:59:21.820 | and then one in the third year and so you basically have one Justice rolling off every two
00:59:26.020 | years and one coming on and so you have nine Justices and so every two years adds up to 18
00:59:31.240 | years that proposal makes a ton of sense to me and um and so you know you know that when you
00:59:36.880 | vote for a president they're going to get two Supreme Court picks um that feels less chaotic
00:59:41.440 | than this that would be that'd be a much better that's a great idea that's a great idea yeah that's
00:59:46.600 | a great idea I think it's a fabulous idea I I took solace in the fact that when they asked her the uh
00:59:52.600 | what's protected in the First Amendment she couldn't name all five things that I could
00:59:56.380 | I was like what about protest did you miss that one and I thought that was like a I mean it's a
01:00:01.240 | gotcha moment obviously uh and it's not easy to be under that kind of scrutiny and obviously she
01:00:05.620 | Justice J. Cal wow I just thought that was like it's also like pretty interesting
01:00:11.140 | I think they I think they invented the word I think they invented the word unconfirmable for
01:00:19.660 | J. Cal you got a right to have your own pistola but you shouldn't have a shotgun boys Friedberg
01:00:25.960 | Friedbergers has a hard stop at three uh the the uh the fact that you left that protest I do think
01:00:33.400 | let's let's just end on the election uh and our little handicapping of what's going to happen and
01:00:38.260 | getting out of this mess I do think
01:00:40.540 | one of the stories coming out of this is going to be uh female voters I have the sense and I know
01:00:47.800 | it's anecdotal that Trump has just alienated and pissed off so many women and that the threat of
01:00:55.180 | the Supreme Court thing and with uh RGB dying uh this has made women feel so under appreciated and
01:01:07.240 | attacked especially with Trump um
01:01:10.240 | uh you know in terms of how he treats women and things he says about women and then you had the
01:01:17.620 | constant interruption by Pence of the moderator and Kamala like I think all of this is going to
01:01:24.580 | add up and we do the post-mortem on this losing all these women as voters is going to be and as
01:01:30.580 | well as uh the black vote and people of color this is going to be a big part of it so I think that
01:01:37.540 | Trump's going to lose and it's going to be a landslide what a roundabout way to say the same
01:01:43.660 | thing you've been saying for four months oh my God respected women I don't know listen I I don't know
01:01:50.800 | uh I think Biden is uh is is on the path to an enormous victory right now well that's what the
01:01:57.400 | polls that's what the polls say certainly is that it looks like a buying landslide I
01:02:01.480 | um and I I guess that makes sense I think Trump's running out of time to change the polls um and I'm
01:02:03.460 | not sure if you're going to be able to see that but I'm not sure if you're going to be able to see
01:02:04.480 | that but I'm not sure if you're going to be able to see that but I'm not sure if you're going to be able to see
01:02:05.480 | that but I'm not sure if you're going to be able to see that but I'm not sure if you're going to be able to see
01:02:06.480 | polls. Every day that goes by, he's basically got like 19 outs, where 18 days, he's got 18 outs,
01:02:12.360 | every day that goes by where he isn't able to move the poll number, he loses an out,
01:02:16.040 | right. And so we're going to get closer to election day, he's only gonna have like a three
01:02:19.840 | out or something. So yeah, I mean, look, obviously, I understand the polls, I still
01:02:24.960 | somehow think, I know, it sounds kind of weird, but I'm just not sure Americans are ready for
01:02:30.080 | this reality show to end. I mean, we know it's jumped the shark. Okay. But the Kardashians,
01:02:35.560 | the Kardashians lasted for 19 seasons. I just don't know if America is ready for the Trump
01:02:40.780 | reality show. I think part of the appeal of Trump last time around was the the message of change.
01:02:46.880 | And he's not delivering a message of change anymore. And I think that's where he's kind of
01:02:51.220 | lost the narrative. And the excitement of building a wall and changing everything and draining the
01:02:56.880 | swamp, like he's just like, keep draining the swamp or keep building the wall. And people don't
01:03:02.800 | love that. He's also
01:03:04.640 | he also, I think it's coming across as not being he's looking weak by not being willing to be
01:03:10.540 | challenged. And that came across clearly in that debate. He last time around, he got on stage,
01:03:15.160 | and he just knocked everyone down. But by not letting Biden talk by not kind of engaging on
01:03:20.140 | any of the topics, he looks just he looks like he just doesn't want to have a shot at it. And
01:03:25.880 | it just comes across as bad. So I don't know, these are all contributing factors, I think,
01:03:29.380 | to what's going on.
01:03:30.080 | Chances of a pardon by Pence,
01:03:33.760 | he resigns, he pardons himself. Pence, zero, zero, zero,
01:03:38.680 | Well, we wouldn't see that unless he lost the election if he loses during the lame duck during
01:03:47.920 | the lame duck period. He lost maybe 20% 20% Yeah, because at that point, he's got nothing to lose,
01:03:53.640 | right? Right.
01:03:55.300 | That I think it's I think it's like a I think it's 5050. He just goes for the full family pardon.
01:04:02.840 | All right. Love you guys. I gotta go.
01:04:04.040 | Love you guys. And hopefully we'll have a bestie poker soon.
01:04:08.880 | Yay. Talk to you guys later. Bye.