back to index

E107: The Twitter Files Parts 1-2: shadow banning, story suppression, interference & more


Chapters

0:0 Bestie gut health!
2:17 Twitter Files Part 2: shadow banning and blacklisting entire accounts and topics; how content moderation was handled at other tech giants
33:9 Twitter Files Part 1: Suppressing the New York Post's Hunter Biden laptop story
47:30 China effectively ends Zero-Covid policies; Iran, China, and Japan demographics
58:27 Kevin O'Leary defends FTX on CNBC, was paid $15M as a spokesperson; Sinema flips to Independent

Whisper Transcript | Transcript Only Page

00:00:00.000 | You were bloated last night.
00:00:01.640 | What else is new?
00:00:02.440 | I said not bloated.
00:00:04.360 | My God, you really are, though.
00:00:06.560 | You look bloated.
00:00:07.520 | Listen, that's coming from you.
00:00:08.880 | You started to look like Bert,
00:00:10.240 | and now you're back to Ernie.
00:00:11.640 | Your face is getting round again.
00:00:13.240 | All I have to say is,
00:00:14.680 | hold on a second, guys.
00:00:15.600 | I gotta get a drink.
00:00:16.320 | Is it OK?
00:00:16.720 | You guys got a minute for me to get a drink?
00:00:17.960 | Yeah, yeah, I definitely do.
00:00:19.280 | I definitely do.
00:00:19.840 | Go ahead.
00:00:20.160 | Hold on a second.
00:00:20.760 | You gonna get a beer?
00:00:21.480 | No, no.
00:00:22.520 | I'm actually, you know,
00:00:23.520 | I've been working on my weight,
00:00:24.600 | so I'm just gonna pick here.
00:00:25.680 | I think I have the mocha latte
00:00:27.240 | from Supergut,
00:00:28.040 | and I also have the chocolate shake.
00:00:29.880 | Do you have a recommendation here
00:00:31.160 | for me, Friedberg?
00:00:31.920 | Because I'm going to put it in my coffee.
00:00:33.040 | Is mocha on a mocha?
00:00:35.160 | You can't go wrong.
00:00:36.200 | You can't go wrong.
00:00:37.280 | Thank you.
00:00:37.680 | Double mocha's a win.
00:00:38.840 | Just on a completely unrelated topic,
00:00:40.800 | did you happen to invest
00:00:42.200 | in Supergut, J.Cao?
00:00:43.360 | No, no, no.
00:00:43.920 | I haven't invested yet,
00:00:44.760 | but use the promo code.
00:00:45.800 | Oh, OK.
00:00:48.400 | It's been a big part
00:00:49.120 | of my weight loss journey.
00:00:50.080 | It's also been a big part of me
00:00:51.800 | and Friedberg becoming besties
00:00:54.080 | and creating a unified block
00:00:56.360 | for All In Summit 2023.
00:00:58.600 | So I've got two solid votes.
00:01:00.880 | I'll be very honest with you.
00:01:01.760 | You guys give me a credible plan
00:01:03.600 | where we can maintain the integrity.
00:01:06.000 | I was joking.
00:01:07.240 | Hold on, hold on, keep going.
00:01:08.640 | I was joking.
00:01:09.360 | Hold on.
00:01:09.880 | Maintain credibility.
00:01:11.680 | Continue.
00:01:12.200 | Listen to me.
00:01:12.720 | Listen to me.
00:01:13.240 | Listen to me.
00:01:13.760 | I'm listening.
00:01:14.640 | If you two idiots.
00:01:16.320 | I'm not involved.
00:01:17.520 | Yes, you are.
00:01:18.280 | You clearly are involved
00:01:19.720 | with this fucking grift.
00:01:20.800 | You're an important vote.
00:01:21.480 | Hold on.
00:01:22.240 | Continue, Chamath.
00:01:22.920 | I'm writing this in.
00:01:23.600 | I'm writing it down.
00:01:24.160 | If you two idiots,
00:01:25.400 | the two of you have to do this together
00:01:26.800 | because otherwise I'm with David
00:01:28.200 | and there's absolutely got it.
00:01:29.920 | You two idiots
00:01:31.400 | need to come up with a plan
00:01:33.400 | where we can each make.
00:01:37.160 | Make four million bucks each net,
00:01:41.000 | then I'll do it.
00:01:42.120 | Four million net.
00:01:43.400 | OK, great.
00:01:44.560 | Look at Jekyll writing that down
00:01:46.040 | as as if he respects a contract.
00:01:48.280 | OK, got this.
00:01:49.280 | Got this.
00:01:49.640 | I signed the fucking car.
00:01:50.760 | I signed the contract for Jekyll.
00:01:52.600 | The negotiation begins
00:01:53.960 | at the point
00:01:54.520 | where there's a signed contract.
00:01:55.960 | Yeah, exactly.
00:01:57.800 | So, OK, now negotiate with you.
00:02:01.280 | All right, everybody,
00:02:18.440 | the show has started.
00:02:19.800 | The four of us are still here
00:02:21.880 | by some miracle.
00:02:23.200 | We're still going after 107 episodes
00:02:25.080 | and it's better than ever.
00:02:25.840 | Last week we were number 12.
00:02:26.960 | So mainstream media.
00:02:28.520 | We'll see you in the top 10.
00:02:31.520 | Mainstream media.
00:02:32.560 | Here we go.
00:02:33.520 | Twitter files, part one and part two.
00:02:35.040 | We're not on strike.
00:02:35.680 | Despite your oppressive conditions.
00:02:37.560 | Yes, Jekyll.
00:02:38.560 | We're not on strike.
00:02:39.480 | What oppressive conditions
00:02:41.040 | of making you show up on time.
00:02:42.920 | Yeah.
00:02:43.680 | If I was getting paid five bucks for this,
00:02:45.920 | I'd be on strike right now.
00:02:48.000 | Guys, not only are you getting five bucks,
00:02:49.360 | you're getting a bill for the production.
00:02:50.720 | OK, here we go.
00:02:51.920 | By the way, how beautiful is it
00:02:53.960 | that the same reporters
00:02:55.240 | who couldn't stop writing
00:02:57.440 | about the oppressive working conditions
00:02:59.200 | that Elon Musk was supposedly creating
00:03:01.160 | because he simply wanted the employees
00:03:02.600 | to go back to the office and work hard.
00:03:05.200 | And if they did it,
00:03:05.880 | he'd give them a generous
00:03:06.640 | three months severance package.
00:03:08.200 | Yeah, those same reporters
00:03:09.400 | are now on strike
00:03:10.760 | because the souls burgers
00:03:11.800 | are running a clickbait farm over there
00:03:13.560 | with oppressive working conditions.
00:03:15.000 | The intellectual dishonesty
00:03:16.760 | has never been higher in the world.
00:03:18.240 | Yeah, I would like to ask.
00:03:20.480 | Honesty.
00:03:21.080 | Yes. Will the publisher
00:03:22.360 | of The New York Times agree
00:03:24.440 | that anybody who isn't happy
00:03:26.760 | there can have a voluntary
00:03:28.480 | three month severance package?
00:03:29.600 | Yeah. Click this link.
00:03:31.000 | And do you want to work hard
00:03:32.680 | or do you want three months severance?
00:03:34.160 | If The New York Times publisher did that,
00:03:35.680 | you know what would happen?
00:03:36.960 | 800 of 1200 people
00:03:38.280 | would take the severance.
00:03:40.080 | Of course.
00:03:41.320 | All right, here we go.
00:03:42.160 | Twitter files have dropped.
00:03:43.680 | Part one dropped with the legendary
00:03:46.400 | award winning, highly respected journalist
00:03:49.200 | Matt Taibbi.
00:03:50.040 | If you don't know who he is,
00:03:51.440 | he is a left leaning journalist
00:03:53.200 | who worked at Rolling Stone
00:03:55.720 | and did the best coverage, hands down,
00:03:58.280 | of the financial crisis
00:03:59.520 | and the shenanigans.
00:04:00.440 | And he held truth to power to that group.
00:04:02.080 | This is important to note.
00:04:03.920 | The second drop was given to Bari Weiss,
00:04:06.920 | who is a right leaning
00:04:09.320 | independent journalist.
00:04:10.560 | These are both independent journalists.
00:04:12.920 | She previously worked at The New York
00:04:15.520 | Times itself.
00:04:16.680 | Now, I think we should work backwards.
00:04:20.000 | From two to one.
00:04:21.000 | Do you agree?
00:04:21.800 | Yes, for sure.
00:04:23.280 | Let's start with the drop
00:04:24.080 | that just happened last night.
00:04:25.080 | Yes, so last night a drop happened.
00:04:26.920 | So here's what happens
00:04:29.040 | in Twitter files part two.
00:04:31.000 | I'm going to give a basic summary
00:04:31.960 | and then I'm going to give it to Sax
00:04:33.040 | because he's chomping at the bit.
00:04:34.280 | We now have confirmation
00:04:35.840 | that what the right thought
00:04:37.880 | was happening all along,
00:04:39.000 | which is a secret silencing system
00:04:42.360 | built into the software of blacklists
00:04:46.000 | was tagging right wing conservative voices
00:04:49.440 | in the system.
00:04:51.160 | And these included people like Dan
00:04:54.800 | Bungino, is that how you pronounce it?
00:04:56.880 | He was tagged with being
00:04:58.680 | on a search blacklist.
00:05:00.000 | What that means is you're a fan of Dan's
00:05:02.880 | who is a former Secret Service agent
00:05:05.440 | who is now a right wing conservative.
00:05:07.920 | I could just say conservative
00:05:09.000 | instead of wing.
00:05:09.800 | A conservative radio host, podcast host.
00:05:12.880 | He was not allowed to be found
00:05:14.920 | in search engines for some reason.
00:05:17.600 | Charlie Kirk, who is a conservative
00:05:19.240 | commentator, he was tagged with
00:05:21.160 | do not amplify.
00:05:22.360 | I guess that means you can't trend
00:05:24.440 | into people's feeds,
00:05:25.480 | even if they follow you.
00:05:27.400 | And then there were people
00:05:29.600 | who were banned from the trends
00:05:31.160 | blacklist, including a Stanford
00:05:33.120 | professor, Jay Bhattacharya.
00:05:38.040 | Did I get it right?
00:05:38.800 | Yes, Jay Bhattacharya.
00:05:40.520 | OK, I got it right.
00:05:41.200 | Doctor of Stanford School of Medicine.
00:05:43.400 | And he was not allowed
00:05:46.360 | on the trends blacklist
00:05:48.360 | because he had a dissenting opinion.
00:05:51.280 | A Stanford professor
00:05:53.280 | had a dissenting opinion.
00:05:54.480 | A dissenting opinion on COVID
00:05:55.000 | that's turned out to be true.
00:05:56.520 | And this is where the danger comes in,
00:05:58.280 | because all of these actions
00:06:00.440 | were taken without any transparency.
00:06:04.600 | And they were taken
00:06:06.840 | on one side of the aisle
00:06:08.760 | by people inside of Twitter,
00:06:11.360 | essentially covertly.
00:06:12.760 | No ownership of who did it.
00:06:14.560 | And they never told the people.
00:06:17.040 | They gaslit them.
00:06:18.880 | They could see their own tweets.
00:06:22.120 | They could use the service,
00:06:23.960 | but they couldn't be seen
00:06:25.600 | even by their own fans
00:06:27.120 | in many cases here.
00:06:28.880 | Sax, when you look at that,
00:06:30.400 | let's just start with that first piece,
00:06:32.720 | the shadow banning, as it's called
00:06:35.080 | in our industry,
00:06:36.640 | where you can participate
00:06:37.960 | in a community,
00:06:39.080 | but you can't be seen.
00:06:42.280 | Is there any circumstance
00:06:43.600 | under which this tool
00:06:45.320 | would make sense for you
00:06:46.320 | to deploy?
00:06:47.320 | And then what's your general take
00:06:48.320 | on what has been discovered
00:06:49.680 | last night?
00:06:50.680 | Okay, look, what was--
00:06:51.680 | Two-part question.
00:06:52.680 | Yes, let me start with
00:06:53.680 | what's been discovered here.
00:06:54.800 | Let me boil it down for you.
00:06:56.480 | This is an FTX-level fraud,
00:07:00.040 | except that what was stolen here
00:07:01.880 | was not customer funds.
00:07:03.200 | It was their free speech rights,
00:07:04.960 | not just the rights of people
00:07:06.200 | like Jay Bonacoria
00:07:07.200 | and Dan Bongino to speak,
00:07:09.120 | but the right of the public
00:07:10.520 | to hear them in the way
00:07:12.280 | that they expected, okay?
00:07:14.680 | And you had statement
00:07:15.920 | after statement
00:07:16.920 | by Twitter executives
00:07:18.120 | like Jack Dorsey,
00:07:19.680 | like Vijay Gowdy,
00:07:22.040 | like, you know, Yoel,
00:07:24.440 | and others saying,
00:07:25.560 | "We do not shadow ban."
00:07:26.880 | And then they also said,
00:07:28.080 | "We certainly,"
00:07:29.080 | this is their emphasis,
00:07:30.480 | "do not shadow ban
00:07:31.480 | on the basis of political viewpoint."
00:07:33.440 | And what the Twitter files show
00:07:35.000 | is that is exactly
00:07:36.160 | what they were doing.
00:07:37.160 | They, in the same way
00:07:38.360 | that SBF was using FTX
00:07:41.360 | and customer funds
00:07:42.360 | as a personal piggy bank,
00:07:43.960 | they were using Twitter
00:07:45.160 | as their personal
00:07:46.560 | ideological piggy bank.
00:07:47.880 | They were going in to the tools
00:07:50.200 | and using the content moderation system,
00:07:52.240 | these big brother-like tools
00:07:53.760 | that were designed
00:07:55.160 | to basically put their thumb
00:07:56.440 | on the scale of American democracy
00:07:58.280 | and suppress viewpoints
00:07:59.360 | that they did not agree with
00:08:00.280 | and they did not like,
00:08:01.360 | even when, even when,
00:08:03.880 | they could not justify
00:08:06.600 | removing content
00:08:07.920 | based on their own rules.
00:08:08.920 | So there are conversations
00:08:10.480 | in the Slack that Barry Weiss exposed,
00:08:12.880 | where, for example,
00:08:13.880 | Libs of TikTok,
00:08:15.200 | they admit in the Slack
00:08:16.560 | that we can't suppress
00:08:18.880 | Libs of TikTok
00:08:19.880 | based on our hate policy.
00:08:21.680 | Libs of TikTok hasn't violated it.
00:08:23.680 | We're going to suppress
00:08:24.680 | that account anyway.
00:08:25.680 | Now, it's important to note
00:08:26.680 | what Libs of TikTok does.
00:08:27.680 | This is a great talking point.
00:08:29.880 | Libs of TikTok
00:08:31.160 | finds people who are trans,
00:08:34.560 | people who are, you know,
00:08:36.320 | maybe not LGBTQ,
00:08:39.160 | and they feature their TikToks
00:08:42.400 | and they mock them on Twitter.
00:08:44.400 | Now, this certainly is free speech.
00:08:47.080 | And the argument from the safety team
00:08:49.000 | was by putting all of these together,
00:08:51.800 | you're inciting violence
00:08:52.960 | towards those people.
00:08:54.840 | And they said they haven't broken a rule,
00:08:56.400 | but collectively,
00:08:58.160 | they could be, in some way,
00:09:01.240 | targeting those people.
00:09:02.240 | Is there anything fair,
00:09:03.760 | Friedberg, to that statement?
00:09:07.480 | That they targeted them?
00:09:09.320 | By collecting their,
00:09:11.480 | let's say, views that are,
00:09:12.960 | I'm asking this question
00:09:13.960 | for discussion purposes.
00:09:14.960 | I'm not giving my opinion.
00:09:15.960 | Jake, help.
00:09:16.960 | Hold on.
00:09:17.960 | I want Friedberg to answer one.
00:09:18.960 | Why can't I finish?
00:09:19.960 | I'm going to go back to you.
00:09:20.960 | You spoke for two minutes.
00:09:21.960 | That's why.
00:09:22.960 | Friedberg.
00:09:23.960 | You turned down moderating today, Sax.
00:09:24.960 | You could have had the opportunity
00:09:25.960 | to decide who speaks.
00:09:26.960 | Everybody else gets to speak
00:09:27.960 | as long as they want
00:09:28.960 | and I get interrupted.
00:09:29.960 | You got two minutes.
00:09:30.960 | Let Friedberg talk.
00:09:31.960 | Let me just finish the SBF analogy, okay?
00:09:32.960 | Oh, my God.
00:09:33.960 | The filibuster continues.
00:09:34.960 | Go ahead.
00:09:35.960 | Then you can both sides this issue.
00:09:36.960 | Don't worry, Sax.
00:09:37.960 | While you're speaking,
00:09:38.960 | I'm going to ask you to take a minute
00:09:39.960 | to think about what you said
00:09:40.960 | and then I'll come back to you.
00:09:41.960 | I'm going to ask you to think about
00:09:42.960 | what you said and then I'll come back to you.
00:09:43.960 | I'm going to ask you to think about
00:09:44.960 | what you said and then I'll come back to you.
00:09:45.960 | I'm going to ask you to think about
00:09:46.960 | what you said and then I'll come back to you.
00:09:47.960 | I'm going to ask you to think about
00:09:48.960 | what you said and then I'll come back to you.
00:09:49.960 | I'm going to ask you to think about
00:09:50.960 | what you said and then I'll come back to you.
00:09:51.960 | I'm going to ask you to think about
00:09:52.960 | what you said and then I'll come back to you.
00:09:53.960 | I'm going to ask you to think about
00:09:54.960 | what you said and then I'll come back to you.
00:09:55.960 | I'm going to ask you to think about
00:09:56.960 | what you said and then I'll come back to you.
00:10:20.960 | I'm going to ask you to think about
00:10:21.960 | what you said and then I'll come back to you.
00:10:22.960 | I'm going to ask you to think about
00:10:23.960 | what you said and then I'll come back to you.
00:10:24.960 | I'm going to ask you to think about
00:10:25.960 | what you said and then I'll come back to you.
00:10:26.960 | I'm going to ask you to think about
00:10:27.960 | what you said and then I'll come back to you.
00:10:28.960 | I'm going to ask you to think about
00:10:29.960 | what you said and then I'll come back to you.
00:10:30.960 | I'm going to ask you to think about
00:10:31.960 | what you said and then I'll come back to you.
00:10:32.960 | I'm going to ask you to think about
00:10:33.960 | what you said and then I'll come back to you.
00:10:34.960 | I'm going to ask you to think about
00:10:35.960 | what you said and then I'll come back to you.
00:10:36.960 | I'm going to ask you to think about
00:10:37.960 | what you said and then I'll come back to you.
00:11:02.960 | I'm going to ask you to think about
00:11:03.960 | what you said and then I'll come back to you.
00:11:04.960 | I'm going to ask you to think about
00:11:05.960 | what you said and then I'll come back to you.
00:11:06.960 | I'm going to ask you to think about
00:11:07.960 | what you said and then I'll come back to you.
00:11:08.960 | I'm going to ask you to think about
00:11:09.960 | what you said and then I'll come back to you.
00:11:10.960 | I'm going to ask you to think about
00:11:11.960 | what you said and then I'll come back to you.
00:11:12.960 | I'm going to ask you to think about
00:11:13.960 | what you said and then I'll come back to you.
00:11:14.960 | I'm going to ask you to think about
00:11:15.960 | what you said and then I'll come back to you.
00:11:16.960 | I'm going to ask you to think about
00:11:17.960 | what you said and then I'll come back to you.
00:11:18.960 | I'm going to ask you to think about
00:11:19.960 | what you said and then I'll come back to you.
00:11:44.960 | I'm going to ask you to think about
00:11:45.960 | what you said and then I'll come back to you.
00:11:46.960 | I'm going to ask you to think about
00:11:47.960 | what you said and then I'll come back to you.
00:11:48.960 | I'm going to ask you to think about
00:11:49.960 | what you said and then I'll come back to you.
00:11:50.960 | I'm going to ask you to think about
00:11:51.960 | what you said and then I'll come back to you.
00:11:52.960 | I'm going to ask you to think about
00:11:53.960 | what you said and then I'll come back to you.
00:11:54.960 | I'm going to ask you to think about
00:11:55.960 | what you said and then I'll come back to you.
00:11:56.960 | I'm going to ask you to think about
00:11:57.960 | what you said and then I'll come back to you.
00:11:58.960 | I'm going to ask you to think about
00:11:59.960 | what you said and then I'll come back to you.
00:12:00.960 | I'm going to ask you to think about
00:12:01.960 | what you said and then I'll come back to you.
00:12:24.960 | I'm going to ask you to think about
00:12:50.960 | what you said and then I'll come back to you.
00:13:19.960 | I think, I think, I think Saks has
00:13:23.960 | articulated a vision for the product
00:13:27.960 | he wanted Twitter to be
00:13:29.960 | but I don't think that's necessarily the product
00:13:32.960 | that they wanted to create.
00:13:34.960 | It's not that Twitter set out at the time
00:13:36.960 | or stated clearly that they were going to be
00:13:38.960 | the harbinger of truth and the free speech platform for all.
00:13:42.960 | I think they were really clear
00:13:43.960 | and they have been in their behavior
00:13:46.960 | and as you know demonstrated through this stuff
00:13:48.960 | that came out which to me feels a lot like a
00:13:50.960 | we already knew all this stuff.
00:13:52.960 | This is a bit of a nothing burger
00:13:54.960 | that they were curating
00:13:56.960 | and they were editing and they were editorializing
00:13:59.960 | other people's content and the ranking of content
00:14:02.960 | in the same way that many other internet platforms do
00:14:06.960 | to create what they believe to be
00:14:08.960 | their best user experience
00:14:10.960 | for the users that they want to appeal to.
00:14:12.960 | And I'll say like there's been this long debate
00:14:15.960 | and it goes back 20 years at this point
00:14:18.960 | on how Google does ranking.
00:14:20.960 | You guys may remember Jeremy Stoppelman went to
00:14:22.960 | DC and he complained about how Google was using his content
00:14:26.960 | and he wasn't being ranked high enough
00:14:28.960 | as Google's own content that was being
00:14:30.960 | shoved in the wrong place.
00:14:31.960 | And there's a guy who ran, he was a spokesperson
00:14:34.960 | for the SEO, the search engine optimization
00:14:37.960 | rules at Google.
00:14:39.960 | And it was always the secret at Google
00:14:41.960 | how did the search results get ranked.
00:14:43.960 | And I can tell you it's not just a pure algorithm
00:14:46.960 | that there was a lot of manual intervention,
00:14:48.960 | a lot of manual work.
00:14:49.960 | In fact, the manual work gets to the point
00:14:51.960 | that they said there's so much stuff that we know
00:14:53.960 | is the best content and the best
00:14:55.960 | form of content for the user experience
00:14:58.960 | that they ranked it all the way at the top
00:15:00.960 | and they called it the one box.
00:15:01.960 | It's the stuff that sits above
00:15:02.960 | the primary search results.
00:15:04.960 | And that editorialization ultimately led to a product
00:15:07.960 | that they intended to make because they believed
00:15:09.960 | it was a better user experience
00:15:10.960 | for the users that they wanted to service.
00:15:13.960 | And I don't think that this is any different
00:15:15.960 | than what's happened at Twitter.
00:15:17.960 | Twitter is not a government agency.
00:15:19.960 | They're not a free speech, they're not the internet.
00:15:21.960 | They're a product.
00:15:22.960 | And the product managers and the people
00:15:24.960 | that run that product team
00:15:26.960 | ultimately made some editorial decisions
00:15:28.960 | that said, this is the content we do want to show
00:15:30.960 | and this is the content we don't want to show.
00:15:32.960 | And they certainly did wrap up
00:15:34.960 | a bunch of rules that had a lot of leeway
00:15:37.960 | for what they could or couldn't do.
00:15:39.960 | Or they gave themselves a lot of different excuses
00:15:41.960 | on how to do it.
00:15:42.960 | I don't agree with it.
00:15:43.960 | It's not the product I want.
00:15:44.960 | It's not the product I think should exist.
00:15:46.960 | I think Elon also saw that.
00:15:48.960 | And clearly he stepped in and said,
00:15:50.960 | I want to make a product that is a different product
00:15:53.960 | than what is being created today.
00:15:54.960 | So none of this feels to me like these guys
00:15:56.960 | were the guardians of the internet
00:15:58.960 | and they came along and they were distrustful.
00:16:00.960 | They did exactly what a lot of other companies have done
00:16:04.960 | and exactly what they set out to do.
00:16:05.960 | And they editorialized the product
00:16:07.960 | for a certain user group.
00:16:08.960 | And by the way, they never blocked,
00:16:09.960 | they never edited people's tweets.
00:16:11.960 | They changed how people's results
00:16:13.960 | were showing up in rankings.
00:16:14.960 | They showed how viral they would get in the trend box.
00:16:17.960 | Those were in-app features and in-app services.
00:16:20.960 | This was not about taking someone's tweet and changing it.
00:16:23.960 | And people may feel shamed and they may feel
00:16:26.960 | upset about the fact that they were deranked
00:16:30.960 | or they were kind of quote, shadow banned.
00:16:33.960 | But ultimately, that's the product they chose to make
00:16:36.960 | and people have the choice and the option of going elsewhere.
00:16:39.960 | And I don't agree with it.
00:16:40.960 | And it's not the product I want
00:16:41.960 | and it's not a product I want to use.
00:16:43.960 | And I certainly don't feel happy seeing it.
00:16:45.960 | So you want to see products in,
00:16:47.960 | FreeMark to summarize it,
00:16:48.960 | you want to see the free market do its job.
00:16:50.960 | Chamath, you worked at Facebook.
00:16:53.960 | Facebook seems to have done, I would say,
00:16:55.960 | an excellent job with content moderation.
00:16:58.960 | I think in large part, correct me if I'm wrong,
00:16:59.960 | because of the real names policy.
00:17:02.960 | But tell us what you think,
00:17:04.960 | when you look at this and the 15 year history
00:17:07.960 | of social media and moderation.
00:17:09.960 | I think moderation is incredibly difficult.
00:17:12.960 | And typically what happens is,
00:17:14.960 | early on in a company's life cycle,
00:17:16.960 | and I'm going to guess that Twitter and YouTube
00:17:20.960 | were very similar to what we did at Facebook.
00:17:23.960 | And it's very similar to probably what TikTok
00:17:25.960 | had to do in the early days, which is,
00:17:27.960 | you have this massive tidal wave of usage.
00:17:31.960 | And so you're always on a little bit of a hamster wheel.
00:17:34.960 | And so you build these very basic tools,
00:17:37.960 | and you uncover problems along the way.
00:17:40.960 | And so I think it's important to humanize
00:17:44.960 | the people that are at Twitter,
00:17:46.960 | because I'm not sure that they're these
00:17:47.960 | super nefarious actors per se.
00:17:49.960 | I do think that they were conflicted.
00:17:51.960 | I do think that they made some very corrupting decisions,
00:17:54.960 | but I don't think that they were these evil actors.
00:17:58.960 | I think that they were folks who,
00:18:01.960 | against the tidal wave of usage,
00:18:03.960 | built some brittle tools, built on top of them,
00:18:05.960 | built on top of it some more,
00:18:07.960 | and tried to find a way of coping.
00:18:09.960 | And as scale increased,
00:18:12.960 | they didn't have an opportunity
00:18:14.960 | to take a step back and reset.
00:18:15.960 | And I think that that's true for all of these companies.
00:18:18.960 | And so you're just seeing it out in the light,
00:18:20.960 | what's happening at Twitter.
00:18:22.960 | But don't for a second think that
00:18:24.960 | any other company behaved any differently.
00:18:26.960 | Google, Facebook, Twitter, ByteDance and TikTok,
00:18:29.960 | they're all the same.
00:18:30.960 | They're all dealing with this problem,
00:18:31.960 | and they're all probably trying to do
00:18:34.960 | a decent job of it, as best as they know how.
00:18:36.960 | So what do we do from here is the question.
00:18:40.960 | The reason somebody needs to do something about this
00:18:43.960 | is summarized really elegantly
00:18:45.960 | in this Jay Bhattacharya tweet.
00:18:48.960 | So please, Nick, just throw it up here
00:18:50.960 | so that we can just talk about this.
00:18:52.960 | This is why I think that this issue is important.
00:18:56.960 | Critically. This is a perfect tweet.
00:18:58.960 | Still trying to process my emotions
00:19:00.960 | on learning that Twitter blacklisted me.
00:19:02.960 | Okay, who cares about that?
00:19:03.960 | Here's what matters.
00:19:04.960 | The thought that will keep me up tonight.
00:19:07.960 | Censorship of scientific discussion
00:19:10.960 | permitted policies like school closures
00:19:13.960 | and a generation of children were hurt.
00:19:16.960 | Now, just think about that in a nutshell.
00:19:19.960 | What was Jay Bhattacharya to do?
00:19:21.960 | Maybe he was supposed to go on TikTok
00:19:23.960 | and try to sound the alarm bells through a TikTok.
00:19:25.960 | Maybe he was supposed to go on YouTube
00:19:27.960 | and create a video.
00:19:28.960 | Maybe he was supposed to go on Facebook
00:19:30.960 | and post into a Facebook group
00:19:33.960 | or do a newsfeed post.
00:19:35.960 | The problem is that,
00:19:37.960 | and the odds are reasonably likely,
00:19:39.960 | that a lot of these companies
00:19:41.960 | had very similar policies.
00:19:42.960 | In this example, around COVID misinformation,
00:19:45.960 | because it was the CDC
00:19:47.960 | and governmental organizations
00:19:49.960 | directing information and rules,
00:19:53.960 | reaching out to all of these companies.
00:19:55.960 | We're just seeing an insight into Twitter,
00:19:57.960 | but the point is it happened everywhere.
00:19:59.960 | The implication of suppressing information like this
00:20:02.960 | is that a credible individual like that
00:20:05.960 | can't spark a public debate.
00:20:08.960 | In not being able to spark the debate,
00:20:12.960 | you have this building up of errors in the system.
00:20:16.960 | Then who gets hurt?
00:20:18.960 | In this example, which is true,
00:20:20.960 | you couldn't even talk about school closures
00:20:22.960 | and masking up front and early in the system.
00:20:26.960 | If you had scientists actually debate it,
00:20:29.960 | maybe what would have happened
00:20:30.960 | is we would have kept the schools open
00:20:31.960 | and you would have had less learning loss
00:20:33.960 | and you'd have less depression
00:20:34.960 | and less overprescription of Ritalin and Adderall,
00:20:38.960 | because those are all factual things
00:20:39.960 | we can measure today.
00:20:41.960 | I think the important thing to take away
00:20:43.960 | from all of this is
00:20:44.960 | we've got confirmatory evidence
00:20:47.960 | that these folks under a tidal wave of pressure
00:20:51.960 | made some really bad decisions.
00:20:54.960 | The implications are pretty broad-reaching.
00:20:57.960 | Now I do think governments have to step in
00:21:00.960 | and create better guardrails
00:21:02.960 | so this kind of stuff doesn't happen.
00:21:04.960 | I don't buy the whole,
00:21:05.960 | "It's a private company.
00:21:07.960 | They can do what they want."
00:21:08.960 | I think that that is too naive of an expectation
00:21:12.960 | for how important these three companies literally are
00:21:16.960 | to how Americans consume and process information
00:21:20.960 | to make decisions.
00:21:21.960 | Incredibly well said, Sachs.
00:21:22.960 | Your reaction to your besties?
00:21:24.960 | I largely agree with what Jamal said,
00:21:25.960 | but let's go back to what Freeberg said.
00:21:27.960 | I think what Freeberg's point of view is
00:21:28.960 | is really what you're hearing now
00:21:30.960 | from the mainstream media today,
00:21:31.960 | which is, "Oh, nothing to see here.
00:21:34.960 | You know, that they told us all along
00:21:36.960 | what was happening.
00:21:37.960 | This was just content moderation.
00:21:39.960 | They had the right to do this.
00:21:40.960 | You're making a big deal over nothing."
00:21:42.960 | No, that's not true.
00:21:43.960 | Go back and look at the media coverage
00:21:45.960 | starting in 2018.
00:21:46.960 | Article after article said
00:21:48.960 | that this idea of shadow banning
00:21:50.960 | was a right-wing conspiracy theory.
00:21:52.960 | That's what they said.
00:21:53.960 | Furthermore, Jack Dorsey denied
00:21:55.960 | that shadow banning was happening,
00:21:57.960 | including at a congressional hearing,
00:21:58.960 | I believe, under oath.
00:22:00.960 | So either he lied,
00:22:01.960 | or he was lied to by his subordinates.
00:22:04.960 | I actually believe that the latter is possible.
00:22:06.960 | I don't think it's true with SBF.
00:22:08.960 | It might be true with Jack
00:22:09.960 | because he was so checked out.
00:22:10.960 | Furthermore, you had people, again,
00:22:12.960 | like Vijay Aghati,
00:22:14.960 | again, tweeting and repeatedly stating,
00:22:17.960 | "We do not shadow ban
00:22:18.960 | and we certainly don't shadow ban
00:22:19.960 | on the basis of political viewpoint."
00:22:21.960 | So, these people were denying
00:22:23.960 | exactly what their critics were saying.
00:22:26.960 | They were accusing their critics
00:22:27.960 | of being conspiracy theorists.
00:22:29.960 | Now that the thing is proven,
00:22:31.960 | the mountain of evidence has dropped,
00:22:33.960 | they're saying, "Oh, well, this is old news.
00:22:35.960 | This was known a long time ago."
00:22:36.960 | No, it was not known a long time ago.
00:22:38.960 | It was disputed by you.
00:22:40.960 | And now, finally, it's proven.
00:22:42.960 | And you're trying to say it's not a big deal.
00:22:44.960 | It is a big deal.
00:22:45.960 | It's a violation of the public trust.
00:22:47.960 | And if you are so proud
00:22:49.960 | of your content moderation policies,
00:22:51.960 | why didn't you admit
00:22:52.960 | what you were doing in the first place?
00:22:54.960 | That's what I said.
00:22:55.960 | Don't you feel good
00:22:56.960 | that Elon's running this business now?
00:22:58.960 | I mean, like,
00:22:59.960 | the things that you're concerned about
00:23:00.960 | as a user,
00:23:01.960 | as someone who cares about
00:23:03.960 | the public's access
00:23:05.960 | to knowledge,
00:23:06.960 | to opinions,
00:23:07.960 | to free speech,
00:23:09.960 | this has got to be a good change, right?
00:23:11.960 | Like, this has come to light.
00:23:12.960 | It's clearly going to get resolved.
00:23:14.960 | Everyone's going to move forward.
00:23:15.960 | I mean, do you think that there's penalties
00:23:16.960 | needed for the people that work there?
00:23:17.960 | Or, like,
00:23:18.960 | what's the anger?
00:23:20.960 | Because you won.
00:23:21.960 | Look, I think we got,
00:23:22.960 | I think we basically got extremely lucky
00:23:25.960 | that Elon Musk happened to care
00:23:27.960 | about free speech
00:23:28.960 | and decided to do something about it
00:23:30.960 | and actually had the means
00:23:31.960 | to do something about it.
00:23:32.960 | He's just about the only
00:23:34.960 | billionaire
00:23:35.960 | who has that level of means
00:23:37.960 | who actually cared enough
00:23:38.960 | to take on this battle.
00:23:39.960 | But are you saying that this is a
00:23:40.960 | hard measure for other platforms?
00:23:41.960 | I think he deserves praise for that.
00:23:43.960 | But, I mean,
00:23:45.960 | unless Elon can buy
00:23:46.960 | every single tech company,
00:23:47.960 | which he clearly can't,
00:23:49.960 | I think you guys are right.
00:23:50.960 | This is happening to a lot of other
00:23:51.960 | tech companies.
00:23:52.960 | We're about to rewrite
00:23:53.960 | the government,
00:23:54.960 | the United States government
00:23:55.960 | is going to make an attempt
00:23:56.960 | to rewrite Section 230.
00:23:58.960 | I think that what this does
00:24:00.960 | is put a very fine point
00:24:02.960 | on a comment that Elon actually tweeted out.
00:24:05.960 | And, Nick, if you could find that, please.
00:24:06.960 | That's a very good tweet,
00:24:07.960 | where he said,
00:24:08.960 | "Going forward,
00:24:09.960 | you will be able to see
00:24:10.960 | if you were shadow banned,
00:24:12.960 | you were able to see
00:24:13.960 | if you were de-boosted.
00:24:16.960 | And be able to appeal."
00:24:17.960 | And I think that that concept,
00:24:19.960 | to be very honest with you,
00:24:21.960 | should be enshrined in law.
00:24:23.960 | And I think that should be part
00:24:25.960 | of the Section 230 rewrite.
00:24:27.960 | And all of these media companies
00:24:29.960 | and all of these social media companies
00:24:31.960 | should be subject to it.
00:24:33.960 | And the reason is because
00:24:34.960 | it ties a lot of these concepts together
00:24:36.960 | and says, "Look,
00:24:38.960 | you can build a service,
00:24:40.960 | you're a private company,
00:24:41.960 | make as much money as you want,
00:24:43.960 | but we're going to have
00:24:44.960 | some connective tissue
00:24:46.960 | back to the fundamental underpinnings
00:24:47.960 | of the Constitution,
00:24:49.960 | which is the framework
00:24:50.960 | under which we all live,
00:24:52.960 | and we're going to transparently
00:24:53.960 | allow you to understand it."
00:24:54.960 | And I think that's really reasonable.
00:24:55.960 | Make that a legal expectation
00:24:57.960 | of all these organizations.
00:24:59.960 | And by the way,
00:25:00.960 | the companies will love it
00:25:02.960 | because I think it's super hard
00:25:04.960 | for you to be in these companies,
00:25:06.960 | and they probably are like,
00:25:07.960 | "Take this responsibility off my plate."
00:25:09.960 | It's very simple.
00:25:10.960 | This is a...
00:25:11.960 | There's really four problems
00:25:12.960 | that occurred here.
00:25:13.960 | Number one,
00:25:14.960 | there was no transparency.
00:25:16.960 | The people who were shadow banned,
00:25:19.960 | taken out of search, etc.,
00:25:21.960 | they did not know.
00:25:22.960 | If they were told,
00:25:24.960 | and it was clear to users,
00:25:26.960 | we could have a discussion about
00:25:27.960 | was that a fair judgment or not.
00:25:29.960 | In the cases we've seen so far
00:25:31.960 | from Bari Weiss's reporting
00:25:33.960 | in the Twitter files part,
00:25:34.960 | it's very clear that these
00:25:36.960 | were not justifiable.
00:25:38.960 | Number two,
00:25:39.960 | these were not evenly enforced.
00:25:41.960 | It's very clear that one side...
00:25:44.960 | 'Cause we don't have one example
00:25:46.960 | of a person on the left
00:25:48.960 | being censored.
00:25:50.960 | If we do,
00:25:51.960 | then we could put
00:25:52.960 | balls and strikes together,
00:25:54.960 | and we could say how many people
00:25:55.960 | on one side versus how many people
00:25:56.960 | on the other.
00:25:57.960 | It's pretty clear what happened here.
00:25:58.960 | Because these all occurred
00:26:00.960 | with a group of people
00:26:01.960 | working at Twitter,
00:26:02.960 | which is 96% or 97%
00:26:05.960 | left-leaning.
00:26:06.960 | The statistics are clear.
00:26:08.960 | Number three,
00:26:09.960 | the shadow banning
00:26:11.960 | and the search banning,
00:26:12.960 | and I think this is something
00:26:13.960 | we talked about previously,
00:26:14.960 | Chamath,
00:26:15.960 | it feels very underhanded.
00:26:16.960 | This was your point.
00:26:17.960 | If we're gonna block people,
00:26:19.960 | they should be blocked,
00:26:21.960 | and they should know why.
00:26:23.960 | The fourth piece of this,
00:26:24.960 | which is absolutely infuriating,
00:26:26.960 | and this is a discussion
00:26:27.960 | that myself, Sax, and Elon
00:26:30.960 | have had many times
00:26:31.960 | about this moderation,
00:26:32.960 | and I'm not speaking out of school now
00:26:34.960 | because he's now very public
00:26:35.960 | with his position,
00:26:36.960 | and his position he came to
00:26:38.960 | on his own.
00:26:39.960 | It's not like this is Sax and I
00:26:42.960 | coming up with these positions.
00:26:43.960 | This is why Elon bought the business.
00:26:46.960 | If you really want to intellectually test
00:26:49.960 | your thinking on this,
00:26:51.960 | and I am a moderate who's left-leaning,
00:26:53.960 | I can tell you there's a simple way
00:26:55.960 | for anybody who is debating
00:26:57.960 | the validity of the concerns here.
00:27:00.960 | Imagine Rachel Maddow or Ezra Klein,
00:27:02.960 | or whoever your favorite left-leaning pundit is,
00:27:05.960 | was shadow-banned
00:27:07.960 | by a group of right-wing moderators
00:27:09.960 | who were acting covertly
00:27:11.960 | and without any transparency.
00:27:13.960 | How would you feel
00:27:15.960 | if Maddow reporting on, you know,
00:27:18.960 | all the Russian coordination
00:27:21.960 | with Trump's campaign did this,
00:27:23.960 | or Ezra Klein with whatever topics he covers,
00:27:26.960 | and you will very quickly find yourself infuriated,
00:27:29.960 | and you should then intellectually,
00:27:31.960 | as we say on this program,
00:27:33.960 | steelmanning,
00:27:34.960 | if you argue the other side,
00:27:35.960 | it's infuriating for either side
00:27:37.960 | to experience this,
00:27:38.960 | and that is what the 230 change needs to be, Chamath.
00:27:41.960 | You're exactly correct.
00:27:43.960 | If you make an action,
00:27:44.960 | it should be listed on the person's profile page
00:27:47.960 | and on the tweet,
00:27:48.960 | and if you click on the question mark,
00:27:50.960 | you should see when the action was taken,
00:27:52.960 | by who,
00:27:53.960 | you know, which department,
00:27:54.960 | maybe not the person,
00:27:55.960 | so they get personally attacked,
00:27:57.960 | and then what the resolution to it is.
00:27:59.960 | This has been banned
00:28:00.960 | because it's targeted harassment.
00:28:02.960 | This can be resolved in this way,
00:28:04.960 | then everybody's behavior would steer
00:28:06.960 | towards whatever the stated purpose
00:28:08.960 | of that social network is.
00:28:10.960 | You can get better behavior
00:28:12.960 | by making the rules clear,
00:28:14.960 | by making the rules unclear,
00:28:16.960 | and making it unfair,
00:28:18.960 | you create this insane situation.
00:28:20.960 | Go ahead, Chamath,
00:28:22.960 | and that's why I'm infuriated about it.
00:28:24.960 | I think you have to take it one step further
00:28:26.960 | to really do justice
00:28:27.960 | to why this should be important to everybody,
00:28:30.960 | and I do think this school example,
00:28:32.960 | it really matters to me.
00:28:33.960 | Like, we have, like, I don't know now,
00:28:36.960 | we know what the counterfactual is,
00:28:38.960 | which is that we have,
00:28:40.960 | I mean, we've relegated our children
00:28:42.960 | to a bunch of years
00:28:44.960 | of really complicated relearning
00:28:46.960 | and learning that they never had to go through
00:28:48.960 | because of all the learning loss they gave them,
00:28:50.960 | but what if Jay Bhattacharya,
00:28:53.960 | who's, I mean, like,
00:28:54.960 | you can't be, you know,
00:28:55.960 | have a higher sort of role in society
00:28:57.960 | in terms of, you know, population.
00:28:59.960 | - Pretty good credentials.
00:29:00.960 | - I mean, imagine if, you know,
00:29:02.960 | there was a platform
00:29:03.960 | where he could have actually said this
00:29:04.960 | and then, you know, people would have clamored
00:29:06.960 | and said, "You know what?
00:29:07.960 | "You and Fauci need to get to the bottom of this,"
00:29:09.960 | or where legislators could have seen it
00:29:11.960 | and said, "You know what?
00:29:12.960 | "Before we make a decision like this,
00:29:15.960 | "maybe, hey, Fauci, go talk to Jay
00:29:17.960 | "because he's a Stanford prof.
00:29:18.960 | "He's probably not an idiot.
00:29:20.960 | "Why does he think that?"
00:29:21.960 | Or maybe let's convene, you know,
00:29:23.960 | an actual group of 20 or 30 scientists
00:29:26.960 | and the fact that this one version
00:29:29.960 | of thinking about things
00:29:30.960 | was deemed so heterodoxical,
00:29:32.960 | it is just such a good example
00:29:34.960 | because you-- - They shut down
00:29:35.960 | an important conversation.
00:29:36.960 | - You know that the decision was so wrong
00:29:39.960 | and the damage was so severe.
00:29:41.960 | - Yes. - So we know what happened
00:29:43.960 | by suppressing that speech.
00:29:44.960 | - And that's one example.
00:29:46.960 | - Well, it's, in my estimation,
00:29:48.960 | it is the silver bullet example
00:29:49.960 | that cleans through all of this other stuff
00:29:51.960 | because, you know,
00:29:52.960 | I don't really care if Rachel Maddow has reclined.
00:29:54.960 | Who the hell cares?
00:29:55.960 | This is important stuff
00:29:56.960 | because it affects everybody
00:29:57.960 | irrespective of your political persuasion
00:29:59.960 | and what editorial you want to read.
00:30:01.960 | - Chamath, what if the investigation
00:30:03.960 | into the Catholic Church
00:30:04.960 | and the abuses that occurred there,
00:30:06.960 | somebody said, "Oh, this person,
00:30:08.960 | "it needs to be shut down,"
00:30:09.960 | and then children are molested for another decade?
00:30:12.960 | By the way, we have an example of that.
00:30:13.960 | Sinead O'Connor came out on SNL.
00:30:15.960 | You can look it up for if you're under 40 years old
00:30:17.960 | and said, "Fight the real enemy."
00:30:18.960 | She ripped up a picture of the Pope
00:30:20.960 | because of the scandals there.
00:30:21.960 | She was excommunicated.
00:30:23.960 | She was canceled at that time,
00:30:24.960 | one of the first people to be canceled
00:30:26.960 | because she spoke truth to power.
00:30:27.960 | What if somebody,
00:30:28.960 | an investigative journalist at the New York Times,
00:30:30.960 | the Boston Globes are in the movie spotlight.
00:30:32.960 | Those are the people who broke the story
00:30:34.960 | of the Catholic Church.
00:30:35.960 | If somebody came in
00:30:36.960 | and the Catholic Church put pressure
00:30:37.960 | on a social network and said,
00:30:38.960 | "Hey, you can't put this stuff up here.
00:30:40.960 | "You can't have this discussion of abuse."
00:30:42.960 | - Here's another example.
00:30:43.960 | - It's infuriating.
00:30:44.960 | Why are we shutting down discussions in America?
00:30:45.960 | - Remember the Vietnam Papers?
00:30:46.960 | - Well, because, J. Cal,
00:30:47.960 | the media does not value transparency anymore.
00:30:50.960 | If you go back and look
00:30:51.960 | at the way the media portrays itself,
00:30:53.960 | like in the movie, "The Post,"
00:30:55.960 | which is about the revelations
00:30:56.960 | about the Catholic Church,
00:30:58.960 | where you go back to all the president's men,
00:31:00.960 | what the media prized
00:31:02.960 | and what they congratulated themselves on
00:31:04.960 | was, first of all, transparency
00:31:07.960 | and exposing the lies of powerful people.
00:31:09.960 | Well, that is exactly what has happened here.
00:31:12.960 | The lies of the powerful group of people
00:31:14.960 | who were running Twitter policy
00:31:16.960 | and suppressing one side of the debate
00:31:19.960 | has been exposed,
00:31:21.960 | and the media is treating it with a yawn,
00:31:23.960 | like there's nothing to see here.
00:31:25.960 | Because they were complicit in this.
00:31:26.960 | They were complicit in suppressing the views
00:31:29.960 | of people like Jay Bhattacharya.
00:31:31.960 | They were complicit in choosing the views
00:31:33.960 | of Fauci and the elite on COVID.
00:31:36.960 | And so they have no interest now
00:31:37.960 | in bringing and making
00:31:40.960 | what's happened here at Twitter fully transparent.
00:31:42.960 | - They have to own it.
00:31:43.960 | I think, by the way,
00:31:44.960 | just a quick correction there.
00:31:45.960 | I think, Sax, when you said "The Post,"
00:31:47.960 | Washington Post, Watergate, Spotlight, exactly.
00:31:49.960 | - Oh, I'm not even thinking about Spotlight, sorry.
00:31:51.960 | It may have been Spotlight.
00:31:52.960 | - Yeah, that's what I was going to say.
00:31:53.960 | - Okay.
00:31:54.960 | - But "The Post" is another example.
00:31:55.960 | That movie was about another event like this,
00:31:58.960 | which could have been easily suppressed
00:31:59.960 | in today's world, much harder there,
00:32:02.960 | which was the Pentagon Papers.
00:32:03.960 | And in that world, you know,
00:32:05.960 | there was an immense amount of pressure
00:32:07.960 | that the government put on "The Washington Post,"
00:32:09.960 | but then they said, "You know what?
00:32:10.960 | We're going with it," and they still published it.
00:32:12.960 | And it created a groundswell of support
00:32:14.960 | to really reexamine the Vietnam War,
00:32:15.960 | and it had a huge impact.
00:32:17.960 | But could you imagine this time around,
00:32:18.960 | which is like, "Hey, guys,
00:32:19.960 | there's going to be some kind of misinformation.
00:32:21.960 | You know, these Pentagon Papers are not real.
00:32:23.960 | It's coming from the Russians.
00:32:24.960 | Suppress it."
00:32:26.960 | And nobody could--
00:32:27.960 | It's so much easier now to run this play.
00:32:29.960 | - Right.
00:32:30.960 | - What journalists need to realize
00:32:32.960 | is that today's conspiracy theories
00:32:34.960 | are tomorrow's Pulitzer Prizes.
00:32:36.960 | On to you, Sax.
00:32:37.960 | - Not in the current media environment.
00:32:39.960 | They work for these corporations,
00:32:41.960 | and they don't get rewarded for telling the truth.
00:32:44.960 | - Oh, no, they're going for Pulitzers.
00:32:46.960 | Trust me, they are.
00:32:47.960 | And what they need to do is stop thinking short-term
00:32:50.960 | and think long-term.
00:32:51.960 | Anytime there's a conspiracy theory,
00:32:53.960 | you must give it some validity and say,
00:32:56.960 | "Is there any truth here?"
00:32:57.960 | Because it could, in fact, be a scandal
00:32:59.960 | that's being covered up.
00:33:00.960 | And if you take the approach of shutting down everyone--
00:33:02.960 | - They're involved in the cover-up right now.
00:33:04.960 | They're involved in the cover-up right now.
00:33:05.960 | - This is a cover-up. I agree.
00:33:07.960 | I'm in agreement with you.
00:33:08.960 | - Let's bring the first batch of Twitter files
00:33:10.960 | into the conversation,
00:33:11.960 | the one that Matt Taibbi exposed.
00:33:13.960 | What he did was confirm that a completely true story
00:33:17.960 | by the New York Post about Hunter Biden
00:33:19.960 | that came out a month before the election
00:33:21.960 | was suppressed by Twitter executives,
00:33:23.960 | including at the behest of, you know,
00:33:26.960 | of FBI agents and former security state officials.
00:33:31.960 | So this has now been exposed.
00:33:33.960 | There was no legitimate basis for suppressing that story.
00:33:37.960 | It was true.
00:33:38.960 | It was a respected publication.
00:33:39.960 | They did it anyway.
00:33:40.960 | This is election interference.
00:33:42.960 | You know, the same people who pride themselves
00:33:45.960 | on strengthening democracy
00:33:47.960 | are engaged in this wide-scale censorship
00:33:50.960 | of one side of the political debate,
00:33:52.960 | including of true stories before an election,
00:33:54.960 | and then they puff out their chest and say,
00:33:56.960 | "We're protecting democracy."
00:33:57.960 | They're not protecting democracy.
00:33:59.960 | They're interfering with democracy.
00:34:00.960 | They're interfering with the public's right to know.
00:34:03.960 | And then we look at a country like China,
00:34:05.960 | and we say, "Well, we're so much better than them,"
00:34:07.960 | because they've got this problem over there
00:34:09.960 | where the state and big tech are colluding
00:34:12.960 | to create a Big Brother-like system.
00:34:14.960 | Well, what is this?
00:34:16.960 | What are these tools that have been exposed?
00:34:18.960 | This is a Big Brother-like system.
00:34:20.960 | Okay, yeah, but just you have to--
00:34:22.960 | I know you want to make it like an equivalency.
00:34:24.960 | It's less than a 1% equivalency
00:34:26.960 | because in our society, we can have moments like this,
00:34:29.960 | and we can have investigations.
00:34:30.960 | So just to put it in perspective--
00:34:32.960 | Yeah, look, Jake, I don't think we're equivalent,
00:34:34.960 | but what I'm saying is that this is
00:34:37.960 | very much like a Big Brother social credit system
00:34:40.960 | that was being perpetrated--
00:34:41.960 | Yes, alarm bells should be going off.
00:34:43.960 | There should be an alarm bell going off.
00:34:45.960 | And if Elon didn't decide--
00:34:47.960 | Just we had this one idiosyncratic billionaire
00:34:50.960 | who believes in free speech.
00:34:51.960 | If he didn't decide to take this on,
00:34:53.960 | we would never have known this stuff.
00:34:55.960 | Okay, tell me what happened in between these two things.
00:34:57.960 | There is an attorney at Twitter,
00:35:01.960 | and I don't know the details of this.
00:35:03.960 | Right, okay, so this is interesting.
00:35:04.960 | I do not work for the Twitter Corporation.
00:35:06.960 | I do not speak for the Twitter Corporation.
00:35:07.960 | Saks does not work for the Twitter Corporation
00:35:09.960 | and does not speak for it.
00:35:10.960 | But there was, in between these two drops,
00:35:12.960 | something that happened.
00:35:13.960 | Yes, so basically, what was discovered--
00:35:16.960 | and this is all just publicly reported--
00:35:18.960 | is that a former FBI lawyer named Jim Baker
00:35:21.960 | had now become Deputy General Counsel at Twitter.
00:35:26.960 | And this guy, Jim Baker, is like the zealot
00:35:28.960 | of the whole Russian collusion hoax.
00:35:30.960 | He was involved in the FISA warrants
00:35:34.960 | that the FBI applied to the FISA courts
00:35:37.960 | that had all the errors and omissions.
00:35:39.960 | He was involved in the Alphabank hoax.
00:35:41.960 | He was the guy that that Perkins Coie lawyer,
00:35:45.960 | Sussman, was feeding this phony scam to.
00:35:51.960 | And I don't think he was officially sanctioned,
00:35:54.960 | but basically he was asked to leave the FBI.
00:35:56.960 | And then, lo and behold, where does he land?
00:35:59.960 | At Twitter.
00:36:01.960 | And he is involved in their content moderation policies.
00:36:04.960 | I think what it shows is how deeply intertwined
00:36:07.960 | our big tech companies have become
00:36:10.960 | with the security state.
00:36:12.960 | Now, how did this get exposed?
00:36:14.960 | Well, Barry Weiss was basically putting forward
00:36:18.960 | document requests for the latest batch of Twitter files,
00:36:22.960 | and she wasn't getting anything back.
00:36:24.960 | And she's like, "What's going on here?"
00:36:26.960 | And the guy who's giving her the files, his name is Jim.
00:36:29.960 | And she's like, "Well, wait, wait, Jim who?"
00:36:32.960 | And she finds out, "Wait, Jim Baker?
00:36:34.960 | Wait, that Jim Baker?"
00:36:36.960 | The New York Post had a long story about this guy.
00:36:39.960 | And so it was discovered that the guy who was curating
00:36:43.960 | the Twitter files was this former operative
00:36:47.960 | of the FBI who was involved in the Russian collusion hoax
00:36:50.960 | and then was involved in their blacklist decisions.
00:36:56.960 | So in any event, once this came out,
00:37:00.960 | Twitter fired him.
00:37:02.960 | And then, you know, Barry apparently received
00:37:05.960 | all these files that are now the second batch
00:37:07.960 | of the Twitter files.
00:37:09.960 | And just to be clear, that's not James Baker
00:37:11.960 | if you're, you know, thinking it's the former Reagan
00:37:15.960 | cabinet member, not James Baker.
00:37:17.960 | This is Jim Baker, who's a different person.
00:37:19.960 | Right, but a lot of people are wondering,
00:37:21.960 | "Well, how could this have been missed?"
00:37:23.960 | He's an ex-FBI agent.
00:37:24.960 | These guys don't want to be found.
00:37:26.960 | I mean, this is, some people call it, you know,
00:37:29.960 | the permanent Washington establishment.
00:37:31.960 | Some people call it the deep state.
00:37:33.960 | The administrations come and go.
00:37:35.960 | The people who work in Washington stay there forever.
00:37:37.960 | And they can simply effectuate policy
00:37:39.960 | by outlasting everybody else and clandestinely
00:37:42.960 | implementing what they believe.
00:37:44.960 | And they've become a constituency of their own
00:37:47.960 | that exercises power like a Praetorian Guard
00:37:50.960 | in Washington.
00:37:51.960 | So in any event, this guy is an expert at
00:37:54.960 | bull-weaveling himself into the bureaucracy.
00:37:56.960 | Great. Praetorian Guard bull-wea-
00:37:59.960 | You're on fire right now.
00:38:00.960 | Hold on a second.
00:38:01.960 | So, hold on.
00:38:02.960 | Say the two words again.
00:38:03.960 | Hold on a second.
00:38:04.960 | When they finally rooted, when they finally rooted
00:38:07.960 | this mole out of the FBI, he bull-weavels himself
00:38:11.960 | into another powerful bureaucracy.
00:38:13.960 | What is that word? Bull-weavel?
00:38:15.960 | Bull-weavel.
00:38:16.960 | Bull-weavel?
00:38:17.960 | Like Burrows.
00:38:18.960 | Like Burrows.
00:38:20.960 | Like Burrows. Like that.
00:38:22.960 | So, he digs his way into the Twitter bureaucracy
00:38:27.960 | to the point where he isn't even found.
00:38:29.960 | And then somehow, he has put himself in the position
00:38:32.960 | to be intermediating the Twitter files.
00:38:35.960 | Can you believe this?
00:38:38.960 | Praetorian Guard.
00:38:39.960 | So once it was discovered,
00:38:40.960 | a unit of the Imperial Roman Army
00:38:42.960 | that served as personal bodyguards and intelligence agents.
00:38:45.960 | The Praetorian Guard. Okay. Got it.
00:38:46.960 | Well, you understand what happened is that
00:38:48.960 | the Praetorian Guard originated because they were
00:38:50.960 | to defend the life of the Emperor.
00:38:52.960 | And then what happened...
00:38:53.960 | That's PQR.
00:38:54.960 | Then they became so powerful that
00:38:56.960 | whoever bribed the Praetorians would become Emperor.
00:39:00.960 | And then finally, the last step is that
00:39:02.960 | the Praetorians themselves would pick the Emperor.
00:39:04.960 | And whoever basically led the Praetorian Guard
00:39:07.960 | would be the next Emperor.
00:39:08.960 | In any event, I mean, we're not at that point yet.
00:39:12.960 | But the point is that these security state officials
00:39:16.960 | have power that they should not have.
00:39:19.960 | Okay. That's the bottom line.
00:39:20.960 | They should not be involved in our elections in this way.
00:39:25.960 | They should be completely nonpartisan and nonpolitical.
00:39:29.960 | They should just do their jobs as law enforcement officials.
00:39:31.960 | But we know from the Hunter Biden story
00:39:34.960 | that a very important piece of this was the pre-bunking
00:39:37.960 | that the FBI went to Facebook and Twitter and social networks
00:39:40.960 | and said, "Be on the lookout for a story about Hunter Biden.
00:39:44.960 | It is Russian disinformation."
00:39:46.960 | And they primed these social networks
00:39:48.960 | to suppress that story when it came out.
00:39:50.960 | That was something they never should have done.
00:39:52.960 | And they knew. They knew the story was not fake.
00:39:54.960 | They knew it was not Russian disinformation
00:39:56.960 | because they had the laptop in their possession since 2019.
00:39:59.960 | Well, okay. That has not...
00:40:00.960 | The providence of the laptop is still being reviewed in fairness.
00:40:04.960 | No, sir. You're wrong.
00:40:05.960 | Hold on. And there is an investigation going on of Hunter Biden.
00:40:08.960 | You also have to put the context in here.
00:40:10.960 | And please let me finish.
00:40:12.960 | There is a context here of there was massive election interference going on.
00:40:16.960 | Both sides of the aisle, Republicans, Democrats,
00:40:18.960 | all wanted to see the Russian interference
00:40:21.960 | and the Ukrainian interference
00:40:23.960 | and Trump's encouraging the Ukraine and the Russians
00:40:27.960 | to interfere in elections.
00:40:28.960 | Everybody was on high alert.
00:40:30.960 | And that happened to drop, like it was announced 30 days before,
00:40:34.960 | and it dropped 10 days before the election.
00:40:35.960 | So everybody was on high alert.
00:40:36.960 | And I agree it was not done properly.
00:40:38.960 | That's why it was the perfect excuse.
00:40:40.960 | It should have been done properly.
00:40:43.960 | They should have said, they should have come out public and say,
00:40:45.960 | "We don't know the providence of this.
00:40:47.960 | It could be hacked. It might not be hacked."
00:40:49.960 | Jason, they knew.
00:40:51.960 | Let's wait and see. We have to reserve judgment.
00:40:53.960 | No, listen. Let me tell you what happened.
00:40:54.960 | Let me just tell you what happened, okay?
00:40:56.960 | And make sure you source this.
00:40:58.960 | I will.
00:40:59.960 | So, look, it's all in the New York Post, okay?
00:41:01.960 | They've done a great...
00:41:02.960 | Oh, great.
00:41:03.960 | No, nobody has refuted it.
00:41:05.960 | Nobody has refuted it.
00:41:06.960 | It's a superfluous paper.
00:41:07.960 | Let me just tell you what happened.
00:41:08.960 | Let me just get this on the record here.
00:41:10.960 | From the Post.
00:41:12.960 | The FBI was given the laptop in 2019 by the lab store owner.
00:41:17.960 | Those guys have forensics.
00:41:19.960 | They have cyber experts.
00:41:20.960 | They knew the laptop was real.
00:41:22.960 | We know it's real now.
00:41:23.960 | Nobody questions that.
00:41:24.960 | In fact, the FBI has admitted that the laptop was real
00:41:27.960 | and that the Hunter Biden files are real.
00:41:29.960 | Nobody disputes that, okay?
00:41:31.960 | But what they did before the election is
00:41:33.960 | they used this excuse of Russian disinformation
00:41:36.960 | to discredit the story before it even came out.
00:41:38.960 | But they had no business getting involved in the story that way.
00:41:41.960 | They simply didn't.
00:41:42.960 | They should have stayed out of it completely.
00:41:44.960 | I don't understand how you can possibly justify that.
00:41:47.960 | Yeah, I mean, I think we do have to look at the context
00:41:49.960 | of that time period when Hillary's emails were hacked
00:41:52.960 | and we had a president...
00:41:53.960 | That's why it's a perfect excuse.
00:41:54.960 | Well, I didn't finish the sentence.
00:41:55.960 | And we had a president, which you will agree,
00:41:58.960 | our presidents and presidential candidates
00:41:59.960 | should not be encouraging foreign powers
00:42:01.960 | to hack their adversaries.
00:42:04.960 | Do you agree with that?
00:42:05.960 | This is the election.
00:42:06.960 | Do you agree with that? Answer my question.
00:42:07.960 | Do you agree that presidential...
00:42:08.960 | You're still wrapped up in...
00:42:10.960 | Answer the question.
00:42:11.960 | Why do you have to divert?
00:42:12.960 | No, Jake, I know what you're doing.
00:42:13.960 | You're going to personally attack me.
00:42:14.960 | No, I'm not.
00:42:15.960 | Don't personally attack me. Just answer the question.
00:42:16.960 | Should presidential candidates...
00:42:17.960 | This is your election denial for 2016.
00:42:18.960 | You're still wrapped up on this.
00:42:19.960 | You can't let it go.
00:42:20.960 | Again, you personally attack me,
00:42:21.960 | you don't answer the question.
00:42:22.960 | That's fine. We'll move on.
00:42:23.960 | You can't be intellectually honest?
00:42:24.960 | That's fine.
00:42:25.960 | The audience knows you're not being intellectually honest.
00:42:26.960 | Let's move on.
00:42:27.960 | You don't even know what you're talking about.
00:42:29.960 | If you could answer the simple question,
00:42:30.960 | should presidents encourage foreign powers
00:42:33.960 | to hack their adversaries,
00:42:34.960 | then you would be being intellectually dishonest.
00:42:36.960 | I am absolutely disappointed
00:42:38.960 | that you will not answer that simple question.
00:42:40.960 | It's an obvious yes.
00:42:41.960 | It's an obvious yes.
00:42:42.960 | We don't want people doing it.
00:42:43.960 | Of course, but I don't really believe that happened.
00:42:45.960 | Then say it.
00:42:46.960 | You won't say it because you know Trump's going to win the primary.
00:42:49.960 | Let's keep going.
00:42:50.960 | China has...
00:42:51.960 | Honestly, I don't...
00:42:52.960 | Wait, wait, wait, wait, wait, wait.
00:42:53.960 | Listen, I've said so many times on this show
00:42:55.960 | that he's not my candidate.
00:42:57.960 | I don't know what you're talking about.
00:42:58.960 | You're going all the way back.
00:42:59.960 | He'll wind up being your candidate when he wins.
00:43:01.960 | What you're doing right now is delusional.
00:43:03.960 | You're going back to some throwaway comment
00:43:05.960 | he made at a rally in 2016.
00:43:06.960 | It's got nothing, absolutely nothing to do with the story.
00:43:09.960 | And the fact you're even bringing it up is pure TDS.
00:43:12.960 | And I don't want to waste any time talking about it.
00:43:14.960 | Here you go, name calling instead of answering a question.
00:43:16.960 | That's your technique.
00:43:17.960 | Your technique is to call me names
00:43:19.960 | instead of answering the question.
00:43:21.960 | I want to un-muddy the waters.
00:43:22.960 | I want to make one more point about this.
00:43:23.960 | Un-muddy the waters.
00:43:24.960 | Another technique that I'm muddying the waters.
00:43:25.960 | I'm not muddying the waters.
00:43:26.960 | You are.
00:43:27.960 | I don't know what this has to do.
00:43:28.960 | Let's move on.
00:43:29.960 | Let's move on.
00:43:30.960 | I want to make one final point.
00:43:31.960 | Okay, I'll make a final point.
00:43:32.960 | There was a letter.
00:43:33.960 | Listen, there was a letter with this Hunter Biden thing.
00:43:36.960 | This is 2020 election Jason.
00:43:38.960 | We're not going back two elections ago.
00:43:40.960 | I want to talk about the most recent one.
00:43:41.960 | Okay, fine.
00:43:42.960 | You had Clapper, you had Comey,
00:43:43.960 | you had 50 of these security state officials.
00:43:45.960 | They write a letter saying that the Hunter Biden story
00:43:48.960 | has all the hallmarks of Russian disinformation.
00:43:52.960 | They claim that it was Russian disinformation.
00:43:54.960 | When it wasn't, they knew it wasn't.
00:43:56.960 | And it was the same story that the FBI was telling Twitter.
00:44:00.960 | And it was the same story that these Twitter executives
00:44:02.960 | were indulging in.
00:44:03.960 | Even though they all knew or had reason to know,
00:44:06.960 | it wasn't true.
00:44:07.960 | And they suppressed the New York Post story anyway.
00:44:09.960 | I don't know why you're bringing up this Trump stuff.
00:44:12.960 | It has nothing to do with the real issue here.
00:44:15.960 | Hold on a second.
00:44:16.960 | The real issue is this.
00:44:17.960 | Does social media have the right to suppress true stories
00:44:21.960 | put out by our media the month before an election?
00:44:25.960 | Yes or no?
00:44:26.960 | How do you defend that?
00:44:28.960 | I will answer your question yes or no,
00:44:30.960 | and you will not answer mine
00:44:31.960 | because you're being intellectually dishonest.
00:44:33.960 | Yes, we should.
00:44:34.960 | No, we should not suppress news stories.
00:44:36.960 | If it was, and I will argue both sides,
00:44:38.960 | if it was Snowden, if it was the Pentagon Papers,
00:44:41.960 | if it's Hunter Biden's laptop taking out the sex stuff,
00:44:44.960 | which we both agree on,
00:44:45.960 | or if it is Russia and Ukraine,
00:44:49.960 | where your presidential candidate at the time,
00:44:52.960 | Trump, asked Zelensky to find dirt on Biden
00:44:58.960 | before the election,
00:44:59.960 | and he asked the Russians to hack Hillary's email,
00:45:01.960 | and they did that,
00:45:02.960 | and they released it 10 days before the election.
00:45:04.960 | That is facts that happened,
00:45:05.960 | and that is the context.
00:45:06.960 | No, wait a second.
00:45:07.960 | That's not what this was.
00:45:08.960 | That's not what this was.
00:45:09.960 | You said you would let me speak,
00:45:10.960 | and you will let me speak.
00:45:11.960 | You're muddying the waters.
00:45:12.960 | No, stop interrupting me,
00:45:14.960 | and stop insulting me.
00:45:15.960 | I will say my part, you said yours,
00:45:17.960 | and then we will move on.
00:45:19.960 | The fact is,
00:45:21.960 | Trump encouraged hacking of other candidates,
00:45:25.960 | and he did it twice in a four-year period,
00:45:27.960 | back-to-back elections.
00:45:29.960 | We need to be on high alert
00:45:30.960 | when you have a Republican candidate,
00:45:33.960 | Trump, doing something so absolutely treasonous.
00:45:39.960 | Period. I'm done.
00:45:40.960 | This is why it was the perfect cover story.
00:45:41.960 | This is why it was the perfect cover story,
00:45:43.960 | is because people like you are on--
00:45:44.960 | But you would address the treasonous behavior.
00:45:46.960 | Let's move on.
00:45:47.960 | Listen, I don't think it was a perfect phone call.
00:45:49.960 | I think it was--
00:45:50.960 | Treasonous.
00:45:51.960 | There were lots of shenanigans.
00:45:52.960 | There were lots of shenanigans.
00:45:53.960 | It's called treason.
00:45:54.960 | Hold on. I'm not defending anything Trump did, okay?
00:45:56.960 | I don't feel the need, okay?
00:45:57.960 | I never defended it.
00:45:59.960 | But the deal is that you're letting your TDS justify--
00:46:03.960 | I don't have TDS.
00:46:04.960 | He's treasonous.
00:46:05.960 | You're allowing this Russian disinformation
00:46:08.960 | to be a cover story for what they all did.
00:46:10.960 | No, I said I don't think posts should have been blocked.
00:46:13.960 | You're misconstruing what I'm saying.
00:46:14.960 | Then why are you even bringing this up?
00:46:15.960 | It was not Russian disinformation.
00:46:17.960 | Because the context under which--
00:46:18.960 | The reason I'm bringing--
00:46:19.960 | I agree that the posts should have been blocked.
00:46:21.960 | The context made it a great cover story.
00:46:22.960 | That's your interpretation.
00:46:24.960 | The context also is everybody was on high alert
00:46:27.960 | waiting for a hack to drop,
00:46:29.960 | and in fact, a hack dropped 10 days before.
00:46:31.960 | You have to--
00:46:32.960 | That was not a hack.
00:46:33.960 | Okay, we found out subsequently it was a hack.
00:46:35.960 | That's why there needed to be time.
00:46:36.960 | They knew at the time is my point.
00:46:37.960 | They knew at the point.
00:46:38.960 | Twitter and Facebook did not know.
00:46:40.960 | Twitter and Facebook didn't know.
00:46:41.960 | That's the point.
00:46:42.960 | They don't know they're not in the FBI.
00:46:43.960 | Hold on a second. No, no, no, no, no, no, no.
00:46:45.960 | Taibbi, go back to the Twitter files.
00:46:47.960 | The first drop.
00:46:48.960 | Jim Baker, hold on a second.
00:46:50.960 | Jim Baker and Vijaya Gaudi said, okay,
00:46:53.960 | that there were a lot of internal questions
00:46:56.960 | about whether that Hunter Biden story
00:46:58.960 | could be justified under the hacked policy.
00:47:02.960 | Okay?
00:47:03.960 | And there were many legitimate questions
00:47:04.960 | raised internally about whether
00:47:06.960 | they could maintain that party line.
00:47:09.960 | And the emerging view was that
00:47:10.960 | they could no longer maintain that line
00:47:12.960 | and still Gaudi and Jim Baker said, no,
00:47:14.960 | we will maintain the idea that this was
00:47:16.960 | hacked information until proven otherwise.
00:47:19.960 | Even though it was not hacked,
00:47:20.960 | it was a New York Post story.
00:47:22.960 | Okay, let's move on.
00:47:23.960 | We'll agree to disagree.
00:47:24.960 | Let's move on.
00:47:25.960 | Why are you bringing up all this
00:47:26.960 | like irrelevant stuff?
00:47:27.960 | I think the audience and the other besties
00:47:28.960 | want us to move on.
00:47:29.960 | So let's move on.
00:47:30.960 | China ends most zero COVID rules
00:47:31.960 | and Iran might be abolishing its morality.
00:47:34.960 | Police news broke in the past week.
00:47:37.960 | On Wednesday, China's health authorities
00:47:38.960 | overhauled its zero COVID policy
00:47:40.960 | and announced a 10 point national plan
00:47:42.960 | that scrapped most health code tracking.
00:47:44.960 | And also they're rolling back
00:47:47.960 | their mass testing.
00:47:48.960 | And this allowed many positive cases
00:47:51.960 | to just simply quarantine at home
00:47:52.960 | like we were doing, I guess a year ago now.
00:47:55.960 | And they're limiting some of these lockdowns.
00:47:59.960 | This all comes from a Foxconn letter,
00:48:02.960 | which we don't know the cause causation here.
00:48:06.960 | Does it?
00:48:07.960 | Well, we don't know.
00:48:08.960 | That's why I just said,
00:48:09.960 | we don't know cause and correlation here.
00:48:11.960 | Give us some perspective here, Chama.
00:48:13.960 | Well, I just think it's kind of ridiculous
00:48:17.960 | to assume that the second largest economy
00:48:20.960 | in the world pivots based on one letter
00:48:24.960 | from one CEO.
00:48:25.960 | So I know that that's how the Western media-
00:48:27.960 | Describe the letter, please.
00:48:28.960 | Well, apparently what happened was Terry Guo,
00:48:31.960 | who's colloquially known as Uncle Terry,
00:48:33.960 | who's the CEO of Foxconn,
00:48:35.960 | wrote a letter that essentially said,
00:48:37.960 | if we don't figure out a way
00:48:38.960 | to get out of this lockdown process,
00:48:41.960 | we're going to lose our leadership
00:48:44.960 | in the global supply chain.
00:48:46.960 | And apparently that jolted
00:48:48.960 | the Central Planning Commission
00:48:49.960 | to realize that they needed to get out
00:48:52.960 | of these lockdowns.
00:48:53.960 | I think it's something different,
00:48:54.960 | which is I think this has been part and parcel
00:48:57.960 | of a very focused and dedicated plan by Xi.
00:49:02.960 | Phase one was to consolidate power.
00:49:05.960 | Phase two was to get through November
00:49:08.960 | and to basically get reappointed for life
00:49:11.960 | and dispel any other, you know,
00:49:13.960 | rivals that he actually had.
00:49:16.960 | And now phase three is just to reopen
00:49:18.960 | the economy again so this guy can basically
00:49:20.960 | sit on top of the second largest economy
00:49:24.960 | in the world.
00:49:25.960 | So I think this is sort of a natural
00:49:27.960 | flow of things.
00:49:29.960 | The other part of it,
00:49:30.960 | which I think is being underreported,
00:49:32.960 | is I think that the way in which they did it
00:49:35.960 | was less responsive in my opinion
00:49:37.960 | to a letter from Uncle Terry,
00:49:39.960 | but was more responsive to the fact
00:49:40.960 | that there are people on the ground.
00:49:41.960 | And I think that these guys are getting
00:49:43.960 | very sophisticated in understanding
00:49:46.960 | how to give the Chinese people
00:49:47.960 | some part of what they want
00:49:50.960 | so that they're roughly happy enough
00:49:52.960 | to keep moving forward.
00:49:53.960 | And I'm not going to morally judge
00:49:54.960 | whether it's right or wrong,
00:49:55.960 | but it's just a comment on
00:49:57.960 | what the gameplay and the game theory
00:49:58.960 | seems to be coming from
00:50:00.960 | the leadership of China.
00:50:02.960 | So I think this is,
00:50:04.960 | it's good for the Chinese people
00:50:07.960 | and the real question is
00:50:08.960 | what will it mean for the US economy
00:50:10.960 | if these guys get their economy going again.
00:50:14.960 | We talked about this previously,
00:50:16.960 | but this is a good example
00:50:18.960 | of the autocrat
00:50:21.960 | not necessarily being absolute
00:50:24.960 | in their authority.
00:50:27.960 | And the sense that I think we get
00:50:30.960 | at this point coming out of China
00:50:33.960 | is that there was enough dissent
00:50:35.960 | from the populace on the lockdown
00:50:37.960 | and the experience of the lockdowns
00:50:38.960 | and we can all go online
00:50:39.960 | and see the videos of
00:50:41.960 | steel bars being put on doors
00:50:43.960 | to keep people in their apartment buildings
00:50:44.960 | and people screaming
00:50:45.960 | and buildings being on fire,
00:50:47.960 | people can't escape the buildings.
00:50:48.960 | How much of that was true or not
00:50:50.960 | and riots in the street
00:50:51.960 | and people fighting with the COVID testers.
00:50:53.960 | How much of it is true or not
00:50:55.960 | we don't really know.
00:50:57.960 | But it certainly seems to indicate
00:50:58.960 | that there was enough dissent
00:51:00.960 | and enough unrest
00:51:01.960 | that in order to stay in power
00:51:03.960 | the CCP had to take action
00:51:05.960 | and they had to shift their position
00:51:07.960 | and shift their tone.
00:51:08.960 | And I think it's a really important moment
00:51:10.960 | to observe
00:51:12.960 | that sometimes the CCP
00:51:14.960 | and perhaps even we can extend this
00:51:18.960 | into other autocratic regimes
00:51:20.960 | that we think are absolute
00:51:22.960 | in their authority and their power
00:51:24.960 | perhaps are necessarily influenced
00:51:27.960 | by the people that they are there to govern
00:51:29.960 | and that they are
00:51:31.960 | you know, ruling over.
00:51:33.960 | And that
00:51:35.960 | while we don't think about these places
00:51:37.960 | as democracies
00:51:38.960 | perhaps they're not entirely
00:51:40.960 | the traditionally defined autocracy.
00:51:43.960 | That there is an influence
00:51:45.960 | that the people can have.
00:51:46.960 | And maybe we see the same change
00:51:47.960 | happening in Iran
00:51:49.960 | with young people
00:51:51.960 | and a population that's more modern
00:51:53.960 | that's growing and swelling in size
00:51:55.960 | that doesn't want to accept some of the
00:51:57.960 | traditional norms and the traditional
00:51:59.960 | laws.
00:52:01.960 | And maybe that will start to resonate
00:52:03.960 | around the world
00:52:04.960 | that the internet
00:52:05.960 | is starting to do what everyone hoped
00:52:07.960 | and wanted it to do
00:52:08.960 | which is the democratization of information
00:52:10.960 | the democratization
00:52:11.960 | of seeing other people's conditions
00:52:13.960 | and seeing what the rest of the world is
00:52:15.960 | and is like
00:52:16.960 | gives the populace the ability
00:52:18.960 | to rise up
00:52:20.960 | and to say this is what we want.
00:52:22.960 | We hope that there are better things out there.
00:52:24.960 | And these autocratic regimes have to start
00:52:26.960 | to shift slightly.
00:52:27.960 | And over time maybe that has a real impact.
00:52:29.960 | Here's a specific statistic and chart for everybody.
00:52:31.960 | The demographics of
00:52:33.960 | Iran are
00:52:35.960 | incredibly
00:52:37.960 | notable. If you look at this chart
00:52:39.960 | for those of you listening
00:52:41.960 | it just shows people
00:52:43.960 | by age and how many, what percentage of the
00:52:45.960 | population they are
00:52:46.960 | or actually the raw numbers of the population.
00:52:48.960 | As you can see, it's basically like a pair.
00:52:50.960 | You have very few old people and you have
00:52:52.960 | a lot of people in their 20s
00:52:54.960 | and younger. And so young
00:52:56.960 | people... No, no, no, Jason. It's really
00:52:58.960 | 40s and 30s.
00:53:00.960 | It's really... Yeah, okay. So 40s, 30s.
00:53:02.960 | You don't have the geriatric
00:53:04.960 | population that you see in other
00:53:06.960 | countries like Japan.
00:53:08.960 | And so the demographics of Iran are
00:53:10.960 | extremely weighted
00:53:12.960 | towards younger people.
00:53:14.960 | Millennials, Gen Xers
00:53:16.960 | and younger. And they have
00:53:18.960 | VPNs, virtual private networks.
00:53:20.960 | They can see everything happening
00:53:22.960 | in the free world
00:53:24.960 | versus, let's say
00:53:26.960 | closed societies. And so I think that's what
00:53:28.960 | gives me a lot of hope is that these countries
00:53:30.960 | are going to have to evolve because young people
00:53:32.960 | are seeing how the rest of the world lives.
00:53:34.960 | And I think that's a
00:53:36.960 | big part of the change. Tamath, what are your thoughts?
00:53:38.960 | About Iran
00:53:40.960 | specifically? I think
00:53:42.960 | demographic change and then China and
00:53:44.960 | demographic change slash
00:53:46.960 | the protests. I've said this
00:53:48.960 | before and I've been tweeting about this for years, but
00:53:50.960 | people so poorly understand demographics.
00:53:52.960 | Everybody thinks that we have a surplus of
00:53:54.960 | people and we don't.
00:53:56.960 | And we need to have a positive birth rate
00:53:58.960 | in order to kind of continue to
00:54:00.960 | support the expansion of the world and GDP.
00:54:02.960 | And we need that. And right now, we're
00:54:04.960 | not in that situation. If you
00:54:06.960 | look at a country by country basis, a lot of these
00:54:08.960 | countries are facing
00:54:10.960 | that in a pretty
00:54:12.960 | cataclysmic way. The most
00:54:14.960 | sensitive country
00:54:16.960 | to this is China. I mean, their
00:54:18.960 | population, at current course and speed,
00:54:20.960 | I think the last number is, it's going to halve
00:54:22.960 | by 2100. There'll be about
00:54:24.960 | 600 million people in China,
00:54:26.960 | which is unbelievably
00:54:28.960 | disruptive in a very negative way for them.
00:54:30.960 | Because you will have
00:54:32.960 | a lot of people who are entering
00:54:34.960 | the workforce having to support
00:54:36.960 | an entire cohort of people
00:54:38.960 | above them in terms of age, who are retired,
00:54:40.960 | etc. So the state's
00:54:42.960 | going to have to get much, much more actively involved
00:54:44.960 | over the next 50 years in China.
00:54:46.960 | And then you look at other countries like
00:54:48.960 | Nigeria or India, who are
00:54:50.960 | in, you know,
00:54:52.960 | at the beginning of what could be a multi-decade
00:54:54.960 | boom. Because you have
00:54:56.960 | 20-year-olds that will be entering the workforce,
00:54:58.960 | you know, they'll effectively work
00:55:00.960 | for less than their older counterparts, right?
00:55:02.960 | So then there'll be an incentive then
00:55:04.960 | to bring work on shore
00:55:06.960 | into those countries.
00:55:08.960 | And so it's going to have
00:55:10.960 | huge impacts because then you have
00:55:12.960 | rising GDP, you'll have rising expectations of
00:55:14.960 | living quality, you'll have rising expectations of
00:55:16.960 | how governments treat those people.
00:55:18.960 | So it's all kind of positive in general,
00:55:20.960 | but the world needs more people.
00:55:22.960 | Let's just be clear.
00:55:24.960 | Especially in Western countries,
00:55:26.960 | we are going to be, we're not
00:55:28.960 | as badly off as China, but we're not far
00:55:30.960 | behind. Yeah, here's a quick view of China
00:55:32.960 | and Japan. Yeah, these same
00:55:34.960 | kind of, I don't know what they exactly call these
00:55:36.960 | charts, they're kind of like vertical histograms,
00:55:38.960 | and again, you know,
00:55:40.960 | data's hard to come by in some countries, but
00:55:42.960 | you know, China's starting to get top-heavy
00:55:44.960 | when compared to Iran,
00:55:46.960 | and then if you look at Japan,
00:55:48.960 | quite stunning. There's just no young people
00:55:50.960 | left, and they live
00:55:52.960 | very, too much older ages
00:55:54.960 | than Japan. It's this longevity is
00:55:56.960 | one of their great strengths as a population,
00:55:58.960 | as a country, and so these demographics
00:56:00.960 | can't be fought. You're going to have
00:56:02.960 | a constricting economy in Japan,
00:56:04.960 | and their place in the
00:56:06.960 | world is going to be very, very
00:56:08.960 | different.
00:56:10.960 | Okay, where do we want to go to next?
00:56:12.960 | You never asked my opinion on
00:56:14.960 | these protests in China. Oh, I was going to, usually you
00:56:16.960 | just talk, so go ahead. I didn't want to,
00:56:18.960 | I didn't know if I could throw it to you. No, I just talk. No, I usually
00:56:20.960 | have to fight to give my opinion. Oh, here
00:56:22.960 | we go. Listen, have your agent
00:56:24.960 | call my agent, we'll talk about it. Okay.
00:56:26.960 | We'll talk about it on the All In, Nearly Did It.
00:56:28.960 | I have a slightly different view of what's
00:56:30.960 | happening in China, Jason, which
00:56:32.960 | is, you know, I think that the people
00:56:34.960 | there need to stop harassing the
00:56:36.960 | CCP. You see, the Chinese
00:56:38.960 | Communist Party, they're the elites,
00:56:40.960 | they've set things up for the benefit of the
00:56:42.960 | people. They're not engaged in shadow banning,
00:56:44.960 | they're just, you know, they have a system there
00:56:46.960 | to, you know, to engage in censorship,
00:56:48.960 | to prevent abuse
00:56:50.960 | and harm. Yeah. Right?
00:56:52.960 | That's the system. Continue. They've set up,
00:56:54.960 | right? And the people just need to
00:56:56.960 | understand that, that when they say things
00:56:58.960 | like, you know, when they oppose
00:57:00.960 | things like COVID lockdowns, like
00:57:02.960 | Abe Auditoria did, they need to understand
00:57:04.960 | that that is engaging in abuse
00:57:06.960 | and harm. You see? Exactly.
00:57:08.960 | Yes. And you know what? They've
00:57:10.960 | provided re-education camps
00:57:12.960 | for citizens who need,
00:57:14.960 | you know, to maybe rethink
00:57:16.960 | their positions on freedom and
00:57:18.960 | their wages, the hours they work,
00:57:20.960 | and their social
00:57:22.960 | conditions. You're absolutely correct. China
00:57:24.960 | really has built a perfect model
00:57:26.960 | for our society. Well said,
00:57:28.960 | Sax. Great. Now we can move
00:57:30.960 | forward. Let's go. Now we can move forward.
00:57:32.960 | We are finally, we're in agreement. By the way, you
00:57:34.960 | know that's going to get clipped out and go viral.
00:57:36.960 | You understand, right? No, no, no. It's a good thing.
00:57:38.960 | According to our elites,
00:57:40.960 | according to our elites, like
00:57:42.960 | Yoel Roth or Taylor
00:57:44.960 | Lorenz, to criticize them
00:57:46.960 | is a form of harassment.
00:57:48.960 | You understand that, right?
00:57:50.960 | So, therefore, what the people in
00:57:52.960 | China are doing, specifically by opposing
00:57:54.960 | lockdowns,
00:57:56.960 | you know, they're taking the J-Pod Autoria point
00:57:58.960 | of view, they're engaging in
00:58:00.960 | harm and abuse and
00:58:02.960 | harassment of their
00:58:04.960 | betters, of their elites. I mean, the disagreement
00:58:06.960 | Why won't they just submit
00:58:08.960 | to the social credit system that
00:58:10.960 | has been set up for them
00:58:12.960 | for their benefit?
00:58:14.960 | It's for their benefit. Why question it? Yeah, just accept.
00:58:16.960 | Accept your fate and work hard
00:58:18.960 | for the good of the people. Great, great
00:58:20.960 | points. Let's move forward. Should we talk about
00:58:22.960 | sales? No, I think it's actually a pretty, it's
00:58:24.960 | a pretty good satire. I agree.
00:58:26.960 | I think we have to talk about FTX.
00:58:28.960 | I don't know if you saw
00:58:30.960 | and I,
00:58:32.960 | the people covering for
00:58:34.960 | SBF, it continues
00:58:36.960 | to be an absolute joke.
00:58:38.960 | The number of interviews that SBF
00:58:40.960 | is doing is absurd.
00:58:42.960 | But the people carrying water for him
00:58:44.960 | is even more
00:58:46.960 | offensive. I mean, if you're a criminal trying
00:58:48.960 | to cover up your crime, okay, we get it.
00:58:50.960 | You're trying to cover up and stay out of jail.
00:58:52.960 | But Kevin O'Leary,
00:58:56.960 | calls himself Mr. Wonderful,
00:58:58.960 | was on CNBC trying to
00:59:00.960 | defend the fact that
00:59:02.960 | he was given,
00:59:04.960 | this is stunning by the way,
00:59:06.960 | $15 fucking million
00:59:08.960 | to be a spokesperson for FTX.
00:59:10.960 | So the grift
00:59:12.960 | not only went to the press,
00:59:14.960 | politicians,
00:59:16.960 | but now commentators on CNBC,
00:59:18.960 | $15 million.
00:59:20.960 | To put that in context,
00:59:22.960 | I mean, you're talking what an elite
00:59:24.960 | NBA player gets from Nike.
00:59:26.960 | This does not exist
00:59:28.960 | in the world.
00:59:30.960 | Kevin O'Leary might get $50
00:59:32.960 | to $200K for speaking gigs.
00:59:34.960 | But nobody gets $15 million
00:59:36.960 | to show. Here's a
00:59:38.960 | 75 second clip that
00:59:40.960 | I don't know if you all have seen,
00:59:42.960 | but is unbelievably stunning.
00:59:44.960 | See you on the other side of 75 seconds.
00:59:46.960 | If you're a defense attorney
00:59:48.960 | that represents someone that you know
00:59:50.960 | is guilty, you gotta say, yeah, well,
00:59:52.960 | they're innocent. But you may know they're guilty.
00:59:54.960 | You may know they're guilty. If you find someone,
00:59:56.960 | if you watch someone kill someone,
00:59:58.960 | yeah, they're innocent. Don't prove them guilty.
01:00:00.960 | There's only the murder of my money in this case, okay?
01:00:02.960 | It's murder of FTX's
01:00:04.960 | money, in my view.
01:00:06.960 | Look, Joe, if you
01:00:08.960 | made the decision.
01:00:10.960 | I don't think you should be singing the blues right now
01:00:12.960 | at all. Oh, yes, I'm singing the blues.
01:00:14.960 | Why? Because your $15 million didn't pan out?
01:00:16.960 | That's a lot of money to be a
01:00:18.960 | paid spokesperson. It's a lot of money.
01:00:20.960 | You didn't have to do much for that. That's found
01:00:22.960 | money, Kevin. That's a different decision. That's a different discussion.
01:00:24.960 | You can make that decision
01:00:26.960 | on your own, but I'm going to this point
01:00:28.960 | that if you want to say he's guilty
01:00:30.960 | before he's tried, I just don't understand it.
01:00:32.960 | But it may end up costing you
01:00:34.960 | $15 for reputation on everything
01:00:36.960 | else. That's the problem. That's why I stay on
01:00:38.960 | this pursuit. I'm very transparent about it.
01:00:40.960 | I've disclosed everything I know about it. I will find out
01:00:42.960 | more information. If I make the credit committee,
01:00:44.960 | I will act as a fiduciary for everybody involved.
01:00:46.960 | I will testify. I am an
01:00:48.960 | advocate for this industry,
01:00:50.960 | and this changes nothing.
01:00:52.960 | Just look at the numbers that came out of Circle today. I'm an investor
01:00:54.960 | there, too. You've got the "I lost it
01:00:56.960 | all" on FTX, and we have a fantastic
01:00:58.960 | print on Circle. The
01:01:00.960 | promise of crypto remains. This will
01:01:02.960 | not change it. Pretty crazy.
01:01:04.960 | $15 million. Any thoughts
01:01:06.960 | on the continuing SBF
01:01:08.960 | saga, Sax?
01:01:10.960 | Well, I don't know why we should care so much about him.
01:01:12.960 | I mean, Kevin Leary.
01:01:14.960 | It's indicative, right?
01:01:16.960 | It's indicative of all these guys that got money from
01:01:18.960 | this guy. Who is he?
01:01:20.960 | He's on Shark Tank.
01:01:22.960 | He's on Shark Tank, and he's a contributor
01:01:24.960 | to CNBC who's on multiple
01:01:26.960 | times a week. The point is,
01:01:28.960 | you've got the grift.
01:01:30.960 | I'm just trying to point out, $15
01:01:32.960 | million to a CNBC commentator
01:01:34.960 | is just an
01:01:36.960 | extraordinary payoff.
01:01:38.960 | I've never heard of anything like that. I don't think
01:01:40.960 | it's fair to pick on Kevin O'Leary
01:01:42.960 | per se, because
01:01:44.960 | there's a bunch of those guys that took money
01:01:46.960 | from him. A bunch of athletes
01:01:48.960 | did. Probably a bunch of movie stars.
01:01:50.960 | Pats. Republicans.
01:01:52.960 | Democrats.
01:01:54.960 | Everybody got paid by this guy.
01:01:56.960 | Democrats.
01:01:58.960 | Just like in the Twitter example,
01:02:00.960 | I think it's important in this case to generalize
01:02:02.960 | because the generalized
01:02:04.960 | thing is the real problem. Look, if you
01:02:06.960 | want to focus on the crux of this,
01:02:08.960 | you have a concept in law
01:02:10.960 | that Saks knows better than the rest of us called
01:02:12.960 | fraudulent conveyance. We have
01:02:14.960 | example after example
01:02:16.960 | where it does not matter whether
01:02:18.960 | it was in the Bernie Madoff example, or, for
01:02:20.960 | example, Jason, we talked about it, the guy
01:02:22.960 | in LA that lost all the money, client
01:02:24.960 | funds playing poker.
01:02:26.960 | You have to give the money back,
01:02:28.960 | especially if it was fraudulently
01:02:30.960 | conveyed to you. Can you explain
01:02:32.960 | this in detail for a second so the audience understands?
01:02:34.960 | Well, my understanding, which is very
01:02:36.960 | basic, and I think David can probably do a much
01:02:38.960 | better job, is the following, which is
01:02:40.960 | if you get money
01:02:42.960 | some way,
01:02:44.960 | but it comes from
01:02:46.960 | somebody who fraudulently
01:02:48.960 | acquired that money, you have to give
01:02:50.960 | the money back. So in this example,
01:02:52.960 | what it would mean is if
01:02:54.960 | that they can show that that $15 million
01:02:56.960 | that this guy got
01:02:58.960 | came from
01:03:00.960 | SBF basically raiding the piggy bank
01:03:02.960 | of user
01:03:04.960 | accounts,
01:03:06.960 | he's going to have to pay the money back.
01:03:08.960 | Just like, for example, in the Madoff
01:03:10.960 | fraud,
01:03:12.960 | the folks
01:03:14.960 | that went to find the money were able to go
01:03:16.960 | back to folks that actually redeemed,
01:03:18.960 | even the beginning early ones, and said, "I understand
01:03:20.960 | that you didn't know any better, but this
01:03:22.960 | was fraudulently conveyed to you, so we need
01:03:24.960 | the money back." And they got the money back.
01:03:26.960 | In that case, if they had put a million in,
01:03:28.960 | and it grew to $3 million, they got their million
01:03:30.960 | principal back, but the $2 million in gains, which
01:03:32.960 | were ill gotten, had to be returned.
01:03:34.960 | So as I understand it,
01:03:36.960 | based on just what I've read, that there's
01:03:38.960 | a 90-day rule around contributions,
01:03:40.960 | meaning that if,
01:03:42.960 | I think this has to do with the bankruptcy, that
01:03:44.960 | if he donated money within
01:03:46.960 | 90 days, then
01:03:48.960 | that can be unwound.
01:03:50.960 | So, yeah,
01:03:52.960 | but I do think it creates potentially a powerful
01:03:54.960 | incentive here
01:03:56.960 | by politicians and various
01:03:58.960 | political groups for him
01:04:00.960 | not to be convicted of fraud, for him to
01:04:02.960 | be able to plead this out into some sort of
01:04:04.960 | negligence, because they don't have to
01:04:06.960 | give the money back. They keep the bag!
01:04:08.960 | What an incredible insight.
01:04:10.960 | Well, this is what I think is so interesting about the Kevin O'Leary
01:04:12.960 | thing. It's not about Kevin O'Leary,
01:04:14.960 | but it's about the fact that the money was spread
01:04:16.960 | around so widely,
01:04:18.960 | and into such deep trenches
01:04:20.960 | of the regulatory world,
01:04:22.960 | society,
01:04:24.960 | influencers,
01:04:26.960 | and basically, I think
01:04:28.960 | the guy cemented
01:04:30.960 | this, he thought that
01:04:32.960 | which I think, by the way, is a
01:04:34.960 | really interesting product of
01:04:36.960 | the crypto ecosystem and the
01:04:38.960 | model that so many kind of crypto
01:04:40.960 | businesses have engaged in over the years, which is
01:04:42.960 | if you can fester the belief,
01:04:44.960 | then there is a business. If you cannot
01:04:46.960 | fester the belief, there is no business.
01:04:48.960 | That there isn't a fundamental productivity driver.
01:04:50.960 | It's about building a belief system.
01:04:52.960 | And you can buy a belief system
01:04:54.960 | if you can take money that people have given you,
01:04:56.960 | you can embed it in influencers
01:04:58.960 | and celebrities and
01:05:00.960 | politicians and regulators, and if you
01:05:02.960 | give it to enough of these people,
01:05:04.960 | and you give enough of it to them, maybe
01:05:06.960 | that belief system solidifies and your
01:05:08.960 | thing becomes real. Which is a
01:05:10.960 | classic grift technique, by the way, in the grifters.
01:05:12.960 | Oh, tell us all about it, Jacob.
01:05:14.960 | Yeah, what you do is you have this
01:05:16.960 | No, no, it's the patina,
01:05:18.960 | it's this, you know,
01:05:20.960 | you look like you're incredibly rich,
01:05:22.960 | you're going to fancy restaurants, you're wearing an expensive
01:05:24.960 | suit, you're getting in a sports car, and
01:05:26.960 | then you own some palazzo, whatever,
01:05:28.960 | and then some other rich person
01:05:30.960 | comes and you get them to invest in something
01:05:32.960 | and then you abscond with the money.
01:05:34.960 | But they see all the accoutrements, you
01:05:36.960 | check all the boxes, your parents were Stanford,
01:05:38.960 | you went to MIT, and
01:05:40.960 | you are donating large sums of money,
01:05:42.960 | and you got this big table at the club,
01:05:44.960 | and you got a penthouse, everybody starts to
01:05:46.960 | feel, well, might is right, you got the
01:05:48.960 | wealth, there might be some agony. How would you guys feel?
01:05:50.960 | Like, how would you guys feel about, honestly,
01:05:52.960 | honestly, no, backing a CEO
01:05:54.960 | of a growth stage company
01:05:56.960 | that you put your firm's money into
01:05:58.960 | who lives in a $130 million
01:06:00.960 | house and has not yet exited the
01:06:02.960 | business? Yeah, absolute alarm bells
01:06:04.960 | everywhere. Never done it. And this is why I'm not a fan
01:06:06.960 | of secondary sales. Let me ask you guys
01:06:08.960 | a question. Or huge secondary sales.
01:06:10.960 | Yeah, let me ask you guys a question. Do you
01:06:12.960 | think that a billion dollars of
01:06:14.960 | dark money could stop a red wave?
01:06:16.960 | Just asking for a friend.
01:06:18.960 | A billion dollars
01:06:20.960 | in dark money. Do you think it was over-weighted to Democrats?
01:06:22.960 | No, honestly, do you think it's over-weighted?
01:06:24.960 | Yes, his mother was a huge Democratic bundler.
01:06:26.960 | Yeah. And moreover,
01:06:28.960 | the specific politicians he needed
01:06:30.960 | to influence, yes, there were some
01:06:32.960 | Republicans, but by and large,
01:06:34.960 | it was the SEC. So you're the first
01:06:36.960 | person to make this claim. I want to say,
01:06:38.960 | did you hear it here first on the
01:06:40.960 | OlliPod, David Sachs
01:06:42.960 | making the declaration that
01:06:44.960 | the red wave was stopped
01:06:46.960 | because of... Well, let me ask you a follow-up
01:06:48.960 | question. What do you think would have more impact
01:06:50.960 | on our election?
01:06:52.960 | Enormous amounts of
01:06:54.960 | dark money going to Democrats
01:06:56.960 | or extensive
01:06:58.960 | shadow banning of conservative
01:07:00.960 | influencers?
01:07:02.960 | Which one do you think would have a bigger impact?
01:07:04.960 | And hold on a second, in a 50-50
01:07:06.960 | country where, I mean, the
01:07:08.960 | scales are like balanced, where these elections
01:07:10.960 | are just a few thousand votes.
01:07:12.960 | What do you think the result is going to be
01:07:14.960 | if we actually have a level playing field
01:07:16.960 | to get rid of this swindler's dark money?
01:07:18.960 | Yeah, that's an interesting question. Let me
01:07:20.960 | add a thing to that.
01:07:22.960 | What would have a bigger impact?
01:07:24.960 | This subversion of covert voices...
01:07:26.960 | I think this is great, except for when you guys
01:07:28.960 | in your fight, like... Or
01:07:30.960 | taking away a woman's right to choose after 50
01:07:32.960 | years of giving it to them. Which would have a bigger impact
01:07:34.960 | on the red wave? That did have a big impact, but I think
01:07:36.960 | we're going to move past that. I think we're going to move past that.
01:07:38.960 | Yeah, all right. Great, yeah.
01:07:40.960 | Great, great strategic move. Sachs, what do
01:07:42.960 | you think about this cinema,
01:07:44.960 | Kyrsten Cinema flipping to independent?
01:07:46.960 | Do you think that's a big deal? I think
01:07:48.960 | it's really interesting. I think it's actually a very shrewd move
01:07:50.960 | on her part. Purple.
01:07:52.960 | She's gone purple. So first of all, I think she's great.
01:07:54.960 | You know... Yeah, just tell us about her, Sachs.
01:07:56.960 | No, he...
01:07:58.960 | Well, she's great. She's... She is
01:08:00.960 | the senator from Arizona,
01:08:02.960 | formerly Democrat, now independent,
01:08:04.960 | who is in the mold
01:08:06.960 | of, you know, John McCain, who was a former
01:08:08.960 | senator from Arizona. Sort of this maverick
01:08:10.960 | independent, and she does not kowtow
01:08:12.960 | to her party orthodoxy.
01:08:14.960 | And when Biden
01:08:16.960 | wanted to pass three
01:08:18.960 | and a half trillion of Build Back
01:08:20.960 | Better spending, she, along
01:08:22.960 | with Manchin, opposed it. And I think, save
01:08:24.960 | the administration from this gigantic
01:08:26.960 | boondoggle that would have made inflation much, much
01:08:28.960 | worse. Now, Manchin got all the credit,
01:08:30.960 | but she was equally responsible for
01:08:32.960 | putting a hold on that, and then as
01:08:34.960 | a result, they only did the 750
01:08:36.960 | billion inflation reduction act. So, she's
01:08:38.960 | willing to buck her own party. Now, as a result
01:08:40.960 | of that, I think
01:08:42.960 | they were planning on... She was going to get primaried.
01:08:44.960 | That the progressive wing
01:08:46.960 | of the party was planning on primarying
01:08:48.960 | her. And by moving
01:08:50.960 | to an independent, in a sense,
01:08:52.960 | she preempts that. Because
01:08:54.960 | what she's now saying is... She's now
01:08:56.960 | sort of like, you know, Bernie
01:08:58.960 | Sanders is an independent, or this guy
01:09:00.960 | Angus King from Maine.
01:09:02.960 | They still caucus with the
01:09:04.960 | Democrats, but they're independents,
01:09:06.960 | and the Democrats don't run
01:09:08.960 | candidates against
01:09:10.960 | them. Because they know that
01:09:12.960 | if they do, you'll have
01:09:14.960 | a Republican, a Democrat, an independent,
01:09:16.960 | and the Democrats and the independents will split
01:09:18.960 | the vote, and the Republican will win. So,
01:09:20.960 | basically, she's now daring
01:09:22.960 | the Democrats, "Hey, if you want
01:09:24.960 | to run a candidate against me, I'm not going to
01:09:26.960 | sit around and get primaried by
01:09:28.960 | them. You go ahead and run somebody, but
01:09:30.960 | then we're both going to lose to the
01:09:32.960 | Republican." That's what's smart about
01:09:34.960 | it, is I think she's daring Schumer
01:09:36.960 | to run somebody against her.
01:09:38.960 | It's also interesting, she's
01:09:40.960 | the only member of
01:09:42.960 | Congress I've read that's non-theist,
01:09:44.960 | which is kind of like atheist. She doesn't
01:09:46.960 | talk about God or doesn't believe in God,
01:09:48.960 | and I think she's the first openly bisexual
01:09:50.960 | member of Congress. She's a
01:09:52.960 | maverick.
01:09:54.960 | Certainly. Saks, do you think she held up on
01:09:56.960 | making this decision until after that Georgia
01:09:58.960 | Senate runoff election
01:10:00.960 | finished, and do you think that it influenced the
01:10:02.960 | decision? I don't know, but I think
01:10:04.960 | that the key consideration here for her is...
01:10:06.960 | Well, imagine if she
01:10:08.960 | doesn't make this move now, okay?
01:10:10.960 | And then in two years,
01:10:12.960 | well, I guess really next year, she gets
01:10:14.960 | primaried, okay? And then what if
01:10:16.960 | she loses the primary? It's going to be very hard
01:10:18.960 | for her to run as an independent. At
01:10:20.960 | that point, it'll look like sour grapes
01:10:22.960 | or loser, right? But
01:10:24.960 | if she goes independent now,
01:10:26.960 | she's saying, "Listen,
01:10:28.960 | I'm running as an independent no matter what. The
01:10:30.960 | question you have to make,
01:10:32.960 | the Democratic Party, is whether
01:10:34.960 | to support me or basically
01:10:36.960 | tank this election and throw it to the Republicans." Will we see more
01:10:38.960 | of this purple approach? Well,
01:10:40.960 | I was just going to ask you, what does this mean for Joe
01:10:42.960 | Manchin?
01:10:44.960 | Well, I don't think Joe Manchin has this problem, and I'll tell
01:10:46.960 | you why. Because West
01:10:48.960 | Virginia, unlike Arizona, is like a plus 22
01:10:50.960 | red state. Joe
01:10:52.960 | Manchin is the only politician
01:10:54.960 | in that state who could
01:10:56.960 | win that seat for the Democrats.
01:10:58.960 | When Joe Manchin retires, that seat
01:11:00.960 | is going Republican. And Schumer
01:11:02.960 | knows this, the Democrats know this, they think
01:11:04.960 | they're lucky stars every day that
01:11:06.960 | they got Joe Manchin, because otherwise
01:11:08.960 | that would be a Republican seat. And so,
01:11:10.960 | look, all this stuff about how the progressives
01:11:12.960 | were upset with Manchin and all that
01:11:14.960 | papapalooza he got, that may be
01:11:16.960 | the sort of progressive wing is going to say
01:11:18.960 | that publicly, but the smart Democrats
01:11:20.960 | know that they're very lucky
01:11:22.960 | to have a politician like Joe Manchin
01:11:24.960 | on their side of the aisle. I've got to ask a question
01:11:26.960 | to you, Chamath. Why
01:11:28.960 | do Democrats,
01:11:30.960 | why are they,
01:11:32.960 | it seems
01:11:34.960 | to be so anti-moderate Democrats,
01:11:36.960 | why are they so resistant
01:11:38.960 | to the concept
01:11:40.960 | of a moderate Democrat when obviously
01:11:42.960 | moderate Democrats seem to have an advantage
01:11:44.960 | in these elections? Well, no, I think David
01:11:46.960 | described it well, which is that in
01:11:48.960 | many of the seats, this is both
01:11:50.960 | true for Republicans and for Democrats,
01:11:52.960 | you're not really competing in a general election,
01:11:54.960 | you're competing in a primary, and if you win
01:11:56.960 | a primary, you're probably going to win. So, like,
01:11:58.960 | you know, if you're in Mississippi, for example,
01:12:00.960 | you just have to win the Republican primary. Nothing
01:12:02.960 | else matters. And then you're just going to skate
01:12:04.960 | to victory. And so the real question
01:12:06.960 | is who votes in those are
01:12:08.960 | different oftentimes than who votes in the general.
01:12:10.960 | And this is why you get this
01:12:12.960 | dispersion that's happening where
01:12:14.960 | folks seem to be
01:12:16.960 | getting more and more extreme.
01:12:18.960 | It's reflecting the soundbites
01:12:20.960 | that those primary voters want to hear.
01:12:22.960 | And this is the big problem that we
01:12:24.960 | have. And that's why, like, if you have
01:12:26.960 | a bunch of this, you know, rank choice
01:12:28.960 | voting, or, you know,
01:12:30.960 | these other kinds of methods, it starts to clean
01:12:32.960 | that up so that you move people
01:12:34.960 | more into the moderate middle.
01:12:36.960 | But that's why that's why you
01:12:38.960 | have this crazy stuff happening. All right, everybody, this
01:12:40.960 | has been another amazing episode of the all in podcast
01:12:42.960 | for the dictator,
01:12:44.960 | the Sultan of science and
01:12:46.960 | David Sachs. I am
01:12:48.960 | Jay cow. We'll see you next time. Bye bye.
01:12:51.960 | Let your winners ride.
01:12:53.960 | Rain Man David Sachs
01:12:55.960 | And it said we open source it to the fans
01:13:00.960 | and they've just gone crazy with it.
01:13:02.960 | Love you. Queen of
01:13:04.960 | King.
01:13:06.960 | Besties are gone.
01:13:13.960 | My dog taking a notice in your driveway.
01:13:18.960 | Oh, my
01:13:20.960 | natural media.
01:13:22.960 | Just get a room and just have one big
01:13:24.960 | huge or because they're all just like
01:13:26.960 | this like sexual tension that they just need to release
01:13:28.960 | somehow.
01:13:30.960 | You're a
01:13:34.960 | We need to get
01:13:36.960 | Merck.
01:13:38.960 | I'm going.
01:13:46.960 | (Hairdryer sound)
01:13:48.600 | [BLANK_AUDIO]