back to index

"Founder Mode," DOJ alleges Russian podcast op, Kamala flips proposals, Tech loses Section 230?


Chapters

0:0 Bestie intros! Jason goes "Founder Mode"
1:3 All-In Summit lineup announcement
9:1 Understanding "Founder Mode"
32:52 Bolt is back in the news as Ryan Breslow goes "Founder Mode"
52:28 Tech's Section 230 protections might be in danger after new ruling
68:45 DOJ charges two Russians with infiltrating US media company
82:42 Kamala's economic pivot

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Hold on guys, I gotta get into founder mode.
00:00:03.160 | - Oh wow, founder mode.
00:00:04.460 | - If he founder modes, I'm in a founder mode.
00:00:06.640 | - I got a founder mode too.
00:00:08.000 | Where's J-Cal?
00:00:10.360 | Oh, shit.
00:00:11.200 | (laughing)
00:00:13.580 | - Let's go Freeberg, let's get this show started baby.
00:00:17.280 | Come on podcast, number one podcast in the world.
00:00:19.820 | Here we go.
00:00:20.980 | Okay guys, let's start the show.
00:00:22.360 | - You are founder moding.
00:00:23.680 | - J-Cal, if you founder mode anymore,
00:00:25.120 | you're gonna get pneumonia.
00:00:27.060 | - All right, sax is like 90s again.
00:00:29.660 | We're back at Stanford.
00:00:30.860 | (laughing)
00:00:32.800 | - This is the funniest cold open we've ever done.
00:00:35.040 | - I'm sorry, I got Belgian waffle mix everywhere.
00:00:39.200 | - You got, it's right here, J-Cal.
00:00:40.400 | You got a bunch of founder mode right there.
00:00:42.140 | (laughing)
00:00:43.340 | - I got a little founder mode there.
00:00:44.980 | Thanks for looking out.
00:00:46.280 | (upbeat music)
00:00:49.020 | (indistinct)
00:00:51.440 | - Founder mode, baby.
00:01:04.620 | Let's go.
00:01:05.460 | All right, everybody.
00:01:06.280 | Welcome back to the number one podcast in the world
00:01:08.960 | with me again, the Chairman Dictator Chamath Palihapitiya.
00:01:13.400 | How are you doing, buddy?
00:01:14.240 | Have you acclimated?
00:01:15.060 | Now you're 15 days back on American soil.
00:01:17.900 | How are you adjusting?
00:01:19.140 | - I'm doing well.
00:01:21.680 | I turned 48 two days ago.
00:01:24.260 | - Oh, I know the big 50th is coming.
00:01:25.860 | I've already called dibs on chairing your 50th.
00:01:29.660 | So we will be certainly getting arrested in 24 months.
00:01:33.660 | Save up your bail money, boys.
00:01:35.740 | Friedberg, you look like you have had a record number
00:01:40.740 | of panic attacks in the last 48 hours.
00:01:43.100 | How is summit planning going?
00:01:45.200 | Are you okay, buddy?
00:01:46.620 | You took this responsibility on.
00:01:48.540 | Are you okay?
00:01:49.380 | - I'm hanging in there.
00:01:50.200 | I'm just waiting for the shoe to drop,
00:01:52.020 | Jake Allen, what you're gonna do to blow (beep) up.
00:01:54.220 | But I think we're almost there.
00:01:55.220 | So we're super excited.
00:01:57.120 | We have speaker names going out today.
00:01:59.620 | So we'll talk a little bit about who the speakers are.
00:02:01.500 | You guys know your bestie, Elon, will be there.
00:02:04.760 | - Oh, third year in a row, okay.
00:02:06.520 | - Third year in a row.
00:02:07.420 | He's the only repeat this year.
00:02:09.220 | We've got Mark Benioff joining us
00:02:11.960 | for a conversation about the future of enterprise.
00:02:13.940 | We've got Barry Weiss from the Free Press.
00:02:16.900 | We've got Arizona Senator Kirsten Sinema.
00:02:20.200 | We have John Mearsheimer and Jeffrey Sachs
00:02:23.540 | to talk about geopolitics with David Sachs, Sachs on Sachs.
00:02:26.980 | - Nick Casciarora.
00:02:28.180 | - Nick Casciarora.
00:02:29.180 | - What about Peter Thiel, my guy?
00:02:31.240 | I haven't interviewed Peter Thiel in a decade.
00:02:33.660 | He's amazing.
00:02:34.580 | - We've got the legend from LA, Michael Ovitz, joining us.
00:02:39.260 | - Love that book.
00:02:40.100 | - We're gonna get into some really cool tech.
00:02:42.020 | We have-
00:02:42.860 | - Have you guys read Michael Ovitz's book?
00:02:44.980 | - Oh, it's incredible.
00:02:45.820 | - Oh, yeah.
00:02:46.660 | - It's incredible.
00:02:47.480 | It's really incredible.
00:02:48.660 | It's a great book.
00:02:49.820 | - Man.
00:02:50.660 | - Michael's incredible.
00:02:51.480 | It's a great book.
00:02:52.420 | - What a playbook.
00:02:53.700 | - It's a really interesting mix.
00:02:54.700 | We've also got some cool panels on tech and robotics.
00:02:57.620 | So we're doing all three of the big eVTOL companies,
00:03:01.020 | Joby, Archer, and Wisk,
00:03:03.220 | and their CEOs are gonna be there.
00:03:05.180 | Gecko Robotics.
00:03:06.680 | So we've got Jake coming from Gecko
00:03:08.260 | to actually do a really cool product demo.
00:03:10.820 | We have the CEO of Waymo, Takedra Mawakana, joining us,
00:03:15.460 | talk about autonomous driving.
00:03:17.180 | Juan Carlos Belmonte, talking about age reversal
00:03:19.940 | from Altos Labs,
00:03:22.460 | probably the most funded private company in history.
00:03:25.100 | And also Ingo's coming out from Europe,
00:03:28.060 | from Adyen, CEO of Adyen.
00:03:30.220 | You guys may not know this,
00:03:31.220 | but Ingo has never done a US conference before.
00:03:34.080 | So this is his first public speaking event
00:03:36.460 | at a US conference.
00:03:37.860 | - He may be the first.
00:03:38.900 | I mean, Adyen, I think, is the first
00:03:40.780 | and really only company that's implemented AI
00:03:44.340 | in production at scale.
00:03:46.480 | It'll be really interesting to understand
00:03:48.220 | where that's gone in the last year.
00:03:49.700 | - Great topic to talk about.
00:03:50.940 | And huge shout out to my boy, Woody Hoberg, US astronaut,
00:03:55.940 | joining us to tell us about his months aboard the ISS,
00:03:59.860 | his experience captaining--
00:04:00.700 | - Is he gonna talk about his experience at Uranus?
00:04:03.700 | - Hey.
00:04:04.900 | And captaining the Crew Dragon capsule back to Earth.
00:04:08.900 | - From Uranus.
00:04:09.740 | - And the future of space flight and Uranus, yes.
00:04:11.900 | Thank you, Chamath. - To Uranus.
00:04:12.900 | - To Uranus.
00:04:13.740 | Thank you, everyone.
00:04:14.560 | Anyway.
00:04:15.400 | - I love how he's like my guy, the astronaut,
00:04:17.060 | as if like Freeberg and an astronaut
00:04:19.620 | have the same life experience, you know?
00:04:21.900 | - You know what happened, Jay Cal?
00:04:23.180 | - It's like me and my guy, Draymond, you know, champions.
00:04:25.100 | - Did I tell you, you know how I met Woody?
00:04:27.220 | He's a huge fan of the pod.
00:04:29.180 | And then he, NASA astronauts,
00:04:31.060 | which I follow on Twitter, DMed me.
00:04:33.100 | And they're like, "Woody would like to meet."
00:04:34.780 | And I'm like, "What?"
00:04:35.620 | And then I get this message from Woody.
00:04:37.860 | I'm like, "Is this real?"
00:04:38.820 | He's like, "I'm on the ISS.
00:04:39.780 | "I listen to the pod every week.
00:04:40.940 | "I love the show.
00:04:42.440 | "And you guys, you know,
00:04:43.740 | "keep me entertained up here in space."
00:04:45.500 | And I'm like, "No way."
00:04:46.700 | So then he's like, "Can you do a Zoom?"
00:04:48.180 | And I'm like, "Sure."
00:04:49.060 | So we hop on Zoom.
00:04:50.580 | And I, you guys saw this thing.
00:04:51.860 | I put it out on Twitter or something.
00:04:53.820 | But I did the Zoom with Woody, got to know him,
00:04:55.820 | hung out with him since he's been back.
00:04:57.500 | Great guy, so.
00:04:58.540 | - You left out the number one host
00:05:01.940 | on the internet in broadcast or news on YouTube.
00:05:06.620 | Megyn Kelly will be joining us.
00:05:07.820 | - Megyn Kelly.
00:05:09.300 | - The phenomenal Megyn Kelly.
00:05:10.820 | Didn't she call you a prick, J. Kel?
00:05:12.820 | (laughing)
00:05:13.660 | - Did she call me a prick?
00:05:14.500 | Why should she be any different?
00:05:16.460 | - Yeah.
00:05:17.300 | (laughing)
00:05:20.060 | - Oh, I can't wait for this fireworks.
00:05:20.900 | - What do tall, beautiful, intelligent women
00:05:23.500 | call me historically?
00:05:25.500 | - What do they call you?
00:05:26.340 | They call you, is that your mom?
00:05:27.340 | - They don't.
00:05:28.180 | (laughing)
00:05:29.020 | - Exactly.
00:05:29.840 | They're like, "Get out of the way.
00:05:30.680 | "Where's your mom?"
00:05:31.500 | They're like, "Can you park my car?"
00:05:32.540 | - That's what they normally say.
00:05:33.380 | - And they hand me their keys.
00:05:34.200 | - And then really excited to have Thomas Lafont
00:05:36.460 | come and give us a state of the union
00:05:38.300 | on public market and private market tech investing.
00:05:41.420 | He's got a really interesting presentation
00:05:43.460 | he's gonna share.
00:05:44.300 | So a lot of the content from the summit,
00:05:45.780 | we're gonna be pushing out--
00:05:46.620 | - Co-founder of Co2.
00:05:48.020 | - Oh, sorry, co-founder of Co2, yep.
00:05:49.600 | And so a lot of the content we're gonna be pushing out
00:05:51.400 | on YouTube as quickly as we can.
00:05:54.020 | Obviously, some of the more timely,
00:05:55.260 | newsly stuff we'll get out first.
00:05:57.380 | And then as we did in the last two years,
00:05:58.920 | material will roll out in the days and weeks that follow.
00:06:01.780 | We do not do sponsors here on the All In podcast,
00:06:04.360 | but the All In Summit does have sponsors.
00:06:06.560 | So a huge shout out--
00:06:07.760 | - Partners.
00:06:08.600 | - Which we would never otherwise do,
00:06:09.420 | partners to Accel Events, Hexclad,
00:06:12.600 | merch.com, Athletic Brewing, The Ridge,
00:06:15.640 | the greatest law firm on earth, Cooley,
00:06:17.960 | Velocity Black, and our big enterprise software
00:06:21.640 | partnership sponsor, Google Cloud.
00:06:24.120 | Big shout out and big thanks to Google Cloud
00:06:26.220 | for stepping it up in a big way.
00:06:28.040 | They have an incredible set of AI tools
00:06:31.000 | that they're making available to startups.
00:06:32.840 | Startup founders that are attending the summit
00:06:34.960 | get 350,000 bucks in Google Cloud credits.
00:06:38.280 | - Wait, what? - So thanks, yeah.
00:06:39.560 | - Each startup founder? - Each startup, yeah.
00:06:42.480 | So pretty awesome.
00:06:43.520 | - I would've sent all my companies to the summit.
00:06:45.360 | - Exactly.
00:06:46.200 | - Hey, $7,500 for a ticket, you get $350,000 in credits.
00:06:49.160 | - $350,000 of credit, that's incredible.
00:06:51.440 | - For Google Cloud, it is up to $350,000
00:06:53.480 | in Google Cloud credits for eligible AI startups,
00:06:56.680 | which they can use over two years.
00:06:58.360 | Comes with a bunch of other benefits.
00:06:59.440 | So yeah, it's pretty awesome.
00:07:00.840 | Really glad they're doing this,
00:07:01.680 | and they're gonna have a big setup.
00:07:02.520 | - You're the most lukewarm ad read I've ever heard.
00:07:04.720 | Do you want me to do that for you and get people hyped?
00:07:06.320 | - Yeah, go for it.
00:07:07.160 | - Yeah, yeah, yeah.
00:07:07.980 | - Can you guys Theo-von it?
00:07:09.200 | How would you guys Theo-von this?
00:07:10.520 | - Go ahead.
00:07:11.560 | - And Google Cloud is giving an amazing offer
00:07:15.440 | to everybody who comes to the summit.
00:07:17.200 | Wait 'til you hear the sax, $350,000 for AI startups credit.
00:07:22.200 | And that's real money that you would've spent otherwise.
00:07:25.520 | So a great gift from our friends at Google Cloud.
00:07:28.420 | - Wait, $350,000 just for startups
00:07:31.560 | whose founders come to our summit?
00:07:33.080 | - Yeah, for AI startup founders,
00:07:36.240 | they can sign up for Google Cloud.
00:07:37.640 | They get 350,000 credits that they can use over two years.
00:07:40.360 | - Total, across all the founders.
00:07:42.400 | - No, no, no, no, per business.
00:07:45.080 | - Per business, oh my gosh.
00:07:46.880 | - Yeah, so if you're spending 10K a month.
00:07:48.720 | - I have seven friends and family tickets left.
00:07:50.760 | I'm starting seven companies in the next half hour.
00:07:53.240 | They're all going to the summit.
00:07:54.960 | I'm going to clear $2.3 million of credits
00:07:57.960 | and I'm going to sell them on eBay for 500K.
00:08:00.400 | - You can't do that.
00:08:01.240 | It's against the terms of service.
00:08:02.060 | - Oh, you can't do that?
00:08:02.900 | - Thank you to our friends at Google.
00:08:03.960 | - We did get a funny note from Jekyll this morning
00:08:05.860 | asking if his remaining friends and family tickets
00:08:08.200 | could be sold off so he could grift himself some money
00:08:10.840 | off of our free collection.
00:08:12.040 | - I was like, hey, what do these go for on the,
00:08:13.640 | I'm just thinking if it's sold out,
00:08:15.880 | these aren't worth 7,500 each,
00:08:17.720 | what's it worth if you can't get in?
00:08:19.840 | - Yeah, I'm sure you'd find that out pretty quickly.
00:08:21.920 | - Yeah, I mean, who's going to stop me?
00:08:24.760 | - Jekyll will be standing on the corner
00:08:26.560 | trying to scalp the remaining tickets.
00:08:28.560 | - I know, I'm going to give them to my brother, Josh.
00:08:30.760 | If you want to get into the summit for 10K cash,
00:08:33.440 | Josh will be at the loading dock.
00:08:35.000 | - We're going to see a couple of Calcanis brothers
00:08:36.800 | on the sidewalk outside the conference holding a sign.
00:08:40.840 | You know, tickets available.
00:08:41.840 | (laughing)
00:08:44.000 | - Down at the Mongolian barbecue joint in Westwood.
00:08:46.880 | (laughing)
00:08:47.720 | - Listen, if you've ever been to Madison Square Garden,
00:08:51.160 | you just hold up the number one
00:08:52.720 | and then somebody will come talk to you.
00:08:54.360 | Somebody will come talk to you.
00:08:55.200 | All right, we've gotten through all the great housekeeping
00:09:00.000 | and it's time to talk about founder mode, founder mode.
00:09:04.160 | On Sunday, the Y Combinator founder, Paul Graham,
00:09:08.280 | published an essay titled "Founder Mode."
00:09:11.140 | It was based on a talk that Airbnb CEO, Brian Chesky,
00:09:14.400 | gave to a group of YC founders last week,
00:09:16.640 | where he said, "Hey, the advice I got
00:09:18.320 | "on running a large company was to hire good people
00:09:21.160 | "and give them room to do their jobs."
00:09:23.400 | He says he took that advice
00:09:24.920 | and the results were a disaster.
00:09:26.520 | So he studied how Steve Jobs ran Apple
00:09:29.880 | and according to PG, Paul Graham that is,
00:09:33.160 | a bunch of very successful founders in the audience
00:09:35.640 | kind of nodded their head and said,
00:09:36.880 | "Hey, that is similar to my experience."
00:09:40.760 | PG defines the world now in two ways,
00:09:44.280 | two philosophies for running a company,
00:09:46.100 | manager mode versus founder mode.
00:09:47.960 | In manager mode, that's just the conventional way
00:09:50.440 | of doing this, what you teach in business school,
00:09:52.040 | you hire good people, you give them room to do their jobs.
00:09:55.000 | You tell your direct reports what to do,
00:09:56.800 | they figure it out, you don't micromanage.
00:09:58.880 | PG says, he doesn't give too many specifics
00:10:01.680 | about founder mode, he kind of says,
00:10:02.840 | "Hey, we gotta figure out what this thing is."
00:10:05.360 | So in some ways, this blog post was a way
00:10:08.080 | of starting the conversation.
00:10:09.920 | But he said things like less delegation and more hands-on
00:10:14.480 | and if you haven't heard of it, skip level meetings.
00:10:17.000 | This is where the CEO will meet,
00:10:19.680 | not with their manager or direct report,
00:10:21.520 | but maybe a group of people who reported to that manager
00:10:24.560 | without the manager in the room.
00:10:25.840 | It's a notable and well-known management technique
00:10:29.600 | to kind of get information to the CEO a little bit faster.
00:10:33.280 | And what do you think?
00:10:35.800 | Chamathus got a lot of attention on a Sunday.
00:10:40.920 | Did you have any takeaways from it?
00:10:42.320 | - I was confused when I read it
00:10:45.120 | because everybody was breathlessly panting
00:10:49.640 | about how incredibly insightful it was.
00:10:52.280 | And when I read it, my first thought was,
00:10:54.080 | where's the second half that actually explains what this is
00:10:56.880 | so that you can have an opinion?
00:10:58.880 | And then the next thing I thought was,
00:11:00.440 | I don't really understand what any of this is all about.
00:11:03.280 | And I tweeted this.
00:11:04.160 | In a quarter century now in Silicon Valley,
00:11:06.160 | I think I've encountered two different kinds of people.
00:11:10.840 | One are the groups of people that can go right
00:11:13.520 | to the heart of issues because they can break things down
00:11:16.440 | and look at it from first principles
00:11:18.080 | and just ruthlessly attack what's not working
00:11:22.720 | with absolutely no nostalgia for people
00:11:25.120 | or sunk cost or tech debt.
00:11:27.960 | And then there's everybody else.
00:11:30.840 | And I think that that is an archetype
00:11:33.140 | that actually drives successful companies.
00:11:35.600 | It's not the case that those are only exhibited in founders.
00:11:41.240 | I think it's a psychological makeup of a person.
00:11:44.820 | And the people that have it tend to build good companies.
00:11:47.760 | We'll have such a person, for example, at the All in Summit.
00:11:53.560 | If you look at what Nikesh Arora was able to do
00:11:55.800 | at Palo Alto Networks, it's incredible.
00:11:58.240 | I mean, in eight years,
00:11:59.560 | he's created $80 billion of value.
00:12:01.760 | How do you do that?
00:12:02.600 | I think it's important to understand how that happens.
00:12:05.700 | If you look at a whole bunch of other people
00:12:10.140 | that are managers, Shantanu Narayan,
00:12:12.800 | how has he built Adobe?
00:12:14.480 | Satya Nadella, how did he build Microsoft?
00:12:17.720 | These are all just like a handful of examples
00:12:19.440 | of people that have just tacked on collectively
00:12:23.700 | trillions of dollars of market cap,
00:12:25.360 | way more than all founders added together
00:12:27.520 | in most companies, but for one or two.
00:12:30.320 | So I think the takeaway is instead of looking for some label,
00:12:34.940 | I think you're going to have to do the really hard work
00:12:39.140 | if you want a successful business,
00:12:40.800 | which is when things are working,
00:12:42.620 | have the courage to change the few things
00:12:44.520 | that still need changing.
00:12:45.880 | And when things are not working,
00:12:47.200 | break it down to the studs.
00:12:48.760 | And most people don't have the courage
00:12:51.920 | to go through the glass eating that is required
00:12:55.800 | to get to the other side of that process.
00:12:59.080 | - Sax, did you read the piece?
00:13:00.920 | You've clearly saw all the memes and, you know,
00:13:04.400 | I guess, kudos for the piece on X this past weekend.
00:13:11.000 | What are your thoughts on it, generally speaking?
00:13:12.560 | Anything that you took from it that was notable
00:13:15.380 | or is this just some obvious stuff?
00:13:17.220 | - Yeah, I mean, my reaction to it was that
00:13:19.740 | this is hardly new.
00:13:21.260 | I mean, I remember way back in the PayPal days
00:13:23.860 | over 20 years ago, we had a rule that
00:13:26.340 | we wouldn't hire MBAs.
00:13:27.980 | We had a no MBA hiring rule.
00:13:29.820 | And the reason was 'cause we figured out pretty early on
00:13:32.300 | that MBAs were bringing more
00:13:34.620 | of a traditional management toolkit
00:13:36.580 | that seemed less applicable at a startup.
00:13:39.820 | Subsequent to that, obviously Elon has taken
00:13:43.720 | a very hands-on approach at his companies
00:13:46.240 | that Walter Isaacson described in his book as demon mode.
00:13:50.880 | And so that's been described.
00:13:52.600 | Ben Horowitz did a very interesting series of blog posts
00:13:56.560 | about management and addressed the topic
00:13:58.960 | of micromanagement years ago.
00:14:01.240 | And so he's written extensively about it
00:14:03.560 | and put a lot of substance behind it.
00:14:05.360 | Ben drew attention to a book by Andy Grove
00:14:09.160 | that came out 40 years ago called "High Output Management"
00:14:12.440 | that also addressed this problem in significant detail.
00:14:15.960 | And just to boil it down into a nutshell for you,
00:14:18.680 | the way that Grove defines this problem
00:14:22.800 | is he says that the output of a manager
00:14:25.280 | is the output of that person's team.
00:14:29.460 | And whether that manager is the CEO or a team lead
00:14:33.260 | or a VP or what have you,
00:14:35.520 | the way you, again, measure their output
00:14:38.960 | is just to look at the output of their team.
00:14:40.460 | So therefore, you ask what's gonna maximize
00:14:43.720 | the output of that team or that org.
00:14:46.460 | And obviously if the CEO tries to do everything
00:14:50.560 | and make every decision,
00:14:52.640 | probably that's not gonna result in the maximum output.
00:14:55.540 | Conversely, if you delegate everything,
00:14:57.800 | I call it the problem of infinite delegation
00:14:59.500 | where CEO delegates to VP, VP delegates to director,
00:15:03.080 | director delegates to manager,
00:15:04.640 | manager delegates to the summer intern.
00:15:07.520 | - Yeah.
00:15:08.360 | - If all the most important work in the company
00:15:10.040 | gets delegated all the way down
00:15:11.720 | to the newest, most inexperienced people,
00:15:14.040 | that's not gonna work either.
00:15:15.440 | - Yeah.
00:15:16.280 | - So again, you're gonna have to find the right balance.
00:15:18.680 | And again, the way that you figure out
00:15:20.960 | what the balance is is to apply Grove's principle
00:15:24.480 | of maximizing the output of the team or the organization.
00:15:27.660 | - And we've seen in some of the big companies, Sax,
00:15:29.640 | that they've cut out middle management
00:15:32.360 | because so much of the work was just being playing telephone
00:15:36.080 | and handing it to the next person
00:15:37.320 | that when you took out that swath of people
00:15:39.000 | at Facebook or Google or Microsoft.
00:15:40.740 | - Right.
00:15:41.580 | - The company seemed to work better.
00:15:42.720 | - Yeah, a great point.
00:15:43.560 | - 'Cause you're taking out a layer of overpaid people.
00:15:44.400 | - But it just took out a huge amount of middle management
00:15:46.760 | and it didn't seem to harm the performance,
00:15:49.200 | performance got better.
00:15:50.540 | And again, you can apply Grove's principle,
00:15:52.200 | look at what happens to the aggregate output.
00:15:54.720 | So this topic has been addressed at length
00:15:58.840 | and I think has been understood for decades.
00:16:01.400 | - Yeah.
00:16:02.240 | - Maybe the only new thing here is a little bit of branding
00:16:05.200 | around founder mode versus manager mode.
00:16:08.760 | The problem with that branding is,
00:16:10.760 | I think it's an overly simplistic
00:16:13.000 | and Manichaean view of the world
00:16:15.480 | where it kind of fits into really all of the PGSAs
00:16:20.480 | and the YC model, which is founders always right.
00:16:26.680 | And everybody else in the ecosystem,
00:16:29.280 | especially traditional managers,
00:16:30.720 | they're either liars or fakers.
00:16:32.880 | And that basically has been the Manichaean model
00:16:36.240 | that Paul Graham has been pumping out for decades.
00:16:38.640 | - Is there something wrong with that model
00:16:40.560 | you're kind of looking at
00:16:42.000 | and I'm kind of hearing a little bit of a tone there,
00:16:44.040 | maybe I'm reading into it, that that doesn't match reality.
00:16:48.560 | - Well, sure.
00:16:49.400 | I mean, look, the whole ecosystem at Silicon Valley exists
00:16:52.360 | to help make founders successful.
00:16:54.000 | I mean, I was a founder myself, now I'm an investor.
00:16:56.880 | There's also talent who don't found companies themselves
00:16:58.920 | but wanna join them.
00:17:00.760 | The whole thing is organized to make founders successful.
00:17:04.240 | And the whole idea that people are out to get the founder,
00:17:07.120 | I mean, this is a really antiquated idea.
00:17:08.840 | There was some truth to it in the 1990s.
00:17:11.960 | By 2002 or 2003, when Peter Thiel created Founders Fund,
00:17:16.640 | he called it that to emphasize
00:17:18.720 | that founders should be in charge.
00:17:21.080 | I'd say by the mid to late 2000s,
00:17:23.120 | no one really disagreed with that anymore.
00:17:25.360 | So this whole idea that people are out to get founders
00:17:28.240 | rather than make them successful
00:17:30.040 | is antiquated by at least 15 years, I would say.
00:17:32.520 | - Part of why combinators marketing, wouldn't you say,
00:17:34.400 | is like to kind of put themselves next to the founder
00:17:36.840 | and say, "Hey, we're the only ones
00:17:38.440 | "who really care about founders.
00:17:39.880 | "Everybody else is out to get them."
00:17:42.040 | - Yeah, I mean, look, there was some truth to it,
00:17:44.680 | I would say, in the 1990s.
00:17:46.480 | There was a prevailing view in Silicon Valley 30 years ago
00:17:50.220 | that once the company got to a certain level of size,
00:17:52.320 | you hire a professional manager.
00:17:54.040 | Subsequent to that, people figured out
00:17:55.760 | that the best performing companies
00:17:57.160 | are indeed the founder-led companies,
00:17:59.680 | the ones that keep the founder engaged
00:18:01.160 | for a long period of time.
00:18:02.520 | They tend to build the most value.
00:18:04.520 | Everybody understands that.
00:18:05.480 | Everyone wants the founder to be successful.
00:18:07.560 | The question is, how do you make this founder successful?
00:18:09.760 | And I think there is a perverse dynamic
00:18:11.580 | where if you teach the founder that,
00:18:13.280 | "Hey, you're always right,
00:18:15.080 | "and everybody else is a liar and a faker,"
00:18:17.640 | then that can create a distortion effect
00:18:21.200 | where founders don't feel the need to level up.
00:18:24.300 | We all want the founder to level up.
00:18:25.960 | I mean, I would always rather invest in the founder
00:18:28.920 | who has vision over some professional manager, right?
00:18:32.760 | You want the visionary to succeed,
00:18:34.440 | but you need them to level up and learn
00:18:37.080 | just basic management over time.
00:18:38.920 | And if you're teaching them that,
00:18:40.720 | "Hey, founder's always right,"
00:18:42.200 | there's less incentive to do that.
00:18:44.280 | - I think it's a wonderful Rorschach test
00:18:46.160 | for basic intelligence.
00:18:47.440 | I mean, if you just read something,
00:18:49.560 | and all of a sudden, like, "That's my problem,"
00:18:52.200 | it's probably not your problem.
00:18:55.120 | And so it's a great way to actually weed out
00:18:57.960 | the people that were just meant to fail anyways
00:18:59.880 | and just needed a way to externalize their frustration
00:19:03.960 | and have an excuse.
00:19:05.480 | I have never seen a person build a successful company
00:19:08.360 | unless they understood intimately
00:19:11.240 | the core amount of value that their company created
00:19:14.120 | and knew how to balance getting into the weeds
00:19:17.760 | with scaling through other people.
00:19:19.840 | I've never seen a good example of a great CEO
00:19:22.320 | that didn't do that well.
00:19:24.000 | - Yeah, everyone has to figure this out.
00:19:25.640 | - And there is no word salad that explains this.
00:19:29.680 | It's incredibly unique to every single company.
00:19:32.880 | And so there is no panacea here.
00:19:34.880 | - Yeah, I mean, sometimes the founder's job
00:19:36.360 | is to get in the weeds
00:19:37.560 | of something that's broken in a company.
00:19:39.320 | - For sure.
00:19:40.160 | - And get down and micromanage it.
00:19:41.000 | - But wasn't this already understood?
00:19:41.840 | - Other times, you want to step back
00:19:43.280 | and let the sales team cook 'cause they're crushing it.
00:19:46.000 | - I would say it's their job
00:19:47.680 | to have the strategic wherewithal
00:19:49.400 | to understand what it takes to win
00:19:52.080 | and then win at all costs.
00:19:53.720 | That's their job.
00:19:54.640 | And if you cannot win
00:19:56.480 | and then you externalize your inability to win
00:19:59.280 | to an essay or something else,
00:20:02.680 | you're probably going to fail
00:20:03.960 | and your company's probably going to fail.
00:20:05.840 | - Freeberg, you're back in the founder's seat
00:20:07.680 | after a brief sojourn as an investor.
00:20:12.440 | Any of this ring true to you or obvious?
00:20:15.040 | Did you read the essay?
00:20:16.080 | Any thoughts on it, the reaction to it?
00:20:18.040 | - I mentioned this when we had our conversation in 2020
00:20:23.600 | around governance and why governments
00:20:26.480 | seemed to not be able to synthesize different data
00:20:29.560 | to make decisions.
00:20:30.400 | They were being told what to do by their subordinates.
00:20:34.800 | The difference for me is leading versus managing.
00:20:39.800 | That a traditional manager,
00:20:42.640 | and I've seen this in a lot of companies.
00:20:44.160 | I even saw this a lot at Monsanto.
00:20:47.200 | The manager says to the people that report to them,
00:20:52.200 | what are you guys going to do?
00:20:54.760 | And then the people,
00:20:56.600 | they go down to the people that report to them
00:20:58.240 | and they say, what are you guys going to do?
00:21:00.280 | And so you end up net net developing
00:21:02.400 | this kind of bottoms up model for the organization,
00:21:05.880 | which is effectively driven
00:21:08.720 | with a diffusion of responsibility.
00:21:13.920 | And as a result, the lack of vision.
00:21:16.020 | The leader says, here's what we are going to do.
00:21:20.680 | And here's how we are going to do it.
00:21:22.880 | And then they can allocate responsibility
00:21:25.160 | for each of the pieces necessary.
00:21:27.560 | And the leader that's most successful
00:21:29.760 | is the one that can synthesize the input
00:21:32.240 | from the subordinates and take that synthesis
00:21:34.920 | to come up with a decision or a new direction
00:21:37.680 | rather than be told what's the answer by the subordinates.
00:21:41.200 | So leaders, I think fundamentally are able to number one,
00:21:44.480 | understand the different points of view
00:21:47.120 | of the people that report to them.
00:21:48.960 | Number two, set a direction or set a vision,
00:21:51.680 | really clearly say this is where we are going.
00:21:53.680 | And then number three,
00:21:54.800 | figure out how to allocate responsibility
00:21:56.960 | to the people that report to them to achieve that objective.
00:21:59.240 | Whereas a manager is typically being told
00:22:02.080 | what's going to happen in the organization,
00:22:03.600 | like a giant Ouija board with 10,000 employees hands
00:22:07.240 | on the writing thing that go around
00:22:08.700 | and try and write the sentences.
00:22:09.880 | And ultimately you just get a bunch of gobbledygook.
00:22:12.000 | As companies scale and they bring in these quote,
00:22:13.800 | professional managers,
00:22:15.260 | they're typically kind of looking down and saying,
00:22:17.000 | hey, what are we going to do?
00:22:17.840 | What's going to happen next?
00:22:19.240 | And they're not actually setting a direction.
00:22:20.720 | Whereas someone who's a founder typically feels
00:22:23.180 | the authority to be able to set the direction.
00:22:25.660 | But to Chamath's point,
00:22:27.240 | Satya Nadella, Sundar, Nikesh.
00:22:30.280 | - Tim Cook.
00:22:32.280 | - You can kind of, Tim Cook, a great example.
00:22:34.960 | There are some really great leaders
00:22:36.640 | that have run organizations that are already scaled
00:22:39.580 | and then taken them to an order of magnitude
00:22:42.580 | or two orders of magnitude beyond where they were
00:22:44.520 | when they step in.
00:22:46.200 | So I don't think that there's necessarily a founder.
00:22:48.720 | - And they each did it uniquely.
00:22:50.000 | I don't think there's a single way
00:22:51.440 | in which any one of those companies is run
00:22:54.040 | that necessarily you could have cut and pasted it
00:22:56.240 | to the other companies and have it worked.
00:22:58.240 | So it requires a smart individual
00:23:01.480 | who understands their company
00:23:04.000 | and their human capital intimately well
00:23:06.280 | and their business model.
00:23:07.560 | - By the way, what's important
00:23:08.400 | about what you just said, Chamath,
00:23:09.600 | is that that is effectively,
00:23:12.120 | I think the definition
00:23:13.520 | of being a successful founder,
00:23:16.360 | understanding from first principles.
00:23:17.960 | So these are leaders that can think from first principles,
00:23:20.440 | not necessarily from comparables.
00:23:22.740 | Most managers are taught in business school,
00:23:25.040 | here's how a business is run,
00:23:26.420 | here's how this person did it,
00:23:27.480 | here's how this company did it.
00:23:28.880 | And then they go and they try and cookie cutter repeat that.
00:23:31.400 | That doesn't really work
00:23:32.720 | because every business, if it's going to be successful,
00:23:34.840 | it has to be unique.
00:23:36.120 | And so the ability to actually think uniquely
00:23:38.320 | and identify from first principles
00:23:40.200 | the means necessary to achieve the mission
00:23:42.240 | of the organization, I think is a critical,
00:23:45.120 | critical leadership trait.
00:23:47.080 | - Maybe.
00:23:48.320 | - Yeah, that leader can come from a founder or not,
00:23:50.440 | but this ability to kind of, yeah, ignore convention,
00:23:53.500 | ignore comparables and think for yourself.
00:23:55.900 | - I don't see most founders starting a business
00:23:59.160 | because they've done some first principles analysis.
00:24:01.720 | I see most founders starting companies
00:24:03.640 | because they see a gap that they're pretty sure exists.
00:24:07.720 | But I think the biggest thing that founders have
00:24:11.920 | is this intersection of fearlessness and naivety.
00:24:15.260 | And so if they knew too much, they would never do it.
00:24:19.800 | That I don't think necessarily means
00:24:22.000 | that the entire set of all of those people
00:24:23.960 | are actually great first principles thinkers.
00:24:26.460 | The first principles is required after product market fit.
00:24:29.840 | What's required before product market fit
00:24:31.880 | is risk-taking, curiosity, relentless iteration.
00:24:36.460 | And those are a very different set of characteristics.
00:24:39.560 | - Is it true that sub-founders
00:24:40.960 | can then change their toolkit?
00:24:42.760 | Absolutely.
00:24:43.880 | But is it true that, quote unquote,
00:24:45.440 | because you have a title, you have those things?
00:24:47.340 | Absolutely not.
00:24:48.680 | Otherwise, we would have a much higher success rate
00:24:51.200 | in Silicon Valley than we do.
00:24:52.440 | There is a reason why 95% of these companies go to zero.
00:24:56.160 | It's because the ability to do the first thing is rare,
00:24:59.660 | and then the ability to transition is even more rare.
00:25:02.620 | - Well, building a unique business.
00:25:04.000 | And I think if you look at Airbnb,
00:25:06.440 | and I'll give Brian Chesky incredible credit for this,
00:25:09.640 | you look at Uber, you look at how Steve Jobs built Apple,
00:25:12.400 | what every one of these businesses have in common
00:25:14.960 | is that they were all built in a unique way.
00:25:17.440 | And I think that's what makes great businesses
00:25:19.480 | is that they identify their own path,
00:25:21.680 | their own unique path for how to operate a business
00:25:24.440 | to scale to achieve the mission of the organization.
00:25:27.040 | And I think that that's what's typically lacking
00:25:29.000 | in trained managers who use comparables and biases
00:25:32.240 | from their prior experience
00:25:33.360 | and what they've been trained and taught
00:25:35.060 | to use as a cookie cutter type model,
00:25:37.080 | which doesn't create unique value.
00:25:38.600 | It makes your business look like the other,
00:25:40.680 | and ultimately it commoditizes some aspect of the business
00:25:43.460 | to the point that the value of the business goes down.
00:25:46.000 | But to build something unique
00:25:47.220 | where you're constantly finding the unique path
00:25:49.040 | that gives you an advantage
00:25:50.440 | is what all of these companies have in common.
00:25:52.560 | Whether that unique path was made
00:25:54.400 | by an experienced hired person or a first time founder,
00:25:58.240 | it's this ability to kind of build a unique competency
00:26:01.020 | that I think makes them all distinct as a group.
00:26:03.320 | - I think that's a really good point.
00:26:04.740 | Like if you look at some technology businesses,
00:26:06.780 | they start off looking like technology businesses
00:26:09.200 | and what they really are
00:26:10.240 | are tech enabled consumer businesses.
00:26:13.780 | Other businesses are pure technology companies.
00:26:16.140 | And why that distinction is important in this context
00:26:19.320 | is that there are certain kinds of challenges
00:26:21.320 | you have in the former
00:26:22.420 | that you just don't have in the latter and vice versa.
00:26:25.160 | So if you're a Google or a Facebook
00:26:26.880 | and you run into some problem,
00:26:29.240 | typically the solution is you can engineer around it
00:26:32.080 | because ultimately your product is free
00:26:33.960 | and there's a different way to scale and grow
00:26:35.680 | and create feature value
00:26:37.760 | because those are also incrementally free.
00:26:40.160 | If you look at a company like Airbnb
00:26:42.080 | and if you look at the last six months in the stock,
00:26:45.360 | what is it telling you?
00:26:46.320 | It's telling you that the people that own it
00:26:48.200 | have realized that it's less of a technology business
00:26:51.120 | and it's more of a cyclical business
00:26:54.880 | that ebbs and flows with the ability to spend money
00:26:58.080 | on behalf of the consumer.
00:26:59.760 | So now all of a sudden you're put into a different bucket
00:27:02.800 | and you grow as people's belief ebbs and flows
00:27:06.400 | about the amount of spending that consumers have.
00:27:09.500 | That is independent of your product quality.
00:27:11.920 | And what that means is that there's no amount of investment
00:27:14.240 | in technology that you can really make to change that.
00:27:18.120 | People need to have excess capital
00:27:20.800 | in order to spend on vacations
00:27:22.280 | and then you can capture your fair share of that.
00:27:24.960 | But if we're in a recession,
00:27:26.600 | that company will be under pressure
00:27:28.520 | in a way that has nothing to do
00:27:29.640 | with the internal product quality that they exhibit.
00:27:32.140 | So these are just dimensionality.
00:27:34.320 | It's just different kinds of problems
00:27:35.880 | that every single company faces
00:27:37.560 | that are totally unique to its own circumstances.
00:27:40.680 | So if you are trying to point to some label
00:27:45.160 | as the reason why your company is failing,
00:27:47.720 | I would just encourage you to not do that.
00:27:50.000 | - J. Cal, you've interviewed thousands of founders, CEOs,
00:27:53.240 | you've seen successful, unsuccessful.
00:27:55.840 | Do you have a point of view on whether this concept applies
00:27:59.080 | and if you have a general kind of heuristic
00:28:02.420 | for the difference between success and not success?
00:28:05.660 | - It's very different in the place
00:28:08.260 | where Paul Graham and I invest,
00:28:09.760 | year zero, year one of a company.
00:28:12.100 | And Chamath, you alluded to this.
00:28:13.940 | In that time period, going from zero to one,
00:28:17.060 | from no customers to one customer,
00:28:19.040 | it's about really having a team of builders,
00:28:21.540 | people who can actually build a product
00:28:23.500 | and have what's called product velocity in the industry,
00:28:25.620 | the ability to iterate,
00:28:26.760 | and then how much time do they spend
00:28:28.220 | with customers understanding
00:28:29.580 | what those customers' problems are?
00:28:31.500 | And so what we'll see is when the professional managers
00:28:34.240 | try to do that, they try to get a product to market
00:28:37.380 | and achieve what's called product market fit,
00:28:39.580 | and then eventually maybe get product or get market pull,
00:28:43.420 | as Andy Ratcliffe defines it,
00:28:45.000 | which is the market is seeking your product out
00:28:47.900 | because of word of mouth and because they need it,
00:28:49.940 | which Airbnb and Uber would fall into now.
00:28:52.960 | The difference between those two moments in time
00:28:54.840 | and who's good at it is pretty different,
00:28:56.760 | to your point, Chamath.
00:28:58.120 | So to be able to win both those bets is hard.
00:29:00.620 | That's like a parlay in a way.
00:29:03.120 | Getting the product market fit
00:29:05.540 | requires this sort of relentlessness in innovation
00:29:09.380 | and trying things.
00:29:10.240 | If you have experience in that vertical,
00:29:12.260 | it works against you.
00:29:13.860 | So we were looking at a company today in our firm
00:29:17.980 | that has a really good idea,
00:29:19.620 | and it's people who are from the industry,
00:29:21.980 | but they have an okay idea.
00:29:22.820 | It's people from the industry,
00:29:23.960 | but they can't get the product built
00:29:25.780 | because they're senior managers
00:29:26.980 | who have 20 years experience in this particular vertical.
00:29:30.540 | You would much rather see neophytes come out the vertical.
00:29:33.020 | So if you look at the companies mentioned here,
00:29:35.140 | Airbnb, Uber, and then SpaceX and Tesla,
00:29:39.880 | you just look at those four companies.
00:29:41.800 | They had no experience, zero, in space,
00:29:46.340 | in travel, in transportation, hospitality,
00:29:50.820 | or any of the success they had, or electric cars.
00:29:55.060 | But to actually scale those companies,
00:29:56.900 | that does require a lot of expertise,
00:29:58.820 | a lot of tactics,
00:30:00.060 | and people who can focus in a way on very narrow things.
00:30:05.060 | And that's sometimes founders who are in that first group,
00:30:09.520 | they can't have that focus on but one tiny little thing.
00:30:13.780 | They just get bored with it.
00:30:14.860 | And so they have to learn to have really great people,
00:30:18.340 | recruit really great people,
00:30:19.880 | and actually, yes, delegate to them to run a department
00:30:22.500 | and say, "Hey, the goal of this organization,"
00:30:25.840 | and you saw this, Chamath, when you were at Facebook,
00:30:27.800 | I think, you've told me so many stories,
00:30:29.740 | about you're focused on one thing, the sign-up flow,
00:30:34.140 | getting people to add that second or third friend,
00:30:36.140 | getting them to fill out their profile.
00:30:38.240 | There were key moments you determined would equal success,
00:30:42.460 | and then Zuckerberg, yourself, and that early crew
00:30:45.940 | were able to obsess over those.
00:30:47.380 | So maybe you could talk about Zuckerberg's ride
00:30:50.700 | to getting the first whatever,
00:30:53.140 | 10 million people on the product,
00:30:54.220 | but then 10 million to a billion.
00:30:56.580 | - Well, I mean, Zuck, I think,
00:30:57.660 | did an incredible thing for me,
00:30:59.020 | which is he let my team cook.
00:31:00.640 | I don't remember having skip-level meetings,
00:31:04.340 | this, that, none of this other nonsense.
00:31:06.540 | But at the same time, what he did was
00:31:07.980 | he created air cover for us because, let's be honest,
00:31:10.660 | I was an immature executive at the time.
00:31:12.580 | So I was still learning how to be part of a fabric
00:31:17.100 | of a group of people.
00:31:18.020 | I did not know how to do that.
00:31:19.260 | So I was a little bit of a lone wolf operator with my team,
00:31:22.620 | and I operated that way.
00:31:24.020 | I would, "Hey, guys, we get to 100 million people.
00:31:26.180 | "We're all going to Vegas."
00:31:27.540 | And the rest of the company would be upset.
00:31:29.180 | And Mark and Cheryl's job would be
00:31:31.300 | to clean up the broken glass of everybody else saying,
00:31:34.160 | "There's these haves and have-nots at Facebook."
00:31:36.460 | And in hindsight, who cares, 'cause it all worked.
00:31:38.780 | In the moment, I could see why everybody else
00:31:41.160 | was a little bit upset with that.
00:31:42.220 | So they did a wonderful job of letting this wonderful team
00:31:46.020 | of very curious iterators do their job,
00:31:48.980 | and they basically got out of the way.
00:31:51.340 | And I think that that's a wonderful thing.
00:31:53.040 | But is that repeatable?
00:31:54.600 | Not in any other company, because that context
00:31:57.660 | and timing and moment was very unique.
00:32:00.460 | And again, my approach to the job was unique.
00:32:03.820 | Not necessarily sustainable,
00:32:05.180 | not necessarily good nor bad, different.
00:32:08.100 | And we adapted to those boundary conditions
00:32:10.280 | to maximize how we could deliver.
00:32:12.520 | But it doesn't mean anything
00:32:14.700 | to any other company, in my opinion.
00:32:16.940 | - Yeah, I think it's well said.
00:32:19.400 | There's another good book,
00:32:20.620 | before we go on to our next subject.
00:32:22.200 | Patrick Lancione, I'm sure some of you
00:32:24.540 | have read "The Five Dysfunctions of a Team,"
00:32:26.780 | but he kind of got into early this sort of ability
00:32:30.860 | to be unliked inside of a company.
00:32:33.200 | And I think I highly recommend the book,
00:32:36.580 | because it really talks about this sort of fear of conflict
00:32:39.700 | and avoidance of accountability and inattention to results,
00:32:43.140 | which is kind of the underpinnings of, I think,
00:32:45.780 | what Paul was getting to with founder mode.
00:32:49.700 | Okay, great discussion, everybody.
00:32:51.740 | Let's, speaking of founder mode,
00:32:54.780 | I mean, this story keeps coming back.
00:32:56.640 | Ryan Breslow is back in the news.
00:33:00.400 | We talked about this company before.
00:33:02.180 | Breslow is the founder of a company called Bolt.
00:33:03.880 | That's a payment startup.
00:33:05.560 | And I have to set the story up in a couple of acts here.
00:33:08.960 | Sax and I have been following it for a while.
00:33:11.160 | Bolt was a payment startup.
00:33:13.640 | They did one-click checkout products.
00:33:15.480 | If you don't know the one-click,
00:33:17.040 | Amazon had a patent for this for a while.
00:33:20.280 | It came out of patent protection.
00:33:23.920 | Everybody sort of jumped on it.
00:33:25.200 | They're ShopPay by Shopify.
00:33:26.760 | You probably have experience.
00:33:28.480 | You buy something at one store,
00:33:30.520 | and then you log into another store with your phone number.
00:33:33.360 | It has all your credentials.
00:33:34.400 | You've been cookied.
00:33:35.240 | And that basically takes out the friction
00:33:37.320 | and allows you to make a purchase at another vendor
00:33:39.620 | without having to put in all your credentials
00:33:42.680 | and credit card and everything.
00:33:44.480 | So Bolt makes that through an API
00:33:47.120 | for different shops online.
00:33:51.120 | They generated 27 million in revenue
00:33:53.480 | on a $300 million loss last year.
00:33:56.140 | They get two to 3% commission when they sell something.
00:33:59.400 | Anyway, we talked about Bolt back in January, 2022,
00:34:02.560 | because they had raised at an extraordinary valuation,
00:34:05.080 | 355 million at an $11 billion valuation,
00:34:08.720 | 366 times revenue, Sax.
00:34:12.160 | I guess we missed those.
00:34:14.120 | Then Breslow made waves on Twitter
00:34:16.640 | by calling Shripe and YC the mob bosses of Silicon Valley.
00:34:20.520 | He alleged they were kind of acting in cahoots
00:34:24.360 | to keep people from using Bolt.
00:34:26.660 | And he claimed YC was skewering rankings on Hacker News,
00:34:30.440 | all this other stuff.
00:34:31.720 | Anyway, it's been two and a half years
00:34:34.800 | since we talked about this craziness at Bolt.
00:34:37.720 | And Breslow stepped down as CEO after posting that thread.
00:34:40.460 | He was accused of overstating Bolt's customers
00:34:42.720 | and technical capabilities to investors
00:34:45.280 | while raising money, that got probed by the SEC.
00:34:48.600 | They didn't take any action.
00:34:50.560 | And then their valuation was slashed 97%
00:34:53.480 | to 300 million earlier this year.
00:34:55.600 | That's where the story had ended until just last week
00:34:59.160 | when an insane story came out
00:35:00.800 | that Bolt's interim CEO, not Breslow,
00:35:03.320 | had emailed investors informing them out of the blue
00:35:07.240 | that they were going to raise 450 million
00:35:10.020 | at a $14 billion valuation.
00:35:13.640 | This came as a surprise to investors.
00:35:16.040 | They didn't know this was happening according to reports
00:35:18.840 | and that this deal would put Breslow back in charge as CEO.
00:35:22.320 | It was confusing to all the investors
00:35:24.040 | and they've all started to lawyer up
00:35:25.720 | and try to figure this out.
00:35:27.480 | Here's what we know so far.
00:35:29.240 | Bolt is on pace to generate 28 million in revenue this year,
00:35:31.760 | roughly the same as 2021.
00:35:33.940 | And so a $14 billion valuation
00:35:36.800 | would be 500 times revenue, Chamath.
00:35:41.240 | I'm not sure who would pay for that deal,
00:35:44.000 | but this deal is unique in that it's called pay to play.
00:35:47.440 | If you don't know that term,
00:35:48.440 | it basically means if you're an existing investor,
00:35:51.080 | if you don't invest in this round,
00:35:53.120 | you are going to lose your existing shares
00:35:55.080 | or have them diluted massively.
00:35:57.400 | So some investment bank out of the Seychelles,
00:36:01.840 | which is that tiny island in Africa
00:36:04.480 | that rich people go on vacation,
00:36:05.820 | was supposed to put in 200 million.
00:36:07.880 | That firm was called SilverBear.
00:36:10.120 | The other 250 million would come from a firm
00:36:12.480 | called the London Farm in marketing capital
00:36:15.840 | and credits on a venture investing platform.
00:36:18.920 | So anyway, there's a ton of carve-outs here and other stuff.
00:36:22.520 | I'm going to stop there at the notes and just, Sax,
00:36:25.080 | this is the craziest story.
00:36:27.120 | Sorry it took so long to get here.
00:36:28.560 | What are your thoughts on what we're seeing here
00:36:31.260 | in this drama and this Zurb valuation?
00:36:33.900 | - Yeah, look, Ryan has been right before.
00:36:38.200 | He's been right in the past,
00:36:39.320 | but in this case, it looks like he's clearly over his skis.
00:36:42.720 | And if the reporting is correct,
00:36:45.400 | it looks like he's trying to drum up interest
00:36:47.360 | in a financing round by representing things
00:36:50.880 | that aren't true or haven't happened yet,
00:36:52.960 | saying that certain investors are important
00:36:54.560 | and those investors are coming out
00:36:56.000 | and saying that they're not.
00:36:57.560 | The question, I think that's relevant.
00:37:00.880 | And that relates back to our previous topic,
00:37:03.000 | is this founder mode?
00:37:05.000 | I mean, what makes this not founder mode?
00:37:07.560 | And I think that this is where it'd be helpful
00:37:09.760 | to have a little bit more substance
00:37:11.560 | and not just a branding exercise.
00:37:14.240 | Let me give you a more mundane example.
00:37:16.600 | I was in a board meeting the other day
00:37:18.540 | and I commented to the team
00:37:19.720 | on how they had gotten their burn down substantially
00:37:22.120 | since the last board meeting.
00:37:24.040 | And somebody joked, one of the founders said founder mode.
00:37:26.320 | It was kind of a joke.
00:37:27.240 | Okay, great.
00:37:28.080 | But then it got me thinking,
00:37:29.600 | what if in this board meeting it had gone the opposite way
00:37:32.680 | and the founders had taken the position,
00:37:34.640 | you know what, we're not going to try and cut burn.
00:37:37.240 | We're going to put the pedal to the metal
00:37:39.000 | and accelerate the business and spend a lot more
00:37:41.880 | because somehow it's going to get us
00:37:43.160 | to the next funding round faster.
00:37:44.700 | Founder mode.
00:37:45.540 | Founder mode.
00:37:46.360 | Founder mode two.
00:37:48.320 | Hold on, let me find out.
00:37:50.480 | Oh, that's founder mode, baby, let's go.
00:37:53.320 | Let's double burn.
00:37:55.120 | So when you start branding these concepts
00:37:57.760 | without putting any substance behind them,
00:38:00.440 | and quite frankly, you've never been an operator yourself,
00:38:02.920 | you've never actually created a unicorn company,
00:38:05.680 | but you're representing yourself as a unicorn whisperer,
00:38:08.600 | like you're a guru
00:38:09.520 | in something you've never really done before,
00:38:11.800 | then it just allows people who want to justify bad behavior
00:38:16.040 | to basically get away with doing whatever they want.
00:38:18.680 | And that's what you're seeing with founder mode now, Jake.
00:38:20.680 | The reason why that joke is funny,
00:38:22.840 | you're doing your Tony Montana bit,
00:38:24.680 | is because you can justify any bad behavior as founder mode.
00:38:28.280 | And I think that's what all the memes are about right now
00:38:31.800 | is that founder modes become a joke
00:38:34.120 | because there's no substance to it
00:38:35.400 | and it allows you to justify anything a founder wants to do.
00:38:39.280 | And I think a more honest way to approach this
00:38:41.520 | would be to define here are the actual behaviors
00:38:46.360 | that make a founder successful,
00:38:48.400 | including when a founder is wrong.
00:38:51.040 | - How about slightly differently?
00:38:52.160 | There are stupid outcomes and smart outcomes.
00:38:55.920 | And there are all kinds of different people
00:38:57.320 | that can get to stupid outcomes and smart outcomes.
00:38:59.700 | And so you have to have the courage to say,
00:39:01.520 | that was stupid, don't ever do that again.
00:39:03.440 | And that was smart, do more of that.
00:39:05.280 | And if you can't distinguish the two
00:39:06.880 | or you can't get to the latter, you're gonna fail.
00:39:09.500 | - Instead of founder mode,
00:39:10.400 | why don't we call it founder responsibility?
00:39:12.480 | Like, what are the responsibilities?
00:39:14.920 | - I mean, founder authority is good.
00:39:15.760 | - Founder authority, founder responsibility.
00:39:17.160 | - Steph Curry is not the founder of the Warriors,
00:39:18.840 | but he's the linchpin.
00:39:20.080 | Okay, my point is teams need winning players.
00:39:24.640 | And then you need to put together
00:39:26.280 | your own game plan for winning,
00:39:27.720 | but the goal should be winning.
00:39:28.860 | If you are not winning, you are a loser.
00:39:31.200 | - And what does winning have?
00:39:32.560 | - Really talented, really driven people
00:39:35.180 | in different positions who understand their position
00:39:37.960 | are held accountable for their positions.
00:39:41.120 | - Winning, winning.
00:39:41.960 | If you're not winning, you're losing.
00:39:43.400 | - Yeah, to be frank,
00:39:44.440 | there's a certain kind of con man
00:39:47.160 | who represents themselves as a guru,
00:39:50.100 | even though they've never done the thing
00:39:51.440 | that they pretend to have done.
00:39:53.880 | And yeah, and in this case,
00:39:56.400 | frankly, you've got someone promoting concepts
00:39:59.200 | about operating when they've never done that before.
00:40:02.640 | They've never scaled a unicorn company,
00:40:04.160 | yet they're pretending to be a unicorn whisperer
00:40:06.720 | who's telling founders how to behave.
00:40:08.680 | And the question you need to ask is this,
00:40:10.160 | at the end of the day, helpful or not?
00:40:12.360 | I mean, we all want to exalt and celebrate founders.
00:40:14.960 | - I want to win.
00:40:16.720 | - Exactly, but we all want to win.
00:40:18.200 | And James Madison said that men are not angels.
00:40:22.040 | What do you do about those cases?
00:40:23.800 | Rare, I think they're pretty rare
00:40:25.640 | where the founder is just wrong about something,
00:40:28.280 | where they're representing that investors
00:40:30.200 | are gonna do something that they're not gonna do,
00:40:32.240 | or Elizabeth Holmes represents that she's got a product
00:40:35.160 | that she doesn't have,
00:40:36.840 | or you have a founder breaking a regulatory requirement
00:40:39.880 | that actually is necessary, that they comply with.
00:40:42.720 | What do you do in all those cases?
00:40:44.480 | You actually need to have some objective standard
00:40:47.080 | of behavior that's not just, oh, founders always right,
00:40:51.720 | traditional managers are always wrong,
00:40:53.960 | because that's just not nuanced enough
00:40:56.000 | to account for what it's actually gonna take to win.
00:40:58.920 | - Well, I mean, if Adam Neumann had listened
00:41:01.440 | to a lot of the smart people he had hired at WeWork,
00:41:04.560 | he might not have signed leases that were so high priced
00:41:08.760 | and gotten away from the playbook that worked
00:41:10.600 | when they started that business,
00:41:11.560 | which was find the cheapest real estate,
00:41:13.560 | charge the highest price you can for it
00:41:15.240 | by making it a community and really nice,
00:41:19.360 | as opposed to hiring the class A,
00:41:21.400 | getting the class A space, and then charging a B price.
00:41:24.960 | - To your point, a senior real estate executive
00:41:27.440 | would have put together a forecast of cash flows,
00:41:30.600 | and it would have been a very boring meeting,
00:41:33.080 | but if you didn't have the intellectual wherewithal
00:41:35.400 | to realize that that was a critical meeting
00:41:37.000 | for a real estate business, and then listen to that,
00:41:40.520 | and then manage your burn appropriately,
00:41:42.040 | that was a huge mistake.
00:41:43.280 | So again, it always comes down to,
00:41:47.440 | can you take all the labels and all of the
00:41:51.800 | navel-gazing aside, and can you make good, smart decisions
00:41:55.360 | when you're asked to run the play?
00:41:58.680 | That's your job at every level of a company,
00:42:01.520 | and a CEO that knows how to do that
00:42:03.480 | tends to helm winning companies.
00:42:06.680 | And I don't think that that is exhibited
00:42:08.440 | in whether you are employee zero or employee 20.
00:42:11.880 | It's a intellectual and psychological archetype.
00:42:16.080 | - Yeah, I mean, in my experience,
00:42:17.400 | the founders that are highly successful
00:42:19.720 | are the ones that are hands-on,
00:42:21.880 | they've got a strong vision, and they pursue it,
00:42:23.880 | and they're constantly learning and leveling up.
00:42:27.120 | And they're gonna learn wherever they can.
00:42:29.040 | They are gonna learn from other founders,
00:42:31.040 | they're gonna learn from executives they may have hired
00:42:33.800 | who maybe do have more of a traditional background.
00:42:35.720 | They're gonna figure out what works for them,
00:42:37.120 | and they're gonna discard the parts
00:42:38.840 | that don't work for them,
00:42:39.680 | and they're gonna double down on the parts that do,
00:42:41.840 | and the end result is gonna be a style that works for them.
00:42:45.080 | And there's not a one-size-fits-all to that,
00:42:47.160 | as we've seen.
00:42:48.000 | - That's actually the most important thing
00:42:50.240 | that I've heard in this whole discussion,
00:42:51.480 | that we don't talk enough about.
00:42:52.600 | The people that win are deeply intellectually promiscuous.
00:42:57.400 | They're learning about many, many things.
00:42:59.760 | They're adapting things to their own playbook.
00:43:02.560 | The next day, they may throw that piece away
00:43:04.360 | and take something else because it's something they learned.
00:43:07.140 | We use the word re-underwrite a lot, right?
00:43:09.360 | But it's this idea of constantly re-underwriting
00:43:12.640 | what the conditions on the field are,
00:43:14.120 | so you know what you need to do in that moment
00:43:16.160 | to maximize your chances of success.
00:43:17.600 | - That's the perfect example of this.
00:43:19.280 | When I was running Weblogs Inc.,
00:43:20.800 | we were gonna raise a round of funding.
00:43:22.600 | Mark Cuban had given us the first 300,
00:43:24.560 | and Mark Andreessen was gonna give us 500K
00:43:28.680 | for our blogging company that did Engadget, et cetera.
00:43:30.760 | And I had met Jeff Bezos,
00:43:32.640 | and I told him, "I'm in Seattle this week.
00:43:36.840 | "Would love to catch up with you."
00:43:38.160 | I said, "I was gonna be in Seattle next week
00:43:40.300 | "for the whole week.
00:43:41.140 | "Would love to catch up with you."
00:43:42.000 | I wasn't planning on being in Seattle.
00:43:43.440 | I made that story up just to make it convenient for him.
00:43:46.400 | And he said, "Sure, come by on Tuesday or whatever."
00:43:49.000 | I come by on Tuesday.
00:43:49.840 | I sit with Jeff Bezos, my partner Brian Alvei.
00:43:52.680 | And the intellectual curiosity he took
00:43:55.120 | on taking a part in Engadget and blogging
00:43:58.280 | as this new medium that was very disruptive at the time.
00:44:00.800 | How do you pick the name of the blog?
00:44:02.480 | Tell me about the CMS.
00:44:04.320 | What is the publishing strategy?
00:44:05.760 | How do you hire people?
00:44:07.000 | And I would say, "Oh, you know, we hire people.
00:44:08.600 | "The best people are the great commentators.
00:44:10.280 | "So we look at the great comments, and then we hire them."
00:44:13.320 | And then I watched in Amazon, they started hiring
00:44:16.600 | the people writing great reviews for their review team.
00:44:18.920 | And I was like, "Oh wow, he really is taking notes on this."
00:44:22.120 | And to your point, Sax,
00:44:23.800 | or maybe Tramont, you made this point.
00:44:25.920 | In terms of hiring, one of the great techniques Bezos had
00:44:29.320 | is a concept called bar raisers.
00:44:31.840 | And a bar raiser in Amazon is somebody you hire
00:44:34.920 | for your team who is so good
00:44:37.400 | that they raise the bar for the entire team.
00:44:40.340 | So when they would say,
00:44:41.300 | "Hey, we're gonna add somebody to this team,
00:44:42.680 | "they're gonna be the 10th person."
00:44:44.200 | You'd say, "Are they better at whatever dimensions
00:44:47.120 | "than the rest of the team?
00:44:47.960 | "And will they raise the bar for everybody here?"
00:44:51.120 | And then you think about some managers,
00:44:52.320 | they hire somebody who fits in.
00:44:54.120 | They hire somebody who doesn't rock the boat.
00:44:55.920 | And there are techniques here in very specific ways
00:44:58.520 | to attract talent to your startup that will become--
00:45:02.520 | - You know, do you guys remember part of the reason
00:45:04.880 | why YC and Founders Fund and Andreessen took this path
00:45:08.960 | was to try to corner Sequoia?
00:45:11.280 | Because Sequoia had this reputation
00:45:13.280 | in the '90s and early 2000s
00:45:15.600 | as somebody that would boot the founder.
00:45:19.000 | But when you look at Sequoia's returns,
00:45:22.240 | what their returns say is these guys are just winners.
00:45:25.200 | They're consummate, incredible winners.
00:45:28.040 | And so what Sequoia had the ability to do,
00:45:30.240 | clearly looking backwards,
00:45:32.160 | is figure out which companies had people
00:45:34.040 | that were adapting themselves and scaling,
00:45:36.640 | and people who were a little bit in a cul-de-sac
00:45:39.560 | and just needed to get replaced.
00:45:40.840 | And again, they were operating
00:45:43.840 | from a first principles perspective, in my opinion,
00:45:46.320 | and they were being pretty ruthless
00:45:49.120 | and acting with zero nostalgia.
00:45:51.260 | So again, it's just a reminder,
00:45:54.960 | there is no easy answer here.
00:45:56.560 | It is so hard.
00:45:57.880 | That is why 95-plus percent of our companies
00:46:01.480 | and our efforts end up with nothing
00:46:04.400 | to show for it at the end of it.
00:46:05.240 | - There was, Rieberg, to bring you in on this discussion,
00:46:07.200 | there was a moment in time, actually,
00:46:09.760 | where this actually crossed over
00:46:11.920 | and you got to witness it,
00:46:12.880 | which was when Larry and Sergei were taking Google public.
00:46:16.600 | And before that, they said,
00:46:18.240 | "I don't know if the markets will let these two PhDs,
00:46:21.480 | "as smart as they are, as driven as they are,
00:46:23.280 | "with such a great product,
00:46:24.800 | "I don't know if they're gonna buy the stock.
00:46:26.100 | "Can we get an adult in the room who's done this before?"
00:46:28.520 | And they found Eric Schmidt, who had run Novell,
00:46:30.880 | and they brought him in as this third part of the triangle,
00:46:35.880 | and then eventually he wound up leaving, et cetera.
00:46:38.800 | What were your thoughts on Silicon Valley
00:46:41.280 | and that transition time and Eric Schmidt's role at Google?
00:46:44.360 | - Well, Eric Schmidt came into Google
00:46:45.720 | through his relationship with John Doerr,
00:46:49.120 | and John Doerr, as a mentor to Larry and Sergei
00:46:51.760 | and investor in Google,
00:46:53.440 | had suggested that they bring in a professional manager
00:46:57.420 | that can help them successfully scale the business.
00:47:00.340 | And John had this relationship with Eric,
00:47:02.680 | and Larry and Sergei started to socialize with Eric,
00:47:05.280 | and it was a very long process,
00:47:07.720 | and they ultimately respected Eric
00:47:09.280 | because of his technical capabilities
00:47:11.400 | and technical background,
00:47:13.080 | which was quite distinct from the other candidates
00:47:15.160 | that they had met during this process.
00:47:17.280 | But it was necessary.
00:47:18.120 | They were first time, they'd never worked anywhere.
00:47:21.160 | They'd never had a job.
00:47:22.160 | They had never had experience.
00:47:23.760 | I think similarly, Sheryl Sandberg, as a partner to Zuck,
00:47:27.480 | ultimately helped him level up,
00:47:29.080 | and obviously Chamath and others prior to that.
00:47:32.520 | But these first-time founders
00:47:33.880 | that haven't had any sort of work experience
00:47:35.960 | and have no concept of organizational dynamics
00:47:39.280 | and the challenges that will arise
00:47:40.600 | as you try and build and scale a team to execute a mission,
00:47:43.600 | typically need to have some degree
00:47:45.000 | of counseling, mentorship, and support.
00:47:47.280 | And so bringing in that degree of experience
00:47:49.680 | with someone who's ready and willing
00:47:50.960 | to partner with the founders,
00:47:53.060 | meaning not necessarily direct them,
00:47:55.240 | but partner with them and help them learn,
00:47:57.800 | as Eric did,
00:47:59.600 | and then ultimately handed the reins back to Larry,
00:48:02.240 | and as Sheryl did,
00:48:03.960 | and ultimately Zuck continued to kind of lead the company,
00:48:07.320 | have been really powerful enabling forces.
00:48:09.160 | But Chamath is right.
00:48:10.320 | The early days of Silicon Valley Venture Capital
00:48:13.040 | were really framed around this concept
00:48:14.720 | where you find some smart technical founding team,
00:48:17.200 | and then you bring in a professional manager.
00:48:19.160 | But that's because so much of the origin of technology
00:48:21.640 | in Silicon Valley was about selling technology
00:48:24.200 | into an enterprise.
00:48:25.040 | So there was kind of a bit of a tried and true
00:48:27.120 | business model and business structure that made sense.
00:48:32.240 | The new era, since the internet, has been quite different.
00:48:35.360 | Every business model imaginable
00:48:36.860 | has been reinvented in Silicon Valley.
00:48:39.120 | And so success, I think, has largely arisen
00:48:41.240 | in reinventing businesses, reinventing industries,
00:48:44.160 | by kind of novel independent thinking leaders,
00:48:46.960 | not necessarily bringing in experienced managers
00:48:50.640 | to scale up a known business model.
00:48:52.840 | - Right, and just, I think the turning point
00:48:55.440 | was when Peter started Founders Fund,
00:48:57.640 | but that was over 20 years ago.
00:48:59.880 | In other words, Peter realized
00:49:01.800 | that that old approach in the '90s
00:49:03.720 | of bringing in the professional management
00:49:05.600 | had run its course and that we needed
00:49:08.520 | to help founders level up and stay in the seat
00:49:11.520 | for as long as possible.
00:49:13.080 | That was 22 years ago that he created that firm?
00:49:16.240 | - Yeah.
00:49:17.080 | - So we're like decades into this,
00:49:18.040 | and that's what kind of feels anachronistic
00:49:19.800 | about this whole discussion,
00:49:21.680 | is it's almost like we're pretending
00:49:24.240 | like we're still in a world
00:49:25.440 | in which founders aren't celebrated and exalted.
00:49:28.560 | It's quite the contrary.
00:49:29.500 | They can do almost anything.
00:49:31.440 | And the question is, how do you help them level up?
00:49:36.160 | And at some point, if you're constantly just saying
00:49:39.160 | that, well, founders know everything,
00:49:40.280 | founders always right, is that actually helping them,
00:49:43.000 | or is it actually reinforcing a mentality
00:49:45.960 | that, oh, I don't really have to learn anything?
00:49:48.200 | 'Cause all the great founders have been learning machines,
00:49:52.160 | but if they had been told throughout
00:49:53.640 | that you're always right, don't worry about it,
00:49:55.720 | then they may not have learned the same way.
00:49:58.280 | - Did you say you met with Eric Schmidt recently, Shama?
00:50:01.080 | - Well, I had a meeting with Eric Schmidt a few weeks ago,
00:50:04.440 | which blew my mind.
00:50:05.440 | And I think I called Freeberg right afterwards,
00:50:07.680 | but this is just an example of like,
00:50:09.840 | there are just people that know how to win.
00:50:12.580 | He is one of those people.
00:50:14.000 | So as part of 80/90,
00:50:15.680 | one of the things that we're doing
00:50:16.760 | is we're building a transpiler.
00:50:19.040 | Right now, everybody is very much fixated on PyTorch,
00:50:22.920 | and the problem with PyTorch and building to NVIDIA
00:50:27.040 | is you have this thing called CUDA in the middle of it,
00:50:29.320 | which is owned by NVIDIA.
00:50:30.320 | And I think that over time, that's problematic.
00:50:32.520 | So the long-term solution
00:50:34.800 | is you need a functional transpiler
00:50:36.560 | that can take CUDA-littered code,
00:50:38.240 | but be able to compile it
00:50:39.800 | without losing performance to any hardware.
00:50:41.920 | So Amazon hardware, Google TPU, et cetera.
00:50:44.920 | So Eric and I sat down,
00:50:46.000 | we were together for a couple of days
00:50:48.160 | when we were in Europe,
00:50:49.000 | and we had like a hour, 90-minute meeting.
00:50:52.600 | The level of technical detail and mastery that he had
00:50:56.320 | blew me away.
00:50:58.280 | And he asked hundreds of very, very, very specific questions,
00:51:03.040 | some of which I knew the answers to,
00:51:04.400 | some of which I didn't.
00:51:06.080 | I was able to ask him what he thought,
00:51:08.560 | he was able to go into the weeds
00:51:10.720 | in an enormous amount of detail.
00:51:12.240 | And what I thought was,
00:51:13.400 | where is he finding the time
00:51:17.080 | to know as much as he does about compilers
00:51:21.000 | and the specifics of AI at this level of detail?
00:51:25.200 | And the story of telling you this example
00:51:28.800 | is just more that that is a kind of person
00:51:31.520 | that is just rare and unique.
00:51:33.360 | You can't put a label on that person.
00:51:35.640 | And you just want that person near you
00:51:37.600 | because he has the ability to help you in a way
00:51:40.160 | that most other people just do not,
00:51:42.960 | irrespective of whatever their title is.
00:51:45.160 | So it just goes to say,
00:51:47.240 | you gotta focus on the actual meat of the problem.
00:51:50.920 | And in that specific case,
00:51:52.680 | he gave us two technical directions
00:51:54.960 | that my team and I are exploring now
00:51:57.160 | with this goal that if one of them pans out,
00:51:59.280 | I'll go back to Eric and say,
00:52:00.400 | "Look, we tried these things, this is working,
00:52:03.000 | "this is not working, what do you think?"
00:52:04.760 | And it was an incredibly helpful meeting to me.
00:52:07.120 | And he just offered that time to me.
00:52:09.000 | And so my point is, people like this exist.
00:52:12.560 | And I think that the whole goal, if you're trying to win,
00:52:15.680 | seek those people out independent of what they're doing,
00:52:18.080 | what their title is. - Get them on your team.
00:52:19.720 | - Learn from them as much as you can,
00:52:21.520 | and hopefully you get one step closer to winning.
00:52:24.080 | - And get them on your team if possible.
00:52:25.880 | - Yeah. - Ideally.
00:52:26.960 | - All right, here we go.
00:52:28.720 | A panel of California judges has ruled
00:52:31.160 | that Section 230 does not protect Toc's algorithm
00:52:34.520 | in the case of the death, tragically, of a 10-year-old girl.
00:52:38.240 | We've talked about Section 230 and legislation
00:52:41.480 | to protect algorithms or to make algorithms more editorial.
00:52:45.760 | We'll get to that in a second,
00:52:46.800 | but let me just cue up the story,
00:52:48.280 | and then we'll play a clip from episode 99 of "All In" podcast.
00:52:51.880 | In late 2021, a 10-year-old Pennsylvania girl
00:52:54.960 | accidentally killed herself
00:52:56.360 | while participating in a blackout challenge
00:52:59.080 | she saw on TikTok.
00:53:00.280 | That is a challenge that encourages viewers
00:53:02.160 | to choke themselves with objects like belts
00:53:04.360 | until they pass out.
00:53:05.400 | According to Bloomberg, this challenge has been linked
00:53:08.560 | with the deaths of 15 young people.
00:53:10.520 | - Oh my God. - Yeah, this is terrible.
00:53:12.800 | - What? - The child's mother
00:53:14.040 | sued TikTok, arguing that their algorithm
00:53:16.880 | served blackout challenge videos to her child,
00:53:19.600 | thus making them responsible.
00:53:21.480 | In the past, algorithmic decisions,
00:53:23.360 | as we've talked about here,
00:53:24.520 | were protected under Section 230
00:53:26.120 | of the Communications Decency Act.
00:53:28.840 | Just to break this down very simply,
00:53:30.920 | if you're Section 230, that grants internet platforms
00:53:33.800 | featuring user-generated content immunity
00:53:36.120 | from being sued over content
00:53:38.040 | published by those users on their platforms.
00:53:40.000 | So YouTube, TikTok, Twitter,
00:53:43.120 | a blogging platform, et cetera.
00:53:45.360 | So because of 230, you're technically not supposed
00:53:49.000 | to be able to go after something like TikTok
00:53:51.920 | because a random user posted crazy videos like this.
00:53:55.000 | But an appeals court, Chamath, has reversed that ruling
00:53:58.240 | with the judge arguing that TikTok's algorithm
00:54:00.200 | represents a unique "expressive product"
00:54:04.360 | which communicates to users
00:54:05.960 | through a curated stream of videos.
00:54:07.800 | The judge claims TikTok's algorithms
00:54:10.440 | reflect editorial judgment.
00:54:12.240 | So here's the interesting legal detail.
00:54:15.320 | The new ruling specifically cites the Supreme Court's
00:54:17.840 | recent decision, Moody v. NetChoice,
00:54:19.760 | in which the court ruled unanimously
00:54:21.920 | to vacate that Florida law we talked about here
00:54:25.000 | that banned tech companies
00:54:26.080 | from deplatforming political officials.
00:54:28.880 | So that was viewed as a big win
00:54:31.000 | for speech protection sacks
00:54:32.840 | and moderation rights in Big Tech,
00:54:35.080 | but this is all super ironic
00:54:36.840 | because the same ruling that affirmed Big Tech's
00:54:38.880 | First Amendment protections in content moderation
00:54:41.480 | may also have nullified the Section 230 immunity.
00:54:45.040 | So let's play this quick clip here, Chamath.
00:54:48.560 | This is a discussion you and I had
00:54:50.360 | about should algorithms be part of Section 230
00:54:54.000 | back in 2022.
00:54:55.240 | - Let me break down an algorithm for you.
00:54:57.400 | Okay, effectively, it is a mathematical equation
00:55:00.120 | of variables and weights.
00:55:02.160 | An editor 50 years ago
00:55:04.480 | was somebody who had that equation
00:55:07.600 | of variables and weights in his or her mind.
00:55:10.520 | And so all we did was we translated,
00:55:13.280 | again, this multimodal model
00:55:15.120 | that was in somebody's brain
00:55:16.760 | into a model that's mathematical,
00:55:19.360 | that sits in code.
00:55:21.120 | - You're talking about the front page editor
00:55:22.640 | of the New York Times.
00:55:23.680 | - Yeah, and I think it's a fig leaf
00:55:25.680 | to say that because there is not an individual person
00:55:28.000 | who writes 0.2 in front of this one variable
00:55:31.080 | and 0.8 in front of the other,
00:55:33.480 | that all of a sudden
00:55:34.320 | that this isn't editorial decision-making is wrong.
00:55:36.840 | We need to understand the current moment in which we live,
00:55:41.240 | which is that these computers
00:55:42.800 | are thinking actively for us.
00:55:46.120 | And that is just true.
00:55:47.600 | And so I think we need to acknowledge that
00:55:49.440 | because I think it allows us
00:55:51.200 | at least to be in a position
00:55:53.080 | to rewrite these laws
00:55:54.520 | through the lens of the 21st century.
00:55:56.680 | - There's such an easy way to do this.
00:55:58.120 | If you're TikTok, if you're YouTube,
00:56:00.480 | if you want Section 230,
00:56:01.920 | if you want to have common carrier
00:56:03.800 | and not be responsible for what's there,
00:56:05.560 | when a user signs up,
00:56:06.880 | it should give them the option.
00:56:08.240 | Would you like to turn on an algorithm?
00:56:10.320 | Here are a series of algorithms which you could turn on.
00:56:12.840 | You could bring your own algorithm,
00:56:14.600 | you could write your own algorithm
00:56:16.000 | with a bunch of sliders,
00:56:17.320 | or here are ones that other users
00:56:19.440 | and services provide, like an app store.
00:56:21.480 | - So, Shamath, that was your take on it.
00:56:24.240 | And here we are.
00:56:25.280 | What are your thoughts
00:56:26.120 | on Section 230 and algorithms today?
00:56:29.840 | Should, if you use an algorithm,
00:56:32.680 | should that nullify, void your Section 230 protection?
00:56:36.200 | - I think it is a fig leaf to say that we've,
00:56:38.880 | there's a level of abstraction
00:56:40.600 | that should give us immunity.
00:56:42.120 | I think that these algorithms are taught
00:56:45.040 | to iteratively become better and better
00:56:47.720 | at hooking people on whatever is trending in that moment.
00:56:51.840 | And I think it's probably known to these companies
00:56:55.480 | at any given time, what's trending.
00:56:58.080 | So even though they're not literally going
00:57:00.400 | and changing something in the moment,
00:57:03.160 | doesn't mean that they don't support it
00:57:05.160 | and doesn't mean that they're not aware of it.
00:57:06.800 | And it doesn't mean that they haven't created a net
00:57:11.800 | to sort of catch these lightnings in a bottle
00:57:14.600 | and spread them around.
00:57:16.440 | And so I think we just need
00:57:17.760 | to have a more intelligent discussion
00:57:21.200 | about how responsibility should be shared
00:57:25.840 | in a world where Section 230, yeah, you're right.
00:57:29.200 | I'll just say it again.
00:57:30.040 | We're not writing deterministic code anymore.
00:57:31.920 | It doesn't say if this, then that,
00:57:33.800 | show them the blackout video.
00:57:36.400 | There is no piece of code anywhere,
00:57:38.320 | but it's kind of a fig leaf to say
00:57:40.600 | that that isn't the intention of the way
00:57:42.480 | that the new form of software is written.
00:57:44.760 | - Yes, correct.
00:57:45.880 | Sax, your thoughts here on balancing 230
00:57:48.840 | with the fact that I think we all agree
00:57:51.320 | that these algorithms are the new modern day editors.
00:57:55.360 | - I don't agree.
00:57:56.200 | I have a very different opinion on this
00:57:57.040 | than you guys do, and I always have.
00:57:59.600 | Well, first of all,
00:58:00.440 | let me just speak to the legal precedent here.
00:58:02.920 | Last year, there were two Supreme Court decisions
00:58:05.640 | addressing this very issue
00:58:07.040 | of whether algorithms basically obviated 230 protection,
00:58:12.040 | and the court found that they didn't.
00:58:14.120 | There were a couple of cases involving users
00:58:18.440 | who basically went down a rabbit hole
00:58:21.520 | of terrorist videos and got recruited by ISIS.
00:58:24.720 | Remember this? - Yes.
00:58:25.920 | - And they were sued by the families of victims,
00:58:28.480 | or at least the social networks were,
00:58:29.840 | and the argument was that they basically
00:58:32.000 | had been recruited into ISIS and committed the terrorism
00:58:35.520 | because of the algorithms
00:58:37.880 | and the social networks were liable for that.
00:58:40.240 | Supreme Court found that they weren't,
00:58:42.000 | which is just to say that algorithms
00:58:44.200 | were not treated differently
00:58:46.160 | than if the content had been on a regular feed,
00:58:50.200 | you know, just a chronological feed.
00:58:52.280 | And what Section 230 says
00:58:54.160 | is that you can only hold the user liable.
00:58:57.120 | We're gonna treat social networks as distributors,
00:58:59.640 | not publishers, for the purpose of user-generated content.
00:59:03.920 | I've long held the view
00:59:06.000 | that if you wanna make online platforms liable as publishers
00:59:10.460 | for every piece of user-generated content,
00:59:12.840 | then you're gonna have very little free speech left,
00:59:15.480 | 'cause I just think that corporate risk aversion
00:59:18.000 | is gonna force these guys to become even more censorious
00:59:21.360 | than they already are.
00:59:22.840 | Every piece of content that could potentially lead
00:59:25.400 | to a lawsuit is gonna get shut down,
00:59:28.640 | and I think that the net effect of that
00:59:30.560 | will be much more negative.
00:59:32.800 | Now, to the argument about aren't algorithms
00:59:35.360 | just the new editors, I don't think so.
00:59:38.040 | I think there's a fundamental difference
00:59:39.240 | between what an editor does and what an algorithm does.
00:59:42.280 | If you look at an editorial page
00:59:44.200 | of The New York Times, The Wall Street Journal,
00:59:46.400 | it has a very specific point of view that is promoting.
00:59:49.960 | In fact, sometimes you can't tell the op-ed page
00:59:52.160 | from the news page,
00:59:53.760 | because it seems like that publication is so biased
00:59:56.760 | in favor of one point of view or another.
00:59:59.440 | An algorithm isn't supposed to do that.
01:00:01.440 | An algorithm is supposed to give you more of what you want,
01:00:04.520 | and so therefore, if you are pro-Trump,
01:00:06.800 | you'll see more pro-Trump content.
01:00:08.840 | If you're pro-Kamala Harris, you'll see more Harris content.
01:00:12.240 | In other words, X, or whatever the platform is,
01:00:15.160 | isn't supposed to be taking an editorial position
01:00:18.200 | on whether it supports Trump or Harris,
01:00:20.560 | but rather is giving you more of what you want.
01:00:23.280 | Elon recently spoke to this.
01:00:25.160 | He had a tweet recently where he talked about,
01:00:27.760 | "Hey, people are coming to him saying,
01:00:29.720 | "Hey, I'm seeing all these offensive things in my feed.
01:00:33.120 | "Why is this?"
01:00:33.960 | Well, it turns out that they had been sharing
01:00:37.040 | those posts that were offensive to them,
01:00:39.000 | and so they were seeing more of them,
01:00:41.120 | and that's a really good example
01:00:42.560 | of how the algorithm just gives you more
01:00:45.280 | of what you're interacting with,
01:00:47.040 | so don't interact with outrage porn.
01:00:49.180 | If you don't want to be outraged,
01:00:50.440 | stop interacting with outrage porn,
01:00:52.160 | and you'll see less of it.
01:00:53.800 | I just think that is fundamentally different
01:00:55.760 | than having an editorial point of view.
01:00:57.760 | X does not have an editorial point of view
01:01:00.280 | that it wants you to see more of that outrage.
01:01:02.560 | It's simply the user is making clicks
01:01:05.520 | and clicking a share button in a way
01:01:08.340 | that it interprets it to be saying,
01:01:10.520 | "Oh, this user wants more of that content."
01:01:12.740 | - Yes.
01:01:13.580 | - And so this is why I don't think
01:01:15.340 | that all of a sudden Section 230 protections
01:01:17.360 | should be voided.
01:01:18.980 | I just think it's a fundamentally different thing
01:01:21.560 | than what a publisher does.
01:01:23.480 | - I have a third view,
01:01:24.400 | and I'll bring you in on this discussion here, Friedberg,
01:01:27.000 | is that these algorithms have become so powerful,
01:01:30.960 | they are even better than editors
01:01:33.640 | at catering to users' needs
01:01:35.560 | because they're obviously one-to-one casting, right?
01:01:38.600 | And so this is a perfect opportunity
01:01:41.060 | not to get rid of 230, but to evolve it.
01:01:44.100 | And as I said in that previous clip,
01:01:46.520 | there is a big issue as to who's making these algorithms,
01:01:50.260 | in the case of TikTok, the Chinese government's influence,
01:01:53.240 | and what is their intention?
01:01:55.780 | And are they being thoughtful about it
01:01:57.440 | 'cause they are powerful?
01:01:58.560 | And what responsibility do they take?
01:02:00.200 | Is the responsibility stopping at increasing the session?
01:02:03.760 | Because if you want to increase a session,
01:02:05.720 | all you have to do is keep showing more rage
01:02:08.840 | and getting people more upset,
01:02:10.120 | and that leads to division in society
01:02:11.760 | and all kinds of weird things that can occur.
01:02:14.140 | And so empowering users and having more transparency
01:02:17.040 | would be a great way for the government,
01:02:19.840 | for individuals who are frustrated with this,
01:02:22.640 | and the platforms to find some common ground and evolve.
01:02:25.480 | - What do you think of this possibility, Friedberg,
01:02:28.560 | of maybe you come to YouTube
01:02:31.320 | and instead of shutting down 230 and causing chaos,
01:02:33.600 | to your point, Sax, which I agree with that,
01:02:35.400 | it would cause chaos and more censoring,
01:02:37.840 | just saying to people, "Hey, welcome to YouTube.
01:02:41.320 | Here are three ways you can view your content by default.
01:02:44.080 | You can pick our algorithm,
01:02:45.600 | you can pick an education-leaning algorithm,
01:02:48.080 | you can pick a music-leaning algorithm,
01:02:50.280 | or you could have no algorithm at all
01:02:52.840 | and you'll just be faced with a directory."
01:02:55.320 | What do you think of that as maybe a middle ground here?
01:02:58.280 | - I'm not sure that giving people a choice of an algorithm
01:03:02.080 | is gonna work.
01:03:02.920 | Consumers just wanna have stuff that incites emotion.
01:03:07.920 | There's a reason horror movies do well,
01:03:11.820 | and also romantic comedies, and also adventure films.
01:03:14.480 | Like, they're all emotive.
01:03:15.840 | At the end of the day, the more emotive something is,
01:03:18.400 | the more emotionally inciting it is,
01:03:20.360 | the more likely you are to kind of
01:03:22.160 | wanna have something like that again.
01:03:24.640 | And so, that's simply the nature of how, you know,
01:03:28.760 | humans interact with the world around them.
01:03:32.040 | When something is emotionally inciting,
01:03:33.600 | you want more of it and you get more of it,
01:03:35.200 | and that creates the dopamine and that drives the behavior.
01:03:38.340 | The crazy thing about social media,
01:03:40.320 | or digital media, firstly,
01:03:42.200 | is that the feedback cycle's much faster.
01:03:45.360 | It used to be that you put out a book,
01:03:46.980 | you wouldn't get the results on how the book sold
01:03:48.880 | and how many people liked it
01:03:49.960 | and how many people read it 'til a year later.
01:03:52.200 | And then with movies, you get results in a weekend.
01:03:54.760 | But with social media, you get results instantly
01:03:56.780 | when a piece of content comes out online.
01:03:58.680 | You get an instant result.
01:04:00.360 | And then, with the concept of a feed,
01:04:02.520 | where the next content is dynamically selected for you
01:04:05.180 | based on your reaction to the content prior to it,
01:04:08.600 | you kind of get this immediate feedback loop
01:04:10.280 | that then tunes and lines up the next thing,
01:04:12.080 | and now you're getting hundreds of iterations an hour.
01:04:14.320 | So, I'm not sure that there's really this, like,
01:04:16.540 | simple solve where let me show you something
01:04:18.520 | that's, like, less inciting.
01:04:19.960 | At the end of the day, if it's something that's horrific
01:04:21.760 | or something that's romantic or something that's inspiring,
01:04:26.340 | if it's emotionally kind of activating enough,
01:04:28.960 | you're gonna be happy watching the next one,
01:04:30.800 | you're gonna keep watching it,
01:04:31.640 | and you're gonna keep watching it.
01:04:32.520 | - I think there is a relatively easy solve,
01:04:35.080 | which is you could simply open-source the algorithm
01:04:38.260 | like X has done.
01:04:39.520 | So, we can see, is it messing with your mind or not?
01:04:42.400 | Is it actually biased?
01:04:44.160 | Are they actually inserting editorial opinions
01:04:47.160 | into the algorithm?
01:04:49.320 | And so, if you open-source it,
01:04:50.560 | people can see what it's doing.
01:04:52.120 | I actually, I hear this point of view a lot,
01:04:54.280 | that social media is only preying on negative emotions.
01:04:57.740 | That argument is made a lot by people
01:04:59.520 | who want to regulate it more,
01:05:00.880 | and they want more censorship,
01:05:02.160 | and that's why they're making that kind of argument.
01:05:05.000 | I don't think that the algorithm is only reacting
01:05:07.680 | to outrage or incitement.
01:05:08.720 | I think quite the contrary.
01:05:10.080 | What I've seen on my X feed
01:05:11.800 | is that it's showing me more stuff that I like,
01:05:14.000 | and I'm-- - That's right.
01:05:15.120 | - I mean, the algorithm's gotten so good.
01:05:17.360 | - Yeah, like, if you go on YouTube, I got,
01:05:19.680 | I love watching videos of mountain biking,
01:05:22.560 | and I see all these crazy mountain biking videos now,
01:05:25.360 | and I love watching them.
01:05:26.440 | I'm like, dude, this is awesome.
01:05:27.440 | It's exciting for me. - Right, so I don't think
01:05:28.280 | it's about making me angry.
01:05:29.720 | - I hear what you're saying,
01:05:30.640 | but I would offer something slightly different,
01:05:33.320 | which is, I think that these algorithms focus on momentum.
01:05:36.820 | So, meaning, if Freeberg spends the next 15 minutes
01:05:40.680 | looking at mountain biking videos,
01:05:42.120 | the momentum shifts to mountain biking.
01:05:43.800 | If he then goes to surfing, the algorithm goes to that.
01:05:46.640 | - There's a decay function, that's right, yeah.
01:05:48.720 | And the reason is that the way that these,
01:05:51.440 | if you go back to the actual models themselves,
01:05:54.840 | the way that they're architected, right,
01:05:56.560 | like, if you look inside of a transformer, what is it?
01:05:58.680 | There's a neural network part,
01:06:00.160 | and then there's a self-attention part.
01:06:01.880 | What is the self-attention thing trying to do?
01:06:03.920 | It's trying to figure out the momentum
01:06:05.600 | and the importance of a given input.
01:06:08.080 | So, the core structure of the way that, like,
01:06:11.160 | we've solved the problem today
01:06:13.960 | is erring towards a thing where,
01:06:16.640 | if actors in the system wanted to create momentum
01:06:21.640 | in one direction versus the other,
01:06:25.040 | they can probably do it before they get caught,
01:06:27.400 | because the algorithms will amplify that.
01:06:30.840 | And I think this is the whole point,
01:06:32.240 | where these blackout videos,
01:06:33.560 | it's not as if somebody wanted to consume that, per se,
01:06:37.120 | as much as there was a moment in time
01:06:39.520 | where the momentum was towards those videos,
01:06:43.760 | a large swap of people got it,
01:06:45.980 | and then the unfortunate tragedy
01:06:48.200 | is the very small percentage of people
01:06:50.000 | acted on it and were killed.
01:06:51.880 | This can happen over and over and over again
01:06:53.800 | on all kinds of different things.
01:06:56.040 | And I think that's what needs to be addressed.
01:06:57.760 | - And maybe 10-year-olds shouldn't be allowed
01:06:59.600 | to use TikTok without any supervision.
01:07:01.440 | Maybe that's a better answer.
01:07:02.760 | - That could be a better answer, too.
01:07:04.080 | - I think, based on some of our discussions,
01:07:06.600 | yeah, having kids not have these phones at school
01:07:09.200 | and putting them into the pouches
01:07:10.720 | and no social media until you're 16 or 17,
01:07:13.320 | because one of the other things that's happening here
01:07:15.560 | is these algorithms are so good
01:07:18.680 | that they are now causing a dopamine deficiency in children.
01:07:22.840 | It's causing depression because you can get desensitized,
01:07:27.920 | and every time you swipe up, you get a dopamine hit,
01:07:30.680 | and then your brain gets reset,
01:07:32.820 | and you have less ability to get joy
01:07:35.820 | from having dinner with your family
01:07:38.400 | or playing cards with your friends
01:07:40.000 | or doing any other things,
01:07:41.520 | and that's what doomscrolling does.
01:07:43.000 | That's why we lost you, Sax, to the poker game.
01:07:45.040 | You're too busy doomscrolling,
01:07:46.240 | and you got addicted to it,
01:07:47.160 | and you don't come to the poker anymore.
01:07:49.320 | But this is what we need to do.
01:07:50.800 | As a society, these phones are destroying everybody's lives.
01:07:53.820 | They're killing friendships.
01:07:55.000 | They're killing these poor kids
01:07:57.040 | who are sitting there like zombies.
01:07:59.360 | The algorithm is just too good.
01:08:01.840 | We have now, the snake has eaten his tail.
01:08:03.800 | - I will agree with that to some degree,
01:08:05.220 | that I waste way too much time using X,
01:08:08.240 | but I just do it to stay up on current events
01:08:10.920 | so I can do this pod.
01:08:11.960 | - Yes, that's my excuse, too.
01:08:14.320 | If I didn't have this pod,
01:08:15.480 | maybe I could just stop using it.
01:08:16.880 | That'd be great.
01:08:17.720 | I'd love to have that time back and just read more books.
01:08:20.320 | - You know what?
01:08:21.160 | I'm gonna do it with you.
01:08:22.160 | We're gonna go on a little social media diet,
01:08:24.160 | you and I.
01:08:25.000 | From now on, we're gonna go for sushi,
01:08:26.320 | maybe some sky. - But you've been acting
01:08:27.140 | insane.
01:08:27.980 | You've been acting totally insane.
01:08:28.820 | - You have lost your mind.
01:08:29.640 | It's all you.
01:08:30.480 | - No, you've lost your mind.
01:08:31.300 | - You need a detox.
01:08:32.240 | I'm gonna-- - No, you need a detox.
01:08:34.000 | - No, you need a detox. - Anyone who disagrees
01:08:35.240 | with you is like on Putin's payroll now.
01:08:37.600 | (laughing)
01:08:38.440 | What's wrong with you? - Okay, that's me, Danya,
01:08:39.680 | my friend, Comrade Sachs.
01:08:41.360 | - You're like the reincarnation of Joe McCarthy.
01:08:43.640 | - Dah, dah, dah. - On X.
01:08:45.040 | - Yes.
01:08:45.880 | Since you wanna go into Russia, here we go.
01:08:49.000 | It's 50 days before the election.
01:08:49.840 | - You're the one with Russia on the brain.
01:08:51.760 | - And as predicted, we're 50 days before the election,
01:08:55.720 | and who shows up?
01:08:57.240 | Putin.
01:08:58.080 | This is the greatest story ever.
01:09:01.360 | For those of you listening, Sachs just rolled his eyes.
01:09:03.760 | It's so funny.
01:09:05.360 | This is just great.
01:09:06.760 | Okay, the DOJ-- - Who would've predicted
01:09:08.360 | there's gonna be another Russiagate hoax
01:09:10.080 | just in time for the election? - You did, you did,
01:09:10.920 | and here we are.
01:09:11.760 | And by the way, there's breaking news.
01:09:13.440 | More Russia stuff has dropped while we're on the pod.
01:09:16.760 | - There's a lot of people who started
01:09:18.320 | with Trump derangement syndrome,
01:09:19.880 | and then they held Putin responsible
01:09:22.120 | for Trump getting elected.
01:09:23.080 | So the TDS became PDS,
01:09:25.720 | and they got Putin derangement syndrome.
01:09:27.720 | - Let's get to the story here, Comrade.
01:09:29.960 | (laughing)
01:09:31.680 | - I think this whole story is such a waste of time.
01:09:33.640 | We shouldn't even waste time on it.
01:09:34.960 | - Well, we're gonna do it anyway.
01:09:35.880 | So here we go.
01:09:36.720 | The DOJ discharged two Russian media operatives
01:09:39.280 | with infiltrating podcasts, yes,
01:09:42.640 | to push pro-Kremlin talking points.
01:09:45.120 | This was a wild drop on Wednesday, Chamath.
01:09:47.280 | The DOJ charged two Russian media operatives
01:09:49.840 | as part of an alleged, wait for it,
01:09:51.880 | propaganda and misinformation scheme.
01:09:54.160 | According to the charges, these two employees
01:09:56.360 | of the state-run Russia Today outlet, RT,
01:09:59.960 | funneled $10 million into a Tennessee-based media company
01:10:04.320 | to influence public opinion and sow social divisions.
01:10:07.920 | The wire transfers happened between October of last year
01:10:10.960 | and as recently as last month.
01:10:14.120 | This included placing blame on Ukraine, Ukraine, Ukraine,
01:10:17.840 | for its conflict with Russia.
01:10:19.320 | Interesting.
01:10:20.280 | Both Russians are charged with conspiracy
01:10:22.240 | to violate the Foreign Agents Registration Act
01:10:25.240 | and conspiracy to commit money laundering.
01:10:27.680 | They're looking at 20 years in jail.
01:10:29.400 | The indictment did not mention the company by name,
01:10:31.800 | but people quickly figured out who it was
01:10:33.480 | because of their location.
01:10:35.160 | The company's called Tenet Media,
01:10:37.320 | founded in 2023 by Lauren Chen, a conservative commentator,
01:10:41.400 | and their personalities included Tim Pool, Benny Johnson,
01:10:45.200 | Dave Rubin, and Lauren Southern.
01:10:47.520 | These are right-leaning podcasters.
01:10:50.320 | In the last year, Tenet has posted 2,000 videos,
01:10:52.880 | which collectively had 16 million YouTube views.
01:10:55.520 | According to the DOJ charges,
01:10:58.160 | looks like the Russian operatives
01:10:59.560 | were coordinating with the founders.
01:11:01.840 | - Jason, let's say the quiet part out loud, okay?
01:11:03.960 | We've heard this from Bobby Kennedy.
01:11:06.840 | We've heard this from a whole host of other people.
01:11:09.800 | We've now heard this from the DOJ.
01:11:12.440 | There is an inextricable link between the media
01:11:16.560 | and the people that pay for the media
01:11:18.400 | that want to influence what the media says.
01:11:21.400 | We know that to be true.
01:11:22.760 | Now, in this specific case,
01:11:24.640 | I hope the DOJ runs us to ground
01:11:26.920 | and gets justice served.
01:11:30.520 | What I would say generally is
01:11:32.560 | the most incredible decision we ever made
01:11:35.800 | is to not have ads.
01:11:37.080 | It has been the single biggest clarifying function
01:11:43.000 | for our ability to maintain our own opinions
01:11:46.640 | and be credible.
01:11:49.120 | It's allowed us to change our minds.
01:11:51.160 | It's allowed us to evolve.
01:11:52.560 | If we were sort of characters in a movie,
01:11:54.760 | we will have gone through different transitions, frankly,
01:11:58.320 | as these last four years have gone by.
01:12:00.440 | I think that that would have been much harder to do
01:12:03.240 | if we had people paying us money.
01:12:05.640 | So this is just yet another example of your media diet,
01:12:10.640 | the more it can come from people
01:12:12.640 | who don't need to make money from this,
01:12:15.440 | the better off you will be
01:12:16.400 | because you will get earnest opinions.
01:12:19.280 | And what will happen is,
01:12:20.960 | whether it's large multinational corporations
01:12:24.680 | or foreign state actors,
01:12:26.360 | they are going to find unwitting people
01:12:30.320 | to take this money and blather on some set of talking points
01:12:34.520 | on a whole host of topics.
01:12:36.280 | - Okay, and just to be clear-
01:12:38.640 | - I mean, do we all agree with that or not?
01:12:40.480 | - I mean, I agree.
01:12:41.320 | It was a great move to not have advertising, sure.
01:12:44.360 | 'Cause yeah, the zero pressure from anybody calling us
01:12:46.800 | saying we're gonna pull our ad spots
01:12:48.280 | and also it makes the pod easy to listen to.
01:12:50.640 | I mean, none of us need the money.
01:12:52.600 | These personalities, Tim Pool, Benny Johnson,
01:12:54.560 | Dave Ruhman and Lauren Southern have,
01:12:57.280 | I think most of them have come out and said
01:12:58.720 | they didn't know.
01:12:59.680 | The indictment says
01:13:01.040 | that they didn't know these podcasters
01:13:02.720 | and the indictment also says
01:13:03.880 | they were paid huge sums of money,
01:13:05.840 | upwards of $100,000 per episode to do this.
01:13:09.320 | And it's unclear what they were asked to do
01:13:11.840 | in the indictment,
01:13:12.720 | but there are some pressure techniques to
01:13:15.680 | produce tweets around certain things
01:13:20.320 | that the Russians wanted propagated.
01:13:22.520 | - What pressure techniques?
01:13:24.480 | - They said in the indictment
01:13:26.480 | that they wanted more traffic
01:13:29.240 | 'cause they weren't hitting their traffic numbers
01:13:31.160 | and to please amplify different videos and tweets
01:13:34.160 | that were pro-Russian.
01:13:36.960 | So that's in the indictment.
01:13:38.560 | So I guess the question then, Sax, for you is-
01:13:41.000 | - You're totally mischaracterizing this.
01:13:43.120 | - I mean, what was mischaracterized?
01:13:44.760 | I would never mischaracterize this.
01:13:46.120 | And I literally would never,
01:13:47.320 | I have no horse in the race here.
01:13:49.240 | There's no reason for me to mischaracterize it.
01:13:50.800 | So what am I getting wrong?
01:13:52.880 | - You do have a horse in the race.
01:13:54.040 | You believe in the Russia gate hoax for many years
01:13:56.160 | and you wanna basically try and restart that whole thing.
01:13:59.120 | - That's not true.
01:13:59.960 | You can ask me.
01:14:00.800 | I think Russia, I've told this to you many times,
01:14:03.240 | Russia's sole goal is to sow division
01:14:07.040 | between people in this country.
01:14:09.280 | I refuse to allow it to do so on this podcast
01:14:13.400 | or between you and I.
01:14:14.560 | The Russians just wanna produce chaos.
01:14:16.520 | They do it on all sides.
01:14:17.720 | They get Green Party people. - You've changed your story.
01:14:19.160 | - They get Democratic people.
01:14:20.000 | That's been my story from the beginning.
01:14:21.040 | - You've changed your story a little bit.
01:14:22.160 | Okay.
01:14:23.000 | - It's been my, I have said since the beginning
01:14:25.400 | that the KGB's technique in America is to sow division.
01:14:28.880 | That's their sole goal.
01:14:29.960 | They don't care who wins.
01:14:30.960 | They just want us to not pay attention
01:14:32.680 | to what they're doing in Ukraine
01:14:34.320 | or what's happening in Russia.
01:14:35.920 | That's Putin's goal.
01:14:37.280 | - So I think the place to start is with asking the question,
01:14:41.000 | who is Lauren Chen and what is Tenet Media
01:14:43.280 | and what has been their objective or what's their agenda?
01:14:47.360 | And their agenda now for months
01:14:49.480 | has been what's called division grifting
01:14:51.840 | inside the conservative movement.
01:14:53.880 | Lauren Chen's been putting out a lot of tweets
01:14:56.000 | explaining to people that they should vote against Trump.
01:14:59.840 | She's been saying that he has gone soft
01:15:02.280 | on issues like abortion.
01:15:03.720 | She's been pushing a very tough pro-life message.
01:15:07.520 | She's been promoting a, I'd say,
01:15:10.840 | strange and almost fanatical idea
01:15:13.200 | that we should repeal the 19th Amendment,
01:15:16.040 | which gave women the right to vote.
01:15:18.360 | No one in the conservative movement seriously thinks this.
01:15:21.560 | And because of these positions,
01:15:23.160 | she was already being called out
01:15:25.240 | by people like Ashley Sinclair and Mike Cernovich,
01:15:28.480 | who said, "Something's not right here.
01:15:29.800 | "This seems like an op.
01:15:31.640 | "These positions don't really make any sense.
01:15:34.480 | "It seems like she's just causing mischief."
01:15:37.600 | So the place to start here is to recognize
01:15:40.600 | that if the Russians were paying Tenet Media
01:15:44.680 | to put out content, that content was actually anti-Trump.
01:15:49.280 | It was trying to get people to vote against Trump.
01:15:52.080 | So I think the place you have to start
01:15:53.440 | is by asking the question,
01:15:54.600 | why would Vladimir Putin wanna get Kamala Harris elected?
01:15:57.960 | In fact, Putin just came out today
01:16:00.240 | and announced his endorsement of Kamala Harris.
01:16:03.600 | So perhaps the Russians have an agenda
01:16:06.280 | to get Harris elected.
01:16:08.400 | I could see why.
01:16:09.720 | She is the much weaker candidate.
01:16:11.440 | She's afraid to take questions on the tarmac.
01:16:13.520 | Trump is strong.
01:16:15.520 | He's authoritative.
01:16:16.880 | And I could see why they might prefer
01:16:18.800 | to have a President Harris than a President Trump.
01:16:21.840 | - Let me just stop there
01:16:22.840 | and get your reaction to that, J. Kell.
01:16:25.160 | - Yeah, I mean, I think the sowing,
01:16:28.320 | Cernovich's point that they just wanna sow division
01:16:31.200 | and cause chaos is the key point.
01:16:33.600 | I don't think that Putin cares who wins.
01:16:36.160 | I think he just wants chaos here and he wants distraction.
01:16:39.000 | So whether he flips, he wants Kamala, he wants Hillary,
01:16:43.240 | he wants Trump, I think he just wants chaos.
01:16:45.880 | And that's what the KGB does.
01:16:47.560 | That's what the KGB has always done
01:16:49.440 | is try to get people to not believe reality.
01:16:52.400 | And that is their playbook for all history.
01:16:54.880 | They want to demoralize a population
01:16:57.520 | to not believe in their institutions,
01:16:59.680 | to not believe facts.
01:17:01.080 | And it seems like it's working pretty well
01:17:04.720 | here in the United States.
01:17:05.640 | - I think that this is all wildly overstated,
01:17:08.560 | but to the extent we're gonna deal with it at all,
01:17:10.200 | I think it's really interesting
01:17:12.080 | how when the Russian operation benefits Kamala Harris,
01:17:17.000 | nobody accuses her of collusion
01:17:19.640 | or being a puppet of Russia.
01:17:21.160 | They just say it's about sowing division.
01:17:23.720 | But somehow when the op- - When did that happen?
01:17:26.440 | - You just did that right now.
01:17:27.720 | - No, no, what op happened with Kamala?
01:17:29.280 | I'm not aware.
01:17:30.440 | Was there an op on the left? - I'm just saying that
01:17:31.560 | with tenant media, it's benefiting her.
01:17:33.480 | I just explained how.
01:17:34.680 | - Oh yeah, the four people are super anti-Kamala.
01:17:36.280 | - What I'm saying is there's a double standard.
01:17:38.080 | There's a double standard in the coverage here
01:17:39.960 | and how it's interpreted.
01:17:41.440 | When the op, when the alleged op benefits Harris,
01:17:45.840 | it has nothing to do with Harris.
01:17:46.960 | When the alleged op benefits Trump,
01:17:49.080 | Trump must be a Russian agent,
01:17:50.640 | must be a Russian puppet.
01:17:52.400 | There must've been collusion.
01:17:53.820 | And you were among the forefront of people
01:17:55.520 | saying that Trump himself was compromised.
01:17:59.280 | Now- - Oh, I never said Trump.
01:18:00.280 | No, no, I have to correct you there.
01:18:01.240 | I've never said Trump was compromised.
01:18:02.360 | I said the people around him were compromised
01:18:04.120 | because they kept taking meetings
01:18:05.320 | and money from the Russians.
01:18:06.560 | And this is the Russian playbook.
01:18:07.400 | - Do you believe in Russiagate?
01:18:08.720 | - No, I believe- - You never believed
01:18:10.520 | in Russiagate? - Putting the words aside
01:18:12.680 | and claims like this and you're telling me what I think,
01:18:15.040 | you can ask me what I think.
01:18:16.400 | And I can tell you very clearly, again,
01:18:18.560 | that I think their plan is to find simple-minded people
01:18:21.680 | to give them money and to cause chaos
01:18:24.200 | and division in our country.
01:18:25.480 | I don't think that they got to Trump.
01:18:28.180 | I think they got to almost everybody around him.
01:18:30.720 | And if you look at those people who show up
01:18:32.920 | at Putin dinners and you look at who takes money,
01:18:35.700 | it's all around Trump.
01:18:37.120 | And the Russians are experts at this.
01:18:38.760 | And they did it to the Green Party.
01:18:40.440 | They just sprinkle money around people
01:18:42.780 | to cause this type of chaos.
01:18:44.240 | That's the goal sense.
01:18:45.440 | How many times do I got to say this to you clearly?
01:18:47.320 | I don't have a horse in this race.
01:18:48.760 | I think all Americans, I'll say it again
01:18:50.920 | so that you can understand it
01:18:52.200 | and the audience can understand my position very clearly.
01:18:55.240 | Their goal is to get us to fight with each other
01:18:57.680 | so we don't look at their invasion of Ukraine
01:19:00.180 | or what's happening with the suffering
01:19:01.760 | of the Russian people.
01:19:03.400 | And we as Americans should not be partisan on this issue.
01:19:06.440 | We as Americans should all come together,
01:19:08.320 | you, Sax, me, and everybody else,
01:19:10.400 | and say we don't want the Russians interfering
01:19:13.360 | in our elections or causing division,
01:19:15.480 | and we're not going to make this political.
01:19:19.100 | - Look, Jake, this is the third straight election
01:19:21.880 | in America in which we've been led to believe
01:19:25.060 | that there is essentially massive Russian interference
01:19:29.480 | in our election.
01:19:30.800 | But I think we should ask, what did we learn
01:19:33.400 | since the last two elections?
01:19:35.040 | In 2016, okay, we had the Steele dossier,
01:19:38.960 | which hatched the whole Russiagate hoax.
01:19:41.800 | We subsequently found out that Hillary's campaign
01:19:45.120 | created the Steele dossier,
01:19:46.800 | and then that was taken up by the deep state.
01:19:48.760 | They basically did an investigation of the Trump campaign.
01:19:52.080 | They went to the FISA court.
01:19:53.400 | They lied to the FISA court,
01:19:55.120 | and we had a two-year independent counsel investigation
01:19:57.880 | based on a piece of opposition research
01:19:59.760 | that turned out to be completely phony.
01:20:01.840 | But we heard every day on cable news for years
01:20:05.000 | that the Russians had somehow interfered in that election,
01:20:07.160 | never proven.
01:20:08.360 | In 2020, we heard that the Russians again were interfering
01:20:12.240 | because you had those 51 security state operatives
01:20:15.240 | say that Hunter Biden's laptop was Russian disinformation.
01:20:18.660 | That turned out to be a total lie.
01:20:20.160 | The laptop was authentic.
01:20:21.300 | That was just a made-up story
01:20:22.920 | that the Russians were involved.
01:20:24.520 | After that, let's not forget,
01:20:26.280 | there was NewsGuard and Hamilton 68,
01:20:28.980 | which were two bogus media watchdog groups
01:20:31.560 | that were deep state ops that were designed
01:20:34.480 | to get conservative content censored
01:20:36.040 | as Russian disinformation.
01:20:37.640 | So now, after all of these ops
01:20:40.400 | in which it was alleged
01:20:42.280 | that the Russians interfered in our elections
01:20:43.800 | and those stories turned out to be completely bogus,
01:20:45.800 | we now have this new one about Tenet Media,
01:20:49.480 | and I don't know the truth of this story.
01:20:51.220 | I can't say one way or another whether it's accurate or not,
01:20:54.040 | but what I've learned
01:20:54.880 | is not to take these things at face value.
01:20:56.920 | And if we are to take it at face value,
01:20:59.000 | Tenet Media was working against Trump's interests.
01:21:01.520 | So apparently, the Russians don't want Trump
01:21:04.520 | to win the presidential election.
01:21:06.320 | They want Kamala Harris.
01:21:07.880 | - Yeah, and I agree that Trump was not compromised
01:21:11.000 | by the Russians, just the people around him.
01:21:13.120 | And to recap for people, since maybe they don't remember,
01:21:15.800 | Michael Flynn, who was the national security advisor,
01:21:19.240 | pled guilty to lying to the FBI in 2017
01:21:21.800 | about his contacts with the Russians.
01:21:24.960 | Paul Manafort, who was the campaign chairman for Trump,
01:21:28.920 | had ties to pro-Russian figures in the Ukraine
01:21:31.280 | and shared polling data with the Russian associates
01:21:33.880 | during the campaign.
01:21:35.280 | He was convicted of tax and bank fraud in 2018,
01:21:39.200 | later pardoned by Trump, like Michael Flynn was pardoned.
01:21:41.840 | Rick Gates, who was the decorated deputy chairman,
01:21:44.920 | he worked closely with Paul Manafort.
01:21:46.600 | He pled guilty to conspiracy and lying to investigators.
01:21:50.360 | George Papadopoulos, I won't bring up
01:21:52.160 | because he was kind of a joke.
01:21:54.040 | And then Roger Stone was investigated
01:21:56.160 | for all his contacts with WikiLeaks.
01:21:59.320 | He was convicted of obstruction, lying to Congress,
01:22:02.160 | and witness tampering in 2019.
01:22:04.040 | So we have all of these folks being convicted.
01:22:09.120 | That's four people convicted.
01:22:10.800 | I don't know what the point of bringing up
01:22:12.040 | all these investigations are when
01:22:13.320 | you said the most important thing, which
01:22:14.880 | is you do not believe that Trump was a Russian agent,
01:22:18.280 | despite the fact that for years, that's
01:22:20.000 | what Democrats maintained.
01:22:21.680 | My maintaining is, once again, fourth time,
01:22:24.200 | is that the Russians are trying to cause division in America,
01:22:27.280 | and we shouldn't all fall for it.
01:22:28.680 | Hey, Friedberg--
01:22:29.720 | I think we shouldn't fall for Hamilton 68.
01:22:31.560 | I think we shouldn't have fallen for NewsGuard.
01:22:33.240 | I think we shouldn't have fallen for the Steele dossier.
01:22:35.880 | I think we shouldn't fall for the Russiagate hoax, J. Cal.
01:22:38.640 | Stop falling for all these Russian hoaxes.
01:22:40.240 | I agree.
01:22:40.640 | That's why we should be united.
01:22:41.920 | Let's be united.
01:22:42.720 | Friedberg, you've got some passionate thoughts
01:22:44.880 | on Kamala Harris pivoting a little bit
01:22:47.160 | around her economic policies in the last week or two.
01:22:51.520 | Let's wrap with that.
01:22:52.520 | What are your thoughts?
01:22:53.400 | No passionate thoughts, J. Cal.
01:22:54.720 | Did any of you guys see that video
01:22:56.120 | I sent you of Kamala's campaign speech in New Hampshire
01:22:59.720 | yesterday?
01:23:00.480 | So basically--
01:23:00.960 | Summarize it for the audience, yeah.
01:23:02.480 | --she fundamentally pivoted from a lot of the statements
01:23:05.200 | she made a few weeks ago that I think
01:23:08.240 | caused a resounding amount of commentary
01:23:12.040 | about being anti-business, anti-economic prosperity.
01:23:16.360 | A lot of people call these comments socialist in nature.
01:23:19.600 | And apparently, she heard the feedback
01:23:22.320 | and is now pivoting the message and pivoting the policy.
01:23:24.920 | So I think the one question to understand
01:23:28.160 | is, how real is the pivot?
01:23:29.280 | And how much is this about getting elected?
01:23:31.600 | She made a couple of key announcements
01:23:34.280 | at this campaign speech yesterday.
01:23:35.800 | The first one is that she started out
01:23:37.920 | saying that small businesses are the lifeblood of our economy.
01:23:41.160 | She's proposing that we increase the startup tax
01:23:44.240 | deduction from $5,000 to $50,000 if you start up a new business
01:23:48.880 | and you're generating profitable income.
01:23:51.000 | You can deduct up to $50,000 on the year
01:23:53.840 | that you start the company up.
01:23:55.640 | She made an emphasis on getting rid of red tape
01:23:58.280 | and deregulating a lot of aspects
01:24:00.680 | that make it hard to start companies.
01:24:02.280 | She talked a lot about the importance of venture capital
01:24:04.640 | and venture dollars and that there's
01:24:06.480 | a goal to spur 25 million new SBA loan applications
01:24:12.120 | by the end of her term, so in the next four years, which
01:24:15.240 | would be a massive increase in government
01:24:17.000 | lending to small businesses.
01:24:18.840 | She also stepped back the capital gains tax proposal
01:24:22.600 | that was made under the Biden administration
01:24:24.840 | that she previously said was her policy as well
01:24:29.160 | and is now proposing a 28% capital gains tax instead
01:24:33.400 | of the 40-something percent that I think
01:24:35.160 | was the previous proposal.
01:24:36.560 | So a 28% capital gains tax only applied
01:24:39.120 | to households with a net worth greater than $1 million.
01:24:42.960 | But towards the end of her speech,
01:24:44.800 | she still talked a lot and got a lot of cheers around the idea
01:24:48.000 | that billionaires and big corporations
01:24:50.320 | need to pay their fair share and that there
01:24:52.560 | needs to be methods of increasing
01:24:54.120 | the taxation of billionaires and big corporations
01:24:57.360 | as they're kind of classified by her
01:24:59.920 | and some of the other Democrats.
01:25:01.680 | So I don't know, is the pivot real?
01:25:03.800 | Is she becoming more pro-business?
01:25:05.360 | Is she responding to the critical feedback
01:25:07.360 | she's received over the last few weeks?
01:25:10.200 | Or is this just about getting elected?
01:25:13.160 | Is this going to persist?
01:25:14.320 | Is this a change in policy, Chamath?
01:25:16.400 | I hand it over to you.
01:25:19.080 | I think that this is all the tactics leading to the debate
01:25:22.840 | on September the 10th.
01:25:24.200 | I think that a lot of this polling,
01:25:27.480 | you're seeing the sugar high fade on both sides.
01:25:31.720 | And I think what we can acknowledge
01:25:35.040 | is that the setup going into this debate
01:25:37.720 | is eerily reminiscent of the setup going
01:25:40.680 | into the first debate between Donald Trump and Hillary
01:25:42.960 | Clinton, which was a very profound sensation
01:25:47.280 | that there was a popular vote advantage
01:25:49.720 | to the Democratic candidate, Hillary then, Kamala now.
01:25:53.600 | But what everybody got wrong last time,
01:25:58.160 | and now what people are doing a much more refined job,
01:26:01.440 | other than Nate Silver, this time around,
01:26:04.760 | is then being able to do a more nuanced look
01:26:07.560 | at the electoral college probability, which
01:26:09.520 | is in the opposite direction, favoring Donald Trump
01:26:11.840 | and not Kamala.
01:26:12.840 | So I think that the Democrats have realized
01:26:15.320 | that the sugar high isn't going to win the election.
01:26:18.080 | They need to be specific in ways that gets moderates.
01:26:21.960 | And the only way to do that is to tack to the middle,
01:26:25.360 | because a lot of these other trial balloons
01:26:27.240 | were kind of nutty.
01:26:28.640 | And those people are not going to win them
01:26:30.480 | the election in the key states that matter.
01:26:32.240 | Again, it's what we've said before.
01:26:34.240 | Five things in five states, right?
01:26:36.760 | That's what it's going to come down to.
01:26:38.360 | It's also-- I think you've made this point, Shamath, as well,
01:26:41.320 | which is when you're running for the primaries--
01:26:43.520 | and I think Dean Phillips, when he was on the program,
01:26:45.840 | talked about this--
01:26:46.960 | you've got one agenda.
01:26:48.280 | And that tends to be a bit extreme,
01:26:50.040 | because you're trying to get those extreme people in primaries
01:26:52.440 | to come out.
01:26:53.440 | Then, when you get to the general election,
01:26:55.560 | you kind of come towards the middle.
01:26:57.280 | And we said on this very podcast--
01:26:58.800 | and I gave a couple of disclaimers
01:27:00.280 | as the moderator here--
01:27:01.560 | hey, we don't know these are Kamala's policies,
01:27:03.640 | because A, she's not talking to anybody.
01:27:05.560 | B, she hasn't released any.
01:27:07.280 | So maybe she's taking feedback.
01:27:10.960 | I have some inside information on this,
01:27:13.480 | which is there are people in our circles
01:27:16.040 | who are sitting with her saying, these things are crazy,
01:27:20.760 | and maybe pointing to our podcast
01:27:22.400 | and other reactions online.
01:27:24.080 | And maybe we should really think,
01:27:27.200 | what do you want to do, potentially, Madam President,
01:27:31.600 | when you're president, if you do win?
01:27:33.800 | And I think she's starting to think, from first principles,
01:27:36.800 | hey, what is actually good for the countries?
01:27:38.640 | That would be a very honest way of looking at this.
01:27:41.480 | She's not thinking from first principles.
01:27:42.840 | Are you kidding?
01:27:43.480 | She's reading from a teleprompter.
01:27:45.320 | No, no, I'm talking about in her policies.
01:27:47.120 | You can make whatever digs you want to her style,
01:27:49.200 | and that's valid.
01:27:50.080 | You can make whatever digs you want to Trump's style.
01:27:52.040 | What I'm talking about is--
01:27:53.000 | You said she's thinking from first principles.
01:27:54.760 | Yeah, well, from her first principles, as to what--
01:27:56.880 | Her writers are giving her new talking points.
01:27:58.680 | OK, listen, can you stop with the personal attacks on people
01:28:01.320 | and just maybe have an intelligent--
01:28:02.880 | It's not a personal attack.
01:28:03.120 | She's a candidate for the presidency of the United
01:28:04.800 | States.
01:28:05.400 | I'm allowed to point this out.
01:28:05.640 | He's actually answering your question.
01:28:06.800 | He's answering your question, J. Cal.
01:28:08.000 | He's saying that they're not real policy.
01:28:09.840 | They're not policy points, but that they're talking about--
01:28:10.640 | Well, let me finish my thought, and then you
01:28:12.200 | can deride her style.
01:28:13.840 | I'm trying to have something about substance.
01:28:15.680 | It's not a style point, Jason.
01:28:16.120 | It's a substance point.
01:28:16.960 | OK, let me finish my point, and then you
01:28:18.640 | can talk about substance or style.
01:28:20.440 | I think what happened here is she
01:28:22.000 | was put into the position as president
01:28:24.680 | and had to, in a very short period of time,
01:28:26.680 | come up with what her positions would be.
01:28:28.840 | Because everybody knows, when you're the vice president,
01:28:30.880 | your positions are what the president thinks.
01:28:32.800 | We've made that point about J.D. Vance,
01:28:34.440 | that his position on wanting a national abortion ban
01:28:37.560 | has nothing to do with Trump's, and Trump's wins
01:28:39.520 | in that situation.
01:28:40.800 | So here we are.
01:28:41.560 | I think it's a similar situation.
01:28:42.960 | Biden might have been captured and wanted
01:28:45.240 | to do this crazy wealth tax and this seizure of people's
01:28:49.440 | assets, whereas maybe she actually is much more moderate.
01:28:52.000 | And that's what people around her have always said,
01:28:54.080 | is that she's more moderate, and that she's coming to actually
01:28:57.480 | form her own platform, and she just needed time to do that.
01:29:00.640 | I'm giving what I'm hearing from the left.
01:29:03.240 | This isn't necessarily my position, Sax.
01:29:05.280 | This is what I'm hearing from the people around Kamala Harris.
01:29:09.200 | OK, let me tell you what's really going on here.
01:29:11.160 | OK, you can tell us all.
01:29:12.160 | Look, OK, here we go.
01:29:14.920 | She's hilarious.
01:29:16.080 | The longtime Democratic pundit, Roy Tushara,
01:29:19.160 | just had a piece out called "Vince Lombardi Democrats,"
01:29:22.440 | where he quoted Vince Lombardi saying,
01:29:24.800 | "Winning isn't everything.
01:29:26.240 | It's the only thing."
01:29:28.000 | And so, obviously, what's happening now
01:29:30.360 | is that Kamala Harris will say whatever
01:29:32.480 | she needs in order to win.
01:29:34.280 | But the question that voters should be asking,
01:29:36.680 | who is this person, really?
01:29:38.320 | Because, obviously, what she's going
01:29:40.120 | to do once you give her the keys to the government
01:29:43.200 | is going to be different than whatever she's saying right now.
01:29:46.000 | And we do have a lot of background on this person.
01:29:48.080 | She came up through the San Francisco political machine.
01:29:50.480 | She was a product of the progressive Pelosi machine
01:29:54.960 | in San Francisco.
01:29:56.480 | She then rose to the Senate.
01:29:58.680 | She was voted by GovTrack as the most liberal member
01:30:00.880 | of the Senate.
01:30:01.840 | And just a few weeks ago, when she first
01:30:03.520 | announced her economic proposals,
01:30:06.000 | they were all this super far left stuff,
01:30:09.160 | like the unrealized gains tax and a 44% cap gains rate.
01:30:13.920 | So that's where her instincts were.
01:30:16.040 | That was what was in the Biden-Harris budget.
01:30:18.200 | That was in her campaign platform.
01:30:20.040 | And that was in her economic policy speeches.
01:30:22.320 | And then what happened is there was a huge negative reaction
01:30:25.200 | to all of that.
01:30:26.040 | So now she's changing her talking points.
01:30:27.960 | But it's obvious what's going on.
01:30:29.720 | She's going to say whatever is necessary to win the election.
01:30:33.160 | But you're a fool and kidding yourself
01:30:36.600 | if you think you're going to get something really different
01:30:39.000 | after the election.
01:30:40.120 | You're going to get a continuation of the Biden-Harris,
01:30:44.680 | Elizabeth Warren economic program
01:30:46.960 | that we've had for the last four years.
01:30:48.600 | How would you say Trump's position on abortion
01:30:51.520 | relates to that?
01:30:53.280 | Don't people adapt their position based on winning?
01:30:55.640 | I mean, I think he's done a masterclass
01:30:57.280 | in changing his position.
01:30:58.400 | He's never changed his position.
01:30:59.880 | You don't seem to understand how that issue works.
01:31:04.680 | He was in favor of returning it to the states.
01:31:07.560 | He's been completely consistent on that.
01:31:09.360 | And I don't know why you keep bringing it back to that always.
01:31:12.240 | Whenever Kamala Harris isn't doing well,
01:31:14.200 | it's always about abortion for you.
01:31:15.840 | - Well, no, I think it's a great example
01:31:17.640 | of a politician moving to the center on the issue.
01:31:20.720 | He just said this week
01:31:21.560 | that he wants more than six weeks, right?
01:31:23.080 | And so he's moving towards a maybe--
01:31:24.560 | - He didn't just say that.
01:31:25.600 | It's always been his position.
01:31:26.800 | He's always been a moderate on the issue, J. Cal.
01:31:28.840 | - Yeah.
01:31:29.920 | - He's against the national abortion ban.
01:31:31.360 | He's always been in favor of returning it to the states.
01:31:33.800 | I don't even understand how this is like,
01:31:35.640 | you're still talking about this.
01:31:37.040 | - Well, the reason I talk about it
01:31:38.200 | is because there's so many states,
01:31:39.280 | including the one I'm living in
01:31:40.800 | where women can't get an abortion.
01:31:42.680 | - Okay, they're gonna have to cross the state line.
01:31:44.240 | You're right.
01:31:45.080 | That's an issue.
01:31:45.920 | - So, I mean, I think women feel differently about it
01:31:47.560 | than you brushing it aside.
01:31:49.080 | They feel that Trump--
01:31:49.920 | - I'm not brushing it aside.
01:31:50.760 | I'm just stating what the truth of the issue is.
01:31:52.400 | - Yeah, no, no.
01:31:53.240 | I'm just saying the truth for women that I talk to--
01:31:54.080 | - Hold on a second.
01:31:55.000 | If you wanna vote on that issue, that's your right.
01:31:57.280 | Go for it.
01:31:58.280 | All I ask is that you correctly state Trump's position
01:32:01.160 | on that issue, which he stated on our own podcast.
01:32:03.560 | - Yeah, no, and that's the point I'm bringing up
01:32:05.280 | is I think people-- - It's up to the states.
01:32:06.560 | - The evangelicals are really upset about him
01:32:09.680 | in their perception, changing his mind,
01:32:11.400 | and women, their view of it is he took away their right
01:32:14.400 | in the states where, like the one I live in,
01:32:16.320 | you can't get an abortion.
01:32:17.320 | Okay, everybody, this has been an amazing episode
01:32:19.560 | of the "All In" podcast.
01:32:20.840 | We got to business up front, great discussion,
01:32:22.880 | politics that got super toxic at the end.
01:32:25.440 | It's the mullet that you love, business,
01:32:27.440 | and then political parties at the back
01:32:29.480 | for the sultan of science who did an amazing job
01:32:33.800 | setting up all these amazing speakers
01:32:35.720 | and doing all the work on the "All In" summit
01:32:38.560 | taking place next week.
01:32:39.840 | We thank you for your efforts.
01:32:42.320 | Sultan of science, David Friedberg.
01:32:44.800 | You excited, you okay, buddy? - I'm here for the team, J-Kel.
01:32:46.840 | It's a team sport. - I love it.
01:32:48.000 | It's a team sport, that's right.
01:32:50.080 | For the chairman and dictator, look at that.
01:32:52.800 | Look at that silver fox.
01:32:54.720 | Look at him with that great hair.
01:32:57.120 | When is it gonna stop?
01:32:58.760 | You gonna cut it for the summit or are you just...
01:33:01.640 | Wow, he's going big.
01:33:03.520 | And the architect. - Welcome to 48.
01:33:06.560 | Welcome to 48.
01:33:07.600 | - The architect, David Sachs, my bestie.
01:33:10.640 | - Welcome to 48. - I'll see you on Twitter.
01:33:11.960 | - Welcome to 48. - I'll see you on Twitter,
01:33:13.440 | Sachsie-poo. - Welcome to 48.
01:33:15.080 | - Welcome to 48.
01:33:16.240 | - I think you should stop making a fool of yourself
01:33:17.680 | on Twitter.
01:33:18.520 | It's getting old. - I think, yeah.
01:33:20.680 | Yes, and you should do another 30 tweets a day
01:33:23.480 | about you, Craig.
01:33:24.520 | We'll see you all next time, everybody.
01:33:25.880 | Bye-bye.
01:33:26.720 | (upbeat music)
01:33:28.160 | - We'll let your winners ride.
01:33:30.800 | Rain Man David Sachs.
01:33:32.360 | ♪ I'm going all in ♪
01:33:35.120 | - And instead, we open-sourced it to the fans
01:33:37.360 | and they've just gone crazy with it.
01:33:39.240 | - Love you, besties.
01:33:40.080 | - The queen of quinoa.
01:33:41.440 | ♪ I'm going all in ♪
01:33:42.720 | ♪ Let your winners ride ♪
01:33:44.160 | ♪ Let your winners ride ♪
01:33:46.400 | ♪ I'm going all in ♪
01:33:48.040 | - Besties are gone.
01:33:49.320 | (laughing)
01:33:50.720 | - That is my dog taking a notice in your driveway, Sachs.
01:33:53.400 | (laughing)
01:33:55.680 | - Oh, man.
01:33:56.520 | - My avatars will meet me at Flintstone.
01:33:58.480 | - We should all just get a room
01:33:59.560 | and just have one big huge orgy
01:34:01.160 | 'cause they're all just useless.
01:34:02.400 | It's like this like sexual tension
01:34:03.880 | that they just need to release somehow.
01:34:06.480 | ♪ When you're the bee ♪
01:34:08.320 | ♪ When you're the bee ♪
01:34:09.160 | ♪ When you're a bee ♪
01:34:10.240 | ♪ Bee ♪
01:34:11.080 | ♪ When you're a bee ♪
01:34:11.920 | ♪ We need to get merch ♪
01:34:12.760 | ♪ Besties are gone ♪
01:34:13.600 | ♪ I'm going all in ♪
01:34:18.600 | ♪ I'm going all in ♪
01:34:24.000 | - And now, the plugs.
01:34:26.440 | You can subscribe to this show on YouTube.
01:34:28.400 | Yes, watch all the videos.
01:34:29.560 | Our YouTube channel has passed 500,000 subscribers.
01:34:32.880 | Shout out to Donny from Queens.
01:34:34.600 | Follow the show, x.com/theallinpod.
01:34:37.840 | TikTok, theallinpod.
01:34:39.440 | Instagram, theallinpod.
01:34:41.040 | LinkedIn, search for All In Podcast.
01:34:42.920 | And to follow Chamath, he's x.com/chamath.
01:34:45.960 | Sign up for his weekly email.
01:34:48.160 | What I read this week at chamath.substack.com
01:34:51.200 | and sign up for a developer account at console.grok.com
01:34:55.120 | and see what all the excitement is about.
01:34:57.320 | Follow Sax at x.com/davidsax
01:34:59.880 | and sign up for Glue at glue.ai.
01:35:01.960 | Follow Friedberg, x.com/friedberg.
01:35:04.720 | And Ohalo is hiring.
01:35:06.200 | Click on the careers page at ohalogenetics.com.
01:35:09.760 | I am the world's greatest moderator, Jason Calacanis.
01:35:12.200 | If you are a founder and you want to come
01:35:14.760 | to my accelerators and my programs,
01:35:16.880 | founder.university, lunch.close/apply
01:35:19.280 | to apply for funding from your boy J-Cal for your startup.
01:35:22.000 | And check out athenawow.com.
01:35:24.240 | This is the company I am most excited about at the moment.
01:35:27.160 | Athenawow.com to get a virtual assistant
01:35:29.840 | for about $3,000 a month.
01:35:31.680 | I have two of them.
01:35:32.840 | Thanks for tuning in to the world's number one podcast.
01:35:35.040 | You can help by telling just two friends.
01:35:37.320 | That's all I ask.
01:35:38.520 | Forward this podcast to two friends and say,
01:35:41.560 | this is the world's greatest podcast.
01:35:43.080 | You got to check it out.
01:35:43.920 | It'll make you smarter and make you laugh,
01:35:45.200 | laugh while learning.
01:35:46.360 | We'll see you all next time.