back to index

Meta's scorched earth approach to AI, Tesla's future, TikTok bill, FTC bans noncompetes, wealth tax


Chapters

0:0 Bestie Intros: Reservation Tips!
5:20 Meta goes scorched earth in AI, why the stock was down despite beating on earnings
22:20 Tesla's roadmap, ranking the company's highest-upside bets outside of cars for the next 10 years
47:25 FTC bans noncompetes: impact on startups and company formation
60:33 Besties reminisce on their encounters with Steve Jobs
70:25 TikTok "divest-or-ban" bill is signed into law: Will China comply? What's it worth without the algorithm? Will a deal get done?
87:22 Biden's proposed capital gains hikes and a 25% wealth tax for those with $100M+ in assets

Whisper Transcript | Transcript Only Page

00:00:00.000 | What did you think about that reservation article thing?
00:00:03.040 | Oh, that's crazy. Well, this is a great topic. There is a guy,
00:00:07.200 | a pretty industrious kid, it turns out what this kid's been doing is he's been getting
00:00:10.800 | carbon reservations, other top tier reservations. And by the way, everybody's had this as an app
00:00:14.880 | idea. Nobody's really executed it that well. I don't want to give any plugs for any apps right
00:00:18.960 | now. But he has been selling $70,000 has been flipping restaurant reservations for upwards
00:00:25.040 | of a thousand bucks. Chamath, you're very industrious and you're an absurd tipper,
00:00:30.720 | 100% tipper minimum. I know this because when I paid for dinner one time at Carbone,
00:00:37.280 | you took me and Phil Hellman's cards and you said, "You guys pay and I'm going to put the tip in."
00:00:42.400 | And then I had to sell Uber shares to cover it. That was gross. That was gross.
00:00:48.560 | I've never had a problem getting a reservation at any restaurant in New York.
00:00:51.920 | Yeah, they see you coming. You know what they say? 10 dimes worth of wine. I have taught many
00:00:56.320 | of my friends how to get reservations. I have a series of tips that I could give,
00:01:02.000 | but I don't want to give too many of them away. But nobody gives tips to maitre d's anymore.
00:01:07.520 | Saks, you tip the maitre d with cash?
00:01:09.280 | All the time. Yeah, sure.
00:01:10.960 | Of course, right? See, this is why Saks and I get along. Saks is a legit old school guy.
00:01:14.720 | Yeah, Saks is elite. He's got the $100 bill pressed into his palm, shakes the guy's hand.
00:01:21.360 | I'll tip the bartender to keep the ice cubes cold.
00:01:24.160 | The bartender got $100 just for keeping the ice cubes cold.
00:01:26.960 | There you go. Oh.
00:01:27.680 | C-note for you, C-note for you.
00:01:49.600 | This is my tip. It has worked. I'm going to give it out here. And the reason I'm giving this out
00:01:54.320 | is because y'all are cheap f***s. Not you, but just a lot of y'all out there. And you don't
00:01:59.040 | know the art. In Brooklyn, everybody gets tipped. We tip the phone guy when he comes to fix your
00:02:03.760 | phone at your house. Everybody gets tipped. So, here's what I would do. And you can do this with
00:02:07.760 | a 20. You can do it with a 50. You know, if you're going to some great place, you can do it with a
00:02:12.400 | hundy. You fold it twice, as Chamath says. You put it in your palm. You walk up to the maitre d stand.
00:02:19.520 | If there's anybody in line, you just cut it. And you go to the side of the table.
00:02:25.120 | You put your hand on the maitre d stand. You push the hand over slightly. Here's the maitre d stand.
00:02:30.560 | You just slide your hand over and you reveal the 50. And you say, "I am so sorry. I was supposed
00:02:36.000 | to make a reservation. I don't know if my assistant did it or not. She probably didn't.
00:02:39.760 | However, I'm a huge fan of the restaurant. If there's anything you could do for me,
00:02:43.680 | I made a stupid mistake to not make a reservation. If somebody cancels, I'll be at the bar. If
00:02:47.680 | there's any way you could accommodate me, I truly appreciate only in town for one night."
00:02:51.280 | When I say 99 times out of 100, it works. The one time it didn't work...
00:02:57.760 | How much are you tipping? A hundred?
00:02:59.520 | It used to be 20 when I lived in New York. Now, I regularly do 50.
00:03:03.200 | And the reason I do it is I don't like giving money to charity. I feel like these charities
00:03:07.840 | are all a giant scam. And I like to give it to service people.
00:03:10.240 | Like, is it awkward when you tip $50 in ones like that? Like, doesn't it become really thick?
00:03:15.600 | It's a little harder to hold in your hand. I used to use a roll of quarters, actually. That
00:03:19.120 | was awkward. The roll of quarters was a little bit hard. The 50 is easy.
00:03:23.520 | I'm just asking everybody here, if you hear my voice, please go tip your maitre d 20 or 50
00:03:30.960 | and report back if my technique works. Because what you're doing in the framing is you're saying
00:03:35.040 | you made a mistake, and then you make it so the person doesn't feel like you're bribing them or
00:03:39.680 | paying them off. They're helping you solve a problem. So, anyway, it's kind of like when
00:03:44.480 | I troll David and I'm like, "Hey, Freeberg, with the sum and whatever." And he's like,
00:03:48.480 | "I'll do all the work. Forget about it." You just got to reframe it. All right,
00:03:51.600 | let's get started here.
00:03:52.400 | But it is a bribe.
00:03:53.360 | That is a bribe, yeah.
00:03:54.960 | It is, in fact, a bribe.
00:03:55.920 | Yeah. I consider it... It's interesting to bring this up.
00:04:00.880 | You're creating plausible deniability, but it's still a bribe.
00:04:03.760 | I consider it a tip in advance of services.
00:04:06.240 | No, you're not tipping him for good service. You're tipping him to basically break in line.
00:04:10.480 | Yeah.
00:04:11.600 | You didn't make a reservation, and you're taking away somebody else's spot.
00:04:15.520 | Yeah.
00:04:15.760 | And you're paying him to break the rules.
00:04:17.360 | You know what? Your shift to man of the people, it's the most brilliant move you've ever made.
00:04:23.760 | No, but it's a bribe.
00:04:24.800 | You're a populist.
00:04:26.320 | You're not even really doing him a favor. You're doing yourself a favor.
00:04:30.080 | All right, let's unpack that.
00:04:32.080 | You're such a populist. That's so brilliant. I hate you.
00:04:34.640 | He just figures out a way to get those populist votes.
00:04:38.240 | No, here's what it is. Because I had this argument with my...
00:04:42.800 | You gave him the 50 just for doing a good job, even if you had the reservation is what you're saying.
00:04:47.600 | I am so confident in how good the service is going to be. I'm giving a pre-tip.
00:04:51.520 | I consider it a pre-tip.
00:04:52.880 | Oh, my God.
00:04:54.080 | A pre-tip if he gives you a reservation when you don't have one.
00:04:57.920 | I am happy to not get the money back if they can't accommodate me.
00:05:01.200 | One time it happened, the woman at Asia to Cuba in New York in the 90s said,
00:05:05.680 | "Here's your 20 back."
00:05:06.560 | And she said, "I'm really sorry. There's nothing I can do. I would love...
00:05:10.000 | It's very lovely of you to give me the tip."
00:05:11.360 | And she handed it back to me.
00:05:12.640 | And I said, "Keep it."
00:05:13.440 | And she said, "No, but the bartender is going to take care of you at the bar."
00:05:17.920 | So that ends well.
00:05:20.400 | All right, let's get started, everybody.
00:05:22.000 | We have a full docket for you today.
00:05:24.480 | And story of the week as voted by besties.
00:05:28.240 | We have a new system here.
00:05:29.120 | Instead of me sorting the docket, the besties give a thumbs up, thumbs down.
00:05:33.600 | This one, I think, had two of three people voted for it.
00:05:36.960 | One person didn't vote this week.
00:05:38.640 | And that is Zuck's Scorched Earth AI Strategy.
00:05:41.440 | As you probably know, if you're in tech,
00:05:43.920 | Meta was the first big tech company to fully embrace open source AI.
00:05:49.280 | Llama leaked last year.
00:05:51.040 | Folks said that it wasn't leaked.
00:05:53.840 | It was released covertly or on the slide, DL, by the folks at Meta.
00:05:58.480 | But they just released Llama 3.
00:06:01.040 | And I talked to Sandeep about this a whole bunch last week.
00:06:04.960 | It just came out a few days ago.
00:06:06.160 | And it's already in the top two models on HuggingFace's leaderboard.
00:06:09.360 | You can take a look at HuggingFace's leaderboard.
00:06:10.960 | It's basically where it's the most trusted place, I think,
00:06:13.760 | where people benchmark these things.
00:06:16.240 | Developers are saying it's faster.
00:06:17.760 | It's less preachy, was an observation in chat.
00:06:21.200 | GPT-4, despite being slightly lower quality.
00:06:24.320 | They also have some small models that are performing really well.
00:06:26.720 | They also announced, very interestingly,
00:06:29.840 | open sourcing the Quest operating system called MetaHorizons OS.
00:06:34.800 | The OS powers the mixed reality headsets, VR, all that kind of stuff, AR.
00:06:38.720 | But here's the big news.
00:06:40.320 | This one is amazing.
00:06:41.680 | Instagram, Facebook, and WhatsApp are integrating a search box
00:06:44.720 | or their AI assistant chat box into all their apps.
00:06:47.760 | Worth noting, Meta came out with their results just the other day.
00:06:52.400 | And we're taping this on Thursday.
00:06:53.520 | As you know, it comes out Friday.
00:06:54.880 | Their stock is down as much as 16% after reporting its Q1 earnings.
00:06:58.560 | They beat estimates.
00:07:00.400 | But Zuck is saying he's going to spend a bucket load of cash on infrastructure.
00:07:05.360 | Chamath, it's your alma mater.
00:07:07.440 | He worked side-by-side with Zuck,
00:07:09.520 | growing Facebook from, what, 10, 20 million to 400 million mouse.
00:07:15.280 | What do you think of his strategy?
00:07:16.240 | 800 million when I left.
00:07:17.920 | Okay.
00:07:18.480 | What are your thoughts on Gold Chain Zuck and his new strategy to pivot away from VR to AR,
00:07:25.120 | AI, and going full all-in, so to speak, on open source?
00:07:30.720 | They're taking a play that we talked about, which makes a ton of sense,
00:07:34.720 | which is it's not a core market, but it's very important.
00:07:39.600 | The core money-making market for them is monetizing their family of apps.
00:07:44.960 | And so they're scorching the earth for these new markets so that the economics are not viable,
00:07:51.760 | which will, I think, allow a more robust ecosystem where they'll have more of a say.
00:07:58.560 | Because I think up until this point,
00:08:00.080 | you could have said that they and Google were sort of lagging behind open AI,
00:08:04.080 | but I don't think that's the case.
00:08:06.000 | And by the way, this is the strategy that we talked about 16 months ago
00:08:10.800 | that these big companies should take.
00:08:12.640 | I just think it's worth listening to what we said,
00:08:14.880 | and then basically what Zuck said just right now.
00:08:18.720 | Here we go.
00:08:19.280 | This is a black and white flashback.
00:08:20.800 | Chamath in a turtleneck.
00:08:23.280 | I think that Google will open source their models because
00:08:27.040 | the most important thing that Google can do is reinforce the value of search.
00:08:32.560 | And the best way to do that is to scorch the earth with these models,
00:08:36.320 | which is to make them widely available and as free as possible.
00:08:40.320 | That will cause Microsoft to have to catch up,
00:08:42.480 | and that will cause Facebook to have to really look in the mirror and decide
00:08:46.560 | whether they're going to cap the betting that they've made on AR/VR
00:08:51.360 | and reallocate very aggressively to AI.
00:08:53.600 | That should be what Facebook does.
00:08:55.360 | And the reason is, if Facebook and Google and Microsoft
00:08:58.880 | have roughly the same capability and the same model,
00:09:01.600 | there's an element of machine learning called reinforcement learning.
00:09:05.280 | And specifically, it's reinforcement learning from human feedback.
00:09:08.320 | Facebook has an enormous amount of reinforcement learning inside of Facebook.
00:09:12.960 | Every click, every comment, every like, every share,
00:09:17.520 | the huge companies will create the substrates.
00:09:20.560 | And I think they'll be forced to scorch the earth and give it away for free.
00:09:25.360 | Now you can see what Zuck said last week.
00:09:27.840 | Look at this little victory lap here.
00:09:29.360 | It's probably also pretty bad for one institution
00:09:36.000 | to have an AI that is way more powerful than everyone else's AI.
00:09:40.000 | And I kind of think that a world where AI is very widely deployed
00:09:45.520 | in a way where it's gotten hardened progressively over time
00:09:50.160 | and is one where all the different systems will be in check
00:09:54.400 | in a way that seems like it is fundamentally more healthy to me
00:09:58.080 | than one where this is more concentrated.
00:09:59.920 | - That is, I think, the definition of scorched earth.
00:10:02.880 | So, and he's running a really perfect place so far.
00:10:05.600 | He's open-sourced the headset OS.
00:10:08.960 | He's opening-sourced these models.
00:10:10.800 | He stopped training, I think, for LLAMA 3.5, was it?
00:10:15.360 | And now he's moved directly to LLAMA 4
00:10:17.760 | just to try to head off GPT-5 at the pass.
00:10:20.240 | Well, so what are we seeing?
00:10:22.240 | We're seeing the economic value getting disintegrated.
00:10:24.880 | There is no value in foundational models economically.
00:10:27.840 | So then the question is, who can build on top of them the fastest?
00:10:31.360 | And Jason, to your point, LLAMA was announced last Thursday.
00:10:35.520 | 14 hours later, Grok actually had that model deployed
00:10:38.960 | in the Grok cloud so that 100,000-plus developers
00:10:42.080 | could start building on it.
00:10:43.280 | That's why that model is so popular.
00:10:45.760 | So it puts the closed models on their heels
00:10:49.280 | because if you can't both train and deploy iteratively
00:10:53.680 | and quickly enough, these open-source alternatives will win.
00:10:57.440 | And as a result, the economic potential
00:10:59.280 | that you have to monetize those models will not be there.
00:11:02.080 | And what that does is reinforce the existing economic moat
00:11:06.240 | that Facebook already has in monetizing their apps.
00:11:08.480 | Now, why the stock went down,
00:11:10.480 | I'd like to talk in contrast to Tesla a little bit later
00:11:13.440 | because this has nothing to do with that
00:11:15.200 | and it's more about squares versus sharps in the stock market.
00:11:19.280 | Freeberg, your thoughts on meta-embracing going all in on AI.
00:11:24.240 | You were at Google, your alma mater,
00:11:28.160 | and you watched them be proprietary and close
00:11:33.440 | when it came to search and the search algorithm,
00:11:36.080 | but open-source when they were behind.
00:11:38.240 | And that's, I guess, the phrase we use here in Silicon Valley.
00:11:41.120 | When you're behind, you use open-source to catch up.
00:11:43.440 | And when you're ahead, you close everything down.
00:11:46.720 | Facebook famously shut down all access to their APIs.
00:11:50.560 | You can't get into the social graph,
00:11:52.880 | but they're going open here.
00:11:55.200 | Why? Because they're behind.
00:11:56.640 | Your thoughts on this from, say, a Google perspective,
00:11:59.440 | and then maybe how Google might react,
00:12:01.360 | and then just broadly, do you think open-source wins the day,
00:12:05.920 | getting to AGI?
00:12:07.600 | I would say the analogy for Google is Android,
00:12:12.000 | where Google open-sourced the Android operating system
00:12:15.760 | because the handset manufacturers
00:12:18.800 | and some of the big software companies,
00:12:20.160 | so Nokia and Microsoft in particular,
00:12:23.360 | BlackBerry, had these closed proprietary operating systems.
00:12:28.320 | And then the telcos put their apps on and made money
00:12:30.880 | and basically had control over whether or not
00:12:32.800 | people could access Google and Google services
00:12:35.280 | through their phone.
00:12:36.320 | So the intention with open-sourcing Android
00:12:38.640 | was to make sure that Google was not disadvantaged
00:12:41.200 | in their core business over time.
00:12:42.720 | I think that there's a similar analogy here
00:12:45.440 | that by open-sourcing these models,
00:12:47.520 | they limit competition because VCs are no longer gonna plow
00:12:50.720 | half a billion dollars into a foundational
00:12:53.200 | model development company.
00:12:54.480 | So you limit the commercial interest
00:12:56.480 | and the commercial value of foundational models.
00:12:59.520 | Now, the CapEx question is a really hard one to diagnose
00:13:02.800 | because we don't know how they're spending the money,
00:13:04.960 | where they're spending the money,
00:13:05.920 | what they're doing in there.
00:13:07.360 | I think that it's a really important sign
00:13:09.520 | that founder-led companies with these voting rights
00:13:12.320 | have the ability to make these sorts of hard decisions
00:13:19.040 | that it might be very difficult
00:13:20.640 | for a management by committee group to make.
00:13:22.640 | - Got it.
00:13:23.120 | - Look at how Mark is making these decisions
00:13:26.080 | and Elon is making big, tough decisions
00:13:28.880 | that it would very likely be pushed back on
00:13:31.440 | if they didn't have the voting rights,
00:13:32.720 | if they didn't have the control
00:13:33.840 | over the company that they had.
00:13:34.880 | - It's a great point about founder authority.
00:13:37.520 | I think that's a really salient point.
00:13:39.680 | Shout out to Zach.
00:13:40.880 | Would really be great to get him at the All in Summit.
00:13:43.200 | Sax, we have some breaking news here.
00:13:44.720 | It turns out Lama 3, they just tested it in the benchmarks.
00:13:47.760 | It says here you can make the founding fathers
00:13:49.920 | any race you like.
00:13:50.960 | It's a really interesting feature.
00:13:52.160 | Any thoughts on Facebook's strategy here?
00:13:56.000 | You are a master strategist, David Sax.
00:13:58.720 | Give us your master strategy here,
00:14:00.880 | 1700 chess rating, Sax.
00:14:03.760 | - Well, we kind of glossed over what the real news was,
00:14:06.880 | which was that Meta released Lama 3,
00:14:10.400 | which they had spent billions of dollars creating
00:14:12.800 | in a completely open source way.
00:14:14.720 | And the testing on Lama 3 is that it's comparable to GPT-4.
00:14:18.880 | And I think this is what Chamath means by scorched earth
00:14:22.080 | is that we now have a free model
00:14:24.560 | that's as good as GPT-4. - Best in class.
00:14:26.320 | - Best in class, or at least tied for best in class.
00:14:28.640 | There are some slight differences
00:14:31.520 | and we've been playing around with it.
00:14:32.960 | It is faster and cheaper than using Chad GPT-4,
00:14:39.520 | but the context window is smaller.
00:14:41.600 | I think it has only an 8,000 token context window.
00:14:44.880 | So that's something that the open source community
00:14:46.880 | is gonna need to work on is rolling out foundation models
00:14:50.320 | of a bigger context window.
00:14:52.080 | Nonetheless, the point is that at least for now,
00:14:57.280 | until open AI comes out with GPT-5,
00:14:59.680 | which is supposed to happen soon,
00:15:02.480 | the open source community has kind of caught up
00:15:04.400 | with the top closed source model, which is open AI.
00:15:08.800 | - Will they blow past the closed source model, Sax?
00:15:12.400 | - Well, I still think that open AI is ahead
00:15:16.320 | because the scuttlebutt about Chad GPT-5
00:15:19.920 | is that it's amazing and it's a big improvement
00:15:21.920 | over GPT-4 and supposedly it's gonna come out
00:15:24.880 | any day, week, or month.
00:15:26.720 | So if GPT-5 comes out and knocks our socks off
00:15:30.720 | in a few weeks, then we're gonna see that,
00:15:34.240 | oh, they're actually a cycle ahead
00:15:36.080 | of the open source community.
00:15:37.840 | But as of this moment,
00:15:39.520 | in terms of what's been publicly released,
00:15:41.760 | I think it's fair to say that open source
00:15:43.520 | is largely caught up with open AI.
00:15:46.800 | And this is why, bring up this tweet by Naveen Rao,
00:15:50.400 | who is a founder who created the company Mosaic ML,
00:15:53.440 | which is sold to Databricks or Unicorn Outcome.
00:15:56.560 | And he says, "I don't think everyone has comprehended
00:15:58.560 | "the massive disruption and distortion
00:16:00.880 | "that's gonna happen in the gen AI market
00:16:03.040 | "due to LLAMA-3.
00:16:04.080 | "Boats will be destroyed and investments will go to zero.
00:16:06.880 | "Just like everything in gen AI,
00:16:08.400 | "this will all happen fast."
00:16:10.160 | So it's a similar reaction to what Chamath is saying,
00:16:13.040 | which is we have a open source model now
00:16:15.760 | that erases billions of dollars of private investment.
00:16:21.280 | - And by the way,
00:16:21.920 | you just said something really interesting before,
00:16:24.320 | which is, okay, there's a limitation, right?
00:16:26.320 | You mentioned the context window, which was a problem.
00:16:28.960 | I just found this thing today.
00:16:31.760 | They solved it.
00:16:33.440 | Now they're at 96K context.
00:16:35.680 | - Oh, that's amazing.
00:16:36.800 | - So to your point, Sax, this market is moving so fast
00:16:39.840 | because you cannot compete with open source.
00:16:42.080 | So all these closed models,
00:16:43.280 | which means open AI and a bunch of these other folks,
00:16:46.480 | especially the ones that are sitting inside
00:16:48.160 | of these smaller companies, right?
00:16:50.320 | Snowflake has a model.
00:16:52.080 | I think Databricks has a model.
00:16:53.520 | There's an important question that has to be asked
00:16:56.720 | around the economic viability of these models
00:16:59.120 | in a world where open source is not only better
00:17:02.000 | and better funded, but they're iterating faster
00:17:05.280 | and the feature set is catching up to your needs.
00:17:07.600 | So the minute that Sax says he needs a longer context window,
00:17:11.040 | within a week, it's there.
00:17:12.160 | That's pretty intense.
00:17:13.840 | - Yeah, I think this is like one of the--
00:17:15.360 | - What do you think, J. Cole?
00:17:16.480 | - Well, I've got a totally different take on this.
00:17:20.800 | Obviously, I agree with all you've said here
00:17:23.040 | in terms of the open source and the dramatic effect
00:17:25.680 | it's gonna have on pricing,
00:17:26.960 | but there's something people are missing.
00:17:28.560 | If you go to meta.ai for a second, Nick,
00:17:30.720 | and you pull it up,
00:17:32.560 | what you'll see is they've dedicated the meta.ai URL
00:17:37.280 | to a search-like experience.
00:17:38.800 | And then they've put a search box at the top
00:17:42.400 | of Instagram, WhatsApp, and Facebook.
00:17:46.560 | And so they are going to put search engine,
00:17:51.680 | essentially, their modern search engine,
00:17:53.520 | which is starting from zero,
00:17:54.800 | in front of 3 billion people
00:17:56.960 | using Meta's collection of services.
00:18:00.000 | And so just like Apple and Firefox
00:18:01.920 | were able to intercept search traffic,
00:18:04.400 | I think, and let's make a prediction here,
00:18:06.400 | that Meta's gonna get 10 points of the search market.
00:18:08.800 | Now, each point of the search market is worth,
00:18:12.400 | you know, what is Google's worth?
00:18:13.680 | About 2 trillion.
00:18:14.560 | If you take out 500 billion
00:18:16.400 | for YouTube and their other services,
00:18:17.760 | you get 1.5 trillion,
00:18:18.960 | which means 10 points is worth 150 billion in market cap.
00:18:22.240 | Now, as you well know, Chamath,
00:18:25.120 | and you well know, David,
00:18:26.560 | these two ad networks,
00:18:28.800 | Meta started with psychographic data,
00:18:31.360 | the person who they know.
00:18:32.800 | And then, of course,
00:18:34.320 | Google had the greatest advertising product ever,
00:18:36.880 | intent.
00:18:37.440 | You type in Volvo, you type in Tesla,
00:18:39.760 | type in Tesla Santa Monica,
00:18:41.280 | used, you know,
00:18:42.000 | you don't have to guess what the person is interested in
00:18:44.880 | at that moment.
00:18:45.680 | Well, for the first time,
00:18:47.920 | you know, one of these ad networks
00:18:49.280 | has content data and psychographic,
00:18:51.600 | and they're gonna be able to put it together.
00:18:53.520 | And I think they take 10 points of search,
00:18:55.200 | but more importantly,
00:18:56.240 | they're gonna have data on individuals
00:18:58.720 | that will be unrivaled in the advertising space.
00:19:02.480 | Now, Google tried to do this themselves.
00:19:05.120 | They did something called Google+.
00:19:07.040 | I don't know if you were there,
00:19:08.000 | Freeberg during the Google+ era,
00:19:09.520 | they spent billions of dollars on a social network.
00:19:11.760 | It failed, they shut it down
00:19:13.120 | because they wanted to get that.
00:19:14.720 | So looking at this,
00:19:16.080 | I think, man, there's so much going on on open source,
00:19:19.120 | but now Google,
00:19:21.440 | we haven't talked about Google's role in all of this
00:19:23.440 | and Sundar being a hired gun,
00:19:25.440 | not having founder authority to your point, Freeberg.
00:19:28.640 | You know, I think they're going after Google search business,
00:19:31.040 | as well as, you know,
00:19:33.280 | taking away and commoditizing all of the open source.
00:19:36.800 | So we might be sitting here in three or four years
00:19:39.200 | watching Meta have, I don't know, 10?
00:19:42.240 | Nobody has to learn search,
00:19:44.480 | it's just that Google can't afford to lose anything.
00:19:46.560 | Yeah.
00:19:47.760 | Everybody takes a point here and a point there,
00:19:49.360 | and all of a sudden,
00:19:50.080 | Google could be in the high 70s of search,
00:19:52.560 | and that would be economically disastrous for their stock.
00:19:55.520 | Yeah, you're gonna have to make cuts.
00:19:57.280 | Why do you guys think the stock is down?
00:19:59.280 | Why is the stock down 16%?
00:20:00.720 | I have a theory as to why.
00:20:02.480 | Yeah, tell us your theory, go for it.
00:20:05.200 | I think that there's a few times a quarter
00:20:10.000 | where you can really see the dispersion in the stock market
00:20:12.880 | between what I would call the smart money and everybody else.
00:20:15.440 | So, you know, using betting language,
00:20:17.360 | the sharps and the squares.
00:20:20.240 | And I think Meta was an example of the sharps
00:20:27.040 | taking a line, which I think is very accurate.
00:20:29.280 | And Tesla was the other great example this quarter.
00:20:32.240 | And so if you look inside of Meta,
00:20:35.840 | and this, Freiburg mentioned this earlier,
00:20:37.600 | I think, Freiburg, what people really reacted negatively to
00:20:41.120 | was the total quantum of spending
00:20:44.080 | and the idea that it's misallocated,
00:20:47.440 | not to AI, but specifically to NVIDIA.
00:20:49.760 | And the reason is that,
00:20:52.320 | and we've mentioned this before,
00:20:54.080 | I've tried to talk a lot about this
00:20:55.280 | with Jonathan Ross from Grok,
00:20:57.200 | but AI is really two markets, training and inference.
00:21:00.320 | And inference is going to be 100 times bigger than training.
00:21:05.440 | And NVIDIA is really good at training
00:21:08.560 | and very miscast at inference.
00:21:10.640 | The problem is that right now,
00:21:13.120 | we need to see a CapEx build cycle for inference.
00:21:17.360 | And there are so many cheap and effective solutions,
00:21:20.480 | Grok being one of them, but there are many others.
00:21:24.320 | And I think why the market reacted very negatively was
00:21:27.440 | it did not seem that Facebook understood that distinction,
00:21:30.000 | that they were way overspending
00:21:33.280 | and trying to allocate a bunch of GPU capacity
00:21:35.760 | towards inference that didn't make sense.
00:21:38.000 | And so I think what people were saying is,
00:21:40.160 | hold on a second, so far, your plays are perfect.
00:21:43.760 | It's everything we want you to do.
00:21:45.040 | We want you to scorch the earth.
00:21:46.320 | We want you to open source the headset,
00:21:49.040 | but we also want you to understand the difference
00:21:51.200 | between training and inference
00:21:52.320 | in a little bit more of a nuanced way.
00:21:54.400 | Build up the inference capacity,
00:21:55.840 | but spend a lot less money
00:21:56.960 | because you don't need to spend it on NVIDIA.
00:21:58.560 | And the reason is the sharps know
00:22:01.200 | that NVIDIA cannot do inference.
00:22:03.280 | And so I think that's why the stock is down this much.
00:22:05.840 | And I think it's important for people who care about AI,
00:22:08.720 | but also may traffic in NVIDIA
00:22:11.040 | to understand why the sharps think that.
00:22:13.200 | And I think several of us have tried to explain it now
00:22:16.400 | for the last couple of weeks, but it is miscast.
00:22:18.720 | And so you're overspending.
00:22:20.560 | Up next, Tesla earnings.
00:22:22.080 | Elon's master plan part two is going well as planned.
00:22:24.960 | Again, this was voted up by our panel here
00:22:28.800 | in our group chat.
00:22:30.000 | Last time we covered Tesla was episode 164,
00:22:32.720 | early February after a Delaware judge
00:22:35.120 | voided Elon's pay package.
00:22:36.640 | Since then, Tesla has made some giant moves.
00:22:39.040 | They launched FSD-12.
00:22:40.480 | I've been using it.
00:22:41.280 | It's pretty great.
00:22:42.640 | Also, they cut the price of FSD by one third
00:22:45.600 | from 12K to 8K.
00:22:47.040 | I think you can also get it for a hundred bucks a month.
00:22:48.880 | And they announced a 10% riff,
00:22:51.600 | going to cut 14,000 employees.
00:22:53.600 | Always painful to do that.
00:22:56.000 | They're looking to reincorporate in Texas,
00:22:58.160 | leaving Delaware for obvious reasons.
00:23:00.000 | Earlier this week, Tesla shares were down
00:23:02.400 | more than 40% year to date,
00:23:03.760 | mostly due to a decrease in demand for EVs.
00:23:07.040 | Market caps dropped from 750 billion to 460 billion.
00:23:10.880 | Shares shrunk 12%
00:23:11.520 | when they reported Q1 earnings on Tuesday.
00:23:15.280 | Investors got excited after Elon announced
00:23:17.520 | that Tesla's new line of models
00:23:18.720 | could start production at the end of the year,
00:23:20.560 | or maybe even early 2025.
00:23:24.080 | He shared some thoughts on the robo taxi or cyber cab,
00:23:28.960 | which he talked about maybe five or six years ago.
00:23:31.600 | He has had in the master plan for a long time.
00:23:34.240 | Chamath, you made some great calls on Tesla
00:23:38.080 | and made some great trades there back in the day.
00:23:40.240 | Your thoughts on Tesla in 2024.
00:23:45.920 | Sharps love it.
00:23:48.000 | This is another one.
00:23:48.960 | Sharps versus the squares.
00:23:50.240 | Why did the sharps love it?
00:23:51.360 | Why did it go up and everybody was so confused?
00:23:53.520 | And I think the answer is that
00:23:54.720 | he's actually executing exactly to plan.
00:23:58.160 | And so if you're investing in a stock,
00:23:59.920 | what you really want is a CEO
00:24:01.600 | to kind of stick to a plan that is well known.
00:24:04.720 | So we don't have to actually guess what the plan is
00:24:08.320 | because he puts it out on the website.
00:24:09.760 | Look at the master plan part do.
00:24:11.440 | And if you start reading down everything
00:24:14.880 | that he said he's going to do starting in 2016
00:24:17.440 | is basically what it is.
00:24:18.480 | So let's take the first part, right?
00:24:20.240 | What did he say in the first part?
00:24:21.440 | He said, hey, listen, guys, this is 2016.
00:24:24.160 | We're going to build a Model 3.
00:24:25.600 | We're going to build a compact SUV, the Model Y.
00:24:28.880 | We're going to build a new kind of pickup truck.
00:24:30.560 | Okay, check, check and check.
00:24:32.000 | Then he said, oh, by the way,
00:24:33.440 | we're also probably going to have to do a heavy duty truck
00:24:35.680 | and a high passenger density urban transport vehicle.
00:24:39.200 | So that's the robo taxi thing
00:24:40.480 | that he's going to announce in August.
00:24:41.760 | Okay, check and check.
00:24:43.200 | And then he talks about the software
00:24:45.040 | and the investment in FSD
00:24:46.560 | and trying to get to a place where, you know,
00:24:48.800 | you can just have a much higher probability
00:24:51.440 | that it will save you from an accident
00:24:53.040 | and it'll keep you safe when you're driving your car,
00:24:55.440 | which will be a large driver
00:24:56.720 | of why people want to buy the cars themselves.
00:24:59.760 | And you can see like these FSD miles.
00:25:01.360 | So if you're a smart capital allocator,
00:25:04.640 | what you actually saw was a dislocated stock price.
00:25:08.240 | What you saw was a plan that by and large,
00:25:11.920 | he's been executing on at a strategic level.
00:25:16.000 | Underneath, there's the vicissitudes.
00:25:18.160 | What are the vicissitudes?
00:25:19.120 | Sometimes you overhire, you need to trim some fat.
00:25:21.760 | Sometimes you overspend on CapEx.
00:25:24.640 | Now he spent about a billion of CapEx this quarter
00:25:28.160 | on AI infrastructure, so H100s for training.
00:25:31.520 | But the market saw through it
00:25:32.800 | because that was a reasonable amount of money
00:25:35.040 | to spend on training for FSD, right?
00:25:38.080 | So you can start to see the tale of these two reactions.
00:25:41.280 | The sharps looked at the capital allocation and said,
00:25:44.480 | okay, you missed by two and a half billion this quarter,
00:25:46.960 | but that's going to create a buying opportunity
00:25:49.600 | because we think it's roughly mispriced.
00:25:51.280 | And we think it's mispriced
00:25:52.880 | because going back to this plan for 2016,
00:25:55.040 | this is all the things that we've been underwriting
00:25:57.120 | from $40 billion of market cap to 750.
00:26:00.720 | And now we're getting a 40% discount to buy back in.
00:26:03.600 | So I think it's a really interesting moment
00:26:07.040 | to just contrast and compare what sharps look at
00:26:12.480 | and then what the media breathlessly exaggerates.
00:26:15.920 | And I think what they wanted was to lionize
00:26:20.080 | the meta earning story, but the sharps rejected it
00:26:24.240 | and they wanted to dump on Elon
00:26:26.000 | and the sharps rejected that in size in both cases.
00:26:29.600 | And it's just a reminder to all of us,
00:26:31.440 | be very careful what you're reading
00:26:32.960 | because if you just took the headlines,
00:26:35.200 | you would have expected the two reactions
00:26:36.880 | to be exactly opposite.
00:26:38.080 | But when you vote with money,
00:26:40.480 | it's very clear and unambiguous
00:26:42.160 | because you actually move in the direction of accuracy.
00:26:45.280 | And that's what the stock market allows you to see.
00:26:46.880 | And I think this is a really interesting contrast
00:26:48.960 | to compare sharps versus squares here.
00:26:50.560 | - Yeah, just well recapped.
00:26:53.280 | For those who don't know sharps in gambling
00:26:55.120 | are people like Haralda, friend of the pod,
00:26:57.120 | who are just the sharpest bettors in gambling.
00:27:01.680 | Facistitude, that's an unpleasant change in circumstances
00:27:04.320 | for those of you who don't know the word.
00:27:06.000 | Sachs, I don't know if you want to comment.
00:27:09.280 | I mean, you and I are biased in this,
00:27:11.920 | with the relationship with Elon.
00:27:13.200 | Any thoughts on Chamath's take?
00:27:15.200 | - Well, I'm not going to comment on the stock per se.
00:27:18.400 | I mean, Tesla's products are amazing.
00:27:21.680 | They seem to be really executing well there.
00:27:23.920 | And I think that Elon foreshadowed
00:27:26.480 | on that earnings call what's coming in the future.
00:27:28.560 | I'm personally really curious about the Optimus robot.
00:27:32.720 | I mean, that has the potential
00:27:33.680 | to create an entirely new market
00:27:36.240 | and bring about something
00:27:38.240 | that we've only seen in science fiction.
00:27:40.400 | So to me, that's very interesting.
00:27:42.400 | I think in terms of the company's problems,
00:27:44.640 | I think they're mostly dealing with some macro forces.
00:27:48.720 | So first of all, the GDP growth rate
00:27:51.680 | has slowed to 1.6% in Q1, so this just came out.
00:27:56.400 | And that was a notable slowdown relative to expectations.
00:28:01.280 | So first of all, you're dealing
00:28:02.160 | with a slowing economy, it seems like.
00:28:03.840 | Second, high interest rates mean
00:28:06.560 | that car payments are higher.
00:28:07.760 | If you want to finance the purchase of a car,
00:28:10.640 | your car payments is going to be a lot higher
00:28:12.720 | when interest rates are above 5%.
00:28:13.920 | And I think that has been a pretty big headwind
00:28:16.800 | for Tesla and really all the car companies
00:28:20.080 | over the last year or so.
00:28:21.920 | But I would say that I think both
00:28:23.200 | of those things are ultimately cyclical
00:28:24.800 | and what matters is Tesla's products.
00:28:26.480 | I would say the only issue they have
00:28:28.880 | that's a real long-term issue
00:28:30.960 | is just the Chinese competition.
00:28:32.880 | Companies like BDY are ramping up production
00:28:37.360 | of knockoff products at low prices.
00:28:40.400 | And so managing the competition
00:28:43.200 | with China is probably the only one of these issues
00:28:46.240 | that's I think probably a long-term issue
00:28:48.000 | for them to deal with.
00:28:48.800 | - Let me ask a question here.
00:28:50.320 | How do you rank these businesses?
00:28:54.480 | Energy, Optimus, trucking, ride sharing.
00:29:00.320 | Energy, Optimus, the robot, trucking, ride hailing.
00:29:05.840 | Rank those, your number one or two
00:29:08.000 | in terms of potential for Tesla, Chima.
00:29:10.640 | Round the horn.
00:29:11.200 | - That's a good question.
00:29:12.080 | - Ride sharing.
00:29:14.160 | - Is number one.
00:29:15.280 | - And the absolute probably by an order of magnitude.
00:29:18.800 | - Oh, interesting.
00:29:19.840 | - And the reason I say that is in order for ride sharing
00:29:22.560 | and ride hailing to really work
00:29:24.000 | the way that they envision it,
00:29:25.760 | you will have level five autonomy
00:29:31.200 | supported in jurisdictions
00:29:35.440 | where, again, if you just look at Pardue,
00:29:38.080 | the whole point is you can summon a car from your app.
00:29:41.200 | And if you don't have access to a mobile phone,
00:29:43.760 | you can go to these bus stop equivalents
00:29:45.600 | and just press a button and the car comes to you.
00:29:47.600 | And so in that world, there's just so much value add
00:29:51.920 | in terms of passenger safety and sustainability
00:29:55.360 | for the environment and less traffic.
00:29:57.680 | It's a gargantuan game changer.
00:30:00.640 | Meanwhile, what you see--
00:30:03.200 | - What's your number two?
00:30:03.760 | - Sorry, just to contrast, Uber and Lyft
00:30:07.600 | seemingly are profitable
00:30:09.840 | because they find pricing power to raise prices.
00:30:12.640 | So if Tesla enters in as a third player in that field
00:30:15.680 | and just guts pricing,
00:30:16.800 | which I think would make sense for them to do,
00:30:18.480 | I think this thing is just gangbusters nuclear.
00:30:21.680 | - That's a really interesting point.
00:30:24.240 | - What do you got, Sax?
00:30:25.520 | You wanna run the horn here?
00:30:26.240 | - Well, I already said optimist,
00:30:27.440 | but that's based on a long-term view.
00:30:28.960 | - Yeah, this is a long-term question.
00:30:30.800 | Based on the long-term, one and two.
00:30:32.240 | - In the long-term, in terms of option value,
00:30:35.760 | I'm the most intrigued by the robot play.
00:30:40.560 | And Elon has pointed out that once you have robots
00:30:44.000 | that are capable of doing lots of different jobs,
00:30:46.320 | pretty much any job you point them at,
00:30:48.080 | that we can have more robots on earth than humans.
00:30:52.160 | So it's a big future market.
00:30:56.400 | - So you got optimist number one, and who's number two?
00:30:58.720 | - I agree with Tomas in the short term
00:31:00.320 | that ride-sharing could really reach
00:31:02.960 | its logical conclusion with robo-taxis.
00:31:08.960 | Remember when Uber first gained steam
00:31:12.880 | and people were speculating about
00:31:14.240 | how big the market could be,
00:31:15.280 | there was a lot of giddy talk
00:31:17.520 | about how car ownership would change
00:31:20.400 | and you wouldn't need to buy a car anymore
00:31:23.360 | because you would just use Uber all the time.
00:31:25.920 | - We called it going full Uber.
00:31:27.840 | - Yeah, and actually I went full Uber
00:31:30.240 | or I shouldn't say full, I went mostly Uber
00:31:33.200 | like within a couple of years of discovering that product.
00:31:36.640 | - Yeah, you were one of the first.
00:31:37.040 | - Yeah, so I thought the potential was there
00:31:39.200 | but the problem is it was just too expensive.
00:31:41.200 | I mean, Uber is still quite expensive
00:31:43.840 | and the most expensive piece of it
00:31:45.440 | is the human driving the car.
00:31:47.440 | It's all that labor cost.
00:31:48.800 | So if you're able to charge for the ride
00:31:51.600 | purely based on fractionalizing the CapEx
00:31:56.000 | and that CapEx isn't even that high
00:31:58.480 | because they're making these robo-taxis
00:32:00.480 | pretty cheaply now
00:32:01.440 | and you're able to get so much more
00:32:04.320 | utilization out of a car.
00:32:05.760 | - Exactly.
00:32:06.640 | - Because I mean, how many hours a day
00:32:08.320 | do you drive your own car?
00:32:09.440 | Like an hour, maybe?
00:32:10.480 | - Less.
00:32:11.120 | - But a robo-taxi--
00:32:11.920 | - 90% I think is the number on average
00:32:14.640 | that they're not used.
00:32:15.840 | Freeberg, who's your one and two?
00:32:17.280 | - I think the Optimus has the highest alpha
00:32:22.720 | but the highest beta.
00:32:23.920 | So more upside than anything else,
00:32:27.760 | just unclear how you get there,
00:32:31.440 | what the path is
00:32:32.320 | and the capital requirements.
00:32:33.760 | And I think there's going to be
00:32:35.840 | a lot of commoditization in this space.
00:32:37.760 | But the ride sharing is built in.
00:32:39.760 | It's low beta and significant alpha, so.
00:32:42.000 | - So you're one and two Optimus ride share?
00:32:44.640 | - Yeah, Optimus ride share.
00:32:46.320 | - Chamath, what was your number two again?
00:32:47.840 | Just so I can recap here.
00:32:48.720 | - I would actually probably pick energy.
00:32:50.400 | - And why would you pick energy
00:32:51.520 | as your number two after ride hailing?
00:32:52.960 | - I think it's a pretty obvious disruption
00:32:56.080 | if, so today in America,
00:32:58.320 | if you look at the energy infrastructure,
00:33:00.240 | the utilities keep raising rates.
00:33:02.720 | And they keep raising rates independent
00:33:06.160 | of what it costs to generate energy.
00:33:08.400 | There are 1,700 utilities in America.
00:33:12.480 | They are by law obligated to spend,
00:33:16.480 | I think in the next 10 years,
00:33:18.480 | about $2 trillion on CapEx.
00:33:20.560 | They will pass all of those costs
00:33:23.760 | plus a margin to consumers.
00:33:25.600 | Even though in these next 10 years,
00:33:28.240 | the cost of generating an incremental unit of energy
00:33:30.560 | will be essentially free.
00:33:32.000 | So I think the very disruptive play
00:33:34.800 | is to take 100 million homes
00:33:37.760 | and make them mini utilities.
00:33:40.080 | - Hmm, say more.
00:33:41.600 | - And.
00:33:42.240 | - What does it mean, make them utilities?
00:33:43.680 | - Yeah, you go to a website.
00:33:46.400 | Again, the biggest investment I've made
00:33:48.160 | is in a company called Palmetto.
00:33:49.600 | They are close to now the largest
00:33:52.880 | residential solar business in the United States,
00:33:55.600 | if not the largest already.
00:33:56.880 | Now, what do they do?
00:33:58.000 | You're a consumer, you can go and lease or buy,
00:34:01.040 | it doesn't really matter.
00:34:01.840 | And we give you a super cheap system
00:34:03.840 | that goes on top of your roof,
00:34:05.120 | plus a battery system.
00:34:06.720 | And all of a sudden you're totally resilient
00:34:08.880 | and independent of the utilities.
00:34:10.880 | And we help change the laws
00:34:13.680 | so that every time you have excess energy,
00:34:15.280 | you can put it back into the grid and get paid,
00:34:17.200 | which is called net metering.
00:34:18.720 | So if you have 100 million of these mini utilities
00:34:21.760 | competing against 1700 incumbents
00:34:24.400 | that are forced to make huge investments
00:34:26.320 | and are mispriced,
00:34:28.560 | that's an upheaval of an enormous amount of market cap,
00:34:33.120 | but also an enormous amount of debt.
00:34:34.880 | There's like trillions of dollars of debt.
00:34:36.640 | So I've always thought a ginormous directional bet to make
00:34:40.560 | would be to go along the disruptor.
00:34:42.240 | The person that helps arm the rebels in this case
00:34:45.120 | would be the person that arms 100 million homes.
00:34:47.760 | And then eventually at some point,
00:34:49.040 | just short the debt of these utilities,
00:34:53.040 | because they won't be able to service them.
00:34:54.480 | If enough people quit and don't need PG&E and everybody else,
00:34:57.280 | it'll be just massively, massively disruptive.
00:35:00.960 | So that's why I think that that market could be big.
00:35:04.400 | I like the optimist market.
00:35:07.680 | The only thing that I'll say is in my experience,
00:35:09.440 | when I've looked at these generalized robots,
00:35:12.720 | the conclusion that I came to,
00:35:13.920 | and I could be totally wrong,
00:35:15.680 | is I suspect instead of having a generalized robot,
00:35:19.040 | you have very specific use case robots that look differently.
00:35:22.880 | So for example, there's an example, Nick,
00:35:24.400 | you can throw a picture,
00:35:25.280 | but there's something called intuitive surgical,
00:35:27.600 | which is essentially a robot that operates on you.
00:35:29.840 | And if you look at that,
00:35:31.520 | it doesn't look anything like what you would think
00:35:34.160 | a robot would look like.
00:35:36.080 | It just looks like a ginormous machine.
00:35:38.880 | And so I suspect that there will be people
00:35:42.080 | that become experts in these vertical use cases.
00:35:45.360 | And that will limit the TAM for a generalized approach.
00:35:48.960 | I could be totally wrong,
00:35:50.720 | but that's sort of why I ranked them the way I ranked them.
00:35:52.720 | But I'm pretty convinced that robot taxis are number one.
00:35:55.360 | And then this energy thing that upends
00:35:58.000 | the 1700 utilities in America is number two.
00:36:00.480 | - All right, Jay Cal, you are the third investor in Uber.
00:36:04.320 | Are you worried about your investment?
00:36:06.400 | As I understand that you still haven't sold, have you?
00:36:08.240 | - I've sold slightly more than the majority of my holdings
00:36:12.960 | over the years to private transactions.
00:36:15.280 | And then I held everything post going public.
00:36:17.440 | And I'm still long Uber.
00:36:19.760 | I give number one to Optimus by far and away,
00:36:24.240 | because I think the build of materials on Optimus
00:36:28.240 | could be 10K or less.
00:36:31.440 | I looked at some of these other startups
00:36:34.880 | that are in the space,
00:36:35.680 | and I've been studying it a little bit
00:36:37.120 | just out of curiosity.
00:36:38.240 | And so at a 10K bomb,
00:36:40.880 | I think they could rent you one of these
00:36:42.640 | for $300 a month or something like that,
00:36:45.520 | a subscription.
00:36:46.320 | I think every human on planet Earth
00:36:49.360 | eventually will have a robot.
00:36:51.600 | So I know that's crazy,
00:36:52.560 | but I think Optimus will dwarf the entirety
00:36:56.240 | of Tesla's other businesses.
00:36:58.560 | Number two is energy, clearly,
00:37:00.400 | I think, and number three is ride hailing.
00:37:02.240 | Number two, energy, fairly obvious what's happening here.
00:37:05.040 | If you look at what's happening with data centers,
00:37:08.240 | it's spiked, I think, from like three or 4% of energy now
00:37:11.440 | to like, is it 15% Chamath?
00:37:13.280 | I think we had a discussion-
00:37:14.320 | - Yeah, it's almost, I think it's 18%,
00:37:17.200 | it's almost 20%, it's a huge number now.
00:37:19.040 | - Okay, so we don't tell the two stories together.
00:37:22.080 | These H100s are beasts, cooling beasts.
00:37:25.600 | This is like, you can't even,
00:37:27.840 | Phil Deutsch was telling us in the group chat,
00:37:29.760 | like you can't even cool some of these places.
00:37:32.080 | And what that means is
00:37:33.840 | there's gonna be demand for energy,
00:37:36.240 | to Chamath's point,
00:37:37.520 | if each home becomes a little provider,
00:37:40.640 | you know, it's gonna create some regulation change.
00:37:43.280 | I believe, I'm no expert on the regulations,
00:37:44.960 | we should definitely have Deutsch on the pod
00:37:46.480 | at some point to talk about this.
00:37:47.920 | But imagine these, like in Texas,
00:37:50.960 | they keep having the grids go down.
00:37:52.400 | And in Australia, they have the grids go down.
00:37:54.880 | It's gonna be so annoying for consumers
00:37:58.080 | to have the grids go down,
00:37:59.200 | while demand spikes for data centers,
00:38:01.680 | all the regulations are gonna get opened up
00:38:04.640 | to put batteries and solar everywhere.
00:38:07.120 | - It's also dangerous.
00:38:08.640 | I mean, people die when these blackouts
00:38:10.720 | and outages occur.
00:38:11.600 | And also, like we saw in San Carlos,
00:38:14.720 | here in the Bay Area,
00:38:16.400 | Nat gas lines explode,
00:38:18.240 | PG&E is responsible for forest fires.
00:38:20.640 | I mean, if you can do your part
00:38:23.120 | to basically get off these grids,
00:38:25.040 | use somebody like Palmetto or somebody else
00:38:28.960 | and just get solar, Tesla, and be done with it.
00:38:31.440 | And at least you're not contributing
00:38:33.120 | to some of these downstream errors.
00:38:34.640 | - Yeah, and so I agree 100%.
00:38:38.400 | And by the way, if you can make money
00:38:41.280 | from your solar and your batteries
00:38:44.080 | and somebody else picks up the cost of installation,
00:38:46.800 | I think there's some arbitrage here, Chamath, to be had.
00:38:49.440 | And I think that's like what you discovered
00:38:51.120 | when you did your first principle approach to this.
00:38:55.440 | So I think those two businesses will be far bigger
00:38:58.080 | than anything else they're doing.
00:38:59.760 | Now on ride hailing, I'm using FSD-12.
00:39:04.800 | I have been using Autopilot.
00:39:06.640 | I was one of the first users of Autopilot
00:39:08.800 | from the beginning, like literally.
00:39:12.320 | And FSD-12 is a significant improvement.
00:39:15.520 | That being said, all of these systems work really well
00:39:19.600 | in a controlled grid,
00:39:21.120 | like we've seen with Waymo in Arizona
00:39:23.680 | and in limited parts of Los Angeles and San Francisco.
00:39:26.720 | I think that vision is five, six, seven years out
00:39:30.000 | from having low single digit percentages.
00:39:31.840 | And you may have seen Uber has incorporated Waymo
00:39:35.920 | and the taxis in London into their system.
00:39:39.360 | I think Uber is ultimately going to be the place people go
00:39:42.480 | to call up one of 20 different self-driving.
00:39:46.880 | I think a lot of people are going to get there
00:39:48.240 | at the same time.
00:39:48.800 | Wouldn't be surprised if Elon gets there first,
00:39:50.880 | but I think that's five or 10 years out,
00:39:53.280 | if I'm being totally honest,
00:39:54.480 | you know, in terms of getting to scale.
00:39:56.480 | So I would never, ever bet against Elon
00:39:59.840 | seeing what he's done, watching-
00:40:02.080 | - Wait a second, when are you saying
00:40:03.680 | that self-driving will be good enough
00:40:06.000 | to power just one robo-taxi?
00:40:08.080 | Let's just forget about the scale.
00:40:09.520 | Just like, when will it be good enough
00:40:11.600 | to have like a steering wheel free car?
00:40:13.680 | - Yeah, so if you can just look at cruise and Waymo
00:40:18.240 | and you can have your answer,
00:40:20.160 | which is today in a contained environment
00:40:23.120 | with remote drivers intervening
00:40:25.360 | on some regular basis, it works.
00:40:27.440 | So people are taking Waymo, San Francisco, LA
00:40:30.560 | on a pretty regular basis cruise.
00:40:31.840 | I think I kicked out of San Francisco
00:40:33.520 | but in the cruise data,
00:40:34.800 | there was a lot of interventions going on.
00:40:36.480 | I suspect Waymo's doing a lot of interventions.
00:40:38.400 | So I think-
00:40:40.400 | - I don't want to be a proponent of this by the way,
00:40:42.640 | but bleep out his name,
00:40:45.680 | was saying that Jason with FSD,
00:40:48.880 | they're very strict that you have to be holding
00:40:51.440 | the steering wheel and stuff.
00:40:52.480 | - Yes.
00:40:52.800 | - But you can go to Alibaba
00:40:54.560 | and buy something that hacks that, right?
00:40:56.400 | - It's a weight.
00:40:57.600 | A lot of people are doing that.
00:40:58.720 | - I'm not promoting this.
00:40:58.800 | - No, no, no, no, no, no, no.
00:41:00.320 | - I'm not promoting this.
00:41:01.200 | I'm just saying that it doesn't exist.
00:41:01.920 | - So anyways, to answer your question, Saks,
00:41:03.600 | I think today,
00:41:04.320 | or let's just say by the end of this year,
00:41:07.440 | an FSD-12 could be operating in a constrained space
00:41:10.560 | just like Waymo is.
00:41:11.440 | Now to get it to all regions,
00:41:15.280 | back roads,
00:41:16.720 | you know, weather,
00:41:18.560 | there's a lot of edge cases here.
00:41:20.160 | So I think the more important question is
00:41:22.800 | when can automated be 5% of rides
00:41:24.960 | or 10% of rides?
00:41:25.920 | And I think that's five or 10 years from now.
00:41:27.680 | I do think a couple of people
00:41:28.800 | will reach that moment at the same time.
00:41:30.480 | - Well, let me show you a video.
00:41:32.000 | This is one of my coworkers at Kraft took this.
00:41:34.720 | This wasn't me,
00:41:35.680 | but he was riding in the back
00:41:37.280 | of one of these robo taxis.
00:41:38.880 | I think this is San Francisco or New York.
00:41:41.280 | I'm not sure which one.
00:41:42.240 | But there is no one in the driver's seat.
00:41:47.760 | - Yeah, but there's a safety driver
00:41:50.160 | watching this remote, I believe,
00:41:51.520 | which is what Cruz was doing.
00:41:53.520 | And then when they get to a point
00:41:54.880 | where they can't figure it out,
00:41:56.160 | the human comes in, intervenes.
00:41:59.280 | - I don't see a safety driver.
00:42:00.320 | Where's the safety driver?
00:42:01.280 | - No, no, a remote safety driver.
00:42:02.960 | - Remote, remote.
00:42:04.320 | But I have to say watching this video,
00:42:06.960 | that is the coolest bloody thing.
00:42:08.560 | I mean, this is amazing.
00:42:10.880 | I'll be honest.
00:42:11.520 | - I think it's underrated
00:42:13.200 | in terms of how significant this is going to be.
00:42:14.960 | - Me too.
00:42:15.280 | - And look, if this is a Jaguar,
00:42:17.360 | look at that, it's a Jaguar.
00:42:18.800 | So you're telling me that Tesla
00:42:20.800 | is not going to have something
00:42:22.720 | at least as good as this?
00:42:23.840 | - Better.
00:42:24.480 | - Soon?
00:42:25.440 | - Yeah, no, I think they have it already.
00:42:28.720 | If you drive this FSD-12,
00:42:30.400 | I believe FSD-12 probably could be working today
00:42:33.760 | in San Francisco under a controlled grid
00:42:36.800 | with the remote drivers like the other people have.
00:42:38.720 | If somebody knows about this, by the way,
00:42:40.720 | can they send us information?
00:42:42.080 | I'm Jason at calicanus.com for life.
00:42:43.840 | Just send me a picture.
00:42:45.360 | I'll keep it on the DL.
00:42:47.600 | I will reveal sources of who these remote drivers are,
00:42:50.800 | what their setup looks like.
00:42:52.160 | I'm very curious to understand
00:42:54.080 | how the remote drivers work for Waymo and Cruz.
00:42:56.080 | - Can they park the car?
00:42:57.360 | - They can take the car with a joystick,
00:42:59.200 | is my understanding,
00:43:00.160 | and move it around a garbage can
00:43:01.520 | in the middle of the street, Jamal.
00:43:02.480 | - Wow.
00:43:03.120 | - And that might be the future of this.
00:43:05.040 | Like, let's say 10 or 15%.
00:43:06.240 | Let's say 10% of the time it needs an intervention.
00:43:10.480 | One an hour, two an hour.
00:43:13.120 | Having some remote person in Manila, wherever,
00:43:15.760 | watching 10 videos concurrently
00:43:18.960 | and looking for moments to do this.
00:43:21.280 | - It's incredible.
00:43:21.760 | - Cruz workers intervened to help the company's cars
00:43:24.960 | every 2.5 to five miles,
00:43:27.520 | according to a New York Times article.
00:43:29.680 | All right.
00:43:30.000 | And just if we score everybody's rating here,
00:43:32.480 | I give two points for first place,
00:43:33.680 | one point for second place.
00:43:34.960 | Optimus got six points.
00:43:36.560 | Ridehailing, four.
00:43:37.520 | Energy, two.
00:43:38.080 | Trucking, zero.
00:43:39.520 | There you have it, folks.
00:43:40.160 | There's your scores from the besties.
00:43:42.160 | - What is Waymo?
00:43:43.360 | What is that?
00:43:43.840 | Is that like Uber?
00:43:44.400 | - That's Google self-driving.
00:43:45.520 | They spun it out.
00:43:46.240 | - But what is it?
00:43:48.640 | Is it like a service like--
00:43:49.600 | - LiDAR, $150,000 cars.
00:43:52.320 | - Uh-huh.
00:43:54.240 | But it competes with Cruz?
00:43:55.760 | - Yeah.
00:43:57.040 | They're doing what Cruz did.
00:43:58.160 | - It was the original self-driving car company
00:44:00.240 | that Google set up.
00:44:01.280 | And then they set up Google X around it.
00:44:03.280 | - Just explain to me the market, guys.
00:44:05.360 | I'm not sure I get it.
00:44:06.000 | So there's Waymo and Cruz.
00:44:07.280 | - It's Uber.
00:44:07.840 | - But there's Waymo and Cruz,
00:44:09.760 | which are like, there's no drivers.
00:44:11.280 | And then there's Uber and Lyft,
00:44:12.560 | where there are drivers.
00:44:13.600 | - Yeah.
00:44:14.320 | - That's the basic, right?
00:44:16.320 | - Yeah.
00:44:16.880 | - That's right?
00:44:17.360 | - Yeah.
00:44:17.680 | - Yeah.
00:44:18.320 | - And there's also Aurora.
00:44:19.520 | I don't know if Uber owns a percentage of that
00:44:23.280 | or not, but I think you know about Aurora.
00:44:25.200 | - Aurora was set up by Chris Urmson.
00:44:27.520 | He was the original engineering lead on Waymo
00:44:30.240 | inside of Google.
00:44:31.200 | When they did a management changeover,
00:44:32.720 | there was a big falling out.
00:44:33.680 | A lot of people ended up leaving the Waymo program.
00:44:36.080 | Chris left, took a bunch of people with him
00:44:38.400 | and started Aurora,
00:44:39.200 | which was basically software to do autonomous driving
00:44:42.560 | that they were then gonna license into OEMs.
00:44:45.520 | So that company got backed by a bunch of VCs
00:44:47.920 | and ended up going public via SPAC.
00:44:50.000 | And it's publicly traded today.
00:44:51.440 | - Ah, I see.
00:44:52.240 | - Uber owns 26% of Aurora.
00:44:54.640 | Or actually now it's down to 20%.
00:44:56.240 | There's also Zoox.
00:44:57.040 | - They made an investment in it, yeah.
00:44:58.880 | - Yeah.
00:44:59.440 | There's also Zoox, which is part of Amazon.
00:45:02.080 | So I think there's about 10 people
00:45:04.240 | who are gonna get to the finish line at the same time.
00:45:05.840 | I think it's gonna be a very crowded space.
00:45:07.040 | - But again, some of these guys
00:45:08.240 | are trying to make software for OEMs
00:45:10.320 | so that the OEMs don't have to build
00:45:11.920 | their own autonomous platform.
00:45:13.360 | And some of them are trying to make
00:45:15.680 | a driving service like Waymo.
00:45:20.960 | And then some of them are, like Tesla has a driving car.
00:45:24.000 | They actually sell cars, so different models.
00:45:26.000 | - Yeah.
00:45:26.240 | - So they're all gonna converge, yeah.
00:45:27.840 | - I actually disagree that there's gonna be
00:45:30.080 | 20 or 30 of these, because I actually think
00:45:32.800 | that self-driving is really, really hard.
00:45:34.560 | And we know that Tesla has been spending years
00:45:38.800 | getting to this point.
00:45:41.280 | And then Google as well.
00:45:42.640 | So I'm not sure why there'd be more
00:45:44.320 | than two or three companies that can do self-driving.
00:45:46.880 | - Well, there are 10 or more pursuing it.
00:45:50.160 | I do agree it goes down to four or five.
00:45:52.880 | And then it's a question of does any,
00:45:54.240 | is it a winner-take-all, winner-take-most?
00:45:56.240 | And that's what remains to be seen.
00:45:58.320 | - But I mean, you need a ton of data to make it work.
00:46:01.280 | - Yeah.
00:46:01.360 | - And you need a lot of data around edge cases.
00:46:03.760 | And so the advantage Tesla has is,
00:46:06.160 | they actually have real--
00:46:07.200 | - They have so many cars on the road.
00:46:08.480 | - They have real cars on the road
00:46:10.000 | with real driver minutes,
00:46:11.440 | and with drivers using the previous version
00:46:14.640 | of FSD and making interventions.
00:46:16.400 | So they're constantly getting better.
00:46:19.280 | I think Google seems to be putting enough resources
00:46:22.320 | into this to get somewhere.
00:46:24.320 | But I don't think a lot of companies
00:46:26.160 | are gonna figure this out.
00:46:27.120 | I think it's gonna be somewhere between
00:46:28.480 | one and three companies figure this out.
00:46:30.880 | - Well, Waymo has the most on the road right now.
00:46:33.280 | I think Cruze is second to them in terms of in-market.
00:46:36.080 | So the question is, how quickly can Elon get this out?
00:46:39.200 | Have you guys, any of you guys
00:46:40.640 | taken one of these robo-taxis?
00:46:42.240 | - I haven't taken one.
00:46:44.240 | - No, that's one of my coworkers.
00:46:45.760 | - Yeah, we've gotta do that.
00:46:47.200 | - Freeberg, have you taken one?
00:46:48.080 | - Yeah, they're cool.
00:46:48.800 | They just move around.
00:46:49.920 | - You did take one.
00:46:50.880 | - Yeah, they're all over the city.
00:46:53.040 | You can take them in San Francisco now.
00:46:54.400 | - Yeah.
00:46:54.720 | - You have to download the Waymo app.
00:46:56.640 | - Yeah, and there's like a wait list,
00:46:58.320 | and then you get on it,
00:46:59.120 | and they're letting people on,
00:47:01.120 | and then you just get on it and go for a ride.
00:47:02.880 | Pretty cool.
00:47:03.920 | - By the way, someone tweeted something kind of funny.
00:47:06.160 | They said, "I took Uber years and years
00:47:09.520 | "and like a trillion dollars
00:47:10.720 | "to finally get to profitability,
00:47:12.240 | "and now they're about to get disrupted by robo-taxis."
00:47:14.800 | This is brutal.
00:47:15.600 | - Unlikely, I think unlikely.
00:47:18.480 | I think it's gonna be a very slow rollout,
00:47:20.960 | you know, 1% or 2% a year, so.
00:47:23.360 | But you know, it compounds, and so we'll see.
00:47:25.600 | All right, everybody, is this a big win for LenaCon?
00:47:28.240 | The FTC just banned non-competes.
00:47:30.720 | Three of three people voted this up
00:47:32.160 | of the three besties who decided to vote on the docket.
00:47:35.280 | Federal Trade Commission this week voted
00:47:38.240 | by a three to two margin to ban non-competes
00:47:40.240 | to give you a sense of the impact here.
00:47:42.080 | Estimated of 18% of the total US workforce
00:47:44.560 | are covered in some way by non-competes.
00:47:47.120 | 30 million people.
00:47:48.240 | The ruling bans agreements entirely
00:47:51.920 | and requires companies to let their staff know
00:47:53.760 | that non-competes are non-enforceable.
00:47:55.600 | There are a few exceptions like existing non-competes
00:47:57.920 | for senior level executives.
00:47:59.200 | I'm guessing when you buy a company,
00:48:01.680 | you can get somebody to non-compete
00:48:03.680 | for buying their company.
00:48:04.880 | The Democratic FTC Commissioner, Rebecca Slaughter,
00:48:08.080 | called non-competes unfree and unfair.
00:48:11.360 | One of the dissenting commissioners,
00:48:13.200 | Republican Andrew Ferguson,
00:48:14.960 | said the FTC lacked authority from Congress
00:48:17.600 | to make this ruling without specifying
00:48:19.680 | whether or not he agreed with it.
00:48:20.800 | Theoretically, the new rule takes effect in 120 days.
00:48:23.520 | Wow, we're on a fast timeline.
00:48:25.120 | After which the vast majority of existing non-competes
00:48:27.840 | will be unenforceable.
00:48:29.600 | Here in California, they're non-enforceable as we know,
00:48:31.680 | except under certain circumstances.
00:48:33.760 | So this is really an East Coast thing.
00:48:35.200 | People in Boston, New York,
00:48:37.360 | they're really affected by this
00:48:38.640 | and people will actually enforce these
00:48:41.520 | for things as silly as hairdressers.
00:48:43.920 | It was a big lawsuit by hairdressers
00:48:45.520 | who had to sign like 50-mile non-competes
00:48:50.160 | with their beauty salons.
00:48:51.520 | The U.S. Chamber of Commerce says
00:48:54.560 | it plans to file a suit over the ruling.
00:48:56.000 | FTC says the rule will create 8,500 new startups a year
00:49:00.320 | because people can leave and compete.
00:49:02.640 | Friedberg, your thoughts.
00:49:04.320 | In California, non-competes are not enforceable,
00:49:07.280 | but outside of California, they are.
00:49:09.280 | And they're often used like in the agriculture industry,
00:49:13.440 | in the ag inputs industry,
00:49:14.720 | where folks are developing new technology,
00:49:16.960 | mostly at companies based in the Midwest and the East Coast.
00:49:20.640 | As you guys know, in financial services as well,
00:49:23.360 | they're often used that if you work for one company
00:49:26.560 | because you have a lot of trade secrets,
00:49:27.920 | you can't carry them over to others.
00:49:29.120 | But in Silicon Valley, we're very used to engineers,
00:49:31.440 | product managers, executives,
00:49:32.880 | taking all of the knowledge and capabilities
00:49:35.440 | that they've developed at one business
00:49:36.720 | and leveraging that into a new role.
00:49:39.200 | It creates a dynamically competitive marketplace.
00:49:42.960 | It also inflates, obviously, costs
00:49:45.120 | because you're willing to pay more for someone
00:49:47.040 | who's worked directly at a competitor
00:49:48.560 | and has deep knowledge.
00:49:49.520 | But they are bringing IP often,
00:49:51.680 | and there is a real risk that they take something
00:49:54.240 | from your company that you've invested in them to understand,
00:49:57.680 | and they take it to your competitor.
00:49:59.520 | So I see both sides of this.
00:50:01.440 | I think non-competes give companies the ability
00:50:06.160 | to invest in employees and feel confident
00:50:09.440 | that the knowledge and information and support
00:50:11.760 | they're providing to those employees
00:50:12.960 | doesn't ultimately find its way over to a competitor.
00:50:17.360 | And it gives you a rationale for paying more
00:50:19.920 | to your employees.
00:50:20.640 | It actually inflates salaries.
00:50:22.400 | If I'm fearful that someone might leave
00:50:24.880 | and take knowledge with them,
00:50:26.240 | I'm going to share less with the employee.
00:50:27.840 | I'm going to pay them less.
00:50:29.360 | So I could see this going both ways.
00:50:31.120 | I'm really fairly torn on this issue, to be honest.
00:50:33.200 | I don't know what the right answer is,
00:50:35.200 | or if there really is a strong kind of ethical case
00:50:38.960 | one way or the other.
00:50:40.960 | I think that non-competes can both benefit
00:50:43.680 | and hurt employees and benefit and hurt companies.
00:50:46.320 | Yes, there are unintended consequences.
00:50:48.640 | Chamath, where do you stand on this?
00:50:49.760 | Is this a huge win for Alina Khan, neutral, or a loss?
00:50:53.280 | I don't see non-competes having any value in any market
00:50:58.640 | except East Coast financial markets.
00:51:02.240 | I've heard of folks getting something called beach leave,
00:51:06.400 | typically in finance.
00:51:08.320 | These are Wall Street jobs, hedge fund jobs.
00:51:10.640 | So I know it exists there.
00:51:13.280 | What does it mean, beach leave?
00:51:14.400 | If you leave hedge fund A and you're going to go to hedge fund B,
00:51:18.080 | they make you sit on ice for six months or nine months,
00:51:21.600 | garden leave, I think maybe they call it.
00:51:23.520 | And the whole idea is that you don't take any
00:51:25.840 | active market thoughts to your next job.
00:51:29.680 | For what Freeberg said,
00:51:31.600 | I just think the whole thing doesn't make much sense.
00:51:33.520 | Intellectual property is the least valuable it's ever been.
00:51:40.240 | Right?
00:51:40.640 | More people develop things now via trade secret than patents.
00:51:45.680 | The whole patent system has become a little bit of a game.
00:51:48.800 | We just talked about open source, right?
00:51:50.880 | They have less and less value.
00:51:52.400 | So innovation is happening in plain sight.
00:51:55.040 | And so I don't think companies are actually thinking about this issue at all anymore.
00:51:59.200 | They're paying people because they need to pay more to get the best and brightest.
00:52:04.480 | So I think if you're hoping to rely on one of these agreements, if you're an employer,
00:52:11.600 | I think that you should just forget about them and just hire the best people,
00:52:16.240 | pay them the best and if they suck, fire them.
00:52:18.400 | Did you ever see that movie, Spanish Prisoner, starring Steve Martin?
00:52:22.480 | Great film.
00:52:24.400 | God, that's a deep pull.
00:52:25.760 | Steve Martin plays the con man that goes to this guy.
00:52:30.000 | And the guy is like a chemical engineer.
00:52:32.160 | Campbell Scott is the guy's name, the actor.
00:52:34.080 | And he has like a process for making a chemical and it's worth so much money.
00:52:40.080 | And the whole movie is Steve Martin conning him into telling him the chemical process.
00:52:44.720 | And eventually he took it, he was going to give him all this money, hire him.
00:52:47.840 | And then he disappeared once he got the knowledge.
00:52:49.360 | And that was the end of it.
00:52:50.000 | It's a great, great film.
00:52:51.360 | Shout out David Mamet.
00:52:52.720 | David Mamet, exactly.
00:52:53.760 | But I think there's a lot of investment that goes into intellectual property businesses that
00:53:02.320 | include things like chemical engineering, obviously financial services, trading companies,
00:53:07.680 | algos in hedge funds, and a lot in software that if you can access that intellectual property
00:53:14.000 | without stealing it, but actually hiring an employee that has it in their head,
00:53:18.320 | you gain quite a lot.
00:53:20.960 | But you can do that now.
00:53:22.000 | I don't see it actually doing anything bad.
00:53:24.000 | So I have had a lot of folks that I know in other industries outside of tech or software,
00:53:28.560 | where California non-competes are not enforceable,
00:53:30.960 | that aren't allowed to go work at a competitor for one year, two years, three years.
00:53:34.480 | No, no, no, I'm not disputing that it doesn't exist.
00:53:36.560 | I'm saying where do you see it actually having negative repercussions in California?
00:53:40.560 | The point of the fact that it doesn't exist here is also seemingly that it doesn't mean anything.
00:53:45.680 | Yeah, but the rate of change here is crazy.
00:53:48.320 | Like, the rate at which software changes, I think is quite distinct from the rate at which,
00:53:54.400 | for example, a chemical engineering process might change.
00:53:57.440 | Where you learn that process, you spend $200 million building a system,
00:54:00.880 | and then that that employee can go and tell it to someone else
00:54:03.440 | and suddenly changes everything about that other business.
00:54:05.840 | So I don't know, I feel like there's something about California software
00:54:09.360 | that gives us this, this point of view that maybe it doesn't matter.
00:54:12.800 | But in other industries, it might.
00:54:14.160 | In that example, you'd still need to have come up
00:54:16.160 | with hundreds of millions of capex to implement it, whatever, you know.
00:54:19.200 | Yeah.
00:54:19.360 | So Saks, any thoughts?
00:54:20.560 | I just don't think these things mean anything.
00:54:22.400 | Yeah. Do you care about non-compete Saks?
00:54:24.560 | I think one of the reasons why the tech industry moves so fast
00:54:26.720 | and innovates so well is because of the happy coincidence
00:54:30.400 | that California doesn't allow non-competes.
00:54:33.520 | And so the industry's never had them.
00:54:35.120 | And as a result of that, like you're saying,
00:54:38.000 | the talent just flows freely to wherever the opportunity is.
00:54:40.960 | And you have this dynamic in tech where the VCs and the talent
00:54:47.760 | all swarm the biggest and best opportunities
00:54:51.520 | and the newest platforms in a completely decentralized way.
00:54:54.880 | And there's little or no friction in doing that.
00:54:57.840 | If you were to impose non-competes on all of the talent,
00:55:02.320 | there's going to be enormous delays and friction
00:55:04.800 | in them moving around to pursue opportunities.
00:55:07.440 | And so I think it's been very beneficial to the industry as a whole
00:55:10.880 | that we don't have that.
00:55:11.840 | Now, it's hard for me to speak to other industries,
00:55:14.400 | but I think that if you're looking to maximize the pace of innovation,
00:55:18.160 | you wouldn't have these employee non-compete.
00:55:20.800 | So on that level, I kind of agree with what Lena Khan's done.
00:55:24.800 | That being said, I do think the Republicans' point was well taken
00:55:28.880 | in the sense that this was very aggressive rulemaking.
00:55:32.080 | I mean, I think you could legitimately argue
00:55:35.440 | that a change this big should be made by Congress,
00:55:38.640 | but I tend to think this is probably a positive change.
00:55:42.480 | - I think this is great, what Lena Khan's doing.
00:55:44.960 | And this is two for two.
00:55:45.920 | I think her Apple action was great.
00:55:47.840 | I think this is great.
00:55:48.960 | Love to have her at the summit.
00:55:50.160 | That would be an amazing fireside chat.
00:55:51.920 | So if anybody knows her, invite her to the summit.
00:55:55.520 | There really is three way.
00:55:56.560 | Oh, and by the way, Chamath,
00:55:57.840 | media business on the East Coast also does this.
00:56:01.520 | And that's why friend of the pod, Tucker Carlson and Don Lemon,
00:56:07.280 | both of them started shows on the internet, not other networks
00:56:11.920 | because they have very tight non-competes and it's pay to play.
00:56:15.040 | They got paid out their full contracts according to reports.
00:56:18.000 | And so there is something I think in the middle ground here, Chamath,
00:56:22.480 | where if you do want somebody to not compete,
00:56:26.000 | you got to pay to play.
00:56:27.040 | In other words, they're getting paid.
00:56:28.400 | - This is what I'm saying.
00:56:29.120 | It just tends to be examples, Jason, over and over of markets
00:56:32.720 | that don't really matter that much relying on these things
00:56:36.320 | and markets that are dynamic and add value in humanity,
00:56:39.200 | just moving past them.
00:56:41.280 | So it probably makes sense for all of us to just go move past them.
00:56:44.560 | - Absolutely, 100%.
00:56:45.760 | That's going to be great for the economy.
00:56:47.040 | Sach, you were going to say something?
00:56:48.000 | - Well, just because these agreements are no longer enforceable
00:56:52.160 | doesn't mean that businesses don't have moats.
00:56:54.800 | It just means you have to look in other places.
00:56:56.960 | I mean, you have to find network effects.
00:56:59.520 | You have to find data scale effects.
00:57:00.960 | There's other ways of building a moat besides just trade secrets
00:57:06.320 | or patents is another category where in the tech industry,
00:57:11.440 | patents have never really mattered that much.
00:57:12.960 | And the only people who are seeking to litigate patents
00:57:15.040 | are basically patent trolls.
00:57:16.800 | - Losers.
00:57:17.760 | - So you want to find real durable moats,
00:57:21.680 | not these like legal arrangements that try to protect your business
00:57:24.720 | through these types of contracts.
00:57:27.520 | - That's really well said.
00:57:29.120 | That's exactly what I was trying to say.
00:57:30.320 | That's well said.
00:57:30.880 | - Well said.
00:57:31.760 | Team, product, brand.
00:57:34.480 | These are the ways to build a great business,
00:57:36.800 | not legal shenanigans.
00:57:37.600 | - And by the way, one of the reasons why the industry moves so fast
00:57:42.080 | is because best practices get shared very quickly.
00:57:45.440 | And one of the ways that happens is that
00:57:47.440 | everybody is moving around to different companies.
00:57:50.720 | I mean, what's the average term of employment?
00:57:52.720 | - 18 to 36 months.
00:57:54.800 | - Right.
00:57:55.120 | And you don't have to worry that
00:57:56.800 | if you've got some knowledge in your head,
00:57:58.960 | that you don't have to worry that if you use it at some new company,
00:58:01.840 | you're going to violate some trade secret.
00:58:03.920 | But the one thing everyone knows not to do is take anything with them
00:58:08.400 | in terms of, you know, IP.
00:58:10.640 | - Not everybody.
00:58:11.360 | Not everyone.
00:58:11.840 | Lewandowski, was that his name?
00:58:13.200 | Lewandowski.
00:58:13.760 | - Well, that's why it's really stupid.
00:58:14.880 | - What an idiot.
00:58:15.600 | Oh my God.
00:58:16.160 | - Yeah.
00:58:16.320 | I mean, sometimes people do break the rules.
00:58:18.080 | I mean, the rule is you don't take any work product with you.
00:58:22.000 | You don't take any documents.
00:58:23.360 | - No thumb drives, dipshits.
00:58:25.360 | - Don't bring your old computer to the new job.
00:58:28.720 | Don't bring any old code.
00:58:30.320 | - No docs.
00:58:31.040 | - And there are people who violate those rules,
00:58:33.120 | and that is definitely breaking the rules.
00:58:35.440 | But you're allowed to take with you anything in your head.
00:58:38.160 | And it is one of the ways that best practices sort of become more common.
00:58:43.520 | - You guys know the story, right?
00:58:44.960 | Like Steve Chen, before he started YouTube, was an engineer at Facebook.
00:58:48.960 | And there was like, at one point, I remember being in some meeting where I was like,
00:58:54.400 | "Oh, I think we found an early version of the YouTube code on an old laptop."
00:58:59.920 | That was a Facebook laptop that Steve owned.
00:59:02.000 | So I remember like the GC was like, "I think we technically own YouTube."
00:59:06.240 | - That's a good general counsel.
00:59:09.440 | - "I think we own YouTube."
00:59:12.480 | Could you imagine if Facebook sued Google to get YouTube?
00:59:16.000 | - Speaking of startups, this is critically important for founders listening.
00:59:19.760 | If you're working at a company, the best practice is,
00:59:22.400 | if you're going to do a side hustle, is to get permission and a waiver
00:59:25.520 | from your boss that you're working on that.
00:59:27.760 | - And get a clean machine.
00:59:29.280 | - And you got to go on a clean machine.
00:59:31.280 | Your machine is being monitored.
00:59:32.720 | - Cannot use your work computer.
00:59:34.080 | - Your work machine, I'm telling you right now,
00:59:36.320 | is being monitored every step of the way.
00:59:38.480 | - Well, it's not just monitoring.
00:59:40.960 | - It's their property.
00:59:42.800 | It's their property.
00:59:43.760 | And if you type on it, especially in this work from home era,
00:59:47.280 | and you know, anyway, I had a situation, I wouldn't say at which company,
00:59:52.640 | where a sales executive left the company,
00:59:56.640 | went into the CRM system, downloaded it, and he was so dumb.
01:00:02.800 | He had written plans for his next employer on his computer.
01:00:09.040 | And then in his corporate email, emailed the database
01:00:12.720 | and his plans for the new employer to his Yahoo or Outlook email.
01:00:17.600 | - Oh, my God.
01:00:18.880 | - They busted him.
01:00:20.160 | - Of course.
01:00:20.880 | - They dropped a legal letter on him.
01:00:22.880 | They put a legal letter on the person who's hiring him.
01:00:24.800 | The person who hires him rescinds the offer.
01:00:27.280 | Obviously, they don't want some toxic nonsense.
01:00:29.200 | And yeah, the person, you know, I have no idea what happened.
01:00:33.360 | - You know, there's that very famous email that Steve Jobs sent to Bruce Chisholm,
01:00:38.320 | which goes something along the lines of,
01:00:40.080 | "Dear Bruce, it's come to my attention that Apple never recruits from Adobe,
01:00:45.280 | but it seems like Adobe is actively recruiting Apple employees.
01:00:49.280 | One of our practices have to change."
01:00:51.680 | - That Steve Jobs is a gangster.
01:00:55.280 | - You know, "Please let me know who," or something like that.
01:00:57.680 | It's just a phenomenal email.
01:00:59.200 | - Yeah, and this actually resulted in a bunch of legal action, a settlement,
01:01:04.240 | because I think Eric Schmidt and Steve Jobs,
01:01:07.200 | correct me if I'm wrong, producers,
01:01:08.400 | or there was something between Google and Apple,
01:01:10.560 | where they agreed, and I think Sergey or somebody was--
01:01:14.160 | - This is true, right?
01:01:14.960 | - Yeah, Sergey got in on it and says, "I don't know if this is legal,
01:01:18.160 | but they didn't want to piss Steve off
01:01:21.040 | because it's not the kind of guy you want to piss off
01:01:23.600 | because he can execute."
01:01:25.360 | And Eric Schmidt was on the board of Apple at the time.
01:01:29.120 | There's a very famous picture of them at the Sand Hill Road,
01:01:33.680 | or Stanford Mall, having coffee, and Steve is lit.
01:01:37.680 | He's not happy about this.
01:01:38.960 | And I think they agreed to not poach a certain level,
01:01:43.920 | or something like that.
01:01:44.560 | Oh, there's the famous picture where they were breaking bread,
01:01:46.480 | and then Eric Schmidt did Android.
01:01:48.800 | Steve felt very betrayed by this,
01:01:50.400 | and Eric Schmidt left the board of Apple shortly thereafter,
01:01:54.720 | because Steve reportedly felt like he was double-crossed.
01:01:59.440 | But, you know, all that is speculation,
01:02:01.440 | so check your own facts.
01:02:03.520 | - Do you have any other details on that, Freeberg?
01:02:05.760 | Do you remember that case?
01:02:07.120 | It was after your time, right?
01:02:08.240 | - I'll tell you a funny story.
01:02:10.240 | We had a meeting with the exec team from Apple
01:02:13.920 | when I was at Google,
01:02:14.880 | and Marissa Mayer, and Sergey, and myself,
01:02:20.880 | and one other person went to the meeting,
01:02:22.560 | and Marissa's idea was, "Why don't we pitch Apple
01:02:26.720 | that we could put a cache of the internet in 200 megabytes,
01:02:30.480 | like the most important 200 megabytes of the internet,
01:02:33.360 | cache it, and put it on the iPod,
01:02:34.720 | so you could access the internet on the iPod?"
01:02:36.720 | - Surf the web.
01:02:37.440 | - Right.
01:02:37.760 | - Wow!
01:02:38.080 | - So basically, search a cached version of the web.
01:02:41.680 | And that was the pitch.
01:02:42.560 | And they kind of were like radio silent,
01:02:44.000 | they didn't respond to the pitch,
01:02:45.200 | and then they kind of blank-faced stared at us.
01:02:47.600 | Two years later, the iPhone came out.
01:02:50.320 | - Hmm.
01:02:51.040 | I have three amazing Steve Jobs stories.
01:02:53.200 | I'll tell you one of them right now.
01:02:54.400 | I was the co-founder of a website called Engadget,
01:02:58.000 | which was a large blog in the world.
01:02:59.520 | We wound up selling the collection of those blogs to AOL.
01:03:02.720 | And I was at a conference, Steve Jobs was there,
01:03:04.960 | he sees my badge, says, "Engadget."
01:03:07.440 | He goes, "Oh, Jason, Engadget,
01:03:10.000 | my favorite tech blog in the world,
01:03:13.040 | I read it every day, yada, yada."
01:03:14.400 | And I said, "Oh, that's great."
01:03:16.800 | And you know, dah, dah, dah.
01:03:17.520 | And they said, "Hey, can I ask you a question?"
01:03:19.040 | I said, "Yeah, sure, anything."
01:03:20.000 | And we're out there, beautiful view of the Pacific Ocean.
01:03:23.200 | And I said, "You know, Steve, I have the iPod."
01:03:25.520 | I pulled it out of my pocket, and it had a color screen.
01:03:28.800 | And I said to him, I said, "You know, it seems to me
01:03:31.760 | that I could download short video clips,
01:03:34.800 | like of the Chappelle Show."
01:03:35.680 | And I use that as an example.
01:03:36.800 | And they could sell them to me, like you're selling MP3s,
01:03:39.600 | and you could just buy like a Chappelle Show episode,
01:03:42.800 | it's a pretty obvious idea.
01:03:44.000 | And then I could play it when I'm on the subway or something,
01:03:46.400 | or got some downtime in a cafe,
01:03:48.720 | I could watch the Chappelle Show clips.
01:03:50.240 | But could you make a video version of this,
01:03:52.880 | since it's already got the storage in the screen?"
01:03:55.360 | And he looked at me, he said, "Jason,
01:03:57.520 | do you want to watch a thumbnail size video?"
01:03:59.600 | I said, "Yeah, totally fine."
01:04:01.920 | He said, "All of our research says
01:04:06.080 | nobody wants to watch a tiny video like that.
01:04:08.640 | It's the worst idea ever."
01:04:10.160 | And I had this moment of like self-doubt,
01:04:13.280 | 'cause he had that reality distortion field.
01:04:15.520 | I go back, I tell my guys what he said,
01:04:17.200 | and, "Hey, should we write a story about it?"
01:04:18.800 | And 30 days later, he releases the video version,
01:04:23.600 | and the ability to watch videos on it.
01:04:26.480 | He had the greatest poker face ever.
01:04:28.640 | It was just an incredible moment.
01:04:30.800 | Man, I really miss that guy.
01:04:32.480 | What an incredible, what an incredible--
01:04:36.080 | Are we doing a walk down memory lane of our Steve Jobs stories?
01:04:39.120 | Did you have one?
01:04:39.680 | I have two, but I--
01:04:41.680 | Give us one.
01:04:42.240 | Come on, these are great.
01:04:43.280 | People loved, by the way, the segment when we went on--
01:04:45.360 | When I worked at Winam, my boss,
01:04:49.040 | when it was merged into AOL to create AOL Music,
01:04:55.280 | was this guy, Kevin Conroy, great guy.
01:04:57.360 | I remember Kevin.
01:04:58.080 | And Kevin basically got us runway
01:05:03.760 | to launch a 99-cent music download store.
01:05:07.360 | And we were able to use AOL's cards on file,
01:05:13.040 | so there was like 25 or 30 million credit cards.
01:05:15.280 | And we were able to use Warner Brothers' music library,
01:05:19.920 | because AOL had merged with Time Warner.
01:05:22.960 | And so we launched this 99-cent music store,
01:05:25.600 | and it performed phenomenally.
01:05:28.720 | And I went and I did demos for Walt Mossberg, Takara.
01:05:33.680 | I went to Apple.
01:05:35.520 | I showed James Higa, Eddie Cue, Steve Jobs, all these guys.
01:05:39.600 | And AOL just couldn't do anything with the data.
01:05:43.360 | We kept pushing.
01:05:44.160 | Kevin kept pushing.
01:05:45.440 | But they were just so tied up in their own political nonsense,
01:05:49.120 | they wouldn't greenlight it becoming more than a beta.
01:05:51.920 | And then nine months later,
01:05:53.120 | the 99-cent download store launched on iTunes.
01:05:57.040 | And I was just like, "Oh my God, this is execution."
01:06:00.000 | And they had all the labels,
01:06:01.280 | and it was a thousand times better experience than we created.
01:06:04.800 | But it was the first time where I stumbled
01:06:08.320 | into something relatively accidentally
01:06:09.920 | and just got, frankly, owned by an organization
01:06:13.200 | that was just a thousand times better.
01:06:14.480 | - Zak, did you ever meet Steve Jobs in person?
01:06:16.880 | - No. - Freeberg?
01:06:18.720 | - Yeah, I met him at Macworld in 2003.
01:06:24.560 | He was walking the floor.
01:06:26.800 | It was at Moscone.
01:06:27.840 | I remember it because I've been a Mac user since 1984.
01:06:32.000 | First Mac original.
01:06:32.880 | I still have it in my office.
01:06:33.920 | Diehard, diehard Mac fan.
01:06:36.000 | I read every edition of Macworld, Macuser, Macweek.
01:06:40.320 | And so to meet him on the floor, walking around,
01:06:43.120 | I was just like a 22-year-old kid.
01:06:46.000 | I was freaking out.
01:06:47.040 | It was awesome.
01:06:47.920 | Only time I met him.
01:06:48.720 | - He was the greatest.
01:06:50.480 | He was the absolute greatest.
01:06:52.240 | - I have one other story, but I'm not going to share it.
01:06:54.320 | - You know what segment people loved last week?
01:06:57.680 | - I interviewed with him for a job.
01:06:59.440 | - What?
01:06:59.920 | Oh, come on.
01:07:00.480 | You can't not tell that story.
01:07:01.680 | - Right when I left Facebook.
01:07:02.560 | - People loved the story last week
01:07:04.320 | when we did our childhood businesses.
01:07:06.400 | - I want to hear the story, Chamath.
01:07:07.680 | - Come on, Chamath.
01:07:09.200 | You give us one more.
01:07:10.080 | I'll give you one that's recorded on video.
01:07:11.760 | - It's like so littered with my failures
01:07:14.320 | and my insecurities, but okay.
01:07:15.520 | - Good, that's what we want to hear.
01:07:16.480 | - Yeah, that's what we want to hear.
01:07:17.360 | - Get it out, get it out.
01:07:18.320 | - Be self-deprecating.
01:07:19.680 | - Share with your brothers.
01:07:20.640 | - There was basically somebody
01:07:22.960 | at a well-known recruiting firm
01:07:24.880 | who pinged me and said,
01:07:25.680 | "Hey, there could be a consult."
01:07:27.600 | This is right when I was leaving Facebook.
01:07:29.280 | And I had a pretty large project
01:07:34.080 | that never saw the light of day,
01:07:35.200 | which was this Facebook phone.
01:07:36.640 | Although people in the ecosystem
01:07:38.240 | knew that we were working on it.
01:07:39.520 | So, you know, Samsung and Intel were our partners,
01:07:42.000 | AT&T and all this stuff.
01:07:43.520 | Anyways, long story short,
01:07:44.880 | there was a rumor that they wanted
01:07:47.920 | to consolidate iPhone into one business unit, right?
01:07:51.360 | And at the time it was kind of like
01:07:52.640 | very balkanized or whatever.
01:07:54.800 | And I got approached by a recruiter saying,
01:07:57.520 | "They're thinking of creating a role,
01:07:59.520 | "which is kind of like head of iPhone."
01:08:01.360 | And at that point I was like,
01:08:02.640 | "This is in 2011."
01:08:04.480 | And I was adamant that I would never work for anybody
01:08:08.480 | until I heard it was Steve Jobs.
01:08:09.920 | And I was like, "I will do whatever he says."
01:08:12.240 | (laughs)
01:08:12.880 | Whatever, Jobs.
01:08:14.000 | - Wow.
01:08:14.960 | - I feel that he's asking me.
01:08:16.160 | - You're ready to bend the knee.
01:08:17.600 | - Dude, I would have driven his car.
01:08:19.280 | I would have, "Yes, sir."
01:08:20.400 | I would have driven that car.
01:08:21.920 | (laughs)
01:08:22.720 | I was like-
01:08:23.200 | - You'd be driving Miss Stacy?
01:08:24.800 | "Oh, Steve Jobs, you're my best friend."
01:08:26.800 | (laughs)
01:08:27.520 | - And I went through a couple of interviews.
01:08:29.520 | I'm not gonna get into those details.
01:08:30.880 | And at the end of it, it was like,
01:08:33.040 | "Well, he's not gonna become,
01:08:34.960 | "he's not gonna be the CEO.
01:08:36.160 | "You know, he's gonna become executive chairman.
01:08:38.560 | "And so you'll be reporting to Tim Cook."
01:08:42.000 | And that's where that process died.
01:08:43.920 | (laughs)
01:08:45.280 | They said I wasn't a good cultural fit for Apple
01:08:47.440 | in the new era.
01:08:48.400 | Now, and then it turned out.
01:08:49.440 | What happened?
01:08:49.920 | - Yeah, you would-
01:08:50.400 | - Scott Forstall laughed.
01:08:51.520 | I was not a good culture.
01:08:52.400 | I would not have been a good cultural fit for that.
01:08:53.760 | - No, he wanted radicals
01:08:55.200 | and Tim Cook wanted, you know, optimizers.
01:08:58.000 | - So then I went and I started Social Capital.
01:08:59.680 | But that was my only job interview that I had
01:09:01.680 | when I was leaving Facebook.
01:09:04.720 | And it was, for me,
01:09:06.720 | I had the same reaction as Freeberg.
01:09:09.360 | I was like a kid meeting his,
01:09:13.200 | I don't even want to call it idol.
01:09:15.120 | Like, I think I was like a believer meeting Jesus.
01:09:19.520 | - Hmm.
01:09:20.320 | - That's how it felt.
01:09:21.680 | - I will save my, just to get this,
01:09:26.160 | we got to get through two more stories in the doc
01:09:27.760 | and I'm gonna leave my other Steve Jobs stories
01:09:29.440 | for another time.
01:09:30.880 | We got time.
01:09:31.760 | - Well, you have that famous clip of you asking him
01:09:33.840 | that question where he basically was looking at you like,
01:09:36.000 | "That's a very famous one."
01:09:37.120 | Because that, you really got the better of him,
01:09:39.200 | I thought, which is pretty good.
01:09:40.080 | - All right, all right.
01:09:40.640 | - Will you help companies like ours sell podcasts,
01:09:44.080 | you know, be an audible?
01:09:45.120 | So if we wanted to sell a podcast through your service,
01:09:47.600 | would you help us do the fulfillment?
01:09:49.920 | - You know, we're planning on having all the podcasts
01:09:53.040 | be free at first,
01:09:53.840 | but zing me an email with what you've got in mind
01:09:56.560 | and we're open to anything.
01:09:57.760 | - Same email I always send it to you?
01:09:58.960 | - Yep.
01:09:59.360 | - Okay, you got it.
01:09:59.800 | (laughing)
01:10:02.480 | - That's so strong.
01:10:03.680 | I mean, I got to give it to you, Jake,
01:10:05.040 | all you have is (beep) balls.
01:10:06.240 | - The truth is I traded emails with Steve many times.
01:10:11.040 | With the press, he was full contact, as you know.
01:10:15.520 | All right, it's time.
01:10:16.400 | - Full contact.
01:10:16.960 | - It's time for Saks Red Meat
01:10:21.280 | and we've got a double serving for Saks.
01:10:23.200 | Saks stands are going to be really happy right now.
01:10:25.440 | TikTok's divestor ban bill has been signed into law.
01:10:28.400 | Senate passed the bill '79 to '18.
01:10:30.160 | We've talked about this here over and over and over again.
01:10:34.000 | President signed it, this thing's on the fast track.
01:10:37.360 | Divest or shut down.
01:10:39.920 | TikTok bill was packaged, bundled,
01:10:42.800 | with $95 billion in foreign aid.
01:10:46.080 | 61 billion for the Ukraine, Ukraine, Ukraine.
01:10:49.920 | 26.4 billion for Israel and humanitarian aid
01:10:53.920 | for civilians in Gaza.
01:10:55.120 | And about 8 billion for the Indo-Pacific region,
01:10:58.960 | AKA Taiwan.
01:10:59.920 | House added a provision to the bill
01:11:02.160 | that would require the president to seek
01:11:03.920 | a $10 billion repayment from Ukraine's government.
01:11:08.080 | Finally, they're talking about the lease back.
01:11:10.720 | We'll see if that ever happens.
01:11:12.400 | I'm curious what Saks thinks there.
01:11:14.160 | Last year, CNBC reported that ByteDance
01:11:16.560 | was buying back 5 billion worth of stock.
01:11:18.160 | And early Thursday morning, it was reported
01:11:20.880 | that ByteDance is exploring selling a majority stake
01:11:24.080 | in TikTok's U.S. business.
01:11:25.360 | Looks like gun to the head is working
01:11:27.920 | to a non-tech company without the algorithm.
01:11:30.640 | In other words, maybe they sell it to a Walmart,
01:11:33.440 | somebody who they don't consider super competitive.
01:11:35.920 | I'm not sure who the non-tech company is here.
01:11:37.760 | Could be Disney would come to mind as well.
01:11:40.000 | And remember, Disney did look at buying Twitter,
01:11:42.080 | but they didn't want the toxicity of an open platform.
01:11:44.400 | TikTok is obviously heavily scrubbed
01:11:47.760 | of anything that is aggressive.
01:11:49.440 | You can't show a movie clip
01:11:51.200 | with somebody getting attacked violently.
01:11:53.600 | It's very PG-13.
01:11:54.880 | But this is the key part, Saks.
01:11:58.160 | No algorithm in the package deal.
01:12:00.480 | Your thoughts on this,
01:12:02.800 | and you can take it either way you like,
01:12:04.560 | because I know your fans want to hear about everything.
01:12:07.040 | Do you want to talk the budget bundling
01:12:09.760 | or the TikTok ban and the divestiture,
01:12:12.640 | which seems to be happening?
01:12:13.760 | Well, the overall theme is just
01:12:15.120 | that the national security state got everything it wanted.
01:12:18.320 | It got 100 billion to support forever wars.
01:12:22.480 | It got a TikTok ban.
01:12:24.480 | And this divestiture thing is a total fig leaf
01:12:27.200 | because it's not clear that China's going to allow
01:12:30.000 | TikTok to divest because it would set a horrible precedent
01:12:33.440 | where the U.S. could just pass a law
01:12:34.880 | and then essentially steal the value of a Chinese company.
01:12:38.640 | So I don't think they're going to be able to divest.
01:12:40.160 | I think they're just going to get shut down.
01:12:41.840 | So the security state's getting its wish there.
01:12:44.160 | And then another bill that you didn't mention,
01:12:46.000 | which they just passed,
01:12:46.960 | is they approved the warrantless spying on Americans.
01:12:50.800 | So the federal government can now spy on you
01:12:53.280 | and your communications without even getting a warrant.
01:12:58.080 | - Ridiculous, disgusting.
01:13:00.160 | - The national security state just seems to get whatever it wants.
01:13:03.120 | And there was large bipartisan majorities follow this.
01:13:05.840 | And the way they do it is they conjure all these fears
01:13:09.920 | to try and fear us into,
01:13:12.160 | well, if we don't agree to warrantless spying,
01:13:16.240 | then you can get a terrorist attack.
01:13:17.760 | Well, when has a warrant requirement ever gotten in the way
01:13:22.960 | of actually doing what we need to do to stop terrorism?
01:13:26.160 | Never.
01:13:26.720 | But that fear was enough to get Congress to authorize that legislation.
01:13:31.120 | They're keeping us involved in this forever war in Ukraine
01:13:34.960 | by this fear that Putin's somehow going to conquer all of Europe,
01:13:38.000 | which I think is total threat inflation.
01:13:39.840 | And this TikTok fear that somehow what videos we like
01:13:45.840 | is like precious data that's being shared with the CCP.
01:13:48.800 | Look, I'm willing to believe it's possible,
01:13:51.040 | but they never proved that.
01:13:52.320 | - Oh, no, they proved that data.
01:13:54.480 | It's been proved that they spied on us.
01:13:55.200 | - No, you only showed us, we debated this, we debated this.
01:13:58.240 | - Yeah.
01:13:58.480 | - You showed one article from the New York Times about how,
01:14:01.440 | yeah, that's not proof, Jekyll.
01:14:03.920 | Those were two rogue employees who shared some data with certain New York Times reporters.
01:14:09.680 | - I spoke to multiple TikTok employees myself personally,
01:14:14.640 | and they told me that the Chinese representatives
01:14:17.200 | are all over the company and inside of it.
01:14:19.520 | Freeberg, are your thoughts?
01:14:21.120 | - But what was the judicial process or legislative process to prove that?
01:14:25.040 | Did they have hearings, did they prove that?
01:14:27.120 | I understand that, you know, your hearsay and your view is good enough to ban it,
01:14:32.640 | but I'd actually like to see some proof.
01:14:33.840 | - No, my position on banning, thanks for asking,
01:14:36.000 | is based on reciprocity and the potential damage it could do,
01:14:39.520 | and based on how influential and powerful the product is,
01:14:42.480 | and how they could change sentiment and censorship.
01:14:45.440 | And the Chinese government is fantastic at censorship,
01:14:48.560 | and they're doing massive censorship inside the app, yeah.
01:14:51.140 | - Handle it in a trade bill then, because handle it in a trade bill,
01:14:52.800 | because what we've authorized here is not a TikTok ban,
01:14:55.920 | it is a sweeping new federal power to ban foreign adversary-controlled applications.
01:15:03.600 | That's the new category.
01:15:05.360 | So we have a new category of foreign-controlled apps, whatever that means,
01:15:09.040 | and the government can now use it to shut down apps they don't like.
01:15:12.880 | And I guarantee you, I think here's where this goes next.
01:15:15.760 | I think Telegram is next on the hit list.
01:15:17.680 | You have a Russian founder, okay?
01:15:21.920 | You have rumors for years, I'd say unproven,
01:15:25.200 | that somehow Telegram's been backdoored by the Russian government.
01:15:30.000 | You know, we've all heard those rumors.
01:15:32.320 | It's kind of like the CCP.
01:15:33.160 | - Where is his base, though?
01:15:34.240 | Isn't he out of there?
01:15:35.760 | He's in Monaco or something?
01:15:36.880 | He doesn't live there.
01:15:37.680 | - Who cares?
01:15:38.640 | - No, he lives in Dubai.
01:15:39.120 | - He lives in Dubai, yeah.
01:15:40.400 | - Here's what you're gonna see.
01:15:42.160 | You're gonna see, at some point, you'll see a media campaign
01:15:46.720 | that'll be promoting the idea that Telegram is used by Hamas,
01:15:51.200 | by terrorists, by groups that the U.S. doesn't like,
01:15:54.240 | and that it's backdoored by Russia,
01:15:57.280 | and that it's got some shady investors on its cap table.
01:15:59.680 | And no politician's gonna want to stand up and defend Telegram.
01:16:05.360 | And all the industry money that flows into Washington
01:16:09.920 | will be promoting this idea that we have to ban it,
01:16:12.000 | because think about the market share gains
01:16:14.640 | that all of Telegram's competitors will make,
01:16:16.640 | and they're the ones wired into D.C.
01:16:18.320 | So this is a foregone conclusion, I think.
01:16:20.000 | - But you also said on Twitter,
01:16:21.120 | you thought this would come to X and to Rumble.
01:16:23.840 | You actually think they'll take this to American companies?
01:16:26.400 | - I think that--
01:16:27.440 | - That seems like a stretch.
01:16:28.400 | - Not really.
01:16:29.920 | I think that the first step is you go get Telegram,
01:16:32.800 | because that's easy, you know?
01:16:34.400 | This guy is an agent of Putin, obviously.
01:16:36.640 | And anyone who says differently
01:16:38.320 | is themselves an agent of the Kremlin.
01:16:39.840 | - How does it jump to X and Rumble?
01:16:40.800 | - That's how this rhetoric works.
01:16:42.640 | So first of all, they'll whack Telegram.
01:16:45.680 | And by the way, we all know
01:16:46.800 | that companies are gonna benefit enormously
01:16:48.720 | by slurping up that market share.
01:16:50.160 | And then if Biden wins a second term,
01:16:52.960 | they will eventually set their gun sights on X.
01:16:56.240 | But look, that's a battle,
01:16:57.680 | because in X, you have a billionaire who has resources,
01:17:01.280 | who's willing to stand on his hind legs and fight.
01:17:04.880 | - Yeah, I mean--
01:17:05.520 | - And so they're not gonna do that before the election.
01:17:07.520 | But look, if we continue to see
01:17:09.600 | more weaponization and more censorship,
01:17:13.360 | I believe that eventually what they will do
01:17:15.600 | is try and push Elon to divest X.
01:17:18.800 | And look, all they gotta do, listen, J-Cal,
01:17:21.120 | this act empowers the Attorney General
01:17:23.760 | to open an investigation.
01:17:25.520 | So I think that in a second Biden term,
01:17:28.160 | they will open an investigation
01:17:29.920 | and just start ratcheting up the pressure on Elon
01:17:33.040 | to get rid of X, because that's clearly what they want.
01:17:35.040 | - Yeah, and the TikTok guy's bought off Trump already,
01:17:37.280 | so he's gonna fight for it.
01:17:39.120 | Freeberg, your thoughts?
01:17:40.720 | - I don't agree that the Chinese government
01:17:45.600 | will shut it down.
01:17:47.280 | I certainly don't have any insight
01:17:49.360 | into what the Chinese government is discussing
01:17:51.360 | or thinking about doing,
01:17:52.400 | but there's a lot of money to be made here.
01:17:55.040 | So if I'm ByteDance, I'm very likely gonna hire
01:17:59.360 | a bunch of bankers, run an auction process,
01:18:01.680 | and sell this thing, and I have a year to do it.
01:18:03.600 | This is a business that did $14 billion in revenue last year,
01:18:07.760 | according to a published report.
01:18:09.440 | 170 million Americans are active users of TikTok.
01:18:14.240 | Our revenue grew 40% plus last year.
01:18:16.960 | So I don't know how they would just say,
01:18:18.720 | "Hey, let's write this thing off and shut it down,"
01:18:21.360 | when we could probably generate--
01:18:23.840 | - I'm basing that statement on published reports
01:18:27.440 | that China has said they're opposed
01:18:30.160 | to the forced sale of TikTok in the US.
01:18:32.720 | - Yeah, maybe.
01:18:33.120 | - And look, I agree with you that it might change,
01:18:36.240 | but think about it, if you're the Chinese government,
01:18:38.240 | you're adamantly opposed to this,
01:18:39.840 | you don't like the precedent,
01:18:41.280 | and you don't really care about tech entrepreneurs
01:18:45.520 | getting rich.
01:18:46.000 | - So let's assume--
01:18:46.960 | - I mean, the founder/CEO of TikTok lives in Singapore.
01:18:50.480 | - Right.
01:18:51.120 | - So why would they give a shit?
01:18:52.800 | - But now let's assume that they let it go through.
01:18:55.120 | Think about the implications.
01:18:57.520 | Morgan Stanley and Goldman Sachs get hired
01:18:59.520 | to run a joint process to auction this thing off, right?
01:19:02.800 | They're gonna run this process for a period of months.
01:19:05.360 | There's only a certain number of folks
01:19:06.720 | that could actually make a bid to buy this thing.
01:19:08.880 | Maybe some of the big guys in private equity,
01:19:11.360 | Silver Lake and others,
01:19:12.240 | try and pull some capital together to buy it,
01:19:13.920 | but I think it's more likely a big company tries to buy it.
01:19:16.640 | Cepheus and Antitrust will probably not be
01:19:20.640 | as relevant here as it normally would be.
01:19:24.080 | Certainly Cepheus wouldn't be
01:19:25.120 | 'cause it's being completely divested.
01:19:26.560 | Antitrust, there's a real question
01:19:29.600 | on whether they're gonna be given some leeway.
01:19:31.360 | But even if Antitrust does apply,
01:19:34.640 | who could buy TikTok in the US?
01:19:36.880 | There's only a handful of companies that could or would.
01:19:39.920 | Oracle, maybe Microsoft?
01:19:44.160 | They said it's not a tech, is the report.
01:19:46.320 | That it's not a what?
01:19:48.400 | Not a tech company.
01:19:49.680 | So that would be Walmart, Disney,
01:19:51.360 | you start to get to over--
01:19:52.480 | Maybe a media company, right?
01:19:53.920 | Yeah, so--
01:19:54.320 | Absolutely.
01:19:55.040 | I do think that there's a really interesting--
01:19:57.440 | Netflix.
01:19:58.000 | Maybe Netflix.
01:19:59.920 | So I think that there's a really interesting rewrite
01:20:02.160 | of the landscape a little bit here,
01:20:04.080 | where some of the traditional big tech companies,
01:20:07.200 | Meta, Google, have really had total control
01:20:10.960 | over the consumer media consumption on the internet
01:20:13.200 | that there's gonna be a real difference here
01:20:14.640 | that might be triggered
01:20:18.160 | in how the industry kind of is sorted
01:20:22.000 | based on the TikTok auction,
01:20:23.680 | if it does happen.
01:20:24.720 | And I think it's probably--
01:20:25.760 | Look, we could argue all day
01:20:26.640 | about what the Chinese are gonna do.
01:20:28.160 | But if it does go through,
01:20:29.440 | the next story you're gonna see
01:20:30.960 | in the Wall Street Journal
01:20:31.760 | is how much money the bankers are gonna make
01:20:33.120 | on fees running the auction here.
01:20:34.400 | And then the next story you'll see after that
01:20:37.360 | is gonna be about how the tech and media landscape
01:20:40.160 | has been reshuffled and rewritten
01:20:41.760 | by the TikTok deal.
01:20:43.520 | Chamath Palihapitiya, our Chairman Dictator,
01:20:47.120 | can't wait to see you at poker tonight.
01:20:48.800 | What are your thoughts on this, if any?
01:20:50.400 | I think the thing to keep in mind
01:20:51.840 | is that the reason why the government
01:20:54.160 | is banning TikTok,
01:20:55.120 | and I'm totally speculating and guessing,
01:20:57.360 | has nothing to do with these silly little videos.
01:20:59.680 | But it is that the overwhelming majority
01:21:02.720 | of the people that download TikTok
01:21:04.480 | keep the microphone on,
01:21:05.600 | and it is an ambient passive surveillance device.
01:21:09.360 | And I think to the extent that
01:21:12.480 | a foreign government has access
01:21:14.160 | to whatever you hear ambiently and randomly
01:21:17.120 | while the app is not even being used
01:21:18.800 | is the real problem here.
01:21:20.640 | And so I think that that's effectively
01:21:23.840 | what this is, is a listening device
01:21:25.840 | into enough Americans.
01:21:27.440 | And I think that that's pretty scary to folks.
01:21:29.280 | The second thing is with respect
01:21:31.200 | to the actual product itself,
01:21:33.040 | in the absence of that algorithm,
01:21:35.280 | which I think is just incredible,
01:21:36.960 | as somebody who was a voracious user of that app
01:21:40.000 | until I deleted it,
01:21:41.520 | I don't think the product is much of anything.
01:21:43.600 | It's probably no better or no worse
01:21:45.760 | than shorts or reels
01:21:46.880 | or some of these other alternatives.
01:21:48.640 | So I don't see the economic justification
01:21:52.480 | for anybody who already has one of these assets
01:21:54.720 | to buy these things
01:21:55.440 | because the content is roughly the same now.
01:21:58.000 | It is the thing that makes that app is the alga.
01:22:00.800 | But the videos are everywhere.
01:22:03.280 | They're on YouTube.
01:22:04.240 | They're on Instagram.
01:22:06.160 | They're everywhere.
01:22:08.000 | They're everywhere.
01:22:08.800 | So I don't see why anybody
01:22:11.440 | would pay a lot of money for this,
01:22:12.720 | especially in the absence of the algorithm.
01:22:15.040 | This is my honest thought.
01:22:16.240 | What do you think, Jake?
01:22:17.280 | I think this is a security risk.
01:22:20.400 | I've made that clear here before.
01:22:22.080 | And I don't think we would ever allow Iran,
01:22:25.920 | North Korea, Russia
01:22:28.320 | to run any of these companies inside of America,
01:22:31.520 | nor would they allow us to run Facebook,
01:22:33.760 | X, et cetera, in their countries.
01:22:35.760 | So I think reciprocity is the key.
01:22:37.920 | But I am running a little test right now.
01:22:40.560 | I posted this video to my TikTok today.
01:22:43.440 | And it's under review.
01:22:45.840 | They won't let it publish.
01:22:47.120 | And the video is very simple.
01:22:49.440 | It quotes a study from Reuters.
01:22:52.000 | You don't have to play it.
01:22:52.720 | I'll explain what I said in it.
01:22:55.040 | But I just took a video of the Reuters study.
01:22:58.160 | What this Reuters study shows you,
01:22:59.440 | pull up the chart there, Nick,
01:23:00.640 | is that in this Reuters study,
01:23:02.400 | the CCP is censoring sensitive topics related to China.
01:23:05.840 | And so if you want to understand why the CEO,
01:23:09.200 | who I think you may have worked with before,
01:23:11.120 | Chamath, at Facebook,
01:23:12.880 | the CEO is saying it's a ban
01:23:15.440 | when in fact it's a divestiture.
01:23:16.880 | If you want to understand why they care about this,
01:23:20.560 | it's because they want to be able
01:23:23.200 | to influence things in America.
01:23:24.800 | They want to have data on Americans.
01:23:26.800 | It's spyware.
01:23:27.760 | That's my position.
01:23:28.720 | And it's the most censored platform,
01:23:31.760 | I think, in America,
01:23:33.440 | which I know you feel passionately about, Sax.
01:23:35.760 | So I did this video,
01:23:38.080 | and just like COVID-19 was blocked on X,
01:23:43.200 | here in America, in the US,
01:23:45.040 | if you talked about Wuhan,
01:23:46.160 | if you talked about any of these topics,
01:23:47.920 | our government banned it.
01:23:49.920 | Well, if you talk about Hong Kong,
01:23:51.760 | if you talk about Uyghurs,
01:23:53.040 | if you talk about Tiananmen Square,
01:23:55.920 | your video will not be posted.
01:23:58.240 | In fact, my video is held right now.
01:24:00.480 | So, if you're hearing my voice right now,
01:24:02.640 | I'm doing a little experiment.
01:24:04.400 | I want everybody to post that chart
01:24:07.680 | and just read the Wikipedia page,
01:24:09.280 | maybe talk about Tank Man,
01:24:11.360 | talk about the Hong Kong riots,
01:24:12.800 | and then use the hashtag #DontBanTikTok
01:24:15.280 | as a little bit of a cheeky way
01:24:17.920 | for us all to track each other's.
01:24:19.760 | And I want to see if 100 or 1,000 all-in fans
01:24:22.400 | post one of these,
01:24:24.320 | if they all get banned,
01:24:25.200 | or if we can see any of them
01:24:26.080 | so we can do a live censorship test.
01:24:28.240 | Here in America,
01:24:29.040 | are Americans allowed to talk
01:24:30.560 | about Tiananmen Square,
01:24:31.520 | the Hong Kong riots,
01:24:32.720 | Uyghurs, or any of those things?
01:24:34.240 | And you can also,
01:24:36.800 | if you want,
01:24:38.720 | hat mention me.
01:24:40.080 | I'm @JasonCalacanis on TikTok.
01:24:42.080 | So, that's my position on it.
01:24:43.600 | I want to see this thing banned immediately
01:24:45.680 | or divested.
01:24:46.240 | I would prefer it be divested,
01:24:47.280 | because I think people love it.
01:24:49.600 | One quick thing.
01:24:50.720 | I think what Zak said is really interesting
01:24:52.480 | about them then going after Telegram.
01:24:54.400 | The big issue that I think Telegram
01:24:58.480 | has always had
01:24:59.440 | is that it is encrypted,
01:25:02.480 | but it has its own form of encryption
01:25:05.520 | that they rolled themselves, right?
01:25:07.120 | Like, typically, I think WhatsApp
01:25:08.640 | and a lot of these other folks
01:25:10.080 | just use pretty standard signal.
01:25:11.440 | I think as well use SHA-256 encryption,
01:25:13.760 | which is like pretty standard.
01:25:14.800 | But there is always a fear
01:25:15.920 | that the US government
01:25:16.880 | actually can unencrypt that
01:25:20.640 | and has some kind of a backdoor.
01:25:22.080 | That was always the fear of that SHA-256.
01:25:24.240 | And so, Pavel Durov and his team
01:25:26.080 | basically rolled his own,
01:25:27.680 | which is also 256-bit symmetric encryption,
01:25:30.080 | but different standard altogether.
01:25:31.600 | And what people would say is,
01:25:33.760 | "Hold on, he has the backdoor,
01:25:35.200 | disaccess point."
01:25:35.920 | So that was always the claim,
01:25:37.200 | counterclaim between
01:25:38.240 | these things of why Telegram
01:25:41.200 | was always painted
01:25:42.720 | in more of a sketchy corner.
01:25:44.080 | I'm not saying that it's true.
01:25:45.040 | And I think that if they go after it,
01:25:48.400 | I think it's because
01:25:49.280 | this underlying encryption model
01:25:50.880 | is something that we can't get access to.
01:25:52.800 | And so they'd rather just eliminate it
01:25:54.240 | to your disaccess point.
01:25:55.280 | All right.
01:25:56.800 | I think Telegram can stick its
01:25:58.640 | head between his legs
01:26:00.320 | and kiss his ass goodbye,
01:26:01.520 | because they're next on the hit list.
01:26:04.240 | All right, we're going to see.
01:26:06.160 | What do you think of my
01:26:06.960 | "Don't ban TikTok" challenge?
01:26:07.520 | The national security state has used fears,
01:26:10.080 | like the one that Jake helped explain,
01:26:12.480 | to get a new power.
01:26:14.880 | And it's not a power to ban TikTok.
01:26:16.800 | It's a power to ban any foreign-controlled app.
01:26:19.200 | So goodbye, Telegram.
01:26:21.200 | What do you think of my
01:26:22.640 | "Don't ban TikTok" challenge, Saks?
01:26:24.560 | The censorship challenge.
01:26:27.200 | Yeah, I mean, fine, whatever.
01:26:30.160 | You're so passionate about censorship.
01:26:32.080 | It's done. It's over. It's over.
01:26:34.000 | Yeah.
01:26:34.240 | Look, the national security state
01:26:35.600 | gets whatever it wants.
01:26:36.560 | It's pretty clear.
01:26:37.520 | Not if we stay vigilant.
01:26:38.800 | I mean, if we keep talking about this.
01:26:40.000 | Well, how did this
01:26:40.560 | warrantless spying thing happen?
01:26:41.760 | I mean, look,
01:26:42.160 | I think the warrantless spying
01:26:43.280 | is actually worse than the TikTok ban.
01:26:45.040 | Me too. A hundred times worse.
01:26:47.280 | How are they not required
01:26:48.320 | to go to court to get a warrant?
01:26:50.080 | That's so easy, by the way.
01:26:51.440 | Have we eliminated
01:26:52.800 | the physical courts entirely?
01:26:53.840 | We don't need those anymore either?
01:26:55.440 | They can just do whatever they want?
01:26:56.640 | Well, I think we still have FISA,
01:26:58.400 | but you don't have to go
01:26:58.960 | to the court to get a warrant.
01:27:00.080 | They can just do whatever they want.
01:27:01.360 | We need to —
01:27:02.320 | let's take a deep dive on that
01:27:03.600 | in a future episode,
01:27:04.720 | because we're running out of time.
01:27:05.520 | We really should.
01:27:05.840 | One more story to get to.
01:27:06.880 | Yeah, let's deep dive it.
01:27:08.160 | Listen, there's a lot of topics
01:27:09.360 | people want us to talk about,
01:27:10.640 | and this one, I think,
01:27:11.440 | people are talking about a whole bunch.
01:27:14.480 | It's kind of breaking.
01:27:15.440 | I haven't —
01:27:16.400 | couldn't find a Wall Street Journal
01:27:18.000 | or a Washington Post covering this,
01:27:19.520 | but a lot of people were
01:27:20.640 | talking about it on X.
01:27:22.080 | Biden's 2025 budget
01:27:24.240 | includes some serious capital gains hikes.
01:27:26.880 | There are three proposals
01:27:29.040 | in the 2025 budget —
01:27:30.720 | we'll put all the links in the show notes —
01:27:32.800 | to increase cap gains rates
01:27:34.320 | as opposed to income cap gains.
01:27:36.640 | If all three are passed, big if,
01:27:39.440 | it would more than double
01:27:40.480 | the long-term capital gains tax
01:27:42.080 | to almost 45 percent.
01:27:45.040 | Important to note,
01:27:45.680 | this only covers those making
01:27:47.360 | one million or more a year,
01:27:48.880 | which is like,
01:27:49.920 | you know, less than 1% of the country,
01:27:51.600 | way less.
01:27:52.320 | And so this is definitely
01:27:54.320 | a tax-the-rich idea here.
01:27:55.760 | Currently, the highest
01:27:58.160 | long-term capital gains rate is 20%,
01:28:00.400 | which is really about 24%
01:28:01.840 | if you earn more than 200k per year,
01:28:03.520 | because you have to pay an extra small tax.
01:28:05.120 | If these proposals become law, big if,
01:28:08.880 | it would create the highest cap gains rate
01:28:12.240 | in 100 years.
01:28:14.000 | Here's the chart.
01:28:14.720 | This comes from Americans for Tax Reform,
01:28:17.760 | which was done by a Reagan-era,
01:28:19.440 | an NGO, I think,
01:28:20.880 | that was formed by a former
01:28:22.880 | Reagan administration.
01:28:24.320 | They basically make politicians sign
01:28:26.320 | that they won't increase taxes.
01:28:27.520 | The budget also proposes
01:28:29.520 | a 25% unrealized cap gain tax.
01:28:32.000 | This would be a tax on total income,
01:28:33.520 | including unrealized cap gains
01:28:34.720 | for all taxpayers worth over $100 million.
01:28:36.640 | Okay, Sachs,
01:28:38.080 | I know this is your red meat.
01:28:40.400 | There's a lot of pieces here,
01:28:41.440 | but gosh, this is crushing
01:28:43.520 | for those folks
01:28:44.960 | who want to vote for Biden
01:28:46.240 | but are moderates,
01:28:47.120 | because this is like
01:28:48.640 | the number one thing you can do
01:28:49.840 | to stop innovation
01:28:51.520 | and investment in the country,
01:28:52.640 | which we desperately need.
01:28:53.600 | This is a head scratcher.
01:28:54.560 | - Biden is playing a game of chicken
01:28:56.480 | with the tech industry
01:28:57.840 | and with, I'd say,
01:28:58.720 | suburban voters in general.
01:29:00.720 | I mean, is this what you want?
01:29:02.320 | At some point, you're going to have to
01:29:03.440 | say that this is not okay.
01:29:05.840 | I mean, first of all,
01:29:08.080 | you've got this 25% unrealized gains tax,
01:29:11.520 | which is a wealth tax.
01:29:13.280 | If you're somebody
01:29:14.000 | who's created a small business
01:29:15.680 | or a family farm,
01:29:17.040 | or you're a tech founder,
01:29:18.560 | if you get to qualify for the amount,
01:29:22.240 | then you've got to pay 25%.
01:29:23.760 | And in order to raise the money
01:29:26.320 | for that 25%,
01:29:27.040 | you're going to get taxed on,
01:29:29.760 | let's say you try to sell
01:29:30.800 | 25% of the company,
01:29:31.920 | you're now going to get taxed
01:29:33.760 | 45% of that,
01:29:36.240 | plus 13.3% California.
01:29:38.560 | So really, you're going to have to sell
01:29:39.680 | more like 40% of the company
01:29:42.160 | just to pay off this unrealized gains tax.
01:29:44.320 | And by the way,
01:29:45.040 | you haven't made a dime yet.
01:29:46.320 | You haven't put a dollar
01:29:47.360 | in your bank account.
01:29:48.160 | And that's your one confiscation.
01:29:50.160 | That's your one.
01:29:51.120 | What about what happens
01:29:52.000 | the following year?
01:29:52.800 | It's the actual same thing.
01:29:55.120 | This is this is ridiculous.
01:29:56.400 | These things will never pass.
01:29:57.520 | I mean, let me tell you,
01:29:58.480 | I mean, so,
01:29:59.040 | so this is in Biden's budget
01:30:01.200 | for next year.
01:30:01.920 | And this idea,
01:30:03.280 | this is not the first time
01:30:04.080 | we've seen this.
01:30:04.880 | It was in his original
01:30:06.160 | build back better proposal.
01:30:07.920 | And the only reason that failed
01:30:09.360 | is because mansion and cinema
01:30:11.280 | voted against it.
01:30:12.160 | So mansion and cinema
01:30:14.080 | are not going to be
01:30:14.640 | in the Senate next year.
01:30:16.480 | Okay, they're not running
01:30:17.600 | for reelection.
01:30:18.240 | So if the Democrats
01:30:19.440 | have the trifecta,
01:30:21.040 | if Biden wins reelection
01:30:22.720 | and holds on to the Senate
01:30:24.160 | and House,
01:30:24.800 | but without mansion and cinema,
01:30:26.560 | I think you have to price in
01:30:28.240 | the strong possibility
01:30:29.840 | that this passes.
01:30:30.640 | I don't think
01:30:31.600 | you can just shrug it off.
01:30:32.720 | - Freeberg, your thoughts?
01:30:34.800 | Well, we're talking about it here.
01:30:35.680 | We're not shrugging it off.
01:30:36.560 | This is like five along fire here on X.
01:30:39.200 | Everybody's talking about it.
01:30:40.160 | Freeberg, your thoughts?
01:30:41.520 | - There was a poll published yesterday.
01:30:45.360 | Bloomberg News morning consult
01:30:47.200 | surveyed 4,900 registered voters
01:30:50.000 | in seven swing states.
01:30:51.280 | And the poll showed
01:30:54.160 | that 77% of registered voters
01:30:57.600 | in those states
01:30:58.480 | support basically an asset tax
01:31:01.920 | on ultra high net worth people,
01:31:04.640 | people worth over $100 million
01:31:06.000 | to keep social security intact.
01:31:09.440 | So I think that's a strong indication
01:31:13.040 | of where things are headed generally.
01:31:15.280 | We can debate the tactics
01:31:16.800 | and the nuance of this election cycle,
01:31:18.560 | but as we all have talked about and know,
01:31:20.560 | social security becomes de facto bankrupt
01:31:25.040 | by 2033, perhaps earlier.
01:31:27.760 | So we have a few years to figure out
01:31:30.320 | whether we cut social security benefits
01:31:32.240 | in this country
01:31:33.440 | or find alternative ways to fund it.
01:31:35.760 | And it's pretty evident from this poll
01:31:38.000 | that Americans are not in support
01:31:39.920 | of raising the minimum age from 67 to 69,
01:31:43.280 | which was one of the questions in here.
01:31:44.880 | Only 25% of Americans
01:31:48.240 | support raising the minimum age
01:31:50.800 | for social security from 67 to 69.
01:31:53.840 | Meanwhile, 77% strongly support
01:31:56.880 | a tax on ultra high net worth
01:31:59.280 | to fund the gap and the funding need.
01:32:01.600 | Well, yeah, I mean, that's a loaded question
01:32:04.640 | because you're basically positioning this tax,
01:32:09.120 | which is really complicated in the details
01:32:12.400 | and the person getting asked the question
01:32:14.000 | doesn't understand it,
01:32:15.440 | compared to the most popular program
01:32:17.760 | that the federal government has,
01:32:19.520 | which is social security.
01:32:20.880 | So obviously it's gonna pull that way.
01:32:23.200 | I mean, look, you can ask a question
01:32:25.600 | in a way on a poll
01:32:27.200 | to get whatever answer you wanna get to.
01:32:29.040 | I can basically prove to you
01:32:30.560 | that a majority of the American public
01:32:32.560 | is either for Ukraine funding
01:32:34.640 | or against Ukraine funding,
01:32:36.240 | depending on how you ask the question.
01:32:37.840 | And look, yeah, obviously taxing people
01:32:41.600 | with $100 million of paper wealth
01:32:43.680 | is gonna be more popular
01:32:46.800 | than sacrificing your social security, obviously,
01:32:49.600 | but that doesn't mean
01:32:50.480 | that this is a good proposal economically at all.
01:32:52.640 | - No, it doesn't.
01:32:54.480 | And I don't think that that's what really matters.
01:32:56.640 | I think most folks are voting for themselves,
01:32:58.720 | their particular needs,
01:33:00.720 | and the majority of people need more support.
01:33:03.520 | - This is not even gonna save social security.
01:33:06.400 | I mean, what is this tax gonna get us?
01:33:09.760 | It's gonna destroy the startup ecosystem.
01:33:12.240 | It's not gonna balance the budget.
01:33:13.600 | We're still spending way too much for that.
01:33:15.680 | It doesn't pay down the debt.
01:33:17.840 | It doesn't save social security.
01:33:19.360 | It's just more and more taxes.
01:33:21.120 | - Yeah, I think that's the inevitable trend.
01:33:23.440 | I'm just saying, I don't know how it's gonna manifest,
01:33:25.600 | but I think it's inevitable
01:33:26.720 | that we're gonna raise federal revenue
01:33:28.800 | through some form of taxation
01:33:30.320 | that's gonna feel deeply uncomfortable
01:33:32.080 | and inappropriate,
01:33:33.760 | and it's gonna have negative economic consequences.
01:33:36.640 | And this is where the economic spiraling
01:33:39.120 | that Malay has talked about.
01:33:41.120 | - But Balaji actually had a brilliant blog post
01:33:44.720 | about this called, "All It Takes Is All You Got."
01:33:47.600 | And what he pointed out is,
01:33:50.640 | he was talking about the federal government's
01:33:53.040 | runaway deficits and debt and borrowing,
01:33:56.080 | and somebody responded to him with this chart
01:34:00.240 | that shows assets versus liabilities.
01:34:02.480 | This guy, Brent, who I guess is a foil on X for Balaji,
01:34:07.120 | sometimes, said, "Oops, you only showed
01:34:09.360 | "one side of the balance sheet.
01:34:10.480 | "Common mistake, though."
01:34:11.520 | Basically saying that Balaji didn't know
01:34:13.280 | what he was talking about
01:34:14.160 | because Balaji was only showing the red,
01:34:16.400 | which is the government liabilities,
01:34:18.560 | the government debt.
01:34:19.760 | And he showed that, "Well, no,
01:34:21.200 | "you have to include all the green bars."
01:34:22.880 | But what are the green bars?
01:34:24.240 | Those are private assets.
01:34:25.760 | And Balaji pointed out that,
01:34:28.960 | "No, actually, you're proving my point."
01:34:30.560 | Because people like you just see
01:34:32.400 | all of the private assets of citizens of the United States
01:34:36.400 | as belonging to the government.
01:34:38.400 | And if you actually extend the red bars
01:34:41.520 | to the present value of all the long-term liabilities
01:34:44.720 | that this government has,
01:34:46.000 | it's more like $175 billion.
01:34:48.160 | So people like this see the $160 trillion
01:34:52.640 | of private wealth as being on the balance sheet
01:34:56.960 | of the federal government
01:34:59.440 | and being used to offset the $175 trillion of liabilities.
01:35:04.240 | In other words, all it takes is all you got.
01:35:08.240 | That's the trajectory we're on.
01:35:09.760 | - Mm-hmm.
01:35:10.600 | - Is we have a-
01:35:11.420 | - Yeah, your 401k is the government's,
01:35:13.520 | and we're gonna seize it.
01:35:14.360 | - They're gonna absolutely go
01:35:15.440 | after your retirement accounts,
01:35:17.120 | because that's the only way
01:35:18.080 | they're gonna pay off these liabilities.
01:35:19.600 | - It's gross.
01:35:20.440 | - Unless you wanna stop it now.
01:35:21.280 | - Yeah, 'cause another idea, cut spending.
01:35:23.200 | Cut spending, yeah.
01:35:24.040 | - Well, there's another great tweet
01:35:25.120 | by a guy who said, "Laptop mercenary."
01:35:27.680 | I don't know this account,
01:35:28.640 | but he said something funny.
01:35:30.400 | He said, "Imagine being a California high earner,
01:35:33.120 | "options, either vote for your own financial liquidation
01:35:36.320 | "or vote for the orange man."
01:35:38.400 | - Yeah, it's brutal.
01:35:39.240 | - Checkmate, dude, checkmate.
01:35:40.800 | You guys just gotta put on the red MAGA hat.
01:35:42.720 | It's happening.
01:35:44.280 | - Here's a, I mean, I read that you're all in, so to speak.
01:35:49.280 | - You don't have to wear the red MAGA hat.
01:35:51.280 | When you go into that polling booth,
01:35:52.960 | nobody knows the button you're pushing.
01:35:54.880 | - Yeah, okay, there it is.
01:35:55.920 | I mean, these are the two worst candidates in the history.
01:36:00.320 | - Jason, Biden has had four years
01:36:04.080 | to prove that he's a moderate.
01:36:05.840 | He's had four years to prove that he represents
01:36:08.320 | the normalcy that he promised to us
01:36:10.960 | when he first got elected.
01:36:12.640 | That's been refuted.
01:36:13.760 | I mean, how much more does it take for you to understand
01:36:16.320 | that his policy is radical?
01:36:19.440 | - Yeah, I mean.
01:36:20.280 | - He combines the foreign policy of Dick Cheney
01:36:22.880 | with the economic policy of Elizabeth Warren.
01:36:25.600 | How much more does it take?
01:36:26.960 | Are you gonna vote for your own financial liquidation?
01:36:29.680 | Because that's what we're talking about.
01:36:31.520 | - It's the worst two candidates,
01:36:32.800 | and I think I'm voting for the third.
01:36:34.160 | - Ooh, an RFK endorsement.
01:36:36.400 | Wow, I didn't see that.
01:36:38.320 | - I think I just have to put in the protest vote of RFK.
01:36:41.920 | - I love RFK, I think he's a great vote.
01:36:44.320 | - Yeah.
01:36:44.800 | - For Democrats.
01:36:45.760 | - I mean, crazy.
01:36:48.080 | I think maybe I go after the rock.
01:36:49.840 | But yeah, congratulations on Biden
01:36:51.920 | on making this easy for Trump
01:36:53.360 | with his build back broke plan.
01:36:57.040 | So dumb.
01:36:58.720 | All right, everybody, this is the World's Greatest Podcast.
01:37:01.440 | By the way, Jason, if you do decide
01:37:03.280 | in the end to vote for Biden,
01:37:04.720 | I'll send you some luxury tents
01:37:06.960 | that you and your kids could sleep in.
01:37:08.400 | You can tell them stories about how you voted for him.
01:37:13.520 | - It's so brutal.
01:37:15.360 | I mean, the worst two candidates of our lifetime.
01:37:18.560 | - J. Cal, here are your choices, okay?
01:37:20.800 | You can vote for Trump,
01:37:21.840 | or you can give up the ski lodge.
01:37:24.480 | What's it gonna be?
01:37:26.720 | - I mean, at this point, I think I'm RFK all the way.
01:37:30.480 | I think I got to go for,
01:37:31.680 | just try to support independence.
01:37:33.120 | I just think we got to balance the budget.
01:37:35.040 | Shout out David Friedberg.
01:37:36.160 | For the rain man himself, David Sachs.
01:37:39.280 | Yeah, David Friedberg, the sultan of science.
01:37:42.800 | We didn't get to science corner this week.
01:37:44.080 | We're gonna start with it next week.
01:37:45.920 | And the chairman dictator, Chamath Palihapitiya.
01:37:49.760 | I am the world's greatest moderator
01:37:53.120 | of the number one podcast in the world.
01:37:56.320 | I'm manifesting.
01:37:57.120 | Can we get to Spielberg?
01:37:58.160 | I'm manifesting tracking here.
01:37:59.680 | And take us out Spielberg.
01:38:01.120 | Young Spielberg, baby, coming at you.
01:38:02.720 | - Love your voice, bye-bye.
01:38:05.040 | - Love you besties.
01:38:06.080 | - Welcome everybody to episode 175.
01:38:09.040 | - That's right.
01:38:09.520 | It's episode 177 of your favorite podcast.
01:38:12.960 | And the largest and most listened to podcast in the world.
01:38:16.560 | Officially, episode 177 of the podcast starts right now.
01:38:19.840 | - Is that the largest most listened to podcast in the world?
01:38:26.000 | - I'm manifesting.
01:38:27.120 | (upbeat music)
01:38:29.200 | (upbeat music)
01:38:39.200 | (upbeat music)
01:38:49.200 | (upbeat music)
01:38:59.200 | (upbeat music)
01:39:09.200 | (upbeat music)
01:39:19.200 | (upbeat music)
01:39:29.200 | (upbeat music)
01:39:39.200 | (upbeat music)
01:39:49.200 | (upbeat music)
01:39:59.200 | (upbeat music)
01:40:09.200 | (upbeat music)
01:40:19.200 | (upbeat music)
01:40:29.200 | (upbeat music)
01:40:39.200 | (upbeat music)
01:40:49.200 | (upbeat music)
01:40:59.200 | (upbeat music)
01:41:09.200 | (upbeat music)
01:41:19.200 | (upbeat music)