back to index

Ep4. Tesla FSD 12, Imitation AI Models, Open vs Closed AI Models, Delaware vs Elon, & Market Update


Chapters

0:0 Intro + Phase Shifts
3:42 Tesla FSD 12 & Imitation Learning
28:32 AI Model Improvements | Open vs Closed Models
49:10 Elon Musk Delaware Court Case
58:30 Macro Market Outlook

Whisper Transcript | Transcript Only Page

00:00:00.000 | I would make the argument that every company in Delaware
00:00:04.580 | has to move to a different domicile
00:00:07.540 | because they could be sued in a future derivative law
00:00:11.460 | suit for the risk they've taken by staying in Delaware.
00:00:14.040 | Oh, my god.
00:00:15.020 | Oh, my god, you're so right.
00:00:17.020 | You are so right.
00:00:19.220 | Mic drop on that.
00:00:20.540 | [MUSIC PLAYING]
00:00:23.880 | [MUSIC PLAYING]
00:00:27.300 | Hey, Bill, great to see you.
00:00:35.000 | Good to see you, man.
00:00:35.880 | People loved when you were here last week in person,
00:00:38.140 | so we've got to make that happen again.
00:00:39.320 | But now, where are you?
00:00:40.480 | It looks like you're in Texas somewhere.
00:00:42.160 | I'm back in Texas, yes.
00:00:43.600 | All right.
00:00:44.520 | All right.
00:00:45.840 | So what's on your mind?
00:00:47.560 | It's been a lot of action the last couple of weeks.
00:00:50.040 | What's going on?
00:00:51.560 | One thing that I reflect on quite a bit
00:00:54.200 | is just kind of how lucky we are to be
00:00:57.320 | a part of the venture capital industry in the startup world
00:00:59.680 | simply because things change so fast.
00:01:02.840 | And if you're a curious person, if you're
00:01:04.800 | someone that likes constant learning, it's really amazing.
00:01:08.480 | Like, the stuff we're talking about,
00:01:10.400 | the stuff I'm listening to podcasts on every day,
00:01:13.440 | you know, two years ago didn't exist.
00:01:15.560 | And now, it's 80% or 90%--
00:01:18.440 | 80% or 90% of the dialogue.
00:01:21.320 | And that's just pretty wild.
00:01:23.440 | Yeah, I know.
00:01:25.600 | Our brains really aren't programmed
00:01:27.360 | to work in kind of these exponentials, right?
00:01:30.400 | I mean, you and I both know every sell-side model
00:01:32.920 | on Wall Street has linear deceleration and growth rates.
00:01:36.080 | Like, we think really--
00:01:38.200 | you know, we're really good at thinking
00:01:39.800 | in kind of these linear ways.
00:01:41.960 | You know, I had that thought this morning
00:01:44.320 | that the biggest investment opportunities really
00:01:48.000 | do occur around these phase shift moments.
00:01:50.040 | I mean, Satya talks about all the value capture
00:01:53.360 | occurs in the two- to three-year period around phase shifts.
00:01:56.280 | But it's hard to forecast in those moments, right?
00:01:59.120 | I mean, that's when you see these massive deltas,
00:02:02.160 | you know, in these forecasts.
00:02:03.960 | And I just went back and looked at, for example,
00:02:06.240 | at the start of last year, the consensus
00:02:09.400 | estimate of the smartest people covering NVIDIA day-to-day
00:02:13.480 | was that the data center revenue was going
00:02:15.760 | to be $22 billion for the year, right?
00:02:18.320 | Guess what it ended up being?
00:02:20.160 | $96 billion.
00:02:22.200 | OK, they were off almost by a factor of 3 or a 4, right?
00:02:27.760 | The EPS at the beginning of last year, the earnings per share,
00:02:30.800 | was expected to be $5.70.
00:02:33.720 | And now it looks like it's going to be $25, right?
00:02:37.080 | Like, over the course of your career,
00:02:38.920 | have you ever seen sell-side estimates
00:02:41.520 | off by that much on a large-cap stock?
00:02:43.240 | I mean, just like, you know, very, very rare.
00:02:46.560 | Like, you know, once a decade, maybe, you know,
00:02:49.760 | that something like this happens.
00:02:51.400 | Yeah, it's amazing.
00:02:52.480 | So, you know, and I've had investors say to me
00:02:55.320 | when the stock was at 200, hell, you and I talked about this.
00:02:58.320 | You know, should we sell it all at 200?
00:02:59.960 | Sell it all at 300?
00:03:00.960 | Sell it at 400?
00:03:01.920 | And now, you know, those investors
00:03:03.640 | are calling me every day saying, have you sold it yet?
00:03:07.160 | Our general view is that if the numbers are going up,
00:03:10.120 | so if our numbers are higher than the street's number
00:03:12.520 | for whatever variant perception that we have, right,
00:03:15.400 | then the stock is going to continue to go higher.
00:03:18.080 | At some point, the street will get ahead of itself
00:03:21.080 | and its numbers will now be higher
00:03:23.280 | or at the same level as ours.
00:03:24.720 | And at that point, I think it becomes
00:03:26.800 | more of a market performer.
00:03:29.480 | But of course, some things will be wildly overestimated
00:03:33.400 | and some things will be wildly underestimated,
00:03:35.760 | but that sort of discontinuity really occurs
00:03:39.800 | around these moments of big phase shifts.
00:03:42.040 | So speaking of a big phase shift, right,
00:03:44.480 | we teased on the pod, I think at the start last time,
00:03:48.760 | that I had taken a test ride in Tesla's new FSD-12.
00:03:53.760 | And I said, you know, it kind of felt like
00:03:56.360 | a little bit of a chat GPT moment,
00:03:58.840 | but I think we left the audience hanging.
00:04:00.400 | We got a lot of feedback.
00:04:01.440 | Hey, you know, dig in more to that.
00:04:03.480 | So you and I spent some time on this,
00:04:05.400 | both together and with some folks on the Tesla team.
00:04:10.400 | So roughly the setup here, background,
00:04:12.640 | I want to get your reaction to it,
00:04:14.800 | is about 12 months ago,
00:04:17.680 | the team pretty dramatically forked
00:04:19.880 | their self-driving model, right?
00:04:21.920 | Moving it from this really C++ deterministic model
00:04:26.920 | to what they refer to as an end-to-end model
00:04:31.000 | that's really driven by imitation learning, right?
00:04:33.840 | So we think of this new model,
00:04:35.240 | it's really video in and control out.
00:04:38.920 | It's faster, it's more accurate, you know,
00:04:41.480 | but after 11 different versions of FSD,
00:04:44.360 | I think there's a lot of skepticism in the world.
00:04:46.680 | Like, is this going to be, you know, something different?
00:04:50.040 | You sent me a video and I have tons of these videos,
00:04:54.360 | you know, floating around at the moment, you know,
00:04:56.960 | that really kind of shows, you know,
00:04:59.840 | how this acts more like a human
00:05:01.840 | than prior models out there.
00:05:04.040 | So Bill, kind of just react, you know,
00:05:06.840 | you've watched this video,
00:05:08.120 | react to this video and give us your thoughts.
00:05:10.720 | You know, I think you've been a longtime observer
00:05:13.200 | of self-driving.
00:05:14.680 | I might even describe you as a bit of a critic of,
00:05:17.800 | you know, or a skeptic when it comes to full self-driving.
00:05:22.080 | So is this a big moment?
00:05:23.720 | Did I overstate it?
00:05:25.000 | Kind of, what are your thoughts here?
00:05:27.640 | Yeah, so, you know, one of the critiques
00:05:31.280 | and concerns people had about self-driving
00:05:36.080 | is they would say that, yeah, we're 98% of the way
00:05:39.880 | there are 99, but the last 1% is gonna take as long
00:05:44.200 | as the first 99.
00:05:45.840 | And one of the reasons for that is,
00:05:47.920 | it's nearly impossible to code for all of the corner cases.
00:05:53.800 | And the corner cases are where you have problems,
00:05:56.200 | that's where you end up in wrecks, right?
00:05:57.920 | And so the approach Tesla had been taken up
00:06:01.640 | until this point in time was one where you would
00:06:05.640 | literally try and code every object,
00:06:08.680 | every circumstance, every case in like a piece of software.
00:06:13.320 | This X happens, then Y, right?
00:06:15.920 | And that ends up being a patchwork kind of a,
00:06:19.920 | just a big nasty, you know, rat's nest of code.
00:06:24.920 | And it builds up and builds up and builds up
00:06:28.320 | and maybe even steps on itself.
00:06:30.120 | And it's not very elegant.
00:06:32.280 | What we learned this week is that they've completely
00:06:35.360 | tossed all of that out and gone with a neural network model
00:06:40.360 | where they're uploading videos from their best drivers.
00:06:44.720 | And literally the videos are the input and the output
00:06:48.480 | is the steering wheel, the brake and the gas pedal.
00:06:52.360 | And, you know, there's this principle known
00:06:56.360 | as Occam's razor, which has been around forever in science.
00:07:00.680 | But the simplified version of it is a simpler approach
00:07:05.680 | is much more likely to be the optimal approach, right?
00:07:09.360 | And when I fully understood what they had done here,
00:07:13.640 | it seems to me this approach has a much better chance
00:07:18.400 | of going all the way and of being successful.
00:07:21.640 | And certainly of being maintainable and reasonable.
00:07:26.640 | It's way more elegant.
00:07:29.240 | It requires them to upload a hell of a lot of video,
00:07:32.080 | which we can talk about.
00:07:33.540 | But, and the other thing that's just so damn impressive
00:07:39.000 | is that this company, which is very large,
00:07:42.720 | hundreds of thousands of employees,
00:07:45.040 | made a decision so radical to kind of throw out
00:07:48.740 | the whole thing and start afresh.
00:07:51.320 | And it sounds like the genesis of that may have been,
00:07:55.200 | you know, three or four years ago,
00:07:56.640 | but they got to the point where they're like,
00:07:59.580 | this is gonna be way better and threw the whole thing out.
00:08:02.840 | And I think about four months after they made the change,
00:08:07.720 | Elon did a drive where he uploaded
00:08:09.800 | and kind of streamed the drive.
00:08:11.520 | So we can put that in the notes and people can watch it.
00:08:14.480 | But it's way, way different.
00:08:15.840 | It's way, way different.
00:08:17.080 | And in my mind, you know,
00:08:18.640 | basically with this Occam razor's notion,
00:08:22.040 | it's got a much higher chance of being wildly successful.
00:08:25.840 | Yeah, let's dig in a little bit
00:08:27.580 | into how it's different, right?
00:08:29.720 | So, and you referenced a little of this.
00:08:33.400 | So, you know, like for example,
00:08:35.440 | this model does not have a deterministic view
00:08:38.320 | of a stoplight, right?
00:08:40.160 | I mean, McCarthy has talked about this before,
00:08:43.360 | you know, before you have to label a stoplight, right?
00:08:46.440 | So you would basically take the data from the car.
00:08:49.440 | That would be your perception data.
00:08:51.600 | You would draw a box around a stoplight.
00:08:53.840 | You would say, this is a, you know, this is a stoplight.
00:08:56.600 | So that your first job on the car
00:08:58.640 | would have to be to identify that you're at a stoplight.
00:09:02.400 | Then the second thing is you would write all of this C++
00:09:06.280 | that would deterministically say,
00:09:08.080 | when you are at a stoplight,
00:09:09.920 | here's what the controls should do, right?
00:09:12.840 | And so for all of that second half of the model,
00:09:18.200 | you know, the heuristics, the planning and the execution,
00:09:22.280 | that was all driven by this patchwork
00:09:24.080 | that you're talking about.
00:09:25.080 | And that was like, you would just chase, you know,
00:09:27.680 | every one of these corner cases
00:09:30.000 | and you could never solve them all.
00:09:31.520 | Now in this new model, it's pixels in.
00:09:35.480 | So the model itself has no code.
00:09:37.840 | It doesn't know this is a stoplight per se.
00:09:41.480 | In fact, they just watched the driver's behavior.
00:09:44.320 | So the driver's behavior is actually the label.
00:09:48.320 | It says, when we see pixels like this on the screen,
00:09:51.080 | here's how the model should behave,
00:09:53.600 | which I thought is just an extraordinary break.
00:09:56.120 | And I don't think there's a deep appreciation
00:09:58.760 | for the fact that, you know, again,
00:10:00.560 | because we've had 11 versions of what came before it,
00:10:03.720 | those were just slightly better patchwork models.
00:10:07.640 | In fact, I think what, you know,
00:10:09.800 | we learned was that the rate of improvement of this
00:10:14.320 | is order of magnitude five to 10X better per month
00:10:17.960 | as a model versus the rate of improvement
00:10:20.520 | of those prior systems.
00:10:21.760 | - And once again, the audacity to throw out
00:10:24.040 | the whole old thing and put a new thing in is just crazy.
00:10:28.120 | One thing for the listeners,
00:10:29.360 | well, actually two things I would mention.
00:10:31.520 | One, in terms of just how they got this going,
00:10:35.640 | you know, a lot of people I fear equate AI with LLMs
00:10:40.640 | because it was really the arrival of ChatGPT and the LLM
00:10:45.680 | that I think introduced what AI was capable of
00:10:49.480 | to most people, but those are language models.
00:10:51.960 | That's what one of the L stands for.
00:10:54.640 | And these AI models that Tesla's used for FSD-12
00:10:59.640 | are these generic open source AI models
00:11:03.600 | that you can find on Hugging Face, you know,
00:11:06.080 | and they obviously customized them.
00:11:08.040 | So there's some proprietary code there at Tesla,
00:11:11.720 | but, you know, AI has been evolving for a very long time.
00:11:15.320 | And this notion of neural networks was around
00:11:18.000 | before the LLMs popped out, which is why, you know,
00:11:20.800 | they had started on this four years ago or whatever, right?
00:11:24.560 | But the foundational elements, you know, are there.
00:11:27.320 | And by the way, they use the hardware
00:11:30.600 | that we're talking about, right?
00:11:31.920 | They use the big NVIDIA clusters to do the training.
00:11:35.880 | They need some type of GPU or TPU
00:11:38.520 | to do the inference at runtime.
00:11:41.920 | So it's the same hardware the LLMs use,
00:11:44.400 | but it's not the same type of code.
00:11:47.240 | I just thought that was worth mentioning.
00:11:49.720 | Yeah, no, it's a, to me, if we dig in a little bit
00:11:54.720 | to, you know, the model itself, you know, the transformers,
00:12:00.200 | the diffusion architecture, the convolution neural nets,
00:12:03.440 | those are all like these modular
00:12:05.400 | open source building blocks, right?
00:12:07.280 | Like the thing that's extraordinary to me,
00:12:09.360 | and we're gonna get later in the pod
00:12:11.160 | to this open versus closed debate,
00:12:13.520 | but like, this is just this great example, you know,
00:12:15.960 | you talk about ideas having sex.
00:12:17.520 | I mean, these open source module, you know,
00:12:21.120 | kind of modular components,
00:12:22.960 | those have been worked on for the last decade.
00:12:25.480 | And now they're bringing those components together,
00:12:28.280 | and now all of their energy,
00:12:30.400 | and I wanna dig into this a little bit,
00:12:32.000 | that is really going,
00:12:34.000 | they're taking all these engineers
00:12:35.320 | who were writing the C++,
00:12:36.880 | these deterministic, you know, patches effectively,
00:12:40.120 | and now they're focusing them on how do we make sure
00:12:43.280 | that our data infrastructure,
00:12:45.320 | that the data that we're pulling off of the edge
00:12:48.080 | comes in and makes these models better.
00:12:50.200 | So all of a sudden it becomes about the data,
00:12:52.880 | because the model itself is just digesting this data,
00:12:56.720 | brute forcing it with a lot of this, you know,
00:12:59.200 | NVIDIA hardware and outputting better models.
00:13:02.360 | - You know, it's such a classic
00:13:04.560 | Silicon Valley startup thing
00:13:08.120 | where you need all the pieces to line up.
00:13:10.200 | If you go back and watch,
00:13:11.480 | if you haven't watched,
00:13:12.640 | if anyone's watched the General Magic video,
00:13:15.600 | which is fantastic, it's on the internet,
00:13:19.400 | about why General Magic didn't work.
00:13:21.160 | And Tony Fidel, who ended up building the iPod
00:13:24.280 | and ran engineering for the iPhone,
00:13:26.400 | talks about how the pieces just weren't there.
00:13:28.920 | So they were having to do all the pieces, right?
00:13:31.920 | The network and the chips,
00:13:33.760 | and it just wasn't there yet.
00:13:35.040 | And so these models have been around,
00:13:38.160 | maybe ahead of the hardware,
00:13:40.000 | and now NVIDIA's bringing the hardware,
00:13:42.080 | and these pieces start to come together.
00:13:44.080 | And then the data,
00:13:45.560 | and I think one of the most fascinating things
00:13:48.360 | about this story of Tesla and FSD-12
00:13:52.080 | is when you understand where they get the data.
00:13:55.240 | So they are tracking their best drivers with five cameras,
00:14:00.240 | and the drivers know it.
00:14:01.320 | They've opted into the program,
00:14:03.360 | and they upload the video overnight.
00:14:07.320 | And so, you know, talk about the pieces coming together.
00:14:11.080 | We've found Reddit forums and stuff we can put links to
00:14:15.040 | in the notes where Tesla drivers are saying
00:14:19.800 | they're uploading 10 gigabit a night.
00:14:22.120 | And so, you know, you had to have the Wi-Fi infrastructure,
00:14:26.440 | like, how would it be possible to upload that much?
00:14:31.440 | Here's someone who's Tesla uploaded 115 gigabyte in a month,
00:14:36.680 | right?
00:14:37.880 | And so these are massive numbers,
00:14:39.760 | and the infrastructure, five years ago,
00:14:42.800 | your car couldn't have done this.
00:14:44.320 | And, you know, I think we'll talk about competition
00:14:46.760 | in a minute, but like, you know,
00:14:48.360 | who else has the capacity to do this, right?
00:14:51.000 | It's unbelievable to, like, the footprint of cars they have.
00:14:55.360 | And then the notion that, oh yeah,
00:14:57.120 | we could just go upload this data,
00:14:59.120 | and it is a buttload of data that's coming.
00:15:02.960 | Right, and even with this architecture,
00:15:06.040 | so you just do the math.
00:15:07.320 | Five million cars, 30 miles a day,
00:15:09.840 | I think eight cameras on the car, five megapixels each,
00:15:12.960 | and then the data going back 10 years, right?
00:15:15.160 | This amount of shadow data,
00:15:16.560 | you could combine the clusters
00:15:18.600 | of every hyperscaler in the world,
00:15:21.120 | and you couldn't possibly store all of this data, right?
00:15:24.320 | That's the size of the challenge.
00:15:26.160 | So what they've had to do is process this data on the edge.
00:15:30.320 | And in fact, I think 99% of the data that a car collects
00:15:34.600 | never makes it back to Tesla.
00:15:36.520 | So, you know, they're using video compression,
00:15:38.880 | these remote send filters,
00:15:40.800 | they're running, you know, neural nets and software
00:15:43.000 | on the car itself.
00:15:44.360 | So basically they, you know, for example,
00:15:47.320 | if 80% of your driving is the highway
00:15:49.320 | and there's nothing interesting that happens on the highway,
00:15:51.720 | then you can just throw out all that data.
00:15:53.640 | So what they're really looking for is, you know,
00:15:56.640 | what is the data that is a long way away
00:16:00.280 | from the mean data, right?
00:16:02.040 | So what are these outlier moments?
00:16:03.920 | And then can we find tens or hundreds
00:16:06.440 | or thousands of those moments to train the model?
00:16:09.280 | So they're literally pulling this compressed,
00:16:12.160 | filtered data every single night off of these cars.
00:16:16.320 | They've built an autonomous system.
00:16:18.280 | So before they would have engineers look at that data
00:16:21.040 | and say, okay, what have we perceived here now?
00:16:22.760 | How do we write, you know, this patchwork code?
00:16:25.040 | Instead, this is simply going into the model itself.
00:16:28.600 | It's fine tuning the model.
00:16:30.160 | And they're constantly running this autonomous process
00:16:33.080 | of fine tuning these models.
00:16:34.680 | And then they're re-uploading those models back to the car.
00:16:38.320 | Okay, this is why you get these exponential moments
00:16:41.600 | of improvement, right, that we're seeing now,
00:16:45.080 | which then brings us back to build this question.
00:16:47.800 | You know, Tesla has 5 million cars on the road.
00:16:50.280 | They have all this infrastructure.
00:16:51.840 | They are collecting this data.
00:16:54.240 | We know they're a couple of years ahead.
00:16:56.000 | Think about Waymo, for example.
00:16:57.960 | They're still using the old architecture.
00:17:00.480 | It's geo-fenced.
00:17:01.600 | I don't know, they have 30 or 40 cars on a road,
00:17:04.320 | and they're only running the--
00:17:05.680 | so do they have any chance?
00:17:07.360 | Does Waymo have any chance of competing or even
00:17:10.000 | adopting this architecture?
00:17:11.880 | It'd be-- it's such an interesting question.
00:17:14.240 | And by the way, just one quick comment
00:17:17.000 | on the previous thing you said.
00:17:18.480 | It's genius, actually, that they are--
00:17:21.640 | they've taught the car what moments it should record.
00:17:24.800 | Exactly.
00:17:25.300 | And so they mentioned to us an example
00:17:28.360 | of any time there's--
00:17:31.720 | well, obviously, a disengagement.
00:17:33.360 | So a disengagement becomes a moment
00:17:35.360 | where they want the video before and the video after.
00:17:38.000 | The other thing would be any abrupt movement.
00:17:40.600 | So if the gas goes fast, or if the brake has hit quickly,
00:17:44.760 | or if the steering wheel jerks, that
00:17:46.920 | becomes a recordable moment.
00:17:48.400 | And the part I didn't know, which they told us,
00:17:51.200 | which is just fascinating, people with LLMs
00:17:54.080 | have heard about reinforcement learning from human feedback.
00:17:58.200 | RLHF.
00:17:59.040 | And they've talked about how that could make it--
00:18:01.200 | even with Gemini, they said maybe
00:18:02.680 | that was what caused that.
00:18:04.760 | What we were told is that those moments, these moments
00:18:08.120 | where the car jerks or whatever, if it is super relevant,
00:18:12.400 | they can put that in the model with extra weight.
00:18:16.720 | And so it tells the model, if this circumstance arises,
00:18:21.560 | this is something that's more important
00:18:23.600 | and you have to pay extra attention to.
00:18:25.600 | And so if you think about these corner case scenarios, which
00:18:30.960 | we all know are the biggest problems in self-driving,
00:18:34.800 | now they have a way to only capture
00:18:37.440 | the things that are most likely to be those things
00:18:40.520 | and to learn on them.
00:18:42.560 | So the amount of data they needed to get started
00:18:46.400 | was this impossible amount of data
00:18:48.640 | with the millions of cars.
00:18:50.120 | And now the way that plays to their advantage
00:18:53.240 | is they're much more likely to capture
00:18:56.600 | these more severe, less frequent moments because
00:19:01.960 | of the bigger footprint.
00:19:03.440 | And so you ask the question, I don't know who could compete.
00:19:07.320 | It certainly couldn't-- let's make an assertion.
00:19:11.640 | If this type of neural network approach is the right answer,
00:19:15.880 | and once again, Occam's razor seems that way to me,
00:19:20.600 | then who could compete?
00:19:22.040 | And several of the companies who would be least likely
00:19:27.280 | would be Cruise and Waymo and these things
00:19:29.800 | because they just don't have that many cars.
00:19:31.920 | And their cars cost $150,000.
00:19:35.200 | So if they wanted to have--
00:19:37.600 | the math just doesn't work.
00:19:38.760 | You can't build the footprint.
00:19:41.080 | And so who could?
00:19:41.960 | I don't know.
00:19:43.680 | I don't know.
00:19:45.760 | What would it cost to build a five-camera device
00:19:47.960 | to put on top of every Uber?
00:19:49.520 | I don't know.
00:19:50.440 | A lot, it would be weird.
00:19:52.760 | They're not going to do it.
00:19:55.080 | And that, to me, is--
00:19:57.760 | when you look at these alternative models,
00:20:00.480 | if this really is about data-- and remember,
00:20:02.440 | Bill just said an important point, which
00:20:04.480 | is it's not just about quantity of data.
00:20:06.760 | Something magic happens around a million cars.
00:20:09.760 | Yes, you've got to get all that quantity of data.
00:20:12.000 | But to get the long-tail events--
00:20:14.680 | these are events that occur tens or just hundreds of times.
00:20:18.720 | That's where you really need millions of cars.
00:20:20.880 | Otherwise, you don't have a statistically relevant pool
00:20:24.120 | of these long-tail instances.
00:20:26.160 | And what they're uploading from the Edge, Bill, he said,
00:20:29.800 | each instance is a few seconds long of video,
00:20:35.120 | plus some additional vehicle-driving metadata.
00:20:37.720 | And it's those events.
00:20:38.880 | If you only have hundreds of cars or thousands of cars,
00:20:41.560 | you can get a lot of data quickly.
00:20:43.680 | It's not about quantum of data.
00:20:45.840 | 100 cars can produce a huge quantum of data
00:20:48.800 | driving 1,000 miles.
00:20:50.760 | It's about the quality of the data, those adverse events.
00:20:55.200 | Yes, and I guess the other type of company that maybe
00:20:58.760 | could take a swing at it would be like Mobileye or something.
00:21:01.720 | The problem they have is they don't control
00:21:05.000 | the whole design of the car.
00:21:06.880 | And so this part where Tesla has the car in the garage at night
00:21:11.920 | and uploads gigabytes and puts it right into the model,
00:21:15.400 | are they going to be able to get that done working
00:21:18.360 | with other OEMs?
00:21:20.840 | Are they going to be able to organize all that?
00:21:23.360 | Do they have the piece on the car that says when to record
00:21:28.200 | and when not to record?
00:21:31.240 | It's just a massive infrastructure question.
00:21:33.400 | I would probably, if I had to handicap anybody,
00:21:36.760 | it would probably be BYD or one of the Chinese manufacturers.
00:21:41.480 | And if you think about it, they have a lot of miles driven.
00:21:45.000 | In China, much less so outside of China,
00:21:49.080 | I imagine you're going to have some
00:21:50.560 | of this nationalistic stuff that emerges on both ends of this.
00:21:56.000 | But one of the things I asked our analyst, Bill,
00:21:58.280 | is if we just step back, I think these guys have network
00:22:01.600 | advantage.
00:22:02.120 | They have data advantage.
00:22:03.160 | They're clearly in the lead.
00:22:04.360 | They have bigger H100 clusters than the people
00:22:07.200 | they're competing against.
00:22:08.320 | I mean, they have all sorts of things
00:22:10.080 | that have come together here.
00:22:11.680 | But if you think about what's the so what to Tesla?
00:22:15.680 | And just in the first instance, and we'll
00:22:17.800 | pull up this slide that Frieda on our team made,
00:22:21.320 | if you look at the unit economics of a Tesla,
00:22:24.720 | with no FSD, they're making about $2,500 on a vehicle.
00:22:29.640 | If you look at it today, they have about 7% penetration
00:22:32.640 | of FSD.
00:22:33.920 | That was, let's call it, through FSD-11.
00:22:36.240 | And those people paid $12,000 incrementally for that FSD.
00:22:41.080 | And as we know, you can go read about it on Twitter.
00:22:43.600 | People are like, yeah, it's good,
00:22:44.960 | but it's not as good as I thought it would be.
00:22:46.880 | So now we have this big moment of what
00:22:49.440 | feels like kind of a step function,
00:22:51.800 | the model getting better at a much faster rate.
00:22:54.960 | So I asked the question, what if we reduce the price on this
00:22:59.240 | by half?
00:23:01.120 | What if Tesla said, this is such a good product,
00:23:03.880 | we think we want to drive penetration,
00:23:05.920 | so let's make it $500 a month, not $1,000 a month?
00:23:09.840 | So if you assume that you have penetration go from 7% to 20%,
00:23:17.040 | give it to everybody for free, they drive around for a month,
00:23:19.680 | they're like, wow, this really does feel like a human driver.
00:23:22.560 | I'm happy to pay $500 a month.
00:23:25.160 | If you get to 20% penetration, then your contribution margin
00:23:30.640 | at Tesla is about the same, even though you're
00:23:34.040 | charging half as much.
00:23:35.880 | Now, if you get to 50% penetration, all of a sudden
00:23:38.320 | you're creating billions of dollars
00:23:39.840 | in incremental EBITDA.
00:23:41.160 | Now, think about this from a Tesla perspective.
00:23:43.880 | Why do they want to drive even more adoption of FSD?
00:23:46.800 | Well, you get a lot more information and data
00:23:49.680 | about disengagements and all these other things.
00:23:52.440 | So that data then continues to turn the flywheel.
00:23:55.680 | So my guess is that Tesla, seeing
00:23:59.120 | this meaningful improvement, is going to focus on penetration.
00:24:03.600 | My guess is that they want to get a lot more people trying
00:24:06.840 | the product, and they're going to play around with price.
00:24:09.720 | Why not?
00:24:11.040 | Maybe $100 a month is the right intersection between adoption
00:24:16.120 | or penetration and price.
00:24:18.800 | But again, I think that all of these things
00:24:21.560 | are occurring at an accelerating rate at Tesla.
00:24:24.840 | And when I look around, I still hear
00:24:27.160 | people saying Waymo's worth $50 or $60 billion.
00:24:30.400 | But you could be in a situation on that business
00:24:33.560 | where it just gets passed really quickly,
00:24:37.600 | and they have a hard time structurally of catching up.
00:24:41.880 | People have said that--
00:24:43.360 | and if someone has data, once again,
00:24:45.640 | that they want to correct this, I'd
00:24:47.400 | be glad to state to recorrect the data.
00:24:49.400 | But we've been told they have a head count similar to Cruise.
00:24:54.600 | And the Cruise financials came out, and they were horrific.
00:24:57.520 | And so I don't have any reason to believe
00:25:00.800 | that the Waymo financials are any different than the Cruise
00:25:04.080 | ones.
00:25:04.600 | And I've always thought this model,
00:25:08.080 | that we're going to build this incredible car,
00:25:10.840 | and our business model is going to be to run a service.
00:25:13.800 | Like the CapEx, if you just build a 10-year model,
00:25:16.760 | the CapEx you need, they would have to go raise $100 billion.
00:25:20.720 | And there's another element that's super interesting.
00:25:23.240 | The team at Tesla feels very strongly
00:25:26.840 | that LIDAR does not need to be a component of this thing.
00:25:31.600 | And so the Waymo, Cruise, all those approaches, and Mobileye
00:25:36.080 | are LIDAR-dependent, which is a very costly piece of material
00:25:43.080 | in those designs.
00:25:44.280 | And so if this is all true, if this is how it plays out,
00:25:48.280 | it's a pretty radical new discovery.
00:25:53.600 | So one of the things I also want to talk about,
00:25:55.840 | because one of the reasons I started going down this path
00:25:59.640 | is our team's been spending a lot of time
00:26:01.760 | with the robotics companies, new robotics companies.
00:26:04.400 | So we have Optimus at Tesla.
00:26:06.680 | Figure.ai just raised some money from OpenAI and Microsoft,
00:26:12.360 | and we met with those guys.
00:26:13.480 | And they're all doing really interesting things.
00:26:16.040 | But again, they're shifting their models.
00:26:19.200 | The robotics companies also were using
00:26:22.080 | these deterministic models to teach the robot maybe
00:26:27.720 | how to pour a cup of coffee or something.
00:26:30.800 | And now they're moving to these imitation models.
00:26:32.840 | So I was searching around the other day,
00:26:35.560 | and I came across this video by a PhD student
00:26:38.000 | at Stanford, Ching Cheh.
00:26:40.800 | And he showed how this robotic arm was basically just
00:26:44.680 | collecting data very quickly using a little camera
00:26:48.000 | on a handheld device.
00:26:49.680 | And then they literally take the SD card out of the camera.
00:26:53.240 | They plug it into the computer.
00:26:55.000 | It uploads this data to the computer.
00:26:57.480 | It refreshes the model.
00:26:59.520 | And just based on two minutes of training data,
00:27:03.560 | now video in, control out, this robotic arm
00:27:07.680 | knows how to manipulate this coffee cup in all
00:27:11.440 | of these different situations.
00:27:13.320 | So I think we're going to see the application
00:27:16.440 | of these models, end-to-end learning models,
00:27:18.480 | imitation learning models, impact not just cars.
00:27:21.680 | I mean, 5 million cars on the road,
00:27:23.560 | that's probably the best robot we could possibly
00:27:25.680 | imagine for data collection.
00:27:27.240 | The challenge, of course, in robotics
00:27:29.000 | is going to be data collection.
00:27:30.360 | But then I saw this video, and I said,
00:27:32.200 | well, maybe that's a manageable challenge, particularly
00:27:35.000 | for a discrete set of events.
00:27:36.920 | Yeah, and the other great thing about that video,
00:27:39.360 | if people take the time to watch it,
00:27:41.640 | it actually explains pretty simply how the Tesla stuff's
00:27:45.680 | working, right?
00:27:46.720 | I mean, it's just a different scale, obviously,
00:27:48.680 | but that's the exact same thing, just at a very reduced state.
00:27:53.080 | Right, and you can imagine when that's just
00:27:55.200 | this autonomous flywheel without a lot of human intervention.
00:27:57.840 | And that's the direction that Tesla still
00:28:00.960 | has some engineering intervention along the way.
00:28:03.080 | But I think the engineering team working on this at Tesla
00:28:06.560 | is about 1/10 the size of the teams at Cruze.
00:28:10.720 | Well, I mean, that gets back to this simplicity point, right?
00:28:15.480 | Like, this approach removes so much complexity
00:28:21.600 | that you should be able to do it with less people.
00:28:24.480 | And the fact that you can have something better
00:28:27.520 | with less people is really powerful.
00:28:31.080 | So we talked a little bit about how models,
00:28:35.040 | these open source models, are driving a lot
00:28:37.720 | of the improvements at Tesla.
00:28:39.880 | We seem to get model improvements and model updates
00:28:43.560 | every day, Bill.
00:28:45.040 | Maybe I just go through a few of the recent ones.
00:28:48.360 | And I want to explore this open versus closed.
00:28:52.400 | But last week, we heard about Gemini 1.5.
00:28:55.600 | It has a huge expanded context window.
00:28:58.880 | And Gemini 1.5, about a chat GPT-4 level.
00:29:03.640 | Then yesterday, we get Claude III announcements.
00:29:06.440 | Their best model, Opus, is just a little bit better
00:29:09.560 | than chat GPT-4.
00:29:10.720 | But I think the significant thing there--
00:29:12.760 | and we have a slide on this--
00:29:15.240 | is just really about the cost breakthrough,
00:29:18.120 | that their SONET level model can do workloads
00:29:21.440 | at a fraction of the price of chat GPT-4,
00:29:25.000 | even though it's performing at or near that quality.
00:29:29.560 | And then we have-- those models were trained on a mixture,
00:29:34.320 | I think, of H100 and prior version of NVIDIA chips.
00:29:37.720 | The first H100-only trained models, I think,
00:29:41.760 | will be LLAMA III and chat GPT-5.
00:29:44.800 | So we're hearing rumors that both of those models
00:29:47.560 | are going to come out in the May-July time frame.
00:29:50.480 | With respect to LLAMA III that was trained on Meta's H100
00:29:55.240 | cluster, rumors are that it has Claude III-like performance,
00:30:00.000 | which is pretty extraordinary if you're thinking about a fully
00:30:04.120 | open-sourced model.
00:30:06.040 | And then chat GPT-5, which we hear is done.
00:30:11.080 | And they're simply in kind of their post-training safety
00:30:15.480 | guardrails, their normal post-training work.
00:30:19.160 | We hear that's going to launch sometime in May versus June.
00:30:21.560 | And because that one was trained on H100s,
00:30:24.400 | we hear it is like a 2x improvement versus chat GPT-4.
00:30:30.000 | But then we hear all the rest of the Frontier models
00:30:32.480 | are kind of in this holding pattern
00:30:34.800 | because they're waiting for the B100s
00:30:36.680 | to get launched to this Q3, Q4 out of NVIDIA, which probably
00:30:40.040 | means the next iteration of the Frontier models
00:30:43.120 | will come out in Q2 of next year, Q2 of '25.
00:30:46.320 | That's after chat GPT-5.
00:30:48.760 | So Bill, if you go through this bedrock page on AWS,
00:30:53.840 | if you just scroll through, you see
00:30:55.720 | that Amazon is offering all these different models.
00:30:58.400 | I mean, you can run your workloads
00:31:00.240 | on LLAMA, on Mistral, on Claude, et cetera.
00:31:03.480 | Snowflake today just announced a deal with Mistral.
00:31:07.600 | And they're going to have LLAMA as well.
00:31:09.240 | Imagine Databricks will.
00:31:10.960 | Microsoft, you can use LLAMA, or you can use Mistral, or OpenAI.
00:31:16.600 | So where do you think all of this
00:31:19.000 | goes in terms of the models that will actually
00:31:21.320 | get used by enterprises and consumers in practice?
00:31:24.920 | Yeah, so I have a lot of different thoughts.
00:31:27.240 | My first one, when this new Anthropic thing came out
00:31:30.640 | and they list all the different math tests, and science tests,
00:31:33.880 | and PhD, and they're all listing the same thing,
00:31:37.080 | I wonder if they're racing up a hill.
00:31:41.400 | But they're all racing up the same hill.
00:31:43.760 | Yeah, there's the thing.
00:31:45.560 | Because they're all running these same comparative tests,
00:31:48.840 | and they're all releasing this data.
00:31:50.760 | And what I don't know, if any of them
00:31:53.800 | are creating the type of differentiation that's
00:31:56.880 | going to lead to one of them becoming the wholesale
00:31:59.560 | winner versus the other.
00:32:01.560 | And is this type of micro-optimization
00:32:06.560 | in a way that's going to matter to people or to the users?
00:32:11.840 | And it's not clear to me.
00:32:13.360 | I mean, I see some developers get way more excited
00:32:16.480 | about the pricing at the low end of those three choices
00:32:19.320 | than they do about the performance of the top end.
00:32:22.480 | So that's one thing.
00:32:23.440 | The second thing on my mind, I don't have a lot of logic
00:32:28.520 | to put around this.
00:32:29.320 | It's more of an intuition.
00:32:30.800 | I wonder if these companies can simultaneously
00:32:33.640 | try and compete with Google to be this consumer app
00:32:37.480 | that you're going to rely on to get you information.
00:32:40.600 | So you could call that Wikipedia on steroids,
00:32:44.240 | Google search redefined, whatever market
00:32:46.440 | you want to call that.
00:32:48.280 | And simultaneously be great at enterprise models.
00:32:51.960 | And I just don't know if they can do both.
00:32:55.520 | I really don't.
00:32:56.480 | And maybe that'll get to the third thing, which
00:32:58.680 | is more the essence of your question.
00:33:00.600 | What am I hearing about and seeing
00:33:02.320 | about when it comes to companies that are actually
00:33:04.880 | utilizing these things?
00:33:07.280 | The Tesla example was interesting
00:33:08.840 | because they start with these bedrock components that
00:33:11.680 | are open source.
00:33:12.800 | And one thing that happened in the past 20 years--
00:33:16.160 | it happened very slowly, but we definitely got there--
00:33:19.000 | CIOs at large companies, they used
00:33:21.440 | to be an IBM shop or an Oracle shop or a Microsoft shop.
00:33:26.160 | That was their platform.
00:33:27.720 | They slowly got to the place where most of the best CIOs
00:33:31.720 | were open source first.
00:33:33.280 | And so for any new project they start,
00:33:36.280 | they used to be skeptical of open source.
00:33:38.080 | And it slipped completely the other way.
00:33:39.880 | Like, oh, is there an open source choice we can use?
00:33:42.500 | And the reason is they don't--
00:33:44.000 | one, there's more competition.
00:33:45.280 | And two, they don't want to get stuck on anything.
00:33:47.680 | And so when I look at what I see going on in the startup world,
00:33:51.880 | they might start with one of these really well-known service
00:33:55.800 | models that's proprietary.
00:33:57.520 | But the minute they start thinking about production,
00:34:00.000 | they become very cost-focused and on the inference side.
00:34:03.080 | And they'll just play these things off of one another,
00:34:05.720 | and they'll run a whole bunch of different ones.
00:34:07.720 | I saw one startup that moved between four
00:34:10.000 | different platforms.
00:34:11.400 | And I just think that that competition
00:34:14.700 | is very different than the competition
00:34:17.320 | to compete with Google on this consumer thing.
00:34:19.720 | And I'll give you another example.
00:34:21.280 | Like, I was talking to somebody.
00:34:23.640 | If you had a legal application you wanted to use,
00:34:28.080 | you'd be better off with a smaller model that
00:34:31.200 | had been trained on a bunch of legal data.
00:34:33.640 | It wouldn't need some of the training of this overall LLM.
00:34:38.120 | And it might be way cheaper to have something
00:34:40.480 | that's very proprietary-- or not proprietary,
00:34:42.640 | but very focused from a vertical standpoint.
00:34:44.960 | And you could imagine that in a whole bunch
00:34:46.760 | of different verticals.
00:34:47.720 | So it just strikes me that on the B2B side,
00:34:52.560 | this stuff's getting cut up and into a bunch
00:34:55.560 | of different pieces where a bunch of different parties
00:34:58.720 | could be more competitive, and where those components are
00:35:02.800 | most likely to be open source first.
00:35:05.880 | Yes, yes.
00:35:06.880 | I mean, you're causing me to think
00:35:08.440 | a couple of different things.
00:35:09.600 | One, I've said in the past, if I was Sam Altman running OpenAI,
00:35:13.800 | I think I might rename the company ChatGPT
00:35:17.040 | and just focused on the multi-trillion dollar
00:35:20.240 | opportunity to replace Google.
00:35:22.960 | Because I think winning at both--
00:35:24.920 | beating Google at consumer and beating Microsoft
00:35:28.160 | at enterprise--
00:35:30.080 | Andy wants to beat NVIDIA at building chips--
00:35:32.880 | those are three big battlefronts.
00:35:35.120 | And if I think about the road to AGI--
00:35:38.840 | building memory, building all this thing that's
00:35:41.320 | going to differentiate you in the consumer competition--
00:35:46.800 | that just seems best aligned with who they are,
00:35:48.920 | what they're doing.
00:35:49.680 | I mean, ChatGPT has become the verb in the age of AI.
00:35:53.200 | They replaced Google at the start.
00:35:54.960 | Nobody's saying we're barding something.
00:35:56.720 | They're saying we're ChatGPTing something.
00:35:58.560 | So I think that they have a leg up there.
00:36:01.840 | When I look at the competition in enterprise--
00:36:05.240 | I think Anthropic was up at the Morgan Stanley
00:36:07.320 | conference this morning, and they said they're hiring--
00:36:11.040 | their sales force went from two people last year
00:36:13.920 | to 25 this year.
00:36:15.920 | Think of the tens of thousands of salespeople
00:36:19.400 | at Microsoft, at Amazon, et cetera,
00:36:21.920 | that you got to go compete with.
00:36:23.280 | Now, of course, they're also partnering with Amazon.
00:36:26.480 | But when you think about that, these guys--
00:36:29.080 | there's going to be all this margin stacking, Bill.
00:36:31.720 | So Amazon's got to get paid.
00:36:33.600 | Anthropic's got to get paid.
00:36:35.480 | NVIDIA's got to get paid.
00:36:37.240 | Now, if you use an open source model,
00:36:39.400 | you can pull one of those pieces of the margin stacking out.
00:36:43.440 | So now, this is just Microsoft getting paid using LLAMA 3
00:36:47.520 | or LLAMA 2.
00:36:48.160 | They don't have to pay for the use of that model.
00:36:50.320 | And NVIDIA gets paid.
00:36:51.360 | So I think in the competitive dynamics of an open marketplace
00:36:56.360 | that that enterprise game is going
00:36:58.240 | to be tough for two different reasons for these model
00:37:00.520 | businesses.
00:37:01.440 | Number one, Zuckerberg is going to drive the price.
00:37:05.000 | He's going to give away Frontier-esque models
00:37:08.280 | on the cheap.
00:37:10.640 | And that's going to be highly disruptive to your ability
00:37:13.320 | to stack margin.
00:37:14.680 | If I'm a CIO of JP Morgan or some other large institution,
00:37:19.840 | do I really want to pay a lot for that model?
00:37:22.040 | I'd rather have the benefit of open,
00:37:24.680 | because then I can move my data around a little bit more
00:37:28.760 | fluidly.
00:37:29.320 | I get the safety benefits of an open source model.
00:37:33.360 | And I'm not sending my data to OpenAI.
00:37:35.760 | I'm not sending my data to some of these places.
00:37:38.200 | Huge point you just made that is in addition to everything
00:37:42.280 | we said, which is a lot of the big companies
00:37:45.360 | have concerns about their data being
00:37:47.600 | commingled or uploaded, even at all,
00:37:50.440 | into these proprietary models.
00:37:52.400 | And so it's not just--
00:37:55.200 | I think the challenge for them in enterprise
00:37:57.240 | is not just how do I build an enterprise fleet
00:37:59.400 | to go compete with the largest hyperscaler in the world who
00:38:02.560 | are great enterprise businesses, and you've
00:38:04.280 | got to compete with Databricks and Snowflake, et cetera.
00:38:07.080 | But I think the second thing is just there
00:38:09.760 | is this bias, this tendency that you
00:38:12.120 | say has evolved over a couple of decades
00:38:14.440 | of open versus closed, which then brings me
00:38:18.040 | a little bit to this--
00:38:19.720 | But wait, there's one more element
00:38:22.280 | that I think that's important for everyone to understand.
00:38:25.560 | One of the reasons open source is so powerful
00:38:27.960 | is because it can be replicated for free,
00:38:31.480 | you end up with just so much more experimentation.
00:38:34.800 | So it turns out right now there are multiple startups who
00:38:39.480 | believe they have an opportunity hosting open source models.
00:38:43.480 | So they're propping up Llama3 or Mistraw
00:38:47.040 | as a service provider competing with Amazon,
00:38:49.680 | but they're going to tune it a little different way.
00:38:52.120 | They're going to play with it a little different way.
00:38:54.320 | So the number of places you can go
00:38:57.160 | buy one of these open source models delivered as a service
00:39:01.160 | is you have multiple choices.
00:39:02.800 | It's been proliferated.
00:39:03.920 | And that creates optionality.
00:39:05.680 | There's just so much more experimentation
00:39:08.400 | that's going to happen.
00:39:09.720 | On top of the data privacy problem, the pricing stuff
00:39:12.800 | you talked about.
00:39:13.600 | So there's a lot of different elements
00:39:16.000 | that make me think that the open source component
00:39:19.200 | models are going to be way more successful in the enterprise.
00:39:22.400 | And it's a really tough thing to compete with.
00:39:24.600 | Now, go ahead.
00:39:25.800 | Well, it kind of brings into stark relief a big debate
00:39:32.240 | that erupted this week, certainly on the Twitters,
00:39:36.720 | with Elon's lawsuit that he filed.
00:39:41.240 | And part of that was about this not-for-profit to for-profit
00:39:44.360 | conversion.
00:39:46.440 | That's, to me, a little bit less interesting.
00:39:48.640 | Don't want to talk a lot about that.
00:39:50.400 | But it blew the doors wide open on this open versus closed
00:39:55.000 | debate, right?
00:39:56.040 | And the potential that exists here for regulatory capture.
00:40:00.000 | Nobody's more thoughtful about this topic than you.
00:40:04.360 | I think I saw somebody tweet this 2x2 matrix.
00:40:08.720 | It says dividing every conversation up
00:40:12.840 | between Mark and Vinod and Elon and Sam.
00:40:17.240 | But we saw a lot of very sharp opinions expressed.
00:40:24.480 | So help us think about the risk of regulatory capture
00:40:29.840 | and why this moment is so important.
00:40:32.440 | Yeah, and I happened to mention this
00:40:35.400 | when I did my regulatory capture speech at the all-in
00:40:39.400 | conference.
00:40:39.960 | I mentioned very briefly when I showed a picture of Sam Altman
00:40:44.000 | that I was worried that they were attempting
00:40:47.800 | to use fear-mongering about dumerism and AI
00:40:51.320 | to build regulation that would be particularly beneficial
00:40:57.120 | to the proprietary models.
00:40:58.600 | And then after that, there were rumors
00:41:02.760 | that people at some of the big model companies
00:41:05.760 | were going around saying we should kill open source,
00:41:09.680 | or we should make it illegal, or we should get the government
00:41:12.360 | to block it.
00:41:13.440 | And then Vinod started basically saying that, literally,
00:41:16.880 | like, yes, we should block open source.
00:41:19.840 | And that became very concerning to me.
00:41:21.520 | I think it obviously became concerning to Mark Andreessen
00:41:24.600 | as well.
00:41:25.440 | And for me, the biggest reason that it's concerning
00:41:29.360 | is because I think it could become a precedent where
00:41:31.600 | all companies would try and eliminate open source.
00:41:34.960 | And there's a good reason why.
00:41:36.360 | I mean, we just talked about it's
00:41:37.880 | a hell of a fucking competitor.
00:41:39.400 | Like, I wouldn't want to go up against it.
00:41:43.160 | But it's also really amazing for the world.
00:41:45.480 | It's great for startups.
00:41:46.600 | It's amazing for innovation.
00:41:48.320 | It's great for worldwide prosperity.
00:41:50.320 | Think about Tesla.
00:41:51.400 | We just talked about all this open source that they're using.
00:41:54.040 | Yeah, yeah.
00:41:54.640 | So it's the last thing I would want to see happen.
00:41:58.920 | But we do live in this world where these pieces exist.
00:42:03.200 | And I would urge people to read--
00:42:05.840 | we'll put a link in--
00:42:06.960 | a Politico article that shows the amount of lobbying
00:42:11.120 | that has been done on behalf of the large proprietary models.
00:42:15.120 | And I don't think you'll find--
00:42:17.840 | literally, the only thing that comes close, perhaps,
00:42:20.000 | and people will think I'm being outlandish,
00:42:21.800 | but is SBF, who was also lobbying at this kind of level.
00:42:25.640 | But this Politico article shows they have three or four
00:42:28.640 | different super PACs.
00:42:30.080 | They're putting people--
00:42:31.720 | they're literally inserting people
00:42:33.320 | onto the staffs of the different congressmen and senators
00:42:36.520 | to try and influence the outcome here.
00:42:39.600 | I think we maybe escaped this.
00:42:42.760 | I think the open source models are so prolific right now
00:42:46.720 | that maybe we've gotten past it.
00:42:48.360 | And I also think their competitiveness
00:42:51.200 | has shown that there's a reason why
00:42:55.320 | they would want to stop them.
00:42:56.920 | I mean, I think at the time they started,
00:42:59.520 | maybe that wasn't clear.
00:43:00.640 | But I think it's remarkably clear right now.
00:43:03.960 | I also don't believe in the Dumerism scenario.
00:43:07.560 | Someone who I admire quite a bit, Steve Pinker,
00:43:10.520 | posted a link to this article by Michael Totten
00:43:13.440 | where he goes through, I think in a very sophisticated way,
00:43:17.320 | the different arguments.
00:43:19.040 | And I would urge people maybe to read that on their own.
00:43:21.640 | But yeah, I don't--
00:43:24.320 | for me, if you want to spread the Dumerism,
00:43:27.840 | let's get people to tell that story that
00:43:31.400 | aren't running billion-dollar companies that
00:43:33.360 | are taking hundreds of millions out
00:43:35.120 | and giving it to their employees.
00:43:36.500 | I mean, there's a level of bias that's obvious here.
00:43:43.120 | And so I'd rather listen to a Dumerism argument
00:43:46.480 | from someone who's not standing to gain from regulation.
00:43:50.520 | Yeah, I mean, I think you saw this tweet from Martin Casado
00:43:55.560 | that was in response to Vinod comparing open source--
00:43:59.880 | would you use open source for the Manhattan Project, which
00:44:04.920 | really kind of opened up this box even more.
00:44:09.200 | What's your-- weigh in a little bit here.
00:44:12.920 | Just if you're in Washington and you're
00:44:15.720 | hearing these things like, we can't allow these types
00:44:19.760 | of models to be used on things like this.
00:44:24.080 | We saw India is now requiring approval to release models.
00:44:29.360 | That also was, I think, a scary development for people
00:44:33.080 | in the open source community.
00:44:35.600 | But again, just reinforce, why should we not
00:44:38.840 | be worried about open source AI models?
00:44:43.480 | How do they send us to a better place?
00:44:46.200 | In the Totten article, Pinker uses an analogy
00:44:49.880 | that I just love, which he says, you
00:44:53.040 | could spread a Dumerism argument that a self-driving car would
00:44:57.720 | just go 200 miles an hour and run over everybody.
00:45:00.680 | But he says, if you look at the evolution of self-driving cars,
00:45:04.720 | they're getting safer and safer and safer.
00:45:08.160 | We don't program the AI to give them this singular purpose that
00:45:13.400 | overrides all the other things they've been taught,
00:45:16.200 | and then they go crazy.
00:45:17.680 | That's not what's happening.
00:45:19.440 | That's not how the technology works.
00:45:21.120 | That's not how we use the technology.
00:45:23.800 | And so I think the whole article is great, but I think--
00:45:28.440 | and look, I also think Pinker is a really smart human.
00:45:33.080 | He's also one of the biggest outspoken proponents
00:45:35.480 | of nuclear, which is another topic that I think
00:45:37.920 | has been wildly misconstrued.
00:45:40.720 | And so anyway, I'm more of an optimist about technology.
00:45:46.080 | These kind of Dumerism things go way back to the Luddites,
00:45:50.560 | hence the definition of the word, and ever since then.
00:45:54.680 | And someone else tweeted it'd be like telling the farmer,
00:45:59.560 | look out for the tractor.
00:46:01.040 | It's going to ruin you.
00:46:03.240 | It's just not how our world evolves.
00:46:05.880 | Well, the reason I think this is so important
00:46:08.400 | is because the competition that's
00:46:11.880 | going to come from these models, all the evidence
00:46:14.000 | suggests that it moves us to a better place, but not
00:46:17.360 | worst place.
00:46:18.160 | However, during these moments where you do have a new thing,
00:46:23.800 | and it does sound scary, and then you
00:46:25.960 | have all these people coming to Washington saying, hey,
00:46:28.360 | we can't allow all this experimentation.
00:46:31.000 | We can't allow these open source models.
00:46:32.840 | What I worry about is that that can actually
00:46:35.680 | win the day like it has in India.
00:46:38.360 | But I was in Washington last week
00:46:40.240 | talking to leadership in both the House and the Senate
00:46:42.680 | about a program near and dear to me called Invest America.
00:46:46.520 | But the conversation about AI came up
00:46:49.320 | with many senators and many senior leadership
00:46:52.080 | folks in the House.
00:46:53.280 | And one of them said to me, when he was asking about AI,
00:46:58.080 | I said I was worried about excessive government oversight
00:47:01.520 | getting persuaded, particularly as it
00:47:03.280 | relates to open source models.
00:47:05.640 | And he said, don't worry.
00:47:08.160 | He said, we had Sam Altman out here,
00:47:11.400 | and we know what he's up to.
00:47:13.120 | Oh, that's great.
00:47:14.000 | And I thought that was--
00:47:15.720 | and he ended by saying, we need competition.
00:47:18.320 | Like, the way we stay ahead of China is we need competition.
00:47:21.520 | So that was highly encouraging to me
00:47:24.600 | from a senior member of the House.
00:47:26.000 | By the way, it's interesting.
00:47:26.600 | That's so great to hear.
00:47:27.640 | And I think this China thing comes up all the time.
00:47:31.360 | The one thing that would cause us to get way behind China
00:47:34.440 | is if we played without open source and they had it.
00:47:38.960 | And the other thing I would just say
00:47:41.080 | is many academics I talk to are like,
00:47:45.000 | I have way more trust in open source
00:47:47.000 | where I can get in and see and analyze what's going on.
00:47:50.080 | And the other side of this, because we
00:47:53.520 | talked about LLMs, or AI competing both in the B2B side
00:47:57.480 | and the B2C side, on the consumer side,
00:48:00.920 | the Gemini release from Google, I think,
00:48:02.880 | is proof of the type of--
00:48:07.320 | the Google Gemini model was much more similar to something
00:48:11.160 | autocratic that you might equate with a communist society.
00:48:16.080 | Like, it's intentionally limiting the information
00:48:19.440 | you can have and painting it in a very specific way.
00:48:23.680 | And so, yeah, I'm more afraid of the proprietary.
00:48:27.840 | Yeah, they're effectively imposing a worldview
00:48:30.200 | by massaging the kernel here in ways
00:48:32.360 | that we don't understand.
00:48:33.400 | It's a black box influencing our opinions.
00:48:35.840 | And I just find it ironic in this moment in time
00:48:38.920 | that the person putting the most dollars up
00:48:42.160 | against the open source is somebody
00:48:44.400 | we're critical of, Washington was pretty critical of a couple
00:48:47.560 | of years ago, which is Zuckerberg.
00:48:49.360 | And the fact of the matter is you
00:48:51.200 | need to have a million H100s.
00:48:53.240 | He's going to have hundreds of thousands of B100s.
00:48:56.720 | You need somebody who has a business model that
00:48:58.840 | can fund this level of frontier magic
00:49:01.920 | on these open source models.
00:49:03.360 | And the good news, it appears we have it.
00:49:06.080 | Yeah.
00:49:07.120 | That's awesome.
00:49:07.760 | I'm thrilled you heard that.
00:49:09.800 | You know, there was another interesting case
00:49:13.880 | over the course of the last couple of weeks
00:49:15.400 | that I know you and I--
00:49:16.880 | By the way, actually, one last thing on this
00:49:19.360 | because I just recalled a conversation I
00:49:21.400 | was having with a senator.
00:49:23.320 | Let's assume that Dumerism's right
00:49:29.920 | and you have to be worried about this.
00:49:32.360 | What are the odds that our government could put together
00:49:37.360 | a piece of effective legislation that would actually
00:49:40.200 | solve the problem?
00:49:41.480 | Right.
00:49:43.160 | Right.
00:49:44.400 | Well, I mean, I think the cost to society
00:49:48.600 | is certainly greater when you look
00:49:51.640 | at kind of the tail risk of it.
00:49:53.480 | But again, how the node frames it, what I get worried about,
00:49:58.280 | I have no problem in him having an active defense
00:50:03.720 | and wanting to do everything in open AI's best interest.
00:50:07.000 | I just don't want to see us attack technological progress,
00:50:11.640 | which open source obviously contributes
00:50:14.600 | to in route to that.
00:50:16.200 | Just compete against them heads up and win heads up,
00:50:18.760 | like that's fine.
00:50:19.800 | But let's not try to cap the other guys
00:50:23.320 | by taking their knees out before they even get started.
00:50:26.440 | So back to what I was saying, speaking of government's role
00:50:30.680 | in business, a couple of weeks ago, the state of Delaware,
00:50:33.720 | the chancery court, this judge, Kathleen McCormick,
00:50:37.400 | she pretty shockingly struck down Elon's 2018 pay package.
00:50:42.840 | Remember, the company was on the verge of bankruptcy.
00:50:45.520 | They basically cut a pay package with him
00:50:47.520 | where he took nothing if the company didn't improve.
00:50:50.200 | But if the company hit certain targets,
00:50:51.880 | he would get paid out 1% tranches of options,
00:50:55.360 | I think over 12 tranches, which because the company had
00:50:59.600 | this extraordinary turnaround, he achieved his goal.
00:51:03.600 | So now she's kind of Monday morning quarterbacking.
00:51:05.920 | She's looking back.
00:51:07.200 | And she says his pay package is unfathomable.
00:51:10.120 | And she said the board never asked the $55 billion
00:51:14.680 | question, Bill, was it even necessary to pay him this
00:51:18.520 | to retain him and to achieve the company's goals?
00:51:22.480 | So of course, this can be appealed to the Delaware Supreme
00:51:25.160 | Court, and it will be.
00:51:26.800 | But in response to this, Elon and I think many others
00:51:31.440 | just said, hold on a second here.
00:51:33.320 | What the hell just happened?
00:51:35.600 | The state of Delaware has had this historical advantage
00:51:39.240 | in corporate law because of its predictability.
00:51:42.160 | And its predictability wasn't because of the code,
00:51:44.800 | but it was because the judiciary, right?
00:51:46.800 | There was a lot of precedent in the state of Delaware.
00:51:49.920 | And this seemed to turn that totally on its head.
00:51:52.560 | He said he was going to move incorporation
00:51:54.840 | to the state of Texas.
00:51:56.520 | We're starting to see other companies follow suit
00:51:59.280 | and other people talking about this.
00:52:01.120 | So what was your reaction seeing something
00:52:05.600 | that was, I think most of us thought,
00:52:07.200 | was highly unlikely and pretty shocking?
00:52:10.400 | Yeah, well, first of all, I think
00:52:11.960 | it's super important for everyone
00:52:13.480 | to pay attention to this.
00:52:14.560 | I don't actually think it's just an outlier event.
00:52:17.920 | I think it's so unprecedented in Delaware's history
00:52:22.040 | that it really marks a moment for everyone to pay attention.
00:52:25.480 | And there's a couple of things I would pay attention to.
00:52:27.800 | One data point you left out, which came up recently,
00:52:31.440 | is the lawyers that pursued this case
00:52:34.960 | are asking for $5 or $6 billion in payment.
00:52:38.680 | And it turns out when you bring a derivative suit in Delaware,
00:52:45.360 | there have been cases where people ask for a percent
00:52:48.040 | and the judge gets to kind of decide that.
00:52:50.200 | And if you step back and look, this is a victimless crime.
00:52:57.120 | And I think that's the thing that makes Delaware look
00:53:00.280 | like a kangaroo court here.
00:53:02.520 | Everyone knows the lawyer grabs someone
00:53:04.400 | that only had nine chairs.
00:53:06.520 | And those nine chairs went way up,
00:53:08.800 | but it's kind of silly because it's so small anyway.
00:53:12.120 | So how could a client with nine chairs
00:53:16.880 | lead to a multi-billion dollar award to a lawyer?
00:53:20.160 | And that's only true if you've created a bounty hunter system,
00:53:27.720 | a bureaucratic bounty hunter system.
00:53:29.640 | There's something in California called PAGA that's
00:53:31.760 | evolved this way.
00:53:33.640 | And if that's the new norm in Delaware,
00:53:37.040 | that's really, really concerning.
00:53:39.600 | The other thing that's different here
00:53:41.520 | is the stock went way, way up.
00:53:43.400 | So I think we've all become accustomed to when stock price
00:53:46.840 | is going down, these litigators grab a handful of shareholders
00:53:51.640 | and bring a shareholder lawsuit.
00:53:53.080 | And we're like, oh, yeah.
00:53:55.120 | Unfortunately, that's become a way of life.
00:53:57.800 | But to attack companies that go way up, I would--
00:54:04.080 | two things.
00:54:04.680 | One, I would offer this pay package--
00:54:07.480 | I looked at it in detail-- to any CEO I work with,
00:54:10.360 | and I think they would all turn it down.
00:54:12.720 | Because there's no cash, no guarantee.
00:54:15.600 | And the first tranche was a 2x of the stock.
00:54:19.400 | So that's fantastic.
00:54:21.880 | I think the biggest problem with compensation packages--
00:54:24.760 | and you may tackle that some other day--
00:54:27.080 | is a misalignment with shareholders,
00:54:29.160 | where people are getting paid when the stock doesn't move.
00:54:32.680 | That's how RSUs do.
00:54:34.040 | And here--
00:54:34.920 | By the way, that's the standard in corporate America.
00:54:38.120 | We have this grift where people make a ton of money,
00:54:40.680 | and the stock doesn't do anything.
00:54:42.380 | Look at the pay package for Mary Barra at GM.
00:54:45.760 | So the first tranche here was if the stock doubled.
00:54:48.880 | And I would offer that to anyone.
00:54:50.680 | I would also say, if any other CEO took a package like this,
00:54:55.800 | in a public company, I would be very encouraged
00:54:58.240 | to consider buying a lot of it.
00:54:59.880 | Totally.
00:55:01.320 | And so it may be one of the most shareholder-aligned incentive
00:55:06.600 | packages ever, which is exactly what you would think Delaware
00:55:10.720 | courts would be looking after.
00:55:13.160 | And ISS as well, which is a whole other subject.
00:55:17.360 | So I think it's just really bad.
00:55:20.840 | And it does show a new side of Delaware, one
00:55:27.120 | that they haven't shown before.
00:55:29.560 | And so I think everyone has to pay attention.
00:55:32.000 | Right, no, I mean, it's shocking.
00:55:34.240 | And if you--
00:55:35.520 | I was a corporate lawyer in my first life.
00:55:38.580 | As you know, if you actually go and look
00:55:42.040 | at the actual corporate law code in the state of Delaware,
00:55:48.080 | it's almost word for word the same as Texas,
00:55:51.520 | the same as California, and so on.
00:55:54.000 | The point here is it's not that Delaware
00:55:55.760 | has a legal code around corporations
00:55:59.680 | that's so much different than every other state.
00:56:01.920 | What has set it apart is it has way more legal precedent, way
00:56:06.800 | more trials that have occurred, and judges
00:56:09.880 | who have interpreted that in a way that
00:56:11.880 | is very shareholder-aligned, shareholder-friendly.
00:56:15.040 | So the big--
00:56:16.160 | And letter-- and they're known for letter of the law.
00:56:18.560 | So the construction of charges.
00:56:19.880 | Correct.
00:56:20.380 | And so here we have a moment.
00:56:22.680 | And the reason it's so shocking is
00:56:24.360 | because it's at odds with all of the precedent
00:56:27.280 | that people had come to expect.
00:56:29.280 | So I think there are going to be--
00:56:31.720 | We left out they had 70% shareholder approval.
00:56:34.520 | I mean, and there was a high-low probability event
00:56:38.760 | that happened to happen.
00:56:40.000 | And you can't look at that after the fact
00:56:42.480 | and say, oh, it was obvious this was going to happen.
00:56:45.520 | You know, you can't do that.
00:56:46.840 | Right, I think that if this stands--
00:56:50.240 | so I imagine corporations right now are in holding patterns.
00:56:54.640 | Elon is moving, reincorporating in Texas.
00:56:57.560 | I think a lot of other corporations
00:56:59.100 | will stay pending the Delaware Supreme Court appeals ruling.
00:57:04.480 | If they overturn this judge's ruling,
00:57:08.880 | then I think you may be back to the status quo
00:57:11.160 | in the state of Delaware.
00:57:12.320 | But if they uphold the ruling and deny--
00:57:15.960 | I mean, I think Elon said, despite all the goodness that's
00:57:19.040 | occurred, saving the company from bankruptcy,
00:57:21.320 | this means he effectively gets paid
00:57:23.160 | zero for the last five years.
00:57:25.320 | I mean, it's such an outlandish outcome.
00:57:27.600 | So if it gets upheld, I expect you're
00:57:31.000 | going to see significant flight from the state of Delaware
00:57:34.480 | by people reincorporating in these other states
00:57:36.920 | that, frankly, are pretty friendly as well.
00:57:39.360 | Brad, I just thought of something.
00:57:41.480 | So if it's upheld, and if these lawyers are paid anything
00:57:47.440 | as a percentage, anything other than maybe just
00:57:49.680 | their hourly fee, so if those two things happen,
00:57:53.140 | I would make the argument that every company in Delaware
00:57:57.680 | has to move to a different domicile
00:58:00.640 | because they could be sued in a future derivative law
00:58:04.600 | suit for the risk they've taken by staying in Delaware.
00:58:08.120 | Oh my god, you're so right.
00:58:10.120 | You are so right.
00:58:11.800 | Oh, mic drop on that.
00:58:15.520 | So now on the boards that I sit on,
00:58:18.200 | I have to warn them that if they stay in the state of Delaware,
00:58:21.920 | then they're knowingly and negligently taking
00:58:24.880 | on this incremental risk.
00:58:26.200 | Absolutely.
00:58:27.240 | Oh, wow.
00:58:30.160 | Let's just wrap with this, a quick market check.
00:58:34.600 | One of the things I like to do is be responsive
00:58:36.680 | to the feedback we get.
00:58:38.520 | A lot of people loved some of the charts
00:58:42.400 | we had put up on the market checked on the last show.
00:58:47.280 | So we get asked about this all the time.
00:58:50.120 | We said on the prior pod, prices have run a lot this year,
00:58:55.120 | and the background noise around macro has not improved.
00:58:59.080 | Arguably, it's getting a little worse.
00:59:00.600 | Inflation is running a little hotter.
00:59:02.720 | Rates are not expected to come down as much.
00:59:06.400 | So just a quick check on the multiples of companies
00:59:10.120 | that we really care about--
00:59:11.720 | Microsoft, Amazon, Apple, Meta, Google, and NVIDIA.
00:59:14.600 | And I just want to walk through this really quick.
00:59:16.800 | So this is a chart that just shows
00:59:18.240 | the multiples between March of '21 and March of '24.
00:59:23.000 | And so if we look at--
00:59:24.480 | let's start with Meta.
00:59:27.160 | You can look at that time.
00:59:28.240 | There are multiples gone from about 20 times earnings
00:59:30.840 | to about 23 times earnings.
00:59:32.960 | So it's a little bit higher.
00:59:35.960 | Take a look at Google.
00:59:37.600 | Its multiples has gone from about 25 earnings
00:59:41.160 | to now down to just below 20 times earnings.
00:59:44.080 | Now, this is to be expected.
00:59:45.680 | We've been having this debate about whether or not
00:59:48.840 | Google's search share is going to go down and the impact
00:59:51.800 | that that will have.
00:59:52.880 | And so this is just the market's voting machine at a moment
00:59:56.160 | time saying, hey, we hear that debate,
00:59:58.080 | and we're a little bit more worried
00:59:59.680 | about those future cash flows than we
01:00:01.680 | were in March of '21, which makes a lot of sense to me.
01:00:04.680 | If you look at Apple, it too is--
01:00:06.920 | And by the way, on that one, I mean, the Gemini release,
01:00:09.800 | the world's looking at you with this lens,
01:00:13.520 | and then you release this thing, and then you trip.
01:00:18.480 | I mean, they basically tripped, right?
01:00:20.400 | And we know they tripped because they've
01:00:22.960 | apologized for tripping.
01:00:24.560 | And so it's just not good.
01:00:28.080 | It's not confidence inspiring.
01:00:29.800 | Well, and now you're seeing the drumbeat starting.
01:00:33.520 | You and I are getting the text, the emails, the drumbeats
01:00:36.600 | around whether Sundar is going to make it
01:00:39.920 | past this moment in time.
01:00:41.040 | I mean, listen, I think boards have one job--
01:00:43.560 | hire, fire the CEO who leads the company forward.
01:00:47.560 | Can they execute against the plan?
01:00:49.280 | And I think that if I was on the board of Google,
01:00:51.320 | that's the question I'd be asking at this moment in time.
01:00:53.280 | Not is he a good human being, not is he a smart product guy,
01:00:56.040 | not is he a good technologist, not
01:00:57.760 | what's happened over the course of the last 10 years.
01:01:00.600 | But at this moment in time, do we
01:01:02.880 | have any risk of innovator's dilemma?
01:01:05.000 | And is this the team?
01:01:06.280 | Is this the CEO who can lead us through what is
01:01:10.000 | likely to be a tricky moment?
01:01:12.000 | Just to finish it off, Apple's multiple is a little bit lower.
01:01:15.400 | That also makes sense to me.
01:01:17.280 | You see what's happening in China.
01:01:19.640 | Some concerns about their--
01:01:21.920 | they get $20 billion a year from Google.
01:01:24.960 | Like, what happens to that?
01:01:27.040 | In the case of Microsoft, their multiple is a little higher.
01:01:29.680 | But again, these multiples are all in the range.
01:01:32.440 | And then the final two, Amazon's multiple
01:01:37.040 | is actually quite a bit lower here.
01:01:40.680 | And so that's interesting to me.
01:01:42.240 | I actually think the retail business is doing better.
01:01:44.800 | I actually think the cloud business is doing better.
01:01:47.280 | And now that stock looks cheaper to me.
01:01:49.920 | And then NVIDIA, of course, is the one
01:01:51.600 | that everybody's talking about.
01:01:52.880 | And this goes back to where we started the show.
01:01:54.960 | I mean, if you look at NVIDIA's multiple
01:01:56.920 | to start the year bill, so hover there right above December 23,
01:02:02.400 | its multiple was at a 5, 10-year low, right?
01:02:08.440 | Because earnings exploded last year from $5 to $25.
01:02:13.360 | Its multiple has obviously come up here a little bit
01:02:15.920 | at the start of the year.
01:02:16.960 | But you can see it's well below some
01:02:19.600 | of its historical really frothy multiples.
01:02:22.360 | But I think the question in my mind--
01:02:24.160 | we're big NVIDIA shareholders-- like in other people's minds
01:02:27.920 | is, is this earnings train durable for NVIDIA, right?
01:02:33.520 | Are these revenues durable?
01:02:35.400 | Have we pulled forward this training data?
01:02:37.320 | We showed that chart a couple of weeks
01:02:39.280 | ago that we think the future build out
01:02:42.360 | of compute and supercompute of B100s of everything
01:02:45.600 | is longer and wider than people think.
01:02:47.920 | And then the interesting thing, when
01:02:49.560 | you see that note out of Klarna last week, Bill,
01:02:52.760 | and what they were able to achieve,
01:02:54.600 | this is really the question.
01:02:56.680 | At the end of the day, are companies and consumers
01:03:00.080 | getting massive benefits out of the models and inference
01:03:03.960 | that's running on these chips?
01:03:06.360 | If the answer is no, then all of these stocks are going lower.
01:03:10.040 | If the answer is yes, then they probably
01:03:12.240 | have a lot of room to run.
01:03:14.440 | But that's the quick--
01:03:15.760 | maybe we'll do this at the end of each of them,
01:03:18.200 | do a quick market check.
01:03:20.360 | But why don't we leave it there?
01:03:21.680 | It's good seeing you.
01:03:22.560 | Next time, get back out here.
01:03:24.280 | Let's do this together again.
01:03:25.800 | All right.
01:03:26.280 | Take care.
01:03:27.040 | Take it easy.
01:03:27.640 | [MUSIC PLAYING]
01:03:31.000 | As a reminder to everybody, just our opinions, not
01:03:41.640 | investment advice.