back to index

Ep9. GPT-4o, Astra, Multi Modal, China Tariffs, Tech Earnings | BG2 with Bill Gurley & Brad Gerstner


Chapters

0:0 Intro
0:40 Open AI Launches GPT-4 Omni
10:29 Chat GPT-4o Benchmarks
14:11 Voice as the New GUI
20:1 Breakout Power of ChatGPT
22:9 Consumer AI vs Enterprise AI Landscape
27:51 AI Disruption
35:45 Regulation & Industrial Policy
43:47 China Tariffs
53:15 Tech Market Check

Whisper Transcript | Transcript Only Page

00:00:00.000 | This is classic freshman economics.
00:00:03.300 | If someone's better at doing something than us, we should let them do it.
00:00:06.940 | We should buy it from them and we should send them what we're good at.
00:00:09.660 | Like this is just, you know, de-globalization, highly inflationary,
00:00:15.500 | and we'll make our companies less competitive, not more.
00:00:18.680 | Hey, man, great to see you.
00:00:32.260 | That was fun to take some of your money in Vegas this past weekend.
00:00:35.680 | I can't believe you're bringing that up.
00:00:38.020 | So last night I'm, uh, I'm sitting on the back porch.
00:00:43.340 | I'm helping Lincoln study for, uh, his macro econ test today.
00:00:48.480 | Hopefully he did pretty well on it.
00:00:49.640 | I mean, I have to say as a dad, this is one of my like favorite things, just
00:00:53.600 | watching kids gain that agency, like really get hungry and curious.
00:00:58.020 | And, and so he was asking questions that, you know, about the IMF and the world
00:01:02.320 | bank, and I didn't know the precise answer to him.
00:01:05.160 | So I just opened up chat GPT and I just found myself, chat GPT literally became,
00:01:10.640 | you know, almost like a teacher who was sitting at the table, non-intrusive, uh,
00:01:15.280 | answering questions for us, keeping the conversation flowing, you know, and I
00:01:19.320 | found myself, we never opened Google a single time bill, um, and it really
00:01:23.920 | unlocked a lot of learning and led to this 45 minute conversation back and
00:01:28.440 | forth that we had, um, and you know, what doing that and then watching some of
00:01:34.200 | the things this week, it really reminded me that unlocking education, right.
00:01:39.280 | Both for, you know, kids like my kid, but also, uh, kids around the world
00:01:43.840 | who all basically have equal access to this incredible tutor, uh, on these
00:01:48.640 | mobile devices, I think it's hard to imagine the amount of human potential
00:01:53.320 | and kind of global economic impact that we'll have over time, knowing we're
00:01:56.940 | in the very early innings of this.
00:01:58.400 | I think it's a really good point.
00:02:00.320 | We, we talk about what can AI do and what can it not do.
00:02:04.240 | And there's a lot of, I think there's a lot of fair
00:02:06.600 | conversations on both sides.
00:02:08.480 | But one thing that seems blaringly obvious is this, is this teacher element
00:02:13.440 | and Salk Khan, who's present in some of the demos this week, um, has
00:02:18.840 | been leaning into this heavily.
00:02:20.240 | He's someone we should definitely pay attention to.
00:02:22.520 | Um, but they look real.
00:02:24.280 | And we've talked about the models are infinitely patient,
00:02:27.080 | which humans struggle with.
00:02:28.680 | Um, and I even tweeted when I was watching one of the demos that reminded
00:02:34.240 | me of, there's a great Neil Stevenson book called the diamond age where, and
00:02:39.040 | the, the subtitle of the book is a young lady's illustrated primer and the
00:02:43.720 | narrative, the hero, heroine of this book, um, gets a little tablet and the
00:02:48.920 | tablet teaches her through throughout the course of the book.
00:02:52.200 | And it's, it's some really elegant writing.
00:02:54.480 | And it's very prophetic for, for what we saw this week.
00:02:57.400 | Well, let's talk a little bit about what we saw this week, because
00:03:00.920 | it was all about consumer.
00:03:03.040 | And, you know, in our first eight episodes, Bill, uh, you must have
00:03:06.400 | brought up the movie her, uh, you know, five or 10 times in its relationship to AI.
00:03:12.160 | So you must've felt vindicated this week when Sam Altman, uh, you know,
00:03:16.440 | after the demos began tweeted her, not to be upstaged, then Google yesterday,
00:03:21.960 | you know, at their IO event showed off Astra, their version of her, albeit
00:03:26.160 | a little recorded and maybe a little bit more clunky, um, so I made a little
00:03:30.880 | mashup for you, um, yeah, maybe just as a, a bit of a bit of a demo to get us
00:03:36.960 | started here, uh, so maybe Benny will, we'll, we'll fire it up for us.
00:03:41.380 | Good morning, Theodore morning.
00:03:44.080 | You have a meeting in five minutes.
00:03:46.000 | You want to try getting out of bed?
00:03:47.760 | Theodore, I saw in your emails that you'd gone through a breakup recently.
00:03:53.880 | You're kind of nosy.
00:03:55.520 | Am I?
00:03:56.280 | You'll get used to it.
00:03:57.600 | I feel really close to her.
00:03:59.640 | Like when I talked to her, I feel like she's with me.
00:04:02.120 | Learn everything about everything.
00:04:06.600 | I'm going to discover myself.
00:04:08.200 | I want that for you too.
00:04:09.720 | Hey, how are you doing?
00:04:13.880 | I'm doing fantastic.
00:04:15.840 | Thanks for asking.
00:04:16.800 | How about you?
00:04:17.800 | Pretty good.
00:04:18.560 | What's up?
00:04:19.040 | So my friend, uh, Barrett here, he's been having trouble sleeping lately.
00:04:22.400 | And, uh, I want you to tell him a bedtime story about robots and love.
00:04:26.720 | Oh, a bedtime story about robots and love.
00:04:30.440 | I got you covered.
00:04:32.040 | Gather round Barrett.
00:04:33.960 | Once upon a time in a world not too different from ours, there was a robot
00:04:40.200 | named Byte.
00:04:41.600 | What does that part of the code do?
00:04:44.440 | This code defines encryption and decryption functions.
00:04:51.200 | It seems to use AESCBC encryption to encode and decode data based on a key
00:04:56.760 | and an initialization vector IV.
00:04:59.120 | Happy birthday.
00:05:01.880 | So it only took us, you know, 11 years from her to have kind of our, her moment.
00:05:11.960 | Um, and I actually asked Chad GPT how long it usually takes from science fiction to
00:05:17.280 | it showing up kind of in its first iteration.
00:05:19.680 | And it turns out they, they estimate, you know, 15 to 40 years.
00:05:23.360 | So that seems pretty good.
00:05:24.640 | Um, you know, in, in, in terms of the timeline there, but let's talk about
00:05:29.280 | what chat GPT for Omni and Google Astra gave us this week.
00:05:34.560 | I mean, I think it's mostly these consumer facing small assistants.
00:05:38.840 | The models are smaller, faster, but most importantly, it's like
00:05:42.960 | this end to end audio model, right?
00:05:45.400 | It can listen, it can think, it can speak in one model.
00:05:49.440 | The voice seemed to have a lot more empathy.
00:05:51.880 | Uh, it could interrupt.
00:05:53.400 | It was much, it was reacting a lot faster.
00:05:55.920 | Still didn't have memory, still didn't have actions, but the tone
00:06:00.240 | and the cadence of interruption, it all felt a lot more real.
00:06:04.320 | Yeah.
00:06:05.000 | What were your reactions?
00:06:06.240 | So I agree.
00:06:08.680 | I found myself, um, feeling like both open AI and Google were
00:06:15.320 | tilting more towards the consumer.
00:06:17.240 | There's been a lot of talk.
00:06:18.280 | Is this about the API and enterprises?
00:06:20.240 | Is this about the consumer?
00:06:21.480 | Where are they going to make money?
00:06:22.800 | Where are they going to focus?
00:06:23.880 | These felt like we're, you know, I think open AI said something about
00:06:28.240 | the human computer connectivity, which sounds like something Bill
00:06:31.920 | Gates would have said, you know, and, and very consumer focused.
00:06:35.960 | I thought the winner, if you will, of the week was open AI's voice recognition.
00:06:42.040 | I will tell you that playing with it on my own phone, I didn't get the
00:06:46.800 | same reaction time you mentioned.
00:06:48.840 | I didn't get the same quickness when I tried to interrupt it hung.
00:06:52.840 | So I don't know if they had a specially fast version or whatever, but, but
00:06:57.720 | I didn't have the same experience.
00:06:59.240 | I hope, I hope I will.
00:07:00.480 | Um, I think it's really killer as I've talked about when you think about voice
00:07:04.560 | and if that's, you know, where they're winning, you gotta think about what
00:07:08.240 | apps are voice in voice out, you know, and, um, not all apps are that way.
00:07:13.760 | If you want to see four different hotel choices in Taiwan, I don't
00:07:20.840 | think you want that read to you.
00:07:22.400 | Um, I think that will be tedious versus looking at it on a screen.
00:07:26.240 | Now combinations are possible.
00:07:28.120 | So there's all kinds of things we can do.
00:07:29.880 | Um, I, I, I wondered whose voice they train these things on because the, uh,
00:07:37.280 | overt excessive inflection was a bit cringy as Elon Musk, I don't
00:07:44.480 | think real people talk that way.
00:07:46.400 | Hopefully I was hoping maybe they'll put a smarmy slider in so I can
00:07:51.440 | take that down a bit when I use it.
00:07:53.720 | Well, I think, I think they actually, they, they actually demoed that at
00:07:57.520 | open AI, where you could turn up and down the dial, I think on inflection,
00:08:01.920 | or I imagine it's going to have, uh, the sexy dial it's going to have the
00:08:06.120 | serious dial, it's going to have the teacher voice, you're going to
00:08:09.240 | be able to get whatever you want.
00:08:10.640 | But let's, let's, let's click on this, on this voice thing, Bill, because
00:08:14.640 | I think it's, it's really interesting.
00:08:16.640 | You and I've talked a bunch of times, um, about kind of voices,
00:08:21.000 | the new graphical interface, right?
00:08:23.280 | And is this really the moment where now that we have a voice, uh, a model
00:08:30.400 | trained on voice that really is multimodal and brings these things
00:08:34.320 | together, you know, that we get to a level of latency and to get to a level
00:08:38.600 | of interaction that feels human enough that it really does start to
00:08:42.200 | replace typing and, and, and the mouse.
00:08:44.720 | And I'll stipulate to you that it may in fact be that it reads me something
00:08:50.640 | that says, Hey, check out your phone, you know, for the four options of
00:08:53.720 | the hotel, right, where there's an interaction that's more blended.
00:08:57.560 | They're definitely going to be things that we're going to want to see,
00:09:00.200 | you know, but think of this, you know, design for the human world, but
00:09:03.400 | also AI and, you know, enabled.
00:09:05.720 | So we're not going to throw out all the apps that we currently have
00:09:08.920 | like DoorDash and booking.com, but where your AI will be smart enough
00:09:13.320 | to navigate those and then maybe take screenshots and push them
00:09:16.480 | to you or things like that.
00:09:18.280 | Well, look, or put them on your heads up display of your, your meta ray band.
00:09:22.880 | Um, I, I think, I think it's a really important point in time.
00:09:29.000 | And if they achieved what you said, I think it's like the first door that
00:09:34.200 | opens for us to go figure all those things out.
00:09:37.000 | Um, it, it might be useful for someone to, you know, we got all
00:09:41.240 | these scores, uh, to measure AI.
00:09:43.520 | It might be useful to someone to come out with a score for, uh, voice recognition.
00:09:48.880 | I mean, there's all kinds of different dialects people have
00:09:51.640 | and different types of voices.
00:09:53.760 | It'd be killer if there was a, a way to kind of gauge.
00:09:57.200 | It feels like from the demo, this is way better than Siri.
00:10:01.200 | Um, I, like I said, my experience wasn't, wasn't quite
00:10:05.240 | up to what we saw on the TV, but it might be useful to know that.
00:10:08.560 | And if it is achieved, if there is this, this voice recognition
00:10:13.040 | that doesn't miss a beat, like in the movie, her, then I think, I think
00:10:16.960 | that's a, I think that's a really important point of demarcation in
00:10:22.240 | time, um, that that's been achieved.
00:10:24.480 | And I think it will allow all types of experimentation, um, into the future.
00:10:29.440 | Let's touch on that.
00:10:31.080 | You mentioned the benchmark.
00:10:32.480 | So in terms of the benchmarks, this wasn't actually a huge leap forward.
00:10:37.440 | You know, maybe that's why they didn't call it chat GPT five.
00:10:41.040 | Um, maybe because on certain benchmarks, you know, Lama
00:10:45.120 | 400 B is already better, but it was accelerating another thing.
00:10:49.280 | So I, we have a couple of charts here.
00:10:50.800 | One is we just took the semi-analysis chart, which is kind of this quality
00:10:54.860 | versus inference API pricing, and we plotted chat GPT four Omni on this chart.
00:11:02.120 | And you can see how it barely improved in terms of human level, uh, the human
00:11:08.560 | eval score, but it dramatically improved in terms of pricing, um, you know,
00:11:15.000 | in terms of inference pricing.
00:11:16.360 | And then if you go to the, the next chart, this is just ELO
00:11:19.820 | performance versus release date.
00:11:21.760 | Again, you know, chat GPT four Omni is an improvement, um, but it
00:11:27.100 | does appear on those measures.
00:11:29.120 | Bill, we have some maybe plateauing going on, but, but I think the point
00:11:35.020 | you made is, is, is the more important one.
00:11:37.540 | And this is one, you know, Karpathy tweeted about, which I think is
00:11:42.040 | interesting, he said, this is the first time that we really have a model that
00:11:48.140 | reasons across voice, text, and vision in a single model, you know, it's processing
00:11:53.820 | all three modalities in a single neural net, right?
00:11:57.540 | And if I just think about how humans operate, we're processing vision.
00:12:02.440 | We're processing audio.
00:12:03.840 | We're processing text and voice all in the same neural net.
00:12:08.180 | And so it feels to me like this, these, the, the path to reasoning, the path
00:12:13.580 | to feeling more human, like much like the GPT moment we had with FSD 12 and
00:12:18.780 | self-driving that you kind of had to throw away, um, you know, how we got
00:12:23.560 | here and you have to focus more on, you know, these, you know, bringing
00:12:28.200 | these modalities together.
00:12:29.760 | So it's, it may be that the benchmarks bill plateauing, um, are, are consistent
00:12:35.640 | with, you know, having to attack to other ways to get model improvements,
00:12:40.380 | to get performance enhancements.
00:12:42.040 | You may be right.
00:12:43.660 | And just to clarify, I think, you know, this, but just to make sure the audience
00:12:46.940 | does, I was implying, maybe we need a new benchmark to measure, uh, voice,
00:12:51.900 | voice interpretation quality.
00:12:54.360 | So that would be different than these, um, which is obviously
00:12:58.260 | an area that, where they improve.
00:13:00.100 | So you could be right.
00:13:01.060 | I mean, the, the notion of multimodal or mixture of experts or any of
00:13:05.380 | these things that, that have multiple models, um, might, might unlock
00:13:10.240 | different areas of improvement.
00:13:12.100 | Um, it, it does strike me that the, the models and the scores kind of send
00:13:17.480 | everyone running in the same direction.
00:13:19.540 | And, you know, we've got in both of these announcements, you've got
00:13:23.040 | CEOs or, or someone putting up on a screen, you know, context window,
00:13:27.820 | you know, and, uh, 2 million tokens.
00:13:30.640 | Like, like, like, like the general public has any idea what that means.
00:13:34.820 | Um, so, so I, I, it's useful if things move in a different direction,
00:13:41.000 | I think, and things splinter.
00:13:42.340 | Um, one, one big takeaway I had from this is if they're all going at
00:13:48.120 | consumer and then we had this, this announcement that, that Mike Krieger
00:13:52.800 | from Instagram fame is joining Anthropic, like if they're going, and
00:13:57.380 | then some people rumored that some of the executive changes at Amazon or
00:14:01.640 | to get them more engaged on AI, we're just going to have six companies
00:14:06.160 | all running at the same prize.
00:14:08.300 | Um, and it's interesting.
00:14:11.980 | We ended up talking about AI every week, but I don't know what there is
00:14:15.160 | else to talk about because that's all anyone's talking about.
00:14:18.040 | And so it, it, we're, we're gearing up for a battle Royale.
00:14:22.400 | That's, that's what it feels like.
00:14:24.900 | Let's talk about this on, on a couple of dimensions, because first, um, if
00:14:29.700 | it's all about voice and all about assistance and all about consumer.
00:14:32.900 | Then it seems like it all, it has to be all about Apple, right?
00:14:37.080 | I mean, that's what we're all carrying around in our pockets.
00:14:40.380 | You know, it brings us back to the rumors that they're going to announce
00:14:44.140 | a deal with open AI at, at their worldwide developer conference in June.
00:14:48.480 | Um, but as I've said many times, like, I, I think a decision to sell the
00:14:53.820 | generative AI search button, um, you know, to open AI and the way that
00:14:59.440 | they're doing with Google search is a far cry from outsourcing, right.
00:15:04.920 | There, their mission to build their own version of her, which I don't
00:15:09.500 | think that Apple can ever outsource.
00:15:11.640 | I think it's too existential to them.
00:15:13.660 | Um, but that, that, that brings us back to kind of like, what
00:15:17.680 | will really make that special.
00:15:19.200 | And it feels to me like what was left out this week, right.
00:15:22.440 | What her lacked was memory and actions, right?
00:15:27.020 | This ideal that you move, remove the final wave of latency on these
00:15:31.260 | devices, um, by putting it on the device, a smaller model, uh, that
00:15:36.500 | can remember things and have actions.
00:15:38.340 | Now, I think the question that Apple's probably struggling with Bill.
00:15:42.060 | Is in order to do that, even a seven B model, you probably have to have,
00:15:46.460 | um, you know, 75 gigs of, of, of, of DRAM on your phone, right.
00:15:52.380 | The device is, you know, and it's got to run everything else on the phone.
00:15:56.280 | And so I think that we're going to see a lot more memory
00:16:00.640 | that gets pushed to the edge.
00:16:02.040 | You're going to see these devices, you know, these models continue to get
00:16:05.520 | smaller, but a question that I have is, can you package the mode multimodality
00:16:10.620 | that we just saw with four, uh, with chat, GBT four Omni, can you package
00:16:15.620 | that into a small enough model, a llama three seven B that we can actually
00:16:21.180 | sit resident on these devices?
00:16:23.020 | Yeah, I I'm sure that's, I'm sure that's a challenge in that I want
00:16:28.840 | one that will be solved in time.
00:16:30.540 | Like that's not something that's impossible as time moves forward,
00:16:34.620 | but it might be impossible now.
00:16:36.180 | Um, it's going to be super interesting to watch them try one, one, one of my
00:16:40.180 | big takeaways from Google IO was, and we had talked about this a few episodes
00:16:45.380 | back that Google just has a lot of assets, you know, if it is one of the
00:16:50.500 | problems with anthropic and open AI is they don't have a lot of assets, but
00:16:54.380 | Google has a ton of assets and it really showed up in, in their demos.
00:16:58.900 | They had, you know, they were touching YouTube.
00:17:01.100 | They were touching Gmail.
00:17:02.220 | They were touching docs and sheets.
00:17:03.940 | It was interesting to see docs and sheets come back to the forefront,
00:17:07.540 | which, you know, there is a world.
00:17:10.300 | And I, I won't suggest that this will happen, but there's a world where they
00:17:15.060 | get AI so right that people move to Gmail, move to docs, move to sheets,
00:17:21.500 | move to the Google stack, move to Android because the integration is so amazing.
00:17:27.420 | They're, they're really the only one that has all of that, you know?
00:17:31.180 | There's no doubt, there's no doubt about it.
00:17:33.940 | And, and, and, you know, I was watching Liz Reed.
00:17:36.940 | She was, you know, running, uh, the, the search demo for
00:17:41.860 | the new Gemini powered search.
00:17:43.220 | I think she runs all search now at Google.
00:17:45.700 | It was doing exactly what we expect it would do, right.
00:17:49.180 | Predictively doing this AI embedded search.
00:17:51.660 | Didn't get rid of the 10 blue links.
00:17:53.500 | Um, you know, if you look at the sell side notes out of banks this morning,
00:17:57.580 | Morgan Stanley and others, they're defending that this will be good
00:18:01.060 | for Google, that they've figured out how to monetize AI search.
00:18:04.420 | Um, and so I, I, you know, clearly Google has woken up.
00:18:08.820 | They begun this, this path towards structural changes.
00:18:12.660 | They talk, talk about it and getting fit.
00:18:15.300 | You know, as I've said, I've gotten, you know, one toe on the bandwagon
00:18:19.340 | there, um, because I do think that they have all these assets
00:18:22.820 | that you're talking about.
00:18:23.900 | Um, but I have to say the company with no assets, as you describe it, uh, you
00:18:29.540 | know, in the case of open AI, I would say they have more than, than no, but
00:18:32.820 | their assets are their people and their assets are they're small and they're
00:18:35.580 | moving fast and they got the funding from Microsoft to do it.
00:18:38.900 | Um, and they obviously have Jensen, you know, delivering them the latest chips.
00:18:43.460 | So they got all the raw ingredients, but they're really pushing Google.
00:18:47.180 | Um, and I thought that Google's version of its assistant that they showed off
00:18:53.180 | this week, definitely lagged open AI.
00:18:56.420 | Um, but I also stipulate, I think they're going to be multiple companies.
00:19:00.580 | Like you said, I think Meta is going to get there.
00:19:02.300 | I think Apple's going to get there.
00:19:03.460 | I think open AI is going to get there.
00:19:05.420 | I think that Google will get there.
00:19:07.340 | Yeah.
00:19:08.660 | And obviously, you know, this, but by assets, I meant users, apps, data.
00:19:13.620 | I didn't mean the, the, the, the potential of the individuals.
00:19:17.660 | And there was one thing in the, in the open AI demo that I thought was super
00:19:22.700 | interesting that to me will create tension going forward, they showed off
00:19:27.740 | a desktop version of the app, which I don't think is out yet, but hopefully
00:19:33.020 | it'll be soon Mac app, oddly a Mac app since Microsoft put $10 billion in them,
00:19:37.620 | but a Mac app might be some interesting reasons why that's true.
00:19:40.980 | Um, but it looked at the other screens you had resident, and
00:19:45.180 | I don't know how it did that.
00:19:46.420 | I don't know if it's just taking a screenshot and it has to be at the
00:19:50.420 | forefront or if it literally could look at any active app you have.
00:19:54.940 | I do not know a lot of questions on that front about how you do that
00:19:58.940 | and what you're allowed to do.
00:20:00.900 | It's interesting, like Google was built and meta was built, you know, in, in a,
00:20:07.140 | in a Microsoft dominated world, you know, then Apple came up, but they were able
00:20:12.740 | to insert themselves in the browser and then create enough power to rise up and
00:20:18.460 | do other things and create moats Google with the Chrome browser, that kind of thing.
00:20:22.660 | Right.
00:20:23.140 | Um, it, it'll be interesting to watch open AI, try and do that.
00:20:27.780 | Um, cause people are aware of what, what the, the possibilities
00:20:32.700 | are, if you get to that place.
00:20:34.180 | If you get to enough breakout power, um, you know, Google got to breakout power,
00:20:39.340 | launched a mobile OS, launched a browser, like, like fought back.
00:20:44.580 | So letting someone, letting someone in bed, you know, as, as a
00:20:49.580 | bit of a parasite is a risk.
00:20:51.940 | Yeah.
00:20:52.900 | And I think what you're saying is if Apple were to allow chat, GBT to embed,
00:20:57.540 | you know, in any way within the iPhone, it's a little bit like AOL, AOL, who is
00:21:02.660 | the King at the time, allowing Google to embed, you know, uh, you know, and, and
00:21:07.820 | pull away a lot of search activity and kind of legitimize them, it's then very
00:21:12.140 | hard for AOL to kind of pull back search, you know, when, when they build it for
00:21:16.180 | themselves, is that kind of, or like, or the examples I mentioned, I mean, you
00:21:21.420 | remember when, when iOS took off and really started succeeding and then, and
00:21:27.180 | then medic came public, P meta stock went from 40 to 18, I feel it at 40, went to
00:21:33.540 | 18 and everyone was worried that Apple just controlled too much of the, of iOS.
00:21:42.180 | And they'd be able to undermine what meta wanted to do.
00:21:47.020 | And I mean, that's another example.
00:21:48.700 | Now it didn't happen, meta got their, their stuff together.
00:21:51.460 | Mark says it was because he went public and woke him up and they
00:21:55.020 | got their mobile game together.
00:21:56.300 | And, and, and then they had to fight that war again with, with the ad network
00:22:00.380 | changes and they've worked their way around it, but there is tension between
00:22:05.020 | a very ambitious app on a platform, you know, and that's why I think when you
00:22:12.380 | look at kind of the business model, you need to have to survive, right?
00:22:17.420 | Because the inputs are so costly.
00:22:19.420 | It seems to me that everybody's going to fight for this consumer
00:22:24.380 | landscape and the only right.
00:22:26.620 | We, we have perplexity out there with, I think last we saw on, on Twitter,
00:22:31.460 | something like 20 million in revenues and the, the people who love them
00:22:35.300 | really love perplexity, but the fact of the matter is chat GPT is the
00:22:40.620 | Google verb of the moment of AI.
00:22:43.820 | It does seem to me that they need to, and are focusing more on consumer.
00:22:49.260 | I feel like it's a better path to AGI.
00:22:52.220 | And, you know, maybe just look at this cost per query
00:22:55.420 | of OpenAI's flagship model.
00:22:57.540 | You know, you talk about these pricing cliffs, you know, in the enterprise.
00:23:01.500 | But the price is plummeting in the enterprise.
00:23:05.180 | In fact, I was talking to a large data company this week, who's in the
00:23:08.620 | business of serving models, and they were lamenting the fact that the pricing
00:23:12.740 | umbrella was set by OpenAI and there is no price, right, so that it's a much
00:23:18.100 | lower margin business than the traditional software business.
00:23:21.540 | So in enterprise, it's really kind of hard seeing anybody other than the
00:23:25.860 | big data platforms, right, kind of being in that game.
00:23:30.180 | So if you're going to be Anthropic or OpenAI or Mistral or anybody else, it's
00:23:36.020 | hard to see how you build a business, you know, with payback on the tens of
00:23:41.260 | billions that you're going to be required to invest unless you're in the consumer
00:23:45.740 | game and the only brand right now that's really broken out in the consumer game
00:23:49.580 | is ChatGPT, other than the brands that already have big consumer distribution.
00:23:56.140 | So a couple of reactions to what you just said.
00:23:58.500 | One, you know, this, I'm fascinated by the price, what I call the price cliff,
00:24:04.660 | which is just, you know, if you back off 0.5 of one of these release models, you
00:24:10.260 | save 90% or 95% of the money, it's like 20x differential.
00:24:15.700 | And when I talk to our entrepreneurs that are using these models, they might design
00:24:23.380 | with the cutting edge model, but they all back off to the affordable models on
00:24:28.500 | implementation, inference and runtime.
00:24:30.540 | They all do.
00:24:31.260 | There's no one, you know, that hangs around up there.
00:24:34.140 | And so it is interesting that that dynamic happens.
00:24:38.660 | It does raise questions about just how price competitive, uh, running
00:24:43.780 | AI models as a services.
00:24:45.740 | I would love one day for you and I to have a episode dedicated to whether,
00:24:50.860 | whether, I don't know whether cloud hosting is a good business or not.
00:24:54.700 | I'd like to find out.
00:24:55.780 | I'd like to talk to some experts, but, but it may be to your point.
00:25:00.580 | It may be that the enterprise side's just, just too competitive, too tough.
00:25:06.460 | You know, we'll see.
00:25:07.580 | I, I, I think open AI believes that if they get voice, right, if they get data,
00:25:12.900 | right, memory, right, that people will want to develop apps with all of those
00:25:17.660 | in there and they will get more locked into their platform, that's TBD.
00:25:22.380 | Well, let's see what happens.
00:25:23.980 | Well, there's no doubt that if you build, you know, um, and I think
00:25:29.260 | that's the bet they're making.
00:25:30.300 | And frankly, um, you know, listen, this is like, the reason we talk
00:25:34.860 | about this every week is only three or four weeks ago, bill that llama was
00:25:38.500 | out with its new releases and it broke the internet talking about, you know,
00:25:42.900 | the, the breakthroughs with llama seven B and, um, and the integration across
00:25:47.700 | the family of apps at meta, et cetera.
00:25:49.740 | And it seems like this is punch counterpunch, but unlike prior eras,
00:25:54.980 | it's not punch counterpunch with just small, you know, underfunded startups.
00:25:59.180 | This is the largest companies on the planet who are all in, uh, you
00:26:03.500 | know, in this game and, you know, one of the things that I saw a lot of
00:26:08.900 | people talking about this week when Omni and, and, and Gemini upgrades
00:26:13.020 | came out was, who does it kill?
00:26:14.660 | Right.
00:26:15.700 | Like now we're talking about who wins.
00:26:17.780 | Um, but there's a, there's been record funding into AI by venture capital
00:26:23.540 | over the course of the last two years.
00:26:25.660 | We've talked on this pod and, and others about how inflection who was building.
00:26:30.540 | Right.
00:26:30.860 | Remember pie, which stood for personal intelligence, um, was
00:26:35.020 | trying to build a consumer app, built one of the largest H 100 clusters
00:26:39.020 | in, in the world raised a lot of money.
00:26:41.460 | You know, that Mustafa, uh, you know, said this is going to be too hard,
00:26:46.220 | sells the business, uh, to Microsoft.
00:26:49.060 | So Microsoft, you know, he's now running consumer AI at Microsoft, you know, by
00:26:53.860 | the way, I'm not really sure yet what consumer AI at Microsoft is, because
00:26:58.260 | I'm not sure what the brand is, um, you know, is it going to be inflection?
00:27:01.980 | Is it going to be being, is it going to be this co-pilot, but they need, you
00:27:06.220 | know, they need to sort that out.
00:27:07.660 | I'm sure Mustafa, uh, you know, is going to work hard on that, but you know, so
00:27:12.380 | inflections out of the game character, which we heard a lot about raised a lot
00:27:15.820 | of money, you know, I, I was watching open AI's demo this week and I said to
00:27:21.020 | myself, that's kind of what I thought character was going to be right.
00:27:25.100 | They raised a lot of money.
00:27:26.300 | So where is character in all of this?
00:27:28.860 | And then, you know, Sam said in this Stanford interview a couple of weeks
00:27:32.460 | ago, Bill, that if you're a GPT rapper, right, then, um, you better go find a
00:27:38.060 | new line of work because that they're going to continue expanding the
00:27:41.580 | concentric rings and all the rappers, you know, even if they continue to
00:27:45.180 | survive, the value they'll be able to extract out of the ecosystem will
00:27:49.100 | get smaller and smaller.
00:27:50.380 | Um, so as you think about, you know, what you saw this week, I saw somebody
00:27:55.220 | tweet, uh, I'll leave it here.
00:27:56.700 | He said, startups, dead sentiment analysis, live meeting, assistant
00:28:00.820 | translation apps, language, learning apps, music, signing, generation,
00:28:05.460 | tutoring apps, et cetera.
00:28:06.860 | Um, uh, is that what this all comes down to?
00:28:10.580 | This is going to be winner take most, and they're going to eat a lot
00:28:13.500 | of that app ecosystem as well.
00:28:15.220 | Uh, it's, it's a big question.
00:28:17.300 | You know, I, we, we talked about education, you know, uh, Chag, who is
00:28:23.100 | a company that was, or is, it still exists, um, heavily focused on helping
00:28:28.940 | college students make their way through their homework, you know, it got hit
00:28:33.180 | early, right when these models started coming in.
00:28:36.340 | And so people do ask the question, what's next?
00:28:39.420 | A lot of the demos that open AI were language related.
00:28:42.940 | They had two of them that, that had, you know, one of them was a live
00:28:47.140 | translation where just put the phone out there, I'll speak my language.
00:28:51.220 | You speak yours and it recognized the, the interpreted the voice,
00:28:55.940 | recognized the language, knew the language of the other person.
00:28:59.180 | And, and just was a translator.
00:29:00.900 | And I, I took the phone out that night and, you know, thinking about her and
00:29:07.100 | thinking about language tutoring and companies like Duolingo, I just said,
00:29:11.340 | Hey, I'm a beginner Spanish student.
00:29:14.740 | Could you, could we do some lessons?
00:29:16.860 | And it just jumped right in, just jumped right in.
00:29:19.700 | Like, wow, sure.
00:29:20.940 | And, and, and, and we did the first lesson and it said, is that easier level or too
00:29:27.300 | hard?
00:29:27.660 | I said, let's go up a level and it just went up a level, like it was like, and I
00:29:32.020 | don't, I don't think anyone coded the language tutor into this thing, right?
00:29:37.820 | It was, that was all emergent.
00:29:39.420 | So it's pretty powerful.
00:29:40.620 | I mean, you have to, I think the day that the demo showed Duolingo stock was down,
00:29:47.020 | you know, four or 5%, you know, so again, I think, you know, one of the things you
00:29:53.300 | see, and we'll get to this later when we talk about kind of markets and multiples
00:29:57.780 | and valuations, but valuations are a discounted, you know, cashflow it's the
00:30:06.020 | expectation about future cash flows.
00:30:07.900 | Right.
00:30:08.260 | And so the one thing that happens when you have this level of disruption at this scale
00:30:12.940 | for so many businesses, the multiple you apply to those future cash flows just
00:30:17.780 | needs to be lower.
00:30:18.780 | Your margin of safety needs to be higher.
00:30:21.420 | So stocks come down, not because people are like, oh my God, this is going to take
00:30:25.860 | it out tomorrow.
00:30:26.860 | But the fact of the matter is Duolingo is valued on its next 10 years of cash flows.
00:30:31.940 | And people are simply saying, I have a little less visibility.
00:30:35.180 | This makes me a little bit scarier, uh, more scared that those subscription
00:30:39.660 | revenues that are going to Duolingo aren't going to be there, or at least the
00:30:45.020 | level of confidence I had yesterday, right, is lower today based on these things
00:30:49.980 | that I'm seeing.
00:30:50.700 | And that reminds me of two, two things.
00:30:52.980 | So one, um, I've talked in the past about my good friend, Michael Mobison.
00:30:56.780 | He w he wrote a paper 25 or 30 years ago about what he calls CAP competitive
00:31:02.100 | advantage period.
00:31:03.180 | And he, that competitive advantage period is how many years of cash flows into the
00:31:09.500 | future are, are embedded in this company stock price.
00:31:13.180 | And it's a meta, it's a measure of, of how durable the wall street as a whole
00:31:21.500 | believes your business model is.
00:31:23.020 | It's almost a mug measurement.
00:31:24.340 | And so you're absolutely right.
00:31:26.540 | If, if there's a question about your strategic, um, value, your multiple can
00:31:32.980 | come in big time.
00:31:33.900 | And that reminds me of the second thing I was going to say, which was a famous quote
00:31:37.180 | from a poker game long ago, uh, multiple compression is a, it's not fun.
00:31:43.940 | Uh, no, no, no doubt about it.
00:31:46.620 | I, you know, to say it in another way and to give an example, people often ask me,
00:31:51.500 | they're like, you know, Coca-Cola may have low single digit growth and trade at a
00:31:56.820 | higher multiple than a tech company.
00:31:59.780 | That's also profitable and has a high rate of growth.
00:32:02.740 | And they're like, how could that possibly be?
00:32:05.060 | You know, doesn't the market, isn't the market mispricing those?
00:32:07.660 | I'm like, no, because, you know, back to cap market is saying that Coca-Cola is
00:32:12.580 | going to have those free cash flows, you know, forever.
00:32:15.500 | And what they're saying about this tech company is it may have them
00:32:19.180 | forever, or it may not.
00:32:20.780 | I'm just applying a much higher discount to that probability.
00:32:24.580 | And you and I have lived through this before.
00:32:26.940 | I remember vertical search engines when, you know, Google started pushing deeper
00:32:31.820 | into the verticals, right?
00:32:33.700 | All these companies that people love, like TripAdvisor, that absolutely got
00:32:37.940 | eviscerated because one day they showed up and Google was showing basically the
00:32:43.260 | content that previously like they had dominated.
00:32:45.940 | And so as a firm, as an investment firm, as you know, that on the public side, you
00:32:51.540 | know, we look at who's going to be disrupted.
00:32:53.900 | I think, you know, this is the list of companies that we have that are
00:32:58.580 | potentially disrupted is as long as it's ever been, you know, we're coming out of
00:33:03.060 | this period of stasis, um, you know, where things didn't really change that
00:33:07.420 | much in consumer land, right?
00:33:09.300 | And so you, you know, not since mobile, not since mobile.
00:33:13.060 | Yeah.
00:33:13.340 | And that was, you know, 10 years ago.
00:33:15.260 | Um, one, one, one little piece of advice I'd give people that are thinking about
00:33:19.660 | this, you know, I've said from the beginning, like LLM stands for large
00:33:24.580 | language models, so things that involve language, um, are, are what this thing
00:33:30.300 | was designed for and what it's really great at.
00:33:33.180 | Uh, really great.
00:33:34.060 | And, and also reiterating something we said before that code is a subset of
00:33:39.820 | language that's actually more structured and therefore it's even better at that.
00:33:43.940 | I thought my, my very favorite demo from, from both of these things was the one you
00:33:49.100 | showed in the video where someone just has their camera on a piece of code and
00:33:54.140 | ask for analysis of what's going on there.
00:33:56.740 | That is powerful.
00:33:58.820 | Like that is useful.
00:34:00.380 | That is high utility stuff.
00:34:02.740 | Um, and we'll help people be more productive for sure.
00:34:06.220 | Well, one of the, one of the areas, you know, so I think what you were saying is
00:34:10.300 | if you're a lang, if you're in the business of language, like Duolingo is,
00:34:13.420 | then you kind of go, you, you move from the confident category to the less
00:34:17.420 | confident category, um, because people start wondering about this disruption.
00:34:21.540 | I'll tell you a whole nother category of companies, Bill, um, that have moved into
00:34:25.740 | that bucket and it's all these BPO companies, business process, outsourcing
00:34:29.700 | businesses, right?
00:34:31.140 | So, you know, because remember, it's not just about language now, right?
00:34:35.660 | BPO companies woke up yesterday and they're like, Oh my God, this is about
00:34:39.420 | voice, this is about call centers.
00:34:41.220 | This is about video in, this is about, you know, and so all of those modalities
00:34:46.260 | now call into question, uh, the things that BPO companies do very well.
00:34:50.940 | And so that's the type of disruption that I think is going to be rolling thunder.
00:34:54.860 | You, you, you, you can't just look at where we are today.
00:34:58.220 | You have to look at the rate of innovation.
00:35:00.340 | The rate of innovation is the only thing that matters here.
00:35:02.900 | And I've never seen rate of innovation at this pace.
00:35:05.980 | And so I think that, um, the surface area of disruption, the companies that are
00:35:11.100 | going to get disrupted potentially by this and the market will discount it
00:35:14.380 | before it actually happens.
00:35:15.700 | Um, but there's a long, long list of those companies.
00:35:18.580 | And I, I would just qualify one thing you just said.
00:35:22.020 | I think with language and code, this is already, you know, 10 out of 10 high alert.
00:35:29.780 | I think on business process automation, we think it's coming.
00:35:34.060 | I don't, I don't think it's at 10, you know, maybe the warning signals at six.
00:35:38.300 | Like it hasn't been proven.
00:35:40.140 | It's going to run it over yet.
00:35:41.380 | And there's still some experimentation.
00:35:44.460 | Um, one other thing I want to hit on this topic, um, you know, coming out of this
00:35:49.380 | week on AI before, before we move on is all about Washington, right.
00:35:53.820 | And there are two, two things.
00:35:55.660 | I think that we have additional information on as it pertains to Washington.
00:35:59.300 | One is on the regulation side.
00:36:00.780 | The other one is on the investment side on the regulation side.
00:36:04.140 | Um, I heard a good discussion with, with, with our friends over at all in and, and
00:36:08.260 | Sam on, you know, this idea we should regulate outputs instead of regulate
00:36:13.460 | inputs, right?
00:36:15.140 | The, the idea that don't, they shouldn't be in there mucking around
00:36:18.740 | with weights and everything else.
00:36:20.220 | But like a plane, you know, if you have a frontier level model, you
00:36:24.380 | should have to show it to them.
00:36:25.420 | They get to run it and they get to decide, does this meet whatever
00:36:28.700 | safety standards, um, when I was listening to that and they seemed to
00:36:32.380 | kind of galvanize around this regulate outputs regime, I was thinking about you.
00:36:37.540 | Does that really cross the line for you?
00:36:40.340 | Like is, is regulate outputs a good enough standard for you?
00:36:43.860 | And then how do you really implement that?
00:36:46.540 | Yeah.
00:36:48.020 | I mean, it, it kind of depends, right.
00:36:50.260 | And how it's implemented.
00:36:51.220 | I think, you know, David Sachs had said there's laws already.
00:36:54.660 | So if you commit bank fraud with an AI model, you've committed bank fraud.
00:36:59.780 | You don't need an AI bank fraud law.
00:37:01.860 | The tool doesn't matter if you use a gun or an AI model.
00:37:06.660 | Right.
00:37:06.940 | Um, and so I, I do think that's a, the type of attitude that can cause
00:37:12.340 | everyone to take a deep breath, because I think the odds that, that any
00:37:17.100 | government in any country could get in this early and start messing around
00:37:20.980 | with inputs and be effective is near zero.
00:37:24.060 | And, you know, I think Jan also said, you know, these models have been out
00:37:28.460 | here for a couple of years, like we're all the crimes, you know, and it's
00:37:33.340 | another, just take a deep breath moment.
00:37:35.260 | Um, it, it turns out early this morning, there was a large 20 page document
00:37:41.340 | dropped by Schumer in the Senate, which I guess some people were waiting on.
00:37:45.180 | And I read through it.
00:37:47.420 | Um, it wasn't that long.
00:37:48.900 | And if you, if, if you're up for it, I'll tell you what's in there.
00:37:53.140 | Um, no, I think that to me is really interesting because that's less about
00:37:58.220 | regulation and I think more about, you know, uh, you know, government now
00:38:03.340 | deciding they want to invest a bunch of money in AI.
00:38:05.460 | So why don't you give us the breakdown?
00:38:06.940 | Yeah.
00:38:07.180 | I mean, that was my big headline surprise.
00:38:09.860 | So, well, actually I'll just read you that it was broken into pieces,
00:38:14.940 | supporting us innovation, AI in the workforce, high impact uses of AI
00:38:20.300 | elections and democracy, which was only two paragraphs, privacy and liability,
00:38:25.100 | transparency, explainability, and intellectual property and safeguarding
00:38:29.860 | against AI risk and national security.
00:38:32.260 | So those were the things it was very high level.
00:38:34.780 | It didn't recommend any kind of legislation at this point.
00:38:39.580 | The big shocker for me was it recommended spending $32 billion a year.
00:38:44.860 | And I didn't, I didn't even know that was on the table.
00:38:47.860 | I thought we were trying to manage risk.
00:38:50.180 | I didn't know that, uh, that money needed to be handed out certainly in my lifetime.
00:38:55.420 | And I think prior to my lifetime, there's never been a venture capital
00:38:58.700 | category that's attracted more money.
00:39:01.300 | So the, the, the irony that we would even need even more
00:39:05.380 | money is, uh, is, is laughable to me.
00:39:08.860 | Um, now it turns out if you dig into it, some of the money's more DARPA
00:39:13.100 | like, like they want it to drip into the DOD.
00:39:16.580 | They want it to drip into the military.
00:39:18.300 | They want it to drip into academia.
00:39:20.220 | But even with that said, let me put this in perspective.
00:39:24.180 | The NIH is 40 billion, a little over 40 billion.
00:39:27.300 | They'd like it to be 50.
00:39:28.460 | The national science foundation is nine down from 10.
00:39:32.580 | So 49, you're going to start the AI foundation or, or, or a federation or
00:39:40.180 | whatever it is at Institute at 32 billion a year, you're going to put it right
00:39:44.940 | below the Institute of health and, and four X the national science foundation.
00:39:50.300 | It just seems a little, a little overkill.
00:39:52.380 | You said that they were wanting to drip money into these things.
00:39:55.460 | Let me just remind you, Washington never drips anything.
00:39:59.700 | It is a flood.
00:40:01.340 | It is a fire hose.
00:40:02.700 | And I just think at a point where we're $38 trillion in debt, where we, you
00:40:07.700 | know, now we're moving toward a trillion dollars in interest payments.
00:40:11.100 | You know, again, when we think about trade offs, there are certain places
00:40:14.780 | we actually need money and there are other places that we don't.
00:40:17.460 | Um, but it's like right now, um, it certainly feels like, you know, we're
00:40:23.460 | presented with a long menu of, of things that you can potentially order at the
00:40:27.780 | restaurant and they're just choosing all of them and, um, yeah, yeah, yeah.
00:40:32.620 | And, and take like the military, like I've never seen more venture capital
00:40:37.100 | excitement about the military ever.
00:40:40.580 | Like, so I just wish they'd take a deep breath and gauge what's
00:40:45.460 | going on in the private markets.
00:40:47.100 | We're a wash in innovation and speculation and, and funding.
00:40:52.180 | And yeah, I don't think that's needed.
00:40:54.500 | Let me hit on three other things that are in there.
00:40:56.700 | I'm kind of interested.
00:40:57.620 | They, they referred to some AI systems as quote black boxes and said, this may
00:41:03.860 | raise questions about whether companies with such systems are appropriately
00:41:07.660 | abiding by the laws.
00:41:09.060 | This is a bit of a turnabout because, you know, it's my belief based on
00:41:13.380 | everything I've read and seen that the, that the, the proprietary model companies
00:41:17.740 | were in there begging for regulation and urging the open source to be cut off at
00:41:23.860 | the knees, this is, this looks like it rebounded in their faces and they're the
00:41:28.500 | ones that, and, and the, the academicians I've talked to have made this point that
00:41:33.060 | the, the open source ones are visible and transparent and these others aren't.
00:41:37.180 | So that was kind of interesting to see that in there.
00:41:39.140 | The second thing they talked about using AI to improve efficiency in government.
00:41:44.460 | The reason I chuckle is I think AI could be amazing at improving
00:41:51.380 | efficiency of government.
00:41:52.580 | I put a very low probability on them actually leaning into
00:41:57.300 | that, you know, Malay style.
00:41:59.100 | Perhaps, you know, 15% of the U S workers, 45 million are government workers.
00:42:04.180 | There is unbelievable amount of opportunity.
00:42:06.860 | But I just, I really doubt they'll lean into it.
00:42:11.020 | And then, and then the last thing, and I think this is a huge positive, although
00:42:15.700 | I don't want to overstate it.
00:42:16.860 | There was a quote that related to improving immigration
00:42:20.860 | for high skilled STEM workers.
00:42:22.300 | Now, all it said was the relative committees to consider legislation to improve.
00:42:28.260 | So, but it's nice to see that in there.
00:42:30.780 | That is something I think everyone in Silicon Valley has been rooting for.
00:42:35.060 | No doubt here, here.
00:42:36.460 | And I, and I will say, I was out there a couple of weeks ago.
00:42:39.980 | An event you know, great event called the Hill Valley event.
00:42:45.300 | And there's never been more engagement between Silicon Valley and Washington, DC.
00:42:50.980 | And, you know, I think it's proactive engagement on
00:42:54.340 | the issues that matter most.
00:42:55.820 | I hope it doesn't result in a flood of money.
00:42:58.940 | Because I think that the private markets are better at allocating those dollars
00:43:03.340 | and a new shift into industrial policy by, by the United States
00:43:08.500 | where government is picking winners and losers is a bad idea.
00:43:11.100 | I don't think that's where we're headed.
00:43:12.700 | I think Schumer and, and his colleagues on both sides of
00:43:18.100 | the aisle are thinking smart about this.
00:43:20.980 | Jay Obernolte, who's leading the task force in the house,
00:43:24.580 | thinking smart about this.
00:43:25.980 | So I left very optimistic about how I'm feeling about Washington.
00:43:31.860 | How they're seeing it.
00:43:32.980 | I don't think we're going to get a lot of regulation.
00:43:34.940 | I would be surprised if we see a lot of incremental dollars go into this.
00:43:39.220 | And, you know, from your mouth to God's ears on immigration
00:43:43.420 | and, and government efficiency.
00:43:44.860 | But while we're talking about Washington, I saw another tweet out of you this
00:43:49.100 | week that, that, that caught my eye.
00:43:50.860 | So I want to talk about it because there was some other news.
00:43:53.940 | And, you know, Biden just a couple of days, days ago tweeted, "I just
00:43:58.580 | imposed a series of tariffs on goods made in China, 25% on steel and aluminum,
00:44:03.980 | 50% on semiconductors, 100% on EVs, 50% on solar panels," and, you know,
00:44:11.740 | quickly Twitter blew up because people posted his tweet from the then Senator
00:44:16.260 | Biden, when Trump issued, you know, tariffs on China, where Biden apparently
00:44:22.220 | tweeted, "Trump doesn't get the basics.
00:44:24.540 | He thinks the tariffs are being paid by China.
00:44:26.860 | Any freshman econ student like my son could tell you that the American
00:44:30.860 | people are paying for the tariffs.
00:44:32.940 | The cashiers at Target sees what, see what is going on.
00:44:36.740 | They know more about economics than Trump."
00:44:39.460 | So has something's changed here, Bill?
00:44:42.820 | Like, are the, do we have valid national security reasons today that we didn't
00:44:47.540 | have, you know, five or six years ago when it comes to imposing these obvious
00:44:52.460 | economic costs on American consumers?
00:44:56.460 | Well, look, I mean, I totally agree with Senator Joe Biden, who apparently is a
00:45:02.260 | different human than President Biden.
00:45:04.460 | And, and I don't know, I really don't know how the media doesn't contrast
00:45:10.260 | those two things and make a big deal about it.
00:45:12.460 | You know, my, my tweet was just highlighting that he said, well, he said
00:45:18.620 | two things, but he said that, that I'm determined to ensure America leads the
00:45:24.620 | world in these categories, giving someone a, a headstart doesn't make them faster
00:45:31.020 | at the a hundred yard dash.
00:45:32.660 | It makes them slower.
00:45:33.980 | It makes them less competitive.
00:45:35.620 | Someone, um, append, uh, replied to my tweet and said that Germany was over in
00:45:42.940 | China and used the word fitness, that it makes them more fit to, to have to be
00:45:48.540 | engaged in the competition on the field.
00:45:51.100 | And I mean, not only, not only is there that element, like you want to be
00:45:56.940 | competitive, right?
00:45:57.980 | But, but the other element is, you know, this is classic, you know, freshman
00:46:03.220 | economic, if someone's better at doing something than us, we should let them do
00:46:07.380 | it and we should buy it from them and we should send them what we're good at.
00:46:10.220 | Like, this is just, you know, de-globalization, highly inflationary,
00:46:16.100 | and we'll make our companies less competitive, not more.
00:46:19.540 | And if any, you know, one thing I did when I saw this, you know, encourage
00:46:23.140 | everyone who's interested in this topic to do, just go on YouTube and look at
00:46:27.820 | videos that were made at the recent China auto show.
00:46:31.220 | And these companies are creating more competitive products, not just on AV, not
00:46:39.180 | just on landed price point, but even on features consumer care about much bigger
00:46:44.460 | screens, more interesting features you've never seen before.
00:46:48.140 | It's, it's abundantly clear that innovation is most alive and most well in
00:46:55.300 | the China auto industry and pulling up the, the, these tariffs to help protect
00:47:02.780 | our companies won't make them stronger.
00:47:05.260 | It'll make them weaker by not exposing them to the reality on the field.
00:47:08.660 | It also didn't surprise me that, uh, that Biden was surrounded by a bunch of
00:47:13.900 | people in union church as he, as he gave this, as he announced this, by the way,
00:47:19.140 | one little, sorry to go on and on one little thing that I hate about the Biden
00:47:24.860 | tweets, and I don't think he's doing them.
00:47:27.220 | I think someone else is, he uses the word I all the time.
00:47:31.620 | Like what, what grade in school are you told?
00:47:35.620 | If you work with a team, you should say we, instead of I like it's so easy.
00:47:41.180 | Like, this is just amateurish to use that, that, that pronoun there.
00:47:45.740 | Well, I shouldn't be doing that.
00:47:47.140 | What, what, what, what, one of the things that you and I talked about before we
00:47:50.220 | launched the pod, we're not going to spend a lot of time on politics, but the
00:47:53.220 | intersection, uh, the intersection of the political stuff with us free trade
00:47:58.860 | policy, us industrial policy with AI regulation, you know, and innovation goes
00:48:04.740 | to the very heart of what makes us competitive economically.
00:48:08.700 | And we are focused on what makes us competitive economically.
00:48:12.580 | And, and when I look at this, what I worry about bill is for 20 years.
00:48:17.260 | I mean, I think a lot of people forget the late seventies where we had a lot of
00:48:20.940 | protectionism in the United States, but we also had double digit interest rates
00:48:25.940 | and double digit inflation.
00:48:27.380 | The U S economy was losing its way.
00:48:29.940 | Um, you know, uh, the Japanese automakers were ascendant.
00:48:33.940 | The European economies were stronger than the U S economy.
00:48:37.940 | And then we had basically 30 years of unabated, uh, move toward free trade
00:48:43.700 | around the world and the U S led the way for global free trade.
00:48:48.020 | Right.
00:48:48.620 | And of course, one of the consequences we know of free trade is
00:48:52.860 | that it leads to dislocation.
00:48:54.340 | There are, there are people in the United States who get dislocated because it is
00:48:59.940 | cheaper to produce, you know, things.
00:49:01.900 | And so we had NAFTA and we had, you know, these trade wars, you know, that we
00:49:06.380 | resolved over those years, but we always seem to resolve them in the favor of free
00:49:10.620 | trade.
00:49:11.140 | And I just wonder if we're entering this new era, you know, a new era of the end
00:49:17.020 | of free trade, more de-globalization, the rise of U S industrial policy,
00:49:21.740 | where we're picking winners.
00:49:23.380 | Uh, what scares me is that seems to have supporters, not only on the
00:49:28.140 | democratic side of the aisle, but it seems to have a bunch of supporters on
00:49:31.900 | the Republican side of the aisle, including, uh, you know, president Trump
00:49:35.820 | and including a lot of people in Silicon Valley who on all other issues seem to
00:49:40.180 | be free marketers.
00:49:41.340 | But when it comes to these two issues, particularly as it pertains to China,
00:49:45.740 | they're quick to jump on the bandwagon.
00:49:47.980 | They say level the playing field with China, but it doesn't feel like that to
00:49:52.060 | It feels like we're the ones who are kind of leading the way on, uh, on D
00:49:56.460 | de-globalization.
00:49:57.740 | Any, any big picture thoughts on that?
00:49:59.700 | I mean, it feels to me like it will make us less productive.
00:50:03.060 | I'm glad you brought that up.
00:50:04.940 | I mean, that is that, that is the primary reason that this stuff matters to me.
00:50:09.140 | And I, you know, I, in my, in the speech I did on regulatory character, I
00:50:13.300 | mentioned these two Matt Ridley books, how innovation works and the rational
00:50:17.060 | optimist and, and the rational optimist walks through history and shows that
00:50:22.860 | rises in, in prosperity and standard of living are always tied to free trade and
00:50:30.060 | the free exchange of ideas.
00:50:31.580 | And if we start untangling those and moving away from those, and other
00:50:36.740 | countries have done this in the past, China once led as a global nation, pulled
00:50:41.860 | up the walls and fell precipitously, you know, and yeah, I, I think the whole
00:50:47.940 | vilification of China things misplaced personally.
00:50:50.940 | Um, but I also, you know, when I think about, um, human prosperity, I think
00:50:57.420 | we're, I think the world is small enough now where you have to think about that
00:51:01.020 | on a global basis and it's unclear to me why, you know, a worker in Iowa needs
00:51:08.340 | $40 an hour if there's a worker in Mexico that's willing to bust his ass for 15.
00:51:15.540 | Like, I don't necessarily understand that being a, uh, a higher moral ground.
00:51:21.580 | Well, it certainly seems to me, uh, politically, I understand why it exists.
00:51:26.420 | Um, and again, like these disruptions are hard to deal with, and we're going to
00:51:31.420 | have a lot of disruptions that come from AI.
00:51:33.540 | You're going to have a lot of people looking for more protectionism.
00:51:36.380 | A lot of people looking, uh, for, uh, more end to free trade.
00:51:40.740 | Um, I think what's made us great as a country is we've resisted those things
00:51:44.700 | for the last three decades.
00:51:45.940 | We need to resist them into the future.
00:51:47.660 | Back to what you said, the greatest path forward is to be the
00:51:51.500 | world leaders in innovation.
00:51:54.100 | And that requires fitness with competing against the best, starting at the same
00:51:58.340 | place on the starting line, not having unfair government imposed advantages.
00:52:02.540 | And so, and by the way, like, like one thing that people, I don't know, I think
00:52:10.140 | everybody that listens to this podcast probably is aware of this, but you know,
00:52:13.860 | there've been quotes from, I think Mike Moritz at Sequoia, where he's like, I've
00:52:17.940 | been hanging out in China and these people are amazing.
00:52:21.820 | They're hardworking.
00:52:22.860 | They're smart.
00:52:23.620 | And I, and I would say that's true.
00:52:25.340 | Like it, like the time I've spent over there, I've adored, um, the people are
00:52:30.020 | wonderful.
00:52:30.740 | They're certainly as smart and entrepreneurial as anyone I've met here.
00:52:35.180 | And so there's not, you know, I think hanging our hat on the fact that we're
00:52:40.420 | losing because their governance supporting it, man, it, it, it, maybe it's
00:52:46.540 | because they're really good.
00:52:47.660 | You know, they're really, really good at what they do.
00:52:50.500 | Well, it just, it just seems to me that we've won with China over the last two
00:52:55.020 | decades.
00:52:55.580 | Okay.
00:52:56.340 | I'm the first to say, if they're engaged in nefarious tactics, we ought to meet
00:53:00.140 | them where they are.
00:53:01.060 | Um, but I think the middle ground here, we may have swung too far and the middle
00:53:06.140 | ground, uh, should be the continuation of free trade and finding ways to work
00:53:10.420 | together.
00:53:10.820 | That's how you prevent wars and that's how you move humanity forward.
00:53:14.260 | Let, let's, let's wrap up with a, with, with a public market check.
00:53:19.100 | What's been on your mind lately, Brad, since we talked last.
00:53:22.780 | Well, um, you know, maybe we start just with the news out this morning because
00:53:27.220 | it's a segue, uh, or a bridge back to where we talked about before, which was
00:53:31.780 | what's happening with inflation and rates.
00:53:34.060 | So we had a CPI print out this morning, bill CNBC said, you know, the CPI print
00:53:39.060 | came in quite dovish 20 basis points.
00:53:41.660 | Sequent sequential cooling, uh, in the rate of change was the largest in several
00:53:46.980 | months.
00:53:47.300 | Retail sales came in pretty weak sales, X auto and gas came in pretty weak.
00:53:52.060 | Um, so you still have inflation, uh, adjusted rents coming in pretty hot.
00:53:57.820 | Um, so CPI gave us a little bit of a break.
00:54:00.580 | I think the 10 years down to, to four or three, five this morning.
00:54:03.980 | In fact, you can look at this chart here that we have that shows the
00:54:07.860 | restrictiveness.
00:54:09.140 | Um, so where are interest rates compared to inflation?
00:54:12.500 | And, you know, we're really, you know, crossing this line right now.
00:54:16.500 | So the yellow line is the proxy fed funds rate.
00:54:19.300 | So this is like what the San Francisco fed, um, calculates as the actual interest
00:54:25.500 | rate paid by, uh, Americans.
00:54:27.860 | The black line is the 10 year.
00:54:29.780 | So you can see that the proxy fed funds rate went above inflation earlier this
00:54:35.940 | year, the actual interest rate's going to go above the rate of inflation.
00:54:39.420 | The forecast we have in here is simply, uh, the consensus Morgan Stanley
00:54:43.660 | Goldman Sachs forecast, uh, for the balance of the year, which
00:54:46.540 | has been pretty accurate.
00:54:47.700 | So what is this, you know, tell me the backdrop around inflation
00:54:53.180 | continues to be constructive.
00:54:55.340 | We had a couple months in there where I think people got scared.
00:54:57.980 | Larry Summers said, maybe the next rate move is up.
00:55:01.500 | Why does that matter so much bill?
00:55:03.740 | Because if people can earn, you know, just listen to the Berkshire
00:55:07.140 | Hathaway annual meeting, you know, Warren Buffett says, listen, I'm
00:55:10.900 | collecting my, my interest payments every week.
00:55:13.620 | If I can earn 5.4% taking no risk, why would I take risks?
00:55:17.300 | So, you know, I still am in the camp that, um, you know, we're on a glide path.
00:55:22.500 | It's going to take a little bit longer.
00:55:24.260 | Um, but I do suspect that rates are going to come down.
00:55:27.500 | Jay Powell said yesterday, we're on hold for a couple more months.
00:55:30.580 | I think you're going to get a rate cut before the election, but I don't think
00:55:34.220 | the market actually needs a rate cut bill.
00:55:36.340 | What they need to know is that inflation is coming down and the fed can give
00:55:40.660 | us a rate cut if they want to, right.
00:55:42.860 | That's the important point.
00:55:44.180 | And so I think that backdrop was important this morning, but I
00:55:47.220 | juxtapose that bill with, um, what we've seen in this earning season.
00:55:52.620 | So we're almost through the earning season.
00:55:55.380 | We knew coming into the earning season that GDP had come in lighter, uh, than
00:56:01.020 | we had expected, uh, for the quarter.
00:56:03.380 | And now here's the update on earnings.
00:56:05.540 | So this first chart bill, this is the S and P earnings on a quarterly basis.
00:56:11.940 | The growth, if you back out the mag seven.
00:56:15.260 | So we know, obviously we've got this AI tailwind.
00:56:18.100 | So in the quarter, if you back out, uh, you know, mag seven, we're coming in,
00:56:24.140 | uh, below expectations, right?
00:56:26.180 | You're actually seeing earnings shrink on a quarter.
00:56:29.780 | That's quarter over quarter.
00:56:32.700 | Okay.
00:56:33.940 | And then if you look at the next chart, which is our quarterly performance of
00:56:39.780 | software companies versus consensus estimates.
00:56:43.220 | So there's a chart that jamming on my teammates.
00:56:46.380 | Um, you can see in the quarter, a lot of the companies hit 98% of software
00:56:51.340 | companies hit their guidance, but remember they had reduced their
00:56:54.420 | guidance in previous quarters.
00:56:55.820 | So maybe the bar was just easier to get over.
00:56:58.380 | I think the prospective view is this next chart, which is
00:57:03.340 | guidance versus consensus estimates.
00:57:05.380 | So what are they saying about the future bill versus what consensus thought they
00:57:09.860 | were going to say about the future and 54% of software companies disappointed
00:57:14.540 | in the quarter, and that's the worst quarter.
00:57:17.300 | I think we've had since 2022 in terms of their view as to the future.
00:57:22.780 | And so I was, uh, I did a little stint on CNBC last week.
00:57:27.180 | I said, you know, we've turned down our exposures a little bit, take it a
00:57:30.860 | little bit of our net exposure.
00:57:32.620 | So added to some shorts, taken some of our longs down a little bit.
00:57:36.180 | And people said, well, why did you do that?
00:57:37.820 | And I said, well, you know, look at the backdrop here.
00:57:40.700 | The NASDAQ this morning hit, uh, an all time high.
00:57:45.220 | Right.
00:57:45.500 | And I always say to my team, all time highs, those, that's the
00:57:49.620 | highest in a long, long time.
00:57:51.220 | Right.
00:57:52.020 | And the backdrop is that the economy slowing earnings are coming in lighter
00:57:56.820 | than people expected and AI is creating more disruption, creating
00:58:00.780 | more uncertainty than we expected.
00:58:02.500 | So, you know, this is a moment in time where I think that if you're an
00:58:05.940 | investor, you know, owning some of the companies that you think are real,
00:58:09.780 | you're really confident are going to continue to compound, um, and benefit
00:58:13.940 | from AI, the Microsofts of the world, you know, the NVIDIA's of the world
00:58:17.700 | seems to me like a relatively safe place.
00:58:20.100 | Those earnings multiples don't seem to me to be too onerous, right.
00:58:24.500 | Between 20 and 30 times earnings for, I think 21 times for the, for the mag six
00:58:30.500 | anyway, um, but I think you do have to look at whether you own retail names
00:58:37.780 | or, or a lot of other names that are not exposed to those trends.
00:58:42.220 | Um, so long as we're going to have interest rates above the rate of
00:58:45.340 | inflation, be in this restrictive territory, the trend is down in terms
00:58:49.380 | of, you know, economic growth, which the government need, you know, J-PAL
00:58:53.580 | needed to manufacture slowing growth in order to get inflation down.
00:58:57.620 | Um, but I think that's the backdrop.
00:59:00.020 | I don't think there's any, you know, uh, you know, fall off a cliff here.
00:59:03.620 | Um, but I do think that we're in this situation, Bill, um, where, uh, you
00:59:10.020 | know, where, uh, the fed is probably at some point in time, going to need to
00:59:13.940 | give us some relief on rates in order to keep economic growth in this, you know,
00:59:18.980 | kind of one and a half, 2% GDP range.
00:59:21.140 | Yeah, I hear you.
00:59:22.220 | And look, there's two other things that would cause me to share your opinion
00:59:26.620 | about being conservative at this moment in time, especially related to those things.
00:59:31.340 | One, Washington does not appear to be concerned about inflation, which means
00:59:36.460 | they're spending both Ukraine, Israel, this new AI thing, like they don't seem
00:59:41.500 | to be concerned about, about expanding the budget, which isn't
00:59:44.860 | helpful, obviously on inflation.
00:59:46.220 | And then second, like it's a chaotic time.
00:59:48.820 | I mean, with, with, with colleges moving to the summer, we were losing kind of the
00:59:53.620 | craziness that was happening on campuses, but we do have this election coming up.
00:59:57.500 | We have one candidate in court potentially going to jail.
01:00:01.420 | Like there could be quite a bit of social unrest, um, in the next, you know, six
01:00:06.660 | months.
01:00:06.980 | And I think that causes you to want to be cautious also.
01:00:11.300 | Yeah, there's no doubt about it.
01:00:12.940 | And this is really just a question of units of risk that you want to have on
01:00:16.260 | the table at a given moment in time.
01:00:17.620 | So you have the NASDAQ at an all time high.
01:00:19.540 | You have all those concerns that you lay out and we were just in Vegas and it's
01:00:23.340 | like, we got a couple more cards turned over.
01:00:25.300 | And the fact of the matter is all the odds got a little worse for us.
01:00:29.260 | It's not to say that we're necessarily folding the hand and getting out of it
01:00:32.260 | altogether, but we're not pushing all in either.
01:00:34.740 | I mean, this is just a time, I think, to have, you know, a portion of your stack
01:00:38.660 | working.
01:00:39.180 | And if the market happens to go against you because we have, you know, a bad event
01:00:44.020 | happened at the DNC convention in Chicago, or because inflation bumps back up
01:00:49.460 | because of all of this, uh, you know, fiscal spending that you point out, uh,
01:00:53.820 | then you're in a position to do something about it.
01:00:56.580 | And so this is, you know, I always talk to our team just about less versus more.
01:01:00.780 | And we're in a, we're in a moment in time where maybe a little bit less makes
01:01:03.660 | sense.
01:01:03.980 | It's great to see you.
01:01:05.140 | A great conversation as always.
01:01:06.700 | Um, thanks for, thanks for making it happen.
01:01:08.860 | Take it easy.
01:01:09.460 | As a reminder to everybody, just our opinions, not investment advice.