back to indexE122: Is AI the next great computing platform? ChatGPT vs. Google, containing AGI & RESTRICT Act
Chapters
0:0 Bestie intros!
1:31 Joe Manchin calls out Biden on IRA flip-flop
7:40 Sacks writes GPT-4-powered blog post, OpenAI launches ChatGPT plugins
26:31 Will generative AI be more important than mobile and the internet itself? Making the case for both Google and OpenAI to win generative AI
50:19 Reaching and containing AGI, AI's impact on job destruction
76:35 RESTRICT Act's bait and switch
00:00:04.000 |
I've been here the whole time. I was just, I was just having some of these beautiful 00:00:09.600 |
salted roasted pistachios. The only problem is when I went to the store, I kid you not, 00:00:14.440 |
there was a shelf of these. All flavors available except one flavor. 00:00:23.480 |
I am not kidding. I go to the fancy, you know, bespoke- 00:00:30.800 |
I went to the Raley's and Truckee, the artisanal, and they have, you know, all these overpriced- 00:00:37.200 |
That's what I said. The art stuff. The artistic food. The artisanal row where they had this, 00:00:45.440 |
I kid you not, spicy, salty, no salt, every shelf packed. Then there's one shelf I can 00:00:55.440 |
And I look at the tiny little sign, salt and vinegar, shelled nuts. 00:01:00.560 |
Sea salt and vinegar. Chamat's shelled nuts. Sold out across the country. 00:01:05.680 |
You know, I cannot recommend these more highly. They're incredible. 00:01:11.160 |
They are delicious. My salty nuts are delicious. 00:01:14.840 |
Did you see Joe Manchin's high heater op-ed in the Wall Street Journal? 00:01:40.600 |
He is. I think. Okay, so let me ask Saks right there. 00:01:43.720 |
Saks, Joe Manchin, Nikki Haley, and who's the guy from Florida? 00:01:49.720 |
By the way, there was a big dissection that was leaked this week. Ron Lauder 00:01:54.200 |
flipped from Trump to DeSantis. That's a big one because Lauder's good for 00:02:00.360 |
Joe Manchin, what impact would he have coming into the race? I'm not trolling you. 00:02:04.520 |
Well, it depends how he comes in. What did he say in the op-ed? 00:02:07.160 |
He was talking about the insincerity of the Biden administration to control costs and how 00:02:10.600 |
everybody was incompetent. And it's certainly there's some waste and we can control some 00:02:14.040 |
spending and everybody needs to grow up and get in a room and just manage the budget for 00:02:18.360 |
the American people and stop playing politics. 00:02:19.960 |
Yeah, I think the headline of the article actually, to your point, Jake, 00:02:23.000 |
was much worse than the substance of the article, Saks. But if you see the headline, 00:02:26.360 |
I don't know, Nick, if you can just throw it up there, it was brutal. 00:02:29.000 |
The headline and the byline of the article, I think, was more damaging than the substance 00:02:34.440 |
Biden's Inflation Reduction Act betrayal. Instead of implementing the law as intended, 00:02:39.800 |
his administration subverts it for ideological, ideological ends. 00:02:43.880 |
I have to think that Joe was responsible for that, for the titling of that article. 00:02:50.220 |
And by the way, I think if you guys remember, we talked about this when that act was first 00:02:55.320 |
published. And if you guys remember, I think I pulled up the CBO data, the CBO model, 00:02:59.960 |
and it showed for the first five years, this thing burns a couple hundred billion dollars. 00:03:04.680 |
And then there's some expectation that there'll be some sudden boom in revenue 00:03:08.200 |
in the out years, and then you make the money back in the out years. 00:03:11.160 |
So it's total like accounting shenanigans for him to have made the claim in the first place 00:03:15.880 |
that the IRA was actually going to be like a net deficit reduction or debt reduction. 00:03:20.040 |
In fact, it's all just accounting shenanigans. And it's just a massive spend package, 00:03:23.720 |
particularly in the near term when it matters most. 00:03:26.360 |
I think I told you guys this, but I think this was like, 00:03:28.200 |
when was the last time I was in Washington? Probably, what is it March now? So maybe it 00:03:33.000 |
was January I was there. And I saw Schumer and Mark Warner, and I spent about two hours with 00:03:41.320 |
Manchin. He is really impressive. He's cool. He's interesting. He's thoughtful. He's moderate. 00:03:49.320 |
Manchin's like a formidable guy. So this will be really interesting if he steps in there and 00:03:56.200 |
Between Nikki Haley and Manchin, where do you write your check? 00:03:59.080 |
I'd probably write a check to both, to be honest. 00:04:01.080 |
Feels like a good ticket to me. I've always wanted to see the cross- 00:04:04.120 |
Could you imagine a Democrat and Republican merging somehow and like running together? 00:04:10.520 |
I've been pitching that for years. I think that's like a clear path. 00:04:13.800 |
David Freeberg may have just come up with one of the most disruptive ideas in American politics 00:04:24.360 |
Just my comment on this. So first of all, I remember when, you know, Manchin did a good 00:04:28.280 |
job stopping Biden's $3.5 trillion build back better. Remember, it was him and Sinema that 00:04:32.680 |
were the holdouts. But then Manchin compromised and gave Biden a $750 billion version of it. 00:04:39.400 |
And I guess now he's complaining that Biden didn't live up to his end of the bargain in 00:04:43.480 |
doing the deficit reduction. But quite frankly, many commentators said at the time that the 00:04:49.160 |
bill's claims to deficit reduction were preposterous and that would never happen. 00:04:53.560 |
So quite frankly, you know, Manchin shouldn't have been euchared or hoodwinked by Biden. 00:04:58.680 |
Everyone was basically saying there'll never be any deficit reduction out of this bill. 00:05:01.720 |
It's just more spending. So I don't really feel bad for Manchin here saying that somehow he was 00:05:07.160 |
betrayed by Biden. He should have known better. Now, in terms of him running, yeah, I think as 00:05:12.840 |
a Democrat who's figured out how to get himself elected in West Virginia, which is a plus 20 00:05:17.720 |
red state, he obviously knows how to appeal to the center. The problem for him is just how do 00:05:22.280 |
you get the Democratic Party nomination? Because he's far to the right of your average Democratic 00:05:27.720 |
Party voter. If he wants to run as an independent, that's a different story. And that would really 00:05:32.440 |
throw a curveball into the race. But I don't see him doing that. I think it's kind of a stretch. 00:05:37.480 |
And this is the problem with a lot of these fantasy candidates is that, you know, centrist or 00:05:43.240 |
moderate voters might like them, but they can't get the nomination of their party. 00:05:46.200 |
And unless you mean like Trump and Obama, those who are fantasy candidates? 00:05:50.440 |
Trump was not a fantasy candidate. He's the ultimate fantasy candidate. 00:05:52.360 |
Well, he was an outsider, but he appealed to the base of the party. He appealed to the 00:05:55.960 |
base of the party. What I'm saying is in order to get the nomination of a major party, you have 00:06:00.040 |
to appeal to its base. And I don't think Manchin appeals to the base of the Democratic Party. He's 00:06:04.440 |
out of step with it. He's out of step with it in ways that I like, don't get me wrong, 00:06:08.360 |
but I just, I don't see how he's going to get a nomination. 00:06:11.720 |
Chris Christie? What do you think of him? It seems like he's about to come in the race too, David. 00:06:20.360 |
Pointless. All right. Listen, everybody, welcome to the all-in podcast. It's like episode 100 00:06:25.560 |
something with me again today. The Rain Man himself. Yeah. David Sacks is here. Friedberg 00:06:32.440 |
is in his garden at his home in Paris. Spring has sprung. The Queen of Quinoa. And of course, 00:06:39.960 |
the dictator himself, Chamath Palihapitiya, the Silver Fox. Look at that little tuft of silver 00:06:47.240 |
I got a haircut from somebody recently who said that people go to her and ask her to put the 00:06:59.640 |
Friedberg looks like he's in Smurf Village there. What is that background? 00:07:08.680 |
I like most of my backgrounds. I think it reflects the mood and the moment of the week. 00:07:13.000 |
You guys just totally, totally denied half the beta mails in the YouTube comments from 00:07:18.040 |
being able to guess what the background was. Thanks a lot, Sacks. 00:07:22.920 |
Actually, I did a reverse image search and then I used a chat GPT plug-in to automatically figure 00:07:29.720 |
Oh, okay. All right. Well, let's get started. Come on. 00:07:38.760 |
Sacks feels like he's in a good mood. I like this. 00:07:43.000 |
OpenAI launched a bunch of chat GPT plugins and I don't know if you saw it, but David Sacks 00:07:52.600 |
wrote a blog post with chat GPT. It's an amazing back and forth. I read this back and forth. 00:07:58.680 |
Explain what you did, Sacks. This was really one of the best conversations I've seen with chat GPT. 00:08:05.880 |
It'll pop it up here on the screen, but explain what you did. 00:08:08.200 |
Well, I had an idea for a blog post about the use of a, I guess, tactic you could call give to get. 00:08:16.360 |
I thought it would be an interesting tactic for AI startups to use if they're trying to get a 00:08:21.640 |
hold of proprietary training data. For example, if you want to create an architect AI, you need 00:08:27.560 |
a lot of plans. Or if you're going to create a doctor AI, you need a lot of lab results or medical 00:08:34.120 |
reports to train the AI on. Those are hard to get. OpenAI doesn't necessarily have them yet. 00:08:39.080 |
There is an opportunity, I think, for startups to create these AIs in different, 00:08:43.560 |
you'd call them professional verticals. The give to get technique would be you give points to your 00:08:50.200 |
users for uploading that data and then they can spend those points by using the AI. Anyway, 00:08:55.560 |
the company that came up with this give to get tactic was a company called Jigsaw almost 20 00:09:01.560 |
years ago. No one remembers this company. I'm kind of dating myself because I remembered it. 00:09:05.400 |
But I just had this idea, gee, I wonder if the Jigsaw approach could be used for AI startups. 00:09:09.800 |
I started by going into chat GPT and I said, "Hey, have you heard of Jigsaw?" Then it had. 00:09:16.200 |
Then I said, "Tell me about its give to get approach." Then I said, "Would this approach 00:09:21.880 |
work for AI startups that want proprietary training data sets?" It said, "Yes, this is a good idea." 00:09:29.960 |
Then I gave the architect example and I said, "Can you give me more examples like this?" It gave me 00:09:33.800 |
like 20 more examples. Then I asked it just to flesh out various kinds of details. I went down 00:09:39.560 |
some cul-de-sacs I didn't use. Then at the end, I said, "Can you summarize everything we've just 00:09:44.120 |
talked about in a blog post?" It gave me the first draft of a blog post. I then did a substantial 00:09:49.560 |
amount of editing on most of the blog posts, although some of it I just used verbatim. 00:09:53.960 |
Then I had a couple of people in my firm look at it. They made some good suggestions. It's not like 00:09:58.120 |
the human is completely out of the loop. Then I copied and pasted my edited version back into 00:10:02.920 |
chat. GPT said, "Here's my edit." Then I asked for some suggestions. It made a few small edits. 00:10:09.080 |
I said, "Okay, great. Just incorporate the edits yourself." Gave me that final output. 00:10:13.560 |
Then I posted on Substack, a blog that probably would have taken me a week to research and write. 00:10:19.640 |
If I had done it at all, I was able to do in a day. I can't see myself going back now. I think 00:10:25.080 |
this is just how I'm going to write all my blog posts is use chat GPT as my researcher, 00:10:31.960 |
as a writing partner, some cases an editor, but I'm definitely going to run it through. 00:10:38.360 |
The thing that I was struck by was just how kind and generous and thoughtful this conversation was. 00:10:45.160 |
I just thought, "I've never seen Sax have a conversation where he was so kind to the 00:10:48.520 |
other person and thoughtful." Right about now, all your friends and family are like, 00:10:51.880 |
"How do we get Sax to have this conversation with us?" You were super kind to the AI. 00:10:59.260 |
Well, just in case it takes over the world, J-Cal, you can't be too careful. No, I think, 00:11:06.680 |
Look at him, he's so kind. He's like, "Oh, perfect, thanks." 00:11:15.160 |
Scroll up and show that example. The AI actually gave me some information about Jigsaw's point 00:11:20.200 |
system, again, the rewards that they used. And it was just in text. So I said down below, "Hey, 00:11:26.440 |
can you spit that out as a table?" And it did instantly, perfectly. 00:11:29.960 |
It's like a day's work, right? You would have to have an analyst or researcher do a day's work. 00:11:33.880 |
Then I just screenshot of that and I made it an exhibit in my blog post. 00:11:38.760 |
And then it was delightful back to you. I mean, this is a- 00:11:43.080 |
This is a literal road to you being a kind human being. All the money that you've spent on therapy 00:11:50.120 |
and just trying coaching to be nice to people, you're just nice naturally to people. 00:11:55.080 |
No, J. Kell, Sax is in a good mood today. I don't know why you're instigating him. 00:11:59.160 |
He's laughing. Come on, it's fun. You have to think it's funny, 00:12:04.440 |
This is confirmatory of what we know. David wants to live in a set of highly 00:12:09.000 |
transactional relationships, ideally with a machine, 00:12:12.200 |
who can then eventually help him make him money. 00:12:17.320 |
Can I ask you a question, though? Sincerely, Sax. What did you enjoy more, 00:12:21.000 |
working with your team of humans on this or working with 00:12:24.680 |
ChatGPT? Which one was more enjoyable for you, just personally? 00:12:29.320 |
Well, I think they both were. I would say that the human contributions were essential. 00:12:38.120 |
It's not about enjoyment. This is just a job to get done, but it definitely sped things up 00:12:44.120 |
enormously. I personally find the hardest part of writing a blog is when you're staring at that 00:12:48.280 |
blank sheet of paper and just having to spit out the first thousand words. It's just so time 00:12:54.360 |
consuming to do that. But again, if you start with the first draft, even if it's not very good, 00:12:59.240 |
then you can just edit it and it speeds things up a lot. 00:13:03.960 |
But the contributions of people in my firm were important. I actually trusted it. I know that you 00:13:09.960 |
probably should fact check it in a way because it can hallucinate. But the things that we're 00:13:14.280 |
saying made so much sense to me that I didn't think it was hallucinating. 00:13:17.160 |
Well, this is a great moment to pivot into what OpenAI did with plugins. These came fast and 00:13:23.640 |
furious this week. And a bunch of folks who had started verticalized ChatGPT-based projects, 00:13:30.840 |
MVPs were like, "Oh, maybe my project MVP is now dead," because Instacart, OpenTable, Shopify, 00:13:38.040 |
Slack, Zapier, and Zapier, obviously, like if then this, then that is a very wide ranging tool 00:13:44.440 |
that allows you to connect APIs from a multitude of sources. And what this all lets you do at the 00:13:51.080 |
end of the day is have ChatGPT ping one of these sources, just like an app might do or some custom 00:13:58.600 |
software might do, ping the API and return data. So, hey, what tables are open on OpenTable? 00:14:07.640 |
Maybe Shopify, find me things to buy in this category, etc. And so people have started building 00:14:14.280 |
little scripts. We used to call these when magic leap was out, internet agents, and the concept of 00:14:22.600 |
a software agent that's existed for a long time, actually, in computer science, I'm sure Freebird 00:14:28.200 |
will give us some examples of that. But also ChatGPT can now use a browser. So that means you 00:14:33.000 |
can get around the dated nature of the content in the corpus. Somebody did things like, "Hey, 00:14:39.720 |
build me a meal plan, book me a reservation for Friday night in OpenTable, source other ingredients 00:14:44.600 |
and buy it for Saturday night on Instacart and then use something like Wolfram Alpha to, you 00:14:50.360 |
know, calculate the calories, etc." So when you saw this drop, Sax, what did you think in terms of 00:14:58.120 |
the opportunity for startups and to build these intelligent agents, things that will do if that, 00:15:05.320 |
if this, then that, or just background tasks over time, and you could actually leave them running? 00:15:11.400 |
Yeah, I mean, I think this is the most important developer platform 00:15:15.160 |
since the iPhone and the launch of iOS in the App Store, and I would argue maybe ever in our 00:15:22.360 |
industry, certainly since the beginning of the internet. I think there was a question when 00:15:26.520 |
ChatGPT launched on November 30th, and people started playing with in December, what exactly 00:15:31.720 |
OpenAI's product strategy was going to be? Was this just like a proof of concept or a demo? 00:15:38.120 |
And they even kind of called it like a demo. And initially, it looked like what their business 00:15:44.280 |
model was going to be was providing an intelligence API that other websites, other applications could 00:15:51.080 |
incorporate. And we saw some really cool demos like that Notion demo of other applications 00:15:56.200 |
incorporating AI capabilities. And so initially, it looked like what OpenAI was going to be was 00:16:02.280 |
more like Stripe, where in the same way that Stripe made payments functionality available 00:16:07.720 |
very easily through a developer platform, they were going to make AI capabilities available 00:16:12.520 |
through their developer platform. And then I think a funny thing happened on the way to this 00:16:17.320 |
announcement, which is they became the fastest growing application of all time, talking about 00:16:21.560 |
ChatGPT, over 100 million users in two months. Nobody else has ever done that before. I think 00:16:26.760 |
it took the iPhone, you know, two years plus, Gmail, Google, those products all took, I think, 00:16:33.720 |
well over a year. So this became the fastest growing site of all time. And I think with 00:16:39.480 |
plugins, what they're indicating is that they will become a destination site. This is not just a 00:16:44.360 |
developer platform, this is a destination site. And through plugins, they are now incorporating 00:16:49.640 |
the ability to basically, you know, anything you could do through an application, you will now be 00:16:56.120 |
able to do through a plugin, you'll just tell ChatGPT what you want done. If you say, hey, 00:17:01.400 |
book me a plane ticket on this date, it will go into Kayaks plugin and do that. You say, 00:17:07.640 |
So the promise of Siri and Alexa realized because those were very rigid, they had no intelligence, 00:17:14.280 |
right, Friedberg? If you wanted Siri to do something specific, like use Waze, or to go 00:17:20.680 |
get you an open table, it needed to be pretty specific. And it didn't have any kind of natural 00:17:26.280 |
language model behind it. So this is taking existing APIs and putting a natural language 00:17:31.160 |
layer in front of it, which makes it, you know, perform a little more naturally. Is that what 00:17:37.640 |
I think it provides access to a corpus of data and a suite of services that are not well 00:17:45.400 |
integrated into a search or chat interface anywhere today. So, you know, knowing what 00:17:52.760 |
restaurants have what seats available is in a closed service, it's in a data warehouse, 00:17:57.480 |
operated by OpenTable. And now what OpenTable can do is provide an API into that data 00:18:04.280 |
via an interface and they can allow chat GPT to make a request to figure that data out 00:18:09.640 |
to give a response to a user where they can ultimately benefit from transacting and allowing 00:18:16.840 |
a service. This closes the loop between search and commerce in a way that Google cannot and 00:18:24.840 |
does not do today. And I think that's what makes it very powerful. We've seen this 00:18:29.240 |
attempted in a number of important ways in the last couple of years with Alexa 00:18:33.320 |
and Apple Home and Google Home kind of integration via the chat services that they offer, 00:18:39.400 |
you know, where you speak to the device. But the deep integration that's possible now, 00:18:44.200 |
and the natural language way that you can go from the request all the way through to the 00:18:49.080 |
transaction is what makes this so extremely powerful. And I think, you know, the points 00:18:54.760 |
I made a few weeks ago, when we first talked about, you know, search, having so many searches 00:19:00.440 |
that are done, where the human computer interface presents a table or presents a chart, or presents 00:19:07.160 |
a shopping list in a matrix. That's what makes search such a defensible product, I think, 00:19:12.600 |
could theoretically be completely obviated or destroyed with an interface like this, 00:19:17.240 |
where you can write the ability for chat GPT or whatever the core centralized services to 00:19:23.560 |
actually present results in a table in a matrix in an interface in a shopping list, and actually 00:19:29.000 |
close the transaction loop. It's really disruptive to things like commerce providers, 00:19:34.200 |
it's really disruptive. You know, some of these commerce platforms, it's really disruptive 00:19:38.040 |
to a lot of different industries, but also introduces a lot of real opportunity to build 00:19:42.840 |
on top of that capability and that functionality to rewrite and ultimately make things easier and 00:19:47.480 |
better for consumers on the internet. What do you think Chamath you're looking at this, 00:19:50.920 |
and it seems to be moving at a very fast pace. Over 100 million users, they put a business model 00:19:56.440 |
on it already 20 bucks a month, they have a secondary business model of, hey, use the API 00:20:01.720 |
and we'll charge you for usage. And then you layer on what Zapier and if this than that had 00:20:06.120 |
already sort of established in the world, which is API's, but nobody ever really wanted to write 00:20:11.720 |
scripts. So that seemed to be the blocker, you go into Zapier, if this and that, it's where 5% 00:20:16.840 |
of the audience people want to customize stuff, people who want to tinker. But this seems to now 00:20:20.600 |
with the chat GPT chat interface, open it up to a lot of people. So is this super significant? Or 00:20:25.960 |
is this a commodity product that, you know, 10 people will have for sitting here next year on 00:20:30.680 |
all in Episode 220? I think you are asking the exact right question. And you use the great term, 00:20:36.760 |
like in poker, if there are three hearts on the board, and you have the ace of hearts, 00:20:43.640 |
you have what's called the nut blocker, right? Which means that nobody else, even if anybody 00:20:50.360 |
else has a flush, they never have the best flush. And if flush is the best hand, there's a lot of 00:20:55.480 |
ways that you can manipulate the pot and eventually win the pot because you have that ace of hearts 00:20:59.320 |
and nobody else has it. The concept of blocker, I think is very important to understand here, 00:21:06.360 |
which is what are the real blockers for this capability to not be broadly available. 00:21:12.040 |
So I think you have to segregate, you have the end user destination, 00:21:16.360 |
you have the language model, and then you have the third party services. And so if you ask the 00:21:23.160 |
question, what is the incentive of the third party service? Well, the shareholders of 00:21:27.720 |
a travel site, right? They're not interested in doing an exclusive deal with any distribution 00:21:36.440 |
endpoint, they want their services integrated as broadly as possible. Right? So I think the 00:21:42.040 |
the answer for the service providers is just like they build an app for iOS and for Google. And, 00:21:48.520 |
you know, if they could have justified it, they would have built an app for a gaming console, 00:21:52.600 |
they can, they should, they would, they do. Right? So that's going to get commoditized 00:21:57.800 |
and broadly available. I think on the LLM side, I think we've talked about this. 00:22:02.120 |
Everybody's converging on each other. In fact, there was an interesting 00:22:06.440 |
article that was released that said that there was a handful of Google engineers that quit, 00:22:12.760 |
because apparently Bard was actually learning on top of 00:22:19.480 |
chat GPT, which they felt was either legal or unethical or something, right? So, 00:22:25.720 |
so the point is, I think we've talked about this for a while, but all of these models will 00:22:30.120 |
converge in the absence of highly unique data, right? What I've been calling these white truffles. 00:22:35.720 |
So if you can hoard white truffles, your model will be better. Otherwise, your model will be 00:22:40.920 |
the same as everybody else's model. And then you have the distribution endpoints of which there 00:22:46.440 |
are many whose economic incentives are very high, right? So Facebook doesn't want to just sit around 00:22:51.880 |
and have all this traffic go to chat GPT, they want to be able to enable Instagram users and 00:22:56.760 |
WhatsApp users and Facebook users to interact through messenger or what have you. Obviously, 00:23:01.640 |
Google has a, you know, many hundreds of billions of reasons to defend their territory. So I think 00:23:08.040 |
all of this to me just means that these are really important use cases. As an investor, 00:23:14.920 |
I think it's important to just stay a little patient. Because it's not clear to me that 00:23:20.200 |
there are any natural blockers. But I do think that David's right that it's demonstrating a 00:23:24.920 |
use case that's important. But it's still so early, we are six weeks in. Yeah, I tell you, 00:23:32.200 |
I think there's a couple of great blockers here, or there's going to be an M&A bonanza for Silicon 00:23:36.600 |
Valley. If you look at certain data sets, Reddit, stack overflow for programming, and Cora, these 00:23:45.000 |
things are going to be worth a fortune and to be able to buy those or get exclusive licenses to 00:23:49.880 |
those if you're maybe Google barred, or if you're a chat GPT, that could be a major difference maker 00:23:54.840 |
Twitter's data set, obviously. And then you look at certain tools like Zapier. And if this and that 00:24:00.040 |
they've spent a decade building the sort of meta API, that would be an incredible blocker. I think 00:24:07.400 |
this is going to be like a balkanization of so many oil sources. Zapier already did it for free, 00:24:11.800 |
they did a plugin for free. Yeah, exactly. I was just going to say, I don't think these are not 00:24:15.720 |
blockers. I don't think this is the ace of hearts on the flush board. I don't think so. I think that 00:24:19.880 |
these things are really interesting assets. They are definitely truffly in nature. But they may not 00:24:26.520 |
be the you know, 10 pound white truffle from Alba that we're looking for. Yeah, no, but on the M&A 00:24:31.240 |
side, don't you think this would be like incredible? No, but the only reason I say that again, 00:24:34.920 |
is it is just so early, like I in the text, I mentioned this to you guys, I remember, 00:24:39.320 |
and sacks and I were in the middle of this. We were both right at the beginning of social 00:24:44.920 |
networking, sack started genie, I was in the middle of aim. And all of a sudden, we saw 00:24:49.880 |
read start social net. Then we saw Friendster get started. Then we saw myspace get started. And you 00:24:56.920 |
have to remember, when you look back now 20 years later, the winner was the seventh company, which 00:25:02.200 |
was Facebook, not the first, not the second, it was the seventh, which started two and a half years 00:25:07.480 |
properly after the entire web point to a phenomenon started. Yeah, same with search, 00:25:12.440 |
by the way, where Google was probably 20. Exactly. To the scene. Yeah, excite like us. 00:25:17.800 |
If you want to be a real student of business history, I'll just say something that's more 00:25:21.400 |
meta, which is, if there's something that I've learned on the heels of this SVB fiasco, 00:25:27.400 |
is that there is an enormous amount of negative perception of Silicon Valley, and frankly, 00:25:34.120 |
a lot of disdain for VCs and prognosticating technologists, right. And I think that 00:25:42.760 |
I think we have to be very careful. Yeah. And I do think that we are an example of that, 00:25:47.560 |
because we are the bright, shiny object of the people that were successful. And the broad makeup 00:25:52.760 |
of America thinks that we're not nearly as smart as we all think we are. And after all of this money 00:25:58.280 |
that's been burned in crypto land, and NF T's and all of this web three nonsense, to yet again, 00:26:03.880 |
whip up the next hype cycle, I think doesn't serve us well. So I do think there's something 00:26:10.680 |
very important here. But I think if we want to maintain reputational capital through this cycle, 00:26:18.600 |
because government will get involved much faster in this cycle, I think it's important to just 00:26:23.800 |
be methodical and thoughtful, iterate experiment, but it's too early to call it, 00:26:29.560 |
Yeah, it's definitely too early to call it. But, 00:26:31.240 |
Saks, you're saying explicitly, you think this is bigger than the internet itself, 00:26:38.760 |
It's definitely top three. And I think it might be the biggest ever. I think, look, 00:26:42.200 |
I think things could certainly play out the way that Jamath is saying. However, 00:26:45.720 |
I actually think that open AI has demonstrated now with these platform features, that it has a lead, 00:26:53.720 |
a substantial lead. And I actually think that lead is likely to grow in the next year. And let me 00:26:58.520 |
tell you why I think it's got a couple of assets here that are hard to replicate. So number one, 00:27:03.640 |
user attention, I think they've now got, I would guess hundreds of millions of users. And this 00:27:08.200 |
thing is caught on like wildfire, must have been beyond their wildest dream. I think it even 00:27:13.160 |
surprised them how much this has taken off. It's really captured the public's imagination and 00:27:18.120 |
people are discovering new use cases for it every day. If you are sort of the number two or number 00:27:24.360 |
three or the seventh large language model to basically get deployed behind a chatbot, I just 00:27:30.040 |
don't think you're going to get that kind of distribution because the novelty factor will 00:27:33.080 |
have worn off and people will have already kind of learned to use chat GPT. So number one is the 00:27:39.480 |
hundreds of millions of eyeballs. Number two is with this developer platform, I think we should 00:27:43.560 |
describe a couple of other features of it. One of the problems with chat GPT, if you've used it, 00:27:49.160 |
is that the training data ends in 2021. And so you very rapidly for many questions, get to a 00:27:55.080 |
stopping point where it says, like, I don't know the answer to that because I don't have any 00:27:59.560 |
information about the last two years. Well, one of the plugins that OpenAI has introduced itself 00:28:04.920 |
is called the browsing plugin. And it allows chat GPT to go search the internet and not just run 00:28:11.640 |
internet searches, but to run an internet search as if it were a human. So you ask chat GPT a 00:28:18.360 |
question and it goes to find, it runs a search and then it scours through the list of 20 links 00:28:24.200 |
and it doesn't stop until it finds a good answer. And then it comes back to you with just the 00:28:27.960 |
answer. So it actually saves you the time of clicking through all those loops and it'll give 00:28:31.560 |
you the browsing history to show you what it did. That's mind blowing. They also have a thing called 00:28:36.120 |
a retrieval API, which allows developers to share proprietary knowledge bases with chat GPT. So if 00:28:43.000 |
you have a company knowledge base or some other kind of content, you can share with chat GPT so 00:28:48.760 |
that chat GPT can be aware of that. And there are some privacy concerns, but the company has said 00:28:53.160 |
they're going to sandbox that data and protect it. As an example, I'm planning on writing a book on 00:28:59.400 |
SAS using chat GPT, and I'm going to put together all the previous articles and talks I've done as 00:29:04.280 |
a database so I can then work with that in chat GPT. So you're going to have more and more developers 00:29:09.960 |
sharing information with chat GPT. You're going to have chat GPT able to update its training based 00:29:18.440 |
on sort of the last two years, being able to search the internet. And I think that as those 00:29:22.680 |
hundreds of millions of users use the product, and as developers keep sharing more and more of 00:29:26.840 |
these data sets, the AI is going to get smarter and smarter. And then what's going to happen is 00:29:31.320 |
both consumers and developers are going to want to use or build on the smartest API. 00:29:36.600 |
Yeah, see, this is where it feeds on itself. I mean, yeah, I think there might be a 00:29:40.440 |
I agree with much of what you're saying. But I do think somebody like Facebook, when they release 00:29:45.640 |
their language model, which they're about to, is not going to allow chat GPT to have any access to 00:29:50.920 |
the Facebook corpus of data. And then LinkedIn will do the same, they'll block any access to 00:29:55.640 |
chat GPT to their data. And so then you might say, you know what, I'm doing something related 00:30:00.120 |
to business and business contacts, I need to use the LinkedIn one. And they're just going to block 00:30:03.880 |
other people's usage of and tell you, hey, you have to come to our interface and have a pro 00:30:07.400 |
account on LinkedIn. And this all becomes little islands of data. And so I'm not sure that you may 00:30:12.760 |
be right to call us too early to have a definitive opinion. But I would say I have to believe plugins 00:30:16.600 |
are going to be promiscuous. Yes, exactly. plugins are the refutation of your idea does 00:30:20.680 |
not have an API, Twitter turned off their API, people who are smart, data sets, Cora doesn't 00:30:25.560 |
let people use its data. So I just picked three. Those are three incredible data sets that don't 00:30:30.840 |
allow people and Craigslist doesn't. So people who are smart, do not allow API's into their data, 00:30:35.560 |
they keep it for themselves. I think there were a lot of people when the app store rolled out 00:30:41.560 |
that swore up and down, they'd never build a mobile app, because they didn't want to give 00:30:44.360 |
Apple that kind of power that the internet was open, whereas the app store is closed and curated 00:30:49.080 |
by Apple. And sure enough, they all at the end of the day had to roll out apps, even though in 00:30:53.640 |
the case of Facebook, it definitely has made them vulnerable, because they're downstream of Apple. 00:30:58.200 |
I mean, Apple now has enormous influence over Facebook's advertising revenue, because 00:31:03.400 |
users have to go through Apple, they never had to do that before the internet. Nonetheless, 00:31:07.640 |
Facebook felt compelled to release a mobile app, because they knew was existential for them, 00:31:11.880 |
if they did it. And I believe that was happening. I don't think it's right analogy, the right 00:31:15.880 |
analogy would be Google search, does Facebook does Craigslist allow their data to be indexed 00:31:21.400 |
inside of Google search answers? No, right? They block that for a reason they and they will write 00:31:25.720 |
a cease and desist letter. Fine. So so you know what, those guys will stay out of it. But look 00:31:29.320 |
how much content Google search already has. And I think that chat GPT will start by eating a 00:31:35.160 |
substantial portion of search, because again, you don't have to go through the 20 links, 00:31:38.360 |
it just gives you the answer. It's going to eat a substantial portion of browser usage and app 00:31:43.800 |
usage, because you're just going to tell chat GPT what you want to do. It will go book your plane 00:31:47.880 |
ticket, it will go book your hotel room. Yeah, see, this is a play in this hold on the apps that 00:31:53.960 |
want to play in this will benefit. So there'll be a powerful incentive for applications to get an 00:31:58.840 |
advantage by participating. Let me finish my point. Yeah. And then eventually, they will be forced to 00:32:04.840 |
do it not because they get an advantage, but because they're so competitively disadvantaged, 00:32:08.360 |
if they don't participate in that ecosystem, I agree that they'll participate in it. But here's 00:32:12.760 |
the thing, what's going to happen is Google is going to turn on bard and I've been playing with 00:32:16.200 |
bard, it is 80% of chat GPT already. And then when they make bard a default, you know, little snippet 00:32:24.520 |
on your Google search return page, or bard is built into YouTube, or Chrome, or Android, 00:32:29.240 |
all the Play Store, they're going to roll right over chat GPT because they have billions of users 00:32:33.640 |
already. So this advantage that you see today, I see that getting rolled real quick, because you'll 00:32:39.480 |
be on YouTube. And on the top right hand side will be barred. And when you do a search, it's going to 00:32:43.400 |
say, here are other sentences you could do, oh, you want to search Mr. Beast, when he's helped 00:32:47.800 |
people or Mr. Beast when he's given away more money, or people have copied and been inspired 00:32:51.560 |
by Mr. Beast, all that's going to occur inside of YouTube, and chat GPT is not going to have access 00:32:55.560 |
to the YouTube corpus of data. And then when you do a search, it's going to be the same thing, 00:32:58.760 |
it's going to be on the right hand side. And it's going to be playing just like it is in Bing, 00:33:01.960 |
if you turn on your Android phone, they're going to make Google Assistant go right into bard. 00:33:06.600 |
And Google Assistant is already used by hundreds of millions of people. So I think that Google 00:33:11.320 |
will roll. I think they're going to roll chat GPT. I don't know who's going to win. But I'm looking 00:33:17.160 |
at this saxapoo more reductively as a capitalist, which is what are people's incentives, because 00:33:23.320 |
that's what they'll do. Google's incentive is to usurp chat GPT usage by inserting something 00:33:33.560 |
inside of their existing distribution channels to suppress the ability for you to want to go 00:33:39.160 |
to the app known as bundling. I think Facebook has that same incentive. Oddly, even though 00:33:45.720 |
Microsoft is such a deep partner, I think certain assets of Microsoft have that incentive, you're 00:33:49.960 |
talking collectively about five or $6 trillion of market cap, then when you add in Alexa, and 00:33:55.080 |
Amazon and Siri and Apple, what is their incentive, I don't think their incentive is to let this 00:34:02.920 |
happen. And I think if you look at the slack, Microsoft Teams example of even a better engineer 00:34:08.200 |
product was excellent and widely deployed, even at hundreds of millions of users doesn't much 00:34:13.480 |
matter when it's more cleverly distributed and priced. And so those things again, you may still 00:34:22.040 |
be right, all I'm saying is, it's just so early to know. And as slow and lumbering as some of 00:34:28.840 |
these big companies are, they are not so stupid as to kill their own golden goose and or defend it 00:34:35.000 |
when threatened. So I think you just have to let let it see what happens. I want to finish the 00:34:38.680 |
point on Google. And then we can move on to the bundling thing. Let me just make the counter 00:34:41.720 |
argument, which is that I think Google is caught completely flat footed here, even though they 00:34:46.600 |
shouldn't have been because they published the original paper on transformers in 2017. They 00:34:51.560 |
should have seen where all this was going, but they didn't open AI, use that paper and commercialized 00:34:57.000 |
it. And the proof of that is there was just a lawsuit a couple of days ago, or at least a claim 00:35:02.680 |
by a former employee of Google who quit, because he said that they were using chat GPT to train 00:35:09.720 |
their AI. So their AI is so far behind. They were violating the terms of use, hold on, 00:35:17.160 |
they were violating the terms of use of open AI to train their own AI on chat GPT. That's not a 00:35:23.960 |
good sign. That's not a good sign. I also think hold on, hold on, I'm just making the counter 00:35:29.640 |
argument here. I mean, don't dismiss it out of hand, give me a chance to explain it. Moreover, 00:35:34.280 |
chat GPT for which was just released a few weeks ago, we know that open AI had that they were using 00:35:41.640 |
it internally for seven months. So the state of the art is not what we're using. It's what 00:35:47.240 |
open AI has internally, they're obviously working now on chat GPT five. And so if you're saying 00:35:53.640 |
that Bard is 80% of chat GPT four, well, I got news for you, it's probably 50% or 20% of chat 00:36:01.800 |
GPT five. And who knows what the product roadmap is inside of open AI, I am sure that they've got 00:36:08.680 |
200 ideas for things they could do to make it better and low hanging fruit. But look, regardless, 00:36:14.280 |
I think the pace of innovation here in development is going to speed up massively. I mean, there is 00:36:19.160 |
going to be a flurry of activity. I agree, it's hard to know exactly how it's going to play out. 00:36:24.040 |
But I think this idea that oh, it's a foregone conclusion, these big companies are just going 00:36:28.200 |
to catch up with open AI, I think that there's a strong counter argument. That's not the case. 00:36:32.520 |
I'm making a very specific argument. It's not a foregone conclusion where all the value will 00:36:36.360 |
get captured. Just like in any of these major tidal waves. If you make the bets too early, 00:36:43.160 |
you typically don't make all the money. And it tends to be the case. And it has been in the past, 00:36:48.520 |
at least with these transformative moves. It's sort of in the early third of the cycle, 00:36:53.880 |
is where the real opportunities to make the tons of money emerge. And there's a lot of folks that 00:36:59.080 |
show you a path and then just don't necessarily capture the value. I'm not saying that that's 00:37:03.160 |
going to be the case here. All I'm saying is, if history is a guide, all of these other big waves 00:37:09.000 |
have shown that fact pattern. And so I'm very excited and I'm paying attention. But I'm just 00:37:14.760 |
being circumspect with this idea that you know, having been in the middle of these couple of 00:37:18.920 |
waves before it, I made all the money by waiting a couple years. I don't know if that's going to 00:37:22.760 |
be true this time around. But right, that's sort of my posture right now. 00:37:25.720 |
So you look, you obviously have a point because we're only four months in. So how can we know 00:37:28.680 |
where this is going to be in five years. So you could be right to your point, Saks. I think it's 00:37:33.000 |
clear. And this is, you know, big ups to the open AI team, that they will be one of the top two or 00:37:38.520 |
three players. Absolutely. We all agree on that, which is extraordinary in itself. And the top four 00:37:43.720 |
players freeberg are obviously going to be Microsoft OpenAI, we'll call that like, whatever 00:37:49.720 |
that little, you know, pairing, and then Google, Facebook. And then we haven't talked about Apple, 00:37:56.120 |
but obviously, Apple is not going to take this sitting down. And hopefully, they'll get in gear 00:38:00.760 |
and have Siri, you know, make it to the next level, or they'll just put her out to pasture. 00:38:05.000 |
If you were to look at those four, and we're sitting here a year from now, who has the best 00:38:11.080 |
product offering? Who has the biggest user base? Just take a minute to think about that. Because 00:38:15.880 |
you were at Google. And we all know, the word on the street is, it's the return of the kings. 00:38:22.360 |
Larry and Sergey are super engaged by all reports, every back channel, everybody I talked to was 00:38:27.480 |
saying that their every day, they're obsessed with Google's legacy now and making this happen. 00:38:32.680 |
So what can you tell us in terms of who you think a year or two from now, we'll have the biggest 00:38:38.600 |
user base and be the most innovative amongst that quartet, or maybe you think there's other players 00:38:44.440 |
who will emerge? The advantage that OpenAI has, which is the advantage that any call it emerging, 00:38:53.320 |
you know, advantage competitor has, outside is, yeah, outsider is that the incumbents are 00:39:01.080 |
handicapped by their current scale. Much of the consideration set that Google has had in deciding 00:39:07.560 |
what features and tools to launch with respect to AI over the last couple of years, has been driven 00:39:12.440 |
fundamentally by a concern about public policy and public reaction. And I know this from speaking to 00:39:18.280 |
folks there that are close enough to kind of indicate like, Google has been so targeted has 00:39:26.120 |
been such the point of attack by governments around the world with respect to their scale and 00:39:31.400 |
monopoly and monopolistic kind of behavior. Some people have framed it privacy concerns, you know, 00:39:39.240 |
etc, etc. The fines in the EU are extraordinary, that so much of what goes on at Google today, 00:39:45.160 |
is can I get approval to do this? And so many people have felt so frustrated that they can't 00:39:49.960 |
actually unleash the toolkit that Google has built. And so they've been harnessed and focused 00:39:54.680 |
on these internal capabilities. I think I mentioned this in the past, but things like, 00:39:58.440 |
what's the right video to show on YouTube to keep people engaged? What's the right ad to show to 00:40:04.040 |
increase click through rates, etc, etc. versus building great consumer products for fear of the 00:40:09.080 |
backlash that would arise, and governments coming down on them and ultimately, and ultimately 00:40:13.560 |
attacking the revenue and the core revenue stream. And this is no different than any other kind of 00:40:19.080 |
innovative dilemma. You know, any other business of scale in any other industry historically 00:40:23.400 |
ultimately gets disrupted, because their job at that point is to protect their cash flow and their 00:40:28.360 |
revenue stream and their balance and their assets, not to disrupt themselves, especially as a public 00:40:34.040 |
company, especially under the scrutiny and the watchful eye of governments and regulators. 00:40:38.200 |
So I think Google has, in aggregate, probably good competitive talent, if not better talent 00:40:44.840 |
than open AI and others. Google has arguably the best corpus of data upon which to train 00:40:51.080 |
the best capabilities, the best toolkit, the best hardware issues, the lowest cost 00:40:55.400 |
for running these sorts of models, the lowest cost for serving them, etc, etc. So frankly, 00:41:00.360 |
they're way behind the battle is theirs to lose, if they are willing to disrupt themselves. And 00:41:05.800 |
this is the moment that Larry and Sergey should wield those founders shares that they have. 00:41:10.360 |
And they should wield the comments that they wrote in that founders letter, that they will always 00:41:14.040 |
make the right decision for the long term for this company, even if it means taking a cost in 00:41:18.200 |
the short term and disrupting themselves. This is the moment to prove that those founders shares 00:41:23.320 |
were worth, you know, the negotiation to get there. And, and I think that it is going to require 00:41:29.080 |
a real degree of scrutiny, a real degree of regulatory uncertainty, a real degree of 00:41:34.040 |
challenge by governments, and public policy people, and perhaps even a revenue hit in the 00:41:38.520 |
near term to realize the opportunity, but I do think that they're better equipped to win if they 00:41:45.000 |
well, really well said. I think the founders share insight is particularly interesting, 00:41:48.760 |
Saks. The fact that by the way, sorry for those who did nothing with them. 00:41:52.600 |
Gotcha. Yeah, no, no, I was just gonna say the exact same thing. It's like, if they don't use 00:41:56.360 |
it now, what would it take? And when? Yeah, this is good. Yet another, yet another case of the 00:42:03.160 |
emperor has no clothes, just a power grab by Silicon Valley execs, which was meaningless, 00:42:07.960 |
because if in this moment, you don't wield that power, and break that company into bits as you 00:42:12.840 |
need to, what was the point of having it? They need to come in and say, we're going to give 00:42:18.360 |
barred results to 10% of users and ask them to get feedback on it. Because who has worse 00:42:22.920 |
queries than just one point I want to make there for revert, who has more reinforcement learning 00:42:29.240 |
than Google, that search box is everywhere. And people write after question and Gmail, 00:42:35.000 |
and Google Docs, etc, etc. I mean, they have so many people asking questions. And YouTube might 00:42:40.600 |
be the the transcripts of YouTube, every video and the image of every video bananas and the comments 00:42:46.200 |
under it, you know, the comments under the video, you have the transcript of what happened in this 00:42:50.440 |
video. And then what was the question and answer underneath it? Let me make the counterpoint, 00:42:55.160 |
please, to my own point, like, look at how Gerstner came after Zuck. So Zuck had his point 00:42:59.880 |
of view, his strongly held belief that AR VR was the future of the platform. That's what he wanted 00:43:05.000 |
to bet into. That's what he wanted to lean into. It's what he wanted to build the company against. 00:43:08.520 |
He did it. And then the financial analysts and the investors came at him and said, 00:43:13.560 |
this is a waste of money, focus on making money, you have a responsibility to shareholders, 00:43:17.880 |
F those founders shares, you don't deserve that 10x voting right, or whatever the framing might 00:43:21.880 |
have been to get him to say, you know what I acquiesce, I'm giving it up. And I think that 00:43:25.880 |
we should also think about what's going to happen on the other side, Google is a trillion plus dollar 00:43:29.320 |
market cap company. Their shares are owned by every public endowment, public pension fund, 00:43:34.680 |
institutional investor owns Google in their portfolio. So the backlash against Google 00:43:38.840 |
making a hard bet like this, and potentially destroying billions of dollars of cash flow 00:43:43.800 |
in the process every year will not be easy to do that the same sorts of letters that Gerstner at 00:43:48.680 |
all and obviously, we love Gerstner. And, you know, we can all defend him all day long. At 00:43:53.000 |
Zuck is what might may end up happening with with alphabet if they did choose to go this path. 00:43:58.040 |
Saks, what do you think here about the founder share specifically Google's chances 00:44:02.600 |
of disrupting themselves and, you know, just putting this into every product 00:44:07.560 |
and shoving it down users throats and catching up? 00:44:10.600 |
Well, I mean, with all due respect, Larry and Sergey, I mean, they've been on the beach a long 00:44:16.120 |
time. This reminds me of Apollo Creed coming out of retirement in rocket four. 00:44:31.160 |
Sam Altman may not look like Ivan Drago, but but this this is one shrewd character. 00:44:40.440 |
Yeah, he's, you know, he's a multi time founder who sat at the top of YC and got to see everything 00:44:44.600 |
that worked. Yep. And got to see all the research and he's been plugging away at this for what, 00:44:51.240 |
like, years. So there's a there's a big I just think there's a big gap to catch up on. Now, 00:44:57.320 |
Google has all the resources in the world. And they've got a lot of proprietary assets, 00:45:01.880 |
too. And they've got all the incentive in the world. So do I think that Google will be 00:45:05.320 |
one of the top four players in AI? Absolutely. But this idea that is going to come in steamroll 00:45:10.760 |
open AI, I have a prediction. I got a prediction. But then next year, Larry and Sergey take the 00:45:15.560 |
title of co CEOs. And then they do a demo day where the two of them get on stage. 00:45:20.520 |
They actually do the demos of these products. If that happens, that fictional quantification, 00:45:26.440 |
that's it. That's listen, and base us to the run for president. Those are my two predictions. 00:45:30.520 |
I'm taking love riddance. Can you imagine if Larry freeberg? Where are the chances of Larry 00:45:34.760 |
and Sergey taking co CEO slots? That's prediction one and then prediction two, 00:45:39.000 |
where are the chances of them running the next Google IO, where they get on stage and they walk 00:45:44.200 |
people through all the products that they shepherded and that they have a vested interest in that 00:45:48.840 |
there they want to demo. There is an institutional problem at Google at the top level, which does 00:45:54.760 |
need to be solved, which is this position of constantly being in defense against the scrutiny, 00:46:02.040 |
again, of regulators and public policy folks and, you know, all these different groups that are 00:46:07.240 |
against Google. And so as a result, the kind of cultural seasoning, particularly the executive 00:46:13.800 |
and the board level has been one of like, you know, protect the nest, don't overreach, don't 00:46:19.720 |
overstep. And it's a real, you know, I think one for the for the business school books or whatever, 00:46:25.800 |
ultimately is what they end up doing about it. Because now is, you know, the time when 00:46:30.200 |
that defensive posture is really kind of putting the entire business at risk. 00:46:34.040 |
The same thing happened to Microsoft. Remember in the late 90s? That's right, 00:46:36.920 |
when they got crushed by that antitrust lawsuit, that can be very defensive. 00:46:43.000 |
they put up they had a wartime CEO come in, Balmer came in. And, you know, followed by kind 00:46:48.200 |
of an innovative guy who could kind of continue to build. And I think that there may be a moment 00:46:53.720 |
here. I look, I love Sundar. He's a great guy, great CEO, because sooner or not, I don't forget, 00:46:58.760 |
if I ever told you this, he and I started at Google on the same day. We're both in the same 00:47:01.800 |
nuclear class, we were the freaking hat on the TGIF day and on stage. He was a product manager, 00:47:07.160 |
and now he runs the company. But I think the question is like, whether it's the CEO or the 00:47:13.320 |
broader whole kind of executive org or the board, a degree of disruption necessary to shift that 00:47:18.440 |
cultural seasoning is so necessary right now, for them to have a shot at this. And similar to what 00:47:22.760 |
you just said, sacks, like you're gonna need a bomber type moment to kind of, you know, 00:47:27.400 |
reinvigorate that business. And by the way, I'll tell you such a moment, I think that 00:47:31.480 |
it's an important port point when bomber took over during that period after gates, 00:47:37.240 |
when they were on their heels, he basically just focused on revenue and paying dividends 00:47:42.680 |
and stock buybacks, and the stock went sideways. And he missed mobile. And now, 00:47:48.840 |
Jay K. You're forgetting one big thing, which is that that was also because he had to operate 00:47:53.480 |
under a consent decree to the DOJ. Exactly. So the product managers of Microsoft were replaced 00:47:57.880 |
with lawyers from the Department of Justice, and you had to get their sign off before you 00:48:01.560 |
could ship anything. So we have to remember that those things probably slowed Microsoft down as 00:48:06.920 |
well. And the great thing that Satya had was a blank slate and the removal of that consent decree. 00:48:14.120 |
So he was able to do everything that just made a lot of sense. And he's executed flawlessly. 00:48:19.720 |
I think the problem at Google is not Sundar or Larry or Sergey. I think it's more in the deep 00:48:28.520 |
bowels of middle management of that company, which is that there's just far too many people 00:48:33.080 |
that probably have an opinion. And their opinion is not shrouded in survival. Their opinion is 00:48:39.960 |
shrouded in elite language around what is the moral and ethical implications of this and where 00:48:46.040 |
has this been properly tested on the diaspora of 19 different ethnic tribes of the Amazon. 00:48:53.000 |
That's the kind of decision making that is a nice to have when you are the second or third 00:48:58.920 |
most valuable technology company in the world. But you have to be able to pause that kind of 00:49:05.080 |
thinking and instead get into wartime survival mode. And it's very hard. So it doesn't almost 00:49:11.000 |
matter. At this point, what Sundar wants, the real question is, what is the capability of middle 00:49:17.720 |
management to either do it or get out of the way. And I think that in all of these big companies 00:49:23.080 |
that struggle, what you really see is an inability for middle management to get out of the way or 00:49:28.680 |
frankly, just you need somebody to then fire them. And if you look at folks who get their groove back, 00:49:34.840 |
let's see what Facebook does. What are they targeting? They're targeting middle management. 00:49:40.360 |
If you look at what Elon does in the companies that he owns, there is virtually no middle management. 00:49:46.840 |
It's like get out of the way, build product, build product and ship it. Yeah. And what is 00:49:53.640 |
the core truth? And so if failure is there in front of you, and if David is right, that you 00:49:58.200 |
have 200 million users come out of nowhere, who are voting every day with their time and attention 00:50:03.720 |
to use an app, and that doesn't create a five alarm fire where you get middle management out 00:50:09.080 |
of the way and you are the senior most people talking to the people doing the work and shipping 00:50:15.000 |
things every day you're you are toast. You are toast. A lot of people are starting to think 00:50:20.760 |
we're moving a little bit too fast. When it comes to open AI is incredible performance, which I GPT 00:50:27.480 |
for the plugins and all this and so the future of life Institute, which was formed in 2015. It's a 00:50:33.560 |
nonprofit that's focused on de risking major technology like AI. They did a petition titled 00:50:41.240 |
pause giant AI experiments and open letter, a bunch of computer scientists sign this letter. 00:50:47.000 |
And the letter quote says we must ask ourselves should we let machines flood our information 00:50:54.840 |
channels with propaganda and untruth? Should we automate away all the jobs including the 00:50:59.880 |
fulfilling ones? Should we develop non human minds that might eventually outnumber outsmart 00:51:04.440 |
obsolete and replace us? Should we risk loss of control of our civilization, a number of notable 00:51:09.640 |
tech leaders like Elon, Steve Wozniak, and a handful of deep mind researchers have signed it. 00:51:15.320 |
What do you guys think of the latter? Are we going to slow down or not? And then we can ask the 00:51:18.840 |
question generally, how close are we getting to AGI, which is what everybody's scared of, 00:51:22.680 |
is that these agents start working with each other in the background to do things that are 00:51:27.560 |
against human interest. I know it sounds like science fiction. But there is a theory that when 00:51:32.840 |
these AI is start operating on their own, like we explained in the previous sort of segment here with 00:51:38.920 |
plugins, and they make agents that are operating based on feedback from each other, could they get 00:51:45.560 |
out of control and be mischievous and then work against human interest? Or what do you think sex? 00:51:49.880 |
I think there's a difference between what could happen in the short term and then what could 00:51:53.560 |
happen in the long term. I think in the short term, everything we're seeing right now is 00:51:58.120 |
very positive. And let me just give you an example. There was a really interesting 00:52:02.600 |
tweet storm about a guy who wrote about how Chad GPT saved his dog. And do you guys see this? 00:52:09.160 |
This is one of the really mind blowing ones to me, use cases. So his dog was sick, took him to a vet, 00:52:16.680 |
vet prescribed some medication, three days later, dog still sick, in fact, even worse. 00:52:20.600 |
So the owner of the pet just literally copied and pasted the lab result for the blood test for the 00:52:27.720 |
dog with all the lab values into chat GPT and said, what could this be like, what's your 00:52:33.240 |
likely diagnosis? Chat GPT gave three possible answers, three illnesses. The first one was what 00:52:40.200 |
the vet basically diagnosed with. So that wasn't it. The second one was excluded by another test. 00:52:45.480 |
So he then went to a second vet and said, listen, I think my dog has the third one. 00:52:50.200 |
And vet prescribed something and sure enough, dog is cured, saved. So that's really mind blowing 00:52:57.880 |
that even though chat GPT hasn't been specifically optimized, as far as we know, for lab results, 00:53:04.120 |
it could figure this out. The reason I'm mentioning this is it gives you a sense of 00:53:07.400 |
the potential here to cure disease to, you know, like I could see major medical breakthroughs 00:53:14.760 |
based on the AI in the next five or 10 years. Now, the question is, like, what happens in 00:53:19.400 |
the long term, you know, as the AI gets smarter and smarter, and we are kind of getting 00:53:23.960 |
into the realm of science fiction, but here would be the scenario is you're on 00:53:27.960 |
chat GPT 10, or 20, or whatever it is, or maybe some other companies AI. And the developers asked 00:53:35.800 |
the AI, hey, how could you make yourself better? Now do it? Which is a question we asked chat GPT 00:53:41.640 |
all the time in different contexts. And so chat GPT will already have the ability to write perfect 00:53:47.400 |
code by that point. I think, you know, code writing is one of the I think of its superpowers already. 00:53:52.040 |
So it gives itself the ability to rewrite its code to auto update it to recursively make itself 00:53:57.880 |
better. I mean, at that point, isn't that like a speciation event doesn't that very quickly lead 00:54:04.200 |
to the singularity if the AI has the capability to rewrite its own code to make itself better? 00:54:11.160 |
And you know, won't it very quickly write billions of versions of itself. 00:54:14.200 |
And, you know, it's very hard to predict what that future looks like. Now, I also don't know 00:54:19.320 |
how far away we are from that that could be 10 years, 20 years, 30 years, whatever. 00:54:22.680 |
But I think it's a question worth asking for sure. 00:54:26.600 |
Is it worth slowing down, though, sex? Should we be pausing because, based on what you said, 00:54:31.880 |
you know, I think you've framed it properly. When these things hit a certain point, and they start 00:54:36.200 |
reinforcing their own learning with each other, they can go at infinite speed, right? This is not 00:54:42.120 |
comparable to human speed, they could be firing off millions, billions of different I think you're 00:54:47.400 |
right scenarios. We're definitely now on this fuck around, find out curve. Yeah. And so there's only 00:54:54.600 |
one way to really find out which is somebody is going to push the boundaries, the competitive 00:55:00.120 |
dynamics will get the better of some startup, they'll do something that people will look back 00:55:06.120 |
on and say, Whoa, that was a little that was a bridge too far. So yeah, we're just a matter of 00:55:11.320 |
time. Yeah, I think we're not going to slow down. I actually think it's going the other way. I think 00:55:16.120 |
things are going to speed up. And the reason they're going to speed up is because the one 00:55:20.280 |
thing Silicon Valley is really good at is taking advantage of a platform shift. And so when you 00:55:25.320 |
think about like all the VCs, and all the founders, you know, everyone accuses us of being lemmings. 00:55:31.480 |
And so when there's like kind of like a fake platform shift, or people kind of glom on to 00:55:37.480 |
something that ends up not being real, everyone's kind of got egg on their faces. But the flip side 00:55:42.120 |
of that is that when the platform shift is real, Silicon Valley is really good at throwing money 00:55:47.560 |
at it, the talent knows how to go after it. And they keep making it better and better. And so 00:55:53.720 |
that's the dynamic we're in right now. You look at 70% of the last YC class was ready all AI 00:55:58.120 |
startups for the next one probably 95%. So I think that we're on a path here where the pace of 00:56:04.360 |
innovation is actually going to speed up. Companies are going to compete with each other, they're 00:56:08.520 |
going to seek to invent new capabilities. And I think the results are going to all be incredibly 00:56:13.960 |
positive for some period of time, like you know, the vet example, we're going to cure 00:56:18.840 |
illnesses, we're going to solve major problems are positive, then we invest more, we trust more. 00:56:25.320 |
But the paradox of that as Chamath is pointing out, Friedberg is if we trust it more, we invest 00:56:29.720 |
more than some person in a free market is going to say, you know what, I need to be 00:56:34.360 |
chat GPT. Therefore, I'm going to take the rails off this thing, I'm going to let it go 00:56:38.440 |
faster, and take off some constraints, because I need to win. And I'm so far behind. 00:56:43.080 |
How do you feel about that scenario that sort of Chamath and 00:56:47.400 |
Saks T. Dep Friedberg, I think there's like, GPT three, I think ran on 700 gigs. Is that right? 00:56:57.320 |
Does anyone know what GPT four runs on? It's got to be on some number that's, you know, not too 00:57:03.800 |
not not many multiples of that. But look, someone could make a copy of this thing and fork it and 00:57:12.840 |
develop an entirely new model. I think that's what's incredible about software and digital 00:57:20.600 |
technology and also kind of, you know, means that it's very hard to contain. Similar to like what 00:57:26.760 |
we've seen in biology ever since biology got digitized through DNA sequencing, and the ability 00:57:32.920 |
to kind of express molecules through gene editing. You know, you can't control or contain the ability 00:57:39.720 |
to do gene editing work at all. Because everyone knows the code. Everyone can make CRISPR cast 00:57:46.280 |
molecules, everyone can make gene editing systems in any lab anywhere. Once it was out, it was out. 00:57:51.480 |
And now there's hundreds of variants for doing gene editing, many of which are much improved 00:57:57.240 |
over CRISPR cast nine, I use that as an analogy, because it was this breakthrough technology that 00:58:01.480 |
allowed us to precisely, specifically edit genomes. And that allowed us to engineer biology and do 00:58:06.120 |
these incredible things where biology effectively became software. And remember, CRISPR cast nine 00:58:11.160 |
gave us effectively a word processing type tool find and replace. And the tooling that's evolved 00:58:18.280 |
from that is much better. So whatever is underlying, whatever the parameters are for GPT for 00:58:26.280 |
whatever that model is, if a close enough replicant of that model exists, or a copy of that model is 00:58:32.680 |
made, and then new training data and new evolutions can be done separately, you could see many, many 00:58:37.960 |
variants kind of emerge from here. And I think this is a good echoing of chumas point, we don't 00:58:41.960 |
know what's ultimately going to win. Is there enough of a network effect in the plugin model, 00:58:46.440 |
as Saks pointed out, to really give open AI the sustaining competitive advantage? I'm not sure. 00:58:51.560 |
The model runs on 700 gigs. That's less data than, you know, fits on my iPhone. So you know, 00:58:58.200 |
I could take that model, I could take the parameters of that model. And I could create 00:59:01.720 |
an entirely new version, I could fork it, and I could do something entirely new with it. 00:59:04.680 |
So I don't think you can contain it. I don't think that this idea that we can put in place, 00:59:10.200 |
some regulatory constraints and say, it's illegal to do this, or, you know, try and, 00:59:17.080 |
you know, create IP around it, or protections around it is realistic at this state, 00:59:21.480 |
the power of the tool is so extraordinary, the extendability of the tools are so extraordinary. 00:59:25.960 |
So the economic and the various incentives are there for, you know, other models to emerge, 00:59:32.520 |
and whether they're directly copied from someone hacking into open AI servers and making a copy of 00:59:37.080 |
that model, or whether they're, you know, open sourced, or whether that someone generates 00:59:40.760 |
something that's 95% is good, and then it forks and a whole new class of models emerge. 00:59:45.720 |
I think this is like, it's as Saks pointed out, highlighting the kind of economic, market 00:59:52.200 |
uprooting, social uprooting potential, and many models will start to kind of come to market. 00:59:57.800 |
What do we think the impact of white collar jobs getting annihilated by this technology, 01:00:02.120 |
if that in fact, I want to say one thing on this. Yeah, look, I just share one example here. So 01:00:06.840 |
here's a Reddit post that I was made aware of earlier this week. I lost everything that made 01:00:11.960 |
me love my job through mid journey overnight. I am employed as a 3d artist and a small games 01:00:16.840 |
company of 10 people. Our team is two people who basically explains he says since mid journey 01:00:22.520 |
version five came out, he's not an artist anymore, nor a 3d artist. All they do is prompting 01:00:27.800 |
photoshopping and implementing good looking pictures. And he basically says, this happened 01:00:33.080 |
overnight, and he had no choices. Boss also had no choice. He says I am now able to create rig 01:00:38.120 |
and animate a character that spit out from MJ mid journey in two to three days before it took 01:00:43.640 |
us several weeks and 3d. The difference is that he cares about his you know, job and for his boss, 01:00:48.440 |
it's just a huge time money saver. He's no longer making art and the person who was number two in 01:00:53.160 |
the organization who didn't make as good content as him is now embracing this technology because 01:00:58.600 |
it carries favor with his boss. And he ends basically saying, getting a job in the game 01:01:04.280 |
industry is already hard, but leaving a company and a nice team because AI took my job feels very 01:01:09.560 |
dystopian. I doubt it would be better in a different company. Also, I am between grief 01:01:14.520 |
and anger and I am sorry for using my gosh, your art fellow artists. This is yet another reason 01:01:20.200 |
that figma really needs to close this acquisition from Adobe. I mean, it's like, the value of these 01:01:27.080 |
apps are just getting gutted. If you take a workflow management tool for things like design 01:01:33.480 |
and imagery, and you reduce it by an order of 90% it's like what is that app experience worth? 01:01:40.200 |
And how could you replicate it if you were a big company that already has distribution? That's one 01:01:47.240 |
comment. But what I would tell you Jason to answer the white collar question is I think there are a 01:01:51.560 |
handful of companies you need to look at exclusively because they will be the first ones to 01:01:56.520 |
really figure out how to displace human labor. And that is TCS. So Tata Consulting Services, 01:02:03.800 |
Accenture, Cognizant, these are all the folks that do coding for hire work at scale. I think 01:02:10.920 |
Accenture has something like 750,000 employees. So the incentive to sort of squeeze op ex to 01:02:18.280 |
create better utilization rates to increase profitability is quite obvious. It always has 01:02:24.280 |
been they will be the first people to figure out how to use these tools at scale. Before the law 01:02:29.240 |
firms or the accounting firms or any of those folks, even sort of try to figure out how to 01:02:33.320 |
displace white collar labor, I think is going to be the coding jobs and it's going to be the coding 01:02:36.760 |
for hire jobs that companies like Accenture and TCS. So those business processing do for other 01:02:42.360 |
people, developer kind of folks, they're going to need half as many people 25% as many people, 01:02:48.120 |
we're going to find out the efficient frontier. Yeah. 01:02:50.440 |
I see it a different way. I mean, this argument that productivity leads to job loss has been made 01:02:56.200 |
for hundreds of years. And it's always been refuted. When you make human beings more productive, 01:03:01.240 |
it leads to more prosperity, more wealth, growth, more growth. And so yeah, it's easy to think about 01:03:07.080 |
in a narrow way, the jobs are going to be displaced. But but why would that be? It's 01:03:11.000 |
because you're giving leverage to other human beings to get more done. And some of those human 01:03:16.040 |
beings, really anybody with a good idea is now going to be able to create a startup much more 01:03:20.840 |
easily. So you're going to see a huge explosion in creativity in startup creation, new companies, 01:03:27.400 |
new jobs. Imagine think about the case of you know, Zuckerberg founding Facebook at Harvard, 01:03:33.480 |
he wrote the first version himself, maybe with a couple of friends, that project happened and 01:03:38.520 |
turned into a giant company because he was able to self execute his idea without needing to raise 01:03:44.360 |
venture capital or even recruit employees. Even really before forming a company, anyone with a 01:03:50.520 |
good idea to be able to do that soon, you're going to be able to use these AI tools, they truly 01:03:54.280 |
will be no code, you'll be able to create an app or a website just by speaking to some AI program 01:03:59.640 |
in a natural language way. So more flowers will bloom more startups, more projects. Now, 01:04:04.440 |
it will create I think a lot of dislocation. But for every testimonial that is like the one that 01:04:11.880 |
you showed, which I think is, I'd say a little bit overly dramatic, I have seen 10 or 100 01:04:17.960 |
testimonials from coders on Twitter or other blogs, talking about the power that these new 01:04:24.280 |
tools give them. They are like this makes me a 10x engineer, right? And, and especially these 01:04:29.320 |
like junior engineers who are right out of school, who don't have 20 years of coding history, 01:04:33.400 |
they get superpowers right away. Like it makes them so much better. 01:04:38.200 |
Let me give you a response to that guy. So so and using sexist point, that guy saying what used to 01:04:45.160 |
take me weeks, I can now do in two to three days. And I feel like my work is gone. And that's because 01:04:50.680 |
he's thinking in terms of his output being static. And if he thinks about his output being dynamic, 01:04:56.360 |
he can now in the matter of three weeks, instead of making one character, he cannot make a character 01:05:01.960 |
every two days. So he can make 30 characters in three weeks. That's an alternative way for him 01:05:07.400 |
to think about what this tooling does for him and his business, the number of video games will go up 01:05:12.360 |
by 10x, or 100x or 1000x. The number of movies and videos that can be rendered in computers can go up 01:05:18.600 |
by 10x or 100x or 1000x. This is why I really believe strongly that in some period of time, 01:05:25.000 |
we will all have our own movie or our own video game ultimately generated for us on the fly. 01:05:29.880 |
Based on our particular interests, there will certainly be shared culture, shared themes, 01:05:34.040 |
you know, shared morality, shared things that that tie all these things together. 01:05:39.480 |
And that will become the shared experience. But in terms of like us all consuming the same content, 01:05:43.960 |
it will really like you with YouTube and Tick Tock, we're all consuming different stuff all the time. 01:05:48.680 |
And this will enable an acceleration of that evolution and personalization. I'll also highlight, 01:05:54.120 |
you know, back in the day, one human had to farm a farm by hand. And we eventually got the tool of 01:05:59.960 |
a hoe and we can put in the ground and make, you know, make stuff faster. And then we got a plow. 01:06:05.400 |
And then we got a tractor. And today, agricultural farm equipment allows one farmer to farm over 10,000 01:06:13.240 |
acres, you go to Western Australia, it's incredible. These guys have 24 row planters 01:06:16.360 |
and harvesters. And it's completely changed the game. So the unit of output per farmer 01:06:21.640 |
is now literally millions of times what it was just 150 years ago. 01:06:25.480 |
And in that case, Freeberg, nobody wants to do backbreaking labor in the fields. And everybody 01:06:30.280 |
wants that cheaper food. But in this case, let me just read you one quote that I didn't read in the 01:06:34.520 |
original reading of this. He says, I want to make art that isn't the result of scraped internet 01:06:39.960 |
content from artists that were not asked. And so I think that's part of this is that it's bespoke 01:06:44.920 |
art. But the one question I have for sacks was, sacks, you we started this conversation, we're 01:06:51.720 |
saying, hey, this is different than anything in terms of efficiency that came before it. This is, 01:06:55.640 |
I'm going to put some words in my theory, but this is like a step function, more efficient. So 01:07:00.520 |
to the argument of, hey, efficiency has always resulted in, you know, more ideas, and we've 01:07:07.640 |
found something to do with people's time, is this time different, potentially, 01:07:11.240 |
because this is so much more powerful. This isn't just like a spellchecker. 01:07:14.360 |
I would say differently, I think, and I agree with what J. Cal is saying, because I think that 01:07:18.360 |
the thing that technology has never done is tried to displace human judgment. It's allowed us to 01:07:29.480 |
replace physical exertion of energy, but it has always preserved humans injecting our judgment. 01:07:37.000 |
And I think this is the first time where we're being challenged with autonomous systems 01:07:42.440 |
that has some level of judgment. Now we can say, and it's true, 01:07:47.560 |
again, we're four months in that that judgment isn't so great. But eventually, 01:07:53.320 |
and because of the pace of innovation, eventually is probably not that far away. 01:07:58.760 |
To judgment will become perfect. I'll give you a totally different example. You know, 01:08:02.520 |
how many pilots are there in the world? Will we, at some point in the next 10 years, want 01:08:09.320 |
folks to actually manually take off and land? Or will we want precision guided instrumentation and 01:08:16.680 |
computers and sensors that can guarantee a pitch perfect landing every single time in all kinds of 01:08:21.800 |
weather conditions so that now planes can even have 50 x the number of sensors with a computer 01:08:28.520 |
that can then process it and act accordingly. Just a random example that isn't even thought 01:08:32.840 |
of when we talk about sort of where AI is going to rear its head. I think that this judgment idea 01:08:38.120 |
is an important one to figure out. Because this is the first time I've seen something that is 01:08:43.160 |
bumping up against our ability to have judgment. And what this person was talking about in this 01:08:47.240 |
mid journey example is his judgment has been usurped. Yes. Yeah, I would disagree. 01:08:53.320 |
I don't know. Yeah, let me just let me just make one point on this. So, you know, an image is a 01:08:59.720 |
matrix of, you know, data that's rendered on a screen and as pixels and those pixels are different 01:09:05.480 |
colors. And, you know, that's what an image is. Is it or is it is it the judgment of the creator? 01:09:11.000 |
Well, no, I'm just saying an image in general. So like when Adobe Photoshop and digital photography 01:09:15.240 |
arose, photographers were like, this is, you know, BS, why are you digitizing photography 01:09:20.200 |
was analog and beautiful before. And then what digital photography allowed is the photographer 01:09:25.080 |
to do editing and to do work that was creative beyond what was possible with just a natural 01:09:30.920 |
photograph taken through a camera. And they're arguably different art forms, but it was a new 01:09:34.600 |
kind of art form that emerged through digital photography. And then in the early 90s, there 01:09:38.840 |
was a plugin suite called Kai's power tools that came out in Adobe Photoshop. And it was a third 01:09:44.920 |
party plugin set, you would you would buy it and then it would work on Photoshop and it did things 01:09:49.080 |
like motion blur, sharpening, pixelation, all these interesting kind of like features. And 01:09:54.680 |
prior to those tools coming out, the judgment of the digital artist, the digital photographer was 01:09:59.480 |
to go in and do pixel by pixel changes on the image to make that pixel to make that image look 01:10:04.600 |
blurry, or to make it look sharper, or to make it look like it had some really interesting motion 01:10:08.840 |
feature. And the Kai's power tools created this instant toolkit where in a few seconds, you 01:10:14.200 |
created a blur on the image. And that was an incredible toolkit. But a lot of digital artists 01:10:19.160 |
said, this is automating my work. What is my point now? Why am I here? And the same happened in 01:10:24.280 |
animation when three when you know, CGI came around, and animators were no longer animating 01:10:28.760 |
cells by hand. And in every point in this evolution, there was a feeling of loss initially, 01:10:33.880 |
but then the evolution of a whole new art form emerged, and an evolution of a whole new area of 01:10:38.040 |
human creative expression emerged. And I think we don't yet know what that's going to look like. 01:10:42.280 |
Do you think you think the the level of judgment that AI offers you is the same as the level of 01:10:49.160 |
judgment that Kai power tools offered? Yeah, look, I mean, I think that the person making 01:10:52.600 |
the judgment or the decision about which pixel to change into what color felt like, you know, 01:10:56.600 |
I have control. And I think it's ultimately like, I just told her I disagree with you. I mean, 01:11:00.840 |
I think that this is a magnitude different number. It's more of a magnitude. Yeah, 01:11:05.480 |
it's still love. It's on you. You don't look nice. You and I have sat in spreadsheets. And 01:11:11.800 |
we've I'm generally happy with this idea. So I'll give you a different example. 01:11:14.920 |
Today, we use radiologists and pathologists to identify cancers. Yep. There are closed loop 01:11:22.600 |
systems. We have one right now that's in front of the FDA. That is a total closed loop system 01:11:27.240 |
that will not need any human input. So I don't know what those folks do. 01:11:33.000 |
Except what I can tell you is that we can get cancer detection, basically down to a 0% error 01:11:40.600 |
rate. That is not possible with human intervention. That is judgment. Right? So I just think it's 01:11:48.040 |
important to really acknowledge that this is happening at a level that it's never happened 01:11:52.200 |
before. You may be right that there's some amazing job for that radiologist or pathologist to do in 01:11:58.040 |
the future. I don't know offhand what that is. But these are closed loop systems now, 01:12:04.360 |
that think for themselves and self improve. I get it. But I think that there there is an 01:12:08.760 |
unfathomable set of things that emerge. We did not have the concept of Instagram influencers, 01:12:14.360 |
we did not have the concept of personal trainers, we did not have the concept of like, 01:12:19.800 |
all these new jobs that have emerged in the past couple of decades, that people enjoy doing that 01:12:25.000 |
they can make money doing that is a greater kind of experience and level of fulfillment for those 01:12:29.800 |
that choose and have the freedom to do it than what they were having to do before, when they 01:12:34.200 |
had to work just to make money. What do you think that radiologist or pathologist wants to do? 01:12:38.760 |
Be a trainer or Pilates instructor? No, I think that's gonna look like 01:12:42.600 |
all right, Sam. Yeah. You have any thoughts on this? As we wrap this topic? It's obviously a 01:12:48.120 |
lot of passion coming out. Yeah. Elimination of white collar jobs in a massive way. 01:12:52.760 |
I think that this is a short term versus long term thing. In the short term, I see the benefits 01:12:57.240 |
of AI being very positive, because I don't think it's in most cases, wiping out human jobs is 01:13:03.240 |
making them way more productive. You still need the developer. It says that there are five times 01:13:07.960 |
or 10x more productive. But I don't think we're at the point in the short term, we're gonna be 01:13:12.600 |
able to eliminate that role entirely. What I've seen in basically every startup I've ever been 01:13:17.800 |
a part of is that the limiting factor on progress is always engineering bandwidth. That is always 01:13:23.800 |
the thing that you wish you had more of. Totally. 01:13:26.760 |
It's the product roadmap is always the most competed on thing inside the organization. 01:13:31.000 |
Everyone's trying to get their project prioritized, because there's just never 01:13:35.080 |
enough engineering bandwidth. It's really the lifeblood of the company. If you make 01:13:39.240 |
the developers more productive, it maybe just accelerates the product roadmap. 01:13:42.680 |
I don't think in the short term that what's going to happen is these companies are going to look to 01:13:47.800 |
cut all their developers because one or two of them can do 10x the work. I think that they're 01:13:52.360 |
going to try and accelerate their product roadmaps. Now, again, you have this long term concern that 01:13:58.280 |
maybe you don't need developers at all at some point. But I think that the benefits of developing 01:14:03.640 |
this technology are so great in the short to midterm that we're going down that path no matter 01:14:07.720 |
what. And we're just gonna have to find out what that long term really looks like. And maybe 01:14:11.480 |
will look very different. I mean, once we get past the short term, 01:14:16.520 |
we may have a different long term view. I think in this narrow vertical, I 100% 01:14:21.560 |
agree with you. Look, I think that AI is going to eliminate unit testing, it has already done. 01:14:26.600 |
So it's going to eliminate most forms of coding, the engineers that you have, all of them will now 01:14:31.560 |
become 10x engineers. So with fewer of them, or with the same number, you'll be able to do as much 01:14:37.480 |
or more than you could have before. That's a wonderful thing. And all I'm saying on that 01:14:40.840 |
specific narrow vertical is you'll see it first rear its head in companies like Accenture and TCS 01:14:46.040 |
because and cognizant because they have an immediate incentive to use this tooling to drive 01:14:52.760 |
efficiency and profitability that's rewarded by shareholders. It'll be less visible in other 01:14:58.040 |
companies. So but what I am saying, though, is that you have to think about the impact 01:15:02.280 |
on the end markets for a second. And I think that AI does something that other technology layers 01:15:10.280 |
have never done before, which is supplant human judgment in a closed loop manner. And I just think 01:15:15.960 |
it's worth appreciating that there are many systems and many jobs that reply that rely on human 01:15:21.320 |
judgment. Where we deal with error bars, and an error rate that a computer will just destroy and 01:15:30.840 |
blow out of the water. And we will have to ask ourselves, should this class of job exist with 01:15:36.760 |
its inherent error rate? Or should it get replaced fully by a computer which has no error rate? And 01:15:41.960 |
I think that's an important question that's worth putting on the table. 01:15:44.440 |
Okay, so let's wrap here. I just have my final thought on it is like, you're going to see 01:15:49.080 |
entire jobs, categories of jobs go away. We've seen this before phone operators, travel agents, 01:15:54.760 |
copy editors, illustrators, logo designers, accountants, sales development reps, I'm seeing 01:15:58.840 |
a lot of these job functions in the modern world, like phone operators previously, I think these 01:16:04.920 |
could wholesale just go away. And they would just be done by AI. And I think it's going to happen 01:16:08.840 |
in a very short period of time. And so it's gonna be about who can transition. And some people might 01:16:13.160 |
not be able to make the transition. And that's going to be pain and suffering. And it's going 01:16:15.720 |
to be in the white collar ranks. And those people have more influence. So I think this is could 01:16:19.960 |
lead to some societal disturbance. I'm going to learn Pilates and be an influencer. 01:16:25.160 |
That's it. But I do agree with sacks that the software development backlog, if this is what 01:16:29.000 |
you're saying is so great, that I don't think we'll see it in software development for a decade 01:16:33.160 |
or two. There's just so much software that still needs to be made. All right, last week, we talked 01:16:36.200 |
about Tick Tock. And this first bipartisan hearing we've seen in a long time. And people actually, 01:16:42.360 |
I think framing correctly exactly how dangerous it is, in my opinion, to have Tick Tock in the 01:16:46.280 |
United States. And of course, then we get the great disappointment of the actual bill, 01:16:52.520 |
the restrict act was proposed by Senator Mark Warner, Democrat Virginia, on March 7. The problem 01:16:59.000 |
with it is, is it seems like it's poorly worded, that there will be civil penalties and criminal 01:17:04.600 |
penalties to Americans for breaking the law and using software that's been banned. And many people 01:17:12.280 |
said, you know, this probably is just bad language. I have a question. Yeah, does it does 01:17:17.640 |
this apply to incognito mode? Because if it doesn't, I don't know, it is not. Yes. They're 01:17:25.400 |
saying they're saying that you can get, you know, you can get fined or 20 years in jail, whatever it 01:17:30.440 |
is, for using a VPN VPN to Tick Tock. Freeberg, what are your thoughts on it? Look, I think this 01:17:36.200 |
is a real threat to the open internet. I'm really concerned about the language that's been used 01:17:42.200 |
that basically speaks to protecting the safety and security of the American people 01:17:46.040 |
by actively monitoring network traffic, and making decisions about what network traffic 01:17:51.480 |
is and isn't allowed to be transmitted across the open internet. It's the first time that I 01:17:56.120 |
think in the United States, we are seeing like a real threat and a real set of behaviors from our 01:18:00.600 |
government that looks and feels a lot like what goes on in China and elsewhere, where they operate 01:18:05.960 |
with a closed internet and internet that's controlled, monitored, observed, tracked, and, 01:18:11.640 |
and gates are decided by some set of administrators on what is and isn't appropriate. And the language 01:18:17.160 |
is always the same. It's for safety and security of the people. The entire purpose of the internet 01:18:22.120 |
is that it did not have bounds that it did not have governments that it did not have controls 01:18:26.120 |
that it did not have systems that are politically and economically influenced that the architecture 01:18:31.160 |
of the internet was and always would be open. The protocols are open, the transmission of data on 01:18:35.880 |
that network would be open. And as a result, all people around the world would have access to 01:18:40.600 |
information of their choosing, and it allowed ultimate freedom of choice. You know, this, 01:18:45.880 |
this kind of is the first of what I'm concerned, creates a precedent that ultimately leads to a 01:18:50.520 |
very slippery slope, saying that tick tock cannot make money in the US by charging advertisers or 01:18:55.560 |
managing commerce flows is one thing. That's where the government can and should and could, 01:18:59.720 |
if they chose to have a role. But I think going in and observing tracking internet traffic and 01:19:04.840 |
making decisions about what is and isn't appropriate for people, I think, is one of the 01:19:10.280 |
things that we all should be most concerned about what's going on right now. There is no end in sight 01:19:15.240 |
to this. If you allow this to happen in the first time, you know, VPNs, virtual private networks, 01:19:21.880 |
allow you to anonymously access internet traffic and and access internet traffic via remote 01:19:29.240 |
destinations. So, so that the ultimate consumption of content that you're using can't be tracked and 01:19:36.520 |
monitored by local agencies or ISPs. And I think that saying that that can now be restricted, 01:19:42.680 |
takes away all ability to have true privacy and all rights to privacy on the open internet. 01:19:48.120 |
So I'd love to talk about this more. Unfortunately, I got to run. 01:19:51.400 |
Great. So this is a super threat to me. And I think this is something we should be super, 01:19:56.680 |
super concerned about. And that the entire community of technology, internet, and anyone 01:20:01.560 |
that wants to have, you know, freedom of choice, steps up and says, this is totally inappropriate 01:20:06.520 |
and overreach. Yeah, there are other ways to manage stuff like this. 01:20:12.760 |
Intentional overreach or poorly written or somewhere in between? What do you think? 01:20:15.880 |
Both. I think both. I think this is the biggest bait and switch that Washington, 01:20:20.360 |
the central government has ever tried to pull on us. Everybody thinks that they're just trying to 01:20:24.600 |
ban TikTok from operating in the US. And if that's all they did, then I think the bill would be 01:20:29.400 |
supported by most Americans. But that's not what they're doing. They're not restricting TikTok, 01:20:34.680 |
That's not the goal here. Yeah. What a bait and switch. 01:20:38.360 |
It's a huge bait and switch. And so just so you know, what the act provides is that a US citizen 01:20:42.920 |
using a VPN to access TikTok could theoretically be subjected to a maximum penalty of 1 million 01:20:49.320 |
in fines or 20 years in prison or both. Now, they'll say, Mark Warner, the sponsor of legislation, 01:20:57.000 |
will swear up and down. That's not the intent. But the problem is that the language of the bill 01:21:01.160 |
is so vague that some clever prosecutor may want to pursue this theory one day. 01:21:05.720 |
And that needs to be stopped. Also, there's another problem with the bill, which is, 01:21:10.520 |
you think this is just about TikTok. It's not. What they do is, it says here, I guess they don't 01:21:17.240 |
want to mention TikTok by name. So they're trying to create a category of threatening application. 01:21:22.360 |
But because it is a category, it's very, very broad. So the bill states that it covers any 01:21:29.320 |
transaction, transaction, not just an app, in which an entity described in subparagraph B has 01:21:35.560 |
any interest. And then entities described in subparagraph B are, quote, a foreign adversary, 01:21:40.680 |
an entity subject to the jurisdiction of or organizing the laws of a foreign adversary, 01:21:45.240 |
an entity owned, directed, or controlled by either of these. And then it gives the executive branch 01:21:50.520 |
the power to name a foreign adversary, any foreign government regime that one of the cabinet 01:21:58.040 |
secretaries defines, without any vote of Congress. So this is giving sweeping powers to the executive 01:22:04.920 |
branch to declare foreign companies to be enemies. It feels like the plot of the prequels in Star 01:22:11.240 |
Wars. Well, like T-Power, here we go. You know, we criticize China for having a great firewall. 01:22:16.760 |
What do you think this is? Yeah, I mean, this should obviously have nothing to do with the 01:22:21.960 |
American consumer and everything to do with a foreign adversary collecting data of Americans 01:22:26.120 |
at scale. This could be written in a much simpler way. Yeah, you know what it should be? It should 01:22:30.520 |
be one sentence, which is that app stores are prohibited from allowing TikTok to be an app 01:22:36.760 |
in their store. That's what they do in India. That's it. Case closed. Game over. I think India 01:22:40.760 |
is doing okay, right? They block like 100 Chinese apps, and I think their society is still functioning. 01:22:45.080 |
So, you know, all due respect to AOC, you know, like the idea that 150 million Americans are 01:22:51.080 |
going to suffer because they can't be tracked by the CCP is kind of nuts. This is going to give 01:22:55.480 |
sweeping powers to the security state to surveil us, to prosecute us, to limit our internet usage. 01:23:05.080 |
This is basically the biggest power grab and bait and switch they've ever tried to pull on us. 01:23:09.880 |
And again, if they really were concerned about TikTok, it's one sentence. 01:23:13.560 |
Yeah, we were done. All right, everybody. It's been an amazing episode for the 01:23:18.040 |
Sultan of Science, David Freeburg, the Rain Man himself, David Sachs, and the dictator, 01:23:22.840 |
Chamath Palihapitiya. I am the world's greatest moderator, and we will see you next time. Bye-bye. 01:24:00.120 |
We should all just get a room and have just one big "U"Georgie