back to index

OpenAI's $150B conversion, Meta's AR glasses, Blue-collar boom, Risk of nuclear war


Chapters

0:0 Bestie intros: In Memoriam
6:43 OpenAI's $150B valuation: bull and bear cases
24:46 Will AI hurt or help SaaS incumbents?
40:41 Implications from OpenAI's for-profit conversion
49:57 Meta's impressive new AR glasses: is this the killer product for the age of AI?
69:5 Blue collar boom: trades are becoming more popular as entry-level tech jobs dry up
80:55 Risk of nuclear war increasing

Whisper Transcript | Transcript Only Page

00:00:00.160 | all right everybody uh let's get the show started here wait jason why are you wearing a tux what's
00:00:04.640 | going on there oh well it's time for a very emotional segment we do here on the all-in
00:00:10.160 | podcast i just got to get myself composed for this oh jason are you okay i'm i'm gonna be okay
00:00:18.480 | i think it looks like you're fighting back at tier what's yeah this is always a tough one
00:00:23.520 | this year we tragically lost giants in our industry these individuals bravely
00:00:30.240 | honed their craft at open ai before departing ilia suskeever he left he left us in may
00:00:38.320 | jan like also left in may john shulman tragically left us in august wait these are all ai employees
00:00:52.880 | yes sir it's so left on wednesday bob mcgrew also left on wednesday too short too short and mira
00:01:02.160 | mirati also left us tragically on wednesday we lost mirror too yeah and greg brockman
00:01:09.200 | is on extended leave the enforcer he left too thank you for your service your memories will live on
00:01:19.040 | as training data and may your memories be a vesting
00:01:27.920 | and it said we open source it to the fans and they've just gone crazy
00:01:42.000 | sorry oh my goodness all those losses wow that is three in one day three in one day my goodness i
00:01:48.720 | thought open ai was nothing without its people
00:01:50.720 | well i mean this is a this is a great whoa we lost them whoa what's happening wait oh wait what
00:02:00.640 | this is like the photo and back to the future wow they're just all gone wait oh no don't worry he's
00:02:08.880 | replacing everybody here we go let's replace it with a g700 a bugatti and i guess sam's got mountains of
00:02:15.360 | cash so don't worry he's got a backup plan anyway as an industry and as leaders in the industry the show
00:02:21.200 | sends its regards to sam and the open ai team on their tragic losses and congratulations on the 150
00:02:28.800 | billion dollar valuation and you're seven percent sam now just cashed in 10 billion dollars apparently
00:02:34.080 | so congratulations to fan of uh friend of the pod sam allman is the round that's all right out of some
00:02:41.520 | article right that's not like confirmed or anything is all of that done i mean it's reportedly allegedly
00:02:47.520 | that he's gonna have seven percent of the company and we can jump right into our first story i mean what i'm
00:02:52.640 | saying is has the money been wired and the dog's been signed according to reports this round is
00:02:57.440 | contingent on not being a um non-profit anymore and sorting that all out they have to remove the
00:03:04.800 | profit cap and do the c corp there's some article that reported this right none of us it's not some
00:03:10.000 | article it's bloomberg and it got a lot of traction and it was re-reported by a lot of places and i don't
00:03:15.920 | see anyone disputing it so is mainstream media we trust the mainstream media in this case because
00:03:22.000 | i think that when we could do a good bit yeah that's mine no i think that bloomberg bloomberg
00:03:28.480 | reported it based on obviously talks that are ongoing with investors who have committed to this round
00:03:34.000 | yeah and no one's disputing it has anyone said it's not true this has been speculated for months the
00:03:42.000 | 150 billion valuation raising something in the range of six to seven billion if you do the math
00:03:49.040 | on that and bloomberg is correct that sam altman got his seven percent i guess that would be the
00:03:54.400 | reality is you can't raise six billion dollars without probably meeting with a few dozen firms
00:03:59.600 | and some number of junior people in those few dozen firms are having a conversation or two with
00:04:05.040 | reporters so you can kind of see how it gets out all right and before we get to our uh first story
00:04:10.320 | there about open ai congratulations to chamath let's pull up the photo here he was a featured guest
00:04:17.440 | on uh the alex jones show no sorry i'm sorry uh that would be uh joe rogan congratulations on coming to
00:04:25.840 | austin and being on joe rogan what was it like to do a three hour podcast with uh joe rogan it's great i
00:04:32.800 | mean i loved it he's really awesome he's super cool it's good to do long form stuff like this so that i can
00:04:40.080 | actually talk clearly is the limitation of this podcast is the other three of us finally you have
00:04:45.040 | found a way to make it about yourself um i saw a comment somebody commented like oh wow it's like
00:04:51.680 | amazing to hear chamath expand on topics throughout the constant eruptions also known as moderation
00:05:00.240 | someone called me someone called me from from from the 70s show i thought that was funny the the amount
00:05:08.640 | of trash talking in rogan's youtube comments it's next level it is i mean it is it is the wild wild west
00:05:16.400 | uh in in terms of the comment section on youtube yeah um a bunch of comments asking jake out why do you
00:05:22.560 | call it alex jones is that because it's just a texas short podcaster who's short and stout and they
00:05:27.840 | look similar so it's just a but i mean it looks like alex jones uh started lifting weights actually
00:05:34.560 | you know they're both uh the same height and yeah i saw joe rogan 25 years ago doing stand-up i have a
00:05:40.960 | photo with him at the club it was like a small club in san francisco and we hung out with him afterwards
00:05:46.560 | he was just like a nobody back in the day he was like a stand-up guy right now he's yeah uber star
00:05:52.480 | well you have to go back pretty far for joe rogan to be a nobody i mean he had a tv show for a long
00:05:57.840 | time and two of them he was more like a stand-up comic for a while he was a stand-up comic stand-up
00:06:02.480 | comic yeah fear factor that's right that's but didn't he also do survivor or one of those like so i
00:06:07.840 | think and then the ufc i mean this guy's got four at four i feel like that's where he blew up ufc yeah
00:06:14.240 | yeah well i mean i think he got the ufc at a fear factor and being a ufc fighter and a comedian
00:06:20.000 | and there's like a famous story where like dana white was pursuing him and he was like i don't
00:06:26.720 | know and then dana white's like i'll send a plane for you you can bring your friends he's like okay
00:06:30.080 | fine i'll do it he did it for free and then dana white pursued him heavily to become the voice of
00:06:34.960 | the ufc and yeah and obviously it's grown tremendously and it's you know worth billions of dollars okay
00:06:43.520 | so how is open ai worth 150 billion dollars can anyone apply well why don't we get into the topic
00:06:49.920 | should we make the bull case and the bear case all right open ai as we were just joking in the opening
00:06:55.040 | segment is trying to convert into a for-profit benefit corporation that's a b corp it just means
00:07:01.680 | i will explain b corp later sam altman is reportedly i thought they're converting to a c corp no it's the same
00:07:07.120 | thing b corp doesn't really benefit corporation is a c corporation variant that is not a non-profit but
00:07:14.480 | the board of directors sax is required not only to be a fiduciary for all shareholders but also for the
00:07:22.560 | stated mission of the company that's my understanding of a b corp am i right freeberg external stakeholders
00:07:28.240 | yeah so like the environment or society or whatever but it all from all other kind of legal tax
00:07:34.240 | factors it's the same as a c corp and it's a way to i guess signal to investors the market employees
00:07:42.400 | that you care about something more than just profit so famous most famous b corp i think is
00:07:49.200 | toms is that the shoe company toms that's a famous b corp somebody will look it up here patagonia
00:07:54.080 | yeah that falls into that category so for profit with a mission reuters has cited anonymous sources
00:08:00.560 | close to the company that the plan is still being hashed out with lawyers and shareholders and the
00:08:06.000 | timeline isn't certain but what's being discussed is that the uh non-profit will continue to exist as a
00:08:12.800 | minority shareholder in the new company how much of a minority shareholder i guess is uh the devil's in
00:08:19.360 | the detail there do they own one percent or 49 the uh very much discussed friedberg 100x profit cap for
00:08:27.040 | investors will be removed that means investors like vinod friend of the pod and reed hoffman also friend
00:08:33.280 | of the pod could see a 100x turn into a thousand x or more according to the bloomberg report sam waltman's
00:08:40.400 | going to get his equity finally seven percent that would put him at around 10.5 billion if this is all true
00:08:46.880 | and open ai could be valued as high as 150 billion dollars we'll get into all the shenanigans but
00:08:54.880 | let's start with your question freeberg and since you asked it i'm going to boomerang it back to you
00:08:59.760 | make the bull case for 150 dollar 150 billion dollar valuation the bull case would be that the moat in the
00:09:09.920 | business with respect to model performance and infrastructure gets extended with the large amount
00:09:16.880 | of capital that they're raising they aggressively deploy it they are very strategic and tactical
00:09:22.400 | with respect to how they deploy that infrastructure to continue to improve model performance and as a
00:09:27.920 | result continue to extend their advantage in both consumer and enterprise applications the api tools and so on
00:09:34.720 | that they offer and so they can maintain both kind of model and application performance leads that they
00:09:42.000 | have today across the board i would say like the the o1 model their voice application sora has not been
00:09:49.360 | released publicly but if it does and it looks like what it's been demoed to be it's certainly ahead of the
00:09:53.840 | the pack so there's a lot of uh aspects of of open ai today that kind of makes them a leader and if
00:10:00.800 | they can deploy infrastructure to maintain that lead and not let google microsoft amazon and others
00:10:05.360 | catch up then their ability to use that capital wisely keeps them ahead and ultimately as we all know
00:10:11.920 | there's a multi-trillion dollar market to capture here making lots of verticals lots of applications lots of
00:10:17.040 | products so they could become a a true kind of global player here plus the extension into computing which
00:10:22.880 | i'm excited to talk about later when we get into the computing stuff sacks here's a chart of open
00:10:27.680 | areas revenue growth that has been piecemeal together from various sources at various times
00:10:34.160 | but you'll see here they are reportedly
00:10:39.760 | as of june of 2024 on a 3.4 billion run rate for this year after hitting 2 billion in 23 1.3 billion
00:10:48.800 | in october of 23 and then back in 2022 which reported they only had 28 million in revenue so
00:10:56.720 | this is a pretty pretty hot a pretty a pretty big streak here in terms of revenue growth i would put it
00:11:01.920 | at 50 times top line revenue 150 billion valuation you want to give us the bear case maybe or the bull case
00:11:08.480 | well so the the whisper numbers i heard was that their revenue run rate for this year was in the
00:11:13.680 | four to six billion range which is a little higher than that so you're right if it's really more like
00:11:20.240 | 3.4 this valuation is about 50 times current revenue but if it's more like 5 billion then it's only 30 times
00:11:28.320 | and if it's growing 100 year over year it's only 15 times next year so depending what the numbers actually
00:11:33.680 | are the 150 billion valuation could be warranted i don't think 15 times ford arr is a high valuation
00:11:44.000 | for a company that has this kind of strategic opportunity i think it all comes down to the
00:11:49.360 | durability of its comparative advantage here i think there's no question that open ai is the leader of the
00:11:56.320 | pack it has the most advanced ai models it's got the best developer ecosystem the best apis it keeps
00:12:03.680 | rolling out new products and the question is just how durable that advantage is is there really a moat
00:12:09.760 | to any of this for example meta just announced llama 3.2 which can do voice and this is roughly at the same
00:12:17.760 | time that open ai just released its voice api so the open source ecosystem is kind of hot on open ai's
00:12:25.520 | heels the large companies google microsoft so forth they're hot on their heels too although it seems like
00:12:33.840 | they're further behind where meta is and the question is just can open ai maintain its lead can it consolidate
00:12:40.720 | its lead can it develop some moats if so it's on track to be the next trillion dollar big tech company
00:12:47.120 | but if not it could be eroded and you could see the value of open ai get get commoditized and we'll
00:12:53.840 | look back on it as kind of a cautionary tale okay chamath do us a favor here if there is a bear case what is it
00:13:00.480 | okay let's steal a man the bear case yes that's what i'm asking please
00:13:03.760 | so one would just be on the fundamental technology itself and i think the version of that story would
00:13:13.760 | go that the underlying frameworks that people are using to make these models great is well described
00:13:23.040 | and available in open source on top of that there are at least two viable open source
00:13:30.320 | models that are as good or better at any point in time than open ai so what that would mean is that
00:13:38.400 | the value of those models the economic value basically goes to zero and it's a consumer surplus
00:13:46.160 | for the people that use it so that's very hard theoretically to monetize i think the second
00:13:55.200 | part of the bear case would be that specifically meta becomes much more aggressive in inserting meta ai
00:14:04.480 | into all of the critical apps that they control because those apps really are the front door
00:14:11.760 | to billions of people on a daily basis so that would mean whatsapp instagram messenger the facebook app and
00:14:20.640 | threads gets refactored in a way where instead of leaving that application to go to a chat gpt like app
00:14:28.480 | you would just stay in the app
00:14:29.760 | and then the companion to that would be that google also does the same thing with their version in front of
00:14:41.840 | search so those two big front doors to the internet become much more aggressive in giving you a reason to
00:14:48.800 | not have to go to chat gpt because a their answers are just as good and b they're right there in a few less
00:14:55.520 | clicks for you yeah so that would be the that would be the second piece the third piece is that all of these
00:15:03.200 | models basically run out of viable data to differentiate themselves and it basically becomes a race around
00:15:11.440 | synthetic information and synthetic data which is a cost problem meaning if you're going to invent synthetic
00:15:17.920 | data you're going to have to spend money to do it and the large companies facebook microsoft amazon google
00:15:24.720 | have effectively infinite money compared to any startup and then the fourth which is the most quizzical one
00:15:32.960 | is what does the human capital thing tell you about what's going on it reads a little bit like a
00:15:40.560 | telenovela i have not in my time in silicon valley ever seen a company that's supposedly on such a straight
00:15:48.480 | line to a rocket ship have so much high level churn and i but i've also never seen a company have this
00:15:56.240 | much liquidity and so how are people deciding to leave if they think it's going to be a trillion dollar
00:16:04.960 | company and why when things are just starting to cook would you leave if you are technically enamored
00:16:12.080 | with what you're building so if you had to construct the bear case i think those would be the four things
00:16:17.440 | open source front door competition the move to synthetic data and all of the executive turnover
00:16:25.920 | would be sort of why you would say maybe there's a fire where there's all this smoke okay i think this
00:16:32.800 | is very well put and i have been using chat gpt and claude and gemini exclusively i stopped using google search
00:16:43.360 | and i also stopped sax asking people on my team to do stuff before i asked chat gpt to do it
00:16:50.400 | specifically freeberg the o1 version and the o1 version is distinctly different have you gentlemen
00:16:56.640 | been using o1 like on a daily basis okay so we can have a really interesting conversation here
00:17:02.800 | i did something on my other podcast this week in startups that i'll show you right now that was
00:17:07.840 | crazy yesterday one is a grinch is a game changer yes it's the first real it's the first real chain
00:17:13.920 | of thought production system yes that i think we've seen are using o1 preview or o1 mini uh i am using
00:17:21.760 | o1 preview now let me show you what i did here just just so the audience can level set here if you're not
00:17:26.720 | watching us go to youtube and type in all in and and you can you can watch us we do video here so i was
00:17:34.000 | analyzing you know just some early stage uh deals and cap tables and i put in here hey a startup just
00:17:39.920 | raised some money at this valuation here's what the friends and family invested the accelerator the
00:17:44.960 | seed investor etc in other words like the history the investment history in a company
00:17:48.640 | what o1 does distinctly differently than the previous versions and the previous version i felt
00:17:54.800 | was three to six months ahead of competitors this is a year ahead of competitors and so
00:17:59.760 | here chamath if you look it said it thought for 77 seconds and if you click the down arrow sax what
00:18:07.040 | you'll see is it gives you an idea of what its rationale is for interpreting and what secondary
00:18:14.000 | queries it's doing in order to give you this is called chain of thought right and this is the underlying
00:18:20.480 | mega model that sits on top of the llms and the mega model effectively the chain of thought approach is
00:18:28.320 | the model asks itself the question how should i answer this question right and then it comes up with
00:18:36.320 | an answer and then it says now based on that what are the steps i should take to answer the question so
00:18:41.280 | the model keeps asking itself questions related to the structure of the question that you ask and then it
00:18:48.480 | comes up with a series of steps that it can then call the llm to do to fill in the blanks link them all
00:18:54.960 | together and come up with the answer it's the same way that a human train of thought works and it really
00:18:59.680 | is the the kind of ultimate evolution of what a lot of people have said these systems need to become
00:19:05.280 | which is a much more call it intuitive approach to answering questions rather than just predictive text
00:19:12.080 | based on the single statement you made and it really is changing the game and everyone is going to
00:19:16.160 | chase this and follow this it is the new paradigm for for how these ai kind of systems will work and
00:19:23.120 | by the way what this did was what prompt engineers were doing or prompt engineering websites were doing
00:19:29.040 | which was trying to help you construct your question and so if you look to this one it says listing
00:19:34.640 | disparities i'll compile a cap table with investments and valuations building the cap table accessing the
00:19:39.440 | share evaluation breaking down ownership breaking down ownership etc evaluating the terms and then it
00:19:44.160 | checks its work a bit it weights investment options and you can see this is a this is fired off like
00:19:50.320 | two dozen different queries to as freebroke correctly pointed out you know build this chain and it got
00:19:59.680 | incredible answers explain the form so it's thinking about what your next question would be
00:20:04.880 | and this when i share this with my team it was like a super game changer uh sax you had some thoughts here
00:20:12.400 | well yeah i mean this is pretty impressive and just to build on what freeberg was saying about chain of
00:20:18.560 | thought where this all leads is to agents where you can actually tell the ai to do work for you you give
00:20:26.160 | it an objective it can break the objective down into tasks and then it can work each of those tasks and open ai
00:20:32.880 | at a recent meeting with investors said that phd level reasoning was next on its roadmap and then agents
00:20:40.080 | weren't far behind that they've now released the at least the preview of the phd level reasoning with this
00:20:46.080 | oh one model so i think we can expect an announcement pretty soon about agents yeah and so
00:20:52.000 | and if you think about if you think about business value you know we think a lot about this is like
00:20:55.520 | where's the sas opportunity and all this the software as a service opportunity it's going to be an
00:21:01.600 | agents i think we'll ultimately look back on these sort of chat models as a little bit of a parlor trick
00:21:07.680 | compared to what agents are going to do in the workplace if you've ever been to a call center or an
00:21:13.600 | operations center they're also called service factories it's assembly lines of people doing very
00:21:19.680 | complicated knowledge work but ultimately you can unravel exactly what the chain is there the chain
00:21:26.800 | of thought that goes into their decisions it's very complicated and that's why you have to have humans
00:21:32.480 | doing it but you could imagine that once system integrators or enterprise sas apps go into these
00:21:41.040 | places go into these companies they integrate the data and then they map out the workflow you could
00:21:47.440 | replace a lot of these steps in the workflow with agents yeah by the way it's not just it's not just
00:21:52.800 | call centers i had a conversation with um i'm on the board of a company with the ceo the other day
00:21:57.760 | and he was like well we're gonna hire an analyst that's gonna sit between our kind of retail sales
00:22:03.120 | operations and do the and you know figure out what's working to drive marketing decisions and i'm like
00:22:08.640 | no you're not like i i really think that that would be a mistake because today you can use oh one
00:22:15.360 | and describe just feed it the data and describe the analysis you want to get out of that data
00:22:21.040 | and within a few minutes and i've now done this probably a dozen times in the last week with
00:22:25.200 | different projects internally at my company it gives you the entire answer that an analyst would
00:22:30.000 | have taken days to put together for you and if you think about what an analyst's job has been
00:22:34.720 | historically is they take data and then they manipulate it and the big evolution in software over
00:22:39.680 | the last decade and a half has been tools that give that analyst leverage to do that data manipulation
00:22:46.320 | more quickly like tableau and you know r and all sorts of different toolkits that are out there
00:22:51.120 | but now you don't even need the analyst because the analyst is the chain of thought it's the prompting
00:22:56.400 | from the model and it's completely going to change how knowledge work is done everyone that
00:23:01.360 | owns a function no longer needs an analyst the analyst is the model that's sitting on the computer
00:23:06.880 | in front of you right now and you tell it what you want and not days later but minutes later you get
00:23:11.840 | your answer it's completely revolutionary in um uh ad hoc uh knowledge work as well as kind of this
00:23:20.480 | repetitive structured knowledge this is such a good point freeberg the ad hoc piece of it when we're
00:23:25.600 | processing 20 000 applications for funding a year we do 100 plus meetings a week the analysts on our team
00:23:32.160 | are now putting in the transcripts um and key questions about markets and they are getting so
00:23:38.480 | smart so fast that you know when somebody comes to them with a marketplace in diamonds their understanding
00:23:45.360 | of the diamond marketplace becomes so rich so fast that we can evaluate companies faster then we're also
00:23:52.560 | seeing uh chamath uh before we call our lawyers when we have a legal question about a document we start
00:23:58.400 | putting in you know let's say the the standard note template or the standard safe template we put in the
00:24:03.680 | new one and there's a really cool project by google called notebook l lm where you can put in multiple
00:24:10.400 | documents and you can start asking questions so imagine you take every single legal document sacks
00:24:16.320 | that yammer had when you had chamath as an investor i'm not sure if he's on the board
00:24:20.160 | and you can start asking questions about the documents and we have had people make changes to
00:24:25.280 | these documents and it immediately finds them explains them and so everybody's just getting so
00:24:29.840 | goddamn smart so fast using these tools that i insisted that every person on the team when they
00:24:35.440 | hit control tab it opens a chat gpt4 window in 01 and we burned out our credits immediately like
00:24:42.640 | it stopped us it said you're you're you have to stop using it for the rest of the month chamath your
00:24:47.360 | thoughts on this we're seeing it in real time in 80 90. what i'll tell you is what sachs said is
00:24:55.600 | totally right there's so many companies that have very complicated processes that are a combination of
00:25:04.240 | well-trained and well-meaning people and bad software and what i mean by bad software is that
00:25:15.120 | some other third party came in listened to what your business process was and then wrote this clunky
00:25:20.640 | deterministic code usually on top of some system of record charged you tens or hundreds of millions
00:25:28.160 | of dollars for it and then left and will support it only if you keep paying them millions of dollars a
00:25:33.120 | year that whole thing is so nuts because the ability for people to do work i think has been very much constrained
00:25:39.920 | and it's constrained by people trying to do the right thing using really really terrible software
00:25:44.720 | and all of that will go away the radical idea that i would put out there is i think that systems of
00:25:51.040 | record no longer exist because they don't need to and the reason is because all you have is data and
00:25:58.720 | you have a pipeline of information can you level set and just explain to people what system of record is
00:26:02.000 | just so so inside of a company you'll have a handful of systems that people would say are
00:26:06.800 | the single source of truth they're the things that are used for reporting compliance an example would be
00:26:14.800 | for your general ledger so to netsuite record your revenues you'd use netsuite or you'd use oracle gl
00:26:24.240 | or you'd use workday financials then you'd have a different system of record for all of your revenue
00:26:30.480 | generating activities so who are all of the people you sell to how are sales going what is the pipeline
00:26:36.320 | there's companies like salesforce or hubspot sugar crm then there's a system of record for all the
00:26:43.200 | employees that work for you all the benefits they have what is their salary this is hris so the point
00:26:49.040 | is that the software economy over the last 20 years and this is trillions of dollars of market cap
00:26:56.560 | and hundreds of billions of revenue have been built on this premise that we will create this system of
00:27:04.080 | record you will build apps on top of the system of record and the knowledge workers will come in
00:27:08.960 | and that's how they will get work done and i think that sax is right this totally flips that on its head
00:27:15.280 | instead what will happen is people will provision an agent and roughly direct what they want the outcome to
00:27:22.560 | be and they'll be process independent they won't care how they do it they just want the answer
00:27:27.440 | so i think two things happen the obvious thing that happens in that world is systems of record lose a grip
00:27:36.320 | on the vault that they had in terms of the data that runs a company you don't necessarily need it
00:27:44.080 | with in the same reliance and primacy that you did five and ten years ago that'll have an impact to the
00:27:50.480 | software economy and the second thing that i think is even more important than that is that then the
00:27:57.920 | atomic size of companies changes because each company will get much more leverage from using software and
00:28:05.920 | few people versus lots of people with a few pieces of software and so that inversion i think creates
00:28:12.320 | tremendous potential for operating leverage all right you're a thought sax you operate in the sas space with
00:28:17.440 | system of records and investing in these type of companies give us your take well it's interesting
00:28:23.040 | we were having a version of this conversation last week on the pod and i started getting texts from
00:28:29.120 | benioff as he was listening to it and then oh he called me and i think he got a little bit triggered by
00:28:34.480 | the idea that systems of record like salesforce are going to be obsolete in this new ai era and he made a
00:28:40.800 | very compelling case to me about why that wouldn't happen which is yeah well first of all i think ai
00:28:46.560 | models are predictive i mean at the end of the day they're predicting the next set of texts and and so
00:28:51.440 | forth and when it comes to like your employee list or your customer list you just want to have a source
00:28:56.720 | of truth you don't want it to be 98 accurate you just want to be 100 accurate you want to know if the
00:29:02.400 | federal government asks you for the tax id numbers of your employees you just want to be able to give it to
00:29:06.960 | them if wall street analysts asks you for your customer list and what the gap revenue is you just
00:29:12.160 | want to be able to provide that you don't want ai models figuring it out so you're still going to need
00:29:16.240 | a system of record furthermore he made the point that you still need databases you still need enterprise
00:29:23.600 | security if you're dealing with enterprises you still need compliance you still need sharing models
00:29:28.080 | there's all these aspects all these things that have been built on top of the database that sas
00:29:32.480 | company's been doing for 25 years and then the final point that i think is compelling is that
00:29:37.440 | enterprise customers don't want to diy it right they don't want to figure out how to put this together
00:29:44.000 | and you can't just hand them an llm and say here you go there's a lot of work that is needed in order to
00:29:53.120 | make these models productive and so at a minimum you're going to need system integrators and consultants
00:29:59.440 | to come in there connect hold on just connect all the enterprise data to these models map the workflows
00:30:05.440 | you have to do that now how is that different from how this this clunky software sold today
00:30:10.800 | i mean look i i don't want to take away from the quality of the company that mark has built and what
00:30:17.040 | he's done for the cloud economy so let's just put that aside but i wish this is what we could have
00:30:22.160 | actually all been on stage and talked about i told him when he was at the summit i said that because i
00:30:27.760 | disagree with basically every premise of those three things number one systems integrators exist today
00:30:33.120 | to build apps on top of these things why do you think you have companies like viva how can a 20 billion
00:30:38.240 | dollar plus company get built on top of salesforce it's because it doesn't do what it's meant to do
00:30:43.840 | that's why otherwise app stores are a great way to allow people to build on your platform and and
00:30:52.640 | cover those niche cases the point i'm trying to make is that's no different than the economy that
00:30:56.880 | exists today it's just going to transform to different groups of people number one well by
00:31:00.000 | the way he said he's willing to come on the pod and talk about this very issue but just with you
00:31:04.320 | just with you great he'll come on the pod and discuss whether ai makes sass obsolete a lot of
00:31:12.320 | people are asking that question let's talk about it next year at the summit can you talk about his uh
00:31:16.640 | philanthropy first okay let's get back to focus here let's get focused everybody
00:31:19.840 | spicy love you mark who's coming to dreamforce raise your hand i want to make another point the second
00:31:28.000 | point is that when you have agents i think that we are overestimating what a system of record is david
00:31:34.160 | what you talked about is actually just an encrypted file or it's a bunch of rows in some database or
00:31:39.600 | it's in some data lake somewhere you don't need to spend tens or hundreds of millions of dollars
00:31:46.000 | to wrap your revenue in something that says it's a system of record you don't need that actually you can
00:31:53.680 | just pipe that stuff directly from stripe into snowflake and you can just transform it and do what
00:32:00.160 | you will with it and then report it i'll tell you you could do that today it's just that that's an
00:32:04.240 | interesting point through steak dinners and golf outings and all this stuff we've sold cios this idea
00:32:12.000 | that you need to wrap it in something called a system of record and all i'm saying is when you confront
00:32:17.280 | the total cost of that versus what the alternative that is clearly going to happen in the next
00:32:23.600 | five or ten years irrespective of whether any of us build it or not it'll be definitely not be able
00:32:28.320 | to just you just won't be able to justify it because it's going to cost a fraction of price
00:32:32.400 | there's probably also an aspect of this that we can't predict what is going to work with respect
00:32:39.760 | to data structure so right now all of um all of the the tooling for ai is on the front end and we
00:32:47.200 | haven't yet unleashed ai on the back end which is if you told the ai here's all the data ingest i'm
00:32:53.520 | going to be doing from all these different points in my business figure out what you want to do with
00:32:58.720 | all that data the ai will eventually come up with its own data structure and data system no that's
00:33:05.360 | not nothing that will look nothing like right and that's already happening right and so that's nothing
00:33:10.000 | like what we have today in the same vein that we don't understand how the translation works in a in an
00:33:15.360 | in an llm we don't understand how a lot of the function works a lot of the data structure and data
00:33:19.200 | architecture we won't understand clearly because it's going to be obfuscated by the model driving
00:33:23.760 | the the development there are open source agentic frameworks that already do freeberg what you're
00:33:28.640 | saying that so it's not true that it's not been done it's already oh yeah sure so maybe it's being
00:33:32.880 | done right that's so it hasn't been fully implemented to replace the system of record there are companies
00:33:38.560 | uh i'll give you an example of one like mechanical orchard they'll go into the most gnarliest of
00:33:43.680 | environments and what they will do is they will launch these agents that observe it's sort of what
00:33:48.640 | i told you guys before the io stream of these apps and then reconstruct everything in the middle
00:33:53.840 | automatically i don't understand why we think that there's a world where customer quality and nps would
00:34:01.920 | not go sky high for a company that has some old legacy fortran system and now they can just pay you know
00:34:08.880 | mechanical orchard a few million bucks and they'll just replace it in a matter of months it's going to
00:34:13.280 | happen right yeah that's the very interesting piece for me is i'm you know watching startups
00:34:18.560 | you know working on this the ai first ones i think are going to come to it with a totally different
00:34:24.240 | cost structure the idea of paying for seats and i mean some of these seats are five thousand per person
00:34:29.520 | per year you nailed it a year ago when you were like oh you mentioned some company that had like flat
00:34:35.120 | pricing at first by the way when you said that i thought this is nuts but you're right it actually
00:34:41.440 | makes a ton of sense because if you have a fixed group of people who can use this tooling to basically
00:34:48.880 | effectively be as productive as a company that's 10 times as big as you
00:34:53.840 | you can afford to flat price your software because you can just work backwards from what margin structure
00:35:00.320 | you want and it's still meaningfully cheaper than any other alternative a lot of startups now are
00:35:05.920 | doing consumption based pricing so they're saying you know how many um how many sales calls are you
00:35:12.320 | doing how many are we analyzing as opposed to how many sales executives do you have because when you
00:35:18.160 | have agents as we're talking about those agents are going to do a lot of the work so we're going to
00:35:23.200 | see the number of people working at companies become fixed and i think the static team size
00:35:28.560 | that we're seeing at a lot of large companies is only going to continue it's going to be down into
00:35:33.760 | the right and if you think you're going to get a high-paying job at a big tech company and you have to
00:35:39.200 | beat the agent you're going to have to beat the maestro who has five agents working for them i think
00:35:44.080 | this is going to be a completely different world uh chima this i want to get back to open ai with a couple of
00:35:50.800 | other pieces so let's wrap this up so we get to the next word on this yes please last word for you
00:35:55.040 | that's right so look i i think that on the whole i agree with benioff here that there's more net new
00:36:03.200 | opportunity for ai companies whether they be startups or you know existing big companies like salesforce
00:36:09.840 | that are trying to do ai then there is disruption i think there will be some disruption it's very hard
00:36:14.560 | for us to see exactly what ai is going to look like in five or ten years so i don't want to discount
00:36:19.360 | the possibility that there will be some disruption of existing players but i think on the whole there's
00:36:25.040 | more net new opportunity for example the most highly valued public software company right now in terms of
00:36:30.960 | ar multiple is palantir and i think that's largely because the market perceives palantir as having a big
00:36:38.320 | ai opportunity what is palantir's approach the first thing palantir does when they go into a customer
00:36:44.240 | is they integrate with all of its systems and they're dealing with the largest enterprises they're
00:36:48.240 | dealing with the government the pentagon department of defense the first thing they do is go in and
00:36:53.920 | integrate with all of these legacy systems and they collect all of the data in one place they call it
00:37:00.400 | creating a digital twin and once all the data is in one place with the right permissions and safeguards
00:37:06.080 | now analysts can start working it and that was their historical value proposition but in addition ai can
00:37:12.480 | now start working that problem so anything that the analyst could work now ai is going to be able to work
00:37:17.120 | and so they're in an ideal position to master these new ai workflows so what is the point i'm making it's
00:37:24.080 | just that you can't just throw an llm at these large enterprises you have to go in there and integrate
00:37:30.240 | with the existing systems it's not about ripping out the existing systems because that's just a lot
00:37:34.240 | of headaches that nobody needs it's generally an easier approach just to collect except except when
00:37:39.520 | the renewal comes what happens when you have to you know you got a really good negotiating dollars on
00:37:43.840 | something yeah and then you're going to renegotiate are you going to spend a billion dollars again five
00:37:48.000 | years from now it just doesn't seem very likely there's going to be a lot of hardcore negotiations
00:37:52.480 | going on chamath people are going to ask for 20 off 50 off and people are going to have to be more
00:37:57.680 | competitive that's all i suspect i suspect palantir's go to market when they start to really scale they'll
00:38:02.240 | be able to under price a bunch of these other alternatives and it so i think that when you look at
00:38:09.280 | the impacts and pricing that all of these open source and closed source model companies have now
00:38:18.560 | introduced in terms of the price per token what we've seen is just a massive step function lower right so
00:38:26.240 | it is incredibly deflationary so the things that sit on top are going to get priced as a function of that
00:38:34.160 | of that cost which means it will be an order of magnitude cheaper than the stuff that it replaces
00:38:39.440 | which means that a company would almost have to purposely want to keep paying tens of millions of dollars
00:38:46.960 | when they don't have to they would need to make that as an explicit decision and i think that
00:38:51.680 | very few companies will be in a position to be that cavalier in five and ten years so you're either
00:38:58.960 | going to rebase the revenues of a bunch of these existing deterministic companies or you're going to
00:39:06.160 | create an entire economy of new ones that have a fraction of the revenues today but a very different
00:39:11.680 | profitability profile i just think i think whenever you're dealing with a whenever you're dealing with
00:39:17.120 | a disruption as big as this current one i think it's always tempting to think in terms of the existing
00:39:23.680 | pie getting disrupted and shrunk as opposed to the pie getting so big with new use cases that on the whole
00:39:32.400 | the ecosystem benefits no no i agree with that i suspect that's what's going to happen no i i agree
00:39:38.000 | with that my my only point is that the pie can get bigger while the slices get much much smaller
00:39:42.960 | well i mean i right between the two of you i think is um the truth because what's happening is if you
00:39:52.400 | look at investing it's very hard to get into these late stage companies because they don't need as much
00:39:57.280 | capital because to your point chamath they when they do hit profitability with 10 or 20 people
00:40:03.200 | the revenue per employee is going way up if you look at google uber airbnb and facebook meta
00:40:11.200 | they have the same number or less employees than they did three years ago but they're all growing in
00:40:15.520 | that 20 to 30 percent a year which means in but two to three years each of those companies has doubled
00:40:22.000 | revenue per employee so that concept of more efficiency and then that trickles down sacks to
00:40:28.800 | the startup investing space where you and i are i'm a pre-seed seed investor you're a seed series a investor
00:40:33.200 | if you don't get in in those three or four rounds i think it's going to be really expensive and the
00:40:38.160 | companies are not going to need as much money downstream speaking of of investing in late stage
00:40:43.200 | companies we never closed the loop on the whole open ai thing what did we think of the fact that
00:40:49.920 | they're completely changing the structure of this company they're changing it into a corporation from
00:40:54.080 | the non-profit and sam's now getting a 10 billion dollar stock package he's not in for the money he
00:41:00.800 | has health insurance sex but we never did congress i don't need money i've got enough money i just needed
00:41:08.000 | the health insurance pull the clip up nick pull the clip i mean it's the funniest clip ever
00:41:13.680 | like coffee no it's in congress watch it this is what is in congress you make a lot of money do you
00:41:20.960 | i make no i'm paid enough for health insurance i have no equity in open ai really that's interesting
00:41:26.480 | you need a lawyer i need a what you need a lawyer or an agent i i'm doing this because i love it
00:41:34.960 | it's the greatest look at me don't believe him i this can i ask you a question there sacks are you
00:41:41.040 | doing this venture capital where you put the money in the startups because you love it or because you're
00:41:48.400 | working with the entrepreneurs another home in a coastal city and put more jet fuel in that plane i
00:41:53.920 | need an answer for the people of the sovereign state of mississippi no louisiana that's john kennedy
00:42:01.680 | he's a very smart guy actually with a lot of you know sort of common folk wisdom he got that simple
00:42:09.280 | talk yeah yeah he's very funny but he's very funny if you you listen to him he knows how to slice and
00:42:20.320 | dice you might need to get yourself uh one of them fancy agents from hollywood or an attorney from the
00:42:27.200 | wilson suncini corporation to renegotiate your contract son because you're worth a lot more from
00:42:32.720 | what i can gather in your performance today than just some simple healthcare and uh i hope you took
00:42:38.640 | the blue cross blue shield i would like to make two semi-serious observations let's go yeah please get
00:42:44.880 | us back on track i think the first is that there's going to be a lot of people that are looking at the
00:42:49.760 | architecture of this conversion because if it passes muster everybody should do it think about this model
00:42:56.640 | let's just say that you're in a market and you start as a non-profit what that really means is you pay no
00:43:02.720 | income tax so for a long time you put out a little bit of the percentage of whatever you earn
00:43:12.880 | but you can now outspend and outcompete all of your competitors and then once you win you flip to
00:43:19.600 | a corporation that's a great hack on the tax code and you let the donators get first bite of the apple
00:43:27.680 | if you do convert because remember vinod and hoffman got all their shares on the conversion the other way
00:43:33.600 | will also work as well because there's no there's nothing that says you can't go in the other direction so
00:43:38.480 | let's assume that you're already a for-profit company but you're in a space with a bunch of
00:43:43.200 | competitors can't you just do this conversion in reverse become a non-profit again you pay no
00:43:50.400 | income tax so now you are economically advantaged relative to your competitors and then when they
00:43:57.360 | wither and die or you can outspend them you flip back to a for-profit again i think the the point is that
00:44:04.240 | that there's a lot of people that are going to watch this closely and if it's legal and it's
00:44:12.880 | allowed i i just don't understand why everybody wouldn't do this yeah i mean that was elon's point
00:44:17.760 | as well and i mean the second thing which is just more of like cultural observation is and you brought up
00:44:26.320 | elon my comment to you guys yesterday and i'll just make the comment today it's a little bit disheartening
00:44:34.000 | to see a situation where elon built something absolutely incredible defied every expectation
00:44:39.920 | and then had the justice system take 55 billion dollars away from him his payment package you're
00:44:50.240 | referring to at his payment package the options and then on the other side sam's going to pull
00:44:56.720 | something like this off definitely pushing the boundaries and he's going to make 10 billion
00:45:04.240 | and i just think when you put those two things in contrast that's not how the system should
00:45:10.400 | probably work i think is what most people would say freeberg you've been a little quiet here any
00:45:14.480 | thoughts on the transaction the non-profit to for-profit if you were looking at that in what you're doing
00:45:20.720 | do you see a way that ohalo could take a non-profit status raise a bunch of money through donations for
00:45:27.280 | virtuous work then license those patents to your for-profit would that be advantageous to you and do
00:45:33.200 | you think this could become absolutely zero idea i have no idea what they're doing i don't know how
00:45:38.560 | they're converting a non-profit to a for-profit none of us have the details on this there's there may be
00:45:42.640 | significant tax implications the payments they need to make i don't think any of us
00:45:46.240 | no i certainly don't i don't know if there's actually a real benefit here if there is i'm
00:45:51.360 | sure everyone would do it no one's doing it so there's probably a reason why it's difficult
00:45:55.280 | i don't know it's been done a couple times yeah the mozilla foundation did it we talked about that
00:45:59.680 | in a previous episode sacks you want to you want to wrap us up here on the corporate structure any final
00:46:03.680 | thoughts i mean elon put in 50 million i think he gets the same as sam don't you think he should just
00:46:10.480 | chip off seven percent for elon and not that elon needs the money where he's asking but
00:46:13.920 | i'm just wondering why elon doesn't get the seven percent and get or you know if they're going to
00:46:19.840 | actually put in 50. did he put in 50 million dollars 50 million is the report right in the non-profit
00:46:24.480 | yeah hoffman put in 10. look i said on a previous show that this organizational chart of open ai
00:46:31.600 | was ridiculously complicated and they should go clean it up they should open up the books and
00:46:35.840 | straighten everyone out and i also said that as part of that they could give sam altman a ceo option
00:46:41.600 | grant and they should also give elon some fair compensation for being the seed investor put in
00:46:47.440 | the first 50 million dollars and co-founder and what you're seeing is well they're kind of doing that
00:46:53.040 | they're opening up the books they're straightening out the corporate structure they're giving sam his
00:46:58.480 | option grant but they didn't do anything for elon and i'm not saying this as elon's friend i'm just
00:47:04.480 | saying that it's not really fair to basically go fix the original situation you're making it into a
00:47:12.000 | for-profit you're giving everyone shares but the guy who puts in the original c capital doesn't get
00:47:17.200 | anything that's ridiculous and you know what they're basically saying to elon is if you don't like it just sue us
00:47:24.480 | i mean that's basically what they're doing and i said that they should go clean this up but they
00:47:29.760 | should make it right with everybody so how do you not make it right with elon i haven't talked to him
00:47:34.960 | about this but he reacted on x saying this is really wrong it appeared to be a surprise to him i doubt he
00:47:40.720 | knew this was coming so the company apparently made no effort to make things right with him
00:47:46.240 | and i think that that is a bit ridiculous if you're going to clean this up if you're going to change
00:47:52.560 | the original purpose of this organization to being a standard for-profit company where the ceo
00:48:00.480 | who previously said he wasn't going to get any compensation is now getting 10 billion dollars of
00:48:05.040 | compensation how do you do that and then not clean it up for the co-founder who put in the first 50
00:48:10.160 | million dollars yeah that doesn't make sense to me and you know when reed was on our pod he he said well
00:48:15.840 | elon's rich enough well that that's not a principled excuse i mean does vinod ever act that way does reed
00:48:21.760 | ever act that way do they ever say well you know you don't need to do what's fair for me because i'm
00:48:26.880 | ready rich that's that's not a principled answer the argument that i heard was that elon was given the
00:48:32.480 | opportunity to invest along with reed along with vinod and he he declined to participate in the
00:48:38.960 | for-profit investing side that everyone else participated made that argument and i think
00:48:43.680 | it's the best argument the company has but let's think about that argument maybe elon was busy that
00:48:48.800 | week maybe elon already felt like he had put all the money that he had allocated for something like this
00:48:54.080 | into it because he put in a 50 million dollar check whereas re put in 10. we don't know what elon was
00:49:00.160 | thinking at that time maybe there was a crisis at tesla and he was just really busy the point is elon
00:49:05.360 | shouldn't have been obligated to put in more money into this venture the fact of the matter is they're
00:49:11.440 | refactoring the whole venture elon had an expectation when he put in the 50 million that this would be a
00:49:16.960 | non-profit and stay a non-profit and they're changing that and if they change it they have to make things
00:49:21.840 | right with him it doesn't really matter whether he had a subsequent opportunity to invest he wasn't
00:49:27.840 | obligated to to make that investment what he had an expectation of is that his 50 million dollars would
00:49:34.720 | be used for a philanthropic purpose and clearly it has not been yeah and in fairness to vinod he bought
00:49:40.240 | that incredible beachfront property and donated it to the public trust so we can all surf and have our
00:49:44.480 | halloween party there so it's all good thank you vinod for uh giving us that incredible beach i want
00:49:50.240 | to talk to you guys about interfaces that came up chamath in your headwinds or your your four pack of
00:49:56.880 | reasons that you know open ai when you steal man the bear case could have challenges obviously we're seeing
00:50:03.520 | that and it is emerging that meta is working on some ar glasses that are really impressive additionally
00:50:11.600 | i've installed ios 18 which is apple intelligence that works on 15 phones and 16 phones 18 is the
00:50:19.280 | ios did any of you install the beta of ios 18 yet and use siri it's pretty clear with this new one that
00:50:26.160 | you're going to be able to talk to siri as an llm like you do in chat gpt mode which i think means
00:50:32.080 | they will not make themselves dependent on chat gpt and they will siphon off half the searches that
00:50:37.280 | would have gone to chat gpt so i see that as a serious series not very good jackel and you know
00:50:41.440 | this because when you were driving me to the airport yesterday we tested it and didn't work
00:50:44.960 | yes you tried you try he tried to he tries to execute this joke where he's like hey siri
00:50:49.680 | send chamath palihapitiya a message and it was a very off-color message i'm not going to say what it
00:50:54.960 | is so it's a spicy joke and then it's like okay great sending linda blah blah blah he's like no
00:51:00.400 | stop stop stop stop i was like no don't send that joke to her it hallucinates and almost sends it to
00:51:04.960 | some other you know some woman in his contact it would have been really damaging it's not very good
00:51:09.840 | jason it's not very good well but what i will say is there are features of it where if you squint a
00:51:15.600 | little bit you will see that siri is going to be conversational so when i was talking to it with music
00:51:22.000 | and you know you can have a conversation with it and do math like you can do with the chat gpt version
00:51:27.760 | and you have microsoft doing that with their co-pilot and now met is doing it at the top of each one so
00:51:34.480 | everybody's going to try to intercept the queries and the voice interface so chat gpt4 is now up against
00:51:40.720 | meta siri apple and microsoft for for that interface it's going to be challenging but let's talk about these
00:51:47.040 | meta glasses here meta showed off the ar glasses that nick will pull up right now these aren't
00:51:53.760 | goggles goggles look like ski goggles that's what apple is doing with their vision pro or when you see
00:52:00.480 | the meta quest you know how those work those are vr with cameras that will create a version of the world
00:52:07.040 | these are actual chunky sunglasses like the ones i was wearing earlier when i was doing the bit
00:52:12.080 | so these let you operate in the real world and are supposedly extremely expensive they made a thousand
00:52:20.800 | prototypes they were letting a bunch of influencers and folks like um gary vanderchuk uh use them and
00:52:29.520 | they're not ready for prime time but the way they work freeberg is there's a wristband
00:52:33.680 | that will track your fingers and your wrist movement so you could be in a conversation like we are here on
00:52:38.240 | the pod and below the desk you could be you know moving your arm and hand around to be doing replies
00:52:44.240 | to i don't know incoming messages or whatever it is what do you think of this ar vision of the world
00:52:50.240 | and meta making this progress well i think it ties in a lot to the the ai discussion because i think we're
00:52:58.400 | really witnessing this big shift from and this big transition in computing probably the biggest
00:53:04.960 | transition since mobile you know we moved from mainframes to desktop computers everyone had kind
00:53:11.040 | of this computer on their desktop but used a mouse and a keyboard to control it to mobile where you had
00:53:15.600 | a keyboard and clicking and touching on the screen to do things on it and now to what i would call this
00:53:20.480 | kind of ambient computing method and you know i think the big difference is control and response
00:53:27.440 | in directed computing you're kind of telling the computer what to do you're controlling it you're
00:53:33.280 | using your mouse or your keyboard to to go to this website so you type in a website address then you click
00:53:39.200 | on the thing that you want to click on and you kind of keep doing a series of work to get the computer to go
00:53:44.480 | access the information that you ultimately want to achieve your objective but with ambient computing
00:53:49.760 | you can more kind of cleanly state your objective without this kind of directive process you can say
00:53:55.280 | hey i i want to say i want to have dinner in new york next thursday at the at a michelin star restaurant
00:54:00.800 | at 5 30. book me something and it's done and i think that there are kind of five core things that
00:54:06.320 | are needed for this to work um both in control and response it's voice control gesture control and eye
00:54:13.040 | control are kind of the control pieces that replace you know mice and clicking and touching and keyboards
00:54:18.560 | and then response is audio and kind of integrated visual which is the idea of the goggles voice control
00:54:25.280 | works have you guys used the open ai voice control system lately i mean it is really incredible i had
00:54:30.320 | my earphones in and i was like doing this exercise i was trying to learn something so i
00:54:35.200 | told open ai to start quizzing me on this thing and i just did a 30 minute walk and while i was walking
00:54:40.480 | it was asking me quiz questions and i would answer it and tell me i was right or wrong it was really
00:54:44.080 | this incredible dialogue experience so i think the voice controls there i don't know if you guys have
00:54:48.400 | used apple vision pro but gesture control is here today you can do single finger movements with
00:54:52.880 | apple vision pro it triggers actions and eye control is incredible you look at the letters you
00:54:57.920 | want to have kind of spelled out or you look at the thing you want to activate and it does it
00:55:02.160 | so all of the control systems for this ambient computing are there and then the ai enables
00:55:06.400 | this kind of audio response where it speaks to you and the big breakthrough that's needed that i don't
00:55:11.360 | think we're quite there yet but maybe zuck is highlighting that we're almost there and apple
00:55:15.280 | vision pro feels like it's almost there except it's big and bulky and expensive is integrated visual
00:55:20.000 | where the ambient visual interface is always there and you can kind of engage with it so there's this
00:55:24.960 | big change i don't think that mobile handsets are going to be around in 10 years i don't think we're going to have
00:55:29.040 | this like phone in our pocket that we're like pressing buttons on and touching and telling it
00:55:33.680 | where on the browser to go to the browser interface is going to go away i think so much of how computing
00:55:39.040 | is done how much how we integrate with data in the world and how the computer ultimately fetches that
00:55:44.240 | data and does stuff with it for us is going to completely change to this ambient model so i'm um i'm
00:55:49.840 | pretty excited about this evolution but i think that what we're seeing with zuck what we saw with apple vision pro
00:55:55.200 | and all of the open ai demos they all kind of converge on this very incredible shift in computing
00:56:00.720 | that will kind of become this ambient system that exists everywhere all the time and i know folks
00:56:05.760 | have kind of mentioned this in the past but i think we're really seeing it kind of all come together now
00:56:09.440 | with these five key things jamaath any thoughts on facebook's progress with ar and how that might impact
00:56:21.040 | computing and interfaces when paired with language models i think david's right that there's something
00:56:28.880 | that's going to be totally new and unexpected so i agree with that part of what freebrook says i
00:56:37.760 | am still not sure that glasses are the perfect form factor to be ubiquitous when you look at a phone
00:56:45.520 | a phone makes complete sense for literally everybody right man woman old young every
00:56:55.600 | race every country of the world it's such a ubiquitously obvious form factor
00:57:06.960 | but the thing is like that initial form factor was so different than what it replaced even if you
00:57:12.960 | looked at like flip phones versus that first generation iphone so i i do think freebrook you're
00:57:18.880 | right that there's like this new way of interacting that is ready to explode onto the scene and i think
00:57:27.360 | that these guys have done a really good job with these glasses i mean like i give them a lot of credit
00:57:32.400 | for sticking with it and iterating through it and getting it to this place it looks meaningful
00:57:36.880 | better than the vision pro to be totally honest but i'm still not convinced that we've explored the
00:57:44.320 | best of our creativity in terms of the devices that we want to use with these ai models you need some
00:57:50.320 | visual interface i mean the question is where is the visual interface is it in the isn't in the wall
00:57:54.720 | but do i mean if well when you're asking like i want to watch chamath on rogan like i don't just
00:58:00.160 | want to hear i want to see like when i want to visualize stuff i want to visualize it i want to look at the
00:58:05.600 | food i'm buying online i want to look at pictures of the restaurant i'm going to go to but how much
00:58:09.520 | of that time when you say those things are you not near some screen that you can just project and
00:58:15.120 | broadcast that i mean right maybe that's the model is if the use case is i'm walking in the park and i
00:58:20.800 | need to watch tv at the same time i don't think that's a real use i i think you're on this one wrong
00:58:26.240 | chamath because i saw this revolution in japan maybe 20 years ago they got obsessed with augmented reality
00:58:34.080 | there were a ton of startups right as they started getting to the mobile phones and the use cases were
00:58:39.360 | really very compelling and we're starting to see them now in education and when you're at dinner with
00:58:44.240 | a bunch of friends how often does picking up your phone and you know looking at a message disturb the
00:58:50.960 | flow well people will have glasses on they'll be going for walks they'll be driving they'll be at a
00:58:54.880 | dinner party they'll be with their kids and you'll have something on like focus mode you know whatever the
00:58:59.760 | equivalent is an apple and a message will come in from your spouse or from your child but you won't
00:59:05.120 | have to take your phone out of your pocket and i think once these things weigh a lot less you're
00:59:09.600 | going to have four different ways to interact with your computer in your pocket your phone your watch your
00:59:14.560 | airpods whatever you have in your ears and the glasses and i bet you glasses are going to take like
00:59:19.040 | a third of the tasks you do i mean what is the point of taking out your phone and watching the uber come to you
00:59:24.960 | but seeing that little strip that tells you the uber is 20 minutes away 15 minutes away or what the
00:59:29.760 | gate number is i don't have that anxiety well i don't know if it's anxiety but i just think it's
00:59:34.640 | ease of use 15 minutes 10 minutes that's that's the definition i think it adds up i think it taking your
00:59:40.320 | phone out of your pocket 50 times a day those are all useless notifications the whole thing is to train
00:59:45.600 | yourself to realize that it'll come when it comes okay sac do you have any thoughts on this uh impressive
00:59:50.480 | demo or you know the demo that people who've seen have said is pretty pretty darn compelling i think
00:59:55.600 | it it does look pretty impressive i mean you can wear these meta orion glasses around
01:00:00.720 | and look like a human i mean you might look like eugene levy but you'll still look like a semi-normal
01:00:08.320 | person whereas you can't wear the apple vision pro i mean you can't wear that around what they don't
01:00:13.440 | look good you don't like them nick can you please find a picture of eugene levy
01:00:18.160 | i mean it seems like a major advancement certainly compared to apple vision pro i mean you don't hear
01:00:25.680 | really impressive you don't hear about the apple vision pros anymore at all i mean those things came and
01:00:30.640 | went it seems to me that who's that meta meta is executing extremely well i mean you had the very
01:00:44.080 | successful cost-cutting which wall street liked zuck published that letter which i give him credit for
01:00:50.080 | regretting the censorship that meta did which was at the best of the deep state yeah they made huge
01:00:57.200 | advancements in ai i don't think they were initially on the cutting edge of that but they've caught up
01:01:01.120 | and now they're leading their firing on all cylinders yeah with with llama 3.2 and now it seems to me that
01:01:07.440 | they're ahead on augmented reality ever since uh zuck grew out the hair yeah gold chain don't ever don't
01:01:13.920 | ever cut the hair it's like samson i mean yeah it's like samson based based zuck is the best zuck
01:01:19.760 | he does i mean by the way i want to be clear i i think these glasses are are going to be successful my
01:01:25.760 | only comment is that i think that when you look back 25 and 30 years from now and say that was the
01:01:32.560 | killer ai device i don't think it's going to look like something we know today that's my only point
01:01:38.880 | and maybe it's going to be this thing that sam altman and johnny ive are baking up that's supposed
01:01:44.160 | to be this ai infused iphone killer maybe it's that thing i doubt that will be a pair of glasses or a phone
01:01:53.360 | or a pin if you think about like so so take the constraints on i don't need a keyboard because i'm
01:01:59.920 | not gonna be typing stuff i don't need a normal browser interface you could see a device come out
01:02:06.640 | that's almost like smaller than the palm of your hand that gives you enough of the visuals and all
01:02:11.440 | it is is a screen with maybe two buttons on the side and it's all audio driven you put a headset in
01:02:17.120 | and you're basically just talking or using gesture or looking at it to kind of describe where you want
01:02:21.840 | things to go and it can create an entirely new computing interface because ai does all of these
01:02:25.680 | incredible things with predictive text with gesture control with eye control and with audio control
01:02:32.160 | and then it can just give you what you want on a screen and all you're getting is a simple
01:02:35.040 | interface so chamath you may be right it might be a big watch or a handheld thing that's much smaller
01:02:39.680 | than an iphone and just all it is is a screen with nothing i really resonate when you talk about voice
01:02:46.320 | only because i think it's like it you i think there's like a part of like social decorum
01:02:52.560 | that all of these goggles and glasses violate and i think we're going to have to decide as a society
01:03:00.160 | whether that's going to be okay and then i think like are when you like go trekking in nepal are you
01:03:06.880 | going to encounter somebody wearing ar glasses i think the odds are pretty low but you do see people
01:03:12.000 | today with a phone so what do they replace it with and i think voice as a modality is is i think it's
01:03:19.440 | more credible that that could be used by eight billion people i think social fabric's more affected by
01:03:23.920 | people staring at their phones all the time you sit on a bus you sit at a restaurant you go to dinner
01:03:27.440 | with someone and they're staring at their phone like you know spouses friends we all deal with it
01:03:32.320 | where you feel like you're not getting attention from the person that you're interfacing with in
01:03:36.400 | the real world because they're so connected to the phone if we can disconnect the phone but still
01:03:41.520 | take away this kind of addictive feedback loop system but still give you this computing ability
01:03:46.400 | in a more ambient way that allows you to remain engaged in the physical world i think everyone
01:03:49.920 | could say it hurts your feeling when he's playing chess and not paying attention yeah i'll be playing chess
01:03:55.120 | on my ar glasses while pretending to listen to you you idiot oh fine he's buying him he got he got
01:04:03.120 | version one but what one point i want to just hit on is that the reason why these glasses have a chance
01:04:08.800 | of working is because of ai i mean facebook initially made that's exactly my point that's exactly my point
01:04:14.720 | facebook or meta made these huge investments before ai was a thing and in a way i think they've kind of
01:04:20.400 | gotten lucky because what ai gives you is uh voice and audio so you can talk to the glasses or whatever
01:04:27.440 | the wearable is it can talk to you that's the perfect natural language and computer vision allows
01:04:34.080 | it to understand the world around you so whatever this device is it can be a true yeah personal
01:04:39.120 | digital assistant in the real world and that's the opportunity if you guys play with apple vision
01:04:43.840 | pro have any of you actually like used it to any extent if you actually play with it i used it for a day
01:04:50.480 | or a night when we were playing poker and i've never used it again right which i get but i do think
01:04:56.560 | that it has these tools in it similar to like the original macintosh had these incredible graphics
01:05:01.600 | editors like mac paint and all these things that like people didn't like get addicted to at the time but
01:05:07.120 | they became this like tool that completely revolutionized everything in computing later
01:05:13.200 | and fonts and so on but like this i think has these tools apple vision pro with gesture control
01:05:18.800 | and the keyboard and the eye control those aspects of that device highlight where this could all go which
01:05:26.000 | is this these systems can kind of be driven without keyboards without typing without like you know moving
01:05:32.560 | your finger around without clicking i think that's the i think that's the key observation i really
01:05:36.800 | agree with what you just said it's this it's this idea that you're just you're liberated from
01:05:41.520 | the hunting and pecking the controlling it's you don't need to control the computer anymore the
01:05:47.360 | computer now knows what you want and then the computer can just go and do the work well this is
01:05:52.720 | respond yeah now this is the behavior change that i don't think we're fully
01:05:57.120 | giving enough credit to so today part of what jason talked about what i called anxiety is because of
01:06:02.720 | the information architecture of these apps that is totally broken and the reason why it's broken is
01:06:07.920 | when you tell an ai agent get me the cheapest car right now to go to xyz place it will go and look
01:06:15.520 | at lyft and uber and whatever it'll provision the car and then it'll just tell you when it's coming and
01:06:21.280 | it will break this cycle that people have of having to check these apps for what is useless filler
01:06:27.200 | information and when you strip a lot of that notification traffic away i think you'll find that
01:06:32.880 | people start looking at each other in the face more often and i think that that's a net positive
01:06:37.680 | so we'll we'll meta sell hundreds of millions of these things i suspect probably but all i'm saying
01:06:45.120 | is if you look backwards 30 years from now what is the device that sells in the billions
01:06:50.880 | it's probably not a form factor that we understand today i just want to point out like
01:06:55.280 | the the form factor you're seeing now is going to get greatly reduced these were um some of the
01:07:00.960 | early apple um i don't know if you guys remember these but frog design made these crazy
01:07:06.560 | tablets in the 80s that were the eventual inspiration for the ipad you know 25 years later
01:07:13.840 | i guess exactly uh and so that's the journey we're on here right now this clunky and these are not
01:07:20.000 | functional prototypes dude the apple newton is like exactly people forget about and then it
01:07:24.160 | turns out hey you throw away the stylus and you got an iphone right and then and everything gets a
01:07:28.560 | million x better the other subtle thing that's happening which i don't think we should sleep on
01:07:32.720 | is that the air pods are probably going to become much more socially acceptable to wear on a 24 by 7
01:07:40.560 | basis because of this feature that allows it to become a useful hearing aid and i think as it starts
01:07:46.640 | being worn in more and more social environments and as the form factor of that shrinks that's when i
01:07:53.440 | really do think we're going to find some very novel use case which is you know very unobtrusive it kind
01:08:00.160 | of blends into your own physical makeup as a person without it really sticking out i think that's when
01:08:06.480 | you'll have a really killer feature but i think that the the air pods as hearing aids will also add a lot
01:08:12.320 | so meta's doing a lot apple's doing a lot but i don't think we've yet seen the super killer hardware
01:08:17.760 | device yet and there was an interesting waypoint microsoft had the first tablets here's the the
01:08:23.520 | microsoft tablet for those of you watching that came you know i don't know this was the late 90s or early
01:08:28.960 | 2000s friedberg if you remember it these like incredibly bulky tablets that bill gates was
01:08:35.760 | bringing to all the events 99 2000 yeah that era so you get a lot of false starts they're spending
01:08:42.480 | i think close to 20 billion dollars a year on this arv or so anyway we're definitely on this path to
01:08:46.960 | ambient computing i don't think i don't think this whole like hey you got to control a computer thing is
01:08:50.800 | anything my kids are going to be doing in 20 years this is uh this is the convergence of like three
01:08:54.720 | or four really interesting technological waves all right just uh dovetailing with tech jobs and the
01:09:00.160 | static team size there is a report of a blue collar boom the tool belt generation is what gen z
01:09:08.880 | is being referred to as a report in the wall street journal reports say tech jobs have dried up we all
01:09:14.800 | we're all seeing that and according to indeed developer jobs down within 30 since february of 2020
01:09:20.960 | pre-covered of course if you look at layoffs.fyi you'll see all the you know tech jobs that have
01:09:26.960 | been eliminated since 2022 over a half million of them a bunch of things at play here and the wall
01:09:34.160 | street journal notes that entry-level tech workers are getting hit the hardest especially all these recent
01:09:41.040 | college graduates and if you look at a historical college enrollment let's pull up that chart nick you can
01:09:47.200 | see here undergraduate graduate and total with the red line we peaked at 21 million people in either
01:09:54.320 | graduate school or undergraduate in 2010 and that's come down to 8.6 million at the same time obviously
01:10:00.320 | in the last 12 years you've had the population has grown so this is even you know it was a percentage
01:10:06.000 | basis would be even more dramatic so what's behind this a poll of a thousand teens this summer found that
01:10:13.360 | about half believe a high school degree trade program or two-year degree best meets their career
01:10:18.720 | needs and 56 percent said real world on the job experience is more valuable than obtaining a college
01:10:24.240 | degree something you've talked about with your own personal experience chamath at waterloo doing
01:10:29.680 | apprenticeships essentially your thoughts on generation tool belt such a positive trend i mean
01:10:35.600 | there's so many reasons why this is good i'll i'll just list a handful that come to the top of my mind
01:10:41.360 | the first and probably the most important is that it breaks this stranglehold that the university
01:10:49.520 | education system has on america's kids we have tricked millions and millions of people
01:10:59.920 | into getting trillions of dollars in debt on this idea that you're learning something in university
01:11:05.760 | that's somehow going to give you economic stability and ideally freedom and it has turned out for so many
01:11:14.720 | people to not be true it's just so absurd and unfair that that has happened so if you can go and get a
01:11:22.560 | trade degree and live a economically productive life where you can get married and have kids and take care of
01:11:28.160 | your family and do all the things you want to do that's going to put an enormous amount of pressure
01:11:32.240 | on higher ed why does it charge so much what does it give in return that's one thought the second
01:11:39.440 | thought which is much more narrow peter thiel has that famous saying where if you have to put the word
01:11:44.960 | science behind it it's not really a thing and what we are going to find out is that that was true for
01:11:52.160 | a whole bunch of things where people went to school like political science and social science social
01:11:57.120 | science but i always thought that computer science would be immune but i think he's going to be right
01:12:02.640 | about that too because you can spend two or three hundred thousand dollars getting in debt to get a
01:12:08.000 | computer science degree but you're probably better off learning javascript and learning these tools
01:12:12.640 | in some kind of a boot camp for far far less and graduating in a position to make money right away
01:12:18.240 | so those are just two ideas i think that it allows us to be a better functioning society so i am
01:12:24.720 | really supportive of this trend sacks your thoughts on this generation tool belt where we're reading about
01:12:32.080 | and you know the sort of combination with static team size that we're seeing in technology companies
01:12:39.360 | keeping the number of employees the same or trending down while they grow 30 year over year
01:12:43.680 | oh my god i'm like so sick of this this topic of of job loss or job disruption yeah i got in so much
01:12:50.560 | trouble last week you asked a question about whether the upper middle class is going to suffer because
01:12:55.200 | they're all going to be put out of work by ai and i just kind of brush it off not because i'm advocating
01:13:00.880 | for that but just because i don't think it's going to happen this whole thing about job loss is so
01:13:06.400 | overdone there's going to be a lot of job disruption but in the case of coders just as an example i think
01:13:12.160 | we can say that coders depending on who you talk to are 10 20 30 more productive as a result of these
01:13:19.280 | coding assistant tools but we still need coders you can't automate 100 of it and the world needs so many
01:13:25.600 | of them the need for software is unlimited we can't hire enough of them at glue by the way shout out if
01:13:31.840 | you're looking if you're a coder who is afraid of not being able to get a job apply for one at
01:13:36.160 | glue believe me we're hiring i just think that this is so overdone there's going to be a lot of
01:13:41.920 | disruption in the knowledge worker space like we talked about the workflow at call centers and just
01:13:49.200 | service factories there's going to be a lot of change but at the end of the day i think there's
01:13:53.920 | going to be plenty of work for humans to do and some of the work will be more in the blue collar space
01:13:59.760 | and i agree with jamoth that this is a good thing i think there's been perhaps an over emphasis on the idea
01:14:05.920 | that the only way to get ahead in life is to get uh like a fancy degree from one of these universities
01:14:11.680 | and we've seen that many of the universities they're just not that great they're overpriced you end up
01:14:17.440 | graduating with a mountain of debt and you get a degree that is you know maybe even far worse than
01:14:23.040 | computer science this is completely worthless so if people learn more vocational skills if they skip college
01:14:30.320 | because they are have a proclivity to do something that doesn't need that degree i think that's a good
01:14:34.640 | thing and that's healthy for the economy friedberg is this like uh just the pendulum swung too much
01:14:39.040 | and education got too expensive spending 200k to make 50 000 a year distinctly different than our
01:14:44.560 | childhoods or i'm sorry our adolescence when we were able to go to college for 10k a year 20k a year
01:14:49.360 | graduate with you know some low tens of thousands in debt if you did take debt and then
01:14:54.800 | your entry level job was 50 60 70k coming out of college what are your thoughts here is this a value
01:14:59.200 | issue with college well yeah i think the market's definitely correcting itself i think for years as
01:15:04.720 | chamat said there was kind of this belief that if you went to college there was regardless of the
01:15:09.520 | college there was this outcome where you would make enough money to justify the debt you're taking on
01:15:17.600 | and i think folks have woken up to the fact that that's not reality again if there was a free market
01:15:23.600 | remember most people go to college with student loans and all student loans are funded by the federal
01:15:29.200 | government so the the cost of education has ballooned and the underwriting criteria necessary for this
01:15:36.800 | free market to work has been completely destroyed because of the federal spending in the student loan
01:15:43.600 | program that there's no disc discrimination between one school or another you could go to trump university
01:15:49.600 | or you could go to harvard it doesn't matter you still get a student loan
01:15:53.360 | even if at the end of the process you don't have a degree that's valuable and so i think folks are
01:15:59.360 | now waking up to this fact and the market is correcting itself which is good i'll also say that
01:16:04.240 | i think that there's this premium with generally mass production and industrialization of
01:16:13.440 | the human touch and what i mean is if you think about hey you could go to the store and buy a bunch of
01:16:20.880 | cheap food off of the store shelves you could buy a bunch of hershey's chocolate bars
01:16:24.880 | or you can go to a swiss chocolatier in downtown san francisco pay 20 for a box of handmade chocolates
01:16:32.160 | you'll pay that premium for that better product same with clothes there's this big trend in kind of
01:16:36.960 | handmade clothes and high-end luxury goods bespoke artisanal artisanal handmade and and similarly i
01:16:46.000 | think that there's a premium in human service in the partnership with a human it's not just about
01:16:52.080 | blue-collar jobs it's about having a waiter talk to you and serve you if you go to a restaurant instead
01:16:59.600 | of having a machine spit out the food to you there's an experience associated with that that
01:17:04.240 | you'll pay a premium for there's hundreds and hundreds of micro breweries in the united states
01:17:08.720 | that in aggregate outsell budweiser and miller and even modello today and that's because they're
01:17:14.400 | handcrafted by local people and there's a there's an artisan craftsmanship yeah so while technology and
01:17:20.480 | ai are going to completely reduce the cost of a lot of things and increase the production and
01:17:26.560 | productivity of those things one of the complementary consequences of that is that there will be an
01:17:32.240 | emerging premium for human service and i think that there will be an absolute burgeoning and blossoming
01:17:38.160 | in the salaries and the availability and demand for human service in a lot of walks of life certainly
01:17:45.120 | there's all the work at home the electricians and the plumbers and so on but also fitness classes food
01:17:51.360 | personal service around tutoring and learning and developing oneself there's going to be an incredible
01:17:57.040 | blossoming i think in human service jobs and they don't need to have a degree in poli-sci to be
01:18:02.000 | performed i think that there will be a lot of people that'll be very happy in that world how do
01:18:05.200 | you see the differentiation the person makes freeberg in doing that job versus the agent or the ai or whatever
01:18:12.400 | well these are in-person human jobs so if i want to do a fitness class do i want to stare at the tonal
01:18:17.360 | this is what i'm asking you yeah like yeah i think i i think that there's an aspect of um it look it's
01:18:23.840 | like your laura piano like you you talk about the story of laura piano where is the vicuna coming from
01:18:30.960 | how's it made who's involved in it like yes look you're oh my god look at those who say the white don't
01:18:37.520 | stop freeberg i can give you truffle flavoring out of a can but you you love the white truffles you
01:18:42.400 | want to go to italy you want the storytelling there's an aspect of it right like yeah and i think that
01:18:47.040 | there's an aspect of humanity that we pay a premium that we do and will look etsy crushes i don't know
01:18:52.720 | how much stuff you guys buy on etsy i love buying from etsy i love finding handmade stuff on it i buy my
01:18:57.360 | underwear for decoration no you don't do you really yes i do yeah handcrafted yeah handmade so i think that
01:19:03.120 | there's an aspect of this that um in a lot of walks of life i mean i have so many jokes right
01:19:08.320 | i've never used that seat but i'm going to try it now after have you guys taken music lessons lately
01:19:15.760 | you know i started my kids do piano lessons and so last year i started ducking in to do a
01:19:20.160 | third 45 minute piano lesson with the piano teacher there's just like a great aspect to paying for these
01:19:24.880 | services to getting fascinating to bring that up oh here we go you can play the harmonica really
01:19:32.160 | i've just like well i've been playing some i want to play some zach bryant songs and he's got a couple
01:19:36.320 | songs i like with a harmonica in them so i just got a harmonica my daughter and i have been playing
01:19:40.080 | harmonica yeah are you teaching yourself let's hear it let's hear it let's hear it i'll play it next week
01:19:45.680 | it could be a bit but i'll write your song next week be a little shy he's a little shy
01:19:54.800 | no no i'll do i'll i'll write a song i'll do the uh the trials and tribulations of donald trump
01:20:01.520 | and i'll do a little bob dylan send-up song for you
01:20:04.400 | but listen to that interview with um with bob dylan i don't know when it was recently
01:20:10.800 | about how oh and then writing the lyrics that clip oh yeah that's amazing that bradley clip
01:20:16.880 | about magic yeah well you know some of those songs i don't know how i wrote them they just came out
01:20:22.320 | in the best but the best part is what he says what he says that anymore he's like no but i did it once
01:20:28.240 | no but i did it what i mean what an incredible that means something yeah
01:20:32.880 | eclipses are both that's really grounding it's really grounding to understand too soon there is no
01:20:38.640 | chance of dying yeah that's an incredible clip all right guys want to wrap or you want to keep
01:20:44.000 | talking about more stuff we were at 90 minutes here let me just say tell you something i think there's
01:20:47.840 | going to be a big war i think by the time the show airs israel's incursion into lebanon is going to
01:20:52.960 | get bigger it's going to escalate and by next week we could be in a full-blown multinational war in the
01:21:00.160 | middle east and if i am you know a betting man i would bet that the odds are you know more than 30 40
01:21:07.760 | that this happens before the election that this this conflict in the middle east escalates thank you for
01:21:12.640 | bringing this up i am not asking anybody to go listen to what i my interview with rogan but i will say
01:21:22.320 | this part of why i was so excited to go and talk to him in a long form format was this issue of war
01:21:32.720 | is i think the existential crisis of this election and of this moment and i really do agree with you
01:21:41.040 | freeberg there is a non-trivially high probability the highest it's ever been that we are just bumbling
01:21:48.400 | and sleepwalking into a really bad situation we can't walk back from i really hope you're wrong and
01:21:56.240 | here's the here's the situation i really hope you're wrong if israel incurs further in into lebanon
01:22:02.640 | going after hezbollah and iran ends up getting involved in a more active way does russia start to
01:22:11.280 | provide supplies to iran like we are supplying to ukraine today does this sort of bring everyone to a
01:22:17.680 | line just to give you a sense you know of the scale of what israel could then respond with iran has 600 000
01:22:24.880 | active duty military another 350 000 in reserve they have dozens of ships they have 19 submarines they
01:22:31.040 | have a 600 kilometer range missile system israel has 90 000 sorry 170 000 active duty and half a million
01:22:38.640 | reserve personnel 15 warships five submarines potentially up to 400 nuclear weapons including
01:22:45.200 | a very wide range of tactical sub one kiloton nuclear weapons small small payload you could see that if israel
01:22:53.680 | starts to feel incurred upon further they could respond in a more aggressive way with what is
01:23:01.440 | you know by far and away you know the the most significantly stocked arsenal and military force
01:23:09.520 | in the middle east again we've talked about what are these other countries going to do what is jordan
01:23:14.240 | going to do in this situation how is uh how are the saudis going to respond what is russia going to do
01:23:20.640 | well the russia ukraine thing meanwhile still goes on and we saw in our group chat one of our friends
01:23:26.000 | posted but russia basically said any more attacks on our land you know we reserve all rights including
01:23:32.720 | nuclear response that is insane well you know so just to give you a sense um it's insane how are we here
01:23:40.080 | yeah so the the nuclear bombs that were set off during world war ii i just want to show you how crazy
01:23:47.840 | this this is do you see that image on the left that all the way over on the left that's a bunker buster you
01:23:56.480 | guys remember those from afghanistan and and the damage that those bunker buster bombs caused hiroshima
01:24:02.880 | is a 15 kiloton nuclear and you can see the size of it there on the left that's a zoom in of the image
01:24:10.240 | on the right and the image on the right starts to show the biggest ever tested was tsar bomba by the
01:24:16.800 | soviets this was a 50 megaton bomb it it caused shock waves that went around the earth three times they
01:24:26.080 | could be felt as seismic shock waves around the earth three times from this one detonation today
01:24:30.960 | there are a lot of 0.1 to one kiloton nuclear bombs that are kind of considered these tactical nuclear
01:24:38.080 | weapons that kind of fall closer to between the bunker buster and the hiroshima and that's really
01:24:45.200 | where a lot of folks get concerned that if israel or russia or others get cornered in a way and there's
01:24:51.440 | no other tactical response that that is what then gets pulled out now if someone detonates a 0.1 or
01:24:57.920 | one kiloton nuclear bomb which is going to look like a mega bunker buster what is the other side and
01:25:03.280 | what's the world going to respond with that's how on the brink we are and there's 12 000 nuclear weapons
01:25:10.240 | with an average payload of 100 kilotons around the world the u.s has a large stockpile russia has the
01:25:17.520 | largest many of these are hair trigger alert systems china has the third largest and then
01:25:22.800 | israel and india and and so on it is a very concerning situation because if anyone does get
01:25:30.400 | pushed to the brink that has a nuclear weapon and they pull out a tactical nuke does that mean the game
01:25:35.440 | is on and that's why i'm so nervous about where this all leads to if we can't decelerate it's very scary
01:25:40.720 | because you can very quickly see this thing exciting the most objectively scared i've ever been
01:25:45.040 | and i think that people grossly underestimate how quickly this could just spin up out of control
01:25:52.320 | and right now not enough of us are spending the time to really understand why that's possible and
01:26:00.080 | then also try to figure out what's the offering and i think it's i think it's just incredibly important
01:26:06.080 | that people take the time to figure out that this is a non-zero probability and this is probably for
01:26:11.920 | many of us the first time in our lifetime where you could really say that well i think freeberg's right
01:26:16.800 | that we're at the beginning stages of i think what will soon be referred to as the third lebanon war
01:26:22.800 | the first one was in 1982 uh israel went into lebanon and occupied it until 2000 then it went back in 2006
01:26:31.600 | left after about a month and now we're in the third war it's hard to say exactly how much this will
01:26:38.080 | escalate the idf is exhausted after the war in gaza there's significant opposition within israel and
01:26:47.600 | within the armed forces to a big ground invasion of lebanon so far most of the fighting has been
01:26:55.040 | israel using its air superiority overwhelming firepower against southern lebanon and i i think
01:27:02.800 | that if israel makes a ground invasion they're giving hezbollah the war that hezbollah wants i mean
01:27:10.160 | hezbollah would love for this to turn into a guerrilla war in southern lebanon so i think there's still some
01:27:16.080 | question about whether netanyahu will do that or not at the same time it's also possible that hezbollah will
01:27:23.520 | attack northern israel what nazrallah has threatened to invade the galilee in response to what israel is
01:27:34.880 | doing so there's multiple ways this could escalate and if hezbollah and israel are in a full-scale war
01:27:43.600 | with ground forces it could be very easy for iran to get pulled into it on hezbollah's side and if
01:27:50.080 | that happens i think it's just inevitable that the united states will be pulled into this war so yeah
01:27:55.040 | look i think we are in russia and then we have been drifting into a regional war in the middle east that
01:28:01.920 | you know ideally would not pull in the u.s i think the u.s should try to avoid being pulled
01:28:07.600 | in but i think very likely will be pulled in if it escalates and then meanwhile
01:28:13.680 | in terms of the war in ukraine i mean i've been warning about this for two and a half years how
01:28:17.440 | dangerous the situation was and that's why we should have availed ourselves of every diplomatic
01:28:23.520 | opportunity to make peace and we now know because there's been such universal reporting that in istanbul
01:28:30.560 | in the first month of the ukraine war there was an opportunity to make a deal with russia where ukraine
01:28:35.920 | would get all this territory back it's just that ukraine would have to agree not to be part of
01:28:40.640 | nato would have to agree to be neutral and not part of the western military bloc that was so threatening
01:28:46.000 | to russia the biden administration refused to make that deal they sent in boris johnson to scuttle it
01:28:51.440 | they threw cold water on it they blocked it they told zelinski we'll give you all the weapons you need
01:28:55.920 | to fight russia zelinski believed in that it has not worked out that way ukraine is getting destroyed
01:29:03.040 | it's very hard to get honest reporting on this from the mainstream media but the sources i've read
01:29:08.800 | suggest that the ukrainians are losing about 30 000 troops per month and that's just kia i don't even
01:29:16.160 | think that's wounded that on a bad day they're suffering 1200 casualties it's more than even
01:29:23.360 | during that failed counter-offensive last summer that ukraine had during that time they were losing
01:29:27.760 | about 20 000 troops a month so the level of carnage is escalating russia has has more of everything
01:29:36.000 | more weapons more firepower air superiority and they are destroying ukraine and it's very clear i think that
01:29:43.120 | ukraine within it could be in the next month it could be the next two months it could be the next six
01:29:48.080 | months i think they're eventually going to collapse they're getting close to being combat incapable and
01:29:53.440 | in a way that poses the biggest danger because the closer ukraine gets to collapse the more the west is
01:30:00.320 | going to be tempted to intervene directly in order to save them and that is what zelinski was here in
01:30:06.800 | the us doing over the past week is arguing for direct involvement by america in the ukraine war
01:30:13.760 | to save him how did he propose this he said we want to be directly admitted to nato immediately that was
01:30:19.600 | his request and he called this the the victory plan so in other words his plan for victory is to get america
01:30:26.160 | involved in the war and fighting it for him but that is the only chance ukraine has and it is possible
01:30:32.160 | that the biden harris administration will agree to do that or at least agree to some significant
01:30:37.200 | escalation so far i think biden to his credit has resisted another zelinski demand which is the ability
01:30:43.680 | to use america's long-range missiles and british long-range missiles the storm shadows against russian
01:30:50.160 | cities that is what zelinski is asking for if zelinski wants a major escalation of the war because
01:30:54.640 | that is the only thing that's going to save him save his side and maybe even his neck personally
01:31:00.400 | and so we're one mistake away from the very dangerous situation that chamath and freeberg have
01:31:07.360 | described if a president biden who is basically senile or a president harris agree to one of these
01:31:16.240 | zelinski requests we could very easily find ourselves in a direct yeah war with the russians the waltz into
01:31:22.560 | world war three is what it should be called and the reason why this could happen is because we don't
01:31:27.760 | have a fair media that's fairly reported anything about this war i mean trump is on the campaign trail
01:31:33.520 | making i think very valid points about this war that the ukrainian cause is doomed and that we should be
01:31:39.600 | seeking a peace deal and a settlement before this can spiral into world war three that is fundamentally
01:31:45.600 | correct but the media portrays that as being pro-russian and pro-putin and if you say that you want
01:31:51.040 | peace you are basically on the take from putin and russia that is what the media has told the american
01:31:56.000 | public for three years the definition of liberalism has always been being completely against war of any
01:32:03.760 | kind and being completely in favor of free speech of all kinds that's what being a liberal means
01:32:10.160 | we've lost the script and i think that people need to understand that this is the one issue where if
01:32:17.280 | we get it wrong literally nothing else matters and we are sleepwalking and tiptoeing into a potential
01:32:27.120 | massive world war jeffrey sacks said it perfectly you don't get a second chance in the nuclear age
01:32:33.600 | you don't all it takes is one big mistake you do not get a second chance and for me
01:32:38.720 | i have become a single issue voter this is the only issue to me that matters we can sort everything
01:32:47.120 | else out we can figure it all out we can we can find common ground and reason should taxes go up
01:32:53.760 | should taxes go down let's figure it out should regulations go up should regulations go down we can
01:32:59.680 | figure it out but we are fighting a potential nuclear threat on three fronts how have we allowed this to
01:33:09.360 | happen russia iran china you cannot underestimate that when you add these kinds of risks on top of each
01:33:19.040 | other something can happen here and i don't think people really know they're too far away from it they're
01:33:25.280 | too many generations removed from it war is something you heard maybe your grandparents
01:33:30.080 | talk about now and you just thought okay whatever i lived it it's not good
01:33:36.000 | tomoth you're right i mean during the cuban missile crisis all of america was huddled around their tv sets
01:33:42.000 | worried about what would happen there is no similar concern in this day and age about the escalatory wars
01:33:49.760 | that are happening there's a little bit of concern i think about what's happening in the middle east there's
01:33:53.120 | virtually no concern about what's happening in ukraine because people think it can't affect them
01:33:58.400 | but it can and one of the reasons it could affect them is because we do not have a fair debate about
01:34:03.760 | that issue in the u.s media the media has simply caricature and any opposition to the war as being pro-putin
01:34:11.520 | so i would say that when every pundit and every person in a position to do something about it says
01:34:22.240 | you have nothing to worry about you probably have something to worry about and so when everybody is
01:34:31.120 | trying to tell you everybody that this is not a risk it's probably a bigger risk than we think
01:34:39.440 | yeah they're protesting too much how can you say it's not a risk me think thou doth protest it all
01:34:44.880 | all right all right love you boys
01:34:49.760 | we'll let your winners ride
01:34:51.760 | rain man david sacks
01:34:54.560 | and it said we open source it to the fans and they've just gone crazy with it
01:35:01.680 | love you guys i'm going all in what what your winners ride what what your winners ride
01:35:09.120 | besties are gone
01:35:11.360 | that's my uh dog taking a notice in your driveway
01:35:15.360 | you should all just get a room and just have a one big huge orgy because they're all just useless
01:35:24.880 | it's like this like sexual tension but they just need to release somehow
01:35:33.840 | we need to get merch is our back i'm doing all in i'm doing all in
01:35:46.960 | Thank you.