back to indexDid OpenAI Just Kill Social Media? | Cal Newport

Chapters
0:0 Did OpenAI Just Kill Social Media?
46:12 Will Sora 2 kill creativity?
55:55 Is Substack better than social media?
66:36 What are your thoughts on a “family smartphone”?
68:0 What should I do when others disregard the structures that are meant to reduce the digital back and forth?
75:45 A Digital Declutter
79:59 Children with dumbphones or smartphones
87:47 Sam Altman Proposes AI Erotica
00:00:00.240 |
if you've spent enough time online this past month you've probably noticed a sudden uptick 00:00:05.280 |
unsettling realistic ai generated videos maybe it's body cam footage of super mario in his cart 00:00:12.880 |
being pulled over by police and then yelling i can't get another ticket before peeling out 00:00:17.120 |
or a horse on skis competing in a freestyle skiing competition or bob ross wrestling 00:00:24.000 |
abraham lincoln these were all generated by open ai's blockbuster video generation model thora 2 00:00:31.120 |
which allows you to create videos simply by describing them in words open eye open ai however 00:00:37.120 |
has gone one step further and bundled up access to this model in a new app that they call simply thora 00:00:44.080 |
that operates like tick tock you can easily create videos and share them you can view other people's 00:00:49.680 |
creations and there's a an algorithmically curated feed just like you would see on tick tock so you 00:00:55.280 |
can have a steady incoming feed of grade a high octane lop the online world is usually pretty 00:01:02.480 |
excited about new ai products but the reaction to sora has been more interesting people online seem 00:01:08.080 |
to put it simply unsettled i want to play a clip here here's casey neistat reacting to the new sora app 00:01:23.040 |
fucked uh and now here is vlogger hank green venting some of his own frustrations 00:01:32.720 |
boy humanity if you're the kind of who will create slop talk you are not the kind of who 00:01:40.080 |
should be in charge of open ai all right the new york times for its part was really worried about what 00:01:45.680 |
this means for even just truth they quoted a founder of a tech non-profit saying the following 00:01:50.960 |
nobody will be willing to accept videos as proof of anything anymore all right but for all of these 00:01:58.320 |
particular fears we're hearing now about sora there is one fear in particular that i think is really 00:02:04.480 |
interesting but is not getting discussed that much yet and that is the question of what will the impact be 00:02:12.160 |
of apps like sora on the existing major social media platforms think about it it's an app on your phone 00:02:18.480 |
in which anyone can basically create any visual content they can think of why would i watch a tick 00:02:24.640 |
tock dance routine when using sora i can create a video of michael jackson dancing on mount everest 00:02:31.120 |
with martin luther king jr that's actually a real video i found jesse so by introducing sora has open ai 00:02:36.960 |
just radically changed the entire social media landscape this is the question we're going to get 00:02:41.520 |
into today as always i'm cal newport and this is deep questions 00:02:55.200 |
today's episode did open ai just kill social media 00:03:03.920 |
all right so we gotta be careful in our approach here we're going to go very systematically 00:03:12.560 |
to investigate this question and what we think the answer is so if we really want to understand the 00:03:18.400 |
danger that the existing social media platforms currently face i want to take you back in time to 00:03:25.520 |
roughly the summer of 2022 back then there is a new entrance in town called tick tock it was just taking 00:03:33.200 |
off and having this sort of phenomenally fast user growth and the entrenched social media platforms 00:03:38.960 |
they were horrified by the speed with which tick tock was growing and the fear that it was 00:03:44.480 |
pulling their users over into its own ecosystem so the existing social media incumbents responded the 00:03:52.080 |
way you might expect they tried to make their services more like tick tock this means they 00:03:58.160 |
began to ignore the way they classically operated with friends and followers and favorites and retweets 00:04:03.760 |
and instead try to offer up a stream of content selected exclusively by an algorithm to be as low friction 00:04:11.360 |
as possible and as engaging as possible i noticed this shift with some interest back when it was happening 00:04:17.680 |
and i wrote an article about this for the new yorker back in the summer 2022 it was called tick tock and 00:04:23.760 |
the fall of the social media giants and that article argued that this move to be more like tick tock was 00:04:29.840 |
something that they would come to regret so why don't i load this article on the screen here 00:04:33.120 |
for those who are watching instead of just listening all right so here's the article pretty cool graphic 00:04:38.480 |
i don't know if i ever really understood it jesse but it seems to be 00:04:42.400 |
a lot of people in cages but the doors are open and they're circular cages and they're dancing 00:04:47.120 |
the cool graphic i never quite understood it all right so let me start with the following thing i 00:04:51.360 |
wrote in this article when uh after introducing the idea that the other social media platforms the 00:04:56.400 |
existing platforms were chasing tick tock i wrote this shift is not surprising given tick tock's phenomenal 00:05:02.640 |
popularity but it's also short-sighted now to unpack that let's start by asking a fundamental question 00:05:11.520 |
what was it that made the existing social media giants so unassailable before the era of tick tock 00:05:17.040 |
began i want to go through three of them one by one let's look at them carefully we'll start with facebook 00:05:23.120 |
when you're facebook what made you hard to compete with well you had two things going on 00:05:29.120 |
one was the specific people who used it because facebook was so early to the the social media 00:05:34.240 |
space they got lots and lots of regular people to sign up to use it that meant for each individual new 00:05:39.920 |
user facebook had a really good pitch the people you know whether they're your friends your family 00:05:47.600 |
or somebody new from high school your college roommate they're probably on this platform and you 00:05:50.880 |
can connect to them they're not on other platforms but they're on our platform so the people 00:05:54.480 |
you know are on this platform the other advantage that facebook built up is that early on as all 00:05:59.120 |
of these regular people joined it they put in a lot of free labor to select people who were their friends 00:06:06.000 |
and to manually tell facebook that's a friend that's a friend that's a friend which created in the 00:06:10.880 |
end this sort of like dense social graph who captured who knew who that allowed it then to 00:06:17.520 |
deliver a pretty compelling content profile to its users i can tell you what people you know are up to 00:06:24.080 |
if there's changes in their life what they're reading what they're interested in that's a pretty 00:06:29.440 |
compelling package it was hard for later entrance into the social media space to compete with because 00:06:34.960 |
they weren't going to get a billion people necessarily to sign up and to spend all their time 00:06:39.200 |
saying who their friends are if you wanted to know what people you knew were up to be on the same 00:06:42.480 |
network with people you knew facebook was the only game in town that was a great competitive advantage 00:06:47.680 |
all right well let's look at instagram instagram had a slightly different set of advantages that is used 00:06:53.920 |
it also mattered they were early as well it also mattered who signed up for their platform but it was 00:07:00.160 |
less from the point of view of the user it was less about people you personally knew being on this 00:07:06.640 |
platform but instead the fact that there was lots of interesting and compelling expert type 00:07:10.960 |
people on this platform creating visually interesting content so uh fitness influencers writers people 00:07:18.800 |
who lived in ghost towns people who walked around in white linen dresses and collected flowers while 00:07:24.640 |
soft music plays there was interesting people we'll put like experts in square quotes but like people who 00:07:29.680 |
were uh good at what they did or known for things that were on there producing visually interesting 00:07:34.720 |
content so it's a place you could go to follow people like cooks chefs everything right bakers 00:07:40.720 |
bread makers that did things that you thought was interesting or compelling you wanted to do 00:07:44.240 |
and they're producing visually interesting content they had a lot of those people 00:07:47.360 |
had invested in being on instagram and once again users took the time to go through and painstakingly 00:07:55.040 |
favorite and follow different of these influencers so in the end what could instagram offer its users 00:07:59.280 |
a steady curated stream of content that was about stuff that user cared about from people they felt 00:08:06.400 |
like were doing interesting things and it was hard for competitors to come along because it's hard to get all those same 00:08:10.560 |
writers and bakers and people who walk around in white linen dresses it's hard to get them to sign up 00:08:15.040 |
for your new service and if i'm a new user of a new service and i haven't gone through all the effort of 00:08:19.440 |
clicking on all the people that i like and want to follow then it's hard to figure out what to show me so 00:08:22.880 |
that was a good advantage that instagram had what about twitter well in twitter's case again what they 00:08:29.680 |
got early on from a user-based perspective that was useful is they got the people who were 00:08:34.640 |
interesting from a cultural zeitgeist perspective the the politically adjacent people the reporters 00:08:41.280 |
the comedians you had people who could make interesting or funny or smart or astute or off 00:08:48.240 |
kilter observations about what was happening in the world all these interesting people had gone to twitter 00:08:53.760 |
when it began to take off even more importantly they got all of their users to again go through 00:09:00.960 |
this painstaking process of creating these follower relationships i follow this person this person this 00:09:06.000 |
person that person follows this person this person that person if you study the resulting graph 00:09:11.680 |
it has a very high expansion factor so if i go to my followers and then my follower followers and my 00:09:18.000 |
followers followers followers the size of people that's being reached increases very quickly it increases 00:09:23.680 |
along typically like an exponential curve now what this meant was twitter had whether they meant to or not 00:09:32.320 |
created a fantastic basically human driven distributed curation machine you have lots of people who know 00:09:39.200 |
lots of things and are interesting and good equips and if someone makes like a really good relevant 00:09:44.640 |
observation or picks out a piece of news that like is going to be really interesting to the moment 00:09:49.200 |
and they read they tweet it to their followers the cascade of retweets that follows can get that content 00:09:56.480 |
to a huge percentage of the hundreds of millions of users using twitter very quickly 00:10:00.640 |
so as a result they created this engine for keeping the finger on the pulse of the online zeitgeist 00:10:06.400 |
in a way that would be like very hard to replicate from scratch so if i come along to be a twitter 00:10:10.320 |
clone and a lot of people tried this the problem is if i don't have enough of those interesting people 00:10:16.000 |
and we don't have those people spending enough time making these intricate follower relationships so that 00:10:19.840 |
we have these exponential cascades i just end up getting like a relatively boring feed directly of 00:10:25.600 |
people who aren't that interesting it's just not that interesting to me so again twitter had a really good 00:10:30.560 |
business model they're really good competitive advantage in all three cases we can generalize 00:10:35.520 |
what had made up until this point we're getting to now what had made these services so powerful is 00:10:40.320 |
that they had the right users and these painstakingly constructed social graphs and it was hard for 00:10:46.640 |
anyone new to come along get both of those things so they were in a good position but then tick tock came 00:10:53.040 |
along and the thing about tick tock is that it doesn't really care about who its users are and tick 00:11:00.640 |
tock did not really care about who knew who follower graphs favorites interests they didn't really care about 00:11:07.600 |
that either they basically bypassed the elements that made the social media giants hard to compete with 00:11:14.640 |
i want to bring up on the screen here uh this is the way i explained it in my article so why don't why 00:11:19.520 |
don't i just read this because i think it it captures it well the effectiveness of the tick 00:11:24.240 |
tock's experience is found in what it doesn't require unlike twitter tick tock doesn't need a critical 00:11:30.560 |
mass of famous or influential people to use it for its content to prove engaging the short video format grabs 00:11:36.800 |
users attention at a more primal level relying on visual novelty or a clever interplay of music and 00:11:41.680 |
action or direct emotional expression to generate its appeal and unlike facebook tick tock doesn't 00:11:48.080 |
require that your friends already use the service for you to find it useful though there are some 00:11:52.720 |
social features built in the tick tock they're not the main draw of the app tick tock also doesn't 00:11:57.280 |
rely on its users to manually share content with friends or follow followers to surface compelling 00:12:01.920 |
offerings it assigns this responsibility to its scary good recommendation algorithm all right so tick tock 00:12:09.040 |
bypassed the competitive advantages that protected the social media giants by just saying look we're 00:12:13.600 |
just going to take a whole bunch of content we don't care who creates it and we'll use a very good 00:12:18.640 |
probably like two tower style recommendation app to just show you a constant stream of stuff we'll 00:12:22.720 |
think you like it will adjust what you show you based on your behavior and it will just be right 00:12:26.080 |
to the brain stem interesting when facebook instagram and twitter noticed that and chased after tick tock 00:12:31.440 |
they ignored or walked away from their competitive advantages when they started adding features that 00:12:39.040 |
just served up content selected by algorithms independent of who created it or who you like or 00:12:43.760 |
follow as a user they were basically walking away or walking beyond their own proverbial castle walls and out 00:12:50.640 |
there in the plains beyond they were vulnerable so i ended that article in 2022 with a warning which i'm 00:12:57.600 |
going to read right here all right this all points to a possible future in which social media giants 00:13:04.560 |
like facebook may soon be past their long stretch of dominance they'll continue to chase new engagement 00:13:10.320 |
models leaving behind the protection of their social graphs and in doing so eventually succumb to the 00:13:14.400 |
new competitive pressures this introduces tick tock of course is subject to the same pressure so in this 00:13:20.080 |
future it too will eventually fade the app's energetic embrace of shallowness makes it more likely in the 00:13:26.080 |
long term to become the answer to a trivia question than a sustained cultural force in the wake churned by 00:13:32.320 |
these sinkings will arise new entertainments and new models for distractions but also innovative new apps 00:13:36.800 |
and methods for expression and interaction so basically what i'm saying here is all of social media was 00:13:41.600 |
collapsing around the tick tock model which was just pure engagement and my argument here is if all you 00:13:46.560 |
are offering to people is pure engagement you are now competing with every other possible source of 00:13:52.400 |
engagement that exists out there when facebook is saying your friends are here you can see what your 00:13:57.200 |
friends are up to i'm not competing with netflix i'm not competing with podcast or slot machines or 00:14:02.880 |
pornography or anything else because i'm offering you a very specific thing that only i can do but when i'm 00:14:07.440 |
instead just giving you like a a long uh infinite feed of videos from people you don't know or really 00:14:12.800 |
care about that are just trying to capture your attention then that's just purified engagement and 00:14:16.880 |
you're competing with anything else in that moment that might grab my attention and so i ended that warning 00:14:21.840 |
by saying new things will come along that are even more engaging than you you don't have a competitive 00:14:27.760 |
advantage you can't stay on the pedestal for much longer all right this brings us back to the main argument for 00:14:33.200 |
today we can think of the sora app as the one of the first major sort of new distractions or entertainments 00:14:41.280 |
that i warned about in that article once all these social media platforms said we're just going to tickle 00:14:46.560 |
your brain stem open ai came along and said well we can do that even better but you have to actually 00:14:53.120 |
get people to create these videos and they're confined by their environment and the people they have 00:14:58.960 |
the film and they're not professional actors they don't have good lighting and like really interesting 00:15:03.360 |
stuff is dangerous and hard to do and only like a few number of video creators actually have enough 00:15:07.120 |
money to do that like we can just do that all in ai we can make if you if all you're looking at is 00:15:10.640 |
like i don't care what this video is i'm just swiping swiping swiping is that interesting 00:15:13.920 |
we can give you like i saw when i was doing research for this a video where it said this is true 00:15:20.960 |
jesse a swimming competition there's a lot of stephen hawking content out there it's a swimming 00:15:25.360 |
competition stephen hawking's on the block they fire the gun and he just falls in the water and drowns 00:15:30.160 |
that would be like it's a terrible taste really hard to actually stage and film 00:15:35.680 |
in sora you just type in that like in a sentence and you have that video so yeah they are in trouble 00:15:43.520 |
because i guess i could say they ignored me when you're in the world of pure novelty that's going to 00:15:51.680 |
be a constantly churning novelty it's hard to be the king of just pure novelty for that long so this 00:15:56.400 |
is all bad news for the existing social media giants but are they doomed do they have a chance here maybe 00:16:04.720 |
maybe they have a chance of breaking free out of the spiral that could be dangerous to them 00:16:10.240 |
so when we return i want to talk through what exactly facebook instagram and twitter 00:16:15.120 |
could do right now if they wanted to to protect themselves from the threat of ai generated distraction 00:16:20.960 |
but first i want to take a real quick break to hear a word from our sponsors 00:16:25.440 |
so going online without expressvpn is like not having a case for your phone most of the time you'll 00:16:34.160 |
probably be fine but all it takes is one drop and you'll wish you spent those few extra dollars 00:16:38.960 |
now i speak from experience here jesse i don't know if you know this but my phone screen has a big 00:16:42.480 |
crack across it and i'm not gonna do anything about that so what does a vpn do well when you connect 00:16:48.160 |
to the internet your traffic is contained in little bundles called packets and the data in these packets 00:16:53.040 |
that's protected right so like the actual information you typed into a form before you 00:16:57.600 |
click submit that's encrypted and people can't read it but the packet has an address on it that says 00:17:03.200 |
what site or service you're talking to that's out in the open that's plain text anyone can read that 00:17:08.640 |
right so the guy next to you with uh his his radio in packet sniffing mode on his laptop 00:17:14.560 |
knows exactly what sites and services you're using as does your internet service provider who 00:17:18.720 |
who or whoever owns that public access point you happen to be connected to now just like with your 00:17:24.560 |
phone case maybe a lot of times no one's listening to what you're up to but it's just the one time that 00:17:28.400 |
they are that could really cause the problem that's why you need a vpn because a vpn protects you 00:17:33.360 |
when you use a vpn it takes the sites and services uh it talks to sites and services rather on your behalf 00:17:39.680 |
and all people learn is that you're using a vpn you actually encrypt your whole packet you send 00:17:43.600 |
it to the vpn server it unencrypts it and then talks to the site and service on your behalf encrypts 00:17:47.440 |
the response and sends it back so now anyone who's saying hey who are you talking to all they know is 00:17:52.240 |
that you're talking to a vpn server privacy regain if you're going to use a vpn i suggest today's sponsor 00:17:58.320 |
expressvpn it's easy to use it works on all of your devices and it's rated number one by top tech reviewers 00:18:03.440 |
like cnet and the verge if you really want to try something cool they now offer a dedicated 00:18:09.600 |
ip service that uses a technology called zero knowledge proofs that means not even expressvpn 00:18:15.600 |
itself knows which users are talking to which sites uh i know about these really nerdy stuff jesse i 00:18:21.200 |
actually know about zero knowledge proofs when i was at mit i think we we were hearing about the early 00:18:26.320 |
work on this when i was taking like complexity theory at georgetown we have at least one person 00:18:30.720 |
on the faculty who works on this technology it's really cool stuff and very impressive that expressvpn 00:18:35.280 |
is starting to work with it in their product so anyways what are you waiting for secure your online data today 00:18:40.400 |
by visiting expressvpn.com slash deep that's expressvpn e-x-p-r-e-s-s-v-p-n dot com slash deep 00:18:51.120 |
to find out how you can get up to four extra months that's expressvpn.com 00:18:55.840 |
slash deep i also want to talk about our friends at better help because this podcast is sponsored 00:19:04.240 |
by better help october 10th not long ago october 10th was world mental health day this year i think 00:19:11.840 |
we should take a moment to say thank you therapist there are a lot of elements to cultivating a deep 00:19:17.360 |
life one of the most important is having a healthy relationship with your own mind 00:19:22.000 |
if you're struggling with relentless anxiety or draining rumination you don't have to put up with 00:19:26.880 |
that and a therapist can help you repair that relationship with yourself to help you get more 00:19:32.960 |
depth out of your life trust me i've had my own ups and downs with my own minds uh it's always worth 00:19:39.040 |
getting help uh when you're not happy with the way your mind is working if you're worried about the 00:19:44.960 |
complexity of finding a therapist then i recommend checking out better help better help has been around 00:19:48.960 |
for over 12 years and has helped over 5 million people worldwide on their mental health journeys 00:19:53.920 |
that's millions of stories millions of journeys behind everyone is a therapist who showed up 00:19:58.480 |
listened and helped someone take a step forward here are a few things to keep in mind about better 00:20:03.920 |
help one their therapists work according to a strict code of conduct and they are fully licensed in the 00:20:08.960 |
u.s two better help does the initial matching for you so you can focus on your therapy goals a short 00:20:17.520 |
questionnaire helps identify your needs and preferences and their 10 plus years of experience and industry 00:20:22.560 |
leading match fulfillment rate means they typically get it right the first time you aren't happy with 00:20:28.000 |
your match switch to a different therapist at any time using our their tailored recommendations so this 00:20:35.040 |
world mental health day we're celebrating the therapist who've helped millions of people take 00:20:38.960 |
a step forward if you're ready to find the right therapist for you better help can help you start 00:20:44.080 |
that journey our listeners get 10 off their first month at betterhelp.com deep questions that's better 00:20:51.520 |
h-e-l-p dot com slash deep questions all right jesse let's get back to our show 00:20:59.040 |
all right so as we covered in the first part of this deep dive the new soar app revealed the 00:21:02.960 |
dangers of an online business model that's based purely on offering unanchored distraction there will 00:21:08.880 |
always be new technology boasting an even more direct way to tap into their users brainstem so it's a very 00:21:15.120 |
sort of unsettled place to be trying to build a business so here's the question i want to answer next 00:21:21.280 |
what can existing social media giants do to protect themselves from this ai future i'm going to go through 00:21:27.680 |
this platform by platform and give you my game plan for how you would ai proof yourself against this 00:21:33.680 |
particular incursion i'll say jesse as an aside it is a little bit weird that me mr quit social media 00:21:39.920 |
is trying to help these companies out but um honestly i i don't think uh i don't think there can be like 00:21:46.240 |
kind of a worse app than what we see with aura so to me this is kind of like an enemy of my enemy is 00:21:52.080 |
my friend type of situation i mean i guess i wouldn't be that unhappy if all these services 00:21:55.760 |
went out of business but you know i would like them to be better than sora so i'm going to do this 00:22:02.000 |
i'm going over the dark side and i'm going to give advice to the major social media companies all right 00:22:05.840 |
let's start with twitter here's my advice for twitter to fight off the ai threat one forget trying to do 00:22:12.080 |
that video autoplay tick tock type thing that you've been doing where you know someone clicks on a video 00:22:17.360 |
that they really want to see and then you try to get them into an algorithmically curated feed that 00:22:21.840 |
that is the exact type of thing that makes you vulnerable to sora instead you have to work i would 00:22:27.040 |
say on the competitive advantage that always made you strong be a place where interesting people go to 00:22:32.080 |
say interesting things and that you have this like really good distributed curation model of capturing 00:22:38.160 |
like what is interesting to the zeitgeist on any given day to do this i would say twitter probably 00:22:43.840 |
has to find a better middle ground on content moderation right before elon musk took over 00:22:49.600 |
the service was becoming like a lot of institutions to be holding to the far left and this was having 00:22:54.320 |
a sort of stultifying effect and it was really also sort of like annoying and uh and kind of 00:22:58.720 |
censorious after musk has gone too far in the other direction so he kind of went too far the other way 00:23:03.680 |
uh you look you're you're on you you scroll twitter for five minutes you get the jew hatred which to me is 00:23:10.560 |
always like the the bellwether of like okay this is this things aren't going great here we also have 00:23:16.000 |
a lot of arbitrariness from musk himself or he'll just sort of like someone or not like someone and 00:23:19.840 |
put his finger on the thumb none of that is good if you're trying to create a town square that you want 00:23:24.400 |
a lot of people to go on to to see what's interesting in the day so find some sort of middle ground like 00:23:28.000 |
build a council like we have a council who's in charge of all content moderation policies and by 00:23:33.920 |
definition it's going to be two people left-leaning two people right-leaning and like two centrist and 00:23:39.440 |
they vote on all the policies and it's transparent and we apply them clearly and just be reasonable 00:23:44.560 |
people are okay with reasonable if you do that and then to keep focusing on feature improvements 00:23:49.760 |
making the experience better like get the friction out of here make it easier to do this or that 00:23:53.680 |
uh i think you could still do well in that model i think twitter can be a very sustainable three to 00:23:58.800 |
four billion dollar annual revenue business and still have an impact on the culture 00:24:02.240 |
yeah you're not gonna be meta that's pretty good for a private company and you're pretty ai proof 00:24:07.840 |
because this is all about like real people saying things that are interesting in the moment that are 00:24:12.080 |
being selected in a way that's distributed intelligence in a way that like an algorithm can't quite replicate 00:24:17.040 |
all right what can instagram do i think instagram needs to return to its bread and butter of 00:24:22.320 |
interesting influencers producing visually interesting content and allowing users to specify oh i like this 00:24:31.200 |
person i want to see what they're doing so moving retreating from their push towards algorithmically 00:24:36.720 |
curated sort of reel after reel of just like we'll just show you stuff that might be interesting what 00:24:43.200 |
the algorithm instagram should refocus on doing is just suggesting new people that you might want to 00:24:49.200 |
follow or look at hey you like these type of authors you might like this one and you go check them out 00:24:52.960 |
and like yeah i like what they're doing and you add them to the stable of people that you're listening 00:24:56.880 |
to but again instagram should be a place where they're sort of like experts doing stuff you find 00:25:01.120 |
interesting producing content that's visually interesting and you have a good feed of that from 00:25:04.640 |
people that you selected that again is going to differentiate you from a sort of the the slop style 00:25:11.120 |
world that you'll see something like on something on sora because it's not just random engagement 00:25:15.920 |
it's engaging content on specific things you find interesting it builds on like human expertise and 00:25:22.640 |
human connections ai proves you what about facebook you got to go back to the friend graph that's my 00:25:28.240 |
argument there that facebook should be a place where you go where you want to see what people that 00:25:32.480 |
you know or organizations that you know about and specifically chose to follow what are they up to and 00:25:37.760 |
maybe what it is that they're consuming that like one degree of direct indirection content sharing is 00:25:42.800 |
fine here is articles that my friends are reading or suggesting right now that's fine or i follow a 00:25:50.400 |
magazine and so in my sort of news feed will be various things that they publish that that sort of 00:25:56.080 |
in between era facebook where it was still very friend and follower graph friend graph focused 00:26:02.640 |
but you had a news feed that old from the different people you knew but not going so far that like we're 00:26:09.520 |
just going to make this feed all the way algorithmic and just try to like keep you engaged that's a 00:26:13.680 |
really good competitive advantage the other thing you should lean into if you're facebook is attention 00:26:18.560 |
to the users this was facebook's original business model i wrote about this in my book deep work we 00:26:24.720 |
forget this now in our current age of engagement and tick tock but the original pitch for for facebook 00:26:30.320 |
was not just your friends are here but that they will pay attention to what you're doing before facebook 00:26:36.160 |
came out we had had this web 2.0 revolution where we had things like blogs that made it easier to 00:26:41.520 |
publish content online you didn't have to directly edit html files you could like type into a text box 00:26:46.080 |
and click publish we take that for granted but it was a big deal in the earlier 2000s the problem 00:26:51.600 |
of the early web 2 era is that when people tried publishing their own things and starting their own like 00:26:58.000 |
blogs on blogger.com no one came because the vast vast majority of what people have to say or write 00:27:05.440 |
is really pretty boring and that was really painful and then facebook came along and said hey here's a 00:27:11.280 |
place where you can publish stuff and people pay attention because we have this social compact 00:27:14.720 |
if you friend these people on facebook you will like or comment on the stuff they do and they'll like 00:27:20.960 |
and comment on the stuff you do that is a really powerful feeling that feeling if i put something out 00:27:25.680 |
there and some people reacted to it professional writers like myself sort of take this for granted 00:27:30.720 |
but it was a really powerful pull that made people really like facebook especially when they moved the 00:27:36.720 |
mobile and put the like button on there that really got their engagement numbers up because people loved 00:27:41.440 |
it people love seeing other people paying attention to stuff they do go back to that model facebook 00:27:49.440 |
that's a good model the people you know and the specific organizations you like are here 00:27:54.560 |
you can see what they're up to and what they're publishing and you can get some attention for the 00:27:59.440 |
stuff you put up there put up an article you find then a few other people you know will comment on it 00:28:02.960 |
that is really powerful stuff and it's stuff that you can't replicate with ai the sora tiktok experience 00:28:09.280 |
is still a very different thing that's just pure engagement this is something that's going to be hard to 00:28:12.640 |
do if i don't have all the users on there and all the friend graphs so that's what i would do if i was facebook 00:28:16.240 |
again yes this is none of these are strategies that maximize user engagement minutes but they're 00:28:22.240 |
also strategies that can keep you pretty profitable while hoping that you avoid going completely out of 00:28:26.320 |
business if you chase after the the peer engagement model what about tiktok themselves are they completely 00:28:31.840 |
in trouble here i mean their whole model is built on just a stream of engagement and no profundity to 00:28:39.120 |
it no it's just whatever you look at longer gets on the screen they really really are in danger 00:28:46.000 |
from an app like sora because sora can create stuff that tiktok creators themselves cannot so can tiktok 00:28:51.040 |
survive the uh the age of sora well i think they could survive a little bit longer potentially potentially 00:28:59.040 |
and here's why i'm not hearing this talked about as much yet either i think it's an important point 00:29:03.680 |
it is expensive to produce sora videos using the sora 2 video generation model it is computationally very 00:29:13.840 |
expensive you have expensive gpu chips that are fired up for a long amount of time it's a huge amount of 00:29:18.800 |
electricity and a huge amount of usage to produce that 30 seconds of bob ross you know wrestling with 00:29:26.880 |
abraham lincoln so the only way for this to be economically feasible of course is that open ai has 00:29:32.800 |
to charge for the creation of these videos so right now you can download the sora app for free but if 00:29:39.840 |
you want to create a video you have to have at the very least a chat gpt plus membership this is at least 00:29:44.160 |
my understanding which is 20 a month if you have that 20 a month membership you can produce up to 50 videos a 00:29:50.880 |
month but they have to be at 480p resolution which is low and it's actually gonna be kind of hard to 00:29:55.040 |
compete with existing videos that are out there and 50 might sound like a lot but you also have to keep 00:30:00.160 |
in mind the way these videos are created is you just type in some words and hit generate 00:30:04.560 |
you can imagine it might take a lot of iterations on your descriptions to get just one video that you like 00:30:10.880 |
so if you really want to be producing a reasonable number of sora videos that you can iterate on at a 00:30:17.600 |
reasonable resolution that could actually do good on a lot of these like high resolution devices 00:30:22.000 |
you have to have the chat gpt subscription account that's 200 a month that's a really high price point 00:30:28.640 |
when you compare it to tick tock's price point which is free for all users because the cost of operation 00:30:34.480 |
here is so much less the actual computation that goes into like capturing and rendering and editing the 00:30:41.760 |
videos is distributed to the phones that the users themselves own and all the tick tock itself has to do 00:30:48.000 |
is uh take up these these highly compressed videos from the app and store those i mean it's not 00:30:53.440 |
computationally free a company like twitter is still going to be more efficient to run because 00:30:58.080 |
you're dealing with text and not video but still it is so much more cheaper if you don't have to generate 00:31:04.080 |
video from scratch it is so much cheaper right i mean like we know this jesse like anyone can watch a 00:31:10.000 |
youtube video on their computer but we had to get a beast of a computer to do video editing because anytime 00:31:15.200 |
you're doing anything with like editing or creating videos you need a lot of computational power 00:31:19.040 |
so tick tock might be able to kind of wait out sora i'm assuming open ai is losing money on this 00:31:24.880 |
they're going to try to their hope is they can eat money and grow it big enough that when they sell ads 00:31:30.640 |
they'll be able to somehow kind of like make it back and make this economically viable if they have 00:31:34.080 |
enough people i mean maybe we'll get into this more in the final segment but if i'm tick tock i'm like 00:31:38.800 |
you know what people don't want to pay 200 a month just to have a shot to produce these videos so you're 00:31:43.680 |
going to have many fewer people producing these videos and what makes slop sort of work like what 00:31:48.880 |
makes tick tock work is there are it's so easy to create videos they can get a huge corpus of potential 00:31:55.440 |
videos to show each person and it just gives them more opportunity to get it right when you're doing 00:32:01.920 |
recommendation algorithms it's kind of like a core idea you need inventory the more things you have 00:32:07.600 |
that potentially suggest to someone the closer you can become to maximizing the expected reward for the 00:32:13.040 |
individual users and the better the experience comes so that's going to be the achilles the achilles 00:32:18.080 |
heel of stora potentially is that it's so expensive there's just not enough inventory for it to have the 00:32:22.400 |
same sort of like wow you're really matching my interest tightly that like tick tock is able to get 00:32:27.360 |
and so what's going to happen is the relatively small number of people creating the sore videos 00:32:31.760 |
are just going to jailbreak them out of the app and upload them on tick tock anyways where they can 00:32:35.760 |
just be part of the big inventory that tick tock is using so tick tock might survive sore because i 00:32:39.360 |
don't think the economic model necessarily makes sense but they're still in trouble because their model 00:32:45.440 |
this is like a horde of barbarians is coming at their roman gates they might fight this one off 00:32:50.480 |
but there's going to be wave after wave after wave when your model is just pure engagement 00:32:56.560 |
there's going to be so many other things that come along that offer that engagement better maybe you 00:33:01.360 |
fight off this one but like at some point there's going to be another app where it's going to be 00:33:05.520 |
like a sports betting with topless women that like sends you random you know jackpot uh slot machine 00:33:14.560 |
rewards or something while misting you with fentanyl mist i mean you're never going to win this game 00:33:19.680 |
forever the engagement there always will be new things to figure out how to be more engaged if all 00:33:24.480 |
you're trying to do is just win this game of just like slack jaw how much drool can you get out of 00:33:28.880 |
your mouth as you're staring at the screen all right so tick tock you can survive a lot longer but 00:33:33.280 |
there's other social media companies go back to your old social media model i don't love that model by 00:33:37.520 |
the way i don't think the internet should be consolidated in the four companies i have huge 00:33:40.640 |
issues with those companies operating the way they used to operate but i would rather have them built on 00:33:46.640 |
their carefully constructed user bases and social graphs each of them focus on their each unique sort of 00:33:51.760 |
competitive value that makes them somewhat unassailable i would rather have a world with 00:33:55.520 |
them than a world where it's just like one sora after another as a pretender to the distraction 00:34:01.920 |
throne of who can make your eyes glaze over uh faster by the way i just uh saw the announcement jesse 00:34:08.560 |
tick tock has just um up their rev share for their creator so they see this threat coming 00:34:13.760 |
and they're trying they're like yeah we can just make it cost 200 to use sora we're gonna not only is it free 00:34:20.240 |
we're paying you more than we used to pay you before to create content here so they're really 00:34:23.840 |
uh they're putting up a defense and i think they've actually have a good shot at uh surviving this 00:34:28.000 |
particular wave all right so uh let's move on to some takeaways 00:34:46.960 |
all right so as i mentioned i'd be happy if all the social media platforms maintain their tick tock 00:34:53.520 |
clone strategy and they all falter under the pressure of the apps like sora or whatever brain stem slop 00:35:00.800 |
follows in its wake and then maybe as they all sort of collapse under the impossibility of being a dominant 00:35:07.440 |
force doing such a simple thing that people will finally be convinced to move on from this type of content 00:35:14.080 |
and go to more rewarding indie media style content like podcasts like newsletters like other high 00:35:18.480 |
quality independent media but it's not inevitable that this has to happen and if the social media 00:35:23.600 |
giants are smart they can avoid it the properties that made social media so powerful for so long are 00:35:29.120 |
still there for the taking they still have these social graphs that no one ever again will replicate 00:35:33.920 |
they still have these fantastically large and varied and interesting user groups it'll be hard for any 00:35:39.680 |
other new platform to every replicate so if the social media giants retrench around their social 00:35:44.080 |
graphs and curated user bases they can survive they can have their advantages again and i guess if the 00:35:50.560 |
alternative is watching videos of abraham lincoln wrestling bob ross then i'd rather have a world of 00:35:55.600 |
instagram influencers and twitter takes last just a little bit longer all right so there we go jesse 00:36:03.040 |
sora have you did you hear about this it's like kind of everywhere but no yeah i mean i'm actually i 00:36:08.400 |
have a lot of questions but i'm only gonna probably ask you a couple just to save some time do any idea 00:36:13.200 |
what the user base is um i wasn't able to find that yet uh it's all kind of new it's because it's being 00:36:20.960 |
rolled out in sort of a way that i find a little bit confusing because it was the it's the sora 2 is the 00:36:25.120 |
model sora is the app that you can use to access sora 2. there might be other ways to access sora 2. 00:36:31.680 |
i think a lot of people are downloading it to look at the slop so it's probably pretty big i don't know 00:36:36.080 |
how many people are generating though and then do you think a lot of people know that facebook and 00:36:40.320 |
instagram are under the same umbrella company oh i don't know what people know yeah it's a good question 00:36:45.200 |
i mean facebook bought instagram in part because it is hard so i'm talking about it is hard to build 00:36:51.440 |
user bases and social graphs from scratch instagram uh had built up this like user base and more 00:36:58.160 |
influencer type expert types who are doing visually interesting content and facebook's like i don't 00:37:01.840 |
know if we can get them all to come over here they've already there it's already existing the other thing 00:37:06.000 |
that instagram did this might have been more important is they were way more native mobile because 00:37:11.600 |
you have to use the camera to use instagram it was conceived as a mobile app from scratch because 00:37:15.520 |
you need a smartphone camera for it to make sense and so their mobile app was just better than 00:37:19.760 |
what facebook was doing like oh great we'll just hire them uh aqua hire them they spent a billion 00:37:26.000 |
dollars there's like 12 people who worked for this company it was crazy it was like the the most money 00:37:29.680 |
ever spent for so few people like we'll just get that app and then we don't have to try to build it 00:37:34.080 |
ourselves but yeah i don't know if most people know in part because as i just went through each of 00:37:38.480 |
these major platforms has such a unique identity like they're not that's why they all survived like i don't 00:37:44.640 |
go on facebook to hear from authors i would go on instagram to hear from authors i i but i don't go 00:37:49.600 |
on instagram really i mean i don't use any of these i'm using the sort of royal yeah here right you 00:37:54.000 |
wouldn't go on instagram necessarily anymore to know what your like nephews are up to though because like 00:37:58.960 |
you'll be uh group texting pictures or whatever right that's not good for that and like twitter if you 00:38:04.160 |
want to like a political take or controversy or what's going on like reactions to something that's 00:38:08.560 |
happening in the news like that's it's doing that you want to go on instagram for that and 00:38:12.240 |
so they've all kind of uh they've differentiated into their advantages that was a pretty stable setup 00:38:17.600 |
until tick tock came along and they felt like they had to follow it so they could keep getting growth but 00:38:23.680 |
i really think that was a problem yeah um i'm sure like the people that have the sora accounts for 200 00:38:30.400 |
a month and posts to instagram will become like beast users before anybody like realizes that you know 00:38:36.480 |
i i got to imagine that's going to happen so because a lot of people aren't going to catch 00:38:39.680 |
on to sora 2 for at least a year right at least the majority of the people yeah and sora puts on 00:38:44.080 |
a watermark right because they're worried about uh deep fakes because you can just put like anyone in 00:38:49.760 |
here but people have already figured out how to strip it off so if you can strip off the the mark you can 00:38:55.120 |
strip off probably whatever protections you need to put it into i mean of course you can right it doesn't 00:39:00.160 |
matter how many protections you put on your video generation because at the very least i can just 00:39:06.320 |
on my computer be playing my sora account and i can just uh screen capture it into its own video 00:39:13.120 |
right there's nothing you can do to prevent me from from if i want to capture that information the other 00:39:18.480 |
thing open ai did which i think is well desperate to be honest when they turned on this app and that 00:39:26.080 |
model they put very few protections on it so you could generate any ip you could work with any ip that was 00:39:34.560 |
out there right so there's all of these videos there's a lot of like um dash cam footage of like 00:39:40.800 |
police officers pulling people over and then peeling away and it's all ip it's mario it's spongebob square 00:39:47.120 |
pants and like uh it's in a drug arrest right um it's like whatever character they uh they didn't put any ip 00:39:54.960 |
protection um also any historical figures there's like a lot of hitler stuff on there there's a lot of like 00:40:00.320 |
queen elizabeth stuff on there just existing people who are alive now uh just full deep fake videos 00:40:07.920 |
there's there's um videos with with mark cuban uh on shark tank promoting this like made up like really 00:40:15.360 |
terrible products or what just him looking straight to the camera and they know they're under pressure 00:40:19.760 |
and everyone's like yeah you can't use our you think like nintendo and disney are okay with you making 00:40:24.960 |
like fake drug bust videos with their ip of course not right so now open ai is like oh we are uh turning 00:40:29.840 |
on like whatever whatever we'll turn on protections or whatever but their their idea was you know i think 00:40:35.840 |
it's so cynical let's let people make like really offensive ip violating slop at least for a few weeks 00:40:42.000 |
to try to get a lot of attention and then we'll be like oh my goodness i never thought that people would 00:40:47.040 |
think to put protected ip in their videos we're going to try to turn it off now so it's just like 00:40:54.080 |
really cynical thing they were doing whereas but i think they're going to get their um they're going 00:40:58.000 |
to get sued left and right yeah i mean of course they knew people were going to do that um so there you 00:41:03.040 |
go well they have a valuation of over 500 billion so they might be able to afford some lawyers i guess 00:41:10.000 |
valuation is not money i know and good for them they have that valuation as we'll see we'll see 00:41:17.360 |
what happens more on this a little bit later all right so we have a lot more to discuss about ai and 00:41:21.760 |
social media in the following including some questions from you my listeners we got a call 00:41:26.800 |
and a third segment where we're going to return to a recent tweet from sam altman which i think is bad 00:41:31.120 |
news for open ai but we'll get into it it's not the most uh heartening thing to hear from your your 00:41:35.200 |
loyal founder um first let's do a little bit of housekeeping before we move on the rest of the 00:41:40.160 |
show uh jesse i got to give you a halloween update let's hear it well here's the thing i don't want to 00:41:47.440 |
dox myself so i can't i don't want to like fully explain my halloween setup but i just want to briefly 00:41:52.960 |
explain because i think it's important more important than the future of social media briefly explain the 00:41:57.440 |
thing i built because it was really a pain to build so i wanted this sounds easy but it's hard i wanted 00:42:03.280 |
synchronized lights and sound so i was doing like effects using programmable leds like carefully 00:42:10.000 |
timed effects that i wanted sound to be synchronized to that is really hard to do in especially in like 00:42:16.800 |
a waterproof container and i i got it installed and so far it's worked and it's survived for a week 00:42:22.640 |
without you know fritzing out or something like that so that i'm happy about i had to use a micro 00:42:26.800 |
controller that allows me to touch each individual bulb in a programmable led strand and i could write 00:42:32.400 |
custom code for any of my animations right it's writing from scratch and see so i made all the 00:42:36.800 |
animations and then i had to wire that up solder that up to another circuit board that has a sound 00:42:41.920 |
controller on it and now i can sort of trigger sounds from the same microcontroller and it's all 00:42:47.680 |
just low level timing loops i wrote from scratch so i can exactly trigger exactly when sounds happen 00:42:52.320 |
exactly when you know lights are happening and i could hook up that sound board to my microcontroller 00:42:56.880 |
by just putting low voltage on different pins at certain like times or whatever then you have to have 00:43:01.520 |
like a whole power supply situation like all these things need different voltages you need a lot of 00:43:07.520 |
voltage to run 200 programmable led lights right so that you big voltage supply but then i need voltage 00:43:14.240 |
to run the speakers because i got to amplify the sound and i need voltage to run uh the microcontroller i 00:43:20.800 |
need voltage to run the sound chip so i just actually just wired the voltage from the sound the microcontroller 00:43:26.400 |
for the sound chip and then everything has to have like properly shared ground so that you have like because 00:43:30.160 |
we have to synchronize communication signals over communication buses and then you got to have this 00:43:34.400 |
full thing somehow fit into some sort of waterproof container in which lots of wires can come in and out 00:43:38.320 |
and i got that all to work at least temporarily the funny part is imagine if you got it all set up 00:43:43.120 |
it's working great and you walk back to your neighbors and it's like he's got something 10 times better 00:43:46.960 |
i mean it's not gonna happen but an animatronic yeah walking across the like whatever 00:43:55.120 |
i would i would get there i would get there so that's what's going on halloween update 00:43:59.440 |
uh i'm going to the i'm gonna be at the new yorker festival oh when is that that'll be the the weekend 00:44:04.000 |
after this comes out it's all sold out i think but that'll be fun but if fans are there they can say 00:44:07.520 |
how to you they can say hi if they're there my i'll be i'm doing a panel on saturday with um charles 00:44:13.920 |
do hig and anna wiener and it's coming on like ai in the future and stuff like that yeah we're 00:44:19.600 |
competing with the other event happening the same time is ad smith being interviewed by george sanders 00:44:23.920 |
so we're not the a event but i think it'll still be i think it'll still be interesting so you know 00:44:29.600 |
new yorker festival is fun it's always good to like visit the city and i'm gonna you know have 00:44:33.200 |
dinner with my publishing team and meet my you know hang out with my speaking agent and hopefully 00:44:37.840 |
you can get some good gear they sell like coffee cups and something again for free um yes i guess 00:44:43.360 |
that's true that's where your mind goes i'm thinking about like the celebrity writers i can meet 00:44:47.760 |
just like get a coffee cup you have coffee cup a sweatshirt a new yorker hat it's spread out over 00:44:53.920 |
like lots of venues in the city i think but anyway so that's fun uh some other exciting stuff coming up 00:44:58.400 |
but i guess i can't really talk about it yet because some of this stuff is secret but anyways i think 00:45:02.800 |
that's all the housekeeping what else do we have going on anything else jesse people need to know 00:45:05.440 |
sign up for the newsletter does brad have a book available oh yeah yeah a friend of the show brad 00:45:11.360 |
stolberg his new book the way of excellence which is excellent i blurbed it uh it's available for 00:45:16.160 |
pre-order so you can just google way of excellence um if you go to his newsletter go to the link to his 00:45:23.840 |
newsletter there's like bonuses if you want to pre-order it anyways he got a killer blurb by the way 00:45:28.080 |
by you or somebody else better better than me i mean obviously i'm the killerest of blurbs cover of the 00:45:33.280 |
cover of the uh book is steve kerr oh really yeah this is are they buddies uh they know each other 00:45:39.440 |
but he just likes the book yeah it's it's he says it's an absolutely beautiful book so people 00:45:43.440 |
don't know steve kerr is the coach of the gold state warriors um and himself he played with jordan 00:45:49.840 |
i watched him play with jordan back in the 90s yeah so you know jordan punched him in the face once in 00:45:53.600 |
practice remember that no oh you don't remember that it was like big news first kind of small 00:45:58.080 |
guy he's like our size isn't he he's probably six three but jordan's six six six six yeah so there 00:46:02.960 |
you go that's all the things that are happening all right let's move on to some questions 00:46:07.360 |
hi first question is from corey the videos produced by soar 2 look pretty good to me 00:46:15.040 |
if this technology continues to advance will this be the end of things like movies and television 00:46:19.280 |
shows as we currently know them this seems to be a concern that people are having so when i was doing 00:46:24.640 |
my research for this episode there's lots of different concerns about sora we talked about 00:46:29.440 |
one on today's episode which is will it kill social media but there's lots of other concerns disinformation 00:46:34.320 |
etc one of the bigger concerns was is this the death of creativity um this was actually i think the main 00:46:42.720 |
argument in you know casey neistat had an argument uh video out we played a clip from it back in the 00:46:47.600 |
introduction but his main point the the what he works to in this video is to argue that the sora 00:46:55.840 |
app running with sora 2 is going to be basically the end of like creative industry as we know it 00:47:00.960 |
i think jesse we could we got a clip right all right let's listen to that then i'll react to it 00:47:05.120 |
times a day if all you have to do today is type those words into an app and click a button and it gives 00:47:12.000 |
you a tick tock video in one then like youtube can't be far behind where you just type in the 00:47:16.640 |
youtube video you want to see you push the button and boom okay quit your job for 100 grand last 00:47:20.080 |
person off the island keeps buy every seat on a plane give away post come on come on good next 00:47:23.840 |
one build the biggest vending machine that gives away cars and how many years away from netflix you 00:47:27.760 |
just type in like jason bourne moon landing and go generating so he's arguing this technology is going 00:47:36.640 |
to take down visual mediums one after another actually that part there in the middle 00:47:40.160 |
with mr beast was actually pretty funny he used sora to produce a video this is like the type of 00:47:44.960 |
thing you can do right now without the protections you can just make a video with someone in it who 00:47:48.160 |
has no idea you're doing it he made a video of mr beast you very meta using sora at a computer and 00:47:53.840 |
just frantically typing in like descriptions of videos so that like mr beast when you know instead 00:47:57.840 |
of making the videos he's just like i would do this i want to do that it's actually pretty good use 00:48:00.560 |
of sora all right so he argues it's going to take things one by one by one until it's netflix movies 00:48:05.120 |
and like why would we do anything if we could just type in and get whatever we want 00:48:09.200 |
i don't really think that is an issue and the reason why i don't think that is an issue 00:48:13.040 |
is we have had over the generations all sorts of different quality levels of content and often 00:48:21.680 |
the lower level the lower quality level stuff is very very engaging in the moment it could be like 00:48:26.880 |
oh i gotta really watch this and yet we have not up to this point really seen a lot of examples of the 00:48:33.920 |
existence of a lower quality high engagement stratum of content lead to the collapse of upper stratum 00:48:40.720 |
right because like in the age of tick tock we still got chris nolan's oppenheimer 00:48:47.200 |
like talk videos are like easy to make there's a huge number of them they're incredibly compelling when 00:48:53.280 |
you're looking at them they're hard to look away oppenheimer's on the other end of the spectrum this is a 00:48:58.080 |
movie that costs 250 million dollars to make and it took all these years and it's very artistic or 00:49:02.400 |
whatever people were happy for both of these to exist they've made a billion dollars in box of the 00:49:06.800 |
people who wanted to see that type of what stratum equality and they'll also separately look at the 00:49:12.000 |
other stratum equality that is something like uh like a tick tock video this is why for example you can have 00:49:19.120 |
during the like early 2000s on your tv you had the same device you have the real housewives which is 00:49:26.720 |
like super engaging to watch and there's a lot of different series of them you can just turn it right 00:49:30.960 |
on and it doesn't require much you're just in it and it's like primal and people are fighting and 00:49:36.080 |
throwing drinks each other's faces at the same time you have the sopranos on hbo 00:49:39.200 |
both stratums of quality exist the existence of a higher engagement and i would say it's higher 00:49:45.600 |
engaging right like it's much easier to jump into a real housewives episode than it is to get into 00:49:50.640 |
like a difficult sopranos episode it's much easier to watch a tick tock video than it is to try to like 00:49:54.800 |
understand what like the time bending is that's going on in like dunkirk or something like that 00:49:59.120 |
they both exist so we don't i mean we we often tell ourselves that the introduction of a lower stratum 00:50:06.000 |
quality higher engagement type of content is going to destabilize the foundations and the things on top 00:50:10.080 |
falls but we really have not seen that historically and so you know i would just extrapolate from what 00:50:15.920 |
we've seen so far so yeah we can we can introduce you know uh this new type of super engaging content 00:50:25.120 |
that can in this type you know it has different famous people in it not really they're going to turn 00:50:29.040 |
that off because of ip stuff but it'll be visually very arresting in a way that tick tock is not because 00:50:34.480 |
you don't actually have to go to a place to make the things and and it'll have its own sort of stratum there 00:50:39.200 |
but i don't think it's going to collapse to the levels above it the only way this would happen is 00:50:44.720 |
if if nice thought was arguing that like somehow eventually these models will produce things that 00:50:50.240 |
are indistinguishable from like uh top movies that's not exactly how that works it's like really really 00:50:56.240 |
hard to make a movie and like almost no ideas ever get made into the movie and it takes like auteur 00:51:00.560 |
directors to just pull all these pieces together in their vision maybe they'll use more ai in their 00:51:05.120 |
effects or something like that but it's not going to be i type the description of a movie and 00:51:08.960 |
that it's made so i'm not super worried about that i just think slop is slop it is going to 00:51:14.400 |
take over adjacent slop i think it's at the same strata ai videos at the same strata as tick tock 00:51:20.400 |
and i again it's going to be trouble once the economics get worked out i think that's trouble 00:51:23.360 |
for tick tock but i don't think it's trouble for like uh you know the next paul thomas anderson movie 00:51:28.480 |
but maybe i'm just being naive but it hasn't happened yet hasn't happened yet all right who do we have next 00:51:33.600 |
next up is dave i'm an it security engineer ai coding tools have allowed me to write scripts 00:51:40.000 |
that save me weeks of time would my time be better spent writing code from scratch or am i better doing 00:51:45.600 |
the best thing by focusing on the problems and using all the tools that i have available to make solutions 00:51:50.240 |
quickly can i tell you what was programmed from scratch no ai tools your halloween light halloween 00:51:57.200 |
lights damn straight they were timing down to the millisecond level all built within a central timing loop 00:52:04.240 |
elegantly so i at each iteration of the loop like you're just in the iteration of a loop and you have 00:52:09.920 |
to keep track of what lights are on and not on and how do i need to move that and where is it i got 00:52:14.080 |
to randomize this thing and have an explosion have you just always known how to do that stuff or i 00:52:19.280 |
yeah i've known how to like that type of coding like so i'm not like a full stack engineer anymore 00:52:23.760 |
like i don't really know like the latest systems if you tell me to build like a commercial piece of 00:52:26.960 |
software i was like i've been out i'm a theorist i don't know but i've been coding since i was like 00:52:32.080 |
seven but like that type of thing give me a blank slate like here's a microcontroller like build or 00:52:37.760 |
whatever that was like 20 minutes i sit in the office like let's just roll you know and i went for it 00:52:42.240 |
i actually had a job for a while in college where uh there was a product there's a company in my town 00:52:48.960 |
and they built a product it was it was not that interesting but it was like a piece of um it was 00:52:54.000 |
for measuring the optical properties of film and it would like rotate things and shoot lasers through 00:53:00.320 |
prisms and make measurements or whatever and the whole thing was run from a computer and they had 00:53:04.640 |
like a special control card you would stick onto the the motherboard of the computer that would allow 00:53:08.720 |
you to like precisely control this thing but then the technology for using those old control 00:53:13.200 |
boards was getting outdated and so they're like we what we really need to do is like control this whole 00:53:17.120 |
thing by a microcontroller that's on the device itself and the computer should just like send it a 00:53:21.600 |
command like oh do a measurement and then like the microcontroller can like precisely you know control 00:53:26.000 |
everything and then just send the data back and my job was to do all that like here's the microcontroller 00:53:31.200 |
here's the thing and i programmed it from scratch right so like i got really good and that was an 00:53:35.040 |
assembler nowadays like my halloween thing i can program it's like a c or c plus plus variant jr 00:53:40.480 |
do we know stuff but back then it was assembler like you were like you were in it and you were counting 00:53:46.160 |
the micro uh the number of the number of clock cycles required per instruction you could count 00:53:52.400 |
those up to get timing like okay each clock cycles this many microseconds and the ad's going to 00:53:56.560 |
take three clock cycles but yeah that type of stuff i love i love low level that type of thing i can do 00:54:02.080 |
even though i don't do i don't build systems as a computer scientist um anyways what i'm trying to 00:54:07.680 |
say dave is you're not a real man if you're not programming assembler from scratch um okay here's 00:54:15.600 |
what i have to say about that actually more seriously about it if you're in an engineering position 00:54:20.240 |
and you're building a system that like there is some stakes it doesn't even have to be something 00:54:25.760 |
that other people are using i'm not talking about like you're building a product that you're 00:54:28.480 |
trying to release but like you're building a system that does things that's going to affect 00:54:32.480 |
other people it's helping you like update password files or review like what's going on or clean out 00:54:37.200 |
tickets or this or that you don't want to be purely vibe coding by purely vibe coding meaning like you're 00:54:43.600 |
just asking ai model to produce code and your only way of measuring it is just you run it and see if 00:54:48.080 |
it works or not and if it's not you like tell it to fix some things you need to understand the code 00:54:54.080 |
that's running whatever these scripts are that might affect other people you can use ai tools 00:54:59.280 |
to help you produce that code or to help you debug that code or to give you like hey can you show me 00:55:04.480 |
how i would do this thing but you have to know enough about whatever language and system you're 00:55:08.720 |
using to be able to verify look at that code and see what it's doing to be able to audit yourself for 00:55:14.080 |
a simple script and be like okay oh i see i didn't know that command ai helped me get that so you can't 00:55:19.760 |
be vibe coding if there's any stakes like this where it could affect like outcomes for other people 00:55:24.480 |
you got to know what's going on it's fine i'm happy if you want to use ai to speed up the process of 00:55:28.720 |
generating or checking or debugging code but you got to actually understand at least that's my my computer 00:55:34.800 |
science ethic you have to know enough about that programming language to know what's going on so ai 00:55:39.520 |
should help you speed up something you already know how to do in this context not do something for 00:55:45.840 |
you that you don't know how to do i think that's where we are especially in an engineering position 00:55:50.160 |
like that all right who do we have next next up is pastor alex i don't use social media what are your 00:55:56.960 |
thoughts on substack is this okay for me to use or are there similar negative consequences of what you've 00:56:02.000 |
recently discussed with social media you know what i do pastor alex is i program my own substack code 00:56:09.200 |
in assembler and it runs on a microcontroller i'm a man and then you posted account newport.com i don't 00:56:15.840 |
think this like i'm a man thing works when we're talking about programming languages on microcontrollers 00:56:20.160 |
i think i need actually something more manly for that tag to actually work um okay in general email 00:56:26.080 |
newsletters are something that i am uh bullish on because i think they're a great example of indie media 00:56:33.200 |
it's taking advantage of the internet for distribution the spread more ideas than would 00:56:38.400 |
otherwise be spreadable it's controlled by the person who writes it it's your content it's long 00:56:43.360 |
form it's idiosyncratic so i'm a big fan of newsletters substack look there's some issues with 00:56:49.280 |
it to catch my attention they want to be more like a social platform they want you to move in out of your 00:56:54.400 |
inbox into their app they want it to be like all of these newsletters like in tick tock are somehow 00:57:01.040 |
just um nuggets of content and you don't don't worry we'll help you find them we'll see what you 00:57:09.200 |
like reading and we'll find other things you'll just kind of have this like stream of things you 00:57:12.960 |
might be reading they i think their their model is they want to be their aspiration is to be like 00:57:18.560 |
an algorithmically curated magazine i don't love that i think email newsletters i think of as 00:57:23.760 |
as i have a reason why i want to hear what this person is writing right because i'm a big believer 00:57:30.480 |
in distributed curation through webs of trust you don't have a machine just tell you hey here's 00:57:35.200 |
something you might find interesting you actually find your way to someone through real people this 00:57:39.440 |
person i already know and have a relationship with we have a link in the web of trust i know this person 00:57:43.200 |
i trust this person either because i know him personally or just through their reputation 00:57:46.240 |
they're now recommending this person this person is a good writer i can now add a link in 00:57:50.640 |
that web of trust to this other person because if this person trusts them and i trust them then 00:57:56.160 |
i trust this other person as well so maybe i'll check out what they're doing i'm willing to possibly 00:58:00.000 |
consume what they're doing and then maybe they're constantly talking about some other writer well that 00:58:05.280 |
puts another link in my web of trust that's now someone that through this chain of trust i can trust 00:58:11.360 |
like a legitimate good actor good faith it's interesting it's stuff worth listening to that's 00:58:16.080 |
the way we used to find writers the way we used to find books just the way we used to find like albums 00:58:20.320 |
and stuff like that and i think that's a fantastic mode of curation the internet is good for distribution 00:58:25.520 |
in this model i can now see what this person has to read without having to like send away for something 00:58:30.960 |
or go find a book in store that's fantastic but i don't necessarily need the internet to help me find the 00:58:34.880 |
person distributed webs of trust are an incredibly effective tool in my opinion for curation it handles 00:58:43.680 |
like so many issues that we actually have about the internet today about you know actors that are acting 00:58:48.320 |
in bad faith or they're actually trying to deceive you or they're like undermining values in ways or 00:58:53.600 |
they're they're like actually like agents of like another power this or that a lot of that goes away 00:58:58.880 |
when you use webs of trust because when you're building up these chains of trust it's very hard to 00:59:02.480 |
build your linkage over to nonsense and it's very different for example than when you have a fully 00:59:07.280 |
homogenized algorithmically curated type of setup like tiktok every tiktok video is the same everyone's 00:59:11.840 |
treated the same i don't know where this video is coming from is this really just like someone over 00:59:17.200 |
in you know like arkansas who has an opinion on this world event or is this like a bot that's coming out 00:59:21.360 |
of like you know uh pakistan i don't know or unlike twitter everything looks the exact same and stuff just 00:59:27.440 |
kind of spreads to me that i see or don't see i i i like a world where i had to go from person to 00:59:32.240 |
person to person making a real chain and connection each way to get to someone else 00:59:35.600 |
because i'm not going to get the nonsense that way so i don't like that substack is trying to move away 00:59:40.480 |
from that there i guess it's an engagement model uh there's just a constant stream of people you might 00:59:45.360 |
want to hear writing that being said so put that aside it is a boon for writers substack is a boon 00:59:52.560 |
for writers because they take out a lot of the complexity and expense of taking advantage of that 00:59:57.120 |
distribution model and i i don't want to undercut that i think that is actually like a really really big 01:00:01.680 |
contribution to the world of ideas i do want to give substack credit for that i'm independent fully 01:00:07.840 |
i'm independent with my newsletter that's not easy not easy it's also expensive i don't know what we 01:00:14.240 |
pay jesse but i think it's like you know we have a hundred something thousand subscribers it's thirty 01:00:19.920 |
thousand dollars a year easy just to host that newsletter right so it's not these are non-trivial 01:00:25.520 |
expenses now i make money because i also have a podcast it's that is synergistic with the newsletter 01:00:31.760 |
and we work with a really good ad agency and we've been at this for a while and then we have an income 01:00:36.240 |
stream through ads that subsidize the other things that's a really hard thing we do um and if i was just 01:00:42.320 |
starting from scratch my newsletter is getting popular like i don't have that type of money right 01:00:45.680 |
so i think that's a really good thing to do it's also complicated the software is complicated it's not 01:00:49.040 |
too hard we use a really good provider but you know you got to learn how this works uh and substack 01:00:55.200 |
makes it easy substack also has recommendations like they'll they'll recommend as i talked about 01:01:00.400 |
this thing that i'm a little bit wary about there's like algorithmic recommendations it does help people's 01:01:04.960 |
lists grow though i'm kind of a believer in just like old-fashioned list growth it takes a lot longer 01:01:09.280 |
but the the people you gain are like bigger on your list um so i won't cast shade on substack 01:01:14.640 |
because it allows writers to just focus on writing and not this like expense and complexity i mean like 01:01:20.240 |
we have someone who basically just like handles the newsletter from a technological perspective you know 01:01:25.200 |
substack writers don't need to worry about that so um so yeah use it that's fine i think it's fine i 01:01:31.440 |
think it's fine you substack just be wary when you consume about like 01:01:35.120 |
the algorithmic curation and in general use manual chains of trust to find newsletters but i'm fine 01:01:41.440 |
with substack um i like that they make it cheap for people to actually do what they do i don't think 01:01:46.800 |
there's a podcast equivalent of that but podcast is there like if i wanted to host i guess there must 01:01:52.800 |
be like let's say i wanted to host a podcast and i didn't want to just like pay money for it 01:01:56.880 |
i don't know if that exists there probably is somewhere for some reason that's way cheaper 01:02:03.360 |
which never made sense to me hosting podcast is way cheaper than email newsletters like we don't pay 01:02:11.200 |
that much money to host our podcast i don't know if people know how podcast technology works but like 01:02:15.520 |
basically uh when you have a podcast you have a server somewhere you're paying just to put the 01:02:21.760 |
sound files the mp3 files you just get uploaded to a server somewhere right and then you also maintain 01:02:27.760 |
an rss feed which is like a machine readable you know xml file where every time you post a new it's 01:02:33.040 |
like a big text file and every time you post a new episode you add all the information to this file 01:02:37.760 |
and it has like the title the description and a link like this is where the file is to download 01:02:41.920 |
and all a podcast player is like when you subscribe the podcast is all you're saying is like hey keep 01:02:47.520 |
checking on these rss feeds a few times a day it loads them up and reads the text file if there's a 01:02:51.680 |
new episode it puts the information in the player and it downloads the file from that server so the only 01:02:55.280 |
thing you really need from a hosting perspective if you're a podcaster is a place to store the files and 01:03:00.160 |
that's like really cheap because i guess storing files and bandwidth for people downloading files 01:03:04.960 |
is uh you know it's like a commodity it's pretty cheap even though we'll have hundreds of thousands 01:03:11.520 |
of these files downloaded it's somehow like way cheaper than having that many newsletter subscribers 01:03:16.320 |
so i don't know something about sending emails is expensive something about downloading podcasts is cheap 01:03:21.440 |
so there you go or less than economics all right we have some more questions to go about ai and social 01:03:26.640 |
media plus a bonus question we added about an unrelated topic that i just like talking about 01:03:30.560 |
we also have a case study coming up that uh will be good and a tweet from sam altman that uh 01:03:38.080 |
i i'm gonna have to get into so all this is still to come but first we need to take another quick 01:03:42.720 |
break to hear from some more of this episode's sponsors here's the thing about being an aging 01:03:49.280 |
guy when you're young you don't think about your skin you spend time in the sun only occasionally you 01:03:53.920 |
know cleaning the grime off your face with a brillo pad and you still end up looking like leonardo 01:03:58.400 |
capro in growing pains but then one day you wake up and you realize like wait a second i look like an old 01:04:04.400 |
grizzled pirate captain why didn't anyone tell me that i was supposed to take care of my skin 01:04:10.160 |
i want you to avoid that fate and you could do so with caldera labs their high performance 01:04:15.280 |
skincare is designed specifically for men it's simple effective and backed by science 01:04:18.640 |
their products include the good which is an award-winning serum packed with 27 active botanicals 01:04:24.400 |
and 3.4 million antioxidant units per drop that's true i checked their eye serum which helps reduce 01:04:29.680 |
the appearance of tired eyes dark circles and puffiness and their base layer which is a nutrient 01:04:34.080 |
rich moisturizer infused with plant stem cells and snow mushroom extract you don't have to know what all 01:04:38.800 |
that means you just get the products from caldera labs and you use them it really does work in a 01:04:44.800 |
consumer study 100 of men said their skin looks smoother and healthier all right so men if you 01:04:50.560 |
want to look more like growing pains era leonardo dicaprio and less like a grizzled pirate captain 01:04:56.000 |
you need to try caldera labs skin care doesn't have to be complicated but it should be good 01:05:01.440 |
upgrade your routine with caldera lab and see the difference for yourself go to caldera lab dot com slash 01:05:06.960 |
deep and use code deep at checkout and you will get 20 off your first order i also want to talk 01:05:13.840 |
about our friends at shopify if you run a small business you know there's nothing small about it 01:05:17.680 |
every day there's a new decision to make and even the smallest decisions feel massive 01:05:22.240 |
so when you find the decision that's a no-brainer you should take it when it comes to selling things 01:05:26.720 |
using shopify is exactly one of those no-brainers shopify's point-of-sale system is a unified command 01:05:33.760 |
center for your retail business it brings together in-store and online operations across up to 1 000 01:05:39.440 |
locations it has very impressive features like endless aisle tip to customer and buy online pickup and 01:05:46.480 |
store and with shopify pos you get personalized experiences to help shoppers come back and they will 01:05:52.480 |
based on a report from ey businesses on shopify pos see real results like 22 better total cost of 01:05:58.800 |
ownership and benefits equivalent to an 8.9 uplift in sales on average relevant to the markets that studied 01:06:05.840 |
so get all the big stuff for your small business right with shopify sign up for your one dollar per 01:06:11.680 |
month trial and start selling today at shopify.com deep go to shopify.com deep that's shopify.com deep 01:06:22.080 |
all right jesse let's get back to doing some more questions 01:06:26.960 |
all right we got who we got next next up we have patricia i think it's realistic for most people to 01:06:37.680 |
give up their smartphone however if something is available only through a mobile app what are your 01:06:42.960 |
thoughts on a family smartphone it's not tied to anybody in particular and it's there to purely 01:06:48.720 |
cover specific needs um i think it's a great idea i think a family smartphone app is great you have those 01:06:54.800 |
type of apps on it you need for just like operating logistically in various parts of life and you grab it 01:06:59.280 |
when you need it get a cheap phone keep it forever keep it charged in the kitchen i actually just heard an 01:07:04.560 |
interview with warner herzog on conan o'brien's podcast that he was actually talking about this 01:07:09.520 |
warner herzog like famously is one of these people like quentin tarantino or like aziz anzari who do 01:07:16.000 |
not use smartphones or he's never really used a smartphone he's too busy making like 30 movies a year 01:07:19.840 |
but he told conan he had to buy one recently why because the parking lots in a lot of european cities 01:07:27.840 |
need a smartphone app for you to use it you have to use the app to actually like 01:07:31.920 |
register your car so it doesn't get towed so he has some smartphone that he barely knows how to 01:07:36.160 |
use and all it has on it is a parking app i think that's great and so if you're uh if you're worried 01:07:41.200 |
about you know having a fully serviced smartphone as being an unavoidable distraction then yeah have 01:07:46.080 |
a family smartphone for when you need it that is a great idea patricia i uh i recognize it all right 01:07:51.040 |
jesse we have our bonus question here right it's on a topic outside of today's episode but one that i 01:07:54.800 |
like to talk about yes we do all right who we got it's from tim we write monthly articles for our 01:08:00.400 |
company's internal weekly newsletter to minimize back and forth email communications i set up a clear 01:08:05.440 |
system and structure this past month a member didn't use the structure at all but kept moving the 01:08:10.720 |
conversation back to emails even when i tried to move things into the system multiple times i just do this 01:08:16.640 |
this is like pure a world without email if you haven't read that book oh man i really get into 01:08:22.720 |
like what happens with email why it's destroyed work and what it looks like to not be beholden the email 01:08:27.360 |
so tim i haven't talked about that book in a while so you gave me an excuse to do so i have three different 01:08:32.720 |
things to mention we can decide which of these three things are relevant to you all right so when it 01:08:38.480 |
comes to putting in place systems or protocols that are an alternative that just send the emails back and 01:08:43.680 |
forth you have to differentiate between uh a new protocol that a team is going to use like internally 01:08:53.760 |
versus a protocol that you're exposing to the outside world like the other teams for internal changes 01:09:02.080 |
so when you say like oh we do an internal weekly newsletter if there's like a team of writers who 01:09:06.960 |
work on this together and that's what you're talking about internally email alternatives have to be 01:09:12.720 |
bottom up with buy-in if you just come in and say technically i'm in charge of this team 01:09:17.840 |
here is how we are communicating now it creates resentment and friction and it really 01:09:23.680 |
rarely works people just don't like it they just don't even if it is like on paper a better idea 01:09:29.680 |
they are going to have like a cal newport dartboard up that they're going to be you know 01:09:33.600 |
throwing those darts at and they're going to be aiming for the groin if you know what i mean right 01:09:36.720 |
it's not good you got to have buy it you have to be like hey let us talk about what's going on 01:09:40.880 |
here is why i'm worried about us just doing all this with email here's the problems like i read 01:09:44.880 |
this cal newport book of a world without email like all this context shifting like i think this is 01:09:48.720 |
making us miserable it's making us less productive we need a some sort of protocol that's maybe a 01:09:53.920 |
little bit more structured that will reduce the main thing that you want to reduce which is not 01:09:57.600 |
messages but it's unscheduled messages to require responses because you're trying to 01:10:01.120 |
prevent having to switch your context as much as possible all right what should we do then as a 01:10:07.040 |
team you come up with this together you have buy and you say great what's our check-in rhythm once a 01:10:11.040 |
month all right we'll check back in and again a month this isn't going to completely work we'll fix 01:10:15.920 |
the parts that aren't we'll try new things we'll try to see what parts are working that's the way you 01:10:20.080 |
change email with protocols within a team if it's external so if what you're talking about here for 01:10:26.880 |
example is you run the internal editorial team but then people from all over the company can send you 01:10:32.720 |
stuff like be like hey i want you to put this in the newsletter or something like that then it's okay to 01:10:38.160 |
have uh expose a protocol interface and just say this is what we use i think it's a really good idea i talk 01:10:44.640 |
about in the book world without email the way that like this is the this is how they they organize the 01:10:49.840 |
apollo program because there are so many different teams and you know between academia and through 01:10:54.160 |
the government and through defense contractors they eventually essentially had to publish like 01:10:58.720 |
communication protocols for communicating between the teams here is where the information comes and in 01:11:03.840 |
what format we sign it off and when you do it because otherwise it was too impossible to be like too much 01:11:07.920 |
information or whatever like that so i think that's fine if you're exposing a communication 01:11:13.520 |
interface to the outside world or outside your team then you can just say this is the way we communicate 01:11:17.760 |
and there i think it is uh often sufficient to just pretend like stuff that's not in that protocol 01:11:24.880 |
doesn't exist or just say you know you send back here's the instructions for like submitting to the 01:11:30.080 |
newsletter and like oh okay but just hey what about this could you i knew what's going on here 01:11:35.280 |
like these are the instructions for submitting to the newsletter you kind of that's just it like you 01:11:38.560 |
just sort of hold the line on that and a few people like and then they'll kind of like stop 01:11:42.640 |
participating most people will do it third thing i want to say if you do have one of these external 01:11:48.000 |
interfaces and you're getting a lot of this pushback you also have to keep in mind that maybe your 01:11:54.480 |
interface is bad and often the issue is you have too much friction people are busy you're not their most 01:12:01.360 |
important thing the thing that they're communicating to you with through your interface is probably 01:12:05.920 |
something that's low down on their to-do list they just want to get it off there you do have 01:12:09.200 |
to respect that about people but there there's common friction points that really can make these 01:12:14.560 |
external facing protocol interfaces fail and you want to make sure you're not suffering from one of 01:12:19.280 |
these the biggest one is like making people have to switch systems that's a really big one 01:12:23.440 |
it seems simple on paper but it really is a problem for people if you're like look you gotta 01:12:28.400 |
you're gonna have to like log in and have a login and log into some system and click on some drop down 01:12:32.880 |
menus and there's some ambiguity about like what you're trying to do doesn't quite fit into these 01:12:36.960 |
forms and if it's too much like i'm making you do a lot of my work for me you get a lot of this like 01:12:42.960 |
internal bureaucracies like you got to put this in and the code here and then click from this drop down 01:12:47.040 |
like what the things are like that type of friction like i have to switch to another system or i have to 01:12:53.040 |
fit what i'm doing into like an interface where what i'm doing might not cleanly fit that really annoys 01:12:58.000 |
people so what you want to look for is like low friction interfaces let me give you an example 01:13:02.160 |
at our department for example at georgetown like we have a weekly newsletter you know for the department 01:13:08.240 |
that goes out and uh people come up with like things they might want to submit or this or that 01:13:14.480 |
there's like a really nice protocol we have in place and by me it's really my department here 01:13:18.080 |
figure this out i just admire it and it's it it uses email but it doesn't use email in an ad hoc way of 01:13:25.120 |
like let's just start talking to individual people there's a particular address you can use it's not 01:13:29.120 |
associated with a single name but actually with a whole admin team and the people in charge of 01:13:33.680 |
communication you can just email something that you want to be included to this address so it's very low 01:13:40.720 |
friction for you as like the user interfacing with this newsletter i can just like forward because often 01:13:46.320 |
like you'll get something from a colleague like oh we should put this in the newsletter i can just hit 01:13:49.440 |
forward and send that thing over but because the address is not associated with an individual person 01:13:53.600 |
i don't expect it to be a response i don't expect to be able to have like a back and forth interaction 01:13:57.360 |
about it and then on the other end there is i believe in this case we have a a student you know 01:14:04.640 |
we we have like student not interns we have like student employees in the department who you know we pay 01:14:09.440 |
hourly to do a bunch of different stuff and one of the things they do is they can monitor this address 01:14:15.200 |
and they can take these things and i don't know the whole back end but they like put them all in a big 01:14:19.040 |
google doc file and then there's like one time each week where the people who build the newsletter the 01:14:24.000 |
member the admin staff who build the newsletter they take what's in there and they format and send it out 01:14:28.800 |
before that there is a weekly meeting that already exists of like the various directors like myself 01:14:34.080 |
and we can just like look through that and be like oh that's not important that's not important yeah 01:14:38.240 |
include this include this now include that right and so we're already meeting that just adds like 01:14:42.480 |
30 seconds to our time all of this produces a newsletter where it's like very easy for people 01:14:47.200 |
to submit things it's still curated gets formatted nicely and how many unscheduled emails get sent that 01:14:52.800 |
have to be responded to zero and that is what you're often looking for all right so those are the three 01:14:58.640 |
things internally if there's an internal protocol you need buy-in you can't just impose it if it's an 01:15:03.440 |
external protocol you can hold the line dot dot dot but if it has too much friction in it then maybe you 01:15:11.280 |
need to make this a lower friction protocol that still accomplishes the goal which is not time which 01:15:16.000 |
is not message reduction but reducing the messages that require responses there we go a little world 01:15:21.440 |
without email i i can't resist sometimes jesse we're a lot of technology stuff going around we have to talk 01:15:25.600 |
about right now email sounds old-fashioned but it's at the core of so many people's like subjective 01:15:31.360 |
experience of work day-to-day and their subjective well-being so i can't help talking about it sometimes 01:15:36.480 |
all right we have a case study right yes now let's get some case study music 01:15:41.840 |
all right case studies where people send in their own accounts of applying the type of advice we talked 01:15:55.920 |
about on this show in their own life today's case study comes from mike all right here's what mike says 01:16:02.800 |
as a creative being on social media felt like a necessity but often also a chore so i finally did 01:16:09.520 |
a 30-day digital declutter as you describe in your book digital minimalism i don't know why i waited so 01:16:15.120 |
long here are my takeaways just as other people have described i was very antsy the first couple days 01:16:21.200 |
however it quickly subsided and i got used to the slower less anxious life it felt like i had more time in 01:16:27.440 |
my day because i did i realized how little value i actually get from being on social media now i get 01:16:32.720 |
everything i need in real life conversations are more robust as i don't need to see them on social 01:16:37.920 |
media an unexpected benefit was i started making decisions faster and with more certainty it was like 01:16:44.880 |
i got back to what i actually wanted for my life and not what someone on instagram was telling me to do 01:16:50.400 |
i'm now at the stage of implementing rules for how to use the apps after my declutter ends i definitely 01:16:55.600 |
think everyone should try this and enjoy the benefits all right mike i appreciate that for 01:16:59.760 |
those who don't know in my book digital minimalism which seems to be having a kind of like a new 01:17:04.480 |
renaissance by the way jesse i think people are like looking for responses to the negative relationship 01:17:11.920 |
they have with technology like well i've got one for you it's in this nice book you can read it 01:17:15.840 |
i recommend that book you take 30 day break from all your technology so you can rediscover like what you 01:17:21.120 |
actually care about through experimentation reflection figure out your values what you really want to do 01:17:25.120 |
in your life and then only add back technologies that directly support those values and when you 01:17:30.160 |
do put clear rules around them for when and how you use them so instead of like having to have a real 01:17:35.840 |
reason not to use a technology you have to have a real reason to use it and once you know why you're 01:17:39.840 |
using it you can put clear rules around it because if my value is x then why am i looking at my phone 01:17:44.160 |
when i'm in the bathroom that's unrelated to that so digital declutter is a good way to sort of 01:17:49.120 |
jumpstart a new relationship with all your tools so i'm glad things went well with mike 01:17:52.320 |
yeah i mean that's one of the big things i hear from people is they're surprised by how much brain 01:17:58.960 |
space social media was played taking up and my brain space like emotional space psychological space like 01:18:03.920 |
energy being burnt in this sort of like simmering fire in the background or whatever it's it's like 01:18:11.040 |
baseball fans feel uh you know a friend's a big tigers fan i'm just thinking about how much 01:18:17.360 |
mental energy the last couple of weeks burnt up for him right because it's just you know it's tough 01:18:22.880 |
your team it's suspenseful they're low and then finally you don't win and then they're immediately 01:18:28.080 |
talking about so you know train your pitcher mad dog was so upset that he only pitched six innings 01:18:32.720 |
in the final game all right trust me he's like he took him out and i was on a long text thread about 01:18:36.880 |
this yeah i know what you're talking about and what i thought about it was like okay this of course is 01:18:40.880 |
valid like your team your long time tigers fan you got this is going to eat up a lot of mental space 01:18:45.280 |
that's most people all the time because of social media like you have that same level of like 01:18:50.640 |
smoldering background like i have to talk about it or whatever it makes sense when it's like your 01:18:54.720 |
baseball team and it's two weeks that's the joys of october but for a lot of people it is they're 01:18:59.440 |
always they're always living in a constant state of being like a detroit fan after scoobal was taken 01:19:06.480 |
out in the sixth that's like their every day that is exhausting and i don't think people realize that 01:19:11.680 |
until they actually step away from it so yeah read that book digital digital minimalism often it's a 01:19:16.480 |
weird thing about these books jesse it's like when you're writing them if when you're in the world 01:19:20.240 |
like this is way too late but it's you're almost always too early yeah you know i was like oh it's 01:19:25.600 |
too late people already know like their phones are they have a problem with it it's like now that people 01:19:30.640 |
are getting around to it or it's like when john height wrote the ancient generation last year 01:19:35.760 |
we knew like experts like yeah we've been talking about this since 2017 like it's too late no people 01:19:40.400 |
weren't ready till then so yeah digital minimalism it came out in 2019 but it's more relevant now than 01:19:46.080 |
it ever has been all right do we have a call this week yes all right oh it's also about phones and all 01:19:51.760 |
that type of stuff all right let's hear this call hey cal and jesse this is antonio from los angeles 01:19:57.920 |
i followed your advice years ago and i gave my son a dumb phone when he turned 16 i gave him a 01:20:03.840 |
smartphone he has yet to download any social media uh so i think that was kind of spectacular kind of um 01:20:11.200 |
your advice in uh in action my daughter now is 13 and i gave her the same dumb phone uh however she's 01:20:19.440 |
sort of using the text threads it sort of is a de facto social media so she's on two or three different 01:20:27.200 |
text threads she'll you know wake up and there'll be 7 487 messages um and i i i don't quite know how 01:20:36.160 |
to regulate that or how to help her make different choices i'm wondering if you have any advice about 01:20:44.800 |
that uh thank you so much for what you've been doing um and uh i do love the newsletter that's a 01:20:50.160 |
new new thing for me all right antonio thanks for that question i have strong thoughts about all this 01:20:55.120 |
i'm actually giving a talk the day you're hearing this i'm giving uh another talk about my kids school 01:21:00.960 |
about like phones and actually the title of the talk i'm giving today the day you're hearing this is 01:21:05.680 |
when should i give my kid a smartphone i'm giving it to elementary school age parents so 01:21:09.680 |
i think a lot about these things um okay so so first of all i like the story about your son 01:21:15.280 |
that that's partially why 16 is considered like a good time to wait till social media because 01:21:20.800 |
you're done with puberty you have well-established social groups and your social identity is in place 01:21:27.760 |
so you're a lot less malleable at that point and then when you bring it in it's sort of more like 01:21:34.000 |
when an adult gets into it you can still end up using it too much but it doesn't have that same 01:21:37.600 |
impact that if you bring those tools into your life when you're still forming your social identity 01:21:41.680 |
when you're still prepubescent or going through puberty and you have all those chemicals and 01:21:44.560 |
reconfigurations of your brain and you're still trying to find like you know clee from the parental 01:21:49.920 |
uh community tribal dyad and sort of prove yourself you into a tribe of your own like all these things 01:21:55.120 |
happen around that time where you really don't want that technology you know in your face that being 01:22:00.640 |
said when it comes to social uh digital sociality there is a big gender divide and you're seeing that 01:22:05.200 |
you know these are all just bell curves and there's people on either ends of these bell 01:22:09.120 |
curves but in general young women and girls are more cued in onto the social technologies than young 01:22:16.240 |
men or boys are so this is why your son probably had an easier time with it you need to be wary 01:22:22.160 |
about digital sociality that's the terminology for sort of any socializing that's happening largely 01:22:27.120 |
through the digital exchange of text or images so what you would do on text threads but or whatsapp 01:22:31.600 |
or what you would do on something like snapchat you got to be wary about it for a few reasons 01:22:37.280 |
one is uh when you're communicating especially linguistically so through text there's a lot of 01:22:44.640 |
the parts of your brain that are part of normal social interaction that we evolved over a couple 01:22:49.520 |
hundred thousand years guard rails guard rails to like keep you a reasonable human being 01:22:55.120 |
and to maintain relationships in a way that's going to allow you to have a long-term relationship in 01:22:58.800 |
your tribe a lot of those guard rails are not activated when the communication is happening 01:23:03.520 |
with words because they don't know that you're communicating with someone yeah your frontal cortex 01:23:07.840 |
does like rationally you have some new nerve bundles that recognize like this text is texting to someone 01:23:12.960 |
else but like these deep type of social guard rails they evolved in a time when there was no written 01:23:16.640 |
language they don't know about texting so they're tuned down so people get nasty people say things they 01:23:24.400 |
wouldn't say before with girls it's less i want to understand talking when i give these talks that 01:23:29.360 |
middle school girls will tell me it's often less about like i'm going to say something mean to you 01:23:33.920 |
it's way more subtle knife digging than that it's like okay here's what i'm going to do you weren't 01:23:39.680 |
invited to this thing and so i'm going to find a way to mention it in like a very happy way like isn't 01:23:45.760 |
that really funny or and lol and put a lot of emojis or whatever but really the whole point of that 01:23:49.680 |
was to make you feel bad you probably wouldn't have done that in person right so so guard rails 01:23:55.040 |
are down it's it's uh so it's more sharp right you have more opportunities for these sort of negative 01:24:00.320 |
interactions two it's exhausting it uses a lot of mental energy to to be in the middle of a social 01:24:06.720 |
interaction it's like when you come back from a party you're exhausted or whatever and that's fine it 01:24:10.720 |
should be mentally expensive especially when you really care about it like you're you know you're 01:24:14.320 |
you'd be messing like teenager or whatever but when you have those conversations and ubiquitous 01:24:19.280 |
ubiquitous access to a phone you never get a break from it you wake up to them when you go to bed 01:24:25.600 |
you're still working on them when you're at home you never get a break where i am just safe from having 01:24:30.160 |
to navigate this complicated overly sharp guardrail turned down sociology i never get a break from it 01:24:35.840 |
it follows me everywhere and that is really exhausting and it is a burden that you don't want to place 01:24:40.720 |
upon your kids so digital sociality is something that we really worry about so if your kid is using 01:24:46.960 |
the dumb phone to basically be in a world of sort of constant digital sociality i would change that 01:24:50.960 |
and there's a couple things you could do there the big one i would do is i do not believe in the model 01:24:56.160 |
of here is your dumb phone no you don't have a phone i have a phone i bought a phone i'm paying for 01:25:02.800 |
this by the month the family owns this phone if you're going somewhere where for logistical reasons you 01:25:09.840 |
need to be able to like call us to pick you up or whatever you can take one of the family phones 01:25:14.720 |
with you that's what we do in our family we have a couple different dumb phones they're both terrible 01:25:18.400 |
to use they're not owned by any of our kids they're there if like okay my oldest needs to take the city 01:25:24.720 |
bus to baseball practice take one of the family phones with you today so that you can let us know 01:25:29.120 |
if there's like a problem with the bus ride when you get home you put it back so he doesn't have 01:25:34.240 |
a sense of ownership of a communication technology i think that is really important when you have 01:25:38.240 |
excessive digital sociality then you might say well they can't be cut off completely there's 01:25:43.120 |
all this texting going on well they could be cut off a lot and what you can offer them is we will set 01:25:48.240 |
up a family ipad that has a i message on it and you can be a part of groups on that that's fine but your 01:25:56.080 |
time to do that is going to be like your time to watch tv there will be like certain times like you can 01:26:00.000 |
go do some group messaging now if you want so you can still sort of be involved in these things half hour here 01:26:06.640 |
like after dinner or before dinner but then that and downstairs where we can see you you don't get 01:26:11.440 |
an ipad in your room i don't i don't get this where people are like i would never let my my kid have a 01:26:16.080 |
phone but man they're up in their room a lot you know with the door locked with their ipad ipad is 01:26:20.560 |
just a bad phone guys like you have access to all the same stuff so you can go group text in the living 01:26:24.800 |
room now and if you want to do that instead of watching a show for 20 minutes and then this ipad comes 01:26:29.200 |
back to me not yours you don't own it that is giving relief to your kid you're like okay there's 01:26:35.760 |
no way i could be communicating right now i don't have to worry about it and what happens is is that 01:26:40.080 |
their friends adapt to it oh okay so and so has like a dad that listens to cal newport they all know and 01:26:45.440 |
curse my name like my name kind of gets out there more and more i'm kind of like a a boogeyman right 01:26:49.840 |
it's or it's sort of like if in the in the mindset of like the american teenager the crampus like evil figure 01:26:55.120 |
that comes in is is john height and i'm kind of like the he's like evil sidekick me like this less 01:27:00.800 |
known but sort of i come in and help him like steal you know your toys that's kind of the way they see 01:27:05.600 |
us that's okay like oh my dad's like a cal newport guy uh so like i i can only be on these somewhat the 01:27:10.960 |
the people adjust and the the groups you don't want to be a part of they're really mean then they're like 01:27:16.560 |
you don't want to be in those groups anyways and you're real friends that you're doing other stuff 01:27:19.120 |
with like yeah we just know it's just that's just the way you are so i feel strongly about digital 01:27:23.280 |
sociality we have to be careful about so that phone is not hers that phone is yours she just 01:27:28.720 |
needs it if she needs it for logistics group chat is like watching tv there's a few times we'll let you 01:27:33.120 |
do it uh she will survive till she's 16. and she'll thank you for it in the end all right uh i believe we 01:27:40.640 |
are ready for our final segment all right i want to react i want to kind of close up our discussion today 01:27:48.800 |
of social media versus ai by checking in with our friend sam altman right so they released this sora 01:27:56.000 |
app as as hank green called it pick slop to try to make money by producing a lot of these videos what 01:28:03.280 |
else have they been up to recently i want to bring a tweet a tweet up here on the screen this was from 01:28:07.920 |
last week here's sam altman uh he starts by saying we made chat gpt pretty restrictive to make sure that we 01:28:16.560 |
were being careful with mental health issues we realized this made it less enjoy useful enjoyable 01:28:21.280 |
to many users who had no mental health problems but given the series of the issue when to get this 01:28:24.400 |
right now that we have been able to mitigate the serious mental health issues and have new tools 01:28:28.160 |
we are going to be able to safely relax the restrictions in most cases in a few weeks we 01:28:32.400 |
plan to put out a new version of chat gpt that allows people to have a personality that behaves more 01:28:36.800 |
like what people liked about 4.0 if you want your chat gpt to respond in a very human way or use a ton of 01:28:42.160 |
emoji or act like a friend chat gpt should do it too in december as in december as we roll out age 01:28:48.080 |
gating more fully and as part of our treat adult users like adult principles we will allow even more 01:28:53.120 |
like erotica for verified adults the bad news is he goes on the clarify all of the ai generated erotica 01:29:01.920 |
will feature uh him so i don't that this could put his face on everyone so that's it's kind of like 01:29:07.360 |
there's good and there's bad but i guess if you own the company a lot of um all right let me tell 01:29:12.560 |
you let me tell you if you're a an open ai monitor what this all means things aren't going well over 01:29:20.720 |
at open ai that is the only way i can think to interpret sora followed by this announcement of 01:29:28.000 |
we're going to turn on erotica in chat gpt let me explain why it wasn't that long ago that if you 01:29:37.440 |
listen to sam altman or you listen to dario amaday we were still talking in terms of the world was about 01:29:44.160 |
to be transformed it was what four months ago that sam altman was on theo von's podcast comparing himself 01:29:52.400 |
the oppenheimer and the pending release of gp5 to the trinity atomic bomb test it was july the 01:29:59.760 |
dario amaday was going around saying like look we're going to like automate half of white collar 01:30:05.520 |
jobs it's going to be a bloodbath like i you know i don't even know who's going to be employed anymore 01:30:10.880 |
and we had studies coming out they were trying to identify like who would not be affected by ai 01:30:17.360 |
there's literally a study that i read that very helpfully identified that the safest job in the 01:30:23.200 |
near future is going to be a dredge operator so i guess if you're on a dredge barge operating it you're 01:30:30.160 |
like less likely to be whatever so we were in this place until very recently where the reason why there 01:30:36.800 |
are so many hundreds of billions of dollars being invested in these technologies is because they were 01:30:40.640 |
going to automate huge parts of the economy remember sam altman in 2021 wrote or 2022 uh put out that 01:30:47.040 |
essay moore's law for everything where he said this this technology is going to take over everything 01:30:52.640 |
we have to just have a new tax system in place that basically like tax taxes equity wealth because 01:30:57.920 |
there's going to be just like three companies that do everything with their software and you're just 01:31:02.000 |
going to need to come and and we're going to have all the money so you're just gonna have like each 01:31:04.880 |
year take 20 of the money and distribute it to people so there's not riots in the street 01:31:08.080 |
like that was the path they were on now what are we seeing they're saying maybe we can sell ads 01:31:16.720 |
against ai generated videos of you know stephen hawking being pushed into a pool maybe we can sell ads 01:31:24.080 |
like among chat gpt feeds of erotica this means that something has shifted in open ai they realize 01:31:31.760 |
the change the world impact that they were on a get there or die type of trajectory that's not going to 01:31:40.480 |
happen because gpt5 wasn't a massive leap over gpt4 it wasn't a massive leap over 4.5 they couldn't 01:31:48.400 |
scale their way to artificial general intelligence most of their improvements as on the tech side for 01:31:53.200 |
example was coming from fine tuning what was basically like a four slash four or five model 01:31:58.720 |
to do well on very particular tasks or benchmarks that happened to be well suited the synthetic data set 01:32:03.280 |
tuning it wasn't becoming like generally aware they were seeing that even though this was supposed to be 2025 was 01:32:09.440 |
supposed to be the year of the agents that trying to run automated processes off of llm queries just 01:32:14.400 |
doesn't work they're just not reliable enough they make things up they don't understand what you're 01:32:18.560 |
saying and if you just automatically like query an llm and then take action of what it says 01:32:22.960 |
it's disaster that's why no one has agents no one's automating anything the use cases that are emerging 01:32:29.840 |
there are good ones and they're very useful cases because it's very powerful technology 01:32:33.920 |
are not the use cases that a trillion dollar cut enough to support a trillion dollar company or a 500 01:32:38.800 |
billion dollar company like it is now this is why there's also these shenanigans going on 01:32:43.680 |
where you know the nvidia deal with open ai where i'm going to pay you 100 billion dollars for this 01:32:49.200 |
and you're going to pay me 100 billion dollars back and then we're going to kind of hope that there's 01:32:52.080 |
a stock market inflation and in that arbitrage to somehow makes a difference there's a great article 01:32:58.240 |
about this in the journal recently you know what the last time those type of deals were going on was it 01:33:01.680 |
was in the enron era with all those fiber companies right before they all went out of business in the 01:33:05.760 |
late 90s early 2000s so i think this is a mark this shift towards attention economy slop that we've 01:33:13.120 |
seen in the last like month from open ai is a signal that they're in trouble because it was super 01:33:19.440 |
expensive to build these models and though they're cool they're not going to take over the economy and 01:33:23.120 |
make them the only company left and be the last invention that anyone ever needs to make 01:33:27.200 |
and now they're like how do we make money we're making some like they do pretty well selling chat gpt 01:33:32.160 |
subscriptions but they need to like 10x that to really capture their their capex expenses so they're 01:33:38.320 |
now starting to flail that's the way i see these two stories so what we started with building a tick tock 01:33:45.200 |
clone what we ended with saying uh let's just try to make the chat feed as like whatever brainstem 01:33:52.720 |
engaged and we can and by the way they're gonna sell ads on all this of course they hired a major 01:33:56.320 |
a major ad expert onto their executive team earlier in the summer i mean this is what they're going to 01:34:01.920 |
do is try to sell ads on just brainstem stimulation i see this as the fall of like a once super ambitious 01:34:08.800 |
exciting company with visions for changing the world and now they're hoping they just sort of like slop their 01:34:13.760 |
way towards profitability and maybe other breakthroughs are coming and i'll be wrong but then i'm not a stock 01:34:17.840 |
market analyst and i'm just a technology guy i'm just saying i think these two things together 01:34:22.320 |
is not a good sign if you were really on the team of we're like a year away from open ai changing 01:34:27.840 |
everything so there we go not good news jesse 01:34:31.040 |
but we'll see all right that's all the time we have thank you for listening to the show we will be back 01:34:38.560 |
next week with another episode and until then as always stay deep if you like today's discussion of 01:34:44.960 |
sora and its competition with existing social media platforms you might like episode 372 in which i 01:34:51.920 |
you might decode exactly how tick tock algorithm works you want to understand how the slop is being 01:34:58.160 |
served you need to listen to that episode i think you'll like it check it out the big news in the world of social media 01:35:03.920 |
recently is the announcement made last week that u.s interest would be taking over operation of tick tock in america