back to indexThe Case for Quitting Social Media | Cal Newport

Chapters
0:0 Is it Finally Time to Leave Social Media?
38:59 How do I find everything I should be blocking on my kid’s phone?
43:39 Is it okay that I make YouTube Videos?
49:13 Shouldn’t we focus on using social media better?
58:0 A musician and social media
70:42 US House panel asks internet CEOs to testify after Charlie Kirk assassination
00:00:00.000 |
Two days after the assassination of Charlie Kirk, I sent a short essay to my newsletter. 00:00:05.840 |
I wrote it in about 15 minutes, but it was about as raw or as pointed as I ever get. 00:00:14.540 |
Those of us who study online culture like to use the phrase, Twitter is not real life. 00:00:20.440 |
But as we saw yet again this week, when the digital discourses fostered on services like Twitter or Blue Sky or TikTok do intersect with the real world, 00:00:28.920 |
whether they originate from the left or the right, the results are often horrific. 00:00:33.140 |
This should tell us all we need to know about these platforms. 00:00:38.400 |
They are responsible as much as any other force for the unraveling of civil society that seems to be accelerating. 00:00:44.780 |
All right, then later in the essay, I reached the following conclusion. 00:00:50.100 |
After troubling national events, there's often a public conversation about the appropriate way to respond. 00:01:01.440 |
Find other ways to keep up with the news or spread ideas or be entertained. 00:01:05.400 |
Be a responsible grown-up who does useful things, someone who serves real people in the real world. 00:01:11.660 |
In the over two decades or so that I've been writing that newsletter, I don't think I've ever received as large or as positive of a response as I did for that post. 00:01:20.780 |
I'm at something like 200 email replies and counting and nearly 100 comments. 00:01:25.080 |
The vast majority of this feedback is saying the same thing. 00:01:29.360 |
This was the final push I needed to finally quit these apps. 00:01:35.540 |
Our biggest problem with social media right now is not convincing people that it's bad. 00:01:47.280 |
But they're stuck in a sort of purgatory where they know it's bad, but they're not sure exactly why or whether it's something that maybe could be fixed or maybe something that's worse for other people than it is for them. 00:01:57.260 |
So they stay where they stay in the status quo, waiting for someone to push them into a different pattern. 00:02:07.160 |
I'm going to put on my technocritic professor hat and provide a precise and detailed explanation of how exactly the worst of these platforms impact us and why these problems are fundamental to these technologies and not something that we can easily fix. 00:02:21.240 |
I want to give you the ironclad case for why the most reasonable response for most people is to do what I recommend in my essay, to quit for good. 00:02:30.620 |
As always, I'm Cal Newport, and this is Deep Questions. 00:02:48.520 |
Today's episode, The Case for Quitting Social Media. 00:02:52.660 |
All right, so we need to start our conversation here by defining some terms. 00:03:00.520 |
The phrase social media can mean a lot of things, and then that creates confusion because you might have one thing in mind and someone else has something else. 00:03:10.240 |
Today, I'm going to be referring to a specific class of social apps that I call curated conversation platforms. 00:03:19.520 |
All right, so these are platforms in which you have a lot of people joining in conversation, be it through text, images, or video, using curation algorithms based on optimizing engagement to help select what each individual user sees. 00:03:35.780 |
So some obvious examples of curated conversation platforms include Twitter and Blue Sky and Threads and Facebook. 00:03:42.260 |
There's other things that people call social media that don't fit into this particular category. 00:03:47.060 |
So probably like Pinterest doesn't fall into there. 00:03:50.420 |
It's not a conversation platform so much as you're looking at things that people posted. 00:03:56.180 |
It's a conversation platform, but it's not algorithmically curated. 00:04:04.980 |
It's a conversation, but not a massive platform. 00:04:08.140 |
And then there's some platforms where, depending on how you use them, could fall into this category or not. 00:04:14.740 |
So I think things like Instagram or YouTube, depending on how you use them, could fall into the category of what we're talking about today or not. 00:04:22.640 |
Now, I want to focus on curated conversation platforms in particular because I think they lead to uniquely awful harms. 00:04:29.480 |
I also think they are the social platform most responsible for the rise in violence that we've been seeing recently, be it political or otherwise. 00:04:41.200 |
So let's talk about these curated conversation platforms. 00:04:44.640 |
If you ask people, hey, what's wrong with Twitter, Blue Sky, or Threads? 00:04:50.160 |
They'll eventually probably list some combination of the following three things. 00:04:54.260 |
I'll put these up on the screen for those who are watching instead of just listening. 00:04:58.320 |
They're probably going to say, okay, distraction. 00:05:02.920 |
This is the one I talk about a lot in my books on this show. 00:05:05.960 |
It's the idea that there's an addictive element to these platforms so that we use them more than is healthy and to the detriment of other things that we seem to be are important in our life. 00:05:14.640 |
Another harm people will often mention if you talk about these platforms is demoderation. 00:05:19.000 |
This is where you begin to lose any sense of moderation around a given issue. 00:05:23.060 |
It's where you become increasingly strident, see less room for good faith or even basic humanity on the other side of the issue in question. 00:05:29.260 |
There's a lot of tribalism when it comes to demoderation. 00:05:32.840 |
You become part of a tribe that's fighting some sort of epic fight. 00:05:35.820 |
Because of these stakes, there's a lot of sort of furious policing of your own tribe. 00:05:40.880 |
The third harm we often hear about these things is what I call disassociation. 00:05:47.400 |
This is where you begin to make a complete break from normal associations with human community. 00:05:52.040 |
You lose connection to normative ethical constraints. 00:05:55.660 |
This is the state in which violence to others or to yourself becomes a possibility. 00:06:00.240 |
Or where you become numbed to seeing such violence being perpetrated against others. 00:06:05.380 |
There's really two major flavors of dissociation. 00:06:08.520 |
There's one that's based around overwhelming rage. 00:06:11.240 |
Well, I am so mad I have to take action regardless of the consequences. 00:06:17.240 |
The other flavor of disassociation where it leads is nihilistic withdrawal. 00:06:21.000 |
The idea that the world is so dark and unredeemable and meaningless that you might as well cause some chaos on the way out. 00:06:27.100 |
This nihilistic variation of disassociation is one that we've really been feeling the consequences of recently. 00:06:32.540 |
Earlier this year, the FBI began using a new term when talking about mass violence incidents. 00:06:43.280 |
It has become so common to have violence caused by a sense of nihilism that there's a new term for it that the FBI uses. 00:06:50.420 |
Part of the issue with nihilistic withdrawal is that if it is online platforms that have driven you to disassociation, you begin to think, oh, all that matters is attention and respect on online platforms. 00:07:08.220 |
Here's Derek Thompson writing in an Atlantic piece that he wrote a couple of years ago. 00:07:12.040 |
For many socially isolated men in particular, for whom reality consists primarily of glowing screens and empty rooms, a vote for destruction is a politics of last result, a way to leave one's mark on a world where collective progress or collective support of any kind feels impossible. 00:07:25.860 |
So online disassociation can lead to violence because you're like, why not? 00:07:30.280 |
And the only thing that really matters is views. 00:07:36.300 |
So these are the harms that people would normally talk about. 00:07:39.580 |
If you then say to someone, all right, given these harms of curated conversation platforms, why are you still using them? 00:07:49.420 |
Well, this is because if you ask people how these harms affect them, they're going to give an answer that looks something like this. 00:07:55.700 |
And Jesse, we can take this off the screen for now. 00:07:57.120 |
They'll say something like, I may suffer a bit from distraction. 00:08:03.120 |
Yeah, I use these more than is useful, but I could probably get that under control with, you know, some better habits or I can break my phone or something like that. 00:08:10.720 |
Then they'll say typically, yeah, the demoderation thing, that is a problem. 00:08:14.200 |
Not really for me, but, but for the other, the other tribes. 00:08:16.840 |
It's like whatever groups I'm affiliated with, like we're just being reasonable. 00:08:20.080 |
We're reasonably upset and we're kind of right. 00:08:23.780 |
Other people on the other side, they have problems with demoderation. 00:08:28.060 |
Probably the fix for that is depending on what side of the political spectrum I'm on. 00:08:32.320 |
Maybe it's something about more aggressive content moderation. 00:08:34.860 |
So if they don't see the crazy stuff, they won't act so demoderated. 00:08:39.580 |
Like, well, if we could just correct things that are false, then people would be a little bit more reasonable. 00:08:46.620 |
Well, people would say, yeah, that's scary for sure. 00:08:52.100 |
But that's not even really happening on the platforms I use. 00:08:55.520 |
Like when I think about, whenever I hear about these mass shooters, you always hear about them being on like rumble. 00:09:01.520 |
rumble or kick or on some sort of like Groyper server on Discord. 00:09:05.960 |
These sort of weird, semi-dark web type platforms. 00:09:13.580 |
That's sort of how people respond to these harms. 00:09:17.160 |
And I think that response goes a long way to explaining why we're sort of stuck. 00:09:22.000 |
Why more people aren't making major changes about the role of these technologies in their lives. 00:09:33.600 |
What I want to do here is outline a different way of understanding how these curated conversation platforms hurt you. 00:09:41.200 |
And it's going to be a way of understanding them that is not going to let you off the hook so easily. 00:09:46.140 |
All right, I'm going to do some drawing here. 00:09:49.260 |
If you're listening, you might want to turn on the YouTube version to see my brilliant art. 00:09:55.100 |
So what I have up here on the screen is the three harms we talked about. 00:09:58.340 |
Distraction, demoderation, and disassociation. 00:10:02.800 |
The right way to think about how we interact with these is to imagine that there is a slope. 00:10:16.680 |
Starting with distraction, we start sloping downhill. 00:10:32.720 |
And I'm going to draw here a sort of dividing line between, you know, label this here. 00:10:50.980 |
So let me explain this beautiful picture I just drew. 00:11:00.860 |
Then the slope goes through the demoderation zone. 00:11:03.420 |
And then finally, the slope goes through the disassociation zone. 00:11:06.780 |
I've drawn a ground across this slope diagram. 00:11:10.020 |
So right around the place we go from demoderation to disassociation, the slope goes underground. 00:11:16.500 |
Below ground, now we're in the world sort of of the dark web. 00:11:23.640 |
When you first start using these type of curated conversation platforms, it's pretty easy to fall 00:11:34.040 |
So you sort of fall onto the top of the slope where you're like, you know what? 00:11:39.460 |
Whatever my platform poison of choice is, I'm kind of on this more that it's healthy, right? 00:11:51.040 |
And you're in this distraction zone long enough. 00:11:54.120 |
Gravity is going to pull you down into the demoderation portion of the slope. 00:11:59.560 |
And now you're not just using these services too much. 00:12:03.940 |
You're losing your ability to see the other side in any sort of good faith. 00:12:08.180 |
You find yourself in dehumanizing tendencies. 00:12:12.340 |
I can't, anyone who supports this person, I just can't even imagine that, right? 00:12:21.400 |
Don't you realize we have, we're at war here. 00:12:27.320 |
And it gets worse and worse as you slide down this slope. 00:12:29.720 |
If you continue down the slope and the pressure is pulling here, 00:12:32.380 |
you'll end up in the disassociation zone where you say, you know what? 00:12:39.760 |
And once you're down here is where people bounce off of the primary mainstream platforms. 00:12:47.940 |
You begin to bounce off those into, like, whatever mixture of things like Gab or Kick or Rumble. 00:12:55.140 |
Like, now you kind of, like, end up roaming around these barely regulated sort of Wild West, 00:13:01.380 |
And that's where, like, the really bad stuff happens. 00:13:03.020 |
This is, I call this the slope of terribleness. 00:13:06.860 |
This is actually the right way to understand these harms. 00:13:10.700 |
You start with one, but gravity is pulling you down towards the next and then towards the next. 00:13:16.200 |
Now, your question might be, can we fix this? 00:13:19.220 |
Maybe if we just fix these services, there's a way that we can maybe put up some roadblocks 00:13:23.760 |
on the slope of terribleness and prevent this from happening. 00:13:25.700 |
But I want to argue that actually this is not the side effect of some mistakes in the way 00:13:30.100 |
these services were implemented that can be fixed. 00:13:32.220 |
But actually, this slope of terribleness and the tendency to slide down it is actually fundamental 00:13:42.480 |
You start using a curated conversation platform. 00:13:46.540 |
Part of the problem is the curation means that the algorithm is looking for engaging material. 00:13:54.960 |
Whatever that is, it will get pretty good at finding that. 00:13:57.400 |
It represents it as sort of weighted data points in a multidimensional space, but it figures 00:14:01.320 |
out what type of content is engaging for you because that's how they make their money. 00:14:05.140 |
Now, the issue with this is the stuff you're seeing on the phone is very compelling. 00:14:10.400 |
This will pretty soon begin to overwhelm your short-term motivation centers in your brain. 00:14:15.260 |
So we have these circuits in our brain that are making predictions. 00:14:18.560 |
They're looking at activities that are available to you in the moment, and they're making predictions 00:14:25.140 |
What are the different outcomes going to be of these various short-term activities? 00:14:30.040 |
And if the brain has learned, oh, there's a really positive outcome that's going to happen 00:14:35.440 |
from a particular activity, you get a flood of dopamine in that motivation circuit, which 00:14:39.340 |
activates it and then makes that activity seem irresistible, right? 00:14:43.840 |
This is something we often get wrong about dopamine. 00:14:45.680 |
I noticed this in the governor of Utah, Spencer Cox's comments in the aftermath of the Kirk assassination. 00:14:52.660 |
A lot of people think of dopamine as a pleasure chemical of like, oh, once you do something, 00:14:56.400 |
you get a splash of dopamine and it feels good. 00:15:00.480 |
So these short-term motivation circuits have learned through past experience. 00:15:04.640 |
For example, when I eat that cookie, there's all of these things I feel that are good. 00:15:09.720 |
And the circuits learn from that and they say, great, when I see a cookie again, dopamine will 00:15:15.160 |
activate that circuit because that circuit recognizes the cookie and give me a sense of 00:15:20.200 |
Well, algorithmically curated content really plays to those circuits very well. 00:15:27.000 |
And if you're using the phone all the time, you build a very strong prediction circuit that 00:15:32.020 |
when it says a phone is nearby, dopamine, dopamine, dopamine in that circuit saying, pick up that 00:15:37.460 |
We're going to see something that's going to cause a reaction that's interesting, be it pleasurable 00:15:41.380 |
or outrage, but it's going to be interesting, an interesting reaction. 00:15:47.040 |
And the problem is the phone is always with you. 00:15:50.480 |
So that circuit is just constantly being bathed in dopamine. 00:15:57.060 |
You can resist it sometimes, but it becomes very hard to consistently resist it. 00:16:01.900 |
So you go back to using those services again and again and again. 00:16:04.580 |
That's just the basic sort of feedback reinforcement loop. 00:16:08.320 |
You're like, look, I don't want to smoke, but I'm going to carry in my hand at all times 00:16:14.080 |
That's a very hard way to cut back on smoking because those circuits are constantly being 00:16:18.660 |
bathed in dopamine saying that will give us a reward. 00:16:22.820 |
So just fundamental brain chemistry, you have algorithmically curated content accessible in 00:16:28.700 |
You're going to use that more than you want to. 00:16:31.560 |
So that's how you begin sliding down that distraction slope. 00:16:34.460 |
As soon as you begin in any sort of regular way, using one of these curated platforms. 00:16:43.580 |
Well, two things happen with these platforms. 00:16:46.000 |
Once you start using them all the time, another side effect of the algorithmic curation is the 00:16:53.600 |
And we say this sort of loosely like, oh, don't be in an echo chamber, but I'm saying it in 00:16:57.520 |
a, in a sort of a techno determinist sort of way. 00:16:59.620 |
It is a unavoidable side effect of an algorithmically curated information experience. 00:17:08.620 |
And I don't want to be too technical, but basically they have these huge multi-dimensional 00:17:14.880 |
So just large vectors of numbers that describe, it's an embedding of content into a, think of 00:17:22.040 |
I have a thousand categories I can use when I'm trying to describe something and I take 00:17:27.200 |
a thing and I go through and give it a number in each of these categories. 00:17:30.420 |
And that combination like describes it really well. 00:17:33.000 |
And what it does is it learns spaces, the multi-dimensional spaces in which these preference vectors exist. 00:17:39.360 |
It sort of learns these regions where content here does really well with this person. 00:17:47.580 |
I mean, it's not, no one's sitting there programming, create echo chamber, execute, or there's an echo 00:17:52.980 |
It is just an unavoidable consequence of using this type of, I'm going to observe your engagement 00:17:59.340 |
and then use that to try to figure out where in the space of possible content, things create 00:18:03.440 |
So you're going to start seeing a lot of stuff that's similar. 00:18:05.880 |
So if there's a, something you lean towards on an issue, you're going to be seeing a lot 00:18:13.300 |
And you're going to see stuff that's more and more engaging within that space, which typically 00:18:16.600 |
would be the stuff that's a little bit more strident. 00:18:20.640 |
The other issue is, so we're hearing kind of the same stuff, algorithmic echo chamber. 00:18:27.800 |
Remember I mentioned that it's important that it's people conversing, not just like you consuming 00:18:33.420 |
That is going to fire up in your mind the tribal community circuits. 00:18:39.980 |
These are these people that we're talking back and forth. 00:18:44.160 |
So you have echo chambers plus tribal community circuits fired up. 00:18:48.140 |
And these things come together and they create in your mind, this virtual tribe that has a 00:18:59.760 |
It's not a sort of knob that can be turned in the algorithms that run these social networks. 00:19:04.260 |
It is a unescapable consequence of how these platforms function. 00:19:08.560 |
Those tribal community circuits are incredibly powerful because throughout most of our modern 00:19:14.340 |
history as a homo sapien, so let's go back two to 300,000 years, this is how we survive. 00:19:21.180 |
The other tribes, we had to be very, very suspicious of it. 00:19:28.480 |
If you're not, I worry that you're going to kill me or steal our food. 00:19:33.960 |
I need to be, there's nothing more compelling than those tribal community circuits. 00:19:38.560 |
You can think of like the entire history of human civilization. 00:19:42.780 |
Yuval Harari is good about how we used abstractions to do this, is an effort to try to overcome 00:19:49.020 |
the tribal community circuits that are so deep in our mind. 00:19:52.800 |
And this is basically the point of the book Sapiens, if you want to sort of go down a rabbit 00:19:56.420 |
hole here, is that humans' abilities to organize around abstract causes in tribes basically hijacked 00:20:02.580 |
our tribal circuits and allowed us to expand the scope to which they applied them. 00:20:06.600 |
So now they could apply to like our entire country or everyone who shares a religion. 00:20:10.880 |
And it's what allowed humans to work together in large numbers, which was key to our success. 00:20:16.240 |
But the point is, our success came from figuring out how do we actually like hijack, overwhelm, 00:20:21.460 |
or otherwise redirect these tribal circuits because they are so strong. 00:20:24.400 |
Purated conversation platforms just get back to paleolithic, let's go, right? 00:20:30.160 |
And that's where that demoderation is so strong. 00:20:32.960 |
And it's not because, oh, that's a bug in the system. 00:20:37.540 |
We need to turn that off so I can get the real value out of these platforms. 00:20:44.280 |
That's the thing that's feeding into the addiction loop on it. 00:20:48.040 |
So it's very difficult to avoid demoderation if you're using these things a lot. 00:20:54.520 |
The more increasingly demoderated you get, and look in the mirror, you know you are. 00:21:01.800 |
These tribes are almost, by the way, random what you end up echo chambered in. 00:21:05.260 |
It's like your slight preferences plus whatever the algorithm pushed you in. 00:21:08.540 |
Like it can be pretty arbitrary, but just think about right now, whether it's like it's political 00:21:13.920 |
and you're like on the left and thinking about MAGA people or you're on the right and you're 00:21:18.300 |
thinking about like the blue hair people or, you know, you're like a super health mom who's 00:21:23.660 |
thinking about, man, these pharmaceutical companies are poisoning my kids because they 00:21:30.200 |
Think about the push to dehumanize that other, that other group, like, man, they're just terrible. 00:21:40.360 |
When you were in that mindset, you keep sliding down the slope of terribleness. 00:21:45.960 |
That is what eventually gets you to disassociation because you get to a point where you say, 00:21:50.680 |
the other thing, the side I don't like is so terrible and yet nothing is happening about 00:21:56.080 |
We're not, you know, cutting off half the country or whatever it is. 00:22:00.020 |
Like they're not seeing that I'm just right and changing their views to align with mine 00:22:03.440 |
and no one seems to care, uh, or whatever the issue is. 00:22:06.780 |
And it nihilism, either nihilism or rage follows. 00:22:09.640 |
Like I just, I can't fathom this and it sort of breaks it to make matters worse, especially 00:22:14.760 |
if you're younger, a side effect of a conversation platform is that it's taking the place of actual 00:22:22.260 |
So you're also going to become more actually socially isolated. 00:22:25.220 |
So huge demoderation plus more of social isolation, whether you're young, this also happens a lot 00:22:30.400 |
with older people as well, who are sort of retired and older in life, man, that gets you 00:22:34.660 |
to disassociation before you know it, you're showing up at comma Pete, comment pizza, uh, you 00:22:41.380 |
know, someone's got to free the kids from the secret caves underneath or your nihilistic 00:22:47.880 |
One thing leads to another and it is built in to the technology. 00:22:55.780 |
You can't, this is not, there's no version of Twitter that doesn't have this because the 00:23:01.280 |
One thing leads to drive this is the fact that as people coming together, it's virtual interaction 00:23:09.120 |
You can't have a version of those services without a conversation and algorithmic curation. 00:23:15.920 |
That's like saying to Pizza Hut, look, we, we, we don't want to put you out of business, but 00:23:19.240 |
is there a way you can exist without selling pizza? 00:23:23.660 |
This is what these conversational algorithmic platforms do. 00:23:25.940 |
You have a huge number of people come together onto a platform to have conversation. 00:23:29.760 |
You have to have algorithmic curation because what else are you going to do? 00:23:32.320 |
You have 500 million people submitting tweets a day. 00:23:35.980 |
And if you're going to curate it, you're going to curate it on like what people want to see. 00:23:45.380 |
It is a unavoidable side effect of these technologies. 00:23:50.980 |
So you're probably saying yes, but I'm not going to end up at the bottom of this slope. 00:23:59.500 |
A lot of people use these services, but they don't end up on, you know, rumble or, or 00:24:04.860 |
grateful, you know, group or discord servers. 00:24:09.660 |
You can resist the pull of gravity, but let's think about what that costs. 00:24:14.520 |
I'm going to bring up this picture one more time on here. 00:24:20.100 |
So yes, I'm going to draw a little picture here. 00:24:22.520 |
You can, you know, at some point, hold on, I don't know, that looked like a mountain climber 00:24:33.360 |
So you, you can arrest your fall at some point, but here's the issue. 00:24:38.440 |
Number one, imagine over here, we have a scale. 00:24:43.300 |
Think about this as like a scale of flourishing, right? 00:24:48.160 |
The higher up you're on the scale, the more you're flourishing as a human. 00:25:00.160 |
You stop yourself somewhere on the slope of terribleness, but the farther you've gone down, the lower you've 00:25:07.020 |
Maybe you arrest your stops somewhere here in the demoderation area, but you've arrested your life in a place that's worse. 00:25:11.880 |
You're flourishing less than if you weren't on the slope at all. 00:25:14.900 |
To make matters worse, resisting further slide down this slope requires a lot of mental energy. 00:25:21.740 |
To not be demoderated when you're using the services all the time requires like a real application of will. 00:25:28.740 |
When you're demoderated, then not to make that final slide towards disassociation requires a lot of will. 00:25:33.880 |
To be fully disassociated and resist the call to come join other disassociated people who understand your disaffection on these sort of dark web type services, that takes a lot of mental energy. 00:25:43.600 |
So, yes, you can maybe prevent yourself, and most people do, of course, from getting to the bottom of the slope of terribleness, but they get stuck in a place where they have reduced flourishing and it's eating up a lot of mental energy that you could otherwise be spending on stuff that matters to you more in your life. 00:25:57.560 |
So, why is this worth it to be less happy and to waste huge amounts of mental energy needed to prevent yourself from descending into what is essentially a sort of living hell? 00:26:08.680 |
Once you realize the real dynamics at play here, it becomes hard to justify why you're still spending time with these curated conversational platforms. 00:26:22.960 |
You want to hear information that you think is otherwise being suppressed? 00:26:26.740 |
All right, subscribe to the newsletter, subscribe to the podcast, do the old-fashioned work of here is a real individual, here is why I think they are a good person to listen to, here is their credentials, here is how they convinced me, and I will listen to them writing out a thing once a week and there's no curation and it's not conversation. 00:26:47.860 |
You don't have to be on the curated conversation platforms. 00:26:50.860 |
Being on those platforms just tricks you into thinking you're a part of something. 00:26:54.860 |
You imagine often if you're engaged in combat on these platforms, you imagine you're at the front of the crowd marching on that bridge from Selma. 00:27:03.860 |
In reality, you're in an attention factory, punching with your time clock while your billionaire overseers laugh as their net worth goes up due to your efforts. 00:27:12.860 |
You want to be a part of something, actually go be a part of something, not on a screen. 00:27:16.860 |
And maybe you're using it because you're bored. 00:27:19.860 |
Even on your phone, there's better high-quality entertainment. 00:27:28.860 |
You can listen to podcasts, read newsletters, read books, watch movies. 00:27:33.860 |
You have literally something like a half a trillion dollars worth of CapEx expense and entertainment available at your fingertips for like $30 a month. 00:27:42.860 |
Like there's plenty of other things you can do to not be bored that doesn't put you on the slope of terribleness. 00:27:50.860 |
So what we do here, like we always do is move on to our takeaways section. 00:27:59.860 |
All right, so let's get to my takeaways here. 00:28:13.860 |
When it comes to these curated conversation platforms, we've been telling ourselves a reassuring story. 00:28:20.860 |
Yeah, they're not great, but most of the worst harms don't impact me. 00:28:28.860 |
But in reality, these harms are all connected to a common slope of terribleness. 00:28:33.860 |
You use them and you'll start sliding down the slope. 00:28:35.860 |
It'll make your life worse as you move further. 00:28:38.860 |
You'll likely stop yourself eventually before you get to the hellish bottom. 00:28:41.860 |
But this resistance requires energy, mental energy. 00:28:43.860 |
You could be spending on things that matter or people you love. 00:28:49.860 |
I want to end this deep dive by reading from the conclusion of the essay that I published on my newsletter a couple of weeks ago. 00:28:58.860 |
To save civil society, we need to end our decade long experiment with curated conversation platforms. 00:29:17.860 |
I'm going to talk to Jesse about his reactions to this deep dive. 00:29:21.860 |
And then we have a collection of questions from my listeners about exactly this issue of social media in their life. 00:29:26.860 |
So we'll get into how people are grappling with this. 00:29:29.860 |
But first, we just need to take a quick break to talk about one of the show's sponsors. 00:29:35.860 |
So here's the thing about being a guy and aging. 00:29:39.860 |
When you're young, you don't think about your skin. 00:29:42.860 |
You spend time in the sun, like maybe only occasionally cleaning the grime off your face with a Brillo pad. 00:29:48.860 |
And you still end up looking like Leonardo DiCaprio and growing pains. 00:29:51.860 |
And then one day you wake up and you're like, wait a second. 00:29:56.860 |
Why didn't anyone tell me I'm supposed to take care of my skin? 00:30:05.860 |
Their high-performance skin care is designed specifically for men. 00:30:08.860 |
It is simple, effective, and backed by science. 00:30:11.860 |
Their products include The Good, an award-winning serum packed with 27 active botanicals and 3.4 million antioxidant units per drop. 00:30:21.860 |
I thought it was like 3-3 million, but I counted. 00:30:24.860 |
The eye serum, which helps reduce the appearance of tired eyes, dark circles, and puffiness. 00:30:29.860 |
And the base layer, a nutrient-rich moisturizer that's infused with plant stem cells and snow mushroom extract. 00:30:37.860 |
In a consumer study, 100% of men said their skin looks smoother and healthier. 00:30:41.860 |
So men, if you want to look more like growing pains-era Leonardo DiCaprio and less like a grizzled pirate captain, you need to try Caldera Labs. 00:30:50.860 |
Skin care doesn't have to be complicated, but it should be good. 00:30:53.860 |
Upgrade your routine with Caldera Lab and see the difference for yourself. 00:30:58.860 |
Go to calderalab.com/deep and use deep at checkout for 20% off your first order. 00:31:05.860 |
I also want to talk about our friends at Miro. 00:31:07.860 |
Let me tell you, my team here at my media company has been enjoying using Miro. 00:31:14.860 |
Miro is a shared web-based workspace that teams can use to work together more effectively. 00:31:19.860 |
We use it here to keep track of our upcoming podcast episodes and newsletter topics. 00:31:27.860 |
And the cool part is we can brainstorm right there in the shared document. 00:31:31.860 |
So we can sort ideas and put things in the sticky notes. 00:31:36.860 |
So like the script for an episode will just show up right in the planning document where we're talking about it. 00:31:41.860 |
But partially why I'm excited today is that Miro has been integrating AI features into this product in a way that I think makes a big difference. 00:31:51.860 |
For example, you can now use Miro AI to turn unstructured data like sticky notes or screenshots into diagrams or product briefs or data tables and maybe make prototypes of an idea in minutes. 00:32:03.860 |
There's no need to toggle over to another AI tool and write out long prompts. 00:32:06.860 |
The AI is integrated right into the Miro canvas where you're already working. 00:32:31.860 |
I'm a great artist is what I think the, what it is there. 00:32:36.860 |
I mean, what I was trying to do there, Jesse, is, you know, this is what I noticed in the reaction to my emails. 00:32:42.860 |
The email I sent out is people are like, yeah, I mean, yeah, it's bad, but they needed a push. 00:32:47.860 |
And it's one of these things where just people knowing, like, I think this thing is bad doesn't always lead to action. 00:32:54.860 |
And in this case, this is what I think is going on. 00:32:56.860 |
This is why it's bad is because you're constantly being pulled down the slope. 00:33:04.860 |
And to stop from going to hell, you have to like expend all of this energy. 00:33:16.860 |
So, I mean, what I was saying earlier were, you know, it depends on how you use these things. 00:33:19.860 |
The things that made the slope inevitable was it's a conversational and it's algorithmic curation. 00:33:24.860 |
It's like, that's why I didn't include Pinterest is probably not there because it's not really conversational, right? 00:33:29.860 |
Like you were looking at a board that someone else made, but you don't feel like you're it. 00:33:33.860 |
So it doesn't activate those community circuits necessarily. 00:33:36.860 |
YouTube or Instagram, I think it depends on how you use it. 00:33:39.860 |
So Instagram, and I guess LinkedIn would be similar. 00:33:43.860 |
One way that you use it, it tends to have a higher concentration of a sort of like expert community. 00:33:48.860 |
Like Instagram is where I go because there's like a comedian. 00:33:51.860 |
I want to see their funny videos and like a really fit guy. 00:33:54.860 |
And I want to see fitness advice or like authors. 00:34:00.860 |
I want to see, you know, like clips from Ryan holiday or whatever. 00:34:04.860 |
So it's, there's more of that non-interactive element to the lot of way people use Instagram. 00:34:08.860 |
So it, you, you can get the compelling aspect, which is a problem, but without that, we're 00:34:13.860 |
all kind of conversing about things or giving our takes. 00:34:19.860 |
Now, some people, if you're using reels and like getting to certain corners of Instagram, 00:34:25.860 |
Like a lot of people use it as a, I listen to podcasts on there. 00:34:29.860 |
There are certain creators I like, and I use it like a cable channel. 00:34:32.860 |
Like, Hey, I want to see like the, the latest Cal Newport podcast or something. 00:34:37.860 |
But if you're like on YouTube streams and you're in the comments, like there's ways 00:34:42.860 |
to use it where you can get some of the slope of terribleness effects. 00:34:44.860 |
But it's really like the, the, the offenders that drive people all the way down the slope 00:34:49.860 |
into the place where like nihilism or rage based violence comes out. 00:34:57.860 |
And I think those, those are probably the worst offenders. 00:34:59.860 |
You've kind of been mentioning this theme for multiple years. 00:35:03.860 |
Did you used to get a ton of hate mail about it? 00:35:06.860 |
Yeah. I mean, it's weird how this, how this changed, right? 00:35:08.860 |
Like it used to be, cause I've been on this beat for a long time. 00:35:13.860 |
Why do we all feel like we have to use social media? 00:35:19.860 |
Things got worse, but I'd never bought this idea that these were ubiquitous technologies 00:35:24.860 |
that we all have to use because I was a big early internet booster. 00:35:30.860 |
I didn't like the idea that three companies were going to take over the internet. 00:35:33.860 |
And to use the internet meant we had to use an app by like from one of like a small number 00:35:37.860 |
of companies, uh, to, you know, enrich a very small number of people that was sort of like anti-internet. 00:35:42.860 |
So a lot of early internet people were very suspicious of social media. 00:35:45.860 |
I used to get yelled at people say, no, no, no. 00:35:54.860 |
It's, you know, whatever it is, it's how you're going to meet. 00:36:00.860 |
It's going to bring democracy to the Middle East. 00:36:02.860 |
And then as things got worse, I really usually around 2016, 17, people stopped defending them. 00:36:09.860 |
They're like, well, yeah, there's some problems here, but I still get some benefits, but you 00:36:16.860 |
And now I think with the political violence, it's again, clearly being driven by these. 00:36:20.860 |
Now people are like, no, I don't, I have let's, let's go. 00:36:29.860 |
And I think the reason why they're stuck is this analysis of like, there's all of these 00:36:36.860 |
They can probably be fixed, but the things that affect me are more minor. 00:36:44.860 |
Cause when you see these as isolated harms and I'm only affected by the distraction harm, 00:36:51.860 |
But when you see them connected as a slope, you're like, no, no, no, no. 00:36:53.860 |
That distraction is pulling you down a slope and it's going to keep pulling you down there. 00:36:57.860 |
And it's going to get worse and worse until you have to expend a huge amount of energy 00:37:05.860 |
You are dangerous for saying social media in any way is bad. 00:37:10.860 |
Two, I completely agree with you, but like, I need a push to actually take action on it. 00:37:15.860 |
So I've been talking about this for a long time, but what I've been focusing on has really shifted. 00:37:20.860 |
My early Facebook articles were just me responding to people, what they thought were like airtight arguments for why you had to use Facebook. 00:37:29.860 |
I had these early essays where I would just go through their arguments and just be like, I don't think that's true. 00:37:35.860 |
Or like, that's not, there's other ways to do that. 00:37:38.860 |
I mean, it was basically, you know, a lot of, I don't think you need to use Facebook and people like you are crazy. 00:37:44.860 |
And I would say, explain why I need to use it. 00:37:46.860 |
And they're like, well, this or that or this. 00:37:48.860 |
And I would just very calmly be like, I mean, that's not a great argument. 00:37:52.860 |
You know, I, that doesn't make me want to be, it's weird. 00:37:55.860 |
It's this what I'm cooking and relationship status. 00:38:01.860 |
So that's how I started pictures of dogs, pictures of dogs, poking, a lot of poking going on early Facebook. 00:38:12.860 |
Why are we poking each other on a computer screen? 00:38:15.860 |
People are like, I gotta use these things because. 00:38:18.660 |
I mean, it took like six minutes for people to be like all hiring and business happens on social media. 00:38:25.660 |
And there's no other way that people will ever discover you, hire you, or you'll ever find a client again. 00:38:29.660 |
It took like six minutes before people were convinced. 00:38:32.660 |
If you do not have videos on Instagram, there is no way your dental practice can succeed. 00:38:39.660 |
I was like, well, how do we ever have commerce before, before we had social media? 00:38:45.660 |
So I want to hear what you, the audience has to think as well. 00:38:47.660 |
So Jesse, we pulled some questions here from the audience about this topic. 00:38:51.660 |
Let's hear what you, my listeners wanted to know or are wondering about this. 00:38:59.660 |
I told him we'd be using access controls on it until he goes to high school. 00:39:04.660 |
I know the obvious sites and apps to block, but I worry that I'm missing some of the more fringe options that young men seem to get lost on. 00:39:11.660 |
Is there some up to date list somewhere on what I should be blocking? 00:39:16.660 |
I think you skipped the word and said middle-aged son, but you know what? 00:39:22.660 |
My middle-aged son is on social media too much. 00:39:28.660 |
I, but I could see an older mom being like, yeah, my middle-aged son is in the basement and he's like the captain of his, I don't know video games. 00:39:40.660 |
What's the one where they, uh, command gun, whatever. 00:39:46.660 |
He's just on call of duty and doing social media all the time. 00:39:50.660 |
Uh, this is a, this is actually is a good question, right? 00:39:52.660 |
Is when you got to the bottom of that slope of terribleness, the, where the, the transition into violence happens is where you then sort of ski slope off the bottom into these much less regulated platforms that just don't care. 00:40:05.660 |
They're usually funded by things like gambling advertisements and whatever. 00:40:09.660 |
Like they're, they're, they're funded in, in sort of shady ways and they get closed down a lot. 00:40:17.660 |
It was like, okay, if I'm a running the, uh, the blocks, like here's the sites you can't use. 00:40:22.660 |
How am I supposed to know about like the latest sites where like disaffected young men are finding themselves? 00:40:28.660 |
One, I don't know your model for phone moderation with your kids. 00:40:35.660 |
It's not actually the, the model that I would typically recommend. 00:40:37.660 |
So just to cover what I would typically recommend would be no phone in middle school, no smartphone. 00:40:45.660 |
If you need to take the bus or something and you need it in case of emergencies. 00:40:48.660 |
But the model I would normally push is no smartphone until high school. 00:40:52.660 |
That smartphone is going to be really locked down until 16. 00:40:56.660 |
And then we'll, we'll take a lot of that off. 00:41:05.660 |
And now we're going to start giving you sort of more free reign. 00:41:09.660 |
So during that period where you have a smartphone, but it's highly restricted, how do you find these sort of sites that are at the bottom of the slope of terribleness? 00:41:15.660 |
If you understand the slope of terribleness, I think it changes the way you think about this because no one starts at those sites at the bottom. 00:41:26.660 |
No one is like, oh, I don't, I got a smartphone. 00:41:31.660 |
I can't wait to go to some weird discord server. 00:41:34.660 |
We're like Nick Fuentes is going to be talking about Groyper movement, something like that. 00:41:38.660 |
You get there by sliding down the slope of terribleness. 00:41:44.660 |
She was talking about, you know, all these memes, you know, about you're kind of like clued into sort of fun internet culture. 00:41:54.660 |
And now you begin to get really caught up in some particular issue or not. 00:41:58.660 |
And because you're young, you're like, yeah, man, I social groups is everything. 00:42:05.660 |
This is firing up all of my developing, like sort of like early puberty brain networks. 00:42:10.660 |
And then you get disassociated eventually because you're like, I, I'm so worked up now that like I now, you know, I'm, I'm isolated because I'm on here all the time. 00:42:20.660 |
And that's when you hear someone in one of the mainstream communities say, Hey, Hey, Hey, Hey, the real conversation about this is over here. 00:42:27.660 |
Like they're, they're the man when you can fill in so many other things or what that means. 00:42:31.660 |
They're going to censor us, but this is where the real conversation is. 00:42:34.660 |
And then that's when you make your way over to the, the really dark places. 00:42:37.660 |
So if you are blocking the mainstream curated conversation platforms for your 14 year old, they're not going to enter the slope. 00:42:47.660 |
So you don't have to worry as much about these sort of weird, those are weird, weird places. 00:42:51.660 |
If you wander there fresh, it's a, it's like Harry Potter and Nocturne alley. 00:43:03.660 |
The, the, the witches have on the weird hats and they're selling the like bad spells and Malfoy's doing meth. 00:43:14.660 |
It's like, Oh, stranger danger, uncomfortable. 00:43:17.660 |
It's not, it's not Disneyland, but you got to get there down the slope. 00:43:19.660 |
So anyways, what I'm trying to say there is if you are preventing until your kid is ready for it, access to the curated conversation platforms, they're not going to probably make their way to the weird things. 00:43:30.660 |
You have to kind of, the slope has to pull you there. 00:43:32.660 |
It's not a place where people, people like to go. 00:43:38.660 |
I have several different jobs, but the one I'm struggling with big time is my job of making YouTube videos. 00:43:47.660 |
I'm thinking about your disqualifications and so good. 00:43:50.660 |
And if it is okay, how to organize the deep work required to come up with an idea and script, filming it, then editing and publishing it. 00:43:59.660 |
So yeah, YouTube, because again, we said YouTube is in this sort of interesting liminal space. 00:44:03.660 |
So question number one, should you be making YouTube videos at all? 00:44:07.660 |
As all of my YouTube watchers know, I think it's a terrible thing to do. 00:44:12.660 |
I guess it depends on, I guess it depends on the channel. 00:44:14.660 |
If you were doing like some, maybe if your channel was about you, you, you have like a football field with one of those like long-term measuring things that you use to see with the roller. 00:44:26.660 |
So you can see like knockoff yardage to see how far someone throws a ball. 00:44:29.660 |
And what you were doing was you were punting puppies from the end zone and seeing who could punt the puppy farther. 00:44:37.660 |
And it was all, yeah, that you probably shouldn't be doing that YouTube thing. 00:44:39.660 |
But if you're doing something useful on YouTube, like a deep dive on Harry Potter's Nocturne Alley. 00:44:48.660 |
Um, so here's how I would think about YouTube video alone is not bad. 00:44:51.660 |
I think video independent media in video versus audio. 00:44:57.660 |
I think independent media itself is something that is valuable and there's three main forms of independent media. 00:45:05.660 |
We'll see this in things like blogs or newsletters. 00:45:07.660 |
There's audio, which you see primarily in podcasts. 00:45:12.660 |
Now YouTube is the easiest platform to use to sort of make your video, uh, seen. 00:45:21.660 |
But what you need to be aware of, if you're creating independent video on YouTube, you have to be very wary of capture. 00:45:32.660 |
And you see this as a creator in those view numbers and those subscriber numbers. 00:45:38.660 |
And it really, the thing you have to be so careful about is capture, which says, I am going to just explore the slope of how do I make those numbers higher and higher and higher. 00:45:52.660 |
Now, again, I, YouTube tends not to be as dangerous for people as these other, these other platforms. 00:45:57.660 |
It can be, but it tends not to be, but that can put you into your, now you're chasing the algorithm. 00:46:02.660 |
It puts you in this sort of weird sort of engagement farming type of land. 00:46:11.660 |
I mean, this is what like Mr. Beast did is he just straight up followed what makes more people watch. 00:46:18.660 |
And then he kept ratcheting that up and ratcheting that up and where he ended up as the highest subscribed YouTube channel on YouTube is these like multimillion dollar videos where every seven seconds is aimed at like there's no block of more than seven seconds where there's not something compelling being promised to you or that promise being delivered on. 00:46:36.660 |
Like do, do, do, do, do, do, do, do visually amazing stakes are high. 00:46:40.660 |
So like, you know, I don't know, it's not terrible, but it's kind of weird where you end up if you follow the algorithm or you end up like a lot of people like in political waters, like, okay, I'm going to get more stride in on this issue because more people view it. 00:46:53.660 |
You know, I'm going from, uh, I'm Mr. Health advice to like, if you look at a vaccine, um, your child's hair is going to fall out like it, because it, you get more and more like attention for it. 00:47:05.660 |
So you've got to be careful about capture, but if you, you're producing something you're proud of and you want it to do well, but you're not willing to completely follow the algorithm, then, you know, I think it's okay. 00:47:15.660 |
In terms of how you actually schedule doing YouTube, uh, creation, uh, just, you'll attest to, I mean, it's just time consuming, right? 00:47:21.660 |
Don't underestimate the principle that's at play here is the principle from slow productivity of work at a natural pace. 00:47:29.660 |
Don't pretend like it doesn't just because a YouTube video is quick to watch. 00:47:35.660 |
Like, yeah, I'm going to put aside an hour before work and then after work on Friday. 00:47:39.660 |
And that's how I'm going to run my YouTube channel. 00:47:48.660 |
I got to plane it and shape it and glue this. 00:47:53.660 |
There's a lot of skills involved to come back to it. 00:47:54.660 |
It's going to require a lot of time for me to do. 00:47:56.660 |
So, so that's the other thing to keep in mind to do this. 00:48:02.660 |
It's not, you know, you're not going to just stumble onto a camera and everyone rushes to watch you. 00:48:10.660 |
It's like impossible to get people to watch things. 00:48:12.660 |
It's, you know, and you really, you really got to do a good job. 00:48:15.660 |
So no, I'm not mad at you for making YouTube videos. 00:48:17.660 |
Don't get algorithmically captured and be realistic about how hard it is. 00:48:23.660 |
And if you don't have that time, don't pretend like you can squeeze it in. 00:48:26.660 |
I always tell my kids don't, you don't want to become a YouTuber. 00:48:39.660 |
There's so many easier ways to make the amount of money that like most full-time YouTubers make. 00:48:43.660 |
But they have this idea that it's easy and that everyone's making millions of dollars. 00:48:48.660 |
I'm like six people are making millions of dollars and it's not going to be. 00:48:51.660 |
But like you said, if you're not using it to make money, 00:48:54.660 |
you can take the slow productivity approach and occasionally put out videos. 00:49:05.660 |
If you think you're gonna become the next Mr. Beast, it's a problem. 00:49:12.660 |
I can't help feeling like it's too late to put the social media genie back in the bottle. 00:49:18.660 |
Don't we need to focus instead on how to use it better? 00:49:21.660 |
I get this question a lot when I give talks about social media, especially to young people 00:49:25.660 |
or the parents of young people is the like, come on. 00:49:29.660 |
What we need is to learn how to like use whatever it is, you know, responsibly. 00:49:37.660 |
And here's, here's what I think the fundamental mixup is. 00:49:41.660 |
The social media companies completely hijacked the web 2.0 revolution. 00:49:48.660 |
So the web 2.0 for those who don't follow like internet trajectories. 00:49:54.660 |
Web 2.0 was about the worldwide web, making it easier for information to both be published and consumed. 00:50:02.660 |
But mainly making it easier to publish information. 00:50:04.660 |
So instead of having to go and manually edit a website or to have a sort of complicated database driven, like I'm going to pull out the Amazon listing and put on the screen. 00:50:15.660 |
But basically it was like, it was hard if you weren't had a serious website or a serious web developer, it was hard to update or put information on the web. 00:50:22.660 |
And the web 2.0 was like a collection of different advancements. 00:50:25.660 |
Typically it was a lot of stuff that was actually advancements in the JavaScript language that ran in the new version of HTML that made it easier for like things like I can type something into a form and it shows up and everyone can see it. 00:50:36.660 |
So blogs were like one of the first early web 2.0 things. 00:50:41.660 |
It was like, yeah, you can publish information on the web without having to code, without having to hire like a web developer. 00:50:48.660 |
But we got things like blogs and like these kind of early social network type things like Friendsters where like different people could post things. 00:50:54.660 |
We had places where you could build websites online using an online interface without code. 00:51:00.660 |
And so that, that was the web 2.0 revolution was basically the web 1.0 said, here's a place to post information that anyone can consume it. 00:51:10.660 |
All you need is a browser and anything that's been posted. 00:51:13.660 |
You can, you can, you can, you can consume web 2.0 is like now anyone can produce stuff to be consumed as well. 00:51:21.660 |
One of the more important revolution in the last hundred years. 00:51:26.660 |
Then the social media companies came along and said, Oh, people publishing information is interesting because that could be a lot of information. 00:51:36.660 |
If you have a lot of people publishing stuff, you can create amounts of information that you would never be able to hire people to come anywhere near there. 00:51:43.660 |
If you want to hire people to create content like newspaper writers or whatever, it's expensive and it's like pretty limited how much stuff you produce. 00:51:52.660 |
You had like Nick Denton and Gawker and people saying like, Hey, can we hire like 20 people and have them write 10 blog articles a day? 00:51:59.660 |
But the social media companies say that's amateur hour. 00:52:01.660 |
You're hiring like 10 people at Gawker to publish 20 things a day. 00:52:05.660 |
And you have 200 things published a day that you're selling ads on. 00:52:09.660 |
Why don't we get 200,000 people for free to write information for us? 00:52:16.660 |
So they took over Web 2 by building these beautiful websites, right? 00:52:19.660 |
Facebook, Facebook's triumph early on was in part because they, Mark and the original programmers he worked with really understood these cutting edge new Web 2 type of capabilities. 00:52:33.660 |
They understood the new technology, things like Ajax, which was allowed for, you know, asynchrony and JavaScript. 00:52:41.660 |
So you could have a static website and pull information to put without having to reload the whole web page. 00:52:46.660 |
They were on top of that technology, early Facebook websites was like beautiful Web 2 products. 00:52:50.660 |
And that was like very, very easy for you to publish stuff for people to see. 00:52:57.660 |
Forget blogs, forget like interactive creative websites, just come to our gardens, our social media gardens. 00:53:04.660 |
And it's easy there to publish your own material. 00:53:07.660 |
Then they figured out another thing, which was people get bored publishing their own material. 00:53:13.660 |
Because what happens is that anyone who had a blog in like 2005 knows this. 00:53:17.660 |
Well, what happened is you would publish stuff and no one cared. 00:53:21.660 |
It was the equivalent of like putting stuff on the bulletin board at the coffee shop. 00:53:29.660 |
So I'm going to go, you know, play a video game, right? 00:53:34.660 |
So the second innovation the social media companies had was we'll offer you attention. 00:53:44.660 |
Facebook was like, we'll have this sort of collectivist type of social compact that everyone will post nonsense. 00:53:50.660 |
But everyone will look at the people they know is nonsense. 00:53:53.660 |
In fact, we'll bring it to you because we organize it on these walls and you friend people and we can tell you like, hey, so and so just said something. 00:53:59.660 |
And then you go over and do this performative thing where you're like, give a thumbs up or like you leave a comment like that's great. 00:54:04.660 |
And later they don't like, but I like that or whatever. 00:54:06.660 |
And now what you're offering people is it's easy to publish content and you'll feel like people are paying attention to you. 00:54:17.660 |
Anyone can publish and we'll make you all feel like you're important. 00:54:23.660 |
And now you have massive amounts of content that you can monetize with advertisements, right? 00:54:30.660 |
I have 200 million people publishing all day long because their friends will like on it and we can put ads in the middle of all of that. 00:54:38.660 |
So that was how social media took over Web 2. 00:54:41.660 |
And then social media had its own revolution where after everyone was using these services, they said, we're no longer going to use this model of people are going to pay attention to you. 00:54:50.660 |
Now we're going to bring an algorithmic curation and say, well, once you have this app here, we're just going to pull from the billion people who are using this. 00:54:57.660 |
We'll pull stuff that's going to press your buttons. 00:54:59.660 |
And now what we're selling you is like, eh, it's not your friend liking something. 00:55:03.660 |
You're kind of bored with that, but we'll just give you something every time you open this that's really interesting. 00:55:07.660 |
Once we already captured you, we'll switch you over to algorithmic engagement because now you'll use it 10 to 20 X more per day than when you were just looking for likes because you only look for likes after you publish something. 00:55:23.660 |
That's a very specific, successful, but very specific business model. 00:55:29.660 |
And in exchange, algorithms show you the most compelling stuff that other people publish. 00:55:39.660 |
If you shut down the five major social media companies tomorrow, the internet would still be there. 00:55:49.660 |
Apps and sites where you can, you can gather with like-minded people. 00:56:00.660 |
The underlying protocols that allow anyone who can access any of the networks and networks that makes the internet can talk to anyone else and read anything else that's accessible from this network. 00:56:12.660 |
And one that happens to create a slope of terribleness that we all hate. 00:56:15.660 |
So no, I don't think social media is fundamental. 00:56:20.660 |
We have to just accept this particular use of it. 00:56:24.660 |
The other thing I would argue is, okay, let's say you did believe that you needed to use social media to be successful in the world. 00:56:32.660 |
These are low friction consumer facing products. 00:56:36.660 |
They're made to be something you can pick up immediately. 00:56:38.660 |
You can learn how, you know, TikTok takes like nine seconds to learn how it works. 00:56:43.660 |
I signed up for TikTok and documented like what the process is. 00:56:46.660 |
So it's not something you need to teach people about. 00:56:49.660 |
And in terms of like, oh, let's teach you how to use this. 00:56:52.660 |
You're not going to compete with the slope of terribleness. 00:56:55.660 |
It's just more compelling because they don't use this too much. 00:57:03.660 |
You can say, you know, don't like be too strident. 00:57:07.660 |
You know, listen to people from the other side. 00:57:10.660 |
You're talking about a hundred thousand year old paleolithic circuits that are being pushed on again and again. 00:57:16.660 |
And again, and again, that advice ain't going to stick. 00:57:22.660 |
Like we just, we have to use it or we need to be practice how to use it. 00:57:26.660 |
Or if we just give people literacy, they're going to use these things better. 00:57:34.660 |
Case studies where people write in to talk about their experience, but using type of stuff we talked about on the show. 00:57:38.660 |
We found a case study that was about someone's experience leaving social media. 00:57:43.660 |
Get some case study music to get in the mood here, Jesse. 00:58:00.660 |
Scott said, I stumbled upon your excellent work after purchasing deep work on a whim and wanted to congratulate you on both defining and providing workable solutions for one of the biggest challenges of our always connected modern age. 00:58:13.660 |
I was one of the first musicians to use social media to build a large following in the early 2010s as a pianist and founder of a popular music group. 00:58:22.660 |
For a long time, I was a champion of social media tools, in particular, the way they granted artists the ability to build a worldwide audience while bypassing record labels and other traditional industry gatekeepers. 00:58:35.660 |
Fast forward to now, and it's clear to me that social media has actually had a net deleterious effect on the arts. 00:58:42.660 |
Just as we see in knowledge work, engagement-based algorithms prioritize the cheap dopamine hit of a shallow content rather than the kind, deep, thoughtful, artistic work that nourishes the soul. 00:58:53.660 |
Some of the true masters of their craft today are reduced to performing inane TikTok dance challenges or parroting trendy ideological talking points instead of creating the brilliant work that they are uniquely suited to create. 00:59:05.660 |
I could go on and on, but I'm certain I'm preaching to the choir here. 00:59:08.660 |
I deleted my own personal socials, with the exception of Substack, in 2021, and I think it has greatly improved my creative output at the expense of my own cultural relevance. 00:59:19.660 |
I think that it is the trade-off that artists and academics and everyone else will have to be incentivized to accept if we were to reverse what the last decade or so of McLuhan's Global Village on steroids has wrought. 00:59:33.660 |
All right, so I think that's a great case study. 00:59:37.660 |
For this musician, social media was everything. 00:59:42.660 |
Worldwide audience, get past the gatekeepers, go straight, you know, get music straight to people. 00:59:47.660 |
Great Kevin Kelly arguments, a thousand true fans. 00:59:51.660 |
Then he's like, it got worse and the net fact was negative. 01:00:03.660 |
And I mentioned this in my last answer, but just to emphasize it one more time, what social media is has changed. 01:00:08.660 |
A lot of the things that we really liked about social media, the access to new people, the access to new ideas, right? 01:00:18.660 |
That got worse when the companies made what I call the algorithmic turn. 01:00:23.660 |
So again, as I talked about before, the early social media model was not so heavy on algorithmic curation. 01:00:33.660 |
It was about, it's just interesting enough to talk to interesting people, to follow interesting people, to friend interesting people, but it wasn't so much about algorithms. 01:00:42.660 |
And it's because the original goal, if you're an early stage social media company was user growth. 01:00:50.660 |
Like you need to be something that feels like it's super culturally relevant and offers a good experience. 01:01:03.660 |
It was, you know, it was less, less important because they weren't even really selling ads yet that you use it all day long. 01:01:08.660 |
They just wanted everyone to have to use a service like Facebook. 01:01:10.660 |
So it used to be at first, again, let's do the evolution real quick. 01:01:20.660 |
There's no like button, but we comment on each other stuff. 01:01:24.660 |
And we feel like there's a little bit of attention. 01:01:26.660 |
Like I'm publishing something and people care about that feels good. 01:01:30.660 |
Then they went towards, let's get famous people on here and interesting people on here. 01:01:37.660 |
And then it was like, yeah, it's not just about your friends. 01:01:41.660 |
You can also follow or friend really interesting, like creative people or thinkers or artists. 01:01:49.660 |
So like now I see what my friends are up to, but also I see what this like political thinker is saying. 01:02:01.660 |
Here are all of the things that have been tweeted by people that you follow in order of when they tweeted with the most recent thing first. 01:02:10.660 |
Like here is your, at first you had to go to other people's walls. 01:02:13.660 |
You'd go to their sites and they gave you this feed, but the feed at first was just strict. 01:02:19.660 |
This person, this friend just did something that friend just did something. 01:02:22.660 |
And some of these friends could be interesting people. 01:02:24.660 |
That was sort of the sweet spot era where it was non algorithmic, but you had a lot of interesting people on there. 01:02:33.660 |
I could just follow on Twitter or on Facebook. 01:02:36.660 |
Someone who was like a author I admired or a musician or something and kind of see what they were up to. 01:02:43.660 |
Like this is when people were like pretty happy about this stuff. 01:02:48.660 |
The algorithmic turn, which began like around 2012, right around 2012 to 2015. 01:03:00.660 |
They hired the grownups, you know, Zuckerberg brought on Sandberg, et cetera. 01:03:04.660 |
And they said, look, we have to actually make money. 01:03:07.660 |
And to make money, we have to care about how much these users are using this, right? 01:03:17.660 |
2012 was this tipping point where we had more than half of the adult population using smartphones in the U.S. 01:03:27.660 |
They now have constant access to these services. 01:03:34.660 |
We are going to select with algorithms or with the assistance of algorithms what you see so that you get more consistent brain rewards. 01:03:44.660 |
And those short term reward circuits are going to get more exposures to positive learning. 01:03:49.660 |
It's going to be this virtuous loop that makes us a lot of money. 01:03:53.660 |
And we've been in the algorithmic turn for the last decade, increasingly pushed towards algorithmically curated content. 01:03:58.660 |
You need that piece for the slope of terribleness to be there. 01:04:03.660 |
These companies would disappear because you can't make money in 2010 Facebook, right? 01:04:08.660 |
It's not going to be worth a trillion dollars anymore. 01:04:11.660 |
But all the terribleness goes away without the algorithmic curation. 01:04:17.660 |
And sometimes there's nothing interesting on there, but sometimes someone has published something cool. 01:04:21.660 |
And then I learned I got an early drop of a Taylor Swift song or something. 01:04:28.660 |
I wasn't going to look at it a thousand times a day, but it was a cool hang. 01:04:30.660 |
As soon as we made the algorithmic turn, that's when social media turned. 01:04:40.660 |
Once we began that shift, that's when these, that's why we have this feeling of like, why do these things make me feel dark? 01:04:47.660 |
Because we are at the top of that slope of terribleness. 01:04:51.660 |
And then at some point we looked up or like I'm sliding and that wasn't happening before. 01:05:02.660 |
So now we know what you think about this issue in our third and final segment, we can say, well, what is regulators going to do? 01:05:13.660 |
Well, there's some news about this that I want to get into about some recent news about Congress taking on some of these issues of social media. 01:05:22.660 |
But first, let's take another brief break to hear from a sponsor. 01:05:30.660 |
In the knowledge economy, the ability to communicate clearly is everything. 01:05:36.660 |
If you do it well, it helps you get promoted. 01:05:38.660 |
It is like the number one skill in the modern knowledge economy. 01:05:41.660 |
So you really should care about your communication, but it's something people struggle with. 01:05:45.660 |
Writing is hard and they don't know how to get better at it. 01:05:51.660 |
Grammarly is an essential AI communication assistant that boosts both the productivity and quality of your writing. 01:05:57.660 |
I don't think people fully realize how powerful these Grammarly writing tools have gotten. 01:06:01.660 |
Let me tell you about a particular feature that I've been messing around with with Grammarly, and I'm really enjoying. 01:06:13.660 |
I'm going to send it to, you know, God, I'm in charge of so many things. 01:06:16.660 |
So there's some group of people I'm sending an email to. 01:06:18.660 |
I type out like, yeah, here's what I want to say. 01:06:21.660 |
But before sending it, I click the button on the sidebar there because I have to plug it in to say, hey, Grammarly, can you take the proofreading agent? 01:06:29.660 |
Can you take a look and you can select what type of things you want it to look for? 01:06:33.660 |
So you can say, first of all, let's just do like a straight check. 01:06:41.660 |
But then you can click something like sharpen opening point. 01:06:45.660 |
That's an actual option in the proofreading agent, sharpen opening point. 01:07:03.660 |
So this is the type of thing you can do with Grammarly. 01:07:06.660 |
It's just, it's like having a, an editor sitting over your shoulder. 01:07:09.660 |
You don't always have time to pour over everything you write. 01:07:12.660 |
Click on like these things like the proofreading agent. 01:07:14.660 |
And it just goes in there and makes the writing sharper, fix your mistakes. 01:07:21.660 |
It can help you even brainstorm titles or ideas. 01:07:24.660 |
Like, so do some of this like Gen AI text production. 01:07:32.660 |
So download Grammarly for free at grammarly.com/podcast. 01:07:41.660 |
I also want to talk about our friends at one password. 01:07:44.660 |
Back in the old days, password management at companies used to be simple. 01:07:50.660 |
And you want to make sure that like they had a password for their computer. 01:07:55.660 |
Uh, that's basically how we run the, the, the it system here at deep questions. 01:07:59.660 |
Our whole operation, we run it on a, a mainframe computer cost $75,000 a month electricity bills. 01:08:07.660 |
And Jesse walks around with a white lab coat. 01:08:10.660 |
Why an old footage of old computers to people wear lab coats, right? 01:08:15.660 |
I mean, a lab coat is for, I'm doing experiments, but if I'm just walking in a room. 01:08:19.660 |
With like magnetic tape, they always have lab coats on and, and, uh, clipboards. 01:08:26.660 |
My mom used to program series sevens mainframes back in the eighties. 01:08:29.660 |
I'm going to ask her, you didn't wear, I don't think they wore lab coats, which is a little bit far afield from one passwords. 01:08:38.660 |
Um, anyways, the point is this is not the way people run businesses today. 01:08:42.660 |
We don't have one big computer with people with lab coats. 01:08:44.660 |
We have service after service that people use as part of their job. 01:08:48.660 |
Some of them, you know about some of them, you don't, they just signed up for, and they have all these different passwords. 01:08:55.660 |
This is where Trellica by one password can help. 01:08:58.660 |
Trellica by one password inventories, every app in use at your company. 01:09:04.660 |
Then it uses pre-populated at profiles to assess the risks, the security risk of these different applications that your employees are actually using. 01:09:12.660 |
And it allows you to easily then manage your access or optimize your spend or enforce security best practices across every app your employees use. 01:09:18.660 |
So again, they're using tons of stuff they signed up for on the web. 01:09:22.660 |
They're not just logging into the mainframe that Jesse maintains. 01:09:25.660 |
Trellica by one password helps you monitor and be on top of all of that. 01:09:30.660 |
You can manage shadow IT securely onboard and off-board employees and meet compliance goals. 01:09:35.660 |
Trellica by one password provides a complete solution for SAS access governance. 01:09:39.660 |
And it's just one of the ways that extended access management helps teams strengthen compliance and security. 01:09:44.660 |
One password's award-winning password manager is trusted by millions of users and over 150,000 businesses from IBM to Slack. 01:09:50.660 |
And now they're securing more than just passwords with one password extended access management. 01:09:55.660 |
Plus one password is ISO 27001 certified with regular third-party audits and the industry's largest bug bounty. 01:10:02.660 |
One password exceeds the standards set by various authorities and is a leader in security. 01:10:07.660 |
So when we move away from our 1960s era mainframe and put away our lab coats, one password would be exactly the type of thing we'd use here at my media company. 01:10:15.660 |
So take the first step to better get better security for your team by securing credentials and protecting every application, even unmanaged shadow IT. 01:10:26.660 |
That's one password, the number one wordpassword.com/deep and type that in all lowercase. 01:10:33.660 |
All right, Jesse, let's move on to our final segment. 01:10:43.660 |
What are the government going to do about this? 01:10:52.660 |
And we know the house of representatives is really good at technology. 01:10:58.660 |
I want to play you a little bit of audio here from a couple of years ago when the house of representatives brought in the leaders of TikTok to interview them. 01:11:10.660 |
Mr. Chu, does TikTok access the home Wi-Fi network? 01:11:18.660 |
I'm sorry, I may not understand the question. 01:11:20.660 |
So if I have a TikTok app on my phone and my phone is on my home Wi-Fi network, does TikTok access that network? 01:11:26.660 |
You will have to access the network to get connections to the Internet, if that's the question. 01:11:34.660 |
Basically, what I'm saying is we're kind of screwed. 01:11:38.660 |
The US House of Representatives does not know a lot about technology, but they are trying because here's the news that we just heard about. 01:11:45.660 |
I'll put this on the screen here for people who are watching. 01:11:47.660 |
The US House panel asked online forum CEOs to testify after Charlie Kirk assassination. 01:11:57.660 |
But let's think, I just want to look at this for a second. 01:12:01.660 |
I'm going to read from the press release here. 01:12:04.660 |
A US House committee on Wednesday asked the CEOs of online platforms Discord, Stream and Twitch and Reddit to testify at a hearing following the assassination of Charlie Kirk, citing the radicalization of online forum users. 01:12:20.660 |
First of all, as indicated by that TikTok clip, not exactly a bunch of technical wizards. 01:12:26.660 |
And we're sort of seeing why this is a problem already. 01:12:33.660 |
That's that's somehow like metaphor that like the website for this article is so covered in ads. 01:12:41.660 |
But we're seeing here the issue, not just that Congress necessarily doesn't know a lot about these technologies. 01:12:47.660 |
When you understand things like the slope of terribleness, they're talking to the wrong CEOs. 01:12:52.660 |
They're talking to the CEOs of companies that are at their services are being picked up by disassociated youth at the bottom of the slope of terribleness. 01:13:05.660 |
We should talk to the existing companies and have them turn off this behavior. 01:13:11.660 |
Because what you need, these sort of, I think of them as like loosely regulated digital conversation spaces. 01:13:18.660 |
These are the end game of falling down the slope of terribleness, distraction, demoderation and disassociation. 01:13:24.660 |
And then finally, when you're disassociated, you want to be around other disassociated people and you egg each other on. 01:13:32.660 |
They often get shut down and then other ones come along. 01:13:35.660 |
And because it's not, this is not hard technologically to have a place where people can do like voice chat, 01:13:41.660 |
like a discord server that anyone can spool up on their own machine, or you have some sort of forum. 01:13:48.660 |
Well, that got shut down, but then they started 8chan and then that got shut down. 01:13:54.660 |
There was like the Nazi website, their stormer, and that got shut down. 01:13:58.660 |
And often the way these get shut down eventually is they become such a cesspool. 01:14:05.660 |
I mean, I think what took 8chan down was Cloudflare, which is, you know, you need that to prevent denial of service attacks. 01:14:11.660 |
Like you need it to keep in general, your website up and running. 01:14:14.660 |
And they eventually like, oh, we don't want to be in business with you. 01:14:17.660 |
And so with no one to protect them, the site became unusable, right? 01:14:22.660 |
I mean, existing forum companies keep the pressure on. 01:14:32.660 |
You should be responsible for what people are talking about. 01:14:35.660 |
If you have to be accused of being censorious, be accused of being censorious. 01:14:45.660 |
And so I think that's the wrong place to look. 01:14:47.660 |
The right place to look is what is tumbling people down into these below ground depths of the loosely regulated web. 01:14:57.660 |
And it's almost not even worth bringing in CEOs of these companies, which they do. 01:15:05.660 |
I mean, the current administration won't bring him in, but you could. 01:15:12.660 |
Again, it's like bringing in the head of Domino's and being like, look, you know, pizza's bad. 01:15:19.660 |
And they're like, this is our business is pizza. 01:15:24.660 |
Curated conversation platforms is they use algorithms plus large groups of people in conversation to generate lots of eyeballs that they monetize. 01:15:33.660 |
So if you really want to step in here, I don't even know what the exact solution is, but there's a couple of things to catch my attention. 01:15:39.660 |
I'm definitely interested in Australia's model of a nationwide ban on social media writ large, be expansive instead of contractive here for 16 or younger. 01:15:50.660 |
If we know the slope of terribleness is inevitable, let's keep people off it until at the very least their brain has developed more because you know what makes that slope super steep? 01:16:04.660 |
Did anything matter more to your brain than social inclusion, social grouping, what group you're in on and out on what was going on with your friends, pleasing your community, being a part of a tribe. 01:16:16.660 |
Those parts of your brain get supercharged doing those years in your life. 01:16:20.660 |
You combine that brain with algorithmic conversation platforms. 01:16:25.660 |
I mean, what was like a blue trail becomes a black diamond. 01:16:36.660 |
We do this with lots of other things with kids. 01:16:40.660 |
It's super addictive in like a 12 year olds brain. 01:16:44.660 |
They cannot consent to the lifetime addiction. 01:16:53.660 |
So no, you have to be older to smoke cigarettes. 01:16:56.660 |
Let's wait till your brain develops before we're going to allow you to, you know, the drink alcohol. 01:17:03.660 |
So we don't want you behind the wheel of a car until you've gone through training and you can be older. 01:17:08.660 |
And sure for all of these things, you can argue, but no, you're holding people back. 01:17:13.660 |
You're restricting the freedom of movement and you can't restrict the freedom of movement of people in America, but you can for kids. 01:17:26.660 |
Why can't they go to an R rated movie by themselves when they're 15? 01:17:28.660 |
Because we think kids aren't ready for this material or they're not going to make a good decision on their own. 01:17:35.660 |
I think we should do something similar for social media. 01:17:41.660 |
There's not a simple answer here, but Section 230 of the Internet Communication Decency Act has the effect of basically giving legal cover to social media companies for you are not liable for the material you publish. 01:17:54.660 |
And this is very different than, say, a newspaper. 01:18:00.660 |
If they say something that defames someone, they can be sued, right? 01:18:04.660 |
If they incite some sort of action that is like highly negative for an individual or company, they can sue them and say, you know, you were wrong. 01:18:17.660 |
The Washington Post had this issue a few years ago. 01:18:20.660 |
Remember the Covington kids at the at the mall? 01:18:23.660 |
It was like a group of kids, you know, all these high schools visit D.C. 01:18:29.660 |
And so kids from some school were visiting D.C. 01:18:31.660 |
And they had bought MAGA hats because they were selling MAGA hats, you know, the street vendors or whatever. 01:18:36.660 |
And there was like some they got into some incident with someone that was there. 01:18:41.660 |
I think it was like a Native American pro doing some sort of protest. 01:18:45.660 |
And anyways, a picture got captured of like one of the Covington kids with a MAGA hat sort of in the face of like this Native American person. 01:18:56.660 |
The post, this was like the Washington Post at peak sort of, you know, um, peak woke time. 01:19:02.660 |
And there was this like, this kid is the kid from, uh, the omen basically like it's Satan spawn and is can, you know, whatever. 01:19:12.660 |
And he went and just like went with whatever felt right. 01:19:16.660 |
Seeing this photo, he went and tracked down this person because they were marginalized and blah, blah, blah, blah. 01:19:20.660 |
The problem was that's not really what happened. 01:19:23.660 |
Actually, like the kids were just taking a photo and this other person was like bothering them and coming over them and the photo. 01:19:28.660 |
And they were trying to like, you know, they're, that person was confronting them and the photo caught them at whatever time. 01:19:34.660 |
And the kids, the kid was like, well, you can't just say that stuff. 01:19:38.660 |
And the post had to give them millions of dollars. 01:19:40.660 |
So most people, you have to actually care a little bit. 01:19:48.660 |
They say it's not, we're not responsible for what's published on here. 01:19:50.660 |
And that's section two 30 more or less getting that protection. 01:19:53.660 |
So there have been arguments for reforming it, which would basically no one has those protections. 01:19:57.660 |
If you want to use free content, like get a lot of people to create content on your platform, have a billion users creating content that you then like algorithmically sort and use to make money. 01:20:12.660 |
Now, of course, the result of this is you, the, the business model for these massive social media curated algorithmic platforms disappears. 01:20:22.660 |
You might not be able to exist under such a world. 01:20:24.660 |
I don't think that's necessarily a bad thing. 01:20:26.660 |
I know it's more complicated than that, but that might be the closest hammer you would have from like a regulatory perspective to say, like, I think this is a bad thing to have these like massive, uh, if you have these massive conversation platforms are algorithmically curated, you're going to create things like these slope of terrible. 01:20:45.660 |
This is just really bad for people and you can't control it. 01:20:48.660 |
And if you are responsible for everything that's published, then you can't have those type of platforms. 01:20:54.660 |
If you want people to publish, it has to be more moderated and curated and it matters what's on there. 01:21:05.660 |
We don't necessarily have to have like a billion people on the same platform to see the, the benefit of the internet. 01:21:11.660 |
I put an asterisk there because look, I'm in digital ethics and academic context. 01:21:18.660 |
So I'm oversimplifying it there, but you know, that's the type of thing that's worth looking at. 01:21:21.660 |
The most important thing we could do though is cultural. 01:21:36.660 |
Who's like spending a lot of time swiping this piece of glass and like typing in these things, like other people, like, what are you doing? 01:21:47.660 |
When we see people that are like really engaged online in the social media platforms or whatever, we should see it as if they wore like a straight up 01:21:59.660 |
And we're just like, um, the star fleet knows what's going on. 01:22:05.660 |
You have a star fleet pin on at your job at, you know, Kinko's or whatever. 01:22:10.660 |
I'm going to kind of make fun of you a little bit. 01:22:12.660 |
Like this is not something a grown man should be doing. 01:22:14.660 |
That's the way we probably need to think about social media. 01:22:18.660 |
That's why I'm out here saying, doing things like the slope of terrible and says like, this is not innocent. 01:22:24.660 |
I mean, unless you have either a lot of stock in one of these companies or you own some of the land left in Hawaii that Mark Zuckerberg hasn't bought yet. 01:22:33.660 |
And you think you're going to get a good price for it. 01:22:34.660 |
If he gets a little richer, like if you're not in those situations, come on, why are we on these things? 01:22:40.660 |
Seeing what we saw a couple of weeks ago on all sides, the, what the role of social media into creating the assassination of a 31 year old dad, the reaction on all the sides. 01:22:56.660 |
The, the, the instinct on the far left to like, I don't care about this at all. 01:23:01.660 |
And it's all, you know, it's, and I'm celebrating it. 01:23:05.660 |
The reaction at the right, you know, not just the censorious stuff, the far right. 01:23:13.660 |
Guess who they're blaming for Charlie Kirk's death. 01:23:21.660 |
Somehow like the Jews in Israel are involved. 01:23:24.660 |
Why are we like, we, we need to marginalize people who are spending a lot of time on these platforms by getting off of ourselves first. 01:23:34.660 |
I, maybe I'm biased because I've been there for a long time and people yelled at me and I'm happy that people are listening now, but. 01:23:48.660 |
I'm interested, but mainly there's a cultural fix here. 01:23:52.660 |
It's time to treat the people who are still like really on there and be like, that doesn't make you some sort of like fierce investigator or warrior or protester. 01:23:59.660 |
It's the guy with the captain Kirk pin who, you know, just showed up at the company photo day. 01:24:09.660 |
And I can tell you life off side of those services. 01:24:16.660 |
As the governor of Utah said, you can touch some grass. 01:24:27.660 |
Hey, if you liked today's episode about why you should quit social media, you should check 01:24:31.660 |
out episode 369, which gives you one more reason why you should quit. 01:24:42.660 |
Today, I want to talk about this so-called reverse Flynn effect. 01:24:46.660 |
There are some convincing arguments for why this is happening. 01:24:50.660 |
But my main purpose is to then introduce an additional factor that I think is getting overlooked. 01:24:55.660 |
But once we recognize this additional factor, it will give you some compelling opportunities 01:25:01.660 |
for getting ahead in an increasingly stupid world.