back to index

Quit Social Media (Revisiting my Viral TED Talk, 8 Million Views Later) | Deep Questions Podcast


Chapters

0:0 Cal's intro
1:12 Cal talks about the context of his original talk
4:21 How TEDx changed the title
7:0 New York Times commissioned response op-ed
10:0 How Social Media companies shifted in 2016
12:55 Social Media universalism
18:0 Cambridge Analytica

Whisper Transcript | Transcript Only Page

00:00:00.000 | (upbeat music)
00:00:02.580 | All right, so I wanna talk about today,
00:00:07.160 | today's deep dive,
00:00:09.120 | Quit Social Media.
00:00:11.600 | And the reason why I'm using that title
00:00:15.040 | is because that was the title of the single video
00:00:18.700 | I've ever produced that has been the most watched.
00:00:21.800 | My TEDx talk, I believe it was from 2017,
00:00:25.880 | Jessica confirmed that,
00:00:26.960 | that was titled Quit Social Media.
00:00:29.320 | And that went on 8 million something views since then.
00:00:32.220 | And I wanted to revisit it.
00:00:35.400 | It's been five years since I gave that talk,
00:00:37.800 | the most viral thing I've ever done online.
00:00:39.440 | I thought this would be a good time to visit
00:00:41.080 | my Quit Social Media TEDx talk.
00:00:45.360 | All right, so to start,
00:00:47.560 | I thought it might be interesting to give the backstory
00:00:50.440 | of how that talk came to be.
00:00:53.120 | Then I wanna talk about what happened after it came out,
00:00:56.780 | and then finally reflect on how do I think today,
00:01:00.420 | five years later, about the points I was making then.
00:01:03.020 | Am I happy about things turned out?
00:01:04.440 | Am I upset how they turned out?
00:01:06.100 | Are they no longer relevant?
00:01:08.020 | That's my goal here.
00:01:08.860 | So let me go back to the context.
00:01:11.220 | So I had published the book Deep Work
00:01:14.620 | very early in 2016, right?
00:01:18.400 | So after I had published that book,
00:01:21.180 | we were looking for different topics
00:01:23.880 | I could bring out into the public sphere
00:01:25.940 | to try to generate attention for the book.
00:01:27.620 | So like, for example,
00:01:28.900 | I did an article for the Harvard Business Review
00:01:32.260 | that was titled "Modest Proposal, Eliminate Email,"
00:01:37.040 | which was the seed of the idea, of course,
00:01:38.680 | that I later elaborated in my book, "A World Without Email,"
00:01:41.280 | because that was like one of the ideas,
00:01:42.480 | email is distracting.
00:01:44.020 | And one of the other things that I talked about
00:01:45.860 | in Deep Work was social media.
00:01:47.480 | There was a chapter on social media being distracting,
00:01:50.720 | and we're putting too much emphasis on it
00:01:52.620 | when there's deeper skills that are probably more valuable.
00:01:55.460 | So we thought, okay, we should find a way
00:01:57.500 | to talk about this angle somewhere.
00:02:00.500 | Now, at this point, going up to 2016,
00:02:03.300 | I'm a bit of an odd guy.
00:02:04.940 | I had spent my entire adult life up to this point
00:02:08.220 | doing professional speaking.
00:02:09.820 | I mean, I started professional speaking
00:02:11.380 | when I was like 22 or 23 years old.
00:02:14.900 | I would travel around the country talking at colleges,
00:02:17.680 | then eventually at large conferences and corporate events.
00:02:22.260 | I had spoken in front of a thousand people
00:02:24.580 | at Lincoln Center and at the World Domination Summit,
00:02:27.340 | these type of places, as well as corporate gigs,
00:02:29.440 | the Canadian Olympic Committee,
00:02:30.620 | like all sorts of different places.
00:02:31.460 | I just gave a lot of talks.
00:02:33.200 | So I was familiar with talking.
00:02:35.020 | And when you're a talker who's been around for a while,
00:02:37.280 | I used to get lots of invitations for TEDx conferences.
00:02:40.860 | There was a time back then, 2014, 2015, 2016,
00:02:45.180 | where TEDx conferences were popping up everywhere
00:02:47.220 | because TED was very popular.
00:02:48.780 | And I would get those invitations all the time,
00:02:50.540 | and I was like, "No, I'm not gonna go to a TEDx conference.
00:02:53.460 | "I have enough sort of big, high-paying gigs
00:02:55.960 | "or giant audience gigs to do."
00:02:58.140 | But when we were trying to promote deep work,
00:03:00.440 | and we're thinking about where can I go
00:03:02.900 | to introduce this idea of not using social media,
00:03:05.960 | I had this idea that you know who does video best
00:03:11.440 | when it comes to talks is these TEDx conferences.
00:03:14.400 | And so one thing they do very well
00:03:15.840 | is they have very good cameras,
00:03:17.180 | and they have the distribution power of TED.
00:03:20.040 | Behind the video.
00:03:20.880 | So I said, "Okay, the next reasonable TEDx invitation I get,
00:03:23.920 | "I'm gonna say yes, I'm gonna use that as a venue
00:03:26.020 | "to give a fire-breathing talk about social media."
00:03:29.060 | And a TEDx conference that was being held in Virginia,
00:03:32.800 | so a 45-minute drive from me,
00:03:34.260 | came along and said, "Do you wanna speak?"
00:03:35.620 | And I said, "Yes, I don't really care what your theme is.
00:03:38.020 | "I'm gonna come talk about quitting social media."
00:03:39.980 | So that's how it came to be.
00:03:41.100 | And so I wrote this talk, and it's a TED talk,
00:03:43.980 | so I memorized it, it's 15 minutes long,
00:03:46.940 | where I make the argument that more people
00:03:49.340 | should quit social media.
00:03:50.980 | It's not something that everyone has to use,
00:03:52.980 | and its damages are bigger than people might think.
00:03:55.260 | And I came to this TEDx talk, it was a small room,
00:03:58.220 | it was like a classroom, there must have been,
00:03:59.900 | I don't know, at most 40 people there.
00:04:02.140 | And I gave my talk.
00:04:04.020 | The whole idea behind the talk was that
00:04:06.740 | I wanna bring attention to that topic.
00:04:09.180 | Almost immediately, it made the organizers uncomfortable.
00:04:12.900 | This idea that you should quit social media,
00:04:16.100 | to say it with a straight face, without a ton of caveats,
00:04:18.500 | was very counter-cultural and eccentric.
00:04:20.860 | So the first thing they did was change the title of my talk
00:04:23.260 | before they posted it online.
00:04:24.820 | They changed the title to
00:04:26.180 | Working Deeply in a Distracted World.
00:04:28.460 | I barely mentioned deep work in this talk.
00:04:30.700 | This is straight up,
00:04:31.980 | this is the problem with social media,
00:04:34.180 | you should stop using it,
00:04:35.820 | here's what it's like to not use social media,
00:04:37.700 | it's great, I was being purposefully provocative
00:04:40.420 | and clear about it, and they changed the title,
00:04:42.580 | because they were worried, this is eccentric.
00:04:45.140 | Let's make it Working Deeply in a Distracted World.
00:04:47.420 | And I called them up, it's like, no,
00:04:48.700 | this is the whole reason I came here,
00:04:50.820 | is so that talk could be called Quit Social Media.
00:04:53.660 | A talk called Quit Social Media will be posted by TED.
00:04:57.540 | That's what I want.
00:04:58.380 | So they went back and they changed the talk back.
00:05:00.660 | And it sat there for a little while,
00:05:02.460 | and then it picked up steam,
00:05:04.060 | and then views really began to pile up,
00:05:06.180 | and it ended up,
00:05:07.020 | it's not like it's the most popular TEDx talk of all time,
00:05:09.580 | but I think it's easily in the top 100 or top 50.
00:05:13.180 | All right, so that's what happened.
00:05:15.500 | And the question is, why did that happen?
00:05:16.940 | And one of the contentions I wanna make
00:05:18.300 | is that if I had given that same talk two years earlier,
00:05:21.940 | no one would have cared.
00:05:23.140 | That the timing of that talk was just perfect,
00:05:28.340 | that sort of early in 2017 when I gave it,
00:05:31.340 | was just perfect for this topic.
00:05:34.020 | And to understand that there's a little bit of history
00:05:35.900 | I think people don't always understand
00:05:38.100 | all of the details of.
00:05:39.040 | But what you have to understand
00:05:40.380 | is in the lead up to that point,
00:05:41.860 | so 2012 up through 2016,
00:05:45.300 | there was a lot of exuberance around social media.
00:05:50.300 | The general sense about social media
00:05:52.860 | from any sort of elite discourse
00:05:54.820 | was this is a revolutionary technology
00:05:58.420 | that is a progressive force of good for the world,
00:06:01.740 | it's instigated the Arab Spring,
00:06:04.460 | it is supporting expression of groups
00:06:06.460 | that otherwise could not be expressed,
00:06:07.740 | that this is a great technology.
00:06:10.260 | That was the general sense.
00:06:13.120 | During that period is when I began to build up a skepticism,
00:06:17.100 | not about the internet,
00:06:18.280 | but about what I used to call social media universalism.
00:06:20.940 | The idea that we all needed to be using social media,
00:06:23.060 | I thought for most people, this is not that great.
00:06:25.340 | Most people are not toppling governments,
00:06:27.320 | most people are not bringing interesting new voices
00:06:29.860 | into the marketplace of ideas,
00:06:31.140 | most people are looking at their phone three hours a day
00:06:33.380 | instead of doing things that are useful.
00:06:34.660 | These are purposefully addictive
00:06:35.920 | and the idea that we all have to be on where it's weird
00:06:38.340 | is a problem for society.
00:06:39.660 | And I began to make that critique
00:06:40.900 | and people thought I was insane.
00:06:44.560 | So the TEDx organizers tried to change the name of my talk.
00:06:47.520 | A couple of years before, for example,
00:06:49.020 | I'd written an op-ed for the New York Times
00:06:51.700 | that argued that social media was not as important
00:06:54.140 | for young people's careers as they thought
00:06:56.700 | and it created a fear.
00:06:58.320 | The New York Times actually had to commission
00:07:01.960 | a response op-ed the next week,
00:07:04.360 | addressing my op-ed and saying,
00:07:07.300 | this is wrong, don't listen to it.
00:07:09.140 | People got mad about it.
00:07:10.780 | I went on a major national radio show in Canada
00:07:13.220 | to talk about it and they ambushed me
00:07:14.780 | 10 minutes into the interview.
00:07:15.860 | Here is a social media expert and an artist
00:07:18.140 | who uses social media to promote her work
00:07:19.860 | here to tell you that social media is important, right?
00:07:22.380 | Because it's so infuriated people that anyone say that.
00:07:25.180 | There was a professor in this area here in DC
00:07:27.620 | who was frantically trying to get me to come debate him.
00:07:31.500 | Like he couldn't take that in a national publication
00:07:34.580 | someone had said social media is not important.
00:07:36.060 | So that's what things was like
00:07:38.140 | and I was seen as eccentric
00:07:39.140 | and that's where things were when that talk came out.
00:07:42.260 | All of that changed.
00:07:43.900 | All of that changed right around the time
00:07:47.620 | that that quit social media talk was released
00:07:51.380 | which is why that timing was so good.
00:07:52.900 | And the reason it changed was politics.
00:07:55.180 | And it was the 2016 presidential election here in America.
00:08:01.300 | And that election had a very unfortunate outcome
00:08:05.820 | for social media companies
00:08:07.780 | because they managed to upset
00:08:11.660 | all sides of the political spectrum.
00:08:13.740 | Now I got the first inklings of this
00:08:15.220 | actually when I was promoting Deep Work in early 2016.
00:08:19.220 | When I would go on conservative radio shows
00:08:22.580 | or conservative podcasts,
00:08:24.300 | everyone was asking me about censorship on social media.
00:08:28.420 | And this was not something that was in my normal orbit.
00:08:32.060 | It's not something I'd heard about or encountered.
00:08:33.820 | It was not something that was being covered
00:08:34.980 | in sort of standard techno journalism.
00:08:37.180 | So I would get caught off guard by these questions.
00:08:38.780 | Like I don't know what you're talking about
00:08:40.220 | but already in the lead up to that election,
00:08:42.700 | the right was starting to get upset
00:08:45.020 | where they said when we look at like what gets taken down
00:08:47.340 | and what doesn't, it all comes from like a standard set
00:08:50.820 | of relatively far to the left political perspective.
00:08:55.420 | And I don't think this is surprising.
00:08:56.700 | These companies are based in Northern California.
00:08:59.100 | This is more of a,
00:09:00.140 | the political left is way more dominant there.
00:09:03.540 | But so the right started to get upset
00:09:06.060 | or skeptical about social media.
00:09:08.460 | And then the election happened and Donald Trump won.
00:09:11.500 | And it took a little while, about a half a year or so
00:09:15.580 | but the left then began to get real upset about social media
00:09:20.140 | because of their role in helping Donald Trump win.
00:09:23.380 | So now you had the left start to get upset
00:09:26.060 | about social media as well.
00:09:28.380 | Now that's kind of a complicated story.
00:09:30.580 | If you really look at that story closely
00:09:32.500 | about what happened on the left,
00:09:33.700 | I think it's often portrayed as they saw specific harms
00:09:38.660 | that like Facebook was doing and that's why they were upset.
00:09:41.060 | There's Cambridge Analytica,
00:09:43.060 | there was Russian misinformation.
00:09:45.500 | But really if you watch closer,
00:09:46.900 | what really happened with the left and social media
00:09:49.460 | is that most of, I would say,
00:09:51.420 | the mainstream sort of political and cultural voices
00:09:55.980 | entered a resistance mode after Donald Trump was elected.
00:09:58.980 | Like our goal, the point of our paper and what we're doing
00:10:01.660 | is there's an existential threat to our country.
00:10:03.540 | It's Donald Trump.
00:10:04.540 | And this is what we're trying to do
00:10:06.300 | is we're in resistance to that.
00:10:07.660 | And the social media companies,
00:10:09.540 | though politically they're to the left,
00:10:12.180 | didn't join that resistance mode.
00:10:14.140 | Zuckerberg, a couple of years later, came to Georgetown,
00:10:17.740 | gave a speech about free speech online.
00:10:20.660 | They were trying to, they weren't supporting Donald Trump
00:10:24.940 | but they were not ready to go full resistance mode.
00:10:27.220 | They weren't like, "We're fully on board.
00:10:28.500 | He's off the platforms.
00:10:29.820 | We're going to do what we need to do
00:10:32.980 | to make sure that we are helping preserve
00:10:34.780 | this vision of democracy."
00:10:35.820 | They didn't do it.
00:10:36.660 | They tried to go down the center.
00:10:37.780 | It was like during the French Revolution
00:10:39.340 | where you were the shopkeeper.
00:10:41.100 | And you're like, "Look, I'm no fan of the king,
00:10:43.860 | but I also am not going to go to the Bastille."
00:10:46.740 | And your days are numbered at that point.
00:10:49.900 | Like the revolutionaries are going to see you
00:10:51.100 | as with like a sort of bourgeois suspicion, right?
00:10:53.620 | And I think this was a big thing that happened.
00:10:56.740 | So there was sort of a tribal traitorousness
00:10:59.060 | that was going on here where the left was like,
00:11:01.660 | "You guys aren't fully on our team."
00:11:03.620 | And so a lot of those mechanisms turned against it.
00:11:06.860 | So now the left was really mad
00:11:08.300 | at these social media companies
00:11:09.900 | and you had like the delete Facebook movement
00:11:11.980 | and Zuckerberg became the devil.
00:11:14.820 | And now they had upset all sides,
00:11:17.900 | all sides of the political spectrum.
00:11:20.260 | And what that meant was for everyone else
00:11:23.460 | who maybe was not looking at these technologies
00:11:27.260 | purely through a political lens,
00:11:29.060 | this had had the effect of dislodging
00:11:33.100 | how these technologies were categorized
00:11:35.780 | in the cultural hive mind.
00:11:37.420 | It had dislodged them from exuberant cool new technologies
00:11:41.180 | to something that there's probably issues with,
00:11:43.420 | like these political issues.
00:11:44.460 | And once it was in a category where we acknowledged
00:11:46.700 | there could be issues with it,
00:11:47.700 | people that were far away from political concerns
00:11:49.700 | saw other issues with it.
00:11:51.540 | And my kids are using this too much,
00:11:52.940 | I'm on this too much, I don't really like this.
00:11:54.860 | It opened the floodgates to critique.
00:11:58.220 | So it took this particular political disruption
00:12:00.700 | to change our cultural categorization of social media.
00:12:03.460 | But once we changed it to something
00:12:04.900 | that was worthy of critique,
00:12:06.940 | we found a lot of critiques.
00:12:08.780 | And that was exactly the environment
00:12:10.500 | in which that video dropped.
00:12:12.340 | I think that's why it found an audience.
00:12:14.740 | It was perfectly timed to a cultural awakening
00:12:17.140 | where people said,
00:12:17.980 | let's start looking closer at social media.
00:12:19.740 | And it doesn't mean I was convincing them,
00:12:21.260 | but they were ready to hear an argument from my side.
00:12:23.660 | They were ready to watch a video
00:12:26.340 | that was titled "Quit Social Media"
00:12:29.060 | because that suddenly became something
00:12:31.300 | that was at very least comprehensible
00:12:34.340 | in a way that it wouldn't have been in 2015,
00:12:37.700 | the way it wouldn't have been in 2016.
00:12:39.620 | So I think that's the story behind that video.
00:12:42.140 | Now, reflecting on it today,
00:12:44.380 | how do I feel about the issues I talked about in that video?
00:12:48.620 | I would say I'm pretty optimistic.
00:12:50.540 | I'm pretty optimistic because again,
00:12:54.620 | the foundation for my critique
00:12:56.540 | was my wariness of social media universalism.
00:13:01.420 | This idea that everyone had to use
00:13:03.980 | the same small number of platforms,
00:13:06.380 | and these were giant platform monopolies
00:13:08.420 | that were engineering highly addictive experiences.
00:13:10.900 | And I did not think it was good for the body politic.
00:13:13.540 | I don't think it was good for our culture
00:13:14.780 | that everyone had to be on Facebook and Twitter
00:13:18.020 | and Instagram, and we were all on there
00:13:20.020 | because I think for most people,
00:13:21.460 | it probably causes more harm than good.
00:13:23.740 | Not that these were evil tools, but for most people,
00:13:25.900 | it's the time it takes, the emotional labor it creates
00:13:30.460 | is not worth the minor distracting benefits.
00:13:32.700 | And the line I used to use back then
00:13:35.140 | is that I think social media,
00:13:36.340 | I don't think it should be banned.
00:13:37.700 | I think it should be like "Game of Thrones."
00:13:40.380 | Something that's pretty popular
00:13:41.500 | and has a pretty loyal following,
00:13:43.580 | but like most people don't care.
00:13:45.940 | There's also a lot of people that say,
00:13:47.980 | "I'm not gonna want something with dragons."
00:13:51.020 | And I think we are actually starting
00:13:52.900 | to move towards that.
00:13:54.060 | I think in recent years, after this political disruption,
00:13:56.500 | recategorized social media writ large,
00:13:59.620 | we are seeing a fragmentation of the social media universe.
00:14:04.500 | Facebook fell from favor.
00:14:08.020 | I mean, it was just relentlessly attacked
00:14:09.980 | from the left and the right.
00:14:12.260 | Twitter has never been super mainstream.
00:14:14.580 | It's incredibly influential,
00:14:15.780 | but most people aren't tweeting.
00:14:17.540 | Most people don't really care what's going on on Twitter.
00:14:20.060 | There was things like Snapchat that rose and are gone.
00:14:22.620 | We're in a moment now where TikTok has become really good,
00:14:24.860 | but we're not anymore in a moment
00:14:27.180 | where there's any one platform
00:14:28.900 | where it would be considered weird to not use.
00:14:32.820 | I was labeled a heretic for saying, "I don't use Facebook."
00:14:36.500 | There is no such platform today, no matter how popular,
00:14:38.900 | that people would think it's weird at all
00:14:40.180 | if I say I don't use it.
00:14:41.380 | TikTok is very popular.
00:14:44.740 | No one would bat an eye if you say, "I don't use TikTok."
00:14:47.860 | But yeah, it's like "Game of Thrones."
00:14:49.060 | Like some people love it,
00:14:50.020 | and some people think dragons are kind of stupid.
00:14:52.260 | We get it. It's not a big deal.
00:14:55.420 | If you say, "Oh, I don't use Twitter,"
00:14:56.900 | people are like, "Yeah, I get it.
00:14:58.340 | "It's kind of toxic on there, anxiety-producing.
00:15:01.180 | "Like, good for you."
00:15:02.220 | And I think that is a good thing.
00:15:04.380 | So what's gonna come next in the world of social media?
00:15:06.220 | I just think more fragmentation.
00:15:07.900 | I mean, we have seen social media moved away
00:15:10.180 | from being a tool to connect people
00:15:12.660 | into a tool of infinite scroll distraction,
00:15:15.140 | algorithmically optimized.
00:15:16.660 | TikTok, for example, is just owning that
00:15:18.380 | better than anyone else.
00:15:19.980 | Forget anything else other than just,
00:15:21.620 | "Let's touch your brainstem in 30-second burst."
00:15:26.460 | You're like, "Ooh, ah, ooh."
00:15:27.940 | Like it's getting straight to the chase.
00:15:29.620 | Like it's not about connecting.
00:15:30.700 | It's not about creativity.
00:15:31.660 | It's not about finding interesting people.
00:15:32.900 | Let's just like touch your brainstem.
00:15:34.380 | So once it's in a world of distraction,
00:15:35.980 | there are many sources of distraction.
00:15:38.060 | There's various social media platforms.
00:15:39.980 | There's podcasts.
00:15:40.980 | There's all these streaming services
00:15:42.540 | that are spending hundreds of billions of dollars
00:15:44.340 | to produce really good stuff.
00:15:46.260 | There's articles.
00:15:47.340 | There's newspapers.
00:15:48.340 | There's endless books.
00:15:49.500 | Like there's endless things they're trying to provide
00:15:51.380 | distraction and you can choose which ones are best for you.
00:15:54.300 | And maybe it's TikTok or maybe it's podcast
00:15:56.300 | or maybe it's books.
00:15:57.140 | And like, we're getting to a point where that's all fine.
00:15:59.060 | And that's what I wanted to see.
00:16:00.780 | A world in which there was a diversity
00:16:02.860 | of different technological tools and innovations
00:16:05.500 | coming and going, some getting popular, some falling,
00:16:08.220 | trends come, trends don't go,
00:16:09.940 | but we'd never had this sense of universalism.
00:16:12.340 | You have to use this one.
00:16:14.420 | We all have to be on that one.
00:16:16.020 | That was the world I was hoping for in that talk.
00:16:19.580 | And I think we're getting closer.
00:16:21.420 | We're closer to be there.
00:16:23.500 | We're closer to be there.
00:16:24.380 | So we are in a better place in 2022
00:16:26.820 | than we were in 2017, early 2017,
00:16:29.900 | when I was giving that talk.
00:16:31.300 | At that point, I was making people nervous
00:16:35.660 | with the way I talked about social media.
00:16:37.540 | Today, I think I'm probably seen
00:16:39.380 | as being too centrist on social media
00:16:41.220 | because I'm not part of the,
00:16:42.620 | both the right and the left is out for blood.
00:16:44.620 | You are our tribal enemy and we will destroy you.
00:16:47.420 | And I'm not pitchforking.
00:16:48.940 | I'm just saying like, hey, maybe don't use Facebook.
00:16:51.300 | And I think that is a probably a good switch of affairs.
00:16:55.100 | So quit social media.
00:16:56.940 | I don't care if you do or you don't,
00:16:58.860 | but I'm just glad that if you do,
00:17:00.620 | very few people are going to care.
00:17:03.340 | That's my reflection on that talk.
00:17:08.460 | So there you go, Jesse.
00:17:11.580 | Looking back at it.
00:17:16.540 | - The internet says that it was September 19th, 2016.
00:17:20.060 | - Oh, okay. - In the Tysons.
00:17:21.220 | - Oh, so '16, that makes sense.
00:17:22.500 | Summer of 2016, probably.
00:17:24.300 | - Yeah, September 19th, so right after.
00:17:26.740 | - All right, 2016, that makes more sense.
00:17:29.500 | So it was like, it took a little while
00:17:31.180 | for all this to sink in.
00:17:32.700 | Like that op-ed I wrote for the Times
00:17:36.100 | that generated all that furor
00:17:37.580 | was the week after the presidential election.
00:17:39.420 | So like at that point, it was still like,
00:17:40.780 | what do you mean social media is awesome?
00:17:42.020 | It really took another year
00:17:44.220 | before Cambridge Analytica, I think, helped did this.
00:17:47.660 | There was like this general upswelling of, wait a second,
00:17:52.140 | the Russian misinformation.
00:17:53.500 | Like all of that took a while to get going.
00:17:55.780 | So even when I gave that talk in 2016,
00:17:58.300 | it was still eccentric.
00:17:59.140 | By 2017, people were like, yeah,
00:18:02.020 | we don't like social media anymore.
00:18:03.220 | So it's funny how that shifted.
00:18:04.820 | I mean, even, do you remember Cambridge Analytica?
00:18:07.300 | - Yeah.
00:18:08.140 | - Right, this is like an interesting example.
00:18:12.140 | Like the way that was pitched,
00:18:14.260 | like the way that was pitched to people
00:18:16.660 | was basically that there was like a Bond villain
00:18:19.900 | in a hollowed out volcano
00:18:22.580 | that was on like a secret laser phone to Donald Trump,
00:18:27.060 | perpetuating like this giant heist.
00:18:29.140 | But the reality was,
00:18:30.380 | what they were doing at Cambridge Analytica
00:18:31.940 | was the business model of Facebook.
00:18:34.420 | It was what like everyone was doing, political or not.
00:18:37.140 | Like that's how Facebook actually worked.
00:18:39.980 | And like they had just started changing,
00:18:41.980 | like there's no real crime committed there
00:18:43.700 | other than they maybe had just started
00:18:45.620 | to change their terms of service
00:18:46.700 | like a few months before or something like that.
00:18:48.100 | But like that was not some unusual use of Facebook.
00:18:50.740 | Like that's what Facebook,
00:18:51.820 | that's why it was so profitable.
00:18:53.460 | You could go in and scrape all this information
00:18:55.700 | and target people or whatever.
00:18:57.300 | And my contention is that like Facebook saw the danger
00:19:02.060 | of people recognizing like, this is what we do.
00:19:05.180 | We, you play the, you do the personality test
00:19:07.620 | and we steal your whole contact network
00:19:10.820 | and use that to like target ads to everyone.
00:19:13.140 | Like we don't want people realizing that.
00:19:15.620 | So they leaned in heavy that like,
00:19:17.260 | oh, this was some sort of unusual
00:19:18.940 | or exceptional crime that occurred.
00:19:21.100 | Like, oh, this Cambridge Analytica
00:19:22.780 | was some mastermind bond criminal,
00:19:25.540 | like broke into the data safe and was doing things.
00:19:27.660 | And because they, so they were trying to desperately,
00:19:30.020 | I believe, they did not want the story out there
00:19:32.780 | that was like Cambridge Analytica reveals the extent
00:19:36.380 | to this is what Facebook is.
00:19:38.300 | It's stealing all this data.
00:19:39.620 | And so they were very successful, I think at the time
00:19:42.100 | and making it seem like it was an exceptional case.
00:19:44.380 | And the only real thing exceptional about it was like
00:19:46.220 | its scale was very large.
00:19:48.460 | So it was a large number of people,
00:19:50.700 | but that was a standard academic research study play
00:19:54.420 | of like personality tests to scrape data,
00:19:56.340 | to target people with ads.
00:19:57.260 | I mean, it was like the whole business model
00:19:58.740 | around Facebook.
00:19:59.580 | So I really think they leaned in the trying to make it
00:20:00.940 | about like people doing something unusual
00:20:03.980 | or exceptionally bad.
00:20:05.460 | And I think, because again, I think that was something
00:20:07.980 | that Facebook felt like we can play on that ground
00:20:11.620 | to be okay.
00:20:12.780 | We can say like Cambridge Analytica
00:20:14.180 | was an exceptional instance.
00:20:15.420 | We're working on privacy.
00:20:16.620 | So that can't happen again and distract people
00:20:18.620 | from the fact that like,
00:20:19.460 | that's what their business model was,
00:20:21.060 | but it didn't work because of the political anger.
00:20:22.980 | So even though they were like, yeah, we agree.
00:20:24.580 | Cambridge Analytica is bad.
00:20:25.620 | And we're gonna like change our privacy laws,
00:20:27.580 | the sort of you were not enough on our side post Trump,
00:20:31.540 | I think still numbered them.
00:20:34.180 | Their days were still numbered.
00:20:35.300 | The media was gonna be done with them.
00:20:36.780 | So like they tried to create a villain
00:20:39.340 | that they could be like, yeah, we're on your side.
00:20:40.940 | We gotta go after those people.
00:20:42.100 | We don't know what they were doing.
00:20:42.940 | We definitely did not encourage exactly that behavior
00:20:45.180 | for like hundreds of clients.
00:20:46.900 | They tried that and it didn't work
00:20:48.300 | because the damage had been done.
00:20:50.980 | So there you go.
00:20:53.820 | Zuckerberg's on the podcast tour.
00:20:56.780 | - Yeah, I listened to his Ferris and Lex.
00:20:59.860 | - Lex, Ferris, yeah.
00:21:01.180 | Has he answered our invitations yet?
00:21:03.860 | - He wants to talk to you directly.
00:21:06.180 | He wants to play golf with you.
00:21:07.300 | - Yeah, can you imagine Mark Zuckerberg
00:21:09.060 | coming to the Deep Work HQ?
00:21:10.460 | Like, don't worry, Mark.
00:21:13.580 | That smell is the grease trap of the kitchen below us.
00:21:17.140 | It's not me.
00:21:18.460 | [MUSIC PLAYING]