back to index

Jimmy Wales: Wikipedia | Lex Fridman Podcast #385


Chapters

0:0 Introduction
0:47 Origin story of Wikipedia
6:51 Design of Wikipedia
13:44 Number of articles on Wikipedia
19:55 Wikipedia pages for living persons
40:48 ChatGPT
54:19 Wikipedia's political bias
60:23 Conspiracy theories
73:28 Facebook
81:46 Twitter
102:22 Building Wikipedia
116:55 Wikipedia funding
128:15 ChatGPT vs Wikipedia
132:56 Larry Sanger
138:28 Twitter files
141:20 Government and censorship
155:44 Adolf Hitler's Wikipedia page
167:26 Future of Wikipedia
179:29 Advice for young people
186:50 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | We've never bowed down to government pressure
00:00:03.160 | anywhere in the world and we never will.
00:00:05.720 | We understand that we're hardcore
00:00:07.400 | and actually there is a bit of nuance
00:00:09.000 | about how different companies respond to this,
00:00:12.120 | but our response has always been just to say no.
00:00:15.680 | And if they threaten to block, well knock yourself out,
00:00:18.320 | you're gonna lose Wikipedia.
00:00:19.720 | - The following is a conversation with Jimmy Wales,
00:00:24.600 | co-founder of Wikipedia,
00:00:27.080 | one of, if not the most impactful websites ever,
00:00:31.880 | expanding the collective knowledge, intelligence,
00:00:34.320 | and wisdom of human civilization.
00:00:37.280 | This is the "Alex Friedman Podcast."
00:00:39.720 | To support it, please check out our sponsors
00:00:41.800 | in the description.
00:00:43.080 | And now, dear friends, here's Jimmy Wales.
00:00:46.680 | Let's start at the beginning.
00:00:49.200 | What is the origin story of Wikipedia?
00:00:51.840 | - The origin story of Wikipedia.
00:00:53.560 | Well, so I was watching the growth
00:00:55.640 | of the free software movement, open source software,
00:01:00.040 | and seeing programmers coming together
00:01:02.000 | to collaborate in new ways, sharing code,
00:01:06.120 | doing that under a free license,
00:01:08.320 | which is really interesting
00:01:09.960 | because it empowers an ability to work together
00:01:13.280 | that's really hard to do if the code is still proprietary
00:01:15.840 | because then if I chip in and help,
00:01:18.160 | we sort of have to figure out how I'm gonna be rewarded
00:01:21.000 | and what that is,
00:01:21.840 | the idea that everyone can copy it
00:01:23.360 | and it just is part of the commons
00:01:26.000 | really empowered a huge wave
00:01:28.240 | of creative software production.
00:01:32.400 | And I realized that that kind of collaboration
00:01:34.280 | could extend beyond just software
00:01:35.800 | to all kinds of cultural works.
00:01:37.520 | And the first thing that I thought of was an encyclopedia.
00:01:41.040 | I thought, oh, that seems obvious,
00:01:42.680 | that an encyclopedia, you can collaborate on it.
00:01:45.520 | There's a few reasons why.
00:01:46.680 | One, we all pretty much know
00:01:49.280 | what an encyclopedia entry on, say,
00:01:51.960 | the Eiffel Tower should be like.
00:01:54.120 | You know, you should see a picture, a few pictures, maybe,
00:01:57.000 | history, location, something about the architect,
00:02:01.040 | et cetera, et cetera.
00:02:02.360 | So we have a shared understanding
00:02:03.960 | of what it is we're trying to do,
00:02:05.160 | and then we can collaborate
00:02:06.320 | and different people can chip in
00:02:07.520 | and find sources and so on and so forth.
00:02:10.200 | So set up first, Newpedia,
00:02:13.280 | which was about two years before Wikipedia.
00:02:17.080 | And with Newpedia, we had this idea
00:02:22.080 | that in order to be respected,
00:02:24.520 | we had to be even more academic
00:02:26.280 | than a traditional encyclopedia,
00:02:29.040 | because a bunch of volunteers on the internet
00:02:31.160 | getting out of the right encyclopedia,
00:02:32.920 | you know, you could be made fun of
00:02:34.080 | if it's just every random person.
00:02:36.320 | So we had implemented this seven-stage review process
00:02:39.880 | to get anything published.
00:02:41.200 | And two things came of that.
00:02:44.280 | So one thing, one of the earliest entries
00:02:47.560 | that we published after this rigorous process,
00:02:51.000 | a few days later, we had to pull it,
00:02:52.600 | because as soon as it hit the web
00:02:54.200 | and the broader community took a look at it,
00:02:57.200 | people noticed plagiarism
00:02:58.520 | and realized that it wasn't actually that good,
00:03:00.960 | even though it had been reviewed by academics and so on.
00:03:03.640 | So we had to pull it.
00:03:04.480 | So it's like, oh, okay, well,
00:03:05.400 | so much for a seven-stage review process.
00:03:07.880 | But also I decided that I wanted to try,
00:03:10.600 | I was frustrated, why is this taking so long?
00:03:12.680 | Why is it so hard?
00:03:14.480 | So I thought, oh, okay.
00:03:16.200 | I saw that Robert Merton had won a Nobel Prize in economics
00:03:20.680 | for his work on option pricing theory.
00:03:23.240 | And when I was in academia, that's what I worked on,
00:03:25.280 | was option pricing theory, how to publish paper.
00:03:27.720 | So I'd worked through all of his academic papers
00:03:30.040 | and I knew his work quite well.
00:03:32.080 | I thought, oh, I'll just,
00:03:33.160 | I'll write a short biography of Merton.
00:03:35.960 | And when I started to do it, I'd been out of academia,
00:03:39.240 | hadn't been a grad student for a few years then.
00:03:41.840 | I felt this huge intimidation
00:03:43.960 | because they were gonna take my draft
00:03:46.520 | and send it to the most prestigious finance professors
00:03:49.280 | that we could find to give me feedback for revisions.
00:03:53.080 | And it felt like being back in grad school.
00:03:54.880 | It's like this really oppressive sort of like,
00:03:56.760 | you're gonna submit it for review
00:03:58.040 | and you're gonna get critiques.
00:03:59.640 | - A little bit of the bad part of grad school.
00:04:00.960 | - Yeah, yeah, the bad part of grad school, right?
00:04:03.280 | And so I was like, oh, this isn't intellectually fun.
00:04:05.280 | This is like the bad part of grad school.
00:04:07.520 | It's intimidating and there's a lot of
00:04:11.360 | potential embarrassment if I screw something up
00:04:13.280 | and so forth.
00:04:14.480 | And so that was when I realized,
00:04:15.800 | okay, look, this is never gonna work.
00:04:17.280 | This is not something that people
00:04:18.480 | are really gonna wanna do.
00:04:20.200 | So Jeremy Rosenfeld, one of my employees,
00:04:22.600 | had brought and showed me the wiki concept in December
00:04:25.520 | and then Larry Sanger brought in the same,
00:04:29.160 | said, "Hey, what about this wiki idea?"
00:04:31.440 | And so in January, we decided to launch Wikipedia,
00:04:36.440 | but we weren't sure.
00:04:37.480 | So the original project was called Newpedia.
00:04:40.440 | And even though it wasn't successful,
00:04:41.800 | we did have quite a group of academics
00:04:43.800 | and like really serious people.
00:04:45.480 | And we were concerned that, oh, maybe these academics
00:04:48.680 | are gonna really hate this idea
00:04:50.280 | and we shouldn't just convert the project immediately.
00:04:53.080 | We should launch this as a side project,
00:04:54.760 | the idea of, here's a wiki
00:04:55.880 | where we can start playing around.
00:04:58.240 | But actually we got more work done in two weeks
00:05:00.800 | than we had in almost two years
00:05:02.360 | because people were able to just jump on
00:05:04.000 | and start doing stuff.
00:05:05.480 | And it was actually a very exciting time.
00:05:07.760 | You know, you could, back then,
00:05:08.840 | you could be the first person who typed
00:05:11.000 | Africa is a continent and hit save,
00:05:13.280 | which isn't much of an encyclopedia entry,
00:05:15.360 | but it's true and it's a start and it's kind of fun.
00:05:18.160 | Like, you put your name down.
00:05:20.120 | Actually, a funny story was several years later,
00:05:23.400 | I just happened to be online and I saw when,
00:05:26.320 | I think his name is Robert Allman,
00:05:27.800 | won the Nobel Prize in economics.
00:05:29.920 | And we didn't have an entry on him at all,
00:05:32.400 | which was surprising, but it wasn't that surprising.
00:05:34.200 | This was still early days, you know?
00:05:36.920 | And so I got to be the first person to type,
00:05:39.560 | Robert Allman won Nobel Prize in economics and hit save,
00:05:42.000 | which again, wasn't a very good article.
00:05:44.440 | But then I came back two days later
00:05:45.560 | and people had improved it and so forth.
00:05:47.760 | So that second half of the experience
00:05:50.520 | where with Robert Merton, I never succeeded
00:05:52.480 | because it was just too intimidating.
00:05:53.680 | It was like, oh no, I was able to chip in and help.
00:05:55.840 | Other people jumped in.
00:05:57.040 | Everybody was interested in the topic
00:05:58.560 | because it's all in the news at the moment.
00:06:00.480 | And so it's just a completely different model,
00:06:02.240 | which worked much, much better.
00:06:03.520 | - What is it that made that so accessible, so fun,
00:06:06.000 | so natural to just add something?
00:06:09.320 | - Well, I think it's, you know,
00:06:10.720 | especially in the early days,
00:06:12.000 | and this, by the way, has gotten much harder
00:06:13.760 | because there are fewer topics
00:06:15.080 | that are just greenfield, you know, available.
00:06:18.340 | But you know, you could say, oh, well, you know,
00:06:22.480 | I know a little bit about this and I can get it started.
00:06:26.560 | But then it is fun to come back then
00:06:28.480 | and see other people have added and improved
00:06:30.520 | and so on and so forth.
00:06:31.480 | And that idea of collaborating, you know,
00:06:34.160 | where people can, much like open source software,
00:06:36.740 | you know, you put your code out
00:06:39.320 | and then people suggest revisions and they change it
00:06:41.400 | and it modifies and it grows beyond the original creator.
00:06:45.880 | It's just a kind of a fun, wonderful, quite geeky hobby,
00:06:49.540 | but people enjoy it.
00:06:51.440 | - How much debate was there over the interface,
00:06:53.680 | over the details of how to make that
00:06:55.920 | seamless and frictionless?
00:06:57.200 | - Yeah, I mean, not as much as there probably
00:06:59.180 | should have been, in a way.
00:07:01.040 | During that two years of the failure of Newpedia,
00:07:04.400 | where very little work got done,
00:07:06.600 | what was actually productive was
00:07:08.600 | there was a huge, long discussion, email discussion,
00:07:11.540 | very clever people talking about things like neutrality,
00:07:15.200 | talking about what is an encyclopedia,
00:07:17.260 | but also talking about more technical ideas, you know,
00:07:19.440 | things back then XML was kind of all the rage
00:07:22.520 | and thinking about, ah, could we, you know,
00:07:25.200 | shouldn't you have certain data
00:07:28.500 | that might be in multiple articles
00:07:30.080 | that gets updated automatically?
00:07:31.320 | So, for example, you know, the population of New York City,
00:07:35.300 | every 10 years there's a new official census,
00:07:37.920 | couldn't you just update that bit of data in one place
00:07:41.240 | and it would update across all languages?
00:07:42.840 | That is a reality today, but back then it was just like,
00:07:45.240 | hmm, how do we do that, how do we think about that?
00:07:47.120 | - So that is a reality today where it's,
00:07:49.320 | there's some universal variables, Wikidata.
00:07:52.160 | - Yeah, Wikidata, you can link, you know,
00:07:56.200 | from a Wikipedia entry, you can link to that piece of data
00:07:59.000 | in Wikidata, I mean, it's a pretty advanced thing,
00:08:01.220 | but there are advanced users who are doing that.
00:08:03.000 | And then when that gets updated,
00:08:04.680 | it updates in all the languages where you've done that.
00:08:06.760 | - I mean, that's really interesting.
00:08:07.920 | There was this chain of emails in the early days
00:08:10.000 | of discussing the details of what is,
00:08:13.220 | so there's the interface, there's the--
00:08:14.920 | - Yeah, so the interface, so an example,
00:08:16.600 | there was some software called UseModWiki,
00:08:18.760 | which we started with, it's quite amusing actually,
00:08:22.560 | because the main reason we launched with UseModWiki
00:08:25.520 | is that it was a single Perl script.
00:08:28.160 | So it was really easy for me to install it on the server
00:08:30.420 | and just get running.
00:08:32.040 | But it was, you know, some guy's hobby project,
00:08:35.400 | it was cool, but it was just a hobby project.
00:08:37.760 | And all the data was stored in flat text files.
00:08:42.760 | So there was no real database behind it.
00:08:45.440 | So to search the site, you basically used Grap,
00:08:49.320 | which is just like basic Unix utility
00:08:52.000 | to like look through all the files.
00:08:54.240 | So that clearly was never going to scale.
00:08:55.640 | But also, in the early days, it didn't have real logins.
00:09:00.120 | So you could set your username, but there were no passwords.
00:09:03.160 | So, you know, I might say Bob Smith,
00:09:06.880 | and then someone else comes along and says,
00:09:08.080 | oh, I'm Bob Smith, and they're both at it.
00:09:09.800 | Now that never really happened,
00:09:10.720 | we didn't have a problem with it,
00:09:11.560 | but it was kind of obvious, like you can't grow
00:09:13.160 | a big website where everybody can pretend to be everybody.
00:09:15.280 | That's not gonna be good for trust
00:09:17.720 | and reputation and so forth.
00:09:19.440 | So quickly, I had to write a little, you know,
00:09:21.800 | login, store people's passwords and things like that,
00:09:24.480 | so you can have unique identities.
00:09:26.400 | And then another example of something, you know,
00:09:28.400 | quite you would have never thought
00:09:30.040 | would have been a good idea,
00:09:31.080 | and it turned out to not be a problem,
00:09:32.400 | but to make a link in Wikipedia, in the early days,
00:09:35.980 | you would make a link to a page that may or may not exist
00:09:39.720 | by just using camel case, meaning it's like uppercase,
00:09:43.400 | lowercase, and you smash the words together.
00:09:45.120 | So maybe New York City, you might type N-E-W,
00:09:50.000 | no space, capital Y, York City,
00:09:53.520 | and that would make a link.
00:09:54.360 | But that was ugly, that was clearly not right.
00:09:56.280 | And so I was like, okay, well,
00:09:58.120 | that's just not gonna look nice.
00:10:00.320 | Let's just use square brackets,
00:10:01.680 | two square brackets makes a link.
00:10:04.200 | That may have been an option in the software,
00:10:05.560 | I'm not sure I thought up square brackets,
00:10:07.000 | but anyway, we just did that, which worked really well.
00:10:10.280 | It makes nice links, and you know,
00:10:11.600 | you can see in its red links or blue links,
00:10:13.720 | depending on if the page exists or not.
00:10:15.680 | But the thing that didn't occur to me even think about
00:10:18.400 | is that, for example, on the German language
00:10:21.240 | standard keyboard, there is no square bracket.
00:10:24.200 | So for German Wikipedia to succeed,
00:10:26.280 | people had to learn to do some alt codes
00:10:28.280 | to get the square bracket, or a lot of users
00:10:30.800 | cut and paste a square bracket when they could find one,
00:10:33.720 | they would just cut and paste one in.
00:10:35.480 | And yet, German Wikipedia has been a massive success,
00:10:37.520 | so somehow that didn't slow people down.
00:10:40.120 | - How is it that the German keyboards
00:10:42.400 | don't have a square bracket?
00:10:43.560 | How do you do programming?
00:10:44.720 | How do you live life to its fullest without square brackets?
00:10:48.880 | - Very good question, I'm not really sure.
00:10:50.880 | I mean, maybe it does now, because keyboard standards
00:10:53.240 | have drifted over time and becomes useful
00:10:56.840 | to have a certain character.
00:10:57.680 | I mean, it's the same thing, like there's not really
00:10:59.640 | a W character in Italian.
00:11:01.600 | And it wasn't on keyboards, or I think it is now,
00:11:04.760 | but in general, W is not a letter in Italian language,
00:11:08.040 | but it appears in enough international words
00:11:10.600 | that it's crept into Italian.
00:11:12.560 | - And all of these things are probably
00:11:14.480 | Wikipedia articles in themselves.
00:11:17.160 | - Oh yeah, oh yeah.
00:11:18.000 | - The discussion of square brackets in German.
00:11:18.840 | - The whole discussion, I'm sure.
00:11:20.200 | - On both the English and the German Wikipedia,
00:11:23.280 | and the difference between those two
00:11:24.960 | might be very interesting.
00:11:28.000 | So Wikidata's fascinating, but even the broader discussion
00:11:31.800 | of what is an encyclopedia, can you go to that
00:11:35.480 | sort of philosophical question of what is an encyclopedia?
00:11:39.640 | - What is an encyclopedia?
00:11:40.480 | So the way I would put it is an encyclopedia,
00:11:44.640 | what our goal is, is the sum of all human knowledge,
00:11:48.700 | but sum meaning summary.
00:11:50.880 | So, and this was an early debate,
00:11:53.740 | I mean, somebody started uploading the full text
00:11:57.160 | of Hamlet, for example, and we said,
00:11:59.160 | hmm, wait, hold on a second,
00:12:00.880 | that's not an encyclopedia article, but why not?
00:12:03.280 | So hence was born Wikisource,
00:12:06.400 | which is where you put original texts
00:12:08.760 | and things like that out of copyright text,
00:12:11.200 | because they said, no, an encyclopedia article
00:12:13.320 | about Hamlet, that's a perfectly valid thing,
00:12:15.080 | but the actual text of the play
00:12:16.360 | is not an encyclopedia article.
00:12:18.520 | So most of it's fairly obvious,
00:12:20.420 | but there are some interesting quirks and differences.
00:12:23.080 | So for example, as I understand it,
00:12:26.200 | in French language encyclopedias,
00:12:29.700 | traditionally, it would be quite common to have recipes,
00:12:33.120 | which in English language, that would be unusual.
00:12:34.840 | You wouldn't find a recipe for chocolate cake in Britannica.
00:12:38.940 | And so I actually don't know the current state,
00:12:41.980 | haven't even thought about that in many, many years now.
00:12:44.620 | - State of cake recipes in Wikipedia, in English Wikipedia?
00:12:47.680 | - I wouldn't say there's chocolate cake recipes.
00:12:49.560 | I mean, you might find a sample recipe somewhere.
00:12:52.220 | I'm not saying there are none, but in general, no.
00:12:54.160 | Like we wouldn't have recipes.
00:12:55.440 | - I told myself I would not get outraged
00:12:57.000 | in this conversation, but now I'm outraged.
00:12:59.500 | I'm deeply upset.
00:13:00.380 | - It's actually very complicated.
00:13:02.240 | I love to cook.
00:13:03.180 | I'm actually quite a good cook.
00:13:06.460 | What's interesting is it's very hard
00:13:09.320 | to have a neutral recipe because--
00:13:12.000 | - Like a canonical recipe for cake, chocolate cake.
00:13:13.600 | - A canonical recipe is kind of difficult to come by,
00:13:17.000 | because there's so many variants
00:13:18.360 | and it's all debatable and interesting.
00:13:20.800 | For something like chocolate cake, you could probably say,
00:13:23.200 | here's one of the earliest recipes,
00:13:24.640 | or here's one of the most common recipes.
00:13:26.440 | But for many, many things, the variants are as interesting
00:13:31.440 | as somebody said to me recently,
00:13:36.020 | 10 Spaniards, 12 paella recipes.
00:13:39.720 | So these are all matters of open discussion.
00:13:44.240 | - Well, just to throw some numbers, as of May 27th, 2023,
00:13:49.240 | there are 6.66 million articles in the English Wikipedia
00:13:54.760 | containing over 4.3 billion words, including articles.
00:14:01.000 | The total number of pages is 58 million.
00:14:06.000 | Does that blow your mind?
00:14:07.760 | - I mean, yes, it does.
00:14:09.120 | I mean, it doesn't because I know those numbers
00:14:11.200 | and see them from time to time.
00:14:13.240 | But in another sense, a deeper sense, yeah, it does.
00:14:16.120 | I mean, it's really remarkable.
00:14:19.040 | I remember when English Wikipedia passed 100,000 articles
00:14:24.040 | and when German Wikipedia passed 100,000,
00:14:26.200 | 'cause I happened to be in Germany
00:14:27.740 | with a bunch of Wikipedians that night.
00:14:29.360 | And then it seemed quite big.
00:14:32.640 | I mean, we knew at that time
00:14:34.020 | that it was nowhere near complete.
00:14:36.800 | I remember at Wikimania in Harvard
00:14:41.240 | when we did our annual conference there in Boston,
00:14:44.240 | someone who had come to the conference from Poland
00:14:49.800 | had brought along with him a small encyclopedia,
00:14:54.800 | a single volume encyclopedia of biographies.
00:14:59.160 | So short biographies, normally a paragraph or so
00:15:02.000 | about famous people in Poland.
00:15:03.920 | And there were some 22,000 entries.
00:15:07.240 | And he pointed out that even then, 2006,
00:15:10.040 | Wikipedia felt quite big.
00:15:12.040 | And he said, "In English Wikipedia,
00:15:13.520 | there's only a handful of these,
00:15:15.960 | less than 10%, I think he said."
00:15:18.280 | And so then you realize, yeah, actually,
00:15:21.040 | who was the mayor of Warsaw in 1873?
00:15:26.040 | Don't know, probably not in English Wikipedia,
00:15:28.400 | but it probably might be today.
00:15:30.140 | But there's so much out there.
00:15:33.120 | And of course, what we get into
00:15:34.400 | when we're talking about how many entries there are
00:15:36.960 | and how many could there be,
00:15:39.940 | is this very deep philosophical issue of notability,
00:15:43.160 | which is the question of, well,
00:15:45.600 | how do you draw the limit?
00:15:47.880 | How do you draw what is there?
00:15:51.040 | So sometimes people say, oh, there should be no limit,
00:15:54.280 | but I think that doesn't stand up to much scrutiny
00:15:56.000 | if you really pause and think about it.
00:15:57.400 | So I see in your hand there, you've got a BIC pen,
00:16:01.800 | pretty standard, everybody's seen
00:16:04.000 | billions of those in life.
00:16:05.000 | - Classic though.
00:16:05.840 | - It's a classic, clear BIC pen.
00:16:08.080 | So could we have an entry about that BIC pen?
00:16:09.980 | Well, I bet we do, that type of BIC pen,
00:16:13.100 | because it's classic, everybody knows it,
00:16:14.700 | and it's got a history.
00:16:15.620 | And actually, there's something interesting
00:16:17.940 | about the BIC company.
00:16:19.500 | They make pens, they also make kayaks,
00:16:22.940 | and there's something else there for them.
00:16:23.940 | So basically, they're sort of a definition
00:16:27.060 | by non-essentials company.
00:16:28.740 | Anything that's long and plastic, that's what they make.
00:16:32.460 | - Wow.
00:16:33.300 | (Lyle laughs)
00:16:34.140 | - So if you wanna find the common ground.
00:16:36.420 | - The platonic form of a BIC.
00:16:37.860 | - But could we have an article about that very BIC pen
00:16:41.480 | in your hand?
00:16:42.760 | So Lex Friedman's BIC pen as of this week.
00:16:44.760 | - Oh, the very, this instance.
00:16:45.600 | - The very specific instance.
00:16:47.760 | And the answer is no, there's not much known about it,
00:16:50.540 | I dare say, unless it's very special to you
00:16:53.960 | and your great-grandmother gave it to you or something,
00:16:55.720 | you probably know very little about it.
00:16:56.920 | It's a pen, it's just here in the office.
00:16:59.720 | So that's just to show there is a limit.
00:17:03.400 | I mean, in German Wikipedia, they used to talk about
00:17:06.120 | the rear nut of the wheel of Uli Fuchs' bicycle,
00:17:10.840 | Uli Fuchs, the well-known Wikipedian of the time,
00:17:14.320 | to sort of illustrate, like you can't have an article
00:17:15.960 | about literally everything.
00:17:17.140 | And so then it raises the question,
00:17:18.900 | what can you have an article about, what can't you?
00:17:21.280 | And that can vary depending on the subject matter.
00:17:24.060 | One of the areas where we try to be much more careful
00:17:29.520 | would be biographies.
00:17:31.520 | The reason is a biography of a living person,
00:17:34.720 | if you get it wrong, it can actually be quite hurtful,
00:17:37.160 | quite damaging.
00:17:38.000 | And so if someone is a private person
00:17:41.200 | and somebody tries to create a Wikipedia entry,
00:17:44.480 | there's no way to update it, there's not much known.
00:17:46.020 | So for example, an encyclopedia article about my mother,
00:17:49.660 | my mother, school teacher, later a pharmacist,
00:17:52.280 | wonderful woman, but never been in the news.
00:17:55.600 | I mean, other than me talking about
00:17:56.760 | why there shouldn't be a Wikipedia entry,
00:17:57.960 | that's probably made it in somewhere, standard example.
00:18:00.640 | But there's not enough known.
00:18:02.960 | And you could sort of imagine a database of genealogy,
00:18:07.960 | having date of birth, date of death,
00:18:09.760 | and certain elements like that of private people,
00:18:12.980 | but you couldn't really write a biography.
00:18:15.360 | One of the areas this comes up quite often
00:18:17.240 | is what we call BLP1A, we've got lots of acronyms.
00:18:21.920 | Biography of a living person
00:18:23.440 | who's notable for only one event.
00:18:25.520 | There's a real sort of danger zone.
00:18:28.140 | And the type of example would be a victim of a crime.
00:18:31.640 | So someone who's a victim of a famous serial killer,
00:18:35.120 | but about whom, like really not much is known.
00:18:37.320 | They weren't a public person,
00:18:38.320 | they're just a victim of a crime.
00:18:39.920 | We really shouldn't have an article about that person.
00:18:42.040 | They'll be mentioned, of course,
00:18:43.320 | and maybe the specific crime might have an article.
00:18:46.760 | But for that person, no, not really.
00:18:49.560 | That's not really something that makes any sense,
00:18:52.140 | because how can you write a biography
00:18:54.200 | about someone you don't know much about?
00:18:56.400 | And this is, it varies from field to field.
00:18:59.420 | So for example, for many academics,
00:19:01.780 | we will have an entry that we might not have
00:19:05.160 | in a different context, because for an academic,
00:19:08.440 | it's important to have sort of their career,
00:19:11.760 | what papers they've published, things like that.
00:19:13.840 | You may not know anything about their personal life,
00:19:15.600 | but that's actually not encyclopedically relevant
00:19:18.040 | in the same way that it is for a member of a royal family,
00:19:21.240 | where it's basically all about the family.
00:19:23.400 | So we're fairly nuanced about notability
00:19:27.340 | and where it comes in.
00:19:28.800 | And I've always thought that the term notability,
00:19:32.980 | I think, is a little problematic.
00:19:34.380 | I mean, we struggle about how to talk about it.
00:19:38.840 | The problem with notability is it can feel insulting
00:19:42.620 | to say, oh no, you're not noteworthy.
00:19:45.080 | Well, my mother's noteworthy.
00:19:46.080 | She's a really important person in my life, right?
00:19:48.080 | So that's not right.
00:19:49.220 | But it's more like verifiability.
00:19:51.100 | Is there a way to get information
00:19:53.300 | that actually makes an encyclopedia entry?
00:19:56.120 | - It so happens that there's a Wikipedia page about me,
00:19:59.660 | as I've learned recently.
00:20:01.660 | And the first thought I had when I saw that was,
00:20:06.040 | surely I am not notable enough.
00:20:10.320 | So I was very surprised and grateful
00:20:12.860 | that such a page could exist.
00:20:14.240 | And actually just allow me to say thank you
00:20:16.580 | to all the incredible people that are part of creating
00:20:20.360 | and maintaining Wikipedia.
00:20:22.820 | It's my favorite website on the internet,
00:20:24.620 | the collection of articles that Wikipedia has created.
00:20:27.040 | It's just incredible.
00:20:28.140 | We'll talk about the various details of that.
00:20:31.480 | But the love and care that goes into creating pages
00:20:36.480 | for individuals, for a big pen,
00:20:40.420 | for all this kind of stuff, is just really incredible.
00:20:43.060 | So I just felt the love when I saw that page.
00:20:46.300 | But I also felt, just 'cause I do this podcast,
00:20:48.820 | and I just through this podcast gotten to know
00:20:51.600 | a few individuals that are quite controversial.
00:20:54.740 | I've gotten to be on the receiving end of something quite,
00:20:58.060 | to me as a person who loves other human beings,
00:21:02.440 | I've gone to be at the receiving end of some kind of attacks
00:21:07.080 | through the Wikipedia form.
00:21:09.160 | Like you said, when you look at living individuals,
00:21:12.120 | it can be quite hurtful.
00:21:13.600 | The little details of information.
00:21:15.540 | And because I've become friends with Elon Musk,
00:21:20.220 | and I've interviewed him,
00:21:22.600 | but I've also interviewed people on the left, far left,
00:21:26.920 | people on the right, some people would say far right.
00:21:30.140 | And so now you take a step,
00:21:32.860 | you put your toe into the cold pool of politics,
00:21:37.240 | and the shark emerges from the depths
00:21:39.680 | and pulls you right in.
00:21:40.840 | - Boiling hot pool of politics.
00:21:43.840 | - I guess it's hot.
00:21:45.280 | And so I got to experience some of that.
00:21:48.760 | I think what you also realize is there has to be
00:21:53.760 | for Wikipedia kind of credible sources, verifiable sources.
00:21:58.960 | And there's a dance there because some of the sources
00:22:02.400 | are pieces of journalism.
00:22:04.960 | And of course, journalism operates
00:22:06.560 | under its own complicated incentives,
00:22:09.920 | such that people can write articles that are not factual,
00:22:13.520 | or are cherry picking all the flaws
00:22:16.600 | that you can have in a journalistic article.
00:22:18.640 | And those can be used as sources.
00:22:22.440 | It's like they dance hand in hand.
00:22:24.960 | And so for me, sadly enough,
00:22:28.320 | there was a really kind of concerted attack
00:22:31.240 | to say that I was never at MIT.
00:22:33.400 | I never did anything at MIT.
00:22:35.220 | Just to clarify, I am a research scientist at MIT.
00:22:39.160 | I have been there since 2015.
00:22:41.640 | I'm there today.
00:22:42.960 | I'm at a prestigious, amazing laboratory called LIDS.
00:22:47.640 | And I hope to be there for a long time.
00:22:49.600 | I work on AI, robotics, machine learning.
00:22:51.440 | There's a lot of incredible people there.
00:22:53.880 | And by the way, MIT has been very kind to defend me.
00:22:57.480 | Unlike Wikipedia says, it is not an unpaid position.
00:23:01.840 | There was no controversy.
00:23:03.660 | It was all very calm and happy and almost boring research
00:23:08.660 | that I've been doing there.
00:23:10.520 | And the other thing, because I am half Ukrainian,
00:23:13.880 | half Russian, and I've traveled to Ukraine,
00:23:16.800 | and I will travel to Ukraine again,
00:23:18.720 | and I will travel to Russia
00:23:21.320 | for some very difficult conversations.
00:23:23.400 | My heart's been broken by this war.
00:23:25.200 | I have family in both places.
00:23:26.560 | It's been a really difficult time.
00:23:28.600 | But the little battle about the biography there
00:23:32.400 | also starts becoming important for the first time for me.
00:23:36.380 | I also wanna clarify sort of personally,
00:23:39.280 | I use this opportunity of some inaccuracies there.
00:23:42.960 | My father was not born in Chkalovsk, Russia.
00:23:47.040 | He was born in Kiev, Ukraine.
00:23:49.100 | I was born in Chkalovsk, which is a town not in Russia.
00:23:53.820 | There is a town called that in Russia,
00:23:55.640 | but there's another town in Tajikistan,
00:23:58.040 | which is a former Republic of the Soviet Union.
00:24:00.760 | It is that town is now called B-U-S-T-O-N,
00:24:05.280 | Buston, which is funny 'cause we're now in Austin,
00:24:09.200 | and I also am in Boston.
00:24:10.840 | It seems like my whole life is surrounded
00:24:12.480 | by these kinds of towns.
00:24:13.840 | So I was born in Tajikistan,
00:24:15.560 | and the rest of the biography is interesting,
00:24:17.480 | but my family is very evenly distributed
00:24:21.640 | between their origins and where they grew up,
00:24:23.640 | between Ukraine and Russia,
00:24:24.840 | which adds a whole beautiful complexity
00:24:27.400 | to this whole thing.
00:24:28.240 | So I wanna just correct that.
00:24:30.920 | It's like the fascinating thing about Wikipedia
00:24:34.640 | is in some sense, those little details don't matter.
00:24:38.200 | But in another sense, what I felt
00:24:40.000 | when I saw a Wikipedia page about me
00:24:41.680 | or anybody I know, there's this beautiful
00:24:44.640 | kind of saving that this person existed,
00:24:48.660 | like a community that notices you.
00:24:52.320 | It says, "Huh," like a little,
00:24:54.800 | you see like a butterfly that floats,
00:24:57.000 | and you're like, "Huh."
00:24:58.440 | It's not just any butterfly, it's that one.
00:25:00.520 | I like that one.
00:25:01.440 | Or you see a puppy or something,
00:25:02.720 | or it's this big pen.
00:25:05.280 | This one, I remember this one, it has this scratch,
00:25:07.800 | and you get noticed in that way.
00:25:09.920 | I don't know, it's a beautiful thing
00:25:11.580 | and it's, I mean, maybe it's very silly of me and naive,
00:25:16.580 | but I feel like Wikipedia, in terms of individuals,
00:25:20.680 | is an opportunity to celebrate people,
00:25:24.880 | to celebrate ideas. - For sure, for sure.
00:25:26.760 | - And not a battleground of attacks,
00:25:29.580 | of the kind of stuff we might see on Twitter,
00:25:32.000 | like the mockery, the derision, this kind of stuff.
00:25:35.340 | - For sure.
00:25:36.180 | - And of course, you don't wanna cherry pick,
00:25:37.800 | all of us have flaws and so on,
00:25:39.860 | but it just feels like to highlight a controversy
00:25:44.800 | of some sort, one that doesn't at all represent
00:25:47.640 | the entirety of the human, in most cases, is sad.
00:25:50.800 | - Yeah, yeah, yeah.
00:25:51.720 | So there's a few things to unpack in all that.
00:25:55.200 | So first, one of the things I find really,
00:25:58.240 | always find very interesting is your status with MIT.
00:26:03.240 | Okay, that's upsetting and it's an argument
00:26:06.560 | and can be sorted out.
00:26:08.660 | But then what's interesting is you gave as much time
00:26:11.140 | to that, which is actually important and relevant
00:26:13.360 | to your career and so on, to also where your father
00:26:16.800 | was born, which most people would hardly notice,
00:26:19.080 | but is really meaningful to you.
00:26:21.040 | And I find that a lot when I talk to people
00:26:22.760 | who have a biography in Wikipedia,
00:26:24.960 | is they're often as annoyed by a tiny error
00:26:28.520 | that no one's gonna notice, like this town in Tajikistan
00:26:32.640 | has got a new name and so on.
00:26:34.240 | Nobody even knows what that means or whatever,
00:26:35.840 | but it can be super important.
00:26:38.120 | And so that's one of the reasons for biographies,
00:26:42.600 | we say like human dignity really matters.
00:26:46.120 | And so some of the things have to do with,
00:26:49.520 | and this is a common debate that goes on in Wikipedia,
00:26:53.080 | is what we call undue weight.
00:26:55.480 | So I'll give an example.
00:26:57.240 | There was a article I stumbled across many years ago
00:27:02.120 | about the mayor, or no, he wasn't a mayor,
00:27:05.880 | he was a city council member of,
00:27:08.640 | I think it was Peora, Illinois,
00:27:10.360 | but some small town in the Midwest.
00:27:13.360 | And the entry, he's been on the city council
00:27:15.860 | for 30 years or whatever, he's pretty,
00:27:17.960 | I mean, frankly, pretty boring guy
00:27:19.440 | and seems like a good local city politician.
00:27:22.400 | But in this very short biography,
00:27:24.040 | there was a whole paragraph, a long paragraph,
00:27:26.680 | about his son being arrested for DUI.
00:27:31.360 | And it was clearly undue weight.
00:27:33.080 | It's like, what has this got to do with this guy
00:27:34.840 | if it even deserves a mention?
00:27:36.160 | It wasn't even clear, had he done anything hypocritical?
00:27:40.440 | Had he done himself anything wrong?
00:27:42.840 | Even was his son, his son got a DUI, that's never great,
00:27:45.440 | but it happens to people and it doesn't seem
00:27:47.240 | like a massive scandal for your dad.
00:27:49.540 | So of course, I just took that out immediately.
00:27:51.480 | This is a long, long time ago.
00:27:53.480 | And that's the sort of thing where, you know,
00:27:56.600 | we have to really think about in a biography
00:27:59.560 | and about controversies to say,
00:28:01.880 | is this a real controversy?
00:28:03.200 | So in general, like one of the things we tend to say
00:28:07.440 | is like any section, so if there's a biography
00:28:11.360 | and there's a section called controversies,
00:28:14.000 | that's actually poor practice
00:28:16.040 | because it just invites people to say,
00:28:18.880 | oh, I wanna work on this entry.
00:28:20.840 | See, there's seven sections.
00:28:21.920 | Oh, this one's quite short, can I add something?
00:28:24.040 | Go out and find some more controversies.
00:28:25.360 | No, that's nonsense, right?
00:28:26.480 | And in general, putting it separate from everything else
00:28:29.840 | kind of makes it seem worse
00:28:30.920 | and also doesn't put it in the right context.
00:28:33.080 | Whereas if it's sort of a life flow
00:28:34.320 | and there is a controversy,
00:28:35.240 | there's always potential controversy for anyone,
00:28:38.560 | it should just be sort of worked into the overall article
00:28:41.560 | because then it doesn't become a temptation.
00:28:43.120 | You can contextualize appropriately and so forth.
00:28:46.640 | So that's, you know,
00:28:47.860 | that's part of the whole process.
00:28:53.680 | But I think for me, one of the most important things
00:28:56.880 | is what I call community health.
00:28:59.920 | So yeah, are we gonna get it wrong sometimes?
00:29:02.080 | Yeah, of course.
00:29:03.080 | We're humans and doing good quality,
00:29:05.880 | you know, sort of reference material is hard.
00:29:08.800 | The real question is how do people react,
00:29:11.800 | you know, to a criticism or a complaint or a concern?
00:29:15.640 | And if the reaction is defensiveness or combativeness back,
00:29:20.640 | or if someone's really sort of in there being aggressive
00:29:24.680 | and in the wrong, like, no, no, no, hold on,
00:29:27.680 | we've gotta do this the right way.
00:29:29.240 | You gotta say, okay, hold on, you know,
00:29:30.800 | are there good sources?
00:29:32.280 | Is this contextualized appropriately?
00:29:33.840 | Is it even important enough to mention?
00:29:35.800 | What does it mean?
00:29:37.920 | And sometimes one of the areas where I do think
00:29:43.000 | there is a very complicated flaw,
00:29:45.600 | and you've alluded to it a little bit,
00:29:47.640 | but it's like, we know the media is deeply flawed.
00:29:51.600 | We know that journalism can go wrong.
00:29:54.480 | And I would say, particularly in the last,
00:29:56.320 | whatever, 15 years,
00:29:58.000 | we've seen a real decimation of local media,
00:30:01.360 | local newspapers.
00:30:03.240 | We've seen a real rise in clickbait headlines
00:30:06.160 | and sort of eager focus on anything
00:30:08.840 | that might be controversial.
00:30:10.080 | We've always had that with us, of course.
00:30:11.320 | There's always been tabloid newspapers.
00:30:13.640 | But that makes it a little bit more challenging to say,
00:30:16.120 | okay, how do we sort things out
00:30:19.440 | when we have a pretty good sense
00:30:20.880 | that not every source is valid?
00:30:24.600 | So as an example,
00:30:27.640 | a few years ago, it's been quite a while now,
00:30:31.000 | we deprecated the Mail Online as a source.
00:30:36.000 | And the Mail Online, the digital arm of the Daily Mail,
00:30:40.640 | it's a tabloid.
00:30:41.680 | It's not completely, it's not fake news,
00:30:46.680 | but it does tend to run very hyped up stories.
00:30:49.840 | They really love to attack people
00:30:51.920 | and go on the attack for political reasons and so on.
00:30:54.760 | And it just isn't great.
00:30:55.840 | And so by saying deprecated,
00:30:57.960 | and I think some people say,
00:30:59.320 | oh, you banned the Daily Mail.
00:31:00.400 | No, we didn't ban it as a source.
00:31:01.800 | We just said, look, it's probably not a great source.
00:31:04.800 | You should probably look for a better source.
00:31:06.120 | So certainly, if the Daily Mail runs a headline saying,
00:31:09.320 | new cure for cancer, it's like,
00:31:13.800 | probably there's more serious sources
00:31:16.600 | than a tabloid newspaper.
00:31:17.760 | So in an article about lung cancer,
00:31:21.360 | you probably wouldn't cite the Daily Mail.
00:31:23.160 | That's kind of ridiculous.
00:31:24.600 | But also for celebrities and so forth,
00:31:27.160 | to sort of know, well,
00:31:28.080 | they do cover celebrity gossip a lot,
00:31:30.280 | but they also tend to have vendettas and so forth.
00:31:32.520 | And you really have to step back and go,
00:31:34.640 | is this really encyclopedic
00:31:36.200 | or is this just the Daily Mail going on a rant?
00:31:38.840 | - And some of that requires a great community health.
00:31:41.520 | - It requires massive community health.
00:31:43.200 | - Even for me, for stuff I've seen that's kind of,
00:31:45.880 | if actually iffy about people I know,
00:31:47.960 | things I know about myself,
00:31:50.600 | I still feel like a love for knowledge
00:31:55.600 | emanating from the article.
00:31:58.800 | I feel the community health.
00:32:01.560 | So I will take all slight inaccuracies.
00:32:05.600 | I love it because that means there's people,
00:32:09.920 | for the most part, I feel a respect and love
00:32:13.360 | in the search for knowledge.
00:32:14.840 | Like sometimes, 'cause I also love Stock Overflow,
00:32:17.920 | Stock Exchange for programming related things.
00:32:20.120 | And they can get a little cranky sometimes
00:32:22.400 | to a degree where it's like,
00:32:23.920 | it's not as, like you can feel the dynamics
00:32:29.440 | of the health of the particular community
00:32:32.040 | and sub-communities too,
00:32:33.680 | like a particular C# or Java or Python or whatever.
00:32:37.800 | There's little communities that emerge.
00:32:40.200 | You can feel the levels of toxicity
00:32:42.640 | 'cause a little bit of strictness is good,
00:32:45.520 | but a little too much is bad because of the defensiveness.
00:32:48.920 | 'Cause when somebody writes an answer
00:32:51.120 | and then somebody else kind of says,
00:32:52.520 | well, modify it and they get defensive
00:32:54.280 | and there's this tension that's not conducive
00:32:57.400 | to improving towards a more truthful depiction
00:33:01.040 | of that topic.
00:33:02.480 | - Yeah, a great example that I really loved
00:33:05.440 | this morning that I saw,
00:33:07.480 | someone left a note on my user talk page
00:33:10.000 | in English Wikipedia saying,
00:33:12.000 | it was quite a dramatic headline,
00:33:14.160 | racist hook on front page.
00:33:16.280 | So we have on the front page of Wikipedia,
00:33:17.520 | we have a little section called, did you know?
00:33:19.880 | And it's just little tidbits and facts,
00:33:21.480 | just things people find interesting.
00:33:22.640 | And there's a whole process for how things get there.
00:33:25.160 | And the one that somebody was raising a question about was,
00:33:29.280 | it was comparing a very well-known US football player, black.
00:33:34.280 | There was a quote from another famous sport person
00:33:39.440 | comparing him to a Lamborghini, clearly a compliment.
00:33:43.800 | And so somebody said, actually, here's a study,
00:33:47.400 | here's some interesting information
00:33:48.600 | about how black sports people are far more often compared
00:33:53.600 | to inanimate objects and given that kind of analogy.
00:33:57.040 | And I think it's demeaning to compare a person to a car,
00:34:00.680 | et cetera, et cetera.
00:34:01.600 | But they said, I'm not deleting it, I'm not removing it,
00:34:05.480 | I just wanna raise the question.
00:34:07.080 | And then there's this really interesting conversation
00:34:09.160 | that goes on where I think the general consensus was,
00:34:13.040 | you know what, this isn't like the alarming headline,
00:34:17.040 | racist thing on the front page of Wikipedia,
00:34:18.920 | that sounds, holy moly, that sounds bad.
00:34:20.960 | But it's sort of like, actually, yeah,
00:34:22.320 | this probably isn't the sort of analogy
00:34:24.480 | that we think is great.
00:34:26.040 | And so we should probably think about
00:34:27.480 | how to improve our language and not compare sports people
00:34:30.120 | to inanimate objects and particularly be aware
00:34:32.840 | of certain racial sensitivities that there might be
00:34:36.400 | around that sort of thing if there is a disparity
00:34:38.640 | in the media of how people are called.
00:34:40.520 | And I just thought, you know what,
00:34:41.960 | nothing for me to weigh in on here,
00:34:43.360 | this is a good conversation.
00:34:44.800 | Like, nobody's saying people should be banned
00:34:48.480 | if they refer to, what was his name,
00:34:51.400 | the fridge, Refrigerator Perry,
00:34:53.040 | the very famous comparison to an inanimate object
00:34:56.840 | of a Chicago Bears player many years ago.
00:34:59.480 | But they're just saying, hey, let's be careful about
00:35:01.640 | analogies that we just pick up from the media.
00:35:03.920 | I said, yeah, you know, that's good.
00:35:06.480 | - On the sort of deprecation of news sources,
00:35:09.000 | really interesting because I think what you're saying
00:35:11.320 | is ultimately you wanna make a article by article decision.
00:35:15.160 | Kind of use your own judgment.
00:35:17.480 | And it's such a subtle thing because
00:35:19.600 | there's just a lot of hit pieces written about
00:35:27.320 | individuals like myself, for example,
00:35:29.360 | that masquerade as kind of an objective,
00:35:33.280 | thorough exploration of a human being.
00:35:35.440 | It's fascinating to watch because controversy
00:35:38.440 | and hit pieces just get more clicks.
00:35:41.160 | - Oh yeah, for sure.
00:35:42.280 | I guess as a Wikipedia contributor,
00:35:45.360 | you start to deeply become aware of that
00:35:49.960 | and start to have a sense,
00:35:51.120 | like a radar of clickbait versus truth.
00:35:54.200 | Like to pick out the truth
00:35:55.840 | from the clickbaity type language.
00:35:58.160 | - Oh yeah, I mean, it's really important.
00:36:00.640 | And we talk a lot about weasel words.
00:36:03.800 | Actually, I'm sure we'll end up talking about AI
00:36:09.920 | and Chat GPT, but just to quickly mention in this area,
00:36:13.720 | I think one of the potentially powerful tools,
00:36:16.620 | because it is quite good at this,
00:36:20.400 | I've played around with and practiced it quite a lot,
00:36:23.720 | but Chat GPT-4 is really quite able to take a passage
00:36:28.720 | and point out potentially biased terms,
00:36:33.960 | to rewrite it to be more neutral.
00:36:37.200 | Now, it is a bit anodyne and it's a bit cliched.
00:36:41.800 | So sometimes it just takes the spirit out of something
00:36:44.440 | that's actually not bad.
00:36:45.500 | It's just like poetic language and you're like,
00:36:48.520 | well, okay, that's not actually helping.
00:36:50.280 | But in many cases,
00:36:51.760 | I think that sort of thing is quite interesting.
00:36:53.120 | And I'm also interested in,
00:36:54.520 | can you imagine where you feed in a Wikipedia entry
00:37:02.360 | and all the sources and you say,
00:37:06.400 | help me find anything in the article
00:37:08.040 | that is not accurately reflecting what's in the sources.
00:37:12.520 | And that doesn't have to be perfect.
00:37:14.400 | It only has to be good enough to be useful to community.
00:37:17.680 | So if it scans an article and all the sources and you say,
00:37:21.720 | oh, it came back with 10 suggestions
00:37:25.280 | and seven of them were decent
00:37:26.480 | and three of them it just didn't understand,
00:37:28.520 | well, actually that's probably worth my time to do.
00:37:30.880 | And it can help us really more quickly
00:37:36.720 | get good people to sort of review obscure entries
00:37:40.720 | and things like that.
00:37:41.880 | - So just as a small aside on that,
00:37:44.040 | and we'll probably talk about language models a little bit
00:37:46.640 | or a lot more, but one of the articles,
00:37:50.160 | one of the hit pieces about me,
00:37:52.400 | the journalist actually was very straightforward
00:37:55.320 | and honest about having used GPT
00:37:58.120 | to write part of the article.
00:37:59.440 | - Oh, interesting.
00:38:00.280 | - And then finding that it made an error
00:38:01.840 | and apologize for the error that GPT-4 generated,
00:38:04.920 | which has this kind of interesting loop,
00:38:07.880 | which the articles are used to write Wikipedia pages.
00:38:12.000 | GPT is trained on Wikipedia.
00:38:13.960 | And there's like this interesting loop
00:38:18.280 | where the weasel words and the nuances can get lost
00:38:23.200 | or can propagate even though they're not grounded in reality.
00:38:27.880 | Somehow in the generation of the language model,
00:38:31.560 | new truths can be created and kind of linger.
00:38:35.280 | - Yeah, there's a famous web comic
00:38:38.480 | that's titled "Cytogenesis,"
00:38:40.760 | which is about how something, an error's in Wikipedia,
00:38:45.760 | and there's no source for it,
00:38:47.280 | but then a lazy journalist reads it and writes the source.
00:38:50.840 | And then some helpful Wikipedian spots
00:38:53.040 | that it has no source, finds the source,
00:38:54.520 | and adds it to Wikipedia.
00:38:56.080 | And voila, magic.
00:38:57.200 | This happened to me once.
00:38:59.720 | Well, it nearly happened.
00:39:01.020 | There was this, I mean, it was really brief.
00:39:04.400 | I went back and researched it.
00:39:05.360 | I'm like, "This is really odd."
00:39:06.760 | So "Biography" magazine,
00:39:08.280 | which is a magazine published by the Biography TV channel,
00:39:11.460 | had a profile of me, and it said,
00:39:15.440 | "In his spare time," I'm not quoting exactly.
00:39:19.120 | I've spent many years, but,
00:39:20.000 | "In his spare time, he enjoys playing chess with friends."
00:39:22.960 | I thought, "Wow, that sounds great.
00:39:24.400 | Like, I would like to be that guy, but actually,
00:39:26.920 | I mean, I play chess with my kids sometimes,
00:39:28.720 | but no, it's not a hobby of mine."
00:39:31.360 | And I was like, "Where did they get that?"
00:39:36.360 | And I contacted the magazine.
00:39:37.720 | I said, "Where'd that come from?"
00:39:38.560 | They said, "Oh, it was in Wikipedia."
00:39:39.840 | I looked in the history.
00:39:41.000 | There had been vandalism of Wikipedia,
00:39:42.400 | which was not, you know, it's not damaging.
00:39:44.440 | It's just false.
00:39:46.480 | So, and it had already been removed.
00:39:48.880 | But then I thought, "Oh, gosh, well,
00:39:49.960 | I better mention this to people,
00:39:51.160 | because otherwise, somebody's gonna read that,
00:39:53.000 | and they're gonna add it to the entry,
00:39:54.000 | and it's gonna take on a life of its own."
00:39:56.200 | And then sometimes I wonder if it has,
00:39:58.000 | because I've been, I was invited a few years ago
00:40:00.560 | to do the ceremonial first move
00:40:02.280 | in the World Chess Championship,
00:40:03.680 | and I thought, "I wonder if they think
00:40:05.160 | I'm a really big chess enthusiast,"
00:40:07.000 | because they read this biography magazine article.
00:40:09.720 | So, but that problem,
00:40:13.120 | when we think about large language models
00:40:14.640 | and the ability to quickly generate
00:40:16.800 | very plausible but not true content,
00:40:20.080 | I think is something that there's gonna be
00:40:21.880 | a lot of shakeout, a lot of implications of that.
00:40:25.120 | - What would be hilarious is,
00:40:26.840 | because of the social pressure of Wikipedia
00:40:29.560 | and the momentum, you would actually start
00:40:31.200 | playing a lot more chess.
00:40:32.840 | Just to, not only the articles are written
00:40:36.400 | based on Wikipedia, but your own life trajectory changes
00:40:40.360 | because of Wikipedia, just to make it more convenient.
00:40:43.560 | - Yeah. - Aspire to.
00:40:44.920 | - Aspire to, yeah, aspirational.
00:40:47.160 | - What, if we could just talk about that
00:40:50.600 | before we jump back to some other
00:40:53.160 | interesting topics on Wikipedia.
00:40:54.760 | Let's talk about GPT-4 and large language models.
00:40:58.200 | So they are, in part, trained on Wikipedia content.
00:41:01.560 | - Yeah.
00:41:02.400 | - What are the pros and cons of these language models?
00:41:06.640 | What are your thoughts?
00:41:07.480 | - Yeah, so, I mean, there's a lot of stuff going on.
00:41:10.040 | Obviously, the technology's moved very quickly
00:41:12.400 | in the last six months and looks poised to do so
00:41:15.200 | for some time to come.
00:41:16.520 | So, first things first, I mean,
00:41:20.320 | part of our philosophy is the open licensing,
00:41:24.600 | the free licensing, the idea that, you know,
00:41:27.480 | this is what we're here for.
00:41:28.640 | We are a volunteer community and we write this encyclopedia.
00:41:33.000 | We give it to the world to do what you like with.
00:41:37.080 | You can modify it, redistribute it,
00:41:39.400 | redistribute modified versions,
00:41:41.280 | commercially, non-commercially, this is the licensing.
00:41:44.320 | So in that sense, of course, it's completely fine.
00:41:46.440 | Now, we do worry a bit about attribution
00:41:48.400 | because it is a Creative Commons
00:41:51.920 | attribution share-alike license.
00:41:54.240 | So, attribution is important,
00:41:56.080 | not just because of our licensing model
00:41:57.680 | and things like that, but it's just,
00:41:59.440 | proper attribution is just good intellectual practice.
00:42:02.440 | And that's a really hard, complicated question.
00:42:07.120 | You know, if I were to write something
00:42:13.000 | about my visit here, I might say in a blog post,
00:42:16.440 | you know, I was in Austin, which is a city in Texas.
00:42:21.440 | I'm not gonna put a source for Austin as a city in Texas.
00:42:23.840 | That's just general knowledge.
00:42:24.920 | I learned it somewhere, I can't tell you where.
00:42:26.920 | So you don't have to cite and reference every single thing.
00:42:30.440 | But you know, if I actually did research
00:42:32.320 | and I used something very heavily,
00:42:33.720 | it's just morally proper to give your sources.
00:42:37.720 | So we would like to see that.
00:42:39.520 | And obviously, you know, they call it grounding.
00:42:43.760 | So particularly people at Google are really keen
00:42:46.040 | on figuring out grounding.
00:42:48.920 | - Such a cool term.
00:42:50.600 | Any text that's generated, trying to ground it
00:42:53.840 | to the Wikipedia quality source.
00:42:58.120 | I mean, like the same kind of standard
00:42:59.960 | of what a source means that Wikipedia uses,
00:43:02.600 | the same kind of source will be generated.
00:43:05.320 | - The same kind of thing.
00:43:06.160 | And of course, one of the biggest flaws
00:43:08.240 | in Chattopadhyay right now is that it just literally
00:43:12.840 | will make things up just to be amiable, I think.
00:43:17.000 | It's programmed to be very helpful and amiable.
00:43:19.400 | It doesn't really know or care about the truth.
00:43:21.600 | - And get bullied into, it can kind of be convinced into.
00:43:25.480 | - Well, but like this morning,
00:43:26.960 | the story I was telling earlier
00:43:29.320 | about comparing a football player to a Lamborghini,
00:43:32.680 | and I thought, "Is that really racial?
00:43:34.400 | "I don't know."
00:43:35.480 | But I'm just, I'm mulling it over.
00:43:36.640 | And I thought, "I'm gonna go to ChattGBT."
00:43:39.200 | So I sent to ChattGBT4, I said,
00:43:41.140 | "You know, this happened in Wikipedia.
00:43:44.480 | "Can you think of examples where a white athlete
00:43:48.080 | "has been compared to a fast car inanimate object?"
00:43:53.000 | And it comes back with a very plausible essay
00:43:55.560 | where it tells why these analogies are common in sport.
00:43:58.640 | I said, "No, no, I really,
00:43:59.800 | "could you give me some specific examples?"
00:44:02.000 | So it gives me three specific examples, very plausible,
00:44:05.000 | correct names of athletes and contemporaries
00:44:07.520 | and all of that, could have been true.
00:44:09.160 | Googled every single quote and none of them existed.
00:44:12.000 | And so I'm like, "Well, that's really not good."
00:44:14.120 | Like, I wanted to explore a thought process
00:44:17.680 | I was in, I thought, first I thought,
00:44:19.680 | "How do I Google?"
00:44:20.560 | And it's like, "Well, it's kind of a hard thing to Google
00:44:21.960 | "'cause unless somebody's written about this specific topic,
00:44:24.120 | "it's, you know, it's a large language model,
00:44:26.660 | "it's processed all this data,
00:44:28.020 | "it can probably piece that together for me."
00:44:29.760 | But it just can't yet.
00:44:30.680 | So I think, I hope that ChattGBT5, six, seven,
00:44:35.680 | you know, three to five years,
00:44:39.280 | I'm hoping we'll see a much higher level of accuracy
00:44:45.880 | where when you ask a question like that,
00:44:48.240 | I think instead of being quite so eager to please
00:44:51.640 | by giving you a plausible sounding answer,
00:44:53.360 | it's just like, "Don't know."
00:44:55.040 | - Or maybe display the how much bullshit
00:45:00.040 | might be in this generated text.
00:45:02.200 | Like, "I really would like to make you happy right now,
00:45:05.440 | "but I'm really stretched thin with this generation."
00:45:07.480 | - Well, it's one of the things I've said for a long time.
00:45:10.320 | So in Wikipedia, one of the great things we do
00:45:13.200 | may not be great for our reputation
00:45:14.880 | except in a deeper sense for the longterm, I think it is.
00:45:17.840 | But you know, we'll all be unnoticed that says,
00:45:20.440 | "The neutrality of this section has been disputed,"
00:45:22.600 | or, "The following section doesn't cite any sources."
00:45:25.680 | And I always joke, you know,
00:45:28.200 | sometimes I wish the New York Times would run a banner
00:45:30.680 | saying, "The neutrality of this has been disputed."
00:45:32.600 | They could give us an,
00:45:33.480 | we had a big fight in the newsroom
00:45:34.720 | as to whether to run this or not,
00:45:36.840 | but we thought it's important enough to bring it to you.
00:45:38.520 | But just be aware that not all the journalists
00:45:40.680 | are on board with, "Oh, that's actually interesting,
00:45:42.280 | "and that's fine."
00:45:43.280 | I would trust them more for that level of transparency.
00:45:45.880 | So yeah, similarly, Chattopadhyay should say,
00:45:48.040 | "Yeah, 87% bullshit."
00:45:50.120 | - Well, the neutrality one is really interesting
00:45:52.800 | 'cause that's basically a summary of the discussions
00:45:56.600 | that are going on underneath.
00:45:58.320 | It would be amazing if, I should be honest,
00:46:01.200 | I don't look at the top page often.
00:46:03.320 | It would be nice somehow if there was kind of a summary
00:46:07.800 | in this banner way of like,
00:46:10.040 | this, lots of wars have been fought on this here land
00:46:14.000 | for this here paragraph.
00:46:16.360 | - It's really interesting, yeah.
00:46:17.720 | I hadn't thought of that.
00:46:18.960 | Because one of the things I do spend a lot of time
00:46:20.840 | thinking about these days, and people have found it,
00:46:23.080 | we're moving slowly, but we are moving,
00:46:26.760 | thinking about, okay, these tools exist.
00:46:29.960 | Are there ways that this stuff can be useful
00:46:32.520 | to our community?
00:46:33.960 | Because a part of it is we do approach things
00:46:36.080 | in a non-commercial way, in a really deep sense.
00:46:39.280 | It's like, it's been great that Wikipedia
00:46:41.920 | has become very popular, but really,
00:46:43.320 | we're a community whose hobby is writing an encyclopedia.
00:46:47.080 | That's first, and if it's popular, great.
00:46:48.920 | If it's not, okay, we might have trouble paying
00:46:50.720 | for more servers, but it'll be fine.
00:46:53.760 | And so how do we help the community use these tools?
00:46:57.080 | What are the ways that these tools can support people?
00:46:59.120 | One example I never thought about,
00:47:01.480 | I'm gonna start playing with it, is feed in the article
00:47:04.360 | and feed in the talk page and say,
00:47:06.280 | "Can you suggest some warnings in the article
00:47:08.880 | "based on the conversations in the talk page?"
00:47:11.120 | I think it might be good at that.
00:47:12.720 | It might get it wrong sometimes, but again,
00:47:15.680 | if it's reasonably successful at doing that,
00:47:18.280 | and you can say, "Oh, actually, yeah, it does suggest
00:47:21.360 | "the neutrality of this has been disputed
00:47:24.640 | "on a section that has a seven-page discussion in the back,"
00:47:27.880 | that might be useful.
00:47:28.840 | I don't know.
00:47:29.680 | We're playing with it. - Yeah, I mean,
00:47:30.800 | some more color to the, not neutrality,
00:47:35.520 | but also the amount of emotion laid in
00:47:40.000 | in the exploration of this particular part of the topic.
00:47:44.360 | It might actually help you look at more controversial pages,
00:47:49.080 | like a page on war in Ukraine
00:47:52.080 | or a page on Israel and Palestine.
00:47:54.340 | There could be parts that everyone agrees on,
00:47:56.480 | and there's parts that are just like--
00:47:58.360 | - Tough. - Tough.
00:47:59.360 | - The hard parts, yeah.
00:48:00.640 | - It would be nice to, when looking at those beautiful,
00:48:03.240 | long articles, to know, like, all right,
00:48:06.400 | let me just take in some stuff where everybody agrees on.
00:48:09.160 | - I can give an example that I haven't looked at
00:48:11.520 | in a long time, but I was really pleased
00:48:13.520 | with what I saw at the time.
00:48:15.480 | So the discussion was that they're building something
00:48:20.160 | in Israel, and for their own political reasons,
00:48:24.300 | one side calls it a wall,
00:48:27.520 | hearkening back to Berlin Wall, apartheid.
00:48:30.920 | The other calls it a security fence.
00:48:32.920 | So we can understand quite quickly,
00:48:34.920 | if we give it a moment's thought, like, okay,
00:48:36.640 | I understand why people would have this grappling
00:48:40.200 | over the language, like, okay, you wanna highlight
00:48:43.460 | the negative aspects of this,
00:48:45.020 | and you wanna highlight the positive aspects,
00:48:46.440 | so you're gonna try and choose a different name.
00:48:48.760 | And so there was this really fantastic Wikipedia discussion
00:48:52.600 | on the talk page, how do we word that paragraph
00:48:55.280 | to talk about the different naming?
00:48:57.360 | It's called this by Israelis, it's called this
00:48:59.080 | by Palestinians, and how you explain that to people
00:49:04.000 | could be quite charged, right?
00:49:05.520 | You could easily explain, oh, there's this difference,
00:49:09.480 | and it's because this side's good and this side's bad,
00:49:11.320 | and that's why there's a difference.
00:49:12.320 | Or you could say, actually, let's just,
00:49:15.660 | let's try and really stay as neutral as we can
00:49:17.800 | and try to explain the reason.
00:49:18.880 | So you may come away from it with a concept.
00:49:22.920 | Oh, okay, I understand what this debate is about now.
00:49:26.400 | And just the term Israel-Palestine conflict
00:49:31.400 | is still the title of a page of Wikipedia,
00:49:36.780 | but the word conflict is something that is a charged word.
00:49:41.780 | - Of course, yeah.
00:49:42.720 | - Because from the Palestinian side or from certain sides,
00:49:47.520 | the word conflict doesn't accurately describe the situation
00:49:50.180 | because if you see it as a genocide,
00:49:52.780 | one-way genocide is not a conflict,
00:49:54.800 | because to people that discuss,
00:49:58.220 | that challenge the word conflict,
00:50:01.680 | they see conflict is when there's
00:50:03.480 | two equally powerful sides fighting.
00:50:06.000 | - Yeah, yeah, no, it's hard.
00:50:07.760 | And in a number of cases, so this actually speaks
00:50:11.720 | to a slightly broader phenomenon,
00:50:14.580 | which is there are a number of cases
00:50:16.540 | where there is no one word that can get consensus.
00:50:20.280 | And in the body of an article, that's usually okay,
00:50:24.500 | because we can explain the whole thing.
00:50:26.600 | You can come away with an understanding
00:50:28.040 | of why each side wants to use a certain word,
00:50:30.520 | but there are some aspects, like the page has to have a title.
00:50:33.860 | So, there's that.
00:50:35.360 | Same thing with certain things like photos.
00:50:40.160 | It's like, well, there's different photos, which one's best?
00:50:43.200 | A lot of different views on that,
00:50:44.560 | but at the end of the day, you need the lead photo,
00:50:47.000 | 'cause there's one slot for a lead photo.
00:50:49.100 | Categories is another one.
00:50:52.320 | So, at one point, I have no idea if it's in there today,
00:50:56.540 | but I don't think so, I was listed in,
00:51:01.540 | American entrepreneurs, fine, American atheists.
00:51:04.940 | And I said, hmm, that doesn't feel right to me.
00:51:08.580 | Just personally, it's true.
00:51:10.420 | I mean, I wouldn't disagree with the objective fact of it,
00:51:14.980 | but when you click the category and you see
00:51:17.020 | a lot of people who are, you might say,
00:51:20.700 | American atheist activist, 'cause that's their big issue.
00:51:23.760 | So, Madeline Murray O'Hare, or various famous people,
00:51:27.540 | Richard Dawkins, who make it a big part
00:51:29.480 | of their public argument and persona,
00:51:31.920 | but that's not true of me.
00:51:32.760 | It's just like my private personal belief.
00:51:34.480 | It doesn't really, it's not something I campaign about.
00:51:37.040 | So, it felt weird to put me in the category,
00:51:39.040 | but what category would you put?
00:51:41.760 | And do you need that guy?
00:51:42.960 | In this case, I argued, it doesn't need that guy.
00:51:45.960 | Like, that's not, I don't speak about it publicly,
00:51:48.240 | except incidentally from time to time.
00:51:50.480 | I don't campaign about it, so it's weird to put me
00:51:52.500 | with this group of people.
00:51:54.260 | And that argument carried the day.
00:51:55.660 | I hope not just because it was me,
00:51:57.300 | but categories can be like that,
00:52:00.300 | where you're either in the category or you're not.
00:52:03.580 | And sometimes it's a lot more complicated than that.
00:52:06.220 | And is it, again, we go back to, is it undue weight?
00:52:09.300 | If someone who is now prominent in public life
00:52:15.140 | and generally considered to be a good person
00:52:20.340 | was convicted of something, let's say DUI,
00:52:25.340 | when they were young, we normally,
00:52:27.720 | in normal sort of discourse, we don't think,
00:52:30.040 | oh, this person should be in the category
00:52:31.280 | of American criminals.
00:52:32.960 | Because you think, oh, a criminal, technically speaking,
00:52:35.600 | it's against the law to drive under the influence of alcohol
00:52:39.000 | and you were arrested and you spent a month in prison
00:52:42.520 | or whatever, but it's odd to say that's a criminal.
00:52:45.280 | So, just as an example in this area is,
00:52:50.080 | Mark Wahlberg, Marky Mark,
00:52:51.500 | that's what I always think of him as,
00:52:52.340 | 'cause that was his first sort of famous name,
00:52:54.860 | who I wouldn't think should be listed
00:52:57.500 | as in the category American criminal,
00:53:00.300 | even though he was convicted of quite a bad crime
00:53:03.780 | when he was a young person,
00:53:04.740 | but we don't think of him as a criminal.
00:53:07.060 | Should the entry talk about that?
00:53:08.420 | Yeah, that's actually an important part of his life story,
00:53:11.820 | that he had a very rough youth
00:53:13.100 | and he could have gone down a really dark path
00:53:15.820 | and he turned his life around, that's actually interesting.
00:53:18.260 | So, categories are tricky.
00:53:20.740 | - Especially with people,
00:53:22.140 | because we like to assign labels to people
00:53:27.000 | and to ideas somehow, and those labels stick.
00:53:30.160 | - Yeah. - And there's certain words
00:53:31.200 | that have a lot of power, like criminal,
00:53:34.260 | like political, left, right, center,
00:53:36.900 | anarchist, objectivist,
00:53:41.460 | what other philosophies are there?
00:53:44.280 | Marxist, communist, social democrat,
00:53:47.160 | democratic socialist, socialist.
00:53:50.480 | And if you add that as a category,
00:53:52.600 | all of a sudden it's like, oh boy, you're that guy now.
00:53:55.880 | - Yeah. - And I don't know
00:53:56.720 | if you wanna be that guy. - Well, there's definitely
00:53:58.080 | some really charged ones, like alt-right,
00:54:01.160 | I think is quite a complicated and tough label.
00:54:04.240 | I mean, it's not a completely meaningless label,
00:54:06.880 | but boy, I think you really have to pause
00:54:08.600 | before you actually put that label on someone,
00:54:11.360 | partly because now you're putting them in a group of people,
00:54:14.200 | some of whom you wouldn't wanna be grouped with.
00:54:18.080 | So it's, yeah.
00:54:19.960 | - Let's go into some, you mentioned the hot water
00:54:22.460 | of the pool that we're both tipping a toe in.
00:54:25.860 | Do you think Wikipedia has a left-leaning political bias,
00:54:29.320 | which is something it is sometimes accused of?
00:54:31.440 | - Yeah, so I don't think so, not broadly.
00:54:34.220 | And I think you can always point to specific entries
00:54:40.480 | and talk about specific biases,
00:54:42.300 | but that's part of the process of Wikipedia.
00:54:44.840 | Anyone can come and challenge and to go on about that.
00:54:48.440 | But I see fairly often on Twitter
00:54:52.240 | quite extreme accusations of bias.
00:54:57.600 | And I think, actually, I just, I don't see it.
00:55:00.840 | I don't buy that.
00:55:01.660 | And if you ask people for an example,
00:55:03.920 | they normally struggle,
00:55:05.220 | and depending on who they are and what it's about.
00:55:10.040 | So it's certainly true that some people
00:55:13.460 | who have quite fringe viewpoints,
00:55:16.020 | and who knows, the full rush of history in 500 years,
00:55:22.340 | they might be considered to be path-breaking geniuses,
00:55:24.920 | but at the moment, quite fringe views,
00:55:28.320 | and they're just unhappy that Wikipedia doesn't report
00:55:30.500 | on their fringe views as being mainstream.
00:55:33.060 | And that, by the way, goes across all kinds of fields.
00:55:36.040 | I mean, I was once accosted on the street
00:55:39.600 | outside the TED conference in Vancouver
00:55:42.800 | by a guy who's a homeopath, who was very upset
00:55:46.160 | that Wikipedia's entry on homeopathy
00:55:48.640 | basically says it's pseudoscience.
00:55:52.000 | And he felt that was biased.
00:55:53.240 | And I said, "Well, I can't really help you
00:55:54.920 | "because we cite good quality sources
00:55:58.560 | "to talk about the scientific status,
00:56:00.720 | "and it's not very good."
00:56:02.700 | So it depends.
00:56:04.160 | And I think it's something
00:56:06.240 | that we should always be vigilant about.
00:56:08.920 | But it's, in general, I think we're pretty good.
00:56:13.040 | And I think any time you go
00:56:14.680 | to any serious political controversy,
00:56:19.680 | we should have a pretty balanced perspective
00:56:22.120 | on who's saying what, what the views are, and so forth.
00:56:26.280 | I would actually argue that the areas
00:56:30.000 | where we are more likely to have bias
00:56:32.880 | that persists for a long period of time
00:56:34.560 | are actually fairly obscure things,
00:56:37.280 | or maybe fairly non-political things.
00:56:39.600 | So I always give, it's kind of a humorous example,
00:56:41.560 | but it's meaningful.
00:56:43.580 | If you read our entries about Japanese anime,
00:56:49.220 | they tend to be very, very positive and very favorable
00:56:52.720 | because almost no one knows about Japanese anime
00:56:54.880 | except for fans.
00:56:56.320 | And so the people who come and spend their days
00:56:58.520 | writing Japanese anime articles, they love it.
00:57:01.200 | They kind of have an inherent love for the whole area.
00:57:03.920 | Now, they'll, of course, being human beings,
00:57:05.840 | they'll have their internal debates and disputes
00:57:07.520 | about what's better or not.
00:57:09.520 | But in general, they're quite positive
00:57:11.600 | because nobody actually cares.
00:57:13.240 | On anything that people are quite passionate about,
00:57:16.280 | then hopefully there's quite a lot of interesting stuff.
00:57:20.240 | So I'll give an example, a contemporary example,
00:57:22.720 | where I think we've done a good job
00:57:24.520 | as of my most recent sort of look at it.
00:57:27.600 | And that is the question about the efficacy of masks
00:57:31.960 | during the COVID pandemic.
00:57:34.040 | And that's an area where I would say the public authorities
00:57:38.640 | really kind of jerked us all around a bit.
00:57:41.480 | In the very first days, they said,
00:57:42.600 | "Whatever you do, don't rush on and buy masks."
00:57:44.940 | And their concern was shortages in hospitals.
00:57:50.400 | Okay, fair enough.
00:57:52.000 | Later, it's like,
00:57:52.840 | now everybody's gotta wear a mask everywhere.
00:57:55.280 | It really works really well.
00:57:56.640 | Then now I think it's the evidence is mixed.
00:58:01.040 | Masks seem to help, in my personal view,
00:58:03.120 | masks seem to help.
00:58:04.320 | They're no huge burden.
00:58:06.280 | You might as well wear a mask in any environment
00:58:08.600 | where you're with a giant crowd of people and so forth.
00:58:11.700 | But it's very politicized, that one.
00:58:15.240 | It's very politicized where certainly in the US,
00:58:19.840 | you know, much more so, I mean, I live in the UK,
00:58:22.520 | I live in London, I've never seen kind of on the streets
00:58:25.600 | sort of the kind of thing that there's a lot of reports
00:58:29.000 | of people actively angry because someone else
00:58:31.600 | is wearing a mask, that sort of thing in public.
00:58:35.320 | And so because it became very politicized,
00:58:38.960 | then clearly if Wikipedia,
00:58:41.720 | no, so anyway, if you go to Wikipedia
00:58:43.160 | and you research this topic,
00:58:44.240 | I think you'll find more or less what I've just said.
00:58:46.520 | Actually, after it's all, you know,
00:58:49.160 | to this point in history, it's mixed evidence.
00:58:51.920 | Like masks seem to help,
00:58:53.080 | but maybe not as much as some of the authorities said,
00:58:54.960 | and here we are.
00:58:56.880 | And that's kind of an example where I think,
00:58:58.840 | "Okay, we've done a good job,
00:59:00.000 | "but I suspect there are people on both sides
00:59:02.720 | "of that very emotional debate
00:59:04.320 | "who think this is ridiculous."
00:59:06.240 | Hopefully we've got quality sources,
00:59:08.520 | so then hopefully those people who read this can say,
00:59:10.680 | "Oh, actually, you know, it is complicated."
00:59:13.500 | If you can get to the point of saying,
00:59:16.000 | "Okay, I have my view, but I understand other views,
00:59:19.900 | "and I do think it's a complicated question,"
00:59:22.240 | great, now we're a little bit more mature as a society.
00:59:24.840 | - Well, that one is an interesting one
00:59:26.000 | because I feel like I hope that that article
00:59:29.200 | also contains the meta conversation
00:59:31.600 | about the politicization of that topic.
00:59:34.360 | To me, it's almost more interesting
00:59:35.800 | than whether masks work or not,
00:59:37.880 | at least at this point,
00:59:39.000 | is like why it became, masks became a symbol
00:59:43.320 | of the oppression of a centralized government,
00:59:46.000 | if you wear them.
00:59:47.020 | You're a sheep that follows the mass control,
00:59:52.560 | the mass hysteria of an authoritarian regime,
00:59:56.160 | and if you don't wear a mask,
00:59:57.960 | then you are a science denier, anti-vaxxer,
01:00:01.920 | a alt-right, probably a Nazi.
01:00:05.460 | - Exactly, and that whole politicization of society
01:00:11.600 | is just so damaging,
01:00:13.720 | and I don't know, in the broader world,
01:00:18.920 | how do we start to fix that?
01:00:20.080 | That's a really hard question.
01:00:21.800 | - Well, at every moment,
01:00:23.320 | 'cause you mentioned mainstream and fringe,
01:00:26.080 | there seems to be a tension here,
01:00:27.600 | and I wonder what your philosophy is on it
01:00:31.000 | because there's mainstream ideas and there's fringe ideas.
01:00:33.960 | You look at lab leak theory for this virus.
01:00:37.960 | There could be other things we can discuss
01:00:40.880 | where there's a mainstream narrative
01:00:45.120 | where if you just look at the percent of the population
01:00:50.440 | or the population with platforms, what they say,
01:00:53.560 | and then what is a small percentage in opposition to that,
01:00:58.560 | and what is Wikipedia's responsibility
01:01:01.840 | to accurately represent both the mainstream
01:01:04.160 | and the fringe, do you think?
01:01:05.320 | - Well, I mean, I think we have to try to do our best
01:01:08.240 | to recognize both, but also to appropriately contextualize,
01:01:12.760 | and so this can be quite hard,
01:01:14.720 | particularly when emotions are high.
01:01:16.120 | That's just a fact about human beings.
01:01:19.720 | I'll give a simpler example
01:01:21.120 | because there's not a lot of emotion around it.
01:01:23.240 | Our entry on the moon doesn't say,
01:01:27.240 | some say the moon's made of rocks, some say cheese.
01:01:30.160 | Who knows?
01:01:31.000 | That kind of false neutrality is not what we wanna get to.
01:01:35.280 | That doesn't make any sense, but that one's easy.
01:01:37.880 | We all understand.
01:01:38.940 | I think there is a Wikipedia entry
01:01:41.880 | called something like the moon is made of cheese
01:01:44.120 | where it talks about this is a common sort of joke
01:01:47.240 | or thing that children say
01:01:49.640 | or that people tell to children or whatever.
01:01:52.400 | It's just a thing.
01:01:53.440 | Everybody's heard moon's made of cheese,
01:01:55.600 | but nobody thinks, wow, Wikipedia's so one-sided,
01:02:01.680 | it doesn't even acknowledge the cheese theory.
01:02:04.040 | I'd say the same thing about flat Earth, again.
01:02:08.240 | - That's exactly what I'm looking up right now.
01:02:09.800 | - Very little controversy.
01:02:12.920 | We will have an entry about flat Earth theorizing,
01:02:17.200 | flat Earth people.
01:02:18.400 | My personal view is most of the people
01:02:22.200 | who claim to be flat Earthers are just having a laugh,
01:02:24.560 | trolling, and more power to them.
01:02:26.680 | Have some fun, but let's not be ridiculous.
01:02:31.440 | - Of course, for most of human history,
01:02:33.000 | people believe that the Earth is flat,
01:02:34.440 | so the article I'm looking at
01:02:36.160 | is actually kind of focusing on this history.
01:02:38.160 | Flat Earth is an archaic and scientifically disproven
01:02:40.480 | conception of the Earth's shape as a plane or disc.
01:02:43.280 | Many ancient cultures subscribe to flat Earth cosmography
01:02:47.720 | with pretty cool pictures
01:02:48.720 | of what a flat Earth would look like.
01:02:50.800 | With dragon, is that a dragon?
01:02:52.120 | No, angels on the edge.
01:02:54.640 | There's a lot of controversy about that.
01:02:56.000 | What is on the edge?
01:02:56.840 | Is it the wall?
01:02:57.660 | Is it angels?
01:02:58.500 | Is it dragons?
01:02:59.440 | Is there a dome?
01:03:00.280 | - And how can you fly from South Africa to Perth?
01:03:04.800 | Because on a flat Earth view,
01:03:07.280 | that's really too far for any plane to make it.
01:03:09.480 | - But I wanna know-- - It's all spread out.
01:03:11.800 | - What I wanna know is what's on the other side, Jimmy?
01:03:14.400 | What's on the other side?
01:03:16.200 | That's what all of us want to know.
01:03:18.000 | So there's some, I presume there's probably a small section
01:03:23.120 | about the conspiracy theory of flat Earth,
01:03:25.960 | 'cause I think there's a sizable percent of the population
01:03:28.460 | who at least will say they believe in a flat Earth.
01:03:31.960 | I think it is a movement that just says
01:03:36.600 | that the mainstream narrative,
01:03:38.760 | to have distrust and skepticism
01:03:40.680 | about the mainstream narrative,
01:03:41.820 | which to a very small degree
01:03:43.200 | is probably a very productive thing to do
01:03:45.040 | as part of the scientific process,
01:03:47.180 | but you can get a little silly and ridiculous with it.
01:03:49.720 | - Yeah, I mean, yeah, it's exactly right.
01:03:53.280 | And so I think I find on many, many cases,
01:03:58.280 | and of course I, like anybody else,
01:04:00.380 | might quibble about this or that in any Wikipedia article,
01:04:03.080 | but in general, I think there is a pretty good
01:04:05.860 | sort of willingness and indeed eagerness to say,
01:04:10.460 | oh, let's fairly represent
01:04:12.940 | all of the meaningfully important sides.
01:04:15.820 | So there's still a lot to unpack in that, right?
01:04:18.480 | So meaningfully important.
01:04:20.620 | So people who are raising questions
01:04:25.620 | about the efficacy of masks,
01:04:30.540 | okay, that's actually a reasonable thing
01:04:32.700 | to have a discussion about,
01:04:33.660 | and hopefully we should treat that
01:04:34.980 | as a fair conversation to have
01:04:38.180 | and actually address which authorities have said what
01:04:40.340 | and so on and so forth.
01:04:41.500 | And then there are other cases
01:04:44.540 | where it's not meaningful opposition.
01:04:48.460 | Like you just wouldn't say,
01:04:49.740 | I mean, I doubt if the main article, "Moon,"
01:04:54.700 | it may mention, geez, probably not even,
01:04:59.220 | because it's not credible
01:05:00.860 | and it's not even meant to be serious by anyone,
01:05:03.340 | or the article on the Earth
01:05:05.940 | certainly won't have a paragraph that says,
01:05:07.960 | well, most scientists think it's round,
01:05:09.780 | but certain people think flat.
01:05:11.900 | Like that's just a silly thing to put in that article.
01:05:14.120 | You would wanna sort of address,
01:05:16.420 | that's an interesting cultural phenomenon.
01:05:18.460 | You wanna put it somewhere.
01:05:19.820 | So this goes into all kinds of things about politics.
01:05:26.100 | You wanna be really careful, really thoughtful
01:05:29.380 | about not getting caught up in the anger of our times
01:05:34.380 | and really recognize.
01:05:38.020 | You know, I always thought,
01:05:39.760 | I remember being really kind of proud of the US
01:05:44.340 | at the time when McCain was running against Obama,
01:05:48.220 | because I thought, oh, I've got plenty of disagreements
01:05:49.940 | with both of them,
01:05:51.100 | but they both seem like thoughtful and interesting people
01:05:53.340 | who I would have different disagreements with,
01:05:54.820 | but I always felt like, yeah, that's good.
01:05:57.540 | Now we can have a debate.
01:05:58.580 | Now we can have an interesting debate,
01:05:59.900 | and it isn't just sort of people slamming each other,
01:06:02.540 | personal attacks and so forth.
01:06:04.040 | - And you're saying Wikipedia also represented that?
01:06:09.180 | - I hope so, yeah, and I think so, in the main.
01:06:12.140 | Obviously, you can always find a debate
01:06:14.820 | that went horribly wrong, 'cause there's humans involved.
01:06:18.020 | - But speaking of those humans,
01:06:19.780 | I would venture to guess, I don't know the data,
01:06:24.500 | maybe you can let me know,
01:06:27.780 | but the personal political leaning
01:06:30.540 | of the group of people who edit Wikipedia
01:06:34.340 | probably leans left, I would guess.
01:06:37.820 | So to me, the question there is,
01:06:39.340 | I mean, the same is true for Silicon Valley.
01:06:41.340 | The task for Silicon Valley is to create platforms
01:06:44.180 | that are not politically biased,
01:06:45.860 | even though there is a bias for the engineers who create it.
01:06:50.460 | And I think, I believe it's possible to do that.
01:06:53.860 | You know, there's kind of conspiracy theories
01:06:55.540 | that it somehow is impossible,
01:06:57.140 | and there's this whole conspiracy
01:06:58.500 | where the left is controlling it and so on.
01:07:00.420 | I think engineers, for the most part,
01:07:03.140 | want to create platforms that are open and unbiased,
01:07:05.940 | that create all kinds of perspective,
01:07:08.460 | 'cause that's super exciting
01:07:10.220 | to have all kinds of perspectives battle it out.
01:07:12.220 | But still, is there a degree
01:07:16.860 | to which the personal political bias
01:07:19.100 | of the editors might seep in,
01:07:20.980 | in silly ways and in big ways?
01:07:22.660 | Silly ways could be, I think,
01:07:25.660 | hopefully I'm correct in saying this,
01:07:27.580 | but the right will call it the Democrat Party,
01:07:32.540 | and the left will call it the Democratic Party.
01:07:34.940 | Like subtle, it always hits my ear weird.
01:07:39.380 | Like, are we children here?
01:07:41.620 | That we're literally taking words
01:07:43.820 | and just jabbing at each other.
01:07:45.660 | Like, I could capitalize a thing in a certain way,
01:07:49.260 | or I can just take a word and mess with them.
01:07:52.820 | That's a small way of how you use words.
01:07:54.980 | But you can also have a bigger way
01:07:58.060 | about beliefs, about various perspectives
01:08:02.220 | on political events, on Hunter Biden's laptop,
01:08:05.260 | on how big of a story that is or not,
01:08:07.100 | how big the censorship of that story is or not.
01:08:09.940 | And then there's these camps that take very strong points,
01:08:12.700 | and they construct big narratives around that.
01:08:15.820 | I mean, it's very sizable percent of the population
01:08:18.060 | believes the two narratives that compete with each other.
01:08:21.140 | - Yeah, I mean, it's really interesting.
01:08:25.100 | And it feels, it's hard to judge, you know,
01:08:29.180 | the sweep of history within your own lifetime.
01:08:32.700 | But it feels like it's gotten much worse,
01:08:35.540 | that this idea of two parallel universes
01:08:38.700 | where people can agree on certain basic facts
01:08:41.820 | feels worse than it used to be.
01:08:45.260 | And I'm not sure if that's true,
01:08:47.660 | or if it just feels that way,
01:08:48.700 | but I also, I'm not sure what the causes are.
01:08:51.700 | I think I would lay a lot of the blame in recent years
01:08:56.700 | on social media algorithms,
01:09:01.420 | which reward clickbait headlines,
01:09:05.140 | which reward tweets that go viral,
01:09:09.220 | and they go viral because they're cute and clever.
01:09:12.060 | I mean, my most successful tweet ever,
01:09:16.220 | by a fairly wide margin,
01:09:19.260 | some reporter tweeted at Elon Musk,
01:09:22.460 | 'cause he was complaining about Wikipedia or something,
01:09:26.060 | "You should buy Wikipedia."
01:09:27.340 | And I just wrote, "Not for sale."
01:09:29.940 | And, you know, 90 zillion retweets,
01:09:33.700 | and people liked it, and it was all very good.
01:09:36.740 | But I'm like, you know what?
01:09:38.420 | It's a cute line, right?
01:09:39.980 | And it's a good mic drop and all that.
01:09:41.940 | And I was pleased with myself.
01:09:43.660 | Like, it's not really discourse, right?
01:09:45.420 | It's not really sort of what I like to do,
01:09:49.620 | but it's what social media really rewards,
01:09:51.420 | which is kind of lets you and him have a fight, right?
01:09:55.300 | And that's more interesting.
01:09:56.140 | I mean, it's funny, because at the time,
01:09:57.140 | I was texting with Elon, who was very pleasant to me,
01:10:00.220 | and all of that.
01:10:01.940 | - He might've been a little bit shitty.
01:10:03.420 | The reporter might've been a little bit shitty,
01:10:05.020 | but you fed into the shitty
01:10:06.220 | with a snarky, funny response, "Not for sale."
01:10:09.340 | And like, where do you, like, what?
01:10:12.220 | So that's a funny little exchange,
01:10:13.700 | and you could probably, after that, laugh it off,
01:10:15.500 | and it's fun.
01:10:16.780 | But like, that kind of mechanism that rewards the snark
01:10:20.500 | can go into viciousness.
01:10:22.780 | - Yeah, yeah.
01:10:23.620 | Well, and we certainly see it online.
01:10:26.380 | You know, like, a series of tweets, you know,
01:10:30.940 | sort of a tweet thread of 15 tweets
01:10:35.060 | that assesses the quality of the evidence for masks,
01:10:38.460 | pros and cons, and sort of where this,
01:10:40.740 | that's not gonna go viral, you know?
01:10:43.260 | But, you know, a smackdown for a famous politician
01:10:48.100 | who was famously in favor of masks,
01:10:49.740 | who also went to a dinner and didn't wear a mask,
01:10:52.380 | that's gonna go viral.
01:10:53.540 | And, you know, that's partly human nature.
01:10:57.740 | You know, people love to call out hypocrisy and all of that,
01:11:00.300 | but it's partly what these systems elevate automatically.
01:11:04.420 | I talk about this with respect to Facebook, for example.
01:11:08.460 | So I think Facebook has done a pretty good job,
01:11:11.680 | although it's taken longer than it should in some cases.
01:11:13.800 | But, you know, if you have a very large following
01:11:17.960 | and you're really spouting hatred
01:11:19.760 | or misinformation, disinformation,
01:11:23.040 | they've kicked people off.
01:11:24.400 | They've done, you know, some reasonable things there.
01:11:27.420 | But actually the deeper issue is of this,
01:11:31.560 | the anger we're talking about,
01:11:33.840 | of the contentiousness of everything.
01:11:36.960 | I make a family example with two great stereotypes.
01:11:41.960 | So one, the crackpot racist uncle,
01:11:46.840 | and one, the sweet grandma.
01:11:49.080 | And I always wanna point out,
01:11:50.920 | all of my uncles in my family were wonderful people,
01:11:53.380 | so I didn't have a crackpot racist uncle,
01:11:55.200 | but everybody knows the stereotype.
01:11:57.140 | Well, so grandma, she just posts like sweet comments
01:12:00.520 | on the kids' pictures and congratulates people
01:12:02.760 | on their wedding anniversary.
01:12:04.680 | And crackpot uncle's posting his nonsense.
01:12:07.680 | And normally, it's sort of at Christmas dinner,
01:12:10.360 | everybody rolls their eyes,
01:12:11.440 | oh yeah, Uncle Frank's here,
01:12:12.520 | he's probably gonna say some racist comment
01:12:14.400 | and we're gonna tell him to shut up
01:12:16.000 | or, you know, maybe let's not invite him this year,
01:12:17.960 | you know, normal human drama.
01:12:20.880 | He's got his three mates down at the pub
01:12:22.600 | who listen to him and all of that.
01:12:25.060 | But now, grandma's got, you know, 54 followers on Facebook,
01:12:29.440 | which is the intimate family,
01:12:30.740 | and racist uncle has 714.
01:12:33.360 | He's not a massive influence or whatever,
01:12:35.360 | but how did that happen?
01:12:36.180 | It's because the algorithm notices,
01:12:38.560 | oh, when she posts, nothing happens,
01:12:41.240 | he posts and then everybody jumps in to go,
01:12:43.640 | gosh, shut up, Uncle Frank, you know,
01:12:45.240 | like that's outrageous.
01:12:47.000 | And it's like, oh, there's engagement,
01:12:48.420 | there's page views, there's ads, right?
01:12:50.120 | And those algorithms,
01:12:52.320 | I think they're working to improve that,
01:12:54.100 | but it's really hard for them.
01:12:56.000 | It's hard to improve that if that actually is working.
01:12:59.200 | If the people who are saying things that get engagement,
01:13:03.500 | if it's not too awful, but it's just, you know,
01:13:06.020 | like maybe it's not a racist uncle,
01:13:07.900 | but maybe it's an uncle who posts a lot
01:13:09.340 | about what an idiot Biden is, right?
01:13:12.120 | Which isn't necessarily an offensive
01:13:13.860 | or blockable or banable thing, and it shouldn't be.
01:13:16.740 | But if that's the discourse that gets elevated
01:13:19.100 | because it gets a rise out of people,
01:13:21.080 | then suddenly in a society, it's like,
01:13:23.100 | oh, this is, we get more of what we reward.
01:13:25.540 | So I think that's a piece of what's gone on.
01:13:28.700 | - Well, if we could just take that tangent,
01:13:32.500 | I'm having a conversation with Mark Zuckerberg a second time.
01:13:36.900 | Is there something you can comment on
01:13:38.340 | how to decrease toxicity
01:13:39.660 | on that particular platform, Facebook?
01:13:41.700 | You also have worked on creating a social network
01:13:44.720 | that is less toxic yourself.
01:13:46.580 | So can we just talk about the different ideas
01:13:49.760 | that these already big social networks can do
01:13:52.860 | and what you have been trying to do?
01:13:54.660 | - So a piece of it is, it's hard.
01:13:58.500 | So I don't, the problem with making a recommendation
01:14:01.900 | to Facebook is that I actually believe
01:14:04.500 | their business model makes it really hard for them.
01:14:07.460 | And I'm not anti-capitalism, I'm not, you know, great.
01:14:11.300 | Somebody's got business, they're making money.
01:14:13.080 | That's not where I come from.
01:14:15.800 | But certain business models mean you are gonna prioritize
01:14:19.620 | things that maybe aren't that long-term healthful.
01:14:22.660 | And so that's a big piece of it.
01:14:24.620 | So certainly for Facebook, you could say,
01:14:27.280 | you know, with vast resources,
01:14:31.440 | start to prioritize content that's higher quality,
01:14:35.040 | that's healing, that's kind.
01:14:36.880 | Try not to prioritize content
01:14:39.680 | that seems to be just getting a rise out of people.
01:14:42.160 | Now those are vague human descriptions, right?
01:14:45.040 | But I do believe good machine learning algorithms,
01:14:47.720 | you can optimize in slightly different ways.
01:14:50.120 | But to do that, you may have to say,
01:14:52.760 | actually, we're not necessarily gonna increase page views
01:14:57.240 | to the maximum extent right now.
01:14:59.160 | And I've said this to people at Facebook.
01:15:01.320 | It's like, you know, if your actions are, you know,
01:15:06.320 | convincing people that you're breaking Western civilization,
01:15:10.840 | that's really bad for business in the long run.
01:15:13.200 | Certainly these days, I'll say,
01:15:17.420 | Twitter is the thing that's on people's minds
01:15:20.280 | as being more upsetting at the moment.
01:15:22.720 | But I think it's true.
01:15:23.960 | And so one of the things that's really interesting
01:15:28.720 | about Facebook compared to a lot of companies
01:15:32.440 | is that Mark has a pretty unprecedented amount of power.
01:15:36.160 | His ability to name members of the board,
01:15:38.440 | his control of the company is pretty hard to break.
01:15:42.700 | Even if financial results aren't as good as they could be,
01:15:46.220 | because he's taken a step back from
01:15:49.060 | the perfect optimization to say,
01:15:50.960 | actually, for the long-term health in the next 50 years
01:15:54.240 | of this organization, we need to rein in some of the things
01:15:57.520 | that are working for us and making money
01:15:59.080 | because they're actually giving us a bad reputation.
01:16:02.380 | So one of the recommendations I would say is,
01:16:05.000 | and this is not to do with the algorithms and all that,
01:16:07.040 | but you know, how about just a moratorium
01:16:09.120 | on all political advertising?
01:16:11.320 | I don't think it's their most profitable segment,
01:16:13.880 | but it's given rise to a lot of deep, hard questions
01:16:16.620 | about dark money, about ads that are run
01:16:21.620 | by questionable people that push false narratives,
01:16:25.500 | or the classic kind of thing is you run,
01:16:28.700 | I saw a study about Brexit in the UK
01:16:33.820 | where people were talking about there were ads run
01:16:36.420 | to animal rights activists saying,
01:16:43.300 | finally, when we're out from under Europe,
01:16:45.100 | the UK can pass proper animal rights legislation.
01:16:48.260 | We're not constrained by the European process.
01:16:51.580 | Similarly, for people who are advocates of fox hunting
01:16:55.540 | to say, finally, when we're out of Europe,
01:16:57.160 | we can re-implement.
01:16:58.900 | So you're telling people what they wanna hear.
01:17:01.100 | And in some cases, it's really hard
01:17:04.280 | for journalists to see that.
01:17:06.780 | So it used to be that for political advertising,
01:17:09.880 | you really needed to find some kind of mainstream narrative,
01:17:12.620 | and this is still true to an extent.
01:17:14.480 | Mainstream narrative that 60% of people can say,
01:17:18.460 | oh, I can buy into that,
01:17:19.440 | which meant it pushed you to the center.
01:17:20.900 | It pushed you to sort of try and find some nuanced balance.
01:17:24.480 | But if your main method of recruiting people
01:17:26.820 | is a tiny little one-on-one conversation with them
01:17:31.060 | because you're able to target using targeted advertising,
01:17:34.900 | suddenly you don't need consistent.
01:17:37.100 | You just need a really good targeting operation,
01:17:42.100 | really good Cambridge analytic style
01:17:44.680 | machine learning algorithm data to convince people.
01:17:47.480 | And that just feels really problematic.
01:17:49.000 | So I mean, until they can think about
01:17:50.680 | how to solve that problem, I would just say,
01:17:52.000 | you know what, it's gonna cost us X amount,
01:17:54.520 | but it's gonna be worth it to kind of say,
01:17:57.880 | you know what, we actually think
01:17:58.960 | our political advertising policy hasn't really helped
01:18:02.920 | contribute to discourse and dialogue
01:18:05.800 | and finding reasoned middle ground and compromise solutions.
01:18:10.000 | So let's just not do that for a while
01:18:11.960 | until we figure that out.
01:18:12.880 | So that's maybe a piece of advice.
01:18:14.660 | - And coupled with, as you were saying,
01:18:17.500 | recommender systems for the newsfeed and other contexts
01:18:21.800 | that don't always optimize engagement,
01:18:24.060 | but optimize the long-term mental wellbeing
01:18:28.300 | and balance and growth of a human being.
01:18:30.500 | - Yeah.
01:18:31.340 | - That's a very difficult problem.
01:18:32.580 | - It's a difficult problem, yeah.
01:18:34.780 | And you know, so with WT Social, WikiTree and Social,
01:18:40.440 | we're launching in a few months time,
01:18:43.460 | a completely new system, new domain name,
01:18:46.340 | new lots of things.
01:18:47.780 | But the idea is to say, let's focus on trust.
01:18:51.280 | People can rate each other as trustworthy,
01:18:54.900 | rate content as trustworthy.
01:18:56.100 | You have to start from somewhere.
01:18:57.200 | So we'll start with a core base of our tiny community
01:19:00.180 | who I think are sensible, thoughtful people.
01:19:03.180 | We wanna recruit more, but to say, you know what,
01:19:05.020 | actually let's have that as a pretty strong element
01:19:07.780 | to say, let's not optimize based on
01:19:10.920 | what gets the most page views in this session.
01:19:13.520 | Let's optimize on what sort of the feedback from people is.
01:19:18.520 | This is meaningfully enhancing my life.
01:19:21.760 | And so part of that is,
01:19:22.960 | and it's probably not a good business model,
01:19:25.280 | but part of that is say, okay,
01:19:26.360 | we're not gonna pursue an advertising business model,
01:19:28.480 | but a membership model where you can,
01:19:33.280 | you don't have to be a member,
01:19:34.280 | but you can pay to be a member.
01:19:36.120 | You maybe get some benefit from that,
01:19:37.600 | but in general to say, actually the problem with,
01:19:41.860 | and actually the division I would say is,
01:19:44.620 | and the analogy I would give is,
01:19:46.480 | broadcast television funded by advertising
01:19:52.360 | gives you a different result than paying for HBO,
01:19:57.360 | paying for Netflix, paying for whatever.
01:20:01.380 | And the reason is, you know, if you think about it,
01:20:04.580 | what is your incentive as a TV producer,
01:20:08.980 | you're gonna make a comedy for ABC network in the US,
01:20:13.980 | you basically say, I want something that
01:20:16.020 | almost everybody will like and listen to.
01:20:18.020 | So it tends to be a little blander,
01:20:20.140 | you know, family friendly, whatever.
01:20:22.780 | Whereas if you say, oh, actually,
01:20:25.060 | I'm gonna use the HBO example and an old example.
01:20:28.260 | You say, you know what?
01:20:29.260 | Sopranos isn't for everybody.
01:20:31.260 | Sex and the City isn't for everybody.
01:20:32.940 | But between the two shows,
01:20:34.900 | we've got something for everybody
01:20:36.100 | that they're willing to pay for.
01:20:37.700 | So you can get edgier, higher quality in my view,
01:20:40.740 | content rather than saying,
01:20:42.260 | it's gotta not offend anybody in the world.
01:20:44.500 | It's gotta be for everybody, which is really hard.
01:20:46.860 | So same thing, you know, here in a social network,
01:20:49.860 | if your business model is advertising,
01:20:51.560 | it's gonna drive you in one direction.
01:20:53.580 | If your business model is membership,
01:20:55.220 | I think it drives you in a different direction.
01:20:56.780 | I actually, and I've said this to Elon about Twitter Blue,
01:21:00.860 | which I think wasn't rolled out well.
01:21:02.840 | And so forth, but it's like, hmm,
01:21:04.660 | the piece of that that I like is to say,
01:21:06.980 | look, actually, if there's a model
01:21:10.020 | where your revenue is coming from people
01:21:12.260 | who are willing to pay for the service,
01:21:14.420 | even if it's only part of your revenue,
01:21:16.140 | if it's a substantial part,
01:21:17.860 | that does change your broader incentives to say,
01:21:21.100 | actually, are people gonna be willing to pay for something
01:21:23.600 | that's actually just toxicity in their lives?
01:21:25.900 | Now, I'm not sure it's been rolled out well.
01:21:28.860 | I'm not sure how it's going.
01:21:31.160 | And maybe I'm wrong about that as a plausible business model.
01:21:35.340 | But I do think it's interesting to think about
01:21:38.100 | just in broad terms, business model drives outcomes
01:21:43.100 | in sometimes surprising ways,
01:21:44.740 | unless you really pause to think about it.
01:21:46.620 | - So if we can just link on Twitter and Elon,
01:21:49.780 | before I would love to talk to you
01:21:52.700 | about the underlying business model, Wikipedia,
01:21:54.940 | which is this brilliant, bold move at the very beginning.
01:21:57.820 | But since you mentioned Twitter, what do you think works?
01:22:00.740 | What do you think is broken about Twitter?
01:22:03.060 | - Oof, I mean, it's a long conversation,
01:22:05.380 | but to start with, one of the things that I always say is,
01:22:09.660 | it's a really hard problem.
01:22:11.080 | So I can see that right up front.
01:22:12.780 | I said this about the old ownership of Twitter
01:22:16.220 | and the new ownership of Twitter,
01:22:18.620 | because unlike Wikipedia,
01:22:20.820 | and this is true actually for all social media,
01:22:23.620 | there's a box and the box basically says,
01:22:26.180 | what do you think?
01:22:27.000 | What's on your mind?
01:22:27.980 | You can write whatever the hell you want, right?
01:22:30.180 | This is true, by the way, even for YouTube.
01:22:32.620 | I mean, the box is to upload a video,
01:22:33.980 | but again, it's just like an open-ended invitation
01:22:36.420 | to express yourself.
01:22:38.020 | And what makes that hard is some people have really toxic,
01:22:40.700 | really bad, you know, some people are very aggressive.
01:22:43.140 | They're actually stalking, they're actually, you know,
01:22:45.620 | abusive, and suddenly you deal with a lot of problems.
01:22:49.860 | Whereas at Wikipedia, there is no box that says,
01:22:52.220 | what's on your mind?
01:22:53.740 | There's a box that says,
01:22:55.700 | this is an entry about the moon.
01:22:59.340 | Please be neutral, please cite your facts.
01:23:00.980 | Then there's a talk page,
01:23:02.060 | which is not coming rant about Donald Trump.
01:23:06.260 | If you go on the talk page of the Donald Trump entry
01:23:08.220 | and you just start ranting about Donald Trump,
01:23:10.260 | people would say, what are you doing?
01:23:11.260 | Like, stop doing that.
01:23:12.140 | Like, we're not here to discuss,
01:23:13.860 | like there's a whole world of the internet out there
01:23:15.740 | for you to go and rant about Donald Trump.
01:23:17.220 | - It's just not fun to do on Wikipedia.
01:23:18.940 | Somehow it's fun on Twitter.
01:23:20.660 | - Well, also on Wikipedia, people are gonna say, stop.
01:23:23.940 | And actually, are you here to tell us,
01:23:26.660 | like, how can we improve the article?
01:23:28.140 | Or are you just here to rant about Trump?
01:23:29.460 | 'Cause that's not actually interesting.
01:23:31.140 | So because the goal is different.
01:23:33.340 | So that's just admitting and saying upfront,
01:23:36.020 | this is a hard problem.
01:23:37.740 | Certainly, I'm writing a book on trust.
01:23:42.260 | So the idea is, in the last 20 years,
01:23:46.100 | we've lost trust in all kinds of institutions and politics.
01:23:51.100 | The Edelman Trust Barometer Survey
01:23:54.380 | has been done for a long time.
01:23:56.500 | And trust in politicians, trust in journalism,
01:23:58.700 | it's declined substantially.
01:24:00.660 | And I think in many cases, deservedly.
01:24:03.140 | So how do we restore trust?
01:24:05.500 | And how do we think about that?
01:24:07.460 | - And does that also include trust in the idea of truth?
01:24:12.460 | - Trust in the idea of truth.
01:24:14.980 | Even the concept of facts and truth
01:24:17.300 | is really, really important.
01:24:18.540 | And the idea of uncomfortable truths is really important.
01:24:23.060 | Now, so when we look at Twitter, right?
01:24:28.060 | And we can see, okay, this is really hard.
01:24:30.580 | So here's my story about Twitter.
01:24:35.780 | It's a two-part story.
01:24:37.240 | And it's all pre-Elon Musk ownership.
01:24:41.700 | So many years back,
01:24:43.260 | somebody accused me of horrible crimes on Twitter.
01:24:47.700 | And I, like anybody would, I was like,
01:24:51.580 | I'm in the public eye.
01:24:52.420 | People say bad things.
01:24:53.700 | I don't really, you know, I brush it off, whatever.
01:24:55.700 | But I'm like, this is actually really bad.
01:24:57.260 | Like, accusing me of pedophilia, like, that's just not okay.
01:25:01.860 | So I thought, I'm gonna report this.
01:25:03.300 | So I click report, and I report the tweet.
01:25:05.780 | And there's five others, and I go through the process.
01:25:08.740 | And then I get an email that says, you know, whatever,
01:25:11.500 | a couple hours later, saying, thank you for your report.
01:25:13.740 | We're looking into this.
01:25:14.700 | Great, okay, good.
01:25:16.100 | Then several hours further, I get an email back saying,
01:25:18.980 | sorry, we don't see anything here
01:25:20.020 | to violate our terms of use.
01:25:22.140 | And I'm like, okay.
01:25:23.020 | So I email Jack, and I say, Jack, come on,
01:25:25.700 | like, this is ridiculous.
01:25:26.780 | And he emails back roughly saying, yeah, sorry, Jimmy.
01:25:31.100 | Don't worry, we'll sort this out.
01:25:33.260 | And I just thought to myself, you know what?
01:25:36.100 | That's not the point, right?
01:25:37.380 | I'm Jimmy Wales.
01:25:38.700 | I know Jack Dorsey.
01:25:39.660 | I can email Jack Dorsey.
01:25:40.900 | He'll listen to me 'cause he's got an email from me,
01:25:43.460 | and sorts it out for me.
01:25:44.460 | What about the teenager who's being bullied
01:25:48.140 | and is getting abuse, right,
01:25:50.660 | and getting accusations that aren't true?
01:25:52.660 | Are they getting the same kind of like really poor result
01:25:55.500 | in that case?
01:25:56.820 | So fast forward a few years, same thing happens.
01:26:00.980 | The exact quote I'll use, please help me.
01:26:05.620 | I'm only 10 years old, and Jimmy Wales raped me last week.
01:26:08.740 | So I come on, fuck off.
01:26:09.780 | Like, that's ridiculous.
01:26:10.660 | So I report, I'm like, this time I'm reporting,
01:26:12.660 | but I'm thinking, well, we'll see what happens.
01:26:15.020 | This one gets even worse because then I get a same result,
01:26:19.860 | email back saying, sorry, we don't see any problems.
01:26:21.980 | So I raise it with other members of the board who I know,
01:26:24.220 | and Jack, and like, this is really ridiculous.
01:26:26.900 | Like, this is outrageous.
01:26:29.020 | And some of the board members, friends of mine,
01:26:32.100 | sympathetic, and so good for them,
01:26:34.660 | but I actually got an email back then
01:26:36.940 | from the general counsel, head of trust and safety saying,
01:26:41.380 | actually, there's nothing in this tweet
01:26:42.700 | that violates our terms of service.
01:26:44.100 | We don't regard, and gave reference to the Me Too movement.
01:26:48.100 | If we didn't allow accusations,
01:26:50.060 | the Me Too movement, it's an important thing.
01:26:52.340 | And I was like, you know what?
01:26:53.660 | Actually, if someone says I'm 10 years old
01:26:56.620 | and someone raped me last week,
01:26:57.940 | I think the advice should be,
01:26:59.340 | here's the phone number of the police.
01:27:01.180 | Like, you need to get the police involved.
01:27:02.660 | Twitter's not the place for that accusation.
01:27:05.140 | So even back then, by the way, they did delete those tweets,
01:27:08.500 | but I mean, the rationale they gave is spammy behavior.
01:27:11.860 | So completely separate from abusing me,
01:27:13.780 | it was just like, oh, well, they were retweeting too often.
01:27:16.220 | Okay, whatever.
01:27:17.660 | So like, that's just broken.
01:27:19.500 | Like, that's a system that it's not working
01:27:21.940 | for people in the public eye.
01:27:23.500 | I'm sure it's not working for private people who get abuse.
01:27:26.820 | Really horrible abuse can happen.
01:27:28.620 | So how is that today?
01:27:31.380 | Well, it hasn't happened to me since Elon took over,
01:27:34.380 | but I don't see why it couldn't.
01:27:35.820 | And I suspect now if I send a report and email someone,
01:27:38.620 | there's no one there to email me back
01:27:40.820 | 'cause he's gotten rid of a lot of the trust and safety staff.
01:27:43.980 | So I suspect that problem is still really hard.
01:27:46.820 | - Just content moderation at huge scales.
01:27:49.940 | - At huge scales is really something.
01:27:52.540 | And I don't know the full answer to this.
01:27:54.380 | I mean, a piece of it could be,
01:27:56.860 | to say actually making specific allegations of crimes,
01:28:02.940 | this isn't the place to do that.
01:28:06.180 | We've got a huge database.
01:28:07.540 | If you've got an accusation of crime,
01:28:09.620 | here's who you should call, the police, the FBI,
01:28:12.460 | whatever it is.
01:28:13.580 | It's not to be done in public.
01:28:15.340 | And then you do face really complicated questions
01:28:17.660 | about Me Too movement and people coming forward in public
01:28:20.740 | and all of that.
01:28:21.580 | But again, it's like,
01:28:23.060 | probably you should talk to a journalist, right?
01:28:24.980 | Probably there are better avenues than just tweeting
01:28:27.940 | from an account that was created 10 days ago,
01:28:31.020 | obviously set up to abuse someone.
01:28:33.180 | So I think they could do a lot better,
01:28:36.260 | but I also admit it's a hard problem.
01:28:38.140 | - And there's also ways to indirectly or more humorously
01:28:41.100 | or a more mocking way to make the same kinds of accusations.
01:28:44.380 | In fact, the accusations you mentioned,
01:28:46.300 | if I were to guess, don't go that viral
01:28:48.300 | 'cause they're not funny enough or cutting enough.
01:28:50.780 | But if you make it witty and cutting and meme it somehow,
01:28:55.540 | sometimes actually indirectly making the accusation
01:28:58.820 | versus directly making the accusation,
01:29:00.420 | that can go viral and that can destroy reputations.
01:29:03.300 | And you get to watch yourself,
01:29:05.500 | just all kinds of narratives take hold.
01:29:09.740 | - No, I mean, I remember another case that didn't bother me
01:29:13.140 | 'cause it wasn't of that nature,
01:29:16.260 | but somebody was saying,
01:29:17.860 | "I'm sure you're making millions off of Wikipedia."
01:29:22.820 | I'm like, "No, actually, I don't even work there.
01:29:25.340 | "I have no salary."
01:29:26.800 | And they're like, "You're lying.
01:29:29.340 | "I'm gonna check your 990 form,"
01:29:31.140 | which is the US form for tax reporting for charities.
01:29:35.220 | I was like, "Yeah, I'm not gonna, here's the link.
01:29:37.700 | "Go read it and you'll see I'm listed as a board member
01:29:40.400 | "and my salary is listed as zero."
01:29:41.900 | So, you know, things like that, it's like,
01:29:45.820 | "Okay, that one, that feels like you're wrong,
01:29:49.180 | "but I can take that
01:29:50.320 | "and we can have that debate quite quickly."
01:29:52.580 | And again, it didn't go viral because it was kind of silly.
01:29:54.840 | And if anything would have gone viral, it was me responding.
01:29:58.260 | But that's one where it's like,
01:29:59.140 | "Actually, I'm happy to respond
01:30:00.340 | "because a lot of people don't know that I don't work there
01:30:03.400 | "and that I don't make millions and I'm not a billionaire."
01:30:06.300 | Well, they must know that
01:30:07.340 | 'cause it's in most news media about me.
01:30:10.420 | But the other one I didn't respond to publicly
01:30:13.700 | because it's like Barbra Streisand effect.
01:30:16.420 | You know, it's like sometimes calling attention
01:30:18.780 | to someone who's abusing you
01:30:19.700 | who basically has no followers and so on is just a waste.
01:30:23.940 | - And everything you're describing now
01:30:25.100 | is just something that all of us have to kind of learn
01:30:28.060 | 'cause everybody's in the public eye.
01:30:29.460 | I think when you have just two followers
01:30:32.060 | and you get bullied by one of the followers,
01:30:33.620 | it hurts just as much as when you have a large number.
01:30:35.860 | So it's not, your situation, I think,
01:30:38.420 | is echoed in the situations of millions of other,
01:30:41.060 | especially teenagers and kids and so on.
01:30:43.000 | - Yeah, I mean, it's actually an example.
01:30:46.000 | So we don't generally use my picture
01:30:52.740 | in the banners anymore on Wikipedia, but we did.
01:30:56.140 | And then we did an experiment one year
01:30:58.000 | where we tried other people's pictures,
01:30:59.980 | so one of our developers.
01:31:01.860 | And, you know, one guy, lovely, very sweet guy,
01:31:05.820 | and he doesn't look like your immediate thought
01:31:09.460 | of a nerdy Silicon Valley developer.
01:31:12.060 | He looks like a heavy metal dude 'cause he's cool.
01:31:15.060 | And so suddenly here he is with long hair and tattoos
01:31:18.500 | and there's his sort of say,
01:31:20.660 | here's what your money goes for,
01:31:22.140 | here's my letter asking for support.
01:31:24.700 | And he got massive abuse from Wikipedia,
01:31:27.100 | like calling him creepy and, you know, like really massive.
01:31:30.420 | And this was being shown to 80 million people a day.
01:31:33.300 | His picture, not the abuse, right?
01:31:35.140 | The abuse was elsewhere on the internet.
01:31:37.420 | And he was bothered by it.
01:31:39.140 | And I thought, you know what, there is a difference.
01:31:40.940 | I actually am in the public eye.
01:31:43.340 | I get huge benefits from being in the public eye.
01:31:45.740 | I go around and make public speeches.
01:31:47.220 | If any random thing I think of,
01:31:48.820 | I can write and get it published in the New York Times
01:31:51.460 | and, you know, have this interesting life.
01:31:53.540 | He's not a public figure.
01:31:54.860 | And so actually he wasn't mad at us.
01:31:58.460 | He wasn't mad, you know, it was just like,
01:32:00.180 | yeah, actually suddenly being thrust in the public eye
01:32:03.100 | and you get suddenly lots of abuse,
01:32:05.340 | which normally, you know, if you're a teenager
01:32:07.980 | and somebody in your class is abusing you,
01:32:09.620 | it's not gonna go viral.
01:32:11.200 | So you're only gonna, it's gonna be hurtful
01:32:12.780 | because it's local and it's your classmates or whatever.
01:32:16.320 | But when sort of ordinary people go viral
01:32:19.700 | in some abusive way, it's really, really quite tragic.
01:32:24.020 | - I don't know, even at a small scale, it feels viral.
01:32:27.220 | When five people-- - I suppose you're right, yeah.
01:32:28.900 | - Five people at your school and there's a rumor
01:32:31.180 | and there's this feeling like you're surrounded
01:32:33.620 | and nobody, and the feeling of loneliness, I think,
01:32:36.640 | which you're speaking to when you don't have a plat,
01:32:39.860 | when you at least feel like you don't have a platform
01:32:42.500 | to defend yourself.
01:32:43.900 | And then this powerlessness that I think
01:32:46.140 | a lot of teenagers definitely feel and a lot of people.
01:32:49.340 | - I think you're right, yeah.
01:32:50.260 | - And that, I think even when just like two people
01:32:53.740 | make up stuff about you or lie about you
01:32:55.980 | or say mean things about you or bully you,
01:32:58.340 | that can feel like a crowd.
01:33:00.420 | - Yeah, yeah, no, it's true.
01:33:03.180 | - I mean, whatever that is in our genetics,
01:33:05.020 | in our biology, in the way our brain works,
01:33:07.700 | it just can be a terrifying experience.
01:33:09.620 | And somehow to correct that, I mean, I think,
01:33:14.580 | because everybody feels the pain of that,
01:33:16.180 | everybody suffers the pain of that,
01:33:17.380 | I think we'll be forced to fix that as a society
01:33:20.500 | to figure out a way around that.
01:33:21.900 | - I think it's really hard to fix
01:33:23.260 | because I don't think that problem isn't necessarily new.
01:33:28.780 | Someone in high school who writes graffiti
01:33:33.780 | that says Becky is a slut and spreads a rumor
01:33:39.180 | about what Becky did last weekend,
01:33:41.420 | that's always been damaging, it's always been hurtful.
01:33:43.500 | And that's really hard.
01:33:45.020 | - Those kinds of attacks are as old as time itself.
01:33:47.660 | They precede the internet.
01:33:49.060 | Now, what do you think about this technology
01:33:50.540 | that feels Wikipedia-like,
01:33:53.460 | which is community notes on Twitter?
01:33:55.740 | Do you like it?
01:33:57.500 | - Yeah. - Pros and cons?
01:33:58.940 | Do you think it's scalable?
01:34:00.060 | - I do like it.
01:34:00.900 | I don't know enough about specifically how it's implemented
01:34:03.900 | to really have a very deep view,
01:34:06.820 | but I do think it's quite,
01:34:08.260 | the uses I've seen of it I've found quite good.
01:34:11.700 | And in some cases, changed my mind.
01:34:16.060 | It's like I see something,
01:34:18.780 | and of course, the sort of human tendency
01:34:21.140 | is to retweet something that you,
01:34:26.540 | hope is true or that you are afraid is true.
01:34:29.980 | Or it's like that kind of quick mental action.
01:34:34.740 | And then I saw something that I liked and agreed with,
01:34:37.420 | and then a community note under it that made me think,
01:34:39.780 | oh, actually, this is a more nuanced issue.
01:34:42.740 | So I like that.
01:34:44.180 | I think that's really important.
01:34:46.060 | Now, how is it specifically implemented?
01:34:47.580 | Is it scalable?
01:34:48.420 | I don't really know how they've done it,
01:34:49.740 | so I can't really comment on that.
01:34:51.740 | But in general, I do think it's,
01:34:55.740 | when your only mechanisms on Twitter,
01:34:59.780 | and you're a big Twitter user,
01:35:01.100 | we know the platform,
01:35:02.180 | and you've got plenty of followers and all of that.
01:35:05.180 | The only mechanisms are retweeting,
01:35:08.260 | replying, blocking.
01:35:11.940 | It's a pretty limited scope,
01:35:15.460 | and it's kind of good if there's a way
01:35:17.180 | to elevate a specific thoughtful response.
01:35:21.300 | And it kind of goes to, again,
01:35:23.100 | does the algorithm just pick the retweet or the,
01:35:27.060 | I mean, retweeting,
01:35:28.060 | it's not even the algorithm that makes it viral.
01:35:30.660 | Like, if Paolo Coelho, very famous author,
01:35:35.620 | I think he's got like, I don't know,
01:35:36.980 | I haven't looked lately.
01:35:37.820 | He used to have 8 million Twitter followers.
01:35:39.100 | I think I looked, he's got 16 million now or whatever.
01:35:41.540 | Well, if he retweets something,
01:35:42.540 | it's gonna get seen a lot.
01:35:44.460 | Or Elon Musk, if he retweets something,
01:35:46.140 | it's gonna get seen a lot.
01:35:47.220 | That's not an algorithm,
01:35:48.220 | that's just the way the platform works.
01:35:50.460 | So it is kind of nice if you have something else,
01:35:53.900 | and how that something else is designed,
01:35:55.300 | that's obviously a complicated question.
01:35:57.980 | - Well, there's an interesting thing
01:35:59.460 | that I think Twitter is doing,
01:36:01.220 | but I know Facebook is doing for sure,
01:36:03.300 | which is really interesting.
01:36:06.460 | So you have, what are the signals
01:36:08.540 | that a human can provide at scale?
01:36:10.540 | Like, in Twitter, it's retweet.
01:36:13.460 | - Yep.
01:36:14.300 | - In Facebook, I think you can share.
01:36:16.420 | I forget, but there's basic interactions.
01:36:18.020 | You can have comment and so on.
01:36:19.500 | - Yeah.
01:36:20.340 | - But also in Facebook, and YouTube has this too,
01:36:23.060 | is would you like to see more of this,
01:36:26.380 | or would you like to see less of this?
01:36:28.740 | They post that sometimes.
01:36:30.180 | And the thing that the neural net
01:36:33.260 | that's learning from that has to figure out
01:36:35.100 | is the intent behind you saying,
01:36:37.700 | I wanna see less of this.
01:36:39.180 | Did you see too much of this content already?
01:36:42.100 | You like it, but you don't wanna see so much of it.
01:36:45.380 | You already figured it out, great.
01:36:47.620 | Or does this content not make you feel good?
01:36:50.540 | There's so many interpretations
01:36:51.940 | to how I'd like to see less of this.
01:36:53.380 | But if you get that kind of signal,
01:36:55.940 | this actually can create a really powerfully curated
01:37:00.940 | list of content that is fed to you every day.
01:37:05.420 | That doesn't create an echo chamber or a silo.
01:37:08.620 | It actually just makes you feel good in the good way,
01:37:12.860 | which is like it challenges you,
01:37:14.520 | but it doesn't exhaust you
01:37:16.300 | and make you kind of this weird animal.
01:37:20.180 | - I've been saying for a long time,
01:37:21.660 | if I went on Facebook one morning
01:37:23.340 | and they said, oh, we're testing a new option,
01:37:26.680 | rather than showing you things
01:37:29.180 | we think you're going to like,
01:37:31.020 | we wanna show you some things
01:37:32.080 | that we think you will disagree with,
01:37:34.500 | but which we have some signals that suggest it's of quality.
01:37:38.780 | Like, now that sounds interesting.
01:37:40.140 | - Yeah, that sounds really interesting.
01:37:41.260 | - I wanna see something where,
01:37:43.340 | you know, like, oh, I don't agree with.
01:37:45.300 | So Larry Lessig is a good friend of mine,
01:37:48.940 | founder of Creative Commons,
01:37:50.060 | and he's moved on to doing stuff
01:37:51.260 | about corruption in politics and so on.
01:37:53.340 | And I don't always agree with Larry,
01:37:55.380 | but I always grapple with Larry
01:37:57.700 | because he's so interesting and he's so thoughtful
01:37:59.900 | that even when we don't agree,
01:38:01.920 | I'm like, actually, I wanna hear him out, right?
01:38:04.780 | Because I'm gonna learn from it.
01:38:06.660 | And that doesn't mean I always come around
01:38:08.140 | to agreeing with him,
01:38:08.960 | but I'm gonna understand a perspective of it.
01:38:10.340 | And that's really a great feeling.
01:38:12.620 | - Yeah, there's this interesting thing on social media
01:38:14.420 | where people kind of accuse others of saying,
01:38:17.780 | well, you don't wanna hear opinions
01:38:19.180 | that you disagree with or ideas you disagree with.
01:38:21.500 | I think this is something that's thrown at me all the time.
01:38:24.940 | The reality is there's literally
01:38:27.020 | almost nothing I enjoy more.
01:38:29.700 | - It's a odd thing to accuse you of
01:38:31.140 | 'cause you have quite a wide range of long conversations
01:38:34.180 | with a very diverse bunch of people.
01:38:35.820 | - But there is a very,
01:38:37.480 | there is like a very harsh drop-off
01:38:41.780 | because what I like is high quality disagreement
01:38:44.460 | that really makes me think.
01:38:45.900 | And at a certain point, there's a threshold,
01:38:47.780 | it's a kind of a gray area
01:38:49.020 | when the quality of the disagreement,
01:38:50.900 | it just sounds like mocking
01:38:52.340 | and you're not really interested
01:38:54.480 | in a deep understanding of the topic
01:38:56.460 | or you yourself don't seem to carry
01:38:57.940 | deep understanding of the topic.
01:38:59.300 | Like there's something called Intelligence Squared Debates.
01:39:03.660 | The main one is the British version.
01:39:05.860 | With the British accent, everything always sounds better.
01:39:08.400 | And the Brits seem to argue more intensely,
01:39:11.800 | like they're invigorated, they're energized by the debate.
01:39:15.400 | Those people, I often disagree
01:39:17.880 | with basically everybody involved and it's so fun.
01:39:20.820 | I learned something, that's high quality.
01:39:23.280 | If we could do that,
01:39:24.440 | if there's some way for me to click a button that says,
01:39:27.580 | filter out lower quality just today.
01:39:31.960 | Sometimes show it to me 'cause I wanna be able to,
01:39:34.720 | but today, I'm just not in the mood for the mockery.
01:39:38.240 | Just high quality stuff, even flatter.
01:39:42.000 | I wanna get high quality arguments for the flat Earth.
01:39:45.280 | It would make me feel good
01:39:46.800 | because I would see, oh, that's really interesting.
01:39:49.520 | Like I never really thought in my mind
01:39:52.720 | to challenge the mainstream narrative
01:39:55.800 | of general relativity, right?
01:40:00.000 | Of a perception of physics.
01:40:01.920 | Maybe all of reality,
01:40:03.120 | maybe all of space time is an illusion.
01:40:06.380 | That's really interesting.
01:40:07.480 | I never really thought about, let me consider that fully.
01:40:10.120 | Okay, what's the evidence?
01:40:11.040 | How do you test that?
01:40:12.280 | What are the alternatives?
01:40:14.160 | How would you be able to have such consistent perception
01:40:18.200 | of a physical reality if all of it is an illusion?
01:40:21.440 | All of us seem to share the same kind
01:40:23.080 | of perception of reality.
01:40:24.520 | That's the kind of stuff I love,
01:40:27.060 | but not like the mockery of it.
01:40:28.760 | It seems that social media can kind of inspire.
01:40:34.760 | - Yeah, I talk sometimes about how people assume
01:40:38.680 | that like the big debates in Wikipedia
01:40:41.560 | or the sort of arguments are between the party of the left
01:40:46.560 | and the party of the right.
01:40:47.880 | And I always say, no, it's actually the party
01:40:49.760 | of the kind and thoughtful and the party of the jerks
01:40:52.600 | is really it.
01:40:53.960 | I mean, left and right, like, yeah,
01:40:56.120 | bring me somebody I disagree with politically
01:40:57.840 | as long as they're thoughtful, kind,
01:40:59.840 | we're gonna have a real discussion.
01:41:01.840 | I give an example of our article on abortion.
01:41:06.840 | So, if you can bring together a kind
01:41:12.120 | and thoughtful Catholic priest
01:41:13.640 | and a kind and thoughtful planned paranoid activist,
01:41:16.680 | and they're gonna work together on the article on abortion,
01:41:19.840 | that can be a really great thing.
01:41:22.840 | If they're both kind and thoughtful,
01:41:24.040 | like that's the important part.
01:41:25.560 | They're never gonna agree on the topic,
01:41:27.420 | but they will understand, okay,
01:41:28.760 | like Wikipedia is not gonna take a side,
01:41:31.400 | but Wikipedia is gonna explain what the debate is about.
01:41:33.680 | And we're gonna try to characterize it fairly.
01:41:36.800 | And it turns out like you're kind and thoughtful people,
01:41:39.520 | even if they're quite ideological,
01:41:41.280 | like a Catholic priest is generally gonna be
01:41:42.920 | quite ideological on the subject of abortion,
01:41:45.720 | but they can grapple with ideas and they can discuss,
01:41:49.640 | and they may feel very proud of the entry
01:41:51.520 | at the end of the day,
01:41:52.680 | not because they suppress the other side's views,
01:41:55.260 | but because they think the case has been stated very well,
01:41:58.140 | that other people can come to understand it.
01:42:00.320 | And if you're highly ideological,
01:42:01.760 | you assume, I think naturally,
01:42:04.160 | if people understood as much about this as I do,
01:42:06.420 | they'll probably agree with me.
01:42:07.560 | You may be wrong about that, but that's often the case.
01:42:10.640 | So that's where, you know,
01:42:12.240 | that's what I think we need to encourage more of
01:42:14.740 | in society generally is grappling with ideas
01:42:17.840 | in a really, you know, thoughtful way.
01:42:21.920 | - So is it possible if the majority of volunteers,
01:42:25.880 | editors of Wikipedia really dislike Donald Trump?
01:42:29.980 | - Mm.
01:42:31.460 | - Are they still able to write an article
01:42:34.440 | that empathizes with the perspective of,
01:42:38.520 | for a time at least,
01:42:39.700 | a very large percentage of the United States
01:42:41.920 | that were supporters of Donald Trump,
01:42:43.880 | and to have a full, broad representation
01:42:47.280 | of him as a human being, him as a political leader,
01:42:50.440 | him as a set of policies promised
01:42:53.320 | and implemented, all that kind of stuff?
01:42:55.620 | - Yeah, I think so.
01:42:57.120 | And I think if you read the article, it's pretty good.
01:43:00.280 | And I think a piece of that is within our community,
01:43:05.760 | if people have the self-awareness to understand,
01:43:11.120 | so I personally wouldn't go
01:43:13.980 | and edit the entry on Donald Trump.
01:43:15.600 | I get emotional about it, and I'm like,
01:43:17.600 | I'm not good at this.
01:43:19.240 | And if I tried to do it, I would fail.
01:43:21.720 | I wouldn't be a good Wikipedian.
01:43:23.580 | So it's better if I just step back
01:43:25.180 | and let people who are more dispassionate
01:43:27.480 | on this topic edit it.
01:43:29.600 | Whereas there are other topics
01:43:30.920 | that are incredibly emotional to some people,
01:43:33.900 | where I can actually do quite well.
01:43:36.300 | Like, I'm gonna be okay.
01:43:38.020 | Maybe, we were discussing earlier, the efficacy of masks.
01:43:42.560 | I'm like, oh, I think that's an interesting problem,
01:43:44.440 | and I don't know the answer,
01:43:45.800 | but I can help kind of catalog what's the best evidence,
01:43:48.400 | and so on, and I'm not gonna get upset.
01:43:50.680 | I'm not gonna get angry.
01:43:51.920 | I'm able to be a good Wikipedian.
01:43:54.300 | So I think that's important.
01:43:55.600 | And I do think, though, in a related framework,
01:44:00.600 | that the composition of the community is really important.
01:44:07.140 | Not because Wikipedia is or should be a battleground,
01:44:11.200 | but because blind spots.
01:44:13.040 | Like, maybe I don't even realize what's biased,
01:44:15.480 | if I'm particularly of a certain point of view,
01:44:18.240 | and I've never thought much about it.
01:44:20.080 | So one of the things we focus on a lot,
01:44:23.080 | the Wikipedia volunteers are,
01:44:26.640 | we don't know the exact number,
01:44:27.720 | but let's say 80% plus male.
01:44:31.320 | And they're of a certain demographic.
01:44:33.000 | They tend to be college-educated,
01:44:35.960 | heavier on tech geeks than not, et cetera, et cetera.
01:44:39.900 | So there is a demographic to the community,
01:44:42.160 | and that's pretty much global.
01:44:43.220 | I mean, somebody said to me once,
01:44:44.800 | "Why is it only white men who edit Wikipedia?"
01:44:47.280 | And I said, "You've obviously not met
01:44:49.080 | "the Japanese Wikipedia community."
01:44:51.440 | It's kind of a joke, because the broader principle
01:44:54.000 | still stands, who edits Japanese Wikipedia?
01:44:56.280 | A bunch of geeky men, right?
01:44:59.520 | And women as well.
01:45:00.760 | So we do have women in the community,
01:45:01.960 | and that's very important.
01:45:03.240 | But we do think, okay, you know what?
01:45:04.720 | That does lead to some problems.
01:45:06.800 | It leads to some content issues,
01:45:10.120 | simply because people write more about what they know
01:45:13.280 | and what they're interested in.
01:45:14.840 | They'll tend to be dismissive of things as being unimportant,
01:45:18.720 | if it's not something that they personally
01:45:20.680 | have an interest in.
01:45:21.720 | I like the example, as a parent, I would say,
01:45:27.680 | our entries on early childhood development
01:45:29.760 | probably aren't as good as they should be,
01:45:31.640 | because a lot of the Wikipedia volunteers,
01:45:33.680 | actually, we're getting older, the Wikipedians,
01:45:35.800 | so the demographic has changed a bit.
01:45:38.360 | But it's like, if you've got a bunch of 25-year-old
01:45:42.320 | tech geek dudes who don't have kids,
01:45:46.800 | they're just not gonna be interested
01:45:48.200 | in early childhood development.
01:45:49.680 | And if they tried to write about it,
01:45:50.800 | they probably wouldn't do a good job,
01:45:51.880 | 'cause they don't know anything about it.
01:45:53.320 | And somebody did a look at our entries on novelists
01:45:58.320 | who've won a major literary prize.
01:46:00.600 | And they looked at the male novelist versus the female.
01:46:03.520 | And the male novelist had longer and higher quality entries.
01:46:07.200 | And why is that?
01:46:08.040 | Well, it's not because, 'cause I know hundreds
01:46:11.200 | of Wikipedians, it's not because these are a bunch
01:46:13.720 | of biased, sexist men who are like,
01:46:17.600 | books by women are not important.
01:46:20.160 | It's like, no, actually, there is a gender
01:46:25.160 | kind of breakdown of readership.
01:46:27.240 | There are books, like "Hard Science Fiction"
01:46:31.280 | is a classic example.
01:46:32.120 | "Hard Science Fiction," mostly read by men.
01:46:35.680 | Other types of novels, more read by women.
01:46:39.200 | And if we don't have women in the community,
01:46:41.000 | then these award-winning, clearly important novelists
01:46:44.880 | may have less coverage.
01:46:45.960 | And not because anybody consciously thinks,
01:46:48.440 | oh, we don't like what, a book by Maya Angelou,
01:46:51.680 | like who cares, she's a poet, that's not interesting.
01:46:55.080 | No, but just because, well, people write what they know,
01:46:57.320 | they write what they're interested in.
01:46:58.680 | So we do think diversity in the community
01:47:00.620 | is really important.
01:47:01.760 | And that's one area where I do think it's really clear.
01:47:05.080 | But I can also say, you know what, actually,
01:47:07.600 | that also applies in the political sphere.
01:47:10.720 | Like, to say, actually, we do want kind
01:47:13.800 | and thoughtful Catholic priests,
01:47:16.560 | kind and thoughtful conservatives,
01:47:18.040 | kind and thoughtful libertarians,
01:47:19.920 | kind and thoughtful Marxists to come in.
01:47:23.460 | But the key is the kind and thoughtful piece.
01:47:25.440 | So when people sometimes come to Wikipedia,
01:47:28.360 | outraged by some dramatic thing that's happened on Twitter,
01:47:33.000 | they come to Wikipedia with a chip on their shoulder,
01:47:35.000 | ready to do battle, and it just doesn't work out very well.
01:47:38.760 | - And there's tribes in general where,
01:47:41.320 | I think there's a responsibility on the larger group
01:47:45.360 | to be even kinder and more welcoming to the smaller group.
01:47:48.840 | - Yeah, we think that's really important.
01:47:50.840 | And so, you know, oftentimes people come in,
01:47:53.640 | and you know, there's a lot,
01:47:55.840 | when I talk about community health,
01:47:57.560 | one of the aspects of that that we do think about a lot,
01:48:00.880 | that I think about a lot, is not about politics.
01:48:05.440 | It's just like, how are we treating newcomers
01:48:08.880 | to the community?
01:48:10.440 | And so I can tell you what our ideals are,
01:48:13.480 | what our philosophy is, but do we live up to that?
01:48:17.160 | So, you know, the ideal is you come to Wikipedia,
01:48:19.320 | you know, we have rules,
01:48:21.680 | like one of our fundamental rules is ignore all rules,
01:48:24.760 | which is partly written that way
01:48:26.440 | because it kind of piques people's attention,
01:48:29.080 | like, what the hell kind of rule is that, you know?
01:48:32.080 | But basically says, look, don't get nervous and depressed
01:48:35.680 | about a bunch of, you know,
01:48:37.480 | what's the formatting of your footnote, right?
01:48:39.680 | So you shouldn't come to Wikipedia,
01:48:41.840 | add a link and then get banned or yelled at
01:48:43.840 | because it's not the right format.
01:48:45.560 | Instead, somebody should go, oh, hey,
01:48:48.920 | yeah, thanks for helping,
01:48:50.360 | but, you know, here's the link to how to format,
01:48:55.000 | you know, if you want to keep going,
01:48:56.320 | you might want to learn how to format a footnote.
01:48:59.480 | And to be friendly and to be open and to say,
01:49:01.600 | oh, right, oh, you're new,
01:49:02.960 | and you clearly don't know everything about Wikipedia.
01:49:06.280 | And, you know, sometimes in any community,
01:49:08.440 | that can be quite hard.
01:49:09.280 | So people come in and they've got a great big idea
01:49:12.840 | and they're going to propose this to the Wikipedia community
01:49:14.760 | and they have no idea.
01:49:16.360 | That's basically a perennial discussion
01:49:18.000 | we've had 7,000 times before.
01:49:20.880 | And so then ideally you would say to the person,
01:49:23.560 | oh, yeah, great, thanks.
01:49:25.320 | Like a lot of people have, and here's where we got to,
01:49:27.840 | and here's the nuanced conversation we've had about that
01:49:30.720 | in the past that I think you'll find interesting.
01:49:32.960 | And sometimes people are just like, oh God, another one,
01:49:34.920 | you know, who's come in with this idea, which doesn't work.
01:49:37.520 | And they don't understand why.
01:49:39.040 | - You can lose patience, but you shouldn't.
01:49:40.720 | - And that's kind of human, you know.
01:49:42.640 | But I think it just does require really thinking,
01:49:45.640 | you know, in a self-aware manner of like,
01:49:50.640 | oh, I was once a newbie.
01:49:52.880 | Actually, we have a great,
01:49:54.240 | I just did an interview with Emily Temple Woods,
01:49:58.200 | who she was Wikipedia of the Year.
01:50:00.280 | She's just like a great, well-known Wikipedian.
01:50:03.160 | And I interviewed her for my book
01:50:04.640 | and she told me something I never knew.
01:50:06.840 | Apparently it's not secret.
01:50:07.960 | Like, she didn't reveal it to me,
01:50:09.360 | but is that when she started at Wikipedia, she was a vandal.
01:50:13.560 | She came in and vandalized Wikipedia.
01:50:15.800 | And then basically what happened was
01:50:17.560 | she'd done some sort of vandalized a couple of articles
01:50:21.320 | and then somebody popped up on her talk page and said,
01:50:23.840 | "Hey, like, why are you doing this?
01:50:25.240 | "Like, we're trying to make an encyclopedia here."
01:50:27.480 | And this wasn't very kind.
01:50:29.160 | And she felt so bad.
01:50:31.000 | She's like, "Oh, right, I didn't really think of it that way."
01:50:33.560 | She just was coming in as, she was like 13 years old,
01:50:36.560 | combative and, you know, like having fun and trolling a bit.
01:50:39.680 | And then she's like, "Oh, actually, oh, I see your point."
01:50:42.360 | And became a great Wikipedian.
01:50:43.600 | So that's the ideal really,
01:50:45.600 | is that you don't just go, troll, block, fuck off.
01:50:48.720 | You go, "Hey, you know, like, what gives?"
01:50:51.320 | Which is, I think the way we tend to treat things
01:50:56.040 | in real life, you know, if you've got somebody
01:50:58.800 | who's doing something obnoxious in your friend group,
01:51:01.840 | you probably go, "Hey, like, really,
01:51:05.320 | "I don't know if you've noticed,
01:51:06.920 | "but I think this person is actually quite hurt
01:51:09.520 | "that you keep making that joke about them."
01:51:11.600 | And then they usually go, "Oh, you know what?
01:51:13.800 | "I didn't, I thought that was okay, I didn't."
01:51:15.800 | And then they stop.
01:51:17.000 | Or they keep it up and then everybody goes,
01:51:19.200 | "Well, you're the asshole."
01:51:20.600 | - Well, yeah, I mean, that's just an example
01:51:23.680 | that gives me faith in humanity,
01:51:25.600 | that we're all capable and wanting to be kind to each other.
01:51:30.600 | And in general, the fact that there's a small group
01:51:33.920 | of volunteers that are able to contribute so much
01:51:38.400 | to the organization, the collection,
01:51:40.320 | the discussion of all of human knowledge,
01:51:45.640 | it's so, it makes me so grateful
01:51:47.680 | to be part of this whole human project.
01:51:50.320 | That's one of the reasons I love Wikipedia,
01:51:52.480 | is it gives me faith in humanity.
01:51:54.360 | - No, I once was at Wikimania, our annual conference,
01:51:59.360 | and people come from all around the world,
01:52:01.760 | like really active volunteers.
01:52:04.520 | I was at the dinner, we were in Egypt,
01:52:06.720 | at Wikimania in Alexandria,
01:52:08.360 | at the sort of closing dinner or whatever.
01:52:11.040 | And a friend of mine came and sat at the table,
01:52:12.720 | and she's sort of been in the movement more broadly,
01:52:16.000 | Creative Commons, she's not really a Wikipedian,
01:52:17.800 | she'd come to the conference
01:52:18.640 | 'cause she's into Creative Commons and all that.
01:52:21.280 | So we have dinner and it just turned out,
01:52:22.840 | I sat down at the table with most of the members
01:52:25.280 | of the English Language Arbitration Committee.
01:52:27.600 | And they're a bunch of very sweet, geeky Wikipedians.
01:52:31.320 | And as we left the table, I said to her,
01:52:33.720 | "It's really like, I still find this
01:52:36.960 | "kind of sense of amazement.
01:52:38.960 | "Like we just had dinner with some of the most powerful
01:52:41.340 | "people in English language media."
01:52:43.760 | 'Cause they're the people who are like
01:52:44.880 | the final court of appeal in English Wikipedia.
01:52:48.000 | And thank goodness they're not media moguls, right?
01:52:50.880 | They're just a bunch of geeks,
01:52:52.480 | who are just like well-liked in the community
01:52:54.920 | 'cause they're kind and they're thoughtful
01:52:56.200 | and they really sort of think about things.
01:52:59.160 | I was like, "This is great, love Wikipedia."
01:53:01.840 | - It's like to the degree that geeks run the best aspect
01:53:06.840 | of human civilization brings me joy in all aspects.
01:53:10.840 | And this is true in programming, like Linux,
01:53:14.120 | like programmers, like people that kind of specialize
01:53:18.040 | in a thing and they don't really get caught up
01:53:21.980 | into the mess of the bickering of society.
01:53:25.680 | They just kind of do their thing
01:53:27.120 | and they value the craftsmanship of it,
01:53:29.720 | the competence of it.
01:53:30.560 | - Well, if you've never heard of this or looked into it,
01:53:33.480 | you'll enjoy it.
01:53:34.720 | I read something recently that I didn't even know about,
01:53:36.760 | but like the fundamental time zones
01:53:41.760 | and they change from time to time.
01:53:44.440 | Sometimes a country will pass daylight savings
01:53:46.700 | or move it by a week, whatever.
01:53:49.080 | There's a file that's done all sort of UNIX-based computers
01:53:54.080 | and basically all computers end up using this file.
01:53:56.880 | It's the official time zone file, but why is it official?
01:53:59.800 | It's just this one guy.
01:54:01.640 | It's like this guy and a group, a community around him.
01:54:04.600 | And basically something weird happened
01:54:07.680 | and it broke something because he was on vacation.
01:54:11.040 | And I'm just like, isn't that wild, right?
01:54:13.520 | That you would think, I mean, first of all,
01:54:15.320 | most people never even think about like,
01:54:16.560 | how do computers know about time zones?
01:54:18.940 | Well, they know 'cause they just use this file,
01:54:22.040 | which tells all the time zones
01:54:23.320 | and which dates they change and all of that.
01:54:25.760 | But there's this one guy and he doesn't get paid for it.
01:54:28.240 | It's just, he's like, with all the billions of people
01:54:31.760 | on the planet, he sort of put his hand up and goes,
01:54:33.840 | yo, I'll take care of the time zones.
01:54:36.080 | - And there's a lot, a lot, a lot of programmers
01:54:38.920 | listening to this right now with PTSD about time zones.
01:54:43.520 | And then there, I mean, there's on top of this one guy,
01:54:46.680 | there's other libraries,
01:54:48.480 | the different programming languages
01:54:49.840 | that help manage the time zones for you,
01:54:51.460 | but still there's just within those,
01:54:55.280 | it's amazing just the packages, the libraries,
01:54:58.080 | how few people build them out of their own love
01:55:02.000 | for building, for creating, for community and all of that.
01:55:05.160 | It's, I almost like don't want to interfere
01:55:08.080 | with the natural habitat of the geek, right?
01:55:10.480 | Like when you spot 'em in the wild,
01:55:11.960 | you just wanna be like, whoa, careful.
01:55:14.160 | That thing, that thing needs to be treasured.
01:55:17.000 | - I met a guy many years ago, lovely, really sweet guy.
01:55:20.720 | And he was running a bot on English Wikipedia
01:55:25.440 | that I thought, wow, that's actually super clever.
01:55:27.640 | And what he had done is, his bot was like spell checking,
01:55:31.280 | but rather than simple spell checking,
01:55:33.280 | what he had done is create a database
01:55:35.720 | of words that are commonly mistaken for other words.
01:55:39.240 | They're spelled wrong, so I can't even give an example.
01:55:43.040 | And so the word is, people often spell it wrong,
01:55:47.000 | but no spell checker catches it
01:55:49.520 | because it is another word.
01:55:52.200 | And so what he did is, he wrote a bot
01:55:54.840 | that looks for these words and then checks the sentence
01:55:58.400 | around it for certain keywords.
01:56:00.480 | So in some context, this isn't correct,
01:56:04.920 | but buoy and boy, people sometimes type B-O-Y
01:56:09.920 | when they mean B-O-U-Y.
01:56:11.760 | So if he sees the word boy, B-O-Y in an article,
01:56:14.560 | he would look in the context and see,
01:56:15.960 | is this a nautical reference?
01:56:17.520 | And if it was, he didn't auto correct,
01:56:19.840 | he just would flag it up to himself to go,
01:56:21.960 | oh, check this one out.
01:56:23.120 | And that's not a great example,
01:56:24.440 | but he had thousands of examples.
01:56:26.640 | I was like, that's amazing.
01:56:28.200 | Like I would have never thought to do that.
01:56:30.240 | And I'm glad that somebody did.
01:56:31.640 | And that's also part of the openness of the system.
01:56:34.800 | And also I think being a charity,
01:56:37.200 | being this idea of like, actually,
01:56:40.320 | this is a gift to the world that makes someone go,
01:56:44.040 | oh, oh, well, I'll put my hand up.
01:56:46.080 | Like I see a little piece of things that I can make better
01:56:48.880 | 'cause I'm a good programmer
01:56:49.880 | and I can write this script to do this thing
01:56:51.480 | and I'll find it fun.
01:56:52.600 | Amazing.
01:56:54.600 | - Well, I gotta ask about this big, bold decision
01:56:58.520 | at the very beginning to not do advertisements
01:57:00.400 | on the website and just in general,
01:57:02.880 | the philosophy of the business model, Wikipedia,
01:57:05.000 | what went behind that?
01:57:06.440 | - Yeah, so I think most people know this,
01:57:09.920 | but we're a charity.
01:57:11.080 | So in the US, you know, registered as a charity.
01:57:15.800 | And we don't have any ads on the site.
01:57:19.120 | And the vast majority of the money is from donations,
01:57:24.040 | but the vast majority from small donors.
01:57:26.680 | So people giving 25 bucks or whatever.
01:57:29.200 | - If you're listening to this, go donate.
01:57:31.000 | - Go donate. - Donate now.
01:57:32.960 | I've donated so many times.
01:57:34.480 | - And we have, you know, millions of donors every year,
01:57:36.680 | but it's like a small percentage of people.
01:57:38.640 | I would say in the early days,
01:57:39.880 | a big part of it was aesthetic almost
01:57:42.880 | as much as anything else.
01:57:43.960 | It was just like, I just think,
01:57:46.080 | I don't really want ads in Wikipedia.
01:57:48.360 | Like, I just think it would be,
01:57:50.200 | there's a lot of reasons why it might not be good.
01:57:52.080 | And even back then, I didn't think as much as I have since
01:57:57.080 | about a business model can tend to drive you
01:58:01.880 | in a certain place.
01:58:03.320 | And really thinking that through in advance
01:58:06.720 | is really important because you might say,
01:58:08.880 | yeah, we're really, really keen on community control
01:58:13.360 | and neutrality, but if we had an advertising-based
01:58:16.400 | business model, probably that would begin to erode.
01:58:19.880 | Even if I believe in it very strongly,
01:58:22.120 | organizations tend to follow the money
01:58:24.320 | in the DNA in the long run.
01:58:25.720 | And so things like, I mean, it's easy to think about
01:58:29.960 | some of the immediate problems.
01:58:31.120 | So like if you go to read about, I don't know,
01:58:36.120 | Nissan car company.
01:58:42.480 | And if you saw an ad for the new Nissan
01:58:44.960 | at the top of the page, you might be like,
01:58:46.880 | did they pay for this?
01:58:47.880 | Or like, do the advertisers have influence over the content?
01:58:51.040 | Because you kind of wonder about that
01:58:52.280 | for all kinds of media.
01:58:53.680 | - And that undermines trust.
01:58:55.480 | - Undermines trust, right?
01:58:57.000 | But also things like, we don't have clickbait headlines
01:59:01.600 | in Wikipedia.
01:59:02.440 | You've never seen Wikipedia entries
01:59:05.040 | with all this kind of listicles,
01:59:07.720 | sort of the 10 funniest cat pictures,
01:59:11.320 | number seven make you cry.
01:59:13.120 | None of that kind of stuff, 'cause there's no incentive,
01:59:14.880 | no reason to do that.
01:59:16.480 | Also, there's no reason to have an algorithm to say,
01:59:21.440 | actually, we're gonna use our algorithm to drive you
01:59:25.040 | to stay on the website longer.
01:59:26.560 | We're gonna use the algorithm to drive you to,
01:59:29.160 | it's like, oh, you're reading about Queen Victoria.
01:59:32.680 | There's nothing to sell you when you're reading
01:59:34.120 | about Queen Victoria.
01:59:34.960 | Let's move you on to Las Vegas,
01:59:36.160 | 'cause actually the ad revenue around hotels
01:59:38.440 | in Las Vegas is quite good.
01:59:40.040 | So we don't have that sort of,
01:59:42.320 | there's no incentive for the organization to go,
01:59:44.320 | oh, let's move people around to things
01:59:46.480 | that have better ad revenue.
01:59:47.960 | Instead, it's just like, oh, well,
01:59:50.040 | what's most interesting to the community,
01:59:51.880 | just to make those links.
01:59:53.040 | So that decision just seemed obvious to me,
01:59:58.040 | but as I say, it was less of a business decision
02:00:03.120 | and more of an aesthetic.
02:00:04.600 | It's like, oh, this is how I,
02:00:06.560 | I like Wikipedia, it doesn't have ads.
02:00:08.320 | Don't really want, in these early days,
02:00:11.160 | like a lot of the ads, that was well before the era
02:00:14.720 | of really quality ad targeting and all that.
02:00:16.920 | So you get a lot of--
02:00:18.280 | - Banners.
02:00:19.120 | - Banners, punch the monkey ads
02:00:20.680 | and all that kind of nonsense.
02:00:22.560 | And so, but there was no guarantee.
02:00:27.560 | There was no, it was not really clear
02:00:30.160 | how could we fund this?
02:00:33.240 | Like it was pretty cheap,
02:00:34.600 | is it still is quite cheap compared to,
02:00:37.960 | most, we don't have 100,000 employees and all of that,
02:00:42.280 | but would we be able to raise money through donations?
02:00:45.800 | And so I remember the first time that we did,
02:00:50.200 | like really did a donation campaign
02:00:54.040 | was on a Christmas day in 2003, I think it was.
02:00:59.040 | There was, we had three servers,
02:01:04.720 | database servers and two front end servers,
02:01:06.520 | and they were all the same size
02:01:07.840 | or whatever, and two of them crashed.
02:01:10.960 | They broke, like, I don't even know,
02:01:12.800 | remember now, like the hard drive,
02:01:14.440 | that was like, it's Christmas day.
02:01:16.880 | So I scrambled on Christmas day
02:01:18.480 | to sort of go onto the database server,
02:01:21.200 | which fortunately survived,
02:01:22.400 | and have it become a front end server as well.
02:01:24.800 | And then the site was really slow
02:01:26.800 | and it wasn't working very well.
02:01:28.200 | And I was like, okay, it's time,
02:01:29.360 | we need to do a fundraiser.
02:01:31.400 | And so I was hoping to raise $20,000 in a month's time,
02:01:37.160 | but we raised nearly 30,000 within two, three weeks time.
02:01:41.400 | So that was the first proof point of like,
02:01:43.160 | oh, like we put a banner up and people will donate.
02:01:46.280 | Like we just explained we need the money
02:01:47.800 | and people are like, already,
02:01:49.320 | we were very small back then,
02:01:50.520 | and people were like, oh yeah,
02:01:51.800 | like, I love this, I wanna contribute.
02:01:54.200 | Then over the years,
02:01:55.120 | we've become more sophisticated
02:01:57.600 | about the fundraising campaigns,
02:01:59.440 | and we've tested a lot of different messaging and so forth.
02:02:03.080 | What we used to think,
02:02:05.280 | I remember one year we really went heavy with,
02:02:08.320 | we have great ambitions to,
02:02:10.440 | the idea of Wikipedia is a free encyclopedia
02:02:14.200 | for every single person on the planet.
02:02:16.560 | So what about the languages of Sub-Saharan Africa?
02:02:20.760 | So I thought, okay, we're trying to raise money,
02:02:22.600 | we need to talk about that,
02:02:24.280 | 'cause it's really important and near and dear to my heart.
02:02:26.720 | And just instinctively,
02:02:28.560 | knowing nothing about charity fundraising,
02:02:30.600 | you see it all around, it's like,
02:02:31.680 | oh, charities always mention
02:02:34.400 | the poor people they're helping,
02:02:35.800 | so let's talk about that, didn't really work as well.
02:02:38.880 | The pitch that, this is very vague and very sort of broad,
02:02:42.520 | but the pitch that works better than any other in general
02:02:45.760 | is a fairness pitch of,
02:02:49.480 | you use it all the time, you should probably chip in.
02:02:52.280 | And most people are like, yeah, you know what?
02:02:54.240 | My life would suck without Wikipedia,
02:02:56.000 | I use it constantly, and whatever, I should chip in.
02:02:59.360 | It just seems like the right thing to do.
02:03:02.400 | And there's many variants on that, obviously.
02:03:04.520 | And that's really, it works,
02:03:06.840 | and people are like, oh yeah, Wikipedia, I love Wikipedia,
02:03:09.600 | and I shouldn't.
02:03:11.520 | And so sometimes people say,
02:03:13.440 | why are you always begging for money on the website?
02:03:17.600 | And it's not that often, it's not that much,
02:03:20.400 | but it does happen.
02:03:22.280 | They're like, why don't you just get Google
02:03:24.320 | and Facebook and Microsoft, why don't they pay for it?
02:03:29.520 | And I'm like, I don't think that's really the right answer.
02:03:33.960 | - Influence starts to creep in.
02:03:35.480 | - Influence starts to creep in,
02:03:36.840 | and questions start to creep in.
02:03:38.720 | Like the best funding for Wikipedia is the small donors.
02:03:42.360 | We also have major donors, right?
02:03:43.920 | We have high net worth people who donate.
02:03:46.400 | But we always are very careful about that sort of thing,
02:03:49.320 | to say, wow, that's really great and really important,
02:03:52.640 | but we can't let that become influence,
02:03:56.200 | because that would just be really quite, yeah.
02:03:59.800 | Not good for Wikipedia.
02:04:01.000 | - I would love to know how many times I've visited Wikipedia
02:04:03.720 | and how much time I've spent on it,
02:04:05.640 | because I have a general sense
02:04:07.640 | that it's the most useful site I've ever used,
02:04:10.280 | competing maybe with Google search,
02:04:12.080 | which ultimately lands on Wikipedia.
02:04:15.640 | - Yeah, yeah, yeah.
02:04:16.600 | - But if I were just reminded of,
02:04:19.520 | hey, remember all those times your life was made better
02:04:22.160 | because of this site?
02:04:23.400 | I think I would be much more like, yeah.
02:04:25.840 | Why did I waste money on site XYZ,
02:04:29.960 | when I could be like, I should be giving a lot here?
02:04:33.080 | - Well, you know, the Guardian newspaper
02:04:36.040 | has a similar model, which is, they have ads,
02:04:38.600 | but they also, there's no paywall,
02:04:40.600 | but they just encourage people to donate.
02:04:42.840 | And they do that.
02:04:44.120 | Like I've sometimes seen a banner saying,
02:04:46.660 | oh, this is your 134th article you've read this year.
02:04:51.880 | Would you like to donate?
02:04:52.960 | And I think that's, I think it's effective.
02:04:54.600 | I mean, they're testing.
02:04:56.080 | But also I wonder, for some people,
02:04:58.240 | if they just don't feel like guilty and then think,
02:05:01.360 | well, I shouldn't bother them so much.
02:05:03.280 | I don't know.
02:05:04.360 | It's a good question.
02:05:05.200 | I don't know the answer.
02:05:06.360 | - I guess that's the thing I could also turn on,
02:05:08.000 | 'cause that would make me happy.
02:05:09.360 | I feel like legitimately there's some sites,
02:05:11.720 | and this speaks to our social media discussion,
02:05:14.960 | Wikipedia unquestionably makes me feel better about myself
02:05:19.720 | if I spend time on it.
02:05:20.840 | Like there's some websites where I'm like,
02:05:22.900 | if I spend time on Twitter, sometimes I'm like, I regret.
02:05:27.240 | There's, I think Elon talks about this,
02:05:29.080 | minimize the number of regretted minutes.
02:05:31.240 | - Yeah.
02:05:32.080 | - My number of regretted minutes on Wikipedia is like zero.
02:05:36.520 | Like I don't remember a time.
02:05:38.600 | I've just discovered this,
02:05:41.620 | I started following on Instagram, a page,
02:05:44.720 | depth of Wikipedia.
02:05:46.720 | - Oh yeah.
02:05:47.560 | - There's like crazy Wikipedia pages.
02:05:49.320 | There's no Wikipedia page that--
02:05:51.160 | - Yeah, I gave her a media contributor of the year award
02:05:54.460 | this year, 'cause she's so great.
02:05:55.820 | - Yeah, she's amazing.
02:05:56.660 | - Depth of Wikipedia is so fun.
02:05:58.420 | - So I, yeah, so that's the kind of interesting point
02:06:02.180 | that I don't even know if there's a competitor.
02:06:05.660 | There may be this sort of programming stack overflow
02:06:08.420 | type of websites, but everything else,
02:06:10.260 | there's always a trade off.
02:06:11.660 | It's probably because of the ad driven model,
02:06:14.460 | because there's an incentive to pull you into clickbait,
02:06:17.780 | and Wikipedia has no clickbait.
02:06:19.420 | It's all about the quality of the knowledge
02:06:21.560 | and the wisdom and so on.
02:06:22.400 | - Yeah, that's right.
02:06:23.340 | And I also like stack over, although I wonder,
02:06:25.980 | I wonder what you think of this.
02:06:27.580 | So I only program for fun as a hobby,
02:06:31.700 | and I don't have enough time to do it, but I do.
02:06:34.100 | And I'm not very good at it, so therefore I end up
02:06:37.360 | on stack overflow quite a lot,
02:06:39.140 | trying to figure out what's gone wrong.
02:06:41.100 | And I have really transitioned to using
02:06:44.220 | ChatterBeeTee much more for that,
02:06:47.540 | because I can often find the answer clearly explained,
02:06:51.280 | and it just, it works better than sifting through threads.
02:06:54.740 | And I kind of feel bad about that,
02:06:56.060 | because I do love stack overflow and their community.
02:06:58.500 | I mean, I'm assuming, I haven't read anything
02:07:00.380 | in the news about it.
02:07:01.220 | I'm assuming they are keenly aware of this,
02:07:03.620 | and they're thinking about how can we sort of use
02:07:07.040 | this chunk of knowledge that we've got here
02:07:10.300 | and provide a new type of interface
02:07:12.100 | where you can query it with a question
02:07:14.480 | and actually get an answer that's based on the answers
02:07:18.280 | that we've had, I don't know.
02:07:20.040 | - And I think stack overflow currently
02:07:22.960 | has policies against using GPT.
02:07:26.080 | Like there's a contentious kind of tension.
02:07:28.120 | - Of course, yeah, yeah, yeah.
02:07:28.960 | - But they're trying to figure that out.
02:07:31.280 | - And so we are similar in that regard.
02:07:33.520 | Like obviously all the things we've talked about,
02:07:35.820 | like ChatterBeeTee makes stuff up,
02:07:37.160 | and it makes up references.
02:07:38.720 | So our community has already put into place
02:07:41.160 | some policies about it, but roughly speaking,
02:07:43.720 | there's always more nuance, but roughly speaking,
02:07:45.840 | it's sort of like you, the human, are responsible
02:07:49.380 | for what you put into Wikipedia.
02:07:51.280 | So if you use ChatterBeeTee, you better check it.
02:07:54.920 | 'Cause there's a lot of great use cases of,
02:07:56.400 | you know, like, oh, well, I'm not a native speaker
02:08:00.200 | of German, but I kind of am pretty good.
02:08:02.960 | I'm not talking about myself,
02:08:03.800 | a hypothetical me that's pretty good.
02:08:05.640 | And I kind of just wanna run my edit through ChatterBeeTee
02:08:10.240 | in German to go make sure my grammar's okay.
02:08:13.640 | That's actually cool.
02:08:14.680 | - Does it make you sad that people might use,
02:08:20.440 | increasingly use, ChatGPT for something
02:08:24.060 | where they would previously use Wikipedia?
02:08:25.640 | So basically use it to answer basic questions
02:08:29.640 | about the Eiffel Tower, and where the answer
02:08:33.820 | really comes at the source of it from Wikipedia,
02:08:36.880 | but they're using this as an interface.
02:08:38.720 | - Yeah, no, no, that's completely fine.
02:08:40.600 | I mean, part of it is our ethos has always been,
02:08:43.520 | here's our gift to the world, make something.
02:08:45.520 | So if the knowledge is more accessible to people,
02:08:49.140 | even if they're not coming through us, that's fine.
02:08:52.720 | Now, obviously we do have certain business model concerns,
02:08:55.640 | right, like if, and we've talked,
02:08:57.640 | where we've had more conversation about this,
02:08:59.340 | this whole GPT thing is new, things like,
02:09:02.880 | if you ask Alexa, you know, what is the Eiffel Tower,
02:09:08.440 | and she reads you the first two sentences from Wikipedia
02:09:11.000 | and doesn't say it's from Wikipedia,
02:09:13.160 | and they've recently started citing Wikipedia,
02:09:15.460 | then we worry like, oh, if people don't know
02:09:17.440 | they're getting the knowledge from us,
02:09:19.200 | are they gonna donate money, or are they just saying,
02:09:21.040 | oh, what's Wikipedia for, I can just ask Alexa.
02:09:23.400 | It's like, well, Alexa only knows anything
02:09:25.520 | 'cause she read Wikipedia.
02:09:26.960 | So we do think about that, but it doesn't bother me
02:09:29.320 | in the sense of like, oh, I want people
02:09:31.520 | to always come to Wikipedia first.
02:09:33.680 | But we're also, you know, had a great demo,
02:09:36.640 | like literally just hacked together over a weekend
02:09:38.960 | by our head of machine learning,
02:09:41.100 | where he did this little thing to say,
02:09:44.040 | you could ask any question,
02:09:47.080 | and he was just knocking it together,
02:09:48.520 | so he used the OpenAI's API just to make a demo,
02:09:53.520 | ask a question, why do ducks fly south for winter?
02:09:58.360 | Just the kind of thing you think,
02:09:59.620 | oh, I might just Google for that,
02:10:03.080 | or I might start looking in Wikipedia, I don't know.
02:10:05.800 | And so what he does, he asks Chachapiti,
02:10:08.240 | what are some Wikipedia entries that might answer this?
02:10:10.880 | Then he grabbed those Wikipedia entries,
02:10:13.600 | said, here's some Wikipedia entries,
02:10:15.200 | answer this question based only on the information in this.
02:10:18.440 | And he had pretty good results,
02:10:19.760 | and it kind of prevented them making stuff up.
02:10:21.840 | It's just he hacked together over the weekend,
02:10:23.880 | but what it made me think about was,
02:10:27.240 | oh, okay, so now we've got this huge body of knowledge
02:10:30.720 | that in many cases, you're like, oh, I'm really,
02:10:34.920 | I wanna know about Queen Victoria,
02:10:37.000 | I'm just gonna go read the Wikipedia entry,
02:10:38.520 | and it's gonna take me through her life and so forth.
02:10:43.520 | But other times you've got a specific question,
02:10:45.640 | and maybe we could have a better search experience
02:10:48.280 | where you can come to Wikipedia,
02:10:50.280 | ask your specific question,
02:10:51.760 | get your specific answer that's from Wikipedia,
02:10:54.200 | including links to the articles you might wanna read next.
02:10:57.800 | And that's just a step forward,
02:10:59.040 | like that's just using a new type of technology
02:11:01.560 | to make the extraction of information from this body of text
02:11:05.880 | into my brain faster and easier.
02:11:08.760 | So I think that's kind of cool.
02:11:10.720 | - I would love to see a Chad GPT grounding
02:11:15.140 | into websites like Wikipedia,
02:11:17.440 | and the other comparable website to me
02:11:19.880 | will be like Wolfram Alpha
02:11:21.920 | for more mathematical knowledge, that kind of stuff.
02:11:24.480 | So grounding, like taking you to a page
02:11:27.320 | that is really crafted, as opposed to,
02:11:30.160 | like the moment you start actually taking you
02:11:32.320 | to like journalist websites, like news websites,
02:11:36.640 | starts getting a little iffy.
02:11:38.080 | - Yeah, yeah, yeah. - It's getting a little--
02:11:39.240 | - Yeah, yeah, yeah.
02:11:40.520 | - 'Cause they have, you're now in a land
02:11:42.600 | that has a wrong incentive.
02:11:44.240 | - Right, yeah. - You pulled in--
02:11:45.720 | - And you need somebody to have filtered through that
02:11:48.080 | and sort of tried to knock off the rough edges, yeah.
02:11:50.640 | No, it's very, I think that's exactly right.
02:11:53.480 | And I think, you know, I think that kind of grounding
02:12:00.120 | is, I think they're working really hard on it.
02:12:02.480 | I think that's really important.
02:12:03.640 | And that actually, when I, so if you ask me to step back
02:12:06.800 | and be like very business-like about our business model
02:12:09.680 | and where's it gonna go for us,
02:12:11.560 | and are we gonna lose half our donations
02:12:13.760 | 'cause everybody's just gonna stop coming to Wikipedia
02:12:15.400 | and go to Chad GPT, I think grounding will help a lot
02:12:18.800 | because frankly, most questions people have,
02:12:22.160 | if they provide proper links,
02:12:23.840 | we're gonna be at the top of that
02:12:24.840 | just like we are in Google.
02:12:26.320 | So we're still gonna get tons of recognition
02:12:28.080 | and tons of traffic just from,
02:12:30.240 | even if it's just the moral properness of saying,
02:12:35.000 | here's my source.
02:12:36.800 | So I think we're gonna be all right in that.
02:12:39.160 | - Yeah, and the close partnership of,
02:12:40.640 | if the model is fine-tuned, is constantly retrained,
02:12:43.760 | then Wikipedia is one of the primary places
02:12:45.640 | where if you want to change what the model knows,
02:12:48.600 | one of the things you should do is contribute to Wikipedia
02:12:52.000 | or clarify Wikipedia. - Yeah, yeah, yeah.
02:12:54.120 | - Or elaborate, expand, all that kind of stuff.
02:12:57.240 | - You mentioned all of us have controversies,
02:12:58.880 | I have to ask.
02:12:59.880 | Do you find the controversy
02:13:02.320 | of whether you are the sole founder
02:13:04.080 | or the co-founder of Wikipedia ironic,
02:13:07.960 | absurd, interesting, important?
02:13:10.540 | What are your comments?
02:13:13.080 | - I would say unimportant, not that interesting.
02:13:16.560 | I mean, one of the things that people
02:13:19.840 | are sometimes surprised to hear me say
02:13:21.520 | is I actually think Larry Sanger doesn't get enough credit
02:13:25.120 | for his early work in Wikipedia
02:13:27.120 | even though I think co-founder's
02:13:28.280 | not the right title for that.
02:13:30.240 | So, you know, like he had a lot of impact
02:13:33.480 | and a lot of great work,
02:13:35.120 | and I disagree with him about a lot of things since
02:13:37.080 | and all that, and that's fine.
02:13:38.600 | So yeah, no, to me, that's like,
02:13:41.200 | it's one of these things that the media love
02:13:44.640 | a falling out story,
02:13:46.720 | so they wanna make a big deal out of it,
02:13:48.440 | and I'm just like, yeah, no.
02:13:51.040 | - So there's a lot of interesting engineering contributions
02:13:53.520 | in the early days, like you were saying,
02:13:54.920 | there's debates about how to structure it,
02:13:57.000 | what the heck is this thing that we're doing,
02:13:59.360 | and there's important people that contributed to that.
02:14:02.200 | - Yeah, definitely.
02:14:03.440 | - So he also, you said you had some disagreements,
02:14:06.440 | Larry Sanger said that nobody should trust Wikipedia,
02:14:09.880 | and that Wikipedia seems to assume
02:14:11.600 | that there's only one legitimate defensible version
02:14:13.800 | of the truth on any controversial question.
02:14:16.480 | That's not how Wikipedia used to be.
02:14:18.800 | I presume you disagree with that analysis.
02:14:21.160 | - I mean, just straight up, I disagree.
02:14:22.800 | Like, go and read any Wikipedia entry
02:14:24.840 | on a controversial topic,
02:14:26.000 | and what you'll see is a really diligent effort
02:14:28.920 | to explain all the relevant sides,
02:14:30.720 | so yeah, just disagree.
02:14:32.280 | - So on controversial questions,
02:14:33.920 | you think perspectives are generally represented?
02:14:36.240 | I mean, it has to do with the kind of the tension
02:14:39.240 | between the mainstream and the non-mainstream
02:14:42.400 | that we were talking about.
02:14:43.520 | - Yeah, no, I mean, for sure.
02:14:45.240 | Like, to take this area of discussion seriously
02:14:51.120 | is to say, yeah, you know what,
02:14:54.120 | actually, that is a big part
02:14:55.800 | of what Wikipedians spend their time grappling with,
02:14:58.800 | is to say, you know, how do we figure out
02:15:03.640 | whether a less popular view is pseudoscience?
02:15:09.980 | Is it just a less popular view
02:15:13.560 | that's gaining acceptance in the mainstream?
02:15:16.080 | Is it fringe versus crackpot, et cetera, et cetera?
02:15:19.640 | And that debate is what you've gotta do.
02:15:22.220 | There's no choice about having that debate,
02:15:24.160 | of grappling with something, and I think we do,
02:15:27.800 | and I think that's really important,
02:15:28.960 | and I think if anybody said to the Wikipedia community,
02:15:33.520 | gee, you should stop, you know,
02:15:35.040 | sort of covering minority viewpoints on this issue,
02:15:39.280 | I think they would say,
02:15:40.400 | I don't even understand why you would say that.
02:15:42.360 | Like, we have to sort of grapple with minority viewpoints
02:15:45.400 | in science and politics and so on,
02:15:49.580 | and this is one of the reasons why, you know,
02:15:52.780 | there is no magic simple answer to all these things.
02:15:56.340 | It's really contextual.
02:16:00.860 | It's case by case.
02:16:01.780 | It's like, you know, you've gotta really say,
02:16:03.460 | okay, what is the context here?
02:16:05.260 | How do you do it?
02:16:06.100 | And you've always gotta be open to correction
02:16:08.300 | and to change and to sort of challenge
02:16:10.860 | and always be sort of serious about that.
02:16:13.100 | - I think what happens, again, with social media
02:16:15.140 | is when there is that grappling process in Wikipedia
02:16:19.080 | and a decision is made to remove a paragraph
02:16:21.680 | or to remove a thing or to say a thing,
02:16:24.320 | you're gonna notice the one direction
02:16:27.640 | of the oscillation of the grappling and not the correction,
02:16:30.840 | and you've gotta highlight that and say,
02:16:32.200 | how come this person, I don't know,
02:16:36.520 | maybe legitimacy of elections,
02:16:38.760 | that's the thing that comes up.
02:16:40.160 | Donald Trump, maybe previous stuff.
02:16:42.640 | - I can give a really good example,
02:16:44.120 | which is there was this sort of dust up
02:16:47.720 | about the definition of recession in Wikipedia.
02:16:52.720 | So the accusation was,
02:16:55.200 | and the accusation was often quite ridiculous and extreme,
02:16:58.240 | which is under pressure from the Biden administration,
02:17:01.440 | Wikipedia changed the definition of recession
02:17:04.160 | to make Biden look good.
02:17:05.720 | Or we did it not under pressure,
02:17:07.360 | but because we're a bunch of lunatic leftists and so on.
02:17:11.400 | And then, you know, when I see something like that
02:17:13.400 | in the press, I'm like, oh dear, what's happened here?
02:17:15.560 | How did we do that?
02:17:16.460 | 'Cause I always just accept things for five seconds first.
02:17:19.480 | And then I go and I look and I'm like, you know what?
02:17:20.880 | That's literally completely not what happened.
02:17:23.880 | What happened was one editor
02:17:26.880 | thought the article needed restructuring.
02:17:28.680 | So the article has always said,
02:17:29.620 | so the traditional kind of loose definition of recession
02:17:33.040 | is two quarters of negative growth.
02:17:35.900 | But there's always been, within economics,
02:17:38.600 | within important agencies in different countries
02:17:41.160 | around the world, a lot of nuance around that.
02:17:43.280 | And there's other like factors that go into it and so forth.
02:17:45.880 | And it's just an interesting, complicated topic.
02:17:49.040 | And so the article has always had
02:17:51.120 | the definition of two quarters.
02:17:53.120 | And the only thing that really changed
02:17:54.600 | was moving that from the lead,
02:17:56.960 | from the top paragraph to further down.
02:17:59.720 | And then news stories appeared saying,
02:18:02.400 | Wikipedia has changed the definition of recession.
02:18:04.840 | And then we got a huge rush of trolls coming in.
02:18:07.040 | So the article was temporarily protected,
02:18:08.840 | I think only semi-protected,
02:18:10.720 | and people were told go to the talk page to discuss.
02:18:13.120 | - So it was a dust up that was,
02:18:15.400 | when you look at it as a Wikipedia,
02:18:16.720 | and you're like, oh,
02:18:17.560 | like this is a really routine kind of editorial debate.
02:18:21.520 | Another example, which unfortunately,
02:18:23.280 | our friend Elon fell for, I would say,
02:18:25.880 | is the Twitter files.
02:18:28.160 | So there was an article called "The Twitter Files,"
02:18:30.760 | which is about these files that were released
02:18:32.520 | once Elon took control of Twitter
02:18:34.440 | and he released internal documents.
02:18:36.560 | And what happened was somebody nominated it for deletion.
02:18:42.200 | But even the nomination said,
02:18:44.400 | this is actually,
02:18:46.040 | this is mainly about the Hunter Biden laptop controversy.
02:18:49.440 | Shouldn't this information be there instead?
02:18:52.040 | So anyone can, like,
02:18:53.440 | it takes exactly one human being anywhere on the planet
02:18:55.880 | to propose something for deletion.
02:18:57.800 | And that triggers a process where people discuss it,
02:19:00.200 | which within a few hours,
02:19:01.880 | it was what we call snowball closed,
02:19:03.840 | i.e. this doesn't have a snowball's chance
02:19:06.120 | in hell of passing.
02:19:08.240 | So an admin goes, yeah, wrong.
02:19:11.200 | And closed the debate and that was it.
02:19:12.880 | That was the whole thing that happened.
02:19:15.600 | And so nobody proposed suppressing the information.
02:19:18.400 | Nobody proposed it wasn't important.
02:19:19.920 | It was just like editorially boring internal questions.
02:19:23.960 | And, you know, so sometimes people read stuff like that
02:19:26.680 | and they're like, oh, you see?
02:19:28.280 | Look at these leftists.
02:19:29.200 | They're trying to suppress the truth again.
02:19:31.120 | It's like, well, slow down a second and come and look.
02:19:33.000 | Like, literally, it's not what happened.
02:19:35.000 | - Yeah, so I think the right is more sensitive to censorship.
02:19:40.880 | And so they will more likely highlight,
02:19:44.960 | there's more virality to highlighting something
02:19:49.720 | that looks like censorship in any walks of life.
02:19:52.240 | And this moving a paragraph from one place to another
02:19:54.640 | or removing it and so on as part of the regular grappling
02:19:57.520 | of Wikipedia can make a hell of a good article
02:20:00.680 | or YouTube video.
02:20:01.520 | - Yeah, yeah, yeah.
02:20:02.360 | No, it sounds really enticing and intriguing
02:20:05.600 | and surprising to most people because they're like,
02:20:08.040 | oh, no, I'm reading Wikipedia.
02:20:10.080 | It doesn't seem like a crackpot leftist website.
02:20:12.560 | It seems pretty kind of dull, really, in its own geeky way.
02:20:16.880 | - Well, that's how I--
02:20:17.720 | - So that makes a good story.
02:20:18.680 | It's like, oh, am I being misled
02:20:20.240 | because there's a shadowy cabal of Jimmy Wales?
02:20:24.760 | - You know, I generally, I read political stuff.
02:20:27.120 | I mentioned to you that I'm traveling
02:20:29.840 | to have some very difficult conversation
02:20:33.760 | with high profile figures, both in the war in Ukraine
02:20:36.560 | and in Israel and Palestine.
02:20:39.000 | And I read the Wikipedia articles around that.
02:20:43.080 | And I also read books on the conflict
02:20:46.400 | and the history of the different regions.
02:20:48.480 | And I find the Wikipedia articles to be very balanced
02:20:51.680 | and there's many perspectives being represented.
02:20:53.960 | But then I ask myself, well,
02:20:55.180 | am I one of them leftist crackpots?
02:20:58.400 | They can't see the truth.
02:20:59.640 | I mean, it's something I ask myself all the time.
02:21:02.080 | Forget the leftists, just crackpot.
02:21:03.960 | Am I just being a sheep in accepting it?
02:21:08.960 | - I think that's an important question to always ask,
02:21:10.720 | but not too much.
02:21:12.120 | - Yeah, no, I agree completely.
02:21:13.440 | - A little bit, but not too much.
02:21:15.080 | - No, I think we always have to challenge ourselves
02:21:17.400 | of like, what do I potentially have wrong?
02:21:20.640 | - Well, you mentioned pressure from government.
02:21:23.040 | You've criticized Twitter for allowing,
02:21:28.680 | giving in to Turkey's government censorship.
02:21:33.100 | There's also conspiracy theories or accusations
02:21:36.340 | of Wikipedia being open to pressure
02:21:41.040 | from government or government organizations,
02:21:43.080 | FBI and all this kind of stuff.
02:21:45.560 | What is the philosophy about pressure
02:21:48.640 | from government and censorship?
02:21:50.320 | - So we're super hardcore on this.
02:21:52.920 | We've never bowed down to government pressure
02:21:56.080 | anywhere in the world and we never will.
02:21:58.080 | And we understand that we're hardcore.
02:22:02.080 | And actually there is a bit of nuance
02:22:03.680 | about how different companies respond to this,
02:22:06.800 | but our response has always been just to say no.
02:22:10.320 | And if they threaten to block, well, knock yourself out,
02:22:13.000 | you're gonna lose Wikipedia.
02:22:14.400 | And that's been very successful for us as a strategy
02:22:17.160 | because governments know they can't just casually threaten
02:22:22.120 | to block Wikipedia or block us for two days
02:22:24.280 | and we're gonna cave in immediately
02:22:25.600 | to get back into the market.
02:22:27.680 | And that's what a lot of companies have done.
02:22:29.440 | And I don't think that's good.
02:22:31.400 | We can go one level deeper and say,
02:22:34.280 | I'm actually quite sympathetic.
02:22:35.720 | If you have staff members in a certain country
02:22:39.320 | and they are at physical risk,
02:22:41.320 | you've gotta put that into your equation.
02:22:43.180 | So I understand that.
02:22:44.180 | Like if Elon said, actually I've got 100 staff members
02:22:47.760 | on the ground in such and such a country,
02:22:50.200 | and if we don't comply, somebody's gonna get arrested
02:22:53.440 | and it could be quite serious.
02:22:54.940 | Okay, that's a tough one, right?
02:22:57.120 | That's actually really hard.
02:23:00.920 | But yeah, no, and then the FBI one, no, no.
02:23:04.400 | We, like the criticism I saw,
02:23:06.640 | I kind of prepared for this 'cause I saw people responding
02:23:09.240 | to your requests for questions and I was like,
02:23:12.340 | somebody's like, oh, well, don't you think it was really bad
02:23:14.580 | that you da, da, da, da, da?
02:23:15.920 | And I said, I actually reached out to staff saying,
02:23:17.800 | can you just make sure I've got my facts right?
02:23:20.240 | And the answer is we received zero requests of any kind
02:23:25.240 | from the FBI or any of the other government agencies
02:23:28.400 | for any changes to content in Wikipedia.
02:23:31.280 | And had we received those requests
02:23:33.320 | at the level of the Wikimedia Foundation,
02:23:35.040 | we would have said, it's not our,
02:23:37.320 | like we can't do anything
02:23:38.360 | because Wikipedia is written by the community.
02:23:40.800 | And so the Wikimedia Foundation
02:23:42.840 | can't change the content of Wikipedia without causing,
02:23:45.520 | I mean, God, that would be a massive controversy
02:23:47.360 | you can't even imagine.
02:23:48.560 | What we did do, and this is what I've done,
02:23:51.720 | I've been to China and met with the Minister of Propaganda.
02:23:56.440 | We've had discussions with governments all around the world,
02:24:00.000 | not because we wanna do their bidding,
02:24:02.480 | but because we don't wanna do their bidding,
02:24:04.040 | but we also don't wanna be blocked.
02:24:05.400 | And we think actually having these conversations
02:24:07.400 | are really important.
02:24:08.240 | Now, there's no threat of being blocked in the US,
02:24:10.240 | like that's just never gonna happen.
02:24:11.340 | There is the First Amendment.
02:24:12.840 | But in other countries around the world, it's like, okay,
02:24:15.680 | what are you upset about?
02:24:16.760 | Let's have the conversation, like let's understand,
02:24:19.400 | and let's have a dialogue about it
02:24:21.320 | so that you can understand where we come from
02:24:24.120 | and what we're doing and why.
02:24:26.440 | And then sometimes it's like, gee,
02:24:30.560 | if somebody complains that something's bad in Wikipedia,
02:24:34.280 | whoever they are, don't care who they are.
02:24:36.800 | Could be you, could be the government, could be the Pope,
02:24:40.620 | I don't care who they are.
02:24:41.840 | It's like, oh, okay, well, our responsibility as Wikipedia
02:24:43.920 | is to go, oh, hold on, let's check, right?
02:24:46.240 | Is that right or wrong?
02:24:47.120 | Is there something that we've got wrong in Wikipedia?
02:24:49.460 | Not because you're threatening to block us,
02:24:50.920 | but because we want Wikipedia to be correct.
02:24:53.760 | So we do have these dialogues with people.
02:24:55.920 | And a big part of what was going on with,
02:25:00.920 | you might call it pressure on social media companies
02:25:06.280 | or dialogue with, depending on, as we talked earlier,
02:25:09.280 | grapple with the language, depending on what your view is.
02:25:12.180 | In our case, it was really just about, oh, okay, right,
02:25:17.200 | they wanna have a dialogue about COVID information,
02:25:21.040 | misinformation.
02:25:22.720 | We're this enormous source of information
02:25:25.400 | which the world depends on.
02:25:27.600 | We're gonna have that conversation, right?
02:25:29.440 | We're happy to say, here's, if they say,
02:25:32.280 | how do you know that Wikipedia is not gonna be pushing
02:25:36.360 | some crazy anti-vax narrative?
02:25:38.740 | First, I mean, I think it's somewhat inappropriate
02:25:42.320 | for a government to be asking pointed questions
02:25:45.340 | in a way that implies possible penalties.
02:25:48.340 | I'm not sure that ever happened because we would just go,
02:25:51.200 | I don't know, the Chinese blocked us and so it goes, right?
02:25:56.200 | We're not gonna cave in to any kind of government pressure,
02:25:58.680 | but whatever the appropriateness of what they were doing,
02:26:03.480 | I think there is a role for government in just saying,
02:26:06.320 | let's understand the information ecosystem.
02:26:09.040 | Let's think about the problem of misinformation,
02:26:11.400 | disinformation in society,
02:26:13.240 | particularly around election security,
02:26:15.540 | all these kinds of things.
02:26:17.920 | So, I think it would be irresponsible of us
02:26:21.720 | to get a call from a government agency and say,
02:26:25.560 | yeah, why don't you just fuck off, you're the government.
02:26:28.800 | But it would also be irresponsible to go,
02:26:30.360 | oh dear, government agent's not happy,
02:26:32.540 | let's fix Wikipedia so the FBI loves us.
02:26:35.560 | - When you say you wanna have discussions
02:26:37.680 | with the Chinese government or with organizations
02:26:40.520 | like CDC and WHO, it's to thoroughly understand
02:26:44.040 | what the mainstream narrative is
02:26:46.000 | so that it can be properly represented
02:26:48.080 | but not drive what the articles are.
02:26:50.760 | - Well, it's actually important to say,
02:26:53.520 | like whatever the Wikimedia Foundation thinks
02:26:57.040 | has no impact on what's in Wikipedia.
02:27:00.040 | So, it's more about saying to them,
02:27:03.360 | right, we understand you're the World Health Organization
02:27:06.400 | or you're whoever and part of your job
02:27:08.640 | is to sort of public health, it's about communications.
02:27:13.120 | You wanna understand the world.
02:27:14.600 | So, it's more about, oh, well,
02:27:16.120 | let's explain how Wikipedia works.
02:27:17.960 | - So, it's more about explaining how Wikipedia works
02:27:19.840 | and like, hey, it's the volunteers.
02:27:22.320 | - Yeah, yeah, exactly.
02:27:23.560 | - It's a battle of ideas and here's how the sources are used.
02:27:28.560 | - Yeah, exactly.
02:27:30.320 | - What are legitimate sources
02:27:31.160 | and what are not legitimate sources.
02:27:32.480 | - Yeah, exactly.
02:27:33.320 | - I mean, I suppose there's some battle
02:27:35.000 | about what is a legitimate source.
02:27:36.920 | There could be statements made that CDC,
02:27:39.680 | I mean, like there's a government organizations
02:27:44.680 | in general have sold themselves to be the place
02:27:48.640 | where you go for expertise.
02:27:50.840 | And some of that has been to a small degree
02:27:55.160 | raised in question over the response to the pandemic.
02:27:57.680 | - Well, I think in many cases,
02:28:00.360 | and this goes back to my topic of trust.
02:28:03.920 | So, there were definitely cases of public officials,
02:28:08.560 | public organizations, where I felt like
02:28:12.400 | they lost the trust of the public
02:28:14.400 | because they didn't trust the public.
02:28:16.920 | And so, the idea is like, we really need people
02:28:19.960 | to take this seriously and take actions.
02:28:21.920 | Therefore, we're gonna put out some overblown claims
02:28:26.600 | because it's gonna scare people into behaving correctly.
02:28:29.400 | You know what, that might work for a little while,
02:28:32.600 | but it doesn't work in the long run
02:28:34.120 | because suddenly people go from a default stance
02:28:37.080 | of like the Center for Disease Control,
02:28:40.400 | very well-respected scientific organization,
02:28:43.480 | sort of, I don't know, they've got a vault in Atlanta
02:28:47.240 | with the last vial of smallpox or whatever it is
02:28:49.680 | that people think about them, and to go,
02:28:52.440 | oh, right, these are scientists
02:28:53.600 | we should actually take seriously and listen to,
02:28:55.520 | and they're not politicized.
02:28:57.640 | And it's like, okay, and if you put out statements,
02:29:01.240 | I don't know if the CDC did,
02:29:02.360 | but health organization, whoever,
02:29:04.680 | that are provably false, and also provably,
02:29:08.760 | you kind of knew they were false,
02:29:09.920 | but you did it to scare people
02:29:11.120 | 'cause you wanted them to do the right thing.
02:29:13.280 | It's like, no, you know what,
02:29:14.640 | that's not gonna work in the long run.
02:29:16.320 | Like, you're gonna lose people,
02:29:17.560 | and now you've got a bigger problem,
02:29:19.400 | which is a lack of trust in science,
02:29:21.320 | a lack of trust in authorities, who are, you know,
02:29:24.640 | by and large, they're like quite boring
02:29:27.360 | government bureaucrat scientists
02:29:28.920 | who just are trying to help the world.
02:29:31.280 | - Well, I've been criticized.
02:29:34.560 | And I've been torn on this.
02:29:36.400 | I've been criticized for criticizing Anthony Fauci too hard.
02:29:39.960 | The degree to which I criticized him
02:29:42.560 | is because he's a leader,
02:29:44.840 | and I'm just observing the effect
02:29:48.720 | in the loss of trust in institutions like the NIH,
02:29:53.080 | that where I personally know
02:29:54.240 | there's a lot of incredible scientists
02:29:55.840 | doing incredible work.
02:29:57.400 | And I have to blame the leaders
02:29:58.760 | for the effects on the distrust
02:30:00.440 | in the scientific work that they're doing
02:30:02.920 | because of what I perceive as basic human flaws
02:30:07.920 | of communication, of arrogance, of ego, of politics,
02:30:12.320 | all those kinds of things.
02:30:13.240 | Now, you could say you're being too harsh, possible,
02:30:16.120 | but I think that's the whole point of free speech
02:30:18.400 | is you can criticize the people who lead.
02:30:21.560 | Leaders, unfortunately or fortunately,
02:30:24.240 | are responsible for the effects on society.
02:30:28.240 | To me, Anthony Fauci,
02:30:29.880 | or whoever in the scientific position around the pandemic,
02:30:33.080 | had an opportunity to have a FDR moment
02:30:37.480 | or to get everybody together
02:30:39.240 | and inspire about the power of science
02:30:42.200 | to rapidly develop a vaccine
02:30:44.280 | that saves us from this pandemic and future pandemic
02:30:46.960 | that can threaten the wellbeing of human civilization.
02:30:49.880 | This was epic and awesome and sexy.
02:30:53.000 | And to me, when I talk to people about science,
02:30:56.000 | it's anything but sexy
02:30:57.200 | in terms of the virology and biology development
02:30:59.840 | because it's been politicized, it's icky,
02:31:03.440 | and people just don't wanna,
02:31:04.880 | don't talk to me about the vaccine.
02:31:06.560 | I understand, I understand.
02:31:07.720 | I got vaccinated.
02:31:08.880 | There's just, let's switch topics.
02:31:10.720 | - Right, yeah, yeah, yeah. - Quick.
02:31:12.400 | - Yeah.
02:31:13.240 | Well, it's interesting 'cause as I said,
02:31:14.920 | I live in the UK and I think it's,
02:31:17.280 | all these things are a little less politicized there.
02:31:20.040 | And I haven't paid close enough attention to Fauci
02:31:24.880 | to have a really strong view.
02:31:27.120 | I'm sure I would disagree with some things.
02:31:29.000 | I definitely, you know,
02:31:30.160 | I remember hearing at the beginning of the pandemic
02:31:34.280 | as I'm unwrapping my Amazon package with the masks I bought
02:31:38.280 | 'cause I heard there's a pandemic
02:31:39.480 | and I just was like, I want some N95 mask, please.
02:31:42.720 | And they were saying, don't buy masks.
02:31:45.920 | And the motivation was
02:31:47.400 | because they didn't want there to be shortages
02:31:48.680 | in hospitals, fine.
02:31:50.720 | But they were also statements of masks won't,
02:31:53.840 | they're not effective and they won't help you.
02:31:55.160 | And then the complete about face to,
02:31:58.200 | you're ridiculous if you're not wearing them.
02:32:00.440 | You know, it's just like, no,
02:32:01.680 | like that about face just lost people from day one.
02:32:06.200 | - The distrust in the intelligence of the public
02:32:09.160 | to deal with nuance, to deal with the uncertainty.
02:32:11.480 | - Yeah, this is exactly what, you know,
02:32:13.600 | I think this is where the Wikipedia neutral point of view
02:32:17.560 | is and should be an ideal.
02:32:20.520 | And obviously every article and everything we could,
02:32:22.880 | you know me now and you know how I am about these things,
02:32:25.480 | but like ideally is to say, look,
02:32:27.920 | we're happy to show you all the perspectives.
02:32:29.800 | This is Planned Parenthood's view
02:32:32.000 | and this is Catholic Church view.
02:32:33.880 | And we're gonna explain that
02:32:35.040 | and we're gonna try to be thoughtful
02:32:36.240 | and put in the best arguments from all sides
02:32:40.840 | 'cause I trust you.
02:32:41.760 | Like you read that and you're gonna be more educated
02:32:44.880 | and you're gonna begin to make a decision.
02:32:46.560 | I mean, I can just talk in the UK,
02:32:49.080 | the government, da, da, da, da.
02:32:51.960 | When we found out in the UK
02:32:53.680 | that very high level government officials
02:32:56.440 | were not following the rules they had put on everyone else,
02:33:00.160 | I moved from, I had just become a UK citizen
02:33:03.160 | just a little while before the pandemic.
02:33:05.360 | And you know, it's kind of emotional.
02:33:06.880 | Like you get a passport in a new country
02:33:08.400 | and you feel quite good.
02:33:09.280 | And I did my oath to the queen
02:33:10.960 | and then they dragged the poor old lady out
02:33:13.120 | to tell us all to be good.
02:33:14.280 | And I was like, we're British
02:33:15.800 | and we're gonna do the right things.
02:33:17.160 | And you know, it's gonna be tough, but we're gonna, you know.
02:33:20.200 | So you have that kind of Dunkirk spirit moment
02:33:22.640 | and you're like following the rules to a T
02:33:25.960 | and then suddenly it's like,
02:33:26.920 | well, they're not following the rules.
02:33:28.720 | And so suddenly I shifted personally from,
02:33:31.880 | I'm gonna follow the rules
02:33:32.760 | even if I don't completely agree with them,
02:33:35.400 | I'll still follow 'cause I think
02:33:36.600 | we've got all chipping together to like, you know what?
02:33:38.720 | I'm gonna make wise and thoughtful decisions
02:33:40.920 | for myself and my family.
02:33:43.000 | And that generally is gonna mean following the rules,
02:33:45.160 | but it's basically, you know, when they're,
02:33:48.520 | you know, at certain moments in time,
02:33:50.360 | like you're not allowed to be in an outside space
02:33:52.600 | unless you're exercising.
02:33:54.440 | I'm like, I think I can sit in a park and read a book.
02:33:58.240 | Like, it's gonna be fine.
02:33:59.600 | Like that's irrational rule,
02:34:01.280 | which I would have been following just personally
02:34:03.640 | of like, I'm just gonna do the right thing, yeah.
02:34:06.320 | - And the loss of trust, I think at scale
02:34:09.080 | was probably harmful to science.
02:34:10.760 | And to me, the scientific method
02:34:13.720 | and the scientific communities
02:34:15.360 | is one of the biggest hopes, at least to me,
02:34:19.080 | for the survival and the thriving of human civilization.
02:34:22.560 | - Absolutely.
02:34:23.400 | And I, you know, I think you see
02:34:25.920 | some of the ramifications of this.
02:34:28.160 | There's always been like pretty anti-science,
02:34:31.640 | anti-vax people, okay?
02:34:33.720 | That's always been a thing,
02:34:34.600 | but I feel like it's bigger now
02:34:36.160 | simply because of that lowering of trust.
02:34:41.120 | So a lot of people, yeah, maybe it's like you say,
02:34:43.880 | a lot of people are like, yeah, I got vaccinated
02:34:45.680 | and I really don't wanna talk about this
02:34:47.000 | because it's so toxic, you know?
02:34:49.320 | And that's unfortunate 'cause I think people should say,
02:34:52.240 | what an amazing thing.
02:34:53.920 | And you know, there's also a whole range of discourse
02:34:58.880 | around if this were a disease that were primarily,
02:35:03.880 | that was primarily killing babies,
02:35:07.560 | I think people's emotions about it
02:35:09.080 | would have been very different, right or wrong,
02:35:11.920 | than the fact that when you really looked at
02:35:13.320 | the sort of death rate of getting COVID,
02:35:18.320 | wow, it's really dramatically different.
02:35:21.760 | If you're late in life, this was really dangerous.
02:35:26.760 | And if you're 23 years old, yeah, well, it's not great.
02:35:31.400 | Like, and long COVID's a thing and all of that.
02:35:33.360 | But, and I think some of the public communications,
02:35:36.920 | again, were failing to properly contextualize it.
02:35:41.320 | Not all of it, you know, it's a complicated matter,
02:35:43.760 | but yeah.
02:35:45.440 | - Let me read you a Reddit comment that received two likes.
02:35:48.920 | - Oh.
02:35:49.760 | (Lex laughing)
02:35:50.920 | Two whole people liked it.
02:35:52.320 | - Yeah, two people liked it.
02:35:53.720 | And I don't know, maybe you can comment on
02:35:57.320 | whether there's truth to it,
02:35:58.280 | but I just found it interesting
02:35:59.720 | because I've been doing a lot of research
02:36:01.040 | on World War II recently.
02:36:02.440 | So this is about Hitler.
02:36:04.160 | - Oh, okay.
02:36:05.000 | - Here's, it's a long statement.
02:36:07.400 | I was there when a big push was made
02:36:09.360 | to fight bias at Wikipedia.
02:36:11.280 | Our target became getting the Hitler article
02:36:13.560 | to be Wiki's featured article.
02:36:15.680 | The idea was that the voting body only wanted articles
02:36:18.440 | that were good PR and especially articles
02:36:20.720 | about socially liberal topics.
02:36:23.000 | So the Hitler article had to be two to three times better
02:36:26.880 | and more academically researched to beat the competition.
02:36:29.680 | This bias seems to hold today.
02:36:31.440 | For example, the current list of political featured articles
02:36:34.560 | at a glance seems to have only two books,
02:36:37.200 | one on anarchism and one on Karl Marx.
02:36:40.680 | Surely we're not going to say
02:36:43.280 | there have only ever been two articles
02:36:45.120 | about political non-biography books worth being featured,
02:36:48.760 | especially compared to 200 plus video games.
02:36:51.760 | That's the only topics with good books
02:36:54.960 | are socialism and anarchy.
02:36:56.520 | Do you have any interesting comments on this kind of,
02:36:59.880 | so featured, how the featured are selected,
02:37:02.920 | maybe Hitler because he's a special figure, you know.
02:37:07.920 | - I love that, I love that.
02:37:09.800 | No, I love the comparison to how many video games have been.
02:37:13.840 | And that definitely speaks to my earlier as like,
02:37:17.000 | if you've got a lot of young geeky men
02:37:19.800 | who really like video games,
02:37:21.720 | that doesn't necessarily get you the right place
02:37:24.760 | in every respect.
02:37:25.860 | Certainly, yeah, so here's a funny story.
02:37:32.120 | I woke up one morning to a bunch of journalists in Germany
02:37:37.960 | trying to get in touch with me
02:37:39.720 | because German language Wikipedia chose to have
02:37:42.200 | as the featured article of the day, swastika.
02:37:46.520 | And people were going crazy about it.
02:37:48.440 | And some people were saying it's illegal,
02:37:51.880 | has German Wikipedia been taken over
02:37:53.820 | by Nazi sympathizers and so on.
02:37:56.680 | And it turned out it's not illegal,
02:37:58.800 | like discussing the swastika,
02:38:00.840 | using the swastika as a political campaign
02:38:04.400 | and using it in certain ways is illegal in Germany
02:38:07.160 | in a way that it wouldn't be in the US
02:38:08.640 | because of First Amendment.
02:38:10.520 | But in this case, it was like,
02:38:11.800 | actually, part of the point is the swastika symbol
02:38:15.000 | is from other cultures as well.
02:38:17.080 | And they just thought it was interesting.
02:38:18.920 | And I did joke to the community, I'm like,
02:38:20.800 | please don't put the swastika on the front page
02:38:23.200 | without warning me 'cause I'm gonna get a lot of,
02:38:25.360 | now it wouldn't be me, it's the foundation.
02:38:27.040 | I'm not that much on the front lines.
02:38:29.520 | And so I would say that to put Hitler
02:38:32.040 | on the front page of Wikipedia,
02:38:34.040 | it is a special topic and you would wanna say,
02:38:36.320 | yeah, let's be really careful that it's really, really good
02:38:39.440 | before we do that because if we put it on the front page
02:38:42.200 | and it's not good enough, that could be a problem.
02:38:45.680 | There's no inherent reason, like clearly World War II
02:38:49.800 | is a very popular topic in Wikipedia.
02:38:53.920 | It's like, they're on the History Channel.
02:38:56.000 | Like people, it's a fascinating period of history
02:38:58.360 | that people are very interested in.
02:39:00.400 | And then on the other piece, like anarchism
02:39:03.520 | and Karl Marx, yeah, I mean, that's interesting.
02:39:08.280 | I'm surprised to hear that not more political books
02:39:12.600 | or topics have made it to the front page.
02:39:15.120 | - Now we're taking this Reddit comment.
02:39:16.840 | - I mean, as if it's completely,
02:39:19.400 | but I'm trusting, so I think that's probably is right.
02:39:21.640 | They probably did have the list up.
02:39:23.280 | No, I think that piece, the piece about how many
02:39:26.800 | of those featured articles have been video games
02:39:29.640 | and if it's disproportionate, I think we should,
02:39:32.520 | the community should go, actually what's gone,
02:39:34.840 | like that doesn't seem quite right.
02:39:37.040 | You know, I mean, you can imagine
02:39:41.240 | that because you're looking for an article
02:39:45.760 | to be on the front page of Wikipedia,
02:39:47.840 | you wanna have a bit of diversity in it.
02:39:51.680 | You want it to be not always something
02:39:54.680 | that's really popular that week.
02:39:56.200 | So like, I don't know, the last couple of weeks,
02:39:57.840 | maybe Succession, a big finale of Succession
02:40:00.080 | might lead you to think, oh, let's put Succession
02:40:02.000 | on the front page, that's gonna be popular.
02:40:04.040 | In other cases, you kinda want,
02:40:06.400 | pick something super obscure and quirky
02:40:08.440 | because people also find that interesting and fun.
02:40:11.200 | So yeah, don't know, but you don't want it
02:40:12.720 | to be video games most of the time.
02:40:15.400 | That sounds quite bad.
02:40:16.640 | - Well, let me ask you just for,
02:40:22.480 | as somebody who's seen the whole thing,
02:40:25.400 | the development of the millions of articles,
02:40:30.120 | big impossible question, what's your favorite article?
02:40:33.480 | - My favorite article?
02:40:34.920 | Well, I've got an amusing answer,
02:40:38.040 | which is possibly also true.
02:40:40.960 | There's an article in Wikipedia
02:40:44.440 | called "Inherently Funny Words."
02:40:47.200 | And one of the reasons I love it is
02:40:49.320 | when it was created early in the history of Wikipedia,
02:40:54.280 | it kinda became like a dumping ground.
02:40:56.160 | People would just come by and write in any word
02:40:57.920 | that they thought sounded funny.
02:40:59.920 | And then it was nominated for deletion
02:41:01.640 | 'cause somebody's like, this is just a dumping ground,
02:41:03.720 | like people are putting all kinds of nonsense in.
02:41:06.320 | And in that deletion debate,
02:41:07.640 | somebody came forward and said, essentially,
02:41:10.200 | wait a second, hold on, this is actually a legitimate concept
02:41:14.040 | in the theory of humor and comedy.
02:41:16.040 | And a lot of famous comedians and humorists
02:41:17.840 | have written about it.
02:41:19.720 | And it's actually a legitimate topic.
02:41:23.640 | So then they went through and they meticulously referenced
02:41:26.320 | every word that was in there
02:41:27.560 | and threw out a bunch that weren't.
02:41:29.480 | And so it becomes this really interesting.
02:41:30.960 | Now, my biggest disappointment,
02:41:32.760 | and it's the right decision to make
02:41:35.400 | because there was no source,
02:41:38.160 | but it was a picture of a cow,
02:41:41.920 | but there was a rope around its head
02:41:46.360 | tying on some horns onto the cow.
02:41:50.200 | So it was kind of a funny looking picture.
02:41:52.800 | It looked like a bull with horns,
02:41:55.500 | but it's just like a normal milk cow.
02:41:58.680 | And below it, the caption said,
02:42:00.560 | according to some, cow is an inherently funny word,
02:42:04.320 | which is just hilarious to me,
02:42:05.840 | partly because the according to some
02:42:07.520 | sounds a lot like Wikipedia,
02:42:09.320 | but there was no source.
02:42:10.160 | So it went away and I feel very sad about that.
02:42:12.640 | But I've always liked that.
02:42:14.640 | And actually, the reason depths of Wikipedia
02:42:16.840 | amuses me so greatly is because it does highlight
02:42:21.480 | really interesting, obscure stuff.
02:42:24.200 | And you're like, wow, I can't believe somebody wrote
02:42:27.000 | about that in Wikipedia.
02:42:28.480 | It's quite amusing.
02:42:29.320 | And sometimes there's a bit of wry humor in Wikipedia.
02:42:32.200 | There's always a struggle.
02:42:33.640 | You're not trying to be funny,
02:42:36.400 | but occasionally a little inside humor
02:42:38.680 | can be quite healthy.
02:42:40.480 | - Apparently words with the letter K are funny.
02:42:43.800 | There's a lot of really well researched stuff on this page.
02:42:46.800 | - Yeah.
02:42:47.640 | - It's actually exciting.
02:42:49.480 | And I should mention for depths of Wikipedia,
02:42:51.680 | it's run by Annie Rowerta.
02:42:56.560 | - That's right, Annie.
02:42:57.600 | - And let me just read off some of the pages.
02:43:01.400 | Octopolis and Octolantis.
02:43:04.720 | - Oh yeah, that was.
02:43:05.640 | - Are two separate non-human underwater settlements
02:43:08.200 | built by the gloomy octopuses in Jervis Bay, East Australia.
02:43:12.320 | The first settlement named Octopolis by biologists
02:43:15.800 | was found in 2009.
02:43:17.680 | The individual structures in Octopolis
02:43:19.360 | consist of burrows around a piece of human detritus
02:43:23.240 | believed to be scrap metal.
02:43:25.320 | And it goes on in this way.
02:43:26.760 | Satiric misspelling, least concerned species.
02:43:33.640 | Humans were formally assessed
02:43:35.160 | as a species of least concern in 2008.
02:43:40.160 | I think Hitchhiker's Guide to the Galaxy
02:43:43.480 | would slightly disagree.
02:43:45.680 | And last one, let me just say.
02:43:46.800 | Friendship paradox is the phenomena
02:43:48.680 | first observed by the sociologist Scott Feld in 1991
02:43:53.120 | that on average an individual's friends
02:43:55.480 | have more friends than that individual.
02:43:58.120 | - Oh, that's really interesting.
02:43:58.960 | - That's very lonely.
02:43:59.800 | - Isn't that, that's the kind of thing
02:44:01.560 | that makes you wanna, like it sounds implausible at first
02:44:04.840 | 'cause shouldn't everybody have on average
02:44:06.960 | about the same number of friends as all their friends?
02:44:09.360 | So you really wanna dig into the math of that
02:44:11.320 | and really think, oh, why would that be true?
02:44:13.640 | - And it's one way to feel more lonely
02:44:17.160 | in a mathematically rigorous way.
02:44:19.880 | Somebody also on Reddit asks,
02:44:22.600 | I would love to hear some war stories from behind the scenes.
02:44:25.480 | Is there something that we haven't mentioned
02:44:26.960 | that was particularly difficult
02:44:28.200 | in this entire journey you're on with Wikipedia?
02:44:32.120 | - I mean, it's hard to say.
02:44:33.960 | I mean, so part of what I always say about myself
02:44:38.160 | is that I'm a pathological optimist.
02:44:41.080 | So I always think everything is fine.
02:44:42.960 | And so things that other people might find a struggle,
02:44:46.600 | I'm just like, oh, well, this is the thing we're doing today.
02:44:48.560 | So that's kind of about me.
02:44:50.100 | And it's actually, I'm aware of this about myself.
02:44:52.720 | So I do like to have a few pessimistic people around me
02:44:55.520 | to keep me a bit on balance.
02:44:57.740 | Yeah, I mean, I would say some of the hard things,
02:45:01.920 | I mean, there were hard moments
02:45:03.360 | like when two out of three servers crashed on Christmas day
02:45:06.240 | and then we needed to do a fundraiser
02:45:09.400 | and no idea what was gonna happen.
02:45:12.040 | I would say as well, like in that early period of time,
02:45:19.360 | the growth of the website and the traffic to the website
02:45:23.800 | was phenomenally great.
02:45:25.080 | The growth of the community,
02:45:26.360 | and in fact, the healthy growth of the community was fine.
02:45:29.240 | And then the Wikimedia Foundation,
02:45:31.500 | the nonprofit I set up to own and operate Wikipedia,
02:45:34.340 | as a small organization, it had a lot of growing pains.
02:45:37.800 | And that was the piece that's just like many companies
02:45:44.020 | or many organizations that are in a fast growth.
02:45:46.740 | It's like, you've hired the wrong people
02:45:48.960 | or there's this conflict that's arisen
02:45:50.600 | and nobody's got experience to do this and all that.
02:45:53.480 | So no specific stories to tell,
02:45:55.640 | but I would say growing the organization
02:45:58.320 | was harder than growing the community
02:46:00.480 | and growing the website, which is interesting.
02:46:02.760 | - Well, yeah, it's kind of miraculous and inspiring
02:46:05.400 | that a community can emerge and be stable
02:46:08.680 | and that has so much kind of productive, positive output.
02:46:13.680 | Kind of makes you think.
02:46:16.920 | I mean, I don't, it's one of those things
02:46:18.600 | you don't wanna analyze too much
02:46:20.200 | 'cause you don't wanna mess with a beautiful thing,
02:46:24.520 | but it gives me faith in communities.
02:46:26.360 | - Yeah, yeah.
02:46:27.200 | - That they can spring up in other domains as well.
02:46:29.560 | - Yeah, I think that's exactly right.
02:46:30.880 | And at Fandom, my for-profit wiki company,
02:46:35.600 | where it's like all these communities about pop culture,
02:46:39.280 | mainly sort of entertainment, gaming, and so on,
02:46:43.880 | there's a lot of small communities.
02:46:46.240 | And so I went last year to our Community Connect Conference
02:46:50.240 | and just met some of these people.
02:46:51.960 | And like, here's one of the leaders of the Star Wars wiki,
02:46:55.600 | which is called Wookieepedia, which I think is great.
02:46:58.360 | And he's telling me about his community and all that.
02:47:01.440 | And I'm like, oh, right, yeah, I love this.
02:47:03.800 | So it's not the same purpose as Wikipedia
02:47:07.480 | of a neutral, high-quality encyclopedia,
02:47:11.040 | but a lot of the same values are there of like,
02:47:13.280 | oh, people should be nice to each other.
02:47:15.520 | It's like, when people get upset, it's like,
02:47:17.560 | just remember, we're working on a Star Wars wiki together.
02:47:19.600 | Like, there's no reason to get too outraged.
02:47:22.200 | And just kind people, just like geeky people with a hobby.
02:47:26.080 | - Where do you see Wikipedia in 10 years, 100 years,
02:47:32.240 | and 1,000 years?
02:47:34.120 | - Dun, dun, dun.
02:47:35.840 | Right, so 10 years, I would say pretty much the same.
02:47:41.840 | Like, we're not gonna become TikTok, you know,
02:47:46.160 | with entertainment, scroll by video,
02:47:50.200 | humor and blah, blah, blah, an encyclopedia.
02:47:54.480 | I think in 10 years, we probably will have
02:47:57.120 | a lot more AI-supporting tools, like I've talked about.
02:48:02.320 | And probably your search experience will be,
02:48:05.080 | you can ask a question and get the answer
02:48:06.920 | rather than, you know, from our body of work.
02:48:09.400 | - So search and discovery, a little bit improved.
02:48:11.720 | - Yeah, yeah, all that.
02:48:14.000 | I always say one of the things that people,
02:48:16.000 | most people won't notice,
02:48:18.000 | because already they don't notice it,
02:48:20.760 | is the growth of Wikipedia
02:48:23.200 | in the languages of the developing world.
02:48:25.520 | So you probably don't speak Swahili,
02:48:29.320 | so you're probably not checking out
02:48:30.960 | that Swahili Wikipedia is doing very well.
02:48:33.160 | And it is doing very well.
02:48:36.200 | And I think that kind of growth
02:48:37.400 | is actually super important and super interesting.
02:48:40.480 | But most people won't notice that.
02:48:41.800 | - If we can just link on that, if we could.
02:48:44.040 | Do you think, there's so much incredible translation work
02:48:47.880 | is being done with AI, with language models.
02:48:50.240 | Do you think that can accelerate Wikipedia?
02:48:55.240 | So you start with the basic draft
02:48:56.800 | of the translation of articles and then build on top of that.
02:48:59.560 | - So what I used to say is,
02:49:03.840 | like machine translation for many years
02:49:05.800 | wasn't much use to the community,
02:49:07.360 | 'cause it just wasn't good enough.
02:49:09.040 | As it's gotten better, it's tended to be a lot better
02:49:13.280 | in what we might call economically important languages.
02:49:17.120 | That's because the corpus that they train on
02:49:19.080 | and all of that.
02:49:20.120 | So to translate from English to Spanish,
02:49:24.040 | if you've tried Google Translate recently,
02:49:26.280 | Spanish to English is what I would do.
02:49:28.400 | It's pretty good.
02:49:29.280 | Like it's actually not bad.
02:49:30.280 | It used to be half a joke,
02:49:31.480 | and then for a while it was kind of like,
02:49:32.760 | well, you can get the gist of something.
02:49:33.960 | And now it's like, actually it's pretty good.
02:49:36.480 | However, we've got a huge Spanish community
02:49:38.280 | who write in native Spanish.
02:49:40.080 | So they're able to use it and they find it useful,
02:49:42.320 | but they're writing.
02:49:44.000 | But if you tried to do English to Zulu,
02:49:47.400 | where there's not that much investment,
02:49:50.720 | like there's loads of reasons to invest in English to Spanish
02:49:53.080 | 'cause they're both huge economically important languages,
02:49:55.840 | Zulu not so much.
02:49:56.680 | So for those smaller languages, it was just still terrible.
02:50:00.200 | My understanding is it's improved dramatically
02:50:03.360 | and also because the new methods of training
02:50:05.880 | don't necessarily involve identical corpuses
02:50:10.880 | to try to match things up,
02:50:13.160 | but rather reading and understanding with tokens
02:50:16.760 | and large language models,
02:50:18.400 | and then reading and understanding,
02:50:19.640 | and then you get a much richer.
02:50:22.040 | Anyway, apparently it's quite improved.
02:50:23.480 | So I think that now it is quite possible
02:50:26.760 | that these smaller language communities are gonna say,
02:50:29.600 | oh, well, finally I can put something in English
02:50:32.840 | and I can get out Zulu that I feel comfortable sharing
02:50:36.480 | with my community because it's actually good enough,
02:50:38.880 | or I can edit it a bit here and there.
02:50:40.680 | So I think that's huge.
02:50:41.720 | So I do think that's gonna happen a lot.
02:50:43.280 | And that's gonna accelerate, again,
02:50:44.960 | what will remain to most people an invisible trend,
02:50:47.120 | but that's the growth in all these other languages.
02:50:50.080 | So then move on to 100 years.
02:50:52.080 | - Oh, it's starting to get scary.
02:50:54.880 | - Well, the only thing I say about 100 years is like,
02:50:58.360 | we've built the Wikimedia Foundation,
02:51:02.960 | and we run it in a quite cautious
02:51:06.640 | and financially conservative and careful way.
02:51:10.080 | So every year we build our reserves,
02:51:12.920 | every year we put aside a little bit more money.
02:51:15.440 | We also have the endowment fund,
02:51:17.200 | which we just passed 100 million.
02:51:18.920 | That's a completely separate fund with a separate board
02:51:22.640 | so that it's not just like a big fat bank account
02:51:24.800 | for some future profligate CEO to blow through.
02:51:28.040 | The foundation will have to get the approval
02:51:30.640 | of a second order board to be able to access that money.
02:51:34.200 | And that board can make other grants
02:51:36.240 | through the community and things like that.
02:51:38.480 | So the point of all that is I hope and believe
02:51:41.920 | that we're building in a financially stable way,
02:51:46.840 | that we can weather various storms along the way
02:51:49.560 | so that hopefully we're not taking the kind of risks.
02:51:54.560 | And by the way, we're not taking too few risks either.
02:51:57.280 | That's always hard.
02:51:58.280 | I think the Wikimedia Foundation and Wikipedia
02:52:01.040 | will exist in 100 years,
02:52:02.680 | if anybody exists in 100 years, we'll be there.
02:52:05.960 | - Do you think the internet just looks
02:52:08.000 | unpredictably different than just the web?
02:52:11.000 | - I do, I do.
02:52:11.920 | I mean, I think right now,
02:52:14.560 | this sort of enormous step forward we've seen
02:52:19.200 | has become public in the last year
02:52:20.800 | of the large language models,
02:52:22.320 | really is something else, right?
02:52:26.000 | It's really interesting.
02:52:26.840 | You and I have both talked today
02:52:28.600 | about the flaws and the limitations,
02:52:30.800 | but still it's, as someone who's been around technology
02:52:33.440 | for a long time, it's sort of that feeling
02:52:36.320 | of the first time I saw a web browser,
02:52:38.680 | the first time I saw the iPhone,
02:52:40.640 | like the first time the internet was like really usable
02:52:42.920 | on a phone and it's like, wow,
02:52:44.240 | that's a step change difference.
02:52:46.960 | There's a few other, you know.
02:52:48.600 | - Maybe a Google search.
02:52:49.920 | - Google search was actually one.
02:52:50.760 | - I remember the first search.
02:52:51.760 | - Because I remember AltaVista was kind of cool for a while,
02:52:54.160 | then it just got more and more useless
02:52:55.600 | 'cause the algorithm wasn't good.
02:52:57.240 | And it's like, oh, Google search,
02:52:58.320 | now the internet works again.
02:52:59.800 | And so large language model, it feels like that to me,
02:53:03.960 | like, oh wow, this is something new
02:53:06.400 | and really pretty remarkable.
02:53:08.480 | And it's gonna have some downsides,
02:53:09.520 | like the negative use case.
02:53:14.520 | People in the area who are experts,
02:53:16.080 | they're giving a lot of warnings
02:53:17.680 | and I don't know enough to, I'm not that worried,
02:53:19.760 | but I'm a pathological optimist.
02:53:22.240 | But I do see some like really low hanging fruit,
02:53:25.200 | bad things that can happen.
02:53:26.440 | So my example is, how about some highly customized spam
02:53:31.440 | where the email that you receive isn't just like
02:53:38.560 | misspelled words and like trying to get through filters,
02:53:40.840 | but actually is a targeted email to you
02:53:42.840 | that knows something about you
02:53:44.760 | by reading your LinkedIn profile
02:53:46.840 | and writes a plausible email
02:53:48.320 | that will get through the filters.
02:53:49.640 | And it's like suddenly, oh, that's a new problem,
02:53:52.880 | that's gonna be interesting.
02:53:55.040 | - Is there a, just on the Wikipedia editing side,
02:53:58.440 | does it make the job of the volunteer,
02:54:00.320 | of the editor more difficult if in a world
02:54:03.720 | where larger and larger percentage of the internet
02:54:07.000 | is written by an LLM?
02:54:08.520 | - So one of my predictions, and we'll see,
02:54:11.840 | ask me again in five years how this panned out,
02:54:14.480 | is that in a way this will strengthen
02:54:19.280 | the value and importance of some traditional brands.
02:54:24.080 | So if I see a news story and it's from
02:54:29.080 | the Wall Street Journal, from the New York Times,
02:54:33.200 | from Fox News, I know what I'm getting
02:54:36.680 | and I trust it to whatever extent I might have,
02:54:40.040 | you know, trust or distrust in any of those.
02:54:42.960 | And if I see a brand new website that looks plausible,
02:54:46.760 | but I've never heard of it,
02:54:48.040 | and it could be machine-generated content
02:54:50.360 | that may be full of errors, I think I'll be more cautious.
02:54:53.000 | I think I'm more interested.
02:54:54.200 | And we can also talk about this around
02:54:56.320 | photographic evidence.
02:54:58.640 | So obviously there will be scandals
02:55:00.240 | where major media organizations get fooled by a fake photo.
02:55:04.120 | However, if I see a photo of, the recent one
02:55:08.560 | was the Pope wearing an expensive puffer jacket,
02:55:12.280 | I'm gonna go, yeah, that's amazing
02:55:14.380 | that a fake like that could be generated,
02:55:16.400 | but my immediate thought is not,
02:55:18.280 | oh, so the Pope's dipping into the money, eh?
02:55:21.320 | Probably 'cause this particular Pope
02:55:23.080 | doesn't seem like he'd be the type.
02:55:25.180 | - My favorite is extensive pictures
02:55:27.680 | of Joe Biden and Donald Trump
02:55:29.560 | hanging out and having fun together.
02:55:31.000 | - Yeah, brilliant.
02:55:33.160 | So I think people will care about the provenance of a photo.
02:55:37.920 | And if you show me a photo and you say,
02:55:40.360 | yeah, this photo is from Fox News,
02:55:45.520 | even though I don't necessarily think that's the highest,
02:55:47.880 | but I'm like, well, it's a news organization
02:55:50.080 | and they're gonna have journalists in it,
02:55:51.120 | they're gonna make sure the photo is what it purports to be,
02:55:54.520 | that's very different from a photo
02:55:56.720 | randomly circulating on Twitter.
02:55:58.040 | Whereas I would say 15 years ago,
02:56:00.240 | a photo randomly circulating on Twitter,
02:56:02.400 | in most cases, the worst you could do,
02:56:06.000 | and this did happen, is misrepresent the battlefield.
02:56:10.000 | So like, oh, here's a bunch of injured children.
02:56:13.280 | Look what Israel's done,
02:56:14.360 | but actually it wasn't Israel,
02:56:15.680 | it was another case 10 years ago.
02:56:17.920 | That has happened, that has always been around.
02:56:20.960 | But now we can have much more specifically constructed,
02:56:25.160 | plausible looking photos,
02:56:26.720 | that if I just see them circulating on Twitter,
02:56:28.360 | I'm gonna go, just don't know, not sure.
02:56:30.680 | Like I can make that in five minutes.
02:56:32.760 | - Well, I also hope that it's kind of like
02:56:35.240 | what you're writing about in your book,
02:56:36.360 | that we could also have citizen journalists
02:56:38.760 | that have a stable, verifiable trust that builds up.
02:56:43.840 | So it doesn't have to be New York Times,
02:56:45.880 | this organization, that it could be an organization of one,
02:56:49.000 | as long as it's stable and carries through time
02:56:50.920 | and it builds up or builds up.
02:56:52.240 | - No, I agree, but the one thing I've said in the past,
02:56:56.040 | and this depends on who that person is
02:56:57.760 | and what they're doing, but it's like,
02:56:59.280 | I think my credibility, my general credibility in the world
02:57:02.320 | should be the equal of a New York Times reporter.
02:57:05.400 | So if something happens and I witness it
02:57:08.720 | and I write about it, people are gonna go,
02:57:10.440 | well, Jimmy Wales said it.
02:57:12.360 | That's just like if a New York Times reporter said it,
02:57:15.200 | I'm gonna tend to think he didn't just make it up.
02:57:17.980 | Truth is, nothing interesting ever happens around me.
02:57:20.280 | I don't go to war zones, I don't go to big press conferences,
02:57:23.040 | I don't interview Putin and Zelensky, right?
02:57:27.960 | So just to an extent, yes,
02:57:30.400 | whereas I do think for other people,
02:57:32.100 | those traditional models of credibility
02:57:36.360 | are really, really important.
02:57:39.280 | And then there is this sort of citizen journalism,
02:57:41.600 | I don't know if you think of what you do as journalism,
02:57:44.880 | I kind of think it is, but you do interviews,
02:57:46.840 | you do long form interviews.
02:57:48.920 | And I think people, like if you come and you say,
02:57:51.800 | right, here's my tape, but you wouldn't hand out a tape,
02:57:55.160 | like I just gestured you as if I'm handing you a cassette
02:57:57.320 | tape, but if you put it into your podcast,
02:58:00.040 | here's my interview with Zelensky,
02:58:03.120 | and people aren't gonna go, yeah, how do we know?
02:58:06.840 | That could be a deep fake.
02:58:08.160 | Like you could have faked that,
02:58:09.320 | 'cause people are like, well, no,
02:58:11.320 | like you're a well-known podcaster
02:58:12.920 | and you do interview interesting people,
02:58:14.540 | and yeah, like you wouldn't think that.
02:58:17.120 | So that your brand becomes really important,
02:58:19.080 | whereas if suddenly, and I've seen this already,
02:58:21.760 | I've seen sort of video with subtitles in English,
02:58:26.720 | and apparently the Ukrainian was the same,
02:58:29.600 | and Zelensky saying something really outrageous.
02:58:32.840 | And I'm like, yeah, I don't believe that.
02:58:34.200 | Like, I don't think he said that in a meeting with,
02:58:37.080 | you know, whatever.
02:58:38.840 | I think that's Russian propaganda, or probably just trolls.
02:58:42.280 | - Yeah, and then building platforms and mechanisms
02:58:44.600 | of how that trust can be verified.
02:58:46.640 | Whether, you know, if something appears
02:58:47.800 | on a Wikipedia page, that means something.
02:58:50.000 | If something appears on, like say,
02:58:52.780 | my Twitter account, that means something.
02:58:54.360 | That means I, this particular human, have signed off on it.
02:58:57.960 | - Yeah, exactly.
02:58:58.800 | - And then the trust you have in this particular human
02:59:02.660 | transfers to the piece of content.
02:59:04.720 | And then each, hopefully there's millions of people
02:59:07.800 | with different metrics of trust.
02:59:10.120 | And then you could see that there's a certain kind of bias
02:59:12.440 | in the set of conversations you're having.
02:59:14.400 | So maybe, okay, I trust this person to have this kind
02:59:16.600 | of bias, and I'll go to this other person
02:59:18.160 | with this other kind of bias, and I can integrate them
02:59:20.360 | in this kind of way, just like you said with Fox News
02:59:22.520 | and whatever else.
02:59:23.360 | - Yeah, and the Boston Journal, New York Times,
02:59:24.760 | like they've all got their, like where they sit.
02:59:27.220 | Yeah.
02:59:29.480 | - So you have built, I would say,
02:59:34.400 | one of, if not the most impactful website
02:59:37.960 | in the history of human civilization.
02:59:39.920 | So let me ask for you to give advice to young people
02:59:44.240 | how to have impact in this world.
02:59:46.120 | High schoolers, college students, wanting to have
02:59:48.940 | a big positive impact in the world.
02:59:50.360 | - Yeah, great.
02:59:51.240 | If you wanna be successful, do something you're really
02:59:54.040 | passionate about, rather than some kind of cold calculation
02:59:56.960 | of what can make you the most money.
02:59:58.680 | 'Cause if you go and try to do something,
03:00:00.760 | and you're like, I'm not that interested,
03:00:02.080 | but I'm gonna make a lot of money doing it,
03:00:03.680 | you're probably not gonna be that good at it.
03:00:05.560 | And so that is a big piece of it.
03:00:07.760 | I also like, so for startups, I give this advice.
03:00:14.440 | So young, and this is a career startup,
03:00:17.280 | any kind of young person just starting out,
03:00:19.920 | is like, be persistent, right?
03:00:24.840 | There'll be moments when it's not working out,
03:00:27.320 | and you can't just give up too easily.
03:00:29.360 | You've gotta persist through some hard times.
03:00:31.080 | Maybe two servers crash on a Sunday,
03:00:33.080 | and you've gotta sort of scramble to figure it out,
03:00:35.320 | but persist through that.
03:00:36.580 | And then also, be prepared to pivot.
03:00:41.200 | That's a newer word, new for me,
03:00:43.120 | but when I pivoted from Newpedia to Wikipedia,
03:00:48.120 | it's like, this isn't working,
03:00:49.360 | I've gotta completely change.
03:00:51.180 | So be willing to completely change direction
03:00:53.360 | when something's not working.
03:00:54.440 | Now the problem with these two wonderful pieces of advice
03:00:58.440 | is which situation am I in today, right?
03:01:01.200 | Is this a moment when I need to just power through
03:01:03.560 | and persist because I'm gonna find a way to make this work,
03:01:06.520 | or is this a moment where I need to go,
03:01:07.960 | actually, this is totally not working,
03:01:09.440 | and I need to change direction?
03:01:11.560 | But also, I think for me, that always gives me a framework
03:01:13.880 | of like, okay, here's a problem.
03:01:17.900 | Do we need to change direction,
03:01:19.600 | or do we need to kind of power through it?
03:01:21.280 | And just knowing those are the choices,
03:01:24.800 | and they're not always the only choices,
03:01:26.560 | but those are choices, I think can be helpful to say,
03:01:29.560 | okay, am I checking in out,
03:01:34.560 | 'cause I'm having a little bump,
03:01:36.040 | and I'm feeling it emotional,
03:01:37.000 | and I'm just gonna give up too soon?
03:01:38.520 | Okay, ask yourself that question.
03:01:40.720 | And also, it's like, am I being pigheaded
03:01:42.920 | and trying to do something that actually doesn't make sense?
03:01:45.240 | Okay, ask yourself that question too,
03:01:46.360 | even though they're contradictory questions.
03:01:48.560 | Sometimes it'll be one, sometimes it'll be the other,
03:01:51.600 | and you gotta really think it through.
03:01:53.520 | - I think persisting with the business model
03:01:55.520 | behind Wikipedia is such an inspiring story,
03:01:59.960 | because we live in a capitalist world.
03:02:03.120 | We live in a scary world, I think,
03:02:07.040 | for an internet business.
03:02:08.460 | And so to do things differently
03:02:12.440 | than a lot of websites are doing,
03:02:14.200 | like Wikipedia has lived through the successive explosion
03:02:18.120 | of many websites that are basically ad-driven.
03:02:20.880 | Google is ad-driven.
03:02:23.760 | Facebook, Twitter, all of these websites are ad-driven.
03:02:26.840 | And to see them succeed, become these incredibly rich,
03:02:31.840 | powerful companies that if I could just have that money,
03:02:37.040 | you would think as somebody running Wikipedia,
03:02:39.240 | I could do so much positive stuff, right?
03:02:41.120 | And so to persist through that is,
03:02:43.780 | I think is, from my perspective now,
03:02:46.980 | Monday night quarterback or whatever,
03:02:50.400 | is the right decision.
03:02:53.440 | - But boy, is that a tough decision.
03:02:55.640 | - It seemed easy at the time, so.
03:02:58.400 | - And then you just kinda stay with it, stick with it.
03:02:59.920 | - Yeah, you just stay with it, it's working.
03:03:01.800 | - So on that one, you chose persistent.
03:03:03.400 | - Yeah, well, yeah.
03:03:07.360 | I mean, I always like to give an example of MySpace,
03:03:11.400 | 'cause I just think it's an amusing story.
03:03:13.360 | So MySpace was poised, I would say, to be Facebook, right?
03:03:17.640 | It was huge, it was viral, it was lots of things.
03:03:21.000 | Kind of foreshadowed a bit of maybe even TikTok,
03:03:23.400 | because it was like a lot of entertainment content, casual.
03:03:26.400 | And then Rupert Murdoch bought it,
03:03:30.920 | and it collapsed within a few years.
03:03:33.520 | And part of that, I think, was because they were really,
03:03:36.280 | really heavy on ads and less heavy
03:03:39.280 | on the customer experience.
03:03:40.800 | So I remember to accept a friend request
03:03:43.120 | was like three clicks where you saw three ads.
03:03:45.880 | And on Facebook, you accept the friend request,
03:03:47.880 | you didn't even leave the page,
03:03:49.000 | it just, like, that just accepted.
03:03:51.400 | But what is interesting, so I used to give this example
03:03:53.680 | of like, yeah, well, Rupert Murdoch really screwed that one
03:03:56.840 | up, in a sense, maybe he did, but somebody said,
03:03:58.640 | you know what, actually, he bought it for,
03:04:01.040 | and I don't remember the numbers,
03:04:02.040 | he bought it for 800 million,
03:04:03.040 | and it was very profitable through its decline.
03:04:06.920 | He actually made his money back and more.
03:04:09.000 | So it wasn't, like, from a financial point of view,
03:04:11.160 | it was a bad investment in the sense of,
03:04:12.680 | you could have been Facebook,
03:04:14.400 | but on sort of more mundane metrics, it's like,
03:04:17.000 | actually it worked out okay for him.
03:04:18.120 | It all matters how you define success.
03:04:20.120 | - It does, and that is also advice to young people.
03:04:23.880 | One of the things I would say,
03:04:27.280 | like, when we have our mental models of success,
03:04:31.600 | as an entrepreneur, for example,
03:04:33.640 | and your examples in your mind are Bill Gates,
03:04:37.560 | Mark Zuckerberg, so people who, at a very young age,
03:04:42.560 | had one really great idea that just went straight
03:04:45.480 | to the moon and became one of the richest people
03:04:47.000 | in the world, that is really unusual,
03:04:49.600 | like, really, really rare.
03:04:52.040 | And for most entrepreneurs,
03:04:53.520 | that is not the life path you're gonna take.
03:04:55.560 | You're gonna fail, you're gonna reboot,
03:04:57.640 | you're gonna learn from what you failed at,
03:04:59.400 | you're gonna try something different.
03:05:01.320 | And that is really important,
03:05:02.760 | because if your standard of success is,
03:05:05.880 | well, I feel sad because I'm not as rich as Elon Musk,
03:05:09.280 | it's like, well, so should almost everyone,
03:05:13.000 | possibly everyone, except Elon Musk
03:05:14.760 | is not as rich as Elon Musk.
03:05:16.040 | And so that, like, realistically,
03:05:18.760 | you can set a standard of success,
03:05:20.720 | even in a really narrow sense, which I don't recommend,
03:05:24.160 | of thinking about your financial success.
03:05:27.640 | It's like, if you measure your financial success
03:05:30.040 | by thinking about billionaires, like, that's heavy,
03:05:35.040 | like, that's probably not good.
03:05:37.280 | I don't recommend it.
03:05:38.540 | Whereas, like, I personally, like, for me,
03:05:42.800 | when people, when journalists say,
03:05:44.240 | oh, how does it feel to not be a billionaire,
03:05:45.760 | I usually say, I don't know, how's it feel to you?
03:05:48.480 | 'Cause they're not.
03:05:49.480 | But also, I'm like, I live in London.
03:05:53.920 | The number of bankers that no one's ever heard of
03:05:57.000 | who live in London, who make far more money than I ever will,
03:05:59.640 | is quite a large number.
03:06:01.480 | And I wouldn't trade my life for theirs at all, right?
03:06:05.000 | Because mine is so interesting.
03:06:06.760 | Like, oh, right, Jimmy, we need you to go
03:06:11.120 | and meet the Chinese propaganda minister.
03:06:13.840 | Oh, okay, that's super interesting.
03:06:15.600 | Like, yeah, Jimmy, you know, like, here's the situation.
03:06:18.600 | Like, you can go to this country,
03:06:20.320 | and while you're there, the president has asked to see you.
03:06:24.040 | It's like, God, that's super interesting.
03:06:26.120 | Jimmy, you're going to this place,
03:06:28.600 | and there's a local Wikipedian who said,
03:06:30.600 | do you wanna stay with me and my family?
03:06:32.840 | And I'm like, yeah, like, that's really cool.
03:06:35.000 | Like, I would like to do that.
03:06:36.120 | That's really interesting.
03:06:38.240 | I don't do that all the time, but I've done it,
03:06:39.560 | and it's great.
03:06:40.400 | So, like, for me, that's like arranging your life
03:06:43.360 | so that you have interesting experiences is just great.
03:06:48.360 | - Well, this is more to the question
03:06:51.560 | of what Wikipedia looks like in 1,000 years.
03:06:54.120 | What do you think is the meaning of this whole thing?
03:06:56.440 | Why are we here, human civilization?
03:06:58.660 | What's the meaning of life?
03:07:00.280 | - I don't think there is an external answer to that question.
03:07:05.280 | - And I should mention that there's a very good
03:07:08.560 | Wikipedia page on the different philosophies
03:07:10.920 | of the meaning of life.
03:07:11.760 | - Oh, interesting.
03:07:12.580 | - So, I'll have to read that and see what I think.
03:07:14.320 | Hopefully, it's neutral and gives a wide frame.
03:07:16.680 | - Oh, it's a really good reference
03:07:18.440 | to a lot of different philosophies about meaning.
03:07:21.360 | The 20th century philosophy in general,
03:07:23.840 | from Nietzsche to the existentialist,
03:07:26.440 | to all, some of the Brewer,
03:07:29.360 | all of them have an idea of meaning.
03:07:30.920 | They really struggle systematically, rigorously,
03:07:33.400 | and that's what the page,
03:07:34.840 | and obviously, a shout out to "The Hitchhiker's Guide"
03:07:37.080 | and all that kind of stuff.
03:07:37.920 | - Yeah, yeah, yeah.
03:07:38.960 | No, I think there's no external answer to that.
03:07:41.800 | I think it's internal.
03:07:43.000 | I think we decide what meaning we will have in our lives
03:07:48.000 | and what we're gonna do with ourselves.
03:07:50.300 | And so, when I think, you know,
03:07:53.920 | if we're talking about thousand years, millions of years,
03:07:57.320 | Uri Milner wrote a book.
03:08:01.860 | He's a big internet investor guy.
03:08:03.920 | He wrote a book advocating quite strongly
03:08:08.680 | for humans exploring the universe
03:08:12.760 | and getting off the planet.
03:08:14.360 | And he funds projects to like,
03:08:16.600 | send like using lasers to send little cameras
03:08:19.200 | and interesting stuff.
03:08:20.660 | And he talks a lot in the book about meaning.
03:08:24.160 | It's like his view is that the purpose of the human species
03:08:27.560 | is to broadly survive and get off the planet.
03:08:31.360 | Well, I don't agree with everything he has to say
03:08:33.600 | 'cause I think that's not a meaning
03:08:35.440 | that can motivate most people in their own lives.
03:08:38.000 | It's like, okay, great.
03:08:40.080 | You know, like the distances of space
03:08:42.080 | are absolutely enormous.
03:08:43.320 | So, I don't know what,
03:08:45.640 | should we build generation ships to start flying places?
03:08:48.160 | Well, I can't do that.
03:08:49.400 | And I'm not, even if I could, even if I'm Elon Musk
03:08:51.720 | and I could devote all my wealth to build,
03:08:53.880 | I'll be dead on the ship on the way.
03:08:55.700 | So, is that really meaning?
03:08:57.760 | But I think it's really interesting to think about.
03:09:00.800 | And reading his little book,
03:09:02.480 | it's quite a short little book, reading his book,
03:09:04.480 | it made me, it did make me think about,
03:09:06.880 | wow, like this is big.
03:09:07.880 | Like, this is not what you think about
03:09:09.120 | in your day-to-day life.
03:09:10.440 | It's like, where is the human species going to be
03:09:13.920 | in 10 million years?
03:09:15.680 | And it does make you sort of turn back to Earth and say,
03:09:19.560 | gee, let's not destroy the planet.
03:09:23.240 | Like, we're stuck here for at least a while.
03:09:26.360 | And therefore, we should really think about sustainability.
03:09:32.600 | And I mean, one million year sustainability.
03:09:37.000 | And we don't have all the answers.
03:09:38.240 | We have nothing close to the answers.
03:09:40.040 | I'm actually excited about AI in this regard,
03:09:43.320 | while also bracketing.
03:09:44.660 | Yeah, I understand there's also risks
03:09:45.960 | and people are terrified of AI.
03:09:48.360 | But I actually think it is quite interesting,
03:09:50.320 | this moment in time that we may have in the next 50 years
03:09:54.200 | to really, really solve some really long-term human problems,
03:10:00.080 | for example, in health.
03:10:02.120 | Like, the progress that's being made in cancer treatment,
03:10:06.640 | because we are able to, at scale,
03:10:08.960 | model molecules and genetics and things like this,
03:10:15.200 | it gets huge, it's really exciting.
03:10:16.960 | So if we can hang on for a little while,
03:10:22.640 | and certain problems that seem completely intractable today,
03:10:27.280 | like climate change, may end up being actually not that hard.
03:10:30.200 | And we just might be able to alleviate
03:10:33.360 | the full diversity of human suffering.
03:10:35.760 | - For sure, yeah.
03:10:36.900 | - And in so doing, help increase the chance
03:10:41.480 | that we can propagate the flame of human consciousness
03:10:44.360 | out towards the stars.
03:10:46.640 | And I think another important one, if we fail to do that,
03:10:51.240 | for me, is propagating and maintaining
03:10:54.400 | the full diversity and richness and complexity
03:10:58.760 | and expansiveness of human knowledge.
03:11:02.240 | So if we destroy ourselves,
03:11:03.960 | it would make me feel a little bit okay.
03:11:07.440 | - Yeah, you just--
03:11:08.520 | - If the human knowledge--
03:11:09.360 | - Just triggered me to say something really interesting,
03:11:12.520 | which is, when we talked earlier about translating
03:11:16.860 | and using machines to translate,
03:11:18.920 | we mostly talked about small languages
03:11:21.480 | and translating into English.
03:11:22.760 | But I always like to tell this story
03:11:24.320 | of something inconsequential, really.
03:11:26.840 | But there's, I was in Norway, in Bergen, Norway,
03:11:30.920 | where every year they've got this annual festival
03:11:32.880 | called Buekorp, which is young groups drumming,
03:11:37.600 | and they have a drumming competition.
03:11:39.080 | It's the 17 sectors of the city,
03:11:40.640 | and they've been doing it for a couple hundred years
03:11:42.280 | or whatever.
03:11:43.560 | They wrote about it in the three languages of Norway.
03:11:48.040 | And then from there, it was translated into English,
03:11:50.720 | into German, et cetera, et cetera.
03:11:53.040 | And so what I love about that story
03:11:55.320 | is what it reminds me is,
03:11:57.040 | this machine translation goes both ways.
03:12:00.800 | And when you talk about the richness
03:12:02.800 | and broadness of human culture,
03:12:04.900 | we're already seeing some really great pieces of this.
03:12:08.320 | So, like Korean soap operas, really popular,
03:12:12.120 | not with me, but with people.
03:12:13.880 | And the ability to, you know,
03:12:17.280 | imagine taking a very famous, very popular,
03:12:20.480 | very well-known Korean drama,
03:12:23.840 | and now, I mean, and I literally mean now,
03:12:26.240 | we're just about there technologically,
03:12:29.080 | where we use a machine to re-dub it in English
03:12:33.400 | in an automated way, including digitally editing the faces
03:12:37.040 | so it doesn't look dubbed.
03:12:38.920 | And so suddenly you say, oh, wow, like here's a piece of,
03:12:43.640 | you know, it's the Korean equivalent of,
03:12:46.500 | maybe it's "Friends" as a comedy,
03:12:48.160 | or maybe it's "Succession" just to be very contemporary.
03:12:50.760 | It's something that really impacted a lot of people,
03:12:52.680 | and they really loved it.
03:12:53.520 | And we have literally no idea what it's about.
03:12:56.120 | And suddenly it's like, wow, you know, like music,
03:13:00.760 | street music from wherever in the world
03:13:03.680 | can suddenly become accessible to us all in new ways.
03:13:08.200 | It's so cool.
03:13:09.680 | - It's really exciting to get access
03:13:11.520 | to the richness of culture in China,
03:13:14.680 | in the many different subcultures of Africa, South America.
03:13:18.960 | - One of my unsuccessful arguments
03:13:21.320 | with the Chinese government is by blocking Wikipedia,
03:13:25.680 | right, you aren't just stopping people in China
03:13:28.480 | from reading Chinese Wikipedia
03:13:31.040 | and other language versions of Wikipedia,
03:13:32.440 | you're also preventing the Chinese people
03:13:34.880 | from telling their story.
03:13:36.880 | So is there a small festival in a small town in China
03:13:40.560 | like Buekor?
03:13:42.360 | I don't know, but by the way,
03:13:44.020 | the people who live in that village,
03:13:45.880 | that small town of 50,000,
03:13:48.360 | they can't put that in Wikipedia
03:13:49.720 | and get it translated into other places.
03:13:51.480 | They can't share their culture and their knowledge.
03:13:54.280 | And I think for China,
03:13:55.400 | this should be a somewhat influential argument
03:13:57.600 | because China does feel misunderstood in the world.
03:14:00.600 | It's like, okay, well, there's one way,
03:14:02.040 | if you wanna help people understand, put it in Wikipedia.
03:14:06.040 | That's what people go to when they wanna understand.
03:14:08.360 | - And give the amazing, incredible people of China a voice.
03:14:13.260 | - Exactly.
03:14:14.700 | - Jimmy, thank you so much.
03:14:16.040 | I'm such a huge fan of everything you've done.
03:14:17.920 | - Oh, thank you.
03:14:18.760 | - Yeah, it's really great.
03:14:19.800 | - I'm deeply, deeply, deeply, deeply grateful for Wikipedia.
03:14:23.040 | I love it, it brings me joy.
03:14:24.800 | I donate all the time, you should donate too.
03:14:27.640 | It's a huge honor to finally talk with you.
03:14:29.120 | It's just amazing.
03:14:30.500 | Thank you so much for today.
03:14:31.520 | - Thanks for having me.
03:14:32.680 | - Thanks for listening to this conversation
03:14:35.080 | with Jimmy Wales.
03:14:36.280 | To support this podcast,
03:14:37.400 | please check out our sponsors in the description.
03:14:40.440 | And now, let me leave you with some words
03:14:43.000 | from the world historian Daniel Borsten.
03:14:47.520 | The greatest enemy of knowledge is not ignorance.
03:14:50.360 | It is the illusion of knowledge.
03:14:53.160 | Thank you for listening, and hope to see you next time.
03:14:57.720 | (upbeat music)
03:15:00.300 | (upbeat music)
03:15:02.880 | [BLANK_AUDIO]