back to indexJimmy Wales: Wikipedia | Lex Fridman Podcast #385
Chapters
0:0 Introduction
0:47 Origin story of Wikipedia
6:51 Design of Wikipedia
13:44 Number of articles on Wikipedia
19:55 Wikipedia pages for living persons
40:48 ChatGPT
54:19 Wikipedia's political bias
60:23 Conspiracy theories
73:28 Facebook
81:46 Twitter
102:22 Building Wikipedia
116:55 Wikipedia funding
128:15 ChatGPT vs Wikipedia
132:56 Larry Sanger
138:28 Twitter files
141:20 Government and censorship
155:44 Adolf Hitler's Wikipedia page
167:26 Future of Wikipedia
179:29 Advice for young people
186:50 Meaning of life
00:00:00.000 |
We've never bowed down to government pressure 00:00:09.000 |
about how different companies respond to this, 00:00:12.120 |
but our response has always been just to say no. 00:00:15.680 |
And if they threaten to block, well knock yourself out, 00:00:19.720 |
- The following is a conversation with Jimmy Wales, 00:00:27.080 |
one of, if not the most impactful websites ever, 00:00:31.880 |
expanding the collective knowledge, intelligence, 00:00:55.640 |
of the free software movement, open source software, 00:01:09.960 |
because it empowers an ability to work together 00:01:13.280 |
that's really hard to do if the code is still proprietary 00:01:18.160 |
we sort of have to figure out how I'm gonna be rewarded 00:01:32.400 |
And I realized that that kind of collaboration 00:01:37.520 |
And the first thing that I thought of was an encyclopedia. 00:01:42.680 |
that an encyclopedia, you can collaborate on it. 00:01:54.120 |
You know, you should see a picture, a few pictures, maybe, 00:01:57.000 |
history, location, something about the architect, 00:02:29.040 |
because a bunch of volunteers on the internet 00:02:36.320 |
So we had implemented this seven-stage review process 00:02:47.560 |
that we published after this rigorous process, 00:02:58.520 |
and realized that it wasn't actually that good, 00:03:00.960 |
even though it had been reviewed by academics and so on. 00:03:10.600 |
I was frustrated, why is this taking so long? 00:03:16.200 |
I saw that Robert Merton had won a Nobel Prize in economics 00:03:23.240 |
And when I was in academia, that's what I worked on, 00:03:25.280 |
was option pricing theory, how to publish paper. 00:03:27.720 |
So I'd worked through all of his academic papers 00:03:35.960 |
And when I started to do it, I'd been out of academia, 00:03:39.240 |
hadn't been a grad student for a few years then. 00:03:46.520 |
and send it to the most prestigious finance professors 00:03:49.280 |
that we could find to give me feedback for revisions. 00:03:54.880 |
It's like this really oppressive sort of like, 00:03:59.640 |
- A little bit of the bad part of grad school. 00:04:00.960 |
- Yeah, yeah, the bad part of grad school, right? 00:04:03.280 |
And so I was like, oh, this isn't intellectually fun. 00:04:11.360 |
potential embarrassment if I screw something up 00:04:22.600 |
had brought and showed me the wiki concept in December 00:04:31.440 |
And so in January, we decided to launch Wikipedia, 00:04:45.480 |
And we were concerned that, oh, maybe these academics 00:04:50.280 |
and we shouldn't just convert the project immediately. 00:04:58.240 |
But actually we got more work done in two weeks 00:05:15.360 |
but it's true and it's a start and it's kind of fun. 00:05:20.120 |
Actually, a funny story was several years later, 00:05:32.400 |
which was surprising, but it wasn't that surprising. 00:05:39.560 |
Robert Allman won Nobel Prize in economics and hit save, 00:05:53.680 |
It was like, oh no, I was able to chip in and help. 00:06:00.480 |
And so it's just a completely different model, 00:06:03.520 |
- What is it that made that so accessible, so fun, 00:06:15.080 |
that are just greenfield, you know, available. 00:06:18.340 |
But you know, you could say, oh, well, you know, 00:06:22.480 |
I know a little bit about this and I can get it started. 00:06:34.160 |
where people can, much like open source software, 00:06:39.320 |
and then people suggest revisions and they change it 00:06:41.400 |
and it modifies and it grows beyond the original creator. 00:06:45.880 |
It's just a kind of a fun, wonderful, quite geeky hobby, 00:06:51.440 |
- How much debate was there over the interface, 00:06:57.200 |
- Yeah, I mean, not as much as there probably 00:07:01.040 |
During that two years of the failure of Newpedia, 00:07:08.600 |
there was a huge, long discussion, email discussion, 00:07:11.540 |
very clever people talking about things like neutrality, 00:07:17.260 |
but also talking about more technical ideas, you know, 00:07:19.440 |
things back then XML was kind of all the rage 00:07:31.320 |
So, for example, you know, the population of New York City, 00:07:35.300 |
every 10 years there's a new official census, 00:07:37.920 |
couldn't you just update that bit of data in one place 00:07:42.840 |
That is a reality today, but back then it was just like, 00:07:45.240 |
hmm, how do we do that, how do we think about that? 00:07:56.200 |
from a Wikipedia entry, you can link to that piece of data 00:07:59.000 |
in Wikidata, I mean, it's a pretty advanced thing, 00:08:01.220 |
but there are advanced users who are doing that. 00:08:04.680 |
it updates in all the languages where you've done that. 00:08:07.920 |
There was this chain of emails in the early days 00:08:18.760 |
which we started with, it's quite amusing actually, 00:08:22.560 |
because the main reason we launched with UseModWiki 00:08:28.160 |
So it was really easy for me to install it on the server 00:08:32.040 |
But it was, you know, some guy's hobby project, 00:08:35.400 |
it was cool, but it was just a hobby project. 00:08:37.760 |
And all the data was stored in flat text files. 00:08:45.440 |
So to search the site, you basically used Grap, 00:08:55.640 |
But also, in the early days, it didn't have real logins. 00:09:00.120 |
So you could set your username, but there were no passwords. 00:09:11.560 |
but it was kind of obvious, like you can't grow 00:09:13.160 |
a big website where everybody can pretend to be everybody. 00:09:19.440 |
So quickly, I had to write a little, you know, 00:09:21.800 |
login, store people's passwords and things like that, 00:09:26.400 |
And then another example of something, you know, 00:09:32.400 |
but to make a link in Wikipedia, in the early days, 00:09:35.980 |
you would make a link to a page that may or may not exist 00:09:39.720 |
by just using camel case, meaning it's like uppercase, 00:09:45.120 |
So maybe New York City, you might type N-E-W, 00:09:54.360 |
But that was ugly, that was clearly not right. 00:10:04.200 |
That may have been an option in the software, 00:10:07.000 |
but anyway, we just did that, which worked really well. 00:10:15.680 |
But the thing that didn't occur to me even think about 00:10:21.240 |
standard keyboard, there is no square bracket. 00:10:30.800 |
cut and paste a square bracket when they could find one, 00:10:35.480 |
And yet, German Wikipedia has been a massive success, 00:10:44.720 |
How do you live life to its fullest without square brackets? 00:10:50.880 |
I mean, maybe it does now, because keyboard standards 00:10:57.680 |
I mean, it's the same thing, like there's not really 00:11:01.600 |
And it wasn't on keyboards, or I think it is now, 00:11:04.760 |
but in general, W is not a letter in Italian language, 00:11:18.000 |
- The discussion of square brackets in German. 00:11:20.200 |
- On both the English and the German Wikipedia, 00:11:28.000 |
So Wikidata's fascinating, but even the broader discussion 00:11:31.800 |
of what is an encyclopedia, can you go to that 00:11:35.480 |
sort of philosophical question of what is an encyclopedia? 00:11:40.480 |
So the way I would put it is an encyclopedia, 00:11:44.640 |
what our goal is, is the sum of all human knowledge, 00:11:53.740 |
I mean, somebody started uploading the full text 00:12:00.880 |
that's not an encyclopedia article, but why not? 00:12:11.200 |
because they said, no, an encyclopedia article 00:12:13.320 |
about Hamlet, that's a perfectly valid thing, 00:12:20.420 |
but there are some interesting quirks and differences. 00:12:29.700 |
traditionally, it would be quite common to have recipes, 00:12:33.120 |
which in English language, that would be unusual. 00:12:34.840 |
You wouldn't find a recipe for chocolate cake in Britannica. 00:12:38.940 |
And so I actually don't know the current state, 00:12:41.980 |
haven't even thought about that in many, many years now. 00:12:44.620 |
- State of cake recipes in Wikipedia, in English Wikipedia? 00:12:47.680 |
- I wouldn't say there's chocolate cake recipes. 00:12:49.560 |
I mean, you might find a sample recipe somewhere. 00:12:52.220 |
I'm not saying there are none, but in general, no. 00:13:12.000 |
- Like a canonical recipe for cake, chocolate cake. 00:13:13.600 |
- A canonical recipe is kind of difficult to come by, 00:13:20.800 |
For something like chocolate cake, you could probably say, 00:13:26.440 |
But for many, many things, the variants are as interesting 00:13:44.240 |
- Well, just to throw some numbers, as of May 27th, 2023, 00:13:49.240 |
there are 6.66 million articles in the English Wikipedia 00:13:54.760 |
containing over 4.3 billion words, including articles. 00:14:09.120 |
I mean, it doesn't because I know those numbers 00:14:13.240 |
But in another sense, a deeper sense, yeah, it does. 00:14:19.040 |
I remember when English Wikipedia passed 100,000 articles 00:14:41.240 |
when we did our annual conference there in Boston, 00:14:44.240 |
someone who had come to the conference from Poland 00:14:49.800 |
had brought along with him a small encyclopedia, 00:14:59.160 |
So short biographies, normally a paragraph or so 00:15:26.040 |
Don't know, probably not in English Wikipedia, 00:15:34.400 |
when we're talking about how many entries there are 00:15:39.940 |
is this very deep philosophical issue of notability, 00:15:51.040 |
So sometimes people say, oh, there should be no limit, 00:15:54.280 |
but I think that doesn't stand up to much scrutiny 00:15:57.400 |
So I see in your hand there, you've got a BIC pen, 00:16:08.080 |
So could we have an entry about that BIC pen? 00:16:28.740 |
Anything that's long and plastic, that's what they make. 00:16:37.860 |
- But could we have an article about that very BIC pen 00:16:47.760 |
And the answer is no, there's not much known about it, 00:16:53.960 |
and your great-grandmother gave it to you or something, 00:17:03.400 |
I mean, in German Wikipedia, they used to talk about 00:17:06.120 |
the rear nut of the wheel of Uli Fuchs' bicycle, 00:17:10.840 |
Uli Fuchs, the well-known Wikipedian of the time, 00:17:14.320 |
to sort of illustrate, like you can't have an article 00:17:18.900 |
what can you have an article about, what can't you? 00:17:21.280 |
And that can vary depending on the subject matter. 00:17:24.060 |
One of the areas where we try to be much more careful 00:17:31.520 |
The reason is a biography of a living person, 00:17:34.720 |
if you get it wrong, it can actually be quite hurtful, 00:17:41.200 |
and somebody tries to create a Wikipedia entry, 00:17:44.480 |
there's no way to update it, there's not much known. 00:17:46.020 |
So for example, an encyclopedia article about my mother, 00:17:49.660 |
my mother, school teacher, later a pharmacist, 00:17:57.960 |
that's probably made it in somewhere, standard example. 00:18:02.960 |
And you could sort of imagine a database of genealogy, 00:18:09.760 |
and certain elements like that of private people, 00:18:17.240 |
is what we call BLP1A, we've got lots of acronyms. 00:18:28.140 |
And the type of example would be a victim of a crime. 00:18:31.640 |
So someone who's a victim of a famous serial killer, 00:18:35.120 |
but about whom, like really not much is known. 00:18:39.920 |
We really shouldn't have an article about that person. 00:18:43.320 |
and maybe the specific crime might have an article. 00:18:49.560 |
That's not really something that makes any sense, 00:19:05.160 |
in a different context, because for an academic, 00:19:11.760 |
what papers they've published, things like that. 00:19:13.840 |
You may not know anything about their personal life, 00:19:15.600 |
but that's actually not encyclopedically relevant 00:19:18.040 |
in the same way that it is for a member of a royal family, 00:19:28.800 |
And I've always thought that the term notability, 00:19:34.380 |
I mean, we struggle about how to talk about it. 00:19:38.840 |
The problem with notability is it can feel insulting 00:19:46.080 |
She's a really important person in my life, right? 00:19:56.120 |
- It so happens that there's a Wikipedia page about me, 00:20:01.660 |
And the first thought I had when I saw that was, 00:20:16.580 |
to all the incredible people that are part of creating 00:20:24.620 |
the collection of articles that Wikipedia has created. 00:20:28.140 |
We'll talk about the various details of that. 00:20:31.480 |
But the love and care that goes into creating pages 00:20:40.420 |
for all this kind of stuff, is just really incredible. 00:20:43.060 |
So I just felt the love when I saw that page. 00:20:46.300 |
But I also felt, just 'cause I do this podcast, 00:20:48.820 |
and I just through this podcast gotten to know 00:20:51.600 |
a few individuals that are quite controversial. 00:20:54.740 |
I've gotten to be on the receiving end of something quite, 00:20:58.060 |
to me as a person who loves other human beings, 00:21:02.440 |
I've gone to be at the receiving end of some kind of attacks 00:21:09.160 |
Like you said, when you look at living individuals, 00:21:15.540 |
And because I've become friends with Elon Musk, 00:21:22.600 |
but I've also interviewed people on the left, far left, 00:21:26.920 |
people on the right, some people would say far right. 00:21:32.860 |
you put your toe into the cold pool of politics, 00:21:48.760 |
I think what you also realize is there has to be 00:21:53.760 |
for Wikipedia kind of credible sources, verifiable sources. 00:21:58.960 |
And there's a dance there because some of the sources 00:22:09.920 |
such that people can write articles that are not factual, 00:22:35.220 |
Just to clarify, I am a research scientist at MIT. 00:22:42.960 |
I'm at a prestigious, amazing laboratory called LIDS. 00:22:53.880 |
And by the way, MIT has been very kind to defend me. 00:22:57.480 |
Unlike Wikipedia says, it is not an unpaid position. 00:23:03.660 |
It was all very calm and happy and almost boring research 00:23:10.520 |
And the other thing, because I am half Ukrainian, 00:23:28.600 |
But the little battle about the biography there 00:23:32.400 |
also starts becoming important for the first time for me. 00:23:39.280 |
I use this opportunity of some inaccuracies there. 00:23:49.100 |
I was born in Chkalovsk, which is a town not in Russia. 00:23:58.040 |
which is a former Republic of the Soviet Union. 00:24:05.280 |
Buston, which is funny 'cause we're now in Austin, 00:24:15.560 |
and the rest of the biography is interesting, 00:24:21.640 |
between their origins and where they grew up, 00:24:30.920 |
It's like the fascinating thing about Wikipedia 00:24:34.640 |
is in some sense, those little details don't matter. 00:25:05.280 |
This one, I remember this one, it has this scratch, 00:25:11.580 |
and it's, I mean, maybe it's very silly of me and naive, 00:25:16.580 |
but I feel like Wikipedia, in terms of individuals, 00:25:29.580 |
of the kind of stuff we might see on Twitter, 00:25:32.000 |
like the mockery, the derision, this kind of stuff. 00:25:36.180 |
- And of course, you don't wanna cherry pick, 00:25:39.860 |
but it just feels like to highlight a controversy 00:25:44.800 |
of some sort, one that doesn't at all represent 00:25:47.640 |
the entirety of the human, in most cases, is sad. 00:25:51.720 |
So there's a few things to unpack in all that. 00:25:58.240 |
always find very interesting is your status with MIT. 00:26:08.660 |
But then what's interesting is you gave as much time 00:26:11.140 |
to that, which is actually important and relevant 00:26:13.360 |
to your career and so on, to also where your father 00:26:16.800 |
was born, which most people would hardly notice, 00:26:28.520 |
that no one's gonna notice, like this town in Tajikistan 00:26:34.240 |
Nobody even knows what that means or whatever, 00:26:38.120 |
And so that's one of the reasons for biographies, 00:26:49.520 |
and this is a common debate that goes on in Wikipedia, 00:26:57.240 |
There was a article I stumbled across many years ago 00:27:24.040 |
there was a whole paragraph, a long paragraph, 00:27:33.080 |
It's like, what has this got to do with this guy 00:27:36.160 |
It wasn't even clear, had he done anything hypocritical? 00:27:42.840 |
Even was his son, his son got a DUI, that's never great, 00:27:49.540 |
So of course, I just took that out immediately. 00:27:53.480 |
And that's the sort of thing where, you know, 00:28:03.200 |
So in general, like one of the things we tend to say 00:28:07.440 |
is like any section, so if there's a biography 00:28:21.920 |
Oh, this one's quite short, can I add something? 00:28:26.480 |
And in general, putting it separate from everything else 00:28:30.920 |
and also doesn't put it in the right context. 00:28:35.240 |
there's always potential controversy for anyone, 00:28:38.560 |
it should just be sort of worked into the overall article 00:28:43.120 |
You can contextualize appropriately and so forth. 00:28:53.680 |
But I think for me, one of the most important things 00:28:59.920 |
So yeah, are we gonna get it wrong sometimes? 00:29:05.880 |
you know, sort of reference material is hard. 00:29:11.800 |
you know, to a criticism or a complaint or a concern? 00:29:15.640 |
And if the reaction is defensiveness or combativeness back, 00:29:20.640 |
or if someone's really sort of in there being aggressive 00:29:37.920 |
And sometimes one of the areas where I do think 00:29:47.640 |
but it's like, we know the media is deeply flawed. 00:30:03.240 |
We've seen a real rise in clickbait headlines 00:30:13.640 |
But that makes it a little bit more challenging to say, 00:30:27.640 |
a few years ago, it's been quite a while now, 00:30:36.000 |
And the Mail Online, the digital arm of the Daily Mail, 00:30:46.680 |
but it does tend to run very hyped up stories. 00:30:51.920 |
and go on the attack for political reasons and so on. 00:31:01.800 |
We just said, look, it's probably not a great source. 00:31:04.800 |
You should probably look for a better source. 00:31:06.120 |
So certainly, if the Daily Mail runs a headline saying, 00:31:30.280 |
but they also tend to have vendettas and so forth. 00:31:36.200 |
or is this just the Daily Mail going on a rant? 00:31:38.840 |
- And some of that requires a great community health. 00:31:43.200 |
- Even for me, for stuff I've seen that's kind of, 00:32:14.840 |
Like sometimes, 'cause I also love Stock Overflow, 00:32:17.920 |
Stock Exchange for programming related things. 00:32:33.680 |
like a particular C# or Java or Python or whatever. 00:32:45.520 |
but a little too much is bad because of the defensiveness. 00:32:54.280 |
and there's this tension that's not conducive 00:32:57.400 |
to improving towards a more truthful depiction 00:33:17.520 |
we have a little section called, did you know? 00:33:22.640 |
And there's a whole process for how things get there. 00:33:25.160 |
And the one that somebody was raising a question about was, 00:33:29.280 |
it was comparing a very well-known US football player, black. 00:33:34.280 |
There was a quote from another famous sport person 00:33:39.440 |
comparing him to a Lamborghini, clearly a compliment. 00:33:43.800 |
And so somebody said, actually, here's a study, 00:33:48.600 |
about how black sports people are far more often compared 00:33:53.600 |
to inanimate objects and given that kind of analogy. 00:33:57.040 |
And I think it's demeaning to compare a person to a car, 00:34:01.600 |
But they said, I'm not deleting it, I'm not removing it, 00:34:07.080 |
And then there's this really interesting conversation 00:34:09.160 |
that goes on where I think the general consensus was, 00:34:13.040 |
you know what, this isn't like the alarming headline, 00:34:27.480 |
how to improve our language and not compare sports people 00:34:30.120 |
to inanimate objects and particularly be aware 00:34:32.840 |
of certain racial sensitivities that there might be 00:34:36.400 |
around that sort of thing if there is a disparity 00:34:44.800 |
Like, nobody's saying people should be banned 00:34:53.040 |
the very famous comparison to an inanimate object 00:34:59.480 |
But they're just saying, hey, let's be careful about 00:35:01.640 |
analogies that we just pick up from the media. 00:35:06.480 |
- On the sort of deprecation of news sources, 00:35:09.000 |
really interesting because I think what you're saying 00:35:11.320 |
is ultimately you wanna make a article by article decision. 00:35:19.600 |
there's just a lot of hit pieces written about 00:35:35.440 |
It's fascinating to watch because controversy 00:36:03.800 |
Actually, I'm sure we'll end up talking about AI 00:36:09.920 |
and Chat GPT, but just to quickly mention in this area, 00:36:13.720 |
I think one of the potentially powerful tools, 00:36:20.400 |
I've played around with and practiced it quite a lot, 00:36:23.720 |
but Chat GPT-4 is really quite able to take a passage 00:36:37.200 |
Now, it is a bit anodyne and it's a bit cliched. 00:36:41.800 |
So sometimes it just takes the spirit out of something 00:36:45.500 |
It's just like poetic language and you're like, 00:36:51.760 |
I think that sort of thing is quite interesting. 00:36:54.520 |
can you imagine where you feed in a Wikipedia entry 00:37:08.040 |
that is not accurately reflecting what's in the sources. 00:37:14.400 |
It only has to be good enough to be useful to community. 00:37:17.680 |
So if it scans an article and all the sources and you say, 00:37:28.520 |
well, actually that's probably worth my time to do. 00:37:36.720 |
get good people to sort of review obscure entries 00:37:44.040 |
and we'll probably talk about language models a little bit 00:37:52.400 |
the journalist actually was very straightforward 00:38:01.840 |
and apologize for the error that GPT-4 generated, 00:38:07.880 |
which the articles are used to write Wikipedia pages. 00:38:18.280 |
where the weasel words and the nuances can get lost 00:38:23.200 |
or can propagate even though they're not grounded in reality. 00:38:27.880 |
Somehow in the generation of the language model, 00:38:31.560 |
new truths can be created and kind of linger. 00:38:40.760 |
which is about how something, an error's in Wikipedia, 00:38:47.280 |
but then a lazy journalist reads it and writes the source. 00:39:08.280 |
which is a magazine published by the Biography TV channel, 00:39:15.440 |
"In his spare time," I'm not quoting exactly. 00:39:20.000 |
"In his spare time, he enjoys playing chess with friends." 00:39:24.400 |
Like, I would like to be that guy, but actually, 00:39:51.160 |
because otherwise, somebody's gonna read that, 00:39:58.000 |
because I've been, I was invited a few years ago 00:40:07.000 |
because they read this biography magazine article. 00:40:21.880 |
a lot of shakeout, a lot of implications of that. 00:40:36.400 |
based on Wikipedia, but your own life trajectory changes 00:40:40.360 |
because of Wikipedia, just to make it more convenient. 00:40:54.760 |
Let's talk about GPT-4 and large language models. 00:40:58.200 |
So they are, in part, trained on Wikipedia content. 00:41:02.400 |
- What are the pros and cons of these language models? 00:41:07.480 |
- Yeah, so, I mean, there's a lot of stuff going on. 00:41:10.040 |
Obviously, the technology's moved very quickly 00:41:12.400 |
in the last six months and looks poised to do so 00:41:20.320 |
part of our philosophy is the open licensing, 00:41:28.640 |
We are a volunteer community and we write this encyclopedia. 00:41:33.000 |
We give it to the world to do what you like with. 00:41:41.280 |
commercially, non-commercially, this is the licensing. 00:41:44.320 |
So in that sense, of course, it's completely fine. 00:41:59.440 |
proper attribution is just good intellectual practice. 00:42:02.440 |
And that's a really hard, complicated question. 00:42:13.000 |
about my visit here, I might say in a blog post, 00:42:16.440 |
you know, I was in Austin, which is a city in Texas. 00:42:21.440 |
I'm not gonna put a source for Austin as a city in Texas. 00:42:24.920 |
I learned it somewhere, I can't tell you where. 00:42:26.920 |
So you don't have to cite and reference every single thing. 00:42:33.720 |
it's just morally proper to give your sources. 00:42:39.520 |
And obviously, you know, they call it grounding. 00:42:43.760 |
So particularly people at Google are really keen 00:42:50.600 |
Any text that's generated, trying to ground it 00:43:08.240 |
in Chattopadhyay right now is that it just literally 00:43:12.840 |
will make things up just to be amiable, I think. 00:43:17.000 |
It's programmed to be very helpful and amiable. 00:43:19.400 |
It doesn't really know or care about the truth. 00:43:21.600 |
- And get bullied into, it can kind of be convinced into. 00:43:29.320 |
about comparing a football player to a Lamborghini, 00:43:44.480 |
"Can you think of examples where a white athlete 00:43:48.080 |
"has been compared to a fast car inanimate object?" 00:43:53.000 |
And it comes back with a very plausible essay 00:43:55.560 |
where it tells why these analogies are common in sport. 00:44:02.000 |
So it gives me three specific examples, very plausible, 00:44:09.160 |
Googled every single quote and none of them existed. 00:44:12.000 |
And so I'm like, "Well, that's really not good." 00:44:20.560 |
And it's like, "Well, it's kind of a hard thing to Google 00:44:21.960 |
"'cause unless somebody's written about this specific topic, 00:44:24.120 |
"it's, you know, it's a large language model, 00:44:28.020 |
"it can probably piece that together for me." 00:44:30.680 |
So I think, I hope that ChattGBT5, six, seven, 00:44:39.280 |
I'm hoping we'll see a much higher level of accuracy 00:44:48.240 |
I think instead of being quite so eager to please 00:45:02.200 |
Like, "I really would like to make you happy right now, 00:45:05.440 |
"but I'm really stretched thin with this generation." 00:45:07.480 |
- Well, it's one of the things I've said for a long time. 00:45:10.320 |
So in Wikipedia, one of the great things we do 00:45:14.880 |
except in a deeper sense for the longterm, I think it is. 00:45:17.840 |
But you know, we'll all be unnoticed that says, 00:45:20.440 |
"The neutrality of this section has been disputed," 00:45:22.600 |
or, "The following section doesn't cite any sources." 00:45:28.200 |
sometimes I wish the New York Times would run a banner 00:45:30.680 |
saying, "The neutrality of this has been disputed." 00:45:36.840 |
but we thought it's important enough to bring it to you. 00:45:38.520 |
But just be aware that not all the journalists 00:45:40.680 |
are on board with, "Oh, that's actually interesting, 00:45:43.280 |
I would trust them more for that level of transparency. 00:45:45.880 |
So yeah, similarly, Chattopadhyay should say, 00:45:50.120 |
- Well, the neutrality one is really interesting 00:45:52.800 |
'cause that's basically a summary of the discussions 00:46:03.320 |
It would be nice somehow if there was kind of a summary 00:46:10.040 |
this, lots of wars have been fought on this here land 00:46:18.960 |
Because one of the things I do spend a lot of time 00:46:20.840 |
thinking about these days, and people have found it, 00:46:33.960 |
Because a part of it is we do approach things 00:46:36.080 |
in a non-commercial way, in a really deep sense. 00:46:43.320 |
we're a community whose hobby is writing an encyclopedia. 00:46:48.920 |
If it's not, okay, we might have trouble paying 00:46:53.760 |
And so how do we help the community use these tools? 00:46:57.080 |
What are the ways that these tools can support people? 00:47:01.480 |
I'm gonna start playing with it, is feed in the article 00:47:06.280 |
"Can you suggest some warnings in the article 00:47:08.880 |
"based on the conversations in the talk page?" 00:47:18.280 |
and you can say, "Oh, actually, yeah, it does suggest 00:47:24.640 |
"on a section that has a seven-page discussion in the back," 00:47:40.000 |
in the exploration of this particular part of the topic. 00:47:44.360 |
It might actually help you look at more controversial pages, 00:47:54.340 |
There could be parts that everyone agrees on, 00:48:00.640 |
- It would be nice to, when looking at those beautiful, 00:48:06.400 |
let me just take in some stuff where everybody agrees on. 00:48:09.160 |
- I can give an example that I haven't looked at 00:48:15.480 |
So the discussion was that they're building something 00:48:20.160 |
in Israel, and for their own political reasons, 00:48:34.920 |
if we give it a moment's thought, like, okay, 00:48:36.640 |
I understand why people would have this grappling 00:48:40.200 |
over the language, like, okay, you wanna highlight 00:48:45.020 |
and you wanna highlight the positive aspects, 00:48:46.440 |
so you're gonna try and choose a different name. 00:48:48.760 |
And so there was this really fantastic Wikipedia discussion 00:48:52.600 |
on the talk page, how do we word that paragraph 00:48:57.360 |
It's called this by Israelis, it's called this 00:48:59.080 |
by Palestinians, and how you explain that to people 00:49:05.520 |
You could easily explain, oh, there's this difference, 00:49:09.480 |
and it's because this side's good and this side's bad, 00:49:15.660 |
let's try and really stay as neutral as we can 00:49:22.920 |
Oh, okay, I understand what this debate is about now. 00:49:36.780 |
but the word conflict is something that is a charged word. 00:49:42.720 |
- Because from the Palestinian side or from certain sides, 00:49:47.520 |
the word conflict doesn't accurately describe the situation 00:50:07.760 |
And in a number of cases, so this actually speaks 00:50:16.540 |
where there is no one word that can get consensus. 00:50:20.280 |
And in the body of an article, that's usually okay, 00:50:28.040 |
of why each side wants to use a certain word, 00:50:30.520 |
but there are some aspects, like the page has to have a title. 00:50:40.160 |
It's like, well, there's different photos, which one's best? 00:50:44.560 |
but at the end of the day, you need the lead photo, 00:50:52.320 |
So, at one point, I have no idea if it's in there today, 00:51:01.540 |
American entrepreneurs, fine, American atheists. 00:51:04.940 |
And I said, hmm, that doesn't feel right to me. 00:51:10.420 |
I mean, I wouldn't disagree with the objective fact of it, 00:51:20.700 |
American atheist activist, 'cause that's their big issue. 00:51:23.760 |
So, Madeline Murray O'Hare, or various famous people, 00:51:34.480 |
It doesn't really, it's not something I campaign about. 00:51:42.960 |
In this case, I argued, it doesn't need that guy. 00:51:45.960 |
Like, that's not, I don't speak about it publicly, 00:51:50.480 |
I don't campaign about it, so it's weird to put me 00:52:00.300 |
where you're either in the category or you're not. 00:52:03.580 |
And sometimes it's a lot more complicated than that. 00:52:06.220 |
And is it, again, we go back to, is it undue weight? 00:52:09.300 |
If someone who is now prominent in public life 00:52:32.960 |
Because you think, oh, a criminal, technically speaking, 00:52:35.600 |
it's against the law to drive under the influence of alcohol 00:52:39.000 |
and you were arrested and you spent a month in prison 00:52:42.520 |
or whatever, but it's odd to say that's a criminal. 00:52:52.340 |
'cause that was his first sort of famous name, 00:53:00.300 |
even though he was convicted of quite a bad crime 00:53:08.420 |
Yeah, that's actually an important part of his life story, 00:53:13.100 |
and he could have gone down a really dark path 00:53:15.820 |
and he turned his life around, that's actually interesting. 00:53:27.000 |
and to ideas somehow, and those labels stick. 00:53:52.600 |
all of a sudden it's like, oh boy, you're that guy now. 00:53:56.720 |
if you wanna be that guy. - Well, there's definitely 00:54:01.160 |
I think is quite a complicated and tough label. 00:54:04.240 |
I mean, it's not a completely meaningless label, 00:54:08.600 |
before you actually put that label on someone, 00:54:11.360 |
partly because now you're putting them in a group of people, 00:54:14.200 |
some of whom you wouldn't wanna be grouped with. 00:54:19.960 |
- Let's go into some, you mentioned the hot water 00:54:22.460 |
of the pool that we're both tipping a toe in. 00:54:25.860 |
Do you think Wikipedia has a left-leaning political bias, 00:54:29.320 |
which is something it is sometimes accused of? 00:54:34.220 |
And I think you can always point to specific entries 00:54:44.840 |
Anyone can come and challenge and to go on about that. 00:54:57.600 |
And I think, actually, I just, I don't see it. 00:55:05.220 |
and depending on who they are and what it's about. 00:55:16.020 |
and who knows, the full rush of history in 500 years, 00:55:22.340 |
they might be considered to be path-breaking geniuses, 00:55:28.320 |
and they're just unhappy that Wikipedia doesn't report 00:55:33.060 |
And that, by the way, goes across all kinds of fields. 00:55:42.800 |
by a guy who's a homeopath, who was very upset 00:56:08.920 |
But it's, in general, I think we're pretty good. 00:56:22.120 |
on who's saying what, what the views are, and so forth. 00:56:39.600 |
So I always give, it's kind of a humorous example, 00:56:43.580 |
If you read our entries about Japanese anime, 00:56:49.220 |
they tend to be very, very positive and very favorable 00:56:52.720 |
because almost no one knows about Japanese anime 00:56:56.320 |
And so the people who come and spend their days 00:56:58.520 |
writing Japanese anime articles, they love it. 00:57:01.200 |
They kind of have an inherent love for the whole area. 00:57:05.840 |
they'll have their internal debates and disputes 00:57:13.240 |
On anything that people are quite passionate about, 00:57:16.280 |
then hopefully there's quite a lot of interesting stuff. 00:57:20.240 |
So I'll give an example, a contemporary example, 00:57:27.600 |
And that is the question about the efficacy of masks 00:57:34.040 |
And that's an area where I would say the public authorities 00:57:42.600 |
"Whatever you do, don't rush on and buy masks." 00:57:44.940 |
And their concern was shortages in hospitals. 00:57:52.840 |
now everybody's gotta wear a mask everywhere. 00:58:06.280 |
You might as well wear a mask in any environment 00:58:08.600 |
where you're with a giant crowd of people and so forth. 00:58:15.240 |
It's very politicized where certainly in the US, 00:58:19.840 |
you know, much more so, I mean, I live in the UK, 00:58:22.520 |
I live in London, I've never seen kind of on the streets 00:58:25.600 |
sort of the kind of thing that there's a lot of reports 00:58:29.000 |
of people actively angry because someone else 00:58:31.600 |
is wearing a mask, that sort of thing in public. 00:58:44.240 |
I think you'll find more or less what I've just said. 00:58:49.160 |
to this point in history, it's mixed evidence. 00:58:53.080 |
but maybe not as much as some of the authorities said, 00:59:00.000 |
"but I suspect there are people on both sides 00:59:08.520 |
so then hopefully those people who read this can say, 00:59:16.000 |
"Okay, I have my view, but I understand other views, 00:59:19.900 |
"and I do think it's a complicated question," 00:59:22.240 |
great, now we're a little bit more mature as a society. 00:59:43.320 |
of the oppression of a centralized government, 00:59:47.020 |
You're a sheep that follows the mass control, 00:59:52.560 |
the mass hysteria of an authoritarian regime, 01:00:05.460 |
- Exactly, and that whole politicization of society 01:00:31.000 |
because there's mainstream ideas and there's fringe ideas. 01:00:45.120 |
where if you just look at the percent of the population 01:00:50.440 |
or the population with platforms, what they say, 01:00:53.560 |
and then what is a small percentage in opposition to that, 01:01:05.320 |
- Well, I mean, I think we have to try to do our best 01:01:08.240 |
to recognize both, but also to appropriately contextualize, 01:01:21.120 |
because there's not a lot of emotion around it. 01:01:27.240 |
some say the moon's made of rocks, some say cheese. 01:01:31.000 |
That kind of false neutrality is not what we wanna get to. 01:01:35.280 |
That doesn't make any sense, but that one's easy. 01:01:41.880 |
called something like the moon is made of cheese 01:01:44.120 |
where it talks about this is a common sort of joke 01:01:55.600 |
but nobody thinks, wow, Wikipedia's so one-sided, 01:02:01.680 |
it doesn't even acknowledge the cheese theory. 01:02:04.040 |
I'd say the same thing about flat Earth, again. 01:02:08.240 |
- That's exactly what I'm looking up right now. 01:02:12.920 |
We will have an entry about flat Earth theorizing, 01:02:22.200 |
who claim to be flat Earthers are just having a laugh, 01:02:36.160 |
is actually kind of focusing on this history. 01:02:38.160 |
Flat Earth is an archaic and scientifically disproven 01:02:40.480 |
conception of the Earth's shape as a plane or disc. 01:02:43.280 |
Many ancient cultures subscribe to flat Earth cosmography 01:03:00.280 |
- And how can you fly from South Africa to Perth? 01:03:07.280 |
that's really too far for any plane to make it. 01:03:11.800 |
- What I wanna know is what's on the other side, Jimmy? 01:03:18.000 |
So there's some, I presume there's probably a small section 01:03:25.960 |
'cause I think there's a sizable percent of the population 01:03:28.460 |
who at least will say they believe in a flat Earth. 01:03:47.180 |
but you can get a little silly and ridiculous with it. 01:04:00.380 |
might quibble about this or that in any Wikipedia article, 01:04:03.080 |
but in general, I think there is a pretty good 01:04:05.860 |
sort of willingness and indeed eagerness to say, 01:04:15.820 |
So there's still a lot to unpack in that, right? 01:04:38.180 |
and actually address which authorities have said what 01:05:00.860 |
and it's not even meant to be serious by anyone, 01:05:11.900 |
Like that's just a silly thing to put in that article. 01:05:19.820 |
So this goes into all kinds of things about politics. 01:05:26.100 |
You wanna be really careful, really thoughtful 01:05:29.380 |
about not getting caught up in the anger of our times 01:05:39.760 |
I remember being really kind of proud of the US 01:05:44.340 |
at the time when McCain was running against Obama, 01:05:48.220 |
because I thought, oh, I've got plenty of disagreements 01:05:51.100 |
but they both seem like thoughtful and interesting people 01:05:53.340 |
who I would have different disagreements with, 01:05:59.900 |
and it isn't just sort of people slamming each other, 01:06:04.040 |
- And you're saying Wikipedia also represented that? 01:06:09.180 |
- I hope so, yeah, and I think so, in the main. 01:06:14.820 |
that went horribly wrong, 'cause there's humans involved. 01:06:19.780 |
I would venture to guess, I don't know the data, 01:06:41.340 |
The task for Silicon Valley is to create platforms 01:06:45.860 |
even though there is a bias for the engineers who create it. 01:06:50.460 |
And I think, I believe it's possible to do that. 01:06:53.860 |
You know, there's kind of conspiracy theories 01:07:03.140 |
want to create platforms that are open and unbiased, 01:07:10.220 |
to have all kinds of perspectives battle it out. 01:07:27.580 |
but the right will call it the Democrat Party, 01:07:32.540 |
and the left will call it the Democratic Party. 01:07:45.660 |
Like, I could capitalize a thing in a certain way, 01:07:49.260 |
or I can just take a word and mess with them. 01:08:02.220 |
on political events, on Hunter Biden's laptop, 01:08:07.100 |
how big the censorship of that story is or not. 01:08:09.940 |
And then there's these camps that take very strong points, 01:08:12.700 |
and they construct big narratives around that. 01:08:15.820 |
I mean, it's very sizable percent of the population 01:08:18.060 |
believes the two narratives that compete with each other. 01:08:29.180 |
the sweep of history within your own lifetime. 01:08:38.700 |
where people can agree on certain basic facts 01:08:48.700 |
but I also, I'm not sure what the causes are. 01:08:51.700 |
I think I would lay a lot of the blame in recent years 01:09:09.220 |
and they go viral because they're cute and clever. 01:09:22.460 |
'cause he was complaining about Wikipedia or something, 01:09:33.700 |
and people liked it, and it was all very good. 01:09:51.420 |
which is kind of lets you and him have a fight, right? 01:09:57.140 |
I was texting with Elon, who was very pleasant to me, 01:10:03.420 |
The reporter might've been a little bit shitty, 01:10:06.220 |
with a snarky, funny response, "Not for sale." 01:10:13.700 |
and you could probably, after that, laugh it off, 01:10:16.780 |
But like, that kind of mechanism that rewards the snark 01:10:26.380 |
You know, like, a series of tweets, you know, 01:10:35.060 |
that assesses the quality of the evidence for masks, 01:10:43.260 |
But, you know, a smackdown for a famous politician 01:10:49.740 |
who also went to a dinner and didn't wear a mask, 01:10:57.740 |
You know, people love to call out hypocrisy and all of that, 01:11:00.300 |
but it's partly what these systems elevate automatically. 01:11:04.420 |
I talk about this with respect to Facebook, for example. 01:11:08.460 |
So I think Facebook has done a pretty good job, 01:11:11.680 |
although it's taken longer than it should in some cases. 01:11:13.800 |
But, you know, if you have a very large following 01:11:24.400 |
They've done, you know, some reasonable things there. 01:11:36.960 |
I make a family example with two great stereotypes. 01:11:50.920 |
all of my uncles in my family were wonderful people, 01:11:57.140 |
Well, so grandma, she just posts like sweet comments 01:12:00.520 |
on the kids' pictures and congratulates people 01:12:07.680 |
And normally, it's sort of at Christmas dinner, 01:12:16.000 |
or, you know, maybe let's not invite him this year, 01:12:25.060 |
But now, grandma's got, you know, 54 followers on Facebook, 01:12:56.000 |
It's hard to improve that if that actually is working. 01:12:59.200 |
If the people who are saying things that get engagement, 01:13:03.500 |
if it's not too awful, but it's just, you know, 01:13:13.860 |
or blockable or banable thing, and it shouldn't be. 01:13:16.740 |
But if that's the discourse that gets elevated 01:13:32.500 |
I'm having a conversation with Mark Zuckerberg a second time. 01:13:41.700 |
You also have worked on creating a social network 01:13:46.580 |
So can we just talk about the different ideas 01:13:49.760 |
that these already big social networks can do 01:13:58.500 |
So I don't, the problem with making a recommendation 01:14:04.500 |
their business model makes it really hard for them. 01:14:07.460 |
And I'm not anti-capitalism, I'm not, you know, great. 01:14:11.300 |
Somebody's got business, they're making money. 01:14:15.800 |
But certain business models mean you are gonna prioritize 01:14:19.620 |
things that maybe aren't that long-term healthful. 01:14:31.440 |
start to prioritize content that's higher quality, 01:14:39.680 |
that seems to be just getting a rise out of people. 01:14:42.160 |
Now those are vague human descriptions, right? 01:14:45.040 |
But I do believe good machine learning algorithms, 01:14:52.760 |
actually, we're not necessarily gonna increase page views 01:15:01.320 |
It's like, you know, if your actions are, you know, 01:15:06.320 |
convincing people that you're breaking Western civilization, 01:15:10.840 |
that's really bad for business in the long run. 01:15:17.420 |
Twitter is the thing that's on people's minds 01:15:23.960 |
And so one of the things that's really interesting 01:15:28.720 |
about Facebook compared to a lot of companies 01:15:32.440 |
is that Mark has a pretty unprecedented amount of power. 01:15:38.440 |
his control of the company is pretty hard to break. 01:15:42.700 |
Even if financial results aren't as good as they could be, 01:15:50.960 |
actually, for the long-term health in the next 50 years 01:15:54.240 |
of this organization, we need to rein in some of the things 01:15:59.080 |
because they're actually giving us a bad reputation. 01:16:02.380 |
So one of the recommendations I would say is, 01:16:05.000 |
and this is not to do with the algorithms and all that, 01:16:11.320 |
I don't think it's their most profitable segment, 01:16:13.880 |
but it's given rise to a lot of deep, hard questions 01:16:21.620 |
by questionable people that push false narratives, 01:16:33.820 |
where people were talking about there were ads run 01:16:45.100 |
the UK can pass proper animal rights legislation. 01:16:48.260 |
We're not constrained by the European process. 01:16:51.580 |
Similarly, for people who are advocates of fox hunting 01:16:58.900 |
So you're telling people what they wanna hear. 01:17:06.780 |
So it used to be that for political advertising, 01:17:09.880 |
you really needed to find some kind of mainstream narrative, 01:17:14.480 |
Mainstream narrative that 60% of people can say, 01:17:20.900 |
It pushed you to sort of try and find some nuanced balance. 01:17:26.820 |
is a tiny little one-on-one conversation with them 01:17:31.060 |
because you're able to target using targeted advertising, 01:17:37.100 |
You just need a really good targeting operation, 01:17:44.680 |
machine learning algorithm data to convince people. 01:17:58.960 |
our political advertising policy hasn't really helped 01:18:05.800 |
and finding reasoned middle ground and compromise solutions. 01:18:17.500 |
recommender systems for the newsfeed and other contexts 01:18:34.780 |
And you know, so with WT Social, WikiTree and Social, 01:18:47.780 |
But the idea is to say, let's focus on trust. 01:18:57.200 |
So we'll start with a core base of our tiny community 01:19:03.180 |
We wanna recruit more, but to say, you know what, 01:19:05.020 |
actually let's have that as a pretty strong element 01:19:10.920 |
what gets the most page views in this session. 01:19:13.520 |
Let's optimize on what sort of the feedback from people is. 01:19:26.360 |
we're not gonna pursue an advertising business model, 01:19:37.600 |
but in general to say, actually the problem with, 01:19:52.360 |
gives you a different result than paying for HBO, 01:20:01.380 |
And the reason is, you know, if you think about it, 01:20:08.980 |
you're gonna make a comedy for ABC network in the US, 01:20:25.060 |
I'm gonna use the HBO example and an old example. 01:20:37.700 |
So you can get edgier, higher quality in my view, 01:20:44.500 |
It's gotta be for everybody, which is really hard. 01:20:46.860 |
So same thing, you know, here in a social network, 01:20:55.220 |
I think it drives you in a different direction. 01:20:56.780 |
I actually, and I've said this to Elon about Twitter Blue, 01:21:17.860 |
that does change your broader incentives to say, 01:21:21.100 |
actually, are people gonna be willing to pay for something 01:21:23.600 |
that's actually just toxicity in their lives? 01:21:31.160 |
And maybe I'm wrong about that as a plausible business model. 01:21:35.340 |
But I do think it's interesting to think about 01:21:38.100 |
just in broad terms, business model drives outcomes 01:21:46.620 |
- So if we can just link on Twitter and Elon, 01:21:52.700 |
about the underlying business model, Wikipedia, 01:21:54.940 |
which is this brilliant, bold move at the very beginning. 01:21:57.820 |
But since you mentioned Twitter, what do you think works? 01:22:05.380 |
but to start with, one of the things that I always say is, 01:22:12.780 |
I said this about the old ownership of Twitter 01:22:20.820 |
and this is true actually for all social media, 01:22:27.980 |
You can write whatever the hell you want, right? 01:22:33.980 |
but again, it's just like an open-ended invitation 01:22:38.020 |
And what makes that hard is some people have really toxic, 01:22:40.700 |
really bad, you know, some people are very aggressive. 01:22:43.140 |
They're actually stalking, they're actually, you know, 01:22:45.620 |
abusive, and suddenly you deal with a lot of problems. 01:22:49.860 |
Whereas at Wikipedia, there is no box that says, 01:23:06.260 |
If you go on the talk page of the Donald Trump entry 01:23:08.220 |
and you just start ranting about Donald Trump, 01:23:13.860 |
like there's a whole world of the internet out there 01:23:20.660 |
- Well, also on Wikipedia, people are gonna say, stop. 01:23:46.100 |
we've lost trust in all kinds of institutions and politics. 01:23:56.500 |
And trust in politicians, trust in journalism, 01:24:07.460 |
- And does that also include trust in the idea of truth? 01:24:18.540 |
And the idea of uncomfortable truths is really important. 01:24:43.260 |
somebody accused me of horrible crimes on Twitter. 01:24:53.700 |
I don't really, you know, I brush it off, whatever. 01:24:57.260 |
Like, accusing me of pedophilia, like, that's just not okay. 01:25:05.780 |
And there's five others, and I go through the process. 01:25:08.740 |
And then I get an email that says, you know, whatever, 01:25:11.500 |
a couple hours later, saying, thank you for your report. 01:25:16.100 |
Then several hours further, I get an email back saying, 01:25:26.780 |
And he emails back roughly saying, yeah, sorry, Jimmy. 01:25:40.900 |
He'll listen to me 'cause he's got an email from me, 01:25:52.660 |
Are they getting the same kind of like really poor result 01:25:56.820 |
So fast forward a few years, same thing happens. 01:26:05.620 |
I'm only 10 years old, and Jimmy Wales raped me last week. 01:26:10.660 |
So I report, I'm like, this time I'm reporting, 01:26:12.660 |
but I'm thinking, well, we'll see what happens. 01:26:15.020 |
This one gets even worse because then I get a same result, 01:26:19.860 |
email back saying, sorry, we don't see any problems. 01:26:21.980 |
So I raise it with other members of the board who I know, 01:26:24.220 |
and Jack, and like, this is really ridiculous. 01:26:29.020 |
And some of the board members, friends of mine, 01:26:36.940 |
from the general counsel, head of trust and safety saying, 01:26:44.100 |
We don't regard, and gave reference to the Me Too movement. 01:26:50.060 |
the Me Too movement, it's an important thing. 01:27:05.140 |
So even back then, by the way, they did delete those tweets, 01:27:08.500 |
but I mean, the rationale they gave is spammy behavior. 01:27:13.780 |
it was just like, oh, well, they were retweeting too often. 01:27:23.500 |
I'm sure it's not working for private people who get abuse. 01:27:31.380 |
Well, it hasn't happened to me since Elon took over, 01:27:35.820 |
And I suspect now if I send a report and email someone, 01:27:40.820 |
'cause he's gotten rid of a lot of the trust and safety staff. 01:27:43.980 |
So I suspect that problem is still really hard. 01:27:56.860 |
to say actually making specific allegations of crimes, 01:28:09.620 |
here's who you should call, the police, the FBI, 01:28:15.340 |
And then you do face really complicated questions 01:28:17.660 |
about Me Too movement and people coming forward in public 01:28:23.060 |
probably you should talk to a journalist, right? 01:28:24.980 |
Probably there are better avenues than just tweeting 01:28:27.940 |
from an account that was created 10 days ago, 01:28:38.140 |
- And there's also ways to indirectly or more humorously 01:28:41.100 |
or a more mocking way to make the same kinds of accusations. 01:28:48.300 |
'cause they're not funny enough or cutting enough. 01:28:50.780 |
But if you make it witty and cutting and meme it somehow, 01:28:55.540 |
sometimes actually indirectly making the accusation 01:29:00.420 |
that can go viral and that can destroy reputations. 01:29:09.740 |
- No, I mean, I remember another case that didn't bother me 01:29:17.860 |
"I'm sure you're making millions off of Wikipedia." 01:29:22.820 |
I'm like, "No, actually, I don't even work there. 01:29:31.140 |
which is the US form for tax reporting for charities. 01:29:35.220 |
I was like, "Yeah, I'm not gonna, here's the link. 01:29:37.700 |
"Go read it and you'll see I'm listed as a board member 01:29:45.820 |
"Okay, that one, that feels like you're wrong, 01:29:52.580 |
And again, it didn't go viral because it was kind of silly. 01:29:54.840 |
And if anything would have gone viral, it was me responding. 01:30:00.340 |
"because a lot of people don't know that I don't work there 01:30:03.400 |
"and that I don't make millions and I'm not a billionaire." 01:30:10.420 |
But the other one I didn't respond to publicly 01:30:16.420 |
You know, it's like sometimes calling attention 01:30:19.700 |
who basically has no followers and so on is just a waste. 01:30:25.100 |
is just something that all of us have to kind of learn 01:30:33.620 |
it hurts just as much as when you have a large number. 01:30:38.420 |
is echoed in the situations of millions of other, 01:30:52.740 |
in the banners anymore on Wikipedia, but we did. 01:31:01.860 |
And, you know, one guy, lovely, very sweet guy, 01:31:05.820 |
and he doesn't look like your immediate thought 01:31:12.060 |
He looks like a heavy metal dude 'cause he's cool. 01:31:15.060 |
And so suddenly here he is with long hair and tattoos 01:31:27.100 |
like calling him creepy and, you know, like really massive. 01:31:30.420 |
And this was being shown to 80 million people a day. 01:31:39.140 |
And I thought, you know what, there is a difference. 01:31:43.340 |
I get huge benefits from being in the public eye. 01:31:48.820 |
I can write and get it published in the New York Times 01:32:00.180 |
yeah, actually suddenly being thrust in the public eye 01:32:05.340 |
which normally, you know, if you're a teenager 01:32:12.780 |
because it's local and it's your classmates or whatever. 01:32:19.700 |
in some abusive way, it's really, really quite tragic. 01:32:24.020 |
- I don't know, even at a small scale, it feels viral. 01:32:27.220 |
When five people-- - I suppose you're right, yeah. 01:32:28.900 |
- Five people at your school and there's a rumor 01:32:31.180 |
and there's this feeling like you're surrounded 01:32:33.620 |
and nobody, and the feeling of loneliness, I think, 01:32:36.640 |
which you're speaking to when you don't have a plat, 01:32:39.860 |
when you at least feel like you don't have a platform 01:32:46.140 |
a lot of teenagers definitely feel and a lot of people. 01:32:50.260 |
- And that, I think even when just like two people 01:33:09.620 |
And somehow to correct that, I mean, I think, 01:33:17.380 |
I think we'll be forced to fix that as a society 01:33:23.260 |
because I don't think that problem isn't necessarily new. 01:33:33.780 |
that says Becky is a slut and spreads a rumor 01:33:41.420 |
that's always been damaging, it's always been hurtful. 01:33:45.020 |
- Those kinds of attacks are as old as time itself. 01:34:00.900 |
I don't know enough about specifically how it's implemented 01:34:08.260 |
the uses I've seen of it I've found quite good. 01:34:29.980 |
Or it's like that kind of quick mental action. 01:34:34.740 |
And then I saw something that I liked and agreed with, 01:34:37.420 |
and then a community note under it that made me think, 01:35:02.180 |
and you've got plenty of followers and all of that. 01:35:23.100 |
does the algorithm just pick the retweet or the, 01:35:28.060 |
it's not even the algorithm that makes it viral. 01:35:39.100 |
I think I looked, he's got 16 million now or whatever. 01:35:50.460 |
So it is kind of nice if you have something else, 01:36:20.340 |
- But also in Facebook, and YouTube has this too, 01:36:39.180 |
Did you see too much of this content already? 01:36:42.100 |
You like it, but you don't wanna see so much of it. 01:36:55.940 |
this actually can create a really powerfully curated 01:37:00.940 |
list of content that is fed to you every day. 01:37:05.420 |
That doesn't create an echo chamber or a silo. 01:37:08.620 |
It actually just makes you feel good in the good way, 01:37:23.340 |
and they said, oh, we're testing a new option, 01:37:34.500 |
but which we have some signals that suggest it's of quality. 01:37:57.700 |
because he's so interesting and he's so thoughtful 01:38:01.920 |
I'm like, actually, I wanna hear him out, right? 01:38:08.960 |
but I'm gonna understand a perspective of it. 01:38:12.620 |
- Yeah, there's this interesting thing on social media 01:38:14.420 |
where people kind of accuse others of saying, 01:38:19.180 |
that you disagree with or ideas you disagree with. 01:38:21.500 |
I think this is something that's thrown at me all the time. 01:38:31.140 |
'cause you have quite a wide range of long conversations 01:38:41.780 |
because what I like is high quality disagreement 01:38:59.300 |
Like there's something called Intelligence Squared Debates. 01:39:05.860 |
With the British accent, everything always sounds better. 01:39:11.800 |
like they're invigorated, they're energized by the debate. 01:39:17.880 |
with basically everybody involved and it's so fun. 01:39:24.440 |
if there's some way for me to click a button that says, 01:39:31.960 |
Sometimes show it to me 'cause I wanna be able to, 01:39:34.720 |
but today, I'm just not in the mood for the mockery. 01:39:42.000 |
I wanna get high quality arguments for the flat Earth. 01:39:46.800 |
because I would see, oh, that's really interesting. 01:40:07.480 |
I never really thought about, let me consider that fully. 01:40:14.160 |
How would you be able to have such consistent perception 01:40:18.200 |
of a physical reality if all of it is an illusion? 01:40:28.760 |
It seems that social media can kind of inspire. 01:40:34.760 |
- Yeah, I talk sometimes about how people assume 01:40:41.560 |
or the sort of arguments are between the party of the left 01:40:47.880 |
And I always say, no, it's actually the party 01:40:49.760 |
of the kind and thoughtful and the party of the jerks 01:40:56.120 |
bring me somebody I disagree with politically 01:41:01.840 |
I give an example of our article on abortion. 01:41:13.640 |
and a kind and thoughtful planned paranoid activist, 01:41:16.680 |
and they're gonna work together on the article on abortion, 01:41:31.400 |
but Wikipedia is gonna explain what the debate is about. 01:41:33.680 |
And we're gonna try to characterize it fairly. 01:41:36.800 |
And it turns out like you're kind and thoughtful people, 01:41:42.920 |
quite ideological on the subject of abortion, 01:41:45.720 |
but they can grapple with ideas and they can discuss, 01:41:52.680 |
not because they suppress the other side's views, 01:41:55.260 |
but because they think the case has been stated very well, 01:42:04.160 |
if people understood as much about this as I do, 01:42:07.560 |
You may be wrong about that, but that's often the case. 01:42:12.240 |
that's what I think we need to encourage more of 01:42:21.920 |
- So is it possible if the majority of volunteers, 01:42:25.880 |
editors of Wikipedia really dislike Donald Trump? 01:42:47.280 |
of him as a human being, him as a political leader, 01:42:57.120 |
And I think if you read the article, it's pretty good. 01:43:00.280 |
And I think a piece of that is within our community, 01:43:05.760 |
if people have the self-awareness to understand, 01:43:30.920 |
that are incredibly emotional to some people, 01:43:38.020 |
Maybe, we were discussing earlier, the efficacy of masks. 01:43:42.560 |
I'm like, oh, I think that's an interesting problem, 01:43:45.800 |
but I can help kind of catalog what's the best evidence, 01:43:55.600 |
And I do think, though, in a related framework, 01:44:00.600 |
that the composition of the community is really important. 01:44:07.140 |
Not because Wikipedia is or should be a battleground, 01:44:13.040 |
Like, maybe I don't even realize what's biased, 01:44:15.480 |
if I'm particularly of a certain point of view, 01:44:35.960 |
heavier on tech geeks than not, et cetera, et cetera. 01:44:44.800 |
"Why is it only white men who edit Wikipedia?" 01:44:51.440 |
It's kind of a joke, because the broader principle 01:45:10.120 |
simply because people write more about what they know 01:45:14.840 |
They'll tend to be dismissive of things as being unimportant, 01:45:21.720 |
I like the example, as a parent, I would say, 01:45:33.680 |
actually, we're getting older, the Wikipedians, 01:45:38.360 |
But it's like, if you've got a bunch of 25-year-old 01:45:53.320 |
And somebody did a look at our entries on novelists 01:46:00.600 |
And they looked at the male novelist versus the female. 01:46:03.520 |
And the male novelist had longer and higher quality entries. 01:46:08.040 |
Well, it's not because, 'cause I know hundreds 01:46:11.200 |
of Wikipedians, it's not because these are a bunch 01:46:41.000 |
then these award-winning, clearly important novelists 01:46:48.440 |
oh, we don't like what, a book by Maya Angelou, 01:46:51.680 |
like who cares, she's a poet, that's not interesting. 01:46:55.080 |
No, but just because, well, people write what they know, 01:47:01.760 |
And that's one area where I do think it's really clear. 01:47:23.460 |
But the key is the kind and thoughtful piece. 01:47:28.360 |
outraged by some dramatic thing that's happened on Twitter, 01:47:33.000 |
they come to Wikipedia with a chip on their shoulder, 01:47:35.000 |
ready to do battle, and it just doesn't work out very well. 01:47:41.320 |
I think there's a responsibility on the larger group 01:47:45.360 |
to be even kinder and more welcoming to the smaller group. 01:47:57.560 |
one of the aspects of that that we do think about a lot, 01:48:00.880 |
that I think about a lot, is not about politics. 01:48:05.440 |
It's just like, how are we treating newcomers 01:48:13.480 |
what our philosophy is, but do we live up to that? 01:48:17.160 |
So, you know, the ideal is you come to Wikipedia, 01:48:21.680 |
like one of our fundamental rules is ignore all rules, 01:48:26.440 |
because it kind of piques people's attention, 01:48:29.080 |
like, what the hell kind of rule is that, you know? 01:48:32.080 |
But basically says, look, don't get nervous and depressed 01:48:37.480 |
what's the formatting of your footnote, right? 01:48:50.360 |
but, you know, here's the link to how to format, 01:48:56.320 |
you might want to learn how to format a footnote. 01:48:59.480 |
And to be friendly and to be open and to say, 01:49:02.960 |
and you clearly don't know everything about Wikipedia. 01:49:09.280 |
So people come in and they've got a great big idea 01:49:12.840 |
and they're going to propose this to the Wikipedia community 01:49:20.880 |
And so then ideally you would say to the person, 01:49:25.320 |
Like a lot of people have, and here's where we got to, 01:49:27.840 |
and here's the nuanced conversation we've had about that 01:49:30.720 |
in the past that I think you'll find interesting. 01:49:32.960 |
And sometimes people are just like, oh God, another one, 01:49:34.920 |
you know, who's come in with this idea, which doesn't work. 01:49:42.640 |
But I think it just does require really thinking, 01:49:54.240 |
I just did an interview with Emily Temple Woods, 01:50:00.280 |
She's just like a great, well-known Wikipedian. 01:50:09.360 |
but is that when she started at Wikipedia, she was a vandal. 01:50:17.560 |
she'd done some sort of vandalized a couple of articles 01:50:21.320 |
and then somebody popped up on her talk page and said, 01:50:25.240 |
"Like, we're trying to make an encyclopedia here." 01:50:31.000 |
She's like, "Oh, right, I didn't really think of it that way." 01:50:33.560 |
She just was coming in as, she was like 13 years old, 01:50:36.560 |
combative and, you know, like having fun and trolling a bit. 01:50:39.680 |
And then she's like, "Oh, actually, oh, I see your point." 01:50:45.600 |
is that you don't just go, troll, block, fuck off. 01:50:51.320 |
Which is, I think the way we tend to treat things 01:50:56.040 |
in real life, you know, if you've got somebody 01:50:58.800 |
who's doing something obnoxious in your friend group, 01:51:06.920 |
"but I think this person is actually quite hurt 01:51:11.600 |
And then they usually go, "Oh, you know what? 01:51:13.800 |
"I didn't, I thought that was okay, I didn't." 01:51:25.600 |
that we're all capable and wanting to be kind to each other. 01:51:30.600 |
And in general, the fact that there's a small group 01:51:33.920 |
of volunteers that are able to contribute so much 01:51:54.360 |
- No, I once was at Wikimania, our annual conference, 01:52:11.040 |
And a friend of mine came and sat at the table, 01:52:12.720 |
and she's sort of been in the movement more broadly, 01:52:16.000 |
Creative Commons, she's not really a Wikipedian, 01:52:18.640 |
'cause she's into Creative Commons and all that. 01:52:22.840 |
I sat down at the table with most of the members 01:52:25.280 |
of the English Language Arbitration Committee. 01:52:27.600 |
And they're a bunch of very sweet, geeky Wikipedians. 01:52:38.960 |
"Like we just had dinner with some of the most powerful 01:52:44.880 |
the final court of appeal in English Wikipedia. 01:52:48.000 |
And thank goodness they're not media moguls, right? 01:52:52.480 |
who are just like well-liked in the community 01:53:01.840 |
- It's like to the degree that geeks run the best aspect 01:53:06.840 |
of human civilization brings me joy in all aspects. 01:53:14.120 |
like programmers, like people that kind of specialize 01:53:18.040 |
in a thing and they don't really get caught up 01:53:30.560 |
- Well, if you've never heard of this or looked into it, 01:53:34.720 |
I read something recently that I didn't even know about, 01:53:44.440 |
Sometimes a country will pass daylight savings 01:53:49.080 |
There's a file that's done all sort of UNIX-based computers 01:53:54.080 |
and basically all computers end up using this file. 01:53:56.880 |
It's the official time zone file, but why is it official? 01:54:01.640 |
It's like this guy and a group, a community around him. 01:54:07.680 |
and it broke something because he was on vacation. 01:54:18.940 |
Well, they know 'cause they just use this file, 01:54:25.760 |
But there's this one guy and he doesn't get paid for it. 01:54:28.240 |
It's just, he's like, with all the billions of people 01:54:31.760 |
on the planet, he sort of put his hand up and goes, 01:54:36.080 |
- And there's a lot, a lot, a lot of programmers 01:54:38.920 |
listening to this right now with PTSD about time zones. 01:54:43.520 |
And then there, I mean, there's on top of this one guy, 01:54:55.280 |
it's amazing just the packages, the libraries, 01:54:58.080 |
how few people build them out of their own love 01:55:02.000 |
for building, for creating, for community and all of that. 01:55:14.160 |
That thing, that thing needs to be treasured. 01:55:17.000 |
- I met a guy many years ago, lovely, really sweet guy. 01:55:20.720 |
And he was running a bot on English Wikipedia 01:55:25.440 |
that I thought, wow, that's actually super clever. 01:55:27.640 |
And what he had done is, his bot was like spell checking, 01:55:35.720 |
of words that are commonly mistaken for other words. 01:55:39.240 |
They're spelled wrong, so I can't even give an example. 01:55:43.040 |
And so the word is, people often spell it wrong, 01:55:54.840 |
that looks for these words and then checks the sentence 01:56:04.920 |
but buoy and boy, people sometimes type B-O-Y 01:56:11.760 |
So if he sees the word boy, B-O-Y in an article, 01:56:31.640 |
And that's also part of the openness of the system. 01:56:40.320 |
this is a gift to the world that makes someone go, 01:56:46.080 |
Like I see a little piece of things that I can make better 01:56:54.600 |
- Well, I gotta ask about this big, bold decision 01:56:58.520 |
at the very beginning to not do advertisements 01:57:02.880 |
the philosophy of the business model, Wikipedia, 01:57:11.080 |
So in the US, you know, registered as a charity. 01:57:19.120 |
And the vast majority of the money is from donations, 01:57:34.480 |
- And we have, you know, millions of donors every year, 01:57:50.200 |
there's a lot of reasons why it might not be good. 01:57:52.080 |
And even back then, I didn't think as much as I have since 01:58:08.880 |
yeah, we're really, really keen on community control 01:58:13.360 |
and neutrality, but if we had an advertising-based 01:58:16.400 |
business model, probably that would begin to erode. 01:58:25.720 |
And so things like, I mean, it's easy to think about 01:58:31.120 |
So like if you go to read about, I don't know, 01:58:47.880 |
Or like, do the advertisers have influence over the content? 01:58:57.000 |
But also things like, we don't have clickbait headlines 01:59:13.120 |
None of that kind of stuff, 'cause there's no incentive, 01:59:16.480 |
Also, there's no reason to have an algorithm to say, 01:59:21.440 |
actually, we're gonna use our algorithm to drive you 01:59:26.560 |
We're gonna use the algorithm to drive you to, 01:59:29.160 |
it's like, oh, you're reading about Queen Victoria. 01:59:32.680 |
There's nothing to sell you when you're reading 01:59:42.320 |
there's no incentive for the organization to go, 01:59:58.040 |
but as I say, it was less of a business decision 02:00:11.160 |
like a lot of the ads, that was well before the era 02:00:37.960 |
most, we don't have 100,000 employees and all of that, 02:00:42.280 |
but would we be able to raise money through donations? 02:00:45.800 |
And so I remember the first time that we did, 02:00:54.040 |
was on a Christmas day in 2003, I think it was. 02:01:22.400 |
and have it become a front end server as well. 02:01:31.400 |
And so I was hoping to raise $20,000 in a month's time, 02:01:37.160 |
but we raised nearly 30,000 within two, three weeks time. 02:01:43.160 |
oh, like we put a banner up and people will donate. 02:01:59.440 |
and we've tested a lot of different messaging and so forth. 02:02:05.280 |
I remember one year we really went heavy with, 02:02:16.560 |
So what about the languages of Sub-Saharan Africa? 02:02:20.760 |
So I thought, okay, we're trying to raise money, 02:02:24.280 |
'cause it's really important and near and dear to my heart. 02:02:35.800 |
so let's talk about that, didn't really work as well. 02:02:38.880 |
The pitch that, this is very vague and very sort of broad, 02:02:42.520 |
but the pitch that works better than any other in general 02:02:49.480 |
you use it all the time, you should probably chip in. 02:02:52.280 |
And most people are like, yeah, you know what? 02:02:56.000 |
I use it constantly, and whatever, I should chip in. 02:03:02.400 |
And there's many variants on that, obviously. 02:03:06.840 |
and people are like, oh yeah, Wikipedia, I love Wikipedia, 02:03:13.440 |
why are you always begging for money on the website? 02:03:24.320 |
and Facebook and Microsoft, why don't they pay for it? 02:03:29.520 |
And I'm like, I don't think that's really the right answer. 02:03:38.720 |
Like the best funding for Wikipedia is the small donors. 02:03:46.400 |
But we always are very careful about that sort of thing, 02:03:49.320 |
to say, wow, that's really great and really important, 02:03:56.200 |
because that would just be really quite, yeah. 02:04:01.000 |
- I would love to know how many times I've visited Wikipedia 02:04:07.640 |
that it's the most useful site I've ever used, 02:04:19.520 |
hey, remember all those times your life was made better 02:04:29.960 |
when I could be like, I should be giving a lot here? 02:04:36.040 |
has a similar model, which is, they have ads, 02:04:46.660 |
oh, this is your 134th article you've read this year. 02:04:58.240 |
if they just don't feel like guilty and then think, 02:05:06.360 |
- I guess that's the thing I could also turn on, 02:05:11.720 |
and this speaks to our social media discussion, 02:05:14.960 |
Wikipedia unquestionably makes me feel better about myself 02:05:22.900 |
if I spend time on Twitter, sometimes I'm like, I regret. 02:05:32.080 |
- My number of regretted minutes on Wikipedia is like zero. 02:05:51.160 |
- Yeah, I gave her a media contributor of the year award 02:05:58.420 |
- So I, yeah, so that's the kind of interesting point 02:06:02.180 |
that I don't even know if there's a competitor. 02:06:05.660 |
There may be this sort of programming stack overflow 02:06:11.660 |
It's probably because of the ad driven model, 02:06:14.460 |
because there's an incentive to pull you into clickbait, 02:06:23.340 |
And I also like stack over, although I wonder, 02:06:31.700 |
and I don't have enough time to do it, but I do. 02:06:34.100 |
And I'm not very good at it, so therefore I end up 02:06:47.540 |
because I can often find the answer clearly explained, 02:06:51.280 |
and it just, it works better than sifting through threads. 02:06:56.060 |
because I do love stack overflow and their community. 02:06:58.500 |
I mean, I'm assuming, I haven't read anything 02:07:03.620 |
and they're thinking about how can we sort of use 02:07:14.480 |
and actually get an answer that's based on the answers 02:07:33.520 |
Like obviously all the things we've talked about, 02:07:41.160 |
some policies about it, but roughly speaking, 02:07:43.720 |
there's always more nuance, but roughly speaking, 02:07:45.840 |
it's sort of like you, the human, are responsible 02:07:51.280 |
So if you use ChatterBeeTee, you better check it. 02:07:56.400 |
you know, like, oh, well, I'm not a native speaker 02:08:05.640 |
And I kind of just wanna run my edit through ChatterBeeTee 02:08:14.680 |
- Does it make you sad that people might use, 02:08:25.640 |
So basically use it to answer basic questions 02:08:33.820 |
really comes at the source of it from Wikipedia, 02:08:40.600 |
I mean, part of it is our ethos has always been, 02:08:43.520 |
here's our gift to the world, make something. 02:08:45.520 |
So if the knowledge is more accessible to people, 02:08:49.140 |
even if they're not coming through us, that's fine. 02:08:52.720 |
Now, obviously we do have certain business model concerns, 02:08:57.640 |
where we've had more conversation about this, 02:09:02.880 |
if you ask Alexa, you know, what is the Eiffel Tower, 02:09:08.440 |
and she reads you the first two sentences from Wikipedia 02:09:13.160 |
and they've recently started citing Wikipedia, 02:09:19.200 |
are they gonna donate money, or are they just saying, 02:09:21.040 |
oh, what's Wikipedia for, I can just ask Alexa. 02:09:26.960 |
So we do think about that, but it doesn't bother me 02:09:36.640 |
like literally just hacked together over a weekend 02:09:48.520 |
so he used the OpenAI's API just to make a demo, 02:09:53.520 |
ask a question, why do ducks fly south for winter? 02:10:03.080 |
or I might start looking in Wikipedia, I don't know. 02:10:08.240 |
what are some Wikipedia entries that might answer this? 02:10:15.200 |
answer this question based only on the information in this. 02:10:19.760 |
and it kind of prevented them making stuff up. 02:10:21.840 |
It's just he hacked together over the weekend, 02:10:27.240 |
oh, okay, so now we've got this huge body of knowledge 02:10:30.720 |
that in many cases, you're like, oh, I'm really, 02:10:38.520 |
and it's gonna take me through her life and so forth. 02:10:43.520 |
But other times you've got a specific question, 02:10:45.640 |
and maybe we could have a better search experience 02:10:51.760 |
get your specific answer that's from Wikipedia, 02:10:54.200 |
including links to the articles you might wanna read next. 02:10:59.040 |
like that's just using a new type of technology 02:11:01.560 |
to make the extraction of information from this body of text 02:11:21.920 |
for more mathematical knowledge, that kind of stuff. 02:11:30.160 |
like the moment you start actually taking you 02:11:32.320 |
to like journalist websites, like news websites, 02:11:38.080 |
- Yeah, yeah, yeah. - It's getting a little-- 02:11:45.720 |
- And you need somebody to have filtered through that 02:11:48.080 |
and sort of tried to knock off the rough edges, yeah. 02:11:53.480 |
And I think, you know, I think that kind of grounding 02:12:00.120 |
is, I think they're working really hard on it. 02:12:03.640 |
And that actually, when I, so if you ask me to step back 02:12:06.800 |
and be like very business-like about our business model 02:12:13.760 |
'cause everybody's just gonna stop coming to Wikipedia 02:12:15.400 |
and go to Chad GPT, I think grounding will help a lot 02:12:30.240 |
even if it's just the moral properness of saying, 02:12:40.640 |
if the model is fine-tuned, is constantly retrained, 02:12:45.640 |
where if you want to change what the model knows, 02:12:48.600 |
one of the things you should do is contribute to Wikipedia 02:12:54.120 |
- Or elaborate, expand, all that kind of stuff. 02:12:57.240 |
- You mentioned all of us have controversies, 02:13:13.080 |
- I would say unimportant, not that interesting. 02:13:21.520 |
is I actually think Larry Sanger doesn't get enough credit 02:13:35.120 |
and I disagree with him about a lot of things since 02:13:51.040 |
- So there's a lot of interesting engineering contributions 02:13:57.000 |
what the heck is this thing that we're doing, 02:13:59.360 |
and there's important people that contributed to that. 02:14:03.440 |
- So he also, you said you had some disagreements, 02:14:06.440 |
Larry Sanger said that nobody should trust Wikipedia, 02:14:11.600 |
that there's only one legitimate defensible version 02:14:26.000 |
and what you'll see is a really diligent effort 02:14:33.920 |
you think perspectives are generally represented? 02:14:36.240 |
I mean, it has to do with the kind of the tension 02:14:39.240 |
between the mainstream and the non-mainstream 02:14:45.240 |
Like, to take this area of discussion seriously 02:14:55.800 |
of what Wikipedians spend their time grappling with, 02:15:03.640 |
whether a less popular view is pseudoscience? 02:15:16.080 |
Is it fringe versus crackpot, et cetera, et cetera? 02:15:24.160 |
of grappling with something, and I think we do, 02:15:28.960 |
and I think if anybody said to the Wikipedia community, 02:15:35.040 |
sort of covering minority viewpoints on this issue, 02:15:40.400 |
I don't even understand why you would say that. 02:15:42.360 |
Like, we have to sort of grapple with minority viewpoints 02:15:49.580 |
and this is one of the reasons why, you know, 02:15:52.780 |
there is no magic simple answer to all these things. 02:16:01.780 |
It's like, you know, you've gotta really say, 02:16:06.100 |
And you've always gotta be open to correction 02:16:13.100 |
- I think what happens, again, with social media 02:16:15.140 |
is when there is that grappling process in Wikipedia 02:16:27.640 |
of the oscillation of the grappling and not the correction, 02:16:47.720 |
about the definition of recession in Wikipedia. 02:16:55.200 |
and the accusation was often quite ridiculous and extreme, 02:16:58.240 |
which is under pressure from the Biden administration, 02:17:01.440 |
Wikipedia changed the definition of recession 02:17:07.360 |
but because we're a bunch of lunatic leftists and so on. 02:17:11.400 |
And then, you know, when I see something like that 02:17:13.400 |
in the press, I'm like, oh dear, what's happened here? 02:17:16.460 |
'Cause I always just accept things for five seconds first. 02:17:19.480 |
And then I go and I look and I'm like, you know what? 02:17:20.880 |
That's literally completely not what happened. 02:17:29.620 |
so the traditional kind of loose definition of recession 02:17:38.600 |
within important agencies in different countries 02:17:41.160 |
around the world, a lot of nuance around that. 02:17:43.280 |
And there's other like factors that go into it and so forth. 02:17:45.880 |
And it's just an interesting, complicated topic. 02:18:02.400 |
Wikipedia has changed the definition of recession. 02:18:04.840 |
And then we got a huge rush of trolls coming in. 02:18:10.720 |
and people were told go to the talk page to discuss. 02:18:17.560 |
like this is a really routine kind of editorial debate. 02:18:28.160 |
So there was an article called "The Twitter Files," 02:18:30.760 |
which is about these files that were released 02:18:36.560 |
And what happened was somebody nominated it for deletion. 02:18:46.040 |
this is mainly about the Hunter Biden laptop controversy. 02:18:53.440 |
it takes exactly one human being anywhere on the planet 02:18:57.800 |
And that triggers a process where people discuss it, 02:19:15.600 |
And so nobody proposed suppressing the information. 02:19:19.920 |
It was just like editorially boring internal questions. 02:19:23.960 |
And, you know, so sometimes people read stuff like that 02:19:31.120 |
It's like, well, slow down a second and come and look. 02:19:35.000 |
- Yeah, so I think the right is more sensitive to censorship. 02:19:44.960 |
there's more virality to highlighting something 02:19:49.720 |
that looks like censorship in any walks of life. 02:19:52.240 |
And this moving a paragraph from one place to another 02:19:54.640 |
or removing it and so on as part of the regular grappling 02:19:57.520 |
of Wikipedia can make a hell of a good article 02:20:05.600 |
and surprising to most people because they're like, 02:20:10.080 |
It doesn't seem like a crackpot leftist website. 02:20:12.560 |
It seems pretty kind of dull, really, in its own geeky way. 02:20:20.240 |
because there's a shadowy cabal of Jimmy Wales? 02:20:24.760 |
- You know, I generally, I read political stuff. 02:20:33.760 |
with high profile figures, both in the war in Ukraine 02:20:39.000 |
And I read the Wikipedia articles around that. 02:20:48.480 |
And I find the Wikipedia articles to be very balanced 02:20:51.680 |
and there's many perspectives being represented. 02:20:59.640 |
I mean, it's something I ask myself all the time. 02:21:08.960 |
- I think that's an important question to always ask, 02:21:15.080 |
- No, I think we always have to challenge ourselves 02:21:20.640 |
- Well, you mentioned pressure from government. 02:21:33.100 |
There's also conspiracy theories or accusations 02:21:52.920 |
We've never bowed down to government pressure 02:22:03.680 |
about how different companies respond to this, 02:22:06.800 |
but our response has always been just to say no. 02:22:10.320 |
And if they threaten to block, well, knock yourself out, 02:22:14.400 |
And that's been very successful for us as a strategy 02:22:17.160 |
because governments know they can't just casually threaten 02:22:27.680 |
And that's what a lot of companies have done. 02:22:35.720 |
If you have staff members in a certain country 02:22:44.180 |
Like if Elon said, actually I've got 100 staff members 02:22:50.200 |
and if we don't comply, somebody's gonna get arrested 02:23:06.640 |
I kind of prepared for this 'cause I saw people responding 02:23:09.240 |
to your requests for questions and I was like, 02:23:12.340 |
somebody's like, oh, well, don't you think it was really bad 02:23:15.920 |
And I said, I actually reached out to staff saying, 02:23:17.800 |
can you just make sure I've got my facts right? 02:23:20.240 |
And the answer is we received zero requests of any kind 02:23:25.240 |
from the FBI or any of the other government agencies 02:23:38.360 |
because Wikipedia is written by the community. 02:23:42.840 |
can't change the content of Wikipedia without causing, 02:23:45.520 |
I mean, God, that would be a massive controversy 02:23:51.720 |
I've been to China and met with the Minister of Propaganda. 02:23:56.440 |
We've had discussions with governments all around the world, 02:24:05.400 |
And we think actually having these conversations 02:24:08.240 |
Now, there's no threat of being blocked in the US, 02:24:12.840 |
But in other countries around the world, it's like, okay, 02:24:16.760 |
Let's have the conversation, like let's understand, 02:24:21.320 |
so that you can understand where we come from 02:24:30.560 |
if somebody complains that something's bad in Wikipedia, 02:24:36.800 |
Could be you, could be the government, could be the Pope, 02:24:41.840 |
It's like, oh, okay, well, our responsibility as Wikipedia 02:24:47.120 |
Is there something that we've got wrong in Wikipedia? 02:25:00.920 |
you might call it pressure on social media companies 02:25:06.280 |
or dialogue with, depending on, as we talked earlier, 02:25:09.280 |
grapple with the language, depending on what your view is. 02:25:12.180 |
In our case, it was really just about, oh, okay, right, 02:25:17.200 |
they wanna have a dialogue about COVID information, 02:25:32.280 |
how do you know that Wikipedia is not gonna be pushing 02:25:38.740 |
First, I mean, I think it's somewhat inappropriate 02:25:42.320 |
for a government to be asking pointed questions 02:25:48.340 |
I'm not sure that ever happened because we would just go, 02:25:51.200 |
I don't know, the Chinese blocked us and so it goes, right? 02:25:56.200 |
We're not gonna cave in to any kind of government pressure, 02:25:58.680 |
but whatever the appropriateness of what they were doing, 02:26:03.480 |
I think there is a role for government in just saying, 02:26:09.040 |
Let's think about the problem of misinformation, 02:26:21.720 |
to get a call from a government agency and say, 02:26:25.560 |
yeah, why don't you just fuck off, you're the government. 02:26:37.680 |
with the Chinese government or with organizations 02:26:40.520 |
like CDC and WHO, it's to thoroughly understand 02:26:53.520 |
like whatever the Wikimedia Foundation thinks 02:27:03.360 |
right, we understand you're the World Health Organization 02:27:08.640 |
is to sort of public health, it's about communications. 02:27:17.960 |
- So, it's more about explaining how Wikipedia works 02:27:23.560 |
- It's a battle of ideas and here's how the sources are used. 02:27:39.680 |
I mean, like there's a government organizations 02:27:44.680 |
in general have sold themselves to be the place 02:27:55.160 |
raised in question over the response to the pandemic. 02:28:03.920 |
So, there were definitely cases of public officials, 02:28:16.920 |
And so, the idea is like, we really need people 02:28:21.920 |
Therefore, we're gonna put out some overblown claims 02:28:26.600 |
because it's gonna scare people into behaving correctly. 02:28:29.400 |
You know what, that might work for a little while, 02:28:34.120 |
because suddenly people go from a default stance 02:28:43.480 |
sort of, I don't know, they've got a vault in Atlanta 02:28:47.240 |
with the last vial of smallpox or whatever it is 02:28:53.600 |
we should actually take seriously and listen to, 02:28:57.640 |
And it's like, okay, and if you put out statements, 02:29:11.120 |
'cause you wanted them to do the right thing. 02:29:21.320 |
a lack of trust in authorities, who are, you know, 02:29:36.400 |
I've been criticized for criticizing Anthony Fauci too hard. 02:29:48.720 |
in the loss of trust in institutions like the NIH, 02:30:02.920 |
because of what I perceive as basic human flaws 02:30:07.920 |
of communication, of arrogance, of ego, of politics, 02:30:13.240 |
Now, you could say you're being too harsh, possible, 02:30:16.120 |
but I think that's the whole point of free speech 02:30:29.880 |
or whoever in the scientific position around the pandemic, 02:30:44.280 |
that saves us from this pandemic and future pandemic 02:30:46.960 |
that can threaten the wellbeing of human civilization. 02:30:53.000 |
And to me, when I talk to people about science, 02:30:57.200 |
in terms of the virology and biology development 02:31:17.280 |
all these things are a little less politicized there. 02:31:20.040 |
And I haven't paid close enough attention to Fauci 02:31:30.160 |
I remember hearing at the beginning of the pandemic 02:31:34.280 |
as I'm unwrapping my Amazon package with the masks I bought 02:31:39.480 |
and I just was like, I want some N95 mask, please. 02:31:47.400 |
because they didn't want there to be shortages 02:31:50.720 |
But they were also statements of masks won't, 02:31:53.840 |
they're not effective and they won't help you. 02:31:58.200 |
you're ridiculous if you're not wearing them. 02:32:01.680 |
like that about face just lost people from day one. 02:32:06.200 |
- The distrust in the intelligence of the public 02:32:09.160 |
to deal with nuance, to deal with the uncertainty. 02:32:13.600 |
I think this is where the Wikipedia neutral point of view 02:32:20.520 |
And obviously every article and everything we could, 02:32:22.880 |
you know me now and you know how I am about these things, 02:32:27.920 |
we're happy to show you all the perspectives. 02:32:41.760 |
Like you read that and you're gonna be more educated 02:32:56.440 |
were not following the rules they had put on everyone else, 02:33:17.160 |
And you know, it's gonna be tough, but we're gonna, you know. 02:33:20.200 |
So you have that kind of Dunkirk spirit moment 02:33:36.600 |
we've got all chipping together to like, you know what? 02:33:43.000 |
And that generally is gonna mean following the rules, 02:33:50.360 |
like you're not allowed to be in an outside space 02:33:54.440 |
I'm like, I think I can sit in a park and read a book. 02:34:01.280 |
which I would have been following just personally 02:34:03.640 |
of like, I'm just gonna do the right thing, yeah. 02:34:19.080 |
for the survival and the thriving of human civilization. 02:34:28.160 |
There's always been like pretty anti-science, 02:34:41.120 |
So a lot of people, yeah, maybe it's like you say, 02:34:43.880 |
a lot of people are like, yeah, I got vaccinated 02:34:49.320 |
And that's unfortunate 'cause I think people should say, 02:34:53.920 |
And you know, there's also a whole range of discourse 02:34:58.880 |
around if this were a disease that were primarily, 02:35:09.080 |
would have been very different, right or wrong, 02:35:21.760 |
If you're late in life, this was really dangerous. 02:35:26.760 |
And if you're 23 years old, yeah, well, it's not great. 02:35:31.400 |
Like, and long COVID's a thing and all of that. 02:35:33.360 |
But, and I think some of the public communications, 02:35:36.920 |
again, were failing to properly contextualize it. 02:35:41.320 |
Not all of it, you know, it's a complicated matter, 02:35:45.440 |
- Let me read you a Reddit comment that received two likes. 02:36:15.680 |
The idea was that the voting body only wanted articles 02:36:23.000 |
So the Hitler article had to be two to three times better 02:36:26.880 |
and more academically researched to beat the competition. 02:36:31.440 |
For example, the current list of political featured articles 02:36:45.120 |
about political non-biography books worth being featured, 02:36:56.520 |
Do you have any interesting comments on this kind of, 02:37:02.920 |
maybe Hitler because he's a special figure, you know. 02:37:09.800 |
No, I love the comparison to how many video games have been. 02:37:13.840 |
And that definitely speaks to my earlier as like, 02:37:21.720 |
that doesn't necessarily get you the right place 02:37:32.120 |
I woke up one morning to a bunch of journalists in Germany 02:37:39.720 |
because German language Wikipedia chose to have 02:37:42.200 |
as the featured article of the day, swastika. 02:38:04.400 |
and using it in certain ways is illegal in Germany 02:38:11.800 |
actually, part of the point is the swastika symbol 02:38:20.800 |
please don't put the swastika on the front page 02:38:23.200 |
without warning me 'cause I'm gonna get a lot of, 02:38:34.040 |
it is a special topic and you would wanna say, 02:38:36.320 |
yeah, let's be really careful that it's really, really good 02:38:39.440 |
before we do that because if we put it on the front page 02:38:42.200 |
and it's not good enough, that could be a problem. 02:38:45.680 |
There's no inherent reason, like clearly World War II 02:38:56.000 |
Like people, it's a fascinating period of history 02:39:03.520 |
and Karl Marx, yeah, I mean, that's interesting. 02:39:08.280 |
I'm surprised to hear that not more political books 02:39:19.400 |
but I'm trusting, so I think that's probably is right. 02:39:23.280 |
No, I think that piece, the piece about how many 02:39:26.800 |
of those featured articles have been video games 02:39:29.640 |
and if it's disproportionate, I think we should, 02:39:32.520 |
the community should go, actually what's gone, 02:39:56.200 |
So like, I don't know, the last couple of weeks, 02:40:00.080 |
might lead you to think, oh, let's put Succession 02:40:08.440 |
because people also find that interesting and fun. 02:40:30.120 |
big impossible question, what's your favorite article? 02:40:49.320 |
when it was created early in the history of Wikipedia, 02:40:56.160 |
People would just come by and write in any word 02:41:01.640 |
'cause somebody's like, this is just a dumping ground, 02:41:03.720 |
like people are putting all kinds of nonsense in. 02:41:10.200 |
wait a second, hold on, this is actually a legitimate concept 02:41:23.640 |
So then they went through and they meticulously referenced 02:42:00.560 |
according to some, cow is an inherently funny word, 02:42:10.160 |
So it went away and I feel very sad about that. 02:42:16.840 |
amuses me so greatly is because it does highlight 02:42:24.200 |
And you're like, wow, I can't believe somebody wrote 02:42:29.320 |
And sometimes there's a bit of wry humor in Wikipedia. 02:42:40.480 |
- Apparently words with the letter K are funny. 02:42:43.800 |
There's a lot of really well researched stuff on this page. 02:42:49.480 |
And I should mention for depths of Wikipedia, 02:42:57.600 |
- And let me just read off some of the pages. 02:43:05.640 |
- Are two separate non-human underwater settlements 02:43:08.200 |
built by the gloomy octopuses in Jervis Bay, East Australia. 02:43:12.320 |
The first settlement named Octopolis by biologists 02:43:19.360 |
consist of burrows around a piece of human detritus 02:43:26.760 |
Satiric misspelling, least concerned species. 02:43:48.680 |
first observed by the sociologist Scott Feld in 1991 02:44:01.560 |
that makes you wanna, like it sounds implausible at first 02:44:06.960 |
about the same number of friends as all their friends? 02:44:09.360 |
So you really wanna dig into the math of that 02:44:11.320 |
and really think, oh, why would that be true? 02:44:22.600 |
I would love to hear some war stories from behind the scenes. 02:44:28.200 |
in this entire journey you're on with Wikipedia? 02:44:33.960 |
I mean, so part of what I always say about myself 02:44:42.960 |
And so things that other people might find a struggle, 02:44:46.600 |
I'm just like, oh, well, this is the thing we're doing today. 02:44:50.100 |
And it's actually, I'm aware of this about myself. 02:44:52.720 |
So I do like to have a few pessimistic people around me 02:44:57.740 |
Yeah, I mean, I would say some of the hard things, 02:45:03.360 |
like when two out of three servers crashed on Christmas day 02:45:12.040 |
I would say as well, like in that early period of time, 02:45:19.360 |
the growth of the website and the traffic to the website 02:45:26.360 |
and in fact, the healthy growth of the community was fine. 02:45:31.500 |
the nonprofit I set up to own and operate Wikipedia, 02:45:34.340 |
as a small organization, it had a lot of growing pains. 02:45:37.800 |
And that was the piece that's just like many companies 02:45:44.020 |
or many organizations that are in a fast growth. 02:45:50.600 |
and nobody's got experience to do this and all that. 02:46:00.480 |
and growing the website, which is interesting. 02:46:02.760 |
- Well, yeah, it's kind of miraculous and inspiring 02:46:08.680 |
and that has so much kind of productive, positive output. 02:46:20.200 |
'cause you don't wanna mess with a beautiful thing, 02:46:27.200 |
- That they can spring up in other domains as well. 02:46:35.600 |
where it's like all these communities about pop culture, 02:46:39.280 |
mainly sort of entertainment, gaming, and so on, 02:46:46.240 |
And so I went last year to our Community Connect Conference 02:46:51.960 |
And like, here's one of the leaders of the Star Wars wiki, 02:46:55.600 |
which is called Wookieepedia, which I think is great. 02:46:58.360 |
And he's telling me about his community and all that. 02:47:11.040 |
but a lot of the same values are there of like, 02:47:17.560 |
just remember, we're working on a Star Wars wiki together. 02:47:22.200 |
And just kind people, just like geeky people with a hobby. 02:47:26.080 |
- Where do you see Wikipedia in 10 years, 100 years, 02:47:35.840 |
Right, so 10 years, I would say pretty much the same. 02:47:41.840 |
Like, we're not gonna become TikTok, you know, 02:47:57.120 |
a lot more AI-supporting tools, like I've talked about. 02:48:06.920 |
rather than, you know, from our body of work. 02:48:09.400 |
- So search and discovery, a little bit improved. 02:48:37.400 |
is actually super important and super interesting. 02:48:44.040 |
Do you think, there's so much incredible translation work 02:48:56.800 |
of the translation of articles and then build on top of that. 02:49:09.040 |
As it's gotten better, it's tended to be a lot better 02:49:13.280 |
in what we might call economically important languages. 02:49:33.960 |
And now it's like, actually it's pretty good. 02:49:40.080 |
So they're able to use it and they find it useful, 02:49:50.720 |
like there's loads of reasons to invest in English to Spanish 02:49:53.080 |
'cause they're both huge economically important languages, 02:49:56.680 |
So for those smaller languages, it was just still terrible. 02:50:00.200 |
My understanding is it's improved dramatically 02:50:13.160 |
but rather reading and understanding with tokens 02:50:26.760 |
that these smaller language communities are gonna say, 02:50:29.600 |
oh, well, finally I can put something in English 02:50:32.840 |
and I can get out Zulu that I feel comfortable sharing 02:50:36.480 |
with my community because it's actually good enough, 02:50:44.960 |
what will remain to most people an invisible trend, 02:50:47.120 |
but that's the growth in all these other languages. 02:50:54.880 |
- Well, the only thing I say about 100 years is like, 02:51:06.640 |
and financially conservative and careful way. 02:51:12.920 |
every year we put aside a little bit more money. 02:51:18.920 |
That's a completely separate fund with a separate board 02:51:22.640 |
so that it's not just like a big fat bank account 02:51:24.800 |
for some future profligate CEO to blow through. 02:51:30.640 |
of a second order board to be able to access that money. 02:51:38.480 |
So the point of all that is I hope and believe 02:51:41.920 |
that we're building in a financially stable way, 02:51:46.840 |
that we can weather various storms along the way 02:51:49.560 |
so that hopefully we're not taking the kind of risks. 02:51:54.560 |
And by the way, we're not taking too few risks either. 02:51:58.280 |
I think the Wikimedia Foundation and Wikipedia 02:52:02.680 |
if anybody exists in 100 years, we'll be there. 02:52:14.560 |
this sort of enormous step forward we've seen 02:52:30.800 |
but still it's, as someone who's been around technology 02:52:40.640 |
like the first time the internet was like really usable 02:52:51.760 |
- Because I remember AltaVista was kind of cool for a while, 02:52:59.800 |
And so large language model, it feels like that to me, 02:53:17.680 |
and I don't know enough to, I'm not that worried, 02:53:22.240 |
But I do see some like really low hanging fruit, 02:53:26.440 |
So my example is, how about some highly customized spam 02:53:31.440 |
where the email that you receive isn't just like 02:53:38.560 |
misspelled words and like trying to get through filters, 02:53:49.640 |
And it's like suddenly, oh, that's a new problem, 02:53:55.040 |
- Is there a, just on the Wikipedia editing side, 02:54:03.720 |
where larger and larger percentage of the internet 02:54:11.840 |
ask me again in five years how this panned out, 02:54:19.280 |
the value and importance of some traditional brands. 02:54:29.080 |
the Wall Street Journal, from the New York Times, 02:54:36.680 |
and I trust it to whatever extent I might have, 02:54:42.960 |
And if I see a brand new website that looks plausible, 02:54:50.360 |
that may be full of errors, I think I'll be more cautious. 02:55:00.240 |
where major media organizations get fooled by a fake photo. 02:55:08.560 |
was the Pope wearing an expensive puffer jacket, 02:55:18.280 |
oh, so the Pope's dipping into the money, eh? 02:55:33.160 |
So I think people will care about the provenance of a photo. 02:55:45.520 |
even though I don't necessarily think that's the highest, 02:55:51.120 |
they're gonna make sure the photo is what it purports to be, 02:56:06.000 |
and this did happen, is misrepresent the battlefield. 02:56:10.000 |
So like, oh, here's a bunch of injured children. 02:56:17.920 |
That has happened, that has always been around. 02:56:20.960 |
But now we can have much more specifically constructed, 02:56:26.720 |
that if I just see them circulating on Twitter, 02:56:38.760 |
that have a stable, verifiable trust that builds up. 02:56:45.880 |
this organization, that it could be an organization of one, 02:56:49.000 |
as long as it's stable and carries through time 02:56:52.240 |
- No, I agree, but the one thing I've said in the past, 02:56:59.280 |
I think my credibility, my general credibility in the world 02:57:02.320 |
should be the equal of a New York Times reporter. 02:57:12.360 |
That's just like if a New York Times reporter said it, 02:57:15.200 |
I'm gonna tend to think he didn't just make it up. 02:57:17.980 |
Truth is, nothing interesting ever happens around me. 02:57:20.280 |
I don't go to war zones, I don't go to big press conferences, 02:57:39.280 |
And then there is this sort of citizen journalism, 02:57:41.600 |
I don't know if you think of what you do as journalism, 02:57:44.880 |
I kind of think it is, but you do interviews, 02:57:48.920 |
And I think people, like if you come and you say, 02:57:51.800 |
right, here's my tape, but you wouldn't hand out a tape, 02:57:55.160 |
like I just gestured you as if I'm handing you a cassette 02:58:03.120 |
and people aren't gonna go, yeah, how do we know? 02:58:19.080 |
whereas if suddenly, and I've seen this already, 02:58:21.760 |
I've seen sort of video with subtitles in English, 02:58:29.600 |
and Zelensky saying something really outrageous. 02:58:34.200 |
Like, I don't think he said that in a meeting with, 02:58:38.840 |
I think that's Russian propaganda, or probably just trolls. 02:58:42.280 |
- Yeah, and then building platforms and mechanisms 02:58:54.360 |
That means I, this particular human, have signed off on it. 02:58:58.800 |
- And then the trust you have in this particular human 02:59:04.720 |
And then each, hopefully there's millions of people 02:59:10.120 |
And then you could see that there's a certain kind of bias 02:59:14.400 |
So maybe, okay, I trust this person to have this kind 02:59:18.160 |
with this other kind of bias, and I can integrate them 02:59:20.360 |
in this kind of way, just like you said with Fox News 02:59:23.360 |
- Yeah, and the Boston Journal, New York Times, 02:59:24.760 |
like they've all got their, like where they sit. 02:59:39.920 |
So let me ask for you to give advice to young people 02:59:46.120 |
High schoolers, college students, wanting to have 02:59:51.240 |
If you wanna be successful, do something you're really 02:59:54.040 |
passionate about, rather than some kind of cold calculation 03:00:03.680 |
you're probably not gonna be that good at it. 03:00:07.760 |
I also like, so for startups, I give this advice. 03:00:24.840 |
There'll be moments when it's not working out, 03:00:29.360 |
You've gotta persist through some hard times. 03:00:33.080 |
and you've gotta sort of scramble to figure it out, 03:00:43.120 |
but when I pivoted from Newpedia to Wikipedia, 03:00:54.440 |
Now the problem with these two wonderful pieces of advice 03:01:01.200 |
Is this a moment when I need to just power through 03:01:03.560 |
and persist because I'm gonna find a way to make this work, 03:01:11.560 |
But also, I think for me, that always gives me a framework 03:01:26.560 |
but those are choices, I think can be helpful to say, 03:01:42.920 |
and trying to do something that actually doesn't make sense? 03:01:48.560 |
Sometimes it'll be one, sometimes it'll be the other, 03:02:14.200 |
like Wikipedia has lived through the successive explosion 03:02:18.120 |
of many websites that are basically ad-driven. 03:02:23.760 |
Facebook, Twitter, all of these websites are ad-driven. 03:02:26.840 |
And to see them succeed, become these incredibly rich, 03:02:31.840 |
powerful companies that if I could just have that money, 03:02:37.040 |
you would think as somebody running Wikipedia, 03:02:58.400 |
- And then you just kinda stay with it, stick with it. 03:03:07.360 |
I mean, I always like to give an example of MySpace, 03:03:13.360 |
So MySpace was poised, I would say, to be Facebook, right? 03:03:17.640 |
It was huge, it was viral, it was lots of things. 03:03:21.000 |
Kind of foreshadowed a bit of maybe even TikTok, 03:03:23.400 |
because it was like a lot of entertainment content, casual. 03:03:33.520 |
And part of that, I think, was because they were really, 03:03:43.120 |
was like three clicks where you saw three ads. 03:03:45.880 |
And on Facebook, you accept the friend request, 03:03:51.400 |
But what is interesting, so I used to give this example 03:03:53.680 |
of like, yeah, well, Rupert Murdoch really screwed that one 03:03:56.840 |
up, in a sense, maybe he did, but somebody said, 03:04:03.040 |
and it was very profitable through its decline. 03:04:09.000 |
So it wasn't, like, from a financial point of view, 03:04:14.400 |
but on sort of more mundane metrics, it's like, 03:04:20.120 |
- It does, and that is also advice to young people. 03:04:27.280 |
like, when we have our mental models of success, 03:04:33.640 |
and your examples in your mind are Bill Gates, 03:04:37.560 |
Mark Zuckerberg, so people who, at a very young age, 03:04:42.560 |
had one really great idea that just went straight 03:04:45.480 |
to the moon and became one of the richest people 03:05:05.880 |
well, I feel sad because I'm not as rich as Elon Musk, 03:05:20.720 |
even in a really narrow sense, which I don't recommend, 03:05:27.640 |
It's like, if you measure your financial success 03:05:30.040 |
by thinking about billionaires, like, that's heavy, 03:05:44.240 |
oh, how does it feel to not be a billionaire, 03:05:45.760 |
I usually say, I don't know, how's it feel to you? 03:05:53.920 |
The number of bankers that no one's ever heard of 03:05:57.000 |
who live in London, who make far more money than I ever will, 03:06:01.480 |
And I wouldn't trade my life for theirs at all, right? 03:06:15.600 |
Like, yeah, Jimmy, you know, like, here's the situation. 03:06:20.320 |
and while you're there, the president has asked to see you. 03:06:32.840 |
And I'm like, yeah, like, that's really cool. 03:06:38.240 |
I don't do that all the time, but I've done it, 03:06:40.400 |
So, like, for me, that's like arranging your life 03:06:43.360 |
so that you have interesting experiences is just great. 03:06:54.120 |
What do you think is the meaning of this whole thing? 03:07:00.280 |
- I don't think there is an external answer to that question. 03:07:05.280 |
- And I should mention that there's a very good 03:07:12.580 |
- So, I'll have to read that and see what I think. 03:07:14.320 |
Hopefully, it's neutral and gives a wide frame. 03:07:18.440 |
to a lot of different philosophies about meaning. 03:07:30.920 |
They really struggle systematically, rigorously, 03:07:34.840 |
and obviously, a shout out to "The Hitchhiker's Guide" 03:07:38.960 |
No, I think there's no external answer to that. 03:07:43.000 |
I think we decide what meaning we will have in our lives 03:07:53.920 |
if we're talking about thousand years, millions of years, 03:08:16.600 |
send like using lasers to send little cameras 03:08:20.660 |
And he talks a lot in the book about meaning. 03:08:24.160 |
It's like his view is that the purpose of the human species 03:08:27.560 |
is to broadly survive and get off the planet. 03:08:31.360 |
Well, I don't agree with everything he has to say 03:08:35.440 |
that can motivate most people in their own lives. 03:08:45.640 |
should we build generation ships to start flying places? 03:08:49.400 |
And I'm not, even if I could, even if I'm Elon Musk 03:08:57.760 |
But I think it's really interesting to think about. 03:09:02.480 |
it's quite a short little book, reading his book, 03:09:10.440 |
It's like, where is the human species going to be 03:09:15.680 |
And it does make you sort of turn back to Earth and say, 03:09:26.360 |
And therefore, we should really think about sustainability. 03:09:40.040 |
I'm actually excited about AI in this regard, 03:09:48.360 |
But I actually think it is quite interesting, 03:09:50.320 |
this moment in time that we may have in the next 50 years 03:09:54.200 |
to really, really solve some really long-term human problems, 03:10:02.120 |
Like, the progress that's being made in cancer treatment, 03:10:08.960 |
model molecules and genetics and things like this, 03:10:22.640 |
and certain problems that seem completely intractable today, 03:10:27.280 |
like climate change, may end up being actually not that hard. 03:10:41.480 |
that we can propagate the flame of human consciousness 03:10:46.640 |
And I think another important one, if we fail to do that, 03:10:54.400 |
the full diversity and richness and complexity 03:11:09.360 |
- Just triggered me to say something really interesting, 03:11:12.520 |
which is, when we talked earlier about translating 03:11:26.840 |
But there's, I was in Norway, in Bergen, Norway, 03:11:30.920 |
where every year they've got this annual festival 03:11:32.880 |
called Buekorp, which is young groups drumming, 03:11:40.640 |
and they've been doing it for a couple hundred years 03:11:43.560 |
They wrote about it in the three languages of Norway. 03:11:48.040 |
And then from there, it was translated into English, 03:12:04.900 |
we're already seeing some really great pieces of this. 03:12:29.080 |
where we use a machine to re-dub it in English 03:12:33.400 |
in an automated way, including digitally editing the faces 03:12:38.920 |
And so suddenly you say, oh, wow, like here's a piece of, 03:12:48.160 |
or maybe it's "Succession" just to be very contemporary. 03:12:50.760 |
It's something that really impacted a lot of people, 03:12:53.520 |
And we have literally no idea what it's about. 03:12:56.120 |
And suddenly it's like, wow, you know, like music, 03:13:03.680 |
can suddenly become accessible to us all in new ways. 03:13:14.680 |
in the many different subcultures of Africa, South America. 03:13:21.320 |
with the Chinese government is by blocking Wikipedia, 03:13:25.680 |
right, you aren't just stopping people in China 03:13:36.880 |
So is there a small festival in a small town in China 03:13:51.480 |
They can't share their culture and their knowledge. 03:13:55.400 |
this should be a somewhat influential argument 03:13:57.600 |
because China does feel misunderstood in the world. 03:14:02.040 |
if you wanna help people understand, put it in Wikipedia. 03:14:06.040 |
That's what people go to when they wanna understand. 03:14:08.360 |
- And give the amazing, incredible people of China a voice. 03:14:16.040 |
I'm such a huge fan of everything you've done. 03:14:19.800 |
- I'm deeply, deeply, deeply, deeply grateful for Wikipedia. 03:14:24.800 |
I donate all the time, you should donate too. 03:14:37.400 |
please check out our sponsors in the description. 03:14:47.520 |
The greatest enemy of knowledge is not ignorance. 03:14:53.160 |
Thank you for listening, and hope to see you next time.