back to indexE147: TED goes woke, Canada's Nazi blunder, AI adds vision, plus: who owns OpenAI?
Chapters
0:0 Bestie intros with Coleman Hughes
1:12 Coleman's experience with TED, Understanding TED's ideological shift
15:11 Focusing on class instead of race when enacting policies, reaction to Coleman's talk, institutional takeovers
44:1 "When Virtue Signalling Goes Wrong": Canadian parliament cheers for a Nazi
64:21 OpenAI's big week, informed speculation on Sam Altman's actual ownership of OpenAI
72:39 The next evolution of AI: multimodal and consumer hardware
00:00:08.000 |
My girlfriend introduced me to the show like two years ago, and I've been a fan 00:00:12.640 |
Apparently, like many women, she has like, she has a legit concerning 00:00:37.160 |
I need to just psychologically explore this before we get into the real 00:00:43.360 |
By the way, I think you guys missed the second half of my statement. 00:00:48.440 |
Let's get to the, let's get to the meat of the issue. 00:01:19.440 |
I thought we'd start with something pretty crazy. 00:01:26.440 |
So we decided to have him on to talk about the experience. 00:01:31.440 |
They did it to Sarah Silverman for doing comedy at Ted. 00:01:34.440 |
Because people at Ted are a bunch of virtual signaling lunatics, including 00:01:40.440 |
But Coleman Hughes, if you don't know him, is a writer and podcaster. 00:01:45.440 |
He has a pretty popular podcast called Conversations with Coleman. 00:01:49.040 |
And he did a talk, which I encourage everybody to watch, at Ted. 00:02:07.040 |
And maybe you could just share with the audience how you wound up speaking at Ted. 00:02:18.040 |
And then the bizarre reaction when they tried to ban and kill your talk, post you giving it. 00:02:25.040 |
Yeah, so first, really glad to be on, guys. I'm a fan of the pod. 00:02:28.040 |
So I'll give the short version here. If you want the long version, you can go to the free press, 00:02:32.040 |
where I wrote a big summary of what happened there. 00:02:35.040 |
Basically what happened is Chris Anderson invited me to give a Ted talk. 00:02:40.040 |
And I chose the subject of my upcoming book, which is coming out in February, 00:02:47.040 |
And the argument is just, essentially, color blindness. 00:02:51.040 |
This is the idea that you want to treat people without regard to race, 00:02:54.040 |
both in your personal lives and in our public policy. 00:02:57.040 |
And wherever we have policies that are meant to collect and help the most disadvantaged, 00:03:03.040 |
we should preferentially use class as a variable, rather than race. 00:03:19.040 |
95% of the people in the audience, it was quite well-received. 00:03:23.040 |
Whether or not they agreed with every point, it was well within the bounds of acceptable discourse. 00:03:29.040 |
There was a very small minority on stage, I could see, that was physically upset by my talk. 00:03:37.040 |
I could see this on stage, yeah, in the moment. 00:03:39.040 |
But, I mean, I'm talking five people in a crowd of almost 2,000. 00:03:43.040 |
So, I expected that, because color blindness is not in vogue today on the left, amongst progressives. 00:03:55.040 |
So I was expecting to field some pushback, and I talked to some critics and so forth. 00:04:01.040 |
But what happened is, what began as just a few people upset, 00:04:06.040 |
began to spiral into a kind of internal staff meltdown at Ted. 00:04:13.040 |
So this group called Black at Ted asked to speak with me. 00:04:16.040 |
I agreed, and then they said, "Actually, we don't want to talk to you." 00:04:22.040 |
After the conference, Chris emailed me and said, 00:04:25.040 |
"Look, I'm getting a lot of blowback here internally. 00:04:28.040 |
There are people saying we shouldn't release your talk at all." 00:04:31.040 |
And then, over the course of the next month, they came up with a variety of sort of creative solutions 00:04:37.040 |
about how to release my talk in a way that would appease the woke staffers 00:04:41.040 |
that really didn't want it to be released at all. 00:04:44.040 |
And at this point, I had to start kind of sticking up for myself. 00:04:46.040 |
So first, they wanted to attach a debate to the end of my talk and release it as one video, 00:04:53.040 |
which I felt would really send the wrong message. 00:04:55.040 |
It would send the message that this idea can't be heard without the opposing perspective. 00:05:00.040 |
Did they tell you what was problematic about your talk? 00:05:08.040 |
Well, there were no factual problems. It passed the fact-checking team. 00:05:11.040 |
There were no substantive issues with the talk. 00:05:17.040 |
It upset the staff. That was the language that was used. 00:05:26.040 |
You know, I tried to actually have face-to-face conversations with some of these folks. 00:05:35.040 |
So, presumably, many of them were Black, but possibly not all. 00:05:39.040 |
Okay. What do you perceive was the problem with your talk? 00:05:44.040 |
Or what they perceive the problem with your talk is? 00:05:46.040 |
So, the last day of the TED conference, they have a town hall. 00:05:50.040 |
People from the audience come and give feedback. 00:05:53.040 |
The town hall opened with two people denouncing my talk back-to-back. 00:05:57.040 |
The first said that it was racist and dangerous and irresponsible. 00:06:02.040 |
And the second guy, who's actually a guy I knew, 00:06:05.040 |
he said that I was willing to have a slide back into the days of separate but equal. 00:06:13.040 |
And I implore anyone to just go online and watch it. 00:06:18.040 |
Decide for yourself whether these criticisms bear any resemblance to reality. 00:06:26.040 |
That, you know, I'm some kind of pro-Jim Crow person. 00:06:30.040 |
It's really, really deranged kind of criticisms. 00:06:34.040 |
Your talk is up on TED's website and on YouTube, right? 00:06:38.040 |
But part of the controversy was that the number of views seemed to be pretty suppressed. 00:06:45.040 |
Was that discussed with Chris when you talked with him? 00:06:48.040 |
Or do you have a point of view on the suppression of the promotion of the video, 00:06:53.040 |
And how that's affected, you know, how widespread the video has been made available to folks? 00:06:58.040 |
Yeah, so in my final call with Chris, he sort of presented this idea about how to release it. 00:07:04.040 |
And he sold it to me as a way to amplify my talk, which I think was kind of some spin. 00:07:09.040 |
He was in a tough position, caught between me and his employees. 00:07:12.040 |
We ultimately decided they would release the talk. 00:07:15.040 |
And then two weeks later, they'd release a debate between myself and this guy, Jamel Bui, 00:07:27.040 |
And I kind of mentally had forgotten about the whole situation until Tim Urban, 00:07:32.040 |
who was a popular blogger, who's actually given the... 00:07:39.040 |
He's also given the most viewed TED Talk of all time on YouTube. 00:07:43.040 |
Tim noticed that my talk just had a really absurdly low view count, 00:07:48.040 |
like an implausibly low view count on TED's website. 00:07:51.040 |
In mid-August, he tweeted this, and that he believed they were intentionally underpromoting my talk. 00:07:59.040 |
I checked, and all of the five talks surrounding mine, 00:08:03.040 |
they all had between, you know, 450,000 views and 800,000 views. 00:08:13.040 |
So 16% of the low end of the range of all the talks released around mine. 00:08:18.040 |
So when that happened, I felt that TED had kind of reneged on its end of our bargain, 00:08:23.040 |
and that's when Barry Weiss got wind of it, and I went public. 00:08:28.040 |
Just to be clear, you're saying that the condition for releasing your TED Talk, 00:08:34.040 |
the bargain you struck with Chris, was that you would do a debate with someone in a separate video, 00:08:41.040 |
and that you had to do the debate in order to have your TED Talk released? 00:08:47.040 |
So yeah, that was the end of the negotiation. 00:08:49.040 |
The beginning of the negotiation was trying to get me to release those things as one video, 00:08:55.040 |
And then next, we're going to release them as separate videos on the same day. 00:09:00.040 |
And then we agreed on a two-week separation between the two. 00:09:04.040 |
In your experience with TED and your conversations around this matter, 00:09:08.040 |
are you aware of other videos that TED has refused to put out that were a live TED Talk 00:09:15.040 |
at the TED Conference, and they were deemed to be too controversial to be released publicly? 00:09:22.040 |
I don't know the whole history of TED, but nothing like that this year, for sure. 00:09:26.040 |
We can go one of two ways with this, Friedberg. 00:09:28.040 |
Do you want to talk about the substance of the talk, or maybe dig into the culture of TED? 00:09:33.040 |
I want to talk about the substance of the talk in a minute, 00:09:35.040 |
but I think it's worth just sharing my experience with you. 00:09:38.040 |
I started going to TED as an attendee around, I believe, 2007, 00:09:48.040 |
It was an incredible week of my life every year. 00:09:52.040 |
In the early days, I would go there, and I saw new perspectives on technology, 00:09:58.040 |
on the environment, on social change, on all these topics that were not in my day-to-day, 00:10:05.040 |
that I thought were really exciting and awe-inspiring. 00:10:08.040 |
That really was kind of this ethos of TED back in the day before Chris Anderson took it over, 00:10:12.040 |
was to kind of inspire people with new ideas. 00:10:15.040 |
Over the years that I attended TED, I began to observe that many of the talks, 00:10:19.040 |
and I spoke about this very briefly last week as part of my motivation and interest 00:10:24.040 |
but that over time, many of the talks began to take a bit of a social justice turn 00:10:29.040 |
in the sense that there was almost a lecturing happening 00:10:37.040 |
When Donald Trump was elected president in 2016, 00:10:41.040 |
needless to say, most of the audience of TED was not on that side of the voting block. 00:10:48.040 |
And what disturbed me the most was that in the three years after he was elected, 00:10:54.040 |
every TED conference had plenty of subjects, plenty of talks, and plenty of conversations 00:11:00.040 |
about why society is falling apart, why Donald Trump is a key root cause of that, 00:11:05.040 |
why so much of him and what he stands for and the people behind him 00:11:11.040 |
There wasn't a single talk that provided a perspective of why anyone voted for him. 00:11:16.040 |
There was no one that shared a point of view about why this person had come 00:11:21.040 |
to gather more than half the votes or half the votes in the country. 00:11:24.040 |
And I thought that was such an important topic to better understand 00:11:28.040 |
that I was so shocked that it was never part of the discourse at TED. 00:11:31.040 |
I'm not a Republican, I'm not a conservative, and I'm not against social justice issues, 00:11:36.040 |
but I saw TED over time get overtaken with this kind of very one-sided, 00:11:42.040 |
almost bullying type of approach to this is the narrative we want to sell society on 00:11:47.040 |
rather than have a true discourse about the matter. 00:11:50.040 |
I sent a survey response in 2019 after I went to TED and I said, 00:11:56.040 |
"I'm never coming back again. This year did it for me. I'm over it." 00:11:59.040 |
And there was such a lack of diversity of points of view at this conference, 00:12:03.040 |
and so much of this has veered away from inspiring topics and inspiring talks, 00:12:07.040 |
and it became all about fear of technology, it became about social injustice, 00:12:11.040 |
caused by one side of the political spectrum, 00:12:14.040 |
and it really angered and upset me that everyone had become so close-minded at TED. 00:12:18.040 |
And I sent this note and Chris Anderson reached out to me and said, 00:12:21.040 |
"Well, you have a conversation." I went on a Zoom call with him, 00:12:24.040 |
and I spoke with him for an hour and I shared all this, and I said, 00:12:26.040 |
"He's missing so much of what's happening that's optimistic about the world, 00:12:29.040 |
that's optimistic about technology, that's different ways of looking at things." 00:12:33.040 |
And he's kind of created this very narrow-minded view on the topics 00:12:37.040 |
that they want to address and how they want to address them. 00:12:44.040 |
to me, it's almost like the ultimate endgame of this process 00:12:48.040 |
that I've been observing at TED personally for the last 13 years. 00:12:51.040 |
And I just wanted to, you know, last 15 years, I guess, 00:12:54.040 |
share that story with you and speak publicly about it. 00:12:57.040 |
I very much respect the intention of the people at TED. 00:13:01.040 |
The TED Talks changed my life many times along the way 00:13:08.040 |
I have plenty of people that have worked there. 00:13:12.040 |
But I think it's such a microcosm and a reflection 00:13:14.040 |
of what's broadly been going on, which is it's either my opinion or not, 00:13:17.040 |
and everyone coalesces around people with the same opinion. 00:13:19.040 |
And then you magnify it and you concentrate it, 00:13:23.040 |
And TED used to be a place for discourse, and it's lost that, 00:13:26.040 |
as have so many other forums for conversation in society and country today. 00:13:31.040 |
Colman, what's your take on the TED organization, 00:13:34.040 |
you know, pre and post having had this experience? I'm curious. 00:13:38.040 |
Yeah, what you just said, David, I've heard echoed from at least a dozen people 00:13:43.040 |
that have gone to TED or been, you know, in the TED community for 10 years or more. 00:13:50.040 |
They've noticed the exact change that you noticed. 00:13:56.040 |
Is it actually coming top-down from the leadership? 00:14:04.040 |
You know, with Chris Anderson, I would say no. 00:14:06.040 |
I agree. So, like, all my private communications with Chris suggest to me 00:14:10.040 |
that he is just as alive to this problem of ideological capture of institutions as anyone. 00:14:17.040 |
But when it comes to, you know, his own staff, who have really strong feelings, 00:14:23.040 |
who are not pro-free speech, who are not pro-heterodox beliefs and open discourse, 00:14:29.040 |
who literally just don't share that value, you know, it's a very tricky thing with leadership. 00:14:36.040 |
Sometimes you have to simply be the bad guy and say, "I'm sorry. 00:14:39.040 |
These are the values of the institution, and if you're not on board, this is not right for you." 00:14:44.040 |
And my perception is that TED has been captured kind of from the bottom up, 00:14:48.040 |
like many institutions, just from the seeping in of staff that don't share those values 00:14:53.040 |
and the inability of the leadership to actually hold the line for those values. 00:14:58.040 |
Did they tell you that you made them feel unsafe? 00:15:01.040 |
Yes, actually. Actually, yes. People said they felt they were attacked in the audience. 00:15:06.040 |
And, you know, my talk was, again, just look it up on YouTube. It's quite mild. 00:15:13.040 |
Yeah, let's go into the substance. What was your take on it, Chamath? 00:15:15.040 |
I'll just make a statement, which is I think that your talk was superb. 00:15:21.040 |
And just to give you my journey, as a kid that grew up as a refugee on welfare, 00:15:30.040 |
and then to get through every single sort of strata of society, 00:15:34.040 |
I think when I look back, the biggest thing that I struggled with was always confusing. 00:15:41.040 |
When I felt mistreated, I would always direct it at racism. 00:15:47.040 |
It would be my sort of safety blanket, and I would always look at other people as doing that. 00:15:53.040 |
And it was only until I met my wife and spending years and years talking about it 00:16:00.040 |
where I was able to disarm this and see that out of 100 interactions, 00:16:04.040 |
a lot of the time just people are having a bad day. 00:16:07.040 |
Some other percentage of the time, people are actually just being very classist. 00:16:11.040 |
Because racism, it turns out, is like a pretty severe perversion, 00:16:15.040 |
and it's really crazy when you actually see it play out. 00:16:20.040 |
And for me, had I had a framework, if I had your talk when I was in my 20s and 30s, 00:16:26.040 |
I would have spared myself a lot of self-sabotage. 00:16:29.040 |
Because what that does is when you feel these things and you don't have a framework to interpret it, 00:16:33.040 |
or to tolerate the anxiety, I would internalize that anxiety, 00:16:40.040 |
And so if the goal was for me on behalf of my family or on behalf of people like me to make it, 00:16:48.040 |
I would have gotten there much faster had I not gotten in my own way. 00:16:53.040 |
And when I watched your talk, it was incredibly validating for the work that I had done. 00:16:58.040 |
And I had thought to myself, "Man, if I had had him, if he had made that for me when I was 20 years old, 00:17:10.040 |
Because when I think about some of the mistakes I made, 00:17:12.040 |
they were rooted in this specific issue that you touched. 00:17:17.040 |
And I also want to say that to the extent other people are interested and feel like that, 00:17:21.040 |
you should really listen to what you had to say, because I thought it was eloquently addressed. 00:17:25.040 |
I was a huge, huge, huge fan of what you had to say. 00:17:31.040 |
And especially for someone as young as you, I thought it was just amazing. 00:17:35.040 |
>> Colman, let me ask you, what was the reaction from people of color, 00:17:39.040 |
people who've experienced racism, perhaps, to your talk? 00:17:42.040 |
Because you must have gotten a tremendous amount. 00:17:47.040 |
So what was the reaction to people like Chamath or yourself, people of color, 00:17:53.040 |
who maybe have experienced racism on some regular basis, 00:17:56.040 |
and this idea of having colorblindness when we're operating as a society in that goal? 00:18:02.040 |
Which, I'll just point out when I listen to your talk, seems to be exactly what Martin Luther King said. 00:18:10.040 |
So, the stereotype of the reaction is that white people like my talk and people of color don't. 00:18:18.040 |
>> So that's the stereotype that my critics would like to believe is the reality, 00:18:22.040 |
because then they don't have to confront my arguments. 00:18:26.040 |
The reality is that even at the TED conference, which is a progressive space, 00:18:31.040 |
many, many people of color, black people, South Asian people, came up to me saying, 00:18:37.040 |
"That was an excellent talk for this, that, and the third reason." 00:18:41.040 |
And I think probably for reasons similar to what you were saying, Chamath, 00:18:49.040 |
I've found that oftentimes immigrants of color really resonate with my message. 00:18:56.040 |
I have many, for instance, Jamaican friends that, you know, they view themselves as Jamaican, 00:19:02.040 |
they come to America, and our conversation about race doesn't make very much sense to them. 00:19:10.040 |
>> It doesn't make sense, for instance, to strongly feel that your racial identity is an aspect of your core inner self, 00:19:19.040 |
that you ought to judge people on the basis of their racial identity, 00:19:24.040 |
that if you're a white person, that you don't have a valid perspective to bear on a conversation, 00:19:31.040 |
or you have to preface every belief by saying, "Well, I'm a dumb white guy, what do I know?" 00:19:36.040 |
This kind of routine that we've gotten into in spaces, rather than just confronting each other as, 00:19:41.040 |
"Hey, I'm Coleman, you're Chamath, you're David, etc. 00:19:45.040 |
Let's all talk about this from the point of view of epistemic equals and have conversations. 00:19:52.040 |
And yeah, you're going to know about stuff I haven't known because of your individual life story. 00:19:56.040 |
I'm going to have experienced stuff that you haven't." 00:19:59.040 |
We may have even experienced racial discrimination. 00:20:01.040 |
We may have stories to tell, but we are starting out fundamentally from the framework of all being human beings 00:20:08.040 |
that can talk to each other, and, you know, we don't have to sort of play act these racial roles 00:20:14.040 |
that have become increasingly in vogue in woke spaces. 00:20:21.040 |
And what's more, you've gotten this thing on the left, you've gotten media institutions that have been taken in by this. 00:20:29.040 |
So, you see New York Times op-eds like one, I think, five years ago, that's, 00:20:33.040 |
"Can my children be friends with white people?" 00:20:36.040 |
You've got Robin DiAngelo in her book saying things like, 00:20:39.040 |
"A white person shouldn't cry around a black person because it triggers us." 00:20:44.040 |
It's like this is so the opposite of what it actually feels like to hang out with an interracial and tight-knit group of friends. 00:20:52.040 |
Your racial identity recedes in importance the more you get to know people. 00:20:57.040 |
And I think people in interracial relationships know this. 00:21:03.040 |
So, my message actually resonates with people of all colors. 00:21:07.040 |
That, I think, was one of the most poignant parts of it. 00:21:09.040 |
Sax, you got to watch the talk as well, I believe. 00:21:12.040 |
So, your thoughts on maybe institutions rotting from the inside and maybe even one that's supposed to support ideas, ideas that matter. 00:21:25.040 |
I just want to not use the term "rotting" because I think your point is that it's not good. 00:21:31.040 |
I don't think that's necessarily the case because the point is there's institutional capture that's happened. 00:21:38.040 |
And that institutional capture is almost like a democratic process that we're seeing at companies, that we're seeing at government agencies, 00:21:46.040 |
and that we're seeing in private and non-profit institutions that the individuals that are employed are capturing the organization's ideals. 00:22:00.040 |
It was such a storied institution, you know, in terms of it was a brave institution under Ricky Saul Warman, you know. 00:22:05.040 |
I get it, but I think "rotting" is such a derogatory term in the sense that some of these institutions evolved to be different. 00:22:12.040 |
And that's the only thing I just… I don't want to make it… yeah. 00:22:14.040 |
So, rotting or is it being taken over from the inside out from the bottom up? What are your thoughts? 00:22:20.040 |
I think "captured" is a pretty good word to use. Freeberg used that word. 00:22:24.040 |
Just remember, TED's original mission represented in their tagline was "ideas worth spreading." 00:22:30.040 |
So, there's supposed to be a forum for interesting, worthy ideas that they're going to spread. 00:22:39.040 |
They're basically sandbagging the views, and they didn't want to publish it at all. 00:22:43.040 |
And then when they did agree to publish it, they basically subjected that to a new requirement of putting a rebuttal right by it. 00:22:52.040 |
So, this is not living up to the original mission. 00:22:56.040 |
I want to go to Chris Anderson's response here. 00:22:59.040 |
He wrote this long post on X, which is too long to read here. 00:23:03.040 |
It's a really sort of weasley, mealy-mouthed defense of what they did. 00:23:12.040 |
I think there's really only one or two sentences that are relevant in terms of explaining this whole thing. 00:23:19.040 |
What he says is that many people have been genuinely hurt and offended by what they heard you say. 00:23:29.040 |
This is not what we dream of when we post our talk. 00:23:31.040 |
So, I think this is really the key intellectual mistake that Chris Anderson's making, is that he believes that people can be genuinely hurt by encountering well-reasoned ideas they disagree with. 00:23:44.040 |
I think the way that the marketplace of ideas is supposed to work is that when you encounter an idea you disagree with, you formulate an equally well-thought-out response. 00:23:55.040 |
And you engage in intellectual discourse and debate. 00:24:00.040 |
But, you know, I think these words are really significant because he's saying not just that the objectors here were offended. 00:24:07.040 |
He was saying that they were hurt, genuinely hurt. 00:24:10.040 |
So, he's buying into this idea that hearing ideas you disagree with is somehow a threat to your safety. 00:24:17.040 |
And as soon as you do that, as soon as you concede that there can be some sort of physical harm from engaging with ideas, you give the equivalent of a heckler's veto to the people who don't like these ideas. 00:24:36.040 |
So, there's no way you can function as a marketplace of ideas and certainly a platform for ideas worth spreading if you're going to give a veto to people who can claim that their subjective emotional reaction to well-thought ideas should trump the right of the speaker to put out that idea. 00:24:59.040 |
Right, exactly. And I think that's where we've ended up. 00:25:02.040 |
Coleman, can I ask your point of view on institutional capture? 00:25:06.040 |
Obviously, this is different than the topics you've spoken about. 00:25:11.040 |
But as you've gone through this experience with Ted and as you think more broadly about what's going on, do you have a point of view on the capture of institutions from the bottom up that's happened? 00:25:22.040 |
And how that's affected some of these topics like free speech, sharing of ideas, open discourse, all these foundations that made kind of a free and open society work effectively for so long. 00:25:33.040 |
Yeah, well it's a very difficult problem because it's easy for me from the outside not being the leader of a major institution to say, "Well, this is just what you have to do." 00:25:44.040 |
Obviously, it's more psychologically difficult to go to your own staff that you have to metaphorically live with every day and really shake things up. 00:25:55.040 |
Someone like Barry Weiss who used to be at the New York Times, her point of view on it is, "Look, you just got to start your own institutions. 00:26:04.040 |
You have to start your own institutions with the right ethos from day one." 00:26:09.040 |
And that's what she's tried to do with the free press rather than try to reform institutions that have a lot of unhealthy inertia. 00:26:17.040 |
Chris could have stopped this very easily. I mean, this is a failure of leadership. 00:26:21.040 |
What he needed to tell these employees is, "Look, our mission is to be a platform for spreading interesting ideas. 00:26:28.040 |
And we can't treat this speech differently than any other speech just because you disagree with it." 00:26:35.040 |
And by the way, just because an idea may be offensive does not mean that it should not be spread. 00:26:41.040 |
I think, have you read Jonathan Haidt's book, "Coddling of the American Mind," Coleman? 00:26:46.040 |
And I think that speaks, and that was the book I gave away in our gift bag at the All In Summit this year, 00:26:51.040 |
because I thought it was such like an important and kind of prescient point of view on what's going on right now. 00:26:57.040 |
We assume that if something is offensive by some group, could be a large group or a small group, it needs to be suppressed. 00:27:05.040 |
And obviously, as you extend that concept to its extreme, you end up losing many ideas that challenge, you know, the current kind of main concept that everyone believes. 00:27:18.040 |
So, Coleman, just maybe if you can just guess, why when somebody watches this talk, could they feel genuinely hurt? 00:27:29.040 |
Like, if we had to steel man them, let's step in their shoes, like, what's the cycle that's going on there that gets them to, "Oh my God, this is an intolerable point of view"? 00:27:44.040 |
Yeah, I mean, I think there has to be something with, if you're a person that has, you know, staked your life or your career out on the concept of sort of race-based diversity, equity, and inclusion, 00:27:59.040 |
explicitly taking race into account in policies, and, you know, you're someone that's been working in that domain for 30 years, 00:28:08.040 |
and you see someone like me come up there and just argue against that whole approach, 00:28:14.040 |
there may be some severe threat mechanism that comes on board where you actually don't have a rational argument that easily debunks what I'm saying, because what I'm saying is very reasonable. 00:28:28.040 |
And so, in the absence of a great rational argument, when the stakes are high, all the, you know, primal animal emotions sort of come out, your whole limbic system, 00:28:40.040 |
and you feel like you're kind of in a fight-or-flight situation, and you feel incredibly emotional. That's my only guess. 00:28:48.040 |
Yeah, they're hurt, and it's scary to think, what if you win the argument? And if you win the argument, it means certain things might go away. 00:28:56.040 |
I think the two examples they gave you, Chris Anderson came on stage and said, "Oh, you know, when conductors are looking for a new violinist, they put them behind a shade, and they do a colorblind selection process." 00:29:10.040 |
I think Malcolm Gladwell talked about that in Blink. And your response, and then they said, "Well, wouldn't it be better if we could have some representation in that group, so then we would inspire people to get to that group?" Your response to that was? 00:29:24.040 |
Yeah, my response to that was, what you really want to do is, if there are reasons why, say, Black kids aren't getting access to violins at a young age, because schools are underfunded, or band programs are horrible in inner cities, that's where you want to intervene. 00:29:41.040 |
You don't want to intervene at the meritocratic end line, racially rigging the very bar that you would use to measure progress on those deeper dimensions. 00:29:52.040 |
Have you read this book called Losing Ground by Charles Murray? 00:29:58.040 |
I mean, it's a very provocative book. I have always thought, and maybe I'll just leave this with you, because if you were willing to do it, I for one would love to support you in any way that I could to do it. 00:30:09.040 |
But we don't have a full accounting of what really happened starting in the late 1960s with LBJ's war on poverty. 00:30:18.040 |
And I think when you look at racism through the American lived experience, a lot of it goes back to a bunch of economic incentives that were set up to try to do what theoretically seemed at the time the right thing. 00:30:30.040 |
We can debate whether that's where LBJ came from or not. 00:30:33.040 |
But you compound and cascade a bunch of decisions forward and to your point, now we're sort of trying to deal with the symptoms without really addressing the root cause. 00:30:41.040 |
And I think if America wants to really heal and deal with this, what we also need to do is give all those people that have that fight or flight response, the better toolkit to understand what kind of got us here. 00:30:51.040 |
Because right now we have a very charged way of viewing these things without actually looking at some of the practical, quantifiable details. 00:30:59.040 |
Thomas Sowell has talked about it, Charles Murray talks about it. 00:31:02.040 |
But these are unfortunately such heterodox ideas that they just don't get enough mainstream discussion. 00:31:08.040 |
And if you then compound that with this institutional capture, they get buried. 00:31:13.040 |
And so the answer may actually be sitting right in front of our face, where it was the welfare reform system that we implemented in the late 1960s on down the line. 00:31:23.040 |
Because those are structural ways where we can solve it, which ultimately will get to your point, which is great. 00:31:28.040 |
Fund more music in the schools in that example. 00:31:31.040 |
And right now we're so caught up in all of the labels and the fear mongering that we never get to that. 00:31:36.040 |
And so I just wanted to put that out there that I think that there needs to be smart, brilliant people like yourself, young people who can do a full accounting of like the last 50 or 60 years in a much more structural way that these gentlemen tried to do. 00:31:49.040 |
But the ideas were just too heterodox at the time. 00:31:51.040 |
But because of formats like podcasts and like the free press and other things, I think there's a chance that you can actually get these ideas out. 00:31:58.040 |
And I think it's important because I think folks like me or the people that approached you, there's not enough of us that came from this background that are open minded or at a point where we can tolerate the anxiety to listen to your ideas. 00:32:12.040 |
There's a lot of people that may just viscerally react. 00:32:14.040 |
But the more that we can shift those people away from viscerally reacting to actually tolerating and then thinking and then evolving their point of view, you can do some enormous good in the world. 00:32:25.040 |
Just why I just wanted to put that out there. 00:32:28.040 |
Yeah, no, I mean, that's a huge topic and an understudied topic. 00:32:31.040 |
What was the effect of the welfare reforms of the 60s and 70s? 00:32:35.040 |
I know my mother used to say, she grew up in the South Bronx. 00:32:40.040 |
And she used to say, she used to just have stories of, you know, when the welfare auditors would come around and people would hide their boyfriends, hide their husbands, etc. 00:32:51.040 |
And in the book Black Power by Stokely Carmichael, aka Kwame Ture, which is, you know, the manifesto of the Black Power Movement, hardly a right wing source. 00:33:02.040 |
They made the same point about welfare reform. 00:33:04.040 |
So there definitely is something to be investigated there. 00:33:10.040 |
I know Glenn Lowry is someone who has really dug into that sort of research, but there's definitely a lot of room for study there. 00:33:19.040 |
Komen, let me ask you a question about our industry. 00:33:21.040 |
We've had a lot of hand wringing and debates about diversity in funding of startups, capital allocators, venture capital firms. 00:33:32.040 |
And we have limited partners who have a mission to have more diverse general partners, the people at venture firms who invest in startups, invest in more female led startups, etc. 00:33:44.040 |
Because the numbers, frankly, you know, have not been very diverse historically in venture. 00:33:53.040 |
And we recently had a female, a black female venture firm, I think it's called Fearless Founders Get Sued. 00:34:00.040 |
I'm not sure if you're aware of that lawsuit. 00:34:03.040 |
Should there be venture firms specifically designed to change the ratio? 00:34:12.040 |
And should, you know, people with large endowments of capital be backing, you know, black venture capitalists to see more of them or female black venture capitalists, Hispanic, etc. 00:34:23.040 |
Or how would you look at that issue, which has been a pretty sticky issue and hasn't changed for a long time? 00:34:29.040 |
So prescriptively, I don't want to say much because I don't like to tell people how to run their funds or run their businesses, right? 00:34:35.040 |
If you're a Christian and you want to hire only Christian people, if you're a Muslim, you want to hire only Muslims. 00:34:40.040 |
I think you should frankly be allowed to do that if those are your personal values. 00:34:44.040 |
Now, personally, I will tell you, with respect to the people that I would hire to say work on my podcast, I want every single hire to know that I am not hiring them as a result of their skin color or gender or any other contingent feature of their identity. 00:35:04.040 |
I want them to know that I'm hiring them for what they really bring to the table. 00:35:07.040 |
Now, I have a very small team. Maybe there's something about how certain optics are required for a larger firm. 00:35:16.040 |
But I think the problems begin when you sort of bless this idea that race is a super deep feature of who you are right from the start. 00:35:28.040 |
When you bless that idea right from the start, it sends the signal that what people bring to the table is their racial identity, is their gender. 00:35:37.040 |
Now, when you fast forward two years down the line when a company is having some meltdown over a race or a gender issue, you have to understand that it's possible you made this bed by signaling from the very beginning that what's important about the people you're bringing in is their race, is their gender, 00:35:53.040 |
and that you are vulnerable to the kinds of appeals that can be made purely on the basis of what are ultimately superficial features of our identity. 00:36:05.040 |
What would your advice be to institutional leaders that are past that point of no return, the CEOs of big companies and big institutions that are now captive by these ideologies where they are effectively, as you say, ultra sensitive to issues around race and gender and other sort of superficial identities 00:36:27.040 |
and are challenged often to make decisions or driven to make decisions that their employees and teams demand of them? Do you have advice on how they can rethink their roles as leaders and how to reframe this? 00:36:40.040 |
I mean, in a word, no, because by that point, it's an intractable problem. I've talked to CEOs that ask this question to me over and over again, like, "What do I do once I'm past the point where I have so many staff and the system is so sprawling that it's no longer under my control? 00:37:01.040 |
I have so many people with values that I don't share that I frankly think privately are insane, but I cannot say so publicly because I have higher order commitments to the shareholders, to the board, to steer the ship, right, such as it is, and the ship cannot be changed at this point." I don't have good advice. I'm not going to pretend that I do. 00:37:21.040 |
Do you think that same problem is inherent in political parties in the United States, states, state governments, and other larger kind of social systems that we use to organize ourselves and are now also captive and in kind of a point of no return? 00:37:37.040 |
I think definitely in the Democratic Party, there has been a problem with mistaking the Twitter commentariat and the journalistic elite for real life. The truth is the vast majority of even Democrat voters find my arguments around colorblindness totally uncontroversial, whether they may have some agreements or not. 00:38:00.040 |
But if you ask the elite, there's a meltdown, right? There's this huge, there's just this huge discrepancy. And it can never be hammered enough the extent to which people in politics are operating in a bubble and believe mistaking the elite and the Twittersphere for the wider population. 00:38:19.040 |
I mean, this feels to me like why Donald Trump got elected, but that's another topic. 00:38:24.040 |
This has been amazing. Everybody take a moment, search for Coleman Hughes, subscribe to his YouTube channel, type Coleman Hughes Ted Coleman, you do you do a podcast? 00:38:33.040 |
Yeah, I do a podcast conversations with Coleman. Actually, David Sacks has been on the podcast about a year ago. 00:38:45.040 |
Was it talk about the Ukraine? Talk about Ukraine? 00:38:58.040 |
I saw you had the Dilbert guy on and I thought that was a pretty engaging, interesting conversation. 00:39:06.040 |
Yeah, Scott Adams, who is, you know, really controversial. And I thought you handled that one really well, too. 00:39:10.040 |
Yeah, thanks. He's an interesting one. It's like he has a lot of brilliant things to say, but also he maybe thinks the CIA is going to kill him recently on Twitter. It's a mixed bag. 00:39:19.040 |
It's a mixed bag would be where I would go with it. All right, listen, this has been amazing. The TED Talk is extraordinary. Everybody should watch it. And yeah, ideas worth spreading, unless maybe you don't agree with them. Go to the TED channel and watch it. 00:39:36.040 |
Sorry, I mean, I don't want to give Ted too much more time. But they tried to get me to pay $50,000 a year, $25,000 a year for like a five year package to go to the event. And I was like, yeah, 00:39:49.040 |
Regular tickets used to be $7,500. And then no, they used to be $7,500. Then I think they went up to $10K. And then you can do like donor tickets and you get different features and so on. 00:40:01.040 |
Remember, it is set up as a nonprofit and there is philanthropic work that's done. And so, you know, the organization is, again, it's not a profiteering media company, it became a big media company, because of the success of the efforts and the quality of the content that was produced over time. But, you know, as we talked about a lot of media companies, and a lot of institutions get captured and 00:40:25.040 |
you know, the original kind of mission gets fulfilled. 00:40:39.040 |
What other ideas what other talks have been canned before they even got to the stage? You have to wonder and 00:41:20.040 |
So, the hypocrisy is just so crazy with the TED people. And it's a lot of my friends still go is they had Sarah Silverman come these people have laughed at Sarah Silverman a million times they've watched Dave Chappelle, they've seen any number of comments, you know, make them laugh with edgy humor. But then when they're in that, you know, TED audience, and they're feeling super precious, and that they're very important because they donate 50 grand a year or whatever Friedberg gave them, I don't know, to get in there from the side door. 00:41:49.040 |
Then they were super offended. So, you know, they're hypocrites, and I don't know how to say it any more clearly. They literally, you could pull up Chris Anderson apologizing, not just again, again, I really hope this is a learning experience for everyone. I hope that this is a turning point for leadership and institutions like this to take a look at what happened, how it happened. And then hopefully to write the course because organizations like TED, I thought were very important and important. 00:42:18.040 |
We're very important and should be in the world and should be successful. And I hope that they kind of return to the original values. And I hope that this is a moment that that there's a learning experience that we don't just shit on them and say, they're awful, they're failed. It's over. Hopefully something comes to this. 00:42:31.040 |
I do think there is one other potential remedy here, which besides just starting a new TED, and kind of the Barry Weiss point of view, which is just write it off and start over. Remember what Brian Armstrong did at Coinbase, he basically just said, Listen, we have a mission here. It's around crypto, we're going to focus 100% on this mission. And if you're not on board with this mission, or want to capture this institution to promote other missions, this is not the place for you. Go do those missions somewhere else. 00:42:58.040 |
And it worked. He took a hit. The New York Times wrote their obligatory hit piece. But a year later, they were told to focus on this mission. 00:43:06.040 |
I would say if Chris has good mentors, as well as a good sounding board, that is the threshold question that should be debated right now is do I walk in the door? And do I just give this simple litmus test and have people sign up or not? And yeah, we're kind of it's quite and it's quite easy. Because to your point, it's not like he's inventing something new. He's saying this is where we started. And this is where we're going to stay. And this is what it means. 00:43:33.040 |
And if he doesn't do that, then he's spoken with his actions. And it is what it is. It'll what what is meant to happen exactly then happen. Exactly. And it's a it's a it's a moment for looking at the internal compass. It's a wholesale leadership reset moment opportunity. See if it happens or not. 00:43:50.040 |
I really appreciate your being public about all this and talking about it. It's been a great conversation. 00:43:57.040 |
All right, thank you. We'll see you soon. Cheers now. 00:44:00.040 |
All right. Listen, it's a new segment we have here when virtue signaling goes wrong. If you missed it, the Canadian Parliament gave a standing ovation to a Nazi, not like a new Nazi or a Nazi sympathizer. One of the few actual Nazis still alive. Here we see just the crowd going wild. 00:44:20.040 |
Last Friday, Ukrainians President Zelensky gave a speech at the Canadian House of Commons and Canadian House Speaker Anthony Rota introduced a 98 year old Yaroslav Hunk as a Ukrainian war hero. And then the Canadian Parliament proceeded to give him a standing ovation. 00:44:42.040 |
And it turns out that this person first fought for the first Ukrainian division in World War Two. That unit was also known as the Waffen SS Galicia Division, if I'm pronouncing that correctly, which was a voluntary unit under Nazi command. 00:45:05.040 |
So the Canadian Parliament apparently gave a standing ovation to Nazis. They have apologized for this and said it was a mistake. Chamath, I don't know if you got to see this. You're Canadian. So your thoughts on what we've seen here. 00:45:19.040 |
I mean, I'll give you my feedback as somebody who, when I was, you know, in Canada, was a pretty ardent liberal. I grew up in a liberal household, my father canvassed religiously for the liberals. 00:45:37.040 |
And I think that at some point after I moved to the United States, they took wokeism, which I think look at some level was rooted in something very important, which was, how do you get marginalized folks to be seen. 00:45:54.040 |
But unfortunately, along the way, just got perverted by folks that just use it as a cudgel to censor people to make other people feel guilty to judge people. And so I think we all would agree that it's kind of become this virus. 00:46:10.040 |
The thing that it masks are all of these other really bad things that come along with it. And one of them in Canada, which Justin Trudeau is case zero of is also when nepotism goes bad. 00:46:23.040 |
His father was an incredible exemplary Prime Minister in Canada, set the benchmark on all dimensions was just incredible, cool, composed, move the country forward, brought the country together. 00:46:37.040 |
And then fast forward 25 or 30 years in a vacuum of leadership, what basically happened, we picked this guy, who was up until that point, a substitute teacher, and the other claim to fame was appearing twice in brownface. 00:46:50.040 |
Okay, so making fun of people like me, and elected him Prime Minister. 00:46:56.040 |
And what happened was he became the sort of like virtue signal or in chief of this very important ga country. 00:47:06.040 |
And it was all kind of bumbling along. And in the absence of anybody else that was able to step up and offer an alternative, he got reelected barely, but he did. 00:47:16.040 |
And so many of these things happened in the last year. And when you look through that prism is how you can see what happens if a country doesn't draw a line and finally take a stand. 00:47:26.040 |
So we had this guy who was ill qualified and way over his head, who shouldn't have been in this role as Prime Minister get put in that position. 00:47:35.040 |
And finally, a group of people in Canada push back in this case, the truckers, he and the entire government explicitly labeled them as Nazis, right and said, these people need to be put down and completely dismantled. 00:47:51.040 |
It didn't seem like it was right. We call that out. We all talked about it. And we said, this doesn't smell right on the surface. These are really seems like good earnest people that are just trying to make a point and are not being heard. 00:48:03.040 |
And we had this thing three weeks ago, two or three weeks ago, where he actually had a speech in front of the entire parliament where he accused the largest democracy in the world, India, in this case, of coming into Canada, Canadian soil and assassinating a Canadian citizen, which is an enormous allegation to levy. 00:48:24.040 |
And what was important to know about that allegation was that it was done without the explicit vocal support of either Britain or the United States, which would be the two most natural allies that Canada would present that information to and instead of doing it behind a closed door to Modi, he did it on live stage, like it was like some theatrical performance. 00:48:46.040 |
Then India follows up and says, this guy's kind of known to be a little bit of a drug addict and was on a two day bender and the Indian drug dog smelled a bunch of cocaine on the plane. Then they have this thing from Vladimir Zelensky, where everybody was there to sort of like virtue signal this war, and then they actually invited a Nazi and then gave him a standing ovation. 00:49:09.040 |
So when you when you put it all together, I think what it shows is just the lack of professionalism, which also belies just the lack of experience and capability. And so I think what it shows is just like, isn't this enough? Like, have we not seen enough of these examples, but you can actually start to ask yourself, why can't we just get really good, competent people to do these jobs? 00:49:33.040 |
Why can't we actually embrace free speech and all of what it means and explore that? Why can't we have people that don't need to theatrically perform on stage, because eventually, you're going to make these mistakes, and you're going to embarrass your entire country. And then you're going to imperil relations with some really important allies. And I think this is a moment in time where all of those things need to be questioned and put on the table. 00:49:54.040 |
And you're clearly questioning his competence here, because to not have the care to check who is going to speak in front of Parliament is crazy. And just to make it super clear, the speaker that invited hunker, that was Anthony rota, resigned on Tuesday. And Trudeau says rota, the person who invited the Nazi is solely responsible. 00:50:18.040 |
And then he blamed Russian misinformation on top of that. But Jason, you don't you don't the Prime Minister, who is the most important politician in the country doesn't show up someplace unless the office knows who else is going to be there. He knew that Zolensky was going to be there. He would have known who the guest list was. 00:50:32.040 |
Yeah, this is really going to cover it up. But, but the bigger issue is just be clear, you're not saying that they invited a Nazi on purpose and cheered for a Nazi on purpose, where nobody's saying that you're saying there's a lack of care here. And it's, it's a lack of competence. It's a lack of competence. Just so we're clear. Yeah. Okay. 00:50:51.040 |
So I agree with all of that. I think there's also two other dimensions to this backstory, if you will. I think first in terms of how does a mistake like this happen? I think it was Orwell who said that he who controls the present controls the past and he who controls the past controls the future. The present is Ukraine, it is the current thing, everybody has to cheer for Ukraine and for the killing of Russians. 00:51:17.040 |
The reason why Hanka was cheered with a standing ovation is because they said that he fought Russians. He was a war hero who fought Russians. All you had to do was do a little bit of math to realize the guy's 98 years old. When was there a war against Russia? Who could he possibly have been fighting for? But to the extent people did that, they sort of airbrushed it or whitewash history. 00:51:38.040 |
So the present controls the past, to ensure a vision of the future, which Trudeau laid out in this speech he gave recently, where he became so ardent in his support for Ukraine, he was almost yelling at the podium, saying that Canada had to make all these economic sacrifices to win the war. So that's point number one is I think that the woke mind virus almost requires this whitewashing of the past, but it's done for a specific purpose, which is to control the present. 00:52:08.040 |
But they're not whitewashing the past if it was a mistake. That intellectually doesn't make sense. 00:52:12.040 |
No, what they did is, what they're saying is... 00:52:16.040 |
The present is that we hate Russia so much that we're going to cheer for anybody who killed Russians. 00:52:24.040 |
Okay, I understand your point. But you're agreeing that they did not knowingly put a Nazi on there, so it was a mistake. Got it. 00:52:29.040 |
I don't think they knowingly did it. It was a huge debacle and embarrassing spectacle. 00:52:34.040 |
But I think that nobody asked any questions about the past because the present overrides it. 00:52:40.040 |
The present need to support the current thing overrides like any sort of examination of what has happened historically. 00:52:48.040 |
There's one other way in which I think this wasn't an accident, Jason, is that if you look at US policy towards Ukraine, we have made common cause with a number of these far-right, ultra-nationalist groups. 00:53:08.040 |
First of all, if you go back to World War II, the father of Ukrainian nationalism is a guy named Stepan Bandera. 00:53:15.040 |
And today in Ukraine, he is seen as some sort of hero. 00:53:20.040 |
And there are streets named after some of his co-conspirators who collaborated with Nazis. 00:53:25.040 |
If you fast forward to the more recent past, to 2014, when we had this Maidan coup in Kiev that was backed by Victoria Nuland, 00:53:36.040 |
one of the key figures in that coup was a guy named Oleg Tanibok, who is the founder of the Svoboda Party, which is the social nationalist party. 00:53:48.040 |
Which if you know what Nazi stands for, it's national socialist. 00:53:54.040 |
And the original logo of the Svoboda Party was the Wolf's Angel, which was a Nazi insignia. 00:54:00.040 |
This was a far-right party infused with the racial ideology of Stepan Bandera, who was, again, a Nazi. 00:54:10.040 |
And they brought this guy in and his party as the muscle in this coup. 00:54:14.040 |
If you look at the Victoria Nuland phone call, the infamous phone call where she is picking the new Ukrainian government, the Yatsyzar guy phone call, 00:54:23.040 |
she says that Klitsch, meaning Klitschko, and Tanibok need to remain on the outside, but Yats needs to be talking to Tanibok four times a week. 00:54:32.040 |
Okay. He was part of the chess pieces that they were moving around. 00:54:37.040 |
After the coup, a civil war breaks out in the Donbass because the ethnic Russians there are opposed to this new government, 00:54:44.040 |
and the fact that Yanukovych, who they voted for, was deposed in an insurrection. 00:54:49.040 |
What happens then is a war breaks out where far-right paramilitary organizations like Right Sector and like the infamous Azov Battalion start killing these ethnic Russian separatists. 00:55:01.040 |
And a full-blown civil war breaks out, thousands of people get killed. 00:55:05.040 |
Does the Kiev government suppress these neo-Nazi groups? No, they bring them under the formal command structure of the Ukrainian military. 00:55:15.040 |
Azov Battalion becomes a division of the Ukrainian military. It's shocking. 00:55:25.040 |
So you're saying the Ukraine army, just to be clear here, has Nazis in it, Nazi supporters? 00:55:30.040 |
There's no question about that. And there are many people who are concerned about this in the 2015 to 2020 time frame. 00:55:37.040 |
There were many articles written about it. The Nation had an article about it. 00:55:41.040 |
There were efforts in Congress at various points to try and ensure that the aid that we were giving to the Ukrainian government did not go to the Azov Battalion. 00:55:51.040 |
Do you think Zelensky is a Nazi or a Nazi sympathizer? 00:55:57.040 |
No, I don't think he's a Nazi. And to be clear, I don't think most Ukrainians are Nazis, and I don't even think that most Ukrainian nationalists are Nazis. 00:56:04.040 |
What I'm saying is that there is a Nazi element in Ukraine that people have whitewashed over. 00:56:10.040 |
Well, here's the thing about it. I don't think it's a huge percentage, but I think they have outsized influence due to their willingness to use violence, due to their extremism. 00:56:19.040 |
Do you think it's any different than the Nazi percentage in, say, whatever you want to say, white supremacists in the United States or in Germany or anywhere else? 00:56:29.040 |
I do. I think it's different in the sense that in the United States, for sure, we have neo-Nazi groups. 00:56:35.040 |
They're not brought into the military. We don't have streets named after their patriarchs. 00:56:40.040 |
Furthermore, we don't have members of our military with Nazi insignia on them. 00:56:45.040 |
There was a New York Times article just a few months ago talking about the fact that embarrassingly, a lot of these Ukrainian soldiers are being photographed with Nazi insignia on their uniforms. 00:56:57.040 |
Now, the New York Times is framing this as a problem because it was a propaganda coup for Putin. 00:57:02.040 |
Presumably it was, but I think it's a problem because it's a problem, not because of just the PR optics of it. 00:57:10.040 |
At various points, I think this is in the New York Times article as well, Western media has had to airbrush these photos to hide this fact. 00:57:18.040 |
Oh, the New York Times has airbrushed photos of Nazi uniforms? 00:57:22.040 |
I don't think the New York Times has, but I think they talk about this thorny problem of not wanting to show these photos. 00:57:29.040 |
With respect to Zelensky being Jewish, what I'd say about that is that Zelensky only came on the scene quite recently. 00:57:40.040 |
Again, I don't think the majority of people in Ukraine are Nazis, so I'm not saying that. 00:57:46.040 |
But just because Zelensky came on the scene in 2019 and was elected president doesn't mean there's a long, and I would say disturbing history and association between Ukrainian ultranationalism and neo-Nazi groups. 00:57:59.040 |
And I think that part of the woke thing and part of this Orwellian desire where control of the present gives you the ability to rewrite the past is that there's been a deliberate effort to cover up this problem and to pretend it doesn't exist and to turn a blind eye to it. 00:58:14.040 |
Well, my point is that US policy has been to do this. 00:58:22.040 |
The US State Department and presumably CIA made common cause with these far-right groups because we thought it was beneficial to be aligned with them. 00:58:34.040 |
From 2015 to 2021, we could have gone along with efforts under the Minsk Accords to resolve this conflict in the Donbass peacefully, but we never did that. 00:58:45.040 |
And instead, we gave support to the Kiev regime's attempt to violently suppress these Russian separatists. 00:58:50.040 |
And again, the suppression was being done by these right-wing groups. 00:58:54.040 |
Look, does that make our State Department Nazis? 00:58:57.040 |
No. Does that make the Canadian Parliament Nazis? 00:59:00.040 |
No. What I'm saying is that in both cases, a blind eye was turned to the disturbing ideology and past and associations of these people because it's politically in our interest to do business with them. 00:59:15.040 |
So I don't think in that sense this was just sort of an accident. 00:59:18.040 |
This is the backstory that explains like something like this can happen. 00:59:24.040 |
Jason, you have any reactions to Trudeau doing this and what it means or does it mean nothing? 00:59:28.040 |
Does the backstory I provided give you context on how something like this can happen that's not just like an accident? 00:59:36.040 |
Well, I don't think any of us know exactly what happened here and it's probably going to be some sort of investigation, but I don't think they knowingly put a Nazi up there. 00:59:43.040 |
I think they are pro the war and that probably could that have blinded them to do deeper research? 00:59:51.040 |
People are political, politicians most of all, and people probably take facts or, you know, any kind of anything they can use to make their case stronger. 01:00:07.040 |
Zelensky was pumping his fist and cheering. Don't you think he knew? He can't not know the history. 01:00:17.040 |
That somebody was fighting the Russians in World War II... 01:00:21.040 |
If he did, then you would be saying... If he did know and he was pumping his fist, then you'd be saying that he was pro-Nazi. He was cheering for a Nazi knowingly. 01:00:32.040 |
You know, what I'm saying is, look, the fact that you've got some Jewish answer street is not, in my view, a get out of jail free card for you making political decisions to align... 01:00:41.040 |
I never even brought up the Jewish part. I just said, are you saying he knowingly cheered for a Nazi? 01:00:45.040 |
You know, one of the big backers of the Azov battalion is a Ukrainian oligarch named Igor Kolomoisky. Kolomoisky is Jewish. 01:00:53.040 |
No, you didn't answer my question. You asked me my opinion. I'm just saying, do you think he knowingly cheered for a Nazi? Is that what you're insinuating? 01:00:59.040 |
I think he knowingly cheered knowing that this Ukrainian nationalist who fought in World War II must have been on the German side because there was only one side that was fighting the Russians. 01:01:07.040 |
Okay. I'm just clarifying here. I don't actually have an opinion. Thanks for querying me, John. 01:01:12.040 |
I'm not saying that he cheered for Nazism. What I'm saying is he cheered for Ukrainian nationalism and he knows that Ukrainian nationalism is bound up and tied up with this disturbing history, which he is willing to ignore. 01:01:24.040 |
Let me finish my point about the Azov battalion. The Azov battalion is undeniably a neo-Nazi group. It was funded by Igor Kolomoisky, who's a Ukrainian oligarch, who is Jewish, who lives in Israel. 01:01:35.040 |
Why would Kolomoisky do that? Because the Azov battalion believes that every inch of Ukraine, including Crimea and Donbass, which has enormous energy reserves, belongs to Ukraine. 01:01:45.040 |
So it served the business interests of the energy magnates in Ukraine to support these people. And that, look, politics makes for strange bedfellows. 01:01:55.040 |
Yeah. That's what I was going to say, actually. 01:01:57.040 |
So I'm not saying that Zelensky or Kolomoisky or anybody else is a Nazi because they aligned with these people. I'm saying they found it politically expedient and useful to align with these groups, just like the US State Department did, quite frankly. 01:02:10.040 |
I don't think we should do that. If you want to go around the world, Jason, saying that we're the champions of freedom and democracy and having this moralistic, almost virtue-signaling foreign policy, I don't think we should be in business or aligned with these neo-Nazi groups, wherever the hell they are. 01:02:25.040 |
I think it's... When you say "you," do you mean me or do you mean the United States? 01:02:30.040 |
I'm saying if you want to have a highly moralistic foreign policy. Let's say if one wants to have... 01:02:35.040 |
I would use the word "principled." Yeah, if you're going to be principled, you need to not support Nazis. We're in agreement. 01:02:41.040 |
What do you, Jason and Freeberg, what do you guys think of just like the breadcrumbs in Canada? I'm just curious whether you guys care about this whole vein of just like competent leadership, nepotism, if you have a view, or it's like that is just what it is, whatever. 01:02:57.040 |
I don't know enough about Canadian politics, really, but Trudeau does not seem to be super qualified. 01:03:07.040 |
So, just in terms of the Canadian part of this, there's a writer named Jeet here who's a left-wing writer, but he posted something very interesting here where he explained that in the late 1940s, 1950s, Canada took in a large number of former Nazis, many of whom were SS veterans, so people like Hanka, because they were good anti-communists. 01:03:24.040 |
And then these Nazis proceeded to terrorize anti-Nazi Ukrainian Canadians. There was this Ukrainian hall was bombed here in 1950. So, Canada has a weird history of bringing in some of these people after World War II. 01:03:40.040 |
Yeah, exactly. Look, there's no way that any semi-intelligent person who knows the history of World War II, especially the Ukrainian involvement in World War II, wouldn't know that Ukraine was on the German side in World War II. And Hanka volunteered for the SS. He was a volunteer for the SS Galicia Division. 01:04:01.040 |
So, look, did the Speaker of the House know? Probably not. I think wokeness makes people stupid where they just think about the current thing and don't ask too many questions about the past. But there's a lot more to it than just like this innocent mistake. 01:04:15.040 |
And this has been your update on this week in Ukraine and wokeness. All right, there's a bunch of news about OpenAI this week. Just very quickly, OpenAI is in advanced talks, according to the Financial Times, with Johnny Ive of iPhone fame, Steve Jobs' long-term collaborator, and Masi Yoshisan of SoftBank, to raise more than $1 billion to build the iPhone of AI. 01:04:44.040 |
And so, the idea would be Johnny Ive's got a design firm called Love From, and they would help OpenAI design their first consumer device via the FT sources, Financial Times, that is. 01:04:58.040 |
Altman and Ive have been having brainstorming sessions in Ive's San Francisco studio about what a consumer product centered around OpenAI would look like. It's very early stages. And Sun has pitched a role for Arm in the development, his chip company that he recently took public. 01:05:15.040 |
They also discussed Masa and Altman creating a company that would draw on talent and tech from their three groups with SoftBank, putting in a billion dollars in seed. And then also OpenAI is discussing a secondary share sale that would value the company at between 80 billion and 90 billion. This would be 3x the most recent valuation. 01:05:37.040 |
Reportedly, though, to their credit, they are on track to generate $1 billion in revenue in 2023. I'm not sure how much of that is the $20 a month subscription. You know, that'd be pretty extraordinary if that was those personal subscriptions. This would be a massive gain on paper for Microsoft. 01:05:52.040 |
OpenAI is 49% owned by Microsoft. And Sam Altman has personally has stated multiple times now that he has no equity, so he would be getting $0 of this. And of course, we know that OpenAI started as a non profit before switching. And our friend Vinod Khosla told us very clearly that those are just details. What happened there, Chamath? 01:06:18.040 |
Those are just details. But Vinod is the GOAT. 01:06:23.040 |
Sam is the closest thing that we have to an emergent mogul in tech. And the reason is because if everything sits on this substrate, you're going to need to get a license, you're going to want to get access to whatever developer program, whatever beta that OpenAI has. And so as a result, he'll be... 01:06:48.040 |
Well, I was just going to say, so he'll be in the catbird seat. So even if he doesn't have any equity in OpenAI, he'll just put his money into the best startups that... it's like Y Combinator on steroids. 01:06:59.040 |
By the way, I have a take on that whole claim that Sam doesn't own any part of OpenAI. 01:07:04.040 |
All right, let's hear it. Go ahead, Columbo. Explain to us the details. 01:07:09.040 |
It's one more thing there, ma'am. You said you don't own any shares in OpenAI, but you started OpenAI. 01:07:18.040 |
What I think is really interesting about what OpenAI has done in its fundraising rounds is that each round has been a capped return model. So... 01:07:30.040 |
Well, I think some of the very early people got capped at 100x. I think maybe the $30 billion round was capped at 10x. So I think the $30 billion round is capped at a $300 million valuation, meaning if you're an investor, your shares go up in value till the company hits a market cap of $300 billion, and then basically you're effectively cashed out. It's like you bought a share, but sold a call back to the company at a $300 billion valuation. 01:07:54.040 |
The movie industry works this way, right? You invest in a film, they tell you you can make three acts, and then it's over, right? Something like that. I've seen that in the independent film business. 01:08:01.040 |
Yeah, so in any event, I think people who invested, like the $2 billion valuation were capped at like $100 billion. I heard that employees who were getting stock options are capped at $100 billion, or they were way back when they started granting these things. 01:08:14.040 |
So my point is that if open AI turns into one of these companies, like a Google ends up in the trillion dollar club, then nobody's gonna own anything because they will have already long ago. 01:08:27.040 |
I think what's really going on here is somebody has to own the residual value of the company call it the far out of the money call around the IRS problem of it being non equity. That's how they say that it's not equity in a private corporation. 01:08:56.040 |
Yeah, but I think I think what's so brilliant about it is okay, so look, Sam set up this foundation. It's a nonprofit. But he controls that effectively, right? 01:09:06.040 |
So yes, he technically is not an owner of the shares the foundation is. But what can't you do with the foundation that you could do with personal ownership other than maybe buying a personal residence? I mean, you can buy a plane, I think, look at the Church of Scientology, they own a lot of real estate. 01:09:22.040 |
So my point is, not only do I think that Sam really owns open AI through the fig leaf, this foundation, I think he owns 100% of it in the event that the call option is struck, meaning it ends up being a trillion dollar company. 01:09:36.040 |
Are you saying Sam is our Ron Hubbard in this example? 01:09:42.040 |
Oh, it's just details. Right. As we know, it said, details. 01:09:46.040 |
I am speculating, but I think it's informed speculation. If you wanted to become the world's first trillionaire, and you were extremely premeditated about it, clever and premeditated about it, what would you do? Number one, you would want to choose a moonshot type area that was a world changing technology. AI certainly qualifies. 01:10:06.040 |
So it's called fusion. Maybe crypto does, as I understand the same as bets in all three of those areas. Number two, you would want to figure out a way to own as much of it as you could really 100% if you could. And that's a very hard thing to do when you're running a capital intensive startup. But investors tend to underestimate the power law, and the value of the far out of the money call option. So maybe you can get them to sell that back to you really cheaply. 01:10:33.040 |
And third, if you're really farsighted, you would want to insulate yourself against populist anger from being the world's first trillionaire. So you would basically put your shares in a nonprofit foundation where you're not really sacrificing that much of control, or the ability to control the asset, but it gives you tremendous defense. 01:10:54.040 |
Where did you come up with this? Is this? This is genius. This is genius. It just you and Peter Thiel talk about this over chess or something. How did you construct this? And you're saying this is in for a lot of financial conspiracy corner. I think this is science corner. Let's get the tinfoil hats out. It's really freaking Friedberg out that we're even doing this diametrically opposite to science corner. 01:11:17.040 |
I think if you are even 1%, right, the combination of lawyers and accountants that would leak this and the number of people that were part of the origination of the foundation that would want to sue will be very high. That's just the natural state of things in these kinds of things. 01:11:38.040 |
But what have I said other than the fact that it was sort of premeditated, which that's not the right word that premeditated sounds too nefarious. 01:11:45.040 |
No, no, no, I'm just saying whenever, whenever, I'm just saying whenever money is made at this quantum and at that scale, everybody wants a piece because they know that that's their one shot. So I just think that it'll amplify the pressure for actors inside of those organizations to take their shot. And that's just going to be financially the right thing to do for a lot of people if if what you're saying is true. 01:12:11.040 |
We know the investments have been made under a capital return model. I think that's fact. 01:12:16.040 |
We know the nonprofit foundation owns the shares. That's fact. 01:12:19.040 |
And then just to put the 800 pound gorilla on the table, like, what's Elon thinking? Because he was the one that really got this thing off the ground, because that critical investment made the whole thing come to life. He could have done this on his own. 01:12:34.040 |
I mean, but after a lawsuit, how much does he own? I don't know. I'm just speculating. 01:12:42.040 |
Here we go. So opening I released some new chat GPT features. The key point here is they're doing what's called multi modal. Multi modal is the big innovation. What does that mean? That means the input could be voice, the input could be code, the input could be data. It could be a picture. Here's a picture. If you're watching along on the YouTube channel, do a search for all in podcast on YouTube, hit subscribe, hit the bell. And it's a classic picture of one of those no parking signs where there's four different ones, you take a picture of that, that's the input. 01:13:09.040 |
And you say it's Wednesday at 4pm. Can I park in the spot right now? Tell me in one line, it comes back and says yes, you can park for up to one hour starting at 4pm. What this means is the output or the input could be in any of those modalities, modalities, fancy word for an image, a video, etc. So you're going to be able to say, hey, give me the poster for the all in conference of bestie runner. And I want it to be the things and here are the pictures of the boys and then make it and go back and forth and back and forth. 01:13:36.040 |
And this is really groundbreaking. At the same time, last week, Google Bard and Sandeep Madhra and I played with this on this week in startups. You now have Google Flights, Google Docs, Gmail, and a number of the other core Google services are now in Bard. So that's not multi modal exactly. But you could do things like ask Google Flights, hey, what is the best nonstop? You know, between New York City and New York City? 01:14:05.040 |
And Dubai, or from an East Coast destination that has laid down flat seats, etc. And it really does. It's starting to work. So this idea that Google is going to be displaced, or they're moving slow, that might be antiquated information. So those are the two big, big monumental announcements just in the last 10 days. Freeberg, when you look at these two, which one is the more important announcement? And what do you think about the pace? Because we here we are, we're about to hit the one year anniversary of Chad GPT 3.5. 01:14:34.040 |
I've been using a lot of different tools the last couple of months. And I'm kind of getting to the point that I feel that much of what's happening is under hyped rather than overhyped. There's some really incredible potential emerging. I'll give a couple of examples. And then I'll talk about the mobile phone. First is Andre Carpati, as you guys see in the tweet that I just posted in the chat, made a point today, that LLMs are 01:15:04.040 |
emerging not just as a chatbot, but as a kernel process, meaning a new type of operating system that can do input and output across different modalities can interpret code can access the internet and information and then can render things in a visual way, or in an audio way that the user wants to consume it. So as a result, LLMs become the core driver to a new type of computing interface. There was a paper published and I'll share the link to this paper here as well. And we can put it in the notes, it's not worth pulling up on the 01:15:32.040 |
screen, that showed that using LLMs in autonomous driving can actually significantly improve the performance of the neural nets that the autonomous cars are trained on. So the autonomous car is typically trained on a bunch of sensor data that comes in. And then that sensor data determines what sort of action to take with the car. And what this team showed is that if you actually put in a communication layer that thinks and talks like a human in between the sensor data and the action data, it can actually improve the performance of the autonomous car. 01:15:57.040 |
that thinks and talks like a human in between the sensor data and the action data, it can do really wide ranging interpretations of the data that otherwise would not be apparent from the data set it was trained on. So for example, you can see a person down the road and ask it, what do you think that person is going to do next? And the LLM because it's trained on a much larger corpus of data than just sensor data from cars, it can make a really good human like interpretation of that feedback decision back into the control system of the car and have the car do the action that it's trained on. 01:16:22.040 |
of that feedback decision back into the control system of the 01:16:25.640 |
car, and have the car do something more intelligently 01:16:28.360 |
than it otherwise would have been able to do. So these LLM are 01:16:31.200 |
becoming a lot more like a software operating system. And 01:16:35.040 |
you can kind of extend that into mobile phones, mobile phones 01:16:37.680 |
originally, were just voice. And then they were single lines of 01:16:40.480 |
text in the form of SMS, then you were able to browse the web, 01:16:43.800 |
and then the app revolution came about where all of this 01:16:46.160 |
information emerge through apps. What LLM is now allow, perhaps, 01:16:49.920 |
is that the entire operating system of the phone can run and 01:16:53.280 |
render any sort of application or any sort of service or 01:16:55.920 |
product you might want to use on the fly, in stream. So the input 01:17:00.880 |
to the phone can be voiced, it can be visual, it can be video, 01:17:03.960 |
and the output can be rendered by perhaps a bunch of what might 01:17:06.760 |
otherwise be called apps, but call it third party developers 01:17:09.520 |
that build in stream into that chat that's no longer looks like 01:17:13.040 |
a chat interface like we see on chat GPT, but can be rendered 01:17:16.000 |
visually can be rendered with audio can be rendered a bunch of 01:17:18.760 |
different ways. So if mobile really is the dominant tech 01:17:21.840 |
hardware platform that humans are using for computing today, 01:17:26.000 |
LLM and these sorts of tools can become the dominant operating 01:17:29.160 |
system on that hardware. And you can totally rethink the modality 01:17:32.320 |
of how you use computing through applications today, we have an 01:17:35.400 |
app store, and we download apps and use them. And that all 01:17:37.560 |
becomes in stream in an LLM or chat type interface that can be 01:17:41.480 |
accessed in a bunch of different ways. So for me, there's a 01:17:44.240 |
really bigger thing that's happening. That's not just about 01:17:46.480 |
making smarter tools and increasing productivity, but a 01:17:49.840 |
real revolution in computing itself. That seems to be 01:17:52.920 |
emergent. And I think car poppies tweet this morning, some 01:17:55.760 |
of the stuff I've been playing with some of the papers I've 01:17:57.600 |
been reading, and some of the speculation around a mobile 01:17:59.960 |
hardware start to support that thesis. And I think it's going 01:18:02.440 |
to be really significant. It's a wholesale rewriting of computing, 01:18:05.600 |
computing interfaces, human computer interaction, that's 01:18:08.560 |
going to rethink everything. And it seems to be pretty 01:18:11.480 |
substantial. And just using a bunch of tools myself, I'm blown 01:18:16.880 |
Yeah, I mean, right now, I would agree with you strongly agree 01:18:20.120 |
because this was some this was magic links. Vision for the 01:18:25.320 |
future, which is you would talk to agents, as they call them. 01:18:28.240 |
This was a company that existed in the 90s. Before smartphones 01:18:31.640 |
existed, it was a physical device, Sony made the device. 01:18:34.160 |
And the operating system, the concept was you would say, I'm 01:18:36.760 |
looking for a flight to go to this place, the agent would go 01:18:39.680 |
out, it would do a bunch of work and then come back to you with 01:18:41.640 |
the options. So not just a Google search coming back with 01:18:45.520 |
10 blue links, but actually just solving your problem. And if the 01:18:48.880 |
interface is from general magic, right, general magic, right, 01:18:52.080 |
yeah, right, right. And there's a movie, general magic, the 01:18:54.680 |
movie, you can look at the Wikipedia company, but this was 01:18:58.040 |
a lot of like the early work in this area. And I think this is 01:19:02.200 |
going to become the interface and LM is talking to each other. 01:19:04.280 |
So then the question becomes, who owns this? How many of these 01:19:07.760 |
are there? Are they verticalized? So what do you 01:19:12.520 |
Well, I think this is super interesting. I don't know if 01:19:16.640 |
this qualifies as a science corner. But this is the most 01:19:21.280 |
I'm trying to make a science corner into an intersecting 01:19:26.360 |
I don't know how we crowbar a Uranus joke into this. But let's 01:19:31.560 |
Okay, so on the phone, I think what's interesting there, just 01:19:35.720 |
to boil it down is you're talking about replacing the 01:19:38.840 |
main interface, which is currently a wall of apps, right? 01:19:42.400 |
And you push you tap an app to go into the app, and then you 01:19:46.080 |
interact with it. You're talking about replacing all that with 01:19:48.360 |
basically voice. So imagine a visual, yeah, more visual if you 01:19:52.680 |
connect like glasses to it or something. So rather than 01:19:57.240 |
double click on an app, the app developers as they're called 01:20:00.960 |
today, are basically building in screen utilities that are part 01:20:04.800 |
of the chat interface that that is the phone itself. And that's 01:20:08.080 |
what's going to be so compelling. You have to read like we used 01:20:10.640 |
to write websites, and we wrote apps. And now we're going to 01:20:12.600 |
write these kind of in stream services, these plugins, Alexa 01:20:17.720 |
Well, Alexa or Siri kind of sucks. It just doesn't work that 01:20:21.560 |
well. And but imagine if the phone perfectly understood what 01:20:25.520 |
you were saying, then you would just say call me an Uber, order 01:20:28.480 |
me food, whatever. And precisely just instruct it. It's like in 01:20:32.520 |
that movie. Was it her? Her? The Joaquin Phoenix movie? 01:20:36.640 |
God, that should have been my background today. What am I 01:20:39.000 |
You've disappointed all the science corner fans. 01:20:41.400 |
It's a Spike Jonze movie. He did a really good job with that. 01:20:44.040 |
Man, that movie is looking more and more great. Like it's gonna 01:20:47.320 |
We got to do a rewatchable on that. Yeah, we should rewatch it. 01:20:49.840 |
You won't even really need the pain of glass if you can just 01:20:53.200 |
talk to it with an earpiece. Now I think you're right that the 01:20:55.680 |
phone needs to know what you're looking at. Or it can do so much 01:21:02.040 |
That's part of the multimodal demo that that only I showed 01:21:05.440 |
this week is it has video and it has camera inter integration. 01:21:09.280 |
And remember, in human computer interaction, it's often a lot 01:21:13.960 |
easier for a human to interact with a visual representation of 01:21:17.160 |
stuff on a screen than to hear stuff in audio. So we will still 01:21:22.640 |
need some sort of visual display, whether it's a screen, 01:21:25.040 |
or an eyeglass or something that shows us a bunch of information 01:21:29.120 |
Sam apparently talk about the ecosystem he's trying to create. 01:21:33.560 |
Sam apparently invested in a company that was hardware plus 01:21:37.400 |
software for like journaling, like you would hang like a 01:21:40.800 |
necklace around your neck, a camera type device, wearable, 01:21:44.040 |
wearable, wearable, okay, and it would record everything. And it 01:21:47.920 |
would be like your memory backup, and you'd be able to 01:21:52.320 |
that was William Gibson's plotline in one of his books 01:21:55.840 |
where he had a little Zeppelin that would follow people around 01:21:59.080 |
and record everything and then you'd have a DVR of your entire 01:22:02.400 |
life and that would be completely indexed and then you 01:22:04.520 |
could the AI would know your entire life and be able to 01:22:07.840 |
advise you I do you guys use the feature on your AirPods where if 01:22:11.440 |
you leave them in, it will read you the messages from your 01:22:13.800 |
signal or your incoming notifications where it reads 01:22:16.960 |
Obviously you don't. So there's a new feature on the in the 01:22:21.520 |
AirPods, you leave them in. And if you're working, you're 01:22:24.000 |
walking around the house, you're walking around Manhattan like I 01:22:26.120 |
am these last couple days, it will stop the podcast I'm 01:22:29.320 |
listening to and just say, you know, oh, poker group says this, 01:22:32.760 |
oh, you know, your wife just texted you this, and it reads it 01:22:36.920 |
to you. And then you can say reply. So eventually, if Siri 01:22:40.480 |
works, and then you have those Apple goggles on, I think that 01:22:44.160 |
that is going to be the eventual interface, which is, you'll hear 01:22:47.200 |
certain things, you'll see certain things, some things will 01:22:51.600 |
didn't Facebook announce a new pair of glasses today. 01:22:54.760 |
There. Those are like their spectacle kind of things. These 01:22:57.800 |
are the light AR glasses where you can take pictures. Just meant 01:23:01.200 |
to say everything's converging a lot faster than we all. Yeah, 01:23:03.760 |
it is. So I started using a new note taking app called reflect. 01:23:07.560 |
Have you guys heard of this? It's reflecting on things. Whoa, 01:23:11.840 |
I'm just starting to play with it. But what it does is you keep 01:23:16.480 |
like a daily log of who you've met with and what meetings were 01:23:19.880 |
about. So it's basically a note taking app, but it does back 01:23:22.000 |
links, so that it starts to link together the people on concepts 01:23:26.080 |
or whatever. And so like the use case that I think it's quite 01:23:28.760 |
useful for once you've been using it for a while is, okay, I'm 01:23:32.920 |
meeting with this person, when's last time I saw them? What do we 01:23:35.480 |
talk about then? So it gives you like context. Yeah. Yeah, 01:23:39.960 |
that's awesome. I really like this. It's external memory, 01:23:42.640 |
right? Because like, I can't I'm like, I'm deluged with so much 01:23:44.960 |
stuff. Now. I can't even I forget people's names sometimes 01:23:48.120 |
if I've only met him once or twice. So his name is freeberg. 01:23:54.160 |
no, it's also getting old. You're sure. I mean, it's a 01:23:58.080 |
function of how much input is coming at you. There's just so 01:24:02.280 |
But just having a short log of who I've met with and briefly 01:24:05.880 |
what the meeting was about. So I can go back and check it. And at 01:24:09.640 |
some point in the future, I've searched against it. But the 01:24:11.880 |
only problem with it is I do have to like take the time to 01:24:14.240 |
enter all this stuff. And it's kind of pain. It'll just be done 01:24:17.280 |
automatically. It will get true external hard drive to my brain, 01:24:21.840 |
then that would be very powerful. authenticate with 01:24:24.240 |
Slack and Gmail and do that automatically. And then you'll, 01:24:29.120 |
It already connects with Google. I don't want my Slack in my 01:24:33.080 |
reflect. What I want is my meetings, which they do they 01:24:36.480 |
integrate with Google Calendar. Great. And really, that's it. 01:24:39.120 |
Like the main thing I want is, if I could just know everyone I 01:24:42.840 |
talked to, and I don't need a transcript, I just need the 01:24:45.600 |
logline. Just so I can remember, because I just need the prompt 01:24:49.240 |
six months from now, I just need a prompt that I met with this 01:24:54.080 |
Saks, have you gone to the Have you built Clinton eyes, your 01:24:56.840 |
greetings now? It's great to see you. Great to see you. That's 01:25:01.160 |
the great. That's the great thing. Like, it's great to see 01:25:03.000 |
you so that you know, you preserve optionality for the 01:25:06.440 |
It's great to see you. We've never met, but I 01:25:14.120 |
It's such a when I met Clinton, I was at a Hillary Clinton 01:25:20.280 |
fundraiser when she was a senator here in New York. And 01:25:23.120 |
they sent you up an elevator to this fundraiser. And you get off 01:25:26.160 |
the elevator. And Bill Clinton standing there. And he walks up 01:25:30.720 |
to me like three steps. Oh, Jay cow. It's great to see you and 01:25:34.120 |
grabs your elbow. He shakes your hand. I am so happy for what you 01:25:37.120 |
did to help Hillary win. And, you know, Jason, we're so 01:25:40.800 |
appreciative. And then you know, you walk into the room. And I'm 01:25:43.560 |
like, Oh, my God, Bill Clinton knows my name. Totally. Then I 01:25:48.600 |
look behind me, and I see the next person, I see a woman come 01:25:51.320 |
out with a clipboard, whispering is here, then next person's name 01:25:54.520 |
coming out of the elevator. He's waiting. That person disappears. 01:25:58.480 |
Oh, David Sachs. It's so great to meet you. I really appreciate 01:26:04.240 |
You know, that role of whispering the name of a person 01:26:08.760 |
in the politicians ear goes all the way back to Roman times. It 01:26:18.920 |
How often do you think about the Roman Empire? Just broadly 01:26:25.280 |
I thought it was reference. I thought that was 01:26:27.600 |
pretty great. It's pretty great. I'm just glad that the rest of 01:26:30.560 |
the world is is catching up to our obsession with gladiator. 01:26:33.760 |
Alright, listen, this has been an amazing episode for the 01:26:37.800 |
dictator himself Chamath Palihapitiya and rain man. Yeah, 01:26:42.400 |
definitely burn baby David Sachs and the Sultan of science the 01:26:46.680 |
Queen of Kenwa the Prince of panic attacks and the heir to 01:26:50.760 |
the TED throne, the creator of the world's greatest conference 01:26:55.080 |
David Freeberg. I am the world's greatest moderator. We'll see 01:26:57.840 |
you next time. Love you boys all in. Bye bye. 01:27:00.400 |
Love you. Ted's dead. Ted's dead. That's dead, baby. Ted's 01:27:12.440 |
We open source it to the fans and they've just gone crazy with 01:27:29.000 |
That's my dog taking a notice in your driveway. 01:27:32.560 |
We should all just get a room and just have one big huge orgy 01:27:40.920 |
because they're all like this like sexual tension that they 01:27:46.080 |
What? You're a bee. You're a bee. We need to get merch.