Back to Index

Richard Haier: IQ Tests, Human Intelligence, and Group Differences | Lex Fridman Podcast #302


Chapters

0:0 Introduction
0:43 Measuring human intelligence
15:11 IQ tests
37:59 College entrance exams
46:36 Genetics
52:35 Enhancing intelligence
60:4 The Bell Curve
72:35 Race differences
91:48 Bell curve criticisms
100:57 Intelligence and life success
110:34 Flynn effect
115:26 Nature vs nuture
142:19 Testing artificial intelligence
154:23 Advice
158:30 Mortality

Transcript

Let me ask you to this question, whether it's bell curve or any research on race differences, can that be used to increase the amount of racism in the world, can that be used to increase the amount of hate in the world? - My sense is there is such enormous reservoirs of hate and racism that have nothing to do with scientific knowledge of the data that speak against that.

That, no, I don't wanna give racist groups a veto power over what scientists study. - The following is a conversation with Richard Heyer on the science of human intelligence. This is a highly controversial topic, but a critically important one for understanding the human mind. I hope you will join me in not shying away from difficult topics like this, and instead, let us try to navigate it with empathy, rigor, and grace.

If you're watching this on video now, I should mention that I'm recording this introduction in an undisclosed location somewhere in the world. I'm safe and happy, and life is beautiful. This is the Alex Friedman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here's Richard Heyer.

What are the measures of human intelligence, and how do we measure it? - Everybody has an idea of what they mean by intelligence. In the vernacular, what I mean by intelligence is just being smart, how well you reason, how well you figure things out, what you do when you don't know what to do.

Those are just kinda everyday common sense definitions of how people use the word intelligence. If you wanna do research on intelligence, measuring something that you can study scientifically is a little trickier. And what almost all researchers who study intelligence use is the concept called the G factor, general intelligence.

And that is what is common. That is a mental ability that is common to virtually all tests of mental abilities. - What's the origin of the term G factor, by the way? It's such a funny word for such a fundamental human thing. - The general factor really started with Charles Spearman.

And he noticed, this is like, boy, more than 100 years ago. He noticed that when you tested people with different tests, all the tests were correlated positively. And so he was looking at student exams and things. And he invented the correlation coefficient, essentially. And when he used it to look at student performance on various topics, he found all the scores were correlated with each other and they were all positive correlations.

So he inferred from this that there must be some common factor that was irrespective of the content of the test. - And positive correlation means if you do well on the first test, you're likely to do well on the second test. And presumably that holds for tests across even disciplines.

So not within subject, but across subjects. So that's where the general comes in. Something about general intelligence. So when you were talking about measuring intelligence and trying to figure out something difficult about this world and how to solve the puzzles of this world, that means generally speaking, not some specific test, but across all tests.

- Absolutely right. And people get hung up on this because they say, well, what about the ability to do X? Isn't that independent? And they said, I know somebody who's very good at this, but not so good at this, this other thing. And so there are a lot of examples like that, but it's a general tendency.

So exceptions really don't disprove, you know, your everyday experience is not the same as what the data actually show. And your everyday experience, when you say, oh, I know someone who's good at X, but not so good at Y, that doesn't contradict the statement of about, he's not so good, but he's not the opposite.

It's not a negative correlation. - Okay, so we're not, our anecdotal data, I know a guy who's really good at solving some kind of visual thing. That's not sufficient for us to understand actually the depths of that person's intelligence. So how this idea of G factor, how much evidence is there?

How strong, you know, given across the decades that this idea has been around, how much has it been held up that there's a universal sort of horsepower of intelligence that's underneath all of it? All the different tests we do to try to get to this thing in the depths of the human mind, that's a universal stable measure of a person's intelligence.

- You used a couple of words in there, stable and-- - Are we gonna have to be precise with words? I was hoping we can get away with being poetic. - We can. There's a lot about research in general, not just intelligence research, that is poetic. Science has a poetic aspect to it, and good scientists are very intuitive.

They're not just, hey, these are the numbers. You have to kind of step back and see the big picture. When it comes to intelligence research, you asked how well has this general concept held up? And I think I can say, without fear of being empirically contradicted, that it is the most replicated finding in all of psychology.

Now, some cynics may say, well, big deal, psychology. We all know there's a replication crisis in psychology, and a lot of this stuff doesn't replicate. That's all true. There is no replication crisis when it comes to studying the existence of this general factor. Let me tell you some things about it.

It looks like it's universal, that you find it in all cultures. The way you find it, step back one step, the way you find it is to give a battery of mental tests. What battery? You choose. Take a battery of any mental tests you want, give it to a large number of diverse people, and you will be able to extract statistically the commonality among all those tests.

It's done by a technique called factor analysis. People think that this may be a statistical artifact of some kind. It is not a statistical artifact. - What is factor analysis? - Factor analysis is a way of looking at a big set of data and look at the correlation among the different test scores, and then find empirically the clusters of scores that go together.

And there are different factors. So if you have a bunch of mental tests, there may be a verbal factor, there may be a numerical factor, there may be a visual spatial factor, but those factors have variants in common with each other. And that is the common, that's what's common among all the tests, and that's what gets labeled the G factor.

So if you give a diverse battery of mental tests and you extract a G factor from it, that factor usually accounts for around half of the variances, the single biggest factor, but it's not the only factor, but it is the most reliable, it is the most stable, and it seems to be very much influenced by genetics.

It's very hard to change the G factor with training or drugs or anything else. We don't know how to increase the G factor. - Okay, you said a lot of really interesting things there. So first, just to get people used to it in case they're not familiar with this idea, G factor is what we mean.

So often there's this term used, IQ, which is the way IQ is used, they really mean G factor in regular conversation. 'Cause what we mean by IQ, we mean intelligence, and what we mean by intelligence, we mean general intelligence, and general intelligence in the human mind from a psychology, from a serious, rigorous scientific perspective actually means G factor.

So G factor equals intelligence, just in this conversation to define terms. Okay, so there's this stable thing called G factor. You said it, now, factor, you said factor many times, means a measure that potentially could be reduced to a single number across the different factors you mentioned, and what you said, it accounts for half, half-ish.

Accounts for half-ish of what? Of variance across the different set of tests. So if you do for some reason well on some set of tests, what does that mean? So that means there's some unique capabilities outside of the G factor that might account for that, and what are those?

What else is there besides the raw horsepower, the engine inside your mind that generates intelligence? - There are test-taking skills, there are specific abilities. Someone might be particularly good at mathematical things, mathematical concepts, even simple arithmetic. Some people are much better than others. You might know people who can, and short-term memory is another component of this.

Short-term memory is one of the cognitive processes that's most highly correlated with the G factor. So-- - So all those things like memory, test-taking skills account for variability across the test performances. But you say you can run, but you can't hide from the thing that God gave you, the genetics.

So that G factor, science says that G factor's there. Each one of us have-- - Each one of us has a G factor. - Oh boy. Some have more than others. - I'm getting uncomfortable already. - Well, IQ is a score. An IQ score is a very good estimate of the G factor.

You can't measure G directly, there's no direct measure. You estimate it from these statistical techniques. But an IQ score is a good estimate, why? Because a standard IQ test is a battery of different mental abilities. You combine it into one score, and that score is highly correlated with the G factor, even if you get better scores on some subtests than others.

Because again, it's what's common to all these mental abilities. - So a good IQ test, and I'll ask you about that, but a good IQ test tries to compress down that battery of tests, like tries to get a nice battery, a nice selection of variable tests into one test.

And so in that way, it sneaks up to the G factor. And that's another interesting thing about G factor. Now you give, first of all, you have a great book on the neuroscience of intelligence. You have a great course, which is one I first learned. You're a great teacher, let me just say.

- Thank you. - Your course at the teaching company, I hope I'm saying that correctly. - The Intelligent Brain. - The Intelligent Brain is when I first heard about this G factor, this mysterious thing that lurks in the darkness that we cannot quite shine a light on, we're trying to sneak up on.

So the fact that there's this measure, stable measure of intelligence, we can't measure directly. But we can come up with a battery test or one test that includes a battery of variable type of questions that can, reliably or attempt to estimate in a stable way that G factor, that's a fascinating idea.

So for me as an AI person, it's fascinating. It's fascinating there's something stable like that about the human mind, especially if it's grounded in genetics, it's both fascinating that as a researcher of the human mind and all the human psychological, sociological, ethical questions that start arising, it makes me uncomfortable.

But truth can be uncomfortable. - I get that a lot about being uncomfortable talking about this. Let me go back and just say one more empirical thing. It doesn't matter which battery of tests you use. So there are countless tests. You can take any 12 of them at random, extract a G factor and another 12 at random and extract a G factor.

And those G factors will be highly correlated like over 0.9 with each other. So it is a ubiquitous, it doesn't depend on the content of the test is what I'm trying to say. It is general among all those tests of mental ability. And tests of mental abilities include things like, geez, playing poker.

Your skill at poker is not unrelated to G. Your skill at anything that requires reasoning and thinking, anything, spelling, arithmetic, more complex things, this concept is ubiquitous. And when you do batteries of tests in different cultures, you get the same thing. - So this says something interesting about the human mind that as a computer is designed to be general.

So that means you can, so it's not easily made specialized. Meaning if you're going to be good at one thing, Miyamoto Musashi has this quote, he's an ancient warrior famous for the Book of Five Rings in the martial arts world. And the quote goes, "If you know the way broadly, "you will see it in everything." Meaning if you do one thing, it's going to generalize to everything.

And that's an interesting thing about the human mind. So that's what the G factor reveals. Okay, so what's the difference, if you can elaborate a little bit further, between IQ and G factor, just because it's a source of confusion for people? - An IQ is a score. People use the word IQ to mean intelligence, but IQ has a more technical meaning for people who work in the field.

And it's an IQ score, a score on a test that estimates the G factor. And the G factor is what's common among all these tests of mental ability. So if you think about, it's not a Venn diagram, but I guess you could make a Venn diagram out of it, but the G factor would be really at the core.

What's common to everything. And what IQ scores do, is they allow a rank order of people on the score. And this is what makes people uncomfortable. This is where there's a lot of controversy about whether IQ tests are biased toward any one group or another. And a lot of the answers to these questions are very clear, but they also have a technical aspect of it that's not so easy to explain.

- Well, we'll talk about the fascinating and the difficult things about all of this. So by the way, when you say rank order, that means you get a number, and that means one person, you can now compare. Like you could say that this other person is more intelligent than me.

- Well, what you can say is IQ scores are interpreted really as percentiles. So that if you have an IQ of 140 and somebody else has 70, the metric is such that you cannot say the person with an IQ of 140 is twice as smart as a person with an IQ of 70.

That would require a ratio scale with an absolute zero. Now you may think you know people with zero intelligence, but in fact, there is no absolute zero on an IQ scale. It's relative to other people. So relative to other people, somebody with an IQ score of 140 is in the upper less than 1%, whereas somebody with an IQ of 70 is two standard deviations below the mean.

That's a different percentile. - So it's similar to like in chess, you have an Elo rating that's designed to rank order people. So you can't say it's twice one person. If your Elo rating's twice another person, I don't think you're twice as good at chess. It's not stable in that way, because it's very difficult to do these kinds of comparisons.

So what can we say about the number itself? Is that stable across tests and so on or no? - There are a number of statistical properties of any test. They're called psychometric properties. You have validity, you have reliability. Reliability, there are many different kinds of reliability. They all essentially measure stability.

And IQ tests are stable within an individual. There are some longitudinal studies where children were measured at age 11. And again, when they were 70 years old and the two IQ scores are highly correlated with each other. This comes from a fascinating study from Scotland in the 1930s, some researchers decided to get an IQ test on every single child age 11 in the whole country.

And they did. And those records were discovered in an old storeroom at the University of Edinburgh by a friend of mine, Ian Deary, who found the records, digitized them, and has done a lot of research on the people who are still alive today from that original study, including brain imaging research, by the way.

Really, it's a fascinating group of people who are studied. Not to get ahead of the story, but one of the most interesting things they found is a very strong relationship between IQ measured at age 11 and mortality. So that, you know, 70 years later, they looked at the survival rates and they could get death records from everybody.

And Scotland has universal healthcare for everybody. And it turned out if you divide people by their age 11 IQ score into quartiles, and then look at how many people are alive 70 years later, I know this is in the book, I have the graph in the book, but there are essentially twice as many people alive in the highest IQ quartile than in the lowest IQ quartile.

- Interesting. - It's true in men and women. - Interesting. - So it makes a big difference. Now, why this is the case is not so clear since everyone had access to healthcare. - Well, there's a lot, and we'll talk about it, just the sentences you used now could be explained by nature or nurture.

We don't know. Now, there's a lot of science that starts to then dig in and investigate that question. But let me linger on the IQ test. How are the test design, IQ test design, how do they work? Maybe some examples for people who are not aware. What makes a good IQ test question that sneaks up on this G factor?

- Well, your question is interesting because you want me to give examples of items that make good items. And what makes a good item is not so much its content, but its empirical relationship to the total score that turns out to be valid by other means. So for example, let me give you an odd example from personality testing.

- Nice. - So there's a personality test called the Minnesota Multiphasic Personality Inventory, MMPI. Been around for decades. - I've heard about this test recently because of the Johnny Depp and Amber Heard trial. I don't know if you've been paying attention to that. But they had psychologists-- - I have not been paying attention to it.

- They had psychologists on the stand, and they were talking, apparently those psychologists did, again, I'm learning so much from this trial. They did different, a battery of tests to diagnose personality disorders. Apparently there's that systematic way of doing so, and the Minnesota one is one of the ones that there's the most science on.

There's a lot of great papers, which were all continuously cited on the stand, which is fascinating to watch. Sorry, a little bit of attention. - It's okay, I mean, this is interesting because you're right, it's been around for decades. There's a lot of scientific research on the psychometric properties of the test, including what it predicts with respect to different categories of personality disorder.

But what I wanna mention is the content of the items on that test. All of the items are essentially true/false items. True or false, I prefer a shower to a bath. True or false, I think Lincoln was a better president than Washington. What have all these, what does that have to do?

And the point is the content in these items, nobody knows why these items in aggregate predict anything, but empirically they do. It's a technique of choosing items for a test that is called dust bowl empiricism, that the content doesn't matter, but for some reason, when you get a criterion group of people with this disorder and you compare them to people without that disorder, these are the items that distinguish.

Irrespective of content, it's a hard concept to grasp. - Well, first of all, it's fascinating. 'Cause I consider myself part psychologist 'cause I love human-robot interaction, and that's a problem, half of that problem is a psychology problem 'cause there's a human. So designing these tests to get at the questions is the fascinating part.

What does dust bowl empiricism refer to? Does it refer to the final result? Yeah, so it's the test is dust bowl empiricism, but how do you arrive at the battery of questions? I presume one of the things, now again, I'm going to the excellent testimony in that trial, 'cause they also explain the tests, that a bunch of the questions are kind of, make you forget that you're taking a test.

Like, it makes it very difficult for you to somehow figure out what you're supposed to answer. - Yes, it's called social desirability. But we're getting a little far afield 'cause I only wanted to give that example of dust bowl empiricism. When we talk about the items on an IQ test, many of those items in the dust bowl empiricism method have no face validity.

In other words, they don't look like they measure anything. - Yes. - Whereas most intelligence tests, the items actually look like they're measuring some mental ability. So here's one of the-- - Oh, so you were bringing that up as an example as what it is not. - Yes. - Got it.

- Okay, so I don't want to go too far afield on it. - Too far afield is actually one of the names of this podcast, so I should mention that. - Far afield, yeah. - Far afield. Yeah, so anyway, sorry. So they feel the questions look like they passed the face validity test.

- And some more than others. So for example, let me give you a couple of things here. One of the subtests on a standard IQ test is general information. Let me just think a little bit 'cause I don't want to give you the actual item. But if I said, how far is it between Washington DC and Miami, Florida within 500 miles, plus or minus?

Well, it's not a fact most people memorize, but you know something about geography. You say, well, I flew there once. I know planes fly 500 miles. You can kind of make an estimate. But it also seems like it would be very cultural. So there's that kind of general information.

Then there's vocabulary test. What does regatta mean? And I choose that word because that word was removed from the IQ test because people complained that disadvantaged people would not know that word just from their everyday life. Here's another example from a different kind of subtest. - What's regatta, by the way?

- Regatta is a-- - I think I'm disadvantaged. - A sailing competition, a competition with boats. Not necessarily sailing, but a competition with boats. - Yep, yep, I'm probably disadvantaged in that way. Okay, excellent, so that was removed. Anyway, what you were saying. - Okay, so here's another subtest.

I'm gonna repeat a string of numbers, and when I'm done, I want you to repeat them back to me. Ready? Seven, four, two, eight, one, six. - That's way too many. Seven, four, two, eight, one, six. - You get the idea. Now, the actual test starts with a smaller number, you know, like two numbers, and then as people get it right, you keep going, adding to the string of numbers until they can't do it anymore.

Okay, but now try this. I'm gonna say some numbers, and when I'm done, I want you to repeat them to me backwards. - I quit. - Okay, now, so I gave you some examples of the kind of items on an IQ test. - Yes. - General information. I can't even remember all of it.

General information, vocabulary, digit span forward and digit span backward. - Well, you said I can't even remember them. That's a good question for me. What does memory have to do with your function? - Let's hold on. - Okay, all right. - Let's just talk about these examples. Now, some of those items seem very cultural, and others seem less cultural.

Which ones do you think, scores on which subtests are most highly correlated with the G factor? - Well, the two advances less cultural. - Well, it turns out vocabulary is highly correlated, and it turns out that digit span backwards is highly correlated. - How do you figure? - Now you have decades of research to answer the question, how do you figure?

- Right, so now there's good research that gives you intuition about what kind of questions get added, just like there's something I've done, I've actually used for research, just send me an autonomous vehicle, like whether humans are paying attention, there's a body of literature that does like end-back tests, for example, where you have to put workload on the brain to do recall, memory recall, and that helps you kind of put some work onto the brain while the person is doing some other task, and there's some interesting research with that.

But that's loading the memory, and so there's like research around stably what that means about the human mind, and here you're saying recall backwards is a good predictor. - It's a transformation. - Yeah, so you have to do some, like you have to load that into your brain, and not just remember it, but do something with it.

- Right, now here's another example of a different kind of test, called the Hick paradigm, and it's not verbal at all. It's a little box, and there are a series of lights arranged in a semi-circle at the top of the box, and then there's a home button that you press, and when one of the lights goes on, there's a button next to each of those lights, you take your finger off the home button, and you just press the button next to the light that goes on, and so it's a very simple reaction time.

Light goes on, as quick as you can, you press the button, and you get a reaction time. From the moment you lift your finger off the button, when you press the button where the light is, that reaction time doesn't really correlate with IQ very much, but if you change the instructions, and you say three lights are gonna come on simultaneously, I want you to press the button next to the light that's furthest from the other two.

So maybe lights one and two go on, and light six goes on simultaneously. You take your finger off, and you would press the button by light six. That's, that reaction time to a more complex task, it's not really hard, almost everybody gets it all right, but your reaction time to that is highly correlated with the G factor.

- This is fascinating, so reaction time, so there's a temporal aspect to this. So what role does time-- - Speed of processing. It's the speed of processing. - Is this also true for ones that take longer, like five, 10, 30 seconds? Is time part of the measure with some of these ideas?

- Yes, and that is why some of the best IQ tests have a time limit, because if you have no time limit, people can do better, but it doesn't distinguish among people that well. So that adding the time element is important. So speed of information processing, and reaction time is a measure of speed of information processing, turns out to be related to the G factor.

- But the G factor only accounts for maybe half or some amount on the test performance. For example, I get pretty bad test anxiety. Like I was never, I mean, I just don't enjoy tests. I enjoy going back into my cave and working. Like I've always enjoyed homework way more than tests, no matter how hard the homework is, 'cause I can go back to the cave and hide away and think deeply.

There's something about being watched and having a time limit that really makes me anxious, and I could just see the mind not operating optimally at all, but you're saying underneath there, there's still a G factor, there's still-- - No question, no question. - Boy. - And if you get anxious taking the test, many people say, oh, I didn't do well 'cause I'm anxious.

I hear that a lot. Well, fine, if you're really anxious during the test, the score will be a bad estimate of your G factor. It doesn't mean the G factor isn't there. And by the way, standardized tests like the SAT, they're essentially intelligence tests. They are highly G loaded.

Now, the people who make the SAT don't wanna mention that. They have enough trouble justifying standardized testing, but to call it an intelligence test is really beyond the pale. But in fact, it's so highly correlated because it's a reasoning test. The SAT is a reasoning test, a verbal reasoning, mathematical reasoning.

And if it's a reasoning test, it has to be related to G. But if people go in and take a standardized test, whether it's an IQ test or the SAT, and they happen to be sick that day with 102 fever, the score is not going to be a good estimate of their G.

If they retake the test when they're not anxious or less anxious or don't have a fever, the score will go up and that will be a better estimate. But you can't say their G factor increased between the two tests. - Well, it's interesting. So the question is how wide of a battery of tests is required to estimate the G factor well?

Because I'll give you as my personal example, I took the SAT in, I think it was called the ACT where I was too, also, I took SAT many times. Every single time I got a perfect on math. And verbal, the time limit on the verbal made me very anxious.

I did not, I mean, part of it, I didn't speak English very well, but honestly, it was like, you're supposed to remember stuff. And like, I was so anxious. And like, as I'm reading, I'm sweating. I can't, you know that like, that feeling you have when you're reading a book and you just read a page and you know nothing about what you've read because you zoned out?

That's the same feeling of like, I can't, I have to, you're like, nope. Read and understand and that anxiety is like, and you start seeing like the typography versus the content of the words. Like that was, I don't, it's interesting because I know that what they're measuring, I could see being correlated with something.

But that anxiety or some aspect of the performance sure plays a factor. And I wonder how you sneak up in a stable way. I mean, this is a broader discussion about, that's like standardized testing, how you sneak up, how you get at the fact that I'm super anxious and still nevertheless measure some aspect of my intelligence.

I wonder, I don't know if you can say to that, that time limit sure is a pain. - Well, let me say this. There are two ways to approach the very real problem that you say that some people just get anxious or not good test takers. By the way, part of testing is, you know the answer, you can figure out the answer or you can't.

If you don't know the answer, there are many reasons you don't know the answer at that particular moment. You may have learned it once and forgotten it. You may, it may be on the tip of your tongue and you just can't get it because you're anxious about the time limit.

You may never have learned it. You may never, you may have been exposed to it, but it was too complicated and you couldn't learn it. I mean, there are all kinds of reasons here, but for an individual to interpret your scores as an individual, whoever is interpreting the score has to take into account various things that would affect your individual score.

And that's why decisions about college admission or anything else where tests are used are hardly ever the only criterion to make a decision. - And I think people are, college admissions letting go of that very much. - Oh yes, yeah. - But what does that even mean? Because is it possible to design standardized tests that do get, that are useful to college admissions?

- Well, they already exist. The SAT is highly correlated with many aspects of success at college. - Here's the problem. So maybe you could speak to this. The correlation across the population versus individuals. So, our criminal justice system is designed to make sure, well, it's still, there's tragic cases where innocent people go to jail, but you try to avoid that.

In the same way with testing, it just, it would suck for an SAT to miss genius. - Yes, and it's possible, but it's statistically unlikely. So it really comes down to, do which piece of information maximizes your decision-making ability? So, if you just use high school grades, it's okay, but you will miss some people who just don't do well in high school, but who are actually pretty smart, smart enough to be bored silly in high school, and they don't care, and their high school GPA isn't that good.

So you will miss them, in the same sense that somebody who could be very able and ready for a college just doesn't do well on their SAT. This is why you make decisions with taking in a variety of information. The other thing I wanted to say, I talked about when you make a decision for an individual.

Statistically, for groups, there are many people who have a disparity between their math score and their verbal score. That disparity, or the other way around, that disparity is called tilt. The score is tilted one way or the other, and that tilt has been studied empirically to see what that predicts.

And in fact, you can't make predictions about college success based on tilt. And mathematics is a good example. There are many people, especially non-native speakers of English, come to this country, take the SATs, do very well on the math and not so well on the verbal. Well, if they're applying to a math program, the professors there who are making the decision or the admissions officers, don't wait so much to score on verbal, especially if it's a non-native speaker.

- Well, so yeah, you have to try to, in the admission process, bring in the context. But non-native isn't really the problem. I mean, that was part of the problem for me. But it's the anxiety was, which it's interesting. It's interesting. Oh boy, reducing yourself down to numbers. But it's still true.

It's still the truth. - Well-- - It's a painful truth. That same anxiety that led me to be to struggle with the SAT verbal tests is still within me in all ways of life. So maybe that's not anxiety. Maybe that's something, you know, like personality is also pretty stable.

- Personality is stable. Personality does impact the way you navigate life. - Yeah. - There's no question. - Yeah, and we should say that the G factor in intelligence is not just about some kind of number on a paper. It also has to do with how you navigate life, how easy life is for you in this very complicated world.

So personality's all tied into that in some deep fundamental way. - But now you've hit the key point about why we even want to study intelligence. And personality, I think, to a lesser extent. But that's my interest is more on intelligence. I went to graduate school and wanted to study personality, but that's kind of another story how I got kind of shifted from personality research over to intelligence research.

Because it's not just a number. Intelligence is not just an IQ score. It's not just an SAT score. It's what those numbers reflect about your ability to navigate everyday life. It has been said that life is one long intelligence test. (laughing) And who can't relate to that? And if you doubt, see, another problem here is a lot of critics of intelligence research, intelligence testing, tend to be academics who, by and large, are pretty smart people.

And pretty smart people, by and large, have enormous difficulty understanding what the world is like for people with IQs of 80 or 75. It is a completely different everyday experience. Even IQ scores of 85, 90, there's a popular television program, Judge Judy, where Judge Judy deals with everyday people with everyday problems.

And you can see the full range of problem-solving ability demonstrated there. And sometimes she does it for laughs, but it really isn't funny because people who are, there are people who are very limited in their life navigation, let alone success, by having, by not having good reasoning skills, which cannot be taught.

We know this, by the way, because there are many efforts. You know, the United States military, which excels at training people. I mean, I don't know that there's a better organization in the world for training diverse people. And they won't take people with IQs under, I think, 83 is the cutoff.

Because they have found, they are unable to train people with lower IQs to do jobs in the military. - So one of the things that G-Factor has to do with is learning. - Absolutely. Some people learn faster than others. Some people learn more than others. Now, faster, by the way, is not necessarily better, as long as you get to the same place eventually.

But, you know, there are professional schools that want students who can learn the fastest because they can learn more, or learn deeper, or all kinds of ideas about why you select people with the highest scores. And there's nothing funnier, by the way, to listen to a bunch of academics complain about the concept of intelligence and intelligence testing.

And then you go to a faculty meeting where they're discussing who to hire among the applicants. And all they talk about is how smart the person is. - We'll get to that. We'll sneak up to that in different ways. But there's something about reducing a person to a number that in part is grounded to the person's genetics that makes people very uncomfortable.

- But nobody does that. Nobody in the field actually does that. That is a worry that is a worry like, well, I don't wanna call it a conspiracy theory. I mean, it's a legitimate worry. But it just doesn't happen. Now, I had a professor in graduate school who was the only person I ever knew who considered the students only by their test scores.

- Yes. - And later in his life, he kind of backed off that. But-- - Let me ask you this. So we'll jump around. I'll come back to it. I tend to, I've had political discussions with people. And actually, my friend Michael Malice, he's an anarchist. I disagree with him on basically everything except the fact that love is a beautiful thing in this world.

And he says this test about left versus right, whatever, it doesn't matter what the test is. But he believes, the question is, do you believe that some people are better than others? The question is ambiguous. Do you believe some people are better than others? And to me, sort of the immediate answer is no.

It's a poetic question. It's an ambiguous question, right? Like, people wanna maybe, the temptation to ask better at what? Better at like sports, so on. No, to me, I stand with the sort of the founding documents of this country, which is all men are created equal. There's a basic humanity.

And there's something about tests of intelligence. Just knowing that some people are different, like the science of intelligence that shows that some people are genetically in some stable way across a lifetime, have a greater intelligence than others, makes people feel like some people are better than others. And that makes them very uncomfortable.

And maybe you can speak to that. The fact that some people are more intelligent than others in a way that's, cannot be compensated through education, through anything you do in life. What do we do with that? - Okay, there's a lot there. We haven't really talked about the genetics of it yet, but you are correct in that it is my interpretation of the data that genetics has a very important influence on the G factor.

And this is controversial. We can talk about it. But if you think that genetics, that genes are deterministic, are always deterministic, that leads to kind of the worry that you expressed. But we know now in the 21st century that many genes are not deterministic, that are probabilistic, meaning their gene expression can be influenced.

Now, whether they're influenced only by other biological variables or other genetic variables or environmental or cultural variables, that's where the controversy comes in. And we can discuss that in more detail if you like. But to go to the question about better, are people better, there's zero evidence that smart people are better with respect to important aspects of life, like honesty, even likability.

I'm sure you know many very intelligent people who are not terribly likable or terribly kind or terribly honest. - Is there something to be said? So one of the things I've recently reread for the second time, I guess that's what the word reread means, the rise and fall of the Third Reich, which is, I think, the best telling of the rise and fall of Hitler.

And one of the interesting things about the people that, how should I say it, justified or maybe propped up the ideas that Hitler put forward is the fact that they were extremely intelligent. They were the intellectual class. They were, it was obvious that they thought very deeply and rationally about the world.

So what I would like to say is, one of the things that shows to me is some of the worst atrocities in the history of humanity have been committed by very intelligent people. So that means that intelligence doesn't make you a good person. I wonder if, you know, there's a G factor for intelligence.

I wonder if there's a G factor for goodness. You know, they need you in good and evil. Of course, that's probably harder to measure 'cause that's such a subjective thing, what it means to be good. And even the idea of evil is a deeply uncomfortable thing 'cause how do we know?

But it's independent, whatever it is, it's independent of intelligence. So I agree with you about that. But let me say this. I have also asserted my belief that more intelligence is better than less. That doesn't mean more intelligent people are better people, but all things being equal, would you like to be smarter or less smart?

So if I had a pill, I have two pills, I said, this one will make you smarter, this one will make you dumber. Which one would you like? Are there any circumstances under which you would choose to be dumber? - Well, let me ask you this. That's a very nuanced and interesting question.

There's been books written about this, right? Now we'll return to the hard questions, the interesting questions, but let me ask about human happiness. Does intelligence lead to happiness? - No. - So, okay, so back to the pill then. So why, when would you take the pill? So you said IQ 80.

90, 100, 110, you start going through the quartiles and is it obvious, isn't there a diminishing returns and then it starts becoming negative? - This is an empirical question. And so that I have advocated in many forums more research on enhancing the G factor. Right now there have been many claims about enhancing intelligence with you mentioned the NBAC training, that was a big deal a few years ago, it doesn't work.

Data's very clear, it does not work. - Or doing like memory tests, like training and so on. - Yeah, it may give you a better memory in the short run, but it doesn't impact your G factor. It was very popular a couple of decades ago that the idea that listening to Mozart could make you more intelligent.

There was a paper published on this with somebody I knew published this paper. Intelligence researchers never believed it for a second. Been hundreds of studies, all the meta-analyses, all the summaries and so on. So there's nothing to it, nothing to it at all. (Luke laughs) But wouldn't it be something, wouldn't it be world shaking if you could take the normal distribution of intelligence, which we haven't really talked about yet, but IQ scores and the G factor is thought to be a normal distribution, and shift it to the right so that everybody is smarter.

Even a half a standard deviation would be world shaking because there are many social problems, many, many social problems that are exacerbated by people with lower ability to reason stuff out and navigate everyday life. - So I wonder if there's a threshold. So maybe I would push back and say universal shifting of the normal distribution may not be the optimal way of shifting.

Maybe it's better to, whatever the asymmetric kind of distribution is, is like really pushing the lower up versus trying to make the people at the average more intelligent. - So you're saying that if in fact there was some way to increase G, let's just call it metaphorically a pill, an IQ pill, we should only give it to people at the lower end?

- No, it's just intuitively I can see that life becomes easier at the lower end if it's increased. It becomes less and less, it is an empirical scientific question, but it becomes less and less obvious to me that more intelligence is better. - At the high end, not because it would make life easier, but it would make whatever problems you're working on more solvable.

And if you are working on artificial intelligence, there's a tremendous potential for that to improve society. - I understand. So at the whatever problems you're working on, yes, but there's also the problem of the human condition. There's love, there's fear, and all of those beautiful things that sometimes if you're good at solving problems, you're going to create more problems for yourself.

I'm not exactly sure. So ignorance is bliss, is a thing. So there might be a place, there might be a sweet spot of intelligence given your environment, given your personality, all of those kinds of things, and that becomes less beautifully complicated the more and more intelligent you become. But that's a question for literature, not for science, perhaps.

- Well, imagine this. Imagine there was an IQ pill, and it was developed by a private company, and they are willing to sell it to you. And whatever price they put on it, you are willing to pay it because you would like to be smarter. But just before they give you a pill, they give you a disclaimer form to sign.

Don't hold us, you understand that this pill has no guarantee that your life is going to be better, and in fact, it could be worse. - Well, yes, that's how lawyers work, but I would love for science to answer the question, to try to predict if your life is going to be better or worse when you become more or less intelligent.

It's a fascinating question about what is the sweet spot for the human condition. Some of the things we see as bugs might be actually features, may be crucial to our overall happiness, is our limitations might lead to more happiness than less. But again, more intelligence is better at the lower end.

That's something that's less arguable and fascinating, if possible, to increase. - But you know, there's virtually no research that's based on a neuroscience approach to solving that problem. All the solutions that have been proposed to solve that problem or to ameliorate that problem are essentially based on the blank slate assumption that enriching the environment, removing barriers, all good things, by the way, I'm not against any of those things, but there's no empirical evidence that they're going to improve the general reasoning ability or make people more employable.

- Have you read "Flowers of Algernon"? - Yes. - That's to the question of intelligence and happiness. - There are many profound aspects of that story. It was a film that was very good. The film was called "Charlie," for the younger people who are listening to this. You might be able to stream it on Netflix or something.

But it was a story about a person with very low IQ who underwent a surgical procedure in the brain and he slowly became a genius. And the tragedy of the story is the effect was temporary. It's a fascinating story, really. - That goes in contrast to the basic human experience that each of us individually have, but it raises the question of the full range of people you might be able to be, given different levels of intelligence.

You've mentioned the normal distribution. So let's talk about it. There's a book called "The Bell Curve," written in 1994, written by psychologist Richard Herrnstein and political scientist Charles Murray. Why was this book so controversial? - This is a fascinating book. I know Charles Murray. I've had many conversations with him.

- Yeah, what is the book about? - The book is about the importance of intelligence in everyday life. That's what the book is about. It's an empirical book. It has statistical analyses of very large databases that show that essentially IQ scores or their equivalent are correlated to all kinds of social problems and social benefits.

And that in itself is not where the controversy about that book came. The controversy was about one chapter in that book. And that is a chapter about the average difference in mean scores between black Americans and white Americans. And these are the terms that were used in the book at the time and are still used to some extent.

And historically, or really for decades, it has been observed that disadvantaged groups score on average lower than Caucasians on academic tests, tests of mental ability, and especially on IQ tests. And the difference is about a standard deviation, which is about 15 points, which is a substantial difference. In the book, Hernstein and Murray in this one chapter assert clearly and unambiguously that whether this average difference is due to genetics or not, they are agnostic.

They don't know. Moreover, they assert they don't care because you wouldn't treat anybody differently knowing if there was a genetic component or not because that's a group average finding. Every individual has to be treated as an individual. You can't make any assumption about what that person's intellectual ability might be from the fact of a average group difference.

They're very clear about this. Nonetheless, people took away, I'm gonna choose my words carefully 'cause I have a feeling that many critics didn't actually read the book. They took away that Hernstein and Murray were saying that blacks are genetically inferior. That was the take-home message. And if they weren't saying it, they were implying it because they had a chapter that discussed this empirical observation of a difference.

And isn't this horrible? And so the reaction to that book was incendiary. - What do we know about, from that book and the research beyond, about race differences and intelligence? - It's still the most incendiary topic in psychology. Nothing has changed that. Anybody who even discusses it is easily called a racist just for discussing it.

It's become fashionable to find racism in any discussion like this. It's unfortunate. The short answer to your question is there's been very little actual research on this topic since 19-- - Since the Bell Curve. - Since the Bell Curve, even before. This really became incendiary in 1969 with an article published by an educational psychologist named Arthur Jensen.

Let's just take a minute and go back to that to see the Bell Curve in a little bit more historical perspective. Arthur Jensen was a educational psychologist at UC Berkeley. I knew him as well. And in 1969 or '68, the Harvard Educational Review asked him to do a review article on the early childhood education programs that were designed to raise the IQs of minority students.

This was before the federally funded Head Start program. Head Start had not really gotten underway at the time Jensen undertook his review of what were a number of demonstration programs. And these demonstration programs were for young children who were around kindergarten age. And they were specially designed to be cognitively stimulating, to provide lunches, do all the things that people thought would minimize this average gap of intelligence tests.

There was a strong belief among virtually all psychologists that the cause of the gap was unequal opportunity due to racism, due to all negative things in the society. And if you could compensate for this, the gap would go away. So early childhood education back then was called literally compensatory education.

Jensen looked at these programs. He was an empirical guy. He understood psychometrics. And he wrote a, it was over a hundred page article detailing these programs and the flaws in their research design. Some of the programs reported IQ gains of on average five points, but a few reported 10, 20, and even 30 point gains.

One was called the miracle in Milwaukee. That investigator went to jail ultimately for fabricating data. But the point is that Jensen wrote an article that said, look, the opening sentence of his article is classic. The opening sentence is, I may not quote it exactly right, but it's, we have tried compensatory education and it has failed.

And he showed that these gains were essentially nothing. You couldn't really document empirically any gains at all from these really earnest efforts to increase IQ. But he went a step further, a fateful step further. He said, not only have these efforts failed, but because they have had essentially no impact, we have to re-examine our assumption that these differences are caused by environmental things that we can address with education.

We need to consider a genetic influence, whether there's a genetic influence on this group difference. - So you said that this is one of the more controversial works ever in science. - I think it's the most infamous paper in all of psychology, I would go on to say. Because in 1969, the genetic data was very skimpy on this question, skimpy and controversial.

It's always been controversial, but it was even skimpy and controversial. It's kind of a long story that I go into a little bit in more detail in the book, "Neuroscience of Intelligence." But to say he was vilified is an understatement. I mean, he couldn't talk at the American Psychological Association without bomb threats clearing the lecture hall.

Campus security watched him all the time. They opened his mail. He had to retreat to a different address. This was one of the earliest kinds, this is before the internet, and kind of internet social media mobs, but it was that intense. And I have written that overnight, after the publication of this article, all intelligence research became radioactive.

Nobody wanted to talk about it. And then it didn't, nobody was doing more research. And then the bell curve came along, and the Jensen controversy was dying down. I have stories that Jensen told me about his interaction with the Nixon White House on this issue. I mean, this was like a really big deal.

It was some unbelievable stories, but he told me this, so I kind of believe these stories. Nonetheless-- - 25 years later. - 25 years later. - So all this silence basically saying, nobody wants to do this kind of research. There's so much pressure, so much attack against this kind of research.

And here's sort of a bold, stupid, crazy people that decide to dive right back in. I wonder how much discussion there was. Do we include this chapter or not? - Murray has said they discussed it, and they felt they should include it, and they were very careful in the way they wrote it, which did them no good.

So as a matter of fact, when the bell curve came out, it was so controversial. I got a call from a television show called Nightline. It was with a broadcaster called Ted Koppel, who had this evening show, I think it was on late at night, talked about news. It was a straight up news thing.

And a producer called and asked if I would be on it to talk about the bell curve. And I said, you know, she asked me what I thought about the bell curve as a book. And I said, look, it's a very good book. It talks about the role of intelligence in society.

And she said, no, no, what do you think about the chapter on race? That's what we want you to talk about. I remember this conversation. I said, well, she said, what would you say if you were on TV? And I said, well, what I would say is that it's not at all clear if there's any genetic component to intelligence, any differences, but if there were a strong genetic component, that would be a good thing.

And, you know, complete silence on the other end of the phone. And she said, well, what do you mean? And I said, well, if it's the more genetic any difference is the more it's biological. And if it's biological, we can figure out how to fix it. - I see, that's interesting.

She said, would you say that on television? - Yes. - I said, no. (laughing) And so that was the end of that. - So that's for more like biology is within the reach of science and the environment is a public policy, social and all those kinds of things. From your perspective, whichever one you think is more amenable to solutions in the short term is the one that excites you.

But you saying that is good, the truth of genetic differences, no matter what, between groups is a painful, harmful, potentially dangerous thing. Let me ask you to this question, whether it's bell curve or any research on race differences, can that be used to increase the amount of racism in the world?

Can that be used to increase the amount of hate in the world? Do you think about this kind of stuff? - I've thought about this a lot, not as a scientist, but as a person. And my sense is there is such enormous risk enormous reservoirs of hate and racism that have nothing to do with scientific knowledge of the data that speak against that.

That no, I don't wanna give racist groups of veto power over what scientists study. If you think that the differences, and by the way, virtually no one disagrees that there are differences in scores. It's all about what causes them and how to fix it. So if you think this is a cultural problem, then you must ask the problem, do you want to change anything about the culture?

Or are you okay with the culture? 'Cause you don't feel it's appropriate to change a person's culture. So are you okay with that? And the fact that that may lead to disadvantages in school achievement? It's a question. If you think it's environmental, what are the environmental parameters that can be fixed?

I'll tell you one, lead from gasoline in the atmosphere, lead in paint, lead in water. That's an environmental toxin that society has the means to eliminate, and they should. - Yeah, just to sort of trying to find some insight and conclusion to this very difficult topic. Is there been research on environment versus genetics, nature versus nurture on this question of race differences?

- There is not, no one wants to do this research. First of all, it's hard research to do. Second of all, it's a minefield. No one wants to spend their career on it. Tenured people don't want to do it, let alone students. The way I talk about it, well, before I tell you the way I talk about it, I want to say one more thing about Jensen.

He was once asked by a journalist straight out, "Are you a racist?" His answer was very interesting. His answer was, "I've thought about that a lot, "and I've concluded it doesn't matter." Now, I know what he meant by this. - The guts to say that, wow. - He was a very unusual person.

I think he had a touch of Asperger's syndrome, to tell you the truth, because I saw him in many circumstances. - He would be canceled on Twitter immediately with that sentence. - Yeah, but what he meant was he had a hypothesis. And with respect to group differences, he called it the default hypothesis.

He said whatever factors affect individual intelligence are likely the same factors that affect group differences. It was the default, but it was a hypothesis. It should be tested. And if it turned out empirical tests didn't support the hypothesis, he was happy to move on to something else. He was absolutely committed to that scientific ideal.

It's an empirical question. We should look at it, and let's see what happens. - The scientific method cannot be racist, from his perspective. It doesn't matter what the scientists, if they follow the scientific method, it doesn't matter what they believe. - And if they are biased, and they consciously or unconsciously bias the data, other people will come along to replicate it.

They will fail, and the process over time will work. - So let me push back on this idea, because psychology to me is full of gray areas. And what I've observed about psychology, even replication crisis aside, is that something about the media, something about journalism, something about the virality of ideas in the public sphere, they misinterpret.

They take up things from studies willfully, or from ignorance, misinterpret findings, and tell narratives around that. I personally believe, for me, I'm not saying that broadly about science, but for me, it's my responsibility to anticipate the ways in which findings will be misinterpreted. So I've thought about this a lot, 'cause I publish papers on semi-autonomous vehicles, and those, you know, cars, people dying cars.

There's people that have written me letters saying, emails, nobody writes letters, I wish they did, that have blood on my hands, because of things that I would say, positive or negative, there's consequences. In the same way, when you're a researcher of intelligence, I'm sure you might get emails, or at least people might believe that a finding of your study is going to be used by a large number of people to increase the amount of hate in the world.

I think there's some responsibility on scientists, but for me, I think there's a great responsibility to anticipate the ways things will be misinterpreted, and there, you have to, first of all, decide whether you want to say a thing at all, do the study at all, publish the study at all, and two, the words with which you explain it.

I find this on Twitter a lot, actually, which is, when I write a tweet, and I'm usually just doing so innocently, I'll write it, it takes me five seconds to write it, or whatever, 30 seconds to write it, and then I'll think, all right, I close my eyes open and try to see how will the world interpret this, what are the ways in which this will be misinterpreted?

And I'll sometimes adjust that tweet to see, yeah, so in my mind, it's clear, but that's because it's my mind from which this tweet came, but you have to think, in a fresh mind that sees this, and it's spread across a large number of other minds, how will the interpretation morph?

I mean, for a tweet, it's a silly thing, it doesn't matter, but for a scientific paper and study and finding, I think it matters, so I don't know. I don't know what your thoughts about that, 'cause maybe for Jensen, the data's there, what do you want me to do?

This is a scientific process that's been carried out, if you think the data was polluted by bias, do other studies that reveal the bias, but the data's there. I'm not a poet, I'm not a literary writer, what do you want me to do? I'm just presenting you the data.

What do you think on that spectrum? What's the role of a scientist? - The reason I do podcasts, the reason I write books for the public is to explain what I think the data mean and what I think the data don't mean. I don't do very much on Twitter other than to retweet references to papers.

I don't think it's my role to explain these, 'cause they're complicated, they're nuanced, but when you decide not to do a scientific study or not to publish a result because you're afraid the result could be harmful or insensitive, that's not an unreasonable thought, and people will make different conclusions and decisions about that.

I wrote about this, I'm the editor of a journal called Intelligence, which publishes scientific papers. Sometimes we publish papers on group differences. Those papers sometimes are controversial. These papers are written for a scientific audience, they're not written for the Twitter audience, so I don't promote them very much on Twitter, but in a scientific paper, you have to now choose your words carefully also because those papers are picked up by non-scientists, by writers of various kinds, and you have to be available to discuss what you're saying and what you're not saying.

Sometimes you are successful at having a good conversation, like we are today, that doesn't start out pejorative. Other times I have been asked to participate in debates where my role would be to justify race science. Well, you can see, you start out, and that was a BBC request that I received.

- I have so much, it's a love-hate relationship, mostly hate, with these shallow journalism organizations, so they would want to use you as a kind of, in a debate setting, to communicate as to, like, there is race differences between groups, and make that into debate, and put you in a role of-- - Justifying racism.

- Justifying racism. - That's what they're asking me to do. - Versus, like, educating about this field of the science of intelligence, yeah. - I wanna say one more thing before we get off the normal distribution. You also asked me, what is the science after the bell curve? And the short answer is there's not much new work, but whatever work there is supports the idea that there still are group differences.

It's arguable whether those differences have diminished at all or not, and there is still a major problem in underperformance for school achievement for many disadvantaged and minority students, and there so far is no way to fix it. What do we do with this information? Is this now a task?

Now, we'll talk about the future on the neuroscience and the biology side, but in terms of this information as a society in the public policy, in the political space, in the social space, what do we do with this information? - I've thought a lot about this. The first step is to have people interested in policy understand what the data actually show to pay attention to intelligence data.

You can read policy papers about education and using your word processor, you can search for the word intelligence. You can search a 20,000 word document in a second and find out the word intelligence does not appear anywhere. In most discussions about what to do about achievement gaps, I'm not talking about test gaps, I'm talking about actual achievement gaps in schools, which everyone agrees is a problem.

The word intelligence doesn't appear among educators. - That's fascinating. - As a matter of fact, in California, there has been tremendous controversy about recent attempts to revise the curriculum for math in high schools, and we had a Stanford professor of education who was running this review assert there's no such thing as talent, mathematical talent.

And she wanted to get rid of the advanced classes in math because not everyone could do that. Now, of course, this has been very controversial, they've retreated somewhat, but the idea that a university professor was in charge of this who believes that there's no talent, that it doesn't exist, this is rather shocking, let alone the complete absence of intelligence data.

By the way, let me tell you something about what the intelligence data show. Let's take race out of it. Even though the origins of these studies were a long time ago, I'm blocking on the name of the report, the Coleman Report was a famous report about education, and they measured all kinds of variables about schools, about teachers, and they looked at academic achievement as an outcome.

And they found the most predictive variables of education outcome were the variables the student brought with him or her into the school, essentially their ability. And that when you combine the school and the teacher variables together, the quality of the school, the funding of the school, the quality of the teachers, their education, you put all the teacher and school variables together, it barely accounted for 10% of the variance.

And this has been replicated now. So the best research we have shows that school variables and teacher variables together account for about 10% of student academic achievement. Now, you wanna have some policy on improving academic achievement, how much money do you wanna put into teacher education? How much money do you wanna put into the quality of the school administration?

You know who you can ask? You can ask the Gates Foundation, because they spent a tremendous amount of money doing that. And at the end of it, because they're measurement people, they wanna know the data, they found it had no impact at all. And they've kind of pulled out of that kind of program.

- So, oh boy. Let me ask you, this is me talking, but there's-- - Just the two of us. - Just the two of us, but I'm gonna say some funny and ridiculous things, so you surely are not approving of it. But there's a movie called Clerks. - I've seen it, I've seen it, yeah.

- There's a funny scene in there where a lovely couple are talking about the number of previous sexual partners they had. And the woman says that, I believe she just had a handful, like two or three or something like that, sexual partners, but then she also mentioned that she, what's that called, fallatio, what's the scientific, but she went, you know, gave a blowjob, to 37 guys, I believe it is.

And so that has to do with the truth. So sometimes, knowing the truth can get in the way of a successful relationship of love of some of the human flourishing. And that seems to me that's at the core here, that facing some kind of truth that's not able to be changed makes it difficult to sort of, is limiting as opposed to empowering.

That's the concern. If you sort of test for intelligence and lay the data out, it feels like you will give up on certain people. You will sort of start binning people, it's like, well, this person is like, let's focus on the average people, or let's focus on the very intelligent people.

That's the concern. And there's a kind of intuition that if we just don't measure, and we don't use that data, that we would treat everybody equal and give everybody equal opportunity. If we have the data in front of us, we're likely to misdistribute the amount of sort of attention we allocate, resources we allocate to people.

That's probably the concern. - It's a realistic concern, but I think it's a misplaced concern if you wanna fix the problem. If you wanna fix the problem, you have to know what the problem is. - Yep. - Now, let me tell you this. Let's go back to the bell curve, not the bell curve, but the normal distribution.

- Yes. - 16% of the population on average has an IQ under 85, which means they're very hard. If you have an IQ under 85, it's very hard to find gainful employment at a salary that sustains you, at least minimally, in modern life. Okay? Not impossible, but it's very difficult.

16% of the population of the United States is about 51 or 52 million people with IQs under 85. This is not a small issue. 14 million children have IQs under 85. Is this something we wanna ignore? Does this have any, what is the Venn diagram between, you know, when you have people with IQs under 85 and you have achievement in school or achievement in life?

There's a lot of overlap there. This is why, to go back to the IQ pill, if there were a way to shift that curve toward the higher end, that would have a big impact. - If I could maybe, before we talk about the impact on life and so on, some of the criticisms of the bell curve.

So Stephen Jay Gould wrote that the bell curve rests on four incorrect assumptions. It would be just interesting to get your thoughts on the four assumptions, which are intelligence must be reducible to a single number, intelligence must be capable of rank ordering people in a linear order, intelligence must be primarily genetically based, and intelligence must be essentially immutable.

Maybe not as criticisms, but as thoughts about intelligence. - Yeah, we could spend a lot of time on him. - On Stephen Jay Gould? - Yes. - Yeah. - He wrote that in what, about 1985, 1984? His views were overtly political, not scientific. He was a scientist, but his views on this were overtly political, and I would encourage people listening to this, if they really wanna understand his criticisms, they should just Google what he had to say, and Google the scientific reviews of his book, "The Mismeasure of Man," and they will take these statements apart.

They were wrong, not only were they wrong, but when he asserted in his first book that there was no biological basis, essentially, to IQ, by the time the second edition came around, there were studies of MRIs showing that brain size, brain volume, were correlated to IQ scores, which he declined to put in his book.

(laughs) - So, okay, I'm learning a lot today. I didn't know, actually, the extent of his work. I was just using a few little snippets of criticism. - That's interesting, so there's a battle here. He wrote a book, "Mismeasure of Man," that's missing a lot of the scientific grounding.

- His book is highly popular in colleges today. You can find it in any college bookstore under assigned reading. It's highly popular. - "The Mismeasure of Man"? - Yes, highly influential. - Can you speak to "The Mismeasure of Man"? I'm undereducated about this, so what, is this the book basically criticizing the ideas in the book?

- Yeah, yeah, where those four things came from. And it is really a book that was really taken apart point by point by a number of people who actually understood the data. And he didn't care. He didn't care, he didn't modify anything. - Listen, because this is such a sensitive topic, like I said, I believe the impact of the work, as it is misinterpreted, has to be considered, because it's not just going to be scientific discourse, it's going to be political discourse, there's going to be debates, there's going to be politically motivated people that will use messages in each direction, make something like the bell curve the enemy or the support for one's racist beliefs.

And so I think you have to consider that, but it's difficult because Nietzsche was used by Hitler to justify a lot of his beliefs, and it's not exactly on Nietzsche to anticipate Hitler, or how his ideas will be misinterpreted and used for evil. But there's a balance there. So I understand, this is really interesting, I didn't know, is there any criticism of the book you find compelling or interesting or challenging to use from a scientific perspective?

- There were factual criticisms about the nature of the statistics that were used, the statistical analyses, these are more technical criticisms, and they were addressed by Murray in a couple of articles where he took all the criticisms and spoke to them. And people listening to this podcast can certainly find all those online.

And it's very interesting. But Murray went on to write some additional books, two in the last couple of years. One about human diversity, where he goes through the data, refuting the idea that race is only a social construct with no biological meaning. He discusses the data, it's a very good discussion, you don't have to agree with it, but he presents data in a cogent way, and he talks about the critics of that, and he talks about their data in a cogent, non-personal way.

It's a very informative discussion. The book is called "Human Diversity." He talks about race and he talks about gender, same thing, about sex differences. And more recently, he's written what might be his final say on this, a book called "Facing Reality," where he talks about this again. So, you know, he can certainly defend himself.

He doesn't need me to do that. But I would urge people who have heard about him and the bell curve, and who think they know what's in it, you are likely incorrect, and you need to read it for yourself. - But it is, scientifically, it's a serious subject, it's a difficult subject.

Ethically, it's a difficult subject. Everything you said here calmly and thoughtfully is difficult. It's difficult for me to even consider that G factor exists. I don't mean from like, that somehow G factor is inherently racist or sexist or whatever. It's just, it's difficult in the way that considering the fact that we die one day is difficult.

That we are limited by our biology is difficult. And it's, at least from an American perspective, you like to believe that everything is possible in this world. - Well, that leads us to what I think we should do with this information. And what I think we should do with this information is unusual.

- Uh-oh. - Because I think what we need to do is fund more neuroscience research on the molecular biology of learning and memory. Because one definition of intelligence is based on how much you can learn and how much you can remember. - Yes. - And if you accept that definition of intelligence, then there are molecular studies going on now, and Nobel Prizes being won on molecular biology or molecular neurobiology of learning and memory.

Now, the step those researchers, those scientists need to take when it comes to intelligence is to focus on the concept of individual differences. Intelligence research has individual differences as its heart, because it assumes that people differ on this variable and those differences are meaningful and need understanding. Cognitive psychologists who have morphed into molecular biologists studying learning and memory hate the concept of individual differences historically.

Some now are coming around to it. I once sat next to a Nobel Prize winner for his work on memory. And I asked him about individual differences. And he said, "Don't go there. "It'll set us back 50 years." But I said, "Don't you think they're the key, though, "to understand, you know, "why can some people remember more than others?" He said, "You don't wanna go there." - I think the 21st century will be remembered by the technology and the science that goes to individual differences.

Because we have now data, we have now the tools to much, much better to start to measure, start to estimate, not just on the sort of through tests and IQ test type of things, sort of outside the body kind of things, but measuring all kinds of stuff about the body.

So yeah, truly go into the molecular biology, to the neurobiology, to the neuroscience. Let me ask you about life. (both laugh) How does intelligence correlate with or lead to or has anything to do with career success? You've mentioned these kinds of things. Is there any data, you've had an excellent conversation with Jordan Peterson, for example.

Is there any data on what intelligent means for success in life? - Success in life, there is a tremendous amount of validity data that looked at intelligence test scores and various measures of life success. Now, of course, life success is a pretty broad topic and not everybody agrees on what success means, but there's general agreement on certain aspects of success that can be measured.

And those-- - Including life expectancy, like you said. - Life expectancy, now there's life success. Life expectancy, I mean, that is such an interesting finding but IQ scores are also correlated to things like income. Now, okay, so who thinks income means you're successful? That's not the point. The point is that income is one empirical measure in this culture that says something about your level of success.

You can define success in ways that have nothing to do with income. You can define success based on your evolutionary natural selection success. But for variables, and even that by the way is correlated to IQ in some studies. So however you wanna define success, IQ is important. It's not the only determinant.

People get hung up on, well, what about personality? What about so-called emotional intelligence? Yes, all those things matter. The thing that matters empirically, the single thing that matters the most is your general ability, your general mental intellectual ability, your reasoning ability. And the more complex your vocation, the more complex your job, the more G matters.

G doesn't matter in a lot of occupations don't require complex thinking. And there are occupations like that and G doesn't matter. Within an occupation, the G might not matter so much. So that if you look at all the professors at MIT, and had a way to rank order them on, there's a ceiling effect is what I'm saying.

- Also, when you get past a certain threshold, then there's impact on wealth, for example, or career success. However, that's defined in each individual discipline, but after a certain point, it doesn't matter. - Actually, it does matter in certain things. So for example, there is a very classic study that was started at Johns Hopkins when I was a graduate student there.

I actually worked on this study at the very beginning. It's the study of mathematically and scientifically precocious youth. And they gave junior high school students, age 11 and 12, the standard SAT math exam. And they found a very large number of students scored very high on this exam. Not a large number.

I mean, they found many students when they cast the net, they're all a Baltimore. They found a number of students who scored as high on the SAT math when they were 12 years old as incoming Hopkins freshmen. And they said, "Gee, now this is interesting. "What shall we do now?" And on a case-by-case basis, they got some of those kids into their local community college math programs.

Many of those kids went on to be very successful. And now there's a 50-year follow-up of those kids. And it turns out, these kids were in the top 1%. Okay, so everybody in this study is in the top 1%. If you take that group, that rarified group, and divide them into quartiles, so that you have the top 25% of the top 1%, and the bottom 25% of the top 1%, you can find on measurable variables of success variables of success, the top quartile does better than the bottom quartile in the top 1%.

They have more patents, they have more publications, they have more tenure at universities. And this is based on, you're dividing them based on their score at age 12. - I wonder how much interesting data is in the variability, in the differences. So, but that's really, oh boy. That's very interesting, but it's also, I don't know, somehow painful.

I don't know why it's so painful. That G-factor's so determinant, of even in the nuanced top percent. - Well, this is interesting that you find that painful. Do you find it painful that people with charisma can be very successful in life, even though having no other attributes other than they're famous and people like them?

Do you find that painful? - Yes, if that charisma is untrainable. So one of the things, again, this is like I learned psychology from the Johnny Depp trial. But one of the things the psychologist, the personality psychologist, he can maybe speak to this, 'cause he had interest in this for a time, is she was saying that personality, technically speaking, is the thing that doesn't change over a lifetime.

It's the thing you're, I don't know if she was actually implying that you're born with it. - Well, it's a trait. - It's a trait that's-- - It's a trait that's relatively stable over time. I think that's generally correct. - So to the degree your personality is stable over time, yes, that too is painful.

'Cause what's not painful is the thing, if I'm fat and out of shape, I can exercise and become healthier in that way. If my diet is a giant mess and that's resulting in some kind of conditions that my body's experiencing, I can fix that by having a better diet.

That's sort of my actions, my willed actions can make a change. If charisma is part of the personality that's the part of the charisma that is part of the personality that is stable, yeah, yeah, that's painful too. 'Cause it's like, oh shit, I'm stuck with this. I'm stuck with this.

- Well, I mean, and this pretty much generalizes to every aspect of your being. This is who you are, you gotta deal with it. And what it undermines, of course, is a realistic appreciation for this, undermines the fairly recent idea prevalent in this country, that if you work hard, you can be anything you wanna be.

Which has morphed from the original idea that if you work hard, you can be successful. Those are two different things. - Yeah. - And now we have, if you work hard, you can be anything you wanna be. This is completely unrealistic. I'm sorry, it just is. Now you can work hard and be successful, there's no question.

But you know what, I could work very hard and I am not going to be a successful theoretical physicist, I'm just not. - That said, I mean, we should, 'cause we had this conversation already, but it's good to repeat. The fact that you're not going to be a theoretical physicist, is not judgment on your basic humanity.

Returning again to the old man, which means men and women are created equal. So again, some of the differences we're talking about in quote unquote success, wealth, number of, whether you win a Nobel Prize or not, that doesn't put a measure on your basic humanity and basic value and even goodness of you as a human being.

'Cause that, your basic role and value in society is largely within your control. It's some of these measures that we're talking about. It's good to remember this. One question about the Flynn effect. What is it? Are humans getting smarter over the years, over the decades, over the centuries? - The Flynn effect is, James Flynn, passed away about a year ago, published a set of analyses, going back a couple of decades, when he first noticed this, that IQ scores, when you looked over the years, seemed to be drifting up.

Now this was not unknown to the people who make the test, because they re-norm the test periodically, and they have to re-norm the test periodically, because what 10 items correct meant relative to other people 50 years ago is not the same as what 10 items mean relative today. People are getting more things correct.

Now, the scores have been drifting up about three points, IQ scores have been drifting up about three points per decade. This is not a personal effect, this is a cohort effect. Well, it's not for an individual, but-- - The world, how do you, so what's the explanation? - And this has presented intelligence researchers with a great mystery.

Two questions. First, is it effect on the 50% of the variance that's the G factor, or on the other 50%, and there's evidence that it is a G factor effect. And second, what on earth causes this, and doesn't this mean intelligence and G factor cannot be genetic, because the scale of natural selection is much, much longer than a couple of decades ago.

And so it's been used to try to undermine the idea that there can be a genetic influence on intelligence. But certainly, it can be, the Flynn effect can affect the non-genetic aspects of intelligence, because genes account for maybe 50% of the variance. Maybe higher, it could be as high as 80% for adults, but let's just say 50% for discussion.

So the Flynn effect, it's still a mystery. - It's still a mystery, that's interesting. - It's still a mystery, although the evidence is coming out. I told you before I edited a journal on intelligence, and we're doing a special issue in honor of James Flynn. So I'm starting to see papers now on really the latest research on this.

I think most people who specialize in this area, of trying to understand the Flynn effect, are coming to the view, based on data, that it has to do with advances in nutrition and healthcare. And there's also evidence that the effect is slowing down, and possibly reversing. - Oh boy.

So how would nutrition, so nutrition would still be connected to the G factor. So nutrition as it relates to the G factor, so the biology that leads to the intelligence. - Yes. - That would be the claim, the hypothesis being tested by the research. - Yes, and there's some evidence from infants that nutrition has made a difference.

So it's not an unreasonable connection. But does it negate the idea that there's a genetic influence? Not logically at all. But it is very interesting. So that if you take an IQ test today, but you take the score and use the tables that were available in 1940, you're gonna wind up with a much higher IQ number.

So are we really smarter than a couple of generations ago? No, but we might be able to solve problems a little better. And make use of our G because of things like Sesame Street and other curricula in school. More people are going to school. So there are a lot of factors here to disentangle.

- It's fascinating though. It's fascinating that there's not clear answers yet. That as a population, we're getting smarter. When you just zoom out, that's what it looks like. As a population, we're getting smarter. And it's interesting to see what the effects of that are. I mean, this raises the question.

We've mentioned it many times, but haven't clearly addressed it, which is nature versus nurture question. So how much of intelligence is nature? How much of it is nurture? How much of it is determined by genetics versus environment? - All of it. - All of it is genetics. - No, all of it is nature and nurture.

- Yeah, so yes. Yes. - Okay. That's not as- - But how much of the variance can you apportion to either? Most of the people who work in this field say that that is the framing of that. If the question is framed that way, it can't be answered because nature and nurture are not two independent influences.

They interact with each other. And understanding those interactions is so complex that many behavioral geneticists say it is today impossible and always will be impossible to disentangle that no matter what kind of advances there are in DNA technology and genomic informatics. - But there's still, to push back on that, that same intuition from behavioral geneticists would lead me to believe that there cannot possibly be a stable G factor.

'Cause it's super complex. - Many of them would assert that as a logical outcome. But because I believe there is a stable G factor from lots of sources of data, not just one study, but lots of sources of data over decades, I am more amenable to the idea that whatever interactions between genes and environment exist, they can be explicated, they can be studied, and that information can be used as a basis for molecular biology of intelligence.

- Yeah, so, and we'll do this exact question, 'cause doesn't the stability of the G factor give you at least a hint that there is a biological basis for intelligence? - Yes, I think it's clear that the fact that an IQ score is correlated to things like thickness of your cortex, that it's correlated to glucose metabolic rate in your brain, that identical twins reared apart are highly similar in their IQ scores.

These are all important observations that certainly more than, that indicate, not just suggest, but indicate that there's a biological basis. And does anyone believe intelligence has nothing to do with the brain? I mean, it's so obvious. - Well, indirectly, definitely has to do with it, but the question is, environment interacting with the brain, or is it the actual raw hardware of the brain?

- Well, some would say that the raw hardware of the brain, as it develops from conception through adulthood, or at least through the childhood, that that so-called hardware that you are assuming is mostly genetic, in fact, is not as deterministic as you might think, that it is probabilistic, and what affects the probabilities are things like in uterine environment, and other factors like that, including chance, that chance affects the way the neurons are connecting during gestation.

It's not, hey, it's pre-programmed. So there is pushback on the concept that genes provide a blueprint, that it's a lot more fluid. - Well, but also, yeah, so there's a lot, a lot happens in the first few months of development. So in nine months inside the mother's body, and in the few months afterwards, there's a lot of fascinating stuff, like including chance and luck, like you said, how things connect up.

Man, the question is, afterwards, in your plasticity of the brain, how much adjustment there is relative to the environment, how much that affects the G factor, but that's where the whole conclusions of the studies that we've been talking about is, that seems to have less and less and less of an effect as pretty quickly.

- Yes, and I do think there is more of a genetic, by my view, and I'm not an expert on this, I mean, genetics is a highly technical and complex subject. I am not a geneticist, not a behavioral geneticist, but my reading of this, my interpretation of this, is that there is a genetic blueprint, more or less, and that has a profound influence on your subsequent intellectual development, including the G factor.

And that's not to say things can't happen to, I mean, if you think of that genes provide a potential, fine, and then various variables impact that potential. And every parent of a newborn, implicitly or explicitly, wants to maximize that potential. This is why you buy educational toys. This is why you pay attention to organic baby food.

This is why you do all these things, because you want your baby to be as healthy and as smart as possible. And every parent will say that. - Is there a case to be made, can you steel me on the case, that genetics is a very tiny component of all of this, and the environment is essential?

- I don't think the data supports that genetics is a tiny component. I think the data support the idea that the genetics is a very important, and I don't say component, I say influence. A very important influence. And the environment is a lot less than people believe. Most people believe environment plays a big role.

I'm not so sure. - I guess what I'm asking you is, can you see where what you just said, it might be wrong? Can you imagine a world, and what kind of evidence would you need to see, to say, you know what, the intuition, the studies so far, like reversing the directions.

So one of the cool things we have now more and more, is we're getting more and more data, and the rate of the data is escalating because of the digital world. So when you start to look at a very large scale of data, both on the biology side and the social side, we might be discovering some very counterintuitive things about society.

We might see the edge cases that reveal that if we actually scale those edge cases, and they become like the norm, that we'll have a complete shift in our, like you'll see G-factor be able to be modified throughout life, in the teens and in later life. So is it any case you can make, or for where your current intuitions are wrong?

- Yes, and it's a good question, because I think everyone should always be asked, what evidence would change your mind? It's certainly not only a fair question, it is really the key question for anybody working on any aspect of science. I think that if environment was very important, we would have seen it clearly by now.

It would have been obvious that school interventions, compensatory education, early childhood education, all these things that have been earnestly tried, and well-funded, well-designed studies, would show some effect, and they don't. - They don't. What if the school, the way we've tried school, compensatory school sucks, and we need to do better?

- That's what everybody said at the beginning, that's what everybody said to Jensen. He said, well, maybe we need to start earlier. Maybe we need not do pre-kindergarten, but pre-pre-kindergarten. Yeah, it's always an infinite, well, maybe we didn't get it right. But after decades of trying, 50 years, 50 or 60 years of trying, surely something would have worked to the point where you could actually see a result, and not need a probability level at .05 on some means.

So that's the kind of evidence that would change my mind. - Population-level interventions like schooling that you would see, like this actually has an effect. - Yes, and when you take adopted kids, and they grow up in another family, and you find out when those adopted kids are adults, their IQ scores don't correlate with the IQ scores of their adoptive parents, but they do correlate with their IQ scores of their biological parents, whom they've never met.

I mean, these are important, these are powerful observations. - And it would be convincing to you if the reverse was true. - Yes, that would be more. And there is some data on adoption that indicates the adopted children are moving a little bit more toward their adoptive parents. But it's, to me, the overwhelming, I have this concept called the weight of evidence, where I don't interpret any one study too much.

The weight of evidence tells me genes are important. But what does that mean? What does it mean that genes are important, knowing that gene expression, genes don't express themselves in a vacuum, they express themselves in an environment. So the environment has to have something to do with it, especially if the best genetic estimates of the amount of variance are around 50, or even if it's as high as 80%, it still leaves 20% of non-genetic.

Now, maybe that is all luck, maybe that's all chance. I could believe that, I could easily believe that. So, but I do think, after 50 years of trying various interventions, and nothing works, including memory training, including listening to Mozart, including playing computer games, none of that has shown any impact on intelligence test scores.

- Is there data on the intelligence, the IQ of parents as it relates to the children? - Yes, and there is some genetic evidence of kind of an interaction between the parents' IQ and the environment, high IQ parents provide an enriched environment, which then can impact the child in addition to the genes, it's that environment.

So there are all these interactions that, but it's not, you know, think about the number of books in a household. This was a variable that's correlated with IQ, and-- - It is. - Yeah. Well, why? Especially if the kid never reads any of the books. It's because more intelligent people have more books in their house.

And if you're more intelligent, and there's a genetic component to that, the child will get those genes or some of those genes as well as the environment, but it's not the number of books in the house that actually directly impacts the child. So the two scenarios on this are, you find that, and this was used to get rid of the SAT test, oh, the SAT score is highly correlated with the social economic status of the parents.

So all you're really measuring is how rich the parents are. Okay, well, why are the parents rich? - Yes. - Okay. And so you could, the opposite kind of syllogism is that people who are very bright make more money. They can afford homes in better neighborhoods so their kids get better schools.

Now, the kids grow up bright. Where in that chain of events does that come from? Well, unless you have a genetically informative research design, where you look at siblings that have the same biological parents and so on, you can't really disentangle all that. Most studies of social economic status and intelligence do not have a genetically informed design.

So any conclusions they make about the causality of the social economic status being the cause of the IQ is a stretch. And where you do find genetically informative designs, you find most of the variance in your outcome measures are due to the genetic component. And sometimes the SES adds a little, but the weight of evidence is it doesn't add very much variance to predict what's going on beyond the genetic variance.

So when you actually look at it in some, and there aren't that many studies that have genetically informed designs, but when you do see those, the genes seem to have an advantage. - Sorry for the strange questions, but is there a connection between fertility or the number of kids that you have and G factor?

So you know, the kind of conventional wisdom is people of, maybe is it higher economic status or something like that are having fewer children? I just loosely hear these kinds of things. Is there data that you're aware of in one direction or another on this? - Well, strange questions always get strange answers.

- Yes, all right. So do you have a strange answer for that strange question? - The answer is there were some studies that indicated the more children in a family, the firstborn children would be more intelligent than the fourth or fifth or sixth. It's not clear that those studies hold up over time.

And of course, what you see also is that families where there are multiple children, four, five, six, seven, you know, really big families, the social economic status of those families usually in the modern age is not that high. Maybe it used to be the aristocracy used to have a lot of kids, I'm not sure exactly.

But there have been reports of correlations between IQ and fertility. But I'm not sure that the data are very strong that the firstborn child is always the smartest. It seems like there's some data to that, but I'm not current on that. - How would that be explained? That would be a nurture.

- Well, it could be nurture. It could be in uterine environment. I mean-- - Boy, biology's complicated. - It's, and this is why this, you know, like many areas of science, you said earlier that there are a lot of gray areas and no definitive answers. This is not uncommon in science that the closer you look at a problem, the more questions you get, not the fewer questions, because the universe is complicated.

And the idea that we have people on this planet who can study the first nanoseconds of the Big Bang, that's pretty amazing. And I've always said that if they can study the first nanoseconds of the Big Bang, we can certainly figure out something about intelligence that allows that. - I'm not sure what's more complicated, the human mind or the physics of the universe.

It's unclear to me. I think we overemphasize-- - Well, that's a very humbling statement. (laughs) - Maybe it's a very human-centric, egotistical statement that our mind is somehow super complicated, but biology is a tricky one to unravel. Consciousness, what is that? - Well, I've always believed that consciousness and intelligence are the two real fundamental problems of the human brain.

And therefore, I think they must be related. (both laugh) - Yeah, heart problems like walk together, holding hands kind of idea. - You may not know this, but I did some of the early research on anesthetic drugs with brain imaging, trying to answer the question, what part of the brain is the last to turn off when someone loses consciousness?

And is that the first part of the brain to turn on when consciousness is regained? And I was working with an anesthesiologist named Mike Alkire, who's really brilliant at this. These were really the first studies of brain imaging using positron emission tomography long before fMRI. And you would inject a radioactive sugar that labeled the brain, and the harder the brain was working, the more sugar it would take up.

And then you could make a picture of glucose use in the brain. And he was amazing. He managed to do this. In normal volunteers, he brought in an anesthetized as if they were going into surgery. And he managed all the human subjects requirements on this research. And he was brilliant at this.

And what we did is we had these normal volunteers come in on three occasions. On one occasion, he gave them enough anesthetic drug so they were a little drowsy. And on another occasion, they came in and he fully anesthetized them. And he would say, "Mike, can you hear me?" And the person would say, "Uh, yeah." And then we would scan people and under no anesthetic condition.

So same person. And we were looking to see if we could see the part of the brain turn off. He subsequently tried to do this with fMRI, which has a faster time resolution. And you could do it in real time as the person went under and then regain consciousness, where you couldn't do that with PET.

You had to have three different occasions. And the results were absolutely fascinating. We did this with different anesthetic drugs and different drugs impacted different parts of the brain. So we were naturally looking for the common one. And it seemed to have something to do with the thalamus and consciousness.

This was actual data on consciousness. Real, actual consciousness. - What part of the brain turns on? What part of the brain turns off? - It's not so clear. - But maybe has something to do with the thalamus. - The sequence of events seemed to have the thalamus in it.

- Boy. - Now, here's the question. Are some people more conscious than others? Are there individual differences in consciousness? And I don't mean it in the psychedelic sense. I don't mean it in the political consciousness sense. I just mean it in everyday life. Do some people go through everyday life more conscious than others?

And are those the people we might actually label more intelligent? So now, the other thing I was looking for is whether the parts of the brain we were seeing in the anesthesia studies were the same parts of the brain we were seeing in the intelligence studies. Now, this was very complicated, expensive research.

We didn't really have funding to do this. We were trying to do it on the fly. I'm not sure anybody has pursued this. I'm retired now. He's gone on to other things. But I think it's an area of research that would be fascinating to see the parts. There are a lot more imaging studies now of consciousness.

I'm just not up on them. - But basically, the question is which imaging, so newer imaging studies to see in high-resolution spatial and temporal way, which part of the brain lights up when you're doing intelligence tasks? And which parts of the brain lights up when you're doing consciousness tasks and see the interplay between them?

Try to infer. I mean, that's the challenge of neuroscience. Without understanding deeply, looking from the outside, try to infer something about how the whole thing works. - Well, imagine this. Here's a simple question. Does it take more anesthetic drug to have a person lose consciousness if their IQ is 140 than a person with an IQ of 70?

- That's an interesting way to study it, yeah. I mean, if there is a, if the answer to that is a stable yes, that's very interesting. - So I tried to find out. And I went to some anesthesiology textbooks about how you dose. And they dose by weight. And what I also learned, this is a little bit off subject, anesthesiologists are never sure how deep you are.

- Yeah. - And they usually tell by poking you with a needle. And if you don't jump, they tell the surgeon to go ahead. I'm not sure that's literally true, but it's-- - Well, it might be very difficult to know precisely how deep you are. It has to do with the same kind of measurements that you were doing with the consciousness.

It's difficult to know. - So I don't lose my train of thought. I couldn't find in the textbooks anything about dosing by intelligence. I asked my friend, the anesthesiologist, he said, no, he doesn't know. I said, can we do a chart review and look at people using their years of education as a proxy for IQ?

Because if someone's gone to graduate school, that tells you something. You can make some inference as opposed to someone who didn't graduate high school. Can we do a chart review? And he says, no, they never really put down the exact dose. No, he said, no. So to this day, the simple question, does it take more anesthetic drug to put someone under if they have a high IQ or less, or less?

It could go either way. Because by the way, our early PET scan studies of intelligence found the unexpected result of an inverse correlation between glucose metabolic rate and intelligence. It wasn't how much a brain area lit up. How much it lit up was negatively correlated to how well they did on the test, which led to the brain efficiency hypothesis, which is still being studied today.

And there's more and more evidence that the efficiency of brain information processing is more related to intelligence than just more activity. - Yeah, and it'll be interesting, again, it's a total hypothesis how much the relationship between intelligence and consciousness, it's not obvious that those two, if there's correlation, they could be inversely correlated.

Wouldn't that be funny? If you, the consciousness factor, the C factor plus the G factor equals one. It's a nice trade-off. - You get a trade-off, how deeply you experience the world versus how deeply you're able to reason through the world. - What a great hypothesis. Certainly somebody listening to this can do this study.

- Even if it's the aliens analyzing humans a few centuries from now. Let me ask you from an AI perspective. I don't know how much you've thought about machines, but there's the famous Turing test, test of intelligence for machines, which is a beautiful, almost like a cute formulation of intelligence that Alan Turing proposed.

Basically conversation being if you can fool a human to think that a machine is a human that passes the test. I suppose you could do a similar thing for humans. If I can fool you that I'm intelligent, then that's a good test of intelligence. You're talking to two people, and the test is saying who has a higher IQ.

It's an interesting test, 'cause maybe charisma can be very useful there. You're only allowed to use conversation, which is the formulation of the Turing test. Anyway, all that to say is what are good tests of intelligence for machines? What do you think it takes to achieve human-level intelligence for machines?

- Well, I have thought a little bit about this, but every time I think about these things, I rapidly reach the limits of my knowledge and imagination. So when Alexa first came out, and I think there was a competing one. Well, there was Siri with Apple, and Google had Alexa.

- No, no, Amazon had Alexa. - Amazon had Alexa, and Google has something. So I proposed to one of my colleagues that he buy one of these, one of each, and then ask it questions from the IQ test. - Nice. - But it became apparent that they all search the internet so they all can find answers to questions like how far is it between Washington and Miami, and repeat after me.

Now, I don't know if you said to Alexa, I'm going to repeat these numbers backwards to me. I don't know what would happen, I've never done it. But so one answer to your question is, try, you're gonna try it right now, let's try it. Let's try it. - No, no, no.

- Yes, Siri. - So it would actually probably go to Google search, and it would be all confusing kind of stuff. It would fail. - Well, then I guess there's a test that it would fail. - Well, but that's not, that has to do more with the, you know, the language of communication versus the content.

So if you did an IQ test to a person who doesn't speak English, and the test was administered in English, that's not really the test of-- - Well, let's think about the computers that beat the Jeopardy champions. - Yeah, so that, because I happen to know how those are programmed, those are very hard-coded, and there's definitely a lack of intelligence there.

There's something like IQ tests. There's a guy, artificial intelligence researcher, Francois Chollet, he's at Google, he's one of the seminal people in machine learning. He also, as a fun aside thing, developed an IQ test for machines. - Oh, I haven't heard that. I would just like to know about that.

- I'll actually email you this, 'cause it'd be very interesting for you. It doesn't get much attention, because people don't know what to do with it. But it deserves a lot of attention, which is, it basically does a pattern type of tests, where you have to do, you know, one standard one is, you're given three things, and you have to do a fourth one, that kind of thing.

So you have to understand the pattern here. And for that, it really simplifies to, so the interesting thing is, he's trying not to achieve high IQ, he's trying to achieve like pretty low bar for IQ. Things that are kind of trivial for humans. And they're actually really tough for machines.

Which is seeing, playing with these concepts of symmetry, of counting. Like if I give you one object, two objects, three objects, you'll know that the last one is four objects, you can like count them. You can cluster objects together. It's both visually and conceptually. We could do all these things with our mind that we take for granted.

The objectness of things. We can like figure out what spatially is an object and isn't. And we can play with those ideas. And machines really struggle with that. So he really cleanly formulated these IQ tests. I wonder what like that would equate to for humans with IQ. But it'd be a very low IQ.

But that's exactly the kind of formulation like, okay, we wanna be able to solve this. How do we solve this? And he does this as a challenge, and nobody's been able to, it's similar to the Alexa Prize, which is Amazon is hosting a conversational challenge. Nobody's been able to do well on his.

But that's an interesting, those kinds of tests are interesting 'cause we take for granted all the ability of the human mind to play with concepts and to formulate concepts out of novel things. So like things we've never seen before. We're able to use that. I mean, that's, I've talked to a few people that design IQ tests sort of online.

They write IQ tests. And I was trying to get some questions from them. And they spoke to the fact that we can't really share questions with you because part of the, like first of all, it's really hard work to come up with questions. It's really, really hard work. It takes a lot of research, but it also takes a lot, novelty generating.

You're constantly coming up with really new things. And part of the point is that they're not supposed to be public. They're supposed to be new to you when you look at them. It's interesting that the novelty is fundamental to the hardness of the problem. At least a part of what makes the problem hard is you've never seen it before.

- Right, that's called fluid intelligence as opposed to what's called crystallized intelligence, which is your knowledge of facts. You know things. But can you use those things to solve a problem? Those are two different things. - Do you think we'll be able to, 'cause we spoke, and I don't wanna miss an opportunity to talk about this, we spoke about the neurobiology, about the molecular biology of intelligence.

Do you think one day we'll be able to modify the biology of, or the genetics of a person to modify their intelligence, to increase their intelligence? We started this conversation by talking about a pill you could take. Do you think that such a pill would exist? - Metaphorically, I do.

And I am supremely confident that it's possible because I am supremely ignorant of the complexities of neurobiology. (Lex laughing) And so I have written-- - Ignorance is bliss. - Well, I have written that the nightmares of neurobiologists, understanding the complexities, this cascade of events that happens at the synaptic level, that these nightmares are what fuel some people to solve.

So some people, you have to be undaunted. I mean, yeah, this is not easy. Look, we're still trying to figure out cancer. It was only recently that they figured out why aspirin works. These are not easy problems, but I also have the perspective of the history of science is the history of solving problems that are extraordinarily complex.

- And seem impossible at the time. - And seem impossible at the time. - And so one of the things you look at at companies like Neuralink, you have brain-computer interfaces, you start to delve into the human mind and start to talk about machines measuring, but also sending signals to the human mind, you start to wonder what impact that has on the G factor, modifying in small ways or in large ways the functioning, the mechanical, electrical, chemical functioning of the brain.

- I look at everything about the brain. There are different levels of explanation. On one hand, you have a behavioral level, but then you have brain circuitry, and then you have neurons, and then you have dendrites, and then you have synapses, and then you have the neurotransmitters and the presynaptic and the postsynaptic terminals.

And then you have all the things that influence neurotransmitters. And then you have the individual differences among people. Yeah, it's complicated, but 51 million people in the United States have IQs under 85 and struggle with everyday life. Shouldn't that motivate people to take a look at this? - Yeah, and to treat it seriously.

Yeah, but I just want to linger one more time that we have to remember that the science of intelligence, the measure of intelligence is only a part of the human condition. The thing that makes life beautiful and the creation of beautiful things in this world is perhaps loosely correlated, but it's not dependent entirely on intelligence.

- Absolutely, I certainly agree with that. - So for anyone sort of listening, I'm still not convinced that more intelligence is always better if you want to create beauty in this world. I don't know. - Well, I didn't say more intelligence is always better if you want to create beauty.

I just said all things being equal, more is better than less. That's all I mean. - Yeah, but that's sort of that I just want to sort of say, 'cause to me, one of the things that makes life great is the opportunity to create beautiful things, and so I just want to sort of empower people to do that no matter what some IQ test says.

At the population level, we do need to look at IQ tests to help people. And to also inspire us, yeah, to take on some of these extremely difficult scientific questions. Do you have advice for young people in high school, in college, whether they're thinking about career or they're thinking about a life they can be proud of?

Is there advice you can give? Whether they want to pursue psychology or biology or engineering, or they want to be artists and musicians and poets. - I can't advise anybody on that level of what their passion is. - Poetry. - But I can say if you're interested in psychology, if you're interested in science, and the science around the big questions of consciousness and intelligence and psychiatric illness, we haven't really talked about brain illnesses and what we might learn from, if you are trying to develop a drug to treat Alzheimer's disease, you are trying to develop a drug to impact learning and memory, which are core to intelligence.

So it could well be that the so-called IQ pill will come from a pharmaceutical company trying to develop a drug for Alzheimer's disease. - Because that's exactly what you're trying to do, right? Yeah, just like you said. - Well, what will that drug do in a college student that doesn't have Alzheimer's disease?

So I would encourage people who are interested in psychology, who are interested in science, to pursue a scientific career and address the big questions. And the most important thing I can tell you, if you're gonna be in kind of a research environment, is you gotta follow the data where the data take you.

You can't decide in advance where you want the data to go. And if the data take you to places that you don't have the technical expertise to follow, like I would like to understand more about molecular biology, but I'm not gonna become a molecular biologist now, but I know people who are.

And my job is to get them interested to take their expertise into this direction. And it's not so easy, but... - And if the data takes you to a place that's controversial, that's counterintuitive in this world, no, I would say it's probably a good idea to still push forward boldly, but to communicate the interpretation of the results with skill, with compassion, with a greater breadth of understanding of humanity, not just the science, of the impact of the results.

- One famous psychologist wrote about this issue, that somehow a balance has to be found between pursuing the science and communicating it with respect to people's sensitivities. The legitimate sensitivities. Somehow, he didn't say how. - Somehow. - Somehow. And this is-- - Every part of that sentence, somehow, and balance is left up to the interpretation of the reader.

Let me ask you, you said big questions, the biggest, or one of the biggest. We already talked about consciousness and intelligence, one of the most fascinating, one of the biggest questions, but let's talk about the why. Why are we here? What's the meaning of life? - Oh, I'm not gonna tell you.

- You know, but you're not gonna tell me? This is very, I'm gonna have to wait for your next book. - The meaning of life, you know, we do the best we can to get through the day. - And then there's just the finite number of the days. Are you afraid of the finiteness of it?

You think about your death? - I think about it more and more as I get older. - Yeah, I do. And it's one of these human things, that it is finite, we all know it. Most of us deny it and don't wanna think about it. Sometimes you think about it in terms of estate planning, you try to do the rational thing.

Sometimes it makes you work harder 'cause you know your time is more and more limited and you wanna get things done. I don't know where I am on that. It is just one of those things that's always in the back of my mind. And I don't think that's uncommon.

- Well, it's just like G-factor intelligence, it's a hard truth that's there. And sometimes you kinda walk past it and you don't wanna look at it, but it's still there. - Yeah, yes, you can't escape it. And the thing about the G-factor intelligence is everybody knows this is true on a personal daily basis.

Even if you think back to when you were in school, you know who the smart kids were. When you are on the phone talking to a customer service representative that in response to your detailed question is reading a script back to you and you get furious at this. Have you ever called this person a moron or wanted to call this person a moron?

You're not listening to me. Everybody has had the experience of dealing with people who they think are not at their level. It's just common because that's the way human beings are. That's the way life is. - But we also have a poor estimation of our own intelligence. We have a poor, we're not always a great, our judgment of human character of other people is not as good as a battery of tests.

That's where bias comes in. That's where our history, our emotions, all of that comes in. So people on the internet, there's such a thing as the internet and people on the internet will call each other dumb all the time. And that's the worry here is that we give up on people.

We put them in a bin just because of one interaction or some small number of interactions as if that's it, they're hopeless. That's just in their genetics. But I think no matter what the science here says, once again, that does not mean we should not have compassion for our fellow man.

- That's exactly what the science does say. It's not opposite of what the science says. Everything I know about psychology, everything I've learned about intelligence, everything points to the inexorable conclusion that you have to treat people as individuals respectfully and with compassion because through no fault of their own, some people are not as capable as others.

And you wanna turn a blind eye to it, you wanna come up with theories about why that might be true, fine. I would like to fix some of it as best I can. - And everybody is deserving of love. Richard, this is a good way to end it, I think.

- I'm just getting warmed up here, wasn't I? - I know. I know you can go for another many hours, but to respect your extremely valuable time, this was an amazing conversation. Thank you for the teaching company, the lectures you've given with the New York Science of Intelligence, just the work you're doing.

It's a difficult topic, it's a topic that's controversial and sensitive to people, and to push forward boldly and in that nuanced way, just thank you for everything you do. And thank you for asking the big questions of intelligence, of consciousness. - Well, thank you for asking me. I mean, there's nothing like good conversation on these topics.

- Thanks for listening to this conversation with Richard Heyer. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Albert Einstein. "It is not that I'm so smart, "but I stay with the questions much longer." Thank you for listening, and hope to see you next time.

(upbeat music) (upbeat music)