back to index

Garry Nolan: UFOs and Aliens | Lex Fridman Podcast #262


Chapters

0:0 Introduction
0:43 Biology
6:36 Alien civilizations
10:40 UFO encounters
47:40 Atacama skeleton
54:57 UFO materials
66:19 Jacques Vallee
70:27 UFO data
81:33 Alien hardware in US possession
86:10 Bob Lazar
89:5 Avi Loeb and Oumuamua
93:7 Advice for young people
99:55 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | How would you as a higher intelligence
00:00:02.640 | represent yourself to a lesser intelligence?
00:00:06.200 | - Do you think they saw what they say they saw?
00:00:09.760 | - It didn't just start showing up in 1947.
00:00:12.040 | - How hard do you think it is
00:00:13.260 | for aliens to communicate with humans?
00:00:15.080 | - What do we believe in?
00:00:15.920 | We believe in technology.
00:00:16.760 | So you show yourself as a form of technology, right?
00:00:20.160 | But the common thread is you're not alone.
00:00:23.080 | And there's something else here with you.
00:00:26.240 | And there's something that's, as you said, watching you.
00:00:29.760 | (air whooshing)
00:00:31.960 | - You are a professor at Stanford
00:00:33.920 | studying the biology of the human organism
00:00:36.300 | at the level of individual cells.
00:00:38.680 | So let me ask first
00:00:40.080 | the big ridiculous philosophical question.
00:00:43.800 | What is the most beautiful or fascinating aspect
00:00:47.040 | of human biology at the level of the cell to you?
00:00:50.040 | - The micromachines and the nanomachines
00:00:52.840 | that proteins make and become.
00:00:54.600 | That to me is the most interesting.
00:00:56.760 | The fact that you have this basically dynamic computer
00:01:01.760 | within every cell
00:01:03.040 | that's constantly processing its environment.
00:01:05.960 | And at the heart of it is DNA,
00:01:08.360 | which is a dynamic machine, a dynamic computation process.
00:01:12.720 | People think of the DNA as a linear code.
00:01:16.640 | It's codes within codes within codes.
00:01:18.600 | And it is the, actually the epigenetic state
00:01:21.920 | that's doing this amazing processing.
00:01:23.920 | I mean, if you ever wanted to believe in God,
00:01:26.280 | just look inside the cell.
00:01:28.200 | - So DNA is both information and computer.
00:01:30.780 | - Exactly.
00:01:32.140 | - How did that computer come about?
00:01:34.760 | A big continuing on the philosophical question.
00:01:37.240 | This is both scientific and philosophical.
00:01:39.400 | How did life originate on earth, do you think?
00:01:42.320 | How did this, at every level?
00:01:44.320 | So the very first step
00:01:46.480 | and the fascinating complex computer that is DNA,
00:01:50.880 | that is multicellular organism,
00:01:52.520 | and then maybe the fascinating complex computer
00:01:56.080 | that is the human mind.
00:01:58.720 | - Well, I think you have to take just one more step back
00:02:01.040 | to the complex computer that is the universe, right?
00:02:04.360 | All of the so-called particles or the waves
00:02:08.280 | that people think the universe is made of
00:02:10.200 | and appears to me at least to be a computational process.
00:02:14.540 | And embedded in that is biology, right?
00:02:18.120 | So all the atoms of a protein, et cetera,
00:02:21.040 | sit in that computational matrix.
00:02:23.600 | From my point of view, it's computing something,
00:02:26.120 | it's computing towards something.
00:02:28.240 | It was created in some ways, if you wanna believe in God,
00:02:31.400 | and I don't know that I do,
00:02:32.480 | but if you wanna believe in something,
00:02:34.960 | the universe was created or at least enabled
00:02:37.960 | to allow for life to form.
00:02:40.840 | And so the DNA, if you ask, where does DNA come from?
00:02:45.200 | And you can go all the way back to Richard Dawkins
00:02:47.160 | and the selfish gene hypotheses.
00:02:50.600 | The way I look at DNA though is it is not a moment in time.
00:02:55.600 | It assumes the context of the body and the environment
00:02:59.440 | in which it's going to live.
00:03:00.640 | And so if you wanna ask a question of where
00:03:03.980 | and how does information get stored,
00:03:07.240 | DNA, although it's only 3 billion base pairs long,
00:03:10.780 | contains more information than I think
00:03:13.920 | the entire computational memory resources
00:03:18.120 | of our current technology.
00:03:20.160 | Because who and what you are is both what you were as an egg
00:03:24.360 | all the way through to the day you die,
00:03:27.560 | and it embodies all the different cell types
00:03:29.560 | and organs in your body.
00:03:31.600 | And so it's a computational reservoir
00:03:36.400 | of information and expectation that you will become.
00:03:40.620 | So actually I would sort of turn it around a different way
00:03:43.640 | and say, if you wanted to create
00:03:46.420 | the best memory storage system possible,
00:03:50.500 | you could reverse engineer what a human is
00:03:53.300 | and create a DNA memory system
00:03:56.480 | that is not just the linear version,
00:03:59.820 | but is also everything that it could become.
00:04:03.300 | - When we're talking about DNA, we're talking about Earth
00:04:05.100 | and the environment creating DNA.
00:04:06.940 | So this, you're talking about trying to come up
00:04:09.820 | with an optimal computer for this particular environment.
00:04:12.820 | - Right.
00:04:14.980 | - So if you reverse engineer that computer,
00:04:17.860 | what do you mean by considering
00:04:21.020 | all the possible things it could become?
00:04:22.900 | - So who you are today, right?
00:04:24.760 | So 3 billion bits of information
00:04:27.500 | does not explain Lex Friedman.
00:04:29.780 | It doesn't explain me, right?
00:04:31.620 | But the DNA embodies the expectation of the environment
00:04:36.200 | in which you will live and grow and become.
00:04:41.200 | So all the information that is you, right,
00:04:44.620 | is actually not only embedded in the DNA,
00:04:48.420 | but it's embedded in the context of the world
00:04:50.740 | in which you grow into and develop, right?
00:04:55.740 | So all that information, though,
00:04:57.460 | is packed in the expectation of what the DNA expects to see.
00:05:01.700 | - Interesting, so like some of the information,
00:05:04.840 | is that accurate to say is stored outside the body?
00:05:08.220 | - Exactly, yeah.
00:05:09.820 | The information is stored outside
00:05:11.260 | because there's a context of expectation.
00:05:13.740 | Isn't that interesting?
00:05:14.620 | - Yeah, it's fascinating.
00:05:15.700 | I mean, to linger on this point,
00:05:20.340 | if we were to run Earth over again a million times,
00:05:24.420 | how many different versions
00:05:26.060 | of this type of computer would we get?
00:05:28.020 | - I think it would be different each time.
00:05:29.820 | I mean, if you assume there's no such thing as fate, right,
00:05:33.060 | and it's not all pre-programmed,
00:05:34.980 | you know, and that there is some sort of, let's say,
00:05:37.420 | variation or randomness at the beginning,
00:05:40.380 | you would get as many different versions of life
00:05:43.900 | as you could imagine.
00:05:45.420 | And I don't think it would all be
00:05:46.980 | unless there's something built into the, you know,
00:05:49.700 | into the substrate of the universe.
00:05:51.160 | It wouldn't always be left-handed proteins, right?
00:05:55.900 | - But I wonder what the flap of a butterfly wing,
00:05:58.460 | what effects it has,
00:06:01.300 | because it's possible that the system is really good
00:06:05.420 | at finding the efficient answer,
00:06:07.260 | and maybe the efficient answer is,
00:06:11.580 | there's only a small finite set of them
00:06:13.460 | for this particular environment.
00:06:14.820 | - Exactly, exactly.
00:06:16.020 | That's the, kind of, in a way,
00:06:17.220 | the anthropomorphic universe
00:06:19.340 | of the multiverse expectations, right?
00:06:21.460 | That, you know, there's probably a zillion
00:06:23.740 | other kinds of universes out there
00:06:25.260 | if you believe in multiverse theory.
00:06:28.500 | We only live in the ones where the rules are such
00:06:32.500 | that life like ours can exist.
00:06:35.900 | - So using that logic,
00:06:37.260 | how many alien civilizations do you think are out there?
00:06:40.300 | There's like trillions of environments, aka planets,
00:06:45.300 | or maybe you can think even bigger than planets.
00:06:51.740 | How many lifelike organisms do you think
00:06:55.060 | are out there thriving?
00:06:57.620 | And maybe how many do you think are long gone,
00:07:00.220 | but were once here?
00:07:01.860 | - I think, well, innumerable.
00:07:04.820 | I think in terms of the--
00:07:05.860 | - Greater than zero.
00:07:06.900 | - Much greater than zero.
00:07:08.380 | I mean, I would just be surprised.
00:07:09.780 | What a waste, right, of all that space just for us
00:07:12.980 | if we're never gonna get there.
00:07:14.580 | That would be my first way to think about it.
00:07:19.780 | But second, I mean,
00:07:22.060 | I remember when I was about seven or eight years old,
00:07:26.900 | and I would love if any of your listeners
00:07:28.400 | could find this National Geographic.
00:07:31.380 | I remember opening the page of the National Geographic.
00:07:35.340 | I was about, again, seven to 10 years old.
00:07:37.940 | And it was sort of a current picture of the universe.
00:07:42.060 | It was around probably 1968, 1969.
00:07:45.460 | I just remember looking at it and thinking,
00:07:48.320 | what kinds of empires have risen and fallen
00:07:53.060 | across that space that we'll never know about?
00:07:57.980 | And isn't that sad that we know nothing
00:08:01.260 | about something so grand?
00:08:03.560 | And so I've always been a reader of science fiction
00:08:08.940 | because I like the creative ideas
00:08:12.420 | of what people come up with.
00:08:14.420 | And I especially like science fiction writers
00:08:17.060 | that base it in good science,
00:08:19.420 | but base it also in evolution.
00:08:22.060 | That if you evolve a civilization
00:08:24.980 | from something lifelike, right?
00:08:27.940 | Some sort of biology,
00:08:30.140 | its assumptions about the universe will come
00:08:32.300 | from the environment in which it grew up.
00:08:37.180 | So for instance, Larry Niven is a great writer,
00:08:40.660 | and he imagines different kinds of civilizations.
00:08:44.060 | In some cases, what happens if intelligence evolved
00:08:49.060 | from a herd animal, right?
00:08:51.540 | Would you lead from behind, right?
00:08:54.020 | Would you be, you know, in his case,
00:08:57.260 | one of them were the so-called puppeteers.
00:08:59.940 | And to them, the moral imperative is cowardice.
00:09:04.020 | You put other people forward to run the risk for you, right?
00:09:08.060 | And so he writes entire books around that premise.
00:09:11.820 | There's another guy, Brin, David Brin is his name,
00:09:15.780 | and he writes the so-called uplift universe books.
00:09:20.380 | And in those, he takes different intelligences,
00:09:25.300 | each from a different evolutionary background.
00:09:29.060 | And then he posits a civilization based around
00:09:33.540 | where and what they came from.
00:09:35.940 | And so to me, I mean, that's just fun,
00:09:38.980 | but I mean, back to your original question,
00:09:41.180 | is how many are there?
00:09:43.020 | I think as many stars as we can see.
00:09:47.020 | Now, how many are currently there?
00:09:49.860 | I don't know.
00:09:50.700 | I mean, that's the whole question of, you know,
00:09:53.740 | how long can a civilization last
00:09:55.740 | before it runs out of steam?
00:09:58.700 | And you, for instance, does it just get bored,
00:10:02.020 | or does it transcend to something else?
00:10:03.860 | Or does it say, "I've seen enough and I'm done"?
00:10:06.580 | - What does running out of steam look like?
00:10:08.220 | It could be destroy itself or get bored.
00:10:10.300 | - You know, or we've done everything we can,
00:10:13.300 | and they just decide to stop.
00:10:16.060 | I don't know.
00:10:17.180 | I just don't know.
00:10:18.020 | - It's that Elon Musk worry that we stop reproducing,
00:10:21.300 | or we slow down the reproduction rate
00:10:23.180 | to where the population can go to zero.
00:10:25.340 | - Can go to zero, and we collapse.
00:10:27.660 | I mean, so the only way to get around that
00:10:30.540 | is perhaps create enough machines with AI to take care of us.
00:10:35.540 | - What could possibly go wrong?
00:10:40.020 | You've talked to people that told stories
00:10:44.820 | of UFO encounters.
00:10:46.220 | What is the most fascinating to you
00:10:48.740 | about the stories of these UFO encounters
00:10:51.420 | that you've heard that people have told you?
00:10:54.420 | - The similarity of them,
00:10:56.700 | the uniformity of the stories.
00:11:00.900 | Now, I just wanna say up front,
00:11:02.740 | a lot of people think that when I speculate,
00:11:08.540 | I believe something.
00:11:09.980 | That's not true, right?
00:11:11.940 | Speculation is just creativity.
00:11:14.020 | Speculation is the beginning of hypothesis.
00:11:16.340 | None of what I hear in terms of the anecdotes,
00:11:21.180 | do I necessarily believe are they true?
00:11:24.340 | But I still find them fascinating to listen to
00:11:26.540 | because at some level, they're still raw data.
00:11:29.580 | And you have to listen.
00:11:30.700 | And once you start to hear the same story again and again,
00:11:35.700 | then you have to say, well, there might be something to it.
00:11:37.860 | I mean, maybe it's some kind of a Jungian background
00:11:42.580 | in the human mind and human consciousness
00:11:44.460 | that creates these stories again and again.
00:11:47.420 | And it's coming out of the DNA.
00:11:48.860 | It's coming out of that pre-programmed something.
00:11:51.700 | And Jung talked quite a bit about this kind of thing,
00:11:54.300 | the collective unconscious.
00:11:56.380 | But actually one of the most interesting ones I find
00:11:58.740 | is this constant message
00:12:03.100 | that you're not taking care of your world.
00:12:07.260 | And this came long before climate change.
00:12:11.540 | It came long before many kinds of,
00:12:15.420 | let's say, current day memes around
00:12:19.620 | taking care of our planet, pollution, et cetera.
00:12:25.100 | And so, for instance, perhaps the best example of this,
00:12:28.180 | the one that I find the most fascinating,
00:12:30.220 | is a story out of Zimbabwe.
00:12:32.020 | 50 or 60 children one afternoon in Zimbabwe.
00:12:37.940 | It was a well-educated group of white and black children
00:12:44.980 | who at lunchtime in the playground saw a craft
00:12:49.460 | and they saw a little man.
00:12:52.580 | And they all ran into the teachers
00:12:54.500 | and they told the same story and they drew the same pictures.
00:12:57.860 | And the message several of them got was,
00:13:01.220 | you are not taking care of your planet.
00:13:03.180 | And it got, you know, there's actually a movie coming out
00:13:07.820 | on this episode.
00:13:09.460 | And 30 years later now, the people who were there,
00:13:13.220 | the children who've now grown up, say, it happened to us.
00:13:17.060 | Now, did it happen?
00:13:19.060 | Was it some sort of hallucination
00:13:21.580 | or was it an imposed hallucination by something?
00:13:24.620 | Was it material?
00:13:25.660 | I don't know.
00:13:27.000 | But these kids were seven to 10 years old.
00:13:31.180 | You see them on video.
00:13:33.380 | Seven to 10 year olds can't lie like that.
00:13:36.260 | And so, you know, whether it's real or not, I don't know.
00:13:40.280 | But I find that fascinating data.
00:13:42.340 | And again, it's these unconnected stories
00:13:46.460 | of individuals with the same story.
00:13:50.740 | That is worthy of further inquiry.
00:13:56.620 | - Yeah, so here we are, humans
00:13:59.460 | with limited cognitive capacities
00:14:01.300 | trying to make sense of the world,
00:14:02.640 | trying to understand what is real and not.
00:14:04.860 | We have this DNA that somehow in complex ways
00:14:08.500 | is interacting with the environment.
00:14:10.060 | And then we get these novel ideas
00:14:15.060 | that come from the populace.
00:14:19.860 | And then they make us wonder about what it all means.
00:14:24.300 | And so how to interpret it.
00:14:27.380 | If you think from an alien perspective,
00:14:29.700 | how would you communicate with other lifelike organisms?
00:14:34.540 | You perhaps have to find end points
00:14:38.780 | on this interaction between the DNA
00:14:41.300 | and its manifestations in terms of the human mind
00:14:46.300 | and how it interacts with the environment.
00:14:49.260 | So get some kind of, all right, what is this DNA?
00:14:52.540 | What is this environment?
00:14:53.700 | I have to get in somehow to interact with it,
00:14:57.340 | to perturb the system to where these little ants,
00:15:00.860 | human-like ants get excited and figure stuff out.
00:15:05.660 | - Yeah, and then somehow steer them.
00:15:08.900 | First of all, for investigative purposes,
00:15:11.660 | understand like oftentimes to understand a system,
00:15:14.740 | you have to perturb it.
00:15:15.740 | - Exactly, yeah.
00:15:16.580 | - It's like poke at it.
00:15:17.620 | Do they get excited or not?
00:15:19.420 | And then the other way is you want to,
00:15:22.840 | if you worry about them,
00:15:25.460 | you can steer in one direction or another.
00:15:28.340 | And this kind of idea
00:15:29.740 | that we're not taking care of our world,
00:15:31.960 | that's interesting.
00:15:34.960 | - To me, that's comforting.
00:15:35.940 | That's hopeful because that means the greater intelligence,
00:15:39.580 | which is what I would hope, we want to take care of us.
00:15:42.340 | - Like we want to take care of the gorillas
00:15:44.860 | in the national parks in Africa.
00:15:47.100 | - Yeah, but we don't want to take care of cockroaches.
00:15:49.740 | So there's a line we draw.
00:15:51.500 | So you have to hope that.
00:15:53.380 | - Right now we're a bunch of angry monkeys
00:15:55.580 | and maybe whatever these intelligences are
00:15:59.380 | are also keeping an eye on us.
00:16:03.620 | You don't want the angry monkey troop
00:16:06.000 | stomping around the local galactic arm.
00:16:08.840 | - Do you think these folks are telling the truth?
00:16:11.720 | Do you think they saw what they say they saw?
00:16:15.720 | - I think they saw what they said they saw,
00:16:19.120 | but I also think they saw what they were shown.
00:16:22.440 | I mean, if you go back to the whole notion of,
00:16:25.920 | okay, how long has this been around?
00:16:28.400 | It didn't just start showing up in 1947.
00:16:31.160 | There are stories going back into the 1800s
00:16:36.140 | of people who saw things in their farm fields in the US.
00:16:41.140 | It's in local newspapers from the 1800s.
00:16:44.900 | It's fascinating.
00:16:46.340 | But if you can go even further back,
00:16:48.860 | so to your point of how would you as a higher intelligence
00:16:53.860 | represent yourself to a lesser intelligence?
00:16:58.320 | Well, let's go back to pre-civilization.
00:17:01.940 | Maybe you show yourself as the spirits in the forest
00:17:05.380 | and you give messages through that.
00:17:08.300 | Once you get a little bit more civilized,
00:17:10.820 | then you show yourself as the gods and then you're God.
00:17:13.940 | Well, we don't believe in God anymore necessarily.
00:17:16.100 | Not everybody does.
00:17:17.400 | So what do we believe in?
00:17:18.380 | We believe in technology.
00:17:19.340 | So you show yourself as a form of technology, right?
00:17:22.800 | But the common thread is you're not alone
00:17:26.620 | and there's something else here with you.
00:17:28.840 | And there's something that's, as you said, watching you
00:17:31.480 | and at least watching over your shoulder.
00:17:34.740 | But I think that like any good parent,
00:17:38.760 | you don't tell your student everything.
00:17:42.720 | You make them learn.
00:17:44.400 | And learning requires mistakes
00:17:46.320 | because if you tell them everything, then they get lazy.
00:17:50.100 | You've looked at the brains of,
00:17:55.600 | or information coming from the brain of some of the people
00:17:58.220 | that have had UFO encounters.
00:17:59.880 | What's common about the brain of people
00:18:01.820 | who encounter UFOs?
00:18:03.020 | So the study started with a group of,
00:18:07.940 | let's say a cohort of individuals that were brought to me
00:18:11.460 | and their MRIs to ask about the damage
00:18:16.140 | that had been seen in these individuals.
00:18:19.180 | It turns out that the majority of those patients
00:18:21.220 | ended up being, as far as we can tell, Havana syndrome.
00:18:24.100 | And so for me at least, that part of the story
00:18:28.480 | ends in terms of the injury.
00:18:30.160 | It's likely almost all Havana syndrome.
00:18:33.240 | That's somebody else's problem now.
00:18:34.700 | That's not my problem.
00:18:36.580 | But when we were looking at the brains of these individuals,
00:18:41.920 | we noticed something right in the center
00:18:44.140 | of the basal ganglia in many of these individuals
00:18:47.560 | that at first we thought was damage.
00:18:49.280 | It was basically an enriched patch of MRI dense neurons
00:18:54.280 | that we thought was damage,
00:18:57.500 | but then it was showing up in everybody.
00:18:59.100 | And then we looked and we said, oh, it's actually not.
00:19:01.540 | The other readings on these MRIs
00:19:03.780 | show that actually that's living tissue.
00:19:05.780 | That's actually the head of the caudate and the pitamen.
00:19:10.060 | And at the time, and I remember even asking
00:19:12.080 | a good friend of mine at Stanford who is a psychiatrist,
00:19:16.620 | what does the basal ganglia do?
00:19:18.380 | He said, oh, the basal ganglia is just about movement
00:19:21.740 | and nerve and motor control.
00:19:24.380 | I said, well, that's odd because there's other papers
00:19:27.900 | that we were reading at the time started to suggest
00:19:30.120 | that it was involved with higher intelligence
00:19:34.560 | and is actually downstream of the executive function
00:19:37.340 | and involved with intuition and planning.
00:19:42.780 | And if you think about it,
00:19:44.360 | if you're gonna have motor control,
00:19:46.700 | which is centralized in one place,
00:19:48.820 | motor control requires knowledge of the environment.
00:19:51.660 | You don't wanna move something and hit the table.
00:19:55.780 | Or if you're walking across a room,
00:19:57.620 | you want to be aware and cognizant
00:20:01.660 | of what you might bump into.
00:20:03.660 | So obviously all of that planning
00:20:06.900 | requires access to all the senses.
00:20:11.400 | It requires access to your desires, memory,
00:20:15.420 | knowledge of where and what you want
00:20:17.260 | and desire to walk nearby.
00:20:19.420 | Like I used the example of you're at a party,
00:20:21.700 | you wanna avoid that person, you like that person,
00:20:23.900 | the waiter's about to drop something.
00:20:26.160 | All without thinking, you maneuver.
00:20:29.400 | So that actually, all that planning is done
00:20:31.920 | in the basal ganglia.
00:20:33.120 | And it's actually now called the brain within the brain.
00:20:37.580 | It's a goal processing system
00:20:41.420 | subservient to executive function.
00:20:44.280 | So what we think we found there was not something
00:20:49.280 | which allows people to talk to UFOs.
00:20:51.180 | I mean, I think the UFO community took it a step too far.
00:20:56.180 | What I think we found was a form
00:20:58.900 | of higher functioning and processing.
00:21:02.820 | So what we then looked at,
00:21:04.400 | and this was the most fascinating part of it,
00:21:06.460 | we looked then at individuals in the families of those,
00:21:12.120 | let's say the index case individuals.
00:21:14.980 | And we found that it was actually in families.
00:21:17.440 | And more so, this is the most fascinating part,
00:21:20.920 | we've probably looked now at about 200 just random cases
00:21:25.040 | that you can download off of databases online.
00:21:28.200 | You don't see this higher connectivity.
00:21:31.760 | You only find it in what Kate Green would have called
00:21:35.360 | or has called higher functioning individuals.
00:21:37.840 | People who are, I mean, he called them savants.
00:21:42.840 | I don't have the means to, we haven't done the testing.
00:21:47.800 | But it turns out my family has it, right?
00:21:51.000 | We found it in me, my brother, my sister, my mother.
00:21:55.760 | We found it as well in other individuals,
00:21:58.560 | husband and wife pairs.
00:22:00.620 | So statistically, if you had a group of 20 individuals
00:22:04.240 | and you found two husband wife pairs, both of whom had it,
00:22:06.920 | and yet it's only found at about,
00:22:09.000 | we think one in 200, one in 300 individuals.
00:22:11.640 | The fact that two individuals came together,
00:22:13.840 | two sets of individuals came together, both of whom had it,
00:22:17.640 | implied either a restricted breeding group or attraction.
00:22:22.640 | The reason why it seems to be in, let's say,
00:22:30.240 | so-called experiencers or people who claim,
00:22:34.660 | if intuition is the ability to see something
00:22:37.500 | that other people don't,
00:22:38.460 | and I don't mean that in a paranormal sense,
00:22:41.860 | but being able to see something just in front of you
00:22:44.260 | that other people might just dismiss,
00:22:46.140 | well, maybe that's a function
00:22:48.880 | of a higher kind of intelligence to say,
00:22:50.900 | well, I'm not looking at an artifact.
00:22:55.900 | I'm not looking at something that I should just ignore.
00:22:59.580 | I'm seeing something and I recognize it for,
00:23:01.940 | not what it is, but that it is something different
00:23:04.900 | than what is normally found in my environment.
00:23:07.900 | - Yeah, I have a little bit of that.
00:23:10.060 | I seem to see the magic in a lot of moments.
00:23:14.900 | Like I have a deep, it's obviously, not obviously,
00:23:17.820 | but it seems to be chemical in nature
00:23:21.220 | that I just am excited about life.
00:23:24.740 | I love life.
00:23:26.460 | I love stupid things.
00:23:27.900 | It feels like I'm high a lot,
00:23:29.940 | unlike mushrooms or something like that,
00:23:32.180 | where you'd really appreciate that.
00:23:33.460 | So I'm able to detect something about the environment
00:23:38.140 | that maybe others don't, I don't know,
00:23:41.420 | but I seem to be over the top grateful to be alive
00:23:46.180 | on a lot of stupid reasons,
00:23:48.820 | and that's in there somewhere.
00:23:50.460 | I mean, it's kind of interesting
00:23:52.020 | because it really is true that our brains,
00:23:57.720 | the way we're brought up, but also the genetics,
00:24:00.840 | enables us to see certain slices of the world.
00:24:03.800 | And some people are probably more receptive
00:24:08.800 | to anomalous information.
00:24:12.420 | They see the magic, the possibility in the novel thing,
00:24:17.420 | as opposed to kind of finding the pattern
00:24:22.000 | of the common, of the regular.
00:24:24.360 | Some people are more, wait a minute, this is kind of weird.
00:24:27.360 | I mean, a lot of those people
00:24:28.360 | probably become scientists too.
00:24:30.040 | Like, huh.
00:24:31.940 | Like, there's this pattern happening over and over and over,
00:24:35.080 | and then something weird just happened.
00:24:36.880 | And then you get excited by that weirdness
00:24:41.040 | and start to pull the string
00:24:43.240 | and discover what is at the core of that weirdness.
00:24:46.200 | Perhaps, is that, you know, maybe by way of question,
00:24:51.200 | how does the human perception system
00:24:55.180 | deal with anomalous information?
00:24:57.000 | Do you think?
00:24:57.840 | Well, it first tries to classify it
00:24:59.320 | and get it out of the way.
00:25:00.960 | If it's not food, if it's not sex, right?
00:25:04.340 | If it's not in the way of my desires,
00:25:08.400 | or if it is in the way of my desires,
00:25:10.680 | then you focus on it.
00:25:13.680 | And so I think the question is
00:25:16.800 | how much spare processing power,
00:25:19.240 | how much CPU cycles do we spend
00:25:22.640 | on things that are not those core desires?
00:25:27.640 | - What is the most kind of memorable,
00:25:32.960 | powerful UFO encounter report you've ever heard?
00:25:38.680 | Just to you on a personal level,
00:25:40.480 | like something that was really powerful.
00:25:42.840 | - Well, I mentioned the Zimbabwe one.
00:25:45.080 | That's particularly interesting.
00:25:47.360 | And one that actually most people don't know about,
00:25:50.640 | but family driving down the highway,
00:25:54.880 | two little girls in the back,
00:25:57.000 | open glass topped car,
00:26:00.760 | and the little girls see a craft right over their car.
00:26:05.760 | This is in the middle of the day on a busy highway.
00:26:08.960 | The mother sees it.
00:26:11.100 | Nobody can, they look around, nobody else seems to see it.
00:26:16.360 | So the girls take out their camera, take a picture of it,
00:26:19.920 | and then they get home.
00:26:21.080 | They look at the picture.
00:26:24.480 | There's no craft, but there's a little object
00:26:28.240 | about 30 feet above their car or so,
00:26:32.000 | probably about three feet across, kind of star shaped.
00:26:36.360 | It's not the craft, but it's something else.
00:26:39.320 | Obviously there was something there.
00:26:41.320 | And so what were they seeing?
00:26:42.800 | Were they seeing a projection?
00:26:44.080 | Were they seeing, and why were only they seeing it?
00:26:48.080 | - And the photograph was capturing something
00:26:51.880 | very different than they were seeing,
00:26:53.600 | but there's still an object.
00:26:54.960 | What, can you give a little bit of context?
00:26:58.280 | Is this from modern day?
00:26:59.720 | - It's modern day.
00:27:00.560 | Oh yeah, they had a camera.
00:27:01.380 | I mean, they had a cell phone camera.
00:27:02.520 | - And this was like a--
00:27:03.360 | - About four or five years ago.
00:27:04.560 | - Report provided.
00:27:06.080 | By the way, where's like a central place
00:27:07.760 | to provide a report?
00:27:08.600 | Is this--
00:27:09.420 | - Oh, there's a move on, but this isn't public.
00:27:11.080 | I've seen the picture.
00:27:12.680 | - Oh, this is something you've directly interacted with.
00:27:14.560 | - Yeah, yeah, I've seen the picture.
00:27:16.800 | - So those moments like that,
00:27:18.660 | they captivate your mind.
00:27:23.280 | - It's so different,
00:27:24.120 | it doesn't fall into the standard story at all.
00:27:26.960 | But it also, but in another way,
00:27:29.640 | it's kind of a, it's a clear enunciation of this notion
00:27:33.840 | that when people see events,
00:27:36.360 | they don't all see the same thing.
00:27:37.940 | Now, we've heard this about like traffic accidents.
00:27:40.400 | Different people will see the color of the car differently
00:27:42.880 | or the chain of events differently.
00:27:44.540 | And this tells you that memory isn't anywhere near
00:27:47.140 | what we think it is.
00:27:48.800 | But the issue around these so-called UFO reports
00:27:52.420 | is that the same people will see a very different thing,
00:27:56.480 | almost as if whatever it is is projecting a,
00:28:01.480 | is projecting something into the mind
00:28:04.300 | rather than it being real, right?
00:28:07.020 | Rather than it being a real manifestation,
00:28:09.140 | you know, material in front of you,
00:28:11.500 | it's actually almost some sort of an altered virtual reality
00:28:16.340 | that is imposed on you.
00:28:18.020 | I mean, you know, I think the company Meta
00:28:21.860 | and all the virtual reality companies
00:28:23.820 | would love to have something like that, right?
00:28:26.500 | Where you don't have to actually wear something
00:28:27.820 | on your face to experience a virtual reality.
00:28:31.500 | What happens if you could just project it?
00:28:33.880 | - Well, that's the fundamental question
00:28:35.180 | from an alien perspective.
00:28:37.140 | When you look at it, or as we humans look at ants,
00:28:40.180 | how does its perception system operate?
00:28:43.380 | So not only how does this thing's mind operate,
00:28:45.740 | how does the human mind operate,
00:28:47.780 | but how does their perception system operate
00:28:50.620 | so that we can like stimulate the perception system
00:28:54.500 | properly to get them to think certain things?
00:28:57.540 | And so, you know, that's a really important question.
00:29:02.540 | Humans think that, you know, the only way to communicate
00:29:06.980 | is in like 3D or 4D space-time.
00:29:10.500 | There's physical objects, or maybe you write things
00:29:13.720 | into some kind of language.
00:29:15.320 | But there could be just so much more richness
00:29:20.320 | in how you can communicate.
00:29:23.740 | And so from an alien perspective,
00:29:25.340 | or somebody that has much greater technological capabilities,
00:29:28.060 | you have to figure out how do I use the skills I have
00:29:31.580 | to stimulate the human, the limited humans?
00:29:35.540 | - Right, well, I mean, let's take the ants exam,
00:29:37.980 | again, as an example.
00:29:38.980 | Let's say that you wanted to make ants practical.
00:29:43.240 | You wanted to use them for something, right?
00:29:45.300 | You wanted to use them as a form of biological robot.
00:29:47.600 | Now, DARPA and other people have been trying to use insects
00:29:52.600 | for, you know, turn them into biological robots.
00:29:56.100 | But if you wanted to, you would have to interact
00:29:58.740 | with their sense of smell, right?
00:30:02.220 | Their pheromone system that they use
00:30:04.300 | to interact with each other.
00:30:06.620 | So you would either create those molecules
00:30:10.580 | to talk to them, to make them do it.
00:30:12.020 | I'm not saying talk to them as if they're intelligent,
00:30:13.660 | but talk to them to manipulate them in ways that you want.
00:30:16.860 | Or if you were advanced enough,
00:30:19.420 | you would use some sort of electromagnetic
00:30:22.540 | or other means to stimulate their neurons
00:30:26.240 | in ways that would accomplish the same goal
00:30:29.500 | as the pheromones, but by doing it
00:30:31.340 | in a sort of a telefactoring way.
00:30:33.340 | So let's say you wanted to telefactor with humans.
00:30:37.940 | You would interact with them.
00:30:40.900 | And this is, again, this is a technology
00:30:42.600 | which you could imagine possible.
00:30:45.420 | You could telefactor information
00:30:47.420 | into the sensory system of a human, right?
00:30:50.580 | But then each human is a little bit different.
00:30:53.260 | So either you know enough about them
00:30:55.400 | to tailor it to that individual,
00:30:57.220 | or you just basically take advantage
00:30:58.820 | of whatever the sensory net is that that individual has.
00:31:02.220 | So if you happen to be good at sound,
00:31:05.340 | or you happen to be a very visually inclined individual,
00:31:08.300 | then maybe the sensory information that you get
00:31:11.320 | that's most effective in terms of transmitting information
00:31:15.160 | would come through that portal.
00:31:17.220 | - I think the aliens would need to figure out
00:31:19.820 | that humans value physical consistency.
00:31:22.760 | So we've discovered physics.
00:31:25.180 | So we want our perception to make sense.
00:31:28.460 | Maybe they don't, they haven't, you know.
00:31:30.620 | That's not an obvious fact of perception,
00:31:32.920 | that you have to figure out what kind of things
00:31:37.180 | are humans used to observing
00:31:39.180 | in this particular environment of Earth,
00:31:41.540 | and how do we stimulate the perception system
00:31:44.900 | in a way that's not anomalous, or not too,
00:31:49.060 | doesn't cross that threshold of just like,
00:31:51.700 | well, that's way too weird.
00:31:53.580 | So they have to, I mean, that's not obvious
00:31:56.060 | that that should be important.
00:31:57.940 | You know, maybe you wanna err on the side of anomaly,
00:32:02.420 | like lean into the weirdness.
00:32:04.020 | So communication is complicated.
00:32:06.300 | - Yeah, well, that's why I always find this issue
00:32:09.620 | of people talking about the so-called greys as interesting,
00:32:13.860 | because it is related to what you're saying.
00:32:17.300 | They're different enough,
00:32:19.460 | but they're not so different as to be scary, right?
00:32:22.360 | They're not venom-dripping fangs, right?
00:32:25.540 | They're different enough,
00:32:27.860 | but it's also like they're what you could imagine
00:32:32.220 | us becoming in some distant future.
00:32:34.060 | So is that a purposeful representation?
00:32:36.820 | I don't know.
00:32:38.780 | I mean, I don't believe in the greys, for instance,
00:32:41.260 | but I believe that people think that they see it.
00:32:44.580 | So if we're talking about a communication strategy
00:32:47.160 | that says, you know, we're like you,
00:32:50.320 | but not the same as you,
00:32:53.340 | this might be a manifestation that you represent
00:32:57.660 | in terms of a communication strategy.
00:32:59.720 | - What do you make of David Farrer's citing
00:33:04.060 | of the Tic Tac UFO and other pilots
00:33:07.420 | who have seen these objects
00:33:09.540 | that seem to defy the laws of physics?
00:33:11.500 | - Well, I think you have to take them at their word.
00:33:15.720 | - Are they fascinating to you, these reports?
00:33:18.180 | - Oh, absolutely.
00:33:19.020 | No, I know a lot of these people, right?
00:33:20.940 | So I know Lou Elizondo, Chris Mellon,
00:33:24.620 | the whole crowd.
00:33:26.380 | I saw the videos about three weeks or so
00:33:30.500 | before they went public.
00:33:33.340 | I was at a bar with Lou overlooking the Pentagon
00:33:38.180 | in Crystal City, and they showed them to me,
00:33:40.140 | and my hair stood on end.
00:33:42.020 | And he said, "This is coming out soon."
00:33:45.360 | And I know one of the guys on the inside
00:33:49.340 | who was the Naval Intelligence
00:33:51.360 | who had interviewed all of these pilots,
00:33:53.540 | again, before this came out.
00:33:55.540 | And it was hair-raising to hear this,
00:33:58.740 | but also exciting that here's not just people's testimony.
00:34:03.740 | These are credible individuals.
00:34:05.820 | And if you've seen the 60-minute episode
00:34:08.240 | with some of the pilots, they have no monetary gain.
00:34:13.140 | If anything, they've got negative gain from coming out.
00:34:16.920 | But then you also have all of the simultaneous
00:34:19.620 | ship analysis from the USS Princeton
00:34:23.740 | and the radar analysis, et cetera.
00:34:26.220 | So at the end of the day, it's just data.
00:34:31.220 | It's not a conclusion.
00:34:33.780 | I'd be perfectly happy, honestly perfectly happy,
00:34:38.620 | if somebody showed that it was all a hoax.
00:34:41.260 | I'd go back to my day job, right?
00:34:44.100 | - That could be a hoax, but other things might not be.
00:34:47.780 | This is the point.
00:34:49.580 | This is why it's nice to remove some of the stigma
00:34:53.060 | about this topic, because it's all just data.
00:34:55.700 | And anomalous events are such that there's going to be,
00:35:02.260 | they're going to be rare in terms
00:35:04.020 | of how much data they represent.
00:35:05.900 | But we have to consider the full range of data
00:35:08.420 | to discover the things that actually represent something
00:35:11.580 | that's, if we pull at it, we'll discover something
00:35:14.780 | that's extraterrestrial, or something deep
00:35:18.140 | about the phenomena on Earth that we don't yet understand.
00:35:22.020 | - Right, well, if it only stimulates people,
00:35:25.820 | for instance, to think, okay, well, what happens
00:35:28.100 | if we could move like that with momentumless movement?
00:35:32.260 | And it stimulates young individuals to go
00:35:37.140 | into the sciences to ask those questions.
00:35:40.980 | That to me is fascinating.
00:35:41.980 | I mean, after I've been openly talking about this
00:35:44.580 | in the last year, especially, I've had a number
00:35:47.420 | of students from top schools who aren't my students
00:35:52.420 | come to me and say, "If I can help, let me.
00:35:57.220 | "How can I help?
00:35:58.100 | "I never had thought about this before,
00:35:59.540 | "but you opened, you and others, not just you,
00:36:02.460 | "and others have opened my mind
00:36:03.900 | "to thinking about this matter."
00:36:06.260 | - Yeah, that's why it's actually funny
00:36:07.620 | that Elon Musk doesn't think too much
00:36:10.660 | about these kinds of propulsion systems
00:36:14.780 | that could defy the laws of physics
00:36:16.580 | as we currently understand them.
00:36:18.700 | To me, it's a powerful way to think, well, what is possible?
00:36:22.740 | It's inspiring, even if some of the data
00:36:26.660 | doesn't represent extraterrestrial vehicles.
00:36:30.260 | I think the observation itself,
00:36:32.940 | it's like something you mentioned,
00:36:35.020 | which is hypothesizing, imagining these things,
00:36:40.020 | considering the possibility of these things,
00:36:43.340 | I think opens up your mind in a way
00:36:45.780 | that ultimately can create the technology.
00:36:49.420 | First, you have to believe the technology is possible
00:36:52.140 | before you can create it.
00:36:54.060 | - In my own lab, we always look for, as I've said before,
00:36:59.020 | what is inevitable, and saying,
00:37:02.420 | "Inevitably, this is the kind of data we need,
00:37:05.160 | "but if we need that kind of data,
00:37:06.480 | "the instrument we want doesn't exist."
00:37:10.780 | Okay, so I imagine the perfect instrument, I can't make it.
00:37:14.980 | And you back into something which is practical,
00:37:18.300 | and then you, in a sense, reverse engineer the future
00:37:22.500 | of what it is that you wanna make.
00:37:24.580 | And I've started and sold at least half a dozen
00:37:28.540 | or more companies using that basic premise.
00:37:32.380 | And so it was always something that didn't exist today,
00:37:35.900 | but we imagined what we wanted.
00:37:38.700 | And at the time, many people said it couldn't be done.
00:37:41.940 | I mean, for instance, all the gene therapy
00:37:43.680 | that's done today with retroviruses
00:37:46.140 | came from a group meeting in David Baltimore's lab.
00:37:49.940 | I was a postdoc with him.
00:37:52.100 | And one of the other postdocs wasn't able
00:37:55.500 | to make retroviruses in a way that he wanted to.
00:37:58.940 | And I realized I had a cell line
00:38:00.660 | that would allow us to make retroviruses
00:38:02.780 | in two days rather than two months.
00:38:04.740 | And so he and I then worked together to make that system.
00:38:09.580 | And now all gene therapy with retroviruses
00:38:11.540 | is done using this basic approach around the whole world
00:38:14.700 | because something couldn't be done,
00:38:17.300 | and we wanted to do it better, and we imagined the future.
00:38:22.100 | And so that's, I think,
00:38:24.460 | what the whole UFO phenomenon is doing for people.
00:38:28.140 | It's like, well, let's imagine a future
00:38:30.100 | where these kinds of technologies are,
00:38:33.120 | but also let's imagine a future
00:38:34.320 | where we don't blow ourselves up, right?
00:38:36.220 | So if these things are there,
00:38:37.660 | they manage to not blow themselves up.
00:38:40.500 | So it means that at least one other civilization
00:38:44.780 | got past the inflection point.
00:38:47.200 | - So if some of the encounters
00:38:48.660 | are actually representing alien civilizations visiting us,
00:38:52.600 | why do you think they're doing so?
00:38:55.420 | You suggested that perhaps it's the study
00:39:00.140 | and understand their own past, right?
00:39:02.540 | - Right.
00:39:03.380 | - What are some of the motivations, do you think?
00:39:05.380 | And again, from our perspective, us as humans,
00:39:09.260 | what motivations would we have
00:39:11.100 | when we approach other civilizations
00:39:12.980 | we might discover in the future?
00:39:15.180 | - Well, I think one motivation might be
00:39:17.660 | to steer us away from the precipice, right?
00:39:22.380 | Or on the assumption that, you know,
00:39:25.420 | even if we make it past the precipice,
00:39:27.580 | at least we're not a bunch of psychopaths,
00:39:31.360 | you know, running around.
00:39:32.540 | So maybe there's a little bit of motivation there
00:39:35.300 | to make sure that the neighbor
00:39:37.020 | that's growing up next to you is not, you know, unruly.
00:39:40.520 | You know, but I mean, maybe it's sort of a moral imperative
00:39:46.540 | like what we have with, you know,
00:39:49.020 | creating national parks where animals
00:39:54.020 | can continue to live out their lives in a natural way.
00:39:57.840 | I don't know.
00:39:59.980 | I mean, that would be, I mean,
00:40:01.820 | the problem is we're imagining
00:40:04.420 | from a anthropomorphic viewpoint,
00:40:08.580 | what an alien might think.
00:40:10.300 | And as I've said before, alien means alien, right?
00:40:14.820 | I mean, not Hollywood aliens,
00:40:17.620 | but a whole different way of thinking
00:40:21.340 | and a whole different level of experience.
00:40:23.640 | And let's say wisdom, hopefully,
00:40:25.980 | that we could only hope to understand.
00:40:30.260 | Now, but if we ever get out there,
00:40:32.260 | if we ever make it past our current problems,
00:40:36.200 | and even if we don't have faster than light travel,
00:40:39.100 | and even if we're only using ram scoops
00:40:43.060 | or light sails to get where we wanna go,
00:40:46.860 | and it takes us 10,000 years to get somewhere
00:40:50.660 | or to spread out, we might encounter such things.
00:40:54.020 | And are we just gonna stomp all over it
00:40:56.380 | like we did in colonial South America
00:40:58.980 | or Africa or all the rest?
00:41:02.180 | On our current path, likely.
00:41:03.580 | And so what are we gonna learn?
00:41:06.580 | - Well, we're getting better and better
00:41:08.500 | at understanding what is life.
00:41:10.580 | And I think we're getting better and better
00:41:13.820 | at being careful not to step on it when we see it.
00:41:17.460 | And this is one of the nice things
00:41:19.740 | about talking about UFOs,
00:41:21.660 | is it expands the Overton window.
00:41:24.420 | It expands our understanding
00:41:25.820 | of what possibly could be life.
00:41:28.500 | It gets us to think.
00:41:29.740 | It gets the scientific community to think.
00:41:32.060 | When we go to Mars, when we go to these different moons
00:41:34.860 | that possibly have life,
00:41:36.220 | we're not looking at legged organisms.
00:41:40.300 | We're looking at some kind of complexity
00:41:44.140 | that arises in resistance to the natural world.
00:41:50.340 | And there's a lot of interesting--
00:41:54.140 | - I like that, resistance to the natural world, yeah.
00:41:57.300 | - So somehow there's a rebellious process,
00:42:01.020 | complex system going on here.
00:42:03.380 | And I don't know the many ways it could take form.
00:42:06.620 | And there's a sense for aliens
00:42:09.860 | that as the technology develops,
00:42:12.780 | they take form more and more as information,
00:42:16.820 | as something that can influence the space of ideas,
00:42:21.420 | of the processing of data itself.
00:42:24.860 | So I just, this idea of embodiment
00:42:27.620 | that we humans so admire,
00:42:29.380 | physically visible, perceivable embodiment
00:42:34.420 | may be a very inefficient thing.
00:42:38.380 | - Right. - Right?
00:42:39.740 | - If you think just about your area, AI,
00:42:43.220 | we're trying to make smaller and smaller and smaller
00:42:48.060 | circuitry that is basically close to the surface
00:42:54.580 | closer and closer to the physics
00:42:56.780 | of how the universe operates, right?
00:43:00.420 | Right down at the level of, I mean,
00:43:01.780 | quantum computers are basically right down
00:43:03.620 | about quantum information storage.
00:43:05.780 | So fast forward 10,000, 100,000 years,
00:43:10.540 | maybe somebody found a way to embody AI directly
00:43:13.500 | into the physics of the universe, right?
00:43:16.380 | And it doesn't require a physical manifestation.
00:43:19.580 | It just sits in space-time.
00:43:22.780 | It's just a locally ordered space.
00:43:25.220 | We're just locally ordered space-time, right?
00:43:28.820 | You know, I mean, people,
00:43:30.340 | but maybe they just, they found a way to embody it there.
00:43:33.780 | - They probably have to get really good at not,
00:43:36.300 | you know, trampling on the ants.
00:43:37.980 | 'Cause the better your technology gets,
00:43:40.340 | the easier it is to accidentally like, oops,
00:43:42.820 | just destroy these simpleton biological systems.
00:43:47.980 | - We constantly think about whatever these things might be.
00:43:50.580 | We think that they are some sort of a unified force.
00:43:55.580 | Well, maybe they're not unified.
00:43:58.840 | Maybe they are as disparate as you and I are.
00:44:03.020 | And maybe what keeps them from stomping all over the ants
00:44:06.620 | is each other, right?
00:44:09.140 | That they are in a self-tension
00:44:11.620 | to prevent one or more of them from running amok.
00:44:16.620 | - Oh, yeah.
00:44:18.780 | I mean, that's kind of the anarchy of nations
00:44:21.020 | that we have on earth.
00:44:21.860 | So there's always going to be this-
00:44:26.860 | - There's a hierarchy.
00:44:28.180 | - This hierarchy that's formed
00:44:29.620 | of greater and greater intelligences.
00:44:31.540 | - Right.
00:44:32.380 | - And they're all probably also wondering,
00:44:34.500 | wait, what's bigger than me?
00:44:36.140 | - Exactly.
00:44:36.980 | That's what I always wonder,
00:44:37.900 | is that maybe that what keeps them in line
00:44:41.140 | is something that is beyond them.
00:44:43.740 | - Like what created the universe?
00:44:45.260 | I mean, that's probably a question that bothers them too.
00:44:49.500 | - What about the communication task itself?
00:44:53.020 | How hard do you think it is for aliens
00:44:54.780 | to communicate with humans?
00:44:56.300 | So is this something you think about,
00:45:00.380 | about this barrier of communication
00:45:02.060 | between biological systems and something else?
00:45:05.020 | How difficult is it to find a common language?
00:45:07.620 | - Well, I think if you're smart enough
00:45:11.240 | or technologically enabled enough,
00:45:13.780 | it's relatively straightforward.
00:45:18.420 | Now, whether your concepts
00:45:21.700 | can ever be dumbed down to us,
00:45:27.260 | that might be hard.
00:45:28.860 | I mean-
00:45:30.900 | - Again, talking to the ants.
00:45:32.260 | - Talking to the ants.
00:45:33.220 | I mean, they don't-
00:45:34.060 | - On Instagram.
00:45:34.900 | You want to look good in this picture.
00:45:39.780 | Let me explain to you-
00:45:40.620 | - Let me explain to you why.
00:45:41.940 | So that's the essential problem of,
00:45:47.220 | you know, perhaps they realize
00:45:50.780 | who it is that they're talking to.
00:45:53.460 | And they say, "Rather than muddy the picture,
00:45:57.900 | we're only gonna give them limited information."
00:46:00.940 | - Yeah.
00:46:01.780 | - Right?
00:46:02.620 | And yeah, maybe we could sit down like you and I
00:46:05.060 | and have a conversation,
00:46:06.580 | but then they would make assumptions.
00:46:09.060 | The humans would then make assumptions about us
00:46:10.820 | that aren't true, 'cause we're not humans, right?
00:46:13.900 | So let's stay at arm's length.
00:46:17.300 | Let's just let them know that we're here, right?
00:46:21.700 | And here's the limited amount of communication.
00:46:23.740 | Again, this notion that
00:46:25.220 | if you give somebody everything, they'll get lazy.
00:46:29.300 | And, you know, if they've been around as long as they have,
00:46:34.060 | they've seen every kind of thing that can go wrong.
00:46:36.740 | And so it's, they know as much as they might want
00:46:41.620 | to step in, that that would be a wrong thing.
00:46:44.420 | - Yeah, you have to also understand
00:46:46.900 | that the amount of wisdom they carry.
00:46:50.300 | - Yeah.
00:46:51.140 | You know, and so it's very easy as well for religions to,
00:46:56.540 | I don't wanna get into a whole religious conversation,
00:46:58.780 | but you could, very easy for,
00:47:00.780 | you could see how religions could call them angels
00:47:04.220 | or devils or what have you,
00:47:06.780 | because again, if you're trying to fit it
00:47:08.980 | into a framework of cultural understanding,
00:47:12.260 | the first thing you reach for is God.
00:47:14.980 | And so it, when you look at what these things are,
00:47:21.060 | and again, with the angels and the devils,
00:47:23.580 | in a similar sort of way, their communication is limited.
00:47:29.380 | They just kind of give little, what's the,
00:47:31.980 | the Oracle of Delphi,
00:47:33.300 | they kind of give these Delphic pronouncements,
00:47:36.620 | and then it's up to you to figure out
00:47:37.860 | what it is that they really mean.
00:47:39.500 | - Stephen Greer claimed that a skeleton discovered
00:47:45.220 | in Atacama region of Chile might be an alien.
00:47:50.100 | You reached out to him and took on the task
00:47:53.740 | of proving or disproving that with the rigor of science.
00:47:57.180 | The result is a paper titled,
00:47:59.260 | "Whole Genome Sequencing of Atacama Skeleton
00:48:02.060 | Shows Novel Mutations Linked with Dysplasia."
00:48:06.340 | Can you tell this full story?
00:48:07.840 | - The story was, as you put it right there, correct.
00:48:13.820 | Reached out, got a sample of the body,
00:48:18.460 | did the DNA sequencing, then worked with a team
00:48:22.420 | of two other Stanford scientists
00:48:25.060 | and Roche sequencing group, Roche Diagnostics,
00:48:30.060 | and probably a total team of about 11 or so people.
00:48:34.620 | And as is standard in these kinds of things,
00:48:38.180 | the professors actually don't do the work.
00:48:40.460 | The students do the work and figured out the answer.
00:48:43.320 | And then we helped them put together the story.
00:48:46.900 | And the story was simply that it was human, 100%.
00:48:53.540 | I went into it thinking it was originally a monkey
00:48:57.260 | of some sort.
00:48:58.140 | I got kind of excited a few months into the process,
00:49:02.660 | thinking, well, what happens if it is an alien?
00:49:05.060 | - Can you describe some of the characteristics
00:49:08.180 | of the skeleton that makes it unique and interesting?
00:49:10.420 | - Primarily, it had dysmorphias of the brain.
00:49:13.960 | And so the first thing I did, actually,
00:49:15.420 | when I got pictures of it, I took it to a local expert
00:49:19.260 | at Stanford, and he was on the paper.
00:49:24.260 | And he was the world expert in pediatric bone dysmorphias.
00:49:30.700 | He literally wrote the book on this,
00:49:34.620 | 'cause that's what you do.
00:49:35.460 | You go to an expert when it's outside
00:49:37.100 | of your field of interest.
00:49:38.540 | And he said, "Well, I haven't seen this particular
00:49:41.780 | "collection of mutations before, or this physiology before,
00:49:46.780 | "but here's what I think it might be."
00:49:49.880 | And he said, "But based on the size of the thing
00:49:55.060 | "and the bone density, it would appear to be
00:50:00.660 | "like six or seven years old."
00:50:02.940 | Now, again, that's the thing where I think
00:50:06.420 | the lay public doesn't understand
00:50:10.160 | or takes a speculation like that and turns it into a fact.
00:50:15.160 | No one ever said that it was that age.
00:50:17.860 | We only said that the bones made it look
00:50:19.720 | like it was that age.
00:50:21.720 | But then we went back and looked for genetic explanations
00:50:26.620 | of why things might look the way they did.
00:50:29.020 | And if you, again, read the paper,
00:50:31.500 | it's very carefully caveated to say
00:50:34.260 | that these mutations might result in this.
00:50:38.320 | But what we did find was an unexpectedly large number
00:50:43.320 | of mutations associated with bone growth in this individual.
00:50:48.060 | And it was just a bad roll of the dice, right?
00:50:52.120 | You roll the dice enough times
00:50:53.580 | with enough people born every year,
00:50:56.380 | and someone will roll the wrong dice all at once.
00:51:01.380 | So the sad part about it was individuals
00:51:06.100 | in the UFO community who wanted to think
00:51:09.460 | that there was some sort of conspiracy around it, right?
00:51:13.940 | That somebody had somehow convinced
00:51:16.780 | all of my students to lie.
00:51:20.440 | I mean, come on.
00:51:21.280 | I would lose my job, first of all.
00:51:26.020 | And they would all be in trouble forever.
00:51:31.020 | - Yeah, but also it's just projecting malevolence
00:51:34.060 | onto people that doesn't, I don't think,
00:51:37.100 | exist in normal populace, and especially doesn't exist
00:51:40.660 | in the scientific community.
00:51:42.180 | The kind of people that go into science,
00:51:43.680 | I mean, this is what bothers me
00:51:44.900 | with the current distrust of science,
00:51:47.480 | is they might be naive.
00:51:49.860 | They might not, especially in modern science,
00:51:53.460 | look at the big picture philosophical, ethical questions,
00:51:56.100 | all that kind of stuff.
00:51:57.420 | But ultimately, they're people with integrity
00:52:02.020 | and just a deep curiosity for the discovery
00:52:05.140 | of cool little things.
00:52:06.820 | And there's no malevolence, broadly speaking,
00:52:11.820 | in the scientific community.
00:52:15.780 | So, I mean, there's a bigger story here,
00:52:17.660 | which is there's a hunger in the populace
00:52:22.300 | to discover something anomalous, something new.
00:52:25.700 | And science has to be both open to the anomalous,
00:52:30.700 | but also to reject the anomalous
00:52:35.340 | when the data doesn't support it.
00:52:37.460 | What do you make of that, walking that line for you?
00:52:40.740 | Because you're dealing with UFO encounters,
00:52:42.820 | you're dealing with the anomalous.
00:52:45.660 | - Well, people have said, let's go back
00:52:48.940 | to the Atacama case, that I was debunking it.
00:52:52.060 | Well, debunking is a loaded term.
00:52:54.260 | It sort of assumes that you were going in purposefully
00:52:57.700 | to prove something is wrong.
00:53:00.500 | I wasn't, I was just going in to collect the data.
00:53:03.920 | And I showed that this one was human.
00:53:08.100 | There was another skull that somebody had at one point,
00:53:11.220 | it was called the star child,
00:53:12.300 | they called it the star child skull.
00:53:14.100 | I said, I looked at it, I looked at the DNA sequencing
00:53:16.980 | that they had done, I said, this is human.
00:53:19.860 | End of story.
00:53:21.900 | The people who owned the thing at the time
00:53:25.460 | disagreed with me, and then eventually,
00:53:27.700 | another group came in and proved that I was right.
00:53:30.660 | And it's not about debunking,
00:53:32.340 | it's about getting the more spectacular
00:53:35.020 | and hyped cases off the table.
00:53:37.580 | I mean, the reason I got interested in it
00:53:39.300 | is 'cause somebody was hyping it.
00:53:41.100 | And not because I wanted to disprove it,
00:53:42.780 | but because I just wanted to know.
00:53:44.500 | And thus, get it off the table,
00:53:45.900 | 'cause it's usually the most extravagant things
00:53:49.780 | that are most likely to be wrong.
00:53:51.680 | Somewhere in the rubble will be something interesting.
00:53:57.340 | And so that's what you do,
00:53:58.660 | you get the dross off the table,
00:54:02.300 | and then somewhere in the data
00:54:04.660 | will be something worth real inquiry.
00:54:09.540 | - And that, if you inquire deeply enough,
00:54:12.380 | will be extravagant as well.
00:54:13.820 | - Yes, exactly.
00:54:15.060 | - And that's what actually excites scientists,
00:54:17.100 | I mean, you want, with the rigor of science,
00:54:21.100 | to actually reveal the extravagant.
00:54:24.100 | - And so look at CRISPR
00:54:25.460 | as probably the most perfect example of that.
00:54:27.920 | These weird sequences in bacterial genomes
00:54:32.740 | all arrayed one after the other
00:54:34.260 | with these strange sequences around them.
00:54:37.240 | But when you looked at the sequences,
00:54:38.620 | they looked like viruses.
00:54:41.100 | And so how did they get there?
00:54:42.780 | And lo and behold, after a lot of effort and work,
00:54:46.100 | well, a couple of Nobel Prizes went out the door,
00:54:48.860 | but these strange things ended up
00:54:52.740 | having extraordinarily extravagant possibilities.
00:54:56.760 | - You've also looked at UFO materials.
00:55:00.060 | You are in possession of UFO materials yourself.
00:55:03.820 | - Claimed UFO materials.
00:55:05.580 | - Claimed. - Alleged.
00:55:07.060 | - Alleged UFO materials, that's right.
00:55:08.980 | So what's another term?
00:55:11.540 | Weird materials that don't seem to,
00:55:15.580 | that have a story.
00:55:16.580 | - They have a story that doesn't seem
00:55:18.700 | to be of natural origins, but it's not,
00:55:22.420 | there's a process to proving that,
00:55:25.980 | and that process may take decades, if not centuries,
00:55:30.680 | because you have to keep pulling at the string
00:55:33.300 | and discover where they could possibly come from.
00:55:35.800 | But anyway, you're in possession of some materials
00:55:40.100 | of this kind.
00:55:41.540 | Can you describe some of them?
00:55:45.140 | And maybe also talk to the process
00:55:47.020 | of how you investigate them, how do you analyze them?
00:55:50.140 | - Right, so let's say that there's two classes of materials
00:55:53.920 | that I've been given by people,
00:55:56.380 | and they're not given by the government or anything,
00:55:58.380 | just given people who've collected them,
00:56:00.240 | and there's a reasonable chain of evidence
00:56:02.580 | associated with them that you believe is not just a pebble
00:56:05.620 | somebody picked up off a road.
00:56:07.120 | There are almost always things that people have claimed
00:56:12.020 | have either been dropped off as some sort
00:56:15.540 | of a leftover material, molten metals,
00:56:19.260 | or they are from an object that was released from this
00:56:24.260 | so that kind of exploded.
00:56:29.500 | They're almost always metals.
00:56:31.340 | I have some couple of things that might be biological
00:56:33.540 | that are interesting that I haven't really spent
00:56:35.080 | a lot of time on yet.
00:56:36.500 | When you look at a metal, you basically,
00:56:39.500 | well, okay, what are the elements in it?
00:56:42.220 | And what's it made of?
00:56:43.620 | And so there's pretty standard approaches to doing that.
00:56:47.820 | Most of them involve a technology called mass spectrometry,
00:56:51.540 | and there's probably about five or six different kinds
00:56:53.320 | of mass spectrometry that you could bring to bear
00:56:55.820 | on answering it.
00:56:56.740 | And they either tell you, depending upon the limit
00:57:00.220 | of the resolution of the instrument,
00:57:02.300 | they either tell you the elements that are there,
00:57:05.260 | or they tell you the isotopes that are there.
00:57:07.140 | And you're interested not just in knowing
00:57:08.700 | whether something is there or not,
00:57:10.660 | you're interested in knowing whether there are,
00:57:14.380 | you know, the amounts of it, and in the case of elements,
00:57:19.380 | how many different isotopes are there.
00:57:23.620 | And that's kind of where, in some of these cases,
00:57:25.900 | it gets interesting, right?
00:57:27.580 | Because in at least one of the materials,
00:57:30.700 | as we first studied it, the isotope ratios of,
00:57:33.940 | in this case, it was magnesium, are way off normal.
00:57:37.460 | And I just don't know why.
00:57:40.100 | It doesn't prove anything.
00:57:42.620 | It just, all it proves is that it was probably accomplished
00:57:47.620 | by some kind of an industrial process.
00:57:51.300 | Whether it's the result of a process, or whether,
00:57:55.140 | and this is sort of the leftover,
00:57:57.460 | or whether it was made that way for a particular purpose,
00:58:01.820 | I don't know.
00:58:02.780 | All I know is that it was engineered.
00:58:07.580 | That's it, right?
00:58:11.540 | But then it's, the question is,
00:58:15.260 | it's sort of, you go one step deeper.
00:58:16.940 | Why would you engineer it?
00:58:18.740 | - Right, why, and what does engineered means?
00:58:24.260 | There's all kinds of, it could be a byproduct,
00:58:27.260 | it could be the main result of an engineering process,
00:58:33.260 | it would be a small part of the engineering process
00:58:37.380 | that is the main part.
00:58:39.100 | - Well, so the ratios of isotopes for any given element
00:58:43.820 | are basically the result of stellar processes.
00:58:46.920 | Supernova blew up sometime several billion years ago.
00:58:53.540 | That became a cloud.
00:58:56.860 | Those atoms coalesced gravitationally to form another sun.
00:59:02.860 | And a ring that became a rocky planet.
00:59:05.800 | And the ratios of the isotopes were determined
00:59:11.080 | at the time of that explosion.
00:59:14.440 | And so everything in the local solar system
00:59:17.220 | is more or less of that ratio,
00:59:19.780 | depending upon certain gravitational difference.
00:59:22.260 | But by fragments of a percent,
00:59:26.260 | not whole tens of percent difference.
00:59:29.580 | So what do humans use isotopes for?
00:59:32.500 | Mostly to blow stuff up.
00:59:34.180 | I mean, the vast majority of the isotopes
00:59:36.500 | that have been made in the per pound or ton
00:59:40.660 | are things like certain ratios of plutonium and uranium
00:59:44.500 | to blow stuff up.
00:59:45.760 | We don't make or engineer isotopes,
00:59:49.940 | which today is relatively easy to do,
00:59:52.460 | but it's still expensive.
00:59:54.100 | For any other reason, apart from let's say,
00:59:57.380 | as anti-cancer, we use stable isotopes in money these days
01:00:02.380 | as a counterfeiting tool.
01:00:04.840 | You basically embed certain ratios of isotopes
01:00:08.180 | in to make it harder for counterfeiters to accomplish.
01:00:11.900 | And so, but other than that, we don't do anything with that.
01:00:16.620 | So why would you make grams of such material
01:00:20.380 | in this one case and drop it around on a beach in Brazil?
01:00:24.780 | - So which case are we talking about?
01:00:26.940 | - This is the Ubatuba case.
01:00:30.620 | - Can you describe this case a little bit further?
01:00:32.620 | Like what material we're talking about,
01:00:34.580 | just the full story of the case.
01:00:36.100 | It's an interesting one.
01:00:37.540 | - It's an interesting one.
01:00:38.380 | So a fisherman saw an object that released something
01:00:43.380 | or it exploded and it was this,
01:00:47.160 | I've got some big chunks of it,
01:00:50.220 | relatively pure magnesium with obviously
01:00:54.680 | something else in it 'cause magnesium burns.
01:00:56.780 | So it had something in it that would,
01:00:59.340 | other metals, simple alloy,
01:01:01.700 | that would prevent it from basically burning up.
01:01:05.680 | And so the question is,
01:01:09.820 | and so then we had two pieces that came
01:01:12.420 | from two different chains of custody,
01:01:15.700 | both claimed to be from the same object.
01:01:19.460 | At least physically, when you look at the two things,
01:01:23.180 | they look the same, right?
01:01:27.020 | So we took small fragments of each of them.
01:01:29.860 | We put them in an instrument called
01:01:31.220 | a secondary ion mass spec,
01:01:33.340 | which is an extremely sensitive instrument.
01:01:35.980 | And it can see down to 0.0001 mass units,
01:01:40.420 | which is important for, let's say, more arcane reasons,
01:01:46.100 | but it's a sensitive instrument.
01:01:48.860 | And so one of the chains of custody,
01:01:52.900 | we had two pieces from the same chain of custody,
01:01:55.940 | and then two pieces from the other chain of custody.
01:01:58.820 | One of them had completely normal magnesium isotope ratios,
01:02:03.820 | magnesium 24, 25, 26, and the other was off,
01:02:08.180 | not just like slightly off, way off.
01:02:09.700 | And they were both off to the same extent.
01:02:12.580 | So, I mean, it was sort of like you had an internal control
01:02:17.140 | of what was normal,
01:02:18.900 | and then you had this other one which was wrong.
01:02:22.420 | And so you're left with,
01:02:24.220 | I just want kind of an open question,
01:02:27.140 | was this a hoax?
01:02:29.700 | Were these two chains of custody, one of them a hoax,
01:02:32.420 | that somebody purposefully introduced those things?
01:02:35.140 | 'Cause you could do it, it would cost a lot.
01:02:38.460 | I mean, at the time that this was found,
01:02:40.840 | I guess the 1970s or so, might've been earlier, I forget,
01:02:45.740 | the amount that I had would have cost
01:02:49.780 | several tens of thousands of dollars to make.
01:02:52.020 | And again, it's not something you would just throw around,
01:02:56.180 | and why would you do it in the hope
01:02:57.660 | that some guy 30 years from then
01:03:00.020 | would pick it up and study it?
01:03:02.500 | Yeah, it's a very subtle, subtle troll.
01:03:05.100 | It's a long-term plan.
01:03:06.880 | So I just don't know what to make of it,
01:03:12.340 | except it's interesting.
01:03:13.620 | So a different kind of question that you're asking
01:03:17.580 | is what constitutes evidence, right?
01:03:21.540 | So is this sufficient evidence?
01:03:25.940 | Absolutely not.
01:03:27.660 | But somebody's put it forward, I have the time,
01:03:30.440 | it's my time, I'll study it,
01:03:33.700 | and my objective is to sort of take those
01:03:36.400 | that I think are credible enough
01:03:38.540 | and do a reasonable analysis, put it out there,
01:03:41.260 | and maybe somebody else will come up with an idea
01:03:43.320 | as to what it is.
01:03:44.620 | Now, what would be better
01:03:47.100 | is some sort of true technology, right?
01:03:51.140 | Something that is obviously, we don't have it.
01:03:54.140 | And people like Neil deGrasse Tyson and Seth Shostack
01:04:00.000 | have come out rightfully and have said,
01:04:04.020 | when you show up with something really obviously technological
01:04:11.740 | that we don't understand, then we'll pay attention, right?
01:04:16.180 | Not just material.
01:04:17.260 | Not just material.
01:04:18.260 | A piece of metal is interesting,
01:04:21.800 | and several of the things that I've looked at
01:04:26.060 | and other things that people have come to me with,
01:04:29.600 | we found to be completely banal
01:04:32.100 | or were actually pieces of aircraft
01:04:35.780 | that were invented back in the 1940s.
01:04:38.580 | And so take them off the table.
01:04:40.660 | - See, but I think, again,
01:04:45.540 | I think showing up with technology
01:04:47.620 | that we humans would find completely novel
01:04:50.540 | is actually a really difficult task for aliens
01:04:53.460 | because it obviously can't be so novel
01:04:56.820 | that we don't recognize it.
01:04:58.180 | - For what it is, yeah.
01:04:59.100 | - For what it is.
01:04:59.940 | And I would say most of the technology aliens likely have
01:05:04.320 | would be something we don't recognize.
01:05:06.700 | So it's actually a hard problem
01:05:09.380 | how to convince ants.
01:05:11.420 | Like, you first have to understand
01:05:13.420 | what ants are tweeting about,
01:05:15.300 | what they care about in order to inject into their culture
01:05:21.340 | because that's why I think it would be the technology
01:05:26.420 | that you could present is in the space of ideas,
01:05:29.720 | is try to influence individual humans with the encounters
01:05:35.780 | and try to, with this kind of thing that you mentioned
01:05:39.780 | about us not taking,
01:05:42.380 | messages about us not taking care of the world.
01:05:44.860 | It's difficult.
01:05:47.980 | I mean, for them to understand
01:05:49.740 | that you have to come up with trinkets that impress us,
01:05:52.420 | I mean, maybe the very technology,
01:05:56.020 | the fascination with the development of technology
01:05:58.580 | and the development of technology,
01:06:00.420 | the actual act of innovation itself
01:06:03.140 | is the thing that they're communicating.
01:06:05.540 | - Right.
01:06:06.380 | - I mean, this is kind of what Jacques Vallee thinks about,
01:06:09.740 | is the role of--
01:06:11.380 | - The control system, he calls it.
01:06:13.180 | - The control system.
01:06:14.340 | Well, let me ask about Jacques.
01:06:16.460 | Who is he?
01:06:17.720 | You know him.
01:06:20.420 | Who is Jacques Vallee?
01:06:21.660 | What have you learned from him?
01:06:24.460 | About life, about--
01:06:27.140 | - Oh my God.
01:06:27.980 | - About UFOs, about technology,
01:06:31.180 | about our role in the universe?
01:06:33.500 | - Well, I met Jacques,
01:06:35.620 | actually soon after the whole Atacama thing happened.
01:06:38.820 | I was visited by those people associated
01:06:43.140 | with the government and whatever around the Havana,
01:06:47.620 | what ended up mostly being Havana syndrome patients,
01:06:50.260 | but also Jacques at the same time.
01:06:51.540 | And they were actually working behind the scenes
01:06:53.460 | with each other, that, oh, here's this Stanford professor
01:06:57.300 | who is willing to talk about this stuff
01:06:59.140 | and investigate things.
01:07:00.380 | Maybe we should go talk to him.
01:07:03.100 | And he reached out through a colleague
01:07:06.100 | and he and I had lunch actually at the Rosewood Inn
01:07:09.980 | up on near Sand Hill.
01:07:12.420 | So Jacques is one of the first openly active scientists
01:07:17.420 | and he's really a scientist in this area,
01:07:23.340 | going back to the 1960s.
01:07:25.360 | And he's put forward a number of ideas,
01:07:31.860 | speculations about what it might be
01:07:34.700 | that people are interacting with.
01:07:36.100 | And the first thing that I learned from him
01:07:38.260 | is this notion of what he called Kabuki theater,
01:07:41.660 | that many of the things that people have seen are,
01:07:45.220 | I remember reading his books and thinking,
01:07:47.100 | he uses this word absurd a lot.
01:07:49.280 | He said, "The things that people claim they see are absurd."
01:07:54.020 | A ship doesn't land in a farmer's field
01:08:00.860 | and then come up and knock on the door and say,
01:08:02.580 | "Can I have a glass of water?"
01:08:03.940 | And these are stories literally out of newspapers
01:08:06.380 | from the 1930s, it's absurd.
01:08:09.660 | You know, and the other thing that people say,
01:08:10.940 | "Ships don't crash.
01:08:12.580 | "If you're so technologically advanced, you don't crash.
01:08:15.700 | "It's absurd that they crash."
01:08:18.660 | So he says, "This is put on as a show.
01:08:23.660 | "It's meant to, it's an influence campaign, right?
01:08:29.820 | "It's not meant to influence individuals,
01:08:31.840 | "it's meant to influence a culture as a whole.
01:08:35.660 | "Maybe they don't look at us as individuals.
01:08:37.580 | "Maybe they look at us as an organism
01:08:41.180 | "that lives on a planet, right?"
01:08:43.500 | And perhaps rightfully so.
01:08:45.260 | And so that's how you interact with them.
01:08:47.200 | That's how you influence them.
01:08:48.900 | So that was one of the first things
01:08:50.460 | that kind of took me back and realized,
01:08:52.620 | wow, there's actually a,
01:08:53.840 | maybe there's a puppet master behind the scenes
01:08:58.260 | that's doing this influencing
01:09:00.660 | and that all this stuff about aliens
01:09:02.340 | is just not true per se.
01:09:05.180 | They're just a representation of something
01:09:08.060 | that is meant to influence.
01:09:09.820 | So that was probably the most interesting.
01:09:11.300 | I mean, the man is brilliant.
01:09:13.420 | He's also, it can be, and I'm sorry, Jacques,
01:09:16.060 | he can also be incredibly annoying
01:09:18.460 | to have a conversation with
01:09:20.500 | because he will pick apart your arguments
01:09:23.700 | or anything that you think you know
01:09:25.580 | and show you why you don't know what you think you know.
01:09:28.980 | And he uses the, he used the example that for me,
01:09:32.380 | that is all you need is one counter example
01:09:37.380 | to any conclusion and you're wrong.
01:09:41.620 | And so I learned from him,
01:09:43.580 | I mean, I'm supposed to be a good scientist,
01:09:45.380 | but I learned from him, don't talk about conclusions,
01:09:49.020 | just talk about the data because data's not wrong.
01:09:52.140 | I mean, convince yourself that the data's not wrong
01:09:54.340 | or not an artifact, but be careful about your conclusions
01:09:57.380 | because whatever is going on,
01:09:58.740 | it's much more complicated than we imagine.
01:10:03.020 | - Wow, that's powerful, being able to always step back.
01:10:05.660 | 'Cause we humans get excited.
01:10:07.500 | - Yeah. - We start to jump
01:10:08.860 | to conclusions from the data, but always step back.
01:10:11.540 | Powerful, being able to always step back.
01:10:13.820 | 'Cause we humans get excited.
01:10:15.660 | - Yeah. - We start to jump
01:10:16.980 | to conclusions from the data, but always step back.
01:10:19.700 | - Well, in some of my Twitter feeds,
01:10:21.380 | when I dare to go on Twitter, are full of,
01:10:24.100 | well, when are you gonna give us the answer?
01:10:26.540 | You know, science is not immediate.
01:10:29.180 | You're gonna have to be patient.
01:10:30.980 | And even some of my science colleagues have said,
01:10:33.180 | well, where's the data?
01:10:35.060 | My answer to them has been,
01:10:36.860 | where's been your work to try to produce any?
01:10:39.540 | You know, I'm not here to give you everything
01:10:41.260 | on a silver platter.
01:10:42.420 | - We talked offline how much I love data
01:10:45.860 | and machine learning and so on.
01:10:48.260 | And it's been really disheartening to see
01:10:50.620 | the US government not invest as much as they possibly could
01:10:55.620 | into this whole process.
01:10:56.980 | So let's jump to the most recent thing,
01:11:00.100 | which is what do you make of the report titled,
01:11:02.700 | preliminary assessment on identified aerial phenomena
01:11:07.340 | that was released by the office of the director
01:11:09.380 | of national intelligence in June, 2021?
01:11:12.740 | So this was like, okay, we're gonna step back
01:11:17.020 | and we're going to like, what, where do we stand
01:11:20.020 | and where do we hope the future is?
01:11:21.500 | What do you make of that report?
01:11:22.540 | Is it hopeful?
01:11:23.700 | Is it--
01:11:24.520 | - I see it as very hopeful, very hopeful.
01:11:26.820 | I think the adults are finally stepping up
01:11:29.500 | and being in charge, right?
01:11:31.900 | - In the good sense of adult.
01:11:33.700 | - What's that?
01:11:34.540 | - In the good sense of adult. - In the good sense of adult.
01:11:36.340 | (Lex laughing)
01:11:37.540 | - You know-- - 'Cause childlike curiosity
01:11:38.860 | is a pretty powerful thing.
01:11:40.380 | - That's true, yeah.
01:11:42.260 | But it's also, I think, the people who were worried
01:11:45.200 | that the populace at large might run screaming
01:11:48.140 | into the streets and riot, you know,
01:11:51.460 | they basically, the empiric evidence is they're wrong.
01:11:55.300 | You know, these videos and all these things
01:11:57.980 | have been out for now, what, five years?
01:12:00.420 | Most people don't even know about it, right?
01:12:02.820 | So as hyped as it's been and all over the newspapers
01:12:06.780 | that it's been and et cetera, you know,
01:12:08.980 | even Tucker Carlson has talked about it many times
01:12:12.220 | on his news program.
01:12:13.340 | Joe Rogan has.
01:12:15.340 | A lot of people don't know about it.
01:12:16.780 | So I think people, if it's not affecting
01:12:18.860 | their day-to-day life, they're going on
01:12:21.220 | with their day-to-day life.
01:12:23.180 | So, but that said, I think it was an important
01:12:27.540 | sea change in the internal discussions going on
01:12:31.340 | in the government because, and the reason being,
01:12:35.460 | that I think this is actually partly true
01:12:38.580 | with the maturation of human social technology.
01:12:43.100 | It was becoming so obvious that this stuff
01:12:46.200 | was showing up again and again and again
01:12:48.140 | around our ships.
01:12:49.120 | They just couldn't keep it quiet anymore, right?
01:12:51.860 | And so it's like, we need to do something about it.
01:12:53.940 | And Lou Elizondo and Chris and others,
01:12:55.820 | to their great credit, found the right angle
01:12:59.220 | to talk about this.
01:13:00.660 | It says, well, okay, let's say it's not out there.
01:13:04.260 | Maybe it's the Russians, the Chinese, or somebody else.
01:13:06.900 | We should know about this 'cause we damn sure
01:13:09.660 | know it's not us.
01:13:11.420 | So that, to me, is an important thing
01:13:16.040 | to finally be a little bit more open about the matter.
01:13:20.720 | But like I often say, I'm not looking for people
01:13:24.160 | to give me permission to do anything.
01:13:26.180 | I'm just gonna do the analysis myself with what I have.
01:13:29.220 | Avi Loeb has taken the same approach.
01:13:31.800 | He said, I'm not gonna wait for the government
01:13:33.680 | to give me telescopic information about technologies
01:13:38.440 | or things that might be even on our own solar system.
01:13:41.940 | I'm just gonna collect it myself.
01:13:44.180 | And that's the right way to do it, right?
01:13:46.460 | Don't wait for somebody else to give it to you.
01:13:48.780 | - It's also possible to inspire a large number of people
01:13:52.580 | to do a wider spread data collection.
01:13:55.580 | - Yes.
01:13:56.420 | - I mean, you yourself can't do a large enough
01:13:59.320 | data collection that would,
01:14:00.700 | if you're talking about anomalous events.
01:14:02.860 | - Right, right.
01:14:04.340 | - You should be collecting high resolution data
01:14:08.760 | about everything that's happening on Earth
01:14:11.040 | in terms of the kind of things that would indicate
01:14:16.040 | to you a strong signal that something weird happened here.
01:14:20.120 | And this is why governments can be good
01:14:24.400 | at funding large scale efforts.
01:14:26.240 | - Yes.
01:14:27.080 | - I mean, NASA and so on, working with SpaceX,
01:14:31.040 | with Blue Origin, fund capitalistic,
01:14:36.040 | sort of fund companies, fund company efforts
01:14:40.160 | to do huge moonshot projects.
01:14:42.300 | - Right.
01:14:43.140 | - And in the same way, do huge moonshot
01:14:45.300 | data collection efforts in terms of UFOs.
01:14:47.900 | I mean, we're not, it needs to be like 10X,
01:14:50.860 | like one or two orders of magnitude more funding.
01:14:53.620 | - Exactly.
01:14:54.460 | - To do this kind of thing.
01:14:55.280 | And I understand on the flip side of that,
01:14:57.540 | if you make it about what are the Russians,
01:14:59.840 | what are the Chinese doing,
01:15:02.420 | you know, make it a question of geopolitics,
01:15:05.440 | it gets touchy because now you're kind of taken away
01:15:09.100 | from the realm of science and--
01:15:11.820 | - Making it military.
01:15:12.740 | - Making it military.
01:15:13.740 | Some of the greatest, this is what makes me,
01:15:15.580 | as an engineer, makes me truly sad
01:15:18.420 | that some of the greatest engineering work ever done
01:15:21.720 | is by Lockheed Martin.
01:15:23.460 | And we will never know about it.
01:15:25.060 | - Yeah, I agree, I agree.
01:15:26.600 | I wish we were, it was different,
01:15:29.660 | but it's the world we live in.
01:15:31.960 | You know, but related to that UAP task force announcement
01:15:37.620 | that you just said, you know, the bill was passed
01:15:40.660 | in the Department of Defense
01:15:41.740 | and now it formally establishes an office
01:15:44.660 | to collate that information
01:15:47.420 | and also to be transparent about it.
01:15:50.420 | Money is now set aside, right?
01:15:52.860 | - What do you think of, just in case people don't know,
01:15:55.580 | the DOD established a new department to study UFOs
01:16:00.100 | called Airborne Naming, come on, but yes,
01:16:03.540 | Airborne Object Identification
01:16:05.540 | and Management Synchronization Group.
01:16:08.100 | Do you know how to pronounce that?
01:16:09.020 | - No, I do not, no.
01:16:10.140 | - AOIMSG.
01:16:10.980 | - It's stupid and it needs to be renamed, but--
01:16:13.940 | - AOIMSG, AO, all right, is directed by
01:16:18.220 | the Undersecretary of Defense for Intelligence and Security.
01:16:22.460 | What do you make of this office?
01:16:24.220 | Are you hopeful about this office?
01:16:26.620 | - I think there's still a tug of war going on
01:16:28.500 | behind the scenes as to who's gonna control this,
01:16:31.380 | but I do know, though, that money has been set aside
01:16:35.260 | that will be used to make things more public, right,
01:16:40.260 | to start to get others involved.
01:16:45.340 | I'm involved with an effort to get other academics involved.
01:16:51.860 | - So you think there might be some of that money
01:16:53.540 | could be directed towards funding,
01:16:55.540 | maybe like groups like yours to do some research here.
01:16:59.180 | So they would be open to that, you think?
01:17:01.140 | - I hope so.
01:17:01.980 | I mean, nothing is set in stone yet,
01:17:04.100 | so, you know, and I'm not hiding anything
01:17:06.740 | 'cause I just don't know anything, right,
01:17:08.580 | but I do think that there will be public efforts.
01:17:13.580 | Now, there are being set up other private efforts
01:17:19.100 | to bring monies involved and to use that
01:17:21.980 | to leverage and get access
01:17:25.060 | to some of the internal resources as well.
01:17:28.220 | So what you're seeing is kind of an ecosystem building up
01:17:33.220 | in a positive sense of people
01:17:36.300 | who are willing to do the research.
01:17:39.500 | So, you know, before it would be,
01:17:41.740 | you couldn't even go to a scientist and ask them to help.
01:17:45.060 | Now, if there's money, as I said before,
01:17:48.380 | scientists are essentially capitalists.
01:17:50.220 | We go where the money is, you know,
01:17:51.900 | we go, I mean, the work that I've done,
01:17:53.420 | I did out of my own pocket
01:17:55.020 | and probably about 50, 60, $70,000 of money
01:17:59.500 | went into the paper we published out of my own pocket.
01:18:03.500 | And, you know, but the amount of money that needs to go in
01:18:07.740 | is in at least the few millions
01:18:09.780 | to do a proper analysis of these materials.
01:18:13.660 | The work I know that the Galileo Project is involved with,
01:18:18.300 | it's probably in the, you know,
01:18:20.420 | five to 10 million range to get stuff done.
01:18:23.380 | But that's actually a relatively modest amount of money
01:18:26.940 | to accomplish something that has been
01:18:29.500 | in the zeitgeist for decades.
01:18:32.400 | - I should also push back a little bit
01:18:36.300 | on something you probably will agree with.
01:18:37.900 | You said scientists are essentially capitalists.
01:18:40.980 | What I've noticed is there's certainly an influence of money
01:18:44.700 | but oftentimes when you're talking about basic research
01:18:47.140 | and basic science, the money is a little bit ambiguous
01:18:52.140 | to what direction you're doing the research in.
01:18:56.300 | And the scientists get really good at telling a narrative
01:19:00.900 | of like, yeah, yeah, yeah, we're fulfilling
01:19:03.460 | the purpose of this funding, but we're actually,
01:19:06.060 | they end up doing really what they're curious about.
01:19:08.620 | - Yes.
01:19:09.460 | - And of course you cannot deviate,
01:19:10.420 | like if you're getting funded to study penguins
01:19:13.020 | in Antarctica, you can't start building rockets,
01:19:15.340 | but probably you can because you'll convince some,
01:19:19.020 | you'll concoct a narrative saying rockets
01:19:21.500 | are really important for studying penguins in Antarctica.
01:19:23.980 | - Right.
01:19:24.820 | - I think that's actually, this is one thing
01:19:29.240 | I think people don't generally understand
01:19:31.740 | about the scientific mind is I don't know
01:19:34.920 | how capitalistic it is because if it was,
01:19:37.540 | they would start an effing company.
01:19:39.580 | - No, no, no, no, no.
01:19:40.420 | I mean, when I meant capitalist, I didn't mean in the,
01:19:43.740 | they'll start companies per se.
01:19:46.140 | I mean, we can only do the research where there's money.
01:19:50.020 | And so from, maybe it's a bad use of the term capitalist.
01:19:55.020 | But we will only do the research where there's money.
01:19:59.620 | I mean, why do most people work, many biologists,
01:20:04.540 | work in cancer research?
01:20:06.900 | Because there's a lot of money there.
01:20:08.260 | It's an important problem.
01:20:09.940 | But I might not have ever gotten involved in it
01:20:14.140 | if there wasn't money.
01:20:15.100 | I might've gone and I was gonna be a botanist
01:20:17.940 | when I was a kid.
01:20:20.860 | That's what I wanted to do.
01:20:22.260 | So having money available will bring people to bear.
01:20:28.180 | Now, another mistake that's often actually made,
01:20:31.100 | I think by the lay public about science
01:20:33.500 | is that people think that we're paid to do things.
01:20:36.760 | Just as you said, I get a research grant
01:20:39.380 | and luckily from the NIH,
01:20:40.940 | they give you a fair amount of latitude.
01:20:43.620 | I will go my own way and I'll find something,
01:20:45.780 | I might've proposed something,
01:20:47.740 | but I'll end up somewhere entirely different
01:20:50.060 | by the end of the project.
01:20:51.700 | And that's how good science is done.
01:20:52.980 | You follow the data, you follow the results.
01:20:56.280 | And so that's what I'm hoping can be done here.
01:21:01.660 | I think the worst kind of thing that could be done
01:21:05.180 | with this subject area is to put it inside another company
01:21:09.340 | where they have a set plan of what it is they're gonna do
01:21:13.140 | and the scientists either do what the executives
01:21:16.020 | tell them to do or not.
01:21:17.280 | That isn't how anything will really get discovered.
01:21:21.180 | Put it, get it out into the public,
01:21:24.200 | get open minds thinking about it
01:21:26.880 | and then publishing on it and doing the right kind of work.
01:21:29.940 | That's how real progress will be made with this.
01:21:33.780 | - Let's again put our sort of philosophical hats on.
01:21:37.420 | Do you think the US government or some other government
01:21:40.500 | is in possession of something of extraterrestrial origin
01:21:45.500 | that is far more impressive
01:21:49.420 | than anything we've seen in the public?
01:21:53.620 | - I've not seen anything personally,
01:21:56.340 | but if I believe the people who I don't think can lie, yes.
01:22:00.660 | - How does that make you feel in terms of the way
01:22:06.020 | government works, the way our human civilization works,
01:22:09.380 | that there might be things like that
01:22:11.580 | and they're not public?
01:22:13.700 | Is there a hopeful message for transparency that's possible?
01:22:17.300 | Like if you were in power, and I'm not saying president
01:22:21.820 | 'cause maybe the president is not the source of power here,
01:22:25.380 | would you release this information in some way or form?
01:22:30.560 | - Yes, if I were.
01:22:32.660 | I think it's something that can bring humanity together.
01:22:37.660 | I think that knowledge of this kind of thing
01:22:42.940 | to know that we are more alike than we are different
01:22:47.940 | in comparison to whatever this is,
01:22:50.560 | is a positive thing for us.
01:22:55.980 | And to know, I don't necessarily care
01:23:01.140 | that the government has been hiding it.
01:23:02.740 | And I think people who've been talking about
01:23:04.540 | what we should give government officials
01:23:07.700 | or whatever amnesty, I think that's probably the right answer.
01:23:11.740 | This isn't a time to look back and say,
01:23:14.540 | you did something wrong.
01:23:15.540 | You did whatever you did because that was the data
01:23:17.540 | you had available to you at the time
01:23:18.860 | and you had good reasons for doing it.
01:23:21.120 | Now, if your reasons were selfish,
01:23:23.900 | if your reasons were you wanted to do it
01:23:25.980 | because you wanted to monetize it yourself
01:23:30.620 | to your benefit but against that of others,
01:23:32.940 | then I think maybe there's something else that could be said.
01:23:35.280 | But an opportunity to get all this information out
01:23:39.660 | if I were in charge, I would try to do it.
01:23:43.420 | Now, I might be shown something though that says,
01:23:46.540 | hmm, there's a reason why you don't wanna
01:23:47.860 | let anybody know this.
01:23:49.340 | Maybe you don't want everybody having access
01:23:51.680 | to unlimited energy
01:23:54.580 | because maybe you might turn it into a bomb.
01:23:57.180 | - Or something that gives you hints
01:24:00.100 | that something like unlimited energy is possible
01:24:02.980 | but you haven't figured it out yet.
01:24:04.900 | And if you make it public,
01:24:06.620 | maybe some of the other governments
01:24:08.620 | you have tensions with will figure it out first.
01:24:11.380 | - Right.
01:24:12.220 | It's kind of an arms race going on, I think.
01:24:15.180 | - In all forms.
01:24:16.180 | And it makes me truly sad because it's obvious
01:24:19.620 | that, for example, the origins of the COVID virus,
01:24:25.620 | it's obvious to me that the Chinese government,
01:24:29.700 | whatever the origins are,
01:24:31.700 | is interested in not releasing information about it
01:24:36.020 | because it can only be bad for the Chinese government.
01:24:39.560 | And every government thinks like this.
01:24:43.120 | Every, actually, this has been a disappointment to me,
01:24:47.160 | talking to PR folks at companies.
01:24:51.740 | They're always nervous.
01:24:53.260 | They're always conservative in the sense like,
01:24:57.260 | well, if we release more stuff, it can only be bad.
01:25:02.140 | And then an Elon Musk character comes along
01:25:05.380 | who tweets ridiculous memes and doesn't give a fuck.
01:25:10.380 | And I've been encouraging CEOs,
01:25:12.520 | I've been encouraging people to be transparent.
01:25:14.940 | And of course, government, national security
01:25:17.460 | is really like another level.
01:25:20.260 | It's human lives at stake.
01:25:22.040 | But let's start at the lighter case
01:25:25.340 | of just releasing some of the awesome insights
01:25:28.300 | of how the sausage is made, the technology,
01:25:32.300 | and being transparent about it because it excites people.
01:25:35.660 | Like you said, it connects people.
01:25:40.260 | It inspires them.
01:25:42.520 | It's good for the brand.
01:25:44.160 | It's good for everybody.
01:25:45.180 | I honestly think this kind of idea
01:25:47.440 | that people will steal the information
01:25:49.420 | and we use it against you
01:25:50.820 | is an idea that's not true in his idea of the 20th century.
01:25:55.820 | Like you said, some of the benefits of the social media,
01:25:59.700 | our social world is that transparency is beneficial.
01:26:04.020 | And I hope governments will learn that lesson.
01:26:06.840 | Of course, they're usually the last to learn such lessons.
01:26:09.940 | What do you make of Bob Lazar's story
01:26:13.580 | in terms of possession of aircraft?
01:26:15.340 | Do you believe him?
01:26:16.180 | - I don't believe in the Bob Lazar story,
01:26:18.560 | to be quite honest.
01:26:19.400 | I mean, Jeremy Corbell has done a great job interviewing him
01:26:24.400 | and has done some beautiful documentaries.
01:26:30.260 | I just don't, I don't know how to interpret it.
01:26:37.920 | And again, some of the people who I fraternize with
01:26:42.920 | think it's all rubbish.
01:26:47.460 | Yeah, but maybe he's right, but I don't know.
01:26:49.380 | I mean, the problem is,
01:26:51.660 | and this is a little bit different
01:26:54.740 | about how I approach the whole area than a lot of others.
01:26:57.660 | I'm less interested in going over old paperwork
01:27:02.680 | and all these old histories of who said what,
01:27:05.820 | the whole he said, she said of the history of UFOs.
01:27:09.600 | I'm a scientist.
01:27:11.840 | I worked on the brain area
01:27:16.180 | because it's something I can collect data on.
01:27:18.740 | I can go back to the same individual,
01:27:20.020 | collect their MRI again and redo it.
01:27:22.920 | I can hand that MRI to somebody else.
01:27:24.580 | They can analyze it.
01:27:26.020 | I can get materials.
01:27:27.240 | I can analyze them.
01:27:28.740 | I can get some of these skeletons.
01:27:30.100 | I won't touch any skeletons ever again,
01:27:32.300 | but I can analyze it
01:27:33.180 | and somebody else can reproduce the data.
01:27:35.440 | I mean, that's what I'm good at.
01:27:37.680 | And so, I'm not going to go into the whole,
01:27:42.020 | I'm not a historian.
01:27:43.620 | - Yeah, that's true, but there's a human side to it.
01:27:46.460 | Sometimes I think with these,
01:27:50.740 | because again, anomalous, rare events,
01:27:53.940 | some of the data is inextricably connected to humans,
01:27:57.580 | the observations.
01:27:58.740 | - Right.
01:28:00.060 | - I mean, I hope in the future
01:28:01.580 | that that sensory data will not be polluted
01:28:07.140 | by human subjectivity,
01:28:08.540 | but that's still powerful data,
01:28:11.660 | even direct observations.
01:28:13.540 | Like if you talk about pilots.
01:28:15.700 | And so, it's an interesting question to me
01:28:17.420 | whether Baba Tzar is telling the truth,
01:28:18.980 | whether he believes he's telling the truth too.
01:28:21.820 | And what also, what impact his story
01:28:25.820 | and stories like his have on the willingness
01:28:28.620 | of governments to be transparent and so on.
01:28:31.980 | So, you have to credit his story
01:28:36.220 | for captivating the imagination of people
01:28:38.860 | and getting the conversation going.
01:28:40.900 | - He's maintained his story for all these years
01:28:43.420 | with little to no change that I'm aware of.
01:28:45.940 | So, but there's so many other people
01:28:48.900 | who are, let's say, experts in that story.
01:28:51.700 | - Their gut, you accumulate a set of
01:28:56.820 | circumstantial evidence where your gut will say
01:29:02.420 | that somebody is not telling the truth.
01:29:04.860 | - Yeah.
01:29:05.700 | - You mentioned Avi Loeb.
01:29:08.100 | I forgot to ask you about Oumuamua.
01:29:12.340 | Because you've analyzed specimens here on Earth,
01:29:16.100 | what do you make of that one?
01:29:17.740 | And what do you make broadly of our efforts
01:29:19.900 | to look at rocks, essentially,
01:29:22.820 | or look at objects flying around in our solar system?
01:29:27.820 | Is that a valuable pursuit?
01:29:29.380 | Or maybe most of the stories can be,
01:29:33.100 | most of the fascinating things could be discovered
01:29:35.060 | here on Earth or on other nearby planets?
01:29:37.780 | - Just going to Oumuamua,
01:29:41.100 | I think Avi's insight is an interesting speculation.
01:29:46.100 | Like I was saying before,
01:29:49.220 | people can sometimes look at something
01:29:50.900 | and not see it for what it is.
01:29:52.420 | Many would just look at that and say,
01:29:55.340 | "Oh, it's an asteroid," and dismiss it.
01:29:58.060 | There was something odd about the data
01:30:00.740 | that Avi picked up on and said,
01:30:02.960 | "Well, here's an alternative explanation
01:30:04.960 | "that doesn't fit, that actually better fits the models
01:30:08.020 | "than it just being a rock."
01:30:10.220 | And to his credit, he just has ignored the critics
01:30:15.220 | because he believes the data is real
01:30:17.580 | and is using that then as a battering ram
01:30:20.740 | to go after other things.
01:30:22.180 | So I think that's great.
01:30:23.920 | - Yeah, what is his main conclusion?
01:30:27.460 | Does he say it could be of alien extraterrestrial origins?
01:30:32.460 | - Well, that's one of the things.
01:30:33.460 | I mean, he's explained how it could be a light sail.
01:30:37.980 | And a light sail is certainly within near human capabilities
01:30:42.860 | to make such a thing.
01:30:44.380 | I think Yuri Milner, he's a Russian billionaire.
01:30:48.900 | He's involved, I think, in a project
01:30:51.820 | to make light sails with laser,
01:30:54.380 | to launch them with laser power,
01:30:59.500 | essentially, towards Alpha Centauri.
01:31:01.900 | So it's something that humans could make.
01:31:07.760 | I think Avi's proposal is perfectly
01:31:10.060 | within the realm of possibility.
01:31:11.900 | I mean, sadly, the thing is now nearly
01:31:14.780 | out of our solar system.
01:31:15.980 | - Yes, I mean, to me, that's inspiring
01:31:17.800 | to do greater levels of data collection
01:31:21.460 | in our solar system, but also here on Earth.
01:31:23.900 | It just seems like we should be constantly collecting data
01:31:26.940 | because the tools of software that we're developing
01:31:29.780 | get better and better at dealing with huge amounts of data.
01:31:32.300 | It's changing the nature of science.
01:31:33.780 | I mean, collect all of the data.
01:31:36.380 | - Right, collect the data.
01:31:37.660 | I mean, the Galileo Project asked me
01:31:40.740 | over the weekend to join, and I did.
01:31:43.140 | So I'm not a specialist in any of the stuff
01:31:46.580 | that they're doing, but in looking at the list of people
01:31:51.060 | who are on there, there are really no biologists on there.
01:31:54.540 | So at some point, if my expertise is required for something.
01:31:58.860 | - What's the goal and the vision of the Galileo Project?
01:32:01.420 | - Better talk to Avi, but my understanding,
01:32:04.240 | and just actually looking at the bylaws,
01:32:07.620 | this morning, literally just got them,
01:32:09.520 | is number one, collect the data on UAP,
01:32:13.900 | and number two, collect data on local,
01:32:18.820 | potentially local technological artifacts.
01:32:22.260 | - I need to look into this.
01:32:23.420 | This is fascinating.
01:32:24.660 | And Avi is heading the Galileo Project?
01:32:26.980 | - Yeah, have you spoken to him?
01:32:28.700 | - On this podcast, yes.
01:32:30.300 | I believe it was before he was headed.
01:32:32.460 | Is this a new creation?
01:32:33.700 | - Yeah, the Galileo Project was,
01:32:35.780 | I think it's about six or seven months old now.
01:32:38.220 | - That's amazing.
01:32:39.140 | And he's getting a group of scientists together
01:32:41.020 | to try to-- - Oh yeah, about 100.
01:32:41.860 | Oh, it's-- - That's awesome.
01:32:43.060 | - Actually, I was looking at some of their stuff
01:32:46.380 | over the weekend.
01:32:47.500 | I'm shocked at the level of organization
01:32:49.360 | that they've already got put together.
01:32:50.780 | - That's amazing.
01:32:51.620 | - It looks like a moonshot project.
01:32:53.660 | I mean, I've been involved with a lot of NIH,
01:32:56.020 | large NIH projects, which involve a lot of people
01:32:59.420 | in coordination, and they're putting it together.
01:33:03.760 | (Avi sighs)
01:33:05.920 | - So, you're extremely well-published
01:33:10.880 | in a lot of the fields we began this conversation with.
01:33:15.380 | So you're legit scientists. (laughs)
01:33:20.520 | But yet, you're keeping an open mind to a lot of ideas
01:33:23.320 | that maybe require you to take a leap
01:33:28.320 | outside of the conventional.
01:33:33.060 | So what advice would you give to young people today
01:33:36.320 | that are in high school or in college
01:33:40.480 | that are dreaming of having impact in science
01:33:44.980 | or maybe in whatever career path
01:33:47.400 | that goes outside of the conventional,
01:33:49.180 | that really does something new?
01:33:52.620 | - If you believe in something,
01:33:54.920 | you believe that an idea is valuable,
01:33:59.920 | or you have an approach to something,
01:34:02.060 | don't let others shame you into not doing it.
01:34:05.560 | As I've said, shame is a societal control device
01:34:11.000 | to get other people to do what they want you to do
01:34:14.360 | rather than what you wanna do.
01:34:16.520 | So shame sometimes is good to stop you
01:34:18.520 | from doing something unethical or wrong,
01:34:21.160 | but shame also is something
01:34:22.700 | that is circumscribing your environment.
01:34:27.120 | I've never let people who've told me,
01:34:29.680 | you know, you shouldn't do that,
01:34:32.000 | line of science, you should be ashamed of yourself
01:34:34.120 | for even thinking that, give me a break.
01:34:36.120 | Why is it wrong to ask questions about this area?
01:34:41.160 | What's wrong with asking the question?
01:34:43.520 | Frankly, you're the person who's wrong
01:34:46.200 | for trying to stop these questions.
01:34:48.760 | You're the person who's almost acting like a cultist.
01:34:52.960 | You basically have closed your mind
01:34:55.360 | to what the possibilities are.
01:34:57.560 | And if I'm not hurting anybody,
01:34:59.800 | and if it could lead to an advance,
01:35:01.480 | and if it's my time, why does it bother you?
01:35:04.680 | I mean, I had a very well-known scientist once tell me
01:35:08.040 | that I was gonna hurt my career talking about this.
01:35:11.140 | If anything, it's enhanced my career.
01:35:13.840 | - I have a couple of questions on this.
01:35:15.000 | So first of all, just a small comment on that.
01:35:18.400 | I've realized that it feels like a lot of the progress
01:35:22.400 | in science is done by people pursuing an idea
01:35:26.040 | that another senior faculty would probably say,
01:35:29.080 | this is going to hurt your career.
01:35:31.400 | I think it's actually a pretty good indicator
01:35:33.760 | that there's something interesting
01:35:35.000 | when a senior-wise person tells you
01:35:39.240 | this is gonna hurt your career.
01:35:40.600 | I think that's just the one, as a small,
01:35:43.560 | if I were to give advice to young people,
01:35:45.320 | if somebody senior tells you this is gonna hurt your career,
01:35:48.560 | think twice about taking their advice.
01:35:50.320 | - Yeah.
01:35:51.640 | No, I mean, I think that's the primary thing.
01:35:54.240 | And the other, I tell my own students,
01:35:57.680 | I have a lab of about 20, 30 people,
01:35:59.720 | and it's been that big since 1992.
01:36:03.320 | People come and go.
01:36:04.420 | It's not the data that falls in line that's so interesting.
01:36:11.520 | It's the spot off the graph that you wanna understand.
01:36:17.440 | When something is way off the graph,
01:36:23.480 | that's the interesting thing,
01:36:25.000 | because that's usually where discovery is.
01:36:28.160 | And the number of times that I've stopped people in my lab
01:36:31.280 | and said, wait a second, go back a few slides.
01:36:33.840 | What was that?
01:36:34.760 | And then it ended up being something interesting
01:36:38.160 | that made their careers.
01:36:40.440 | I could count on a few hands.
01:36:43.440 | - Yeah, get excited by the extraordinary
01:36:46.880 | that's outside of the thing that you've done in the past.
01:36:52.840 | Just on a personal psychological level,
01:36:55.200 | is there, I'm sure at Stanford,
01:37:00.480 | I'm sure in you exploring some of these ideas,
01:37:03.320 | there's pressure.
01:37:04.920 | How do you not give in to the pressure?
01:37:09.720 | How do you not give in to the people that say,
01:37:12.600 | like, that push you away from these topics?
01:37:15.680 | What would you say is shame?
01:37:19.480 | - I just point to my successes.
01:37:22.080 | I say, you're the ones who told me not to start companies
01:37:25.680 | all this time ago.
01:37:26.800 | And now you're the one coming to me for advice
01:37:30.400 | for how to start a company.
01:37:31.760 | But from the scientific area,
01:37:37.600 | it's you're wanting to take something off the table
01:37:42.600 | that might be an explanation.
01:37:44.740 | How is that the scientific method?
01:37:50.880 | I reverse shame them.
01:37:52.160 | - Reverse shame them.
01:37:53.200 | So purely with reason through conversation,
01:37:56.080 | you're able to do that.
01:37:56.920 | So it doesn't feel, 'cause to me it would just feel lonely.
01:37:59.280 | There's a community.
01:38:01.360 | There's a community of science.
01:38:02.720 | And when you're working on something
01:38:05.120 | that's outside a particular conventional way of thinking,
01:38:10.120 | it can be lonely.
01:38:11.120 | I mean, there's, in the AI field,
01:38:14.320 | if you were working on neural networks in the '90s,
01:38:17.200 | it could be lonely.
01:38:18.720 | I have met some of the most fascinating people ever
01:38:21.680 | that had I stayed the conventional track,
01:38:23.840 | I would never have met.
01:38:25.920 | I mean, truly.
01:38:26.960 | Brilliant people because of this.
01:38:30.980 | So it is, for those worried about,
01:38:35.980 | well, should I step outside of my comfort zone?
01:38:40.320 | You're gonna meet some really interesting people.
01:38:42.720 | And because I'm open about this area,
01:38:47.080 | I'll go and give a talk in Boston, Harvard, or MIT.
01:38:51.240 | And at dinner, inevitably, this subject comes up.
01:38:56.240 | And inevitably, somebody else at the table will admit
01:39:00.440 | both that they're interested or that they've seen something.
01:39:03.460 | And suddenly the whole tone of the conversation changes.
01:39:06.960 | It's kind of like there's safety in numbers.
01:39:09.880 | And then, or I've had people come to me afterwards,
01:39:13.480 | after dinner and say, "Hey, I don't talk about this openly."
01:39:18.200 | So the number of scientists who know
01:39:22.880 | that there's something else going on
01:39:26.040 | is much larger than the scientific community
01:39:29.920 | would like to think.
01:39:31.940 | - That's a really powerful one,
01:39:33.640 | which is I don't talk about this openly,
01:39:37.600 | but here's what I believe.
01:39:40.000 | And you'd be surprised how many people speak like this
01:39:43.280 | and hold those beliefs.
01:39:44.640 | And I am optimistic about social media
01:39:47.760 | and a more connected world to reveal more and more.
01:39:50.680 | Like us not to have these two personalities
01:39:52.840 | where like this public and private one.
01:39:55.800 | We've mentioned the big questions
01:39:57.700 | of the origins of the universe.
01:40:00.060 | What do you think is the meaning of this whole thing?
01:40:02.760 | For us humans, our human existence here on Earth,
01:40:07.520 | or just at the individual level of a human life?
01:40:10.700 | What, Gary, is the meaning of life?
01:40:13.960 | - I think that what we're going through today
01:40:19.720 | with this realization,
01:40:21.560 | it's kind of like you've lived on an island your whole life
01:40:28.340 | and you've looked across the ocean
01:40:31.480 | and you've never imagined there was another island
01:40:33.680 | with anybody else on it.
01:40:35.240 | And then suddenly a ship with sails shows up.
01:40:39.920 | You don't understand it,
01:40:41.180 | but you realize that suddenly your world
01:40:44.060 | just got a lot bigger.
01:40:45.740 | I think we're in one of those moments right now
01:40:48.220 | that our world view, our galactic view is opening, right?
01:40:53.220 | To something a little bit bigger.
01:40:55.780 | And not just that there might be somebody else,
01:40:59.800 | but that there's something else.
01:41:02.660 | And what it is, is yet to be understood.
01:41:06.660 | And the fact that it isn't understood to me
01:41:09.480 | is what's exciting,
01:41:11.120 | because I can fill it with my dreams.
01:41:14.720 | - And this discovery, our world might,
01:41:19.520 | is about to get a lot more humbling
01:41:24.740 | and a lot more fascinating once we look out
01:41:27.740 | and realize we were on an island all along.
01:41:30.500 | - It makes us both smaller but larger
01:41:32.420 | at the same time to me.
01:41:34.180 | I can look outside at the stars
01:41:39.260 | and think and imagine what else might be out there.
01:41:42.700 | And although I know that I will never see it all,
01:41:46.800 | it excites me to know that it's there.
01:41:48.700 | - Well, Gary, both to respect your time
01:41:53.980 | and also because at 12 I turned into a princess,
01:41:56.880 | let me just say thank you for doing everything you're doing
01:42:02.940 | as a great scientist,
01:42:04.580 | as a person willing to reject the conventional.
01:42:08.460 | And thank you for spending your extremely valuable time
01:42:10.820 | with me today.
01:42:11.660 | Thanks for talking.
01:42:12.480 | - Thanks so much.
01:42:13.320 | It was great talking.
01:42:14.900 | - Thanks for listening to this conversation
01:42:16.340 | with Gary Nolan.
01:42:17.420 | To support this podcast,
01:42:18.660 | please check out our sponsors in the description.
01:42:21.500 | And now let me leave you with some words
01:42:23.780 | from Stanislav Lem in Solaris.
01:42:27.340 | How do you expect to communicate with the ocean
01:42:30.820 | when we can't even understand one another?
01:42:33.680 | Thanks for listening and hope to see you next time.
01:42:37.900 | (upbeat music)
01:42:40.480 | (upbeat music)
01:42:43.060 | (upbeat music)
01:42:45.640 | (upbeat music)
01:42:48.220 | (upbeat music)
01:42:50.800 | (upbeat music)
01:42:53.380 | [BLANK_AUDIO]