back to index

John Abramson: Big Pharma | Lex Fridman Podcast #263


Chapters

0:0 Introduction
7:52 Biggest problem with big pharma
16:54 Advertising
36:45 Corruption
53:1 Pfizer vaccine data
61:53 Vaccine profits
72:48 Censorship
93:34 FDA
102:9 NIH
111:6 Live longer
114:36 Medications
123:0 Doctors
127:7 Advice for young people
128:3 Big pharma's influence
132:47 Mortality
134:38 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The jury found Pfizer guilty of fraud
00:00:02.920 | and racketeering violations.
00:00:04.600 | - How does Big Pharma affect your mind?
00:00:06.520 | - Everyone's allowed their own opinion.
00:00:08.800 | I don't think everyone's allowed their own scientific facts.
00:00:11.920 | - Does Pfizer play by the rules?
00:00:13.800 | - Pfizer isn't battling the FDA.
00:00:16.440 | Pfizer has joined the FDA.
00:00:18.760 | - The following is a conversation with John Abramson,
00:00:24.320 | faculty at Harvard Medical School,
00:00:26.520 | a family physician for over two decades,
00:00:29.160 | and author of the new book "Sickening,"
00:00:32.120 | about how Big Pharma broke American healthcare
00:00:35.000 | and how we can fix it.
00:00:37.360 | This conversation with John Abramson
00:00:40.200 | is a critical exploration of the pharmaceutical industry.
00:00:43.680 | I wanted to talk to John
00:00:45.400 | in order to provide a countervailing perspective
00:00:48.160 | to the one expressed in my podcast episode
00:00:50.720 | with the CEO of Pfizer, Albert Bourla.
00:00:55.080 | And here, please allow me to say a few additional words
00:00:58.920 | about this episode with the Pfizer CEO,
00:01:01.880 | and in general, about why I do these conversations
00:01:04.920 | and how I approach them.
00:01:06.800 | If this is not interesting to you, please skip ahead.
00:01:10.520 | What do I hope to do with this podcast?
00:01:13.080 | I want to understand human nature,
00:01:15.600 | the best and the worst of it.
00:01:18.040 | I want to understand how power, money,
00:01:19.840 | and fame changes people.
00:01:21.840 | I want to understand why atrocities are committed
00:01:24.600 | by crowds that believe they're doing good.
00:01:27.240 | All this, ultimately,
00:01:29.280 | because I want to understand
00:01:30.720 | how we can build a better world together,
00:01:33.240 | to find hope for the future,
00:01:35.400 | and to rediscover each time
00:01:38.520 | through the exploration of ideas
00:01:40.880 | just how beautiful this life is,
00:01:43.360 | this, our human civilization,
00:01:45.360 | in all of its full complexity,
00:01:47.640 | the forces of good and evil,
00:01:49.280 | of war and peace, of hate and love.
00:01:52.280 | I don't think I can do this with a heart and mind
00:01:55.560 | that is not open, fragile,
00:01:57.320 | and willing to empathize with all human beings,
00:02:00.920 | even those in the darkest corners of our world.
00:02:03.400 | To attack is easy.
00:02:06.760 | To understand is hard.
00:02:09.520 | And I choose the hard path.
00:02:11.800 | I have learned over the past few months
00:02:13.840 | that this path involves me getting more and more attacked
00:02:17.360 | from all sides.
00:02:19.000 | I will get attacked when I host people
00:02:21.600 | like Jay Bhattacharya or Francis Collins,
00:02:24.480 | Jamie Mertzl or Vincent Reconiello,
00:02:28.120 | when I stand for my friend Joe Rogan,
00:02:31.840 | when I host tech leaders like Mark Zuckerberg,
00:02:34.720 | Elon Musk, and others,
00:02:36.400 | when I eventually talk to Vladimir Putin,
00:02:39.560 | Barack Obama, and other figures
00:02:41.960 | that have turned the tides of history.
00:02:43.880 | I have and I will get called stupid, naive, weak,
00:02:49.920 | and I will take these words with respect,
00:02:53.640 | humility, and love, and I will get better.
00:02:57.200 | I will listen, think, learn, and improve.
00:03:00.680 | One thing I can promise is there's no amount of money
00:03:04.480 | or fame that can buy my opinion
00:03:06.880 | or make me go against my principles.
00:03:09.240 | There's no amount of pressure that can break my integrity.
00:03:13.480 | There's nothing in this world I need
00:03:16.120 | that I don't already have.
00:03:18.520 | Life itself is the fundamental gift.
00:03:21.400 | Everything else is just a bonus.
00:03:24.240 | That is freedom.
00:03:26.440 | That is happiness.
00:03:28.440 | If I die today, I will die a happy man.
00:03:31.480 | Now, a few comments about my approach
00:03:35.720 | and lessons learned from the Albert Bourla conversation.
00:03:39.360 | The goal was to reveal as much as I could
00:03:41.640 | about the human being before me,
00:03:43.680 | and to give him the opportunity to contemplate
00:03:46.720 | in long form the complexities of his role,
00:03:49.680 | including the tension between making money
00:03:52.960 | and helping people,
00:03:54.400 | the corruption that so often permeates human institutions,
00:03:57.960 | the crafting of narratives through advertisements, and so on.
00:04:02.160 | I only had one hour,
00:04:03.840 | and so this wasn't the time to address these issues deeply,
00:04:07.240 | but to show if Albert struggled with them
00:04:09.960 | in the privacy of his own mind,
00:04:12.160 | and if he would let down the veil of political speak
00:04:16.040 | for a time to let me connect with a man
00:04:19.160 | who decades ago chose to become a veterinarian,
00:04:22.320 | who wanted to help lessen the amount of suffering in the world.
00:04:26.080 | I had no pressure placed on me.
00:04:28.160 | There were no rules.
00:04:29.560 | The questions I was asking were all mine
00:04:31.920 | and not seen by Pfizer folks.
00:04:33.920 | I had no care whether I ever talked to another CEO again.
00:04:38.480 | None of this was part of the calculation
00:04:41.600 | in my limited brain computer.
00:04:43.920 | I didn't want to grill him
00:04:45.720 | the way politicians grill CEOs in Congress.
00:04:48.680 | I thought that this approach is easy, self-serving, dehumanizing,
00:04:53.680 | and it reveals nothing.
00:04:56.040 | I wanted to reveal the genuine intellectual struggle,
00:04:58.840 | vision, and motivation of a human being,
00:05:01.200 | and if that fails,
00:05:02.480 | I trusted the listener to draw their own conclusion and insights
00:05:06.640 | from the result,
00:05:07.920 | whether it's the words spoken,
00:05:09.880 | or the words left unspoken,
00:05:11.880 | or simply the silence.
00:05:14.280 | And that's just it.
00:05:15.640 | I fundamentally trust the intelligence of the listener.
00:05:21.720 | In fact, if I criticize the person too hard
00:05:24.680 | or celebrate the person too much,
00:05:26.640 | I feel I fail to give the listener a picture of the human being
00:05:30.640 | that is uncontaminated by my opinion
00:05:33.760 | or the opinion of the crowd.
00:05:36.320 | I trust that you have the fortitude and the courage
00:05:39.120 | to use your own mind,
00:05:40.920 | to empathize, and to think.
00:05:44.120 | Two practical lessons I took away.
00:05:46.160 | First, I will more strongly push
00:05:48.320 | for longer conversations of three, four, or more hours
00:05:51.760 | versus just one hour.
00:05:53.240 | 60 minutes is too short for the guest to relax
00:05:55.800 | and to think slowly and deeply,
00:05:58.080 | and for me to ask many follow-up questions
00:06:00.680 | or follow interesting tangents.
00:06:02.760 | Ultimately, I think it's in the interest of everyone,
00:06:05.680 | including the guests,
00:06:06.920 | that we talk in true long form for many hours.
00:06:11.400 | Second, these conversations with leaders
00:06:13.640 | can be aided by further conversations
00:06:15.960 | with people who wrote books about those leaders
00:06:18.560 | or their industries,
00:06:20.240 | those that can steel man each perspective
00:06:22.240 | and attempt to give an objective analysis.
00:06:24.960 | I think of Teddy Roosevelt's speech
00:06:26.640 | about the man in the arena.
00:06:28.320 | I want to talk to both the men and women in the arena
00:06:32.200 | and the critics and the supporters in the stands.
00:06:35.920 | For the former, I lean toward wanting to understand
00:06:38.680 | one human being's struggle with the ideas.
00:06:43.200 | For the latter, I lean towards
00:06:44.960 | understanding the ideas themselves.
00:06:48.200 | That's why I wanted to have this conversation
00:06:50.120 | with John Abramson,
00:06:51.600 | who is an outspoken critic of the pharmaceutical industry.
00:06:55.400 | I hope it helps add context and depth
00:06:58.640 | to the conversation I had with the Pfizer CEO.
00:07:02.120 | In the end, I may do worse than I could have or should have.
00:07:06.520 | Always, I will listen to the criticisms without ego,
00:07:09.920 | and I promise I will work hard to improve.
00:07:14.440 | But let me say finally that cynicism is easy.
00:07:19.840 | Optimism, true optimism, is hard.
00:07:24.320 | It is the belief that we can and we will build a better world
00:07:29.760 | and that we can only do it together.
00:07:32.520 | This is the fight worth fighting.
00:07:34.840 | So here we go.
00:07:36.320 | Once more into the breach, dear friends.
00:07:39.000 | I love you all.
00:07:41.480 | This is the Lex Friedman Podcast.
00:07:43.720 | To support it, please check out our sponsors in the description.
00:07:47.160 | And now, here's my conversation with John Abramson.
00:07:52.200 | Your faculty at Harvard Medical School,
00:07:55.040 | your family physician for over two decades,
00:07:57.640 | rated one of the best family physicians in Massachusetts.
00:08:00.960 | You wrote the book "Overdosed America"
00:08:03.240 | and the new book coming out now called "Sickening,"
00:08:07.200 | about how Big Pharma broke American healthcare,
00:08:10.000 | including science and research, and how we can fix it.
00:08:14.640 | First question, what is the biggest problem with Big Pharma
00:08:18.600 | that if fixed would be the most impactful?
00:08:21.720 | So if you can snap your fingers and fix one thing,
00:08:24.800 | what would be the most impactful, you think?
00:08:26.640 | - The biggest problem is the way they determine the content,
00:08:34.240 | the accuracy, and the completeness
00:08:37.480 | of what doctors believe to be the full range of knowledge
00:08:42.400 | that they need to best take care of their patients.
00:08:45.880 | So that with the knowledge having been taken over
00:08:51.360 | by the commercial interests,
00:08:52.720 | primarily the pharmaceutical industry,
00:08:55.880 | the purpose of that knowledge is to maximize the profits
00:08:59.400 | that get returned to investors and shareholders,
00:09:03.200 | and not to optimize the health of the American people.
00:09:07.120 | So rebalancing that equation would be the most important
00:09:11.360 | thing to do to get our healthcare back
00:09:14.640 | aimed in the right direction.
00:09:16.360 | - Okay, so there's a tension between helping people
00:09:20.600 | and making money.
00:09:22.160 | So if we look at particularly the task of helping people
00:09:26.280 | in medicine, in healthcare, is it possible
00:09:30.480 | if money is the primary sort of mechanism
00:09:35.320 | by which you achieve that as a motivator,
00:09:38.200 | is it possible to get that right?
00:09:39.720 | - I think it is, Lex, but I think it is not possible
00:09:43.040 | without guardrails that maintain the integrity
00:09:46.640 | and the balance of the knowledge.
00:09:48.640 | Without those guardrails, it's like trying to play
00:09:51.920 | a professional basketball game without referees
00:09:54.800 | and having players call their own fouls.
00:09:57.520 | But the players are paid to win,
00:09:59.840 | and you can't count on them to call their own fouls.
00:10:02.320 | So we have referees who are in charge.
00:10:05.000 | We don't have those referees in American healthcare.
00:10:08.120 | That's the biggest way that American healthcare
00:10:13.120 | is distinguished from healthcare in other wealthy nations.
00:10:17.440 | - So okay, so you mentioned Milton Friedman,
00:10:19.640 | and you mentioned his book called "Capitalism and Freedom."
00:10:24.160 | He writes that there are only three legitimate functions
00:10:27.120 | of government to preserve law and order,
00:10:30.080 | to enforce private contracts,
00:10:31.800 | and to ensure that private markets work.
00:10:35.760 | You said that that was a radical idea at the time,
00:10:40.120 | but we're failing on all three.
00:10:41.800 | How are we failing?
00:10:43.680 | And also maybe the bigger picture is,
00:10:46.680 | what are the strengths and weaknesses of capitalism
00:10:49.640 | when it comes to medicine and healthcare?
00:10:51.840 | - Can we separate those out?
00:10:53.120 | 'Cause those are two huge questions.
00:10:55.200 | So how we're failing on all three,
00:10:58.080 | and these are the minimal functions
00:11:01.240 | that our guru of free market capitalism said
00:11:06.120 | the government should perform.
00:11:07.600 | So this is the absolute baseline.
00:11:10.100 | On preserving law and order,
00:11:13.720 | the drug companies routinely violate the law
00:11:17.440 | in terms of their marketing,
00:11:20.160 | and in terms of their,
00:11:25.240 | presentation of the results of their trials.
00:11:29.200 | I know this because I was an expert in litigation
00:11:32.960 | for about 10 years.
00:11:34.660 | I presented some of what I learned in civil litigation
00:11:40.040 | to the FBI and the Department of Justice,
00:11:42.600 | and that case led to the biggest criminal fine
00:11:46.360 | in US history as of 2009.
00:11:49.040 | And I testified in a federal trial in 2010,
00:11:55.760 | and the jury found Pfizer guilty of fraud
00:12:00.040 | and racketeering violations.
00:12:02.400 | In terms of violating the law, it's a routine occurrence.
00:12:07.400 | The drug companies have paid $38 billion worth of fines
00:12:10.840 | from I think 1991 to 2017.
00:12:14.520 | It's never been enough to stop the misrepresentation
00:12:20.800 | of their data.
00:12:23.240 | And rarely are the fines greater
00:12:25.680 | than the profits that were made.
00:12:27.320 | See, executives have not gone to jail
00:12:31.880 | for misrepresenting data that have involved
00:12:36.460 | even tens of thousands of deaths
00:12:38.480 | in the case of Vioxx, OxyContin as well.
00:12:42.160 | And when companies plead guilty to felonies,
00:12:45.560 | which is not an unusual occurrence,
00:12:48.400 | the government usually allows the companies,
00:12:51.280 | the parent companies to allow subsidiaries
00:12:54.600 | to take the plea so that they are not one step closer
00:12:58.920 | to getting disbarred from Medicare,
00:13:01.280 | not being able to participate in Medicare.
00:13:03.400 | So in that sense, there is a mechanism
00:13:10.000 | that is appearing to impose law and order
00:13:15.040 | on drug company behavior, but it's clearly not enough.
00:13:18.180 | It's not working.
00:13:19.520 | - Can you actually speak to human nature here?
00:13:24.520 | Are people corrupt?
00:13:26.320 | Are people malevolent?
00:13:28.400 | Are people ignorant that work at the low level
00:13:32.880 | and at the high level at Pfizer, for example,
00:13:36.480 | at big pharma companies?
00:13:38.640 | How is this possible?
00:13:40.440 | So I believe, just on a small tangent,
00:13:43.280 | that most people are good.
00:13:45.240 | And I actually believe if you join big pharma,
00:13:48.920 | so a company like Pfizer,
00:13:51.280 | your life trajectory often involves dreaming
00:13:54.920 | and wanting and enjoying helping people.
00:13:58.920 | - Yes.
00:13:59.760 | - And so, and then we look at the outcomes
00:14:03.440 | that you're describing and it looks,
00:14:07.040 | and that's why the narrative takes hold,
00:14:09.400 | that Pfizer CEO Albert Bourla, who I talked to,
00:14:13.180 | is malevolent.
00:14:15.440 | The sense is like these companies are evil.
00:14:19.520 | So if the different parts, the people, are good
00:14:24.520 | and they want to do good, how are we getting these outcomes?
00:14:27.360 | - Yeah, I think it has to do with the cultural milieu
00:14:32.360 | that this is unfolding in.
00:14:35.320 | And we need to look at sociology to understand this,
00:14:41.440 | that when the cultural milieu is set up
00:14:46.440 | to maximize the returns on investment for shareholders
00:14:53.680 | and other venture capitalists and hedge funds and so forth,
00:14:57.520 | when that defines the culture
00:15:00.640 | and the higher up you are in the corporation,
00:15:04.240 | the more you're in on the game of getting rewarded
00:15:10.080 | for maximizing the profits of the investors.
00:15:13.040 | That's the culture they live in.
00:15:14.880 | And it becomes normative behavior to do things with science
00:15:20.760 | that look normal in that environment
00:15:26.900 | and are shared values within that environment
00:15:29.980 | by good people whose self-evaluation becomes modified
00:15:34.900 | by the goals that are shared by the people around them.
00:15:39.060 | And within that milieu, you have one set of standards,
00:15:44.060 | and then the rest of good American people
00:15:48.680 | have the expectation that the drug companies
00:15:50.720 | are trying to make money, but that they're playing by rules
00:15:55.280 | that aren't part of the insider milieu.
00:15:58.480 | - That's fascinating.
00:16:00.440 | The game they're playing modifies the culture
00:16:06.280 | inside the meetings, inside the rooms, day to day,
00:16:09.440 | that there's a bubble that forms.
00:16:12.420 | Like we're all in bubbles of different sizes.
00:16:15.660 | And that bubble allows you to drift
00:16:18.340 | in terms of what you see as ethical and unethical,
00:16:23.080 | because you see the game as just part of the game.
00:16:28.500 | So marketing is just part of the game.
00:16:30.500 | Paying the fines is just part of the game of science.
00:16:36.120 | - Yeah, and without guardrails,
00:16:38.240 | it becomes even more part of the game.
00:16:42.200 | You keep moving in that direction
00:16:44.660 | if you're not bumping up against guardrails.
00:16:48.180 | And I think that's how we've gotten
00:16:49.900 | to the extreme situation we're in now.
00:16:52.220 | - So like I mentioned, I spoke with Pfizer CEO,
00:16:57.200 | Albert Bourla, and I'd like to raise with you
00:17:00.980 | some of the concerns I raised with him.
00:17:03.820 | So one, you already mentioned,
00:17:06.160 | I raised the concern that Pfizer's engaged
00:17:08.280 | in aggressive advertising campaigns.
00:17:11.220 | As you can imagine, he said no.
00:17:14.240 | What do you think?
00:17:16.540 | - I think you're both right.
00:17:20.320 | I think that the, I agree with you,
00:17:23.700 | that the aggressive advertising campaigns
00:17:26.280 | do not add value to society.
00:17:30.560 | And I agree with him that they're, for the most part, legal,
00:17:34.980 | and it's the way the game is played.
00:17:36.980 | - Right, so sorry to interrupt,
00:17:38.620 | but oftentimes his responses are,
00:17:42.300 | especially now, he's been CEO
00:17:46.320 | for only like two years, three years,
00:17:48.740 | he says Pfizer was a different company,
00:17:50.500 | we've made mistakes in the past.
00:17:53.840 | We don't make mistakes anymore.
00:17:56.340 | That there's rules, and we play by the rules.
00:18:00.340 | So like, with every concern raised,
00:18:02.760 | there's very, very strict rules, as he says.
00:18:06.020 | In fact, he says sometimes way too strict,
00:18:08.500 | and we play by them.
00:18:10.180 | And so in that sense, advertisement,
00:18:12.180 | it doesn't seem like it's too aggressive,
00:18:14.320 | because it's playing by the rules.
00:18:16.020 | And relative to the other, again, it's the game,
00:18:19.800 | relative to the other companies,
00:18:22.140 | it's actually not that aggressive.
00:18:23.880 | Relative to the other big pharma companies.
00:18:26.540 | - Yes, yes.
00:18:27.500 | I hope we can quickly get back to whether or not
00:18:30.300 | they're playing by the rules, but in general.
00:18:32.780 | But let's just look at the question
00:18:34.380 | of advertising specifically.
00:18:36.660 | I think that's a good example of what it looks like
00:18:39.620 | from within that culture, and from outside that culture.
00:18:43.140 | He's saying that we follow the law on our advertising.
00:18:49.780 | We state the side effects,
00:18:51.540 | and we state the FDA approved indications,
00:18:53.940 | and we do what the law says we have to do for advertising.
00:18:57.700 | And I have not, I've not been an expert in litigation
00:19:01.780 | for a few years, and I don't know what's going on currently,
00:19:04.980 | but let's take him at his word.
00:19:07.060 | It could be true.
00:19:08.020 | It might not be, but it could be.
00:19:09.940 | But if that's true, in his world, in his culture,
00:19:14.940 | that's ethical business behavior.
00:19:17.280 | From a common sense person's point of view,
00:19:22.560 | a drug company paying highly skilled media folks
00:19:27.220 | to take the information about the drug
00:19:30.420 | and create the illusion, the emotional impact,
00:19:34.380 | and the takeaway message for viewers of advertisements
00:19:38.000 | that grossly exaggerate the benefit of the drug
00:19:41.200 | and minimize the harms, it's sociopathic behavior
00:19:45.380 | to have viewers of ads leave the ad
00:19:49.960 | with an unrealistic impression
00:19:52.980 | of the benefits and harms of the drug.
00:19:56.220 | And yet, he's playing by the rules.
00:19:58.900 | He's doing his job as CEO
00:20:01.300 | to maximize the effect of his advertising.
00:20:04.580 | And if he doesn't do it, this is a key point,
00:20:07.620 | if he doesn't do it, he'll get fired and the next guy will.
00:20:11.940 | - So the people that survive in the company,
00:20:13.620 | the people that get raises in the company
00:20:16.700 | and move up in the company
00:20:17.780 | are the ones that play by the rules,
00:20:19.220 | and that's how the game solidifies itself.
00:20:21.540 | But the game is within the bounds of the law.
00:20:24.380 | Sometimes, most of the time, not always.
00:20:27.260 | - We'll return to that question.
00:20:29.220 | I'm actually more concerned
00:20:31.660 | about the effect of advertisement
00:20:34.340 | in a kind of much larger scale
00:20:39.340 | on the people that are getting funded by the advertisement
00:20:44.520 | in self-censorship, just like more subtle,
00:20:47.500 | more passive pressure to not say anything negative.
00:20:52.980 | Because I've seen this and I've been saddened by it,
00:20:57.980 | that people sacrifice integrity in small ways
00:21:03.140 | when they're being funded by a particular company.
00:21:05.720 | They don't see themselves as doing so,
00:21:09.460 | but you could just clearly see that the space of opinions
00:21:12.880 | that they're willing to engage in
00:21:15.220 | or a space of ideas they're willing to play with
00:21:18.460 | is one that doesn't include negative,
00:21:22.460 | anything that could possibly be negative about the company.
00:21:25.580 | They just choose not to, 'cause why?
00:21:28.900 | And that's really sad to me,
00:21:30.620 | that if you give me 100 bucks,
00:21:33.640 | I'm less likely to say something negative about you.
00:21:36.340 | That makes me sad,
00:21:39.900 | because the reason I wouldn't say something negative
00:21:42.580 | about you I prefer is the pressure of friendship
00:21:45.600 | and human connection, those kinds of things.
00:21:48.340 | So I understand that.
00:21:50.700 | That's also a problem, by the way,
00:21:52.140 | so they start having dinners and shaking hands
00:21:54.340 | and, "Oh, aren't we friends?"
00:21:56.260 | But the fact that money has that effect is really sad to me.
00:22:00.240 | On the news media, on the journalists, on scientists,
00:22:04.120 | that's scary to me.
00:22:06.900 | But of course, the direct advertisement to consumers,
00:22:09.220 | like you said, is potentially a very negative effect.
00:22:11.300 | I wanted to ask if,
00:22:13.020 | what do you think is the most negative impact
00:22:15.860 | of advertisement?
00:22:17.100 | Is it that direct to consumer on television?
00:22:20.300 | Is it advertisement to doctors,
00:22:22.300 | which I'm surprised to learn I was vaguely looking at
00:22:26.340 | is more than the advertisement,
00:22:28.860 | more is spent on advertising to doctors than to consumers.
00:22:32.660 | That's really confusing to me.
00:22:34.060 | It's fascinating, actually.
00:22:35.820 | And then also, obviously, the law side of things
00:22:38.940 | is the lobbying dollars,
00:22:40.980 | which I think is less than all of those.
00:22:42.660 | But anyway, it's in the ballpark.
00:22:44.700 | What concerns you most?
00:22:46.540 | - Well, it's the whole nexus of influence.
00:22:49.780 | There's not one thing,
00:22:51.100 | and they don't invest all their,
00:22:53.980 | they don't put all their eggs in one basket.
00:22:55.580 | It's a whole surround sound program here.
00:23:00.580 | But in terms of advertisements,
00:23:04.460 | let's take the advertisement,
00:23:06.100 | trulicity is a diabetes drug,
00:23:09.620 | for type two diabetes, an injectable drug.
00:23:12.540 | And it lowers blood sugar just about as well
00:23:15.660 | as metformin does.
00:23:18.520 | Metformin costs about $4 a month.
00:23:21.040 | Trulicity costs, I think, $6,200 a year.
00:23:25.780 | So $48 a year versus 6,200.
00:23:29.500 | Trulicity has distinguished itself
00:23:31.600 | because the manufacturer did a study
00:23:35.100 | that showed that it significantly reduces
00:23:37.340 | the risk of cardiovascular disease in diabetics.
00:23:41.080 | And they got approval on the basis of that study,
00:23:44.380 | that very large study being statistically significant.
00:23:48.100 | What the, so the ad,
00:23:50.120 | the ads obviously extol the virtues of trulicity
00:23:53.340 | because it reduces the risk of heart disease and stroke.
00:23:56.820 | And that's one of the major morbidities,
00:23:59.340 | risks of type two diabetes.
00:24:01.580 | What the ad doesn't say is that you have to treat
00:24:03.660 | 323 people to prevent one non-fatal event
00:24:08.100 | at a cost of $2.7 million.
00:24:10.300 | And even more importantly than that,
00:24:13.820 | what the ad doesn't say is that the evidence shows
00:24:17.720 | that engaging in an active, healthy lifestyle program
00:24:22.020 | reduces the risk of heart disease and strokes far more
00:24:25.860 | than trulicity does.
00:24:27.460 | Now, to be fair to the company, the sponsor,
00:24:32.140 | there's never been a study that compared trulicity
00:24:37.140 | to lifestyle changes.
00:24:39.780 | But that's part of the problem of our advertising.
00:24:42.700 | You would think in a rational society
00:24:45.340 | that was way out on a limb as a lone country
00:24:50.300 | besides New Zealand that allows
00:24:52.300 | direct-to-consumer advertising,
00:24:54.340 | that part of allowing direct-to-consumer advertising
00:24:59.140 | would be to mandate that the companies establish
00:25:03.060 | whether their drug is better than, say,
00:25:05.940 | healthy lifestyle adoption to prevent the problems
00:25:09.940 | that they claim to be preventing.
00:25:11.820 | But we don't require that.
00:25:13.820 | So the companies can afford to do very large studies
00:25:17.580 | so that very small differences
00:25:19.540 | become statistically significant.
00:25:21.860 | And their studies are asking the question,
00:25:23.860 | how can we sell more drug?
00:25:25.660 | They're not asking the question,
00:25:27.300 | how can we prevent cardiovascular disease
00:25:30.620 | in people with type 2 diabetes?
00:25:32.700 | And that's how we get off in this,
00:25:34.220 | we're now in the extreme arm of this distortion
00:25:38.540 | of our medical knowledge of studying how to sell more drugs
00:25:43.060 | than how to make people more healthy.
00:25:45.500 | - That's a really great thing to compare it to,
00:25:48.940 | is lifestyle changes.
00:25:51.380 | 'Cause that should be the bar.
00:25:53.260 | If you do some basic diet, exercise,
00:25:56.680 | all those kinds of things,
00:25:58.420 | how does this drug compare to that?
00:26:00.260 | - Right, right.
00:26:01.420 | And that study was done, actually, in the '90s.
00:26:04.100 | It's called the Diabetes Prevention Program.
00:26:06.060 | It was federally funded by the NIH
00:26:09.180 | so that there wasn't this drug company imperative
00:26:13.260 | to just try to prove your drug was better than nothing.
00:26:16.820 | And it was a very well-designed study,
00:26:19.660 | randomized, controlled trial,
00:26:22.460 | in people who were at high risk of diabetes,
00:26:25.020 | so-called pre-diabetics.
00:26:26.820 | And they were randomized to three different groups,
00:26:30.020 | a placebo group, a group that got treated with metformin,
00:26:33.780 | and a group that got treated
00:26:36.180 | with intensive lifestyle counseling.
00:26:38.860 | So this study really tested
00:26:42.180 | whether you can get people in a randomized, controlled trial
00:26:45.980 | assigned to intensive lifestyle changes,
00:26:49.260 | whether that works.
00:26:50.660 | Now, the common wisdom amongst physicians,
00:26:54.640 | and I think in general,
00:26:56.100 | is that you can't get people to change.
00:26:57.940 | You can do whatever you want.
00:26:59.260 | You can stand on your head.
00:27:00.300 | You can beg and plead.
00:27:01.580 | People won't change.
00:27:02.620 | So give it up, and let's just move on with the drugs
00:27:05.060 | and not waste any time.
00:27:06.440 | Except this study that was published
00:27:08.260 | in the New England Journal, I think, in 2002,
00:27:11.020 | shows that's wrong,
00:27:12.860 | that the people who were in the intensive lifestyle group
00:27:16.180 | ended up losing 10 pounds,
00:27:18.060 | exercising five times a week, maintaining it,
00:27:21.220 | and reduced their risk of getting diabetes by 58%,
00:27:26.220 | compared to the metformin group,
00:27:27.900 | which reduced its risk of getting diabetes by 31%.
00:27:31.940 | So that exact study was done,
00:27:34.880 | and it showed that lifestyle intervention is the winner.
00:27:37.920 | - Who, as a small tangent, is the leader?
00:27:44.440 | Who is supposed to fight for the side of lifestyle changes?
00:27:49.140 | Where's the big pharma version of lifestyle changes?
00:27:54.140 | Who's supposed to have the big bully pulpit,
00:27:57.240 | the big money behind lifestyle changes, in your sense?
00:28:00.960 | Because that seems to be missing
00:28:03.400 | in a lot of our discussions about health policy.
00:28:06.280 | - Right, that's exactly right.
00:28:08.040 | And the answer is that we assume
00:28:12.800 | that the market has to solve all of these problems,
00:28:15.920 | and the market can't solve all of these problems.
00:28:18.320 | There needs to be some way of protecting the public interest
00:28:23.240 | for things that aren't financially driven,
00:28:26.500 | so that the overriding question has to be
00:28:28.760 | how best to improve Americans' health,
00:28:31.420 | not companies funding studies to try and prove
00:28:36.200 | that their new inexpensive drug is better
00:28:39.260 | and should be used.
00:28:40.880 | - Well, some of that is also people like yourself.
00:28:45.440 | I mean, it's funny, you spoke with Joe Rogan.
00:28:48.800 | He constantly espouses lifestyle changes.
00:28:50.960 | So some of it is almost like understanding the problems
00:28:55.960 | that big pharma is creating in society,
00:28:58.160 | and then sort of these influential voices
00:29:02.460 | speaking up against it.
00:29:03.560 | So whether they're scientists or just regular communicators.
00:29:08.560 | - Yeah, I think you gotta tip your hat to Joe
00:29:11.300 | for getting that message out.
00:29:13.140 | And he clearly believes it and does his best.
00:29:17.340 | But it's not coming out in the legitimate avenues,
00:29:21.020 | in the legitimate channels that are evidence-based medicine
00:29:26.020 | and from the sources that the docs are trained to listen to
00:29:31.020 | and modify their patient care on.
00:29:34.320 | Now, it's not 100%.
00:29:36.460 | I mean, there are articles in the big journals
00:29:40.180 | about the benefits of lifestyle,
00:29:42.160 | but they don't carry the same gravitas
00:29:45.800 | as the randomized controlled trials
00:29:48.220 | that test this drug against placebo
00:29:50.300 | or this drug against another drug.
00:29:52.340 | So the Joe Rogans of the world keep going.
00:29:55.660 | I tip my hat.
00:29:57.020 | But it's not gonna carry the day for most of the people
00:30:00.900 | until it has the legitimacy of the medical establishment.
00:30:04.220 | - Yeah, like something that the doctors
00:30:05.940 | really pay attention to.
00:30:07.100 | Well, there's an entire mechanism
00:30:09.100 | established for testing drugs.
00:30:11.280 | There's not an entire mechanism established
00:30:14.380 | in terms of scientific rigor of testing lifestyle changes.
00:30:17.580 | I mean, it's more difficult.
00:30:20.460 | I mean, everything's difficult in science,
00:30:23.660 | science that involves humans especially,
00:30:27.100 | but it's just, these studies are very expensive.
00:30:30.620 | They're difficult.
00:30:31.940 | It's difficult to find conclusions
00:30:33.420 | and to control all the variables.
00:30:35.460 | And so it's very easy to dismiss them
00:30:37.340 | unless you really do a huge study that's very well-funded.
00:30:40.940 | And so maybe the doctors just lean
00:30:42.820 | towards the simpler studies over and over,
00:30:45.740 | which is what the drug companies fund.
00:30:48.020 | They can control more variables.
00:30:51.380 | See, but the control there is sometimes
00:30:53.660 | by hiding things too, right?
00:31:00.940 | So sometimes you can just say
00:31:03.500 | that this is a well-controlled study
00:31:06.460 | by pretending there's a bunch of other stuff,
00:31:09.300 | just ignoring the stuff that could be correlated,
00:31:13.300 | it could be the real cause of the effects you're seeing,
00:31:15.580 | all that kind of stuff.
00:31:17.380 | So money can buy ignorance, I suppose, in science.
00:31:21.740 | - It buys the kind of blinders that are on,
00:31:24.740 | that don't look outside the reductionist model.
00:31:28.060 | And that's another issue is that we kind of,
00:31:31.500 | nobody says to doctors in training,
00:31:34.340 | only listen to reductionist studies and conclusions
00:31:39.340 | and methods of promoting health.
00:31:42.100 | Nobody says that explicitly,
00:31:44.020 | but the respectable science
00:31:47.660 | has to do with controlling the factors.
00:31:49.780 | And I mean, it just doesn't make sense to me.
00:31:54.220 | I'm gonna pick on Trulicity
00:31:55.460 | 'cause it's such an obvious example,
00:31:57.280 | but it's not more egregious than many others.
00:32:01.360 | It doesn't make sense to me to allow a drug
00:32:03.740 | to be advertised as preventing cardiovascular disease
00:32:07.000 | when you haven't included lifestyle changes
00:32:10.000 | as an arm in the study.
00:32:11.700 | It's just so crystal clear
00:32:14.660 | that the purpose of that study is to sell Trulicity.
00:32:17.460 | It's not to prevent cardiovascular disease.
00:32:20.460 | If we were in charge, I would try to convince you
00:32:24.760 | that anywhere that study,
00:32:26.020 | the results of that study were presented to physicians,
00:32:31.020 | it would be stamped in big red letters,
00:32:33.580 | this study did not compare Trulicity to lifestyle changes.
00:32:37.720 | They need to know that.
00:32:38.980 | And the docs are kind of trained,
00:32:40.660 | these blinders get put on,
00:32:42.660 | and they're trained to kind of forget that that's not there.
00:32:46.300 | - Do you think, so first of all,
00:32:48.220 | that's a small or big change to advertisement
00:32:51.300 | that seems obvious to say,
00:32:53.100 | like in force that it should be compared
00:32:56.500 | to lifestyle changes.
00:32:57.680 | Do you think advertisements period in the United States
00:33:02.580 | for pharmaceutical drugs should be banned?
00:33:05.920 | - I think they can't be banned.
00:33:07.540 | So it doesn't matter what I think.
00:33:09.220 | (Lex laughing)
00:33:10.520 | - Okay.
00:33:11.360 | Let's say you were a dictator,
00:33:13.180 | and two, why can't they be banned?
00:33:15.180 | - Okay.
00:33:16.740 | - Answer either one.
00:33:17.740 | - I believe, I've been told by lawyers who I trust,
00:33:22.820 | that the freedom of speech in the US Constitution
00:33:27.280 | is such that you can't ban them,
00:33:29.380 | that you could ban cigarettes and alcohol,
00:33:33.300 | which have no therapeutic use,
00:33:35.660 | but drugs have a therapeutic use,
00:33:37.680 | and advertisements about them can't be banned.
00:33:41.600 | Let's assume that they can't be,
00:33:43.680 | 'cause we know they won't be anyway,
00:33:45.840 | but let's assume they can't be,
00:33:49.120 | and especially our Supreme Court now
00:33:51.820 | would be unlikely to take that seriously.
00:33:55.900 | But that's not the issue.
00:33:57.360 | The issue is that if the drug companies
00:34:00.240 | wanna spend their money advertising,
00:34:02.620 | they should have to have independent analysis
00:34:06.880 | of the message that the viewers are left with
00:34:10.440 | about the drug, so that it's realistic.
00:34:13.400 | What's the chance the drug will help them?
00:34:15.520 | Well, in trulicity, it's one out of 323.
00:34:19.000 | 322 people aren't gonna benefit
00:34:21.120 | from the cardiovascular risk reduction.
00:34:23.700 | What's the true cost?
00:34:26.880 | When drugs advertise that you may be able to get this
00:34:30.640 | for a $25 copay or something,
00:34:33.880 | tens of thousands of dollars a year drug,
00:34:35.960 | for a $25 copay, what an enormous disservice that is,
00:34:40.100 | to misrepresent the cost to society.
00:34:42.600 | That should not be allowed.
00:34:44.040 | So you should have to make it clear to the viewers
00:34:48.680 | how many people are gonna benefit,
00:34:49.960 | what's your chance of benefiting,
00:34:51.680 | how does it compare to lifestyle changes
00:34:53.560 | or less expensive therapies,
00:34:55.840 | what do you give up if you use a less expensive therapy
00:34:58.440 | or gain, perhaps.
00:34:59.960 | - And how much it costs.
00:35:01.160 | - How much it costs.
00:35:02.280 | Now, that can go either way,
00:35:03.560 | 'cause if you say Humira costs $72,000
00:35:06.640 | and it's no more effective as a first-line drug
00:35:08.960 | than methotrexate, which costs $480,
00:35:12.280 | people might say, "I want the expensive drug
00:35:14.960 | "'cause I can get it for a $25 copay."
00:35:17.720 | So you'd have to temper that a little bit.
00:35:21.640 | - Oh, you mean people are so, they don't care.
00:35:25.560 | - They don't care, their insurance is gonna cover it
00:35:27.640 | and it's a $25 copay,
00:35:29.440 | but we could figure out how to deal with that.
00:35:31.580 | The main point is that if we assume
00:35:35.240 | that advertisements are gonna keep going, and they are,
00:35:38.760 | we could require that there be outside evaluation
00:35:43.760 | of the message that reasonable, unbiased viewers
00:35:48.880 | take away from the ads,
00:35:50.960 | and the ads would have to tell the truth about the drug.
00:35:54.220 | - And the truth should have sub-truth guardrails,
00:36:01.100 | meaning like the cost that we talked about,
00:36:03.680 | the effects compared to things that actually,
00:36:07.080 | lifestyle changes, just these details,
00:36:11.800 | very strict guardrails of what actually has to be specified.
00:36:16.520 | - And I would make it against the law
00:36:19.360 | to have family picnics or dogs catching Frisbees in the ads.
00:36:23.360 | - So, (laughs)
00:36:26.160 | you mean 95% of the ads, yes.
00:36:30.600 | I mean, there's something dark and inauthentic
00:36:32.620 | about those advertisements, but they see,
00:36:34.500 | I mean, I'm sure they're being done
00:36:36.240 | 'cause they work for the target audience.
00:36:38.600 | And then the doctors too.
00:36:45.120 | Can you really buy a doctor's opinion?
00:36:48.720 | Why does it have such an effect on doctors,
00:36:51.320 | advertisement to doctors?
00:36:53.520 | Like you as a physician, again,
00:36:55.800 | like from everything I've seen, people love you.
00:36:58.360 | (laughs)
00:36:59.280 | And I've just, people should definitely look you up from,
00:37:04.280 | there's a bunch of videos of you giving talks on YouTube,
00:37:09.280 | and it's just, it's so refreshing to hear
00:37:14.280 | just the clarity of thought about health policy,
00:37:17.440 | about healthcare, just the way you think
00:37:19.640 | throughout the years.
00:37:20.640 | - Thank you.
00:37:21.480 | - So, like it's easy to think about,
00:37:23.120 | like maybe you're criticizing Big Pharma,
00:37:25.360 | that's one part of the message that you're talking about,
00:37:28.800 | but that's not, like your brilliance actually shines
00:37:33.000 | in the positive, in the solutions and how to do it.
00:37:35.440 | So as a doctor, what affects your mind?
00:37:40.440 | And how does Big Pharma affect your mind?
00:37:43.120 | - Number one, the information that comes through
00:37:46.480 | legitimate sources that doctors have been taught
00:37:50.320 | to rely on, evidence-based medicine,
00:37:52.560 | the articles in peer-reviewed journals,
00:37:55.420 | the guidelines that are issued.
00:37:57.020 | Now, those are problematic,
00:37:59.240 | because when an article is peer-reviewed
00:38:03.240 | and published in a respected journal,
00:38:05.140 | people and doctors obviously assume
00:38:10.360 | that the peer reviewers have had access to the data
00:38:15.360 | and they've independently analyzed the data,
00:38:18.400 | and they corroborate the findings in the manuscript
00:38:21.920 | that was submitted, or they give feedback to the authors
00:38:25.760 | and say, "We disagree with you on this point,
00:38:28.180 | "and would you please check our analysis,
00:38:30.680 | "and if you agree with us, make it."
00:38:32.460 | That's what they assume the peer-review process is,
00:38:35.620 | but it's not.
00:38:36.920 | The peer reviewers don't have the data.
00:38:39.240 | The peer reviewers have the manuscript
00:38:41.840 | that's been submitted by the, usually in conjunction with,
00:38:46.260 | or by the drug company that manufactures the drug.
00:38:51.540 | So peer reviewers are unable to perform the job
00:38:56.540 | that doctors think they're performing
00:38:59.740 | to vet the data to assure that it's accurate
00:39:03.300 | and reasonably complete.
00:39:05.100 | They can't do it.
00:39:06.340 | And then we have the clinical practice guidelines,
00:39:09.500 | which are increasingly more important,
00:39:11.340 | as the information, the flow of information
00:39:15.900 | keeps getting brisker and brisker,
00:39:18.740 | and docs need to get to the bottom line quickly.
00:39:22.020 | Clinical practice guidelines become much more important,
00:39:25.700 | and we assume that the authors
00:39:28.800 | of those clinical practice guidelines
00:39:30.380 | have independently analyzed the data
00:39:32.380 | from the clinical trials and make their recommendations
00:39:35.880 | that set the standards of care based on their analysis.
00:39:39.140 | That's not what happens.
00:39:40.860 | The experts who write the clinical trials
00:39:44.180 | rely almost entirely on the publications
00:39:49.180 | presenting the results of the clinical trials,
00:39:51.940 | which are peer reviewed, but the peer reviewers
00:39:53.980 | haven't had access to the data.
00:39:56.340 | So we've got a system of the highest level of evidence
00:40:01.140 | that doctors have been trained over and over again
00:40:03.780 | to rely on to practice evidence-based medicine
00:40:06.420 | to be good doctors that has not been verified.
00:40:10.860 | - Do you think that data that's coming
00:40:14.300 | from the pharma companies, do you think they're,
00:40:18.180 | what level of manipulation is going on with that data?
00:40:22.520 | Is it at the study design level?
00:40:25.940 | Is it at literally there's some data
00:40:28.140 | that you just keep off, you know, keep out of the charts,
00:40:33.140 | keep out of the aggregate analysis that you then publish?
00:40:38.580 | Or is it the worst case,
00:40:41.380 | which is just change some of the numbers?
00:40:44.620 | - It happened, all three happened.
00:40:46.160 | I can't, I don't know what the denominator is,
00:40:48.540 | but I spent about 10 years in litigation.
00:40:51.660 | And for example, in Vioxx, which was withdrawn
00:40:55.820 | from the market in 2004 in the biggest drug recall
00:40:59.500 | in American history, the problem was that it got recalled
00:41:06.100 | when a study that Merck sponsored showed
00:41:09.060 | that Vioxx doubled the risk, more than doubled the risk
00:41:11.620 | of heart attacks, strokes, and blood clots,
00:41:15.300 | serious blood clots, it got pulled then.
00:41:18.120 | But there was a study, a bigger study
00:41:20.660 | that had been published in 2000
00:41:22.760 | in the New England Journal of Medicine
00:41:24.660 | that showed that Vioxx was a better drug
00:41:28.580 | for arthritis and pain, not because it was more effective.
00:41:34.240 | It's no more effective than Aleve or Advil,
00:41:36.440 | but because it was less likely
00:41:40.200 | to cause serious GI complications,
00:41:43.200 | bleeds and perforations in the gut.
00:41:45.100 | Now, in that study that was published
00:41:48.160 | in the New England Journal that was never corrected,
00:41:51.620 | it was a little bit modified 15 months
00:41:55.960 | after the drug was taken off the market,
00:41:57.540 | but never corrected, Merck left out three heart attacks.
00:42:02.680 | And the FDA knew that Merck left out three heart attacks,
00:42:05.620 | and the FDA's analysis of the data from that study
00:42:10.420 | said that the FDA wasn't gonna do the analysis
00:42:14.720 | without the three heart attacks in it.
00:42:16.940 | And the important part of this story
00:42:19.660 | is that there were 12 authors listed on that study
00:42:23.100 | in the New England Journal, two were Merck employees,
00:42:26.260 | they knew about the three heart attacks
00:42:27.780 | that had been omitted.
00:42:29.620 | The other 10 authors, the academic authors,
00:42:34.440 | didn't know about it, they hadn't seen that data.
00:42:37.060 | So Merck just, they had an excuse,
00:42:41.920 | it's complicated and the FDA didn't accept it,
00:42:44.240 | so there's no reason to go into it.
00:42:46.800 | But Merck just left out the three heart attacks.
00:42:48.800 | And the three heart attacks, it may seem three heart attacks
00:42:51.300 | in a 10,000 person study may seem like nothing,
00:42:54.160 | except they completely changed the statistics
00:42:57.540 | so that had the three heart attacks been included,
00:43:00.020 | the only conclusion that Merck could have made
00:43:02.600 | was that Vioxx significantly increased
00:43:04.700 | the risk of heart attack.
00:43:06.360 | And they abbreviated their endpoint
00:43:09.600 | from heart attacks, strokes, and blood clots
00:43:12.380 | to just heart attacks.
00:43:13.820 | - Yeah, so those are maybe in their mind,
00:43:17.120 | they're also playing by the rules
00:43:18.300 | because of some technical excuse
00:43:19.820 | that you mentioned that was rejected.
00:43:22.220 | How can this-- - No, no, no,
00:43:23.980 | let me interrupt, no, that's not true.
00:43:27.340 | The study was completed, the blind was broken,
00:43:32.040 | meaning they looked at the data.
00:43:34.420 | In March of 2000, the article was published
00:43:37.360 | in the New England Journal in November of 2000.
00:43:40.060 | In March of 2000, there was an email by the head scientist
00:43:45.060 | that was published in the Wall Street Journal
00:43:48.180 | that said the day that the data were unblinded,
00:43:53.820 | that it's a shame that the cardiovascular events are there,
00:43:57.660 | but the drug will do well and we will do well.
00:44:03.620 | - But removing the three heart attacks,
00:44:10.780 | how does that happen?
00:44:12.300 | Who has to convince themselves?
00:44:16.780 | Is this pure malevolence?
00:44:18.540 | - You have to be the judge of that,
00:44:21.400 | but the person who was in charge
00:44:23.620 | of the Data Safety Monitoring Board
00:44:25.700 | issued a letter that said they'll stop
00:44:29.560 | counting cardiovascular events
00:44:32.380 | a month before the trial is over
00:44:35.100 | and they'll continue counting GI events.
00:44:37.560 | And that person got a contract to consult with Merck
00:44:43.300 | for $5,000 a day, I think for 12 days a year
00:44:47.540 | for one or two years,
00:44:49.540 | that was signed, that contract was signed
00:44:54.540 | within two weeks of the decision
00:44:58.180 | to stop counting heart attacks.
00:45:00.420 | - I wanna understand that man or woman.
00:45:03.100 | I wanna, I want, it's the,
00:45:07.300 | been reading a lot about Nazi Germany
00:45:09.580 | and thinking a lot about the good Germans
00:45:12.260 | because I want to understand so that we can each
00:45:19.040 | encourage each other to take the small heroic actions
00:45:22.100 | that prevents that.
00:45:23.820 | Because it feels to me removing malevolence from the table
00:45:28.420 | where it's just a pure psychopathic person,
00:45:31.260 | that there's just a momentum created by the game,
00:45:34.880 | like you mentioned.
00:45:35.720 | - Yes.
00:45:36.680 | - And so it takes reversing the momentum
00:45:41.020 | within a company, I think requires
00:45:44.180 | many small acts of heroism.
00:45:46.880 | Not gigantic, I'm going to leave and become a whistleblower
00:45:50.660 | and publish a book about it.
00:45:52.480 | But small, quiet acts of pressuring against this.
00:45:57.060 | Like, what are we doing here?
00:45:59.220 | We're trying to help people.
00:46:00.460 | Is this the right thing to do?
00:46:01.700 | Looking in the mirror constantly and asking,
00:46:03.380 | is this the right thing to do?
00:46:05.240 | I mean, that's how, that's what integrity is.
00:46:07.620 | Acknowledging the pressures you're under
00:46:11.220 | and then still be able to zoom out and think,
00:46:13.620 | what is the right thing to do here?
00:46:16.620 | But the data, hiding the data,
00:46:18.600 | makes it too easy to live in ignorance.
00:46:22.660 | So like, within those, inside those companies.
00:46:25.420 | So your idea is that the reviewers should see the data.
00:46:34.500 | That's one step.
00:46:36.340 | So to even push back on that idea is,
00:46:39.740 | I assume you mean that data remains private
00:46:43.340 | except to the peer reviewers.
00:46:47.020 | The problem, of course, as you probably know,
00:46:49.580 | is the peer review process is not perfect.
00:46:51.980 | You know, it's individuals.
00:46:55.460 | It feels like there should be a lot more eyes on the data
00:46:58.740 | than just the peer reviewers.
00:47:00.420 | - Yes.
00:47:01.340 | This is not a hard problem to solve.
00:47:03.500 | When a study is completed,
00:47:06.620 | a clinical study report is made.
00:47:09.000 | And it's usually several thousand pages.
00:47:12.260 | And what it does is it takes the raw patient data
00:47:15.900 | and it tabulates it in the ways,
00:47:19.660 | it's supposedly and usually,
00:47:22.020 | in the ways that the company has pre-specified.
00:47:25.620 | So that you then end up with a searchable,
00:47:28.420 | let's say 3,000 page document.
00:47:30.720 | As I became more experienced as an expert in litigation,
00:47:36.180 | I could go through those documents pretty quickly.
00:47:39.660 | Quickly may mean 20 hours or 40 hours,
00:47:42.040 | but it doesn't mean three months of my work.
00:47:44.340 | And see if the company's,
00:47:49.120 | if the way the company has analyzed the data
00:47:51.660 | is consistent with their statistical analysis plan
00:47:55.800 | and their pre-specified outcome measures.
00:48:00.060 | It's not hard.
00:48:01.300 | And I think you're right.
00:48:02.800 | Peer reviewers, I don't peer review clinical trials,
00:48:06.200 | but I peer review other kinds of articles.
00:48:09.300 | I have to do one on the airplane on the way home.
00:48:11.540 | And it's hard.
00:48:12.380 | I mean, we're just ordinary mortal people volunteering.
00:48:15.620 | - Unpaid, the motivation is not clear.
00:48:19.160 | - The motivation is to keep,
00:48:22.080 | to be a good citizen in the medical community
00:48:27.920 | and to be on friendly terms with the journals
00:48:31.120 | so that if you want to get published,
00:48:33.260 | there's sort of an unspoken incentive.
00:48:37.320 | - As somebody who enjoys game theory,
00:48:39.840 | I feel like that motivation is good,
00:48:42.200 | but could be a lot better.
00:48:43.500 | - Yes, you should get more recognition
00:48:46.540 | or in some way academic credit for it.
00:48:50.260 | It should go to your career advancement.
00:48:52.980 | - If it's an important paper
00:48:54.380 | and you recognize it's an important paper
00:48:56.600 | as a great peer reviewer,
00:48:58.420 | that this is not in that area where it's
00:49:01.780 | clearly a piece of crap paper
00:49:05.920 | or clearly an awesome paper
00:49:08.300 | that doesn't have controversial aspects to it
00:49:10.980 | and it's just a beautiful piece of work,
00:49:13.140 | okay, those are easy.
00:49:14.660 | And then there's like the very difficult gray area
00:49:17.740 | which may require many, many days of work
00:49:20.220 | on your part as a peer reviewer.
00:49:21.900 | So it's not just a couple hours,
00:49:24.400 | but really seriously reading.
00:49:27.300 | Like some papers can take months to really understand.
00:49:30.700 | So if you really want to struggle,
00:49:32.780 | there has to be an incentive for that struggle.
00:49:35.900 | - Yes, and billions of dollars ride on some of these studies.
00:49:40.900 | - And lies, right?
00:49:43.780 | Not to mention.
00:49:44.700 | - Right, but it would be easy to have
00:49:48.340 | full-time statisticians hired by the journals
00:49:52.380 | or shared by the journals
00:49:53.860 | who were independent of any other financial incentive
00:50:00.300 | to go over these kind of methodological issues
00:50:03.980 | and take responsibility for certifying the analyses
00:50:08.900 | that are done and then pass it on
00:50:11.180 | to the volunteer peer reviewers.
00:50:14.100 | - See, I believe even in this,
00:50:15.940 | in the sort of capitalism or even social capital,
00:50:19.420 | after watching Twitter in the time of COVID
00:50:23.580 | and just looking at people that investigate themselves,
00:50:27.420 | I believe in the citizenry.
00:50:30.060 | People, if you give them access to the data,
00:50:32.460 | like these citizen scientists arise.
00:50:35.900 | A lot of them on the, it's kind of funny.
00:50:38.360 | A lot of people are just really used to working with data.
00:50:42.220 | They don't know anything about medicine
00:50:44.620 | and they don't have actually the biases
00:50:46.820 | that a lot of doctors and medical
00:50:48.900 | and a lot of the people that read these papers,
00:50:51.060 | they'll just go raw into the data
00:50:53.260 | and look at it with, like they're bored almost,
00:50:56.140 | and they do incredible analysis.
00:50:58.340 | So there's some argument to be made
00:51:01.060 | for a lot of this data to become public.
00:51:04.060 | Like de-anonymized, no, sorry, anonymized,
00:51:07.120 | all that kind of stuff, but for a lot of it to be public,
00:51:11.100 | especially when you're talking about things
00:51:13.340 | as impactful as some of these drugs.
00:51:16.940 | - I agree 100%.
00:51:18.100 | So let's turn the micro,
00:51:19.960 | let's get a little bit more granular.
00:51:22.140 | On the peer review issue,
00:51:24.180 | we're talking about pre-publication transparencies.
00:51:27.780 | And that is critically important.
00:51:29.580 | Once a paper is published, the horses are out of the barn
00:51:33.580 | and docs are gonna read it,
00:51:34.820 | take it as evidence-based medicine.
00:51:36.780 | The economists call what then happens as stickiness,
00:51:41.000 | that the docs hold on to their beliefs.
00:51:43.340 | And my own voice inside says,
00:51:47.300 | once doctors start doing things to their patients' bodies,
00:51:52.000 | they're really not too enthusiastic
00:51:53.660 | about hearing it was wrong.
00:51:55.420 | - Yeah, that's the stickiness of human nature.
00:51:57.860 | Wow, so that bar, once it's published,
00:52:00.900 | the doctors, that's when the stickiness emerges.
00:52:04.780 | Wow, yeah. - Yeah, yeah.
00:52:05.780 | It's hard to put that toothpaste back in the tube.
00:52:08.260 | Now, that's pre-publication transparency,
00:52:11.540 | which is essential.
00:52:13.020 | And you could have, whoever saw that data pre-publication
00:52:17.480 | could sign confidentiality agreements
00:52:20.000 | so that the drug companies couldn't argue
00:52:22.520 | that we're just opening the spigots of our data
00:52:24.700 | and people can copy it and blah, all the excuses they make.
00:52:28.660 | You could argue that you didn't have to,
00:52:30.540 | but let's just let them do it.
00:52:32.420 | Let the peer reviewers sign confidentiality agreements
00:52:35.180 | and they won't leak the data.
00:52:36.780 | But then you have to go to post-publication transparency,
00:52:39.880 | which is what you were just getting at,
00:52:41.740 | to let the data free and let citizens
00:52:46.740 | and citizen scientists and other doctors
00:52:50.540 | who are interested have at it.
00:52:53.660 | Kind of like Wikipedia, have at it.
00:52:56.640 | Let it out and let people criticize each other.
00:53:00.120 | - Okay, so speaking of the data,
00:53:03.140 | the FDA asked 55 years to release Pfizer vaccine data.
00:53:08.140 | This is also something I raised with Albert Bourla.
00:53:12.220 | - What did he say?
00:53:13.900 | - There's several things I didn't like about what he said.
00:53:16.860 | So some things are expected
00:53:18.360 | and some of it is just revealing the human being,
00:53:21.120 | which is what I'm interested in doing.
00:53:23.780 | But he said he wasn't aware of the 75 and the 55.
00:53:28.100 | - I'm sorry, wait a minute.
00:53:29.860 | He wasn't aware of?
00:53:31.100 | - The how long, so here, I'll explain what he, okay.
00:53:34.180 | - Do you know that since you spoke to him,
00:53:37.380 | Pfizer has petitioned the judge to join the suit
00:53:42.140 | in behalf of the FDA's request to release that data
00:53:47.140 | over 55 or 75 years?
00:53:50.060 | Pfizer's fully aware of what's going on.
00:53:52.240 | He's aware, I'm sure he's aware in some formulation,
00:53:56.300 | the exact years he might have not been aware,
00:53:59.100 | but the point is that there is,
00:54:01.240 | that is the FDA, the relationship of Pfizer and the FDA,
00:54:06.640 | in terms of me being able to read human beings,
00:54:11.040 | was the thing he was most uncomfortable with,
00:54:14.380 | that he didn't wanna talk about the FDA.
00:54:17.580 | And that, it was clear
00:54:20.120 | that there was a relationship there
00:54:22.340 | that if the words you use may do a lot of harm,
00:54:26.460 | potentially because like you're saying,
00:54:28.560 | there might be lawsuits going on,
00:54:30.140 | there's litigation, there's legal stuff,
00:54:32.380 | all that kind of stuff.
00:54:33.500 | And then there's a lot of games being played in this space.
00:54:36.640 | So I don't know how to interpret it,
00:54:40.080 | if he's actually aware or not,
00:54:41.600 | but the deeper truth is that he's deeply uncomfortable
00:54:49.660 | bringing light to this part of the game.
00:54:52.280 | - Yes, and I'm gonna read between the lines
00:54:56.020 | and Albert Bourla certainly didn't ask me to speak for him,
00:54:59.980 | but I think, but when did you speak to him?
00:55:02.500 | How long ago?
00:55:03.460 | - Wow, time flies when you're having fun.
00:55:05.820 | Two months ago.
00:55:06.660 | - Two months ago.
00:55:07.480 | So that was just recently, it's come out,
00:55:12.060 | just in the past week, it's come out,
00:55:14.520 | that Pfizer isn't battling the FDA.
00:55:18.940 | Pfizer has joined the FDA in the opposition to the request
00:55:23.940 | to release these documents in the same amount of time
00:55:29.900 | that the FDA took to evaluate them.
00:55:33.180 | - Yeah.
00:55:34.060 | - So Pfizer is offering to help the FDA
00:55:39.060 | to petition the judge to not release these documents
00:55:48.740 | to not enforce the timeline
00:55:51.540 | that he seems to be moving towards.
00:55:54.100 | - So for people who are not familiar,
00:55:55.620 | we're talking about the Freedom of Information Act request
00:55:59.140 | to release the Pfizer vaccine data, study data,
00:56:04.140 | to release as much of the data as possible,
00:56:07.180 | like the raw data, the details,
00:56:08.940 | or actually not even the raw data, it's data.
00:56:12.460 | Doesn't matter, there's details to it.
00:56:14.620 | And I think the response from the FDA
00:56:17.980 | is that, yes, of course,
00:56:21.440 | but we can only publish some X number of pages a day.
00:56:28.780 | - 500 pages.
00:56:31.020 | - 500 pages of data.
00:56:32.700 | - It's not a day, though.
00:56:33.980 | It's a--
00:56:34.820 | - Whatever.
00:56:35.640 | - A week, I think.
00:56:36.480 | - The point is, whatever they're able to publish
00:56:38.580 | is ridiculous.
00:56:39.420 | It's like, my printer can only print three pages a day
00:56:45.500 | and we cannot afford a second printer.
00:56:47.980 | So it's some kind of bureaucratic language for,
00:56:52.300 | there's a process to this.
00:56:53.960 | And now you're saying that Pfizer is obviously
00:56:58.060 | more engaged in helping this kind of bureaucratic process
00:57:03.520 | prosper in its full absurdity, Kafka-esque absurdity.
00:57:08.520 | So what is this?
00:57:11.860 | This really bothered people.
00:57:13.780 | This really--
00:57:14.620 | - This is really troublesome.
00:57:15.700 | And just to put it in just plain English terms,
00:57:19.660 | Pfizer's making the case that it can't,
00:57:23.500 | the FDA and Pfizer together are making the case
00:57:27.420 | that they can't go through the documents.
00:57:29.780 | It's gonna take them some number of hundredfold,
00:57:33.660 | hundreds of folds more time to go through the documents
00:57:37.100 | than the FDA required to go through the documents
00:57:39.820 | to approve the vaccines,
00:57:42.260 | to give the vaccines full FDA approval.
00:57:44.900 | And the FDA's argument, talk about Kafka-esque,
00:57:48.900 | is that to do it more rapidly
00:57:51.300 | would cost them $3 million.
00:57:53.020 | $3 million equals one hour of vaccine sales over two years.
00:57:59.820 | One hour of sales.
00:58:04.300 | And they can't come up with the money.
00:58:05.860 | And now Pfizer has joined the suit
00:58:07.980 | to help the FDA fight off this judge,
00:58:10.220 | this mean judge who thinks they ought to release the data.
00:58:12.860 | But evidently Pfizer isn't offering
00:58:14.980 | to come up with the $3 million either.
00:58:17.260 | So bought for $3 million, I mean,
00:58:19.140 | maybe the FDA should do a GoFundMe campaign.
00:58:24.140 | - Well, obviously the money thing,
00:58:27.580 | I'm sure if Elon Musk comes along and says,
00:58:31.300 | "I'll give you 100 million, publish it now,"
00:58:35.460 | I think they'll come up with another.
00:58:37.860 | So, I mean, it's clear that there's cautiousness.
00:58:41.700 | I don't know the source of it from the FDA.
00:58:47.100 | - There's only one explanation that I can think of,
00:58:49.660 | which is that the FDA and Pfizer
00:58:53.020 | don't wanna release the data.
00:58:54.560 | They don't wanna release the three or 500,000 pages
00:58:59.380 | of documents.
00:59:02.040 | And I don't know what's in there.
00:59:05.380 | I wanna say one thing very clearly.
00:59:08.140 | I am not an anti-vaxxer.
00:59:10.120 | I believe the vaccines work.
00:59:11.940 | I believe everybody should get vaccinated.
00:59:15.220 | The evidence is clear that if you're vaccinated,
00:59:17.620 | you reduce your risk of dying of COVID by 20 fold.
00:59:20.900 | And we've got new sub-variants coming along.
00:59:23.460 | And I just wanna be very clear about this.
00:59:26.620 | That said, there's something I would give you
00:59:31.580 | 10 to one odds on a bet that there's something in that data
00:59:35.100 | that is gonna be embarrassing to either FDA or Pfizer
00:59:40.100 | or both.
00:59:41.300 | - So, there's two options.
00:59:42.120 | I agree with you 100%.
00:59:43.740 | One is they know of embarrassing things.
00:59:46.720 | That's option one.
00:59:48.200 | And option two, they haven't invested enough
00:59:51.780 | to truly understand the data.
00:59:54.620 | I mean, it's a lot of data.
00:59:56.460 | That they have a sense there might be something
00:59:59.140 | embarrassing in there.
01:00:00.020 | And if we release it,
01:00:02.100 | surely the world will discover the embarrassing.
01:00:04.420 | And to do a sort of, to steel man their argument,
01:00:08.880 | they'll take the small, the press,
01:00:11.700 | the people will take the small embarrassing things
01:00:14.420 | and blow them up into big things.
01:00:16.420 | - Yes, and support the anti-vax campaign.
01:00:20.260 | I think that's all possible.
01:00:22.700 | Nonetheless, the data are about the original clinical trial.
01:00:27.060 | And the emergency use authorization
01:00:32.180 | was based on the first few months
01:00:34.700 | of the data from that trial.
01:00:36.220 | And it was a two year trial.
01:00:37.820 | The rest of that data has not been opened up.
01:00:40.180 | And there was not an advisory committee meeting
01:00:43.500 | to look at that data
01:00:44.980 | when the FDA granted full authorization.
01:00:47.400 | Again, I am pro-vaccine.
01:00:49.260 | I am not making an anti-vax argument here.
01:00:52.580 | But I suspect that there's something pretty serious
01:00:56.100 | in that data.
01:00:57.420 | And the reason why I'm not an anti-vaxxer,
01:01:01.000 | having not been able to see the data
01:01:03.400 | that the FDA and Pfizer seem to willing,
01:01:06.060 | not just to put effort into preventing the release of,
01:01:09.980 | but seem to have quite a bit of energy into preventing,
01:01:13.100 | invest quite a bit of energy in not releasing that data.
01:01:16.500 | The reason why that doesn't tip me over
01:01:18.400 | into the anti-vaxxer side
01:01:20.220 | is because that's clinical trial data,
01:01:22.400 | early clinical trial data
01:01:23.680 | that involved several thousand people.
01:01:25.800 | We now have millions of data points
01:01:28.920 | from people who have had the vaccine.
01:01:31.080 | This is real world data
01:01:32.960 | showing the efficacy of the vaccines.
01:01:35.760 | And so far, knock on wood,
01:01:38.120 | there aren't side effects
01:01:41.200 | that overcome the benefits of vaccine.
01:01:45.120 | So I'm with you.
01:01:46.480 | I'm now, I guess, three shots of the vaccine.
01:01:51.360 | But there's a lot of people that are kind of saying,
01:01:55.760 | well, even the data on the real world use,
01:01:58.960 | large scale data,
01:02:00.360 | is messy.
01:02:05.640 | The way it's being reported,
01:02:06.800 | the way it's being interpreted.
01:02:08.760 | Well, one thing is clear to me
01:02:11.520 | that it is being politicized.
01:02:13.840 | I mean, if you just look objectively,
01:02:16.120 | don't have to go to, at the shallow surface level,
01:02:21.740 | it seems like there's two groups
01:02:25.180 | that, I can't even put a term to it
01:02:29.000 | because it's not really pro-vaccine versus anti-vaccine
01:02:32.160 | 'cause it's pro-vaccine, triple mask,
01:02:37.160 | Democrat, liberal,
01:02:41.160 | and then anti-mandate, whatever those groups are.
01:02:44.680 | I can't quite, 'cause they're changing.
01:02:46.520 | Anti-mask, but not really, but kind of.
01:02:50.360 | So those two groups that feel political in nature,
01:02:53.240 | not scientific in nature,
01:02:54.880 | it's, they're bickering,
01:02:56.840 | and then it's clear that this data is being interpreted
01:03:01.200 | by the different groups differently.
01:03:03.120 | It's very difficult for me as a human being
01:03:07.440 | to understand where the truth lies,
01:03:11.160 | especially given how much money's flying around
01:03:14.040 | on all sides.
01:03:15.360 | So the anti-vaxxers can make a lot of money too.
01:03:19.360 | Let's not forget this.
01:03:20.200 | From the individual perspective,
01:03:22.500 | you can become famous being an anti-vaxxer.
01:03:25.360 | And so there's a lot of incentives on all sides here.
01:03:28.060 | And there's real human emotion and fear
01:03:33.320 | and also credibility.
01:03:35.260 | Scientists don't wanna ruin their reputation
01:03:41.120 | if they speak out in whatever,
01:03:43.120 | like speak their opinion or
01:03:45.960 | they look at some slice of the data
01:03:49.560 | and begin to interpret it in some kind of way.
01:03:51.320 | They're very, it's clear that fear
01:03:53.100 | is dominating the discourse here,
01:03:54.960 | especially in the scientific community.
01:03:57.020 | So I don't know what to make of that.
01:03:59.340 | And the only happy people here is Pfizer.
01:04:05.640 | It's just plowing all ahead.
01:04:08.660 | I mean, with every single variant,
01:04:13.240 | there's very, I would say,
01:04:17.760 | outside of arguably a very flawed system,
01:04:21.900 | there's a lot of incredible scientific
01:04:23.960 | and engineering work being done
01:04:25.760 | in constantly developing new antiviral drugs,
01:04:29.160 | new vaccines to deal with the variants.
01:04:33.360 | So they're happily being a capitalist machine.
01:04:37.520 | And it's very difficult to know what to do with that.
01:04:42.520 | - And let's just put this in perspective for folks.
01:04:46.580 | The best-selling drug in the world
01:04:48.480 | has been Humira for a number of years.
01:04:51.400 | It's approved for the treatment of rheumatoid arthritis
01:04:55.520 | and eight other indications.
01:04:57.760 | And it's sold about $20 billion globally
01:05:02.160 | over the past few years.
01:05:03.840 | It leveled out, it peaked at that level.
01:05:07.160 | Pfizer expects to sell $65 billion of vaccine
01:05:12.160 | in the first two years of the pandemic.
01:05:16.280 | So this is by far the biggest selling
01:05:19.960 | and most profitable drug that's ever come along.
01:05:22.700 | - Can I ask you a difficult question here?
01:05:26.980 | In the fog that we're operating in here,
01:05:31.520 | on the Pfizer-BioNTech vaccine,
01:05:38.660 | what was done well and what was done badly
01:05:43.480 | that you can see now?
01:05:45.840 | It seems like we'll know more decades from now.
01:05:50.080 | - Yes.
01:05:51.360 | - But now in the fog of today,
01:05:53.620 | with the $65 billion flying around,
01:05:58.880 | where do you land?
01:06:03.040 | - So we're gonna get to what I think
01:06:07.000 | is one of the key problems
01:06:08.480 | with the pharmaceutical industry model
01:06:10.960 | in the United States about being profit-driven.
01:06:16.040 | So in 2016, the NIH did the key infrastructure work
01:06:21.040 | to make mRNA vaccines.
01:06:25.220 | That gets left out of the discussion a lot.
01:06:29.320 | And Pfizer-BioNTech actually paid royalties
01:06:32.600 | voluntarily to the NIH.
01:06:36.000 | I don't know how much it was.
01:06:36.880 | I don't think it was a whole lot of money,
01:06:38.520 | but I think they wanted to avoid the litigation
01:06:41.200 | that Moderna got itself into
01:06:43.200 | by just taking that 2016 knowledge
01:06:46.840 | and having that be the foundation of their product.
01:06:50.120 | So Pfizer took that and they did their R&D.
01:06:54.640 | They paid for their R&D, having received that technology.
01:06:59.160 | And when they got the genetic code from China
01:07:03.800 | about the virus, they very quickly made a vaccine
01:07:08.800 | and the vaccine works.
01:07:10.920 | And President Trump, to his credit,
01:07:13.840 | launched Operation Warp Speed
01:07:16.080 | and just threw money at the problem.
01:07:18.160 | They just said, "We spent five times more per person
01:07:22.360 | "than the EU early on.
01:07:24.140 | "Just pay them whatever they want.
01:07:26.800 | "Let's just get this going."
01:07:28.400 | And Americans were vaccinated more quickly.
01:07:32.400 | We paid a lot of money.
01:07:34.200 | The one mistake that I think the federal government made
01:07:37.160 | was they were paying these guaranteed fortunes
01:07:40.640 | and they didn't require that the companies participate
01:07:45.360 | in a program to do global vaccinations.
01:07:49.080 | So the companies, doing their business model,
01:07:53.520 | distributed the vaccines
01:07:55.520 | where they would make the most money.
01:07:57.280 | And obviously, they would make the most money
01:07:59.100 | in the first world.
01:08:00.040 | And almost, I think, 85% of the vaccines early on
01:08:04.360 | went to the first world.
01:08:05.960 | And very, very few vaccinations went to the third world.
01:08:10.560 | So what happened is there was such a low vaccination rate.
01:08:15.560 | In May of 2021, there was an all-hands-on-deck cry for help
01:08:21.480 | from the World Trade Organization,
01:08:25.180 | the World Health Organization, the IMF, and the World Bank,
01:08:31.540 | made a plea for $50 billion
01:08:35.520 | so that we could get to 40% vaccination rate
01:08:39.080 | in the third world by the end of 2021.
01:08:42.320 | And it was unrequited.
01:08:46.560 | Nobody answered.
01:08:47.520 | And now Africa has about a 8.9% vaccination rate.
01:08:53.880 | India's coming up, but it's been very low.
01:08:57.120 | The problem with all this is,
01:09:00.040 | I believe those mRNA vaccines are excellent vaccines.
01:09:04.720 | But if we leave the third world unvaccinated,
01:09:07.960 | we're gonna have a constant supply of variants of COVID
01:09:12.780 | that are gonna come back into the United States
01:09:15.800 | and harm Americans exactly like Delta and Omicron have.
01:09:20.760 | So we've made a great drug.
01:09:22.980 | It reduces the risk of mortality
01:09:25.720 | in Americans who get it by a lot.
01:09:28.400 | But we're not doing what we need to do
01:09:31.060 | to protect Americans from Omicron.
01:09:33.300 | You don't have to be an idealist
01:09:34.760 | and worry about global vaccine equity.
01:09:36.960 | If you're just ordinary selfish people like most of us are,
01:09:41.360 | and you're worried about the health of Americans,
01:09:43.620 | you would ensure global vaccine distribution.
01:09:47.280 | Let me just make one more point.
01:09:49.120 | That $50 billion that was requested
01:09:51.760 | by the four organizations back in May of 2021,
01:09:55.280 | 32 billionaires made $50 billion
01:09:59.320 | from the vaccines at that point,
01:10:01.440 | took it into their private wealth.
01:10:03.960 | So what had been taken,
01:10:05.060 | this enormous amounts of money
01:10:06.260 | that had been taken into private wealth
01:10:08.380 | was enough to do what those organizations said
01:10:11.620 | needed to be done to prevent the sub-variants
01:10:14.780 | from coming back and doing what they're doing.
01:10:16.660 | - So the money was there, but how does the motivation,
01:10:19.100 | the money-driven motivation of big pharma lead to that,
01:10:22.580 | that kind of allocation of vaccines?
01:10:28.500 | - Because they can make-
01:10:29.860 | - More money in the United States.
01:10:31.380 | - Yeah, they're gonna distribute their vaccines
01:10:33.100 | where they can make the most money.
01:10:34.600 | - Right.
01:10:35.440 | Is there a malevolent aspect to this where,
01:10:39.960 | boy, I don't like saying this,
01:10:44.560 | but that they don't see it as a huge problem
01:10:49.560 | that variants will come back to the United States?
01:10:53.200 | - I think it's the issue we were talking about earlier on,
01:10:56.760 | where they're in a different culture
01:10:58.560 | and their culture is that their moral obligation,
01:11:02.500 | as Milton Friedman would say,
01:11:04.600 | is to maximize the profits
01:11:06.280 | that they return to shareholders.
01:11:07.800 | - And don't think about the bigger picture.
01:11:10.600 | - The collateral damage,
01:11:11.560 | don't think about the collateral damage.
01:11:12.760 | - And also kind of believe, convince yourself
01:11:16.800 | that if we give into this capitalist machine
01:11:20.160 | in this very narrow sense of capitalism,
01:11:23.160 | that in the end, they'll do the most good.
01:11:25.920 | This kind of belief that if we just maximize profits,
01:11:30.480 | we'll do the most good.
01:11:32.640 | - Yeah, that's an orthodoxy of several decades ago,
01:11:36.800 | and I don't think people can really say that in good faith.
01:11:40.200 | When you're talking about vaccinating the third world
01:11:43.720 | so we don't get hurt,
01:11:44.920 | it's a little bit hard to make the argument
01:11:47.280 | that the world's a better place
01:11:48.500 | because the profits of the investors went up.
01:11:51.080 | - Yeah, but at the same time,
01:11:54.800 | I think that's a belief you can hold.
01:11:58.080 | I mean, I've interacted with a bunch of folks
01:12:00.220 | that kind of, it's the,
01:12:02.720 | I don't wanna mischaracterize Ayn Rand, okay?
01:12:05.520 | I respect a lot of people,
01:12:07.380 | but there's a belief that can take hold.
01:12:10.120 | If I just focus on this particular maximization,
01:12:13.960 | it will do the most good for the world.
01:12:16.080 | The problem is when you choose what to maximize
01:12:19.240 | and you put blinders on,
01:12:20.740 | it's too easy to start making gigantic mistakes.
01:12:24.760 | That have a big negative impact on society.
01:12:28.160 | So it really matters what you're maximizing.
01:12:30.720 | - Right, and if we had a true democracy
01:12:33.720 | and everybody had one vote,
01:12:35.300 | everybody got decent information and had one vote,
01:12:39.600 | Ayn Rand's position would get some votes, but not many.
01:12:44.080 | And it would be way outvoted by the common people.
01:12:47.840 | - Let me ask you about this very difficult topic
01:12:53.880 | talking to Mark Zuckerberg of Metta,
01:12:58.880 | the topic of censorship.
01:13:03.040 | I don't know if you've heard,
01:13:04.780 | but there's a guy named Robert Malone and Peter McCullough
01:13:08.960 | that were removed from many platforms
01:13:10.880 | for speaking about the COVID vaccine as being risky.
01:13:14.200 | They were both on Joe Rogan's program.
01:13:16.800 | What do you think about censorship in this space?
01:13:23.600 | In this difficult space where so much is controlled by,
01:13:28.600 | not controlled, but influenced by advertisements
01:13:31.560 | from big pharma.
01:13:32.540 | And science can even be influenced by big pharma.
01:13:38.120 | Where do you lean on this?
01:13:41.280 | Should we allow, should we lean towards freedom
01:13:46.280 | and just allow all the voices,
01:13:50.120 | even those that go against the scientific consensus?
01:13:54.560 | Is that one way to fight the science
01:13:59.560 | that is funded by big pharma?
01:14:01.760 | Or is that do more harm than good,
01:14:05.360 | having too many voices that are contending here?
01:14:08.480 | Should the ultimate battle be fought
01:14:10.640 | in the space of scientific publications?
01:14:15.160 | - And particularly in the era of COVID,
01:14:19.360 | where there are large public health ramifications
01:14:22.600 | to this public discourse, the ante is way up.
01:14:27.440 | So I don't have a simple answer to that.
01:14:30.020 | I think everyone's allowed their own opinion.
01:14:34.800 | I don't think everyone's allowed their own scientific facts.
01:14:38.680 | And how we develop a mechanism
01:14:42.240 | that's other than an open internet
01:14:45.360 | where whoever is shouting the loudest
01:14:48.140 | gets the most clicks and the rage creates value
01:14:52.880 | on the internet.
01:14:54.240 | I think that's not a good mechanism for working this out.
01:14:58.200 | And I don't think we have one.
01:14:59.760 | I don't have a solution to this.
01:15:01.800 | I mean, ideally, if we had a philosopher king,
01:15:05.300 | we could have a panel of people
01:15:08.720 | who were not conflicted by rigid opinions
01:15:13.880 | decide on what the boundaries of public discourse might be.
01:15:18.780 | I don't think it should be fully open.
01:15:21.740 | I don't think people who are making,
01:15:24.380 | who are committed to an anti-vaccine position
01:15:28.300 | and will tailor their interpretation
01:15:31.000 | of complex scientific data to support their opinion,
01:15:34.900 | I think that can be harmful.
01:15:36.780 | Constraining their speech can be harmful as well.
01:15:39.260 | So I don't have an answer here, but yeah.
01:15:42.180 | - I tend to believe that it's more dangerous
01:15:45.760 | to censor anti-vax messages.
01:15:49.320 | The way to defeat anti-vax messages
01:15:53.360 | is by being great communicators,
01:15:56.440 | by being great scientific communicators.
01:15:58.320 | So it's not that we need to censor the things we don't like.
01:16:03.320 | We need to be better at communicating the things we do like
01:16:08.240 | or the things that we do believe represent
01:16:10.760 | the deep scientific truth.
01:16:13.920 | Because I think if you censor,
01:16:18.400 | you get worse at doing science
01:16:20.600 | and you give the wrong people power.
01:16:24.860 | So I tend to believe that you should give power
01:16:30.980 | to the individual scientists
01:16:33.400 | and also give them the responsibility
01:16:35.720 | of being better educators, communicators,
01:16:38.880 | expressors of scientific ideas,
01:16:41.680 | put pressure on them to release data,
01:16:43.480 | to release that data in a way that's easily consumable,
01:16:46.800 | not just like very difficult to understand,
01:16:49.200 | but in a way that it can be understood
01:16:50.760 | by a large number of people.
01:16:52.500 | So the battle should be fought in the open space of ideas
01:16:57.160 | versus in the quiet space of journals.
01:17:02.160 | I think we no longer have that comfort,
01:17:05.960 | especially at the highest of stakes.
01:17:08.280 | So this kind of idea that a couple of peer reviewers
01:17:11.600 | decide the fate of billions
01:17:14.280 | doesn't seem to be sustainable,
01:17:18.920 | especially given a very real observation now
01:17:23.720 | that the reason Robert Malone has a large following
01:17:29.600 | is there's a deep distrust of institutions,
01:17:32.980 | deep distrust of scientists, of science as an institution,
01:17:37.720 | of power centers, of companies, of everything,
01:17:41.360 | and perhaps rightfully so.
01:17:43.940 | But the way to defend against that
01:17:45.500 | is not for the powerful to build a bigger wall,
01:17:49.740 | it's for the powerful to be authentic
01:17:51.720 | and maybe a lot of them to get fired
01:17:55.760 | and for new minds, for new fresh scientists,
01:17:58.900 | ones who are more authentic, more real,
01:18:01.800 | better communicators to step up.
01:18:03.960 | So I fear censorship
01:18:06.480 | because it feels like censorship is an even harder job
01:18:11.480 | to do it well than being good communicators.
01:18:16.720 | And it seems like it's always the C students
01:18:19.200 | that end up doing the censorship.
01:18:21.360 | That it's like, it's always the incompetent people
01:18:25.080 | and not just the incompetent, but the biggest whiners.
01:18:28.800 | So like what happens is the people
01:18:33.000 | that get the most emotional and the most outraged
01:18:36.520 | will drive the censorship.
01:18:38.220 | And it doesn't seem like reason drives the censorship.
01:18:42.560 | That's just objectively observing
01:18:44.840 | how censorship seems to work in this current.
01:18:47.960 | So there's so many forms of censorship.
01:18:50.560 | You know, you look at the Soviet Union
01:18:51.960 | with the propaganda or Nazi Germany,
01:18:54.040 | it's a very different level of censorship.
01:18:55.760 | People tend to conflate all of these things together.
01:18:59.120 | You know, social media trying desperately
01:19:01.640 | to have trillions or hundreds of billions
01:19:05.920 | of exchanges a day and like try to make sure
01:19:09.360 | that their platform has some semblance
01:19:12.880 | of like, quote, healthy conversations.
01:19:16.360 | Like people just don't go insane.
01:19:18.560 | They actually like using the platform
01:19:20.840 | and they censor based on that.
01:19:23.400 | That's a different level of censorship.
01:19:24.920 | But even there, you can really run afoul
01:19:28.040 | of the people that get, the whiny C students
01:19:32.440 | controlling too much of the censorship.
01:19:34.880 | I believe that you should actually put the responsibility
01:19:39.480 | on the self-proclaimed holders of truth,
01:19:42.480 | aka scientists, at being better communicators.
01:19:45.620 | - I agree with that.
01:19:47.600 | I'm not advocating for any kind of censorship,
01:19:51.440 | but Marshall McLuhan was very influential
01:19:55.600 | when I was in college.
01:19:57.200 | And his, that meme, the medium is the message.
01:20:02.200 | It's a little bit hard to understand
01:20:04.840 | when you're comparing radio to TV
01:20:06.840 | and saying radio is hotter or TV is hotter or something.
01:20:09.920 | But we now have the medium is the message
01:20:12.520 | in a way that we've never seen,
01:20:14.240 | we've never imagined before,
01:20:16.240 | where rage and anger and polarization
01:20:22.820 | are what drives the traffic on the internet.
01:20:27.820 | And we don't, it's a question of building the commons.
01:20:33.220 | Ideally, I don't know how to get there,
01:20:36.060 | so I'm not pretending to have a solution.
01:20:38.460 | But the commons of discourse about this particular issue
01:20:42.300 | about vaccines has been largely destroyed by the edges,
01:20:47.220 | by the drug companies and the advocates on the one side
01:20:50.100 | and the people who just criticize and think
01:20:54.740 | that even though the data are flawed,
01:20:57.780 | that there's no way vaccines can be beneficial.
01:21:01.060 | And to have those people screaming at each other
01:21:04.140 | does nothing to improve the health
01:21:07.260 | of the 95% of the people in the middle
01:21:10.740 | who want to know what the rational way to go forward is
01:21:15.740 | and protect their families from COVID
01:21:18.620 | and live a good life and be able to participate
01:21:21.500 | in the economy.
01:21:22.620 | And that's the problem.
01:21:25.260 | I don't have a solution.
01:21:26.460 | - Well, there's a difficult problem for Spotify and YouTube.
01:21:29.580 | I don't know if you heard,
01:21:30.420 | this is a thing that Joe Rogan is currently going through
01:21:33.420 | as a platform, whether to censor the conversation
01:21:36.740 | that, for example, Joe's having.
01:21:39.180 | So I don't know if you heard,
01:21:40.140 | but Neil Young and other musicians have kind of spoke out
01:21:43.940 | and saying they're going to leave the platform
01:21:45.820 | because Joe Rogan is allowed to be on this platform
01:21:49.780 | having these kinds of conversations
01:21:51.380 | with the likes of Robert Malone.
01:21:53.020 | And it's clear to me that Spotify and YouTube
01:21:57.820 | are being significantly influenced
01:21:59.860 | by these extreme voices, I can mention on each side.
01:22:03.420 | And it's also clear to me that Facebook is the same
01:22:05.820 | and that was going back and forth.
01:22:07.700 | In fact, that's why Facebook has been oscillating
01:22:10.340 | on the censorship is like one group gets louder
01:22:12.660 | than the other, depending on whether
01:22:14.980 | it's an election year.
01:22:16.260 | There's several things to say here.
01:22:21.220 | So one, it does seem, I think you put it really well,
01:22:24.620 | it would be amazing if these platforms could find mechanisms
01:22:27.620 | to listen to the center, to the big center
01:22:32.060 | that's actually going to be affected by the results
01:22:35.460 | of our pursuit of scientific truth, right?
01:22:39.360 | And listen to those voices.
01:22:42.140 | I also believe that most people are intelligent enough
01:22:45.820 | to process information and to make up their own minds.
01:22:49.380 | Like they're not in terms of,
01:22:51.720 | it's complicated, of course,
01:22:55.260 | 'cause we've just been talking about advertisement
01:22:57.140 | and how people can be influenced.
01:22:58.920 | But I feel like if you have raw long form podcasts
01:23:03.920 | or programs where people express their mind
01:23:08.460 | and express their argument in full,
01:23:12.420 | I think people can hear it to make up their own mind.
01:23:15.540 | And if those arguments have a platform
01:23:17.780 | on which they can live, then other people
01:23:20.020 | could provide better arguments if they disagree with it.
01:23:23.820 | And now we as human beings, as rational,
01:23:26.780 | as intelligent human beings can look at both
01:23:29.140 | and make up our own minds.
01:23:30.580 | And that's where social media can be very good
01:23:33.100 | at like this collective intelligence.
01:23:35.940 | We together listen to all of these voices
01:23:39.180 | and make up our own mind.
01:23:40.660 | Humble ourselves actually often.
01:23:42.860 | You know, you think you know, like you're an expert,
01:23:46.700 | say you have a PhD in a certain thing,
01:23:48.620 | so there's this confidence that comes with that.
01:23:50.940 | And the collective intelligence, uncensored,
01:23:54.340 | allows you to humble yourself eventually.
01:23:56.900 | Like as you discovery, all it takes is a few times,
01:24:01.180 | you know, looking back five years later,
01:24:05.060 | realizing I was wrong.
01:24:07.260 | And that's really healthy for a scientist,
01:24:09.060 | that's really healthy for anybody to go through.
01:24:11.020 | And only through having that open discourse
01:24:13.900 | can you really have that.
01:24:15.940 | That said, Spotify also, just like Pfizer is a company,
01:24:20.940 | which is why this podcast,
01:24:26.740 | I don't know if you know what RSS feeds are,
01:24:29.200 | but podcasts can't be censored.
01:24:31.520 | So Joe's in the unfortunate position,
01:24:33.260 | he only lives on Spotify.
01:24:35.340 | So Spotify has been actually very good
01:24:37.900 | at saying we're staying out of it for now.
01:24:40.660 | But RSS, this is pirate radio.
01:24:44.780 | Nobody can censor, it's the internet.
01:24:47.040 | So financially, in terms of platforms,
01:24:51.860 | this cannot be censored,
01:24:53.620 | which is why podcasts are really beautiful.
01:24:56.740 | And so if Spotify or YouTube
01:25:00.180 | wants to be the host of podcasts,
01:25:04.240 | I think where they flourish is free expression,
01:25:09.240 | no matter how crazy.
01:25:12.860 | - Yes, but I do wanna push back a little bit
01:25:16.540 | on what you're saying.
01:25:18.380 | So I have anti-fax friends who I love.
01:25:23.080 | I mean, they're dear, cherished friends.
01:25:26.120 | And they'll send me stuff.
01:25:28.660 | And it'll take me an hour to go through what they sent
01:25:33.180 | to see if it is credible.
01:25:36.580 | And usually it's not.
01:25:40.520 | It's not a random sample of the anti-vax argument.
01:25:42.820 | I'm not saying I can disprove the anti-vax argument,
01:25:46.760 | but I am saying that it's almost like
01:25:49.140 | we were talking about how medical science, clinical trials,
01:25:54.140 | the presentation of clinical trials to physicians
01:25:56.740 | could be improved.
01:25:57.940 | And the first thing we came up with
01:26:00.500 | is to have pre-publication transparency
01:26:04.380 | in the peer review process.
01:26:06.180 | So bad information, biased information,
01:26:08.660 | doesn't get out as if it's legitimate
01:26:11.720 | and you can't put it back, recapture it once it gets out.
01:26:15.280 | I think there's an element of that
01:26:18.380 | in the arguments that are going on about vaccines.
01:26:21.940 | And they're on both sides,
01:26:23.180 | but I think the anti-vax side
01:26:25.780 | puts out more units of information
01:26:30.180 | claiming to show that the vaccines don't work.
01:26:33.460 | And I guess in an ideal situation,
01:26:36.500 | there would be real-time fact-checking by independent people,
01:26:41.100 | not to censor it, but to just say
01:26:43.780 | that study was set up to do this
01:26:45.860 | and this is what the conclusions were.
01:26:47.940 | So the way it was stated is on one side of this argument.
01:26:52.420 | - But that's what I'm arguing.
01:26:53.700 | I agree with you.
01:26:55.060 | What I'm arguing is that this big network of humans
01:26:58.500 | that we have that is the collective intelligence
01:27:00.940 | can do that real-time if you allow it to,
01:27:03.980 | if you encourage people to do it.
01:27:05.780 | And the scientists, as opposed to,
01:27:07.860 | listen, I interact with a lot of colleagues,
01:27:10.380 | a lot of friends that are scientists,
01:27:12.520 | they roll their eyes.
01:27:14.060 | Their response is like, ugh.
01:27:16.500 | Like they don't want to interact with this.
01:27:18.820 | But that's just not the right response.
01:27:22.900 | When a huge number of people believe this,
01:27:26.420 | it is your job as communicators to defend your ideas.
01:27:30.020 | It is no longer the case that you go to a conference
01:27:33.140 | and defend your ideas to two other nerds
01:27:36.420 | that have been working on the same problem forever.
01:27:38.580 | I mean, sure, you can do that,
01:27:40.300 | but then you're rejecting the responsibility
01:27:44.060 | you have explicitly or implicitly accepted
01:27:48.060 | when you go into this field,
01:27:49.820 | that you will defend the ideas of truth
01:27:52.620 | and the way to defend them
01:27:54.180 | is in the open battlefield of ideas
01:27:56.820 | and to become a better communicator.
01:27:59.460 | And I believe that when you have a large,
01:28:01.020 | you said you invested one or two hours in this particular,
01:28:03.780 | but that's little ants interacting at scale,
01:28:07.820 | I think that allows us to progress towards truth,
01:28:12.060 | at least, you know, at least I hope so.
01:28:14.580 | - I think you're an optimist.
01:28:15.900 | I want to work with you a little bit on this.
01:28:18.060 | (Lex laughing)
01:28:18.940 | Let's say a person like Joe Rogan,
01:28:22.500 | who by the way, had me on his podcast and let me-
01:28:26.100 | - It was an amazing conversation.
01:28:27.260 | I really enjoyed it.
01:28:28.100 | - Well, thank you.
01:28:29.060 | I did too.
01:28:30.020 | And I didn't know Joe.
01:28:31.540 | I didn't know much about his podcast.
01:28:32.900 | - He pushed back on Joe a bunch, which is great.
01:28:34.940 | (Lex laughing)
01:28:35.780 | - And he was- - I love it.
01:28:36.620 | - He was a gentleman and we had it out.
01:28:38.460 | In fact, he put one clip,
01:28:40.620 | at one point he said something that was a little bit wrong
01:28:43.140 | and I corrected him.
01:28:44.300 | And he had the guy who- - Jamie.
01:28:46.980 | - Jamie, he had Jamie check it
01:28:48.940 | and was very forthright in saying,
01:28:51.220 | yeah, you know, John got it right here.
01:28:53.580 | We got to modify this.
01:28:54.860 | In any event.
01:28:55.700 | (Lex laughing)
01:28:56.540 | In any event. - You got him.
01:28:57.380 | (Lex laughing)
01:28:58.220 | - Well, I wasn't trying to get him.
01:29:00.020 | I was just trying to- - No, no, no, no.
01:29:01.620 | Totally, it was a beautiful exchange.
01:29:03.380 | There was so much respect in the room,
01:29:04.880 | pushing back and forth.
01:29:05.940 | It was great. - Yeah.
01:29:07.020 | So I respect him.
01:29:08.980 | And I think when he has somebody on
01:29:13.180 | who's a dyed in the wool anti-vaxxer,
01:29:16.720 | the question is how can you balance,
01:29:21.680 | if it needs balance, in real time?
01:29:24.460 | I'm not talking about afterwards.
01:29:26.300 | I'm talking in real time.
01:29:27.720 | Maybe you record, well, he does record it, obviously,
01:29:30.820 | but maybe when there's a statement made
01:29:33.760 | that is made as if it's fact-based,
01:29:38.040 | maybe that statement should be checked by some folks
01:29:43.680 | who, imaginary folks who are trustworthy.
01:29:48.140 | And in real time, as that discussion
01:29:51.680 | is being played on the podcast,
01:29:54.240 | to show what independent experts say about that claim.
01:29:59.120 | - That's a really interesting idea.
01:30:00.240 | By the way, for some reason,
01:30:01.640 | this idea popped into my head now is,
01:30:04.000 | I think real time is very difficult.
01:30:05.640 | And it's not difficult,
01:30:07.320 | but it kind of ruins the conversation
01:30:09.400 | 'cause you want the idea to breathe.
01:30:10.840 | - Yeah.
01:30:11.840 | I think what's very possible is before it's published,
01:30:15.200 | it's the pre-publication,
01:30:17.020 | before it's published, you let a bunch of people review it
01:30:20.400 | and they can add their voices in post before it's published.
01:30:25.280 | They can add arguments,
01:30:27.180 | arguments against certain parts.
01:30:31.520 | That's very interesting to sort of,
01:30:32.880 | as one podcast, publish addendums.
01:30:37.320 | Publish the peer review together with the publication.
01:30:39.880 | - Yes.
01:30:40.720 | - That's very interesting.
01:30:41.920 | I might actually do that.
01:30:44.160 | That's really interesting.
01:30:45.280 | 'Cause I've been doing more debates
01:30:47.160 | where you at the same time have multiple people,
01:30:51.740 | which has a different dynamic because both people,
01:30:54.800 | I mean, it's really nice to have the time to pause
01:30:58.840 | just by yourself to fact check,
01:31:02.080 | to look at the study that was mentioned,
01:31:04.160 | to understand what's going on.
01:31:05.640 | So the peer review process, to have a little bit of time.
01:31:09.520 | That's really interesting.
01:31:10.440 | I actually would, I'd like to try that.
01:31:14.400 | To agree with you on some point in terms of anti-vax,
01:31:17.760 | I've been fascinated by listening to arguments
01:31:20.680 | from this community of folks that's been quite large
01:31:23.840 | called the Flat Earthers,
01:31:25.360 | the people that believe the Earth is flat.
01:31:28.080 | And I don't know if you've ever listened to them
01:31:30.920 | or read their arguments,
01:31:33.760 | but it's fascinating how consistent
01:31:36.160 | and convincing it all sounds
01:31:37.800 | when you just kind of take it in.
01:31:39.600 | Just like, just take it in like listening normally.
01:31:43.720 | It's all very logical.
01:31:45.340 | Like if you don't think very, well, no.
01:31:49.920 | So the thing is, the reality is
01:31:55.560 | at the very basic human level
01:31:57.280 | with our limited cognitive capabilities,
01:32:00.600 | the Earth is pretty flat when you go outside
01:32:03.680 | and you look, it's flat.
01:32:04.880 | So like when you use common sense reasoning,
01:32:08.040 | it's very easy to play to that
01:32:09.960 | to convince you that the Earth is flat.
01:32:12.080 | Plus there's powerful organizations
01:32:13.640 | that want to manipulate you and so on.
01:32:16.260 | But then there's the whole progress of science
01:32:20.920 | and physics of the past,
01:32:22.600 | but that's difficult to integrate into your thought process.
01:32:26.120 | So it's very true that people should listen
01:32:29.640 | to Flat Earthers because it was very revealing to me
01:32:33.400 | how easily it is,
01:32:35.120 | how easy it is to be convinced of basically anything
01:32:37.920 | by charismatic arguments.
01:32:42.440 | - Right, and if we're arguing
01:32:44.280 | about whether the Earth is flat or not,
01:32:46.920 | as long as we're not navigating airplanes
01:32:48.760 | and doing other kinds of things,
01:32:49.920 | trying to get satellites to do transmission,
01:32:53.800 | it's not that important, what I believe.
01:32:56.200 | But if we're arguing about how we approach
01:32:59.480 | the worst public health crisis in,
01:33:02.400 | I don't know how long,
01:33:03.320 | I think we're getting worse than the Spanish flu now.
01:33:06.360 | I don't know what the total global deaths
01:33:07.800 | with Spanish flu were,
01:33:08.680 | but in the United States,
01:33:10.120 | we certainly have more deaths than we had from Spanish flu.
01:33:12.440 | - Plus the economic pain and suffering.
01:33:14.720 | - Yes, yes, and the damage to the kids, school and so forth.
01:33:19.680 | We got a problem and it's not going away, unfortunately.
01:33:23.060 | So when we get a problem like that,
01:33:25.000 | it's not just an interesting bar room conversation
01:33:28.520 | about whether the Earth is flat.
01:33:30.720 | There are millions of lives involved.
01:33:33.580 | - Let me ask you yet another question,
01:33:36.460 | an issue I raised with Pfizer CEO Albert Bourla.
01:33:40.260 | It's the question of revolving doors,
01:33:45.340 | that there seems to be a revolving door
01:33:47.420 | between Pfizer, FDA and CDC.
01:33:51.060 | People that have worked at the FDA
01:33:53.220 | now work at Pfizer and vice versa,
01:33:56.420 | including the CDC and so on.
01:33:58.620 | What do you think about that?
01:34:01.700 | - So first of all, his response once again is,
01:34:04.060 | there's rules, there's very strict rules
01:34:06.380 | and we follow them.
01:34:07.460 | Do you think that's a problem?
01:34:10.260 | - Hoo-ha.
01:34:11.980 | - And also maybe this is a good time to talk about
01:34:16.220 | this Pfizer play by the rules.
01:34:18.220 | - One at a time.
01:34:20.340 | - One at a time.
01:34:21.180 | - Okay, and this isn't even about Pfizer,
01:34:22.700 | but it's an answer to the question.
01:34:24.300 | - Yes.
01:34:25.140 | - So there's this drug, Adjahilm,
01:34:27.500 | that was approved by the FDA maybe six months ago.
01:34:31.340 | It's a drug to prevent the progression
01:34:34.940 | of low-grade Alzheimer's disease.
01:34:37.160 | The target for drug development for Alzheimer's disease
01:34:43.340 | has been the amyloid, reducing the amyloid plaques
01:34:47.020 | in the brain, which correlate with the progression
01:34:50.300 | of Alzheimer's.
01:34:52.140 | And Biogen showed that its drug, Adjahilm,
01:34:57.140 | reduces amyloid plaques in the brain.
01:35:01.020 | They did two clinical trials
01:35:03.060 | to determine the clinical efficacy,
01:35:05.620 | and they found that neither trial
01:35:07.900 | showed a meaningful benefit.
01:35:09.980 | And in those two trials, 33% more people
01:35:14.340 | in the Adjahilm group developed symptomatic brain swelling
01:35:18.080 | and bleeding than people in the placebo group.
01:35:20.820 | There was an advisory committee convened
01:35:25.940 | to debate and determine how they felt
01:35:30.460 | about the approvability of Adjahilm, given those facts.
01:35:34.260 | And those facts aren't in dispute.
01:35:37.140 | They're in Biogen slides, as well as FDA documents.
01:35:41.580 | The advisory committee voted 10 against approval
01:35:46.580 | and one abstain.
01:35:49.940 | So that's essentially universal,
01:35:52.660 | unanimous vote against approving Adjahilm.
01:35:56.260 | Now, the advisory committees have been
01:35:58.280 | pretty much cleansed of financial conflicts of interest.
01:36:03.280 | So this advisory committee votes 10 no, one abstention,
01:36:08.280 | and the FDA overrules the unanimous opinion
01:36:13.180 | of its advisory committee and approves the drug.
01:36:16.180 | Three of the members of the advisory committee resign.
01:36:21.360 | They say, "We're not gonna be part,
01:36:22.320 | "if the FDA's not gonna listen to a unanimous vote
01:36:24.700 | "against approving this drug,
01:36:26.620 | "which shows more harm than benefit, undisputed.
01:36:30.520 | "We're not gonna participate in this."
01:36:33.820 | And the argument against approval
01:36:36.680 | is that the surrogate endpoint, the reduction of amyloid,
01:36:40.440 | the progression of amyloid plaques,
01:36:43.340 | is known by the FDA not to be a valid clinical indicator.
01:36:48.020 | It doesn't correlate.
01:36:49.140 | 27 studies have shown it doesn't correlate
01:36:52.000 | with clinical progression.
01:36:53.340 | Interrupting the amyloid plaques doesn't mean
01:36:55.720 | that your Alzheimer's doesn't get worse.
01:37:00.260 | So it seems like it's a slam dunk and the FDA made a mistake
01:37:06.920 | and they should do whatever they do
01:37:09.360 | to protect their bureaucratic reputation.
01:37:12.040 | So the head of the Bureau of the FDA,
01:37:15.280 | the Center for Drug Evaluation and Research
01:37:17.320 | that approves new drugs, who had spent 16 years
01:37:21.880 | as an executive in the pharmaceutical industry,
01:37:25.280 | issued a statement and said,
01:37:28.000 | "What we should do in this situation
01:37:30.900 | "is to loosen the prohibition of financial ties of interest
01:37:35.900 | "with the drug companies
01:37:38.520 | "so we get less emotional responses."
01:37:41.340 | Said this, it's in print.
01:37:46.140 | - People are just too emotional about this.
01:37:51.520 | - People were just too emotional.
01:37:52.920 | The 10 people who voted against it
01:37:55.040 | and the no people who voted for it, it's all too emotional.
01:37:58.440 | So this gets back,
01:38:00.000 | this is a long answer to your short question.
01:38:02.560 | I think this is a wonderful window
01:38:04.880 | into the thinking of the FDA
01:38:07.120 | that financial conflicts of interest don't matter
01:38:11.120 | in a situation when I think it's obvious
01:38:13.240 | that they would matter.
01:38:14.960 | - But there's not a direct financial conflict of interest.
01:38:18.000 | It's kind of, like it's not,
01:38:21.160 | like Albert said, there's rules.
01:38:26.080 | I mean, you're not allowed
01:38:27.160 | to have direct financial conflicts of interest.
01:38:29.720 | It's indirect.
01:38:32.240 | - Right, but what I'm saying is,
01:38:34.520 | I'm not denying what he said is true,
01:38:36.540 | but the FDA, a high official in the FDA
01:38:41.960 | is saying that we need to allow conflicts of interest
01:38:45.440 | in our advisory committee meetings.
01:38:47.220 | - Wow.
01:38:49.280 | - And that, she wants to change the rules.
01:38:53.320 | - Right.
01:38:54.160 | - So Albert Borla would still be playing by the rules,
01:38:58.040 | but it just shows how one-sided the thinking here is.
01:39:03.040 | - But you think that's influenced by the fact
01:39:05.280 | that there were pharmaceutical executives
01:39:07.360 | working at the FDA and vice versa.
01:39:09.920 | - And they think that's a great idea.
01:39:11.760 | - Who gets to fix this?
01:39:14.520 | Do you think it should be just banned?
01:39:16.480 | Like if you worked--
01:39:17.320 | - I don't know, two separate questions.
01:39:19.040 | One is, should the officials at the FDA
01:39:22.280 | come from pharma and vice versa?
01:39:24.800 | - Yes.
01:39:25.640 | - That's one question.
01:39:26.460 | And the other question is,
01:39:27.400 | should advisory committee members be allowed
01:39:29.400 | to have financial conflicts of interest?
01:39:31.680 | - Yes.
01:39:33.120 | - I think, in my opinion, and people might say I'm biased,
01:39:38.120 | I think advisory committee people
01:39:40.320 | should not have conflicts of interest.
01:39:42.080 | I think their only interest ought to be the public interest.
01:39:44.880 | And that was true from my understanding of the situation.
01:39:49.240 | It's the afterward in my book,
01:39:51.280 | I spent some time studying it about Adjahilm.
01:39:54.200 | I think it's a slam dunk
01:39:55.640 | that there ought to be no conflicts of interest.
01:39:57.680 | Now, the head of CDER,
01:39:59.760 | Center for Drug Evaluation Research,
01:40:01.400 | thinks that that's gonna give you a biased result
01:40:04.660 | because we don't have company influence.
01:40:07.380 | And that, I think, shows how biased their thinking is,
01:40:14.360 | that not having company influence is a bias.
01:40:17.340 | - Let me try to load that in.
01:40:21.200 | I'm trying to empathize with the belief
01:40:23.320 | that companies should have a voice at the table.
01:40:26.760 | I mean, yeah, it's part of the game.
01:40:30.440 | They've convinced themselves
01:40:31.400 | that this is how it should be played.
01:40:33.200 | - But they have a voice at the table.
01:40:36.320 | They've designed the studies.
01:40:37.760 | - Right, that's their voice.
01:40:39.360 | That's the whole point. - They've analyzed the data.
01:40:40.960 | I mean, what bigger voice do you deserve?
01:40:43.080 | - But I do also think, on the more challenging question,
01:40:47.080 | I do think that there should be a ban.
01:40:50.200 | If you work at a pharmaceutical company,
01:40:53.640 | you should not be allowed to work at any regulatory agency.
01:40:57.640 | - Yes.
01:41:01.520 | - You should not, I mean, that, going back and forth,
01:41:03.960 | it just, even if it's 30 years later.
01:41:06.960 | - I agree, and I have another nomination for a ban.
01:41:11.020 | We're in this crazy situation where Medicare
01:41:13.760 | is not allowed to negotiate the price of drugs
01:41:16.440 | with the drug companies.
01:41:17.780 | So the drug companies get a patent on a new drug.
01:41:20.820 | Unlike every other developed country,
01:41:22.440 | they can charge whatever they want.
01:41:24.000 | So they have a monopoly on a utility
01:41:27.800 | 'cause no one else can make the drug.
01:41:29.560 | Charge whatever they want, and Medicare has to pay for it.
01:41:32.000 | And you say, how did we get in this crazy situation?
01:41:35.640 | So how we got here is that in 2003,
01:41:39.600 | when Medicare Part D was passed,
01:41:42.020 | Billy Towson was head of the Ways and Means Committee
01:41:45.680 | in the House, played a key role in ushering this through
01:41:48.960 | with the non-negotiation clause of it.
01:41:52.440 | And after it was passed,
01:41:53.960 | Billy Towson did not finish out his term in Congress.
01:41:57.480 | He went to pharma for a $2 million a year job.
01:42:00.400 | - This is, this is incredible.
01:42:05.180 | - You might think that a ban on that would be a good idea.
01:42:09.480 | - I spoke with Francis Collins,
01:42:11.000 | head of the NIH on this podcast.
01:42:13.720 | He and NIH have a lot of power over funding in science.
01:42:18.720 | What are they doing right?
01:42:23.520 | What are they doing wrong?
01:42:24.960 | In this interplay with big pharma,
01:42:28.760 | how connected are they?
01:42:30.420 | Again, returning to the question,
01:42:33.760 | what are they doing right?
01:42:35.480 | What are they doing wrong, in your view?
01:42:37.680 | - So my knowledge of the NIH is not as granular
01:42:41.160 | as my knowledge of pharma.
01:42:43.240 | That said, in broad brushstrokes,
01:42:47.520 | the NIH is doing the infrastructure work
01:42:51.200 | for all drug development.
01:42:53.400 | I think they've participated in 100% of the drugs
01:42:56.680 | that have been approved by the FDA
01:42:58.880 | over the past 10 years or so.
01:43:00.560 | They've done infrastructure work.
01:43:03.080 | And what they do is not work on particular drugs,
01:43:08.080 | but they develop work on drug targets,
01:43:12.320 | on targets in the human body that can be affected by drugs
01:43:16.880 | and might be beneficial to turn on or off.
01:43:21.520 | And then the drug companies can,
01:43:23.320 | when they find a target that is mutable
01:43:26.360 | and potentially beneficial,
01:43:29.360 | then the drug companies can take the research
01:43:32.040 | and choose to invest in the development of the drugs,
01:43:34.640 | specific drug.
01:43:35.660 | That's our model.
01:43:38.360 | Now, 96% of the research that's done in clinical trials
01:43:43.360 | in the United States is about drugs and devices.
01:43:47.280 | And only a fraction of the 4% that's left over
01:43:49.960 | is about preventive medicine
01:43:51.760 | and how to make Americans healthier.
01:43:54.480 | I think, again, from the satellite view,
01:43:58.520 | the NIH is investing more in science
01:44:03.520 | that can lead to commercial development
01:44:07.280 | rather than, as you said at the beginning of the podcast,
01:44:10.160 | there's no big fitness and lifestyle industry
01:44:13.480 | that can counter pharma.
01:44:15.960 | So I think at the NIH level, that countering can be done.
01:44:19.700 | And the Diabetes Prevention Program study
01:44:22.520 | that we talked about before,
01:44:23.640 | where lifestyle was part of a randomized trial
01:44:26.480 | and was shown to be more effective than metformin
01:44:28.880 | at preventing the development of diabetes,
01:44:30.960 | that is absolute proof positive
01:44:34.440 | that investing in that kind of science
01:44:36.120 | can produce good results.
01:44:37.880 | So I think that we're aimed at drug development
01:44:42.880 | and what we ought to be aimed at
01:44:44.840 | is an epidemiological approach
01:44:47.680 | to improving the health of all Americans.
01:44:49.880 | We rank 68th in the world in healthy life expectancy.
01:44:55.040 | Despite spending an extra trillion and a half dollars a year.
01:44:58.040 | And I believe strongly that the reason why we've gotten
01:45:03.960 | in this crazy position is because the knowledge
01:45:09.200 | that we're producing is about new drugs and devices
01:45:12.440 | and it's not about improving population health.
01:45:15.720 | In this problem, the NIH is the perfect institution
01:45:19.640 | to play a role in rebalancing our research agenda.
01:45:23.120 | - And some of that is on the leadership side
01:45:24.920 | with Francis Collins and Anthony Fauci,
01:45:27.880 | not just speaking about basically everything
01:45:32.440 | that just leads to drug development, vaccine development,
01:45:34.720 | but also speaking about healthy lifestyles
01:45:36.720 | and speaking about health, not just sickness.
01:45:40.760 | - Yes, and investing.
01:45:41.920 | - Investing. - Investing in health.
01:45:43.160 | - I mean, it's like,
01:45:45.300 | one fee is the other.
01:45:48.960 | One, you have to communicate to the public
01:45:51.160 | the importance of investing in health
01:45:53.960 | and that leads to you getting props for investing in health
01:45:57.800 | and then you can invest in health more and more
01:45:59.520 | and that communicates, I mean,
01:46:01.640 | everything that Anthony Fauci says or Francis Collins says
01:46:05.080 | has an impact on scientists.
01:46:07.200 | I mean, it sets the priorities.
01:46:12.080 | I don't think they, it's the sad thing about leaders,
01:46:16.080 | forgive me for saying the word, but mediocre leaders,
01:46:22.080 | is they don't see themselves as part of a game.
01:46:25.980 | They don't see the momentum.
01:46:29.920 | It's like a fish in the water.
01:46:31.160 | They don't see the water.
01:46:32.920 | Great leaders stand up and reverse the direction
01:46:36.080 | of how things are going
01:46:37.120 | and I actually put a lot of responsibility,
01:46:39.920 | some people say too much, but whatever.
01:46:43.520 | I think leaders carry the responsibility.
01:46:46.440 | I put a lot of responsibility on Anthony Fauci
01:46:48.800 | and Francis Collins for not actually speaking
01:46:51.360 | a lot more about health, not, and bigger,
01:46:55.880 | inspiring people in the power
01:47:00.880 | and the trustworthiness of science.
01:47:05.540 | You know, that's on the shoulders of Anthony Fauci.
01:47:11.720 | - I'm gonna abstain from that 'cause I'm not expert enough.
01:47:15.800 | - Neither am I, but I'm opinionated.
01:47:17.780 | - I am too, but not on camera.
01:47:21.080 | - Yes. (laughs)
01:47:22.520 | - No, but seriously, the problem is pretty simple,
01:47:27.200 | that we're investing 96% of our funding
01:47:31.400 | of clinical research in drugs and devices
01:47:33.520 | and 80% of our health is determined
01:47:36.840 | by how we live our lives.
01:47:38.120 | - Yes.
01:47:39.180 | - And this is ridiculous.
01:47:41.720 | The United States is going further and further
01:47:45.640 | behind the other wealthy countries in terms of our health.
01:47:49.780 | We ranked 38th in healthy life expectancy in 2000,
01:47:53.600 | and now we're spending a trillion and a half dollars extra,
01:47:57.040 | and we rank 68th.
01:47:58.360 | We've gone down.
01:47:59.520 | - You have this excellent, there's a few charts
01:48:02.400 | that I'll overlay that tell the story
01:48:06.440 | in really powerful ways.
01:48:09.720 | So one is the healthcare spending as percentage of GDP
01:48:13.600 | that on the x-axis is years and the y-axis is percentage,
01:48:17.840 | and the United States as compared to other countries
01:48:20.820 | on average has been much larger and growing.
01:48:25.820 | - Right, we are now spending 7% more of our GDP,
01:48:30.540 | 17.7% versus 10.7% on healthcare.
01:48:35.220 | 7%, and I think GDP is the fairest way
01:48:38.820 | to compare healthcare spending.
01:48:40.700 | Per person in dollars, we're spending even,
01:48:43.460 | the difference is even greater,
01:48:45.540 | but other costs vary with GDP,
01:48:48.200 | so let's stick with the conservative way to do it.
01:48:50.800 | - 17.7 or 18% of GDP,
01:48:55.800 | 18% of GDP spent on healthcare,
01:49:00.740 | 7% higher than the comparable country average.
01:49:04.800 | - Right.
01:49:05.640 | - 17.7% versus 10.7, 7% higher.
01:49:09.960 | - Right, and 7% of $23 trillion GDP
01:49:15.140 | is more than $1.5 trillion a year in excess.
01:49:19.040 | - And then you have another chart
01:49:20.520 | that shows healthcare system performance
01:49:23.440 | compared to spending,
01:49:24.840 | and there's a point cloud of different countries,
01:49:29.800 | the x-axis being healthcare spending
01:49:33.160 | as a percentage of GDP, which we just talked about,
01:49:36.380 | that US is 7% higher than the average,
01:49:40.920 | and then on the y-axis is performance.
01:49:44.520 | So x-axis spending, y-axis performance,
01:49:48.280 | and there's a point cloud,
01:49:49.700 | we'll overlay this if you're watching on YouTube,
01:49:52.400 | of a bunch of countries that have high performance
01:49:57.000 | for what they're spending,
01:50:00.080 | and then US is all alone
01:50:04.640 | on the right bottom side of the chart
01:50:07.500 | where it's low performance and high spending.
01:50:10.760 | - Correct.
01:50:12.880 | - So this is a system that is abiding by spending
01:50:17.880 | that is directed by the most profitable ways
01:50:21.160 | to deliver healthcare.
01:50:22.480 | - So you put that in the hands of big pharma,
01:50:25.040 | is you maximize for profit,
01:50:27.320 | you're going to decrease performance and increase spending.
01:50:31.620 | - Yes, but I wanna qualify that
01:50:34.320 | and say it's not all big pharma's fault.
01:50:36.320 | They're not responsible for all the problems
01:50:39.320 | in our healthcare system.
01:50:41.200 | They're not responsible for the administrative costs,
01:50:43.200 | for example, but they are the largest component
01:50:47.400 | of our rising healthcare costs,
01:50:51.320 | and it has to do with this knowledge issue.
01:50:54.160 | Controlling the knowledge that doctors have
01:50:56.780 | makes it so that doctors can live with this situation,
01:51:01.240 | believing that it's optimal, when it's a wreck.
01:51:04.880 | - Yeah.
01:51:06.160 | Let me ask you the big, so as a physician,
01:51:10.160 | so everything you've seen, we've talked about 80%
01:51:13.680 | of the impact on health is lifestyle.
01:51:15.960 | How do we live longer?
01:51:20.220 | What advice would you give to general people?
01:51:22.720 | What space of ideas result in living longer
01:51:27.720 | and higher quality lives?
01:51:30.760 | - Right, this is a very simple question to answer.
01:51:33.600 | Exercise for at least a half hour,
01:51:37.840 | at least five times a week.
01:51:40.400 | Number one.
01:51:42.400 | Number two, don't smoke.
01:51:44.200 | Number three, maintain a reasonably healthy body weight.
01:51:49.320 | Some people argue that being lower than a BMI of 25
01:51:53.760 | is healthy.
01:51:54.840 | I think that may be true, but I think getting above 30
01:51:59.280 | is unhealthy, and that ought to be.
01:52:01.940 | Now, that's largely impacted by socioeconomic status,
01:52:07.640 | and we don't want to blame the victims here.
01:52:09.960 | So we got to understand that when we talk about
01:52:12.640 | all of these things, not cigarettes, but exercise
01:52:17.080 | and a good diet and maintaining a healthy body weight,
01:52:21.340 | we have to include in doing those things
01:52:26.160 | the impediments to people of lower socioeconomic status
01:52:32.040 | being able to make those changes.
01:52:34.400 | We've got to understand that personal responsibility
01:52:38.160 | accounts for some of this, but also social circumstances
01:52:42.280 | accounts for some of it.
01:52:44.000 | And back to your fishbowl analogy,
01:52:47.000 | if you're swimming in a fishbowl,
01:52:50.040 | if you live in a fish tank that's not being properly
01:52:52.200 | maintained, the approach wouldn't be to treat
01:52:55.720 | individual sick fish, it would be to fix your fish tank
01:53:01.560 | to get the bacteria out of it and whatever bad stuff
01:53:04.720 | is in there and make your fish tank healthier.
01:53:08.440 | Well, we invest far less than the other wealthy countries do.
01:53:12.820 | We're flipped.
01:53:13.660 | We have the mirror image in the spending on social
01:53:17.700 | determinants of health and medical determinants of health.
01:53:20.840 | We have exactly the wrong order.
01:53:22.920 | And not only does that choke off social determinants
01:53:26.640 | of health, which are very important,
01:53:28.320 | but actually just the ratio, even if you were spending,
01:53:32.880 | if we raise the social spending and raise our medical
01:53:37.120 | spending in proportion, it's the ratio of social spending
01:53:40.360 | to medical spending that's the problem.
01:53:42.920 | So, and why do we do that?
01:53:44.560 | Well, the answer is perfectly obvious that the way
01:53:47.400 | to transfer money from working Americans to investors
01:53:51.840 | is through the biomedical model,
01:53:53.660 | not through the social health model.
01:53:57.720 | And that's the problem for, and I'd like to discuss this
01:54:02.720 | because the market isn't gonna get us
01:54:06.320 | to a reasonable allocation.
01:54:08.020 | All the other wealthy countries that are so much healthier
01:54:10.800 | than we are and spending so much less than we are
01:54:14.000 | have some form of government intervention
01:54:17.080 | in the quality of the health data that's available
01:54:20.440 | in the budgeting of health and social factors.
01:54:25.920 | And we don't, we're kind of the wild west
01:54:28.040 | and we let the market determine those allocations.
01:54:31.040 | And it's an awful failure, it's a horrendous failure.
01:54:36.040 | - So one argument against government, or sorry,
01:54:40.800 | an alternative to the government intervention
01:54:43.720 | is the market can work better if the citizenry
01:54:49.680 | has better information.
01:54:51.660 | So one argument is that, you know,
01:54:54.180 | communicators like podcasts and so on,
01:54:58.700 | but other channels of communication
01:55:01.100 | will be the way to fight big pharma.
01:55:03.860 | Your book is the way to, so by providing information.
01:55:07.580 | The alternative to the government intervention
01:55:10.360 | on every aspect of this, including communication
01:55:12.780 | with the doctors is to provide them other information
01:55:15.300 | and not allow the market to provide that information
01:55:18.620 | by basically making it exciting to buy books,
01:55:23.620 | to make better and better communicators on Twitter,
01:55:28.140 | through books, through op-eds, through podcasts,
01:55:31.860 | through so on, so basically,
01:55:33.980 | 'cause there's a lot of incentive to communicate
01:55:38.340 | against the messages of big pharma.
01:55:40.420 | There is incentive because people want to understand
01:55:43.620 | what's good for their lives and they're willing to listen
01:55:45.740 | to charismatic people that are able to clearly explain
01:55:48.700 | what is good for them.
01:55:50.940 | - And they do, and more than 80% of people
01:55:54.000 | think that drugs cost too much and the drug industry
01:55:56.660 | is too interested in profits.
01:55:58.780 | - But they still get influenced.
01:56:02.340 | - They can't, you can't get the vote through Congress.
01:56:05.260 | - Yeah. - You know,
01:56:06.860 | Democrats and Republicans alike
01:56:08.740 | are taking money from Congress,
01:56:10.260 | and somehow it just doesn't work out
01:56:13.940 | that these even small changes,
01:56:17.180 | I mean, the pared down part of Medicare,
01:56:21.780 | the plan for increasing Medicare negotiation drug costs
01:56:26.780 | in Build Back Better, it's literally gonna reduce
01:56:31.180 | the number of new drugs that are beneficial,
01:56:36.020 | uniquely beneficial, by about one new drug
01:56:39.480 | or two new drugs over 30 years.
01:56:43.860 | It will have virtually an indecipherable impact.
01:56:48.420 | And yet pharma is talking about the impact on innovation.
01:56:53.420 | And if you vote for this,
01:56:55.900 | if you let your congressman vote for this,
01:56:58.260 | you're gonna severely slow down drug innovation
01:57:03.260 | and that's gonna affect the quality of your life.
01:57:09.540 | - Let me ask you about over-medication
01:57:14.540 | that we've been talking about from different angles,
01:57:19.700 | but one difficult question for me,
01:57:22.660 | I'll just, I'll pick one of the difficult topics,
01:57:25.500 | depression.
01:57:26.940 | So depression is a serious, painful condition
01:57:31.940 | that leads to a lot of people suffering in the world.
01:57:37.220 | And yet it is likely they were over-prescribing
01:57:40.500 | antidepressants.
01:57:42.340 | So as a doctor, as a patient, as a healthcare system,
01:57:47.020 | as a society, what do we do with that fact
01:57:50.300 | that people suffer?
01:57:53.000 | There's a lot of people suffering from depression
01:57:56.540 | and there's also people suffering from over-prescribing
01:58:00.500 | of antidepressants.
01:58:01.940 | - Right.
01:58:02.840 | So a paper in the New England Journal by Eric Turner
01:58:06.820 | showed that the data, if you put all the data together
01:58:10.780 | from antidepressants, you find out that antidepressants
01:58:15.620 | are not effective for people who are depressed
01:58:19.340 | but don't have a major depression.
01:58:21.060 | Major depression is a serious problem.
01:58:25.300 | People can't function normally,
01:58:27.260 | they have a hard time getting out,
01:58:31.020 | performing their normal social roles.
01:58:35.780 | But what's happened is that the publicity,
01:58:39.460 | I mean, Prozac Nation was a good example
01:58:43.020 | of making the argument that why should people
01:58:45.980 | settle for normal happiness when they can have
01:58:48.180 | better than normal happiness?
01:58:49.780 | And if you're not having normal happiness,
01:58:52.460 | you should take a drug.
01:58:53.540 | Well, that concept that serotonin metabolism
01:58:58.540 | is the root cause of depression
01:59:03.520 | is really a destructive one.
01:59:05.500 | We have drugs that change serotonin metabolism,
01:59:08.740 | but we don't know if that's why antidepressants
01:59:12.140 | work on major depression.
01:59:14.420 | And they certainly don't work on everybody
01:59:16.020 | with major depression.
01:59:16.860 | I forget what the number needed to treat is.
01:59:18.500 | I think it's around four.
01:59:20.820 | One out of four people have significant improvement.
01:59:23.580 | But the people without major depression don't get better.
01:59:28.180 | And the vast majority of these drugs are used
01:59:30.900 | for people without major depression.
01:59:33.700 | So what's happened is that the feelings
01:59:37.260 | of life satisfaction, of happiness and not sadness
01:59:42.020 | have been medicalized.
01:59:43.940 | The normal range of feelings have been medicalized.
01:59:47.860 | And that's not to say that they shouldn't be attended to,
01:59:51.020 | but the evidence shows that attending to them
01:59:54.340 | by giving somebody a medicine doesn't help
01:59:57.020 | except that they feel like somebody cares about them
01:59:59.660 | and believes that they're suffering.
02:00:01.420 | But there are problems in living that give rise
02:00:05.380 | to much of this symptomatology of less
02:00:08.860 | than major depression.
02:00:10.100 | And let's call it what it is and figure out a way
02:00:13.340 | to help people in visual therapy, group therapy.
02:00:16.720 | Maybe lifestyle modification would work.
02:00:19.880 | We got to try that.
02:00:20.880 | But let's call it what it is instead of saying,
02:00:24.780 | oh, you're in this vast basket of people who are depressed,
02:00:29.860 | so we'll give you an antidepressant,
02:00:31.540 | even though the evidence shows that people
02:00:34.220 | who are suffering from your level of depression
02:00:36.620 | don't get better.
02:00:38.220 | - And that's a consequence of not focusing
02:00:42.540 | on preventative medicine, the lifestyle changes,
02:00:46.060 | all that kind of stuff.
02:00:47.140 | - Well, yes, but it's really a consequence
02:00:49.660 | of the drug companies creating the impression
02:00:53.060 | that if you're sad, take a pill.
02:00:56.660 | - If you're non-major depression,
02:01:01.140 | how do you overcome depression?
02:01:03.460 | - Well, you have to talk about what the problem is.
02:01:06.700 | - So talk therapy, lifestyle changes.
02:01:09.900 | - Well, no, I'm not jumping to that.
02:01:12.300 | I'm saying that you ought to, A,
02:01:16.140 | the way you feel must be respected.
02:01:19.440 | - Yeah, acknowledge that you're suffering.
02:01:21.180 | - Acknowledge that you're suffering
02:01:22.700 | and deal with healthcare providers
02:01:24.740 | who acknowledge that you're suffering.
02:01:27.260 | So let's take that first step.
02:01:30.220 | - Big first step also.
02:01:32.260 | - Big first step, yeah.
02:01:33.580 | Family docs are pretty good at that.
02:01:36.260 | That's kind of the arena that caused me
02:01:39.880 | to go into family medicine,
02:01:41.920 | the subjective experience of the patient.
02:01:44.300 | Okay, so you're a person who is not getting
02:01:48.100 | the enjoyment out of their life
02:01:49.660 | that they feel they ought to be getting.
02:01:52.080 | Now let's figure out why.
02:01:54.660 | And whether that means some time with a social worker,
02:01:57.300 | some time with a psychiatrist,
02:01:59.060 | some time with a psychiatric nurse,
02:02:01.320 | I'm not sure how you'd best do that,
02:02:04.100 | most effectively and efficiently,
02:02:05.780 | but that's what you need to do.
02:02:07.500 | And it may be that there's a marital problem
02:02:11.680 | and there's something going on
02:02:13.620 | and one of the spouses can't find satisfaction
02:02:18.580 | in the life they have to live within the relationship.
02:02:21.500 | Maybe there's a past history of trauma or abuse
02:02:24.620 | that somebody is projecting onto their current situation.
02:02:28.800 | Maybe there's socioeconomic circumstances
02:02:31.080 | where they can't find a job that gives them self-respect
02:02:34.740 | and enough money to live.
02:02:35.980 | An infinite range of things,
02:02:39.700 | but let's figure out, make a diagnosis first.
02:02:42.080 | The diagnosis isn't that the person feels sadder
02:02:45.460 | than they want to feel.
02:02:48.620 | The diagnosis is why does the person feel sadder
02:02:51.980 | than they want to feel?
02:02:53.140 | - You mentioned this is what made you want to get
02:02:57.780 | into family medicine.
02:02:59.240 | As a doctor, what do you think about the saying,
02:03:03.100 | save one life, save the world?
02:03:05.380 | This was always moving to me about doctors
02:03:13.780 | 'cause you have this human in front of you
02:03:17.700 | and your time is worth money.
02:03:20.920 | What you prescribe and your efforts after the visit
02:03:27.300 | are worth money and it seems like the task of the doctor
02:03:31.860 | is to not think about any of that.
02:03:34.140 | Not the task, but it seems like a great doctor,
02:03:42.020 | despite all that, just forgets it all
02:03:45.100 | and just cares about the one human
02:03:47.060 | and somehow that feels like the love and effort you put
02:03:51.780 | into helping one person is the thing
02:03:53.900 | that will save the world.
02:03:55.460 | It's not like some economic argument
02:03:58.420 | or some political argument or financial argument.
02:04:03.420 | It's a very human drive that ultimately is behind
02:04:09.540 | all of this that will do good for the world.
02:04:13.020 | - Yes, I think that's true.
02:04:15.620 | And at the same time, I think it's equally true
02:04:19.660 | that all physicians need to have a sense of responsibility
02:04:23.660 | about how the common resources are allocated
02:04:28.660 | to serve the whole population's interest best.
02:04:34.140 | That's a tension that you have as a physician.
02:04:36.380 | Let's take the extreme example.
02:04:38.460 | Let's say you had a patient in front of you
02:04:41.420 | who if you gave a $1, $10 billion pill to,
02:04:46.340 | you would save their life.
02:04:47.660 | I would just be tortured by that as a physician
02:04:52.380 | because I know that $10 billion spent properly
02:04:56.100 | in an epidemiologically guided way
02:05:00.260 | is gonna save a whole lot more lives than one life.
02:05:03.580 | - So it's also your responsibility as a physician
02:05:06.380 | to walk away from that patient.
02:05:08.240 | - I wouldn't say that.
02:05:10.540 | - I think it's your responsibility--
02:05:12.060 | - To be tortured by the choice.
02:05:12.900 | - To be tortured by it.
02:05:14.020 | That's exactly right.
02:05:15.160 | - The human condition.
02:05:18.440 | That's a tough job, but yeah, yeah,
02:05:24.700 | to maintain your humanity through it all.
02:05:27.220 | - Yeah, but you've been asking at different points
02:05:30.260 | in this conversation, why are doctors so complacent
02:05:35.260 | about the tremendous amount of money we're spending?
02:05:38.860 | Why do they accept knowledge from different sources
02:05:41.460 | that may not pan out when they really know the truth?
02:05:44.340 | And the answer is that they're trying to do their best
02:05:48.380 | for their patients.
02:05:49.500 | And it's the same kind of torture
02:05:55.500 | to figure out what the hell is going on with the data.
02:05:59.880 | And that's a sort of future project.
02:06:03.460 | And maybe people will read my book
02:06:06.160 | and maybe they'll get a little more excited about it,
02:06:08.140 | become more legitimate in practice.
02:06:10.180 | I would feel like my life was worthwhile if that happened.
02:06:13.660 | But at the same time, they've got to do something
02:06:17.180 | with the patient in front of them.
02:06:18.780 | They've got to make a decision.
02:06:21.140 | And they probably, there are not many weirdos like me
02:06:24.860 | who invest their life in figuring out
02:06:27.300 | what's behind the data.
02:06:28.340 | They're trying to get through the day
02:06:29.820 | and do the right thing for their patient.
02:06:31.660 | So they're tortured by that decision too.
02:06:35.020 | - And so if you're not careful,
02:06:38.460 | Big Pharma can manipulate that drive
02:06:43.460 | to try to help the patient,
02:06:44.980 | that humanity of dealing with the uncertainty of it all.
02:06:49.740 | Like what is the best thing to do?
02:06:51.980 | Big Pharma can step in and use money
02:06:53.800 | to manipulate that humanity.
02:06:55.540 | - Yeah, I would state it quite differently.
02:06:57.540 | It's sort of an opt out rather than an opt in.
02:07:00.980 | Big Pharma will do that.
02:07:02.860 | And you need to opt out of it.
02:07:05.460 | (inhales)
02:07:08.020 | - What advice would you give to a young person today
02:07:11.340 | in high school or college,
02:07:13.180 | stepping into this complicated world
02:07:17.020 | full of advertisements, of big powerful institutions,
02:07:22.020 | of big rich companies,
02:07:24.900 | how to have a positive impact in the world,
02:07:27.100 | how to live a life they can be proud of?
02:07:29.460 | - I would say, should that person
02:07:34.300 | who has only good motives go into medicine?
02:07:38.140 | They have an inclination to go into medicine
02:07:39.780 | and they've asked me what I think about that
02:07:42.100 | given what I know about the undermining
02:07:45.460 | of American healthcare at this point.
02:07:47.700 | And my answer is, if you got the calling, you should do it.
02:07:51.500 | You should do it because nobody's gonna do it
02:07:54.360 | better than you.
02:07:55.200 | And if you don't have the calling
02:07:57.860 | and you're in it for the money,
02:08:01.080 | you're not gonna be proud of yourself.
02:08:03.300 | - How do you prevent yourself from doing,
02:08:07.740 | from letting the system change you over years and years?
02:08:12.740 | Like letting the game of pharmaceutical influence
02:08:17.980 | affect you?
02:08:19.820 | - It's a very hard question
02:08:22.640 | because the sociologic norms are to be affected
02:08:28.080 | and to trust the sources of information
02:08:33.000 | that are largely controlled by the drug industry.
02:08:36.380 | And that's why I wrote "Sickening"
02:08:38.220 | is to try and help those people in the medical profession
02:08:43.220 | to understand that what's going on right now looks normal,
02:08:50.620 | but it's not.
02:08:52.380 | The health of Americans is going downhill.
02:08:55.460 | Our society is getting ruined
02:08:57.060 | by the money that's getting pulled out of other
02:09:01.020 | socially beneficial uses to pay for healthcare
02:09:06.660 | that is not helping us.
02:09:08.560 | - So fundamentally, the thing that is normal,
02:09:14.600 | not question the normal, don't,
02:09:19.120 | if you conform, conform hesitantly.
02:09:23.700 | - Well, you have to conform.
02:09:26.140 | You can't become a doctor without conforming.
02:09:28.960 | I just made it through.
02:09:32.340 | (both laughing)
02:09:35.100 | But there aren't many and it's hard work,
02:09:38.680 | but you have to conform.
02:09:42.460 | And even with my colleagues in my own practice,
02:09:44.980 | I couldn't convince them that some of the beliefs they had
02:09:48.740 | about how best to practice weren't accurate.
02:09:51.500 | There's one scene, a younger physician
02:09:55.500 | had prescribed hormone replacement therapy,
02:09:57.940 | this is back in 2000, 2001,
02:10:00.420 | had prescribed hormone replacement therapy
02:10:02.660 | for one of my patients
02:10:04.340 | who happened to be a really good personal friend.
02:10:07.140 | And I saw that patient covering for my colleague
02:10:12.140 | at one point and I saw that her hormone replacement therapy
02:10:15.660 | had been renewed.
02:10:17.260 | And I said, "Are you having hot flashes or any problem?"
02:10:19.500 | "No, no, no, no, but Dr. So-and-so
02:10:23.180 | said it's better for my health."
02:10:25.140 | And I said, "No, it's not.
02:10:26.700 | The research is showing that it's not.
02:10:28.440 | It's harmful for your health and I think you should stop it."
02:10:31.300 | So my colleague approached me when she saw the chart
02:10:35.820 | and said, "Wait a minute, that's my patient,
02:10:38.060 | maybe your friend, but it's my patient."
02:10:40.520 | And I went to a conference from my alma mater,
02:10:45.520 | medical school, and they said that healthy people
02:10:49.060 | should be given hormone replacement.
02:10:51.220 | And I said, "There's gotta be drug companies
02:10:53.920 | involved in this."
02:10:55.260 | And she said, "No, no, no, it was at my university.
02:10:57.700 | It was not a drug company thing.
02:10:59.900 | We didn't go to a Caribbean island."
02:11:02.320 | I said, "Do you have the syllabus?"
02:11:03.860 | She said, "Yeah."
02:11:05.100 | And she went and got the syllabus and sure enough,
02:11:07.640 | it was sponsored by a drug company.
02:11:09.480 | - They're everywhere.
02:11:11.300 | - They're everywhere.
02:11:12.140 | And it's back to Kuhn that groups of experts
02:11:16.300 | share unspoken assumptions.
02:11:19.780 | And in order to be included in that group of experts,
02:11:22.780 | you have to share those unspoken assumptions.
02:11:25.140 | And what I'm hoping to do with my book, "Sickening,"
02:11:27.860 | and being here, having this wonderful conversation with you,
02:11:31.820 | is to create an alternative to this normal
02:11:36.180 | that people can pursue and practice better medicine
02:11:45.020 | and also prevent burnout.
02:11:47.180 | I mean, about half the doctors complain
02:11:49.320 | that they're burned out and they've had it.
02:11:51.220 | And I think that this is a subject,
02:11:54.180 | I don't have data on this, this is just my opinion,
02:11:57.540 | but I think that a lot of that burnout
02:11:59.860 | is so-called moral injury from practicing in a way
02:12:04.340 | that the docs know isn't working.
02:12:08.380 | - It's not actually providing an alternative
02:12:10.940 | to the normal, it's expanding the normal,
02:12:12.980 | it's shifting the normal, just like with Kuhn.
02:12:14.900 | I mean, you're basically looking for,
02:12:17.000 | to shift the way medicine is done to the original,
02:12:23.540 | I mean, to the intent that it represents,
02:12:27.100 | that the ideal of medicine, of healthcare.
02:12:30.580 | - Yeah, in Kuhnian terms, to have a revolution.
02:12:33.500 | And that revolution would be to practice medicine
02:12:36.820 | in a way that will be epidemiologically most effective,
02:12:41.700 | not most profitable for the people who are providing you
02:12:45.300 | with what's called knowledge.
02:12:47.340 | - You helped a lot of people as a doctor, as an educator,
02:12:51.360 | live better lives, live longer,
02:12:56.500 | but you yourself are a mortal being.
02:12:59.140 | Do you think about your own mortality?
02:13:02.220 | Do you think about your death?
02:13:03.580 | Are you afraid of death?
02:13:04.780 | - I'm not, I've faced it.
02:13:10.580 | I've been close.
02:13:11.860 | - With yourself?
02:13:12.740 | - Yeah, yeah.
02:13:14.940 | - How do you think about it?
02:13:16.180 | What wisdom do you gain from having come close to death,
02:13:20.060 | the fact that the whole thing ends?
02:13:21.820 | - It's liberating.
02:13:24.040 | It's very liberating.
02:13:26.580 | I mean, I'm serious, I was close, and not too long ago.
02:13:30.860 | And it was a sense of, you know,
02:13:39.540 | this may be the way it ends, and I've done my best.
02:13:44.540 | It's not been perfect.
02:13:47.040 | And if it ends here, it ends here.
02:13:51.220 | The people around me are trying to do their best.
02:13:54.460 | And in fact, I got pulled out of it,
02:13:57.900 | but it didn't look like I was gonna get pulled out of it.
02:14:01.500 | - Are you ultimately grateful for the ride,
02:14:06.180 | even though it ends?
02:14:07.420 | - Well, it's a little, I think so.
02:14:12.420 | If I know, you know, you can't take the ride
02:14:15.660 | if you know it's gonna end well.
02:14:17.580 | (both laughing)
02:14:18.660 | It's not the real ride, it's just a ride.
02:14:20.980 | But I, having gone through the whole thing,
02:14:25.180 | I definitely freed me of a sense of anxiety about death.
02:14:30.180 | And it said to me, "Do your best every day,
02:14:35.500 | "'cause it's gonna end sometime."
02:14:37.340 | - I apologize for the ridiculously big question,
02:14:40.780 | but what do you think is the meaning of life,
02:14:45.700 | of our human existence?
02:14:47.560 | - I think it's to care about something
02:14:54.620 | and do your best with it.
02:14:55.860 | Whether it's being a doctor and trying to make sure
02:15:01.540 | that the greatest number of people get the best healthcare,
02:15:05.340 | or it's a gardener who wants to have
02:15:08.140 | the most beautiful plants, or it's a grandparent
02:15:10.420 | who wants to have a good relationship
02:15:12.780 | with their grandchildren.
02:15:13.700 | But whatever it is that gives you a sense of meaning,
02:15:18.700 | as long as it doesn't hurt other people,
02:15:21.860 | to really commit yourself to it.
02:15:24.980 | That commitment, being in that commitment,
02:15:27.460 | for me, is the meaning of life.
02:15:29.860 | - Put your whole heart and soul into the thing.
02:15:32.980 | - Yep.
02:15:34.500 | - What is it, the Bukowski poem, "Go All the Way."
02:15:37.620 | John, you're an incredible human being,
02:15:41.460 | incredible educator.
02:15:42.460 | Like I said, I recommend people listen to your lectures.
02:15:45.020 | It's so refreshing to see that clarity of thought
02:15:48.380 | and brilliance.
02:15:49.300 | And obviously, your criticism of Big Pharma,
02:15:51.860 | or your illumination of the mechanisms of Big Pharma
02:15:56.420 | is really important at this time.
02:15:58.380 | So I really hope people read your book,
02:16:02.260 | "Sickening," that's out today,
02:16:03.900 | or depending on when this comes out.
02:16:05.740 | Thank you so much for spending your extremely valuable time
02:16:11.100 | with me today.
02:16:12.260 | It was amazing.
02:16:13.100 | - Well, Lex, I wanna back to you.
02:16:15.500 | Thanks for engaging in this conversation,
02:16:18.700 | for creating the space to have it,
02:16:21.420 | and creating a listenership that is interested
02:16:25.140 | in understanding serious ideas.
02:16:27.420 | And I really appreciate the conversation.
02:16:29.500 | - And I should mention that offline,
02:16:30.980 | you told me you listened to the Gilbert Strang episode.
02:16:34.060 | So for anyone who don't know Gilbert Strang,
02:16:35.980 | another epic human being that you should check out.
02:16:39.020 | If you don't know anything about mathematics
02:16:41.300 | or linear algebra, go look him up.
02:16:43.260 | He's one of the great mathematics educators of all time.
02:16:47.300 | Of all the people you mentioned to me,
02:16:49.060 | I appreciate that you mentioned him,
02:16:50.820 | 'cause he is a rockstar of mathematics.
02:16:54.100 | John, thank you so much for talking to me.
02:16:55.380 | This was awesome.
02:16:56.220 | - Great, thank you.
02:16:57.900 | - Thanks for listening to this conversation
02:16:59.340 | with John Abramson.
02:17:00.780 | To support this podcast,
02:17:02.140 | please check out our sponsors in the description.
02:17:04.980 | And now, let me leave you some words from Marcus Aurelius.
02:17:08.800 | Waste no time arguing about what a good man should be.
02:17:14.740 | Be one.
02:17:15.620 | Thank you for listening, and hope to see you next time.
02:17:19.860 | (upbeat music)
02:17:22.440 | (upbeat music)
02:17:25.020 | [BLANK_AUDIO]