The following is an edited transcript from the Making Sense podcast:
There is a topic on which I appear to have offended many people—about which many of these offended people appear profoundly confused. I think this topic lies somewhere near the center of our most pressing cultural problems—especially the shattering of our information landscape and the resulting hyperpolarization of our politics. The result of this shattering, especially on the right, is an increasingly conspiratorial view of the world. Rather than recognize that bad outcomes are often due to ignorance or incompetence, many people on the right see malevolent competence and coordination everywhere. This shattering is also fueling widespread contempt for institutions. Needless to say, any response to this contempt from the institutions themselves tends to be dismissed as just more sinister machinations on the part of the elites. Some of this populist backlash is understandable—but increasingly, this seems like a cultural death spiral to me. Our institutions simply must regain public trust. The question is, how can they do this?
At the core of this problem lurks a fundamental question about the nature of intellectual authority. When do we rely on it, and when are we right to ignore it, or even repudiate it?
Everyone knows that you shouldn’t argue from authority. You can’t say, “What I’m saying is true because I am saying it,” or “It’s true because Einstein said it,” or “It’s true because it’s been published in a prestigious journal.” If a theory is true, or a fact is a really a fact, it is so independent of the source making the claim (leaving aside facts that merely relate to one person’s subjective experience). Consequently, no sane expert ever really argues from authority. What actually happens is something that is easily mistaken for this—which is that people often rely on authority as a proxy for explaining, or even understanding, why something is true. It’s a little like using money as a medium of exchange, rather than hauling around valuable objects or commodities. It’s easier to carry dirty paper in your pocket than a barrel of oil or a bushel of wheat. In the same way, it’s easier to say, or to think, that “gravity is identical to the curvature of space-time because Einstein proved it,” than it is to really understand the general theory of relativity. It’s a shortcut that’s necessary, for just about everyone, most of the time. The crucial point is that there is a difference between rejecting any argument from authority and rejecting the value, or reality, of authority itself.
For instance, I often speak with physicists on this podcast, and when I do, it is appropriate for me to assume that they understand their field better than I do—after all, that is what specialization is. If I spent as much time studying physics as a professional physicist, and proved competent at that task, I would be a physicist. And when talking to a physicist, it is important for me to understand that I’m not one.
Of course, this is true for any area of specialization. If I’m talking to Siddhartha Mukherjee about cancer, it is only decent and sane for me to acknowledge—if merely tacitly by asking questions and listening to the answers—that he, being a celebrated oncologist, knows more about cancer than I do. There is such a thing as expertise, and to not acknowledge this is just idiotic. To move through life not acknowledging it is to turn the whole world into a theater of potential embarrassment.
Relying on authority can produce errors, of course. (In the same way that some of the money in your wallet could prove to be counterfeit.) But not relying on it—shunning it, just “doing one’s own research”—is guaranteed to produce more errors, at least in the aggregate. After all, what is one doing when one is “doing one’s own research”—if not seeking out what the best authorities have to say on a given topic?
What the phrase “doing one’s own research” usually refers to are the efforts that people make to sort through information, mostly online, when they no longer trust what the most mainstream experts have to say. Usually, they have gone in search of other voices that are telling them what they want to hear—or perhaps what they don’t want to hear, but it’s now coming with a compelling, conspiratorial or contrarian slant. You don’t trust what the most respected doctors have to say—because you think they’ve all been captured by big pharma, perhaps—so you’ve found a guy in Tijuana who says he can cure your cancer. You don’t trust what the Mayo Clinic says about vaccines—and now you’re afraid to get your kids vaccinated—because you’ve listened to 14 hours of RFK Jr. on podcasts. And now you’ve started trusting him as… what?… a new authority.
We can’t break free of the circle of authority. Of course, I’m not denying that it’s possible to do truly original research—where you become the new authority—but that is not what we’re talking about here. Doing one’s own research almost never entails running the relevant experiments in virology oneself, or searching the Soviet archives oneself, or translating the speech from Arabic oneself, or interviewing the long-dead politician oneself. Most of the time, we simply have to trust that other people did their work responsibly, that their data isn’t fabricated, that they didn’t devote their entire careers to perpetrating an elaborate hoax. Again, there are exceptions—but they are simply not relevant most of the time. (That is what it means to be an exception.) Most of the time, if you no longer trust the experts, you’ve started trusting someone’s uncle.
Most of the time, real experts, who have been trained in the relevant disciplines, through real institutions, offer the best approximation of our knowledge within a field. This is no more debatable than that, most of the time, our best basketball players are in the NBA. Is it possible to find someone outside the NBA who’s amazing at basketball? Of course. Is it also possible to find someone in the NBA who shouldn’t be there? Probably. (Though it’s also safe to assume that such a person will spend most of his time sitting on the bench.) It simply is a fact that if you had to find the best basketball players in America, in some reasonable time frame, you could do a lot worse than grab the NBA all-stars from any given year. And so it is with scientists, historians, and other specialists at our most elite institutions.
There are two important caveats to this general rule: (1) There are fake disciplines, or those that are mostly fake—whole fields of scholarship that pretend to be scientific, or at least intellectually rigorous, but they are mostly, or entirely, a sham. And (2), there are real areas of scholarship that have been corrupted, to one or another degree, by politics or other bad incentives. For instance, you cannot, with any confidence, venture into a department of Middle Eastern studies at an American university and get a morally sane (much less accurate) account of the conflict between Israel and the Palestinians, or between western values and those of conservative Islam. But the reasons for that failure are also knowable, and ultimately correctable. One reason is that Qatar, an Islamic theocracy and patron of terrorists, has given more money to US universities than any other country on Earth has. This is a totally bizarre situation that fairly shrieks of intellectual corruption, if not suicide. But, again, the problem here is understandable and can be fixed. And it is simply one version of the problem of bad incentives.
The reason to worry about bad incentives is that we understand how they corrupt people. This is why the Upton Sinclair line is so famous, because it captures a perennial problem in society: "It is difficult to get a man to understand something, when his salary depends on his not understanding it." In so far as it’s possible, you want to remove the bad incentives from your life—and, collectively, we have to worry about bad incentives distorting our view of what is important, or even of what is real.
Most of the current skepticism about establishment institutions, and about mainstream expertise generally, is the result of the various failures of scientific thinking and communication that occurred during the Covid pandemic. While many of these failures were significant, there is no question that they have been magnified and distorted by our politics. In a previous podcast, I made an invidious comparison between Anthony Fauci and Francis Collins—two doctors who are now widely demonized right of center—and RFK jr. My point wasn’t to absolve Fauci and Collins of all responsibility for mismanaging our response to Covid. For all I know, both men should be investigated—and I have no idea what we would find. I was simply pointing out that these guys exist within a culture of science where intellectual embarrassment—and worse—is still possible. RFK Jr. doesn’t. He is a crackpot and a conspiracy nut. That doesn’t mean he’s wrong about everything. And that doesn’t mean that turning him loose on the bureaucracy of the HHS might not do some good. Perhaps it will. Is he the best person to do that good? Of course not. But it is possible for the wrong person to occasionally do the right thing.
Keep reading with a 7-day free trial
Subscribe to Sam Harris to keep reading this post and get 7 days of free access to the full post archives.