By: Nathan Nobis

As the dust settles on the Trump presidency, we are left to reflect on its many themes. One of its most fundamental was this: believing without adequate evidence.

Many aspects of the last four years fit this theme, but we saw this most vividly with Trump’s repeated denial of the election results. This denial culminated in the unprecedented riot and attack on the Capitol by Trump’s supporters.

From the beginning, headlines declared there was “no evidence” and “no credible or reliable evidence” to support Trump’s claims of fraud. Trump’s mid-riot Tweets—insisting his “sacred landslide election victory” was stolen—were met with a chorus of “again, no evidence!” by TV news anchors.

That Trump’s claims were made on the basis of no evidence is unremarkable—he has a long, well-documented history of making claims that are false and contrary to evidence.

What is remarkable though is that so many of us would suddenly care so much about evidence. We often just don’t care about evidence. That’s a problem.

Evidence is information relevant to determining whether a belief is true or false. Without evidence, a belief cannot be knowledge. Some evidence is so strong that we call it “proof”; other evidence is weak, merely suggesting a belief might be true. Some evidence is from our own experience and reflection, but a lot of it comes from what other people tell us—their testimony—and our trusting that what they say is true.

Evidence of all types, however, has an ethics. While the concept of ethical (or unethical) behavior is well-known, there is also an “ethics of belief.” The idea is that we shouldn’t be just concerned with whether our beliefs are true or false; we should also be concerned about whether we formed them in responsible, ethical manners. And on most views, this ethics is determined by evidence.

According to a common understanding of the ethics of belief, we should only believe ideas that are supported by strong evidence. So if we lack adequate evidence for a claim, we should not believe it: believing that claim is wrong. Believing against the evidence—believing a claim when there is strong evidence that it is false—is an even graver wrong.

Why is there an ethics of belief? Because, as the insurrection at the Capitol made clear, beliefs have consequences. Beliefs guide how we act. Believing against the evidence tends to have bad results—harms, damage, disrespect—and believing what’s supported by strong evidence tends to yield positive outcomes. This is why, for example, a doctor needs strong evidence that nothing else will work before proposing a high-risk surgery and why you want an independent mechanic to check a used car before you decide it’s reliable enough to be a safe buy. 

People who care about the ethics of belief in the first place would have never supported Trump for so long: they would have recognized that they didn’t have strong evidence to accept Trump’s claims, especially about election fraud. Trump’s supporters believed they had strong evidence for their beliefs, but they were mistaken: their evidence was not strong, given the reasons to doubt—all the counterevidence against any conspiracy, which included the independent certification of results by 50 different states and rebuffing by at least nine federal judges many of whom are members of Trump’s own political party.

But it is also incorrect to say they had no evidence: Trump’s words—testimony from their hero, although itself based on little to no evidence—and the repeated reports from sources they trust provided some evidence.

But what if Trump’s followers had demanded strong evidence just for his election fraud claims? The riots would have never happened: adherence to the ethics of belief would have prevented unethical action.

Many people, especially those critical of Trump, are apt to judge MAGA insurrectionists harshly for having beliefs that are not supported by evidence, and for acting rashly, and wrongly, on the basis of those beliefs. The problem with this judgment though is that almost everyone has beliefs that are not supported by strong evidence.

Many of people’s most cherished beliefs—on important matters such as religion, health, science, ethics, justice, and more—are not based on strong evidence. How often do we undertake careful, unbiased research to thoroughly assess the trustworthiness of who and what we accept as sources of good information? Not often. We usually just accept the beliefs of people we consider “like us” and then “rationalize,” dismissing evidence that our beliefs are inaccurate or, when challenged, seeking out any support we can find, no matter how flimsy.

This is true of everyone whether they identify as “conservative” or “liberal”: we are all inclined to accept beliefs not on the basis of strong evidence, including scientists. So deep is the desire to believe certain ideas—which support membership in our “tribes,” the groups that form our social identities—that we cancel the search for good evidence.

We readily recognize the errors of believing without good evidence in others, especially when the consequences are extreme. But we should also recognize this tendency in ourselves: our believing without adequate evidence is risky and wrong. Thankfully, this error doesn’t usually lead to violence, as was seen at the Capitol, but it’s the same type of faulty believing that leads to a false dichotomy between us and “them,” that we are the rational ones and everyone else is “crazy” or worse.

This isn’t “whataboutism”: I’m certainly not defending the MAGA rioters, nor claiming that Biden supporters are generally more likely to hold beliefs supported by strong evidence. Rather, to improve everyone’s ability to make thoughtful judgments, we need to first recognize the mistaken, and flawed, ways that everyone forms and sustains many of their beliefs. Perhaps recognizing that commonality is a key to healing our divides.

So what can we do about the problematic ways we often come to hold our beliefs about important matters, including our beliefs about which beliefs to act on? Unfortunately, many common proposed solutions are themselves problematic.

One long-term proposal is to increase our nation’s critical thinking and media literacy skills. Better education would not fully solve the problems—“educated” people also believe on insufficient evidence and are as subject to vice as anyone else is—but this could have some positive impact. The difficulty though is this: how could we convince a population that widely rejects these skills that they need them, especially since these skills are often rejected across the political spectrum? How can they be introduced in schools if too many parents often reject them? This proposal is good in theory, but the mechanisms to make it happen might be hard to realize.

A related response encourages evidence-seekers and evidence-deniers to interact more, in positive ways, so that the latter comes to appreciate the importance of strong evidence. But this “solution” requires that the need for strong evidence is already widely appreciated, by those who don’t seem to care much about evidence. Given widespread rejection of expertise, it’s unlikely that most people would enthusiastically accept the thought that some basic orientations toward evidence are better than others.

Maybe our tendencies toward ignoring and disrespecting evidence can’t be taught or socialized out of us: we haven’t been able to do it yet. If we can’t be changed, maybe our environments can be altered such that we interact with less disinformation and have fewer opportunities to organize around it. The challenge though is making that happen without wrongly restricting freedom of thought and association. And who would control this information and manage interactions? And how could they gain the power to make this happen, if someone decides (who?) that they should? Great answers here are hard to find.

Another proposal involves changing our political system so that people with strong evidence and knowledge have political power and those without strong evidence or the abilities to engage with it have little political influence. This is “epistocracy,” proposed in Plato’s Republic. But who is deemed a genuine knower? Who would know who knows that? Since knowledge can corrupt—there’s no clear connection between being knowledgeable and fair—we’d need a way to keep these ruling knowers in check, and we don’t know what would be.

The willingness to try epistocracy might most depend on how it works for people deemed to lack knowledge: if it works well for them, materially, and they are respected and feel respected, maybe it’d be a viable solution. Maybe even just improved economic and status changes would be enough to dissolve what motivates people to believe against the evidence, at least on political matters, and so big political change wouldn’t even be needed. However, money and status don’t always buy decency either.

So solutions to our problems with believing won’t come easy. In his Inaugural Address, though, President Biden said that “each of us has a duty and responsibility, as citizens, as Americans” to “defend the truth and defeat the lies.” Determining how we might meet this duty requires evidence. Biden also proclaimed this election was about America’s “soul.” Our minds, however, are part of that soul and working to better use our minds—in seeking and evaluating evidence —will help restore our souls. To adapt a saying from many Christian churches, “Evidence is good, all the time.” Let us agree and get to seeking strong evidence for beliefs on these matters, as the ethics of belief requires.


Nathan Nobis, PhD, is an Associate Professor of Philosophy at Morehouse College, Atlanta, Georgia.

Image: slightly modified illustration from the PERFECT project at the University of Birmingham PERFECT (birmingham.ac.uk)

POLITICAL ANIMAL IS AN OPEN FORUM FOR SMART AND ACCESSIBLE DISCUSSIONS OF ALL THINGS POLITICAL. WHEREVER YOUR BELIEFS LIE ON THE POLITICAL SPECTRUM, THERE IS A PLACE FOR YOU HERE. OUR COMMITMENT IS TO QUALITY, NOT PARTY, AND WE INVITE ALL POLITICAL ANIMALS TO SEIZE THEIR VOICE WITH US.
THINK. DISCUSS. DEFEND. FREELY.