Arielle Scarcella, a lesbian whose Facebook posts deftly deconstruct the claims of the trans lobby, recently reminded me of a cognitive bias I’ve often alluded to but never named explicitly: the illusory truth effect. You’ve likely heard the saying that if a lie is repeated often enough, people will eventually accept it as true. That is precisely the phenomenon. Psychologists—admittedly late to the insight—first identified the effect in the 1970s, and it has since proved remarkably robust. Social science often “discovers” the obvious. The illusory truth effect shapes advertising, political messaging, public discourse, and even personal relationships. The underlying idea was widely recognized long before it acquired the label “big lie.”
A “big lie” is a sweeping distortion or complete falsification of reality, commonly used as a tool of political propaganda. The German phrase große Lüge appeared in Adolf Hitler’s Mein Kampf (1925), where he argued that people could be persuaded to accept an enormous falsehood precisely because they would not expect anyone to have the audacity to invent something so outrageously untrue. Trust in authority is essential to the success of a big lie; conversely, if a person is seen as untrustworthy or illegitimate, even truthful statements may be dismissed as lies. Big lies often function by convincing people that genuine truth-tellers are the deceivers. (See yesterday’s essay, The Fallacies of Appeal to the Authority of Consensus and Expertise.)

We are witnessing a striking example of this dynamic today. Although there is no evidence that Trump engaged in sexual misconduct with minors, millions believe he did. Even exculpatory evidence is interpreted as incriminating (see Epstein, Russia, and Other Hoaxes—and the Pathology that Feeds Their Believability). Paradoxically, many people distrust Democrats and the mass media yet continue to treat them as credible sources of information. This is the credibility or trust paradox—a situation in which individuals claim to distrust an institution but still rely on it, often because its claims align with their worldview or because no alternative appears more reliable. (See What Explains Trump Derangement Syndrome? Ignorance of Background Assumptions in Worldview.)
President Trump has been successfully framed as a liar because progressives dominate the institutions that generate and interpret social meaning. Antonio Gramsci’s concept of ideological hegemony explains this paradox. In his view, a ruling class exercises power not merely by suppressing opposition but by shaping the “common sense” of a society—embedding background assumptions in a social logic that is pressed into the public un/consciousness by repetition. Those who control the narrative exploit humanity’s tendency to believe repeated claims. A long chain of misrepresentations about Trump has, through sheer repetition, hardened into “truths” for millions: “good people on both sides,” “suckers and losers,” “drink fish tank cleaner,” “inject bleach”—all distortions or fabrications. “Trump mocked a disabled journalist.” He did not. “Trump is a fascist.” He is not. “Trump is a pedophile.” No evidence.
Why do repeated statements feel truer simply because they are familiar? Our brains use processing fluency—the ease with which information is absorbed—as a proxy for accuracy. When we encounter a claim repeatedly, it becomes easier to process, and that increased fluency creates a false sense of credibility. As Jonathan Swift wrote in 1710, “Falsehood flies, and the truth comes limping after it.” The more familiar modern phrasing—“A lie can travel halfway around the world before the truth has got its boots on”—is often attributed to Mark Twain, though no evidence links the quote to him. The misattribution itself is an example of the very effect this essay addresses.
The illusory truth effect operates largely below conscious awareness, and people of all backgrounds and education levels are susceptible to it. Its power is compounded by partisan framing. This is one reason misinformation, rumors, and slogans can become widely accepted: repetition breeds belief. Consider the slogan “Hands up.” Many still believe its original implication despite contradictory evidence. Or take the long-standing urban legend involving actor Richard Gere—a defamatory fiction from the 1980s and 1990s that persists in popular memory despite being debunked.
Big lies are everywhere. They’re used by dominant institutions to justify atrocities and injustices. Consider the slogan “Transwomen are women.” Even though the claim is on its face false, and demonstrably so using basic science, those who hear it often enough will come to believe it and repeat it—especially if trusted institutions tell them to. The lie is popularly reinforced by people committed to an ideological or political worldview.
I distinctly recall the first time I heard the slogan. It was on Facebook several years ago. The person, a former student, stated the slogan as fact because he assumed I didn’t believe it. He was right; however, I had not explicitly stated that it was not true; rather, my argument suggested that I did not work from the same background assumptions that had infected his brain. He needed me to hear it said. He then appeared to wait for me to agree. It felt as if he needed me to affirm the truth of the slogan. When affirmation was not forthcoming, he privately messaged me a few days later to tell me how much I had disappointed him, especially since he had looked up to me as a teacher. He wondered aloud what had happened to me (and expressed hope that I would come home) before unfriending me. He is one of several former students who have reacted in this way when they discover that I hold opinions that do not align with theirs. They had wrongly assumed I was a tribal member because I taught in a social science program. Given that social science has been captured by woke progressive ideology, the assumption is not entirely irrational.
The illusory truth effect is so powerful that it can override what people already know to be true. Repetition does not merely make a statement sound familiar—it gradually reshapes memory and weakens the influence of prior knowledge. When individuals repeatedly encounter the same false claim, the brain begins to prioritize fluency over accuracy, and their knowledge is corrupted. The ease with which the information is processed creates a misleading sense of reliability, and that sense of reliability can eventually outweigh a person’s original understanding. In this way, repetition can erode even well-established facts, leading people to adopt beliefs they once recognized as false. This is why persistent misinformation can be so corrosive: it does not need to be persuasive in any rational sense; it merely needs to be ubiquitous. And this is why ideological capture of dominant institutions is so dangerous.
Recognizing this effect highlights the critical importance of rigorous scrutiny in how we evaluate information. Claims should not be accepted simply because they are familiar, presented with confidence, or widely repeated. Instead, they must be weighed against evidence, cross-checked with credible sources, and analyzed for internal coherence. This requires cultivating intellectual habits that push back against cognitive shortcuts—habits such as asking whether a statement aligns with independently verifiable facts, whether the source has a track record of accuracy, and whether contrary evidence exists. Ultimately, the illusory truth effect reminds us of a foundational epistemic principle: familiarity is not proof, and no claim—no matter how often repeated—deserves belief without sufficient evidence to support it.
