Holy Anorexia and Its Analogs

“Being sick means constructing an alternate reality, strapping it in place with sturdy mantras, surrendering to the beguiling logic of an old fairy tale. —Katy Waldman

Anorexia is not just about striving for an idealized body image. It is an obsessive, relentless—and futile—quest to be pure, perfect, and clean. —Cherry Jackson

Holy Anorexia, also known as Anorexia Mirabilis (“miraculous lack of appetite”), refers to a historical and religious phenomenon prevalent during the Middle Ages, in which predominantly young Christian women—often nuns or other aspirants to religious life—engaged in extreme self-starvation as a demonstration of their spiritual piety, purity, and devotion to God. Unlike the modern medical framing of anorexia nervosa as a psychiatric disorder, Anorexia Mirabilis was interpreted at the time as a form of divine grace or saintly asceticism. It’s this type of thing that stuffs my virtual filing cabinet (one of the functions of Freedom and Reason), in this case for when I return to my sociology of religion course.

The roots of Anorexia Mirabilis lie in the Christian tradition of bodily mortification and fasting, especially as practiced by anchorites (the religious recluse), ascetics (those living austere lives for God), and mystics. In an age when women had limited access to formal institutional and theological power, the control of the body through chastity, fasting, and suffering offered a spiritual and social outlet for religious agency. Refusing food was not only a way to imitate the suffering of Christ, but also a means of rejecting earthly pleasures and asserting moral superiority over the flesh.

Crucially, these practices were celebrated rather than pathologized. Women who could sustain long periods of fasting were said to be sustained by the Eucharist alone or the direct love of Christ. Here was a manifestation of the power of God. At least that was the explanation; wasted bodies were seen as evidence of their sanctity. This led to the widespread veneration of such women, many of whom became the subjects of hagiographies—i.e., religious biographies extolling their virtues—and, in some cases, were canonized as saints, for example Saint Catherine of Siena, who lived between 1347–1380.

In his influential 1985 book Holy Anorexia, historian Rudolph Bell argued that many of these women exhibited behaviors strikingly like modern anorexia nervosa, but within a radically different cultural and theological framework. Bell posits that Anorexia Mirabilis was the historical precursor to today’s eating disorders—especially among women—and served both as spiritual self-expression and a form of resistance to prescribed gender roles. However, more recent theological analyses contend that framing these women as proto-anorexics diminishes their religious agency and overlooks the sincere mystical experiences many reported (I don’t doubt their sincerity). Feminists emphasize the symbolic power of food refusal in a time when marriage and childbirth were seen as a woman’s destiny—abstaining from both sex and food became a path to autonomy and transcendence.

A group of medieval women are addressed by a monk before a table of food (source)

Religion is a powerful force, whether rationalizing or inspiring movements like Holy Anorexia. But we don’t have to travel to the Middle Ages to see this—although Anorexia Mirabilis and what it likely signifies is fascinating and relevant to the balance of this essay. Emerging in the late 1990s and early 2000s, with the rise of internet forums and online communities, there was a phenomenon associated with the “pro-ana” (pro-anorexia) subculture.

While not a formalized cult in the traditional sense (the conventionally understood definition of a cult is that it is typically a structured, often religious group with a charismatic leader, rigid hierarchy, rituals, and clear boundaries), the pro-ana movement functioned with many cult-like dynamics, developing its own ideology, language, and symbology—even its own icon. The pro-ana cult (on second thought, let’s call it a cult) often centered around the glorification of extreme thinness and anorexia nervosa (a species of eating disorder, or ED) as a lifestyle choice rather than a psychiatric illness. Indeed, pro-ana became an identity, and members of the community affirmed anorexia nervosa as a legitimate way of being.

The roots of the pro-ana movement can be traced back to early internet communities such as LiveJournal, Yahoo groups, and later, Tumblr and Pinterest. Young people—mostly girls and women—who felt isolated or misunderstood in their struggles with eating disorders found camaraderie in these online spaces, as do those with Tourette syndrome and other neurological and psychiatric disorders (see Why Aren’t We Talking More About Social Contagion?). These communities dismissed anorexia as a disorder, rejected recovery, and instead framed anorexia as a form of discipline, control, and even spiritual transcendence.

Out of this emerged the figure of “Anna,”or “Ana” a personification or deity-like representation of anorexia. “Anna” was sometimes treated as a guiding force, almost like a guardian spirit—a cruel but revered mistress. This anthropomorphizing of the illness helped users externalize and, paradoxically, embrace it—deepening their entrenchment in disordered behaviors. (People with bulimia also have a personification of their disorder named “Mia.” Personification of desire and phenomena is not uncommon to cultish and religious groups.)

The movement adopted strict “thinspirational” aesthetics—images of skeletal models, motivational quotes promoting starvation, sharing “tips and tricks” for extreme calorie restriction. These behaviors were often ritualized and reinforced communally, mirroring cultic social structures. Some forums developed rules, hierarchies, and even “commandments” issued in the voice of “Anna,” reinforcing the notion that to disobey was to fail a higher calling. Associated with this cult was the modern resurgence of extreme body modification practices like tight-lacing, i.e., extreme corset-wearing.

I followed this closely for a while, even delivering a lecture on the topic in the unit on patriarchy and misogyny in my Freedom and Social Control course some 15 years ago. Always changing the content of that course to keep it topical, I dropped the lecture from the unit. There was another reason why I dropped it, however—the same reason why I dropped my lectures on circumcision (including female genital mutilation) and stopped showing images of lynching and the Holocaust: thinspirational aesthetics and tight-lacing obviously disturbed students. I never came back to it and, over time, forgot about it.

I was reminded of the Anna cult yesterday morning thinking about the transgender phenomenon, where rather than treating body dysmorphia around gender identity as a mental disorder, those suffering from this disorder are instead affirmed in their delusion. It is much the same way that those suffering from anorexia were affirmed, at least in the Anna cult, the facts of which came rushing back to mind once I had reminded myself of the phenomenon. I imagined the absurdity of bariatric surgery or liposuction for girls and women who saw in the mirror obesity instead of emaciation. What psychiatrist would think to affirm girls and women suffering from this disorder in such a way?

You may not remember this cult, but media attention and increased scrutiny led to crackdowns on such content. I oppose censorship, but the response by Internet platforms was the right one for the sake of the vulnerable who get sucked into social contagion. Search engines and social platforms banned or filtered pro-ana material. However, in 2010-2015, there was a resurgence of the cult on Tumblr, where the “sad girls” aesthetic, self-harm, and mental illness fetishism overlapped with pro-ana ideas. During this time, the worship of “Anna” as a quasi-religious figure reached its more stylized and mythologized forms.

From 2015 onward, Tumblr’s adult content ban and broader mental health awareness campaigns led to a decline in centralized pro-ana communities. When I went down the rabbit hole on this subject disclaimers accompanied webpages directing those seeking this content to hotlines where they could speak to counselors. Instagram and TikTok began aggressively moderating pro-ana content. The movement fractured—some of it morphed into so-called “pro-recovery” spaces, albeit that still subtly romanticized disordered behaviors, while other remnants retreated to more encrypted platforms like Discord or Reddit. I am presently fighting the temptation to go down that rabbit hole, but there are so many and I have only so much time.

At any rate, having remembered this, I did inquire as to whether the cult still exists. While the explicit pro-ana culture has largely diminished in mainstream digital spaces due to increased regulation and public awareness, it hasn’t disappeared entirely. Instead, as noted, it has splintered and gone underground. In recent years, more nuanced discussions of “toxic recovery culture” and “eating disorder aesthetics” have emerged, pointing to the persistence of some of the same harmful ideologies in new forms. “Anna” as a deity figure is now far less common, but the spiritual or moralistic framing of anorexia—as a pure or superior way of being—still lingers in corners of the internet. Sad, but a reminder that the pathologies of internalized misogyny persist.

So, there’s good news. What once resembled a cult-like movement built around anorexia has mostly been dismantled. But there’s bad news, too, as misogyny makes sure of: the psychological and cultural underpinnings that enabled its rise are still active albeit in less overt ways. Perhaps describing something as  “less overt” is a poor way of putting the way misogyny manifests in other disorders—pathologies that may affect men, as well (not that anorexia was exclusive to girls and women). I have in mind here so-called gender affirming care.

The similarity of the pro-ana cult with the gender identity movement is striking, with an equally striking difference: whereas with anorexia there was a determined effort to arrest a social contagion and get its victims the psychiatric help they needed, gender identity disorder has been normalized and the psychiatrists, endocrinologists, and surgeons make (literally) billions of dollars by affirming gender identity with puberty blockers, cross-sex hormones, and surgical procedures to manufacture simulated sexual identities. (See Simulated Sexual Identities: Trans as Bad Copy; see also The Persistence of Medical Atrocities: Lobotomy, Nazi Doctors, and Gender Affirming Care; Disordering Bodies for Disordered Minds; Fear and Loathing in the Village of Chamounix: Monstrosity and the Deceits of Trans Joy; Thomas Szasz, Medical Freedom, and the Tyranny of Gender IdeologyThe Exploitative Act of Removing Healthy Body Parts). 

There is a growing body of literature—especially from the last decade—that critically examines gender identity disorder (GID), redescribed as gender dysphoria (which is not inaccurate), and the broader transgender identification phenomenon, particularly among adolescents, through the lens of social contagion, online influence, and cultural shifts (if you’re interested in the literature on the normalization of anorexia, both contemporary and historical, I summarize it at the conclusion of this essay). As you might imagine, this line of inquiry is far more contested and politically sensitive than critical analysis of the pro-ana discourse. Still, some researchers have raised concerns about the rapid increase in youth identifying as transgender and the role of peer dynamics, internet subcultures, and sociocultural narratives in shaping identity development.

One of the most prominent and controversial contributions comes from Dr. Lisa Littman, whose 2018 study introduced the concept of “rapid-onset gender dysphoria” (ROGD). Published in PLoS ONE (later revised after peer review), Littman’s work suggested that for some adolescents, particularly natal females, gender dysphoria may emerge suddenly during puberty, potentially influenced by social factors such as friend groups and online content. The study was based on parental reports.

Unlike the work on the pro-ana phenomenon, Littman’s work has been criticized for “methodological limitations,” a standard debunking phase. The standard line, pumped out for example by Wikipedia (which feeds various AI system the consensus opinion of corporate captured sense-making institutions), is that ROGD is not recognized as a legitimate mental health diagnosis by any major professional organization. The American Psychiatric Association (APA), the World Professional Association for Transgender Health (WPATH), and over sixty other medical associations have called for its removal from clinical practice, citing a lack of “credible” scientific evidence and concerns that it promotes stigma against gender-affirming care for transgender youth. Nonetheless, Littman’s work has sparked significant academic and public debate about whether gender identity can, in some cases, be shaped by social contagion mechanisms similar to those observed in eating disorders and self-harm communities.

Further exploration of this idea can be found in Abigail Shrier’s Irreversible Damage: The Transgender Craze Seducing Our Daughters (2020), a journalistic treatment that builds on Littman’s findings and interviews with clinicians, detransitioners, and families. Shrier’s book frames the rise in adolescent transgender identification as part of a broader cultural and ideological movement, drawing comparisons to historical phenomena such as eating disorders and multiple personality disorder (which also has a cult-like appearance). Again, critics (the medical-industrial complex and its army of organic intellectuals) argue that such comparisons risk minimizing the genuine experiences of those with longstanding or medically significant gender dysphoria. Here, those who campaign for legitimizing a mental disorder appeal to emotional blackmail to stifle criticisms of the gender identity movement.

As expected, academic counterpoints have come from gender studies, sociology, and trans-affirming clinicians, who argue that increased visibility and acceptance have simply enabled more individuals to come out safely. But isn’t this what pro-ana advocates sought to do: normalize anorexia nervosa? It appears based on the phenomenon of Anorexia Mirabilis that eating disorders have been normalized before—in the same way that a schizophrenic might assume the role of shaman in a gatherer and hunter society. I checked, and anorexia nervosa remains a recognized mental health diagnosis in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5), and its 2022 revision, the DSM-5-TR. It is classified under the category of “Feeding and Eating Disorders.” 

A small but growing number of clinical researchers and bioethicists—including Marcus Evans (former clinician at the UK’s Tavistock Clinic) and Stella O’Malley—have voiced concern about affirmation-only models and the potential for misdiagnosis or premature medicalization. These discussions, while still emerging compared to the well-established literature on eating disorders, mean that a contested but active field of inquiry into the sociocultural dynamics of gender identity formation in youth is gaining momentum. While the framing of transgender identification as a form of social contagion is more controversial than in the context of eating disorders, this because of politics and profits, a similar debate is unfolding. With this essay, I aim to do my part to keep the debate going. 

* * *

Again, if you don’t know about or recall the pro-ana phenomenon, there is a considerable body of scholarly literature has critically examined the emergence and development of the pro-anorexia (or “pro-ana”) movement, particularly its roots in online communities and its framing of anorexia as a lifestyle or identity. Psychological and psychiatric journals such as the American Journal of Public Health and International Journal of Eating Disorders have published influential studies, including work by Bardone-Cone and Cass (2007) and Borzekowski et al. (2010), which explore how exposure to pro-ana websites can reinforce disordered thinking and behaviors.

Norris et al. (2006) provide a comprehensive review of these websites, shining light on their structure, content, and the risks they pose to vulnerable individuals. From a qualitative standpoint, Gavin, Rodham, and Poyer (2008) analyze how online group dynamics within these communities foster a sense of belonging while deepening participants’ commitment to the disorder, often through the adoption of rituals and coded language.

Beyond clinical psychology, cultural and media studies offer additional context. Sharlene Hesse-Biber’s The Cult of Thinness (2007) presents a sociological critique of Western beauty ideals and explores how thinness functions as a quasi-religious ideal in contemporary society—paralleling the reverence of anorexia seen in pro-ana spaces. In the broader sphere of digital culture, scholars like Alice Marwick have examined how online platforms facilitate identity formation and subcultural bonding, offering frameworks that help explain the stickiness of pro-ana ideology in digital environments. Together, these sources provide a multidisciplinary lens through which to understand how a psychiatric illness became mythologized, even spiritualized, in parts of the internet—and how that mythology still resonates today in fragmented forms.

For more on Holy Anorexia and related matters in the Medieval period, see Caroline Bynum’s 1987 Holy Feast and Holy Fast: The Religious Significance of Food to Medieval Women; Barbara Newman’s 1995 From Virile Woman to WomanChrist: Studies in Medieval Religion and Literature; Amy Hollywood’s 2002 Sensible Ecstasy: Mysticism, Sexual Difference, and the Demands of History; Virginia Burrus’ 2004 The Sex Lives of Saints: An Erotics of Ancient Hagiography. For a handy summary of these views, see “Holy Anorexia: How Medieval Women Coped With What Was Eating At Them,” by Student Whitney May writing for A Medieval Woman’s Companion

* * *

I want to add to this essay my introduction of it on social media to provide a more context and the importance of avoiding revisionist history. I have been going back through my inventory of ideas and lectures and this essay is one I would have published years ago but, like a lot of things I have studied over the years, this one slipped my mind. Once a thought pushes itself into consciousness, I start remembering all sort of things about the topic—and seeing its relevance for today’s concerns.

As note above, I followed the Anna cult closely for a while, even delivering a lecture on the topic in the unit on patriarchy and misogyny in my Freedom and Social Control course some fifteen years ago—before I became aware of the current wave of the transgenderism, which became a potent force in the early 2010s and gained momentum in the mid-2010s (when I presumed along with millions of others that trans was gay adjacent). It was during this period that the rise of transgender activism heightened public awareness that I went down the rabbit hole.

One note here about history of transgenderism (this is not in my essay, but I may add it now that it occurs to me): I am well aware that what the medical industry now calls “gender affirming care” has a long history (going back to the late-nineteenth and early-twentieth century). The history of belief that women can think they’re men is even longer. Trans activists are quick to argue that trans is not a new thing. It’s true: female-bodied individuals living or identifying as male (and visa versa) is not a new phenomenon.

The fact that anorexics can be found in history makes it an even more powerful analog for the gender identity phenomenon. Sociologically, it shows how psychiatric disorders are rationalized in various ways in different times and places. The schizophrenic is a shaman in a gatherer-and-hunter village, etc. Some will flip this and decry the medicalization of ways of being (Gabor Maté, for example, a Canadian pediatrician whom I admire but disagree with on this matter). But this ignores science. It’s a postmodernist notion. Cultural relativism is a useful methodological technique, but to conflate epistemology and ontology is a fallacious move.

Rejecting Crisis Capitalism: The Dramatic Realignment of America’s Political Parties

The Republican Party has long been associated with big business and corporate interests, while the Democrats have long been seen as the party of the working class. At least that’s the perception. But it’s something of a myth. Both parties are bourgeois, expected in a country as capitalist as the United States. However, it’s the Republican Party that’s the party of the working class, small business, traditional agricultural, energy, and manufacturing sectors, and forward-looking entrepreneurs and innovators. The Democrats are the party of oligarchic power and technocratic elites.

As I have documented in past essays on this platform, the Republican Party was founded with the above identified coalition at its core. The political realignment we’re seeing—where working-class voters increasingly support Republicans and corporate donors shift toward Democrats—is a return to the Republican Party as originally constituted, while the globalist ambitions of Democrats remains consistent, now explicit. The shift can be traced back at least to around 2016, apparent with Donald Trump’s rise to political prominence serving as a major inflection point. At the heart of this transformation is the return of populist-nationalism, which emphasizes democratic-republican governance and classical liberal principles.

Trump’s 2016 campaign broke with traditional Republican orthodoxy in several ways. Rather than emphasizing free trade, Trump leaned into economic nationalism, protectionism, and a populist tone that resonated with working-class voters, particularly in the industrial Midwest—not just white voters, but black and brown workers, too. Trump spoke directly to those left behind by deindustrialization and globalization (off-shoring manufacturing and mass immigration), offering a sharp contrast to the multiculturalist, technocratic, and urban-centric cosmopolitan messaging that defines the Democratic Party. Republicans have become the party of the masses, whereas Democrats have become the party of elites.

The facts are clear: corporate and high-income donors—especially from sectors like entertainment, finance, and established technology firms—found a more globalist friendly and socially progressive partner in the Democratic Party, which fully embraces climate initiatives (population control strategies), identity politics (strategies dividing the proletariat along lines of ethnicity, gender, race, and religion), and (corporate-captured) regulatory command in ways that appeal to large, urban-based multinational and transnationalist corporations and professional-managerial strata.

This shift is apparent not only in political and policy orientation, but in campaign finance trends. Where the oligarchy puts their money is a powerful indicator of political alignment. By 2020, Democrats were raising more money than Republicans from Wall Street and Silicon Valley. Republicans still retain support from traditional industries—such as agriculture, fossil fuels, and manufacturing—but the bulk of individual corporate donations favor Democrats, especially from the top tiers of the finance and tech worlds.

The new alignment (if the Democrats, the party of slavery Jim Crow, and positive discrimination, ever were the party of labor) is only deepening over time. During the 2024 election cycle, Democrats received significant financial support from the finance and big corporate power—much more than did the Republicans. Democratic-aligned dark money groups spent nearly double the amount spent by their Republican counterparts. Harris’ campaign, along with affiliated Democratic entities, raised approximately 2.9 billion dollars during the 2024 election cycle, compared to the Republicans’ 1.8 billion dollars.

This financial shift mirrors changes in the party’s base of support. Polling and election data from 2024 show an alignment between the Republicans and the working class, particularly among voters without a college degree—those voters whose interests progressives have long claimed to champion. Gallup polling finds that nearly half of Republicans identify as working or lower class. Trump won 56 percent of non-college-educated voters. Working-class voters trust Republicans more on issues like economic growth, entrepreneurship, immigration, and public safety. This realignment is also evident among entrepreneurs and small business owners. In contrast, fewer Democrats self-identify working class—only around a third. This reflects the party’s appeal to college-educated, urban professionals (although, even here, college-educated white men are split in party loyalty).

Whatever it was in the past, here’s the reality of today’s America: While the populist and working-class appeal of the Republican Party have grown stronger since 2016, the Democratic Party has increasingly become the political home of the credential class and the preferred partner of transnational corporations. To be sure, the realignment isn’t absolute—there are still significant overlaps and exceptions, again expected in the most capitalist country in the world—but the broader trend has held over multiple election cycles and is deepening. The Democratic Party is the party of the oligarchy. The oligarchy embraces the Democratic Party because the Party embraces free trade.

AI generated image (Sora)

As I noted in a recent essay on this platform, in an 1848 speech to the Democratic Association in Brussels, Karl Marx took a stance in favor of free trade—not because he supported capitalist economics, but because he believed free trade would accelerate capitalism’s internal contradictions and hasten its downfall. Marx argued that protectionism served to preserve capitalist economic structures relatively advantageous to the working class and slow the inevitable progression of capitalist contradiction. In contrast, free trade, by unleashing global competition between working classes across the planet and undermining traditional industries and social relations, deepens inequality and (if one’s eyes are open) exposes the exploitative nature of the capitalist system.

For Marx, the destructive dynamism of free trade is a necessary stage in the development of capitalism. Thus, Marx was an accelerationist, advocating for what we might call “crisis capitalism”: the more capitalism expands and destabilizes societies globally, the sooner the conditions emerge for its revolutionary overthrow. In this sense, Marx saw free trade as a catalyst for historical progress (as he understood it)—not toward a stronger market economy, but toward a post-capitalist world reorganized along communist principles.

Ironically, this is the path Democrats and big corporations and financiers who support them, have chosen. Thus, in a way, when conservatives describe Democrats as “Marxist,” while not literally true (Democrats are the party of the corporate oligarchy), are not off the mark, since the ends Democrats seek increase the possibility that something that at least looks like communism will replace capitalism: the reduction of the proletarian to serfdom, managed by a global administrative apparatus run by a technocratic elite and its army of bureaucrats.

The Paradox of Teaching the Rules of Academic Writing that Straitjacket Academic Writers

Recently, I published an essay on my platform explaining my approach to writing (How I Write and Why). I just finished grading essays for the several classes I teach, and this caused me to reflect on that essay and my own writing. I sometimes worry that students might read essays on Freedom and Reason and wonder why, if I require them to use a requisite number of peer-reviewed scholarly sources—academic journals, university press books—do my essays contain no parenthetical citations, no works cited page (which is not always true, but for the most is)?

Not Me (AI generated by Sora)

My first answer is simple: if I don’t teach students the rules, they won’t learn them—and if they don’t learn the rules, they can’t break them later with sophistication. Teaching academic writing is like teaching music: first comes theory and practice, then improvisation. I ask students to engage with scholarly discourse not because that’s the end goal, but because it’s the foundation. Only by internalizing the conventions—citation, evidence, structured argument—can they later transcend them if they find a space safe enough to do that (who knows if such spaces will continue to exist). I do want them to transcend these conventions. I want them to have opinions—their own opinions—and convention can, and often is, be stifling.

These days, my own work blends critical structure with topical responsiveness. I write quickly about complex, ongoing events, drawing on general knowledge, analytical habit, and a career’s worth of scholarly grounding. But I couldn’t do this—and certainly couldn’t teach others to do it—without first having acquired the discipline of academic writing. One learns the form so that, in time, he can bend it with purpose. Indeed, I cannot teach this. One only learns to do this over time. And I didn’t learn the form until graduate school. I’m giving students a big head start!

But even as I say that, I know it’s more complicated than that. One of the frustrating ironies of teaching today is that, to prepare students for success in graduate school, I must teach them to write in a style I personally find tedious and pretentious. The thicket of citations, the ritualistic referencing of theoretical frameworks, the constant nods to academic trends—these have become the currency of scholarly legitimacy.

In that same recent essay, I noted how different the writing of mid-twentieth-century sociologists feels. Their prose is direct, idea-driven, and strikingly light on citation when viewed through today’s eyes. These were serious scholars—Robert Merton, C. Wright Mills, Gresham Sykes—writing from deep knowledge with deserved confidence, not attempting to prove their intellectual bonafides with every paragraph by festooning their essays with shoutouts to their community.

The academic landscape has changed. Today, even accomplished scholars often cite the work of others more to appear academic than to advance ideas. The form of academic writing has become a kind of credential in itself. The ranking of the journal in which one’s work appears, etc. And too often, the more academic the writing appears, the less substantive it actually is. Those older works outshine the ideological and jargon-laden scholarship being produced today. One knows this because these ideas endure as the true science to the discipline, whereas the new stuff is used to rationalize ideology.

This was Paul Baran and Paul Sweezy’s critique offered in the preface to their 1966 Monopoly Capital (1966). In that preface, they expressed strong dissatisfaction with the state of modern academia. Conformity and ideological bias in academic institutions, especially in economics, were largely shaped by the needs and interests of capitalist societies. Rather than seeking objective truth or critically examining capitalism, academia often served to justify and sustain the status quo. The organic intellectual engaged in the suppression of critical thought by marginalizing or excluding radical ideas from mainstream academic discourse, often by only citing their side, a form of intellectual repression that stifled meaningful analysis and debate.

(Baran and Sweezy also criticized the increasing compartmentalization and specialization in the social sciences, which discourages holistic, systemic analysis of society and the economy—especially analyses that challenged capitalist structures. This echoed the critique C. Wright Mills made of “abstracted empiricism” in his 1959 book The Sociological Imagination. His criticism was directed at trends in mid-twentieth century American sociology, particularly the dominance of highly quantitative, methodologically rigid research that he believed had lost sight of the broader purpose and potential of sociological inquiry. His main target was understood to be Paul Lazarsfeld, who headed the Bureau of Applied Social Research at Columbia University, where he focused on consumer behavior and public opinion, funded by corporate or government sponsors. For their part, Baran and Sweezy were especially critical of neoclassical economics, which they correctly saw as abstract, unrealistic, and ideologically committed to free-market capitalism, and thus an intellectual tool used to obscure the real dynamics of monopoly capitalism and class struggle, but I digress.)

Still, I have to teach students the rules: the citation formats, the tone, the rhetorical signaling. I have to think of those who will go on to graduate school. I want them to be prepared—to give them an edge. Graduate school is an audition before a room of the deeply indoctrinated and ironically conventional functionaries. And if my students can survive that, if they can master the constraints without being mastered by them, then maybe they’ll earn the authority to break free—to write with clarity, with conviction, and with real intellectual power.

Finally, sorry to throw shade at other academics (and this is hardly all of them), but we have to admit that part of the problem is that not everyone has good ideas or a keen analytical mind. Anyone can adopt the academic style—the citations, the jargon, the reverent name-dropping—and produce work that looks like serious scholarship. Form often disguises the absence of substance. A mediocre thinker can sound profound through mimicry. At the same time, institutions corrupted by mediocrity diminish real profundity. Given the ubiquity of progressive thought and technocratic practice in higher education, it feels like an intractable problem. And with the rise of artificial intelligence, I’m not sure that even academia can be refugee for truly breakthrough ideas.

Nietzsche’s Critique of Christianity and His Impact on Social Theory

In the context of my lectures in Freedom and Social Control (also in Social Theory) on Paul Ricoeur’s thesis of the “masters of suspicion,” Friedrich Nietzsche occupies a central position alongside Karl Marx and Sigmund Freud. Each of these thinkers, according to Ricoeur, embodies a hermeneutics of suspicion—a method of interpretation aimed not at understanding surface meanings, but at exposing the hidden power structures and desires that lie beneath. Nietzsche, in this triad (an unholy trinity, if you will), is the one who most rigorously dismantles the moral and metaphysical scaffolding of Christianity and its cultural inheritance in the West.

Yet Nietzsche’s influence does not end in the realm of philosophy or theology. His radical revaluation of values also helped shape the methodological and cultural sensibilities of modern sociology—most notably in the work of Max Weber. Weber’s concept of the “disenchantment of the world” and his ambivalence toward rationalization reflect a world profoundly shaped by Nietzschean suspicion, especially toward inherited metaphysical meaning. In what follows, I present Nietzsche’s critique of Christianity and then return to how his influence echoes in Weber’s sociological imagination.

Friedrich Nietzsche (AI generated by Sora)

At the heart of Nietzsche’s critique is his distinction between “master morality” and “slave morality.” Master morality, rooted in antiquity, emerges from the affirmation of life and power. It values beauty, power, and self-assertion. Slave morality, on the other hand, is born from weakness and ressentiment—a vengeful revaluation of values by those without power (suggestive of Friedrich Engels’s conceptualization of demoralization in The Condition of the Working Class in England, used to describe the profound moral and psychological degradation experienced by the working class under industrial capitalism). According to Nietzsche, Christianity institutionalized slave morality, portraying humility, meekness, and suffering not as necessary evils, but as moral ideals.

In On the Genealogy of Morals, Nietzsche traces how the early Christians, oppressed by Roman rule, reshaped morality to favor their condition. In doing so, they turned traditional values upside down. What had once been seen as noble and life-affirming—ambition, pride, strength—were rebranded as sinful, while weakness and submission were reimagined as virtues. In The Gay Science and Thus Spoke Zarathustra, Nietzsche’s infamous proclamation—“God is dead”—strikes at the heart of Western metaphysics. This declaration is not atheistic triumphalism but a cultural diagnosis. Nietzsche recognized that modern secular societies continue to rely on the moral assumptions of Christianity even after losing faith in its theological foundations.

By saying that we have killed God, Nietzsche implicates modern humanity in the collapse of the metaphysical order. He warns of an impending nihilism—the absence of meaning, purpose, and objective value. This moment of crisis, however, is not the end but a challenge: can humanity create new values in the aftermath? Nietzsche believed this task required the rise of the Übermensch, one who can live creatively and affirmatively without recourse to transcendent absolutes.

For Nietzsche, Christianity was more than mistaken—it was anti-life. He argued that its teachings encourage a rejection of the body, instinct, and earthly joy in favor of spiritual purity and a promised afterlife. Likewise, Antonio Gramsci in his Prison Notebooks employs the term “animality,” particularly in the section titled “Americanism and Fordism,” to capture the historical struggle to suppress the “element of ‘animality’ in man,” referring to the natural, instinctual behaviors that industrial capitalism seeks to discipline and regulate.

Gramsci explicitly links the concept of animality to Puritanism, which functions as a cultural and moral framework used to discipline the working class in capitalist societies—particularly in the United States—by repressing the instinctual, spontaneous, and sensual aspects of human life. He observes that in the development of American industrial capitalism, especially under Taylorism and Fordism, there was a concerted effort not only to rationalize labor processes but also to morally reform the worker by cultivating habits of punctuality, sobriety, sexual restraint, and self-control. These values, deeply rooted in Puritan religious and cultural traditions, were repurposed by industrialists and reformers to create a more disciplined and efficient labor force.

Gramsci argues that this attempt to eliminate animality is not simply technical but cultural and ethical, aimed at creating a new type of human being suitable for modern industrial production. Ford’s program of moral surveillance—offering bonuses to workers who adopted “respectable” domestic lifestyles—exemplified this intervention. Gramsci interprets these measures as secularized Puritanism: a disciplinary apparatus designed to align workers’ private lives with the demands of capitalist production. Thus, he sees Puritanism as a historical and ideological tool in the struggle to suppress the natural, “animal” aspects of human life that could disrupt the rationalized order of capitalism. This repression, for Gramsci, is not simply about productivity but about constructing a hegemonic moral order that naturalizes capitalist social relations.

In The Antichrist, Nietzsche writes with unmistakable vitriol: “Christianity is a rebellion against natural instincts, a protest against nature. Taken to its logical extreme, Christianity would mean the systematic cultivation of human failure.” In Nietzsche’s view, Christian morality cultivates guilt and shame—particularly through its doctrine of original sin. Instead of empowering individuals to affirm their instincts and embrace life in all its complexity, Christianity demands submission and self-denial. This makes it, in Nietzsche’s words, a “will to nothingness.”

Despite his disdain for Christianity as a doctrine, Nietzsche admired Jesus as a figure who embodied love and inner peace without dogma or resentment. In The Antichrist, Nietzsche claims that Jesus lived and preached an aesthetic, not a moral life—a life of radical inner transformation that was later distorted by Paul and the Church into a system of judgment, doctrine, and power. “The very word ‘Christianity’ is a misunderstanding—at bottom there was only one Christian, and he died on the cross.” This distinction underscores Nietzsche’s central concern: that the Church preserved not the life-affirming example of Jesus, but a perverse moralism that turned life itself into something to be ashamed of.

Nietzsche’s critique of Christianity and his broader cultural diagnosis had a profound influence on Max Weber (and probably through Weber, on Gramsci), though Weber rarely acknowledged it directly. Both thinkers grappled with the consequences of secularization, but where Nietzsche feared the rise of nihilism, Weber analyzed its social forms—especially the bureaucratic rationalization of modern life. Weber’s concept of the “disenchantment of the world” (Entzauberung) echoes Nietzsche’s death of God. In a world increasingly dominated by scientific reason, bureaucratic efficiency, and instrumental logic, traditional sources of meaning—religion, myth, and metaphysics—lose their authority.

While Nietzsche calls for a new kind of individual to create meaning, Weber remains more ambivalent: he sees modernity as at once liberating and constraining. For Weber, the Protestant ethic—shaped by Calvinist Christianity—ironically laid the groundwork for modern capitalism and rational bureaucracy. This irony resonates with Nietzsche’s suspicion: values born in a religious, ascetic context end up fueling a secular, impersonal economic order. The “spirit” of asceticism survives, but stripped of its religious framework—a process Nietzsche would recognize as another transformation of values through history and ressentiment.

Nietzsche also stands in a tense and revealing relation to his fellow “masters of suspicion,” Freud and Marx. Like Nietzsche, Freud understood religion as a psychological projection, an illusion born of human weakness. In The Future of an Illusion, Freud describes religious belief as a collective neurosis—a system of wish-fulfillment designed to shield humanity from the harshness of reality. Nietzsche anticipates this view but goes further: rather than merely reducing religion to illusion, he exposes the value system behind it as a historically contingent moral framework rooted in weakness and ressentiment.

Before both of them, Marx framed religion as ideology and a painkiller—“the opiate of the people”—but also as a symptom of material alienation. In the Preface to A Contribution to the Critique of Hegel’s Philosophy of Right, Marx famously describes religion as “the heart of a heartless world,” not merely false consciousness but a protest against real suffering. Nietzsche shared Marx’s insight that religion is historically embedded and socially functional, yet where Marx seeks emancipation through collective material transformation, Nietzsche seeks liberation through individual revaluation.

Each of these figures, in his own way, demands that we see religion not as divine truth but as human product—deeply implicated in structures of desire, power, and social organization. By placing Nietzsche in dialogue with Paul Ricoeur’s “masters of suspicion” thesis and Max Weber’s sociology, we begin to see the depth and range of his influence. Nietzsche does not merely critique Christianity; he inaugurates a deeper suspicion toward all inherited systems of meaning. His work represents both a demolition and a provocation—an insistence that values are not given but made, and that their history is often one of conflict, inversion, and power.

In Max Weber, we see the sociological legacy of this suspicion. While Nietzsche tears down the metaphysical edifice, Weber examines what arises in its place: a world where reason reigns but purpose fades, where institutions thrive but meaning dissolves. And in Freud and Marx, we find parallel expressions of Nietzsche’s impulse—one psychological, the other materialistic—each dismantling the illusions that uphold inherited and illegitimate authority. Together, they form a constellation of modern critique, united by a determination to uncover what lies beneath appearances and to demand a reckoning with the true sources of belief and value.

Nietzsche’s challenge endures: if the old gods are dead, and their shadows still haunt our morals and institutions, what shall we build in their place? His answer is not a system, but a call—to courage, to creativity, and to a life lived without illusion.

Before leaving this essay, I must record a note about Weber’s influence on Gramsci, which I earlier suggested. I believe my assumption that Weber influenced Gramsci is largely accurate, albeit with nuance. Gramsci does not appear to be directly influenced by Weber in a systematic way in the sense that he did not engage Weber’s work extensively or explicitly (not in anything I have read). However, there are converging concerns: both thinkers grapple with rationalization, the moral consequences of modernity, and the role of culture and ideology in social control. I have always been struck by the similarity between Weber and Gramsci’s critique of industrialism, both finding Americanism the paradigm of bureaucratic rationality. I must conclude, then, that, indirectly, through debates circulating in early twentieth century European Marxism, especially through interlocutors like Georg Lukács, Weber’s influence percolated into broader intellectual currents that shaped Gramsci’s thinking.

For certain, the Frankfurt School—especially thinkers like Max Horkheimer, Theodor Adorno, and later Jürgen Habermas—more deliberately synthesized Marx, Freud, and Weber. They credited Weber with illuminating the cultural and institutional dimensions of capitalist modernity that Marx had only partially addressed. Gramsci, although often treated as a precursor or cousin to Critical Theory, maintained an independent trajectory rooted more in Marx, Machiavelli, and Italian political thought. Still, my framing is justifiable in a pedagogical context that highlights how these traditions intersect—and how Nietzsche casts a long shadow across all of them.

Perhaps one day I will produce a podcast in which I capture the essence of my lectures on the masters of suspicion in my courses Freedom and Social Control and Social Theory. If that never happens, readers of Freedom and Reason will have this essay to know what I talk about in those courses. Some will reasonably ask what, if anything, college students gets out of such esoteric matters. I make two assumptions about that. First, I never presume that students are incapable of grasping the more high-minded ideas in social theory and moral philosophy. I do not see them as Hobbits (nor do I see Hobbits the way elites see the ordinary man). And, secondly, as one of my professors in graduate school once remarked to me, “Never hesitate to expand your students’ vocabulary.”

“Resign, You Racist Fuck”: Clarifying for Haters the Meaning of Race

I woke up today to a voicemail from the telephone number (920) 918-1710: “Resign, you racist fuck.” Since this is all the message, I am uncertain to what the caller was reacting. Obviously it was in reaction to something I had written, but what? Was it my defense of Israel’s war with clerical fascism that has caught civilians in Gaza in the crossfire and my criticism of the anti-Israel protests occurring across Europe and in the United States? Was it my defense of the principle of cultural integrity in the Westphalian system of sovereign nation-states organized around ethnicity, i.e., common history and shared language? I recently defended the right of the English, and by extension of Americans, Swedes, etc., to resist colonization by foreign cultures by restricting immigration and deporting illegal aliens. Or could it be my criticism of anti-white belief and practice prevailing in South Africa and my support for Trump’s policy of providing refuge for Afrikaners fleeing racial persecution?

I do know what it could not have been: any actual racism in my writing. I have on this platform carefully explained why the views I espouse are not racist (indeed they are anti-racist in any real meaning of that term). Racism, or racialism, in its strictest sense, is the belief that the human species can be divided into distinct phenotypic types that possess inherent differences in traits such as behavior, intelligence, and morality and that these differences justify the hierarchical ranking of groups as superior or inferior. On that definition, which is standard, conflating race with culture, ethnicity, nation, or religion is fallacious. Ironically, such conflation is itself an expression of racism, since it suggests that ethnicity and nation, etc., are the projection of racial differences.

Why do I say that this conflation is itself racist? Because the argument assumes that ethnic or national identities (really the same things in this argument) are fundamentally rooted in race—that is, that what defines a people or a nation is a set of inherent, biological characteristics rather than cultural, historical, or linguistic factors. By reducing ethnicity or nationality to racial essence, which one does by labeling the defense of nationalism “racist,” imports the core assumption of racism: that meaningful human groupings are defined by immutable biological traits. But opposition to the mass migration of black and brown people into Europe, for the vast majority of those seeking to preserve their cultures and nations, has nothing to do with the skin color of the new arrivals, but ethnic differences and the shared experience of stubborn refusal to assimilate with the culture of the host country.

Put another way, when so-called anti-racists label the preservation of ethnic or national identity as inherently racist, they reveal their own assumption that such identities are racial in nature. This is itself a racialist view, as it treats race as the underlying basis of culture or nationhood and erases the complex, non-biological foundations of those identities.

AI generated image using Sora

This is why I prefaced an October 2021 essay, Multiracialism Versus Multiculturalism, on the difference between multiracialism, which I see as the mark of a tolerant society, and multiculturalism, or cultural pluralism, which I have judged to be destructive to national integrity, with this observation: “Culture and race are not the same things. Culture refers to a social system of beliefs, ideas, norms, and values. Race refers to supposed genetic or otherwise essential variation in our species claimed to be meaningfully organized into types that exhibit concomitant variability in behavioral proclivity, cognitive capacity, and moral integrity. Culture is a real thing. Race is not.” (If I had written this today I would have said more precisely that the idea of race as constructed by racism is not real thing, since there is evidence that the large groupings humans have intuited for centuries as racial differences do have a basis in nature, but there is no evidence that race determines behavior, intelligence, or morality, or that there are superior and inferior races.)

I noted in that essay that extremists on both sides of the political-ideological spectrum conflate culture and race. I do not. I’m a humanist and individualist. I’m thus a universalist in that I work from a human rights standpoint in which all individuals, regardless of gender, race, religion, etc., are seen as entitled to the same regard, this because rights inhere in species-being. I am here drawing upon Marx’s concept of Gattungswesen, which captures the essential nature of human beings as conscious, creative, and social creatures who express themselves through purposeful, transformative labor and collective action more broadly. Put simple, we are all members of the same species and thus share a common nature, therefore we all have the same rights.

Unlike animals, Marx argues, humans do not merely react to their environment but actively shape it, and in doing so, realize their potential. This creative capacity is central to what it means to be human. Crucially, as social animals, humans do this through collective action. Under capitalism, Marx argues in his theory of alienation, workers become estranged from their species-being because their labor is no longer an expression of their humanity but a means of survival controlled by others. This suggests a solution: the abolition of social class and the de-alienation of species-being. But in the here and now, enlightened humans organize governments and national communities to protect the rights of all citizens with the understanding that not all collectivities share this understanding of human nature.

It’s a plain fact that not every group of humans possess a founding document that states plainly: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” The American Declaration of Independence from which that statement derives also asserts that “to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.” Thus is established a republican form of government that makes possible a rule of law based on a recognition of human rights, as well as national borders that protect citizens from those who would undermine the principles for which such a government was instituted to protect and defend.

What of the Marxist project? The vision of a future world society in which people are no longer alienated from their labor and can freely develop their human capacities in cooperation with others, thus reclaiming their true nature as species-beings, may be, theoretically, a desirable thing (albeit it hasn’t worked out in practice). However, in a world where cultures and ideologies, particularly Islam, reject the basis of universal human rights, and reshape those societies to which they migrate, whether intentionally and unintentionally, such a world is unattainable for the foreseeable future. Therefore we must guard against the degradation of Western culture.

Keep in mind that, in 1990, the Organization of Islamic Cooperation (OIC) adopted the Cairo Declaration on Human Rights in Islam (CDHRI) as an alternative framework to the UN’s 1948 Universal Declaration of Human Rights, adopted in the aftermath of WWII. While the Cairo Declaration ostensibly affirms many human rights principles, it does so within an explicitly Islamic framework, stating that all rights and freedoms are subject to Sharia law. In other words, the position of the Muslim world is that universal human rights do not exist—until all peoples of the world are assumed by Islamic doctrine. There is a paradox here: Since Islam is a totalitarian doctrine, the moment everybody is subjected to it universal human rights are negated for everybody.

I have written extensively about this, but it might be helpful for readers if I summarize the meaning of race as constructed by racism, which is based on well-established history and scientific knowledge. The concept of race emerged prominently during European colonial expansion from the sixteenth century onward, when early pseudo-scientific theories were used to rationalize conquest, colonialism, and slavery. It was not the sole basis of rationalization, but it was a major part of it. In the eighteenth and nineteenth centuries, thinkers like Carl Linnaeus and later proponents of “scientific racism” such as Arthur de Gobineau and others attempted to codify these ideas into racial taxonomies, often placing white Europeans at the top.

The terms “racism” and “racialism” did not come into common usage until the early twentieth century, even though, as history records, the ideas and practices associated with them existed much earlier. The word “racialism” appears to have been used by English speakers by the late nineteenth century, often in a somewhat neutral or descriptive sense to refer to belief in racial distinctions. Prior to these terms being coined, the same hierarchical beliefs about human differences were described using other language—such as “race science,” “racial superiority/inferiority,” or simply ideas of civilization and barbarism tied to physical traits. So while the terminology is relatively modern, the underlying ideology has a much longer and deeply entrenched history.

It was not until the 1920s and 1930s that the more charged and politically significant term “racism” gained wider currency, particularly in response to the rise of Nazi ideology and other forms of race nationalism. Race nationalism is typically misdescribed as ethnonationalism, a term complicated by redundancy, in that ethnicity and nation are synonyms, if a nation is organized by common history and shared culture and language, as integral nation-state are. National socialism does more than this: it roots ethnic differences in race pseudoscience and segregates society on this basis. This is very different that the assimilationism of free nation-states, which demand that those who wish to become part of the citizenry adopt the culture of the host country in order to preserve the basis upon which individual liberty is sustained.

Racialist praxis and the Westphalian system of sovereign nation-states are in contradiction. So are the practices of colonialism, imperialism, and identity politics. History is messy, and so contradictions exist and persist. These ideologies became embedded in Western institutions and policies, influencing everything from slavery and segregation to apartheid and eugenics. Though modern genetics has discredited the biological basis of race, the legacy of this hierarchical framework continues to shape social structures and inequalities globally. However, one does not resolve the contradiction by promoting multiculturalism; instead one demands assimilation to Western Enlightenment values, which are superior, not because they emerge from white-majority European society, but because they are, in recognizing the objective fact of species-being, universal in character.

In a June 2019 essay, Race, Ethnicity, Religion, and the Problem of Conceptual Conflation and Inflation, I wrote the following: “That populations share genes with greater or lesser frequency is explained by a mundane fact: people tend to mate with people they live around. As a result, their offspring will generally look more like them than they will the parents of unrelated or less related offspring. The further apart the families, the more dissimilar the offspring will appear.” I noted further that, “even in migration, people tend to reproduce with those who look like the people from the places they left. Because of this, the appearance of so-called racial types enjoys stability over time and space. The same is true with language and dialect. People who live around each other will tend to sound like each other. They will also carry themselves similarly. And so on. But that does not mean they are a racial type.” I wrote these words to highlight the problem of conflation: people have come to confuse national integrity with racism. They are very different things.

I reference in that piece an earlier essay, also from June 2019, Kenan Malik: Assimilation, Multiculturalism, and Immigration, in which I discuss Malik’s 1996 book The Meaning of Race (1996), in which the author critically examines the historical and ideological development of the concept of race, arguing that it is not a fixed biological reality but a social and political construct shaped by modernity. Malik traces in much greater detail than I do above how scientific racism and colonial expansion contributed to the formation and entrenchment of racial categories. Malik explains that ideas of race evolved alongside changing conceptions of human difference, identity, and power, particularly in Western societies.

Across his work, Malik highlights the dangers of both racism and cultural relativism. Ultimately, The Meaning of Race calls for a humanist, universalist perspective to counter racial thinking and promote genuine equality. Malik does not argue for an end to nation-states, however. Readers will benefit from his arguments, which are not identical to mind, but which support the argument I am making with respect to the history and function of racism and the problem of multiculturalism.

I raise Malik’s work in the present essay because I find the above conversation between Kenan Malik and Coleman Hughes useful for understanding the cross-currents that threaten to disorganize Western society. One of those cross-currents is multiculturalism, which, while recognizing that people bring elements of their culture with them when they migrate—language, religion, traditions, and ways of life—think of these as, if you will, fixed packages that should be preserved rather seeing individuals as adaptive, and evolving as they interact with new environments.

Malik challenges the idea that migrants are simply “culture-bearers” in the sense of carrying a pure, unchanging tradition. He argues that treating migrants this way actually limits their agency. Instead of being seen as individuals who can reshape and question their own traditions—like anyone else—they’re boxed into representing a supposedly singular cultural identity. This perspective stifles integration and reinforce stereotypes rather than promotes freedom and individualism.

Thus, the problems of immigration and multiculturalism, both of which are championed by the progressive left, promote the importation of culture-bearers who are resistant to assimilation—and more than this, often mean to change the culture to which they immigrate—and domestic political forces that insist culture-bearers not integrate with the host country, decrying the demand that they do as racist, which, as I have established, is a fallacious deployment of that concept.

The effect of this is the Balkanization of the West, a development that sees the emergence of ethnic enclaves, the ghettoization of modern society, where cultural and religious rules are asserted over the universalist practice of the rule of law in which people are treated as individuals not as groups. This is the problem of identitarianism, and while versions of it appears on the left and right, it is leftwing identitarianism that has become the far more destructive force in the West.

One of the ways leftwing identitarianism proceeds is by conflating culture and race and haranguing those who don’t with accusations of racism.

Still Stumped About 2020 Election

This post is inspired by Scott Adams’ May 11 Sunday live X feed “Coffee with Scott Adams.” Adams is often the voice of reason. Seems appropriate for a platform called Freedom and Reason. On this day, one of the things he talked about was the 2020 election. He’s suspicious. He repeated his skepticism on today’s “Coffee with Scott Adams.”

As I was listening to Adams it reminded me that I still can’t figure out why, four years later, Joe Biden got 6,264,244 more votes than Kamala Harris. That’s a nearly 8 percentage point different. Those are significant numbers. Where did those votes go? We’re told that Biden’s astonishing vote total was driven by animus towards Trump. Why would the American electorate have less animus towards Trump in 2024 than in 2020? That’s what Scott wondered. I wondered this and more: did a significant number of those voting for Biden in 2020 switch their votes to Trump?

The total vote count for those voting for the Democratic and Republican candidates for president in 2020 and 2024 respectively were 159,633,396 and 154,925,368. So 4,708,028 fewer voters voted for either Harris or Trump, roughly a 3 percent difference. However, Trump received 3,080,204 more votes in 2024 compared to 2020, approximately a 4 percentage points improvement. Did voters realize their mistake in voting for Biden and, comparing his presidency to the previous four years under Trump, voted for Trump in 2024?

(Image Source)

Also, I still can’t figure out how Trump won 6 of the 7 battleground states in 2016, lost all 7 in 2020, then won all 7 in 2024. Adams wondered aloud: isn’t it strange that, in the 2020 election, counties historically considered bellwethers (meaning they vote for the winning candidate) supported Donald Trump, yet Joe Biden won the election? I’ve noted before the republic’s redness. Trump won 2,564 of 3,144 counties in 2020. He won 2,633 counties in 2024. In 2016, Trump won 2,626 counties, roughly the same number he won in 2024. What explains Biden’s success in more counties compared to Clinton and Harris? The difference is considerable.

Biden received 81,283,501 votes in 2020. That is an astonishing number. I don’t believe it. I continue to hear it said that 2020 was “the most secure election in history.” But repeating something incessantly doesn’t make it true. It is furthermore suspicious that this claim would be repeated so frequently. Men who are genuinely innocent don’t typically feel the need to loudly proclaim their innocence.

Suppose the 2020 election was stolen. How would we know? Presumably the authorities would audit the election and see if there is something amiss. You would think given the widespread belief that something was amiss, an audit would help reassure everybody. But instead of an audit, the public was told that it was the most secure election in history. That’s a lot like the medical industry’s resistance to a review of vaccine safety and efficacy dismissed by claiming vaccines are safe and effective. Shouldn’t we find out? We don’t have to because vaccines are safe and effective. Remember the notorious circular argument that God is real because the Bible tells us so, and you can trust the Bible because it’s the inspired word of God? Yeah, that.

Suppose there was an audit and no significant evidence that the election was stolen was found. Could it be that the operation was so sophisticated that it could be audit-proof? That was Adams’ point. At best, at their most honest, all the establishment can do is say that they don’t know whether 2020 was “the most secure election in history.” But me and millions of other people, including Scott Adams don’t believe that. And, speaking for myself, I never will.

Israel’s Blockade of Gaza and the Noise of Leftwing Antisemitism

I confess, I’m continually astonished by the lengths today’s leftists go to warp normal moral understanding and back the clerical fascist movement to eliminate Jews from the Levant. I understand it, and I will explain it below, but I remain astonished. This was the side I once and proudly stood on. I can’t anymore.

This isn’t anti-Zionist sentiment (even if we presume Jewish nationalism is a bad thing). It’s naked antisemitism. The Jews are literally being compared to Nazis, when the reality is exactly the opposite. The fascist threat today is Islamism—and the transnational forces that use it for their own ends.

Consider the widespread condemnation of Israel’s blockade of Gaza in light of history. The anti-Israel brigade is rampant on social media.

The Allied blockade of Nazi Germany during World War II was a major strategic effort cutting off the Reich’s access to vital imports, including food, medicine, and raw materials. The blockade severely restricted Germany’s ability to trade with neutral countries and obtain supplies from overseas, contributing to widespread shortages throughout the war.

The situation worsened drastically after 1942 as the war turned against Nazi Germany. Food became scarce, especially in urban areas, and the quality and the availability and quantity of medical supplies declined sharply.

The results were terrible. The blockade, combined with Allied bombing campaigns, which reduced many German cities to rubble, left the German population struggling with inadequate medical care, malnutrition, and poor health, particularly in the final years of the war, and in the aftermath of the war.

Yet, there were no widespread protests in Allied countries against the blockade of Nazi Germany. Nor was there significant public condemnation of the Allies or widespread sympathy for Nazi Germany over the suffering caused by the blockade.

Why? Because the moral compass then pointed true north.

In Allied countries public opinion overwhelmingly and rightly viewed Nazi Germany as the instigator of a brutal war, responsible for invading countries and committing atrocities. Sympathy for German civilians was limited, especially after revelations of war crimes and the Holocaust began to emerge. If you took Germany’s side you were seen as crackpot.

To be sure, toward the end of the war, humanitarian groups and individuals expressed concern about conditions in Europe, including Germany. However, their concern focused on postwar reconstruction and aid—especially for displaced persons, children, and noncombatants—rather than any sympathy for the Nazi regime.

The blockade of Gaza by Israel, much like the Allied blockade of Nazi Germany during World War II, is a strategic effort aimed at choking a hostile and violent regime from access to weapons, military supplies, and dual-use materials that could be used to further acts of terror.

As with the Allies during WWII, the intent is not to purposely harm civilians but to weaken a regime that has openly declared its goal to destroy a neighboring state, massacre civilians, and use its own people as human shields.

The blockade restricts Hamas’s ability to import and produce weapons while allowing humanitarian aid to reach the civilian population through controlled channels—unlike the complete stranglehold faced by German civilians in the 1940s.

As the conflict persists, humanitarian conditions in Gaza have worsened. But the root cause, as in WWII, lies with the aggressor. Hamas has diverted aid, embedded military assets in civilian areas, and prolonged suffering for strategic propaganda purposes.

Just as food shortages and medical crises in Germany escalated due to Allied military pressure and Nazi mismanagement, today’s suffering in Gaza results from Hamas’s decisions and the consequences of its violent actions.

In WWII, Allied bombing campaigns destroyed German infrastructure and cities, leaving civilians in dire conditions. But Nazi Germany was the aggressor and guilty of atrocities. Sympathy for German civilians was tempered by the recognition that the alternative. Nazi victory was morally and strategically unacceptable.

Allowing Hamas to continue in any form is just as unacceptable. Israel must prevail in this struggle and denazify Gaza. They must occupy Gaza until the mission is complete.

Today, moral clarity is too often lost, the compass demagnetized—worse: deliberately reversed. Those who attempt to cast Israel in the role of Nazi Germany ignore the historical and ethical context.

Hamas, like the Nazis, initiated the conflict, targeted civilians, and operated with genocidal intent. That genocidal intent is inherent in Hamas’s ideology. Eliminating Jews is the core of its being. Israel, like the Allies, is responding to ensure its survival and protect its citizens from a genocidal death cult, while attempting to minimize harm to civilians caught in the crossfire.

While humanitarian concern for innocent Gazans is valid and necessary—just as it was for German civilians post-1945—it must not obscure who is responsible for the conflict and the suffering it entails.

Leftists saw things clearly during WWII in the struggle against Nazi Germany and the horrors of Judeocide. The leftist of today is on the other side.

Demonstrators in solidarity with Palestinians in Gaza, amid the ongoing conflict between Israel and the Palestinian Islamist group Hamas, London, October 21, 2023

What changed? Decades of postcolonial theory, anti-Western indoctrination, and postmodern nihilism with respect to truth and moral clarity have corrupted a generation of leftists.

The Old Left stood firm against fascism and genocide, recognizing the necessity of Allied force against an enemy bent on annihilation. Islamism is today the enemy bent on annihilation of the Jews—just as it was during WWII when Hajj Amin al-Husaynir met with Hitler to plot the elimination of the Jews in the Arab world.

The New Left operates with a different moral framework—one that sees the West, and by extension Israel, as inherently illegitimate.

The protests and acts of civil disobedience on our streets and college campuses are not about standing up for the oppressed; They’re about dismantling a civilization youth of the West have been taught to loathe. Israel, as the region’s only liberal democracy and an outpost of Enlightenment values, becomes a target not despite its Western character, but because of it.

What we’re witnessing is not principled anti-imperialism—it’s the repackaging of old antisemitism in radical chic. The language of anti-imperialism has become a rhetoric divorced from the principle it pretends to uphold. Like anti-racism and anti-fascism, it has become its opposite.

The Right to Resist: Cultural Survival and the Moral Consistency of Anti-Colonialism

In the modern West, it has become commonplace to label any resistance to demographic and cultural change as “racist” or “xenophobic.” Yet when indigenous populations of the past resisted colonization, they were rightly seen as brave defenders of their culture, land, and way of life. This double standard is morally inconsistent and intellectually dishonest. The right to resist colonization should not be a selective principle applied only to non-European peoples. If indigenous peoples in Australia, North America, and South Africa were justified in resisting cultural erasure and displacement—and they were—then native populations in Europe and the Americas today are equally justified in defending their cultures and nations from analogous forms of transformation.

It is essential to distinguish between reasonable immigration and colonization. Migration, when limited in scale and conducted by those willing to assimilate into the host culture, poses no existential threat to a nation. Many societies have enriched themselves through such exchanges. But colonization is something altogether different. It occurs when migration is so large in scale—or so resistant to assimilation—that it disorganizes and displaces the native culture, alters the political order, and marginalizes the native population.

This is not merely theoretical. Historical colonization worked precisely this way: settlers arrived in overwhelming numbers, imposed their culture, institutions, and language, and often relied on collaborators from among the native elite to legitimize their control. We condemn this history when it involves the British in India or Europeans in Africa yet hesitate to apply the same critique when similar dynamics unfold today under the banner of “multiculturalism” or “progress.”

My argument is not about race. Race is a biological concept (and a sketchy one), but nations are cultural and historical entities. A nation is an extended kinship network bound by shared customs, language, memory, and territory. These bonds matter. They create the continuity, shared purpose, and trust that make democratic self-governance possible. Defending these bonds is not racism; it is the defense of a living culture—of a people.

Those who resist mass immigration that threatens to erase their national identity are not animated by hatred of the “other,” but by love for their own. Just as a man loves his family, he loves his nation. He loves his way of life. This is patriotism. This is no different from the motivations of American Indians who resisted European settlers, or Japanese citizens who would rightly object to a hypothetical scenario in which Germany colonized Japan, changed the language, suppressed Japanese culture, and declared it a new society. It would no longer be Japan. The Japanese would be right to resist.

Throughout history, colonization has often been facilitated by members of the native population—political or economic elites who side with the colonizers, either out of self-interest or ideological alignment. These individuals are known in the literature of political economy and history as colonial collaborators. Today, many among the political and cultural elite of Western nations play a similar role, encouraging policies that accelerate demographic transformation while silencing dissent through moral condemnation. They dismiss legitimate concerns about national continuity as mere “racism,” thereby marginalizing those who dare to resist.

Patriotic Resistance (AI generated by Sora)

This is not a conspiracy theory; it is a pattern observable throughout history. Recognizing it does not imply paranoia—it is to signal understanding of how power works, and how cultures are lost not only through invasion but through slow-motion surrender.

Because the principle is clear, one common objection to this argument is to sidestep that principle and substitute others—most notably the notions of civilizational debt and reparative migration. The modern West, built in significant part on wealth extracted through colonialism, imperial domination, and slavery, carries a moral and historical burden owed to the Global South. This debt is not merely symbolic; it is cultural, material, and ongoing.

Like original sin, the guilt asserted here is not individually chosen but collectively inherited, embedded in borders, economic systems, and institutions that persist long after formal empires have collapsed. The exploitation of land, the drawing of arbitrary borders, the theft of labor and resources—these acts created asymmetries that now manifest in global inequality, migration, and war.

From this perspective, migration is not charity, but restitution owed to the living by the dead. To deny mobility to the descendants of the colonized is to preserve the privileges of empire under a new guise—safeguarding the wealth of settler societies while externalizing the costs of their history. The West’s borders, in this view, become a kind of gated inheritance, protecting ill-gotten gain from the rightful claims of those it once dispossessed. Just as intergenerational wealth is passed down, so too, it is argued, are intergenerational debts. Migration becomes a right—not merely of movement, but something of a right of return, a right to reclaim what was taken by passing through the gates of the Wesst. It is a right to participate in the world violently shaped in one’s name.

This argument, though rhetorically powerful, fails to withstand moral scrutiny. Legally, a person inherits his father’s wealth, not his debts. The counter is that moral debts don’t operate under legal logic: if a country has benefited from centuries of exploitation, and that wealth continues to confer advantage—through infrastructure, institutions, or global leverage—then some responsibility must accompany those advantages. Yet holding the living responsible for the crimes of the dead mistakes legacy for cause, inheritance for intent. 

Moreover, colonialism and imperialism were not democratic endeavors—they were elite ventures, orchestrated by aristocracies, merchant classes, and industrial capitalists. The working peoples of empire were not its architects; they were often its cannon fodder, its coerced laborers, and at times, its resisters. British miners, French conscripts, Irish peasants, and Italian tenant farmers did not colonize India or Africa. They were themselves exploited, conscripted, displaced, ruled, and taxed alongside the colonized subjects of the past. When today’s discourse treats “the West” as a guilty monolith, it obscures the class dimension of historical injustice. It absolves the capitalist class and redirects attention away from the actual mechanics of empire: extraction by a few, benefit for the fewest, and manipulation of the many. It turns the global proletariat—North and South—against itself by moralizing historical suffering rather than dismantling current structures of exploitation. It redirects blame toward those who merely existed—who survived history. It turns the global proletariat—North and South—against itself, moralizing historical suffering while ignoring the ongoing dynamics of exploitation. 

Propaganda rationalizing the colonization of the United Kingdom

It also conceals the actual motive behind contemporary mass migration: to drive down wages in the West, facture working-class solidarity, and prepare the world for the integration of the peoples of the various nations into a borderless corporate order. The elite want to make you pay for what your ancestors didn’t do to advance a political economic project. A reparative politics, if it is to be just and effective, even if we accept it on principle, must distinguish between those who rule and those who are ruled. Otherwise, it dissimulates class struggle with ethical symbolism serving the interests of the elite and their functionaries; it collapses reality into moral theater—substituting abstraction for justice and absolving the elite while scapegoating the native working class.

The elite are not really asking ordinary people to pay for what they never did. Rather they are telling the people to accept cultural erasure and social dislocation in service of an economic and ideological project they never chose. They’re asking the masses to give up their way of life for the sake of power and privilege they fear losing as capitalism unwinds in the terminal phase of its existence—and unwinding they hasten with free trade and globalization.

So, the argument that, because Europeans were colonizers, and thus have no moral ground to complain about the colonization of their nations is not only a primitive and regressive notion of justice but a rhetoric to marginalize the majority who seeks to preserve their culture and their nations. Those living are not responsible for the deeds of the ghosts. No one alive today colonized North America or subjugated India. To hold modern individuals morally accountable for ancestral actions is to abandon any serious conception of justice. Moreover, whatever affluence the peoples of Western nations inherited from historical deeds does not explain their culture, customs, and traditions, nor does it negate their right to continue them.

The native peoples of the West are their own people, not avatars of their imperial past. They have as much right to keep their nations as those of the Global South. The West need not be haunted by history, but it can learn from it. And history teaches this: when a people lose control over their land, culture, and institutions, they cease to exist as a distinct people. If that loss is lamentable in the case of Native Americans or Aboriginal Australians, then it is no less lamentable, no less mournful, when it happens to the French, the English, or the Afrikaners. Appeals to civilizational guilt do not override this first principle. They are rhetorical constructions designed to obscure it.

A troubling inversion has taken place. In the name of tolerance and inclusion, many now celebrate the decline or displacement of historically Western populations—so long as it is framed as “progress.” When white South African farmers, facing political and social persecution, seek refuge in countries settled by their ancestors, they are mocked, not welcomed. Meanwhile, millions of black and brown migrants are accepted with open arms, regardless of whether they intend to integrate. If this isn’t racialized thinking, what is? What is being celebrated is not diversity, but the diminishment of a particular people—Western, to be sure often white and often Christian. The resentment this generates is manufactured and manipulated, then dismissed as racism when it emerges. But it is no more racist than Japanese resistance to hypothetical German colonization would be. In such a case, the Japanese would be right to resist. Their cause would be righteous.

If we are to be ethically consistent, we must affirm the right of all peoples to defend their nations—not only those previously colonized. Cultural and demographic and cultural self-defense is not inherently racist; it is a form of national self-determination. History cannot be undone. But the future is still being written. If resisting colonization was justified centuries ago, it is no less justified today—regardless of who the colonizers happen to be. To deny this is not to serve justice. It is to participate in a new injustice—one that shames and silences those who wish only to remain who they are, to preserve their republics, and to pass their traditions forward.

The Danger of Missing the Point: Historical Analogies and the Israel-Gaza Conflict

X users shared a graduation speech by NYU student Logan Rozos, who took his golden moment to declare a genocide in Gaza and condemn the state of Israeli. Here’s the speech:

I responded (both in the above comment and then in my own feed sharing the post): “It so brave to get up on stage and defend Nazi Germany while condemning the United States and the other countries trying to free the world from the grip of fascist tyranny and terrorism. Also, I just have to say, it’s heartening to see so many young Americans stand and applaud Nazism. I don’t know [if] it will save the Nazi project to exterminate the Jews, but here’s to trying. Good job Logan Rozos. History will always remember your bravery and eloquent defense of the project to free the region from the terrible presence of such an evil force.”

I had thought the point I was making was obvious, and it appears some people grasped it, but others did not, thinking I was claiming that Rozos supported the Nazis. I do need to remind myself periodically that not every adult progresses to the optimal level in childhood development. Some become stuck at an early phase. In Jean Piaget’s theory of cognitive development, children in the Preoperational Stage—which spans roughly from ages 2 to 7—typically take language literally and struggle to understand analogies and metaphors. During this stage, children are developing symbolic thinking but have not yet acquired the ability to perform logical operations and are thus typically incapable of abstract reasoning. As a result, they interpret words and ideas concretely rather than figuratively. Literal thinking is linked to egocentrism, which is when a person has difficulty seeing perspectives other than his own. It is only in the next stage, the Concrete Operational Stage (around ages 7 to 11), that children begin to grasp analogies and metaphors as they develop more logical thinking skills. There are X users who got stuck at the Preoperational Stage.

Rather than waste my time explaining an obvious analogy, other than remarking upon the arrested development of those who cannot grasp a simple analogy, I thought I would use this platform to explain the analogy, presuming there are others who don’t understand the moral point of it. Charitably, the reason why an X user might take what I said literally is because they have the moral equation inverted in their mind. This has rendered them so blind as to not grasp the point of the analogy. So I thought an essay might serve a more useful purpose.

Of course, many of them are not victims of arrested development. The power of egocentrism across the life-course, and the phenomenon of antisemitism rampant in Western culture, makes reasonably intelligent people dumb and incapable of proper moral reasoning. Much has been said about the morality of Israel’s response to the Hamas-led massacre of Israeli civilians on October, 2023, with some voices going so far as to liken Israel to the Nazis. This analogy is not only false but dangerously inverted. The accurate historical parallel lies in the opposite direction: Hamas, not Israel, represents the genocidal ideology of the Nazi regime, and in the historical parallel Israel stands in the place of the Allied powers who had the moral obligation to crush that ideology utterly and reconstruct the society from which it emerged.

View from Dresden’s town hall of the aftermath the allied bombings, February 1945. Roughly 25,000 people were killed, and 90 percent of the city centre turned to rubble.

So, to clarify, my analogy draws from the broader narrative of the twentieth-century global conflicts. World War I serves as a precursor in this framing, with a series of Arab-Israeli wars—particularly the 1948, 1967, and 1973 conflicts—analogous to the buildup of nationalist resentment and militaristic posturing that preceded World War II. Just as Germany invaded France in World War I and again in World War II in pursuit of regional domination, so too have Arab coalitions launched successive military attempts to destroy the Jewish state. Hamas is the genocidal analog to Nazi Germany, both regimes obsessed with eliminating the Jews—from Europe with Hitler; from the Middle East with the Palestinians.

The genesis of Hamas fits this pattern. As an antisemitic, Islamist, and theocratic movement, Hamas’s 1988 charter openly calls for the destruction of Israel and the extermination of Jews. This genocidal objective is not incidental; it’s ideological, foundational, and non-negotiable—just as Nazi Germany’s goals were. The October 7, 2023 attack was not merely a military operation; it was a barbaric massacre aimed at terrorizing and erasing Jewish life. That act of brutality mirrors, in spirit and purpose, the pogroms and atrocities committed by Nazi forces in Eastern Europe. It is a continuation of the effort in the Arab world to eliminate Jews from the region.

Historically, antisemitism in the Arab world has manifested in both rhetoric and policy. After the establishment of Israel, nearly a million Jews were expelled or fled from Arab countries, with their communities—many centuries old—being reduced to a fraction of their original size. Properties were seized, rights revoked, and populations scattered. This ethnic cleansing, though less discussed (for ideological reasons), forms part of the backdrop to the persistent hostility toward Israel. Readers may be unaware of the fact that the elimination of Jews from the Middle East (except for Israel, of course) has reduced the Jewish population there by approximately 97 percent. There is no other way to describe this history but as ethnic cleansing.

The goal of eliminating Jews from the Arab world is a longstanding one. And it is not happenstance. Even before the Jewish homeland was recognized a nation-state, during World War II, Nazi Germany met with representatives of Palestinian Arab leadership. Amin al-Husseini, the Grand Mufti of Jerusalem, met with Adolf Hitler in November 1941 seeking support from the Nazis for Arab independence, expressing the view that the Nazis and Palestinians shared a common enemy in the Jewish people. He expressed solidarity with Nazi Germany’s goal of annihilating the Jewish population in Europe and encouraged Arab resistance against the Jewish communities in Palestine.

The proposed collaboration between the Palestinians and Nazi Germany symbolized a dark convergence of antisemitic ideologies, linking the genocidal aims of the Nazis with Palestinian nationalism. The Mufti’s alliance with Hitler remains a historical fact that underscores the deadly roots of the ideologies still present today.

It’s important for readers to remember (or learn) that, in the third and final war between the Jewish people and the Roman Empire, after crushing the Jewish Bar Kokhba revolt in 135 CE in Judea, the Roman Emperor Hadrian renamed the region “Syria Palaestina.”  This was intentional, intended to minimize Jewish identification with the land by invoking the ancient Philistines—historical enemies of the Israelites—thereby erasing Jewish ties to the territory. Unfortunately, the name stuck and was used through various empires, including the Byzantine, Islamic Caliphates, Ottoman Empire, and the British Mandate period.

Before the twentieth century, the inhabitants of the region—Jews, along with Arabs, Christians, Druze—primarily identified with their local villages, clans, religious groups, or as part of the broader Arab world rather than a distinct “Palestinian” national identity. The term “Palestinian” was a geographic designation used by outsiders, including the British Mandate authorities (1917–1948), to describe all inhabitants of the territory regardless of ethnicity. The distinct Palestinian Arab national identity began to crystallize in the twentieth century during and after the British Mandate period, largely in response to growing Jewish immigration and Zionist claims.

Crucially, Palestine was not devoid of Jews during that time. The Jewish connection to the land—historically known as Judea, Samaria, and later Palestine—dates back thousands of years, with continuous Jewish presence despite periods of exile and foreign rule. Indeed, Jews are the indigenous people of that territory. Arabs first appear in recorded history around the 1st millennium BCE in the Arabian Peninsula. Arabs are not indigenous to the Levant. They migrated there. And now they threaten the existence of the Jewish state.

Archaeological sites, religious texts, and historical records confirm that Jewish communities lived in the region from ancient times through Roman, Byzantine, Islamic, Ottoman, and British rule. Even after the Roman expulsions and various diasporas, Jewish populations remained in cities like Hebron, Jerusalem, Safed, and Tiberias continuously. Jewish religious, cultural, and national identity has been historically intertwined with this land, which forms the foundation for modern Zionism and Israel’s claim to the territory as their ancestral homeland.

Berlin after the Allied victory over Germany

In light of these realities, Israel’s military campaign in Gaza should not be viewed through the narrow lens of proportionality alone. Rather, it is akin to the Allied invasion of Nazi Germany—a response not just to aggression but to an existential threat. The Allies did not merely repel Nazi advances; they leveled German cities, toppled the regime, and embarked on a comprehensive project of denazification. They understood that the Nazi ideology could not be appeased or contained. It had to be eradicated.

So too with Hamas. Israel is not merely fighting a militant group—it is confronting an ideology that glorifies death, martyrdom, and the annihilation of Jews. Just as Hitler sought to bring all of Europe under Nazi rule, Hamas seeks an Arab nation established on the Jewish homeland, thus expanding Arab—and Islamic—hegemony over the entire region. Just as the Allies bore the moral burden of wartime devastation in order to secure a future free from fascism, so too does Israel now shoulder the moral responsibility of rooting out a death cult that threatens not only its people but the broader values of civilization.

This is not to deny the suffering of Palestinian civilians any more than one denied the suffering of German and Japanese civilians during WWII. Civilian casualties are a tragedy, as they were in Berlin, Dresden, and Tokyo. But the source of that tragedy lies not in the defenders, but in the aggressors who embed themselves among civilians while proclaiming genocidal aims. The path forward must include, as it did in postwar Germany, a full ideological and structural transformation—a denazification of Gaza. And before that objective can be realized, there must be total annihilation of Hamas and Gaza occupied until the transformation is complete.

We live in a time where historical analogies are often abused or flattened into caricatures. But if history is to have any moral utility, it must guide us in recognizing evil where it arises, and support those who take the burden of confronting it. In this light, Israel’s war against Hamas is not only justified—it is necessary, and, like the Allied war effort against the Nazis, morally urgent.

The widespread pro-Palestinian protests in the United States and Europe reveal a striking disconnect from the complex historical realities on the ground, bordering on the absurd when they equate Israel with Nazi Germany or ignore Hamas’s genocidal ideology. Their moral reasoning corrupted by the postcolonial studies standpoint, these protests represent a reductive and inverted narrative of colonial oppression that fails to account for the historical circumstances of Israel’s founding and survival amid existential threats.

While postcolonial theory can be useful for critiquing imperialism and advocating for oppressed peoples (I am being generous here), its application in this case (as well as others, such the South African situation) is fallacious, framing the Israeli-Palestinian conflict purely through a misapplied colonial lens and thus obscuring history and existential threat posed not only to Israel but the West more broadly by Islamism. Reductionism and historical revisionism distort public understanding and obscure the moral clarity required to confront a movement like Hamas, whose objectives align with total annihilation of a native people.

This is not a case of postcolonial liberation—at least not for Palestinians. The Palestinians are not colonial subjects. They are colonizers. The leftwing protests on Western campuses and streets are performative gestures rooted in global identity politics shaped by postmodernist irrationalities. Many of the protestors were taught these irrationalities at the university they attend. The West needs to reclaim its sense-making institutions and point them back towards truth-seeking. Higher education needs a big reset.

Racial Identity Disorder

One frequently encountered argument regarding race is that race is a social construct—that it has no inherent biological basis but is instead rooted in superficial phenotypic characteristics and the stereotypes derived from them. According to this view, the social roles historically assigned to different races are arbitrary and constructed rather than essential or natural. I have long taught this point of view in my sociology courses. It was what I was taught as a student. One finds this point of view across the anthropological and sociological literature.

A parallel argument is often made about gender: that gender, too, is a social construct without essential biological grounding. Like race, it is said to be based on socially imposed stereotypes tied to outward appearance or behavior. The roles historically assigned to men and women are, in this view, culturally constructed and socially imposed rather than biologically determined. These roles are learned and internalized through socialization, including anticipatory socialization.

Yet when it comes to how society reacts to individuals crossing these constructed boundaries, we see a striking inconsistency. When a white man adopts stereotypically black dress, speech, or mannerisms—engaging in what is often termed a performance of blackness—he is accused of “blackface” and branded a bigot. His actions are condemned as offensive, and those who speak out against them are seen as standing up for racial justice. The underlying assumption here is that the white man is appropriating something that does not belong to him, something intrinsic to black identity.

However, if a man adopts stereotypically feminine dress, speech, or mannerisms—engaging in a performance of womanhood—he is not typically accused of “woman-face” or labeled a bigot. Instead, criticism of his performance is often branded as transphobia, and the man is affirmed in his self-identification. In this context, gender is treated as fluid, performative, and open to personal redefinition. Critics of this treatment of gender and personal identification are bigots.

What explains this double standard? Why is it that race, which is claimed to be a social construct, is treated as essential and exclusive, while gender, also claimed to be a social construct, is treated as flexible and inclusive?

The implication is that, despite rhetoric, society essentializes race. A black person is seen as essentially black, such that even a performative act by a non-black person is deemed a violation. This is not true in the inverse. Consider that in recent years, film and theater have increasingly embraced diverse casting, including black actors portraying roles originally written as or traditionally played by white characters—even when those characters are historical figures. This trend is celebrated as a form of artistic reimagining and social progress, aiming to correct longstanding disparities in representation. Casting choices like these are defended as empowering and inclusive, especially given that white actors have historically dominated the stage and screen, often to the exclusion of non-white performers.

The reverse—casting a white actor to play a black historical figure—would likely provoke widespread condemnation. This reaction is rooted in the history of blackface and racial caricature, which makes such portrayals deeply offensive regardless of artistic intent. Critics point to the asymmetry of power: while marginalized groups taking on roles of privilege can challenge dominant narratives, dominant groups portraying the historically marginalized can come across as appropriation or erasure. While this may seem like a double standard, it is justified by the historical and systemic context in which these portrayals occur. The prevailing consensus is that representation cannot be separated from the realities of history and power.

By contrast, gender is not treated in the same way; it is considered open to performance and self-definition in both directions. Men can be women. Women can be men. Thus, while both race and gender are often described as social constructs, the social and moral frameworks surrounding them diverge. In the case of race, society tends to treat the identity as fixed and inviolable—except where in performance where blacks are empowered to assume white roles. In the case of gender, society increasingly embraces the idea that identity is fluid and performative altogether.

We are taught that the “black role”—the cultural identity and expression of blackness—is the result of historical oppression. It emerged within the context of slavery, segregation, and systemic racism, and therefore occupies a space of resistance and survival within a dominant framework of white supremacy. Consequently, when a white man adopts the black role—by imitating black vernacular, dress, or artistic expression—he is seen as trivializing this struggle. His performance is interpreted not as homage but as mockery, a form of exploitation that reasserts racial hierarchy. He is accused of appropriating cultural expressions born of pain and resilience for his own benefit, gaining social currency while bypassing the lived experience of black oppression. Moreover, such appropriation grants him access to spaces—linguistic, social, or symbolic—that have traditionally been carved out for black people as sanctuaries or affirmations of identity.

But if this logic holds for race, why does it not apply to gender? Is the woman role not also the result of historical oppression? Has it not, too, developed under centuries of subjugation—patriarchy, legal exclusion, domestic relegation, and gender-based violence? Just as blackness is shaped by its historical position relative to white dominance, femininity has been shaped by its historical position relative to male dominance. So why is a man’s adoption of the woman role—through dress, speech, or mannerisms—not interpreted as a similar kind of appropriation?

Does it not, in parallel, allow the man to access spaces traditionally reserved for women, including shelters, support groups, athletic competitions, and intimate conversations shaped by shared experience? Does it not enable him to use the language developed within feminist and female-centric contexts—the female experience, menstrual health, or women’s empowerment—without having lived the realities they emerged from? If a white man performing blackness is accused of reinforcing racial superiority under the guise of identification, could a man performing womanhood not be seen, by the same logic, as reinforcing male privilege—using the social authority he retains as a man to redefine what it means to be a woman?

Racial Identity Disorder (AI generated by Sora)

A common defense of the man who adopts the woman role is that he does not merely perform femininity externally but experiences a deeply felt internal sense of being a woman. This sense of identity is described as essential, even if it is subjective and unverifiable. Whether labeled “gender dysphoria” or “gender identity disorder,” it has been medicalized and, in many frameworks, essentialized: the body is said to be misaligned with the true self, and thus medical intervention is appropriate—indeed, necessary—to bring external appearance into alignment with internal identity. Hormones, surgeries, social accommodations—these become pieces of affirming care, steps toward congruence and psychological well-being.

But why does this line of reasoning not apply to race? Why is there no serious public argument for the concept of a white man who believes or feels that his identity is, internally and essentially, that of a black man? You may object that this is not a widespread phenomenon, but neither was gender identity until only a little while ago in historical terms. There are people who identify as the other race. Why are they not affirmed in their identity? Why not speak of “racial identity disorder” or “race dysphoria”? If one can claim that their inner sense of gender overrides the physical and historical realities of biological sex, then why can’t the same be said for race?

A person who identifies as black, despite being born white, might seek medical and social interventions to bring his appearance and experience into alignment with his internal identity. This could include skin darkening treatments, changes in hair texture, or speech coaching—what we might call “racial affirming care.” Even without medical procedures, such a person might present himself as black, participate in black spaces, and expect recognition of his racial identity based on personal experience and conviction. Would he be entitled to this by today’s lights? Could he be a diversity hire at the institution or organization?

No. Society overwhelmingly rejects this. Such a person is typically ridiculed, ostracized, and accused of deceit and appropriation. Their internal identity is not affirmed but denied, even condemned, regardless of the depth or sincerity of their experience. The case of Rachel Dolezal, for example, was not treated as a matter of racial dysphoria but racial fraud. The very suggestion that one could “feel black” or possess a black identity absent black ancestry is seen as offensive—a form of theft, not self-expression. But what does it mean to “feel female”? Would you not have to be one?

This again reveals the asymmetry. While gender identity is treated as subjective, psychological, and potentially in conflict with biology or social history, racial identity is treated as fixed, external, and rooted in inherited experience. Gender, we are told, lives in the mind; race, it seems, lives in the blood. Thus, despite claims that both are social constructs, society treats gender as interior and malleable, and race as exterior and immutable.

This development is striking consider the biological foundation—or lack thereof—of the categories in question. As noted at the start, the claim that race is not essential, that it is constructed from superficial phenotypic traits such as skin color, facial features, and hair texture, is widely accepted in the social sciences and for much of biology and physical anthropology. While these traits correlate to geographic ancestry, they do not meaningfully divide humanity into discrete biological races. Instead, these visible traits have been socially organized into categories—“black,” “white,” “Asian,” etc.—which are then imbued with roles, expectations, and stereotypes. In this view, race is imposed from the outside in.

Gender, however, rests on a more robust biological foundation. It is tethered to a host of deeper biological realities: chromosomal configurations (XX vs. XY), gametes (ova vs. sperm), internal and external reproductive anatomy, and secondary sex characteristics. These distinctions exist independent of social categorization and are relevant not only to reproduction but to a wide array of physiological and developmental processes. The essentialist view of sex is not a matter of superficial traits, but of fundamental biological organization.

Paradoxically, though, it is gender that is now treated as mutable, fluid, and internally defined, while race—which rests on far thinner biological ground—is treated as fixed and sacred. This reversal leads to an irony: from a purely biological standpoint, a white man performing the black role and claiming black identity may actually be less of a stretch than a man claiming to be a woman. The white man’s “whiteness” is not encoded in a unique chromosome or gamete; it is a loose proxy for ancestry and phenotype. By contrast, the man’s male body is rooted in a suite of anatomical and genetic realities that cannot be changed in kind, only in appearance.

Moreover, when viewed through a historical lens, the subordination of women by men predates the transatlantic slave trade or European colonization by millennia. Patriarchy is not merely a social system—it is, in many cultures, a civilizational bedrock, deeply interwoven with religion, law, family structure, and language. If one argues that it is offensive for a white man to appropriate black identity because it trivializes centuries of black struggle under white domination, then, by the same logic, one might argue that a man adopting womanhood trivializes thousands of years of female subjugation under male dominance.

In this light, society’s current framework—affirming gender identity while rejecting racial identity—begins to look not only inconsistent but internally contradictory. The man who claims womanhood is asking society to affirm an identity in spite of biology and history. The white man who claims blackness is denied for precisely those same reasons. The standard, then, is not principled but cultural—one built on shifting political sentiments rather than coherent logic.

This asymmetry becomes even more pronounced when we consider those who reject the binary system altogether. In the context of gender, there is growing recognition of individuals who identify as nonbinary—those who do not see themselves as either male or female, or who claim a fluid identity that transcends traditional categories. While sex remains biologically binary, gender is treated as a spectrum, or even as optional. A person can be affirmed not only in transitioning from one role to another, but in refusing to perform any gendered role at all.

But where is the analogous concept in race? Why is there no mainstream recognition of a person who is nonracial—someone who does not identify with any racial group and refuses to perform race altogether? In a society that claims race is a social construct without essential biological grounding, this should be entirely possible. If race is externally imposed and historically constructed, then surely one should be permitted to disavow it, to decline participation in racial identity just as one can decline participation in gender roles.

Yet in practice, society offers no space for racial nonconformity. A person who claims to be “raceless” (spellcheckers don’t recognize the word) is often met with confusion, suspicion, or accusations of privilege and denial. To refuse race is to refuse the terms by which power, identity, and belonging are currently organized. Even in contexts that celebrate nonbinary gender identities, racial identity remains strictly policed. One must belong to some race, even if that race is “mixed” or “other.” Racial categories are treated as compulsory and immutable, despite their acknowledged artificiality.

This raises a final and uncomfortable contradiction: gender, which is grounded in biology, can be dismissed, redefined, or transcended. Race, which is acknowledged to be a social fiction, must be performed, claimed, or affirmed. One cannot opt out of race, even though race has no essential reality. But one can opt out of gender, despite sex being among the most deeply rooted biological features of the human body. The conceptual framework that permits nonbinary gender identities but forbids nonracial ones reflects not a coherent theory of identity, but the selective application of social norms.

In the end, the double standard is not merely an inconsistency within our cultural logic—it is an inversion of biological reality. A society without racial roles and identities is imaginable, perhaps even desirable, if one accepts the premise that race is socially constructed and not biologically essential. Indeed, the human species can thrive without maintaining rigid racial categories. Nothing about reproduction, survival, or social cooperation depends on treating race as real.

But a society without gender—or more precisely, without sex distinctions—is not imaginable in the same way. Human beings are a sexually dimorphic species. The terms “man” and “woman” are not arbitrary labels but describe biological roles in reproduction. While gender is often treated as distinct from sex, in reality the two are inseparable: even if they are not treated as synonyms, gender is the cultural expression of sexual dimorphism. To argue otherwise is to deny that the categories “male” and “female” correspond to any objective natural facts. The perpetuation of the species depends on the existence—and recognition—of those very facts.

A white man and a black woman can produce children. A man and another man cannot. A black woman and an Asian man can form a family in the biological sense. A woman and a trans woman cannot. However much one may wish to reconstruct identity through culture, medicine, or language, sex remains stubbornly real, and its implications universal. It is embedded not only in human history, but in the evolutionary logic of life itself.

Therefore, if we follow the logic of biology rather than ideology, and if we are to find any of this controversial, it is not transracialism that ought to be controversial, but transgenderism. Race is contingent and context-bound; sex is cross-cultural, transhistorical, and essential to life. The irony is unmistakable: the very identity we treat as fluid and performative—gender—is the one rooted in biology, while the one we treat as immutable—race—is arguably nothing more than skin deep. We have built a cultural orthodoxy on a foundation precisely opposite the facts it claims to defend. Gender ideology is a nonsensical position, which would be fine (there are lots of nonsensical position) but for the harm it causes in practice and its acceptance by governments and the organization that shape our lives.