Sacred Words—Presumed and Actual Power

Words are presumed to carry power, especially words that offend people. The very idea that a word can “offend” someone depends on an imagined or assumed structure of power. When a term is labeled a slur, it is usually because it is thought to emerge from, reinforce, or call into being some underlying social hierarchy. For example, there are words that black people can use to describe white people that technically qualify as slurs, yet very few white people are seriously offended by them. There is a presumption that whites hold structural power over blacks and thus their words do not injure. Moreover, whites deserve to suffer slurs since they are the oppressors. The presumed asymmetry of power flows in one direction, and that presumption shapes how the words operate. (Do you see the paradox?)

In the opposite direction, there are words that white people can use toward black people that are deeply hurtful. The assumption is that such words express or invoke a position of power, and that they carry within them the weight of a larger social asymmetry. At the same time, black people may use these same words among themselves and often argue that this usage strips the words of their oppressive power—an act of rhetorically “reclaiming” language from the dominant group.

We see a similar dynamic in words directed at gay people: slurs aimed at gay men or lesbians wound deeply, while parallel slurs thrown at straight people land with far less force. Yet accusations of homophobia, like accusations of racism, can be hurtful because they charge the accused with moral wrongdoing. In that sense, the equivalent offense on one side is the use of a derogatory term; on the other side, it is the accusation that the person is morally tainted for supposedly using or embodying a derogatory attitude that manifests the asymmetry of power.

Over time, some words become so heavily charged that even referencing them without malice becomes taboo. The power dynamic is so baked in that people avoid speaking the word outright and instead reduce it to constructions like “the N-word” or “the F-word.” Yet, everyone who hears the euphemism instantly imagines the actual word in their mind. Even the people who would be offended if they heard the word mentally summon it the moment the euphemism appears. It is in everybody’s head (or else we wouldn’t know what was being conveyed). The taboo becomes paradoxical: the word is forbidden to speak, but impossible not to think.

This dynamic is on my mind today because of the controversy surrounding the word retarded,” now frequently replaced by “the R-word.” When I was growing up in the early 1960s, words like “idiot,” “imbecile,” and “moron” were understood as synonyms for retarded. Yet today retarded alone has taken on the status of a sacred or forbidden term. It resembles, in a way, the ancient Jewish taboo against vocalizing the actual name of God; instead, one used circumlocutions. Only priests or scribes could speak the divine name. This taboo was built on the assumption of an asymmetrical power relation between the clerical class and ordinary people. Similarly, our modern panel of offensive words functions as a set of secularized sacred terms—words that cannot be uttered because of the social power they are imagined to reveal.

Thus, what we call “offensive language” is really a structure of sacred language embedded within an imagined system of power. This is what postmodern philosophers describe as discursive formation: the idea that language does not so much reflect power as generate and organize it. If one is to have power, one must control language (yet another paradox). While the term is modern, the underlying phenomenon is ancient. Civilizations long before ours used regulated language—taboos, sacred terms, forbidden names—to enforce and perpetuate structures of power. In that sense, nothing about our current landscape of forbidden words is new. The observation is simply that we have reinvented an old form of linguistic sacredness under secular conditions.

When I was growing up in church, I learned something about power that I now see as parallel. I often heard it said that the devil—Satan—has only the power that God allows him. If we imagined Satan as possessing independent, self-generated power, a kind of standalone evil deity, then Judaism and Christianity would be polytheistic rather than monotheistic. But the theology I heard insisted that God alone is sovereign and that anything Satan does occurs only within limits established by God (see the story of Job).

Years ago, during a debate on CNN’s Crossfire between Frank Zappa and a guest—likely someone associated with the Moral Majority, since it occurred during their campaign to ban or label certain song lyrics—Zappa repeatedly emphasized that lyrics are simply “words,” nothing more than letters arranged in a particular order to convey an idea.

Zappa, a well-known atheist, approached the issue from a perspective I share. My objection to any theological system that forbids certain words from being spoken—what is traditionally called blasphemy—has always been strong. I find the creation and exercise of such power offensive. Here, I am not using “offensive” in the sense of hurtful words; rather, I find it offensive when systems restrict people’s freedom to speak. I find it offensive because it is illiberal and totalitarian.

The theological concept of blasphemy has been secularized: the same logic now governs prohibited social words, where uttering them—especially depending on who speaks—can trigger sanctions. This phenomenon shatters the illusion of presumed power. The real power structure is revealed when people find themselves on the disciplinary end of this linguistic control system. This is a situation of inequality; liberty is manifest when everybody enjoys equal access to words to express their thoughts.

It takes a lot of courage, I know, but we should collectively refuse to participate in a system that punishes people for uttering words and should actively work to dismantle such punitive mechanisms. It is not as if we don’t have the tools to wage this fight. The First Amendment to the US Constitution can be understood as a recognition that power structures have historically used punishment for certain forms of speech as a tool of authoritarian control. The Framers rebelled against that power. To allow a system of linguistic control is fundamentally at odds with the free and open society envisioned in American jurisprudence.

Image by Sora

Does Religious Liberty Permit Extreme and Primitive Religious Practices?

A post circulating on X claims that Japan is hostile to Islamic burial practices and that these practices are effectively banned. The claim is not entirely accurate. However, Islamic burial customs indeed face significant constraints in Japan. The post frames the issue as a suppression of religious liberty. My contribution to these threads—posed as a rhetorical question—is whether there are legitimate limits on religious freedom. Of course there are. However, before explaining why, I would like to outline Islamic burial traditions and the current situation in Japan.

Islamic tradition requires burial. It strongly prefers that the deceased be buried as soon as possible—ideally within 24 hours and traditionally before sunset if death occurs earlier in the day. Embalming is generally strongly discouraged or outright prohibited, and cremation is strictly forbidden. In Japan, however, cremation is the overwhelmingly dominant practice (99.8 percent of corpses are cremated).

A small number of Japanese cemeteries accept Muslim burials, but they are few, often far from major Muslim enclaves, and sometimes prohibitively difficult or expensive to access. When local Muslim groups attempt to establish new cemeteries, they frequently encounter strong local resistance based on concerns about cultural identity, groundwater contamination, and property values. As a result, proposed cemeteries are routinely canceled.

Japanese burial grounds (Source: Gareth Jones)

The issue, then, is less one of explicit state prohibition than of de facto exclusion resulting from administrative hurdles, community opposition, and cultural norms. In practical terms, Muslims in Japan face significant obstacles to securing a burial that aligns with their faith—an ongoing problem (for them, at least) even without a formal national ban.

My rhetorical question to posters is whether they believe it would constitute an infringement of religious liberty for Japan (or countries most anywhere in the world, for that matter) to prohibit funerary practices involving endocannibalism—anthropologists’ term for the ritual consumption of members of one’s own community as part of mortuary rites. Such practices were not acts of hostility but expressions of cosmological belief, mourning, and reverence for the dead.

This is not a theoretical scenario. Various societies around the world have incorporated ritual cannibalism into their treatment of the dead, viewing it as a compassionate means of honoring the deceased, maintaining spiritual continuity, and strengthening social solidarity.

As an anthropology minor, I took an entire course on cannibalism taught by Dr. Marilyn Wells, whose fieldwork spanned Central America, East and West Africa, and Papua New Guinea. Her lectures and course materials were fascinating. When we reached the topic of endocannibalism in funerary rites, I remember clearly thinking about multiculturalism and whether Western nations should tolerate the practice in the name of religious liberty. Cannibalism is often my go-to example when testing the limits of religious freedom.

One of the best-known examples is the Fore people of Papua New Guinea, who practiced funerary cannibalism into the mid–twentieth century. For the Fore, consuming parts of the deceased preserved the person’s auma—spiritual life force—within the kin group. The auma, the source of vitality, contrasted with the aona, the physical body. Endocannibalism was the consumption of corpses, not the living, and served to protect the community from the kwela, a dangerous spirit believed to linger after death. The practice was eventually suppressed after researchers linked it to kuru, a fatal prion disease.

Concerns about groundwater contamination associated with burial—including those linked to specific Muslim burial methods—are entirely rational, and Japan is within its rights to impose restrictions for public health reasons. But more is at stake. The Japanese have the right to preserve their cultural practices within their own country—a right one might expect cultural relativists to defend.

Yet, within contemporary progressive discourse, the cultural norms of advanced societies such as European and East Asian nations are often treated with contempt and deemed unworthy of protection. Meanwhile, primitive cultures are presumed to possess an absolute right to preserve their traditions, even when doing so imposes significant burdens on the host society. Resistance to extreme and primitive religious practices is thus framed as a violation of the very religious liberties to which advanced populations are expected to subscribe.

The Fore are not the only example. The Wari’ of the Brazilian Amazon also practiced funerary cannibalism. For them, consuming the dead was the most respectful mortuary practice; in contrast to Muslims, burial was considered degrading and emotionally harmful. Anthropologist Beth Conklin has written extensively about how Wari’ mortuary cannibalism expressed compassion, reinforced emotional bonds, and strengthened solidarity among survivors.

Various Melanesian groups likewise practiced funerary cannibalism, often as part of cosmological frameworks that guided the spirit of the dead or preserved aspects of their essence within the lineage. In the Amazon Basin, groups such as the Amahuaca and neighboring peoples consumed parts of the body during mourning rituals. Certain indigenous Australian groups historically ingested charred bone powder as a way of symbolically incorporating the spirit of the deceased.

A recurring theme in my cannibalism course, and more broadly the anthropological and sociological curricula, I can accurately convey the cultural relativists’ viewpoints: although foreign or unsettling to outsiders, these practices are deeply meaningful to those cultures where they appear. If one asks why this should matter, this is the right question. Moreover, why should cultural relativism be anything more than an epistemological problem and methodological approach? It does not follow that it should also be a moral standpoint.

Ultimately, the question comes down to this: Why should the practices of foreign cultures impose burdens on host countries? What moral obligation does a society have to tolerate religious rituals that are profoundly alien to its own traditions—especially when these practices compromise social cohesion, disrupt cultural norms, and threaten public health? While religious liberty is a vital principle, it is not absolute. Host societies have every right—and indeed, a responsibility—to set reasonable limits that protect the culture, values, and welfare of their citizens.

Moreover, tolerance must be reciprocal: just as outsiders should respect the laws and norms of the countries they inhabit, host nations are justified in shaping the boundaries of acceptable practice, particularly when the stakes involve both public safety and the preservation of cultural integrity.

If foreign culture-bearers wish to continue their traditional practices, then they need not enter those countries that do not tolerate extreme or primitive rituals. They can stay where they are. We should prefer that they do. And if they are not allowed to practice their rituals where they are, for example, because their people have been integrated into another superior, more advanced group, then the following generations can thank those who stopped them.

The Virtue of Being Wrong: How Humility Strengthens Thought

When a person discovers that they are wrong about something—especially something of significance—they ought to ask a further question: What else might I be wrong about? A single error can be dismissed as an isolated lapse (everybody makes mistakes or misses something), but recognizing a substantial error should naturally prompt a broader self-examination.

Beneath that lies an even deeper question: Why was I wrong? For it is the “why” that reveals whether there is a flaw in one’s thinking, methods of reasoning, or habits of evaluating evidence. Identifying the cause of an error helps prevent the same underlying problem from quietly generating future mistakes.

Most people do not reach this deeper question until they have first checked whether they might also be wrong about something else (of course, some people never reach the deeper question). If they uncover a second (or more) significant error(s) and still fail to ask why their judgments are misfiring, the issue is no longer a simple mistake—it becomes a matter of cognitive integrity. A pattern of errors suggests that what requires scrutiny is not just one’s conclusions, but one’s intellectual process itself.

Realizing one was wrong about a single belief may be unremarkable. Realizing one was wrong about two or more important matters calls for a harder look at the structure of one’s thinking. Multiple false beliefs rarely occur by accident; more often, they signal a deeper problem in how a person forms, organizes, and justifies their views.

As mistaken beliefs fall away, the result can be a profound reordering of one’s worldview. But it may also result in recovering deep principles. Indeed, the ability to admit one is wrong, and to see that there is a reason they have arrived a wrong conclusions, itself points to deeper principles about which one may be unaware or have forgotten.

Epistemology concerns the nature and justification of knowledge, while ontology concerns what fundamentally exists or is true. The shift I am describing may ultimately reshape both epistemology and ontology: not only how a person acquires and evaluates knowledge, but also what he believes to be true about reality. When a person confronts the roots of his own errors, both dimensions of his thinking may undergo significant revision. At the same time, as I have suggested, it can result in the reclamation of a deeper understanding.

If the latter, then what explains the buried principle or lost understanding? Affinity and ideology play a central role here. By ideology, I mean a way of thinking that systematically distorts a person’s epistemological approach—assuming, of course, that a rational and undistorted approach is possible (which I believe it is, since I believe in objective truth). Ideology does not merely mislead someone about particular facts; it warps the framework through which facts are assessed. The warping corrupts not only the content of one’s knowledge but also one’s cognitive integrity. The individual’s sense of intellectual honesty, his standards for evidence, and his capacity for self-correction can all erode under the weight of an ideology that supplies ready-made answers and shields its adherents from uncomfortable truths.

Partisan loyalty and tribal affinity also play roles in keeping people away from reason and a clear assessment of evidence—even the evidence itself.

In 2018, when I discovered I was wrong about systemic racism in American criminal justice, I wondered what else I was wrong about—and why. I began taking long walks, during which I reconsidered the things I believed and which of these beliefs were worth keeping and which needed jettisoning. Critical self-examination led to a reflection on the deeper structure of belief-formation. This led me to recover something professional development had compromised: common sense. Of course, men can’t be women. There is no science there. Moreover, my commitment to women’s rights. What was I thinking? That was the problem. I wasn’t. I was following.

This is where humility becomes so important to intellectual development. Humility is the cornerstone of personal growth and meaningful relationships because it allows us to acknowledge that we are not infallible. It’s okay to be wrong. It is not okay to deny oneself the capacity to admit it. It is not fair to others. And it is unfair to oneself.

Recognizing that humans can be wrong requires courage, self-awareness, and a willingness to confront our own limitations. When we admit our errors, we not only correct misunderstandings but also foster trust and openness with others. Of course, we depend on others to extend charity in such situations. Alas, one discovers that some do not wish us to be wrong, especially when they relied upon us for their appeals to authority. If the authority changes his mind on some matter dear to others, it cannot be that he corrected an error, but that he has become misguided in his judgment. The error is what they want to continue believing in. They lose faith in something they should never have faith in: the infallibility of others.

Humility transforms mistakes from sources of shame into opportunities for learning, for getting closer to the truth. By embracing the possibility that one’s perspective may be flawed, a man cultivates empathy, deepens his understanding, and creates space for collaboration, ultimately becoming a wiser and more compassionate individual. Those around him with the same humility can grow with him, or at least acknowledge that opinions can differ. The stubborn can condemn the man for “switching sides.” But that’s their problem, not his.

Image by Sora

Trump and the Battle for Western Civilization

The media are reporting that President Donald Trump’s friendly Oval Office meeting with the soon-to-be mayor of America’s largest city, Zohran Mamdani, on November 21 roiled parts of the MAGA base. The New York Times was somewhat less optimistic in its assessment. “There was one moment when Zohran Mamdani seemed like he might have bit off a little more than he could chew by making his pilgrimage to the lion’s den that is President Trump’s blinged-out Oval Office,” Shawn McCreesh writes. “The 34-year-old mayor-elect of New York was pressed by a reporter if he thought his host, who was sitting about four inches away, was really ‘a fascist.’ How terribly awkward.” Indeed. “But before Mamdani could get out an answer,” McCreesh continues, “Trump jumped in to throw him a lifeline. ‘That’s OK, you could just say, Yes,’ Trump said, looking highly amused by the whole thing. He waved his hand, as if being called the worst term in the political dictionary was no big deal. ‘OK, all right,’ Mamdani said with a smile.”

New York City mayor-elect Mamdani at the side of President Trump in the Oval Office, November 21, 2025

The interpretation of this moment is easy to get right. Contrary to what progressives desperately want the public to believe, Trump is highly intelligent, and he played Mamdani like a fiddle. Being smeared as a fascist doesn’t play well today. Not just because of overuse, but because calling a liberal businessman from Queens a fascist is so inaccurate that it draws an eyeroll from those who hear it misapplied. They’re thought is: “There you go again.” By playing it cool, seated at the Resolute Desk, an imposing figure even while sitting down, Trump made Mamdani look small and insignificant. He let Mamdani, his arms folded in front of him like a schoolboy, do his thing: talk without saying anything. What anybody prepared to accept reality saw was the mayor-elect bending the knee to the President of the United States. Trump gave Democrats nothing. His strategy was obvious: when Mamdani fails, the Muslim’s sycophants won’t be able to talk about a confrontational moment at the White House.

What observers didn’t know was that Trump had something in his pocket. Just 72 hours later, the White House gave supporters (and most Americans, if they understood the situation), something they had long sought: an executive order that designated Muslim Brotherhood chapters as foreign terrorist organizations: “Within 30 days of the date of this order, the Secretary of State and the Secretary of the Treasury, after consultation with the Attorney General and the Director of National Intelligence, shall submit a joint report to the President, through the Assistant to the President for National Security Affairs, concerning the designation of any Muslim Brotherhood chapters or other subdivisions, including those in Lebanon, Jordan, and Egypt, as foreign terrorist organizations.”

Founded in Egypt in 1928, the Muslim Brotherhood is a transnational Islamist movement that has influenced Islamist organizations and parties worldwide. The Brotherhood plays a chief role in the Islamization project. Trump’s EO allows the federal government to investigate, among other things, the Brotherhood’s public relations firm, the Council on American–Islamic Relations (CAIR). Founded in 1994, CAIR describes its mission as advocating for Muslim Americans, fostering understanding of Islam, and protecting civil liberties. The political action organization, Unity & Justice Fund, CAIR’s super PAC, donated thousands of dollars to New Yorkers for Lower Costs, one of the main PACs backing Mamdani. Mamdani is the smiling face of the Islamization project.

With this EO, Trump is signaling significant movement against the project. But he is doing much more than this. Indeed, even before the November 24 order, following his meeting with Mamdani, Trump ended Temporary Protected Status (TPS) for Somalis in Minnesota. On November 27, the President announced a review of green cards for Afghans—along with holders from 18 other “countries of concern.” The review was triggered by the targeted shooting on November 26 of two National Guard members, who were ambushed near the White House by an Afghan refugee. The Afghan, Rahmanullah Lakanwal, a 29-year-old Afghan national who had previously worked with a CIA-backed paramilitary unit in Afghanistan, was one of tens of thousands imported to the United States by the Biden regime, organized by then-DHS Secretary Alejandro Mayorkas.

Readers will recall that Trump has confronted Islam before. In a January 2017 essay, Executive Order 13769: Its Character and Implications, I argued that, if democracy and liberalism are to prevail, “the state must preserve secular values and practices, and every person who enjoys the blessings of liberty should dedicate himself to ensuring the perpetuation of this state of affairs. A liberal democracy must proceed based on reason.” Therefore, I continued, the conversation about Trump’s actions in 2017 should be grounded in “an understanding of the unique problem Islam presents to human freedom, as well as an examination of the European experience with Muslim immigration.” I noted that “[t]he problem that many on the left fail to consider is the corrosive effects of an ideology antithetical to the values and norms of Western society—government, law, politics, and culture—and the need for a policy that deliberately integrates Muslims with these values and norms, as well as promotes these values in the Islamic world.” I saw in the reaction to Trump’s order “an opportunity to have a broader conversation about Islam and immigration.”

Trump’s actions have Steve Bannon of the podcast War Room embracing the late Christopher Hitchens’ warning about Islam: that the Islamization (or Islamification) of the West is an existential problem. Atheists and liberals have long warned conservatives about the Islamization project, and I think I speak for many of us when I say that we welcome conservatives to the fight. We don’t have much time to turn things around, however, so the more robustly Republicans address the problem, the better (and they’d better put a strategy in place before the 2026 midterm elections). Indeed, America (and the West more broadly) should move aggressively to contain Islam in the same way the West contained communism during the Cold War. Just because Islam calls itself a religion is no reason to throw open the doors of Western civilization to Muslims. After all, as Hutchens noted, it’s not as if communism weren’t also effectively a religion; Islam, a species of clerical fascism, represents no less a threat to the internal security of the nations across the trans-Atlantic space.

I addressed this problem in recent essays (see Defensive Intolerance: Confronting the Existential Threat of Enlightenment’s Antithesis; Revisiting the Paradox of Tolerating Intolerance—The Occasion: The Election of Zohran Mamdani; Human Nature and the Limits of Tolerance: When Relativism Becomes Nihilism), as well as in a May 2019 essay, Threat Minimization and Ecumenical Demobilization. In these essays, I warn the West about the extension of the ecumenical spirit—originally aimed at creating understanding and unity across Christian sects—to fellowship with Muslims. Christianity and Islam are radically different ideological systems, and ignoring this fact prepares populations for what Canadian psychologist Gad Saad identifies as suicidal empathy. This progressive desire is, for many of the rank and file, an instantiation of misguided tolerance. For elites, it is a strategy of denationalism and the managed decline of the West.

Christianity is about charity, love, tolerance, and many other good things. But many Christians have forgotten about or never learned the history of Islamic conquest and the reality that our Christian ancestors took up swords and saved Europe from the fate suffered by the Middle East and North Africa, formerly thriving Christian centers in the world, now primitive hellholes, where women are treated as second class citizens, and the fate of hundreds of millions have fallen into the hands of clerics working from a plagiarism of JudeoChristian texts that twists those scriptures into a totalitarian system. It was Christians, including militant monks, who repelled with violence the Muslim barbarians, drove them from Europe, and secured the future for Christianity. Had they not acted when they did, there would be no Europe. No Europe, no America. No Enlightenment. No human rights. Only clerical fascism. Tragically, modern Christianity has made Nietzsche’s critique of the religion a reality by rejecting the militant side of the faith and suppressing the human instinct for self-preservation (see Republican Virtue and the Unchained Prometheus: The Crossroads of Moral Restraint and the Iron Cage of Rationality). 

As I noted in those essays, Muslims have now added to the tactic of military aggression the mass migration of Muslims to the West and the progressive Islamization of the trans-Atlantic space. The tactic of migration is a strategy to conquer the civilized world from within. The softest parts of Christianity, strategically exploited by transnational elites, continue in the progressive attitude that empathizes with Muslims and the barbarian hordes, while rejecting the militancy necessary to repel the existential threat Islam represents to human dignity and freedom. The failure of Westerners to take up both sides of Christianity—the soft (selectively tolerant) and the hard (militant) sides—portends disaster. At the same time, what militancy remains, progressives have aimed at their fellow Westerners. We must not be shy about calling things what they are; the left has become a fifth column in the West, working with our enemies to bring down Western civilization.

Reflecting on this, I have lost confidence in the United Nations and the efficacy of international law to defend freedom and human rights. When the United Nations was founded, it was established on Western values of international cooperation and law. The Universal Declaration of Human Rights emerged from this framework. But not all member states endorsed it in substance, even if they formally signed onto it. Moreover, Muslim-majority nations developed their own declarations of rights—most notably the Cairo Declaration on Human Rights in Islam—which is founded on Sharia rather than the Enlightenment principles that gave rise to democratic republicanism and human rights. As a result, the UN includes a wide array of states whose commitments to democracy and rights are not aligned with the Western standards that originally shaped the institution. These Western standards are not arbitrary; they are the product of reason in the context of European culture, made possible by the Protestant Reformation and the broader intellectual currents of Christian civilization. 

This matters when we consider cases such as Israel (see my recent essay How Did the Roles Get Reversed? The Moral Confusion Surrounding Israel and Gaza, and embedded links). If the UN or its agencies are asked to adjudicate whether Israel is responsible for genocide after the massacre of Jews in Israel on October 7, 2023, the judgment would ostensibly rest on the legal definition of genocide—a Western juridical concept. In practice, however, the judgment rendered would be heavily influenced by the political alignments and value systems of states that do not share the underlying philosophical commitments from which those legal definitions arose. Many of these states are openly hostile to Israel and to the West. Perhaps the UN won’t make this determination. But one has reason to worry it will. (And then what?)

When reflecting on this dynamic, it is easy to think of the contrast presented in Star Trek’s construct of the United Federation of Planets. Starfleet included many different species and cultures, but they were all integrated into a framework of shared values rooted in Enlightenment-style principles and liberal norms—equality, reason, tolerance, universalism. Diversity existed, but it was anchored in a common civilizational ethic. In contrast, groups like the Klingons and Romulans, who did not share these principles, remained outside the Federation and were recurring sources of conflict because their worldviews diverged so fundamentally. I raise the matter of a 1960s Sci-fi TV show and its spin-offs because it shaped the beliefs of many Americans who today contemplate the world situation. By portraying such antagonism as occurring out in space, they do not see the Klingons and Romulans as analogs to Muslims.

However, the contemporary terrestrial situation more closely resembles the dark side of that fictional interstellar situation. The real Earth is divided by profoundly different religious and civilizational traditions, and there is no universally accepted philosophical foundation uniting all nations. Had the West colonized the world and brought it to the principles of individualism and secularism, it would be a different matter. Even in its failure to accomplish this, the desire is portrayed as imperial ambition. The UN project to include every state in a single system of international cooperation by tolerating the cultures of barbaric countries and regions has undermined its original purpose. Instead of a mechanism for upholding universal principles, it has become an arena in which illiberal, non-Western, and even totalitarian regimes can leverage their numbers to dilute, reinterpret, or subvert the values the institution was created to advance and defend.

Last night, I revisited an interview conducted with Hitchens (Conversations with History, UC Berkeley’s Harry Kreisler) in which he expresses optimism about the role of international law in holding member nations to account based on a universal standard of treatment. His argument is similar to arguments advanced by pro-Arab intellectuals Noam Chomsky and Norman Finkelstein, who insist on putting Israel’s fate in the United Nations’ hands. However, the validity of their argument depends on a uniformity across the planet of values that align with the underlying principles upon which a just international law must rest. It should be obvious that this is not the case. Given this, one must ask whether justice is what these intellectuals desire or if their sentiments are driven more by a hostility towards the Jewish state.

The reality of the world we live in, with the totalitarian ambitions of China, and its radically different conception of the world, growing more belligerent by the day, and also those of Islam and the rest of the Third World, make such uniformity impossible. The universalism desired by those who established the United Nations and developed further the system of international law presumes the hegemony of the Western worldview. There is no such hegemony. Only in a fantasy world like Star Trek could such a situation exist. At this point, we can’t even count on Europe to uphold the foundational values that support the endeavor. Europe is well into its Islamization phase, and the pessimistic side of me has trouble believing that the continent hasn’t passed the point of no return.

We must therefore ask whether the United Nations is something worth continuing in its present form. How can we allow barbarian cultures and corrupt elements of the West to determine the fate of mankind? At the very least, how can we leave the fate of America to such madness? The situation demands a comprehensive rethink. In the meantime, Trump is doing the right thing: halting mass immigration and reviewing the status of those who have entered our country.

* * *

Because of all the anti-Western and anti-white rhetoric the occasion of Thanksgiving has provoked, I want to close with a couple of historical notes. For it was not just the false claim of “stolen land” that progressives rehearsed (see Gratitude and the Genocide Narrative: Thanksgiving and the Ideology of Historical Responsibility), but the African slave trade. “Never forget,” advocates lecturers. I’ll take them up on that.

First, the Asante (an ethnic group of modern-day Ghana) were deeply involved in the slave trade, particularly from the seventeenth through the nineteenth centuries. Readers may remember that Democrats wore the ceremonial garb of the Asante, the Kente cloth, during the BLM riots, a large-scale uprising against the West and white people triggered by the overdose death of convicted felon George Floyd while in the custody of Minneapolis police.

Second, white Europeans (millions of them) were enslaved in the Barbary States for several centuries. The Muslim slave trade—also called the Arab, Islamic, or Trans-Saharan slave trade—was one of the largest and longest-lasting systems of slavery in world history, spanning over 1,300 years, involving multiple regions and empires, and predating and outlasting the Atlantic slave trade. In fact, slavery continues in the Islamic world. I will say more about the Barbary States, in particular, Tripoli, today an open-air slave market. I will bring these closing remarks to the point about religion and freedom.

During Thomas Jefferson’s presidency, the United States intervened militarily against the Barbary States—Algiers, Morocco, Tripoli, and Tunis—because these North African regimes sponsored piracy and the enslavement or ransoming of captured American and European sailors. For centuries, Barbary corsairs seized ships in the Atlantic and Mediterranean, forcing nations to pay tribute for safe passage. After the American Revolution, the US no longer had British naval protection, and American crews were increasingly captured. Earlier presidents agreed to pay tribute to the Barbary States, but Jefferson believed this was dishonorable and unsustainable.

In 1801, when Tripoli demanded increased payments, Jefferson refused, prompting the ruler of Tripoli to declare war. Jefferson responded by sending the US Navy to the Mediterranean, launching the First Barbary War (1801–1805). The conflict included naval blockades, ship-to-ship battles, and the famous 1804 raid led by Lieutenant Stephen Decatur to destroy the captured USS Philadelphia. The war ultimately forced Tripoli to renounce future tribute demands and release American captives, marking the first major overseas military campaign in US history and establishing America’s willingness to confront piracy and state-sponsored enslavement abroad.

As I noted in a December 2023 essay, Rise of the Domestic Clerical Fascist and the Specter of Christian Nationalism, the Treaty of Peace and Amity with Tripoli (1805), which ended the First Barbary War, included a famous clause emphasizing the secular nature of the US government. “As the Government of the United States of America is not, in any sense, founded on the Christian religion,” Article 11 states, “it is declared that there is no hostility on the part of the United States to the laws, religion, or tranquility of Muslims.” This provision was intended to reassure Tripoli that the US, though largely populated by Christians, was not a religiously motivated state and had no intention of spreading Christianity through its foreign policy.

The inclusion of Article 11, however diplomatically strategic, testifies more profoundly to the American principle of separating religion from government, even in international relations, and is often cited as evidence that the US government was officially secular even while its citizens were predominantly Christian. I have invoked this clause many times in my insistence that the United States is not and should not become a theocratic state.

However, America’s adversaries do not advance such a principle; Islamic countries are not secular even while their citizens are predominantly Muslim. If they did, it might be reasonable to tolerate Muslim immigrants, as they would have been socialized in a secular culture that respected other religious faiths (or, in my case, those who have no faith at all). However, as I have explained many times, since humans are culture-bearers, those bearing cultures incompatible with secular ethics are not suited to reside in America. They should therefore be barred from entering the country.

Whether we are a Christian nation is a point reasonable people can debate, but those who believe all laws derive from Islam are a priori unreasonable people. No discussion is possible with such people. Therefore, the rational policy is to keep those animated by irrational cultures from entering and subverting Western institutions.

Gratitude and the Genocide Narrative: Thanksgiving and the Ideology of Historical Responsibility

“We didn’t land on Plymouth Rock. The rock was landed on us.”—Malcolm X

“Who controls the past controls the future. Who controls the present controls the past.”—George Orwell

In A People’s History of the United States, first published in 1980 and widely adopted in high schools, Howard Zinn argues that all history-writing is shaped by choices, just as mapmaking is. A cartographer decides what to enlarge, what to shrink, and what to leave out entirely; those decisions create a perspective, not a neutral mirror of reality. Historians, Zinn contends, do the same (but more than that, as we shall see): what they highlight or omit reflects ideology, political interests, and values (and, I must add, tribal affinity). He uses the analogy to insist that objectivity in history is impossible, because the historian must always select from an overwhelming number of facts—and those selections inevitably reflect a standpoint, usually that of governments, elites, or victors.

The analogy is true enough in science, as well, and thus crashes on the shores of a necessary truth. Yet it has proved useful to those who claim that truth is determined by power and standpoint, and that a marginal standpoint can legitimately revise history in the pursuit of power—a hallmark of postmodernist thought.

Below, I quote Zinn at length so readers can see exactly the perspective and politics I am criticizing in this essay—a politics I once endorsed myself, for example, in a 2012 talk to educators, A Culturally Competent and Democratic Pedagogy.

“To state the facts,” Zinn writes, “and then to bury them in a mass of other information is to say to the reader with a certain infectious calm: yes, mass murder took place, but it’s not that important—it should weigh very little in our final judgments; it should affect very little what we do in the world.”

He then deploys the mapmaker analogy:

“It is not that the historian can avoid emphasis of some facts and not of others. This is as natural to him as to the mapmaker, who, in order to produce a usable drawing for practical purposes, must first flatten and distort the shape of the earth, then choose out of the bewildering mass of geographic information those things needed for the purpose of this or that particular map.”

Zinn concedes that selection, simplification, and emphasis are inevitable for both cartographers and historians. But, he insists,

“My argument cannot be against [them],” he writes. “The map-maker’s distortion is a technical necessity for a common purpose shared by all people who need maps. The historian’s distortion is more than technical; it is ideological; it is released into a world of contending interests, where any chosen emphasis supports (whether the historian means to or not) some kind of interest, whether economic or political or racial or national or sexual.”

The ideological interest, Zinn continues, is never openly expressed the way a mapmaker’s technical interest is obvious. Instead, traditional history is presented “as if all readers of history had a common interest which historians serve to the best of their ability.” This is not intentional deception; historians have simply been trained in a society that treats knowledge as a technical problem of excellence rather than as a weapon in the hands of contending classes, nations, and races.

At the core of Zinn’s project is the smuggling in of a primitive ethic: that the living are responsible—not for historiography, but for the actual deeds of past generations. Otherwise, why would any historian’s “ideological” rendering of the past matter at all? If traditional historians distort history to evade collective, intergenerational responsibility, then the responsible progressive historian must rediscover or emphasize the facts they omit or downplay. The entire endeavor only makes sense if one first accepts that collective, intergenerational responsibility is something the living ought to bear—and bear in a way that justifies altering present arrangements.

An. exercise in guilting the living

I reject that premise, as I made clear on Thanksgiving 2021 in Awokening to the Meaning of Thanksgiving. “Thanksgiving is about the living. It’s not about corpses—except for the dearly departed we remember together,” I wrote. “Those who want everybody to dwell in a narrative of collective guilt have way too much influence in today’s world. We need to be more forceful in our insistence that they sit the fuck down.”

I put the matter bluntly, I know. I was frustrated. I still am. Every time I hear a land acknowledgment at a ceremony or meeting, I sigh and roll my eyes. If I were inclined to be more disruptive, I would say something. Instead, I redirect the frustration into essays.

Two years later, in Giving Thanks Amid Uncertainty and Hopeful Developments, I wrote:

“I hope I never have a day in my life when I won’t or can’t be thankful for living in the greatest republic that ever existed—the United States of America. Although I am not responsible for the actions of those now dead and gone, I can be thankful for my ancestors who founded, built, and defended this great nation. I worry about the future, though, not only because of the threats abroad, but also because of the rot inside. The enemies of America are in charge of the machinery of the republic. I’m not religious, but I know many of you are and will pray for America. I’m thankful for that, too. We need more than prayers, though. We need action.”

(We took that action in November 2024 and returned a transformational leader to the White House.)

What I want to do in the remainder of this essay—while I wait to celebrate the day with my nuclear family—is recover from manufactured forgetting key relevant facts about Thanksgiving and show that the claim that the holiday celebrates the genocide of indigenous peoples is a recent, thoroughgoing political reinterpretation, one that emerged long after the holiday’s traditions were firmly established in American culture.

The facts of the case are objective, not ideological. Thanksgiving developed not as a commemoration of conquest but as a moral and religious day of gratitude, shaped far more by nineteenth-century Protestant culture and the exigencies of the Civil War than by early colonial events—though those events supplied moments later generations felt worth remembering.

The colonial antecedents lie in seventeenth-century New England harvest celebrations. The best-known—the 1621 Plymouth gathering—was a modest festival attended by both Pilgrims and Wampanoag during a period of alliance and mutual dependence. It was neither intended nor understood at the time as a celebration of dispossession or violence.

When Malcolm X, in his 1963 Message to the Grassroots speech, uttered the phrase quoted at the top of this essay, he could not possibly have been talking about Africans. There were no African slaves at Plymouth (or for decades after). He was deconstructing the symbolism of Plymouth Rock as the founding of a great and peaceful nation by misleading his audience—just as journalist Nikole Hannah-Jones and The New York Times Magazine would decades later with the 1619 Project—about the history of America.

The national holiday we observe today, however, owes its form to Abraham Lincoln’s 1863 proclamation, issued amid a civil war, designating a day of gratitude, prayer, and unity. Whatever nostalgic connection we retain to the Plymouth story we learned in grade school (complete with hand-traced and crudely-decorated construction-paper turkeys stapled to corkboards), modern Thanksgiving has no connection to the Indian Wars or any narrative of conquest. To teach children that it does is educational malpractice—and malpractice in American public education is as rare as medical injury.

The association of Thanksgiving with genocide is a post-1960s critical narrative born of the convergence of American Indian political mobilization (AIM and related movements), broader progressive civil-rights activism, and the rise of postcolonial, revisionist historiography rooted in postmodern corruption of our sense-making institutions. Beginning with the 1970 National Day of Mourning, activists reframed Thanksgiving as a myth that obscures catastrophic population loss, displacement, and cultural destruction. For the anti-American activist, the holiday now symbolizes the start of a tragic trajectory rather than communal gratitude. To them it means the American project is invalid.

In this telling, the American story is exceptional in terms of ethnic oppression and genocide. Indeed, this is the only kind of American exceptionalism allowed—if one wishes to avoid being smeared as a white supremacist.

The 1621 gathering itself is not a myth; it happened. But turning Thanksgiving into a day of mourning is a political act of repurposing—a classic move of woke ideology, which demands that every American story be reexamined through the lens of power, race, and structural injustice.

(When critics remind me that “woke” is an old word whose meaning has changed, they are half-right: its first mainstream print appearance was in 1962, urging black Americans to “stay woke” to racial injustice. The core purpose, however, has not changed: to make permanent the perception that America is fundamentally unjust.)

The reinterpretation of the holiday as a symbol of genocide thus represents an intentional political shift in cultural sensibilities rather than the uncovering of a hidden historical truth. But the truth of Thanksgiving was never hidden—any more than the history of slavery was hidden. The trope of “hidden history” is itself a rhetorical device for manufacturing historical forgetting.

The youth of today are taught history not as an informative exercise, or even to educate the developing person about discernment in historiography and the importance of understanding biography and history; rather, the purpose of history education since the 1960s is pitched as the liberation of secret truths concealed by oppressors—white cisgendered Christian supremacists—to advance an imagined status quo manufactured forgetting means to valorize.

Many of us who grew up before the woke era experienced Thanksgiving as a day of family and reflection (even an atheist like me could participate culturally and feel loved), unburdened by subversive political desire. I say that so younger readers may pine for a world where not everything is politicized, where the woke gaze is diminished.

My generation (born 1962) always knew about the fate of indigenous peoples. We were horrified by aspects of that history, but we recognized it as history: deeds done by the dead, for which no living person bears responsibility—even if they inherited the spoils of conquest and colonization.

America is not exceptional in this way: World history is the story of conquest and colonization; American Indians themselves arrived in worlds shaped by earlier conquests.

Progressive history revises the past in order to delegitimize the present on the fallacious premise that each generation is responsible for the sins of its predecessors. That is a primitive ethic, one that the modern world rightly buried. It should never have been resurrected from its grave.

The future is open, but it is also constrained by the present order—some elements of which are worth preserving. When in Nineteen Eighty-Four Orwell quotes O’Brien, an Inner Party member, about the past, present, and future, he highlights for readers the power of shaping history to influence society and maintain authority.

Those in power, or who are in a position to capture it, manipulate collective memory by censoring and rewriting historical events to justify their ambitions. If the ruling class or some other determined group can convince people that past events occurred in a certain way, then they can shape beliefs, values, and expectations—and this control shapes future behavior to align with their interests.

Postmodernists are right about this—the one truth they cannot deny: control over historical narrative is a tool for political domination, as people’s understanding of the present and their vision of the future are deeply influenced by what they are taught about the past. For them, it’s all about discursive power (which depends on corruption and command of society’s institutions). For those who care about facts as really-existing things, it’s about truth and justice. This is why it is vital to the life of the free republic to prevent its youth from being taught to feel guilty about their nation’s past.

(For further reading on this topic, see my July 2021 essay The Zinn Effect: Lies Your Teachers Tell You.)

Why Chicago Mayor Brandon Johnson is Full of Shit

Have you seen this yet?

The chart below illustrates why the mayor of Chicago, Brandon Johnson, is full of shit. He tells his constituents that America will never incarcerate its way out of violent crime. No social system can completely eliminate violent crime. And the best that a society with dense urban populations, widespread idleness and welfare dependency, fractured family structures, the presence in power of policymakers and politicians who promote a culture of resentment and violence, and officials who stand down law enforcement while returning lawbreakers to the street can do is reduce crime and violence to tolerable levels. The most effective way to do that? Incarceration.

Chart based on FBI and BoJ statistics.

Incarceration doesn’t reduce violent crime by deterring criminals from preying on the public or warring with one another. Deterrence requires more law enforcement officers on the street and the aggressive policing of the populations there. Incarceration reduces violent crime through incapacitation. Suppose a society removes violent offenders from the streets. In that case, it follows that those who cannot abide by the rules of a decent society will be unable to commit violent crime. This is logically obvious, and the empirical evidence confirms it, as shown in the above chart.

There is no other explanation for the drastic drop in crime associated with mass incarceration. Our society is neither more equal nor less impoverished than it was in the decades before the 1960s. Criminogenic conditions only increased in the period following the 1960s, which explains the drastic rise in crime since then. What exacerbated those conditions? Ghettoization; the vast expansion of the welfare state; mass immigration that idled millions of American citizens; and the practice of defining down deviance. Who is responsible for this? Corporations and their progressive operatives in the Democratic Party, along with Republican collaborators (RINOs).

Given the degree of violent crime in American society—largely the result of decades of progressive social policy that destroyed inner-city neighborhoods and demoralized the people living in them—mass incarceration has proven the most effective intervention if the goal is to make society safer and therefore freer. That should be the aim of anyone who claims to care about other people—especially those who profess that black lives matter. Unfortunately, the same party that for the most part created these conditions continues to perpetuate them for economic and political reasons, and that party remains a significant force at both the federal and state levels. That would be the Democratic Party.

Some people view mass incarceration as an indicator of unfreedom. But the relevant question is whether the deprivation of liberty is justified. Not everybody deserves to be free. Unfreedom is justified under the principle of just deserts: if one breaks the law, there are consequences, and the consequences should keep foremost in mind the safety of those who follow the law. It is the right of the lawbreaker to be punished for his actions. It is the right of the people to be protected from those actions. Some see demographic patterns in criminal justice as evidence of systemic racism. This may be true with respect to the policies that create and exacerbate criminogenic conditions, but it is not true of the institutions that must deal with the consequences of those policies. Demographic patterns in criminal justice reflect demographic patterns in serious criminal offending.

In the final analysis, the deprivation of liberty experienced by those who commit violent crimes is the result of both progressive policies and the voluntary actions of those who suffer them. Those who abide by the law do not deserve to be victimized by those who do not. Regardless of social conditions, those who harm others choose to do so. One makes a choice to break the law. Their victims—or those they are likely to victimize—have a legitimate expectation that a good society will use the most effective and immediate means available to enhance public safety. Incarceration is the most effective and immediate means to that end.

Politicians like Brandon Johnson (and JB Pritzker) do not operate from an objective, empirical standpoint. Not because they cannot—although Johnson is plainly a stupid man—but because they operate from an ideology that asks the public to imagine that demographic patterns in criminal justice are driven not by the demographics and patterns of crime but by systemic racism. This is a falsifiable proposition, and it has been repeatedly falsified. If rational and honest people are to reason objectively and scientifically, then ideologues like Johnson are among the worst politicians a city can elect. Yet citizens continue to elect them. Therein lies the deeper problem plaguing the blue city: widespread ignorance and ideological corruption among the populace.

Are there other ways to reduce violent crime? Yes. Among them: closing the borders; deporting illegal aliens; restricting public assistance to those who truly have no other means of support; and insisting that able-bodied Americans go to work. However, these measures must be pursued in tandem with aggressive law enforcement and incarceration. It will take decades to undo the harm Democrats have inflicted on American cities over the last seventy years. Given the depth of ideological corruption, partisan loyalty, tribal affinity, and imposed ignorance in this country—largely a consequence of progressive control over society’s sense-making institutions, e.g., public education—it is unlikely that citizens will be able to keep Democrats out of government and elect those who would rationally address these problems at the scale required to re-order society, restore public safety, and reverse the structural causes of criminogenic conditions (what one properly identifies as the evidence of systemic racism).

I will close by noting that the logic behind the reductions in violent crime between the mid-1990s and roughly around 2014 is the same logic that explains why violent crime increased after 2014: the nation largely abandoned effective law-and-order policies. This was not accidental. Beginning around 2010, the mass media began promoting the myth of systemic racism and white supremacy. Wealthy individuals and organizations created and funded groups like Black Lives Matter, which persuaded millions that depolicing and decarceration were justified based on the false claim that law enforcement was inherently racist. This problem was made worse when, in 2020, Democrats opened the borders and flooded the United States with cheap foreign labor—an intentional action benefiting billionaires while disorganizing working-class communities and diminishing the life chances of American citizens.

The worsening conditions in impoverished inner-city neighborhoods are not the unintended consequences of well-meaning policy. The do-gooders are not doing good. Today’s situation is deliberate in the same way that criminal law defines and adjudicates intent and criminal culpability. Because of the way violent crime affects all of us, we are victims of a grand political crime perpetrated by the elite and their functionaries in the Democratic Party. As I have noted before, Republicans don’t run the blue cities. Unfortunately, congressional Republicans seem hesitant to act to stop the federal judiciary from undermining Donald Trump’s efforts to rein in violent crime.

How Did the Roles Get Reversed? The Moral Confusion Surrounding Israel and Gaza

Recent polling by Richard Baris (of Big Data Poll) shows that a large share of Americans—particularly younger voters, including many on the political right—believe that Israel committed genocide in Gaza. When asked, a plurality of registered voters (38.4%) believe “what Israel has done in Gaza amounts to genocide.” Less than 3 in 10 (29.0%) say it does not, and roughly one-third (32.6%) are unsure. Republican voters ages 18-29 agree 43.5 to 36.2 percent. That margin widens significantly among the same age group that self-identifies as America First Republicans, with nearly 60 percent agreeing with the statement. Moreover, except among Republicans overall, Israel drew less support than did Gazans. Even here, sympathy for Israel is less than 50 percent. More striking is that the group with the greatest sympathy for Gaza is young devotees of the American First movement. Note also the ambivalence of many respondents. The sample size of the poll exceeded 2,000. (For Baris’s report, see Poll: Sympathy for Israel Falls to Historic Low Among U.S. Voters.)

(Source)

As someone well-informed about the conflict and having an in-depth understanding of the laws of genocide and war, these numbers are troubling. They indicate that a large proportion of the American population does not understand the situation. However, as I will come back to at the end of this essay, it suggests something more disturbing: that many Americans hold Israel to a different standard than they do other nations. Assuming, charitably, that these numbers mainly reflect widespread ignorance of genocide law and a nation’s permissible response when attacked, it is important to state that the belief that Israel perpetrated genocide in Gaza misinterprets both the legal meaning of genocide and Israel’s response to the events of October 7, 2023.

On the matter of genocide, a genocide is defined by its motive: the intent to destroy an ethnic population in whole or in part. Israel did not carry out its operations in Gaza with this motive. Israel’s action in Gaza was defensive. Israel was responding to an attack by a belligerent entity on Israeli soil. Indeed, it was responding to a genocidal act, not perpetrating one. To explain this, I will draw a parallel between the Israeli-Gazan situation and Allied operations conducted against Nazi Germany during WWII. Allied actions in Nazi Germany will serve as the moral measuring rod for judging the appropriateness of Israel’s actions.

Under Nazi rule, Germany pursued a genocidal agenda, seeking to eliminate the Jews from German society and from Europe altogether, with plans to do the same in the Middle East (see Jew-Hatred in the Arab-Muslim World: An Ancient and Persistent Hatred). Following this genocidal aggression and Germany’s broader assault on Europe, the Allies unleashed a campaign of overwhelming force on German cities—Berlin, Cologne, Dresden, Frankfurt, and other urban centers—reducing them to rubble. The devastation, when viewed in photographs today (easily obtained by searching Google images, some of which appear in my essay The Danger of Missing the Point: Historical Analogies and the Israel-Gaza Conflict), bears a striking visual resemblance to Gaza. Roughly 600,000 German civilians were killed in Allied bombing alone, tens of thousands of them children, and millions of German civilians died through other causes during the war. Yet the Allied campaign is not understood as genocidal because its motive was defensive and reactive. The scale of devastation, horrific as it was, did not define the moral category. Intent did.

Hamas gunman, October 7, 2023

The Hamas attack of October 7 carried a clearly stated genocidal intention. Hamas’s foundational commitment is the removal of Jews from Palestine, which its slogan “from the river to the sea” and its charter openly articulate. The 1988 Hamas Covenant contains genocidal language, including explicit calls for violence against Jews as a group, promotion of antisemitic conspiracy theories, and framing of the conflict as a religious obligation to eliminate the “Zionist enemy.” The charter contains two particularly inflammatory provisions that are widely regarded as genocidal in intent. Article 7 quotes a well-known hadith declaring that the Day of Judgment will not arrive until Muslims fight and kill the Jews. Article 13 categorically rejects any peaceful solution or negotiation, stating that “there is no solution for the Palestinian question except through Jihad” and dismissing all diplomatic initiatives and international conferences as contrary to Hamas’s principles. Regardless of later revisions to the charter, which do not alter the intent identified above, the ideological core remains: a Jew-free Palestine. October 7 was carried out in furtherance of this genocidal goal.

Israel responded to the horrific attacks of October 7 defensively, striking Hamas targets embedded across Gaza’s densely populated urban environment. Again, crucially, the moral comparison between Germany and Hamas rests not on the scale of devastation (in lives lost, approximately 6-7 percent of the German civilian population, and 3-4 percent of the Gazan population), but on motive: in both cases, one side initiated aggression grounded in genocidal ideology; the other responded with overwhelming force designed to defeat that aggression.

Critics argue that the comparison to World War II is flawed because the Allies fought a sovereign nation-state, whereas Israel faces a non-state militant organization embedded among civilians. However, the structural form of the enemy does not alter the essential moral fact: in each case, a genocidal actor initiated the violence. Israel’s response, like that of the Allies, aimed to neutralize an entity driven by the elimination of a people, as the Hamas Convenant makes clear. Once more, intent, not political form, is the hinge of the moral argument.

Another criticism focuses on foreseeability. Critics claim that even if Israel did not intend civilian casualties, the extent of the destruction was foreseeable and therefore morally condemnable. Yet international law has long distinguished between intent and foreseeable collateral damage. Civilian casualties, even on a large scale, do not constitute genocide unless they arise from a desire to destroy a population. The Allies bombed German cities knowing that civilians would die in enormous numbers, yet their motive—to defeat a belligerent and genocidal regime—remains morally distinct from genocide itself. The same holds for Israel confronting Hamas fighters who systematically embed themselves in civilian structures precisely to produce inflated civilian death tolls.

A further argument asserts that Israel’s overwhelming military superiority imposes a heightened obligation for restraint. But superiority does not alter intent, nor does it erase the right of a nation to defend itself after suffering a genocidal massacre. Indeed, a nation acquires overwhelming military superiority to deter threats to its people and to effectively repel those threats if deterrence fails. The Allies eventually enjoyed overwhelming industrial and military superiority over Germany, yet this never transformed their defensive campaign into genocide. Nor did Israel’s campaign in Gaza become genocidal. Moral categories do not shift based on the balance of forces.

Some critics insist that Israel never truly left Gaza, pointing to border controls and airspace restrictions. This is the “Gaza under siege” narrative, which typically elevates controls and restrictions with language suggesting an Israeli blockade. But Israel’s withdrawal in 2005 was complete: every soldier and every Jewish civilian was removed from Gaza. What followed was Hamas’s ascendancy and its decision to militarize Gaza, diverting international aid away from civilian needs and into tunnels and weaponry (Gaza-specific aid for the 2005-2023 period is estimated at $12–15 billion, with $3.5-4 billion coming from USAID). The dire conditions in Gaza reflect this militarization, not an Israeli desire to eliminate the population. Holding Israel responsible for the consequences of Hamas’s governance confuses cause with effect.

Critics also claim that Hamas does not represent the civilian population in the way that the Nazi regime represented Germany, making the analogy inappropriate. Yet Hamas is the de facto governing authority of Gaza, exercising control for nearly two decades. (Can it really be said that the Nazi government was representative of German interests?) Gaza has deliberately placed its military infrastructure in hospitals, schools, and residential buildings to maximize civilian exposure and to weaponize civilian casualties for political effect. When a governing authority uses civilians as shields, civilian deaths become part of its strategic calculus, not evidence of genocidal intent by the opposing force.

Some argue that the scale of destruction in Gaza must itself be taken as proof of genocide. But devastation alone does not define genocide. World War II’s destruction of Germany far exceeded what has occurred in Gaza (possibilty twice as many civilians deaths in Germany compared to Gaza), yet the Allies are not remembered as perpetrators of genocide against Germans. The decisive factor in moral reasoning is always intent, not the magnitude of devastation, and Israel’s intent has been the defeat of a genocidal organization, not the extermination of a people.

This brings the analogy to one more important dimension. The Allied demand for Germany’s total surrender was followed by the project of denazification, which aimed to ensure that Germany would not repeat its genocidal aggression. Ending hostilities without uprooting the ideology at its core would have guaranteed future conflict. By contrast, the cease-fire negotiated between Israel and Gaza—despite Israel’s ongoing operations—prevented Israel from securing a total surrender from Hamas or enforcing any ideological disarmament comparable to denazification. Calls for Hamas to be disarmed have not been accepted by Hamas itself (and the Arabic parties involved seem disinteresting in pressing the issue), and nothing resembling ideological de-radicalization has occurred in Gaza. The Islamist, clerical-fascist ideology that undergirds Hamas bears a conceptual similarity to the fascism that animated Nazi Germany, but unlike postwar Germany, Gaza has undergone no ideological transformation. This is why I opposed a cease-fire. I believe Israel should have been permitted to completely remove Hamas from the territory.

Thus, Israel is not only wrongly accused of genocide; it is held to a standard that the Allies themselves were never held to. Imagine how unacceptable a resolution to WWII would have been if it had ended through a cease-fire that left the Nazi regime intact, unreformed, unbeaten, and un-disarmed. Such an outcome would have been rightly rejected as dangerous and incomplete. A cease-fire may halt violence temporarily, but it can also freeze a conflict in a form that prevents the defensive side from accomplishing the very goal that made its campaign morally justified. Yet Israel faces precisely this situation. It is judged harshly for doing far less than what the Allies were required to do to end a genocidal threat, and at the same time, is denied the opportunity to achieve the decisive conditions that ended the fascist threat in Europe.

The charge of genocide against Israel not only fails historically, legally, and morally—it inverts the roles of aggressor and defender in a way that obscures the real dynamics of the conflict. So I close by asking readers to consider the source of the double standard. How did the sides get flipped in the minds of so many people? How does Israel become, in the eyes of millions of reasonably intelligent observers, a bad actor when the Allied victory over Germany is celebrated, and the deradicalization of a belligerent entity is seen as necessary? What is the difference between the cases? The only one I can see is that, in the case of Israel’s actions, the ethnic group defending its people from genocide is Jewish. Given the extent and intensity of anti-Jewish sentiment in the West today, perhaps this was a predictable development.

Immigrants, Billionaires, and the Failure to See the Connection

Before I get to the main topic of this essay, which concerns a recent viral video by actor Mark Ruffalo, I must first note a remarkable headline on a related matter. Yesterday, in an article about Border Patrol’s Operation Charlotte’s Web (so named because the site of the action, Charlotte, North Carolina, the city where Iryna Zarutska, a 23-year-old Ukrainian refugee, was brutally murdered by Decarlos Brown Jr.), CBS News buried the lede: “One-third of those arrested by Border Patrol in Charlotte were classified as criminals, internal document says.”

In fact, the author, Camilo Montoya-Galvez, writes, “Fewer than one-third of the individuals arrested by Border Patrol during the Trump administration’s recent immigration enforcement crackdown in Charlotte were classified as criminals, according to an internal Department of Homeland Security document obtained by CBS News.” Later in the article, she notes, “Roughly 200 green-uniformed Border Patrol agents recorded more than 270 immigration arrests during the Charlotte campaign” (her word choice makes it sound like war). Of these, “[f]ewer than 90 of those arrested by Border Patrol were categorized as ‘criminal aliens’ in the document.”

Ninety criminal aliens would make it one-third, so fewer than that suggests perhaps 89 or 88 of the 270 arrested by Border Patrol fit that designation, determined mainly by criminal convictions, but also those charged with a crime, or engaged in conduct that makes them removable on criminal grounds. Montoya-Galvez could have written, “almost” or “nearly” one-third, but that would have changed the obvious intent of the story, which was to manufacture the appearance that the Trump Administration was violating a media-manufactured promise that it would only be targeting criminal aliens for arrest and deportation. This “promise” asks the public to expect that, confronted with a detainee who is in the country illegally, Border Control is supposed to release that individual (who may or may not be a danger to society)—as if an illegal alien had not by definition already committed an offense: that of illegally entering a country or overstaying a visa.

What the headline works overtime to obscure is the remarkable fact that almost one-third of those detailed by Border Control have a criminal record or, for some reason, were designated as criminals. We are told, and, as the reader will see, Ruffalo repeats the myth, that the immigrants being detained and deported are much less likely to have a criminal record than citizens. Not that it matters, of course, since removal of illegal aliens would reduce the overall volume of crime, whatever the relative proportions of criminal offenders; but having documented the fact that a large proportion of illegal immigrants detained in Charlotte meet the criteria of criminal aliens, a headline properly phrased would cause the rational observer to question the myth. Moreover, that one-third of those detained in Charlotte (so far) aligns with an equally astonishing statistic that the media attempts to slot into the narrative of systemic racism—that one-third of black men in America have a felony conviction—puts another of Ruffola’s claims, namely that criminals are by and large white, to the test.

Turning now to Ruffalo’s viral video, my first reaction upon watching it was that the man is as dumb as he looks. But he’s not the only one (even if others aren’t so dumb looking). In the clip, Ruffalo begins by telling us that the immigrants are not the criminals. According to the statistics, he says, white people are the criminals. He then goes on to tell the camera that the “gift of our time” is getting to see who the true villains are, who is really making our lives unbearable, who is making us so desperate: the billionaires. It’s time, Ruffalo says, for Americans to take back our country from the extreme wealth that has its hands all over the power of the nation. Keep the immigrant. Send the billionaires packing. Then we can once more be the “land of the free and the home of the brave.”

Setting aside Ruffalo’s dubious claim about white criminality, how can a man capable of stringing words together to form more or less intelligible sentences fail to see the obvious? What is making life desperate and unbearable for millions of American workers is the billionaires and the weapon they wield against the native labor force: the immigrant. Ruffalo suffers from the same blindness that afflicts the useful idiots protesting outside the migrant detention center in the Florida Everglades (see Protests at Alligator Alcatraz: What Do The Protesters Want?): the failure to see the connection between extreme wealth and mass immigration.

Image by Sora

The pattern—both the capitalist-immigrant connection and the failure of individuals to see it—is older than most people realize. The connection hides behind the slogan “a nation of immigrants,” a foundational myth that functions as ideological mystification: a bourgeois narrative that naturalizes exploitative labor relations and obscures the use of superexploited immigrant labor to depress wages among the native born.

From the late nineteenth century through the early twentieth century, the United States absorbed wave after wave of European immigrants at the precise moment industrial capitalism was exploding. Who benefited most? The industrialists, with intellectuals like Horace Kallen selling the scheme to politicians and the public. The mass of cheap labor was the critical input that supercharged capitalist accumulation. Industrialists and their shills lobbied against any restriction on immigration, dispatched recruiters to Europe (facilitated by ethnic middlemen), and worked hand-in-glove with steamship companies to keep the human cargo flowing. The public justification was always the same: “labor shortages.” Translation: wages were too high, and high wages cut into the profits required to build Newport mansions and corner the steel market.

Capitalism’s inner logic explains why they sought cheap foreign labor. Competition compels every capitalist to maximize profit, which means minimizing labor costs. Surplus value is extracted from labor; the lower the wage, the greater the portion of the working day that is unpaid, and the fatter the owner’s margin. Flooding the labor market with immigrants also disciplines workers: when there is always someone hungrier standing behind you, strikes become risky, and unions lose leverage. At the same time, competition forces capitalists to substitute machines for men. The organic composition of capital rises, productivity increases, and fewer workers are needed. The individual capitalist who automates first gains a cost advantage; when everyone follows suit, which they inevitably do, the tendency of the rate of profit to fall intensifies the hunt for still-cheaper labor. The result is a growing reserve army of the unemployed—first immigrants, then natives—driving down wages across the board.

In the 1920s, Congress slammed the door. The Emergency Quota Act of 1921 and the Johnson–Reed Act of 1924 slashed immigration from Europe to a trickle. Industrialists howled; nativists—and, more importantly, American workers—prevailed (see my December 2018 essay Smearing Labor as Racist: The Globalist Project to Discredit the Working Class). With the foreign tap turned off, employers suddenly faced real labor shortages (that is, wages they could no longer suppress).

Their response? They turned south. Labor agents fanned out across Dixie, offering train tickets, running ads in The Chicago Defender, and building company housing. The Great Migration was born. Northern industry actively recruited blacks to replace the immigrant labor that restrictive laws had denied them. Black workers, excluded from most unions and fleeing Jim Crow, were seen as docile, desperate, and—crucially—cheap. The parallel is exact. European peasants spent decades being lured across the Atlantic by the same class that later lured sharecroppers’ sons out of Mississippi. Different skin color, same economic function: a disposable labor pool to keep native wages from rising. (See Shorthanding “Black Jobs.”)

Fast-forward a century. The game is the same. Only the costumes and cultures have changed. When immigration restrictions, postwar prosperity, and strong unions finally forced American wages upward, industry needed a new reserve army that couldn’t vote (not legally anyway), couldn’t easily unionize, and could be deported at the first sign of complaint. Enter the H-1B visa: a modern indenture dressed up as “high-skilled immigration.” (See We Need to Close the Borders; The H-1B visa Controversy: The Tech Bros Make Their Move.)

Tech billionaires and their lobbyists insist America faces a catastrophic STEM shortage, yet they rarely raise starting salaries, fund serious domestic training, or recruit from the millions of laid-off American coders already here. Instead, they fly in planeloads of young workers from India, bind them to their employer with the threat of visa revocation, and pay them 20–40 percent below market while pocketing the difference as profit. The Indian outsourcing firms (the new ethnic middleman) that dominate the program force employees to sign contracts agreeing to pay massive “liquidated damages” if they dare leave for a better job. It is debt bondage with stock options and a Silicon Valley postcode. Once again, a restricted labor pool for natives becomes a glut the moment capital is allowed to import replacements; once again, the loudest voices crying “shortage” are the same ones whose yachts keep getting longer.

Like I said, it’s an old pattern. When policy blocks one source of cheap labor, capital finds another. Close the borders to Europeans, and the factory owner reaches into the South. Open them again to the global poor, and the children of the Great Migration find themselves idled in ghettos (The Defenders of Mass Immigration Insult Native-Born Labor). The billionaire class never runs out of “labor shortages”; it only runs out of workers willing to work for little to nothing—until it imports new ones. (See The Mass Immigration Swindle; The Denationalization Project and the End of Capitalism.)

That is the connection Ruffalo and the rank-and-file progressive cannot (or will not) make. The misery Ruffalo laments is not an accident. It’s the business model he embraces when he defends open borders and condemns ICE operations. On second thought, perhaps Ruffalo isn’t as dumb as he looks. Perhaps he’s doing the dirty work for the wealthy elite who pony up the capital to finance the movies he stars in. Perhaps, like so many other celebrities, he knows who butters his bread. And what of his estate? Does he, like so many of his kind, have groundskeepers and housekeepers? Are the citizens? Probably not.

Protests at Alligator Alcatraz: What Do The Protesters Want?

A group of protesters has been blocking access to an ICE facility in Florida, the so-called Alligator Alcatraz. The protesters are not shy about explaining their motivations. Without prompting, they openly declare that ICE is “kidnapping people” and “separating families,” all in service to “fascist billionaires.”

Protesters outside the Krome detention center in Miami, Florida, November 22, 2025.

Their action and rhetoric raise intriguing questions, especially when viewed against the broader ideological stance often held by such activists. One might reasonably expect that many of these same protesters also advocate defunding the police or even prison abolition. Yet they are not seen blocking jails and prisons across the United States. Nor are they interfering with routine law enforcement interactions involving US citizens. Their actions appear highly selective—segregated across domains—focusing intensely on the treatment of immigrants, to the point of putting their own bodies on the line, while showing far less urgency toward the treatment of citizens.

The protesters’ claim that ICE is engaged in “kidnapping” is fundamentally misleading. Kidnapping is the act of taking someone captive illegally by force for nefarious purposes (perversion, ransom, sex trafficking). ICE, by contrast, operates as a branch of law enforcement, enforcing immigration laws in much the same way that other agencies enforce criminal laws. This involves detaining and arresting individuals, delivering them to the justice system for processing and adjudication, and, in many cases, deporting them. In the case of citizens, in other law enforcement domains, this process may involve jailing or imprisoning them. Comparatively, many more US citizens are sent to jails and prisons each year than immigrants are deported. Family separation frequently occurs in both contexts, although on a much vaster scale for citizens. Yet these protests consistently overlook the vast disparity in scale between the two systems. Why?

I will come to that. But before I do, it must also be noted that the accusation that ICE’s actions serve the interests of “fascist billionaires” is specious at best. Many of these same billionaires and large corporations favor increased immigration, as it tends to drive down wages for native-born workers and legal residents by introducing a labor force willing, or at least compelled by circumstance, to accept lower pay. On the political side, immigration also shifts partisan power dynamics, overwhelmingly benefiting Democrats, particularly progressive Democrats, who align with the transnational agendas of these corporate powers. Thus, the protestors are advancing the interests of the billionaires they describe as “fascist.”

In contrast, the populist-nationalist faction within the Republican Party—exemplified by supporters of Donald Trump—pushes for stricter immigration controls, which directly opposes the interests of many of these so-called fascist billionaires. This contradiction suggests that the protesters’ framing does not align with the complex economic and political realities at play with leftwing interests in mind. If they were truly on the left, their commitment should prioritize worker interests over corporate ones. Instead, they defend developments that undermine American workers and superexploit foreign ones—all for the sake of corporate power and profit.

There is an apparent irony here. These activists—who chain themselves to gates and lie down in front of federal vehicles to block the enforcement of immigration law—are unwittingly (or perhaps wittingly, but let’s be charitable) doing the bidding of the very “fascist billionaires” they claim to oppose. The billionaires in question are not the caricatured nationalists of progressive imagination; they are the architects of a post-national order who seek to erode the sovereignty of the United States and other Western nation-states in service of a globalization project led by transnational corporations. These entities envision a future in which populations, especially labor, are managed not by democratic nation-states but by a corporate-administrative regime exercising control through bureaucratic rule, digital surveillance, and technocratic systems of credit and compliance.

The Florida protest is hardly an isolated incident; it fits a recurring pattern we have witnessed across the country for months. A clear parallel can be drawn with Antifa and similar movements (in fact, many of those engaged in anti-ICE protests are Antifa). Far from resisting the corporate-led denationalization project, the protestors seek to accelerate it—by disrupting borders, undermining the legitimacy of republican institutions (in the small-r, classical sense of a self-governing polity), and eroding the very concept of civic cohesion and national integrity that serves as a counterweight to unaccountable elite power.

The irony can be explained by acknowledging sentiments of commonality found among the protestors: their routine condemnation of the United States as a white-supremacist settler state built on stolen land—an indictment they extend to the entire Western nation-state tradition. Far from being contradictory, their selective outrage and tactical choices reveal a deeper coherence rooted in an explicitly anti-American and anti-Western politics. Immigrants, in this worldview, are not defended primarily out of universal humanitarian concern (what about the people of America and other advanced Western countries?), but because mass immigration is seen as a solvent that erodes the cultural continuity, demographic cohesion, and historical legitimacy of the very nations these activists consider irredeemably illegitimate.

In other words, the protestors champion open borders for the same reason many ordinary citizens demand immigration enforcement: both camps recognize—whether they admit it or not—that large-scale, unassimilated immigration fundamentally disrupts the continuity of the modern nation-state, which is precisely what the transnational corporate agenda seeks. The difference lies in valuation: where one side sees dissolution as justice, the other sees an existential threat. Yet, only one side has worker solidarity in mind. The protester and the border hawk agree on the transformative power of demographic change—one rationalizes it as justice and celebrates it as retribution, the other resists it as self-preservation. Self-reservation is the rational instinct.

The question for Americans in choosing comrades is whether they wish the nation-state to go away and transnational corporations to control mankind’s future, or whether the West remains a system of free nation-states where the respective countries shape their own destinies according to republican principles enshrining individual liberty and collective self-determination in the spirit of mutual interests and respecting differences. The latter requires borders.

It is not a hard choice to make. Those protesting ICE facilities advance the transnational corporate project. That project seeks to establish global corporate statism. This is the New Fascism. Either we stop it, or the world will be what Orwell asked us to imagine in Nineteen Eighty-Four: “If you want a picture of the future, imagine a boot stamping on a human face—forever.”

Marx’s Misstep: Human Nature and the Limits of Class Reductionism

In reflecting on my “sermon” yesterday (Republican Virtue and the Unchained Prometheus: The Crossroads of Moral Restraint and the Iron Cage of Rationality), I thought it necessary to present a critique of Karl Marx’s observation regarding the production of ideas and the relation of the means of production, a subject about which I have written many times. In approaching this matter, I have quoted favorably part of a passage from his 1845 The German Ideology, which establishes an essential truth, one I still find compelling: 

The ideas of the ruling class are in every epoch the ruling ideas, i.e., the class which is the ruling material force of society, is at the same time its ruling intellectual force. The class that has the means of material production at its disposal has control at the same time over the means of mental production, so that, generally speaking, the ideas of those who lack the means of mental production are subject to it. The ruling ideas are nothing more than the ideal expression of the dominant material relationships, the dominant material relationships grasped as ideas; hence, of the relationships which make the one class the ruling one, therefore, the ideas of its dominance. The individuals composing the ruling class possess, among other things, consciousness and, therefore, think. Insofar, therefore, as they rule as a class and determine the extent and compass of an epoch, it is self-evident that they do this in its whole range, hence among other things rule also as thinkers, as producers of ideas, and regulate the production and distribution of the ideas of their age: thus their ideas are the ruling ideas of the epoch. 

However, Marx immediately follows this with an example that gets to the heart of the problem with communist thinking, that of reductionism: “For instance, in an age and in a country where royal power, aristocracy, and bourgeoisie are contending for mastery and where, therefore, mastery is shared, the doctrine of the separation of powers proves to be the dominant idea and is expressed as an ‘eternal law.’” In this example, the reader is to accept that the principle of separation of powers is an ideology that disguises ruling class power by projecting the principle as a universal one rather than an emergent or practical doctrine that prevents the domination of any one party in a reasonable system checked by ethical ideals that may, in fact, be rooted in human nature (Marx has a tortured relationship with human nature, as readers will soon see). In the case of ideals that elevate liberty above tyranny, such as those of a free republic, separation of powers may not be ideological deception but rather an arrangement that preserves liberty for all by constraining both the tyranny of the majority and rule by the minority of the opulent, and by giving a voice to the people. 

Let’s allow Marx to continue for a moment longer: 

If now in considering the course of history we detach the ideas of the ruling class from the ruling class itself and attribute to them an independent existence, if we confine ourselves to saying that these or those ideas were dominant at a given time, without bothering ourselves about the conditions of production the producers of these ideas, if we thus ignore the individuals and world conditions which are the source of the ideas, we can say, for instance, that during the time that the aristocracy was dominant, the concepts honor, loyalty, etc. were dominant, during the dominance of the bourgeoisie the concepts freedom, equality, etc. The ruling class itself, on the whole, imagines this to be so.  This conception of history, which is common to all historians, particularly since the eighteenth century, will necessarily come up against the phenomenon that increasingly abstract ideas hold sway, i.e., ideas which increasingly take on the form of universality. For each new class which puts itself in the place of one ruling before it, is compelled, merely in order to carry through its aim, to represent its interest as the common interest of all the members of society, that is, expressed in ideal form: it has to give its ideas the form of universality, and represent them as the only rational, universally valid ones.

The reader might suspect here that Marx talks himself out of his own position, since it is the ideals of duty, freedom, equality, and so forth, ideals that represent the common interests of all members of society, that come to hold sway in development and thus limit the actions of the ruling class. Is that not a good thing? Should we not recognize this before rejecting the separation of powers and putting our fate into the hands of the masses (direct democracy, i.e., majoritarianism) or a vanguard that claims to represent the popular interests with no checks on its power (i.e., the dictatorship of the proletariat)? Rejecting these ideals as inverted projections of aristocratic and bourgeois power risks abandoning them to mob rule or to the channeling of those passions by a new aristocracy for its own ends, whether in the form of a communist or corporate (read fascist) master, rather than grasping that some arrangements allow human nature to find its expression in just social arrangements in free and open relations—such as those identified in yesterday’s essay. 

Image by Sora

Marx’s claim that the ruling class in every era (except the original one, which I will come to) controls not only the material foundations of society but also its intellectual life has long been regarded as one of his most penetrating insights. Again, I have quoted the useful part of his formulation several times on this platform. I do find it useful, especially with the emergence of the corporate state and technocratic rule under late capitalism. But in light of what I have just presented, revisiting that formulation becomes a necessity; I cannot just leave that “out there.” Marx’s assertion that “the ideas of the ruling class are in every epoch the ruling ideas” precludes the possibility that the ideas that it manufactures serve other interests beyond those of the ruling class. To be sure, ideas are ultimately expressions of underlying material relationships, but these relationships are determined by really-existing human beings; dominant moral or political concepts are not merely notions articulating and justifying the interests of the class that rules. While Marx’s argument rightly underscores the intimate connection between power and the circulation of ideas, Marx extends the claim in a way that exposes the limitations of his framework, ultimately undermining his attempt to reduce political ideals to mere instruments of class domination.

One might object that The German Ideology was an immature work. Marx was, after all, only 27 years old. But the formulation Marx sets down here informs decades of his work. He repeats in so many words the formulation in the Preface to his 1859 Introduction to a Critique of Political Economy: “In the social production of their existence, men inevitably enter into definite relations, which are independent of their will, namely relations of production appropriate to a given stage in the development of their material forces of production. The totality of these relations of production constitutes the economic structure of society, the real foundation, on which arises a legal and political superstructure and to which correspond definite forms of social consciousness. The mode of production of material life conditions the general process of social, political, and intellectual life. It is not the consciousness of men that determines their existence, but their social existence that determines their consciousness.” This is a solid critique of idealism, but what about human nature? Is it possible that social consciousness is, at least to some degree, rooted in the anthropology of our species?

Marx’s own example of the separation of powers in The German Ideology illustrates the problem. He argues that in a society where the aristocracy, bourgeoisie, and monarchy compete for power, the separation of powers becomes the prevailing doctrine, presented as an “eternal law,” though it merely reflects the accommodation among ruling groups. In this reading, the principle of divided government is not a constitutional innovation grounded in moral or practical insight but a veil concealing shared domination. This interpretation ignores why such principles emerge in the first place: not as disguises or distortions but as carefully crafted mechanisms that prevent precisely the kinds of domination that Marx suggests they hide.

Consider the American Republic. In a republic committed to preserving liberty, the separation of powers operates as a check on both the tyranny of the majority and the concentration of authority in the hands of the wealthy. It is not ideological mystification but a structural arrangement that protects the freedom of all by limiting the capacity of any faction to rule unchecked—the opposite of what is desired by the corporate state represented by the Democratic Party and those elements of the Republican establishment that oppose the return to constitutional principle. (Speaking of young men, Alexander Hamilton, one of the principal designers of federalism, was not much older than Marx when he penned 51 of the Federalist Papers’ 85 installments that helped secure the Constitution’s ratification in 1788.)

The tension in Marx’s account becomes sharper as he goes on. He notes that historians often speak of different ages as being governed by different dominant ideals—again, honor under aristocracy, equality and freedom under the bourgeoisie—and he insists that these are merely the ruling class projecting its own interests in universal form. Yet he also describes the way such ideals assume an increasingly abstract and universal character, appealing to members of all classes. What explains this? The stupidity of the common man? Perhaps. Marx does portray this as a necessary tactic of every new ruling class: its interests must be presented as the interests of all, and its concepts must appear as universally valid principles. But in characterizing the process this way, Marx acknowledges that these ideals take on an authority that exceeds the narrow interests of any particular group. Moreover, by reducing these to class power (as he does in the 1959 Preface), he precludes the possibility that these ideas may exist in human nature, finding their expression in social arrangements appropriate to that nature. Could it be that concepts such as duty, equality, liberty, and constitutional restraint resonate across social boundaries not because they serve a ruling class, but because they articulate widely felt moral intuitions and fundamental features of human social life?

I need to bring into the discussion Marx’s concept of “species-being” (Gattungswesen) presented in the Economic and Philosophical Manuscripts of 1844. His conception of human nature provides a critical lens through which to evaluate his account of ruling ideas. Here, for Marx, humans are essentially creative and social beings whose nature is realized through conscious, productive activity shared with others. Labor is not merely a means of survival but a vehicle for self-expression and the fulfillment of human potential (echoes of John Locke). Yet in The German Ideology, he reduces moral and political ideals to instruments of class domination, leaving little room to consider how these ideals might genuinely facilitate the realization of species-being. Principles such as equality and liberty, and their elevation through constitutional government (or their expression under the original conditions of primitive communism, i.e., hunter and gatherer societies), do more than conceal or obscure ruling-class interests—they create social conditions under which humans can exercise their inherent capacities for cooperation, creativity, and rational deliberation.

Viewed through the lens of species-being, then, universal ideals may be understood not merely as ideological projections but as giving rise to structures that enable humans to develop and express their essential nature. Thus, Marx’s framework contains the seeds of a tension (not unexpected in the dialectical working out of opposing ideas if we are to be charitable): if human nature is cooperative and creative, i.e., social in a uniquely human way, some moral and political ideals must have real normative force, independent of ruling-class interests, because they sustain the conditions necessary for human flourishing. How would our species otherwise have survived for hundreds of thousands of years of its existence? Surely, we can assume that such conditions are to some significant extent universal; we are, after all, all members of the same species. Given this, does it now follow that some conditions facilitate the expression of that nature, while other conditions corrupt and suppress it?

There is a normative contradiction in Marx’s theory: Marx denies that universal moral ideals possess genuine validity, yet he relies on a universal moral horizon—human emancipation rooted in a conception of species-being—to condemn class domination. This view is even more problematic given the fact of individual differentiation across a range of attributes (Marx does not deny the Darwinian conception of natural history, nor should he). It follows from the stubborn truth of human differences that, with the complexification of social ecology over time, driven by technological innovation, itself an expression of man’s creativity, social segmentation is an inevitable development. It was his colleague, Frederich Engels, in part relying on Marx’s notes concerning Lewis Henry Morgan’s 1877 Ancient Society, who made this very argument in explaining the emergence of social class in his 1884 The Origin of the Family, Private Property and the State. To put this another way, Marx treats all universals as ideological illusions while simultaneously appealing to a universal to ground his critique: again, the original conditions of humanity, primitive communism, the condition before the segmentation of human society.

Secondly, there is a historical contradiction: Marx claims that ruling ideas exist to reinforce ruling-class power, but the very political institutions he dismisses as ideological—bills of rights, checks on authority, constitutions, representative assemblies—have, uncorrupted by ideology and money-power, functioned precisely to limit the power of elites, thus creating the grounds for equality before the law, which Marx cannot easily dismiss as an ideological prop. Although he attempts to reduce formal equality and all the rest of it to ideological tools of ruling class power, these institutions have constrained monarchs, curbed aristocratic privilege, and held economic elites accountable to broader publics. Their effect and purpose have been to redistribute power, not conceal it—first civil rights, then political rights, and finally social rights (as T.H. Marshall showed in his seminal 1949 essay “Citizenship and Social Class”). Marx’s framework cannot account for such developments without mischaracterizing them.

What Marx misses—or conveniently skirts—is that many political ideals and institutions endure not because they mystify domination but because they successfully channel enduring features of human nature, which, I argue in this previous essay, are realized through Protestantism. Marx says as much in one of his earliest works. I have discussed this matter before, but it takes on new significance for me considering what I am grappling with in these essays. In “On the Jewish Question,” published in 1844, Marx contrasts what he calls “theoretical Christianity” with “practical Judaism” to illustrate his concern with the relationship between ideas and material life. Anticipating Max Weber, Marx characterizes theoretical Christianity as a religion of abstract, universal principles, emphasizing contemplation and moral ideals rather than concrete human needs or social relations (see Anticipating Weber: Revisiting Marx and the “Jewish Question”). Practical Judaism, in contrast, is oriented toward everyday life, the world of commerce, property, and sensuous (sinnlich) social activity. Yet, as Weber suggests, Protestantism permits Christians to pursue the worldly pursuits Judaism valorizes within moral constraints that emerge from a cultural system that cannot be reduced to material relations.

By drawing this distinction, Marx argues that genuine human liberation requires attention to material and social conditions, not just abstract legal or moral principles. My reaction to this observation now is: of course. But more must be said; for human beings, made aware of their individuality, their creative productive power, desire liberty, resent arbitrary power, respond to principles of equality and fairness, and seek institutions that distribute authority in ways that protect against abuses. Their individuality is an a priori condition unrealized by millennia of subjection.

A communist would find individualism a barrier to the project to reconstruct society along collectivist lines, since he would have to suppose that individuality is not a product of a constrained human nature, as Thomas Sowell puts it, constraints imposed not by subjection but by natural history, but rather an infinitely malleable nature, which is to say no nature at all. Yet constitutional structures such as the separation of powers survive because they work for all, not because they allow one group to exploit and oppress the other; they bind rulers and ruled alike, limit the sway of passion and unbridled self-interest, and make room for the exercise of reason regulated by civic responsibility and deliberation, which are simultaneously self-interested and solidarity-building. These practices reflect insights into human nature that transcend class interest, and they represent achievements in political thought that Marx’s reductionist framework cannot—or dare not—fully acknowledge.

I will, of course, defend Marx’s desire for a more equitable social result. He saw collectivism as a means to greater individual liberty; with exclusive control over the productive means of production, the people would be a liberty to produce for themselves. It may very well be the case that the emerging automated society will, if the people demand it, free all from necessary labor (if they don’t, then neofeudalism and administrative management is likely mankind’s fate). But, in the end, and it pains me to admit this, Marx hobbles his own argument. By reducing ideals such as equality, liberty, and the rule of law to ideological projections, he obscures the fact that these ideals—again, uncorrupted by ideology and money-power—serve as constraints on the very powers he believes they rationalize.

Marx recognizes that universal principles come to dominate political discourse, yet he cannot explain their force without conceding, however obscuring that concession in a barrage of verbiage, that they speak to genuine human concerns. Thus, in opposition to his point, the universalization of such ideas does not merely disguise class rule—it limits it. In the final analysis, the most coherent conclusion is not Marx’s explicit one, but the one he tries to avoid: that certain political ideals and constitutional forms are not tools of domination but the means by which free people secure just social relations against domination in the first place.

This is why we must reject the claim that the desire for more equitable social arrangements is the exclusive domain of those advocating social justice. Might more just social arrangements be achieved by pushing even further the liberal ideals that have emancipated over the centuries and across the planet billions of human beings from communism, fascism, monarchy, and primitive religion? The question answers itself. We certainly don’t need to wonder what will happen to democracy and liberty under communist rule. Humanity already tried that. With terrible results.

* * *

I want to append to this essay a few kind words about Karl Marx, since it may seem that I am abandoning him, especially in light of my recent alignment with populist politics, where so many resist appreciating the man’s contributions to the scientific study of economics and history. I have argued before that Marxian thought—not his political project, but his contribution to anthropology and sociology—ought to serve as a foundational paradigm for the social sciences, including the study of history. In Marx and Darwin: Pioneers of Scientific Inquiry in Social and Natural History, I clarify that when I say I identify as a Marxist, I mean it in the same way one might say they identify as a Darwinist. In the annals of intellectual history, Marx is to social history what Darwin is to natural history.

In Marxist but not Socialist, I elaborate on this point by citing Christopher Hitchens’ remark during a 2006 town hall in Pennsylvania: “I am no longer a socialist, but I still am a Marxist.” Hitchens explained that he remained impressed by Marxism’s analytical rigor and historical insight—its capacity to illuminate the deep structures and internal contradictions of capitalist society, and to reveal the underlying causes of inequality and social unrest. He was particularly drawn to Marxism’s emphasis on economic justice: its vision of a society in which opportunities and resources are more equitably distributed and the needs of the many take precedence over the privileges of the few. (See also Why I am not a Socialist.)

Hitchens, of course, began his political life as a committed socialist, deeply involved with the International Socialists, a Trotskyist organization. Over time, he became disillusioned with socialism as a workable political project. By the 1990s and 2000s, he believed that much of what passed for socialism had degenerated into a form of corrupt populism, and he no longer regarded the international working-class movement he had once envisioned as a plausible engine of global change. This growing disappointment led him to step back from socialism as a political goal. He also came to see capitalism as a far more revolutionary force for good; in his estimation, the bourgeois revolution still had unfinished business.

After 9/11, Hitchens aligned himself with certain strands of neoconservative foreign policy. This shift reflected his deep loathing of clerical fascism, particularly in its contemporary Islamic form. He came to view the struggle against Islam and other totalitarian movements as a moral imperative. This stance placed him at odds with much of the left (this would be even more true today), even though he remained steadfast in his defense of liberal, secular values against what he perceived as existential threats. Throughout all this, he upheld his commitments to civil liberties and human rights. (Then again, today’s left can hardly be counted upon to defend liberal, secular values, civil liberties, and all the rest of it. Indeed, the New Left appears to be very much against these Old Left ideas.)

Despite these political shifts, Hitchens continued to describe himself as a Marxist—intellectually, if not politically. It is in this sense that I echo his formulation: Marxist, but not socialist. What he sought to preserve in Marxism is the same thing I aim to preserve: the method, specifically, the materialist conception of history. This approach holds that economic and material forces—rather than ideals or metaphysical motivations—ultimately drive the development of human societies. No Geist is unfolding the world toward a teleological end. Like Hitchens, I still regard Marx’s analytical framework as an effective tool for understanding historical dynamics and the transformative power of capitalism, even as I reject socialist politics in practice.