A century ago, the most potent moral figure in the West was Jesus Christ. Believers and unbelievers alike accepted him as an ethical exemplar. Not to do so was to make oneself an outcast. But now, our most potent moral figure is Adolf Hitler. In our relativist, pluralist age, he is our one reference point by which we know what is evil.

We still believe that Jesus is good—but not with the fervor and conviction with which we believe that Nazism is evil. Crosses and crucifixes have lost most of their cultural power. They can be played with, even joked about, and no one really minds. The swastika packs a far greater punch. Play or joke with that, and you make yourself a monster.

To understand the postwar age, we need to realize this: It is the age of Hitler. Since 1945, the man with the toothbrush mustache has dominated our moral imaginations. I don’t object to demonizing him. But we should be clear about what we’ve done: We have replaced a positive exemplar with a negative one. We’ve taught ourselves that evil looks like a Nuremberg rally, when in fact evil takes many forms. And we’ve persuaded ourselves that we live in an age that has no religion, when in fact we are believers in the myth of World War II. So I am pleased that the wheels are coming off that myth, that we are losing faith in the age of Hitler. I think there’s a good chance that something better will come next.

Western culture didn’t switch out a Christian moral system for an anti-Nazi one without good reason. Christianity’s moral authority had been decaying for centuries, but what made the decline terminal was World War II: the modern age’s keenest moral test, and a test the Christian churches failed dismally. Not just because so many of them openly collaborated with or consented to Nazism, fascism, and nationalism. Nearly as bad was the slowness of the non-collaborating churches to wake up to what was going on. The most glaring sin was the churches’ ancient complicity in antisemitism, but that was only the centerpiece of a wider tableau of moral failings, which the war had mercilessly exposed.

The devil’s bargain by which Christians in Germany and elsewhere consented to Nazi or fascist rule was not some ghastly blunder. It was based on a Christian hierarchy of values. Most Christians did not actually approve of cruelty, warmongering, and systematic murder; they simply cared more about maintaining social order, about defending Christianity against mockers, profaners, and blasphemers (including Jews), and about reasserting Christian sexual and family morals.

Even groups like the German Confessing Church, who offered some level of resistance to Nazism, were far more concerned with protecting their own independence and preserving their own moral purity than with opposing their rulers’ crimes. A tiny handful of visionaries, such as Dietrich Bonhoeffer, perceived the magnitude of what was happening, but even in the Confessing Church, most only argued about details while being grateful that the trains ran on time and the communists were kept out. As the Confessors’ most important leader, Martin Niemoller, later admitted: If it was distasteful that the Gestapo were coming for the socialists, the trade unionists, and the Jews, that was okay, as long as they weren’t coming for us. It was a serious, authentic Christian judgment. That is why it was so unforgivable.


When the war came, Winston Churchill and plenty of others tried to cast it as a war for “Christian civilization,” but the label didn’t stick. Franklin Roosevelt found firmer ground with his talk of “Judeo-Christian civilization” and of fundamental freedoms to be defended “everywhere in the world.” By the war’s end, the Allies were calling themselves the “United Nations,” fighting for all of humanity against tyranny. They seamlessly bequeathed that name to the organization they founded to police the new world order, an organization created to speak for all of humanity, not for one country or one faith.

In the late 1940s, as the Western powers moved from World War to Cold War without missing a beat, it almost seemed as if those “universal” ideals might be eclipsed and the death camps forgotten, with Christian civilization having acquired a Jewish asterisk. Godless communism was now the enemy, and in 1956 “In God We Trust” became the United States’ official motto. Church membership surged in America and even recovered in Europe. Hollywood turned out Biblical epics, reminiscent of the Victorian fashion for fictionalized lives of Christ. The Greatest Story Ever Told ran on ABC radio from 1947 to 1956. The long-planned movie of the same name, which finally appeared in 1965, seemed like a license to print money.

And yet its global box-office takings of $15 million covered only three-quarters of its production costs. In part, it was just a bad movie: four ponderous, sententious hours, remembered mostly for John Wayne’s unintentionally comic cameo as the centurion at Calvary. But the main problem was timing. The producers found themselves on the wrong side of a cultural watershed. The early 1960s had happened, and reverence was no longer the order of the day.

There are almost too many reasons why the early 1960s were such a pivotal cultural moment: the Pill, the Bomb, the Beatles. But beneath the seismic shifts of the 1960s lay the delayed shock of World War II, which set off the secularizing tsunami that has since swept over the Western world. Crucially, the secularization of the 1960s was not driven by Europe’s and America’s tiny band of campaigning atheists. The churches were facing something much more dangerous: a terrible loss of faith in themselves, a loss of faith rooted in the crisis of the war against Hitler.

The unwitting avatar of this loss of faith was Dietrich Bonhoeffer, one of the first Christians truly to understand “the radical evilness of evil” that Nazism represented. This understanding drove him to abandon his church’s supine politics and led ultimately to his judicial murder by the Nazis in April 1945. But his now famous letters from prison show him groping toward radical conclusions about where the war had left his faith. As he notoriously put it: “We are proceeding towards a time of no religion at all: men as they are now simply cannot be religious anymore.” It was not a lament. Bonhoeffer had become so appalled by the churches and their parody of Christianity that he believed their demise might be God’s will. Maybe, he wondered, “a world come of age” had outgrown religion? His hope was that “religion is no more than the garment of Christianity”: a soiled and tattered garment, which it was time now to abandon. What was needed was “a religionless Christianity.”

He was aware that this was a paradox, not a plan. If the old garment was to be flung off, what would replace it? As he said at the end of his fullest letter on this subject: “the outward aspect of this religionless Christianity, the form it takes, is something to which I am giving much thought, and I shall be writing to you about it again soon.” In his letters he returned again and again to this question: What might be left when hierarchies, forms, jargon, wealth, and power have been stripped away, leaving a Christlike Christianity that serves the world in weakness from the cross? The letters are full of these sorts of phrases: “I am thinking over the problem at present,” “More about that next time, I hope.” If he found an answer before the Nazis hanged him, his surviving letters do not record it. His martyr’s death was itself a kind of answer, but hardly a practical model for any church.

What he did not expect was that after his death those inconclusive private wrestlings would be published and read as a manifesto, their authority sealed by his sacrifice. A certain kind of Christian in the 1950s was drawn to Bonhoeffer and those like him, ranging from Paul Tillich to John Robinson. These figures represented clarity, urgency, and impatience with the complacent churchiness that, in the 1930s, had ignored or belittled the moral emergency of fascism until it was far too late. In the 1950s, Bonhoeffer’s spirit seemed to be embodied in churchmen such as the English anti-apartheid campaigner Trevor Huddleston and, above all, Martin Luther King Jr. 

And so, when the moment of cultural flux arrived in the early 1960s, some of the most compelling and authoritative Christian voices were advocating, not conventional Christian ethics, but radical or indeed secular applications of them. For the Christians who formed the backbone of the American civil rights movement, it became a point of principle to play down their religious identity and forge broad alliances that paid no heed to faith. The Student Nonviolent Coordinating Committee (SNCC), the leading student civil-rights organization, was created mostly by Christian students who deliberately embraced a secular identity so as to build as broad a coalition as possible. In Britain, the Student Christian Movement (SCM), which had been leading a modest uptick in Christian affiliation in the 1950s, redefined itself in the 1960s by adopting an “openness” policy. Its new general secretary, Ambrose Reeves—a veteran of South Africa’s anti-apartheid struggle and a proud ally of King—declared that “we can best serve the churches by ceasing to be a ‘religious’ society.” The SCM began to define itself, not in traditionally religious terms, but by the political causes it supported. And as a result, it almost vanished. In the ten years before 1973, it lost 90 percent of its members. It did manage to outlive the SNCC, which dissolved in 1970 after growing numbers of radicals decided the Christian-inspired commitment to nonviolence was cowardly rather than principled.

These acts of institutional self-sacrifice were the religious crisis of the 1960s in miniature. It was a time when Bonhoeffer’s inconclusive musings felt like a prophetic summons. Many Christians suddenly felt that the one thing they could no longer do with a good Christian conscience was to assert their Christianity. To claim that their story was the greatest ever told had once seemed innocent, even banal. Now it seemed like sinful arrogance.

So when secularists were emboldened to mock Christianity during the 1960s, prominent Christians rushed to join in. They believed, more fervently than any scoffer, that the churches were part of the problem. Many Christians in the West felt that to assert their traditional doctrines would be exclusive, offensive, or discriminatory. Their concern was at least as much for conscience as for appearances, and it has only grown stronger since. To the extent that Western societies have become secular, this is one of the principal reasons: Many Christians consciously and deliberately decided that it should be so.

Which still leaves Bonhoeffer’s question unanswered. If religion, our traditional arbiter of right and wrong, was to be cast off like a worn-out garment, what should replace it?

In February 1943, the American troop ship Dorchester was torpedoed off the Canadian coast. Four military chaplains—two Protestants, a Catholic, and a Jew—were aboard. According to survivors’ accounts, the chaplains worked together to hurry men into lifeboats, then distributed lifejackets. When the lifejackets ran out, they gave their own to four young soldiers. They then joined hands, singing and praying together as the ship sank. Reportedly, all four were reciting the Shema, the Jewish affirmation of God’s oneness, as the waters took them. The “Four Chaplains” were swiftly commemorated as symbols of America’s war for Judeo-Christian civilization.

Every time I tell their story to a lecture audience, I struggle to get through it without a catch in my throat. Since I usually try to display the cynicism expected of a professor, this is embarrassing. I have to tell myself firmly not to be sentimental. But for me, and not just for me, that story and others like it from World War II have a visceral appeal. In the same way, Winston Churchill’s speeches have, in Britain at least, sunk into our collective consciousness like holy writ. Christopher Nolan’s 2017 movie Dunkirk (which I also struggle to get through dry-eyed) ends with the dazed soldiers, back in the temporary safety of England, being handed a newspaper report of Churchill’s “We shall fight on the beaches” speech of June 4, 1940. One soldier reads the speech to his comrade, without flourishes, stumbling and halting, like an exhausted, uneducated man reading aloud a text he is seeing for the first time. The scene works because we, the audience, know the speech already. We can hear Churchill’s cadences even as the soldier mangles them. The very clumsiness of the reading helps us to hear the familiar words afresh. Usually, the only words that need to be rescued from over-familiarity in this way are the words of Scripture.

Nothing affects me in quite the same way as tales of World War II. In particular, the narrative of heroic self-sacrifice at the heart of my Christian faith—the story that was, once, the greatest ever told—simply does not have the same grip on me. Throughout the Christian centuries the story of Christ’s Passion often has had that effect. A great many Christians have found themselves emotionally transported or shattered by the Passion narratives. But in our own age—including for many of us who still identify as Christians—that emotional immediacy is simply not there, or is accessible only through conscious devotional effort.

That is, plenty of us still believe in our religions, but often not with the same intuitive immediacy and blithe faith with which we believe in our culture’s true religion: World War II, the greatest story we have ever told.

The French novelist Laurent Binet calls the war “our Trojan War: a landmark, a reference, a source of inexhaustible stories, a collection of epics and tragedies.” It is all that, but it is also our Paradise Lost, our epic of meanings and values, dominated by its endlessly fascinating central villain. It is, more than we sometimes remember, the basis for our most fundamental convictions about what is good and what is evil.

In the age of Hitler, the post–World War II age in which we live, “humanity” is our shared faith. The concept of “human rights” is of course much older, going back to the age of Enlightenment and beyond, and most famously to Thomas Jefferson, who held it to be self-evident “that all men are . . . endowed by their Creator with certain unalienable rights.” And yet this claim is a problem. Not only because Jefferson wrote it while holding hundreds of men and women in slavery, but also because it is simply, factually wrong. Jefferson and a few of his Enlightenment friends thought that the existence of human rights was a self-evident truth; but it can’t be, because a great many people in a great many historical settings have not believed in any such thing. The claim that “We hold these truths to be self-evident” reveals the doctrine of human rights for what it is: a castle in the air, a defiant existential assertion of values.

But that is not the deepest problem. The deepest problem is that most of us, most of the time, neither know nor care whether “human rights” have a solid foundation beneath them: like Jefferson, we have come simply to believe in them in their own right. Now, in the post-1945 era, in the age of Hitler, we really do hold the existence of human rights and human equality to be self-evident. We can’t, intellectually, prove it to be true; but that doesn’t matter, because we feel that it is true. For now.

Why do we believe that human beings have rights? Even asking it feels uncomfortable, a questioning of what ought not to be questioned. To raise the problem is almost to blaspheme. The most honest response is that we simply do believe, down to our core, that human beings have rights, regardless of whether it can be proved. That conviction feels like an answer. In fact, it is the question.

The closest we can come to an actual answer is the one advanced by the United Nations’ Universal Declaration of Human Rights in 1948, a document whose title could not emphasize its totalizing ambitions more strongly. Having asserted, pragmatically, that human rights are “the foundation of freedom, justice and peace in the world,” it then explains that

disregard and contempt for human rights have resulted in barbarous acts which have outraged the conscience of mankind, and the advent of a world in which human beings shall enjoy freedom of speech and belief and freedom from fear and want has been proclaimed as the highest aspiration of the common people.

So, this is all about World War II. Of course it is! In 1948, how could it not be? The experience of exceptionally “barbarous acts,” and the claim that the “common people” of the whole world have united around the desire for Roosevelt’s four freedoms, give the declaration its urgency. Its confidence is that of the victors of a total war, sure that they have earned the right to speak for the “common people” of the world.

It took a decade and a half for the moral shock of the war to work its way to the surface. The most vivid single moment of that change was Adolf Eichmann’s trial in 1961, which confronted the world with horrors we had been trying hard to forget. But the best gauge of how the memory of the war shifted and changed is the movies. The war movies of the 1950s were somber (The Cruel Sea, 1953), rousingly patriotic (The Dam Busters, 1955), or rip-roaring adventures (Ice Cold in Alex, 1958), but they neither demonized Germans nor dwelled on Nazi crimes. The depiction of the German army as honorable opponents in The Longest Day (1962) was underpinned by the involvement of German actors and several former Wehrmacht generals in the production. In the much-loved film The Great Escape (1963), Germany’s regular armed forces are contrasted with the Gestapo and the S.S., who alone are made to bear the responsibility for atrocities. What was novel was that villainous Germans were given screen time at all. Over the following decades, cinematic Nazis became increasingly sinister, until by the twenty-first century we had movies like Fury (2014), in which the entire German war machine is presented as a nihilistic death-cult, or Inglourious Basterds (2009), a fully fictionalized account of an ideological war between Jew-hating Nazis and a squad of Jewish-American avengers.

Beginning in the 1960s, the movies began hesitantly to tackle the Nazi genocide itself. The Oscar-winning adaptation of the stage play Judgment at Nuremberg appeared in 1961, while Eichmann was awaiting execution. The 1978 American television miniseries Holocaust helped to generalize that term. The 1987 British television movie Escape from Sobibor was one of the first such productions to be set in a death camp. In Schindler’s List (1993), Hollywood finally looked the Holocaust in the eye. The string of Holocaust films since then—Life is Beautiful (1997), The Pianist (2002), Son of Saul (2015), The Zone of Interest (2023)—shows that the appetite for this story is not fading.

In movies that had nothing to do with World War II, Nazis began to appear as villains par excellence. The “Illinois Nazi Party” in The Blues Brothers (1980) is utterly gratuitous, but what better counterpoint could there be to that film’s anarchic racial inclusivity? And then there are the Indiana Jones movies of 1981, 1989, and 2023. The values of our age are summed up by Dr. Jones in The Last Crusade (1989): “Nazis! I hate these guys!”

By then “Nazism” had come to denote any attempt to wield authority over others. First, workplace bullies were “little Hitlers”; then punctuation pedants became “grammar Nazis.” A 1995 episode of Seinfeld featured “the Soup Nazi,” a New York soup vendor with strict views on his customers’ behavior. The International Holocaust Remembrance Alliance takes a dim view of this kind of trivialization, understandably enough, but it is inevitable. Nazis come as readily to our minds as did Satan to the minds of medieval Christians. A symbol of absolute evil is too useful to be left reverently in a corner. If hypocrisy is the tribute vice pays to virtue, trivialization is the tribute flippancy pays to earnest moral conviction.

In 1990, the internet pioneer Mike Godwin formulated his famous “law”: that the longer an online discussion goes on, the chance that someone will compare someone else to Hitler or the Nazis inexorably increases, and once it happens, the discussion ends. Almost nothing else about the internet is recognizable thirty-four years on, but Godwin’s Law still holds. Calling someone a Nazi is quite literally the final insult. It ends an argument because it is a punch in the face. What can you do in reply, other than punch back? In a relativist, pluralist age, Nazism is our one absolute reference point. 

You can defend Stalin, peddle conspiracy theories about 9/11 or Covid-19 vaccines, or claim that most men accused of rape are innocent, and you will probably only boost your online following. But even now, if you deny the Holocaust, you are an intolerable monster, and when you bleat about cancel culture, only other monsters will care. And rightly so. To deny the Holocaust is to reject the deepest moral truth on which our society is built.

The clearest sign is the several ways in which World War II has been translated into fantasy and science-fiction settings, being reshaped and purified in the process. It started during the war itself, as J. R. R. Tolkien was writing what would become the twentieth century’s bestselling novel, The Lord of the Rings. Tolkien was quite right to insist that his War of the Ring was not an allegory for the real world’s war. He was a staunch anti-Nazi but also a veteran of World War I who knew that a war—any war—is “an ultimately evil job.” His fear, he wrote in 1944, was that “we are attempting to conquer Sauron with the Ring,” to fight evil with its own weapons. Even if we won such a war, our victory would only “breed new Saurons.” So Tolkien’s novel told the story of World War II, not as it was, but as it should have been: as a struggle against a dehumanizing tyrant in which the heroes, despite great temptations, destroy the power to impose tyranny, rather than use that power to defend themselves.

As Tolkien knew, in political terms this was a fantasy more unbelievable than any elf or wizard. That is why he wrote it as a myth. If we are to use the memory of World War II to reset our moral compasses, then we need purified versions of it, not the morally compromised reality. As Tolkien warned, we have been breeding new Saurons ever since—though not quite in the way he meant. Dark Lords, a series of ersatz Hitlers, have populated the most popular mythologies of the post-war era. In Star Wars, the Dark Lord has stormtroopers and jackbooted interrogators, blasts whole planets into nothingness, and is defeated by plucky farmboys in planes. In the Harry Potter novels and films, the Dark Lord pursues a racialized supremacy, giving his followers a name—the “Death Eaters”—and a set of symbols that evoke the S.S. The same vision is apparent in one of the most compelling creations of the British 1960s: The Daleks, villains of the television series Doctor Who, are miniature tanks whose relentless desire for “extermination” reveals them to be wrecks of life entombed in metal and hate.

These are the myths of Hitler on which generations of children in the post-Christian West have been raised. In these myths, the brutal lessons of World War II are transposed into morality tales. We are apparently determined to teach ourselves, and eager repeatedly to relearn, that this is what evil looks like—even though evils rarely appear in such plain dress.

So if we ask why Christianity went into retreat in the West from the early 1960s on, there is a simple answer: Christianity’s one crucial and virtually uncontested function in Western societies had suddenly failed. Whatever else Christianity had become by then, it was still our store of value. Believers and unbelievers alike accepted the authority of Jesus’s ethics as reflexively as we accept the notion of human rights now. But once a new set of values was in place—once a new lodestone had reset our moral compass, so that what had pointed towards Jesus now pointed away from Hitler—the adjustment of our coordinates made the old maps redundant. And so they were abandoned, or simply and quietly fell out of use.

Perhaps you still hold religious values, mixed in with the anti-Nazi values of our age. Perhaps you even tell yourself that your religious values are fundamental, and that the agreement of your religious and secular values is a happy coincidence. And perhaps you worry that we live in hopelessly divided societies, fractured into irreconcilable pluralities. We are slow to recognize that we do share a deep, strong, and pervasive moral consensus—slow, because of the inability of fish to know what water is. Our myth is that we live in a secular age, based on self-evident truths such as human rights. But in fact, we live in the age of Hitler. Our religion is World War II.

And now, in the 2020s, it is that faith that is crumbling.

World War II is losing its moral centrality. Much of the planet is, reasonably enough, reluctant to accept that “universal values” were defined for all time by the victors of a war among colonial powers. Even in the Western world, on the left, the growing intuition that racism is the defining evil of our time has meant that the Nazis’ racial tyranny looks like only one example among many. Slavery, imperialism, and apartheid are just as terrible. The post-1945 instinct that antisemitism is a case apart, an exceptional evil, has been dulled, not least as Israel has, for many on the left, been reclassified from victim to villain. On the right, the long-shared determination that any hint of fascism was unacceptable has been crumbling for two decades and more. It is easy to vilify political opportunists who have built their careers on populist race-baiting and defiance of the taboos that protect democratic norms, but these people have sprung up like weeds in every Western democracy. They are symptoms of the erosion, not its cause.

I am unfashionably encouraged by these developments. Not that I disagree with the values of the age of Hitler; I simply think they’re not enough. Our myths of World War II have taught us some valuable lessons, but also some misleading and even toxic ones: that evil appears in the guise of a dark lord; that it is best confronted with violence; that prudence, forbearance, and discretion are merely “appeasement.” As Tolkien could have told us, building our values around a war is unwise, since even a just war is, in itself, a very evil thing. More fundamentally, replacing a positive exemplar (Jesus) with a negative one (Hitler) comes at a heavy cost. It teaches us what to hate but not what to love. Our culture assures us that we are each free to pursue our own good, but—quite deliberately—gives us no resources to discern what that good might be. It assures us that we have rights and freedoms. But what are they for? Not, presumably, for triumphantly denouncing one another on social media. To get past that, however, we would need gentler virtues and sharper insights than the value-system the age of Hitler provides. We would need to recognize that evil is infinite in its varieties, and that Nazism is only one of its flavors; that evil is distributed, not personified; and that it is usually rooted inside ourselves. We cannot defeat it by jumping into a Spitfire and shooting at it.

Hence my measured optimism. In this century we will face perils ranging from climate breakdown to economic and demographic turmoil to the impact of artificial intelligence—to say nothing of old-fashioned nuclear weapons. The values we learned from World War II will be essential for confronting these evils; but they will manifestly not be enough. We will need to draw on our deeper cultural wells, namely our religious and philosophical traditions, which offer positive rather than negative values. Our anti-Nazi values separate the world into the black and white of good and evil, or, if we are being sophisticated, into shades of grey. It’s our deeper traditions that show us full color, so that instead of merely passing judgment, we can see beauty. In particular, those deeper traditions teach some of the virtues that we will most need in order to navigate this century, and which our anti-Nazi values conspicuously lack: humility, repentance, and forgiveness.

Since we must do this—bring out of our treasure both what is new and what is old—sooner or later we will. It’s a race: Who will reach this synthesis first? Will the left realize that their brittle secular values become more alienating the more aggressively they are asserted, and underpin them with deeper traditions? Or will the right realize that progressive secularism thrives most when it is under direct attack, and demonstrate that secular-progressive values can be brought to fruition only when they are infused with the beauty, richness, and strength of religion and other deep traditions? Which side will be the first to desist from shrill, alienating assertions of its own perfection, and embrace the subtle, disarming power of humility and repentance?

Eventually one side or the other will win the race, and we will leave the age of Hitler behind. Knowledge of what to love is the only position from which victory in our culture wars is possible. The question is how much damage will be done along the way.

Alec Ryrie is Professor of the History of Christianity at Durham University.